1 00:00:14,000 --> 00:00:16,840 Speaker 1: Welcome to tex stuff. This is a story. I'm mos 2 00:00:16,920 --> 00:00:19,120 Speaker 1: Voloscen here with Cara Price. 3 00:00:19,560 --> 00:00:19,680 Speaker 2: Hi. 4 00:00:19,840 --> 00:00:22,640 Speaker 3: Ah, so I would like to show you something and 5 00:00:22,720 --> 00:00:25,560 Speaker 3: just get your reaction, and I promise you it relates 6 00:00:25,560 --> 00:00:26,400 Speaker 3: to today's story. 7 00:00:27,440 --> 00:00:32,519 Speaker 2: Here we are planet Earth. It's impossible not to appreciate 8 00:00:32,560 --> 00:00:36,680 Speaker 2: the sheer grandeur of our world, and yet there remains 9 00:00:36,760 --> 00:00:41,800 Speaker 2: one forest unexplored by humans, a forest filled with life, 10 00:00:42,640 --> 00:00:46,199 Speaker 2: with creatures living in the burrows of the firms, the 11 00:00:46,240 --> 00:00:50,320 Speaker 2: branches of the trees, by the flowing streams, and the 12 00:00:50,440 --> 00:00:56,920 Speaker 2: mossy box. I'm David Attenborough's neighbor Dennis, and welcome to 13 00:00:57,240 --> 00:01:00,560 Speaker 2: a forest filled with little critters. 14 00:01:01,680 --> 00:01:06,160 Speaker 1: So this is a sort of Pixar style animated film 15 00:01:06,560 --> 00:01:13,160 Speaker 1: with a parody of David attenborough narration. The publication date 16 00:01:13,240 --> 00:01:17,520 Speaker 1: is April tenth, twenty twenty three. So I am a 17 00:01:17,560 --> 00:01:20,200 Speaker 1: little curious why you want me to watch this film, 18 00:01:20,200 --> 00:01:22,560 Speaker 1: which is titled Critters with a Z. 19 00:01:22,840 --> 00:01:26,560 Speaker 3: Critters what you just saw is the beginning of a 20 00:01:26,600 --> 00:01:29,959 Speaker 3: five minute animated movie that was created using open AI's 21 00:01:30,120 --> 00:01:33,440 Speaker 3: text to image software back in twenty twenty three. Critters 22 00:01:33,480 --> 00:01:36,759 Speaker 3: with a Z was created using doll e. We're obviously 23 00:01:36,840 --> 00:01:39,759 Speaker 3: now in twenty twenty five, and open Ai, in conjunction 24 00:01:39,920 --> 00:01:43,240 Speaker 3: with a creative agency and film production company, are turning 25 00:01:43,280 --> 00:01:47,080 Speaker 3: critters into a feature length film. Not only that it 26 00:01:47,160 --> 00:01:49,720 Speaker 3: is set to premiere at can in twenty twenty six, 27 00:01:49,760 --> 00:01:51,160 Speaker 3: so they have a drop dead deadline. 28 00:01:51,280 --> 00:01:55,040 Speaker 4: The twenty twenty three short is not terribly engaging. 29 00:01:55,160 --> 00:01:58,040 Speaker 1: In the first forty three seconds, it feels like a 30 00:01:58,040 --> 00:02:01,000 Speaker 1: product demo, which of course it is. But what can 31 00:02:01,040 --> 00:02:03,640 Speaker 1: we expect from the twenty twenty six version. Will this 32 00:02:03,760 --> 00:02:06,680 Speaker 1: be a moment that has Pixar quaking in their boots. 33 00:02:06,880 --> 00:02:09,880 Speaker 3: Well, you know, if open Ai and their partners in 34 00:02:09,919 --> 00:02:13,520 Speaker 3: this can deliver a feature length hit, the very definition 35 00:02:13,560 --> 00:02:16,600 Speaker 3: of studio could change, you know, especially given their goal, 36 00:02:17,000 --> 00:02:18,960 Speaker 3: which is to make this movie for less than thirty 37 00:02:19,000 --> 00:02:21,840 Speaker 3: million dollars, which is nothing really compared to the hundreds 38 00:02:21,880 --> 00:02:24,360 Speaker 3: of millions it can cost a major studio to make 39 00:02:24,400 --> 00:02:27,800 Speaker 3: a feature length animated film. And then there's the timeline 40 00:02:27,840 --> 00:02:29,840 Speaker 3: that we talked about. You know, open Ai said they 41 00:02:29,840 --> 00:02:33,360 Speaker 3: will wrap in nine months, while Pixar would probably take 42 00:02:33,639 --> 00:02:35,639 Speaker 3: two to four years to make something similar. 43 00:02:35,880 --> 00:02:38,880 Speaker 1: Assuming the film comes out in twenty twenty six and 44 00:02:38,919 --> 00:02:40,960 Speaker 1: it's well received, it can standing. 45 00:02:40,639 --> 00:02:43,120 Speaker 4: Ovation fifteen minutes, and then it. 46 00:02:43,120 --> 00:02:44,760 Speaker 1: Drives real box office. I mean, there are a lot 47 00:02:44,760 --> 00:02:48,239 Speaker 1: of big assumptions. If all of that happens, and it 48 00:02:48,280 --> 00:02:50,560 Speaker 1: happens at a fraction of the cost, at a fraction 49 00:02:50,600 --> 00:02:52,520 Speaker 1: of the time the pix I would do it in 50 00:02:52,880 --> 00:02:56,120 Speaker 1: this would represent kind of a seismic moment for the 51 00:02:56,120 --> 00:02:56,880 Speaker 1: whole of Hollywood. 52 00:02:57,080 --> 00:03:00,359 Speaker 3: It would represent a seismic moment. I think that it's 53 00:03:00,400 --> 00:03:04,200 Speaker 3: something that Hollywood definitely should be worried about. But there 54 00:03:04,240 --> 00:03:07,400 Speaker 3: also are a lot of humans inside this loop. You know, 55 00:03:07,960 --> 00:03:11,880 Speaker 3: OpenAI products like Dolly and Sora will be used for production, 56 00:03:12,360 --> 00:03:14,520 Speaker 3: but humans will be the voice actors, and there are 57 00:03:14,560 --> 00:03:17,160 Speaker 3: some human artists who are even overseeing the project. I 58 00:03:17,160 --> 00:03:19,720 Speaker 3: think most importantly, the screenplay is being written by two 59 00:03:19,760 --> 00:03:21,800 Speaker 3: of the writers from the Paddington franchise. 60 00:03:22,480 --> 00:03:26,480 Speaker 1: How is the industry responding, I'm obviously generative AI was 61 00:03:26,520 --> 00:03:28,960 Speaker 1: one of the huge issues of the writers strike. 62 00:03:29,240 --> 00:03:30,799 Speaker 3: Yeah, and I think there are a lot of people 63 00:03:30,880 --> 00:03:34,400 Speaker 3: outside Hollywood that assume that generative AI is no go 64 00:03:34,600 --> 00:03:37,960 Speaker 3: in the industry, especially because of how tense the strike's got. 65 00:03:38,560 --> 00:03:41,360 Speaker 3: But that's what today's episode is about. I actually got 66 00:03:41,360 --> 00:03:44,040 Speaker 3: to talk to Lila Shapiro, who's a features writer for 67 00:03:44,080 --> 00:03:47,720 Speaker 3: New York Magazine, and her recent piece was titled Hollywood 68 00:03:47,760 --> 00:03:51,640 Speaker 3: already uses generative AI in parenz and is hiding it? 69 00:03:52,240 --> 00:03:54,440 Speaker 3: And in this piece she talks about how generative AI 70 00:03:54,560 --> 00:03:57,120 Speaker 3: is viewed and used in Hollywood. She actually heard from 71 00:03:57,160 --> 00:04:01,880 Speaker 3: studio execs, VFX artists, writers, act directors, and even up 72 00:04:01,920 --> 00:04:04,800 Speaker 3: and coming generative AI studios hoping to work within the 73 00:04:04,840 --> 00:04:05,640 Speaker 3: studio system. 74 00:04:06,200 --> 00:04:09,440 Speaker 5: The victories of the strikes were largely about like what 75 00:04:09,480 --> 00:04:13,280 Speaker 5: people can sort of be forced to do, but people 76 00:04:13,320 --> 00:04:16,760 Speaker 5: are being sort of pressed to do it off the 77 00:04:16,880 --> 00:04:21,799 Speaker 5: record informally by people lower down the production chain because 78 00:04:21,839 --> 00:04:25,320 Speaker 5: everyone is pressed for time, and here are some really 79 00:04:25,360 --> 00:04:29,560 Speaker 5: efficient ways to do things. So it's like there are 80 00:04:29,600 --> 00:04:33,400 Speaker 5: these guidelines, but it's limited to how much it's going 81 00:04:33,440 --> 00:04:38,239 Speaker 5: to really like stave off these greater pressures of needing 82 00:04:38,279 --> 00:04:43,360 Speaker 5: to be more efficient, productive, and deal with tightening budgets 83 00:04:43,400 --> 00:04:44,560 Speaker 5: across the industry. 84 00:04:45,000 --> 00:04:47,400 Speaker 3: I think it's important to remember that AI already plays 85 00:04:47,400 --> 00:04:50,200 Speaker 3: a role in Hollywood, and it has for over a decade. 86 00:04:50,400 --> 00:04:53,159 Speaker 3: Lila told me every studio uses some form of AI. 87 00:04:53,680 --> 00:04:57,599 Speaker 3: The thing being argued over in Hollywood is like generative 88 00:04:57,680 --> 00:05:00,719 Speaker 3: art like purely creative task that can take the artistry 89 00:05:00,720 --> 00:05:03,239 Speaker 3: out of a job or threatened to replace jobs entirely. 90 00:05:03,800 --> 00:05:05,840 Speaker 3: But there are a million uses for AI that are 91 00:05:05,880 --> 00:05:09,200 Speaker 3: just replacing gruntwork. I mean, the writer's strike resulted in 92 00:05:09,200 --> 00:05:11,719 Speaker 3: a contract that says scripts can't be generated by AI, 93 00:05:12,040 --> 00:05:14,680 Speaker 3: but if a writer wants to use AI, they can 94 00:05:14,839 --> 00:05:15,120 Speaker 3: use it. 95 00:05:15,440 --> 00:05:16,120 Speaker 4: And this is how. 96 00:05:16,040 --> 00:05:18,840 Speaker 6: AI is treated in most parts of the industry. Here's 97 00:05:18,920 --> 00:05:19,520 Speaker 6: Lila again. 98 00:05:19,839 --> 00:05:25,240 Speaker 5: Studios aren't advertising what they're doing with it because the 99 00:05:25,279 --> 00:05:28,839 Speaker 5: public response is so negative and there aren't like tons 100 00:05:28,880 --> 00:05:31,960 Speaker 5: of examples that we know about. To give you an example, 101 00:05:32,160 --> 00:05:36,279 Speaker 5: like you've looked at what happened with The Brutalist, where 102 00:05:36,360 --> 00:05:40,800 Speaker 5: the director came under criticism when it developed that they'd 103 00:05:40,920 --> 00:05:45,760 Speaker 5: used generative AI to adjust the audio on some of 104 00:05:45,800 --> 00:05:47,680 Speaker 5: the dialogue, you know, And I think it had to 105 00:05:47,720 --> 00:05:52,840 Speaker 5: do with adjusting the actors like Hungarian accent, and so 106 00:05:52,880 --> 00:05:56,599 Speaker 5: something like that I think is extremely widespread, and it's 107 00:05:56,680 --> 00:05:59,440 Speaker 5: almost it's hard to argue with. You're like, well, there's 108 00:05:59,480 --> 00:06:03,680 Speaker 5: not something sacred about like adjusting audio levels in a 109 00:06:03,720 --> 00:06:07,320 Speaker 5: certain way, and now there's this technology that makes it 110 00:06:07,440 --> 00:06:10,520 Speaker 5: much easier and faster. If you look at like specific 111 00:06:10,640 --> 00:06:14,159 Speaker 5: little windows. You're going to find everyone is doing it, 112 00:06:14,240 --> 00:06:15,480 Speaker 5: and why wouldn't they. 113 00:06:15,920 --> 00:06:18,839 Speaker 3: But here's the thing. The only people motivated to scream 114 00:06:18,880 --> 00:06:21,520 Speaker 3: about it from the rooftops are AI companies like Open 115 00:06:21,560 --> 00:06:24,400 Speaker 3: Ai or other creative studios that are dedicated to the 116 00:06:24,480 --> 00:06:27,240 Speaker 3: use of generative AI, which means that Lila had to 117 00:06:27,279 --> 00:06:29,960 Speaker 3: really dig to understand the true role of AI in 118 00:06:30,040 --> 00:06:34,120 Speaker 3: Hollywood right now, not to mention the industry's collective opinion 119 00:06:34,160 --> 00:06:34,440 Speaker 3: on it. 120 00:06:34,520 --> 00:06:35,800 Speaker 4: So what you're saying is we're going to get an 121 00:06:35,839 --> 00:06:36,880 Speaker 4: expose today. 122 00:06:37,120 --> 00:06:40,240 Speaker 3: Well, I started my conversation with Lila asking about a 123 00:06:40,279 --> 00:06:43,479 Speaker 3: swanky Hollywood party she attended. There were a ton of 124 00:06:43,520 --> 00:06:45,960 Speaker 3: high powered people in attendance, but so many of the 125 00:06:46,000 --> 00:06:49,360 Speaker 3: people Lilah interviewed there refused to be named. And that's 126 00:06:49,400 --> 00:06:51,880 Speaker 3: because the party was a launch party for a new 127 00:06:52,040 --> 00:06:55,000 Speaker 3: generative AI studio. I'll let Lila take it from here. 128 00:06:55,800 --> 00:07:00,800 Speaker 5: So the party was put on by one of these 129 00:07:00,960 --> 00:07:06,279 Speaker 5: new AI studios. This one is called Asteria, and it 130 00:07:06,360 --> 00:07:09,400 Speaker 5: had described itself as sort of the Pixar of AI. 131 00:07:10,560 --> 00:07:13,960 Speaker 5: And part of the reason the studio had some buzz 132 00:07:14,000 --> 00:07:19,320 Speaker 5: around it was that it was run partly by the 133 00:07:19,400 --> 00:07:23,480 Speaker 5: actress and showrunner and writer Natasha Leone who sort of 134 00:07:23,520 --> 00:07:28,040 Speaker 5: this like Indie World Darling. And the other component of 135 00:07:28,080 --> 00:07:31,520 Speaker 5: it is that this studio is sort of positioning themselves 136 00:07:31,680 --> 00:07:36,800 Speaker 5: as the ethical AI studio in that they are training 137 00:07:36,840 --> 00:07:41,760 Speaker 5: their model, they say, only on licensed material. So there's 138 00:07:41,800 --> 00:07:45,000 Speaker 5: this sense that a lot of artists feel, rightly so 139 00:07:45,080 --> 00:07:49,360 Speaker 5: that like their work is being scraped, they're not being compensated, 140 00:07:49,400 --> 00:07:52,920 Speaker 5: and now they're being replaced by these models that can 141 00:07:53,000 --> 00:07:56,960 Speaker 5: do what they did much more efficiently. So Asteria is 142 00:07:57,000 --> 00:08:00,640 Speaker 5: trying to sort of get around that problem by saying 143 00:08:00,640 --> 00:08:03,760 Speaker 5: that they're licensing all their footage and they're building a 144 00:08:03,800 --> 00:08:06,480 Speaker 5: model with a group of engineers who kind of come 145 00:08:06,520 --> 00:08:09,920 Speaker 5: out of the AI scene in the Silicon Valley. This 146 00:08:10,000 --> 00:08:12,360 Speaker 5: party that I went to was the launch party for 147 00:08:12,560 --> 00:08:16,640 Speaker 5: their model, and it's in this big room and there 148 00:08:16,640 --> 00:08:21,040 Speaker 5: are these images that are created by their model projected 149 00:08:21,080 --> 00:08:26,080 Speaker 5: across the walls, trippy things of like a cloud turning 150 00:08:26,080 --> 00:08:29,120 Speaker 5: into a man and the man falling into the clouds 151 00:08:29,120 --> 00:08:32,440 Speaker 5: and turning back into them, or like undulating fields and 152 00:08:32,520 --> 00:08:33,680 Speaker 5: galloping horses. 153 00:08:33,960 --> 00:08:35,600 Speaker 6: Everything is very dolliesque. 154 00:08:35,840 --> 00:08:38,440 Speaker 5: Yes, there was no presentation or anything, so you don't 155 00:08:38,480 --> 00:08:41,920 Speaker 5: really know how they made it or what the process 156 00:08:42,200 --> 00:08:47,600 Speaker 5: was of producing those moving images. It was very like scene. 157 00:08:48,160 --> 00:08:51,000 Speaker 5: They weren't leading with an explanation of sort of what 158 00:08:51,040 --> 00:08:54,360 Speaker 5: they were and what they were about and how they 159 00:08:54,360 --> 00:08:55,520 Speaker 5: were doing these things. 160 00:08:55,760 --> 00:08:58,440 Speaker 3: There was sort of a weird thing that you described 161 00:08:58,480 --> 00:09:00,240 Speaker 3: in your article that there were a lot of people 162 00:09:00,240 --> 00:09:01,560 Speaker 3: who didn't want to give their names. 163 00:09:01,920 --> 00:09:04,240 Speaker 6: What was that about, right? 164 00:09:05,000 --> 00:09:08,120 Speaker 5: I think that people still feel that there is a 165 00:09:08,240 --> 00:09:14,000 Speaker 5: kind of stigma like associated with using generative AI. When 166 00:09:14,000 --> 00:09:17,800 Speaker 5: the article came out, a lot of people were really 167 00:09:17,840 --> 00:09:20,880 Speaker 5: upset with Natasha because they're like, this is a betrayal. 168 00:09:21,480 --> 00:09:23,800 Speaker 5: So people would tell me, like, I think this is 169 00:09:23,840 --> 00:09:26,479 Speaker 5: going to like alienate me from my friends and colleagues 170 00:09:26,520 --> 00:09:29,720 Speaker 5: if they know that I'm here and then I'm interested 171 00:09:29,800 --> 00:09:32,839 Speaker 5: and I'm looking into it. Even though it's becoming increasingly 172 00:09:32,880 --> 00:09:37,280 Speaker 5: widespread and it's happening everywhere, people still don't like it. 173 00:09:38,040 --> 00:09:41,320 Speaker 3: I understand that there's still a stigma around the use 174 00:09:41,320 --> 00:09:44,760 Speaker 3: of AI and supporting companies that use AI, even though 175 00:09:44,880 --> 00:09:46,440 Speaker 3: every company now uses AI. 176 00:09:46,840 --> 00:09:50,000 Speaker 5: Yeah, a lot of what's being done with generative AI 177 00:09:50,600 --> 00:09:56,320 Speaker 5: is replacing what CGI was, and it's not like there's 178 00:09:56,360 --> 00:10:00,680 Speaker 5: something that's sacred about CGI or like rotoscopic where you're 179 00:10:00,720 --> 00:10:06,960 Speaker 5: like very painstakingly cutting out a person's image and moving 180 00:10:06,960 --> 00:10:12,520 Speaker 5: them into another background. That's not like a pleasurable activity. 181 00:10:12,760 --> 00:10:16,640 Speaker 5: It's not a creative activity. It's like a roat task 182 00:10:16,800 --> 00:10:18,640 Speaker 5: that has to be done over and over and over 183 00:10:18,679 --> 00:10:21,560 Speaker 5: and over and over again. If you have a way 184 00:10:21,600 --> 00:10:25,840 Speaker 5: to do that where it happens much faster, saving you 185 00:10:25,880 --> 00:10:30,239 Speaker 5: like days or weeks of time and looks the same essentially, 186 00:10:30,600 --> 00:10:33,000 Speaker 5: then like why wouldn't you want to do that? A 187 00:10:33,040 --> 00:10:35,560 Speaker 5: lot of the people who are most interested in experimenting 188 00:10:35,600 --> 00:10:38,960 Speaker 5: with it are like CGI VFX people because they're like 189 00:10:39,040 --> 00:10:42,720 Speaker 5: tech people who are interested in emerging technologies, and their 190 00:10:42,800 --> 00:10:46,760 Speaker 5: work itself is part was once in an emerging technology, 191 00:10:46,840 --> 00:10:48,960 Speaker 5: so it's not like they're like dead set against it, 192 00:10:49,040 --> 00:10:52,120 Speaker 5: but it does have the effect of no matter what 193 00:10:52,160 --> 00:10:56,360 Speaker 5: way you sort of slice it, visual effects is now 194 00:10:56,679 --> 00:10:59,679 Speaker 5: going to be much faster, which means from like a 195 00:10:59,720 --> 00:11:02,360 Speaker 5: labor perspective, it takes less people to do it, and 196 00:11:02,360 --> 00:11:05,320 Speaker 5: it takes those people a shorter amount of time. And 197 00:11:05,360 --> 00:11:07,760 Speaker 5: that even leaves aside the ethical issue of is this 198 00:11:07,840 --> 00:11:10,880 Speaker 5: based on stolen work? Which hysteria is trying to get 199 00:11:10,920 --> 00:11:12,920 Speaker 5: around with their system. 200 00:11:13,559 --> 00:11:16,360 Speaker 3: So one of the studios you spoke to is Runway. 201 00:11:16,400 --> 00:11:18,640 Speaker 3: Can you tell me a little bit more about them, 202 00:11:18,679 --> 00:11:21,600 Speaker 3: and like what do they do exactly. 203 00:11:23,080 --> 00:11:29,840 Speaker 5: Runway is this company that they've designed a model that 204 00:11:29,960 --> 00:11:35,640 Speaker 5: generates video and they also, you know, have the ambitions 205 00:11:35,679 --> 00:11:39,080 Speaker 5: to become a studio themselves. And they've now produced a 206 00:11:39,120 --> 00:11:42,560 Speaker 5: pilot and they run these film festivals that are AI 207 00:11:42,640 --> 00:11:47,840 Speaker 5: film festivals. So they're really embedded in Hollywood at this point, 208 00:11:47,880 --> 00:11:52,360 Speaker 5: and they've gotten a lot further there than the big 209 00:11:52,640 --> 00:11:56,840 Speaker 5: tech companies have around. When I started reporting, I think 210 00:11:56,920 --> 00:12:01,360 Speaker 5: the picture had been big tech doesn't know how to 211 00:12:01,440 --> 00:12:05,360 Speaker 5: operate in Hollywood, whereas I think Runway came in and 212 00:12:05,440 --> 00:12:09,840 Speaker 5: started having meetings, and at this point they're meeting with 213 00:12:09,920 --> 00:12:14,760 Speaker 5: every studio. They had the first publicly announced partnership with 214 00:12:14,920 --> 00:12:18,679 Speaker 5: the studio, which was Lionsgate. But they have a model 215 00:12:19,040 --> 00:12:23,960 Speaker 5: which can do incredible special effects. When I was talking 216 00:12:24,040 --> 00:12:28,600 Speaker 5: to the Lionsgate executive and other filmmakers who are working 217 00:12:28,600 --> 00:12:32,640 Speaker 5: with Runway, they talked a lot about how, say you 218 00:12:32,679 --> 00:12:37,240 Speaker 5: want to have a shot of ten thousand soldiers in 219 00:12:37,280 --> 00:12:41,600 Speaker 5: the mountains during a snowstorm. To really shoot it, they'd 220 00:12:41,600 --> 00:12:43,640 Speaker 5: have to go to the Himalayas, and it would take 221 00:12:43,679 --> 00:12:46,640 Speaker 5: three days and cost many millions of dollars. And then 222 00:12:47,000 --> 00:12:51,959 Speaker 5: using Runway, they can just create that for ten thousand dollars. 223 00:12:52,040 --> 00:12:56,360 Speaker 5: And it can also do like much more insignificant things 224 00:12:56,679 --> 00:12:59,360 Speaker 5: like say you have a shot, but you want an 225 00:12:59,400 --> 00:13:04,840 Speaker 5: extra foot of image around every part of the scene. 226 00:13:05,200 --> 00:13:06,880 Speaker 5: We can do that. So it can do a lot 227 00:13:06,920 --> 00:13:12,200 Speaker 5: of things with varying degrees of excellence, all of which 228 00:13:12,240 --> 00:13:17,000 Speaker 5: are much faster than alternative ways of doing those things. 229 00:13:17,600 --> 00:13:20,040 Speaker 3: In your piece, there was one paragraph that really struck me. 230 00:13:20,120 --> 00:13:23,200 Speaker 3: It's a paraphrase of something a lions Gate exact told you. 231 00:13:23,640 --> 00:13:26,160 Speaker 3: With a library as large as lions Gates, they could 232 00:13:26,240 --> 00:13:29,880 Speaker 3: use Runway to repackage and resell what the studio already owned. 233 00:13:30,400 --> 00:13:33,120 Speaker 3: You know, I already feel like we're in. 234 00:13:33,000 --> 00:13:35,760 Speaker 6: The We have sequelitis. 235 00:13:35,160 --> 00:13:39,720 Speaker 3: Right now in Hollywood, and I think people who love movies, 236 00:13:40,600 --> 00:13:44,000 Speaker 3: like myself are often lamenting the fact that everything seems 237 00:13:44,040 --> 00:13:47,520 Speaker 3: to be something that already existed. I guess my question 238 00:13:47,600 --> 00:13:50,960 Speaker 3: for you is, do you feel like we are increasingly 239 00:13:51,080 --> 00:13:56,239 Speaker 3: just pushing into repackaged, repurpose content territory that is Hollywood 240 00:13:56,360 --> 00:14:00,480 Speaker 3: or is that lament a little bit pollyannish. 241 00:14:00,440 --> 00:14:03,880 Speaker 5: Like, of course you hear something like that and it's like, 242 00:14:03,960 --> 00:14:08,520 Speaker 5: that's depressing. Now, I'd say the alternate argument to that, 243 00:14:08,640 --> 00:14:11,040 Speaker 5: which I heard a lot from the people at Hysteria, 244 00:14:11,240 --> 00:14:16,000 Speaker 5: certainly Natasha Leone, is that sure, that's one possibility. Another 245 00:14:16,120 --> 00:14:21,080 Speaker 5: possibility is that this technology levels the playing field right, 246 00:14:21,280 --> 00:14:25,600 Speaker 5: fresh voices, new people who would otherwise be unable to 247 00:14:27,040 --> 00:14:30,040 Speaker 5: get a foothold in Hollywood will be in a position 248 00:14:30,120 --> 00:14:34,920 Speaker 5: to make films. And so there are certainly people who 249 00:14:35,000 --> 00:14:38,840 Speaker 5: are arguing that this can lead to a new golden 250 00:14:38,960 --> 00:14:44,240 Speaker 5: era of independent filmmaking. I don't think that's impossible, but 251 00:14:44,640 --> 00:14:50,200 Speaker 5: the worst case scenario doesn't seem that improbable either, Which 252 00:14:50,240 --> 00:14:53,000 Speaker 5: is what which is that like it'll just be all 253 00:14:53,080 --> 00:14:55,800 Speaker 5: slop from here on out. The slop will just get 254 00:14:55,840 --> 00:14:58,840 Speaker 5: more and more and more slop, Like because we're so 255 00:14:58,960 --> 00:15:03,800 Speaker 5: focused in like chure out content and if someone can 256 00:15:04,120 --> 00:15:07,280 Speaker 5: just click a button and star in like seven seasons 257 00:15:07,440 --> 00:15:10,440 Speaker 5: of Game of Throne knockoff, they're going to want to 258 00:15:10,520 --> 00:15:17,720 Speaker 5: do that. What are the incentives for making quality content? 259 00:15:19,120 --> 00:15:21,040 Speaker 5: But at the same time, like, I don't think it's 260 00:15:21,080 --> 00:15:23,240 Speaker 5: an all or nothing. I'm sure it'll just continue to 261 00:15:23,280 --> 00:15:24,480 Speaker 5: be kind of both. 262 00:15:25,080 --> 00:15:28,640 Speaker 3: It's a matter of if people who care about movies 263 00:15:29,120 --> 00:15:33,200 Speaker 3: and the art form of movies will invite these kinds 264 00:15:33,240 --> 00:15:35,640 Speaker 3: of movies into their lives or worlds. 265 00:15:35,920 --> 00:15:38,560 Speaker 5: Yeah. One thing I thought was really interesting in response 266 00:15:38,600 --> 00:15:40,560 Speaker 5: to the piece is at the end of the piece, 267 00:15:40,800 --> 00:15:45,160 Speaker 5: Natasha talks about how her neighbor was David Lynch before 268 00:15:45,200 --> 00:15:49,080 Speaker 5: he died, and she related to me like this conversation 269 00:15:49,240 --> 00:15:51,920 Speaker 5: she had with him about asking him his thoughts, and 270 00:15:52,600 --> 00:15:55,280 Speaker 5: his response was to pick up a pencil and say 271 00:15:55,440 --> 00:15:58,920 Speaker 5: this is AI, basically like, it's a tool. What matters 272 00:15:59,040 --> 00:16:01,560 Speaker 5: is how you use it. And then all these people 273 00:16:01,640 --> 00:16:05,160 Speaker 5: online were like, how dare she drag David Lynch's name 274 00:16:05,600 --> 00:16:08,160 Speaker 5: through the mud like this? He would never have used AI, 275 00:16:08,560 --> 00:16:11,880 Speaker 5: But I thought it was actually very probable that he 276 00:16:11,880 --> 00:16:14,760 Speaker 5: would have said that because he was a constantly evolving 277 00:16:15,200 --> 00:16:19,600 Speaker 5: filmmaker who did use many tools that were at the 278 00:16:19,640 --> 00:16:23,680 Speaker 5: time that he began using them controversial and something that 279 00:16:23,760 --> 00:16:26,480 Speaker 5: others were saying would lead to the death of art 280 00:16:26,680 --> 00:16:30,520 Speaker 5: and the death of filmmaking. So I do think it's 281 00:16:30,560 --> 00:16:33,800 Speaker 5: true that it's just a tool and that like, of course, 282 00:16:33,840 --> 00:16:36,920 Speaker 5: people will make interesting things with it, and there are 283 00:16:37,080 --> 00:16:40,200 Speaker 5: interesting artists right now who are experimenting with it, and 284 00:16:40,440 --> 00:16:44,040 Speaker 5: we haven't seen their work yet, but I think once 285 00:16:44,160 --> 00:16:46,880 Speaker 5: those projects start to come out, there will be a 286 00:16:46,960 --> 00:16:51,280 Speaker 5: shift in attitude. I think the Netflix. Ted Surrando said this, 287 00:16:51,400 --> 00:16:53,000 Speaker 5: like he thinks that it's not just going to make 288 00:16:53,040 --> 00:16:56,200 Speaker 5: movies much cheaper to make, but also ten percent better, 289 00:16:57,160 --> 00:17:00,480 Speaker 5: and that the reason it would make movies ten percent 290 00:17:00,520 --> 00:17:03,080 Speaker 5: better is because like, oh, we'll be more freed up 291 00:17:03,080 --> 00:17:07,800 Speaker 5: to spend more time on the creativity. And to me, 292 00:17:07,880 --> 00:17:11,080 Speaker 5: it's like, well, that's certainly not an inevitable outcome. It 293 00:17:11,160 --> 00:17:14,000 Speaker 5: could easily be just that it's faster and now you're 294 00:17:14,040 --> 00:17:16,359 Speaker 5: expected to just churn out more and more and more 295 00:17:16,400 --> 00:17:20,639 Speaker 5: and more, and you're never giving that time back to 296 00:17:20,840 --> 00:17:25,120 Speaker 5: invest in the creativity. 297 00:17:30,320 --> 00:17:34,320 Speaker 3: After the break, Will AI ever completely replace human filmmakers? 298 00:17:34,760 --> 00:17:46,880 Speaker 7: Stay with us? 299 00:17:51,240 --> 00:17:54,639 Speaker 3: I want to pivot a little bit here to the 300 00:17:55,040 --> 00:17:58,600 Speaker 3: thing that people are focused on most in any conversation 301 00:17:58,680 --> 00:18:04,160 Speaker 3: about AI, which is job. It's no secret that creative 302 00:18:04,200 --> 00:18:07,480 Speaker 3: workers in Hollywood are struggling. Whether it be writers, actors, 303 00:18:08,560 --> 00:18:10,639 Speaker 3: I don't know. It seems like there's a lot on 304 00:18:10,680 --> 00:18:14,000 Speaker 3: the line right now. What were people saying to you 305 00:18:14,280 --> 00:18:17,080 Speaker 3: in your piece, and in your reporting about the future 306 00:18:17,200 --> 00:18:20,440 Speaker 3: of the quote unquote Hollywood workforce. 307 00:18:21,040 --> 00:18:24,240 Speaker 5: Yeah, I mean, I think there's no question that there 308 00:18:25,160 --> 00:18:29,600 Speaker 5: has already been job loss and that there will probably 309 00:18:29,640 --> 00:18:32,440 Speaker 5: be more job loss. It's very hard, though, to really 310 00:18:32,480 --> 00:18:34,919 Speaker 5: pin down the reasons why. When I was talking to 311 00:18:35,080 --> 00:18:38,360 Speaker 5: unions in Hollywood for the piece, no one was sort 312 00:18:38,359 --> 00:18:41,160 Speaker 5: of able or willing to say, like, our members lost 313 00:18:41,160 --> 00:18:45,680 Speaker 5: to this many jobs because of AI. There's also problems 314 00:18:45,720 --> 00:18:50,399 Speaker 5: like the box office is down and budgets are tighter. 315 00:18:51,040 --> 00:18:53,439 Speaker 5: But all of that said, I think there are some 316 00:18:53,560 --> 00:18:58,400 Speaker 5: specific areas where it's pretty clear that like AI can 317 00:18:58,560 --> 00:19:01,439 Speaker 5: do as good job as a person, which is not 318 00:19:01,560 --> 00:19:05,960 Speaker 5: everywhere by any means. But for example, the junior studio executive, 319 00:19:06,280 --> 00:19:09,800 Speaker 5: a big part of that person's job was to write 320 00:19:09,920 --> 00:19:15,159 Speaker 5: script analyzes, and that's something that generative AI can do 321 00:19:15,200 --> 00:19:17,560 Speaker 5: a really good job of and it's not really a 322 00:19:17,600 --> 00:19:21,480 Speaker 5: creative enterprise exactly. When the studio executive talked to me 323 00:19:21,520 --> 00:19:24,680 Speaker 5: about it, that was how she came up writing script 324 00:19:24,720 --> 00:19:28,920 Speaker 5: summaries for her bosses, and now she doesn't need an 325 00:19:29,000 --> 00:19:31,600 Speaker 5: assistant to do that for her. But at the same 326 00:19:31,640 --> 00:19:33,560 Speaker 5: time she was like, but I am worried about what 327 00:19:33,640 --> 00:19:36,680 Speaker 5: is the next generation of studio executives. How are they 328 00:19:36,720 --> 00:19:40,960 Speaker 5: going to learn when this core function of their job 329 00:19:41,040 --> 00:19:46,600 Speaker 5: is no longer necessary? And so who am I training now? 330 00:19:46,760 --> 00:19:47,120 Speaker 5: No one. 331 00:19:48,200 --> 00:19:51,960 Speaker 3: Is it fair to fear that AI will take over 332 00:19:52,040 --> 00:19:53,320 Speaker 3: filmmaking altogether. 333 00:19:54,000 --> 00:19:57,560 Speaker 5: I think that's a reasonable thing to be somewhat afraid of, 334 00:19:57,800 --> 00:20:02,800 Speaker 5: although I do think that it's a pretty complicated question actually, 335 00:20:02,880 --> 00:20:07,080 Speaker 5: because do I think that people will lose the urge 336 00:20:07,119 --> 00:20:12,159 Speaker 5: to create art? No, But it could be that we 337 00:20:12,280 --> 00:20:16,159 Speaker 5: reach a time where there's no economic incentive to do 338 00:20:16,280 --> 00:20:23,679 Speaker 5: it because most people spend their time scrolling TikTok, and 339 00:20:23,720 --> 00:20:29,040 Speaker 5: that there's no more like eyeballs that want to engage 340 00:20:29,160 --> 00:20:31,600 Speaker 5: in serious art. And I mean that's kind of true 341 00:20:31,640 --> 00:20:37,639 Speaker 5: for not just Hollywood, but also publishing, writing, podcasting, everything. 342 00:20:37,920 --> 00:20:41,840 Speaker 5: Is there going to be money to keep making original work? 343 00:20:42,119 --> 00:20:45,000 Speaker 5: But I don't think that we're in a place where 344 00:20:45,680 --> 00:20:50,359 Speaker 5: AI could produce a good movie without human involvement. And 345 00:20:50,440 --> 00:20:52,080 Speaker 5: I don't really know that we will ever be in 346 00:20:52,119 --> 00:20:54,280 Speaker 5: a place where AI could make a good movie without 347 00:20:54,359 --> 00:20:57,000 Speaker 5: human involvement. And core feature of AI is that it 348 00:20:57,040 --> 00:21:00,199 Speaker 5: has no taste. I think sometimes when you have these 349 00:21:00,240 --> 00:21:03,719 Speaker 5: conversations with people who are like thinking about this vision 350 00:21:04,080 --> 00:21:09,240 Speaker 5: where like AI is replacing everything. It's predicated on the 351 00:21:09,480 --> 00:21:11,680 Speaker 5: idea that there's going to continue to be like an 352 00:21:11,720 --> 00:21:14,840 Speaker 5: exponential leap forward and what it can do. And like, 353 00:21:14,960 --> 00:21:18,840 Speaker 5: right now, like AI can't write a good script, for instance, 354 00:21:19,160 --> 00:21:21,359 Speaker 5: and nobody thinks it can. I mean, that's part of 355 00:21:21,440 --> 00:21:27,920 Speaker 5: why the writer's victory during the strikes wasn't that meaningful, 356 00:21:28,880 --> 00:21:31,920 Speaker 5: because right now it really can't create a. 357 00:21:31,840 --> 00:21:34,200 Speaker 6: Good script, but it could in the future. 358 00:21:34,640 --> 00:21:39,160 Speaker 5: Maybe it seems like a real question mark. Meaningful artwork 359 00:21:39,720 --> 00:21:46,160 Speaker 5: has to reflect human choice. So you can imagine someone 360 00:21:46,480 --> 00:21:50,760 Speaker 5: using chat GBT as part of their writing process, but 361 00:21:50,840 --> 00:21:53,920 Speaker 5: it would still have to be led by a person. 362 00:21:54,440 --> 00:21:58,919 Speaker 5: Because I don't think that AI can produce meaningful It 363 00:21:58,920 --> 00:22:01,840 Speaker 5: doesn't have any motivation, It doesn't have any desire to 364 00:22:01,960 --> 00:22:03,240 Speaker 5: express anything. 365 00:22:03,440 --> 00:22:05,960 Speaker 3: Right, And that's why taste seems to be the final 366 00:22:05,960 --> 00:22:09,359 Speaker 3: frontier that would need to be entirely disrupted, which is 367 00:22:09,880 --> 00:22:11,840 Speaker 3: from where I stand, seemingly impossible. 368 00:22:12,680 --> 00:22:16,400 Speaker 5: Yeah, it's just like a different order of business entirely. 369 00:22:17,040 --> 00:22:18,560 Speaker 5: A lot of people that I talk to in AI 370 00:22:19,200 --> 00:22:23,040 Speaker 5: don't think that that is going to happen. They more 371 00:22:23,160 --> 00:22:26,560 Speaker 5: think that it will in the realm of Hollywood, say, 372 00:22:26,840 --> 00:22:30,520 Speaker 5: get better and better and better at doing these specific 373 00:22:30,640 --> 00:22:35,560 Speaker 5: technical functions, which is so different than give me seven 374 00:22:35,640 --> 00:22:37,960 Speaker 5: seasons of whatever TV show. 375 00:22:38,560 --> 00:22:41,520 Speaker 3: Did you get a sense at all in reporting this 376 00:22:41,680 --> 00:22:44,879 Speaker 3: that of like how lay people feel about more and 377 00:22:44,960 --> 00:22:47,280 Speaker 3: more AI content coming out of Hollywood? 378 00:22:47,400 --> 00:22:49,960 Speaker 5: I mean most people, I don't think feel great about it. 379 00:22:50,440 --> 00:22:53,640 Speaker 5: There is, like, I think, a shift since the writer strikes, 380 00:22:53,840 --> 00:22:58,000 Speaker 5: of just more and more acceptance, partly also because chat 381 00:22:58,119 --> 00:23:02,480 Speaker 5: GPT is so widely used now and so more and 382 00:23:02,520 --> 00:23:05,360 Speaker 5: more people are having some form of interaction with it 383 00:23:05,440 --> 00:23:07,440 Speaker 5: and being like this can be helpful. Oh, I could 384 00:23:07,480 --> 00:23:09,680 Speaker 5: do this thing, that would be helpful. So I think 385 00:23:09,720 --> 00:23:11,920 Speaker 5: a big message right now is like, if you want 386 00:23:11,920 --> 00:23:14,879 Speaker 5: to be employed, then you have to use AI. And 387 00:23:14,920 --> 00:23:18,960 Speaker 5: that's true for like every industry right now, not just Hollywood. 388 00:23:19,359 --> 00:23:23,280 Speaker 5: It's like, if you take away the environmental problems, if 389 00:23:23,320 --> 00:23:27,479 Speaker 5: you take away the theft of the world's artwork and 390 00:23:27,680 --> 00:23:30,280 Speaker 5: set those issues aside for a second. 391 00:23:30,119 --> 00:23:32,960 Speaker 6: Those two small factors, those. 392 00:23:32,800 --> 00:23:37,800 Speaker 5: Two small factors, then I do think that it is 393 00:23:37,840 --> 00:23:41,239 Speaker 5: a tool, and there's nothing inherently frightening about it, and 394 00:23:41,280 --> 00:23:45,000 Speaker 5: I think there are certain areas where it's just here 395 00:23:45,080 --> 00:23:47,840 Speaker 5: to stay. So it's not that useful to be like 396 00:23:47,960 --> 00:23:52,119 Speaker 5: that's bad, because it's just the nature of the world 397 00:23:52,280 --> 00:23:54,919 Speaker 5: right now. So I think some of the reactions to 398 00:23:55,000 --> 00:23:57,320 Speaker 5: it don't make a ton of sense. Like I don't 399 00:23:57,320 --> 00:24:02,679 Speaker 5: think that using AI currently curdles the work, because I 400 00:24:02,720 --> 00:24:05,080 Speaker 5: think there's tons of ways to use it that the 401 00:24:05,200 --> 00:24:08,160 Speaker 5: viewer will never even be aware of. And I think 402 00:24:08,240 --> 00:24:12,000 Speaker 5: that there is an increasing number of people lower down 403 00:24:12,560 --> 00:24:15,760 Speaker 5: in the industry who come to their producers and are like, 404 00:24:15,800 --> 00:24:19,040 Speaker 5: here's a good solution to this problem we've been dealing with. 405 00:24:19,240 --> 00:24:22,520 Speaker 5: Should we try it? And I think that is something 406 00:24:22,560 --> 00:24:24,840 Speaker 5: that I've heard was increasingly common. 407 00:24:25,240 --> 00:24:27,800 Speaker 6: Well, thank you so much, Llila, this was a great conversation. 408 00:24:28,160 --> 00:24:29,520 Speaker 5: Thank you so much for having me. 409 00:24:53,480 --> 00:24:54,119 Speaker 7: For tect stuff. 410 00:24:54,160 --> 00:24:56,000 Speaker 4: I'm Cara Price and I'm most Valocian. 411 00:24:56,160 --> 00:24:58,879 Speaker 3: This episode was produced by Eliza Dennis Tyler Hill and 412 00:24:58,960 --> 00:25:02,560 Speaker 3: Melissa Slaughter. It was executive produced by me Oswa Lashin 413 00:25:02,760 --> 00:25:06,520 Speaker 3: and Kate Osborne for Kaleidoscope and Katrina Norvel for iHeart Podcasts. 414 00:25:06,760 --> 00:25:09,359 Speaker 3: Jack Insley mixed this episode and Kyle Murdoch wrote our 415 00:25:09,400 --> 00:25:09,840 Speaker 3: theme song. 416 00:25:10,000 --> 00:25:11,800 Speaker 4: Join us on Friday for the Week in Tech. 417 00:25:12,359 --> 00:25:14,439 Speaker 1: Karen and I will run through the tech headlines you 418 00:25:14,480 --> 00:25:17,000 Speaker 1: may have missed, and please do rate and review the 419 00:25:17,040 --> 00:25:19,880 Speaker 1: show wherever you listen to your podcasts, and reach out 420 00:25:19,880 --> 00:25:38,320 Speaker 1: to us at tech Stuff podcast at gmail dot com.