1 00:00:08,080 --> 00:00:12,000 Speaker 1: Welcome to another episode of Strictly Business, the podcast in 2 00:00:12,039 --> 00:00:14,360 Speaker 1: which we speak with some of the brightest minds working 3 00:00:14,360 --> 00:00:18,599 Speaker 1: in the media business today. I'm Andrew Wallenstein with Variety. 4 00:00:19,440 --> 00:00:23,440 Speaker 1: The handwringing over what generative AI could mean for Hollywood 5 00:00:23,520 --> 00:00:27,560 Speaker 1: is nothing new, certainly not since last year's strikes, but 6 00:00:27,680 --> 00:00:30,520 Speaker 1: concerns have ratcheted up to a whole new level in 7 00:00:30,560 --> 00:00:34,839 Speaker 1: recent weeks when Open Ai gave the world a glimpse 8 00:00:34,880 --> 00:00:39,400 Speaker 1: of its upcoming text to video tool SORA. At first, 9 00:00:39,479 --> 00:00:41,960 Speaker 1: the assessments that we were witnessing the end of the 10 00:00:42,080 --> 00:00:45,560 Speaker 1: entertainment industry as we know it were limited to social media, 11 00:00:46,000 --> 00:00:48,880 Speaker 1: but then last week mogul Tyler Perry upped the Annie 12 00:00:48,880 --> 00:00:52,760 Speaker 1: by publicly declaring SORA had prompted him to stop construction 13 00:00:53,040 --> 00:00:56,840 Speaker 1: on an eight hundred million dollar expansion of his studio 14 00:00:56,960 --> 00:01:01,120 Speaker 1: in Atlanta, and he beseeched Congress to say, the industry, 15 00:01:01,760 --> 00:01:06,120 Speaker 1: so what's going on here? Overreaction or is it even 16 00:01:06,240 --> 00:01:10,400 Speaker 1: possible underestimation? To help me make sense of it all, 17 00:01:10,760 --> 00:01:14,880 Speaker 1: I've enlisted Steven Zeitchik, who has been closely tracking the 18 00:01:14,959 --> 00:01:20,800 Speaker 1: intersection of AI and entertainment at his buzzworthy Substack Mind 19 00:01:20,880 --> 00:01:31,360 Speaker 1: and Iron. We will be back with him after these messages, 20 00:01:32,480 --> 00:01:35,240 Speaker 1: and we are back with Steve Zeitchik, formerly of The 21 00:01:35,400 --> 00:01:39,640 Speaker 1: Washington Post, The Los Angeles Times, and other august publications, 22 00:01:40,040 --> 00:01:44,000 Speaker 1: where he has written extensively about media and tech, most 23 00:01:44,040 --> 00:01:48,080 Speaker 1: recently on his substack Mind and Iron, which is available 24 00:01:48,160 --> 00:01:52,200 Speaker 1: every Thursday. You can check it out at mindanron dot com. 25 00:01:52,320 --> 00:01:55,280 Speaker 1: I'm also lucky to call him a close friend for 26 00:01:55,360 --> 00:01:58,640 Speaker 1: more than two decades. But this is our very first 27 00:01:58,680 --> 00:02:01,720 Speaker 1: podcast together, and I can think of no better subject 28 00:02:01,720 --> 00:02:04,440 Speaker 1: to finally bring us together than one that some say 29 00:02:05,200 --> 00:02:08,480 Speaker 1: represents the dawn of a new era in entertainment, the 30 00:02:08,560 --> 00:02:12,200 Speaker 1: profound implications. So Steve, I want to hear what you 31 00:02:12,440 --> 00:02:15,520 Speaker 1: have to say. No pressure, but thanks for stopping by. 32 00:02:16,120 --> 00:02:16,400 Speaker 2: Andy. 33 00:02:16,440 --> 00:02:18,520 Speaker 3: It's great to be here, as you say, after all 34 00:02:18,560 --> 00:02:21,120 Speaker 3: these decades. Of course, we did meet when we were 35 00:02:21,160 --> 00:02:23,680 Speaker 3: four years old, so two decades, you know, doesn't take 36 00:02:23,760 --> 00:02:24,560 Speaker 3: us back that far. 37 00:02:25,639 --> 00:02:26,560 Speaker 2: Yeah, I mean, Sora. 38 00:02:26,680 --> 00:02:30,360 Speaker 3: It has been a fascinating time, obviously really starting with 39 00:02:30,360 --> 00:02:32,480 Speaker 3: the release of Chat GPT all the way back in 40 00:02:32,520 --> 00:02:35,800 Speaker 3: November twenty two. But the anti as you note, was 41 00:02:35,880 --> 00:02:38,880 Speaker 3: incredibly upped a couple of weeks ago when open Ai 42 00:02:39,440 --> 00:02:40,080 Speaker 3: release Sora. 43 00:02:40,320 --> 00:02:42,560 Speaker 2: This is the text to video. 44 00:02:44,200 --> 00:02:47,119 Speaker 3: Basically platform or program that open Ai has not yet 45 00:02:47,120 --> 00:02:51,960 Speaker 3: released commercially, but they're unrolling it, probably in the notcho 46 00:02:52,040 --> 00:02:54,880 Speaker 3: just in future. And essentially, what text to video means 47 00:02:54,960 --> 00:02:57,320 Speaker 3: is you can put in a text input the way 48 00:02:57,320 --> 00:02:59,919 Speaker 3: you can with Chat GPT for text or the Dot 49 00:03:00,280 --> 00:03:01,840 Speaker 3: and Mid Journeys of the World for images. 50 00:03:02,120 --> 00:03:02,760 Speaker 2: You can do the. 51 00:03:02,720 --> 00:03:07,800 Speaker 3: Same now for video and have incredibly realistic looking with 52 00:03:07,919 --> 00:03:10,200 Speaker 3: only some slight bugs. As I'm sure we'll get into 53 00:03:10,680 --> 00:03:14,440 Speaker 3: video and the power of such a tool. We've heard 54 00:03:14,440 --> 00:03:16,880 Speaker 3: about it for a long time. We sort of sci 55 00:03:16,880 --> 00:03:19,160 Speaker 3: fi writers have kind of dreamed of it. I don't 56 00:03:19,160 --> 00:03:20,639 Speaker 3: know if any of us thought we would actually see 57 00:03:20,639 --> 00:03:23,320 Speaker 3: the day. Certainly did not imagine. I think many of 58 00:03:23,400 --> 00:03:25,720 Speaker 3: us did not imagine seeing it so soon. Here we are, 59 00:03:25,840 --> 00:03:29,000 Speaker 3: very early in twenty twenty four, already potentially with the 60 00:03:29,040 --> 00:03:32,959 Speaker 3: capability to do this, but Soras here and it's only 61 00:03:32,960 --> 00:03:34,000 Speaker 3: going to be coming on stronger. 62 00:03:34,760 --> 00:03:39,640 Speaker 1: So what was it exactly that people found so mind blowing, Because, 63 00:03:39,680 --> 00:03:44,200 Speaker 1: as you said, there were other applications in the marketplace already. 64 00:03:44,280 --> 00:03:46,840 Speaker 1: Text to video wasn't entirely new. 65 00:03:48,040 --> 00:03:49,760 Speaker 3: Yeah, it's a good question, and I think, you know, 66 00:03:49,800 --> 00:03:52,080 Speaker 3: these things are all sort of you know, a little 67 00:03:52,080 --> 00:03:53,840 Speaker 3: bit of that that Fitzgerald, you know, how did you 68 00:03:53,880 --> 00:03:56,240 Speaker 3: go broke gradually at first or then all at once 69 00:03:56,320 --> 00:03:56,800 Speaker 3: or hemingway. 70 00:03:56,800 --> 00:03:58,000 Speaker 2: I always forget who it was, But. 71 00:03:58,240 --> 00:04:01,240 Speaker 3: That's certainly been the trajectory of AI, where it's kind 72 00:04:01,280 --> 00:04:04,000 Speaker 3: of like, oh, it's very incremental, it's very slow, it's 73 00:04:04,080 --> 00:04:06,040 Speaker 3: very slow, and then all of a sudden it seems 74 00:04:06,080 --> 00:04:09,120 Speaker 3: to be here, and we ask how it happened so quickly. So, yes, 75 00:04:09,200 --> 00:04:12,920 Speaker 3: there certainly have been some kind of video applications, I 76 00:04:12,920 --> 00:04:14,360 Speaker 3: don't think nearly as smooth. 77 00:04:15,200 --> 00:04:16,440 Speaker 2: And I think one of the reasons. 78 00:04:16,240 --> 00:04:18,560 Speaker 3: This took people back, certainly for those of us with 79 00:04:18,640 --> 00:04:21,000 Speaker 3: a long enough memory to go back to the earlier, 80 00:04:21,040 --> 00:04:23,600 Speaker 3: earlier days of the Internet, is that, if you remember, 81 00:04:23,680 --> 00:04:27,560 Speaker 3: the sort of distribution of a lot of these different 82 00:04:27,600 --> 00:04:30,400 Speaker 3: forms of media really took a long time to evolve. 83 00:04:30,440 --> 00:04:33,080 Speaker 3: I mean, you know, people were using email back in 84 00:04:33,120 --> 00:04:36,480 Speaker 3: the early nineties. I don't think we were seeing video 85 00:04:36,920 --> 00:04:40,320 Speaker 3: really distributed widely for you know, ten to fifteen years 86 00:04:40,360 --> 00:04:42,839 Speaker 3: after that. And so while on the one hand, AI 87 00:04:42,920 --> 00:04:45,400 Speaker 3: of course has been in development for many decades, some 88 00:04:45,480 --> 00:04:47,719 Speaker 3: of your listeners may know about Elizo, which was a 89 00:04:48,320 --> 00:04:51,000 Speaker 3: kind of a text based chatbot back in the sixties. 90 00:04:51,240 --> 00:04:53,360 Speaker 3: But the reality is, for most people and for most 91 00:04:53,400 --> 00:04:56,960 Speaker 3: modern use cases, we really have not had any sort 92 00:04:56,960 --> 00:04:59,840 Speaker 3: of widely deployed AI until just you know, a little 93 00:05:00,160 --> 00:05:02,640 Speaker 3: a year ago with chat GPT and so to go 94 00:05:02,720 --> 00:05:06,559 Speaker 3: from a text based you know, application to a video 95 00:05:06,600 --> 00:05:09,560 Speaker 3: based one and literally, you know, fourteen months. When you 96 00:05:09,600 --> 00:05:13,080 Speaker 3: think about you know, previous uh, you know evolutions took 97 00:05:13,120 --> 00:05:13,880 Speaker 3: fourteen years. 98 00:05:14,320 --> 00:05:16,400 Speaker 2: I think that's partly why people are so shocked by this. 99 00:05:17,440 --> 00:05:21,960 Speaker 1: I think there was something also about what was in market. 100 00:05:22,320 --> 00:05:26,920 Speaker 1: Uh prior to Sora had just lasted clips that were 101 00:05:26,920 --> 00:05:31,279 Speaker 1: about four seconds, and there was something about getting to 102 00:05:31,360 --> 00:05:34,960 Speaker 1: that one minute mark that Sora that was able was 103 00:05:35,040 --> 00:05:39,040 Speaker 1: able to do that. I mean that I think really 104 00:05:39,160 --> 00:05:42,680 Speaker 1: woke people up to the fact that this technology was 105 00:05:42,720 --> 00:05:46,599 Speaker 1: evolving very quickly. But see, here's the thing for me, 106 00:05:46,960 --> 00:05:51,840 Speaker 1: I think there's for me, I'm not so quick to 107 00:05:52,080 --> 00:05:56,159 Speaker 1: presume that means that going forward, there's going to be 108 00:05:56,279 --> 00:06:01,040 Speaker 1: some uninterrupted hockey stick you know, evolution up into the right. 109 00:06:01,839 --> 00:06:04,080 Speaker 1: That means that, you know, by the end of the year, 110 00:06:04,240 --> 00:06:07,120 Speaker 1: Sora is going to be spitting out a ninety minute 111 00:06:07,480 --> 00:06:11,240 Speaker 1: movie that's going to be ready for theatrical distribution. What 112 00:06:11,279 --> 00:06:13,800 Speaker 1: do you think in terms of the evolution of this 113 00:06:13,880 --> 00:06:15,479 Speaker 1: stuff going forward from here? 114 00:06:16,520 --> 00:06:19,880 Speaker 3: Yeah, no, I think the length is a very big question. 115 00:06:19,920 --> 00:06:22,960 Speaker 3: I mean, certainly you're right that the quantum leap we've 116 00:06:23,040 --> 00:06:26,400 Speaker 3: just had is nothing short or remarkable. Whether you think 117 00:06:26,440 --> 00:06:28,279 Speaker 3: this is a good thing or not. When I say remarkable, 118 00:06:28,320 --> 00:06:31,200 Speaker 3: I don't mean that it's going to save humanity. 119 00:06:31,200 --> 00:06:32,840 Speaker 2: It could also be its destruction. 120 00:06:33,000 --> 00:06:36,600 Speaker 3: But certainly, in terms of the tech and the kind 121 00:06:36,640 --> 00:06:40,160 Speaker 3: of form factor here, it's really remarkable that we've gone 122 00:06:40,160 --> 00:06:42,839 Speaker 3: from either as you say, very short videos are really 123 00:06:42,839 --> 00:06:45,800 Speaker 3: even still images, to a full minute. But yeah, the 124 00:06:45,880 --> 00:06:48,919 Speaker 3: technical challenges without sort of getting or boring anyone with 125 00:06:49,000 --> 00:06:53,919 Speaker 3: the engineering or kind of processing requirements here, but the 126 00:06:54,080 --> 00:06:57,960 Speaker 3: kind of capabilities needed to go from a minute to 127 00:06:58,800 --> 00:07:00,320 Speaker 3: you know, let's say, I don't know, even a twenty 128 00:07:00,320 --> 00:07:03,440 Speaker 3: five minute episode of television is not simply a factor 129 00:07:03,480 --> 00:07:05,880 Speaker 3: of you know, twenty five where you can just stack 130 00:07:06,000 --> 00:07:08,279 Speaker 3: minute long videos together. First of all, they would not, 131 00:07:08,720 --> 00:07:10,600 Speaker 3: you'd have to stitch them essentially in a way that 132 00:07:10,600 --> 00:07:13,160 Speaker 3: would not work. And then even doing a lot of 133 00:07:13,200 --> 00:07:16,080 Speaker 3: limited videos, I mean we know how much power even 134 00:07:16,120 --> 00:07:18,600 Speaker 3: one minute takes. So I don't think we're going to 135 00:07:18,640 --> 00:07:21,480 Speaker 3: see full length forget feature films, but even even you know, 136 00:07:22,000 --> 00:07:26,560 Speaker 3: shorter you know, sitcom type episodes anytime soon. But I 137 00:07:26,600 --> 00:07:29,520 Speaker 3: think that the future is now in the sense that 138 00:07:30,040 --> 00:07:31,840 Speaker 3: it will only be a matter of time. I mean, 139 00:07:31,880 --> 00:07:34,440 Speaker 3: to me, the question of the question of time is 140 00:07:34,480 --> 00:07:37,120 Speaker 3: not really a significant one. You know. Again, a lot 141 00:07:37,120 --> 00:07:40,120 Speaker 3: of this is about processing power. You know, basically, how 142 00:07:40,120 --> 00:07:44,240 Speaker 3: computers can handle this much data and sort of troll 143 00:07:44,360 --> 00:07:48,680 Speaker 3: this much data in a short amount of time that 144 00:07:48,760 --> 00:07:52,760 Speaker 3: we will get there. I'm not concerned from a technical standpoint. 145 00:07:52,800 --> 00:07:55,280 Speaker 3: I don't think there's any there's any kind of hindrance there. 146 00:07:55,400 --> 00:07:57,480 Speaker 3: I think the question is how good will it be 147 00:07:57,960 --> 00:07:59,440 Speaker 3: when we do get there? I mean, and I don't 148 00:07:59,440 --> 00:08:01,000 Speaker 3: know if you want to get into the challenges now, 149 00:08:01,040 --> 00:08:03,480 Speaker 3: but even in the one minute, we know about all 150 00:08:03,560 --> 00:08:05,360 Speaker 3: the laws of physics that are being defined. 151 00:08:05,840 --> 00:08:08,520 Speaker 2: You know, the cat with the with the fifth leg. 152 00:08:08,480 --> 00:08:11,720 Speaker 3: Or the you know, or the hand key that never 153 00:08:11,760 --> 00:08:15,720 Speaker 3: gets eaten, yeah, the extra hand, the person who's flying backwards. 154 00:08:15,960 --> 00:08:18,720 Speaker 3: I mean, all of these are not small challenges. It's 155 00:08:18,760 --> 00:08:21,360 Speaker 3: not like a case. I mean, yes, you could have 156 00:08:21,440 --> 00:08:23,880 Speaker 3: humans go in and fix that, as you can with 157 00:08:23,960 --> 00:08:27,800 Speaker 3: any sort of animation, but that essentially defeats the purpose 158 00:08:27,840 --> 00:08:29,240 Speaker 3: of having a machine do this. I mean, you could 159 00:08:29,240 --> 00:08:31,560 Speaker 3: also have a human design this whole thing in an 160 00:08:31,600 --> 00:08:34,360 Speaker 3: animation studio. So I think that will be The question 161 00:08:34,440 --> 00:08:36,800 Speaker 3: in terms of the democratization is not so much when 162 00:08:36,800 --> 00:08:38,959 Speaker 3: we can get to the length, because look, people will 163 00:08:38,960 --> 00:08:40,280 Speaker 3: play with minute long formats. 164 00:08:40,280 --> 00:08:42,640 Speaker 2: We have TikTok videos that are ten seconds that go viral. 165 00:08:43,040 --> 00:08:45,720 Speaker 3: I'm not really and eventually we will get to the 166 00:08:45,720 --> 00:08:48,120 Speaker 3: twenty five minute or ninety five minute mark. I think 167 00:08:48,160 --> 00:08:49,840 Speaker 3: the question is how good will they be when we do? 168 00:08:50,800 --> 00:08:52,640 Speaker 1: But still, I want to like, just back up for 169 00:08:52,640 --> 00:08:56,400 Speaker 1: a second. I'm just picturing like your mom, my mom 170 00:08:56,480 --> 00:08:58,520 Speaker 1: listening to this podcast, and I just want to make 171 00:08:58,520 --> 00:09:01,920 Speaker 1: sure they understand the implications because you know, they love 172 00:09:02,000 --> 00:09:07,960 Speaker 1: podcasts of what we're talking about. Where you know, the 173 00:09:08,000 --> 00:09:11,360 Speaker 1: way I want to explain it is, you know, the 174 00:09:11,480 --> 00:09:16,080 Speaker 1: way it's always worked if you wanted to shoot a 175 00:09:16,200 --> 00:09:19,720 Speaker 1: scene of something was you know, you had to have 176 00:09:20,400 --> 00:09:24,400 Speaker 1: cameras and actors or at least animation, or you know, 177 00:09:24,880 --> 00:09:29,559 Speaker 1: a set or a location. And now, thanks to this software, 178 00:09:30,320 --> 00:09:34,320 Speaker 1: all you need is a computer and then you, you know, 179 00:09:34,400 --> 00:09:38,840 Speaker 1: you whisper some text instructions into you know, Sora's ear, 180 00:09:39,440 --> 00:09:44,359 Speaker 1: and Sora simply ushers into existence. What all that equipment 181 00:09:44,600 --> 00:09:49,520 Speaker 1: and all the cost and time and manpower that comes 182 00:09:49,559 --> 00:09:54,320 Speaker 1: with that and every all of that is no longer 183 00:09:54,360 --> 00:10:01,920 Speaker 1: necessary and that is simply revolutionary now. Soa as we 184 00:10:02,000 --> 00:10:05,160 Speaker 1: currently know it, I think we've just made clear is 185 00:10:05,679 --> 00:10:08,400 Speaker 1: not ready for prime time. But it's not about where 186 00:10:08,440 --> 00:10:13,000 Speaker 1: it's at now, it's about where it could be in 187 00:10:13,040 --> 00:10:19,160 Speaker 1: the future. So if you are in uh, you know, 188 00:10:19,320 --> 00:10:25,840 Speaker 1: a traditional Hollywood production company studio, how are you not 189 00:10:26,280 --> 00:10:27,640 Speaker 1: freaking out? 190 00:10:28,840 --> 00:10:30,520 Speaker 2: Yeah? Well, for for better or worse. 191 00:10:30,600 --> 00:10:32,959 Speaker 3: Neither of us run studios that I think normally maybe 192 00:10:33,160 --> 00:10:35,760 Speaker 3: for worse, although on a day like today and after 193 00:10:35,800 --> 00:10:37,480 Speaker 3: the release of Soura, maybe it's a good thing we're 194 00:10:37,520 --> 00:10:38,520 Speaker 3: not dealing with this problem. 195 00:10:38,600 --> 00:10:40,319 Speaker 2: But but look, you're absolutely right. 196 00:10:40,360 --> 00:10:45,400 Speaker 4: I mean, the the the transformation, the transform transformational moment 197 00:10:45,440 --> 00:10:49,280 Speaker 4: we're in is is uh is striking and is I 198 00:10:49,320 --> 00:10:51,120 Speaker 4: would you know you you talk about just being able 199 00:10:51,160 --> 00:10:53,480 Speaker 4: to speak, and you know, our our mother is being 200 00:10:53,480 --> 00:10:55,440 Speaker 4: able to speak and a video is created. 201 00:10:55,480 --> 00:10:56,840 Speaker 2: I mean, you know, think. 202 00:10:56,640 --> 00:10:59,480 Speaker 3: About what just what that's meant for distribution for so 203 00:10:59,520 --> 00:11:01,960 Speaker 3: many years, where it's like, you know, it used to 204 00:11:01,960 --> 00:11:04,920 Speaker 3: take you know, an engineering degree, and if that if 205 00:11:04,920 --> 00:11:07,800 Speaker 3: you can even do it to send a video to basically, 206 00:11:08,160 --> 00:11:10,440 Speaker 3: you know, tell your your friends that, hey, look at 207 00:11:10,440 --> 00:11:13,720 Speaker 3: this cute thing my my my grandkid did, or my 208 00:11:13,840 --> 00:11:16,000 Speaker 3: friend did, or my nephew did, or my child did. 209 00:11:16,360 --> 00:11:18,959 Speaker 3: And now, of course, with a flick of a keystroke 210 00:11:19,120 --> 00:11:21,160 Speaker 3: or a swipe of a screen, we can do that. 211 00:11:21,280 --> 00:11:24,200 Speaker 3: And so essentially what we're now doing is pourting that 212 00:11:24,360 --> 00:11:28,040 Speaker 3: over to not just a distribution but the production. And 213 00:11:28,480 --> 00:11:31,520 Speaker 3: I think that really dovetails right into your question about 214 00:11:31,840 --> 00:11:34,040 Speaker 3: you know, where where Hollywood studios are going to take 215 00:11:34,040 --> 00:11:35,640 Speaker 3: this and how worried they should be about where it's 216 00:11:35,640 --> 00:11:37,920 Speaker 3: going to go. I mean, you know, and you were 217 00:11:37,920 --> 00:11:40,280 Speaker 3: there covering it right at the dawn of the YouTube age, 218 00:11:40,559 --> 00:11:42,520 Speaker 3: where it's like the studios are sort of like, what 219 00:11:42,600 --> 00:11:44,640 Speaker 3: are we going to do that anyone can kind of 220 00:11:44,720 --> 00:11:46,839 Speaker 3: upload and share videos? And how is that going to 221 00:11:46,840 --> 00:11:49,199 Speaker 3: disrupt our business? We know what Google did about it. 222 00:11:49,240 --> 00:11:51,360 Speaker 3: They went out and bought YouTube and we saw eventually 223 00:11:52,040 --> 00:11:54,920 Speaker 3: after Netflix and other companies kind of ate their lunch, 224 00:11:54,920 --> 00:11:57,800 Speaker 3: how the legacy media outlets started to react, and I 225 00:11:57,800 --> 00:12:00,120 Speaker 3: think that's a bit of a good template maybe for 226 00:12:01,640 --> 00:12:04,480 Speaker 3: how to see this moment from a production standpoint. If 227 00:12:04,480 --> 00:12:07,280 Speaker 3: you're a studio, which is to say, massive fears of 228 00:12:07,280 --> 00:12:09,640 Speaker 3: disruption and is this going to take away our business? 229 00:12:09,880 --> 00:12:12,840 Speaker 2: And let's be real, you know the netflixes. 230 00:12:12,320 --> 00:12:15,280 Speaker 3: Of the world that the automation of the distribution or 231 00:12:15,320 --> 00:12:18,520 Speaker 3: the ease of distribution did, in many ways, you know, 232 00:12:18,600 --> 00:12:20,440 Speaker 3: take away a lot of their business. And I think 233 00:12:20,800 --> 00:12:23,800 Speaker 3: that can and very much in some cases will happen 234 00:12:24,160 --> 00:12:26,640 Speaker 3: on the production side. That said, I don't think anyone 235 00:12:26,800 --> 00:12:30,360 Speaker 3: feels like Hollywood studios have gone away because of streaming. 236 00:12:30,400 --> 00:12:32,600 Speaker 3: In some ways, it's been another it's been a bood 237 00:12:32,640 --> 00:12:35,120 Speaker 3: for them. It's been a revenue stream. It's disrupted them 238 00:12:35,160 --> 00:12:39,000 Speaker 3: and it's and it's been their salvation. And although I 239 00:12:39,000 --> 00:12:40,839 Speaker 3: don't think we quite yet know how it's going to 240 00:12:40,880 --> 00:12:43,000 Speaker 3: play out, I think using that as a sort of 241 00:12:43,120 --> 00:12:47,320 Speaker 3: rubric to say, look, the automation of production, much like 242 00:12:47,320 --> 00:12:51,280 Speaker 3: the automation of distribution, is going to be incredibly disruptive. 243 00:12:51,320 --> 00:12:54,319 Speaker 2: It's going to create you know, years, if. 244 00:12:54,160 --> 00:12:58,000 Speaker 3: Not more, of you know, new business models, of people 245 00:12:58,080 --> 00:13:00,360 Speaker 3: having to learn new skills of people losing their jobs, 246 00:13:00,360 --> 00:13:05,400 Speaker 3: maybe some other people getting jobs, but ultimately, I don't 247 00:13:05,440 --> 00:13:08,160 Speaker 3: think from where I sit, and I'm not you know me, 248 00:13:08,240 --> 00:13:11,000 Speaker 3: I'm not usually a Pollyanna about this stuff. I don't 249 00:13:11,000 --> 00:13:14,080 Speaker 3: think it's going to bring down the traditional business anymore 250 00:13:14,080 --> 00:13:17,360 Speaker 3: than streaming, I e. The distribution side of this brought 251 00:13:17,440 --> 00:13:21,200 Speaker 3: down Hollywood. I think it just fundamentally transformed Hollywood, and 252 00:13:21,240 --> 00:13:22,640 Speaker 3: I think we're going to get to that point as well. 253 00:13:22,640 --> 00:13:28,280 Speaker 1: Here well, enter Tyler Perry or should I say Chicken Little. 254 00:13:29,240 --> 00:13:31,199 Speaker 1: What I mean by that is he gives his interview 255 00:13:31,200 --> 00:13:34,079 Speaker 1: to the Hollywood Reporter in which he declares the sky 256 00:13:34,240 --> 00:13:38,160 Speaker 1: is falling. You know, he makes some really really bold 257 00:13:38,240 --> 00:13:43,960 Speaker 1: statements to quote, you know, just a few of them. 258 00:13:44,000 --> 00:13:47,440 Speaker 1: Here's one quote, there's got to be some sort of 259 00:13:47,520 --> 00:13:51,040 Speaker 1: regulations in order to protect us. If not, I just 260 00:13:51,200 --> 00:13:53,400 Speaker 1: don't see how we survive. 261 00:13:54,320 --> 00:13:54,959 Speaker 3: End quote. 262 00:13:55,400 --> 00:13:59,480 Speaker 1: I love the wei, by the way, as if, as 263 00:13:59,520 --> 00:14:02,880 Speaker 1: if he's in the same tax bracket as the people 264 00:14:02,920 --> 00:14:06,080 Speaker 1: he's trying to protect. But now, look, I mean give 265 00:14:06,080 --> 00:14:08,760 Speaker 1: the guy credit for doing at least what no one 266 00:14:08,800 --> 00:14:12,320 Speaker 1: else in his tax bracket seems willing to do, which 267 00:14:12,400 --> 00:14:15,520 Speaker 1: is to get out there and ring the alarm in 268 00:14:15,600 --> 00:14:21,280 Speaker 1: a major way. The question though, is, you know, is 269 00:14:21,400 --> 00:14:25,160 Speaker 1: he being hysterical or responsible? 270 00:14:26,080 --> 00:14:28,040 Speaker 3: Yeah, and I think maybe a little bit of both. 271 00:14:28,520 --> 00:14:30,040 Speaker 3: And just to go back for a second to your 272 00:14:30,080 --> 00:14:34,960 Speaker 3: week question, I think both tax bracket and job responsibility 273 00:14:35,000 --> 00:14:37,600 Speaker 3: I think needs to be delineated there because and even 274 00:14:37,640 --> 00:14:40,440 Speaker 3: said it himself, he's like, look, as an employer, as 275 00:14:40,440 --> 00:14:43,640 Speaker 3: a boss, as an executive, this is really, you know, 276 00:14:43,760 --> 00:14:46,640 Speaker 3: kind of my wildest fantasy. It's like, you can automate 277 00:14:46,760 --> 00:14:49,680 Speaker 3: large parts of the of the the assembly line, as 278 00:14:49,720 --> 00:14:51,400 Speaker 3: it were, and if you're trying to, you know, you're 279 00:14:51,400 --> 00:14:53,320 Speaker 3: worried about costs. And even Tyler Perry has to worry 280 00:14:53,320 --> 00:14:55,920 Speaker 3: about costs. Suddenly you just saved on a whole bunch 281 00:14:55,960 --> 00:14:58,560 Speaker 3: of animators or certainly kind of are able to scale 282 00:14:58,600 --> 00:15:00,200 Speaker 3: back a lot of what a lot of of the 283 00:15:00,280 --> 00:15:02,440 Speaker 3: sort of uh kind of spade work that had to 284 00:15:02,440 --> 00:15:04,000 Speaker 3: be done or grown work that had to be done 285 00:15:04,040 --> 00:15:07,760 Speaker 3: by by humans. So I think from that perspective, he's 286 00:15:07,800 --> 00:15:10,560 Speaker 3: probably not freaking out too much. But as he also notes, 287 00:15:10,600 --> 00:15:13,880 Speaker 3: as an actor, as a uh you know. 288 00:15:13,880 --> 00:15:15,240 Speaker 2: As a as a as. 289 00:15:15,120 --> 00:15:18,920 Speaker 3: A craftsman, as a fellow, you know, employee of a 290 00:15:18,920 --> 00:15:21,520 Speaker 3: lot of the people who work for him. Clearly there's 291 00:15:21,800 --> 00:15:23,520 Speaker 3: there's a lot of disruption there. I mean, look, this 292 00:15:23,640 --> 00:15:26,760 Speaker 3: is you know, I don't think Tyler Perry is I 293 00:15:26,760 --> 00:15:29,240 Speaker 3: think he's hysterical in that, you know, the notion that 294 00:15:29,320 --> 00:15:32,320 Speaker 3: you know, somehow we shouldn't be building studios or we 295 00:15:32,320 --> 00:15:34,400 Speaker 3: should worry that the entire industry is just going to 296 00:15:34,400 --> 00:15:38,520 Speaker 3: evaporate because uh, you know, some you know, teenager can 297 00:15:38,560 --> 00:15:41,560 Speaker 3: suddenly you know, create the next media or whatever franchise 298 00:15:41,640 --> 00:15:44,680 Speaker 3: is going to resonate with people. I think that's an exaggeration. 299 00:15:44,720 --> 00:15:46,480 Speaker 3: I mean, I think that, first of all, that's a 300 00:15:46,520 --> 00:15:48,960 Speaker 3: long way off from happening. Even when it does happen, 301 00:15:49,040 --> 00:15:52,760 Speaker 3: it'll probably be a very much a second class citizen. 302 00:15:52,800 --> 00:15:56,640 Speaker 3: I mean again, you know, it is TikTok competing with 303 00:15:56,640 --> 00:16:00,640 Speaker 3: with Game of Thrones or with Succession. It's not you know, 304 00:16:00,640 --> 00:16:03,520 Speaker 3: maybe it's competing on the mind share front, but but 305 00:16:03,640 --> 00:16:05,160 Speaker 3: clearly you know we're not We're not going to be 306 00:16:05,160 --> 00:16:08,400 Speaker 3: at the point where where any of this technology can 307 00:16:08,440 --> 00:16:11,400 Speaker 3: create the next succession. I think he's being responsible in 308 00:16:11,440 --> 00:16:13,360 Speaker 3: the sense we need to be, you know, kind of 309 00:16:13,400 --> 00:16:17,360 Speaker 3: plotting our course well in advance of when we've been 310 00:16:17,400 --> 00:16:19,360 Speaker 3: doing that. And you know, it's funny you mentioned the 311 00:16:19,440 --> 00:16:22,000 Speaker 3: you know, we need regulation, we need to get get 312 00:16:22,080 --> 00:16:24,440 Speaker 3: you know, kind of our duccen row here. He also 313 00:16:24,440 --> 00:16:26,440 Speaker 3: said something as he was making those comments. He said, 314 00:16:26,440 --> 00:16:28,720 Speaker 3: we need to not just like do this one guild 315 00:16:28,800 --> 00:16:30,880 Speaker 3: or one union at a time, one strike at a time. 316 00:16:31,120 --> 00:16:32,800 Speaker 2: And I think he's absolutely right about that. 317 00:16:32,840 --> 00:16:34,880 Speaker 3: I mean, I think as much as the writers and 318 00:16:34,920 --> 00:16:37,640 Speaker 3: actors strikes, I think put AI at the forefront. This 319 00:16:37,760 --> 00:16:41,280 Speaker 3: sort of patchwork approach from a Hollywood industry perspective that 320 00:16:41,360 --> 00:16:43,840 Speaker 3: we had over the last year, where the directors went 321 00:16:43,840 --> 00:16:45,600 Speaker 3: and made their own deals, and the writers made a deal, 322 00:16:45,640 --> 00:16:47,320 Speaker 3: and then the actors kind of got what they got, 323 00:16:47,600 --> 00:16:49,440 Speaker 3: and then of course you've got all these other unions 324 00:16:49,440 --> 00:16:50,200 Speaker 3: the below the line. 325 00:16:50,240 --> 00:16:52,520 Speaker 2: Folks. That's not really going to work. 326 00:16:52,560 --> 00:16:55,360 Speaker 3: And I would argue that not even from a labor standpoint, 327 00:16:55,360 --> 00:16:59,520 Speaker 3: but from from a studio in management and produce aial standpoint, 328 00:16:59,520 --> 00:17:02,680 Speaker 3: there needs to be a lot more alignment, a lot 329 00:17:02,720 --> 00:17:05,560 Speaker 3: more talking and and you know, you talk about regulation, 330 00:17:05,640 --> 00:17:09,719 Speaker 3: a lot more negotiation with with. 331 00:17:09,160 --> 00:17:12,320 Speaker 2: With Washington, because let's be real about this too. 332 00:17:12,400 --> 00:17:15,400 Speaker 3: You know, the sam Altmans and and Sachinidella's and and 333 00:17:15,480 --> 00:17:18,480 Speaker 3: all the tech moguls they are, they are working at 334 00:17:18,520 --> 00:17:22,040 Speaker 3: their hardest to make sure that from a lobbying standpoint, 335 00:17:22,560 --> 00:17:26,120 Speaker 3: they have as few regulations as possible. I know there's 336 00:17:26,119 --> 00:17:28,679 Speaker 3: a big lawsuit, a few big lawsuits now, including the 337 00:17:28,680 --> 00:17:31,800 Speaker 3: New York Times suing Open AI for copyright infringement. But 338 00:17:32,160 --> 00:17:34,240 Speaker 3: the reality is, to the degree this is going to 339 00:17:34,280 --> 00:17:38,280 Speaker 3: be negotiated in the halls of Congress tech is you know, 340 00:17:39,280 --> 00:17:43,720 Speaker 3: halfway through the marathon, and you know you've got Hollywood 341 00:17:43,720 --> 00:17:45,960 Speaker 3: executives still debating what sneakers to buy. I mean, so 342 00:17:46,000 --> 00:17:48,959 Speaker 3: this is a this is we are incredibly behind as 343 00:17:48,960 --> 00:17:52,720 Speaker 3: an industry from Hollywood standpoint, relative to where the tech 344 00:17:52,720 --> 00:17:55,480 Speaker 3: industry is in terms of figuring this out regulatorially. 345 00:17:55,600 --> 00:17:57,760 Speaker 2: So in all of those regards, I think Tyler Perry 346 00:17:57,800 --> 00:17:58,640 Speaker 2: is exactly right. 347 00:17:59,280 --> 00:18:02,679 Speaker 3: But I do think that in terms of the immediate fears, 348 00:18:02,720 --> 00:18:06,480 Speaker 3: the immediate displacement, the idea that the quality of work 349 00:18:06,480 --> 00:18:11,879 Speaker 3: can somehow start even remotely approaching what even very basic 350 00:18:11,960 --> 00:18:17,120 Speaker 3: professionals do, I think we're a ways off of them. 351 00:18:17,440 --> 00:18:19,879 Speaker 1: We will be back in just a moment with more 352 00:18:20,080 --> 00:18:31,440 Speaker 1: with Steve Zeichick stick around and we are back with 353 00:18:31,480 --> 00:18:37,640 Speaker 1: Steve Zeichik, who writes frequently about issues regarding artificial intelligence 354 00:18:37,720 --> 00:18:40,640 Speaker 1: at his sub stack Mind and Iron, which you could 355 00:18:40,680 --> 00:18:44,280 Speaker 1: check out at mindan iron dot com. He puts out 356 00:18:44,720 --> 00:18:47,600 Speaker 1: really good stuff every Thursday, so do check it out. 357 00:18:48,480 --> 00:18:51,359 Speaker 1: You know, Steve, you were just talking about you know, 358 00:18:51,400 --> 00:18:54,560 Speaker 1: you kind of panned back and gave us the big 359 00:18:54,600 --> 00:18:59,600 Speaker 1: picture in terms of the battle lines being drawn here, 360 00:19:00,040 --> 00:19:03,520 Speaker 1: and it just seems like such an impossible state of 361 00:19:03,600 --> 00:19:08,920 Speaker 1: affairs when you think about the regulatory picture, the guild picture, 362 00:19:09,440 --> 00:19:13,000 Speaker 1: and what Tyler Perry is calling on for here in 363 00:19:13,119 --> 00:19:17,399 Speaker 1: terms of this industry getting its act together and getting 364 00:19:17,400 --> 00:19:23,240 Speaker 1: everyone on the same page. What hope can we really 365 00:19:23,800 --> 00:19:28,879 Speaker 1: have for Hollywood to fight this battle in the right way. 366 00:19:29,640 --> 00:19:33,320 Speaker 3: Well, if streaming is in any indication, I would say zero. 367 00:19:35,280 --> 00:19:36,159 Speaker 2: You know, it's like the whole thing. 368 00:19:36,200 --> 00:19:37,720 Speaker 3: I mean, you follow out of this stuff, like whatever 369 00:19:37,720 --> 00:19:39,760 Speaker 3: we did with social media on the tech side, let's 370 00:19:39,760 --> 00:19:42,959 Speaker 3: do the opposite now. And I think you can make 371 00:19:43,000 --> 00:19:46,000 Speaker 3: a case that that's true with streaming as well. I mean, 372 00:19:46,040 --> 00:19:49,200 Speaker 3: you know, we obviously saw the legacy companies way behind, 373 00:19:50,000 --> 00:19:52,840 Speaker 3: We saw black box data issues, we saw lack of 374 00:19:52,920 --> 00:19:55,840 Speaker 3: revenue sharing. I mean, creators certainly don't want to repeat 375 00:19:55,880 --> 00:19:58,480 Speaker 3: any of that, And you know, I don't think it's 376 00:19:58,520 --> 00:20:00,679 Speaker 3: a stretch to say we're in changer of doing that 377 00:20:00,760 --> 00:20:03,080 Speaker 3: and then some So you know, I don't think the 378 00:20:03,160 --> 00:20:07,160 Speaker 3: historical precedent here is terribly encouraging. And I think, as 379 00:20:07,160 --> 00:20:09,359 Speaker 3: you've kind of been alluding to, in some ways, this 380 00:20:09,359 --> 00:20:10,760 Speaker 3: is are In a lot of ways, this is more 381 00:20:10,760 --> 00:20:14,679 Speaker 3: transformative than streaming. You know, production and creation always going 382 00:20:14,680 --> 00:20:17,000 Speaker 3: to be more fundamental to the business than distribution, though 383 00:20:17,000 --> 00:20:19,080 Speaker 3: distribution is of course very important. 384 00:20:19,160 --> 00:20:21,000 Speaker 2: So I don't have a ton of hope. 385 00:20:21,000 --> 00:20:23,639 Speaker 3: The only the only sort of in that regard the 386 00:20:23,680 --> 00:20:26,880 Speaker 3: glimmer I would offer listeners here and you could tell 387 00:20:26,920 --> 00:20:28,840 Speaker 3: me if you think I'm being too optimistic, is that 388 00:20:28,960 --> 00:20:31,399 Speaker 3: I think we have learned some lessons. I do think, 389 00:20:31,600 --> 00:20:34,080 Speaker 3: you know, I talked to executives, as I'm sure to you, 390 00:20:34,119 --> 00:20:36,640 Speaker 3: who kind of say, look, you know, we are not 391 00:20:36,720 --> 00:20:39,120 Speaker 3: going to get caught unawares of the way we were. 392 00:20:39,160 --> 00:20:41,080 Speaker 3: You know, there were so many people ten fifteen years 393 00:20:41,080 --> 00:20:44,000 Speaker 3: ago who dismissed a lot of this stuff as just 394 00:20:44,080 --> 00:20:47,040 Speaker 3: either you know, kind of want to be you know, 395 00:20:47,080 --> 00:20:49,640 Speaker 3: Hollywood content or a user generated stuff. You know, we're 396 00:20:49,640 --> 00:20:52,480 Speaker 3: both old enough to remember the whole MySpace days and 397 00:20:52,600 --> 00:20:54,679 Speaker 3: the frenzy about that and that that wasn't going to 398 00:20:54,680 --> 00:20:57,640 Speaker 3: really cannibalize the business. And I think that I think 399 00:20:57,680 --> 00:21:01,120 Speaker 3: Hollywood executives now and Labor Gill for that matter, are 400 00:21:01,200 --> 00:21:05,560 Speaker 3: just too savvy. They know that they cannot underestimate this. Now, 401 00:21:05,920 --> 00:21:08,320 Speaker 3: does that mean they're going to react to that in 402 00:21:08,359 --> 00:21:11,920 Speaker 3: the right way? Are they going to align and get 403 00:21:12,040 --> 00:21:15,600 Speaker 3: you know, you know, you know, can the guilds even 404 00:21:15,640 --> 00:21:17,960 Speaker 3: get on the same page. Can management and the guilds, 405 00:21:17,960 --> 00:21:20,880 Speaker 3: given some of the ranker decide what's best for them? 406 00:21:20,880 --> 00:21:23,400 Speaker 3: Because you know, look, as much as I think the 407 00:21:23,440 --> 00:21:27,119 Speaker 3: Hollywood studios and management are in some ways, you know, 408 00:21:27,320 --> 00:21:30,320 Speaker 3: Tyler Perry being an example in this regard the enemy 409 00:21:30,320 --> 00:21:33,000 Speaker 3: of labor, I also think they're their best ally because 410 00:21:33,359 --> 00:21:36,720 Speaker 3: because the tech companies, as we know, don't necessarily care 411 00:21:36,800 --> 00:21:39,879 Speaker 3: that much about Hollywood studios, I'm preserving their business model. 412 00:21:39,880 --> 00:21:42,720 Speaker 3: They care about maximizing their profits as they should. So 413 00:21:43,080 --> 00:21:44,720 Speaker 3: to the extent that this is going to be a 414 00:21:44,760 --> 00:21:48,720 Speaker 3: battle between big Tech and Hollywood writ large, then hopefully 415 00:21:48,720 --> 00:21:52,080 Speaker 3: producers and executives and conglomerates can get on the same 416 00:21:52,119 --> 00:21:55,640 Speaker 3: page with workers and creators, because the sooner they could 417 00:21:55,680 --> 00:21:57,800 Speaker 3: do that, the better they can figure out, you know, 418 00:21:57,840 --> 00:22:01,360 Speaker 3: how to either neutralize the threat or work in concert 419 00:22:01,440 --> 00:22:02,800 Speaker 3: with with the opportunity. 420 00:22:02,840 --> 00:22:04,600 Speaker 2: But but you know, I don't mean to. 421 00:22:04,640 --> 00:22:06,479 Speaker 3: Like sing a Kumbaya tune here, but I think the 422 00:22:06,560 --> 00:22:10,920 Speaker 3: more kind of animous intention you have between Hollywood management 423 00:22:11,200 --> 00:22:15,640 Speaker 3: and Hollywood labor, the more likely it is that big 424 00:22:15,640 --> 00:22:17,280 Speaker 3: tech is going to come and eat both their lunches. 425 00:22:17,680 --> 00:22:21,040 Speaker 1: Yeah. I my head is spinning just listening to this. 426 00:22:21,240 --> 00:22:25,080 Speaker 1: I mean, I'm still somewhat fixated on the notion that, 427 00:22:25,880 --> 00:22:27,840 Speaker 1: you know, and I'm not I'm not pinning this on 428 00:22:27,920 --> 00:22:32,560 Speaker 1: Tyler Perry, but you know, he's obviously calling for Congress 429 00:22:33,000 --> 00:22:40,320 Speaker 1: to have the studios protect labor, and I guess obviously 430 00:22:40,520 --> 00:22:44,080 Speaker 1: not fire everyone in sight and just have computers crank 431 00:22:44,160 --> 00:22:49,080 Speaker 1: out all the production needs, you know, from here on in, 432 00:22:49,119 --> 00:22:52,560 Speaker 1: and because obviously that would be tremendous cost savings. But 433 00:22:53,840 --> 00:22:56,320 Speaker 1: you know, when I keep my when I wrap my 434 00:22:56,359 --> 00:22:59,160 Speaker 1: head around that scenario, I sort of say to myself, 435 00:22:59,720 --> 00:23:05,919 Speaker 1: so if you force the studios to keep employing people, Okay, 436 00:23:06,080 --> 00:23:08,600 Speaker 1: if there's some sort of protection there, how do you 437 00:23:08,760 --> 00:23:13,600 Speaker 1: keep the studios competition from then utilizing the technology and 438 00:23:13,640 --> 00:23:16,880 Speaker 1: then beating them at a fraction of the cost, and 439 00:23:17,200 --> 00:23:22,919 Speaker 1: keeping the purveyors of that technology from you know, do 440 00:23:22,960 --> 00:23:26,520 Speaker 1: you keep them from from deploying the technology? Like, I 441 00:23:26,600 --> 00:23:29,160 Speaker 1: just I don't understand how that all that could even 442 00:23:29,200 --> 00:23:30,320 Speaker 1: possibly work. 443 00:23:31,280 --> 00:23:33,399 Speaker 2: You don't think Congress is in any way the answer 444 00:23:33,400 --> 00:23:34,159 Speaker 2: here is your point. 445 00:23:34,920 --> 00:23:38,000 Speaker 1: I just don't know how you how you have Congress 446 00:23:38,160 --> 00:23:43,119 Speaker 1: force the studios to keep people employed, even though that 447 00:23:43,359 --> 00:23:47,280 Speaker 1: is the most humane solution. And I don't know how 448 00:23:47,320 --> 00:23:53,000 Speaker 1: you keep the technology companies from not deploying the technology. 449 00:23:53,080 --> 00:23:55,160 Speaker 1: I just don't understand how that works. 450 00:23:56,520 --> 00:23:58,480 Speaker 3: So a couple of things that they're not going to 451 00:23:58,560 --> 00:24:00,800 Speaker 3: there's no way they're not going to stay the technology 452 00:24:00,840 --> 00:24:04,159 Speaker 3: companies from from deploying the technology. I think what's going 453 00:24:04,240 --> 00:24:06,600 Speaker 3: to happen is and this is not really a congressional issue. 454 00:24:06,600 --> 00:24:07,720 Speaker 2: This is a judicial issue. 455 00:24:07,720 --> 00:24:11,000 Speaker 3: And I'm I'm not an expert on regulation or legislation 456 00:24:11,760 --> 00:24:14,439 Speaker 3: in this regard or in any regard, but my understanding 457 00:24:14,480 --> 00:24:16,879 Speaker 3: from covering this a little bit is that really what 458 00:24:16,960 --> 00:24:18,240 Speaker 3: can happen on that side is. 459 00:24:18,200 --> 00:24:21,320 Speaker 2: Just the toughening up of existing. 460 00:24:20,920 --> 00:24:23,280 Speaker 3: Copyright laws that they're pretty tough as it is, but 461 00:24:23,320 --> 00:24:25,520 Speaker 3: they could be tougher in some respects. And then the 462 00:24:25,600 --> 00:24:27,280 Speaker 3: enforcement and of course that's going to be up to 463 00:24:27,280 --> 00:24:29,359 Speaker 3: the courts. And again we'll see where this New York 464 00:24:29,400 --> 00:24:32,199 Speaker 3: Times lawsuit goes. But but I think inso far, and 465 00:24:32,200 --> 00:24:34,600 Speaker 3: that would address kind of the second part of your question, 466 00:24:34,640 --> 00:24:38,000 Speaker 3: which is what's going to stop people from just you know, 467 00:24:38,080 --> 00:24:41,199 Speaker 3: grabbing it. On the one hand, so that that you know, 468 00:24:41,240 --> 00:24:43,320 Speaker 3: that depends on copyright and I don't think that's a 469 00:24:43,359 --> 00:24:47,240 Speaker 3: resolved issue yet. You know, can open ai just unleash 470 00:24:47,240 --> 00:24:50,879 Speaker 3: a product that lets people like you know, drop Brad 471 00:24:50,920 --> 00:24:54,040 Speaker 3: pit take Brad Pitt from I don't know seven and 472 00:24:54,160 --> 00:24:57,600 Speaker 3: drop them into their student film like I think that 473 00:24:57,680 --> 00:25:00,440 Speaker 3: there are legal mechanisms that can prevent that's totally that 474 00:25:00,440 --> 00:25:03,879 Speaker 3: convent that from getting distributed if someone does that, you know, 475 00:25:03,880 --> 00:25:05,760 Speaker 3: I don't think open ai cares. I think they want 476 00:25:05,760 --> 00:25:08,120 Speaker 3: people to use this tool. I also think there could 477 00:25:08,119 --> 00:25:11,920 Speaker 3: be judicial restraint or or legislative and judicial restraints put 478 00:25:11,960 --> 00:25:16,280 Speaker 3: on you know, how much open ai could could actually 479 00:25:16,920 --> 00:25:19,760 Speaker 3: train their models on this data to begin with? 480 00:25:19,840 --> 00:25:20,439 Speaker 2: It was interesting. 481 00:25:20,440 --> 00:25:22,920 Speaker 3: I don't know if you notice this nuance, but when 482 00:25:22,920 --> 00:25:25,560 Speaker 3: sam Alwen was giving his little spiel on SOA. He 483 00:25:25,720 --> 00:25:27,439 Speaker 3: was talking about how this was all kind of trained 484 00:25:27,440 --> 00:25:31,359 Speaker 3: on publicly available information, and so I think they're very 485 00:25:31,400 --> 00:25:34,960 Speaker 3: aware now that the initial chat GPT, which was a 486 00:25:34,960 --> 00:25:37,560 Speaker 3: lot murkier in what it was grabbing, as we see 487 00:25:37,560 --> 00:25:39,520 Speaker 3: from the New York Times lawsuit, they have to be 488 00:25:39,520 --> 00:25:41,479 Speaker 3: a lot more careful about that. So I think that 489 00:25:41,520 --> 00:25:44,399 Speaker 3: does address a little bit of what can be taken 490 00:25:44,720 --> 00:25:46,920 Speaker 3: and if you do take it, you have to pay 491 00:25:46,960 --> 00:25:49,480 Speaker 3: for it. And that's of course one avenue this could 492 00:25:49,480 --> 00:25:52,080 Speaker 3: all go as well, which is they do train on 493 00:25:52,680 --> 00:25:55,880 Speaker 3: a lot of this data that's you know, that's copyrighted, 494 00:25:55,920 --> 00:25:59,160 Speaker 3: but the studios and hopefully the artists get compensated for it. 495 00:25:59,400 --> 00:25:59,560 Speaker 2: You know. 496 00:25:59,600 --> 00:26:03,480 Speaker 3: In terms of the jobs, I'm extremely, extremely pessimistic about 497 00:26:03,480 --> 00:26:05,960 Speaker 3: the kind of protectionism that I don't know if Tyler 498 00:26:05,960 --> 00:26:08,800 Speaker 3: Perry is actively advocating for it, but to the degree 499 00:26:08,880 --> 00:26:10,679 Speaker 3: is kind of dangling this hope that Congress is going 500 00:26:10,760 --> 00:26:13,480 Speaker 3: to pass a law that keeps people employed when there's 501 00:26:13,480 --> 00:26:16,719 Speaker 3: technology that can that can automate their jobs. 502 00:26:17,560 --> 00:26:19,560 Speaker 2: I mean, again, I am no labor. 503 00:26:19,359 --> 00:26:21,040 Speaker 3: Historian by any stretch, but if you look at the 504 00:26:21,040 --> 00:26:23,280 Speaker 3: history of automation in this country, in the auto industry 505 00:26:23,320 --> 00:26:26,520 Speaker 3: and and and other other industries. There's just not a 506 00:26:26,560 --> 00:26:29,800 Speaker 3: lot of reason to think that's that's gonna work. You know, 507 00:26:30,119 --> 00:26:31,919 Speaker 3: and you can go back to the typewriter, you can 508 00:26:32,000 --> 00:26:34,600 Speaker 3: go to you know, to Google, to search engines. 509 00:26:34,680 --> 00:26:35,960 Speaker 2: I just don't think that's gonna work. 510 00:26:36,119 --> 00:26:37,480 Speaker 3: I think what you can have, and I don't know 511 00:26:37,520 --> 00:26:40,479 Speaker 3: if this happens legislatively or if it happens, you know, 512 00:26:40,640 --> 00:26:41,840 Speaker 3: within the private sector. 513 00:26:42,000 --> 00:26:43,480 Speaker 2: I think what you can have is retraining. 514 00:26:43,920 --> 00:26:46,600 Speaker 3: You can basically say we're gonna, yes, this is going 515 00:26:46,600 --> 00:26:49,560 Speaker 3: to potentially some cases take your jobs, but in other 516 00:26:49,600 --> 00:26:52,280 Speaker 3: cases make it easier. We're going to help you do 517 00:26:52,440 --> 00:26:55,280 Speaker 3: your job better with the help of this technology, or 518 00:26:55,320 --> 00:26:57,800 Speaker 3: do a different job now that technology is doing this 519 00:26:57,880 --> 00:27:00,040 Speaker 3: current one. And I think there is room for that. 520 00:27:00,080 --> 00:27:00,159 Speaker 2: Now. 521 00:27:00,200 --> 00:27:03,760 Speaker 3: I don't want to get too like shiny optimist about that, 522 00:27:03,800 --> 00:27:05,719 Speaker 3: because I think there's a limit to how much you 523 00:27:05,720 --> 00:27:08,840 Speaker 3: can retrain someone if their jobs are now completely automated. 524 00:27:09,240 --> 00:27:11,119 Speaker 3: But I think there is some there's some avenue for 525 00:27:11,160 --> 00:27:11,639 Speaker 3: hope there. 526 00:27:11,920 --> 00:27:13,879 Speaker 1: Yeah, I do hope this isn't a matter of just 527 00:27:13,960 --> 00:27:18,440 Speaker 1: simple displacement. But we'll have to see on that and 528 00:27:18,760 --> 00:27:21,720 Speaker 1: on the copyright question. Keep in mind, I think copyright 529 00:27:21,800 --> 00:27:24,200 Speaker 1: is almost like we're getting a little ahead of ourselves, 530 00:27:24,200 --> 00:27:29,600 Speaker 1: where we still need to see SORA be equipped with 531 00:27:30,600 --> 00:27:37,800 Speaker 1: having some degree of creative control where filmmakers will be 532 00:27:38,000 --> 00:27:44,840 Speaker 1: able to use your example, insert Brad Pit or insert 533 00:27:45,080 --> 00:27:49,199 Speaker 1: you know whoever, because at this point we don't you know, 534 00:27:49,520 --> 00:27:55,119 Speaker 1: you can't even add sound to SOA, although eleven Labs 535 00:27:55,160 --> 00:28:00,000 Speaker 1: another software tool al owed there you could layer sound 536 00:28:00,040 --> 00:28:02,720 Speaker 1: and on top of things, but you know, there's so 537 00:28:03,040 --> 00:28:08,960 Speaker 1: much that still needs to be demonstrated at this point. 538 00:28:09,119 --> 00:28:16,920 Speaker 1: What I'm also wondering about at this time is I'm wondering, 539 00:28:17,520 --> 00:28:21,000 Speaker 1: especially as we see one minute clips, is not even 540 00:28:21,160 --> 00:28:24,800 Speaker 1: just the disruption that is going to happen in terms 541 00:28:25,040 --> 00:28:29,639 Speaker 1: of we're obviously focusing on Hollywood premium long form entertainment. 542 00:28:30,240 --> 00:28:34,400 Speaker 1: I'm wondering whether the TikTok and youtubes of the world 543 00:28:34,960 --> 00:28:39,640 Speaker 1: have things to worry about here because these this kind 544 00:28:39,680 --> 00:28:44,000 Speaker 1: of technology is also going to impact, you know, the 545 00:28:44,080 --> 00:28:49,840 Speaker 1: social video creator economy layer of the world. Because let's 546 00:28:49,880 --> 00:28:55,600 Speaker 1: not forget that just because these videos can will great 547 00:28:56,280 --> 00:29:02,040 Speaker 1: premium video into existence, that doesn't mean everyone is Steven Spielberg. 548 00:29:02,760 --> 00:29:04,520 Speaker 1: Not everyone is gonna be able to turn this into 549 00:29:04,600 --> 00:29:07,600 Speaker 1: ninety minutes or thirty minutes. But what they will be 550 00:29:07,680 --> 00:29:08,200 Speaker 1: able to. 551 00:29:08,160 --> 00:29:09,120 Speaker 2: Do is. 552 00:29:10,600 --> 00:29:15,600 Speaker 1: Social video, which already has no barriers to entry. Why 553 00:29:15,640 --> 00:29:20,920 Speaker 1: can't open ai become the next YouTube or the next 554 00:29:20,960 --> 00:29:24,680 Speaker 1: TikTok In other words, I think the existing platforms in 555 00:29:24,760 --> 00:29:30,920 Speaker 1: social video could perhaps find a new platform emerge from 556 00:29:31,600 --> 00:29:36,560 Speaker 1: these new players that already power this kind of video. 557 00:29:36,680 --> 00:29:37,520 Speaker 1: You see what I'm saying. 558 00:29:38,400 --> 00:29:40,400 Speaker 3: So you think that because right well, right now, Opening 559 00:29:40,440 --> 00:29:42,560 Speaker 3: I mean Opening is purely a tech company. In fact, 560 00:29:42,600 --> 00:29:45,440 Speaker 3: they want developers forget distribution. They don't even want to, 561 00:29:45,760 --> 00:29:47,360 Speaker 3: you know, they don't know we want to be creating 562 00:29:47,400 --> 00:29:50,040 Speaker 3: the apps here. But you think that either open ai 563 00:29:50,160 --> 00:29:52,600 Speaker 3: or a company that's responsible for the tech by the way, 564 00:29:52,680 --> 00:29:55,320 Speaker 3: Google would be would be the natural one because of 565 00:29:55,360 --> 00:29:58,960 Speaker 3: course they have both the AI capabilities and you know 566 00:29:59,000 --> 00:30:02,200 Speaker 3: with YouTube the distri abution capabilities. But do you think 567 00:30:02,240 --> 00:30:04,920 Speaker 3: there might be some kind of blurring of the lines 568 00:30:04,920 --> 00:30:08,880 Speaker 3: between the tech that enables the creation and the distribution 569 00:30:08,920 --> 00:30:11,000 Speaker 3: where those companies get into the distribution. 570 00:30:11,600 --> 00:30:14,440 Speaker 1: To me, it's a natural extension. I mean, and Google, 571 00:30:14,480 --> 00:30:20,160 Speaker 1: by the way, could be right behind open ai here. 572 00:30:20,320 --> 00:30:23,320 Speaker 1: They have the Lumie Air is something that is coming 573 00:30:23,360 --> 00:30:27,520 Speaker 1: that supposedly is going to be very similar to Sora. 574 00:30:27,800 --> 00:30:31,960 Speaker 1: I'm just saying open Ai, I think, why not open 575 00:30:32,040 --> 00:30:35,560 Speaker 1: up a distribution platform and compete because you're going to 576 00:30:35,600 --> 00:30:38,440 Speaker 1: have tons of video coming out as people play with 577 00:30:38,480 --> 00:30:41,640 Speaker 1: this stuff. Why not get into that space as well. 578 00:30:41,680 --> 00:30:43,960 Speaker 1: It could be a whole new play. And so I 579 00:30:44,040 --> 00:30:46,720 Speaker 1: just wonder whether we should be thinking even a little 580 00:30:46,760 --> 00:30:49,800 Speaker 1: more broadly here in terms of what disruption could come. 581 00:30:50,360 --> 00:30:52,479 Speaker 1: If I was Sam Altman, That's how I'd be thinking. 582 00:30:53,040 --> 00:30:56,680 Speaker 3: Yeah, he's not shown to date a lot of interest 583 00:30:56,920 --> 00:31:00,240 Speaker 3: in and you know, you could say this is wise 584 00:31:00,320 --> 00:31:01,920 Speaker 3: or not. I actually think it kind of is. He's 585 00:31:01,920 --> 00:31:04,280 Speaker 3: not shown a lot of interest in being sort of 586 00:31:04,440 --> 00:31:06,880 Speaker 3: front facing in that way. I think he knows there's 587 00:31:06,920 --> 00:31:11,440 Speaker 3: companies with massive footholds. I think he feels like if 588 00:31:11,440 --> 00:31:13,280 Speaker 3: he does his job well and if open ai is 589 00:31:13,280 --> 00:31:16,760 Speaker 3: able to create or give developers the tools, really because 590 00:31:16,800 --> 00:31:18,480 Speaker 3: they're not really even creating a lot of this, but 591 00:31:18,480 --> 00:31:21,400 Speaker 3: they're giving developers. I mean, so it is a different case, 592 00:31:21,440 --> 00:31:24,120 Speaker 3: but they're giving developers the tools to create this stuff, 593 00:31:24,120 --> 00:31:25,800 Speaker 3: then they can go out and put it on those 594 00:31:25,840 --> 00:31:28,320 Speaker 3: distribution platforms. Though you know open Eye of course, as 595 00:31:28,320 --> 00:31:31,440 Speaker 3: we know, as a very close relationship both spiritually and 596 00:31:31,680 --> 00:31:35,920 Speaker 3: corporately with Microsoft. And you know, Microsoft clearly has a 597 00:31:35,920 --> 00:31:38,479 Speaker 3: lot of reach with Windows and Office and all that, 598 00:31:38,600 --> 00:31:41,080 Speaker 3: so there's certainly some some potential there. 599 00:31:41,360 --> 00:31:42,120 Speaker 2: I was curious. 600 00:31:42,120 --> 00:31:44,320 Speaker 3: There's something you said a second ago is interesting that 601 00:31:44,400 --> 00:31:46,080 Speaker 3: I was hoping maybe to circle back to for a minute, 602 00:31:46,120 --> 00:31:48,560 Speaker 3: Like you talk about what it can do in terms 603 00:31:48,560 --> 00:31:51,400 Speaker 3: of the barriers to entry for creators. Do you think 604 00:31:51,400 --> 00:31:54,120 Speaker 3: in terms of the creator economy, this is like an 605 00:31:54,200 --> 00:31:56,360 Speaker 3: unabashed well, is this a good thing? 606 00:31:56,440 --> 00:31:57,080 Speaker 2: Is this a bad thing? 607 00:31:57,120 --> 00:32:00,480 Speaker 3: It's only up the level of kind of you know, 608 00:32:00,560 --> 00:32:03,040 Speaker 3: professional or semi professional creators who are not you know, 609 00:32:03,080 --> 00:32:06,120 Speaker 3: Hollywood types at all, your your mister beasts and and 610 00:32:06,200 --> 00:32:09,320 Speaker 3: you know, your your TikTok kind of auteurs. This would 611 00:32:09,360 --> 00:32:12,440 Speaker 3: seem to really just give them tools that are almost unimaginable. 612 00:32:12,480 --> 00:32:16,000 Speaker 3: It levels the playing field for them, as I see it. 613 00:32:16,040 --> 00:32:17,800 Speaker 3: What do you what do you think it does in 614 00:32:17,880 --> 00:32:20,960 Speaker 3: terms of that world and creativity and the monetization they're up. 615 00:32:21,880 --> 00:32:24,680 Speaker 1: I think you could see it level the playing field. 616 00:32:24,800 --> 00:32:29,960 Speaker 1: I think you could see a whole new a whole 617 00:32:30,000 --> 00:32:33,960 Speaker 1: new group of players come in that are maybe more 618 00:32:34,040 --> 00:32:39,800 Speaker 1: adept at the kind of premium entertainment that you know, 619 00:32:39,920 --> 00:32:44,080 Speaker 1: maybe the mister beasts of the world are not necessarily 620 00:32:44,360 --> 00:32:45,240 Speaker 1: adept at. 621 00:32:46,080 --> 00:32:46,920 Speaker 2: I don't know. 622 00:32:47,200 --> 00:32:51,000 Speaker 1: I mean, I just think that what we are talking 623 00:32:51,080 --> 00:32:54,760 Speaker 1: about here in general, whether we're talking about creator economy, 624 00:32:55,320 --> 00:32:59,560 Speaker 1: premium entertainment, it's so hard to get my head around 625 00:33:00,280 --> 00:33:02,600 Speaker 1: the disruption that we're going to see. And I think 626 00:33:02,640 --> 00:33:06,280 Speaker 1: a lot of what's gonna be disrupted, like we could 627 00:33:06,360 --> 00:33:10,160 Speaker 1: barely conceive of it. That's how huge this is. 628 00:33:11,120 --> 00:33:12,720 Speaker 3: Yeah, And I think just to add to that, I mean, 629 00:33:12,760 --> 00:33:15,480 Speaker 3: you talk about the whole new layer of creators who 630 00:33:16,320 --> 00:33:19,160 Speaker 3: who can you know, sort of h up their game 631 00:33:19,320 --> 00:33:21,440 Speaker 3: or or figure out how to master these tools in 632 00:33:21,440 --> 00:33:23,640 Speaker 3: a way that maybe this current or previous generation of 633 00:33:23,640 --> 00:33:24,560 Speaker 3: content creators have. 634 00:33:24,600 --> 00:33:27,440 Speaker 2: And you know, I would look at animation, I really would. 635 00:33:27,480 --> 00:33:30,400 Speaker 3: I mean, you know, animation, and this is where maybe 636 00:33:30,640 --> 00:33:33,360 Speaker 3: you know, if I'm you know, Illumination or I'm Pixar 637 00:33:33,440 --> 00:33:35,520 Speaker 3: or on any of these companies that have that have 638 00:33:35,600 --> 00:33:38,400 Speaker 3: done this so well for so long, I'm worried because 639 00:33:38,680 --> 00:33:41,280 Speaker 3: forget just the the you know, the animators on the floor, 640 00:33:41,320 --> 00:33:43,960 Speaker 3: who should be rightly worried here, but you. 641 00:33:43,920 --> 00:33:45,640 Speaker 2: Know, this is going to give tools. 642 00:33:45,640 --> 00:33:47,400 Speaker 3: I mean, and I think you know, we both watched 643 00:33:47,440 --> 00:33:50,120 Speaker 3: these these videos and these demos and sort of saw 644 00:33:50,320 --> 00:33:53,280 Speaker 3: the remarkable animations that were being created. I think I 645 00:33:53,320 --> 00:33:56,120 Speaker 3: was frankly more struck by that than uh than than 646 00:33:56,120 --> 00:33:58,280 Speaker 3: some of the live stuff. And to me, if you're 647 00:33:58,320 --> 00:34:01,040 Speaker 3: able to kind of create animation up level, you know, 648 00:34:01,120 --> 00:34:05,720 Speaker 3: if your your average or slightly above average content creator 649 00:34:05,760 --> 00:34:08,040 Speaker 3: could now create you know, a one minute and then 650 00:34:08,080 --> 00:34:09,840 Speaker 3: it gets a little bit longer, you know, gets you 651 00:34:09,880 --> 00:34:13,320 Speaker 3: know these animations that you know, pix Our, Illumination, their shorts, 652 00:34:13,320 --> 00:34:15,759 Speaker 3: and now suddenly there's like someone who's mastered these. 653 00:34:15,719 --> 00:34:17,640 Speaker 2: Tools and maybe as an artist in their own right, 654 00:34:17,719 --> 00:34:18,160 Speaker 2: but is not. 655 00:34:18,160 --> 00:34:21,359 Speaker 3: Employed by a studio as there's no affiliation with them, 656 00:34:21,680 --> 00:34:25,520 Speaker 3: is now suddenly creating Pixar or Illumination level animation. 657 00:34:26,160 --> 00:34:28,719 Speaker 2: You know, what does that do to both sides? Right? 658 00:34:28,719 --> 00:34:30,960 Speaker 3: What does that do to the Hollywood firmament where it's like, 659 00:34:31,320 --> 00:34:34,840 Speaker 3: now suddenly you've got you get we can't distinguish between 660 00:34:34,840 --> 00:34:37,480 Speaker 3: these studios, And what does it do to to these 661 00:34:37,520 --> 00:34:40,359 Speaker 3: YouTube TikTok like platforms where suddenly it's not just you 662 00:34:40,360 --> 00:34:43,479 Speaker 3: know uh, you know, somebody doing some fun karaoke video. 663 00:34:43,840 --> 00:34:45,399 Speaker 2: But I'm watching, you know, and I. 664 00:34:45,360 --> 00:34:47,239 Speaker 3: Know this is a bit of a stretch, but but 665 00:34:47,280 --> 00:34:49,319 Speaker 3: not as much, not as big as it was a 666 00:34:49,320 --> 00:34:52,399 Speaker 3: month ago, where I'm watching someone do the next you know. 667 00:34:52,800 --> 00:34:55,560 Speaker 2: Cocoa or cars or or minions. 668 00:34:55,960 --> 00:34:58,479 Speaker 3: It just feels to me like there are so many 669 00:34:58,480 --> 00:35:01,920 Speaker 3: places here where the content creates could get this massive 670 00:35:01,920 --> 00:35:04,160 Speaker 3: boost again the ones who know how to use the tools. 671 00:35:04,520 --> 00:35:06,440 Speaker 3: And then you know, how does that get monetized if 672 00:35:06,719 --> 00:35:10,960 Speaker 3: suddenly someone could do really high level stuff without having 673 00:35:11,040 --> 00:35:13,680 Speaker 3: to go through the studios. Again, to your point, I 674 00:35:13,719 --> 00:35:15,920 Speaker 3: think that's an area where the level of disruption is 675 00:35:15,920 --> 00:35:16,680 Speaker 3: just mind boggling. 676 00:35:17,360 --> 00:35:21,760 Speaker 1: And it just brings me back to Tyler Perry and 677 00:35:21,760 --> 00:35:28,800 Speaker 1: and it's so easy to dismiss what he is saying 678 00:35:28,920 --> 00:35:34,640 Speaker 1: and the profound implications of what he's raising as hysteria. 679 00:35:34,760 --> 00:35:37,200 Speaker 1: But the thing I keep coming back to is, and 680 00:35:37,239 --> 00:35:39,040 Speaker 1: I think we're going to end on this note, is 681 00:35:40,200 --> 00:35:45,839 Speaker 1: it's it's as if, you know, I'm typically allergic to hysteria, 682 00:35:45,920 --> 00:35:50,040 Speaker 1: but in this scenario, I find that, for once, I 683 00:35:50,080 --> 00:35:55,600 Speaker 1: find myself thinking hysteria is plausible, and if it's plausible, 684 00:35:55,960 --> 00:36:00,520 Speaker 1: isn't it truly kind of apocalyptic? And how does disturbing 685 00:36:00,760 --> 00:36:01,279 Speaker 1: is that? 686 00:36:01,719 --> 00:36:02,640 Speaker 2: I would end on this note. 687 00:36:02,680 --> 00:36:05,319 Speaker 3: From my end, the slight bit of optimism I would 688 00:36:05,360 --> 00:36:07,640 Speaker 3: inject to that and maybe leave listeners with. From from 689 00:36:07,680 --> 00:36:10,000 Speaker 3: my point of view is that if you love Tyler 690 00:36:10,000 --> 00:36:12,000 Speaker 3: Perry and you love the media movies, or you love 691 00:36:12,440 --> 00:36:15,640 Speaker 3: you Know Spielberg or James Cameron, whoever it is, the 692 00:36:15,680 --> 00:36:19,080 Speaker 3: reality is is not only can these can these models 693 00:36:19,239 --> 00:36:21,759 Speaker 3: not actually create that? I mean they may be able 694 00:36:21,800 --> 00:36:24,440 Speaker 3: to emulate it, but they can create something that distinctive. 695 00:36:24,840 --> 00:36:27,160 Speaker 3: It's going to make those people even more prize. And 696 00:36:27,200 --> 00:36:29,160 Speaker 3: I think we're gonna love if you love you know, 697 00:36:29,160 --> 00:36:32,640 Speaker 3: whether it's from Tyler Perry to Steven Spielberg to Catherine 698 00:36:32,680 --> 00:36:34,880 Speaker 3: Bigelow to Ava du Verne, I mean, you name it. 699 00:36:35,080 --> 00:36:37,960 Speaker 3: If you love that filmmaker, you love that creator, you know, 700 00:36:38,080 --> 00:36:41,480 Speaker 3: in the sea of pseudo professional content, that stuff is 701 00:36:41,520 --> 00:36:43,680 Speaker 3: going to be even more valuable. There's going to be 702 00:36:43,680 --> 00:36:45,960 Speaker 3: even more premium, more of a premium on people who 703 00:36:45,960 --> 00:36:48,799 Speaker 3: could bring their artistry and humanity. I don't think it's 704 00:36:48,800 --> 00:36:51,640 Speaker 3: gonna touch that high level of content. In fact, it 705 00:36:51,640 --> 00:36:53,759 Speaker 3: may even put more of an emphasis on it so 706 00:36:54,200 --> 00:36:56,160 Speaker 3: I don't know if that tempers the apocalypse, but that's 707 00:36:56,160 --> 00:36:57,239 Speaker 3: the one thing I would say on. 708 00:36:58,360 --> 00:37:01,760 Speaker 1: Score one for traditional Hollywood. Well, thanks Steve for taking 709 00:37:01,800 --> 00:37:04,400 Speaker 1: the time out. You, of course, can check out everything 710 00:37:04,440 --> 00:37:08,640 Speaker 1: he writes every week on his substack at Mind and Iron. 711 00:37:08,880 --> 00:37:11,480 Speaker 1: Appreciate you taking the time out, great being here ready, 712 00:37:17,800 --> 00:37:20,160 Speaker 1: Thanks for listening. Be sure to leave us a review 713 00:37:20,200 --> 00:37:23,640 Speaker 1: at Apple Podcasts and Amazon Music. We love to hear 714 00:37:23,719 --> 00:37:26,759 Speaker 1: from listeners. Please go to Variety dot com to sign 715 00:37:26,840 --> 00:37:31,000 Speaker 1: up for the free weekly Strictly Business newsletter, and don't 716 00:37:31,000 --> 00:37:34,040 Speaker 1: forget to tune in next week for another episode of 717 00:37:34,160 --> 00:37:35,160 Speaker 1: Strictly Business.