1 00:00:01,040 --> 00:00:06,000 Speaker 1: This is the most traumatic podcast ever and iHeartRadio podcast. 2 00:00:08,000 --> 00:00:10,360 Speaker 1: Chris Harrison and Lauren Zema coming to you from the 3 00:00:10,400 --> 00:00:13,319 Speaker 1: home office in Austin, Texas. But all eyes these days 4 00:00:13,320 --> 00:00:19,239 Speaker 1: are pointed out west to Hollywood and the sag Aftra 5 00:00:19,640 --> 00:00:24,480 Speaker 1: union strike that has crippled and shut down Hollywood. This 6 00:00:24,640 --> 00:00:27,159 Speaker 1: is much publicized, been talked about quite a bit, and 7 00:00:27,200 --> 00:00:30,000 Speaker 1: the reason we are opening up the playbook for this 8 00:00:30,040 --> 00:00:32,839 Speaker 1: special episode of the most dramatic podcast ever is some 9 00:00:32,920 --> 00:00:35,080 Speaker 1: of you reached out to us and said, hey, can 10 00:00:35,240 --> 00:00:39,199 Speaker 1: LZ get on and do what she does best? She's brilliant, beautiful. 11 00:00:39,400 --> 00:00:41,880 Speaker 1: I added a couple of those things. But the way 12 00:00:41,920 --> 00:00:44,199 Speaker 1: you break things down people really appreciate it. I know 13 00:00:44,240 --> 00:00:47,839 Speaker 1: you dove into this topic on your Instagram and a 14 00:00:47,880 --> 00:00:51,000 Speaker 1: lot of people responded, just how good you can kind 15 00:00:51,000 --> 00:00:53,239 Speaker 1: of put this into bite sized little nuggets for us. 16 00:00:53,320 --> 00:00:53,880 Speaker 2: Well, thank you. 17 00:00:53,920 --> 00:00:55,960 Speaker 3: That makes me feel like I'm like doing a little 18 00:00:56,000 --> 00:00:57,480 Speaker 3: journalism again. It's been a minute for. 19 00:00:57,480 --> 00:01:00,720 Speaker 2: Me taking a break. Is that what I do? God, Babe, 20 00:01:00,760 --> 00:01:01,560 Speaker 2: I thought I made you last. 21 00:01:01,600 --> 00:01:03,520 Speaker 1: Well, you do a lot of things really well, and 22 00:01:03,640 --> 00:01:06,480 Speaker 1: full disclosure, this is something near and dear to our hearts. 23 00:01:06,760 --> 00:01:09,480 Speaker 1: Laura and I are both card carrying members of the 24 00:01:09,640 --> 00:01:10,720 Speaker 1: SAG After a. 25 00:01:10,840 --> 00:01:14,440 Speaker 3: Guilt, Yes, And look, the part of the reason we 26 00:01:14,480 --> 00:01:17,600 Speaker 3: wanted to do this is because, look, when I watch CNBC, 27 00:01:18,000 --> 00:01:19,800 Speaker 3: I'm confused. I don't know what's going on in the 28 00:01:19,840 --> 00:01:22,160 Speaker 3: stock market, Like I don't understand the world of finance. 29 00:01:22,200 --> 00:01:24,319 Speaker 3: If a doctor tries to explain something to me, I say, 30 00:01:24,319 --> 00:01:25,880 Speaker 3: can you give it to me in layman's terms. So 31 00:01:25,880 --> 00:01:29,280 Speaker 3: we all have our own fields and industries, but this 32 00:01:29,360 --> 00:01:33,160 Speaker 3: strike is historic. I'll explain why it's making national and 33 00:01:33,240 --> 00:01:36,840 Speaker 3: international news. And I think it's so important for everybody 34 00:01:36,840 --> 00:01:39,440 Speaker 3: to understand why, because I actually think what happens in 35 00:01:39,480 --> 00:01:43,039 Speaker 3: this strike could very much affect and set precedent in 36 00:01:43,160 --> 00:01:43,960 Speaker 3: other industries. 37 00:01:44,040 --> 00:01:45,720 Speaker 1: Oh for sure, in huge. 38 00:01:45,400 --> 00:01:47,400 Speaker 3: Part because and we'll get into it, but one of 39 00:01:47,440 --> 00:01:52,240 Speaker 3: the major issues here is AI artificial intelligence and how 40 00:01:52,240 --> 00:01:54,160 Speaker 3: it could potentially take jobs. 41 00:01:54,560 --> 00:01:56,320 Speaker 2: So we're going to get into it all. 42 00:01:57,000 --> 00:01:58,560 Speaker 3: I also want to point out you said, look, we 43 00:01:58,640 --> 00:02:02,960 Speaker 3: are SAG After members, but specifically what the union is 44 00:02:03,000 --> 00:02:05,840 Speaker 3: striking about right now is a certain type of contract. 45 00:02:05,920 --> 00:02:08,280 Speaker 3: Like some people have wondered, you know, how this will 46 00:02:08,280 --> 00:02:10,160 Speaker 3: affect the TV and movies they see. 47 00:02:10,440 --> 00:02:12,280 Speaker 2: It will affect network's fall shows. 48 00:02:12,600 --> 00:02:14,600 Speaker 3: You know, the writer's guild is on strike as well, 49 00:02:14,639 --> 00:02:16,320 Speaker 3: and you're not going to see some of those maybe 50 00:02:16,400 --> 00:02:19,960 Speaker 3: new TV shows in the fall. Things like Euphoria is 51 00:02:20,000 --> 00:02:23,320 Speaker 3: probably delayed, huge show for HBO, probably delayed well into 52 00:02:23,320 --> 00:02:26,240 Speaker 3: twenty twenty six by now, because the writers are on strike. 53 00:02:26,280 --> 00:02:28,000 Speaker 3: They can't write it. Now, the actors are on strike, 54 00:02:28,080 --> 00:02:31,840 Speaker 3: they can't act in it. But because of the specific 55 00:02:31,919 --> 00:02:36,960 Speaker 3: contract that's being struck striked over, reality shows, for example, 56 00:02:36,960 --> 00:02:40,239 Speaker 3: can still happen. So you and I you know we 57 00:02:40,600 --> 00:02:43,320 Speaker 3: mostly do hosting or I was in news news organizations 58 00:02:43,360 --> 00:02:44,079 Speaker 3: are still going. 59 00:02:44,360 --> 00:02:45,960 Speaker 2: Reality shows can still happen this way. 60 00:02:46,080 --> 00:02:48,320 Speaker 1: I think Love Island is in production right now, it's 61 00:02:48,360 --> 00:02:51,280 Speaker 1: okay to move forward. That deal I think has another 62 00:02:51,400 --> 00:02:53,600 Speaker 1: year for variety slash reality. 63 00:02:53,240 --> 00:02:54,160 Speaker 2: Shows and that'll come up. 64 00:02:54,200 --> 00:02:57,320 Speaker 3: So there's specific contracts because our union does encompass so 65 00:02:57,360 --> 00:03:00,360 Speaker 3: many people. Sag after is radio hosts and TV and 66 00:03:00,400 --> 00:03:03,720 Speaker 3: actors and actresses and background actors and commercial actors and 67 00:03:03,760 --> 00:03:04,359 Speaker 3: movie stars. 68 00:03:04,400 --> 00:03:08,040 Speaker 2: So it's a lot of people now here. It is 69 00:03:08,400 --> 00:03:09,120 Speaker 2: simple terms. 70 00:03:09,160 --> 00:03:12,840 Speaker 3: Breaking it down, I would say there's three major things 71 00:03:13,360 --> 00:03:19,040 Speaker 3: that we're striking over. One residuals. Two just our wages 72 00:03:19,160 --> 00:03:24,160 Speaker 3: period and three AI. I'm going to start with residuals 73 00:03:24,200 --> 00:03:27,200 Speaker 3: because part of the reason this is historic is do 74 00:03:27,240 --> 00:03:29,880 Speaker 3: you know the last time that sag After Babe and 75 00:03:29,919 --> 00:03:31,920 Speaker 3: the Writer's Guild, the Writer's Guild have been striking about 76 00:03:31,919 --> 00:03:33,640 Speaker 3: two months and so now it's a huge deal that 77 00:03:33,680 --> 00:03:36,320 Speaker 3: sag After has joined and is also striking. Do you 78 00:03:36,320 --> 00:03:38,320 Speaker 3: know the last time we went on strike together? 79 00:03:38,520 --> 00:03:40,520 Speaker 1: It was a while ago before I was born. 80 00:03:40,880 --> 00:03:43,120 Speaker 2: Since I think nineteen sixty three, the sixties. 81 00:03:43,160 --> 00:03:45,480 Speaker 1: Okay, that was by the way, I'm old, but yeah, 82 00:03:45,480 --> 00:03:46,160 Speaker 1: I'm not that old. 83 00:03:47,040 --> 00:03:50,480 Speaker 3: Back then, a guy named Ronald Reagan was the president 84 00:03:50,520 --> 00:03:52,560 Speaker 3: of sag AF getting it done for us. 85 00:03:52,680 --> 00:03:53,120 Speaker 2: Ronnie. 86 00:03:53,880 --> 00:03:56,360 Speaker 1: Another great orator is we're going to talk about fran Dresher, 87 00:03:56,400 --> 00:03:59,840 Speaker 1: the current president, who is really on point and really 88 00:04:00,000 --> 00:04:02,279 Speaker 1: emotional and passionate and fantastic. 89 00:04:02,520 --> 00:04:04,960 Speaker 2: If you haven't, I would go watch her entire speech. 90 00:04:05,000 --> 00:04:06,120 Speaker 2: We'll play a clip of it later. 91 00:04:06,120 --> 00:04:09,600 Speaker 3: But Franny the Nanny, that's my president. Yes, the president 92 00:04:10,000 --> 00:04:14,840 Speaker 3: of the sag After Screen Actors Guild Union is fran Dresher, 93 00:04:15,320 --> 00:04:18,560 Speaker 3: known as the Nanny, and she even referenced the Nanny 94 00:04:18,560 --> 00:04:19,800 Speaker 3: and her speech announcing the strike. 95 00:04:20,000 --> 00:04:21,840 Speaker 2: So number one residuals. 96 00:04:22,120 --> 00:04:26,480 Speaker 3: Back in the sixties, we both unions went on strike 97 00:04:27,000 --> 00:04:29,600 Speaker 3: in large part for something called residuals. Now, what a 98 00:04:29,640 --> 00:04:32,640 Speaker 3: residual means basically is like at that time, we were 99 00:04:32,640 --> 00:04:37,160 Speaker 3: getting frustrated because networks and studios were selling and reselling 100 00:04:37,240 --> 00:04:40,520 Speaker 3: TV shows and movies and they were continuing to make money. 101 00:04:40,760 --> 00:04:43,240 Speaker 3: Like picture that a TV show goes into reruns. 102 00:04:43,360 --> 00:04:46,000 Speaker 1: You had Gunsmoke on at the time. Yeah, Gunsmoke was 103 00:04:46,040 --> 00:04:48,640 Speaker 1: the most popular show in the world, I think definitely 104 00:04:48,640 --> 00:04:51,320 Speaker 1: in the United States, probably the world, and you could 105 00:04:51,320 --> 00:04:54,000 Speaker 1: just air that over and over and the cowboys would 106 00:04:54,000 --> 00:04:55,200 Speaker 1: only get paid that. 107 00:04:55,160 --> 00:04:58,120 Speaker 3: One time, right, So you know, think about how many 108 00:04:58,160 --> 00:05:00,560 Speaker 3: times shows like when I was younger on Nick at Night, 109 00:05:00,680 --> 00:05:03,440 Speaker 3: I watched Laverne and Shirley that was still in black, 110 00:05:03,520 --> 00:05:05,400 Speaker 3: that was in black and white. That show was decades 111 00:05:05,440 --> 00:05:07,880 Speaker 3: old already at that point, but Nick at Knights airing 112 00:05:07,880 --> 00:05:11,400 Speaker 3: and over and over again with commercials making money. So 113 00:05:11,800 --> 00:05:14,719 Speaker 3: actors said, hey, if you're going to reuse our work, 114 00:05:15,160 --> 00:05:17,120 Speaker 3: we should get a little portion of that money. A 115 00:05:17,120 --> 00:05:19,240 Speaker 3: thing called a residual. Do you get residuals checks for 116 00:05:19,279 --> 00:05:19,760 Speaker 3: anything I do? 117 00:05:19,839 --> 00:05:23,360 Speaker 1: Yeah, for Bachelor bachelorette. Still who wants to be a millionaire? 118 00:05:23,600 --> 00:05:25,479 Speaker 1: A bunch I did a bunch of movies back in 119 00:05:25,480 --> 00:05:28,360 Speaker 1: the day. Still get little checks for that. And it's funny. 120 00:05:28,600 --> 00:05:31,400 Speaker 1: Sometimes you get like a fourteen cent check. Seriously, and 121 00:05:31,440 --> 00:05:33,720 Speaker 1: I think to myself, it cost more to mail me 122 00:05:33,839 --> 00:05:36,760 Speaker 1: this fourteen cent check. Their needs. We'll get to that 123 00:05:36,880 --> 00:05:39,360 Speaker 1: and the other negotiation of all they just if it's 124 00:05:39,440 --> 00:05:40,880 Speaker 1: under a dollar, just keep it and put it in 125 00:05:40,880 --> 00:05:43,520 Speaker 1: a big pool for I don't know, a lottery or something. 126 00:05:43,600 --> 00:05:45,400 Speaker 2: I mean, it's probably not worth the people's printed on. 127 00:05:46,080 --> 00:05:48,760 Speaker 1: But I do get residuals, and they are important. 128 00:05:49,279 --> 00:05:52,760 Speaker 3: Well, back then we fought for residuals on the networks. 129 00:05:53,640 --> 00:05:58,559 Speaker 3: Now streaming the streamers have changed everything. Obviously, I'm talking 130 00:05:58,560 --> 00:06:00,720 Speaker 3: about Netflix and Who. 131 00:06:00,200 --> 00:06:00,600 Speaker 2: And all that. 132 00:06:00,680 --> 00:06:03,920 Speaker 3: And the thing is many networks have created their own streamers, right, 133 00:06:04,080 --> 00:06:08,880 Speaker 3: NBC now has Peacock. So the business model has changed now. 134 00:06:08,920 --> 00:06:11,159 Speaker 3: I think I could be wrong, but I think Hulu 135 00:06:11,880 --> 00:06:13,880 Speaker 3: is or has at some point aired old seasons of 136 00:06:13,920 --> 00:06:16,720 Speaker 3: The Bachelor. Yeah, have you gotten residuals checks for that? 137 00:06:16,960 --> 00:06:19,120 Speaker 1: I don't think specifically I have. I would get it 138 00:06:19,240 --> 00:06:22,000 Speaker 1: when Hulu bought it, say say they bought it for 139 00:06:22,040 --> 00:06:25,000 Speaker 1: one hundred dollars. I would get a little small portion 140 00:06:25,080 --> 00:06:27,680 Speaker 1: of that, Okay, but I think what you're alluding to 141 00:06:27,800 --> 00:06:31,280 Speaker 1: is what happens when that is aired a million times exactly. 142 00:06:31,400 --> 00:06:36,000 Speaker 3: Actually, one of the stars, Sean Gunn, who played Kirk 143 00:06:36,080 --> 00:06:38,080 Speaker 3: on Gilmour Girls, Okay, one of our faves. 144 00:06:38,080 --> 00:06:40,359 Speaker 2: You and I have watched Taylor. 145 00:06:40,440 --> 00:06:45,880 Speaker 3: Yes, So he was on the picket line and he said, look, 146 00:06:45,960 --> 00:06:49,400 Speaker 3: Gilmore Girls. Netflix bought the rights to air Gilmore Girls 147 00:06:50,000 --> 00:06:53,000 Speaker 3: to have it on Netflix. He said, it's probably one 148 00:06:53,000 --> 00:06:55,760 Speaker 3: of their top shows. And think about that. The streamers 149 00:06:55,760 --> 00:06:59,080 Speaker 3: don't release their data, so when a TV shows and reruns, 150 00:06:59,480 --> 00:07:01,719 Speaker 3: you could know, oh hey, I know exactly how many 151 00:07:01,760 --> 00:07:03,960 Speaker 3: times it might the show I was on re ran 152 00:07:04,240 --> 00:07:05,640 Speaker 3: and I'm going to get a check for every time 153 00:07:05,640 --> 00:07:07,760 Speaker 3: it was aired and every time that network made money from. 154 00:07:07,680 --> 00:07:08,400 Speaker 2: Commercials on it. 155 00:07:08,800 --> 00:07:12,119 Speaker 3: But the streamers don't release their data, so Gilmore Girls 156 00:07:12,200 --> 00:07:15,760 Speaker 3: could be I mean, new generations have discovered Gilmore Girls 157 00:07:15,760 --> 00:07:18,960 Speaker 3: on Netflix, binge watching it. It's probably watched over and 158 00:07:19,000 --> 00:07:21,600 Speaker 3: over and over and over again. But Sean Gunn, the 159 00:07:21,640 --> 00:07:24,000 Speaker 3: actor who played Kirk, all these people are watching him, 160 00:07:24,040 --> 00:07:27,080 Speaker 3: but he's not seeing any of that, and the streamers 161 00:07:27,120 --> 00:07:29,520 Speaker 3: won't tell you how many times people are watching, and. 162 00:07:29,480 --> 00:07:32,280 Speaker 1: They are very protective of those numbers and those metrics, 163 00:07:32,280 --> 00:07:36,080 Speaker 1: and you have no idea. And when it was ABC, NBCCBS, 164 00:07:36,080 --> 00:07:40,560 Speaker 1: Fox and you know cable, we all use the same. 165 00:07:40,360 --> 00:07:41,840 Speaker 2: System, Nielsen ratings. 166 00:07:41,920 --> 00:07:45,000 Speaker 1: Everyone uses Nielsen. So you know, okay, our show did 167 00:07:45,040 --> 00:07:47,920 Speaker 1: a point eight, it did three million viewers. What have 168 00:07:48,080 --> 00:07:51,160 Speaker 1: you know exactly what you're doing. You know, we can 169 00:07:51,240 --> 00:07:53,360 Speaker 1: argue if those Nielsen numbers are correct or not, but 170 00:07:53,440 --> 00:07:56,600 Speaker 1: they at least use the same metric. Nobody has any 171 00:07:56,600 --> 00:08:00,280 Speaker 1: idea what's going on at Amazon, Netflix and the line. 172 00:08:00,480 --> 00:08:02,200 Speaker 3: So when it comes to this, it's like in the 173 00:08:02,240 --> 00:08:03,760 Speaker 3: sixties we fought for residuals. 174 00:08:03,840 --> 00:08:04,960 Speaker 2: Now we're fighting for it again. 175 00:08:05,120 --> 00:08:09,840 Speaker 3: But because streaming has changed everything, and the union representatives 176 00:08:09,840 --> 00:08:13,160 Speaker 3: for sag AFTRA have made this point many times. You've 177 00:08:13,280 --> 00:08:16,120 Speaker 3: changed the entire model of our business. You got to 178 00:08:16,200 --> 00:08:18,720 Speaker 3: change our contract that goes with it. You can't sit 179 00:08:18,760 --> 00:08:21,840 Speaker 3: here and say that like everything's different, but you want 180 00:08:21,920 --> 00:08:23,400 Speaker 3: us to operate in an old way. 181 00:08:24,040 --> 00:08:26,640 Speaker 2: And I completely agree with that. 182 00:08:26,720 --> 00:08:28,560 Speaker 3: And look, I think one good point is I've seen 183 00:08:28,600 --> 00:08:31,800 Speaker 3: some people in the comments on social media be like Oh, okay, Hollywood. 184 00:08:32,280 --> 00:08:32,959 Speaker 2: Look, here's the thing. 185 00:08:33,040 --> 00:08:37,840 Speaker 3: There's one hundred and sixty thousand members of Sagaftra. Most 186 00:08:37,840 --> 00:08:40,959 Speaker 3: of those people are not the Angelina Jolie's and the 187 00:08:41,080 --> 00:08:41,760 Speaker 3: Jennifer Lall. 188 00:08:41,800 --> 00:08:44,200 Speaker 1: That's the thing. We have to kind of making millions. Yeah, 189 00:08:44,240 --> 00:08:46,400 Speaker 1: you think Brad Pitt, George Clooney as soon as we 190 00:08:46,440 --> 00:08:50,920 Speaker 1: say Hollywood, Leonardo DiCaprio, Jennifer Lawrence making twenty million dollars 191 00:08:50,960 --> 00:08:53,839 Speaker 1: in her last film. No, like, that's the point. Oh 192 00:08:53,880 --> 00:08:57,440 Speaker 1: oh one percent. You're not seeing ninety nine percent of 193 00:08:57,480 --> 00:08:59,640 Speaker 1: that iceberg that is underwater, which is a bunch of 194 00:08:59,640 --> 00:09:03,320 Speaker 1: people grinding it out, just making rent, just like everybody 195 00:09:03,360 --> 00:09:05,160 Speaker 1: else in the world. They just love what they do 196 00:09:05,240 --> 00:09:07,000 Speaker 1: and they've chosen this profession. 197 00:09:06,640 --> 00:09:09,280 Speaker 2: Right, background actors, commercial actors. 198 00:09:09,360 --> 00:09:10,840 Speaker 1: And just bit part actors. 199 00:09:10,840 --> 00:09:13,320 Speaker 2: I mean yeah, And then look, it's a difficult job. 200 00:09:13,360 --> 00:09:14,280 Speaker 2: There's a lot of rejection. 201 00:09:14,520 --> 00:09:16,079 Speaker 3: And part of the reason we have to have a 202 00:09:16,160 --> 00:09:18,920 Speaker 3: union and be protective is it's an inconsistent job. It's 203 00:09:18,960 --> 00:09:21,800 Speaker 3: not a nine to five you can work and being 204 00:09:21,920 --> 00:09:25,120 Speaker 3: like a great you know movie. Literally, you could be 205 00:09:25,120 --> 00:09:26,920 Speaker 3: in a star in a movie one year and then 206 00:09:27,280 --> 00:09:30,319 Speaker 3: not work again for two or three years. So I 207 00:09:30,400 --> 00:09:35,559 Speaker 3: think the residuals thing is really important. And basically, you know, 208 00:09:35,840 --> 00:09:38,360 Speaker 3: these dreamers don't want it to They don't want to 209 00:09:38,400 --> 00:09:40,600 Speaker 3: tell you how much your work is being used. Like 210 00:09:40,679 --> 00:09:42,880 Speaker 3: think about that. It's kind of crazy when you think 211 00:09:42,880 --> 00:09:45,400 Speaker 3: about it. Like a lawyer can have billable hours and 212 00:09:45,400 --> 00:09:47,400 Speaker 3: tell you, here's exactly how many hours I worked, and 213 00:09:47,440 --> 00:09:49,760 Speaker 3: I want to be paid for that. Actors are saying, 214 00:09:49,880 --> 00:09:51,920 Speaker 3: we want to know how many hours people are enjoying 215 00:09:51,920 --> 00:09:52,800 Speaker 3: our work and we want. 216 00:09:52,679 --> 00:09:53,280 Speaker 2: To get paid for that. 217 00:10:05,480 --> 00:10:10,959 Speaker 3: So from residuals to point number two, which is wages, 218 00:10:11,160 --> 00:10:14,679 Speaker 3: simple living wages. On that note, this is pretty short 219 00:10:14,679 --> 00:10:18,280 Speaker 3: and sweet. What the union representatives are saying is that. 220 00:10:18,440 --> 00:10:20,800 Speaker 3: And keep in mind, you guys, the unions tried to 221 00:10:20,840 --> 00:10:23,679 Speaker 3: negotiate with They tried to negotiate with the Let me 222 00:10:23,720 --> 00:10:28,079 Speaker 3: get this n acronym right, the AMPTM. 223 00:10:28,360 --> 00:10:30,320 Speaker 2: Oh, I'm not going to get it right. Help me, babe. 224 00:10:30,520 --> 00:10:35,160 Speaker 1: Yes, it is the AMPTP, which is the Alliance of 225 00:10:35,200 --> 00:10:39,080 Speaker 1: Motion Pictures and Television Producers. That is who essentially seg 226 00:10:39,120 --> 00:10:40,760 Speaker 1: After is striking against. 227 00:10:40,880 --> 00:10:42,480 Speaker 2: Because we've been negotiating with them. 228 00:10:42,559 --> 00:10:45,280 Speaker 3: Should point out the unions had been trying to negotiate 229 00:10:45,320 --> 00:10:48,200 Speaker 3: with them on a new contract for weeks on weeks, 230 00:10:48,200 --> 00:10:51,040 Speaker 3: if not months. We were actually supposed to negotiate our 231 00:10:51,080 --> 00:10:54,320 Speaker 3: contract before COVID hit, if you remember. And then COVID 232 00:10:54,400 --> 00:10:57,240 Speaker 3: hit and we said, okay, look, we'll allow a pause 233 00:10:57,280 --> 00:10:59,679 Speaker 3: on this. It's a wild time, it's unprecedented. We don't 234 00:10:59,679 --> 00:11:02,680 Speaker 3: know what's happen, so we'll delay. Well, then COVID change 235 00:11:02,679 --> 00:11:04,640 Speaker 3: the industry even more, right, like people aren't going to 236 00:11:04,679 --> 00:11:07,319 Speaker 3: the movies as much, so all the more reason to 237 00:11:07,880 --> 00:11:13,320 Speaker 3: now strike because even more's changed. So we have tried 238 00:11:13,360 --> 00:11:16,240 Speaker 3: to negotiate with them, and the whole living wages thing 239 00:11:16,240 --> 00:11:22,439 Speaker 3: of it is. Per our union chief negotiator Duncan Crabtree, Ireland, 240 00:11:23,559 --> 00:11:27,680 Speaker 3: he said that they wanted actors to make less money 241 00:11:28,080 --> 00:11:30,120 Speaker 3: in the future than they have in the past. 242 00:11:30,520 --> 00:11:31,400 Speaker 2: He put it that simply. 243 00:11:31,440 --> 00:11:33,920 Speaker 3: He said, no, we would not agree that we would 244 00:11:34,200 --> 00:11:37,040 Speaker 3: agree to a wage that in the future our union 245 00:11:37,080 --> 00:11:38,480 Speaker 3: members we're going to be making more money than they 246 00:11:38,480 --> 00:11:40,120 Speaker 3: are now. We wanted them to factor in a little 247 00:11:40,120 --> 00:11:43,360 Speaker 3: thing called inflation. So just basic labor. 248 00:11:43,440 --> 00:11:45,040 Speaker 2: Yeah, desires. 249 00:11:45,160 --> 00:11:47,600 Speaker 1: But it's this third point that I think is really 250 00:11:47,720 --> 00:11:49,400 Speaker 1: at the crux of this and really at the middle, 251 00:11:49,440 --> 00:11:54,200 Speaker 1: and I have to. Can I introduce it please? It's AI. 252 00:11:55,000 --> 00:11:59,640 Speaker 1: Artificial intelligence is really a big part of this negotiation 253 00:11:59,760 --> 00:12:04,000 Speaker 1: and strike, and you broke this down very eloquently when 254 00:12:04,000 --> 00:12:04,360 Speaker 1: we were. 255 00:12:04,280 --> 00:12:07,000 Speaker 2: Talking about it, well really, I almost. 256 00:12:08,120 --> 00:12:12,520 Speaker 3: I was paraphrasing what our chief negotiator, Duncan said. Let's 257 00:12:12,559 --> 00:12:15,760 Speaker 3: play a little piece from the press conference where he 258 00:12:16,280 --> 00:12:19,679 Speaker 3: and the other negotiating team announced that we'd be striking. 259 00:12:19,960 --> 00:12:21,240 Speaker 2: Here's what he said. 260 00:12:21,760 --> 00:12:26,400 Speaker 3: That the that the studios and networks potentially want to 261 00:12:26,440 --> 00:12:29,360 Speaker 3: do in the future with artificial intelligence and actors. 262 00:12:29,520 --> 00:12:32,840 Speaker 4: They proposed that our background performers should be able to 263 00:12:32,840 --> 00:12:35,880 Speaker 4: be scanned, get paid for one day's pay, and their 264 00:12:35,880 --> 00:12:39,520 Speaker 4: company should own that scan, their image, their likeness, and 265 00:12:39,520 --> 00:12:41,480 Speaker 4: should be able to use it for the rest of 266 00:12:41,520 --> 00:12:44,600 Speaker 4: eternity in any project they want, with no consent and 267 00:12:44,640 --> 00:12:45,440 Speaker 4: no compensation. 268 00:12:46,160 --> 00:12:47,160 Speaker 2: So there you go. 269 00:12:47,320 --> 00:12:49,160 Speaker 3: I thought he explained it perfectly, and I got to 270 00:12:49,160 --> 00:12:51,160 Speaker 3: give a shout out to him and to fran Dresher. 271 00:12:51,640 --> 00:12:54,720 Speaker 3: They killed it in that press conference. They have broken 272 00:12:54,800 --> 00:12:57,760 Speaker 3: things down in such clear terms for people to understand. 273 00:12:58,280 --> 00:12:59,800 Speaker 2: And how scary is that. 274 00:13:00,120 --> 00:13:02,480 Speaker 1: I went into that. I listened to that press conference 275 00:13:02,520 --> 00:13:04,800 Speaker 1: with you, and I thought Okay, look, I would like 276 00:13:04,840 --> 00:13:07,959 Speaker 1: to be open minded and see both sides of this situation. 277 00:13:08,440 --> 00:13:11,120 Speaker 1: And I thought, in the coming days, I want to 278 00:13:11,160 --> 00:13:14,760 Speaker 1: hear how the Alliance Motion Picture and Television kind of 279 00:13:14,760 --> 00:13:16,959 Speaker 1: rebut this whole thing. What is their take on it? 280 00:13:17,040 --> 00:13:19,920 Speaker 1: They have not, like no one's denied anything that they 281 00:13:19,920 --> 00:13:22,360 Speaker 1: have said from this press conference that all of this 282 00:13:22,640 --> 00:13:24,880 Speaker 1: was what was trying to be kind of forced down 283 00:13:24,920 --> 00:13:27,880 Speaker 1: the union's throat. I was shocked that, right, you. 284 00:13:27,880 --> 00:13:29,240 Speaker 2: Thought, maybe there's a miscommunication. 285 00:13:29,400 --> 00:13:31,679 Speaker 1: Well maybe they're maybe they're Yeah, maybe they're trying to 286 00:13:31,720 --> 00:13:33,679 Speaker 1: exaggerate a little bit to make our case seem a 287 00:13:33,720 --> 00:13:36,079 Speaker 1: little stronger. You know, it's what you would do. 288 00:13:36,640 --> 00:13:39,440 Speaker 3: No, I actually it's not a podcast Duncan did. He 289 00:13:39,480 --> 00:13:42,160 Speaker 3: did our friend Matt Bellan's podcast The Town, and he 290 00:13:42,280 --> 00:13:44,040 Speaker 3: doubled down on that because Matt asked him about it 291 00:13:44,080 --> 00:13:45,840 Speaker 3: and he said, no, no, no, I was like quoting 292 00:13:45,840 --> 00:13:51,440 Speaker 3: a document there. They want to scan an actor, pay 293 00:13:51,480 --> 00:13:54,000 Speaker 3: that actor for a day's work, and then use that 294 00:13:54,120 --> 00:14:02,120 Speaker 3: actor in perpetuity as art via artificial intelligence. So if 295 00:14:02,160 --> 00:14:05,080 Speaker 3: that doesn't scare everybody at home, I don't know what 296 00:14:05,160 --> 00:14:08,079 Speaker 3: does that scares me? That scares me, not just as 297 00:14:08,120 --> 00:14:11,320 Speaker 3: a performer. But we all have to be aware of 298 00:14:11,400 --> 00:14:14,760 Speaker 3: how these companies because I think something critical that's happened 299 00:14:14,840 --> 00:14:18,000 Speaker 3: is studios kind of used to be run by creative people, 300 00:14:18,720 --> 00:14:20,680 Speaker 3: people who valued the art form, who wanted to make 301 00:14:20,680 --> 00:14:25,640 Speaker 3: good movies. In recent years, like many industries, studios have 302 00:14:25,680 --> 00:14:28,320 Speaker 3: gotten bought by bigger companies and now it's like numbers 303 00:14:28,360 --> 00:14:31,680 Speaker 3: cruncher money people at the top who are very focused 304 00:14:31,720 --> 00:14:34,240 Speaker 3: on their stock price and their bottom line. They're going 305 00:14:34,320 --> 00:14:36,440 Speaker 3: to want to go for what's cheap, for the cheaper chicken. 306 00:14:36,720 --> 00:14:38,840 Speaker 3: How can they make a movie cheaper? How can they 307 00:14:38,840 --> 00:14:41,840 Speaker 3: make more money? And I'm not saying that makes them 308 00:14:41,880 --> 00:14:44,280 Speaker 3: like villains, but I am saying that we have to 309 00:14:44,320 --> 00:14:46,880 Speaker 3: stand up because no, you can't get paid for a 310 00:14:46,960 --> 00:14:47,480 Speaker 3: day's work. 311 00:14:47,520 --> 00:14:48,400 Speaker 2: And think about that. 312 00:14:48,640 --> 00:14:51,320 Speaker 3: They're saying they want to use your likeness in perpetuity, 313 00:14:51,640 --> 00:14:54,760 Speaker 3: So they're really what they're saying is potentially just eliminating 314 00:14:54,800 --> 00:14:56,840 Speaker 3: the job of the background actor. They're only going to 315 00:14:56,880 --> 00:15:00,120 Speaker 3: have artificial intelligence playing background actors in every movie. 316 00:15:00,160 --> 00:15:02,800 Speaker 1: And that and that's enough, by the way, that's enough 317 00:15:02,800 --> 00:15:04,600 Speaker 1: to be scary, and that's enough to fight for. But 318 00:15:04,720 --> 00:15:08,880 Speaker 1: I see that going any even further. And just just 319 00:15:08,920 --> 00:15:15,160 Speaker 1: for reference. Background acting that day work is vital to 320 00:15:15,800 --> 00:15:18,160 Speaker 1: the economy of Hollywood and to so many people getting 321 00:15:18,200 --> 00:15:21,280 Speaker 1: their start in this business. Getting your hours up, getting 322 00:15:21,280 --> 00:15:23,760 Speaker 1: your numbers up, being a part of the union, sometimes 323 00:15:23,840 --> 00:15:26,120 Speaker 1: leading to a speaking part that can change your life 324 00:15:26,160 --> 00:15:28,480 Speaker 1: in your career. That is what happens. I have never 325 00:15:28,520 --> 00:15:30,920 Speaker 1: been on a movie set, and I was luckily lucky 326 00:15:31,000 --> 00:15:33,880 Speaker 1: enough to be in several movies that didn't have extras. 327 00:15:34,840 --> 00:15:37,120 Speaker 1: I was the night of the strike. I was watching 328 00:15:37,120 --> 00:15:39,000 Speaker 1: one of my favorite shows on one of the streamers, 329 00:15:39,120 --> 00:15:42,480 Speaker 1: and there was a court scene and my immediate thought was, 330 00:15:43,040 --> 00:15:46,320 Speaker 1: they'll never have to fill a courtroom again. They'll fill 331 00:15:46,320 --> 00:15:48,720 Speaker 1: it with AI figures and you'll only pay the three 332 00:15:49,440 --> 00:15:51,480 Speaker 1: lawyer people that you know, the people that are the 333 00:15:51,480 --> 00:15:54,520 Speaker 1: main characters in this You'll never pay the background. Take 334 00:15:54,560 --> 00:15:57,240 Speaker 1: a you know one of my favorite movies, Moneyball. You 335 00:15:57,240 --> 00:15:59,920 Speaker 1: can fill up a stadium a great movie like Rudy. 336 00:16:00,040 --> 00:16:01,880 Speaker 1: You can fill that stadium up. You don't have to 337 00:16:01,920 --> 00:16:04,040 Speaker 1: pay people anymore, and that is big work. 338 00:16:04,120 --> 00:16:07,520 Speaker 3: I will say the union is not saying. The union 339 00:16:07,560 --> 00:16:09,800 Speaker 3: is actually not saying you can't use AI at all. 340 00:16:09,920 --> 00:16:12,560 Speaker 3: The sag after isn't saying that. But they're saying, we 341 00:16:12,600 --> 00:16:15,600 Speaker 3: can't allow you to pay someone for a day and 342 00:16:15,640 --> 00:16:18,040 Speaker 3: then you own their likeness and perpetuity. 343 00:16:18,120 --> 00:16:19,600 Speaker 2: That is wild. 344 00:16:19,840 --> 00:16:22,520 Speaker 3: Like, look, I'm sure they've cgid big crowds of people 345 00:16:22,560 --> 00:16:25,240 Speaker 3: at this point. That's not that different than artificial intelligence. 346 00:16:25,280 --> 00:16:27,880 Speaker 3: If you think about, like how much CGI is done 347 00:16:27,920 --> 00:16:30,440 Speaker 3: in movies. Now, I'm not saying you got to you know, 348 00:16:30,840 --> 00:16:34,240 Speaker 3: pay a million background actors, but you can't own someone's likenesses. 349 00:16:34,280 --> 00:16:36,240 Speaker 1: You can't use you, and you can't use me, and 350 00:16:36,280 --> 00:16:38,840 Speaker 1: you can't use my Yeah, you can't use my expression, 351 00:16:38,920 --> 00:16:42,360 Speaker 1: my likeness. That's that's very different. And am Gary, what's 352 00:16:42,400 --> 00:16:46,560 Speaker 1: scarier is when you come in. I forget the actress 353 00:16:46,560 --> 00:16:48,800 Speaker 1: that came out and said, oh yeah, they they scanned 354 00:16:48,840 --> 00:16:52,000 Speaker 1: our likeness and face when we came in, and our expressions, 355 00:16:52,480 --> 00:16:55,720 Speaker 1: and you can use that. You could use that for 356 00:16:55,760 --> 00:16:58,760 Speaker 1: an entire new show, you could use it for a movie. 357 00:16:59,160 --> 00:17:01,280 Speaker 1: I mean, think of this. This is on a crazy 358 00:17:01,280 --> 00:17:07,639 Speaker 1: scale of AI. Daniel Craig could be James Bond forever. Well, 359 00:17:08,280 --> 00:17:10,199 Speaker 1: he'll never age out of it. We'll just keep him 360 00:17:10,200 --> 00:17:11,080 Speaker 1: at this age now. 361 00:17:11,200 --> 00:17:12,760 Speaker 2: And by the way, I think that's very real. 362 00:17:12,800 --> 00:17:14,880 Speaker 3: I mean, you've heard the recording New Beatles Music using 363 00:17:14,920 --> 00:17:16,840 Speaker 3: AI to recreate John Lennon's voice. 364 00:17:16,720 --> 00:17:16,960 Speaker 1: Yeah. 365 00:17:17,160 --> 00:17:19,280 Speaker 3: Now, I think what our union is saying is we 366 00:17:19,359 --> 00:17:21,560 Speaker 3: know that's coming. We're not rejecting the. 367 00:17:21,480 --> 00:17:22,639 Speaker 1: Future, we're embracing it. 368 00:17:22,680 --> 00:17:25,160 Speaker 2: Actually, but you have to. 369 00:17:25,040 --> 00:17:28,000 Speaker 3: Pay people for their skills, for their talents for them 370 00:17:28,680 --> 00:17:32,199 Speaker 3: and I you know, and if you look at it 371 00:17:32,200 --> 00:17:34,840 Speaker 3: from the writer's perspective, the Writer's Guild is actually being 372 00:17:34,840 --> 00:17:37,480 Speaker 3: a little more intense about it. They say, we want 373 00:17:37,520 --> 00:17:39,359 Speaker 3: to promise in our contract that AI is not going 374 00:17:39,400 --> 00:17:43,080 Speaker 3: to write scripts because like, one thing I didn't realize was, 375 00:17:43,160 --> 00:17:47,000 Speaker 3: for example, there's no like copyright law on AI right now, 376 00:17:47,080 --> 00:17:49,400 Speaker 3: or at least very little right So what a studio 377 00:17:49,440 --> 00:17:52,600 Speaker 3: could say is, like all movie, if you've seen a movie, 378 00:17:52,640 --> 00:17:55,360 Speaker 3: the script's available to find online. I could google Bridesmaid 379 00:17:55,440 --> 00:17:57,359 Speaker 3: script right now and I'd get the entire script in 380 00:17:57,400 --> 00:18:00,400 Speaker 3: a second. So what a studio could say is, we're 381 00:18:00,400 --> 00:18:04,760 Speaker 3: gonna have AI read all of Aaron Sorkin's scripts and 382 00:18:04,800 --> 00:18:07,639 Speaker 3: then we're going to tell that artificial intelligence to write 383 00:18:07,640 --> 00:18:11,639 Speaker 3: a movie in this style of Aaron Sorkin. So where's 384 00:18:11,680 --> 00:18:14,880 Speaker 3: the copyright protection there? Because you're kind of almost plagiarizing 385 00:18:14,960 --> 00:18:18,280 Speaker 3: Aron Sorkin in a way. But they're going to try 386 00:18:18,320 --> 00:18:20,119 Speaker 3: to have the AI do it because it's cheaper there, 387 00:18:20,160 --> 00:18:20,480 Speaker 3: and then. 388 00:18:20,359 --> 00:18:22,040 Speaker 1: You could bring in maybe a writer or two to 389 00:18:22,080 --> 00:18:25,440 Speaker 1: punch it up and make it worthy for the network. 390 00:18:25,280 --> 00:18:29,920 Speaker 3: And so there's copyright issues there, and also from the 391 00:18:29,920 --> 00:18:34,000 Speaker 3: writer's perspective, I mean, they could just and I worry, 392 00:18:34,080 --> 00:18:38,040 Speaker 3: like for just the quality of I mean, I think 393 00:18:38,080 --> 00:18:41,199 Speaker 3: about think about how much we quote movie lines or 394 00:18:41,240 --> 00:18:43,520 Speaker 3: we quote TV shows, and how a TV show or 395 00:18:43,560 --> 00:18:46,640 Speaker 3: a movie emotionally connected you. I mean, I mentioned Bride, 396 00:18:46,720 --> 00:18:49,200 Speaker 3: which as your soul. I still quote Bride's Maid's probably 397 00:18:49,200 --> 00:18:51,920 Speaker 3: once a week. I'll be like, there's something, there's someone 398 00:18:51,960 --> 00:18:53,879 Speaker 3: on the wings, she's churning butter on the wing or 399 00:18:54,160 --> 00:18:56,359 Speaker 3: ready to party. 400 00:18:56,600 --> 00:18:58,119 Speaker 2: So I quote that movie all the time. 401 00:18:58,880 --> 00:19:03,000 Speaker 3: I'm sure Studio intend to have AI straight up just 402 00:19:03,040 --> 00:19:05,639 Speaker 3: write scripts, even if you're not trying to do it 403 00:19:05,680 --> 00:19:08,439 Speaker 3: in the style of someone They're gonna want AI to 404 00:19:08,560 --> 00:19:11,359 Speaker 3: just write scripts. And I can't even think of the 405 00:19:11,480 --> 00:19:14,239 Speaker 3: quality that we're going to start to get. We're going 406 00:19:14,280 --> 00:19:17,200 Speaker 3: to become dumber as a society, I think, and really 407 00:19:17,280 --> 00:19:19,840 Speaker 3: living like I don't want to I'm not trying to 408 00:19:19,840 --> 00:19:22,600 Speaker 3: reject the future, but I don't want to live in 409 00:19:22,640 --> 00:19:25,600 Speaker 3: a world that's lacking the human experience. 410 00:19:37,800 --> 00:19:39,760 Speaker 1: I think the biggest point out of this to go 411 00:19:39,880 --> 00:19:42,320 Speaker 1: back to the actual strike that you hit the nail 412 00:19:42,359 --> 00:19:45,040 Speaker 1: on the head for and Dresser, the president of the 413 00:19:45,119 --> 00:19:48,399 Speaker 1: union as well as the chief negotiator, said, the union 414 00:19:48,520 --> 00:19:52,960 Speaker 1: is encouraging AI. They believe in AI. They know that 415 00:19:52,960 --> 00:19:56,880 Speaker 1: that's where the future lies. It just can't be unfettered 416 00:19:57,040 --> 00:20:00,840 Speaker 1: and unchecked. We need some checks and balances, we need 417 00:20:00,880 --> 00:20:03,200 Speaker 1: some precautions taken. And you got to do it now 418 00:20:03,560 --> 00:20:06,359 Speaker 1: because if you don't, the cat's out of the bag, 419 00:20:06,480 --> 00:20:08,520 Speaker 1: and or, as we like to say, the toothpaste is 420 00:20:08,560 --> 00:20:10,520 Speaker 1: out of the tube and this thing's over. And so 421 00:20:11,000 --> 00:20:13,199 Speaker 1: you got to make this stand now and doing it 422 00:20:13,240 --> 00:20:15,000 Speaker 1: with the Writer's Guild is the right thing to do. 423 00:20:15,359 --> 00:20:17,600 Speaker 1: But I like the fact they're not fighting it, they're 424 00:20:17,680 --> 00:20:20,320 Speaker 1: encouraging it. Just let's all put our arms together and 425 00:20:20,359 --> 00:20:20,560 Speaker 1: do this. 426 00:20:20,680 --> 00:20:22,240 Speaker 3: Right. You're seeing this, and this is why I bring 427 00:20:22,280 --> 00:20:24,760 Speaker 3: it up because again we're seeing this across industries right. 428 00:20:24,760 --> 00:20:27,200 Speaker 3: I mean, just in May of this year, it was 429 00:20:27,240 --> 00:20:29,399 Speaker 3: the headline everywhere that like three hundred and fifty of 430 00:20:29,440 --> 00:20:33,760 Speaker 3: the top experts on AI published a letter warning that 431 00:20:33,840 --> 00:20:36,680 Speaker 3: they think AI is potentially what you said at Pandora's 432 00:20:36,680 --> 00:20:40,840 Speaker 3: Box and could quote poseate an extinction to humans. When 433 00:20:40,880 --> 00:20:42,360 Speaker 3: are we going to set up and take this seriously? 434 00:20:42,480 --> 00:20:43,879 Speaker 3: By the way, you know, an industry called all this 435 00:20:43,920 --> 00:20:47,359 Speaker 3: out the movies, I mean, the matrix and the terminator, 436 00:20:47,400 --> 00:20:49,919 Speaker 3: and like, when are we gonna say we need to 437 00:20:49,920 --> 00:20:52,320 Speaker 3: put some regulations on this, because the problem is if 438 00:20:52,320 --> 00:20:56,440 Speaker 3: it goes unregulated, and if at any point somebody takes 439 00:20:56,480 --> 00:20:59,000 Speaker 3: it too far and experiments too much, we have a 440 00:20:59,000 --> 00:21:01,840 Speaker 3: serious problem on our hands. Probably in some small ways 441 00:21:01,840 --> 00:21:04,080 Speaker 3: that's already happened, like in government stuff that we don't 442 00:21:04,119 --> 00:21:04,639 Speaker 3: even know about. 443 00:21:04,760 --> 00:21:07,080 Speaker 1: Okay, So those are kind of the three pillars of 444 00:21:07,119 --> 00:21:08,560 Speaker 1: what everybody's fighting for. 445 00:21:08,680 --> 00:21:11,840 Speaker 3: And I want to point out we are striking because 446 00:21:13,680 --> 00:21:18,280 Speaker 3: we couldn't I mean, per our union. They tried, and 447 00:21:18,760 --> 00:21:23,040 Speaker 3: the studios and the networks just wouldn't give on things. 448 00:21:23,320 --> 00:21:25,360 Speaker 3: And you got to meet in the middles at some point, 449 00:21:25,480 --> 00:21:27,160 Speaker 3: or if you don't, that's how a strike. 450 00:21:27,160 --> 00:21:30,800 Speaker 1: And the studios walked away, and that's what caused this resk. 451 00:21:30,920 --> 00:21:34,080 Speaker 3: You you've been in this business longer than me, you 452 00:21:34,119 --> 00:21:36,400 Speaker 3: know not since you weren't born in the sixties, but. 453 00:21:36,560 --> 00:21:38,679 Speaker 1: I remember working with John Wayne. 454 00:21:39,720 --> 00:21:41,679 Speaker 2: Did do you ever do anything with Reagan? You and 455 00:21:41,720 --> 00:21:42,879 Speaker 2: Reagan hitting it up together? 456 00:21:44,359 --> 00:21:48,440 Speaker 3: What do you think knowing you know some of these 457 00:21:49,240 --> 00:21:51,520 Speaker 3: big wigs and stuff like that. Oh well, and I 458 00:21:51,560 --> 00:21:53,320 Speaker 3: should point out a huge part of the reason people 459 00:21:53,359 --> 00:21:57,080 Speaker 3: are frustrated. I'm sorry for backtracking, but just the amounts 460 00:21:57,119 --> 00:21:59,080 Speaker 3: of money. I mean, you and I have dealt with 461 00:21:59,119 --> 00:22:01,879 Speaker 3: this every time go to negotiate, and everybody deals with 462 00:22:01,880 --> 00:22:04,360 Speaker 3: this at work. But for your job, for your contract, 463 00:22:04,480 --> 00:22:07,560 Speaker 3: you're told, you know, money's tight right now, gosh, COVID 464 00:22:07,600 --> 00:22:10,240 Speaker 3: made it hard and all that. But then how do 465 00:22:10,280 --> 00:22:14,040 Speaker 3: you argue with some of these CEOs making tens of millions. 466 00:22:13,680 --> 00:22:15,919 Speaker 1: Hundreds No, no, hundreds of millions. 467 00:22:16,119 --> 00:22:16,719 Speaker 2: I'm not good at math. 468 00:22:16,920 --> 00:22:20,919 Speaker 1: Yeah, these CEOs are making hundreds of millions of dollars. 469 00:22:20,600 --> 00:22:23,879 Speaker 3: And so I think the unions are frustrated because they're like, 470 00:22:24,040 --> 00:22:27,200 Speaker 3: our people can't make a living wage. And you've got 471 00:22:27,240 --> 00:22:29,879 Speaker 3: one person in your company, and if you could just 472 00:22:30,200 --> 00:22:32,760 Speaker 3: rearrange some of the money, reallocating be a little bit, 473 00:22:32,800 --> 00:22:33,320 Speaker 3: it would. 474 00:22:33,080 --> 00:22:36,359 Speaker 1: Be a rounding error on your tax return. To fix 475 00:22:36,400 --> 00:22:37,320 Speaker 1: this problem. 476 00:22:37,040 --> 00:22:37,840 Speaker 2: One hundred percent. 477 00:22:37,960 --> 00:22:41,040 Speaker 3: I mean, I think about Amazon, and I'd look no 478 00:22:41,160 --> 00:22:43,600 Speaker 3: hate to the people who work at Amazon, but they 479 00:22:43,640 --> 00:22:45,840 Speaker 3: made that show is the most expensive show ever, the 480 00:22:45,880 --> 00:22:48,760 Speaker 3: Lord of the Rings show. That show costs like four 481 00:22:48,880 --> 00:22:53,160 Speaker 3: hundred and fifty million dollars or something, and who really 482 00:22:53,240 --> 00:22:53,840 Speaker 3: wanted You. 483 00:22:53,800 --> 00:22:55,679 Speaker 1: Might as well have just burned it at the bottom 484 00:22:55,680 --> 00:22:56,600 Speaker 1: of a canyon. 485 00:22:56,840 --> 00:22:58,960 Speaker 3: If you took that four hundred and fifty million and 486 00:22:59,440 --> 00:23:01,040 Speaker 3: paid the writer's better. 487 00:23:01,560 --> 00:23:03,240 Speaker 2: Some of their hit shows like Jury Duty was a 488 00:23:03,240 --> 00:23:04,120 Speaker 2: big hit for Amazon. 489 00:23:04,160 --> 00:23:06,520 Speaker 3: I mean, just realitate, just be a bit smarter about 490 00:23:06,560 --> 00:23:07,600 Speaker 3: how you're spending your money. 491 00:23:07,680 --> 00:23:12,480 Speaker 1: So we are in right now, we are at an impasse. 492 00:23:12,119 --> 00:23:14,639 Speaker 3: And I was going to ask you, knowing these CEOs 493 00:23:14,640 --> 00:23:17,120 Speaker 3: and such, what do you think could happen from here? 494 00:23:17,400 --> 00:23:19,000 Speaker 1: Well, that's that's That's what I was going to get 495 00:23:19,040 --> 00:23:20,880 Speaker 1: to with you of where we are and all this 496 00:23:22,320 --> 00:23:24,600 Speaker 1: there are, And again this is I mean, you're you're 497 00:23:24,640 --> 00:23:29,560 Speaker 1: basically betting on a coin flip here. Is this going 498 00:23:29,600 --> 00:23:32,840 Speaker 1: to go longer? Yes? I mean my guess is the 499 00:23:32,840 --> 00:23:36,880 Speaker 1: the studio will back away and use the cover of summer, 500 00:23:37,000 --> 00:23:39,359 Speaker 1: which is the doldrums of production. 501 00:23:39,080 --> 00:23:40,800 Speaker 2: Anyway, there's usually not new TV shows. 502 00:23:40,840 --> 00:23:42,920 Speaker 1: Yeah, Network, you're not in production, you're not really doing 503 00:23:43,000 --> 00:23:45,200 Speaker 1: much so and there's no new TV shows, so you're 504 00:23:45,240 --> 00:23:48,320 Speaker 1: not going to really notice anything. So let's just let 505 00:23:48,320 --> 00:23:52,719 Speaker 1: this go a little bit longer, maybe into September October 506 00:23:53,040 --> 00:23:56,000 Speaker 1: when everybody has to re engage, because that's when we're 507 00:23:56,000 --> 00:23:58,080 Speaker 1: going to start feeling the heat. And I say we 508 00:23:58,520 --> 00:24:01,520 Speaker 1: television viewers, people that want to see their shows will 509 00:24:01,560 --> 00:24:04,760 Speaker 1: start feeling this. But I think we're going to get 510 00:24:04,800 --> 00:24:07,879 Speaker 1: a couple of months of let's just see how this goes. 511 00:24:07,920 --> 00:24:11,040 Speaker 1: And that's what that's again totally guessing. This is a 512 00:24:11,040 --> 00:24:13,520 Speaker 1: coin flip. They could come back tomorrow and this whole 513 00:24:13,520 --> 00:24:15,400 Speaker 1: thing could be solved, but I don't think it's gonna 514 00:24:15,400 --> 00:24:15,840 Speaker 1: happen either. 515 00:24:15,920 --> 00:24:18,000 Speaker 3: Well, we've seen Speak you know, we love drama here 516 00:24:18,000 --> 00:24:20,359 Speaker 3: at the most dramatic podcast ever. This is also bringing 517 00:24:20,440 --> 00:24:23,080 Speaker 3: some big time like Hollywood Insider drama. There was an 518 00:24:23,160 --> 00:24:27,320 Speaker 3: article in Deadline, which is just an industry news publication, 519 00:24:28,080 --> 00:24:33,159 Speaker 3: and they quoted an executive anonymously, not on the record, 520 00:24:33,800 --> 00:24:38,520 Speaker 3: but this person said that the goal of the amp mother, 521 00:24:38,680 --> 00:24:41,960 Speaker 3: what is it, the AMPTM, the alliance, Oh. 522 00:24:41,800 --> 00:24:43,600 Speaker 2: God of the alliance. That sounds that makes it sound 523 00:24:43,600 --> 00:24:44,080 Speaker 2: like a Star. 524 00:24:43,920 --> 00:24:46,280 Speaker 1: Wars stars the Alliance. We're against the Alliance. 525 00:24:46,640 --> 00:24:52,480 Speaker 3: They quoted an executive who said that they their goal 526 00:24:53,560 --> 00:24:56,880 Speaker 3: was to hold out and to make the writers suffer 527 00:24:56,960 --> 00:25:00,160 Speaker 3: so much that they would become homeless. 528 00:25:00,160 --> 00:25:01,560 Speaker 1: A really silly thing to say. And I know the 529 00:25:01,600 --> 00:25:04,760 Speaker 1: Bob Eiger interview from Sun Valley was not a great 530 00:25:04,800 --> 00:25:05,359 Speaker 1: look either. 531 00:25:05,720 --> 00:25:09,399 Speaker 3: He was even the interview from sun Valley, Idaho against 532 00:25:09,440 --> 00:25:13,280 Speaker 3: the Mountains, talking about how their lowly writers aren't being reasonable. 533 00:25:14,040 --> 00:25:16,119 Speaker 1: So yeah, it hasn't been a good look. And so 534 00:25:16,240 --> 00:25:18,440 Speaker 1: far the Alliance has not done a good job of 535 00:25:19,800 --> 00:25:22,719 Speaker 1: getting their ducks in a row. And that's what's difficult 536 00:25:22,760 --> 00:25:24,920 Speaker 1: about this is it's not just this small group, as 537 00:25:24,920 --> 00:25:28,880 Speaker 1: you said anymore. There are so many massive industries. When 538 00:25:28,920 --> 00:25:32,280 Speaker 1: you're dealing with Amazon, Apple, Netflix. 539 00:25:31,840 --> 00:25:36,040 Speaker 3: Andy is with different motives in different financial situations. NBC, 540 00:25:36,119 --> 00:25:38,439 Speaker 3: even though they have a streamer in Peacock, they're in 541 00:25:38,440 --> 00:25:40,399 Speaker 3: a very different financial model than Apple. 542 00:25:40,480 --> 00:25:41,920 Speaker 2: Apple makes their money from iPhones. 543 00:25:42,119 --> 00:25:44,359 Speaker 1: Amazon wants you to buy underwear and Clorox. 544 00:25:44,400 --> 00:25:47,880 Speaker 3: Apple could sit with this strike forever because they're beyond 545 00:25:48,000 --> 00:25:51,440 Speaker 3: cash positive for iPhones. Their main product is not TV 546 00:25:51,520 --> 00:25:53,480 Speaker 3: shows and movies. They have different priorities, and I think 547 00:25:53,480 --> 00:25:57,119 Speaker 3: that's going to create drama and infighting within the Alliance 548 00:25:57,200 --> 00:25:58,680 Speaker 3: because I think they're going to get to the point 549 00:25:58,680 --> 00:26:01,280 Speaker 3: where the networks are like, we need new TV shows 550 00:26:01,440 --> 00:26:02,960 Speaker 3: and Apple's like, oh, we don't really care. 551 00:26:03,119 --> 00:26:05,080 Speaker 1: And I think there is already that in fighting. I 552 00:26:05,080 --> 00:26:07,359 Speaker 1: think there were people that were very unhappy that that 553 00:26:07,440 --> 00:26:09,480 Speaker 1: one quote about people being hauled. 554 00:26:09,520 --> 00:26:10,600 Speaker 2: Yes the problem. 555 00:26:10,720 --> 00:26:14,240 Speaker 3: Apparently there were allegedly reportedly phone calls back and forth 556 00:26:14,280 --> 00:26:16,119 Speaker 3: who said this, this isn't how we're supposed to be 557 00:26:16,119 --> 00:26:19,560 Speaker 3: presenting ourselves. But here's one thing I don't think that 558 00:26:19,600 --> 00:26:21,000 Speaker 3: the Alliance was prepared for. 559 00:26:21,920 --> 00:26:23,760 Speaker 1: We need music done. 560 00:26:24,760 --> 00:26:25,239 Speaker 2: They were not. 561 00:26:25,560 --> 00:26:27,960 Speaker 3: You know what the music we need is the Nanny 562 00:26:28,000 --> 00:26:33,359 Speaker 3: theme song. They were not prepared for. Fran our Union 563 00:26:33,400 --> 00:26:37,760 Speaker 3: president Fran Dresher. Listen to just a snippet of the 564 00:26:37,800 --> 00:26:40,840 Speaker 3: speech that she gave after hours of negotiating. 565 00:26:40,840 --> 00:26:41,920 Speaker 2: The woman is exhausted. 566 00:26:41,920 --> 00:26:44,359 Speaker 3: And here is the fiery speech she gave at the 567 00:26:44,359 --> 00:26:45,960 Speaker 3: press conference announcing the strike. 568 00:26:46,240 --> 00:26:50,320 Speaker 5: How they plead comedy that they're losing money left and 569 00:26:50,400 --> 00:26:54,960 Speaker 5: right when giving hundreds of millions of dollars to their ceohs, 570 00:26:56,200 --> 00:27:01,280 Speaker 5: it is disgusting shame on them. They stand on the 571 00:27:01,320 --> 00:27:05,760 Speaker 5: wrong side of history at this very moment. You cannot 572 00:27:05,920 --> 00:27:10,840 Speaker 5: change the business model as much as it has changed 573 00:27:11,440 --> 00:27:15,679 Speaker 5: and not expect the contract to change too. We're not 574 00:27:15,800 --> 00:27:19,439 Speaker 5: going to keep doing incremental changes on a contract that 575 00:27:19,640 --> 00:27:21,119 Speaker 5: no longer honors. 576 00:27:21,720 --> 00:27:23,320 Speaker 2: What is happening right. 577 00:27:23,119 --> 00:27:26,560 Speaker 5: Now with this business model that was foisted upon us. 578 00:27:27,359 --> 00:27:30,840 Speaker 5: What are we doing moving around furniture on the Titanic? 579 00:27:32,320 --> 00:27:33,240 Speaker 4: It's crazy. 580 00:27:35,040 --> 00:27:37,240 Speaker 5: So the jig is up, AMPTP. 581 00:27:38,080 --> 00:27:39,240 Speaker 4: We stand tall. 582 00:27:40,160 --> 00:27:44,000 Speaker 5: You have to wake up and smell the coffee. We 583 00:27:44,480 --> 00:27:47,000 Speaker 5: are labor and we stand tall. 584 00:27:47,640 --> 00:27:49,160 Speaker 2: She killed it. 585 00:27:50,040 --> 00:27:52,520 Speaker 3: And I don't think what the studios realized is like, 586 00:27:52,840 --> 00:27:55,600 Speaker 3: that's somebody people know. You know, our president isn't someone 587 00:27:55,600 --> 00:27:57,679 Speaker 3: you've never heard of. And I will say it's different. 588 00:27:58,040 --> 00:28:00,480 Speaker 3: No shade to the Writer's Guild, but they're they're eleven 589 00:28:00,480 --> 00:28:03,399 Speaker 3: thousand people, SAG afters one hundred and sixty thousand people 590 00:28:03,440 --> 00:28:05,720 Speaker 3: and some very recognizable faces. 591 00:28:06,359 --> 00:28:08,800 Speaker 2: And look, I know the nanny. 592 00:28:08,800 --> 00:28:11,040 Speaker 3: I care about the nanny, and I think people are 593 00:28:11,040 --> 00:28:15,080 Speaker 3: gonna listen and sit up and pay attention. And I 594 00:28:15,119 --> 00:28:17,240 Speaker 3: would definitely listen to the Nanny before I'd listened to 595 00:28:17,240 --> 00:28:20,280 Speaker 3: this studio executive who makes hundreds of millions sitting on 596 00:28:20,320 --> 00:28:21,040 Speaker 3: their private jet. 597 00:28:21,160 --> 00:28:23,680 Speaker 1: Well, one of the things that you can't do when 598 00:28:23,720 --> 00:28:26,840 Speaker 1: you are in this strike and a part of this union, 599 00:28:27,320 --> 00:28:30,639 Speaker 1: is you can't go out and promote your work. You 600 00:28:30,680 --> 00:28:35,000 Speaker 1: can't promote your movie. So this dropped, This news dropped 601 00:28:35,440 --> 00:28:37,880 Speaker 1: when they were doing the premiere of Oppenheimer in London 602 00:28:38,200 --> 00:28:41,680 Speaker 1: and the actors walked off the red carpet. They left, 603 00:28:41,840 --> 00:28:44,480 Speaker 1: And so you will start to feel this your late 604 00:28:44,560 --> 00:28:46,960 Speaker 1: night shows when you see your favorite actors and actresses. 605 00:28:47,000 --> 00:28:48,880 Speaker 1: They're not gonna be on. They can't come on and 606 00:28:49,040 --> 00:28:50,880 Speaker 1: talk about their new movie. They talk about their new 607 00:28:50,880 --> 00:28:53,880 Speaker 1: TV show. So it'll affect the late night shows. Those 608 00:28:53,920 --> 00:28:57,920 Speaker 1: will probably go dark and the writers. So the ripple 609 00:28:57,960 --> 00:29:01,760 Speaker 1: effects are already being felt as Oppenheimer and Barbie are 610 00:29:01,800 --> 00:29:04,160 Speaker 1: about to release and Mission Impossible and you're not going 611 00:29:04,240 --> 00:29:06,400 Speaker 1: to see these stars out doing the talk shows anymore. 612 00:29:06,480 --> 00:29:08,360 Speaker 3: Well, and what that might mean is they're taking too 613 00:29:08,400 --> 00:29:11,120 Speaker 3: Instagram and hopefully the picket lines and talking about the 614 00:29:11,160 --> 00:29:14,040 Speaker 3: issues in a way that people understand and will listen. 615 00:29:14,160 --> 00:29:17,360 Speaker 3: And you know, I say, a fran Dresser, that's my president. 616 00:29:17,480 --> 00:29:20,400 Speaker 3: I really I love her. I think she's very relatable, 617 00:29:20,960 --> 00:29:24,440 Speaker 3: and I think she made all this really easy to understand, 618 00:29:24,480 --> 00:29:26,560 Speaker 3: and I hope we've made it easy to understand today. 619 00:29:27,400 --> 00:29:29,520 Speaker 1: Fran we want you on that wall. We need you 620 00:29:29,560 --> 00:29:33,600 Speaker 1: on that wall. See plagiarizing a great movie again. 621 00:29:33,640 --> 00:29:36,720 Speaker 3: I think that this is going to potentially set precedent 622 00:29:37,200 --> 00:29:39,800 Speaker 3: for lots of industries, especially like I said on the 623 00:29:39,840 --> 00:29:43,840 Speaker 3: AI thing, and certainly everybody understands right now inflation, how 624 00:29:43,880 --> 00:29:47,960 Speaker 3: hard it is to make a living today, and that 625 00:29:49,120 --> 00:29:50,880 Speaker 3: you know, we're fighting for our future. 626 00:29:51,040 --> 00:29:52,959 Speaker 2: That's really I agree with. 627 00:29:53,440 --> 00:29:57,320 Speaker 3: I hate that striking, and Franz said this, Look, we 628 00:29:57,400 --> 00:29:59,560 Speaker 3: don't want to be striking because we don't want people 629 00:29:59,600 --> 00:30:01,440 Speaker 3: to be out to work. We don't want to negatively 630 00:30:01,440 --> 00:30:07,360 Speaker 3: affect the economy of Hollywood and many places abroad. But 631 00:30:08,280 --> 00:30:12,360 Speaker 3: we have to fight for our future because otherwise it's 632 00:30:12,440 --> 00:30:14,520 Speaker 3: only going to get worse from here. And so this 633 00:30:14,640 --> 00:30:17,120 Speaker 3: is a historic moment, just like the residuals thing was 634 00:30:17,120 --> 00:30:19,960 Speaker 3: in the sixties, Just like when the Internet first started 635 00:30:20,040 --> 00:30:22,480 Speaker 3: and writers had to first fight to get paid for 636 00:30:22,720 --> 00:30:24,560 Speaker 3: you know, their work being used on the internet and 637 00:30:24,560 --> 00:30:26,320 Speaker 3: that kind of thing. This is another one of those 638 00:30:26,440 --> 00:30:29,520 Speaker 3: mile marker moments where we have to stand up. 639 00:30:29,840 --> 00:30:33,280 Speaker 1: I hope this helped. I hope you understand a little 640 00:30:33,280 --> 00:30:36,880 Speaker 1: bit more of why this is going on and maybe 641 00:30:36,880 --> 00:30:40,360 Speaker 1: what's to come. And we'll follow this and if you know, 642 00:30:40,400 --> 00:30:42,120 Speaker 1: we'd love to hear your comments, leave them in the 643 00:30:42,400 --> 00:30:45,160 Speaker 1: comment section below or shoot us a DM at the 644 00:30:45,200 --> 00:30:48,160 Speaker 1: Most Dramatic Pod Ever. But thank you so much, and Elsie, 645 00:30:48,200 --> 00:30:49,800 Speaker 1: thank you for doing this. 646 00:30:50,080 --> 00:30:51,440 Speaker 2: Hey, thank you Fran for doing this. 647 00:30:51,520 --> 00:30:55,080 Speaker 1: Thank you Fran, and I hope you enjoyed us opening 648 00:30:55,160 --> 00:30:57,200 Speaker 1: up the playbook once again. We'll talk to you next 649 00:30:57,200 --> 00:30:59,240 Speaker 1: time because we have a lot more to talk about. 650 00:30:59,400 --> 00:31:02,000 Speaker 1: Thanks for at Follow us on Instagram at the Most 651 00:31:02,080 --> 00:31:04,600 Speaker 1: Dramatic Pod Ever and make sure to write us a 652 00:31:04,640 --> 00:31:07,280 Speaker 1: review and leave us five stars. I'll talk to you 653 00:31:07,320 --> 00:31:07,720 Speaker 1: next time.