1 00:00:15,396 --> 00:00:22,476 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:22,476 --> 00:00:25,436 Speaker 1: where we explored the stories behind the stories in the news. 3 00:00:25,876 --> 00:00:29,716 Speaker 1: I'm Noah Feldman. In honor of the Academy Awards. This weekend, 4 00:00:29,956 --> 00:00:32,916 Speaker 1: we have an extra special episode for you about the 5 00:00:32,996 --> 00:00:37,596 Speaker 1: movie Bombshell. Bombshell is based on the sexual harassment scandal 6 00:00:37,636 --> 00:00:41,436 Speaker 1: at Fox News that brought down CEO Roger Ailes. The 7 00:00:41,516 --> 00:00:45,196 Speaker 1: movie is up for several awards, Best Actress for Charlie's Thren, 8 00:00:45,356 --> 00:00:49,236 Speaker 1: Best Supporting Actress from Margot Robbie, and Best Makeup and Hairstyling. 9 00:00:50,076 --> 00:00:53,156 Speaker 1: I got to speak to the screenwriter, Charles Randolph, about 10 00:00:53,156 --> 00:00:56,476 Speaker 1: how the movie came into existence and about his views 11 00:00:56,756 --> 00:01:00,276 Speaker 1: on the awards process in general and the complicated politics 12 00:01:00,396 --> 00:01:04,516 Speaker 1: that surround them. Charles, thank you so much for being here. 13 00:01:04,636 --> 00:01:08,716 Speaker 1: This is a thrill for me. Gladly, Gladly, I want 14 00:01:08,716 --> 00:01:12,236 Speaker 1: to star. Are talking about the process that led you 15 00:01:12,356 --> 00:01:16,036 Speaker 1: to write the screenplay for Bombshell? Okay, and take us back. 16 00:01:16,196 --> 00:01:18,556 Speaker 1: You had just maybe it was right when you had 17 00:01:18,596 --> 00:01:21,756 Speaker 1: just won your Academy Award for Big Short, or maybe 18 00:01:21,756 --> 00:01:23,836 Speaker 1: it was before that. What was the timing in relationship? 19 00:01:23,956 --> 00:01:25,876 Speaker 1: It was not long after, you know, so you're you're 20 00:01:25,876 --> 00:01:28,316 Speaker 1: at the top of your career, You've just won an 21 00:01:28,316 --> 00:01:32,316 Speaker 1: Academy award, and you're thinking, what project do I do next? 22 00:01:32,516 --> 00:01:38,876 Speaker 1: How did you gravitate to the Fox News harassment story? Wow? 23 00:01:38,916 --> 00:01:41,436 Speaker 1: What a good question. You know it's a great story, right. 24 00:01:41,676 --> 00:01:45,476 Speaker 1: The fact that a story of feminist determination came from 25 00:01:45,516 --> 00:01:49,556 Speaker 1: these conservative women was really interesting and complicated. And what 26 00:01:49,596 --> 00:01:52,836 Speaker 1: you're always looking for is you're looking for strong characters 27 00:01:52,836 --> 00:01:56,876 Speaker 1: with strong internal conflicts. We think of a movie being 28 00:01:56,876 --> 00:02:01,956 Speaker 1: made about your life as an award that Hollywood gives you, 29 00:02:02,036 --> 00:02:04,356 Speaker 1: but that's not the case. We are always looking for 30 00:02:04,476 --> 00:02:07,956 Speaker 1: stories where people have a rich and unusual and you 31 00:02:07,996 --> 00:02:11,276 Speaker 1: need both in conflict that will drive the story forward. 32 00:02:11,596 --> 00:02:14,756 Speaker 1: And Megan and Gretchen both had that. Megan Kelly, Gretchen 33 00:02:14,796 --> 00:02:16,516 Speaker 1: Carls said, everybody know there knows their names and first 34 00:02:16,556 --> 00:02:18,236 Speaker 1: names are appropriate, but we should say it at least 35 00:02:18,276 --> 00:02:21,236 Speaker 1: once exactly. So that's that's what was the appeal, like, 36 00:02:21,796 --> 00:02:26,596 Speaker 1: this is a fascinating story with strong characters, was strong conflicts. Also, 37 00:02:26,636 --> 00:02:28,396 Speaker 1: I mean I am a bit of a contrarian. So 38 00:02:28,436 --> 00:02:33,396 Speaker 1: the idea that this moment preme too moment with such 39 00:02:33,436 --> 00:02:37,676 Speaker 1: illustrative power in our society came from inside Fox News 40 00:02:37,876 --> 00:02:40,556 Speaker 1: was sort of fascinating, although me too hadn't happened yet 41 00:02:40,676 --> 00:02:42,236 Speaker 1: when you sat down to write this. Now it had. 42 00:02:42,396 --> 00:02:44,076 Speaker 1: We'll get to that more in a second, but I 43 00:02:44,076 --> 00:02:46,636 Speaker 1: just want to go back to this question of the conflicts, right. 44 00:02:46,876 --> 00:02:49,356 Speaker 1: I mean, to see the film is to believe you, 45 00:02:49,476 --> 00:02:52,836 Speaker 1: because you wrote them as characters with extraordinary conflicence. It's 46 00:02:52,836 --> 00:02:55,196 Speaker 1: one of the reasons it's such a skillful script. But 47 00:02:55,916 --> 00:02:59,316 Speaker 1: what conflicts did you see as it were before you started? 48 00:02:59,356 --> 00:03:01,676 Speaker 1: I mean, when I hear that story, I don't think 49 00:03:01,676 --> 00:03:04,996 Speaker 1: to myself, oh my goodness, what cont this is? Maybe 50 00:03:05,156 --> 00:03:07,636 Speaker 1: why I'm not a scamwriter. What made you think there 51 00:03:07,756 --> 00:03:09,996 Speaker 1: were conflicts there? Did you know in advance there had 52 00:03:10,036 --> 00:03:14,516 Speaker 1: to have been conflict? Megan Kelly was quiet for ten days? 53 00:03:15,516 --> 00:03:19,756 Speaker 1: Why right, this thing happens. It's in everyone's interests to 54 00:03:19,836 --> 00:03:22,636 Speaker 1: go out and say, Roger's fine, these things, these things 55 00:03:22,636 --> 00:03:25,996 Speaker 1: never happened all that. But she was silent, right, So 56 00:03:26,076 --> 00:03:28,116 Speaker 1: what happened in that time? And that's where you start. 57 00:03:28,156 --> 00:03:30,436 Speaker 1: You start with this thing of hmm, that's really interesting. 58 00:03:30,516 --> 00:03:33,836 Speaker 1: She was quiet, so so you really are starting from that, 59 00:03:33,876 --> 00:03:37,836 Speaker 1: You're starting from m there's a tell there. Let me 60 00:03:37,876 --> 00:03:40,676 Speaker 1: ask a follow out a question about to tell. One 61 00:03:40,676 --> 00:03:43,436 Speaker 1: of the remarkable things that you do is you tell 62 00:03:43,516 --> 00:03:46,756 Speaker 1: stories that are true stories, right, but you're still in 63 00:03:46,836 --> 00:03:50,036 Speaker 1: some way fictionalizing them for the screen. And that's an 64 00:03:50,036 --> 00:03:53,076 Speaker 1: incredibly delicate dance in its own way. It's much harder 65 00:03:53,076 --> 00:03:56,956 Speaker 1: than doing a pure truth documentary or pure made up fiction. 66 00:03:56,996 --> 00:03:59,076 Speaker 1: It's it's just a higher degree of difficulty, as it 67 00:03:59,116 --> 00:04:02,716 Speaker 1: were on the dive. Right. So you imagined that those 68 00:04:02,756 --> 00:04:05,516 Speaker 1: ten days embodied a conflict, right, and you made a 69 00:04:05,596 --> 00:04:08,916 Speaker 1: movie where those ten days do embody a conflict. Would 70 00:04:08,956 --> 00:04:11,316 Speaker 1: it matter to you if, in fact the ten days 71 00:04:11,316 --> 00:04:13,596 Speaker 1: were you know, she called her lawyers and they were like, 72 00:04:13,716 --> 00:04:15,556 Speaker 1: let's see how this plays out. You know, we have 73 00:04:15,596 --> 00:04:17,156 Speaker 1: a goal of getting more money at the end of 74 00:04:17,196 --> 00:04:19,996 Speaker 1: the day, and if it didn't reflect some internal conflict, 75 00:04:20,036 --> 00:04:23,076 Speaker 1: would that matter or would it? Would it would? You know, 76 00:04:23,196 --> 00:04:26,636 Speaker 1: she's watched the film and she's come out and said 77 00:04:26,716 --> 00:04:30,636 Speaker 1: that my portrayal of that internal conflict is too harsh 78 00:04:30,676 --> 00:04:33,716 Speaker 1: in some ways against her, because she was always a 79 00:04:33,756 --> 00:04:37,276 Speaker 1: strong advocate of women, and we have one character accuse 80 00:04:37,396 --> 00:04:40,996 Speaker 1: her in her silence of having you know, perpetuated the 81 00:04:41,036 --> 00:04:44,836 Speaker 1: culture of harassment, and then she reacts quite defensively in 82 00:04:44,876 --> 00:04:47,676 Speaker 1: the film, you know, in that sort of classic Megan 83 00:04:47,756 --> 00:04:50,356 Speaker 1: Kelly way of saying, Hey, that's how it is, snowflake 84 00:04:50,476 --> 00:04:55,076 Speaker 1: kind of thing. But so at the end of the day, yeah, 85 00:04:55,116 --> 00:05:00,636 Speaker 1: I want the conflict to reflect a real internal dynamic 86 00:05:00,676 --> 00:05:02,756 Speaker 1: that's going on in that human being, and in this 87 00:05:02,796 --> 00:05:04,636 Speaker 1: case it did. She even says, you know, I have 88 00:05:04,716 --> 00:05:06,036 Speaker 1: to say, I don't know if you saw the video 89 00:05:06,036 --> 00:05:07,756 Speaker 1: of her watching the film, and she she says, you 90 00:05:07,756 --> 00:05:09,356 Speaker 1: don't have to say. I do feel like I could 91 00:05:09,356 --> 00:05:13,116 Speaker 1: have done more so while my expression of it didn't 92 00:05:13,116 --> 00:05:16,836 Speaker 1: please her, the underlying fact of it I think was appropriate, 93 00:05:16,876 --> 00:05:19,636 Speaker 1: and she admits that. And so, yeah, so I'm always 94 00:05:19,636 --> 00:05:23,196 Speaker 1: hoping that those things that you arrive at, those internal 95 00:05:23,276 --> 00:05:25,996 Speaker 1: emotional conflicts that you arrive at, that they're true and 96 00:05:26,036 --> 00:05:28,756 Speaker 1: that they will resonate, because if they are, this is 97 00:05:28,796 --> 00:05:30,516 Speaker 1: a place where the truth will lead you to something 98 00:05:30,516 --> 00:05:34,156 Speaker 1: that you can't really make up as effectively or efficiently. 99 00:05:34,556 --> 00:05:37,316 Speaker 1: And so, you know, I always hope that, yes, that 100 00:05:37,316 --> 00:05:42,396 Speaker 1: those portrayals and Strue Gretchen as well, reflect an underlying reality. 101 00:05:42,796 --> 00:05:44,876 Speaker 1: Is that because you think it's as you said, more 102 00:05:44,916 --> 00:05:48,316 Speaker 1: efficient from an artistic perspective, or is it an ethical concern? 103 00:05:48,356 --> 00:05:50,636 Speaker 1: I mean I have in my mind the famous, maybe 104 00:05:50,676 --> 00:05:54,196 Speaker 1: fictional exchange between Gertrude Stein and Picasso, when you know, 105 00:05:54,276 --> 00:05:56,516 Speaker 1: she looks at her his portrait of her, and she 106 00:05:56,596 --> 00:05:59,356 Speaker 1: sends him a note saying it doesn't look like me, right, 107 00:05:59,476 --> 00:06:03,556 Speaker 1: and he sends her back a note saying it will. So. 108 00:06:03,636 --> 00:06:06,476 Speaker 1: I mean, you know that's the case here. Your depiction 109 00:06:06,876 --> 00:06:10,636 Speaker 1: of this story is the one that everyone will remember, right, 110 00:06:10,796 --> 00:06:13,156 Speaker 1: not just Megan Kelly, but her children and her grandchildren 111 00:06:13,156 --> 00:06:15,956 Speaker 1: and the whole world. She herself will probably come to 112 00:06:16,036 --> 00:06:18,356 Speaker 1: remember it, and in this way, because that's the power 113 00:06:18,556 --> 00:06:22,036 Speaker 1: of the film like this, of a successful, widely viewed film. Yeah, 114 00:06:22,036 --> 00:06:24,476 Speaker 1: I take that responsibility very seriously. I do. Yeah, And 115 00:06:24,476 --> 00:06:27,196 Speaker 1: it is it is largely an ethical concern to get 116 00:06:27,196 --> 00:06:28,996 Speaker 1: it right. You know, there are things you have to cheat. 117 00:06:29,556 --> 00:06:32,636 Speaker 1: Timeframes get truncated, for sure, Dialogue is infitted, without doubts, 118 00:06:32,676 --> 00:06:36,396 Speaker 1: the emotional flavor of individual scenes changes. But I want, 119 00:06:36,956 --> 00:06:40,516 Speaker 1: I want the struggles that these human beings go through 120 00:06:40,556 --> 00:06:42,556 Speaker 1: in this world to be as true as possible. Let 121 00:06:42,556 --> 00:06:45,436 Speaker 1: me ask you about again the background to that sort 122 00:06:45,436 --> 00:06:49,556 Speaker 1: of remarkable video where you see Megan Kelly responding to 123 00:06:49,596 --> 00:06:52,876 Speaker 1: the film right, in which she is a character. Right, 124 00:06:53,156 --> 00:06:56,276 Speaker 1: very obviously she did not know what was coming. Yeah, 125 00:06:56,356 --> 00:07:00,076 Speaker 1: she had no say in your script none. Technically, how 126 00:07:00,116 --> 00:07:02,596 Speaker 1: does that happen? Is it that her story was sufficiently 127 00:07:02,636 --> 00:07:05,956 Speaker 1: in the public domain that neither you nor the producers 128 00:07:05,996 --> 00:07:08,636 Speaker 1: had to go to her to get any permission or 129 00:07:08,676 --> 00:07:11,596 Speaker 1: any right to use her story? Yeah, I mean, it's 130 00:07:11,596 --> 00:07:14,196 Speaker 1: not the dominant part of the narrative, right, and the 131 00:07:14,476 --> 00:07:17,596 Speaker 1: narrative is broader than just her story. Sure, So part 132 00:07:17,636 --> 00:07:20,396 Speaker 1: of it's that. Part of it is she is a 133 00:07:20,396 --> 00:07:25,476 Speaker 1: public figure who's you know, influenced our culture in big ways, 134 00:07:25,596 --> 00:07:28,676 Speaker 1: and our laws are structured to you know, shield anyone 135 00:07:28,716 --> 00:07:31,716 Speaker 1: who wants to comment on her presence, you know, in 136 00:07:31,756 --> 00:07:34,116 Speaker 1: a way. So a lot of this is you know, 137 00:07:34,196 --> 00:07:38,396 Speaker 1: obviously fair use. So technically it's it's not a problem 138 00:07:38,396 --> 00:07:42,156 Speaker 1: as long as you're being as truthful as possible. Also, 139 00:07:42,196 --> 00:07:46,356 Speaker 1: we're we're using beats and moments that she's discussed a 140 00:07:46,356 --> 00:07:48,196 Speaker 1: lot she wrote a book about that also have appeared 141 00:07:48,196 --> 00:07:50,636 Speaker 1: in other media narratives It's not you know, and and 142 00:07:51,076 --> 00:07:54,196 Speaker 1: appeared prior to her book as well. So, and and 143 00:07:54,356 --> 00:07:56,796 Speaker 1: you know, obviously I'm talking to more than her, I'm 144 00:07:56,796 --> 00:07:59,596 Speaker 1: interring you know, twenty people at Fox. So some of 145 00:07:59,596 --> 00:08:02,156 Speaker 1: these perspectives on scenes with her or from people who 146 00:08:02,276 --> 00:08:03,836 Speaker 1: you know, are in the room with her, as opposed 147 00:08:03,876 --> 00:08:08,156 Speaker 1: to coming from her directly. So she's just pretty lucky that, 148 00:08:08,556 --> 00:08:10,636 Speaker 1: you know, she gets a depiction that is she may 149 00:08:10,636 --> 00:08:12,676 Speaker 1: not have thought it was sufficiently sympathetic, but it is 150 00:08:12,716 --> 00:08:16,196 Speaker 1: in certain respect sympathetic to her, I mean part of it. No, 151 00:08:16,436 --> 00:08:18,476 Speaker 1: there's a there's a bigger issue here, which is that 152 00:08:18,556 --> 00:08:23,436 Speaker 1: film is inherently humanizing. You know, the critique that a 153 00:08:23,516 --> 00:08:26,316 Speaker 1: film about someone will normalize them, will platform them, isn't 154 00:08:26,516 --> 00:08:29,836 Speaker 1: is true? It does. It inherently does that the worst 155 00:08:29,836 --> 00:08:33,956 Speaker 1: person depicted on film will somehow be comprehensible. And if 156 00:08:33,956 --> 00:08:36,876 Speaker 1: to understand all, isn't maybe to forgive all at least 157 00:08:36,876 --> 00:08:40,596 Speaker 1: to humanize. Yeah, exactly, So, even someone like Roger gets humanized. 158 00:08:40,916 --> 00:08:43,316 Speaker 1: And this is also I remember having in the shadow 159 00:08:43,356 --> 00:08:46,036 Speaker 1: of a hyperpowers and culture where so many of our 160 00:08:46,116 --> 00:08:49,556 Speaker 1: narratives are dreaming driven by media that with which have 161 00:08:49,676 --> 00:08:52,756 Speaker 1: been twitterized to the degree we have a strong emotional, 162 00:08:53,156 --> 00:08:56,356 Speaker 1: you know, reaction to these human beings already we think 163 00:08:56,356 --> 00:09:00,636 Speaker 1: we know, and so contrasting that is not difficult. Let's 164 00:09:00,676 --> 00:09:04,436 Speaker 1: be clear, right, because there's been such vitification of parties 165 00:09:04,436 --> 00:09:07,516 Speaker 1: on both sides of the aisle, any kind of normal 166 00:09:07,676 --> 00:09:12,956 Speaker 1: human discourse that's short of odious often feels very humanizing, 167 00:09:13,116 --> 00:09:16,636 Speaker 1: you know. Strangely enough, that's assuming that it's short short 168 00:09:16,676 --> 00:09:19,116 Speaker 1: of odious. Right to go about your ethical point, that 169 00:09:19,196 --> 00:09:23,076 Speaker 1: also must be partly your choice, Yeah, to humanize because 170 00:09:23,076 --> 00:09:26,076 Speaker 1: that makes it greater art. Yeah, I'm a big believer 171 00:09:26,196 --> 00:09:31,276 Speaker 1: that in this particular moment, especially that the art complicates 172 00:09:31,996 --> 00:09:35,116 Speaker 1: so part of it is in the shadow of partisanship. 173 00:09:35,236 --> 00:09:39,196 Speaker 1: Any kind of portrayal of a real human being feels 174 00:09:39,236 --> 00:09:42,076 Speaker 1: grounded and humanizing in a way. And it seems particularly 175 00:09:42,116 --> 00:09:45,476 Speaker 1: significant because that human is Roger Ales, who is held 176 00:09:45,516 --> 00:09:48,756 Speaker 1: responsible by so many people, rightly or wrongly, for our 177 00:09:48,796 --> 00:09:52,236 Speaker 1: culture of vilification. Yeah, exactly, that's the irony. I mean, 178 00:09:52,276 --> 00:09:54,476 Speaker 1: that really is. I mean, that's like irony with you know, 179 00:09:54,516 --> 00:09:58,196 Speaker 1: with the bright red let capital letters. Absolutely right. Yeah, 180 00:09:58,236 --> 00:10:00,956 Speaker 1: so we are we are hopefully extending to mister Ale 181 00:10:01,076 --> 00:10:03,436 Speaker 1: something he didn't extend to the rest of the world, right, 182 00:10:03,516 --> 00:10:06,636 Speaker 1: And and that's okay. Because you know, I'm fully aware 183 00:10:06,676 --> 00:10:09,756 Speaker 1: of the fact that I'm being empowered and by a 184 00:10:09,876 --> 00:10:14,196 Speaker 1: massive amount of money and a massive distribution network that 185 00:10:14,276 --> 00:10:17,556 Speaker 1: will distribute my vision of these human beings around the planet, 186 00:10:17,596 --> 00:10:21,676 Speaker 1: and so certainly that comes with an obligation to hopefully 187 00:10:21,756 --> 00:10:26,636 Speaker 1: be as as judicious and human as possible. One thing 188 00:10:26,676 --> 00:10:30,836 Speaker 1: about humanizing sexual harasser, though, which I think is actually 189 00:10:31,236 --> 00:10:33,676 Speaker 1: underlooked in the public discourse around this, is, of course 190 00:10:33,676 --> 00:10:36,116 Speaker 1: we're all eager to condemn everyone who engages in sexual 191 00:10:36,156 --> 00:10:39,036 Speaker 1: harass rise evil, right, But if you think that it's 192 00:10:39,076 --> 00:10:41,356 Speaker 1: only the most extreme evil people who engage in the 193 00:10:41,396 --> 00:10:44,156 Speaker 1: sexual harassment, then it's really hard to explain why there's 194 00:10:44,196 --> 00:10:46,036 Speaker 1: so much of it out there in the world. And 195 00:10:46,156 --> 00:10:48,636 Speaker 1: most of it is realizing that lots of people who 196 00:10:48,716 --> 00:10:50,716 Speaker 1: don't think they're such terrible people, and who might in 197 00:10:50,796 --> 00:10:53,716 Speaker 1: other domains that their lives not be so terrible, could 198 00:10:53,756 --> 00:10:57,476 Speaker 1: still do this. Absolutely so, the guy twirling his mustache, 199 00:10:57,636 --> 00:10:59,716 Speaker 1: you know, is not the one who's going to create 200 00:10:59,756 --> 00:11:02,836 Speaker 1: the most emotional damage in a woman's life. It's the 201 00:11:02,876 --> 00:11:05,676 Speaker 1: person she considers a friend who on the business trip 202 00:11:05,756 --> 00:11:07,836 Speaker 1: suddenly you know, says, hey, let's go to the hotel 203 00:11:07,876 --> 00:11:10,596 Speaker 1: bar and then goes out from there, or the or 204 00:11:10,636 --> 00:11:12,956 Speaker 1: the man she considers a mentor you know, those are 205 00:11:12,956 --> 00:11:15,676 Speaker 1: the situations where you get the most in trouble. So 206 00:11:15,676 --> 00:11:19,716 Speaker 1: so with Roger, part of it was creating a portrait 207 00:11:19,756 --> 00:11:22,556 Speaker 1: of him that allowed us to sympathize with him to 208 00:11:22,596 --> 00:11:25,956 Speaker 1: the degree that those scenes could have real power. You know, 209 00:11:26,156 --> 00:11:28,076 Speaker 1: we reach the point of the conversation where you know, 210 00:11:28,116 --> 00:11:31,036 Speaker 1: two white guys are now talking about sexual harassment. We 211 00:11:31,076 --> 00:11:33,636 Speaker 1: can hear the little voice you know in our in 212 00:11:33,676 --> 00:11:36,796 Speaker 1: our ears saying what's wrong with this picture? And it 213 00:11:36,836 --> 00:11:38,796 Speaker 1: won't be sufficient to answer to say, well, white guys 214 00:11:38,836 --> 00:11:40,836 Speaker 1: do most of the sexual harassments, or why shouldn't we 215 00:11:40,876 --> 00:11:43,876 Speaker 1: be the ones to speak about it? In this contemporary 216 00:11:43,916 --> 00:11:48,116 Speaker 1: era where we're all very focused and concerned appropriately on agency, 217 00:11:48,676 --> 00:11:51,036 Speaker 1: it seems like a violation of agency for the two 218 00:11:51,076 --> 00:11:53,116 Speaker 1: of us to sit here and talk about sexual harassment 219 00:11:53,156 --> 00:11:55,716 Speaker 1: from the standpoint as it were, of the harasser, right 220 00:11:56,156 --> 00:11:58,836 Speaker 1: when there are human beings women being harassed and you 221 00:11:58,916 --> 00:12:00,796 Speaker 1: want to have the agency to speak about this topic. 222 00:12:01,156 --> 00:12:03,756 Speaker 1: So I just want to ask you for your thoughts 223 00:12:03,796 --> 00:12:07,996 Speaker 1: of starting with the critique that simply says, you know, 224 00:12:08,116 --> 00:12:11,476 Speaker 1: why why you yeah. Well, I wrote the script before 225 00:12:11,516 --> 00:12:15,796 Speaker 1: the Me Too movement began in its most populous form, 226 00:12:15,916 --> 00:12:18,996 Speaker 1: so prior to the Harvey Weinstein I've actually finished about 227 00:12:18,996 --> 00:12:23,156 Speaker 1: three well two months before the Harvey case broke. That 228 00:12:23,236 --> 00:12:25,796 Speaker 1: was fast worry. Yeah, yeah, So I wrote it over that, 229 00:12:25,836 --> 00:12:28,236 Speaker 1: you know, just after the election, over that spring, So 230 00:12:28,316 --> 00:12:31,716 Speaker 1: in a way, there was no one telling that story 231 00:12:31,756 --> 00:12:37,156 Speaker 1: at the time. Had Harvey's situation occurred, I probably would 232 00:12:37,196 --> 00:12:38,636 Speaker 1: not have written it, probably would not have sold it. 233 00:12:38,676 --> 00:12:40,356 Speaker 1: I'd have probably thought, oh, this issue is getting enough 234 00:12:40,396 --> 00:12:42,636 Speaker 1: traction that I don't have to So part of this 235 00:12:42,716 --> 00:12:44,596 Speaker 1: is just, you know, this feeling that no one was 236 00:12:44,636 --> 00:12:46,676 Speaker 1: saying telling the story and it needed to be told. 237 00:12:46,996 --> 00:12:49,156 Speaker 1: And that's, by the way, something that the public just 238 00:12:49,276 --> 00:12:51,636 Speaker 1: isn't that aware of. They see a film coming out 239 00:12:51,716 --> 00:12:54,756 Speaker 1: now a couple of years after Me Too, and they think, oh, 240 00:12:54,876 --> 00:12:56,796 Speaker 1: this was a me too movie, not realizing it just 241 00:12:56,836 --> 00:12:59,116 Speaker 1: takes a long time to put together a film like 242 00:12:59,156 --> 00:13:02,556 Speaker 1: that take forever. I mean, contracts alone take eight months, 243 00:13:02,596 --> 00:13:04,316 Speaker 1: you know, plus legal vetting on a film like this, 244 00:13:04,356 --> 00:13:06,316 Speaker 1: it just takes forever. So part of your answer is 245 00:13:06,676 --> 00:13:08,916 Speaker 1: when you started doing it, there wasn't in Me Too. Yeah, 246 00:13:08,916 --> 00:13:10,716 Speaker 1: But but but but that's you know, that's a cop 247 00:13:10,716 --> 00:13:12,676 Speaker 1: out of us. I think there's a you know, let's 248 00:13:12,716 --> 00:13:15,396 Speaker 1: let's address the real problem is should men be telling 249 00:13:15,396 --> 00:13:20,196 Speaker 1: these stories? And I go back and forth myself. Right, Partly, yes, there, 250 00:13:20,196 --> 00:13:23,196 Speaker 1: you know, I'm also writing men in these situations and 251 00:13:23,276 --> 00:13:27,156 Speaker 1: getting that right, getting that accurately matters for helping us 252 00:13:27,236 --> 00:13:32,956 Speaker 1: understand this. And certainly the other thing is my greatest 253 00:13:32,996 --> 00:13:36,956 Speaker 1: desire is to put men in those rooms, like the 254 00:13:36,996 --> 00:13:41,116 Speaker 1: situation in the film between Margot Robbie's character Kayla and 255 00:13:41,276 --> 00:13:44,036 Speaker 1: Roger Ailes, because if I can put men in those rooms, 256 00:13:44,076 --> 00:13:48,156 Speaker 1: they can see how these situations are a extremely complicated 257 00:13:48,836 --> 00:13:51,596 Speaker 1: and can be utterly life changing. I have found in 258 00:13:51,636 --> 00:13:54,956 Speaker 1: my own life a tendency when these narratives pop up 259 00:13:55,716 --> 00:13:59,636 Speaker 1: to be a little dismissive in an almost instinctual way. 260 00:14:00,236 --> 00:14:02,876 Speaker 1: It's almost as though, you know, the minute I hear 261 00:14:02,876 --> 00:14:06,716 Speaker 1: one of these stories, I have this Darwinian thing to say, yeah, 262 00:14:06,756 --> 00:14:08,836 Speaker 1: but you know she went into that office, Well, like 263 00:14:08,876 --> 00:14:11,596 Speaker 1: I'm immediately have that thing that comes up, that voice 264 00:14:11,596 --> 00:14:15,796 Speaker 1: in my head, that masculine protective, that sort of gender defensiveness. 265 00:14:15,996 --> 00:14:18,556 Speaker 1: And so I wanted to I wanted to interrogate that 266 00:14:18,596 --> 00:14:21,236 Speaker 1: part of myself. You know, as I say, I wanted 267 00:14:21,276 --> 00:14:26,796 Speaker 1: to drag my own prejudices through, you know, a gauntlet 268 00:14:26,796 --> 00:14:29,436 Speaker 1: of real lives and sort of sort of examine on them. 269 00:14:29,876 --> 00:14:33,956 Speaker 1: And so I guess I hope that in showing those 270 00:14:33,996 --> 00:14:38,396 Speaker 1: situations two men around the world, I am doing the 271 00:14:38,516 --> 00:14:42,596 Speaker 1: greater good, and that I am willing to accept the 272 00:14:42,636 --> 00:14:45,436 Speaker 1: critique that there probably is someone who could have told 273 00:14:45,516 --> 00:14:49,276 Speaker 1: this story better, who is a woman, but you know, 274 00:14:49,436 --> 00:14:52,156 Speaker 1: she had not yet emerged, and so I did what 275 00:14:52,236 --> 00:14:55,156 Speaker 1: I could. So it's interesting. They're they're sort of two strands. 276 00:14:55,556 --> 00:15:00,516 Speaker 1: One is the educative part, yeah, and strands too, I thought, 277 00:15:00,716 --> 00:15:03,716 Speaker 1: was the internal part, the part where every artist is 278 00:15:03,756 --> 00:15:06,636 Speaker 1: to some extent grappling with his or her own internal 279 00:15:06,716 --> 00:15:09,556 Speaker 1: voices and internal demons. And I think what you say is, 280 00:15:09,836 --> 00:15:12,876 Speaker 1: I think it's true surely of the great majority of men, 281 00:15:12,916 --> 00:15:17,836 Speaker 1: and very possibly of just about all men, that they will, instinctively, 282 00:15:17,956 --> 00:15:21,876 Speaker 1: even if their consciousness has been raised, identify with other 283 00:15:21,916 --> 00:15:25,836 Speaker 1: men or want to deny the depth and extremity of 284 00:15:26,076 --> 00:15:27,916 Speaker 1: you know, what reality tells us. I mean, my version 285 00:15:27,956 --> 00:15:30,076 Speaker 1: of that would be just time after time during the 286 00:15:30,196 --> 00:15:34,836 Speaker 1: me two process, saying that guy, that guy, now that guy, 287 00:15:34,956 --> 00:15:36,476 Speaker 1: you know, and just thinking to myself at a certain 288 00:15:36,516 --> 00:15:38,956 Speaker 1: point I realized, you know, you know, no, you just 289 00:15:39,076 --> 00:15:41,756 Speaker 1: don't know what man a capable, right, you know, you 290 00:15:41,876 --> 00:15:44,636 Speaker 1: just you know, maybe you have your own superrego has 291 00:15:44,676 --> 00:15:46,436 Speaker 1: developed in a certain kind of way, a certain kind 292 00:15:46,436 --> 00:15:48,916 Speaker 1: of guilt consciousness. But maybe not everybody is, you know. 293 00:15:49,036 --> 00:15:51,756 Speaker 1: But the point is just a recognition that I share 294 00:15:51,836 --> 00:15:54,596 Speaker 1: that that same kind of gender protective instinct. I think 295 00:15:54,636 --> 00:15:57,436 Speaker 1: we have it right. I think though, that the left 296 00:15:57,476 --> 00:16:01,876 Speaker 1: critique from a strong feminist would go after both of 297 00:16:01,916 --> 00:16:05,356 Speaker 1: these things by saying, well, education is great, but couldn't 298 00:16:05,356 --> 00:16:07,116 Speaker 1: you have picked another story to do it all right? 299 00:16:07,196 --> 00:16:09,916 Speaker 1: Or you know, why does this have to men educating 300 00:16:09,956 --> 00:16:13,116 Speaker 1: men using the story of women as their instruments. And 301 00:16:13,156 --> 00:16:15,036 Speaker 1: then on the personal thing, they would say, well, that's it. 302 00:16:15,156 --> 00:16:18,396 Speaker 1: They accept that, they would accept that art always involves 303 00:16:18,396 --> 00:16:20,956 Speaker 1: some self exploration, right, but they want to see the 304 00:16:20,996 --> 00:16:23,676 Speaker 1: film that would emerge if that self exploration was from 305 00:16:23,676 --> 00:16:26,156 Speaker 1: the women's perspective rather than the men's. And then they 306 00:16:26,156 --> 00:16:29,076 Speaker 1: would point to the industry and say, well, maybe no 307 00:16:29,116 --> 00:16:31,316 Speaker 1: one had emerged, but maybe that's because women were denied 308 00:16:31,356 --> 00:16:35,836 Speaker 1: the opportunity you know, for their voices to emerge, and 309 00:16:35,876 --> 00:16:39,236 Speaker 1: they would be right right on both counts in my book, right, Yeah. 310 00:16:39,276 --> 00:16:43,556 Speaker 1: And I would like to think that our film is 311 00:16:43,796 --> 00:16:47,756 Speaker 1: moving the ball down the field, right, and that someone 312 00:16:47,796 --> 00:16:49,916 Speaker 1: else can come along and say, Okay, here's this film. 313 00:16:49,916 --> 00:16:51,636 Speaker 1: It was made for thirty he's going to make you know, 314 00:16:51,836 --> 00:16:54,876 Speaker 1: fifty sixty million worldwide. So there is you know, there's 315 00:16:54,916 --> 00:16:57,476 Speaker 1: a door open there that we can prove these stories. 316 00:16:57,516 --> 00:17:00,716 Speaker 1: There's an audience for these stories, and so that so 317 00:17:00,796 --> 00:17:04,516 Speaker 1: that we are you know, we are delicately laying the 318 00:17:05,036 --> 00:17:08,996 Speaker 1: foundation for you know, the quintessential me too movie to 319 00:17:09,116 --> 00:17:11,836 Speaker 1: come along, you know, three four or five, six years 320 00:17:11,836 --> 00:17:14,316 Speaker 1: from now, totally done by women. Right, And when someone 321 00:17:14,356 --> 00:17:18,276 Speaker 1: writes the Harvey Weinstein biopic, it will be a woman. Yeah. 322 00:17:18,316 --> 00:17:19,916 Speaker 1: And I and I think I think the two people 323 00:17:19,916 --> 00:17:22,116 Speaker 1: that are hired for that picture. I think they're two 324 00:17:22,156 --> 00:17:24,716 Speaker 1: projects are both women too. You know that makes sense. 325 00:17:24,996 --> 00:17:27,516 Speaker 1: I was speaking abstractly, but but in real life, I 326 00:17:27,516 --> 00:17:29,796 Speaker 1: guess that's already happening. Yeah. Plan B's got one, and 327 00:17:29,796 --> 00:17:32,476 Speaker 1: I think there's another one out there too, right. And 328 00:17:32,476 --> 00:17:34,756 Speaker 1: there's also a film, an indie film coming out about 329 00:17:34,996 --> 00:17:38,036 Speaker 1: assistant in Harvey's life. So so there are other things coming, 330 00:17:38,116 --> 00:17:40,476 Speaker 1: for sure, you know, And it's but it's part of 331 00:17:40,476 --> 00:17:43,156 Speaker 1: a you know, it's part of this broader conversation about 332 00:17:43,236 --> 00:17:46,956 Speaker 1: how we are going to reach some version of parody 333 00:17:47,636 --> 00:17:51,036 Speaker 1: behind the camera, and how can we do that as ruthlessly, 334 00:17:51,036 --> 00:17:54,356 Speaker 1: efficiently and quickly as possible. You see that playing out, 335 00:17:54,436 --> 00:17:56,956 Speaker 1: you know, in the Oscar you know conversation right now. 336 00:17:56,996 --> 00:18:01,156 Speaker 1: But you know, I'm more than willing to take whatever 337 00:18:01,196 --> 00:18:05,676 Speaker 1: criticism comes by virtue of of of just wanting to 338 00:18:05,716 --> 00:18:09,516 Speaker 1: do whatever we can to help the situation. Were there 339 00:18:09,516 --> 00:18:13,356 Speaker 1: other critiques beyond the gender identity one that struck you 340 00:18:13,436 --> 00:18:16,796 Speaker 1: as meaningful ones, Well, what you what we get a 341 00:18:16,796 --> 00:18:20,916 Speaker 1: lot of is frustration that the film does not take 342 00:18:21,076 --> 00:18:25,676 Speaker 1: Megan to task more for her complicity in the broader culture. 343 00:18:26,076 --> 00:18:29,076 Speaker 1: I say, fox right, And that's a little difficult because 344 00:18:29,116 --> 00:18:31,116 Speaker 1: what I wanted to avoid was making it sound as 345 00:18:31,116 --> 00:18:36,836 Speaker 1: though right wing ideology creates necessarily a context for harassment, 346 00:18:36,876 --> 00:18:38,836 Speaker 1: because we know that's not true. This is happening in 347 00:18:38,876 --> 00:18:42,356 Speaker 1: every office around the country, right. I think really, at 348 00:18:42,356 --> 00:18:43,996 Speaker 1: the end of the day, a lot of people wanted 349 00:18:44,036 --> 00:18:47,316 Speaker 1: the race issue to be addressed, and I struggle with this. 350 00:18:48,116 --> 00:18:51,396 Speaker 1: You know, I had early drafts where, you know, I 351 00:18:51,476 --> 00:18:53,396 Speaker 1: tried to address it. There's a you know, sort of 352 00:18:53,396 --> 00:18:56,676 Speaker 1: a famous bookkeeping office where things got pretty dark over 353 00:18:56,716 --> 00:18:59,556 Speaker 1: at Fox, and there were a couple of lawsuits of 354 00:18:59,676 --> 00:19:02,596 Speaker 1: race based harassment. I tried to have Roger walk into 355 00:19:02,596 --> 00:19:04,596 Speaker 1: one of them one at one point. You know, all this, 356 00:19:04,716 --> 00:19:06,556 Speaker 1: and it's just it never worked. It felt like we 357 00:19:06,556 --> 00:19:10,956 Speaker 1: were undermining the issue of her rassment itself. Until I 358 00:19:10,956 --> 00:19:13,316 Speaker 1: came up with this thing. I was convinced was genius. 359 00:19:13,356 --> 00:19:15,036 Speaker 1: And here's how I was going to solve this problem. 360 00:19:16,996 --> 00:19:19,396 Speaker 1: Four or five six times in the film, an African 361 00:19:19,436 --> 00:19:23,836 Speaker 1: American individual with wandering frame and they would see the camera, 362 00:19:23,876 --> 00:19:26,916 Speaker 1: they would have sort of privileged gaze of the sort 363 00:19:26,916 --> 00:19:30,036 Speaker 1: of context of production, so to speak, and they go, 364 00:19:30,076 --> 00:19:32,996 Speaker 1: oh sorry and back out right. And so the idea 365 00:19:33,076 --> 00:19:36,756 Speaker 1: with it, anytime an African American person was introduced into 366 00:19:36,796 --> 00:19:39,116 Speaker 1: the frame, they would be, oh, oh sorry, just leave 367 00:19:39,156 --> 00:19:42,716 Speaker 1: the frame right to suggest that this is subtly and 368 00:19:42,796 --> 00:19:48,396 Speaker 1: comedically a not very subtly comedically a environment absolutely hostile 369 00:19:48,476 --> 00:19:52,796 Speaker 1: to that perspective. Now, in saying that you already know 370 00:19:52,876 --> 00:19:54,956 Speaker 1: what I discovered it in the editing room with Jay, 371 00:19:55,036 --> 00:19:58,556 Speaker 1: which is this was a very bad idea. Made it 372 00:19:58,636 --> 00:20:01,076 Speaker 1: look like they were saying that the movie was a 373 00:20:01,116 --> 00:20:03,516 Speaker 1: white people's movie rather than that Fox News was a 374 00:20:03,516 --> 00:20:06,316 Speaker 1: white people's environment. It was hard to read the intent, 375 00:20:06,796 --> 00:20:09,276 Speaker 1: but the fact you're asking the audience to read the 376 00:20:09,316 --> 00:20:12,676 Speaker 1: intent at the moment already suggest problems, right, you know. 377 00:20:13,036 --> 00:20:14,636 Speaker 1: And it was one of those things that was either 378 00:20:14,716 --> 00:20:18,116 Speaker 1: confusing or just it just felt like it was the 379 00:20:18,196 --> 00:20:21,476 Speaker 1: disease it purported to cure, you know, yes, and so 380 00:20:21,476 --> 00:20:24,516 Speaker 1: so at the end of the day, both the structural 381 00:20:24,516 --> 00:20:26,236 Speaker 1: things I tried to do it to address race at 382 00:20:26,236 --> 00:20:28,756 Speaker 1: Fox and the playful things I tried to do just 383 00:20:28,836 --> 00:20:32,036 Speaker 1: didn't work, right, So that's you know, you do what 384 00:20:32,076 --> 00:20:34,156 Speaker 1: you can. Thankfully, it's a you know, it's a medium, 385 00:20:34,196 --> 00:20:36,436 Speaker 1: that's it's slow, but but we have the ability to 386 00:20:36,476 --> 00:20:38,836 Speaker 1: correct our mistakes. And we just took that out and 387 00:20:39,036 --> 00:20:43,276 Speaker 1: and you know, now you can, as critics do, complain 388 00:20:43,316 --> 00:20:46,796 Speaker 1: about their address addressing the race issue. So you mentioned 389 00:20:46,836 --> 00:20:49,516 Speaker 1: the Oscar context, right, and I think that's a great 390 00:20:49,716 --> 00:20:56,156 Speaker 1: topic to talk about here. Um, is there a revenge 391 00:20:56,356 --> 00:21:01,236 Speaker 1: or reaction or blowback moment coming from the men in 392 00:21:01,516 --> 00:21:04,236 Speaker 1: the Oscar nomination process. I think you're seeing that now, right. 393 00:21:04,276 --> 00:21:07,836 Speaker 1: I think this this season is very mail driven, with 394 00:21:07,876 --> 00:21:10,436 Speaker 1: a lot of films address seeing sort of male identity 395 00:21:10,476 --> 00:21:14,236 Speaker 1: and revisiting male identity. And part of that, I think 396 00:21:14,516 --> 00:21:18,316 Speaker 1: is the fact that a certain core group of filmmakers 397 00:21:18,316 --> 00:21:23,036 Speaker 1: who obviously make movies about strongly male issues are making 398 00:21:23,076 --> 00:21:25,316 Speaker 1: their films. But yeah, I think I think we're in 399 00:21:25,396 --> 00:21:27,876 Speaker 1: a little bit of a broader cultural blowback against me too, 400 00:21:27,916 --> 00:21:31,156 Speaker 1: and I think the film season this year reflects that 401 00:21:31,276 --> 00:21:33,356 Speaker 1: in some ways. Yeah, I want to talk about it 402 00:21:33,396 --> 00:21:37,236 Speaker 1: some concrete cases, like why didn't Gredit Groway get nominated 403 00:21:37,316 --> 00:21:40,436 Speaker 1: for Best Director for Little Women? I mean, I thought 404 00:21:40,916 --> 00:21:45,036 Speaker 1: that Little Women was a terrific movie, But more importantly, 405 00:21:45,076 --> 00:21:49,396 Speaker 1: I perceived, maybe wrongly, that what made it distinctive and 406 00:21:49,436 --> 00:21:51,796 Speaker 1: different from all the other Little Women's that are out 407 00:21:51,796 --> 00:21:55,276 Speaker 1: there was the kind of direct hand of the director, 408 00:21:55,356 --> 00:21:57,556 Speaker 1: you know, the sense that the director had a vision, 409 00:21:57,676 --> 00:22:00,436 Speaker 1: was going a certain way. The presentation was distinctive, the 410 00:22:00,516 --> 00:22:03,996 Speaker 1: outcomes were distinctive, and I was genuinely surprised in my 411 00:22:04,036 --> 00:22:07,836 Speaker 1: ignorance that it didn't yield to a Best Director nomination. 412 00:22:08,396 --> 00:22:10,396 Speaker 1: Explain to me what I'm what I'm missing because I 413 00:22:10,436 --> 00:22:13,396 Speaker 1: think I am missing. Yeah, several things. Well, those of 414 00:22:13,596 --> 00:22:16,396 Speaker 1: the of us in the industry were not as surprised 415 00:22:16,436 --> 00:22:19,356 Speaker 1: because it had not won any of the precursor awards 416 00:22:19,396 --> 00:22:23,076 Speaker 1: that you normally get. I think the broader most interesting 417 00:22:23,076 --> 00:22:26,116 Speaker 1: thing for me to say about this stuff is, we 418 00:22:26,156 --> 00:22:31,156 Speaker 1: are in the Academy attempting to become reach gender parody 419 00:22:31,196 --> 00:22:36,596 Speaker 1: and to some degree inclusion on of of ethnicity, but 420 00:22:36,756 --> 00:22:41,996 Speaker 1: that is coming with greater internationalization. Explain that. So that 421 00:22:42,116 --> 00:22:45,116 Speaker 1: sounds really interesting, So that, for example, in the director's branch, 422 00:22:45,276 --> 00:22:47,756 Speaker 1: the people we are inviting, the women we are inviting 423 00:22:47,796 --> 00:22:51,996 Speaker 1: in to join the Academy exactly tend to be international. 424 00:22:52,916 --> 00:22:55,516 Speaker 1: I think with a film like Greta's what ended up 425 00:22:55,556 --> 00:23:01,316 Speaker 1: happening is that international cohort probably did not respond to 426 00:23:01,396 --> 00:23:04,316 Speaker 1: the film as powerfully as you did, because it is 427 00:23:04,396 --> 00:23:06,996 Speaker 1: very Mary's a very American story, sure, you know, and 428 00:23:07,076 --> 00:23:09,956 Speaker 1: it has classic American apps well, and it has a 429 00:23:10,076 --> 00:23:12,516 Speaker 1: kind of Christmas morning cheeriness to a lot of the 430 00:23:12,556 --> 00:23:16,236 Speaker 1: scenes that I think, for literally, yes, for for some 431 00:23:16,516 --> 00:23:19,796 Speaker 1: you know, for some directors outside of this country, seems 432 00:23:19,836 --> 00:23:22,836 Speaker 1: just very American, you know. So there's that, and that's 433 00:23:22,836 --> 00:23:25,956 Speaker 1: the situation in which I think that internationalization maybe worked 434 00:23:25,996 --> 00:23:28,756 Speaker 1: against a film, but I think in terms of the 435 00:23:28,796 --> 00:23:33,196 Speaker 1: performances it helped. So SAG, for example, did not nominate 436 00:23:33,236 --> 00:23:35,836 Speaker 1: any of the screenacers screenacers GID for their awards, to 437 00:23:35,916 --> 00:23:39,796 Speaker 1: not nominate any of the women from UH for Ensengwe. 438 00:23:39,836 --> 00:23:42,836 Speaker 1: I don't think for any of the individual awards either. However, 439 00:23:43,836 --> 00:23:48,236 Speaker 1: the acting branch of the Academy did, so that group 440 00:23:48,636 --> 00:23:51,836 Speaker 1: felt those performances were worthy of recognition in a way 441 00:23:51,876 --> 00:23:54,916 Speaker 1: that the more exclusively American group, which you know, that's 442 00:23:54,956 --> 00:23:56,836 Speaker 1: what SAG does is they have a committee of people 443 00:23:56,996 --> 00:23:59,916 Speaker 1: that sort of I think it's by lottery that that nominates, 444 00:23:59,956 --> 00:24:01,596 Speaker 1: so so you know there are going to be good 445 00:24:01,596 --> 00:24:03,516 Speaker 1: and bad things that come with that. On the whole, 446 00:24:03,516 --> 00:24:06,276 Speaker 1: of course, it's great that we're becoming an international academy. 447 00:24:06,716 --> 00:24:12,596 Speaker 1: What aspect of your experience both in writing your film, 448 00:24:12,636 --> 00:24:17,276 Speaker 1: writing Bombshell, and also in pitching it to the world 449 00:24:17,356 --> 00:24:20,436 Speaker 1: and talking about it, do you wish someone had asked 450 00:24:20,436 --> 00:24:22,996 Speaker 1: you to speak about that? No one has asked you 451 00:24:23,036 --> 00:24:25,396 Speaker 1: about if any Maybe every question has been asked and 452 00:24:25,396 --> 00:24:26,996 Speaker 1: you've said everything you want to say, But is there 453 00:24:27,076 --> 00:24:29,556 Speaker 1: something that hasn't been asked or that you would like 454 00:24:29,716 --> 00:24:33,876 Speaker 1: to say about it. I mean, I think the thing 455 00:24:33,876 --> 00:24:40,276 Speaker 1: that's gone under theorized is how we are going to 456 00:24:40,596 --> 00:24:45,076 Speaker 1: deal with the fact that men, particularly older men, but 457 00:24:45,556 --> 00:24:49,556 Speaker 1: men in the academy and in the filmmaking business, have 458 00:24:50,396 --> 00:24:55,316 Speaker 1: risen on the power of believing in their taste, believing 459 00:24:55,356 --> 00:24:58,916 Speaker 1: that their gut instinct matters. And you see this in 460 00:24:59,276 --> 00:25:02,036 Speaker 1: Stephen King's recent tweet about the fact that he doesn't 461 00:25:02,396 --> 00:25:07,316 Speaker 1: use diversity when making artistic evaluations and judgments about excellence, right, 462 00:25:07,516 --> 00:25:09,036 Speaker 1: and he got beat up for them. And he is, 463 00:25:09,156 --> 00:25:12,116 Speaker 1: in a sense the ultimate model of a person who's 464 00:25:12,236 --> 00:25:15,276 Speaker 1: whatever his instincts are, somebody's buying them. Right. I saw 465 00:25:15,356 --> 00:25:18,156 Speaker 1: recently a list of American authors and how many books 466 00:25:18,196 --> 00:25:21,076 Speaker 1: are purchased, might have been worldwide authors, and he was 467 00:25:21,156 --> 00:25:25,596 Speaker 1: just towering over people who sell millions of copies, towered 468 00:25:25,596 --> 00:25:27,636 Speaker 1: over all of them. Anyway, So go on. So he's 469 00:25:27,676 --> 00:25:30,556 Speaker 1: got he's got that well honed instinct which he privileges 470 00:25:30,636 --> 00:25:34,076 Speaker 1: as part of his core, part of his identity. Right now, 471 00:25:34,116 --> 00:25:37,876 Speaker 1: we want to come along and say, unfortunately, that instinct 472 00:25:38,516 --> 00:25:40,716 Speaker 1: it's not misogynistic. I think that's I think it's wrong. 473 00:25:40,756 --> 00:25:43,436 Speaker 1: We use that phrase. It's more, you know, androcentric, I 474 00:25:43,476 --> 00:25:45,156 Speaker 1: suppose is the word I'm looking for, a sort of 475 00:25:45,236 --> 00:25:49,156 Speaker 1: male centered. But we have to get people like that 476 00:25:50,116 --> 00:25:55,396 Speaker 1: to engage with the idea that diversity doesn't mean necessarily 477 00:25:55,516 --> 00:26:00,396 Speaker 1: throwing out you know that gut, but rather just doing 478 00:26:00,436 --> 00:26:03,276 Speaker 1: a series of checks. And we've got to find a 479 00:26:03,316 --> 00:26:07,236 Speaker 1: way to just somehow to introduce to diversity in the 480 00:26:07,236 --> 00:26:10,436 Speaker 1: discussion around diversity without this sort of naming and shaming 481 00:26:10,956 --> 00:26:14,916 Speaker 1: and really get people to think through Okay, here's a 482 00:26:14,956 --> 00:26:20,836 Speaker 1: film like Little Women, Yes, that beautiful scene where so 483 00:26:20,956 --> 00:26:24,636 Speaker 1: someone talks about you know, I am so sick of 484 00:26:24,796 --> 00:26:26,956 Speaker 1: being women being defined for who they love yet I 485 00:26:26,956 --> 00:26:31,116 Speaker 1: want love. It's like a beautiful, beautiful moment, right. That's 486 00:26:31,116 --> 00:26:33,116 Speaker 1: not going to resonate with a lot of guys, but 487 00:26:33,236 --> 00:26:35,036 Speaker 1: we have to get them in a position where they 488 00:26:35,076 --> 00:26:38,716 Speaker 1: can see its power, right, And I just feel like 489 00:26:38,876 --> 00:26:42,156 Speaker 1: we do not have a good way to do that yet. 490 00:26:42,636 --> 00:26:43,916 Speaker 1: One of the things I think we could do which 491 00:26:43,956 --> 00:26:47,756 Speaker 1: would help is we could expand the best director category 492 00:26:48,596 --> 00:26:51,596 Speaker 1: from five films to ten films, and the way that 493 00:26:51,716 --> 00:26:54,196 Speaker 1: rank voting works that would inherently be more inclusive. And 494 00:26:54,236 --> 00:26:56,836 Speaker 1: since that's already been done in the past for other categories, 495 00:26:56,996 --> 00:26:58,756 Speaker 1: but it was done for a Best Picture. It seems 496 00:26:58,796 --> 00:27:00,916 Speaker 1: like a no brainer. Yeah. What's so interesting about the 497 00:27:00,956 --> 00:27:04,916 Speaker 1: example of the awards is that the awards are symbolic, 498 00:27:04,956 --> 00:27:07,356 Speaker 1: and if they were only symbolica, we wouldn't spend so 499 00:27:07,436 --> 00:27:09,476 Speaker 1: much time worrying about them. But as I under tend it, 500 00:27:09,596 --> 00:27:16,036 Speaker 1: in your industry, winning those awards translates into people sitting 501 00:27:16,036 --> 00:27:18,436 Speaker 1: in the seat in the movie theater, which therefore translates 502 00:27:18,436 --> 00:27:21,556 Speaker 1: into money. So winning the awards is actually a crucial 503 00:27:21,676 --> 00:27:26,276 Speaker 1: component of how power gets deployed, and in that sense, 504 00:27:26,636 --> 00:27:29,756 Speaker 1: it's super different from the way awards work into many 505 00:27:29,796 --> 00:27:32,796 Speaker 1: other domains of life. You're absolutely right. Yeah, you're absolutely right. 506 00:27:32,836 --> 00:27:35,036 Speaker 1: And to answer your question is I feel like I 507 00:27:35,076 --> 00:27:38,316 Speaker 1: feel like the one conversation I'm not hearing as I've 508 00:27:38,316 --> 00:27:41,516 Speaker 1: done the press for this film is a more honest 509 00:27:41,556 --> 00:27:46,516 Speaker 1: conversation about gender politics and how gender politics influence our 510 00:27:46,596 --> 00:27:49,596 Speaker 1: choices and how we can sort of arrive at solutions. 511 00:27:49,676 --> 00:27:51,516 Speaker 1: I feel like what I'm getting a lot of is 512 00:27:51,716 --> 00:27:54,436 Speaker 1: people wanting to evoke the problem, wanting to say this, 513 00:27:54,516 --> 00:27:56,796 Speaker 1: but no one really wanting to kind of go through 514 00:27:56,876 --> 00:27:59,156 Speaker 1: and not no one but not enough people wanting to 515 00:27:59,236 --> 00:28:03,116 Speaker 1: go through and and have a candid conversation about what's 516 00:28:03,116 --> 00:28:05,716 Speaker 1: possible and what's not possible. Well, thank you for having 517 00:28:05,716 --> 00:28:09,196 Speaker 1: that candid conversation with me. I'm really grateful to you 518 00:28:09,316 --> 00:28:11,316 Speaker 1: for bringing this perspective. Thank you so much, Thank you, 519 00:28:11,396 --> 00:28:16,476 Speaker 1: thank you for having me. Deep Background is brought to 520 00:28:16,516 --> 00:28:19,876 Speaker 1: you by Pushkin Industries. Our producer is Lydia gene Coott, 521 00:28:20,236 --> 00:28:23,396 Speaker 1: with studio recording by Joseph Friedman and mastering by Jason 522 00:28:23,436 --> 00:28:27,996 Speaker 1: Gambrell and Jason Roskowski. Our showrunner is Sophie mckibbn. Our 523 00:28:28,036 --> 00:28:31,236 Speaker 1: theme music is composed by Luis Garat special thanks to 524 00:28:31,236 --> 00:28:34,916 Speaker 1: the Pushkin Brass, Malcolm Gladwell, Jacob Weisberg, and Mia Lobel. 525 00:28:35,476 --> 00:28:38,716 Speaker 1: I'm Noah Feldman. You can follow me on Twitter at 526 00:28:38,796 --> 00:28:41,716 Speaker 1: Noah R. Feldman. This is Deep Background