1 00:00:11,840 --> 00:00:14,520 Speaker 1: Good morning, peeps, and welcome to wik Up Daily with 2 00:00:14,600 --> 00:00:18,320 Speaker 1: Meet your Girl Danielle Moody once again recording live from 3 00:00:18,360 --> 00:00:22,600 Speaker 1: the Brooklyn Bunker. You know, it's been probably a little 4 00:00:22,640 --> 00:00:27,520 Speaker 1: over a week, maybe two, since Dave Chappelle's stand up 5 00:00:28,040 --> 00:00:33,880 Speaker 1: received considerable considerable backlash his special on Netflix, The Closer, 6 00:00:34,240 --> 00:00:37,280 Speaker 1: which contained even in the media, they're referring to them 7 00:00:37,320 --> 00:00:41,040 Speaker 1: as jokes at the expense of transgender people. But if 8 00:00:41,040 --> 00:00:43,159 Speaker 1: you listen to democracy issue, you know that my co 9 00:00:43,280 --> 00:00:46,680 Speaker 1: host Urey and I discussed this in a bit of 10 00:00:46,720 --> 00:00:52,080 Speaker 1: detail last week, and I had the opportunity to bring 11 00:00:52,200 --> 00:00:58,840 Speaker 1: on one of the former former trans employees at Netflix 12 00:00:58,880 --> 00:01:04,720 Speaker 1: who recently made headlines because the media company decided to 13 00:01:05,319 --> 00:01:09,760 Speaker 1: yes fire them. Be pagals Minor, will join us in 14 00:01:09,800 --> 00:01:12,800 Speaker 1: a bit to talk about their experience at Netflix as 15 00:01:12,840 --> 00:01:16,080 Speaker 1: a black trans person and to talk about just in 16 00:01:16,120 --> 00:01:21,000 Speaker 1: general their experience in the world of tech folks. I 17 00:01:21,040 --> 00:01:24,320 Speaker 1: am so excited to welcome to wok F Daily for 18 00:01:24,520 --> 00:01:28,400 Speaker 1: the first time be pagals Miner, whose name you may 19 00:01:28,480 --> 00:01:32,320 Speaker 1: have heard and know in affiliation with Netflix's current scandal. 20 00:01:32,560 --> 00:01:35,200 Speaker 1: With regard to the fact that there has been considerable 21 00:01:35,240 --> 00:01:39,600 Speaker 1: pushback on Dave Chappelle's special The Closer, and you have 22 00:01:39,640 --> 00:01:42,080 Speaker 1: found yourself at somewhat of the center of that issue. 23 00:01:42,400 --> 00:01:45,320 Speaker 1: But before we jump into where things are now, I 24 00:01:45,400 --> 00:01:48,080 Speaker 1: want to talk about your experience at Netflix as a 25 00:01:48,080 --> 00:01:51,240 Speaker 1: black trans person and what kind of environment and kind 26 00:01:51,280 --> 00:01:54,640 Speaker 1: of culture has been present at Netflix that brings us 27 00:01:54,680 --> 00:01:58,920 Speaker 1: into this present moment. Yes, you know, it's actually really interesting, 28 00:01:58,960 --> 00:02:01,600 Speaker 1: and I think I said this a lot. I actually 29 00:02:01,680 --> 00:02:03,840 Speaker 1: chose Netflix because I thought it was going to be 30 00:02:03,880 --> 00:02:07,360 Speaker 1: one of the best possible experiences for me as a 31 00:02:07,400 --> 00:02:11,360 Speaker 1: black trans person. So before I decided to join up Flix, 32 00:02:11,360 --> 00:02:13,760 Speaker 1: there's a few things around my mind. Obviously, you know, 33 00:02:13,800 --> 00:02:15,280 Speaker 1: I've worked in tech for a very long time, so 34 00:02:15,320 --> 00:02:17,240 Speaker 1: I wanted to make sure I got great k But 35 00:02:17,360 --> 00:02:20,280 Speaker 1: like you know, when I first, before I even started, 36 00:02:20,320 --> 00:02:23,080 Speaker 1: I knew I wanted to have a child. I knew 37 00:02:23,120 --> 00:02:25,639 Speaker 1: I was going to have gender confirming surge of surgery, 38 00:02:25,919 --> 00:02:29,120 Speaker 1: and so Netflix checked the boxes from a benefits perspective. 39 00:02:29,639 --> 00:02:32,040 Speaker 1: And then the manager who actually recruited me, one of 40 00:02:32,080 --> 00:02:34,320 Speaker 1: the things I liked best about my experience with him 41 00:02:34,440 --> 00:02:37,000 Speaker 1: was that he asked me, for instance, at our very 42 00:02:37,000 --> 00:02:40,280 Speaker 1: first meeting, you know, what are your pronouns, and you know, 43 00:02:40,520 --> 00:02:42,240 Speaker 1: I was like my pronounder days of theirs. And he 44 00:02:42,360 --> 00:02:44,799 Speaker 1: also happened to invite me there on the same day 45 00:02:44,840 --> 00:02:47,040 Speaker 1: as a black out of it. So he took me 46 00:02:47,080 --> 00:02:49,119 Speaker 1: down to get a plate of food and I'm gonna 47 00:02:49,160 --> 00:02:52,280 Speaker 1: sill tell you that food was real good. You know, 48 00:02:52,480 --> 00:02:54,760 Speaker 1: black folks know how to eat. And I was just like, 49 00:02:54,800 --> 00:02:59,000 Speaker 1: this is this is a very seamless recruiting experience. And then, obviously, 50 00:02:59,040 --> 00:03:01,400 Speaker 1: just because I do my be diligence, I also asked 51 00:03:01,400 --> 00:03:03,600 Speaker 1: to be introduced to people who had had my role, 52 00:03:03,960 --> 00:03:06,600 Speaker 1: as well as other black at members and transfer members. 53 00:03:06,600 --> 00:03:08,480 Speaker 1: So I did talk to a number of people before 54 00:03:08,520 --> 00:03:09,840 Speaker 1: I started to figure out if it was going to 55 00:03:09,880 --> 00:03:11,960 Speaker 1: be a very safe place for me. And I do 56 00:03:12,080 --> 00:03:15,440 Speaker 1: think that overall, my experience Netflix was one of the 57 00:03:15,480 --> 00:03:18,240 Speaker 1: best experience I've ever had professionally. You know, I felt 58 00:03:18,280 --> 00:03:21,600 Speaker 1: much safer at Netflix than I typically felt anywhere else 59 00:03:21,639 --> 00:03:24,680 Speaker 1: in my tech career, and I also hope that Netflix 60 00:03:24,720 --> 00:03:30,200 Speaker 1: gets back to that experience. How long were you at Netflix? 61 00:03:30,480 --> 00:03:32,800 Speaker 1: So how from from the moment of the of the 62 00:03:32,840 --> 00:03:38,280 Speaker 1: good plate of food at the ERG to your I 63 00:03:38,320 --> 00:03:43,240 Speaker 1: guess expulsion firing being let go? How long. How long 64 00:03:43,280 --> 00:03:46,280 Speaker 1: have you been there? So I started in March twenty twenty. 65 00:03:46,400 --> 00:03:48,800 Speaker 1: I was actually part of the first COVID cohort, so 66 00:03:48,960 --> 00:03:50,320 Speaker 1: I would in the one day and the next day 67 00:03:50,360 --> 00:03:53,640 Speaker 1: we're closed. But yes, I started March twenty twenty and 68 00:03:53,640 --> 00:03:56,600 Speaker 1: my last day was October fourteen, so approximately one year 69 00:03:56,600 --> 00:04:00,880 Speaker 1: and seven months. So you chose Netflix, like you're saying, 70 00:04:01,000 --> 00:04:07,400 Speaker 1: because you felt safe, because you felt seen right as somebody, 71 00:04:07,960 --> 00:04:10,720 Speaker 1: And you know, can you talk a bit too, because 72 00:04:10,760 --> 00:04:14,560 Speaker 1: I'm part of I'm an advisor at an organization called 73 00:04:14,640 --> 00:04:17,560 Speaker 1: Lesbians Who Tech and Allies. And one of the things 74 00:04:17,600 --> 00:04:20,520 Speaker 1: that you know is spoken about often is the fact 75 00:04:20,880 --> 00:04:25,000 Speaker 1: of how little diversity there still is in tech, how 76 00:04:25,040 --> 00:04:28,120 Speaker 1: difficult it is for people that are from the LGBTQ 77 00:04:28,320 --> 00:04:32,560 Speaker 1: community and those that are also black and brown folks 78 00:04:32,720 --> 00:04:37,039 Speaker 1: to navigate a very broish culture, a culture that is 79 00:04:37,160 --> 00:04:43,320 Speaker 1: not affirming in any way. And so the reason behind 80 00:04:43,440 --> 00:04:47,600 Speaker 1: you choosing Netflix was because of their openness and just 81 00:04:48,000 --> 00:04:52,119 Speaker 1: your experience overall. What was your experience overall in tech 82 00:04:52,520 --> 00:04:56,320 Speaker 1: proper before you went to before you went to Netflix. Yeah, 83 00:04:56,360 --> 00:04:58,080 Speaker 1: first of all, shout out to Lesbian's Who Tech. That's 84 00:04:58,080 --> 00:05:00,000 Speaker 1: actually one of the first places I ever spoke at, 85 00:05:00,720 --> 00:05:03,080 Speaker 1: and so it was a very foundational experience for me. 86 00:05:04,160 --> 00:05:07,120 Speaker 1: But so this is the thing that's interesting. So one 87 00:05:07,160 --> 00:05:10,039 Speaker 1: of the difficulties of working in tech is that I 88 00:05:10,080 --> 00:05:13,480 Speaker 1: said it was safer than any other company I had 89 00:05:13,520 --> 00:05:16,560 Speaker 1: worked at. And so there's a nuance there because just 90 00:05:16,600 --> 00:05:18,920 Speaker 1: because it's safer doesn't mean that it's actually where it's 91 00:05:18,920 --> 00:05:21,880 Speaker 1: supposed to be if you actually want a place that 92 00:05:22,040 --> 00:05:28,400 Speaker 1: supports LGBTQ plus people, black people, Latin X people, Asian people. Also, 93 00:05:28,440 --> 00:05:32,440 Speaker 1: from an international perspective, the company was rapidly expanding, and 94 00:05:32,520 --> 00:05:35,440 Speaker 1: so if you think about it, the you can culture 95 00:05:35,680 --> 00:05:37,280 Speaker 1: and so you can't stand for the United States and 96 00:05:37,360 --> 00:05:41,280 Speaker 1: Canada drastically different than what's going on the LASTAMP, drastically 97 00:05:41,279 --> 00:05:44,000 Speaker 1: different than what's going on in me a drastically different 98 00:05:44,040 --> 00:05:46,160 Speaker 1: that was going on an apack. And I was uniquely 99 00:05:46,200 --> 00:05:49,760 Speaker 1: situated to understand that complexity because I was the elected 100 00:05:49,839 --> 00:05:52,880 Speaker 1: leader of both Black app which is one of the 101 00:05:52,960 --> 00:05:56,720 Speaker 1: largest ergs at Netflix, and I also was a part 102 00:05:56,839 --> 00:05:59,680 Speaker 1: of elected to be the co lead for trans Start, 103 00:06:00,120 --> 00:06:03,080 Speaker 1: which is one of the smaller ARGs but a very 104 00:06:03,080 --> 00:06:05,840 Speaker 1: powerful voice. And so what would happen is is that 105 00:06:06,440 --> 00:06:08,920 Speaker 1: the people who are part of these ards would come 106 00:06:08,960 --> 00:06:13,279 Speaker 1: to me constantly about their experiences, and so I would 107 00:06:13,279 --> 00:06:16,480 Speaker 1: hear like everything from oh, this is the most heartwarming, 108 00:06:16,560 --> 00:06:18,920 Speaker 1: best story ever, like your manager is so great, to 109 00:06:19,640 --> 00:06:23,080 Speaker 1: holy crap, what is going on over there? Like why 110 00:06:23,160 --> 00:06:25,719 Speaker 1: is this happening here? And so that's part of this 111 00:06:25,760 --> 00:06:28,560 Speaker 1: conversation too. You know, everyone seems to think that this 112 00:06:28,600 --> 00:06:30,440 Speaker 1: is about the special, but it's not about the special. 113 00:06:31,160 --> 00:06:35,400 Speaker 1: It's about externally, what kind of content is being put 114 00:06:35,440 --> 00:06:37,680 Speaker 1: out there, and how can we make sure that content 115 00:06:37,839 --> 00:06:42,159 Speaker 1: represents diverse lives and specifically for trans start trans lives, 116 00:06:42,520 --> 00:06:44,880 Speaker 1: And if you were to ask me from the black perspective, 117 00:06:44,960 --> 00:06:49,120 Speaker 1: also black lives right, especially dark skinned lives right, darc 118 00:06:49,320 --> 00:06:52,400 Speaker 1: and black people's lives. Like every single employee who was 119 00:06:52,400 --> 00:06:54,960 Speaker 1: a diverse person, I bet that if you actually asked them, 120 00:06:54,960 --> 00:06:57,839 Speaker 1: they would say racially, not really hitting the mark on 121 00:06:57,920 --> 00:07:00,960 Speaker 1: some of our content, based on my experiences. So that's 122 00:07:01,000 --> 00:07:02,920 Speaker 1: one part of it, and the second part of it 123 00:07:02,960 --> 00:07:06,160 Speaker 1: is from an internal perspective, it's like, oh, like you know, 124 00:07:06,360 --> 00:07:10,680 Speaker 1: Netflix is safer and better than pretty much every other 125 00:07:10,680 --> 00:07:14,440 Speaker 1: company we've worked at. However, because of the Netflix culture 126 00:07:14,520 --> 00:07:19,320 Speaker 1: values and concepts, like freedom and responsibility. Like freedom and 127 00:07:19,360 --> 00:07:23,240 Speaker 1: responsibility essentially says that you're supposed to do what's best 128 00:07:23,240 --> 00:07:25,440 Speaker 1: for Netflix, Like that's one hundred percent what it is. 129 00:07:25,520 --> 00:07:28,040 Speaker 1: Is like, even if it's not like a being on 130 00:07:28,120 --> 00:07:31,760 Speaker 1: the bean path type thing that someone else may do, 131 00:07:31,760 --> 00:07:35,040 Speaker 1: you have the freedom and the responsibility to go out 132 00:07:35,080 --> 00:07:39,320 Speaker 1: and do that thing because it's the best thing for Netflix. However, 133 00:07:39,480 --> 00:07:42,480 Speaker 1: one of the things I noticed in my capacity is 134 00:07:42,680 --> 00:07:46,239 Speaker 1: rd LEAD is that that concept was misapplied or applied 135 00:07:46,280 --> 00:07:49,640 Speaker 1: differently depending on certain types of people. And in fact, 136 00:07:49,880 --> 00:07:52,080 Speaker 1: one of the big arguments that I'd had with many people, 137 00:07:52,080 --> 00:07:54,120 Speaker 1: when I'd brought up many times is is that it 138 00:07:54,160 --> 00:07:56,360 Speaker 1: needs to be taken from freedom of responsibility to freedom 139 00:07:56,360 --> 00:08:00,240 Speaker 1: and accountability. Right, you know, if someone if if a 140 00:08:00,280 --> 00:08:03,440 Speaker 1: colleague does if one colleague does this and they're a 141 00:08:03,520 --> 00:08:06,400 Speaker 1: white sister and or man, and another colleague does this 142 00:08:06,520 --> 00:08:09,240 Speaker 1: and that as a black sister and or woman, and 143 00:08:09,400 --> 00:08:13,920 Speaker 1: she gets fired because someone says that she used to 144 00:08:14,040 --> 00:08:17,320 Speaker 1: use bad judgment even though she was working within the 145 00:08:17,320 --> 00:08:23,680 Speaker 1: funeral responsibility a spectrum, and he does not, then there's 146 00:08:23,680 --> 00:08:27,120 Speaker 1: a problem with the company. And so ultimately it's this 147 00:08:27,280 --> 00:08:33,400 Speaker 1: internal battle saying if these cultural values are true, they 148 00:08:33,440 --> 00:08:35,600 Speaker 1: need to be applied the same way to every single 149 00:08:35,600 --> 00:08:39,840 Speaker 1: person who works in Netflix. And that makes sense, you know, 150 00:08:39,920 --> 00:08:42,400 Speaker 1: And that makes sense because I think that I think 151 00:08:43,280 --> 00:08:47,360 Speaker 1: across industries there is a reckoning that is happening right now. There, 152 00:08:47,400 --> 00:08:50,160 Speaker 1: I believe, and I've had continue to have this conversation 153 00:08:50,559 --> 00:08:53,679 Speaker 1: with different folks about there being a workers a worker's 154 00:08:53,760 --> 00:08:57,400 Speaker 1: revolution that is happening right now. There are so many 155 00:08:58,080 --> 00:09:02,640 Speaker 1: different strikes, different walkouts that are being experienced right now 156 00:09:02,960 --> 00:09:06,160 Speaker 1: that large corporations are trying to figure out, well, what 157 00:09:06,240 --> 00:09:09,320 Speaker 1: are what do we do? And you know, what do 158 00:09:09,360 --> 00:09:12,040 Speaker 1: we do we treat people with dignity and respect? What 159 00:09:12,080 --> 00:09:14,320 Speaker 1: do we do we pay people what they're worth? Like, 160 00:09:14,360 --> 00:09:18,040 Speaker 1: there's a reason why this is collectively happening as we're 161 00:09:18,080 --> 00:09:21,880 Speaker 1: still grappling with a pandemic that cost people a lot 162 00:09:21,960 --> 00:09:25,240 Speaker 1: of not only their lives but also their livelihoods and 163 00:09:25,280 --> 00:09:28,800 Speaker 1: people and having the time right whether privileged time or not, 164 00:09:28,880 --> 00:09:32,760 Speaker 1: to recognize, you know, I'm not treated well, I'm not 165 00:09:32,800 --> 00:09:35,760 Speaker 1: paid enough for this type of for this type of abuse, 166 00:09:36,320 --> 00:09:41,720 Speaker 1: right um. And I think two particularly, we look to 167 00:09:43,640 --> 00:09:47,600 Speaker 1: we look to big tech right as as the barometer 168 00:09:48,040 --> 00:09:51,719 Speaker 1: by which different firms and companies should be acting. It 169 00:09:51,840 --> 00:09:57,239 Speaker 1: was the open campus model came because across different industries 170 00:09:57,280 --> 00:10:00,520 Speaker 1: because of big tech, like you know, being able to 171 00:10:01,280 --> 00:10:05,120 Speaker 1: give your employees different types of benefits right that would 172 00:10:06,040 --> 00:10:10,520 Speaker 1: help them feel valued and seen that came from Silicon 173 00:10:10,640 --> 00:10:13,200 Speaker 1: Valley and this approach to how do we get the 174 00:10:13,200 --> 00:10:17,360 Speaker 1: best talent right? And then recognizing well, when we say 175 00:10:17,400 --> 00:10:20,040 Speaker 1: best talent, who were we actually talking about when the 176 00:10:20,080 --> 00:10:25,200 Speaker 1: diversity is not is not there across the board. I 177 00:10:25,240 --> 00:10:29,920 Speaker 1: want to switch gears now to the present moment that 178 00:10:30,080 --> 00:10:35,760 Speaker 1: Netflix has itself in. When you first saw or heard 179 00:10:35,800 --> 00:10:44,040 Speaker 1: about this Chappelle special, what were your initial reactions? Yeah, 180 00:10:44,120 --> 00:10:46,480 Speaker 1: so you know, a few days before the special came out, 181 00:10:46,480 --> 00:10:48,360 Speaker 1: I got an email basically giving me a heads up 182 00:10:48,360 --> 00:10:51,760 Speaker 1: saying the special is coming out, and I was just like, oh, 183 00:10:51,880 --> 00:10:54,840 Speaker 1: is it like typical? You know? I was like, you know, 184 00:10:55,120 --> 00:10:57,560 Speaker 1: I know it's going to be some stuff that people 185 00:10:57,640 --> 00:11:01,199 Speaker 1: in my community won't like. And also, to be clear, 186 00:11:01,480 --> 00:11:04,000 Speaker 1: so I identify as trans, non binary, but I also 187 00:11:04,120 --> 00:11:07,360 Speaker 1: was socialized as a black woman, right, and so I 188 00:11:07,440 --> 00:11:10,079 Speaker 1: have a lot of empathy and I get very upset 189 00:11:10,120 --> 00:11:13,120 Speaker 1: when people disrespect black women. And so that's another reason 190 00:11:13,120 --> 00:11:14,480 Speaker 1: for me for the special. I was like, is he 191 00:11:14,520 --> 00:11:17,800 Speaker 1: also got talking about women in a derogatory manner like 192 00:11:18,040 --> 00:11:20,600 Speaker 1: those those types of things, And they're like, yeah, kind of. 193 00:11:21,160 --> 00:11:25,640 Speaker 1: And and then came out and the special came out 194 00:11:25,679 --> 00:11:28,680 Speaker 1: on October fifth, and I didn't I didn't watch it 195 00:11:28,679 --> 00:11:31,080 Speaker 1: at the time, but on October sixth, that's when we 196 00:11:31,120 --> 00:11:35,160 Speaker 1: started seeing like the news reports and then you know 197 00:11:35,240 --> 00:11:39,480 Speaker 1: about like the turf mentioning turf uh mis gendering people. 198 00:11:39,920 --> 00:11:42,160 Speaker 1: And so then a bunch of us actually looked at 199 00:11:42,160 --> 00:11:44,600 Speaker 1: the transcript because you know, one of the things about 200 00:11:44,679 --> 00:11:46,480 Speaker 1: working in company like Netflix is you don't want to 201 00:11:46,520 --> 00:11:49,520 Speaker 1: watch it, right because you're feeding into the algorithm, you're 202 00:11:49,520 --> 00:11:52,280 Speaker 1: feeding into the number of views. And so we read 203 00:11:52,320 --> 00:11:56,439 Speaker 1: the transcript and we were frankly horrified, right, I mean, 204 00:11:56,480 --> 00:11:59,320 Speaker 1: he actually starts off with like something that's borderline anti semitic, 205 00:11:59,600 --> 00:12:01,640 Speaker 1: and then you know, he just kind of continues to 206 00:12:01,679 --> 00:12:03,920 Speaker 1: go and offend as many different groups as you possibly 207 00:12:03,920 --> 00:12:06,480 Speaker 1: can and then spending like a large portion of it 208 00:12:06,800 --> 00:12:13,240 Speaker 1: on transphobic content. Um and it might take was, wow, 209 00:12:14,080 --> 00:12:18,440 Speaker 1: this is so much worse than I could have ever imagined, Like, 210 00:12:18,559 --> 00:12:20,840 Speaker 1: you know, this is it's like if the last special 211 00:12:20,920 --> 00:12:24,280 Speaker 1: was not good. This special is just like, Holy cow, 212 00:12:24,880 --> 00:12:28,559 Speaker 1: how did we not think strategically about what this might 213 00:12:28,600 --> 00:12:35,040 Speaker 1: actually do to the internal and external reputation of Netflix? Hm? Hm, 214 00:12:35,760 --> 00:12:40,160 Speaker 1: you know there there has been so much commentary, uh 215 00:12:40,200 --> 00:12:44,079 Speaker 1: since the special has aired. Um the cover that many 216 00:12:44,120 --> 00:12:49,600 Speaker 1: people want to provide comedians because they are positioned, you know, 217 00:12:49,720 --> 00:12:54,640 Speaker 1: in society as the jester for you know, for for 218 00:12:54,640 --> 00:12:59,280 Speaker 1: for all intensive purposes, the jester that brings to light 219 00:12:59,480 --> 00:13:02,280 Speaker 1: and saw finds the blow in many ways of a 220 00:13:02,280 --> 00:13:05,640 Speaker 1: lot of societal ills, right, things that we will not 221 00:13:05,800 --> 00:13:09,839 Speaker 1: talk about or that are uncomfortable to talk about. If 222 00:13:09,880 --> 00:13:11,800 Speaker 1: you put a laugh track to it, it makes it 223 00:13:11,840 --> 00:13:17,760 Speaker 1: easier to digest. What's your response to that argument around 224 00:13:18,080 --> 00:13:23,680 Speaker 1: Chappelle's special? So, first and foremost, you know, I talk 225 00:13:23,760 --> 00:13:27,040 Speaker 1: to other people who typically like his specials, and some 226 00:13:27,080 --> 00:13:30,080 Speaker 1: of the things they said was this didn't feel as funny, right, 227 00:13:30,200 --> 00:13:35,079 Speaker 1: And because that's why I hesitate to say it myself, 228 00:13:35,120 --> 00:13:37,000 Speaker 1: because like, the last time I really liked a lot 229 00:13:37,040 --> 00:13:38,959 Speaker 1: of Chappelle stuff was to Chappelle the first two seasons 230 00:13:38,960 --> 00:13:43,200 Speaker 1: of Chappelle Show. So like, I'm not obviously the audience 231 00:13:43,280 --> 00:13:45,840 Speaker 1: for his specials, but talking to other people, they were 232 00:13:45,840 --> 00:13:48,160 Speaker 1: like I really struggled with it, Like I was just like, 233 00:13:48,440 --> 00:13:50,200 Speaker 1: can you make some jokes? Like why are we still 234 00:13:50,200 --> 00:13:53,240 Speaker 1: talking about you know, trans people, right? And these are 235 00:13:53,240 --> 00:13:56,800 Speaker 1: people who like to laugh with him, and so for me, 236 00:13:57,160 --> 00:14:00,160 Speaker 1: I just thought, I was like, how are you missing 237 00:14:00,720 --> 00:14:03,280 Speaker 1: the point here? And also, as a part as a 238 00:14:03,320 --> 00:14:06,280 Speaker 1: proud member of the LGBTQ plus community, I could go 239 00:14:06,360 --> 00:14:10,760 Speaker 1: to a drag show every night and hear way more controversial, 240 00:14:10,840 --> 00:14:14,080 Speaker 1: inappropriate things that will have me peeing in my seat 241 00:14:14,120 --> 00:14:18,240 Speaker 1: because it's hilarious. Because we know that comedy can be great, 242 00:14:18,679 --> 00:14:21,880 Speaker 1: especially when you know what you're talking about. Now, when 243 00:14:21,880 --> 00:14:24,880 Speaker 1: you know what you're talking about, you can make something. 244 00:14:25,720 --> 00:14:28,680 Speaker 1: So you can make something so terrible and difficult and 245 00:14:28,760 --> 00:14:31,720 Speaker 1: uncomfortable and say it in such a way that I 246 00:14:31,720 --> 00:14:35,120 Speaker 1: will laugh. At the same time, questioned myself like maybe 247 00:14:35,560 --> 00:14:37,560 Speaker 1: why am I laughing? Yeah? You know, but it's like, 248 00:14:37,600 --> 00:14:40,480 Speaker 1: but that's the thing, and that's where that's what we want, right, 249 00:14:40,680 --> 00:14:45,840 Speaker 1: Like we want this comedy to be so amazingly done 250 00:14:46,480 --> 00:14:49,560 Speaker 1: that we laugh, right. And so you know, one of 251 00:14:49,600 --> 00:14:51,760 Speaker 1: the biggest parts of the special that I thought was 252 00:14:51,800 --> 00:14:54,760 Speaker 1: missing is and and is that you know, I'm black 253 00:14:55,240 --> 00:14:59,920 Speaker 1: and I'm trans. I'm black and I'm a lesbian. You know, 254 00:15:00,080 --> 00:15:03,520 Speaker 1: the Special seems to have this supposition that the LGBTQ 255 00:15:03,640 --> 00:15:07,400 Speaker 1: plus community is essentially adjacent to the white supremacist community, 256 00:15:07,880 --> 00:15:13,640 Speaker 1: so LGBTQ plus means white and that trans people, and 257 00:15:13,640 --> 00:15:16,240 Speaker 1: then separately that the civil rights movement the black people 258 00:15:16,880 --> 00:15:21,240 Speaker 1: somehow are being hurt completely by the LGBTQ plus movement. 259 00:15:22,080 --> 00:15:24,600 Speaker 1: And I'm like, well, in that case, then by the 260 00:15:24,640 --> 00:15:28,960 Speaker 1: trans of property, I am being destroyed by both movements, right, 261 00:15:29,360 --> 00:15:33,480 Speaker 1: because at no point does he acknowledge that people like 262 00:15:33,600 --> 00:15:37,240 Speaker 1: me even exist, right, And I think that that's one 263 00:15:37,240 --> 00:15:39,520 Speaker 1: of the things that also again when you know what 264 00:15:39,560 --> 00:15:43,120 Speaker 1: you're talking about, perhaps that could have changed his entire 265 00:15:43,200 --> 00:15:46,200 Speaker 1: set because he would have been like, well, but you know, 266 00:15:46,360 --> 00:15:49,960 Speaker 1: I know, people like be exists. And because of that, 267 00:15:50,000 --> 00:15:53,520 Speaker 1: I probably need to think about this more strategically, about 268 00:15:53,520 --> 00:15:56,200 Speaker 1: the overarching point that I'm trying to make so that 269 00:15:56,280 --> 00:15:59,560 Speaker 1: I can actually be funny and well said in this 270 00:15:59,680 --> 00:16:09,760 Speaker 1: special versus completely ebracing this super the super small group 271 00:16:09,800 --> 00:16:15,000 Speaker 1: of individuals who are harmed a much higher rate than 272 00:16:15,080 --> 00:16:19,320 Speaker 1: virtually every other community in this country. Yeah, I you know, 273 00:16:19,520 --> 00:16:24,680 Speaker 1: as as a black lesbian. I'm watching this special and 274 00:16:25,040 --> 00:16:28,680 Speaker 1: there was one point where I was just like, so, 275 00:16:28,760 --> 00:16:32,080 Speaker 1: do you not understand that people live at multiple intersections 276 00:16:32,080 --> 00:16:36,800 Speaker 1: of identities? And you know, while you can be I 277 00:16:37,440 --> 00:16:43,000 Speaker 1: will say that he was valid in saying that they're like, oh, 278 00:16:43,040 --> 00:16:45,920 Speaker 1: can gay people be racist? But in his mind he 279 00:16:46,000 --> 00:16:49,560 Speaker 1: was really thinking white gaze? Can white gaze be racist? 280 00:16:49,600 --> 00:16:54,640 Speaker 1: But instead of he was making gay synonymous with whiteness, 281 00:16:54,840 --> 00:16:58,240 Speaker 1: which is largely part of the problem. And what invisibilizes 282 00:16:58,560 --> 00:17:01,560 Speaker 1: the rest of us, right is when is when that 283 00:17:01,640 --> 00:17:07,000 Speaker 1: assumption is made from the jump? And you know, I 284 00:17:07,680 --> 00:17:10,159 Speaker 1: what also stood out for me, and I want to 285 00:17:10,200 --> 00:17:14,760 Speaker 1: ask you this was this this feeling that it was 286 00:17:15,040 --> 00:17:22,960 Speaker 1: necessary to establish a hierarchy like the oppression the oppression Olympics, right, 287 00:17:23,359 --> 00:17:28,640 Speaker 1: and to do it again without any nuance or understanding 288 00:17:28,640 --> 00:17:32,320 Speaker 1: of those that live at the intersection of multiple communities. 289 00:17:32,359 --> 00:17:35,600 Speaker 1: And so I'm like, of course, there's racism in the 290 00:17:35,760 --> 00:17:38,800 Speaker 1: lgbt Q plus community. I have worked in this movement 291 00:17:39,040 --> 00:17:42,159 Speaker 1: for well over a decade. I can tell you firsthand 292 00:17:42,320 --> 00:17:46,080 Speaker 1: stories about like where the money comes from, and like 293 00:17:46,400 --> 00:17:48,960 Speaker 1: who gets to decide what is important and who is 294 00:17:49,000 --> 00:17:52,480 Speaker 1: important and when money drives up in one area. Why 295 00:17:52,520 --> 00:17:55,360 Speaker 1: that is and who are some of the most transphobic 296 00:17:55,400 --> 00:18:00,600 Speaker 1: people that I've come into interfacing with our other members 297 00:18:00,640 --> 00:18:03,440 Speaker 1: of the LGB community that are some of the most 298 00:18:03,440 --> 00:18:07,159 Speaker 1: transphobic people. And again, if you have a deep understanding, 299 00:18:08,480 --> 00:18:11,520 Speaker 1: more so than the shallow puddle from which I believe 300 00:18:11,520 --> 00:18:14,359 Speaker 1: he was operating from, then you would be able to 301 00:18:14,440 --> 00:18:20,680 Speaker 1: have a lot more your jokes would be sharper, right exactly, exactly, 302 00:18:20,840 --> 00:18:22,920 Speaker 1: like if nothing else, Like I would like to hang 303 00:18:22,960 --> 00:18:25,680 Speaker 1: out with Dave Chappelle so that I could educate him, 304 00:18:25,720 --> 00:18:29,960 Speaker 1: so that he really can make me laugh, right right, 305 00:18:30,040 --> 00:18:32,080 Speaker 1: Like that's the whole goal, Like, you know, how do 306 00:18:32,160 --> 00:18:34,679 Speaker 1: we educate people? And that's the whole point, because like 307 00:18:34,680 --> 00:18:37,040 Speaker 1: it was never about taking down the special it was 308 00:18:37,080 --> 00:18:41,040 Speaker 1: about creating purity and content so that people could understand 309 00:18:41,760 --> 00:18:44,639 Speaker 1: the context for which he was talking. Like for instance, 310 00:18:44,680 --> 00:18:46,320 Speaker 1: you know, I've spent a lot of time explaining the 311 00:18:46,440 --> 00:18:49,160 Speaker 1: term turf to people, you know, over the past week, 312 00:18:49,560 --> 00:18:52,080 Speaker 1: and they were like what they were like, is this 313 00:18:52,080 --> 00:18:53,600 Speaker 1: the grass? I was like, no, it's not the grass. 314 00:18:54,000 --> 00:18:57,840 Speaker 1: It's not this is like these are trans exclusionary radical 315 00:18:57,880 --> 00:19:01,760 Speaker 1: feminists and what's dangerous about these people, and what's dangerous 316 00:19:01,800 --> 00:19:06,159 Speaker 1: about someone like Dave Chappelle endorsing them is the simple 317 00:19:06,200 --> 00:19:09,359 Speaker 1: fact of one, they don't think trans women or women. 318 00:19:09,880 --> 00:19:12,400 Speaker 1: They think many trans men and trans mont binary people 319 00:19:12,440 --> 00:19:15,520 Speaker 1: are just like confused women who have been harmed or 320 00:19:15,600 --> 00:19:18,800 Speaker 1: hurt in some way and can't make their own adult decisions, 321 00:19:18,840 --> 00:19:21,199 Speaker 1: which is like very strange to me. And then they 322 00:19:21,280 --> 00:19:27,080 Speaker 1: also have they literally tear women down to this concept 323 00:19:27,160 --> 00:19:30,840 Speaker 1: of like menistruation or being able to biologically have a child. 324 00:19:31,200 --> 00:19:34,480 Speaker 1: And we know every single day people who are born 325 00:19:34,560 --> 00:19:38,240 Speaker 1: as women are also particular, there's a lot of women 326 00:19:38,240 --> 00:19:40,760 Speaker 1: who are also born who can never have children. So 327 00:19:40,840 --> 00:19:44,560 Speaker 1: does that mean that any of those women are not women? Right? 328 00:19:45,040 --> 00:19:47,320 Speaker 1: And so this ideology is not only harmful to the 329 00:19:47,320 --> 00:19:51,720 Speaker 1: trans community, it's harmful to the overall women's movement. Right. 330 00:19:51,800 --> 00:19:56,040 Speaker 1: The women's movement is being set back by this unintelligent 331 00:19:56,080 --> 00:19:59,959 Speaker 1: conversation that's happening from this group of people who are 332 00:20:00,119 --> 00:20:03,119 Speaker 1: full of hate. And so that's the type of nuance. 333 00:20:03,160 --> 00:20:06,399 Speaker 1: Like I would love if Netflix, you know, founded, like 334 00:20:06,480 --> 00:20:09,359 Speaker 1: you know, found a documentary that maybe already this or 335 00:20:09,600 --> 00:20:13,200 Speaker 1: commissioner documentary that explain terms, you know, I would love 336 00:20:13,280 --> 00:20:17,520 Speaker 1: to see a POSE esque show on Netflix. Give me 337 00:20:17,600 --> 00:20:20,160 Speaker 1: six seasons of a show that that's like Pose on Netflix, 338 00:20:20,440 --> 00:20:22,159 Speaker 1: and maybe I would like I would just be like, 339 00:20:22,200 --> 00:20:24,359 Speaker 1: you know what, I didn't even get mad about the 340 00:20:24,440 --> 00:20:28,119 Speaker 1: Chappelle's Special because there's a there's another piece of content 341 00:20:28,680 --> 00:20:30,720 Speaker 1: that I can go to that I can point people 342 00:20:30,760 --> 00:20:33,919 Speaker 1: to that shows trans lives in the fullness of trans 343 00:20:34,000 --> 00:20:37,800 Speaker 1: lives right there. And that's the real point of this 344 00:20:38,119 --> 00:20:40,080 Speaker 1: is that you cannot show one side of it that 345 00:20:40,200 --> 00:20:44,359 Speaker 1: it's wrong, like right, cannot cannot say that enough it's wrong, 346 00:20:44,560 --> 00:20:48,000 Speaker 1: it's inaccurate, it's not the right information, and then not 347 00:20:48,119 --> 00:20:50,399 Speaker 1: also show the competing side. I mean, we do it 348 00:20:50,440 --> 00:20:53,919 Speaker 1: for presidential elections. You can't. You can't have one person 349 00:20:53,960 --> 00:20:57,520 Speaker 1: from one party, one person who's running for an office, 350 00:20:57,880 --> 00:21:00,679 Speaker 1: on one channel and not give the other person or 351 00:21:00,760 --> 00:21:03,640 Speaker 1: other people the chance to speak their peace. And all 352 00:21:03,680 --> 00:21:06,240 Speaker 1: were asking is for media companies to invest in that. 353 00:21:07,480 --> 00:21:12,320 Speaker 1: So talk to talk to me about UM, the walkout 354 00:21:12,640 --> 00:21:17,560 Speaker 1: that that happened and the important list of demands that 355 00:21:17,600 --> 00:21:22,120 Speaker 1: were that were put up UM to the two Netflix 356 00:21:22,160 --> 00:21:26,119 Speaker 1: to the president of Netflix. Yeah, so first and foremost, UM, 357 00:21:26,320 --> 00:21:28,600 Speaker 1: initially it was not going to be a walkout. So 358 00:21:28,640 --> 00:21:30,639 Speaker 1: initially it was just supposed to be a trans date 359 00:21:30,680 --> 00:21:34,320 Speaker 1: of rest, like an opportunity for you know, the trans 360 00:21:34,359 --> 00:21:36,879 Speaker 1: people who had been deeply affected by this, because I 361 00:21:36,880 --> 00:21:40,320 Speaker 1: could tell that it was a major emotional, you know, 362 00:21:40,440 --> 00:21:43,760 Speaker 1: toll on people. Um, it was supposed to just be 363 00:21:44,600 --> 00:21:47,280 Speaker 1: you know, at the trans day of rest, and then 364 00:21:47,440 --> 00:21:50,439 Speaker 1: our colleagues who were not trans were supposed to educate 365 00:21:50,480 --> 00:21:55,520 Speaker 1: themselves and look at different ways to support the trans community. 366 00:21:56,040 --> 00:21:58,760 Speaker 1: But the problem is then all of Ted's emails came 367 00:21:58,760 --> 00:22:04,320 Speaker 1: out and text emails were progressively worse and worse and worse. 368 00:22:04,680 --> 00:22:08,000 Speaker 1: And so then I took a vote of the trans 369 00:22:08,080 --> 00:22:13,360 Speaker 1: you know community and asked them, hey, you know, I'm 370 00:22:13,400 --> 00:22:15,680 Speaker 1: starting to think maybe this should be a walkout. What 371 00:22:15,720 --> 00:22:19,560 Speaker 1: do you think? And it was an overwhelming vote saying yes, 372 00:22:20,000 --> 00:22:21,680 Speaker 1: you know, we think that that, you know, we would 373 00:22:21,720 --> 00:22:24,960 Speaker 1: like to have this be a walkout. And then so 374 00:22:25,200 --> 00:22:27,919 Speaker 1: on October fourteenth, at about like, you know, three or 375 00:22:27,920 --> 00:22:30,880 Speaker 1: four pm, I posted in the public channel saying, hey, 376 00:22:30,920 --> 00:22:34,320 Speaker 1: f why this is now a walkout? This is why, 377 00:22:34,560 --> 00:22:36,320 Speaker 1: you know, we feel like we need to walk out, 378 00:22:37,480 --> 00:22:43,280 Speaker 1: And that's how it became a walkout. So why were 379 00:22:43,320 --> 00:22:45,640 Speaker 1: you fired, or why do you why do you why 380 00:22:45,680 --> 00:22:49,920 Speaker 1: do you believe that Netflix terminated you after a year 381 00:22:49,960 --> 00:22:53,480 Speaker 1: and seven months. So I will say that this is 382 00:22:53,520 --> 00:22:58,000 Speaker 1: what they told me. So October fourteenth, I said, hey, 383 00:22:58,359 --> 00:23:01,399 Speaker 1: let's have a walkout, and then by seven pm that 384 00:23:01,520 --> 00:23:06,159 Speaker 1: day I was terminated for essentially what they said was 385 00:23:06,480 --> 00:23:11,240 Speaker 1: the likelihood that I leaked sensitive and confidential information and 386 00:23:11,440 --> 00:23:14,879 Speaker 1: was this the president's emails? No, that this is the 387 00:23:17,080 --> 00:23:19,960 Speaker 1: data the data in the Bloomberg article. They were like, 388 00:23:20,000 --> 00:23:23,639 Speaker 1: it's likely that you did, because you you access this information. 389 00:23:25,440 --> 00:23:29,000 Speaker 1: And my response was yes, I did access that information 390 00:23:29,520 --> 00:23:32,000 Speaker 1: in support of the initiatives that we were putting together 391 00:23:32,480 --> 00:23:37,600 Speaker 1: to support our argument for diversifying the content on Netflix. Now, 392 00:23:37,600 --> 00:23:41,040 Speaker 1: what's really interesting is is in that conversation they were like, well, 393 00:23:41,080 --> 00:23:43,399 Speaker 1: but we also saw that you forwarded other emails in 394 00:23:43,520 --> 00:23:46,720 Speaker 1: different times outside of Netflix. And I was like, are 395 00:23:46,720 --> 00:23:50,960 Speaker 1: those all too, Alissa Pagels And they were like, yeah, 396 00:23:51,000 --> 00:23:53,119 Speaker 1: well or they didn't necessarily say yeah, but they were like, 397 00:23:53,320 --> 00:23:54,879 Speaker 1: I mean we could double check, and I was like, 398 00:23:55,080 --> 00:23:59,479 Speaker 1: that's my wife. I was like, I forwarded those emails 399 00:23:59,520 --> 00:24:02,680 Speaker 1: to my life and and and at the time they 400 00:24:02,720 --> 00:24:04,960 Speaker 1: were like, oh, by the way, it's a fireball offense 401 00:24:05,000 --> 00:24:09,600 Speaker 1: to afford any email that Netflix sends, period and it's like, 402 00:24:09,720 --> 00:24:12,080 Speaker 1: we are you aware of that? I was not aware of. 403 00:24:12,080 --> 00:24:16,399 Speaker 1: This was awhere, not that I know of. Like, you know, 404 00:24:16,440 --> 00:24:18,359 Speaker 1: so I have an attorney. Now my attorney is like, 405 00:24:18,520 --> 00:24:21,080 Speaker 1: I don't really understand how they came to this conclusion 406 00:24:21,080 --> 00:24:24,280 Speaker 1: as well, because like, for instance, they were like and 407 00:24:24,359 --> 00:24:26,560 Speaker 1: so I like, for instance, I even said back to them, 408 00:24:26,560 --> 00:24:28,840 Speaker 1: I was like, well, but like you know, sometimes you 409 00:24:28,880 --> 00:24:33,280 Speaker 1: guys will invite us to events and you know, you say, hey, 410 00:24:33,400 --> 00:24:35,119 Speaker 1: like you can go to this movie premiere, so you're 411 00:24:35,160 --> 00:24:37,080 Speaker 1: saying I shouldn't afford it that email to my wife 412 00:24:37,080 --> 00:24:39,840 Speaker 1: asking her if she wanted to go with me, you know, 413 00:24:40,359 --> 00:24:43,120 Speaker 1: and then you know, uh, you know, there's sometimes where 414 00:24:43,119 --> 00:24:44,960 Speaker 1: they would say, do you want to watch this show 415 00:24:44,960 --> 00:24:46,639 Speaker 1: and give us feedback to make sure the show is 416 00:24:46,640 --> 00:24:48,399 Speaker 1: going to be good? And I afford that to my 417 00:24:48,400 --> 00:24:49,680 Speaker 1: wife and said do you want to watch this show 418 00:24:49,680 --> 00:24:52,119 Speaker 1: because you had to sign up and she would say 419 00:24:52,240 --> 00:24:53,679 Speaker 1: yes or no. I was like, so I wasn't supposed 420 00:24:53,680 --> 00:24:56,879 Speaker 1: to do that. You know, if a complimentary email came 421 00:24:57,080 --> 00:24:59,800 Speaker 1: from one of our executives and I thought it was great, 422 00:25:00,000 --> 00:25:01,639 Speaker 1: and I afforded to my wife and said, look at 423 00:25:01,680 --> 00:25:04,080 Speaker 1: how great this email was. I wasn't supposed to do 424 00:25:04,119 --> 00:25:07,040 Speaker 1: that either, Like, it didn't make sense to me that 425 00:25:07,240 --> 00:25:10,160 Speaker 1: all of a sudden, all of these things that I 426 00:25:10,200 --> 00:25:12,359 Speaker 1: just did in my normal course of business that often 427 00:25:12,480 --> 00:25:17,879 Speaker 1: again we're often complimentary if netflix were fireable, and that 428 00:25:17,920 --> 00:25:20,560 Speaker 1: there had never been an example of what you could 429 00:25:20,560 --> 00:25:24,280 Speaker 1: be fired up for provided to me. And you know, 430 00:25:24,359 --> 00:25:26,080 Speaker 1: I think this really comes back to this idea, like 431 00:25:26,119 --> 00:25:31,280 Speaker 1: no rules, rules is a great concept until until things 432 00:25:31,280 --> 00:25:35,640 Speaker 1: come up. Because I talk to other people and I 433 00:25:35,680 --> 00:25:37,200 Speaker 1: won't name names, so I talk to other people. I 434 00:25:37,240 --> 00:25:38,440 Speaker 1: was like, have you ever done any of these things? 435 00:25:38,440 --> 00:25:40,520 Speaker 1: And they were like yeah. I was like, so basically 436 00:25:40,560 --> 00:25:44,000 Speaker 1: we all should be fired because we were all breaking 437 00:25:44,040 --> 00:25:48,240 Speaker 1: these types of rules, you know, And I think that's 438 00:25:48,320 --> 00:25:51,359 Speaker 1: the that's the issue, you know, Like you know, and 439 00:25:51,400 --> 00:25:54,480 Speaker 1: this goes back to the freedom responsibility and changing that 440 00:25:54,520 --> 00:25:58,399 Speaker 1: to accountability. You know, you can't tell someone that they 441 00:25:58,440 --> 00:26:02,800 Speaker 1: have the freedom to be transparent and do all these 442 00:26:02,840 --> 00:26:06,320 Speaker 1: different types of actions, but then when that person does 443 00:26:06,359 --> 00:26:09,040 Speaker 1: something that you don't like, all of a sudden, they 444 00:26:09,119 --> 00:26:13,919 Speaker 1: become a problem. Because I mean, so then the question 445 00:26:13,920 --> 00:26:18,359 Speaker 1: would be was it then not within the scope of 446 00:26:18,400 --> 00:26:22,000 Speaker 1: your position to access the data that you access to 447 00:26:22,080 --> 00:26:26,520 Speaker 1: begin with. I mean, as far as I know, if 448 00:26:26,600 --> 00:26:31,520 Speaker 1: you have access to something, you can access it like that. 449 00:26:31,880 --> 00:26:33,919 Speaker 1: I mean, that's as far as I know, you know, 450 00:26:34,520 --> 00:26:37,359 Speaker 1: And also that it was if it had been restricted, 451 00:26:37,640 --> 00:26:39,600 Speaker 1: then you wouldn't have been able to access it in 452 00:26:39,640 --> 00:26:42,840 Speaker 1: the first place. Correct. And I'm not, to be clear, 453 00:26:42,920 --> 00:26:45,520 Speaker 1: I'm not advocating for Netflix to change its policies to 454 00:26:45,720 --> 00:26:48,119 Speaker 1: block things down. I'm not saying that at all. But 455 00:26:48,200 --> 00:26:50,919 Speaker 1: what I am saying is is that this felt weird. 456 00:26:51,440 --> 00:26:54,159 Speaker 1: It felt really really strange. And actually, in my previous role, 457 00:26:54,200 --> 00:26:57,639 Speaker 1: so my most recent role was Game Watch Operations program Manager, 458 00:26:57,800 --> 00:26:59,919 Speaker 1: but before that, I was a senior data product manager 459 00:27:00,200 --> 00:27:03,600 Speaker 1: and I manage finance and membership data, which are two 460 00:27:03,600 --> 00:27:05,920 Speaker 1: of the most important pieces of data that you could 461 00:27:06,000 --> 00:27:09,960 Speaker 1: possibly deal with in Netflix. And in fact that that 462 00:27:09,960 --> 00:27:13,080 Speaker 1: that job was so stressful because I would know weeks 463 00:27:13,200 --> 00:27:16,000 Speaker 1: or months ahead of time whether we were probably going 464 00:27:16,040 --> 00:27:19,320 Speaker 1: to miss how many you know, members we were supposed 465 00:27:19,320 --> 00:27:22,000 Speaker 1: to have, how much money we made, and that is 466 00:27:22,040 --> 00:27:25,520 Speaker 1: so much more sensitive than going on to a page 467 00:27:25,520 --> 00:27:28,280 Speaker 1: and seeing, you know, what movie are we going to 468 00:27:28,320 --> 00:27:36,680 Speaker 1: release for instance. So and I know, I don't want 469 00:27:36,680 --> 00:27:38,960 Speaker 1: to get into the legalities of it, because you obviously 470 00:27:39,040 --> 00:27:41,800 Speaker 1: have a case that is, you know, that is pending. 471 00:27:42,920 --> 00:27:49,040 Speaker 1: What are the feelings of let's say, the employees who 472 00:27:49,600 --> 00:27:55,520 Speaker 1: are trans who are also allies, what happens if the 473 00:27:55,600 --> 00:27:59,080 Speaker 1: demands that are presented are not met? Where where do 474 00:27:59,119 --> 00:28:03,160 Speaker 1: you think where do you think that this goes? Yeah, 475 00:28:03,320 --> 00:28:05,040 Speaker 1: so it's really interesting. So one of the things I 476 00:28:05,320 --> 00:28:07,919 Speaker 1: really appreciated about the allies that were a part of 477 00:28:07,960 --> 00:28:10,600 Speaker 1: this movement is that they were a little scared, like 478 00:28:10,520 --> 00:28:13,119 Speaker 1: they were a little pissed off too, right, And it 479 00:28:13,119 --> 00:28:16,199 Speaker 1: really came back to like a few different ideas. So 480 00:28:16,320 --> 00:28:20,200 Speaker 1: first and foremost it was this idea that you know, 481 00:28:20,760 --> 00:28:24,720 Speaker 1: the emails were very strange. They seem to be shutting 482 00:28:24,720 --> 00:28:27,959 Speaker 1: down the sense and that's not a core Netflix value. 483 00:28:29,000 --> 00:28:32,879 Speaker 1: They were also really upset because especially around those comments 484 00:28:32,880 --> 00:28:36,720 Speaker 1: around harmful content, because like I think, no matter who 485 00:28:36,760 --> 00:28:39,760 Speaker 1: you are, like we all agree that there are certain 486 00:28:39,760 --> 00:28:41,840 Speaker 1: types of content that you wouldn't show to certain types 487 00:28:41,880 --> 00:28:43,840 Speaker 1: of people. That's why you have PG thirteen, That's why 488 00:28:43,920 --> 00:28:46,160 Speaker 1: you have rated art, that's why you have rated NT seventeen. 489 00:28:46,480 --> 00:28:51,600 Speaker 1: So it's well, it's well litigated that content has an 490 00:28:51,640 --> 00:28:55,200 Speaker 1: impact in the real world, right. And I think the 491 00:28:55,760 --> 00:28:59,600 Speaker 1: third thing about that was about reputation, right, Like it 492 00:28:59,640 --> 00:29:02,280 Speaker 1: is ward for Netflix to think about the type of 493 00:29:02,320 --> 00:29:05,120 Speaker 1: concept that we put out there. So I do think 494 00:29:05,160 --> 00:29:07,560 Speaker 1: that no matter what, there's a little bit of a 495 00:29:07,600 --> 00:29:10,920 Speaker 1: cultural rightning that has to happen to discuss what does 496 00:29:10,960 --> 00:29:13,800 Speaker 1: it look like for Netflix to move forward and how 497 00:29:13,840 --> 00:29:17,120 Speaker 1: it will actually handle these types of situations in the future. 498 00:29:17,480 --> 00:29:20,040 Speaker 1: So that's first and foremost. Secondly, in terms of the 499 00:29:20,120 --> 00:29:24,920 Speaker 1: ass not being met, I'm not sure how that's gonna look, right, 500 00:29:24,960 --> 00:29:27,160 Speaker 1: I'm not sure how allies especially are going to feel 501 00:29:27,160 --> 00:29:29,640 Speaker 1: about that because the ass like almost everyone thinks the 502 00:29:29,640 --> 00:29:34,240 Speaker 1: asks are very reasonable, right, And Netflix has the history 503 00:29:34,320 --> 00:29:37,320 Speaker 1: of saying we're going to invest in this potential, this 504 00:29:37,600 --> 00:29:43,360 Speaker 1: particular pipeline of talent to help grow this particular category 505 00:29:43,400 --> 00:29:46,720 Speaker 1: of content. For instance, like there's like there's a one 506 00:29:46,880 --> 00:29:49,800 Speaker 1: for black and Latin X people are already when COVID 507 00:29:49,840 --> 00:29:52,760 Speaker 1: first started, they started creating these huge funds for creations 508 00:29:52,880 --> 00:29:56,640 Speaker 1: across the world, So it's not that dilearched concepts that 509 00:29:56,720 --> 00:29:59,520 Speaker 1: already exist, right. And then like one of the one 510 00:29:59,520 --> 00:30:02,360 Speaker 1: of the other things is about actually, you know, investing 511 00:30:02,800 --> 00:30:07,080 Speaker 1: marketing into trans you know content like so, for instance, 512 00:30:07,120 --> 00:30:09,680 Speaker 1: many people think that Disclosure had a chance of potentially 513 00:30:09,720 --> 00:30:12,720 Speaker 1: being nominated for an oscar, but Netflix didn't actually put 514 00:30:12,760 --> 00:30:15,920 Speaker 1: a lot of power behind promoting it. Right, So these 515 00:30:15,920 --> 00:30:19,200 Speaker 1: are all very reasonable, simple asks. And so I do 516 00:30:19,280 --> 00:30:22,080 Speaker 1: think in my columns question, why wouldn't you want to 517 00:30:22,080 --> 00:30:26,120 Speaker 1: do at least something here? Right? Um, So that's that's 518 00:30:26,200 --> 00:30:30,600 Speaker 1: my take on it. I hope that you know, I'm 519 00:30:31,040 --> 00:30:35,520 Speaker 1: I'm being a little too pragmatic and like Netflix actually 520 00:30:35,560 --> 00:30:38,920 Speaker 1: decides to take a dive in and really try to 521 00:30:38,960 --> 00:30:43,600 Speaker 1: do um some some some bigger left there. But at 522 00:30:43,600 --> 00:30:47,360 Speaker 1: the same time, you know, I've been very shocked by 523 00:30:47,400 --> 00:30:49,440 Speaker 1: a lot of what Netflix has done over the past 524 00:30:49,800 --> 00:30:51,800 Speaker 1: couple of weeks. These are things that I didn't expect 525 00:30:51,800 --> 00:30:55,520 Speaker 1: from this company. Um and and and you know, I'm 526 00:30:55,640 --> 00:30:58,240 Speaker 1: I'm usually pretty good at predicting things, and so it's 527 00:30:58,280 --> 00:31:01,400 Speaker 1: been very surprising to me the ways in which they've moved. 528 00:31:03,160 --> 00:31:07,400 Speaker 1: Do you feel like, because of this termination and because 529 00:31:07,440 --> 00:31:12,080 Speaker 1: of how public it is and has become, that you're 530 00:31:12,080 --> 00:31:14,600 Speaker 1: going to be some in some way, shape or form 531 00:31:14,680 --> 00:31:18,240 Speaker 1: blacklisted in your industry, So it's really interesting. So first 532 00:31:18,240 --> 00:31:21,920 Speaker 1: and foremost Netflix is a media company, but I'm a technologist, 533 00:31:22,320 --> 00:31:24,480 Speaker 1: right right, So I do think that that helps, Like 534 00:31:24,520 --> 00:31:26,600 Speaker 1: if I it could be that if I wanted to 535 00:31:26,640 --> 00:31:29,680 Speaker 1: work in another media company, there could be issues. But 536 00:31:29,840 --> 00:31:32,000 Speaker 1: you know, I'm going to end up being hopefully a 537 00:31:32,040 --> 00:31:34,640 Speaker 1: product manader program manager a tech company, and I think 538 00:31:34,640 --> 00:31:37,240 Speaker 1: it's a little bit different there, right, you know, from 539 00:31:37,320 --> 00:31:40,120 Speaker 1: from that perspective. And I have reached out, well, my 540 00:31:40,160 --> 00:31:42,480 Speaker 1: network has reached out to me, and there's a lot 541 00:31:42,520 --> 00:31:46,280 Speaker 1: of people who have nothing but complimentary things to say 542 00:31:46,360 --> 00:31:50,000 Speaker 1: about how much they loved working with me throughout my career. 543 00:31:50,400 --> 00:31:53,520 Speaker 1: So I'm very hopeful that this is just simply a 544 00:31:53,600 --> 00:31:59,840 Speaker 1: lesson for Netflix and that people remember the quality of 545 00:32:00,960 --> 00:32:05,160 Speaker 1: my work for the past decade in technology and that 546 00:32:05,280 --> 00:32:08,360 Speaker 1: I don't have issues, you know, and if I do 547 00:32:08,480 --> 00:32:13,240 Speaker 1: have issues, if nothing else, I get to loudly say, hey, 548 00:32:13,240 --> 00:32:16,920 Speaker 1: by the way, there's a lot of stuff going on 549 00:32:17,040 --> 00:32:19,680 Speaker 1: here that we should really talk about and address and 550 00:32:19,720 --> 00:32:22,520 Speaker 1: try to figure out how to fix, you know. I 551 00:32:22,600 --> 00:32:26,120 Speaker 1: just I have to say that one, I commend you 552 00:32:26,320 --> 00:32:30,200 Speaker 1: for the conversations that you have been having that are 553 00:32:30,280 --> 00:32:34,840 Speaker 1: so much far beyond oh, just take down the special, right, 554 00:32:34,920 --> 00:32:39,080 Speaker 1: because taking down the special isn't getting to the heart 555 00:32:39,120 --> 00:32:41,760 Speaker 1: of the heart of the matter, right, which is to 556 00:32:41,880 --> 00:32:48,920 Speaker 1: your point, initially about accountability, about responsibility, about representation that 557 00:32:49,360 --> 00:32:51,959 Speaker 1: is on par with the you know, the good and 558 00:32:52,040 --> 00:32:56,160 Speaker 1: the bad, right, giving that exposure, creating those pipelines. And 559 00:32:56,200 --> 00:33:00,640 Speaker 1: I think that in these many large companies from Facebook 560 00:33:00,640 --> 00:33:03,120 Speaker 1: to Netflix, Google and you know and all of them, 561 00:33:03,760 --> 00:33:07,520 Speaker 1: that they're very tight lipped and you don't really know 562 00:33:07,600 --> 00:33:10,719 Speaker 1: what's going on internally, and then when you get to 563 00:33:10,760 --> 00:33:13,880 Speaker 1: see that, when the curtain is pulled back, you're just like, so, 564 00:33:13,960 --> 00:33:17,400 Speaker 1: this isn't great, right, And as a consumer, I'm saying 565 00:33:17,440 --> 00:33:20,680 Speaker 1: to myself, and especially as a black queer consumer, I'm like, 566 00:33:21,440 --> 00:33:23,760 Speaker 1: I don't really think that these asks are that hard, 567 00:33:24,040 --> 00:33:26,680 Speaker 1: right for a multibillion dollar company to be able to 568 00:33:26,680 --> 00:33:29,880 Speaker 1: put together. I know, you hire consultants and you know, 569 00:33:30,000 --> 00:33:33,440 Speaker 1: and and and advocates and all of these different things. 570 00:33:33,480 --> 00:33:36,920 Speaker 1: So make more of a concerted effort, right, like, show 571 00:33:36,960 --> 00:33:40,280 Speaker 1: people that you are beyond just obviously it's a company. 572 00:33:40,360 --> 00:33:43,240 Speaker 1: It's capitalism, Dave Chappelle makes them a hell of a 573 00:33:43,240 --> 00:33:46,200 Speaker 1: lot of money. So they were never going to, you know, 574 00:33:46,320 --> 00:33:48,440 Speaker 1: to take down that show. But it's like where else 575 00:33:48,480 --> 00:33:53,360 Speaker 1: can you put investments? The last question for you is, 576 00:33:53,640 --> 00:33:57,960 Speaker 1: you know, what are your hopes for your former colleagues 577 00:33:58,160 --> 00:34:01,400 Speaker 1: now who did put themselves volves out there, who did 578 00:34:01,480 --> 00:34:04,520 Speaker 1: do this walk out? Um, what are your hopes for 579 00:34:04,680 --> 00:34:09,319 Speaker 1: how they're able to move forward in this climate? Yeah? 580 00:34:09,360 --> 00:34:11,360 Speaker 1: So I think there's two categories there. So for my 581 00:34:11,400 --> 00:34:14,319 Speaker 1: trans Star siblings, like, I'm so proud of them, right, 582 00:34:14,400 --> 00:34:17,239 Speaker 1: Like I it's so interesting because Tara and I, like 583 00:34:17,280 --> 00:34:19,360 Speaker 1: we talk about this all the time. Tara and I 584 00:34:19,400 --> 00:34:22,800 Speaker 1: are probably the two feistiest members of trans Star and 585 00:34:23,080 --> 00:34:27,000 Speaker 1: the rest of the group often Um, they're they're the 586 00:34:27,040 --> 00:34:29,000 Speaker 1: most creators of great people in the world. But it 587 00:34:29,040 --> 00:34:31,880 Speaker 1: would have been easy, um, and the two of us 588 00:34:32,000 --> 00:34:33,839 Speaker 1: had to go through what we went through to take 589 00:34:33,880 --> 00:34:35,880 Speaker 1: a step back and say do we really want to 590 00:34:35,880 --> 00:34:38,359 Speaker 1: do this? Do we want to risk this? And they did. 591 00:34:38,840 --> 00:34:41,839 Speaker 1: They pushed forward, they kept promoting things, they kept making 592 00:34:41,880 --> 00:34:44,560 Speaker 1: sure it happened. I'm so proud of them, and I 593 00:34:44,640 --> 00:34:47,160 Speaker 1: know for a fact that every single one of them 594 00:34:47,280 --> 00:34:49,400 Speaker 1: is made the better for it, because, like I know 595 00:34:49,480 --> 00:34:52,560 Speaker 1: that they now know that they have the strength to 596 00:34:52,560 --> 00:34:56,080 Speaker 1: to fight no matter what. And I also really firmly 597 00:34:56,120 --> 00:34:58,120 Speaker 1: believe that they can make this happen, like they can 598 00:34:58,160 --> 00:35:02,280 Speaker 1: really continue to be the bug and Netflix's ear saying, hey, 599 00:35:02,280 --> 00:35:04,799 Speaker 1: where's that trans content? When's it gonna happen, Where's it 600 00:35:04,840 --> 00:35:08,680 Speaker 1: gonna be here for the allies to join? You know, 601 00:35:08,760 --> 00:35:11,239 Speaker 1: one of the things that's interesting to me, and this 602 00:35:11,320 --> 00:35:16,359 Speaker 1: goes to show when you deal with quality people, it 603 00:35:16,360 --> 00:35:18,880 Speaker 1: does not matter what happens, they show up for you. 604 00:35:18,880 --> 00:35:22,160 Speaker 1: You know, I've gotten dozens upon dozens of mustages for 605 00:35:22,239 --> 00:35:25,000 Speaker 1: my former Netflix colleagues. They also set up a diaper 606 00:35:25,040 --> 00:35:28,160 Speaker 1: fund for me, and honestly, it's at this point it's 607 00:35:28,200 --> 00:35:30,080 Speaker 1: it's a big enough fun that I'm just like, I'm 608 00:35:30,120 --> 00:35:32,480 Speaker 1: gonna have to donate some diapers to other people because 609 00:35:32,560 --> 00:35:35,400 Speaker 1: it's like this is too much for diapers. But you know, 610 00:35:35,440 --> 00:35:37,080 Speaker 1: they set up all of this to make sure I 611 00:35:37,120 --> 00:35:40,400 Speaker 1: was taken care of, and so like, I love and 612 00:35:40,440 --> 00:35:44,000 Speaker 1: appreciate them. You know, I have no ill will towards Netflix, 613 00:35:44,120 --> 00:35:47,200 Speaker 1: primarily because I want them to be successful, right and 614 00:35:47,200 --> 00:35:50,880 Speaker 1: and a lot of those individuals they have the ability, 615 00:35:51,000 --> 00:35:54,000 Speaker 1: they're so talented, they're so smart, and I know for 616 00:35:54,120 --> 00:35:56,920 Speaker 1: a fact that they're gonna be a bugging people's ears too, 617 00:35:57,040 --> 00:36:00,359 Speaker 1: saying hey, this was weird, this was strange. How can 618 00:36:00,400 --> 00:36:03,240 Speaker 1: we be better? Let's never have the situation happen again. 619 00:36:03,480 --> 00:36:07,000 Speaker 1: We should never be divided in this way as a company, 620 00:36:07,200 --> 00:36:09,879 Speaker 1: because like we are trying to fight all these other 621 00:36:09,880 --> 00:36:14,200 Speaker 1: competitive forces, why are we also fighting within ourselves? Be 622 00:36:14,680 --> 00:36:17,560 Speaker 1: you know, once again, thank you so much for making 623 00:36:17,600 --> 00:36:21,480 Speaker 1: the time for willkaf today, but also for your courage 624 00:36:22,200 --> 00:36:26,800 Speaker 1: to speak out and be honest and vulnerable, right because 625 00:36:26,840 --> 00:36:33,759 Speaker 1: it's it's oftentimes black trans people just in general are silenced, 626 00:36:34,560 --> 00:36:39,160 Speaker 1: our stats and our victims and are made to be victims. 627 00:36:39,719 --> 00:36:43,279 Speaker 1: And in this instance, you're, you know, a warrior on 628 00:36:43,320 --> 00:36:47,120 Speaker 1: the front lines, bringing much light and needed attention to 629 00:36:47,200 --> 00:36:51,680 Speaker 1: a situation that frankly shouldn't have occurred. And hopefully people 630 00:36:51,719 --> 00:36:55,480 Speaker 1: will learn from this moving forward. And I hope that 631 00:36:55,560 --> 00:36:59,200 Speaker 1: as your case evolves and moves forward, that you'll come 632 00:36:59,200 --> 00:37:01,240 Speaker 1: back and join us again and give us an update 633 00:37:01,280 --> 00:37:03,920 Speaker 1: as to as to where things go. I will be 634 00:37:03,960 --> 00:37:06,600 Speaker 1: happy too, and and and honestly, I really hope it's 635 00:37:06,640 --> 00:37:09,720 Speaker 1: a very positive update, you know, I am. I'm putting 636 00:37:09,719 --> 00:37:13,080 Speaker 1: that energy and those vibrations out for you. As well. Yes, 637 00:37:13,239 --> 00:37:21,799 Speaker 1: for sure, thank you, thank you. That is it for me. 638 00:37:22,040 --> 00:37:26,160 Speaker 1: Dear friends here on woke as always. Power to the 639 00:37:26,200 --> 00:37:29,319 Speaker 1: people and to all the people. Power, get woke and 640 00:37:29,440 --> 00:37:30,800 Speaker 1: stay woke as fuck.