1 00:00:05,200 --> 00:00:07,640 Speaker 1: Hey, this is Annie and Samantha and welcome to Steff. 2 00:00:07,640 --> 00:00:18,800 Speaker 1: I've never told your production of I Heart Radio. Today, 3 00:00:18,840 --> 00:00:22,600 Speaker 1: we are once again thrilled to be joined by the amazing, 4 00:00:23,400 --> 00:00:28,920 Speaker 1: award winning, fantastic Bridget Todd, who we account ourselves so 5 00:00:29,040 --> 00:00:34,080 Speaker 1: lucky to know and call her friends. Congratulations Bridget on 6 00:00:34,200 --> 00:00:37,400 Speaker 1: your recent award for Best Technology Podcast at the I 7 00:00:37,520 --> 00:00:41,880 Speaker 1: Heart Radio Podcast Awards. Well deserved. That's so awesome. Congratulations 8 00:00:41,920 --> 00:00:44,839 Speaker 1: to you and your whole team. Oh my goodness, thank 9 00:00:44,880 --> 00:00:47,640 Speaker 1: you so much. Yeah, and you know what I'm gonna say, 10 00:00:47,680 --> 00:00:51,879 Speaker 1: it feels freaking good to win. I was kind of like, oh, like, 11 00:00:52,120 --> 00:00:55,080 Speaker 1: you know, just being nominated as nice, but I deep 12 00:00:55,120 --> 00:00:57,600 Speaker 1: down kind of you ever have one of those times 13 00:00:57,600 --> 00:00:59,319 Speaker 1: in your life where you just really need to win. 14 00:00:59,760 --> 00:01:02,640 Speaker 1: I'm That's where I was at. So um I have 15 00:01:02,800 --> 00:01:06,120 Speaker 1: to like briefly shout out my team. Tara Harrison is 16 00:01:06,120 --> 00:01:10,560 Speaker 1: my producer and engineer. She's phenomenally talented. Jonathan Strickland is 17 00:01:10,560 --> 00:01:14,760 Speaker 1: our EP. Phenomenally talented. Dr Michael Amato is our chief 18 00:01:14,800 --> 00:01:19,399 Speaker 1: Science officer and uh producer, so talented. I could not 19 00:01:19,600 --> 00:01:24,080 Speaker 1: have one without them. They're so awesome. Yes, thank you, Yes, 20 00:01:24,880 --> 00:01:29,520 Speaker 1: I love Owen, especially when it's our personal friends. Yeah, 21 00:01:29,560 --> 00:01:31,720 Speaker 1: I said it. I'm claiming he was a personal friend. 22 00:01:32,200 --> 00:01:35,440 Speaker 1: I know, Bridget, thank you. I mean I feel the 23 00:01:35,480 --> 00:01:38,600 Speaker 1: same way about you. I feel very onward to to 24 00:01:38,880 --> 00:01:42,920 Speaker 1: know you all in real life. Yes, yes, um. And 25 00:01:42,959 --> 00:01:45,360 Speaker 1: in case you don't know listeners, which I'm assuming you do, 26 00:01:45,440 --> 00:01:47,480 Speaker 1: this is for the podcast There Are No Girls on 27 00:01:47,520 --> 00:01:51,240 Speaker 1: the Internet, which is You've got a new season coming right, Yes, yeah, 28 00:01:51,280 --> 00:01:52,920 Speaker 1: I probably should have said the name of my own, 29 00:01:53,120 --> 00:01:58,120 Speaker 1: but I'm trying to get better at like self promotion. Yes, 30 00:01:58,360 --> 00:02:00,720 Speaker 1: the podcast is called There No Is on the Internet. 31 00:02:00,800 --> 00:02:03,000 Speaker 1: We just want an I Heeart Award and we're coming 32 00:02:03,000 --> 00:02:06,640 Speaker 1: back for a brand new season on March one. So um. 33 00:02:06,640 --> 00:02:08,560 Speaker 1: We've kind of been high on hiatus for a bit 34 00:02:08,560 --> 00:02:10,919 Speaker 1: while we've been retooling, and I'm so excited that we're 35 00:02:10,919 --> 00:02:12,920 Speaker 1: finally launching. So it would mean a lot to me 36 00:02:12,960 --> 00:02:15,200 Speaker 1: if you'll checked it out. Thank you for mentioning and Annie, 37 00:02:15,520 --> 00:02:18,639 Speaker 1: I obviously cannot be trusted to remember to see the 38 00:02:18,720 --> 00:02:21,800 Speaker 1: name of the thing I meant to be promoting, but yeah, 39 00:02:21,800 --> 00:02:24,120 Speaker 1: please check it out. We have all kinds of interesting 40 00:02:24,160 --> 00:02:27,760 Speaker 1: conversations about how women and queer folks and trans folks 41 00:02:27,760 --> 00:02:30,440 Speaker 1: and black folks and other marginalized voices. How we show 42 00:02:30,520 --> 00:02:33,239 Speaker 1: up or don't show up online and in technology? So 43 00:02:33,639 --> 00:02:35,760 Speaker 1: uh yeah, please check it out. If that sounds like 44 00:02:35,800 --> 00:02:38,720 Speaker 1: something you're interested in, you absolutely should. I'm still waiting 45 00:02:38,720 --> 00:02:42,400 Speaker 1: for the fan fiction episode. I'm ready. Oh yes, tv D. 46 00:02:43,560 --> 00:02:46,000 Speaker 1: Also just a fun fact about Tar, who's also a 47 00:02:46,040 --> 00:02:50,359 Speaker 1: good friend of ours. She always sings sitcom jingles at karaoke. Yeah, 48 00:02:50,520 --> 00:02:54,400 Speaker 1: it's the thing I adore about her. H She refuses 49 00:02:54,639 --> 00:02:57,640 Speaker 1: to seeing anything else and when someone interrupts her, she 50 00:02:57,680 --> 00:03:01,000 Speaker 1: gets very annoyed. Rightly so, because this is her thing 51 00:03:01,639 --> 00:03:04,119 Speaker 1: and she wants it and has claimed it and has 52 00:03:04,160 --> 00:03:06,840 Speaker 1: done well with it. How is it wring about? How 53 00:03:06,840 --> 00:03:08,760 Speaker 1: have I worked with Sari for two years? And if not, 54 00:03:08,840 --> 00:03:11,480 Speaker 1: I mean, I guess COVID as hell, but know this, 55 00:03:11,600 --> 00:03:14,240 Speaker 1: I need to kick her out that for karaoke to 56 00:03:14,280 --> 00:03:16,600 Speaker 1: see this in action? Or you know what, what's a 57 00:03:16,600 --> 00:03:18,799 Speaker 1: better bridget? We need you to come to Atlanta and 58 00:03:19,080 --> 00:03:22,480 Speaker 1: hosts a big karaoke night. H if. There's several great 59 00:03:22,480 --> 00:03:25,800 Speaker 1: locations that we love, including my favorite spot towards before 60 00:03:25,800 --> 00:03:28,680 Speaker 1: It Highway ran by a Korean family who was as 61 00:03:28,760 --> 00:03:31,400 Speaker 1: Korean as you think, and we adored them and they 62 00:03:31,480 --> 00:03:34,079 Speaker 1: bring us fruit place is a delight while we're seeing 63 00:03:34,480 --> 00:03:37,520 Speaker 1: karaoke to our hearts content. Yeah, please put that on 64 00:03:37,600 --> 00:03:39,760 Speaker 1: the agenda. The last time I was in Atlanta, I 65 00:03:39,760 --> 00:03:44,720 Speaker 1: went to what is that spot called Church of Church Church. 66 00:03:46,760 --> 00:03:49,400 Speaker 1: There was some karaoke happening. It was pretty fun. So yeah, 67 00:03:49,480 --> 00:03:52,880 Speaker 1: Atlanta karaoke date. Literally anytime I am in, I need 68 00:03:52,880 --> 00:03:58,400 Speaker 1: to see producer Tari in action singing the singles one 69 00:03:58,440 --> 00:04:02,840 Speaker 1: time and she was like, we're singing cheers together. It's okay. 70 00:04:03,320 --> 00:04:04,680 Speaker 1: I love it. I love it, I love it, I 71 00:04:04,720 --> 00:04:09,240 Speaker 1: love it. Okay. So this topic you're bringing to us today, Bridget, 72 00:04:09,360 --> 00:04:13,640 Speaker 1: is extremely timely. We're very excited to talk about it. Uh, 73 00:04:13,640 --> 00:04:15,520 Speaker 1: and we have a lot to get into. So let's 74 00:04:15,600 --> 00:04:17,599 Speaker 1: let's get into it. What are we talking about today? 75 00:04:17,800 --> 00:04:22,280 Speaker 1: So today is Friday, February two, and I want to 76 00:04:22,320 --> 00:04:27,080 Speaker 1: talk about the just recently announced new nominee for the 77 00:04:27,120 --> 00:04:31,240 Speaker 1: Supreme Court, Judge Katangi Brown Jackson. So I have to 78 00:04:31,279 --> 00:04:33,600 Speaker 1: admit I was putting together all my notes for this 79 00:04:33,640 --> 00:04:36,640 Speaker 1: episode last night, and so the notes were all like, oh, 80 00:04:36,720 --> 00:04:39,520 Speaker 1: the potential nominee, Like when we have the nominee. It 81 00:04:39,520 --> 00:04:42,480 Speaker 1: was all very like hypothetical, and then this morning the 82 00:04:42,480 --> 00:04:45,520 Speaker 1: news just dropped that she is indeed the White House's 83 00:04:45,640 --> 00:04:49,679 Speaker 1: pick to be the first ever black woman Supreme Court 84 00:04:49,760 --> 00:04:52,600 Speaker 1: justice on our Supreme Court. So very historic, very important, 85 00:04:52,720 --> 00:04:56,279 Speaker 1: very exciting. But we know that it also comes along 86 00:04:56,320 --> 00:05:00,280 Speaker 1: with racist attacks, sexist attacks, miss and disinformation, and that 87 00:05:00,560 --> 00:05:05,080 Speaker 1: women of color who are in public office unfortunately tend 88 00:05:05,120 --> 00:05:07,440 Speaker 1: to face. And so today today I really wanted to 89 00:05:07,480 --> 00:05:09,800 Speaker 1: talk about how we got to this place of having 90 00:05:09,839 --> 00:05:12,599 Speaker 1: this historic black woman being nominated for the Supreme Court, 91 00:05:12,920 --> 00:05:15,359 Speaker 1: and what kind of attacks she's likely to be facing, 92 00:05:15,400 --> 00:05:17,360 Speaker 1: what kind of attacks she's already been facing, and how 93 00:05:17,440 --> 00:05:19,560 Speaker 1: we can sort of all work together to create the 94 00:05:19,560 --> 00:05:24,200 Speaker 1: conditions to have a better conversation about her nomination. Yes, 95 00:05:24,320 --> 00:05:28,920 Speaker 1: and as you said, this is very ongoing. We're trying 96 00:05:28,960 --> 00:05:30,840 Speaker 1: to get this episode out as quickly as possible because 97 00:05:30,839 --> 00:05:34,440 Speaker 1: things are changing very quickly. But we've already seen some 98 00:05:34,480 --> 00:05:36,200 Speaker 1: of these attacks. I know we're gonna get into that 99 00:05:36,240 --> 00:05:38,000 Speaker 1: in a minute, but before we do that, can you 100 00:05:38,000 --> 00:05:42,000 Speaker 1: give us some some history and background on what's going 101 00:05:42,040 --> 00:05:44,800 Speaker 1: on here? Absolutely, So here's a little bit of background 102 00:05:44,800 --> 00:05:46,560 Speaker 1: about the call to nominate a black woman for the 103 00:05:46,560 --> 00:05:49,120 Speaker 1: Supreme Court. Um, I have to shout out that she 104 00:05:49,320 --> 00:05:52,400 Speaker 1: will rise campaign organized by a great organization called Sister 105 00:05:52,480 --> 00:05:56,280 Speaker 1: Scotus UM and their whole coalition is full of dynamic 106 00:05:56,360 --> 00:05:59,880 Speaker 1: black women, women like April Rain who created the hashtag 107 00:06:00,000 --> 00:06:03,679 Speaker 1: scars So White sixteen nineteen project creator Nicole Hannah Jones, 108 00:06:03,839 --> 00:06:06,080 Speaker 1: Allencia Johnson, who I love. I used to work with 109 00:06:06,120 --> 00:06:11,560 Speaker 1: her at Planned Parenthood Broadway, Multi Tony Award winner Audra McDonald. UM. So, 110 00:06:11,640 --> 00:06:15,280 Speaker 1: just a huge coalition of dynamic, badass black women who 111 00:06:15,279 --> 00:06:18,040 Speaker 1: have been advocating to put a black woman on the 112 00:06:18,040 --> 00:06:21,159 Speaker 1: Supreme Court. And so in the over two hundred year 113 00:06:21,240 --> 00:06:24,240 Speaker 1: history of the Supreme Court, not one black woman has 114 00:06:24,279 --> 00:06:27,120 Speaker 1: ever been confirmed or even nominated to serve on the 115 00:06:27,160 --> 00:06:29,440 Speaker 1: Supreme Court. There have been a hundred and fifteen men 116 00:06:29,520 --> 00:06:31,440 Speaker 1: and women who have served on the Supreme Court, and 117 00:06:31,480 --> 00:06:33,520 Speaker 1: only three of them have been people of color. There 118 00:06:33,520 --> 00:06:36,719 Speaker 1: have only been two Black American members of the Supreme Court, 119 00:06:36,880 --> 00:06:39,520 Speaker 1: Justice through Good Martial and Justice Clarence Thomas. And so 120 00:06:40,080 --> 00:06:43,800 Speaker 1: that's obviously not a very inclusive track record in terms 121 00:06:43,880 --> 00:06:47,920 Speaker 1: of representation. And you know, this idea of I thought 122 00:06:47,920 --> 00:06:50,200 Speaker 1: this was kind of a new precedent. You know, a 123 00:06:50,240 --> 00:06:53,279 Speaker 1: presidential candidate saying like, oh, if I'm elected, I will 124 00:06:53,320 --> 00:06:55,719 Speaker 1: put this kind of person on the Supreme Court. However, 125 00:06:55,760 --> 00:06:59,640 Speaker 1: there is actually a long history and president for presidents 126 00:06:59,680 --> 00:07:02,240 Speaker 1: peg in to pick a Scotus nominee who represents a 127 00:07:02,240 --> 00:07:04,760 Speaker 1: certain demographic of our population. So this is from a 128 00:07:04,839 --> 00:07:07,600 Speaker 1: really great New York Times op ed by Walter Dellinger, 129 00:07:07,640 --> 00:07:10,120 Speaker 1: who was the acting Solicitor General of the United States 130 00:07:10,160 --> 00:07:13,720 Speaker 1: under Bill Clinton. He writes, there is a long and 131 00:07:13,760 --> 00:07:18,000 Speaker 1: important tradition of presidents taking into consideration the demographic characteristics 132 00:07:18,000 --> 00:07:22,160 Speaker 1: of perspective justices, including geographic background, religion, race, and sex, 133 00:07:22,360 --> 00:07:25,520 Speaker 1: to ensure the Supreme Court is and remains a representative 134 00:07:25,520 --> 00:07:28,400 Speaker 1: institution in touch with the very facets of American life. 135 00:07:28,520 --> 00:07:31,680 Speaker 1: More fundamentally, our history shows the process of reaching out 136 00:07:31,720 --> 00:07:35,080 Speaker 1: to expand the personal backgrounds of the justices has often 137 00:07:35,120 --> 00:07:38,760 Speaker 1: produced stellar jurists who make historic contributions to our court 138 00:07:38,800 --> 00:07:41,480 Speaker 1: and judicial system. So he goes on to describe how 139 00:07:41,560 --> 00:07:44,320 Speaker 1: President Reagan promised to nominate a woman of the Supreme 140 00:07:44,360 --> 00:07:47,440 Speaker 1: Court and even though a bunch of his Republican colleagues 141 00:07:47,440 --> 00:07:50,120 Speaker 1: were very vocally against it and kind of forced him 142 00:07:50,120 --> 00:07:53,320 Speaker 1: to add some man to his shortlist. President Reagan was 143 00:07:53,520 --> 00:07:56,520 Speaker 1: really adamant about picking a woman and eventually nominated Sandra 144 00:07:56,600 --> 00:08:00,000 Speaker 1: Day O'Connor, making for the first ever female Associate Supreme 145 00:08:00,000 --> 00:08:02,440 Speaker 1: Court justice. Um. And so that's some like history that 146 00:08:02,560 --> 00:08:05,760 Speaker 1: I didn't even know about. You know how other presidents 147 00:08:05,760 --> 00:08:08,240 Speaker 1: have set this precedent to make the Supreme Court up 148 00:08:08,280 --> 00:08:12,840 Speaker 1: more inclusive. Yeah, and I think, I mean it's kind 149 00:08:12,840 --> 00:08:15,280 Speaker 1: of a way to pract to call a lot of 150 00:08:15,880 --> 00:08:19,120 Speaker 1: news organizations and Republicans perhaps if you're critical, but they 151 00:08:19,120 --> 00:08:22,160 Speaker 1: would have you believe that this is a new thing, like, oh, 152 00:08:22,200 --> 00:08:27,400 Speaker 1: this is never having before and it's ridiculous, right, And 153 00:08:27,480 --> 00:08:33,240 Speaker 1: this is not just an issue of representation, correct, absolutely, 154 00:08:33,280 --> 00:08:37,120 Speaker 1: so representation is important. People often say like, representation matters, 155 00:08:37,160 --> 00:08:40,640 Speaker 1: and it absolutely does. I'm not, you know, disagreeing with that. However, 156 00:08:40,840 --> 00:08:44,040 Speaker 1: it's not just a problem because representation matters. It's also 157 00:08:44,040 --> 00:08:48,079 Speaker 1: a problem because we deserve a representative democracy, a democracy 158 00:08:48,080 --> 00:08:50,840 Speaker 1: with the people who are governing actually are able to 159 00:08:51,000 --> 00:08:53,760 Speaker 1: meaningfully represent the people they are governing on behalf of. 160 00:08:54,080 --> 00:08:57,080 Speaker 1: And you can't have that if your Supreme Court is 161 00:08:57,160 --> 00:09:00,520 Speaker 1: mostly men, mostly white, because then you have a Supreme 162 00:09:00,559 --> 00:09:04,840 Speaker 1: Court is not actually able to, you know, meaningfully represent 163 00:09:04,960 --> 00:09:07,040 Speaker 1: the people that are meant to be advocating on behalf 164 00:09:07,080 --> 00:09:10,080 Speaker 1: of or working on behalf of. A biographical database from 165 00:09:10,080 --> 00:09:13,040 Speaker 1: the Federal Judicial Center shows that of the three thousand, 166 00:09:13,160 --> 00:09:16,559 Speaker 1: eight d forty three federal judges, less than two percent 167 00:09:16,640 --> 00:09:18,840 Speaker 1: have been black women. And so again it's not just 168 00:09:18,920 --> 00:09:20,959 Speaker 1: the Supreme Court. It really does go to show that 169 00:09:21,200 --> 00:09:23,040 Speaker 1: we have a long way to go and making sure 170 00:09:23,080 --> 00:09:25,720 Speaker 1: that the people who are actually representing us in the 171 00:09:25,760 --> 00:09:29,160 Speaker 1: courts actually are able to meaningfully represent us and like 172 00:09:29,320 --> 00:09:32,920 Speaker 1: look like the population that they're actually meant to serve. Yes, yes, 173 00:09:33,280 --> 00:09:36,840 Speaker 1: And I think right now a lot of us, for 174 00:09:36,960 --> 00:09:39,800 Speaker 1: good reason, I would say, are on edge when it 175 00:09:39,840 --> 00:09:44,160 Speaker 1: comes to the Supreme Court and decisions that they're tackling 176 00:09:44,240 --> 00:09:47,439 Speaker 1: or thinking about, including abortion. So this is very, very 177 00:09:47,520 --> 00:09:51,840 Speaker 1: very important that we are representing accurately our country, the 178 00:09:51,880 --> 00:10:08,839 Speaker 1: people in our country. So can we talk a little 179 00:10:08,840 --> 00:10:13,760 Speaker 1: bit about the promise Biden made when it comes to 180 00:10:13,840 --> 00:10:16,840 Speaker 1: nominating a black woman to the Supreme Court. Absolutely, So 181 00:10:16,880 --> 00:10:19,680 Speaker 1: this is one of those instances where Biden made a 182 00:10:19,880 --> 00:10:24,640 Speaker 1: very clear campaign promise. Certainly the administration has made other 183 00:10:24,679 --> 00:10:28,040 Speaker 1: campaign promises and tb D if they will come to fruition. 184 00:10:28,120 --> 00:10:29,839 Speaker 1: But this was a very clear one. So this is 185 00:10:29,880 --> 00:10:32,480 Speaker 1: sort of making good on a very exuplicit promise. You 186 00:10:32,559 --> 00:10:34,719 Speaker 1: might hear black women referred to as the backbone of 187 00:10:34,760 --> 00:10:38,240 Speaker 1: the Democratic Party, and that's because we are reliable voters 188 00:10:38,280 --> 00:10:41,200 Speaker 1: who tend to vote Democrat, and we tend to go 189 00:10:41,280 --> 00:10:43,440 Speaker 1: out there and organize our friends and our family and 190 00:10:43,440 --> 00:10:46,200 Speaker 1: our communities to also vote as well. Uh, if you 191 00:10:46,200 --> 00:10:48,760 Speaker 1: look at the numbers, you know, black women, I think 192 00:10:48,800 --> 00:10:50,680 Speaker 1: it was like less than one percent of Black women 193 00:10:50,760 --> 00:10:53,480 Speaker 1: voted for Trump. So we pretty much we are pretty 194 00:10:53,559 --> 00:10:58,280 Speaker 1: reliable voters for for the Democrats has helped limit it 195 00:10:58,360 --> 00:11:01,319 Speaker 1: that way and for awhelming majority of Black women. I 196 00:11:01,360 --> 00:11:05,920 Speaker 1: think it was supported Biden during his presidential bid, right 197 00:11:05,920 --> 00:11:09,079 Speaker 1: before voting began in South Carolina, during South Carolina's primary, 198 00:11:09,280 --> 00:11:12,319 Speaker 1: Biden made a very clear campaign promise to nominate a 199 00:11:12,360 --> 00:11:15,120 Speaker 1: black woman to the Supreme Court. And so yeah, this 200 00:11:15,240 --> 00:11:18,800 Speaker 1: was a a very clear promise that he made to 201 00:11:18,960 --> 00:11:23,560 Speaker 1: a constituency that is reliable, that is really did a 202 00:11:23,600 --> 00:11:25,440 Speaker 1: lot of the work and a lot of the ground 203 00:11:25,440 --> 00:11:28,480 Speaker 1: game of getting out the vote and organizing their our 204 00:11:28,520 --> 00:11:31,400 Speaker 1: communities to get out the vote as well. Right, and 205 00:11:31,400 --> 00:11:34,120 Speaker 1: and going back to what you mentioned, when it comes 206 00:11:34,120 --> 00:11:39,240 Speaker 1: to all of these sexist racist attacks that a lot 207 00:11:39,280 --> 00:11:41,520 Speaker 1: of women and women of color, and especially black women 208 00:11:41,600 --> 00:11:45,720 Speaker 1: face when it comes to elections like this or nominations 209 00:11:45,800 --> 00:11:49,760 Speaker 1: like this, and then also misinformation and disinformation, It's one 210 00:11:49,760 --> 00:11:52,480 Speaker 1: thing to kind of say that like, Okay, let's elect 211 00:11:52,840 --> 00:11:58,720 Speaker 1: a black woman, and then another to provide the necessary support. Absolutely, so, 212 00:11:58,880 --> 00:12:00,840 Speaker 1: you know, we hear, like I have a shirt that 213 00:12:00,880 --> 00:12:04,000 Speaker 1: says trust black women. You know, we hear a lot 214 00:12:04,000 --> 00:12:06,840 Speaker 1: of slogans about, you know, the importance of electing black 215 00:12:06,840 --> 00:12:10,680 Speaker 1: women and really amplifying our political leadership. And again I 216 00:12:10,720 --> 00:12:14,520 Speaker 1: feel like that that is great. Representation is so great. However, 217 00:12:15,160 --> 00:12:17,800 Speaker 1: it really does need to go along with the work 218 00:12:17,840 --> 00:12:21,000 Speaker 1: of creating the conditions so these these women will be supported. 219 00:12:21,200 --> 00:12:24,199 Speaker 1: That they're not it's going to be, you know, fodder 220 00:12:24,280 --> 00:12:29,640 Speaker 1: for unfair, sexist racist attacks purely based on their identity 221 00:12:29,679 --> 00:12:32,360 Speaker 1: who they are. And so I don't want to just 222 00:12:32,520 --> 00:12:35,880 Speaker 1: have black women or women or women of color. I 223 00:12:35,880 --> 00:12:38,880 Speaker 1: don't want to just have us be amplified as leaders 224 00:12:39,240 --> 00:12:41,160 Speaker 1: if we're going to be set up to fail, if 225 00:12:41,160 --> 00:12:42,760 Speaker 1: we're going to be set up to compete in a 226 00:12:42,840 --> 00:12:48,439 Speaker 1: completely unequal playing field. I want to amplify the leadership 227 00:12:48,640 --> 00:12:51,440 Speaker 1: of women, but I also want to create the conditions 228 00:12:51,480 --> 00:12:54,280 Speaker 1: that we actually can thrive. Um And so I think 229 00:12:54,320 --> 00:12:56,080 Speaker 1: that that's really what I want to get into today 230 00:12:56,160 --> 00:12:59,920 Speaker 1: about some of the ways that our media landscape are 231 00:13:00,120 --> 00:13:03,000 Speaker 1: kind of set up to ensure that this person will 232 00:13:03,080 --> 00:13:05,640 Speaker 1: not get a fair shake. I have to say, like, 233 00:13:05,679 --> 00:13:08,320 Speaker 1: I wasn't really thrilled watching Jen Psaki from the White 234 00:13:08,320 --> 00:13:11,400 Speaker 1: House kind of give this tepid acknowledgement of the kind 235 00:13:11,400 --> 00:13:14,600 Speaker 1: of racism and sexism that would go along with picking 236 00:13:14,600 --> 00:13:16,920 Speaker 1: a black woman for the Supreme Court nominee. She said 237 00:13:16,960 --> 00:13:19,040 Speaker 1: that Biden's intention to pick a black woman for the 238 00:13:19,040 --> 00:13:23,360 Speaker 1: Supreme Court presented quote specific challenges, But in my opinion, 239 00:13:23,400 --> 00:13:26,520 Speaker 1: that really doesn't go far enough in naming and lifting 240 00:13:26,600 --> 00:13:29,800 Speaker 1: up the kind of racist, sexist attacks and media climate 241 00:13:29,840 --> 00:13:32,360 Speaker 1: that are setting this person up to face. And that 242 00:13:32,400 --> 00:13:35,120 Speaker 1: can really be tricky because, as I'm sure a lot 243 00:13:35,120 --> 00:13:38,120 Speaker 1: of people listening can probably attest to, it can be 244 00:13:38,200 --> 00:13:42,280 Speaker 1: sometimes difficult to call out the kind of unfair attacks 245 00:13:42,320 --> 00:13:44,440 Speaker 1: that we face as marginalized people. So if you're a 246 00:13:44,440 --> 00:13:47,839 Speaker 1: woman who is facing sexism. Sometimes if you if you 247 00:13:48,000 --> 00:13:50,600 Speaker 1: are the one to vocalize that, that only kind of 248 00:13:50,600 --> 00:13:53,840 Speaker 1: goes against you, because then you're the complainer, you're the nag. 249 00:13:54,160 --> 00:13:56,640 Speaker 1: And so if you're in a climate where you can't 250 00:13:56,679 --> 00:13:59,920 Speaker 1: really call out what you're facing and the people around 251 00:14:00,080 --> 00:14:02,480 Speaker 1: you aren't going to explicitly call out what you're facing, 252 00:14:02,640 --> 00:14:07,320 Speaker 1: then it's just allowed to fester like uncalled out. Yeah. 253 00:14:07,440 --> 00:14:10,400 Speaker 1: I found it interesting because as you had brought this 254 00:14:10,480 --> 00:14:12,920 Speaker 1: to our attention, I hadn't realized he had made an announcement. 255 00:14:12,920 --> 00:14:16,000 Speaker 1: We just knew the promises that President Biden had given. 256 00:14:16,240 --> 00:14:18,480 Speaker 1: Finally naming a name and trying to look her up 257 00:14:18,520 --> 00:14:21,000 Speaker 1: and seeing who she was. Of course, one of the 258 00:14:21,000 --> 00:14:24,080 Speaker 1: top things that I saw was a congressman going in 259 00:14:24,160 --> 00:14:27,960 Speaker 1: on attacks and being repeatedly attacking them, just obviously already 260 00:14:28,000 --> 00:14:31,080 Speaker 1: ready to go. Oddly enough, the same congressman was the 261 00:14:31,120 --> 00:14:33,120 Speaker 1: one that had been one of three Republicans who had 262 00:14:33,200 --> 00:14:35,720 Speaker 1: voted her in her federal seat, and we were like, wait, 263 00:14:35,800 --> 00:14:38,160 Speaker 1: so how is she qualified then and not now? But 264 00:14:38,240 --> 00:14:42,000 Speaker 1: we know the answer obviously of why they are very 265 00:14:42,080 --> 00:14:45,760 Speaker 1: upset about this pick. In general, in all of the 266 00:14:45,800 --> 00:14:49,520 Speaker 1: game of politics, we understand that, but as it comes along. 267 00:14:50,600 --> 00:14:52,640 Speaker 1: That is part of that conversation where I'm like, I know, 268 00:14:52,800 --> 00:14:54,480 Speaker 1: I don't even want to mention who it is because 269 00:14:54,520 --> 00:14:57,000 Speaker 1: I don't want to amplify that even more. Instead, let's 270 00:14:57,000 --> 00:15:00,280 Speaker 1: talk about her qualifications and why she is qualified to 271 00:15:00,440 --> 00:15:04,360 Speaker 1: lead us in this type of position. Absolutely so, even 272 00:15:04,400 --> 00:15:08,320 Speaker 1: before Judge Jackson was named as the nominee, people were 273 00:15:08,360 --> 00:15:14,200 Speaker 1: wasting no time lobbying these completely ridiculous, unfair sexist attacks 274 00:15:14,200 --> 00:15:16,200 Speaker 1: on her. Just we don't even know who she is, 275 00:15:16,200 --> 00:15:19,680 Speaker 1: she was just a hypothetical person. Senator John Kennedy he 276 00:15:19,760 --> 00:15:22,360 Speaker 1: told Politico, I want a nominee who knows a law 277 00:15:22,400 --> 00:15:24,800 Speaker 1: book from a J Crew catalog. I want a nominee 278 00:15:24,800 --> 00:15:27,360 Speaker 1: who's not going to try to rewrite the constitution every 279 00:15:27,360 --> 00:15:30,840 Speaker 1: other Thursday to advance a woke agenda. And he was 280 00:15:30,840 --> 00:15:33,600 Speaker 1: saying this, we didn't even know who this person was yet, 281 00:15:33,600 --> 00:15:37,960 Speaker 1: and they were already lobbying these racist, sexist attacks, like 282 00:15:38,040 --> 00:15:39,960 Speaker 1: why do you assume that she's not going to know 283 00:15:40,000 --> 00:15:42,200 Speaker 1: what a legal brief from a J Crew catalog? Like 284 00:15:42,240 --> 00:15:45,120 Speaker 1: that's like reading between the lines. That is obviously meant 285 00:15:45,160 --> 00:15:48,560 Speaker 1: to be a sexist swipe. Because Biden had already expressed 286 00:15:48,560 --> 00:15:51,000 Speaker 1: his intentions to nominate a black woman. Why do you 287 00:15:51,040 --> 00:15:53,680 Speaker 1: assume this person is going to be, you know, pushing 288 00:15:53,720 --> 00:15:56,360 Speaker 1: awoke agenda? And what does that even mean? I think 289 00:15:56,360 --> 00:15:59,720 Speaker 1: when you really pull apart some of these dog whistles 290 00:15:59,760 --> 00:16:04,520 Speaker 1: that are used, it just reveals itself as unfair attacks, 291 00:16:04,600 --> 00:16:08,080 Speaker 1: rooted in identity, or just complete hypocrisy. Like just like 292 00:16:08,120 --> 00:16:11,040 Speaker 1: what you were saying, were you referencing Lindsay Graham? Who 293 00:16:11,280 --> 00:16:14,440 Speaker 1: was you know? And I think it was Literal. So 294 00:16:14,440 --> 00:16:16,440 Speaker 1: so I've been up working on this since like like 295 00:16:16,480 --> 00:16:18,520 Speaker 1: all day right where I watched the announcement come in, 296 00:16:18,560 --> 00:16:21,920 Speaker 1: I watched all the all the responses. Lindsay Graham tweeted 297 00:16:22,160 --> 00:16:26,040 Speaker 1: Literal minute after she was announced, saying, quote, the radical 298 00:16:26,160 --> 00:16:29,840 Speaker 1: left had won over Biden. Yet Graham also voted to 299 00:16:29,840 --> 00:16:32,960 Speaker 1: confirm Judge Decks into the DC Federal Appeals Court, which 300 00:16:33,000 --> 00:16:35,520 Speaker 1: is the second most important court in the country, just 301 00:16:35,640 --> 00:16:37,760 Speaker 1: eight months ago, this past summer. So which is it 302 00:16:37,960 --> 00:16:41,000 Speaker 1: is this win for the woke left? And like what 303 00:16:41,040 --> 00:16:43,480 Speaker 1: was different eight months earlier when you voted to confirm her. 304 00:16:43,600 --> 00:16:46,440 Speaker 1: So it is very interesting at this point, I wouldn't 305 00:16:46,440 --> 00:16:50,560 Speaker 1: even like the hypocrisy is so clear that it's almost 306 00:16:50,600 --> 00:16:53,120 Speaker 1: sort of like not worth plating out. It's like, of course, 307 00:16:53,400 --> 00:16:57,040 Speaker 1: you were only interested in a bad faith assessment. You know, 308 00:16:57,400 --> 00:16:59,480 Speaker 1: you don't even expect people to look back at your 309 00:16:59,520 --> 00:17:02,320 Speaker 1: own ooting record to see where you actually stand on 310 00:17:02,360 --> 00:17:05,840 Speaker 1: this issue. You're just counting on people not actually spending 311 00:17:05,880 --> 00:17:08,960 Speaker 1: a little bit of time and thinking about what you're saying. Right, Yeah, 312 00:17:09,480 --> 00:17:13,360 Speaker 1: we're both shaking our head. Was It's true, though. It's 313 00:17:13,400 --> 00:17:16,439 Speaker 1: so frustrating because it's like every time I point out 314 00:17:16,520 --> 00:17:20,320 Speaker 1: something that's hypocritical, it doesn't matter anymore. They're like, yeah, 315 00:17:21,760 --> 00:17:23,879 Speaker 1: I mean, essentially, it's kind of what we were talking 316 00:17:23,920 --> 00:17:27,520 Speaker 1: about when um Amy Barritt Cohen was confirmed as well 317 00:17:27,560 --> 00:17:31,720 Speaker 1: as Kavanaugh, about the ridiculous hYP hypocrisy that happened between 318 00:17:31,760 --> 00:17:35,160 Speaker 1: Obama and administration and the Trump administration. And now we're 319 00:17:35,160 --> 00:17:38,560 Speaker 1: back again. Now we're here again to the same conversation, 320 00:17:38,840 --> 00:17:42,960 Speaker 1: and literally half of the Republican candidates or the conservative 321 00:17:43,240 --> 00:17:46,439 Speaker 1: politicians will agree, yeah, this is hypocritical. This is what 322 00:17:46,480 --> 00:17:48,679 Speaker 1: we do, and they kind of just leave it at 323 00:17:48,720 --> 00:17:52,040 Speaker 1: that and assume that nobody will notice. And typically people 324 00:17:52,080 --> 00:17:55,280 Speaker 1: who are already dug their hills in don't notice, won't 325 00:17:55,280 --> 00:17:58,440 Speaker 1: notice because they want what they want in whatever agenda 326 00:17:58,600 --> 00:18:01,840 Speaker 1: it is that they have planted themselves firmly. And but yeah, 327 00:18:01,920 --> 00:18:04,680 Speaker 1: let's talk about the fact this is a whole different conversation, 328 00:18:04,880 --> 00:18:07,600 Speaker 1: but that we have different types of terms, and words 329 00:18:07,760 --> 00:18:10,760 Speaker 1: like woke that have been used as a positive has 330 00:18:10,800 --> 00:18:13,879 Speaker 1: now flipped so hard to a negative that everyone is 331 00:18:13,920 --> 00:18:16,359 Speaker 1: automatically like, oh, he said that, he's right. It's bad. 332 00:18:17,040 --> 00:18:20,960 Speaker 1: Oh my gosh, I could talk all day, I'll say. So, 333 00:18:21,000 --> 00:18:23,520 Speaker 1: I have two things to say about this. One is 334 00:18:23,520 --> 00:18:26,920 Speaker 1: that you are so right. There are so many words 335 00:18:26,960 --> 00:18:30,560 Speaker 1: that have just become meaningless. Right. So, if you're talking 336 00:18:30,600 --> 00:18:34,280 Speaker 1: about a hypothetical Supreme Court nominee and you're like, oh, 337 00:18:34,280 --> 00:18:36,600 Speaker 1: we don't want somebody who's woke, you don't know any 338 00:18:36,680 --> 00:18:38,879 Speaker 1: This person doesn't is a hypothetical person, so you're not 339 00:18:38,960 --> 00:18:40,520 Speaker 1: You don't know anything about their record, you don't know 340 00:18:40,520 --> 00:18:42,680 Speaker 1: anything about like where they stand on the issues. So 341 00:18:43,000 --> 00:18:46,200 Speaker 1: saying woke it almost just just the only thing we 342 00:18:46,240 --> 00:18:47,440 Speaker 1: know about this person is that she's going to be 343 00:18:47,480 --> 00:18:49,920 Speaker 1: a black woman. You're saying you don't want someone who 344 00:18:50,000 --> 00:18:52,800 Speaker 1: is woke. Reading within the lines, you're using that as 345 00:18:52,840 --> 00:18:55,280 Speaker 1: a stand in for the word black. Right, And so 346 00:18:55,320 --> 00:18:57,399 Speaker 1: I think that we see that time and time again 347 00:18:57,480 --> 00:19:06,000 Speaker 1: where these words become stand ins for identity, and they 348 00:19:06,040 --> 00:19:08,840 Speaker 1: also kind of become meaningless, right, Like cancel culture is 349 00:19:08,880 --> 00:19:10,920 Speaker 1: another one I remember reading this is this is sort 350 00:19:10,960 --> 00:19:13,600 Speaker 1: of silly, But there was this story a while ago 351 00:19:13,800 --> 00:19:17,879 Speaker 1: where this um guy who had been running a racehorse 352 00:19:17,880 --> 00:19:20,840 Speaker 1: in the Kentucky Derby. His racehorse had been I don't 353 00:19:20,880 --> 00:19:22,720 Speaker 1: know that, I'm sure there's more details to it, but 354 00:19:23,160 --> 00:19:27,640 Speaker 1: essentially his racehorse had been drug tested and he had 355 00:19:27,640 --> 00:19:29,800 Speaker 1: been like found to have drugs in a system, so 356 00:19:29,840 --> 00:19:32,520 Speaker 1: he was disqualified. And then an interview he was like, oh, 357 00:19:32,560 --> 00:19:35,080 Speaker 1: this is cancel culture strikes again, And I was like, 358 00:19:35,119 --> 00:19:36,960 Speaker 1: what are you even talking about, Like, how is the 359 00:19:37,000 --> 00:19:43,159 Speaker 1: cancel culture? Like in what way? Yeah, dopen your horse 360 00:19:43,320 --> 00:19:47,399 Speaker 1: and your horse was disqualified. That's not cancel culture. Even like, 361 00:19:47,600 --> 00:19:50,080 Speaker 1: I just have a lot of questions about the way 362 00:19:50,119 --> 00:19:52,399 Speaker 1: that these words are being used. But I think and 363 00:19:52,440 --> 00:19:54,080 Speaker 1: that's kind of the second point that I want to 364 00:19:54,119 --> 00:19:57,320 Speaker 1: make about this is that one of the reasons why 365 00:19:57,440 --> 00:20:00,880 Speaker 1: I am so adamant about things like disinformation and misinformation 366 00:20:01,119 --> 00:20:05,439 Speaker 1: and just having a healthier, more honest conversation and a 367 00:20:05,480 --> 00:20:08,359 Speaker 1: media landscape that facilitates those kinds of conversations. Is that 368 00:20:09,400 --> 00:20:14,840 Speaker 1: we are no longer able to have substitutive, thoughtful conversations 369 00:20:14,880 --> 00:20:19,879 Speaker 1: about the issue when our ecosystem is flooded with bad faith, 370 00:20:20,520 --> 00:20:24,760 Speaker 1: clearly hypocritical rhetoric and discourse. And so, you know, even 371 00:20:24,800 --> 00:20:27,040 Speaker 1: if you're someone let's say that you're listening and you 372 00:20:27,080 --> 00:20:30,840 Speaker 1: are very conservative, you know, you probably have hated this 373 00:20:30,880 --> 00:20:33,840 Speaker 1: conversation that we've been having. But you know, like even 374 00:20:33,840 --> 00:20:36,439 Speaker 1: if you're someone who is very conservative, you deserve to 375 00:20:36,480 --> 00:20:39,480 Speaker 1: be able to, you know, talk about your your issues, 376 00:20:39,760 --> 00:20:43,119 Speaker 1: talk have have a substitutive conversation and a substitutive debate 377 00:20:43,200 --> 00:20:45,719 Speaker 1: about where you stand on the issues. And so I 378 00:20:45,800 --> 00:20:49,080 Speaker 1: believe that when the discourse and the space is just 379 00:20:49,200 --> 00:20:53,920 Speaker 1: flooded with you know, charged rhetoric where we're talking where 380 00:20:53,920 --> 00:20:58,119 Speaker 1: we're talking about race or identity but using different words, 381 00:20:58,440 --> 00:21:01,520 Speaker 1: everybody loses because you're not able to have a substitutive 382 00:21:01,520 --> 00:21:04,679 Speaker 1: conversation about where you might agree or disagree with the 383 00:21:04,720 --> 00:21:07,760 Speaker 1: Supreme Court nominee, right, you're the And so I think 384 00:21:07,760 --> 00:21:09,800 Speaker 1: that that's my biggest issue is that we have a 385 00:21:09,880 --> 00:21:14,399 Speaker 1: media ecosystem that really amplifies the most extreme, the most 386 00:21:14,720 --> 00:21:18,000 Speaker 1: over the top statements, or the most nonsense statements, and 387 00:21:18,040 --> 00:21:23,480 Speaker 1: so everybody loses. Democrat, Republican, conservative, lefty, whoever. Everybody loses 388 00:21:23,640 --> 00:21:27,919 Speaker 1: when we have an ecosystem that amplifies the least substitutive 389 00:21:28,000 --> 00:21:31,280 Speaker 1: takes because that takes away from the ability to have 390 00:21:31,359 --> 00:21:36,240 Speaker 1: an actual substitutive, thoughtful debate or conversation about the actual issues. 391 00:21:36,280 --> 00:21:38,880 Speaker 1: And so I don't want to create the conditions for 392 00:21:39,440 --> 00:21:44,080 Speaker 1: Judge Jackson to only be judged by racist, sexist tropes 393 00:21:44,160 --> 00:21:47,520 Speaker 1: or caricatures or unfair attacks, because I want to talk 394 00:21:47,560 --> 00:21:50,159 Speaker 1: about her actual credentials. I want to talk about her 395 00:21:50,160 --> 00:21:52,439 Speaker 1: actual record. I want to talk about her actual character. 396 00:21:52,680 --> 00:21:57,560 Speaker 1: But disinformation and misinformation does not allow for the actual 397 00:21:57,680 --> 00:22:00,480 Speaker 1: issues to take the center stage that they should. Right 398 00:22:00,880 --> 00:22:03,760 Speaker 1: you just kind of explained my whole conversation with my 399 00:22:03,840 --> 00:22:06,639 Speaker 1: parents over the holidays, but we won't get into that 400 00:22:06,760 --> 00:22:09,800 Speaker 1: right now. I'm just gonna put that there. Um, it 401 00:22:09,880 --> 00:22:12,719 Speaker 1: was interesting, but you know, and then thinking about this 402 00:22:12,840 --> 00:22:15,720 Speaker 1: because when we talk about these terms and automatically just 403 00:22:15,800 --> 00:22:19,600 Speaker 1: becoming used by media as an ecosystem to bring in 404 00:22:19,760 --> 00:22:23,560 Speaker 1: the shock value, Uh, it makes me also realize that 405 00:22:23,680 --> 00:22:26,399 Speaker 1: in terms of what they're talking about in woke, it 406 00:22:26,600 --> 00:22:28,960 Speaker 1: is a black term that was created by the black 407 00:22:29,000 --> 00:22:33,919 Speaker 1: community to kind of gift non black people with, Hey, 408 00:22:34,040 --> 00:22:37,679 Speaker 1: you woke up. Congratulations, you're finally seeing what we have 409 00:22:37,760 --> 00:22:42,560 Speaker 1: been going through all of our generations. Welcome you have woke. Like, 410 00:22:42,640 --> 00:22:45,000 Speaker 1: that's kind of that term, and I hate that it 411 00:22:45,040 --> 00:22:49,320 Speaker 1: has been weaponized to this point of being used against people, 412 00:22:49,720 --> 00:22:52,920 Speaker 1: and when they're using it. When Kennedy used this, he 413 00:22:53,000 --> 00:22:56,600 Speaker 1: was weaponizing this terms to a woman, to an official, 414 00:22:56,720 --> 00:22:59,480 Speaker 1: to a judge, to a professional who didn't need to 415 00:22:59,520 --> 00:23:02,720 Speaker 1: be woke. She was already there. This was her life 416 00:23:02,920 --> 00:23:05,400 Speaker 1: and not only that. If we do look at her 417 00:23:05,600 --> 00:23:08,960 Speaker 1: backgrounds and credentials, she has been doing this work. There's 418 00:23:09,000 --> 00:23:12,280 Speaker 1: no conversation about her being woke. She just is. So 419 00:23:12,560 --> 00:23:14,760 Speaker 1: with that. Because I'm angry about this, can we talk 420 00:23:14,800 --> 00:23:18,520 Speaker 1: about some of those qualifications? Oh? Absolutely? So. One of 421 00:23:18,560 --> 00:23:21,679 Speaker 1: the best ways to counter all the kinds of bs, racist, 422 00:23:21,680 --> 00:23:24,680 Speaker 1: sexist attacks that you're definitely going to hear about Judge 423 00:23:24,760 --> 00:23:27,959 Speaker 1: Jackson is to flood the space to counter that with 424 00:23:28,040 --> 00:23:31,120 Speaker 1: accurate information. So I'm super excited to talk about her 425 00:23:31,240 --> 00:23:34,160 Speaker 1: actual qualifications. So a little bit of background information about 426 00:23:34,200 --> 00:23:36,840 Speaker 1: Judge Jackson. She was born in Washington, d C. Shout 427 00:23:36,840 --> 00:23:39,320 Speaker 1: out to d C, where I am also from. She 428 00:23:39,359 --> 00:23:42,439 Speaker 1: grew up in Miami, Florida. Her parents started their career 429 00:23:42,480 --> 00:23:45,280 Speaker 1: as public school teachers and then later became administrators in 430 00:23:45,280 --> 00:23:47,960 Speaker 1: the Miami Dade County public school system. I love this 431 00:23:48,000 --> 00:23:51,320 Speaker 1: little fun fact about her. Judge Jackson was a star student, 432 00:23:51,640 --> 00:23:53,639 Speaker 1: but she was told not to set her sights too 433 00:23:53,720 --> 00:23:56,320 Speaker 1: high by a guidance counselor when she told that guidance 434 00:23:56,320 --> 00:23:58,920 Speaker 1: counselor that she wanted to go to Harvard. And guess 435 00:23:58,920 --> 00:24:07,000 Speaker 1: where she ended up going to college Harvard. That's right, 436 00:24:07,000 --> 00:24:09,640 Speaker 1: so she definitely like shut that guidance counselor right up. 437 00:24:10,080 --> 00:24:13,439 Speaker 1: She studied government at Harvard University and attended Harvard Law School, 438 00:24:13,600 --> 00:24:16,520 Speaker 1: where she was the supervising editor of the Harvard Law Review. 439 00:24:16,720 --> 00:24:22,480 Speaker 1: So her educational credentials are pretty solid. Yeah, I would say, so. 440 00:24:22,680 --> 00:24:27,160 Speaker 1: I love that too. I hope that the guidance counselor knows. 441 00:24:28,480 --> 00:24:33,520 Speaker 1: Oh he knows now. So something else about her is 442 00:24:33,520 --> 00:24:36,600 Speaker 1: that she is absurdly qualified and experience. So this is 443 00:24:36,640 --> 00:24:40,280 Speaker 1: from Steve Ladek. Judge Jackson has eight point nine years 444 00:24:40,280 --> 00:24:43,160 Speaker 1: of prior judicial experience. That's more than four of our 445 00:24:43,200 --> 00:24:48,320 Speaker 1: current justices Thomas, Roberts, Kagan, and Barrett had combined. It's 446 00:24:48,320 --> 00:24:51,040 Speaker 1: also more than four of the last ten justices had 447 00:24:51,040 --> 00:24:53,800 Speaker 1: at their confirmations, nine of the last seventeen, and the 448 00:24:53,880 --> 00:24:57,560 Speaker 1: forty three of fifty eight appointed since nineteen hundred. So 449 00:24:57,680 --> 00:25:01,639 Speaker 1: anybody who tells you she is not experienced, anybody that 450 00:25:01,720 --> 00:25:04,280 Speaker 1: tells you that she's not qualified, anybody that tells you 451 00:25:04,359 --> 00:25:07,480 Speaker 1: that she only got this position because she's a black woman, 452 00:25:07,560 --> 00:25:10,840 Speaker 1: it's just misinformed, and they're spreading misinformation because as we 453 00:25:10,880 --> 00:25:14,600 Speaker 1: can see, she's a very qualified more qualified than you know, 454 00:25:14,720 --> 00:25:19,800 Speaker 1: some of the current Supreme Court justices, right, which is infuriating. 455 00:25:19,960 --> 00:25:23,959 Speaker 1: But yes, also, as you mentioned, she's already gone through 456 00:25:24,000 --> 00:25:26,920 Speaker 1: a lot of vetting, right absolutely, So this is something 457 00:25:26,920 --> 00:25:29,719 Speaker 1: else that I think people really need to understand. She 458 00:25:29,840 --> 00:25:32,760 Speaker 1: has been vetted a ton, she has a proven track 459 00:25:32,840 --> 00:25:36,399 Speaker 1: record of attracting bipartisan support in the Senate. She's already 460 00:25:36,400 --> 00:25:39,600 Speaker 1: been confirmed three times on a bipartisan vote, So there 461 00:25:39,720 --> 00:25:42,320 Speaker 1: is no reason to not expect the same now that 462 00:25:42,320 --> 00:25:45,160 Speaker 1: she's being considered for the Supreme Court again. Lindsay Graham 463 00:25:45,200 --> 00:25:48,280 Speaker 1: voted to confirm her, Murkowski voted to confirm her, and 464 00:25:48,320 --> 00:25:50,920 Speaker 1: Collins voted to confirm her. Right, So these are Republicans 465 00:25:50,920 --> 00:25:53,760 Speaker 1: who broke ranks with their party to side with Democrats 466 00:25:53,880 --> 00:25:56,920 Speaker 1: to vote to confirm her, and so, having already gone 467 00:25:56,960 --> 00:25:59,840 Speaker 1: through this process, I would really want, you know, some 468 00:26:00,000 --> 00:26:01,720 Speaker 1: on Mike Lindsay Graham to sit down with me and 469 00:26:01,760 --> 00:26:04,480 Speaker 1: explain what changed over the last eight months when he 470 00:26:04,560 --> 00:26:06,760 Speaker 1: voted to confirm her to the second most important court 471 00:26:06,760 --> 00:26:09,720 Speaker 1: in the country. Uh to now if he is saying 472 00:26:09,760 --> 00:26:12,040 Speaker 1: that she is not an appropriate choice, I'm the same 473 00:26:12,119 --> 00:26:15,040 Speaker 1: with Collins and Murkowski. There's no reason to expect that 474 00:26:15,119 --> 00:26:18,760 Speaker 1: she should not be able to be confirmed by the Senate, 475 00:26:19,400 --> 00:26:22,040 Speaker 1: considering she's been vetted and gone through this process three 476 00:26:22,040 --> 00:26:26,359 Speaker 1: different times before. There are certain people that like thinking 477 00:26:26,359 --> 00:26:30,280 Speaker 1: about talking to them makes my skin crawls one of them. 478 00:26:30,359 --> 00:26:46,600 Speaker 1: But yes, I do find it interesting, and this is 479 00:26:46,600 --> 00:26:48,719 Speaker 1: a whole different conversation again. You know how I love 480 00:26:48,760 --> 00:26:52,119 Speaker 1: the sidetrack that it is an interesting strategy that Biden 481 00:26:52,160 --> 00:26:55,199 Speaker 1: and his administration has pulled, not only because this is 482 00:26:55,200 --> 00:26:57,879 Speaker 1: obviously telling it's like, Okay, this is one of the 483 00:26:57,920 --> 00:27:00,440 Speaker 1: most qualified people that we could find that we think 484 00:27:00,520 --> 00:27:03,800 Speaker 1: is deserving and has earned a place in this position. 485 00:27:04,440 --> 00:27:07,600 Speaker 1: If you who opposed everything we do, are so gonna 486 00:27:07,640 --> 00:27:10,560 Speaker 1: start arguing, is gonna be a telltale son of who 487 00:27:10,640 --> 00:27:14,840 Speaker 1: you are and what you're doing. So it's an interesting turn. Again, 488 00:27:14,880 --> 00:27:17,280 Speaker 1: that's a whole different conversation because I've always interested in 489 00:27:17,760 --> 00:27:20,280 Speaker 1: good political thriller, and this feels like, you know, I 490 00:27:20,320 --> 00:27:22,680 Speaker 1: have to do everything a little fictional or movie. Yeah, 491 00:27:24,320 --> 00:27:27,399 Speaker 1: it's so true because, like you know, and thinking I 492 00:27:27,480 --> 00:27:30,200 Speaker 1: was thinking about that this morning. I do think it's 493 00:27:30,240 --> 00:27:33,159 Speaker 1: clear to me that the administration picked someone who had 494 00:27:33,160 --> 00:27:36,040 Speaker 1: already been like gone through this process a few times, 495 00:27:36,080 --> 00:27:38,440 Speaker 1: just so that it would be you know, test sort 496 00:27:38,480 --> 00:27:42,879 Speaker 1: of shut down any kind of you know, consternation there 497 00:27:42,960 --> 00:27:45,200 Speaker 1: might be in terms of like you know, like vetting 498 00:27:45,200 --> 00:27:47,440 Speaker 1: her and all of that. And I think it really 499 00:27:47,480 --> 00:27:51,000 Speaker 1: demonstrates something. I'm not totally sure. This is not a 500 00:27:51,000 --> 00:27:53,159 Speaker 1: completely fleshed out thought, and I'm not totally sure how 501 00:27:53,200 --> 00:27:55,760 Speaker 1: it fits into this conversation, but I do think there 502 00:27:55,960 --> 00:27:59,240 Speaker 1: is this standard for women, women of color and black 503 00:27:59,280 --> 00:28:03,560 Speaker 1: women where we have to be extra special qualified kind 504 00:28:03,560 --> 00:28:06,359 Speaker 1: of that standard of women have to do a hundred 505 00:28:06,359 --> 00:28:10,960 Speaker 1: and thirty more than men in general. And then when 506 00:28:10,960 --> 00:28:14,240 Speaker 1: you're part of the marginalized community, add on another twenty percent. 507 00:28:14,320 --> 00:28:16,359 Speaker 1: And then if you're a black woman, add on another 508 00:28:17,119 --> 00:28:22,640 Speaker 1: of exceeding the standard, exceeding the qualifications because for some reason, 509 00:28:23,320 --> 00:28:26,199 Speaker 1: you have to be on that level in order to 510 00:28:26,240 --> 00:28:30,320 Speaker 1: be seen as um serious. So that's just in general. 511 00:28:30,600 --> 00:28:33,120 Speaker 1: So let's put that in this federal level where they 512 00:28:33,160 --> 00:28:36,439 Speaker 1: are putting her under a microscope in every way, and 513 00:28:36,480 --> 00:28:38,800 Speaker 1: not only putting her in micro under a microscope to 514 00:28:38,840 --> 00:28:41,440 Speaker 1: twist and turn truth. They're going to tell flat out wise. 515 00:28:41,960 --> 00:28:45,120 Speaker 1: We know this, It's already happening. That's what they're doing. Uh. 516 00:28:45,120 --> 00:28:47,800 Speaker 1: And we saw this with Kamala Harris, which I really 517 00:28:47,800 --> 00:28:50,720 Speaker 1: found fascinating because Kamala Harris has a track record as 518 00:28:50,760 --> 00:28:55,280 Speaker 1: the Attorney General. They lasted her rightfully. It's so for 519 00:28:55,320 --> 00:28:59,160 Speaker 1: her track record in criminalizing people in general. And when 520 00:28:59,200 --> 00:29:03,040 Speaker 1: I found out Judge Jackson comes from the public defender, 521 00:29:04,400 --> 00:29:07,720 Speaker 1: that's right, so super super exciting. Not only would she 522 00:29:07,800 --> 00:29:10,440 Speaker 1: be historic as the first black woman on the Supreme Court, 523 00:29:10,600 --> 00:29:13,320 Speaker 1: but also the first public defender to serve on the Court, 524 00:29:13,360 --> 00:29:16,440 Speaker 1: which is a big deal. She has a long personal 525 00:29:16,480 --> 00:29:18,640 Speaker 1: history of working as a public defender. While she was 526 00:29:18,680 --> 00:29:21,200 Speaker 1: at Harvard, A relative was sentenced to life in prison 527 00:29:21,200 --> 00:29:23,680 Speaker 1: for a non violent drug offense, and she helped convince 528 00:29:23,720 --> 00:29:26,280 Speaker 1: a law firm to take his case pro bono, eventually 529 00:29:26,320 --> 00:29:29,000 Speaker 1: leading to having President Obama commute his sentence. And so 530 00:29:29,280 --> 00:29:31,480 Speaker 1: it is a big deal to have somebody on the 531 00:29:31,480 --> 00:29:35,240 Speaker 1: Supreme Court who has this legacy of public service in 532 00:29:35,240 --> 00:29:36,800 Speaker 1: this way. My brother is a lawyer. He got his 533 00:29:36,800 --> 00:29:39,480 Speaker 1: start as a public defender in Durham, North Carolina. I 534 00:29:39,520 --> 00:29:41,520 Speaker 1: just love the idea of having someone who has this 535 00:29:41,640 --> 00:29:44,520 Speaker 1: history because it's it's important, you know. I think that 536 00:29:44,920 --> 00:29:51,200 Speaker 1: it's often easy to forget that public service should be 537 00:29:51,320 --> 00:29:55,320 Speaker 1: a pipeline into bigger things, you know, like I want 538 00:29:55,360 --> 00:29:59,840 Speaker 1: to see more teachers, public educators, public servants be elevated 539 00:29:59,880 --> 00:30:02,200 Speaker 1: on in a national way like this. So I love 540 00:30:02,240 --> 00:30:08,520 Speaker 1: that aspect of her Amen former social worker A men. 541 00:30:11,480 --> 00:30:13,920 Speaker 1: We have this attitude where it's like, if you choose 542 00:30:14,000 --> 00:30:17,400 Speaker 1: to serve the public, you're some kind of like I 543 00:30:17,440 --> 00:30:19,240 Speaker 1: don't know, I don't think. I don't think we give 544 00:30:19,400 --> 00:30:23,600 Speaker 1: people who serve the public that like prompts they deserve 545 00:30:23,720 --> 00:30:26,480 Speaker 1: and they deserve a lot, and you know what, I 546 00:30:26,520 --> 00:30:30,280 Speaker 1: have been feeling some emotions for our people in Texas 547 00:30:30,280 --> 00:30:33,080 Speaker 1: who are social workers, that we know what's happening. They 548 00:30:33,080 --> 00:30:35,680 Speaker 1: are most people do, I think, especially our listeners. And 549 00:30:35,720 --> 00:30:37,920 Speaker 1: my heart has been breaking because I already know how 550 00:30:38,000 --> 00:30:41,160 Speaker 1: divisive that type of conversation is. But going back to 551 00:30:41,600 --> 00:30:44,920 Speaker 1: Judge Jackson and her public defense, like, Yeah, I have 552 00:30:45,080 --> 00:30:48,080 Speaker 1: worked with many public defenders, and a good public defender 553 00:30:49,160 --> 00:30:52,520 Speaker 1: is invaluable in the way that they have to advocate 554 00:30:52,600 --> 00:30:56,560 Speaker 1: and push for truth and justice. True justice meaning that 555 00:30:56,680 --> 00:30:59,080 Speaker 1: we are hearing the side and that they are not 556 00:30:59,160 --> 00:31:04,560 Speaker 1: guilty till even otherwise. And how oftentimes, especially in I'm 557 00:31:04,600 --> 00:31:07,360 Speaker 1: assuming DC Atlanta was very similar, how often it that 558 00:31:07,400 --> 00:31:11,360 Speaker 1: gets wrong, and that when we would find as social 559 00:31:11,360 --> 00:31:14,320 Speaker 1: worker who worked in the judicial system, when I found 560 00:31:14,320 --> 00:31:17,280 Speaker 1: a good defender, I would and talk to them personally 561 00:31:17,320 --> 00:31:19,440 Speaker 1: to try to get them onto cases for my kids 562 00:31:19,480 --> 00:31:21,960 Speaker 1: specifically to make sure that they got what was fair. 563 00:31:23,280 --> 00:31:26,160 Speaker 1: And that is so huge, So I do my heart 564 00:31:26,200 --> 00:31:29,240 Speaker 1: is soaring to know that we have someone in that 565 00:31:29,320 --> 00:31:33,240 Speaker 1: field coming to this point. Yeah, it's it's a pretty 566 00:31:33,280 --> 00:31:37,640 Speaker 1: historic thing. And yeah, people who serve the public like 567 00:31:37,760 --> 00:31:41,680 Speaker 1: thoughtfully and meaningfully, they really care about people and so 568 00:31:42,160 --> 00:31:45,120 Speaker 1: I I applaud you for for you know, going the 569 00:31:45,160 --> 00:31:48,080 Speaker 1: extra mile to create the conditions for your kids to 570 00:31:48,160 --> 00:31:50,680 Speaker 1: get real justice and to have a real advocate, because 571 00:31:50,760 --> 00:31:54,000 Speaker 1: that's not everybody is like that. That's like a special thing. Yeah, 572 00:31:54,040 --> 00:31:55,880 Speaker 1: this is something that's been on my mind and I 573 00:31:55,880 --> 00:31:57,760 Speaker 1: think a lot of our minds lately. Which is a 574 00:31:57,800 --> 00:32:00,560 Speaker 1: separate podcast, but I'll mention it here is I feel 575 00:32:01,000 --> 00:32:03,720 Speaker 1: in this country we have a real problem with glorifying 576 00:32:03,880 --> 00:32:10,160 Speaker 1: like male asspilary as being successful, like oh, you manage 577 00:32:10,240 --> 00:32:13,720 Speaker 1: the system, get your money, like it doesn't matter. Kind 578 00:32:13,720 --> 00:32:15,440 Speaker 1: of like what we're talking about here, with the hypocritical 579 00:32:15,520 --> 00:32:18,080 Speaker 1: nature of all these arguments being made, and it's more 580 00:32:18,240 --> 00:32:20,160 Speaker 1: to me, it feels more about the show and like, oh, 581 00:32:20,160 --> 00:32:24,200 Speaker 1: I'm a politician, I'm on TV, like it's glorified, whereas 582 00:32:24,240 --> 00:32:26,880 Speaker 1: we have people who are working for the public, and 583 00:32:26,920 --> 00:32:28,360 Speaker 1: in a lot of ways, these are kind of more 584 00:32:28,400 --> 00:32:33,400 Speaker 1: gendered as feminine and therefore lesser in nature, like not 585 00:32:33,480 --> 00:32:36,760 Speaker 1: worth the attention, not worth the accolades, not worth the money, 586 00:32:36,840 --> 00:32:44,960 Speaker 1: even though it's so critical to a functioning society. But yeah, 587 00:32:45,040 --> 00:32:47,120 Speaker 1: oh my gosh, I mean I could talk all day 588 00:32:47,360 --> 00:32:52,000 Speaker 1: the way that we have harmed everyone a deep, deep 589 00:32:52,080 --> 00:32:57,760 Speaker 1: societal harm is our legacy of not respecting jobs that 590 00:32:57,800 --> 00:33:01,040 Speaker 1: we code as feminine, that traditionally femin and so care work, 591 00:33:01,480 --> 00:33:05,120 Speaker 1: you know, social work, education, all of these things that 592 00:33:05,160 --> 00:33:09,280 Speaker 1: we have coded as feminine. The de prioritization and just 593 00:33:09,400 --> 00:33:12,440 Speaker 1: like outright degradation and disrespect of those I think that 594 00:33:12,480 --> 00:33:16,480 Speaker 1: the ramifications we will never fully be able to contend 595 00:33:16,480 --> 00:33:18,640 Speaker 1: with as a society. How much that has harmed us. 596 00:33:18,640 --> 00:33:21,640 Speaker 1: The way that you know, people who do care work 597 00:33:21,720 --> 00:33:24,800 Speaker 1: are are underpaid, if they're paid at all, just the 598 00:33:24,960 --> 00:33:27,240 Speaker 1: just the way that we don't even factor it in 599 00:33:27,440 --> 00:33:30,960 Speaker 1: and it just falls on unpaid women, right, Like, this 600 00:33:31,000 --> 00:33:32,760 Speaker 1: is what I mean when I say women are the 601 00:33:32,840 --> 00:33:36,120 Speaker 1: backbone of our country and our society, and oftentimes that 602 00:33:36,160 --> 00:33:40,600 Speaker 1: work is just completely not just unpaid, but unseen, unacknowledged. 603 00:33:40,640 --> 00:33:44,440 Speaker 1: It's just like the cost of society functioning. And so 604 00:33:44,720 --> 00:33:47,880 Speaker 1: I think that this country sometimes the only thing that 605 00:33:47,960 --> 00:33:51,200 Speaker 1: is standing between our our country and like complete collapse 606 00:33:51,440 --> 00:33:57,040 Speaker 1: is the effort and labor of some exhausted women. Is 607 00:33:57,240 --> 00:33:59,440 Speaker 1: let's be honest at this point in time, is black 608 00:33:59,440 --> 00:34:01,920 Speaker 1: women like I been thinking, Like when we talked about 609 00:34:01,920 --> 00:34:05,520 Speaker 1: the elections before Biden, the Biden administration, the amount of 610 00:34:05,600 --> 00:34:08,240 Speaker 1: work that has to be done. When we talked about 611 00:34:08,320 --> 00:34:11,680 Speaker 1: cases like Aubrey Ahmed and all of those, how women 612 00:34:11,719 --> 00:34:14,440 Speaker 1: have been forefront in protesting and bringing up all of 613 00:34:14,440 --> 00:34:17,600 Speaker 1: these conversations and the continued work that they have to do. 614 00:34:18,120 --> 00:34:20,560 Speaker 1: But not only that, like, not only are they working 615 00:34:20,560 --> 00:34:24,239 Speaker 1: to do good, but they're often having to combat all 616 00:34:24,239 --> 00:34:26,640 Speaker 1: of the disinformation and harmful things that are being set 617 00:34:26,680 --> 00:34:30,600 Speaker 1: about them because they are doing the hard work exactly. 618 00:34:30,640 --> 00:34:33,000 Speaker 1: I mean that really leads us back to, you know, 619 00:34:33,160 --> 00:34:36,000 Speaker 1: Judge Jackson. Just so so, Judge Jackson, I think I 620 00:34:36,040 --> 00:34:38,080 Speaker 1: got the email that she was going to be the 621 00:34:38,120 --> 00:34:41,879 Speaker 1: pick at like eight thirty this morning, and so we're 622 00:34:41,880 --> 00:34:44,360 Speaker 1: talking at two pm and so already to send in 623 00:34:44,440 --> 00:34:47,280 Speaker 1: a couple of hours. Here are some of the unfair 624 00:34:47,480 --> 00:34:50,200 Speaker 1: or just completely misleading attacks I've seen on her. One 625 00:34:50,400 --> 00:34:52,759 Speaker 1: is that you know, her decisions have been overturned, when 626 00:34:52,800 --> 00:34:55,840 Speaker 1: in reality, of her roughly six hundred decisions that she 627 00:34:55,920 --> 00:34:59,000 Speaker 1: has authored, she has been overruled just two percent of 628 00:34:59,000 --> 00:35:00,919 Speaker 1: the time, right, And so if someone is telling you 629 00:35:01,080 --> 00:35:04,040 Speaker 1: that in her career as a judge, her decisions have 630 00:35:04,160 --> 00:35:07,319 Speaker 1: been routinely overruled, or overturned. That's just not true. It's 631 00:35:07,320 --> 00:35:10,239 Speaker 1: happened two percent, less than two percent of the time. Uh, 632 00:35:10,280 --> 00:35:12,360 Speaker 1: the idea that she's an affirmative action higher, that she 633 00:35:12,360 --> 00:35:14,239 Speaker 1: only got the job because she's a black woman, when 634 00:35:14,280 --> 00:35:17,319 Speaker 1: in reality, she is clearly absurdly qualified. To the point 635 00:35:17,360 --> 00:35:20,920 Speaker 1: we're even talking about it seems ridiculous. Um, this idea 636 00:35:20,960 --> 00:35:23,640 Speaker 1: that she's too woke or too radical. One thing I 637 00:35:23,680 --> 00:35:27,360 Speaker 1: would say is really be wary of people throwing things 638 00:35:27,400 --> 00:35:30,440 Speaker 1: like that around without being able to point to a 639 00:35:30,480 --> 00:35:34,040 Speaker 1: specific ruling or policy or argument that she makes. Right, So, 640 00:35:34,360 --> 00:35:37,880 Speaker 1: just saying somebody so and so is too radical and 641 00:35:37,920 --> 00:35:40,440 Speaker 1: then not having it be attached to any kind of 642 00:35:40,480 --> 00:35:43,880 Speaker 1: actual policy that you think demonstrates that. Be wary of 643 00:35:43,880 --> 00:35:46,520 Speaker 1: people who are who are saying that, because, in my opinion, 644 00:35:46,880 --> 00:35:48,520 Speaker 1: nine times out of ten, when someone is saying that, 645 00:35:48,520 --> 00:35:50,960 Speaker 1: what they actually mean is this person is a black woman. 646 00:35:51,320 --> 00:35:54,080 Speaker 1: And then this idea of just sort of kind of 647 00:35:54,160 --> 00:35:57,879 Speaker 1: connecting her to things for no real reason, with no 648 00:35:58,000 --> 00:36:01,120 Speaker 1: real explanation as to like why you are connecting her 649 00:36:01,160 --> 00:36:05,200 Speaker 1: to these things. And so earlier today, miss McConnell tweeted, 650 00:36:05,400 --> 00:36:08,360 Speaker 1: the Senate must conduct a vigorous exhaustive review of Judge 651 00:36:08,400 --> 00:36:11,360 Speaker 1: Jackson's nomination to the Supreme Court. This is especially crucial 652 00:36:11,400 --> 00:36:14,719 Speaker 1: as American families face major crises that connect directly to 653 00:36:14,719 --> 00:36:18,120 Speaker 1: our legal systems, such as skyrocketing violent crime and open borders. 654 00:36:18,360 --> 00:36:20,719 Speaker 1: So what does Judge Jackson have to do with open 655 00:36:20,760 --> 00:36:23,440 Speaker 1: borders exactly? What does she have to do with violent crime? Exactly? 656 00:36:23,760 --> 00:36:26,400 Speaker 1: You know, the fact that he's just throwing her in, 657 00:36:26,800 --> 00:36:30,040 Speaker 1: you know, connecting her to these these things. I think, really, 658 00:36:30,440 --> 00:36:32,560 Speaker 1: I think this is a moment where we will really 659 00:36:32,600 --> 00:36:36,239 Speaker 1: be rewarded by really thinking critically about what people are 660 00:36:36,280 --> 00:36:38,520 Speaker 1: saying and what they're saying in between the lines. And 661 00:36:38,520 --> 00:36:42,960 Speaker 1: I think that really, to me, demonstrates a tricky truth 662 00:36:43,120 --> 00:36:46,799 Speaker 1: about the nature of racialized and gender disinformation. I think 663 00:36:46,800 --> 00:36:49,600 Speaker 1: most people can tell you that there is misleading or 664 00:36:49,600 --> 00:36:52,560 Speaker 1: an accurate information out there about like COVID or vaccines. 665 00:36:52,920 --> 00:36:55,759 Speaker 1: But misinformation can be a lot trickier to spot and 666 00:36:55,800 --> 00:36:58,640 Speaker 1: talk about when it relies on dog whistles, you know, 667 00:36:58,680 --> 00:37:01,200 Speaker 1: the same way that someone say like, oh, a woman 668 00:37:01,400 --> 00:37:04,400 Speaker 1: is too emotional to lead, and that's just code for 669 00:37:04,440 --> 00:37:08,200 Speaker 1: being a woman. Oftentimes people are using dog whistles or 670 00:37:08,280 --> 00:37:12,799 Speaker 1: coded language to attack women, women of color, and other 671 00:37:12,840 --> 00:37:16,520 Speaker 1: marginalized folks with this kind of like highly coded language 672 00:37:16,560 --> 00:37:18,839 Speaker 1: that can be so tough to to really call out 673 00:37:18,880 --> 00:37:21,680 Speaker 1: and talk about for what it is. Yeah, and it's 674 00:37:21,680 --> 00:37:24,680 Speaker 1: such a Again, it's one of those things. It's such 675 00:37:24,680 --> 00:37:27,279 Speaker 1: a double standard and it's so hypocritical. But I think 676 00:37:27,280 --> 00:37:31,240 Speaker 1: back to Kavanaugh. He's like crying these angry tears. He's 677 00:37:31,280 --> 00:37:36,640 Speaker 1: like confirmation, Wait, this is what you're not going to 678 00:37:36,719 --> 00:37:40,960 Speaker 1: call him emotional notes. He's being awarded for being emotional 679 00:37:40,960 --> 00:37:44,280 Speaker 1: as a man because that's difficult for them because he's trying, 680 00:37:44,320 --> 00:37:51,279 Speaker 1: because he's being accused of sexual assault. Right, Yes, isn't 681 00:37:51,320 --> 00:37:55,319 Speaker 1: it funny how like we've been we've branded anger as 682 00:37:55,400 --> 00:37:58,960 Speaker 1: not an emotion, so like whomen are too emotional? It's like, well, men, 683 00:37:59,239 --> 00:38:02,320 Speaker 1: if men get angry in public, is that not an emotion? 684 00:38:02,360 --> 00:38:05,960 Speaker 1: And how come the two emotional? You know, banner is 685 00:38:05,960 --> 00:38:09,799 Speaker 1: not used to identify that as an emotion. Yes, oh, 686 00:38:09,880 --> 00:38:14,759 Speaker 1: I could talk about that forever. What is the cartoon 687 00:38:15,480 --> 00:38:20,319 Speaker 1: with all the emotions inside out? And well inside out 688 00:38:20,360 --> 00:38:24,600 Speaker 1: taught me different anger's emotion. I saw it. I'm surprised, 689 00:38:24,640 --> 00:38:28,200 Speaker 1: just saw that. You avoid those said, so do I cried, 690 00:38:28,520 --> 00:38:32,200 Speaker 1: well good. I was sad last night. My partner trying 691 00:38:32,200 --> 00:38:33,719 Speaker 1: to made me watch Paddington again. I was like, how 692 00:38:33,800 --> 00:38:37,759 Speaker 1: dare you know? It was Paddington too? I didn't trust 693 00:38:37,840 --> 00:38:43,800 Speaker 1: them after the first one. Anyway, that's fair. Uh. Something 694 00:38:43,840 --> 00:38:47,279 Speaker 1: else that I has been on my mind, and it 695 00:38:47,280 --> 00:38:49,400 Speaker 1: has been a source of frustration for a lot of 696 00:38:49,440 --> 00:38:51,759 Speaker 1: as in that we've talked about on your segments when 697 00:38:51,800 --> 00:38:53,759 Speaker 1: you come on Bridges is this kind of what you 698 00:38:53,800 --> 00:39:00,640 Speaker 1: mentioned is like, not only is Judge Jackson facing all 699 00:39:00,680 --> 00:39:05,200 Speaker 1: of this like job interview that's very intensive, but then 700 00:39:05,239 --> 00:39:09,320 Speaker 1: there's all of these attacks online that are racist and sexist, 701 00:39:09,480 --> 00:39:14,480 Speaker 1: and I we've talked about it. It's not unique to 702 00:39:14,560 --> 00:39:16,759 Speaker 1: this area because we've talked about it in terms of 703 00:39:16,800 --> 00:39:18,640 Speaker 1: video games and we've talked about in terms of entertainment. 704 00:39:18,680 --> 00:39:20,640 Speaker 1: But I feel like that's already a term of gatekeeping. 705 00:39:20,640 --> 00:39:23,040 Speaker 1: This is already an extra thing you're asking people and 706 00:39:23,160 --> 00:39:25,400 Speaker 1: the people that know them, like their families and friends, 707 00:39:26,000 --> 00:39:29,719 Speaker 1: to deal with, and we just seem to accept that 708 00:39:29,840 --> 00:39:33,840 Speaker 1: as the status quo of being a marginalized person that 709 00:39:33,960 --> 00:39:40,200 Speaker 1: exists in our media landscape. Absolutely, you know I I 710 00:39:40,200 --> 00:39:42,080 Speaker 1: I'm so glad that you put it that way because 711 00:39:42,080 --> 00:39:44,719 Speaker 1: I think that we have this understanding that it's just 712 00:39:44,800 --> 00:39:47,640 Speaker 1: the cost of doing business as a marginalized person. And 713 00:39:47,680 --> 00:39:49,760 Speaker 1: if you don't want this kind of thing to happen 714 00:39:49,800 --> 00:39:52,960 Speaker 1: to you, then just don't speak and don't express opinions, 715 00:39:53,000 --> 00:39:55,080 Speaker 1: and don't strive, and don't put yourself out there and 716 00:39:55,120 --> 00:39:57,320 Speaker 1: don't become a public figure and don't serve your public 717 00:39:57,320 --> 00:39:59,360 Speaker 1: and don't like it's just a whole list of don'ts. 718 00:39:59,440 --> 00:40:02,440 Speaker 1: And so I really want to urge people to have 719 00:40:02,520 --> 00:40:04,759 Speaker 1: a shift around their understanding of that and said and 720 00:40:04,800 --> 00:40:07,439 Speaker 1: say like, we deserve to have a media landscape where 721 00:40:07,440 --> 00:40:10,160 Speaker 1: everybody can speak up, everybody can participate, and that you're 722 00:40:10,200 --> 00:40:13,000 Speaker 1: not going to be attacked on fairly based on your 723 00:40:13,040 --> 00:40:15,279 Speaker 1: identity just for putting yourself out there. And so I 724 00:40:15,320 --> 00:40:17,719 Speaker 1: want to quickly talk about some research from the Institute 725 00:40:17,719 --> 00:40:20,759 Speaker 1: for Strategic Dialogue, and they analyze social media abuse of 726 00:40:20,800 --> 00:40:23,680 Speaker 1: candidates and found that women of color candidates are targeted 727 00:40:23,719 --> 00:40:26,440 Speaker 1: on social media at alarming rates. UM. They analyze all 728 00:40:26,480 --> 00:40:29,440 Speaker 1: these different messages that UM these candidates were getting, and 729 00:40:29,480 --> 00:40:32,280 Speaker 1: they found that abusive messages accounted for more than fifteen 730 00:40:32,360 --> 00:40:35,759 Speaker 1: percent of those directed at every female lawmaker they analyze, 731 00:40:35,840 --> 00:40:39,120 Speaker 1: compared with around five to ten percent of the male candidates. 732 00:40:39,280 --> 00:40:42,640 Speaker 1: Women of color were particularly likely to be targeted. Representative 733 00:40:42,640 --> 00:40:45,840 Speaker 1: il Han Omar got the highest proportion thirty nine percent 734 00:40:45,880 --> 00:40:48,879 Speaker 1: of abusive messages of all the candidates, and AOC got 735 00:40:48,880 --> 00:40:51,960 Speaker 1: the highest ratio of abusive comments on Facebook. And when 736 00:40:51,960 --> 00:40:55,000 Speaker 1: you're talking about women, uh, the abuse directed towards women 737 00:40:55,080 --> 00:40:57,319 Speaker 1: is much more likely to be about their gender than 738 00:40:57,400 --> 00:41:00,200 Speaker 1: the abuse has targeting men. Abuse targeting men was much 739 00:41:00,239 --> 00:41:03,600 Speaker 1: more generalized and focus on their political stances, while messages 740 00:41:03,640 --> 00:41:06,560 Speaker 1: directed at women were much more likely to focus on 741 00:41:06,600 --> 00:41:10,680 Speaker 1: either appearance or general competence. And so, yeah, I mean you, 742 00:41:10,840 --> 00:41:13,200 Speaker 1: if you are a woman or a marginalized person who 743 00:41:13,239 --> 00:41:15,600 Speaker 1: is putting yourself out there in this way, you deserve 744 00:41:15,680 --> 00:41:18,480 Speaker 1: to be judged on your merits, your record, your words, 745 00:41:18,560 --> 00:41:23,400 Speaker 1: your deeds, your values, not your identity, not racist sexist tropes, 746 00:41:23,480 --> 00:41:26,600 Speaker 1: not you know, nonsense about women or women of color 747 00:41:26,719 --> 00:41:29,759 Speaker 1: not being good leaders, are being unqualified. You deserve to 748 00:41:29,800 --> 00:41:32,319 Speaker 1: really have your record speak for itself. And I have 749 00:41:32,440 --> 00:41:35,000 Speaker 1: to say it's not just sort of I talk a 750 00:41:35,040 --> 00:41:38,920 Speaker 1: lot about you know, online bad actors. It's not just 751 00:41:39,120 --> 00:41:43,160 Speaker 1: fringe extremists. We also see mainstream media outlets playing a 752 00:41:43,280 --> 00:41:47,719 Speaker 1: huge role in legitimizing and mainstreaming racist sexist attacks on 753 00:41:47,800 --> 00:41:50,640 Speaker 1: women of color in public office. And so you know, 754 00:41:50,719 --> 00:41:53,640 Speaker 1: you might see things like an article about some complete 755 00:41:53,719 --> 00:41:58,080 Speaker 1: racist nonsense or a complete racist attack being quoted in 756 00:41:58,120 --> 00:42:00,800 Speaker 1: the headline of an article. So when people share it 757 00:42:00,840 --> 00:42:03,799 Speaker 1: on social media, it provides the impression that this is 758 00:42:03,800 --> 00:42:06,279 Speaker 1: a legitimate grievance that somebody might have instead of just 759 00:42:06,320 --> 00:42:09,320 Speaker 1: a racist attack. Right, And so we are really calling 760 00:42:09,600 --> 00:42:13,480 Speaker 1: on media to not create the conditions for these kind 761 00:42:13,560 --> 00:42:16,719 Speaker 1: of racist sexist attacks to fester and spread. Right. I 762 00:42:16,719 --> 00:42:20,120 Speaker 1: think that this is a time that really requires everybody, 763 00:42:20,160 --> 00:42:22,600 Speaker 1: whether you are a journalist or an editor or just 764 00:42:22,760 --> 00:42:25,880 Speaker 1: a regular person on social media, to really be careful 765 00:42:26,000 --> 00:42:30,960 Speaker 1: and thoughtful about how you are talking about this very 766 00:42:31,080 --> 00:42:33,879 Speaker 1: visible black woman who is going up for this very 767 00:42:34,000 --> 00:42:37,400 Speaker 1: visible position in the Supreme Court. Okay, I feel like 768 00:42:37,440 --> 00:42:39,759 Speaker 1: you just let us into this. So tell us the 769 00:42:39,800 --> 00:42:44,040 Speaker 1: listeners and US voters and UH constituents who are here 770 00:42:44,160 --> 00:42:48,160 Speaker 1: to not only look and see and view and be 771 00:42:48,200 --> 00:42:52,360 Speaker 1: the audience, but participate in help stopping this harmful disinformation. 772 00:42:52,480 --> 00:42:55,839 Speaker 1: What do we need to do to make sure that 773 00:42:55,920 --> 00:42:58,120 Speaker 1: we are not only engaging in that, but not being 774 00:42:58,120 --> 00:43:00,480 Speaker 1: a part of that, but not spreading that. I'm so 775 00:43:00,560 --> 00:43:02,839 Speaker 1: glad you asked. So first, it's just you know, if 776 00:43:02,880 --> 00:43:06,880 Speaker 1: you see like a harmful racist, sexist attack or a 777 00:43:06,960 --> 00:43:10,160 Speaker 1: conspiracy theory, first and foremost, don't spread it. Nine times 778 00:43:10,160 --> 00:43:11,960 Speaker 1: out of Tenant you see this kind of thing, if 779 00:43:12,000 --> 00:43:15,280 Speaker 1: you retweet it or like comment on it, you're actually 780 00:43:15,280 --> 00:43:17,960 Speaker 1: just helping it grow and spread. Because of the algorithmic 781 00:43:18,000 --> 00:43:20,200 Speaker 1: nature of most of our social media platforms, and so 782 00:43:20,440 --> 00:43:22,520 Speaker 1: the platform is going to think like, oh, this person 783 00:43:22,560 --> 00:43:24,400 Speaker 1: is engaging with this, it must be good content. I'm 784 00:43:24,400 --> 00:43:26,879 Speaker 1: going to surface it for more people. So don't do that. 785 00:43:27,200 --> 00:43:32,200 Speaker 1: Focus instead on sharing accurate, timely information about the issues 786 00:43:32,239 --> 00:43:35,920 Speaker 1: and the candidates. Right, So, talk to people about Judge 787 00:43:35,960 --> 00:43:39,560 Speaker 1: Jackson's actual record, talk to people about her actual positions 788 00:43:39,560 --> 00:43:41,960 Speaker 1: and where she actually stands. That that will kind of 789 00:43:41,960 --> 00:43:45,239 Speaker 1: provide a little bit of taking the oxygen out of 790 00:43:45,280 --> 00:43:48,000 Speaker 1: the kinds of racist, sexist, gendered attacks we are shut 791 00:43:48,040 --> 00:43:50,360 Speaker 1: to see. You can go to we are ultra violet 792 00:43:50,360 --> 00:43:53,400 Speaker 1: dot org and check out our media kit. Really asking 793 00:43:53,440 --> 00:43:56,200 Speaker 1: for the media to create the conditions for this person 794 00:43:56,239 --> 00:43:59,200 Speaker 1: to be you know, fairly judged and fairly talked about 795 00:43:59,239 --> 00:44:02,799 Speaker 1: and and and fair assessed by the American public. And 796 00:44:02,840 --> 00:44:04,960 Speaker 1: also the last one I would say is like, just 797 00:44:05,120 --> 00:44:08,239 Speaker 1: really ask questions and be I think this is a 798 00:44:08,239 --> 00:44:10,799 Speaker 1: time where we will really be rewarded if we are 799 00:44:10,920 --> 00:44:14,719 Speaker 1: interested in being critical thinkers. Right, So when someone says like, oh, 800 00:44:14,840 --> 00:44:17,520 Speaker 1: she's too radical, but can't give you a single thing 801 00:44:17,560 --> 00:44:20,239 Speaker 1: that backs up what they're saying, or when someone says like, oh, 802 00:44:20,280 --> 00:44:22,560 Speaker 1: she's just gonna be really woke, you know, really being 803 00:44:22,600 --> 00:44:24,560 Speaker 1: able to ask, you know, what do you mean by that? 804 00:44:24,600 --> 00:44:26,120 Speaker 1: What is? What? What? What? When you say woke? Like, 805 00:44:26,160 --> 00:44:27,719 Speaker 1: what does that mean? What are you trying to say? 806 00:44:27,920 --> 00:44:30,279 Speaker 1: I have found that to be really useful when I'm 807 00:44:30,320 --> 00:44:33,720 Speaker 1: having conversations, particularly with people that may not be aligned 808 00:44:33,719 --> 00:44:37,120 Speaker 1: with me politically, you know, really getting down to what 809 00:44:37,280 --> 00:44:39,880 Speaker 1: is the substitutive thing that you are trying to say. 810 00:44:39,960 --> 00:44:42,239 Speaker 1: And if the answer is well, I just don't like 811 00:44:42,280 --> 00:44:43,680 Speaker 1: her because she's a black woman and I don't like 812 00:44:43,719 --> 00:44:46,279 Speaker 1: black women, then say that so we can so we 813 00:44:46,320 --> 00:44:50,160 Speaker 1: can you know, address that for what it is. Be 814 00:44:50,320 --> 00:44:52,239 Speaker 1: loud about it, because you know you're already pretty much 815 00:44:52,239 --> 00:44:54,799 Speaker 1: saying it, right, Like say it with your chest. If 816 00:44:54,800 --> 00:44:58,080 Speaker 1: you're gonna say just do it, you're already doing it. 817 00:44:58,080 --> 00:45:00,400 Speaker 1: It feels hypocritical for me to ask this, so I 818 00:45:00,480 --> 00:45:04,880 Speaker 1: apologize from the jump, But as non black women and 819 00:45:05,120 --> 00:45:10,000 Speaker 1: for our male identifying listeners, what can we do to 820 00:45:10,239 --> 00:45:13,959 Speaker 1: make sure because we know we already know black women 821 00:45:13,960 --> 00:45:17,200 Speaker 1: are just the out. But the fact of the matter is, 822 00:45:17,360 --> 00:45:19,879 Speaker 1: with this is going to come on to so many 823 00:45:19,920 --> 00:45:22,880 Speaker 1: more attacks is visceral. We we know what's gonna happen. 824 00:45:23,080 --> 00:45:25,040 Speaker 1: We know it, and it's gonna be harmful and it's 825 00:45:25,080 --> 00:45:27,359 Speaker 1: gonna be gross. And not only should we do all 826 00:45:27,400 --> 00:45:29,000 Speaker 1: the things that you told us to do, but what 827 00:45:29,080 --> 00:45:34,000 Speaker 1: can we do to ease load a but help fight 828 00:45:35,200 --> 00:45:40,280 Speaker 1: for y'all? In general? I love this question. I would say, 829 00:45:40,440 --> 00:45:43,120 Speaker 1: first of all, it's probably a tall order, but I 830 00:45:43,120 --> 00:45:47,080 Speaker 1: would love to see a shift where we understand that 831 00:45:47,320 --> 00:45:50,359 Speaker 1: this fight is all of our fight, like we are 832 00:45:50,400 --> 00:45:52,480 Speaker 1: that you know, the more marginalized you are, the more 833 00:45:52,480 --> 00:45:55,000 Speaker 1: you are a target. But this kind of thing really 834 00:45:55,040 --> 00:45:58,080 Speaker 1: harms us all as I think seeing this work as 835 00:45:58,200 --> 00:46:01,480 Speaker 1: all of our work to create a how healthy democracy 836 00:46:01,680 --> 00:46:04,399 Speaker 1: is really key, and so these kinds of attacks, they 837 00:46:04,400 --> 00:46:06,680 Speaker 1: don't just hurt the women that are subjected to them. 838 00:46:07,000 --> 00:46:11,319 Speaker 1: They really have a meaningful impact on the health and 839 00:46:11,360 --> 00:46:13,960 Speaker 1: well being of our democracy, because we can't have a 840 00:46:14,600 --> 00:46:19,240 Speaker 1: fully functioning representative democracy if everybody is not able to 841 00:46:19,360 --> 00:46:21,800 Speaker 1: use their voice, if everybody is not able to participate. 842 00:46:21,840 --> 00:46:24,919 Speaker 1: And so step one of that really comes with having 843 00:46:24,960 --> 00:46:28,720 Speaker 1: a healthy media ecosystem and a healthy climate for everybody 844 00:46:28,760 --> 00:46:30,839 Speaker 1: to be able to participate. And so I would say 845 00:46:30,840 --> 00:46:35,399 Speaker 1: really working to see these attacks on marginalized people as 846 00:46:35,520 --> 00:46:37,360 Speaker 1: all of our fight, because all of us are invested 847 00:46:37,400 --> 00:46:39,960 Speaker 1: in having a healthy democracy, whether you are a man, 848 00:46:40,080 --> 00:46:43,239 Speaker 1: a woman, black, white, like, it is all of our fight. 849 00:46:43,440 --> 00:46:48,239 Speaker 1: And so really being able to to see this as 850 00:46:48,280 --> 00:46:51,080 Speaker 1: something that you're meaningfully invested in, not just because it's 851 00:46:51,080 --> 00:46:52,960 Speaker 1: the right thing to do, which it is, but because 852 00:46:53,440 --> 00:46:58,040 Speaker 1: we all deserve to have a functioning democracy. Yes, I 853 00:46:58,120 --> 00:47:01,160 Speaker 1: love it, well said as always, bridget any other resources 854 00:47:01,160 --> 00:47:04,520 Speaker 1: you want to shout out, any final thoughts, Yeah, I 855 00:47:04,560 --> 00:47:07,279 Speaker 1: would I I have to just again shout out the 856 00:47:07,280 --> 00:47:10,520 Speaker 1: work of Sister Scotis. They have been really doing a 857 00:47:10,520 --> 00:47:13,440 Speaker 1: lot of the work of building this infrastructure to hold 858 00:47:13,560 --> 00:47:15,919 Speaker 1: by an accountable for this campaign promise that he made. 859 00:47:15,960 --> 00:47:18,040 Speaker 1: So their their website is awesome. The women who run 860 00:47:18,040 --> 00:47:20,000 Speaker 1: it are awesome, so definitely check them out. You can 861 00:47:20,080 --> 00:47:22,520 Speaker 1: check out Ultra violets work. You know, we are doing 862 00:47:22,560 --> 00:47:26,320 Speaker 1: a lot of the work of trying to inoculate folks 863 00:47:26,320 --> 00:47:29,359 Speaker 1: against disinformation and like help people spot it when they 864 00:47:29,360 --> 00:47:31,239 Speaker 1: see it and identify it. So you can check us 865 00:47:31,239 --> 00:47:33,480 Speaker 1: out at we are ultra violet dot org. And of 866 00:47:33,520 --> 00:47:36,120 Speaker 1: course you can always check out my podcast There Are 867 00:47:36,120 --> 00:47:38,720 Speaker 1: No Girls on the Internet. Check out our new season 868 00:47:38,880 --> 00:47:42,120 Speaker 1: which is dropping March first, and yeah, we would love 869 00:47:42,120 --> 00:47:47,239 Speaker 1: to have you there. Yes, yes, award winning podcast. You 870 00:47:47,280 --> 00:47:53,000 Speaker 1: haven't that subscribe button already? Why? Yes? Yes, I agree? 871 00:47:53,080 --> 00:47:59,359 Speaker 1: Why what are you waiting for? What are you doing? Yes? 872 00:47:59,400 --> 00:48:03,520 Speaker 1: Which as this episode releases, that should just be airing 873 00:48:03,600 --> 00:48:07,840 Speaker 1: so perfectly timely, very exciting and we can't wait to 874 00:48:07,920 --> 00:48:09,960 Speaker 1: talk to you again. Bridget Thank you, thank you, thank 875 00:48:10,000 --> 00:48:12,319 Speaker 1: you so much as always for being here. You are 876 00:48:12,360 --> 00:48:14,600 Speaker 1: the best. We love you. Thank you so much for 877 00:48:14,640 --> 00:48:16,759 Speaker 1: having me. It's been kind of fun talking about this 878 00:48:16,800 --> 00:48:19,680 Speaker 1: like very timely issue that is like happening today as 879 00:48:19,680 --> 00:48:22,200 Speaker 1: opposed to looking back on it. So thank you for 880 00:48:22,239 --> 00:48:25,960 Speaker 1: giving me the space to do that. Yes, absolutely, anytime, 881 00:48:26,520 --> 00:48:28,920 Speaker 1: and thank you listeners for listening. If you would like 882 00:48:28,960 --> 00:48:31,120 Speaker 1: to contact us, you can our emailus, stuff to your mom, 883 00:48:31,160 --> 00:48:32,799 Speaker 1: stuff that I hurted me, you dot com, you find 884 00:48:32,840 --> 00:48:34,600 Speaker 1: us on Twitter at mom, s to podcast or Instagram, 885 00:48:34,640 --> 00:48:36,480 Speaker 1: and stuff I've never told you. Thanks as always to 886 00:48:36,520 --> 00:48:39,319 Speaker 1: our super producer Christie, we love you. To Chris Data, 887 00:48:39,480 --> 00:48:42,200 Speaker 1: Yes you're the best. We love you, and thanks to 888 00:48:42,239 --> 00:48:43,800 Speaker 1: you for listening. Stuff one ever told the instruction of 889 00:48:43,880 --> 00:48:45,759 Speaker 1: I Heart Radio for more podcast. My Heart Radio is 890 00:48:45,800 --> 00:48:47,759 Speaker 1: a diet radio app' a podcast. We're will be listening 891 00:48:47,760 --> 00:48:48,560 Speaker 1: to your favorite shows