1 00:00:05,200 --> 00:00:07,600 Speaker 1: Hey, this is Anny and Samantha. I'm welcome to Steffone 2 00:00:07,600 --> 00:00:19,159 Speaker 1: never told you a prediction of iHeartRadio. And today we 3 00:00:19,200 --> 00:00:23,640 Speaker 1: are really lucky because we are once again joined by 4 00:00:23,640 --> 00:00:28,040 Speaker 1: the fabulous and delightful bridget Todd three times in this month. 5 00:00:28,160 --> 00:00:28,800 Speaker 2: That's amazing. 6 00:00:29,280 --> 00:00:32,080 Speaker 3: I know, I love it. I'm like, okay, coming on, 7 00:00:32,240 --> 00:00:34,760 Speaker 3: how often can we have you on? Let's do more stuff? 8 00:00:34,800 --> 00:00:40,680 Speaker 4: I also love it, you know, the Hugh Hefner antics 9 00:00:40,720 --> 00:00:43,000 Speaker 4: never end. Also, can I just say thank you for 10 00:00:43,240 --> 00:00:47,080 Speaker 4: if folks haven't listened to our Hugh Hefner conversation, I've 11 00:00:47,200 --> 00:00:49,800 Speaker 4: got a lot out of it, and thank you for Like, 12 00:00:49,880 --> 00:00:51,960 Speaker 4: I don't know, I feel like it was a meaty 13 00:00:52,040 --> 00:00:53,920 Speaker 4: one and people should listen to it. 14 00:00:54,040 --> 00:00:57,160 Speaker 2: And yeah, I'm grateful that we had the space. 15 00:00:57,680 --> 00:01:00,000 Speaker 3: Yeah, I know. And of course we stretched it out 16 00:01:00,200 --> 00:01:00,600 Speaker 3: very long. 17 00:01:00,640 --> 00:01:03,480 Speaker 5: We still left out so many things at the end 18 00:01:03,520 --> 00:01:05,600 Speaker 5: that we do want to come back with, maybe like 19 00:01:05,640 --> 00:01:10,520 Speaker 5: a conversation about where Playboy and magazines as that are today, 20 00:01:10,760 --> 00:01:13,199 Speaker 5: and that will be a later, later conversation. 21 00:01:12,760 --> 00:01:15,920 Speaker 3: I swear to Goddess later, because there's. 22 00:01:15,720 --> 00:01:18,320 Speaker 5: So much and it is so deep, and like having 23 00:01:18,520 --> 00:01:20,800 Speaker 5: going through all of that depth conversation that who was 24 00:01:20,840 --> 00:01:22,920 Speaker 5: kind of at one point, we're like, this is abruptly ending, 25 00:01:23,800 --> 00:01:25,600 Speaker 5: and Annie had to fix us because he's like, yo, 26 00:01:25,680 --> 00:01:27,800 Speaker 5: we can't do it like that because that's too much. 27 00:01:29,760 --> 00:01:32,120 Speaker 5: But it was so much that there was no like 28 00:01:32,400 --> 00:01:34,039 Speaker 5: bits of it that was just like, Okay, this is 29 00:01:34,040 --> 00:01:35,440 Speaker 5: an easy, easy going part. 30 00:01:35,640 --> 00:01:37,320 Speaker 3: There's no easy going part. 31 00:01:38,319 --> 00:01:43,000 Speaker 4: No, Yeah, it's so true, and like what a I 32 00:01:43,000 --> 00:01:45,199 Speaker 4: don't know. I've been listening to a lot of nostalgia 33 00:01:45,319 --> 00:01:47,120 Speaker 4: podcasts that are like, oh. 34 00:01:47,000 --> 00:01:48,440 Speaker 2: Remember when this, remember when that? 35 00:01:48,640 --> 00:01:50,440 Speaker 4: Like you know, you're wrong about things like that, and 36 00:01:50,480 --> 00:01:52,800 Speaker 4: it's like, what a weird time for the culture, What 37 00:01:52,840 --> 00:01:56,720 Speaker 4: a weird thing that we all experienced together like it 38 00:01:56,760 --> 00:01:58,080 Speaker 4: was normal and it really. 39 00:01:57,920 --> 00:02:04,200 Speaker 5: Was not right though, because we are aging, like slowly aging, 40 00:02:04,480 --> 00:02:08,160 Speaker 5: y'all slower than Maze fines great, but like looking back 41 00:02:08,280 --> 00:02:10,519 Speaker 5: on things, a lot of this conversation is because things 42 00:02:10,520 --> 00:02:14,120 Speaker 5: are revamping and we're seeing in the younger generation doing 43 00:02:14,160 --> 00:02:17,519 Speaker 5: their thing similar to what we already know and putting 44 00:02:17,520 --> 00:02:19,720 Speaker 5: their own like style to it. But watching it is 45 00:02:19,800 --> 00:02:23,280 Speaker 5: kind of like, huh, that's not as original as you think. 46 00:02:23,400 --> 00:02:25,880 Speaker 5: It's cute and I like your version, but just know 47 00:02:25,960 --> 00:02:28,320 Speaker 5: that this was there before. But at the same time, 48 00:02:28,360 --> 00:02:30,840 Speaker 5: I'm like, I wonder how many times the people before 49 00:02:30,919 --> 00:02:33,200 Speaker 5: us also felt that way. We're just coming into that 50 00:02:33,240 --> 00:02:36,800 Speaker 5: same like generational glitch of like, yeah, there it is. 51 00:02:36,880 --> 00:02:39,360 Speaker 5: This is why now I understand that the older generation 52 00:02:39,400 --> 00:02:40,880 Speaker 5: would just look at us kind of funny and be like, 53 00:02:40,960 --> 00:02:44,240 Speaker 5: what are you doing? This has been done anyway. I've 54 00:02:44,280 --> 00:02:45,200 Speaker 5: had those moments lately. 55 00:02:47,000 --> 00:02:49,359 Speaker 4: I had one watching a group of like very young 56 00:02:49,400 --> 00:02:54,480 Speaker 4: people wearing the oversized jeans that are so long that 57 00:02:54,480 --> 00:02:56,359 Speaker 4: that you are stepping on them, and it was like 58 00:02:56,400 --> 00:02:58,840 Speaker 4: a damp day and I was like, ooh, I remember 59 00:02:58,919 --> 00:03:02,480 Speaker 4: that feeling of like the Jinko jeans that went over 60 00:03:02,520 --> 00:03:04,639 Speaker 4: the back that the heels of your sneaker and get 61 00:03:04,680 --> 00:03:06,400 Speaker 4: wet and gross yuck. 62 00:03:07,000 --> 00:03:08,760 Speaker 3: You're gonna want the skinny jeans again one day. 63 00:03:08,840 --> 00:03:11,280 Speaker 4: Yeah, like, oh, y'all are back there again. Okay, cool, 64 00:03:11,320 --> 00:03:12,280 Speaker 4: let's see how that goes. 65 00:03:13,120 --> 00:03:14,640 Speaker 1: I just have to put in here, and I'm not 66 00:03:14,680 --> 00:03:17,480 Speaker 1: going to provide any more context than this, but Jinko 67 00:03:17,600 --> 00:03:23,480 Speaker 1: jeans are very important in my Dungeons and Dragons campaign 68 00:03:23,560 --> 00:03:24,760 Speaker 1: follow up or no, no, no. 69 00:03:27,000 --> 00:03:29,359 Speaker 3: I I think Peaches has something to do with this. 70 00:03:30,960 --> 00:03:36,360 Speaker 1: Dog Peaches is very much a inspiration in the campaign. Yeah, well, 71 00:03:36,080 --> 00:03:39,400 Speaker 1: I'll say this as the I run the game, sometimes 72 00:03:39,440 --> 00:03:40,839 Speaker 1: you got to go with what the players are doing. 73 00:03:40,840 --> 00:03:43,320 Speaker 1: And we're all from kind of the same generation. 74 00:03:43,800 --> 00:03:46,520 Speaker 3: Is that millennial level of like. 75 00:03:46,880 --> 00:03:54,240 Speaker 6: Yeah, anyway, we really appreciate you joining us today, virgin 76 00:03:54,480 --> 00:03:57,400 Speaker 6: What you the topic you have for us is one 77 00:03:57,520 --> 00:04:00,440 Speaker 6: that I admit is giving me a lot of thingsniety, 78 00:04:00,960 --> 00:04:02,600 Speaker 6: But I'm very glad that you're here to talk about 79 00:04:02,600 --> 00:04:05,000 Speaker 6: it because it's so so important and there's a lot 80 00:04:05,080 --> 00:04:08,240 Speaker 6: to breakdown, So let's go ahead and get into it. 81 00:04:08,720 --> 00:04:09,000 Speaker 2: Yeah. 82 00:04:09,160 --> 00:04:12,560 Speaker 4: I feel the same anxiety and angst that you're feeling 83 00:04:12,720 --> 00:04:16,240 Speaker 4: I'm feeling, and that is that we have an upcoming election. 84 00:04:16,960 --> 00:04:19,800 Speaker 4: I've been doing this thing where I'm saying I'm not 85 00:04:19,839 --> 00:04:22,880 Speaker 4: going to even pay attention to the election until we're 86 00:04:22,920 --> 00:04:25,640 Speaker 4: like one hundred days out. Generally for me around then 87 00:04:25,720 --> 00:04:27,560 Speaker 4: that I'm like, oh o God, I just want a campaign. 88 00:04:27,880 --> 00:04:31,400 Speaker 4: But I think like I need to confront the fact 89 00:04:31,400 --> 00:04:34,120 Speaker 4: that we're in an election year, and it's not twenty 90 00:04:34,160 --> 00:04:34,600 Speaker 4: twenty four. 91 00:04:34,640 --> 00:04:35,760 Speaker 2: It's not just an election year. 92 00:04:35,760 --> 00:04:39,120 Speaker 4: It's probably the election year, not just here in the 93 00:04:39,200 --> 00:04:40,680 Speaker 4: United States, but also globally. 94 00:04:41,240 --> 00:04:41,960 Speaker 2: I did not know this. 95 00:04:42,080 --> 00:04:45,760 Speaker 4: Around the world, more voters in history will vote this year, 96 00:04:46,080 --> 00:04:48,719 Speaker 4: with at least sixty four countries and the European Union 97 00:04:48,800 --> 00:04:53,120 Speaker 4: having elections combined. This is about forty nine percent of 98 00:04:53,160 --> 00:04:56,640 Speaker 4: people worldwide are meant to be holding national elections in 99 00:04:56,680 --> 00:04:57,440 Speaker 4: twenty twenty four. 100 00:04:57,480 --> 00:04:58,480 Speaker 2: So it's a very big deal. 101 00:04:58,680 --> 00:05:01,200 Speaker 4: The stakes are high, or not wrong to be feeling 102 00:05:01,600 --> 00:05:04,159 Speaker 4: a little anxiety or a little angst or a little 103 00:05:04,240 --> 00:05:07,920 Speaker 4: uncertainty around it, Annie, But given all that, I think 104 00:05:07,920 --> 00:05:11,320 Speaker 4: it really makes sense to think about whether or not 105 00:05:12,240 --> 00:05:16,359 Speaker 4: elections are fair to women who are running for office. Obviously, 106 00:05:16,400 --> 00:05:19,200 Speaker 4: this is like a global issue, but right here in 107 00:05:19,240 --> 00:05:21,680 Speaker 4: the United States things are also very bleak. 108 00:05:21,720 --> 00:05:24,440 Speaker 2: And it's important because we can't have like. 109 00:05:24,400 --> 00:05:27,960 Speaker 4: An equitable, functional democracy if women are not able to 110 00:05:28,040 --> 00:05:31,599 Speaker 4: run for office without facing identity motivated attacks simply for 111 00:05:31,800 --> 00:05:33,520 Speaker 4: showing up and daring to be women. 112 00:05:34,120 --> 00:05:37,560 Speaker 1: Yes, and you've talked about this a lot before, and 113 00:05:37,560 --> 00:05:42,240 Speaker 1: I think it's so important because if you're considering running 114 00:05:42,240 --> 00:05:47,320 Speaker 1: for office as a woman or any intersection of marginalized identity, 115 00:05:48,080 --> 00:05:50,280 Speaker 1: that's just like an extra thing to deal with, and 116 00:05:50,320 --> 00:05:51,880 Speaker 1: like your family has to deal with, like people who 117 00:05:51,920 --> 00:05:55,200 Speaker 1: know you has to deal with, and I can't honestly 118 00:05:55,200 --> 00:05:57,920 Speaker 1: blame people for being like, you know what, I don't 119 00:05:57,920 --> 00:05:59,640 Speaker 1: want to deal with that, And that's a huge problem. 120 00:06:00,120 --> 00:06:01,480 Speaker 1: Is a huge. 121 00:06:01,200 --> 00:06:05,320 Speaker 4: Problem, absolutely, and even for folks who are not running 122 00:06:05,360 --> 00:06:08,400 Speaker 4: for elected office, it's a problem because more and more 123 00:06:08,400 --> 00:06:12,279 Speaker 4: we're seeing it just harassment and identity based attacks being 124 00:06:12,360 --> 00:06:15,000 Speaker 4: part of showing up to public or civic life. And 125 00:06:15,040 --> 00:06:17,560 Speaker 4: so if you want to be a poll worker for 126 00:06:17,640 --> 00:06:20,600 Speaker 4: your elections in your town, or if you want to 127 00:06:20,640 --> 00:06:23,039 Speaker 4: speak up at a town hall meeting in your town, 128 00:06:23,440 --> 00:06:27,640 Speaker 4: just these every day, not exciting bits of democracy, these 129 00:06:27,680 --> 00:06:30,040 Speaker 4: everyday ways that folks should be able to show up 130 00:06:30,040 --> 00:06:32,800 Speaker 4: and make their voices heard and participate in their civic 131 00:06:32,800 --> 00:06:35,640 Speaker 4: and public life. Women and people of color and other 132 00:06:35,720 --> 00:06:38,719 Speaker 4: marginalized people are facing attacks just for doing that. And 133 00:06:39,240 --> 00:06:42,920 Speaker 4: you know, it makes me sad because I've seen very 134 00:06:42,960 --> 00:06:48,080 Speaker 4: clear research that suggests that women are shying away from 135 00:06:48,120 --> 00:06:51,800 Speaker 4: public office and shying away from this kind of civic 136 00:06:51,800 --> 00:06:53,040 Speaker 4: participation because. 137 00:06:52,760 --> 00:06:55,120 Speaker 2: Of these attacks. And on the one hand, I get it, like. 138 00:06:55,080 --> 00:06:57,640 Speaker 4: Women are smart enough to know if I'm not protected 139 00:06:57,640 --> 00:06:59,159 Speaker 4: and when I do this, if I have to pay 140 00:06:59,560 --> 00:07:02,039 Speaker 4: lots of lots and lots of money to protect myself 141 00:07:02,080 --> 00:07:05,039 Speaker 4: in a way that my male counterparts just don't even 142 00:07:05,040 --> 00:07:06,760 Speaker 4: have to think about. I don't want to do that. 143 00:07:06,839 --> 00:07:09,360 Speaker 4: And so as much as I want to have the 144 00:07:09,400 --> 00:07:13,720 Speaker 4: representative democracy that we all deserve, I understand why more 145 00:07:13,760 --> 00:07:16,360 Speaker 4: and more marginalized people are checking out and being like, yeah, 146 00:07:16,360 --> 00:07:17,720 Speaker 4: I'm just not going to sign up for this, Like 147 00:07:17,720 --> 00:07:18,280 Speaker 4: why would you? 148 00:07:18,400 --> 00:07:18,880 Speaker 2: I get it? 149 00:07:19,800 --> 00:07:23,920 Speaker 1: Yeah, And I think one of the reasons for my anxiety, 150 00:07:23,920 --> 00:07:27,080 Speaker 1: A lot of our anxiety, as you said, you have 151 00:07:27,160 --> 00:07:29,280 Speaker 1: been on here to talk about it being like disinformation 152 00:07:29,360 --> 00:07:33,000 Speaker 1: in these attacks and misinformation, it feels like it's gotten 153 00:07:33,560 --> 00:07:37,400 Speaker 1: worse for a lot of reasons. Is that in my. 154 00:07:37,480 --> 00:07:40,960 Speaker 4: Head or I am sorry to say it is not 155 00:07:41,200 --> 00:07:43,480 Speaker 4: in your head. You know, today in twenty twenty four, 156 00:07:43,960 --> 00:07:47,360 Speaker 4: women in politics and positions of power are the targets of. 157 00:07:47,560 --> 00:07:50,360 Speaker 2: Just overwhelming volumes. 158 00:07:49,800 --> 00:07:54,080 Speaker 4: Of gender disinformation and online abuse in forms of character assassinations, 159 00:07:54,080 --> 00:07:58,040 Speaker 4: fake stories, and humiliating or sexually charged images. Something that 160 00:07:58,120 --> 00:08:03,320 Speaker 4: is new is the proliferation of AI enabled disinformation, visual disinformation, 161 00:08:03,400 --> 00:08:05,360 Speaker 4: so things like cheap fakes, deep stakes, all of that 162 00:08:05,440 --> 00:08:08,800 Speaker 4: kind of stuff that I think we really hadn't had 163 00:08:08,800 --> 00:08:11,640 Speaker 4: to wade through before, Like we are now. 164 00:08:12,040 --> 00:08:14,240 Speaker 2: And I also just think, I mean, this is just 165 00:08:14,280 --> 00:08:14,880 Speaker 2: my opinion. 166 00:08:15,920 --> 00:08:19,320 Speaker 4: I think that the cultural landscape around these kinds of 167 00:08:19,400 --> 00:08:20,679 Speaker 4: attacks have changed. 168 00:08:21,120 --> 00:08:24,560 Speaker 2: I think that we're in a place where people. 169 00:08:25,840 --> 00:08:29,000 Speaker 4: Are really willing to believe things that adhere to their 170 00:08:29,560 --> 00:08:33,680 Speaker 4: worldviews already. And so if you see a piece of 171 00:08:33,679 --> 00:08:38,400 Speaker 4: disinformation or an AI generated deep fake, even if there 172 00:08:38,400 --> 00:08:40,640 Speaker 4: are all the things that you're like, that's probably not 173 00:08:40,679 --> 00:08:45,040 Speaker 4: true if it aligns to your worldview. Anecdotally, I believe 174 00:08:45,040 --> 00:08:47,560 Speaker 4: that I've seen more and more people being unwilling to 175 00:08:47,679 --> 00:08:51,040 Speaker 4: challenge things that are clearly not true if it aligns 176 00:08:51,040 --> 00:08:54,479 Speaker 4: to their worldview. And we're talking about misogyny and sexism, 177 00:08:54,880 --> 00:08:56,440 Speaker 4: so much of that is wrapped. 178 00:08:56,160 --> 00:08:58,120 Speaker 2: Up in how people think about the world. 179 00:08:58,200 --> 00:09:01,040 Speaker 4: And so if you already are somebody who was prone 180 00:09:01,040 --> 00:09:03,839 Speaker 4: to think that women are untrustworthy or women are incapable, 181 00:09:03,920 --> 00:09:06,360 Speaker 4: or women are not qualified or not good leaders, when 182 00:09:06,440 --> 00:09:08,880 Speaker 4: you see something that confirms that, even if it's like 183 00:09:09,080 --> 00:09:13,400 Speaker 4: obviously AI manipulated or something are not true, I do 184 00:09:13,440 --> 00:09:15,960 Speaker 4: think here in twenty twenty four, people are going to 185 00:09:15,960 --> 00:09:17,840 Speaker 4: be willing to be like, oh, yeah, that tracks, you know, 186 00:09:17,920 --> 00:09:20,880 Speaker 4: because it aligns to this like deeply held idea and 187 00:09:21,000 --> 00:09:25,520 Speaker 4: attitude about women that they already have, right, you know. 188 00:09:25,480 --> 00:09:27,040 Speaker 5: One of the things I was thinking about the last 189 00:09:27,040 --> 00:09:30,360 Speaker 5: time we kind of had a similar conversation about this, 190 00:09:30,520 --> 00:09:32,800 Speaker 5: we talked a lot about like bots, and we talked 191 00:09:32,800 --> 00:09:36,520 Speaker 5: a lot about like how Russia really did target specific 192 00:09:36,800 --> 00:09:40,560 Speaker 5: women and marginalized people, even pretending to be like marginalized 193 00:09:40,559 --> 00:09:43,600 Speaker 5: people themselves in order to attack other people or be like, yes, 194 00:09:43,800 --> 00:09:46,280 Speaker 5: I'm definitely voting for so and so because I am 195 00:09:46,360 --> 00:09:49,360 Speaker 5: this type of person who thinks that this candidate will 196 00:09:49,360 --> 00:09:51,559 Speaker 5: do the best for us, even though it's not true 197 00:09:51,559 --> 00:09:54,720 Speaker 5: at all, which you know, we've seen plants essentially everywhere, 198 00:09:54,760 --> 00:09:57,280 Speaker 5: but then these bots were this thing. Is there something 199 00:09:57,320 --> 00:10:00,199 Speaker 5: different this time round that we're seeing, Like as as 200 00:10:00,200 --> 00:10:02,760 Speaker 5: calculated as that was, obviously it's getting worse. 201 00:10:02,840 --> 00:10:03,880 Speaker 3: What are we seeing now? 202 00:10:04,320 --> 00:10:05,679 Speaker 2: Yeah, that's a great question. 203 00:10:05,880 --> 00:10:09,000 Speaker 4: I would say that I was somebody who really talked 204 00:10:09,000 --> 00:10:13,880 Speaker 4: a lot about the way that global forces were using 205 00:10:14,600 --> 00:10:20,880 Speaker 4: and exploiting our online ecosystem to like manufacture political conversation 206 00:10:21,320 --> 00:10:23,439 Speaker 4: in a way that was inauthentic, and I think that's 207 00:10:23,480 --> 00:10:25,880 Speaker 4: important to talk about. Like, there was an entire Senate 208 00:10:25,920 --> 00:10:29,280 Speaker 4: inquiry report that said that Russia specifically was interested in 209 00:10:29,320 --> 00:10:32,360 Speaker 4: targeting black people, black voters and get them to either 210 00:10:32,480 --> 00:10:34,640 Speaker 4: vote for Trump or to stay home in our last 211 00:10:34,640 --> 00:10:35,400 Speaker 4: presidential election. 212 00:10:35,480 --> 00:10:37,880 Speaker 2: Right, So, like that's a thing. 213 00:10:38,280 --> 00:10:41,120 Speaker 4: However, I think in twenty twenty four, the conversation really 214 00:10:41,160 --> 00:10:46,480 Speaker 4: has to be more involved because it's not just foreign adversaries, 215 00:10:46,640 --> 00:10:50,080 Speaker 4: it is also people right here in the United States. 216 00:10:50,600 --> 00:10:51,839 Speaker 2: You have the front. 217 00:10:51,600 --> 00:10:56,280 Speaker 4: Runner for the Republican Party engaging in identity based disinformation 218 00:10:56,320 --> 00:11:00,080 Speaker 4: attacks all the time, right, And so I think the 219 00:11:00,120 --> 00:11:03,240 Speaker 4: problem has gotten more involved, and I think there are 220 00:11:03,240 --> 00:11:04,439 Speaker 4: more parties involved. 221 00:11:04,480 --> 00:11:05,439 Speaker 2: And I also think it's. 222 00:11:05,280 --> 00:11:07,960 Speaker 4: Been more normalized, right, Like I think like the idea 223 00:11:08,000 --> 00:11:11,600 Speaker 4: that someone would just lie about a candidate and tell 224 00:11:11,640 --> 00:11:15,839 Speaker 4: a lie that is only works because it is supported 225 00:11:15,920 --> 00:11:20,239 Speaker 4: by bias or unfair attitudes or lies. 226 00:11:20,000 --> 00:11:21,120 Speaker 2: About somebody's identity. 227 00:11:21,760 --> 00:11:23,680 Speaker 4: I think that we're in a place where people are 228 00:11:23,679 --> 00:11:26,360 Speaker 4: not calling that what it is, and so the whole 229 00:11:26,400 --> 00:11:29,160 Speaker 4: problem thats would have like gotten worse. 230 00:11:29,520 --> 00:11:32,200 Speaker 1: Yes, And I think like when you do call it 231 00:11:32,200 --> 00:11:37,240 Speaker 1: what it is, we've just become so like we don't 232 00:11:37,240 --> 00:11:40,880 Speaker 1: trust anybody else's data. We don't trust for they're getting 233 00:11:40,880 --> 00:11:43,760 Speaker 1: their sources. That's what my experience has been When I 234 00:11:43,800 --> 00:11:47,199 Speaker 1: get in fights with people, they're always like, where's your source? 235 00:11:48,760 --> 00:11:53,280 Speaker 4: No, don't even get me like this, like where don't 236 00:11:53,280 --> 00:11:53,839 Speaker 4: I even start? 237 00:11:53,880 --> 00:11:55,559 Speaker 2: Like I have had that same experience. 238 00:11:55,559 --> 00:11:57,960 Speaker 4: It's part and parcel of why I've kind of like 239 00:11:58,280 --> 00:12:02,199 Speaker 4: given up on a certain kind of online discourse with strangers, 240 00:12:02,240 --> 00:12:04,480 Speaker 4: because I'm just like, it's not worth my time. If 241 00:12:04,480 --> 00:12:06,360 Speaker 4: you're somebody who wants to be in the trenches arguing 242 00:12:06,360 --> 00:12:09,240 Speaker 4: with somebody on a Reddit, more love to you, like 243 00:12:09,320 --> 00:12:11,120 Speaker 4: good job, good luck, and it's. 244 00:12:11,000 --> 00:12:13,320 Speaker 2: Not my ministry. I don't have the energy, I don't 245 00:12:13,320 --> 00:12:13,600 Speaker 2: have the. 246 00:12:13,559 --> 00:12:16,880 Speaker 4: Time because and I guess that's the thing about people 247 00:12:16,920 --> 00:12:21,080 Speaker 4: with these entrenched worldviews, but it's very hard to have 248 00:12:21,160 --> 00:12:25,160 Speaker 4: conversations with them in some cases because it's like, oh well, 249 00:12:25,960 --> 00:12:29,920 Speaker 4: here's a like reputable source, and it's like, oh, well 250 00:12:29,960 --> 00:12:33,320 Speaker 4: the source is wrong. Here's why you can't trust whatever source. 251 00:12:33,360 --> 00:12:36,520 Speaker 4: And the problem is is really that like a lot 252 00:12:36,559 --> 00:12:40,480 Speaker 4: of these institutions and sources really have given people legit 253 00:12:40,600 --> 00:12:43,079 Speaker 4: reasons to be suss about them and to question them, and. 254 00:12:43,040 --> 00:12:44,840 Speaker 2: So it just makes it that much more complicated. 255 00:12:44,920 --> 00:12:48,480 Speaker 4: We're like, you know, well, yes, like I don't want 256 00:12:48,480 --> 00:12:50,920 Speaker 4: to go out here and cape for like the CDC 257 00:12:51,280 --> 00:12:51,840 Speaker 4: or whatever. 258 00:12:52,480 --> 00:12:54,640 Speaker 2: But like we have to start somewhere. 259 00:12:54,679 --> 00:12:58,160 Speaker 4: We have to have like a baseline understanding of just 260 00:12:58,200 --> 00:13:01,520 Speaker 4: the reality that we're all sharing, other wise everything breaks down. 261 00:13:12,040 --> 00:13:16,120 Speaker 1: You are someone who has been working in the world 262 00:13:16,160 --> 00:13:19,600 Speaker 1: of disinformation and misinformation for a while, and I think 263 00:13:19,600 --> 00:13:23,240 Speaker 1: it's interesting because when we talk to you, there's always 264 00:13:23,320 --> 00:13:26,960 Speaker 1: kind of like almost like a game plan of like 265 00:13:27,000 --> 00:13:30,720 Speaker 1: how things work, are how people do things, and as yes, 266 00:13:30,800 --> 00:13:32,720 Speaker 1: as you've been saying, like some of it relies on 267 00:13:32,720 --> 00:13:34,959 Speaker 1: what's baked in already, of like the sexism baked in 268 00:13:35,000 --> 00:13:35,959 Speaker 1: already totally. 269 00:13:36,360 --> 00:13:38,600 Speaker 4: So that is a great way to put it, Annie, 270 00:13:38,640 --> 00:13:41,880 Speaker 4: because it really is like a game plan. And I really, 271 00:13:42,200 --> 00:13:44,880 Speaker 4: in the work that I do around missing disinformation, I 272 00:13:44,920 --> 00:13:47,719 Speaker 4: really encourage folks to like take a step back and 273 00:13:47,840 --> 00:13:51,440 Speaker 4: see the machinations of how this is working, to sort 274 00:13:51,440 --> 00:13:54,960 Speaker 4: of see the ways that we are being essentially like 275 00:13:55,240 --> 00:13:58,120 Speaker 4: hoodwinked and grifters are sort of like getting something out 276 00:13:58,160 --> 00:14:01,960 Speaker 4: of us by following a very specific kind of game plan. 277 00:14:02,040 --> 00:14:04,040 Speaker 4: And so you can really think about the way that 278 00:14:04,080 --> 00:14:07,679 Speaker 4: we're seeing misogyny based attacks work today as a sort 279 00:14:07,720 --> 00:14:08,840 Speaker 4: of three pronged attack. 280 00:14:09,120 --> 00:14:11,720 Speaker 2: Prong one is online violence. 281 00:14:11,760 --> 00:14:15,040 Speaker 4: So like threats of abuse, hate speech, harassments, beer campaigns. 282 00:14:15,160 --> 00:14:18,960 Speaker 4: Then you have pronged two disinformation by bad actors. So 283 00:14:19,080 --> 00:14:22,240 Speaker 4: like the kind of artificial activity that Sam was talking about, 284 00:14:22,240 --> 00:14:27,880 Speaker 4: like bots, coordinated influence operations, fake news combined with algorithmic 285 00:14:27,960 --> 00:14:31,760 Speaker 4: preferences toward insidiary content. Right, Like a whole body of 286 00:14:31,760 --> 00:14:36,800 Speaker 4: research tells us that social media platforms, places like Twitter, Reddit, whatever, 287 00:14:37,160 --> 00:14:40,600 Speaker 4: they really do prioritize content. 288 00:14:40,200 --> 00:14:41,359 Speaker 2: That is inflammatory. 289 00:14:41,440 --> 00:14:41,600 Speaker 5: Right. 290 00:14:41,600 --> 00:14:44,280 Speaker 4: And so if you are somebody who wants to tell 291 00:14:44,800 --> 00:14:48,920 Speaker 4: an inflammatory lie that it's identity based about somebody that 292 00:14:49,080 --> 00:14:52,240 Speaker 4: is going to get profential treatment on algorithms, it's just 293 00:14:52,320 --> 00:14:54,840 Speaker 4: a fact. And then the last prong is what you 294 00:14:54,880 --> 00:14:59,760 Speaker 4: were talking about Annie, just gold fashioned everyday sexism, right, 295 00:14:59,840 --> 00:15:04,280 Speaker 4: Just the general suspicion around women, particularly women of color 296 00:15:04,320 --> 00:15:08,560 Speaker 4: and black women, suspicion around our motivations, our abilities are 297 00:15:08,880 --> 00:15:12,280 Speaker 4: like motivations to lead, like why do we want to lead? 298 00:15:12,280 --> 00:15:13,800 Speaker 2: How do we get to a leadership position? 299 00:15:14,880 --> 00:15:18,920 Speaker 4: All of this really allows gender based attacks to take 300 00:15:19,040 --> 00:15:22,800 Speaker 4: root in mainstream political discourse. So I wish that we 301 00:15:22,800 --> 00:15:26,440 Speaker 4: were talking about something that we only see festering in 302 00:15:27,160 --> 00:15:32,160 Speaker 4: you know, random dudes blog dot biz. But a lot 303 00:15:32,200 --> 00:15:37,240 Speaker 4: of this conversation does get amplified to like mainstream platforms. 304 00:15:37,760 --> 00:15:41,080 Speaker 1: It does. And that's something I've been thinking about a 305 00:15:41,120 --> 00:15:44,400 Speaker 1: lot lately too, is the you've talked about before, like 306 00:15:44,480 --> 00:15:50,320 Speaker 1: how kind of like outrage gets more money, like in 307 00:15:50,600 --> 00:15:55,320 Speaker 1: it's monetized in our social media and other platforms. And 308 00:15:56,240 --> 00:16:01,160 Speaker 1: I feel when women point that out, the criticisms come 309 00:16:01,200 --> 00:16:04,320 Speaker 1: again of like, well, see she can't handle it or 310 00:16:04,320 --> 00:16:07,120 Speaker 1: what have you. But there are numbers to back this up. 311 00:16:07,320 --> 00:16:08,080 Speaker 2: Oh totally. 312 00:16:08,080 --> 00:16:11,080 Speaker 4: And something about that is like men are not being 313 00:16:11,160 --> 00:16:15,960 Speaker 4: asked to handle identity based criticisms when they run for 314 00:16:15,960 --> 00:16:18,480 Speaker 4: public office or just not. And so I would argue 315 00:16:18,520 --> 00:16:22,000 Speaker 4: that like a landscape where women are asked to handle 316 00:16:22,040 --> 00:16:25,040 Speaker 4: something that their counterparts simply do not have to deal 317 00:16:25,080 --> 00:16:27,080 Speaker 4: with is not a fair landscape. 318 00:16:27,080 --> 00:16:28,720 Speaker 2: And you shouldn't have to put up. 319 00:16:28,560 --> 00:16:32,120 Speaker 4: With identity based lies and smears to run for office. 320 00:16:32,400 --> 00:16:35,240 Speaker 4: You should be able to handle criticisms about your record, 321 00:16:35,280 --> 00:16:37,880 Speaker 4: about what you say, about your behavior, about your policy. 322 00:16:38,000 --> 00:16:40,360 Speaker 4: But we're not talking about that. We're talking about you 323 00:16:40,400 --> 00:16:43,280 Speaker 4: can't lead because you're a woman. I am suspicious of 324 00:16:43,280 --> 00:16:45,720 Speaker 4: your motivations because you're a woman. That's a different thing 325 00:16:46,000 --> 00:16:49,400 Speaker 4: that men in public office simply do not have to 326 00:16:49,400 --> 00:16:49,760 Speaker 4: deal with. 327 00:16:49,920 --> 00:16:52,440 Speaker 2: So let's look at some of the data around this. 328 00:16:53,000 --> 00:16:56,040 Speaker 4: So discrediting and disabusing women candidates in the twenty twenty 329 00:16:56,040 --> 00:16:59,600 Speaker 4: two midterm election echoed these patterns that really took shape 330 00:16:59,600 --> 00:17:03,920 Speaker 4: in twenty twenty when racist and sexist and ablest abuse 331 00:17:04,359 --> 00:17:08,000 Speaker 4: was hurled at candidates like Representative Ilhan Omar and Senator 332 00:17:08,040 --> 00:17:12,040 Speaker 4: Tammy Duckworth, when multiple secretaries of state faced harassment and 333 00:17:12,440 --> 00:17:17,480 Speaker 4: legit physical threats for ratifying the election results, and white 334 00:17:17,480 --> 00:17:22,560 Speaker 4: supremacists plotted to kidnap and hold ransom Governor Whitmere side note, 335 00:17:22,720 --> 00:17:24,879 Speaker 4: that is something that like, I don't think that we 336 00:17:25,000 --> 00:17:29,040 Speaker 4: ever like the fact that white supremacists were foiled in 337 00:17:29,080 --> 00:17:34,120 Speaker 4: a plot to kidnap a public elected official. We did 338 00:17:34,160 --> 00:17:37,719 Speaker 4: not have nearly enough conversation about that. I believe if 339 00:17:37,720 --> 00:17:41,639 Speaker 4: that had happened in another country, we would be like, 340 00:17:41,760 --> 00:17:43,640 Speaker 4: we would be looking at it like, wow, look who 341 00:17:43,640 --> 00:17:46,040 Speaker 4: can't do democracy? Oh my god, what is going on 342 00:17:46,119 --> 00:17:48,280 Speaker 4: over there? It happened here, and I feel like it 343 00:17:48,320 --> 00:17:50,080 Speaker 4: barely made made a blip. 344 00:17:50,560 --> 00:17:53,320 Speaker 5: I mean, it's kind of like any abuse situation or 345 00:17:53,359 --> 00:17:57,720 Speaker 5: stalking situation unless they're dead. They don't care unless we die, 346 00:17:57,960 --> 00:18:00,639 Speaker 5: They're not going to listen. It's not an actual threat, 347 00:18:00,760 --> 00:18:02,960 Speaker 5: which is the most absurd thing that they need to 348 00:18:03,000 --> 00:18:05,119 Speaker 5: see it end in violence in order to give credit 349 00:18:05,119 --> 00:18:08,400 Speaker 5: to what happened, Like that's unfortunately what's happening. 350 00:18:08,720 --> 00:18:11,359 Speaker 4: And that's sort of my point of this whole conversation, Sam, 351 00:18:11,440 --> 00:18:15,240 Speaker 4: is that we shouldn't accept that we have to wait 352 00:18:15,400 --> 00:18:18,359 Speaker 4: until somebody is dead or in the hospital to have 353 00:18:18,520 --> 00:18:26,040 Speaker 4: these conversations. Right like media platforms amplified violence and abuse 354 00:18:26,240 --> 00:18:30,040 Speaker 4: against Nancy Pelosi and her hustbody broke into her house 355 00:18:30,080 --> 00:18:31,720 Speaker 4: and attacked her husband with a hammer and. 356 00:18:31,680 --> 00:18:33,000 Speaker 2: He was hospitalized, right. 357 00:18:32,920 --> 00:18:36,119 Speaker 4: Like, it shouldn't take something like that to be like, 358 00:18:36,240 --> 00:18:39,400 Speaker 4: oh wait, is this a real thing. It's a real thing, 359 00:18:39,560 --> 00:18:42,520 Speaker 4: and it shouldn't have to get to a point of 360 00:18:43,320 --> 00:18:46,399 Speaker 4: violence for us to really take it seriously, and that 361 00:18:46,480 --> 00:18:47,120 Speaker 4: these kinds of. 362 00:18:47,040 --> 00:18:48,440 Speaker 2: Attacks are really expanding. 363 00:18:48,480 --> 00:18:50,919 Speaker 4: There is a recent global study shows that women government 364 00:18:50,960 --> 00:18:53,879 Speaker 4: officials targeted with violence is one of the largest categories 365 00:18:53,880 --> 00:18:56,440 Speaker 4: of attacks in the US compared to other regions of 366 00:18:56,480 --> 00:18:59,760 Speaker 4: the world, which includes public and civil servants, local authorities, 367 00:18:59,760 --> 00:19:04,080 Speaker 4: and nonpartisan political appointments such as judges, so like, yeah, 368 00:19:04,080 --> 00:19:07,840 Speaker 4: like it's a global problem, but the problem is unique 369 00:19:07,880 --> 00:19:10,720 Speaker 4: in the United States, and especially given that we might 370 00:19:10,800 --> 00:19:14,560 Speaker 4: have two women running for vice president in twenty twenty four. 371 00:19:14,960 --> 00:19:17,679 Speaker 4: The way that we talk about gender and leadership and 372 00:19:17,720 --> 00:19:20,240 Speaker 4: the way that we know the kind of conversations that 373 00:19:20,280 --> 00:19:23,560 Speaker 4: we normalize and tolerate around women in leadership, we are 374 00:19:23,680 --> 00:19:25,520 Speaker 4: going to have to sort of get that straight right, 375 00:19:25,560 --> 00:19:27,920 Speaker 4: because it's certainly going to be something that comes up. 376 00:19:28,400 --> 00:19:30,240 Speaker 4: In fact, I would say that we've already seen this 377 00:19:30,280 --> 00:19:33,000 Speaker 4: when it comes to a tax on Vice President Harris. 378 00:19:34,160 --> 00:19:42,640 Speaker 1: Yes. Yeah, it's so sad because I remember having these 379 00:19:42,680 --> 00:19:46,959 Speaker 1: thoughts when like Hillary Clinton was almost elected and then 380 00:19:48,400 --> 00:19:51,520 Speaker 1: Vice President Kamala Harris. I was thinking that they're going 381 00:19:51,560 --> 00:19:54,600 Speaker 1: to be assassinated. I thought that, And that's horrific that 382 00:19:54,680 --> 00:19:58,960 Speaker 1: I thought it, But I was like worried for them. 383 00:19:59,119 --> 00:20:03,280 Speaker 1: And there's a reason to because there are. If you're 384 00:20:03,359 --> 00:20:06,160 Speaker 1: living in a world where you're getting constant threats of 385 00:20:06,200 --> 00:20:10,959 Speaker 1: like death and violence and sexual violence, it's hard not 386 00:20:11,119 --> 00:20:15,800 Speaker 1: to think like one of these people might actually do something. 387 00:20:16,520 --> 00:20:21,080 Speaker 4: Yeah, especially in a country where guns are not really 388 00:20:21,080 --> 00:20:24,200 Speaker 4: that difficult to come by, in a country that normalizes 389 00:20:24,280 --> 00:20:27,520 Speaker 4: and amplifies these this kind of violent rhetoric against women 390 00:20:27,560 --> 00:20:31,000 Speaker 4: in public officials like the conditions are perfect for that 391 00:20:31,080 --> 00:20:34,679 Speaker 4: kind of violence to easily move from online violence to 392 00:20:34,760 --> 00:20:37,479 Speaker 4: offline violence, just like we saw in January sixth, right 393 00:20:37,520 --> 00:20:40,119 Speaker 4: like we're I'm not like pulling this out of nowhere. 394 00:20:40,160 --> 00:20:40,960 Speaker 2: We have already seen this. 395 00:20:41,320 --> 00:20:46,920 Speaker 1: Yes, and again there are numbers behind behind. 396 00:20:46,640 --> 00:20:48,000 Speaker 2: This, there sure are. 397 00:20:48,200 --> 00:20:52,000 Speaker 4: So shout out to the Institute for Strategic Dialogue. They 398 00:20:52,119 --> 00:20:54,520 Speaker 4: have a study that found that women of color candidates 399 00:20:54,560 --> 00:20:58,800 Speaker 4: are targeted by the right at really alarming rates online 400 00:20:58,960 --> 00:21:01,919 Speaker 4: a couple of key finding so, abusive messages accounted for 401 00:21:02,000 --> 00:21:04,720 Speaker 4: more than fifteen percent of those directed at every female 402 00:21:04,760 --> 00:21:08,520 Speaker 4: lawmaker analyzed, compared with around five to ten percent of 403 00:21:08,600 --> 00:21:12,640 Speaker 4: male candidates. Women of color are particularly likely to be targeted. 404 00:21:13,040 --> 00:21:15,640 Speaker 4: And this piece I think is really important. So male 405 00:21:15,720 --> 00:21:20,119 Speaker 4: politicians who are ethnic minorities are not more vulnerable than 406 00:21:20,160 --> 00:21:23,000 Speaker 4: their white counterparts. For instance, Corey Booker and Tim Scott, 407 00:21:23,000 --> 00:21:25,800 Speaker 4: who are both black men, got similar levels of abuse 408 00:21:25,840 --> 00:21:29,640 Speaker 4: to white male candidates. However, the attacks against them were 409 00:21:29,640 --> 00:21:34,520 Speaker 4: more likely to be around race. ISD found so abuse 410 00:21:34,560 --> 00:21:36,920 Speaker 4: directed toward women it's more likely to be about their 411 00:21:37,000 --> 00:21:41,320 Speaker 4: gender than abuse targeting men. Abuse targeting men was more 412 00:21:41,400 --> 00:21:44,840 Speaker 4: generalized and focused on their political stances, while the messages 413 00:21:44,880 --> 00:21:47,200 Speaker 4: directed that women are more likely to focus on things 414 00:21:47,240 --> 00:21:52,120 Speaker 4: like their appearance or like general competence. Female Democrats received 415 00:21:52,200 --> 00:21:55,639 Speaker 4: ten times more abusive comments than their male counterparts on Facebook, 416 00:21:55,760 --> 00:21:58,760 Speaker 4: and Republican women received twice as many abusive comments as 417 00:21:58,800 --> 00:22:03,040 Speaker 4: Republican men. So it's one of those situations where women 418 00:22:03,080 --> 00:22:04,320 Speaker 4: of color and black women. 419 00:22:04,119 --> 00:22:07,199 Speaker 2: Are really shouldering the majority of this burden. 420 00:22:07,440 --> 00:22:10,639 Speaker 4: But it's it's it's not the same along gender and 421 00:22:10,760 --> 00:22:14,000 Speaker 4: race lines, right, Like it is very different if you 422 00:22:14,040 --> 00:22:17,119 Speaker 4: are criticizing somebody because of their their political stances or 423 00:22:17,119 --> 00:22:20,040 Speaker 4: their policy versus if you're criticizing somebody because of the 424 00:22:20,040 --> 00:22:23,040 Speaker 4: way they look or because of like a like a 425 00:22:23,160 --> 00:22:26,960 Speaker 4: general sussbission of who they are based on their identity, Like, 426 00:22:27,000 --> 00:22:29,480 Speaker 4: those things are very different. And as this report from 427 00:22:29,520 --> 00:22:32,639 Speaker 4: the Institute for Strategic Dialogue points out, everybody is not 428 00:22:32,680 --> 00:22:34,000 Speaker 4: shouldering that burden equally. 429 00:22:34,480 --> 00:22:40,720 Speaker 1: Yes, and it's it's very frustrating for a lot of reasons. 430 00:22:40,760 --> 00:22:43,919 Speaker 1: But as going back to like the whole like showing 431 00:22:43,960 --> 00:22:49,240 Speaker 1: your sources thing, it's really hard to combat, right, it's 432 00:22:49,280 --> 00:22:53,000 Speaker 1: really hard to to fight against this. 433 00:22:53,880 --> 00:22:55,840 Speaker 4: It really is like, this is work that I have 434 00:22:55,960 --> 00:23:00,879 Speaker 4: done professionally during various times of my career, and you know, 435 00:23:01,000 --> 00:23:03,240 Speaker 4: being in the space, we talk a lot about things 436 00:23:03,280 --> 00:23:06,680 Speaker 4: like debunking, which I do think has its place. However, 437 00:23:06,920 --> 00:23:09,600 Speaker 4: it's a pretty frustrating endeavor because it's like you're playing 438 00:23:09,640 --> 00:23:11,720 Speaker 4: whack a mole, right Like I used to be the 439 00:23:11,720 --> 00:23:15,560 Speaker 4: person who would have meetings with Facebook or Twitter or 440 00:23:15,640 --> 00:23:18,119 Speaker 4: Reddit and be like, oh, well, here's a bunch of 441 00:23:18,160 --> 00:23:21,320 Speaker 4: examples of content that violates your terms of service and 442 00:23:21,520 --> 00:23:25,240 Speaker 4: seems to be violent threats or attacks on public officials 443 00:23:25,280 --> 00:23:27,080 Speaker 4: or women running for office, and they'd be. 444 00:23:27,000 --> 00:23:28,040 Speaker 2: Like, okay, we'll take those down. 445 00:23:28,080 --> 00:23:30,159 Speaker 4: And it's like you can't do that forever, right Like, 446 00:23:30,400 --> 00:23:33,800 Speaker 4: So it is a frustrating thing, and I think, you know, 447 00:23:34,960 --> 00:23:38,359 Speaker 4: you need more than just debunking to combat this stuff, 448 00:23:38,400 --> 00:23:41,119 Speaker 4: you know, gender disinformation. I don't feel like it can 449 00:23:41,160 --> 00:23:45,640 Speaker 4: really be addressed through just fact checking or debunking. These 450 00:23:45,680 --> 00:23:51,080 Speaker 4: attacks are often based in character assassination or just unverifiable information, 451 00:23:51,200 --> 00:23:55,399 Speaker 4: and so when that is magnified by algorithmic preferences of 452 00:23:55,480 --> 00:23:59,120 Speaker 4: platforms that privilege fake and outrageous content. 453 00:23:58,840 --> 00:24:01,800 Speaker 2: To enhance profit. We have a real problem. 454 00:24:02,240 --> 00:24:04,439 Speaker 4: It's a tech problem, it's a policy problem, it's a 455 00:24:04,440 --> 00:24:06,760 Speaker 4: democracy problem, it's a cultural problem. 456 00:24:06,920 --> 00:24:09,000 Speaker 2: It's a lot of problems all rolled into one big 457 00:24:09,040 --> 00:24:10,760 Speaker 2: ball that like whack. 458 00:24:10,600 --> 00:24:13,320 Speaker 4: Playing whack the moll on individual pieces of content just 459 00:24:13,320 --> 00:24:15,560 Speaker 4: ain't going to cut it right. And so I would 460 00:24:15,600 --> 00:24:20,320 Speaker 4: say that this really needs to be combated by changing 461 00:24:20,680 --> 00:24:24,359 Speaker 4: social media platform policies in the long term, and also 462 00:24:24,880 --> 00:24:28,199 Speaker 4: a cultural understanding of what's going on. Right, So not 463 00:24:28,359 --> 00:24:30,680 Speaker 4: just being like, oh, well, it's just a platform problem, 464 00:24:31,080 --> 00:24:34,280 Speaker 4: really understanding the role that, like the way that we're 465 00:24:34,320 --> 00:24:37,080 Speaker 4: all thinking about and talking about women who run for 466 00:24:37,160 --> 00:24:40,359 Speaker 4: office and other marginalized people who run for office, really 467 00:24:40,359 --> 00:24:42,720 Speaker 4: does play a role in pushing back against this. 468 00:24:42,960 --> 00:24:45,600 Speaker 5: I mean a part of the conversation is that the 469 00:24:45,680 --> 00:24:49,400 Speaker 5: reason feminism is important, the reason intersectional feminism is important, 470 00:24:49,480 --> 00:24:53,520 Speaker 5: is to undo the misogyny and the conversations that happened 471 00:24:53,520 --> 00:24:57,200 Speaker 5: that allowed this to start with, that allow this type 472 00:24:57,240 --> 00:25:00,320 Speaker 5: of rhetoric to begin with. Like in this conversation, odd 473 00:25:00,320 --> 00:25:03,320 Speaker 5: because I always go back to thinking, how immediately like 474 00:25:03,359 --> 00:25:06,320 Speaker 5: a one a man specifically wants to threaten a woman 475 00:25:06,480 --> 00:25:09,080 Speaker 5: is always some kind of sexual violence that's the immediate 476 00:25:09,119 --> 00:25:11,959 Speaker 5: part is like their go too. And we know when 477 00:25:12,000 --> 00:25:14,120 Speaker 5: it comes to war crimes, that's one of the number 478 00:25:14,160 --> 00:25:17,440 Speaker 5: one things that is used within these wars that are 479 00:25:17,480 --> 00:25:19,720 Speaker 5: often so like awful. We start thinking about it, like 480 00:25:19,840 --> 00:25:22,359 Speaker 5: the level of sexual violence that happened to show power 481 00:25:22,359 --> 00:25:24,919 Speaker 5: and control and a lot all of this has to 482 00:25:24,960 --> 00:25:28,960 Speaker 5: do with the misogynistic take on power, and it's kind 483 00:25:29,000 --> 00:25:30,919 Speaker 5: of one of those questions of like how do you 484 00:25:31,000 --> 00:25:35,680 Speaker 5: even begin, especially when it's so deeply rooted that any 485 00:25:36,000 --> 00:25:41,480 Speaker 5: almost especially like I'm gonna say this, like uneducated, ignorant people, 486 00:25:41,560 --> 00:25:43,600 Speaker 5: that's their go to, Like that's the first thing, is 487 00:25:43,640 --> 00:25:46,479 Speaker 5: that you deserve to be like raped, Like that's their comment, 488 00:25:46,680 --> 00:25:50,360 Speaker 5: and you see it so many times that it becomes normalized. 489 00:25:50,440 --> 00:25:53,560 Speaker 4: Yeah, I mean, how many of us, like I know 490 00:25:53,640 --> 00:25:54,600 Speaker 4: that I've experienced this. 491 00:25:54,960 --> 00:25:56,040 Speaker 2: I'm sure that you all have. 492 00:25:56,119 --> 00:25:58,320 Speaker 4: I'm sure that a lot of people listening, how many 493 00:25:58,359 --> 00:26:01,160 Speaker 4: of us have had some creed say like I hope 494 00:26:01,200 --> 00:26:03,600 Speaker 4: you get raped, you deserve to be raped. Like the 495 00:26:03,640 --> 00:26:06,719 Speaker 4: fact that that is hurled at us as a threat 496 00:26:06,960 --> 00:26:09,840 Speaker 4: when we speak up and use our voices, I think 497 00:26:09,880 --> 00:26:12,560 Speaker 4: that that illustrates exactly what you're talking about. 498 00:26:12,560 --> 00:26:14,000 Speaker 2: Them. 499 00:26:14,160 --> 00:26:23,680 Speaker 1: Yeah, so what what can we do then to address 500 00:26:23,720 --> 00:26:26,359 Speaker 1: this online? Perhaps? 501 00:26:26,840 --> 00:26:29,520 Speaker 4: So the first thing I would say is really understanding 502 00:26:29,680 --> 00:26:32,680 Speaker 4: the kinds of attacks that stick and letting that kind 503 00:26:32,680 --> 00:26:35,280 Speaker 4: of inform your thinking about how you respond. 504 00:26:35,359 --> 00:26:37,360 Speaker 2: Right, because not all of these these. 505 00:26:37,200 --> 00:26:39,720 Speaker 4: Tactics are creative equal, So let's look at the ones 506 00:26:39,720 --> 00:26:43,439 Speaker 4: that are actually effective at impacting voting behavior. Attacks that 507 00:26:43,480 --> 00:26:46,240 Speaker 4: seem to pack the biggest punch on voter behavior focus 508 00:26:46,280 --> 00:26:51,520 Speaker 4: on character things like trust, qualifications, likability, and control with 509 00:26:51,600 --> 00:26:54,840 Speaker 4: a constant backdrop these like sexualized attacks that we were 510 00:26:54,840 --> 00:26:56,920 Speaker 4: just talking about. So you can sort of think of 511 00:26:56,960 --> 00:27:00,320 Speaker 4: these attacks as manifesting in the. 512 00:27:00,160 --> 00:27:01,880 Speaker 2: Candidates are untrustworthy. 513 00:27:01,960 --> 00:27:06,679 Speaker 4: So women are liars or hypocrites or too ambitious, right that, 514 00:27:06,840 --> 00:27:09,080 Speaker 4: like the very fact that this woman is running for 515 00:27:09,160 --> 00:27:12,040 Speaker 4: office means that there is something off about her, Like 516 00:27:12,119 --> 00:27:14,080 Speaker 4: you can't trust somebody who wants it too much? 517 00:27:14,200 --> 00:27:17,480 Speaker 2: Right, And then meanwhile it's like, oh, so did like. 518 00:27:17,880 --> 00:27:21,000 Speaker 4: The male candidate just like accidentally fill out the paperwork 519 00:27:21,000 --> 00:27:23,320 Speaker 4: to run and just like accidentally found themselves on a 520 00:27:23,359 --> 00:27:27,359 Speaker 4: podium like And those attacks really work because trust is 521 00:27:27,400 --> 00:27:30,960 Speaker 4: obviously a major factor in vote choice. Another one is 522 00:27:30,960 --> 00:27:35,480 Speaker 4: that women are unqualified women are stupid, weak, incapable, and experienced. 523 00:27:35,560 --> 00:27:37,640 Speaker 2: Right, So historically. 524 00:27:37,760 --> 00:27:41,320 Speaker 4: Women must meet a higher threshold to prove their qualifications. 525 00:27:42,800 --> 00:27:46,280 Speaker 2: Another is that women are just unlikable. I just don't 526 00:27:46,359 --> 00:27:46,640 Speaker 2: like her. 527 00:27:46,800 --> 00:27:48,359 Speaker 4: I don't like her face, I don't like what she 528 00:27:48,400 --> 00:27:50,720 Speaker 4: stands for, difficult to get along with. 529 00:27:50,880 --> 00:27:52,600 Speaker 2: This is one that like you see a lot with 530 00:27:52,640 --> 00:27:53,320 Speaker 2: women of color. 531 00:27:54,320 --> 00:27:57,639 Speaker 4: You know they're angry or like never satisfied or like 532 00:27:57,760 --> 00:28:00,199 Speaker 4: what a sour puss. No one likes them. They have 533 00:28:00,240 --> 00:28:03,680 Speaker 4: annoying personalities. Like women, leaders are expected to be agreeable. 534 00:28:04,160 --> 00:28:07,440 Speaker 4: I remember when Elizabeth Warren was running for president. One 535 00:28:07,480 --> 00:28:09,800 Speaker 4: of the attacks that her campaign really saw a lot 536 00:28:09,880 --> 00:28:15,320 Speaker 4: is that she was like too prepared and would would evoke. 537 00:28:15,119 --> 00:28:16,800 Speaker 2: Plans for too many things. 538 00:28:16,800 --> 00:28:18,199 Speaker 4: So if you asked her a question, she'd be like, oh, 539 00:28:18,240 --> 00:28:20,520 Speaker 4: here's my plan on that, or like here's my policy 540 00:28:20,520 --> 00:28:23,800 Speaker 4: paper on that, and people were like, don't like it too. 541 00:28:24,520 --> 00:28:26,080 Speaker 1: I can't be with her. 542 00:28:29,440 --> 00:28:30,879 Speaker 2: It really. I mean, like if. 543 00:28:30,760 --> 00:28:34,679 Speaker 4: You're if you were a woman who like dominated and 544 00:28:34,840 --> 00:28:37,720 Speaker 4: led the group project in class, I don't need to 545 00:28:37,720 --> 00:28:40,440 Speaker 4: tell you that. Sometimes when you're really good and really 546 00:28:40,520 --> 00:28:44,800 Speaker 4: qualified and know what you're doing, uh, that will be 547 00:28:44,840 --> 00:28:45,960 Speaker 4: seen as an attack on you. 548 00:28:46,040 --> 00:28:47,840 Speaker 2: We're like used against. 549 00:28:47,440 --> 00:28:48,800 Speaker 4: You to be like, I just don't like her, even 550 00:28:48,840 --> 00:28:51,040 Speaker 4: though she's like did all the work on this project 551 00:28:51,080 --> 00:28:53,080 Speaker 4: that I would have gotten an F without her, I 552 00:28:53,080 --> 00:28:57,840 Speaker 4: don't like it. And lastly that they're uncontrollable. They're too angry, 553 00:28:57,920 --> 00:29:02,520 Speaker 4: too crazy, too hormonal, too evil, all of that, that 554 00:29:02,560 --> 00:29:06,720 Speaker 4: they're too whatever to be voted into office. And you 555 00:29:06,720 --> 00:29:09,120 Speaker 4: can really see how these are like things that kind 556 00:29:09,120 --> 00:29:12,480 Speaker 4: of there's there's really no way to win because they 557 00:29:12,560 --> 00:29:17,240 Speaker 4: are sexually promiscuous or they're uneffable, right, so it's like, oh, 558 00:29:17,240 --> 00:29:21,480 Speaker 4: if you're too sexual or not sexual enough, both of 559 00:29:21,520 --> 00:29:23,480 Speaker 4: those are like attacks, so it's like what are you 560 00:29:23,480 --> 00:29:25,040 Speaker 4: supposed to do, Like there's no way to. 561 00:29:25,000 --> 00:29:28,640 Speaker 1: Win, right. A lot of these are very they're conflicting 562 00:29:28,680 --> 00:29:30,520 Speaker 1: with each other. It's like you're too ambitious, but you 563 00:29:30,600 --> 00:29:34,640 Speaker 1: don't know what you're doing. Like it's like too okay, 564 00:29:36,080 --> 00:29:39,080 Speaker 1: I can't prepared. I guess that's bad, too okay. 565 00:29:38,760 --> 00:29:44,120 Speaker 4: Cool, absolutely and so like, and all of these attacks 566 00:29:44,240 --> 00:29:47,560 Speaker 4: work and they stick despite being like contradictory or like 567 00:29:47,760 --> 00:29:51,440 Speaker 4: seemingly really silly. They work because of this larger system 568 00:29:51,560 --> 00:29:55,800 Speaker 4: of unchecked misogyny massage noire and biased against women. That 569 00:29:55,920 --> 00:29:58,680 Speaker 4: is worse for marginalized women. And so the reason why 570 00:29:58,800 --> 00:30:02,160 Speaker 4: these are effective and actually you know, not just existing 571 00:30:02,160 --> 00:30:06,160 Speaker 4: out there but actually impacting voting behavior is because of 572 00:30:06,200 --> 00:30:10,160 Speaker 4: this larger cultural climate of bias against women and marginalized people. 573 00:30:10,360 --> 00:30:13,240 Speaker 1: Absolutely, and I know a lot of us have experienced 574 00:30:13,320 --> 00:30:26,000 Speaker 1: that just in our lives, just seen it. Do you 575 00:30:26,040 --> 00:30:29,360 Speaker 1: have any tips on how we can counter this or 576 00:30:29,360 --> 00:30:30,280 Speaker 1: fight back against this? 577 00:30:30,640 --> 00:30:31,120 Speaker 2: Totally? 578 00:30:31,200 --> 00:30:34,200 Speaker 4: So the biggest one, I think is just to share positive, 579 00:30:34,360 --> 00:30:38,520 Speaker 4: accurate content about women's participation in civic and political life. 580 00:30:38,880 --> 00:30:41,440 Speaker 4: You know, I would say the most effective form of 581 00:30:41,440 --> 00:30:46,840 Speaker 4: inoculation against this is not like debunking each specific individual attack, 582 00:30:47,120 --> 00:30:50,800 Speaker 4: It is demonstrating that women belong in political life, that 583 00:30:50,840 --> 00:30:55,680 Speaker 4: women are good leaders, So really demonstrating women as trustworthy, qualified, 584 00:30:55,760 --> 00:30:58,240 Speaker 4: and competent. That's the kind of content that we need 585 00:30:58,280 --> 00:31:01,640 Speaker 4: to really correct the narrative and protect leaders and emphasize 586 00:31:01,640 --> 00:31:05,560 Speaker 4: the need for women and marginalized people's place in political involvement. 587 00:31:06,120 --> 00:31:07,920 Speaker 4: So take all of that to mean that probably the 588 00:31:07,960 --> 00:31:09,480 Speaker 4: worst thing that you can do if you're trying to 589 00:31:09,480 --> 00:31:13,880 Speaker 4: combat this kind of thing is to directly respond or 590 00:31:13,960 --> 00:31:17,520 Speaker 4: engage with individual attacks because that could amplify them or 591 00:31:17,600 --> 00:31:19,840 Speaker 4: legitimize them. This is something that I think is really 592 00:31:19,880 --> 00:31:22,320 Speaker 4: tough and I have to check myself on because when 593 00:31:22,320 --> 00:31:25,760 Speaker 4: I'm scrolling social media and I see like a lie 594 00:31:25,800 --> 00:31:28,600 Speaker 4: about a woman, my first instinct is to reply and 595 00:31:28,640 --> 00:31:31,920 Speaker 4: be like, that's not true, blah blah blah. But because 596 00:31:31,960 --> 00:31:34,560 Speaker 4: of the way that algorithms work, you actually might be 597 00:31:34,640 --> 00:31:37,400 Speaker 4: amplifying that because the algorithm is like, oh, you're engaging 598 00:31:37,440 --> 00:31:38,800 Speaker 4: with this, it must be good content. 599 00:31:38,920 --> 00:31:40,960 Speaker 2: Let me show it to more people. So that is 600 00:31:41,080 --> 00:31:41,960 Speaker 2: not what you want to do. 601 00:31:43,040 --> 00:31:47,560 Speaker 4: Instead, push out positive, proactive counter messaging that supports women 602 00:31:47,640 --> 00:31:51,120 Speaker 4: leaders as qualified and also acknowledge the kinds of identity 603 00:31:51,160 --> 00:31:53,400 Speaker 4: based harms that they are facing. So if there's a 604 00:31:53,440 --> 00:31:56,120 Speaker 4: particular woman candidate that you're like, like, yay, I like 605 00:31:56,160 --> 00:32:00,720 Speaker 4: this person, emphasize that woman's credentials, her expertise, her background, 606 00:32:00,800 --> 00:32:04,480 Speaker 4: and her shared values. According to research from Gretchen Barton 607 00:32:04,520 --> 00:32:07,840 Speaker 4: of the Worthy Strategy Group, there are six intangible qualities 608 00:32:07,840 --> 00:32:09,920 Speaker 4: that Americans look for in their leaders. They want somebody 609 00:32:09,960 --> 00:32:13,400 Speaker 4: who is a challenger, a nurturer, an innovator, strong, stable, 610 00:32:13,440 --> 00:32:16,200 Speaker 4: and visible and meets the moment right and so there 611 00:32:16,280 --> 00:32:19,760 Speaker 4: is a huge disparity in women being and underrepresented in 612 00:32:19,800 --> 00:32:23,560 Speaker 4: each of these categories that Americans site when they're asked 613 00:32:23,560 --> 00:32:25,200 Speaker 4: to envision what they want of a leader, And so 614 00:32:25,240 --> 00:32:27,800 Speaker 4: we should really be working to flip the script a 615 00:32:27,840 --> 00:32:30,600 Speaker 4: little bit and show all the different ways that these 616 00:32:30,720 --> 00:32:35,240 Speaker 4: dynamic women leaders that dot our political landscape fit those 617 00:32:35,280 --> 00:32:37,880 Speaker 4: things that Americans say that they want out of their leaders, right, 618 00:32:37,920 --> 00:32:42,040 Speaker 4: So highlighting the ways that women are innovators, highlighting the 619 00:32:42,040 --> 00:32:46,120 Speaker 4: ways that women are strong and stable and visible, highlighting 620 00:32:46,120 --> 00:32:48,680 Speaker 4: the ways that women really are the ones who are 621 00:32:49,000 --> 00:32:53,320 Speaker 4: rising up to meet the moment and challenging the status quo. Right, 622 00:32:53,760 --> 00:32:57,080 Speaker 4: all of these things that we know make women great leaders. 623 00:32:57,600 --> 00:33:00,479 Speaker 4: A good way to challenge the lies that women are 624 00:33:00,520 --> 00:33:03,720 Speaker 4: facing in our current media climate is to highlight. 625 00:33:03,280 --> 00:33:04,120 Speaker 2: The truth about the. 626 00:33:04,040 --> 00:33:06,280 Speaker 4: Fact that women do make really good leaders and that 627 00:33:06,360 --> 00:33:10,200 Speaker 4: we want a pluralistic society where women are well represented. 628 00:33:10,960 --> 00:33:15,880 Speaker 1: Yes, and it's really unfortunate because there are examples of this, 629 00:33:15,920 --> 00:33:20,640 Speaker 1: plenty of examples of women being good leaders, but it 630 00:33:20,760 --> 00:33:27,000 Speaker 1: always feels like either they're ignored or like, if I'm 631 00:33:27,040 --> 00:33:29,160 Speaker 1: thinking of cases from outside the United States, like that's 632 00:33:29,200 --> 00:33:32,720 Speaker 1: the United States, though, like she's a good leader from Europe. 633 00:33:34,120 --> 00:33:40,440 Speaker 1: Okay New Zealand, honey, Okay, cool. I don't like that 634 00:33:40,840 --> 00:33:45,560 Speaker 1: one thing. I have to say. I have had some 635 00:33:45,680 --> 00:33:51,560 Speaker 1: experiences where I've tried with liberal white men to do this, 636 00:33:52,240 --> 00:33:55,880 Speaker 1: and let me tell you how angry they got. I 637 00:33:56,120 --> 00:33:59,600 Speaker 1: was shocked. So if that's that's like a to me, 638 00:33:59,720 --> 00:34:01,920 Speaker 1: that a wake up call of how embedded this is. 639 00:34:01,960 --> 00:34:05,160 Speaker 1: That I thought they would at least hear me out, 640 00:34:06,120 --> 00:34:07,000 Speaker 1: and they did not. 641 00:34:07,920 --> 00:34:10,200 Speaker 2: Yeah, I mean, I hate to say it. 642 00:34:10,280 --> 00:34:12,799 Speaker 4: I have had the same experience, and I think there 643 00:34:12,800 --> 00:34:15,359 Speaker 4: are a lot of men in my life that think 644 00:34:15,360 --> 00:34:19,000 Speaker 4: of themselves as like progressive or radical or lefty or 645 00:34:19,040 --> 00:34:23,399 Speaker 4: like promoters of democracy. But then even they don't see 646 00:34:23,440 --> 00:34:28,680 Speaker 4: the ways that misogyny and sexism threaten our democracy, even 647 00:34:28,680 --> 00:34:30,360 Speaker 4: as those things are so clear to me. And so 648 00:34:30,560 --> 00:34:35,399 Speaker 4: I've really experienced what you're highlighting that like, even from 649 00:34:35,480 --> 00:34:38,440 Speaker 4: men who are like ostensibly with it. 650 00:34:38,440 --> 00:34:40,600 Speaker 2: It's just I think it's just I don't think that 651 00:34:40,600 --> 00:34:41,719 Speaker 2: men see it that way all the time. 652 00:34:41,760 --> 00:34:44,160 Speaker 4: I just think that they that they're like, oh well, 653 00:34:44,520 --> 00:34:48,840 Speaker 4: sexism ended back in whatever, and like it's just not 654 00:34:48,920 --> 00:34:51,560 Speaker 4: a topic that I think impacts our lives anymore when 655 00:34:51,560 --> 00:34:55,000 Speaker 4: it so impacts our lives, and I guess that's one 656 00:34:55,040 --> 00:34:57,759 Speaker 4: of the reasons why it's so insidious and why it 657 00:34:57,800 --> 00:35:00,440 Speaker 4: takes such intentionality to combat it, because the things are 658 00:35:00,480 --> 00:35:03,520 Speaker 4: so ingrained that people might not even realize the ways 659 00:35:03,560 --> 00:35:05,160 Speaker 4: that they are contributing to it. 660 00:35:05,719 --> 00:35:07,759 Speaker 1: Yep. I had a guy tell me there was no 661 00:35:07,880 --> 00:35:10,880 Speaker 1: sexism in the twenty sixteen election, and I was like, 662 00:35:11,760 --> 00:35:13,759 Speaker 1: you do know what I do for a living, right, 663 00:35:14,640 --> 00:35:17,520 Speaker 1: It's like, I just don't see it. Okay, cool, I'm 664 00:35:17,520 --> 00:35:22,960 Speaker 1: gonna leave because I'm gonna get real mad. But just 665 00:35:23,480 --> 00:35:27,360 Speaker 1: I don't agree. And a part of that is this 666 00:35:27,480 --> 00:35:31,040 Speaker 1: idea we shouldn't talk about it, are we should ignore it, 667 00:35:31,760 --> 00:35:34,320 Speaker 1: which is not really helpful. 668 00:35:34,600 --> 00:35:38,359 Speaker 2: Always no there, I mean, I'm guilty of this too. 669 00:35:38,560 --> 00:35:41,080 Speaker 4: There used to be this attitude that, you know, the 670 00:35:41,120 --> 00:35:43,919 Speaker 4: only way to counter these unfair attacks was to ignore them, 671 00:35:44,000 --> 00:35:47,400 Speaker 4: right like when they go low, we go high, Like 672 00:35:47,560 --> 00:35:50,919 Speaker 4: don't don't mention it, like it'll go away. I hate 673 00:35:50,960 --> 00:35:53,160 Speaker 4: to say that that is really a page from an old, 674 00:35:53,200 --> 00:35:57,720 Speaker 4: outdated playbook. Research really suggests now that ignoring these attacks. 675 00:35:57,480 --> 00:35:58,560 Speaker 2: Just doesn't make them go away. 676 00:35:58,640 --> 00:36:02,000 Speaker 4: It just allows them to fester, get amplified and become 677 00:36:02,120 --> 00:36:05,360 Speaker 4: legitimized and become part of our political discourse. 678 00:36:06,840 --> 00:36:08,080 Speaker 2: And it could also kind of. 679 00:36:08,160 --> 00:36:11,759 Speaker 4: Backfire for women who are running for office, because candidates 680 00:36:11,800 --> 00:36:15,080 Speaker 4: want to see like strength and backbone and a candidate 681 00:36:15,160 --> 00:36:16,799 Speaker 4: meeting the moment. And so if you're like I'm not 682 00:36:16,800 --> 00:36:19,720 Speaker 4: going to respond to that, that can sort of backfire 683 00:36:19,760 --> 00:36:21,720 Speaker 4: because it's like, oh, well, why isn't she meeting the moment. 684 00:36:22,320 --> 00:36:25,840 Speaker 4: So that doesn't mean that like a woman candidate needs 685 00:36:25,880 --> 00:36:28,960 Speaker 4: to get meyred down in the in the bog of 686 00:36:29,000 --> 00:36:33,399 Speaker 4: like defending herself against unfair lies and attacks. But it's 687 00:36:33,440 --> 00:36:38,120 Speaker 4: really about emphasizing your accomplishments and qualifications in a way 688 00:36:38,120 --> 00:36:41,839 Speaker 4: that corrects harmful narratives without repeating those narratives and without 689 00:36:41,920 --> 00:36:44,400 Speaker 4: legitimizing those narratives. It's like a little bit of a 690 00:36:44,480 --> 00:36:47,239 Speaker 4: tightrope block to do it correctly. And again, like I 691 00:36:47,360 --> 00:36:49,520 Speaker 4: just want to emphasize that it's something that we should 692 00:36:49,560 --> 00:36:52,920 Speaker 4: not have to do. Like certainly male candidates are not 693 00:36:53,040 --> 00:36:55,960 Speaker 4: having to prepare counter messaging in this way to be like, well, 694 00:36:56,000 --> 00:36:57,920 Speaker 4: I got to respond to this attack, but in a 695 00:36:57,960 --> 00:37:00,759 Speaker 4: way that emphasizes my values and thus legitimize it. 696 00:37:00,800 --> 00:37:01,760 Speaker 2: And also like like. 697 00:37:02,080 --> 00:37:06,160 Speaker 4: It's a minefield that not everybody is asked to wade through. 698 00:37:06,200 --> 00:37:07,880 Speaker 4: So I just want to acknowledge that we should not 699 00:37:07,960 --> 00:37:10,719 Speaker 4: have to deal with this. However, I have seen a 700 00:37:10,719 --> 00:37:14,720 Speaker 4: lot of candidates who use these kinds of unfair identity 701 00:37:14,760 --> 00:37:19,000 Speaker 4: based attacks as an opportunity to really create community and 702 00:37:19,120 --> 00:37:22,279 Speaker 4: contrast with the people who are hurling these attacks at them. 703 00:37:22,360 --> 00:37:25,759 Speaker 4: Like a good example is the former Canadian Minister of 704 00:37:25,800 --> 00:37:29,759 Speaker 4: Climate and Environment Catherine McKenna, exemplified this advice when she 705 00:37:29,800 --> 00:37:33,440 Speaker 4: pushed back against this this hashtag that she was facing 706 00:37:34,239 --> 00:37:38,200 Speaker 4: that labels her climate Barbie, which resulted in death threats, 707 00:37:38,360 --> 00:37:41,720 Speaker 4: and to respond, she tweeted, do you use that sexist 708 00:37:41,800 --> 00:37:44,880 Speaker 4: garbage about your daughter, mother, and sister? We need more 709 00:37:44,920 --> 00:37:48,279 Speaker 4: women in politics. Your sexist comments won't stop us. And 710 00:37:48,320 --> 00:37:51,239 Speaker 4: so that's a really interesting way how she responds to 711 00:37:51,320 --> 00:37:55,879 Speaker 4: an unfair sexist attack that attacks her gender and makes 712 00:37:55,880 --> 00:37:58,239 Speaker 4: it about all the other women and girls who are 713 00:37:58,280 --> 00:38:00,640 Speaker 4: watching this kind of garbage and and being like, don't 714 00:38:00,680 --> 00:38:03,520 Speaker 4: we want a climate that supports these women and girls 715 00:38:03,719 --> 00:38:04,680 Speaker 4: to be in politics? 716 00:38:04,680 --> 00:38:05,880 Speaker 2: Like you know, these. 717 00:38:05,760 --> 00:38:08,080 Speaker 4: Kinds of comments go against that, like and they're not 718 00:38:08,080 --> 00:38:10,320 Speaker 4: going to stop us, right, So, like I have seen 719 00:38:10,400 --> 00:38:14,560 Speaker 4: candidates that really challenge this in some unique ways. 720 00:38:15,320 --> 00:38:18,200 Speaker 1: Yeah, and going back to your point of like, we 721 00:38:18,200 --> 00:38:21,359 Speaker 1: shouldn't have to be doing this at all, And this 722 00:38:21,440 --> 00:38:23,240 Speaker 1: is a topic I think about a lot in terms 723 00:38:23,239 --> 00:38:28,560 Speaker 1: of so many, so many things, But it's not just 724 00:38:29,640 --> 00:38:33,400 Speaker 1: women who are being targeted that should have to speak 725 00:38:33,480 --> 00:38:34,640 Speaker 1: up or say something. 726 00:38:34,800 --> 00:38:37,319 Speaker 4: I'm so glad that you brought that up, because I mean, 727 00:38:37,320 --> 00:38:39,440 Speaker 4: how many of us have been in a situation where 728 00:38:39,600 --> 00:38:42,279 Speaker 4: somebody says something or does something and you're like, oh, 729 00:38:42,440 --> 00:38:43,719 Speaker 4: I don't want to be the one that has to 730 00:38:43,760 --> 00:38:49,200 Speaker 4: be like, well, that's actually offensive, or like that's actually 731 00:38:49,239 --> 00:38:50,400 Speaker 4: not a cool thing. 732 00:38:50,280 --> 00:38:51,920 Speaker 2: To say in the workplace. 733 00:38:52,120 --> 00:38:54,399 Speaker 4: And it really shouldn't just be on the person who 734 00:38:54,480 --> 00:38:58,080 Speaker 4: is targeted to respond to this, because challenging a sexist 735 00:38:58,080 --> 00:39:00,759 Speaker 4: climate really does take all of that to speak up. 736 00:39:01,000 --> 00:39:04,240 Speaker 4: So a good example of this is Fannie Willis, which 737 00:39:04,400 --> 00:39:07,759 Speaker 4: I think you all might know, do y'all. I feel 738 00:39:07,760 --> 00:39:10,440 Speaker 4: like in Atlanta she might be somebody who's on Yell's radar. 739 00:39:10,640 --> 00:39:13,919 Speaker 5: As well as just kicks in the primary, so we're 740 00:39:14,040 --> 00:39:14,680 Speaker 5: very excited. 741 00:39:15,960 --> 00:39:20,040 Speaker 4: So Willis is the Fulton County District attorney prosecuting the 742 00:39:20,040 --> 00:39:24,640 Speaker 4: Georgia Trump election case constantly under attack, right, really showing 743 00:39:24,719 --> 00:39:27,600 Speaker 4: how black women are attacked and undermined, and how that 744 00:39:27,680 --> 00:39:31,080 Speaker 4: is a strategy to undermine democratic process, So that it 745 00:39:31,120 --> 00:39:34,399 Speaker 4: was a really good example. On February fifteenth, Willis took 746 00:39:34,400 --> 00:39:38,680 Speaker 4: the witness stand and right wing actors, including Fox News, surprise, surprise, 747 00:39:39,120 --> 00:39:43,480 Speaker 4: they circulated this meme that suggested that she had worn 748 00:39:43,800 --> 00:39:46,880 Speaker 4: her dress backward. So this is a caption that Fox 749 00:39:46,920 --> 00:39:49,520 Speaker 4: News put out Bonnie set to take the hot seat 750 00:39:49,560 --> 00:39:53,080 Speaker 4: again after day one packed with yelling a wink and. 751 00:39:53,120 --> 00:39:54,760 Speaker 2: Surprise backward dress. 752 00:39:55,080 --> 00:39:57,520 Speaker 4: So basically they were saying, this black woman is so 753 00:39:57,680 --> 00:40:00,719 Speaker 4: stupid and and competent that can you believe she wore 754 00:40:00,760 --> 00:40:01,720 Speaker 4: her dress backward? 755 00:40:01,880 --> 00:40:04,240 Speaker 2: What an idiot? Who would ever believe in her leadership? 756 00:40:04,320 --> 00:40:04,480 Speaker 4: Right? 757 00:40:04,520 --> 00:40:06,200 Speaker 2: Like that was the substance of the attack. 758 00:40:06,600 --> 00:40:11,120 Speaker 4: However, a guy on TikTok delegitimized that attack that had 759 00:40:11,160 --> 00:40:14,400 Speaker 4: broken into mainstream media by pointing out in a TikTok 760 00:40:14,440 --> 00:40:17,040 Speaker 4: that like, actually, here's the dress and she was wearing 761 00:40:17,120 --> 00:40:19,920 Speaker 4: it correctly, right, So this was like nonsense that a 762 00:40:20,040 --> 00:40:24,759 Speaker 4: fairly mainstream news outlet amplified just completely wrong. He found 763 00:40:24,760 --> 00:40:27,120 Speaker 4: the dress, it's like she's wearing it perfectly correctly. 764 00:40:27,640 --> 00:40:28,480 Speaker 2: While he did. 765 00:40:28,320 --> 00:40:32,680 Speaker 4: That, other institutions put out positive counter content, avoiding the 766 00:40:32,719 --> 00:40:36,479 Speaker 4: direct attacks while emphasizing Willis's leadership and qualifications in order 767 00:40:36,520 --> 00:40:39,040 Speaker 4: to target them. So it was this like two pronged 768 00:40:39,080 --> 00:40:42,439 Speaker 4: thing where the TikTok messenger was able to be like, look, 769 00:40:42,440 --> 00:40:46,200 Speaker 4: how ridiculous this attack is, y'all, while other institutions pushed 770 00:40:46,200 --> 00:40:50,719 Speaker 4: out positive messaging about Willis's leadership and so really it 771 00:40:50,719 --> 00:40:54,040 Speaker 4: should not be on just her, this person targeted to 772 00:40:54,640 --> 00:40:57,520 Speaker 4: respond to these garbage lies about her character and her 773 00:40:57,600 --> 00:41:00,840 Speaker 4: values and her leadership. It really takes age to create 774 00:41:00,880 --> 00:41:03,440 Speaker 4: the culture where sets of them and misogyny is not 775 00:41:03,560 --> 00:41:05,479 Speaker 4: able to thrive and take root in this way. 776 00:41:06,400 --> 00:41:08,520 Speaker 5: Yes, and I would like to say Alana backed her 777 00:41:08,680 --> 00:41:10,759 Speaker 5: like I said, she killed it in the primaries. 778 00:41:10,800 --> 00:41:11,600 Speaker 3: We all celebrated. 779 00:41:14,080 --> 00:41:16,560 Speaker 5: We love her at this point, Like of course, you know, 780 00:41:16,600 --> 00:41:20,080 Speaker 5: when it comes to judicial systems in the South, it 781 00:41:20,080 --> 00:41:23,279 Speaker 5: can get ugly. But when you have a case like 782 00:41:23,320 --> 00:41:27,040 Speaker 5: this and you realize you have someone who is firm 783 00:41:27,360 --> 00:41:31,120 Speaker 5: and we love it, like it's it's representing what we 784 00:41:31,160 --> 00:41:33,440 Speaker 5: want to see in Atlanta, which is we're not taking 785 00:41:33,440 --> 00:41:35,040 Speaker 5: care from you so move on. 786 00:41:37,800 --> 00:41:41,960 Speaker 1: Yeah, And it's also so many of these attacks that 787 00:41:41,960 --> 00:41:46,280 Speaker 1: we're seeing right now are also distractions from like real 788 00:41:46,320 --> 00:41:50,160 Speaker 1: things we should be talking about. And it's I mean, 789 00:41:50,520 --> 00:41:53,680 Speaker 1: like she is wearing her dress backwards. It does have 790 00:41:53,760 --> 00:41:57,000 Speaker 1: these as you said, what they're trying to signal is 791 00:41:57,040 --> 00:42:01,360 Speaker 1: she is incompetent. But it's also just so like, okay, 792 00:42:02,480 --> 00:42:05,200 Speaker 1: but it's really important that we talk about this because 793 00:42:05,200 --> 00:42:08,160 Speaker 1: we are in an election year, extremely important election year, 794 00:42:08,760 --> 00:42:12,360 Speaker 1: and all of this stuff is very dangerous. 795 00:42:13,000 --> 00:42:16,200 Speaker 2: Yeah, And I think like if you're listening and you're like. 796 00:42:16,160 --> 00:42:19,200 Speaker 4: Well, I don't like Willis, I don't like Harris, I 797 00:42:19,239 --> 00:42:24,160 Speaker 4: don't like Pelosi, that is your right. However, you should 798 00:42:24,200 --> 00:42:27,120 Speaker 4: be more than anybody wanting to have a climate where 799 00:42:27,719 --> 00:42:30,279 Speaker 4: we are able to talk about these leaders and their 800 00:42:30,360 --> 00:42:33,040 Speaker 4: qualifications and their records in a way that is fair 801 00:42:33,080 --> 00:42:33,600 Speaker 4: and honest. 802 00:42:33,680 --> 00:42:36,640 Speaker 2: Right, Like, when the climate. 803 00:42:36,360 --> 00:42:40,000 Speaker 4: Is cluttered up with all kinds of garbage, sexist lies 804 00:42:40,040 --> 00:42:43,960 Speaker 4: and attacks, you can't actually have an honest conversation about 805 00:42:44,239 --> 00:42:47,120 Speaker 4: why these leaders are failing us or why these leaders 806 00:42:47,160 --> 00:42:49,920 Speaker 4: are not failing us. It just creates a dynamic where 807 00:42:49,920 --> 00:42:53,160 Speaker 4: like we're not having a conversation about the actual thing. 808 00:42:53,239 --> 00:42:57,800 Speaker 4: We're talking about backward dresses or whether or not somebody's 809 00:42:57,800 --> 00:42:59,719 Speaker 4: face looks weird, right, Like, we're just not having a 810 00:42:59,719 --> 00:43:03,000 Speaker 4: sub conversation. And it does not just threaten the women 811 00:43:03,040 --> 00:43:05,719 Speaker 4: who are targeted, it threatens all of us. Undermining women 812 00:43:05,800 --> 00:43:09,040 Speaker 4: leaders is part of a larger anti democracy agenda, right 813 00:43:09,040 --> 00:43:11,520 Speaker 4: It is not just about these individual women. It is 814 00:43:11,520 --> 00:43:15,680 Speaker 4: about threatening our democracy. And as you were saying earlier, Annie, Like, 815 00:43:16,040 --> 00:43:19,240 Speaker 4: it is unfortunate that it is very effective because people 816 00:43:19,480 --> 00:43:21,319 Speaker 4: might not always be able to see the ways that 817 00:43:21,360 --> 00:43:25,360 Speaker 4: sexism and misogyny and massage noire really threaten the fabric 818 00:43:25,400 --> 00:43:28,640 Speaker 4: of our democracy. It is this insidious thing that can 819 00:43:28,680 --> 00:43:31,359 Speaker 4: operate and plain in sight, even amongst people who think 820 00:43:31,400 --> 00:43:33,799 Speaker 4: of themselves as people who want to protect democracy. 821 00:43:33,800 --> 00:43:36,000 Speaker 2: And so we really need to get a handle on it. 822 00:43:36,040 --> 00:43:38,120 Speaker 4: We need to talk about it honestly, and we need 823 00:43:38,160 --> 00:43:40,200 Speaker 4: to listen to the people who are targeted and come 824 00:43:40,239 --> 00:43:42,520 Speaker 4: forward to talk about what they have dealt with just 825 00:43:42,560 --> 00:43:45,759 Speaker 4: for trying to serve their country or be part of 826 00:43:45,840 --> 00:43:47,320 Speaker 4: public or civic life. 827 00:43:47,840 --> 00:43:52,000 Speaker 1: Absolutely absolutely, and I think I'm sure we'll be talking 828 00:43:52,000 --> 00:43:58,480 Speaker 1: about the election for several episodes. But it's also I 829 00:43:58,880 --> 00:44:02,280 Speaker 1: would say sadly of and plenty of my women friends 830 00:44:02,360 --> 00:44:03,680 Speaker 1: who are like, well, I don't want to vote for 831 00:44:03,719 --> 00:44:07,200 Speaker 1: her because they're I'm honestly, it sounds like I'm scared 832 00:44:07,239 --> 00:44:10,520 Speaker 1: to vote for her because what if she's bad? Or 833 00:44:10,680 --> 00:44:13,000 Speaker 1: like I'm hearing all this stuff about her and she 834 00:44:13,120 --> 00:44:19,200 Speaker 1: must be bad. So it's a really really toxic environment 835 00:44:19,320 --> 00:44:21,600 Speaker 1: for like actual conversations and democracy. 836 00:44:21,719 --> 00:44:27,040 Speaker 4: So yeah, I mean I remember, like I have felt that, 837 00:44:27,239 --> 00:44:29,959 Speaker 4: like I have been, I have been like, oh, I'm 838 00:44:30,040 --> 00:44:34,040 Speaker 4: worried about having XYZ marginalized person in public office because 839 00:44:34,680 --> 00:44:37,520 Speaker 4: we have this insidious culture that it's like, well, we 840 00:44:37,680 --> 00:44:40,920 Speaker 4: tried one out and it didn't work, so we can 841 00:44:40,960 --> 00:44:43,920 Speaker 4: ever have another one. And that's just such a like 842 00:44:45,480 --> 00:44:48,240 Speaker 4: limiting mindset, and it's a mindset that you can creep 843 00:44:48,280 --> 00:44:51,399 Speaker 4: in before you know it. And we should really be 844 00:44:51,600 --> 00:44:55,560 Speaker 4: doing the intentional, sometimes internal work of like unwarring the 845 00:44:55,560 --> 00:44:58,440 Speaker 4: way that these things have really been internalized sometimes by 846 00:44:58,520 --> 00:45:00,000 Speaker 4: women and other marginalized people. 847 00:45:01,320 --> 00:45:05,880 Speaker 1: Yes, well as always Bridget, thank you so much for 848 00:45:06,040 --> 00:45:08,319 Speaker 1: doing this work, for coming on to share it with us. 849 00:45:09,000 --> 00:45:12,359 Speaker 1: We always love having you. Can you tell the good 850 00:45:12,400 --> 00:45:13,680 Speaker 1: listeners where to find you? 851 00:45:13,680 --> 00:45:15,440 Speaker 4: You can find me on my podcast if there are 852 00:45:15,440 --> 00:45:17,279 Speaker 4: no girls on the Internet, and you can find me 853 00:45:17,320 --> 00:45:18,919 Speaker 4: on Instagram at Bridget Marie in DC. 854 00:45:19,640 --> 00:45:24,600 Speaker 1: Yes and go do that. Listeners so important, so important, 855 00:45:25,840 --> 00:45:28,560 Speaker 1: and thanks as always, yes to Bridget and to you 856 00:45:28,640 --> 00:45:31,120 Speaker 1: for listening. If you would like to contact us, you 857 00:45:31,160 --> 00:45:33,880 Speaker 1: can our email Stepania mom Stuff at iHeartMedia dot com. 858 00:45:34,040 --> 00:45:36,600 Speaker 1: Can find us on Twitter at mom Stuff Podcast, and 859 00:45:36,800 --> 00:45:39,040 Speaker 1: on TikTok and Instagram That Stuff I Ever told you. 860 00:45:39,160 --> 00:45:41,480 Speaker 1: We're on YouTube, we have a tblic store, and we 861 00:45:41,480 --> 00:45:43,040 Speaker 1: have a book you can get wherever you get your books. 862 00:45:43,120 --> 00:45:45,440 Speaker 1: Thanks as always to our super producer Christina, our exective 863 00:45:45,440 --> 00:45:46,960 Speaker 1: producer My and your contributor Joey. 864 00:45:47,160 --> 00:45:48,120 Speaker 3: Thank you and. 865 00:45:48,080 --> 00:45:50,000 Speaker 1: Thanks to you for listening. Stuff I Never Told You 866 00:45:50,000 --> 00:45:51,600 Speaker 1: to protection of iHeart Radio. For more podcasts from my 867 00:45:51,600 --> 00:45:53,040 Speaker 1: Heart Radio, you can check out the Heart Radio ap 868 00:45:53,080 --> 00:45:55,520 Speaker 1: Apple Podcasts, where you listen to favorite shows. 869 00:46:01,200 --> 00:46:01,720 Speaker 2: The Lone