1 00:00:11,960 --> 00:00:14,800 Speaker 1: Good morning, peeps, and welcome to book a f Daily 2 00:00:14,880 --> 00:00:19,639 Speaker 1: with Me your Girl Danielle Moody recording from the Home Bunker, Folks. 3 00:00:20,200 --> 00:00:23,639 Speaker 1: Over the past several weeks, since that piece of shit 4 00:00:23,880 --> 00:00:29,160 Speaker 1: in Selle Elon Musk has taken over Twitter and has basically, 5 00:00:29,520 --> 00:00:33,240 Speaker 1: you know, thrown out the window all of the protections 6 00:00:33,320 --> 00:00:37,320 Speaker 1: that were provided and should be provided by a social 7 00:00:37,320 --> 00:00:41,720 Speaker 1: media platform of its stature and of its importance, we've 8 00:00:41,720 --> 00:00:47,320 Speaker 1: seen a lot of think pieces and critical analysis around 9 00:00:47,320 --> 00:00:51,920 Speaker 1: the importance of the Internet, around the importance of a 10 00:00:52,680 --> 00:00:57,760 Speaker 1: global town square, and the fact that our government, by 11 00:00:57,840 --> 00:01:02,360 Speaker 1: virtue of not understanding really truly how technology works, the 12 00:01:02,400 --> 00:01:05,919 Speaker 1: importance of social media platforms, and just you know, sheer 13 00:01:06,040 --> 00:01:12,679 Speaker 1: ignorance has allowed a maniacal, egomaniac billionaire to take over 14 00:01:13,120 --> 00:01:15,720 Speaker 1: this platform and to do with it whatever he wants. 15 00:01:16,640 --> 00:01:21,000 Speaker 1: Today's guest on wokef bridget Todd, who you may know 16 00:01:21,200 --> 00:01:25,039 Speaker 1: from the podcasts There Are No Girls on the Internet Now, 17 00:01:25,120 --> 00:01:28,920 Speaker 1: has a new podcast, The Internet Hate Machine. Her and 18 00:01:29,000 --> 00:01:33,840 Speaker 1: I get into a conversation about the importance of regulation, 19 00:01:34,160 --> 00:01:38,640 Speaker 1: around the importance of seeing platforms like Twitter as a 20 00:01:38,720 --> 00:01:44,319 Speaker 1: public utility, and the missed opportunities that our government had 21 00:01:44,480 --> 00:01:48,360 Speaker 1: and still has and is still making in regard to 22 00:01:48,400 --> 00:01:52,400 Speaker 1: how they decide to regulate this force. The theme that 23 00:01:52,480 --> 00:01:55,640 Speaker 1: I want folks to understand is that I feel like, 24 00:01:56,160 --> 00:02:00,480 Speaker 1: you know, in the advent of social media, truly right, 25 00:02:00,600 --> 00:02:04,480 Speaker 1: like it taking hold in the early two thousands, Facebook 26 00:02:04,720 --> 00:02:08,040 Speaker 1: being I think the first, if I'm not wrong, outside 27 00:02:08,040 --> 00:02:12,880 Speaker 1: of chat rooms and those things, followed by Twitter, and 28 00:02:12,919 --> 00:02:17,080 Speaker 1: then you know, later on Instagram and TikTok and you 29 00:02:17,120 --> 00:02:19,640 Speaker 1: know all of the other ones that would bubble up 30 00:02:19,680 --> 00:02:25,160 Speaker 1: in between. Twitter has stood alone in terms of the 31 00:02:25,320 --> 00:02:29,560 Speaker 1: power that this platform has had, The power it has 32 00:02:29,560 --> 00:02:35,080 Speaker 1: had for community organizers, the power it has had for activists, 33 00:02:35,560 --> 00:02:41,480 Speaker 1: for potential elected officials, for elected officials, for the presidents 34 00:02:41,520 --> 00:02:46,480 Speaker 1: of the United States. It has been a place in 35 00:02:46,639 --> 00:02:50,440 Speaker 1: real time where you go to find out what is happening, 36 00:02:50,440 --> 00:02:54,440 Speaker 1: not only in this country, but around the world. We 37 00:02:54,680 --> 00:02:59,840 Speaker 1: learned about activations like the Arab Spring that took place 38 00:03:00,040 --> 00:03:03,160 Speaker 1: in the earlier two thousands. We learned about what was 39 00:03:03,200 --> 00:03:10,440 Speaker 1: happening in Ukraine, in Iran, in China. Because of social media, 40 00:03:10,720 --> 00:03:14,079 Speaker 1: journalists have been able You've been able to follow them 41 00:03:14,120 --> 00:03:17,640 Speaker 1: in real time as they're on the ground in crisis 42 00:03:17,720 --> 00:03:23,960 Speaker 1: situations like in Ferguson. You know Ferguson and what happened 43 00:03:24,000 --> 00:03:28,040 Speaker 1: there is a prime example with the murder of Mike 44 00:03:28,120 --> 00:03:33,240 Speaker 1: Brown that had it not been for on the ground 45 00:03:33,480 --> 00:03:39,280 Speaker 1: eye reporters tweeting and people beginning to follow in live 46 00:03:39,400 --> 00:03:44,920 Speaker 1: recording and periscoping what was happening in Ferguson, it would 47 00:03:44,960 --> 00:03:49,040 Speaker 1: have never been picked up by cable news and national 48 00:03:49,120 --> 00:03:53,000 Speaker 1: news outlets. It would have been yet another violent act 49 00:03:53,280 --> 00:03:56,520 Speaker 1: that was contained to the community that it was happening in, 50 00:03:56,960 --> 00:04:01,240 Speaker 1: as opposed to being blown up as a national crisis 51 00:04:01,280 --> 00:04:06,840 Speaker 1: and incident. You look at George Floyd, right and the 52 00:04:06,880 --> 00:04:14,120 Speaker 1: fact that this begat an entire summer of activism because 53 00:04:15,120 --> 00:04:19,000 Speaker 1: of the ability of those on the ground to film 54 00:04:19,040 --> 00:04:22,120 Speaker 1: what was happening and then to be able to send 55 00:04:22,360 --> 00:04:28,640 Speaker 1: those videos out into the universe. Without a robust Twitter, 56 00:04:29,480 --> 00:04:32,520 Speaker 1: we would have never had those activations. And so when 57 00:04:32,560 --> 00:04:38,000 Speaker 1: people say things like, well, Elon Musk is stupid and 58 00:04:38,080 --> 00:04:41,279 Speaker 1: he doesn't know what he's doing, Elon Musk knows exactly 59 00:04:41,279 --> 00:04:45,640 Speaker 1: what the fuck he is doing. He is a white supremist, 60 00:04:45,839 --> 00:04:51,360 Speaker 1: in sell fucking misogynist, patriarchal piece of shit. I wish 61 00:04:51,440 --> 00:04:56,480 Speaker 1: I could form an opinion he took over this platform 62 00:04:56,560 --> 00:05:03,000 Speaker 1: because like the greedy ass billionaire he is, he, much 63 00:05:03,120 --> 00:05:07,400 Speaker 1: like his ilk, do not want the people to organize, 64 00:05:07,480 --> 00:05:10,400 Speaker 1: do not want the people to unite, do not want 65 00:05:10,440 --> 00:05:14,840 Speaker 1: those that have traditionally been marginalized and oppressed to have 66 00:05:15,120 --> 00:05:20,520 Speaker 1: a voice and to develop a community of followers, right, 67 00:05:20,839 --> 00:05:23,800 Speaker 1: Because then you know what that means. It means that 68 00:05:23,839 --> 00:05:27,440 Speaker 1: those that are at the top lose power because the 69 00:05:27,520 --> 00:05:30,800 Speaker 1: power is being taken back by the people. And how 70 00:05:30,839 --> 00:05:35,680 Speaker 1: do you stop activations like we saw in the summer 71 00:05:35,720 --> 00:05:39,200 Speaker 1: of twenty twenty, how do you stop push back against 72 00:05:39,400 --> 00:05:43,240 Speaker 1: fascist regimes like the Republican Party and Donald Trump and 73 00:05:43,320 --> 00:05:47,400 Speaker 1: Maga Dum. You take away the town square, you take 74 00:05:47,440 --> 00:05:53,880 Speaker 1: away a key to their organizing, right. That is what 75 00:05:53,920 --> 00:05:56,840 Speaker 1: Elon Musk has done. And so when you look at 76 00:05:56,839 --> 00:06:00,279 Speaker 1: it through that lens, you recognize that he is being 77 00:06:00,360 --> 00:06:06,120 Speaker 1: incredibly successful. Because when people like myself right and many 78 00:06:06,160 --> 00:06:09,760 Speaker 1: many other people with millions and millions of followers, decide 79 00:06:09,800 --> 00:06:13,040 Speaker 1: that they no longer want to spend time on this platform, 80 00:06:13,560 --> 00:06:17,680 Speaker 1: that they no longer want to deal with the inundation 81 00:06:18,080 --> 00:06:22,799 Speaker 1: of racism and misogyny that is now part of Elon 82 00:06:22,880 --> 00:06:26,839 Speaker 1: Must's free speech plan, then it ceases to be the 83 00:06:27,040 --> 00:06:31,960 Speaker 1: powerful machine that it could be, and then people need 84 00:06:32,000 --> 00:06:35,680 Speaker 1: to disperse and then figure out where they're going to 85 00:06:35,760 --> 00:06:40,440 Speaker 1: go next to unite. That was the point of Elon 86 00:06:40,560 --> 00:06:44,960 Speaker 1: Must's takeover. So my conversation coming up next with bridget 87 00:06:45,000 --> 00:06:49,560 Speaker 1: Todd delves into that and what it is, if anything, 88 00:06:49,880 --> 00:06:52,880 Speaker 1: that the people can do to hold onto their voice 89 00:06:53,080 --> 00:07:00,400 Speaker 1: and their power. That conversation is coming up next, folks, 90 00:07:00,480 --> 00:07:05,240 Speaker 1: I am very excited to bring to Woke. F bridget Todd, 91 00:07:05,279 --> 00:07:11,720 Speaker 1: who is tech guru extraordinaire, is the host of the 92 00:07:11,840 --> 00:07:16,640 Speaker 1: award winning podcast There Are No Girls on the Internet, 93 00:07:16,840 --> 00:07:20,080 Speaker 1: and of the new pod, which my God couldn't have 94 00:07:20,160 --> 00:07:25,760 Speaker 1: dropped at a better time, The Internet Hate Machine. Bridget 95 00:07:27,000 --> 00:07:32,040 Speaker 1: let me just tell you that first, congrats on all 96 00:07:32,280 --> 00:07:35,960 Speaker 1: of the accolades with There Are No Girls on the Internet. 97 00:07:37,800 --> 00:07:42,200 Speaker 1: It has been It's so wonderful when I see black 98 00:07:42,240 --> 00:07:47,080 Speaker 1: women winning in spaces that we don't normally win in 99 00:07:48,200 --> 00:07:52,600 Speaker 1: and creating room and space for conversations about people that 100 00:07:52,640 --> 00:07:57,880 Speaker 1: look like us that exist in this tech space. So 101 00:07:57,920 --> 00:08:00,240 Speaker 1: I just want to say, like, shout out for that. 102 00:08:02,280 --> 00:08:06,480 Speaker 1: But your new pod comes at a time when the 103 00:08:07,000 --> 00:08:13,600 Speaker 1: billionaire troll Elon Musk has taken control over the world's 104 00:08:13,800 --> 00:08:19,200 Speaker 1: town square. And now all of a sudden, it seems 105 00:08:19,240 --> 00:08:22,240 Speaker 1: over the last several weeks that people are like, oh, 106 00:08:22,400 --> 00:08:26,840 Speaker 1: my Twitter actually is important. It isn't just a social 107 00:08:26,840 --> 00:08:31,320 Speaker 1: media you know platform that it is really a source 108 00:08:31,680 --> 00:08:36,880 Speaker 1: for organizing and activism and journalism. So, you know, before 109 00:08:36,920 --> 00:08:39,120 Speaker 1: we dive in, you know, I want to get a 110 00:08:39,200 --> 00:08:43,199 Speaker 1: sense of, you know, why you started this new pod, 111 00:08:43,640 --> 00:08:47,079 Speaker 1: right and is it and was it provoked by the 112 00:08:47,160 --> 00:08:53,640 Speaker 1: impending you know cell that now has actually happened of Twitter. Yeah, 113 00:08:53,679 --> 00:08:55,440 Speaker 1: thank you for having me and that for that world, 114 00:08:55,480 --> 00:08:59,240 Speaker 1: welcome the podcast. So it was already in the work, 115 00:08:59,559 --> 00:09:02,440 Speaker 1: and because I feel like I could sort of see 116 00:09:02,480 --> 00:09:05,679 Speaker 1: the writing on the wall, I was already you know, 117 00:09:06,080 --> 00:09:08,640 Speaker 1: pitching and thinking about some of the concepts that we 118 00:09:08,800 --> 00:09:12,680 Speaker 1: that we discuss on Internet Hate Machine before Elon officially 119 00:09:12,679 --> 00:09:15,080 Speaker 1: bought Twitter. But I could sort of see the ways 120 00:09:15,080 --> 00:09:19,319 Speaker 1: that our social media platforms were primed or takeover in 121 00:09:19,360 --> 00:09:21,800 Speaker 1: some kind of way, you know, whether it's by bad 122 00:09:21,840 --> 00:09:27,239 Speaker 1: actors and like charlatans and extremists, gamifying conversations and hijacking 123 00:09:27,280 --> 00:09:30,760 Speaker 1: conversations about its sensitive topics or topics that people really 124 00:09:30,800 --> 00:09:32,959 Speaker 1: need to have conversation on, Like I could sort of 125 00:09:33,000 --> 00:09:34,920 Speaker 1: see that happening, and I could sort of see the 126 00:09:34,960 --> 00:09:38,600 Speaker 1: ways that these platforms should be designed and set up 127 00:09:38,600 --> 00:09:40,760 Speaker 1: to have folks really be able to come to them, 128 00:09:40,800 --> 00:09:44,320 Speaker 1: to organize, to connect with one one another, have community, 129 00:09:44,559 --> 00:09:48,280 Speaker 1: and that makes them these these spaces that folks do 130 00:09:48,400 --> 00:09:50,679 Speaker 1: want to you know, hijack and disrupt. So I could 131 00:09:50,720 --> 00:09:52,959 Speaker 1: sort of see the writing on the wall even before 132 00:09:52,960 --> 00:09:56,280 Speaker 1: Elon Musk actually bought Twitter, when we were in production 133 00:09:56,320 --> 00:09:58,920 Speaker 1: for the podcast that that deal actually went through. So 134 00:09:58,920 --> 00:10:01,600 Speaker 1: I was like, oh my god, perfect timing for show, 135 00:10:02,120 --> 00:10:07,600 Speaker 1: right right. You know, It's just I think that for 136 00:10:07,720 --> 00:10:12,000 Speaker 1: far too long, the folks have said that, you know, 137 00:10:12,040 --> 00:10:15,679 Speaker 1: Twitter is toxic. Twitter is a toxic space, and you know, 138 00:10:16,040 --> 00:10:19,840 Speaker 1: and and there were there had been moments where major 139 00:10:19,920 --> 00:10:24,360 Speaker 1: things would happen. You profiled one of them, um on 140 00:10:24,400 --> 00:10:29,840 Speaker 1: the Internet hate machine, Leslie Jones, and you you profile 141 00:10:29,920 --> 00:10:32,600 Speaker 1: and and I want to allow you to the space 142 00:10:32,679 --> 00:10:36,839 Speaker 1: to tell the Willkaf audience about exactly that, which I'm 143 00:10:36,880 --> 00:10:40,720 Speaker 1: sure many you know, everyone remembers. But Leslie Jones, you 144 00:10:40,720 --> 00:10:46,640 Speaker 1: know SNL, big huge star comedian had Ghostbusters coming out, 145 00:10:46,960 --> 00:10:50,480 Speaker 1: and she had been one of those people that were 146 00:10:50,720 --> 00:10:54,400 Speaker 1: using Twitter in a way that was enjoyable, right, Like, 147 00:10:54,480 --> 00:10:57,200 Speaker 1: we all wanted to follow her because, you know, her 148 00:10:57,240 --> 00:11:00,559 Speaker 1: commentary was fun and funny, and we felt like we 149 00:11:00,559 --> 00:11:04,439 Speaker 1: were in real lifetime of like an SNL skit, right 150 00:11:04,480 --> 00:11:07,719 Speaker 1: whenever she would cover sporting events or you know, just 151 00:11:07,800 --> 00:11:10,800 Speaker 1: do something fun and it kind of you know, her 152 00:11:10,840 --> 00:11:14,120 Speaker 1: presence at that time reminded us of like the good 153 00:11:14,240 --> 00:11:18,760 Speaker 1: naturedness of what it meant to build social media platforms. 154 00:11:19,200 --> 00:11:27,080 Speaker 1: Then here comes along this motherfucker, Milo, the white supremicist 155 00:11:28,320 --> 00:11:30,360 Speaker 1: I never pronounced his last name, right, and I don't 156 00:11:30,360 --> 00:11:35,200 Speaker 1: ever care about pronouncing his last name right, comes along 157 00:11:35,640 --> 00:11:40,320 Speaker 1: and creates an entire campaign to target her for no 158 00:11:40,400 --> 00:11:42,760 Speaker 1: other reason than the fact that she's black. She's a 159 00:11:42,760 --> 00:11:47,680 Speaker 1: black woman who is famous and wealthy, and you know, 160 00:11:47,760 --> 00:11:51,560 Speaker 1: so I just want you to go into using her 161 00:11:51,720 --> 00:11:56,400 Speaker 1: story as kind of the kind of anchor story as 162 00:11:56,440 --> 00:12:01,679 Speaker 1: to what regular people, regular you know, folks that happened 163 00:12:01,720 --> 00:12:05,080 Speaker 1: to be from marginalized communities have the experience at the 164 00:12:05,160 --> 00:12:11,560 Speaker 1: hands of white supremas, of misogynists, of these horrible trolls. 165 00:12:12,559 --> 00:12:14,440 Speaker 1: I'm so glad that you put it that way. I 166 00:12:14,480 --> 00:12:17,400 Speaker 1: think that so often people hear the story of what 167 00:12:17,480 --> 00:12:20,280 Speaker 1: happened with Leslie Jones on Twitter and say, well, she's 168 00:12:20,320 --> 00:12:22,680 Speaker 1: a big star, she's a celebrity who cares, Like, what 169 00:12:22,720 --> 00:12:25,200 Speaker 1: does this have to do with anything? But that's exactly 170 00:12:25,240 --> 00:12:27,000 Speaker 1: the point that I posited in the episode about her, 171 00:12:27,040 --> 00:12:29,600 Speaker 1: which is that Leslie Jones was someone who was a 172 00:12:29,600 --> 00:12:32,360 Speaker 1: as you said, like a Twitter super user. Everyone loved 173 00:12:32,400 --> 00:12:35,240 Speaker 1: the way that Leslie Jones tweeted. She was got global 174 00:12:35,520 --> 00:12:39,040 Speaker 1: fame for her hilarious live tweeting of the Olympics that year, 175 00:12:39,280 --> 00:12:43,000 Speaker 1: and she was like beloved by Twitter and frankly got 176 00:12:43,040 --> 00:12:45,720 Speaker 1: Twitter a lot of positive attention. Right. And so if 177 00:12:45,800 --> 00:12:48,800 Speaker 1: someone who is this big celebrity, wealthy and also like 178 00:12:48,840 --> 00:12:51,800 Speaker 1: a Twitter super user, who is like beloved by Twitter, 179 00:12:52,120 --> 00:12:55,079 Speaker 1: if they could be harassed and abused in these ways 180 00:12:55,280 --> 00:12:57,240 Speaker 1: and in ways that are criminal. Right. So it wasn't 181 00:12:57,240 --> 00:13:00,320 Speaker 1: just her being you know, people saying mean things to 182 00:13:00,360 --> 00:13:03,040 Speaker 1: her on Twitter. They hacked her website and and and 183 00:13:03,080 --> 00:13:05,760 Speaker 1: put intimate photos of her on her website, which is 184 00:13:05,920 --> 00:13:08,120 Speaker 1: which is a crime, you know. So it's it's these 185 00:13:08,160 --> 00:13:10,960 Speaker 1: ways that are that are go so far beyond people 186 00:13:11,040 --> 00:13:13,559 Speaker 1: saying mean things to her. But if someone is famous 187 00:13:13,559 --> 00:13:15,920 Speaker 1: and well connected as Leslie Jones, that can happen to 188 00:13:15,960 --> 00:13:18,280 Speaker 1: her and people can say, big deal. What does that 189 00:13:18,320 --> 00:13:19,679 Speaker 1: mean for the rest of us? What does that mean 190 00:13:19,679 --> 00:13:21,800 Speaker 1: for all the other black women and girls out there 191 00:13:21,800 --> 00:13:24,400 Speaker 1: who are using social media platforms and are also being 192 00:13:24,400 --> 00:13:27,240 Speaker 1: targeted for this kind of harassment or LGBTQ folks who 193 00:13:27,280 --> 00:13:29,000 Speaker 1: are being targeted for this kind of harassment. And so 194 00:13:29,160 --> 00:13:31,160 Speaker 1: I'm so glad that that's how you put it. I 195 00:13:31,240 --> 00:13:34,520 Speaker 1: also think that the story of Leslie Jones being harassed 196 00:13:34,600 --> 00:13:38,440 Speaker 1: and a targeted harassment campaign by Milo, it actually doesn't 197 00:13:38,520 --> 00:13:41,360 Speaker 1: start with Milo. It starts with Steve Bannon, because Steve 198 00:13:41,440 --> 00:13:45,040 Speaker 1: bann was really one of the first people to identify 199 00:13:45,640 --> 00:13:50,760 Speaker 1: this the powerful force of like disaffected white male gamers 200 00:13:51,160 --> 00:13:52,960 Speaker 1: and say like, oh, this is a force that we 201 00:13:53,000 --> 00:13:55,760 Speaker 1: can really like weaponize. And it was him who really 202 00:13:55,800 --> 00:13:58,480 Speaker 1: empowered Milo at the time, working for Breitbart, his his 203 00:13:58,559 --> 00:14:02,199 Speaker 1: publication to you know, separ had these kinds of attacks, 204 00:14:02,200 --> 00:14:06,080 Speaker 1: and so again like it's so it's it's wild to 205 00:14:06,120 --> 00:14:09,320 Speaker 1: me how how often this gets sort of sidelined as 206 00:14:09,320 --> 00:14:11,839 Speaker 1: a celebrity story when like Steve bannam went on to 207 00:14:11,840 --> 00:14:14,800 Speaker 1: be in the White House, right, So like, obviously this 208 00:14:14,880 --> 00:14:17,520 Speaker 1: is a story that has deeper roots than just something 209 00:14:17,520 --> 00:14:20,520 Speaker 1: bad happening to a celebrity on the internet. Yeah, and 210 00:14:20,560 --> 00:14:23,080 Speaker 1: I think that that's right, And I'm so glad that 211 00:14:23,160 --> 00:14:27,440 Speaker 1: you lifted that up on your episode, because what I 212 00:14:27,520 --> 00:14:34,040 Speaker 1: think is is becomes really problematic is when we're being 213 00:14:34,280 --> 00:14:39,720 Speaker 1: disingenuous about the actual problem. And the problem here isn't 214 00:14:39,800 --> 00:14:42,520 Speaker 1: just you know, a bunch of white guys that are 215 00:14:42,560 --> 00:14:45,800 Speaker 1: sitting behind a computer screen that you know, never go 216 00:14:45,920 --> 00:14:50,520 Speaker 1: outside and are just using their trolling as a way 217 00:14:50,600 --> 00:14:54,400 Speaker 1: to you know, to lift up their masculinity. That this 218 00:14:54,480 --> 00:14:57,800 Speaker 1: is actually tied to a larger political movement that we 219 00:14:57,880 --> 00:15:01,800 Speaker 1: have seen that has been wept an eye. Whereas where Twitter, 220 00:15:02,280 --> 00:15:05,760 Speaker 1: for instance, in particular, of all the platforms, has been 221 00:15:05,800 --> 00:15:12,240 Speaker 1: weaponized um by really fragile white men such as Donald Trump, 222 00:15:12,400 --> 00:15:16,480 Speaker 1: such as Elon Musk m and others who get on 223 00:15:16,560 --> 00:15:20,240 Speaker 1: there are their grievances, stir up their and their inseell 224 00:15:20,600 --> 00:15:26,880 Speaker 1: you know, uh bots to target and to shut down discourse. 225 00:15:27,000 --> 00:15:28,760 Speaker 1: And so one of the questions that I want to 226 00:15:28,800 --> 00:15:32,680 Speaker 1: ask to you is the past couple of weeks that 227 00:15:32,760 --> 00:15:35,840 Speaker 1: I have gone on you know, to your former your 228 00:15:35,880 --> 00:15:40,200 Speaker 1: former outlet to MSNBC and you know, and been and 229 00:15:40,280 --> 00:15:43,200 Speaker 1: we're talking about Elon Musk and they're like, oh, Danielle, 230 00:15:43,360 --> 00:15:46,800 Speaker 1: so you know, he's a spectacular failure and all of 231 00:15:46,840 --> 00:15:48,920 Speaker 1: these things. And I said, what is it that makes 232 00:15:48,960 --> 00:15:52,360 Speaker 1: you all think that he's failing? Right? And so my question, 233 00:15:52,520 --> 00:15:57,400 Speaker 1: my question to you is, you know, as you're watching 234 00:15:58,080 --> 00:16:02,400 Speaker 1: daily the headlines and following the headlines about Twitter, about 235 00:16:02,440 --> 00:16:06,960 Speaker 1: advertisers pulling out, about you know, different attack tweets that 236 00:16:07,120 --> 00:16:10,200 Speaker 1: Musk is putting out targeting you know, whether it's a 237 00:16:10,440 --> 00:16:16,840 Speaker 1: representative Alexandria Cassio Cortez or others, what do you make 238 00:16:17,080 --> 00:16:21,560 Speaker 1: of the narrative that is being spun around what he 239 00:16:21,760 --> 00:16:25,920 Speaker 1: is doing. Yeah, that's such a great question. I think 240 00:16:25,920 --> 00:16:29,160 Speaker 1: that that idea that oh, this must be like can 241 00:16:29,200 --> 00:16:31,840 Speaker 1: you believe this spectacular failure? I do think that that 242 00:16:32,080 --> 00:16:34,840 Speaker 1: in part is rooted in this idea that we need 243 00:16:34,920 --> 00:16:38,040 Speaker 1: to challenge that a white guy who's a billionaire is 244 00:16:38,040 --> 00:16:42,920 Speaker 1: automatically a co genius. Musk is a genius. I also 245 00:16:42,960 --> 00:16:46,040 Speaker 1: think I agree with you. I think that the reason 246 00:16:46,080 --> 00:16:49,960 Speaker 1: why we see Twitter specifically as this battle ground for 247 00:16:50,040 --> 00:16:51,800 Speaker 1: who can have a voice, and who can have power 248 00:16:52,200 --> 00:16:55,680 Speaker 1: is because people who are traditionally marginalized. Twitter is our 249 00:16:56,160 --> 00:16:59,160 Speaker 1: is our domain, like we have been able even we 250 00:16:59,200 --> 00:17:01,840 Speaker 1: have been able to build platforms and voices for ourselves 251 00:17:01,920 --> 00:17:05,639 Speaker 1: using Twitter when other, you know, more traditional outlets and 252 00:17:06,000 --> 00:17:08,960 Speaker 1: places have not always helped us amplify our voices. And 253 00:17:08,960 --> 00:17:12,399 Speaker 1: so I think Elon Musk sees that like, Okay, this 254 00:17:12,440 --> 00:17:15,520 Speaker 1: is a place where black folks, brown folks, women, queer 255 00:17:15,520 --> 00:17:18,400 Speaker 1: folks are able to build up a voice that can 256 00:17:18,440 --> 00:17:21,520 Speaker 1: sometimes even challenge traditional power structures, right things like meet 257 00:17:21,560 --> 00:17:23,520 Speaker 1: to you, where you can start a hashtag and have 258 00:17:23,640 --> 00:17:28,560 Speaker 1: that be something that actually allows for accountability from power structures. 259 00:17:28,600 --> 00:17:32,280 Speaker 1: I think that Elon Musk wants to cozy up with 260 00:17:32,640 --> 00:17:35,880 Speaker 1: extremist right wing types, and part of that as being 261 00:17:36,200 --> 00:17:41,119 Speaker 1: is dismantling Twitter's power that marginalized people can use. And 262 00:17:41,160 --> 00:17:42,680 Speaker 1: so if you chip away at that power, if you 263 00:17:42,840 --> 00:17:45,080 Speaker 1: if you make it a place where you know, the 264 00:17:45,119 --> 00:17:48,520 Speaker 1: kinds of people who are like creating and shapers of 265 00:17:48,560 --> 00:17:51,119 Speaker 1: discourse aren't really there so much, if you make it 266 00:17:51,119 --> 00:17:52,720 Speaker 1: so that like if you do hang out on Twitter, 267 00:17:52,880 --> 00:17:54,919 Speaker 1: your mentions are going to be full of like nazis 268 00:17:55,000 --> 00:17:57,200 Speaker 1: tweeting garbage at you, and this make it a place 269 00:17:57,240 --> 00:17:59,080 Speaker 1: that no one is going to want to hang out. 270 00:17:59,320 --> 00:18:02,479 Speaker 1: I think that is imparcel to how you take away 271 00:18:02,600 --> 00:18:04,760 Speaker 1: some of a platform like Twitter's power. And I think 272 00:18:04,800 --> 00:18:08,000 Speaker 1: it's not surprising that we saw, you know, when Trump 273 00:18:08,119 --> 00:18:11,000 Speaker 1: was deep platformed, the platform that really hurt him the 274 00:18:11,000 --> 00:18:13,320 Speaker 1: hardest was not having Twitter, because I think that Twitter 275 00:18:13,440 --> 00:18:15,560 Speaker 1: is a unique kind of platform where you can really 276 00:18:15,920 --> 00:18:18,080 Speaker 1: get a message out there and get it to travel quickly. 277 00:18:18,200 --> 00:18:21,119 Speaker 1: And I think Elon Musk is trying to tip the 278 00:18:21,200 --> 00:18:24,600 Speaker 1: scales so that more folks on the extremeist side of things, 279 00:18:24,640 --> 00:18:26,919 Speaker 1: on the right wing side of things, have access to 280 00:18:26,960 --> 00:18:29,480 Speaker 1: that power because they're just not as they're just haven't 281 00:18:29,480 --> 00:18:31,359 Speaker 1: been as good at it as we are, I guess. 282 00:18:31,359 --> 00:18:33,760 Speaker 1: And so I think that he is. So when people 283 00:18:33,800 --> 00:18:36,679 Speaker 1: say like, oh, he's really failing, he's really like, you know, 284 00:18:37,359 --> 00:18:40,439 Speaker 1: ruining Twitter, I'm not going to say that's not by design. 285 00:18:40,600 --> 00:18:42,520 Speaker 1: I think that he wants to make it so that 286 00:18:42,560 --> 00:18:45,080 Speaker 1: Twitter is a place where folks like you and I 287 00:18:45,119 --> 00:18:46,840 Speaker 1: aren't gonna want to show up and aren't gonna want 288 00:18:46,840 --> 00:18:48,639 Speaker 1: to be able to build up power and voices on 289 00:18:48,640 --> 00:18:53,080 Speaker 1: a platform do you think that government, though Bridget failed 290 00:18:53,160 --> 00:18:58,919 Speaker 1: here by not making Twitter a public utility? Do you 291 00:18:59,000 --> 00:19:03,439 Speaker 1: think that because we have essentially and I'll be super 292 00:19:03,480 --> 00:19:08,880 Speaker 1: agist and not care optinagarians who don't actually understand how 293 00:19:08,960 --> 00:19:12,399 Speaker 1: social media in a lot of ways works, do you 294 00:19:12,440 --> 00:19:15,399 Speaker 1: think that there has been missed opportunities where they would 295 00:19:15,440 --> 00:19:19,000 Speaker 1: bring in, you know, Mark Zuckerberg and scold him with 296 00:19:19,040 --> 00:19:22,639 Speaker 1: regard to allowing misinformation whether it was about covid or 297 00:19:22,680 --> 00:19:27,240 Speaker 1: the twenty sixteen election, to fester on Facebook and depress 298 00:19:27,440 --> 00:19:31,040 Speaker 1: like democratic voices and ads and campaign ads and what 299 00:19:31,119 --> 00:19:35,320 Speaker 1: have you. They bring them in to chastise them, but 300 00:19:35,520 --> 00:19:41,440 Speaker 1: don't really understand how like how this platform is used. 301 00:19:41,480 --> 00:19:45,760 Speaker 1: So do you think that government and has missed an 302 00:19:45,760 --> 00:19:50,600 Speaker 1: opportunity here? Yes, this is just I mean, this is 303 00:19:50,640 --> 00:19:53,399 Speaker 1: my opinion. So speaking for myself, I do think that 304 00:19:53,480 --> 00:19:55,639 Speaker 1: government has missed an opportunity, and I think it's really 305 00:19:56,000 --> 00:19:58,280 Speaker 1: I think that we're really seeing the impact of that. 306 00:19:58,400 --> 00:20:02,840 Speaker 1: I think that when listen, when your community wants to 307 00:20:02,840 --> 00:20:06,040 Speaker 1: get information out there about an emergency, they don't go 308 00:20:06,080 --> 00:20:08,159 Speaker 1: to Facebook, they don't go to Instagram, they don't go 309 00:20:08,160 --> 00:20:10,240 Speaker 1: in a snapchat, they go to Twitter, Right, so Twitter 310 00:20:10,320 --> 00:20:12,880 Speaker 1: is how most a lot of us get information about 311 00:20:12,960 --> 00:20:15,480 Speaker 1: what's happening in our communities right now. The fact that 312 00:20:15,680 --> 00:20:18,920 Speaker 1: a billionaire could wake up one day and outright by 313 00:20:19,040 --> 00:20:22,240 Speaker 1: that platform essentially on a whim for a laugh, and 314 00:20:22,920 --> 00:20:24,679 Speaker 1: you know, do what he's done to it is a 315 00:20:24,720 --> 00:20:27,919 Speaker 1: real problem. I think that you're exactly right. It pains 316 00:20:28,000 --> 00:20:31,920 Speaker 1: me when I see, you know, people who I know 317 00:20:32,640 --> 00:20:34,560 Speaker 1: did not who are not digital natives, who did not 318 00:20:34,680 --> 00:20:36,760 Speaker 1: grow up on law, who don't know the way that 319 00:20:36,800 --> 00:20:40,600 Speaker 1: the Internet can really impact our real lived experiences, kind 320 00:20:40,600 --> 00:20:44,760 Speaker 1: of dropping the ball when it comes to regulating and 321 00:20:44,040 --> 00:20:47,840 Speaker 1: making sure that these platforms are actually able to be 322 00:20:48,160 --> 00:20:50,840 Speaker 1: the kind of public utilities that we know that is 323 00:20:50,840 --> 00:20:52,919 Speaker 1: the purpose that they serve in our actual lives. The 324 00:20:52,960 --> 00:20:55,320 Speaker 1: fact that the fact that we're having this conversation after 325 00:20:55,320 --> 00:20:57,520 Speaker 1: the fact really does tell me that our elected officials 326 00:20:57,520 --> 00:21:00,560 Speaker 1: have really failed us and dropped the ball. And frankly, 327 00:21:00,680 --> 00:21:03,560 Speaker 1: I don't know if I see it getting better. You know, 328 00:21:04,160 --> 00:21:07,240 Speaker 1: I've the hearings that you were just describing, like I've 329 00:21:07,280 --> 00:21:11,679 Speaker 1: solved some elected official chastising Mark Zuckerberg, and he was like, 330 00:21:11,720 --> 00:21:14,720 Speaker 1: will you commit to ending finn stuff? And Mark Zuckerberg 331 00:21:14,840 --> 00:21:16,760 Speaker 1: was like, what are you talking about? Right? Like, I 332 00:21:16,760 --> 00:21:18,359 Speaker 1: think it's I think it's also a testament to the 333 00:21:18,400 --> 00:21:21,199 Speaker 1: fact that we need younger folks elected officials. We need 334 00:21:21,240 --> 00:21:23,920 Speaker 1: to the ones who aren't younger folks need to have 335 00:21:24,560 --> 00:21:27,000 Speaker 1: more younger folks in their circles that they're actually listening to, 336 00:21:27,520 --> 00:21:33,040 Speaker 1: because we've got people making legislation about technology that frankly, 337 00:21:33,280 --> 00:21:36,199 Speaker 1: I wonder if they truly really understand the impact of 338 00:21:36,240 --> 00:21:39,320 Speaker 1: that technology. No. And I think you know what was 339 00:21:39,400 --> 00:21:42,639 Speaker 1: funny is that during the Obama years, I think that 340 00:21:42,720 --> 00:21:51,840 Speaker 1: people really took for granted how how much Obama understood 341 00:21:51,920 --> 00:21:54,560 Speaker 1: about how the world works, but also how to use, 342 00:21:54,720 --> 00:21:58,800 Speaker 1: how to utilize technology. He created the Chief Technology Officer. 343 00:21:59,080 --> 00:22:03,040 Speaker 1: He courted people from Silicon Valley to come in as 344 00:22:03,040 --> 00:22:05,960 Speaker 1: if it were they were doing their public service right here. 345 00:22:06,040 --> 00:22:08,879 Speaker 1: Don't take that, you know, half a million dollar job 346 00:22:09,040 --> 00:22:11,600 Speaker 1: at Google, come and work for the government for scraps, 347 00:22:11,640 --> 00:22:14,600 Speaker 1: but you're going to be doing something good for America. 348 00:22:14,640 --> 00:22:19,359 Speaker 1: Because he understood that we needed really bright, sharp minds 349 00:22:19,400 --> 00:22:23,879 Speaker 1: to catapult us into the future. And I'm not quite 350 00:22:23,960 --> 00:22:30,280 Speaker 1: sure that that energy that thinking, even with this administration 351 00:22:30,640 --> 00:22:36,159 Speaker 1: being a democratic administration, gets that, and so I wonder 352 00:22:36,200 --> 00:22:40,440 Speaker 1: what you think, bridget are the implications then moving forward, 353 00:22:40,520 --> 00:22:44,560 Speaker 1: when we have billionaires that are legitimately playing monopoly with 354 00:22:44,680 --> 00:22:48,640 Speaker 1: our lives and our voices, and then have a government 355 00:22:48,720 --> 00:22:52,400 Speaker 1: that is ineffectual in their ability to stop that from happening. 356 00:22:52,760 --> 00:22:55,680 Speaker 1: What does this look like down the road? What does 357 00:22:55,720 --> 00:22:59,359 Speaker 1: the internet, this hate machine that it has become, as 358 00:22:59,800 --> 00:23:03,000 Speaker 1: you cleverly have titled your pod, what does it look 359 00:23:03,080 --> 00:23:06,560 Speaker 1: like down the road a couple of years from now? Oh, 360 00:23:06,760 --> 00:23:09,760 Speaker 1: recipe for disaster. Like I hate to put it like that, 361 00:23:09,800 --> 00:23:13,160 Speaker 1: but absolute recipe for disaster. And I think even looking 362 00:23:13,160 --> 00:23:15,880 Speaker 1: at the Biden administration, there are some things that I'll 363 00:23:15,920 --> 00:23:18,040 Speaker 1: say like, oh, good job doing this or that, but 364 00:23:18,119 --> 00:23:21,199 Speaker 1: it's it often feels like too little, too late, like 365 00:23:21,240 --> 00:23:24,640 Speaker 1: we're playing catchup on on issue right that we has 366 00:23:24,640 --> 00:23:27,840 Speaker 1: gotten so far, you know, far afield. And so I 367 00:23:27,880 --> 00:23:32,840 Speaker 1: think unless you have elected officials who truly understand what's 368 00:23:32,840 --> 00:23:36,080 Speaker 1: happening and are already and willing to act in meaningful ways, 369 00:23:36,240 --> 00:23:40,040 Speaker 1: I think it's a recipe for disaster. I really, I think, 370 00:23:40,280 --> 00:23:42,639 Speaker 1: you know, I especially look at like the younger generation, 371 00:23:43,480 --> 00:23:45,919 Speaker 1: so many of them, what it means to show up 372 00:23:45,960 --> 00:23:48,040 Speaker 1: as like an activist or somebody with a voice, or 373 00:23:48,040 --> 00:23:51,479 Speaker 1: somebody who was really civically engaged means showing up online, right, 374 00:23:51,520 --> 00:23:54,960 Speaker 1: And so if our online platforms are not places where 375 00:23:54,960 --> 00:23:57,800 Speaker 1: they can where they can all do that meaningfully and safely, 376 00:23:57,880 --> 00:23:59,920 Speaker 1: where whether you're a black woman or a white man, 377 00:24:00,000 --> 00:24:02,240 Speaker 1: and you can meaningfully show up and like make your 378 00:24:02,280 --> 00:24:05,320 Speaker 1: wayte heard and be civically engaged online. If our platforms 379 00:24:05,320 --> 00:24:07,760 Speaker 1: aren't places where that's possible, what do you think the 380 00:24:07,800 --> 00:24:10,199 Speaker 1: next generation of like elected officials or people who are 381 00:24:10,440 --> 00:24:12,800 Speaker 1: physically engaged is going to look like we're leaving an 382 00:24:12,880 --> 00:24:16,720 Speaker 1: incredibly unequal, unfair playing field for the next generation. And 383 00:24:16,760 --> 00:24:20,480 Speaker 1: I think unless we do something meaningful to change that, 384 00:24:20,600 --> 00:24:23,360 Speaker 1: I don't I'm not. I'm concerned about what it's going 385 00:24:23,359 --> 00:24:27,000 Speaker 1: to look like down the road. What are your hopes then, 386 00:24:27,080 --> 00:24:32,399 Speaker 1: bridget for your new pod and what you hope to 387 00:24:32,640 --> 00:24:38,840 Speaker 1: bring or highlight to your audiences about this really important space, right, 388 00:24:38,920 --> 00:24:42,320 Speaker 1: Like I think that you know, for at the beginning 389 00:24:42,400 --> 00:24:45,320 Speaker 1: the advent of the Internet, it was considered like this 390 00:24:45,400 --> 00:24:48,600 Speaker 1: kind of fun thing, right, you know, chat rooms and 391 00:24:49,160 --> 00:24:51,960 Speaker 1: websites and blogs and all of these, you know, it's 392 00:24:52,000 --> 00:24:56,320 Speaker 1: feeling it literally feels like, you know, little house on 393 00:24:56,359 --> 00:24:59,600 Speaker 1: the Prairie times when you know from how it started 394 00:24:59,640 --> 00:25:02,960 Speaker 1: compare to like where we are right now. But where 395 00:25:02,960 --> 00:25:06,520 Speaker 1: we're headed seems so bleak and stark. And I think 396 00:25:06,560 --> 00:25:10,520 Speaker 1: that you know, you raise such good alarms and questions. 397 00:25:10,960 --> 00:25:13,680 Speaker 1: You know, what are you hoping that people take from 398 00:25:14,200 --> 00:25:16,760 Speaker 1: your pod? Yeah, I'm so glad that you asked. The 399 00:25:16,760 --> 00:25:20,240 Speaker 1: biggest one is probably that the state and the well 400 00:25:20,280 --> 00:25:22,760 Speaker 1: being and the health of our internet is connected to 401 00:25:22,920 --> 00:25:25,520 Speaker 1: of that health of our democracy writ large. And so 402 00:25:25,880 --> 00:25:28,640 Speaker 1: these are not issues that are just quote happening online 403 00:25:28,800 --> 00:25:31,800 Speaker 1: or things happening on Twitter. You can't tell someone just 404 00:25:31,800 --> 00:25:35,040 Speaker 1: just turn off your computer. It'll be fine. When all 405 00:25:35,080 --> 00:25:37,520 Speaker 1: of us are not meaningly able to show up on 406 00:25:37,520 --> 00:25:40,520 Speaker 1: our our digital communications platforms and make our voices heard 407 00:25:40,560 --> 00:25:44,119 Speaker 1: and you know, express ourselves, that means that that is 408 00:25:44,160 --> 00:25:47,000 Speaker 1: the connected to the health and well being of our democracy, 409 00:25:47,000 --> 00:25:49,480 Speaker 1: and that you know, folks are not going to show 410 00:25:49,560 --> 00:25:52,440 Speaker 1: up online and sort of like your voice their opinion 411 00:25:52,480 --> 00:25:55,080 Speaker 1: about politics or our democracy. Folks are going to be 412 00:25:55,080 --> 00:25:57,280 Speaker 1: We've already seen. The research is very clear women and 413 00:25:57,320 --> 00:25:59,600 Speaker 1: women of color and black women especially are much less 414 00:25:59,680 --> 00:26:03,440 Speaker 1: likely to do things like run for office when they 415 00:26:03,480 --> 00:26:05,720 Speaker 1: know that they can be harassed online just for trying 416 00:26:05,720 --> 00:26:08,160 Speaker 1: to serve their communities, right, and so we can't have 417 00:26:08,200 --> 00:26:10,600 Speaker 1: a representative democracy. If that's the case, we can't have 418 00:26:10,640 --> 00:26:14,000 Speaker 1: a fully functioning democracy. And the health and well being 419 00:26:14,000 --> 00:26:16,560 Speaker 1: of our democracy is innately tied to the way that 420 00:26:16,640 --> 00:26:19,080 Speaker 1: women and women of color, and black women specifically are 421 00:26:19,119 --> 00:26:20,960 Speaker 1: able to show up online. If we're not able to 422 00:26:21,000 --> 00:26:23,120 Speaker 1: show up, then we don't have a healthy democracy. The two, 423 00:26:23,160 --> 00:26:26,680 Speaker 1: in my mind, are very much connected. What do you think? 424 00:26:27,200 --> 00:26:33,320 Speaker 1: Last question for you, like, what do our elected officials? Again, 425 00:26:33,400 --> 00:26:37,720 Speaker 1: because these are the overseers, the ones that we place 426 00:26:37,880 --> 00:26:42,800 Speaker 1: the importance of representing our voices and our needs. What 427 00:26:42,840 --> 00:26:46,399 Speaker 1: do they need to understand one coming out of this 428 00:26:46,480 --> 00:26:50,320 Speaker 1: midterm election, going into a presidential election where the Internet 429 00:26:50,480 --> 00:26:56,360 Speaker 1: is like the most critical infrastructure in order to get 430 00:26:56,359 --> 00:26:59,800 Speaker 1: out information to as many people as possible. I mean, yes, 431 00:26:59,840 --> 00:27:02,439 Speaker 1: I believe in door knocking and I believe in phone calls, 432 00:27:02,480 --> 00:27:06,320 Speaker 1: but we know that tweets and ads and campaign videos, 433 00:27:06,720 --> 00:27:10,600 Speaker 1: the ability to go viral is what allows people to 434 00:27:10,800 --> 00:27:14,440 Speaker 1: have a fighting chance in our democracy. Right, So what 435 00:27:15,000 --> 00:27:18,639 Speaker 1: do you what would your advice, your message, your warning 436 00:27:19,160 --> 00:27:21,760 Speaker 1: be to them. Yeah, I would say, I mean, what 437 00:27:21,800 --> 00:27:25,320 Speaker 1: a good question first and the foremost, I take it seriously. 438 00:27:25,440 --> 00:27:27,240 Speaker 1: I do think that we're still We're coming out of 439 00:27:27,280 --> 00:27:28,640 Speaker 1: it a little bit, but I do think that we're 440 00:27:28,680 --> 00:27:30,280 Speaker 1: still in a place where for a lot of people, 441 00:27:31,040 --> 00:27:33,480 Speaker 1: like what happens on the internet is not real life, 442 00:27:33,480 --> 00:27:36,240 Speaker 1: and we need to really get away from that notion 443 00:27:36,320 --> 00:27:38,320 Speaker 1: because the internet and real life are one and the 444 00:27:38,359 --> 00:27:41,159 Speaker 1: same in twenty twenty two, and I would say, really 445 00:27:41,440 --> 00:27:44,520 Speaker 1: start being able to speak to it when a woman 446 00:27:44,560 --> 00:27:46,520 Speaker 1: of color or a black woman is harassed or abuse 447 00:27:46,600 --> 00:27:48,879 Speaker 1: on the internet, right, Like case and point, when I 448 00:27:48,880 --> 00:27:52,199 Speaker 1: remember when Kamala Harris was first being talked about as 449 00:27:52,280 --> 00:27:55,960 Speaker 1: Biden's VP pick, I really I was obviously excited about that, 450 00:27:56,119 --> 00:27:58,400 Speaker 1: but I was a little disheartened to see that there 451 00:27:58,400 --> 00:28:00,360 Speaker 1: weren't a lot of folks in the administration who were 452 00:28:00,760 --> 00:28:04,200 Speaker 1: willing to openly talk about the kinds of harassment, like racist, 453 00:28:04,280 --> 00:28:07,159 Speaker 1: sis harassment that she was going to face, right, and 454 00:28:07,200 --> 00:28:09,359 Speaker 1: so they just say like, oh, well, we want to 455 00:28:09,359 --> 00:28:11,680 Speaker 1: support her, like we like her leadership this and that 456 00:28:12,040 --> 00:28:15,400 Speaker 1: talk openly about it, normalize talking openly about the reality 457 00:28:15,560 --> 00:28:17,680 Speaker 1: that you are setting Black women up to face when 458 00:28:17,720 --> 00:28:20,600 Speaker 1: you are supporting their leadership. Same thing with Katanti Brown Jackson. 459 00:28:20,720 --> 00:28:22,840 Speaker 1: I was a little again, a little disheartened to see 460 00:28:22,920 --> 00:28:28,800 Speaker 1: the kinds of obvious racialized, sexualized, gendered disinformation, misinformation and 461 00:28:28,840 --> 00:28:31,880 Speaker 1: harassment that she was facing online. I was a little 462 00:28:31,920 --> 00:28:34,119 Speaker 1: sad to see that folks in the administration weren't calling 463 00:28:34,160 --> 00:28:35,520 Speaker 1: it out the way that they should have. And so 464 00:28:35,520 --> 00:28:38,120 Speaker 1: I would say, first and foremost, start talking about it. 465 00:28:38,160 --> 00:28:41,200 Speaker 1: Don't dance around it, don't use ethemisms, call it out 466 00:28:41,240 --> 00:28:45,200 Speaker 1: directly and normalize it. Normalize calling it out, and normalize 467 00:28:45,280 --> 00:28:47,200 Speaker 1: saying like, we're not going to tolerate this, we see 468 00:28:47,200 --> 00:28:49,720 Speaker 1: what it is. Your little dog whistles online aren't going 469 00:28:49,760 --> 00:28:52,080 Speaker 1: to work. We're not going to stand here and allow 470 00:28:52,200 --> 00:28:55,640 Speaker 1: for racist, sexist harassment of our elected officials. And so 471 00:28:55,680 --> 00:28:57,400 Speaker 1: I think we're not even there yet. We're still in 472 00:28:57,440 --> 00:29:00,160 Speaker 1: a place where people they don't openly acknowledge it. It's 473 00:29:00,160 --> 00:29:01,840 Speaker 1: just like this thing that we all have to see 474 00:29:01,920 --> 00:29:08,080 Speaker 1: happening and do not have verbalized in a real way. Well, 475 00:29:08,200 --> 00:29:11,160 Speaker 1: Bridget I just you know, I continue to commend you 476 00:29:11,480 --> 00:29:14,160 Speaker 1: for the work that you do in this tech space 477 00:29:14,280 --> 00:29:18,520 Speaker 1: to elevate the issues that are facing marginalized communities to 478 00:29:18,600 --> 00:29:23,480 Speaker 1: give voice and to the needs right and also to 479 00:29:23,600 --> 00:29:27,520 Speaker 1: just engage more women, communities of color, queer people to 480 00:29:27,640 --> 00:29:33,560 Speaker 1: get involved and to take up space in this digital space. Folks. 481 00:29:34,040 --> 00:29:38,959 Speaker 1: The pod is called the Internet Hate Machine and you 482 00:29:39,040 --> 00:29:43,080 Speaker 1: must check it out wherever you get your podcasts. Bridget, 483 00:29:43,200 --> 00:29:45,040 Speaker 1: thank you so much for making the time for woke 484 00:29:45,080 --> 00:29:48,440 Speaker 1: aff We appreciate you. Oh, I appreciate you so much, 485 00:29:48,440 --> 00:29:50,040 Speaker 1: and thank you for all that you do, and thank 486 00:29:50,040 --> 00:29:52,200 Speaker 1: you for your You have a very engaged audience that 487 00:29:52,200 --> 00:30:00,920 Speaker 1: you've built, so thank thanks all around. Thank you. That 488 00:30:01,120 --> 00:30:03,800 Speaker 1: is it for me today, dear friends, Aunt woke f 489 00:30:04,040 --> 00:30:08,240 Speaker 1: as always, Power to the people and to all the people. Power, 490 00:30:08,600 --> 00:30:11,200 Speaker 1: get woke and stay woke as fuck.