1 00:00:10,640 --> 00:00:14,320 Speaker 1: Welcome to the Therapy for Black Girls podcast, a weekly 2 00:00:14,360 --> 00:00:19,160 Speaker 1: conversation about mental health, personal development, and all the small 3 00:00:19,160 --> 00:00:22,320 Speaker 1: decisions we can make to become the best possible versions 4 00:00:22,360 --> 00:00:26,440 Speaker 1: of ourselves. I'm your host, Dr Joy hard and Bradford, 5 00:00:26,800 --> 00:00:31,880 Speaker 1: a licensed psychologist in Atlanta, Georgia. For more information or 6 00:00:32,000 --> 00:00:35,400 Speaker 1: to find a therapist in your area, visit our website 7 00:00:35,520 --> 00:00:39,120 Speaker 1: at Therapy for Black Girls dot com. While I hope 8 00:00:39,159 --> 00:00:43,040 Speaker 1: you love listening to and learning from the podcast, it 9 00:00:43,159 --> 00:00:46,080 Speaker 1: is not meant to be a substitute for a relationship 10 00:00:46,159 --> 00:00:57,240 Speaker 1: with a licensed mental health professional. Hey, y'all, thanks so 11 00:00:57,320 --> 00:01:00,080 Speaker 1: much for joining me for session of the Therapy of 12 00:01:00,120 --> 00:01:03,760 Speaker 1: Black Girls podcasts. We'll get right into the conversation after 13 00:01:03,760 --> 00:01:15,319 Speaker 1: a word from our sponsors. What would the Internet and 14 00:01:15,360 --> 00:01:20,039 Speaker 1: social media be without black women? Without the labor, brilliance, 15 00:01:20,120 --> 00:01:23,280 Speaker 1: and talent of black women, the twitters and TikTok's of 16 00:01:23,280 --> 00:01:26,520 Speaker 1: the world wouldn't hold the same cultural weight. Despite our 17 00:01:26,520 --> 00:01:30,560 Speaker 1: continuous contributions to these sites, these same platforms are often 18 00:01:30,560 --> 00:01:34,759 Speaker 1: disproportionately more harmful for us. Joining me today to discuss 19 00:01:34,760 --> 00:01:37,600 Speaker 1: the ways we can build spaces that center our humanity 20 00:01:38,200 --> 00:01:42,319 Speaker 1: is Bridget Todd, digital activist and host and creator of 21 00:01:42,400 --> 00:01:45,560 Speaker 1: I Heart Podcasts, There are No Girls on the Internet. 22 00:01:46,040 --> 00:01:49,160 Speaker 1: During our conversation, Bridget and I chatted about what it 23 00:01:49,160 --> 00:01:52,760 Speaker 1: looks like to create change online, the importance of black 24 00:01:52,760 --> 00:01:57,880 Speaker 1: women sharing their online experiences, how misinformation and conspiracy theories 25 00:01:57,920 --> 00:02:02,400 Speaker 1: online negatively impact marginal life communities, and creating a safer 26 00:02:02,480 --> 00:02:07,240 Speaker 1: online ecosystem for everyone to participate in. If something resonates 27 00:02:07,240 --> 00:02:10,440 Speaker 1: with you while enjoying our conversation, please share with us 28 00:02:10,480 --> 00:02:14,880 Speaker 1: on social media using the hashtag TVG in session, or 29 00:02:15,360 --> 00:02:17,679 Speaker 1: join us in the digital safe space we've created for 30 00:02:17,760 --> 00:02:20,960 Speaker 1: black women, the Sister Circle. To talk more in depth 31 00:02:21,000 --> 00:02:24,160 Speaker 1: about the episode, you can join us at community dot 32 00:02:24,200 --> 00:02:31,600 Speaker 1: therapy for black Girls dot com. Here's our conversation. Thank 33 00:02:31,600 --> 00:02:34,000 Speaker 1: you so much for joining us, Bridget, Oh, I'm so 34 00:02:34,080 --> 00:02:36,040 Speaker 1: excited to be here. I'm such a big fan of 35 00:02:36,080 --> 00:02:37,880 Speaker 1: your show and what you've built, So this is a 36 00:02:37,960 --> 00:02:41,600 Speaker 1: dream come true. Very excited, very excited to have you here. 37 00:02:41,720 --> 00:02:45,320 Speaker 1: So you're a podcaster, activists, and digital strategists working to 38 00:02:45,440 --> 00:02:47,600 Speaker 1: create change on the Internet. Tell me a little bit 39 00:02:47,600 --> 00:02:50,000 Speaker 1: about how you got started with net work. Yeah, it 40 00:02:50,080 --> 00:02:53,480 Speaker 1: really just came from being a lifelong lover of the Internet. 41 00:02:53,800 --> 00:02:56,720 Speaker 1: I grew up in a small town in Virginia. Shout out, 42 00:02:56,720 --> 00:02:59,919 Speaker 1: timid Lothian, Virginia. I'm sure most folks have work in 43 00:03:00,000 --> 00:03:03,360 Speaker 1: reach room for a while, so I've okay. So you know, 44 00:03:03,919 --> 00:03:06,040 Speaker 1: it's a little different now, But when I was coming 45 00:03:06,040 --> 00:03:09,880 Speaker 1: of age, it was a very sleepy, small suburban town, 46 00:03:10,120 --> 00:03:13,320 Speaker 1: and I grew up feeling incredibly out of place in 47 00:03:13,400 --> 00:03:17,280 Speaker 1: this very sleepy, small suburban town if not for the Internet. 48 00:03:17,360 --> 00:03:19,000 Speaker 1: I will never forget the day that my parents had 49 00:03:19,000 --> 00:03:22,160 Speaker 1: brought home this like clunky, gray computer and they set 50 00:03:22,160 --> 00:03:24,119 Speaker 1: it up in what will be known as the computer 51 00:03:24,280 --> 00:03:26,839 Speaker 1: room in our house, and nobody was allowed to drink 52 00:03:26,840 --> 00:03:28,800 Speaker 1: of soda in there, nobody was allowed to take food 53 00:03:28,800 --> 00:03:31,720 Speaker 1: in there, and it was like my parents had bought 54 00:03:31,760 --> 00:03:33,760 Speaker 1: me a pair of wings. It was the first time 55 00:03:33,800 --> 00:03:36,680 Speaker 1: that I was able to connect with people outside of 56 00:03:36,720 --> 00:03:40,080 Speaker 1: my little small town, really discover who I was, and 57 00:03:40,120 --> 00:03:42,120 Speaker 1: do a lot of self exploration that I would never 58 00:03:42,200 --> 00:03:44,120 Speaker 1: have had the chance to do. And so I really 59 00:03:44,240 --> 00:03:47,360 Speaker 1: saw the way that the Internet opened up this huge 60 00:03:47,760 --> 00:03:51,240 Speaker 1: world of possibilities for me. And as I got older, 61 00:03:51,600 --> 00:03:55,080 Speaker 1: I saw the way that younger generations today, especially for 62 00:03:55,240 --> 00:03:59,160 Speaker 1: marginalized people so black women, brown folks, queer folks, LGBT, 63 00:03:59,240 --> 00:04:02,080 Speaker 1: hugh folks. I saw the ways that the Internet. You know, 64 00:04:02,160 --> 00:04:05,400 Speaker 1: I'm sure it's still this land of discovery and opportunity, 65 00:04:05,640 --> 00:04:08,280 Speaker 1: but it's also can be really tough out there, and 66 00:04:08,320 --> 00:04:11,000 Speaker 1: so I wanted to make sure that we had healthy 67 00:04:11,080 --> 00:04:14,120 Speaker 1: Internet ecosystems and tell the story of all the different 68 00:04:14,160 --> 00:04:16,800 Speaker 1: amazing black women who are contributing to making the Internet 69 00:04:16,800 --> 00:04:19,599 Speaker 1: a safer place and more inclusive place, because I know 70 00:04:20,040 --> 00:04:22,919 Speaker 1: just how powerful the Internet can be when it's a 71 00:04:23,040 --> 00:04:26,720 Speaker 1: safe place to have those kinds of experiences. So, Bridget, 72 00:04:26,839 --> 00:04:29,240 Speaker 1: I would love for you to share some of your 73 00:04:29,320 --> 00:04:33,120 Speaker 1: earliest online community experiences, because when you said that, my 74 00:04:33,200 --> 00:04:36,599 Speaker 1: brain immediately went to participating in like Yahoo chat rooms. 75 00:04:37,000 --> 00:04:39,840 Speaker 1: And so I don't know your age, but that I 76 00:04:39,880 --> 00:04:42,640 Speaker 1: feel like was my earliest goal talking to strangers on 77 00:04:42,680 --> 00:04:44,840 Speaker 1: the Internet. So what was that for you? Was your 78 00:04:44,839 --> 00:04:48,520 Speaker 1: earliest experience? Oh my god, I loved the Yahoo chat room. 79 00:04:48,600 --> 00:04:52,680 Speaker 1: I loved America online like old school. I did a 80 00:04:52,720 --> 00:04:54,760 Speaker 1: lot of I'm sure I was doing things and I 81 00:04:54,800 --> 00:04:57,039 Speaker 1: shouldn't have been doing, and like in pockets of the 82 00:04:57,040 --> 00:05:00,280 Speaker 1: Internet that I shouldn't have been Probably the community that 83 00:05:00,360 --> 00:05:03,600 Speaker 1: was the most foundational place for me online. I moved 84 00:05:03,720 --> 00:05:06,080 Speaker 1: from d C to San Francisco, California, where I knew 85 00:05:06,080 --> 00:05:07,880 Speaker 1: no one. I was completely like a like a spur 86 00:05:07,920 --> 00:05:10,599 Speaker 1: of the moment move, and I decided that that would 87 00:05:10,640 --> 00:05:13,520 Speaker 1: be the appropriate time to do the big chap, to 88 00:05:13,520 --> 00:05:16,039 Speaker 1: cut off all my hair because I was away from 89 00:05:16,040 --> 00:05:19,040 Speaker 1: my family, you know, knew me new time in my life, 90 00:05:19,080 --> 00:05:21,680 Speaker 1: and San Francisco is not like there weren't a lot 91 00:05:21,680 --> 00:05:24,320 Speaker 1: of like black salons. And so if it was not 92 00:05:24,440 --> 00:05:28,640 Speaker 1: for online message boards about how you maintain curly hair, 93 00:05:28,920 --> 00:05:31,520 Speaker 1: how you have emotional resilience when you're going through the 94 00:05:31,520 --> 00:05:33,400 Speaker 1: big chap all of that, I don't know if I 95 00:05:33,400 --> 00:05:35,359 Speaker 1: would have ever gotten through that time. And so that 96 00:05:35,440 --> 00:05:38,720 Speaker 1: was probably the time when an online community that felt 97 00:05:39,120 --> 00:05:41,920 Speaker 1: like a real community, like people that I had that 98 00:05:42,040 --> 00:05:44,600 Speaker 1: could be in my corner and helped me through this 99 00:05:44,680 --> 00:05:47,000 Speaker 1: tough time in my life. That was probably the time 100 00:05:47,040 --> 00:05:50,240 Speaker 1: that it was the most impactful for me. Even though 101 00:05:50,279 --> 00:05:51,960 Speaker 1: these are people that I didn't know in real life, 102 00:05:52,120 --> 00:05:54,360 Speaker 1: I came to really depend on them for this sense 103 00:05:54,360 --> 00:05:56,200 Speaker 1: of community that I just didn't have. I r L 104 00:05:56,279 --> 00:05:58,359 Speaker 1: because I had just moved to this new city that 105 00:05:58,400 --> 00:06:01,320 Speaker 1: I didn't know anybody in Yeah, you know, really, I 106 00:06:01,320 --> 00:06:03,120 Speaker 1: know we're gonna get into this a little later, just 107 00:06:03,160 --> 00:06:06,800 Speaker 1: around like how important Indians for black women's to form community. 108 00:06:06,839 --> 00:06:09,280 Speaker 1: But this natural hair movement, I feel like it's like 109 00:06:09,600 --> 00:06:14,000 Speaker 1: a hallmark of black women's online community building, because you're right, 110 00:06:14,040 --> 00:06:16,240 Speaker 1: I think so many of us did the big chap 111 00:06:16,320 --> 00:06:18,159 Speaker 1: and like talked about products and all of this, and 112 00:06:18,200 --> 00:06:20,760 Speaker 1: so much of that I think was happening online where 113 00:06:20,800 --> 00:06:22,719 Speaker 1: I didn't necessarily feel like even though I was in 114 00:06:22,760 --> 00:06:25,800 Speaker 1: bigger cities, I wasn't necessarily in like a Midlothian. But 115 00:06:25,880 --> 00:06:28,080 Speaker 1: even in real life, I don't remember having as many 116 00:06:28,080 --> 00:06:30,960 Speaker 1: of those conversations in person as I did, like on 117 00:06:31,000 --> 00:06:33,920 Speaker 1: the message board. So that's a very important point. We're 118 00:06:33,920 --> 00:06:36,040 Speaker 1: going to get back to that later. But you also 119 00:06:36,160 --> 00:06:38,840 Speaker 1: do some work as the COMPS director of Ultra Violet. 120 00:06:39,160 --> 00:06:41,280 Speaker 1: Talk to me about your role there and like how 121 00:06:41,400 --> 00:06:44,440 Speaker 1: you approach making change through that organization. Yeah, I'm so 122 00:06:44,480 --> 00:06:46,640 Speaker 1: glad that you brought this up. Ultra Violet is very 123 00:06:46,680 --> 00:06:48,080 Speaker 1: near and near to my heart. If you don't know 124 00:06:48,120 --> 00:06:51,040 Speaker 1: what we are, we are a gender justice advocacy organization 125 00:06:51,080 --> 00:06:54,240 Speaker 1: with over one point two million gender justice advocates nationwide. 126 00:06:54,640 --> 00:06:58,360 Speaker 1: We really want to create a cost for things like sexism, 127 00:06:58,720 --> 00:07:02,279 Speaker 1: massogyny gets black women in specifically, or an intersectional feminist 128 00:07:02,320 --> 00:07:05,240 Speaker 1: organization that is really lifting up some of the fights 129 00:07:05,279 --> 00:07:07,960 Speaker 1: that we believe are sort of on the path to 130 00:07:08,040 --> 00:07:10,800 Speaker 1: getting a feminist, anti racist future that I know that 131 00:07:10,840 --> 00:07:12,680 Speaker 1: we all want. And so a big part of that 132 00:07:12,680 --> 00:07:15,480 Speaker 1: work at Ultra Violet for me has looked like advocating 133 00:07:15,520 --> 00:07:19,160 Speaker 1: for safer internet experiences and communities, particularly forward people who 134 00:07:19,160 --> 00:07:22,080 Speaker 1: are traditionally marginalized. So black folks, brown folks, queer folks, 135 00:07:22,200 --> 00:07:25,840 Speaker 1: LGBTQ folks, making sure that we are able to show 136 00:07:25,920 --> 00:07:29,560 Speaker 1: up online safely and make our voices heard, because online 137 00:07:29,600 --> 00:07:32,400 Speaker 1: experiences are so crucial to how we all show up 138 00:07:32,400 --> 00:07:35,800 Speaker 1: in the world, especially these days. M And I know 139 00:07:35,840 --> 00:07:38,080 Speaker 1: a large part of work has also been researched, So 140 00:07:38,120 --> 00:07:39,760 Speaker 1: can you talk a little bit about like some of 141 00:07:39,800 --> 00:07:42,160 Speaker 1: the research that you've done, but also what you found 142 00:07:42,560 --> 00:07:45,720 Speaker 1: in terms of like how the internet impacts women in 143 00:07:45,800 --> 00:07:49,160 Speaker 1: people of color. Yeah, So, working with Ultra Violet, we 144 00:07:49,200 --> 00:07:51,679 Speaker 1: really are so lucky to be working in like deep 145 00:07:51,720 --> 00:07:55,120 Speaker 1: collaboration and community with all sorts of different organizations who 146 00:07:55,160 --> 00:07:58,160 Speaker 1: are all researching the internet and technology and the way 147 00:07:58,200 --> 00:08:00,560 Speaker 1: it impacts people who are traditionally marginalized. And so a 148 00:08:00,560 --> 00:08:03,240 Speaker 1: lot of the research that I do is just lifting 149 00:08:03,320 --> 00:08:06,760 Speaker 1: up and amplifying and packaging and doing analysis around research 150 00:08:06,800 --> 00:08:09,120 Speaker 1: of like other great researchers. And so we are in 151 00:08:09,160 --> 00:08:12,640 Speaker 1: these different coalitions like the Women's Information Defense Project, which 152 00:08:12,640 --> 00:08:16,320 Speaker 1: is an organization of multiple different gender justice organizations who 153 00:08:16,320 --> 00:08:19,760 Speaker 1: are all talking about things like disinformation and online safety 154 00:08:19,760 --> 00:08:24,160 Speaker 1: and online communities from the perspective of rooting gender and 155 00:08:24,240 --> 00:08:27,680 Speaker 1: feminism and an intersectional lens. That's how we approach it. 156 00:08:27,680 --> 00:08:30,400 Speaker 1: And so I'm very lucky to have access to all 157 00:08:30,440 --> 00:08:32,640 Speaker 1: this different research that I would never be able to 158 00:08:32,679 --> 00:08:35,320 Speaker 1: get my hands on just working alone. So definitely it's 159 00:08:35,320 --> 00:08:39,040 Speaker 1: been deep collaboration and partnership. Can you share any of 160 00:08:39,080 --> 00:08:41,200 Speaker 1: the findings of some of their research then has really 161 00:08:41,280 --> 00:08:44,560 Speaker 1: jumped out to you as like most interesting or impactful. Yeah. 162 00:08:44,640 --> 00:08:47,679 Speaker 1: Just recently, we put together a report card grading all 163 00:08:47,679 --> 00:08:50,040 Speaker 1: of the major social media platforms on how they really 164 00:08:50,080 --> 00:08:54,600 Speaker 1: fail women and people of color, and we, perhaps unsurprisingly, 165 00:08:54,800 --> 00:08:58,080 Speaker 1: most of the platforms didn't do great on this report card, 166 00:08:58,120 --> 00:09:01,480 Speaker 1: and I think that's sort of been the the I 167 00:09:01,520 --> 00:09:04,959 Speaker 1: don't say most surprising, but definitely most impactful thing I've 168 00:09:05,000 --> 00:09:08,400 Speaker 1: seen from this research is just how badly and how 169 00:09:08,480 --> 00:09:12,000 Speaker 1: deeply a lot of social media platforms are failing traditionally 170 00:09:12,040 --> 00:09:14,920 Speaker 1: marginalized people. And so that work really was sort of 171 00:09:14,920 --> 00:09:18,200 Speaker 1: like making sure that these platforms really understood what's at 172 00:09:18,200 --> 00:09:20,360 Speaker 1: stake and the root of the problem, so we can 173 00:09:20,400 --> 00:09:22,480 Speaker 1: really get to a place where we can figure out 174 00:09:22,559 --> 00:09:24,320 Speaker 1: where we go from there and how we fix it. 175 00:09:24,800 --> 00:09:26,760 Speaker 1: I'm glad you brought that a bridget because as you've 176 00:09:26,760 --> 00:09:29,680 Speaker 1: alluded to, like even in your introduction, you know, there's research, 177 00:09:29,720 --> 00:09:32,319 Speaker 1: i think even from the Pew research that talks about 178 00:09:32,360 --> 00:09:36,040 Speaker 1: like how black people and other traditionally marginalized people are 179 00:09:36,160 --> 00:09:38,560 Speaker 1: like the main ones on a lot of these social 180 00:09:38,640 --> 00:09:42,480 Speaker 1: media platforms. So it's really interesting to hear that even 181 00:09:42,480 --> 00:09:45,320 Speaker 1: though we are sometimes like the largest user base, that 182 00:09:45,640 --> 00:09:48,240 Speaker 1: we are not safe in these places, right, Like we 183 00:09:48,320 --> 00:09:51,120 Speaker 1: are the ones who typically like make things pop on 184 00:09:51,240 --> 00:09:53,679 Speaker 1: these platforms, you know, like all these viral moments are 185 00:09:53,720 --> 00:09:57,800 Speaker 1: a lot of people from traditionally modginalized communities, yet these 186 00:09:57,840 --> 00:10:01,839 Speaker 1: are not typically safe spaces for us. Absolutely, you just 187 00:10:01,960 --> 00:10:05,000 Speaker 1: articulated it so well, and that really for me was 188 00:10:05,040 --> 00:10:08,000 Speaker 1: a big foundational shift and why and how I approached 189 00:10:08,040 --> 00:10:10,800 Speaker 1: this work because the Internet and social media would be 190 00:10:10,880 --> 00:10:14,880 Speaker 1: nothing without black folks, particularly black women and black queer folks. Right, 191 00:10:15,160 --> 00:10:17,720 Speaker 1: what would Twitter be nothing? What would what would Instagram 192 00:10:17,720 --> 00:10:20,200 Speaker 1: be nothing? What would TikTok be if it wasn't for 193 00:10:20,240 --> 00:10:25,000 Speaker 1: our labor, are, brilliance, our talent, nothing, right, And so 194 00:10:25,400 --> 00:10:28,720 Speaker 1: these platforms they need us. We are the lifeblood of 195 00:10:28,720 --> 00:10:32,000 Speaker 1: these platforms. And so I think that for a long time, 196 00:10:32,040 --> 00:10:34,640 Speaker 1: people who run platforms didn't think that they had to 197 00:10:34,640 --> 00:10:37,360 Speaker 1: be accountable to us, the people who are the lifeblood 198 00:10:37,520 --> 00:10:40,400 Speaker 1: of why these things make them millions of dollars like 199 00:10:40,480 --> 00:10:42,600 Speaker 1: every month or so. Right, And so I really wanted 200 00:10:42,600 --> 00:10:45,720 Speaker 1: to flip the script and say, no, platforms, they ought 201 00:10:45,720 --> 00:10:47,680 Speaker 1: to be accountable to us. If we are the ones 202 00:10:47,760 --> 00:10:51,080 Speaker 1: who are making your platform profitable, making your platform a 203 00:10:51,080 --> 00:10:52,920 Speaker 1: good place to be, we ought to be able to 204 00:10:52,960 --> 00:10:54,880 Speaker 1: expect that you're going to be accountable to us, to 205 00:10:54,880 --> 00:10:57,800 Speaker 1: make sure that us being there feels good, for us 206 00:10:57,840 --> 00:10:59,800 Speaker 1: being there feel safe that if we're going to be 207 00:10:59,840 --> 00:11:02,000 Speaker 1: out these platforms, we can actually get paid for our 208 00:11:02,080 --> 00:11:04,559 Speaker 1: labor and our brilliance, not just exploited and have to 209 00:11:04,720 --> 00:11:06,720 Speaker 1: be mined by others for them to get paid. And 210 00:11:06,760 --> 00:11:08,960 Speaker 1: so I really wanted to start having some of those 211 00:11:08,960 --> 00:11:12,960 Speaker 1: conversations that really center us and our power and reality 212 00:11:13,040 --> 00:11:15,560 Speaker 1: of what we actually do provide these platforms, which is 213 00:11:15,559 --> 00:11:19,040 Speaker 1: a lot. So what are some of the recommendations more 214 00:11:19,120 --> 00:11:21,440 Speaker 1: things that these platforms might be able to put into 215 00:11:21,480 --> 00:11:25,040 Speaker 1: police so that we can have safer and healthier experiences 216 00:11:25,040 --> 00:11:27,680 Speaker 1: on the platforms. Such a great question. So I would say, 217 00:11:27,720 --> 00:11:32,079 Speaker 1: first and foremost, really making sure that moderation decisions are 218 00:11:32,120 --> 00:11:34,959 Speaker 1: being made by humans who know what they're doing. I 219 00:11:35,000 --> 00:11:37,160 Speaker 1: think that we do a lot of relying on algorithmic 220 00:11:37,160 --> 00:11:40,000 Speaker 1: models that we know are biased against us a lot 221 00:11:40,080 --> 00:11:42,480 Speaker 1: of times, and so I would say getting away from 222 00:11:42,520 --> 00:11:47,040 Speaker 1: algorithmic moderation policies and really center more human moderation policies. 223 00:11:47,240 --> 00:11:49,720 Speaker 1: I can't tell you how many times I've seen I'm 224 00:11:49,760 --> 00:11:52,640 Speaker 1: thinking of TikTok in particular, where a black contact creator 225 00:11:52,800 --> 00:11:56,200 Speaker 1: will make a joke that is clearly an intra community 226 00:11:56,320 --> 00:12:01,360 Speaker 1: black thing, and a algorithmic based con at moderation model 227 00:12:01,720 --> 00:12:04,480 Speaker 1: won't understand that. Nuance I won't understand that, and so 228 00:12:04,679 --> 00:12:08,680 Speaker 1: well penalize this black content creator for making completely appropriate 229 00:12:08,679 --> 00:12:12,120 Speaker 1: content speaking to a black audience because that moderation system 230 00:12:12,240 --> 00:12:14,560 Speaker 1: doesn't understand it. So I would say making sure that 231 00:12:14,559 --> 00:12:19,600 Speaker 1: we're not prioritizing moderation models that just continue to marginalize 232 00:12:19,679 --> 00:12:23,520 Speaker 1: us and penalize us unfairly is one too. I would say, 233 00:12:23,800 --> 00:12:27,079 Speaker 1: really making sure that at these tech companies, the people 234 00:12:27,120 --> 00:12:29,760 Speaker 1: who are making the decisions and holding the power look 235 00:12:29,840 --> 00:12:33,280 Speaker 1: like the user base of those platforms. And so it's 236 00:12:33,320 --> 00:12:35,480 Speaker 1: a huge problem as I'm sure you know that the 237 00:12:35,520 --> 00:12:37,840 Speaker 1: people who are making decisions not to mention the money 238 00:12:37,840 --> 00:12:40,840 Speaker 1: and a lot of these major platforms are white men. 239 00:12:41,040 --> 00:12:43,000 Speaker 1: That's just the way it is. And so I think 240 00:12:43,120 --> 00:12:45,719 Speaker 1: making a shift toward hiring to make sure that the 241 00:12:45,760 --> 00:12:48,320 Speaker 1: people who are actually making these decisions look a little 242 00:12:48,360 --> 00:12:49,920 Speaker 1: bit more like the communities they are meant to be 243 00:12:49,960 --> 00:12:53,520 Speaker 1: serving and protecting such important work. I really appreciate you 244 00:12:53,520 --> 00:12:56,800 Speaker 1: sharing it. So you also have this very cool podcast 245 00:12:56,880 --> 00:12:58,840 Speaker 1: called There Are No Girls on the Internet, which I 246 00:12:58,840 --> 00:13:01,679 Speaker 1: think is like the coolest Kay's name. First, I want 247 00:13:01,720 --> 00:13:04,959 Speaker 1: to hear the story behind the naming, but also why 248 00:13:05,000 --> 00:13:07,920 Speaker 1: and how you chose podcasting as an additional avenue for 249 00:13:07,960 --> 00:13:09,720 Speaker 1: you to be able to share some of these messages. 250 00:13:09,880 --> 00:13:12,400 Speaker 1: Oh my gosh, I love this question. So the name 251 00:13:12,440 --> 00:13:14,280 Speaker 1: of the podcast, there are No Girls on the Internet. 252 00:13:14,400 --> 00:13:15,960 Speaker 1: I love the name, but it is a little bit 253 00:13:15,960 --> 00:13:17,600 Speaker 1: of a mouthful. Every time I have to type it, 254 00:13:17,640 --> 00:13:20,080 Speaker 1: I'm like, why did I pick something that was so long? 255 00:13:22,559 --> 00:13:24,160 Speaker 1: The idea of there are no girls on the Internet 256 00:13:24,200 --> 00:13:26,280 Speaker 1: is sort of like an unstated rule of the Internet, 257 00:13:26,320 --> 00:13:28,240 Speaker 1: and it kind of means two different things, both of 258 00:13:28,240 --> 00:13:31,360 Speaker 1: which I think are not correct. One that anybody on 259 00:13:31,400 --> 00:13:33,800 Speaker 1: the Internet that you meet who says they are a 260 00:13:33,840 --> 00:13:36,240 Speaker 1: woman is not actually a woman. They're lying because there 261 00:13:36,240 --> 00:13:38,880 Speaker 1: are no girls on the Internet. Too. It's this idea 262 00:13:39,000 --> 00:13:40,960 Speaker 1: that when we show up on the internet, when we 263 00:13:40,960 --> 00:13:44,120 Speaker 1: show up online, our identities don't actually matter, and so 264 00:13:44,200 --> 00:13:46,480 Speaker 1: if you are a black woman, you leave that identity 265 00:13:46,520 --> 00:13:48,400 Speaker 1: at the door when you log on. Both of those 266 00:13:48,400 --> 00:13:50,720 Speaker 1: things are not true. You know. First of all, there 267 00:13:50,800 --> 00:13:55,079 Speaker 1: are so many interesting, dynamic women who aren't just showing 268 00:13:55,160 --> 00:13:58,160 Speaker 1: up online but actually changing what it means to be online, 269 00:13:58,200 --> 00:14:00,720 Speaker 1: fighting and shaping the Internet to make it more inclusive 270 00:14:00,760 --> 00:14:03,880 Speaker 1: and safer. So that's one to this idea that our 271 00:14:03,920 --> 00:14:07,280 Speaker 1: identities don't matter online just doesn't hold water for me. 272 00:14:07,360 --> 00:14:09,960 Speaker 1: You know, we know time and time again that the 273 00:14:10,000 --> 00:14:12,560 Speaker 1: experiences that we have out in the real world as 274 00:14:12,800 --> 00:14:16,360 Speaker 1: traditionally marginalized people, as black women, they do show up online. 275 00:14:16,360 --> 00:14:18,360 Speaker 1: And so I really wanted to create a platform that 276 00:14:18,440 --> 00:14:20,760 Speaker 1: was really centered in what I know to be the 277 00:14:20,800 --> 00:14:24,120 Speaker 1: truth about our Internet experiences. That one it is black 278 00:14:24,120 --> 00:14:27,240 Speaker 1: and brown folks, LGBT, hugh folks who are really doing 279 00:14:27,320 --> 00:14:30,240 Speaker 1: the work of making the Internet safer, more inclusive, and 280 00:14:30,320 --> 00:14:33,160 Speaker 1: just a more fun, dynamic place to spend time. And 281 00:14:33,240 --> 00:14:35,800 Speaker 1: that our identities really do make a difference when it 282 00:14:35,840 --> 00:14:38,520 Speaker 1: comes to the kinds of experiences we can expect online. 283 00:14:39,560 --> 00:14:41,560 Speaker 1: And what makes you think that podcasting was a great 284 00:14:41,560 --> 00:14:44,400 Speaker 1: way to have those conversations? Who I have always been 285 00:14:44,440 --> 00:14:47,360 Speaker 1: a podcast person. There's just something about it. I'm sure 286 00:14:47,400 --> 00:14:49,520 Speaker 1: you feel the same way, Like I'll never forget when 287 00:14:49,560 --> 00:14:52,520 Speaker 1: I was a kid. My family is from North Carolina. 288 00:14:52,600 --> 00:14:55,120 Speaker 1: We have one of those big black Southern families, and 289 00:14:55,320 --> 00:14:59,560 Speaker 1: my grandmother she was this like beautiful matriarch figure of 290 00:14:59,560 --> 00:15:02,240 Speaker 1: our fami. Really and now I grew up at her 291 00:15:02,320 --> 00:15:05,080 Speaker 1: knees hearing her stories about her life all the time, 292 00:15:05,480 --> 00:15:08,520 Speaker 1: And after she passed, I actually found that she had 293 00:15:08,560 --> 00:15:10,520 Speaker 1: been part of a research project with the University of 294 00:15:10,520 --> 00:15:13,760 Speaker 1: North Carolina, and they had taken all this audio archival 295 00:15:13,800 --> 00:15:17,160 Speaker 1: of my grandmother telling her stories. And the first time 296 00:15:17,200 --> 00:15:20,280 Speaker 1: that I heard her voice in my ear telling the 297 00:15:20,360 --> 00:15:23,960 Speaker 1: stories of her growing up in the South, through segregation, 298 00:15:24,120 --> 00:15:27,320 Speaker 1: raising this family, it just hit me like something completely different. 299 00:15:27,400 --> 00:15:29,640 Speaker 1: And that was the first time that I really understood 300 00:15:29,680 --> 00:15:32,440 Speaker 1: the power of audio, the power of hearing someone tell 301 00:15:32,520 --> 00:15:35,280 Speaker 1: their own story in their own words. And so I've 302 00:15:35,320 --> 00:15:39,120 Speaker 1: always been an audio person. I've been involved in podcasts 303 00:15:39,160 --> 00:15:42,360 Speaker 1: from the very very beginning, back before shows like Cereal, 304 00:15:42,400 --> 00:15:44,040 Speaker 1: when we didn't know what the heck we were doing, 305 00:15:44,200 --> 00:15:47,560 Speaker 1: you know, we had no idea. I think that with podcasting, 306 00:15:47,960 --> 00:15:50,400 Speaker 1: it's really a chance to hear people tell their own 307 00:15:50,440 --> 00:15:52,920 Speaker 1: experience in their own words and their own voice, and 308 00:15:52,960 --> 00:15:55,440 Speaker 1: it's just so intimate. There's something about the intimacy of 309 00:15:55,480 --> 00:15:58,640 Speaker 1: someone's voice in your ears, and honestly, we deserve it. 310 00:15:58,720 --> 00:16:03,080 Speaker 1: We deserve in to meet thoughtful of nuanced, beautiful stories 311 00:16:03,120 --> 00:16:06,880 Speaker 1: about our experiences. We deserve that more. From my conversation 312 00:16:06,920 --> 00:16:21,480 Speaker 1: with Bridget after the break, yeah, and you know it's 313 00:16:21,600 --> 00:16:24,840 Speaker 1: our earlier conversation about the natural hair community, it feels 314 00:16:24,840 --> 00:16:27,240 Speaker 1: like it nicely dove tails with what you're sharing now, 315 00:16:27,840 --> 00:16:30,440 Speaker 1: and even though it is like beautiful and really important, 316 00:16:30,880 --> 00:16:33,320 Speaker 1: it is also difficult because it feels like there are 317 00:16:33,440 --> 00:16:36,680 Speaker 1: so few safe spaces for black women to be able 318 00:16:36,720 --> 00:16:39,720 Speaker 1: to commune online. Right, So, even if you share a 319 00:16:39,760 --> 00:16:42,080 Speaker 1: tweet and say that you're only talking to other black women, like, 320 00:16:42,160 --> 00:16:46,000 Speaker 1: undoubtedly other people will like budge into their conversation. Right, 321 00:16:46,240 --> 00:16:48,440 Speaker 1: And so I wonder if you have any thoughts about, 322 00:16:48,480 --> 00:16:52,040 Speaker 1: like how we can create spaces and save spaces for 323 00:16:52,120 --> 00:16:55,720 Speaker 1: black women online. Yeah, what a great question. A tool 324 00:16:55,840 --> 00:16:57,800 Speaker 1: or a platform that I've been really into lately has 325 00:16:57,840 --> 00:17:01,560 Speaker 1: been Twitter spaces, right, so you know, when you can say, like, 326 00:17:01,840 --> 00:17:04,480 Speaker 1: I want to curate a conversation with a handful of 327 00:17:04,480 --> 00:17:07,640 Speaker 1: people that I select. If other people come in, okay, 328 00:17:07,680 --> 00:17:10,240 Speaker 1: that's fine, but I have control over who I want 329 00:17:10,280 --> 00:17:12,280 Speaker 1: to have the stage and whose voice I want to 330 00:17:12,280 --> 00:17:15,680 Speaker 1: be spotlighting. I love that I've seen some really interesting 331 00:17:15,960 --> 00:17:18,560 Speaker 1: use cases of how people are really using Twitter spaces 332 00:17:18,600 --> 00:17:22,320 Speaker 1: to cure rate the voices they want to be engaging with. Yeah. 333 00:17:22,359 --> 00:17:24,359 Speaker 1: I agree with that, And it does feel like in 334 00:17:24,359 --> 00:17:26,679 Speaker 1: a lot of ways, Twitter spaces feels like a zoom 335 00:17:26,760 --> 00:17:30,000 Speaker 1: but like a for like a lower lift, right, because 336 00:17:30,119 --> 00:17:32,040 Speaker 1: because you know how to like sit out a password 337 00:17:32,080 --> 00:17:33,720 Speaker 1: and you know all of that stuff, and you, as 338 00:17:33,760 --> 00:17:36,000 Speaker 1: the holes, do have control over those things. I would 339 00:17:36,000 --> 00:17:38,600 Speaker 1: agree with you there. Yeah, I love it. I've never 340 00:17:38,720 --> 00:17:42,320 Speaker 1: hosted a space myself, but I am always popping in like, 341 00:17:42,560 --> 00:17:45,320 Speaker 1: what's what's this that the Twitter spaces. I think you 342 00:17:45,400 --> 00:17:47,520 Speaker 1: might have been in it the night of the Oscars 343 00:17:47,520 --> 00:17:50,000 Speaker 1: where it was like everybody was like, um, like, oh, 344 00:17:50,040 --> 00:17:56,280 Speaker 1: we we need a family debrief what just happened? Yes? 345 00:17:56,640 --> 00:17:59,080 Speaker 1: And you know, so I'm you hear that joke? Let 346 00:17:59,080 --> 00:18:01,880 Speaker 1: go at the next like all black people meeting, right, 347 00:18:02,119 --> 00:18:04,280 Speaker 1: And I often think, like the night of the Oscars, 348 00:18:04,280 --> 00:18:06,240 Speaker 1: it would be really cool if we did have these 349 00:18:06,280 --> 00:18:09,280 Speaker 1: like actual like all hands on deck black people meeting, 350 00:18:09,320 --> 00:18:11,919 Speaker 1: like there's a family emergency, we gotta come together. So 351 00:18:11,960 --> 00:18:13,880 Speaker 1: it does feel like Twitter spaces in some ways has 352 00:18:13,920 --> 00:18:15,879 Speaker 1: kind of given us an opportunity to do some of that. 353 00:18:16,359 --> 00:18:17,960 Speaker 1: And it also, you know, I don't know where you 354 00:18:18,080 --> 00:18:22,600 Speaker 1: on clubhouse bridget I, So it does feel like there's 355 00:18:22,600 --> 00:18:25,080 Speaker 1: a different feeling for Twitter versus Clubhouse or do you 356 00:18:25,119 --> 00:18:27,680 Speaker 1: feel that way? Also? I do feel that way, And 357 00:18:27,760 --> 00:18:29,920 Speaker 1: part of the reason I haven't really thought about this, 358 00:18:29,960 --> 00:18:31,880 Speaker 1: So this is like not a fully fleshed out thought, 359 00:18:31,920 --> 00:18:34,080 Speaker 1: but I do think it comes from the fact that 360 00:18:34,080 --> 00:18:37,239 Speaker 1: when Clubhouse first started, it had this layer of like 361 00:18:37,320 --> 00:18:39,880 Speaker 1: exclusivity because you had to get an invite to go. 362 00:18:40,480 --> 00:18:44,080 Speaker 1: And obviously everybody likes feeling special. Everybody likes feeling they 363 00:18:44,119 --> 00:18:46,240 Speaker 1: have a special invite to something. But I think the 364 00:18:46,320 --> 00:18:48,800 Speaker 1: reason why, at least for me, Twitter space has hits 365 00:18:48,800 --> 00:18:51,960 Speaker 1: a little bit different is because it's not about exclusivity, 366 00:18:52,000 --> 00:18:53,520 Speaker 1: Like if you see it on the top of your page, 367 00:18:53,640 --> 00:18:55,199 Speaker 1: you can go in and listen and you don't have 368 00:18:55,240 --> 00:18:57,480 Speaker 1: to have any kind of special invite. And so for me, 369 00:18:57,680 --> 00:19:01,480 Speaker 1: I always like it when spaces are more accessible for everybody. 370 00:19:01,480 --> 00:19:03,560 Speaker 1: I also think the Clubhouse in the beginning was only 371 00:19:03,600 --> 00:19:06,120 Speaker 1: for iPhone and not for Android, and so I think 372 00:19:06,119 --> 00:19:09,320 Speaker 1: there's a little bit of a a different user experience 373 00:19:09,359 --> 00:19:12,760 Speaker 1: sort of baked into using the platform versus something like 374 00:19:12,760 --> 00:19:14,679 Speaker 1: a Twitter Spaces, which I just feel like from the 375 00:19:14,720 --> 00:19:17,840 Speaker 1: ground up didn't have those same kinds of barriers. Yeah, 376 00:19:17,800 --> 00:19:19,960 Speaker 1: I agree, that does make a lot of sense. You know, 377 00:19:20,000 --> 00:19:22,080 Speaker 1: so in the past couple of years, especially, you know, 378 00:19:22,160 --> 00:19:24,879 Speaker 1: kind of doing the pandemic, I think the issue of 379 00:19:24,920 --> 00:19:29,480 Speaker 1: like online safety and like online experiences has really been heightened, 380 00:19:29,680 --> 00:19:31,240 Speaker 1: and so I love to hear you talk about like 381 00:19:31,280 --> 00:19:35,640 Speaker 1: how misinformation and conspiracy theories around like COVID and all 382 00:19:35,680 --> 00:19:39,840 Speaker 1: of these other things have really hurt marginalized communities. Oh, 383 00:19:40,119 --> 00:19:42,280 Speaker 1: I could talk all day. It's one of the reasons 384 00:19:42,280 --> 00:19:44,840 Speaker 1: why I turned my focus in terms of my work 385 00:19:44,880 --> 00:19:47,880 Speaker 1: and my podcast and my research to miss and disinformation 386 00:19:47,920 --> 00:19:50,119 Speaker 1: because I was seeing it firsthand how it impacted in 387 00:19:50,200 --> 00:19:52,239 Speaker 1: my community. For a long time, I thought of all 388 00:19:52,320 --> 00:19:54,920 Speaker 1: this internet safety work as sort of my nine to five, 389 00:19:54,960 --> 00:19:56,960 Speaker 1: and then I would show up in my community with 390 00:19:57,000 --> 00:19:59,840 Speaker 1: my family very differently. And it was around COVID that 391 00:19:59,880 --> 00:20:02,600 Speaker 1: I I started seeing, like, wait a minute, what's happening 392 00:20:02,600 --> 00:20:05,240 Speaker 1: in my own family group chat? What are the conversations 393 00:20:05,280 --> 00:20:07,440 Speaker 1: that are popping off there? And so I really had 394 00:20:07,480 --> 00:20:10,520 Speaker 1: to turn my lens back to my own people in 395 00:20:10,560 --> 00:20:12,720 Speaker 1: my own community to say, like, wait, how are we 396 00:20:13,080 --> 00:20:16,480 Speaker 1: being targeted and impacted by things like misinformation? And something 397 00:20:16,520 --> 00:20:19,600 Speaker 1: that I really wish that people talked more about when 398 00:20:19,640 --> 00:20:22,399 Speaker 1: they talk about disinformation and conspiracy theories and things like 399 00:20:22,440 --> 00:20:27,000 Speaker 1: that is the way that bad actors are so savvy 400 00:20:27,040 --> 00:20:30,200 Speaker 1: at targeting our and by our our I mean black 401 00:20:30,240 --> 00:20:36,360 Speaker 1: folks targeting our legitimate existing tensions, traumas, and fears and 402 00:20:36,520 --> 00:20:38,760 Speaker 1: using them against us. Right, And so the thing about 403 00:20:38,920 --> 00:20:42,560 Speaker 1: misinformation and conspiracy theories at oftentimes more often than not, 404 00:20:42,920 --> 00:20:45,600 Speaker 1: it is rooted in a grain of truth. Right, I 405 00:20:45,640 --> 00:20:48,840 Speaker 1: guess I see the ways that bad actors really do 406 00:20:49,000 --> 00:20:53,080 Speaker 1: exploit this harmful traumatic legacy that we as black folks 407 00:20:53,080 --> 00:20:55,520 Speaker 1: do have in this country, right, and so things like, yeah, 408 00:20:55,560 --> 00:20:58,879 Speaker 1: it is absolutely reasonable for a black person in the 409 00:20:58,960 --> 00:21:01,840 Speaker 1: United States to be a little bit skeptical of, you know, 410 00:21:01,880 --> 00:21:04,399 Speaker 1: the medical industry, given our history. And so that's just 411 00:21:04,440 --> 00:21:07,200 Speaker 1: a fact that it's just a reality about our experience 412 00:21:07,280 --> 00:21:09,679 Speaker 1: as black folks in this country. And so the way 413 00:21:09,680 --> 00:21:14,359 Speaker 1: that bad actors they seize on that reality to inflame it, 414 00:21:14,440 --> 00:21:17,439 Speaker 1: to exploit it, has been the most damaging thing, I 415 00:21:17,480 --> 00:21:21,199 Speaker 1: think to our communities. And it really lets institutions and 416 00:21:21,240 --> 00:21:24,000 Speaker 1: people with power who have let us down off the hook, 417 00:21:24,040 --> 00:21:26,480 Speaker 1: because it makes it so much harder to talk about 418 00:21:26,520 --> 00:21:29,520 Speaker 1: the reality of what our experiences and traumas have been 419 00:21:29,600 --> 00:21:32,920 Speaker 1: like when bad actors are exploiting those conversations for their 420 00:21:32,920 --> 00:21:36,040 Speaker 1: own gain. Yeah, you actually had a mini series on 421 00:21:36,119 --> 00:21:39,080 Speaker 1: your podcast called Disinformed. Can you talk a little bit 422 00:21:39,080 --> 00:21:41,040 Speaker 1: about how that came to be and some of them, 423 00:21:41,040 --> 00:21:44,960 Speaker 1: like common misconceptions that you explored in that series. Yeah, 424 00:21:44,960 --> 00:21:47,240 Speaker 1: so that really came to be when I noticed the 425 00:21:47,280 --> 00:21:50,639 Speaker 1: ways that black and brown folks are specifically targeted. You 426 00:21:50,720 --> 00:21:52,440 Speaker 1: might not know this to look at the space, because 427 00:21:52,440 --> 00:21:54,680 Speaker 1: so many times when we're talking about conspiracy theories, we're 428 00:21:54,680 --> 00:21:57,720 Speaker 1: talking about white people, but black and brown communities are 429 00:21:57,760 --> 00:22:00,840 Speaker 1: disproportionately targeted for things like disent for nation of conspiracy 430 00:22:00,880 --> 00:22:03,920 Speaker 1: theories and disproportionately harmed, right, And so I really didn't 431 00:22:03,960 --> 00:22:07,439 Speaker 1: see that reality being reflected in the space where we 432 00:22:07,440 --> 00:22:10,520 Speaker 1: talk about conspiracy theories. I didn't really see that reality reflected, 433 00:22:10,560 --> 00:22:12,840 Speaker 1: and so I really wanted to start a conversation about 434 00:22:13,160 --> 00:22:16,280 Speaker 1: the harm that conspiracy theories and disinformation have had in 435 00:22:16,280 --> 00:22:19,520 Speaker 1: our communities and who is profiting off those things, right, 436 00:22:19,560 --> 00:22:22,120 Speaker 1: Because if you're a social media platform, maybe you don't 437 00:22:22,160 --> 00:22:24,399 Speaker 1: love the fact that there's conspiracy theories being spread on 438 00:22:24,440 --> 00:22:27,520 Speaker 1: your platform, but that engagement is making you money, and 439 00:22:27,520 --> 00:22:30,199 Speaker 1: so I really wanted to have that conversation about what 440 00:22:30,200 --> 00:22:32,719 Speaker 1: do we do with the fact that platforms are benefiting 441 00:22:32,840 --> 00:22:37,280 Speaker 1: materially from content that we know disproportionately harms our communities, 442 00:22:37,320 --> 00:22:40,080 Speaker 1: and so that's really how that series came to be. 443 00:22:40,359 --> 00:22:43,919 Speaker 1: I would say the biggest misconception about disinformation and conspiracy 444 00:22:43,960 --> 00:22:47,760 Speaker 1: theories is that people who fall for them are quote 445 00:22:47,840 --> 00:22:50,720 Speaker 1: stupid or uneducated. Right, Like, I've been working in the 446 00:22:50,760 --> 00:22:53,440 Speaker 1: disinformation space for a long time and I get fooled sometimes. 447 00:22:53,560 --> 00:22:57,080 Speaker 1: That's because bad actors are very savvy, and we really 448 00:22:57,119 --> 00:23:00,159 Speaker 1: don't have a media landscape that lends itself to the 449 00:23:00,240 --> 00:23:03,680 Speaker 1: kinds of authentic, helpful, honest content that we should all 450 00:23:03,720 --> 00:23:05,920 Speaker 1: have easy access to, right And so when you look 451 00:23:05,920 --> 00:23:08,160 Speaker 1: at things like the ways that black and brown folks 452 00:23:08,160 --> 00:23:11,679 Speaker 1: are underserved by traditional media just written large, it is 453 00:23:11,720 --> 00:23:15,320 Speaker 1: not surprising why people would fall for conspiracy theories or 454 00:23:15,600 --> 00:23:19,640 Speaker 1: bad actors or misinformation. And so I see a lot 455 00:23:19,640 --> 00:23:22,280 Speaker 1: of people talk about people who fall for conspiracy theories 456 00:23:22,320 --> 00:23:25,720 Speaker 1: as stupid or uneducated or ignorant, and that's nothing could 457 00:23:25,720 --> 00:23:27,960 Speaker 1: be further from the truth. The fact is that bad 458 00:23:28,000 --> 00:23:31,200 Speaker 1: actors are very very savvy and very very good, and 459 00:23:31,359 --> 00:23:33,359 Speaker 1: the best of us can fall for it. Can you 460 00:23:33,640 --> 00:23:35,880 Speaker 1: give us a definition of what you mean by bad actors? 461 00:23:35,920 --> 00:23:37,640 Speaker 1: Bridgid that may maybe a time a lot of people 462 00:23:37,640 --> 00:23:41,360 Speaker 1: are familiar with. Absolutely. Yeah, So I use bad actors 463 00:23:41,560 --> 00:23:45,360 Speaker 1: as a way to talk about people who spread disinformation. So, 464 00:23:45,760 --> 00:23:50,320 Speaker 1: disinformation is untrue content that is being purposely spread with 465 00:23:50,400 --> 00:23:53,680 Speaker 1: the intent to spread confusion or chaos. And so somebody 466 00:23:53,720 --> 00:23:56,600 Speaker 1: who was spreading disinformation, they are a bad actor. They 467 00:23:56,600 --> 00:24:00,320 Speaker 1: are doing this knowingly on purpose to confuse was and 468 00:24:00,440 --> 00:24:03,600 Speaker 1: scare people, as opposed to somebody who is spreading misinformation 469 00:24:03,960 --> 00:24:06,560 Speaker 1: that's just inaccurate content that you might not even know 470 00:24:06,640 --> 00:24:09,879 Speaker 1: it's not true. And so spreading misinformation isn't great, but 471 00:24:10,200 --> 00:24:12,960 Speaker 1: people aren't always doing it with bad intentions. And so 472 00:24:13,000 --> 00:24:15,680 Speaker 1: I use the phrase of bad actors to specify somebody 473 00:24:15,680 --> 00:24:19,000 Speaker 1: who is spreading lies with the intent to cause fear 474 00:24:19,080 --> 00:24:22,280 Speaker 1: and chaos and distrust and confusion. God, I thank you 475 00:24:22,359 --> 00:24:24,439 Speaker 1: for this. So what are some of the ways, Like 476 00:24:24,520 --> 00:24:27,840 Speaker 1: you just said, it's not always easy to know when 477 00:24:27,880 --> 00:24:30,879 Speaker 1: you're getting like misinformation or disinformation. So what are some 478 00:24:30,920 --> 00:24:36,080 Speaker 1: ways that we can identify like extremist or misinformation, especially online? 479 00:24:36,600 --> 00:24:39,399 Speaker 1: What a good question. So honestly, I think of this 480 00:24:39,440 --> 00:24:42,440 Speaker 1: as almost sort of like a mindfulness practice. You can't 481 00:24:42,440 --> 00:24:44,240 Speaker 1: see right now, but on my Leaftop. I have a 482 00:24:44,280 --> 00:24:46,679 Speaker 1: little post it that just says the word slow down, 483 00:24:47,040 --> 00:24:50,199 Speaker 1: because oftentimes I find that when I'm online, I'm just 484 00:24:50,320 --> 00:24:53,800 Speaker 1: moving very quickly. I'm getting, you know, tweet after tweet 485 00:24:53,800 --> 00:24:55,879 Speaker 1: after tweet after a tweet, and I see something that 486 00:24:55,920 --> 00:24:58,359 Speaker 1: I like and I just smash share right away. And 487 00:24:58,400 --> 00:25:00,600 Speaker 1: so I always just try to ground myself and this 488 00:25:00,680 --> 00:25:03,920 Speaker 1: idea of slow down a little bit and notice when 489 00:25:03,920 --> 00:25:07,000 Speaker 1: you're scrolling on your social media page, notice how the 490 00:25:07,119 --> 00:25:09,639 Speaker 1: content is making you feel. Really be a little bit 491 00:25:09,720 --> 00:25:12,560 Speaker 1: mindful about it. You don't have to share everything right away, 492 00:25:12,600 --> 00:25:14,520 Speaker 1: you don't have to respond to everything right away. It's 493 00:25:14,560 --> 00:25:17,920 Speaker 1: completely okay to take a minute, take a beat when 494 00:25:17,960 --> 00:25:20,320 Speaker 1: you see something. If you see something that gets your 495 00:25:20,320 --> 00:25:22,679 Speaker 1: heart racing or makes you feel a little bit twitchy, 496 00:25:22,720 --> 00:25:25,000 Speaker 1: which I see all the time, I always have to 497 00:25:25,040 --> 00:25:27,720 Speaker 1: stop myself and say, no, no, take a minute. This 498 00:25:27,880 --> 00:25:31,000 Speaker 1: piece of content obviously triggered some feelings, and you take 499 00:25:31,040 --> 00:25:33,040 Speaker 1: a minute to see like why that might be. And 500 00:25:33,080 --> 00:25:35,920 Speaker 1: so I would say slowing down online that's probably the 501 00:25:35,960 --> 00:25:38,960 Speaker 1: biggest piece of feedback I have, because when we're just 502 00:25:39,040 --> 00:25:42,040 Speaker 1: moving really quickly, that's how we share things without reading 503 00:25:42,080 --> 00:25:45,040 Speaker 1: the whole article. That's how we share things without looking 504 00:25:45,040 --> 00:25:47,040 Speaker 1: at another source to see if it's true. That's how 505 00:25:47,080 --> 00:25:48,600 Speaker 1: we share things. And if you took a minute and 506 00:25:48,600 --> 00:25:50,400 Speaker 1: really thought about it, maybe it will be a little 507 00:25:50,400 --> 00:25:51,760 Speaker 1: bit fishy to us. And so I would say the 508 00:25:51,760 --> 00:25:53,760 Speaker 1: biggest thing I can can tell people is to slow 509 00:25:53,840 --> 00:25:56,400 Speaker 1: down before you share and just be a little bit 510 00:25:56,400 --> 00:26:00,119 Speaker 1: mindful when you're scrolling social media in general. Yeah, think 511 00:26:00,119 --> 00:26:02,560 Speaker 1: that that's really important too, because I think the other 512 00:26:02,600 --> 00:26:05,000 Speaker 1: thing that has happened online and I think I see 513 00:26:05,040 --> 00:26:07,080 Speaker 1: this most often on Twitter, but I also spend the 514 00:26:07,080 --> 00:26:09,719 Speaker 1: most time on Twitter. So my answer is probably biased. 515 00:26:09,720 --> 00:26:11,960 Speaker 1: My thoughts on this are probably biased. But it also 516 00:26:12,040 --> 00:26:14,080 Speaker 1: feels like and I think there's some legitimacy to this, 517 00:26:14,560 --> 00:26:19,320 Speaker 1: like organizations and people using like shock and racism as 518 00:26:19,359 --> 00:26:22,040 Speaker 1: like an engagement strategy, right, so they know that if 519 00:26:22,080 --> 00:26:24,600 Speaker 1: they say something like off the wall races, we are 520 00:26:24,640 --> 00:26:26,320 Speaker 1: all going to be talking about it. We're all going 521 00:26:26,359 --> 00:26:29,560 Speaker 1: to be sharing about it, right, And really, again, the 522 00:26:29,640 --> 00:26:32,240 Speaker 1: slowing down can help us to remember like, Okay, yes 523 00:26:32,280 --> 00:26:35,040 Speaker 1: I am enraged about this, but me sharing this article 524 00:26:35,160 --> 00:26:38,160 Speaker 1: also give them clicks, right, Me sharing this also gives 525 00:26:38,160 --> 00:26:40,120 Speaker 1: them the engagement that they're looking for. So I think 526 00:26:40,119 --> 00:26:42,640 Speaker 1: that that is really helpful in that way too. Oh 527 00:26:42,680 --> 00:26:45,360 Speaker 1: my gosh, you've said it. And I think we all know, 528 00:26:45,400 --> 00:26:47,240 Speaker 1: Like if you think about your Instagram feed, I see 529 00:26:47,240 --> 00:26:50,000 Speaker 1: that an Instagram a lot. We all know those big 530 00:26:50,040 --> 00:26:53,880 Speaker 1: social media accounts where they are using things that they 531 00:26:53,960 --> 00:26:57,280 Speaker 1: know we're inflammatory to get us riled up. And when 532 00:26:57,359 --> 00:27:00,280 Speaker 1: you use an algorithmic based social media platform on like 533 00:27:00,280 --> 00:27:04,560 Speaker 1: a Twitter or an Instagram, you are training that algorithm 534 00:27:04,600 --> 00:27:06,879 Speaker 1: what you want to see. And so if somebody posts 535 00:27:06,880 --> 00:27:10,920 Speaker 1: something that is intentionally extreme or inflammatory and you engage 536 00:27:10,960 --> 00:27:13,960 Speaker 1: with it, that's going to make more people see that content, 537 00:27:14,040 --> 00:27:16,760 Speaker 1: because that's how algorithms work. The algorithm is gonna think, oh, 538 00:27:16,800 --> 00:27:20,280 Speaker 1: you want to see this kind of intensely negative or 539 00:27:20,320 --> 00:27:23,560 Speaker 1: outrageous content. And so I actually had to unfollow a 540 00:27:23,560 --> 00:27:27,919 Speaker 1: lot of Instagram accounts that would use, especially black queer 541 00:27:28,040 --> 00:27:31,600 Speaker 1: youth and black women, they would use content around them 542 00:27:31,600 --> 00:27:36,000 Speaker 1: in this incredibly transparent way to clearly stoke outrage, and 543 00:27:36,040 --> 00:27:38,359 Speaker 1: I just had to stop giving them my outrage. It 544 00:27:38,400 --> 00:27:41,320 Speaker 1: wasn't productive after a while to see the worst of 545 00:27:41,359 --> 00:27:44,399 Speaker 1: the worst, and like see these comments sections where black 546 00:27:44,400 --> 00:27:47,200 Speaker 1: women or black queer youth would be getting dragged. It 547 00:27:47,280 --> 00:27:49,200 Speaker 1: just wasn't good for me. And so I would invite 548 00:27:49,200 --> 00:27:52,359 Speaker 1: people to really think about curating the kind of social 549 00:27:52,400 --> 00:27:55,040 Speaker 1: media experience they want to have that is actually going 550 00:27:55,080 --> 00:27:57,360 Speaker 1: to be productive for how they want to show up online. 551 00:27:58,119 --> 00:28:11,440 Speaker 1: More from my conversation with bridget after the break, Yeah, 552 00:28:11,440 --> 00:28:13,560 Speaker 1: because you know, like to your point, like that can 553 00:28:13,640 --> 00:28:17,280 Speaker 1: become exhausting, right, I mean, there are enough actually bad 554 00:28:17,359 --> 00:28:20,360 Speaker 1: things happening in the world for us to be adding 555 00:28:20,840 --> 00:28:23,439 Speaker 1: additional like layers of stress and like, oh, look at 556 00:28:23,440 --> 00:28:25,840 Speaker 1: this latest online outrage. Right, No, part of what is 557 00:28:25,880 --> 00:28:27,840 Speaker 1: happening is that in a lot of ways, we are 558 00:28:27,840 --> 00:28:30,600 Speaker 1: becoming desensitized to a lot of this, Right, And so 559 00:28:30,680 --> 00:28:33,960 Speaker 1: when you are constantly looking at this latest hate crime 560 00:28:34,160 --> 00:28:36,399 Speaker 1: or you know, all of these things, at some point 561 00:28:36,400 --> 00:28:39,200 Speaker 1: like your system can't process at all. And so I 562 00:28:39,240 --> 00:28:41,000 Speaker 1: think that it is a good reminders who to just 563 00:28:41,080 --> 00:28:42,880 Speaker 1: kind of make sure we're paying attention to how we're 564 00:28:42,880 --> 00:28:46,560 Speaker 1: curating our feeds and our experiences. We both talked about 565 00:28:46,600 --> 00:28:49,560 Speaker 1: that kind of being very online. In what ways do 566 00:28:49,640 --> 00:28:53,400 Speaker 1: you feel like the internet really an online behavior has 567 00:28:53,480 --> 00:28:56,720 Speaker 1: impacted black women specifically? Oh, I think it's been such 568 00:28:56,760 --> 00:28:59,560 Speaker 1: a double ledged sword for us, because on the one hand, 569 00:29:00,080 --> 00:29:01,880 Speaker 1: the thing I love about the Internet is the way 570 00:29:01,920 --> 00:29:05,160 Speaker 1: that it democratized our voices. And so you know, you 571 00:29:05,200 --> 00:29:07,840 Speaker 1: didn't have to have a contact on a media company 572 00:29:07,920 --> 00:29:10,360 Speaker 1: or a contact at a news company to get your 573 00:29:10,440 --> 00:29:12,280 Speaker 1: voice out there. If you've got a Twitter account and 574 00:29:12,360 --> 00:29:14,840 Speaker 1: a phone, great, you can make an entire movement. You 575 00:29:14,840 --> 00:29:17,200 Speaker 1: can make a Black Lives Matter, or a Me Too, 576 00:29:17,320 --> 00:29:20,280 Speaker 1: or any number of movements that we saw that were 577 00:29:20,280 --> 00:29:22,600 Speaker 1: created by black women that would go on to change 578 00:29:22,640 --> 00:29:24,600 Speaker 1: the world. And so on the one hand, I think 579 00:29:24,640 --> 00:29:28,400 Speaker 1: it's been incredibly powerful for us to have this democratized 580 00:29:28,520 --> 00:29:32,880 Speaker 1: voice where we could really change the conversation, start a 581 00:29:32,920 --> 00:29:35,080 Speaker 1: movement just by a tweet. That is what black women 582 00:29:35,160 --> 00:29:38,320 Speaker 1: have always done. There is nothing more powerful in this 583 00:29:38,400 --> 00:29:40,480 Speaker 1: world and a black holeman with a cell phone and 584 00:29:40,520 --> 00:29:43,840 Speaker 1: an Internet connection, right, And so I firmly believe that 585 00:29:44,120 --> 00:29:46,520 Speaker 1: where we need to catch up is people with power, 586 00:29:46,640 --> 00:29:50,520 Speaker 1: decision makers really honoring how much we have brought to 587 00:29:50,560 --> 00:29:53,239 Speaker 1: the Internet, how much we have brought to technology, and 588 00:29:53,280 --> 00:29:56,160 Speaker 1: really honoring us as the innovators that we are. And 589 00:29:56,200 --> 00:29:58,360 Speaker 1: so you know, when I say it's a double edged sword, 590 00:29:58,680 --> 00:30:01,760 Speaker 1: I want us to fee old, like it's our right 591 00:30:02,000 --> 00:30:04,680 Speaker 1: to take up more space online. I don't want anyone 592 00:30:04,680 --> 00:30:06,560 Speaker 1: to be like, oh, you're a black woman on the internet, 593 00:30:06,600 --> 00:30:08,960 Speaker 1: like what an anomaly. No, I want us to see 594 00:30:09,040 --> 00:30:12,720 Speaker 1: the internet and technology as our rightful domain where we 595 00:30:12,840 --> 00:30:14,800 Speaker 1: belong and where we have a right to take up 596 00:30:14,800 --> 00:30:17,000 Speaker 1: a lot of space and had a big voice in 597 00:30:17,040 --> 00:30:20,160 Speaker 1: the conversation, you know, Ridgid, I firmly agree with you, 598 00:30:20,200 --> 00:30:23,040 Speaker 1: but I also feel like I have had enough conversations 599 00:30:23,080 --> 00:30:25,640 Speaker 1: here on the podcast and just in other places, right, Like, 600 00:30:25,880 --> 00:30:28,160 Speaker 1: so many sisters who are doing that kind of work 601 00:30:28,560 --> 00:30:31,880 Speaker 1: eventually feel like they have to disconnect from social media, right, So, 602 00:30:31,920 --> 00:30:34,520 Speaker 1: whether it be the trolls or you know, so it's 603 00:30:34,560 --> 00:30:37,560 Speaker 1: almost like they will maybe sometimes keep their channels active 604 00:30:37,680 --> 00:30:40,040 Speaker 1: so they can just kind of share highlights, but they 605 00:30:40,080 --> 00:30:43,200 Speaker 1: really aren't personally engaging. And so I think that that's 606 00:30:43,200 --> 00:30:45,440 Speaker 1: the double edged sword that you're referring to, right that 607 00:30:45,800 --> 00:30:49,120 Speaker 1: it is a powerful opportunity to kind of share your message, 608 00:30:49,120 --> 00:30:52,600 Speaker 1: but we also know, you know that the online experiences 609 00:30:52,640 --> 00:30:55,120 Speaker 1: operate much like they do in real life, right, and 610 00:30:55,160 --> 00:30:57,840 Speaker 1: we are often the targets of people who want to 611 00:30:57,880 --> 00:30:59,760 Speaker 1: be hateful. If you were to go to my Twitter 612 00:30:59,880 --> 00:31:01,680 Speaker 1: right now, you would say, I don't think this person 613 00:31:01,760 --> 00:31:04,440 Speaker 1: really likes being online because I'm sort of using my 614 00:31:04,480 --> 00:31:07,360 Speaker 1: Twitter as like a highlight reel, like here's my latest 615 00:31:07,360 --> 00:31:10,720 Speaker 1: podcast episode, here's my latest appearance, whatever, whatever. I'm not 616 00:31:10,800 --> 00:31:13,800 Speaker 1: spending a lot of time they're having conversations because of 617 00:31:13,960 --> 00:31:16,600 Speaker 1: precisely what you just said. And I think that's part 618 00:31:16,600 --> 00:31:18,440 Speaker 1: of the double edged sword of showing up as a 619 00:31:18,440 --> 00:31:22,240 Speaker 1: black woman online, that every single piece of research will 620 00:31:22,600 --> 00:31:26,880 Speaker 1: make this point that black women are disproportionately targeted for 621 00:31:26,960 --> 00:31:31,440 Speaker 1: things like disinformation campaigns, misinformation attacks, and harassment rooted in 622 00:31:31,440 --> 00:31:34,040 Speaker 1: attacks on our identity as black women. Like that's just 623 00:31:34,160 --> 00:31:36,200 Speaker 1: what we have to deal with to show up online. 624 00:31:36,200 --> 00:31:39,160 Speaker 1: And so I can't blame anybody who was like, you know, 625 00:31:39,320 --> 00:31:40,680 Speaker 1: I think I'm gonna check out of this for a 626 00:31:40,720 --> 00:31:42,640 Speaker 1: little while, or i think I'm gonna use this to 627 00:31:42,840 --> 00:31:44,560 Speaker 1: just to get my point across. I'm not gonna be 628 00:31:44,600 --> 00:31:47,440 Speaker 1: here for extended conversations or back and forth with people. 629 00:31:47,720 --> 00:31:49,640 Speaker 1: I completely get that, and I think that's part of 630 00:31:49,640 --> 00:31:51,280 Speaker 1: the double edged sort of what it means to show 631 00:31:51,360 --> 00:31:54,200 Speaker 1: up online as a black woman. Yeah, so you know 632 00:31:54,320 --> 00:31:57,560 Speaker 1: the other thing, and we've had a very very public 633 00:31:57,640 --> 00:32:03,719 Speaker 1: example of this around how like comedians and jokes and 634 00:32:03,800 --> 00:32:07,160 Speaker 1: even a lot of the like internet comedians, I guess 635 00:32:07,280 --> 00:32:10,760 Speaker 1: is the best descriptor how they actually make their careers 636 00:32:10,800 --> 00:32:13,840 Speaker 1: off of dressing up as like older black women or 637 00:32:14,040 --> 00:32:16,600 Speaker 1: there tends to be like some black women being the 638 00:32:16,640 --> 00:32:19,080 Speaker 1: butt of the joke, and that is tend to be 639 00:32:19,200 --> 00:32:21,840 Speaker 1: what we see going viral, right, And I'd love to 640 00:32:21,880 --> 00:32:24,360 Speaker 1: hear your thoughts about like how that can impact us 641 00:32:24,400 --> 00:32:28,600 Speaker 1: and our experiences online. You're absolutely right, And now that 642 00:32:28,640 --> 00:32:30,680 Speaker 1: I do this work professionally and have a little bit 643 00:32:30,680 --> 00:32:32,680 Speaker 1: of research behind it, I can say that's not just 644 00:32:32,760 --> 00:32:37,240 Speaker 1: an anecdotal thing. We have a digital media ecosystem that 645 00:32:37,280 --> 00:32:40,320 Speaker 1: will always be ready and willing to amplify attacks on 646 00:32:40,360 --> 00:32:42,280 Speaker 1: black women. That is just the way that it is. 647 00:32:42,720 --> 00:32:46,280 Speaker 1: Comedians and podcasters and all of that. They traffic in 648 00:32:46,400 --> 00:32:49,400 Speaker 1: jokes about black women that make us about the joke 649 00:32:49,480 --> 00:32:52,520 Speaker 1: because it is effective. It is an effective strategy to 650 00:32:52,600 --> 00:32:55,560 Speaker 1: build cloud. We saw it over and over and over 651 00:32:55,600 --> 00:32:59,040 Speaker 1: again with our former president, who, when his back was 652 00:32:59,040 --> 00:33:03,719 Speaker 1: against the wall, often attack prominent black women in politics 653 00:33:03,720 --> 00:33:06,240 Speaker 1: and journalism because that was always going to be a 654 00:33:06,240 --> 00:33:08,440 Speaker 1: way to like rile up the base and get them going. 655 00:33:08,800 --> 00:33:11,680 Speaker 1: And it's been so heartbreaking to see our own community 656 00:33:11,760 --> 00:33:14,880 Speaker 1: do the same thing. But unfortunately that is the reality. 657 00:33:14,880 --> 00:33:17,280 Speaker 1: And I think when the way that you see yourself 658 00:33:17,320 --> 00:33:21,120 Speaker 1: as black women is these hurtful, cruel depictions that are 659 00:33:21,120 --> 00:33:24,960 Speaker 1: so dehumanizing, after a while you can start to internalize 660 00:33:24,960 --> 00:33:26,800 Speaker 1: it and believe it. And the research is very clear 661 00:33:26,840 --> 00:33:29,720 Speaker 1: that these kinds of attacks, particularly on prominent Black women 662 00:33:29,720 --> 00:33:32,800 Speaker 1: in politics, they have a real world impact. So they 663 00:33:32,880 --> 00:33:35,920 Speaker 1: keep black women from being civically engaged, They keep us 664 00:33:35,960 --> 00:33:38,200 Speaker 1: from doing things like running for office, They keep us 665 00:33:38,200 --> 00:33:40,200 Speaker 1: from doing things like just putting our opinions out there 666 00:33:40,200 --> 00:33:42,760 Speaker 1: and having a voice on our own discourse online. And 667 00:33:42,800 --> 00:33:46,800 Speaker 1: so these are not just jokes. They translate to very 668 00:33:46,800 --> 00:33:49,840 Speaker 1: clear real world harm for Black women and for everybody, 669 00:33:49,880 --> 00:33:52,560 Speaker 1: because we are all better served when we have a 670 00:33:52,600 --> 00:33:55,520 Speaker 1: digital landscape where everybody can show up and everybody can 671 00:33:55,560 --> 00:33:59,320 Speaker 1: see themselves thoughtfully depicted, right, And so if we don't 672 00:33:59,360 --> 00:34:01,160 Speaker 1: have that, it's not just a threat to black women, 673 00:34:01,240 --> 00:34:03,080 Speaker 1: which it is, it is a threat to all of 674 00:34:03,160 --> 00:34:06,840 Speaker 1: us to our democracy into having a functioning digital media landscape, 675 00:34:07,240 --> 00:34:09,560 Speaker 1: which is very important. Right as we move forward with 676 00:34:09,600 --> 00:34:13,359 Speaker 1: a more increasingly digital kind of reality, right, Like, so 677 00:34:13,400 --> 00:34:16,120 Speaker 1: many things are done online, so it is really important 678 00:34:16,120 --> 00:34:19,839 Speaker 1: to focus on that, absolutely, especially since COVID, like, we're 679 00:34:19,880 --> 00:34:22,960 Speaker 1: all showing up online much more frequently. And so if 680 00:34:23,000 --> 00:34:27,040 Speaker 1: our online ecosystem is a dumpster fire where black women 681 00:34:27,080 --> 00:34:29,520 Speaker 1: don't feel safe and can't speak up, what kind of 682 00:34:29,520 --> 00:34:33,000 Speaker 1: online ecosystem is that? So You've already share some of 683 00:34:33,000 --> 00:34:34,799 Speaker 1: these things, but I'd love to hear there are other 684 00:34:34,840 --> 00:34:37,280 Speaker 1: parts of this strategy that you've put together for yourself, 685 00:34:37,520 --> 00:34:39,759 Speaker 1: Like how do you protect your mental health as someone 686 00:34:39,840 --> 00:34:43,080 Speaker 1: who is very online? Oh, I love this question. So 687 00:34:43,200 --> 00:34:46,479 Speaker 1: one is that I'm offline quite a bit, might take 688 00:34:46,719 --> 00:34:49,839 Speaker 1: long breaks from social media. I'm not somebody who is like, hey, 689 00:34:49,920 --> 00:34:52,200 Speaker 1: y'all taking a social media break, but I do that 690 00:34:52,320 --> 00:34:55,719 Speaker 1: quite a lot too. I spend a lot of time outdoors. 691 00:34:55,960 --> 00:34:59,400 Speaker 1: That's how I like reconnect and plug back in and recharge. 692 00:34:59,400 --> 00:35:01,200 Speaker 1: Is just like go for a walk in the park 693 00:35:01,320 --> 00:35:04,080 Speaker 1: or going for a nice hike. I love to be outside. 694 00:35:04,120 --> 00:35:05,839 Speaker 1: I love to be out in the water. And then 695 00:35:05,840 --> 00:35:08,200 Speaker 1: I would say I also just really have a clear 696 00:35:08,320 --> 00:35:12,040 Speaker 1: understanding of where I plug in in my offline world. Right, 697 00:35:12,040 --> 00:35:14,440 Speaker 1: So I love showing up online. I love the Internet, 698 00:35:14,560 --> 00:35:17,239 Speaker 1: but I'm also really clear that my people are my 699 00:35:17,360 --> 00:35:20,680 Speaker 1: community that I really am able to dial in with offline, 700 00:35:20,719 --> 00:35:24,000 Speaker 1: and so when things online start to feel not so good, 701 00:35:24,239 --> 00:35:26,759 Speaker 1: it's always good for me to like reconnect with my 702 00:35:26,840 --> 00:35:29,759 Speaker 1: friends and family who I know have my back, got 703 00:35:29,800 --> 00:35:32,680 Speaker 1: me will always make me feel grounded when the online 704 00:35:32,680 --> 00:35:35,600 Speaker 1: space just feels too much. That is super important. Thank 705 00:35:35,640 --> 00:35:39,280 Speaker 1: you for sharing it. So as someone who is very online, 706 00:35:39,360 --> 00:35:42,960 Speaker 1: I'd love to hear some of your favorite podcasts, newsletters, 707 00:35:43,040 --> 00:35:45,640 Speaker 1: or digital communities that you feel like do a really 708 00:35:45,640 --> 00:35:48,560 Speaker 1: good job of creating safe spaces for black women. Well, 709 00:35:48,719 --> 00:35:51,440 Speaker 1: one is your own. I have to shout out Therapy 710 00:35:51,480 --> 00:35:53,760 Speaker 1: for Black Girls, which I'm sure all your listeners feel 711 00:35:53,760 --> 00:35:56,040 Speaker 1: the same, but I can say I have to say, like, 712 00:35:56,239 --> 00:36:00,120 Speaker 1: what you're doing is so groundbreaking and it's something that 713 00:36:00,640 --> 00:36:03,280 Speaker 1: I don't even think I realized that it was missing 714 00:36:03,360 --> 00:36:06,520 Speaker 1: until I found your work, And so definitely the community 715 00:36:06,560 --> 00:36:09,960 Speaker 1: that you've built around black women talking honestly about our 716 00:36:10,000 --> 00:36:12,719 Speaker 1: mental health and how we show up. Another podcast that 717 00:36:12,760 --> 00:36:15,279 Speaker 1: I really love is Still Processing. It is one of 718 00:36:15,280 --> 00:36:17,880 Speaker 1: my favorite culture podcasts, and it's a podcast where I, 719 00:36:17,880 --> 00:36:19,120 Speaker 1: I guess I feel like they just do a really 720 00:36:19,160 --> 00:36:21,160 Speaker 1: good job of making sure that black women and black 721 00:36:21,239 --> 00:36:24,800 Speaker 1: queer folks get lovingly depicted. Whenever I listened to that podcast, 722 00:36:24,800 --> 00:36:28,239 Speaker 1: I always feel very loved and seen afterward. And then 723 00:36:28,280 --> 00:36:30,080 Speaker 1: another I have to shout out it's I think it's 724 00:36:30,120 --> 00:36:33,080 Speaker 1: no longer active, but it's called being seen. It's about 725 00:36:33,120 --> 00:36:36,960 Speaker 1: the experience of being black and queer. And again, for me, 726 00:36:37,320 --> 00:36:41,560 Speaker 1: podcasts really allow me to feel lovingly depicted. Like I 727 00:36:41,600 --> 00:36:44,120 Speaker 1: know when I'm listening to a podcast when afterward I 728 00:36:44,120 --> 00:36:46,359 Speaker 1: feel kind of like more in love with myself after 729 00:36:46,440 --> 00:36:47,920 Speaker 1: I listened. So those are some of the ones that 730 00:36:47,960 --> 00:36:50,120 Speaker 1: really make me do that I love. Then Yeah, I 731 00:36:50,160 --> 00:36:52,160 Speaker 1: just saw Still Processing is coming back for a new 732 00:36:52,200 --> 00:36:54,680 Speaker 1: season too, So I'm very excited because I've been missing them. Right, 733 00:36:54,960 --> 00:36:57,719 Speaker 1: it feels like when they're a big cultural moments, like 734 00:36:57,800 --> 00:36:59,800 Speaker 1: I often think about, like, oh, what would gin and 735 00:37:00,040 --> 00:37:01,920 Speaker 1: so we have to say about this. So jenn is 736 00:37:01,920 --> 00:37:04,280 Speaker 1: gonna be on a break because I think she's still writing, 737 00:37:04,280 --> 00:37:06,759 Speaker 1: but Wesley is going to come back with some guests coasts. 738 00:37:06,840 --> 00:37:10,000 Speaker 1: Oh my gosh, I can't wait. I know, I know, 739 00:37:10,040 --> 00:37:12,800 Speaker 1: I'm very excited they're coming back. So I appreciate that list. 740 00:37:13,480 --> 00:37:16,439 Speaker 1: So what advice would you have for other black women 741 00:37:16,480 --> 00:37:20,960 Speaker 1: who are seeking to build and create community digitally? Yeah, 742 00:37:21,080 --> 00:37:25,680 Speaker 1: I would say, we need more healthy communities where our 743 00:37:25,800 --> 00:37:29,400 Speaker 1: experiences and our voices are meeting ply centered. So if 744 00:37:29,400 --> 00:37:33,200 Speaker 1: you're thinking about starting that podcast or starting that newsletter 745 00:37:33,320 --> 00:37:36,640 Speaker 1: or starting that listser do it. We need you, We 746 00:37:36,719 --> 00:37:39,440 Speaker 1: need your voice. So yeah, that's the first one I 747 00:37:39,480 --> 00:37:42,839 Speaker 1: would say. And then I would also say, like, I 748 00:37:42,880 --> 00:37:45,120 Speaker 1: think I can't remember the writer I might have been 749 00:37:45,160 --> 00:37:48,400 Speaker 1: Tony Morrison who said that she wrote the books that 750 00:37:48,480 --> 00:37:50,759 Speaker 1: she needed when she was younger that she didn't see 751 00:37:50,760 --> 00:37:53,000 Speaker 1: and that advice has always been so useful for me. 752 00:37:53,320 --> 00:37:57,400 Speaker 1: You know, figure out what that need is in yourself, 753 00:37:57,600 --> 00:38:00,560 Speaker 1: and I guarantee that somebody else out there has same need. 754 00:38:00,600 --> 00:38:02,000 Speaker 1: And so if you're like, the only better was an 755 00:38:02,000 --> 00:38:05,880 Speaker 1: online community for us to talk about blank, I guarantee 756 00:38:05,920 --> 00:38:08,239 Speaker 1: that there's another black woman out there who was like 757 00:38:08,360 --> 00:38:11,720 Speaker 1: I would join that community. So yeah, just really asking 758 00:38:11,719 --> 00:38:14,080 Speaker 1: yourself what it is that you need and not being 759 00:38:14,120 --> 00:38:16,359 Speaker 1: afraid to build the thing that you feel like you 760 00:38:16,400 --> 00:38:19,200 Speaker 1: need in the world that is so important, like you 761 00:38:19,200 --> 00:38:21,759 Speaker 1: mentioned earlier, right, like the democratizing of it right, Like, 762 00:38:21,800 --> 00:38:24,600 Speaker 1: all you really need is a digital device and you 763 00:38:24,640 --> 00:38:27,000 Speaker 1: can create a community. And I do think that that 764 00:38:27,120 --> 00:38:29,000 Speaker 1: is one of the things that black women and you know, 765 00:38:29,080 --> 00:38:31,880 Speaker 1: the other marginalized communities that you've talked about, do a 766 00:38:32,200 --> 00:38:35,520 Speaker 1: particularly great job and is like sharing our stories and 767 00:38:35,640 --> 00:38:37,640 Speaker 1: making room for other people to know that they're not 768 00:38:37,760 --> 00:38:40,480 Speaker 1: alone in their experiences. So I think that that's one 769 00:38:40,480 --> 00:38:43,600 Speaker 1: of the things that's really cool about online communities. Yeah, 770 00:38:43,640 --> 00:38:46,480 Speaker 1: I love it too, And I think like sharing our experience, 771 00:38:47,040 --> 00:38:50,840 Speaker 1: the entire trajectory of those experiences, because the experience of 772 00:38:50,840 --> 00:38:53,799 Speaker 1: being a black woman online is sometimes difficult, sometimes hard. 773 00:38:53,880 --> 00:38:56,200 Speaker 1: Somemime's full of pain, but it's also full of joy 774 00:38:56,400 --> 00:38:59,680 Speaker 1: and love and laughter and magic and light and beauty 775 00:38:59,800 --> 00:39:02,680 Speaker 1: and creativity. And so I really try to tell all 776 00:39:02,719 --> 00:39:05,680 Speaker 1: sides of the story, not just rooted in the tough parts, 777 00:39:05,719 --> 00:39:08,799 Speaker 1: but also buoyed by the happy parts as well. So 778 00:39:08,880 --> 00:39:11,680 Speaker 1: leaning into the entire experience that we show up with, 779 00:39:11,680 --> 00:39:14,920 Speaker 1: I think it's really important. Yeah, I agree. So where 780 00:39:14,960 --> 00:39:17,680 Speaker 1: can we stay connected with you? Bridget What is your website? 781 00:39:17,719 --> 00:39:19,920 Speaker 1: Tell us more about where we can listen to the podcast, 782 00:39:19,960 --> 00:39:22,280 Speaker 1: as well as any social media handles you'd like to share. 783 00:39:22,480 --> 00:39:24,719 Speaker 1: Oh yeah, so you can find my work with Ultra 784 00:39:24,800 --> 00:39:26,880 Speaker 1: Violet and we are ultra violet dot org. We would 785 00:39:26,880 --> 00:39:28,400 Speaker 1: love to have you there. Check out some of our 786 00:39:28,400 --> 00:39:32,799 Speaker 1: cool campaigns, including our campaign a Feminist Net to make 787 00:39:32,800 --> 00:39:35,640 Speaker 1: a feminist, anti racist Internet, so definitely check that out. 788 00:39:35,960 --> 00:39:37,960 Speaker 1: We can find my podcast on I Heart Radio. You 789 00:39:38,000 --> 00:39:40,799 Speaker 1: can find it on Spotify, Apple Podcasts, wherever you get 790 00:39:40,840 --> 00:39:42,560 Speaker 1: your podcast on. It's called There Are No Girls on 791 00:39:42,600 --> 00:39:45,319 Speaker 1: the Internet. You can follow me on social media. I'm 792 00:39:45,360 --> 00:39:48,359 Speaker 1: at Bridget Marie in on Twitter and at Bridget Murrie 793 00:39:48,440 --> 00:39:51,239 Speaker 1: in DC on Instagram. Perfect well. We will be sure 794 00:39:51,280 --> 00:39:53,000 Speaker 1: to include all of that in the show Knows. Thank 795 00:39:53,000 --> 00:39:54,800 Speaker 1: you so much for sharing with us. To Dave Bridget, 796 00:39:54,800 --> 00:39:57,680 Speaker 1: I really appreciate it. Oh my pleasure. This was super fun, 797 00:39:57,800 --> 00:40:03,279 Speaker 1: dream come true. Thank you. I'm so glad Bridget was 798 00:40:03,320 --> 00:40:06,160 Speaker 1: able to share her expertise with us today. To learn 799 00:40:06,160 --> 00:40:09,080 Speaker 1: more about her and to check out her podcast, be 800 00:40:09,160 --> 00:40:11,319 Speaker 1: sure to visit the show notes at Therapy for Black 801 00:40:11,320 --> 00:40:15,239 Speaker 1: Girls dot com slash session to and be sure to 802 00:40:15,280 --> 00:40:17,160 Speaker 1: text two of your girls and tell them to check 803 00:40:17,160 --> 00:40:20,360 Speaker 1: out the episode right now. If you're looking for a 804 00:40:20,400 --> 00:40:22,839 Speaker 1: therapist in your area, be sure to check out our 805 00:40:22,880 --> 00:40:26,760 Speaker 1: therapist directory at Therapy for Black Girls dot com slash directory. 806 00:40:27,280 --> 00:40:29,520 Speaker 1: And if you want to continue digging into this topic 807 00:40:29,920 --> 00:40:32,600 Speaker 1: or just be in community with other sisters, come on 808 00:40:32,680 --> 00:40:35,080 Speaker 1: over and join us in the Sister Circle. It's our 809 00:40:35,120 --> 00:40:37,840 Speaker 1: cozy corner of the Internet designed just for black women. 810 00:40:38,440 --> 00:40:40,879 Speaker 1: You can join us at community dot Therapy for Black 811 00:40:40,920 --> 00:40:45,120 Speaker 1: Girls dot com. This episode was produced by Freda Lucas 812 00:40:45,120 --> 00:40:48,560 Speaker 1: and Alice Ellis and editing was done by Dennis and Bradford. 813 00:40:49,520 --> 00:40:51,720 Speaker 1: Thank you all so much for joining me again this week. 814 00:40:52,160 --> 00:40:54,520 Speaker 1: I look forward to continue in this conversation with you 815 00:40:54,560 --> 00:41:01,240 Speaker 1: all real soon. Take get care what pos swoot