1 00:00:11,760 --> 00:00:14,720 Speaker 1: Good morning, peeps, and welcome to OKF Daily with Me 2 00:00:14,960 --> 00:00:19,920 Speaker 1: your Girl. Daniel Moody recording from the Home Bunker Today. 3 00:00:20,520 --> 00:00:24,160 Speaker 1: I woke up and did not want to get out 4 00:00:24,200 --> 00:00:28,320 Speaker 1: of bed, and so I laid there for at least 5 00:00:28,360 --> 00:00:33,960 Speaker 1: forty five minutes. I am feeling incredibly heavy, incredibly low, 6 00:00:34,880 --> 00:00:40,559 Speaker 1: and doing all that I can to muster the energy 7 00:00:41,479 --> 00:00:47,000 Speaker 1: to keep pushing forward. We've all been dealing with just 8 00:00:47,440 --> 00:00:52,880 Speaker 1: non stop havoc on our nervous systems for the last 9 00:00:52,920 --> 00:00:56,320 Speaker 1: two weeks. But if we're really honest, the last eight years, 10 00:00:57,480 --> 00:01:00,440 Speaker 1: and so, if you do find yourself like I do, 11 00:01:00,640 --> 00:01:05,600 Speaker 1: where you are at a very low point, you find 12 00:01:05,640 --> 00:01:08,680 Speaker 1: yourself at a very low moment. I want to offer 13 00:01:08,760 --> 00:01:13,720 Speaker 1: to you to just give into what your body needs, 14 00:01:13,959 --> 00:01:17,640 Speaker 1: which is rest, which is a social media break, which 15 00:01:17,720 --> 00:01:21,680 Speaker 1: is a news break, and go and do things that 16 00:01:21,840 --> 00:01:26,559 Speaker 1: nourish you. Because what I think is happening right now 17 00:01:27,520 --> 00:01:33,839 Speaker 1: with this inundation of just bad judges, bad decisions, bad 18 00:01:34,040 --> 00:01:39,760 Speaker 1: people gaining power and reverence on social media, in corporate media, 19 00:01:40,800 --> 00:01:46,920 Speaker 1: the fall of our democracy, of places like MSNBC who 20 00:01:47,040 --> 00:01:51,240 Speaker 1: now have pretty much shown and revealed who they are 21 00:01:52,120 --> 00:01:57,520 Speaker 1: with inviting Eric Trump on air, Katie Turr really with 22 00:01:58,400 --> 00:02:01,520 Speaker 1: removing Mourning Joe from the air and lying to the 23 00:02:01,560 --> 00:02:05,880 Speaker 1: hosts about why really like, that's what we're doing, all 24 00:02:06,000 --> 00:02:11,680 Speaker 1: to comply and be complicit in the anointing of Donald 25 00:02:11,680 --> 00:02:15,520 Speaker 1: Trump as king. It's vile and it's disgusting, and I 26 00:02:15,520 --> 00:02:19,079 Speaker 1: think that more and more people are recognizing that they 27 00:02:19,120 --> 00:02:23,040 Speaker 1: are not to be trusted. And so I just woke 28 00:02:23,160 --> 00:02:30,440 Speaker 1: up feeling really heavy and also more committed to telling 29 00:02:30,480 --> 00:02:36,520 Speaker 1: the truth, to doing the work, to uplifting the truth, 30 00:02:37,280 --> 00:02:41,680 Speaker 1: and trying to wake as many people up as humanly 31 00:02:41,800 --> 00:02:46,560 Speaker 1: fucking possible. The announcement of jd Vance as Donald Trump's 32 00:02:46,639 --> 00:02:50,040 Speaker 1: running mate tells you everything that you need to know 33 00:02:50,400 --> 00:02:54,639 Speaker 1: about Republicans. Donald Trump could have, if he was interested 34 00:02:54,680 --> 00:02:58,640 Speaker 1: in actually gaining voters, chosen Nikki Haley. Now, I think 35 00:02:58,760 --> 00:03:03,160 Speaker 1: Nimroda is a piece of shit, But what it would 36 00:03:03,160 --> 00:03:07,360 Speaker 1: have indicated at least two Republican women is that you 37 00:03:07,480 --> 00:03:11,120 Speaker 1: believe in their power and their purpose as well. Donald 38 00:03:11,160 --> 00:03:14,040 Speaker 1: Trump said, yeah, Nah, she don't have the complexion. 39 00:03:13,560 --> 00:03:14,160 Speaker 2: For that, right. 40 00:03:14,280 --> 00:03:18,560 Speaker 1: Remember, but you saw her just this week on the 41 00:03:18,600 --> 00:03:21,519 Speaker 1: stage at the RNC, bending the knee, kissing the ring 42 00:03:22,560 --> 00:03:25,400 Speaker 1: and doing what she can to stay relevant inside of 43 00:03:25,440 --> 00:03:29,080 Speaker 1: a party that sees her as nothing more than the 44 00:03:29,120 --> 00:03:34,240 Speaker 1: fucking help. So choosing jd Vance a man that has 45 00:03:34,280 --> 00:03:37,440 Speaker 1: said that he believes in abortune one hundred percent. No, 46 00:03:38,280 --> 00:03:43,640 Speaker 1: there is no cause in his mind that rape or 47 00:03:43,720 --> 00:03:48,280 Speaker 1: incest or anything horrific would constitute a second thought. He said, no, 48 00:03:48,680 --> 00:03:52,120 Speaker 1: just rape is inconvenient right. But two wrongs don't make 49 00:03:52,160 --> 00:03:54,880 Speaker 1: a right, which is what he said. Tell that to 50 00:03:55,000 --> 00:03:59,680 Speaker 1: victims of sexual assault. He has said that women should 51 00:03:59,680 --> 00:04:03,280 Speaker 1: stay in the side of abusive relationships because it's for 52 00:04:03,360 --> 00:04:07,160 Speaker 1: the better men of the children. He has said that 53 00:04:07,360 --> 00:04:12,200 Speaker 1: LGBTQ plus people are an abomination and wants to roll 54 00:04:12,280 --> 00:04:17,520 Speaker 1: back any protections that the community has gained in the 55 00:04:17,520 --> 00:04:23,000 Speaker 1: past several decades. He is a white supremacist. He is 56 00:04:23,600 --> 00:04:25,880 Speaker 1: a liar. He is a man that said in twenty 57 00:04:25,920 --> 00:04:29,960 Speaker 1: sixteen that Donald Trump was America's hitler, and now in 58 00:04:30,000 --> 00:04:33,400 Speaker 1: twenty twenty four is Shooken and Jivin on the stage 59 00:04:33,400 --> 00:04:36,479 Speaker 1: with him. These people are not to be trusted, and 60 00:04:36,480 --> 00:04:38,240 Speaker 1: they're also to be taken. 61 00:04:38,040 --> 00:04:38,839 Speaker 2: At their word. 62 00:04:40,000 --> 00:04:42,640 Speaker 1: So do not sit this election out. Do not think 63 00:04:43,160 --> 00:04:44,760 Speaker 1: that you're going to get another bite at the apple. 64 00:04:44,800 --> 00:04:48,000 Speaker 1: Do not waste your vote on a third party. Use 65 00:04:48,120 --> 00:04:53,480 Speaker 1: your power, your platform, your treasure, your talents to secure 66 00:04:53,560 --> 00:04:57,120 Speaker 1: our democracy over the next three and a half months. 67 00:04:57,640 --> 00:05:02,080 Speaker 1: Because where who we have been withh wait for coming 68 00:05:02,160 --> 00:05:05,440 Speaker 1: up next, dear friends, my conversation with friend of the 69 00:05:05,480 --> 00:05:10,440 Speaker 1: show bridget Todd, the host of the pod There Are 70 00:05:10,440 --> 00:05:13,360 Speaker 1: No Girls on the Internet. We get into a great 71 00:05:13,360 --> 00:05:20,640 Speaker 1: conversation about tech, about democracy, about this upcoming election, and more, folks. 72 00:05:20,760 --> 00:05:25,240 Speaker 1: I am so excited to welcome back to WOKF Daily 73 00:05:25,279 --> 00:05:27,919 Speaker 1: bridget Todd, the host of There Are No Girls on 74 00:05:28,000 --> 00:05:33,600 Speaker 1: the Internet, which is launching its fourth season on iHeart 75 00:05:33,680 --> 00:05:36,800 Speaker 1: On Outspoken, and I'm so excited to have you. Every 76 00:05:36,800 --> 00:05:39,320 Speaker 1: time that I have you on, we have such rich 77 00:05:39,360 --> 00:05:43,720 Speaker 1: conversations about the world, the culture, society that we are 78 00:05:43,760 --> 00:05:47,760 Speaker 1: living in right now. The last time we spoke last year, 79 00:05:47,880 --> 00:05:50,600 Speaker 1: these are some of the things that hadn't happened that 80 00:05:50,720 --> 00:05:53,880 Speaker 1: have happened in that timeframe, which is kind of wild. 81 00:05:53,960 --> 00:05:57,320 Speaker 1: When I was like looking at this, so since then, 82 00:05:57,560 --> 00:06:03,000 Speaker 1: Twitter has become extreads hadn't taken off yet, and TikTok 83 00:06:03,240 --> 00:06:09,200 Speaker 1: was not facing a federally mandated ban just those three things, 84 00:06:09,200 --> 00:06:14,320 Speaker 1: and AI was still I think being talked about on 85 00:06:14,360 --> 00:06:18,320 Speaker 1: the margins, you know, within the industry, but not in 86 00:06:18,360 --> 00:06:22,120 Speaker 1: a mainstream type setting, and now it's everywhere and it's 87 00:06:22,120 --> 00:06:26,320 Speaker 1: in every conversation. So bridget jump in. Let's start with 88 00:06:26,480 --> 00:06:29,680 Speaker 1: threads and the ban on TikTok and kind of the 89 00:06:29,720 --> 00:06:36,599 Speaker 1: politization of information disinformation and kind of what were you 90 00:06:36,760 --> 00:06:40,600 Speaker 1: thinking about as these platforms were shifting and bands were 91 00:06:40,880 --> 00:06:41,720 Speaker 1: being announced. 92 00:06:42,160 --> 00:06:44,960 Speaker 3: Yeah, I do feel like we are in a really 93 00:06:45,000 --> 00:06:47,880 Speaker 3: specific moment for somebody that has been on the Internet 94 00:06:47,920 --> 00:06:50,719 Speaker 3: for a very long time, sort of covering and thinking 95 00:06:50,760 --> 00:06:53,600 Speaker 3: about the Internet and how we get our information, how 96 00:06:53,600 --> 00:06:56,160 Speaker 3: we connect. I can't recall a time that we were 97 00:06:56,160 --> 00:06:58,960 Speaker 3: ever in a moment like this one where you have 98 00:06:59,680 --> 00:07:04,080 Speaker 3: made sure, you know, communications platforms like Twitter essentially completely 99 00:07:04,160 --> 00:07:07,440 Speaker 3: being disrupted because of one billionaire that's new. You have 100 00:07:07,640 --> 00:07:11,760 Speaker 3: other companies trying to sort of fill that gap with threads. 101 00:07:12,120 --> 00:07:15,800 Speaker 3: You've got, you know, as you said, TikTok, which really 102 00:07:16,000 --> 00:07:18,440 Speaker 3: was taking up a lion's share of how people were 103 00:07:18,480 --> 00:07:22,280 Speaker 3: getting their information on social media facing a fan All 104 00:07:22,320 --> 00:07:24,640 Speaker 3: of this happening against the backdrop of the rise of 105 00:07:24,680 --> 00:07:27,880 Speaker 3: things like Generator of AI, which we know is really 106 00:07:27,920 --> 00:07:30,640 Speaker 3: poised to change our information ecosystem that already has do 107 00:07:30,800 --> 00:07:33,560 Speaker 3: things like you know, politically motivated deep fakes and cheap 108 00:07:33,600 --> 00:07:35,760 Speaker 3: fakes and things like that. I don't know, it's such 109 00:07:35,760 --> 00:07:42,320 Speaker 3: a weird time personally. I really see people experiencing a 110 00:07:42,320 --> 00:07:45,200 Speaker 3: lot of fatigue, and I think I'm there myself with like, well, 111 00:07:45,200 --> 00:07:48,360 Speaker 3: what platform can I use? Or you know, when something 112 00:07:48,440 --> 00:07:51,320 Speaker 3: happens in my community. For the longest time, where you 113 00:07:51,360 --> 00:07:54,720 Speaker 3: would go for like a real time update was Twitter, right, Like, yep, 114 00:07:54,800 --> 00:07:56,280 Speaker 3: you know, I would go to Twitter to see what my. 115 00:07:56,280 --> 00:07:57,640 Speaker 2: Mayor was saying about something. 116 00:07:57,920 --> 00:07:59,960 Speaker 3: Now I have no idea, I don't know where to go, 117 00:08:00,480 --> 00:08:03,880 Speaker 3: and so I think that we're really in this no 118 00:08:04,040 --> 00:08:07,240 Speaker 3: man's land of how you just get access to information 119 00:08:07,800 --> 00:08:11,040 Speaker 3: all of that in the lead up to a critical 120 00:08:11,120 --> 00:08:13,880 Speaker 3: election year, not just in the United States globally, right, 121 00:08:13,920 --> 00:08:18,400 Speaker 3: And so I'm concerned, I guess is the word title use. 122 00:08:18,480 --> 00:08:22,560 Speaker 3: I'm a little bit concerned about our information ecosystem online. 123 00:08:22,920 --> 00:08:26,000 Speaker 1: Yeah, I think that, particularly when you make the point 124 00:08:26,160 --> 00:08:35,520 Speaker 1: that our global town square was disrupted by one right wing, racist, misogynist, 125 00:08:35,720 --> 00:08:42,840 Speaker 1: transphobic billionaire, shows you how broken our systems of communication 126 00:08:43,440 --> 00:08:47,480 Speaker 1: and gathering in media truly are. Right when what we 127 00:08:47,520 --> 00:08:51,880 Speaker 1: had seen from let's say, the murder of Trayvon Martin, 128 00:08:53,000 --> 00:08:56,200 Speaker 1: the murder of Mike Brown, and I'm just I'm giving 129 00:08:56,240 --> 00:09:00,000 Speaker 1: like just specific time points the murder of George Floyd. 130 00:09:00,520 --> 00:09:04,280 Speaker 1: So you're looking at, you know, almost seven eight years 131 00:09:04,480 --> 00:09:08,800 Speaker 1: of time. In that span, what we saw were people 132 00:09:08,880 --> 00:09:13,719 Speaker 1: being able to quickly gather, people being able to organize online, 133 00:09:13,880 --> 00:09:18,520 Speaker 1: and then transition that organizing online into organizing in real life. 134 00:09:19,400 --> 00:09:22,640 Speaker 1: Then because of the bubbling up that was happening in Twitter, 135 00:09:22,840 --> 00:09:26,760 Speaker 1: finally mainstream media was covering what was actually happening on 136 00:09:26,800 --> 00:09:30,720 Speaker 1: the ground in these communities where these murders took place. 137 00:09:31,840 --> 00:09:34,720 Speaker 1: That was I think a bat signal bridget to the 138 00:09:34,840 --> 00:09:38,600 Speaker 1: right wing that was like, Oh, they can't have this, 139 00:09:39,160 --> 00:09:43,320 Speaker 1: they can't have this centralized organizing and we need something 140 00:09:43,440 --> 00:09:46,920 Speaker 1: and someone to disrupt it. How do you look at 141 00:09:47,080 --> 00:09:53,000 Speaker 1: the devolving of an X versus the proposed ban on 142 00:09:53,080 --> 00:09:55,320 Speaker 1: a TikTok? How do you look at that on the 143 00:09:55,400 --> 00:09:56,800 Speaker 1: spectrum of where we are? 144 00:09:57,240 --> 00:09:59,480 Speaker 3: Yeah, I mean, so the way that you broke down 145 00:09:59,520 --> 00:10:03,880 Speaker 3: that timeline is so I think needed and necessary, because 146 00:10:04,080 --> 00:10:07,400 Speaker 3: I do think the reason why Twitter became this battle 147 00:10:07,400 --> 00:10:09,320 Speaker 3: ground is because it really was a place where people 148 00:10:09,320 --> 00:10:13,040 Speaker 3: who traditionally didn't have access to traditional means of power 149 00:10:13,120 --> 00:10:15,240 Speaker 3: were able to build up power. So people like Elon 150 00:10:15,400 --> 00:10:16,880 Speaker 3: must saw that and they were like, we need to 151 00:10:17,000 --> 00:10:20,840 Speaker 3: disrupt that. Right When I think about TikTok, I think 152 00:10:20,880 --> 00:10:24,720 Speaker 3: that TikTok is a platform that it's not surprising to 153 00:10:24,760 --> 00:10:26,720 Speaker 3: me that you have a lot of people kind of 154 00:10:26,720 --> 00:10:29,480 Speaker 3: fear mongering about TikTok because I do think it's one 155 00:10:29,480 --> 00:10:31,199 Speaker 3: of the last places that people can go and just 156 00:10:31,240 --> 00:10:35,800 Speaker 3: like express themselves. I think it's interesting how the conversation 157 00:10:35,880 --> 00:10:40,319 Speaker 3: around the platform has been demonized. But other platforms that 158 00:10:40,360 --> 00:10:42,840 Speaker 3: I could tell you twenty different ways that they're that 159 00:10:42,880 --> 00:10:46,200 Speaker 3: they're that they're harming us, it's not similarly demonized. And 160 00:10:46,240 --> 00:10:48,160 Speaker 3: so I think that what lawmakers who are trying to 161 00:10:48,160 --> 00:10:51,800 Speaker 3: ban TikTok are really saying is that American owned companies 162 00:10:51,960 --> 00:10:54,040 Speaker 3: should be able to harm us in this way. 163 00:10:54,080 --> 00:10:56,560 Speaker 2: Like that's like, like that's my like big theory. 164 00:10:56,880 --> 00:11:00,439 Speaker 3: And I also think like there's a kind of a like, oh, 165 00:11:00,559 --> 00:11:03,400 Speaker 3: you kids with your phones kind of mentality about it, 166 00:11:03,440 --> 00:11:06,040 Speaker 3: where because it's a platform that is really taken off 167 00:11:06,040 --> 00:11:09,080 Speaker 3: with younger people, it's really easily demonized, and so you 168 00:11:09,080 --> 00:11:11,520 Speaker 3: have these lawmakers being like, oh, like, you know, young 169 00:11:11,559 --> 00:11:15,760 Speaker 3: people couldn't possibly feel xyz about Palestine, it has to 170 00:11:15,800 --> 00:11:19,040 Speaker 3: be TikTok is like rotting their brains, and then they 171 00:11:19,200 --> 00:11:22,480 Speaker 3: turn around and want that same demographic to vote for them. 172 00:11:22,480 --> 00:11:24,160 Speaker 3: It's like, on the one hand, they are both like 173 00:11:24,400 --> 00:11:28,439 Speaker 3: devaluing what young people have to say while also being like, oh, 174 00:11:28,440 --> 00:11:30,040 Speaker 3: but we want your votes. And so I just had 175 00:11:30,040 --> 00:11:33,559 Speaker 3: the whole conversation around TikTok on the backdrop of how 176 00:11:33,559 --> 00:11:36,920 Speaker 3: it's unfolding with X like, it just feels really disingenuous, 177 00:11:36,960 --> 00:11:41,640 Speaker 3: and I don't know that it's a conversation that is 178 00:11:41,679 --> 00:11:45,439 Speaker 3: really benefiting us and really making us safer. If lawmakers 179 00:11:45,440 --> 00:11:47,480 Speaker 3: want to have a conversation about how we make social 180 00:11:47,559 --> 00:11:51,480 Speaker 3: media platforms safer, how we stop social media platforms from 181 00:11:51,480 --> 00:11:54,720 Speaker 3: profiting from harming people and taking all of our data, 182 00:11:54,760 --> 00:11:57,560 Speaker 3: and have an actual conversation about like a robust data 183 00:11:57,600 --> 00:11:59,080 Speaker 3: privacy policy, I am. 184 00:11:59,000 --> 00:11:59,600 Speaker 2: All for it. 185 00:12:00,000 --> 00:12:03,160 Speaker 3: I don't think just spanning one platform gets us there. 186 00:12:06,400 --> 00:12:08,760 Speaker 1: We can all list off on the top of our 187 00:12:08,840 --> 00:12:13,760 Speaker 1: heads the ways in which American platforms, American run platforms 188 00:12:13,800 --> 00:12:18,080 Speaker 1: have been abusing our data and abusing us since like 189 00:12:18,240 --> 00:12:22,680 Speaker 1: their beginning, what responsibility did Meta take for the twenty 190 00:12:22,679 --> 00:12:26,800 Speaker 1: sixteen election they had to pay fines over in Europe 191 00:12:26,840 --> 00:12:28,360 Speaker 1: like a slack on the wrisk, but it was at 192 00:12:28,440 --> 00:12:32,000 Speaker 1: least like a billion or so, What was the responsibility? 193 00:12:32,040 --> 00:12:35,760 Speaker 1: What was the accountability that Meta had to play? YouTube 194 00:12:35,800 --> 00:12:38,920 Speaker 1: and others that were you know, for Google, like in 195 00:12:39,040 --> 00:12:45,319 Speaker 1: terms of broadcasting and platforming disinformation, misinformation and white supremacy. 196 00:12:45,640 --> 00:12:48,720 Speaker 3: And in fact, I would argue that platforms like YouTube 197 00:12:48,720 --> 00:12:51,600 Speaker 3: that is owned by Google, in the recent months, they've 198 00:12:51,640 --> 00:12:55,240 Speaker 3: actually walked back what very little guardrails they put up 199 00:12:55,280 --> 00:12:58,040 Speaker 3: after twenty sixteen and twenty twenty, they've walked back. And 200 00:12:58,080 --> 00:13:00,720 Speaker 3: so for a while after the twenty two election they 201 00:13:00,720 --> 00:13:04,120 Speaker 3: were like, oh, we won't allow election denihalism on our platform. 202 00:13:04,200 --> 00:13:05,880 Speaker 3: They walked that back a couple months ago ahead of 203 00:13:05,920 --> 00:13:08,439 Speaker 3: the ahead of the upcoming election. So not only is 204 00:13:08,440 --> 00:13:11,240 Speaker 3: it a lack of accountability, I think that they're just like, 205 00:13:11,720 --> 00:13:13,560 Speaker 3: it doesn't like we can do whatever we want and 206 00:13:13,559 --> 00:13:15,480 Speaker 3: people are not gonna stop using these platforms, and so 207 00:13:15,520 --> 00:13:18,360 Speaker 3: I think the dynamic has really become that they don't 208 00:13:18,400 --> 00:13:21,320 Speaker 3: have to be accountable to us, the people who use 209 00:13:21,400 --> 00:13:23,920 Speaker 3: these platforms and make them billions, and we really have 210 00:13:24,000 --> 00:13:27,160 Speaker 3: to shift that because these platforms would be nothing without us. 211 00:13:27,240 --> 00:13:29,800 Speaker 2: They got us, aren't being accountable to us. 212 00:13:29,600 --> 00:13:32,840 Speaker 1: You know? And I wonder because to that point, how 213 00:13:32,880 --> 00:13:38,280 Speaker 1: because I feel like we have also become intellectually enslaved 214 00:13:38,360 --> 00:13:41,520 Speaker 1: to these platforms and their algorithms. If you are a 215 00:13:41,520 --> 00:13:44,640 Speaker 1: content creator, right like you and I are content creators, 216 00:13:44,800 --> 00:13:47,880 Speaker 1: the only way for you to get pops is to 217 00:13:47,960 --> 00:13:51,640 Speaker 1: basically live on those apps, right, be able to put 218 00:13:51,760 --> 00:13:55,000 Speaker 1: up five, ten, fifteen videos a day, to be able 219 00:13:55,040 --> 00:13:58,040 Speaker 1: to push out content to meet your audience, and for 220 00:13:58,120 --> 00:14:01,640 Speaker 1: folks to stay on their screens all day every day. 221 00:14:01,920 --> 00:14:05,680 Speaker 1: And so because of that, I'm like, how, like, is 222 00:14:05,760 --> 00:14:08,240 Speaker 1: it is it a boycott? Because I don't see a 223 00:14:08,240 --> 00:14:11,080 Speaker 1: bunch of people saying, you know what, I'm good? So 224 00:14:11,360 --> 00:14:14,760 Speaker 1: in your mind, what does it look like for us, 225 00:14:15,160 --> 00:14:17,880 Speaker 1: as those that are the source of their power and 226 00:14:17,960 --> 00:14:22,800 Speaker 1: their wealth to make them accountable even when the people 227 00:14:22,800 --> 00:14:25,760 Speaker 1: that we are electing to regulate them and hold them 228 00:14:25,760 --> 00:14:28,920 Speaker 1: accountable don't What a good question? 229 00:14:29,320 --> 00:14:31,840 Speaker 3: I mean, I think part of it is almost like 230 00:14:31,840 --> 00:14:36,920 Speaker 3: a philosophical answer of really putting these platforms in an 231 00:14:36,960 --> 00:14:39,920 Speaker 3: appropriate place in our lives and in our media diets. 232 00:14:40,200 --> 00:14:42,320 Speaker 3: You know, I love technology and I love the Internet, 233 00:14:42,360 --> 00:14:44,280 Speaker 3: and I hate that we're in a place where when 234 00:14:44,280 --> 00:14:47,760 Speaker 3: I'm talking about technology and the Internet, invariably I'm talking 235 00:14:47,760 --> 00:14:52,120 Speaker 3: about like three different companies Facebook, Twitter, TikTok, I guess 236 00:14:52,120 --> 00:14:54,800 Speaker 3: for Google. And it's like, we've gotten to this place 237 00:14:54,840 --> 00:15:00,120 Speaker 3: where these four massive companies, mostly run by white male billionaires, 238 00:15:00,120 --> 00:15:02,280 Speaker 3: to become synonymous with the Internet. And so I think 239 00:15:02,320 --> 00:15:07,080 Speaker 3: that ultimately we deserve an Internet ecosystem that is more robust, richer, 240 00:15:07,160 --> 00:15:10,400 Speaker 3: more dynamic, has more players than like four people making 241 00:15:10,480 --> 00:15:12,800 Speaker 3: decisions for how we will get our content. And I 242 00:15:12,840 --> 00:15:16,080 Speaker 3: don't know what you said about being a content creator 243 00:15:16,240 --> 00:15:19,760 Speaker 3: and what it feels like to exist in these platforms 244 00:15:20,000 --> 00:15:22,400 Speaker 3: really speaks to me because it is so exhausting, Like 245 00:15:22,480 --> 00:15:27,640 Speaker 3: it's exhausting. Have you ever seen the guidelines from Adam Mosseerri, 246 00:15:27,680 --> 00:15:29,440 Speaker 3: who runs Instagram, where it will be like, here's how 247 00:15:29,440 --> 00:15:33,280 Speaker 3: you succeed on Instagram And it'll be like post five 248 00:15:33,320 --> 00:15:35,960 Speaker 3: reels a day, post five times on a grid, and 249 00:15:36,000 --> 00:15:38,080 Speaker 3: it's like, Okay, well, I have a job. My job 250 00:15:38,120 --> 00:15:40,960 Speaker 3: is that full time Instagram creator. I have a family, 251 00:15:41,040 --> 00:15:44,120 Speaker 3: I have a life and maybe the kind of inside 252 00:15:44,120 --> 00:15:47,080 Speaker 3: and perspective that I'm able to bring. I need to 253 00:15:47,120 --> 00:15:49,840 Speaker 3: spend time not thinking get Instagram to do it. I 254 00:15:49,840 --> 00:15:50,760 Speaker 3: need to spend time in. 255 00:15:50,720 --> 00:15:55,800 Speaker 1: The world like thinking and existing. Right, yeah, yeah, Because 256 00:15:55,800 --> 00:15:57,840 Speaker 1: there was a point where I was really trying to 257 00:15:58,640 --> 00:16:01,920 Speaker 1: develop my TikTok presence, right, and I had gotten some 258 00:16:02,040 --> 00:16:06,520 Speaker 1: really good tutorials from much younger content creators who were 259 00:16:06,520 --> 00:16:08,920 Speaker 1: like Danielle, I think that you could translate to TikTok 260 00:16:09,160 --> 00:16:11,440 Speaker 1: and this is here are some ideas, right and just 261 00:16:11,480 --> 00:16:15,040 Speaker 1: give it a go. And they're like, also, you need 262 00:16:15,080 --> 00:16:19,800 Speaker 1: to put up at least nine videos a day, And 263 00:16:20,000 --> 00:16:22,760 Speaker 1: I was like, in. 264 00:16:22,640 --> 00:16:25,840 Speaker 2: What world, it's not happening, like in not in what 265 00:16:25,960 --> 00:16:26,520 Speaker 2: a world? 266 00:16:27,000 --> 00:16:29,080 Speaker 1: But there was like a span of time bridget I 267 00:16:29,120 --> 00:16:31,600 Speaker 1: shit you not where I was putting up like six 268 00:16:31,680 --> 00:16:34,960 Speaker 1: videos a day, and then at the end of like 269 00:16:35,080 --> 00:16:39,480 Speaker 1: that month, I was burnt out. I didn't want to 270 00:16:39,560 --> 00:16:42,240 Speaker 1: log onto TikTok, I didn't want to share a video. 271 00:16:42,560 --> 00:16:45,760 Speaker 1: I just didn't have it, you know. And so I 272 00:16:45,840 --> 00:16:48,840 Speaker 1: also want people to understand that those that try to 273 00:16:49,000 --> 00:16:51,520 Speaker 1: make a living and build a presence and a platform, 274 00:16:51,960 --> 00:16:55,240 Speaker 1: like again, we don't own any of the data. So 275 00:16:55,520 --> 00:16:58,280 Speaker 1: the tens of thousands of people that started following me 276 00:16:58,680 --> 00:17:01,560 Speaker 1: on TikTok, it's not like I have their email addresses, right, 277 00:17:01,680 --> 00:17:03,800 Speaker 1: It's not like I have access to them to bring 278 00:17:03,840 --> 00:17:07,520 Speaker 1: them into different formats in places, and so again it 279 00:17:07,560 --> 00:17:11,280 Speaker 1: is really about us looking and saying, am I really 280 00:17:11,320 --> 00:17:14,199 Speaker 1: in control of this? Then I say that I'm a 281 00:17:14,240 --> 00:17:16,919 Speaker 1: content creator, I say that I'm a you know, an 282 00:17:17,160 --> 00:17:20,399 Speaker 1: entrepreneur and doing these things, but like I think that 283 00:17:20,520 --> 00:17:23,240 Speaker 1: I have just traded one master for another. 284 00:17:23,800 --> 00:17:26,360 Speaker 3: That is a word. I mean, that is really how 285 00:17:26,400 --> 00:17:29,600 Speaker 3: it feels. And you know, let's say that you are 286 00:17:29,640 --> 00:17:32,080 Speaker 3: somebody who built up a huge following on TikTok then 287 00:17:32,119 --> 00:17:34,280 Speaker 3: it gets banned, or you're somebody who built up a 288 00:17:34,359 --> 00:17:37,159 Speaker 3: huge following on Instagram, then they tweak the algorithm, or 289 00:17:37,200 --> 00:17:39,679 Speaker 3: the kind of content that you were told succeeds on 290 00:17:39,720 --> 00:17:42,720 Speaker 3: the platform no longer pops off. Like it's just not 291 00:17:42,880 --> 00:17:45,320 Speaker 3: something that I think is healthy. It's not something that 292 00:17:45,320 --> 00:17:47,840 Speaker 3: I think is really possible. Like I really had to 293 00:17:48,119 --> 00:17:50,200 Speaker 3: have like a come to Jesus moment of like what 294 00:17:50,320 --> 00:17:52,000 Speaker 3: is it that I do well? 295 00:17:52,160 --> 00:17:54,040 Speaker 2: What is my value? Add what do I like doing? 296 00:17:54,280 --> 00:17:57,040 Speaker 3: And that's not making videos on TikTok, right like, and 297 00:17:57,400 --> 00:17:59,280 Speaker 3: I have to sort of be okay with that that 298 00:17:59,680 --> 00:18:02,400 Speaker 3: not every like Like I also did the TikTok thing. 299 00:18:02,440 --> 00:18:04,719 Speaker 3: It was a huge flop for me, partly because I 300 00:18:04,720 --> 00:18:06,600 Speaker 3: was not willing to make a million videos a day, 301 00:18:07,119 --> 00:18:09,280 Speaker 3: And I kind of have to be okay with that 302 00:18:09,600 --> 00:18:12,840 Speaker 3: and hope that the people who resonate with what I 303 00:18:12,880 --> 00:18:15,600 Speaker 3: have to say will find me in other ways. 304 00:18:16,200 --> 00:18:17,760 Speaker 2: But it's hard. It's really hard. 305 00:18:17,840 --> 00:18:20,760 Speaker 3: But as you said, increasingly, and I think especially for 306 00:18:21,080 --> 00:18:23,600 Speaker 3: black folks, for women, for people who are traditionally marginalized, 307 00:18:23,920 --> 00:18:26,359 Speaker 3: you do have to have that presence on TikTok or 308 00:18:26,359 --> 00:18:28,720 Speaker 3: on social media to get your point across. It's not 309 00:18:28,800 --> 00:18:31,240 Speaker 3: like you could, like you know, if you ever go 310 00:18:31,440 --> 00:18:33,320 Speaker 3: to try to get like a book deal, the first 311 00:18:33,320 --> 00:18:35,199 Speaker 3: thing they ask you is like, how many cass do 312 00:18:35,240 --> 00:18:35,480 Speaker 3: you have? 313 00:18:35,800 --> 00:18:37,280 Speaker 2: Exactly? And so I. 314 00:18:37,320 --> 00:18:39,960 Speaker 3: Recognize it's easy for me to say like, oh, just 315 00:18:39,960 --> 00:18:42,879 Speaker 3: don't care about your TikTok presence or whatever, because of 316 00:18:42,920 --> 00:18:45,400 Speaker 3: the reality is it does matter. But I guess I'm 317 00:18:45,440 --> 00:18:48,320 Speaker 3: trying to do the individual personal work of making it 318 00:18:48,359 --> 00:18:51,320 Speaker 3: matter less in my own mind and my own understanding 319 00:18:51,359 --> 00:18:53,800 Speaker 3: of my work and my values and my value. 320 00:18:54,280 --> 00:18:57,960 Speaker 1: It is so much work, and it's so much compartmentalization, 321 00:18:58,480 --> 00:19:02,040 Speaker 1: right and understanding like that you are not the product 322 00:19:02,040 --> 00:19:04,600 Speaker 1: that you are putting out. It's a very tricky game 323 00:19:04,600 --> 00:19:06,840 Speaker 1: as a content creator. I want to switch gears with 324 00:19:06,880 --> 00:19:09,359 Speaker 1: a couple of minutes that we have to talk about 325 00:19:09,400 --> 00:19:12,960 Speaker 1: generative AI and to talk about where we are, how 326 00:19:13,000 --> 00:19:15,960 Speaker 1: it's being used, and whether or not. And I know 327 00:19:16,040 --> 00:19:18,280 Speaker 1: that it's not black or white, that it is not 328 00:19:18,560 --> 00:19:21,479 Speaker 1: you know, inside of the binary. But is it going 329 00:19:21,560 --> 00:19:24,600 Speaker 1: to be for our betterment in your humble opinion, or 330 00:19:24,760 --> 00:19:29,600 Speaker 1: we just essentially setting ourselves up for a massive like 331 00:19:29,760 --> 00:19:32,560 Speaker 1: failure of humanity Because I tell you I think in 332 00:19:32,640 --> 00:19:36,000 Speaker 1: only existential thoughts when I think when I think about AI, 333 00:19:36,440 --> 00:19:39,560 Speaker 1: have no one has changed my mind on this, so please. 334 00:19:39,440 --> 00:19:41,600 Speaker 3: Sounds like somebody has been staring into the void and 335 00:19:41,760 --> 00:19:46,320 Speaker 3: like correct ground with existential thoughts. Okay, correct, Yeah, I 336 00:19:46,320 --> 00:19:49,040 Speaker 3: mean this is what I think about AI right at 337 00:19:49,040 --> 00:19:52,960 Speaker 3: my core. I am a techno optimist. I think that technology. 338 00:19:53,160 --> 00:19:55,600 Speaker 3: I've seen the way that it can connect people, inform people, 339 00:19:55,680 --> 00:19:57,439 Speaker 3: all of that. I think that we are at a 340 00:19:57,440 --> 00:20:01,480 Speaker 3: critical moment with generative AI where we are running out 341 00:20:01,480 --> 00:20:04,600 Speaker 3: of time to decide whether or not this is going 342 00:20:04,640 --> 00:20:09,199 Speaker 3: to be technology that helps us or harms us. I 343 00:20:09,240 --> 00:20:12,119 Speaker 3: think it's all about the people that we center, because 344 00:20:12,320 --> 00:20:14,240 Speaker 3: you know, it's easy to think about AI as like 345 00:20:14,960 --> 00:20:18,520 Speaker 3: hyper intelligent, hyper aware robots, but it's people. It's made 346 00:20:18,560 --> 00:20:20,919 Speaker 3: by people, it's trained by people, it learns from us. 347 00:20:20,920 --> 00:20:23,159 Speaker 3: So we're really talking about people. And I think that 348 00:20:23,359 --> 00:20:25,800 Speaker 3: right now we have to decide if we are going 349 00:20:25,800 --> 00:20:30,880 Speaker 3: to center and follow the same kind of tech billionaires 350 00:20:31,000 --> 00:20:32,560 Speaker 3: that led us to this place that we were just 351 00:20:32,600 --> 00:20:34,960 Speaker 3: talking about a moment ago, Danielle, where you know, it 352 00:20:35,040 --> 00:20:38,560 Speaker 3: feels like our information ecosystem is crumbling. Do we want 353 00:20:38,600 --> 00:20:40,879 Speaker 3: to listen to those same people that got us to 354 00:20:40,920 --> 00:20:44,520 Speaker 3: this moment that feels so scary and tough when it 355 00:20:44,560 --> 00:20:48,640 Speaker 3: comes to making decisions about AI, or do we want 356 00:20:48,680 --> 00:20:53,600 Speaker 3: to listen to the hundreds of mostly marginalized people, right 357 00:20:53,680 --> 00:20:57,159 Speaker 3: black women women who have been raising the alarm about AI, 358 00:20:57,640 --> 00:21:01,520 Speaker 3: really having the conversations about ethics, talking about how we 359 00:21:01,600 --> 00:21:04,760 Speaker 3: can make AI that truly is people centered, and the 360 00:21:04,840 --> 00:21:06,919 Speaker 3: need to do that. I think we can go one 361 00:21:06,920 --> 00:21:11,240 Speaker 3: of two ways, but we are losing time to decide 362 00:21:11,280 --> 00:21:13,520 Speaker 3: where we're gonna go, and So what I think when 363 00:21:13,520 --> 00:21:15,159 Speaker 3: it comes to AI is like we really need to 364 00:21:15,160 --> 00:21:18,480 Speaker 3: have a shift of who we listen to, who we 365 00:21:18,520 --> 00:21:21,640 Speaker 3: decide as an expert, who we center, because right now 366 00:21:21,680 --> 00:21:23,560 Speaker 3: we're doing the same thing that we're doing with the 367 00:21:23,560 --> 00:21:26,399 Speaker 3: platforms right where we've decided a couple of companies like 368 00:21:26,440 --> 00:21:30,840 Speaker 3: open AI and others mostly headed by white CIS men, 369 00:21:31,200 --> 00:21:32,879 Speaker 3: and we've just decided like they are the people who 370 00:21:32,920 --> 00:21:35,120 Speaker 3: are going to figure this out, and I want something 371 00:21:35,119 --> 00:21:38,280 Speaker 3: different from us. We have voices that are offering a 372 00:21:38,320 --> 00:21:40,919 Speaker 3: different story in a different future. We just have to 373 00:21:40,960 --> 00:21:42,720 Speaker 3: decide that those are the voices that we are going 374 00:21:42,760 --> 00:21:43,159 Speaker 3: to center. 375 00:21:46,359 --> 00:21:48,800 Speaker 1: Last question for you, bridget too, and this one is 376 00:21:48,920 --> 00:21:51,400 Speaker 1: you know, around pride, as I you know we've been 377 00:21:51,440 --> 00:21:54,240 Speaker 1: saying on the show this month. You know, pride is 378 00:21:54,240 --> 00:21:56,520 Speaker 1: three hundred and sixty five days of the year, particularly 379 00:21:56,560 --> 00:22:00,000 Speaker 1: when our community is under attack in so many ways 380 00:22:00,320 --> 00:22:03,280 Speaker 1: from so many spaces and places. And I wonder when 381 00:22:03,280 --> 00:22:06,960 Speaker 1: you say, like you're a techno optimist, how that translates 382 00:22:07,040 --> 00:22:12,280 Speaker 1: into like your own multitude of identities at a time 383 00:22:12,560 --> 00:22:16,280 Speaker 1: when America and the world feel fairly dark. 384 00:22:17,119 --> 00:22:19,760 Speaker 3: Yeah, I'm just remembering that the last time we talked 385 00:22:19,880 --> 00:22:22,800 Speaker 3: was also Pride this time last year, and I was 386 00:22:22,840 --> 00:22:25,200 Speaker 3: telling you that I had got out to a Pride 387 00:22:25,240 --> 00:22:30,640 Speaker 3: party and it felt different, Like I was like, I'm 388 00:22:30,680 --> 00:22:32,240 Speaker 3: a little nervous being here. 389 00:22:32,440 --> 00:22:33,280 Speaker 2: Is this safe? 390 00:22:33,320 --> 00:22:35,240 Speaker 3: But then when I kind of like leaned into the moment, 391 00:22:35,240 --> 00:22:37,040 Speaker 3: I was like, Oh, this actually feels good, and we 392 00:22:37,080 --> 00:22:40,119 Speaker 3: need to have these spaces. I think it really comes 393 00:22:40,160 --> 00:22:43,800 Speaker 3: down to that that, like, sometimes it feels like just existing, 394 00:22:44,200 --> 00:22:48,280 Speaker 3: just living, just thriving, just being here is its own 395 00:22:48,640 --> 00:22:51,880 Speaker 3: kind of quiet, radical act, and so I'm leaning into that. 396 00:22:51,880 --> 00:22:55,200 Speaker 3: That like resistance and showing up every day and being 397 00:22:55,240 --> 00:22:57,960 Speaker 3: an optimist and having hope and thinking making plans for 398 00:22:58,000 --> 00:23:00,640 Speaker 3: your future, being hopeful about the future in a kind 399 00:23:00,680 --> 00:23:02,800 Speaker 3: of way, that is a radical act that I'm sort 400 00:23:02,800 --> 00:23:06,280 Speaker 3: of clinging to that, you know, as everything feels like 401 00:23:06,320 --> 00:23:08,520 Speaker 3: it's burning sometimes sometimes that's all I had to cling to. 402 00:23:09,359 --> 00:23:10,359 Speaker 2: Yeah, I get it. 403 00:23:10,440 --> 00:23:13,679 Speaker 1: I feel like we have to lean into the moment 404 00:23:13,720 --> 00:23:16,040 Speaker 1: that we're in because I feel every single day we 405 00:23:16,080 --> 00:23:18,600 Speaker 1: are living history. You know, I thought for the first 406 00:23:18,600 --> 00:23:21,000 Speaker 1: time the other day, and I don't know why, but 407 00:23:21,760 --> 00:23:25,680 Speaker 1: I have all of these conversations as you do, all 408 00:23:25,720 --> 00:23:29,280 Speaker 1: through COVID, right, that are pretty much like audio diaries, 409 00:23:29,320 --> 00:23:32,600 Speaker 1: all of these podcasts of this moment, and at some 410 00:23:32,760 --> 00:23:36,760 Speaker 1: point in time ten, twenty, thirty, forty years from now, 411 00:23:36,920 --> 00:23:40,239 Speaker 1: people will be listening and being like they had no 412 00:23:40,359 --> 00:23:43,280 Speaker 1: idea what was coming, or like oh my god, they 413 00:23:43,280 --> 00:23:46,560 Speaker 1: did know, but they really didn't know, but like what 414 00:23:46,680 --> 00:23:49,000 Speaker 1: a moment to be living in. The thing that I 415 00:23:49,000 --> 00:23:52,400 Speaker 1: will end on is that I think that pride is necessary. 416 00:23:52,760 --> 00:23:56,119 Speaker 1: It is necessary, particularly in these times, to seek a 417 00:23:56,240 --> 00:24:00,119 Speaker 1: serious amount of joy, to center it, to guard it, 418 00:24:00,200 --> 00:24:04,480 Speaker 1: because everything is at stake, including our health and well being. 419 00:24:04,640 --> 00:24:08,600 Speaker 1: So with that, happy Pride to you, Bridget. Please tell 420 00:24:08,640 --> 00:24:12,480 Speaker 1: people how they connect with your new season of their 421 00:24:12,520 --> 00:24:13,600 Speaker 1: and No Girls on the Internet. 422 00:24:13,760 --> 00:24:16,600 Speaker 3: Well, thank you so much for having me Danielle Happy Pride. 423 00:24:16,720 --> 00:24:18,760 Speaker 3: You can find the podcast There Are No Girls on 424 00:24:18,800 --> 00:24:21,840 Speaker 3: the Internet, on iHeartRadio and Outspoken wherever you get your pods. 425 00:24:21,840 --> 00:24:24,240 Speaker 3: You can follow me on Instagram at Bridget Marie DC 426 00:24:24,840 --> 00:24:27,040 Speaker 3: or on TikTok where I just got around. 427 00:24:27,240 --> 00:24:28,760 Speaker 2: But if you want to follow it, if you want 428 00:24:28,800 --> 00:24:29,320 Speaker 2: to follow. 429 00:24:29,160 --> 00:24:32,600 Speaker 3: Me, it's a Bridget makes podcasts. Yeah, I'd love to 430 00:24:32,600 --> 00:24:33,040 Speaker 3: have you there. 431 00:24:33,880 --> 00:24:37,879 Speaker 1: Amazing As always, thank you so much for making the 432 00:24:37,920 --> 00:24:39,800 Speaker 1: time for WOKF really appreciate you. 433 00:24:40,560 --> 00:24:41,200 Speaker 2: Thanks so much. 434 00:24:47,200 --> 00:24:51,720 Speaker 1: That is it for me today. Friends on wokef as always, 435 00:24:52,119 --> 00:24:55,240 Speaker 1: power to the people and to all the people. Power, 436 00:24:55,359 --> 00:24:57,760 Speaker 1: get woke and stay woke as fuck.