1 00:00:01,960 --> 00:00:05,200 Speaker 1: Look at Our Radio is a radio phonic novela, which 2 00:00:05,280 --> 00:00:08,719 Speaker 1: is just a very extra way of saying a podcast. 3 00:00:09,160 --> 00:00:13,800 Speaker 1: I'm theos f M and I am malays. Local Radio 4 00:00:14,000 --> 00:00:17,560 Speaker 1: is your free, must favorite podcast hosted by us Mala 5 00:00:17,680 --> 00:00:21,720 Speaker 1: and Theosa where Too, I g friends turned podcast partners, 6 00:00:21,800 --> 00:00:26,640 Speaker 1: breaking down pop culture, feminism, sexual wellness, and offering fresh 7 00:00:26,760 --> 00:00:30,400 Speaker 1: takes on strending topics through nuanced interviews with up and 8 00:00:30,440 --> 00:00:34,880 Speaker 1: coming Latin next creatives known as Las Locals, Las Mammy 9 00:00:34,960 --> 00:00:41,360 Speaker 1: Submit and Bullshits next Door and Lasses. We've been podcasting 10 00:00:41,440 --> 00:00:46,320 Speaker 1: independently since and we're bringing our radiophonic novela to the 11 00:00:46,440 --> 00:00:50,600 Speaker 1: Miculcura Network to continue sharing stories from the latinox community. 12 00:00:51,200 --> 00:00:55,880 Speaker 1: Welcome to Local Radio Season seven, Take us to your network, 13 00:00:57,400 --> 00:01:01,680 Speaker 1: no locomotives, Welcome back to look at a Radio. I'm 14 00:01:01,720 --> 00:01:08,280 Speaker 1: DIOSA and I am Mala. Thanks for tuning into one 15 00:01:08,520 --> 00:01:12,720 Speaker 1: forty two. So last time on look at Radio we 16 00:01:12,760 --> 00:01:19,679 Speaker 1: talked about our Senora as her favorite senoras um both 17 00:01:19,720 --> 00:01:21,880 Speaker 1: living and dead. If you have not tuned in, check 18 00:01:21,920 --> 00:01:24,679 Speaker 1: it out, leave us a comment. Today this is the 19 00:01:24,720 --> 00:01:30,600 Speaker 1: Twitter side. We are talking about Twitter's demise, the downfall 20 00:01:30,640 --> 00:01:35,160 Speaker 1: of Twitter elon Musk and just everything in the Twitter verse. 21 00:01:35,360 --> 00:01:37,720 Speaker 1: You know, we love Twitter, we live on Twitter. We 22 00:01:37,760 --> 00:01:40,560 Speaker 1: also kind of hate Twitter. We have mixed feelings. So 23 00:01:40,600 --> 00:01:43,680 Speaker 1: we're just gonna like dive right in. Yeah, so just 24 00:01:43,760 --> 00:01:47,040 Speaker 1: to like set the stage a little bit and remind folks, 25 00:01:47,120 --> 00:01:49,760 Speaker 1: either you've heard maybe you don't want to know anything 26 00:01:49,760 --> 00:01:52,600 Speaker 1: about it, so you haven't been keeping up. But basically, 27 00:01:52,800 --> 00:01:57,600 Speaker 1: billionaire Elon Musk bought Twitter and since then it's been, well, 28 00:01:57,600 --> 00:01:59,200 Speaker 1: it's always been a health scape, but it's been an 29 00:01:59,200 --> 00:02:03,640 Speaker 1: extra health scape. Since he bought Twitter, there were major layoffs, 30 00:02:04,080 --> 00:02:09,960 Speaker 1: many of the like equity initiatives that were started at Twitter, 31 00:02:10,160 --> 00:02:14,600 Speaker 1: like Twitter Alas, which was like the Latino Latin x Hub, 32 00:02:15,040 --> 00:02:19,520 Speaker 1: also like like black Twitter, and like disability, like their 33 00:02:20,120 --> 00:02:24,640 Speaker 1: sections within Twitter. The company had experienced layoffs, and then 34 00:02:24,680 --> 00:02:27,960 Speaker 1: Twitter employees were also let go and then rehired because 35 00:02:28,160 --> 00:02:31,080 Speaker 1: Elon Musk realized, oh wait, I need the software engineers. 36 00:02:31,639 --> 00:02:35,160 Speaker 1: So it's been a mess. Yeah. Famously, he like walked 37 00:02:35,280 --> 00:02:41,440 Speaker 1: into Twitter headquarters holding a sink and then tweeted about 38 00:02:41,440 --> 00:02:44,760 Speaker 1: it something like everything but the kitchen, saying I could 39 00:02:44,880 --> 00:02:47,280 Speaker 1: go look at the exact week, but it was very 40 00:02:47,320 --> 00:02:50,359 Speaker 1: bad and cringe e and it didn't land. The joke 41 00:02:50,400 --> 00:02:53,520 Speaker 1: didn't land the way he wanted it to. And people 42 00:02:53,560 --> 00:02:57,720 Speaker 1: are like making all these comparisons between Elon Musk and 43 00:02:57,800 --> 00:03:02,880 Speaker 1: like like Ken roy on succession, you know, like like 44 00:03:03,120 --> 00:03:08,320 Speaker 1: son of billionaire tycoons who like fox things up basically, 45 00:03:09,120 --> 00:03:13,080 Speaker 1: and he's been kind of like I don't know, like 46 00:03:13,120 --> 00:03:15,880 Speaker 1: I feel like over the years, Elon Musks really wants 47 00:03:16,040 --> 00:03:18,160 Speaker 1: us to think that he's funny, Like he really wants 48 00:03:18,440 --> 00:03:21,160 Speaker 1: to be a comedian and like loved by comedians, I 49 00:03:21,200 --> 00:03:25,399 Speaker 1: feel like, but he just like trolls everyone. Yeah, it's 50 00:03:25,440 --> 00:03:28,040 Speaker 1: like he's trying too hard and it's like not working 51 00:03:28,120 --> 00:03:31,600 Speaker 1: and people are just very annoyed by him. And that's 52 00:03:31,680 --> 00:03:36,760 Speaker 1: like just the public perception part of this, because the 53 00:03:36,880 --> 00:03:40,680 Speaker 1: real issue is like you're talking about like the labor 54 00:03:40,720 --> 00:03:44,880 Speaker 1: stuff with Twitter employees and then what that ends up 55 00:03:44,920 --> 00:03:51,560 Speaker 1: meaning for the platform and users and like protections and 56 00:03:52,000 --> 00:03:56,480 Speaker 1: like filtering content or like dealing with like abuse and 57 00:03:56,480 --> 00:04:02,480 Speaker 1: harassment and doxing, and with the verification stuff, like Elon 58 00:04:02,600 --> 00:04:06,320 Speaker 1: Musk like in this new Twitter era, like changing up 59 00:04:06,760 --> 00:04:10,200 Speaker 1: how people get verified, and then folks already in this 60 00:04:10,280 --> 00:04:12,760 Speaker 1: very short period of time have been dealing with like 61 00:04:13,200 --> 00:04:16,160 Speaker 1: people impersonating them because it's easier to get verified now, 62 00:04:16,320 --> 00:04:19,040 Speaker 1: like you just have to pay, and the sort of 63 00:04:19,080 --> 00:04:22,560 Speaker 1: like fraud detection elements that I guess used to be 64 00:04:22,640 --> 00:04:26,320 Speaker 1: in place are like really not there. So recently there's 65 00:04:26,320 --> 00:04:31,000 Speaker 1: another podcaster, Lacy Moseley, who has a podcast called Scam Goddess. 66 00:04:31,560 --> 00:04:35,479 Speaker 1: Her ironically, her Twitter account and I want to say 67 00:04:35,480 --> 00:04:38,719 Speaker 1: an Instagram account were hacked and she had a really 68 00:04:38,760 --> 00:04:44,560 Speaker 1: really really hard time getting those hacked accounts back. And 69 00:04:44,880 --> 00:04:48,479 Speaker 1: she's somebody with like a big platform, you know, So, yeah, 70 00:04:48,600 --> 00:04:51,839 Speaker 1: it's just the wild West. Her account was already bad, 71 00:04:52,160 --> 00:04:55,160 Speaker 1: her account was hacked, and then her followers were scammed 72 00:04:55,240 --> 00:04:59,000 Speaker 1: out of thousands of dollars collectively because the systems in 73 00:04:59,040 --> 00:05:01,960 Speaker 1: place where no long younger there there were Also, there's 74 00:05:01,960 --> 00:05:04,840 Speaker 1: also been an increase in the N word being used 75 00:05:04,960 --> 00:05:08,720 Speaker 1: across the platform, like a like huge increase, and so 76 00:05:08,800 --> 00:05:12,440 Speaker 1: all these different protections that took years to implement at 77 00:05:12,480 --> 00:05:16,640 Speaker 1: Twitter have now been they no longer exist. So if 78 00:05:16,680 --> 00:05:22,080 Speaker 1: you'll remember, in Twitter suspended Trump's account after the insurrection, 79 00:05:22,560 --> 00:05:27,800 Speaker 1: after so many years of him tweeting nonsense, spreading misinformation 80 00:05:27,800 --> 00:05:30,960 Speaker 1: and disinformation, they finally decided enough was enough a little 81 00:05:31,000 --> 00:05:33,800 Speaker 1: too late in my opinion, But his account was suspended 82 00:05:34,120 --> 00:05:38,560 Speaker 1: after he called the January six writers patriots, which led to, 83 00:05:39,000 --> 00:05:43,640 Speaker 1: of course, this belief that the election was allegedly stolen, 84 00:05:43,960 --> 00:05:47,440 Speaker 1: and so they finally suspended him. Within a few weeks 85 00:05:47,480 --> 00:05:52,560 Speaker 1: of Elon Musk owning Twitter, Trump's account has been reinstated. 86 00:05:52,600 --> 00:05:54,760 Speaker 1: He's not back on Twitter yet, but he has the 87 00:05:54,800 --> 00:05:58,920 Speaker 1: option now. And so just it's a mess. It's been 88 00:05:58,920 --> 00:06:01,559 Speaker 1: a mess in some ways, but it's also been really 89 00:06:01,600 --> 00:06:05,359 Speaker 1: wonderful Twitter that is in other ways. Oh yeah, like 90 00:06:05,480 --> 00:06:09,480 Speaker 1: in general, like across the board. So just us as 91 00:06:09,520 --> 00:06:16,200 Speaker 1: podcasters and as formerly indie podcasters, we like met on Twitter, 92 00:06:17,440 --> 00:06:21,839 Speaker 1: we like built networks on Twitter, and grew our show 93 00:06:22,400 --> 00:06:26,919 Speaker 1: on Twitter and Instagram and us now on TikTok. But 94 00:06:26,960 --> 00:06:31,640 Speaker 1: it's mostly been Twitter and Instagram are really where these 95 00:06:31,680 --> 00:06:35,400 Speaker 1: things truly, um, we're able to take off. We got 96 00:06:35,400 --> 00:06:38,680 Speaker 1: a lot of our news, maybe not always directly from Twitter, 97 00:06:38,720 --> 00:06:41,960 Speaker 1: but Twitter will alert us that something is happening, so 98 00:06:42,120 --> 00:06:46,240 Speaker 1: go investigate further. People are talking about said issue, Let's go. 99 00:06:46,800 --> 00:06:49,120 Speaker 1: And so it very much has informed the way we've 100 00:06:49,160 --> 00:06:53,320 Speaker 1: conducted or produced episodes over the years. Not every episode. 101 00:06:53,400 --> 00:06:57,239 Speaker 1: But there are some conversations that happened on Twitter that 102 00:06:57,839 --> 00:07:01,720 Speaker 1: you know, become like nation Waton national conversations, right, a 103 00:07:01,720 --> 00:07:04,960 Speaker 1: lot like black Twitter has led a lot of these conversations, 104 00:07:05,000 --> 00:07:10,560 Speaker 1: whether that be about violence, police violence, and police brutality. 105 00:07:10,600 --> 00:07:16,920 Speaker 1: Like the Ferguson uprising definitely caught wind via Twitter, and 106 00:07:16,960 --> 00:07:20,200 Speaker 1: that's when folks outside of Ferguson were able to learn 107 00:07:20,240 --> 00:07:23,560 Speaker 1: about what was happening. And so it's been used for 108 00:07:23,720 --> 00:07:28,200 Speaker 1: like a lot of organizing, a lot of journalism, a 109 00:07:28,240 --> 00:07:32,840 Speaker 1: lot of resource sharing over the years. Yeah, and I 110 00:07:32,880 --> 00:07:37,400 Speaker 1: think that a lot of small business owners have been 111 00:07:37,440 --> 00:07:41,320 Speaker 1: able to like utilize the platform to get their work 112 00:07:41,360 --> 00:07:44,000 Speaker 1: out there, to get their art out there, to get 113 00:07:44,000 --> 00:07:47,440 Speaker 1: their businesses off the ground. A lot of people use 114 00:07:47,520 --> 00:07:52,960 Speaker 1: Twitter to crowdfund, especially like for like emergency situations in 115 00:07:53,040 --> 00:07:57,840 Speaker 1: the COVID age, for healthcare costs, for funeral costs, all 116 00:07:57,960 --> 00:08:02,080 Speaker 1: kinds of rent rental assistant. So it's the type of 117 00:08:02,080 --> 00:08:06,560 Speaker 1: place where a lot of good can happen, but obviously 118 00:08:06,720 --> 00:08:09,720 Speaker 1: there's also all sorts of issues. Um there's a journalist 119 00:08:09,760 --> 00:08:13,920 Speaker 1: that we follow, Taylor Lawrenz, who I think typifies in 120 00:08:14,000 --> 00:08:16,800 Speaker 1: many ways, like all of the issues with Twitter sort 121 00:08:16,800 --> 00:08:21,240 Speaker 1: of in one person kind of the poor thing. She 122 00:08:21,600 --> 00:08:27,720 Speaker 1: is like a tech reporter. She's like an internet culture journalist, 123 00:08:27,800 --> 00:08:31,080 Speaker 1: and she has written everywhere where, like at the New 124 00:08:31,160 --> 00:08:34,160 Speaker 1: York Times. Well, she's currently for the She currently works 125 00:08:34,160 --> 00:08:36,839 Speaker 1: for the Washington Post. Yeah, previously she was at the 126 00:08:36,880 --> 00:08:40,280 Speaker 1: New York Times. And she is very prolific on Twitter, 127 00:08:40,880 --> 00:08:46,079 Speaker 1: and she also has been the target of much horrific 128 00:08:46,160 --> 00:08:50,240 Speaker 1: harassment and boxing. Tucker Carlson is like obsessed with her 129 00:08:50,400 --> 00:08:52,959 Speaker 1: and can't stop talking about her and putting her on 130 00:08:53,120 --> 00:08:56,960 Speaker 1: blast on his show, which as a result, means that 131 00:08:57,200 --> 00:09:00,120 Speaker 1: she is at the receiving end of a lot of 132 00:09:01,120 --> 00:09:04,520 Speaker 1: violence and stocking. And she does a lot of reporting 133 00:09:04,520 --> 00:09:08,360 Speaker 1: on well, what's happening on TikTok, what's happening on Twitter? 134 00:09:09,520 --> 00:09:14,320 Speaker 1: And she also has like gained a certain amount of notoriety, 135 00:09:14,559 --> 00:09:16,880 Speaker 1: you know, also because of all of these things. And 136 00:09:16,960 --> 00:09:20,160 Speaker 1: she shares very very openly when there's like targeted harassment 137 00:09:20,240 --> 00:09:23,160 Speaker 1: or when she's experiencing something like this. So she's been 138 00:09:23,240 --> 00:09:26,880 Speaker 1: kind of just like an interesting person just to to 139 00:09:27,080 --> 00:09:30,720 Speaker 1: follow and see like how severe this stuff can get. 140 00:09:31,280 --> 00:09:33,160 Speaker 1: And I think the other thing is like, yeah, a 141 00:09:33,160 --> 00:09:38,160 Speaker 1: lot of small creators, writers journalists are able to amplify 142 00:09:38,240 --> 00:09:42,040 Speaker 1: their work, like we had Serious Castle on the show 143 00:09:42,280 --> 00:09:47,120 Speaker 1: to talk about l A. Sheriff's Department gangs and her 144 00:09:47,160 --> 00:09:49,800 Speaker 1: new podcast called a Tradition of Violence about l A. 145 00:09:49,920 --> 00:09:53,400 Speaker 1: S d Gangs. And so she is a independent journalists 146 00:09:53,400 --> 00:09:56,719 Speaker 1: of freelance journalists writing for a relatively small publication, not 147 00:09:57,040 --> 00:10:01,400 Speaker 1: l A. And that work got will FINDE in large part, 148 00:10:01,480 --> 00:10:04,640 Speaker 1: I would say because of Twitter and the way that 149 00:10:04,679 --> 00:10:07,880 Speaker 1: it allowed the work to reach a very global audience. 150 00:10:07,920 --> 00:10:11,600 Speaker 1: It got like hyper viral. That's a good episode. Tune 151 00:10:11,600 --> 00:10:16,760 Speaker 1: in if you haven't. But there is a place well 152 00:10:16,800 --> 00:10:18,520 Speaker 1: there there's a couple of things I want to I 153 00:10:18,559 --> 00:10:20,400 Speaker 1: want to say based on what you said, and like 154 00:10:21,240 --> 00:10:27,400 Speaker 1: you know, certain issues or movements getting like international coverage 155 00:10:27,480 --> 00:10:29,480 Speaker 1: right like we've seen over the years, like the Arab 156 00:10:29,559 --> 00:10:32,240 Speaker 1: Spring in two thousand eleven. There's been a lot of 157 00:10:32,240 --> 00:10:36,120 Speaker 1: resource sharing in the disabled community, UM citizen journalists just 158 00:10:36,160 --> 00:10:39,920 Speaker 1: documenting what's happening, either you know where they live abroad 159 00:10:40,240 --> 00:10:42,680 Speaker 1: or here in the US. Going back to what you 160 00:10:42,679 --> 00:10:46,840 Speaker 1: said about Taylor Lorenz is that you know Elon Must 161 00:10:48,040 --> 00:10:50,360 Speaker 1: pretends to be a proponent of like free speech, but 162 00:10:50,440 --> 00:10:53,439 Speaker 1: a point that she made is that actually he doesn't 163 00:10:53,440 --> 00:10:58,000 Speaker 1: care about free speech because he's creating all these rules 164 00:10:58,240 --> 00:11:02,640 Speaker 1: on Twitter, specifically that if you tweet in a way 165 00:11:02,679 --> 00:11:05,560 Speaker 1: that he doesn't like, your account is going to be banned, 166 00:11:05,679 --> 00:11:07,920 Speaker 1: or if you tweet in a negative way, it's going 167 00:11:08,000 --> 00:11:11,240 Speaker 1: to be banned, your account will be suspended. But he's 168 00:11:11,280 --> 00:11:15,120 Speaker 1: not explaining what like quote bad means or or there's 169 00:11:15,120 --> 00:11:17,600 Speaker 1: no rules for it. It's just up for interpretation. And 170 00:11:17,640 --> 00:11:21,000 Speaker 1: so that alone tells you that or shows you that 171 00:11:21,040 --> 00:11:23,480 Speaker 1: he actually doesn't care about free speech. It's what he 172 00:11:23,559 --> 00:11:29,240 Speaker 1: wants to hear and what he deems okay to share online. Yeah, 173 00:11:29,280 --> 00:11:34,360 Speaker 1: because Elan is also definitely one of these um like 174 00:11:34,440 --> 00:11:37,319 Speaker 1: cancel culture is the worst thing to ever happen, guys. 175 00:11:37,960 --> 00:11:42,160 Speaker 1: You know, he's like one of those dudes. And so 176 00:11:42,559 --> 00:11:51,079 Speaker 1: eliminating any acknowledgment that hates speech exists is somehow to him, 177 00:11:51,240 --> 00:11:55,880 Speaker 1: a proposed a way to advocate for free speech. But 178 00:11:56,040 --> 00:12:01,439 Speaker 1: then if we don't have the framework of hate speech 179 00:12:01,480 --> 00:12:07,000 Speaker 1: to help us identify and understand hateful speech, he's creating 180 00:12:07,040 --> 00:12:11,800 Speaker 1: new definitions instead, right, And so it goes from being 181 00:12:12,679 --> 00:12:17,000 Speaker 1: this language that is codified by law, we can find 182 00:12:17,040 --> 00:12:22,080 Speaker 1: on a federal level, like protected groups and what language 183 00:12:22,480 --> 00:12:27,120 Speaker 1: is considered like legally hateful, right and like inside violence 184 00:12:27,679 --> 00:12:31,560 Speaker 1: instead of like working with that that's been established over 185 00:12:31,640 --> 00:12:36,280 Speaker 1: decades of like legislation and like civil rights work. Now 186 00:12:36,280 --> 00:12:39,560 Speaker 1: it's just this one rich guy, like the son of 187 00:12:39,600 --> 00:12:44,120 Speaker 1: like diamond oligarchs. Basically like who's going to tell us 188 00:12:44,160 --> 00:12:51,719 Speaker 1: instead this is acceptable communication? And it's not going well, right, 189 00:12:51,880 --> 00:12:53,880 Speaker 1: I mean, and I think, like you know, when people 190 00:12:53,920 --> 00:12:57,880 Speaker 1: talk about free speech, like when the Constitution was written, 191 00:12:58,360 --> 00:13:01,480 Speaker 1: free speech, men like a public square, like literally a 192 00:13:01,559 --> 00:13:06,680 Speaker 1: public square. Now like online there's like this idea of 193 00:13:06,720 --> 00:13:10,800 Speaker 1: the digital public square, and so even then, like there 194 00:13:10,840 --> 00:13:13,840 Speaker 1: are still rules like in place because Twitter is like, 195 00:13:14,160 --> 00:13:16,440 Speaker 1: as we know, like a privately owned company. It's not 196 00:13:16,480 --> 00:13:20,760 Speaker 1: a federal company. Then the government doesn't have stake in it. 197 00:13:21,040 --> 00:13:23,840 Speaker 1: But it gets confusing because obviously, like there's so much 198 00:13:24,640 --> 00:13:27,920 Speaker 1: destabilizing of democracy that also happens online and so it's 199 00:13:27,920 --> 00:13:30,520 Speaker 1: like all over it, all overlaps, and it's all connected, 200 00:13:31,040 --> 00:13:34,360 Speaker 1: and so it's really not about free speech as you 201 00:13:34,400 --> 00:13:37,960 Speaker 1: were saying, because the framework is not there. It's more 202 00:13:38,000 --> 00:13:41,920 Speaker 1: so about this one little man like trying to control everything. 203 00:13:42,440 --> 00:13:44,800 Speaker 1: And it's also like the elite stick together, Like it's 204 00:13:44,800 --> 00:13:47,760 Speaker 1: not just elon Muss. There's like this group of billionaires 205 00:13:47,800 --> 00:13:52,240 Speaker 1: that are all benefiting from you know, potentially Twitter's downfall demise, 206 00:13:52,480 --> 00:13:57,600 Speaker 1: because where's the community building and organizing, the digital organizing 207 00:13:57,600 --> 00:14:01,560 Speaker 1: going to happen now? Totally So Mariah Caspaniana, who is 208 00:14:01,600 --> 00:14:07,520 Speaker 1: also a podcaster and a journalist and tweets. She was 209 00:14:08,040 --> 00:14:13,880 Speaker 1: tweeting out, uh, some thoughts about the way that journalists 210 00:14:14,040 --> 00:14:16,840 Speaker 1: do use Twitter and the way that so many folks 211 00:14:18,000 --> 00:14:24,120 Speaker 1: follow local journals and get news or updates or links 212 00:14:24,160 --> 00:14:28,200 Speaker 1: to articles on their feed. I know that's true for us, 213 00:14:28,400 --> 00:14:32,520 Speaker 1: we follow tons of journalists, and I agree, you know, like, 214 00:14:32,600 --> 00:14:37,080 Speaker 1: what's going to happen with the dissemination of news when 215 00:14:37,200 --> 00:14:41,920 Speaker 1: this apparatus is either just not functioning or it's just gone. 216 00:14:42,600 --> 00:14:46,240 Speaker 1: Life finds a way the information will go somewhere else. 217 00:14:46,760 --> 00:14:50,040 Speaker 1: But right now, this is like a platform that has 218 00:14:50,080 --> 00:14:53,960 Speaker 1: been so huge for such a long time that to 219 00:14:54,120 --> 00:15:00,000 Speaker 1: like totally change over or build something else that's the same, Um, 220 00:15:00,080 --> 00:15:03,240 Speaker 1: I don't know how or when that would happen, right, 221 00:15:03,360 --> 00:15:06,200 Speaker 1: I mean, And there's like sometimes there are some days 222 00:15:06,200 --> 00:15:10,000 Speaker 1: where me, as like a social media user, where it's 223 00:15:10,040 --> 00:15:14,640 Speaker 1: like I don't necessarily want to see reels and photos, 224 00:15:15,240 --> 00:15:18,320 Speaker 1: but I do want to like laugh, or I do 225 00:15:18,480 --> 00:15:23,960 Speaker 1: want to read something hilarious or informative or like a 226 00:15:24,000 --> 00:15:26,480 Speaker 1: great take on something. And so a lot of the 227 00:15:26,520 --> 00:15:29,680 Speaker 1: times I go to Twitter first, like if I see 228 00:15:29,720 --> 00:15:31,800 Speaker 1: a movie that I like, I want to hear like 229 00:15:31,840 --> 00:15:34,000 Speaker 1: what are other people thinking and saying about it? It 230 00:15:34,040 --> 00:15:37,120 Speaker 1: helps me like process things that I'm like really interested 231 00:15:37,160 --> 00:15:39,200 Speaker 1: in or maybe I have I'm confused about and I 232 00:15:39,200 --> 00:15:41,720 Speaker 1: want to I want to read how other people are 233 00:15:41,760 --> 00:15:46,360 Speaker 1: processing aloud. And so Twitter has been really great for 234 00:15:46,600 --> 00:15:50,560 Speaker 1: that type of like culture development or spread of like 235 00:15:51,240 --> 00:15:53,320 Speaker 1: has led to the success of a lot of either 236 00:15:53,480 --> 00:16:00,760 Speaker 1: indie or like non mainstream media culture art. Oh and 237 00:16:00,840 --> 00:16:03,040 Speaker 1: like Over the years, people will tell us like, oh, 238 00:16:03,080 --> 00:16:06,080 Speaker 1: like I found you on Twitter, or I saw this 239 00:16:06,120 --> 00:16:11,800 Speaker 1: tweet I somebody tweeted about you, Like that's not uncommon here, 240 00:16:12,600 --> 00:16:15,320 Speaker 1: you know. So it's a really effective platform for a 241 00:16:15,360 --> 00:16:18,400 Speaker 1: lot of things. At the same time, as a Twitter user, 242 00:16:18,520 --> 00:16:21,840 Speaker 1: like as you said, talk about it, part of me 243 00:16:21,880 --> 00:16:25,400 Speaker 1: also wants to see the app b Yeah, I'm like, 244 00:16:25,440 --> 00:16:29,600 Speaker 1: oh my god, am I actually going to delete this? Is? This? 245 00:16:29,720 --> 00:16:32,800 Speaker 1: Is this app? Going to be no longer, you know. Yeah, 246 00:16:33,160 --> 00:16:35,040 Speaker 1: part of me is like, let's get it over with 247 00:16:35,160 --> 00:16:39,400 Speaker 1: so I can move on with my life. Jesus H. Christ, 248 00:16:39,560 --> 00:16:43,280 Speaker 1: it's been years. I wake up, I checked Twitter. I 249 00:16:43,320 --> 00:16:46,240 Speaker 1: go to bed, I checked Twitter first. You know, I 250 00:16:46,240 --> 00:16:49,120 Speaker 1: have a bit of insomnia. Where am I going to Twitter? 251 00:16:50,440 --> 00:16:52,400 Speaker 1: And I mean we we've talked about some of the 252 00:16:52,440 --> 00:16:55,680 Speaker 1: negatives and some of the like incredible things that have 253 00:16:55,720 --> 00:17:00,920 Speaker 1: either happened community wide, internationally and also for ourselves, but 254 00:17:01,000 --> 00:17:03,920 Speaker 1: there have been some like insufferable things that have happened 255 00:17:03,920 --> 00:17:09,720 Speaker 1: on Twitter, you know, like the the assumption that you 256 00:17:09,800 --> 00:17:13,720 Speaker 1: have to like explain yourself and include everything in a 257 00:17:13,800 --> 00:17:17,920 Speaker 1: hundred and forty characters is exhausting. I remember one time 258 00:17:18,000 --> 00:17:22,240 Speaker 1: I tweeted about during the pandemic, you know, like living 259 00:17:22,280 --> 00:17:24,719 Speaker 1: at home with my family like has allowed me to 260 00:17:24,760 --> 00:17:28,720 Speaker 1: practice community care because if I didn't live with my family, like, 261 00:17:28,880 --> 00:17:31,000 Speaker 1: who would who was going to be taking care of them? 262 00:17:31,080 --> 00:17:33,600 Speaker 1: Who was going to be running all the errands to 263 00:17:33,680 --> 00:17:36,520 Speaker 1: make sure that we had a functional household during the pandemic, 264 00:17:36,680 --> 00:17:41,640 Speaker 1: as like the person that was as like the non senior, 265 00:17:41,840 --> 00:17:44,080 Speaker 1: right like I needed to be the one out doing 266 00:17:44,119 --> 00:17:48,560 Speaker 1: those things and I remember tweeting that and someone responding like, well, 267 00:17:48,560 --> 00:17:51,640 Speaker 1: what if you live in a toxic household? And I'm like, okay, 268 00:17:51,760 --> 00:17:54,840 Speaker 1: well I am not talking about you, and I'm not 269 00:17:54,880 --> 00:17:58,320 Speaker 1: talking about people that live in toxic households, Like of course, 270 00:17:58,359 --> 00:18:00,960 Speaker 1: if you live in a toxic house like you shouldn't 271 00:18:00,960 --> 00:18:03,320 Speaker 1: be living there, Like if you're able to move, move, 272 00:18:03,600 --> 00:18:07,760 Speaker 1: Like nowhere in my tweet did I say I, you know, 273 00:18:07,800 --> 00:18:10,960 Speaker 1: in spite of living in a toxic household, at least 274 00:18:10,960 --> 00:18:12,760 Speaker 1: I can take care of my family, you know, Like 275 00:18:12,800 --> 00:18:16,240 Speaker 1: the assumption, the thread that was drawn out of thin 276 00:18:16,359 --> 00:18:20,080 Speaker 1: air was so frustrating. And that's just one example, right, 277 00:18:20,400 --> 00:18:23,760 Speaker 1: there's like I have a million more, but just this 278 00:18:23,880 --> 00:18:28,879 Speaker 1: idea that you have to say everything. Yeah, when it's like, honey, 279 00:18:28,960 --> 00:18:31,280 Speaker 1: I'm when I'm tweeting and this is this is like 280 00:18:31,320 --> 00:18:34,760 Speaker 1: a note to self. Nine times out of ten, I'm 281 00:18:34,800 --> 00:18:37,720 Speaker 1: thinking out loud. I'm just tweeting into the void. Babe. 282 00:18:40,400 --> 00:18:46,200 Speaker 1: This is lies and deceit, This is hyperbole, this is comedy, 283 00:18:46,359 --> 00:18:48,880 Speaker 1: this is this is a tweet, baby, you know, like 284 00:18:48,960 --> 00:18:52,760 Speaker 1: this is this is not an academic journal, this is 285 00:18:52,760 --> 00:18:57,159 Speaker 1: not a policy brief. Yeah, it's just not right. I 286 00:18:57,640 --> 00:19:04,240 Speaker 1: remember I remember when names started her her her little library, 287 00:19:04,640 --> 00:19:08,359 Speaker 1: her bookstore, and people there were a lot of a 288 00:19:08,359 --> 00:19:11,800 Speaker 1: lot of people like saying it was like elitist or 289 00:19:12,000 --> 00:19:16,479 Speaker 1: you know, all these different things because she was she 290 00:19:16,560 --> 00:19:20,520 Speaker 1: was trying to bring literature to her community. And I 291 00:19:20,600 --> 00:19:24,480 Speaker 1: remember tweeting in support of it, and at the same time, 292 00:19:24,560 --> 00:19:28,680 Speaker 1: you were having your own feud on Twitter because people 293 00:19:28,720 --> 00:19:32,120 Speaker 1: were mad at you for talking about Jenny six nine. 294 00:19:32,600 --> 00:19:35,080 Speaker 1: It was so funny, dude, And then they wanted me 295 00:19:35,160 --> 00:19:39,520 Speaker 1: to like they wanted me to take down Jennie right 296 00:19:39,600 --> 00:19:42,240 Speaker 1: and before we get back to the Jenny s nine, 297 00:19:42,240 --> 00:19:46,399 Speaker 1: like because I tweeted and support of no name. But 298 00:19:46,520 --> 00:19:48,840 Speaker 1: I what I did was I quote tweeted something and 299 00:19:48,880 --> 00:19:53,840 Speaker 1: I said like like instead of like harping on like 300 00:19:53,880 --> 00:19:57,120 Speaker 1: what black women women of color are aren't doing? Are 301 00:19:57,200 --> 00:19:59,680 Speaker 1: doing or not doing, like take a look at yourself 302 00:20:00,000 --> 00:20:03,560 Speaker 1: before you make demands. And someone thought that I was 303 00:20:03,640 --> 00:20:08,959 Speaker 1: talking about you and and what you what was happening 304 00:20:09,040 --> 00:20:12,399 Speaker 1: on your Twitter timeline and We're making these like really 305 00:20:12,400 --> 00:20:14,800 Speaker 1: intense connections and I'm like, oh my god, I'm at 306 00:20:14,840 --> 00:20:16,800 Speaker 1: my day job, like I don't know what's happening on 307 00:20:16,920 --> 00:20:20,160 Speaker 1: enough time for this, Yeah, and I was talking about 308 00:20:20,240 --> 00:20:23,400 Speaker 1: Jenny six nine on my Instagram story and you were 309 00:20:23,440 --> 00:20:26,199 Speaker 1: on Twitter talking about this other thing, not even on 310 00:20:26,240 --> 00:20:30,200 Speaker 1: this thing platform. So like, okay, remember when Soil sixty 311 00:20:30,280 --> 00:20:35,399 Speaker 1: nine came out, Jenny six nine, Like, honestly, it was 312 00:20:35,480 --> 00:20:39,960 Speaker 1: like a cultural reset in some ways. Sorry, yeah, I 313 00:20:39,960 --> 00:20:42,800 Speaker 1: mean we've talked. We've talked about Jenny six nine before 314 00:20:43,000 --> 00:20:45,560 Speaker 1: we know she's problematic. Okay, go back to the to 315 00:20:45,680 --> 00:20:48,200 Speaker 1: the Fake Dina's episode. We don't call her fake DNA, 316 00:20:48,400 --> 00:20:50,920 Speaker 1: but we do talk. We did talk about her. Yeah, 317 00:20:50,920 --> 00:20:52,399 Speaker 1: and this is part of it, right, because it's like 318 00:20:53,359 --> 00:20:58,080 Speaker 1: super micro viral. Uh. She's on stages now, she's performing, 319 00:20:58,119 --> 00:21:03,080 Speaker 1: she's selling tickets, she has millions of followers, like literally millions, 320 00:21:04,040 --> 00:21:06,840 Speaker 1: and she was charting. She was charting at the time. 321 00:21:06,920 --> 00:21:10,400 Speaker 1: Even if you hated it, she was still charting. People 322 00:21:10,440 --> 00:21:14,760 Speaker 1: were probably hate listening, but and everybody doing reels, using 323 00:21:14,800 --> 00:21:18,840 Speaker 1: the noise, using the sound, using the song on on Instagram, 324 00:21:18,920 --> 00:21:22,360 Speaker 1: on TikTok, on Twitter. The l a Time wrote about her, 325 00:21:22,760 --> 00:21:25,880 Speaker 1: she got a lot of write ups. So I on 326 00:21:25,920 --> 00:21:30,439 Speaker 1: my story in the midst of the virility across platforms, 327 00:21:30,480 --> 00:21:36,320 Speaker 1: across journalism, across everyday users, tweeting and posting and creating 328 00:21:36,320 --> 00:21:40,440 Speaker 1: TikTok's little only I go to my story and I say, like, 329 00:21:40,600 --> 00:21:44,040 Speaker 1: you know, say what you will, do what you will. Um, 330 00:21:44,119 --> 00:21:49,240 Speaker 1: But if you like in your gut hate her and 331 00:21:49,320 --> 00:21:52,400 Speaker 1: think that she's horrible and like don't want to see 332 00:21:52,400 --> 00:21:56,560 Speaker 1: her thrive, then don't contribute to the viral moment and 333 00:21:56,640 --> 00:21:59,359 Speaker 1: don't even hate tweet or hate post. You're gonna hate 334 00:21:59,359 --> 00:22:03,040 Speaker 1: post her right into celebrity, which is exactly what happened. 335 00:22:03,280 --> 00:22:06,760 Speaker 1: I mean, hello, that's what happens. That's just literally how 336 00:22:06,800 --> 00:22:09,920 Speaker 1: it works. I had so many people in my d 337 00:22:10,200 --> 00:22:15,600 Speaker 1: M s and ship like she she paragraphs, dude about 338 00:22:15,640 --> 00:22:18,359 Speaker 1: every bad thing she has ever done. My thing is 339 00:22:18,400 --> 00:22:22,080 Speaker 1: like all those folks dem ng me. My thing with 340 00:22:22,119 --> 00:22:24,879 Speaker 1: them is like, you guys clearly know a lot about 341 00:22:24,960 --> 00:22:30,040 Speaker 1: her and are very passionate and very knowledgeable. I just 342 00:22:30,119 --> 00:22:33,000 Speaker 1: simply I don't know as much about Jenny nine as 343 00:22:33,040 --> 00:22:35,040 Speaker 1: you guys seem to know. You know, do you have 344 00:22:35,080 --> 00:22:38,320 Speaker 1: a petition that you would like me to circulate? Are 345 00:22:38,520 --> 00:22:41,879 Speaker 1: you boycotting her? Is there a link I can share? 346 00:22:42,040 --> 00:22:45,840 Speaker 1: Can I amplify your movement in some way? And the 347 00:22:45,880 --> 00:22:51,000 Speaker 1: way people were like, uh, well, you know that means 348 00:22:51,119 --> 00:22:55,200 Speaker 1: that you don't give a fuck and you like don't 349 00:22:55,200 --> 00:22:58,120 Speaker 1: believe in anything that you say you believe in, like 350 00:22:58,840 --> 00:23:01,520 Speaker 1: your platform to hold her accountable. I'm like, so, you 351 00:23:01,560 --> 00:23:04,240 Speaker 1: want me to dedicate an episode to Jenny six nine, 352 00:23:04,680 --> 00:23:08,720 Speaker 1: take her down? You want me with my followers on Instagram? 353 00:23:09,520 --> 00:23:12,760 Speaker 1: Uh So, like, what do an expose on Jenny six 354 00:23:12,880 --> 00:23:17,159 Speaker 1: nine do you want to do? Because I think that 355 00:23:17,280 --> 00:23:19,600 Speaker 1: there are like so many from what I've seen, like 356 00:23:19,640 --> 00:23:23,359 Speaker 1: there are so many pages now, like on Instagram or 357 00:23:23,400 --> 00:23:27,520 Speaker 1: on Twitter that are dedicated to this type of expose. 358 00:23:28,560 --> 00:23:30,800 Speaker 1: But that's not what we do on this podcast, and 359 00:23:30,840 --> 00:23:33,040 Speaker 1: that's not what we do on our personal Instagram. So 360 00:23:33,040 --> 00:23:37,080 Speaker 1: we're more than happy to do a re reshare do 361 00:23:37,160 --> 00:23:40,199 Speaker 1: a repost that kind of thing, but like to to 362 00:23:40,359 --> 00:23:43,479 Speaker 1: take someone down in that way quote unquote is like 363 00:23:43,640 --> 00:23:46,280 Speaker 1: not what we've ever done or not what we've been about, 364 00:23:47,040 --> 00:23:50,560 Speaker 1: And so a thing like what is it even a 365 00:23:50,640 --> 00:23:54,400 Speaker 1: takedown mean with someone of that that that caliber at 366 00:23:54,440 --> 00:23:56,560 Speaker 1: that point right in the game. You want me to 367 00:23:56,560 --> 00:24:01,040 Speaker 1: send a strongly worded d M. Yes, you want me 368 00:24:01,080 --> 00:24:04,280 Speaker 1: to like post about her to my main feed. You 369 00:24:04,280 --> 00:24:05,960 Speaker 1: want me to make episode. You want me to create 370 00:24:06,000 --> 00:24:08,200 Speaker 1: content about her, is what you're asking you to do. Yes, 371 00:24:08,240 --> 00:24:11,920 Speaker 1: you're asking me to create content about this horrific, problematic person. 372 00:24:12,000 --> 00:24:15,480 Speaker 1: You're asking me to further platform her. That is the ask. Yeah, 373 00:24:15,720 --> 00:24:18,199 Speaker 1: congratulations because here we are talking about the bitch. So 374 00:24:18,840 --> 00:24:24,000 Speaker 1: there you go. Activism, activism accomplished, full circle. But yeah, 375 00:24:24,040 --> 00:24:26,520 Speaker 1: I mean I think that like that that goes back 376 00:24:26,600 --> 00:24:32,160 Speaker 1: to like the nature of Twitter, where there's no room 377 00:24:32,359 --> 00:24:36,240 Speaker 1: for like actual intellectual conversations. It's either you agree with 378 00:24:36,280 --> 00:24:40,199 Speaker 1: me or you don't. It's either you say everything in 379 00:24:40,200 --> 00:24:45,320 Speaker 1: these forty characters or you're not representing me, which probably 380 00:24:45,359 --> 00:24:47,280 Speaker 1: I'm not because i'm talking. I'm probably what I'm tweeting 381 00:24:47,280 --> 00:24:49,480 Speaker 1: about is my own experience and i'm speaking, and I 382 00:24:49,960 --> 00:24:54,119 Speaker 1: you know, also, I'm not a representative. Nobody elected me 383 00:24:54,280 --> 00:25:02,240 Speaker 1: to any office. You elected me getting co founder. But no, no, go, 384 00:25:02,359 --> 00:25:05,160 Speaker 1: I'm sorry, go back. Yes, like we we just got 385 00:25:05,160 --> 00:25:08,200 Speaker 1: through our election cycle. We had an opportunity to vote 386 00:25:08,200 --> 00:25:12,359 Speaker 1: for elected officials. You know, right, you're strongly worded emails 387 00:25:12,640 --> 00:25:16,879 Speaker 1: to them. You know what I mean. I'm a pothead podcaster. 388 00:25:17,320 --> 00:25:19,119 Speaker 1: What do you want me to do? You know, we 389 00:25:19,160 --> 00:25:21,439 Speaker 1: can talk about it, we can post about it we 390 00:25:21,480 --> 00:25:25,199 Speaker 1: can discuss again. Is there a go fund me we 391 00:25:25,280 --> 00:25:30,080 Speaker 1: can donate? Is there a march we can attend? You know? 392 00:25:30,800 --> 00:25:35,399 Speaker 1: But otherwise like where this is a chat cast. Okay, 393 00:25:35,400 --> 00:25:38,240 Speaker 1: I don't say like that because it is a chat cast, 394 00:25:38,280 --> 00:25:40,879 Speaker 1: though it is it is a chat cast. That's also 395 00:25:41,000 --> 00:25:43,720 Speaker 1: like you know, I think like a resource share and 396 00:25:43,720 --> 00:25:47,800 Speaker 1: there's so much like downplaying of like chat cast, right, 397 00:25:47,880 --> 00:25:52,720 Speaker 1: but it's like no, like there's impact. So absolutely, yeah, 398 00:25:52,840 --> 00:25:57,719 Speaker 1: I think like going back though too, Twitter specifically, you know, 399 00:25:58,280 --> 00:26:03,239 Speaker 1: there has just been it's like been kind of like 400 00:26:04,240 --> 00:26:08,120 Speaker 1: done in some in some ways, right, But now it's 401 00:26:08,119 --> 00:26:09,960 Speaker 1: like we're seeing and I think a lot of social 402 00:26:09,960 --> 00:26:14,399 Speaker 1: media platforms Instagram to Facebook definitely, Um, it's kind of 403 00:26:14,400 --> 00:26:18,200 Speaker 1: like we've seen, they've seen their heyday and now it's like, well, 404 00:26:18,240 --> 00:26:21,399 Speaker 1: what's next, what's the new thing? Yeah, I'm not ready 405 00:26:21,480 --> 00:26:24,560 Speaker 1: to like start all over again on a different platform, 406 00:26:24,920 --> 00:26:27,000 Speaker 1: although I mean I think it's just going to mean 407 00:26:27,160 --> 00:26:30,840 Speaker 1: that we are actually making TikTok's. Like I think if 408 00:26:30,880 --> 00:26:34,480 Speaker 1: it's not Twitter, and like Instagram is like not doing 409 00:26:34,560 --> 00:26:38,119 Speaker 1: that much better than Twitter, It's really not it's not 410 00:26:38,560 --> 00:26:41,000 Speaker 1: if it's not Twitter, and if it's not Instagram. Do 411 00:26:41,080 --> 00:26:42,919 Speaker 1: we go to a whole new place or do we 412 00:26:43,000 --> 00:26:46,560 Speaker 1: just have to seek refuge on Twitter, on TikTok and 413 00:26:47,160 --> 00:26:51,600 Speaker 1: we got to get on camera and make videos. The youth, 414 00:26:51,640 --> 00:26:53,479 Speaker 1: the gen Zers have been telling us that we need 415 00:26:53,520 --> 00:26:57,440 Speaker 1: to get on TikTok. But every single one of every 416 00:26:57,440 --> 00:26:59,479 Speaker 1: single gen Zer that we talked to is like, are 417 00:26:59,480 --> 00:27:02,720 Speaker 1: you on tikto I mean, yes, we are on TikTok. 418 00:27:02,880 --> 00:27:06,480 Speaker 1: We are technically to underscore radio. We're not active, but 419 00:27:06,680 --> 00:27:09,720 Speaker 1: you know we're there first, semi active. I I post 420 00:27:09,760 --> 00:27:12,240 Speaker 1: every once in a while, but it's not like the 421 00:27:12,280 --> 00:27:17,120 Speaker 1: way that TikTok they're they're posting three times a day 422 00:27:17,320 --> 00:27:21,720 Speaker 1: every day. Yeah, good for them. Good. We're very proud 423 00:27:21,800 --> 00:27:24,160 Speaker 1: of you. We are. We are actually very proud and impressed, 424 00:27:24,359 --> 00:27:29,760 Speaker 1: are you know? Here's my deal, Um, don't mistake my 425 00:27:29,840 --> 00:27:33,320 Speaker 1: cynicism for lack of caring. You know what I mean. 426 00:27:33,640 --> 00:27:36,920 Speaker 1: We've just been on We've been online for for so 427 00:27:36,960 --> 00:27:39,600 Speaker 1: many it's been many years. We've seen it. We've seen 428 00:27:39,640 --> 00:27:44,240 Speaker 1: it all, babe, like everything it comes and goes in waves, waves, 429 00:27:44,400 --> 00:27:47,880 Speaker 1: and it's never ending. And we are in our Sinuta era. 430 00:27:48,080 --> 00:27:52,280 Speaker 1: So I guess do you remember when Instagram was only 431 00:27:52,440 --> 00:27:59,320 Speaker 1: for iPhone users. Yes, I remember like seeing the little 432 00:27:59,680 --> 00:28:02,680 Speaker 1: post post for like people people it would get reshared 433 00:28:02,720 --> 00:28:05,600 Speaker 1: on Facebook and I was like, what what is that? Like? 434 00:28:05,680 --> 00:28:08,320 Speaker 1: How do people? How do people get that photo? What 435 00:28:08,480 --> 00:28:10,320 Speaker 1: is this app? Right? And then I don't have an 436 00:28:10,320 --> 00:28:13,520 Speaker 1: iPhone at the time, so I was like what is that? 437 00:28:13,640 --> 00:28:15,200 Speaker 1: And so when I first when I got my first 438 00:28:15,240 --> 00:28:20,800 Speaker 1: iPhone in undergrad, I started using Instagram. So it's I've 439 00:28:20,840 --> 00:28:24,000 Speaker 1: been on here for almost ten years. Over ten years 440 00:28:24,000 --> 00:28:28,880 Speaker 1: and Twitter longer, Tumbler tumbler before that, I mean my 441 00:28:28,960 --> 00:28:35,760 Speaker 1: Space as as as like a middle schooler on my Space. 442 00:28:35,880 --> 00:28:38,840 Speaker 1: I loved social media, Like from day one, I was 443 00:28:39,000 --> 00:28:42,400 Speaker 1: on my Space. Dude, I was on there. I was 444 00:28:43,400 --> 00:28:46,920 Speaker 1: I loved my Space, you know. And then from their 445 00:28:46,960 --> 00:28:50,960 Speaker 1: Facebook that's again where the addiction started. Like your your 446 00:28:51,000 --> 00:28:54,520 Speaker 1: brain was still developing, and so it just changed. They 447 00:28:54,520 --> 00:28:59,640 Speaker 1: got you. My brain chemistry was forever altered. So also 448 00:28:59,680 --> 00:29:01,800 Speaker 1: when you you're us talking about this stuff, just no, 449 00:29:02,040 --> 00:29:04,360 Speaker 1: we're deep in it. We've been deep in it. So 450 00:29:04,400 --> 00:29:08,120 Speaker 1: if you're not deep in it, you don't need to be. 451 00:29:08,400 --> 00:29:13,280 Speaker 1: For you're sharing. We're reporting from the trenches, the digital trenches. 452 00:29:13,960 --> 00:29:16,320 Speaker 1: You guys up to date. But so you don't have 453 00:29:16,360 --> 00:29:18,800 Speaker 1: to suffer like we suffer. I mean, you even share 454 00:29:18,920 --> 00:29:21,000 Speaker 1: something with me that I didn't know, Like I didn't 455 00:29:21,000 --> 00:29:23,800 Speaker 1: know about the Sink thing that Elon must did because 456 00:29:24,000 --> 00:29:26,720 Speaker 1: once I saw the rumblings, the writing on the wall, 457 00:29:26,760 --> 00:29:28,800 Speaker 1: as they say, I was like, I'm not getting on Twitter, 458 00:29:28,880 --> 00:29:32,840 Speaker 1: like I cannot keep up with what's happening with the updates, 459 00:29:32,880 --> 00:29:35,800 Speaker 1: Like I'm going to wait for it to maybe settle 460 00:29:36,000 --> 00:29:39,080 Speaker 1: and then like go in and catch up. But so 461 00:29:39,160 --> 00:29:43,520 Speaker 1: I missed the sink update because I didn't know that. Well, 462 00:29:44,120 --> 00:29:49,479 Speaker 1: thank you for tuning into another radio. Next time, we're 463 00:29:49,520 --> 00:29:51,920 Speaker 1: going to have Jenny six nine in studio. No, we 464 00:29:52,000 --> 00:29:58,280 Speaker 1: will not exclusive, No, I will let that happen. For 465 00:29:58,880 --> 00:30:09,320 Speaker 1: an intervention, We're gonna bring in an exorcist and I'm 466 00:30:09,360 --> 00:30:13,520 Speaker 1: the We'll catch you next time. Look At That Our 467 00:30:13,640 --> 00:30:16,600 Speaker 1: Radio is a production of Look At That Our Productions 468 00:30:16,720 --> 00:30:20,680 Speaker 1: in partnership with I Hearts Michael podcast Network. For more 469 00:30:20,760 --> 00:30:24,400 Speaker 1: podcast listen to the I Heart Radio app, Apple Podcasts, 470 00:30:24,600 --> 00:30:30,640 Speaker 1: or wherever you listen to your favorite shows. Best Me 471 00:30:30,800 --> 00:30:43,720 Speaker 1: Thos look at Radio Radio phonic novella posted by Mala 472 00:30:43,800 --> 00:31:30,280 Speaker 1: Munos and Them take us to your network.