1 00:00:05,120 --> 00:00:07,560 Speaker 1: Hey, this is Annie and Samantha and welcome to Stephane 2 00:00:07,600 --> 00:00:19,120 Speaker 1: never told you protection of I Heeart Radio, and today 3 00:00:19,160 --> 00:00:22,800 Speaker 1: we are once again thrilled to be joined by the fabulous, 4 00:00:22,880 --> 00:00:26,920 Speaker 1: fantastic Bridget Todd. Welcome Bridgets. We're always so happy to 5 00:00:26,960 --> 00:00:29,440 Speaker 1: have you. Thank you for having me. I'm so happy 6 00:00:29,440 --> 00:00:32,480 Speaker 1: to be back here and happy, happy late thanksgiving. Yes 7 00:00:32,720 --> 00:00:35,879 Speaker 1: to you too. How How are how have things been 8 00:00:35,920 --> 00:00:40,720 Speaker 1: going for you? Bridget? Things have been good. Um, yeah, 9 00:00:40,840 --> 00:00:44,840 Speaker 1: I were getting into the holidays, which as a grinch 10 00:00:45,520 --> 00:00:48,480 Speaker 1: it's always a little tough, but things are good. Annie. 11 00:00:48,560 --> 00:00:50,640 Speaker 1: I have to ask you. I heard that you had COVID. 12 00:00:50,760 --> 00:00:55,760 Speaker 1: How are you feeling? Oh? Thank you? I feel I 13 00:00:55,800 --> 00:00:58,160 Speaker 1: feel totally fine. It was not that bad at all, 14 00:00:58,800 --> 00:01:01,080 Speaker 1: thanks to you. I've got my users, have got my vaccine, 15 00:01:01,120 --> 00:01:03,560 Speaker 1: so it was not bad at all. And uh yeah, 16 00:01:03,600 --> 00:01:06,440 Speaker 1: I was just talking about how it's been so long 17 00:01:06,480 --> 00:01:09,199 Speaker 1: since I've been sick that it was interesting to see 18 00:01:09,240 --> 00:01:11,680 Speaker 1: my like what I view is a very sad reaction 19 00:01:11,680 --> 00:01:14,000 Speaker 1: of oh I can eat soup and like sit in 20 00:01:14,040 --> 00:01:16,080 Speaker 1: bed for a little bit. I think I should be 21 00:01:16,120 --> 00:01:18,959 Speaker 1: able to do those things without getting COVID. But anyway, 22 00:01:19,000 --> 00:01:24,160 Speaker 1: I'm fine. Thank you for asking. That's the best part 23 00:01:24,160 --> 00:01:26,119 Speaker 1: of being about being sick. It's like you can just 24 00:01:26,760 --> 00:01:30,360 Speaker 1: binge bad TV with no guilt, never living bed, be 25 00:01:30,440 --> 00:01:34,040 Speaker 1: under fifteen blankets, and just have that be your life 26 00:01:34,080 --> 00:01:40,840 Speaker 1: for a while. Yeah. Yeah it was. I feel so 27 00:01:40,959 --> 00:01:42,960 Speaker 1: I feel so strange admitting it, but it is true. 28 00:01:43,000 --> 00:01:44,800 Speaker 1: It was like, oh, I can finally do these things 29 00:01:44,800 --> 00:01:47,840 Speaker 1: that I just want to do. But yeah, yeah, it 30 00:01:47,960 --> 00:01:50,280 Speaker 1: was all good. And it was really funny because I 31 00:01:50,360 --> 00:01:54,000 Speaker 1: quarantine Samantha and I are big, like we don't go 32 00:01:54,040 --> 00:01:57,720 Speaker 1: anywhere the pandemic, so I really I was like, oh, 33 00:01:57,840 --> 00:01:59,520 Speaker 1: this is easy. I don't really have to change my 34 00:01:59,720 --> 00:02:07,240 Speaker 1: roots at all. Switch up that is so sad. Like 35 00:02:07,280 --> 00:02:09,280 Speaker 1: the only when I had when I had COVID, the 36 00:02:09,320 --> 00:02:13,240 Speaker 1: only big change was like instead of when I ordered delivery, 37 00:02:13,240 --> 00:02:15,399 Speaker 1: instead of greeting them at the door, I just had 38 00:02:15,440 --> 00:02:17,919 Speaker 1: them leave it and then I would wait behind the door. 39 00:02:18,680 --> 00:02:20,360 Speaker 1: I could sense when they dropped it off, and then 40 00:02:20,400 --> 00:02:22,680 Speaker 1: I would like quickly open the door and like grab 41 00:02:22,760 --> 00:02:27,320 Speaker 1: it like some kind of like troll coming out of 42 00:02:27,360 --> 00:02:36,680 Speaker 1: her layer, like run back upstairs. Yes, same, same, uh yeah, yeah, 43 00:02:36,760 --> 00:02:40,880 Speaker 1: the Oh gosh, I'm so glad you you're you're here 44 00:02:40,919 --> 00:02:42,239 Speaker 1: and that you can make time for it, because the 45 00:02:42,240 --> 00:02:44,959 Speaker 1: holidays are are very stressful, and I know Samanth is 46 00:02:45,000 --> 00:02:49,280 Speaker 1: also somebody who doesn't really like them. I hate it. 47 00:02:50,000 --> 00:02:52,480 Speaker 1: I don't like mind it. I like lights. I think 48 00:02:52,480 --> 00:02:57,079 Speaker 1: that's my big thing. I like flights, but lights like lights, 49 00:02:57,480 --> 00:03:03,040 Speaker 1: I do. That's saying the thing I'm trying to I'm 50 00:03:03,080 --> 00:03:04,880 Speaker 1: trying to change it up a little bit. I was like, 51 00:03:04,919 --> 00:03:08,160 Speaker 1: maybe I'll actually get an actual Christmas tree, maybe I'll 52 00:03:08,160 --> 00:03:11,000 Speaker 1: do some decorating and all of that. But then I 53 00:03:11,000 --> 00:03:13,519 Speaker 1: went to Target yesterday and looked around and I got 54 00:03:13,520 --> 00:03:16,520 Speaker 1: the immediate sinking of like, I hate the holiday. So 55 00:03:16,560 --> 00:03:19,560 Speaker 1: it's like I'm getting sad. I'm walking away like kind 56 00:03:19,560 --> 00:03:21,959 Speaker 1: of the way I felt. And of course I don't 57 00:03:22,320 --> 00:03:24,400 Speaker 1: I have a house, but I don't have a fireplace, 58 00:03:24,520 --> 00:03:26,480 Speaker 1: so I feel like with that missing, you like don't 59 00:03:26,480 --> 00:03:29,080 Speaker 1: have that mantle to put things on, which is really 60 00:03:29,680 --> 00:03:32,240 Speaker 1: ridiculous of a conversation as the reason I'm like, this 61 00:03:32,280 --> 00:03:34,480 Speaker 1: is why I don't want to decorate my house or 62 00:03:34,520 --> 00:03:37,480 Speaker 1: whatever whatnot. But yeah, it's just like I walked in 63 00:03:37,640 --> 00:03:39,880 Speaker 1: it was overwhelming and I was like, yeah, I'm done, 64 00:03:40,000 --> 00:03:43,280 Speaker 1: I'm okay. So I did try for like two seconds. 65 00:03:45,600 --> 00:03:49,360 Speaker 1: You know, perfect, thank you. If it's not for you, 66 00:03:49,400 --> 00:03:51,080 Speaker 1: it's not for you, there's nothing wrong with it. And 67 00:03:51,200 --> 00:03:55,800 Speaker 1: also okay, so Bridget, Samantha and I the topic you 68 00:03:55,800 --> 00:03:58,000 Speaker 1: about today are very We're very excited you brought it 69 00:03:58,040 --> 00:04:01,160 Speaker 1: because we have a lot of questions, but also we've 70 00:04:01,160 --> 00:04:03,440 Speaker 1: been meaning to talk about it, but we were hopeful 71 00:04:03,960 --> 00:04:07,760 Speaker 1: that you would talk about it because I feel like 72 00:04:07,840 --> 00:04:12,720 Speaker 1: you have a lot more knowledge and expertise in this arena. 73 00:04:12,960 --> 00:04:15,760 Speaker 1: So I'm so I'm so excited that this is what 74 00:04:15,800 --> 00:04:19,719 Speaker 1: we're talking about today, which is Twitter. Yeah, and you know, 75 00:04:19,960 --> 00:04:22,320 Speaker 1: the last time you were on, we actually talked about 76 00:04:22,320 --> 00:04:24,920 Speaker 1: the fact that it was the beginning of the end 77 00:04:25,080 --> 00:04:27,040 Speaker 1: kind of had them that moment because at that moment 78 00:04:27,040 --> 00:04:29,560 Speaker 1: we knew that Elon Musk was coming on. We're like, 79 00:04:29,680 --> 00:04:33,840 Speaker 1: oh no, here do we go. Uh So definitely, as 80 00:04:33,880 --> 00:04:36,200 Speaker 1: we were watching this, I feel like it was just yesterday, 81 00:04:36,200 --> 00:04:39,360 Speaker 1: but that we mentioned this, so I'm sure you've got 82 00:04:39,360 --> 00:04:42,320 Speaker 1: so much more in depth information from the beginning of 83 00:04:42,480 --> 00:04:45,920 Speaker 1: to today. Yeah, I I I mean, it's it's so 84 00:04:46,279 --> 00:04:49,280 Speaker 1: weird and just two level set, like we it does 85 00:04:49,360 --> 00:04:52,279 Speaker 1: feel like we're in a little bit of unprecedented waters. 86 00:04:52,640 --> 00:04:54,240 Speaker 1: You know. I have a lot of expertise when it 87 00:04:54,240 --> 00:04:56,800 Speaker 1: comes to tech technology and platforms and how they run 88 00:04:56,839 --> 00:05:00,000 Speaker 1: and how they're moderated, but this has never really happened. 89 00:05:00,120 --> 00:05:03,440 Speaker 1: For since I have been working in in the Internet spaces, 90 00:05:03,480 --> 00:05:06,440 Speaker 1: I have never seen a billionaire by one of our 91 00:05:06,520 --> 00:05:11,320 Speaker 1: largest digital communications platforms and like change the culture pretty 92 00:05:11,400 --> 00:05:14,719 Speaker 1: much single handedly overnight. So I am going to try 93 00:05:14,760 --> 00:05:18,599 Speaker 1: to bring whatever expertise I have and answer whatever questions 94 00:05:18,640 --> 00:05:21,280 Speaker 1: folks have. But I don't know a lot because this 95 00:05:21,400 --> 00:05:24,280 Speaker 1: is just like a very new, weird experience. I am 96 00:05:24,400 --> 00:05:27,159 Speaker 1: learning right alongside all of y'all and figuring out what's 97 00:05:27,200 --> 00:05:30,080 Speaker 1: happening um in real times. It's it's kind of exciting, 98 00:05:30,120 --> 00:05:36,160 Speaker 1: kind of interesting, but also tb D. Yeah, which I 99 00:05:36,160 --> 00:05:40,000 Speaker 1: guess we should say. We're recording this on November because 100 00:05:40,040 --> 00:05:43,640 Speaker 1: things are changing rapidly. Things are changing rapidly. A lot 101 00:05:43,680 --> 00:05:47,440 Speaker 1: has happened, as you said, Samantha, since Elon Musk came 102 00:05:47,480 --> 00:05:51,560 Speaker 1: on board, So yeah, we'll see. I think there's a 103 00:05:51,560 --> 00:05:54,800 Speaker 1: lot more to come. Yeah, well that's a great place 104 00:05:54,839 --> 00:05:58,120 Speaker 1: to start, you know, just the rapid changing of the situation. 105 00:05:58,480 --> 00:06:00,960 Speaker 1: I initially started putting together and outline a couple of 106 00:06:01,240 --> 00:06:04,200 Speaker 1: I think like last week and mostly all that stuff 107 00:06:04,240 --> 00:06:08,000 Speaker 1: that I wrote is now has now been changed, so 108 00:06:08,040 --> 00:06:11,000 Speaker 1: hopefully this will be not totally out of out of 109 00:06:11,080 --> 00:06:12,760 Speaker 1: date by the time folks hear it. UM. I have 110 00:06:12,839 --> 00:06:14,880 Speaker 1: to ask before we even start, like, have you sense 111 00:06:14,960 --> 00:06:17,960 Speaker 1: that your relationship with Twitter has changed since Elon Musk 112 00:06:17,960 --> 00:06:23,520 Speaker 1: took over? Yeah? I would say so, UM, because one 113 00:06:23,560 --> 00:06:27,920 Speaker 1: of the biggest things that I've experienced with Twitter since 114 00:06:28,400 --> 00:06:32,480 Speaker 1: um Elon Musk has been um either kind of a 115 00:06:32,520 --> 00:06:36,279 Speaker 1: sense of uncertainty or confusion, or people contact me and 116 00:06:36,320 --> 00:06:38,840 Speaker 1: be like, I'm not sure, I'm not sure what's going on, 117 00:06:38,880 --> 00:06:41,120 Speaker 1: so you can find me on this other social platform 118 00:06:41,200 --> 00:06:46,000 Speaker 1: or whatever, and then I have had a several experiences 119 00:06:46,000 --> 00:06:50,760 Speaker 1: where I went to check on um something somebody, uh, 120 00:06:50,880 --> 00:06:53,440 Speaker 1: somebody on Twitter is doing that I follow and their 121 00:06:53,680 --> 00:06:57,599 Speaker 1: account is not there anymore, just deactivated and kind of 122 00:06:57,640 --> 00:07:00,200 Speaker 1: just a general like I'm not sure if this is 123 00:07:00,279 --> 00:07:02,279 Speaker 1: coming from who I think this is coming from. I 124 00:07:02,279 --> 00:07:06,440 Speaker 1: guess like, UM, like a loss of trust, like I'm 125 00:07:06,480 --> 00:07:10,440 Speaker 1: not sure what is going on and if I can 126 00:07:10,520 --> 00:07:14,440 Speaker 1: trust anything that is happening, right. I mean, essentially, I 127 00:07:14,520 --> 00:07:17,200 Speaker 1: think that's the biggest thing is I don't know if 128 00:07:17,240 --> 00:07:19,840 Speaker 1: this is a real account to someone at all. The 129 00:07:19,880 --> 00:07:22,360 Speaker 1: amount of ads I've been getting have gotten up. It's 130 00:07:22,520 --> 00:07:24,680 Speaker 1: the same three ads, but I'm like, oh my god, 131 00:07:24,720 --> 00:07:27,880 Speaker 1: it is constantly there, which we all know why, but 132 00:07:28,640 --> 00:07:31,080 Speaker 1: I mean because they're losing so much money. But um, 133 00:07:31,120 --> 00:07:32,520 Speaker 1: I think one of the big things is like I 134 00:07:32,520 --> 00:07:35,920 Speaker 1: will see parody accounts, which because of the verification system 135 00:07:35,960 --> 00:07:38,320 Speaker 1: has kind of completely gone out the window that I 136 00:07:38,360 --> 00:07:39,840 Speaker 1: don't know if it's real. I think there was one 137 00:07:39,880 --> 00:07:41,240 Speaker 1: at one point that I was like, oh, that that 138 00:07:41,320 --> 00:07:44,160 Speaker 1: one is real. Oh that's real. Like that's kind of 139 00:07:44,200 --> 00:07:47,040 Speaker 1: one of those moments as well as the amount of 140 00:07:47,080 --> 00:07:49,920 Speaker 1: misinformation has grown. To me, at least I knew if 141 00:07:49,920 --> 00:07:52,640 Speaker 1: it was coming from this specific account or type of account. 142 00:07:52,960 --> 00:07:55,880 Speaker 1: What was happening Now I don't know. I'm like, okay, wait, 143 00:07:55,920 --> 00:07:58,840 Speaker 1: did that actually happen? And I don't know anymore if 144 00:07:58,840 --> 00:08:01,040 Speaker 1: it's real or not, like so many things that are 145 00:08:01,080 --> 00:08:03,680 Speaker 1: aimed specifically at Elon Musk and like, is that is 146 00:08:03,720 --> 00:08:06,040 Speaker 1: that real? Is that really happening? Did he do these things? 147 00:08:06,160 --> 00:08:08,840 Speaker 1: Or are people saying this in order to fear monger 148 00:08:08,960 --> 00:08:11,120 Speaker 1: or make it even bigger? And then he comes back 149 00:08:11,160 --> 00:08:13,360 Speaker 1: and say, just kidding, that didn't happen, Like I don't. 150 00:08:13,600 --> 00:08:16,040 Speaker 1: I don't know at this point who to trust and 151 00:08:16,040 --> 00:08:18,440 Speaker 1: whom not to trust. Of course on our stuff. I've 152 00:08:18,440 --> 00:08:21,480 Speaker 1: never told you account we the followers just dropped because 153 00:08:21,520 --> 00:08:24,800 Speaker 1: so many people have left. I'm assuming maybe they just allots. 154 00:08:24,920 --> 00:08:27,880 Speaker 1: I don't know, but like the amount of people that 155 00:08:28,000 --> 00:08:31,080 Speaker 1: had left has like increased, and in the fact that 156 00:08:31,120 --> 00:08:34,560 Speaker 1: has impacted our numbers for a while. Not that we 157 00:08:34,720 --> 00:08:36,679 Speaker 1: we don't do too much because I am still fearful 158 00:08:36,720 --> 00:08:40,920 Speaker 1: of writing anything on any of the social media accounts 159 00:08:40,920 --> 00:08:44,040 Speaker 1: at all, but it's interesting to see how that is 160 00:08:44,080 --> 00:08:48,240 Speaker 1: fluctuating as well. The people who I again trusted losing 161 00:08:48,240 --> 00:08:50,400 Speaker 1: their verification because they didn't pay for us, so therefore 162 00:08:50,400 --> 00:08:52,839 Speaker 1: I don't know if that's them anymore. There's so much 163 00:08:52,880 --> 00:08:57,600 Speaker 1: like I don't quite know what's reality what's fake, And 164 00:08:57,640 --> 00:08:59,680 Speaker 1: then the amount of disformation and people that I'm trying 165 00:08:59,800 --> 00:09:01,439 Speaker 1: like have been banned or back on, and I'm like, 166 00:09:01,440 --> 00:09:04,400 Speaker 1: oh god, why. So it's really sad because Twitter has 167 00:09:04,400 --> 00:09:06,160 Speaker 1: been something that I used to like, let me know 168 00:09:06,200 --> 00:09:09,559 Speaker 1: what's going on, is faster than um any other news platforms, 169 00:09:09,600 --> 00:09:12,480 Speaker 1: and and seeing like the truth of what's actually happening, 170 00:09:13,080 --> 00:09:17,880 Speaker 1: whether it's information about wrongful deaths or protests around the country, 171 00:09:17,960 --> 00:09:21,000 Speaker 1: around the world. So it's there's a lot that I'm like, uh, 172 00:09:21,000 --> 00:09:23,160 Speaker 1: I'm kind of lost, but also at the same time 173 00:09:23,240 --> 00:09:27,520 Speaker 1: entertained because it's going in flames. Yes, A thousand times. Yes, 174 00:09:27,600 --> 00:09:30,160 Speaker 1: So I have seen all the things that y'all have 175 00:09:30,200 --> 00:09:32,240 Speaker 1: seen and felt and sensed. I have also seen and 176 00:09:32,280 --> 00:09:35,679 Speaker 1: felt and sensed. And for folks listening, even if you're 177 00:09:35,760 --> 00:09:39,280 Speaker 1: someone who doesn't use Twitter, like the majority of Americans, 178 00:09:39,360 --> 00:09:42,160 Speaker 1: or like will never use Twitter, not a social media person, 179 00:09:42,840 --> 00:09:45,640 Speaker 1: I am confident that I can help help you see 180 00:09:45,640 --> 00:09:47,559 Speaker 1: why this is a big, a big deal for all 181 00:09:47,600 --> 00:09:49,640 Speaker 1: of us, not just folks who are on Twitter, because 182 00:09:49,640 --> 00:09:51,240 Speaker 1: I do think that a lot of the reporting on 183 00:09:51,280 --> 00:09:55,080 Speaker 1: this is like, oh, well, this matters because it's Twitter 184 00:09:55,120 --> 00:09:58,120 Speaker 1: and everybody uses Twitter. Not so right, Like, when you 185 00:09:58,160 --> 00:10:01,839 Speaker 1: compare Twitter to other social media platforms like Facebook, only 186 00:10:01,880 --> 00:10:05,880 Speaker 1: a small amount of Americans use are on Twitter actively, 187 00:10:06,240 --> 00:10:10,000 Speaker 1: and even less Americans actually tweet and participate there, so 188 00:10:10,000 --> 00:10:12,760 Speaker 1: with we're talking about a small amount of folks. However, 189 00:10:13,600 --> 00:10:16,400 Speaker 1: the reason why this actually like kind of matters is 190 00:10:16,400 --> 00:10:18,920 Speaker 1: because the people who are active on Twitter are kind 191 00:10:18,920 --> 00:10:21,040 Speaker 1: of like, I guess you can sort of call them 192 00:10:21,080 --> 00:10:27,240 Speaker 1: like the tastemakers, right, Like journalists, researchers, activists, organizers, people 193 00:10:27,320 --> 00:10:31,319 Speaker 1: who are really able to shape conversation and shape our 194 00:10:31,400 --> 00:10:34,640 Speaker 1: public discourse and sort of have an influence on what 195 00:10:34,840 --> 00:10:38,440 Speaker 1: becomes part of our public kind of conversation. Right. And so, Samantha, 196 00:10:38,480 --> 00:10:41,600 Speaker 1: exactly like what you were saying. When people need quick, 197 00:10:41,880 --> 00:10:45,840 Speaker 1: up to date, hopefully somewhat reliable information, they're not going 198 00:10:45,880 --> 00:10:48,800 Speaker 1: to Instagram. They're not going to Tumbler, they're not going 199 00:10:48,840 --> 00:10:51,880 Speaker 1: to snapchat or TikTok. They're going to Twitter. You know, 200 00:10:51,920 --> 00:10:55,080 Speaker 1: there's a reason why when there's a you know, mass 201 00:10:55,120 --> 00:10:59,320 Speaker 1: shooting on a campus, people can follow updates quickly on Twitter. 202 00:10:59,360 --> 00:11:02,400 Speaker 1: And so Twitter is not just this place where public 203 00:11:02,400 --> 00:11:05,280 Speaker 1: discourse is shaped. It is also a place where you 204 00:11:05,360 --> 00:11:09,160 Speaker 1: just go to get easy, quick access to hopefully reliable 205 00:11:09,160 --> 00:11:11,880 Speaker 1: information about stuff happening near you. And so you can 206 00:11:11,880 --> 00:11:14,320 Speaker 1: sort of get a sense of why this is a 207 00:11:14,360 --> 00:11:18,640 Speaker 1: platform where these kind of power struggles happen, right, Like, 208 00:11:18,960 --> 00:11:21,840 Speaker 1: it's not surprising that like a person like Donald Trump, 209 00:11:21,880 --> 00:11:24,680 Speaker 1: Twitter was his platform to ure because you can have 210 00:11:24,960 --> 00:11:28,360 Speaker 1: such outsized influence in getting a message out there quickly 211 00:11:28,600 --> 00:11:30,360 Speaker 1: and effectively. And I just don't think that we have 212 00:11:30,400 --> 00:11:34,120 Speaker 1: another social media platform that mimics that so effectively. Like 213 00:11:34,400 --> 00:11:36,720 Speaker 1: Facebook moves a little bit slower, even though more people 214 00:11:36,800 --> 00:11:41,320 Speaker 1: are there. Instagram, the timeline moves in such an algorithmic 215 00:11:41,360 --> 00:11:42,840 Speaker 1: way that you don't need. It's like, if I posted 216 00:11:42,880 --> 00:11:44,959 Speaker 1: something that was really important, it's not even a guarantee 217 00:11:44,960 --> 00:11:46,760 Speaker 1: that everybody who follows me would see it, right. And 218 00:11:46,760 --> 00:11:49,360 Speaker 1: so Twitter is such a fast moving platform, and I 219 00:11:49,360 --> 00:11:51,600 Speaker 1: think it's one of the reasons why we're seeing it 220 00:11:51,679 --> 00:11:55,839 Speaker 1: spring up as a battleground in this like highly polarized 221 00:11:55,920 --> 00:12:12,559 Speaker 1: kind of culture war. I'm really glad that you're here 222 00:12:12,600 --> 00:12:15,040 Speaker 1: to make this case because we recently it was recently 223 00:12:15,040 --> 00:12:17,000 Speaker 1: Thanksgiving and I went home and, um, a lot of 224 00:12:17,040 --> 00:12:19,720 Speaker 1: people were kind of, you know, laughing about Elon Musk 225 00:12:19,760 --> 00:12:22,320 Speaker 1: and Twitter, and there are definitely some funny aspects of it, 226 00:12:22,320 --> 00:12:23,719 Speaker 1: but they're all like, well, I hate Twitter, so I'm 227 00:12:23,720 --> 00:12:25,040 Speaker 1: not sad to see it go. And I was like, well, 228 00:12:26,000 --> 00:12:27,560 Speaker 1: I have a lot of problems with Twitter two, but 229 00:12:27,760 --> 00:12:30,680 Speaker 1: it's important, it does matter, and it is a shame 230 00:12:30,880 --> 00:12:35,280 Speaker 1: like that we're seeing this happen, and there have been things, 231 00:12:36,320 --> 00:12:40,679 Speaker 1: examples of stuff that took off on Twitter and then 232 00:12:40,720 --> 00:12:45,920 Speaker 1: fundamentally changed our public discourse, whether you use it or not, 233 00:12:46,240 --> 00:12:48,360 Speaker 1: totally right. And so this is this is the drama 234 00:12:48,400 --> 00:12:50,880 Speaker 1: will always beat. Even if you are not someone who 235 00:12:50,920 --> 00:12:54,360 Speaker 1: was actively using Twitter, you have felt the cultural impact 236 00:12:54,440 --> 00:12:57,840 Speaker 1: of movements and conversations that started on Twitter. You know, 237 00:12:57,880 --> 00:13:01,199 Speaker 1: if you think about the way that people who traditionally 238 00:13:01,200 --> 00:13:03,439 Speaker 1: have not necessarily had a lot of access to power 239 00:13:03,520 --> 00:13:07,079 Speaker 1: and influence, they can use Twitter to build up that 240 00:13:07,120 --> 00:13:08,920 Speaker 1: power and that voice and that influence. I could give 241 00:13:08,920 --> 00:13:11,920 Speaker 1: you a million different examples of concrete changes that we're 242 00:13:11,960 --> 00:13:15,080 Speaker 1: all aware of and probably all felt directly that started 243 00:13:15,120 --> 00:13:17,920 Speaker 1: on Twitter. You know, Uh, it's not surprising to me 244 00:13:18,000 --> 00:13:20,280 Speaker 1: that Twitter has been used as this way to really 245 00:13:20,320 --> 00:13:23,559 Speaker 1: hold power structures accountable. When Toronto Burke started to me 246 00:13:23,640 --> 00:13:26,679 Speaker 1: Too movement, you know, she had this this movement that 247 00:13:26,720 --> 00:13:31,000 Speaker 1: she had started for black and brown women and girls 248 00:13:31,080 --> 00:13:34,320 Speaker 1: who were survivors of sexual violence. And when actor Alissa 249 00:13:34,360 --> 00:13:38,320 Speaker 1: Milano tweeted about it on Twitter using the hashtag me too, 250 00:13:38,559 --> 00:13:40,480 Speaker 1: that's when it really took off, right, And so you 251 00:13:40,559 --> 00:13:44,680 Speaker 1: see the the power that you know, generating conversations on 252 00:13:44,679 --> 00:13:47,520 Speaker 1: the platform can have If not for Twitter, I don't 253 00:13:47,559 --> 00:13:49,320 Speaker 1: know that we would have the me to move it 254 00:13:49,360 --> 00:13:50,840 Speaker 1: the way that the way that it was. I don't 255 00:13:50,840 --> 00:13:54,199 Speaker 1: know that, you know, all of these different powerful abusers 256 00:13:54,360 --> 00:13:56,920 Speaker 1: would be eventually held to account and that we would 257 00:13:56,920 --> 00:13:59,440 Speaker 1: be having a national conversation about things like gender and 258 00:13:59,520 --> 00:14:02,440 Speaker 1: sexual islands. You know. I'm Another great example is a 259 00:14:02,440 --> 00:14:05,840 Speaker 1: friend of mine, April Rain. She tweeted about how white 260 00:14:06,000 --> 00:14:09,320 Speaker 1: the Oscars nominees were and tweeted Oscar so white. It 261 00:14:09,400 --> 00:14:12,920 Speaker 1: completely took off. People were tweeting things like Oscar so white, 262 00:14:13,040 --> 00:14:16,320 Speaker 1: it touches my hair without asking Oscar so white? It 263 00:14:17,000 --> 00:14:21,720 Speaker 1: you know, da da da da. And because of that conversation, 264 00:14:21,880 --> 00:14:26,000 Speaker 1: it fundamentally changed how the Oscars were that year. Uh 265 00:14:26,160 --> 00:14:29,640 Speaker 1: Spike Lee won an Oscar and he he said that 266 00:14:29,680 --> 00:14:32,360 Speaker 1: he doesn't think it would ever have happened without that campaign, 267 00:14:32,440 --> 00:14:35,120 Speaker 1: which started as a tweet a hashtag on Twitter. Look 268 00:14:35,120 --> 00:14:37,680 Speaker 1: at the way that it was instrumental during Arab spring, 269 00:14:37,760 --> 00:14:40,080 Speaker 1: right and so like. There are so many ways and 270 00:14:40,160 --> 00:14:43,600 Speaker 1: times where, particularly outside of the US, Twitter has been 271 00:14:43,720 --> 00:14:47,600 Speaker 1: used to document, you know, abuse of power and hold 272 00:14:47,640 --> 00:14:49,960 Speaker 1: that power to account. And I think it's even if 273 00:14:49,960 --> 00:14:53,480 Speaker 1: you're not on Twitter, you've probably felt or seen some 274 00:14:53,600 --> 00:14:55,440 Speaker 1: kind of the impact of some kind of movement that 275 00:14:55,480 --> 00:14:58,960 Speaker 1: was started on Twitter, you know. And as you were 276 00:14:58,960 --> 00:15:01,480 Speaker 1: talking about these moves mos, which are so huge, I 277 00:15:01,520 --> 00:15:03,200 Speaker 1: had to go back to the original idea of like 278 00:15:03,560 --> 00:15:07,920 Speaker 1: our lingo changed completely with what is now the hashtag 279 00:15:08,360 --> 00:15:11,320 Speaker 1: as well, because that actually originated in two thousand seven. 280 00:15:11,320 --> 00:15:13,720 Speaker 1: According to one of the resources that I looked at 281 00:15:14,160 --> 00:15:17,880 Speaker 1: on Twitter, like it literally changed how we looked at 282 00:15:18,080 --> 00:15:21,880 Speaker 1: so much of our conversations online. Um. It brought up 283 00:15:21,920 --> 00:15:24,320 Speaker 1: a way for us to pass messages and bring up 284 00:15:24,400 --> 00:15:27,600 Speaker 1: big issues with what I would have known as pound sign, 285 00:15:27,640 --> 00:15:31,280 Speaker 1: which just the lineals don't know what that is. And 286 00:15:31,280 --> 00:15:34,000 Speaker 1: that's fine. I'm fine with it. I'm okay with it. Um. 287 00:15:34,040 --> 00:15:37,440 Speaker 1: But like that in itself has begun a new conversation 288 00:15:37,480 --> 00:15:40,560 Speaker 1: the zite guys in itself had changed because of things 289 00:15:40,560 --> 00:15:42,720 Speaker 1: like that on Twitter. And it's such a big significant 290 00:15:42,720 --> 00:15:44,960 Speaker 1: thing that we don't often think about, and knowing that 291 00:15:45,000 --> 00:15:48,040 Speaker 1: before two th seven that didn't exist, and what a 292 00:15:48,120 --> 00:15:51,320 Speaker 1: powerful tool it is on across all social media. Oh 293 00:15:51,320 --> 00:15:54,320 Speaker 1: my god, Sam, this is my your your I will 294 00:15:54,400 --> 00:15:56,600 Speaker 1: keep my comments as brief as I can. This is 295 00:15:56,640 --> 00:16:00,200 Speaker 1: like my favorite topic. The way that they things that 296 00:16:00,240 --> 00:16:02,920 Speaker 1: are originated on different social media platforms, in this case Twitter, 297 00:16:03,160 --> 00:16:06,800 Speaker 1: have changed fundamentally changed the way that we communicate digitally 298 00:16:06,920 --> 00:16:09,560 Speaker 1: is fascinating to me. I don't remember when hashtags were 299 00:16:09,600 --> 00:16:13,680 Speaker 1: first rolled out, and previously there wasn't really an easy 300 00:16:13,720 --> 00:16:16,680 Speaker 1: way to quickly figure out what everybody was saying on 301 00:16:16,680 --> 00:16:18,640 Speaker 1: on a particular topic. And I still remember I was 302 00:16:18,680 --> 00:16:21,200 Speaker 1: working as a social media management at the time when 303 00:16:21,200 --> 00:16:23,520 Speaker 1: the conversation was like, oh, well, should we use hashtags 304 00:16:23,520 --> 00:16:25,960 Speaker 1: on Facebook? Like does it work that way? And you know, 305 00:16:26,080 --> 00:16:28,760 Speaker 1: people still use hashtag. You go to TikTok, like one 306 00:16:28,800 --> 00:16:32,160 Speaker 1: of the biggest, fastest growing platforms out there. When you 307 00:16:32,160 --> 00:16:34,200 Speaker 1: scroll down to the bottom of TikTok, people tend to 308 00:16:34,200 --> 00:16:38,320 Speaker 1: include hashtags. And so it's interesting how this mode of 309 00:16:38,600 --> 00:16:41,200 Speaker 1: this particular mode of digital communication did not just stay 310 00:16:41,240 --> 00:16:44,760 Speaker 1: on Twitter, how it really shaped other platforms as well, 311 00:16:45,000 --> 00:16:47,800 Speaker 1: And I think has become a pivotal way in terms 312 00:16:47,880 --> 00:16:50,160 Speaker 1: of like how we just think about the way that 313 00:16:50,240 --> 00:16:52,600 Speaker 1: discourse works online. I think before the hashtag, we didn't 314 00:16:52,640 --> 00:16:54,560 Speaker 1: think about it as like I should be able to 315 00:16:54,600 --> 00:16:56,920 Speaker 1: pull up a hashtag and get a sense of like 316 00:16:57,200 --> 00:16:59,960 Speaker 1: the diversity of thought and conversation on any one time, 317 00:17:00,200 --> 00:17:01,880 Speaker 1: and now that it's kind of integrals with the way 318 00:17:01,880 --> 00:17:04,959 Speaker 1: that we understand communications online. This is something I can 319 00:17:05,040 --> 00:17:07,480 Speaker 1: nerd out on for hours. It's probably not interesting to 320 00:17:07,520 --> 00:17:12,159 Speaker 1: anybody but me, but loving I found it fascinating. To 321 00:17:12,200 --> 00:17:14,800 Speaker 1: be fair, again, I was late in coming to social media. 322 00:17:14,840 --> 00:17:18,000 Speaker 1: I still in latant understanding social media to a certain extent. 323 00:17:18,240 --> 00:17:19,719 Speaker 1: So when I first started, I was like, what are 324 00:17:19,760 --> 00:17:21,760 Speaker 1: these hashtags? I'm just gonna write sentences on there, and 325 00:17:22,000 --> 00:17:25,040 Speaker 1: they no sense, But I'm like that makes them blue, 326 00:17:25,080 --> 00:17:27,040 Speaker 1: Like you can click on it, I'm gonna do it, 327 00:17:27,240 --> 00:17:31,200 Speaker 1: which is a whole different conversations itself. I'm like, look technology, 328 00:17:31,720 --> 00:17:34,600 Speaker 1: which is again how I react to most things. But 329 00:17:35,040 --> 00:17:37,840 Speaker 1: going back to where we started, because finding something that's 330 00:17:37,840 --> 00:17:41,080 Speaker 1: coming to this point, which seems apocalyptic for a social 331 00:17:41,119 --> 00:17:43,560 Speaker 1: media brand, but one of the oldest one that has 332 00:17:43,640 --> 00:17:45,760 Speaker 1: been out there that's still like stood the test of 333 00:17:45,800 --> 00:17:49,840 Speaker 1: time outside of Facebook, it's interesting to see what is 334 00:17:49,880 --> 00:17:52,479 Speaker 1: happening because it does feel like we have lived this 335 00:17:52,680 --> 00:17:56,560 Speaker 1: extension of a life, of a creation of something completely different. 336 00:17:57,040 --> 00:17:59,800 Speaker 1: So with all of that, how do we get here. 337 00:17:59,800 --> 00:18:03,359 Speaker 1: How did this happen? Great question? So this is something 338 00:18:03,400 --> 00:18:06,600 Speaker 1: again that I wish I saw more of in the reporting. 339 00:18:06,680 --> 00:18:08,760 Speaker 1: People kind of gloss over it, and I think it's 340 00:18:08,760 --> 00:18:12,600 Speaker 1: actually a really big part of the story, and it's 341 00:18:12,640 --> 00:18:14,919 Speaker 1: important to not just gloss over it, which is that 342 00:18:15,000 --> 00:18:20,040 Speaker 1: you know, Musk's decision to to buy Twitter, per his 343 00:18:20,160 --> 00:18:24,640 Speaker 1: own statements, was rooted in transphobia. Um, he'd been talking 344 00:18:24,640 --> 00:18:29,360 Speaker 1: about buying Twitter for a while, and Twitter added intentionally 345 00:18:29,400 --> 00:18:32,520 Speaker 1: misgendering people to their list a prohibited behavior on the platform. 346 00:18:33,000 --> 00:18:37,119 Speaker 1: So you can't, like, as a means of trying to 347 00:18:37,680 --> 00:18:41,720 Speaker 1: harm someone, you cannot misgender them. So in the right wing, 348 00:18:42,320 --> 00:18:44,960 Speaker 1: I guess you'll call it like a parody site. Even 349 00:18:45,000 --> 00:18:48,119 Speaker 1: that doesn't seem quite correct, but we'll call it a 350 00:18:48,160 --> 00:18:51,240 Speaker 1: parody website for the sake of conversation. The Babylon b 351 00:18:51,800 --> 00:18:55,560 Speaker 1: they violated this rule when they tweeted a transphobic joke 352 00:18:55,960 --> 00:18:59,080 Speaker 1: in scare quotes Um that Dr Rachel Levine, who was 353 00:18:59,160 --> 00:19:02,440 Speaker 1: the first openly transfore star officer in the military and 354 00:19:02,600 --> 00:19:05,159 Speaker 1: currently the Assistant Secretary for Health in the U s 355 00:19:05,200 --> 00:19:08,480 Speaker 1: Department of Health and Human Services. They made a crack 356 00:19:08,520 --> 00:19:11,720 Speaker 1: on Twitter that she had been named quote Man of 357 00:19:11,760 --> 00:19:14,199 Speaker 1: the Year, right, and so a lot of the report, 358 00:19:14,440 --> 00:19:18,159 Speaker 1: First of all, it's wild to me, how like that 359 00:19:18,160 --> 00:19:21,160 Speaker 1: that level of humor has not evolved since that movie 360 00:19:21,359 --> 00:19:25,760 Speaker 1: Ace Venture a Pet Detective, Like we're still in Nino, Like, 361 00:19:25,800 --> 00:19:28,960 Speaker 1: oh my god, like like get a new joke, people 362 00:19:29,200 --> 00:19:32,160 Speaker 1: is it is so tired. But a lot of people 363 00:19:32,200 --> 00:19:35,040 Speaker 1: reported that the Babylon Be was banned from Twitter for 364 00:19:35,040 --> 00:19:37,479 Speaker 1: this for this tweet, but that technically is not correct. 365 00:19:38,119 --> 00:19:41,080 Speaker 1: Twitter said that the babylon Be could have their account 366 00:19:41,200 --> 00:19:44,479 Speaker 1: back in twelve hours, but that that countdown could not 367 00:19:44,560 --> 00:19:47,879 Speaker 1: start until they deleted that particular tweet. They refused to 368 00:19:47,920 --> 00:19:50,080 Speaker 1: delete that tweet, so they were unable to tweet, so 369 00:19:50,080 --> 00:19:53,919 Speaker 1: they weren't technically banned. They kind of decided like we 370 00:19:53,960 --> 00:19:55,800 Speaker 1: are not we are going to die on this hill 371 00:19:55,880 --> 00:19:58,160 Speaker 1: of this tweet, and if we can't tweet, that's fine. 372 00:19:58,240 --> 00:20:01,320 Speaker 1: So at this time, Musk was already the biggest shareholder 373 00:20:01,320 --> 00:20:03,240 Speaker 1: of Twitter, and he had been invited to join its 374 00:20:03,240 --> 00:20:06,480 Speaker 1: board of directors. Um In April, Elon offered to buy Twitter, 375 00:20:06,520 --> 00:20:09,680 Speaker 1: and he said that it was specifically that Babylon b 376 00:20:09,960 --> 00:20:12,400 Speaker 1: situation that prompted him to do so that like that 377 00:20:12,520 --> 00:20:15,919 Speaker 1: was the final straw, watching the babylon Be not be 378 00:20:16,000 --> 00:20:18,080 Speaker 1: able to tweet because they refused to delete this joke. 379 00:20:18,920 --> 00:20:21,520 Speaker 1: The babylon Be confirmed this to The Washington Times, saying 380 00:20:22,119 --> 00:20:25,080 Speaker 1: We've had some communication with Musk. Uh he wanted to 381 00:20:25,119 --> 00:20:27,359 Speaker 1: confirm that we had in fact been suspended from Twitter. 382 00:20:27,400 --> 00:20:29,399 Speaker 1: He reached out to us before he publicly asked his 383 00:20:29,440 --> 00:20:32,760 Speaker 1: followers if they think Twitter vigorously adheres to the principle 384 00:20:32,760 --> 00:20:35,560 Speaker 1: of free expression. He even mused on that call that 385 00:20:35,640 --> 00:20:38,160 Speaker 1: he might need to buy Twitter. So it's to me, 386 00:20:38,320 --> 00:20:42,479 Speaker 1: it's like pretty disappointing and also very important that Elon 387 00:20:42,600 --> 00:20:48,199 Speaker 1: Musk's you know, tenure at Twitter really starts with his 388 00:20:48,280 --> 00:20:53,600 Speaker 1: desire to protect transphobic jokes in scare quotes as protected 389 00:20:53,640 --> 00:20:56,120 Speaker 1: speech and free speech on the platform. I think that's 390 00:20:56,440 --> 00:20:58,800 Speaker 1: terribly disappointing, and I think it's related to the ways 391 00:20:58,800 --> 00:21:00,919 Speaker 1: that we're seeing him, you know, at the rain at 392 00:21:00,960 --> 00:21:04,320 Speaker 1: Twitter right now. So disappointed because from what I do understand, 393 00:21:04,520 --> 00:21:08,800 Speaker 1: he does have a trans child and and like obviously 394 00:21:08,800 --> 00:21:12,520 Speaker 1: the relationship is not good. They disowned Musk as there 395 00:21:12,640 --> 00:21:15,680 Speaker 1: are parental figure, but it seems like just an attack 396 00:21:16,560 --> 00:21:18,600 Speaker 1: on them. If I were If I were them, I 397 00:21:18,600 --> 00:21:21,640 Speaker 1: would think this as well. But I'm just so disappointing 398 00:21:21,720 --> 00:21:26,080 Speaker 1: to know that you could really care so little of 399 00:21:26,160 --> 00:21:29,320 Speaker 1: your own child that you're willing to go into forty 400 00:21:29,320 --> 00:21:33,000 Speaker 1: four billion dollars in debt to go for this. Yes, 401 00:21:33,200 --> 00:21:36,760 Speaker 1: it is, I mean the this is a weird aside, 402 00:21:36,800 --> 00:21:39,240 Speaker 1: But like, I actually spend a lot of time thinking 403 00:21:39,240 --> 00:21:43,040 Speaker 1: about Elon Musk's like personal motivations. I obviously don't know 404 00:21:43,080 --> 00:21:45,640 Speaker 1: Elon Musk. I can't like at all. It's all speculation, 405 00:21:45,720 --> 00:21:51,560 Speaker 1: but I do think there's something about it, like I 406 00:21:51,600 --> 00:21:55,600 Speaker 1: can't imagine having a trans child and going out of 407 00:21:55,640 --> 00:21:58,119 Speaker 1: my way and spending a lot of money to protect 408 00:21:58,680 --> 00:22:02,120 Speaker 1: and like transphobic right right, I cannot imagine it. It's 409 00:22:02,119 --> 00:22:04,040 Speaker 1: difficult for me to put myself in that position, I 410 00:22:04,080 --> 00:22:06,680 Speaker 1: guess I'll say. I also think that it really reflects 411 00:22:06,760 --> 00:22:12,919 Speaker 1: the ways that particularly trans people have unfortunately kind of 412 00:22:12,960 --> 00:22:16,920 Speaker 1: become this like like just the existence of trans people 413 00:22:16,960 --> 00:22:21,119 Speaker 1: trying to live their lives has become this like blash point. 414 00:22:21,400 --> 00:22:22,919 Speaker 1: On the one hand, it is surprising to me that 415 00:22:22,960 --> 00:22:25,440 Speaker 1: he would double down on this having a trans child. 416 00:22:25,600 --> 00:22:27,560 Speaker 1: On the other hand, I do see the ways that 417 00:22:27,640 --> 00:22:30,040 Speaker 1: just like the existence of trans people trying to live 418 00:22:30,080 --> 00:22:34,240 Speaker 1: their lives has become this incredibly like politicized hot button issue. 419 00:22:34,480 --> 00:22:36,840 Speaker 1: It certainly should not be. But on the other hand, 420 00:22:36,840 --> 00:22:40,200 Speaker 1: it's not terribly surprising to me that his bid at 421 00:22:40,200 --> 00:22:43,440 Speaker 1: Twitter starts with that transphobibia, that that's where it that 422 00:22:43,440 --> 00:22:46,840 Speaker 1: that's where it begins, right. It makes me want to 423 00:22:46,880 --> 00:22:50,479 Speaker 1: have him fail, but it's just to me. But you know, 424 00:22:50,680 --> 00:22:54,159 Speaker 1: with that, it's kind of odd that he also continues 425 00:22:54,359 --> 00:22:57,680 Speaker 1: to talk about free speech. I think it's really really 426 00:22:57,720 --> 00:23:00,639 Speaker 1: almost ironic. He talks about free spee and several of 427 00:23:00,640 --> 00:23:03,000 Speaker 1: the people that I've seen banned are the people that 428 00:23:03,080 --> 00:23:07,720 Speaker 1: said something about him, not necessarily anything, uh, completely offhand. 429 00:23:07,760 --> 00:23:09,480 Speaker 1: It was just like, hey, he did this and this 430 00:23:09,560 --> 00:23:12,880 Speaker 1: is a bad thing, and he's like banned them exactly. Okay, 431 00:23:12,920 --> 00:23:15,720 Speaker 1: So let's get into this because Elon Musk, he has 432 00:23:15,760 --> 00:23:19,320 Speaker 1: called himself a free speech absolutist. He says that a 433 00:23:19,359 --> 00:23:23,520 Speaker 1: ted talk like free speech is the ability for someone 434 00:23:23,680 --> 00:23:26,159 Speaker 1: you don't like to say something you don't like, right, 435 00:23:26,160 --> 00:23:28,879 Speaker 1: And so you might be thinking, oh, well, he would 436 00:23:28,880 --> 00:23:31,919 Speaker 1: probably be working to foster a climate of open discourse 437 00:23:32,080 --> 00:23:35,080 Speaker 1: and discussion at Twitter, now that he's in charge. But 438 00:23:35,119 --> 00:23:36,600 Speaker 1: you might you would You will be wrong if you 439 00:23:36,680 --> 00:23:38,639 Speaker 1: thought that, right, because that's not what the not what 440 00:23:38,680 --> 00:23:40,800 Speaker 1: the vibe has been so far. And so first of all, 441 00:23:40,880 --> 00:23:42,720 Speaker 1: I have to say, like, obviously, when we talk about 442 00:23:42,760 --> 00:23:46,480 Speaker 1: free speech, really what we're talking about is whether governments 443 00:23:46,520 --> 00:23:50,399 Speaker 1: can punish people or prevent people, um from saying what 444 00:23:50,440 --> 00:23:53,520 Speaker 1: they want to say, right, And so the conversation around 445 00:23:53,520 --> 00:23:56,040 Speaker 1: free speech has gotten so modeled about like, oh, well, 446 00:23:56,080 --> 00:23:58,359 Speaker 1: this is censorship. It's like, oh, not really, but and 447 00:23:58,400 --> 00:24:02,080 Speaker 1: there are absolutely speech issues happening in the United States 448 00:24:02,200 --> 00:24:04,320 Speaker 1: right now, Like what do you look at like anti 449 00:24:04,359 --> 00:24:07,919 Speaker 1: CRT bills or like don't say gay legislation, Like we 450 00:24:08,000 --> 00:24:12,119 Speaker 1: are in a climate where free expression is under attack, 451 00:24:12,400 --> 00:24:15,920 Speaker 1: specifically from government. So like that needs to be clear. 452 00:24:16,400 --> 00:24:19,120 Speaker 1: But for the take of conversation, let's talk about free 453 00:24:19,160 --> 00:24:23,000 Speaker 1: speech as sort of generally fostering a climate of open 454 00:24:23,040 --> 00:24:25,959 Speaker 1: discourse and debate. If we use that definition of a 455 00:24:26,000 --> 00:24:29,399 Speaker 1: free speech you might be thinking, like, has Elon Musk 456 00:24:29,520 --> 00:24:33,440 Speaker 1: worked to create that kind of environment since taking over? Also, no, 457 00:24:33,840 --> 00:24:37,159 Speaker 1: he has not done that. Uh. First, he got started 458 00:24:37,200 --> 00:24:40,359 Speaker 1: by overseeing a mass exodus of staff at Twitter. UM. 459 00:24:40,640 --> 00:24:43,520 Speaker 1: Some employees resigned, which I totally get. Like I would 460 00:24:43,560 --> 00:24:45,720 Speaker 1: have I would have been out the door. Honestly, there's 461 00:24:45,720 --> 00:24:47,159 Speaker 1: not a lot of jobs that if I was offered 462 00:24:47,200 --> 00:24:48,840 Speaker 1: three months pay to not do it, that I would 463 00:24:48,920 --> 00:24:51,240 Speaker 1: keep doing it. It's not, it's not I can't. I've 464 00:24:51,240 --> 00:24:52,439 Speaker 1: got a lot of jobs where that would be the 465 00:24:52,440 --> 00:24:54,280 Speaker 1: case for me. So I would totally have been on 466 00:24:54,320 --> 00:24:57,240 Speaker 1: the resignation train. So I totally get that. And some 467 00:24:57,280 --> 00:25:00,720 Speaker 1: workers were fired, and specifically they were fired for things 468 00:25:00,720 --> 00:25:04,520 Speaker 1: that they said about Elon Musk. Engineer Eric Frohnhopper was 469 00:25:04,560 --> 00:25:07,480 Speaker 1: fired very publicly for going against Musk, and it happened 470 00:25:07,480 --> 00:25:10,919 Speaker 1: to write on Twitter. Eric worked on Twitter's apps for Android, 471 00:25:11,720 --> 00:25:14,840 Speaker 1: and after Musk tweeted that Twitter for Android had been 472 00:25:14,960 --> 00:25:18,840 Speaker 1: really slow, the engineer retweeted Musk saying that Musk his 473 00:25:18,840 --> 00:25:22,639 Speaker 1: his under his technical understanding of Twitter's app was wrong, 474 00:25:22,800 --> 00:25:25,120 Speaker 1: which I can kind of buy because Elon Musk gets 475 00:25:25,160 --> 00:25:28,600 Speaker 1: famously not an engineer, so perhaps might not totally know 476 00:25:28,600 --> 00:25:31,080 Speaker 1: what he's talking about when talking to an actual engineer 477 00:25:31,200 --> 00:25:33,879 Speaker 1: who works on the thing that he is criticizing. And 478 00:25:33,920 --> 00:25:37,600 Speaker 1: so Musk replied and asked this engineer to elaborate before 479 00:25:37,640 --> 00:25:40,600 Speaker 1: writing Twitter is super slow on Android? What if you've 480 00:25:40,600 --> 00:25:42,760 Speaker 1: done to fix that? They kind of go back and forth, 481 00:25:42,880 --> 00:25:45,800 Speaker 1: and then somebody else comes in and it's like, hey, engineer, 482 00:25:46,200 --> 00:25:48,800 Speaker 1: why don't you raise these issues privately with your boss? 483 00:25:49,080 --> 00:25:52,480 Speaker 1: And the engineer replies, maybe he should ask questions privately, 484 00:25:52,600 --> 00:25:56,440 Speaker 1: maybe you slack or email. Elon then weighs in and says, oh, 485 00:25:56,520 --> 00:25:59,800 Speaker 1: that engineer has been fired. So I find it's very 486 00:25:59,840 --> 00:26:03,360 Speaker 1: interesting because a lot of people said, like, oh, if 487 00:26:03,400 --> 00:26:07,240 Speaker 1: you publicly disagree with your boss, you know you can, 488 00:26:07,800 --> 00:26:10,400 Speaker 1: like pretty much anywhere, you would probably be fired. And 489 00:26:10,880 --> 00:26:13,879 Speaker 1: I guess that's a fair point, Like I like in 490 00:26:13,960 --> 00:26:16,600 Speaker 1: most workplaces you can't like get on Twitter and call 491 00:26:16,640 --> 00:26:19,280 Speaker 1: out your boss. Totally get that, But I think the 492 00:26:19,520 --> 00:26:23,320 Speaker 1: larger point is, like it seems directly at odds with 493 00:26:23,359 --> 00:26:26,840 Speaker 1: somebody who has called himself a free speech absolutist. And 494 00:26:26,880 --> 00:26:29,000 Speaker 1: I also think that engineer has a really good point, 495 00:26:29,080 --> 00:26:31,959 Speaker 1: like why is it that Ellen as a as the boss, 496 00:26:32,359 --> 00:26:35,720 Speaker 1: is allowed to publicly misrepresent his work on Twitter, and 497 00:26:35,760 --> 00:26:38,840 Speaker 1: that the staffer, he is the one who has to 498 00:26:38,840 --> 00:26:42,760 Speaker 1: stay quiet and only set the record straight like in private, 499 00:26:42,800 --> 00:26:45,840 Speaker 1: like if Elon Musk, like Elon Musk made it public 500 00:26:45,840 --> 00:26:47,359 Speaker 1: by tweeting about it. Why is it that when his 501 00:26:47,480 --> 00:26:51,800 Speaker 1: boss publicly I think craps on his work, he is 502 00:26:51,880 --> 00:26:54,800 Speaker 1: not able to publicly reply. I kind of totally get 503 00:26:54,840 --> 00:26:56,879 Speaker 1: why he did this, even if for most people they 504 00:26:56,920 --> 00:27:01,680 Speaker 1: might think like, oh, that's clearly a fiable offense. Yes, agreed. 505 00:27:02,320 --> 00:27:05,840 Speaker 1: And also there's been a lot of stories that like 506 00:27:06,000 --> 00:27:09,360 Speaker 1: even privately, and that might not have protected him exactly. 507 00:27:09,400 --> 00:27:11,960 Speaker 1: So even if this engineer had brought it up privately 508 00:27:12,080 --> 00:27:15,560 Speaker 1: via Slack or email, he probably, I think, probably still 509 00:27:15,560 --> 00:27:17,840 Speaker 1: would have been fired. Because it's not just staff who 510 00:27:17,840 --> 00:27:20,719 Speaker 1: have publicly gone against Musk who have been terminated. ABC 511 00:27:20,880 --> 00:27:23,720 Speaker 1: reported that dozens of other staffers said that they were 512 00:27:23,720 --> 00:27:27,080 Speaker 1: fired for raising criticisms on internal Slack messages or email. 513 00:27:27,359 --> 00:27:30,040 Speaker 1: And that kind of tracks with the kind of climate 514 00:27:30,080 --> 00:27:32,640 Speaker 1: that Musk has run at his other companies. I don't 515 00:27:32,640 --> 00:27:34,840 Speaker 1: know how he's been able to dub himself as a 516 00:27:34,960 --> 00:27:38,760 Speaker 1: free speech warrior because that the record of his of 517 00:27:39,000 --> 00:27:42,280 Speaker 1: the climate at his other organizations just does not reflect that. 518 00:27:42,400 --> 00:27:45,640 Speaker 1: Like at SpaceX, for instance, former staff filed an unfair 519 00:27:45,760 --> 00:27:49,280 Speaker 1: labor practice charge with the National Labor Relations Board, saying 520 00:27:49,280 --> 00:27:52,199 Speaker 1: that they were retaliated against for writing and organizing a 521 00:27:52,280 --> 00:27:54,879 Speaker 1: letter that was critical of the company when Musk have 522 00:27:54,960 --> 00:27:57,720 Speaker 1: been accused of sexual harassment. And so they wrote this 523 00:27:57,800 --> 00:28:02,119 Speaker 1: letter saying Elon Musk's behave your sexual harassment. His like 524 00:28:02,240 --> 00:28:04,959 Speaker 1: crass jokes on Twitter is making us look bad and 525 00:28:05,000 --> 00:28:07,200 Speaker 1: it needs to stop. And they they say, and they 526 00:28:07,400 --> 00:28:10,040 Speaker 1: went to the n l RB saying this that they 527 00:28:10,040 --> 00:28:13,720 Speaker 1: were unfairly retaliated against for that. And so I don't know, 528 00:28:13,760 --> 00:28:17,560 Speaker 1: I have a hard time believing that Elon Musk is 529 00:28:17,600 --> 00:28:20,040 Speaker 1: this like champion for free speech when that those are 530 00:28:20,040 --> 00:28:22,680 Speaker 1: the kinds of climates that he has fostered at his organizations. 531 00:28:22,680 --> 00:28:25,080 Speaker 1: There's so much to all of that, including the fact 532 00:28:25,119 --> 00:28:28,920 Speaker 1: when we often hear people complaining about free speech, especially 533 00:28:28,920 --> 00:28:31,720 Speaker 1: today as you were talking about like the anti CRT 534 00:28:32,080 --> 00:28:35,080 Speaker 1: stuff as well as so many other things, don't say 535 00:28:35,080 --> 00:28:38,600 Speaker 1: gay bills, um that it seems very much coming from 536 00:28:38,760 --> 00:28:42,960 Speaker 1: the conservative side because they feel like since they think 537 00:28:42,960 --> 00:28:44,520 Speaker 1: they don't have the same privilege as they did two 538 00:28:44,600 --> 00:28:48,240 Speaker 1: years ago, that they're now being imposed on and therefore 539 00:28:48,280 --> 00:28:50,479 Speaker 1: no longer having free speech. And we also know that 540 00:28:50,480 --> 00:28:54,280 Speaker 1: that in Tesla there was a lawsuit where many black 541 00:28:54,320 --> 00:28:58,520 Speaker 1: employees talked about the racism within that company. We've been 542 00:28:58,520 --> 00:29:02,040 Speaker 1: concluding him, including being called inward and all of that, 543 00:29:02,360 --> 00:29:04,600 Speaker 1: and I'm thinking that's what he's talking about when there's 544 00:29:04,680 --> 00:29:07,320 Speaker 1: limiting free speech, like Okay, white people, you are now 545 00:29:07,360 --> 00:29:11,000 Speaker 1: allowed to say that here because that's free speech, and 546 00:29:11,040 --> 00:29:14,000 Speaker 1: not really not not like crediting the FIPE that that's 547 00:29:14,160 --> 00:29:16,200 Speaker 1: you know, like it's racist. That's not an issue of 548 00:29:16,200 --> 00:29:19,760 Speaker 1: free speech. That's about racism. But that's what I'm seeing 549 00:29:19,800 --> 00:29:21,240 Speaker 1: as a pattern, as well as the fact that that 550 00:29:21,280 --> 00:29:24,680 Speaker 1: the people who have been defending Musk, like the dudeo's like, hey, dude, 551 00:29:24,720 --> 00:29:27,280 Speaker 1: don't be saying this to your boss, are absolutely to 552 00:29:27,320 --> 00:29:30,160 Speaker 1: the level like how close in ratio are we to 553 00:29:30,400 --> 00:29:36,720 Speaker 1: the in cell link here? You? I have so much 554 00:29:36,760 --> 00:29:40,560 Speaker 1: to say about this, So one, I hadn't really thought 555 00:29:40,600 --> 00:29:43,360 Speaker 1: about this before, but you really put you really laid 556 00:29:43,400 --> 00:29:46,160 Speaker 1: this out nicely. I do think that the way that 557 00:29:46,200 --> 00:29:49,360 Speaker 1: people talk about contact moderation on a platform like Twitter 558 00:29:49,800 --> 00:29:55,800 Speaker 1: does kind of connect to conversations about like, oh, because 559 00:29:55,800 --> 00:29:58,240 Speaker 1: I can't say what I used to be able to 560 00:29:58,280 --> 00:30:01,840 Speaker 1: say it's twenty years ago without repercus sins, Therefore I 561 00:30:01,880 --> 00:30:07,240 Speaker 1: am being censored. Therefore, the only way to protect free 562 00:30:07,240 --> 00:30:13,840 Speaker 1: speech is to allow specifically like racist or transphobic content 563 00:30:14,000 --> 00:30:16,400 Speaker 1: on these platforms. Like I hadn't really thought about it 564 00:30:16,400 --> 00:30:18,480 Speaker 1: in that direct of a line, but I think that 565 00:30:18,520 --> 00:30:21,600 Speaker 1: you're absolutely onto something. That the reason why so many 566 00:30:21,600 --> 00:30:24,640 Speaker 1: people when they talk about like free speech, free speech, 567 00:30:25,040 --> 00:30:28,160 Speaker 1: they're not talking about the free speech rights of like 568 00:30:28,440 --> 00:30:32,760 Speaker 1: you know, Palestinian activists, or you know, sex workers or 569 00:30:32,840 --> 00:30:38,160 Speaker 1: any of the other marginalized people who we know actually 570 00:30:38,200 --> 00:30:40,880 Speaker 1: are the recipients of crackdowns on their speech. They're talking 571 00:30:40,880 --> 00:30:43,040 Speaker 1: about the right to say the in word. And told 572 00:30:43,040 --> 00:30:46,560 Speaker 1: to your point about Tesla, I would implore folks to 573 00:30:47,000 --> 00:30:49,440 Speaker 1: read up on this because some of the allegations are 574 00:30:49,840 --> 00:30:53,040 Speaker 1: it's not it's not just like, oh, this person made 575 00:30:53,440 --> 00:30:57,280 Speaker 1: off colored jokes. That allegations, some of them are against 576 00:30:57,400 --> 00:30:59,760 Speaker 1: Musk himself and the kind of climate that he ran 577 00:30:59,840 --> 00:31:01,920 Speaker 1: a sickening right. Like One of them that sticks in 578 00:31:01,960 --> 00:31:04,880 Speaker 1: my mind is that at the Tesla factory, one of 579 00:31:04,920 --> 00:31:08,680 Speaker 1: the lawsuits alleges that when Elon Musk would come to 580 00:31:08,760 --> 00:31:12,360 Speaker 1: a tour of the of the factories, that staffers knew 581 00:31:12,840 --> 00:31:17,360 Speaker 1: that Elon Musk had a problem with seeing black staffers 582 00:31:17,400 --> 00:31:19,600 Speaker 1: on the floor, and so when when on days when 583 00:31:19,640 --> 00:31:22,440 Speaker 1: he was coming, they would have the black staffers like 584 00:31:23,040 --> 00:31:26,280 Speaker 1: essentially hide because that they knew that, like it would 585 00:31:26,320 --> 00:31:29,960 Speaker 1: be a better walk through for Musk if Muff did 586 00:31:29,960 --> 00:31:32,239 Speaker 1: not see black and brown faces on his floor. So 587 00:31:32,280 --> 00:31:34,600 Speaker 1: like it's not just like, oh, someone made a joke 588 00:31:34,680 --> 00:31:37,240 Speaker 1: I didn't like that would be bad enough. It's some 589 00:31:37,240 --> 00:31:41,440 Speaker 1: some of the allegations are really deep and really I 590 00:31:41,480 --> 00:31:47,480 Speaker 1: think reflect a real just a deeply culture. And it's 591 00:31:47,600 --> 00:31:50,360 Speaker 1: interesting because I read this. I can't verify if this 592 00:31:50,480 --> 00:31:52,320 Speaker 1: is an accurate thing or not, but I read this 593 00:31:52,520 --> 00:31:55,040 Speaker 1: um thing where someone was like, oh, I used to 594 00:31:55,080 --> 00:31:58,200 Speaker 1: work at at SpaceX, and everybody knew that part of 595 00:31:58,240 --> 00:32:00,680 Speaker 1: the deal when you worked for Elon mu Us was 596 00:32:01,600 --> 00:32:07,040 Speaker 1: so much management went into managing Elon Musk specifically, and 597 00:32:07,040 --> 00:32:09,160 Speaker 1: so they would they would have all these like different 598 00:32:09,160 --> 00:32:11,680 Speaker 1: window dressing so that when Elon Musk was was there, 599 00:32:11,960 --> 00:32:14,000 Speaker 1: he could feel certain kinds of ways, and so they 600 00:32:14,000 --> 00:32:15,880 Speaker 1: would have like he talked about how they would have 601 00:32:16,040 --> 00:32:18,560 Speaker 1: a computer run code in a way that looks like 602 00:32:18,600 --> 00:32:21,440 Speaker 1: the matrix, so that Elon Musk, who is famously not 603 00:32:21,480 --> 00:32:25,000 Speaker 1: an engineer, would think that really cool, super special tech 604 00:32:25,040 --> 00:32:27,600 Speaker 1: stuff was happening, even though it was just first right, 605 00:32:27,640 --> 00:32:29,600 Speaker 1: And so I think some of that, some of the 606 00:32:29,640 --> 00:32:33,400 Speaker 1: allegations coming out of other Elon Musk run companies are 607 00:32:33,480 --> 00:32:36,560 Speaker 1: so deeply troubling, and I think that these are companies 608 00:32:36,600 --> 00:32:40,640 Speaker 1: that were that were that have an infrastructure to manage 609 00:32:40,720 --> 00:32:44,720 Speaker 1: these these particular ways that Musk has Twitter. It's not 610 00:32:44,800 --> 00:32:46,520 Speaker 1: a company that has that, And so what's it's interesting 611 00:32:46,560 --> 00:32:49,600 Speaker 1: to think, like, what will the vibe at Twitter be 612 00:32:49,720 --> 00:32:52,000 Speaker 1: like when so much of the staff has been gutted 613 00:32:52,280 --> 00:32:54,720 Speaker 1: and it's not a company that has been built to 614 00:32:55,040 --> 00:32:59,440 Speaker 1: manage Elon Musk's quirks. Yeah, that's I've heard that too, 615 00:32:59,440 --> 00:33:02,600 Speaker 1: And that's so side that that's like people's mental energy 616 00:33:02,800 --> 00:33:07,080 Speaker 1: is we gotta keep this guy calm and here's the 617 00:33:07,120 --> 00:33:24,600 Speaker 1: world do it? I think um as as we're discussing, 618 00:33:25,160 --> 00:33:27,120 Speaker 1: he and a lot of people on the right, very 619 00:33:27,120 --> 00:33:32,080 Speaker 1: conservative people, would have you believe that, uh, this sort 620 00:33:32,120 --> 00:33:35,560 Speaker 1: of hard right free speech in quotes, like they're being 621 00:33:35,640 --> 00:33:39,960 Speaker 1: policed more, that they're getting more, they're being more censored. Uh, 622 00:33:40,000 --> 00:33:43,160 Speaker 1: they are very much the victims in this conversation. But 623 00:33:43,400 --> 00:33:46,720 Speaker 1: that isn't the case, is it. It is not the case. 624 00:33:46,800 --> 00:33:48,680 Speaker 1: And this is something that I think really points to 625 00:33:48,680 --> 00:33:50,400 Speaker 1: the kind of ship that Musk is going to be 626 00:33:50,480 --> 00:33:53,000 Speaker 1: running at Twitter, is that he gets on Twitter and 627 00:33:53,040 --> 00:33:58,240 Speaker 1: repeats this thoroughly debunked claim that Twitter has been quote 628 00:33:58,320 --> 00:34:01,520 Speaker 1: censoring the right more than the left, and there is 629 00:34:01,560 --> 00:34:04,680 Speaker 1: an entire body of research debunking this. Like, I could 630 00:34:04,720 --> 00:34:07,000 Speaker 1: do a whole episode digging into some of the studies 631 00:34:07,000 --> 00:34:08,960 Speaker 1: that have come out about this. It's fascinating to me, 632 00:34:09,000 --> 00:34:11,279 Speaker 1: but I'll just run through a couple. So. Researchers from 633 00:34:11,440 --> 00:34:14,240 Speaker 1: m I T. Yale and the University of Exeter published 634 00:34:14,239 --> 00:34:16,839 Speaker 1: a study that found that while right leading accounts are 635 00:34:16,880 --> 00:34:19,359 Speaker 1: banned more frequently, it is not because of anti right 636 00:34:19,400 --> 00:34:22,080 Speaker 1: wing bias, but rather, as the researchers put it, they 637 00:34:22,080 --> 00:34:25,920 Speaker 1: found that users of misinformation sharing was as predictive of 638 00:34:25,960 --> 00:34:29,600 Speaker 1: suspension as was their political orientation. Thus, the observation that 639 00:34:29,640 --> 00:34:32,320 Speaker 1: Republicans were more likely to be suspended the Democrats provides 640 00:34:32,360 --> 00:34:34,920 Speaker 1: no support for the claim that Twitter showed political bias 641 00:34:34,960 --> 00:34:38,200 Speaker 1: and its suspension practices. Instead, the observed asymmetry could be 642 00:34:38,239 --> 00:34:41,680 Speaker 1: explained entirely by the tendency of Republicans to share more misinformation, 643 00:34:41,840 --> 00:34:44,200 Speaker 1: and so basically this idea that like, when they were 644 00:34:44,200 --> 00:34:46,800 Speaker 1: looking at misinformation and who was banned more for for 645 00:34:46,840 --> 00:34:50,839 Speaker 1: spreading it. Republicans or right leading accounts were banned more, 646 00:34:51,160 --> 00:34:53,479 Speaker 1: but because those were the accounts that were more likely 647 00:34:53,520 --> 00:34:55,759 Speaker 1: to be spreading misinformation in the first place. UM. And 648 00:34:56,040 --> 00:34:58,680 Speaker 1: this is actually backed up by Twitter's own in house 649 00:34:58,719 --> 00:35:01,600 Speaker 1: research team in the re musk. Days after Trump was 650 00:35:01,640 --> 00:35:05,400 Speaker 1: banned on Twitter. UM, the Twitter internal team was facing 651 00:35:05,400 --> 00:35:07,600 Speaker 1: a lot of criticism that they were censoring the right, 652 00:35:07,640 --> 00:35:09,560 Speaker 1: so they put together a team to look into it, 653 00:35:09,600 --> 00:35:12,600 Speaker 1: and the internal research team at Twitter found that folks 654 00:35:12,640 --> 00:35:16,400 Speaker 1: on the right are actually amplified on Twitter more often globally. 655 00:35:17,000 --> 00:35:20,400 Speaker 1: From the report, our results reveal a remarkably consistent trend. 656 00:35:20,600 --> 00:35:23,040 Speaker 1: In six out of the seven country studied, the mainstream 657 00:35:23,040 --> 00:35:27,719 Speaker 1: political right enjoys higher algorithmic amplification than the mainstream political left. UM. 658 00:35:27,760 --> 00:35:29,640 Speaker 1: You can read this report on their website. It's twenty 659 00:35:29,640 --> 00:35:33,560 Speaker 1: seven pages long. UM. And So this idea that people 660 00:35:33,600 --> 00:35:37,319 Speaker 1: on the right have been censored by Twitter or or 661 00:35:37,360 --> 00:35:39,920 Speaker 1: are you know, getting a raw deal, it just is 662 00:35:39,960 --> 00:35:43,440 Speaker 1: just not born out by the facts or the or 663 00:35:43,480 --> 00:35:45,680 Speaker 1: the you know, the research has been done. And I 664 00:35:45,680 --> 00:35:48,680 Speaker 1: also think that it really speaks to this like wid 665 00:35:48,760 --> 00:35:53,040 Speaker 1: er misconception about social media more generally, that there is 666 00:35:53,200 --> 00:35:56,680 Speaker 1: no real body of academic study or academic thought with 667 00:35:56,719 --> 00:35:58,960 Speaker 1: regards to social media, and that everything that we know 668 00:35:59,000 --> 00:36:03,080 Speaker 1: about it is and either anecdotal or like unknowable, so 669 00:36:03,160 --> 00:36:05,839 Speaker 1: people can just say whatever and be like, oh, yeah, 670 00:36:05,840 --> 00:36:09,680 Speaker 1: you know how people on the right are censored on 671 00:36:09,719 --> 00:36:12,880 Speaker 1: social media platforms and not really have it challenged. And 672 00:36:12,920 --> 00:36:17,080 Speaker 1: so I think that it really speaks to this. I 673 00:36:17,080 --> 00:36:19,359 Speaker 1: don't know. I think it's an holdover from a time 674 00:36:19,360 --> 00:36:23,160 Speaker 1: when people just saw social media as the Internet and 675 00:36:23,280 --> 00:36:25,480 Speaker 1: not the revealed world, and thus it was not really 676 00:36:25,480 --> 00:36:30,880 Speaker 1: worthy of serious examination. But there's entire schools of research 677 00:36:30,920 --> 00:36:34,560 Speaker 1: and bodies of evidence about social media platforms, and all 678 00:36:34,600 --> 00:36:36,840 Speaker 1: of them say the same thing, that people on the 679 00:36:36,960 --> 00:36:40,640 Speaker 1: right are not being censored or cracked down on or 680 00:36:41,080 --> 00:36:43,279 Speaker 1: anything like that, and that if anybody tells you that 681 00:36:43,280 --> 00:36:46,080 Speaker 1: they are, maybe that's there and maybe they feel that way, 682 00:36:46,080 --> 00:36:49,040 Speaker 1: maybe that's the anecdotal evidence, but they're not. And also, 683 00:36:49,120 --> 00:36:52,320 Speaker 1: like even beyond Twitter, people say the same thing about Facebook, 684 00:36:52,480 --> 00:36:57,560 Speaker 1: when in reality, you know, Mark Zuckerberg was personally intervening 685 00:36:57,600 --> 00:36:59,920 Speaker 1: to keep people like Alex Jones on the platform and 686 00:37:00,320 --> 00:37:03,279 Speaker 1: personally meeting with right wing leaders, and so it's not 687 00:37:03,360 --> 00:37:06,439 Speaker 1: going both ways. And so yeah, I think that Musk 688 00:37:06,600 --> 00:37:09,920 Speaker 1: repeating this claim really shows how easy it is to 689 00:37:09,960 --> 00:37:13,680 Speaker 1: like stoke this victim complex, like like, I, of course 690 00:37:13,760 --> 00:37:16,239 Speaker 1: I am the one being victimized here. What else could 691 00:37:16,239 --> 00:37:19,280 Speaker 1: be happening. I'm not excited about what's coming up because 692 00:37:19,360 --> 00:37:23,200 Speaker 1: there's conversations about how Facebook is prepping for the next election, 693 00:37:23,480 --> 00:37:27,040 Speaker 1: presidential election, and they are way too excited for the 694 00:37:27,040 --> 00:37:31,600 Speaker 1: return of Donald Trump, and I hate things. But you know, 695 00:37:31,760 --> 00:37:35,120 Speaker 1: I find it interesting too, because even with the COVID misinformation, 696 00:37:35,440 --> 00:37:39,600 Speaker 1: it took forever for them to even recognize the misinformation 697 00:37:39,719 --> 00:37:41,279 Speaker 1: and even talking about, Okay, how are we going to 698 00:37:41,320 --> 00:37:44,120 Speaker 1: do that without banning everybody because they didn't want to, 699 00:37:44,360 --> 00:37:46,600 Speaker 1: and so first they put the little warning things to 700 00:37:46,719 --> 00:37:48,680 Speaker 1: all of it, and you're like, it took a lot 701 00:37:48,920 --> 00:37:51,160 Speaker 1: for a person to be banned. It took a lot 702 00:37:51,280 --> 00:37:53,879 Speaker 1: to even have any of that warning to be put 703 00:37:53,920 --> 00:37:55,560 Speaker 1: on there, because we went through a full year. I 704 00:37:55,640 --> 00:37:59,960 Speaker 1: feel like, so much misinformation, so many bad advices about 705 00:38:00,000 --> 00:38:02,160 Speaker 1: how you can treat yourself to get rid of COVID. 706 00:38:02,480 --> 00:38:06,120 Speaker 1: It was affiliate ongoing battle and it still kind of is. 707 00:38:06,160 --> 00:38:09,040 Speaker 1: And it's kind of ironic that they were like they 708 00:38:09,080 --> 00:38:12,719 Speaker 1: actolutely hate us when they were so much fight to 709 00:38:12,800 --> 00:38:15,279 Speaker 1: be put up from the people, especially scientists, be like, 710 00:38:15,320 --> 00:38:17,759 Speaker 1: can you please control this because this is really bad 711 00:38:18,080 --> 00:38:20,880 Speaker 1: for the health of our nation totally. And what's so 712 00:38:20,960 --> 00:38:24,640 Speaker 1: interesting about that is that something that people say a lot, 713 00:38:24,719 --> 00:38:26,440 Speaker 1: like well, meeting people will say a lot. It's like 714 00:38:26,480 --> 00:38:28,880 Speaker 1: you in the early days of COVID, we didn't know 715 00:38:28,920 --> 00:38:30,600 Speaker 1: about masks. Like it was like we were getting a 716 00:38:30,640 --> 00:38:32,040 Speaker 1: lot of like for a while it was like, oh, 717 00:38:32,080 --> 00:38:34,480 Speaker 1: only medical professionals need them, or only this and so like, 718 00:38:34,520 --> 00:38:36,200 Speaker 1: and then eventually it was like, oh, it's masks were 719 00:38:36,200 --> 00:38:40,040 Speaker 1: on the masks. People will, I think kind of understandably 720 00:38:40,560 --> 00:38:45,800 Speaker 1: conflate medical professionals getting more familiar with how the virus 721 00:38:45,880 --> 00:38:48,960 Speaker 1: worked and all of that, which is confusing, right, Like 722 00:38:48,960 --> 00:38:51,239 Speaker 1: it's like having one day having someone be like, oh, 723 00:38:51,239 --> 00:38:52,359 Speaker 1: you don't need to wear a mask, and the next 724 00:38:52,360 --> 00:38:53,640 Speaker 1: thing it hasn't gonna be like, oh wear a mask. 725 00:38:53,680 --> 00:38:56,759 Speaker 1: Like I can understand why people are confused and why 726 00:38:56,840 --> 00:39:00,560 Speaker 1: that looks like, you know, like, oh so why is that? 727 00:39:00,600 --> 00:39:02,719 Speaker 1: Why is that not misinformation? You know? And it's like 728 00:39:03,160 --> 00:39:07,359 Speaker 1: there's a difference between medical professionals, you know, getting more 729 00:39:07,440 --> 00:39:11,160 Speaker 1: up to date on a novel virus, and people who 730 00:39:11,200 --> 00:39:15,880 Speaker 1: are spreading clearly provably false information. And I think that, 731 00:39:15,960 --> 00:39:19,000 Speaker 1: unfortunately was COVID because it was like a new virus, 732 00:39:19,640 --> 00:39:22,319 Speaker 1: it created a climate that really made it easy to 733 00:39:22,360 --> 00:39:24,279 Speaker 1: conflate those things, and so I think it made it 734 00:39:24,320 --> 00:39:29,839 Speaker 1: easy to not challenge provable, demonstrable lies about COVID and 735 00:39:29,960 --> 00:39:33,680 Speaker 1: lies about COVID that were intentional, so like disinformation spread 736 00:39:33,719 --> 00:39:37,080 Speaker 1: by malicious intent by bad actors, I think that the 737 00:39:37,120 --> 00:39:39,560 Speaker 1: climate of having a novel virus where people were still 738 00:39:39,560 --> 00:39:41,400 Speaker 1: getting up to date on what was going on with it, 739 00:39:41,440 --> 00:39:45,200 Speaker 1: made it easy to not crack down on it. And 740 00:39:45,280 --> 00:39:47,520 Speaker 1: I think it's just like a reality of the way 741 00:39:47,560 --> 00:39:50,719 Speaker 1: that inaccurate information on social media works is that there's 742 00:39:50,760 --> 00:39:53,560 Speaker 1: always gonna be that grain of truth. It's like, oh, well, 743 00:39:53,760 --> 00:39:56,560 Speaker 1: why is this not misinformation? And I can understand why 744 00:39:56,640 --> 00:39:59,279 Speaker 1: people raise that, but I think it like did us 745 00:39:59,280 --> 00:40:02,600 Speaker 1: a real disc service too, cracking down on things that 746 00:40:02,640 --> 00:40:07,480 Speaker 1: we know are just provably incorrect statements that makes sense. 747 00:40:07,480 --> 00:40:09,719 Speaker 1: I feel like I'm kind of rambling, but hopefully I'll 748 00:40:09,719 --> 00:40:12,880 Speaker 1: know what I mean. I mean that virus. We didn't know, 749 00:40:13,000 --> 00:40:15,000 Speaker 1: no one knew, no one knew where it came from, 750 00:40:15,000 --> 00:40:17,120 Speaker 1: what to do, how to handle it. And I got 751 00:40:17,520 --> 00:40:20,719 Speaker 1: like that definitely was we definitely were told don't wear masks, please, 752 00:40:20,760 --> 00:40:22,759 Speaker 1: don't buy these masks. We're running out. That's a whole 753 00:40:22,760 --> 00:40:26,600 Speaker 1: different conversations. But to like everybody wearing damn mask please 754 00:40:26,760 --> 00:40:29,279 Speaker 1: for the love of all things, and then having to 755 00:40:29,320 --> 00:40:33,359 Speaker 1: be like, hey, they learn new things because they're researching 756 00:40:33,760 --> 00:40:37,959 Speaker 1: new things, and then also like political stuff, but like, yeah, 757 00:40:38,040 --> 00:40:39,880 Speaker 1: that makes perfect sense that things would change, And of 758 00:40:40,000 --> 00:40:42,680 Speaker 1: course the naysayers would be like do you remember this. 759 00:40:42,880 --> 00:40:45,000 Speaker 1: They're lying because they didn't say this the first time around. 760 00:40:45,000 --> 00:40:46,920 Speaker 1: It was like this is what means to learn and 761 00:40:46,960 --> 00:40:49,160 Speaker 1: to research and to grow. But there was a whole 762 00:40:49,160 --> 00:40:51,400 Speaker 1: lot of stuff. Yeah, And I think that's one of 763 00:40:51,400 --> 00:40:53,960 Speaker 1: the things that I makes me really sad about what's 764 00:40:53,960 --> 00:40:56,360 Speaker 1: happening to Twitter, as we were talking about at the beginning, 765 00:40:56,440 --> 00:41:00,719 Speaker 1: because um, there was a tweet that went viral a 766 00:41:00,760 --> 00:41:03,239 Speaker 1: couple of weeks ago from a reporter that was like, 767 00:41:03,280 --> 00:41:05,040 Speaker 1: you know, world War three could be breaking out, and 768 00:41:05,080 --> 00:41:07,800 Speaker 1: here I am not sure what is reality on Twitter anymore? 769 00:41:07,840 --> 00:41:09,920 Speaker 1: Because it is such a big, a big source for 770 00:41:10,000 --> 00:41:12,680 Speaker 1: news and that whole thing with COVID, it did get 771 00:41:12,719 --> 00:41:17,080 Speaker 1: really messy, and it was an example of an interesting 772 00:41:17,120 --> 00:41:20,239 Speaker 1: and kind of like terrifying example I guess of like, 773 00:41:20,320 --> 00:41:25,160 Speaker 1: how do you how does this platform manage this information 774 00:41:25,760 --> 00:41:30,520 Speaker 1: perhaps misinformation and our disinformation? Um when you know the 775 00:41:30,560 --> 00:41:36,120 Speaker 1: scientists are researching and it is changing, And I think, um, 776 00:41:36,160 --> 00:41:39,000 Speaker 1: because it gets so a lot of it gets convoluted, 777 00:41:39,000 --> 00:41:42,239 Speaker 1: and there's so many like bad faith conversations that can 778 00:41:42,320 --> 00:41:47,719 Speaker 1: happen on platforms like that. UM that it is. It's 779 00:41:47,760 --> 00:41:49,720 Speaker 1: just like one of the it's an interesting case study 780 00:41:50,200 --> 00:41:53,279 Speaker 1: to see how Twitter kind of did deal with that 781 00:41:53,360 --> 00:41:56,359 Speaker 1: how long it took. And I think that's another thing 782 00:41:56,400 --> 00:42:01,120 Speaker 1: that's really upsetting about this is, as you alluded to, Samantha, 783 00:42:01,600 --> 00:42:03,400 Speaker 1: it felt like we were making progress and it was 784 00:42:03,440 --> 00:42:07,480 Speaker 1: so hard fault and now we're seeing like a a 785 00:42:07,680 --> 00:42:13,000 Speaker 1: reversal of all of these things that were It felt like, oh, 786 00:42:13,080 --> 00:42:15,040 Speaker 1: this is a step forward. It's going to be a 787 00:42:15,120 --> 00:42:18,279 Speaker 1: platform where a lot of this misinformation it won't be 788 00:42:18,320 --> 00:42:20,800 Speaker 1: as prevalent as will still be there, but not as prevalent, 789 00:42:22,080 --> 00:42:26,000 Speaker 1: but as getting reversed. It's upsetting. I mean I not 790 00:42:26,040 --> 00:42:28,839 Speaker 1: to get too personal, but like in my day job, 791 00:42:29,080 --> 00:42:32,720 Speaker 1: I do a lot of work trying to make platforms safer. 792 00:42:32,760 --> 00:42:34,680 Speaker 1: And you know, I've met with the team at Twitter 793 00:42:35,400 --> 00:42:37,799 Speaker 1: many times. I don't know who's still there, but like 794 00:42:38,200 --> 00:42:41,879 Speaker 1: two advise them to be like, oh, here, like here 795 00:42:41,880 --> 00:42:43,440 Speaker 1: are steps that you can take to make the platform 796 00:42:43,640 --> 00:42:47,160 Speaker 1: more hospitable for marginalized people, women, people of color, whatever. 797 00:42:47,200 --> 00:42:49,960 Speaker 1: And so it does feel like a lot of the 798 00:42:50,000 --> 00:42:52,839 Speaker 1: work that I have taught and personally, like personally done 799 00:42:52,880 --> 00:42:55,279 Speaker 1: in the last few years it's now being all right, 800 00:42:55,719 --> 00:42:58,239 Speaker 1: So like it's a little it's a little bit like, oh, well, 801 00:42:58,480 --> 00:43:01,759 Speaker 1: glad I spent the three years on this. But musk 802 00:43:01,840 --> 00:43:05,080 Speaker 1: reenstated a bunch of previously suspended or band accounts that 803 00:43:05,120 --> 00:43:07,120 Speaker 1: had been kicked off the platform for things like spreading 804 00:43:07,120 --> 00:43:10,440 Speaker 1: COVID misinformation or harassment, and he signaled that he's planning 805 00:43:10,480 --> 00:43:12,799 Speaker 1: on doing more of that in the following week. So 806 00:43:13,040 --> 00:43:14,640 Speaker 1: some of the folks who have been welcomed back to 807 00:43:14,680 --> 00:43:16,799 Speaker 1: the platform include the Babylon b who we were talking 808 00:43:16,840 --> 00:43:21,200 Speaker 1: about earlier, Jordan Peterson, who was banned after repeatedly dead 809 00:43:21,239 --> 00:43:24,760 Speaker 1: naming the actor Elliott page Um, and miss Gendering Elliott Page, 810 00:43:25,000 --> 00:43:29,880 Speaker 1: Kanye West, whose account was locked after numerous anti Semitic comments. Uh, 811 00:43:29,960 --> 00:43:32,239 Speaker 1: Andrew Tate, who is if you don't know who he is, 812 00:43:32,280 --> 00:43:34,680 Speaker 1: He's like a like a I don't even know what 813 00:43:34,680 --> 00:43:37,239 Speaker 1: you would call him, like a men's rights kind of 814 00:43:37,280 --> 00:43:42,000 Speaker 1: misogynistic influencer slash coach. Um. He Andrew Tate was banned 815 00:43:42,040 --> 00:43:45,320 Speaker 1: from pretty much all platforms for saying things like women 816 00:43:45,320 --> 00:43:47,720 Speaker 1: need to be held accountable if they are raped. Um. 817 00:43:47,840 --> 00:43:51,840 Speaker 1: Marjorie Taylor Green's personal account, not her official account, was 818 00:43:51,920 --> 00:43:56,080 Speaker 1: banned for spreading COVID misinformation, and of course Donald Trump. 819 00:43:56,360 --> 00:43:59,520 Speaker 1: You may recall that Donald Trump was banned after using 820 00:43:59,520 --> 00:44:04,120 Speaker 1: Twitter to foment insurrection, which is a weird thing to say. 821 00:44:04,320 --> 00:44:07,600 Speaker 1: Since having his account reinstated, Trump has actually not tweeted, 822 00:44:07,640 --> 00:44:10,160 Speaker 1: and he is signaled that he is perhaps not going 823 00:44:10,200 --> 00:44:12,000 Speaker 1: to be returning to Twitter, even though he can he 824 00:44:12,040 --> 00:44:15,080 Speaker 1: has his account back, um, saying instead that he's going 825 00:44:15,120 --> 00:44:17,920 Speaker 1: to stay on his own social media platform that he owns, 826 00:44:17,960 --> 00:44:20,879 Speaker 1: called truth Social. I've heard that there might be some 827 00:44:21,000 --> 00:44:25,200 Speaker 1: kind of like contractual obligation there, that Trump might be 828 00:44:25,239 --> 00:44:29,880 Speaker 1: contractually obligated to only tweet on truth Social, But I 829 00:44:29,880 --> 00:44:31,879 Speaker 1: actually don't know the incidents. That's why I can't really 830 00:44:31,880 --> 00:44:34,760 Speaker 1: speak to it personally. But um so all these people 831 00:44:34,880 --> 00:44:39,680 Speaker 1: who are who had to leave the platform for things 832 00:44:39,719 --> 00:44:43,840 Speaker 1: like I would say, they are like fairly serious offensive Listen, 833 00:44:44,000 --> 00:44:49,239 Speaker 1: nobody is permanently suspended from Twitter for a one time 834 00:44:49,280 --> 00:44:52,400 Speaker 1: offense unless it is something egregious, right, And so I 835 00:44:52,400 --> 00:44:55,719 Speaker 1: can I can speak from personal experience. Marjorie Taylor Green 836 00:44:56,000 --> 00:44:59,120 Speaker 1: had numerous warnings and this was like a consistent thing 837 00:44:59,160 --> 00:45:01,360 Speaker 1: with her. Right, Kanye s is the same. It doesn't 838 00:45:01,360 --> 00:45:04,759 Speaker 1: just come out of the blue unless you're doing something 839 00:45:04,800 --> 00:45:07,000 Speaker 1: really egregious. And so these are the people that Elon 840 00:45:07,080 --> 00:45:09,120 Speaker 1: Musk is going to be welcoming back. And he says 841 00:45:09,160 --> 00:45:12,440 Speaker 1: that he's declaring amnesty next week for band accounts um 842 00:45:12,480 --> 00:45:16,319 Speaker 1: But at the same time he is also cracking down 843 00:45:16,360 --> 00:45:19,280 Speaker 1: on accounts that are you know, associated with like lefty 844 00:45:19,320 --> 00:45:22,600 Speaker 1: politics as well. So, for instance, Chad Loader, the founder 845 00:45:22,640 --> 00:45:26,480 Speaker 1: of the cybersecurity especially company called Habituate, his account was 846 00:45:26,600 --> 00:45:29,200 Speaker 1: banned from the site after he used Twitter on November 847 00:45:29,280 --> 00:45:32,440 Speaker 1: twenty three to warn users about an alleged data breach 848 00:45:32,520 --> 00:45:35,839 Speaker 1: on Twitter. Um. Loader is known for like researching and 849 00:45:35,880 --> 00:45:39,279 Speaker 1: reporting on right wing extremism, including unmasking a Proud Boy 850 00:45:39,280 --> 00:45:42,480 Speaker 1: member who attacked a police officer during the instruction UM 851 00:45:42,480 --> 00:45:44,880 Speaker 1: and that report was actually cited by the Department of Justice. 852 00:45:44,920 --> 00:45:49,479 Speaker 1: So like a fairly you know, known person who writes 853 00:45:49,480 --> 00:45:52,320 Speaker 1: about things like cybersecurity and how to be safe on Twitter, 854 00:45:52,680 --> 00:45:56,040 Speaker 1: and right wing extremism was one of the first accounts 855 00:45:56,080 --> 00:46:00,560 Speaker 1: banned while Musk was talking about this amnesty and you know, 856 00:46:01,600 --> 00:46:04,799 Speaker 1: having Twitter be this place for free speech. UM. Interestingly enough, 857 00:46:04,840 --> 00:46:07,800 Speaker 1: when Sam Harris asked Elon Musk about bringing back Alex Jones, 858 00:46:08,080 --> 00:46:11,640 Speaker 1: who you might remember made up egregious lies about babies 859 00:46:11,680 --> 00:46:14,239 Speaker 1: who died in Sandy Hook and the parents who grieved them, 860 00:46:14,320 --> 00:46:16,239 Speaker 1: you know, said that they were paid actors and all 861 00:46:16,239 --> 00:46:20,600 Speaker 1: of that. Elon Musk tweeted, my firstborn child died in 862 00:46:20,680 --> 00:46:23,160 Speaker 1: my arms. I felt his last heartbeat. I have no 863 00:46:23,280 --> 00:46:25,920 Speaker 1: mercy for anyone who would use the deaths of children 864 00:46:26,080 --> 00:46:29,600 Speaker 1: for gain politics or fame. And I actually think that 865 00:46:29,600 --> 00:46:33,960 Speaker 1: that was one of the more transparent, honest statements from 866 00:46:34,040 --> 00:46:37,040 Speaker 1: Musk about how he sees his role in moderation right, 867 00:46:37,080 --> 00:46:40,960 Speaker 1: Like he can personally identify with the pain of losing 868 00:46:40,960 --> 00:46:44,080 Speaker 1: a child. Because Elon Musk unfortunately has lost the child, 869 00:46:44,160 --> 00:46:46,279 Speaker 1: and so it allows him to have a sense of 870 00:46:46,320 --> 00:46:48,120 Speaker 1: how painful it would be to be a grieving parent 871 00:46:48,600 --> 00:46:51,320 Speaker 1: grieving that loss and then also be harassed and called 872 00:46:51,360 --> 00:46:53,480 Speaker 1: a liar on top of it, And so he's gonna 873 00:46:53,520 --> 00:46:57,319 Speaker 1: make those moderation decisions based on that lived experience. It's 874 00:46:57,360 --> 00:46:59,840 Speaker 1: interesting because I think it totally puts to rest that 875 00:47:00,120 --> 00:47:02,759 Speaker 1: Elon Musk is like a free speech absolutist, because it 876 00:47:02,800 --> 00:47:05,719 Speaker 1: reveals what's pretty obvious that these decisions are just made 877 00:47:05,800 --> 00:47:08,040 Speaker 1: up by him based on like what he does or 878 00:47:08,080 --> 00:47:10,920 Speaker 1: doesn't like, which fine, But I also think that, like, 879 00:47:11,600 --> 00:47:16,560 Speaker 1: because he can personally identify with losing a child, he 880 00:47:16,680 --> 00:47:19,839 Speaker 1: is going to moderate from that lived personal experience. But 881 00:47:19,920 --> 00:47:23,200 Speaker 1: what about experiences that Elon Musk has not personally had, right, So, 882 00:47:23,239 --> 00:47:26,640 Speaker 1: like the experience of somebody like Elliott Page or somebody 883 00:47:26,680 --> 00:47:29,080 Speaker 1: like Dr Michel Levine who was just a trans person 884 00:47:29,120 --> 00:47:31,319 Speaker 1: trying to live their life, or the experience of being 885 00:47:31,320 --> 00:47:33,680 Speaker 1: like a woman of color on a platform like Twitter, 886 00:47:33,840 --> 00:47:37,360 Speaker 1: who we know are more harassed than than our counterparts, 887 00:47:37,440 --> 00:47:39,520 Speaker 1: right or what about being a person with disabilities who 888 00:47:39,560 --> 00:47:43,440 Speaker 1: relies on the platform to build community, right, Like, because 889 00:47:43,440 --> 00:47:47,560 Speaker 1: Elon Musk has no personal, lived experience with those marginalized identities. 890 00:47:47,800 --> 00:47:51,200 Speaker 1: It seems like he really can't see those perspectives as 891 00:47:51,360 --> 00:47:54,160 Speaker 1: as real as his own perspective. And it really it 892 00:47:54,200 --> 00:47:56,920 Speaker 1: reminds me so much of this, like classic George Carlin 893 00:47:57,000 --> 00:47:59,520 Speaker 1: jokes that I've gotten so much mileage out of in 894 00:47:59,560 --> 00:48:02,560 Speaker 1: my life. Have you ever noticed that other people's stuff 895 00:48:02,640 --> 00:48:05,719 Speaker 1: is in your is stuff? Right? Like he could only 896 00:48:05,760 --> 00:48:10,240 Speaker 1: see like his lived experience as real and everybody else's 897 00:48:10,320 --> 00:48:14,319 Speaker 1: it's just theoretical or trivial or doesn't really matter. And 898 00:48:14,880 --> 00:48:18,759 Speaker 1: I don't think that platforms like Twitter should be moderated 899 00:48:18,800 --> 00:48:23,360 Speaker 1: based on a white billionaire what what is within the 900 00:48:23,400 --> 00:48:27,279 Speaker 1: scope of his personal lived experience, because there's so many 901 00:48:27,320 --> 00:48:29,400 Speaker 1: lived experiences out there that he has no idea of. 902 00:48:29,960 --> 00:48:33,239 Speaker 1: I mean, not everybody gets to experience having people on 903 00:48:33,320 --> 00:48:36,400 Speaker 1: your staff to make you feel comfortable, including like avoiding 904 00:48:36,440 --> 00:48:40,360 Speaker 1: the color orange. I guess, um, that's a whole different 905 00:48:40,400 --> 00:48:44,520 Speaker 1: level of experience. And you know, not to delegate the 906 00:48:44,560 --> 00:48:46,680 Speaker 1: fact that he did lose a child and that was heartful, 907 00:48:47,000 --> 00:48:49,800 Speaker 1: Like his wife ex wife came back and was like, dude, 908 00:48:49,960 --> 00:48:53,719 Speaker 1: you didn't hold that child. I held that child through 909 00:48:53,760 --> 00:48:57,360 Speaker 1: this entire process, and now who's using the death of 910 00:48:57,360 --> 00:49:01,080 Speaker 1: a child for publicity. That's interesting. I mean, I'm glad 911 00:49:01,160 --> 00:49:04,799 Speaker 1: he's acknowledging that's a bad thing, so I guess half 912 00:49:04,840 --> 00:49:07,479 Speaker 1: a point for that. But it's quite interesting that even 913 00:49:07,600 --> 00:49:10,400 Speaker 1: that he couldn't be truly honest about and he couldn't 914 00:49:10,400 --> 00:49:12,840 Speaker 1: actually see beyond his own like, I'm gonna make myself 915 00:49:12,880 --> 00:49:16,720 Speaker 1: look really big by doing this, when in actuality, again 916 00:49:17,040 --> 00:49:19,560 Speaker 1: his motive in buying this is hurtful for one child, 917 00:49:19,760 --> 00:49:23,160 Speaker 1: and then to lie about that specific thing against the wife, 918 00:49:23,160 --> 00:49:26,560 Speaker 1: who apparently had a lot of issues with Elon Musk 919 00:49:26,600 --> 00:49:30,799 Speaker 1: in general, and to literally dismiss her experience as it's 920 00:49:30,960 --> 00:49:33,000 Speaker 1: there's so many things to this man that I'm just like, 921 00:49:33,160 --> 00:49:35,480 Speaker 1: what is wrong with you other than you are a 922 00:49:35,600 --> 00:49:38,520 Speaker 1: narcissistic sociopath. And I don't know what else to say 923 00:49:38,680 --> 00:49:41,799 Speaker 1: except for, oh my god, you're ruining this totally. So 924 00:49:42,040 --> 00:49:44,680 Speaker 1: if anybody out there, like, if you want to read 925 00:49:44,800 --> 00:49:48,799 Speaker 1: a heartbreaking account of a divorce where you're like that 926 00:49:48,960 --> 00:49:53,600 Speaker 1: man is awful, look into the look into Elon Musk's divorce. 927 00:49:54,000 --> 00:49:57,800 Speaker 1: I read it. It's like it haunts me. The things 928 00:49:57,840 --> 00:50:03,000 Speaker 1: that were alleged that happened in that of haunts me. Um, 929 00:50:03,040 --> 00:50:05,719 Speaker 1: so yes, yes to all of that. And I think 930 00:50:05,760 --> 00:50:10,279 Speaker 1: like the kind of person who like requires their staffers 931 00:50:10,600 --> 00:50:15,319 Speaker 1: to do so much labor and energy to have the 932 00:50:15,400 --> 00:50:19,319 Speaker 1: workplace be to their liking in this in these like 933 00:50:19,360 --> 00:50:22,440 Speaker 1: really particular ways, is the kind of person who sees 934 00:50:22,520 --> 00:50:25,239 Speaker 1: themselves as the main character of life and doesn't even 935 00:50:25,239 --> 00:50:27,080 Speaker 1: really question that, Like, I think that's really what's going 936 00:50:27,120 --> 00:50:29,920 Speaker 1: on here. I think that people like Elon Musk, you know, 937 00:50:29,960 --> 00:50:32,560 Speaker 1: I don't know him, This is my opinion. I think 938 00:50:32,600 --> 00:50:37,080 Speaker 1: it's hard for them to see other people's perspectives as real. 939 00:50:37,520 --> 00:50:39,960 Speaker 1: And so the kind of person who requires staffers to 940 00:50:40,000 --> 00:50:42,759 Speaker 1: go around making computers look super cool and science e 941 00:50:43,200 --> 00:50:46,920 Speaker 1: so that he you know, gets warm fuzzies and and 942 00:50:47,080 --> 00:50:49,800 Speaker 1: doesn't even see that that is what's is what's happening. 943 00:50:49,920 --> 00:50:51,880 Speaker 1: Isn't even able to see that labor. So he just 944 00:50:51,880 --> 00:50:54,160 Speaker 1: thinks like, Wow, my company is so great, and he 945 00:50:54,200 --> 00:50:57,080 Speaker 1: doesn't see the like twenties frustrated staffers Like I've been 946 00:50:57,160 --> 00:50:59,080 Speaker 1: the staff or who has to go out of my 947 00:50:59,160 --> 00:51:01,480 Speaker 1: way and do a lot of emotional labor to accommodate 948 00:51:01,600 --> 00:51:03,640 Speaker 1: a person in power who will never even see that 949 00:51:03,719 --> 00:51:05,919 Speaker 1: labor doesn't even know that labor has happened, and it's 950 00:51:05,960 --> 00:51:09,600 Speaker 1: really hard, and it speaks to this perspective of really 951 00:51:09,640 --> 00:51:13,520 Speaker 1: not being able to see other people. Everybody else is 952 00:51:13,520 --> 00:51:17,279 Speaker 1: just a side character to your main character. I guess 953 00:51:17,320 --> 00:51:32,920 Speaker 1: that's what I was how I'll play it. Yeah, you 954 00:51:32,960 --> 00:51:36,840 Speaker 1: always do such an excellent job of pointing this out, Bridget, 955 00:51:36,960 --> 00:51:40,600 Speaker 1: But people forget sometimes that, you know, technology is not 956 00:51:41,000 --> 00:51:45,400 Speaker 1: without bias because somebody programmed it, somebody's moderating it. So 957 00:51:45,600 --> 00:51:48,400 Speaker 1: I think with Elon Musk, without this team of people, 958 00:51:49,200 --> 00:51:52,839 Speaker 1: with all of these changes, um, and he's kind of 959 00:51:52,880 --> 00:51:56,239 Speaker 1: like declaration that, oh well, I'm going to make it 960 00:51:56,360 --> 00:52:03,280 Speaker 1: somehow less biased. Like that's just false. That's such such 961 00:52:03,320 --> 00:52:07,200 Speaker 1: a lie. And um, I know we've been talking about 962 00:52:07,239 --> 00:52:10,279 Speaker 1: this throughout, but what for you know, everybody who's listening, 963 00:52:10,320 --> 00:52:12,160 Speaker 1: it's like, oh, well, you know, I don't really use Twitter. 964 00:52:12,440 --> 00:52:14,480 Speaker 1: I always kind of didn't like it or whatever. What 965 00:52:14,840 --> 00:52:17,560 Speaker 1: do we stand to lose? Why does this matter? Yeah, 966 00:52:17,600 --> 00:52:20,600 Speaker 1: I'm so glad that you asked, because you might be thinking, Okay, 967 00:52:20,640 --> 00:52:23,080 Speaker 1: I get it, but why do I care. I don't 968 00:52:23,080 --> 00:52:24,759 Speaker 1: work at Twitter. I'll never work at Twitter. Right now? 969 00:52:24,800 --> 00:52:26,680 Speaker 1: You use it. Why do I care? Well, First, is 970 00:52:26,719 --> 00:52:29,720 Speaker 1: the most basic, you know, just Twitter as a place 971 00:52:29,760 --> 00:52:32,640 Speaker 1: to get quick information. Um, like we're talking about earlier. 972 00:52:32,880 --> 00:52:35,399 Speaker 1: You know, it is functioned as this in this way 973 00:52:35,440 --> 00:52:37,800 Speaker 1: for a long time. You know, when Joe Biden needed 974 00:52:37,840 --> 00:52:41,200 Speaker 1: to announce the specifics of like student loan debt relief, 975 00:52:41,239 --> 00:52:44,280 Speaker 1: he didn't go to Reddit or Tumbler or Instagram. Twitter 976 00:52:44,440 --> 00:52:46,600 Speaker 1: is how you get information out quickly. And so like, 977 00:52:46,880 --> 00:52:50,400 Speaker 1: when there's an emergency, Twitter is how you get information 978 00:52:50,400 --> 00:52:51,879 Speaker 1: out about it. Like I have a friend who works 979 00:52:51,880 --> 00:52:54,959 Speaker 1: for the State Department. It's Twitter that is being used 980 00:52:55,040 --> 00:52:59,000 Speaker 1: right now to get important information out to Americans who 981 00:52:59,040 --> 00:53:01,360 Speaker 1: are travel to the World Cup for instance, right, And 982 00:53:01,360 --> 00:53:02,920 Speaker 1: so if the platform is going to be full of 983 00:53:03,000 --> 00:53:05,799 Speaker 1: thoughts and hate speech and accounts and persanding other people 984 00:53:05,840 --> 00:53:08,759 Speaker 1: and bad actors, that whole thing breaks down. Like, think 985 00:53:08,760 --> 00:53:13,360 Speaker 1: about it, if there was an emergency in your neighborhood 986 00:53:13,440 --> 00:53:15,479 Speaker 1: right now and you wanted up to date, real time 987 00:53:15,560 --> 00:53:18,000 Speaker 1: information about it, we don't really have a place to 988 00:53:18,040 --> 00:53:19,879 Speaker 1: really get that other than Twitter. And like, I think 989 00:53:19,880 --> 00:53:22,680 Speaker 1: it really goes to show that we need better public 990 00:53:22,719 --> 00:53:26,000 Speaker 1: interest communications platforms. But like, if you like when there's 991 00:53:26,040 --> 00:53:29,080 Speaker 1: been emergencies in my neighborhood. I'm looking at Twitter to 992 00:53:29,120 --> 00:53:32,360 Speaker 1: see who is talking about and what is developing, you know, quickly, 993 00:53:32,400 --> 00:53:35,879 Speaker 1: And so that's just the most like basic reason why 994 00:53:35,920 --> 00:53:38,759 Speaker 1: folks should care. But it's also more than that. You know, 995 00:53:38,800 --> 00:53:41,680 Speaker 1: earlier I was talking about the massive influence on Twitter 996 00:53:41,880 --> 00:53:44,040 Speaker 1: has and sort of how it's been used to drive progress. 997 00:53:44,239 --> 00:53:46,279 Speaker 1: I could give you so many examples of the way 998 00:53:46,320 --> 00:53:49,520 Speaker 1: that Twitter has been used to push conversations forward to 999 00:53:49,560 --> 00:53:51,400 Speaker 1: get us someplace better. You know, if not for Twitter, 1000 00:53:51,680 --> 00:53:53,839 Speaker 1: we wouldn't have movements like me too, you know, which 1001 00:53:53,920 --> 00:53:57,000 Speaker 1: was not just something happening on Twitter. It shifted our 1002 00:53:57,160 --> 00:54:01,000 Speaker 1: entire culture and our entire you know, the progress of 1003 00:54:01,000 --> 00:54:03,240 Speaker 1: our culture. I would say, for instance, you'll have probably 1004 00:54:03,280 --> 00:54:05,799 Speaker 1: heard of shan Quilla Robinson, who was a twenty five 1005 00:54:05,840 --> 00:54:08,719 Speaker 1: year old black clubman from North Carolina who traveled to 1006 00:54:08,760 --> 00:54:11,799 Speaker 1: Mexico on vacation with a group of people and ended 1007 00:54:11,880 --> 00:54:14,319 Speaker 1: up dead. The people that she traveled with and tried 1008 00:54:14,360 --> 00:54:17,120 Speaker 1: to say that it was alcohol poisoning, but later a 1009 00:54:17,200 --> 00:54:20,160 Speaker 1: video emerged of one of them getting into a pretty 1010 00:54:20,200 --> 00:54:24,680 Speaker 1: brutal physical altercation with her. Now there's actual movement in 1011 00:54:24,719 --> 00:54:27,319 Speaker 1: the case and her mother says that it's because of 1012 00:54:27,520 --> 00:54:30,320 Speaker 1: Black Twitter. She says that if not for black folks 1013 00:54:30,320 --> 00:54:33,160 Speaker 1: tweeting about it, raising the alarm about it, generating awareness 1014 00:54:33,200 --> 00:54:34,440 Speaker 1: about it, she said, she said that she was having 1015 00:54:34,480 --> 00:54:37,480 Speaker 1: a really hard time getting any kind of national attention 1016 00:54:37,640 --> 00:54:39,600 Speaker 1: on the story of her daughter's death, and so she 1017 00:54:39,719 --> 00:54:41,920 Speaker 1: actually says it was Twitter that did that. And so 1018 00:54:42,280 --> 00:54:46,920 Speaker 1: in that case, this rapper Amina Caine shared shan Quilla's 1019 00:54:46,920 --> 00:54:49,560 Speaker 1: photo on Twitter on November night. Her message went viral, 1020 00:54:49,719 --> 00:54:53,239 Speaker 1: got almost twenty retweets and then later spread to other 1021 00:54:53,280 --> 00:54:55,799 Speaker 1: social media platforms and got so much more attention to 1022 00:54:55,840 --> 00:54:58,400 Speaker 1: her death. And um Sherry Williams, who was a professor 1023 00:54:58,440 --> 00:55:01,040 Speaker 1: of race media and communication that I'm the University, she 1024 00:55:01,120 --> 00:55:03,479 Speaker 1: put it really well in this interview with NBC News. 1025 00:55:03,480 --> 00:55:06,759 Speaker 1: She says, black folks know that mainstream news media has 1026 00:55:06,760 --> 00:55:09,560 Speaker 1: a history of completely ignoring our stories. So we've been 1027 00:55:09,640 --> 00:55:12,680 Speaker 1: using these tools to amplify our stories ourselves, and it works. 1028 00:55:12,920 --> 00:55:16,360 Speaker 1: We see the cycle of mainstream news media basically following 1029 00:55:16,400 --> 00:55:19,279 Speaker 1: the chatter on black social media like Twitter, and so 1030 00:55:19,400 --> 00:55:22,759 Speaker 1: it really goes to show the real world impact that 1031 00:55:22,840 --> 00:55:26,360 Speaker 1: platforms like Twitter can have. Twitter was not perfect, it 1032 00:55:26,560 --> 00:55:28,520 Speaker 1: is not perfect. I have had my issues with it, 1033 00:55:28,640 --> 00:55:32,920 Speaker 1: but we lose so much if we don't have platforms 1034 00:55:32,960 --> 00:55:36,440 Speaker 1: like Twitter. Marginalized communities lose so much and our ability 1035 00:55:36,480 --> 00:55:39,759 Speaker 1: to push that culture forward and have those conversations that 1036 00:55:39,840 --> 00:55:43,040 Speaker 1: get us someplace better. We there is so much at 1037 00:55:43,040 --> 00:55:46,360 Speaker 1: stake if we do not have social media and digital 1038 00:55:46,400 --> 00:55:51,080 Speaker 1: communications platforms where those kind of conversations can happen. Yes, yes, 1039 00:55:51,320 --> 00:55:55,080 Speaker 1: yess um. I totally agree. And I think that that's 1040 00:55:55,200 --> 00:55:57,360 Speaker 1: one thing that kind of gives me anxieties when I 1041 00:55:57,400 --> 00:55:59,000 Speaker 1: when I hear people like, well I didn't like Twitter, 1042 00:55:59,040 --> 00:56:01,359 Speaker 1: and like, but that can be a very privileged thing 1043 00:56:01,360 --> 00:56:03,080 Speaker 1: to just be like, oh it goes away, because there 1044 00:56:03,080 --> 00:56:05,160 Speaker 1: are people in other countries that use it to organize 1045 00:56:05,200 --> 00:56:07,760 Speaker 1: and to communicate. It Like, it's a very powerful tool, 1046 00:56:08,200 --> 00:56:11,279 Speaker 1: especially if you're kind of isolated and maybe you don't 1047 00:56:11,320 --> 00:56:14,600 Speaker 1: have a lot of people in your community or like 1048 00:56:14,640 --> 00:56:16,919 Speaker 1: where you grew up to talk to you, to share 1049 00:56:16,960 --> 00:56:22,799 Speaker 1: ideas with, Like, it's important, it's really really important. Um 1050 00:56:22,920 --> 00:56:25,319 Speaker 1: So I think I'm glad, I'm glad you you brought 1051 00:56:25,360 --> 00:56:28,360 Speaker 1: this today. I'm glad that you brought these points because 1052 00:56:28,400 --> 00:56:31,440 Speaker 1: I just feel like there is so much at stake, 1053 00:56:31,600 --> 00:56:34,279 Speaker 1: and a lot of it's getting kind of lost in 1054 00:56:34,360 --> 00:56:39,439 Speaker 1: the chaos of like what's happening with Twitter and why 1055 00:56:39,480 --> 00:56:42,120 Speaker 1: it matters is getting lost in all of this chaos. 1056 00:56:42,120 --> 00:56:45,840 Speaker 1: So thank you as always bridget Oh of course, and 1057 00:56:45,880 --> 00:56:51,040 Speaker 1: if folks listening, like if you have questions about Twitter alternatives, 1058 00:56:51,080 --> 00:56:55,359 Speaker 1: like let's chat, like where can we do? So that's 1059 00:56:55,400 --> 00:57:01,240 Speaker 1: aggressive great question I have. I guess I I the 1060 00:57:01,239 --> 00:57:05,719 Speaker 1: platforms that I see people going to, our Hive and Mastodon, 1061 00:57:06,480 --> 00:57:08,759 Speaker 1: I'm still trying to figure them all out. They all 1062 00:57:08,800 --> 00:57:11,400 Speaker 1: have their issues. But yeah, people have been talking about 1063 00:57:11,480 --> 00:57:16,160 Speaker 1: moving to Hive, maskedon Post. All of these different platforms 1064 00:57:16,160 --> 00:57:18,880 Speaker 1: all have their issues, Like there's been big conversations about 1065 00:57:19,160 --> 00:57:23,200 Speaker 1: the funding structure and the moderation structure. So I will 1066 00:57:23,240 --> 00:57:26,440 Speaker 1: be trying them all out myself, and I'm happy to 1067 00:57:26,480 --> 00:57:28,440 Speaker 1: have report back and let y'all know, like what worked, 1068 00:57:28,440 --> 00:57:31,400 Speaker 1: what didn't work, what's a flop? What's a bob? Ya know? 1069 00:57:31,560 --> 00:57:35,520 Speaker 1: TBD that's amazing. Yes, please please let us not because 1070 00:57:35,520 --> 00:57:39,400 Speaker 1: like Massodon seems super confusing and very specific. Hive may 1071 00:57:39,400 --> 00:57:43,480 Speaker 1: not be ready for the mass movement, is what I'm reading. Um. 1072 00:57:43,520 --> 00:57:46,040 Speaker 1: But and then also that no discord is a thing 1073 00:57:46,080 --> 00:57:49,040 Speaker 1: which I've been told that we should have one. So Bridget, 1074 00:57:49,080 --> 00:57:51,840 Speaker 1: let's have any Let's have a discord for us y 1075 00:57:52,400 --> 00:57:55,560 Speaker 1: Smith Bridget and then having like questionnaires with the people 1076 00:57:55,600 --> 00:57:59,400 Speaker 1: and then deciding where to go. Maybe I love this, Yes, 1077 00:57:59,600 --> 00:58:02,400 Speaker 1: so I even before Twitter was about my musk, I've 1078 00:58:02,400 --> 00:58:06,200 Speaker 1: always loved discord. Uh, it's super fun. Um yeah, let's 1079 00:58:06,200 --> 00:58:08,720 Speaker 1: have a discord channel. We can like cash all this out. 1080 00:58:09,320 --> 00:58:17,360 Speaker 1: Let's gosh, yes, let's make let's make it a thing. 1081 00:58:17,440 --> 00:58:27,360 Speaker 1: Get online. I'm gonna, I'm gonna, I'm we are the coolest. Obviously. 1082 00:58:27,760 --> 00:58:30,440 Speaker 1: This reminds me of the Britney Spears interview where she's like, 1083 00:58:30,680 --> 00:58:39,960 Speaker 1: everyone's talking about the emails. Yeah, like Britney Spears, that's me. Yes, yes, 1084 00:58:40,400 --> 00:58:44,840 Speaker 1: um well, Bridget the actual coolest. Where can the listeners 1085 00:58:44,880 --> 00:58:48,200 Speaker 1: find you? You can find me on Instagram at Bridget 1086 00:58:48,280 --> 00:58:50,240 Speaker 1: Marie and d C you can find me. I still 1087 00:58:50,280 --> 00:58:52,479 Speaker 1: have a Twitter account, but tweeting a lot less these days. 1088 00:58:52,520 --> 00:58:54,960 Speaker 1: At Bridget Marie. You can listen to me on my 1089 00:58:55,040 --> 00:58:57,280 Speaker 1: podcast There Are No Girls on the Internet, and my 1090 00:58:57,360 --> 00:59:00,880 Speaker 1: new limited series with cool Zone Media called Internet Hate Machine, 1091 00:59:00,960 --> 00:59:03,720 Speaker 1: or We Dive into all the ways that the internet 1092 00:59:03,920 --> 00:59:06,120 Speaker 1: can be a not so fun place for women and 1093 00:59:06,160 --> 00:59:09,920 Speaker 1: people of color. Yes, super important to this conversation and 1094 00:59:10,000 --> 00:59:14,480 Speaker 1: definitely listeners go check it out if you haven't already. Um, 1095 00:59:14,680 --> 00:59:17,040 Speaker 1: thank you again Bridget for being here. Always a delight, 1096 00:59:17,880 --> 00:59:20,800 Speaker 1: and thank you listeners for listening. If you would like 1097 00:59:20,840 --> 00:59:22,960 Speaker 1: to contact us, you can or email Stuff and your 1098 00:59:23,000 --> 00:59:24,240 Speaker 1: mom and stuff at I Heart Me. Yeah, you can 1099 00:59:24,240 --> 00:59:27,120 Speaker 1: find us on Twitter most of the podcast all right, 1100 00:59:27,160 --> 00:59:29,280 Speaker 1: Instagram and stuff I've Never told you. Thanks. It's always 1101 00:59:29,320 --> 00:59:31,920 Speaker 1: to your super producer Christina, Thank you and thanks to 1102 00:59:31,960 --> 00:59:33,960 Speaker 1: you for listening Stuff I Never told the Protection. Buy 1103 00:59:34,000 --> 00:59:35,680 Speaker 1: Hiart Radio for more podcast in my heart Radio you 1104 00:59:35,720 --> 00:59:37,840 Speaker 1: can check out the heart radio app, Apple podcast, orherever 1105 00:59:37,880 --> 00:59:39,040 Speaker 1: you listen to your favorite shows