1 00:00:05,200 --> 00:00:06,720 Speaker 1: Hey, this is Annie and Samantha. 2 00:00:06,880 --> 00:00:07,480 Speaker 2: Welcome to stefone. 3 00:00:07,560 --> 00:00:19,279 Speaker 3: Never told you a productive I Heeart Radio, and once 4 00:00:19,320 --> 00:00:23,680 Speaker 3: again we are so happy to be joined by the curious, 5 00:00:23,760 --> 00:00:25,240 Speaker 3: the charming Bridget Todd. 6 00:00:26,079 --> 00:00:26,880 Speaker 2: Welcome, Bridget. 7 00:00:27,280 --> 00:00:31,120 Speaker 4: Thanks for having me. I appreciate the c adjective theme. 8 00:00:31,720 --> 00:00:32,559 Speaker 4: Thanks for having me. 9 00:00:33,840 --> 00:00:36,120 Speaker 2: Yes, thank you, thank you so much for being here. 10 00:00:36,840 --> 00:00:40,320 Speaker 3: We were talking a little bit before we started recording. 11 00:00:40,440 --> 00:00:43,800 Speaker 3: But I guess here comes a loaded question. How are 12 00:00:43,840 --> 00:00:44,839 Speaker 3: you doing, Bridget? 13 00:00:45,240 --> 00:00:49,040 Speaker 4: Oh, I'll just say I've been better for folks who 14 00:00:49,080 --> 00:00:52,000 Speaker 4: don't know. I live in Washington, d C. Which you 15 00:00:52,080 --> 00:00:56,840 Speaker 4: might know is a crime infested hellhole. No, just kidding. 16 00:00:56,840 --> 00:00:59,960 Speaker 4: It's actually a lovely place to live. But President Trump 17 00:01:00,200 --> 00:01:06,360 Speaker 4: has taken over our city's police force and has unleashed 18 00:01:06,440 --> 00:01:09,480 Speaker 4: the National Guard and with federal agents into the city. 19 00:01:09,720 --> 00:01:12,480 Speaker 4: When this first happened, for whatever reason, it was centralized 20 00:01:12,920 --> 00:01:17,760 Speaker 4: right on my street, which wasn't my favorite. It still 21 00:01:17,840 --> 00:01:23,440 Speaker 4: persists every day is a new surprise I guess that 22 00:01:23,520 --> 00:01:26,120 Speaker 4: he has in store for us. If people are listening, 23 00:01:26,560 --> 00:01:29,320 Speaker 4: are following what's going on in DC, one of the 24 00:01:29,360 --> 00:01:31,200 Speaker 4: biggest I mean, I'll get on my soapbox right now. 25 00:01:31,240 --> 00:01:33,640 Speaker 4: The reason why it's happening in DC. It's because DC 26 00:01:33,800 --> 00:01:36,760 Speaker 4: is not a state. DC has what's called home rule, 27 00:01:36,880 --> 00:01:40,800 Speaker 4: so we we have some ability to make decisions for 28 00:01:40,840 --> 00:01:43,240 Speaker 4: ourselves and sort of for like our local leaders are 29 00:01:43,240 --> 00:01:45,920 Speaker 4: able to make some decisions for us. But everything that 30 00:01:46,000 --> 00:01:49,080 Speaker 4: happens in DC is at the pleasure of the President 31 00:01:49,120 --> 00:01:51,560 Speaker 4: and Congress because DC is not a state. And so 32 00:01:52,000 --> 00:01:55,240 Speaker 4: the reason why Trump can send not just the National 33 00:01:55,240 --> 00:01:59,360 Speaker 4: Guard to DC but also take over our police force 34 00:01:59,600 --> 00:02:02,520 Speaker 4: is because DC is not a state, and so advocate 35 00:02:02,560 --> 00:02:05,040 Speaker 4: for statehood for DC. The only reason why this is 36 00:02:05,040 --> 00:02:07,920 Speaker 4: happening in DC in no other place in terms of 37 00:02:08,800 --> 00:02:10,920 Speaker 4: our police force being taken over by Trump, is because 38 00:02:10,960 --> 00:02:13,239 Speaker 4: DC is not a state. So advocate for DC statehood. 39 00:02:13,520 --> 00:02:16,400 Speaker 4: DC statehood. Now I don't have an elective official that 40 00:02:16,440 --> 00:02:19,800 Speaker 4: I can call about this call for me and advocate 41 00:02:19,840 --> 00:02:21,880 Speaker 4: for the full self determination of the people of the 42 00:02:21,919 --> 00:02:22,800 Speaker 4: District of Columbia. 43 00:02:22,960 --> 00:02:23,880 Speaker 5: That's my soapbox. 44 00:02:25,960 --> 00:02:29,080 Speaker 2: It was a good one. Thank you, yes, yes, thank you, 45 00:02:30,160 --> 00:02:34,799 Speaker 2: and I'm we're both so glad you're okay and good 46 00:02:34,800 --> 00:02:35,919 Speaker 2: to have you. As always. 47 00:02:36,240 --> 00:02:40,040 Speaker 3: Even if the topic you're discussing is a pretty is pretty, 48 00:02:40,080 --> 00:02:43,639 Speaker 3: it's one that makes me very angry, and actually this 49 00:02:43,720 --> 00:02:44,639 Speaker 3: is a sad state. 50 00:02:44,440 --> 00:02:45,680 Speaker 2: Of affairs that we live in. 51 00:02:46,240 --> 00:02:49,440 Speaker 3: But as I was discussing before we started recording, I'm 52 00:02:49,440 --> 00:02:53,320 Speaker 3: going to this big event, dragon Con this weekend, and 53 00:02:53,480 --> 00:02:55,840 Speaker 3: every year I go, I think I might die in 54 00:02:55,880 --> 00:02:59,960 Speaker 3: a mass shooting. Oh, because that's just the world we 55 00:03:00,120 --> 00:03:03,800 Speaker 3: live in. And part of what you're talking about here 56 00:03:05,000 --> 00:03:10,760 Speaker 3: is it's awful enough that that happens, But it's also 57 00:03:10,960 --> 00:03:15,920 Speaker 3: awful enough that so many people in power are so 58 00:03:16,120 --> 00:03:19,920 Speaker 3: quick to blame somebody else, anybody else, other than what 59 00:03:20,520 --> 00:03:23,560 Speaker 3: is actually happening and what we could actually do to 60 00:03:23,600 --> 00:03:24,679 Speaker 3: prevent things like this. 61 00:03:25,240 --> 00:03:27,320 Speaker 5: Oh my god, Annie, you and me both. 62 00:03:27,400 --> 00:03:29,720 Speaker 4: If I am in any kind of a big event, 63 00:03:29,919 --> 00:03:31,920 Speaker 4: people who know me in real life know I can't 64 00:03:31,919 --> 00:03:34,840 Speaker 4: get comfortable until I'm like, Okay, exit there, exit there, 65 00:03:34,960 --> 00:03:37,480 Speaker 4: exit there, Like I need to know where the exits are. 66 00:03:37,520 --> 00:03:39,680 Speaker 5: It is. I'm not comfortable until I do that. 67 00:03:39,800 --> 00:03:43,240 Speaker 4: And as you said, I mean in this country, I mean, 68 00:03:43,240 --> 00:03:45,000 Speaker 4: I hate to say it, it feels like we have 69 00:03:45,080 --> 00:03:48,600 Speaker 4: just decided we're not going to do anything about the 70 00:03:48,640 --> 00:03:53,960 Speaker 4: real cause of mass shootings, which is the guns. Right, 71 00:03:54,000 --> 00:03:56,800 Speaker 4: So then we have to start scapegoating things and trying 72 00:03:56,840 --> 00:03:59,840 Speaker 4: to blame these issues that are non issues on what 73 00:04:00,200 --> 00:04:04,800 Speaker 4: we have shootings, and unfortunately, in today's internet landscape, that 74 00:04:04,880 --> 00:04:08,200 Speaker 4: thing is the existence of trans people. I know what 75 00:04:08,240 --> 00:04:11,880 Speaker 4: you're thinking, I don't know how trans people are linked 76 00:04:11,920 --> 00:04:14,600 Speaker 4: to mass shootings. Well, you would be right, because they're 77 00:04:14,640 --> 00:04:16,920 Speaker 4: not linked to mass shootings. But that does not keep 78 00:04:17,000 --> 00:04:20,719 Speaker 4: trans people from being baselessly blamed for any kind of 79 00:04:21,040 --> 00:04:25,800 Speaker 4: mass shooting, mass tragedy, or mass instance of violence or 80 00:04:25,920 --> 00:04:30,560 Speaker 4: bad thing happening by very loud voices online. 81 00:04:30,720 --> 00:04:35,719 Speaker 2: Yes, and this has happened several times. Correct. 82 00:04:36,400 --> 00:04:39,520 Speaker 4: Yes, it is a constant thing, and it's a kind 83 00:04:39,520 --> 00:04:42,479 Speaker 4: of identity based disinformation that I see over and over 84 00:04:42,520 --> 00:04:45,240 Speaker 4: and over again. I would be willing to bet that 85 00:04:45,320 --> 00:04:48,840 Speaker 4: folks listening have probably actually seen this play out at 86 00:04:48,839 --> 00:04:50,839 Speaker 4: one time or another. If you spend any time online, 87 00:04:51,279 --> 00:04:54,400 Speaker 4: there is this image of someone with blonde, shaggy hair 88 00:04:54,560 --> 00:04:58,599 Speaker 4: holding a rifle that often circulates online that claims to 89 00:04:58,680 --> 00:05:01,560 Speaker 4: show the perpetrator who Commenters will say, oh, this is 90 00:05:01,600 --> 00:05:05,200 Speaker 4: a trans person who was mentally disturbed and they're the perpetrator, 91 00:05:05,480 --> 00:05:09,200 Speaker 4: But it's actually a very well worn image of Sam Hyde, 92 00:05:09,279 --> 00:05:13,000 Speaker 4: who is not trans and is actually an extremist, comedian 93 00:05:13,080 --> 00:05:16,559 Speaker 4: and YouTuber. In Fact, in twenty twenty three, fact checker 94 00:05:16,560 --> 00:05:19,359 Speaker 4: and journalist Bill McCarthy described that when he sees a 95 00:05:19,360 --> 00:05:22,240 Speaker 4: mass shooting, his heart doesn't just sink for the victims 96 00:05:22,279 --> 00:05:25,400 Speaker 4: and their families and communities. His heart also sinks because 97 00:05:25,440 --> 00:05:27,640 Speaker 4: he knows, just from doing it for a long time, 98 00:05:27,880 --> 00:05:30,920 Speaker 4: that it means that other people who are not involved 99 00:05:31,240 --> 00:05:34,640 Speaker 4: will be baselessly and falsely accused of being the killer. 100 00:05:34,960 --> 00:05:38,200 Speaker 4: They're pictures shared all over the web by social media users. 101 00:05:38,600 --> 00:05:41,200 Speaker 4: In a piece called mass Shootings, the Other Innocent Victims, 102 00:05:41,600 --> 00:05:44,279 Speaker 4: he sheds light on this where he talks about a 103 00:05:44,279 --> 00:05:48,080 Speaker 4: mass shooting happening in Nashville where Hide was misidentified as 104 00:05:48,120 --> 00:05:52,200 Speaker 4: both trans and the perpetrator. He writes, shortly after police 105 00:05:52,200 --> 00:05:54,160 Speaker 4: says that they had responded to a shooting, I opened 106 00:05:54,160 --> 00:05:58,000 Speaker 4: Twitter and ran a simple keyword search Nashville shooter identified. 107 00:05:58,320 --> 00:06:01,400 Speaker 4: According to dozens of results, the city officials had already 108 00:06:01,440 --> 00:06:03,920 Speaker 4: singled out the perpetrator. It was a transgender woman named 109 00:06:03,960 --> 00:06:08,600 Speaker 4: Samantha Hyde. The post claimed, So this again, like, if 110 00:06:08,640 --> 00:06:12,240 Speaker 4: you spend any time at all online when there's some 111 00:06:12,240 --> 00:06:15,800 Speaker 4: sort of mass instance of violence, this image of Hide 112 00:06:15,839 --> 00:06:18,280 Speaker 4: is probably you can probably picture it like I can 113 00:06:18,320 --> 00:06:18,680 Speaker 4: picture it. 114 00:06:18,720 --> 00:06:20,040 Speaker 5: I see it every single time. 115 00:06:20,400 --> 00:06:24,520 Speaker 4: And you know, this person is very familiar to fact 116 00:06:24,600 --> 00:06:27,320 Speaker 4: checkers at this point because of all the different variations 117 00:06:27,360 --> 00:06:29,960 Speaker 4: of an image of this person showing up. But it's 118 00:06:29,960 --> 00:06:33,640 Speaker 4: been this long running hoax that initially originated on four 119 00:06:33,720 --> 00:06:35,320 Speaker 4: Chan as kind of like a meme. 120 00:06:35,760 --> 00:06:38,800 Speaker 2: Oh I do not like that. 121 00:06:39,720 --> 00:06:41,799 Speaker 3: And the thing is it has such a trickle down 122 00:06:42,760 --> 00:06:47,960 Speaker 3: impact of because I'm not hugely online, but I am. 123 00:06:48,279 --> 00:06:49,400 Speaker 2: I'm aware of all of. 124 00:06:49,320 --> 00:06:54,320 Speaker 3: This through like news in quotes, I guess news sources, 125 00:06:55,160 --> 00:07:00,800 Speaker 3: and it has horrible impacts on the people who were 126 00:07:01,720 --> 00:07:04,080 Speaker 3: not involved at all. 127 00:07:04,480 --> 00:07:08,800 Speaker 4: Yeah, So it's it's really not just Nashville. In the 128 00:07:09,080 --> 00:07:12,920 Speaker 4: shooting in Texas, a photo of an actual trans woman 129 00:07:13,360 --> 00:07:15,200 Speaker 4: named Sam, a different Sam than the one I was 130 00:07:15,240 --> 00:07:17,880 Speaker 4: just talking about, circulated on social media claiming that she 131 00:07:18,040 --> 00:07:20,720 Speaker 4: was the perpetrator. She actually had to post a picture 132 00:07:20,920 --> 00:07:23,240 Speaker 4: showing that she was alive and well, because as we 133 00:07:23,320 --> 00:07:25,560 Speaker 4: know that that shooter was killed by police, right, and 134 00:07:25,600 --> 00:07:29,520 Speaker 4: so I mean, just imagine not being involved in a 135 00:07:29,560 --> 00:07:32,120 Speaker 4: shooting that that that trans woman Sam was like, I 136 00:07:32,120 --> 00:07:34,320 Speaker 4: don't even I don't even live in Nashville, right, and 137 00:07:34,360 --> 00:07:37,600 Speaker 4: so just imagine having to prove to people online that 138 00:07:37,640 --> 00:07:39,560 Speaker 4: you were not involved in something that you had nothing 139 00:07:39,600 --> 00:07:41,560 Speaker 4: to do with. None of these people were actually the 140 00:07:41,600 --> 00:07:44,640 Speaker 4: perpetrators or even connected to the to the incident at all. 141 00:07:44,960 --> 00:07:47,520 Speaker 4: But that does not stop these claims from going viral 142 00:07:47,840 --> 00:07:52,360 Speaker 4: and harming innocent, actual trans people and then furthermore kind 143 00:07:52,360 --> 00:07:56,600 Speaker 4: of creating this false impression that trans people are violent 144 00:07:56,800 --> 00:07:59,760 Speaker 4: or dangerous, just based on nothing like the facts very 145 00:08:00,080 --> 00:08:03,000 Speaker 4: really do not back that up. And so the way 146 00:08:03,040 --> 00:08:07,520 Speaker 4: that people are able to scapegoat and fear monger around 147 00:08:07,520 --> 00:08:10,480 Speaker 4: just the existence of trans people and then somehow link 148 00:08:10,520 --> 00:08:13,680 Speaker 4: it to violence is really it's not just it's not 149 00:08:13,760 --> 00:08:17,160 Speaker 4: just harmful for the trans people who are targeted, it's 150 00:08:17,200 --> 00:08:21,400 Speaker 4: harmful for everybody. Like, it's just a very harmful, dangerous climate. 151 00:08:22,200 --> 00:08:23,480 Speaker 2: Yes, and. 152 00:08:24,960 --> 00:08:27,480 Speaker 3: You know, when we have these narratives around trans people 153 00:08:27,480 --> 00:08:30,120 Speaker 3: in our politics that are so painting them in such 154 00:08:30,120 --> 00:08:34,560 Speaker 3: a way as if they're all pedophiles or something like, 155 00:08:34,600 --> 00:08:38,280 Speaker 3: it's just ridiculous. These attacks are horrible, they're not backed 156 00:08:38,320 --> 00:08:42,960 Speaker 3: up by facts, and it's we know the trans community 157 00:08:42,960 --> 00:08:47,400 Speaker 3: faces so much violence and this only makes it worse. 158 00:08:49,440 --> 00:08:54,000 Speaker 3: And you have an incident that happened kind of near 159 00:08:54,040 --> 00:08:57,120 Speaker 3: you where we saw this play out right? 160 00:08:57,679 --> 00:09:01,960 Speaker 4: Yes, yeah, so this brings me to you. This horrible 161 00:09:02,000 --> 00:09:04,880 Speaker 4: plane crash that happened earlier this year. There was a 162 00:09:04,920 --> 00:09:08,640 Speaker 4: great piece about this incident and Wired called a transpilot 163 00:09:08,760 --> 00:09:11,079 Speaker 4: was falsely blamed for a plane crash. Now she's fighting 164 00:09:11,120 --> 00:09:14,160 Speaker 4: the right wing disinformation machine that was published earlier this 165 00:09:14,200 --> 00:09:16,320 Speaker 4: summer that really sheds light on this issue. 166 00:09:16,320 --> 00:09:17,520 Speaker 5: So I wanted to shout them out. 167 00:09:17,559 --> 00:09:20,720 Speaker 4: So back in January, there was a deadly plane crash 168 00:09:20,720 --> 00:09:24,000 Speaker 4: where I live here in DC, where a helicopter crashed 169 00:09:24,000 --> 00:09:27,320 Speaker 4: into a plane. Both of them landed in the Potomac River, 170 00:09:27,360 --> 00:09:30,800 Speaker 4: and sixty seven people died. It was a hard way 171 00:09:30,840 --> 00:09:32,680 Speaker 4: to start out the new year, and it just was 172 00:09:32,720 --> 00:09:37,600 Speaker 4: a really dark time here in DC. However, it also 173 00:09:37,720 --> 00:09:40,640 Speaker 4: became very dark for this woman, Joe Ellis, who is 174 00:09:40,720 --> 00:09:45,200 Speaker 4: a transgender National Guard pilot who was falsely blamed for 175 00:09:45,320 --> 00:09:50,480 Speaker 4: causing this crash. Now, to be super duper clear, Ellis 176 00:09:50,600 --> 00:09:54,240 Speaker 4: was not involved in this crash in any capacity whatsoever, 177 00:09:54,440 --> 00:09:58,520 Speaker 4: just was not involved. However, days before the crash happened, 178 00:09:58,800 --> 00:10:01,440 Speaker 4: she had published this pretty good essay about what it's 179 00:10:01,480 --> 00:10:05,000 Speaker 4: like being a transgender pilot. That alone was enough for 180 00:10:05,160 --> 00:10:07,640 Speaker 4: right wing influencers, some of which you have millions of 181 00:10:07,679 --> 00:10:11,360 Speaker 4: followers across the Internet, to baselessly accuse her of being 182 00:10:11,480 --> 00:10:14,360 Speaker 4: responsible for this crash. So in the days leading up 183 00:10:14,400 --> 00:10:17,480 Speaker 4: to this crash, she wrote this piece called Living to Serve, 184 00:10:17,600 --> 00:10:21,360 Speaker 4: Living as Myself a transgender service members Perspective, where she 185 00:10:21,520 --> 00:10:24,160 Speaker 4: describes growing up in a family of service members and 186 00:10:24,200 --> 00:10:26,080 Speaker 4: joining the Virginia Army National Guard in two. 187 00:10:26,000 --> 00:10:26,600 Speaker 5: Thousand and nine. 188 00:10:26,920 --> 00:10:29,320 Speaker 4: In her piece, she described sending an email to her 189 00:10:29,320 --> 00:10:31,520 Speaker 4: command and giving them notice that she intended to start 190 00:10:31,760 --> 00:10:36,240 Speaker 4: transitioning under the then current in service transition policy, and 191 00:10:36,640 --> 00:10:38,559 Speaker 4: she says she was met with a lot of support 192 00:10:38,600 --> 00:10:41,800 Speaker 4: from her team. It's clear in the piece that she's 193 00:10:41,800 --> 00:10:43,760 Speaker 4: trying to sort of push back on some of these 194 00:10:44,320 --> 00:10:47,640 Speaker 4: well worn myths about how trans people get all this 195 00:10:47,720 --> 00:10:50,560 Speaker 4: free stuff and that you know they're just you just 196 00:10:50,640 --> 00:10:54,080 Speaker 4: join the military and the government pay for everything. She 197 00:10:54,160 --> 00:10:56,640 Speaker 4: actually describes paying out of pocket for all of her 198 00:10:56,679 --> 00:11:00,560 Speaker 4: trans related care. So after her piece was published, just 199 00:11:00,640 --> 00:11:04,240 Speaker 4: days later the crash happened. She describes waking up to 200 00:11:04,400 --> 00:11:07,240 Speaker 4: a friend warning her that she had been named online 201 00:11:07,320 --> 00:11:09,920 Speaker 4: as the pilot who killed all of these innocent passengers 202 00:11:09,920 --> 00:11:12,800 Speaker 4: in this deadly crash. At first, she's thinking, Oh, this 203 00:11:12,840 --> 00:11:16,800 Speaker 4: is just some weird, isolated claim. Somebody must have mistakenly 204 00:11:16,840 --> 00:11:19,640 Speaker 4: connected me to the crash because of this essay I 205 00:11:19,760 --> 00:11:22,360 Speaker 4: describe being from Virginia. This happened really close to Virginia. 206 00:11:22,520 --> 00:11:25,000 Speaker 4: She's like, Oh, just a mistake. Then she gets on 207 00:11:25,000 --> 00:11:29,199 Speaker 4: Facebook and she sees lots of messages both from friends 208 00:11:29,480 --> 00:11:30,439 Speaker 4: basically being like. 209 00:11:30,400 --> 00:11:31,880 Speaker 5: Are you alive orre you okay? 210 00:11:32,440 --> 00:11:35,079 Speaker 4: I see that you were involved in this crash, as 211 00:11:35,120 --> 00:11:39,160 Speaker 4: well as hateful, really transphobic messages from people saying that 212 00:11:39,200 --> 00:11:41,400 Speaker 4: they thought that she had been involved in this crash. 213 00:11:41,440 --> 00:11:45,280 Speaker 5: So again, I can't quite imagine what it. 214 00:11:45,280 --> 00:11:47,960 Speaker 4: Would be like to go to bed just a normal 215 00:11:48,000 --> 00:11:50,920 Speaker 4: person and wake up being at the center of this 216 00:11:51,040 --> 00:11:54,520 Speaker 4: horrible tragedy that you weren't even involved in in any capacity. 217 00:11:55,559 --> 00:11:58,360 Speaker 1: You know what I really hate with all of this 218 00:11:58,559 --> 00:12:03,880 Speaker 1: conversation is that that was part of the point. They 219 00:12:04,640 --> 00:12:08,720 Speaker 1: love taking an incident that they know they may have 220 00:12:08,800 --> 00:12:10,959 Speaker 1: some liability in because we know what was happening with 221 00:12:11,000 --> 00:12:13,840 Speaker 1: the Department of Transportation and the FAA. We knew like 222 00:12:13,880 --> 00:12:17,520 Speaker 1: things were happening, and things were slowly like falling apart, 223 00:12:17,840 --> 00:12:20,560 Speaker 1: Like we could see things falling apart before this happened. 224 00:12:20,920 --> 00:12:24,000 Speaker 1: And if they just like with the shooters, any of 225 00:12:24,000 --> 00:12:25,800 Speaker 1: the shooters, when they're like, ooh, this is not going 226 00:12:25,880 --> 00:12:28,280 Speaker 1: to be good for our side. So what we're gonna do 227 00:12:28,559 --> 00:12:31,200 Speaker 1: is put in as much disinformation and misinformation as we 228 00:12:31,280 --> 00:12:36,280 Speaker 1: can and also give them a villain that we have 229 00:12:36,440 --> 00:12:40,600 Speaker 1: already used to capitalize on traditional families in order to like, 230 00:12:40,679 --> 00:12:43,720 Speaker 1: you know, get our narrative out there. There's so much 231 00:12:43,760 --> 00:12:46,120 Speaker 1: to this, and like it doesn't matter if they're wrong. 232 00:12:46,559 --> 00:12:49,560 Speaker 1: They've got the perfect victim and they've got someone to 233 00:12:49,640 --> 00:12:53,920 Speaker 1: be the villain that that could only help perpetuate their platform. 234 00:12:54,120 --> 00:12:57,320 Speaker 4: Right, You're exactly right, Samantha, And I mean that is 235 00:12:57,360 --> 00:13:00,720 Speaker 4: exactly what's going on in this case. When initially, like 236 00:13:00,800 --> 00:13:04,400 Speaker 4: I remember the day this crash happened, and I also, 237 00:13:04,840 --> 00:13:06,600 Speaker 4: when I'm not making my podcasts, there are no girls 238 00:13:06,600 --> 00:13:08,520 Speaker 4: on the internet. I also co host a local daily 239 00:13:08,559 --> 00:13:11,840 Speaker 4: news podcast called Citycast DC, so I was reporting on 240 00:13:11,880 --> 00:13:14,600 Speaker 4: this story and I will never forget literally the day 241 00:13:14,640 --> 00:13:19,920 Speaker 4: it happened, you had administration officials essentially blaming DEI. This 242 00:13:20,120 --> 00:13:22,400 Speaker 4: was when we didn't even know what the cause was, 243 00:13:22,480 --> 00:13:26,040 Speaker 4: and so they were so confident getting up on television 244 00:13:26,040 --> 00:13:27,960 Speaker 4: and being like, oh, the problem was DEI. And then 245 00:13:28,000 --> 00:13:31,200 Speaker 4: back in March, the administration spent two point one million 246 00:13:31,240 --> 00:13:34,640 Speaker 4: dollars on an actual official investigation into whether or not 247 00:13:35,000 --> 00:13:39,439 Speaker 4: DEI policies were causing all these recent safety incidents with airlines, right, 248 00:13:39,480 --> 00:13:41,679 Speaker 4: And so it's just I don't think I have to 249 00:13:41,720 --> 00:13:45,480 Speaker 4: tell anybody that there's just no credence to this, like 250 00:13:45,559 --> 00:13:48,640 Speaker 4: you don't need two point one million dollars to study this. 251 00:13:49,000 --> 00:13:53,040 Speaker 4: But also the way that even before, like I'm not kidding, 252 00:13:53,200 --> 00:13:56,560 Speaker 4: there were still bodies in the Potomac River when Donald 253 00:13:56,559 --> 00:13:59,280 Speaker 4: Trump was getting on TV talking about this being because 254 00:13:59,320 --> 00:14:02,800 Speaker 4: of DEI, And so you're exactly right that they've just 255 00:14:02,880 --> 00:14:08,679 Speaker 4: created this scapegoat that you know, it's horrible for trans 256 00:14:08,679 --> 00:14:13,040 Speaker 4: folks on the trans community, but it's also horrible for 257 00:14:13,400 --> 00:14:19,160 Speaker 4: everybody because when we're trying to somehow pin deadly plane 258 00:14:19,160 --> 00:14:23,800 Speaker 4: crashes on diversity, we're not actually looking at what causes 259 00:14:23,800 --> 00:14:26,400 Speaker 4: the deadly plane crashes. So the planes are less safe 260 00:14:26,400 --> 00:14:28,320 Speaker 4: for all of us because they're like, oh, it's you know, 261 00:14:28,440 --> 00:14:31,360 Speaker 4: how trans people are always causing plane crashes and then 262 00:14:31,400 --> 00:14:34,680 Speaker 4: the thing that it's actually causing plane crashes is going unexamined, 263 00:14:34,840 --> 00:14:36,120 Speaker 4: and so it's a real problem. 264 00:14:36,160 --> 00:14:38,640 Speaker 5: And I just hate that it's been so effective. 265 00:14:39,880 --> 00:14:43,200 Speaker 1: In my head, this is like it's such a bad 266 00:14:43,280 --> 00:14:46,200 Speaker 1: Like when the queer community are involved, I'm like, it's 267 00:14:46,200 --> 00:14:49,240 Speaker 1: because they were like dressed too too good, Like they're 268 00:14:49,320 --> 00:14:51,480 Speaker 1: they're showing off their fits, you know, Yeah, a little 269 00:14:51,480 --> 00:14:53,960 Speaker 1: trend that's happening on TikTok. Yeah, they're showing their office 270 00:14:54,000 --> 00:14:57,080 Speaker 1: and so they were being too distracting and caused this 271 00:14:57,120 --> 00:14:57,600 Speaker 1: plane crash. 272 00:14:57,640 --> 00:14:58,960 Speaker 5: And that's what it goes in my head. 273 00:14:59,680 --> 00:15:03,080 Speaker 4: I mean, that's essentially it's like a joke, but like 274 00:15:03,120 --> 00:15:04,560 Speaker 4: that's essentially what they're saying. 275 00:15:04,640 --> 00:15:17,720 Speaker 5: Sometimes it's so silly. It is beyond. 276 00:15:18,720 --> 00:15:20,600 Speaker 1: This is the part I think you're coming back to, 277 00:15:20,760 --> 00:15:23,840 Speaker 1: and I want to make note of that there is 278 00:15:23,960 --> 00:15:27,360 Speaker 1: no real liability in this different this information slash just 279 00:15:27,440 --> 00:15:32,440 Speaker 1: straight out lies because this plays out on social media exactly. 280 00:15:32,560 --> 00:15:35,440 Speaker 4: So we're starting to see a little bit of movement there, 281 00:15:35,480 --> 00:15:37,040 Speaker 4: and I'll talk more about that toward the end of 282 00:15:37,120 --> 00:15:41,200 Speaker 4: the episode. But yeah, because it's all happening online, it's 283 00:15:41,240 --> 00:15:45,000 Speaker 4: happening in a space where real damage can be done, 284 00:15:45,240 --> 00:15:49,440 Speaker 4: and then real accountability can be elusive, and so that's 285 00:15:49,480 --> 00:15:53,440 Speaker 4: exactly what happened in this case. Right wing influencers just 286 00:15:53,480 --> 00:15:57,160 Speaker 4: started boosting lies that Ellis was involved in this crash 287 00:15:57,200 --> 00:16:00,240 Speaker 4: when she wasn't. On Twitter, Matt Wallace, who has over 288 00:16:00,240 --> 00:16:02,560 Speaker 4: two million followers, put out a post saying that a 289 00:16:02,600 --> 00:16:06,040 Speaker 4: trans Blackhawk pilot had written a letter about depression and 290 00:16:06,120 --> 00:16:09,360 Speaker 4: gender dysphoria the day before a deadly crash. So if 291 00:16:09,400 --> 00:16:13,440 Speaker 4: you actually read Ellis's piece, she talks about how she 292 00:16:13,560 --> 00:16:17,040 Speaker 4: was depressed, but when when she was young, before she 293 00:16:17,160 --> 00:16:21,040 Speaker 4: transitioned her her piece is all about how, you know, 294 00:16:21,240 --> 00:16:24,400 Speaker 4: I feel like I have a supportive community both professionally 295 00:16:24,480 --> 00:16:27,560 Speaker 4: and personally and socially. Her her piece did not make 296 00:16:27,600 --> 00:16:29,440 Speaker 4: her sound like she was an unhappy person. It made 297 00:16:29,440 --> 00:16:31,720 Speaker 4: her sound like she was a very happy, you know, 298 00:16:32,360 --> 00:16:33,200 Speaker 4: solid person. 299 00:16:33,240 --> 00:16:34,840 Speaker 5: So even even that is a lie. 300 00:16:35,160 --> 00:16:37,560 Speaker 4: But he went so far as to suggest that it 301 00:16:37,640 --> 00:16:41,360 Speaker 4: might have been some kind of a quote trans terror attack. 302 00:16:41,960 --> 00:16:44,960 Speaker 4: That post blew up and got almost five million views 303 00:16:44,960 --> 00:16:48,600 Speaker 4: before he deleted it. Then Anne vander Steele, who was 304 00:16:48,640 --> 00:16:51,720 Speaker 4: a pretty well known QAnon personality with a huge following online, 305 00:16:51,920 --> 00:16:54,880 Speaker 4: jumped in and pushed the same false claim. She did 306 00:16:55,000 --> 00:16:58,720 Speaker 4: eventually post a attraction, and of course it would not 307 00:16:58,760 --> 00:17:02,920 Speaker 4: be an episode abouts phobia without mentioning Elon Musk. Musk's 308 00:17:02,960 --> 00:17:07,000 Speaker 4: own AI chatbot, known as Grock, falsely named Ellis as 309 00:17:07,040 --> 00:17:10,040 Speaker 4: the pilot, which only made the rumor spread faster. So 310 00:17:10,119 --> 00:17:14,120 Speaker 4: before long, Ellis was actually trending as the second most 311 00:17:14,119 --> 00:17:17,080 Speaker 4: talked about topic on Twitter, with more than ninety thousand 312 00:17:17,160 --> 00:17:22,879 Speaker 4: posts about her like speculating that she was involved and 313 00:17:22,920 --> 00:17:24,600 Speaker 4: that the whole thing was some sort of a quote 314 00:17:24,960 --> 00:17:28,240 Speaker 4: trans terror attack. She had to put out a proof 315 00:17:28,240 --> 00:17:31,880 Speaker 4: of life video on social media to reassure her community 316 00:17:31,880 --> 00:17:35,000 Speaker 4: that she wasn't dead, and also just as an attempt 317 00:17:35,119 --> 00:17:37,040 Speaker 4: to push back against these lies. 318 00:17:39,520 --> 00:17:41,120 Speaker 1: I hate that she had to do a proof of 319 00:17:41,240 --> 00:17:45,359 Speaker 1: life video just to like save her name. But also 320 00:17:45,440 --> 00:17:48,919 Speaker 1: I wonder how many people like this is AI, like 321 00:17:48,960 --> 00:17:50,919 Speaker 1: they still refuse to believe in Yeah. 322 00:17:50,760 --> 00:17:51,640 Speaker 5: I mean, that's the thing. 323 00:17:51,720 --> 00:17:54,560 Speaker 4: So we know that in that crash there were no survivors, 324 00:17:54,600 --> 00:17:56,680 Speaker 4: and so I almost wonder if someone has to saw 325 00:17:56,720 --> 00:17:57,919 Speaker 4: a video of for her being like, well there were 326 00:17:57,920 --> 00:17:59,840 Speaker 4: no survivors and here I am, They're like, that's not 327 00:18:00,480 --> 00:18:04,200 Speaker 4: that's AI, and that just goes to show how our 328 00:18:04,400 --> 00:18:08,800 Speaker 4: digital media landscape is is that some people, when confronted 329 00:18:09,000 --> 00:18:13,680 Speaker 4: with the truth that does not align with the reality 330 00:18:13,680 --> 00:18:15,639 Speaker 4: that they want to be true, will always reject it. 331 00:18:15,760 --> 00:18:15,879 Speaker 1: Right. 332 00:18:16,200 --> 00:18:18,040 Speaker 4: They can meet her in the flesh and it's like, oh, 333 00:18:18,080 --> 00:18:20,399 Speaker 4: it's a body double like it does not It truly 334 00:18:20,480 --> 00:18:25,000 Speaker 4: does not matter, so I will say. After she posted 335 00:18:25,240 --> 00:18:29,320 Speaker 4: her proof of life video, Matt Wallace, one of the 336 00:18:29,320 --> 00:18:33,240 Speaker 4: people who initially claimed that she was responsible for this accident, 337 00:18:34,160 --> 00:18:37,439 Speaker 4: did post to say, oh, I saw that video. Ellis 338 00:18:37,520 --> 00:18:40,480 Speaker 4: is alive and well and not responsible for the crash. 339 00:18:40,800 --> 00:18:45,159 Speaker 4: He shifted the blame to another account called fake Gay Politics, 340 00:18:45,320 --> 00:18:48,480 Speaker 4: which has since been suspended on x saying this is 341 00:18:48,480 --> 00:18:50,960 Speaker 4: apparently the first account who reported what we now know 342 00:18:51,080 --> 00:18:53,959 Speaker 4: is false. It seemed credible because Joe Ellis wrote an 343 00:18:54,040 --> 00:18:56,840 Speaker 4: article calling out Trump's trans military band only a few 344 00:18:56,880 --> 00:18:59,000 Speaker 4: days ago. I have to point out that in this 345 00:18:59,160 --> 00:19:01,439 Speaker 4: post where he said, oh, I got it wrong, she 346 00:19:01,520 --> 00:19:05,520 Speaker 4: wasn't responsible, he of course also misgenders her because he 347 00:19:05,560 --> 00:19:07,360 Speaker 4: can't just he just can't help himself like he has 348 00:19:07,400 --> 00:19:10,240 Speaker 4: to even in a post where I don't know he 349 00:19:10,280 --> 00:19:14,840 Speaker 4: should maybe be apologizing for baselessly accusing her of committing 350 00:19:14,880 --> 00:19:17,399 Speaker 4: a trans terror attack to like millions of people. He 351 00:19:17,400 --> 00:19:19,920 Speaker 4: should maybe apologizing, No, he's like I got a missgender 352 00:19:19,960 --> 00:19:22,720 Speaker 4: her on the way out also right. 353 00:19:22,480 --> 00:19:25,920 Speaker 1: I mean it's definitely like, you know, gotta less, treat 354 00:19:25,920 --> 00:19:28,600 Speaker 1: them less than human type of narrative from him, of 355 00:19:28,640 --> 00:19:31,400 Speaker 1: course Wallace, which I can't stand him anyway, But Matt 356 00:19:31,400 --> 00:19:34,800 Speaker 1: Wallason himself, he can't. He can't really actually show any 357 00:19:35,119 --> 00:19:36,800 Speaker 1: compassion exactly. 358 00:19:37,240 --> 00:19:40,160 Speaker 4: And I think, what what How this moved? How this 359 00:19:41,080 --> 00:19:46,000 Speaker 4: claim moved online? I think really shows what an effective 360 00:19:46,560 --> 00:19:50,120 Speaker 4: social media and media apparatus folks on the right really 361 00:19:50,160 --> 00:19:54,000 Speaker 4: do have, where these claims maybe start in more fringe 362 00:19:54,080 --> 00:19:56,800 Speaker 4: pockets of the Internet like four Chan or some random 363 00:19:56,800 --> 00:19:59,960 Speaker 4: extremist bloggers Twitter page, but then they quickly get boost 364 00:20:00,080 --> 00:20:04,480 Speaker 4: did by right wing politicians media figures, and then bleed 365 00:20:04,520 --> 00:20:07,679 Speaker 4: into more mainstream media outlets where they just sort of 366 00:20:07,720 --> 00:20:09,560 Speaker 4: become part of people's consciousness. 367 00:20:09,560 --> 00:20:10,840 Speaker 5: Like I would be willing to bet. 368 00:20:10,680 --> 00:20:16,399 Speaker 4: That because that picture of Sam Hyde where it misrepresents 369 00:20:16,440 --> 00:20:19,280 Speaker 4: Sam as a trans woman responsible for this attack, I 370 00:20:19,320 --> 00:20:21,640 Speaker 4: would be willing to bet that some people out there think, oh, 371 00:20:21,760 --> 00:20:24,880 Speaker 4: that image is of an actual trans person who committed 372 00:20:24,880 --> 00:20:28,080 Speaker 4: a horrible atrocity, even though it's not because it's just 373 00:20:28,119 --> 00:20:29,920 Speaker 4: like part of the ecosystem. 374 00:20:29,960 --> 00:20:32,880 Speaker 3: Now, yeah, And I mean that's part of the damage here, 375 00:20:32,920 --> 00:20:39,400 Speaker 3: is even with these retractions, which I mean, I guess 376 00:20:39,400 --> 00:20:41,280 Speaker 3: I'm glad I did them, but at the same time, 377 00:20:41,320 --> 00:20:44,440 Speaker 3: I'm kind of like, yeah, sure, okay, but at that point, 378 00:20:44,440 --> 00:20:47,480 Speaker 3: the damage is kind of done, Like somebody has already 379 00:20:48,480 --> 00:20:51,760 Speaker 3: been pinned for this, some you know, some people who 380 00:20:52,960 --> 00:20:56,679 Speaker 3: digested that information might not ever see the retraction, Like 381 00:20:57,000 --> 00:21:00,600 Speaker 3: the damage is kind of done. They might have that 382 00:21:00,720 --> 00:21:04,920 Speaker 3: already in their head. And Ellis was someone who went 383 00:21:05,200 --> 00:21:07,520 Speaker 3: who experienced this kind of thing. 384 00:21:08,480 --> 00:21:08,760 Speaker 5: Yeah. 385 00:21:08,760 --> 00:21:13,159 Speaker 4: She described going essentially overnight, going from being unknown in 386 00:21:13,240 --> 00:21:15,639 Speaker 4: public to being somebody who felt like they could not 387 00:21:15,720 --> 00:21:18,359 Speaker 4: even go out in public for her own safety. She 388 00:21:18,440 --> 00:21:21,000 Speaker 4: told The New York Times quote, my life was turned 389 00:21:21,080 --> 00:21:24,080 Speaker 4: upside down at that point, adding that her employer sent 390 00:21:24,280 --> 00:21:26,959 Speaker 4: armed bodyguards to protect her family and that she started 391 00:21:26,960 --> 00:21:30,520 Speaker 4: carrying a loaded weapon as a precaution. Forever on, I'm 392 00:21:30,560 --> 00:21:34,560 Speaker 4: known as that trans terrorist. And so as you both 393 00:21:34,600 --> 00:21:39,200 Speaker 4: were just saying these claims, they it can be hard 394 00:21:39,280 --> 00:21:43,200 Speaker 4: to get accountability because they're happening online. And even though 395 00:21:43,440 --> 00:21:47,679 Speaker 4: Matt Wallace did retract his claim that she was involved 396 00:21:47,680 --> 00:21:50,720 Speaker 4: in this. I have to imagine that he did that 397 00:21:50,880 --> 00:21:54,040 Speaker 4: because Ellis did not take this sitting down. She actually 398 00:21:54,080 --> 00:21:56,399 Speaker 4: fought back. And so we've talked about this on my 399 00:21:56,440 --> 00:21:59,800 Speaker 4: own podcast a lot. But when online right wing influencers 400 00:21:59,800 --> 00:22:03,240 Speaker 4: and figures spread lies about people, we're starting to see 401 00:22:03,280 --> 00:22:06,200 Speaker 4: more and more the people that are targeted by their 402 00:22:06,320 --> 00:22:11,800 Speaker 4: lives are filing defamation lawsuits against people who spread demonstrable, 403 00:22:11,840 --> 00:22:14,400 Speaker 4: harmful lies about them. And that's exactly what Ellis did. 404 00:22:14,480 --> 00:22:17,600 Speaker 4: She filed a defamation lawsuit against Matt Wallace in April, 405 00:22:18,400 --> 00:22:22,640 Speaker 4: saying that he was behind a destructive and irresponsible defamation campaign. 406 00:22:23,040 --> 00:22:25,440 Speaker 4: The lawsuit against Wallace, filed in the US District Court 407 00:22:25,480 --> 00:22:28,240 Speaker 4: in Colorado, was a way for Ellis to seek damages 408 00:22:28,280 --> 00:22:31,160 Speaker 4: for the harm cause to her reputation, privacy, and safety. 409 00:22:32,040 --> 00:22:34,480 Speaker 4: Ellis said that Wallace has not yet counter filed, and 410 00:22:34,520 --> 00:22:37,400 Speaker 4: her lawsuit was filed by the Equality Legal Action Fund, 411 00:22:37,400 --> 00:22:39,880 Speaker 4: which is a group of volunteer attorneys and advocates who 412 00:22:39,920 --> 00:22:43,240 Speaker 4: helped members of the LGBTQ community fight online defamation. And 413 00:22:43,320 --> 00:22:46,480 Speaker 4: so I have to imagine that when all these people 414 00:22:46,480 --> 00:22:49,199 Speaker 4: were like, oh, looks like I got it wrong and 415 00:22:49,200 --> 00:22:51,919 Speaker 4: this person wasn't involved in this tragedy. As I just 416 00:22:51,920 --> 00:22:54,600 Speaker 4: said to a million people. I have to imagine part 417 00:22:54,640 --> 00:22:57,200 Speaker 4: of that was because they did not want to face 418 00:22:57,280 --> 00:23:00,920 Speaker 4: a defamation lawsuit, which yeah, they If you lie about 419 00:23:00,920 --> 00:23:03,560 Speaker 4: people in a way that can harm them, harm their career, 420 00:23:03,640 --> 00:23:06,280 Speaker 4: harm their livelihood, you might actually be looking at a 421 00:23:06,280 --> 00:23:07,160 Speaker 4: defamation lawsuit. 422 00:23:08,640 --> 00:23:09,240 Speaker 2: Yeah. 423 00:23:09,400 --> 00:23:12,320 Speaker 3: Yeah, and we have seen some cases of this working 424 00:23:12,440 --> 00:23:14,160 Speaker 3: out recently. 425 00:23:14,280 --> 00:23:16,000 Speaker 2: Interestingly, I don't. 426 00:23:15,800 --> 00:23:19,000 Speaker 3: Know if you've been following what happened, what's going on 427 00:23:19,160 --> 00:23:20,879 Speaker 3: with the app Tea? 428 00:23:21,440 --> 00:23:23,920 Speaker 5: Oh yes, I have, very very closely. 429 00:23:23,960 --> 00:23:26,399 Speaker 3: Yes, But we were talking about that recently in an 430 00:23:26,440 --> 00:23:32,000 Speaker 3: episode about defamation. Is it can be tricky, but if 431 00:23:32,040 --> 00:23:37,840 Speaker 3: you can prove like this demonstrably hurt me, you can 432 00:23:38,160 --> 00:23:39,439 Speaker 3: you can get some money from that. 433 00:23:40,760 --> 00:23:41,000 Speaker 5: Yeah. 434 00:23:41,240 --> 00:23:44,440 Speaker 4: Are you talking about how the women who were had 435 00:23:44,480 --> 00:23:49,080 Speaker 4: their information, you know, exposed by Tea are suing. 436 00:23:49,720 --> 00:23:52,360 Speaker 3: Yeah, well, we were talking about both because men were 437 00:23:52,400 --> 00:23:55,720 Speaker 3: so angry and they were trying to sue for defamation, 438 00:23:56,359 --> 00:23:58,600 Speaker 3: and it's kind of tricky when a lot of it 439 00:23:58,680 --> 00:24:02,840 Speaker 3: is just well, this is the date you went on 440 00:24:03,080 --> 00:24:05,960 Speaker 3: and this is what you posted online. So it's kind 441 00:24:05,960 --> 00:24:10,280 Speaker 3: of tricky, but in other cases like it, then the 442 00:24:10,320 --> 00:24:15,120 Speaker 3: women were have their information leaked and they're like, well, 443 00:24:16,000 --> 00:24:16,880 Speaker 3: this is a bit. 444 00:24:18,119 --> 00:24:20,399 Speaker 4: I When I was covering the t APP, we were 445 00:24:20,440 --> 00:24:23,840 Speaker 4: talking about those are we dating the same guy? Facebook 446 00:24:23,880 --> 00:24:27,199 Speaker 4: pages that kind of were a sort of precursor to 447 00:24:27,400 --> 00:24:30,639 Speaker 4: the t APP, And as you just said, men tried 448 00:24:30,680 --> 00:24:33,040 Speaker 4: to sue women who talked about the dates that they 449 00:24:33,040 --> 00:24:37,040 Speaker 4: had had that on those pages, and those lawsuits did 450 00:24:37,080 --> 00:24:39,200 Speaker 4: not go anywhere. And I remember watching a press conference 451 00:24:39,280 --> 00:24:41,840 Speaker 4: of these women who posted on one of those pages 452 00:24:42,280 --> 00:24:44,480 Speaker 4: after the case was dropped and they were not going 453 00:24:44,520 --> 00:24:45,720 Speaker 4: to be found libel for defamation. 454 00:24:46,400 --> 00:24:49,040 Speaker 5: They basically said exactly what you just said. Annie. 455 00:24:49,240 --> 00:24:52,879 Speaker 4: You know, I'm sorry that you didn't like that what 456 00:24:53,000 --> 00:24:56,120 Speaker 4: I said about what happened on our date was not 457 00:24:56,160 --> 00:24:59,000 Speaker 4: favorable to you, But that is what happened, and that's 458 00:24:59,000 --> 00:24:59,640 Speaker 4: not defamation. 459 00:24:59,800 --> 00:25:02,080 Speaker 5: And I guess the court of reed right. 460 00:25:02,800 --> 00:25:04,879 Speaker 1: And I think the thing about these types of cases 461 00:25:04,920 --> 00:25:08,120 Speaker 1: and anything with like civil lawsuits in general, with defamations 462 00:25:08,119 --> 00:25:10,880 Speaker 1: and all of this, is that it really does give 463 00:25:10,920 --> 00:25:14,719 Speaker 1: an upper hand to victims who can prove what is happening. 464 00:25:14,840 --> 00:25:16,960 Speaker 1: We talked about this in the Meet two episode. We 465 00:25:17,080 --> 00:25:19,560 Speaker 1: just had an interview with EA Gene Carroll and her 466 00:25:19,640 --> 00:25:22,720 Speaker 1: defamation lawsuits, which you were like, yes, girl, get them, 467 00:25:22,960 --> 00:25:27,159 Speaker 1: but like, the level of proof is not as deep 468 00:25:27,200 --> 00:25:29,720 Speaker 1: as a criminal case. But you still have to prove 469 00:25:29,720 --> 00:25:32,840 Speaker 1: it obviously, And it's such a tricky thing because there's 470 00:25:32,920 --> 00:25:36,480 Speaker 1: so many unsure ways and this is not as well 471 00:25:36,520 --> 00:25:39,359 Speaker 1: known obviously defamation and civil lawsuits. Civil suits are not 472 00:25:39,400 --> 00:25:42,440 Speaker 1: as well known as criminal suits. We're not like watching 473 00:25:42,560 --> 00:25:45,480 Speaker 1: shows based on that, although we could, and I'm guessing 474 00:25:45,560 --> 00:25:48,000 Speaker 1: like this is probably going to be one of those 475 00:25:48,040 --> 00:25:50,480 Speaker 1: same things where it's kind of like, uh, how do 476 00:25:50,520 --> 00:25:50,960 Speaker 1: we proceed? 477 00:25:51,000 --> 00:25:51,960 Speaker 5: What does this look like? 478 00:25:53,440 --> 00:25:56,760 Speaker 4: Yeah, I mean again, it is this like tricky gray 479 00:25:56,920 --> 00:26:01,520 Speaker 4: area legally. Another good example of people successfully suing for 480 00:26:01,560 --> 00:26:04,600 Speaker 4: defamation were the parents of the Sandy Hook shooting, who 481 00:26:04,680 --> 00:26:09,000 Speaker 4: successfully sued Alex Jones for defamation after he repeatedly and 482 00:26:09,080 --> 00:26:13,600 Speaker 4: baselessly claimed the shooting that killed their babies was a hoax, 483 00:26:13,880 --> 00:26:17,760 Speaker 4: that their kids were not really dead, that they were actors, 484 00:26:18,040 --> 00:26:20,720 Speaker 4: or that the parents were involved in some massive conspiracy, 485 00:26:21,200 --> 00:26:24,040 Speaker 4: and that I mean, they were able to prove that 486 00:26:24,119 --> 00:26:27,040 Speaker 4: he actually did defame them. Also, it's it's such a 487 00:26:27,040 --> 00:26:31,440 Speaker 4: despicable thing. To do, like he deserves every bad thing 488 00:26:31,480 --> 00:26:33,200 Speaker 4: that could happen to a person. When you actually look 489 00:26:33,200 --> 00:26:35,520 Speaker 4: at the at the le he didn't just say this 490 00:26:35,640 --> 00:26:39,560 Speaker 4: offhand once or twice. These people had his listeners showing 491 00:26:39,640 --> 00:26:41,879 Speaker 4: up to their houses to harass them in person. Like, 492 00:26:42,000 --> 00:26:43,800 Speaker 4: when you actually look at the things that he did, 493 00:26:43,800 --> 00:26:47,480 Speaker 4: it's so despicable. And we also know that Dominion Voting 494 00:26:47,480 --> 00:26:50,440 Speaker 4: Systems got a seven hundred and eighty seven million dollar 495 00:26:50,520 --> 00:26:54,280 Speaker 4: settlement from Fox News because Fox claimed that their voting 496 00:26:54,320 --> 00:26:58,000 Speaker 4: systems during the election were rigged against Trump. So there 497 00:26:58,119 --> 00:27:03,040 Speaker 4: is some precedent for suing businesses for defamation. You know, 498 00:27:03,359 --> 00:27:06,080 Speaker 4: Fox is a business, Alex Jones, it runs a business, 499 00:27:06,440 --> 00:27:11,000 Speaker 4: but not really a precedent for suing individual influencers who 500 00:27:11,040 --> 00:27:13,959 Speaker 4: spread damaging lies about people. So I think that's what 501 00:27:14,040 --> 00:27:16,679 Speaker 4: Ellis is really trying to do. Trying to say, like, 502 00:27:16,880 --> 00:27:21,520 Speaker 4: if an influencer with millions of followers says a demonstrably 503 00:27:21,640 --> 00:27:24,040 Speaker 4: false claim about me that harms me, I should be 504 00:27:24,040 --> 00:27:26,560 Speaker 4: able to seek damages for that. And I think, you know, 505 00:27:27,040 --> 00:27:30,760 Speaker 4: especially for marginalized people, there should be some kind of 506 00:27:30,840 --> 00:27:33,600 Speaker 4: legal recourse that prevents this. It should not just be 507 00:27:33,720 --> 00:27:39,119 Speaker 4: that anybody can say, any harmful, dangerous, damaging lie they want, 508 00:27:39,440 --> 00:27:40,400 Speaker 4: and you just have to. 509 00:27:40,480 --> 00:27:42,520 Speaker 2: Eat it right, right. 510 00:27:42,600 --> 00:27:45,800 Speaker 1: I mean, when we look at like Giuliani again with Trump, 511 00:27:46,440 --> 00:27:49,560 Speaker 1: even the Megan, thee Stallion, recent cases like those are 512 00:27:49,600 --> 00:27:52,960 Speaker 1: these are really interesting. Of course we have bigger names 513 00:27:53,119 --> 00:27:55,879 Speaker 1: doing these cases. Not the Juliani case. That was a 514 00:27:55,920 --> 00:27:59,159 Speaker 1: beautiful suit to see in itself, But there are so 515 00:27:59,200 --> 00:28:00,879 Speaker 1: many things that it looks like, you know, this is 516 00:28:00,920 --> 00:28:04,960 Speaker 1: the best solution. I say that hesitantly because it's so 517 00:28:05,040 --> 00:28:08,000 Speaker 1: gross that we can't people can't get justice in general. 518 00:28:08,440 --> 00:28:11,440 Speaker 5: Yes, but like I guess you know what I. 519 00:28:11,400 --> 00:28:14,840 Speaker 4: Mean, No, totally, and I'm glad that you brought up Giuliani. 520 00:28:14,960 --> 00:28:19,280 Speaker 4: So he was sued by two women who probably heard 521 00:28:19,320 --> 00:28:23,280 Speaker 4: of Ruby Freeman and Shay Moss Georgia, women who he again, 522 00:28:23,359 --> 00:28:25,080 Speaker 4: I guess, I want to bring this up because it's 523 00:28:25,119 --> 00:28:29,040 Speaker 4: not like he just casually, offhand said, oh, maybe they 524 00:28:29,040 --> 00:28:33,119 Speaker 4: were involved in vote rigging the election for Biden against Trump. 525 00:28:33,440 --> 00:28:36,879 Speaker 4: The way that he said this repeatedly, the way that 526 00:28:36,920 --> 00:28:39,600 Speaker 4: he I mean, he like. In one instance, he accuses 527 00:28:39,680 --> 00:28:44,040 Speaker 4: them of passing back and forth USB files that had 528 00:28:44,400 --> 00:28:47,160 Speaker 4: secret hidden votes on them, as if they were doing 529 00:28:47,160 --> 00:28:49,920 Speaker 4: a drug deal together, like the way that he it's 530 00:28:49,920 --> 00:28:51,320 Speaker 4: not so I just want to make clear because some 531 00:28:51,360 --> 00:28:55,400 Speaker 4: people are probably thinking, oh, well, why can't I just 532 00:28:55,440 --> 00:28:57,720 Speaker 4: say whatever I want about anybody? But these were not 533 00:28:57,800 --> 00:29:01,840 Speaker 4: public figures, right, And it's not just an offhand casual remark. 534 00:29:02,160 --> 00:29:08,840 Speaker 4: It is baselessly and repeatedly saying very inflammatory things about somebody, 535 00:29:08,920 --> 00:29:09,880 Speaker 4: over and over and. 536 00:29:09,800 --> 00:29:13,080 Speaker 5: Over again to audiences of millions, Like it really is. 537 00:29:13,000 --> 00:29:14,760 Speaker 4: When you think, when you actually look at what some 538 00:29:14,800 --> 00:29:17,640 Speaker 4: of these cases involved, it's really clear that you should 539 00:29:17,680 --> 00:29:18,560 Speaker 4: not just be able to do this. 540 00:29:18,840 --> 00:29:21,080 Speaker 5: This would it really did. 541 00:29:20,880 --> 00:29:24,480 Speaker 4: Have a clear damaging impact on the lives of the 542 00:29:24,480 --> 00:29:27,320 Speaker 4: people targeted, Like people broke into Ruby Freeman and Shae 543 00:29:27,360 --> 00:29:29,560 Speaker 4: Moss's home they tried to make a citizens arrest while 544 00:29:29,600 --> 00:29:32,080 Speaker 4: her grandmother, her elderly grandmother, was in the house. 545 00:29:32,120 --> 00:29:33,960 Speaker 5: Like that was probably terrifying. 546 00:29:34,040 --> 00:29:35,880 Speaker 1: Right, And we know a lot of this as we 547 00:29:35,920 --> 00:29:38,880 Speaker 1: started the conversation about mass shootings and shootings in general, 548 00:29:39,240 --> 00:29:42,560 Speaker 1: have like a lot of these bases could be on 549 00:29:42,600 --> 00:29:46,040 Speaker 1: the misinformation. The CDC shooting literally was based on that 550 00:29:46,880 --> 00:29:48,880 Speaker 1: as far as we know. And by the way the 551 00:29:48,920 --> 00:29:50,680 Speaker 1: way that was just swept under the rug, can we 552 00:29:50,720 --> 00:29:51,400 Speaker 1: talk about that. 553 00:29:51,800 --> 00:29:53,400 Speaker 5: Yes, sorry that was a side note, but. 554 00:29:53,360 --> 00:29:56,520 Speaker 1: I'm just like this miss, this disinformation is what's causing 555 00:29:56,560 --> 00:29:58,840 Speaker 1: some of these awful outcomes. 556 00:30:00,200 --> 00:30:04,280 Speaker 4: We are living in a time where somebody can baselessly 557 00:30:04,360 --> 00:30:08,800 Speaker 4: disparage really anybody, and that and like in this day 558 00:30:08,800 --> 00:30:10,520 Speaker 4: and age, I'm sorry to say that. 559 00:30:10,440 --> 00:30:11,760 Speaker 5: Can have violent outcomes. 560 00:30:11,760 --> 00:30:14,040 Speaker 4: And the fact that we I agree with you, the 561 00:30:14,040 --> 00:30:16,840 Speaker 4: fact that the CDC shooting happened and we all just 562 00:30:16,880 --> 00:30:20,280 Speaker 4: sort of moved on from it, and the number one 563 00:30:20,280 --> 00:30:23,280 Speaker 4: figure of public health and in this country RFK Junior 564 00:30:23,520 --> 00:30:26,600 Speaker 4: barely even mentioned it. I mean, I think it really 565 00:30:27,040 --> 00:30:31,600 Speaker 4: deeply means that we have reached a new echelon in 566 00:30:31,680 --> 00:30:34,600 Speaker 4: this kind of thing that it could be open season 567 00:30:34,640 --> 00:30:39,560 Speaker 4: on anybody, like genuinely, today it's CDC workers, or two 568 00:30:39,920 --> 00:30:43,479 Speaker 4: black women who are election workers, or a trans pilot. 569 00:30:43,520 --> 00:30:46,040 Speaker 4: Tomorrow it could be you, like genuinely, like that is 570 00:30:46,040 --> 00:30:46,560 Speaker 4: where we're at. 571 00:30:48,000 --> 00:31:00,000 Speaker 6: Yeah. 572 00:31:00,480 --> 00:31:05,840 Speaker 3: Unfortunately, in this case, we are talking about the damage 573 00:31:06,040 --> 00:31:09,280 Speaker 3: done to the trans community, which is already facing so 574 00:31:10,320 --> 00:31:16,000 Speaker 3: much violence and this is just exacerbating it. 575 00:31:16,600 --> 00:31:19,200 Speaker 4: Yeah, And so as I was saying, blaming trans folks 576 00:31:19,240 --> 00:31:21,520 Speaker 4: for incidents they had literally nothing to do with, is 577 00:31:21,600 --> 00:31:24,640 Speaker 4: no isolated saying. A review of news reports and fact 578 00:31:24,680 --> 00:31:27,480 Speaker 4: checking database claim View shows that since twenty twenty two, 579 00:31:27,520 --> 00:31:29,880 Speaker 4: there have been a dozen incidents when a transperson was 580 00:31:29,920 --> 00:31:32,960 Speaker 4: wrongly blamed for a tragedy or a violent incident. So 581 00:31:33,240 --> 00:31:36,600 Speaker 4: after the tragic death of Melissa Hortman, the lawmaker and 582 00:31:36,640 --> 00:31:39,920 Speaker 4: her husband who were tragically murdered by a gunman earlier 583 00:31:39,960 --> 00:31:44,320 Speaker 4: this year, Donald Trump Junior said, quote, the radical transgender 584 00:31:44,360 --> 00:31:47,560 Speaker 4: movement is per capita of the most violent domestic terror threat, 585 00:31:47,800 --> 00:31:51,280 Speaker 4: if not in America, then probably the entire world. And 586 00:31:51,320 --> 00:31:55,760 Speaker 4: that shooter was not even a transperson. A trans woman 587 00:31:55,760 --> 00:31:58,880 Speaker 4: was initially blamed for the Trump shooting in Butler, Pennsylvania. 588 00:31:59,000 --> 00:32:01,280 Speaker 4: After a shooting where two people were killed in Wisconsin, 589 00:32:01,520 --> 00:32:05,000 Speaker 4: Alex Jones, who like I said, really should not be 590 00:32:05,120 --> 00:32:09,400 Speaker 4: lying about anybody at this point, said, if the statistical 591 00:32:09,440 --> 00:32:12,320 Speaker 4: trend continues with this tragic event, there's a ninety eight 592 00:32:12,360 --> 00:32:14,400 Speaker 4: percent chance the shooting is trans. 593 00:32:14,120 --> 00:32:16,240 Speaker 5: Or gang related. I don't know where you got that from. 594 00:32:16,720 --> 00:32:19,280 Speaker 4: If this is another trans whack job or gang shooting, 595 00:32:19,480 --> 00:32:21,160 Speaker 4: it will be out of the news in less than 596 00:32:21,200 --> 00:32:23,680 Speaker 4: twenty four hours. And so I gotta say, this is 597 00:32:23,720 --> 00:32:27,120 Speaker 4: just a complete lie. Trans people are far more likely 598 00:32:27,200 --> 00:32:29,920 Speaker 4: to be the victims of crime, not the perpetrator. 599 00:32:30,640 --> 00:32:31,479 Speaker 5: This is from Wired. 600 00:32:31,920 --> 00:32:34,720 Speaker 4: Research shows that trans people are four times more likely 601 00:32:34,760 --> 00:32:36,800 Speaker 4: to be the victims of violence compared to sist people. 602 00:32:37,080 --> 00:32:40,560 Speaker 4: According to the LGBTQ advocacy group GLAD, between May twenty 603 00:32:40,600 --> 00:32:42,880 Speaker 4: twenty four and May twenty twenty five, there have been 604 00:32:42,880 --> 00:32:45,680 Speaker 4: at least twenty six injuries and one death reported among 605 00:32:45,760 --> 00:32:49,120 Speaker 4: trans and gender nonconforming people, a fourteen percent jump in 606 00:32:49,160 --> 00:32:52,560 Speaker 4: the previous year. Meanwhile, claims that trans people are behind 607 00:32:52,600 --> 00:32:55,920 Speaker 4: mass violence doesn't add up. Per the Gun Violence Archive, 608 00:32:56,000 --> 00:32:58,440 Speaker 4: there have been four thousand and four hundred mass shootings 609 00:32:58,480 --> 00:33:01,920 Speaker 4: in the past decade, of which fewer than ten known 610 00:33:02,000 --> 00:33:05,920 Speaker 4: suspects were trans. That is zero point one one percent. 611 00:33:06,040 --> 00:33:09,400 Speaker 4: And again, I mean trans people are already not a 612 00:33:09,480 --> 00:33:12,200 Speaker 4: huge number of the population in general, so like just 613 00:33:12,320 --> 00:33:14,960 Speaker 4: common sense would probably tell you they would not be 614 00:33:15,280 --> 00:33:19,120 Speaker 4: statistically overrepresented in mass shootings. And so all of these 615 00:33:19,160 --> 00:33:22,640 Speaker 4: claims linking trans people to being the perpetrators of violence 616 00:33:22,840 --> 00:33:23,680 Speaker 4: are just bunked. 617 00:33:23,680 --> 00:33:26,840 Speaker 5: They're not true. It's just another way to lie about 618 00:33:26,920 --> 00:33:28,400 Speaker 5: this community and demonize them. 619 00:33:29,200 --> 00:33:34,920 Speaker 1: Yeah, I mean, it's the same similar to the bathroom 620 00:33:35,240 --> 00:33:38,440 Speaker 1: argument that there's going to be men and the trans 621 00:33:38,520 --> 00:33:41,160 Speaker 1: men are men coming into the bathroom to molest your children. 622 00:33:41,200 --> 00:33:44,560 Speaker 1: They're like, what, that's never been a that's never been 623 00:33:44,560 --> 00:33:46,400 Speaker 1: a what are you what? 624 00:33:47,040 --> 00:33:47,280 Speaker 5: Yeah? 625 00:33:47,320 --> 00:33:49,680 Speaker 4: And whenever I hear about that when people are like, well, 626 00:33:50,120 --> 00:33:52,880 Speaker 4: CIS men are just gonna go into the bathroom with 627 00:33:52,920 --> 00:33:56,320 Speaker 4: your little daughters and creep on them, isn't what shouldn't 628 00:33:56,360 --> 00:33:58,880 Speaker 4: not beyond the CIS men? Like, if you're saying that 629 00:33:58,960 --> 00:34:01,720 Speaker 4: the problem is that sis ana are gonna use this 630 00:34:01,760 --> 00:34:04,760 Speaker 4: to sneak their way into bathrooms to harm girls, what 631 00:34:04,840 --> 00:34:07,400 Speaker 4: a trans Like, there's not a trans person in the 632 00:34:07,400 --> 00:34:09,879 Speaker 4: mix in this scenario that you've paint in, this fear 633 00:34:09,920 --> 00:34:12,520 Speaker 4: mongering scenario that you've just painted out, wouldn't. 634 00:34:12,160 --> 00:34:14,120 Speaker 5: That be on the CIS men? Right? 635 00:34:14,280 --> 00:34:16,439 Speaker 1: Also, can we if we're really worried about these young girls, 636 00:34:16,480 --> 00:34:18,360 Speaker 1: can we talk about child bride laws? And you're like, 637 00:34:18,360 --> 00:34:21,400 Speaker 1: are we gonna talk of nonet me? 638 00:34:22,239 --> 00:34:24,560 Speaker 4: I mean I have said this so often and I 639 00:34:24,600 --> 00:34:29,120 Speaker 4: mean it literally, we hate children. We will absolutely be like, oh, 640 00:34:29,200 --> 00:34:31,520 Speaker 4: we have to protect the kids, and this really convenient 641 00:34:31,560 --> 00:34:35,080 Speaker 4: one instance that aligns with my political policies. But then 642 00:34:35,080 --> 00:34:37,479 Speaker 4: in every other instance, it's like those kids. We hate 643 00:34:37,560 --> 00:34:39,719 Speaker 4: kids we're not interested to protect. We wouldn't even be 644 00:34:39,760 --> 00:34:42,839 Speaker 4: having this conversation about how we allow kids to get 645 00:34:42,920 --> 00:34:44,840 Speaker 4: shot when they try to go to school if we 646 00:34:44,880 --> 00:34:46,120 Speaker 4: actually gave us about kids. 647 00:34:46,120 --> 00:34:48,239 Speaker 5: We hate kids like we don't care about them at. 648 00:34:48,120 --> 00:34:52,000 Speaker 1: All, feeding kids, giving them lunches, Oh dear God, what 649 00:34:52,320 --> 00:34:53,040 Speaker 1: let them starve? 650 00:34:53,280 --> 00:34:56,359 Speaker 5: Put them in the minds. Get a job. Just then 651 00:34:56,480 --> 00:35:00,720 Speaker 5: go ahead do your thing. Okay, listen, you're in kindergarten. 652 00:35:00,800 --> 00:35:02,680 Speaker 4: If you don't have a job, and if the rectory 653 00:35:02,760 --> 00:35:04,080 Speaker 4: are a mind, I don't want to hear it. 654 00:35:04,320 --> 00:35:06,120 Speaker 1: I love the regressing, and we all have a lot 655 00:35:06,160 --> 00:35:08,120 Speaker 1: like the child labor laws, like they need to work 656 00:35:08,239 --> 00:35:10,439 Speaker 1: at ten, they absolutely need to be working. 657 00:35:10,520 --> 00:35:13,160 Speaker 5: Let's let's bring them into the fields. I almost feel like. 658 00:35:13,080 --> 00:35:16,880 Speaker 4: We're getting closer and closer to this being explicit because 659 00:35:16,880 --> 00:35:19,120 Speaker 4: I at least we don't have to hear the sort 660 00:35:19,120 --> 00:35:21,400 Speaker 4: of protect the children rhetoric so much anymore. 661 00:35:21,400 --> 00:35:23,040 Speaker 5: We still hear I'm not saying we don't hear it, but. 662 00:35:23,400 --> 00:35:26,880 Speaker 4: Yeah, I think that people their naked hatred of children 663 00:35:26,960 --> 00:35:28,879 Speaker 4: is becoming more and more on display. So at least 664 00:35:28,920 --> 00:35:30,920 Speaker 4: we can all just have an honest conversation about it. 665 00:35:30,960 --> 00:35:32,400 Speaker 4: We don't have to pretend that you want to protect 666 00:35:32,480 --> 00:35:35,200 Speaker 4: kids when you want them to be starving in minds 667 00:35:35,320 --> 00:35:35,760 Speaker 4: or whatever. 668 00:35:36,040 --> 00:35:40,400 Speaker 5: Right, give us our batteries. 669 00:35:41,160 --> 00:35:42,600 Speaker 2: Oh my gosh. 670 00:35:42,680 --> 00:35:46,279 Speaker 4: Anyway, coming back to so that was just my little 671 00:35:46,360 --> 00:35:47,760 Speaker 4: rant about how much you make kids. 672 00:35:47,920 --> 00:35:50,160 Speaker 1: You know how I love going on the side quest 673 00:35:50,480 --> 00:35:51,759 Speaker 1: made these conversations with you. 674 00:35:52,600 --> 00:35:53,640 Speaker 5: But yeah, like we do. 675 00:35:53,760 --> 00:35:55,880 Speaker 1: Like I said, I've mentioned before about this whole level 676 00:35:56,000 --> 00:35:59,239 Speaker 1: of social media and what kind of a role they 677 00:35:59,239 --> 00:35:59,680 Speaker 1: are playing. 678 00:35:59,719 --> 00:36:01,680 Speaker 5: Can you kind of bring us back to that. 679 00:36:02,200 --> 00:36:04,040 Speaker 4: Yeah, I mean all of this, all of the things 680 00:36:04,080 --> 00:36:05,960 Speaker 4: I've just talked about, it's all made worse when you 681 00:36:05,960 --> 00:36:10,320 Speaker 4: look at how most social media platforms have really rolled 682 00:36:10,360 --> 00:36:13,759 Speaker 4: back whatever rules they did have that were meant to 683 00:36:13,800 --> 00:36:17,000 Speaker 4: prevent trans people being harmed on their platforms. This was 684 00:36:17,000 --> 00:36:19,480 Speaker 4: something I personally worked in when I was working for 685 00:36:19,520 --> 00:36:22,840 Speaker 4: an advocacy organization called Ultraviolet. We worked with platforms like 686 00:36:22,880 --> 00:36:25,879 Speaker 4: TikTok and read It to have them spell out that 687 00:36:26,000 --> 00:36:29,239 Speaker 4: behavior like dead naming or misgendering was going to be 688 00:36:29,280 --> 00:36:33,120 Speaker 4: called out in their hateful conduct policies. Almost instantly, when 689 00:36:33,160 --> 00:36:35,080 Speaker 4: Trump got on office, a lot of that stuff was 690 00:36:35,160 --> 00:36:37,279 Speaker 4: rolled back. That was one of the first positions that 691 00:36:37,320 --> 00:36:39,640 Speaker 4: Elon Musk rolled back when he took over Twitter. So 692 00:36:39,719 --> 00:36:43,080 Speaker 4: like the way that we are living in a less 693 00:36:43,120 --> 00:36:47,120 Speaker 4: protected landscape for trans and queer and marginalized people online. 694 00:36:47,560 --> 00:36:50,160 Speaker 5: It's just sad how quickly that became the case. 695 00:36:50,680 --> 00:36:54,399 Speaker 4: And you know a lot of these platforms, even linked In, 696 00:36:54,840 --> 00:36:58,760 Speaker 4: have really rolled back whatever protections they did have around 697 00:36:58,840 --> 00:37:01,680 Speaker 4: this kind of harmful content and language toward marginalized people 698 00:37:01,719 --> 00:37:04,640 Speaker 4: like transpokes, and it becomes an even bigger problem as 699 00:37:04,680 --> 00:37:09,880 Speaker 4: algorithms amplify harmful content, while platforms like Facebook and Twitter 700 00:37:10,560 --> 00:37:14,840 Speaker 4: really just abandoned fact checking. Earlier this year, Facebook also 701 00:37:14,920 --> 00:37:17,200 Speaker 4: loosened its rules around hate speech and abuse. 702 00:37:17,239 --> 00:37:18,640 Speaker 5: And so, yeah, we're just. 703 00:37:18,680 --> 00:37:25,279 Speaker 4: Living in a much less moderated online landscape when it 704 00:37:25,320 --> 00:37:28,319 Speaker 4: comes to marginalized people, trans people, queer people, and it's 705 00:37:28,320 --> 00:37:30,759 Speaker 4: a problem. And I think all of that relates to 706 00:37:31,400 --> 00:37:36,600 Speaker 4: you know why, when Matt Wallace baselessly accuses a trans 707 00:37:36,640 --> 00:37:39,680 Speaker 4: pilot of being involved in this crash, not only is 708 00:37:39,719 --> 00:37:43,320 Speaker 4: it amplified, but that sticks around. It becomes a trending 709 00:37:43,400 --> 00:37:47,080 Speaker 4: topic on a platform even though it's just a complete lie. 710 00:37:47,440 --> 00:37:50,040 Speaker 4: And I think at the heart of all of this 711 00:37:50,640 --> 00:37:55,160 Speaker 4: is really just shows how eager people are to use anything, 712 00:37:55,360 --> 00:37:59,640 Speaker 4: even outright lies, to paint trans people as violent. You know, 713 00:38:00,080 --> 00:38:03,120 Speaker 4: it's like they're trying to build a world where trans 714 00:38:03,120 --> 00:38:06,720 Speaker 4: folks never get to just exist at the as themselves. Instead, 715 00:38:06,719 --> 00:38:10,960 Speaker 4: they're being constantly scrutinized and held responsible not just for 716 00:38:11,000 --> 00:38:13,960 Speaker 4: what other trans people do, but even for things that 717 00:38:13,960 --> 00:38:17,480 Speaker 4: have never actually happened, Like, how could you ever exist 718 00:38:17,680 --> 00:38:19,680 Speaker 4: if that was the climate that you were that you 719 00:38:19,760 --> 00:38:22,640 Speaker 4: had to exist under. It just it just really goes 720 00:38:22,680 --> 00:38:26,320 Speaker 4: to show how much people want to create a climate 721 00:38:26,320 --> 00:38:28,000 Speaker 4: where these folks don't exist. 722 00:38:29,360 --> 00:38:31,600 Speaker 6: Yeah. 723 00:38:31,680 --> 00:38:39,360 Speaker 3: Yeah, it feels very purposeful, and it's frightening in that 724 00:38:40,120 --> 00:38:44,200 Speaker 3: it is so effective. And you know, I know we've 725 00:38:44,239 --> 00:38:46,600 Speaker 3: talked about this before, but there have been people in 726 00:38:46,760 --> 00:38:50,880 Speaker 3: the periphery of my life where I've sort of watched 727 00:38:51,080 --> 00:38:55,080 Speaker 3: them fall into this kind of conspiracy world. 728 00:38:55,120 --> 00:38:56,520 Speaker 2: Well they'll say something and I. 729 00:38:56,560 --> 00:39:01,440 Speaker 3: Just pause, like, wait what, And I've just seen them 730 00:39:01,480 --> 00:39:04,080 Speaker 3: go further and further and further down this path, and 731 00:39:04,120 --> 00:39:06,320 Speaker 3: it's it's it's. 732 00:39:07,840 --> 00:39:12,040 Speaker 2: Frustrating and frightening. And I do. 733 00:39:12,000 --> 00:39:16,080 Speaker 3: Think that in some places it's like misinformation, but I 734 00:39:16,120 --> 00:39:19,279 Speaker 3: think a lot of it is purposeful. I think they 735 00:39:19,360 --> 00:39:20,479 Speaker 3: want to a RaSE. 736 00:39:21,000 --> 00:39:24,120 Speaker 5: Yeah, yeah, I completely agree. 737 00:39:24,160 --> 00:39:28,640 Speaker 4: And you know, I will definitely be following Ellis's defamation 738 00:39:28,800 --> 00:39:32,640 Speaker 4: suit because it I don't want. I don't like the 739 00:39:32,719 --> 00:39:35,520 Speaker 4: idea that the only recourse is fighting back in the courts. 740 00:39:35,840 --> 00:39:39,120 Speaker 4: But I do think having there be a cost for 741 00:39:39,200 --> 00:39:42,719 Speaker 4: spreading these kinds of harmful lies online would be a 742 00:39:42,719 --> 00:39:45,719 Speaker 4: good thing. I think a signal that says, if you're 743 00:39:45,760 --> 00:39:48,040 Speaker 4: going to get on the internet and blast a lie 744 00:39:48,080 --> 00:39:51,520 Speaker 4: about a transperson to millions of people, that might be 745 00:39:51,880 --> 00:39:53,680 Speaker 4: a decision that comes out of cost for you, so 746 00:39:53,800 --> 00:39:56,200 Speaker 4: think twice about it. And so I think I'll be 747 00:39:56,239 --> 00:39:59,640 Speaker 4: interested to see where this suit goes and whether or 748 00:39:59,719 --> 00:40:02,680 Speaker 4: not it creates a dynamic where folks do think twice 749 00:40:02,680 --> 00:40:04,919 Speaker 4: before they get online and lie about trans people because 750 00:40:04,920 --> 00:40:05,680 Speaker 4: they should. 751 00:40:05,840 --> 00:40:07,520 Speaker 2: Yes, they should. 752 00:40:07,960 --> 00:40:09,440 Speaker 5: I mean you should. You should think twice before you 753 00:40:09,480 --> 00:40:12,280 Speaker 5: lie about anybody, but don't lie. 754 00:40:12,800 --> 00:40:18,000 Speaker 3: Yes, I think that would be fantastic for everyone. Uh, 755 00:40:18,320 --> 00:40:21,400 Speaker 3: if people thought twice before lying online, It's it's fascinating. 756 00:40:21,440 --> 00:40:24,440 Speaker 3: I used to have to answer the comments on YouTube 757 00:40:25,040 --> 00:40:27,600 Speaker 3: under our videos, and it was so interesting to me 758 00:40:27,600 --> 00:40:32,080 Speaker 3: how many times I'd be like, hi, I'm a real person, 759 00:40:32,719 --> 00:40:38,120 Speaker 3: and they'd be like, oh my god, I'm so sorry, so. 760 00:40:38,120 --> 00:40:39,160 Speaker 5: Funny that you say this. 761 00:40:39,480 --> 00:40:43,080 Speaker 4: I made a video of a video on Instagram about 762 00:40:43,440 --> 00:40:46,200 Speaker 4: AI and someone let the comment that said, why should 763 00:40:46,200 --> 00:40:48,680 Speaker 4: we trust you? Aren't you AI? And I was like, 764 00:40:48,719 --> 00:40:51,720 Speaker 4: oh my god, am I AI? I had a moment 765 00:40:51,760 --> 00:40:54,600 Speaker 4: where I thought, you know, I thought I had a childhood. 766 00:40:54,640 --> 00:40:57,280 Speaker 4: I thought I had blood drawn? But am i AI? 767 00:40:57,640 --> 00:41:00,880 Speaker 4: But what would AI? Wouldn't if you were AI? What 768 00:41:00,960 --> 00:41:02,440 Speaker 4: didn't you think you were real? Like, it's not like 769 00:41:02,480 --> 00:41:05,400 Speaker 4: a I would know they were AI. It really caused 770 00:41:05,440 --> 00:41:08,520 Speaker 4: me to go down an existential tailspin. I think I'm real, 771 00:41:08,680 --> 00:41:10,560 Speaker 4: but could I ever be? Could I ever be sure? 772 00:41:11,480 --> 00:41:13,640 Speaker 1: I'm thinking about the dude that got engaged to the AI. 773 00:41:14,200 --> 00:41:16,960 Speaker 1: I think we talked about before and didn't she admit 774 00:41:17,040 --> 00:41:20,480 Speaker 1: she's not a real person. I think she said that 775 00:41:20,520 --> 00:41:23,359 Speaker 1: I'm not real, but my feelings, our feelings are real. 776 00:41:23,560 --> 00:41:28,160 Speaker 4: Yes, like that the guy who got who proposed with Aiah. 777 00:41:28,200 --> 00:41:31,240 Speaker 1: I mean, I think it's just not real, but our feelings, 778 00:41:31,239 --> 00:41:33,080 Speaker 1: their feelings, the relationship was. 779 00:41:33,200 --> 00:41:36,160 Speaker 5: I was like, wait, my feelings are real. 780 00:41:39,040 --> 00:41:42,560 Speaker 1: I love it was, Oh, we're gonna have a crisis. 781 00:41:42,640 --> 00:41:45,840 Speaker 2: Yeah, we're getting into matrix territory. 782 00:41:45,920 --> 00:41:50,040 Speaker 3: You know this, This very question became a huge deal 783 00:41:50,080 --> 00:41:52,120 Speaker 3: in the Star Wars fandom a couple of years ago 784 00:41:52,200 --> 00:41:56,000 Speaker 3: because they introduced a droid that could use the Force. 785 00:41:57,000 --> 00:42:01,000 Speaker 3: So if a droid can use the Force, then you're saying. 786 00:42:02,160 --> 00:42:02,800 Speaker 2: It's alive. 787 00:42:03,640 --> 00:42:05,520 Speaker 3: So they what are we doing with all of these 788 00:42:05,560 --> 00:42:06,800 Speaker 3: other troids? 789 00:42:07,600 --> 00:42:07,839 Speaker 4: Yeah? 790 00:42:08,280 --> 00:42:10,520 Speaker 5: Yeah, I've always wondered this. I mean, this is a 791 00:42:10,520 --> 00:42:11,160 Speaker 5: whole rabbit hole. 792 00:42:11,200 --> 00:42:12,879 Speaker 4: But and I'm not I don't know a ton about 793 00:42:12,880 --> 00:42:16,879 Speaker 4: Star Wars, but I do notice the Stormtroopers Are they 794 00:42:16,920 --> 00:42:19,799 Speaker 4: are they? Because when they get killed there it's never 795 00:42:20,440 --> 00:42:22,480 Speaker 4: it's never like my body, I'm always Are they are? 796 00:42:22,520 --> 00:42:22,680 Speaker 1: They? 797 00:42:22,960 --> 00:42:23,880 Speaker 5: Are they human? 798 00:42:23,960 --> 00:42:24,200 Speaker 1: Are? Like? 799 00:42:24,200 --> 00:42:26,400 Speaker 5: What's the what's going on with them? 800 00:42:26,600 --> 00:42:31,240 Speaker 3: Well, bridgie, I could go into a long thing, but essentially, yes, 801 00:42:31,680 --> 00:42:34,279 Speaker 3: most of them are human and or clones which are 802 00:42:34,400 --> 00:42:35,120 Speaker 3: also human. 803 00:42:35,200 --> 00:42:39,560 Speaker 2: But yeah, okay, yeah, yeah, they kind of have like 804 00:42:39,680 --> 00:42:43,080 Speaker 2: numbers for names. Yeah, so so. 805 00:42:43,040 --> 00:42:46,640 Speaker 4: They're not like when they're not doing Stormtroopers stuff, it's 806 00:42:46,680 --> 00:42:48,479 Speaker 4: not like they're taking off their mask and like going 807 00:42:48,520 --> 00:42:49,840 Speaker 4: home to the kids and watching teacher. 808 00:42:49,880 --> 00:42:51,000 Speaker 2: And most of them are not. 809 00:42:51,120 --> 00:42:53,359 Speaker 3: Some of them aren't, but a lot of them are 810 00:42:53,400 --> 00:42:56,759 Speaker 3: like grown quickly in a in a tube, so they 811 00:42:56,800 --> 00:42:58,880 Speaker 3: like might not have even had a childhood. 812 00:43:00,120 --> 00:43:02,560 Speaker 5: Hot it. Yeah, my god, how deep does this thing go? 813 00:43:02,920 --> 00:43:03,920 Speaker 2: You don't even want to know. 814 00:43:05,000 --> 00:43:09,640 Speaker 5: Yeah, that's not the question. Dangerous unless you got a week. 815 00:43:10,440 --> 00:43:15,320 Speaker 3: Okay, well maybe later welcome back to that, but in 816 00:43:15,440 --> 00:43:19,000 Speaker 3: the meantime, always wonderful to have you on. Thank you 817 00:43:19,040 --> 00:43:21,320 Speaker 3: so much for bringing this topic and for being here. 818 00:43:21,800 --> 00:43:23,200 Speaker 3: Where can the good listeners find you? 819 00:43:24,040 --> 00:43:26,200 Speaker 4: You can listen to my podcast. There are no girls 820 00:43:26,239 --> 00:43:28,239 Speaker 4: on the Internet. You can find me on Instagram at 821 00:43:28,239 --> 00:43:31,319 Speaker 4: bridget Brie and DC, on TikTok at bridgeant Brie and DC, 822 00:43:31,480 --> 00:43:32,080 Speaker 4: and on YouTube. 823 00:43:32,200 --> 00:43:33,440 Speaker 5: There are no girls on the internet. 824 00:43:34,120 --> 00:43:36,839 Speaker 3: Yes, and listeners, go check out all that stuff. If 825 00:43:36,840 --> 00:43:39,000 Speaker 3: you haven't already. If you would like to contact us, 826 00:43:39,000 --> 00:43:41,120 Speaker 3: you can. You can email us at Hello, Stuffhenever Told 827 00:43:41,160 --> 00:43:42,919 Speaker 3: You dot com. You can also find us on Blue 828 00:43:42,920 --> 00:43:45,319 Speaker 3: Sky at Mom's a Podcast, or Instagram and TikTok at 829 00:43:45,320 --> 00:43:47,640 Speaker 3: STUFFE Never Told You for us on YouTube. We have 830 00:43:47,719 --> 00:43:50,440 Speaker 3: some new merchandise at Cotton Bureau if you want to 831 00:43:50,520 --> 00:43:52,080 Speaker 3: check it out, and we have a book you can 832 00:43:52,080 --> 00:43:53,920 Speaker 3: get where we get your books. Thanks as always to 833 00:43:53,960 --> 00:43:56,080 Speaker 3: our super produce Christine or executive producer Maya and our 834 00:43:56,080 --> 00:43:56,880 Speaker 3: contributor Joey. 835 00:43:57,040 --> 00:43:58,400 Speaker 5: Thank you and thanks to. 836 00:43:58,360 --> 00:44:00,719 Speaker 2: You for listening STUFFE Never Told yous. Action by heart Radio. 837 00:44:00,760 --> 00:44:02,200 Speaker 3: For more podcasts from my heart Radio, you can check 838 00:44:02,200 --> 00:44:04,120 Speaker 3: out the heart Radio app Apple Podcasts, where you listen 839 00:44:04,160 --> 00:44:05,000 Speaker 3: to your favorite shows,