1 00:00:05,160 --> 00:00:07,720 Speaker 1: Hey, this is Annie and Samantha and welcome to Stephane 2 00:00:07,760 --> 00:00:19,440 Speaker 1: Never Told You reflection of iHeartRadio, and today we are 3 00:00:19,480 --> 00:00:23,439 Speaker 1: thrilled to be joined by the amazing of the wonderful Joey, 4 00:00:23,520 --> 00:00:27,120 Speaker 1: who helps us with TikTok and researching and episodes. Obviously, 5 00:00:27,480 --> 00:00:31,120 Speaker 1: so glad to have you here. Thank you. Yeah, I'm 6 00:00:31,160 --> 00:00:34,239 Speaker 1: excited to be here. Yes, yes, yes, um, we're very 7 00:00:34,280 --> 00:00:37,800 Speaker 1: excited to have you. You know, heads up, probably by 8 00:00:37,880 --> 00:00:39,960 Speaker 1: the title, you can tell this is very controversial topic 9 00:00:40,040 --> 00:00:43,360 Speaker 1: for sure. Not a sponsor also, I hope obviously, but 10 00:00:43,520 --> 00:00:48,440 Speaker 1: you know, not a sponsor. Um, we're gonna be a 11 00:00:48,479 --> 00:00:55,640 Speaker 1: fire if they were. Yes, but this is something, uh, 12 00:00:55,680 --> 00:00:58,920 Speaker 1: we really wanted to talk about and I had been 13 00:00:59,480 --> 00:01:02,520 Speaker 1: intending I had an episode kind of planned about it, 14 00:01:02,960 --> 00:01:04,200 Speaker 1: but it was going to be much more of a 15 00:01:04,280 --> 00:01:06,840 Speaker 1: like personal I know we're gonna talk about this later, 16 00:01:06,840 --> 00:01:09,039 Speaker 1: but kind of a personal take on like grieving and 17 00:01:09,120 --> 00:01:12,880 Speaker 1: fandom and what that that looks like. But yes, all 18 00:01:12,920 --> 00:01:17,080 Speaker 1: of that said, very excited to talk about this. No, 19 00:01:17,200 --> 00:01:21,440 Speaker 1: it's controversial, but come with us on this journey listeners. Joey, 20 00:01:21,840 --> 00:01:25,080 Speaker 1: do you want to introduce yourself anything that I missed 21 00:01:25,160 --> 00:01:29,200 Speaker 1: that I should have said, um, yeah, hi, I'm Joey 22 00:01:29,360 --> 00:01:32,240 Speaker 1: A good I don't think you're really missening. I think 23 00:01:32,280 --> 00:01:36,360 Speaker 1: I'm a producer here at iHeart, So I do research 24 00:01:36,440 --> 00:01:38,760 Speaker 1: for Smith you from time to time, and I also 25 00:01:38,800 --> 00:01:41,479 Speaker 1: run the TikTok. So you guys should all go follow 26 00:01:41,560 --> 00:01:46,800 Speaker 1: that if you haven't already. M yeah, yes, yes, yes, yes. 27 00:01:49,720 --> 00:01:52,200 Speaker 1: You have been such a good sport because we just 28 00:01:52,320 --> 00:01:55,320 Speaker 1: essentially send you very random long clips and are like 29 00:01:55,920 --> 00:02:00,640 Speaker 1: maybe this or really that, you have to go on 30 00:02:00,800 --> 00:02:08,680 Speaker 1: and make it with amazing like any crying, it's fine everything. Yeah, 31 00:02:08,760 --> 00:02:11,000 Speaker 1: I didn't get to do that. I don't know the 32 00:02:11,040 --> 00:02:14,480 Speaker 1: only when there's a little Valentine'sday when I did so 33 00:02:14,520 --> 00:02:22,520 Speaker 1: far with the So it's fun. It's fun. Definitely giving 34 00:02:22,520 --> 00:02:24,160 Speaker 1: me an excuse to like spend a lot more time 35 00:02:24,200 --> 00:02:27,640 Speaker 1: on TikTok too, which is you know, positives and negatives 36 00:02:27,680 --> 00:02:38,280 Speaker 1: to that. But it depends, well, there is a lot 37 00:02:38,320 --> 00:02:41,799 Speaker 1: of ground to cover with this. So can you let's 38 00:02:41,840 --> 00:02:44,440 Speaker 1: get into can you tell us what we're talking about today? Yeah? 39 00:02:45,280 --> 00:02:48,760 Speaker 1: So quick, just trigger warning off the top. I'm going 40 00:02:48,800 --> 00:02:52,480 Speaker 1: to be getting into a lot of like transphobic violence, 41 00:02:52,600 --> 00:02:55,760 Speaker 1: both like interpersonally and on a state level, some brief 42 00:02:55,800 --> 00:03:01,080 Speaker 1: discussion of sexual assault, nothing too explicit, the just transphobia, 43 00:03:01,360 --> 00:03:05,440 Speaker 1: the big thing. Yeah, I'm going to talk about the 44 00:03:06,240 --> 00:03:14,359 Speaker 1: whole ongoing I guess controversy regarding the Harry Potter franchise 45 00:03:14,400 --> 00:03:16,799 Speaker 1: as a whole, but particularly this video game that came 46 00:03:16,840 --> 00:03:22,160 Speaker 1: out recently. And yeah, as Annie had kind of mentioned, 47 00:03:22,320 --> 00:03:25,160 Speaker 1: also getting into like talking about fandom and what you 48 00:03:25,280 --> 00:03:29,160 Speaker 1: do when this sort of thing happens, and how to 49 00:03:29,320 --> 00:03:34,280 Speaker 1: kind of ethically or you know, somewhat ethically interact with 50 00:03:34,320 --> 00:03:39,200 Speaker 1: this kind of media and yeah, yeah, yeah, and I 51 00:03:40,600 --> 00:03:44,360 Speaker 1: I think we're gonna have some personal stuff throughout, but 52 00:03:44,600 --> 00:03:47,520 Speaker 1: I'll just say, like at the top, this is a 53 00:03:47,640 --> 00:03:52,200 Speaker 1: very We've talked about this several times because not this specifically, 54 00:03:52,280 --> 00:03:57,920 Speaker 1: but like, you know, everything has its problematic issues, and 55 00:03:58,200 --> 00:04:00,600 Speaker 1: everybody I always say, like people can do a lot 56 00:04:00,640 --> 00:04:05,880 Speaker 1: of mental gymnastics, and I guess what I'm saying ultimately 57 00:04:06,040 --> 00:04:09,800 Speaker 1: is there's no like ones set, one thing fits all, 58 00:04:09,960 --> 00:04:14,320 Speaker 1: prescriptive thing. And I know that a lot of her 59 00:04:14,440 --> 00:04:17,760 Speaker 1: has been done by this, so just to acknowledge that, 60 00:04:17,960 --> 00:04:21,400 Speaker 1: but oh yeah, sure, I don't know, it's just kind 61 00:04:21,440 --> 00:04:27,160 Speaker 1: of like it's a complicated issue. I mean, and we're 62 00:04:27,200 --> 00:04:29,760 Speaker 1: gonna get into like I don't think there's like there's 63 00:04:29,800 --> 00:04:32,240 Speaker 1: not really a right answer, which is the frustrating part. 64 00:04:32,360 --> 00:04:34,360 Speaker 1: And again, like you'll see what I go through a 65 00:04:34,360 --> 00:04:38,640 Speaker 1: lot of the background. It was really first frustrating writing 66 00:04:38,680 --> 00:04:41,080 Speaker 1: this whole thing because it's like it really is a 67 00:04:41,120 --> 00:04:43,680 Speaker 1: lot of like back and forth between like a lot 68 00:04:43,760 --> 00:04:46,080 Speaker 1: of this really violent, terrible things that are happening to 69 00:04:46,200 --> 00:04:50,360 Speaker 1: trans people and then talking about like nostalgic culture and 70 00:04:50,640 --> 00:04:53,719 Speaker 1: pop culture and how that affects people, which also is important, 71 00:04:53,760 --> 00:04:57,760 Speaker 1: but you know, like it again, it's it gets really complicated, 72 00:04:57,920 --> 00:05:02,680 Speaker 1: and I don't want to deny the experiences of anybody 73 00:05:02,839 --> 00:05:06,680 Speaker 1: or like kind of yeah, like there's a lot of 74 00:05:06,800 --> 00:05:10,520 Speaker 1: layers to this as we're going to get into. Yes, yes, um, 75 00:05:10,920 --> 00:05:14,080 Speaker 1: well let's get into it. Can you give us kind 76 00:05:14,080 --> 00:05:17,839 Speaker 1: of overview of what has been happening, what has brought 77 00:05:17,920 --> 00:05:20,280 Speaker 1: us to where we are today? So yeah, this this 78 00:05:20,440 --> 00:05:23,360 Speaker 1: whole I'm going to keep using the word controversy. I 79 00:05:23,400 --> 00:05:25,120 Speaker 1: don't know how to feel about the word, but anyways, 80 00:05:25,160 --> 00:05:29,640 Speaker 1: this whole controversy started around twenty seventeen twenty eighteen, when 81 00:05:29,760 --> 00:05:33,000 Speaker 1: fans of the Harry Potter series and noticed that JK. 82 00:05:33,200 --> 00:05:35,840 Speaker 1: Rowling had liked a number of tweets attacking the trans 83 00:05:36,000 --> 00:05:40,480 Speaker 1: rights movement, particularly tweets attacking trans women. Apparently at the 84 00:05:40,520 --> 00:05:43,960 Speaker 1: time she said this was a mistake, but I mean again, like, 85 00:05:44,440 --> 00:05:47,719 Speaker 1: we'll see where this is going. And then December twenty 86 00:05:48,120 --> 00:05:51,440 Speaker 1: nineteen it really started. JK. Rowling had tweeted her support 87 00:05:51,440 --> 00:05:53,320 Speaker 1: for a woman who had been fired from the Center 88 00:05:53,360 --> 00:05:56,600 Speaker 1: for Global Development in the UK for her transpopic comments, 89 00:05:57,400 --> 00:06:02,200 Speaker 1: and despite responses from LGBTQ plus activists and allies, Rowling 90 00:06:02,240 --> 00:06:06,320 Speaker 1: refused to back down from her stance. And then twenty twenty, 91 00:06:06,520 --> 00:06:10,880 Speaker 1: things really starting to spiral funnier for everybody in tune, 92 00:06:11,120 --> 00:06:17,080 Speaker 1: Rowling and tweeted something, Honestly, I'm just gonna say I 93 00:06:17,160 --> 00:06:19,560 Speaker 1: thought it was a pretty childish response to an article 94 00:06:20,200 --> 00:06:23,440 Speaker 1: that came out that was titled opinion creating a more 95 00:06:23,680 --> 00:06:26,400 Speaker 1: equal post COVID nineteen world for people who men Straight. 96 00:06:26,960 --> 00:06:30,719 Speaker 1: Rowling was upset over the use of the phrase people 97 00:06:30,760 --> 00:06:34,520 Speaker 1: who men straight rather than women. The article in particular 98 00:06:34,600 --> 00:06:37,800 Speaker 1: was pointing to how you know, women on binary, people 99 00:06:37,920 --> 00:06:42,400 Speaker 1: transpend people who men stright have been unfair like particularly 100 00:06:42,480 --> 00:06:44,800 Speaker 1: affected by the COVID nineteen pandemic and kind of a 101 00:06:44,839 --> 00:06:47,520 Speaker 1: lot of the dangerous consequences of this taboo on menstrual health, 102 00:06:47,720 --> 00:06:50,280 Speaker 1: which I think is a pretty you know, I think 103 00:06:50,320 --> 00:06:52,920 Speaker 1: that's a thing we should be talking about. I think, yeah, 104 00:06:53,040 --> 00:06:55,960 Speaker 1: like I haven't read the article, but I'm sure it 105 00:06:56,080 --> 00:06:58,960 Speaker 1: was you know, pretty well put together. But yeah, Rowling, 106 00:06:59,320 --> 00:07:01,880 Speaker 1: instead of you know, promoting that kind of thing, just 107 00:07:02,000 --> 00:07:05,039 Speaker 1: decided to take issue with the phrase people who went straight, 108 00:07:05,480 --> 00:07:07,640 Speaker 1: which I think it's important to remember because when we 109 00:07:07,720 --> 00:07:10,560 Speaker 1: talk about this, we talk about like turfs that all 110 00:07:10,560 --> 00:07:12,640 Speaker 1: love like transphobic arguments in particular, I think it's important 111 00:07:12,640 --> 00:07:15,640 Speaker 1: to remember they usually end up hurting CIS women too 112 00:07:15,920 --> 00:07:19,360 Speaker 1: and end up promoting patriarchy and like weird kind of 113 00:07:19,440 --> 00:07:22,280 Speaker 1: ways where it's like yeah, like again, this was an 114 00:07:22,360 --> 00:07:24,240 Speaker 1: article that was talking about a real issue that was 115 00:07:24,320 --> 00:07:27,560 Speaker 1: affecting a lot of CIS women and trans people and 116 00:07:27,720 --> 00:07:33,080 Speaker 1: not women too, but like, yeah, I don't know. And 117 00:07:33,160 --> 00:07:36,440 Speaker 1: then also just really quick note, I think like the 118 00:07:36,560 --> 00:07:39,440 Speaker 1: word TURF has made its way into the mainstream kind 119 00:07:39,480 --> 00:07:42,040 Speaker 1: of along with like the you know the fact that 120 00:07:42,080 --> 00:07:44,120 Speaker 1: the trans rights movement has gone a lot more attention. 121 00:07:44,760 --> 00:07:48,600 Speaker 1: TURF stands for trans exclusive radical feminist. I think it's 122 00:07:48,680 --> 00:07:51,280 Speaker 1: kind of become a catchall for like transphobe, but yeah, 123 00:07:51,280 --> 00:07:54,160 Speaker 1: it particularly is talking about a brand of transphobia that 124 00:07:54,240 --> 00:08:01,000 Speaker 1: like disguises itself as feminism, usually pretty like light feminism, 125 00:08:01,480 --> 00:08:07,040 Speaker 1: you know, generally exclusionary feminism. But also yeah, I think 126 00:08:07,280 --> 00:08:10,120 Speaker 1: and when we talk about the harm that like turfism 127 00:08:10,160 --> 00:08:12,640 Speaker 1: in particular has caused, it's important to remember that it's 128 00:08:12,640 --> 00:08:14,720 Speaker 1: like in disguises itself as feminism, and that's part of 129 00:08:14,760 --> 00:08:16,440 Speaker 1: where a lot of the harm comes in. Yeah, So 130 00:08:16,560 --> 00:08:21,400 Speaker 1: back to Rowling. Following those tweets Deck Rowling, in twenty twenty, JK. 131 00:08:21,520 --> 00:08:24,120 Speaker 1: Rowling had posted an article on her blog detailing her 132 00:08:24,160 --> 00:08:27,800 Speaker 1: transphobic views, making a lot of really terrible comments I'm 133 00:08:27,840 --> 00:08:29,400 Speaker 1: not going to get into because I don't think there's 134 00:08:29,400 --> 00:08:32,920 Speaker 1: many benefit and sort of just parenting those ideas. But yeah, 135 00:08:33,040 --> 00:08:36,360 Speaker 1: she at the time, again kind of going back to 136 00:08:36,440 --> 00:08:38,839 Speaker 1: this whole like turf thing, she takes a very like 137 00:08:40,520 --> 00:08:44,160 Speaker 1: liberal quote unquote stand like it's a very like well, 138 00:08:44,200 --> 00:08:47,400 Speaker 1: it's because I support women and because like she does 139 00:08:47,440 --> 00:08:50,679 Speaker 1: at one point say like I support trans people who 140 00:08:50,760 --> 00:08:53,440 Speaker 1: are victims of assault and like all the in violence, 141 00:08:53,440 --> 00:08:55,920 Speaker 1: So all that I just think that, like, and then 142 00:08:55,960 --> 00:08:58,160 Speaker 1: gets into like basically being like, here, I don't know. 143 00:08:58,240 --> 00:09:00,280 Speaker 1: It's it's a very like weird argument where sort of 144 00:09:00,280 --> 00:09:02,040 Speaker 1: like I support trans people except for the fact that 145 00:09:02,080 --> 00:09:05,360 Speaker 1: I don't. But yeah, I think at the time, like 146 00:09:06,320 --> 00:09:08,400 Speaker 1: she was using a lot of like fake niceties to 147 00:09:08,480 --> 00:09:12,520 Speaker 1: be like, I'm still progressive, I just don't think these 148 00:09:12,520 --> 00:09:15,640 Speaker 1: people should exist in public, yeah, and saying that like 149 00:09:15,720 --> 00:09:18,400 Speaker 1: trans women were basically just men and all this like 150 00:09:18,440 --> 00:09:20,640 Speaker 1: other terrible stuff. Yeah, this article is kind of a 151 00:09:20,679 --> 00:09:22,640 Speaker 1: turning point. A bunch of factors in the series ended 152 00:09:22,679 --> 00:09:26,679 Speaker 1: up speaking out against her, Daniel Radcliffe, Emma Watson, like 153 00:09:26,840 --> 00:09:30,520 Speaker 1: notably we're both, you know, the leads of those movies. 154 00:09:30,600 --> 00:09:31,959 Speaker 1: So I think it was interesting a lot of the 155 00:09:32,000 --> 00:09:36,120 Speaker 1: younger generation, particularly we're speaking out against her. Other celebrities 156 00:09:36,160 --> 00:09:40,760 Speaker 1: were speaking in support, which was not fun. And also 157 00:09:40,800 --> 00:09:43,160 Speaker 1: at this point, the Fantastic Beast movies were still coming out, 158 00:09:43,200 --> 00:09:46,760 Speaker 1: which for spinoffs from the original Harry Potter franchise. I 159 00:09:46,800 --> 00:09:50,760 Speaker 1: think she had actually written the screenplay for those. Eddie 160 00:09:50,800 --> 00:09:53,920 Speaker 1: Redmain was the lead of those, and he made like 161 00:09:54,000 --> 00:09:57,240 Speaker 1: a really vague statement basically being like I support trans people. 162 00:09:57,400 --> 00:10:00,679 Speaker 1: He literally like became super famous because he's played a 163 00:10:00,760 --> 00:10:04,599 Speaker 1: trans woman in the Danish Girl, So like, I don't know, 164 00:10:04,760 --> 00:10:06,679 Speaker 1: it was that was definitely a weird about it too, 165 00:10:06,679 --> 00:10:08,599 Speaker 1: where he was sort of like like a lot of 166 00:10:08,600 --> 00:10:11,200 Speaker 1: people pointed out this is especially when we talk about 167 00:10:11,200 --> 00:10:15,400 Speaker 1: like sis actors playing trans characters. This is a reason 168 00:10:15,480 --> 00:10:17,080 Speaker 1: why people are so critical. It is because a lot 169 00:10:17,120 --> 00:10:20,439 Speaker 1: of time it was like great, like this guy gained 170 00:10:20,520 --> 00:10:22,320 Speaker 1: so much fame from playing a trans woman and then 171 00:10:22,400 --> 00:10:25,959 Speaker 1: refuses to like actually make a statement in supportive trans people, 172 00:10:26,360 --> 00:10:30,880 Speaker 1: refuses to remove himself from the series that is benefiting 173 00:10:31,280 --> 00:10:35,160 Speaker 1: this person. Yeah, anyways, frustrating. Yeah, there's a lot of 174 00:10:35,240 --> 00:10:38,800 Speaker 1: exploitation when it comes to bettering their own careers without 175 00:10:38,880 --> 00:10:44,040 Speaker 1: having any sentiment or actual courage to be allies for 176 00:10:44,640 --> 00:10:46,240 Speaker 1: things like this. And that's the big problem when it 177 00:10:46,280 --> 00:10:51,000 Speaker 1: comes to overall usage of nonrepresentatives for characters that should 178 00:10:51,040 --> 00:10:54,440 Speaker 1: be representative. Right, And I bring up this franchise in particular, 179 00:10:54,559 --> 00:10:56,800 Speaker 1: like we're gonna talk about the video game because that's 180 00:10:56,800 --> 00:10:58,199 Speaker 1: the most recent thing that's kind of come out of 181 00:10:58,240 --> 00:11:00,720 Speaker 1: the Harry Potter franchise, but this was the Fantastic Beast 182 00:11:00,800 --> 00:11:02,719 Speaker 1: movies for a while. We're like the main thing that 183 00:11:03,760 --> 00:11:06,680 Speaker 1: was coming out from the series that was continuing jk 184 00:11:06,800 --> 00:11:11,160 Speaker 1: Rowling sort of like relevance at the time. So then 185 00:11:11,280 --> 00:11:14,880 Speaker 1: since twenty twenty jk Rowling is just double down in 186 00:11:14,920 --> 00:11:17,120 Speaker 1: her hate comments. She's totally dropped that sort of like 187 00:11:17,200 --> 00:11:20,839 Speaker 1: faked nicety, like I support some trans people, and just 188 00:11:21,280 --> 00:11:25,160 Speaker 1: like she started also like outright targeting specific activists on 189 00:11:25,240 --> 00:11:28,040 Speaker 1: Twitter and sometimes just trans people, like sometimes not even 190 00:11:28,080 --> 00:11:30,120 Speaker 1: like activists in particular to literally just be like trans 191 00:11:30,160 --> 00:11:32,920 Speaker 1: people trying to live their lives. She's also like used 192 00:11:32,960 --> 00:11:36,679 Speaker 1: her platform to speak out against specific legislation, notably this 193 00:11:37,200 --> 00:11:40,719 Speaker 1: Scotland's Gender Recognition Act in twenty twenty two, which we're 194 00:11:40,720 --> 00:11:42,679 Speaker 1: going to get to because that's all of it. And 195 00:11:42,760 --> 00:11:46,240 Speaker 1: then also so this past year in twenty twenty two, 196 00:11:47,120 --> 00:11:51,360 Speaker 1: the third Fantastic Beast movie came out, and that movie 197 00:11:51,440 --> 00:11:54,400 Speaker 1: ended up flopping and they ended up canceling like any 198 00:11:54,440 --> 00:11:56,800 Speaker 1: further movies that would end up coming out. I don't 199 00:11:56,840 --> 00:12:00,679 Speaker 1: know if this was like because of I think at 200 00:12:00,679 --> 00:12:02,160 Speaker 1: the time I was sort of like, this is great, 201 00:12:02,240 --> 00:12:04,120 Speaker 1: Like people are actually responding to the fact that like 202 00:12:04,240 --> 00:12:06,840 Speaker 1: Dick Carawling is saying all this terrible stuff and then 203 00:12:06,880 --> 00:12:08,600 Speaker 1: not seeing the movie and then this video game made 204 00:12:08,600 --> 00:12:11,240 Speaker 1: me like maybe not. There were a lot of other 205 00:12:11,480 --> 00:12:13,480 Speaker 1: issues with those movies. I also just like, don't think 206 00:12:13,480 --> 00:12:15,080 Speaker 1: they were very good. I don't know. I didn't end 207 00:12:15,160 --> 00:12:18,480 Speaker 1: up seeing them, so like, who's to say, But yeah, 208 00:12:18,840 --> 00:12:20,600 Speaker 1: that's kind of like where we were at the beginning 209 00:12:20,720 --> 00:12:39,120 Speaker 1: of twenty twenty. Sound to the topic of this episode. 210 00:12:39,440 --> 00:12:44,240 Speaker 1: Development of the game began in twenty eighteen, so kind 211 00:12:44,280 --> 00:12:48,560 Speaker 1: of before this became at least like a widely known issue. 212 00:12:48,679 --> 00:12:52,360 Speaker 1: I guess it officially came out last month. Jaki Rowling 213 00:12:52,720 --> 00:12:55,400 Speaker 1: wasn't involved in the development of the game, but she 214 00:12:56,240 --> 00:12:59,800 Speaker 1: gets royalties from it, which I want to point out 215 00:12:59,840 --> 00:13:03,480 Speaker 1: because I've seen a lot of really ridiculous Twitter arguments 216 00:13:03,480 --> 00:13:06,000 Speaker 1: of people being like, well, she wasn't directly involved in 217 00:13:06,080 --> 00:13:08,000 Speaker 1: the game, so it doesn't matter, and it's like, no, 218 00:13:08,280 --> 00:13:13,360 Speaker 1: she's still like royalties, she still makes money from it. Yeah, yeah, 219 00:13:13,559 --> 00:13:16,280 Speaker 1: you have to admit, no matter what, she's a smart 220 00:13:16,400 --> 00:13:19,920 Speaker 1: businesswoman and she makes sure to copyright the hell out 221 00:13:19,960 --> 00:13:22,880 Speaker 1: of this series and makes millions of dollars. And she 222 00:13:23,040 --> 00:13:26,320 Speaker 1: even tweeted in response to people who are criticizing her 223 00:13:26,400 --> 00:13:28,360 Speaker 1: in her comments that it doesn't matter. I'm still gonna 224 00:13:28,400 --> 00:13:31,640 Speaker 1: make money, like literally taunting people. So that's kind of like, 225 00:13:32,040 --> 00:13:33,880 Speaker 1: I have very strong opinions. That is not nice, So 226 00:13:33,960 --> 00:13:36,600 Speaker 1: I'm gonna leave it at that. But because of things 227 00:13:36,640 --> 00:13:39,360 Speaker 1: like that, it's kind of like, no, she knows, she 228 00:13:39,559 --> 00:13:41,880 Speaker 1: knows she's gonna make money, and she makes sure to 229 00:13:42,000 --> 00:13:44,520 Speaker 1: let you know no matter what you do, she's gonna 230 00:13:44,559 --> 00:13:49,319 Speaker 1: stay rich, oh for sure. Yeah. So basically like boycott's 231 00:13:49,760 --> 00:13:52,439 Speaker 1: or there were calls to like boycott the game in 232 00:13:52,600 --> 00:13:55,000 Speaker 1: the months leading up to it coming out, and that 233 00:13:55,160 --> 00:13:57,000 Speaker 1: was the thing that people, yeah kept pointing out was, Yeah, 234 00:13:57,000 --> 00:13:59,240 Speaker 1: there was that tweet from her where she was like, 235 00:13:59,400 --> 00:14:01,400 Speaker 1: I still make money, so I don't care like I 236 00:14:01,440 --> 00:14:03,679 Speaker 1: can say, I can say whatever I want, which is 237 00:14:03,720 --> 00:14:07,920 Speaker 1: like part of the problem because yeah, I don't know. 238 00:14:08,000 --> 00:14:09,800 Speaker 1: I think it's important for people to remember that because 239 00:14:09,800 --> 00:14:13,680 Speaker 1: I've seen again, like I've seen a lot of I'm 240 00:14:13,720 --> 00:14:15,960 Speaker 1: a very like online person, so I spent a lot 241 00:14:16,000 --> 00:14:18,559 Speaker 1: of time like reading through Twitter arguments and stuff like this, 242 00:14:19,240 --> 00:14:21,440 Speaker 1: which is probably not good for my book or whatever, 243 00:14:23,400 --> 00:14:25,640 Speaker 1: but like I've seen a lot of people that are 244 00:14:25,680 --> 00:14:27,920 Speaker 1: just like so caught up in this whole Like, well, 245 00:14:28,000 --> 00:14:30,720 Speaker 1: she wasn't directly involved in creating it. It's like, but 246 00:14:31,000 --> 00:14:33,040 Speaker 1: the point is the money. The point is like she's 247 00:14:33,080 --> 00:14:35,920 Speaker 1: making money, You're keeping irrelevant. She sees this money as 248 00:14:36,000 --> 00:14:39,360 Speaker 1: like I can keep saying all this stuff and there's 249 00:14:39,360 --> 00:14:43,240 Speaker 1: no consequences. I don't know. Yeah, So there was a 250 00:14:43,440 --> 00:14:47,040 Speaker 1: trans YouTuber named Jesse Earle who had tweeted any support 251 00:14:47,040 --> 00:14:49,920 Speaker 1: that Harry Potter franchise current projects while JK Rowling is 252 00:14:49,960 --> 00:14:51,880 Speaker 1: in charge of it and using her ongoing platform to 253 00:14:51,960 --> 00:14:54,480 Speaker 1: target and also just by her continued targeting of trans 254 00:14:54,520 --> 00:14:56,720 Speaker 1: people is harmful to trans people. I think this was 255 00:14:57,400 --> 00:14:59,840 Speaker 1: like the end of twenty twenty two, and then jk 256 00:15:00,080 --> 00:15:03,000 Speaker 1: Arling like responded to it on Twitter with like a 257 00:15:03,080 --> 00:15:06,040 Speaker 1: really nastier response and that sort of again sparked like 258 00:15:06,640 --> 00:15:12,440 Speaker 1: a lot of the discourse quote unquote about it, like 259 00:15:12,640 --> 00:15:16,400 Speaker 1: leading up to its release. Also just really quick about 260 00:15:16,400 --> 00:15:19,560 Speaker 1: this video game, because everybody forgot this really quickly. Back 261 00:15:19,640 --> 00:15:21,720 Speaker 1: in like May last year. I remember there was some 262 00:15:21,840 --> 00:15:23,520 Speaker 1: article that came out about the like plot of the 263 00:15:23,600 --> 00:15:25,960 Speaker 1: video game. So the video game came on to fire 264 00:15:26,000 --> 00:15:27,920 Speaker 1: because it was using a lot of like really anti 265 00:15:27,960 --> 00:15:31,120 Speaker 1: semitic tropes. A lot of it were from the Harry 266 00:15:31,120 --> 00:15:34,640 Speaker 1: Potter series, but they like really really like we're focusing 267 00:15:34,720 --> 00:15:38,479 Speaker 1: in on that sort of plot, particularly with like the goblins. 268 00:15:39,400 --> 00:15:42,440 Speaker 1: So I mean that's kind of a there's a big 269 00:15:42,560 --> 00:15:45,480 Speaker 1: conversation and that people were willing to ignore so much 270 00:15:45,640 --> 00:15:47,840 Speaker 1: the original and I'm one of them because I also 271 00:15:48,400 --> 00:15:50,560 Speaker 1: really enjoyed. I didn't. I wasn't at the beginning because 272 00:15:50,600 --> 00:15:52,560 Speaker 1: I was college years when this was happening when it 273 00:15:52,640 --> 00:15:55,440 Speaker 1: first started, but I was towards the middle, and then 274 00:15:55,480 --> 00:15:57,400 Speaker 1: as you start looking at it, they're like, oh, you 275 00:15:57,560 --> 00:16:00,960 Speaker 1: see that she's actually being racist and anti semitic, and 276 00:16:01,280 --> 00:16:06,160 Speaker 1: oh she talks about like enslaved people's and what is happening, 277 00:16:06,240 --> 00:16:08,560 Speaker 1: and nothing is really bad about it, Like even the 278 00:16:08,720 --> 00:16:12,000 Speaker 1: quip about it is like just taken out and you're like, yeah, 279 00:16:12,360 --> 00:16:14,680 Speaker 1: or why are we okay, Like we've ignored it for 280 00:16:14,800 --> 00:16:17,760 Speaker 1: so long until she outright said something and then you're 281 00:16:17,760 --> 00:16:20,960 Speaker 1: like yeah, yeah, yeah, we really really were blind to 282 00:16:21,320 --> 00:16:26,040 Speaker 1: her darkness for really Yeah, And again with the whole 283 00:16:26,080 --> 00:16:28,480 Speaker 1: thing about like again like yeah, like a lot of 284 00:16:28,560 --> 00:16:31,360 Speaker 1: this was coming from the series, and I was a 285 00:16:31,440 --> 00:16:33,160 Speaker 1: kid when these were coming out. I remember like I 286 00:16:33,200 --> 00:16:35,080 Speaker 1: don't know how to pick up on this, and I 287 00:16:35,160 --> 00:16:37,240 Speaker 1: think a lot of people didn't either. I will say 288 00:16:37,560 --> 00:16:40,960 Speaker 1: I do understand the whole like an, it wasn't that 289 00:16:41,040 --> 00:16:42,400 Speaker 1: long ago, but it was a little bit of a 290 00:16:42,440 --> 00:16:44,640 Speaker 1: different type. So people make mistakes sometimes, Like there's a 291 00:16:44,680 --> 00:16:46,960 Speaker 1: whole conversation we had about like we're focusing on the 292 00:16:47,000 --> 00:16:48,680 Speaker 1: Goblin stuff, like a lot of that is just from 293 00:16:48,840 --> 00:16:51,880 Speaker 1: European mythology being very anti semitic, rather then I think 294 00:16:52,080 --> 00:16:54,400 Speaker 1: jk Rowling in particular, although I do also think that 295 00:16:54,440 --> 00:16:56,920 Speaker 1: she's gonna be ani semitic. But anyways, like I think 296 00:16:58,160 --> 00:17:00,320 Speaker 1: the books came out like twenty years ago, he's also 297 00:17:00,320 --> 00:17:02,880 Speaker 1: came about twenty years ago. Whatever. She had time to 298 00:17:02,960 --> 00:17:06,359 Speaker 1: be like, oh I made a mistake and whoops, Like 299 00:17:06,520 --> 00:17:09,440 Speaker 1: my bad, I will but no, like it's gone to 300 00:17:09,480 --> 00:17:12,080 Speaker 1: the point where like again, she wasn't directly making this 301 00:17:12,200 --> 00:17:14,080 Speaker 1: video game, but like the company that was making it 302 00:17:14,280 --> 00:17:17,520 Speaker 1: was like rather than being like, hey, people have pointed 303 00:17:17,560 --> 00:17:21,760 Speaker 1: out that this is kind of like problematic, they're just 304 00:17:22,040 --> 00:17:24,520 Speaker 1: using it, and I don't know, I feel like that's 305 00:17:24,800 --> 00:17:27,119 Speaker 1: like I'm willing to be like if you made a 306 00:17:27,200 --> 00:17:30,040 Speaker 1: mistake twenty years ago and are like I'm sorry, we'll 307 00:17:30,119 --> 00:17:33,119 Speaker 1: try and like make the series more inclusive. That's one thing. 308 00:17:33,160 --> 00:17:35,320 Speaker 1: If you're still going to continue to like perpetuate that, 309 00:17:36,000 --> 00:17:38,760 Speaker 1: I don't know, it's weird. I mean, there's one thing 310 00:17:38,840 --> 00:17:42,720 Speaker 1: about accountability and taking responsibility. When when you don't see 311 00:17:42,720 --> 00:17:45,840 Speaker 1: that at all, then it doesn't really help the situation. 312 00:17:45,920 --> 00:17:49,360 Speaker 1: It definitely garners some more criticism, and it should for sure. 313 00:17:50,119 --> 00:17:54,000 Speaker 1: So yeah, leaning up to the video game's release, there 314 00:17:54,119 --> 00:17:58,080 Speaker 1: was like a variety of different responses for people. I 315 00:17:58,160 --> 00:18:02,240 Speaker 1: guess around January again, Okay, disclaimer real quick, I don't 316 00:18:02,280 --> 00:18:05,480 Speaker 1: know how anything about video games. We went over this 317 00:18:05,680 --> 00:18:08,919 Speaker 1: before we started recording, but I'm not a big gamer, 318 00:18:09,040 --> 00:18:11,840 Speaker 1: so Annie and Samantha feel free you to correct me 319 00:18:12,160 --> 00:18:15,680 Speaker 1: any of this. But I guess some users were posting 320 00:18:15,720 --> 00:18:22,480 Speaker 1: reviews on Steam, which is a distributor. Yeah, using tags 321 00:18:22,680 --> 00:18:24,800 Speaker 1: in the reviews that would make the game less popular, 322 00:18:25,000 --> 00:18:29,080 Speaker 1: so like psychological horror, nft W. Apparently capitalism was one 323 00:18:29,119 --> 00:18:30,399 Speaker 1: of the tags. I don't know. I just thought that 324 00:18:30,480 --> 00:18:34,120 Speaker 1: was funny, and then they also created a transphobia tag 325 00:18:34,160 --> 00:18:37,880 Speaker 1: for the game. Unfortunately, this like kind of campaign wasn't 326 00:18:38,000 --> 00:18:40,680 Speaker 1: super successful. But I don't know, I think it's a 327 00:18:40,760 --> 00:18:43,200 Speaker 1: unble that people were trying to like protest the game, 328 00:18:43,520 --> 00:18:46,600 Speaker 1: and you guys have talked about review bombing and how 329 00:18:46,680 --> 00:18:51,240 Speaker 1: review bombing can be bad podcasts. But also I don't know, 330 00:18:51,480 --> 00:18:56,760 Speaker 1: I feel like sometimes I don't know, it can be complicated, 331 00:18:57,320 --> 00:18:59,399 Speaker 1: So I'm not totally mad that people were trying to 332 00:18:59,400 --> 00:19:01,680 Speaker 1: do that with this game. A couple of streamers also, 333 00:19:02,080 --> 00:19:04,399 Speaker 1: like a couple of really prominent streamers, We're talking about 334 00:19:05,680 --> 00:19:08,480 Speaker 1: streaming the game to raise money for a trans charity 335 00:19:08,640 --> 00:19:11,120 Speaker 1: organization and then ended up receiving a lot of backlash. 336 00:19:11,119 --> 00:19:13,159 Speaker 1: I think a couple of streamers ended up actually doing this, 337 00:19:13,680 --> 00:19:17,080 Speaker 1: but Hassan Piker it was one that was like really prominent, 338 00:19:17,080 --> 00:19:18,560 Speaker 1: and again, like I don't play video games. I know 339 00:19:18,600 --> 00:19:21,399 Speaker 1: who he is, Like he's like he's a pretty prominent 340 00:19:21,480 --> 00:19:24,480 Speaker 1: like social media person. He apparently had said that he 341 00:19:24,640 --> 00:19:26,399 Speaker 1: was like thinking about streaming it and then was like, 342 00:19:26,880 --> 00:19:28,480 Speaker 1: never mind, I'm not going to do it because people 343 00:19:28,480 --> 00:19:31,080 Speaker 1: are bulling me online, and I was like all right, 344 00:19:31,240 --> 00:19:33,639 Speaker 1: Like I think people were just like maybe that's not 345 00:19:33,760 --> 00:19:37,080 Speaker 1: the best way to like I don't know, like if 346 00:19:37,240 --> 00:19:39,920 Speaker 1: trans creators are asking people to boycott it, maybe you 347 00:19:39,960 --> 00:19:43,359 Speaker 1: should just boycott it instead of streaming it, I don't know, 348 00:19:44,160 --> 00:19:47,840 Speaker 1: and then it'll lasted effort to appease critics, The game 349 00:19:48,119 --> 00:19:56,000 Speaker 1: added a trans character, which so it's like a side character. 350 00:19:56,119 --> 00:19:58,399 Speaker 1: I guess they show up in like one scene. The 351 00:19:58,480 --> 00:20:02,600 Speaker 1: character is named Sobrone Ryan, which I guess Sir Rona 352 00:20:02,880 --> 00:20:05,359 Speaker 1: is the name of a Celtic god for healing and rebirse, 353 00:20:05,440 --> 00:20:08,560 Speaker 1: and that's where the name is from. But yeah, as Samantha, 354 00:20:08,640 --> 00:20:11,960 Speaker 1: as you mentioned, um, jk Rowling is a bit infamous 355 00:20:12,080 --> 00:20:16,320 Speaker 1: for her like vilely and the nose names for minority characters, 356 00:20:16,520 --> 00:20:18,399 Speaker 1: and this was one where people were like, Hm, you're 357 00:20:18,480 --> 00:20:22,520 Speaker 1: naming your trans character sir Rona, Like that sounds like sir. 358 00:20:22,720 --> 00:20:26,040 Speaker 1: And then Bryan, which is like a male first name. Again, 359 00:20:26,040 --> 00:20:27,920 Speaker 1: I don't think jk Rowley wasn't one naming it. I 360 00:20:27,960 --> 00:20:29,920 Speaker 1: don't think she would have wanted this character to even 361 00:20:29,960 --> 00:20:32,960 Speaker 1: be involved in the game. But um, I don't know. 362 00:20:33,000 --> 00:20:34,480 Speaker 1: There are a lot of like really funny tweets of 363 00:20:34,560 --> 00:20:39,080 Speaker 1: like rejected names. They were all like very like they're punished. Yeah, 364 00:20:39,440 --> 00:20:46,600 Speaker 1: they're punish, but yeah, I mean there's so much, there's 365 00:20:46,600 --> 00:20:48,920 Speaker 1: so much to unpack with that, like how that could 366 00:20:48,960 --> 00:20:50,920 Speaker 1: just go so wrong? I kind of would have been like, 367 00:20:51,760 --> 00:20:53,800 Speaker 1: all right, we get it, you made an effort, but 368 00:20:54,400 --> 00:20:56,480 Speaker 1: maybe it's just a better not to just let it. 369 00:20:56,720 --> 00:20:59,439 Speaker 1: Just let it be because the people who you are 370 00:20:59,480 --> 00:21:01,760 Speaker 1: trying to hate or two are gonna be the most 371 00:21:01,800 --> 00:21:03,960 Speaker 1: Like this is more insulting because this is like a 372 00:21:04,160 --> 00:21:07,800 Speaker 1: level of lights, such a patronizing attempt to be like, 373 00:21:07,960 --> 00:21:10,639 Speaker 1: but here we did something, yay. Is she's still going 374 00:21:10,720 --> 00:21:13,280 Speaker 1: to get millions of dollars. Yeah, but there's a character 375 00:21:13,359 --> 00:21:15,120 Speaker 1: for you that you're really not going to play here 376 00:21:15,160 --> 00:21:17,640 Speaker 1: you go for sure? Yeah, yeah, Like I don't even 377 00:21:17,640 --> 00:21:18,959 Speaker 1: think it's a character you can play, like. I think 378 00:21:19,160 --> 00:21:23,080 Speaker 1: it literally is like like a bartender come across. So 379 00:21:23,240 --> 00:21:26,600 Speaker 1: it's yeah, And I don't know. It was so weird. 380 00:21:26,880 --> 00:21:30,560 Speaker 1: It was a really funny day on Twitter. But yeah, 381 00:21:30,560 --> 00:21:34,840 Speaker 1: it was. It was weird. It was a weird decision, definitely. Yeah, 382 00:21:34,920 --> 00:21:38,520 Speaker 1: So by is this all matters? This is a video game. 383 00:21:39,200 --> 00:21:42,080 Speaker 1: Going back to Jesse Earl, who's that YouTuber that Rowling 384 00:21:42,640 --> 00:21:45,920 Speaker 1: had publicly harassed on her Twitter, she had told The 385 00:21:46,160 --> 00:21:49,680 Speaker 1: Washington Post that buying this game quote keeps Rowling relevant, 386 00:21:49,840 --> 00:21:51,879 Speaker 1: and then she went on to say she's equating the 387 00:21:51,920 --> 00:21:55,199 Speaker 1: relevancy with people supporting her views, and her views are 388 00:21:55,240 --> 00:21:58,840 Speaker 1: directly harmful and attacking and doing damage to the trans community. 389 00:22:00,119 --> 00:22:02,000 Speaker 1: Which again, like as I said, part of the reason 390 00:22:02,040 --> 00:22:04,760 Speaker 1: that Decay Rowland keeps doing this is because so far, 391 00:22:05,760 --> 00:22:07,520 Speaker 1: we use the audience, have shown her that there's not 392 00:22:07,600 --> 00:22:11,560 Speaker 1: going to be any like financial harm to her doing 393 00:22:11,960 --> 00:22:14,960 Speaker 1: to continue like saying these kind of things and you know, 394 00:22:15,400 --> 00:22:20,040 Speaker 1: harassing trans people and really becoming this like spokesperson for 395 00:22:20,160 --> 00:22:24,600 Speaker 1: a hate movement. Again, I pointed that tweet from Rolling 396 00:22:24,640 --> 00:22:26,280 Speaker 1: where she sort of was like I don't care. I 397 00:22:26,400 --> 00:22:29,800 Speaker 1: get my royalty techs and I'm fine, So I don't know. 398 00:22:30,119 --> 00:22:32,679 Speaker 1: So yeah, there's the one side of it is it's 399 00:22:32,760 --> 00:22:35,439 Speaker 1: keeping her relevant, it's showing her that she can use 400 00:22:35,480 --> 00:22:40,480 Speaker 1: her platform to continue spousing hate and normalizing that hate. 401 00:22:40,920 --> 00:22:43,040 Speaker 1: But then also it goes a lot deeper than that. 402 00:22:43,640 --> 00:22:47,200 Speaker 1: The same Washington Post article stated that in twenty twenty, 403 00:22:47,560 --> 00:22:51,760 Speaker 1: Senator James lank Board in the US voted against Senate 404 00:22:51,800 --> 00:22:56,240 Speaker 1: consideration of the Equality Act and LGBTQ Plus LGBTQ Civil 405 00:22:56,320 --> 00:22:59,480 Speaker 1: Rights Bill, citing a rallying blog post that bill, by 406 00:22:59,520 --> 00:23:02,520 Speaker 1: the way, still hasn't I think it passed the House, 407 00:23:02,600 --> 00:23:04,200 Speaker 1: but it's kind of in limbo with like the Senate 408 00:23:04,320 --> 00:23:09,320 Speaker 1: right now. And then the article also printed to last month, 409 00:23:10,119 --> 00:23:12,200 Speaker 1: the British government blocked a Scottish bill that would have 410 00:23:12,240 --> 00:23:14,240 Speaker 1: made it easier for people to change their legal gender. 411 00:23:14,359 --> 00:23:18,240 Speaker 1: That was that Scottish gender quality bill that Rolling was 412 00:23:18,280 --> 00:23:22,000 Speaker 1: really outspoken against. And in her twenty twenty blog article, 413 00:23:22,160 --> 00:23:26,399 Speaker 1: she had said that the bill in effect means that 414 00:23:26,920 --> 00:23:29,360 Speaker 1: all a man needs to quote become a woman. It's 415 00:23:29,359 --> 00:23:32,879 Speaker 1: just that he is one, which isn't accurate. And to 416 00:23:33,119 --> 00:23:38,320 Speaker 1: the actual bill. Yeah, and again that that bill last 417 00:23:38,400 --> 00:23:42,520 Speaker 1: month was blocked by UK Parliament. I did read one 418 00:23:42,560 --> 00:23:45,119 Speaker 1: source that said that the particular legislation they used to 419 00:23:45,200 --> 00:23:47,560 Speaker 1: block it, this is the first time they've used that legislation, so, 420 00:23:47,680 --> 00:23:53,480 Speaker 1: you know, cool, super fun. And again, yeah, the Equality 421 00:23:53,480 --> 00:23:56,000 Speaker 1: Act in the US still hasn't passed, so it's going 422 00:23:56,040 --> 00:23:57,639 Speaker 1: back and forth between the US and the UK here. 423 00:23:57,680 --> 00:24:01,400 Speaker 1: But it's sort of like the on both. Her rhetoric 424 00:24:01,640 --> 00:24:05,879 Speaker 1: is being used to argue against specific legislation that literally 425 00:24:05,880 --> 00:24:08,119 Speaker 1: would just make it easier to exist as a transperson. 426 00:24:08,560 --> 00:24:12,040 Speaker 1: Neither these legislations get into like healthcare or anything like that. 427 00:24:12,240 --> 00:24:14,880 Speaker 1: It literally just makes it easier the Scottish bill in particular, 428 00:24:15,359 --> 00:24:16,959 Speaker 1: would have just made it easier to change your name 429 00:24:17,000 --> 00:24:19,880 Speaker 1: on official documents and change your gender on official documents. 430 00:24:20,400 --> 00:24:23,920 Speaker 1: There's one side here where it's just like, obviously the 431 00:24:24,119 --> 00:24:27,240 Speaker 1: rhetoric she's putting out is bad and it is hate 432 00:24:27,240 --> 00:24:31,520 Speaker 1: speech leads to violence, and both in her personal and 433 00:24:31,640 --> 00:24:34,520 Speaker 1: then on the legislative level, where like again her speech 434 00:24:34,600 --> 00:24:37,000 Speaker 1: is being used to just buy bills being shot down 435 00:24:37,080 --> 00:24:40,040 Speaker 1: that would make it easier to exists the transperson. None 436 00:24:40,040 --> 00:24:42,000 Speaker 1: of this stuff exists in a vacuum. All of it 437 00:24:42,080 --> 00:24:46,720 Speaker 1: affects each other. Again, Yeah, hate speech fuels violent crime, 438 00:24:47,040 --> 00:24:49,320 Speaker 1: if that's in her personal or on a state level. Well, 439 00:24:49,359 --> 00:24:50,920 Speaker 1: I'm going back to what you said earlier. I think 440 00:24:50,960 --> 00:24:53,920 Speaker 1: part of the real damage is too that she for 441 00:24:54,119 --> 00:24:57,000 Speaker 1: a long time was able to present herself as a feminist, 442 00:24:57,359 --> 00:25:00,480 Speaker 1: and so she was like on this other side of 443 00:25:00,560 --> 00:25:03,640 Speaker 1: you know, people would write articles like she described Donald 444 00:25:03,680 --> 00:25:05,919 Speaker 1: Trump as bolder morn or like you know, had all 445 00:25:05,960 --> 00:25:09,520 Speaker 1: this stuff, and so now you're seeing like these really 446 00:25:10,119 --> 00:25:16,119 Speaker 1: extreme conservatives use what she's saying but being like, but 447 00:25:16,400 --> 00:25:19,159 Speaker 1: remember you liked her so much, Like that's part of 448 00:25:19,320 --> 00:25:24,200 Speaker 1: the the damage of a terfism or like yeah, yeah, 449 00:25:24,240 --> 00:25:26,920 Speaker 1: I think that's that's what scares me because like when 450 00:25:26,960 --> 00:25:30,199 Speaker 1: we talk about transphobia coming from like the really fat right, 451 00:25:30,359 --> 00:25:32,200 Speaker 1: it's easy to just be like, Okay, well they're like 452 00:25:32,280 --> 00:25:34,119 Speaker 1: really conservative, Like I don't know, like I'm not going 453 00:25:34,160 --> 00:25:36,879 Speaker 1: to take obviously it is also serious and has like 454 00:25:36,960 --> 00:25:39,480 Speaker 1: serious impacts on people's like perception of trans people. But 455 00:25:39,520 --> 00:25:41,840 Speaker 1: it's like, yeah, I'm expecting it to come from like 456 00:25:41,880 --> 00:25:45,439 Speaker 1: a Tucker Carlson type person or like even that senitor 457 00:25:45,520 --> 00:25:47,879 Speaker 1: that I mentioned in the in the article who is Republican, 458 00:25:48,000 --> 00:25:49,600 Speaker 1: Like it's there are going to be the people who 459 00:25:49,600 --> 00:25:50,920 Speaker 1: are going to expect to say this kind of stuff. 460 00:25:50,920 --> 00:25:53,719 Speaker 1: When it's somebody like jk Rowling, it's like who, by 461 00:25:53,760 --> 00:25:54,920 Speaker 1: the way, I just want to say it, Like she 462 00:25:55,000 --> 00:25:58,119 Speaker 1: calls herself a feminist. I don't know about Like I 463 00:25:58,200 --> 00:25:59,800 Speaker 1: have a big believer and you need you need to 464 00:26:00,000 --> 00:26:02,400 Speaker 1: actually practice feminism. You can't just say her a feminist 465 00:26:02,480 --> 00:26:04,639 Speaker 1: and I don't know, Like I remember when when the 466 00:26:05,280 --> 00:26:08,400 Speaker 1: back last spring, when when the stuff about like Roe 467 00:26:08,520 --> 00:26:10,680 Speaker 1: versus Wade was coming out and they were like with 468 00:26:10,760 --> 00:26:13,800 Speaker 1: the Supreme Court. One thing that I saw that it 469 00:26:13,880 --> 00:26:16,040 Speaker 1: was interesting with somebody who was like, hey, take her all. 470 00:26:16,080 --> 00:26:17,440 Speaker 1: It is one of those people that keeps saying she's 471 00:26:17,440 --> 00:26:21,119 Speaker 1: a feminist and she sports like women's autonomy and and 472 00:26:21,640 --> 00:26:24,199 Speaker 1: not harming women's bodies and all this stuff, but then 473 00:26:24,280 --> 00:26:27,240 Speaker 1: she's completely silent on this issue that is like affecting 474 00:26:27,280 --> 00:26:30,960 Speaker 1: a lot of women and not women too. But like, 475 00:26:32,200 --> 00:26:34,800 Speaker 1: maybe this isn't about feminism, Maybe this is just about hate. 476 00:26:34,920 --> 00:26:37,159 Speaker 1: And there's a million other examples you can find in 477 00:26:37,359 --> 00:26:39,760 Speaker 1: like pretty much any turf argument of how it's just 478 00:26:39,920 --> 00:26:45,720 Speaker 1: like it is patriarchy under the facade of feminism and 479 00:26:45,960 --> 00:26:49,840 Speaker 1: women's rights. And I really would like encourage people to 480 00:26:50,200 --> 00:26:52,200 Speaker 1: like look deeper into a lot of those arguments and 481 00:26:52,320 --> 00:26:55,760 Speaker 1: how they specifically end up harming CIS women as well 482 00:26:56,560 --> 00:27:00,359 Speaker 1: as trans folks. Yeah, it's as a whole mess. And 483 00:27:00,400 --> 00:27:03,639 Speaker 1: I think another thing that has to be said about 484 00:27:03,760 --> 00:27:07,840 Speaker 1: where she was pre quotes and tweets, I guess is 485 00:27:07,880 --> 00:27:11,200 Speaker 1: that she really worked as if she was an ally 486 00:27:11,440 --> 00:27:14,520 Speaker 1: to the queer community, talking about like you know, oh, 487 00:27:14,640 --> 00:27:18,400 Speaker 1: yeah Dumbledore is definitely gay and all these things, and yeah, 488 00:27:18,440 --> 00:27:22,119 Speaker 1: absolutely Hermione could have been a black woman, what are 489 00:27:22,119 --> 00:27:24,680 Speaker 1: you talking about? And she gained the trust of so 490 00:27:24,880 --> 00:27:28,399 Speaker 1: many young people who for so long felt like they 491 00:27:28,440 --> 00:27:30,720 Speaker 1: were alone and isolated. And the reason and I'll never 492 00:27:30,720 --> 00:27:32,159 Speaker 1: We're going to talk a little more, but like the 493 00:27:32,240 --> 00:27:34,760 Speaker 1: reason they loved the Harry Potter series, it was a 494 00:27:34,880 --> 00:27:39,040 Speaker 1: light in this darkness because they saw as an outcast, 495 00:27:39,280 --> 00:27:42,120 Speaker 1: you can be special like less the message all kids 496 00:27:42,200 --> 00:27:44,000 Speaker 1: want to hear. But then to hear her go on 497 00:27:44,160 --> 00:27:46,560 Speaker 1: to say yeah, and I'm an ally for the queer 498 00:27:46,640 --> 00:27:49,040 Speaker 1: people and they definitely were part of the Wizarding World, 499 00:27:49,119 --> 00:27:52,600 Speaker 1: which we all wished was a real thing. Maybe it is, 500 00:27:52,640 --> 00:27:55,680 Speaker 1: I don't know, but it was a real thing, and 501 00:27:55,920 --> 00:27:59,160 Speaker 1: felt so so much love for the series and fell 502 00:27:59,200 --> 00:28:01,280 Speaker 1: in love with a series which we've talked about previously 503 00:28:01,320 --> 00:28:04,600 Speaker 1: about what happens when you really love the characters that 504 00:28:04,720 --> 00:28:06,760 Speaker 1: out there, that you have created, even for yourself, a 505 00:28:06,920 --> 00:28:09,639 Speaker 1: different world, and you put yourself in this world, and 506 00:28:09,800 --> 00:28:13,639 Speaker 1: then have her come and slash it, literally murder the 507 00:28:13,760 --> 00:28:16,480 Speaker 1: hopes and love and dreams of these kids who grew 508 00:28:16,600 --> 00:28:19,040 Speaker 1: up and maybe even found themselves a little bit through 509 00:28:19,119 --> 00:28:23,320 Speaker 1: her series. The betrayal of that in itself is so dangerous, 510 00:28:23,400 --> 00:28:26,280 Speaker 1: and we have come back to from being like Alli 511 00:28:26,480 --> 00:28:30,000 Speaker 1: and giving hope and giving someone and out and lights 512 00:28:30,359 --> 00:28:33,240 Speaker 1: to darkness of like, yeah, oh my god, she hates 513 00:28:33,280 --> 00:28:35,440 Speaker 1: me or she hates people like me. Everything that she 514 00:28:35,520 --> 00:28:38,360 Speaker 1: has said is falling apart, and she's celebrating it. She's 515 00:28:38,360 --> 00:28:44,520 Speaker 1: celebrating it by using her influence in society, in politics 516 00:28:44,680 --> 00:28:47,960 Speaker 1: and policy. She's using it as a way of gaining 517 00:28:48,080 --> 00:28:51,760 Speaker 1: more attention through the news media and all the media, 518 00:28:52,000 --> 00:28:54,800 Speaker 1: and then also teaching those parents who have already been 519 00:28:54,840 --> 00:28:57,959 Speaker 1: against the kids to once againe double down. And then 520 00:28:58,000 --> 00:29:00,320 Speaker 1: when we come to see a game that celebrates it's that. 521 00:29:00,720 --> 00:29:03,800 Speaker 1: But there's no way we can I mean, no, you 522 00:29:03,960 --> 00:29:05,800 Speaker 1: really can't. Even when people tell me they buy it, 523 00:29:05,960 --> 00:29:07,800 Speaker 1: or they did they're thinking of buying it, the first 524 00:29:07,840 --> 00:29:11,240 Speaker 1: thing they say is I know, I know she's bad. 525 00:29:11,560 --> 00:29:13,360 Speaker 1: So when that's the first statement that comes out of 526 00:29:13,360 --> 00:29:15,920 Speaker 1: your mouth, you know there's no way to separate this 527 00:29:16,560 --> 00:29:20,360 Speaker 1: for sure. Yeah, And that's what makes it complicated, I think. 528 00:29:20,400 --> 00:29:22,920 Speaker 1: And again, like I grew up with the series and 529 00:29:23,320 --> 00:29:24,880 Speaker 1: like it was my favorite books and I was a kid, 530 00:29:24,920 --> 00:29:28,440 Speaker 1: I definitely like have a lot of like really good 531 00:29:28,480 --> 00:29:31,720 Speaker 1: memories attached to it. I totally understand that aspect and 532 00:29:31,840 --> 00:29:36,280 Speaker 1: also like amas but involved in like the fandom today 533 00:29:36,520 --> 00:29:38,800 Speaker 1: so like I got thought too, but um yeah again 534 00:29:38,840 --> 00:29:41,160 Speaker 1: it just really good. A couple of stats too about 535 00:29:41,400 --> 00:29:45,120 Speaker 1: the state of trans rates as of right now, this 536 00:29:45,280 --> 00:29:47,200 Speaker 1: is gonna be focusing on the US just because it 537 00:29:47,280 --> 00:29:49,560 Speaker 1: shows based in the US. I don't know as much 538 00:29:49,560 --> 00:29:52,720 Speaker 1: about UK politics, although I've heard it's kind of a 539 00:29:52,760 --> 00:29:58,800 Speaker 1: show from friends that are there. But yeah, again, like well, 540 00:29:58,800 --> 00:30:01,680 Speaker 1: actually one thing really quick with the UK, so the 541 00:30:02,440 --> 00:30:04,840 Speaker 1: and again going back to the whole like people buying 542 00:30:04,880 --> 00:30:06,320 Speaker 1: the game and being like I know, like I know, 543 00:30:06,360 --> 00:30:09,480 Speaker 1: which's terrible. I know't the weekend this video game came out, 544 00:30:10,280 --> 00:30:12,280 Speaker 1: there was a sixteen year old trans girl that was 545 00:30:12,400 --> 00:30:15,760 Speaker 1: murdered in the UK. Does that murder have anything directly 546 00:30:15,800 --> 00:30:17,880 Speaker 1: to do with the video game? No, I don't think, like, 547 00:30:18,360 --> 00:30:21,160 Speaker 1: but at the same time, it's like this is a 548 00:30:21,280 --> 00:30:23,400 Speaker 1: lot of people on Twitter and on social media we're 549 00:30:23,440 --> 00:30:25,480 Speaker 1: pointing out kind of the sort of this weird feeling 550 00:30:25,520 --> 00:30:28,320 Speaker 1: of like you have this, you have all these discussions 551 00:30:28,320 --> 00:30:31,640 Speaker 1: about this video game, and like can you support playing 552 00:30:31,680 --> 00:30:33,520 Speaker 1: this video game or buying this video game or like 553 00:30:34,120 --> 00:30:35,920 Speaker 1: all of that are still like being a fan of 554 00:30:36,000 --> 00:30:38,560 Speaker 1: the series, but then the reality of the situation is 555 00:30:38,600 --> 00:30:40,840 Speaker 1: like this sort of hate speech the jk Rowling is 556 00:30:41,160 --> 00:30:44,360 Speaker 1: championing is what ends up getting people killed, is what 557 00:30:44,480 --> 00:30:46,680 Speaker 1: ends up like literally, like a sixteen year old girl 558 00:30:46,840 --> 00:30:49,120 Speaker 1: was killed because of the kind of hate speech that 559 00:30:49,240 --> 00:30:52,880 Speaker 1: jk Rowling has turned into her entire brand at this point. 560 00:30:53,240 --> 00:30:55,440 Speaker 1: According to a twenty twenty report from every Town for 561 00:30:55,520 --> 00:30:59,640 Speaker 1: Guns Safety in the US, the number of trans people 562 00:31:00,040 --> 00:31:03,400 Speaker 1: we've been murdered doubled between twenty seventeen and twenty twenty one. 563 00:31:04,000 --> 00:31:06,720 Speaker 1: So yeah, it's and again, it's not just k Rowling. 564 00:31:06,800 --> 00:31:10,680 Speaker 1: There's a lot of like people across the political spectrum 565 00:31:10,760 --> 00:31:14,840 Speaker 1: that I think, like have responsibility for this rise in 566 00:31:16,120 --> 00:31:19,240 Speaker 1: hate of trans people. But she is a very provident 567 00:31:19,360 --> 00:31:21,400 Speaker 1: spokesperson here a lot of the people of use so 568 00:31:21,480 --> 00:31:23,800 Speaker 1: we're like moral panic when talking about this, I would agree, 569 00:31:23,840 --> 00:31:24,800 Speaker 1: like I think we're in the middle of like a 570 00:31:24,880 --> 00:31:28,120 Speaker 1: full blown moral panic around queer people in particular, on 571 00:31:28,240 --> 00:31:31,800 Speaker 1: LGBTQ plus people in general, but like particularly trans people 572 00:31:32,480 --> 00:31:35,920 Speaker 1: and particularly trans women. I think, yeah, again, like it's 573 00:31:35,960 --> 00:31:38,320 Speaker 1: affecting everybody, it's affecting all trans people, but I think 574 00:31:38,520 --> 00:31:40,080 Speaker 1: trans women are sort of just like that's the main 575 00:31:40,160 --> 00:31:42,680 Speaker 1: group that jk Rowling's targeting that has been the main 576 00:31:42,720 --> 00:31:45,840 Speaker 1: group targeted in a lot of like grooming groomer like 577 00:31:46,240 --> 00:31:51,080 Speaker 1: stuff that's going around and all that. But yeah, as 578 00:31:51,120 --> 00:31:53,760 Speaker 1: of right now, the ACLU is tracking three hundred and 579 00:31:53,800 --> 00:31:57,600 Speaker 1: seventy one anti LGBTQ plus bills throughout the US. A 580 00:31:57,720 --> 00:32:01,920 Speaker 1: number of them particularly target trans people and trans kids. Literally, 581 00:32:02,000 --> 00:32:05,640 Speaker 1: like in the Times, I began putting together like research 582 00:32:05,760 --> 00:32:10,560 Speaker 1: for this episode, Tennessee passa bill the band drag performances 583 00:32:11,440 --> 00:32:14,880 Speaker 1: from public spaces or locations where they could be viewed 584 00:32:14,920 --> 00:32:17,160 Speaker 1: by a minor. And if you look at the bill, 585 00:32:17,240 --> 00:32:19,800 Speaker 1: the language that they use is super vague. I would 586 00:32:19,880 --> 00:32:22,080 Speaker 1: argue that's on purpose, And a lot of activists are 587 00:32:22,120 --> 00:32:24,800 Speaker 1: worried that like this kind of combined with the sort 588 00:32:24,840 --> 00:32:28,440 Speaker 1: of like transphobic view that like trans women are just 589 00:32:28,560 --> 00:32:31,640 Speaker 1: men in drag or vice versa, this bill could be 590 00:32:31,760 --> 00:32:34,960 Speaker 1: used to criminalize like trans people simply existing in public 591 00:32:35,400 --> 00:32:37,600 Speaker 1: or in spaces around children that are private. It's again, 592 00:32:37,640 --> 00:32:39,360 Speaker 1: it's it's not just public at this point now, it's 593 00:32:39,360 --> 00:32:42,160 Speaker 1: also private because it is like it could be a 594 00:32:42,200 --> 00:32:44,720 Speaker 1: private entertainment menu. It could literally be like your home 595 00:32:44,880 --> 00:32:48,160 Speaker 1: if you have kids, or like you know, ore had 596 00:32:48,840 --> 00:32:52,080 Speaker 1: children and your family unit. Yeah, and this isn't like 597 00:32:52,120 --> 00:32:55,120 Speaker 1: the only bill like thought that's going around. Tennessee also 598 00:32:55,320 --> 00:32:59,720 Speaker 1: passed a bill limiting, not limiting banning genderperving care for 599 00:33:00,560 --> 00:33:03,840 Speaker 1: trans kids in Tennessee, which, again really quick I think 600 00:33:03,960 --> 00:33:06,240 Speaker 1: says people get confused when we talk about like gender 601 00:33:06,280 --> 00:33:09,080 Speaker 1: affirming care for kids in particular, because I just assume 602 00:33:09,160 --> 00:33:13,920 Speaker 1: that it means like physical transition. That doesn't necessarily mean that, 603 00:33:14,040 --> 00:33:17,040 Speaker 1: like sometimes it can mean like puberity blockers, which aren't permanent. 604 00:33:18,120 --> 00:33:21,120 Speaker 1: And I'm also not a doctor, so nobody like take 605 00:33:21,240 --> 00:33:23,600 Speaker 1: my particular word for this. But sometimes that literally just 606 00:33:23,720 --> 00:33:27,760 Speaker 1: means like being able to like acknowledge that your kid 607 00:33:28,280 --> 00:33:30,800 Speaker 1: doesn't identify as the gender they've been assigned, which like 608 00:33:32,760 --> 00:33:37,720 Speaker 1: usually just means like letting them change how like their 609 00:33:37,800 --> 00:33:40,080 Speaker 1: hair looks, or how they dress, or like how they 610 00:33:40,120 --> 00:33:42,680 Speaker 1: want to be addressed in public, none of them. Like 611 00:33:43,600 --> 00:33:45,280 Speaker 1: if people people get so caught up and they're like, oh, 612 00:33:45,320 --> 00:33:48,200 Speaker 1: it's not reversible, Like I don't know about Like I'm 613 00:33:48,240 --> 00:33:50,120 Speaker 1: somebody who's changed the names that I've gone by, and 614 00:33:50,240 --> 00:33:54,320 Speaker 1: like that's reversible, Like that is really easy to change. 615 00:33:54,880 --> 00:33:58,320 Speaker 1: I have short hair, I can grow it out really easily. 616 00:33:58,480 --> 00:34:01,400 Speaker 1: Or cut it again, and I changed my outfits every day. 617 00:34:01,480 --> 00:34:02,880 Speaker 1: I don't know about you, guys. I thought that was 618 00:34:02,880 --> 00:34:05,520 Speaker 1: a pretty normal thing, but I don't know. Yeah, so 619 00:34:06,360 --> 00:34:09,920 Speaker 1: it's frustrating. We are living at like a really scary 620 00:34:10,040 --> 00:34:12,640 Speaker 1: time to be a trans person, to be a queer person, 621 00:34:13,320 --> 00:34:18,040 Speaker 1: but particularly to be trans person. Like yeah, and both 622 00:34:18,120 --> 00:34:20,839 Speaker 1: like legislatively on a state level, and also just like, yeah, 623 00:34:20,920 --> 00:34:23,440 Speaker 1: there are more and more trans people that are being murdered, 624 00:34:23,800 --> 00:34:41,320 Speaker 1: particularly trans women. Yeah, it's it's a mess. Yeah, but 625 00:34:41,480 --> 00:34:46,160 Speaker 1: let's talk about a video game. So yeah, the game 626 00:34:46,560 --> 00:34:52,080 Speaker 1: is doing well, which is unfortunate. And again, like the 627 00:34:52,160 --> 00:34:55,080 Speaker 1: frustrating thing, like Samanthony said, like as soon as everybody 628 00:34:55,080 --> 00:34:57,439 Speaker 1: breaks up, it's like, I know, like there really is there. 629 00:34:58,120 --> 00:35:00,360 Speaker 1: Most people know there's a pretty much strong argument against 630 00:35:00,400 --> 00:35:04,279 Speaker 1: getting it or against supporting any sort of like Harry 631 00:35:04,320 --> 00:35:07,600 Speaker 1: Potter franchise content, and yet people still do. And as 632 00:35:07,680 --> 00:35:10,560 Speaker 1: much as I want to be like everybody, look at 633 00:35:10,560 --> 00:35:12,680 Speaker 1: all of the facts, all this terrible stuff that's happening, 634 00:35:13,040 --> 00:35:15,120 Speaker 1: how this connects to hate speech, how it's connects to 635 00:35:15,480 --> 00:35:20,960 Speaker 1: this particular woman's platform. Let's stop, like, let's just like 636 00:35:21,080 --> 00:35:23,880 Speaker 1: pretend that this doesn't exist, that's not going to happen, 637 00:35:24,080 --> 00:35:25,680 Speaker 1: Like if it would have happened, it would have happened. 638 00:35:25,680 --> 00:35:27,560 Speaker 1: At this point, I'd rather just focus on like the 639 00:35:27,640 --> 00:35:29,600 Speaker 1: reality of the situation than like what I'd want in 640 00:35:29,640 --> 00:35:33,719 Speaker 1: an ideal world. So yeah, I think this does kind 641 00:35:33,719 --> 00:35:36,160 Speaker 1: of get the question. I'm also like, I don't want 642 00:35:36,200 --> 00:35:38,440 Speaker 1: to like vilify anybody that has gotten the game or 643 00:35:38,440 --> 00:35:40,720 Speaker 1: anybody that does continue to like enjoy this kind of content. 644 00:35:41,440 --> 00:35:43,200 Speaker 1: I will say the people like I keep saying on 645 00:35:43,239 --> 00:35:45,400 Speaker 1: Twitter that are like making these really elaborate excuses of 646 00:35:45,480 --> 00:35:48,000 Speaker 1: like here's why I particularly should be allowed to enjoy 647 00:35:48,080 --> 00:35:50,920 Speaker 1: the game or how I should Like that's dumb, Like 648 00:35:51,120 --> 00:35:54,320 Speaker 1: I don't know, that's another story. I will argue with 649 00:35:54,360 --> 00:35:56,680 Speaker 1: you about that, but whatever. I mean. At the same time, 650 00:35:56,719 --> 00:35:58,640 Speaker 1: like I get it again, Like I have no interest 651 00:35:58,680 --> 00:36:00,560 Speaker 1: in buying the video game. I don't think people should 652 00:36:00,560 --> 00:36:02,440 Speaker 1: buy the video game, but I do get being attached 653 00:36:02,440 --> 00:36:05,200 Speaker 1: to this universe, Like, yeah, I grew up with it. 654 00:36:05,280 --> 00:36:06,520 Speaker 1: I was a huge fan when I was a kid. 655 00:36:06,960 --> 00:36:09,440 Speaker 1: I have been somewhat involved with like the fandom stuff 656 00:36:09,480 --> 00:36:11,759 Speaker 1: on TikTok, and actually I have like a lot of 657 00:36:11,840 --> 00:36:14,760 Speaker 1: really close friends that are really involved with like TikTok, 658 00:36:14,880 --> 00:36:18,160 Speaker 1: like cosplay content and fan fiction and all that, and 659 00:36:18,320 --> 00:36:22,080 Speaker 1: like I get being attached to this series. And also 660 00:36:22,120 --> 00:36:24,480 Speaker 1: again because it's so popular like worldwide, there's a lot 661 00:36:24,520 --> 00:36:26,000 Speaker 1: of people that are just like casual fans and like 662 00:36:26,040 --> 00:36:28,279 Speaker 1: grew up with reading it or like you know, are 663 00:36:28,360 --> 00:36:29,880 Speaker 1: going to get it for their kids because it's like 664 00:36:29,920 --> 00:36:34,240 Speaker 1: a staple now sort of childhood like reading these books. 665 00:36:35,320 --> 00:36:38,480 Speaker 1: So it's hard. There's not like there's not an easy answer, 666 00:36:39,360 --> 00:36:43,080 Speaker 1: which is the fresh training part of all of this. Yeah, 667 00:36:43,120 --> 00:36:45,600 Speaker 1: I mean just the things, Like I obviously have very 668 00:36:45,640 --> 00:36:50,400 Speaker 1: strong opinions when it comes to morality issues. The biggest 669 00:36:50,480 --> 00:36:52,840 Speaker 1: thing to me is like, you know, just admit it. 670 00:36:52,960 --> 00:36:55,759 Speaker 1: It's fine, Like we get it. It's hard. Like the 671 00:36:55,920 --> 00:36:59,880 Speaker 1: game is set up beautifully. I have seen YouTube videos 672 00:37:00,400 --> 00:37:02,480 Speaker 1: previews of it and be like, people are so excited 673 00:37:02,520 --> 00:37:05,040 Speaker 1: because this is your Tiffany, so you can tell me 674 00:37:05,239 --> 00:37:08,120 Speaker 1: if I'm wrong. But the design of it is unique, 675 00:37:08,200 --> 00:37:10,960 Speaker 1: and the way they really created this world apparently is 676 00:37:11,000 --> 00:37:14,279 Speaker 1: really pretty and beautiful. The way you can play it 677 00:37:14,560 --> 00:37:17,080 Speaker 1: is newer, kind of like people's love for Elden Ring. 678 00:37:17,239 --> 00:37:19,200 Speaker 1: I'm sorry for those who think I'm being awful. For 679 00:37:19,520 --> 00:37:23,200 Speaker 1: compare the two, but the graphics of it is different 680 00:37:23,360 --> 00:37:26,680 Speaker 1: and has been upgraded over the years. So a lot 681 00:37:26,800 --> 00:37:30,480 Speaker 1: of the video gamer people, gamers as we would say, 682 00:37:30,880 --> 00:37:34,359 Speaker 1: try to say that's the reason, Like I literally has 683 00:37:34,440 --> 00:37:36,480 Speaker 1: say like, oh, they can do this in this world 684 00:37:36,600 --> 00:37:38,520 Speaker 1: and that's never been a thing, or you can do 685 00:37:39,239 --> 00:37:42,080 Speaker 1: these things on this level, and you know, people love 686 00:37:42,880 --> 00:37:48,120 Speaker 1: anything spectacular and new and changing, So whatever they have 687 00:37:48,280 --> 00:37:50,759 Speaker 1: created for this game, I get it. I get that 688 00:37:50,880 --> 00:37:55,239 Speaker 1: that's an appeal, but yeah, it is really hard not 689 00:37:55,400 --> 00:38:00,360 Speaker 1: to see when you see a community hurting, suffer and 690 00:38:00,520 --> 00:38:06,320 Speaker 1: being persecuted on such a depth level. Constantly hearing stories 691 00:38:06,360 --> 00:38:10,839 Speaker 1: of people being murdered or heard or kicked out, being 692 00:38:10,960 --> 00:38:14,080 Speaker 1: threatened to be removed from a family, like being threatened 693 00:38:14,080 --> 00:38:17,359 Speaker 1: to be arrested because you do support your family. It's 694 00:38:17,400 --> 00:38:20,600 Speaker 1: kind of hard to see. But it's also some of 695 00:38:20,640 --> 00:38:24,440 Speaker 1: these same people who would probways say that, but this 696 00:38:24,719 --> 00:38:28,440 Speaker 1: world when I was younger really weren't for me and 697 00:38:28,719 --> 00:38:32,719 Speaker 1: really helped me see some things. So the complicated place 698 00:38:32,840 --> 00:38:36,239 Speaker 1: that they are at it is it is painful because 699 00:38:36,280 --> 00:38:38,600 Speaker 1: you have this decision of like how do I grieve 700 00:38:39,239 --> 00:38:42,239 Speaker 1: but I don't want to let it go, yeah, and 701 00:38:42,360 --> 00:38:44,800 Speaker 1: I just just to know, I don't think you realized 702 00:38:44,840 --> 00:38:47,560 Speaker 1: you did this, that you did look at me via 703 00:38:47,640 --> 00:38:51,160 Speaker 1: this video call and say this is your turf Annie, 704 00:38:51,840 --> 00:39:02,400 Speaker 1: which give very very funny. That's the thing we should do, though, 705 00:39:02,440 --> 00:39:04,719 Speaker 1: is we should try to reclaim like to you are, 706 00:39:05,800 --> 00:39:08,880 Speaker 1: I just make whatever people share about like, I don't know, 707 00:39:09,800 --> 00:39:13,480 Speaker 1: this is your whelm, this is you, this is your 708 00:39:13,560 --> 00:39:18,600 Speaker 1: specially who well you have it on videos, we can 709 00:39:18,640 --> 00:39:20,400 Speaker 1: play it back. But I was very like, wait a minute, 710 00:39:21,239 --> 00:39:22,960 Speaker 1: I know actually you did make a face and I 711 00:39:23,000 --> 00:39:28,680 Speaker 1: didn't get it at that point, but yeah, your I 712 00:39:28,760 --> 00:39:31,800 Speaker 1: will say too when it comes to like also like 713 00:39:31,880 --> 00:39:35,880 Speaker 1: the fandom spaces and like the being your turf to 714 00:39:36,000 --> 00:39:39,440 Speaker 1: you are of the kind of thing and somebody like 715 00:39:39,800 --> 00:39:42,080 Speaker 1: I've spent a lot of time in online fandom spaces, 716 00:39:42,320 --> 00:39:45,919 Speaker 1: and I think in general they tend to be pretty queer. 717 00:39:46,000 --> 00:39:47,600 Speaker 1: This may also just be like my bubble, but like, 718 00:39:48,400 --> 00:39:51,560 Speaker 1: they tend to be pretty queer bubbles. I've seen the 719 00:39:51,680 --> 00:39:53,960 Speaker 1: same with the Harry Potter fandom. Like I think the 720 00:39:54,160 --> 00:40:00,759 Speaker 1: like online social media, very like fandom space usually tends 721 00:40:00,760 --> 00:40:02,120 Speaker 1: to be very queer, and I think that also makes 722 00:40:02,160 --> 00:40:04,520 Speaker 1: it complicated because like it can be frustrating as a 723 00:40:04,600 --> 00:40:10,759 Speaker 1: queer person, like feeling stuck and feeling yeah, I'm like 724 00:40:12,360 --> 00:40:14,719 Speaker 1: I don't know, like I just I recognize that's that's 725 00:40:14,920 --> 00:40:18,960 Speaker 1: that can be particularly frustrating. Is again as somebody else 726 00:40:19,040 --> 00:40:22,480 Speaker 1: or who is who's queer and like trans like it's 727 00:40:22,640 --> 00:40:26,080 Speaker 1: it's it can be frustrating. It could be really really 728 00:40:26,120 --> 00:40:29,960 Speaker 1: really frustrating. Yeah. Because also along with that, I've seen 729 00:40:30,000 --> 00:40:33,720 Speaker 1: a lot of like trans masculine people in particular speak 730 00:40:33,719 --> 00:40:36,000 Speaker 1: out and be like, well, I'm really attached to this thing, 731 00:40:36,120 --> 00:40:37,560 Speaker 1: so I'm just going to continue to sort of like 732 00:40:37,640 --> 00:40:40,520 Speaker 1: ignore it and like enjoy this media because it's really 733 00:40:40,520 --> 00:40:43,480 Speaker 1: important to me personally. I think it's important to recognize that, 734 00:40:43,560 --> 00:40:45,080 Speaker 1: like the majority of the people that are being really 735 00:40:45,120 --> 00:40:48,120 Speaker 1: really harmed are trans women. It's going to be transmulin general, 736 00:40:48,120 --> 00:40:49,920 Speaker 1: it's going to be queer people in general, but like 737 00:40:50,200 --> 00:40:52,000 Speaker 1: it's trans women, Like trans women are going to be 738 00:40:52,040 --> 00:40:56,360 Speaker 1: particularly targeted by this, and I think, I don't know, 739 00:40:56,560 --> 00:40:58,640 Speaker 1: I guess like I just would ask people to like 740 00:40:58,880 --> 00:41:00,880 Speaker 1: use this as a time to likely reflects on like 741 00:41:01,239 --> 00:41:04,040 Speaker 1: how do we stand up for one another within the community, 742 00:41:04,400 --> 00:41:07,239 Speaker 1: because I have also seen the argument of people again, 743 00:41:07,320 --> 00:41:09,439 Speaker 1: Like I've seen a frustrating number of videos of people 744 00:41:09,520 --> 00:41:12,640 Speaker 1: being like I'm trans and I'm saying that you can 745 00:41:12,680 --> 00:41:14,600 Speaker 1: buy the studio game it doesn't matter, and it's like 746 00:41:14,920 --> 00:41:18,400 Speaker 1: usually they're trans masculine people. I don't know. I just 747 00:41:18,440 --> 00:41:20,600 Speaker 1: I would encourage people to like really think about who's 748 00:41:20,600 --> 00:41:24,000 Speaker 1: being affected, who is being harmed, and just like stand 749 00:41:24,120 --> 00:41:26,000 Speaker 1: up for other people in your community. Like, this is 750 00:41:26,000 --> 00:41:28,799 Speaker 1: the time when we need to like have each other's backs. Again, 751 00:41:28,840 --> 00:41:31,160 Speaker 1: we're in the middle of a full blown oral panic 752 00:41:31,200 --> 00:41:34,640 Speaker 1: around trans people. We need to have each other's backs. 753 00:41:35,760 --> 00:41:37,759 Speaker 1: Queer sis queer people too, we need to have each 754 00:41:37,760 --> 00:41:39,520 Speaker 1: other's backs. You guys are not saying from this either, 755 00:41:39,800 --> 00:41:42,759 Speaker 1: like I don't know, it's yeah, I mean, and that's 756 00:41:42,840 --> 00:41:45,480 Speaker 1: absolutely when it comes down to it. When it's attacking 757 00:41:46,120 --> 00:41:50,160 Speaker 1: any type of marginalized community in general, it comes down 758 00:41:50,280 --> 00:41:53,799 Speaker 1: onto all of us. And obviously women are a part, 759 00:41:53,960 --> 00:41:57,319 Speaker 1: not necessarily of the overall marginalized, but they are marginalized. 760 00:41:57,360 --> 00:41:59,520 Speaker 1: You are not a sisman, so therefore you or do 761 00:41:59,640 --> 00:42:02,279 Speaker 1: not have the power, So yes, this will come down 762 00:42:02,360 --> 00:42:05,200 Speaker 1: onto you, which we've seen it is. It goes hand 763 00:42:05,239 --> 00:42:07,160 Speaker 1: in hand. When they go for one, they go for all, 764 00:42:07,640 --> 00:42:10,160 Speaker 1: and what we're seeing is when we're not banning together, 765 00:42:11,400 --> 00:42:13,759 Speaker 1: it allows for the breakdown. And then if we think 766 00:42:13,840 --> 00:42:17,279 Speaker 1: one aspect or one injustice is more important than the other, 767 00:42:17,920 --> 00:42:20,840 Speaker 1: then you've got a problem because you are not truly 768 00:42:20,920 --> 00:42:23,200 Speaker 1: seeing the fact that you're a part of the problem. 769 00:42:23,480 --> 00:42:27,960 Speaker 1: And it's really naive to think that you're one issue. 770 00:42:28,320 --> 00:42:30,239 Speaker 1: And I'm talking to women in general. It's just like 771 00:42:30,560 --> 00:42:34,160 Speaker 1: I'm a woman and trans women are affront to me 772 00:42:34,400 --> 00:42:38,600 Speaker 1: because that takes away from my woes and my inability 773 00:42:38,600 --> 00:42:41,520 Speaker 1: to get all the rights, which is not true. Then 774 00:42:41,640 --> 00:42:44,719 Speaker 1: you are woefully ignorant of the fact that you are 775 00:42:44,800 --> 00:42:50,040 Speaker 1: part of the problem. Absolutely, But yeah, I don't know. Again, 776 00:42:50,719 --> 00:42:53,319 Speaker 1: I yeah, with the whole kind of conversation about then, 777 00:42:53,480 --> 00:42:56,239 Speaker 1: like what do we do with this piece of media now, 778 00:42:56,360 --> 00:42:58,759 Speaker 1: like and I'm talking about Harry Potter and general in 779 00:42:58,840 --> 00:43:00,840 Speaker 1: any sort of Harry Potter thing, because again it's not 780 00:43:01,000 --> 00:43:02,719 Speaker 1: it's not just the video game. The video games like 781 00:43:02,840 --> 00:43:05,400 Speaker 1: the most obvious version of like here's this thing that 782 00:43:05,480 --> 00:43:07,799 Speaker 1: people can buy that people can give their money too. 783 00:43:07,840 --> 00:43:10,320 Speaker 1: But it was also like in all the like arguments 784 00:43:10,360 --> 00:43:12,560 Speaker 1: about it, people bring up like the attention economy, which 785 00:43:12,560 --> 00:43:13,960 Speaker 1: you got again going back to like the whole dayk 786 00:43:14,080 --> 00:43:18,040 Speaker 1: Rowling thing, like she's benefiting from the series being really 787 00:43:18,080 --> 00:43:22,239 Speaker 1: popular and benefiting from the attention economy. Like, but also 788 00:43:22,320 --> 00:43:24,000 Speaker 1: I don't want to say, like people just can't enjoy 789 00:43:24,120 --> 00:43:28,560 Speaker 1: this because I don't know. Again, like I think, I 790 00:43:28,960 --> 00:43:31,960 Speaker 1: do think people can enjoy things as long as it's 791 00:43:32,000 --> 00:43:35,680 Speaker 1: not like hert make somebody. I don't know, I think 792 00:43:35,680 --> 00:43:37,640 Speaker 1: it gets complicated, but like I do think, like I 793 00:43:37,880 --> 00:43:39,840 Speaker 1: get the whole like people just wanting to enjoy the 794 00:43:39,920 --> 00:43:42,600 Speaker 1: series too, And I mean, I think the sort of 795 00:43:42,680 --> 00:43:45,880 Speaker 1: obvious answer is like figuring out a way to like 796 00:43:46,520 --> 00:43:51,880 Speaker 1: be involved in fan culture and fan them and not 797 00:43:53,080 --> 00:43:55,759 Speaker 1: giving money that's going to go to jk Rowling, like 798 00:43:55,920 --> 00:44:00,640 Speaker 1: not buying merch that's directly going to benefit jk Rowling. 799 00:44:01,080 --> 00:44:04,399 Speaker 1: There's lots of an independent artists out there. Personally, I'm 800 00:44:04,440 --> 00:44:07,560 Speaker 1: a big advocate of fan fiction. I think like fan 801 00:44:07,680 --> 00:44:10,960 Speaker 1: fiction is always a good thing if definitely read some 802 00:44:11,080 --> 00:44:14,160 Speaker 1: like transversions of characters too, which is fine, um and 803 00:44:15,200 --> 00:44:17,880 Speaker 1: all of that. But yeah, I don't know, I don't know. 804 00:44:18,200 --> 00:44:21,839 Speaker 1: I would love to hear you guys. We have suggestions. 805 00:44:21,880 --> 00:44:25,000 Speaker 1: We had reader we had listeners who sent their own 806 00:44:25,120 --> 00:44:33,040 Speaker 1: and some really good uh rewrites of Harry Potter. Should 807 00:44:33,239 --> 00:44:38,160 Speaker 1: I would yeah, oh my gosh. Um. Uh So originally 808 00:44:38,520 --> 00:44:41,040 Speaker 1: this was you did a much better job than what 809 00:44:41,080 --> 00:44:42,520 Speaker 1: I was gonna do. I was literally just going to 810 00:44:42,600 --> 00:44:47,800 Speaker 1: have a conversation about grieving and what happens because for me, 811 00:44:48,040 --> 00:44:51,800 Speaker 1: like I you know, I graduated high school when the 812 00:44:52,160 --> 00:44:54,320 Speaker 1: seventh book came out, I graduated college from the seventh 813 00:44:54,640 --> 00:44:58,360 Speaker 1: movie came out. It was there on like my darkest times. 814 00:44:58,440 --> 00:45:00,399 Speaker 1: I loved it. I liked travel to lond the idea, 815 00:45:00,440 --> 00:45:03,320 Speaker 1: the whole I was like, like I said, fan doesn't 816 00:45:03,440 --> 00:45:06,880 Speaker 1: describe it. And it was such a big part of 817 00:45:07,080 --> 00:45:10,920 Speaker 1: my identity. And that's the danger. Because I've talked before 818 00:45:11,239 --> 00:45:16,879 Speaker 1: on here, I use fiction to cope with trauma. Um, 819 00:45:17,280 --> 00:45:19,040 Speaker 1: and it's very important to me when you let something 820 00:45:19,080 --> 00:45:23,160 Speaker 1: get that close and then it hurts you so bad, 821 00:45:24,040 --> 00:45:26,560 Speaker 1: and that's normal. Like I mean, I do this fare 822 00:45:26,640 --> 00:45:29,440 Speaker 1: but I think plenty, like it's a pretty common thing. 823 00:45:29,480 --> 00:45:30,880 Speaker 1: I'm like, that's the thing I keep trying to bring up, 824 00:45:30,920 --> 00:45:32,920 Speaker 1: Like I don't plan people for being fans of the series. 825 00:45:33,080 --> 00:45:35,240 Speaker 1: There's a lot of there's a lot of good stuff 826 00:45:35,400 --> 00:45:39,200 Speaker 1: in it, and like yeah, and people do use this 827 00:45:39,320 --> 00:45:41,560 Speaker 1: sort of stuff to cope with trauma. Again, I've seen it, 828 00:45:41,640 --> 00:45:43,960 Speaker 1: Like I've read a lot of fan fiction, so I 829 00:45:44,040 --> 00:45:45,680 Speaker 1: know a lot of people use like fan fiction and 830 00:45:45,760 --> 00:45:47,640 Speaker 1: the sort of stuff to also deal with like their 831 00:45:47,719 --> 00:45:51,520 Speaker 1: identities as like queer people and as trains people and um. 832 00:45:52,280 --> 00:45:54,680 Speaker 1: And that's impacted me a lot too, Like that's definitely 833 00:45:54,880 --> 00:45:57,160 Speaker 1: like I don't know, I get that side of it. 834 00:45:57,320 --> 00:45:59,440 Speaker 1: And that's why I'm like, I never would fault anybody 835 00:45:59,560 --> 00:46:02,239 Speaker 1: for being attached a series because again, now it's on 836 00:46:02,280 --> 00:46:05,239 Speaker 1: the hypocrite, but also like media affects us and we 837 00:46:05,360 --> 00:46:07,879 Speaker 1: can't always you know, none of us can predict where 838 00:46:08,400 --> 00:46:10,640 Speaker 1: these authors are going to go with their lives, unfortunately. 839 00:46:10,840 --> 00:46:13,800 Speaker 1: But yeah, well that's kind of the interesting thing because 840 00:46:13,800 --> 00:46:16,120 Speaker 1: I was like, you know, there's a lot of what 841 00:46:16,200 --> 00:46:18,279 Speaker 1: about is them that happens in this argument where they're like, 842 00:46:18,320 --> 00:46:20,560 Speaker 1: well you'd like that though, right, And that's problematic too, 843 00:46:20,719 --> 00:46:24,080 Speaker 1: And I'm like, yeah, but JK. Rowling is like the 844 00:46:24,360 --> 00:46:27,279 Speaker 1: heart of this, Like I can't I can't even read 845 00:46:27,320 --> 00:46:30,160 Speaker 1: fan fiction anymore, Like I can't not think of this 846 00:46:31,080 --> 00:46:33,839 Speaker 1: when I consume any of the media, and like I'm 847 00:46:33,840 --> 00:46:35,720 Speaker 1: not saying I'm like holier than that at all, because 848 00:46:36,280 --> 00:46:39,160 Speaker 1: it was like twenty twenty. I knew she was probably 849 00:46:39,160 --> 00:46:41,040 Speaker 1: meant before them, but that was when I was like, Okay, 850 00:46:41,080 --> 00:46:44,919 Speaker 1: I can't, I can't do this anymore. And like we said, 851 00:46:45,320 --> 00:46:48,680 Speaker 1: I'm a huge It's true. There's so many like personal 852 00:46:48,760 --> 00:46:51,840 Speaker 1: experiences and mental gymnastics and all of these things that 853 00:46:51,920 --> 00:46:56,840 Speaker 1: go into consuming a thing. And so I also, you know, 854 00:46:56,920 --> 00:46:58,560 Speaker 1: I have friends that have got the game, and I 855 00:46:58,600 --> 00:47:00,680 Speaker 1: don't fallow them, but for me, it's become like I 856 00:47:00,800 --> 00:47:02,719 Speaker 1: literally shut down when people bring it up because it 857 00:47:02,840 --> 00:47:05,160 Speaker 1: was so important to me and it hurt me so bad. 858 00:47:05,680 --> 00:47:07,960 Speaker 1: And so many listeners have written and I'm like all 859 00:47:08,040 --> 00:47:10,560 Speaker 1: kinds of the whole spectrum of this over the years 860 00:47:10,680 --> 00:47:14,360 Speaker 1: and now, like I think of what the stories they've sensed, 861 00:47:14,480 --> 00:47:17,560 Speaker 1: and it's just it's upsite, Like I guess what I'm saying. 862 00:47:17,600 --> 00:47:19,920 Speaker 1: It's like I get I get all sides of this, 863 00:47:20,040 --> 00:47:22,160 Speaker 1: and I also get if, like you're really hurting and grieving, 864 00:47:22,560 --> 00:47:27,120 Speaker 1: because that's kind of where I am, right, Oh for sure, Yeah, yeah, 865 00:47:27,360 --> 00:47:29,560 Speaker 1: I think that's that's the thing. It's like, You're not 866 00:47:29,640 --> 00:47:33,080 Speaker 1: the only one I've known people in my age group, 867 00:47:33,840 --> 00:47:35,719 Speaker 1: it's gonna say generational, like, I'm not doing this to 868 00:47:35,800 --> 00:47:42,320 Speaker 1: myself who use that too? Like literally, we're so traumatized 869 00:47:42,760 --> 00:47:49,120 Speaker 1: by family, by love, by abuse that held onto that world. 870 00:47:49,239 --> 00:47:52,080 Speaker 1: And I keep saying that as a world because it 871 00:47:52,400 --> 00:47:56,320 Speaker 1: is it was something that was created to keep them safe, 872 00:47:57,200 --> 00:47:59,200 Speaker 1: and then they created it as a bigger part of 873 00:47:59,239 --> 00:48:02,879 Speaker 1: themselves so they be who they were and who they are. 874 00:48:03,440 --> 00:48:07,759 Speaker 1: And then we look at the fact that she destroyed it. 875 00:48:08,360 --> 00:48:10,799 Speaker 1: She destroyed it, I think for those people, for all 876 00:48:10,840 --> 00:48:12,520 Speaker 1: of us because I liked it. I liked it. I'm 877 00:48:12,560 --> 00:48:15,080 Speaker 1: a casual person. Don't get me wrong. Now. I you 878 00:48:15,560 --> 00:48:17,960 Speaker 1: would watch Harry Potter the first one, especially as a 879 00:48:18,080 --> 00:48:22,520 Speaker 1: Christmas movie, like I liked it. I was not a diehard. 880 00:48:22,600 --> 00:48:26,640 Speaker 1: I that is not my escapism. But I as even 881 00:48:26,680 --> 00:48:29,479 Speaker 1: almost like the social work side of watching people being 882 00:48:29,560 --> 00:48:33,040 Speaker 1: able to really have a coping mechanism in a healthy way, 883 00:48:33,280 --> 00:48:35,520 Speaker 1: even though I'm like, oh, you're really really in this 884 00:48:35,640 --> 00:48:39,239 Speaker 1: imaginary world. But it's also good for you because it 885 00:48:39,400 --> 00:48:41,600 Speaker 1: is doing so much for you. And then to have 886 00:48:41,880 --> 00:48:46,040 Speaker 1: that go for a flip, it's such a disruption. It 887 00:48:46,239 --> 00:48:49,359 Speaker 1: takes the wind out of everything. It takes your breath 888 00:48:49,480 --> 00:48:54,919 Speaker 1: almost of the jarring turnaround of what it was. And yeah, 889 00:48:55,160 --> 00:48:59,120 Speaker 1: I again, I have a lot of opinions, and I 890 00:48:59,200 --> 00:49:02,680 Speaker 1: don't want to it sounds very critical. I sound very critical, 891 00:49:02,719 --> 00:49:05,080 Speaker 1: and I hate to be that way. When I'm like 892 00:49:05,239 --> 00:49:07,839 Speaker 1: this game, you don't have an excuse. But the fact 893 00:49:07,880 --> 00:49:10,400 Speaker 1: that the matter is that this goes hand in hand, 894 00:49:11,400 --> 00:49:15,280 Speaker 1: and it is it has been tainted, this, this reality, 895 00:49:15,480 --> 00:49:19,680 Speaker 1: this fiction, this conversation, something that was so pure at 896 00:49:19,760 --> 00:49:22,000 Speaker 1: one point in time, what we thought has been really 897 00:49:22,120 --> 00:49:28,600 Speaker 1: tainted and we can't not see why, so affecting everyone 898 00:49:28,800 --> 00:49:32,839 Speaker 1: her whole conversation. She had garnered so many right wing 899 00:49:33,000 --> 00:49:37,920 Speaker 1: people under her support group that it's overwhelming. The same 900 00:49:37,960 --> 00:49:41,080 Speaker 1: people she was tweeting against and yelling at in twenty 901 00:49:41,160 --> 00:49:44,399 Speaker 1: seventeen suddenly flipped and have become a part of her team. 902 00:49:45,200 --> 00:49:49,160 Speaker 1: And then we see women who have forever tried to 903 00:49:49,520 --> 00:49:54,319 Speaker 1: villainize trans women. It's being feeling justified because here's an 904 00:49:54,520 --> 00:49:57,399 Speaker 1: icon saying exactly what they've been saying for so years, 905 00:49:57,440 --> 00:50:00,319 Speaker 1: and now they have someone to justify that. And then, 906 00:50:00,440 --> 00:50:03,560 Speaker 1: also knowing that she's successful, there was no cancel culture 907 00:50:03,600 --> 00:50:06,360 Speaker 1: for her. She tried to pretend like there was there wasn't. 908 00:50:07,520 --> 00:50:10,040 Speaker 1: And then we also see that people like you Joey 909 00:50:10,120 --> 00:50:14,520 Speaker 1: and people like you Annie, who have so much emotion 910 00:50:14,719 --> 00:50:18,480 Speaker 1: and so much pain from people saying like, I'm sorry, 911 00:50:18,600 --> 00:50:19,960 Speaker 1: but I still want to be a part of it, 912 00:50:20,400 --> 00:50:22,880 Speaker 1: not because they're doing something against you, but they're definitely 913 00:50:22,920 --> 00:50:26,040 Speaker 1: not supporting you. Like that's if you're If you're not 914 00:50:26,160 --> 00:50:28,520 Speaker 1: doing something that is to be an ally, that means 915 00:50:28,560 --> 00:50:30,560 Speaker 1: you're not supporting them. I mean, that's kind of how 916 00:50:30,600 --> 00:50:33,040 Speaker 1: it turns out to be. And it's harsh and it's cruel. 917 00:50:33,160 --> 00:50:35,880 Speaker 1: And I say that to myself as well. I know 918 00:50:36,480 --> 00:50:39,400 Speaker 1: that hearing you as the people in my life that 919 00:50:39,480 --> 00:50:43,200 Speaker 1: I care about to do something that is going against 920 00:50:43,440 --> 00:50:46,360 Speaker 1: the community who are hurting because of this community, that 921 00:50:46,520 --> 00:50:49,640 Speaker 1: means I'm going against the support. I'm not supporting you 922 00:50:49,719 --> 00:50:52,200 Speaker 1: like I said I would. And that's the conversation and 923 00:50:52,320 --> 00:50:55,040 Speaker 1: things that you have to think about. And in this 924 00:50:55,239 --> 00:50:57,000 Speaker 1: it's just a game. I get it. It's just a game, 925 00:50:57,680 --> 00:51:00,800 Speaker 1: but it's also people's lives. It's also people hearts in 926 00:51:00,920 --> 00:51:03,359 Speaker 1: all of this, right, And that's one of the things 927 00:51:03,360 --> 00:51:05,279 Speaker 1: I think this episode I want to do because one 928 00:51:05,320 --> 00:51:07,360 Speaker 1: of the things I also keep saying is like a 929 00:51:07,440 --> 00:51:09,839 Speaker 1: lot of people that I think our allies and their 930 00:51:09,920 --> 00:51:13,480 Speaker 1: heart and our supporters, but just haven't like connected the 931 00:51:13,680 --> 00:51:16,200 Speaker 1: dots exactly between like they know, like O. Jk. Rowling's 932 00:51:16,239 --> 00:51:19,000 Speaker 1: really transphobic, but they haven't like totally realized, like no, 933 00:51:19,160 --> 00:51:21,880 Speaker 1: here's how the specific things that she's saying are affecting 934 00:51:21,960 --> 00:51:26,240 Speaker 1: people's lives, are affecting legislation are affecting like the general 935 00:51:26,680 --> 00:51:30,479 Speaker 1: kind of conversation that we are having about trans people 936 00:51:30,600 --> 00:51:38,719 Speaker 1: and about trans women in particular. And I think I understand, 937 00:51:38,800 --> 00:51:41,600 Speaker 1: Like that's it is a hard sort of thing to 938 00:51:41,680 --> 00:51:43,200 Speaker 1: really connect because it is a lot of like, yeah, 939 00:51:43,440 --> 00:51:45,040 Speaker 1: it is a lot of also just like how do hate? 940 00:51:45,080 --> 00:51:49,080 Speaker 1: How do these like cultures of hate happen? But it's there, 941 00:51:49,239 --> 00:51:51,880 Speaker 1: and this is like what I want to do is 942 00:51:51,960 --> 00:51:55,239 Speaker 1: sort of like connect those dots for people and again 943 00:51:55,320 --> 00:51:58,279 Speaker 1: also like remain sympathetic. I do understand, and yeah, again 944 00:51:58,360 --> 00:52:01,080 Speaker 1: like totally understanding it to the series. I'm not saying 945 00:52:01,120 --> 00:52:04,200 Speaker 1: people should just throw there. There was I had like 946 00:52:04,360 --> 00:52:06,120 Speaker 1: a bit like a TikTok break talked about this and 947 00:52:06,200 --> 00:52:08,400 Speaker 1: somebody like commented and was like I'm throwing away my 948 00:52:08,480 --> 00:52:10,960 Speaker 1: books right now, and I was like, no, you don't 949 00:52:11,040 --> 00:52:15,920 Speaker 1: have to do that, Like that doesn't do anything what 950 00:52:16,040 --> 00:52:33,080 Speaker 1: you think it does. This is an important conversation to 951 00:52:33,160 --> 00:52:35,040 Speaker 1: keep having, and it is a conversation that we do 952 00:52:35,200 --> 00:52:37,640 Speaker 1: keep happening because they're just keep There's been a Harry 953 00:52:37,640 --> 00:52:40,239 Speaker 1: Potter content that's been coming out like more or less 954 00:52:40,239 --> 00:52:42,800 Speaker 1: consistently for the past couple of years. There was a 955 00:52:42,960 --> 00:52:45,920 Speaker 1: Team Folk article I saw that kind of got into 956 00:52:45,960 --> 00:52:47,399 Speaker 1: the same thing that I came out a year ago 957 00:52:47,480 --> 00:52:51,400 Speaker 1: because of the twentieth anniversary special that was on HBO 958 00:52:51,480 --> 00:52:53,960 Speaker 1: Max that I had totally forgotten about it, but there 959 00:52:54,000 --> 00:52:55,760 Speaker 1: was again I remember like there was a whole conversation 960 00:52:55,800 --> 00:52:59,800 Speaker 1: around that too, um and people being like yeah, like 961 00:52:59,880 --> 00:53:02,200 Speaker 1: the same sort of conversation about like nostalgia and being 962 00:53:02,200 --> 00:53:04,920 Speaker 1: attached to the series, but then also like recognizing like 963 00:53:05,000 --> 00:53:07,760 Speaker 1: people's lives aren't being put in danger or like literally 964 00:53:07,880 --> 00:53:11,439 Speaker 1: physically armed or guilt and like that is also something 965 00:53:11,480 --> 00:53:14,120 Speaker 1: that is important to remember. Yeah. I don't think that 966 00:53:14,239 --> 00:53:17,240 Speaker 1: there's not an easy answer, and that is that's frustrating 967 00:53:17,239 --> 00:53:19,160 Speaker 1: about all this, but it is like I think the 968 00:53:19,200 --> 00:53:21,360 Speaker 1: best thing people can do is like just continue to 969 00:53:21,440 --> 00:53:24,000 Speaker 1: keep having this conversation, continue to keep acknowledging that what JK. 970 00:53:24,160 --> 00:53:27,840 Speaker 1: Rowling is doing is harmful, and beyond that too, like 971 00:53:27,960 --> 00:53:30,879 Speaker 1: continue to stay informed again, Like there's all of these 972 00:53:30,960 --> 00:53:33,919 Speaker 1: bills in the US right now that are targeting trans 973 00:53:34,000 --> 00:53:37,759 Speaker 1: people and queer people. Pay attention to that. Like, I'm 974 00:53:37,800 --> 00:53:40,320 Speaker 1: sure wherever you are listening to this, there's something in 975 00:53:40,440 --> 00:53:42,520 Speaker 1: your state right now that he's either past or is 976 00:53:42,600 --> 00:53:46,960 Speaker 1: like in consideration right now, Like, look into that. Pay 977 00:53:47,000 --> 00:53:49,359 Speaker 1: attention to what's happening. If you want to be an allay, 978 00:53:49,560 --> 00:53:52,239 Speaker 1: be an ally, that's more important than whether you're going 979 00:53:52,320 --> 00:53:54,960 Speaker 1: to like burn your boer merch that you bought in 980 00:53:55,080 --> 00:53:58,040 Speaker 1: twenty eleven, Like, I don't know. I like I still 981 00:53:58,080 --> 00:54:00,920 Speaker 1: have like a griffindor blank that I bought when I 982 00:54:01,000 --> 00:54:02,640 Speaker 1: was eleven that I'm like, I'm going to keep this 983 00:54:02,800 --> 00:54:06,359 Speaker 1: because I like it, and yeah, like me throwing it away, 984 00:54:06,400 --> 00:54:10,800 Speaker 1: it's not going to affection, okay. Rawlings like annual income right, yeah, exactly, 985 00:54:11,120 --> 00:54:14,680 Speaker 1: and just the annual income I I was looking this 986 00:54:14,840 --> 00:54:21,040 Speaker 1: up apparently for the standard apparently for a writer's royalties 987 00:54:21,680 --> 00:54:26,239 Speaker 1: is fifteen percent. So when the book sales happened, I 988 00:54:26,320 --> 00:54:28,640 Speaker 1: think as of twenty twenty two, she made one point 989 00:54:28,719 --> 00:54:33,560 Speaker 1: one five billion dollars, So we know she was a 990 00:54:33,640 --> 00:54:36,560 Speaker 1: savvy businesswoman, especially when it came down to the movies. 991 00:54:37,040 --> 00:54:40,160 Speaker 1: Likelihood of her making that much money off of this 992 00:54:40,320 --> 00:54:44,640 Speaker 1: game alone, it's pretty high. And because she is a 993 00:54:44,719 --> 00:54:48,000 Speaker 1: supporter of anti trans bills, the likelihood that she's giving 994 00:54:48,080 --> 00:54:52,120 Speaker 1: money to that very high. So oh and she has, Yeah, 995 00:54:52,719 --> 00:54:56,480 Speaker 1: she's definitely given so as a reminder that money does 996 00:54:56,600 --> 00:55:02,360 Speaker 1: go specifically against trans people. Yeah, happy days. Yeah. And 997 00:55:02,480 --> 00:55:06,080 Speaker 1: this is also where with this when we're talking about money, 998 00:55:06,280 --> 00:55:11,440 Speaker 1: like independent rich shops exists and didn't like we love 999 00:55:11,680 --> 00:55:14,640 Speaker 1: buy book second hand? Yes, I know, like streaming is 1000 00:55:14,640 --> 00:55:16,080 Speaker 1: the norm now, so it's sort of hard to do 1001 00:55:16,160 --> 00:55:18,040 Speaker 1: the whole like buy your DVD second hand, but like 1002 00:55:18,040 --> 00:55:21,040 Speaker 1: I don't know, like there's other options. Does eBay still 1003 00:55:21,080 --> 00:55:25,839 Speaker 1: do that? The second hand buying? What did that used 1004 00:55:25,840 --> 00:55:29,359 Speaker 1: to be? I know, y'all, y'all are not old enough 1005 00:55:29,400 --> 00:55:33,160 Speaker 1: in collod books and I think it was pre Amazon 1006 00:55:34,080 --> 00:55:36,880 Speaker 1: and it was eBay and then it became Amazon. Oh 1007 00:55:37,000 --> 00:55:44,520 Speaker 1: my god, I'm dating myself. I'm still talking yea, Oh 1008 00:55:44,640 --> 00:55:49,239 Speaker 1: my god, Oh wow, my jello and put myself to 1009 00:55:49,360 --> 00:55:54,080 Speaker 1: bed early. I mean wow, I mean that sounds nice. Yeah, 1010 00:55:54,360 --> 00:55:56,160 Speaker 1: it is. It's like I feel like we could come 1011 00:55:56,200 --> 00:55:58,400 Speaker 1: back and do a revisit because it is so complicated. 1012 00:55:58,480 --> 00:56:00,040 Speaker 1: Because there's also a part of me that like, I 1013 00:56:00,160 --> 00:56:03,120 Speaker 1: hate that she's done this because it's the movies weren't 1014 00:56:03,120 --> 00:56:05,640 Speaker 1: made in a vacuum. Like sure, if some people agree 1015 00:56:05,680 --> 00:56:07,680 Speaker 1: with her, but there was a lot of people who 1016 00:56:07,800 --> 00:56:11,239 Speaker 1: didn't and who have come out against it. So I 1017 00:56:11,360 --> 00:56:16,520 Speaker 1: hate that that's happening. And then I I would love 1018 00:56:16,560 --> 00:56:18,920 Speaker 1: to bring someone else who knows more about this than me. 1019 00:56:19,080 --> 00:56:20,839 Speaker 1: But there's been a lot of articles about what's going 1020 00:56:20,880 --> 00:56:24,320 Speaker 1: on with trans women, specifically in the UK. It's not 1021 00:56:24,440 --> 00:56:26,359 Speaker 1: it's still it's bad, but it's not quite the same 1022 00:56:26,400 --> 00:56:29,680 Speaker 1: as what's going on in the US, so that is 1023 00:56:29,719 --> 00:56:33,399 Speaker 1: a larger conversation to be had as well. Yeah, Like, again, 1024 00:56:33,440 --> 00:56:37,040 Speaker 1: I don't know as much about UK like politics and culture. 1025 00:56:37,239 --> 00:56:40,080 Speaker 1: I spent my entire life in the US. But at 1026 00:56:40,160 --> 00:56:42,799 Speaker 1: least from what I've heard, it's like it doesn't I've 1027 00:56:42,800 --> 00:56:44,960 Speaker 1: definitely like seen articles when I pointed too, like jk 1028 00:56:45,120 --> 00:56:47,360 Speaker 1: Rowling in particular, in her attitude towards this, like it 1029 00:56:47,440 --> 00:56:49,400 Speaker 1: doesn't come that also doesn't come out of a vacuum, 1030 00:56:49,480 --> 00:56:52,759 Speaker 1: like that is very much reflective of how like the 1031 00:56:53,120 --> 00:56:55,279 Speaker 1: culture around trans people and around career people is in 1032 00:56:55,360 --> 00:56:58,480 Speaker 1: the UK. But again, I'm not an expertal of this. 1033 00:56:58,560 --> 00:57:00,560 Speaker 1: While like I don't know, but yeah, above to hear 1034 00:57:00,719 --> 00:57:03,759 Speaker 1: somebody's thoughts, who who knows more on this, because yeah, 1035 00:57:04,239 --> 00:57:08,160 Speaker 1: and it's scary. It's it's a scary world to live in. 1036 00:57:08,680 --> 00:57:12,920 Speaker 1: I do try to be an optimist against my better judgment. Um, 1037 00:57:13,160 --> 00:57:15,520 Speaker 1: so like I am hopeful that I don't know, I 1038 00:57:15,800 --> 00:57:18,000 Speaker 1: do the fact that we're having this conversation at least 1039 00:57:18,040 --> 00:57:20,160 Speaker 1: like to me is like people are talking about it. 1040 00:57:20,240 --> 00:57:21,840 Speaker 1: It's not like this is just something that's happening and 1041 00:57:21,960 --> 00:57:25,720 Speaker 1: like nobody really cares, like there are there wasn't there 1042 00:57:25,840 --> 00:57:29,240 Speaker 1: was an effort to boycott and it wasn't particularly successful, 1043 00:57:29,280 --> 00:57:30,760 Speaker 1: but like it was there, and I have seen a 1044 00:57:30,840 --> 00:57:34,040 Speaker 1: lot of people like sharing information about it and and 1045 00:57:34,520 --> 00:57:36,880 Speaker 1: sharing information about like a lot of these like anti 1046 00:57:36,920 --> 00:57:39,240 Speaker 1: trans and anti career bills that are going around, and 1047 00:57:39,400 --> 00:57:45,160 Speaker 1: that gives me hope. I don't know. Again, it's everything's 1048 00:57:45,160 --> 00:57:51,479 Speaker 1: about right now, Like, yeah, I don't know. I feel 1049 00:57:51,520 --> 00:57:59,320 Speaker 1: like that's the overall Yeah, okay, job, Yeah, I mean 1050 00:57:59,400 --> 00:58:03,400 Speaker 1: it is. I think like that's a good a good 1051 00:58:03,480 --> 00:58:06,080 Speaker 1: takeaways to stay informed because I think a lot more 1052 00:58:06,120 --> 00:58:08,800 Speaker 1: people know now at the very least where they they 1053 00:58:08,920 --> 00:58:11,919 Speaker 1: didn't before, that this is something we should be talking 1054 00:58:11,960 --> 00:58:14,200 Speaker 1: about or paying attention to. Oh yeah, and I want 1055 00:58:14,240 --> 00:58:15,760 Speaker 1: to clarify, like I said before, like I want to 1056 00:58:15,760 --> 00:58:18,120 Speaker 1: say realistic and just be like I do recognize that 1057 00:58:18,200 --> 00:58:19,640 Speaker 1: I think at this point, like we can't just be like, 1058 00:58:20,720 --> 00:58:22,640 Speaker 1: here's all these bad things and this is why everybody 1059 00:58:22,640 --> 00:58:25,760 Speaker 1: should just forget the series exists and like make Jaker rawling, 1060 00:58:25,880 --> 00:58:27,840 Speaker 1: like I don't think that's gonna work. But also the 1061 00:58:27,920 --> 00:58:29,440 Speaker 1: fact that I don't think that's gonna work, I don't 1062 00:58:29,480 --> 00:58:31,080 Speaker 1: think like it hasn't worked at this point, that doesn't 1063 00:58:31,120 --> 00:58:32,360 Speaker 1: mean that like there aren't things that we can do 1064 00:58:32,480 --> 00:58:33,880 Speaker 1: with that are going to like at least be like 1065 00:58:34,400 --> 00:58:36,760 Speaker 1: harm reduction, you know, like, yeah, we shouldn't keep talking 1066 00:58:36,760 --> 00:58:38,360 Speaker 1: about this. We should keep talking about all this stuff 1067 00:58:38,360 --> 00:58:40,760 Speaker 1: that's happening, because that's always going to be better than 1068 00:58:40,800 --> 00:58:44,480 Speaker 1: just ignoring it. And yeah, or like even if like 1069 00:58:44,600 --> 00:58:47,000 Speaker 1: maybe a mass white cot isn't going to work, that 1070 00:58:47,120 --> 00:58:49,400 Speaker 1: does not mean that you should just like give your 1071 00:58:49,440 --> 00:58:52,840 Speaker 1: money to the corporations they're going to go directly rack 1072 00:58:52,880 --> 00:58:55,920 Speaker 1: to Chaker rawling like you should or like whatever franchise 1073 00:58:56,080 --> 00:58:58,200 Speaker 1: was what I am on the side of, Like, I 1074 00:58:58,280 --> 00:59:00,320 Speaker 1: think if people enjoy the series, I under stand that, 1075 00:59:00,440 --> 00:59:04,160 Speaker 1: and I respect people's ability to, like you consume the 1076 00:59:04,240 --> 00:59:08,760 Speaker 1: media that works for you, but also have some responsibility 1077 00:59:08,920 --> 00:59:11,400 Speaker 1: and think about how your actions and where you're putting 1078 00:59:11,440 --> 00:59:13,440 Speaker 1: your money are affecting other people and are affecting the 1079 00:59:13,480 --> 00:59:16,440 Speaker 1: queer community and the trans community. Um, and like what 1080 00:59:16,520 --> 00:59:19,480 Speaker 1: are some alternatives that. Yeah, I think everybody should just 1081 00:59:19,480 --> 00:59:28,840 Speaker 1: read fan fiction and everything will be fine. Yeah, yet 1082 00:59:29,000 --> 00:59:36,600 Speaker 1: doub Joey, don't tell people. Yeah, I know you afections 1083 00:59:36,640 --> 00:59:40,080 Speaker 1: like Luke Skywalkers of particular for it of like, so 1084 00:59:40,120 --> 00:59:45,200 Speaker 1: you're gonna send me these right, I'm outing her. You 1085 00:59:45,360 --> 00:59:49,920 Speaker 1: really did experiments to see like what gets more clicks. 1086 00:59:49,920 --> 00:59:53,640 Speaker 1: It's been fun, I was what I was putting together. 1087 00:59:54,000 --> 00:59:55,960 Speaker 1: This is because I think the one like internet fandom 1088 00:59:56,040 --> 00:59:58,560 Speaker 1: thing that I had some I don't have a huge 1089 00:59:58,600 --> 01:00:00,320 Speaker 1: chicks Doc account, but I have friends that are big 1090 01:00:00,360 --> 01:00:03,240 Speaker 1: on like Harry Potter TikTok and like Marauder's TikTok in particular. 1091 01:00:03,280 --> 01:00:04,560 Speaker 1: And this is like the point where I was like, oh, 1092 01:00:04,600 --> 01:00:06,960 Speaker 1: my god, somebody's gonna find my TikTok through list Like, 1093 01:00:07,040 --> 01:00:10,439 Speaker 1: somebody's gonna like I've set myself up, Oh my god, 1094 01:00:10,720 --> 01:00:13,440 Speaker 1: like because there's gonna be some video that I was 1095 01:00:13,480 --> 01:00:15,520 Speaker 1: in that my friend Typo that's gonna lead my account 1096 01:00:15,520 --> 01:00:19,280 Speaker 1: and somebody's like, you're criticizing JK Rowling and yeah, you 1097 01:00:19,360 --> 01:00:26,440 Speaker 1: make I don't know. That's the that's the worry. That's 1098 01:00:26,480 --> 01:00:30,920 Speaker 1: the concern people find people find, Oh my god, especially 1099 01:00:30,960 --> 01:00:35,840 Speaker 1: when you get out in which I guess that brings 1100 01:00:35,920 --> 01:00:41,440 Speaker 1: us to wrap up, Joey. Thanks is always for coming on. 1101 01:00:41,680 --> 01:00:43,640 Speaker 1: I know there was even more stuff we could talk about. 1102 01:00:43,680 --> 01:00:47,840 Speaker 1: So come back any anytime, um to revisit this or 1103 01:00:47,920 --> 01:00:53,560 Speaker 1: anything else where. Can the good listeners find you for sure? Yeah, 1104 01:00:53,960 --> 01:00:56,800 Speaker 1: you guys can find me on Twitter or Instagram at 1105 01:00:57,080 --> 01:00:59,960 Speaker 1: Pat not Pratt. That is Pat t which is my 1106 01:01:00,160 --> 01:01:02,960 Speaker 1: last name, and then not an ot and then pr Att, 1107 01:01:03,400 --> 01:01:07,080 Speaker 1: which is not my last name. I have mentioned my TikTok. 1108 01:01:07,120 --> 01:01:08,480 Speaker 1: I will not be telling you guys my TikTok, but 1109 01:01:08,520 --> 01:01:11,440 Speaker 1: if you can find it, good for you. But you 1110 01:01:11,480 --> 01:01:14,160 Speaker 1: should follow the stuff Mom never told you tiktoka. Yes 1111 01:01:15,280 --> 01:01:20,480 Speaker 1: I do that too, Yes you are we always love 1112 01:01:20,520 --> 01:01:23,400 Speaker 1: having you on here, so thanks for coming. Thanks for 1113 01:01:25,680 --> 01:01:28,360 Speaker 1: all this work on such a sensitive topic. I know 1114 01:01:28,480 --> 01:01:32,840 Speaker 1: this is emotionally taxing. Yeah, and it's a lot, it's 1115 01:01:32,840 --> 01:01:35,440 Speaker 1: a lot going on. Thank you, thank you, thank you. 1116 01:01:36,600 --> 01:01:38,360 Speaker 1: If you would like to find us listeners, you can 1117 01:01:38,480 --> 01:01:41,040 Speaker 1: or emails Stefania mom Stuff at iHeartMedia dot com. You 1118 01:01:41,040 --> 01:01:43,520 Speaker 1: can find us on Twitter, Mom Stuff Podcast, or Instagram 1119 01:01:43,600 --> 01:01:46,680 Speaker 1: and TikTok at stuff Mom Never Told You, also on 1120 01:01:46,800 --> 01:01:50,880 Speaker 1: YouTube now. Thanks as always to our super producer Christina, 1121 01:01:51,080 --> 01:01:54,080 Speaker 1: Thank you, Christina. Thanks again to Joey who helps with 1122 01:01:54,160 --> 01:01:57,200 Speaker 1: TikTok and research and this episode, and thanks to you 1123 01:01:57,320 --> 01:02:00,320 Speaker 1: for listening Stuff on Ever Toldy's direction of iHeartRadio more podcast. 1124 01:02:00,360 --> 01:02:01,600 Speaker 1: In my Heart Radio, you can check out the Hurt 1125 01:02:01,640 --> 01:02:03,280 Speaker 1: Radio app pop a podcast wherever you listen to your 1126 01:02:03,280 --> 01:02:07,000 Speaker 1: favorite shows. H