1 00:00:05,240 --> 00:00:07,440 Speaker 1: Hey, this is Annie and Samantha. I'm welcome to stuff 2 00:00:07,480 --> 00:00:09,040 Speaker 1: I never told you production of iHeartRadio. 3 00:00:18,720 --> 00:00:22,120 Speaker 2: And in this episode, I'm just going to preface it's 4 00:00:22,160 --> 00:00:25,079 Speaker 2: going to be what I would call interesting. 5 00:00:25,440 --> 00:00:27,400 Speaker 1: It was originally going to. 6 00:00:27,360 --> 00:00:31,160 Speaker 2: Be an unhappy hour and then it ballooned into something 7 00:00:31,240 --> 00:00:33,879 Speaker 2: way too long for that. I am drinking wine though, 8 00:00:33,920 --> 00:00:36,120 Speaker 2: because this makes me really sad, and I've been putting 9 00:00:36,120 --> 00:00:37,159 Speaker 2: off talking about it. 10 00:00:37,360 --> 00:00:39,760 Speaker 3: Yeah, we did for a long time, I. 11 00:00:41,240 --> 00:00:47,599 Speaker 2: Originally. So today we're talking about when a fandom hurts you. 12 00:00:47,800 --> 00:00:50,199 Speaker 2: And we're not talking about toxic fandom, which can hurt 13 00:00:50,280 --> 00:00:54,680 Speaker 2: you absolutely. We're not talking about when a property looking 14 00:00:54,760 --> 00:00:57,280 Speaker 2: at you Star Wars continues to make choices that hurt 15 00:00:57,320 --> 00:01:02,400 Speaker 2: you storyline wise. We're talking about like when you have 16 00:01:02,520 --> 00:01:06,119 Speaker 2: to leave because it's just hurting so badly. And this 17 00:01:06,160 --> 00:01:09,120 Speaker 2: is about Harry Potter and JK Rowling. So with that, 18 00:01:09,280 --> 00:01:15,080 Speaker 2: there is a content warning for suicidality and transphobia. But 19 00:01:15,160 --> 00:01:17,240 Speaker 2: this is because it was originally going to be an 20 00:01:17,280 --> 00:01:21,640 Speaker 2: unhappy hour. I just want to stress, like, this is 21 00:01:21,880 --> 00:01:28,319 Speaker 2: very much my personal how I've been dealing with this. 22 00:01:28,319 --> 00:01:32,720 Speaker 2: This is not prescriptive. Everyone is different, and I wanted 23 00:01:32,720 --> 00:01:35,120 Speaker 2: to talk about it. When we did, Joey came on 24 00:01:35,160 --> 00:01:38,640 Speaker 2: and did a great episode about kind of the timeline 25 00:01:39,120 --> 00:01:44,120 Speaker 2: of JK Rowling's transphobia and the damage that it's done. 26 00:01:44,400 --> 00:01:46,920 Speaker 2: But because that episode was long, we didn't get to 27 00:01:48,080 --> 00:01:52,000 Speaker 2: talk about the emotional part of it, which I think 28 00:01:52,040 --> 00:01:53,840 Speaker 2: a lot of people leave out of this conversation. I 29 00:01:53,880 --> 00:01:55,560 Speaker 2: think there are a couple of reasons for why they 30 00:01:55,640 --> 00:01:58,880 Speaker 2: do that. But again, yes, this is very much like 31 00:01:59,440 --> 00:02:06,440 Speaker 2: me talking me coming from a very personal place. So 32 00:02:06,480 --> 00:02:09,120 Speaker 2: I just want to make that one thousand percent clear 33 00:02:09,360 --> 00:02:12,359 Speaker 2: before we get into this. This is a very personal thing. 34 00:02:12,919 --> 00:02:15,840 Speaker 2: It is not our typical like researched I have a 35 00:02:15,840 --> 00:02:18,360 Speaker 2: lot of research about this just in my brain, but 36 00:02:18,480 --> 00:02:21,120 Speaker 2: it wasn't like a researched episode. 37 00:02:21,720 --> 00:02:24,399 Speaker 3: I think I will say with this, for anyone who 38 00:02:24,520 --> 00:02:26,880 Speaker 3: was a fan and a true fan, like I really 39 00:02:27,000 --> 00:02:30,320 Speaker 3: liked it, I wasn't emotionally attached to it. So I 40 00:02:30,360 --> 00:02:32,760 Speaker 3: don't I get it, and I have a very very 41 00:02:32,880 --> 00:02:35,600 Speaker 3: very set opinion because I wasn't emotionally attached to it. 42 00:02:36,200 --> 00:02:38,440 Speaker 3: But people who are, and we know there's tons and 43 00:02:38,480 --> 00:02:42,400 Speaker 3: tons of people who are and were, it's gonna fall 44 00:02:42,440 --> 00:02:46,639 Speaker 3: into being controversial no matter what, And it's going to 45 00:02:46,680 --> 00:02:49,639 Speaker 3: be a heavy topic, unfortunately, because that's what happens when 46 00:02:49,680 --> 00:02:52,720 Speaker 3: you fall in love with something so hard and identify 47 00:02:52,760 --> 00:02:55,760 Speaker 3: with it. That's the part you identify with it so much, 48 00:02:56,160 --> 00:02:58,839 Speaker 3: and it feels like you've broken a part of your identity. Yeah, 49 00:02:58,840 --> 00:03:00,600 Speaker 3: that's kind of cause a crisis. And this is what 50 00:03:00,600 --> 00:03:01,320 Speaker 3: this is about. 51 00:03:02,120 --> 00:03:04,560 Speaker 2: It is it is, And I'm mostly talking about it 52 00:03:05,480 --> 00:03:10,480 Speaker 2: because I have a big group of friends that still 53 00:03:10,600 --> 00:03:15,080 Speaker 2: enjoys Harry Potter fan fiction. Like they're very clear they 54 00:03:15,120 --> 00:03:18,520 Speaker 2: don't support her monetarily. 55 00:03:18,760 --> 00:03:19,239 Speaker 3: And. 56 00:03:21,800 --> 00:03:25,400 Speaker 2: I was just thinking about, like, but I feel like 57 00:03:25,480 --> 00:03:27,880 Speaker 2: there's this whole emotional part we're just not talking about, 58 00:03:27,880 --> 00:03:31,440 Speaker 2: like what you're saying, Samantha, of like when you someone 59 00:03:31,480 --> 00:03:33,079 Speaker 2: described it to me the other day when I was 60 00:03:33,160 --> 00:03:34,880 Speaker 2: kind of complaining about it, and she was like, it 61 00:03:34,920 --> 00:03:38,440 Speaker 2: sounds like he broke up with an abusive X and 62 00:03:38,520 --> 00:03:42,000 Speaker 2: everyone's still hanging out with that person and you're kind 63 00:03:42,040 --> 00:03:44,720 Speaker 2: of taking it personally even if that's not what's happening 64 00:03:44,760 --> 00:03:48,160 Speaker 2: at all. But yeah, it does feel it has a 65 00:03:48,200 --> 00:03:53,240 Speaker 2: lot of emotions tied into it. A lot of people 66 00:03:53,320 --> 00:03:57,120 Speaker 2: do identify it with it really hard, and I think 67 00:03:57,840 --> 00:04:00,240 Speaker 2: that means that we get defensive when we have these 68 00:04:00,240 --> 00:04:03,640 Speaker 2: conversations that don't have perfect answers, Like there's not a 69 00:04:03,760 --> 00:04:05,840 Speaker 2: I don't have some solution here I'm going to present. 70 00:04:06,720 --> 00:04:09,440 Speaker 1: I would say, don't support her financially, but I don't 71 00:04:09,480 --> 00:04:10,920 Speaker 1: have a solution. 72 00:04:11,400 --> 00:04:15,440 Speaker 2: And honestly, like, I'm going to Universal in October, and 73 00:04:15,440 --> 00:04:16,960 Speaker 2: that was one of the first things I thought was 74 00:04:17,000 --> 00:04:19,160 Speaker 2: what am I gonna do? And I was like researching 75 00:04:19,200 --> 00:04:20,880 Speaker 2: does she get this money? Like what am I going 76 00:04:20,920 --> 00:04:25,600 Speaker 2: to do about this Harry Potter part of the Universal Park? 77 00:04:25,839 --> 00:04:29,560 Speaker 2: And just like if my friends want to go, am 78 00:04:29,600 --> 00:04:30,839 Speaker 2: I gonna like not go? 79 00:04:31,080 --> 00:04:32,920 Speaker 1: Like what am I? What am I gonna do? 80 00:04:33,600 --> 00:04:35,839 Speaker 3: It? Wouldn't it be that you just shouldn't buy things 81 00:04:35,839 --> 00:04:38,440 Speaker 3: there because it's a part of the Universal pay already 82 00:04:38,440 --> 00:04:40,479 Speaker 3: and she probably already gets a chunk of that anyway 83 00:04:40,560 --> 00:04:44,120 Speaker 3: just having it there, but like to go in there 84 00:04:44,120 --> 00:04:46,960 Speaker 3: and then buy something that would be the profit. So 85 00:04:47,440 --> 00:04:49,120 Speaker 3: no butter beer for you. 86 00:04:50,240 --> 00:04:52,039 Speaker 2: I'll make my own damn butter beer, but I probably 87 00:04:52,040 --> 00:04:55,880 Speaker 2: won't because it makes me so sad now, But that's 88 00:04:55,960 --> 00:05:01,760 Speaker 2: kind of the point, is I don't this episode. I'm 89 00:05:01,800 --> 00:05:04,320 Speaker 2: embarrassed by it, but I might cry. I'm probably gonna 90 00:05:04,320 --> 00:05:06,800 Speaker 2: get emotional. I know it sounds silly to some of you. 91 00:05:06,839 --> 00:05:11,560 Speaker 2: I'm going to explain why, but like, I don't want 92 00:05:11,600 --> 00:05:17,240 Speaker 2: to if you can enjoy it without supporting her, I 93 00:05:17,240 --> 00:05:18,839 Speaker 2: don't want to be the person that comes in and 94 00:05:18,920 --> 00:05:21,400 Speaker 2: ruins that for you. I'm actually kind of jealous of you. 95 00:05:22,560 --> 00:05:27,479 Speaker 2: One does not make better than the other because I 96 00:05:27,560 --> 00:05:30,800 Speaker 2: really don't have any judgment. We're all different mental gymnastics 97 00:05:30,839 --> 00:05:34,080 Speaker 2: and how we deal with these things. But one thing 98 00:05:34,160 --> 00:05:36,040 Speaker 2: that I think has really hung me up on this 99 00:05:36,080 --> 00:05:40,680 Speaker 2: conversation is my dad was kind of controlling about what 100 00:05:40,720 --> 00:05:45,960 Speaker 2: you could enjoy based on what he enjoyed, And so 101 00:05:46,000 --> 00:05:47,480 Speaker 2: I think I've gotten really in my head about I 102 00:05:47,520 --> 00:05:49,640 Speaker 2: don't want to ruin it for other people. But it 103 00:05:49,720 --> 00:05:52,000 Speaker 2: also means I'm not explaining to other people how much 104 00:05:52,000 --> 00:05:55,520 Speaker 2: it kind of hurts me, right, So it's just kind 105 00:05:55,520 --> 00:05:58,360 Speaker 2: of it's built up. So I'm worried if I go 106 00:05:59,400 --> 00:06:02,040 Speaker 2: and don't buy something there, I'll still just be a 107 00:06:02,040 --> 00:06:05,239 Speaker 2: miserable person to be around. But also if I leave, 108 00:06:06,400 --> 00:06:09,000 Speaker 2: then that's gonna make them feel bad. 109 00:06:09,120 --> 00:06:09,760 Speaker 1: Like I don't know. 110 00:06:09,680 --> 00:06:15,160 Speaker 2: What the good solution is right right, And speaking of 111 00:06:15,279 --> 00:06:21,159 Speaker 2: like to be clear, I don't have great coping mechanisms. 112 00:06:22,040 --> 00:06:25,440 Speaker 2: I do really use media as an escape. I use 113 00:06:25,480 --> 00:06:28,640 Speaker 2: it to cope with trauma. So when I do lose that, 114 00:06:28,800 --> 00:06:32,839 Speaker 2: it is hurtful, which I don't think is necessarily unhealthy, 115 00:06:32,960 --> 00:06:35,719 Speaker 2: but I feel like I lean into things so hard, 116 00:06:35,920 --> 00:06:38,679 Speaker 2: and again maybe that's not unhealthy either. 117 00:06:38,520 --> 00:06:39,800 Speaker 1: But that means it hurts a lot. 118 00:06:40,120 --> 00:06:43,560 Speaker 2: It hurts like a lot, and it's just it's sort 119 00:06:43,600 --> 00:06:48,159 Speaker 2: of embarrassing because it feels trivial. I know there's so 120 00:06:48,160 --> 00:06:54,000 Speaker 2: many bigger problems in the world, but it's it is 121 00:06:54,040 --> 00:06:58,039 Speaker 2: something you use because of those bigger problems, like you 122 00:06:58,160 --> 00:07:01,920 Speaker 2: use it, and I think it impacts a lot more 123 00:07:02,000 --> 00:07:06,240 Speaker 2: than people realize, especially people who don't do this, who don't, 124 00:07:07,000 --> 00:07:09,640 Speaker 2: like you said, maybe aren't attached to it, Like maybe 125 00:07:09,640 --> 00:07:11,800 Speaker 2: you like it, but you're kind of like eh, So 126 00:07:11,920 --> 00:07:14,000 Speaker 2: when you lose it, it's sort of like, oh, that's sad, 127 00:07:14,080 --> 00:07:20,880 Speaker 2: But there are people who do, and I know people 128 00:07:21,800 --> 00:07:27,280 Speaker 2: still continue to use it to cope even knowing about 129 00:07:27,320 --> 00:07:31,800 Speaker 2: her but not supporting her. And I totally get that, 130 00:07:32,480 --> 00:07:39,040 Speaker 2: I really do. But that's that's kind of the what 131 00:07:39,080 --> 00:07:41,280 Speaker 2: I wanted to talk about, because I did love it 132 00:07:42,400 --> 00:07:45,240 Speaker 2: so much like if you had told me, like a 133 00:07:45,240 --> 00:07:47,800 Speaker 2: decade ago, I would have laughed in your face and 134 00:07:47,840 --> 00:07:50,880 Speaker 2: been offended if you'd said, like, it's not gonna be 135 00:07:50,920 --> 00:07:53,600 Speaker 2: part of your life anymore. And it might be surprising 136 00:07:53,600 --> 00:07:56,120 Speaker 2: to hear that, because like, if you've been listening to 137 00:07:56,120 --> 00:07:59,560 Speaker 2: me on the show, you might think, like, oh, she'd 138 00:07:59,640 --> 00:08:01,920 Speaker 2: been more likely to give up Harry Potter than Star Wars. 139 00:08:01,920 --> 00:08:03,160 Speaker 1: That is not true. 140 00:08:05,400 --> 00:08:07,680 Speaker 2: In fact, one of the reasons I'm so into Star 141 00:08:07,720 --> 00:08:11,280 Speaker 2: Wars right now is because I have always loved it, 142 00:08:12,040 --> 00:08:15,679 Speaker 2: but then the pandemic happened and I lost Harry Potter, 143 00:08:15,760 --> 00:08:18,000 Speaker 2: and I was like, oh God, I gotta find something 144 00:08:18,200 --> 00:08:19,720 Speaker 2: to fill that space. 145 00:08:20,600 --> 00:08:22,400 Speaker 1: I hate it. I hate that it hurts so much. 146 00:08:22,440 --> 00:08:22,960 Speaker 1: It shouldn't it. 147 00:08:23,160 --> 00:08:25,600 Speaker 3: So all right, so let's go ahead and put a 148 00:08:25,600 --> 00:08:28,880 Speaker 3: cap on this. Because you feel things this deeply, you 149 00:08:28,880 --> 00:08:31,640 Speaker 3: should realize and validate that because it was a big 150 00:08:31,680 --> 00:08:34,120 Speaker 3: part of your life, and to call it dumb or 151 00:08:34,200 --> 00:08:36,760 Speaker 3: to call it something and try to squash it it 152 00:08:36,800 --> 00:08:39,760 Speaker 3: makes it worse. You're trying to invalidate yourself when you 153 00:08:39,800 --> 00:08:42,520 Speaker 3: know it's a lie, so you're lying to yourself and 154 00:08:42,559 --> 00:08:45,240 Speaker 3: it's just gonna trigger so many things so go ahead 155 00:08:45,240 --> 00:08:48,400 Speaker 3: and stop because this is lost and no matter what, 156 00:08:48,640 --> 00:08:53,480 Speaker 3: for you, loss is significant and for you who don't mourn, well, 157 00:08:56,160 --> 00:08:58,840 Speaker 3: this is going to be a greater impact because you 158 00:08:58,920 --> 00:09:02,240 Speaker 3: try to ignore it. Which, by the way, sharing with 159 00:09:02,480 --> 00:09:06,240 Speaker 3: our listeners is a beautiful and vulnerable thing and I 160 00:09:06,280 --> 00:09:09,000 Speaker 3: hope they understand and you understand how big of a 161 00:09:09,040 --> 00:09:13,520 Speaker 3: deal this is. So no matter what, this episode with 162 00:09:13,640 --> 00:09:19,280 Speaker 3: you being heartbroken makes sense and it's okay. And you 163 00:09:19,400 --> 00:09:21,400 Speaker 3: got to figure out how to deal with that somehow 164 00:09:21,440 --> 00:09:25,480 Speaker 3: because it's not going away. So with that, and as 165 00:09:25,559 --> 00:09:27,040 Speaker 3: much as you want to try to replace it, that 166 00:09:27,080 --> 00:09:32,400 Speaker 3: doesn't work no matter what, because the initial emotions is 167 00:09:32,440 --> 00:09:35,160 Speaker 3: something that you're going to long for always because you 168 00:09:35,160 --> 00:09:40,400 Speaker 3: can never replace that. And again with the first love 169 00:09:41,080 --> 00:09:44,000 Speaker 3: is something new and it's something that is even if 170 00:09:44,000 --> 00:09:49,720 Speaker 3: you find any love, it's still something different. So it's okay, 171 00:09:50,000 --> 00:09:53,560 Speaker 3: it's not dumb. Don't say that, because this is very 172 00:09:53,640 --> 00:09:56,720 Speaker 3: valid and just because not everyone understands it doesn't change 173 00:09:56,720 --> 00:10:01,400 Speaker 3: anything because there's a lot more people who do. So 174 00:10:01,520 --> 00:10:05,199 Speaker 3: deep bruh, let's put that to the side. Don't say that, 175 00:10:05,559 --> 00:10:08,520 Speaker 3: and listeners, you feeling that is okay too? 176 00:10:10,600 --> 00:10:11,360 Speaker 1: Yeah, thank you. 177 00:10:11,559 --> 00:10:14,520 Speaker 2: I mean, it's it's just it does feel like grieving, 178 00:10:14,600 --> 00:10:17,360 Speaker 2: and I'm going to talk about that more about why 179 00:10:17,360 --> 00:10:20,000 Speaker 2: that is. But I think also a part of it 180 00:10:20,040 --> 00:10:26,760 Speaker 2: is I really did grow. 181 00:10:26,640 --> 00:10:27,120 Speaker 1: Up with it. 182 00:10:27,480 --> 00:10:29,440 Speaker 2: I know it sounds kind of trite, but I really did, 183 00:10:29,520 --> 00:10:31,439 Speaker 2: Like it's a long time to love something. 184 00:10:31,480 --> 00:10:32,280 Speaker 1: It's a long time. 185 00:10:33,679 --> 00:10:35,720 Speaker 2: I think I read the first book when I was nine, 186 00:10:36,640 --> 00:10:37,880 Speaker 2: and it was just a big part of my life. 187 00:10:37,880 --> 00:10:39,440 Speaker 2: But it was how I met friends, Like some of 188 00:10:39,440 --> 00:10:43,560 Speaker 2: my closest friendships was made over over this, and I 189 00:10:43,720 --> 00:10:48,040 Speaker 2: was kind of mocked for it because I feel like 190 00:10:48,080 --> 00:10:53,120 Speaker 2: Harry Potter eventually achieved a mainstream acceptance, but for a 191 00:10:53,160 --> 00:10:56,560 Speaker 2: while sort of seen is like you that's sort of silly, 192 00:10:56,640 --> 00:10:59,280 Speaker 2: like don't talk. That's embarrassing, especially as you got older. 193 00:10:59,480 --> 00:11:01,959 Speaker 2: And so I'm really defended it, like I stood. 194 00:11:01,760 --> 00:11:04,640 Speaker 1: Up for it. It was a part of my identity. 195 00:11:05,880 --> 00:11:08,760 Speaker 2: It was the version of Star Wars now where like 196 00:11:08,800 --> 00:11:10,320 Speaker 2: if you didn't know what to get me, we would 197 00:11:10,320 --> 00:11:12,480 Speaker 2: just give me something in that realm and I would 198 00:11:12,480 --> 00:11:15,160 Speaker 2: be very happy. And I would say that she was 199 00:11:15,160 --> 00:11:18,040 Speaker 2: a feminist. I would say, like, no, she's out there 200 00:11:20,000 --> 00:11:22,840 Speaker 2: like doing this good thing. And so it's sort of 201 00:11:22,880 --> 00:11:26,880 Speaker 2: embarrassing in that sense too, because some people, and I 202 00:11:26,920 --> 00:11:31,360 Speaker 2: know I know what they mean, but some people would 203 00:11:31,360 --> 00:11:33,640 Speaker 2: be like, I guess that was right and it was. 204 00:11:33,720 --> 00:11:36,000 Speaker 1: Bad all along. I'm like, that does not help me. 205 00:11:36,440 --> 00:11:37,680 Speaker 1: I get what you're doing. 206 00:11:39,600 --> 00:11:41,400 Speaker 3: But the only people I know who stood that was 207 00:11:41,440 --> 00:11:45,840 Speaker 3: bad all along was the one who would say, is witchcraft, yeah, yes, 208 00:11:46,080 --> 00:11:49,320 Speaker 3: which was my parents, So it would hate it more 209 00:11:49,440 --> 00:11:53,320 Speaker 3: because they were feminists. So because she is, she's a turf. 210 00:11:53,640 --> 00:11:56,800 Speaker 3: We know that. Yes, they think themselves feminists, but they 211 00:11:56,840 --> 00:11:57,600 Speaker 3: exclude so. 212 00:11:57,800 --> 00:12:00,000 Speaker 2: Yes, and we're we're gonna talk about that too, because 213 00:12:00,120 --> 00:12:02,840 Speaker 2: that's That's one of the things that hurts the most 214 00:12:02,880 --> 00:12:06,439 Speaker 2: I think is that I did think this about her, 215 00:12:07,200 --> 00:12:10,240 Speaker 2: and then she just is using that to hurt people 216 00:12:10,720 --> 00:12:13,880 Speaker 2: and convince people and normalize a thing that is just hate. 217 00:12:14,400 --> 00:12:17,560 Speaker 2: Some people have also argued I'm actually being sexist and 218 00:12:17,640 --> 00:12:18,640 Speaker 2: judging her harder. 219 00:12:18,760 --> 00:12:19,640 Speaker 1: I do not think so. 220 00:12:21,000 --> 00:12:25,640 Speaker 2: I do get that that happens sometimes, but I think 221 00:12:25,720 --> 00:12:28,480 Speaker 2: if you're calling yourself a feminist and then you're doing this, 222 00:12:28,559 --> 00:12:35,559 Speaker 2: then I don't. I don't think that's the same. 223 00:12:32,600 --> 00:12:35,240 Speaker 3: Right for me. I will say when I was before 224 00:12:35,440 --> 00:12:38,080 Speaker 3: all the big things happened, I would not watch with 225 00:12:38,200 --> 00:12:41,280 Speaker 3: Johnny Depp, so all the yeah stuff, I didn't watch 226 00:12:41,320 --> 00:12:45,000 Speaker 3: it and I still haven't. Yeah, because I'm sorry, Johnny 227 00:12:45,040 --> 00:12:46,880 Speaker 3: Depp is on my no list, and I know that 228 00:12:46,880 --> 00:12:48,680 Speaker 3: that's the whole thing, but whatever, I have a whole 229 00:12:48,679 --> 00:12:51,920 Speaker 3: no list and a lot of men on it. 230 00:12:52,040 --> 00:12:57,000 Speaker 2: Yes, I'm with you, he's online as well now. But 231 00:12:57,280 --> 00:13:00,120 Speaker 2: I also think I kind of want to come. I 232 00:13:00,160 --> 00:13:01,760 Speaker 2: can do a whole episode on this. But I also 233 00:13:01,760 --> 00:13:04,120 Speaker 2: think there's this thing that's happening, or there's a nuance 234 00:13:04,160 --> 00:13:07,560 Speaker 2: between being a bad feminist but then just using that 235 00:13:07,600 --> 00:13:11,520 Speaker 2: as an excuse, like you can acknowledge and say like, 236 00:13:11,559 --> 00:13:15,480 Speaker 2: I like these things that are problematic, But there's a 237 00:13:15,480 --> 00:13:18,680 Speaker 2: difference between that and being like, oh, well, no one's 238 00:13:18,720 --> 00:13:22,760 Speaker 2: perfect when you're just when the not perfect is outright harmful, 239 00:13:22,760 --> 00:13:27,200 Speaker 2: Like it's a it's a nuance, it's a fine line. 240 00:13:27,280 --> 00:13:29,960 Speaker 2: But I feel like some people are just like, oh, well, 241 00:13:30,000 --> 00:13:32,400 Speaker 2: I'm not perfect, so they don't talk about it. They're 242 00:13:32,440 --> 00:13:35,400 Speaker 2: just like, well, yep, too bad, but we kind of 243 00:13:35,440 --> 00:13:42,000 Speaker 2: have to talk about it. Some people have also said 244 00:13:42,000 --> 00:13:46,320 Speaker 2: that they understand why she's cruel and callous because she's 245 00:13:46,360 --> 00:13:50,360 Speaker 2: dealt with so much from actual terrible people, Okay, but 246 00:13:51,600 --> 00:13:53,800 Speaker 2: this is happening with like people who love her and 247 00:13:53,840 --> 00:13:55,880 Speaker 2: are not trying like they are terrible fans. Don't get 248 00:13:55,920 --> 00:13:59,280 Speaker 2: me wrong, horrible fans, but this is coming from like 249 00:13:59,320 --> 00:14:02,160 Speaker 2: people who are like I want to love your work, Hey, 250 00:14:02,240 --> 00:14:04,440 Speaker 2: why are you doing this? Or what do you even 251 00:14:04,440 --> 00:14:07,280 Speaker 2: think about this? Sometimes like not even what are you doing? 252 00:14:08,640 --> 00:14:13,680 Speaker 2: And she's just hateful. I'm dismissive. And that's I hate 253 00:14:13,679 --> 00:14:18,600 Speaker 2: that too, because I would count myself in that of 254 00:14:18,679 --> 00:14:25,280 Speaker 2: you know, I'd never like attacked JK Rowling online, But 255 00:14:26,520 --> 00:14:29,400 Speaker 2: I just feel like she's so I'm getting wrapped up 256 00:14:29,440 --> 00:14:32,440 Speaker 2: in how she's talking about these people, these people being 257 00:14:32,440 --> 00:14:35,520 Speaker 2: the people she perceives is have targeted her, when I'm 258 00:14:35,520 --> 00:14:38,160 Speaker 2: really just like, no, you're hurting people, all. 259 00:14:38,120 --> 00:14:41,200 Speaker 3: Right when you see outright posts, Please don't do this. 260 00:14:41,560 --> 00:14:43,760 Speaker 3: My heart is breaking. I've leaned on you so hard. 261 00:14:44,120 --> 00:14:46,080 Speaker 3: This is one of the most like heartbreaking things that 262 00:14:46,120 --> 00:14:49,120 Speaker 3: you would attack who I am. And her response is 263 00:14:49,160 --> 00:14:51,280 Speaker 3: callous and cold and being like, well you're not a 264 00:14:51,280 --> 00:14:54,000 Speaker 3: woman or this, which is her responses to things, and 265 00:14:54,040 --> 00:14:57,480 Speaker 3: then being invalidating. It's one thing again to be like 266 00:14:57,520 --> 00:15:01,120 Speaker 3: criticized and to go back towards them, but to invalidate 267 00:15:01,120 --> 00:15:04,120 Speaker 3: a person's love for you, Wow, that's cold. 268 00:15:04,880 --> 00:15:07,840 Speaker 2: Yeah, And it's like, this is one of the thing 269 00:15:07,880 --> 00:15:13,440 Speaker 2: that bothers me because I have had disagreements with friends 270 00:15:13,480 --> 00:15:18,360 Speaker 2: about this, and it almost always boils down to like 271 00:15:18,440 --> 00:15:22,120 Speaker 2: the monetary part, which I do, I get, But the 272 00:15:22,200 --> 00:15:25,600 Speaker 2: thing is I supported her monetarily for a long time, 273 00:15:25,840 --> 00:15:29,560 Speaker 2: like the reason she's able to she keeps tweeting about like, oh, 274 00:15:30,280 --> 00:15:33,960 Speaker 2: I'll just watch my royalty checks. That's me Like that 275 00:15:34,040 --> 00:15:37,960 Speaker 2: monetary part. I am a part of it now, even 276 00:15:38,000 --> 00:15:41,000 Speaker 2: if I don't want to be. And so it's just 277 00:15:41,080 --> 00:15:44,080 Speaker 2: so infuriating when she does that, where it's like she's 278 00:15:45,040 --> 00:15:49,040 Speaker 2: being hurtful and cruel and to the people who did 279 00:15:49,520 --> 00:15:53,080 Speaker 2: provide those residuals that are allowing her to make these 280 00:15:53,200 --> 00:15:54,400 Speaker 2: very cruel comments. 281 00:15:54,520 --> 00:15:55,120 Speaker 1: It's messy. 282 00:15:55,200 --> 00:15:57,320 Speaker 2: Like I understand that it's messy, but that's one of 283 00:15:57,400 --> 00:16:00,400 Speaker 2: the things that I keep getting hung up on, is that. 284 00:16:00,400 --> 00:16:02,680 Speaker 1: I did support her quite a bit. 285 00:16:03,480 --> 00:16:06,320 Speaker 2: And speaking of this is not a gatekeepy fan competition, 286 00:16:06,400 --> 00:16:09,280 Speaker 2: because I think I feel like a lot of times 287 00:16:09,280 --> 00:16:11,600 Speaker 2: this devolves into who's the bigger fan, and I don't 288 00:16:11,600 --> 00:16:14,000 Speaker 2: think that's useful at all. But when I say I 289 00:16:14,720 --> 00:16:16,480 Speaker 2: was a fan, I was a fan, like I was 290 00:16:16,520 --> 00:16:23,800 Speaker 2: a huge fan, and now because of how she's behaved, 291 00:16:23,880 --> 00:16:26,400 Speaker 2: I can't even read fan fiction anymore because it's just 292 00:16:27,080 --> 00:16:30,200 Speaker 2: it just hurts, like it's the first thing I think. 293 00:16:30,440 --> 00:16:32,000 Speaker 1: It's the only thing I can think of, is like, 294 00:16:32,840 --> 00:16:33,760 Speaker 1: but she she. 295 00:16:33,960 --> 00:16:36,840 Speaker 2: Said this, or she did this, or she really kind 296 00:16:36,880 --> 00:16:39,560 Speaker 2: of hates a lot of people, and I might be 297 00:16:39,600 --> 00:16:42,560 Speaker 2: in that circle, like it's a very I can't do it. 298 00:16:43,080 --> 00:16:47,720 Speaker 1: And again that's a very personal thing. I know people 299 00:16:47,760 --> 00:16:53,560 Speaker 1: that can, but I can't. And I think one of 300 00:16:53,600 --> 00:16:59,960 Speaker 1: the arguments I've heard a lot is about fan fiction, 301 00:17:00,000 --> 00:17:04,360 Speaker 1: and it's in particular I know she's terrible, but these 302 00:17:04,440 --> 00:17:09,960 Speaker 1: characters mean too much to me, like oh, which I understand, 303 00:17:10,800 --> 00:17:12,760 Speaker 1: But I hate it when people say that because it 304 00:17:12,760 --> 00:17:16,840 Speaker 1: makes it implies that I didn't love them as much 305 00:17:16,880 --> 00:17:20,760 Speaker 1: as you did and that it didn't hurt, or that 306 00:17:20,800 --> 00:17:23,160 Speaker 1: it was easy, or that it was like a choice 307 00:17:23,200 --> 00:17:24,560 Speaker 1: I made, because I really didn't. 308 00:17:25,200 --> 00:17:28,640 Speaker 2: It just hurt too bad. I chose to stop supporting 309 00:17:28,640 --> 00:17:30,960 Speaker 2: her financially, but I didn't choose to let go or 310 00:17:31,000 --> 00:17:35,119 Speaker 2: be forced out of it, So it really riles me. 311 00:17:35,160 --> 00:17:36,280 Speaker 1: Also, people say. 312 00:17:36,119 --> 00:17:39,720 Speaker 2: That I do get it. I really honestly do. And 313 00:17:39,800 --> 00:17:45,399 Speaker 2: as we talked about in the episode, but Joey, this 314 00:17:45,520 --> 00:17:48,720 Speaker 2: situation with JK. Rowland Harry Potter is really unique. She 315 00:17:49,200 --> 00:17:54,639 Speaker 2: is at the heart of it, essentially, I like a 316 00:17:54,680 --> 00:17:58,600 Speaker 2: lot of other decentralized entertainment like Star Wars, that's just 317 00:17:59,040 --> 00:18:02,160 Speaker 2: her thing, and she's the very public face of it. 318 00:18:02,800 --> 00:18:05,600 Speaker 2: And she's very publicly and very coolly and very mockingly 319 00:18:06,680 --> 00:18:09,399 Speaker 2: doing these things. And I feel like a lot of 320 00:18:09,400 --> 00:18:13,080 Speaker 2: people do like to whenever I bring this up, they 321 00:18:13,160 --> 00:18:15,119 Speaker 2: like to do what about is a like, well, what 322 00:18:15,119 --> 00:18:18,040 Speaker 2: do you think entertainment you like is not good either? 323 00:18:18,160 --> 00:18:21,080 Speaker 2: And all this stuff some people have even bought up, 324 00:18:21,119 --> 00:18:23,320 Speaker 2: like old timey books that are on her like school 325 00:18:23,400 --> 00:18:26,520 Speaker 2: booklist or like that is not the same look. I 326 00:18:26,520 --> 00:18:31,200 Speaker 2: don't have, Like I didn't personally see that out. There's 327 00:18:31,200 --> 00:18:33,720 Speaker 2: a difference between something happening in the eighteen hundreds, which 328 00:18:33,800 --> 00:18:36,919 Speaker 2: by all means, yes, let us call out how problematic 329 00:18:36,960 --> 00:18:43,080 Speaker 2: it was, versus someone in our current world like today, 330 00:18:43,160 --> 00:18:49,920 Speaker 2: maybe doing these things that are just impacting current politics. 331 00:18:49,960 --> 00:18:54,000 Speaker 2: As we've talked about in the episode of Joey and 332 00:18:55,760 --> 00:18:59,880 Speaker 2: I don't know, Uh, I literally did have a dream 333 00:19:00,040 --> 00:19:03,840 Speaker 2: that she had people stone me to death and laughed 334 00:19:03,880 --> 00:19:05,280 Speaker 2: while I died, like a week ago. 335 00:19:06,280 --> 00:19:09,960 Speaker 1: And this is someone I used to idolize. 336 00:19:10,000 --> 00:19:15,520 Speaker 2: I have her autograph, like, and I really did hold 337 00:19:15,560 --> 00:19:18,320 Speaker 2: onto it for as long as I could, And I 338 00:19:18,320 --> 00:19:20,720 Speaker 2: admit it because I knew in twenty eighteen is when 339 00:19:20,760 --> 00:19:26,479 Speaker 2: it was first like something's something's amissed here. And it 340 00:19:26,520 --> 00:19:29,440 Speaker 2: wasn't until twenty twenty that I was like, oh, I 341 00:19:29,480 --> 00:19:30,920 Speaker 2: can't do this anymore, which. 342 00:19:30,720 --> 00:19:33,600 Speaker 1: Is when she really really told down. 343 00:19:33,760 --> 00:19:35,680 Speaker 3: Yeah. Yeah, she was starting to like some comments and 344 00:19:35,720 --> 00:19:38,159 Speaker 3: I was like, wait what, Yeah, because she was she 345 00:19:38,320 --> 00:19:40,439 Speaker 3: was She seemed like part of the problem was her 346 00:19:40,480 --> 00:19:45,399 Speaker 3: persona was championing, yes, marginalized folks. At one point the 347 00:19:45,400 --> 00:19:49,240 Speaker 3: whole Dumbledore conversation about him, yeah he's definitely gay to 348 00:19:49,600 --> 00:19:53,159 Speaker 3: Hermione being played by a black woman, being like, why 349 00:19:53,160 --> 00:19:54,879 Speaker 3: wouldn't she be I never said she was white, like 350 00:19:54,920 --> 00:19:58,280 Speaker 3: all these things that made you think that she was 351 00:20:00,119 --> 00:20:03,440 Speaker 3: you like for the marginalized, which again, most like these, 352 00:20:03,560 --> 00:20:05,919 Speaker 3: especially when you have a character that's an orphan immediately, 353 00:20:05,960 --> 00:20:08,639 Speaker 3: like as a person who was literally an orphan living 354 00:20:08,640 --> 00:20:10,760 Speaker 3: in an orphanage, like that is a call out to 355 00:20:10,960 --> 00:20:13,720 Speaker 3: love this character because you feel petty for this character, 356 00:20:13,760 --> 00:20:17,400 Speaker 3: and then you feel some sense of commonality because they're 357 00:20:17,440 --> 00:20:21,200 Speaker 3: the underdogs, Like that's what you know with kindness and caring, 358 00:20:21,200 --> 00:20:23,520 Speaker 3: which again, my social worker friends and I laugh because 359 00:20:23,520 --> 00:20:24,880 Speaker 3: we're like, if that was a real person, he would 360 00:20:24,880 --> 00:20:27,600 Speaker 3: have been real stub and like he would he would 361 00:20:27,640 --> 00:20:29,879 Speaker 3: have committed a lot of crimes and or be so 362 00:20:30,040 --> 00:20:33,760 Speaker 3: introverted and awkward that we would have societal issues like me. 363 00:20:34,280 --> 00:20:40,679 Speaker 3: But all that's say, these fictionalized characters are built up 364 00:20:40,720 --> 00:20:44,840 Speaker 3: so that you can feel a commonality, and as we 365 00:20:44,920 --> 00:20:48,439 Speaker 3: know today, as she did as a child herself. She 366 00:20:48,520 --> 00:20:51,840 Speaker 3: talked about it being feeling like the underdog the entire time. 367 00:20:52,359 --> 00:20:55,280 Speaker 3: So that's a call out to those who felt that 368 00:20:55,280 --> 00:20:57,600 Speaker 3: way in real life and finding a character to love 369 00:20:57,840 --> 00:21:01,000 Speaker 3: and then finding different characters that could be you. So 370 00:21:01,040 --> 00:21:03,680 Speaker 3: like having a Harmony character and an Iran character and 371 00:21:03,720 --> 00:21:07,520 Speaker 3: all the different characters outside characters made you fall in 372 00:21:07,520 --> 00:21:09,720 Speaker 3: love with that situation and wished you were a part 373 00:21:09,760 --> 00:21:13,600 Speaker 3: of that world, because she created this world. No matter what. Yes, 374 00:21:13,680 --> 00:21:16,600 Speaker 3: she is talented, like she's an awful person, but she 375 00:21:16,680 --> 00:21:19,320 Speaker 3: is talented and we know that, and unfortunately she built 376 00:21:19,400 --> 00:21:22,679 Speaker 3: up this love for years because I also watched in 377 00:21:22,720 --> 00:21:25,040 Speaker 3: real time a good friend of mine I've talked about 378 00:21:25,040 --> 00:21:29,840 Speaker 3: her before, whose life was in shamble, Like I watched 379 00:21:29,880 --> 00:21:33,959 Speaker 3: her going through so much in college and like watching 380 00:21:33,960 --> 00:21:37,879 Speaker 3: and having to be careful to care for her because 381 00:21:37,920 --> 00:21:41,240 Speaker 3: her stability was so unsure. I've seen so many things 382 00:21:41,359 --> 00:21:42,560 Speaker 3: in her life. I was like, oh my god, is 383 00:21:42,600 --> 00:21:46,600 Speaker 3: she okay? Leaned so hard into Harry Potter that when 384 00:21:46,600 --> 00:21:49,000 Speaker 3: people would question it, she would break down as well, 385 00:21:49,040 --> 00:21:51,560 Speaker 3: like one person like, you're in college, why do you 386 00:21:51,720 --> 00:21:53,760 Speaker 3: like this? And that was like she was there from 387 00:21:53,760 --> 00:21:56,719 Speaker 3: the beginning too, but we were again age different. She 388 00:21:56,800 --> 00:21:59,240 Speaker 3: was late high schools in the college when she was 389 00:21:59,320 --> 00:22:02,320 Speaker 3: loving it, and people would make fun of her because yes, 390 00:22:02,400 --> 00:22:03,879 Speaker 3: us as college students, we would just be like, what 391 00:22:03,920 --> 00:22:06,520 Speaker 3: are you doing, well, this is a kid's book whatever, whatnot. 392 00:22:07,280 --> 00:22:10,080 Speaker 3: But then seeing her lean so hard into it and 393 00:22:10,119 --> 00:22:14,840 Speaker 3: finding defensive and defensiveness and making sure you understood how 394 00:22:14,880 --> 00:22:17,280 Speaker 3: important it is to her. Again, I don't know where 395 00:22:17,280 --> 00:22:19,200 Speaker 3: she stands to say. We haven't talked in a long time, 396 00:22:19,560 --> 00:22:25,439 Speaker 3: but I realized, wow, okay, this is this is her reality, 397 00:22:25,960 --> 00:22:30,040 Speaker 3: that fictional land is her reality, and she needs it 398 00:22:30,080 --> 00:22:33,479 Speaker 3: to cope today or everything fall apart. Like when we 399 00:22:33,520 --> 00:22:37,360 Speaker 3: talk about suicidality, I was watching her. Let's just say 400 00:22:37,440 --> 00:22:39,920 Speaker 3: like that, but when she would lean into it and 401 00:22:40,000 --> 00:22:44,440 Speaker 3: just watch these movies repeatedly, read these books, repeatedly, look 402 00:22:44,480 --> 00:22:49,560 Speaker 3: into the world repeatedly because she needed that to survive 403 00:22:50,240 --> 00:22:55,160 Speaker 3: literally to the next day. So knowing that and then 404 00:22:55,280 --> 00:22:58,040 Speaker 3: having that world fall apart because the person who created it, 405 00:22:58,080 --> 00:23:00,640 Speaker 3: who you loved, because they created this world for you 406 00:23:00,960 --> 00:23:02,920 Speaker 3: and then they said, yes, this is my world that 407 00:23:02,960 --> 00:23:04,520 Speaker 3: I want you to be a part of. You're welcome, 408 00:23:04,720 --> 00:23:06,800 Speaker 3: where you can feel safe, and then to rip that 409 00:23:06,880 --> 00:23:11,800 Speaker 3: from you is heartbreaking. And it's not only heartbreaking once again, 410 00:23:11,840 --> 00:23:13,960 Speaker 3: it's morning you've lost something. 411 00:23:15,200 --> 00:23:20,960 Speaker 2: Yeah, And that was me, Like I think I did 412 00:23:20,960 --> 00:23:22,960 Speaker 2: what I had to do to survive, as you said, 413 00:23:23,000 --> 00:23:25,200 Speaker 2: but I did. Like I would sit in my room 414 00:23:25,200 --> 00:23:27,760 Speaker 2: and I would just hold the book and I would 415 00:23:27,800 --> 00:23:31,680 Speaker 2: be like, I can I can get through those because 416 00:23:31,680 --> 00:23:35,600 Speaker 2: I have this and it might not be the healthiest thing, 417 00:23:35,640 --> 00:23:38,120 Speaker 2: but it was like what I had and. 418 00:23:39,440 --> 00:23:39,879 Speaker 1: I did. 419 00:23:40,119 --> 00:23:42,280 Speaker 2: When I would have suicidality, I would think, well, you 420 00:23:42,320 --> 00:23:45,960 Speaker 2: don't know how Harry Potter ends yet, So that's something 421 00:23:45,960 --> 00:23:49,720 Speaker 2: to keep fighting for so it was like it was 422 00:23:49,840 --> 00:23:52,920 Speaker 2: such a big part of my life at my darkest, 423 00:23:53,760 --> 00:23:57,720 Speaker 2: like darkest times, and now yeah, it does it. It's 424 00:23:57,720 --> 00:24:01,399 Speaker 2: like alienating me and like making me feel unsafe. I 425 00:24:01,760 --> 00:24:04,920 Speaker 2: had a friend say, like, every time I see someone 426 00:24:05,240 --> 00:24:07,520 Speaker 2: in Harry Potter cosplay, I have to wonder, like are 427 00:24:07,520 --> 00:24:11,120 Speaker 2: they a safe person for me to be around, Like do. 428 00:24:11,080 --> 00:24:11,600 Speaker 1: I know. 429 00:24:13,200 --> 00:24:16,280 Speaker 2: How they or are they like they know but they're 430 00:24:16,320 --> 00:24:17,399 Speaker 2: like able to deal with it? 431 00:24:17,480 --> 00:24:17,919 Speaker 1: Do they know? 432 00:24:18,000 --> 00:24:21,880 Speaker 2: And they agree? Like all of those things. And it's 433 00:24:21,880 --> 00:24:26,879 Speaker 2: also kind of like it's not fun having to explain it. 434 00:24:26,920 --> 00:24:29,119 Speaker 2: I had to explain to my mom why she should 435 00:24:29,280 --> 00:24:32,840 Speaker 2: stop giving me Harry Potter stuff. Oh no, it's just 436 00:24:32,880 --> 00:24:35,240 Speaker 2: like I know that's really silly, but it's not fun 437 00:24:35,280 --> 00:24:37,840 Speaker 2: to have to be like, well, actually, turns out this 438 00:24:37,920 --> 00:24:41,760 Speaker 2: thing that I really convinced you was great it's not 439 00:24:41,840 --> 00:24:45,920 Speaker 2: so great. And now it hurts because it is. It's 440 00:24:45,960 --> 00:24:48,679 Speaker 2: like you enter this safe space, you go to this 441 00:24:48,800 --> 00:24:52,520 Speaker 2: sort of escapist fantasy world and you trust it to 442 00:24:52,600 --> 00:24:56,680 Speaker 2: make you feel to feel safe, and then yeah, it 443 00:24:56,720 --> 00:25:10,439 Speaker 2: turns around and hurts you instead. Another part of this 444 00:25:10,520 --> 00:25:15,760 Speaker 2: has been hearing from listeners, which some of you admittedly, 445 00:25:15,840 --> 00:25:18,439 Speaker 2: like a lot of years passed over these messages that 446 00:25:18,520 --> 00:25:21,920 Speaker 2: I've been hearing, so you your opinion listeners might have 447 00:25:21,960 --> 00:25:25,760 Speaker 2: even changed. But some of you have explained how you 448 00:25:25,800 --> 00:25:29,320 Speaker 2: are able to still enjoy it. The others have said, 449 00:25:29,320 --> 00:25:31,640 Speaker 2: like how much it just wrecked you? And that started 450 00:25:31,680 --> 00:25:37,080 Speaker 2: becoming I couldn't stop thinking about that either, So which 451 00:25:37,119 --> 00:25:40,760 Speaker 2: is not like, please don't feel like you don't you. 452 00:25:40,600 --> 00:25:41,720 Speaker 1: Didn't ruin anything for me. 453 00:25:41,840 --> 00:25:45,080 Speaker 2: She did like, don't feel bad about that, but it 454 00:25:45,480 --> 00:25:48,320 Speaker 2: just became like I couldn't stop thinking about that. 455 00:25:49,040 --> 00:25:52,320 Speaker 1: And also now. 456 00:25:53,640 --> 00:25:56,920 Speaker 2: I wonder about conservative parents who did yeah once we're 457 00:25:56,960 --> 00:25:59,800 Speaker 2: like switchcraft and banned it, and now I'm very happy 458 00:25:59,800 --> 00:26:02,520 Speaker 2: to air with their kids, and I feel weird about 459 00:26:02,600 --> 00:26:05,879 Speaker 2: kids reading it. I would never ever ever like, I 460 00:26:05,920 --> 00:26:10,040 Speaker 2: don't know, come at a kid, but it does. I 461 00:26:10,119 --> 00:26:13,720 Speaker 2: just I wish if it's had it happened, I would 462 00:26:13,720 --> 00:26:16,320 Speaker 2: have loved to have shared that, And now it's become 463 00:26:16,560 --> 00:26:18,000 Speaker 2: kind of a not. 464 00:26:19,160 --> 00:26:21,960 Speaker 1: No, no, I would not love to sharing it anymore. 465 00:26:23,359 --> 00:26:26,600 Speaker 3: I think about that also because my nieces have gotten 466 00:26:26,600 --> 00:26:29,320 Speaker 3: into him recently. I think when I say recently, the 467 00:26:29,359 --> 00:26:32,240 Speaker 3: last five years, and I found that shocking because again 468 00:26:32,440 --> 00:26:35,600 Speaker 3: my mom was like, its witchcrafts never will I have it, 469 00:26:35,600 --> 00:26:38,560 Speaker 3: and my sister even said that too, And then all 470 00:26:38,560 --> 00:26:40,880 Speaker 3: of a sudden, they are older high schoolers and they're 471 00:26:40,920 --> 00:26:42,480 Speaker 3: all reading Harry Potter, and I was like, what's what's 472 00:26:42,480 --> 00:26:45,240 Speaker 3: going on? What's going on? Oh, they let you read 473 00:26:45,280 --> 00:26:49,040 Speaker 3: Harry Potter, They're okay with they watched Harry Potter with you, 474 00:26:49,119 --> 00:26:53,679 Speaker 3: like all these things literally never until that point. And 475 00:26:53,720 --> 00:26:56,240 Speaker 3: then I think about it. I'm like, well, and I'm 476 00:26:56,240 --> 00:26:59,920 Speaker 3: gonna say this very frankly and not too hurting my feeling, 477 00:27:00,119 --> 00:27:03,520 Speaker 3: but I'm like, it kind of shows me that a 478 00:27:05,000 --> 00:27:08,320 Speaker 3: either you are not friends with the people who are 479 00:27:09,320 --> 00:27:12,919 Speaker 3: in this community who has been truly hurt by this, 480 00:27:13,760 --> 00:27:16,800 Speaker 3: or be the people we know. There's a lot of 481 00:27:16,800 --> 00:27:18,919 Speaker 3: trans people who are like, we're fine with it, this 482 00:27:18,960 --> 00:27:21,280 Speaker 3: is fine, this is nothing to do like that separation 483 00:27:21,400 --> 00:27:24,680 Speaker 3: and that's for them themselves. And I get that, and 484 00:27:24,720 --> 00:27:27,439 Speaker 3: it's this whole other level of like being saying that 485 00:27:27,480 --> 00:27:33,080 Speaker 3: they can separate the art from the artists conversation, and 486 00:27:33,160 --> 00:27:34,960 Speaker 3: it's kind of like, huh, okay, I can't give you 487 00:27:35,000 --> 00:27:36,960 Speaker 3: that opinion because you, as a trans person, you have 488 00:27:37,000 --> 00:27:39,760 Speaker 3: to decide for yourself. I'm not. I have no stake 489 00:27:39,800 --> 00:27:43,000 Speaker 3: in that matter, and that sense that like everybody gets hurt, 490 00:27:43,200 --> 00:27:46,600 Speaker 3: but I'm not personally attacked here. So it's kind of 491 00:27:46,600 --> 00:27:48,800 Speaker 3: one of those things like Okay, I'm gonna leave it be, 492 00:27:48,960 --> 00:27:50,960 Speaker 3: but do you see it for the people that you 493 00:27:51,000 --> 00:27:56,440 Speaker 3: don't even know? And it is a question of empathy. 494 00:27:56,480 --> 00:28:00,439 Speaker 3: And I know that word is really popular right now 495 00:28:00,440 --> 00:28:02,680 Speaker 3: and everybody's like, I'm an impact, then I do these things. 496 00:28:02,680 --> 00:28:06,560 Speaker 3: I'm like, I have to question, do you not understand 497 00:28:06,600 --> 00:28:09,960 Speaker 3: when you see these really hurtful words that you may 498 00:28:10,000 --> 00:28:13,240 Speaker 3: not be that identifier? But for a person to say 499 00:28:13,280 --> 00:28:15,879 Speaker 3: that you're invalid, you're not a woman. And then this 500 00:28:16,000 --> 00:28:18,119 Speaker 3: new rhetoric of like they're stealing from us, I'm like, 501 00:28:18,160 --> 00:28:21,040 Speaker 3: what are they stealing? I don't understand trying to replace us? 502 00:28:21,040 --> 00:28:24,920 Speaker 3: That was the thing. I was like, I'm sorry, I'm 503 00:28:24,920 --> 00:28:29,640 Speaker 3: just physically laughing. I'm like, that is the oddest way 504 00:28:29,640 --> 00:28:33,639 Speaker 3: of seeing this, Like then every baby girl born is 505 00:28:33,680 --> 00:28:36,040 Speaker 3: trying to replace you. That's your mentality of like a 506 00:28:36,440 --> 00:28:39,320 Speaker 3: I don't know anyway. That's a whole other sidetracker. There's 507 00:28:39,360 --> 00:28:41,360 Speaker 3: so many things that like where do you get this 508 00:28:43,400 --> 00:28:46,760 Speaker 3: that you might there's that question. It's like, I don't 509 00:28:46,760 --> 00:28:49,200 Speaker 3: think you do, because you're either you're not looking deep 510 00:28:49,320 --> 00:28:57,160 Speaker 3: enough or you're being very, very cavalier about this topic. 511 00:28:57,640 --> 00:28:59,600 Speaker 3: Maybe you're just ignoring it because that's what you have 512 00:28:59,640 --> 00:29:02,040 Speaker 3: to do. That's what you want to do to keep 513 00:29:02,080 --> 00:29:04,760 Speaker 3: this safety. And I understand that to a certain extent. 514 00:29:05,040 --> 00:29:07,320 Speaker 3: But yeah, there's this level like do you true. I 515 00:29:07,360 --> 00:29:10,239 Speaker 3: don't think it it. I think there's a level of 516 00:29:10,320 --> 00:29:14,280 Speaker 3: disconnect if you can do that, and I think they 517 00:29:14,320 --> 00:29:16,280 Speaker 3: would like I say that in a way that sounds harsh, 518 00:29:16,280 --> 00:29:19,160 Speaker 3: but it's true, like they're like disconnected or separated, as 519 00:29:19,160 --> 00:29:22,280 Speaker 3: they would say or people would say, And that's that's 520 00:29:22,320 --> 00:29:24,240 Speaker 3: an interesting way of doing it. I can't do that, 521 00:29:24,320 --> 00:29:27,560 Speaker 3: Like I can't do that for anything. Yeah, which is 522 00:29:27,560 --> 00:29:30,200 Speaker 3: why I missed. I don't know a lot of things. 523 00:29:30,280 --> 00:29:32,600 Speaker 3: But but that's the whole thing, is like, which is 524 00:29:32,600 --> 00:29:34,680 Speaker 3: why I'm like I can't watch this, I can't see this. 525 00:29:34,800 --> 00:29:37,040 Speaker 3: I cringe at this, Like those are this level and 526 00:29:37,080 --> 00:29:38,680 Speaker 3: not that it makes me better, it doesn't. I feel 527 00:29:38,720 --> 00:29:41,040 Speaker 3: like I missed out on so many things because of 528 00:29:41,080 --> 00:29:46,120 Speaker 3: that level. But you have to question, like, Okay, what's 529 00:29:46,160 --> 00:29:50,400 Speaker 3: the motive mm hmmm, and and again, if it's for safety, 530 00:29:50,560 --> 00:29:53,320 Speaker 3: there's I get that to that extent. 531 00:29:53,320 --> 00:29:55,240 Speaker 2: I do too, And we're gonna come back to you 532 00:29:55,240 --> 00:30:02,400 Speaker 2: your point about like it's not it's not being better necessarily. 533 00:30:02,400 --> 00:30:05,520 Speaker 2: That's not like not what I want you to I'm 534 00:30:05,560 --> 00:30:09,240 Speaker 2: not trying to be like shaming you for liking something, 535 00:30:09,320 --> 00:30:12,080 Speaker 2: because I, like I said, as much as it annoys 536 00:30:12,160 --> 00:30:15,120 Speaker 2: me when it comes up in argument, everything is problematic, 537 00:30:16,040 --> 00:30:19,160 Speaker 2: and I get that. And everyone sort of has their 538 00:30:20,720 --> 00:30:23,239 Speaker 2: their own personal lines what they can deal with, what 539 00:30:23,280 --> 00:30:26,320 Speaker 2: they can disconnect, and I get that too. I guess 540 00:30:26,360 --> 00:30:28,760 Speaker 2: what I'm trying. I really just want for anyone else 541 00:30:28,800 --> 00:30:31,560 Speaker 2: who feels this, I just want to let you know 542 00:30:31,640 --> 00:30:34,640 Speaker 2: like I feel it too, because it does get lost, 543 00:30:34,720 --> 00:30:37,920 Speaker 2: and I think it does get lost in a really 544 00:30:38,520 --> 00:30:43,600 Speaker 2: like black or white monetary situation when we're not talking 545 00:30:43,640 --> 00:30:47,560 Speaker 2: about like kind of that cultural impact, that cultural hurt, 546 00:30:47,560 --> 00:30:51,480 Speaker 2: that cultural loss of losing something that made you feel 547 00:30:51,480 --> 00:30:54,920 Speaker 2: really safe or that just was a great coping mechanism 548 00:30:54,960 --> 00:30:57,440 Speaker 2: for you, or like something you have so many associated 549 00:30:57,480 --> 00:31:00,000 Speaker 2: memories with. And I didn't want to put in here 550 00:31:00,080 --> 00:31:02,680 Speaker 2: because this comes up a lot. This is really silly. 551 00:31:04,320 --> 00:31:06,320 Speaker 2: People tease me all the time because I won't eat 552 00:31:06,360 --> 00:31:11,120 Speaker 2: Chick fil A because of homophobia, and they're always kind 553 00:31:11,120 --> 00:31:12,760 Speaker 2: of joking like, oh, she has a code, but I 554 00:31:12,800 --> 00:31:14,760 Speaker 2: like their chicken nuggets too much. And I always say 555 00:31:14,800 --> 00:31:16,680 Speaker 2: it was like really easy for me to give up 556 00:31:16,720 --> 00:31:18,760 Speaker 2: Chick fil A because I actually wasn't a big fan anyway. 557 00:31:18,840 --> 00:31:20,880 Speaker 2: So it's sort of like, Okay, well I can give 558 00:31:20,920 --> 00:31:23,640 Speaker 2: that up. And I consume products that are terrible all 559 00:31:23,680 --> 00:31:26,560 Speaker 2: the time. I use things, honestly, I use things I 560 00:31:26,600 --> 00:31:28,120 Speaker 2: wish I didn't have to use, but they're kind of 561 00:31:28,200 --> 00:31:31,959 Speaker 2: like the only game in town. But there's a difference 562 00:31:32,000 --> 00:31:34,960 Speaker 2: between like a company that I really don't care about, 563 00:31:35,640 --> 00:31:38,760 Speaker 2: I really don't care, and like media that you love 564 00:31:38,840 --> 00:31:43,160 Speaker 2: that got you through like this really difficult time. And 565 00:31:43,200 --> 00:31:45,840 Speaker 2: so I just I guess I just want to say 566 00:31:45,880 --> 00:31:52,200 Speaker 2: again like it wasn't an easy thing, and yeah. 567 00:31:52,120 --> 00:31:53,200 Speaker 1: It has been like grieving. 568 00:31:56,080 --> 00:31:59,360 Speaker 2: As I was saying earlier, I have gotten really in 569 00:31:59,360 --> 00:32:01,320 Speaker 2: my head about I don't want to bring down people's 570 00:32:01,400 --> 00:32:04,520 Speaker 2: joy because they brought me so much joy. But then 571 00:32:04,560 --> 00:32:06,600 Speaker 2: at the same time, I'm not acknowledging that it's like 572 00:32:06,680 --> 00:32:10,880 Speaker 2: bringing me down, and they talk about it, and so 573 00:32:10,920 --> 00:32:13,760 Speaker 2: I kind of shut down. I close off when it 574 00:32:13,800 --> 00:32:15,840 Speaker 2: comes up, which is not like the healthiest thing. 575 00:32:16,320 --> 00:32:17,120 Speaker 1: I think. 576 00:32:17,560 --> 00:32:20,880 Speaker 2: I really haven't allowed myself to say any of this 577 00:32:20,920 --> 00:32:23,479 Speaker 2: out loud, so I'm hoping maybe this is actually going 578 00:32:23,520 --> 00:32:29,400 Speaker 2: to be a healing kind of a healing moment. But 579 00:32:29,520 --> 00:32:33,720 Speaker 2: also I do want to acknowledge this. A lot of 580 00:32:33,800 --> 00:32:37,080 Speaker 2: this has played out on Twitter, and I have most 581 00:32:37,080 --> 00:32:40,520 Speaker 2: of my friends don't use Twitter, haven't ever like and 582 00:32:40,880 --> 00:32:42,960 Speaker 2: so I mean there's an argument to be made about 583 00:32:42,960 --> 00:32:45,800 Speaker 2: like willful ignorance, sure, but I also think like they 584 00:32:45,880 --> 00:32:50,880 Speaker 2: might not know the extent, and I always feel weird. 585 00:32:50,720 --> 00:32:52,800 Speaker 1: And I'm like, well, did you know about this? 586 00:32:53,960 --> 00:32:56,640 Speaker 2: Like if I want to give all the examples because 587 00:32:56,640 --> 00:32:58,040 Speaker 2: there's a weird spot you're. 588 00:32:57,840 --> 00:33:01,040 Speaker 1: In, then when if they're like yes, then you feel 589 00:33:01,560 --> 00:33:04,120 Speaker 1: that's not a great feeling. But if they're like no, 590 00:33:04,880 --> 00:33:07,080 Speaker 1: it's just a risk. It's a roll of the dice. 591 00:33:08,000 --> 00:33:11,480 Speaker 2: But I will acknowlge, especially because I've like, really I 592 00:33:11,480 --> 00:33:14,040 Speaker 2: have gotten off most of social media, like I barely 593 00:33:14,120 --> 00:33:16,800 Speaker 2: use it now. And sometimes you'll say something, I'm like, no, 594 00:33:16,840 --> 00:33:19,840 Speaker 2: I had not heard anything about that, So I understand. 595 00:33:19,960 --> 00:33:22,440 Speaker 2: I do get that that is a thing. I also 596 00:33:22,680 --> 00:33:26,600 Speaker 2: get that it's not fun to talk about. And yes, 597 00:33:26,680 --> 00:33:29,160 Speaker 2: I do not want people to like compliment me or 598 00:33:29,280 --> 00:33:31,200 Speaker 2: say like, oh, yes, I get it, you're stronger and 599 00:33:31,240 --> 00:33:32,040 Speaker 2: better than I am. 600 00:33:33,000 --> 00:33:36,080 Speaker 1: I don't want that. That's not true, that is not 601 00:33:36,120 --> 00:33:36,600 Speaker 1: what it is. 602 00:33:38,120 --> 00:33:40,760 Speaker 2: I just yeah, I just want to share it because 603 00:33:41,240 --> 00:33:43,680 Speaker 2: if anyone else feels it, but also to people who 604 00:33:43,720 --> 00:33:47,080 Speaker 2: are still able to enjoy it, especially in ways that 605 00:33:47,080 --> 00:33:49,280 Speaker 2: don't make her money. I don't blame you, and I 606 00:33:49,360 --> 00:33:51,160 Speaker 2: really am happy for you. It's hard for me to 607 00:33:51,200 --> 00:33:53,920 Speaker 2: show it, I can admit, but I am like I 608 00:33:54,000 --> 00:33:57,040 Speaker 2: miss it, and sometimes I think like maybe one day 609 00:33:57,080 --> 00:33:59,520 Speaker 2: I could too, but I don't think so, like she 610 00:33:59,520 --> 00:34:05,880 Speaker 2: would like really really change her tune and make amends 611 00:34:05,960 --> 00:34:07,480 Speaker 2: and it would still take a long time. 612 00:34:08,080 --> 00:34:12,160 Speaker 1: But I you know, who knows. I'm not trying to 613 00:34:12,160 --> 00:34:14,239 Speaker 1: make anyone feel guilty. I don't. 614 00:34:14,480 --> 00:34:18,400 Speaker 2: Honestly, it's a decent situation, Like it just sucks for everybody. 615 00:34:19,239 --> 00:34:22,360 Speaker 2: It's not something I'm looking at guilty people about. 616 00:34:22,719 --> 00:34:26,279 Speaker 1: But I guess, like I don't like that. 617 00:34:26,480 --> 00:34:28,719 Speaker 2: A lot of the arguments I'm seeing, which there are 618 00:34:28,760 --> 00:34:32,319 Speaker 2: a lot of arguments about it, does revolve around like 619 00:34:32,400 --> 00:34:39,239 Speaker 2: who is who's a bigger fan and it being like well, 620 00:34:39,400 --> 00:34:42,719 Speaker 2: you gave up on it completely and I didn't, or well, 621 00:34:42,800 --> 00:34:46,680 Speaker 2: I gave it up and you didn't. I don't think 622 00:34:46,719 --> 00:34:51,160 Speaker 2: that's the conversation we should be having, honestly, because it 623 00:34:51,360 --> 00:34:57,279 Speaker 2: just it just hurts. It just hurts, and I feel 624 00:34:57,280 --> 00:34:59,719 Speaker 2: like I keep saying giving up, but really it's forced out, 625 00:35:00,320 --> 00:35:05,000 Speaker 2: like it's it does feel like it's it's like you're 626 00:35:05,040 --> 00:35:09,319 Speaker 2: eving a loss. And because I loved it so much, 627 00:35:09,360 --> 00:35:13,360 Speaker 2: like I opened myself up to it hurting me this badly. 628 00:35:13,400 --> 00:35:15,040 Speaker 2: I get that for some of you, you're listening to 629 00:35:15,040 --> 00:35:18,080 Speaker 2: this and you're like, what are you? Why is this 630 00:35:18,160 --> 00:35:23,080 Speaker 2: such a big thing? But it is that, Like I 631 00:35:23,160 --> 00:35:26,680 Speaker 2: understand that for some people it's it doesn't have these 632 00:35:26,719 --> 00:35:31,640 Speaker 2: emotional steaks. But I think, like this argument that I'm seeing, 633 00:35:32,360 --> 00:35:34,839 Speaker 2: I think we should all remember like we're different when 634 00:35:34,880 --> 00:35:37,560 Speaker 2: it comes to how we deal with something like this, 635 00:35:37,680 --> 00:35:42,360 Speaker 2: which again is pretty unique, and maybe neither is stronger 636 00:35:42,480 --> 00:35:48,440 Speaker 2: or better, and that's the point. Maybe maybe it shouldn't 637 00:35:48,440 --> 00:35:52,880 Speaker 2: be a fan competition about who loves something more, because 638 00:35:53,040 --> 00:35:55,279 Speaker 2: that's really hard to judge, and I also don't think 639 00:35:55,280 --> 00:35:57,560 Speaker 2: that's healthy. But I do think the fact that I 640 00:35:57,560 --> 00:36:02,720 Speaker 2: loved it that much and left should say something about 641 00:36:02,719 --> 00:36:05,239 Speaker 2: how bad she is, right. 642 00:36:05,600 --> 00:36:08,520 Speaker 3: I think our listeners are probably aware, but that's the 643 00:36:08,560 --> 00:36:11,920 Speaker 3: biggest thing, is to make sure you that everyone understands 644 00:36:12,200 --> 00:36:16,680 Speaker 3: what's at stake necessarily, and when you see rhetoric and 645 00:36:17,280 --> 00:36:21,000 Speaker 3: laws being changed based on things that people write, which 646 00:36:22,080 --> 00:36:25,400 Speaker 3: literal laws are being changed and or introduced in the 647 00:36:25,520 --> 00:36:29,000 Speaker 3: UK based on what JK. Rowling has written and has 648 00:36:29,040 --> 00:36:32,160 Speaker 3: been presented in the United States as well as to 649 00:36:32,520 --> 00:36:36,400 Speaker 3: why trans people should not exist. Essentially, think that's what 650 00:36:36,440 --> 00:36:42,040 Speaker 3: they're saying, and we are seeing a widespread of anti trans, 651 00:36:42,200 --> 00:36:47,960 Speaker 3: anti lgbtquia plus laws happening all over the world, like 652 00:36:48,040 --> 00:36:51,239 Speaker 3: in the US and the UK, we see it in 653 00:36:51,280 --> 00:36:53,640 Speaker 3: a whole different way, and it's it feels like it's 654 00:36:54,120 --> 00:36:58,160 Speaker 3: they've been waiting and then having an influencer like her 655 00:36:59,600 --> 00:37:01,640 Speaker 3: who had money to back it up as well, because 656 00:37:01,680 --> 00:37:04,720 Speaker 3: we also know she's donated to several of these causes, 657 00:37:04,719 --> 00:37:07,080 Speaker 3: so instead of actually like not donating to anything or 658 00:37:07,120 --> 00:37:10,920 Speaker 3: donating to something that's actually good, she donates to anti things. 659 00:37:11,160 --> 00:37:13,759 Speaker 3: So that's the conversation that we're having, as well as 660 00:37:13,760 --> 00:37:19,040 Speaker 3: the fact that both the publishers, the movie executives and 661 00:37:19,040 --> 00:37:23,760 Speaker 3: herself put her face as the uh spokesperson for Harry Potter, 662 00:37:23,800 --> 00:37:25,920 Speaker 3: and that's that's part of that conversation too. It's not 663 00:37:26,000 --> 00:37:28,879 Speaker 3: necessarily Daniel Radcliffe, it's literally her. 664 00:37:30,400 --> 00:37:34,360 Speaker 2: Yeah, And that's one of the things I think about 665 00:37:34,360 --> 00:37:39,920 Speaker 2: a lot is once the Max Show the reboot got greenlit, 666 00:37:40,200 --> 00:37:43,640 Speaker 2: she said something like, you can cry your tears and 667 00:37:43,640 --> 00:37:46,120 Speaker 2: have your boycott and I'll just be drinking champagne and 668 00:37:46,120 --> 00:37:49,840 Speaker 2: counting my money. And so I think like, even if 669 00:37:50,520 --> 00:37:53,600 Speaker 2: as long as companies don't feel that they have to 670 00:37:53,640 --> 00:37:57,000 Speaker 2: cut her out, she's going to keep getting these deals 671 00:37:57,040 --> 00:37:58,919 Speaker 2: and she's going to keep making that money, whether you 672 00:37:59,360 --> 00:38:01,160 Speaker 2: are like support her financially or not. 673 00:38:01,600 --> 00:38:04,439 Speaker 3: Literally, she said for the game, the game company says 674 00:38:04,480 --> 00:38:05,799 Speaker 3: she's not a part of it. She's not a part 675 00:38:05,800 --> 00:38:07,960 Speaker 3: of it, but in the you know that she's making 676 00:38:08,000 --> 00:38:11,879 Speaker 3: millions of dollars off of it because she's a smart businesswoman. 677 00:38:12,360 --> 00:38:15,200 Speaker 3: Once again, we're not taking that away from her. And 678 00:38:15,239 --> 00:38:18,719 Speaker 3: she made sure to get that copyright deal for everything. 679 00:38:19,200 --> 00:38:21,200 Speaker 3: So there you go. 680 00:38:22,320 --> 00:38:25,920 Speaker 2: Yeah, yeah, And I feel like that's she A lot 681 00:38:25,960 --> 00:38:29,279 Speaker 2: of times she kind of implies she's probably outright said it, 682 00:38:29,280 --> 00:38:29,800 Speaker 2: but I can't. 683 00:38:29,960 --> 00:38:31,000 Speaker 1: I don't know for sure, but. 684 00:38:31,000 --> 00:38:35,360 Speaker 2: She implies like, as long as she's still culturally relevant, 685 00:38:35,440 --> 00:38:38,520 Speaker 2: then what she understands is people either agree with her 686 00:38:38,640 --> 00:38:40,839 Speaker 2: or don't care enough to do anything about it, right, 687 00:38:41,520 --> 00:38:44,120 Speaker 2: and that I think I skipped over this bullet point. 688 00:38:44,200 --> 00:38:47,680 Speaker 2: But one of the biggest things about her that comes 689 00:38:47,760 --> 00:38:51,360 Speaker 2: up again when I'm having these disagreements with friends is like, 690 00:38:51,719 --> 00:38:54,719 Speaker 2: not only is she like a pseudo feminist, not a 691 00:38:54,719 --> 00:38:56,799 Speaker 2: feminist at all, but like presents herself in that way 692 00:38:56,840 --> 00:39:01,120 Speaker 2: and is therefore legitimized in that way. A lot of 693 00:39:01,120 --> 00:39:03,719 Speaker 2: people have said, like, oh, she was cool with like, 694 00:39:04,480 --> 00:39:07,200 Speaker 2: you know, a lot of queer stuff, just not that transpart, 695 00:39:07,200 --> 00:39:09,359 Speaker 2: which they think is bad. But I'm like, well, then 696 00:39:09,400 --> 00:39:11,839 Speaker 2: I think that that's not she wasn't cool with any 697 00:39:11,880 --> 00:39:13,919 Speaker 2: of that. It. Like, I'm kind of like the same 698 00:39:13,920 --> 00:39:16,839 Speaker 2: way we say an intersectional feminist not a feminist, You're 699 00:39:16,840 --> 00:39:18,960 Speaker 2: not a feminist if you're a turf. I think it's 700 00:39:19,040 --> 00:39:23,360 Speaker 2: kind of the same with like queer stuff, Like, well 701 00:39:23,600 --> 00:39:25,840 Speaker 2: then she's not cool with it at all. 702 00:39:25,880 --> 00:39:30,480 Speaker 3: Right, Again, like there's people have nitpicked her for everything. 703 00:39:30,600 --> 00:39:34,719 Speaker 3: I'm like, yeah, I feel like there's some validity. So 704 00:39:34,840 --> 00:39:37,800 Speaker 3: when you start looking at like it's kind of a whole, 705 00:39:37,880 --> 00:39:41,200 Speaker 3: like and this is what I was always told by 706 00:39:41,239 --> 00:39:43,920 Speaker 3: a person from England, So don't yell at me that 707 00:39:44,160 --> 00:39:48,160 Speaker 3: the whole like black white is not an issue but ethnicity. 708 00:39:48,600 --> 00:39:52,799 Speaker 3: So when it comes to like Pakistani people and such like, 709 00:39:52,960 --> 00:39:55,840 Speaker 3: that's a big deal. And it's kind of been implied 710 00:39:55,880 --> 00:39:57,640 Speaker 3: to be like yeah, no, yeah, even though they may 711 00:39:57,800 --> 00:40:01,360 Speaker 3: be cool with like having international couples with black and 712 00:40:01,800 --> 00:40:04,960 Speaker 3: white people, outside of that, there's other ethnicities that that 713 00:40:05,160 --> 00:40:08,320 Speaker 3: becomes a whole conversation. And there's been a whole rhetoric 714 00:40:08,400 --> 00:40:12,840 Speaker 3: in TikTok in which slurs for Asian people is okay, 715 00:40:12,920 --> 00:40:16,080 Speaker 3: still there and you're like, wait what And so there's 716 00:40:16,080 --> 00:40:17,759 Speaker 3: a back and forth of like people like, no, it's 717 00:40:17,760 --> 00:40:21,960 Speaker 3: not true, but they literally call Chinese food the derogatory 718 00:40:22,040 --> 00:40:24,680 Speaker 3: word for Asian people. And I was like wait what 719 00:40:25,400 --> 00:40:27,879 Speaker 3: wait what like and people like no, no, but it's 720 00:40:27,880 --> 00:40:29,680 Speaker 3: it's just it's just that way. It's not a big deal, 721 00:40:29,880 --> 00:40:32,759 Speaker 3: like but it is, but no, but it is, like 722 00:40:33,200 --> 00:40:35,759 Speaker 3: I don't know what's happening, but it is, but that level, 723 00:40:35,880 --> 00:40:40,000 Speaker 3: like just because it's not the same as in other 724 00:40:40,080 --> 00:40:44,480 Speaker 3: places doesn't mean it doesn't exist there Again, with her like, oh, yeah, 725 00:40:44,520 --> 00:40:47,799 Speaker 3: she's cool with this specific type of marginalized people. But 726 00:40:47,840 --> 00:40:51,080 Speaker 3: outside of that, not cool right, doesn't mean she's. 727 00:40:50,960 --> 00:40:57,520 Speaker 1: Cool, right, Yes, I agree, yes so much. Yes, and 728 00:40:57,600 --> 00:40:58,200 Speaker 1: I know like. 729 00:41:01,960 --> 00:41:05,120 Speaker 3: That. Sorry, Ukpole, We'll let us know. 730 00:41:07,560 --> 00:41:21,439 Speaker 2: Kindly please, And I I just want to say here 731 00:41:21,520 --> 00:41:23,720 Speaker 2: because I I hope. 732 00:41:23,520 --> 00:41:24,480 Speaker 1: It's clear. 733 00:41:26,280 --> 00:41:30,920 Speaker 2: That Samantha knows I when I love something, I love something. 734 00:41:31,280 --> 00:41:36,480 Speaker 1: Sure, this is not this is not a conversation. I 735 00:41:36,600 --> 00:41:37,319 Speaker 1: was dreading it. 736 00:41:37,440 --> 00:41:39,880 Speaker 2: I've been putting it off like I didn't want to 737 00:41:39,960 --> 00:41:44,600 Speaker 2: have it. I really am not finding any joy in 738 00:41:45,480 --> 00:41:50,400 Speaker 2: doing this, Like I'm not like gleeful that oh like 739 00:41:51,719 --> 00:41:55,120 Speaker 2: you you all who are still you know in this world. 740 00:41:55,840 --> 00:41:58,560 Speaker 2: I am like finding glee that I am better than 741 00:41:58,600 --> 00:42:00,040 Speaker 2: you when I moved on. I'm not like this. This 742 00:42:00,160 --> 00:42:04,000 Speaker 2: is like a really hurt I hope it's clearing my voice. 743 00:42:03,680 --> 00:42:04,160 Speaker 1: Like it was. 744 00:42:05,400 --> 00:42:13,480 Speaker 2: It's not something I'm trying to make you feel bad about. 745 00:42:14,840 --> 00:42:17,120 Speaker 3: But it is in our show. And as you know, 746 00:42:17,200 --> 00:42:19,080 Speaker 3: we have to talk about the hard stuff, the things 747 00:42:19,080 --> 00:42:21,440 Speaker 3: that make you uncomfortable and the things that you have 748 00:42:21,520 --> 00:42:24,120 Speaker 3: to have this comment, Like we could talk about the 749 00:42:24,160 --> 00:42:26,920 Speaker 3: Asian model minority myth, which is popping back up right 750 00:42:26,920 --> 00:42:29,319 Speaker 3: now in the US, which we will. I've also been 751 00:42:29,440 --> 00:42:32,320 Speaker 3: putting that off because I get very angry and upset 752 00:42:32,320 --> 00:42:37,200 Speaker 3: about it because this level of supremacy, supremacy ideal, which 753 00:42:37,239 --> 00:42:38,640 Speaker 3: is exactly what's happening with JK. 754 00:42:38,760 --> 00:42:39,000 Speaker 1: Rowling. 755 00:42:39,120 --> 00:42:42,400 Speaker 3: She feels this level of supremacy over a group of 756 00:42:42,440 --> 00:42:45,400 Speaker 3: martialized people and wants to maintain that power because for 757 00:42:45,440 --> 00:42:49,120 Speaker 3: some reason it threatens her, which is not true. Again, 758 00:42:49,280 --> 00:42:51,279 Speaker 3: that's what they are saying. They sell that they feel 759 00:42:51,320 --> 00:42:57,000 Speaker 3: threatened at this imaginary idea of someone else being who 760 00:42:57,000 --> 00:43:00,200 Speaker 3: they are. Like again, I'm still very confused by this 761 00:43:00,239 --> 00:43:01,280 Speaker 3: whole we're being replaced. 762 00:43:01,320 --> 00:43:02,839 Speaker 1: It's like, what are you? 763 00:43:03,160 --> 00:43:04,600 Speaker 3: Are you no longer a woman? 764 00:43:04,719 --> 00:43:04,839 Speaker 1: Like? 765 00:43:04,880 --> 00:43:06,320 Speaker 3: Did you what's happening? 766 00:43:06,360 --> 00:43:07,160 Speaker 1: Why are you? 767 00:43:07,320 --> 00:43:09,400 Speaker 3: Congratulations? If you're not, that's how you feel because you 768 00:43:09,440 --> 00:43:13,360 Speaker 3: feel different, go ahead, But I don't think anybody's forcing 769 00:43:13,360 --> 00:43:15,080 Speaker 3: you to be anything else. So I'm not really sure 770 00:43:15,080 --> 00:43:19,080 Speaker 3: how you're being replaced. But okay, once again, but yeah, 771 00:43:19,120 --> 00:43:21,359 Speaker 3: this is the whole conversation that we have to look 772 00:43:21,400 --> 00:43:25,600 Speaker 3: to it. It's not a pretty pristine world and these 773 00:43:25,800 --> 00:43:29,680 Speaker 3: minute what we would say, minute details or what you 774 00:43:29,719 --> 00:43:32,439 Speaker 3: would think is like a smaller conversation is a big 775 00:43:32,480 --> 00:43:36,440 Speaker 3: conversation because as we talked about in intersectionality, is that 776 00:43:36,920 --> 00:43:41,360 Speaker 3: if one group of people are oppressed or marginalized, it 777 00:43:41,480 --> 00:43:44,719 Speaker 3: means everyone is going to be affected by this conversation. 778 00:43:45,320 --> 00:43:49,040 Speaker 3: And when we look at the fact that if one 779 00:43:49,080 --> 00:43:51,560 Speaker 3: percent of people do not have the freedom and the rights, 780 00:43:52,040 --> 00:43:55,000 Speaker 3: then no one is safe and no one like it's 781 00:43:55,000 --> 00:43:57,319 Speaker 3: not true freedom. Well, again, we've talked about that before. 782 00:43:57,360 --> 00:43:59,919 Speaker 3: It's not true equality in any of that. And that's 783 00:44:00,000 --> 00:44:03,160 Speaker 3: the intersectionality of it is looking at the basis of 784 00:44:03,320 --> 00:44:06,879 Speaker 3: every group is affected differently, and some people are affected more, 785 00:44:06,920 --> 00:44:10,040 Speaker 3: and we have to talk about why that breakdown is happening, 786 00:44:10,320 --> 00:44:12,759 Speaker 3: even if it makes it uncomfortable because we may not 787 00:44:12,840 --> 00:44:15,359 Speaker 3: be a part of that group, but we see it 788 00:44:15,440 --> 00:44:18,120 Speaker 3: and we know that that is damaging for everyone. Like 789 00:44:18,160 --> 00:44:20,719 Speaker 3: I hate to be that narcissistic because it's not about us, 790 00:44:21,160 --> 00:44:23,560 Speaker 3: but it can't like that's the truth of it all. 791 00:44:24,440 --> 00:44:28,839 Speaker 3: Mm hmm, Like yeah, until that is broken down, it's 792 00:44:28,880 --> 00:44:31,800 Speaker 3: not true. This idea of equality is not true. 793 00:44:32,560 --> 00:44:40,359 Speaker 1: Mm hmm. Yeah. And I think that these these conversations 794 00:44:40,400 --> 00:44:44,000 Speaker 1: are also tough because you have to acknowledge what you 795 00:44:44,600 --> 00:44:47,080 Speaker 1: were previously allowing to be okay. 796 00:44:48,040 --> 00:44:50,319 Speaker 2: And I think people including me, like everybody, I think 797 00:44:50,320 --> 00:44:54,640 Speaker 2: it's a natural response to feel defensive when you get 798 00:44:54,640 --> 00:45:02,520 Speaker 2: those questions and to sort of try to point out, like, well, 799 00:45:02,840 --> 00:45:07,560 Speaker 2: what way are you not a hypocrite? We're all hypocrites. 800 00:45:07,040 --> 00:45:10,560 Speaker 3: The big Well, it comes down to what are you upholding? 801 00:45:10,920 --> 00:45:17,239 Speaker 3: Are you holding oppression, supremacy and overall power or are 802 00:45:17,280 --> 00:45:21,239 Speaker 3: you looking at true quality? And that's and it does 803 00:45:21,280 --> 00:45:24,759 Speaker 3: affect everything, know that matter how trivial you think it is. 804 00:45:26,120 --> 00:45:30,640 Speaker 3: To pretend like it doesn't is just as naive. I'm 805 00:45:30,680 --> 00:45:33,520 Speaker 3: being a lot more made. I'm a little again, I'm more. 806 00:45:34,400 --> 00:45:37,120 Speaker 3: I absolutely understand. I am definitely disconnected. I have an 807 00:45:37,120 --> 00:45:39,000 Speaker 3: attachment isstry in general, so I don't get attached to 808 00:45:39,080 --> 00:45:41,480 Speaker 3: things too harshly because. 809 00:45:41,239 --> 00:45:42,960 Speaker 1: I'm gonna lose at any point. 810 00:45:43,160 --> 00:45:48,760 Speaker 3: That's my drama. But it's fine. I'm fine, Everything's fine. 811 00:45:49,400 --> 00:45:51,360 Speaker 3: But yeah, but that's I mean, that's the reality is 812 00:45:51,360 --> 00:45:56,880 Speaker 3: when we have these conversations and then when you again, yes, 813 00:45:57,239 --> 00:45:59,840 Speaker 3: you do your thing, you have to do what's safely, 814 00:46:00,239 --> 00:46:01,560 Speaker 3: and then you have to do it in your own 815 00:46:01,600 --> 00:46:05,879 Speaker 3: timing whatever that might be. But we're not we can't 816 00:46:05,960 --> 00:46:11,680 Speaker 3: pretend that it's not something that can affect something like 817 00:46:11,760 --> 00:46:16,160 Speaker 3: it affects things. There is a cause and effect to everything, 818 00:46:17,080 --> 00:46:18,359 Speaker 3: and you have to weigh that you. 819 00:46:18,280 --> 00:46:20,040 Speaker 1: Do you do. 820 00:46:20,600 --> 00:46:25,160 Speaker 2: And I also just quick final note, I want to say, 821 00:46:26,239 --> 00:46:29,200 Speaker 2: sometimes when we do this podcast, I feel really terrible 822 00:46:29,640 --> 00:46:35,360 Speaker 2: because perhaps I'm essentially having an argument that I haven't 823 00:46:35,560 --> 00:46:39,560 Speaker 2: had with a real person, but it's like, come up, 824 00:46:39,600 --> 00:46:41,520 Speaker 2: you know what I mean, and they're not here to 825 00:46:41,560 --> 00:46:45,440 Speaker 2: present their side. And to that end, I want to 826 00:46:45,480 --> 00:46:48,080 Speaker 2: say that I'm also I'm really bad at letting things go. 827 00:46:48,280 --> 00:46:52,440 Speaker 2: I'm really I want to be better at having a disagreement. 828 00:46:52,800 --> 00:46:54,680 Speaker 2: I want to be better about like not shutting down 829 00:46:55,440 --> 00:46:58,680 Speaker 2: and making people feel like they can't share something they 830 00:46:58,760 --> 00:47:02,040 Speaker 2: love with me. This is a complicated thing because it 831 00:47:02,040 --> 00:47:04,680 Speaker 2: does hurt. But I also think because I haven't allowed 832 00:47:04,680 --> 00:47:09,000 Speaker 2: myself to grief yet, that maybe I will get better. 833 00:47:09,360 --> 00:47:12,240 Speaker 2: That's also like a toxic I know that about myself. 834 00:47:12,400 --> 00:47:14,960 Speaker 2: And I also, I mean, you all know, I get 835 00:47:15,000 --> 00:47:17,200 Speaker 2: how fun it is to talk about your fan fiction, 836 00:47:18,080 --> 00:47:21,759 Speaker 2: and so sometimes it feels like I am being very 837 00:47:21,760 --> 00:47:23,160 Speaker 2: selfish because I'm like, I don't want to hear about 838 00:47:23,160 --> 00:47:26,839 Speaker 2: your Hair Potter fan fiction because it's just slipsetting, and 839 00:47:26,880 --> 00:47:28,920 Speaker 2: that means I've stopped talking about my fan fiction pretty 840 00:47:29,000 --> 00:47:33,000 Speaker 2: largely too. But it's like, it's just like this protective 841 00:47:33,040 --> 00:47:36,440 Speaker 2: mechanism kicks in and is like, no, you can't, you 842 00:47:36,520 --> 00:47:42,560 Speaker 2: cannot deal with this, And I just I want to 843 00:47:42,560 --> 00:47:44,240 Speaker 2: work on that. I want to find a better way 844 00:47:44,600 --> 00:47:48,960 Speaker 2: to recognize my negive emotions and deal with them. But 845 00:47:49,239 --> 00:47:51,719 Speaker 2: I also just want to say that it's something I 846 00:47:51,719 --> 00:47:57,960 Speaker 2: should do. But really, I also think I'm being too 847 00:47:58,200 --> 00:48:00,839 Speaker 2: forgiving of jk Rowling because he's kind of the one 848 00:48:00,880 --> 00:48:08,680 Speaker 2: that's bringing you down, not me. Yeah, So thank you 849 00:48:08,719 --> 00:48:11,600 Speaker 2: so much, Samantha. You've been very supportive in this thing 850 00:48:11,680 --> 00:48:16,520 Speaker 2: I was putting off for so long, And to any 851 00:48:16,600 --> 00:48:19,719 Speaker 2: listeners who feel the same, I am here. 852 00:48:19,800 --> 00:48:22,160 Speaker 1: I am with you. 853 00:48:22,160 --> 00:48:25,680 Speaker 2: You can email us at Stephid mom Stuff at iHeartMedia 854 00:48:25,719 --> 00:48:27,399 Speaker 2: dot com. You can find us on Twitter at mom 855 00:48:27,400 --> 00:48:31,319 Speaker 2: Stuff podcast, or on Instagram and TikTok at Stephan Never 856 00:48:31,360 --> 00:48:34,239 Speaker 2: Told You. We also have a tea public store with merchandise, 857 00:48:34,680 --> 00:48:36,319 Speaker 2: or we have a book that you can pre order 858 00:48:36,360 --> 00:48:38,520 Speaker 2: at Stuff. You should read Books dot com audio form 859 00:48:38,600 --> 00:48:41,440 Speaker 2: or physical form. Thanks as always to our super producer, 860 00:48:41,520 --> 00:48:44,680 Speaker 2: Christina our executive producer Maya and our contributor Joey. 861 00:48:44,680 --> 00:48:47,359 Speaker 1: Thank you and thanks to you for listening. Stephan never 862 00:48:47,360 --> 00:48:48,600 Speaker 1: told you the projection of My Heart Radio. 863 00:48:48,640 --> 00:48:50,239 Speaker 2: For more podcast from My Heart Radio, you can check 864 00:48:50,239 --> 00:48:52,160 Speaker 2: out the heart Radio app, Apple Podcasts, or wherever you 865 00:48:52,239 --> 00:49:00,680 Speaker 2: listen to your favorite shows. Oh no,