1 00:00:05,160 --> 00:00:07,680 Speaker 1: Hey, this is a and Samantha. I'm welcome to Stephane. 2 00:00:07,760 --> 00:00:19,520 Speaker 1: Never told you production of IHEART radio, and today we 3 00:00:19,560 --> 00:00:24,600 Speaker 1: are thrilled, rejoicing. I'm so glad to be joined by 4 00:00:24,640 --> 00:00:30,040 Speaker 1: the much missed and very busy brigid todd. Welcome. Oh, 5 00:00:30,080 --> 00:00:32,520 Speaker 1: I'm so excited to be back. It's been too long. 6 00:00:33,120 --> 00:00:35,360 Speaker 1: It has been too long. You have been very busy. 7 00:00:35,400 --> 00:00:39,080 Speaker 1: Can you give us a snapshot? What's been going on? Goodness, 8 00:00:39,280 --> 00:00:42,120 Speaker 1: let us live vicariously through you. Yeah, it's been a 9 00:00:42,200 --> 00:00:45,240 Speaker 1: it's been a weird summer. I spent a month in Portugal, 10 00:00:45,280 --> 00:00:48,600 Speaker 1: which was Super Fun. I was there during a historic 11 00:00:48,800 --> 00:00:50,600 Speaker 1: heat wave and I was staying in a place that 12 00:00:50,640 --> 00:00:55,520 Speaker 1: did not have air conditioning, so that was interesting. M Yeah, 13 00:00:55,600 --> 00:00:58,160 Speaker 1: it's funny. That will probably mean, this sounds so depressing, 14 00:00:58,160 --> 00:01:02,440 Speaker 1: that'll probably be the like coolest summer that that any 15 00:01:02,480 --> 00:01:06,720 Speaker 1: of us will have, considering how I oh, it's a 16 00:01:06,800 --> 00:01:12,360 Speaker 1: hundred and twenty degrees. Cool, cool, cool, cool, COO, COO cool. Yeah, yeah, 17 00:01:12,520 --> 00:01:15,720 Speaker 1: I did some family travel. I got to go to 18 00:01:16,040 --> 00:01:19,880 Speaker 1: Barbados with my brother and my sister in law and 19 00:01:20,040 --> 00:01:22,880 Speaker 1: my sweet the sweetest little baby, my niece, who was 20 00:01:22,920 --> 00:01:27,160 Speaker 1: adorable where I learned that she knows how to swim underwater, 21 00:01:27,280 --> 00:01:32,920 Speaker 1: put her face underwater and she loves it. I love it. 22 00:01:33,160 --> 00:01:35,720 Speaker 1: I love that's impressive. Yeah, she's an highland baby. I 23 00:01:35,800 --> 00:01:40,040 Speaker 1: also learned, I won't say mistakenly, but like unexpectedly, that 24 00:01:40,120 --> 00:01:43,520 Speaker 1: in Barbados they drive on the different side of the road, 25 00:01:43,640 --> 00:01:45,600 Speaker 1: and I didn't learn that until I got into a 26 00:01:45,680 --> 00:01:47,960 Speaker 1: rental car and I was like, well, the steering wheels 27 00:01:47,960 --> 00:01:51,120 Speaker 1: in the wrong places. She was like what do you mean? Well, 28 00:01:51,560 --> 00:01:53,480 Speaker 1: found that before you got on the road. It sounds 29 00:01:53,560 --> 00:01:59,120 Speaker 1: like that good. So unexpectedly. So that could have been 30 00:01:59,160 --> 00:02:01,240 Speaker 1: a whole level of the after yeah. So I for 31 00:02:01,280 --> 00:02:05,120 Speaker 1: anyone listening that's doing any like travel, definitely research what's 32 00:02:05,120 --> 00:02:06,760 Speaker 1: side of the road they drive on the country that 33 00:02:06,840 --> 00:02:10,280 Speaker 1: you intend to drive in for a week. That's brave. 34 00:02:10,520 --> 00:02:12,160 Speaker 1: I don't I don't know if I would drive in 35 00:02:12,520 --> 00:02:15,160 Speaker 1: a different country. It makes me nervous. Oh, I completely 36 00:02:15,200 --> 00:02:17,320 Speaker 1: did it. My partner did. I would have. He said that. 37 00:02:17,639 --> 00:02:19,120 Speaker 1: I was like, Oh, do you think I could handle 38 00:02:19,440 --> 00:02:21,919 Speaker 1: driving here, and he was like you would immediately start crying. 39 00:02:22,080 --> 00:02:25,880 Speaker 1: Immediately start crying, which is accurate. So I didn't drive, 40 00:02:25,919 --> 00:02:28,800 Speaker 1: but props to him. For being able to navigate that situation. 41 00:02:30,760 --> 00:02:34,400 Speaker 1: It's funny because now it's like I literally have to 42 00:02:34,520 --> 00:02:37,959 Speaker 1: go out as if my car is a pet and 43 00:02:38,160 --> 00:02:41,120 Speaker 1: started once a week because I drive so rarely. But 44 00:02:41,240 --> 00:02:44,519 Speaker 1: I learned to drive in Australia and I've driven in 45 00:02:44,600 --> 00:02:47,080 Speaker 1: several different countries. I've come so far. I went from 46 00:02:47,160 --> 00:02:49,840 Speaker 1: like Oh, yeah, I'll drive a stick shift. Yeah, so 47 00:02:49,960 --> 00:02:52,240 Speaker 1: now like you can't get me in that car, Oh 48 00:02:52,320 --> 00:02:55,000 Speaker 1: my God. Well, they in Australia not only did they 49 00:02:55,080 --> 00:02:56,640 Speaker 1: drive on the other side of the road, but they 50 00:02:56,680 --> 00:03:00,440 Speaker 1: also are notorious for like being quite intense drivers. Like 51 00:03:00,520 --> 00:03:03,919 Speaker 1: what was that like? It was scary. It was a 52 00:03:04,040 --> 00:03:07,639 Speaker 1: stick shift and it was on the other side and 53 00:03:07,760 --> 00:03:10,120 Speaker 1: I was like driving at three and one night and 54 00:03:10,200 --> 00:03:11,880 Speaker 1: I was trying to turn and I saw I didn't 55 00:03:11,880 --> 00:03:14,040 Speaker 1: have a license. Everybody, so please don't come at me, 56 00:03:14,400 --> 00:03:16,840 Speaker 1: but I could have been deported. So I was like 57 00:03:17,080 --> 00:03:20,240 Speaker 1: trying to make a smooth turn away from the police 58 00:03:20,919 --> 00:03:23,959 Speaker 1: and its stalled in the middle of the intersex in 59 00:03:24,080 --> 00:03:27,160 Speaker 1: front of the police, of course, of course. So I 60 00:03:27,360 --> 00:03:30,200 Speaker 1: drove around and I went through a drive through and 61 00:03:30,280 --> 00:03:32,440 Speaker 1: they were their lights turned on, but they couldn't find 62 00:03:32,520 --> 00:03:38,119 Speaker 1: me because my clever drive through rules. They're not gonna 63 00:03:38,320 --> 00:03:41,320 Speaker 1: this is like Um fast in the fear like to fast. 64 00:03:42,240 --> 00:03:53,640 Speaker 1: Any addition, yeah, I'm starting to understand why you may 65 00:03:53,680 --> 00:03:57,040 Speaker 1: be banned from a couple of countries now. Yeah, and 66 00:03:57,160 --> 00:03:59,920 Speaker 1: it's my poor driving skills. I never was in any accident, 67 00:04:00,160 --> 00:04:07,440 Speaker 1: but should I have been on the road? Probably not. Probably. Yeah. Also, 68 00:04:07,520 --> 00:04:09,360 Speaker 1: I love there's a cat in the mix. Listeners, you 69 00:04:09,440 --> 00:04:11,720 Speaker 1: might hear a cat. You might hear a cat in 70 00:04:11,800 --> 00:04:15,240 Speaker 1: the mix. Her name is Monika. She's a million years old. 71 00:04:15,520 --> 00:04:18,560 Speaker 1: There's a mouse in our apartment, I'm sad to say. 72 00:04:18,720 --> 00:04:21,160 Speaker 1: So we're we're all on a high alert. You might 73 00:04:21,200 --> 00:04:24,440 Speaker 1: hear some some some meals. If the mouse comes out, 74 00:04:26,400 --> 00:04:28,240 Speaker 1: that would be you know, like, what is it? What 75 00:04:28,400 --> 00:04:31,280 Speaker 1: is that called? Is there a word for podcasting? Must 76 00:04:31,520 --> 00:04:36,000 Speaker 1: Hear podcasting. Yeah, that's what we're trying to capture. Here 77 00:04:36,080 --> 00:04:38,640 Speaker 1: we go with this. I wish you all could see 78 00:04:38,680 --> 00:04:42,440 Speaker 1: because like she is all day, she has been perched 79 00:04:42,560 --> 00:04:44,400 Speaker 1: where she last saw this mouse, like you would think 80 00:04:44,400 --> 00:04:48,320 Speaker 1: that she is like a soldier, like like on patrol 81 00:04:49,000 --> 00:04:52,000 Speaker 1: trying to scope about this mouse. She's been called a duty. 82 00:04:52,080 --> 00:04:56,200 Speaker 1: He knows. Yes, I love it. I love it. Get it, Monica, 83 00:04:56,360 --> 00:05:01,920 Speaker 1: we all believe in you. And then, before this, I 84 00:05:02,120 --> 00:05:03,680 Speaker 1: wanted to bring this up because I think it kind 85 00:05:03,720 --> 00:05:06,000 Speaker 1: of vaguely relates to what we're talking about. We were 86 00:05:06,080 --> 00:05:08,920 Speaker 1: talking about I wonder if younger listeners know about this, 87 00:05:09,680 --> 00:05:11,840 Speaker 1: but back in the early days of the Internet, we 88 00:05:11,880 --> 00:05:16,440 Speaker 1: would get these like letters, these chain email letters, that 89 00:05:16,520 --> 00:05:20,200 Speaker 1: would be like, if you don't send this to ten 90 00:05:20,320 --> 00:05:28,760 Speaker 1: other people, Bloody Mary's coming for you, and I would 91 00:05:28,760 --> 00:05:34,520 Speaker 1: be like, oh no, not bloody Mary. That is so 92 00:05:34,760 --> 00:05:37,880 Speaker 1: specific because, like I said, I've never really got those types. 93 00:05:37,960 --> 00:05:40,720 Speaker 1: But that the fact that bloody Mary why people are 94 00:05:40,920 --> 00:05:45,480 Speaker 1: using old folk tales and doing it in like more, 95 00:05:46,240 --> 00:05:49,480 Speaker 1: I guess, relevant technologies. Is that what this say is 96 00:05:49,520 --> 00:05:51,520 Speaker 1: in instead of saying the name, which you've already said 97 00:05:51,560 --> 00:05:54,800 Speaker 1: it three times, or your car stay, you've done she's 98 00:05:54,839 --> 00:05:57,560 Speaker 1: not looking at a mirror. That was how I shouldn't 99 00:05:57,600 --> 00:06:04,240 Speaker 1: see ourselves so level. And then did she bring us 100 00:06:04,320 --> 00:06:09,040 Speaker 1: with her? Did she with her? Did she say with 101 00:06:09,200 --> 00:06:12,360 Speaker 1: US present? I'm started becoming a little worried, but I'm 102 00:06:12,400 --> 00:06:16,560 Speaker 1: just saying yeah, just like traveling of UH, new technologies, 103 00:06:16,800 --> 00:06:20,599 Speaker 1: I guess, updated ideas, instead of being from the mirror 104 00:06:21,000 --> 00:06:24,479 Speaker 1: to the Internet to zoom calls or ask for US 105 00:06:24,640 --> 00:06:27,320 Speaker 1: skype calls. Is this a thing? I think it's a thing. 106 00:06:27,520 --> 00:06:31,680 Speaker 1: It's it's so interesting, like how these fables and folk 107 00:06:31,720 --> 00:06:34,320 Speaker 1: tales have been around since I've been a kid, how 108 00:06:34,400 --> 00:06:37,160 Speaker 1: they've been updated for the Internet age, where it's like, Oh, 109 00:06:37,240 --> 00:06:40,000 Speaker 1: maybe you'll like I can I can confirm that, the 110 00:06:40,120 --> 00:06:42,920 Speaker 1: same way that we used to get those chain emails, 111 00:06:43,080 --> 00:06:45,400 Speaker 1: send this to ten friends, or bloody marry will kill you. 112 00:06:46,120 --> 00:06:50,200 Speaker 1: That's one for me. I'm not going to say I can. 113 00:06:50,240 --> 00:06:54,160 Speaker 1: Similar things are still floating around, especially on WHATSAPP. I 114 00:06:54,200 --> 00:06:56,440 Speaker 1: have seen and gotten messages that are like Oh, send 115 00:06:56,480 --> 00:06:59,080 Speaker 1: this to ten friends and you'll have good luck, or 116 00:06:59,560 --> 00:07:02,000 Speaker 1: you know it'll be some some corny joke and that 117 00:07:02,080 --> 00:07:05,040 Speaker 1: it will end with send this to ten people. So 118 00:07:05,120 --> 00:07:08,120 Speaker 1: I can't confirm that those things that we grew up 119 00:07:08,200 --> 00:07:10,400 Speaker 1: with are still yeah, and WHATSAPP is a is a 120 00:07:11,560 --> 00:07:17,640 Speaker 1: big vector of that kind of like Shane email style messaging. Yeah, 121 00:07:17,760 --> 00:07:19,640 Speaker 1: I think I'm not cool enough to receive any of 122 00:07:19,760 --> 00:07:25,600 Speaker 1: these threats. I'm good. I did, however, see it actually 123 00:07:25,720 --> 00:07:29,120 Speaker 1: mailed to whomever, like when I was at a rental 124 00:07:29,200 --> 00:07:33,520 Speaker 1: place where they melt. It was a church mailing this 125 00:07:33,640 --> 00:07:37,400 Speaker 1: woman that she needed to send back, I guess witnessing 126 00:07:37,760 --> 00:07:40,280 Speaker 1: letters to people or she would not be blessed, and 127 00:07:40,320 --> 00:07:42,920 Speaker 1: I was like wow, wow, this church is really going 128 00:07:42,960 --> 00:07:45,160 Speaker 1: after people, but they did it through snail mill and 129 00:07:45,240 --> 00:07:46,960 Speaker 1: I was like and they're just throwing it all the 130 00:07:47,000 --> 00:07:49,520 Speaker 1: way back, if I have to say almost kind of 131 00:07:49,600 --> 00:07:54,440 Speaker 1: sounds like low key biblical extortion, like you want this blessing, 132 00:07:55,440 --> 00:07:57,040 Speaker 1: send this back if you know what's good for you. 133 00:07:59,040 --> 00:08:01,400 Speaker 1: I was like, I don't how this is working for people. 134 00:08:01,520 --> 00:08:04,800 Speaker 1: Why are they sending this out, like physically sending this out? 135 00:08:04,920 --> 00:08:07,360 Speaker 1: But that's a whole different story. Yeah, I think it 136 00:08:07,440 --> 00:08:09,480 Speaker 1: goes back to we've talked about this before, like maybe 137 00:08:09,480 --> 00:08:12,120 Speaker 1: you don't believe something, but you're like just nervous enough 138 00:08:12,200 --> 00:08:14,600 Speaker 1: to be like, alright, I don't want to mess with it. 139 00:08:16,080 --> 00:08:18,520 Speaker 1: That's actually a very common I mean not to get 140 00:08:18,560 --> 00:08:21,440 Speaker 1: all like Dorky, but that's actually a very common thing 141 00:08:21,560 --> 00:08:25,680 Speaker 1: that we see in disinformation circles, or misinformation circles more specifically, 142 00:08:26,080 --> 00:08:29,280 Speaker 1: where people come across some information they don't know if 143 00:08:29,320 --> 00:08:31,040 Speaker 1: it's true or not, but they're like let me share 144 00:08:31,080 --> 00:08:33,079 Speaker 1: this just to be safe, like I don't really know, 145 00:08:33,240 --> 00:08:35,680 Speaker 1: but I want I want to share this my mom, 146 00:08:35,920 --> 00:08:38,719 Speaker 1: God lover. She's she's big into this, like she's the 147 00:08:38,840 --> 00:08:41,400 Speaker 1: mom who when you get those messages where it's like, Oh, 148 00:08:42,120 --> 00:08:45,319 Speaker 1: if somebody puts a piece of paper, a flyer on 149 00:08:45,400 --> 00:08:48,040 Speaker 1: your windshield, don't get out of the car, it's gang members, 150 00:08:48,160 --> 00:08:50,920 Speaker 1: like she, I think that she's it's I definitely see 151 00:08:50,960 --> 00:08:53,760 Speaker 1: it as like an act or a gesture of love 152 00:08:53,880 --> 00:08:56,199 Speaker 1: and protection. But I'm like, mom, if some if a 153 00:08:56,280 --> 00:08:58,199 Speaker 1: gang member, wants to kidnap me, they're not going to 154 00:08:58,240 --> 00:09:00,559 Speaker 1: put a piece of paper on my windshield. I'll just 155 00:09:00,840 --> 00:09:19,480 Speaker 1: do it like think about it. Yes. So, true, this 156 00:09:19,679 --> 00:09:22,319 Speaker 1: is kind of a convoluted segue, but I wanted to 157 00:09:22,360 --> 00:09:24,199 Speaker 1: bring it up because it does bring me joy to 158 00:09:24,240 --> 00:09:28,400 Speaker 1: remember me and my old neo's angel email, um, receiving 159 00:09:28,440 --> 00:09:32,920 Speaker 1: these things. But also I do think it kind of 160 00:09:33,200 --> 00:09:36,800 Speaker 1: tangentially relates to what we're talking about today and that 161 00:09:36,960 --> 00:09:40,560 Speaker 1: there seems to be like these messages coming from. We're 162 00:09:40,640 --> 00:09:44,920 Speaker 1: not sure where. Um. So I'm very eager to learn 163 00:09:45,000 --> 00:09:49,959 Speaker 1: more about this podcaster, a podcaster. That was a masterful transition. 164 00:09:51,080 --> 00:09:54,880 Speaker 1: Thank you. Thank you, but yes, I'm very eager to 165 00:09:55,000 --> 00:09:57,720 Speaker 1: learn more about this. Can you tell us what we're 166 00:09:57,880 --> 00:10:01,440 Speaker 1: going to be discussing today? Yes, so, as you just said, Annie, 167 00:10:01,600 --> 00:10:04,760 Speaker 1: that it is sort of related about this idea of 168 00:10:05,520 --> 00:10:10,480 Speaker 1: who is able to really control conversations on the Internet 169 00:10:10,520 --> 00:10:13,679 Speaker 1: about one specific subject, and so today that topic is 170 00:10:13,760 --> 00:10:17,080 Speaker 1: going to be Megan Markele Um, as you all I'm 171 00:10:17,120 --> 00:10:20,640 Speaker 1: sure by now I'll probably know, the queen died last week. 172 00:10:20,800 --> 00:10:22,640 Speaker 1: To do you, I'm curious, do you two have any 173 00:10:22,760 --> 00:10:26,000 Speaker 1: like strong connection to the Royal Family, the monarchy? Like, 174 00:10:26,160 --> 00:10:28,440 Speaker 1: is this something that you followed or you, like could 175 00:10:28,520 --> 00:10:30,559 Speaker 1: not care less? Have not followed it. I don't know 176 00:10:30,640 --> 00:10:34,880 Speaker 1: anything about it. I personally could not care less. I 177 00:10:35,160 --> 00:10:38,920 Speaker 1: it's it's been. It's impossible to escape, I will say 178 00:10:39,080 --> 00:10:42,559 Speaker 1: right now. Um, but I don't have any real feeling 179 00:10:42,600 --> 00:10:46,679 Speaker 1: about it. I have some friends who do. I did 180 00:10:46,800 --> 00:10:49,360 Speaker 1: go through a period where I was really into like 181 00:10:50,920 --> 00:10:54,160 Speaker 1: entertainment from the UK and specifically, like I was in 182 00:10:54,240 --> 00:10:56,960 Speaker 1: love with the idea of moving to London. But I've 183 00:10:57,080 --> 00:11:01,240 Speaker 1: never been into the royal family. That's never made sense 184 00:11:01,320 --> 00:11:04,839 Speaker 1: to me personally, but I have friends who are, so 185 00:11:05,240 --> 00:11:07,319 Speaker 1: I guess I get it through there. So I'm old 186 00:11:07,480 --> 00:11:12,079 Speaker 1: enough to remember, uh, like vividly remember instead of just 187 00:11:12,120 --> 00:11:13,520 Speaker 1: being like kind of like, yeah, I was a kid. 188 00:11:13,559 --> 00:11:17,360 Speaker 1: I remember this the death of Diana, Princess Diana, and also, uh, 189 00:11:17,520 --> 00:11:19,960 Speaker 1: the love for William, Prince William, who is, I think, 190 00:11:19,960 --> 00:11:21,840 Speaker 1: a couple of years younger than me. So of course, 191 00:11:22,000 --> 00:11:25,960 Speaker 1: like that era of trying to figure out watching these 192 00:11:26,200 --> 00:11:30,959 Speaker 1: young princes grow up and following their stories. So not 193 00:11:31,080 --> 00:11:34,920 Speaker 1: necessarily that I was caught up in the royal lifestyle 194 00:11:35,080 --> 00:11:40,040 Speaker 1: or a family, but I do remember that offset of seeing, uh, 195 00:11:40,400 --> 00:11:45,960 Speaker 1: the huge controversies amongst the family at that point in time. 196 00:11:46,400 --> 00:11:49,559 Speaker 1: Of course, as recently as everything has happened in the 197 00:11:49,640 --> 00:11:53,120 Speaker 1: past ten years, with Megan Markle, even Prince William getting 198 00:11:53,160 --> 00:11:57,840 Speaker 1: married and all the different things, I have an understanding 199 00:11:57,920 --> 00:12:00,560 Speaker 1: of what's going on, kind of like you, as well 200 00:12:00,600 --> 00:12:03,400 Speaker 1: as the fact that I have understood my group of 201 00:12:03,440 --> 00:12:07,680 Speaker 1: people that I follow have very strong opinions of colonialism 202 00:12:07,720 --> 00:12:10,160 Speaker 1: and I'm like yeah, okay, distracts. So that's kind of 203 00:12:10,200 --> 00:12:14,040 Speaker 1: where I stand in this mix totally. So I'm somewhere 204 00:12:14,040 --> 00:12:16,120 Speaker 1: in the middle. I have to say right up front 205 00:12:16,240 --> 00:12:19,800 Speaker 1: I am not a careful follower of the royal family. 206 00:12:20,040 --> 00:12:24,280 Speaker 1: I kind of bounce between like ambivalent to like, Oh, 207 00:12:24,360 --> 00:12:26,440 Speaker 1: I know about this. So I know that there are 208 00:12:26,480 --> 00:12:30,160 Speaker 1: people who are obsessively following royal family and monarchy culture 209 00:12:30,160 --> 00:12:32,520 Speaker 1: and if I say anything that's incorrect that it's because 210 00:12:32,520 --> 00:12:36,400 Speaker 1: I am not a careful follower of the royal family. 211 00:12:36,960 --> 00:12:38,360 Speaker 1: But kind of like you, like, I have a lot 212 00:12:38,440 --> 00:12:41,640 Speaker 1: of friends who are really into it. Um, older I 213 00:12:41,960 --> 00:12:45,839 Speaker 1: do have a theory that older MOMS, specifically, like older 214 00:12:45,920 --> 00:12:49,360 Speaker 1: black MOMS, have a real affinity for Diana. So growing 215 00:12:49,440 --> 00:12:53,800 Speaker 1: up my mom loves Diana, cried when Diana died. Still 216 00:12:53,840 --> 00:12:56,760 Speaker 1: has people magazines from from that time with like, you know, 217 00:12:57,000 --> 00:12:59,559 Speaker 1: what was it? The People's Princess and all that like like. 218 00:12:59,720 --> 00:13:03,280 Speaker 1: So definitely got a little bit of Secondhand Royal Family 219 00:13:03,679 --> 00:13:06,600 Speaker 1: engagement from that, but I've never really been somebody who 220 00:13:07,320 --> 00:13:10,559 Speaker 1: followed it carefully. Obviously, when Megan Markele came on the scene, 221 00:13:10,800 --> 00:13:13,640 Speaker 1: I loved that. I was definitely like, oh, black princess. 222 00:13:13,920 --> 00:13:15,640 Speaker 1: I know she's not a princess, but that was what 223 00:13:15,760 --> 00:13:18,280 Speaker 1: we said that when when she came on the scene. Um, 224 00:13:18,679 --> 00:13:20,440 Speaker 1: but yeah, and I think one of the reasons I'm 225 00:13:20,480 --> 00:13:23,959 Speaker 1: so fascinated by even as someone who doesn't really know 226 00:13:24,000 --> 00:13:26,760 Speaker 1: about the monarchy. Why I'm so fascinated about this issue 227 00:13:26,840 --> 00:13:29,719 Speaker 1: is that Megan Markle on the way that people talk 228 00:13:29,760 --> 00:13:32,680 Speaker 1: about her, I think, gives us a very interesting lens 229 00:13:32,920 --> 00:13:37,160 Speaker 1: into how women, particularly black women and women of color, 230 00:13:38,280 --> 00:13:42,800 Speaker 1: how our online experience is shaped by other people. And 231 00:13:43,040 --> 00:13:45,800 Speaker 1: Megan Markle is not even on social media anymore, neither 232 00:13:45,880 --> 00:13:48,920 Speaker 1: is Prince Harry, and yet she is a constant focal 233 00:13:49,000 --> 00:13:52,120 Speaker 1: point on social media. And I think that we really 234 00:13:52,200 --> 00:13:54,280 Speaker 1: saw that with the death of the Queen. And even 235 00:13:54,320 --> 00:13:56,480 Speaker 1: though the Queen was born back in nineteen twenty six 236 00:13:56,559 --> 00:13:58,640 Speaker 1: and there was no such thing as social media back then, 237 00:13:59,080 --> 00:14:01,920 Speaker 1: when she by the time she died last week, social 238 00:14:01,960 --> 00:14:05,520 Speaker 1: media played a huge, overwhelming role in the way that 239 00:14:05,600 --> 00:14:08,280 Speaker 1: the royal family considered her death. You know, the news 240 00:14:08,360 --> 00:14:10,760 Speaker 1: itself was first announced to the public on the royal 241 00:14:10,840 --> 00:14:13,719 Speaker 1: family twitter before any place else. And you know, a 242 00:14:13,760 --> 00:14:16,000 Speaker 1: lot of the immediate next steps and actions that the 243 00:14:16,080 --> 00:14:20,000 Speaker 1: royal family took were circulated around social media. So like 244 00:14:20,360 --> 00:14:23,680 Speaker 1: halting government social media accounts and saying, like, Oh, official 245 00:14:23,760 --> 00:14:27,080 Speaker 1: government accounts are only going to tweet, you know, essential information, 246 00:14:27,240 --> 00:14:30,200 Speaker 1: no non essential posts, and like changing the royal family 247 00:14:30,280 --> 00:14:33,080 Speaker 1: website to be like a like a morning, you know 248 00:14:33,200 --> 00:14:36,520 Speaker 1: in memorium kind of thing. And in addition to the 249 00:14:36,600 --> 00:14:39,160 Speaker 1: way that social media has shaped the way the royals, 250 00:14:39,440 --> 00:14:42,160 Speaker 1: you know, present to the public, we have seen that 251 00:14:43,000 --> 00:14:45,360 Speaker 1: it also presents this place for Megan Markle to be 252 00:14:45,440 --> 00:14:49,800 Speaker 1: faced with pretty intense criticisms, but also things like outright lies, 253 00:14:49,960 --> 00:14:54,560 Speaker 1: conspiracy theories, racist sexist smears and stereotypes about who she is. 254 00:14:55,040 --> 00:14:58,040 Speaker 1: And it really is not the first time, like, we 255 00:14:58,240 --> 00:15:00,400 Speaker 1: have seen this kind of stuff going on with Megan 256 00:15:00,440 --> 00:15:03,600 Speaker 1: Marko before, the Queen's staff, obviously, and it's so interesting 257 00:15:03,640 --> 00:15:07,640 Speaker 1: to me that this, this event, the Queen's death, was 258 00:15:07,880 --> 00:15:11,320 Speaker 1: used not just as a way to like memorialize her 259 00:15:11,560 --> 00:15:14,200 Speaker 1: or like critique her rain or anything like that. It 260 00:15:14,360 --> 00:15:17,920 Speaker 1: was also used as a way to further flam Megan 261 00:15:18,040 --> 00:15:22,920 Speaker 1: markle on social media. Yes, yes, and as someone who, 262 00:15:23,600 --> 00:15:26,400 Speaker 1: as I said, like I'm not I don't follow this 263 00:15:26,520 --> 00:15:29,600 Speaker 1: a lot, I don't get on twitter a lot even, 264 00:15:29,720 --> 00:15:32,040 Speaker 1: I was aware that this was happening, like I saw 265 00:15:32,160 --> 00:15:35,760 Speaker 1: it trending on my brief like what's going on on twitter, 266 00:15:36,520 --> 00:15:40,560 Speaker 1: and I had enough knowledge to be like, I bet 267 00:15:40,760 --> 00:15:44,760 Speaker 1: this is coming from a really terrible racist place. When 268 00:15:44,760 --> 00:15:48,640 Speaker 1: I saw like Meg and Marco go home trending. So yeah, 269 00:15:49,080 --> 00:15:52,440 Speaker 1: can you explain what all of that was? What was 270 00:15:52,560 --> 00:15:56,000 Speaker 1: the social media response? Yeah, I mean it's almost exactly 271 00:15:56,080 --> 00:15:58,600 Speaker 1: like you said, and I want to interviewed somebody who 272 00:15:58,640 --> 00:16:02,480 Speaker 1: writes about fandoms and and culture and she described it interestingly. 273 00:16:02,560 --> 00:16:04,920 Speaker 1: She described it as an anti fandom, which I had 274 00:16:04,960 --> 00:16:07,160 Speaker 1: never heard before, and so it's just like what you 275 00:16:07,280 --> 00:16:10,240 Speaker 1: think of a fandom, but in the opposite direction. So 276 00:16:10,400 --> 00:16:13,720 Speaker 1: like people who their whole thing is I love to 277 00:16:13,880 --> 00:16:16,840 Speaker 1: hate this one person. I love to create content about 278 00:16:16,840 --> 00:16:18,920 Speaker 1: how much I don't like them. That's my thing. And 279 00:16:19,040 --> 00:16:22,680 Speaker 1: so that kind of messaging was in full effect after 280 00:16:22,760 --> 00:16:25,040 Speaker 1: the queen died. As you said, the Hashtag Megan markle 281 00:16:25,120 --> 00:16:28,360 Speaker 1: go home sort of trending on twitter and basically what 282 00:16:28,480 --> 00:16:31,600 Speaker 1: I found so interesting about that is that first people 283 00:16:31,800 --> 00:16:34,720 Speaker 1: were picking up picking on her because she did not 284 00:16:34,960 --> 00:16:40,000 Speaker 1: go to England with Harry, her husband, after the queen died. 285 00:16:40,240 --> 00:16:42,600 Speaker 1: And I'm again I'm the expert here, but it sounds 286 00:16:42,640 --> 00:16:45,160 Speaker 1: like she just was not invited and so not going 287 00:16:46,000 --> 00:16:49,440 Speaker 1: was there anything bad? And then when she actually did go, 288 00:16:50,000 --> 00:16:52,200 Speaker 1: everybody was like, oh, go home, go home. So it's 289 00:16:52,240 --> 00:16:55,360 Speaker 1: interesting how it kind of doesn't matter what she does. 290 00:16:55,600 --> 00:16:58,280 Speaker 1: If she doesn't go, that's bad, if she does go, 291 00:16:58,720 --> 00:17:02,280 Speaker 1: that's also bad, and I think it really illustrates how, yeah, 292 00:17:02,320 --> 00:17:05,080 Speaker 1: it's like this impossible tightrope where it doesn't it's it's 293 00:17:05,119 --> 00:17:08,760 Speaker 1: clearly not really about her actions and what she actually 294 00:17:08,880 --> 00:17:12,280 Speaker 1: does or says. It's just about her it's about her existence, 295 00:17:12,359 --> 00:17:15,399 Speaker 1: her presence. That is threatening, that is not okay with 296 00:17:15,440 --> 00:17:18,520 Speaker 1: the people that hate her and ultimately it's like an 297 00:17:18,560 --> 00:17:22,320 Speaker 1: impossible situation to walk Um. There was just like very 298 00:17:22,680 --> 00:17:27,000 Speaker 1: fascinating instance where she was videoed and somebody in a 299 00:17:27,080 --> 00:17:30,520 Speaker 1: crowd handed her some flowers and an aid seems to 300 00:17:30,560 --> 00:17:31,800 Speaker 1: walk up to her and it's like, Oh, I'll take 301 00:17:31,840 --> 00:17:33,720 Speaker 1: those for you, and she says to she she mouths 302 00:17:33,760 --> 00:17:36,280 Speaker 1: something to him that's like seems to be Oh my God, 303 00:17:36,320 --> 00:17:38,960 Speaker 1: I don't worry about it, thank you, and you would 304 00:17:39,000 --> 00:17:41,040 Speaker 1: the way that people talked about this on the Internet, 305 00:17:41,400 --> 00:17:43,880 Speaker 1: you would think that she punched this aid and it's 306 00:17:43,920 --> 00:17:48,000 Speaker 1: like actually, she's just like calmly standing there, saying a sentence, 307 00:17:48,400 --> 00:17:50,679 Speaker 1: smiling and then like clearly saying thank you, and it's 308 00:17:50,680 --> 00:17:54,760 Speaker 1: like a thirty second interaction. People were dissecting it like 309 00:17:54,880 --> 00:17:58,359 Speaker 1: the JFK is a prouder film like she like, people 310 00:17:58,440 --> 00:18:01,879 Speaker 1: became like, overnight, body language experts and lip readers and 311 00:18:02,000 --> 00:18:10,320 Speaker 1: it was wild. Yeah, yeah, I want to return to 312 00:18:10,400 --> 00:18:13,000 Speaker 1: that idea of anti fandom for sure, because I think 313 00:18:13,800 --> 00:18:15,439 Speaker 1: as someone who is like a fan of a lot 314 00:18:15,480 --> 00:18:18,920 Speaker 1: of stuff, I've seen this before and it just becomes 315 00:18:19,040 --> 00:18:24,000 Speaker 1: like this where people feel justified in hating someone. And 316 00:18:24,080 --> 00:18:27,040 Speaker 1: here's the proof, and I am a body language expert 317 00:18:27,080 --> 00:18:30,040 Speaker 1: and I can tell you why. Um, but all it 318 00:18:30,200 --> 00:18:33,840 Speaker 1: is is like they just want to hate that person 319 00:18:33,960 --> 00:18:41,159 Speaker 1: who usually they feel is encroaching on their territory. And like, 320 00:18:41,320 --> 00:18:43,359 Speaker 1: as someone who is interested in a lot of like 321 00:18:43,520 --> 00:18:47,720 Speaker 1: star wars stuff, nerdy stuff, bandom stuff, I'm sure you've 322 00:18:47,760 --> 00:18:52,600 Speaker 1: seen this a lot. Yes, Oh, I cannot wait to 323 00:18:52,680 --> 00:18:55,040 Speaker 1: talk about it once we get more in depth about 324 00:18:55,119 --> 00:18:58,440 Speaker 1: what might be going on here, because, yeah, it's just 325 00:18:58,760 --> 00:19:02,680 Speaker 1: like it becomes ums. I think I've said to Samantha before, 326 00:19:02,680 --> 00:19:05,320 Speaker 1: it's like this anti finding up thing is ringing. So 327 00:19:05,400 --> 00:19:06,720 Speaker 1: truth with me, because I'm like, well, you're not a 328 00:19:06,800 --> 00:19:09,440 Speaker 1: fan anymore. You're just all about the hate of this. 329 00:19:10,480 --> 00:19:15,520 Speaker 1: You just are unhappy. So I don't know why you're here. 330 00:19:16,800 --> 00:19:19,399 Speaker 1: It's to make everyone else feel unhappy and to like 331 00:19:19,480 --> 00:19:22,320 Speaker 1: assert your what you think is your power and dominance 332 00:19:22,359 --> 00:19:26,280 Speaker 1: in the space. But you're not a fan, if you 333 00:19:26,400 --> 00:19:30,919 Speaker 1: hate this much of it, just go, just go along, 334 00:19:32,240 --> 00:19:39,000 Speaker 1: it's gonna be okay. Oh yeah, and I mean like 335 00:19:39,080 --> 00:19:44,360 Speaker 1: the levels people will go, the things they'll do to scare, 336 00:19:46,560 --> 00:19:53,800 Speaker 1: usually women, marginalize people off the Internet is wild to me, Um, 337 00:19:53,960 --> 00:19:57,520 Speaker 1: and unfortunately very often successful. But like they will come 338 00:19:57,600 --> 00:20:00,680 Speaker 1: up with all of these things. And you you brought 339 00:20:00,720 --> 00:20:03,199 Speaker 1: this example of like conspiracy theory right where they were 340 00:20:03,240 --> 00:20:09,159 Speaker 1: like this, something is wrong here. Yeah, so, like I 341 00:20:09,200 --> 00:20:13,560 Speaker 1: mean the amount of wild conspiracy theory like people. It's 342 00:20:13,600 --> 00:20:16,240 Speaker 1: clearly a lot of like projection, where people already have 343 00:20:17,040 --> 00:20:21,640 Speaker 1: a negative association with someone and then any little thing 344 00:20:22,119 --> 00:20:25,760 Speaker 1: will be used as proof to stack there, to stack that. 345 00:20:26,000 --> 00:20:28,520 Speaker 1: That thinking right. And so one of the big conspiracies 346 00:20:28,560 --> 00:20:31,080 Speaker 1: that came out of the Queen's funeral was that Megan 347 00:20:31,359 --> 00:20:35,960 Speaker 1: attended the funeral wearing a hidden recording device under her dress. 348 00:20:36,440 --> 00:20:37,960 Speaker 1: And the reason why people thought this is because she 349 00:20:38,040 --> 00:20:41,359 Speaker 1: was photographed and in one of the pictures there's like 350 00:20:41,480 --> 00:20:44,520 Speaker 1: a wrinkle or a bump on her thigh and people 351 00:20:44,600 --> 00:20:48,119 Speaker 1: were like how like the goal she's gonna be. What 352 00:20:48,440 --> 00:20:50,399 Speaker 1: is she recording this for her podcast or for some 353 00:20:50,600 --> 00:20:54,040 Speaker 1: netflix special? So shameless, and it's like, Yo, I have 354 00:20:54,280 --> 00:20:57,959 Speaker 1: worn those kinds of recording devices in my clothing. They 355 00:20:58,000 --> 00:21:00,680 Speaker 1: are so bulky. Also, don't you if that was her plan, 356 00:21:00,760 --> 00:21:03,080 Speaker 1: don't you think she would have worn an outfit that 357 00:21:03,359 --> 00:21:06,320 Speaker 1: was more conducive, like she wouldn't have worn a dress 358 00:21:06,400 --> 00:21:08,000 Speaker 1: that it's like, clings to that part of her body. 359 00:21:08,040 --> 00:21:10,320 Speaker 1: She might have worn something with pockets, like you know 360 00:21:10,480 --> 00:21:12,600 Speaker 1: she might have. Don't don't you think that if that's 361 00:21:12,600 --> 00:21:14,960 Speaker 1: what she was doing, she would have done it better? 362 00:21:15,240 --> 00:21:18,000 Speaker 1: And isn't it more possible that, like humans who wear 363 00:21:18,080 --> 00:21:21,080 Speaker 1: clothing and stand up wearing that clothing, sometimes experienced wrinkles? 364 00:21:21,200 --> 00:21:24,479 Speaker 1: Like isn't much more, more of a better explanation from 365 00:21:24,520 --> 00:21:30,120 Speaker 1: what's going on? I think you're being too logical, bridget sensible. 366 00:21:31,600 --> 00:21:35,160 Speaker 1: This whole conversation is so odd to me because Megan 367 00:21:35,240 --> 00:21:39,520 Speaker 1: Markele just being being present marrying a man has caused 368 00:21:39,640 --> 00:21:42,879 Speaker 1: this huge controversy and of course that a lot of 369 00:21:42,920 --> 00:21:47,280 Speaker 1: conversations and comparisons to Princess Diana and Megan Markle. Of course, 370 00:21:47,560 --> 00:21:52,440 Speaker 1: those same people who probably were degrading and or criticizing 371 00:21:52,920 --> 00:21:56,120 Speaker 1: Princess Diana, would now call her an angel and then 372 00:21:56,240 --> 00:21:58,680 Speaker 1: saying that Megan Markle is the one that's uh, you know, 373 00:21:59,320 --> 00:22:02,040 Speaker 1: disrespecting the name of Princess Diana, and every was like wait, what, 374 00:22:02,520 --> 00:22:05,560 Speaker 1: you're the ones who are number one in criticisms. But also, 375 00:22:05,680 --> 00:22:09,520 Speaker 1: just like the fact that Mega Markel has nothing, honestly, 376 00:22:09,600 --> 00:22:13,840 Speaker 1: to gain in this entirety of this conversation, of these controversies, 377 00:22:13,880 --> 00:22:17,080 Speaker 1: to record any of these things, uh, to try, as 378 00:22:17,119 --> 00:22:19,000 Speaker 1: they say, that she's trying to make money off these things. 379 00:22:19,320 --> 00:22:22,560 Speaker 1: She's not. She's literally doing a podcast about her own life, 380 00:22:22,560 --> 00:22:25,480 Speaker 1: bringing in guests to talk about marginalized issues. That has 381 00:22:25,560 --> 00:22:28,159 Speaker 1: nothing to do with the throne in general, as well 382 00:22:28,200 --> 00:22:31,520 Speaker 1: as the fact Prince Harry's not that close, none of 383 00:22:31,600 --> 00:22:34,040 Speaker 1: his family is not close to the lineage to become 384 00:22:34,200 --> 00:22:37,280 Speaker 1: king or queen or whatever whatnot. We have already seen 385 00:22:37,440 --> 00:22:40,359 Speaker 1: the graphs that happened. So like my mind just this 386 00:22:40,480 --> 00:22:42,800 Speaker 1: reeling of like why, why take this effort to bring 387 00:22:42,880 --> 00:22:49,440 Speaker 1: this conspiracy other than hey, she's uh, she is not white, 388 00:22:49,480 --> 00:22:52,760 Speaker 1: so therefore we hate her. Yeah, and you you've touched 389 00:22:52,840 --> 00:22:54,439 Speaker 1: on something. This is a little bit of a tangent 390 00:22:54,560 --> 00:22:56,760 Speaker 1: that you touched on something that I think about all 391 00:22:56,800 --> 00:23:01,160 Speaker 1: the time is how, you know, I was young when 392 00:23:01,480 --> 00:23:04,760 Speaker 1: before Princess Diana died, but even I know, as like 393 00:23:04,840 --> 00:23:07,240 Speaker 1: a child living in the United States, even I knew 394 00:23:07,280 --> 00:23:09,200 Speaker 1: the way the press talked about her, even as a kid, 395 00:23:09,280 --> 00:23:12,560 Speaker 1: and so the press really tore her down and then 396 00:23:12,600 --> 00:23:16,080 Speaker 1: when she died, she's an angel we've always loved. It 397 00:23:16,240 --> 00:23:18,920 Speaker 1: was like, and I see that so often and I 398 00:23:19,040 --> 00:23:21,359 Speaker 1: do think it's like as bad as the media and 399 00:23:21,359 --> 00:23:23,320 Speaker 1: the United States is, I do feel like in the 400 00:23:23,480 --> 00:23:26,879 Speaker 1: UK and in England it's like work can be worse, 401 00:23:27,320 --> 00:23:31,119 Speaker 1: like a little bit like Um more vicious, and I 402 00:23:31,200 --> 00:23:33,800 Speaker 1: think about like amy winehouse is the figure, because I 403 00:23:34,280 --> 00:23:36,440 Speaker 1: was a huge amy winehouse fan. Amy winehouse is the 404 00:23:36,480 --> 00:23:39,359 Speaker 1: figure that I think about a lot, where the press, 405 00:23:39,400 --> 00:23:42,880 Speaker 1: in the media loved to villainize her and really make 406 00:23:43,000 --> 00:23:46,120 Speaker 1: fun of her in these like cruel ways and ways 407 00:23:46,200 --> 00:23:49,920 Speaker 1: that I hope that when people are struggling with substance 408 00:23:50,080 --> 00:23:52,440 Speaker 1: issues now, I hope that we've gotten to a place where, 409 00:23:52,480 --> 00:23:55,520 Speaker 1: like we don't cruelly mock them for it. But then 410 00:23:55,600 --> 00:23:58,639 Speaker 1: when she died, the way that that that same press 411 00:23:59,119 --> 00:24:01,479 Speaker 1: rushed to lie and it's her rushed to act as 412 00:24:01,480 --> 00:24:03,760 Speaker 1: if they had always loved her, and it's just it's 413 00:24:03,800 --> 00:24:06,440 Speaker 1: hard to see and you know, it's kind of breaks 414 00:24:06,520 --> 00:24:10,320 Speaker 1: my heart. But there's this video clip of Harry talking 415 00:24:10,400 --> 00:24:14,159 Speaker 1: about this and he's like basically feels very emphatically that 416 00:24:14,320 --> 00:24:16,440 Speaker 1: the press, it's responsible for the death of his mother. 417 00:24:16,880 --> 00:24:19,480 Speaker 1: And then when when there was a time when Megan 418 00:24:19,760 --> 00:24:24,520 Speaker 1: was open about experiencing, uh, you know, feelings of self harm, 419 00:24:24,680 --> 00:24:27,720 Speaker 1: and he talked about how he felt like he was 420 00:24:27,840 --> 00:24:31,879 Speaker 1: watching the press do the same thing to his new wife, 421 00:24:32,119 --> 00:24:33,760 Speaker 1: that they that he had to watch them due to 422 00:24:33,920 --> 00:24:37,040 Speaker 1: his mom and I something about that clip really sticks 423 00:24:37,080 --> 00:24:39,600 Speaker 1: with me. Of like, I don't know, I don't think 424 00:24:39,600 --> 00:24:41,640 Speaker 1: I'll ever be able to understand what that must feel 425 00:24:41,680 --> 00:24:44,040 Speaker 1: like to have watched that as a child and be 426 00:24:44,200 --> 00:24:47,639 Speaker 1: so powerless and then watch it again and be like no, 427 00:24:47,920 --> 00:24:51,320 Speaker 1: I'm not letting this happen again, and compounding that with 428 00:24:51,480 --> 00:24:55,280 Speaker 1: the fact that this is an racist attack on a 429 00:24:55,359 --> 00:24:58,440 Speaker 1: woman for just being in love with a man, point 430 00:24:58,520 --> 00:25:01,320 Speaker 1: blank Um. But you know, I found it interesting because 431 00:25:01,720 --> 00:25:06,280 Speaker 1: for me, I actually didn't know much about this Hashtag. 432 00:25:06,359 --> 00:25:08,960 Speaker 1: I I as much as I'm on twitter and saying things. Again, 433 00:25:09,040 --> 00:25:11,920 Speaker 1: like I said, my friends have very strong feelings of 434 00:25:12,520 --> 00:25:16,760 Speaker 1: the Queen and her responsibility and colonialism in itself. So 435 00:25:16,960 --> 00:25:19,760 Speaker 1: I had a whole different take on this conversation. So 436 00:25:19,800 --> 00:25:22,200 Speaker 1: I'm kind of surprised to know that this is happening 437 00:25:22,400 --> 00:25:25,640 Speaker 1: and I'm really wondering where did they all come from? 438 00:25:25,920 --> 00:25:27,359 Speaker 1: So this is a great question. So I have to 439 00:25:27,400 --> 00:25:32,119 Speaker 1: say I'm am probably in a similar digital pocket that 440 00:25:32,240 --> 00:25:35,440 Speaker 1: you are, because my timeline when the queen died was 441 00:25:35,560 --> 00:25:39,639 Speaker 1: like very much black twitter and like Irish twitter. I 442 00:25:39,720 --> 00:25:43,080 Speaker 1: would like to make in twitter all coming together. I 443 00:25:43,200 --> 00:25:46,720 Speaker 1: know that people have how can I put this? I 444 00:25:46,960 --> 00:25:49,320 Speaker 1: came so when the Queen died, I don't think I 445 00:25:49,480 --> 00:25:54,920 Speaker 1: realized that I knew people that had a lot had. 446 00:25:55,600 --> 00:25:58,159 Speaker 1: I was surprised by how much reverence my friends in 447 00:25:58,160 --> 00:26:00,359 Speaker 1: the United States had for her. So I was okay, 448 00:26:00,440 --> 00:26:02,600 Speaker 1: she's definitely like a I have friends to think of 449 00:26:02,640 --> 00:26:04,320 Speaker 1: her as like a like a feminist figure. I was 450 00:26:04,359 --> 00:26:07,040 Speaker 1: like okay, like I was a little again. This is 451 00:26:07,080 --> 00:26:09,520 Speaker 1: not to like I don't want to I don't want 452 00:26:09,520 --> 00:26:11,639 Speaker 1: to say you like I really I realized. I was like, Oh, 453 00:26:11,800 --> 00:26:15,159 Speaker 1: people have complex feelings about her and it kind of 454 00:26:15,240 --> 00:26:18,520 Speaker 1: almost became like a raw shark test of where people 455 00:26:18,640 --> 00:26:20,720 Speaker 1: were at, and I was just very surprised to see 456 00:26:20,720 --> 00:26:22,399 Speaker 1: that it had not occurred to me that that was 457 00:26:22,480 --> 00:26:24,320 Speaker 1: going to be the way that it was, I'll put 458 00:26:24,359 --> 00:26:28,520 Speaker 1: it that way. But yeah, like I said, so that's 459 00:26:28,760 --> 00:26:31,000 Speaker 1: pucket eyeman. So I'm out of the loop on this one. 460 00:26:31,400 --> 00:26:34,639 Speaker 1: I did see the real housewives person go out on 461 00:26:34,720 --> 00:26:37,200 Speaker 1: a tangent on Tiktog and that was very obvious. Like 462 00:26:37,400 --> 00:26:46,560 Speaker 1: she's jealous of the podcast numbers. UH, fellow, I heart podcaster. Yeah, 463 00:26:46,760 --> 00:26:49,600 Speaker 1: so she's not gonna want to come onto any of them, 464 00:26:50,080 --> 00:26:53,479 Speaker 1: I don't feel like anytime soon. I mean love real 465 00:26:53,560 --> 00:26:58,720 Speaker 1: housewives of New York. She definitely has an issue, like 466 00:26:58,800 --> 00:27:01,440 Speaker 1: a clear like I saw that video. In that video 467 00:27:01,520 --> 00:27:04,280 Speaker 1: she's like, Oh, Megan Markel is a bad businesswoman. I 468 00:27:04,359 --> 00:27:05,520 Speaker 1: was like what do you what do you even know 469 00:27:05,600 --> 00:27:09,920 Speaker 1: about it? Like I just remember one like yeah, first 470 00:27:09,960 --> 00:27:14,040 Speaker 1: of all, she her podcast archetypes, toppled Joe Rogan's podcast 471 00:27:14,119 --> 00:27:16,359 Speaker 1: for the number one spot, which is not even that. 472 00:27:16,480 --> 00:27:19,960 Speaker 1: That is a that has not happened years or years. 473 00:27:20,160 --> 00:27:24,240 Speaker 1: So I don't know, it's interesting to me. I think 474 00:27:24,280 --> 00:27:27,480 Speaker 1: that Megan Markle, and we should talk about it, because 475 00:27:27,480 --> 00:27:31,200 Speaker 1: I do think Megan markle evokes a reaction in a 476 00:27:31,280 --> 00:27:35,000 Speaker 1: certain type of person. Just her presence, just her being around, 477 00:27:35,800 --> 00:27:42,000 Speaker 1: is threatening and like deeply, I think it can like 478 00:27:42,160 --> 00:27:47,000 Speaker 1: deeply challenge a lot of people's preconceived notions. Right. So 479 00:27:47,080 --> 00:27:48,920 Speaker 1: I think that, like, I also think that Megan Markele 480 00:27:48,960 --> 00:27:51,439 Speaker 1: at this point carries herself in a kind of way that, frankly, 481 00:27:51,480 --> 00:27:54,480 Speaker 1: I think a lot of like white people, might have 482 00:27:54,560 --> 00:27:57,800 Speaker 1: a problem with. Not Surprising, yeah, but the thing is, like, 483 00:27:58,080 --> 00:28:00,320 Speaker 1: so that's the only thing I have seen. So I'm 484 00:28:00,320 --> 00:28:02,600 Speaker 1: trying to figure out where all these conversations are coming 485 00:28:02,720 --> 00:28:07,600 Speaker 1: from and is it actually reflective of the rest of society, 486 00:28:07,720 --> 00:28:10,280 Speaker 1: the rest of the peoples and twitters? Well, so that 487 00:28:10,480 --> 00:28:13,760 Speaker 1: is a very, very interesting question. UH, Christopher boose, who 488 00:28:13,960 --> 00:28:16,040 Speaker 1: is a CEO of a company called Bot Sentinel, which 489 00:28:16,080 --> 00:28:18,920 Speaker 1: is an organization that analyzes twitter data to determine, you know, 490 00:28:19,800 --> 00:28:22,560 Speaker 1: where conversations are coming from, who is generating them, as 491 00:28:22,600 --> 00:28:25,439 Speaker 1: it's real people with its spots. Um, he did an 492 00:28:25,600 --> 00:28:30,400 Speaker 1: entire analysis of Megan markel in October one and basically 493 00:28:30,840 --> 00:28:33,000 Speaker 1: you would be surprised that a lot of the hate 494 00:28:33,080 --> 00:28:35,840 Speaker 1: thrown at Megan Markel, is being driven by a lot 495 00:28:35,960 --> 00:28:52,520 Speaker 1: fewer people on social media than you might suspect. So 496 00:28:52,680 --> 00:28:57,000 Speaker 1: this I'm excited to talk about because I remember forever ago, bridget, 497 00:28:57,080 --> 00:28:59,280 Speaker 1: when you and I did that episode on Star Wars Feminism. 498 00:28:59,760 --> 00:29:03,400 Speaker 1: There was a similar thing about star wars, about the 499 00:29:03,960 --> 00:29:08,080 Speaker 1: hate the actors received. Oh my God, that's right. Yes, 500 00:29:08,360 --> 00:29:10,920 Speaker 1: and I was shocked and I I've I always try 501 00:29:10,960 --> 00:29:15,760 Speaker 1: to make the point, believe me, I know there are terrible, racist, 502 00:29:15,800 --> 00:29:19,760 Speaker 1: sexist fans, believe me, I know that. But the data 503 00:29:19,880 --> 00:29:22,720 Speaker 1: that came out was like it's way less posting about it. 504 00:29:22,880 --> 00:29:25,280 Speaker 1: I'M NOT gonna say it's way way well less, it's 505 00:29:25,320 --> 00:29:30,840 Speaker 1: way less actually posting about it. And still the perception 506 00:29:31,080 --> 00:29:37,120 Speaker 1: I got was there was this huge sex, sex sect 507 00:29:37,440 --> 00:29:41,440 Speaker 1: on twitter that was like tearing down these actors and 508 00:29:41,640 --> 00:29:44,720 Speaker 1: was causing rotten tomatoes to like review bomb. And the 509 00:29:44,800 --> 00:29:47,560 Speaker 1: report we talked about was like it's actually a lot 510 00:29:48,240 --> 00:29:51,920 Speaker 1: of bots, and still it has this huge impact. That 511 00:29:52,120 --> 00:29:57,240 Speaker 1: changed rotten tomatoes and their policy and it changes how 512 00:29:57,360 --> 00:30:04,400 Speaker 1: people perceive like Disney made creative decisions because the cowards 513 00:30:05,160 --> 00:30:09,920 Speaker 1: based on this. Like I'm just so glad you brought 514 00:30:09,960 --> 00:30:11,560 Speaker 1: that up. That's such a good point and I think 515 00:30:11,800 --> 00:30:15,800 Speaker 1: it really goes to show that like a a pretty 516 00:30:15,920 --> 00:30:22,160 Speaker 1: small minority of very motivated, very vocal, coordinated, I guess, 517 00:30:23,520 --> 00:30:27,360 Speaker 1: in a nice way to put it, passionate people can 518 00:30:27,640 --> 00:30:30,800 Speaker 1: really change discourse, they can really change like, they can 519 00:30:31,240 --> 00:30:33,320 Speaker 1: really make an impact, and so I would also I 520 00:30:33,320 --> 00:30:36,880 Speaker 1: would argue then, that, like there's a problem with our 521 00:30:37,360 --> 00:30:41,240 Speaker 1: social media platforms and our digital platforms. It's a small handful, 522 00:30:41,440 --> 00:30:46,160 Speaker 1: a relatively small handful, of dedicated trolls and people to 523 00:30:46,320 --> 00:30:50,880 Speaker 1: people like haters, can really impact and change discourse. Like, 524 00:30:50,920 --> 00:30:53,600 Speaker 1: I would argue that that means that something is broken, something, 525 00:30:53,680 --> 00:30:57,200 Speaker 1: it's flawed. And I also think that, you know, when 526 00:30:57,360 --> 00:31:00,480 Speaker 1: you have a small handful of people Dick tating the 527 00:31:00,560 --> 00:31:05,360 Speaker 1: conversation and making make creating the impression that everybody hates 528 00:31:05,440 --> 00:31:08,160 Speaker 1: this new person on star wars or everybody hates Megan Markle, 529 00:31:08,520 --> 00:31:11,320 Speaker 1: it's so much easier for somebody who, it's just a 530 00:31:11,480 --> 00:31:14,200 Speaker 1: casual viewer of this to get the idea that, like, oh, well, 531 00:31:14,600 --> 00:31:17,120 Speaker 1: everybody hates so and so, so like I'm not going 532 00:31:17,160 --> 00:31:18,880 Speaker 1: to chime in with, like, I like so and so, 533 00:31:19,120 --> 00:31:21,280 Speaker 1: or that I'm ambivalent about so and so, if I'm 534 00:31:21,280 --> 00:31:23,880 Speaker 1: going to chime in, it kind of poisons the well 535 00:31:24,040 --> 00:31:25,840 Speaker 1: and it makes it so that we can't actually have 536 00:31:26,120 --> 00:31:30,400 Speaker 1: honest conversations rooted in what people actually feel, because it's 537 00:31:30,400 --> 00:31:34,120 Speaker 1: just like creates an ecosystem where the hate is, it's 538 00:31:34,160 --> 00:31:36,280 Speaker 1: what's dictating the conversation. And so yeah, I would argue 539 00:31:36,280 --> 00:31:38,960 Speaker 1: that that means that our ecosystems are not healthy and not, 540 00:31:39,520 --> 00:31:43,240 Speaker 1: you know, functioning properly if hate is able to dictate 541 00:31:43,320 --> 00:31:46,160 Speaker 1: the conversation, even if it's a small minority of people 542 00:31:46,160 --> 00:31:49,280 Speaker 1: who feel that way. Absolutely absolutely, because I was under 543 00:31:49,280 --> 00:31:51,320 Speaker 1: the impression, I was totally under the impression like Oh oh, 544 00:31:51,400 --> 00:31:54,320 Speaker 1: this must be a much bigger group of people than 545 00:31:54,360 --> 00:31:56,120 Speaker 1: I thought. Even if I was like they're wrong and 546 00:31:56,160 --> 00:31:58,360 Speaker 1: they're full of hate, I still thought it was way bigger. 547 00:31:59,040 --> 00:32:04,080 Speaker 1: And then numbers you brought about Megan markle kind of 548 00:32:04,200 --> 00:32:07,320 Speaker 1: shocked me with like how much smaller they were based 549 00:32:07,360 --> 00:32:13,280 Speaker 1: on what I have seen as a casual viewer of this. Absolutely. 550 00:32:13,320 --> 00:32:15,440 Speaker 1: So let's dig into some of those numbers. So from 551 00:32:15,560 --> 00:32:18,800 Speaker 1: this Bot Sentinel report, they found that only eighty three 552 00:32:19,040 --> 00:32:24,600 Speaker 1: accounts on twitter generate seventy of Megan markle hate content 553 00:32:24,720 --> 00:32:28,560 Speaker 1: on twitter. They estimate that these eight three accounts have 554 00:32:28,720 --> 00:32:33,000 Speaker 1: a potential combined reach of seventeen million users. So they 555 00:32:33,080 --> 00:32:35,720 Speaker 1: broke it down. They found that fifty five what we 556 00:32:35,800 --> 00:32:38,920 Speaker 1: call single purpose anti Megan markle hate accounts, and so 557 00:32:39,360 --> 00:32:41,480 Speaker 1: a single purpose hate account is that account that like 558 00:32:41,880 --> 00:32:44,800 Speaker 1: it only exists to hate on one person like that, 559 00:32:45,000 --> 00:32:47,400 Speaker 1: like they're not like primarily like that's what they're doing. 560 00:32:47,880 --> 00:32:51,720 Speaker 1: So fifty five of those accounts were these like primary 561 00:32:51,760 --> 00:32:55,200 Speaker 1: hate accounts and then another twenty eight secondary accounts that 562 00:32:55,320 --> 00:32:58,640 Speaker 1: like mainly amplified them. And so those twenty eight accounts, 563 00:32:58,800 --> 00:33:02,480 Speaker 1: they might post about like the royals more generally, but 564 00:33:02,640 --> 00:33:06,400 Speaker 1: that their real function is to boost and amplify when 565 00:33:06,520 --> 00:33:10,600 Speaker 1: those that when those other primary accounts put out anti 566 00:33:10,680 --> 00:33:13,440 Speaker 1: meg and Markle hate, and so that that's really it. 567 00:33:13,600 --> 00:33:16,600 Speaker 1: Like they are generating a lot of the hate online. 568 00:33:16,680 --> 00:33:20,920 Speaker 1: It is not organic conversation and it's certainly not a 569 00:33:21,000 --> 00:33:25,440 Speaker 1: reflection of how just everybody feels on social media because 570 00:33:25,560 --> 00:33:28,000 Speaker 1: these these accounts have such a big reach and such 571 00:33:28,040 --> 00:33:32,600 Speaker 1: a big ability to control the conversation about Megan Markle. Yeah, yeah, 572 00:33:32,720 --> 00:33:34,600 Speaker 1: and that's something that you've you've come on and you've 573 00:33:34,640 --> 00:33:38,640 Speaker 1: talked about a lot on here. Is this the responsibility 574 00:33:38,680 --> 00:33:41,480 Speaker 1: of social media platforms, but also that kind of manipulation 575 00:33:42,080 --> 00:33:48,480 Speaker 1: towards these hateful accounts when it comes to the algorithm 576 00:33:48,680 --> 00:33:53,560 Speaker 1: and like what people see and what gets like more traction. Right. Yeah, exactly, 577 00:33:53,600 --> 00:33:56,720 Speaker 1: and so I I have a little bit of a 578 00:33:56,760 --> 00:33:58,960 Speaker 1: different opinion than a lot of my colleagues in the 579 00:33:59,480 --> 00:34:02,280 Speaker 1: platform acountability, in disinformation space. A lot of people would 580 00:34:02,280 --> 00:34:05,800 Speaker 1: say single use hate accounts should be banned from the Internet. 581 00:34:05,840 --> 00:34:07,560 Speaker 1: Like if you are someone who is running an account 582 00:34:07,800 --> 00:34:11,440 Speaker 1: and the only that account only exists to hate on 583 00:34:11,640 --> 00:34:15,360 Speaker 1: one specific person, it should be banned. I can understand 584 00:34:15,400 --> 00:34:17,680 Speaker 1: that view, but I think that the most important thing 585 00:34:17,760 --> 00:34:21,440 Speaker 1: is that platforms should not be amplifying it. They shouldn't 586 00:34:21,440 --> 00:34:23,960 Speaker 1: be recommending those accounts to people. When you when you 587 00:34:24,120 --> 00:34:27,120 Speaker 1: search Megan markel information, those accounts shall not be the 588 00:34:27,200 --> 00:34:29,960 Speaker 1: ones that are prioritized and that that users see first. 589 00:34:30,239 --> 00:34:33,480 Speaker 1: What users see first should be like thoughtful, honest information 590 00:34:33,880 --> 00:34:36,840 Speaker 1: from sites that are that do not exist only to 591 00:34:36,880 --> 00:34:39,600 Speaker 1: spread hated at this one person. And so it is 592 00:34:39,680 --> 00:34:43,839 Speaker 1: interesting that, like on a platform like twitter, these eight 593 00:34:44,000 --> 00:34:46,279 Speaker 1: three accounts that are driving most of the conversation around, 594 00:34:46,520 --> 00:34:50,360 Speaker 1: the negative conversation around Megan Markel. They kind of blatantly 595 00:34:50,640 --> 00:34:53,719 Speaker 1: violate twitter's rules. Like one of twitter's rules is that 596 00:34:53,880 --> 00:34:58,080 Speaker 1: accounts cannot coordinate to dogpile on people, to harass them 597 00:34:58,200 --> 00:35:01,239 Speaker 1: or to like spread negativity around them. These accounts do 598 00:35:01,640 --> 00:35:04,360 Speaker 1: just that, right, and they do so in like a 599 00:35:05,200 --> 00:35:10,200 Speaker 1: pretty sophisticated coordination with each other. And Yeah, I think 600 00:35:10,239 --> 00:35:12,719 Speaker 1: it's one of those things where platforms really need to 601 00:35:14,000 --> 00:35:17,719 Speaker 1: understand what's at stake when this is allowed, when like 602 00:35:18,920 --> 00:35:21,759 Speaker 1: a small handful of accounts are able to bypass the 603 00:35:21,880 --> 00:35:24,439 Speaker 1: rules that you have set for your platform in order 604 00:35:24,560 --> 00:35:31,600 Speaker 1: to artificially control the discourse about one subject, that's not great. No, no, 605 00:35:31,760 --> 00:35:34,320 Speaker 1: it's not, and it's kind of frightening, to be honest. 606 00:35:35,200 --> 00:35:38,160 Speaker 1: And you had to quote bose, talked about this, like 607 00:35:38,280 --> 00:35:43,200 Speaker 1: making the clarification that, like we can't just blame boughts. 608 00:35:44,600 --> 00:35:47,840 Speaker 1: So I find this super interesting because, you know, we often, 609 00:35:48,080 --> 00:35:50,879 Speaker 1: when it comes to online discourse, talk about like well, 610 00:35:51,920 --> 00:35:53,759 Speaker 1: is it boughts, is it people? And I feel like 611 00:35:53,880 --> 00:35:58,279 Speaker 1: the over focused like talking about both is serious and 612 00:35:58,360 --> 00:36:00,600 Speaker 1: we should be doing that. But when we focus on 613 00:36:00,880 --> 00:36:04,080 Speaker 1: that and don't also bring into the conversation that sometimes 614 00:36:04,120 --> 00:36:06,759 Speaker 1: it is real people. I feel like the conversation can 615 00:36:06,840 --> 00:36:10,680 Speaker 1: be not as useful because it obscared the fact that, like, well, 616 00:36:10,920 --> 00:36:12,919 Speaker 1: it isn't all bought. Some of it is like real people. 617 00:36:13,000 --> 00:36:16,279 Speaker 1: And so in an interview with Buzzfeed boose, I said 618 00:36:16,800 --> 00:36:19,480 Speaker 1: this campaign comes from people who know how to manipulate 619 00:36:19,520 --> 00:36:22,800 Speaker 1: the algorithms, manipulate twitter, stay under the wire to avoid 620 00:36:22,840 --> 00:36:26,359 Speaker 1: detection and suspension. This level of complexity comes from people 621 00:36:26,440 --> 00:36:28,120 Speaker 1: who know how to do this stuff and who are 622 00:36:28,280 --> 00:36:31,160 Speaker 1: paid to do this stuff. And so yeah, I think especially, 623 00:36:31,200 --> 00:36:33,719 Speaker 1: what do you think about the fact that people can 624 00:36:34,920 --> 00:36:37,600 Speaker 1: profit from this? I think it really should be who 625 00:36:37,680 --> 00:36:40,360 Speaker 1: these platforms to make a change. I mean that's the 626 00:36:40,400 --> 00:36:43,080 Speaker 1: big conversation, is that people are making money and can 627 00:36:43,120 --> 00:36:45,000 Speaker 1: make a living off of this. And why is this 628 00:36:45,520 --> 00:36:48,479 Speaker 1: something that is profitable? So it's not just they're making money, 629 00:36:48,520 --> 00:36:53,200 Speaker 1: but they're profiting, and literally off of hate, because over 630 00:36:53,360 --> 00:36:56,480 Speaker 1: here I have at least four times the amount of 631 00:36:56,560 --> 00:36:59,040 Speaker 1: the bots following me and I can only get three 632 00:36:59,080 --> 00:37:01,120 Speaker 1: people to look at my time. To me, how come 633 00:37:01,160 --> 00:37:10,719 Speaker 1: on rageous. But I find that interesting that we have 634 00:37:11,040 --> 00:37:14,879 Speaker 1: had this continuous talk and we have these prime examples 635 00:37:15,320 --> 00:37:18,479 Speaker 1: of what is happening on twitter specifically. Like we talked 636 00:37:18,560 --> 00:37:22,600 Speaker 1: recently about one of the Canadian Um women who is 637 00:37:22,640 --> 00:37:26,120 Speaker 1: currently working with bumble and making sure that there's safety 638 00:37:26,239 --> 00:37:28,160 Speaker 1: for the women who are and those who are on 639 00:37:28,239 --> 00:37:30,520 Speaker 1: their APP and that one of her big things where 640 00:37:30,600 --> 00:37:33,280 Speaker 1: that she was bringing out the reports showing how twitter 641 00:37:33,520 --> 00:37:38,279 Speaker 1: does not help or defend or make twitter safe for 642 00:37:38,480 --> 00:37:41,600 Speaker 1: women and marginalized people. And, uh, if so, just recently 643 00:37:41,680 --> 00:37:45,680 Speaker 1: came out with a big research data showing that women 644 00:37:45,719 --> 00:37:48,719 Speaker 1: are still being heavily targeted and are not being helped 645 00:37:48,760 --> 00:37:51,919 Speaker 1: at all by platforms in general and that the fact 646 00:37:51,960 --> 00:37:55,080 Speaker 1: of the matter is people are once again profiting. Have 647 00:37:55,280 --> 00:37:58,920 Speaker 1: become have become aware of Oh so we can make 648 00:37:59,040 --> 00:38:03,400 Speaker 1: money by essentially on women and women of color and 649 00:38:03,640 --> 00:38:06,000 Speaker 1: doing so in a way that not only can we 650 00:38:06,080 --> 00:38:08,120 Speaker 1: get away with it, not only can we make money, 651 00:38:08,360 --> 00:38:12,160 Speaker 1: but it's making a difference, uh, in people's reaction to this. Yeah, 652 00:38:12,280 --> 00:38:15,279 Speaker 1: I mean like and so you're exactly right. Like this, 653 00:38:15,440 --> 00:38:19,560 Speaker 1: the research is it completely jives with what you're saying. 654 00:38:19,640 --> 00:38:22,680 Speaker 1: You are exactly exactly right, and as bad as that 655 00:38:22,840 --> 00:38:26,879 Speaker 1: is right. So, like platforms are like. I would argue 656 00:38:26,920 --> 00:38:30,560 Speaker 1: that this, this kind of negative engagement and harassment and 657 00:38:30,600 --> 00:38:33,359 Speaker 1: abuse of women of Color on these platforms. I would 658 00:38:33,440 --> 00:38:37,200 Speaker 1: argue that is built into these business models like twitter, 659 00:38:38,080 --> 00:38:41,320 Speaker 1: when when somebody tweets something that is like crapping on 660 00:38:41,440 --> 00:38:44,360 Speaker 1: Megan Markele and it's getting lots of engagement, it is 661 00:38:44,440 --> 00:38:48,440 Speaker 1: in twitter's best financial interest to boost that and amplify 662 00:38:48,520 --> 00:38:50,200 Speaker 1: that because it's like, Oh, this is clicking, people are 663 00:38:50,280 --> 00:38:53,000 Speaker 1: people are paying attention to this. That's a problem and 664 00:38:53,080 --> 00:38:56,080 Speaker 1: I also think that, you know, we're talking about Megan Markle, 665 00:38:56,320 --> 00:38:58,120 Speaker 1: who is a public figure, but she's not a political 666 00:38:58,160 --> 00:39:01,840 Speaker 1: figure necessarily. We're talking about star wars and like fandom. 667 00:39:02,320 --> 00:39:04,880 Speaker 1: Think about how that, like, as bad as that is, 668 00:39:05,160 --> 00:39:09,479 Speaker 1: apply that to democracy, apply that to women and women 669 00:39:09,480 --> 00:39:12,080 Speaker 1: of color who are trying to run for office, apply 670 00:39:12,280 --> 00:39:15,080 Speaker 1: that to women and women of color and other marginalized 671 00:39:15,080 --> 00:39:18,880 Speaker 1: people who are trying to engage in our democracy and 672 00:39:19,040 --> 00:39:21,239 Speaker 1: be part of civic and public life right and so 673 00:39:21,600 --> 00:39:25,279 Speaker 1: the implications really become clear. If it's this bad for 674 00:39:25,560 --> 00:39:30,040 Speaker 1: conversations that are, you know, about pop culture and like celebrities, 675 00:39:30,360 --> 00:39:34,200 Speaker 1: imagine how how big the stakes are when that same 676 00:39:34,320 --> 00:39:36,440 Speaker 1: dynamic is applied to people who are just trying to 677 00:39:36,520 --> 00:39:38,360 Speaker 1: run for office or just trying to make their voices 678 00:39:38,440 --> 00:39:41,239 Speaker 1: heard and participate fully in our democracy. Right. And the 679 00:39:41,400 --> 00:39:44,160 Speaker 1: one report we were talking about, they were specifically focusing 680 00:39:44,200 --> 00:39:46,200 Speaker 1: on journalists and talking about this is a part of 681 00:39:46,239 --> 00:39:48,120 Speaker 1: their job description, and I think about that all the time. 682 00:39:48,120 --> 00:39:50,160 Speaker 1: I'm like, Oh my God, because technically is a part 683 00:39:50,160 --> 00:39:52,080 Speaker 1: of our job description and I hate it. I hate 684 00:39:52,120 --> 00:39:53,680 Speaker 1: it so much because I don't want to become a 685 00:39:53,719 --> 00:39:56,040 Speaker 1: focal point at any point. Nothing that we are, thank God, 686 00:39:56,040 --> 00:39:57,719 Speaker 1: because we go under the radar as well, like I 687 00:39:57,840 --> 00:40:00,759 Speaker 1: said to people like my tweets, it's fine, um, but 688 00:40:00,880 --> 00:40:03,480 Speaker 1: we go under the radar enough that, you know, we 689 00:40:03,560 --> 00:40:05,600 Speaker 1: don't have to deal with that before someone who has 690 00:40:05,880 --> 00:40:07,960 Speaker 1: is being told if you don't get your views up, 691 00:40:08,440 --> 00:40:09,920 Speaker 1: you're not going to get paid or you're not going 692 00:40:10,000 --> 00:40:14,520 Speaker 1: to stay in this job. I couldn't imagine. So I 693 00:40:14,640 --> 00:40:19,040 Speaker 1: actually literally just left a summit all about this as 694 00:40:19,040 --> 00:40:21,920 Speaker 1: it pertains to journalism, and I when I was a kid, 695 00:40:22,160 --> 00:40:24,400 Speaker 1: I wanted to be a journalist, right, like I was 696 00:40:24,480 --> 00:40:28,360 Speaker 1: obsessed with April O'Neil from teenage, mutant into turtles. I 697 00:40:28,600 --> 00:40:30,640 Speaker 1: I wanted to be a journalist like that was like 698 00:40:30,719 --> 00:40:33,680 Speaker 1: the career that I wanted to go into and today 699 00:40:34,840 --> 00:40:39,759 Speaker 1: the fact that young women like who are younger, go 700 00:40:39,960 --> 00:40:43,880 Speaker 1: into journalism like like willingly. It surprises me because I 701 00:40:43,960 --> 00:40:46,960 Speaker 1: think a lot of women have women are smart enough 702 00:40:46,960 --> 00:40:50,200 Speaker 1: to realize if I'm going if the cost of doing 703 00:40:50,280 --> 00:40:55,880 Speaker 1: this job is dealing with online abuse, online harassment, online violence, 704 00:40:56,280 --> 00:40:59,680 Speaker 1: and my male counterparts are not dealing with it the 705 00:40:59,680 --> 00:41:01,319 Speaker 1: same way that I am. So it's just like it's 706 00:41:01,360 --> 00:41:05,280 Speaker 1: just like an extra woman tax or, you know, marginalization 707 00:41:05,360 --> 00:41:09,960 Speaker 1: tax in the workplace, and institutions largely do not know 708 00:41:10,239 --> 00:41:12,759 Speaker 1: how to support women who are going through this and 709 00:41:12,840 --> 00:41:15,400 Speaker 1: dealing with this, and so it's like just their problem. 710 00:41:15,920 --> 00:41:18,320 Speaker 1: There's an assumption that the person who is the focal 711 00:41:18,360 --> 00:41:21,360 Speaker 1: point of harassment and abuse that they've done something to 712 00:41:21,440 --> 00:41:24,120 Speaker 1: warrant it, and that completely is not how it works. 713 00:41:24,200 --> 00:41:27,400 Speaker 1: Like oftentimes the thing that they're doing is like existing 714 00:41:27,480 --> 00:41:31,000 Speaker 1: as a woman Um. And so think about what we 715 00:41:31,080 --> 00:41:35,960 Speaker 1: are asking women to deal with. It's just completely unfair 716 00:41:36,440 --> 00:41:38,000 Speaker 1: and we're asking them to deal with it and not 717 00:41:38,239 --> 00:41:41,040 Speaker 1: even talk about the issue right and so it's just 718 00:41:41,160 --> 00:41:43,880 Speaker 1: completely it's that's up a completely unfair dynamic. When I 719 00:41:43,920 --> 00:41:47,640 Speaker 1: think about how many people have either just given up 720 00:41:47,680 --> 00:41:49,480 Speaker 1: on this as a career they're just like, I'm not 721 00:41:49,560 --> 00:41:51,320 Speaker 1: going to get on social media. It's not worth it. 722 00:41:51,320 --> 00:41:55,320 Speaker 1: It has the ability to suppress women from being involved 723 00:41:55,360 --> 00:41:57,680 Speaker 1: in public and civic life. And the worst part is 724 00:41:57,800 --> 00:41:59,880 Speaker 1: is like we were just now talking about it. We're 725 00:42:00,120 --> 00:42:02,959 Speaker 1: so late in bringing it up right, and that's in fact. 726 00:42:03,760 --> 00:42:06,080 Speaker 1: These reports are specific to twitter, but it's been happening. 727 00:42:06,120 --> 00:42:09,640 Speaker 1: It's been happening on bigger platforms and other social media's right. Oh, 728 00:42:09,719 --> 00:42:11,600 Speaker 1: absolutely it is. I don't want to give the impression 729 00:42:11,640 --> 00:42:13,759 Speaker 1: that it is just twitter, because that is not the case. 730 00:42:14,239 --> 00:42:18,480 Speaker 1: Um Soot Sentinel also analyzed youtube accounts and they found that, 731 00:42:18,600 --> 00:42:23,160 Speaker 1: because of Youtube Monetization, trashing Megan Markel is actually big, 732 00:42:23,480 --> 00:42:26,919 Speaker 1: lucrative business. They found that twenty five youtube channels earned 733 00:42:26,960 --> 00:42:31,120 Speaker 1: around three point five million from ad revenue and that 734 00:42:31,360 --> 00:42:34,320 Speaker 1: three of the most successful anti Megan markel accounts generated 735 00:42:34,600 --> 00:42:38,239 Speaker 1: almost five hundred thousand dollars during their existence. And these 736 00:42:38,360 --> 00:42:41,400 Speaker 1: videos are like it's not like they're filming. You know, 737 00:42:44,760 --> 00:42:49,000 Speaker 1: I know, but yess it's not. Like. They're low quality 738 00:42:49,280 --> 00:42:53,840 Speaker 1: videos that basically just traffic in conspiracy theories and outright 739 00:42:53,920 --> 00:42:57,840 Speaker 1: lies and like racist stereotypes and tropes, right, and so 740 00:42:58,000 --> 00:43:01,279 Speaker 1: like some of the aims that they'll make is like, oh, 741 00:43:01,480 --> 00:43:04,480 Speaker 1: Megan Markel, she baked her pregnancy, she used to serogate 742 00:43:04,600 --> 00:43:08,719 Speaker 1: like it's it's things that are completely baseless lies, and 743 00:43:08,920 --> 00:43:11,239 Speaker 1: I think that if you are able to make half 744 00:43:11,280 --> 00:43:16,520 Speaker 1: a million dollars almost trafficking in baseless lies and conjecture 745 00:43:16,600 --> 00:43:19,359 Speaker 1: and racism and sexism, you shouldn't be able to. If 746 00:43:19,400 --> 00:43:21,120 Speaker 1: you're able to make that kind of money from that, 747 00:43:21,400 --> 00:43:25,040 Speaker 1: something is wrong, like something is really broken. Yeah, yeah, 748 00:43:25,280 --> 00:43:30,600 Speaker 1: I mean agreed, because it's like not that this always works, 749 00:43:30,800 --> 00:43:37,279 Speaker 1: but there's been attempts to crack down on like medical misinformation. Um, 750 00:43:37,560 --> 00:43:40,279 Speaker 1: like you can't make money off of that. It's been 751 00:43:40,680 --> 00:43:43,239 Speaker 1: hit or miss, but there have been attempts and it 752 00:43:43,360 --> 00:43:46,200 Speaker 1: seems like if there's just something flatly untrue, you should 753 00:43:46,280 --> 00:43:50,040 Speaker 1: not be making millions of dollars from it by saying 754 00:43:50,120 --> 00:43:56,279 Speaker 1: that it is true. And then just like the the 755 00:43:56,400 --> 00:44:00,120 Speaker 1: spreading of that Um and how harmful that is is 756 00:44:00,640 --> 00:44:04,399 Speaker 1: on these platforms and we've seen and, like we said, 757 00:44:04,480 --> 00:44:08,279 Speaker 1: like the trickle effect of people believing that this is 758 00:44:08,360 --> 00:44:12,000 Speaker 1: the discourse. And so therefore women have it hard enough already, 759 00:44:12,120 --> 00:44:15,480 Speaker 1: like in the public office, like I just don't like her. 760 00:44:15,480 --> 00:44:17,440 Speaker 1: There's something about her, and to have like just this 761 00:44:17,760 --> 00:44:23,160 Speaker 1: overwhelming discourse of like no, no one likes her, and 762 00:44:23,280 --> 00:44:26,239 Speaker 1: so it just it just seems so toxic and that 763 00:44:26,280 --> 00:44:29,840 Speaker 1: people are making money from it is very frustrating and 764 00:44:30,040 --> 00:44:34,399 Speaker 1: I definitely agree. That means something is wrong here. Yeah, 765 00:44:34,520 --> 00:44:36,879 Speaker 1: and I want to be clear, like I didn't really 766 00:44:36,920 --> 00:44:39,120 Speaker 1: have strong feelings about Megan Markele. I like that she's 767 00:44:39,120 --> 00:44:42,480 Speaker 1: a black princess, but or was a black princess. And Yeah, 768 00:44:42,520 --> 00:44:45,279 Speaker 1: I know she wasn't really princess, but whatever. I but like, 769 00:44:45,840 --> 00:44:49,520 Speaker 1: if you don't like Megan Markele, that's fine, you're it's 770 00:44:49,560 --> 00:44:51,279 Speaker 1: it's totally within your right to be like, I don't 771 00:44:51,320 --> 00:44:54,360 Speaker 1: like her. Rothers to me the wrong way, whatever. We 772 00:44:54,440 --> 00:44:56,080 Speaker 1: all have people that we don't like, but that is 773 00:44:56,440 --> 00:45:01,440 Speaker 1: very different than you know. I am going to spearhead 774 00:45:01,600 --> 00:45:06,720 Speaker 1: a coordinated campaign to make it seem like everybody doesn't 775 00:45:06,760 --> 00:45:09,240 Speaker 1: like her, and that campaign is going to be built 776 00:45:09,320 --> 00:45:13,120 Speaker 1: on racism, sexism and lies about who she is. And 777 00:45:13,440 --> 00:45:17,520 Speaker 1: I say that because when you do that, you're making 778 00:45:18,080 --> 00:45:21,279 Speaker 1: us all less safe. It truly does threaten our democracy. 779 00:45:21,640 --> 00:45:25,479 Speaker 1: When you're able to tell those kinds of inflammatory lies 780 00:45:25,520 --> 00:45:27,799 Speaker 1: and have them take up so much space in the room, 781 00:45:28,160 --> 00:45:31,120 Speaker 1: it makes all of us that much less safe, particularly 782 00:45:31,200 --> 00:45:32,839 Speaker 1: women and women of Color. It makes it so much 783 00:45:32,920 --> 00:45:35,480 Speaker 1: harder for us to thrive at environments where we can 784 00:45:35,560 --> 00:45:38,600 Speaker 1: be judged on our character, things that we actually do 785 00:45:38,760 --> 00:45:42,319 Speaker 1: and actually say, not made up conspiracy theories about us 786 00:45:42,320 --> 00:45:45,800 Speaker 1: putting microphones in our panties. Right, like, we need to 787 00:45:46,160 --> 00:45:52,359 Speaker 1: have a discourse where honest, thoughtful, accurate conversation about who 788 00:45:52,480 --> 00:45:56,279 Speaker 1: we are as marginalized people dominates and crap like that 789 00:45:56,719 --> 00:46:02,040 Speaker 1: does not. Yeah, yeah, I remember very vividly this essay 790 00:46:02,080 --> 00:46:04,080 Speaker 1: I read a while back. That was like, you know, 791 00:46:04,080 --> 00:46:06,040 Speaker 1: if a woman stands up in the town square and 792 00:46:06,080 --> 00:46:09,040 Speaker 1: it is saying something is wrong here, and as thousand 793 00:46:09,120 --> 00:46:11,839 Speaker 1: men threatened her with violence and are screaming over her, 794 00:46:11,960 --> 00:46:16,320 Speaker 1: who's like freedom of speech is being threatened here, because 795 00:46:16,360 --> 00:46:18,520 Speaker 1: it feels like that. Because, like you said, so many 796 00:46:18,560 --> 00:46:22,960 Speaker 1: people leave, so many Marchinines, people and women leave because 797 00:46:23,760 --> 00:46:27,560 Speaker 1: they're facing this. What they feel? What? What is this? Like? 798 00:46:27,719 --> 00:46:33,040 Speaker 1: Shouting hateful vitriol that makes you feel unsafe in this 799 00:46:33,120 --> 00:46:34,800 Speaker 1: space and it makes it impossible for us to have 800 00:46:34,920 --> 00:46:39,759 Speaker 1: these healthy conversations that are necessary and needed for a democracy. Absolutely, 801 00:46:39,960 --> 00:46:42,040 Speaker 1: I would also argue that like that it's by design. 802 00:46:42,360 --> 00:46:44,400 Speaker 1: It's meant to come up the works. It's meant to 803 00:46:44,480 --> 00:46:47,160 Speaker 1: have people who are interested in actual discussion and dialogue 804 00:46:47,160 --> 00:46:49,160 Speaker 1: to check out and be like, no, banks don't want 805 00:46:49,200 --> 00:46:52,000 Speaker 1: to risk risk it by putting my thoughts out there, 806 00:46:52,320 --> 00:46:55,240 Speaker 1: and it's meant to make sure that we can't find unity, 807 00:46:55,360 --> 00:46:58,080 Speaker 1: that we can't come together, that we can't make progress 808 00:46:58,160 --> 00:46:59,800 Speaker 1: on all the issues that are impacting us and have 809 00:47:00,000 --> 00:47:02,719 Speaker 1: of the conversations that we need to have that might 810 00:47:02,800 --> 00:47:07,360 Speaker 1: move us forward on those issues. Right well, Virgid, is 811 00:47:07,400 --> 00:47:10,160 Speaker 1: there any has there been any changes? Is there any hope? 812 00:47:11,080 --> 00:47:13,160 Speaker 1: There is a little bit of hope, right so, earlier 813 00:47:13,320 --> 00:47:16,360 Speaker 1: so I mentioned how on Youtube people are able to 814 00:47:16,400 --> 00:47:20,880 Speaker 1: make big money basically just running single purpose hate accounts 815 00:47:20,880 --> 00:47:24,400 Speaker 1: against Megan Markele. Not Anymore. Earlier this year youtube made 816 00:47:24,440 --> 00:47:27,400 Speaker 1: a big change. They do ranked Anti Megan market results 817 00:47:27,440 --> 00:47:30,200 Speaker 1: from their search. In a really great buzzfeed piece by 818 00:47:30,239 --> 00:47:32,400 Speaker 1: Ellie Hall, which shout out to Elliet Hall. She has 819 00:47:32,400 --> 00:47:35,640 Speaker 1: done some fantastic reporting on Megan markel and race and 820 00:47:35,719 --> 00:47:39,600 Speaker 1: culture and what it all means. UH, Ellie Hall rights, 821 00:47:40,080 --> 00:47:43,000 Speaker 1: but now you'll only find videos from verified accounts and 822 00:47:43,120 --> 00:47:46,040 Speaker 1: news outlets in the Youtube search results for Megan Markel 823 00:47:46,320 --> 00:47:49,719 Speaker 1: and first recommendations in the sidebar. Even if you explicitly 824 00:47:49,760 --> 00:47:53,719 Speaker 1: searched for and started watching videos that accused Megan markle 825 00:47:53,760 --> 00:47:56,080 Speaker 1: of being a narcissist or videos claiming that she wore 826 00:47:56,160 --> 00:47:59,680 Speaker 1: fake ballet to make herself look pregnant, Youtube's recommendation sidebar 827 00:48:00,040 --> 00:48:03,880 Speaker 1: won't initially serve you similar videos. And so, uh, I 828 00:48:04,000 --> 00:48:07,040 Speaker 1: do think that that is a step in the right direction. Like, 829 00:48:07,080 --> 00:48:09,120 Speaker 1: I gotta give it to Youtube that that was a 830 00:48:09,239 --> 00:48:13,840 Speaker 1: good call because, yeah, when you search any topic, but 831 00:48:14,080 --> 00:48:17,239 Speaker 1: let's use Megan Markel at an example, it is not, 832 00:48:17,360 --> 00:48:20,160 Speaker 1: it should not be, in Youtube's best interest for the 833 00:48:20,360 --> 00:48:22,680 Speaker 1: first search results, but I think that they're promoting for 834 00:48:22,760 --> 00:48:25,640 Speaker 1: the recommended videos that they're surfacing to you to be 835 00:48:25,840 --> 00:48:29,759 Speaker 1: videos from people whose whole thing is hating on that 836 00:48:29,960 --> 00:48:34,080 Speaker 1: one thing. Right, it should be honest, accurate content. I 837 00:48:34,160 --> 00:48:36,840 Speaker 1: would love to throw and thoughtful in there, but I 838 00:48:36,920 --> 00:48:39,640 Speaker 1: think if you can just get accurate, that would be great. 839 00:48:39,680 --> 00:48:43,680 Speaker 1: That would be a great start for just just just 840 00:48:43,840 --> 00:48:46,440 Speaker 1: do this, because it's gotten so off the roils and 841 00:48:46,520 --> 00:48:48,759 Speaker 1: it's just like just just be true. At least if 842 00:48:48,760 --> 00:48:50,960 Speaker 1: you're going to say nic and Markle is married to 843 00:48:51,360 --> 00:48:54,359 Speaker 1: Prince Harry, just Indo with that. I'll watch that over. 844 00:49:10,239 --> 00:49:13,120 Speaker 1: This is the biggest conversation. Like, why is it significant? 845 00:49:13,320 --> 00:49:15,759 Speaker 1: When people talk about Megan Markel knew what she was 846 00:49:15,800 --> 00:49:18,000 Speaker 1: coming into. Again, it's just one of those places of like, 847 00:49:18,160 --> 00:49:21,000 Speaker 1: but she just got married, but that's the end. She 848 00:49:21,120 --> 00:49:24,200 Speaker 1: didn't want to be a princess because so many accusations 849 00:49:24,480 --> 00:49:26,239 Speaker 1: came at her that they're like, you know what, we're 850 00:49:26,320 --> 00:49:28,720 Speaker 1: leaving this royal family stuff. We're done. We're done because 851 00:49:28,719 --> 00:49:31,719 Speaker 1: we're getting attacked from every circle, whether it's she's uh, 852 00:49:32,400 --> 00:49:34,960 Speaker 1: a gold digger slash, I guess, a royalty digger. Is that? 853 00:49:35,080 --> 00:49:37,399 Speaker 1: Is that? That's a name. Is that of something um 854 00:49:37,800 --> 00:49:41,000 Speaker 1: or versus to her want to disrupt the family, all 855 00:49:41,040 --> 00:49:43,879 Speaker 1: of these things that what just happened was they got 856 00:49:43,960 --> 00:49:47,000 Speaker 1: set up. They are famous. She wasn't that famous. She 857 00:49:47,160 --> 00:49:49,319 Speaker 1: just she was an actress. She was good at our job. 858 00:49:49,440 --> 00:49:52,640 Speaker 1: She could have kept going. She met him, got married 859 00:49:52,920 --> 00:49:55,040 Speaker 1: and that's it. She's not trying to be political, she's 860 00:49:55,080 --> 00:49:57,320 Speaker 1: not trying to be royalty. She just wants to be 861 00:49:57,760 --> 00:49:59,640 Speaker 1: a part of our family. Has Some really good, good 862 00:49:59,680 --> 00:50:02,960 Speaker 1: relations up with her mother, bad relationship with bother, just 863 00:50:03,120 --> 00:50:05,640 Speaker 1: being that, and because of that she's getting all this 864 00:50:05,800 --> 00:50:09,440 Speaker 1: vitriol being compared to the other royalties. Uh, we we 865 00:50:09,600 --> 00:50:11,600 Speaker 1: know what this is, we know what this looks like, 866 00:50:12,000 --> 00:50:14,520 Speaker 1: and for her, who was not even on social media, 867 00:50:14,800 --> 00:50:17,879 Speaker 1: who continues to be attacked, is just one of those 868 00:50:17,920 --> 00:50:20,600 Speaker 1: parts of like she doesn't deserve this. There's nothing that 869 00:50:20,680 --> 00:50:23,759 Speaker 1: she did that deserves this. There's no conversation where we 870 00:50:24,000 --> 00:50:28,400 Speaker 1: think that this is earned and she she put herself 871 00:50:28,440 --> 00:50:31,440 Speaker 1: out there. She's not a politician taking a stance on anything. 872 00:50:31,719 --> 00:50:38,160 Speaker 1: She's just being like, that's it. Yeah, you something that 873 00:50:38,239 --> 00:50:40,239 Speaker 1: you said. I hadn't even really thought about this, but 874 00:50:40,760 --> 00:50:44,480 Speaker 1: I think that people's reactions to Megan markle really demonstrate 875 00:50:44,560 --> 00:50:48,680 Speaker 1: how women of color especially, are really not allowed to 876 00:50:48,880 --> 00:50:54,840 Speaker 1: publicly be multifaceted, complex humans. It's like the fact that like, Oh, 877 00:50:54,960 --> 00:50:58,759 Speaker 1: she has a, you know, complicated relationship with her father, 878 00:50:59,000 --> 00:51:03,040 Speaker 1: who doesn't like they use these incredibly human things where 879 00:51:03,080 --> 00:51:06,160 Speaker 1: it's like, yeah, welcome to being a human in relationships 880 00:51:06,239 --> 00:51:09,640 Speaker 1: with other humans. Sometimes that these things happen like they 881 00:51:09,840 --> 00:51:13,960 Speaker 1: find these incredibly human things that we all experience and 882 00:51:14,080 --> 00:51:16,719 Speaker 1: they they use it or they frame it as like 883 00:51:16,840 --> 00:51:18,960 Speaker 1: a negative against her in these ways that I just 884 00:51:19,040 --> 00:51:22,360 Speaker 1: think are just completely transparent. It's so obvious what's happening 885 00:51:22,680 --> 00:51:24,600 Speaker 1: and the way that they talk about her with this 886 00:51:24,760 --> 00:51:28,760 Speaker 1: like I'm wanna say this. They're quite good at talking 887 00:51:28,840 --> 00:51:32,080 Speaker 1: about her with this plausible deniability of like, Oh, I 888 00:51:32,160 --> 00:51:34,359 Speaker 1: didn't mean I didn't mean that racially. When I called 889 00:51:34,360 --> 00:51:37,560 Speaker 1: her Compton Kate, even though she's not from Compton, I 890 00:51:37,640 --> 00:51:40,839 Speaker 1: didn't mean anything race like race truly, but I didn't 891 00:51:40,840 --> 00:51:43,759 Speaker 1: say anything racially motivated there. I was just saying she's 892 00:51:43,760 --> 00:51:46,960 Speaker 1: from California and Compton it's also a city in California. 893 00:51:47,840 --> 00:51:56,759 Speaker 1: You know. Yeah, yeah, I mean, this is again I 894 00:51:56,800 --> 00:51:58,800 Speaker 1: feel like we need a new segment, bridget, where you 895 00:51:58,880 --> 00:52:01,800 Speaker 1: come on and we do like but I feel like this, 896 00:52:02,520 --> 00:52:06,560 Speaker 1: there's so much of this. It is like questioning women's 897 00:52:07,200 --> 00:52:10,279 Speaker 1: ambition and it still seems to be in this very 898 00:52:10,440 --> 00:52:14,200 Speaker 1: like old system of royalty, like Oh, she's just trying 899 00:52:14,280 --> 00:52:17,200 Speaker 1: to marry into the royal family, we can't trust anything 900 00:52:17,440 --> 00:52:21,160 Speaker 1: she does and she's trying to ruin this tradition by 901 00:52:21,239 --> 00:52:27,200 Speaker 1: being her and like just constantly questioning and calling out 902 00:52:28,440 --> 00:52:35,560 Speaker 1: what women's intentions are. And it's funny because I find 903 00:52:35,600 --> 00:52:40,320 Speaker 1: myself having I mean, these systems are so ingrained that 904 00:52:40,520 --> 00:52:43,160 Speaker 1: I'm not immune to it. Like I when I read 905 00:52:43,280 --> 00:52:48,319 Speaker 1: articles about ambitious women, like I have to really check 906 00:52:48,400 --> 00:52:50,520 Speaker 1: myself internally and be like, well, are you really just 907 00:52:50,680 --> 00:52:53,479 Speaker 1: like adding this like sexist trope on to this woman 908 00:52:53,960 --> 00:52:58,920 Speaker 1: that assumes that she must be, you know, misleading this 909 00:52:59,120 --> 00:53:02,400 Speaker 1: like simple childlike man who doesn't know any better? And 910 00:53:02,520 --> 00:53:04,239 Speaker 1: and oftentimes the answer is yes, and I have to 911 00:53:04,239 --> 00:53:06,440 Speaker 1: step back and be like, well, let's unpack why. Your 912 00:53:06,480 --> 00:53:09,520 Speaker 1: assumption was that this woman, who is ambitious and powerful, 913 00:53:10,120 --> 00:53:14,040 Speaker 1: must be, you know, calling the shots and misleading this 914 00:53:14,239 --> 00:53:16,120 Speaker 1: like dull Lord of a man? Isn't that kind of 915 00:53:16,160 --> 00:53:19,080 Speaker 1: insulting to both of them? Yeah, it just goes to 916 00:53:19,120 --> 00:53:22,880 Speaker 1: show like how ingrained these these systems are and how 917 00:53:23,360 --> 00:53:26,200 Speaker 1: insidious they are and the work that each individual needs 918 00:53:26,239 --> 00:53:30,800 Speaker 1: to do to unpack them, and myself very much included, right, yeah, me, 919 00:53:31,120 --> 00:53:33,360 Speaker 1: all of us. I think we all have these things 920 00:53:34,400 --> 00:53:37,480 Speaker 1: that we just didn't realize that we did internalize so much. 921 00:53:37,760 --> 00:53:42,239 Speaker 1: And something like this, you know, the royal family, mega 922 00:53:42,280 --> 00:53:47,400 Speaker 1: markle can seem kind of frivolous. It's not that it 923 00:53:47,440 --> 00:53:50,200 Speaker 1: can kind of feel that way because celebrity culture kind 924 00:53:50,200 --> 00:53:53,320 Speaker 1: of insights that a lot. But, as we've been alluding 925 00:53:53,360 --> 00:53:57,400 Speaker 1: to a lot throughout this, it does matter right, oh, 926 00:53:57,520 --> 00:53:59,960 Speaker 1: it matters hugely. Like I would say that the reactions 927 00:54:00,000 --> 00:54:02,400 Speaker 1: of Megan markel really shows how easy it is for 928 00:54:02,520 --> 00:54:05,719 Speaker 1: a relatively small amount of people to create an effective 929 00:54:05,800 --> 00:54:08,799 Speaker 1: and negative echo chamber that can be fueled by things 930 00:54:08,880 --> 00:54:12,040 Speaker 1: like racism and sexism. You know, again, if eighty three 931 00:54:12,360 --> 00:54:15,520 Speaker 1: accounts are able to generate the majority of negative chatter 932 00:54:15,600 --> 00:54:19,960 Speaker 1: about one person and create the impression inorganically that that 933 00:54:20,200 --> 00:54:23,080 Speaker 1: is the overall sentiment of that person, it kind of 934 00:54:23,200 --> 00:54:25,640 Speaker 1: means that our digital landscape, in our platforms might not 935 00:54:25,840 --> 00:54:29,239 Speaker 1: be so healthy. Yes, yes, and, and I don't think 936 00:54:29,280 --> 00:54:32,160 Speaker 1: we went over this, but there were those. The people 937 00:54:32,960 --> 00:54:35,080 Speaker 1: who are doing this, who are running these accounts, know 938 00:54:35,239 --> 00:54:37,080 Speaker 1: what they're doing. They know how to manipulate the system. 939 00:54:37,120 --> 00:54:39,200 Speaker 1: We talked about that, but they have like very specific ways. 940 00:54:39,760 --> 00:54:42,160 Speaker 1: So if there's if we are on working in these 941 00:54:42,200 --> 00:54:46,120 Speaker 1: systems where the eight three, these eighty three people can 942 00:54:46,200 --> 00:54:49,560 Speaker 1: just manipulate these rules and kind of skate being banned. 943 00:54:49,680 --> 00:54:54,040 Speaker 1: And these what some people who are maybe casual twitter 944 00:54:54,200 --> 00:54:57,560 Speaker 1: users or have users assume, like, oh, twitter has these rules, 945 00:54:57,600 --> 00:55:02,320 Speaker 1: they'll kick people off. But that's not working either. No, 946 00:55:02,520 --> 00:55:04,480 Speaker 1: it's not working right. And so I would say the 947 00:55:04,560 --> 00:55:09,120 Speaker 1: fact that platforms still allow this largely is a big problem, right, 948 00:55:09,120 --> 00:55:11,279 Speaker 1: and the fact that it's profitable, that you can make 949 00:55:11,320 --> 00:55:14,200 Speaker 1: money from it, and I think it's incentivized. And so again, 950 00:55:14,320 --> 00:55:18,480 Speaker 1: like we it's these eighty three people are bad actors, 951 00:55:18,719 --> 00:55:22,560 Speaker 1: but also, more institutionally, platforms need to do something to 952 00:55:22,680 --> 00:55:26,120 Speaker 1: make sure that relatively small handful of people can't hijack 953 00:55:26,160 --> 00:55:28,840 Speaker 1: an entire conversation about a subject. And again, I know 954 00:55:28,960 --> 00:55:31,040 Speaker 1: that a lot of people that I respect would say 955 00:55:31,080 --> 00:55:34,600 Speaker 1: that the answer is to ban single issue hate accounts 956 00:55:34,680 --> 00:55:37,160 Speaker 1: or single person hate accounts, that we should be banning those, 957 00:55:37,239 --> 00:55:40,200 Speaker 1: and I again, I I think that there's always going 958 00:55:40,280 --> 00:55:43,200 Speaker 1: to be people who, they're staying is hating on other 959 00:55:43,320 --> 00:55:46,040 Speaker 1: public figures, and so I don't necessarily think that that 960 00:55:46,160 --> 00:55:49,360 Speaker 1: those people should not be able to do that. Like, 961 00:55:49,440 --> 00:55:51,160 Speaker 1: I don't love it and like I wouldn't recommend it 962 00:55:51,200 --> 00:55:52,839 Speaker 1: and I would never be friends with somebody who would 963 00:55:52,840 --> 00:55:54,839 Speaker 1: do that. And I would never do that, but there's 964 00:55:54,920 --> 00:55:57,120 Speaker 1: I feel that there's always going to be people who 965 00:55:57,200 --> 00:56:00,120 Speaker 1: are hell bent on hating on others and I think 966 00:56:00,120 --> 00:56:02,440 Speaker 1: that's just a reality of the world that we live in. 967 00:56:02,880 --> 00:56:06,920 Speaker 1: But even so, platforms don't have to make money off 968 00:56:06,960 --> 00:56:09,440 Speaker 1: of it, profit off of it, amplify it, normalize it, 969 00:56:09,840 --> 00:56:12,440 Speaker 1: you know, like they can take some accountability. And so 970 00:56:13,239 --> 00:56:15,239 Speaker 1: I don't know, I think that the real, the real, 971 00:56:15,400 --> 00:56:19,000 Speaker 1: for me at least, the real problem here, is what 972 00:56:19,120 --> 00:56:22,319 Speaker 1: happens when platforms just allow this and it becomes the norm, 973 00:56:22,440 --> 00:56:24,799 Speaker 1: because I really have seen the way that this same 974 00:56:24,920 --> 00:56:29,000 Speaker 1: dynamic can be applied to political leaders, women running for office, 975 00:56:29,040 --> 00:56:32,160 Speaker 1: women journalists, and I think the implications when you apply 976 00:56:32,239 --> 00:56:35,160 Speaker 1: it that way, really become clear. It's it's, it's they're 977 00:56:35,200 --> 00:56:38,040 Speaker 1: quite dire. And Yeah, people can make money off of it. 978 00:56:38,200 --> 00:56:41,800 Speaker 1: Like the fact that we're looking at Forbes magazine with 979 00:56:42,000 --> 00:56:45,799 Speaker 1: tiktok stars talking about how the number one stars making 980 00:56:45,800 --> 00:56:49,440 Speaker 1: almost twenty million dollars per you. It's kind of like what, wait, what? 981 00:56:50,000 --> 00:56:54,719 Speaker 1: And the realization is literally, Tiktok, the platform, helped elevate 982 00:56:55,160 --> 00:56:58,239 Speaker 1: these individuals and have made them a household market and 983 00:56:58,360 --> 00:57:03,560 Speaker 1: have made them profitable, have created a new genre of celebrities. 984 00:57:03,640 --> 00:57:08,000 Speaker 1: Is that a thing? Right? I mean, the fact is 985 00:57:08,360 --> 00:57:12,320 Speaker 1: they have a lot of power, whether we want to 986 00:57:12,360 --> 00:57:15,320 Speaker 1: admit it or not, and it it plays into, obviously, 987 00:57:15,360 --> 00:57:18,080 Speaker 1: as you were talking about star wars, it plays into movies, 988 00:57:18,240 --> 00:57:21,120 Speaker 1: is played into what's being created, it's plays into what's 989 00:57:21,120 --> 00:57:24,840 Speaker 1: being pulled. Is played into who ends up being on UH, 990 00:57:25,240 --> 00:57:29,240 Speaker 1: who ends up becoming leaders. It absolutely affects everything, and 991 00:57:29,360 --> 00:57:34,440 Speaker 1: when it's specifically targeted to silence marginalized individuals, like what 992 00:57:34,560 --> 00:57:37,320 Speaker 1: has happened with Megan markel that she got off of 993 00:57:37,360 --> 00:57:40,840 Speaker 1: social media. We've seen many teen celebrities, young girls, get 994 00:57:40,880 --> 00:57:43,440 Speaker 1: off social media because it's the constant were rating, constant 995 00:57:43,480 --> 00:57:48,360 Speaker 1: trolling that it becomes a thing. We've seen individuals being 996 00:57:48,480 --> 00:57:51,880 Speaker 1: terrorized for making a comment about someone's favorite musicians. It's 997 00:57:51,960 --> 00:57:55,120 Speaker 1: the whole level of like what is happening and the 998 00:57:55,200 --> 00:57:57,600 Speaker 1: fact that this has allowed to happen and there's no 999 00:57:57,720 --> 00:58:00,760 Speaker 1: one who seemingly controls it, even though it's under a 1000 00:58:00,840 --> 00:58:03,640 Speaker 1: platform that is a private company. Exactly. I mean, I 1001 00:58:03,640 --> 00:58:06,120 Speaker 1: couldn't have put it better myself. And when I think 1002 00:58:06,120 --> 00:58:09,440 Speaker 1: about the ways that, you know, the younger generation, the 1003 00:58:09,480 --> 00:58:13,120 Speaker 1: generation coming up behind me. They're largely, like you know, 1004 00:58:13,440 --> 00:58:17,240 Speaker 1: the on online generation, and so they're learning about politics 1005 00:58:17,320 --> 00:58:20,080 Speaker 1: and the world around them and civic engagement from the Internet. 1006 00:58:20,280 --> 00:58:23,520 Speaker 1: The ways that they exert that power in those voices 1007 00:58:23,720 --> 00:58:26,720 Speaker 1: is are are largely online. If our online systems are 1008 00:58:26,760 --> 00:58:29,040 Speaker 1: so toxic that people don't even want to be part 1009 00:58:29,080 --> 00:58:32,840 Speaker 1: of them, that is an entire generation of marginalized young 1010 00:58:32,880 --> 00:58:36,000 Speaker 1: people who, early on, have just checked out, and that 1011 00:58:36,240 --> 00:58:39,600 Speaker 1: is a huge problem. Yeah, yeah, and on top of that, 1012 00:58:39,880 --> 00:58:43,520 Speaker 1: I think of kind of the toxic messaging we get 1013 00:58:44,360 --> 00:58:48,640 Speaker 1: from a very young age um as women, and so 1014 00:58:48,760 --> 00:58:51,000 Speaker 1: you internalize some of this stuff and then if you 1015 00:58:51,120 --> 00:58:54,200 Speaker 1: were exposed to these social platforms that are just reinforcing 1016 00:58:54,240 --> 00:58:58,760 Speaker 1: all the toxic messaging, that's not good. That's that's very, 1017 00:58:58,960 --> 00:59:05,520 Speaker 1: very bad. Yes, yeah, I guess like I just strongly 1018 00:59:05,600 --> 00:59:08,840 Speaker 1: feel that we all deserve better. Like we we we 1019 00:59:08,960 --> 00:59:12,480 Speaker 1: can have better. The reason why things are not better, 1020 00:59:12,560 --> 00:59:14,920 Speaker 1: I would argue, or because it's lining the pockets of 1021 00:59:15,520 --> 00:59:18,840 Speaker 1: mostly white, straight SYS men who build these systems and 1022 00:59:18,920 --> 00:59:22,640 Speaker 1: profit from them and them. We deserve better. I don't 1023 00:59:22,680 --> 00:59:24,240 Speaker 1: care if they all go bankrupt if it means that 1024 00:59:24,280 --> 00:59:26,760 Speaker 1: we can have something better. For the rest of us. Yes, 1025 00:59:26,920 --> 00:59:32,920 Speaker 1: keep the rich. I like it, I love it. Um. Well, 1026 00:59:32,960 --> 00:59:34,960 Speaker 1: thank you so much as always, bridget. It was a 1027 00:59:35,040 --> 00:59:38,640 Speaker 1: delight to have you. Every time we have these I'm like, well, 1028 00:59:38,640 --> 00:59:40,120 Speaker 1: we gotta talk about this, we're gonna talk about this, 1029 00:59:40,120 --> 00:59:44,440 Speaker 1: we're gonna talk about this. So are there any resources 1030 00:59:44,480 --> 00:59:47,280 Speaker 1: you want to shout out or where can the listeners 1031 00:59:47,320 --> 00:59:50,040 Speaker 1: find you both? Well, I definitely recommend checking out bought 1032 00:59:50,120 --> 00:59:53,640 Speaker 1: Sentinel's work. They put out research briefings that are so fascinating, 1033 00:59:53,800 --> 00:59:56,480 Speaker 1: and Christopher bouse has done such a good job of 1034 00:59:56,720 --> 00:59:59,400 Speaker 1: really helping me understand what's happening on the Internet. So 1035 00:59:59,520 --> 01:00:01,960 Speaker 1: definitely at them out. Um, you can follow me on 1036 01:00:02,080 --> 01:00:04,880 Speaker 1: twitter at Bridget Murray or on Instagram at Bridget Marian 1037 01:00:04,960 --> 01:00:07,080 Speaker 1: D C, and check out my podcast. There are no 1038 01:00:07,160 --> 01:00:10,600 Speaker 1: girls on the Internet. Yes, yes, yes, absolutely do that, listeners, 1039 01:00:10,600 --> 01:00:13,200 Speaker 1: if you have not already, thank you again, a bridget, 1040 01:00:13,280 --> 01:00:16,680 Speaker 1: for being here. Cannot wait until the next time. Me Too. 1041 01:00:16,840 --> 01:00:19,800 Speaker 1: Thanks for having me. Yes, absolutely, I always have wonderful, 1042 01:00:20,000 --> 01:00:23,040 Speaker 1: wonderful pleasure and listeners, if you would like to contact us, 1043 01:00:23,360 --> 01:00:25,280 Speaker 1: you can or emails. stuffit me. The amout stuff at 1044 01:00:25,280 --> 01:00:27,120 Speaker 1: I hurt meat dot com. You can find us on twitter, 1045 01:00:27,240 --> 01:00:29,760 Speaker 1: at MOMS podcast or instagram and stuff I never told you. Thanks, 1046 01:00:29,800 --> 01:00:32,800 Speaker 1: as always, to our super producer, Christina. Thank you, Christina, 1047 01:00:33,000 --> 01:00:35,000 Speaker 1: and thanks to you for listening. Someone never told you 1048 01:00:35,040 --> 01:00:36,720 Speaker 1: the protection of I heart radio. For more podcast in 1049 01:00:36,800 --> 01:00:38,400 Speaker 1: my heart radio, you can check out the heart radio APP, 1050 01:00:38,400 --> 01:00:40,560 Speaker 1: Apple Podcast, wherever you listen to your favorite shows,