1 00:00:05,200 --> 00:00:07,680 Speaker 1: Hey, this is Annie and Samantha and welcome to Stephane 2 00:00:07,720 --> 00:00:19,160 Speaker 1: never told your production of I Heart Radio, and today 3 00:00:19,239 --> 00:00:21,759 Speaker 1: we are once again to thrilled to be joined by 4 00:00:21,960 --> 00:00:26,560 Speaker 1: the magnificent, wonderful bridget Todd. Thank you so much for 5 00:00:26,600 --> 00:00:29,840 Speaker 1: being here. Bridget Oh, I'm so excited and honored. Every 6 00:00:29,880 --> 00:00:31,680 Speaker 1: time I come, I feel like you all give me 7 00:00:31,800 --> 00:00:35,360 Speaker 1: the best intros. I just want to run you doing 8 00:00:35,400 --> 00:00:37,839 Speaker 1: that introduction in my earbuds all the time when I'm 9 00:00:37,840 --> 00:00:44,599 Speaker 1: having a bed day. Exactly well deserved. Yeah, because you 10 00:00:44,960 --> 00:00:47,120 Speaker 1: as always have a lot going on, and I'm very 11 00:00:47,240 --> 00:00:51,560 Speaker 1: very excited about what we're talking about today and about 12 00:00:51,560 --> 00:00:54,760 Speaker 1: the work you're doing around it, because, as we'll shout 13 00:00:54,800 --> 00:00:57,440 Speaker 1: out at the end, you are working on a new 14 00:00:57,520 --> 00:01:02,120 Speaker 1: podcast that is coming out soon. Correct, that is right 15 00:01:02,160 --> 00:01:05,440 Speaker 1: by the time folks here this it will be out. Um. Yeah, 16 00:01:05,480 --> 00:01:11,120 Speaker 1: it's it's a it's exciting. It's not the cheeriest topic, 17 00:01:11,200 --> 00:01:15,520 Speaker 1: but very important. But yes, new podcast coming soon. Yes, Yes, 18 00:01:15,840 --> 00:01:18,440 Speaker 1: And I love the topic you brought because it's such 19 00:01:18,840 --> 00:01:22,880 Speaker 1: it exemplifies so much why this new podcast, which is 20 00:01:22,880 --> 00:01:26,039 Speaker 1: called Internet Hate Machine name, thank you. I have to 21 00:01:26,319 --> 00:01:28,039 Speaker 1: you can see how out of practice I am for 22 00:01:28,080 --> 00:01:30,399 Speaker 1: doing promo. I didn't even say the name of a podcast. Yes, 23 00:01:30,680 --> 00:01:33,600 Speaker 1: Internet Hate Machine. I wish I could take credit for 24 00:01:33,640 --> 00:01:37,640 Speaker 1: that title, but it is a production of Sophie and 25 00:01:37,720 --> 00:01:40,319 Speaker 1: Robert Evans over at cool Zone Media, who are my 26 00:01:40,400 --> 00:01:42,920 Speaker 1: partners in this, and they're phenomenal. And when I was 27 00:01:42,959 --> 00:01:45,760 Speaker 1: like going off the rails on on how I'm like, 28 00:01:45,959 --> 00:01:48,760 Speaker 1: we should be talking about the Internet and blah blah blah, 29 00:01:48,960 --> 00:01:52,000 Speaker 1: Robert Evans is like, it's a real Internet Hate Machine 30 00:01:52,040 --> 00:01:54,400 Speaker 1: and I was like, oh my god. Perfect. So it 31 00:01:54,520 --> 00:01:57,600 Speaker 1: was given us like nine inch nails vibes, all of that. Yes, 32 00:01:57,600 --> 00:01:59,240 Speaker 1: I wish I could take credit for that title, but 33 00:01:59,280 --> 00:02:03,639 Speaker 1: it's all them. Yeah. Yeah, it's excellent and very well needed. 34 00:02:03,680 --> 00:02:06,680 Speaker 1: And I think that good points will be throughout this 35 00:02:06,760 --> 00:02:12,000 Speaker 1: episode of why it is so needed. And also before 36 00:02:12,040 --> 00:02:14,000 Speaker 1: we just dive into it, we like to check in 37 00:02:14,040 --> 00:02:15,959 Speaker 1: with you. How are you doing? We got some cat 38 00:02:16,040 --> 00:02:20,280 Speaker 1: updates which were great before we start recording, which is like, 39 00:02:20,360 --> 00:02:25,000 Speaker 1: how are you bridget Oh, I appreciate being asked. I'm good. 40 00:02:25,040 --> 00:02:28,360 Speaker 1: I am. This is a weird a weird time because 41 00:02:28,400 --> 00:02:30,080 Speaker 1: we're just a few days out from the mid term 42 00:02:30,120 --> 00:02:33,400 Speaker 1: elections and so I'm working in my capacity with the genders, 43 00:02:33,560 --> 00:02:37,360 Speaker 1: this group Ultraviolet on mid term election stuff, which is 44 00:02:37,840 --> 00:02:41,679 Speaker 1: been a lot, I guess, but yeah, it's it's things 45 00:02:41,680 --> 00:02:47,160 Speaker 1: are good, busy, happy, How how are you too? I'm 46 00:02:47,200 --> 00:02:50,720 Speaker 1: always like, oh, I think I'm I'm okay, I'm good. 47 00:02:51,200 --> 00:02:54,320 Speaker 1: I'm nervous about the mid term elections, which I know 48 00:02:54,360 --> 00:02:56,800 Speaker 1: we're going to talk about a lot. I love Halloween, 49 00:02:56,919 --> 00:03:00,000 Speaker 1: so I'm trying to get into Halloween. But I feel 50 00:03:00,000 --> 00:03:02,639 Speaker 1: like I've been so busy in October that I kind 51 00:03:02,680 --> 00:03:05,320 Speaker 1: of missed all of the stuff I normally do. So 52 00:03:05,360 --> 00:03:07,760 Speaker 1: now I'm trying to cram it in really quickly, and 53 00:03:07,800 --> 00:03:10,000 Speaker 1: I don't like that feeling like I think it'll work 54 00:03:10,040 --> 00:03:13,880 Speaker 1: out fine, it'll be great, um, but I don't like 55 00:03:14,040 --> 00:03:16,040 Speaker 1: this all of a sudden, like, oh my gosh, I 56 00:03:16,040 --> 00:03:21,560 Speaker 1: haven't done all of my traditional Halloween things. Uh. So 57 00:03:21,880 --> 00:03:25,160 Speaker 1: I guess stressed, which is normally an answer I would give. 58 00:03:25,160 --> 00:03:28,280 Speaker 1: Stress and tired is about where I am, but trying 59 00:03:28,320 --> 00:03:32,079 Speaker 1: to make the best of it. Yeah see, I'm trying 60 00:03:32,080 --> 00:03:34,400 Speaker 1: to see. These are the type of questions when you're like, 61 00:03:34,560 --> 00:03:37,480 Speaker 1: do I am I honest? Or do I just try 62 00:03:37,520 --> 00:03:42,400 Speaker 1: to get past? If you feel comfortable it's whenever I 63 00:03:42,400 --> 00:03:43,880 Speaker 1: ask how are you, I don't want you to be 64 00:03:43,920 --> 00:03:47,640 Speaker 1: like fine and move on. I want the real answer 65 00:03:47,680 --> 00:03:50,960 Speaker 1: if you're giving it, Oh yeah, absolutely, you know, like 66 00:03:51,400 --> 00:03:53,680 Speaker 1: when it comes to holidays, and even though I think 67 00:03:53,680 --> 00:03:56,240 Speaker 1: we talked about this recently, Annie, in which I'm like, 68 00:03:56,280 --> 00:03:59,160 Speaker 1: I do like Halloween. I really do like the idea 69 00:03:59,160 --> 00:04:00,920 Speaker 1: of it, in a spoon it of it. But when 70 00:04:00,960 --> 00:04:03,440 Speaker 1: it comes to holidays and the holidays as is approaching, 71 00:04:03,640 --> 00:04:08,760 Speaker 1: I go into anxiety overload, slash uh, a little bit 72 00:04:08,760 --> 00:04:11,920 Speaker 1: of holiday seasonal depression. All of that just adds in. 73 00:04:12,440 --> 00:04:14,800 Speaker 1: So a part of me right now is like heart 74 00:04:14,920 --> 00:04:17,120 Speaker 1: is racing and trying to think of how to answer 75 00:04:17,120 --> 00:04:20,680 Speaker 1: this question and not breathe the entire showdown, as well 76 00:04:20,720 --> 00:04:23,880 Speaker 1: as the fact I know that everything is okay, everything 77 00:04:24,000 --> 00:04:27,160 Speaker 1: is okay, but just prepping, like and you're as you 78 00:04:27,240 --> 00:04:29,800 Speaker 1: we're going to talk about there's so much going on 79 00:04:30,279 --> 00:04:33,480 Speaker 1: in in society and there's so much on the line, 80 00:04:33,520 --> 00:04:36,400 Speaker 1: and it has been and I feel like, you know, 81 00:04:36,520 --> 00:04:38,960 Speaker 1: when we talk about the seventies and the nineties and 82 00:04:39,000 --> 00:04:41,679 Speaker 1: how the New Revelation Lucians and all of the changes 83 00:04:41,720 --> 00:04:45,000 Speaker 1: and feminism in itself had taken turns like big things 84 00:04:45,000 --> 00:04:46,880 Speaker 1: had happened to make you feel like, Oh God, is 85 00:04:46,920 --> 00:04:48,560 Speaker 1: the end of the world. And if this doesn't go 86 00:04:48,680 --> 00:04:52,720 Speaker 1: this way, we're all you know what I mean. So 87 00:04:52,760 --> 00:04:56,040 Speaker 1: it's kind of like in this moment of like, Okay, 88 00:04:56,400 --> 00:04:58,240 Speaker 1: we need to fix it. So what do I do 89 00:04:58,279 --> 00:05:00,360 Speaker 1: to fix it? Oh, I can't do anything. I quit. 90 00:05:00,440 --> 00:05:02,520 Speaker 1: I'm gonna go to bed. Let's go to that level. 91 00:05:02,560 --> 00:05:04,880 Speaker 1: So that's where I am. That is the worst answer 92 00:05:04,960 --> 00:05:08,840 Speaker 1: to give, And I'm sorry because I completely feel you. 93 00:05:08,920 --> 00:05:10,680 Speaker 1: First of all, I don't know if I've said this 94 00:05:10,720 --> 00:05:13,839 Speaker 1: on the show. I hate the holidays. I like Halloween. 95 00:05:14,000 --> 00:05:17,840 Speaker 1: I feel like Halloween is I enjoy Halloween. I look 96 00:05:17,839 --> 00:05:21,400 Speaker 1: forward to Halloween all of the other holidays that come after. 97 00:05:21,680 --> 00:05:23,599 Speaker 1: I'm like, oh god, we'll be doing this. It's just 98 00:05:23,680 --> 00:05:27,120 Speaker 1: so much obligation and so much work, and everyone is stressed. 99 00:05:27,160 --> 00:05:30,400 Speaker 1: And I feel like, pretty much universally, everybody would rather 100 00:05:31,200 --> 00:05:33,000 Speaker 1: just like not do it every every year, and like, 101 00:05:33,040 --> 00:05:34,680 Speaker 1: wouldn't it be what if we all just agreed to 102 00:05:34,680 --> 00:05:40,280 Speaker 1: like skip this here. I definitely definitely feel you. Oh. 103 00:05:40,320 --> 00:05:42,320 Speaker 1: I do have to ask are you dressing up for Halloween? 104 00:05:42,400 --> 00:05:43,960 Speaker 1: I was going to ask the both of you, I 105 00:05:44,000 --> 00:05:47,200 Speaker 1: am dressing up the costume that I have. It's a 106 00:05:47,279 --> 00:05:52,760 Speaker 1: little bit niche Have either of you seen the nineties 107 00:05:53,080 --> 00:05:57,000 Speaker 1: remake of Romeo and Juliet with the Boss Lehmann remake? Yes? 108 00:05:58,279 --> 00:06:01,440 Speaker 1: Do you remember Romeo's and I think it's his cousin 109 00:06:01,520 --> 00:06:04,800 Speaker 1: Marcrucio who has So do you remember the scene when 110 00:06:04,800 --> 00:06:07,560 Speaker 1: they go to a costume party and they all dressed 111 00:06:07,640 --> 00:06:12,440 Speaker 1: up and Marcrucio is like a seventies like disco queen 112 00:06:12,640 --> 00:06:15,640 Speaker 1: and there's like a little montage where he sings and 113 00:06:15,760 --> 00:06:19,520 Speaker 1: dances to the song Young Hearts Run Free. So I'm 114 00:06:19,560 --> 00:06:23,400 Speaker 1: going I love costume within a costume, So I am 115 00:06:23,440 --> 00:06:29,599 Speaker 1: going as Marcrucio from that specific scene. Oh, we're talking 116 00:06:29,600 --> 00:06:33,080 Speaker 1: about the Leonardo DiCaprio of course. Okay, okay, okay, yes, 117 00:06:33,120 --> 00:06:34,960 Speaker 1: I have seen it. Silly man, I feel like I 118 00:06:35,000 --> 00:06:39,760 Speaker 1: don't know what's going on. I've seen something like this. 119 00:06:39,920 --> 00:06:43,440 Speaker 1: Samantha hasn't look look this has happened to me. So 120 00:06:44,040 --> 00:06:46,680 Speaker 1: future reference like cool Zone media. I was on Behind 121 00:06:46,680 --> 00:06:49,240 Speaker 1: the Bastard and they kept naming off these cool movies 122 00:06:49,279 --> 00:06:51,960 Speaker 1: that I'm supposed to I'm like, no, I've never seen that. No, 123 00:06:52,040 --> 00:06:54,400 Speaker 1: I've never seen that. I'm like, this is bad, nothing, 124 00:06:54,560 --> 00:06:57,200 Speaker 1: very out of touch. I'm not surprised, but I have 125 00:06:57,320 --> 00:07:01,839 Speaker 1: seen this one. Yeah, but that's at an amazing costume 126 00:07:01,880 --> 00:07:04,720 Speaker 1: and I'm so excited to see the pictures. Yeah, what 127 00:07:04,880 --> 00:07:07,760 Speaker 1: are what are you too going as you're dressing up? Yeah? 128 00:07:08,000 --> 00:07:10,080 Speaker 1: I love dressing up. I love a costume. I probably 129 00:07:10,120 --> 00:07:13,720 Speaker 1: during the pandemic more more costumes and like quote real clothes. 130 00:07:14,160 --> 00:07:16,120 Speaker 1: I love it. I do have a if you haven't 131 00:07:16,160 --> 00:07:18,640 Speaker 1: seen it on Instagram, I have a disco Luke Skywalker 132 00:07:18,640 --> 00:07:21,720 Speaker 1: outfit that is phenomenal and it is like a costume 133 00:07:21,760 --> 00:07:27,600 Speaker 1: within a costume. It's very very niche, but very beautiful. 134 00:07:27,680 --> 00:07:29,760 Speaker 1: I think so I might do that, but that's kind 135 00:07:29,800 --> 00:07:31,320 Speaker 1: of a whole thing. Like if I show up in that, 136 00:07:33,080 --> 00:07:35,360 Speaker 1: I'm like elevating things and I'm just I'm going to 137 00:07:35,440 --> 00:07:41,160 Speaker 1: say it, I'm elevating things. So I also have a 138 00:07:41,240 --> 00:07:44,200 Speaker 1: bunch of Luke Skywalker costumes I could do, and I 139 00:07:44,200 --> 00:07:46,840 Speaker 1: want to do my one. But I also got a 140 00:07:46,840 --> 00:07:48,960 Speaker 1: Taylor Swift outfit, so I might do that one. Who 141 00:07:48,960 --> 00:07:54,240 Speaker 1: knows Taylor Swift, like Luke Skywalker, Taylor Swift or just 142 00:07:54,280 --> 00:07:59,320 Speaker 1: Taylor Swift. I could combine them your I actually did 143 00:07:59,320 --> 00:08:01,160 Speaker 1: think about it. I did think about it because that 144 00:08:01,240 --> 00:08:05,200 Speaker 1: is a popular fan fiction trope, so I could do it. 145 00:08:05,960 --> 00:08:09,400 Speaker 1: But we'll see. Well, I I'm gonna make like a 146 00:08:09,440 --> 00:08:15,080 Speaker 1: game day decision. I think I'm feeling and I just 147 00:08:15,120 --> 00:08:18,000 Speaker 1: went to a Stranger Things thing and I went and 148 00:08:20,120 --> 00:08:21,920 Speaker 1: I killed it. This guy came up to me and 149 00:08:22,400 --> 00:08:24,280 Speaker 1: he was like, I mean, I'm sure he's paid to 150 00:08:24,280 --> 00:08:26,840 Speaker 1: say it, but he was like, nice haties outfit. You 151 00:08:28,040 --> 00:08:31,040 Speaker 1: hate to say it. That's what he worked there, he works, 152 00:08:31,520 --> 00:08:34,120 Speaker 1: so I'm sure he was probably like compliment people I 153 00:08:34,200 --> 00:08:41,800 Speaker 1: say came in costumes. They're good, They're pretty good. Thank you. Yeah, 154 00:08:42,480 --> 00:08:46,320 Speaker 1: because you had a whole crew, your original Delonaga friends 155 00:08:46,559 --> 00:08:50,880 Speaker 1: and then Ramsey, the executive producer friend Ramsey, y'all all 156 00:08:50,960 --> 00:08:57,480 Speaker 1: dressed up was Ray. Ramsey did like the classic eighties 157 00:08:57,600 --> 00:09:02,080 Speaker 1: band T shirt, jean jacket, and like he had really 158 00:09:02,080 --> 00:09:04,120 Speaker 1: good eighties shoes because that's what I struggled the most 159 00:09:04,120 --> 00:09:07,360 Speaker 1: with with the shoes. But yeah, I'm glad because I 160 00:09:07,400 --> 00:09:09,240 Speaker 1: was going to wear an A C. D C shirt 161 00:09:10,320 --> 00:09:12,280 Speaker 1: and I thought Ramsey is going to do the band 162 00:09:12,400 --> 00:09:14,160 Speaker 1: the eighties ban shirts. I won't do it. And he 163 00:09:14,200 --> 00:09:18,640 Speaker 1: did anyway. Yeah, y'all all look good several Like I 164 00:09:18,679 --> 00:09:21,160 Speaker 1: was like, they're already ahead of the game, looking like 165 00:09:21,240 --> 00:09:23,800 Speaker 1: they're from the eighties and loved it. Yeah I am again. 166 00:09:23,840 --> 00:09:25,000 Speaker 1: This is one of those things that I'm like, oh 167 00:09:25,040 --> 00:09:28,959 Speaker 1: my god, because I don't do well with costumes, so 168 00:09:29,080 --> 00:09:31,960 Speaker 1: I just give in and buy the Target onesies. That 169 00:09:32,080 --> 00:09:33,960 Speaker 1: is some kind of animals, so I can be both 170 00:09:34,080 --> 00:09:37,680 Speaker 1: warm and lazy. So, uh, we have a friend who 171 00:09:37,800 --> 00:09:41,960 Speaker 1: was doing a ghost ship themed party, So I am 172 00:09:42,000 --> 00:09:44,040 Speaker 1: going to dress up as a shark and I'm hoping 173 00:09:44,080 --> 00:09:47,000 Speaker 1: that my partner will dress up as a dead uh 174 00:09:47,280 --> 00:09:51,000 Speaker 1: like cruise ship person that I've been eating on. I 175 00:09:51,040 --> 00:09:57,959 Speaker 1: love that there's a story. Yeah right, that's the best way. 176 00:09:57,960 --> 00:09:59,880 Speaker 1: I'm like, I'll tell you about how good dark costume. 177 00:10:00,320 --> 00:10:05,440 Speaker 1: Even though it's not, I love it. I think that's great. Yeah, 178 00:10:05,440 --> 00:10:08,360 Speaker 1: a costume with a backstory of like, well, I'm feasting 179 00:10:08,400 --> 00:10:12,280 Speaker 1: on this person. You see love in my cute little 180 00:10:12,280 --> 00:10:16,960 Speaker 1: Target onesie. It's fine. I think that's warm. I hope. Yes, 181 00:10:17,320 --> 00:10:19,480 Speaker 1: I'm hoping to see pictures of all of these things. 182 00:10:34,040 --> 00:10:36,840 Speaker 1: I love Halloween, but one thing that does always come 183 00:10:37,000 --> 00:10:39,800 Speaker 1: with Halloween, like every couple of years. Is this tension 184 00:10:39,840 --> 00:10:43,360 Speaker 1: around voting? Is this tension that the elections are coming up? 185 00:10:43,360 --> 00:10:47,640 Speaker 1: So it's always like I'm trying to not think about it. Um, 186 00:10:47,720 --> 00:10:52,400 Speaker 1: is there anything spook here? Yes? Yes, And as we've 187 00:10:52,440 --> 00:10:54,200 Speaker 1: hinted at, that's what we're going to be talking about 188 00:10:54,240 --> 00:11:00,640 Speaker 1: a lot in here. Um and uh, Georgia, it's been 189 00:11:00,679 --> 00:11:08,320 Speaker 1: an intense mid term election for us. So yeah, yeah, 190 00:11:08,679 --> 00:11:12,080 Speaker 1: I guess let's just let's just break down some of 191 00:11:12,080 --> 00:11:16,360 Speaker 1: the things that are going on in these mid terms. Yes, 192 00:11:16,559 --> 00:11:19,120 Speaker 1: so we're as when we're recording this. I don't know 193 00:11:19,360 --> 00:11:20,880 Speaker 1: how long it will be out when folks here it, 194 00:11:20,960 --> 00:11:23,760 Speaker 1: but um, we are ten days out from the election 195 00:11:24,240 --> 00:11:27,000 Speaker 1: while we're recording this. And I mean, I it sounds 196 00:11:27,040 --> 00:11:30,319 Speaker 1: like you kind of feel like I do, where particularly 197 00:11:30,320 --> 00:11:33,960 Speaker 1: around mid terms, it just makes me queazy because I 198 00:11:34,080 --> 00:11:38,320 Speaker 1: know that GEO TV is so different when it's not 199 00:11:38,440 --> 00:11:43,280 Speaker 1: a presidential election, Like people, it's harder to get people excited, 200 00:11:43,640 --> 00:11:47,440 Speaker 1: and especially when I know there's so much on the line. 201 00:11:47,720 --> 00:11:51,440 Speaker 1: You know, you mentioned Georgia, that's an important race. A 202 00:11:51,520 --> 00:11:54,640 Speaker 1: lot is happening, a lot it's hinging on people actually 203 00:11:54,679 --> 00:11:57,000 Speaker 1: coming out, and so it's something that I just always 204 00:11:57,000 --> 00:11:59,760 Speaker 1: feel kind of like queezy about it. And for me, 205 00:11:59,800 --> 00:12:02,840 Speaker 1: I always kind of having worked in politics for most 206 00:12:02,880 --> 00:12:06,160 Speaker 1: of my adult life or like politics adjacent, I always 207 00:12:06,240 --> 00:12:09,360 Speaker 1: kind of feel me about elections because I'm not super 208 00:12:09,400 --> 00:12:12,240 Speaker 1: invested in electoral power, like I think that there's other 209 00:12:12,240 --> 00:12:15,440 Speaker 1: ways to build power. And then around like ten days out, 210 00:12:15,559 --> 00:12:19,040 Speaker 1: I'm like, oh my god, I need to like start tootving, 211 00:12:19,120 --> 00:12:21,320 Speaker 1: I need to start like calling everyone. And it's like 212 00:12:21,600 --> 00:12:23,520 Speaker 1: it's like something happens, and so I'm sort of waiting 213 00:12:23,559 --> 00:12:25,840 Speaker 1: for that transformation to happen. And it sounds like you 214 00:12:25,840 --> 00:12:28,520 Speaker 1: two are feelings sort of similarly around the elections this 215 00:12:28,640 --> 00:12:36,280 Speaker 1: time around. Yeah. So for me, I uh recently was 216 00:12:36,320 --> 00:12:39,840 Speaker 1: coming home from a trip with my mother and she 217 00:12:40,000 --> 00:12:43,480 Speaker 1: drove me past where I vote, my polling place, and 218 00:12:43,480 --> 00:12:45,439 Speaker 1: it was the first day of early voting in Georgia 219 00:12:45,480 --> 00:12:47,600 Speaker 1: and it was packed out, Like I was like, oh 220 00:12:47,640 --> 00:12:50,160 Speaker 1: my god, what is happening? And then I looked it 221 00:12:50,200 --> 00:12:52,040 Speaker 1: up an early voting was happening, and I was so 222 00:12:53,120 --> 00:12:56,080 Speaker 1: it's very heartening to see that many people turn out 223 00:12:56,120 --> 00:12:59,960 Speaker 1: for a midterm. And I waited a couple of day 224 00:13:00,160 --> 00:13:02,920 Speaker 1: and when I went back, it was still crowded. And 225 00:13:02,960 --> 00:13:06,600 Speaker 1: people were excited, there were people filming. Everyone was really 226 00:13:07,880 --> 00:13:10,959 Speaker 1: warm and nice, like, oh, I'm so glad you came 227 00:13:10,960 --> 00:13:14,640 Speaker 1: out to vote, which was it was a pleasant experience 228 00:13:14,679 --> 00:13:19,000 Speaker 1: for me, but it was kind of like overshadowed the 229 00:13:19,040 --> 00:13:22,160 Speaker 1: whole thing. Was you know, there has been these crackdowns 230 00:13:22,240 --> 00:13:26,240 Speaker 1: on voting rights and why are so many people coming 231 00:13:26,240 --> 00:13:27,880 Speaker 1: out so early? Which I think is great, but it 232 00:13:27,960 --> 00:13:31,040 Speaker 1: was just kind of overhanging the whole thing. And also 233 00:13:31,120 --> 00:13:38,199 Speaker 1: because it is a very divisive race, senatorial race specifically 234 00:13:38,360 --> 00:13:40,920 Speaker 1: in Georgia. I don't know, it was it was one 235 00:13:40,960 --> 00:13:42,880 Speaker 1: of those things where I was like, I've I've been 236 00:13:42,920 --> 00:13:45,640 Speaker 1: to mid terms before, like nobody was there and I 237 00:13:45,760 --> 00:13:47,880 Speaker 1: walked past that area all the time, and so I 238 00:13:47,920 --> 00:13:49,600 Speaker 1: see it and I would never see a line, And 239 00:13:49,640 --> 00:13:52,040 Speaker 1: now like every day when I walk past it, there's 240 00:13:52,080 --> 00:13:55,720 Speaker 1: a line. So that's nice, but I'm still like, there, 241 00:13:56,240 --> 00:13:59,600 Speaker 1: you just can't shake all of the kind of negative 242 00:13:59,679 --> 00:14:02,320 Speaker 1: things that might be behind why that is, even though 243 00:14:02,640 --> 00:14:06,920 Speaker 1: I'm happy to see it, if that makes sense, right absolutely, 244 00:14:07,000 --> 00:14:10,560 Speaker 1: Like as right now, we have the highest early voter 245 00:14:10,720 --> 00:14:14,200 Speaker 1: turnout in our state that we've ever had, beating last 246 00:14:14,679 --> 00:14:18,320 Speaker 1: presidential election. So it's amazing to see that things are happening, 247 00:14:18,559 --> 00:14:22,520 Speaker 1: and we know that that's great indication of things to come. Again. Yes, 248 00:14:22,600 --> 00:14:26,120 Speaker 1: we have had a huge upsurge in UM voter suppression 249 00:14:26,200 --> 00:14:29,560 Speaker 1: laws in the last five years. Actually has always been 250 00:14:29,640 --> 00:14:32,720 Speaker 1: let's just be honest, but it's seemingly getting worse and worse, 251 00:14:32,760 --> 00:14:35,800 Speaker 1: and we know, uh, the way they have redistricted everything, 252 00:14:35,840 --> 00:14:38,600 Speaker 1: and it's very very shady. Although you know, our current 253 00:14:38,640 --> 00:14:42,080 Speaker 1: governor was the secretary of state who controlled his own election, 254 00:14:42,120 --> 00:14:44,520 Speaker 1: which says a lot in itself, UM, And that's a 255 00:14:44,520 --> 00:14:46,520 Speaker 1: big point of contention here, as well as the fact 256 00:14:46,520 --> 00:14:52,120 Speaker 1: that apparently uh Rafflesburger, that's our current secretary of State UM, 257 00:14:52,160 --> 00:14:54,840 Speaker 1: who actually stood up to Trump at one point in 258 00:14:54,880 --> 00:14:58,200 Speaker 1: time and got too much credit in my opinion, UM 259 00:14:58,200 --> 00:15:01,160 Speaker 1: for what he did. But it's hired for staffer of 260 00:15:01,200 --> 00:15:04,120 Speaker 1: Trump for these elections. So there's a lot of questionable 261 00:15:04,120 --> 00:15:07,040 Speaker 1: things happening. So we have a lot of concerns and 262 00:15:07,080 --> 00:15:09,400 Speaker 1: a lot of hope. Is such a mixed bag, as 263 00:15:09,440 --> 00:15:12,520 Speaker 1: Annie was saying, of what is happening here, we're seeing 264 00:15:12,520 --> 00:15:15,080 Speaker 1: what's happening because the governor's race is a pretty big deal, 265 00:15:15,200 --> 00:15:18,200 Speaker 1: as the overturning of Roe v. Wade hasn't made it 266 00:15:18,240 --> 00:15:20,280 Speaker 1: a huge deal, as we do have a six week 267 00:15:20,320 --> 00:15:23,680 Speaker 1: ban in um Georgia currently, and then there's just talk 268 00:15:23,680 --> 00:15:26,160 Speaker 1: about the fact that they're going to punish um those 269 00:15:26,200 --> 00:15:30,720 Speaker 1: who do have abortions as uh, murder, possibly charging them 270 00:15:30,760 --> 00:15:32,680 Speaker 1: with murder, with the fact that they're doing this whole 271 00:15:32,720 --> 00:15:35,560 Speaker 1: like you know, unborn fetis law and protecting of them 272 00:15:35,600 --> 00:15:39,640 Speaker 1: as as like crediting them as a citizen type of 273 00:15:39,720 --> 00:15:42,800 Speaker 1: conversation which we know is trickery and is leading way 274 00:15:42,880 --> 00:15:46,160 Speaker 1: to a lot of imprisonment and unfair treatment. And we 275 00:15:46,200 --> 00:15:50,840 Speaker 1: know this is completely targeted at marginalized individuals in our state. 276 00:15:50,880 --> 00:15:52,760 Speaker 1: So there's a lot of concerns there, so much that 277 00:15:52,800 --> 00:15:55,840 Speaker 1: feels like it's being really weighty here right now, and 278 00:15:55,880 --> 00:15:58,600 Speaker 1: it does. It feels like it is life or death 279 00:15:58,720 --> 00:16:01,480 Speaker 1: for so many of us. And I'll get I'm being 280 00:16:01,640 --> 00:16:04,240 Speaker 1: very local on this moment because this is kind of 281 00:16:04,280 --> 00:16:06,680 Speaker 1: where we are, and Annie and I obviously have to 282 00:16:06,720 --> 00:16:11,040 Speaker 1: concentrate so hard in these conversations. So there is this 283 00:16:11,120 --> 00:16:16,240 Speaker 1: level of doom, which I'm wonderful at conversation bringing up 284 00:16:16,280 --> 00:16:19,440 Speaker 1: all the time. Apparently there's this fear and level of 285 00:16:19,520 --> 00:16:22,600 Speaker 1: doom in this election that feels too heavy. At the 286 00:16:22,640 --> 00:16:24,600 Speaker 1: same time, we are the butt of jokes with our 287 00:16:24,600 --> 00:16:30,160 Speaker 1: current senator race, and it's it's it's really like you 288 00:16:30,160 --> 00:16:33,040 Speaker 1: want you you have to laugh, but want to cry, 289 00:16:33,200 --> 00:16:36,680 Speaker 1: as well as why why just but why? Yeah? Oh 290 00:16:36,840 --> 00:16:40,320 Speaker 1: my gosh. I if you got if you all get 291 00:16:40,360 --> 00:16:43,680 Speaker 1: me started on a hurt for Walker, I will never 292 00:16:43,840 --> 00:16:47,280 Speaker 1: I will never stop talking. Um, So I'm not going 293 00:16:47,360 --> 00:16:51,640 Speaker 1: to comment on that. Boy I want to, but truly, 294 00:16:51,840 --> 00:16:55,680 Speaker 1: if you got me going, I would never stop. I 295 00:16:55,720 --> 00:16:59,320 Speaker 1: will just say that all of that is really valid. 296 00:16:59,680 --> 00:17:02,960 Speaker 1: You know. I think this feeling of it's great to 297 00:17:03,000 --> 00:17:06,080 Speaker 1: see people voting, but that like unease of life, but 298 00:17:06,200 --> 00:17:09,320 Speaker 1: what does that mean for everyone? I think that's really valid. 299 00:17:09,560 --> 00:17:12,400 Speaker 1: And just the feeling of knowing how much is on 300 00:17:12,440 --> 00:17:15,679 Speaker 1: the line, what is at stake this time around. I 301 00:17:15,680 --> 00:17:18,400 Speaker 1: think that that's I hate to say it, I think 302 00:17:18,480 --> 00:17:22,160 Speaker 1: as our elections become more and more contentious, and as 303 00:17:22,200 --> 00:17:25,040 Speaker 1: things become more and more polarized, I think that's going 304 00:17:25,080 --> 00:17:28,720 Speaker 1: to be a common sentiment. And you know, something that 305 00:17:28,800 --> 00:17:31,000 Speaker 1: kind of speaks to that is the fact that there 306 00:17:31,040 --> 00:17:34,359 Speaker 1: are more women running for statewide offices than ever before, 307 00:17:34,480 --> 00:17:37,520 Speaker 1: which you know, I think is should be good news. 308 00:17:38,440 --> 00:17:41,240 Speaker 1: Marks a record high for women running for gubernatorial and 309 00:17:41,280 --> 00:17:44,040 Speaker 1: senate races. UM. There's about sixty five women running for 310 00:17:44,080 --> 00:17:46,600 Speaker 1: governors races across the country in the cycle. UM. I 311 00:17:46,600 --> 00:17:49,280 Speaker 1: should say it is not all lefties. UM. There are 312 00:17:49,280 --> 00:17:52,760 Speaker 1: more Republicans than Democrats making those bids. This is all 313 00:17:52,760 --> 00:17:55,400 Speaker 1: according to data from the Center for American Women in Politics, 314 00:17:55,880 --> 00:18:00,000 Speaker 1: and I it's true. It's like I when there are 315 00:18:00,160 --> 00:18:03,840 Speaker 1: long lines at polling places, I'm like, oh, that's great, 316 00:18:03,920 --> 00:18:09,480 Speaker 1: like democracy and action. And that's definitely preferable to people 317 00:18:10,359 --> 00:18:13,919 Speaker 1: having elections not go their way and then trying to 318 00:18:14,480 --> 00:18:17,919 Speaker 1: force them to go their way via violent behavior like 319 00:18:18,000 --> 00:18:21,600 Speaker 1: we saw on January sits and so things feel charged. 320 00:18:21,640 --> 00:18:24,160 Speaker 1: I guess I'll put it that way. And women are 321 00:18:24,200 --> 00:18:26,479 Speaker 1: poised to make gains in state white contests, but they 322 00:18:26,520 --> 00:18:30,040 Speaker 1: remain underrepresented in key legislatures around the country. And it's 323 00:18:30,080 --> 00:18:33,000 Speaker 1: also true for black women. We're seeing more black women 324 00:18:33,000 --> 00:18:35,480 Speaker 1: and more women of color running for local office. Black 325 00:18:35,480 --> 00:18:38,560 Speaker 1: women have set record numbers for candidates and cubernatorial, Senate 326 00:18:38,560 --> 00:18:40,840 Speaker 1: and House races. It's the cycle, UM. And there's a 327 00:18:40,880 --> 00:18:44,760 Speaker 1: lot of Latina and Hispanic cubernatorial and US House Senate candidates, 328 00:18:44,920 --> 00:18:47,040 Speaker 1: and a record number of Asian and Pacific Islander women 329 00:18:47,040 --> 00:18:50,879 Speaker 1: who are running for governor, and so I as fraught 330 00:18:50,960 --> 00:18:54,239 Speaker 1: as that can be, I think it's overall good, you know, 331 00:18:54,800 --> 00:18:57,879 Speaker 1: not just because representation matters, which it does, but also 332 00:18:57,920 --> 00:18:59,800 Speaker 1: I think that anything that gets us closer to a 333 00:19:00,040 --> 00:19:04,199 Speaker 1: presentative democracy, one where there's people who actually have the 334 00:19:04,240 --> 00:19:07,840 Speaker 1: lived experience of the people that they are governing, is 335 00:19:07,880 --> 00:19:11,120 Speaker 1: a good thing. And so like that's one kind of 336 00:19:11,200 --> 00:19:14,920 Speaker 1: silver lining I guess is that particularly when I think 337 00:19:14,920 --> 00:19:19,600 Speaker 1: about issues like abortion like you were talking about, Sam, 338 00:19:18,920 --> 00:19:21,840 Speaker 1: I don't have any hard numbers for this, I would 339 00:19:21,880 --> 00:19:26,720 Speaker 1: suspect that women are motivated to run for office because 340 00:19:26,840 --> 00:19:30,320 Speaker 1: those issues have taken such a national stage recently. I 341 00:19:30,359 --> 00:19:33,159 Speaker 1: wish I could say that it's all women who you know, 342 00:19:33,680 --> 00:19:36,720 Speaker 1: want to protect abortion access. I wish I could say 343 00:19:36,720 --> 00:19:38,000 Speaker 1: that if you were a woman, that was part of 344 00:19:38,000 --> 00:19:43,199 Speaker 1: your platform, but that would not be correct. Yeah, and 345 00:19:43,240 --> 00:19:47,080 Speaker 1: that's one thing that is very overwhelming right now in Georgia, 346 00:19:47,080 --> 00:19:49,800 Speaker 1: and I'm sure in a lot of places, but are 347 00:19:50,359 --> 00:19:57,160 Speaker 1: political ads are intense um and hard to escape. But yeah, 348 00:19:57,320 --> 00:19:59,800 Speaker 1: so it's it's kind of like what I was saying. 349 00:20:00,080 --> 00:20:03,320 Speaker 1: We have these positives. We have more women running, we 350 00:20:03,359 --> 00:20:05,240 Speaker 1: have more people of color running, we have more Black 351 00:20:05,240 --> 00:20:11,600 Speaker 1: women running. But that doesn't come without these realities that 352 00:20:11,680 --> 00:20:15,080 Speaker 1: you have spoke about a lot on this show that 353 00:20:15,200 --> 00:20:18,560 Speaker 1: often get kind of swept aside and not talked about, 354 00:20:18,680 --> 00:20:24,360 Speaker 1: even though they're so important and they are impacting our 355 00:20:24,480 --> 00:20:29,119 Speaker 1: political landscape exactly. And so I think it is so 356 00:20:29,240 --> 00:20:31,760 Speaker 1: great when we support women, especially black women and women 357 00:20:31,760 --> 00:20:34,560 Speaker 1: of color, to run for office, and it just generally 358 00:20:34,720 --> 00:20:37,480 Speaker 1: be involved in civic life through things like working as 359 00:20:37,480 --> 00:20:41,280 Speaker 1: election workers or becoming vocal advocates or activists on an issue. 360 00:20:41,880 --> 00:20:44,879 Speaker 1: But it is imperative that that support be grounded in 361 00:20:44,880 --> 00:20:48,080 Speaker 1: the reality of what we know these women will face 362 00:20:48,160 --> 00:20:51,399 Speaker 1: when they find themselves in those situations that we, you know, 363 00:20:51,600 --> 00:20:54,800 Speaker 1: championed them to to hold. Right, so we are on 364 00:20:54,840 --> 00:20:58,000 Speaker 1: the left, we're very fond of saying things like trust 365 00:20:58,040 --> 00:21:01,199 Speaker 1: black women, support black women. That is very true. But 366 00:21:01,280 --> 00:21:04,960 Speaker 1: we can't just you know, advocate for black women to 367 00:21:05,000 --> 00:21:08,159 Speaker 1: get into these positions. We have to be honest about 368 00:21:08,359 --> 00:21:10,240 Speaker 1: what they will face when they get there, and then 369 00:21:10,280 --> 00:21:14,640 Speaker 1: really do the intentional work of creating the conditions for 370 00:21:14,720 --> 00:21:17,440 Speaker 1: them to have an equal playing field to make sure 371 00:21:17,440 --> 00:21:19,479 Speaker 1: that they will thrive in those positions, because it's one 372 00:21:19,520 --> 00:21:21,919 Speaker 1: thing to be like, oh, yes, put black women in 373 00:21:22,000 --> 00:21:24,959 Speaker 1: leadership positions, but then are you going to support them 374 00:21:24,960 --> 00:21:26,919 Speaker 1: when they face the attacks that we know they are 375 00:21:26,920 --> 00:21:29,560 Speaker 1: going to face? You know, for instance, I was super 376 00:21:29,600 --> 00:21:31,480 Speaker 1: excited when President Biden said that he was going to 377 00:21:31,600 --> 00:21:34,920 Speaker 1: pick black women to be the vice president um and 378 00:21:34,960 --> 00:21:38,000 Speaker 1: also nominated black women to be a Supreme Court justice. 379 00:21:38,240 --> 00:21:40,440 Speaker 1: But I also know that that means that those black 380 00:21:40,480 --> 00:21:44,720 Speaker 1: women would certainly face heightened attacks that their white male 381 00:21:44,920 --> 00:21:47,960 Speaker 1: counterparts just would not have to face. And so I 382 00:21:48,040 --> 00:21:50,119 Speaker 1: was then a little bit disappointed to see that the 383 00:21:50,160 --> 00:21:54,640 Speaker 1: White House didn't really meaningfully deal with this reality, right, 384 00:21:54,680 --> 00:21:57,520 Speaker 1: Like I wondered, like, are they setting black women up 385 00:21:57,600 --> 00:22:00,399 Speaker 1: to deal with the harassment and attack acts that we 386 00:22:00,400 --> 00:22:03,399 Speaker 1: know are going to be like racialized and gendered, not 387 00:22:03,440 --> 00:22:06,600 Speaker 1: attacks based on their records or their merits or their actions, 388 00:22:06,600 --> 00:22:09,920 Speaker 1: but attacks on their identity who they are. So when 389 00:22:09,920 --> 00:22:12,840 Speaker 1: these attacks happened, which we saw with both Harris and 390 00:22:13,080 --> 00:22:16,719 Speaker 1: Justice Jackson, the White House didn't really acknowledge them. And 391 00:22:16,760 --> 00:22:20,760 Speaker 1: of course the women themselves can't really talk openly about them. 392 00:22:21,080 --> 00:22:23,760 Speaker 1: So we all just kind of saw it happened, and 393 00:22:23,800 --> 00:22:26,959 Speaker 1: nobody was at the very least talking honestly about it, 394 00:22:27,040 --> 00:22:30,600 Speaker 1: let alone working to create the conditions to combat it, right, 395 00:22:30,640 --> 00:22:34,919 Speaker 1: And that's just such a huge saying of you know, 396 00:22:35,000 --> 00:22:37,000 Speaker 1: either it's a drade on you because you're having to 397 00:22:37,040 --> 00:22:40,240 Speaker 1: deal with these attacks, or you just kind of distance 398 00:22:40,320 --> 00:22:44,639 Speaker 1: yourself from social media. And then that in itself people 399 00:22:44,680 --> 00:22:46,800 Speaker 1: can interpret in a variety of ways that are probably 400 00:22:46,800 --> 00:22:51,800 Speaker 1: not good. Um. And then people's especially younger people, seeing 401 00:22:51,800 --> 00:22:55,359 Speaker 1: that like that. There's just so many unhealthy things about 402 00:22:55,920 --> 00:23:02,240 Speaker 1: what that says. Um that we're accepting um as as 403 00:23:02,280 --> 00:23:04,159 Speaker 1: something that's just like part for the course. That's how 404 00:23:04,160 --> 00:23:08,159 Speaker 1: it is. And the very unfortunate thing is this is 405 00:23:08,240 --> 00:23:12,280 Speaker 1: not just like one time or just a few times. 406 00:23:12,280 --> 00:23:15,480 Speaker 1: It's like all over the place, exactly. It's not an 407 00:23:15,560 --> 00:23:18,359 Speaker 1: isolated thing, I'm sorry to say. According to a report 408 00:23:18,400 --> 00:23:21,440 Speaker 1: from the Institute for Strategic Dialogue called Public Figures of 409 00:23:21,480 --> 00:23:25,359 Speaker 1: Public Rage, candidates abuse on social media. Satistically speaking, black 410 00:23:25,400 --> 00:23:27,560 Speaker 1: women and women of color are more likely to face 411 00:23:27,760 --> 00:23:31,119 Speaker 1: racialize gender to tacks than their white male counterparts. On 412 00:23:31,200 --> 00:23:33,400 Speaker 1: The study, which you can find online is super interesting. 413 00:23:33,640 --> 00:23:36,240 Speaker 1: They found the abusive messages on social media accounted for 414 00:23:36,320 --> 00:23:40,560 Speaker 1: fift of those directed at every female lawmaker that they analyze, 415 00:23:40,600 --> 00:23:42,960 Speaker 1: compared to around five to ten percent when they looked 416 00:23:43,000 --> 00:23:45,359 Speaker 1: at male candidates. A few of their keep findings or 417 00:23:45,400 --> 00:23:48,520 Speaker 1: that women of color are particularly likely to be targeted online. 418 00:23:49,080 --> 00:23:52,040 Speaker 1: Male politicians of color are not more vulnerable than their 419 00:23:52,040 --> 00:23:55,000 Speaker 1: white counterparts. They are attacked at the same rate. An 420 00:23:55,040 --> 00:23:58,720 Speaker 1: abuse towards women was more likely to be about gender 421 00:23:59,080 --> 00:24:02,120 Speaker 1: than the abuse tar targeting men, So abuse targeting men 422 00:24:02,359 --> 00:24:05,680 Speaker 1: was usually focused on their political stances um while the 423 00:24:05,720 --> 00:24:08,160 Speaker 1: messages directed that women were more likely to be about 424 00:24:08,160 --> 00:24:12,520 Speaker 1: their appearance or their general competence. Interestingly enough, female democrats 425 00:24:12,520 --> 00:24:16,240 Speaker 1: received ten times more abusive comments and their male counterparts 426 00:24:16,280 --> 00:24:20,080 Speaker 1: on Facebook, and Republican women received twice as many abusive 427 00:24:20,080 --> 00:24:24,280 Speaker 1: messages as Republican men. So it's a bipartisan issue. It's 428 00:24:24,320 --> 00:24:27,239 Speaker 1: an issue impacting all women and women of color. You know, 429 00:24:27,240 --> 00:24:30,080 Speaker 1: it's not just women on the left who face it. 430 00:24:30,080 --> 00:24:32,960 Speaker 1: It's all of us, right. And one of the things 431 00:24:33,040 --> 00:24:35,119 Speaker 1: one of the points you make when you come on 432 00:24:35,200 --> 00:24:37,160 Speaker 1: here all the time, which I think is very important 433 00:24:37,160 --> 00:24:39,680 Speaker 1: and I love, um, is that a lot of times 434 00:24:39,720 --> 00:24:43,480 Speaker 1: people can distance like, oh, that's online, that's not real life. Um, 435 00:24:43,520 --> 00:24:46,920 Speaker 1: but that is not at all the case, right. Yes, 436 00:24:47,160 --> 00:24:49,320 Speaker 1: So I'm sure people are like sick of me saying 437 00:24:49,359 --> 00:24:51,560 Speaker 1: this because I've you know, say it's pretty from my 438 00:24:51,640 --> 00:24:54,280 Speaker 1: rooftops every time I'm I am given a platform. But 439 00:24:54,880 --> 00:24:57,080 Speaker 1: I think we have this misconception that when we talk 440 00:24:57,119 --> 00:25:00,320 Speaker 1: about online harassment and abuse, we're talking about online shoes. 441 00:25:00,359 --> 00:25:03,880 Speaker 1: And the research could not be clearer that even though 442 00:25:03,880 --> 00:25:06,879 Speaker 1: these attacks may start online, they do not always stay online. 443 00:25:06,880 --> 00:25:08,960 Speaker 1: And a great example that we're seeing right now is 444 00:25:09,280 --> 00:25:12,959 Speaker 1: representative Jaia Paul, a man started by sending her violent 445 00:25:13,200 --> 00:25:16,159 Speaker 1: emails and threatening messages online, showed up as side of 446 00:25:16,160 --> 00:25:18,200 Speaker 1: her house with a gun. I just had to take 447 00:25:18,240 --> 00:25:21,359 Speaker 1: a minute from recording this podcast to quick send on 448 00:25:21,400 --> 00:25:24,280 Speaker 1: a statement about the husband of Nancy Pelosi, someone who 449 00:25:24,320 --> 00:25:27,080 Speaker 1: had a just today while we're recording this, someone with 450 00:25:27,119 --> 00:25:32,159 Speaker 1: a history of violent you know, rhetoric on social media, 451 00:25:32,560 --> 00:25:35,240 Speaker 1: broke into Nancy Pelosi's house and attacked her husband with 452 00:25:35,280 --> 00:25:37,200 Speaker 1: a hammer when he found that she was not there. 453 00:25:37,240 --> 00:25:41,719 Speaker 1: And so it's so important and deeply imperative that if 454 00:25:41,760 --> 00:25:44,800 Speaker 1: we're going to take this kind of violence seriously that 455 00:25:44,880 --> 00:25:48,840 Speaker 1: when someone reports violent threats or violent attacks online, we 456 00:25:49,000 --> 00:25:52,040 Speaker 1: then take it seriously if we're going to prevent real 457 00:25:52,119 --> 00:25:54,800 Speaker 1: world violence from happening, because the two our link. It's 458 00:25:54,840 --> 00:25:57,719 Speaker 1: just that the research could not be clear time and 459 00:25:57,720 --> 00:26:00,919 Speaker 1: time again. When somebody is the perpetrator of a mass 460 00:26:00,920 --> 00:26:03,919 Speaker 1: shooting or an incident of mass violence, you follow the 461 00:26:03,920 --> 00:26:06,720 Speaker 1: paper trail and oh, they had an online history where 462 00:26:06,720 --> 00:26:10,320 Speaker 1: they were threatening women or being aggressive or hostile towards 463 00:26:10,320 --> 00:26:13,280 Speaker 1: the women in their life online. And so we cannot 464 00:26:13,359 --> 00:26:16,520 Speaker 1: get a handle meaningfully on the abuse of women if 465 00:26:16,520 --> 00:26:19,439 Speaker 1: we do not take it seriously when it happens online, 466 00:26:19,480 --> 00:26:21,760 Speaker 1: it's just not going to happen. And so real world 467 00:26:21,880 --> 00:26:24,359 Speaker 1: violence is connected to online violence and we need to 468 00:26:24,359 --> 00:26:26,679 Speaker 1: deal with it as such. But unfortunately that has not 469 00:26:26,760 --> 00:26:29,440 Speaker 1: been the case for so long. People have been so 470 00:26:29,640 --> 00:26:31,960 Speaker 1: quick to belittle it when a woman speaks off about 471 00:26:31,960 --> 00:26:34,960 Speaker 1: what she's facing online, particularly if that woman is black 472 00:26:35,080 --> 00:26:51,440 Speaker 1: or a woman of color. We've talked about so much 473 00:26:51,480 --> 00:26:54,840 Speaker 1: harassment and and I know, one of the biggest platforms 474 00:26:55,000 --> 00:26:56,680 Speaker 1: and I know we're gonna talk about this later and 475 00:26:56,720 --> 00:26:59,720 Speaker 1: I'm already a dressing it. Twitter has been one of 476 00:26:59,720 --> 00:27:03,200 Speaker 1: the bigest perpetrators in allowing this type of abuse and 477 00:27:03,280 --> 00:27:07,400 Speaker 1: not actually following through in the policies they've they've already 478 00:27:07,400 --> 00:27:11,640 Speaker 1: placed themselves to try to supposedly stop these types of harassments, 479 00:27:11,640 --> 00:27:15,640 Speaker 1: and then knowing that there's even ways to actually find 480 00:27:15,680 --> 00:27:17,760 Speaker 1: specific people to target, like when you were talking about 481 00:27:17,760 --> 00:27:20,160 Speaker 1: the bots and then just having uh, pretty much hate 482 00:27:20,200 --> 00:27:23,880 Speaker 1: farms created on these types of platforms. I can't imagine 483 00:27:23,880 --> 00:27:26,440 Speaker 1: what it does too. Yeah, it's happening to those who 484 00:27:26,480 --> 00:27:29,000 Speaker 1: are on higher profile levels. I guess it's the best 485 00:27:29,000 --> 00:27:30,560 Speaker 1: way to say it. But to be fair, like I 486 00:27:30,600 --> 00:27:33,719 Speaker 1: am petrified as someone who kind of it's it's supposed 487 00:27:33,720 --> 00:27:36,040 Speaker 1: to be. I don't know the world not influencing. I 488 00:27:36,080 --> 00:27:39,080 Speaker 1: don't know what a public person. We'll say this, We'll 489 00:27:39,080 --> 00:27:41,680 Speaker 1: say public person. My name is out there, like I'm 490 00:27:41,720 --> 00:27:44,800 Speaker 1: petrified with my five followers, and I'm going to say 491 00:27:44,840 --> 00:27:47,240 Speaker 1: something wrong and or they're gonna hear someone's gonna hear 492 00:27:47,280 --> 00:27:50,040 Speaker 1: something that we do take it and say that means 493 00:27:50,080 --> 00:27:52,840 Speaker 1: we can go after her, and wondering what that would 494 00:27:52,880 --> 00:27:56,399 Speaker 1: be like because I've seen normal people, regular people getting 495 00:27:56,400 --> 00:27:59,280 Speaker 1: attacked like all they said was this one thing and 496 00:27:59,440 --> 00:28:03,000 Speaker 1: or just agreed with something and people are going after them. 497 00:28:03,040 --> 00:28:05,320 Speaker 1: Oh my god, I mean absolutely. And we're also I 498 00:28:05,320 --> 00:28:07,320 Speaker 1: would be remiss to not mention that we're having this 499 00:28:07,400 --> 00:28:11,200 Speaker 1: conversation on day one of Elon Musk at the helm 500 00:28:11,320 --> 00:28:14,760 Speaker 1: at Twitter, and people keep asking like, oh, what do 501 00:28:14,800 --> 00:28:16,119 Speaker 1: you think about that? What do you think about that? 502 00:28:16,200 --> 00:28:21,000 Speaker 1: And I can tell you that the people who are extremists, 503 00:28:21,080 --> 00:28:26,880 Speaker 1: bad actors, people who are interested in, you know, perpetrating 504 00:28:26,920 --> 00:28:30,800 Speaker 1: these the kinds of harassment campaigns that you were just 505 00:28:30,840 --> 00:28:34,879 Speaker 1: talking about, Sam, those people are rejoicing right Like just today, 506 00:28:34,880 --> 00:28:37,000 Speaker 1: I already saw a bunch of tweets where someone was like, 507 00:28:37,000 --> 00:28:39,720 Speaker 1: I'm gonna use the end slor as many times as 508 00:28:39,720 --> 00:28:42,160 Speaker 1: I can because it's okay now on Twitter. These are 509 00:28:42,160 --> 00:28:44,560 Speaker 1: the kind of people that Elon Musk is welcoming back 510 00:28:44,600 --> 00:28:46,320 Speaker 1: to the platform. These are the kind of people who 511 00:28:46,360 --> 00:28:49,040 Speaker 1: are rejoicing because Elon Musk is at the helm of 512 00:28:49,120 --> 00:28:53,040 Speaker 1: our one of our largest and most important communications platforms. 513 00:28:53,040 --> 00:28:56,200 Speaker 1: And so you're exactly right that I think we started 514 00:28:56,240 --> 00:28:59,920 Speaker 1: this conversation talking about elected officials people running for office. 515 00:29:00,120 --> 00:29:02,760 Speaker 1: More and more we are seeing that trickle down to 516 00:29:02,880 --> 00:29:05,320 Speaker 1: just regular people. And I want to be clear, So 517 00:29:05,400 --> 00:29:08,080 Speaker 1: it is it is. We have seen this kind of 518 00:29:08,080 --> 00:29:11,720 Speaker 1: online behavior keep marginalized people from doing things like running 519 00:29:11,720 --> 00:29:14,320 Speaker 1: for office, but also from doing things like just serving 520 00:29:14,400 --> 00:29:18,840 Speaker 1: their community as election officials are poll workers from speaking 521 00:29:18,920 --> 00:29:21,920 Speaker 1: up about their opinions at school board meetings. Because all 522 00:29:22,040 --> 00:29:26,720 Speaker 1: people who just generally engage in public civic life, they 523 00:29:26,800 --> 00:29:29,640 Speaker 1: know that in this climate, they are setting themselves up 524 00:29:29,680 --> 00:29:31,760 Speaker 1: to be attacked. And we're not talking about people like 525 00:29:31,760 --> 00:29:34,600 Speaker 1: it's bad enough that that's happening to women running for office, 526 00:29:34,760 --> 00:29:37,040 Speaker 1: but when it's happening to people who are just you know, 527 00:29:37,160 --> 00:29:39,960 Speaker 1: everyday people. It's not like they have a security detail. 528 00:29:40,000 --> 00:29:42,800 Speaker 1: It's not like they have money to protect themselves in 529 00:29:42,840 --> 00:29:45,120 Speaker 1: the way they would need to. And just a few 530 00:29:45,120 --> 00:29:48,280 Speaker 1: examples that we're seeing recently, educators are being attacked by 531 00:29:48,280 --> 00:29:52,120 Speaker 1: extremists for being suspected of being LGBTQ or for teaching 532 00:29:52,160 --> 00:29:54,840 Speaker 1: something that these people do not like um and these 533 00:29:54,840 --> 00:29:58,720 Speaker 1: campaigns have been incredibly effective. One of them happened right 534 00:29:58,760 --> 00:30:02,000 Speaker 1: in Georgia. Cecily A Lewis. She's a Georgia educator who 535 00:30:02,040 --> 00:30:05,239 Speaker 1: was basically run out of town by a group of 536 00:30:05,280 --> 00:30:09,600 Speaker 1: parents organizing on Facebook because they suspected her of she 537 00:30:09,640 --> 00:30:11,920 Speaker 1: had not even gotten the job yet, and they already 538 00:30:12,160 --> 00:30:16,120 Speaker 1: suspected her of planning to teach critical race theory. Never 539 00:30:16,160 --> 00:30:20,360 Speaker 1: mind the fact that she wasn't that that's just complete conjecture. Uh. 540 00:30:20,440 --> 00:30:22,520 Speaker 1: It just happened to be that she is a black woman, 541 00:30:22,840 --> 00:30:24,560 Speaker 1: and so they were able to say, like, oh, she's 542 00:30:24,560 --> 00:30:27,440 Speaker 1: gonna teach her kids critical race theory, even though she 543 00:30:27,560 --> 00:30:30,680 Speaker 1: wasn't even an in classroom teacher. She was just an administrator, right, 544 00:30:30,680 --> 00:30:33,840 Speaker 1: And so she had to leave town. And when she 545 00:30:34,000 --> 00:30:37,280 Speaker 1: left town, these people followed her to the next town 546 00:30:37,320 --> 00:30:39,160 Speaker 1: that she went to and ran her out of that 547 00:30:39,200 --> 00:30:43,360 Speaker 1: town too. And so these campaigns are incredibly effective. And 548 00:30:43,400 --> 00:30:47,200 Speaker 1: another good example is looking at election workers, eight percent 549 00:30:47,320 --> 00:30:51,240 Speaker 1: of whom are women. Election workers are being basically attacked 550 00:30:51,240 --> 00:30:54,400 Speaker 1: and accused of things like vote tampering if elections don't 551 00:30:54,400 --> 00:30:56,280 Speaker 1: go the way that these extremists want them to. And 552 00:30:56,400 --> 00:30:59,040 Speaker 1: again another example from Georgia is if you watched the 553 00:30:59,120 --> 00:31:01,160 Speaker 1: January six Comman and you might have seen the story 554 00:31:01,200 --> 00:31:04,880 Speaker 1: of Ruby and Shafe Freeman. That story broke my heart. 555 00:31:05,000 --> 00:31:07,960 Speaker 1: A black mother and her daughter with a long storied 556 00:31:08,080 --> 00:31:10,920 Speaker 1: history of serving their community as election and poll workers, 557 00:31:10,960 --> 00:31:13,960 Speaker 1: like the black women in your town, who if you 558 00:31:14,000 --> 00:31:15,840 Speaker 1: have any questions about voting, and they got you. If 559 00:31:15,840 --> 00:31:18,080 Speaker 1: you need help voting, they got you. Like people who 560 00:31:18,120 --> 00:31:20,640 Speaker 1: are called to serve their community and have been for 561 00:31:20,680 --> 00:31:23,960 Speaker 1: a very long time. Well, the thanks they got for 562 00:31:24,080 --> 00:31:28,680 Speaker 1: that was being horribly attacked. When Trump lost Georgia in 563 00:31:28,680 --> 00:31:33,760 Speaker 1: the election, he and Rudy Giuliani and his other cronies baselessly, 564 00:31:33,880 --> 00:31:38,160 Speaker 1: repeatedly and publicly accused these women of vote hampering. They 565 00:31:38,200 --> 00:31:40,080 Speaker 1: had videos of them where they said that they were 566 00:31:40,360 --> 00:31:44,040 Speaker 1: moving votes, which never happened. Um. And this led to 567 00:31:44,760 --> 00:31:48,200 Speaker 1: really terrible, frightening attacks on them where people showed up 568 00:31:48,240 --> 00:31:51,000 Speaker 1: at their home and the home of their elderly grandmother 569 00:31:51,080 --> 00:31:54,160 Speaker 1: and they had to flee for their own safety. And again, 570 00:31:54,440 --> 00:31:57,040 Speaker 1: these are regular people, not people that have a security 571 00:31:57,080 --> 00:32:00,680 Speaker 1: detail or people that maybe have the money to scrub 572 00:32:00,720 --> 00:32:03,080 Speaker 1: their personal information from the Internet the way that you 573 00:32:03,080 --> 00:32:06,040 Speaker 1: would need to if you were facing these kinds of attacks. Yeah. 574 00:32:06,040 --> 00:32:11,000 Speaker 1: Actually I had volunteered one time to be a just 575 00:32:11,040 --> 00:32:13,760 Speaker 1: to oversee as people are lining up, make sure everything 576 00:32:13,840 --> 00:32:16,400 Speaker 1: is okay. Nothing nothing, I don't I don't talk to anybody. 577 00:32:16,400 --> 00:32:18,040 Speaker 1: I just make sure everybody's has the right to vote. 578 00:32:18,080 --> 00:32:20,400 Speaker 1: Almost being harassed all these things. That's all I did. 579 00:32:20,600 --> 00:32:23,640 Speaker 1: But the entire group of people who were actually volunteering, 580 00:32:23,720 --> 00:32:26,920 Speaker 1: handing out the pencils, giving instructions, checking I das and 581 00:32:26,960 --> 00:32:30,480 Speaker 1: such were all black women, every single one of them. 582 00:32:30,480 --> 00:32:34,800 Speaker 1: And just this last voting when I did my whole 583 00:32:35,000 --> 00:32:37,800 Speaker 1: uh recinct was all black, but was sweet due. They 584 00:32:37,800 --> 00:32:39,880 Speaker 1: were so great and we had the forty five minute 585 00:32:40,120 --> 00:32:42,120 Speaker 1: line and they were so on top of it trying 586 00:32:42,120 --> 00:32:44,480 Speaker 1: to make sure to get everybody through seamlessly, and they 587 00:32:44,480 --> 00:32:47,040 Speaker 1: did a beautiful job. And they were there like this 588 00:32:47,080 --> 00:32:48,880 Speaker 1: was the I think the fifth day of the week 589 00:32:48,920 --> 00:32:50,760 Speaker 1: that they were doing this, and they'd been there and 590 00:32:50,760 --> 00:32:53,640 Speaker 1: they were not complaining, they were smiling, making conversation, and 591 00:32:53,680 --> 00:32:56,760 Speaker 1: it's like people like them who are making this happen 592 00:32:56,800 --> 00:33:00,840 Speaker 1: and going on so well and so perfectly, as if 593 00:33:00,840 --> 00:33:03,640 Speaker 1: there's nothing happening. Literally, the line was smoothly going and 594 00:33:03,680 --> 00:33:06,560 Speaker 1: I loved everything about that. And I cannot imagine just 595 00:33:06,600 --> 00:33:09,320 Speaker 1: because they're there, they exist, and they decided to help 596 00:33:09,360 --> 00:33:12,080 Speaker 1: their community, they're going to be targeted and being yelled at, 597 00:33:12,120 --> 00:33:16,160 Speaker 1: and honestly, um, I know those conversations of people really 598 00:33:16,360 --> 00:33:19,080 Speaker 1: being scared about being physically harmed because people are pushing 599 00:33:19,120 --> 00:33:23,400 Speaker 1: and trying to do so many intimidation tactics to accuse 600 00:33:23,520 --> 00:33:28,680 Speaker 1: and blame somebody for the loss of a bully. Yeah, 601 00:33:28,760 --> 00:33:32,200 Speaker 1: this is just my opinion. Black women are the backbone 602 00:33:32,320 --> 00:33:36,880 Speaker 1: of our democracy. They are like older black woman from 603 00:33:36,920 --> 00:33:39,640 Speaker 1: the South. They are the grease of the wheels of 604 00:33:39,680 --> 00:33:42,680 Speaker 1: dermal Like I feel like when I go into a 605 00:33:42,680 --> 00:33:44,200 Speaker 1: place like that and I'm like, oh, of course there's 606 00:33:44,240 --> 00:33:46,680 Speaker 1: like hell of black women with clipboards in here like that, 607 00:33:46,800 --> 00:33:49,800 Speaker 1: like that's the vibe. And you know, and we know 608 00:33:49,840 --> 00:33:52,280 Speaker 1: that eight percent of election workers and poll workers are women. 609 00:33:52,320 --> 00:33:54,160 Speaker 1: I would be be willing to bet at a bunch 610 00:33:54,200 --> 00:33:56,560 Speaker 1: of those are black women, especially in the South. And 611 00:33:56,600 --> 00:33:59,200 Speaker 1: I think that what we're really doing, it's allowing for 612 00:33:59,240 --> 00:34:03,240 Speaker 1: people who are the most marginalized in our society. We 613 00:34:03,320 --> 00:34:06,960 Speaker 1: are allowing for them to put themselves out there to 614 00:34:07,080 --> 00:34:10,840 Speaker 1: support our communities and our democracy in these ways for 615 00:34:10,960 --> 00:34:12,520 Speaker 1: not a lot of money. But mind you, because a 616 00:34:12,520 --> 00:34:14,640 Speaker 1: lot of these women are volunteers, are very low paid, 617 00:34:15,080 --> 00:34:18,160 Speaker 1: and we're saying you also need to absorb these attacks 618 00:34:18,239 --> 00:34:21,000 Speaker 1: to do so, and we're basically setting these women up 619 00:34:21,360 --> 00:34:24,080 Speaker 1: to be attacked and giving them no and not even 620 00:34:24,080 --> 00:34:26,120 Speaker 1: really talking about it when they are right, and so 621 00:34:26,840 --> 00:34:29,759 Speaker 1: I think it's my experience with voting has been the 622 00:34:29,800 --> 00:34:33,719 Speaker 1: exact same that these are the women who are at 623 00:34:33,719 --> 00:34:36,719 Speaker 1: this point, I would say, risking a lot to make 624 00:34:36,760 --> 00:34:39,400 Speaker 1: sure that our democratic process is able to take place. 625 00:34:39,480 --> 00:34:43,000 Speaker 1: And honestly, this is probably not surprising to anybody, because 626 00:34:43,080 --> 00:34:46,520 Speaker 1: these kinds of attacks are meant exactly to keep those 627 00:34:46,600 --> 00:34:49,360 Speaker 1: kinds of people from doing what they do. It is 628 00:34:49,360 --> 00:34:52,000 Speaker 1: meant to keep them from doing their work of keeping 629 00:34:52,040 --> 00:34:54,480 Speaker 1: the wheels of democracy going. It is meant to keep 630 00:34:54,520 --> 00:34:57,000 Speaker 1: them from being engaged in civic life, because who would 631 00:34:57,000 --> 00:34:58,680 Speaker 1: want to sort of their community by working as a 632 00:34:58,680 --> 00:35:01,040 Speaker 1: poll worker or running for off is if it means 633 00:35:01,080 --> 00:35:04,440 Speaker 1: that them and their families are going to be facing 634 00:35:04,480 --> 00:35:07,600 Speaker 1: these kinds of attacks. So, probably not surprising to anybody 635 00:35:07,640 --> 00:35:10,240 Speaker 1: that here we are ten days out from an election 636 00:35:10,480 --> 00:35:13,160 Speaker 1: and the United States is facing a national shortage on 637 00:35:13,280 --> 00:35:16,400 Speaker 1: poll workers. Kim Wyman, who is the senior Election security 638 00:35:16,480 --> 00:35:19,680 Speaker 1: lead at the Cybersecurity and Infrastructure Security Agency or the 639 00:35:19,760 --> 00:35:22,520 Speaker 1: c I s A, said that because of a rise 640 00:35:22,640 --> 00:35:26,000 Speaker 1: in threats against election workers, one in three election workers 641 00:35:26,040 --> 00:35:29,080 Speaker 1: and poll workers have quit their positions over fears for 642 00:35:29,120 --> 00:35:31,919 Speaker 1: their safety, and state officials are having a hard time 643 00:35:32,360 --> 00:35:35,960 Speaker 1: hiring folks for these positions because again, who would want 644 00:35:35,960 --> 00:35:38,640 Speaker 1: to do these like low paid duties. It means like, oh, you, 645 00:35:38,640 --> 00:35:41,280 Speaker 1: you and your family might have to flee for your safety. 646 00:35:41,400 --> 00:35:43,720 Speaker 1: This is because you wanted to do this position helping 647 00:35:43,719 --> 00:35:46,880 Speaker 1: your community in this way. I can understand why people 648 00:35:46,920 --> 00:35:49,480 Speaker 1: are not lining up to do this work. But you 649 00:35:49,520 --> 00:35:53,399 Speaker 1: can see how big those consequences are for our democracy 650 00:35:53,400 --> 00:35:55,239 Speaker 1: and for all of us when we can't even get 651 00:35:55,280 --> 00:35:59,320 Speaker 1: people to do enact the labor that needs to happen 652 00:35:59,440 --> 00:36:04,680 Speaker 1: for our democratic process to exist. Yeah. Um, And as 653 00:36:04,680 --> 00:36:08,360 Speaker 1: you've mentioned, like we've arrived at not a great space 654 00:36:08,719 --> 00:36:12,799 Speaker 1: in our democracy, but we've had a long history, Like 655 00:36:12,840 --> 00:36:16,279 Speaker 1: we had plenty of signs of of where we could 656 00:36:16,280 --> 00:36:19,120 Speaker 1: have listened to, especially black women, and we didn't. And 657 00:36:19,160 --> 00:36:21,480 Speaker 1: this is where we are. So can you expand on 658 00:36:21,520 --> 00:36:25,000 Speaker 1: that a little bit? Yes, I believe that this entire thing, 659 00:36:25,080 --> 00:36:28,279 Speaker 1: all of these fats to our democracy, is really connected 660 00:36:28,520 --> 00:36:31,279 Speaker 1: and goes back to not listening to black women because 661 00:36:31,280 --> 00:36:34,400 Speaker 1: I talked about the fact that you know, these threats 662 00:36:35,080 --> 00:36:38,680 Speaker 1: that start online become real role threats. Black women have 663 00:36:38,719 --> 00:36:41,000 Speaker 1: been saying this for a very long time, and I 664 00:36:41,040 --> 00:36:44,080 Speaker 1: feel that nobody has really listened. You know, some of 665 00:36:44,120 --> 00:36:46,640 Speaker 1: the first people to really raise the alarm about the 666 00:36:46,800 --> 00:36:50,600 Speaker 1: role that online harassment plays in our landscape. We're black women. 667 00:36:50,800 --> 00:36:53,560 Speaker 1: These womy have been saying this for years, and people 668 00:36:53,760 --> 00:36:55,560 Speaker 1: and by that I mean people with power, that people 669 00:36:55,560 --> 00:36:58,120 Speaker 1: with power to do something pretty much ignored them. And 670 00:36:58,160 --> 00:37:00,640 Speaker 1: so I would then argue that not sitting to black 671 00:37:00,680 --> 00:37:04,840 Speaker 1: women has consequences for all of us. A couple of examples. Um, 672 00:37:04,880 --> 00:37:08,040 Speaker 1: most people listening are probably familiar with gamer Gate. If 673 00:37:08,080 --> 00:37:10,960 Speaker 1: you are not familiar with gamer Gate, I am jealous. 674 00:37:10,960 --> 00:37:13,920 Speaker 1: But essentially it was like, if that's something that you were, like, 675 00:37:13,960 --> 00:37:16,120 Speaker 1: I've never heard of that before, I want to live 676 00:37:16,160 --> 00:37:20,200 Speaker 1: the life that you are living, right, But essentially, gamer 677 00:37:20,200 --> 00:37:23,799 Speaker 1: Gate was when a bunch of men, uh pretty much 678 00:37:23,840 --> 00:37:27,759 Speaker 1: attacked and harassed mostly women under the guys of being 679 00:37:28,040 --> 00:37:33,560 Speaker 1: big mad about scare quotes ethics in gaming journalism, and 680 00:37:33,680 --> 00:37:36,520 Speaker 1: it got lots of attention, rightly, and most people like 681 00:37:36,600 --> 00:37:40,799 Speaker 1: probably remember it, but those same people probably don't know 682 00:37:40,960 --> 00:37:43,959 Speaker 1: that the same folks responsible for gamer Gate, we're using 683 00:37:44,000 --> 00:37:47,200 Speaker 1: those exact same tactics against black women. Before gamer Gate, 684 00:37:47,520 --> 00:37:50,440 Speaker 1: women like Adria Richards, who was targeted for racist raspen 685 00:37:50,480 --> 00:37:53,759 Speaker 1: online after she tweeted about a crass joke about dongoles 686 00:37:53,840 --> 00:37:56,000 Speaker 1: that she overheard at a work event. Or women like 687 00:37:56,040 --> 00:38:00,880 Speaker 1: Shafika Hudson, who in actually reported and called bad actors 688 00:38:01,040 --> 00:38:03,840 Speaker 1: who were using dake accounts to impersonate black women to 689 00:38:03,920 --> 00:38:07,000 Speaker 1: destabilize Internet communities. You know, these are women who spoke 690 00:38:07,080 --> 00:38:09,279 Speaker 1: up about what they were facing on the Internet, and 691 00:38:09,280 --> 00:38:12,160 Speaker 1: they were basically ignored. And I wonder what might have 692 00:38:12,160 --> 00:38:15,960 Speaker 1: happened if somebody with power had actually listened to them 693 00:38:15,960 --> 00:38:19,040 Speaker 1: and taken some action. For one, I believe gamer Gate 694 00:38:19,120 --> 00:38:21,400 Speaker 1: might have gone down very differently if when people use 695 00:38:21,480 --> 00:38:25,160 Speaker 1: these tactics against black women somebody was like, hey, wait, 696 00:38:25,239 --> 00:38:27,040 Speaker 1: this is bad. We should not allow our platform to 697 00:38:27,080 --> 00:38:30,320 Speaker 1: be used this way and made some changes. Or consider 698 00:38:30,360 --> 00:38:33,480 Speaker 1: the fact that six years after Saphia Hudson reported people 699 00:38:33,520 --> 00:38:37,440 Speaker 1: impersonating black folks on Twitter to cause chaos, white supremacist 700 00:38:37,480 --> 00:38:40,920 Speaker 1: groups use that very same tactic during the racial uprisings 701 00:38:41,920 --> 00:38:44,839 Speaker 1: to make it seem like Black Lives Matter activists were 702 00:38:45,120 --> 00:38:47,880 Speaker 1: using the Internet to call for people to like loot 703 00:38:47,960 --> 00:38:51,480 Speaker 1: homes and cause violence, and Twitter actually confirmed this. Twitter 704 00:38:51,600 --> 00:38:54,080 Speaker 1: was like, oh, yeah, we can confirm now that some 705 00:38:54,160 --> 00:38:57,319 Speaker 1: of the accounts who were using our platform to call 706 00:38:57,360 --> 00:39:00,960 Speaker 1: for people to like loot and cause violence during these protests, 707 00:39:01,000 --> 00:39:04,120 Speaker 1: those were actually white supremacists pretending to be Black Lives 708 00:39:04,120 --> 00:39:07,280 Speaker 1: Matter activists. And so you know, they didn't actually explain 709 00:39:07,560 --> 00:39:11,680 Speaker 1: why they did all to prevent this from being a 710 00:39:11,719 --> 00:39:14,799 Speaker 1: tactic on a larger scale down the line, if they 711 00:39:14,880 --> 00:39:17,960 Speaker 1: knew it was a vulnerability of their platform. And when 712 00:39:17,960 --> 00:39:20,960 Speaker 1: we zoom out even further, these are the same tactics 713 00:39:20,960 --> 00:39:25,160 Speaker 1: that a Senate inquiry in would confirm we're used to 714 00:39:25,160 --> 00:39:28,520 Speaker 1: try to destabilize election, right, And so if when these 715 00:39:28,560 --> 00:39:32,040 Speaker 1: black women reported what they were seeing online, if somebody 716 00:39:32,080 --> 00:39:36,920 Speaker 1: had done something, I wonder, would bad actors who were 717 00:39:36,960 --> 00:39:40,640 Speaker 1: trying to use this tactic to destabilize our elections and 718 00:39:40,640 --> 00:39:44,240 Speaker 1: destabilize our online communities and truly cause violence and chaos, 719 00:39:44,680 --> 00:39:48,439 Speaker 1: would that have been a a viable tactic? I would 720 00:39:48,520 --> 00:39:51,680 Speaker 1: argue maybe not. But again, these are the consequences when 721 00:39:51,680 --> 00:39:53,960 Speaker 1: folks don't listen to black women and don't take them 722 00:39:54,040 --> 00:39:57,240 Speaker 1: seriously when they speak up about what they're experiencing online. Yeah, 723 00:39:57,280 --> 00:39:59,040 Speaker 1: and one of the things that annoys me the most 724 00:39:59,160 --> 00:40:02,839 Speaker 1: is when this whole idea of like that's just sort 725 00:40:02,840 --> 00:40:06,319 Speaker 1: of the price of the game. Accept it, like you know, 726 00:40:06,600 --> 00:40:09,440 Speaker 1: if you're tough, and sometimes people will try to paint 727 00:40:09,440 --> 00:40:11,680 Speaker 1: it in a very nice brush, like you're strong enough 728 00:40:11,719 --> 00:40:15,160 Speaker 1: to make it against all this harassment, all this stuff online, 729 00:40:15,200 --> 00:40:18,759 Speaker 1: You're you're so tough. But really, we shouldn't be accepting 730 00:40:18,800 --> 00:40:21,719 Speaker 1: this or expecting this in the first place. And that 731 00:40:21,840 --> 00:40:24,480 Speaker 1: is one of the things the points you're making here 732 00:40:24,560 --> 00:40:27,239 Speaker 1: is that we've normalized it to the point where I've 733 00:40:27,280 --> 00:40:30,920 Speaker 1: even heard conversations will people where people will paint it 734 00:40:30,920 --> 00:40:33,360 Speaker 1: as like, oh, it's a good thing, she's tough, she 735 00:40:33,400 --> 00:40:38,359 Speaker 1: can survive in this environment. It shouldn't be that way, right, Yeah. 736 00:40:38,480 --> 00:40:41,880 Speaker 1: I don't want to have a political landscape where you 737 00:40:41,920 --> 00:40:47,320 Speaker 1: have to be able to publicly endure and withstand attacks 738 00:40:47,360 --> 00:40:50,800 Speaker 1: on you and your family, because when it's women, the 739 00:40:50,840 --> 00:40:53,319 Speaker 1: research is super clear that it's never just the woman 740 00:40:53,360 --> 00:40:56,920 Speaker 1: who was attacked. It's her, her mom, her dad, her kids, 741 00:40:57,000 --> 00:41:01,919 Speaker 1: her partner, her community, her neighbors, her ends. I don't 742 00:41:02,000 --> 00:41:04,759 Speaker 1: want a political climate that says that women have to 743 00:41:04,840 --> 00:41:08,200 Speaker 1: be able to endure these kinds of attacks on their 744 00:41:08,239 --> 00:41:12,239 Speaker 1: safety and and watch their families deal with it as well. Uh, 745 00:41:12,280 --> 00:41:14,719 Speaker 1: if they want to be elected officials, if they want 746 00:41:14,719 --> 00:41:17,160 Speaker 1: to serve their community, if they want to just take 747 00:41:17,200 --> 00:41:20,480 Speaker 1: place in civic life, that's not the kind of climate 748 00:41:20,520 --> 00:41:23,520 Speaker 1: I want. We should not that's not acceptable. That's not 749 00:41:23,880 --> 00:41:39,719 Speaker 1: a norm that we should be okay with, you know. 750 00:41:39,800 --> 00:41:42,520 Speaker 1: And I keep thinking about how the algorithm has changed. 751 00:41:42,520 --> 00:41:45,279 Speaker 1: And I was talking to Annie about our own algorithm, 752 00:41:45,280 --> 00:41:47,360 Speaker 1: and because we are so afraid of social media, we 753 00:41:47,400 --> 00:41:50,239 Speaker 1: stay away from it, and therefore anytime we do put 754 00:41:50,280 --> 00:41:52,440 Speaker 1: anything up, it's ignored because we don't have a lot 755 00:41:52,440 --> 00:41:55,920 Speaker 1: of content out there. And I still think about how Twitter, 756 00:41:56,320 --> 00:41:59,440 Speaker 1: and I know, for all uh references, it's just that 757 00:41:59,800 --> 00:42:02,200 Speaker 1: we talk about black Twitter. It is the community coming 758 00:42:02,239 --> 00:42:04,960 Speaker 1: together and having like their own space and talking. And 759 00:42:05,000 --> 00:42:09,480 Speaker 1: because they've kind of become grouped and has has become 760 00:42:09,520 --> 00:42:12,799 Speaker 1: a point where if you don't already uh follow some 761 00:42:12,800 --> 00:42:14,840 Speaker 1: of these people, or if you're not actually paying attention, 762 00:42:14,880 --> 00:42:17,440 Speaker 1: it just goes away from your feed. Kind of how 763 00:42:17,440 --> 00:42:19,680 Speaker 1: TikTok is doing that. We're just making it harder and 764 00:42:19,719 --> 00:42:22,719 Speaker 1: harder to actually see what is happening and what the 765 00:42:22,719 --> 00:42:25,920 Speaker 1: truth really is going down. I have to question, like 766 00:42:26,040 --> 00:42:29,319 Speaker 1: so is the algorithm really a good thing in that 767 00:42:29,360 --> 00:42:31,400 Speaker 1: it does this tour where you're in your own little 768 00:42:31,400 --> 00:42:35,560 Speaker 1: hole and where it's putting pocketing black women who are like, Hey, 769 00:42:35,600 --> 00:42:38,000 Speaker 1: this bad thing is happening. We're trying to tell y'all, 770 00:42:38,040 --> 00:42:39,800 Speaker 1: But the only people who are seeing those are the 771 00:42:39,840 --> 00:42:42,200 Speaker 1: people who already know. And it's like, what the hell, 772 00:42:42,280 --> 00:42:45,480 Speaker 1: how do we change that? Yeah? What an insightful question. 773 00:42:45,920 --> 00:42:48,120 Speaker 1: So this is just my opinion. This is like just 774 00:42:48,200 --> 00:42:53,359 Speaker 1: representing myself. I don't think that it is good they 775 00:42:53,360 --> 00:42:58,120 Speaker 1: have a platform like Twitter, especially be so tied to 776 00:42:58,400 --> 00:43:01,279 Speaker 1: an algorithm. I don't think it's good for all the 777 00:43:01,320 --> 00:43:04,120 Speaker 1: reasons that you just said. People they're there, voices can 778 00:43:04,160 --> 00:43:06,120 Speaker 1: be siloed, you can feel like you're talking to the 779 00:43:06,560 --> 00:43:09,640 Speaker 1: choir of people who already agree with you, and you know, 780 00:43:10,400 --> 00:43:13,160 Speaker 1: I know what you're saying. And I also think that 781 00:43:13,600 --> 00:43:16,600 Speaker 1: I don't think that platforms have shown that they're able 782 00:43:16,680 --> 00:43:21,759 Speaker 1: to be responsible with algorithmically generated content right now. Algorithms 783 00:43:21,800 --> 00:43:24,480 Speaker 1: are just it's just a fact. They are biased towards 784 00:43:24,520 --> 00:43:29,480 Speaker 1: content that is untrue. False information travels on Twitter, specifically 785 00:43:29,960 --> 00:43:33,360 Speaker 1: much faster and much further than correct information. They amplify 786 00:43:33,480 --> 00:43:36,399 Speaker 1: content that is extremist. And I don't mean like extremists, 787 00:43:36,400 --> 00:43:39,600 Speaker 1: like capital E extremists, although that as well, but like 788 00:43:40,400 --> 00:43:46,480 Speaker 1: content that is, you know, more extreme than not extreme. 789 00:43:46,520 --> 00:43:49,279 Speaker 1: So if I'm saying, like I ate a piece of 790 00:43:49,280 --> 00:43:51,880 Speaker 1: toast today and it was burnt, I hate all toast 791 00:43:52,800 --> 00:43:54,920 Speaker 1: rather than like, oh, this piece of toast was burnt. 792 00:43:55,080 --> 00:43:57,640 Speaker 1: When some lose them, they're going to amplify the one 793 00:43:57,680 --> 00:44:01,000 Speaker 1: that sounds more extreme, because that's that's what gets more eyeballs. Um. 794 00:44:01,080 --> 00:44:04,400 Speaker 1: They amplify content that is polarizing, They amplify content that 795 00:44:04,560 --> 00:44:08,319 Speaker 1: is loud and aggressive. And I don't think that platforms 796 00:44:08,440 --> 00:44:11,359 Speaker 1: can be trusted to be run algorithmically because they are 797 00:44:11,400 --> 00:44:14,840 Speaker 1: going to make us all more extreme, more polarized, less informed, 798 00:44:14,840 --> 00:44:17,840 Speaker 1: and less thoughtful. I would love to see an algorithmic 799 00:44:17,920 --> 00:44:23,280 Speaker 1: model that is amplifies content that is thoughtful, honest, accurate, timely, whatever, 800 00:44:23,680 --> 00:44:25,040 Speaker 1: but I have not seen it yet, And so I 801 00:44:25,040 --> 00:44:27,360 Speaker 1: think platforms have shown that they can't be trusted to 802 00:44:27,360 --> 00:44:30,360 Speaker 1: to work with algorithmic models because they're just going to 803 00:44:30,760 --> 00:44:34,759 Speaker 1: amplify stuff that makes us all worse off. Right, Yeah, 804 00:44:34,840 --> 00:44:38,319 Speaker 1: And I honestly just being new to TikTok and We've 805 00:44:38,320 --> 00:44:41,080 Speaker 1: talked about TikTok often um on the show, but it 806 00:44:41,120 --> 00:44:43,840 Speaker 1: also does the same thing with the f y P 807 00:44:44,080 --> 00:44:47,400 Speaker 1: or for you page, where it only amplifies what seems 808 00:44:47,400 --> 00:44:50,640 Speaker 1: to be UH most disturbing. I say that for my 809 00:44:50,680 --> 00:44:53,640 Speaker 1: own UH concept because there's so much out there that 810 00:44:54,200 --> 00:44:57,160 Speaker 1: the worst shadow banning and the idea that many of 811 00:44:57,320 --> 00:45:01,920 Speaker 1: these algorithms do this that there's an enough that people 812 00:45:01,960 --> 00:45:05,040 Speaker 1: can complain or make false allegations, and and their standard 813 00:45:05,040 --> 00:45:08,120 Speaker 1: of what they think is bullying and or racist is 814 00:45:08,160 --> 00:45:10,520 Speaker 1: not And apparently they have a pretty big threshold of 815 00:45:10,560 --> 00:45:13,200 Speaker 1: saying no, it's not right racists, we're good, um, even 816 00:45:13,200 --> 00:45:16,279 Speaker 1: though it's obviously super racist, in the fact that they 817 00:45:16,840 --> 00:45:20,440 Speaker 1: seem to keep working on that level in and allowing 818 00:45:20,560 --> 00:45:24,600 Speaker 1: for things that we know is I don't know, being threatened, 819 00:45:24,760 --> 00:45:28,920 Speaker 1: being called by a stereotype that I would consider racist, 820 00:45:29,239 --> 00:45:30,879 Speaker 1: but yet for so often that it's not that big 821 00:45:30,920 --> 00:45:33,800 Speaker 1: of a deal. They're not really threatening you, like, wait, yeah, 822 00:45:33,960 --> 00:45:35,640 Speaker 1: so we have to have a police report to actually 823 00:45:35,640 --> 00:45:37,759 Speaker 1: say that we're being threatened, and that seems to be 824 00:45:37,800 --> 00:45:41,040 Speaker 1: happening often in these platforms exactly. And this is when 825 00:45:41,080 --> 00:45:44,120 Speaker 1: I said I wasn't really a big fan of algorithmic platforms. 826 00:45:44,160 --> 00:45:45,799 Speaker 1: One of the reasons is because I think that we're 827 00:45:47,160 --> 00:45:51,000 Speaker 1: as humans, we are giving the responsibility of moderating platforms 828 00:45:51,040 --> 00:45:55,120 Speaker 1: more and more to algorithms or AI. And AI is 829 00:45:55,160 --> 00:45:57,319 Speaker 1: smart and good at lots of things, but there are 830 00:45:57,360 --> 00:45:59,759 Speaker 1: some things that you need a humans take on. And 831 00:45:59,800 --> 00:46:01,480 Speaker 1: so Sam, you know, you talked about how like oh, 832 00:46:01,520 --> 00:46:03,920 Speaker 1: I'm being called a racial slur or like I'm being 833 00:46:03,960 --> 00:46:07,399 Speaker 1: attacked in this way. Bad actors are so good at 834 00:46:07,520 --> 00:46:10,920 Speaker 1: using dog whistles or coded language or things to get 835 00:46:11,000 --> 00:46:14,840 Speaker 1: around AI or machine learning that is, like you know, 836 00:46:14,880 --> 00:46:17,320 Speaker 1: working as a content moderator, and they know how to 837 00:46:17,360 --> 00:46:20,640 Speaker 1: exploit those loopholes. And so I completely agree. Especially on TikTok, 838 00:46:20,680 --> 00:46:23,840 Speaker 1: I feel like they really need to make some changes 839 00:46:23,920 --> 00:46:27,080 Speaker 1: with how their platform is run if that platform is 840 00:46:27,120 --> 00:46:29,719 Speaker 1: going to be safer and more inclusive. Um. I think 841 00:46:29,760 --> 00:46:32,920 Speaker 1: it was the Washington Post recently that just did a 842 00:46:32,920 --> 00:46:36,040 Speaker 1: little experiment where if you don't follow a Washington Post 843 00:46:36,040 --> 00:46:38,319 Speaker 1: on TikTok, they have a really interesting TikTok channel. They 844 00:46:38,320 --> 00:46:41,480 Speaker 1: are doing a series on TikTok where they used all 845 00:46:41,560 --> 00:46:44,080 Speaker 1: these words that they were certain we're going to get 846 00:46:44,120 --> 00:46:47,840 Speaker 1: their TikTok suppressed. Um, so they they did a TikTok 847 00:46:47,880 --> 00:46:53,719 Speaker 1: where they said racism, black lives matter, disability, um, you know, 848 00:46:54,320 --> 00:46:56,319 Speaker 1: all these other words, because they were like if you, 849 00:46:56,400 --> 00:46:59,480 Speaker 1: if you use these words in your TikTok, they are 850 00:46:59,520 --> 00:47:01,759 Speaker 1: more likely to be suppressed. What they were doing an experiment, 851 00:47:02,080 --> 00:47:05,319 Speaker 1: and by golly, it worked, you know. And so it's 852 00:47:05,320 --> 00:47:09,040 Speaker 1: clear to me that platforms are not being run one 853 00:47:09,320 --> 00:47:12,719 Speaker 1: with a thoughtful, nuanced, sensitive human who is able to 854 00:47:12,800 --> 00:47:15,399 Speaker 1: understand nuance at the helm and to just not being 855 00:47:15,480 --> 00:47:19,279 Speaker 1: run in a way that amplifies good conversation, thoughtful conversation, 856 00:47:19,360 --> 00:47:25,120 Speaker 1: substitutive conversation, and does not amplify garbage, lies, extremism, hate, 857 00:47:25,239 --> 00:47:27,880 Speaker 1: racial slurs, all of that. Yeah, And on top of that, 858 00:47:27,920 --> 00:47:31,239 Speaker 1: you have again as you talk bad players who and 859 00:47:31,239 --> 00:47:34,600 Speaker 1: I've seen this a lot more on TikTok because again, um, 860 00:47:34,719 --> 00:47:38,200 Speaker 1: that person, but like in the conversations that they thinking 861 00:47:38,239 --> 00:47:41,080 Speaker 1: that they're doing something good and doing that extreme call 862 00:47:41,120 --> 00:47:44,400 Speaker 1: out in which they get people fired and canceled and 863 00:47:44,440 --> 00:47:46,600 Speaker 1: all of these things and go after people on just 864 00:47:46,800 --> 00:47:49,319 Speaker 1: one video. And don't get me wrong, a lot of 865 00:47:49,360 --> 00:47:52,319 Speaker 1: these I agree with that. These people who are saying 866 00:47:52,360 --> 00:47:55,239 Speaker 1: really nasty crap and being caught on camera, are caught 867 00:47:55,280 --> 00:47:57,520 Speaker 1: on their phone for saying these things should be called out, 868 00:47:57,600 --> 00:47:59,279 Speaker 1: that I should be repercussions. But has come to a 869 00:47:59,280 --> 00:48:02,320 Speaker 1: point that it's enter came it almost and I'm wondering 870 00:48:02,360 --> 00:48:04,399 Speaker 1: where this is gonna lead because it's a fairly new 871 00:48:04,440 --> 00:48:06,680 Speaker 1: thing out of the last um five years. And I 872 00:48:06,719 --> 00:48:08,319 Speaker 1: say that for the both the left and the right, 873 00:48:08,320 --> 00:48:11,240 Speaker 1: that it's like this is this is getting dangerously toxic. 874 00:48:11,480 --> 00:48:15,080 Speaker 1: Dangerously toxic. That's a whole different conversation. I know. No, 875 00:48:15,280 --> 00:48:17,040 Speaker 1: it's so funny that you bring this up because I 876 00:48:17,440 --> 00:48:19,960 Speaker 1: so I recorded the first episode of Internet Hate Machine 877 00:48:19,960 --> 00:48:24,400 Speaker 1: with Sophie earlier this week, and I haven't perhaps unpopular 878 00:48:24,400 --> 00:48:27,480 Speaker 1: opinion on this. When I first got on TikTok, specifically, 879 00:48:27,880 --> 00:48:30,360 Speaker 1: I followed a lot of creators who made content that 880 00:48:30,440 --> 00:48:33,000 Speaker 1: was like this guy is a racist and this is 881 00:48:33,000 --> 00:48:35,920 Speaker 1: where he works and I'm gonna call his employer and 882 00:48:35,920 --> 00:48:37,439 Speaker 1: get him fired. And I used to be like, yeah, 883 00:48:37,640 --> 00:48:41,400 Speaker 1: like fire that racist, Like really loved it. As I have, 884 00:48:42,120 --> 00:48:45,640 Speaker 1: I guess matured and matured along with the platform. As 885 00:48:45,719 --> 00:48:49,399 Speaker 1: cathartic as that as that is, I don't think that 886 00:48:49,400 --> 00:48:53,880 Speaker 1: that is an appropriate tactic. And I think that even 887 00:48:53,960 --> 00:48:57,800 Speaker 1: if I am in agreement like yeah, this racist shouldn't 888 00:48:57,840 --> 00:49:00,399 Speaker 1: be doing this, and they're definitely exceptions to this, I'm sure, 889 00:49:00,719 --> 00:49:04,960 Speaker 1: but in general, I just know how easy it is 890 00:49:05,040 --> 00:49:09,960 Speaker 1: for bad actors and extremists to weaponize that very same tactic. Right, 891 00:49:10,080 --> 00:49:12,799 Speaker 1: we're gonna coordinate and have all these people call this 892 00:49:12,840 --> 00:49:15,239 Speaker 1: person to get them fired because we didn't like what 893 00:49:15,280 --> 00:49:18,360 Speaker 1: they tweeted. I don't think that that is a tactic 894 00:49:18,600 --> 00:49:21,680 Speaker 1: that makes us smarter, that makes us better, that makes us, 895 00:49:22,040 --> 00:49:24,920 Speaker 1: you know, healthier as a society. So even when I 896 00:49:24,960 --> 00:49:27,680 Speaker 1: see it happening with someone who was sensibly I agree with, 897 00:49:27,680 --> 00:49:31,120 Speaker 1: shouldn't you know they deserve it? I feel like I 898 00:49:31,160 --> 00:49:34,799 Speaker 1: know how easily that is a tactic to be exploited 899 00:49:34,800 --> 00:49:38,160 Speaker 1: and weaponized by bad actors. And our first episode actually 900 00:49:38,160 --> 00:49:40,160 Speaker 1: deals with this woman, Aga Richards, who I was talking 901 00:49:40,160 --> 00:49:43,120 Speaker 1: about earlier, who was one of the early targets of that, 902 00:49:43,239 --> 00:49:46,319 Speaker 1: where she tweeted a picture of these two guys at 903 00:49:46,320 --> 00:49:49,040 Speaker 1: a tech conference who are making a crass joke that 904 00:49:49,080 --> 00:49:54,759 Speaker 1: she didn't like, and she got horribly mobbed, where they 905 00:49:54,800 --> 00:49:58,920 Speaker 1: coordinated on on sites like furchand to call her employer. 906 00:49:59,320 --> 00:50:02,359 Speaker 1: They threatened her employer, and rather than supporting her, her 907 00:50:02,400 --> 00:50:04,960 Speaker 1: employer fired her and was like, Okay, yeah, you guys 908 00:50:05,000 --> 00:50:07,759 Speaker 1: want her to be fired, she's fired. And I've just 909 00:50:07,840 --> 00:50:10,560 Speaker 1: seen time and time again that that tactic is so 910 00:50:10,640 --> 00:50:14,360 Speaker 1: easily gamified and weaponized and exploited by bad actors that 911 00:50:14,440 --> 00:50:17,080 Speaker 1: I don't think anybody should be engaged in it, right, 912 00:50:17,160 --> 00:50:20,359 Speaker 1: And that's that's the other conversation that it keeps going 913 00:50:20,360 --> 00:50:23,000 Speaker 1: into my head, is that the whole level of Docks saying, 914 00:50:23,000 --> 00:50:24,880 Speaker 1: and which is this kind of like again, this is 915 00:50:24,880 --> 00:50:27,280 Speaker 1: the entertainment level. And this is where it's getting really 916 00:50:27,320 --> 00:50:29,040 Speaker 1: disgusting to me that I'm like, Okay, we we have 917 00:50:29,239 --> 00:50:32,680 Speaker 1: used this as a form of entertainment to see if 918 00:50:32,880 --> 00:50:36,200 Speaker 1: people can be ruined. Yes, consequences should happen, but there's 919 00:50:36,200 --> 00:50:39,200 Speaker 1: this whole new level that that's entertainment now and that 920 00:50:39,239 --> 00:50:41,640 Speaker 1: needs to be a conversation. Like, the whole gist of 921 00:50:41,680 --> 00:50:44,480 Speaker 1: the entire conversation is it becomes a small thing that 922 00:50:44,520 --> 00:50:46,960 Speaker 1: becomes bigger and bigger and then normalized, and then it's 923 00:50:47,000 --> 00:50:49,880 Speaker 1: like we don't realize what we're doing in that we 924 00:50:50,000 --> 00:50:52,120 Speaker 1: have not only ruined people's life but probably a lot 925 00:50:52,120 --> 00:50:55,440 Speaker 1: of innocent people's lives and attached to that, and then 926 00:50:55,480 --> 00:50:58,279 Speaker 1: you see that, Okay, this has been happening, but these 927 00:50:58,280 --> 00:51:01,480 Speaker 1: platforms like TikTok, which is fair knew, are still allowing 928 00:51:01,520 --> 00:51:04,680 Speaker 1: it to happen. How is this not changing exactly? And 929 00:51:04,960 --> 00:51:06,960 Speaker 1: I think we have gotten to a place where it's 930 00:51:06,960 --> 00:51:10,440 Speaker 1: it's normalized as entertainment and it's it's not even I mean, 931 00:51:10,440 --> 00:51:13,840 Speaker 1: like we're talking about pretty serious things like women running 932 00:51:13,840 --> 00:51:16,920 Speaker 1: for office and being engaged in democratic and civic life. 933 00:51:17,200 --> 00:51:21,560 Speaker 1: Do you remember couch guy TikTok, the guy who everybody 934 00:51:21,640 --> 00:51:23,840 Speaker 1: was so convinced was cheating on his girlfriend and it 935 00:51:23,920 --> 00:51:27,400 Speaker 1: was caught on TikTok because his long distance girlfriend films 936 00:51:27,400 --> 00:51:30,399 Speaker 1: the TikTok of her surprising him and his he's not 937 00:51:30,480 --> 00:51:33,960 Speaker 1: reacting the way that maybe you might expect somebody to react, 938 00:51:34,200 --> 00:51:38,600 Speaker 1: and everybody was like overnight became a body language expert, 939 00:51:38,640 --> 00:51:41,879 Speaker 1: and like I saw people on television news. I saw 940 00:51:41,920 --> 00:51:44,960 Speaker 1: a segment where they had a quote body language expert 941 00:51:45,680 --> 00:51:49,120 Speaker 1: breaking down his body language. These are strangers, right, And 942 00:51:49,120 --> 00:51:51,120 Speaker 1: so I do think we've gotten to this place where 943 00:51:51,160 --> 00:51:58,560 Speaker 1: we are We've gotten really comfortable making complete like assumptions 944 00:51:58,640 --> 00:52:02,680 Speaker 1: about people we do not know, confidently going on on 945 00:52:03,000 --> 00:52:07,880 Speaker 1: big platforms and airing those assumptions, and because of the 946 00:52:07,920 --> 00:52:11,520 Speaker 1: way that algorithms work, those assumptions will get millions of 947 00:52:11,560 --> 00:52:15,600 Speaker 1: people to watch them and then reply with their own assumptions. 948 00:52:15,680 --> 00:52:18,320 Speaker 1: Because of the of the sort of riff and remix 949 00:52:18,360 --> 00:52:20,520 Speaker 1: culture of TikTok, then they'll say, well, actually, I think 950 00:52:20,520 --> 00:52:22,080 Speaker 1: he was cheating like this and not like that, Like 951 00:52:22,800 --> 00:52:25,520 Speaker 1: it's not it's not healthy, and it's not good for 952 00:52:25,520 --> 00:52:29,400 Speaker 1: our society. The couch guy was such a meme essentially, 953 00:52:29,640 --> 00:52:31,120 Speaker 1: but you know what, I've seen something similar to that 954 00:52:31,200 --> 00:52:33,960 Speaker 1: just recently on Twitter, and I have no opinion. I 955 00:52:34,080 --> 00:52:38,560 Speaker 1: Joseph was like, wow, okay, garden Lady, Yes, yes, garden Lady. 956 00:52:38,640 --> 00:52:43,600 Speaker 1: I was like, wow, that just what just happen. Yeah, 957 00:52:43,760 --> 00:52:45,640 Speaker 1: we did an episode of There Are No Girls on 958 00:52:45,680 --> 00:52:48,160 Speaker 1: the Internet about this, go ahead. So if you don't 959 00:52:48,160 --> 00:52:51,440 Speaker 1: know who garden Lady is, she is this woman who 960 00:52:51,560 --> 00:52:54,359 Speaker 1: has a garden, and she tweeted something along the lines 961 00:52:54,400 --> 00:52:57,960 Speaker 1: of life I start every morning with taking my coffee 962 00:52:57,960 --> 00:53:00,760 Speaker 1: out to my garden with my husband and talk for hours. 963 00:53:00,880 --> 00:53:04,880 Speaker 1: Never gets old. Love him. So the morning, the entire 964 00:53:04,920 --> 00:53:09,960 Speaker 1: internet was like boo, we hate it, we hate it. Boo. Tomato, 965 00:53:10,040 --> 00:53:13,160 Speaker 1: tomato tomato. Right, Like, it went from like you're being 966 00:53:13,239 --> 00:53:16,520 Speaker 1: ablest to be and you're being classes and there are 967 00:53:16,520 --> 00:53:18,359 Speaker 1: things I'm like, I don't know anything about this woman, 968 00:53:18,400 --> 00:53:21,120 Speaker 1: and then someone went into a deep dive into her 969 00:53:21,120 --> 00:53:24,080 Speaker 1: account and said, oh, she's an awful person. Keep attacking her. 970 00:53:24,239 --> 00:53:27,600 Speaker 1: And I was just like, what what just happened? Yeah? 971 00:53:27,680 --> 00:53:30,400 Speaker 1: And I so that that was so interesting to me, 972 00:53:30,440 --> 00:53:34,160 Speaker 1: and I feel like it's a great example of I 973 00:53:34,200 --> 00:53:37,320 Speaker 1: love social media and the Internet. I don't think millions 974 00:53:37,360 --> 00:53:40,280 Speaker 1: of people were meant to weigh in on the morning 975 00:53:40,480 --> 00:53:43,399 Speaker 1: routine of a stranger in this way, like I saw 976 00:53:43,440 --> 00:53:45,680 Speaker 1: the same thing. Or people were like, actually, she's anti 977 00:53:45,760 --> 00:53:47,959 Speaker 1: vax and it's like, well, I don't think the people 978 00:53:47,960 --> 00:53:50,719 Speaker 1: who are attacking her are attacking her because she they 979 00:53:50,800 --> 00:53:53,520 Speaker 1: knew that, like she maybe post. I mean, I don't 980 00:53:53,520 --> 00:53:55,560 Speaker 1: even I can't you know, confirm or deny that. But 981 00:53:55,600 --> 00:53:57,320 Speaker 1: I was like, I don't know that that's why people 982 00:53:57,320 --> 00:54:01,240 Speaker 1: are attacking her, and I you know, I saw probably 983 00:54:01,280 --> 00:54:04,680 Speaker 1: the wildest response to that tweet I saw was someone 984 00:54:04,800 --> 00:54:07,400 Speaker 1: being like, oh, don't you work and she was like 985 00:54:07,440 --> 00:54:09,960 Speaker 1: oh I, oh my own business, so I'm able to 986 00:54:10,000 --> 00:54:14,000 Speaker 1: have flexibility in the mornings. And then somebody else replied, oh, 987 00:54:14,080 --> 00:54:17,080 Speaker 1: so you're exploiting the labor of your employees and that's 988 00:54:17,120 --> 00:54:19,600 Speaker 1: why you're able to have these nice mornings terrible, and 989 00:54:19,640 --> 00:54:22,600 Speaker 1: she was like, I'm the sole employee of my small business. 990 00:54:22,760 --> 00:54:25,200 Speaker 1: And so just like layers and layers and layers of 991 00:54:25,239 --> 00:54:28,359 Speaker 1: assumed bulle about something they don't even know. But that's 992 00:54:28,400 --> 00:54:31,839 Speaker 1: the thing is, like it becomes this really really tempestuous 993 00:54:32,040 --> 00:54:35,400 Speaker 1: like grounds of like, Okay, her doing a noculous statement 994 00:54:35,400 --> 00:54:38,680 Speaker 1: about how she loved this moment with her husband has 995 00:54:38,719 --> 00:54:42,160 Speaker 1: made it. Some people have weaponized it against her, telling 996 00:54:42,160 --> 00:54:45,560 Speaker 1: her she's all these things, and you're like, what how 997 00:54:45,600 --> 00:54:49,360 Speaker 1: did we get here? Why are we here? And what 998 00:54:49,600 --> 00:54:53,719 Speaker 1: is this platform? Because like like what is this platform? 999 00:54:53,840 --> 00:54:55,960 Speaker 1: Like why am I being told that I need to 1000 00:54:55,960 --> 00:54:59,160 Speaker 1: care about this stranger? Why why is it being surfaced 1001 00:54:59,160 --> 00:55:01,799 Speaker 1: to me? What is going on that this is even 1002 00:55:01,840 --> 00:55:04,000 Speaker 1: something that I am aware of, like the morning routine 1003 00:55:04,000 --> 00:55:07,120 Speaker 1: of this person that I will probably never even meet. Yeah, 1004 00:55:07,280 --> 00:55:09,480 Speaker 1: I feel like so many things. One of my big 1005 00:55:09,480 --> 00:55:12,680 Speaker 1: concerns about social media and the internet now is you know, 1006 00:55:12,760 --> 00:55:16,360 Speaker 1: like internet literacy context I feel like we've thrown contexts 1007 00:55:16,360 --> 00:55:19,520 Speaker 1: out the window, but also just that kind of idea 1008 00:55:20,560 --> 00:55:25,880 Speaker 1: where or this is a lighter example, but of this, Oh, 1009 00:55:25,920 --> 00:55:27,480 Speaker 1: I've got to have an opinion, I've gotta have a 1010 00:55:27,480 --> 00:55:29,360 Speaker 1: stance and I'm going to say something, and then you 1011 00:55:29,440 --> 00:55:32,359 Speaker 1: like don't do any other research into it, and then 1012 00:55:32,360 --> 00:55:36,680 Speaker 1: when you add in like misinformation and disinformation, it just 1013 00:55:36,719 --> 00:55:38,400 Speaker 1: frightens me to be honest, Like, I keep seeing all 1014 00:55:38,440 --> 00:55:43,120 Speaker 1: these stories about, oh, this image was fake, and I 1015 00:55:43,120 --> 00:55:45,279 Speaker 1: haven't even seen the image in question, but I know 1016 00:55:45,320 --> 00:55:47,399 Speaker 1: that a lot of people saw it and they believed it, 1017 00:55:47,719 --> 00:55:53,320 Speaker 1: and like that it gets my heart running for same. 1018 00:55:53,840 --> 00:55:56,439 Speaker 1: Oh my gosh. And I really resonate with your point 1019 00:55:56,440 --> 00:55:59,440 Speaker 1: about having to have a take. I feel like so 1020 00:55:59,640 --> 00:56:03,640 Speaker 1: the way social media functions, both the algorithmic nature of 1021 00:56:03,680 --> 00:56:06,360 Speaker 1: social media platforms and the speed by which social media 1022 00:56:06,360 --> 00:56:10,319 Speaker 1: platforms tend to move, I think it incentivizes people to 1023 00:56:10,440 --> 00:56:12,399 Speaker 1: feel like they have to have a take and they 1024 00:56:12,400 --> 00:56:15,960 Speaker 1: have to have it right away. In reality, it's okay 1025 00:56:16,000 --> 00:56:18,640 Speaker 1: to not have a take. It's okay to to be like, 1026 00:56:18,760 --> 00:56:21,400 Speaker 1: I don't really know anything about this, me putting my 1027 00:56:21,440 --> 00:56:23,360 Speaker 1: opinion or my voice out there on an issue that 1028 00:56:23,400 --> 00:56:25,400 Speaker 1: I don't know about is not going to be helpful. 1029 00:56:25,440 --> 00:56:28,080 Speaker 1: It's not going to add to the conversation. I'll just listen, right. 1030 00:56:28,120 --> 00:56:31,880 Speaker 1: I don't think that social media platforms incentivize just taking 1031 00:56:31,880 --> 00:56:33,880 Speaker 1: me a step back, or even just taking a minute 1032 00:56:33,880 --> 00:56:35,839 Speaker 1: to to figure out what you want to say. It's 1033 00:56:35,840 --> 00:56:37,399 Speaker 1: like it has to be quick, has to be now, 1034 00:56:37,480 --> 00:56:40,840 Speaker 1: you have to reply, get that engagement, and that doesn't 1035 00:56:40,840 --> 00:56:45,720 Speaker 1: serve anybody. I don't think that social media platforms incentivize 1036 00:56:46,440 --> 00:56:50,439 Speaker 1: us to be thoughtful or you know, our best there. 1037 00:56:50,440 --> 00:56:54,040 Speaker 1: I think there was like, for instance, when the Queen died, 1038 00:56:54,920 --> 00:56:56,960 Speaker 1: I don't really know anything about the monarchy. I don't 1039 00:56:56,960 --> 00:56:59,319 Speaker 1: really know what I don't really I'm not invested in it. 1040 00:56:59,440 --> 00:57:01,160 Speaker 1: And so I'm at someone that you would that you 1041 00:57:01,200 --> 00:57:03,239 Speaker 1: need to hear from on it, right, Like I was like, 1042 00:57:03,719 --> 00:57:06,560 Speaker 1: there are people who are very invested in this. Let 1043 00:57:06,600 --> 00:57:08,759 Speaker 1: them have the spotlight of like saying what they want 1044 00:57:08,760 --> 00:57:12,040 Speaker 1: to say. I think that it's okay to not weigh in, 1045 00:57:12,160 --> 00:57:13,759 Speaker 1: it's okay to not have a hot take. I don't 1046 00:57:13,800 --> 00:57:17,160 Speaker 1: think social media platforms incentivize or encourage us to just 1047 00:57:17,680 --> 00:57:20,680 Speaker 1: listen sometimes. Yeah, and I think we've seen that seep 1048 00:57:20,760 --> 00:57:23,480 Speaker 1: into our daily lives and our politics, because I just 1049 00:57:23,520 --> 00:57:25,920 Speaker 1: see that with politicians all the time, and then that 1050 00:57:26,040 --> 00:57:28,320 Speaker 1: muddies the whole conversation because if you have this I 1051 00:57:28,320 --> 00:57:30,360 Speaker 1: don't know anything about this garden lady, but if you 1052 00:57:30,440 --> 00:57:33,200 Speaker 1: have this whole conversation happening, it's distracting from the very 1053 00:57:33,280 --> 00:57:39,560 Speaker 1: real context and information we have about politicians they're doing 1054 00:57:39,600 --> 00:57:45,800 Speaker 1: these things. So I I think I know that this 1055 00:57:45,880 --> 00:57:49,800 Speaker 1: podcast you have coming out is so needed, um, and 1056 00:57:49,840 --> 00:57:53,320 Speaker 1: I'm so so excited to hear it. Um. If you 1057 00:57:53,360 --> 00:57:56,840 Speaker 1: can tell us more about it, Yeah, I mean, I 1058 00:57:57,360 --> 00:58:03,920 Speaker 1: feel like I've got y'all. Y'all got me work obviously, 1059 00:58:04,240 --> 00:58:06,480 Speaker 1: So I think I think it does all relate. And 1060 00:58:06,520 --> 00:58:08,960 Speaker 1: I think that what we're seeing now is these tried 1061 00:58:09,000 --> 00:58:12,680 Speaker 1: and true tactics of online harassment becoming an animating and 1062 00:58:12,800 --> 00:58:17,440 Speaker 1: normalized feature of our political landscape and discourse. And you know, 1063 00:58:17,560 --> 00:58:20,560 Speaker 1: there's this phrase that black women are quote canaries in 1064 00:58:20,600 --> 00:58:23,320 Speaker 1: the coal mine for online harms because first it happens 1065 00:58:23,360 --> 00:58:25,919 Speaker 1: to us, and then it happens to everyone, and bad 1066 00:58:25,920 --> 00:58:29,400 Speaker 1: actors are just continuing to use these same tactics they 1067 00:58:29,440 --> 00:58:32,280 Speaker 1: were able to perfect on black women on others and 1068 00:58:32,320 --> 00:58:35,720 Speaker 1: so I think that we need to have leadership, people 1069 00:58:35,760 --> 00:58:38,800 Speaker 1: with power at you know, elected officials, policymakers, and tech 1070 00:58:38,800 --> 00:58:42,600 Speaker 1: companies listening to black women so that we can actually 1071 00:58:42,760 --> 00:58:46,000 Speaker 1: get a handle on this. And you know, as we 1072 00:58:46,000 --> 00:58:47,840 Speaker 1: were sort of talking about, I think that our Internet 1073 00:58:48,160 --> 00:58:52,160 Speaker 1: and our social media platforms have become so weaponized. You know, 1074 00:58:52,200 --> 00:58:55,440 Speaker 1: it's nearly impossible to have a meaningful conversation or meaningful 1075 00:58:55,480 --> 00:59:00,400 Speaker 1: discourse on our communication platforms because the lead yeters who 1076 00:59:00,440 --> 00:59:05,120 Speaker 1: run those platforms have basically incentivized and amplified lies and 1077 00:59:05,240 --> 00:59:08,840 Speaker 1: grifts and scams and extremism where we're polarized than ever, 1078 00:59:08,960 --> 00:59:12,480 Speaker 1: and grifters know this and are basically running the show. 1079 00:59:12,960 --> 00:59:15,040 Speaker 1: And I think if we're ever planning on getting a 1080 00:59:15,080 --> 00:59:17,320 Speaker 1: handle on this, if we're ever planning on doing something 1081 00:59:17,320 --> 00:59:21,080 Speaker 1: about this, the first step has to be talking honestly 1082 00:59:21,120 --> 00:59:23,960 Speaker 1: about it. And so that's why I'm doing the podcast 1083 00:59:23,960 --> 00:59:26,960 Speaker 1: with Cool Zone Media called Internet Hate Machine, uh. And 1084 00:59:27,440 --> 00:59:30,280 Speaker 1: I really want to chart the history of online harassment 1085 00:59:30,360 --> 00:59:33,800 Speaker 1: against women of color specifically, and how it has led 1086 00:59:33,880 --> 00:59:37,000 Speaker 1: us to this current political and social movement. And so 1087 00:59:37,600 --> 00:59:40,160 Speaker 1: I hope that folks will listen. It's something that I 1088 00:59:40,160 --> 00:59:42,160 Speaker 1: think is really timely right now as we go into 1089 00:59:42,160 --> 00:59:44,200 Speaker 1: mid terms. I think it will be timely for a 1090 00:59:44,240 --> 00:59:47,760 Speaker 1: long time as we determine what we want our political 1091 00:59:47,760 --> 00:59:50,919 Speaker 1: and social landscape to look like. And it's important. I'm 1092 00:59:50,920 --> 00:59:53,400 Speaker 1: so excited because it definitely needs to be the bigger 1093 00:59:53,400 --> 00:59:55,720 Speaker 1: part of the conversation as we look at what our 1094 00:59:55,760 --> 00:59:59,320 Speaker 1: democracy looks like in the future, in the near future. 1095 00:59:59,440 --> 01:00:02,840 Speaker 1: Let's let's a um but not just for the United 1096 01:00:02,880 --> 01:00:05,160 Speaker 1: States but all around the world, because we're seeing similar 1097 01:00:05,240 --> 01:00:08,440 Speaker 1: things happen on different timelines. And I feel like you're 1098 01:00:08,480 --> 01:00:10,560 Speaker 1: going to be able to bring in such good perspective 1099 01:00:10,880 --> 01:00:13,280 Speaker 1: and so showing the present, the past, and possibly the 1100 01:00:13,280 --> 01:00:15,120 Speaker 1: future with all of this, So thank you and I'm 1101 01:00:15,200 --> 01:00:18,120 Speaker 1: excited to listen, especially that first one. Yeah, please check 1102 01:00:18,160 --> 01:00:21,280 Speaker 1: it out. Our trailer and episode zero are already live, 1103 01:00:21,320 --> 01:00:23,720 Speaker 1: so you can go to Internet Hate Machine on wherever 1104 01:00:23,760 --> 01:00:26,920 Speaker 1: you find your podcasts, Spotify, Apple Podcasts. You know how 1105 01:00:26,960 --> 01:00:28,440 Speaker 1: to find your listen. This is a podcast, so you 1106 01:00:28,480 --> 01:00:30,240 Speaker 1: know how to find podcast if you're listening to this 1107 01:00:30,640 --> 01:00:33,520 Speaker 1: UM So episode zero is already live, but our first 1108 01:00:33,560 --> 01:00:36,800 Speaker 1: official episode will drop on November second, so please check 1109 01:00:36,840 --> 01:00:40,720 Speaker 1: it out. Yes, go subscribe, Please support it. Um, it's 1110 01:00:40,720 --> 01:00:43,440 Speaker 1: amazing work. Can't wait to hear it. Where else can 1111 01:00:43,480 --> 01:00:45,840 Speaker 1: the listeners find you, Bridget? Well, you can find me 1112 01:00:46,000 --> 01:00:49,520 Speaker 1: on my weekly podcast about women in the Internet called 1113 01:00:49,560 --> 01:00:52,439 Speaker 1: There Are No Girls on the Internet. Can definitely find 1114 01:00:52,480 --> 01:00:54,640 Speaker 1: that if you're listening to this podcast, Um, that's not 1115 01:00:54,680 --> 01:00:56,800 Speaker 1: going anywhere, so we can listen to them both. You 1116 01:00:56,840 --> 01:00:59,280 Speaker 1: can find me on Instagram at Bridget Marian d C 1117 01:00:59,640 --> 01:01:02,600 Speaker 1: or yes, I'm still gonna be on Twitter elon must 1118 01:01:02,640 --> 01:01:04,919 Speaker 1: be damn. You can find me on You can find 1119 01:01:04,920 --> 01:01:08,200 Speaker 1: me on Twitter at Bridget Murray. Yes, um, and is 1120 01:01:08,280 --> 01:01:12,560 Speaker 1: always such a delight to have you, Bridget. Wishing everybody 1121 01:01:12,960 --> 01:01:15,600 Speaker 1: you and everybody all the best as we enter into 1122 01:01:15,640 --> 01:01:19,880 Speaker 1: this final stretch of the midterms. And also listeners, you 1123 01:01:19,960 --> 01:01:22,560 Speaker 1: might have heard of cameo from Bridget's cat which was 1124 01:01:28,840 --> 01:01:32,600 Speaker 1: we needed it annoyed with me even though she doesn't 1125 01:01:32,600 --> 01:01:34,600 Speaker 1: have a job and like has everything and has treated 1126 01:01:34,640 --> 01:01:37,360 Speaker 1: with my queen. I'm sure I would be like, oh 1127 01:01:37,400 --> 01:01:44,040 Speaker 1: I have a job, yes, um, I cannot wait till 1128 01:01:44,120 --> 01:01:46,400 Speaker 1: next time. Thank you so much again, Bridget. If you 1129 01:01:46,400 --> 01:01:48,760 Speaker 1: would like to find us, you can You can email 1130 01:01:48,800 --> 01:01:50,600 Speaker 1: us at Stuff Media Mom Stuff at iHeart Media You 1131 01:01:50,640 --> 01:01:52,320 Speaker 1: can find us on Twitter, at most of podcast or 1132 01:01:52,440 --> 01:01:54,360 Speaker 1: Instagram and stuff we never told you. Thanks as always 1133 01:01:54,360 --> 01:01:57,240 Speaker 1: started sup producer Chris Duda, Thank you Christina, Thanks to 1134 01:01:57,280 --> 01:01:59,400 Speaker 1: you for listening. Stuff to the production of I Heart 1135 01:01:59,440 --> 01:02:01,200 Speaker 1: Radio for pdcast from My Heart Radio. You can check 1136 01:02:01,200 --> 01:02:03,280 Speaker 1: out the heart Radio app, Apple podcast, wherever you listen 1137 01:02:03,320 --> 01:02:07,440 Speaker 1: to your favorite shows. H