1 00:00:05,200 --> 00:00:07,400 Speaker 1: Hey, this is Anny and Samantha. I hope my Stuff 2 00:00:07,440 --> 00:00:18,960 Speaker 1: I Never Told You a production of iHeartRadio, and in 3 00:00:19,000 --> 00:00:23,200 Speaker 1: our continuing end of the year Favorite episodes wrap up 4 00:00:23,320 --> 00:00:27,360 Speaker 1: bring Back, we're also choosing two of our favorite Bridget episodes. 5 00:00:27,400 --> 00:00:31,760 Speaker 1: Those are so wonderful they're hard to choose from. One 6 00:00:31,800 --> 00:00:34,720 Speaker 1: that really stuck out to me was the one we 7 00:00:34,800 --> 00:00:40,680 Speaker 1: did about Facebook's abortion blackout, where we were talking about 8 00:00:42,280 --> 00:00:45,760 Speaker 1: what Facebook was doing. They're saying they're doing free expression, 9 00:00:46,520 --> 00:00:50,520 Speaker 1: but in fact they're not doing that at all and 10 00:00:50,640 --> 00:00:56,040 Speaker 1: not doing anything with transparency, and it's just such a relevant, 11 00:00:56,520 --> 00:01:03,600 Speaker 1: unfortunately relevant conversation, and it's just really I think it's 12 00:01:03,600 --> 00:01:06,880 Speaker 1: a good reminder that we need to be aware of 13 00:01:06,920 --> 00:01:11,760 Speaker 1: what these companies are doing and just your existence and 14 00:01:11,840 --> 00:01:18,560 Speaker 1: data and how it operates in that space. So yeah, 15 00:01:19,000 --> 00:01:21,600 Speaker 1: please enjoy this classic episode. 16 00:01:27,240 --> 00:01:28,840 Speaker 2: Hey, this is Annie and Samantha. 17 00:01:28,920 --> 00:01:31,080 Speaker 1: I'm welcome to Stuff I Never Told You a productive iHeartRadio. 18 00:01:40,600 --> 00:01:42,520 Speaker 1: And today we are once again thrilled to be joined 19 00:01:42,560 --> 00:01:45,160 Speaker 1: by the fabulous, the fantastic Bridget Pod. 20 00:01:45,400 --> 00:01:48,440 Speaker 3: Welcome Bridget, thank you for having me back. I'm so 21 00:01:48,520 --> 00:01:49,320 Speaker 3: excited to be here. 22 00:01:49,520 --> 00:01:52,320 Speaker 1: We're so excited to have you always, but yes, we 23 00:01:52,440 --> 00:01:57,560 Speaker 1: were discussing beforehand. There's a lot for us that we 24 00:01:57,600 --> 00:02:02,000 Speaker 1: would like your expertise on. So always we appreciate you 25 00:02:02,080 --> 00:02:04,720 Speaker 1: coming on. There's so much going on right now. That 26 00:02:04,880 --> 00:02:06,800 Speaker 1: being said, how are you bridget. 27 00:02:06,960 --> 00:02:09,240 Speaker 3: You know, this week was I don't know if you 28 00:02:09,280 --> 00:02:11,440 Speaker 3: all feel it. I feel like the last few weeks 29 00:02:11,440 --> 00:02:15,880 Speaker 3: has been pretty rough. It feels like things are different. 30 00:02:15,919 --> 00:02:18,680 Speaker 3: It just feels like a shift in the air, which 31 00:02:18,720 --> 00:02:21,800 Speaker 3: can be hard to exist in, let alone make content 32 00:02:21,960 --> 00:02:26,760 Speaker 3: in that doesn't make people fearful and want to check out, 33 00:02:26,800 --> 00:02:28,840 Speaker 3: which I think maybe we're all kind of navigating. 34 00:02:29,320 --> 00:02:31,400 Speaker 4: Yeah, is that something that resonates with youtwo. 35 00:02:32,160 --> 00:02:36,560 Speaker 1: Yes, ah, yes, it's we make a lot of content 36 00:02:37,120 --> 00:02:39,480 Speaker 1: and we do try to mix things up, but we 37 00:02:39,520 --> 00:02:42,400 Speaker 1: also don't want to make more things. But I mean, 38 00:02:42,520 --> 00:02:45,680 Speaker 1: for instance, yesterday I just had a lot of trouble 39 00:02:45,720 --> 00:02:48,360 Speaker 1: concentrating on my work because I was thinking about all 40 00:02:48,360 --> 00:02:51,359 Speaker 1: this other stuff. I'm like, what's going on in the world? 41 00:02:51,440 --> 00:02:54,639 Speaker 1: What can I do? Which I've always maintained that if 42 00:02:54,680 --> 00:02:57,160 Speaker 1: you want to get more productivity out of people, then 43 00:02:57,200 --> 00:03:01,040 Speaker 1: they're going about it the completely wrong. But that's just 44 00:03:01,560 --> 00:03:05,839 Speaker 1: a very small personal gripe of mine. Uh, but yeah, yeah, 45 00:03:05,919 --> 00:03:08,359 Speaker 1: it's it's been it's been difficult. 46 00:03:09,120 --> 00:03:13,079 Speaker 2: You know. What I've discovered is these tiny little coloring 47 00:03:13,120 --> 00:03:16,240 Speaker 2: books that are really like easy coloring, so it's not 48 00:03:16,280 --> 00:03:19,400 Speaker 2: detailed because I've discovered as much as I like to 49 00:03:19,480 --> 00:03:22,480 Speaker 2: do those things, I cannot stay within the lines. And 50 00:03:22,480 --> 00:03:25,800 Speaker 2: when they get really like fancy, the adult coloring book versions, 51 00:03:25,800 --> 00:03:27,600 Speaker 2: like they get fancy and you have to do the shading, 52 00:03:27,600 --> 00:03:30,240 Speaker 2: and like, what is this? Why do I have so 53 00:03:30,320 --> 00:03:33,200 Speaker 2: many things to color? So I've discovered these tiny ones 54 00:03:33,240 --> 00:03:37,120 Speaker 2: that's just like a cutesy large pictures of like a milkshake. 55 00:03:37,960 --> 00:03:40,160 Speaker 2: It has been really nice that I can just check out, 56 00:03:40,520 --> 00:03:43,320 Speaker 2: stay inside the lines and color with a marker like 57 00:03:43,360 --> 00:03:46,080 Speaker 2: those little like you know, paint like markers that. 58 00:03:46,040 --> 00:03:49,280 Speaker 5: Doesn't really satisfying. Nice to like zone. 59 00:03:49,040 --> 00:03:53,280 Speaker 3: Out, Sam, you are speaking my language, because a good 60 00:03:53,320 --> 00:03:55,680 Speaker 3: I have a I've spent so much money on this, 61 00:03:55,840 --> 00:04:01,160 Speaker 3: but alcohol markers on Like, there's nothing quite like good marker. 62 00:04:02,280 --> 00:04:04,320 Speaker 3: A new set of markers. You're like, this is gonna 63 00:04:04,360 --> 00:04:08,200 Speaker 3: change everything. My future starts today. I've got this new 64 00:04:08,240 --> 00:04:10,720 Speaker 3: set of markers. Yes, and the one of the ones 65 00:04:10,760 --> 00:04:15,360 Speaker 3: that that I write really well are so satisfying. I 66 00:04:15,400 --> 00:04:18,520 Speaker 3: know I sound like a crazy person, but genuinely, the 67 00:04:19,200 --> 00:04:23,480 Speaker 3: appeal of finding the right set of markers can change everything. 68 00:04:23,600 --> 00:04:27,480 Speaker 2: Oh no, I am obsessed with pins and well written 69 00:04:27,640 --> 00:04:30,159 Speaker 2: fine point pins, like it has to be fine points, 70 00:04:30,200 --> 00:04:32,680 Speaker 2: like the Chinese and Japanese have a market because they 71 00:04:32,680 --> 00:04:35,839 Speaker 2: have some of the best, like nicely flowing pins. 72 00:04:36,160 --> 00:04:37,800 Speaker 5: Even though my my handwriting. 73 00:04:37,480 --> 00:04:40,240 Speaker 2: Is really really bad, but I love the feel of 74 00:04:40,320 --> 00:04:43,039 Speaker 2: like a nice smooth rite. But with these, like the 75 00:04:43,160 --> 00:04:46,440 Speaker 2: alcohol markers as you're talking about, the only problem I 76 00:04:46,480 --> 00:04:49,320 Speaker 2: have is the color that reports that they say they 77 00:04:49,360 --> 00:04:53,640 Speaker 2: represent on the cap doesn't actually like translate in the marking. 78 00:04:53,720 --> 00:04:55,440 Speaker 5: So I'm like, oh, this is a yellow and it 79 00:04:55,480 --> 00:04:56,080 Speaker 5: turns orange. 80 00:04:56,120 --> 00:04:59,440 Speaker 2: I'm like, wait this is this is not but it 81 00:04:59,520 --> 00:05:02,080 Speaker 2: is very smooth and it is very satisfying because it 82 00:05:02,080 --> 00:05:05,440 Speaker 2: fills all the lines, and you're like, yes, I'm a professional. 83 00:05:04,920 --> 00:05:05,840 Speaker 4: Colorer, of course. 84 00:05:06,120 --> 00:05:08,400 Speaker 3: This is how This is how much I tell seriously, 85 00:05:08,440 --> 00:05:10,400 Speaker 3: I take this. When I get a new set of markers, 86 00:05:10,680 --> 00:05:12,599 Speaker 3: the first thing I do is a little colors watch 87 00:05:12,680 --> 00:05:15,360 Speaker 3: to be the oh they say orange, especially what their 88 00:05:15,400 --> 00:05:16,159 Speaker 3: orange looks like. 89 00:05:16,320 --> 00:05:18,080 Speaker 4: Just do I know, no surprises? 90 00:05:18,680 --> 00:05:19,120 Speaker 5: You know what? 91 00:05:19,160 --> 00:05:21,960 Speaker 2: That's good to That is great advice as a newly 92 00:05:22,440 --> 00:05:24,560 Speaker 2: marker purchasing person. 93 00:05:24,880 --> 00:05:25,360 Speaker 4: So thank you. 94 00:05:25,360 --> 00:05:26,760 Speaker 2: I'm gonna have to i'ma have to get some like 95 00:05:27,160 --> 00:05:29,040 Speaker 2: scrap paper just so I can do that. 96 00:05:29,800 --> 00:05:31,440 Speaker 3: This is we can do a whole episode on this. 97 00:05:31,520 --> 00:05:35,560 Speaker 3: Let's just look pens markers my fans. It gives me 98 00:05:35,600 --> 00:05:37,840 Speaker 3: a head tingles even just talking about it. I love 99 00:05:37,839 --> 00:05:38,400 Speaker 3: it so much. 100 00:05:39,160 --> 00:05:41,760 Speaker 2: So since I did the like pre show, everything's the 101 00:05:41,800 --> 00:05:43,800 Speaker 2: worst I have to bring in. This is a solution, 102 00:05:43,920 --> 00:05:48,240 Speaker 2: and it's coloring, large small coloring books that with wonderful markers. 103 00:05:48,360 --> 00:05:53,599 Speaker 3: Yes, yeah, that can be the antidote to our troubling times. 104 00:05:53,960 --> 00:05:57,400 Speaker 3: Have you considered just diving headfirst into the world of 105 00:05:57,440 --> 00:06:01,200 Speaker 3: markers and penmanship and calligraphy and and coloring. 106 00:06:01,880 --> 00:06:02,760 Speaker 1: It is quite nice. 107 00:06:02,800 --> 00:06:03,640 Speaker 5: It is quite nice. 108 00:06:03,800 --> 00:06:04,280 Speaker 4: Annie. 109 00:06:05,000 --> 00:06:07,000 Speaker 2: I will leave some for you. I know you're about 110 00:06:07,040 --> 00:06:08,719 Speaker 2: to house it for me, so I will leave some 111 00:06:08,839 --> 00:06:10,320 Speaker 2: for you to try out yourself. 112 00:06:10,760 --> 00:06:14,080 Speaker 1: You know I have. I have two main thoughts from this. 113 00:06:14,279 --> 00:06:18,479 Speaker 1: One is that I hate when this happens where your 114 00:06:18,520 --> 00:06:20,600 Speaker 1: birthday is coming up, Samantha, And I wish I had 115 00:06:20,640 --> 00:06:23,159 Speaker 1: known this earlier. Oh no, that would have been a 116 00:06:23,160 --> 00:06:26,800 Speaker 1: great gift. But then I think, bridget you should come 117 00:06:26,839 --> 00:06:30,560 Speaker 1: on one time and let us talk about something that's 118 00:06:30,640 --> 00:06:34,120 Speaker 1: not so stressful. Let's let's give let's give ourselves a 119 00:06:34,120 --> 00:06:36,000 Speaker 1: little break. We can talk about markers. I know you 120 00:06:36,279 --> 00:06:39,360 Speaker 1: mentioned like reality TV. We could have a whole thing 121 00:06:40,320 --> 00:06:41,440 Speaker 1: where it's not something. 122 00:06:41,600 --> 00:06:44,960 Speaker 2: So do you do all the like dark stuff we 123 00:06:44,960 --> 00:06:47,360 Speaker 2: should make Let you come in with the joys that 124 00:06:47,400 --> 00:06:47,640 Speaker 2: you have. 125 00:06:47,680 --> 00:06:50,719 Speaker 5: Because we just talked about the Adirondacks and everything. 126 00:06:51,560 --> 00:06:54,400 Speaker 3: Yeah, I feel like people who listen to my content 127 00:06:54,600 --> 00:06:58,640 Speaker 3: might not know that I experienced joy I have. I have. 128 00:06:59,800 --> 00:07:00,719 Speaker 4: The only things. 129 00:07:00,560 --> 00:07:04,400 Speaker 3: I talk about are not the crushing weight of fascism. 130 00:07:05,480 --> 00:07:09,240 Speaker 4: I also enjoying reality television. I think this would be fun. 131 00:07:09,279 --> 00:07:13,800 Speaker 1: I think we should look at it. I like it. Oh, 132 00:07:13,840 --> 00:07:18,040 Speaker 1: but unfortunately before today, we're not doing that today. This 133 00:07:18,120 --> 00:07:23,400 Speaker 1: is also the timing is interesting because we're Samantha and 134 00:07:23,440 --> 00:07:27,040 Speaker 1: I are working on an update on CPCs crisis Pregnancy Centers. 135 00:07:27,480 --> 00:07:28,520 Speaker 4: I know them very well. 136 00:07:29,120 --> 00:07:32,440 Speaker 1: Yes, and we in our research ran into a lot 137 00:07:32,480 --> 00:07:36,720 Speaker 1: of stuff about how tech companies were basically paying for 138 00:07:36,760 --> 00:07:40,400 Speaker 1: them to advertise or accepting their money and being misleading 139 00:07:40,440 --> 00:07:44,920 Speaker 1: about things. So this is very much related. What are 140 00:07:44,960 --> 00:07:47,040 Speaker 1: we talking about today, bridget. 141 00:07:46,720 --> 00:07:49,200 Speaker 4: Well, that is such a good transition because it's all related. 142 00:07:49,720 --> 00:07:53,280 Speaker 3: But tech companies really have put their thumbs on the 143 00:07:53,320 --> 00:07:56,160 Speaker 3: scale when it comes to being able to get accurate 144 00:07:56,280 --> 00:08:00,000 Speaker 3: content about abortion on social media. Your point about christ 145 00:08:00,040 --> 00:08:03,440 Speaker 3: his pregnancy centers and the way that Google essentially is 146 00:08:03,480 --> 00:08:07,240 Speaker 3: like paying an advertising network for them to exist is 147 00:08:07,280 --> 00:08:09,880 Speaker 3: a great example. But today I really wanted to talk 148 00:08:09,920 --> 00:08:15,040 Speaker 3: about how social media is heavily moderating and even in 149 00:08:15,080 --> 00:08:19,440 Speaker 3: some cases like suppressing and censoring content about abortion. I 150 00:08:19,440 --> 00:08:21,760 Speaker 3: think that we all talked about this back in January, 151 00:08:21,760 --> 00:08:25,239 Speaker 3: But do y'all remember when Mark Zuckerberg had that moment 152 00:08:25,280 --> 00:08:27,320 Speaker 3: that people sort of talked about as his mask off 153 00:08:27,400 --> 00:08:30,200 Speaker 3: moment back in January when Trump came back into office. 154 00:08:30,240 --> 00:08:32,760 Speaker 3: I think that we were talking about how he really 155 00:08:32,840 --> 00:08:37,040 Speaker 3: started dressing like a divorced nightclub promoter and was saying 156 00:08:37,080 --> 00:08:39,520 Speaker 3: things like, Oh, we're taking the tampons out of the 157 00:08:39,679 --> 00:08:43,680 Speaker 3: washrooms at the restaurant. Here at Facebook, HQ just really 158 00:08:43,880 --> 00:08:46,400 Speaker 3: was sort of having a moment where he was saying 159 00:08:46,440 --> 00:08:47,079 Speaker 3: a lot of things. 160 00:08:47,120 --> 00:08:48,120 Speaker 4: Do you remember this. 161 00:08:49,040 --> 00:08:52,559 Speaker 1: Oh yes, oh yes, he was like he leaned in hard, 162 00:08:52,800 --> 00:08:53,880 Speaker 1: he was hard. 163 00:08:54,720 --> 00:08:57,360 Speaker 5: He's been waiting for this moment, And yes, oh my gosh. 164 00:08:57,080 --> 00:08:57,679 Speaker 4: You could tell. 165 00:08:58,000 --> 00:09:00,439 Speaker 3: I mean I also I almost quibbled when people were like, oh, 166 00:09:00,440 --> 00:09:03,480 Speaker 3: it's his mask off moment, because I don't think that 167 00:09:03,600 --> 00:09:06,719 Speaker 3: Mark Zuckerberg has any kind of like I don't. I 168 00:09:06,760 --> 00:09:09,360 Speaker 3: wouldn't call it a mask off moment because I think 169 00:09:09,400 --> 00:09:12,360 Speaker 3: that he is the definition of a hollow, empty person, 170 00:09:12,720 --> 00:09:15,360 Speaker 3: and so I think he is the mask. 171 00:09:15,640 --> 00:09:16,959 Speaker 4: He will say anything. 172 00:09:17,040 --> 00:09:19,840 Speaker 3: I think that he has no he's I'm honestly fascinated 173 00:09:19,840 --> 00:09:21,400 Speaker 3: by him as a tech leader because I think that 174 00:09:21,440 --> 00:09:25,800 Speaker 3: he has no value, scruples, morals, there's just nothing. He 175 00:09:25,880 --> 00:09:29,800 Speaker 3: will say anything, he will do anything. However the wind blows, 176 00:09:29,920 --> 00:09:32,200 Speaker 3: that's how he will blow. And I don't think it's 177 00:09:32,240 --> 00:09:34,760 Speaker 3: fair to call that a mask moment when truly, like 178 00:09:34,880 --> 00:09:37,120 Speaker 3: what the mask is not is not hiding anything. This 179 00:09:37,240 --> 00:09:39,400 Speaker 3: is just genuinely like who you are, who you always 180 00:09:39,440 --> 00:09:42,800 Speaker 3: have been, just the soulless person who was waiting to 181 00:09:42,880 --> 00:09:45,000 Speaker 3: see who they should kiss up to you and will 182 00:09:45,000 --> 00:09:47,200 Speaker 3: do that if it means holding onto power. 183 00:09:47,679 --> 00:09:49,840 Speaker 2: Right, he was just waiting in the background, like his 184 00:09:50,360 --> 00:09:52,360 Speaker 2: true personality was just waiting in the shadows. 185 00:09:52,360 --> 00:09:54,440 Speaker 4: And then it's like, oh, oh ho, this is my moment, 186 00:09:55,640 --> 00:09:57,360 Speaker 4: and so when that all was going on. 187 00:09:57,720 --> 00:10:00,720 Speaker 3: He also announced that Meta was going to be scrapping 188 00:10:00,760 --> 00:10:04,600 Speaker 3: their community notes feature and scrapping all third party fact 189 00:10:04,640 --> 00:10:07,880 Speaker 3: checking on the platform, because, as he said, it was 190 00:10:07,920 --> 00:10:10,120 Speaker 3: time for the company to get back to their roots 191 00:10:10,120 --> 00:10:12,560 Speaker 3: when it comes to free expression. I will play a 192 00:10:12,559 --> 00:10:15,920 Speaker 3: little bit of a video that he put out talking 193 00:10:15,960 --> 00:10:16,400 Speaker 3: about this. 194 00:10:16,920 --> 00:10:19,920 Speaker 6: Hey, everyone, I want to talk about something important today 195 00:10:20,120 --> 00:10:22,240 Speaker 6: because it's time to get back to our roots around 196 00:10:22,280 --> 00:10:26,520 Speaker 6: free expression on Facebook and Instagram. I started building social 197 00:10:26,559 --> 00:10:29,000 Speaker 6: media to give people a voice. I gave a speech 198 00:10:29,000 --> 00:10:32,040 Speaker 6: at Georgetown five years ago about the importance of protecting 199 00:10:32,080 --> 00:10:35,440 Speaker 6: free expression and I still believe this today. But a 200 00:10:35,440 --> 00:10:38,080 Speaker 6: lot has happened over the last several years. There's been 201 00:10:38,120 --> 00:10:42,600 Speaker 6: widespread debate about potential harms from online content. Governments and 202 00:10:42,679 --> 00:10:46,160 Speaker 6: legacy media have pushed to censor more and more. A 203 00:10:46,160 --> 00:10:48,680 Speaker 6: lot of this is clearly political, but there's also a 204 00:10:48,679 --> 00:10:53,280 Speaker 6: lot of legitimately bad stuff out there. Drugs, terrorism, child exploitation. 205 00:10:53,800 --> 00:10:56,120 Speaker 6: These are things that we take very seriously and I 206 00:10:56,120 --> 00:10:57,760 Speaker 6: want to make sure that we handle responsibly. 207 00:10:58,920 --> 00:11:02,800 Speaker 3: So to say about that, first of all, very convenient 208 00:11:03,040 --> 00:11:06,600 Speaker 3: rewriting of the history that frankly wasn't that long ago, 209 00:11:06,679 --> 00:11:09,480 Speaker 3: and that you if you're listening and you're my age, 210 00:11:09,760 --> 00:11:12,800 Speaker 3: you probably remember because we all know it is not 211 00:11:12,840 --> 00:11:16,960 Speaker 3: a secret that Facebook. Mark Zuckerberg created Facebook as a 212 00:11:16,960 --> 00:11:19,520 Speaker 3: college student to rank the looks of the women in 213 00:11:19,600 --> 00:11:22,680 Speaker 3: his on his college campus. Somehow we sort of let 214 00:11:22,800 --> 00:11:25,480 Speaker 3: him get away with being like, I created Facebook to 215 00:11:25,520 --> 00:11:29,880 Speaker 3: protect free expression. Okay, sure, I don't know. I always 216 00:11:29,920 --> 00:11:31,840 Speaker 3: have to like quibble at that because he I guess 217 00:11:31,840 --> 00:11:34,760 Speaker 3: saw a video where he said I created Facebook because 218 00:11:34,760 --> 00:11:37,360 Speaker 3: I wanted people to be able to have debates about 219 00:11:37,400 --> 00:11:39,120 Speaker 3: the Iraq war, and it's like, no, you did it. 220 00:11:39,240 --> 00:11:42,840 Speaker 3: First of all, I was an organizer in the anti 221 00:11:42,880 --> 00:11:46,360 Speaker 3: war movement. Nobody was communicating on Facebook when were like, 222 00:11:46,400 --> 00:11:50,640 Speaker 3: Facebook wasn't for that, So that's just not true. I 223 00:11:50,679 --> 00:11:53,520 Speaker 3: really have a thing where people lie to your face 224 00:11:53,640 --> 00:11:56,480 Speaker 3: about recent history that you remember that. 225 00:11:56,480 --> 00:11:59,320 Speaker 4: You were therefore that you were part of. So that's bullshit. 226 00:11:59,640 --> 00:12:03,520 Speaker 3: But even more than that, he's talking about how the 227 00:12:03,679 --> 00:12:07,120 Speaker 3: content that he really wants to focus on in terms 228 00:12:07,160 --> 00:12:13,559 Speaker 3: of moderating the platform is a legal content, right, child safety, harms, 229 00:12:13,760 --> 00:12:18,360 Speaker 3: drug trade, organized criminal activity, all of that. So this 230 00:12:18,400 --> 00:12:20,560 Speaker 3: is when he was really talking about how important it 231 00:12:20,640 --> 00:12:23,120 Speaker 3: was to protect free expression on social media platforms. 232 00:12:23,200 --> 00:12:23,960 Speaker 4: You'll might recall that. 233 00:12:24,000 --> 00:12:27,520 Speaker 3: Around this time he was in the headlines for saying 234 00:12:27,520 --> 00:12:29,520 Speaker 3: that you felt the Biden administration had been trying to 235 00:12:29,559 --> 00:12:33,320 Speaker 3: pressure Facebook into removing COVID misinformation. The White House had 236 00:12:33,320 --> 00:12:36,920 Speaker 3: a different take, saying, quote, when confronted with the deadly pandemic, 237 00:12:37,200 --> 00:12:41,160 Speaker 3: this administration encouraged responsible actions to protect public health and safety. 238 00:12:41,280 --> 00:12:43,800 Speaker 3: Our position has been very clear and consistent. We believe 239 00:12:43,840 --> 00:12:46,760 Speaker 3: tech companies and other private actors should take into account 240 00:12:46,760 --> 00:12:49,200 Speaker 3: the effects their actions have on the American people while 241 00:12:49,200 --> 00:12:53,240 Speaker 3: making independent choices about the information they present. So, you know, 242 00:12:53,679 --> 00:12:56,600 Speaker 3: Zuckerberg in this moment was like, we are not going 243 00:12:56,640 --> 00:13:00,000 Speaker 3: to be moderating political content the way that we have been. 244 00:13:00,520 --> 00:13:03,240 Speaker 3: We are going to lift restrictions on topics that are 245 00:13:03,240 --> 00:13:06,280 Speaker 3: part of mainstream discourse and really just focus on the 246 00:13:06,400 --> 00:13:09,440 Speaker 3: enforcement of a legal and like high severity violation. 247 00:13:09,640 --> 00:13:11,920 Speaker 4: So yay for free speech right. 248 00:13:11,960 --> 00:13:15,600 Speaker 3: That all sounds great, Well, all of that, It's only 249 00:13:15,640 --> 00:13:18,679 Speaker 3: the case if that part of the mainstream discourse is 250 00:13:18,720 --> 00:13:23,360 Speaker 3: not abortion, which Facebook continues to suppress and moderate quite 251 00:13:23,360 --> 00:13:26,920 Speaker 3: heavily with zero transparency and zero consistency. 252 00:13:26,960 --> 00:13:28,200 Speaker 4: So it seems like if. 253 00:13:28,120 --> 00:13:32,120 Speaker 3: You're spreading COVID misinformation, Well, that is protected speech that 254 00:13:32,240 --> 00:13:34,640 Speaker 3: needs to be left up for freedom. If you are 255 00:13:34,679 --> 00:13:38,959 Speaker 3: sharing accurate information about abortion that isn't even against metas policies, 256 00:13:39,160 --> 00:13:40,200 Speaker 3: they will take it down. 257 00:13:41,000 --> 00:13:44,000 Speaker 1: Yeah, and like you said, without warning or transpancy or 258 00:13:44,200 --> 00:13:47,680 Speaker 1: nothing just is gone. And you might not know why 259 00:13:47,960 --> 00:13:50,440 Speaker 1: or well, you could probably figure it out. But one 260 00:13:50,440 --> 00:13:52,280 Speaker 1: of the things that's really frustrating about all of this 261 00:13:53,960 --> 00:13:55,800 Speaker 1: is that, you know, like you said, they're kind of 262 00:13:55,840 --> 00:13:58,480 Speaker 1: lying to our faces, right, Like they're saying one thing 263 00:13:58,920 --> 00:14:01,440 Speaker 1: and doing something completely different. 264 00:14:02,440 --> 00:14:06,360 Speaker 3: Yeah, that's what really makes me angry about this. You know, 265 00:14:06,400 --> 00:14:08,840 Speaker 3: I cover a lot of tech companies. The thing that 266 00:14:08,880 --> 00:14:11,760 Speaker 3: gets me is when they lie when they say one 267 00:14:11,840 --> 00:14:15,920 Speaker 3: thing publicly, when they publish something in their rules and 268 00:14:15,960 --> 00:14:18,079 Speaker 3: regulations and policies. You know, no one's putting a gun 269 00:14:18,120 --> 00:14:19,640 Speaker 3: to their head and making them put these things in 270 00:14:19,640 --> 00:14:21,280 Speaker 3: their rules. They put them in their rules, and then 271 00:14:21,320 --> 00:14:24,400 Speaker 3: they do a completely different thing, and then when advocates 272 00:14:24,480 --> 00:14:27,760 Speaker 3: or organizers call them out on it, there's just no 273 00:14:27,800 --> 00:14:29,800 Speaker 3: they're just like, oops, what are you gonna do that? 274 00:14:30,000 --> 00:14:33,560 Speaker 3: For some reason, that just really gets me because they 275 00:14:33,600 --> 00:14:37,520 Speaker 3: are allowed to enjoy all of this positive press of 276 00:14:37,600 --> 00:14:41,160 Speaker 3: putting this thing in their policy and then continue doing 277 00:14:41,240 --> 00:14:45,200 Speaker 3: the shady work of going against that policy. It never 278 00:14:45,240 --> 00:14:46,880 Speaker 3: it never comes back at them like they're able to 279 00:14:46,880 --> 00:14:48,960 Speaker 3: just do whatever they want while saying one thing can 280 00:14:49,040 --> 00:14:50,640 Speaker 3: doing another. And I just don't feel like they really 281 00:14:50,680 --> 00:14:54,360 Speaker 3: get held accountable. And so a Meta spokesperson said that 282 00:14:54,560 --> 00:14:59,320 Speaker 3: taking down abortion content goes against Meta's own intentions. A 283 00:14:59,360 --> 00:15:02,520 Speaker 3: spokesperson the New York Times, we want our platforms to 284 00:15:02,560 --> 00:15:05,480 Speaker 3: be a place where people can access reliable information about 285 00:15:05,480 --> 00:15:09,280 Speaker 3: health services, advertisers can promote health services, and everyone can 286 00:15:09,320 --> 00:15:13,000 Speaker 3: discuss and debate public policies in this space. That is 287 00:15:13,040 --> 00:15:16,720 Speaker 3: why we allow posts and ads discussing and debating abortion. 288 00:15:17,200 --> 00:15:19,360 Speaker 3: But they're not doing that at all. Because the big 289 00:15:19,400 --> 00:15:22,520 Speaker 3: thing to know here is that Meta says one thing 290 00:15:22,520 --> 00:15:25,240 Speaker 3: in their policies and then does a completely different thing 291 00:15:25,600 --> 00:15:28,720 Speaker 3: when it comes to how they are actually moderating abortion content. 292 00:15:29,440 --> 00:15:34,320 Speaker 1: Yeah, and it's it's so difficult right now to get 293 00:15:34,360 --> 00:15:37,600 Speaker 1: that good information, and there's so much misinformation and disinformation 294 00:15:37,680 --> 00:15:44,440 Speaker 1: out there, and to remove it is just really piling 295 00:15:44,480 --> 00:15:47,120 Speaker 1: onto a problem that really doesn't need any more piling 296 00:15:47,200 --> 00:15:51,560 Speaker 1: onto it is already really bad and people are already 297 00:15:51,760 --> 00:15:54,840 Speaker 1: very confused. This is not helping. 298 00:15:56,280 --> 00:16:00,440 Speaker 3: No, that's a really good context to set that. You know, 299 00:16:00,520 --> 00:16:04,760 Speaker 3: we're in a time where the Supreme Court struck down Row. 300 00:16:05,320 --> 00:16:10,120 Speaker 3: It is so much harder to access accurate information about 301 00:16:10,120 --> 00:16:12,800 Speaker 3: help so that people can make health decisions for themselves. 302 00:16:13,120 --> 00:16:17,160 Speaker 3: And when social media platforms like Facebook put their thumb 303 00:16:17,160 --> 00:16:21,080 Speaker 3: on the scales and make these kinds of moderation decisions 304 00:16:21,120 --> 00:16:24,320 Speaker 3: with no transparency that go against their own stated policy, 305 00:16:24,800 --> 00:16:27,440 Speaker 3: it just makes that climate so much harder. It makes 306 00:16:27,440 --> 00:16:29,120 Speaker 3: it harder for the people who are trying to do 307 00:16:29,160 --> 00:16:32,440 Speaker 3: this work, abortion providers and abortion advocates. It makes it 308 00:16:32,480 --> 00:16:34,720 Speaker 3: harder for people who need to make decisions about their 309 00:16:34,720 --> 00:16:37,320 Speaker 3: health and the people that support them. It makes it 310 00:16:37,400 --> 00:16:42,560 Speaker 3: so that people cannot access information to figure out what 311 00:16:42,600 --> 00:16:44,560 Speaker 3: they want to do with their own bodies and lives. 312 00:16:44,960 --> 00:16:49,400 Speaker 3: And these companies do that while saying, oh, we promote 313 00:16:49,600 --> 00:16:52,440 Speaker 3: the ability to use our platforms to get this kind 314 00:16:52,440 --> 00:16:55,640 Speaker 3: of information. I would prefer that they say we don't 315 00:16:55,720 --> 00:16:58,640 Speaker 3: like abortion, we don't want people using our platform to 316 00:16:58,640 --> 00:17:01,160 Speaker 3: talk about abortion, so we take that content off. 317 00:17:01,280 --> 00:17:02,600 Speaker 4: At least that would be honest. 318 00:17:02,600 --> 00:17:05,200 Speaker 3: But what they are doing is lying to people about 319 00:17:05,200 --> 00:17:08,680 Speaker 3: what they're actually doing while doing it. It just it's 320 00:17:08,680 --> 00:17:11,520 Speaker 3: it's so it's it's really adding insult to injury. 321 00:17:11,840 --> 00:17:15,439 Speaker 2: Right, I mean, the true, honest answer probably is that 322 00:17:15,520 --> 00:17:17,800 Speaker 2: they are taking money or they know that they are 323 00:17:17,800 --> 00:17:21,919 Speaker 2: just buying time until the entirety of our rights and 324 00:17:21,960 --> 00:17:27,040 Speaker 2: reproductive rights may be completely dismantled in every way. And 325 00:17:27,119 --> 00:17:30,760 Speaker 2: that way they can already say, hey, leaders of this 326 00:17:30,920 --> 00:17:33,760 Speaker 2: fascist regime, we have done everything for you, so can 327 00:17:33,800 --> 00:17:37,479 Speaker 2: you keep supporting our platform and give us more money? Ugh? 328 00:17:37,600 --> 00:17:41,359 Speaker 3: I mean the way that you've got the Fox Watch 329 00:17:41,359 --> 00:17:44,200 Speaker 3: in the Henhouse here, the way that platforms are able 330 00:17:44,240 --> 00:17:47,440 Speaker 3: to cozy up to really, I mean, it's not even 331 00:17:47,440 --> 00:17:47,720 Speaker 3: really the. 332 00:17:47,720 --> 00:17:49,960 Speaker 4: Trump administration, just whoever is in power. 333 00:17:50,440 --> 00:17:50,720 Speaker 2: Uh. 334 00:17:50,760 --> 00:17:53,840 Speaker 3: And then though that administration is also the administration that 335 00:17:53,920 --> 00:17:56,960 Speaker 3: is meant to be overseeing and regulating them. It's horrible 336 00:17:57,040 --> 00:17:59,960 Speaker 3: and so we really it's I'm glad that you brought 337 00:18:00,080 --> 00:18:02,520 Speaker 3: that up, because I think that helps us peel back 338 00:18:02,560 --> 00:18:04,480 Speaker 3: the layers of what exactly is going on here and 339 00:18:04,480 --> 00:18:06,359 Speaker 3: why it's so unacceptable. 340 00:18:06,359 --> 00:18:09,879 Speaker 2: And in understanding that that the whole confusion part is 341 00:18:09,960 --> 00:18:10,919 Speaker 2: probably the point. 342 00:18:12,240 --> 00:18:13,080 Speaker 4: I think that's true. 343 00:18:13,119 --> 00:18:14,720 Speaker 3: I mean, in looking at some of the ways that 344 00:18:15,040 --> 00:18:17,120 Speaker 3: Facebook says one thing and does another when it comes 345 00:18:17,160 --> 00:18:18,399 Speaker 3: to moderating abortion content. 346 00:18:18,600 --> 00:18:19,840 Speaker 4: I think that's exactly the point. 347 00:18:19,840 --> 00:18:22,120 Speaker 3: It's like, you know, if we and we'll get into 348 00:18:22,119 --> 00:18:24,040 Speaker 3: this a little bit in a moment, but if we 349 00:18:24,119 --> 00:18:28,879 Speaker 3: create a confusing, inconsistent, not transparent climate, people will just 350 00:18:28,920 --> 00:18:31,520 Speaker 3: stop posting this information on our platforms. 351 00:18:31,520 --> 00:18:33,359 Speaker 4: And so we don't have to crack down on all 352 00:18:33,400 --> 00:18:33,560 Speaker 4: of it. 353 00:18:33,600 --> 00:18:35,119 Speaker 3: We don't have to have a policy that does not 354 00:18:35,200 --> 00:18:37,680 Speaker 3: allow for abortion content to be on our platform. It'll 355 00:18:37,720 --> 00:18:39,520 Speaker 3: there'll be a chilling effect and people will do it 356 00:18:39,560 --> 00:18:41,360 Speaker 3: on their own. They'll just stop posting on their own. 357 00:18:41,400 --> 00:18:44,760 Speaker 3: And I think, in my opinion, that's the why and 358 00:18:44,840 --> 00:18:58,000 Speaker 3: why this is happening. So Meta says that they really 359 00:18:58,040 --> 00:19:01,040 Speaker 3: want to focus on moderating posts that deal with illegal content. 360 00:19:01,400 --> 00:19:03,640 Speaker 3: Side note, they don't always do such a great job 361 00:19:03,640 --> 00:19:06,959 Speaker 3: of that either, but that's for another episode. So Meta's 362 00:19:07,040 --> 00:19:11,720 Speaker 3: Dangerous Organizations and Individuals ORDI policy was supposed to really 363 00:19:11,760 --> 00:19:16,159 Speaker 3: be like a narrow policy focusing on preventing the platform 364 00:19:16,160 --> 00:19:19,840 Speaker 3: from being used by terrorist groups or organized crime like 365 00:19:19,960 --> 00:19:24,320 Speaker 3: violent or criminal activity. But according to the Electronic Frontier Foundation, 366 00:19:24,880 --> 00:19:27,520 Speaker 3: over the years, we've really seen those rules be applied 367 00:19:27,640 --> 00:19:31,520 Speaker 3: in far broader and troubling ways with little transparency and 368 00:19:31,800 --> 00:19:35,679 Speaker 3: significant impact on marginalized voices. And this has essentially allowed 369 00:19:35,680 --> 00:19:39,520 Speaker 3: Meta to suppress factual content about abortion that does not 370 00:19:39,720 --> 00:19:42,720 Speaker 3: actually break any of the platform's rules. So the reason 371 00:19:42,720 --> 00:19:45,000 Speaker 3: wht me know this is because the Electronic Frontier Foundation 372 00:19:45,119 --> 00:19:48,320 Speaker 3: or EFF have really given us a snapshot into what's 373 00:19:48,359 --> 00:19:51,520 Speaker 3: happening and provided some very clear receipts with their stock 374 00:19:51,680 --> 00:19:55,760 Speaker 3: censoring abortion campaign effs as they collected stories from individuals, 375 00:19:55,800 --> 00:19:59,720 Speaker 3: healthcare clinics, advocacy groups, and more, and together they've revealed 376 00:20:00,200 --> 00:20:03,480 Speaker 3: one hundred examples of posts and resources being taken down, 377 00:20:03,960 --> 00:20:07,719 Speaker 3: ranging from guidance on medication abortion to links of resources 378 00:20:07,720 --> 00:20:11,600 Speaker 3: supporting individuals in states with abortion bands. What is important 379 00:20:11,640 --> 00:20:14,639 Speaker 3: to note is that the posts that they found that 380 00:20:14,720 --> 00:20:17,560 Speaker 3: weren't taken down or that resulted in sometimes a ban, 381 00:20:18,240 --> 00:20:22,159 Speaker 3: did not break any of meta's rules. Iff said, we 382 00:20:22,200 --> 00:20:25,840 Speaker 3: analyze these takedowns, deletions, and bands comparing the content to 383 00:20:25,880 --> 00:20:29,720 Speaker 3: what platform policies allow, particularly those of META, and found 384 00:20:29,800 --> 00:20:33,120 Speaker 3: that almost none of the submissions we received violated any 385 00:20:33,160 --> 00:20:36,560 Speaker 3: of the platform stated policies. Most of the censored posts 386 00:20:36,760 --> 00:20:42,840 Speaker 3: simply provided factual, educational information. So it really is a 387 00:20:42,880 --> 00:20:45,920 Speaker 3: system where you don't know, I mean, I guess you 388 00:20:45,960 --> 00:20:48,480 Speaker 3: could guess why this content is being taken down. There's 389 00:20:48,480 --> 00:20:51,639 Speaker 3: no consistency, there's no transparency, and Facebook just gets to 390 00:20:51,680 --> 00:20:54,879 Speaker 3: be like oopsie when it happens. Here's a great example 391 00:20:54,880 --> 00:20:57,200 Speaker 3: of a post that was removed from a healthcare policy 392 00:20:57,240 --> 00:21:01,440 Speaker 3: strategist named Lauren Carer discussing abortion pills availability by mail. 393 00:21:02,119 --> 00:21:05,679 Speaker 3: Her post reads, FYI, abortion pills are great to have around, 394 00:21:05,840 --> 00:21:08,720 Speaker 3: whether you anticipate needing them or not. Plan C Pills 395 00:21:08,800 --> 00:21:11,520 Speaker 3: is an amazing resource to help you find reliable sources 396 00:21:11,560 --> 00:21:13,640 Speaker 3: for abortion pills by mail, no matter where you live. 397 00:21:14,080 --> 00:21:16,120 Speaker 3: Once received, the pills should be kept in a cool, 398 00:21:16,200 --> 00:21:19,280 Speaker 3: dry place. The shelf life of maybe pristone is about 399 00:21:19,320 --> 00:21:22,680 Speaker 3: five years. The shelf life of missopriystal is about two years. 400 00:21:23,080 --> 00:21:26,680 Speaker 3: There is a missipriystal only regiment that is extremely safe, effective, 401 00:21:26,720 --> 00:21:30,280 Speaker 3: and very common globally. So that post is just here 402 00:21:30,320 --> 00:21:35,920 Speaker 3: is some factual information about these pills. However, Facebook removed 403 00:21:35,960 --> 00:21:39,679 Speaker 3: that post, and the explanation they gave Lauren was that 404 00:21:39,720 --> 00:21:42,920 Speaker 3: they don't allow people to buy, sell, or exchange drugs 405 00:21:42,960 --> 00:21:45,600 Speaker 3: that require a prescription from a doctor or a pharmacist. 406 00:21:45,840 --> 00:21:48,960 Speaker 3: But as you can tell, that post isn't about selling 407 00:21:49,080 --> 00:21:52,440 Speaker 3: or buying or trading medication. It is just fact based 408 00:21:52,720 --> 00:21:55,440 Speaker 3: information about that medication. 409 00:21:56,040 --> 00:21:58,520 Speaker 1: Yeah, it's one of those things where you read it 410 00:21:58,560 --> 00:22:02,160 Speaker 1: and you're like, I don't see the I do see 411 00:22:02,200 --> 00:22:04,840 Speaker 1: the thing. The thing that you're saying is there. It's 412 00:22:05,000 --> 00:22:11,520 Speaker 1: just it's just information. Uh oh it makes me bad. 413 00:22:12,200 --> 00:22:15,000 Speaker 3: Yeah, and Eff points out that this post does not 414 00:22:15,000 --> 00:22:17,400 Speaker 3: break any of Meta's rules and should not be removed. 415 00:22:17,680 --> 00:22:19,399 Speaker 3: But you don't have to take their word for it 416 00:22:19,520 --> 00:22:21,919 Speaker 3: or my word for it, because Meta said the exact 417 00:22:21,960 --> 00:22:25,560 Speaker 3: same thing. Eff points out that Meta publicly insists that 418 00:22:25,640 --> 00:22:28,920 Speaker 3: posts like these should not be censored, and if February 419 00:22:29,000 --> 00:22:32,400 Speaker 3: twenty twenty four letter to Amnesty International, metas Human Rights 420 00:22:32,400 --> 00:22:36,880 Speaker 3: policy director wrote, organic content i e. Non Paid content 421 00:22:37,160 --> 00:22:40,720 Speaker 3: educating users about medication abortion is allowed and does not 422 00:22:40,920 --> 00:22:45,159 Speaker 3: violate our community standards. Additionally, providing guidance on legal access 423 00:22:45,160 --> 00:22:48,800 Speaker 3: to pharmaceuticals is allowed. So what the hell he suck? Like, 424 00:22:48,840 --> 00:22:52,080 Speaker 3: why if it's allowed, why are you taking it down? 425 00:22:53,200 --> 00:22:57,760 Speaker 1: I'm so curious about because if the moderators are essentially 426 00:22:57,840 --> 00:23:01,719 Speaker 1: kind of removed. Then is this just a they have 427 00:23:01,800 --> 00:23:04,280 Speaker 1: like a keyword, like, how is this happening? Is there 428 00:23:04,320 --> 00:23:05,440 Speaker 1: a person or. 429 00:23:06,359 --> 00:23:07,399 Speaker 4: That is a great question. 430 00:23:07,600 --> 00:23:10,640 Speaker 3: If I had to guess, I would say, just knowing 431 00:23:10,640 --> 00:23:12,760 Speaker 3: what I know about content moderation, I would say this 432 00:23:12,840 --> 00:23:17,760 Speaker 3: is probably over use of AI moderation and then not 433 00:23:17,960 --> 00:23:21,359 Speaker 3: caring enough to correct that. That's if I had to say, 434 00:23:21,440 --> 00:23:25,240 Speaker 3: I would say, because honestly, content moderation is a job 435 00:23:25,320 --> 00:23:28,680 Speaker 3: for a not just a human, but a culturally competent human. 436 00:23:28,720 --> 00:23:30,440 Speaker 4: When you don't have culturally. 437 00:23:29,960 --> 00:23:33,960 Speaker 3: Competent humans making the moderation decisions, it's a problem, and 438 00:23:34,040 --> 00:23:36,840 Speaker 3: it's a problem that leads to the content of marginalized 439 00:23:36,840 --> 00:23:39,920 Speaker 3: people being suppressed much more on these platforms. Right, So, 440 00:23:39,920 --> 00:23:42,119 Speaker 3: if I had to guess, I would say, this is 441 00:23:42,160 --> 00:23:45,040 Speaker 3: somebody using an AI content moderation and then not. 442 00:23:45,119 --> 00:23:46,960 Speaker 4: Caring enough to correct that. 443 00:23:47,000 --> 00:23:50,240 Speaker 3: It is consistently taking down content that does not break 444 00:23:50,280 --> 00:23:51,440 Speaker 3: any of the platform's rules. 445 00:23:51,480 --> 00:23:52,480 Speaker 4: That's my guess. 446 00:23:53,640 --> 00:23:55,800 Speaker 1: Well, and that kind of relates to another thing I 447 00:23:55,840 --> 00:23:57,680 Speaker 1: know you're going to talk about, which is something Smith 448 00:23:57,720 --> 00:24:00,359 Speaker 1: and I have also talked about on some of our episodes. 449 00:24:01,200 --> 00:24:02,119 Speaker 1: Is shadow banding. 450 00:24:02,880 --> 00:24:04,840 Speaker 3: That's right, I mean shadow banning is one of those 451 00:24:04,880 --> 00:24:07,919 Speaker 3: issues that I find very interesting because who among us 452 00:24:07,920 --> 00:24:10,840 Speaker 3: has not posted something on social media had that post 453 00:24:10,920 --> 00:24:13,359 Speaker 3: not perform as well as you were expecting, and wonder 454 00:24:13,720 --> 00:24:17,119 Speaker 3: my shadow band. I have definitely thought this myself. If 455 00:24:17,119 --> 00:24:19,479 Speaker 3: you've ever thought that, you are not alone. But it 456 00:24:19,520 --> 00:24:22,200 Speaker 3: does really happen. So shadow banning is when a social 457 00:24:22,200 --> 00:24:26,440 Speaker 3: media platform limits the visibility of someone's content without telling them. 458 00:24:26,760 --> 00:24:29,520 Speaker 3: And this is happening to people and organizations that make 459 00:24:29,640 --> 00:24:33,200 Speaker 3: content about sexual and reproductive health. And it's a real 460 00:24:33,240 --> 00:24:36,160 Speaker 3: problem because, as you were talking about before the Internet 461 00:24:36,400 --> 00:24:39,000 Speaker 3: in twenty twenty five, like that is really where people 462 00:24:39,040 --> 00:24:42,119 Speaker 3: are going to find information about their health, especially in 463 00:24:42,160 --> 00:24:45,320 Speaker 3: a landscape where that information is more difficult to come by, 464 00:24:45,359 --> 00:24:48,960 Speaker 3: where it's criminalized and cracked down on. So people need 465 00:24:49,000 --> 00:24:51,719 Speaker 3: the Internet as a resource, and so if the people 466 00:24:51,960 --> 00:24:55,960 Speaker 3: and advocates and organizations who provide that information online are 467 00:24:56,000 --> 00:24:59,760 Speaker 3: shadow band it becomes that much harder to access what 468 00:24:59,880 --> 00:25:03,160 Speaker 3: is so often life saving information to help people make 469 00:25:03,400 --> 00:25:07,240 Speaker 3: health decisions. Earlier this year, the Center for Intimacy Justice 470 00:25:07,440 --> 00:25:10,359 Speaker 3: shared a report called the Digital Gag Suppression of Sexual 471 00:25:10,400 --> 00:25:13,639 Speaker 3: and Reproductive Health on Meta TikTok Amazon and Google, and 472 00:25:13,680 --> 00:25:16,440 Speaker 3: they found that of the one hundred and fifty nine nonprofits, 473 00:25:16,480 --> 00:25:20,199 Speaker 3: content creators, sex educators, and businesses that they surveyed, sixty 474 00:25:20,280 --> 00:25:24,000 Speaker 3: three percent had content removed on Meta, fifty five percent 475 00:25:24,040 --> 00:25:27,320 Speaker 3: had content removed on TikTok. And this suppression is happening 476 00:25:27,320 --> 00:25:29,840 Speaker 3: at the same time as platforms continue to allow and 477 00:25:29,880 --> 00:25:34,280 Speaker 3: elevate videos of violence and gore and extremist and hateful content. 478 00:25:34,440 --> 00:25:37,160 Speaker 3: And this pattern is troubling because it only becomes more 479 00:25:37,200 --> 00:25:39,879 Speaker 3: prevalent as folks turn more and more to social media 480 00:25:39,960 --> 00:25:42,400 Speaker 3: to find the information that they need to make decisions 481 00:25:42,400 --> 00:25:45,480 Speaker 3: about their health. And so I like that context because 482 00:25:46,119 --> 00:25:49,600 Speaker 3: we really do have a social media landscape that allows 483 00:25:49,640 --> 00:25:54,560 Speaker 3: for violent content, gory content, extremist or hateful content to 484 00:25:54,680 --> 00:25:59,800 Speaker 3: stay up while taking down accurate content about reproductive health 485 00:26:00,080 --> 00:26:03,960 Speaker 3: that they agree does not violate any of their policies. 486 00:26:05,400 --> 00:26:09,080 Speaker 1: It's pretty telling to you. You have some examples here, 487 00:26:09,119 --> 00:26:12,359 Speaker 1: and one of them is from a place near us 488 00:26:12,520 --> 00:26:20,080 Speaker 1: that I was like, oh dear, oh, dear, Emery. Yeah, yep, 489 00:26:21,080 --> 00:26:25,680 Speaker 1: But I mean it's also, as we're doing this research 490 00:26:25,720 --> 00:26:31,159 Speaker 1: on the CPC episode, I consider myself pretty you know, 491 00:26:31,600 --> 00:26:34,800 Speaker 1: pretty informed about abortion and all of it, but I 492 00:26:34,840 --> 00:26:37,639 Speaker 1: had to look up some stuff about like I'm not 493 00:26:37,760 --> 00:26:40,520 Speaker 1: sure is that legal there? I don't know, Like I 494 00:26:41,320 --> 00:26:43,040 Speaker 1: was feeling like I don't I don't know if I 495 00:26:43,040 --> 00:26:45,560 Speaker 1: can trust this information. And then you try to go 496 00:26:45,600 --> 00:26:47,600 Speaker 1: to a place where you're like, Okay, I know this place, 497 00:26:47,640 --> 00:26:50,399 Speaker 1: and then you find out it's taken down, it doesn't 498 00:26:50,440 --> 00:26:53,360 Speaker 1: have anything about it. Yeah, it's not a good climate. 499 00:26:54,720 --> 00:26:54,960 Speaker 4: Yeah. 500 00:26:55,000 --> 00:27:00,560 Speaker 3: And then you have Google allowing CPCs to stay, you know, 501 00:27:01,000 --> 00:27:02,840 Speaker 3: high ranked in their search. And then when you go 502 00:27:02,880 --> 00:27:05,680 Speaker 3: to CPC's they tell you all kinds of misinformation about 503 00:27:05,680 --> 00:27:09,560 Speaker 3: pregnancy and abortion. They are allowed to just essentially lie 504 00:27:09,640 --> 00:27:12,959 Speaker 3: to people people who are in vulnerable situations. And so 505 00:27:13,359 --> 00:27:17,080 Speaker 3: it's already a climate where it's hard to find trustworthy, 506 00:27:17,119 --> 00:27:24,240 Speaker 3: accurate information. And then the clearly not trustworthy, clearly not 507 00:27:24,680 --> 00:27:28,840 Speaker 3: accurate information is allowed to not just allowed to exist, 508 00:27:29,000 --> 00:27:31,359 Speaker 3: but they put their thumb on the scales in terms 509 00:27:31,400 --> 00:27:34,920 Speaker 3: of making it more accessible than information that is factual. 510 00:27:36,640 --> 00:27:36,920 Speaker 4: Yep. 511 00:27:37,960 --> 00:27:40,359 Speaker 1: So let us get into some of these examples, including 512 00:27:40,440 --> 00:27:41,080 Speaker 1: the one eras. 513 00:27:41,280 --> 00:27:43,400 Speaker 4: So let's talk about what happened at Emory University. 514 00:27:43,480 --> 00:27:46,840 Speaker 3: So rise at Emor University, the Center for Reproductive health 515 00:27:46,880 --> 00:27:49,960 Speaker 3: and research in the Southeast. They published a post saying 516 00:27:50,359 --> 00:27:52,960 Speaker 3: let's talk about Miffy pristone and its uses and the 517 00:27:53,000 --> 00:27:57,080 Speaker 3: importance of access. So they post this online. Two months later, 518 00:27:57,200 --> 00:28:01,200 Speaker 3: their account was suddenly suspended, flagged under the policy against 519 00:28:01,240 --> 00:28:05,000 Speaker 3: selling illegal drugs, which they were not selling or offering 520 00:28:05,000 --> 00:28:08,359 Speaker 3: illegal drugs. They were just giving fact based health information. 521 00:28:08,880 --> 00:28:12,480 Speaker 3: They tried to appeal, but that appeal was denied, leading 522 00:28:12,520 --> 00:28:16,080 Speaker 3: to their account being permanently deleted. Sarah Read, the director 523 00:28:16,119 --> 00:28:19,520 Speaker 3: of research and translation at Rise, told Eff as a team, 524 00:28:19,720 --> 00:28:22,360 Speaker 3: this was a hit to our morale. We pour countless 525 00:28:22,400 --> 00:28:25,439 Speaker 3: hours of person power, creativity, and passion into creating the 526 00:28:25,440 --> 00:28:27,920 Speaker 3: content we have on our page and having it vanished 527 00:28:28,000 --> 00:28:32,399 Speaker 3: virtually overnight took a toll on our team. And you know, 528 00:28:32,520 --> 00:28:36,840 Speaker 3: I really think, like think about how critical that information 529 00:28:37,400 --> 00:28:40,920 Speaker 3: is these days, and how critical social media is these days. 530 00:28:41,000 --> 00:28:44,440 Speaker 3: They are already doing sensitive work in an area where 531 00:28:44,440 --> 00:28:47,520 Speaker 3: that work is threatened, and so losing your social media 532 00:28:47,600 --> 00:28:50,840 Speaker 3: that you've put so much time into is like losing 533 00:28:50,840 --> 00:28:54,120 Speaker 3: a lifeline, both for the staff and for the community 534 00:28:54,120 --> 00:28:56,480 Speaker 3: that you're trying to do that work in. As Eff 535 00:28:56,520 --> 00:29:00,160 Speaker 3: puts it, for many organizational users like rise, they're social 536 00:29:00,200 --> 00:29:03,640 Speaker 3: media accounts or repository for resources and metrics that may 537 00:29:03,680 --> 00:29:06,600 Speaker 3: not be stored elsewhere. We spent a significant amount of 538 00:29:06,600 --> 00:29:09,720 Speaker 3: already constrained team capacity attempting to recover all of the 539 00:29:09,760 --> 00:29:12,920 Speaker 3: content we created for Instagram that was potentially going to 540 00:29:12,920 --> 00:29:15,800 Speaker 3: be permanently lost. We also spent a significant amount of 541 00:29:15,840 --> 00:29:18,720 Speaker 3: time and energy trying to understand what options we might 542 00:29:18,760 --> 00:29:21,240 Speaker 3: have available for Meta to appeal our case and recover 543 00:29:21,280 --> 00:29:24,840 Speaker 3: our account. Their support options are not easily accessible, and 544 00:29:24,880 --> 00:29:27,960 Speaker 3: the time it took to navigate this issue distracted from 545 00:29:27,960 --> 00:29:32,400 Speaker 3: our existing work. So I totally feel what they are 546 00:29:32,440 --> 00:29:35,800 Speaker 3: saying that when you are doing work that is that critical, 547 00:29:36,480 --> 00:29:39,920 Speaker 3: you know, time sensitive, having to stop that work to 548 00:29:39,920 --> 00:29:41,600 Speaker 3: figure out, well, how are we going to appeal this 549 00:29:41,680 --> 00:29:43,880 Speaker 3: decision to Meta? Is all of are all of our 550 00:29:43,960 --> 00:29:46,600 Speaker 3: years and years of work on Instagram just lost forever? 551 00:29:47,120 --> 00:29:48,560 Speaker 4: That is a real problem. 552 00:29:48,600 --> 00:29:51,840 Speaker 3: And again, they weren't doing anything wrong, They're they're nothing 553 00:29:51,840 --> 00:29:54,160 Speaker 3: that they posted on their account was against metas policies. 554 00:29:54,280 --> 00:29:57,440 Speaker 3: It's just arbitrary, and so luckily they were able to 555 00:29:57,480 --> 00:30:00,840 Speaker 3: eventually get their account back, but only because they knew 556 00:30:00,920 --> 00:30:04,160 Speaker 3: someone who knew somebody who worked at Facebook personally, which 557 00:30:04,200 --> 00:30:06,840 Speaker 3: is really the only way to appeal when this kind 558 00:30:06,840 --> 00:30:10,360 Speaker 3: of thing happens. If your account is taken down for 559 00:30:10,440 --> 00:30:13,000 Speaker 3: no real reason by Facebook, I am sorry to say, 560 00:30:13,120 --> 00:30:15,200 Speaker 3: unless you have a friend who knows somebody who works 561 00:30:15,200 --> 00:30:16,720 Speaker 3: at Facebook, you're probably not going. 562 00:30:16,600 --> 00:30:18,320 Speaker 4: To be able to appeal again. Because a lot of 563 00:30:18,360 --> 00:30:19,720 Speaker 4: these decisions are AI right. 564 00:30:19,760 --> 00:30:22,720 Speaker 3: It can be very very hard to escalate to a 565 00:30:22,800 --> 00:30:24,520 Speaker 3: human and the only real way to do it is 566 00:30:24,560 --> 00:30:27,520 Speaker 3: to just know somebody there. And again, I just feel 567 00:30:27,520 --> 00:30:31,480 Speaker 3: that in these situations where Meta agrees these posts are 568 00:30:31,520 --> 00:30:33,480 Speaker 3: not in violation of their rules and that they admit 569 00:30:33,480 --> 00:30:36,720 Speaker 3: they made a mistake, it should not come down to 570 00:30:37,120 --> 00:30:41,120 Speaker 3: knowing somebody at Facebook to have these decisions be reversed 571 00:30:41,120 --> 00:30:44,000 Speaker 3: when Meta agrees their mistakes on their part. 572 00:30:45,320 --> 00:30:46,680 Speaker 1: Now I'm trying to think if I know someone at 573 00:30:46,720 --> 00:30:59,440 Speaker 1: Facebook I used to, I don't know if we're still there. Well. 574 00:31:00,120 --> 00:31:06,200 Speaker 1: Another issue with this is, as you said, if people 575 00:31:06,240 --> 00:31:10,080 Speaker 1: are worried that their content might be deleted or shadow banned, 576 00:31:10,480 --> 00:31:16,240 Speaker 1: or just they've seen this happen to other organizations or 577 00:31:16,280 --> 00:31:20,240 Speaker 1: something like that, then they might not post it anymore. 578 00:31:21,440 --> 00:31:24,360 Speaker 3: Yeah, and I have to assume that is the point. 579 00:31:25,640 --> 00:31:28,080 Speaker 3: Eff rights. At the end of the day, clinics are 580 00:31:28,360 --> 00:31:31,520 Speaker 3: clinics are left afraid to post basic information, patients are 581 00:31:31,600 --> 00:31:35,840 Speaker 3: left confused or misinformed, and researchers lose access to these audiences. 582 00:31:36,160 --> 00:31:38,560 Speaker 3: But unless your issue catches the attention of a journalist 583 00:31:38,640 --> 00:31:41,000 Speaker 3: or you know, someone at Meta, you might never regain 584 00:31:41,120 --> 00:31:44,840 Speaker 3: access to your account. And so I really think that 585 00:31:44,840 --> 00:31:47,880 Speaker 3: that is the sort of so what here that Meta 586 00:31:47,960 --> 00:31:55,240 Speaker 3: is doing this to sort of not explicitly discourage organizations 587 00:31:55,240 --> 00:31:57,760 Speaker 3: and advocates and people from posting this kind of information 588 00:31:57,840 --> 00:32:01,280 Speaker 3: on their platform while saying the opp because it is 589 00:32:01,320 --> 00:32:03,800 Speaker 3: going to have a silencing effect. You know, nobody wants 590 00:32:03,840 --> 00:32:07,320 Speaker 3: to risk losing their entire platform, years and years and 591 00:32:07,400 --> 00:32:09,720 Speaker 3: years of content and research and resources they've collected. 592 00:32:09,880 --> 00:32:13,200 Speaker 4: Yeah, no one's going to want to take that risk, right. 593 00:32:13,720 --> 00:32:16,680 Speaker 2: It's interesting that their policy with like the things that 594 00:32:16,720 --> 00:32:23,480 Speaker 2: we're going to actually moderate is about terrorism and gain 595 00:32:23,840 --> 00:32:28,600 Speaker 2: and child endangerment, which is kind of a dog whistle 596 00:32:28,800 --> 00:32:32,080 Speaker 2: for what the Republican platform has been to for jump 597 00:32:32,160 --> 00:32:35,840 Speaker 2: to all of this morality level of issues, and that 598 00:32:35,880 --> 00:32:38,920 Speaker 2: the fact that Zuckerberg is like, you know what, yeah, 599 00:32:39,040 --> 00:32:41,360 Speaker 2: we're going to adopt this too, but it's purely to 600 00:32:41,400 --> 00:32:44,840 Speaker 2: protect the people's We're just protecting the people's And again 601 00:32:44,920 --> 00:32:47,960 Speaker 2: it does seem like, see see we're doing like you, 602 00:32:48,320 --> 00:32:52,120 Speaker 2: we got your back. We also agree with this, this 603 00:32:52,240 --> 00:32:54,200 Speaker 2: is the only way or this is the best way 604 00:32:54,400 --> 00:32:57,360 Speaker 2: to control what information is being out there. 605 00:32:57,800 --> 00:33:00,520 Speaker 4: Yes, and if you actually I mean, this is a 606 00:33:00,520 --> 00:33:01,160 Speaker 4: whole other topic. 607 00:33:01,200 --> 00:33:02,920 Speaker 3: But when you look at the way, so they say, Okay, 608 00:33:02,960 --> 00:33:06,640 Speaker 3: we're only gonna be cracking down on content that creates 609 00:33:07,000 --> 00:33:10,240 Speaker 3: harm for kids, this dog whistle that they love to 610 00:33:10,280 --> 00:33:12,040 Speaker 3: pull up, and then when you look at the kind 611 00:33:12,040 --> 00:33:15,280 Speaker 3: of harm for kids that they either allow or advocate, 612 00:33:16,280 --> 00:33:19,360 Speaker 3: part of me is like, what content are you actually 613 00:33:19,400 --> 00:33:21,360 Speaker 3: taking down? I don't know if you all saw the 614 00:33:21,400 --> 00:33:24,040 Speaker 3: recent reporting. There was a very interesting report I think 615 00:33:24,080 --> 00:33:27,680 Speaker 3: from the Wall Street Journal where they have gotten their 616 00:33:27,720 --> 00:33:31,880 Speaker 3: hands on an internal policy document. So this is something 617 00:33:31,880 --> 00:33:35,360 Speaker 3: that somebody at Facebook said, this is our policy, totally 618 00:33:35,400 --> 00:33:38,760 Speaker 3: fine to have in writing no problem. That said that 619 00:33:39,000 --> 00:33:46,000 Speaker 3: Meta's chatbots were allowed to engage in sensual play with minors, 620 00:33:46,280 --> 00:33:51,640 Speaker 3: so kids, it was okay with Meta if their chatbots 621 00:33:52,880 --> 00:33:55,880 Speaker 3: engaged in like sensual I won't say sexual, but I 622 00:33:55,880 --> 00:33:57,600 Speaker 3: would say I've seen some of the content and it 623 00:33:57,640 --> 00:34:01,400 Speaker 3: is sort of spicy, but it's okay if they're if 624 00:34:01,440 --> 00:34:03,920 Speaker 3: they're bought to do that with children. And part of 625 00:34:03,920 --> 00:34:05,680 Speaker 3: me is like, I cannot believe you will put this 626 00:34:05,720 --> 00:34:08,279 Speaker 3: in writing. I cannot believe that someone at Facebook said, yeah, 627 00:34:08,280 --> 00:34:10,160 Speaker 3: this is a this is a document, I'll attach my 628 00:34:10,280 --> 00:34:12,399 Speaker 3: name to this. Well and behold. When the Wall Street 629 00:34:12,480 --> 00:34:14,560 Speaker 3: Journal asked about it, they were like, oh, no, we 630 00:34:14,680 --> 00:34:18,359 Speaker 3: have since walked that policy back. That's no longer our 631 00:34:18,400 --> 00:34:20,880 Speaker 3: official on the record. Our official on the record policy 632 00:34:20,960 --> 00:34:24,279 Speaker 3: is no longer that it's okay for our bots to 633 00:34:24,719 --> 00:34:26,600 Speaker 3: engage in sexy role play with kids. 634 00:34:26,600 --> 00:34:29,239 Speaker 4: We walked that back, Like, I bet you did walk that. 635 00:34:29,239 --> 00:34:32,120 Speaker 5: Back today as you asked this question. 636 00:34:32,480 --> 00:34:35,279 Speaker 3: I'm sure it happened right after the Wall Street Journal 637 00:34:35,320 --> 00:34:36,239 Speaker 3: called them and asked them about it. 638 00:34:36,320 --> 00:34:38,280 Speaker 4: I'm so sure that it was like an hour later 639 00:34:38,320 --> 00:34:39,040 Speaker 4: we walked at. 640 00:34:38,880 --> 00:34:43,480 Speaker 5: That we got this now, No, we would never right. 641 00:34:45,760 --> 00:34:49,080 Speaker 1: Well, yeah, and I mean you were here, Bridget. I 642 00:34:49,080 --> 00:34:51,400 Speaker 1: guess it was years ago and you were talking about 643 00:34:52,000 --> 00:34:56,000 Speaker 1: another kind of whistle brower account of Facebook knowing it 644 00:34:56,040 --> 00:34:58,320 Speaker 1: was harming young girls. 645 00:34:58,600 --> 00:35:01,680 Speaker 4: Yeah, Francis Hogan is is the whistleblower? Why we know that? 646 00:35:02,440 --> 00:35:06,120 Speaker 1: Yeah? So it's it is very galling for them to 647 00:35:06,200 --> 00:35:08,920 Speaker 1: be like, we want to protect the children, and then 648 00:35:08,960 --> 00:35:13,880 Speaker 1: you have these things that again directly show that clearly 649 00:35:13,960 --> 00:35:14,719 Speaker 1: you don't not. 650 00:35:14,800 --> 00:35:20,640 Speaker 3: Really and to be clear, that is knowingly harming kids. 651 00:35:20,920 --> 00:35:23,600 Speaker 3: So I just think it's very interesting that Facebook gets 652 00:35:23,600 --> 00:35:26,319 Speaker 3: to say, well, we're too busy focusing on content that 653 00:35:26,640 --> 00:35:29,879 Speaker 3: harms kids to take down to really care that much 654 00:35:29,920 --> 00:35:31,879 Speaker 3: about what's going on with our abortion content, and that's 655 00:35:32,040 --> 00:35:33,359 Speaker 3: that's the content we're really working on. 656 00:35:33,880 --> 00:35:35,480 Speaker 4: But really we're not doing that either. 657 00:35:35,400 --> 00:35:37,200 Speaker 3: You know what I mean, Like they get they really. 658 00:35:37,880 --> 00:35:41,720 Speaker 3: It just infuriates me, it really does. And I think 659 00:35:42,200 --> 00:35:45,960 Speaker 3: the issue really is to understand is in twenty twenty five, 660 00:35:46,400 --> 00:35:48,239 Speaker 3: when you have a question, well I have a question, 661 00:35:48,280 --> 00:35:49,200 Speaker 3: the first place I go is. 662 00:35:49,160 --> 00:35:51,520 Speaker 4: The internet, right we all, that's I think that is 663 00:35:51,520 --> 00:35:52,520 Speaker 4: the reality. 664 00:35:52,120 --> 00:35:54,120 Speaker 3: For most of us, and the internet and social media 665 00:35:54,520 --> 00:35:57,719 Speaker 3: really has become this lifeline for folks trying to get 666 00:35:57,719 --> 00:36:02,040 Speaker 3: information about the world around us, including our sexual and 667 00:36:02,080 --> 00:36:05,400 Speaker 3: reproductive health. And if folks are not able to find 668 00:36:05,560 --> 00:36:08,360 Speaker 3: what they need in their own communities, which I'm sorry 669 00:36:08,400 --> 00:36:10,800 Speaker 3: to say is becoming more and more of the reality 670 00:36:10,880 --> 00:36:14,160 Speaker 3: these days, they are going to go online to turn 671 00:36:14,200 --> 00:36:17,520 Speaker 3: to social media to fill those gaps. That access really 672 00:36:17,600 --> 00:36:20,120 Speaker 3: matters most for folks whose care is being cut off, 673 00:36:20,160 --> 00:36:23,759 Speaker 3: like abortion seekers or trans or queer youth living in 674 00:36:23,800 --> 00:36:27,080 Speaker 3: states where healthcare is under attack. And so if you 675 00:36:27,200 --> 00:36:30,960 Speaker 3: have these social media platforms kind of adding to a 676 00:36:31,040 --> 00:36:34,000 Speaker 3: landscape where that information is difficult to access, even if 677 00:36:34,040 --> 00:36:37,000 Speaker 3: that information is not against their rules, it's just making 678 00:36:37,000 --> 00:36:38,440 Speaker 3: it that much more difficult. 679 00:36:38,440 --> 00:36:41,360 Speaker 4: And these decisions really do matter. 680 00:36:41,440 --> 00:36:43,480 Speaker 3: I mean they some of them are life or death, 681 00:36:43,680 --> 00:36:46,760 Speaker 3: and they really have real world impact on people's lives. 682 00:36:47,480 --> 00:36:53,440 Speaker 1: Absolutely, And unfortunately this is not this is part of 683 00:36:53,520 --> 00:36:56,280 Speaker 1: kind of a larger issue, kind of a larger attack, 684 00:36:57,320 --> 00:36:58,600 Speaker 1: gendered attack. Correct. 685 00:36:58,960 --> 00:37:01,440 Speaker 3: Yes, so this is a thing I find so interesting 686 00:37:01,480 --> 00:37:03,280 Speaker 3: and I actually should come back and do another episode 687 00:37:03,320 --> 00:37:03,480 Speaker 3: on it. 688 00:37:03,640 --> 00:37:05,279 Speaker 4: In the middle of some research on it right now. 689 00:37:05,320 --> 00:37:09,040 Speaker 3: But the Center for Intimacy Justice, whose report I mentioned earlier, 690 00:37:09,320 --> 00:37:12,799 Speaker 3: they have another report that really shows how platforms routinely 691 00:37:12,920 --> 00:37:16,959 Speaker 3: suppress sexual health content for women and trans folks, while 692 00:37:17,040 --> 00:37:20,320 Speaker 3: leaving content aimed at supporting the sexual health of CIS 693 00:37:20,360 --> 00:37:23,920 Speaker 3: men largely untouched. Right, So, I know lots of people 694 00:37:24,000 --> 00:37:29,120 Speaker 3: who run businesses that are focused on like the sexual 695 00:37:29,120 --> 00:37:30,879 Speaker 3: health of people who are not CIS men. Right, So 696 00:37:31,480 --> 00:37:34,680 Speaker 3: if you have pelvic pain, if you need sex toys, 697 00:37:34,719 --> 00:37:38,319 Speaker 3: like all these different things that are aimed at people 698 00:37:38,320 --> 00:37:40,680 Speaker 3: who are not cisgender men. I have lots of friends 699 00:37:40,680 --> 00:37:44,320 Speaker 3: who run businesses like that. They are essentially not able 700 00:37:44,400 --> 00:37:48,560 Speaker 3: to do any kind of advertising on Facebook because Facebook 701 00:37:48,560 --> 00:37:52,800 Speaker 3: does not allow it. However, Facebook certainly allows information about 702 00:37:54,239 --> 00:37:58,360 Speaker 3: the sexual wellness of cisgender men. So we really have 703 00:37:58,440 --> 00:38:02,839 Speaker 3: a climate where, let's mostly men who run these platforms 704 00:38:02,920 --> 00:38:06,880 Speaker 3: are able to determine whose sexual health is important and 705 00:38:06,920 --> 00:38:10,680 Speaker 3: who is not, Whose healthcare is healthcare, and whose is 706 00:38:10,760 --> 00:38:13,120 Speaker 3: like something perverted that needs to be suppressed and isn't 707 00:38:13,120 --> 00:38:14,600 Speaker 3: allowed on their platform. 708 00:38:15,320 --> 00:38:15,560 Speaker 4: Yeah. 709 00:38:15,880 --> 00:38:18,400 Speaker 2: Now thinking about it, as I've been looking at Instagram, 710 00:38:18,560 --> 00:38:21,840 Speaker 2: the amount of GLP one ads that I've been getting, 711 00:38:21,960 --> 00:38:24,319 Speaker 2: which is interesting because I thought that was medication that 712 00:38:24,360 --> 00:38:28,760 Speaker 2: you had to get their doctor, is overwhelming. But also 713 00:38:29,719 --> 00:38:31,920 Speaker 2: on the vice versa of that is MELS you know, 714 00:38:32,480 --> 00:38:37,200 Speaker 2: health sex health bs. Those are the two ads that 715 00:38:37,239 --> 00:38:40,080 Speaker 2: I get. Definitely nothing about women in birth control rarely 716 00:38:40,320 --> 00:38:43,800 Speaker 2: there's a few that but as of late I think zero. 717 00:38:44,200 --> 00:38:47,880 Speaker 2: But the amount of GLP one ads, I'm like, WHOA, 718 00:38:48,080 --> 00:38:49,520 Speaker 2: what is happening Instagram. 719 00:38:49,520 --> 00:38:50,759 Speaker 5: I thought we weren't allowing this. 720 00:38:51,400 --> 00:38:55,120 Speaker 3: Yes, I mean the amount of ads I get, specifically 721 00:38:55,200 --> 00:38:59,760 Speaker 3: for the erectile dysfunction medication blue Choo, and the ads 722 00:38:59,800 --> 00:39:01,640 Speaker 3: are do you ever seen these ads online? 723 00:39:01,680 --> 00:39:04,280 Speaker 4: The ads are clearly targeted. 724 00:39:03,880 --> 00:39:08,400 Speaker 3: At women, so it's it's a cute woman being like, ladies, 725 00:39:09,040 --> 00:39:11,799 Speaker 3: get your man to get blue shoe. Blue shoe is 726 00:39:11,840 --> 00:39:14,479 Speaker 3: gonna rock your world. Get your man on blue shoe. 727 00:39:14,719 --> 00:39:17,760 Speaker 3: And that's a medication, that is that is an erectail 728 00:39:17,800 --> 00:39:22,520 Speaker 3: dysfunction prescription medication. But these platforms have just decided, Oh no, 729 00:39:22,640 --> 00:39:23,200 Speaker 3: that's okay. 730 00:39:23,280 --> 00:39:23,719 Speaker 4: That's that. 731 00:39:23,760 --> 00:39:25,960 Speaker 3: You can you can show that all day long, no problem, 732 00:39:26,160 --> 00:39:28,120 Speaker 3: you can boost it, you can put money behind it. 733 00:39:28,280 --> 00:39:32,319 Speaker 2: Totally fine again Like and then the other part being 734 00:39:32,360 --> 00:39:35,880 Speaker 2: the weight loss medication, which slowly a lot more information 735 00:39:35,920 --> 00:39:37,880 Speaker 2: comes back like, oh there's side effis this might not 736 00:39:37,880 --> 00:39:38,839 Speaker 2: be as good as you think. 737 00:39:38,920 --> 00:39:42,000 Speaker 4: By the way, Oh yes, I'll just say yes. 738 00:39:42,239 --> 00:39:43,040 Speaker 5: I'll just say yes. 739 00:39:43,640 --> 00:39:45,960 Speaker 2: Well that's like as as of late, and we know 740 00:39:46,080 --> 00:39:48,239 Speaker 2: this was coming. We knew this was coming because they've 741 00:39:48,239 --> 00:39:50,799 Speaker 2: also got variations that are not FDA approved, which I 742 00:39:50,800 --> 00:39:53,239 Speaker 2: guess means very little at this point in time. But 743 00:39:53,560 --> 00:39:55,920 Speaker 2: again that this is the rampant amount of ads like 744 00:39:56,000 --> 00:40:01,520 Speaker 2: every two like scrolls on instcaret that pops up, and 745 00:40:01,640 --> 00:40:03,480 Speaker 2: on Facebook too, which is I'm like, I don't even 746 00:40:03,719 --> 00:40:05,759 Speaker 2: go to Facebook. I just need to know people's birthdays. 747 00:40:05,840 --> 00:40:09,760 Speaker 2: That's all I need, That's all I really want. But again, 748 00:40:09,960 --> 00:40:12,160 Speaker 2: this seems to be like I thought once, if that 749 00:40:12,360 --> 00:40:16,560 Speaker 2: was your policy from Jump, then how are these as 750 00:40:16,920 --> 00:40:18,840 Speaker 2: paying you, I know, paying you millions? 751 00:40:19,560 --> 00:40:20,360 Speaker 5: How are these okay? 752 00:40:20,840 --> 00:40:25,640 Speaker 3: Yeah, their policy is totally inconsistent, seemingly arbitrary, and seemingly 753 00:40:25,800 --> 00:40:28,640 Speaker 3: biased against any kind of marginal identity. Like that's just 754 00:40:28,680 --> 00:40:31,640 Speaker 3: what's going on. They don't have any transparency. They say 755 00:40:31,680 --> 00:40:34,400 Speaker 3: one thing and do another. Uh, and that just is 756 00:40:34,400 --> 00:40:37,800 Speaker 3: the is the norm for them, and it's I really 757 00:40:37,840 --> 00:40:43,319 Speaker 3: do think that we should be talking about how you know, again, 758 00:40:43,400 --> 00:40:45,920 Speaker 3: let's be honest, we're talking about mostly men, and mostly 759 00:40:45,960 --> 00:40:49,400 Speaker 3: not just men, like a specific kind of man, white moneyed, 760 00:40:49,640 --> 00:40:52,440 Speaker 3: coastal all of that. How we have given them so 761 00:40:52,719 --> 00:40:57,320 Speaker 3: much power to define what knowledge is acceptable, whose voices 762 00:40:57,360 --> 00:41:01,040 Speaker 3: are amplified, whose bodies are are left a risk, and 763 00:41:01,239 --> 00:41:04,920 Speaker 3: when platforms decide what can and can't be shared, they 764 00:41:04,960 --> 00:41:08,600 Speaker 3: are making public health decisions with global consequences in ways 765 00:41:08,600 --> 00:41:11,760 Speaker 3: that are often contrary to public health, and then also 766 00:41:11,880 --> 00:41:15,440 Speaker 3: reinforce systemic and equalities. And so I just think, you know, 767 00:41:15,480 --> 00:41:18,279 Speaker 3: this is not just we're not just talking about like 768 00:41:18,400 --> 00:41:20,480 Speaker 3: vague policy language. I know that I've spent a lot 769 00:41:20,520 --> 00:41:22,600 Speaker 3: of time talking about that because it annoys me. But 770 00:41:23,480 --> 00:41:26,799 Speaker 3: it's really about them deciding who gets to speak, who 771 00:41:26,920 --> 00:41:29,560 Speaker 3: gets seen, who gets access to the information that they 772 00:41:29,600 --> 00:41:32,280 Speaker 3: need to make decisions about their own health and bodies. 773 00:41:32,560 --> 00:41:36,719 Speaker 3: When Meta and these platforms silence accurate, essential sexual and 774 00:41:36,760 --> 00:41:41,160 Speaker 3: reproductive health information, but not just enforcing inconsistent body rules. 775 00:41:41,200 --> 00:41:43,200 Speaker 4: I mean they are, but they're not just doing that. 776 00:41:43,400 --> 00:41:47,040 Speaker 3: They are also shaping people's lives and really deciding whose 777 00:41:47,120 --> 00:41:50,040 Speaker 3: health matters and whose doesn't. And in a world where 778 00:41:50,040 --> 00:41:52,680 Speaker 3: we know the internet has really become this lifeline, that 779 00:41:52,800 --> 00:41:55,400 Speaker 3: is not just annoying, although I am annoid, it is 780 00:41:55,560 --> 00:41:59,160 Speaker 3: dangerous because free speech shouldn't come with a disclaimer that 781 00:41:59,200 --> 00:42:02,480 Speaker 3: your bodies is just optional and up to the whims 782 00:42:02,520 --> 00:42:03,440 Speaker 3: of Mark Zuckerberg. 783 00:42:04,400 --> 00:42:05,800 Speaker 1: No, I don't want to live in that world. 784 00:42:05,920 --> 00:42:10,720 Speaker 3: No, No, I mean I think about this all the time, 785 00:42:10,840 --> 00:42:15,120 Speaker 3: the ways that these that these individuals, like a handful 786 00:42:15,160 --> 00:42:19,919 Speaker 3: of individual mostly white guys, get to define what our 787 00:42:19,960 --> 00:42:22,880 Speaker 3: worlds look like in these very concrete ways. 788 00:42:22,960 --> 00:42:25,400 Speaker 4: And I've never met Mark Zuckerberg. 789 00:42:25,400 --> 00:42:27,839 Speaker 3: Although I have met Cheryl Sandberg, but I've never met 790 00:42:27,840 --> 00:42:31,319 Speaker 3: Mark Zuckerberg. I don't want Mark Zuckerberg in charge of 791 00:42:31,320 --> 00:42:34,120 Speaker 3: deciding anything for my life. I don't think Mark Zuckerberg 792 00:42:34,200 --> 00:42:37,839 Speaker 3: and I have any common idea about what it means 793 00:42:37,880 --> 00:42:39,359 Speaker 3: to have a good, fulfilled life. 794 00:42:39,440 --> 00:42:41,600 Speaker 4: I don't want him designing what my future looks like. 795 00:42:43,760 --> 00:42:44,840 Speaker 1: I think that's very wise. 796 00:42:45,719 --> 00:42:48,240 Speaker 2: I mean that one movie made him look really pretty 797 00:42:48,320 --> 00:42:48,799 Speaker 2: much a dick. 798 00:42:48,920 --> 00:42:51,640 Speaker 3: So, oh my god, you mean The Social Network, one 799 00:42:51,640 --> 00:42:52,800 Speaker 3: of my favorite movies. 800 00:42:53,600 --> 00:42:54,080 Speaker 4: Oh my god. 801 00:42:54,160 --> 00:42:56,560 Speaker 3: I don't want to I don't want to spoil it, 802 00:42:57,000 --> 00:43:00,320 Speaker 3: but the ending of that movie is my version of 803 00:43:00,320 --> 00:43:00,960 Speaker 3: Citizen Kane. 804 00:43:00,960 --> 00:43:03,880 Speaker 4: Have you both have seen it? Yeah, okay, you've not 805 00:43:03,920 --> 00:43:04,239 Speaker 4: seen it. 806 00:43:04,280 --> 00:43:06,120 Speaker 2: I've always seen clips because I'm like, I don't want 807 00:43:06,120 --> 00:43:07,240 Speaker 2: even want to know, but he seem. 808 00:43:07,040 --> 00:43:07,439 Speaker 4: Like a dick. 809 00:43:07,760 --> 00:43:10,480 Speaker 3: Go home and watch it tonight. I know your birthday 810 00:43:10,520 --> 00:43:14,239 Speaker 3: is coming up. My birthday is a fun birthday watch. 811 00:43:14,320 --> 00:43:16,319 Speaker 3: I mean, I'm such a nerd. I say it's a 812 00:43:16,360 --> 00:43:18,200 Speaker 3: fun watch, but if you're looking for a movie to 813 00:43:18,239 --> 00:43:20,239 Speaker 3: watch on your birthday, that might be that might be 814 00:43:20,239 --> 00:43:20,799 Speaker 3: the one. 815 00:43:22,080 --> 00:43:25,160 Speaker 1: It's a good, it's a solid like, oh yeah, you're 816 00:43:25,239 --> 00:43:26,880 Speaker 1: just a sad man ending. 817 00:43:27,480 --> 00:43:29,200 Speaker 4: Yes, you know what I'm talking about. 818 00:43:29,960 --> 00:43:35,080 Speaker 3: You kiss Citizen Kane. The sled moment at the end 819 00:43:35,080 --> 00:43:38,160 Speaker 3: of that movie haunts me, and I think if you haven't, 820 00:43:38,239 --> 00:43:39,880 Speaker 3: I think if you've seen it, it gives context for 821 00:43:39,920 --> 00:43:41,279 Speaker 3: some of the stuff we've talked about when it comes 822 00:43:41,320 --> 00:43:42,360 Speaker 3: to birth today. 823 00:43:43,040 --> 00:43:43,480 Speaker 2: All right. 824 00:43:43,760 --> 00:43:46,320 Speaker 5: Without even seeing it, I was like, all right. 825 00:43:48,560 --> 00:43:50,920 Speaker 1: Also love it. It's such a dark soundtrack. 826 00:43:52,440 --> 00:43:54,520 Speaker 4: Who is it yet? 827 00:43:54,800 --> 00:43:59,919 Speaker 3: Nobody does a haunting soundtrack like Trent Resnor Gone Girl 828 00:44:00,120 --> 00:44:03,800 Speaker 3: soundtrack soundtrack to Challengers, also Trent Resnor, And that's the 829 00:44:03,840 --> 00:44:05,680 Speaker 3: soundtrack I put on when I'm writing if you need 830 00:44:05,719 --> 00:44:08,600 Speaker 3: to focus and just like put on some headphones and 831 00:44:08,640 --> 00:44:12,840 Speaker 3: be like we are writing, that is your soundtrack. Resnor 832 00:44:12,920 --> 00:44:17,320 Speaker 3: can write a dark movie soundtrack like nobody's business. 833 00:44:17,960 --> 00:44:18,560 Speaker 4: And I love that. 834 00:44:18,640 --> 00:44:22,000 Speaker 1: It was in this movie about this college kid trying 835 00:44:22,000 --> 00:44:28,280 Speaker 1: to get a girl to like oh lord, yes, okay, 836 00:44:28,320 --> 00:44:33,239 Speaker 1: well we'll revisit that later. We'll do uh maybe you know, 837 00:44:33,280 --> 00:44:36,760 Speaker 1: we had a fun time trash talking some Hugh Heffner 838 00:44:36,840 --> 00:44:40,839 Speaker 1: that one time we'll come back with a fun thing 839 00:44:40,920 --> 00:44:43,680 Speaker 1: for you, Bridget and you can side because you always 840 00:44:43,719 --> 00:44:49,560 Speaker 1: bring us these heavy topics of your choosing. But before then, 841 00:44:50,239 --> 00:44:52,239 Speaker 1: thank you so much for being here. Where can the 842 00:44:52,280 --> 00:44:53,480 Speaker 1: good listeners find you? 843 00:44:53,480 --> 00:44:55,600 Speaker 3: You can find me on my podcast, there are our 844 00:44:55,640 --> 00:44:58,040 Speaker 3: goals on the internet. You can find me on Instagram. 845 00:44:58,400 --> 00:45:00,160 Speaker 3: I know it's owned by Max Zarckerberg. I don't make 846 00:45:00,200 --> 00:45:04,080 Speaker 3: it either at Bridget Bryan DC, TikTok at Bridge and 847 00:45:04,160 --> 00:45:05,480 Speaker 3: Bran DC, and on YouTube. 848 00:45:05,480 --> 00:45:06,960 Speaker 4: And there are no girls on the internet. 849 00:45:08,120 --> 00:45:10,440 Speaker 1: Yes, go check all of that out if you have 850 00:45:10,520 --> 00:45:14,000 Speaker 1: it already, listeners, if you would like to contact us, 851 00:45:14,040 --> 00:45:16,200 Speaker 1: you can. You can email us at Hello Stuff, I 852 00:45:16,280 --> 00:45:18,080 Speaker 1: never told you. You can find us on Blue skyt 853 00:45:18,120 --> 00:45:20,640 Speaker 1: most of podcast or Instagram and TikTok at Stuff one 854 00:45:20,680 --> 00:45:23,600 Speaker 1: never told you. We're also on YouTube. We have new 855 00:45:23,640 --> 00:45:25,520 Speaker 1: merchandise at Cotton Bureau, and we have a book you 856 00:45:25,560 --> 00:45:27,400 Speaker 1: can get wherever you get your books. Thanks as always to 857 00:45:27,400 --> 00:45:30,120 Speaker 1: her A Suproduces, Christina Executive Producing, my under contributor Joey, 858 00:45:30,280 --> 00:45:32,920 Speaker 1: Thank you and thanks to you for listening. Stuff Never 859 00:45:32,920 --> 00:45:34,600 Speaker 1: told you. Quot should by Heart Radio for more podcasts 860 00:45:34,600 --> 00:45:35,960 Speaker 1: from my heart Radio. You can check out the heart 861 00:45:36,000 --> 00:45:37,719 Speaker 1: Radio app, a podcast or where you listen to your 862 00:45:37,719 --> 00:45:38,560 Speaker 1: favorite shows.