1 00:00:05,160 --> 00:00:06,760 Speaker 1: Hey, this is Annie and Samantha. 2 00:00:06,880 --> 00:00:07,320 Speaker 2: I'm welcome to. 3 00:00:07,280 --> 00:00:18,920 Speaker 3: Stuff I've Never told you production of iHeartRadio, and today 4 00:00:18,960 --> 00:00:21,440 Speaker 3: we are once again thrilled to be joined by the fabulous, 5 00:00:21,480 --> 00:00:23,160 Speaker 3: the fantastic Bridget Todd. 6 00:00:23,320 --> 00:00:26,400 Speaker 2: Welcome Bridget, Thank you for having me back. I'm so 7 00:00:26,480 --> 00:00:27,240 Speaker 2: excited to be here. 8 00:00:27,440 --> 00:00:30,360 Speaker 3: We're so excited to have you always. But yes, we 9 00:00:30,360 --> 00:00:35,479 Speaker 3: were discussing beforehand. There's a lot for us that we 10 00:00:35,520 --> 00:00:39,920 Speaker 3: would like your expertise on, so always, we appreciate you 11 00:00:40,040 --> 00:00:42,640 Speaker 3: coming on. There's so much going on right now. That 12 00:00:42,800 --> 00:00:44,240 Speaker 3: being said, how are you, Bridget? 13 00:00:44,920 --> 00:00:47,159 Speaker 2: You know this week was I don't know if you 14 00:00:47,200 --> 00:00:49,400 Speaker 2: all feel it. I feel like the last few weeks 15 00:00:49,400 --> 00:00:53,840 Speaker 2: has been pretty rough. It feels like things are different. 16 00:00:53,840 --> 00:00:56,600 Speaker 2: It just feels like a shift in the air, which 17 00:00:56,640 --> 00:00:59,720 Speaker 2: can be hard to exist in, let alone make content 18 00:00:59,840 --> 00:01:04,720 Speaker 2: in that doesn't make people fearful and want to check out, 19 00:01:04,720 --> 00:01:06,760 Speaker 2: which I think maybe we're all kind of navigating. 20 00:01:07,280 --> 00:01:09,720 Speaker 4: Yeah, is that something that resonates with youtwo. 21 00:01:10,120 --> 00:01:14,080 Speaker 3: Yes, uh, yes, uh it's we make a lot of 22 00:01:14,160 --> 00:01:17,240 Speaker 3: contents and we do try to mix things up, but 23 00:01:17,280 --> 00:01:20,360 Speaker 3: we also don't want to make more things. But I mean, 24 00:01:20,440 --> 00:01:23,639 Speaker 3: for instance, yesterday I just had a lot of trouble 25 00:01:23,640 --> 00:01:26,280 Speaker 3: concentrating on my work because I was thinking about all 26 00:01:26,319 --> 00:01:29,000 Speaker 3: this other stuff and like, what what's going on in 27 00:01:29,080 --> 00:01:31,920 Speaker 3: the world? What can I do? Which I've always maintained 28 00:01:32,000 --> 00:01:34,959 Speaker 3: that if you want to get more productivity out of people, 29 00:01:35,000 --> 00:01:38,800 Speaker 3: then they're going about it the completely wrong. But that's 30 00:01:38,880 --> 00:01:43,800 Speaker 3: just a very small personal gripe of mine. Uh but yeah, yeah, 31 00:01:43,840 --> 00:01:45,880 Speaker 3: it's it's been it's been difficult. 32 00:01:47,040 --> 00:01:47,240 Speaker 2: You know. 33 00:01:47,280 --> 00:01:51,840 Speaker 1: What I've discovered is these tiny little coloring books that 34 00:01:51,880 --> 00:01:54,840 Speaker 1: are really like easy coloring, so it's not detailed because 35 00:01:54,880 --> 00:01:58,240 Speaker 1: I've discovered as much as I'd like to do those things, 36 00:01:58,680 --> 00:02:00,880 Speaker 1: I cannot stay within the lines. And when they get 37 00:02:00,920 --> 00:02:03,920 Speaker 1: really like fancy the adult coloring book versions, like they 38 00:02:03,960 --> 00:02:05,720 Speaker 1: get fancy and you have to do the shading, and like, 39 00:02:05,800 --> 00:02:08,520 Speaker 1: what what is this? Why do I have so many 40 00:02:08,680 --> 00:02:11,359 Speaker 1: things to color? So I discover these tiny ones that's 41 00:02:11,440 --> 00:02:16,000 Speaker 1: just like cutesy large pictures of like a milkshake. It 42 00:02:16,040 --> 00:02:18,120 Speaker 1: has been really nice that I can just check out, 43 00:02:18,440 --> 00:02:21,240 Speaker 1: stay inside the lines and color with a marker like 44 00:02:21,280 --> 00:02:24,320 Speaker 1: those little like you know, paint like markers that doesn't 45 00:02:24,360 --> 00:02:27,200 Speaker 1: really satisfying. Nice to like zone. 46 00:02:26,960 --> 00:02:31,200 Speaker 2: Out, Sam, you are speaking my language, because a good 47 00:02:31,240 --> 00:02:33,640 Speaker 2: I have a I've spent so much money on this. 48 00:02:33,800 --> 00:02:37,840 Speaker 2: But alcohol markers the one like, there's nothing quite like 49 00:02:37,919 --> 00:02:41,720 Speaker 2: a good marker. A new set of markers. You're like, 50 00:02:41,760 --> 00:02:45,600 Speaker 2: this is gonna change everything. My future starts today. I've 51 00:02:45,600 --> 00:02:48,200 Speaker 2: got this new set of markers. Yes, and the ones, 52 00:02:48,320 --> 00:02:52,919 Speaker 2: the ones that that I write really well, are so satisfying. 53 00:02:53,160 --> 00:02:56,280 Speaker 2: I know I sound like a crazy person, but genuinely, 54 00:02:56,360 --> 00:03:01,400 Speaker 2: the appeal of finding the right set of markers conchange everything. 55 00:03:01,520 --> 00:03:05,440 Speaker 1: Oh no, I am obsessed with pins and well written 56 00:03:05,560 --> 00:03:08,080 Speaker 1: fine point pins, like it has to be fine points, 57 00:03:08,120 --> 00:03:10,600 Speaker 1: Like the Chinese and Japanese have a market because they 58 00:03:10,639 --> 00:03:14,320 Speaker 1: have some of the best, like nicely flowing pins. Even 59 00:03:14,360 --> 00:03:16,560 Speaker 1: though my my hand writing is really really bad, but 60 00:03:16,639 --> 00:03:19,679 Speaker 1: I love the feel of like a nice smooth rite. 61 00:03:19,880 --> 00:03:22,840 Speaker 1: But with these, like the alcohol markers, as you're talking about, 62 00:03:23,360 --> 00:03:26,360 Speaker 1: the only problem I have is the color that reports 63 00:03:26,440 --> 00:03:29,400 Speaker 1: that they say they represent on the cap doesn't actually 64 00:03:29,720 --> 00:03:32,720 Speaker 1: like translate in the marketing. So I'm like, oh, this 65 00:03:32,760 --> 00:03:34,600 Speaker 1: is a yellow and it turns orange. I'm like, wait, 66 00:03:34,680 --> 00:03:38,520 Speaker 1: this is this is not But it is very smooth 67 00:03:38,680 --> 00:03:40,640 Speaker 1: and it is very satisfying because it fills all the 68 00:03:40,720 --> 00:03:43,240 Speaker 1: lines and you're like, yes, I'm a professional colorer. 69 00:03:43,280 --> 00:03:43,760 Speaker 4: Of course. 70 00:03:44,080 --> 00:03:46,360 Speaker 2: This is how this is how much I tell seriously, 71 00:03:46,400 --> 00:03:48,360 Speaker 2: I take this. When I get a new set of markers, 72 00:03:48,640 --> 00:03:50,560 Speaker 2: the first thing I do is a little colors watch 73 00:03:50,600 --> 00:03:53,680 Speaker 2: to be the oh they say orange, what their orange 74 00:03:53,720 --> 00:03:54,520 Speaker 2: looks like? Yeah? 75 00:03:54,640 --> 00:03:57,560 Speaker 1: I know, no surprises, you know what that's good to 76 00:03:57,760 --> 00:04:02,440 Speaker 1: That is great advice as a newly marker purchasing person. 77 00:04:02,840 --> 00:04:04,400 Speaker 1: So thank you. I'm gonna have to have to get 78 00:04:04,400 --> 00:04:06,960 Speaker 1: some like scrap paper just so I can do that. 79 00:04:07,720 --> 00:04:09,360 Speaker 2: This is we can do a whole episode on this. 80 00:04:09,480 --> 00:04:13,480 Speaker 2: Let's just look pens markers my family. It gives me 81 00:04:13,520 --> 00:04:15,760 Speaker 2: a head tingles even just talking about it. I love 82 00:04:15,760 --> 00:04:16,320 Speaker 2: it so much. 83 00:04:17,120 --> 00:04:19,719 Speaker 1: So since I did the like pre show, everything's the 84 00:04:19,760 --> 00:04:21,720 Speaker 1: worst I have to bring in. This is a solution, 85 00:04:21,880 --> 00:04:26,200 Speaker 1: and it's coloring large small coloring books that with wonderful markers. 86 00:04:26,279 --> 00:04:31,560 Speaker 2: Yes, yeah, that can be the antidote to our troubling times. 87 00:04:31,920 --> 00:04:35,320 Speaker 2: Have you considered just diving headfirst into the world of 88 00:04:35,360 --> 00:04:39,120 Speaker 2: markers and penmanship and calligraphy and journaling and coloring. 89 00:04:39,800 --> 00:04:41,559 Speaker 1: It is quite nice. It is quite nice. 90 00:04:41,760 --> 00:04:42,240 Speaker 4: Annie. 91 00:04:42,920 --> 00:04:44,960 Speaker 1: I will leave some for you. I know you're about 92 00:04:44,960 --> 00:04:46,640 Speaker 1: to house it for me, so I will leave some 93 00:04:46,760 --> 00:04:48,240 Speaker 1: for you to try out yourself. 94 00:04:48,680 --> 00:04:52,000 Speaker 3: You know I have. I have two main thoughts from this. 95 00:04:52,240 --> 00:04:56,400 Speaker 3: One is that I hate when this happens where your 96 00:04:56,480 --> 00:04:58,520 Speaker 3: birthday is coming up, Samantha, and I wish I had 97 00:04:58,560 --> 00:05:01,640 Speaker 3: known this earlier. That would have been a great gift. 98 00:05:02,040 --> 00:05:06,560 Speaker 3: But then I think, Bridget, you should come on one 99 00:05:06,600 --> 00:05:09,760 Speaker 3: time and let us talk about something that's not so stressful. 100 00:05:09,839 --> 00:05:12,760 Speaker 3: Let's give let's give ourselves a little break. We can 101 00:05:12,800 --> 00:05:15,719 Speaker 3: talk about markers. I know you mentioned like reality TV. 102 00:05:15,880 --> 00:05:19,400 Speaker 3: We could have a whole thing where it's not something 103 00:05:19,520 --> 00:05:21,880 Speaker 3: so do all. 104 00:05:21,720 --> 00:05:24,320 Speaker 1: The like dark stuff we should make. Let you come 105 00:05:24,360 --> 00:05:26,760 Speaker 1: in with the joys that you have because we just 106 00:05:26,760 --> 00:05:28,640 Speaker 1: talked about the Adirondacks and everything. 107 00:05:29,480 --> 00:05:32,360 Speaker 2: Yeah, I feel like people who listen to my content 108 00:05:32,560 --> 00:05:36,680 Speaker 2: might not know that I experienced joy I have I had. 109 00:05:37,800 --> 00:05:40,960 Speaker 2: The only things I talk about are not the crushing 110 00:05:41,040 --> 00:05:43,680 Speaker 2: weight of fascism. 111 00:05:43,880 --> 00:05:45,080 Speaker 4: Enjoy reality television. 112 00:05:46,200 --> 00:05:47,839 Speaker 3: I think this would be fun. I think we should 113 00:05:47,839 --> 00:05:49,000 Speaker 3: look at it all right. 114 00:05:49,160 --> 00:05:49,760 Speaker 4: I like it. 115 00:05:51,480 --> 00:05:55,479 Speaker 3: Oh, but unfortunately before that day, we're not doing that today. 116 00:05:55,839 --> 00:06:01,279 Speaker 3: This is also the timing is interesting because we're Samantha 117 00:06:01,279 --> 00:06:03,960 Speaker 3: and I are working on an update on CPCs crisis 118 00:06:03,960 --> 00:06:07,880 Speaker 3: pregnancy centers. I know them very well, yes, and we 119 00:06:08,560 --> 00:06:11,240 Speaker 3: in our research ran into a lot of stuff about 120 00:06:11,640 --> 00:06:15,720 Speaker 3: how tech companies were basically paying for them to advertise 121 00:06:15,800 --> 00:06:19,040 Speaker 3: or accepting their money and being misleading about things. So 122 00:06:20,120 --> 00:06:23,400 Speaker 3: this is very much related. Uh, what are we talking 123 00:06:23,640 --> 00:06:24,760 Speaker 3: about today, bridget. 124 00:06:24,640 --> 00:06:27,120 Speaker 4: Well, that is such a good transition because it's all related. 125 00:06:27,680 --> 00:06:31,200 Speaker 2: But tech companies really have put their thumbs on the 126 00:06:31,279 --> 00:06:34,120 Speaker 2: scale when it comes to being able to get accurate 127 00:06:34,200 --> 00:06:38,040 Speaker 2: content about abortion on social media. Your point about crisis 128 00:06:38,120 --> 00:06:41,520 Speaker 2: pregnancy centers and the way that Google essentially is like 129 00:06:41,600 --> 00:06:45,320 Speaker 2: paying an advertising network for them to exist is a 130 00:06:45,360 --> 00:06:48,200 Speaker 2: great example. But today I really wanted to talk about 131 00:06:48,240 --> 00:06:53,120 Speaker 2: how social media is heavily moderating and even in some 132 00:06:53,160 --> 00:06:57,560 Speaker 2: cases like suppressing and censoring content about abortion. I think 133 00:06:57,560 --> 00:06:59,680 Speaker 2: that we all talked about this back in January. But 134 00:06:59,720 --> 00:07:03,240 Speaker 2: did you remember when Mark Zuckerberg had that moment that 135 00:07:03,279 --> 00:07:05,640 Speaker 2: people sort of talked about as his mask off moment 136 00:07:05,720 --> 00:07:08,240 Speaker 2: Back in January when Trump came back into office. I 137 00:07:08,240 --> 00:07:11,080 Speaker 2: think that we were talking about how he really started 138 00:07:11,160 --> 00:07:15,640 Speaker 2: dressing like a divorced nightclub promoter and was saying things like, oh, 139 00:07:15,640 --> 00:07:18,400 Speaker 2: we're taking the tampons out of the washrooms at the 140 00:07:18,440 --> 00:07:22,120 Speaker 2: restroot in here at Facebook, HQ just really was sort 141 00:07:22,160 --> 00:07:24,560 Speaker 2: of having a moment where he was saying a lot 142 00:07:24,640 --> 00:07:25,040 Speaker 2: of things. 143 00:07:25,040 --> 00:07:26,120 Speaker 4: Do you remember this. 144 00:07:26,960 --> 00:07:30,520 Speaker 3: Oh yes, oh yes, he was like he leaned in hard. 145 00:07:30,720 --> 00:07:31,880 Speaker 3: He was hard. 146 00:07:32,680 --> 00:07:33,880 Speaker 1: He's been waiting for those moments. 147 00:07:34,000 --> 00:07:35,640 Speaker 4: Yes, Oh my gosh, you could tell. 148 00:07:35,960 --> 00:07:38,360 Speaker 2: I mean I almost quippled when people were like, oh, 149 00:07:38,360 --> 00:07:41,440 Speaker 2: it's his mask off moment because I don't think that 150 00:07:41,520 --> 00:07:44,680 Speaker 2: Mark Zuckerberg has any kind of like I don't. I 151 00:07:44,680 --> 00:07:47,280 Speaker 2: wouldn't call it a mask off moment because I think 152 00:07:47,320 --> 00:07:50,280 Speaker 2: that he is the definition of a hollow, empty person, 153 00:07:50,680 --> 00:07:53,320 Speaker 2: and so I think he is the mask. 154 00:07:53,600 --> 00:07:54,920 Speaker 4: He will say anything. 155 00:07:54,960 --> 00:07:57,760 Speaker 2: I think that he has no he's I'm honestly fascinated 156 00:07:57,800 --> 00:07:59,320 Speaker 2: by him as a tech leader because I think that 157 00:07:59,360 --> 00:08:03,760 Speaker 2: he has no value, scruples, morals, there's just nothing. He 158 00:08:03,800 --> 00:08:07,680 Speaker 2: will say anything, he will do anything. However the wind blows, 159 00:08:07,880 --> 00:08:10,160 Speaker 2: that's how he will blow. And I don't think it's 160 00:08:10,160 --> 00:08:12,720 Speaker 2: fair to call that a mask moment when truly, like 161 00:08:12,800 --> 00:08:15,440 Speaker 2: what the mask is not hiding anything. This is just 162 00:08:15,480 --> 00:08:17,720 Speaker 2: genuinely like who you are, who you always have been, 163 00:08:18,240 --> 00:08:21,640 Speaker 2: just the soulless person who was waiting to see who 164 00:08:21,640 --> 00:08:23,240 Speaker 2: they should kiss up to you and will do that 165 00:08:23,320 --> 00:08:25,160 Speaker 2: if it means holding onto power. 166 00:08:25,640 --> 00:08:25,800 Speaker 5: Right. 167 00:08:25,880 --> 00:08:29,160 Speaker 1: He was just waiting in the background, like his personality 168 00:08:29,240 --> 00:08:31,560 Speaker 1: was just waiting in the shadows. And then it's like, ohh, 169 00:08:31,560 --> 00:08:32,360 Speaker 1: this is my moment. 170 00:08:33,559 --> 00:08:36,199 Speaker 2: And so when that all was going on, he also 171 00:08:36,559 --> 00:08:39,440 Speaker 2: announced that Meta was going to be scrapping their community 172 00:08:39,480 --> 00:08:42,960 Speaker 2: notes feature and scrapping all third party fact checking on 173 00:08:42,960 --> 00:08:46,240 Speaker 2: the platform, because, as he said, it was time for 174 00:08:46,320 --> 00:08:48,320 Speaker 2: the company to get back to their roots when it 175 00:08:48,360 --> 00:08:50,880 Speaker 2: comes to free expression. I will play a little bit 176 00:08:50,920 --> 00:08:54,319 Speaker 2: of a video that he put out talking about this. 177 00:08:54,840 --> 00:08:57,840 Speaker 5: Hey everyone, I want to talk about something important today 178 00:08:58,040 --> 00:09:00,000 Speaker 5: because it's time to get back to our roots around 179 00:09:00,200 --> 00:09:04,480 Speaker 5: free expression on Facebook and Instagram. I started building social 180 00:09:04,480 --> 00:09:06,920 Speaker 5: media to give people a voice. I gave a speech 181 00:09:06,960 --> 00:09:09,960 Speaker 5: at Georgetown five years ago about the importance of protecting 182 00:09:10,000 --> 00:09:13,360 Speaker 5: free expression and I still believe this today. But a 183 00:09:13,400 --> 00:09:16,000 Speaker 5: lot has happened over the last several years. There's been 184 00:09:16,080 --> 00:09:20,520 Speaker 5: widespread debate about potential harms from online content. Governments and 185 00:09:20,600 --> 00:09:24,080 Speaker 5: legacy media have pushed to censor more and more. A 186 00:09:24,120 --> 00:09:26,640 Speaker 5: lot of this is clearly political, but there's also a 187 00:09:26,640 --> 00:09:31,160 Speaker 5: lot of legitimately bad stuff out there, drugs, terrorism, child exploitation. 188 00:09:31,720 --> 00:09:34,040 Speaker 5: These are things that we take very seriously, and I 189 00:09:34,080 --> 00:09:35,720 Speaker 5: want to make sure that we handle responsibly. 190 00:09:36,840 --> 00:09:38,640 Speaker 4: So I have a lot to say about that. 191 00:09:38,760 --> 00:09:42,719 Speaker 2: First of all, very convenient rewriting of the history that 192 00:09:43,000 --> 00:09:46,240 Speaker 2: frankly wasn't that long ago, and that if you're listening 193 00:09:46,400 --> 00:09:50,200 Speaker 2: in your my age you probably remember, because we all 194 00:09:50,240 --> 00:09:53,360 Speaker 2: know it is not a secret that Facebook. Mark Fuckerberg 195 00:09:53,400 --> 00:09:56,640 Speaker 2: created Facebook as a college student to rank the looks 196 00:09:56,640 --> 00:10:00,200 Speaker 2: of the women in on his college campus. Somehow we 197 00:10:00,240 --> 00:10:02,120 Speaker 2: sort of let him get away with being like, I 198 00:10:02,280 --> 00:10:07,200 Speaker 2: created Facebook to protect free expression. Okay, sure, I don't know. 199 00:10:07,400 --> 00:10:09,240 Speaker 2: I always have to like quibble at that because he, 200 00:10:09,520 --> 00:10:11,960 Speaker 2: I guess saw a video where he said I created 201 00:10:12,000 --> 00:10:13,959 Speaker 2: Facebook because I wanted people to be able to have 202 00:10:14,600 --> 00:10:17,040 Speaker 2: debates about the Iraq War, and it's like, no, you did. 203 00:10:17,160 --> 00:10:20,760 Speaker 2: First of all, I was an organizer in the anti 204 00:10:20,840 --> 00:10:24,320 Speaker 2: war movement. Nobody was communicating on Facebook when were like, 205 00:10:24,320 --> 00:10:25,480 Speaker 2: Facebook wasn't for that. 206 00:10:25,520 --> 00:10:27,559 Speaker 4: So that's just not true. 207 00:10:28,120 --> 00:10:31,000 Speaker 2: I really have a thing where people lie to your 208 00:10:31,080 --> 00:10:34,679 Speaker 2: face about recent history that you remember that you were 209 00:10:34,800 --> 00:10:37,680 Speaker 2: therefore that you were part of so that's bullshit. But 210 00:10:38,080 --> 00:10:42,200 Speaker 2: even more than that, he's talking about how the content 211 00:10:42,480 --> 00:10:45,200 Speaker 2: that he really wants to focus on in terms of 212 00:10:45,240 --> 00:10:51,479 Speaker 2: moderating the platform is a legal content, right, child safety, harms, 213 00:10:51,720 --> 00:10:56,319 Speaker 2: drug trade, organized criminal activity, all of that. So this 214 00:10:56,360 --> 00:10:58,480 Speaker 2: is when he was really talking about how important it 215 00:10:58,559 --> 00:11:01,319 Speaker 2: was to protect free expression on social media platforms. Y'all 216 00:11:01,360 --> 00:11:04,280 Speaker 2: might recall that around this time he was in the 217 00:11:04,280 --> 00:11:07,000 Speaker 2: headlines for saying that you felt the Biden administration had 218 00:11:07,040 --> 00:11:10,680 Speaker 2: been trying to pressure Facebook into removing COVID misinformation. The 219 00:11:10,679 --> 00:11:13,880 Speaker 2: White House had a different take, saying, quote, when confronted 220 00:11:13,880 --> 00:11:17,600 Speaker 2: with the deadly pandemic, this administration encouraged responsible actions to 221 00:11:17,640 --> 00:11:20,240 Speaker 2: protect public health and safety. Our position has been very 222 00:11:20,240 --> 00:11:23,120 Speaker 2: clear and consistent. We believe tech companies and other private 223 00:11:23,160 --> 00:11:25,840 Speaker 2: actors should take into account the effects their actions have 224 00:11:25,880 --> 00:11:28,600 Speaker 2: on the American people while making independent choices about the 225 00:11:28,600 --> 00:11:33,240 Speaker 2: information they present. So, you know, Zuckerberg in this moment 226 00:11:33,360 --> 00:11:36,480 Speaker 2: was like, we are not going to be moderating political 227 00:11:36,480 --> 00:11:38,880 Speaker 2: content the way that we have been. We are going 228 00:11:38,960 --> 00:11:42,000 Speaker 2: to lift restrictions on topics that are part of mainstream 229 00:11:42,040 --> 00:11:45,120 Speaker 2: discourse and really just focus on the enforcement of a 230 00:11:45,200 --> 00:11:47,360 Speaker 2: legal and like high severity violation. 231 00:11:47,559 --> 00:11:49,600 Speaker 4: So yay for free speech. 232 00:11:49,679 --> 00:11:53,240 Speaker 2: Right, that all sounds great, Well, all of that, it's 233 00:11:53,320 --> 00:11:56,360 Speaker 2: only the case if that part of the mainstream discourse 234 00:11:56,480 --> 00:12:00,880 Speaker 2: is not abortion, which Facebook continues to suppress and moderate 235 00:12:01,000 --> 00:12:04,880 Speaker 2: quite heavily with zero transparency and zero consistency. 236 00:12:04,880 --> 00:12:05,600 Speaker 4: So it seems like. 237 00:12:05,880 --> 00:12:09,920 Speaker 2: If you're spreading COVID misinformation, well that is protected speech 238 00:12:09,960 --> 00:12:12,400 Speaker 2: that needs to be left up for freedom. If you 239 00:12:12,440 --> 00:12:15,960 Speaker 2: are sharing accurate information about abortion that isn't even against 240 00:12:16,000 --> 00:12:18,120 Speaker 2: metas policies, they will take it down. 241 00:12:18,960 --> 00:12:22,520 Speaker 3: Yeah, and like you said, without warning or transpancy or nothing, 242 00:12:22,679 --> 00:12:26,280 Speaker 3: just is gone. And you might not know why or well, 243 00:12:26,280 --> 00:12:28,560 Speaker 3: you could probably figure it out. But one of the 244 00:12:28,600 --> 00:12:32,240 Speaker 3: things that's really frustrating about all of this is that, 245 00:12:32,480 --> 00:12:34,120 Speaker 3: you know, like you said, they're kind of lying to 246 00:12:34,440 --> 00:12:37,320 Speaker 3: our faces, right, Like they're saying one thing and doing 247 00:12:37,360 --> 00:12:39,400 Speaker 3: something completely different. 248 00:12:40,360 --> 00:12:44,280 Speaker 2: Yeah, that's what really makes me angry about this. You know, 249 00:12:44,320 --> 00:12:46,760 Speaker 2: I cover a lot of tech companies. The thing that 250 00:12:46,840 --> 00:12:49,720 Speaker 2: gets me is when they lie, when they say one 251 00:12:49,800 --> 00:12:53,880 Speaker 2: thing publicly, when they publish something in their rules and 252 00:12:53,920 --> 00:12:56,040 Speaker 2: regulations and policies. You know, no one's putting a gun 253 00:12:56,040 --> 00:12:57,559 Speaker 2: to their head and making them put these things in 254 00:12:57,600 --> 00:12:59,240 Speaker 2: their rules. They put them in their rules, and then 255 00:12:59,240 --> 00:13:02,360 Speaker 2: they do a completely different thing, and then when advocates 256 00:13:02,440 --> 00:13:05,640 Speaker 2: or organizers call them out on it, there's just no 257 00:13:05,760 --> 00:13:07,720 Speaker 2: They're just like, oops, what are you gonna do that? 258 00:13:07,920 --> 00:13:11,520 Speaker 2: For some reason, that just really gets me, because they 259 00:13:11,520 --> 00:13:15,480 Speaker 2: are allowed to enjoy all of this positive press of 260 00:13:15,559 --> 00:13:19,080 Speaker 2: putting this thing in their policy and then continue doing 261 00:13:19,160 --> 00:13:22,319 Speaker 2: the shady work of going against that policy. 262 00:13:22,800 --> 00:13:24,160 Speaker 4: It never it never comes back at them. 263 00:13:24,200 --> 00:13:26,200 Speaker 2: But they're able to just do whatever they want while 264 00:13:26,240 --> 00:13:27,480 Speaker 2: saying one thing and doing another. 265 00:13:27,520 --> 00:13:29,440 Speaker 4: And I just don't feel like they really get held accountable. 266 00:13:29,480 --> 00:13:33,920 Speaker 2: And so a Meta spokesperson said that taking down abortion 267 00:13:34,040 --> 00:13:38,079 Speaker 2: content goes against Meta's own intentions. A spokesperson told The 268 00:13:38,120 --> 00:13:40,640 Speaker 2: New York Times, we want our platforms to be a 269 00:13:40,679 --> 00:13:44,160 Speaker 2: place where people can access reliable information about health services, 270 00:13:44,280 --> 00:13:47,840 Speaker 2: advertisers can promote health services, and everyone can discuss and 271 00:13:47,880 --> 00:13:51,400 Speaker 2: debate public policies in this space. That is why we 272 00:13:51,480 --> 00:13:55,720 Speaker 2: allow posts and ads discussing and debating abortion. But they're 273 00:13:55,720 --> 00:13:57,640 Speaker 2: not doing that at all. Because the big thing to 274 00:13:57,720 --> 00:14:00,720 Speaker 2: know here is that Meta says one thing in their 275 00:14:00,760 --> 00:14:03,800 Speaker 2: policies and then does a completely different thing when it 276 00:14:03,800 --> 00:14:06,679 Speaker 2: comes to how they are actually moderating abortion content. 277 00:14:07,360 --> 00:14:11,960 Speaker 3: Yeah, and it's it's it's so difficult right now to 278 00:14:12,000 --> 00:14:14,880 Speaker 3: get that good information and there's so much misinformation and 279 00:14:14,920 --> 00:14:21,760 Speaker 3: disinformation out there, and to remove it is just really 280 00:14:21,960 --> 00:14:24,640 Speaker 3: piling onto a problem that really doesn't need any more 281 00:14:24,680 --> 00:14:29,080 Speaker 3: piling onto. It is already really bad and people are 282 00:14:29,080 --> 00:14:32,800 Speaker 3: already very confused. This is not helping. 283 00:14:34,200 --> 00:14:38,400 Speaker 2: No, that's a really good context to set that. You know, 284 00:14:38,440 --> 00:14:42,680 Speaker 2: we're in a time where the Supreme Court struck down Row. 285 00:14:43,240 --> 00:14:48,080 Speaker 2: It is so much harder to access accurate information about 286 00:14:48,080 --> 00:14:50,680 Speaker 2: help so that people can make health decisions for themselves. 287 00:14:51,040 --> 00:14:55,040 Speaker 2: And when social media platforms like Facebook put their thumb 288 00:14:55,080 --> 00:14:59,040 Speaker 2: on the scales and make these kinds of moderation decisions 289 00:14:59,040 --> 00:15:02,240 Speaker 2: with no transparency that go against their own stated policy, 290 00:15:02,720 --> 00:15:05,320 Speaker 2: it just makes that climate so much harder. It makes 291 00:15:05,360 --> 00:15:07,080 Speaker 2: it harder for the people who are trying to do 292 00:15:07,120 --> 00:15:10,360 Speaker 2: this work, abortion providers and abortion advocates. It makes it 293 00:15:10,400 --> 00:15:12,640 Speaker 2: harder for people who need to make decisions about their 294 00:15:12,640 --> 00:15:15,280 Speaker 2: health and the people that support them. It makes it 295 00:15:15,320 --> 00:15:19,200 Speaker 2: so that people cannot access information. 296 00:15:18,880 --> 00:15:21,800 Speaker 4: To figure out what they want to do with their own. 297 00:15:21,600 --> 00:15:26,440 Speaker 2: Bodies and lives, and these companies do that while saying, oh, 298 00:15:26,480 --> 00:15:30,040 Speaker 2: we promote the ability to use our platforms to get 299 00:15:30,040 --> 00:15:32,920 Speaker 2: this kind of information. I would prefer that they say 300 00:15:33,080 --> 00:15:36,000 Speaker 2: we don't like abortion, we don't want people using our 301 00:15:36,000 --> 00:15:39,080 Speaker 2: platform to talk about abortion, so we take that content off. 302 00:15:39,240 --> 00:15:40,520 Speaker 4: At least that would be honest. 303 00:15:40,560 --> 00:15:43,120 Speaker 2: But what they are doing is lying to people about 304 00:15:43,120 --> 00:15:47,200 Speaker 2: what they're actually doing while doing it. It's so it's 305 00:15:47,280 --> 00:15:49,520 Speaker 2: it's really adding insult to injury. 306 00:15:49,800 --> 00:15:53,440 Speaker 1: Right, I mean, the true honest answer probably is that 307 00:15:53,440 --> 00:15:55,720 Speaker 1: they are taking money or they know that they are 308 00:15:55,760 --> 00:15:59,800 Speaker 1: just buying time until the entirety of our rights and 309 00:16:00,040 --> 00:16:04,960 Speaker 1: productive rights may be completely dismantled in every way. And 310 00:16:05,040 --> 00:16:08,720 Speaker 1: that way they can already say, hey, leaders of this 311 00:16:08,880 --> 00:16:11,680 Speaker 1: fascist regime, we have done everything for you, so can 312 00:16:11,720 --> 00:16:14,400 Speaker 1: you keep supporting our platform and give us more money? 313 00:16:15,280 --> 00:16:18,920 Speaker 2: Ugh, I mean the way that you've got the fox 314 00:16:19,040 --> 00:16:21,800 Speaker 2: Watch in the Henhouse here, the way that platforms are 315 00:16:21,840 --> 00:16:24,880 Speaker 2: able to cozy up to really, I mean, it's not 316 00:16:25,200 --> 00:16:27,880 Speaker 2: even really the Trump administration, just whoever is in power. 317 00:16:28,360 --> 00:16:28,640 Speaker 4: Uh. 318 00:16:28,680 --> 00:16:31,800 Speaker 2: And then though that administration is also the administration that 319 00:16:31,880 --> 00:16:34,920 Speaker 2: is meant to be overseeing and regulating them. It's horrible 320 00:16:34,960 --> 00:16:37,960 Speaker 2: and so we really it's I'm glad that you brought 321 00:16:38,000 --> 00:16:40,440 Speaker 2: that up, because I think that helps us peel back 322 00:16:40,480 --> 00:16:42,400 Speaker 2: the layers of what exactly is going on here and 323 00:16:42,440 --> 00:16:44,320 Speaker 2: why it's so unacceptable. 324 00:16:44,320 --> 00:16:47,840 Speaker 1: And in understanding that that the whole confusion part is 325 00:16:47,880 --> 00:16:48,840 Speaker 1: probably the point. 326 00:16:50,160 --> 00:16:51,000 Speaker 4: I think that's true. 327 00:16:51,040 --> 00:16:52,640 Speaker 2: I mean, in looking at some of the ways that 328 00:16:53,000 --> 00:16:55,080 Speaker 2: Facebook says one thing and does another when it comes 329 00:16:55,120 --> 00:16:57,760 Speaker 2: to moderating abortion content, I think that's exactly the point. 330 00:16:57,800 --> 00:16:59,960 Speaker 2: It's like, you know, if we and we'll get into 331 00:17:00,040 --> 00:17:02,000 Speaker 2: this a little bit in a moment, but if we 332 00:17:02,040 --> 00:17:06,840 Speaker 2: create a confusing, inconsistent, not transparent climate, people will just 333 00:17:06,840 --> 00:17:09,440 Speaker 2: stop posting this information on our platforms. 334 00:17:09,440 --> 00:17:11,320 Speaker 4: And so we don't have to crack down on all 335 00:17:11,320 --> 00:17:11,520 Speaker 4: of it. 336 00:17:11,560 --> 00:17:13,040 Speaker 2: We don't have to have a policy that does not 337 00:17:13,160 --> 00:17:15,600 Speaker 2: allow for abortion content to be on our platform. It'll 338 00:17:15,640 --> 00:17:17,439 Speaker 2: there'll be a chilling effect and people will do it 339 00:17:17,480 --> 00:17:19,320 Speaker 2: on their own. They'll just stop posting on their own, 340 00:17:19,320 --> 00:17:22,760 Speaker 2: and I think, in my opinion, that's the why of 341 00:17:22,760 --> 00:17:35,960 Speaker 2: why this is happening. So Meta says that they really 342 00:17:35,960 --> 00:17:39,000 Speaker 2: want to focus on moderating posts that deal with illegal content. 343 00:17:39,320 --> 00:17:41,560 Speaker 2: Side note, they don't always do such a great job 344 00:17:41,600 --> 00:17:44,920 Speaker 2: of that either, But that's for another episode. So meta's 345 00:17:45,000 --> 00:17:49,640 Speaker 2: Dangerous Organizations and Individuals ORDI policy was supposed to really 346 00:17:49,680 --> 00:17:54,080 Speaker 2: be like a narrow policy focusing on preventing the platform 347 00:17:54,119 --> 00:17:57,760 Speaker 2: from being used by terrorist groups or organized crime like 348 00:17:57,880 --> 00:18:02,200 Speaker 2: violent or criminal activity. But according to the Electronic Frontier Foundation, 349 00:18:02,800 --> 00:18:05,440 Speaker 2: over the years, we've really seen those rules be applied 350 00:18:05,600 --> 00:18:09,439 Speaker 2: in far broader and troubling ways, with little transparency and 351 00:18:09,720 --> 00:18:13,600 Speaker 2: significant impact on marginalized voices. And this has essentially allowed 352 00:18:13,640 --> 00:18:17,439 Speaker 2: META to suppress factual content about abortion that does not 353 00:18:17,680 --> 00:18:19,879 Speaker 2: actually break any of the platform's rules. 354 00:18:19,920 --> 00:18:21,400 Speaker 4: So the reason let me know this is. 355 00:18:21,320 --> 00:18:24,760 Speaker 2: Because the Electronic Frontier Foundation or EFF have really given 356 00:18:24,840 --> 00:18:27,920 Speaker 2: us a snapshot into what's happening and provided some very 357 00:18:27,960 --> 00:18:32,040 Speaker 2: clear receipts with their Stoff Censoring Abortion campaign eff sas 358 00:18:32,040 --> 00:18:36,160 Speaker 2: they collected stories from individuals, healthcare clinics, advocacy groups, and more, 359 00:18:36,200 --> 00:18:39,760 Speaker 2: and together they've revealed nearly one hundred examples of posts 360 00:18:39,760 --> 00:18:43,440 Speaker 2: and resources being taken down, ranging from guidance on medication 361 00:18:43,520 --> 00:18:47,320 Speaker 2: abortion to links of resources supporting individuals in states with 362 00:18:47,359 --> 00:18:51,400 Speaker 2: abortion bands. What is important to note is that the 363 00:18:51,440 --> 00:18:54,040 Speaker 2: posts that they found that weren't taken down or that 364 00:18:54,119 --> 00:18:57,639 Speaker 2: resulted in sometimes a ban did not break any of 365 00:18:57,800 --> 00:19:02,119 Speaker 2: Meta's rules. Eff said, we analyze these takedowns, deletions, and 366 00:19:02,200 --> 00:19:06,159 Speaker 2: bands comparing the content to what platform policies allow, particularly 367 00:19:06,200 --> 00:19:08,919 Speaker 2: those of Meta, and found that almost none of the 368 00:19:08,920 --> 00:19:12,720 Speaker 2: submissions we received violated any of the platform stated policies. 369 00:19:13,119 --> 00:19:17,840 Speaker 2: Most of the censored posts simply provided factual, educational information. 370 00:19:18,880 --> 00:19:22,679 Speaker 2: So it really is a system where you don't know, 371 00:19:23,240 --> 00:19:25,320 Speaker 2: I mean, I guess you could guess why this content 372 00:19:25,400 --> 00:19:28,160 Speaker 2: is being taken down. There's no consistency, there's no transparency, 373 00:19:28,400 --> 00:19:31,320 Speaker 2: and Facebook just gets to be like oopsie when it happens. 374 00:19:31,960 --> 00:19:33,880 Speaker 2: Here's a great example of a post that was removed 375 00:19:33,960 --> 00:19:37,919 Speaker 2: from a healthcare policy strategist named Lauren Carer discussing abortion 376 00:19:37,960 --> 00:19:42,480 Speaker 2: pills availability by mail. Her post reads, FYI, abortion pills 377 00:19:42,520 --> 00:19:45,080 Speaker 2: are great to have around, whether you anticipate needing them 378 00:19:45,119 --> 00:19:47,919 Speaker 2: or not. Plan C Pills is an amazing resource to 379 00:19:47,920 --> 00:19:50,640 Speaker 2: help you find reliable sources for abortion pills by mail, 380 00:19:50,720 --> 00:19:53,359 Speaker 2: no matter where you live. Once received, the pill should 381 00:19:53,359 --> 00:19:55,800 Speaker 2: be kept in a cool, dry place. The shelf life 382 00:19:55,800 --> 00:19:58,680 Speaker 2: of maybe pristone is about five years. The shelf life 383 00:19:58,680 --> 00:20:02,000 Speaker 2: of misso pristal is about two years. There's a missipriystal 384 00:20:02,119 --> 00:20:05,960 Speaker 2: only regiment that is extremely safe, effective, and very common globally, 385 00:20:06,280 --> 00:20:09,919 Speaker 2: So that post is just here is some factual information 386 00:20:10,160 --> 00:20:14,879 Speaker 2: about these pills. However, Facebook removed that post, and the 387 00:20:15,440 --> 00:20:18,720 Speaker 2: explanation they gave Lauren was that they don't allow people 388 00:20:18,760 --> 00:20:22,160 Speaker 2: to buy, sell, or exchange drugs that require a prescription 389 00:20:22,240 --> 00:20:24,920 Speaker 2: from a doctor or a pharmacist. But as you can tell, 390 00:20:25,119 --> 00:20:28,600 Speaker 2: that post isn't about selling or buying or trading medication. 391 00:20:28,680 --> 00:20:33,400 Speaker 2: It is just fact based information about that medication. 392 00:20:33,960 --> 00:20:36,480 Speaker 3: Yeah, it's one of those things where you read it 393 00:20:36,480 --> 00:20:40,480 Speaker 3: and you're like, I don't see the see the thing, 394 00:20:40,640 --> 00:20:45,120 Speaker 3: the thing that you're saying is there. It's just it's 395 00:20:45,200 --> 00:20:49,480 Speaker 3: just information. Uh oh it makes you bad. 396 00:20:50,119 --> 00:20:52,879 Speaker 2: Yeah, And Eff points out that this post does not 397 00:20:52,960 --> 00:20:55,360 Speaker 2: break any of Meta's rules and should not be removed. 398 00:20:55,600 --> 00:20:57,359 Speaker 2: But you don't have to take their word for it 399 00:20:57,440 --> 00:20:59,800 Speaker 2: or my word for it, because Meta said the exact 400 00:20:59,800 --> 00:21:03,480 Speaker 2: same thing. Eff points out that Meta publicly insists that 401 00:21:03,600 --> 00:21:06,920 Speaker 2: posts like these should not be censored, and if February 402 00:21:06,920 --> 00:21:10,320 Speaker 2: twenty twenty four letter to Amnesty International, metas Human Rights 403 00:21:10,359 --> 00:21:14,800 Speaker 2: policy director wrote organic content i e. Non Paid content 404 00:21:15,119 --> 00:21:18,679 Speaker 2: educating users about medication abortion, is allowed and does not 405 00:21:18,840 --> 00:21:23,080 Speaker 2: violate our community standards. Additionally, providing guidance on legal access 406 00:21:23,080 --> 00:21:24,960 Speaker 2: to pharmaceuticals is allowed. 407 00:21:25,000 --> 00:21:26,520 Speaker 4: So what the hell is suck? 408 00:21:26,640 --> 00:21:26,720 Speaker 1: Like? 409 00:21:26,760 --> 00:21:30,000 Speaker 4: Why if it's allowed, why are you taking it down? 410 00:21:31,160 --> 00:21:35,719 Speaker 3: I'm so curious about because if the moderators are essentially 411 00:21:35,760 --> 00:21:39,639 Speaker 3: kind of removed, then is this just a they have 412 00:21:39,720 --> 00:21:42,199 Speaker 3: like a keyword, like, how is this happening? Is there 413 00:21:42,240 --> 00:21:43,360 Speaker 3: a person or. 414 00:21:44,320 --> 00:21:45,359 Speaker 4: That is a great question. 415 00:21:45,560 --> 00:21:48,560 Speaker 2: If I had to guess, I would say, just knowing 416 00:21:48,560 --> 00:21:50,719 Speaker 2: what I know about content moderation, I would say this 417 00:21:50,760 --> 00:21:55,719 Speaker 2: is probably over use of AI moderation and then not 418 00:21:55,880 --> 00:21:59,280 Speaker 2: caring enough to correct that. That's if I had to say, 419 00:21:59,359 --> 00:22:03,200 Speaker 2: I would say, because honestly, content moderation is a job 420 00:22:03,240 --> 00:22:06,600 Speaker 2: for a not just a human, but a culturally competent human. 421 00:22:06,640 --> 00:22:10,720 Speaker 2: When you don't have culturally competent humans making that moderation decisions, 422 00:22:11,000 --> 00:22:13,159 Speaker 2: it's a problem, and it's a problem that leads to 423 00:22:13,560 --> 00:22:16,760 Speaker 2: the content of marginalized people being suppressed much more on 424 00:22:16,800 --> 00:22:18,760 Speaker 2: these platforms. Right, So, if I had to guess, I 425 00:22:18,800 --> 00:22:22,320 Speaker 2: would say, this is somebody using an AI content moderation 426 00:22:22,440 --> 00:22:25,239 Speaker 2: and then not caring enough to correct that. It is 427 00:22:25,320 --> 00:22:28,440 Speaker 2: consistently taking down content that does not break any of 428 00:22:28,480 --> 00:22:30,399 Speaker 2: the platform's rules. That's my guess. 429 00:22:31,600 --> 00:22:33,720 Speaker 3: Well, and that kind of relates to another thing I 430 00:22:33,760 --> 00:22:35,639 Speaker 3: know you're going to talk about, which is something Smith 431 00:22:35,640 --> 00:22:38,199 Speaker 3: and I've also talked about on some of our episodes. 432 00:22:39,119 --> 00:22:40,000 Speaker 3: Is shadow banding. 433 00:22:40,800 --> 00:22:41,360 Speaker 4: That's right. 434 00:22:41,400 --> 00:22:43,240 Speaker 2: I mean, shadow banning is one of those issues that 435 00:22:43,280 --> 00:22:46,119 Speaker 2: I find very interesting because who among us has not 436 00:22:46,160 --> 00:22:49,440 Speaker 2: posted something on social media? Had that post not perform 437 00:22:49,480 --> 00:22:51,760 Speaker 2: as well as you were expecting, and wonder am I 438 00:22:51,800 --> 00:22:55,200 Speaker 2: shadow band I have definitely thought this myself. If you've 439 00:22:55,200 --> 00:22:57,720 Speaker 2: ever thought that, you are not alone. But it does 440 00:22:57,880 --> 00:23:00,400 Speaker 2: really happen. So shadow banning is what a social media 441 00:23:00,440 --> 00:23:04,400 Speaker 2: platform limits the visibility of someone's content without telling them. 442 00:23:04,720 --> 00:23:07,480 Speaker 2: And this is happening to people and organizations that make 443 00:23:07,560 --> 00:23:11,120 Speaker 2: content about sexual and reproductive health. And it's a real 444 00:23:11,200 --> 00:23:14,080 Speaker 2: problem because, as we were talking about before the Internet 445 00:23:14,320 --> 00:23:16,960 Speaker 2: in twenty twenty five, like that is really where people 446 00:23:17,000 --> 00:23:20,040 Speaker 2: are going to find information about their health, especially in 447 00:23:20,080 --> 00:23:23,240 Speaker 2: a landscape where that information is more difficult to come by, 448 00:23:23,320 --> 00:23:25,199 Speaker 2: where it's criminalized and cracked down on. 449 00:23:25,720 --> 00:23:28,360 Speaker 4: So people need the Internet as a resource. 450 00:23:28,400 --> 00:23:32,119 Speaker 2: And so if the people and advocates and organizations who 451 00:23:32,160 --> 00:23:35,760 Speaker 2: provide that information online are shadow band it becomes that 452 00:23:35,960 --> 00:23:40,240 Speaker 2: much harder to access what is often life saving information 453 00:23:40,320 --> 00:23:43,760 Speaker 2: to help people make health decisions. Earlier this year, the 454 00:23:43,800 --> 00:23:46,920 Speaker 2: Center for Intimacy Justice shared a report called the Digital 455 00:23:46,920 --> 00:23:50,680 Speaker 2: Gag Suppression of Sexual and Reproductive Health on Meta, TikTok, Amazon, 456 00:23:50,720 --> 00:23:53,040 Speaker 2: and Google, and they found that of the one hundred 457 00:23:53,040 --> 00:23:56,680 Speaker 2: and fifty nine nonprofits, content creators, sex educators, and businesses 458 00:23:56,720 --> 00:24:00,520 Speaker 2: that they surveyed, sixty three percent had content removed on Meta, 459 00:24:00,880 --> 00:24:04,119 Speaker 2: fifty five percent had content removed on TikTok. And this 460 00:24:04,160 --> 00:24:07,320 Speaker 2: suppression is happening at the same time as platforms continue 461 00:24:07,320 --> 00:24:10,440 Speaker 2: to allow and elevate videos of violence and gore and 462 00:24:10,520 --> 00:24:14,040 Speaker 2: extremist and hateful content. And this pattern is troubling because 463 00:24:14,080 --> 00:24:16,879 Speaker 2: it only becomes more prevalent as folks turn more and 464 00:24:16,920 --> 00:24:19,399 Speaker 2: more to social media to find the information that they 465 00:24:19,440 --> 00:24:21,800 Speaker 2: need to make decisions about their health. And so I 466 00:24:21,960 --> 00:24:26,040 Speaker 2: like that context, because we really do have a social 467 00:24:26,080 --> 00:24:30,960 Speaker 2: media landscape that allows for violent content, gory content, extremist 468 00:24:31,080 --> 00:24:35,600 Speaker 2: or hateful content to stay up while taking down accurate 469 00:24:35,640 --> 00:24:40,040 Speaker 2: content about reproductive health that they agreed does not violate 470 00:24:40,119 --> 00:24:41,920 Speaker 2: any of their policies. 471 00:24:43,320 --> 00:24:47,040 Speaker 3: It's pretty telling to you. You have some examples here, 472 00:24:47,040 --> 00:24:50,320 Speaker 3: and one of them is from a place near us 473 00:24:50,440 --> 00:24:58,000 Speaker 3: that I was like, oh, dear, oh dear Emery. Yeah, yep, 474 00:24:59,040 --> 00:25:03,159 Speaker 3: but I mean it's also, oh, as we're doing this 475 00:25:03,240 --> 00:25:09,119 Speaker 3: research on the CPC episode, I consider myself pretty you know, 476 00:25:09,520 --> 00:25:12,760 Speaker 3: pretty informed about abortion and all of it, but I 477 00:25:12,760 --> 00:25:15,560 Speaker 3: had to look up some stuff about like, I'm not 478 00:25:15,720 --> 00:25:18,480 Speaker 3: sure is that legal there? I don't know, Like I 479 00:25:19,240 --> 00:25:20,960 Speaker 3: was feeling like, I know, I don't know if I 480 00:25:20,960 --> 00:25:23,520 Speaker 3: can trust this information. And then you try to go 481 00:25:23,520 --> 00:25:25,520 Speaker 3: to a place where you're like, okay, I know this place, 482 00:25:25,560 --> 00:25:28,359 Speaker 3: and then you find out it's taken down, it doesn't 483 00:25:28,359 --> 00:25:31,280 Speaker 3: have anything about it. Yeah, it's not a good climate. 484 00:25:32,680 --> 00:25:32,920 Speaker 4: Yeah. 485 00:25:32,920 --> 00:25:38,520 Speaker 2: And then you have Google allowing CPCs to stay, you know, 486 00:25:38,960 --> 00:25:40,760 Speaker 2: high ranked in their search. And then when you go 487 00:25:40,800 --> 00:25:43,600 Speaker 2: to CPCs they tell you all kinds of misinformation about 488 00:25:43,600 --> 00:25:47,520 Speaker 2: pregnancy and abortion. They are allowed to just essentially lie 489 00:25:47,600 --> 00:25:50,879 Speaker 2: to people, people who are in vulnerable situations. And so 490 00:25:51,280 --> 00:25:55,040 Speaker 2: it's already a climate where it's hard to find trustworthy, 491 00:25:55,080 --> 00:26:02,159 Speaker 2: accurate information, and then the clearly not trustworthy, clearly not 492 00:26:02,600 --> 00:26:06,800 Speaker 2: accurate information is allowed to not just allowed to exist, 493 00:26:06,960 --> 00:26:09,280 Speaker 2: but they put their thumb on the scales in terms 494 00:26:09,320 --> 00:26:12,840 Speaker 2: of making it more accessible than information that is factual. 495 00:26:14,560 --> 00:26:14,840 Speaker 4: Yep. 496 00:26:15,880 --> 00:26:18,320 Speaker 3: So let us get into some of these examples, including 497 00:26:18,359 --> 00:26:19,040 Speaker 3: the one near us. 498 00:26:19,240 --> 00:26:21,359 Speaker 4: So let's talk about what happened at Emory University. 499 00:26:21,400 --> 00:26:24,800 Speaker 2: So RISE at Emor University, the Center for Reproductive Health 500 00:26:24,800 --> 00:26:27,879 Speaker 2: and Research in the Southeast, they published a post saying, 501 00:26:28,320 --> 00:26:30,879 Speaker 2: let's talk about Miffy pristone and its uses and the 502 00:26:30,920 --> 00:26:35,040 Speaker 2: importance of access. So they post this online. Two months later, 503 00:26:35,119 --> 00:26:39,120 Speaker 2: their account was suddenly suspended, flagged under the policy against 504 00:26:39,160 --> 00:26:42,919 Speaker 2: selling illegal drugs, which they were not selling or offering 505 00:26:42,960 --> 00:26:46,280 Speaker 2: illegal drugs. They were just giving fact based health information. 506 00:26:46,800 --> 00:26:50,439 Speaker 2: They tried to appeal, but that appeal was denied, leading 507 00:26:50,440 --> 00:26:54,040 Speaker 2: to their account being permanently deleted. Sarah Read, the director 508 00:26:54,040 --> 00:26:57,480 Speaker 2: of Research and Translation at RISE, told Eff as a team, 509 00:26:57,640 --> 00:27:00,000 Speaker 2: this was a hit to our morale. We poor account 510 00:27:00,080 --> 00:27:03,240 Speaker 2: less hours of person power, creativity and passion into creating 511 00:27:03,280 --> 00:27:05,359 Speaker 2: the content we have on our page and having it 512 00:27:05,440 --> 00:27:09,000 Speaker 2: vanished virtually overnight took a toll on our team. And 513 00:27:10,119 --> 00:27:13,560 Speaker 2: you know, I really think, like think about how critical 514 00:27:13,920 --> 00:27:18,159 Speaker 2: that information is these days, and how critical social media 515 00:27:18,240 --> 00:27:21,560 Speaker 2: is these days. They are already doing sensitive work in 516 00:27:21,600 --> 00:27:24,520 Speaker 2: an area where that work is threatened, and so losing 517 00:27:24,560 --> 00:27:27,359 Speaker 2: your social media that you've put so much time into 518 00:27:28,160 --> 00:27:31,239 Speaker 2: is like losing a lifeline, both for the staff and 519 00:27:31,320 --> 00:27:33,560 Speaker 2: for the community that you're trying to do that work in. 520 00:27:33,800 --> 00:27:37,359 Speaker 2: As Eff puts it, For many organizational users like Rise, 521 00:27:37,560 --> 00:27:41,040 Speaker 2: their social media accounts are repository for resources and metrics 522 00:27:41,200 --> 00:27:44,160 Speaker 2: that may not be stored elsewhere. We spent a significant 523 00:27:44,160 --> 00:27:47,399 Speaker 2: amount of already constrained team capacity attempting to recover all 524 00:27:47,480 --> 00:27:50,520 Speaker 2: of the content we created for Instagram that was potentially 525 00:27:50,560 --> 00:27:53,400 Speaker 2: going to be permanently lost. We also spent a significant 526 00:27:53,400 --> 00:27:56,360 Speaker 2: amount of time and energy trying to understand what options 527 00:27:56,359 --> 00:27:58,720 Speaker 2: we might have available for Meta to appeal our case 528 00:27:58,720 --> 00:28:02,520 Speaker 2: and recover our account. Their support options are not easily accessible, 529 00:28:02,680 --> 00:28:05,719 Speaker 2: and the time it took to navigate this issue distracted 530 00:28:05,720 --> 00:28:10,200 Speaker 2: from our existing work. So I totally feel what they 531 00:28:10,200 --> 00:28:12,879 Speaker 2: are saying that when you are doing work that is 532 00:28:12,960 --> 00:28:17,320 Speaker 2: that critical, you know, time sensitive, having to stop that 533 00:28:17,480 --> 00:28:19,119 Speaker 2: work to figure out, well, how are we going to 534 00:28:19,119 --> 00:28:21,560 Speaker 2: appeal this decision to Meta? Is all of are all 535 00:28:21,600 --> 00:28:23,800 Speaker 2: of our years and years of work on Instagram just 536 00:28:23,880 --> 00:28:27,400 Speaker 2: lost forever? That is a real problem. And again, they 537 00:28:27,480 --> 00:28:30,280 Speaker 2: weren't doing anything wrong. They're they're nothing that they posted 538 00:28:30,320 --> 00:28:33,240 Speaker 2: on their account was against metas policies. It's just arbitrary 539 00:28:33,359 --> 00:28:36,199 Speaker 2: and so luckily they were able to eventually get their 540 00:28:36,240 --> 00:28:39,760 Speaker 2: account back, but only because they knew someone who knew 541 00:28:39,760 --> 00:28:42,960 Speaker 2: somebody who worked at Facebook personally, which is really the 542 00:28:43,000 --> 00:28:45,400 Speaker 2: only way to appeal when this kind of thing happens. 543 00:28:45,400 --> 00:28:48,160 Speaker 2: If you, if you if your account is taken down 544 00:28:48,240 --> 00:28:50,960 Speaker 2: for no real reason by Facebook, I am sorry to say, 545 00:28:51,040 --> 00:28:53,120 Speaker 2: unless you have a friend who knows somebody who works 546 00:28:53,120 --> 00:28:55,000 Speaker 2: at Facebook, you're probably not going to be able to 547 00:28:55,040 --> 00:28:57,680 Speaker 2: appeal again because a lot of these decisions are AI right. 548 00:28:57,720 --> 00:29:00,640 Speaker 2: It can be very very hard to sate to a 549 00:29:00,760 --> 00:29:02,440 Speaker 2: human and the only real way to do it is 550 00:29:02,480 --> 00:29:05,440 Speaker 2: to just know somebody there. And again, I just feel 551 00:29:05,440 --> 00:29:09,440 Speaker 2: that in these situations where Meta agrees these posts are 552 00:29:09,440 --> 00:29:11,400 Speaker 2: not in violation of their rules and that they admit 553 00:29:11,440 --> 00:29:14,680 Speaker 2: they made a mistake, it should not come down to 554 00:29:15,080 --> 00:29:19,080 Speaker 2: knowing somebody at Facebook to have these decisions be reversed 555 00:29:19,080 --> 00:29:22,400 Speaker 2: when Meta agrees they are mistakes on their part. 556 00:29:23,240 --> 00:29:24,600 Speaker 3: Now I'm trying to think if I know someone on 557 00:29:24,640 --> 00:29:37,400 Speaker 3: Facebook I used to, I don't know if we're still there. Well, 558 00:29:38,000 --> 00:29:44,080 Speaker 3: another issue with this is, as you said, if people 559 00:29:44,160 --> 00:29:48,040 Speaker 3: are worried that their content might be deleted or shadow banned, 560 00:29:48,440 --> 00:29:54,160 Speaker 3: or just they've seen this happen to other organizations or 561 00:29:54,200 --> 00:29:58,200 Speaker 3: something like that, then they might not post it anymore. 562 00:29:59,360 --> 00:30:04,360 Speaker 2: Yeah, I have to assume that is the point eff rights. 563 00:30:04,400 --> 00:30:06,800 Speaker 2: At the end of the day, clinics are clinics are 564 00:30:06,880 --> 00:30:10,240 Speaker 2: left afraid to post basic information, patients are left confused 565 00:30:10,320 --> 00:30:14,200 Speaker 2: or misinformed, and researchers lose access to these audiences. But 566 00:30:14,280 --> 00:30:16,640 Speaker 2: unless your issue catches the attention of a journalist or 567 00:30:16,640 --> 00:30:19,480 Speaker 2: you know, someone at Meta, you might never regain access 568 00:30:19,520 --> 00:30:22,840 Speaker 2: to your account. And so I really think that that 569 00:30:23,040 --> 00:30:26,000 Speaker 2: is the sort of so what here that Meta is 570 00:30:26,040 --> 00:30:33,280 Speaker 2: doing this to sort of not explicitly discourage organizations and 571 00:30:33,320 --> 00:30:35,840 Speaker 2: advocates and people from posting this kind of information on 572 00:30:35,880 --> 00:30:39,440 Speaker 2: their platform, while saying the opposite, because it is going 573 00:30:39,480 --> 00:30:40,640 Speaker 2: to have a silencing effect. 574 00:30:40,720 --> 00:30:42,320 Speaker 4: You know, nobody wants to risk. 575 00:30:42,240 --> 00:30:45,600 Speaker 2: Losing their entire platform, years and years and years of 576 00:30:45,640 --> 00:30:47,680 Speaker 2: content and research and resources they've collected. 577 00:30:47,840 --> 00:30:51,120 Speaker 4: Yeah, no one that's going to want to take that risk, right. 578 00:30:51,640 --> 00:30:54,640 Speaker 1: It's interesting that their policy with like the things that 579 00:30:54,680 --> 00:31:01,440 Speaker 1: we're going to actually moderate is about terror and gain 580 00:31:01,760 --> 00:31:06,520 Speaker 1: and child endangerment, which is kind of a dog whistle 581 00:31:06,760 --> 00:31:10,040 Speaker 1: for what the Republican platform has been to for jump 582 00:31:10,080 --> 00:31:13,760 Speaker 1: to all of this morality level of issues, and that 583 00:31:13,840 --> 00:31:16,840 Speaker 1: the fact that Zuckerberg is like, you know what, Yeah, 584 00:31:17,000 --> 00:31:19,720 Speaker 1: we're gonna adopt this too, but it's purely to protect 585 00:31:19,720 --> 00:31:23,000 Speaker 1: the people's We're just protecting the people's And again it 586 00:31:23,040 --> 00:31:26,320 Speaker 1: does seem like see, see we're doing like you, we 587 00:31:26,440 --> 00:31:30,240 Speaker 1: got your back. We also agree with this. This is 588 00:31:30,280 --> 00:31:32,400 Speaker 1: the only way or this is the best way to 589 00:31:32,560 --> 00:31:35,320 Speaker 1: control what information is being out there. 590 00:31:35,760 --> 00:31:38,440 Speaker 4: Yes, and if you actually I mean, this is a 591 00:31:38,440 --> 00:31:39,120 Speaker 4: whole other topic. 592 00:31:39,120 --> 00:31:40,880 Speaker 2: But when you look at the way so they say, okay, 593 00:31:40,880 --> 00:31:44,600 Speaker 2: we're only gonna be cracking down on content that creates 594 00:31:44,920 --> 00:31:48,240 Speaker 2: harm for kids, this dog whistle that they love. 595 00:31:48,120 --> 00:31:48,640 Speaker 4: To pull up. 596 00:31:48,800 --> 00:31:50,440 Speaker 2: And then when you look at the kind of harm 597 00:31:50,520 --> 00:31:54,520 Speaker 2: for kids that they either allow or advocate, part of 598 00:31:54,560 --> 00:31:57,920 Speaker 2: me is like, what what content are you actually taking down? 599 00:31:58,240 --> 00:32:00,080 Speaker 2: I don't know if you all saw the recent reporting. 600 00:32:00,120 --> 00:32:02,240 Speaker 2: There was a very interesting report I think from the 601 00:32:02,240 --> 00:32:06,200 Speaker 2: Wall Street Journal where they had gotten their hands on 602 00:32:07,160 --> 00:32:10,400 Speaker 2: an internal policy document. So this is something that somebody 603 00:32:10,400 --> 00:32:13,640 Speaker 2: at Facebook said this is our policy, totally fine to 604 00:32:13,680 --> 00:32:18,120 Speaker 2: have in writing, no problem. That said that Meta's chatbots 605 00:32:18,560 --> 00:32:25,479 Speaker 2: were allowed to engage in sensual play with minors, so kids, 606 00:32:26,240 --> 00:32:31,480 Speaker 2: it was okay with Meta if they're chatbots engaged in 607 00:32:31,520 --> 00:32:34,200 Speaker 2: like sensual I won't say sexual, but I would say 608 00:32:34,320 --> 00:32:35,880 Speaker 2: I've seen some of the content and it is sort 609 00:32:35,880 --> 00:32:39,920 Speaker 2: of spicy that it's okay if they're if they're bot 610 00:32:39,960 --> 00:32:42,280 Speaker 2: to do that with children. And part of me is like, 611 00:32:42,480 --> 00:32:44,200 Speaker 2: I cannot believe you will put this in writing. I 612 00:32:44,200 --> 00:32:46,360 Speaker 2: cannot believe that someone at Facebook said, yeah, this is 613 00:32:46,360 --> 00:32:48,680 Speaker 2: a this is a document, I'll attach my name to this. 614 00:32:49,200 --> 00:32:49,760 Speaker 4: Well and behold. 615 00:32:49,760 --> 00:32:51,760 Speaker 2: When the Wall Street Journal asked about it, they were like, 616 00:32:51,800 --> 00:32:55,720 Speaker 2: oh no, we have since walked that policy back. That's 617 00:32:55,720 --> 00:32:58,040 Speaker 2: no longer our official, on the record. Our official on 618 00:32:58,040 --> 00:33:00,240 Speaker 2: the record policy is no longer that it's okay, hey, 619 00:33:00,280 --> 00:33:04,520 Speaker 2: for our bots to engage in sexy role play with kids. 620 00:33:04,560 --> 00:33:07,160 Speaker 4: We've walked that back. I bet you did walk that. 621 00:33:07,200 --> 00:33:10,080 Speaker 1: Back today as you asked this question. 622 00:33:10,400 --> 00:33:13,280 Speaker 2: I'm sure it happened right after the Wall Street Journal 623 00:33:13,280 --> 00:33:14,240 Speaker 2: called them and asked them about it. 624 00:33:14,240 --> 00:33:16,200 Speaker 4: I'm so sure that it was like an hour later 625 00:33:16,280 --> 00:33:16,880 Speaker 4: we walked at that. 626 00:33:18,280 --> 00:33:21,640 Speaker 1: We got this now, No, we would never right. 627 00:33:23,680 --> 00:33:27,000 Speaker 3: Well, yeah, and I mean you were here Bridget, I 628 00:33:27,000 --> 00:33:29,720 Speaker 3: guess it was years ago and you were talking about 629 00:33:29,920 --> 00:33:34,080 Speaker 3: another kind of whistlebrower account of Facebook knowing it was 630 00:33:34,080 --> 00:33:36,280 Speaker 3: harming young girls. 631 00:33:36,520 --> 00:33:39,600 Speaker 4: Yeah, Francis hogen Is is the whistleblower. Why we know that? 632 00:33:40,400 --> 00:33:44,080 Speaker 3: Yeah, So it's it is very galling for them to 633 00:33:44,120 --> 00:33:46,840 Speaker 3: be like, we want to protect the children, and then 634 00:33:46,880 --> 00:33:51,840 Speaker 3: you have these things that again directly show that clearly 635 00:33:51,920 --> 00:33:52,800 Speaker 3: you don't, not. 636 00:33:52,720 --> 00:33:58,560 Speaker 2: Really and to be clear, that is knowingly harming kids. 637 00:33:58,880 --> 00:34:01,520 Speaker 2: So I just think it's very interesting that Facebook gets 638 00:34:01,560 --> 00:34:04,280 Speaker 2: to say, well, we're too busy focusing on content that 639 00:34:04,600 --> 00:34:07,800 Speaker 2: harms kids to take down to really care that much 640 00:34:07,840 --> 00:34:10,080 Speaker 2: about what's going on with our abortion content. And that's 641 00:34:10,120 --> 00:34:11,279 Speaker 2: the content we're really working on. 642 00:34:11,800 --> 00:34:13,440 Speaker 4: But really we're not doing that either. 643 00:34:13,320 --> 00:34:15,920 Speaker 2: You know what I mean, they get they really. It 644 00:34:16,000 --> 00:34:20,440 Speaker 2: just infuriates me, it really does. And I think the 645 00:34:20,480 --> 00:34:23,880 Speaker 2: issue really is to understand is in twenty twenty five, 646 00:34:24,320 --> 00:34:26,160 Speaker 2: when you have a question, When I have a question, 647 00:34:26,239 --> 00:34:28,279 Speaker 2: the first place I go is the internet, right we 648 00:34:28,360 --> 00:34:30,399 Speaker 2: all that's I think that is the reality for most 649 00:34:30,440 --> 00:34:33,080 Speaker 2: of us, and the internet and social media really has 650 00:34:33,200 --> 00:34:36,960 Speaker 2: become this lifeline for folks trying to get information about 651 00:34:37,480 --> 00:34:40,879 Speaker 2: the world around us, including our sexual and reproductive health. 652 00:34:40,920 --> 00:34:43,799 Speaker 2: And if folks are not able to find what they 653 00:34:43,840 --> 00:34:46,720 Speaker 2: need in their own communities, which I'm sorry to say 654 00:34:46,880 --> 00:34:49,320 Speaker 2: is becoming more and more of the reality these days, 655 00:34:50,000 --> 00:34:52,560 Speaker 2: they are going to go online to turn to social 656 00:34:52,600 --> 00:34:56,280 Speaker 2: media to fill those gaps. That access really matters most 657 00:34:56,320 --> 00:34:58,600 Speaker 2: for folks whose care is being cut off, like abortion 658 00:34:58,719 --> 00:35:02,840 Speaker 2: seekers or or queer youth living in states where healthcare 659 00:35:02,960 --> 00:35:05,920 Speaker 2: is under attack. And so if you have these social 660 00:35:05,960 --> 00:35:09,960 Speaker 2: media platforms kind of adding to a landscape where that 661 00:35:10,000 --> 00:35:12,840 Speaker 2: information is difficult to access, even if that information is 662 00:35:12,840 --> 00:35:15,640 Speaker 2: not against their rules, it's just making it that much 663 00:35:15,680 --> 00:35:16,360 Speaker 2: more difficult. 664 00:35:16,360 --> 00:35:19,280 Speaker 4: And these decisions really do matter. 665 00:35:19,360 --> 00:35:21,400 Speaker 2: I mean they some of them are life or death, 666 00:35:21,640 --> 00:35:24,680 Speaker 2: and they really have real world impact on people's lives. 667 00:35:25,400 --> 00:35:31,360 Speaker 3: Absolutely, And unfortunately this is not this is part of 668 00:35:31,440 --> 00:35:34,240 Speaker 3: kind of a larger issue, kind of a larger attack, 669 00:35:35,239 --> 00:35:36,520 Speaker 3: gendered attack. Correct. 670 00:35:36,880 --> 00:35:39,480 Speaker 2: Yes, so this is something I find so interesting and 671 00:35:39,520 --> 00:35:41,440 Speaker 2: I actually should come back and do another episode on it. 672 00:35:41,560 --> 00:35:43,200 Speaker 2: In the middle of some research on it right now. 673 00:35:43,239 --> 00:35:46,960 Speaker 2: But the Center for Intimacy Justice, who's report I mentioned earlier. 674 00:35:47,239 --> 00:35:50,759 Speaker 2: They have another report that really shows how platforms routinely 675 00:35:50,840 --> 00:35:54,839 Speaker 2: suppress sexual health content for women and trans folks, while 676 00:35:54,960 --> 00:35:58,280 Speaker 2: leaving content aimed at supporting the sexual health of CIS 677 00:35:58,280 --> 00:36:01,880 Speaker 2: men largely untouched. Right, So, I know lots of people 678 00:36:01,960 --> 00:36:07,239 Speaker 2: who run businesses that are focused on the sexual health 679 00:36:07,239 --> 00:36:09,560 Speaker 2: of people who are not CIS men. Right, So, if 680 00:36:09,560 --> 00:36:12,759 Speaker 2: you have pelvic pain, if you need sex toys, like 681 00:36:12,800 --> 00:36:16,400 Speaker 2: all these different things that are aimed at people who 682 00:36:16,480 --> 00:36:18,839 Speaker 2: are not cisgender men. I have lots of friends who 683 00:36:19,239 --> 00:36:22,400 Speaker 2: run businesses like that. They are essentially not able to 684 00:36:22,480 --> 00:36:26,640 Speaker 2: do any kind of advertising on Facebook because Facebook does 685 00:36:26,680 --> 00:36:32,359 Speaker 2: not allow it. However, Facebook certainly allows information about the 686 00:36:32,600 --> 00:36:36,440 Speaker 2: sexual wellness of cisgender men. So we really have a 687 00:36:36,480 --> 00:36:40,120 Speaker 2: climate where, let's face it, mostly men who run these 688 00:36:40,160 --> 00:36:44,680 Speaker 2: platforms are able to determine whose sexual health is important 689 00:36:44,719 --> 00:36:48,320 Speaker 2: and who is not, whose healthcare is healthcare, and whose 690 00:36:48,440 --> 00:36:50,840 Speaker 2: is like something perverted that needs to be suppressed and 691 00:36:50,840 --> 00:36:52,080 Speaker 2: isn't allowed on their platform. 692 00:36:53,239 --> 00:36:55,480 Speaker 1: Yeah. Now I'm thinking about it as I've been looking 693 00:36:55,520 --> 00:36:59,239 Speaker 1: at Instagram the amount of GLP one ads that I've 694 00:36:59,239 --> 00:37:01,560 Speaker 1: been getting, which is interesting because I thought that was 695 00:37:01,640 --> 00:37:05,759 Speaker 1: medication that you had to get their doctor is overwhelming. 696 00:37:05,920 --> 00:37:09,440 Speaker 1: But also on the vice versa of that is meles 697 00:37:09,600 --> 00:37:14,719 Speaker 1: you know, health sex health bs. Those are the two 698 00:37:14,760 --> 00:37:16,960 Speaker 1: ads that I get. Definitely nothing about women in birth 699 00:37:16,960 --> 00:37:20,560 Speaker 1: control rarely, there's a few that but it's very as 700 00:37:20,600 --> 00:37:23,160 Speaker 1: of late, I think zero. But the amount of GLP 701 00:37:23,239 --> 00:37:27,400 Speaker 1: one ads, I'm like, whoa, what is happening Instagram? I 702 00:37:27,560 --> 00:37:28,640 Speaker 1: thought we weren't allowing this. 703 00:37:29,320 --> 00:37:33,120 Speaker 2: Yes, I mean the amount of ads I get specifically 704 00:37:33,120 --> 00:37:36,359 Speaker 2: for the a rectail dysfunction medication blue. 705 00:37:36,080 --> 00:37:38,799 Speaker 4: Chew, and the ads are do have you ever seen 706 00:37:38,800 --> 00:37:39,560 Speaker 4: these ads online? 707 00:37:39,600 --> 00:37:43,760 Speaker 2: The ads are clearly targeted at women, So it's a 708 00:37:43,840 --> 00:37:48,160 Speaker 2: cute woman being like, ladies, get your man to get 709 00:37:48,200 --> 00:37:51,160 Speaker 2: blue chee, blue shoe is gonna rock your world. 710 00:37:51,280 --> 00:37:52,480 Speaker 4: Get your man on blue shoe. 711 00:37:52,680 --> 00:37:55,680 Speaker 2: And that's a medication that is that is an arectail 712 00:37:55,719 --> 00:37:57,719 Speaker 2: dysfunction prescription medication. 713 00:37:58,080 --> 00:38:01,120 Speaker 4: But these platforms have just decide, oh no, that's okay, 714 00:38:01,200 --> 00:38:01,640 Speaker 4: that's that. 715 00:38:01,719 --> 00:38:03,880 Speaker 2: You can you can show that all day long, no problem, 716 00:38:04,080 --> 00:38:06,040 Speaker 2: you can boost it, you can put money behind it. 717 00:38:06,239 --> 00:38:10,279 Speaker 1: Totally fine again like and then the other part being 718 00:38:10,320 --> 00:38:13,760 Speaker 1: the weight loss medication, which slowly a lot more information 719 00:38:13,840 --> 00:38:15,799 Speaker 1: comes back, like oh, there's side effects. This might not 720 00:38:15,840 --> 00:38:16,759 Speaker 1: be as good as you think. 721 00:38:16,840 --> 00:38:20,359 Speaker 4: By the way, Oh yes, I'll just say yes. I'll 722 00:38:20,400 --> 00:38:21,040 Speaker 4: just say yes. 723 00:38:21,560 --> 00:38:24,080 Speaker 1: Well that's like as of late, and we know this 724 00:38:24,239 --> 00:38:26,399 Speaker 1: was coming. We knew this was coming because they've also 725 00:38:26,440 --> 00:38:29,000 Speaker 1: got variations that are not FDA approved, which I guess 726 00:38:29,000 --> 00:38:31,800 Speaker 1: means very little at this point in time. But again 727 00:38:31,880 --> 00:38:34,240 Speaker 1: that this is the rampant amount of ass like every 728 00:38:34,320 --> 00:38:39,680 Speaker 1: two like scrolls on Instagram that pops up, and on 729 00:38:39,719 --> 00:38:41,759 Speaker 1: Facebook too, which is I'm like, I don't even go 730 00:38:41,840 --> 00:38:43,960 Speaker 1: to Facebook. I just need to know people's birthdays. That's 731 00:38:43,960 --> 00:38:47,720 Speaker 1: all I need, That's all I really want. But again, 732 00:38:47,880 --> 00:38:50,080 Speaker 1: this seems to be like I thought once, if that 733 00:38:50,280 --> 00:38:54,480 Speaker 1: was your policy from Jump, then how are these as 734 00:38:54,840 --> 00:38:58,280 Speaker 1: paying you, I know, paying you millions? How are these okay? 735 00:38:58,760 --> 00:39:03,560 Speaker 2: Yeah, their policy is totally inconsistent, seemingly arbitrary, and seemingly 736 00:39:03,719 --> 00:39:06,560 Speaker 2: biased against any kind of marginal identity. Like that's just 737 00:39:06,600 --> 00:39:09,600 Speaker 2: what's going on. They don't have any transparency. They say 738 00:39:09,600 --> 00:39:12,319 Speaker 2: one thing and do another. Uh, and that just is 739 00:39:12,360 --> 00:39:15,720 Speaker 2: the is the norm for them, and it's I really 740 00:39:15,760 --> 00:39:21,239 Speaker 2: do think that we should be talking about how you know, again, 741 00:39:21,320 --> 00:39:23,840 Speaker 2: let's be honest we're talking about mostly men, and mostly 742 00:39:23,880 --> 00:39:27,320 Speaker 2: not just men, like a specific kind of man, white moneyed, 743 00:39:27,600 --> 00:39:30,400 Speaker 2: coastal all of that. How we have given them so 744 00:39:30,640 --> 00:39:35,240 Speaker 2: much power to define what knowledge is acceptable, whose voices 745 00:39:35,280 --> 00:39:38,960 Speaker 2: are amplified, whose bodies are are left at risk, and 746 00:39:39,200 --> 00:39:42,440 Speaker 2: when platforms decide what can and can't be shared. 747 00:39:42,719 --> 00:39:43,640 Speaker 4: They are making. 748 00:39:43,480 --> 00:39:46,799 Speaker 2: Public health decisions with global consequences in ways that are 749 00:39:46,800 --> 00:39:51,000 Speaker 2: often contrary to public health, and then also reinforce systemic 750 00:39:51,040 --> 00:39:53,480 Speaker 2: and equalities. And so I just think, you know, this 751 00:39:53,600 --> 00:39:56,680 Speaker 2: is not just we're not just talking about like vague 752 00:39:56,719 --> 00:39:58,440 Speaker 2: policy language. I know that I have spent a lot 753 00:39:58,440 --> 00:40:00,520 Speaker 2: of time talking about that because it's a nois But 754 00:40:01,400 --> 00:40:04,799 Speaker 2: it's really about them deciding who gets to speak, who 755 00:40:04,840 --> 00:40:07,440 Speaker 2: gets seen, who gets access to the information that they 756 00:40:07,560 --> 00:40:10,200 Speaker 2: need to make decisions about their own health and bodies. 757 00:40:10,480 --> 00:40:14,680 Speaker 2: When Meta and these platforms silence accurate, essential sexual and 758 00:40:14,680 --> 00:40:19,120 Speaker 2: reproductive health information for not just enforcing inconsistent body rules, 759 00:40:19,160 --> 00:40:21,120 Speaker 2: I mean they are, but they're not just doing that. 760 00:40:21,320 --> 00:40:25,000 Speaker 2: They are also shaping people's lives and really deciding whose 761 00:40:25,040 --> 00:40:27,959 Speaker 2: health matters and whose doesn't. And in a world where 762 00:40:28,000 --> 00:40:30,600 Speaker 2: we know the internet has really become this lifeline that 763 00:40:30,760 --> 00:40:33,360 Speaker 2: is not just annoying, although I am annoid, it is 764 00:40:33,480 --> 00:40:37,080 Speaker 2: dangerous because free speech shouldn't come with a disclaimer that 765 00:40:37,120 --> 00:40:40,040 Speaker 2: your body's safety is just optional and up to the 766 00:40:40,080 --> 00:40:41,360 Speaker 2: whims of Mark Zuckerberg. 767 00:40:42,320 --> 00:40:43,720 Speaker 3: No, I don't want to live in that world. 768 00:40:43,840 --> 00:40:48,640 Speaker 2: No, No, I mean I think about this all the time, 769 00:40:48,800 --> 00:40:53,920 Speaker 2: the ways that these individuals, like a handful of individual 770 00:40:54,200 --> 00:40:58,640 Speaker 2: mostly white guys, get to define what our worlds look 771 00:40:58,760 --> 00:41:00,840 Speaker 2: like in these very con create ways. 772 00:41:00,920 --> 00:41:03,320 Speaker 4: And I've never met Mark Zuckerberg. 773 00:41:03,360 --> 00:41:05,719 Speaker 2: Although I have met Cheryl Sandberg, but I've never met 774 00:41:05,760 --> 00:41:09,239 Speaker 2: Mark Zuckerberg. I don't want Mark Zuckerberg in charge of 775 00:41:09,280 --> 00:41:12,080 Speaker 2: deciding anything for my life. I don't think Mark Zuckerberg 776 00:41:12,120 --> 00:41:15,800 Speaker 2: and I have any common idea about what it means 777 00:41:15,800 --> 00:41:18,120 Speaker 2: to have a good, fulfilled life. I don't want him 778 00:41:18,160 --> 00:41:19,560 Speaker 2: designing what my future looks like. 779 00:41:21,680 --> 00:41:22,760 Speaker 3: I think that's very wise. 780 00:41:23,680 --> 00:41:26,200 Speaker 1: I mean that one movie made him look really pretty 781 00:41:26,239 --> 00:41:26,759 Speaker 1: much a dick. 782 00:41:26,840 --> 00:41:29,560 Speaker 2: So, oh my god, you mean The Social Network, one 783 00:41:29,560 --> 00:41:30,720 Speaker 2: of my favorite movies. 784 00:41:31,520 --> 00:41:34,040 Speaker 4: Oh my god. I don't want to I don't want. 785 00:41:33,880 --> 00:41:37,040 Speaker 2: To spoil it, but the ending of that movie is 786 00:41:37,640 --> 00:41:38,879 Speaker 2: my version of Citizen Kane. 787 00:41:38,920 --> 00:41:40,160 Speaker 4: Have you both have seen it? 788 00:41:40,360 --> 00:41:43,680 Speaker 1: Yeah, I've not seen it. I've seen clips because I'm like, 789 00:41:43,680 --> 00:41:45,239 Speaker 1: I don't want even want to know, but he seems. 790 00:41:45,000 --> 00:41:45,359 Speaker 3: Like a dick. 791 00:41:45,719 --> 00:41:48,400 Speaker 2: Go home and watch it tonight. I know your birthday 792 00:41:48,440 --> 00:41:52,200 Speaker 2: is coming up, birthday? Is it a fun birthday watch? 793 00:41:52,239 --> 00:41:54,279 Speaker 2: I mean, I'm such a nerd. I say it's a 794 00:41:54,280 --> 00:41:56,160 Speaker 2: fun watch. But if you're looking for a movie to 795 00:41:56,160 --> 00:41:58,200 Speaker 2: watch on your birthday, that might be that might. 796 00:41:58,080 --> 00:41:58,920 Speaker 1: Be the one. 797 00:42:00,080 --> 00:42:03,120 Speaker 3: It's a good it's a solid like, oh, yeah, you're 798 00:42:03,160 --> 00:42:04,840 Speaker 3: just a sad man ending. 799 00:42:05,440 --> 00:42:10,240 Speaker 2: Yes, you know what I'm talking about, you kids, Citizen Kane, 800 00:42:11,040 --> 00:42:13,680 Speaker 2: the sled moment at the end of that movie. It 801 00:42:14,320 --> 00:42:16,399 Speaker 2: haunts me, and I think if you haven't, I think 802 00:42:16,400 --> 00:42:18,040 Speaker 2: if you've seen it, it gives context for some of the 803 00:42:18,080 --> 00:42:20,160 Speaker 2: stuff we've talked about when it comes to Upberg today. 804 00:42:21,000 --> 00:42:24,320 Speaker 1: All right, without even seeing it, I was like, all right. 805 00:42:26,480 --> 00:42:28,879 Speaker 3: Also love it as such a dark soundtrack? 806 00:42:30,400 --> 00:42:32,440 Speaker 4: Who is it yet? 807 00:42:33,560 --> 00:42:37,959 Speaker 2: Nobody does a haunting soundtrack like Trent Resnor Gone Girl 808 00:42:38,040 --> 00:42:41,680 Speaker 2: soundtrack soundtrack to Challenger is also Trent Resnor, And that's 809 00:42:41,680 --> 00:42:43,440 Speaker 2: the soundtrack I put on when I'm writing if you 810 00:42:43,480 --> 00:42:46,400 Speaker 2: need to focus and just like put on some headphones 811 00:42:46,440 --> 00:42:47,840 Speaker 2: and be like we are writing. 812 00:42:48,080 --> 00:42:49,960 Speaker 4: Let's do that is your soundtrack. 813 00:42:50,440 --> 00:42:55,240 Speaker 2: Rednor can write a dark movie soundtrack like Nobody's Business. 814 00:42:55,880 --> 00:42:56,520 Speaker 4: And I love that. 815 00:42:56,560 --> 00:42:59,880 Speaker 3: It was in this movie about this college kid trying 816 00:42:59,880 --> 00:43:06,200 Speaker 3: to get a girl to like Oh Lord, yes, okay, 817 00:43:06,239 --> 00:43:10,800 Speaker 3: well we'll revisit that later. We'll do uh maybe you 818 00:43:11,080 --> 00:43:14,359 Speaker 3: know we had a fun time trash talking some Hugh 819 00:43:14,400 --> 00:43:17,600 Speaker 3: Heffner that one time. We'll come back with a fun 820 00:43:18,560 --> 00:43:21,360 Speaker 3: thing for you, Bridget, and you can side because you 821 00:43:21,360 --> 00:43:25,920 Speaker 3: always bring us these heavy topics of your choosing. But 822 00:43:26,880 --> 00:43:29,919 Speaker 3: before then, thank you so much for being here. Where 823 00:43:29,920 --> 00:43:31,440 Speaker 3: can the good listeners find you? 824 00:43:31,440 --> 00:43:33,440 Speaker 2: You can find me on my podcast. There are no 825 00:43:33,560 --> 00:43:36,000 Speaker 2: girls on the internet. You can find me on Instagram. 826 00:43:36,320 --> 00:43:38,120 Speaker 2: I know it's owned by Max Zarckerberg. I don't like 827 00:43:38,120 --> 00:43:42,520 Speaker 2: it either. At Bridget Bryan DC, TikTok at Bridget Bran DC, 828 00:43:42,760 --> 00:43:43,400 Speaker 2: and on YouTube. 829 00:43:43,560 --> 00:43:44,920 Speaker 4: There are no girls on the internet. 830 00:43:46,080 --> 00:43:48,360 Speaker 3: Yes, go check all of that out if you have 831 00:43:48,440 --> 00:43:51,960 Speaker 3: it already, Listeners, if you would like to contact us, 832 00:43:51,960 --> 00:43:54,200 Speaker 3: you can you can email us at Hello stuff, I 833 00:43:54,239 --> 00:43:56,040 Speaker 3: never told you. You can find us on Blue Scott 834 00:43:56,080 --> 00:43:58,520 Speaker 3: most of podcast or Instagram and TikTok at Stuff. I 835 00:43:58,640 --> 00:44:01,520 Speaker 3: never told you. We're also on YouTube. We have new 836 00:44:01,560 --> 00:44:03,480 Speaker 3: merchandise ad Cotton Bureau, and we have a book you 837 00:44:03,480 --> 00:44:05,440 Speaker 3: can get wherever you get your books. Thanks always her 838 00:44:05,440 --> 00:44:08,040 Speaker 3: as suproducer with senior executive producer, my under contributor Joey. 839 00:44:08,239 --> 00:44:10,160 Speaker 1: Thank you and thanks to you for listening. 840 00:44:10,360 --> 00:44:12,040 Speaker 3: Stuff never told this book should by Heart Radio. For 841 00:44:12,040 --> 00:44:13,640 Speaker 3: more podcast from my Heart Radio, you can check out 842 00:44:13,640 --> 00:44:15,479 Speaker 3: the heart Radio app, a podcast or where you listen 843 00:44:15,480 --> 00:44:16,520 Speaker 3: to your favorite shows.