1 00:00:05,280 --> 00:00:07,600 Speaker 1: Hey, this is Annie and Samitha and know coome stuff. 2 00:00:07,600 --> 00:00:19,320 Speaker 2: I've never told your production of ByHeart Radio, and today 3 00:00:19,520 --> 00:00:23,160 Speaker 2: we are so so happy to once again be joined 4 00:00:23,480 --> 00:00:28,160 Speaker 2: by the brilliant, the fabulous, the talented, so wonderful bridget 5 00:00:28,240 --> 00:00:31,320 Speaker 2: Todd here with us for the last time in twenty 6 00:00:31,360 --> 00:00:33,199 Speaker 2: twenty three. 7 00:00:34,240 --> 00:00:36,880 Speaker 3: Thank you for having me. Happy holidays, y'all. 8 00:00:37,120 --> 00:00:38,560 Speaker 4: Happy holidays to you. 9 00:00:39,120 --> 00:00:43,440 Speaker 1: Yes, I'm so glad you had the time to make it. 10 00:00:43,479 --> 00:00:46,680 Speaker 2: I know it's a really busy, busy time of year 11 00:00:47,080 --> 00:00:51,000 Speaker 2: for everybody, so thank you. 12 00:00:51,120 --> 00:00:54,040 Speaker 5: Yeah, my pleasure. I am such a grane like this 13 00:00:54,120 --> 00:00:56,800 Speaker 5: time of year. I find it stressful, I find it overwhelming. 14 00:00:57,440 --> 00:01:00,680 Speaker 5: I feel like it takes more work than usual to 15 00:01:00,760 --> 00:01:03,920 Speaker 5: show up as my best self because I'm just like 16 00:01:03,960 --> 00:01:07,200 Speaker 5: my I'm like Baseline, stressed and overwhelmed and grouchy. So 17 00:01:07,319 --> 00:01:10,920 Speaker 5: this will be a fun you know, departure from all 18 00:01:10,959 --> 00:01:14,600 Speaker 5: of those holiday plans blah blah blah, stop that and 19 00:01:14,640 --> 00:01:15,559 Speaker 5: sometimes get me down. 20 00:01:15,959 --> 00:01:18,199 Speaker 1: Yeah yeah, well I hope. 21 00:01:18,240 --> 00:01:22,360 Speaker 2: So this is one of our last recordings of the year, 22 00:01:22,440 --> 00:01:25,360 Speaker 2: so it's got that again, that kind of like last 23 00:01:25,440 --> 00:01:26,399 Speaker 2: day of school vibe. 24 00:01:27,640 --> 00:01:30,000 Speaker 1: Are you doing anything fun? Are you doing like weirdo 25 00:01:30,160 --> 00:01:31,680 Speaker 1: Christmas or anything. 26 00:01:31,480 --> 00:01:32,960 Speaker 3: Like that good memory. 27 00:01:33,080 --> 00:01:35,520 Speaker 5: So for folks who don't know, my partner and I 28 00:01:35,560 --> 00:01:39,119 Speaker 5: have started a holiday tradition called Weirdo Christmas. If folks 29 00:01:39,160 --> 00:01:41,959 Speaker 5: are interested in partaking, you can have your own weirdo Christmas. 30 00:01:41,959 --> 00:01:46,440 Speaker 5: And Weirdo Christmas is before you do Christmas with your family. 31 00:01:46,480 --> 00:01:51,200 Speaker 5: Whatever you're like, Christmas obligation looks like is whatever to you? 32 00:01:51,200 --> 00:01:54,120 Speaker 3: You just go have your own little Christmas trip. 33 00:01:54,160 --> 00:01:56,080 Speaker 5: We usually get a cabin with a hot tub, but 34 00:01:56,160 --> 00:01:58,760 Speaker 5: it can be anything where the whole point is just 35 00:01:58,800 --> 00:02:04,960 Speaker 5: to have fun, experience holiday merriment without the obligations that 36 00:02:05,000 --> 00:02:09,600 Speaker 5: sometimes come with like holiday family plans. Our particular way 37 00:02:09,639 --> 00:02:13,920 Speaker 5: of celebrating it is weird costumes, weird lights, and we 38 00:02:14,000 --> 00:02:16,560 Speaker 5: have to watch at least one, but usually several more 39 00:02:16,840 --> 00:02:18,440 Speaker 5: bad holiday movies. 40 00:02:18,600 --> 00:02:19,000 Speaker 1: Ooo. 41 00:02:20,680 --> 00:02:22,560 Speaker 3: I'd like you talk over them and make fun of them. 42 00:02:23,240 --> 00:02:26,799 Speaker 6: Sure, did we talk about what you consider bad previously? 43 00:02:27,880 --> 00:02:30,000 Speaker 3: Ooh, I don't think so. 44 00:02:30,000 --> 00:02:33,440 Speaker 5: So for us it's a lot of over the top 45 00:02:33,600 --> 00:02:39,239 Speaker 5: Hallmark holiday movies. Usually, you know, a girl who has 46 00:02:39,280 --> 00:02:42,040 Speaker 5: a good job in the big city, comes home. Her 47 00:02:42,040 --> 00:02:45,200 Speaker 5: family runs like a bakery, but it's in trouble and 48 00:02:45,240 --> 00:02:48,639 Speaker 5: then she has to meet a hometown holiday hunk who 49 00:02:48,720 --> 00:02:50,880 Speaker 5: shows her the tree meeting of Christmas. 50 00:02:51,000 --> 00:02:54,120 Speaker 6: Yes, who's been like a high school friend or something 51 00:02:54,160 --> 00:02:57,040 Speaker 6: that they reconnect after years and oh my goodness, he 52 00:02:57,120 --> 00:02:59,360 Speaker 6: runs a coffee shop and he's richer than he thought 53 00:02:59,440 --> 00:02:59,840 Speaker 6: or something. 54 00:03:00,240 --> 00:03:03,280 Speaker 3: Yeah, or it's like flipped. 55 00:03:03,400 --> 00:03:06,240 Speaker 5: So like if you're listening and you are a woman 56 00:03:06,240 --> 00:03:09,440 Speaker 5: who has dark hair, brown hair, and you have and 57 00:03:09,480 --> 00:03:11,760 Speaker 5: you care about your career, you have a like full, 58 00:03:11,840 --> 00:03:14,560 Speaker 5: well rounded life, like a good career, good friends, and 59 00:03:14,639 --> 00:03:17,640 Speaker 5: you are going with your partner to a small town 60 00:03:17,680 --> 00:03:18,600 Speaker 5: to visit his family. 61 00:03:18,840 --> 00:03:20,279 Speaker 3: You're about to get left. 62 00:03:20,000 --> 00:03:24,399 Speaker 5: For like a wholesome girl next door type who can 63 00:03:24,919 --> 00:03:26,720 Speaker 5: really understand it's the true meeting of Christmas. 64 00:03:26,720 --> 00:03:29,160 Speaker 3: You just know that that's always the thought of these movies. 65 00:03:30,000 --> 00:03:33,600 Speaker 6: Well, that's kind of a what is that holiday movie 66 00:03:33,840 --> 00:03:35,120 Speaker 6: with Sarahjiska. 67 00:03:34,639 --> 00:03:36,880 Speaker 3: Parker on the Family Stone? 68 00:03:37,440 --> 00:03:40,480 Speaker 6: Yes, like kinda is that whole trap together in one? 69 00:03:40,680 --> 00:03:44,160 Speaker 6: Like you're like, this is the worst holiday movie ever? 70 00:03:44,280 --> 00:03:46,280 Speaker 6: Who thought this was a good idea because they took 71 00:03:46,280 --> 00:03:49,080 Speaker 6: all the trips of like Hallmark but made it tragic. 72 00:03:49,400 --> 00:03:53,000 Speaker 3: Yes, also the Family Stone that's spoiler alert. Oh it's 73 00:03:53,000 --> 00:03:54,080 Speaker 3: been out for like twenty years. 74 00:03:54,640 --> 00:03:58,120 Speaker 5: It has one of my One of the plotlines I've 75 00:03:58,080 --> 00:04:03,120 Speaker 5: find so interesting is when partner swap like focus, that's 76 00:04:03,240 --> 00:04:03,800 Speaker 5: a part of it. 77 00:04:03,960 --> 00:04:05,960 Speaker 3: I always find that to be so weird. 78 00:04:07,160 --> 00:04:10,840 Speaker 6: It was the Luke Wilson character and the Claire Danes 79 00:04:11,200 --> 00:04:14,160 Speaker 6: coming in and you're like, this person is abhorrent, but 80 00:04:14,200 --> 00:04:16,919 Speaker 6: this one is easy and nice and easy, like good, 81 00:04:17,120 --> 00:04:19,400 Speaker 6: good to go, like just they balance each other out 82 00:04:19,400 --> 00:04:22,280 Speaker 6: and you're like, wow, this is what what. 83 00:04:22,240 --> 00:04:24,640 Speaker 4: Is this lesson we're supposed to learn here? 84 00:04:25,680 --> 00:04:25,960 Speaker 3: Annie? 85 00:04:26,000 --> 00:04:30,320 Speaker 4: I'm sure you have never seen it. 86 00:04:30,320 --> 00:04:32,360 Speaker 3: It's actually a pretty good movie. I'm not gonna lae, 87 00:04:32,520 --> 00:04:32,760 Speaker 3: I don't. 88 00:04:34,279 --> 00:04:36,520 Speaker 6: I love people really love it as In fact, the 89 00:04:36,560 --> 00:04:38,120 Speaker 6: reason I thought about this is because it was on 90 00:04:38,200 --> 00:04:41,120 Speaker 6: TikTok and they were trying to redeem it and saying like, 91 00:04:41,320 --> 00:04:43,920 Speaker 6: it's supposed to be more realistic. Everybody's supposed to be 92 00:04:44,040 --> 00:04:47,040 Speaker 6: unlikable to a certain extent because they all have issues, 93 00:04:47,080 --> 00:04:49,200 Speaker 6: and that's supposed to be the point. And it comes 94 00:04:49,240 --> 00:04:52,640 Speaker 6: together like it's just family because you also have Rachel the. 95 00:04:52,680 --> 00:04:54,840 Speaker 3: Gap Rachel McAdams. I think it's her best role. 96 00:04:55,480 --> 00:04:58,280 Speaker 6: Yeah, so you have her in there, you have uh 97 00:04:58,480 --> 00:05:01,760 Speaker 6: like Diane Keaton. You just have a whole list of big, 98 00:05:01,880 --> 00:05:06,320 Speaker 6: big name actors in there, and they're just all somewhat dislikable. 99 00:05:07,160 --> 00:05:09,120 Speaker 6: You have the token gay couple in there, and of 100 00:05:09,160 --> 00:05:11,159 Speaker 6: course one of them is black, so you know, yay, 101 00:05:11,279 --> 00:05:12,520 Speaker 6: everybody's diversity. 102 00:05:12,560 --> 00:05:14,720 Speaker 3: And I think, is it one of them death as well? 103 00:05:15,120 --> 00:05:16,880 Speaker 4: Yes, I think so. I think they just put that 104 00:05:16,960 --> 00:05:18,640 Speaker 4: in one couple. 105 00:05:19,320 --> 00:05:21,480 Speaker 3: A win for diversity. This movie was. 106 00:05:23,360 --> 00:05:25,640 Speaker 6: The checklist was real, like, yes, you put all of 107 00:05:25,680 --> 00:05:27,800 Speaker 6: those great come on in but yeah, like it's such 108 00:05:27,800 --> 00:05:30,200 Speaker 6: an interesting movie. But I always thought of it as 109 00:05:30,279 --> 00:05:33,159 Speaker 6: like this is what happens if a Hallmark movie were 110 00:05:33,200 --> 00:05:35,559 Speaker 6: to come together in a motion like a large motion picture, 111 00:05:35,800 --> 00:05:38,719 Speaker 6: and you just really never had too many redemptive arcs 112 00:05:38,760 --> 00:05:41,720 Speaker 6: except for enough to be like we're a family, we're 113 00:05:41,760 --> 00:05:43,920 Speaker 6: gonna pull it together type of conversation. 114 00:05:44,320 --> 00:05:45,640 Speaker 4: That's how I thought of that movie. 115 00:05:47,640 --> 00:05:51,000 Speaker 5: And what's more important to the stream meaning of Christmas 116 00:05:51,240 --> 00:05:53,560 Speaker 5: than family persevering. 117 00:05:53,000 --> 00:05:56,159 Speaker 4: In the end, right with someone always about to die. 118 00:05:56,440 --> 00:06:07,159 Speaker 7: Yes, we're different Christmas the same obviously Hallmark. 119 00:06:07,160 --> 00:06:10,080 Speaker 6: Because it's been a minute. If I like, I've not 120 00:06:10,120 --> 00:06:14,159 Speaker 6: seen many Hallmark movies, I'm trying to think because like 121 00:06:14,800 --> 00:06:17,640 Speaker 6: they are along the same lines of Lifetime, right, Lifetime 122 00:06:17,680 --> 00:06:20,760 Speaker 6: has been yeah but over yeah. 123 00:06:20,800 --> 00:06:23,640 Speaker 5: So I'm using Hallmark as like a catch all for 124 00:06:23,800 --> 00:06:27,640 Speaker 5: like a like like a d like a type of movie. 125 00:06:27,880 --> 00:06:30,040 Speaker 3: But if you really want to watch a movie, like. 126 00:06:30,000 --> 00:06:33,400 Speaker 5: A Christmas movie that is like I almost feel like. 127 00:06:33,400 --> 00:06:34,800 Speaker 3: Hallmarks a little more highbrow. 128 00:06:35,080 --> 00:06:37,400 Speaker 5: If you really want to watch a really Christmas movie 129 00:06:37,400 --> 00:06:41,120 Speaker 5: that's like not it's like knows what it is, check 130 00:06:41,120 --> 00:06:45,359 Speaker 5: out a mar Vista Entertainment movie because they basically only 131 00:06:45,440 --> 00:06:50,039 Speaker 5: make bad Christmas movies like Hallmark in in theme, but 132 00:06:50,160 --> 00:06:51,360 Speaker 5: like way. 133 00:06:51,160 --> 00:06:52,560 Speaker 3: Trashier and way worse. 134 00:06:53,600 --> 00:06:58,920 Speaker 5: Recently, where the the like logo of the movie, like 135 00:06:59,000 --> 00:07:03,040 Speaker 5: the image for the movie had chevy Chase is prominently featured, 136 00:07:03,120 --> 00:07:04,320 Speaker 5: so we're like, oh, I guess chevy. 137 00:07:04,200 --> 00:07:06,800 Speaker 3: Chase is this movie. When I tell you that chevy Chase. 138 00:07:06,720 --> 00:07:09,080 Speaker 5: Is the movie for like thirty seconds, like he's also 139 00:07:09,400 --> 00:07:11,760 Speaker 5: clearly not filmed with everybody else, so it's like, oh, y'all, 140 00:07:11,760 --> 00:07:14,840 Speaker 5: have y'all have like thirty minutes with chevy Chase and 141 00:07:14,840 --> 00:07:16,240 Speaker 5: you were like throwing on the box he's in. 142 00:07:18,680 --> 00:07:20,800 Speaker 6: That's amazing because I think it's like he got pretty 143 00:07:20,840 --> 00:07:24,160 Speaker 6: much blacklisted because he was so difficult, notoriously difficult, so 144 00:07:24,200 --> 00:07:26,040 Speaker 6: that would make sense. He'd be like, yeah, pay me 145 00:07:26,040 --> 00:07:27,560 Speaker 6: a lot of money. I'll be on it for the 146 00:07:27,600 --> 00:07:31,240 Speaker 6: thirty seconds. This is what I'll do. Now, that's amazing. 147 00:07:33,080 --> 00:07:35,360 Speaker 4: Wow, So I watched all those movies. 148 00:07:35,920 --> 00:07:37,080 Speaker 3: I've watched a ton of them. 149 00:07:37,160 --> 00:07:41,440 Speaker 5: I have watched, Yes, and another another genre that I 150 00:07:41,480 --> 00:07:46,120 Speaker 5: really enjoy is Annie, you'll appreciate this horror Christmas movies. 151 00:07:46,280 --> 00:07:47,320 Speaker 3: I feel like there's a lot. 152 00:07:47,560 --> 00:07:52,360 Speaker 5: Of really good holiday, cheesy horror movies. It's a really 153 00:07:52,360 --> 00:07:53,800 Speaker 5: pretty good intersection that I. 154 00:07:53,880 --> 00:07:56,679 Speaker 2: Like, Oh yeah, oh yeah, I think that's a big 155 00:07:56,760 --> 00:07:57,560 Speaker 2: trend right now. 156 00:07:57,760 --> 00:07:58,920 Speaker 4: It is, and I'm. 157 00:07:58,720 --> 00:08:02,160 Speaker 1: Super into it. I love it. 158 00:08:02,280 --> 00:08:02,360 Speaker 3: This. 159 00:08:02,680 --> 00:08:05,480 Speaker 2: Yeah, this has been an interesting because I had like 160 00:08:05,480 --> 00:08:08,200 Speaker 2: my go to movies, but this year has definitely been 161 00:08:08,200 --> 00:08:10,520 Speaker 2: more of a horror Christmas year than. 162 00:08:13,040 --> 00:08:17,080 Speaker 6: Have you watched The Violent Night or is it the 163 00:08:17,600 --> 00:08:19,800 Speaker 6: Is it like just a violent action movie? 164 00:08:19,840 --> 00:08:21,600 Speaker 1: It's kind of a yeah. 165 00:08:21,840 --> 00:08:25,440 Speaker 2: I would say it's more of a action, like a 166 00:08:25,560 --> 00:08:27,520 Speaker 2: violent action action. 167 00:08:28,720 --> 00:08:31,880 Speaker 6: We watched it with my partner's parents and his mom 168 00:08:31,960 --> 00:08:34,640 Speaker 6: is like the sweetest says a little note card prayer 169 00:08:34,679 --> 00:08:37,199 Speaker 6: for dinner, and she was just like, what is happening. 170 00:08:37,240 --> 00:08:41,120 Speaker 6: I'm like, you could not take the worst movie. 171 00:08:40,520 --> 00:08:41,320 Speaker 1: If he tried. 172 00:08:41,760 --> 00:08:44,120 Speaker 3: How did this become the movie that you all were watching? 173 00:08:44,280 --> 00:08:47,920 Speaker 6: I don't know he was he was just bound. He's 174 00:08:47,960 --> 00:08:49,640 Speaker 6: like set on it, like this is I don't think 175 00:08:49,679 --> 00:08:50,640 Speaker 6: it is just be funny. 176 00:08:51,400 --> 00:08:51,800 Speaker 4: It was not. 177 00:08:52,320 --> 00:08:54,880 Speaker 1: I mean that it was old movie. 178 00:08:54,920 --> 00:08:55,559 Speaker 3: I know. 179 00:08:56,000 --> 00:08:58,920 Speaker 2: I try very hard not to watch a movie like that. 180 00:08:59,080 --> 00:08:59,960 Speaker 2: The first time. 181 00:09:00,040 --> 00:09:06,160 Speaker 4: I think he's been banned from picking movies. Yeah, I 182 00:09:06,280 --> 00:09:07,000 Speaker 4: rerect it. 183 00:09:09,120 --> 00:09:09,800 Speaker 1: I got it. 184 00:09:09,760 --> 00:09:11,200 Speaker 4: It's a old Joys. 185 00:09:12,120 --> 00:09:14,600 Speaker 1: It definitely good. 186 00:09:14,679 --> 00:09:14,880 Speaker 7: That. 187 00:09:14,960 --> 00:09:18,079 Speaker 4: Do you have like New Year's movies that you watch? 188 00:09:18,400 --> 00:09:22,000 Speaker 5: Ooh, well, I have one that I have a saft 189 00:09:22,000 --> 00:09:24,240 Speaker 5: spot for. It is not a movie that is that 190 00:09:24,360 --> 00:09:26,880 Speaker 5: is was well received at the time. But have y'all 191 00:09:26,880 --> 00:09:28,480 Speaker 5: ever seen two hundred cigarettes? 192 00:09:28,760 --> 00:09:29,480 Speaker 1: Oh no. 193 00:09:30,080 --> 00:09:32,120 Speaker 5: So it's a movie that, like, it's one of those 194 00:09:32,320 --> 00:09:36,200 Speaker 5: ensemble cast movies. Basically, it's like, I think it takes 195 00:09:36,200 --> 00:09:38,400 Speaker 5: place in nineteen eighty nine. 196 00:09:38,440 --> 00:09:40,200 Speaker 3: It's New Year's Eve of nineteen eighty nine. 197 00:09:40,240 --> 00:09:43,240 Speaker 5: It came out in nineteen ninety nine, and it's all 198 00:09:43,240 --> 00:09:45,840 Speaker 5: these different characters all trying to make plans for New 199 00:09:45,920 --> 00:09:46,400 Speaker 5: Year's Eve. 200 00:09:46,760 --> 00:09:48,480 Speaker 3: And it's one of those movies that came. 201 00:09:48,320 --> 00:09:50,960 Speaker 5: Out just at the perfect right time where the cast 202 00:09:51,160 --> 00:09:55,360 Speaker 5: is got the most interesting ensemble. Cast is like Kate Hudson, 203 00:09:55,600 --> 00:10:00,360 Speaker 5: Paul Rudd, Christina Rici, Courtney Love, Ben Affleck very interesting. Asked, 204 00:10:00,800 --> 00:10:02,960 Speaker 5: it's not a movie that like, I'm not gonna say 205 00:10:03,000 --> 00:10:05,040 Speaker 5: it's a good movie because it's not. But it's a 206 00:10:05,080 --> 00:10:06,680 Speaker 5: movie that you if you watch it, you will not 207 00:10:06,800 --> 00:10:08,160 Speaker 5: not enjoy it. I'll put it that way. 208 00:10:08,280 --> 00:10:12,080 Speaker 3: Janine Garofalo, I think. 209 00:10:12,000 --> 00:10:12,839 Speaker 4: I have seen that. 210 00:10:13,120 --> 00:10:15,600 Speaker 6: I think I have and it's been so long ago 211 00:10:16,000 --> 00:10:20,800 Speaker 6: and I've probably watched it because of Janine Garofalo lo 212 00:10:20,960 --> 00:10:22,040 Speaker 6: her during that age. 213 00:10:22,720 --> 00:10:25,400 Speaker 5: I feel like the nineties, the late nineties, there was 214 00:10:25,440 --> 00:10:29,600 Speaker 5: such a Janine Garoffo renaissance happening, like she was in everything, 215 00:10:29,840 --> 00:10:31,040 Speaker 5: She's great and everything. 216 00:10:31,200 --> 00:10:31,880 Speaker 3: We love her. 217 00:10:32,679 --> 00:10:34,800 Speaker 6: Yes, I was, because she was the normal girl and 218 00:10:34,840 --> 00:10:37,120 Speaker 6: it made me feel seen. I was like and also 219 00:10:37,240 --> 00:10:39,280 Speaker 6: I had like she has Asian, is she part Asia? 220 00:10:39,360 --> 00:10:41,000 Speaker 6: She looks, she looks. 221 00:10:40,840 --> 00:10:43,760 Speaker 4: She looks not white, she's got her hair. 222 00:10:44,920 --> 00:10:48,120 Speaker 6: There's something that's just like maybe I had the small hopes. 223 00:10:48,760 --> 00:10:51,600 Speaker 5: Can I tell you a this has nothing to do 224 00:10:51,640 --> 00:10:54,240 Speaker 5: with anything, but can I tell you a random fun 225 00:10:54,320 --> 00:10:56,680 Speaker 5: fact about Gine Garoffalo please yes. 226 00:10:57,040 --> 00:10:57,800 Speaker 3: So I don't know. 227 00:10:57,720 --> 00:10:59,200 Speaker 5: Why, but for some reason I was like, I wonder 228 00:10:59,200 --> 00:11:01,360 Speaker 5: if garofflo I was married. We never hear about her 229 00:11:01,559 --> 00:11:03,800 Speaker 5: having a partner. What's the deal? So I looked it 230 00:11:03,880 --> 00:11:08,120 Speaker 5: up and what I found was that Jenny Garoffalo is married. 231 00:11:08,480 --> 00:11:11,720 Speaker 5: And when she got engaged, she went to get married, 232 00:11:12,160 --> 00:11:14,840 Speaker 5: and surprise, surprise, she found out that she was in 233 00:11:14,880 --> 00:11:19,000 Speaker 5: fact already married because years earlier she had participated in 234 00:11:19,360 --> 00:11:21,520 Speaker 5: some kind of a skit or a prank where she 235 00:11:21,600 --> 00:11:24,000 Speaker 5: got married and did not realize that that was like 236 00:11:24,040 --> 00:11:24,960 Speaker 5: a real marriage. 237 00:11:25,040 --> 00:11:28,120 Speaker 3: So when she get married, look this up. It sounds 238 00:11:28,120 --> 00:11:29,720 Speaker 3: like I'm making where I am not. 239 00:11:30,280 --> 00:11:33,000 Speaker 5: When she went to get married, she was like, oh, snap, 240 00:11:33,400 --> 00:11:35,600 Speaker 5: turns out I'm already married. So she had to get 241 00:11:35,600 --> 00:11:38,160 Speaker 5: that marriage dealt with before she could get married for real. 242 00:11:39,440 --> 00:11:43,320 Speaker 4: Oh that is amazing. I did not know that. And 243 00:11:43,360 --> 00:11:44,559 Speaker 4: thank you for this information. 244 00:11:45,640 --> 00:11:49,720 Speaker 5: Just a random piece of information about Jenny Garoffalo. 245 00:11:50,280 --> 00:11:56,160 Speaker 2: Well, it sounds like she, as with many of us, 246 00:11:56,200 --> 00:11:57,920 Speaker 2: getting to the end of the new year, wanted to 247 00:11:58,040 --> 00:12:07,040 Speaker 2: leave something behind masterful as always, Yes, thank you. I 248 00:12:07,120 --> 00:12:09,440 Speaker 2: was worried it wasn't gonna happen. But I made it 249 00:12:09,480 --> 00:12:10,360 Speaker 2: happen anyway. 250 00:12:10,440 --> 00:12:10,800 Speaker 4: He did. 251 00:12:21,400 --> 00:12:23,920 Speaker 2: For the topic today that you brought us, you're talking 252 00:12:23,920 --> 00:12:27,600 Speaker 2: about some of the things tech in the tech world 253 00:12:27,640 --> 00:12:32,000 Speaker 2: that we should leave behind, are wanting exactly? 254 00:12:32,360 --> 00:12:34,720 Speaker 5: Yeah, if you all know that cartoon that you might 255 00:12:34,760 --> 00:12:37,199 Speaker 5: see on Twitter a lot, where it's like a woman 256 00:12:37,400 --> 00:12:40,120 Speaker 5: stepping from one year to the next, and behind her 257 00:12:40,240 --> 00:12:43,720 Speaker 5: she's leaving behind all of this baggage from the previous year, 258 00:12:43,840 --> 00:12:46,679 Speaker 5: like fake friends or heartbreak, and then across over her 259 00:12:46,679 --> 00:12:49,800 Speaker 5: shoulder she's carrying what she's carrying into the next year, 260 00:12:49,840 --> 00:12:52,280 Speaker 5: So like love or focus. Do you know the image 261 00:12:52,280 --> 00:12:53,080 Speaker 5: that I'm talking about? 262 00:12:53,400 --> 00:12:57,400 Speaker 1: I don't. I had never seen it, but I'm looking 263 00:12:57,480 --> 00:12:58,320 Speaker 1: at it now. 264 00:12:59,600 --> 00:13:03,440 Speaker 5: Yes, So buzzfact about that image. It was created by 265 00:13:03,520 --> 00:13:07,440 Speaker 5: a British Ghanaian graphic artist name Penniel and Chill and 266 00:13:07,480 --> 00:13:09,959 Speaker 5: it first appeared on the internet in twenty fourteen and 267 00:13:10,040 --> 00:13:12,880 Speaker 5: BuzzFeed has a great interview with her. But basically, this 268 00:13:13,000 --> 00:13:16,160 Speaker 5: image has been totally memified and it kind of pops 269 00:13:16,240 --> 00:13:19,720 Speaker 5: up around this time of year every year to sort 270 00:13:19,720 --> 00:13:21,760 Speaker 5: of fit the theme of like New Year, New me. 271 00:13:22,160 --> 00:13:24,280 Speaker 5: Sometimes it's a joke, which this artist says that she 272 00:13:24,280 --> 00:13:26,440 Speaker 5: doesn't always agree with or like the way that people 273 00:13:26,520 --> 00:13:29,480 Speaker 5: like do their spins on it. But it's really become 274 00:13:29,480 --> 00:13:31,440 Speaker 5: a part of like the shorthand of this time of 275 00:13:31,520 --> 00:13:33,960 Speaker 5: year of like what we're leaving behind and what we're 276 00:13:33,960 --> 00:13:36,040 Speaker 5: taking with us into the new year. So when that 277 00:13:36,080 --> 00:13:39,880 Speaker 5: picture was making its rounds recently around this time of year, 278 00:13:40,640 --> 00:13:43,439 Speaker 5: you know, I had just wrapped up podcasting for my 279 00:13:43,520 --> 00:13:45,960 Speaker 5: own podcast. There are no girls on the internet talking 280 00:13:45,960 --> 00:13:49,240 Speaker 5: about technology and identity and really asking questions about like 281 00:13:49,280 --> 00:13:51,800 Speaker 5: what we get right and what we get wrong, you know, 282 00:13:51,880 --> 00:13:54,640 Speaker 5: in those in those arenas. So this image got me 283 00:13:54,720 --> 00:13:58,199 Speaker 5: thinking what tech things should just stay in twenty twenty 284 00:13:58,200 --> 00:14:00,800 Speaker 5: three and that we should not be bringing with us 285 00:14:00,920 --> 00:14:03,440 Speaker 5: into twenty twenty four. So I've got my list of 286 00:14:03,480 --> 00:14:06,280 Speaker 5: things that should stay. We're like, we're done with them, 287 00:14:06,360 --> 00:14:08,240 Speaker 5: We're not we should not be bringing them with us 288 00:14:08,320 --> 00:14:09,040 Speaker 5: into the new year. 289 00:14:09,600 --> 00:14:12,160 Speaker 2: Yeah, and I was thinking about this too. This has 290 00:14:12,240 --> 00:14:14,280 Speaker 2: been a big year for tech. This has felt like 291 00:14:14,320 --> 00:14:17,080 Speaker 2: a very monumental, like a lot of things are changing, 292 00:14:17,600 --> 00:14:22,400 Speaker 2: and still looking at this list, I was like, oh, yeah, yeah, yeah, 293 00:14:22,440 --> 00:14:25,040 Speaker 2: it can feel strange because time feels so strange now. 294 00:14:25,080 --> 00:14:28,960 Speaker 2: But that did all happen this year. Yeah, so what 295 00:14:29,120 --> 00:14:30,040 Speaker 2: is on your list? 296 00:14:30,120 --> 00:14:30,960 Speaker 1: Let's get into it. 297 00:14:31,240 --> 00:14:33,760 Speaker 5: So, as you said, like it's been a big year 298 00:14:33,800 --> 00:14:36,920 Speaker 5: for conversations around technology, that they're big right now, and 299 00:14:36,920 --> 00:14:39,920 Speaker 5: I would say there probably is not a bigger conversation 300 00:14:40,440 --> 00:14:43,200 Speaker 5: than the one happening around AI. Right So we're all 301 00:14:43,240 --> 00:14:46,160 Speaker 5: like rightly talking about how AI could change the way 302 00:14:46,160 --> 00:14:48,040 Speaker 5: that we do our jobs or the way that artistic 303 00:14:48,080 --> 00:14:51,480 Speaker 5: projects get created, and those are all important conversations to 304 00:14:51,520 --> 00:14:54,800 Speaker 5: be having. But there is one very serious conversation about 305 00:14:54,840 --> 00:14:57,560 Speaker 5: the reality of AI, and that is how AI is 306 00:14:57,600 --> 00:15:00,680 Speaker 5: being used to do things like violate consent and further 307 00:15:00,840 --> 00:15:03,280 Speaker 5: use technology to make it seem like our bodies are 308 00:15:03,320 --> 00:15:06,479 Speaker 5: just like up for grabs once we show up online. 309 00:15:06,640 --> 00:15:11,160 Speaker 3: So case in point newdify apps. So newdifi apps. 310 00:15:10,880 --> 00:15:14,400 Speaker 5: Are kind of the catch all term for apps or 311 00:15:14,440 --> 00:15:18,479 Speaker 5: platforms that promise to use AI to generate non consensual 312 00:15:18,920 --> 00:15:22,520 Speaker 5: nude or semi nude images of anyone. These kinds of 313 00:15:22,600 --> 00:15:26,480 Speaker 5: apps have been exploding in popularity. In September alone, twenty 314 00:15:26,480 --> 00:15:30,560 Speaker 5: four million people visited newdify apps or undressing websites, according 315 00:15:30,560 --> 00:15:34,880 Speaker 5: to the social network analysis company Graphica. So in their analysis, 316 00:15:34,960 --> 00:15:38,040 Speaker 5: they really talk about how we've how this year specifically, 317 00:15:38,440 --> 00:15:42,520 Speaker 5: we've seen this shift where newdify apps kind of went 318 00:15:42,600 --> 00:15:46,400 Speaker 5: from this niche, underground custom thing where like, if you 319 00:15:46,480 --> 00:15:50,480 Speaker 5: were like a singular creep who had a singular fixation 320 00:15:50,600 --> 00:15:54,400 Speaker 5: on one person, you could find a marketplace for non 321 00:15:54,480 --> 00:15:57,360 Speaker 5: consensual images to be made of somebody, but it was 322 00:15:57,360 --> 00:16:01,280 Speaker 5: like a niche custom thing. Now in twenty twenty three, 323 00:16:01,800 --> 00:16:04,360 Speaker 5: we have really seen that shift to where these are 324 00:16:05,000 --> 00:16:11,720 Speaker 5: fleshed out, monetized online businesses, complete with advertising apparatuses. Graphica 325 00:16:11,760 --> 00:16:14,760 Speaker 5: found that the volume of referral link spam for these 326 00:16:14,840 --> 00:16:17,480 Speaker 5: kinds of services has increased by more than two thousand 327 00:16:17,560 --> 00:16:21,120 Speaker 5: percent on social media platforms like Reddit in Twitter since 328 00:16:21,160 --> 00:16:23,240 Speaker 5: the beginning of twenty twenty three, and this set of 329 00:16:23,280 --> 00:16:26,920 Speaker 5: fifty two Telegram groups used to access non consensual image 330 00:16:27,000 --> 00:16:30,400 Speaker 5: sites like these contain at least one million users as 331 00:16:30,440 --> 00:16:32,320 Speaker 5: of September of twenty twenty three. 332 00:16:32,120 --> 00:16:35,000 Speaker 3: So they have really exploded in popularity. 333 00:16:35,400 --> 00:16:38,920 Speaker 5: Yet I think that we're only now, like only recently 334 00:16:39,200 --> 00:16:41,720 Speaker 5: come to have any kind of conversation about what that means, 335 00:16:42,080 --> 00:16:46,520 Speaker 5: not just for you know, the women who are overwhelmingly 336 00:16:46,880 --> 00:16:49,240 Speaker 5: targeted by these kinds of apps, but also what it 337 00:16:49,280 --> 00:16:51,320 Speaker 5: means for our digital landscape more broadly. 338 00:16:51,800 --> 00:16:55,120 Speaker 6: Okay, I'm not gonna lie. None of this is familiar 339 00:16:55,120 --> 00:16:58,760 Speaker 6: to me as many social media things as I'm involved in. 340 00:16:59,240 --> 00:17:01,480 Speaker 4: Didn't know this was a thing. Didn't know. 341 00:17:02,080 --> 00:17:05,840 Speaker 3: It's pretty despicable. It first got on my radar. 342 00:17:06,680 --> 00:17:10,240 Speaker 5: It first seemed to be rolled out to like more 343 00:17:10,480 --> 00:17:13,520 Speaker 5: mainstream platforms with celebrities, so it'll be like, oh, you 344 00:17:13,520 --> 00:17:16,480 Speaker 5: can get a nude of any celebrity. And now it's like, 345 00:17:17,040 --> 00:17:19,080 Speaker 5: not just celebrities, it's like anybody who can get it. 346 00:17:19,160 --> 00:17:22,719 Speaker 5: You can generate a non consensual nude of anybody. And 347 00:17:22,760 --> 00:17:24,960 Speaker 5: so one of the ways that this is becoming more 348 00:17:25,000 --> 00:17:29,240 Speaker 5: and more ubiquitous is just seeing the advertisements for it 349 00:17:29,359 --> 00:17:33,280 Speaker 5: on social media. After some journalists at Bloomberg were looking 350 00:17:33,320 --> 00:17:36,560 Speaker 5: into the popularity of these kinds of apps, they contacted 351 00:17:36,680 --> 00:17:40,399 Speaker 5: both TikTok and Facebook to ask them about you know, Okay, 352 00:17:40,400 --> 00:17:43,879 Speaker 5: it seems like these really gross apps are being allowed 353 00:17:43,880 --> 00:17:47,680 Speaker 5: to advertise on your platforms, and both Facebook and TikTok 354 00:17:47,720 --> 00:17:50,800 Speaker 5: I will say, you know, to their credit when Bloomberg 355 00:17:50,840 --> 00:17:54,240 Speaker 5: reached out to them, they both did some work to 356 00:17:54,359 --> 00:17:58,199 Speaker 5: block search terms related to neudified apps, but one social 357 00:17:58,200 --> 00:18:00,439 Speaker 5: media platform notably did not do that. 358 00:18:00,520 --> 00:18:06,320 Speaker 3: Can you guess what platform that was? It was Twitter? 359 00:18:06,920 --> 00:18:07,280 Speaker 4: Twitter. 360 00:18:07,359 --> 00:18:13,639 Speaker 6: Obviously I'm not calling you a name's. 361 00:18:11,240 --> 00:18:16,439 Speaker 5: Call on Twitter still, so it was Twitter. They were like, 362 00:18:17,240 --> 00:18:19,680 Speaker 5: we're actually fine with these neuify apps, Like we don't 363 00:18:19,680 --> 00:18:23,080 Speaker 5: see a problem. Uh so this is from vice. Searching 364 00:18:23,160 --> 00:18:26,399 Speaker 5: the word undress on TikTok brings up no results in 365 00:18:26,440 --> 00:18:30,200 Speaker 5: either the top or video tabs. Instead, the platform warns 366 00:18:30,280 --> 00:18:32,720 Speaker 5: users that the phrase may be associated with activity that 367 00:18:32,880 --> 00:18:36,360 Speaker 5: violates the platform's guidelines. Searching the same term on Instagram 368 00:18:36,440 --> 00:18:40,240 Speaker 5: similarly brings up no results. Searching undress on Twitter, however, 369 00:18:40,720 --> 00:18:44,520 Speaker 5: readily surfaces a verified account with nearly twenty thousand followers, 370 00:18:44,720 --> 00:18:48,080 Speaker 5: promoting new TOFI app services. So let's say that you 371 00:18:48,119 --> 00:18:50,720 Speaker 5: were to search Twitter for the word in dress instead 372 00:18:50,720 --> 00:18:54,000 Speaker 5: of undress. Twitter is actually like, wait, were you actually 373 00:18:54,320 --> 00:18:57,280 Speaker 5: meeting to search for undressed? And it prompts you to 374 00:18:57,280 --> 00:19:00,439 Speaker 5: search for undress instead. So where other platforms are like, no, 375 00:19:00,520 --> 00:19:03,200 Speaker 5: we can't have that on our on our platforms, Twitter 376 00:19:03,280 --> 00:19:05,479 Speaker 5: is like not just allowing it on platforms, but like 377 00:19:05,760 --> 00:19:07,960 Speaker 5: helping people to search it when they when they get 378 00:19:07,960 --> 00:19:10,280 Speaker 5: it wrong, And so you might see people if you 379 00:19:10,359 --> 00:19:15,080 Speaker 5: ever see newdify apps advertise on these platforms, sometimes they 380 00:19:16,000 --> 00:19:18,879 Speaker 5: like the a word is misspelled, or like there's a 381 00:19:18,880 --> 00:19:21,320 Speaker 5: space between cople some of the letters to try to 382 00:19:21,400 --> 00:19:24,720 Speaker 5: evade being picked up and like knocked off the platform, 383 00:19:24,760 --> 00:19:27,400 Speaker 5: But it generally this does not seem like Twitter has 384 00:19:27,400 --> 00:19:29,639 Speaker 5: a problem with these kinds of services being advertised on 385 00:19:29,680 --> 00:19:30,280 Speaker 5: their platform. 386 00:19:30,440 --> 00:19:32,080 Speaker 6: And I feel like this is a big question with 387 00:19:32,320 --> 00:19:37,320 Speaker 6: like legalities, especially on public platforms like that in general. 388 00:19:37,400 --> 00:19:39,959 Speaker 6: But like I'm talking about doing news and such like, 389 00:19:41,160 --> 00:19:43,399 Speaker 6: it seems like it should be illegal. It seems like 390 00:19:43,440 --> 00:19:45,199 Speaker 6: this could be one of those things, especially like if 391 00:19:45,240 --> 00:19:47,400 Speaker 6: you think about revenge porn and all of that being 392 00:19:47,480 --> 00:19:51,160 Speaker 6: illegal in so many places, Like is that not a thing? 393 00:19:51,240 --> 00:19:51,479 Speaker 4: Is that? 394 00:19:51,800 --> 00:19:54,040 Speaker 6: Can you not take it to court or at least 395 00:19:54,080 --> 00:19:56,200 Speaker 6: try to stop these images from happening? 396 00:19:56,520 --> 00:19:58,640 Speaker 3: So that is a great question. When I first saw these, 397 00:19:58,640 --> 00:20:00,480 Speaker 3: I was like, this has got to be illegal, And 398 00:20:00,520 --> 00:20:01,400 Speaker 3: it turns out. 399 00:20:01,240 --> 00:20:05,199 Speaker 5: That right now, depending on where you live, it is 400 00:20:05,320 --> 00:20:08,040 Speaker 5: probably not illegal to do this. 401 00:20:08,680 --> 00:20:10,800 Speaker 3: It depends on your jurisdiction. But there is. 402 00:20:10,760 --> 00:20:15,640 Speaker 5: Currently no federal law criminalizing using AI to generate non 403 00:20:15,640 --> 00:20:19,560 Speaker 5: consensual deep fake images. Representative Evet Clark out of New 404 00:20:19,640 --> 00:20:24,240 Speaker 5: York has actually proposed legislation that would criminalize making, you know, 405 00:20:24,480 --> 00:20:27,800 Speaker 5: non consensual deep fakes, But as of right now, kind 406 00:20:27,800 --> 00:20:32,120 Speaker 5: of unbelievably, there is not any legislation federally that prevents 407 00:20:32,119 --> 00:20:35,159 Speaker 5: somebody from doing that. And yeah, I just think that, like, 408 00:20:36,359 --> 00:20:40,680 Speaker 5: as this kind of technology becomes more ubiquitous in our culture, 409 00:20:41,359 --> 00:20:45,000 Speaker 5: I think it adds to this idea that just by 410 00:20:45,080 --> 00:20:48,080 Speaker 5: showing up on the Internet, women are fair game for 411 00:20:48,160 --> 00:20:50,720 Speaker 5: anybody who wants to sexualize us, and that I think 412 00:20:50,760 --> 00:20:54,520 Speaker 5: it's getting to be getting to a point where the 413 00:20:54,640 --> 00:20:57,080 Speaker 5: idea is like, well, if you didn't want someone to 414 00:20:57,240 --> 00:20:59,160 Speaker 5: use your images in that way, why don't you post 415 00:20:59,200 --> 00:21:00,000 Speaker 5: them to Instagram? 416 00:21:00,119 --> 00:21:00,520 Speaker 3: Right Like? 417 00:21:01,359 --> 00:21:03,679 Speaker 5: And I think that we really got to have a 418 00:21:03,800 --> 00:21:07,879 Speaker 5: real serious think and a serious conversation about if that 419 00:21:08,040 --> 00:21:10,879 Speaker 5: is going to be a social media climate that we want. 420 00:21:11,440 --> 00:21:13,440 Speaker 5: And I think that that kind of thinking is the 421 00:21:13,520 --> 00:21:14,960 Speaker 5: kind of thicky that we need to leave behind in 422 00:21:14,960 --> 00:21:17,600 Speaker 5: twenty twenty three, right like, women should be able to 423 00:21:17,640 --> 00:21:22,399 Speaker 5: show up online without non consensual sexualization just being the 424 00:21:22,440 --> 00:21:25,520 Speaker 5: cost of showing up, And so let's leave that behind 425 00:21:25,760 --> 00:21:26,680 Speaker 5: in twenty twenty four. 426 00:21:26,760 --> 00:21:27,760 Speaker 3: We are not bringing it with. 427 00:21:27,760 --> 00:21:32,000 Speaker 6: Us please please. 428 00:21:32,200 --> 00:21:35,680 Speaker 2: Yeah, And that's one of the very frustrating things, since 429 00:21:35,680 --> 00:21:38,520 Speaker 2: I did see a lot about this, and we even 430 00:21:38,840 --> 00:21:42,880 Speaker 2: had an episode bridget about it about journalists getting targeted 431 00:21:42,920 --> 00:21:46,439 Speaker 2: by stuffy and revenge porn and things like that, and 432 00:21:46,480 --> 00:21:48,440 Speaker 2: this is just making it so much easier and it's 433 00:21:48,480 --> 00:21:53,000 Speaker 2: harder to for some people, for all of us to 434 00:21:53,200 --> 00:21:55,320 Speaker 2: ascertain like what is real and what is not real. 435 00:21:55,920 --> 00:22:00,679 Speaker 2: But it does kind of disproportionately impact women in march lines, folks. 436 00:22:01,119 --> 00:22:04,840 Speaker 2: And that's just one example of technology doing that that 437 00:22:04,880 --> 00:22:07,000 Speaker 2: you have on your list, right. 438 00:22:07,640 --> 00:22:10,080 Speaker 5: Yes, And I think, like, just like what you said, 439 00:22:10,760 --> 00:22:13,399 Speaker 5: I think that there are specific groups that we see 440 00:22:13,440 --> 00:22:16,439 Speaker 5: being targeted for this kind of harm first, and then it's. 441 00:22:16,320 --> 00:22:19,080 Speaker 3: Like oh nah, And then like when those groups. 442 00:22:18,720 --> 00:22:22,040 Speaker 5: When it's allowed to happen to those groups, it's like, oh, well, 443 00:22:22,520 --> 00:22:24,639 Speaker 5: you know, nobody really does anything. And then it's like, 444 00:22:24,680 --> 00:22:27,479 Speaker 5: oh surprise, Now it's everybody's problem. Now we just live 445 00:22:27,520 --> 00:22:30,399 Speaker 5: in a society where like this is commonplace. And like 446 00:22:30,480 --> 00:22:33,480 Speaker 5: maybe when this was happening to specific groups of people, 447 00:22:33,520 --> 00:22:35,520 Speaker 5: if we had done something and taken it seriously, then 448 00:22:35,760 --> 00:22:38,359 Speaker 5: we wouldn't have allowed it to just become ubiquitous, right, 449 00:22:38,440 --> 00:22:39,760 Speaker 5: and I don't think anybody wants to live in a 450 00:22:39,800 --> 00:22:44,000 Speaker 5: culture where anybody is fair game just because they were 451 00:22:44,000 --> 00:22:45,880 Speaker 5: all they put a picture of themselves on the Internet 452 00:22:46,040 --> 00:22:49,600 Speaker 5: to have that picture be distorted and sexualized. So absolutely right. 453 00:22:50,440 --> 00:22:53,320 Speaker 5: So that brings me to another thing that we should 454 00:22:53,359 --> 00:22:56,199 Speaker 5: not be taking with us into twenty twenty four, and 455 00:22:56,200 --> 00:22:59,240 Speaker 5: that it's content moderation policies that really hurt women and 456 00:22:59,359 --> 00:23:00,680 Speaker 5: other marched people. 457 00:23:01,040 --> 00:23:02,200 Speaker 3: So, as we were talking. 458 00:23:02,000 --> 00:23:05,840 Speaker 5: About with AI right now, AI is used in content 459 00:23:05,880 --> 00:23:09,000 Speaker 5: moderation that really does a lot of the work of 460 00:23:09,080 --> 00:23:12,760 Speaker 5: deciding what gets amplified and what gets suppressed on social media. 461 00:23:13,320 --> 00:23:18,040 Speaker 5: This technology, however, also objectifies women's bodies, and it's much 462 00:23:18,080 --> 00:23:21,720 Speaker 5: more likely to flag images that involve women or include 463 00:23:21,760 --> 00:23:25,280 Speaker 5: women as racy or inappropriate, and thus those images are 464 00:23:25,320 --> 00:23:28,040 Speaker 5: more likely to be suppressed on social media sites. So 465 00:23:28,119 --> 00:23:31,320 Speaker 5: The Guardian actually put together like a really interesting investigation 466 00:23:31,359 --> 00:23:34,320 Speaker 5: into this where they had journalists use AI tools to 467 00:23:34,359 --> 00:23:36,960 Speaker 5: analyze hundreds of photos of men and women in their underwear, 468 00:23:37,040 --> 00:23:40,600 Speaker 5: working out, using medical tests and with partial nudity, and 469 00:23:40,640 --> 00:23:43,680 Speaker 5: found that the AI used will tag photos of women 470 00:23:43,800 --> 00:23:46,679 Speaker 5: in everyday situations as sexually suggestive. 471 00:23:46,960 --> 00:23:49,960 Speaker 3: So I'm talking about images of like women. 472 00:23:49,960 --> 00:23:53,720 Speaker 5: In their underwear fully covered, or women working out at 473 00:23:53,720 --> 00:23:56,280 Speaker 5: the gym fully covered, right, like images that we would 474 00:23:56,359 --> 00:24:00,800 Speaker 5: recognize as not racy, but because they include women, the 475 00:24:00,880 --> 00:24:03,160 Speaker 5: tools that are being used to make decisions about how 476 00:24:03,200 --> 00:24:06,240 Speaker 5: content is moderated will be like, no, that's racy, can't 477 00:24:06,240 --> 00:24:09,600 Speaker 5: have that on social media. And so as a result 478 00:24:09,640 --> 00:24:13,600 Speaker 5: of this, social media companies that leverage these algorithms have 479 00:24:13,680 --> 00:24:18,840 Speaker 5: suppressed honestly countless images of featuring women's bodies. We know 480 00:24:18,880 --> 00:24:21,200 Speaker 5: that this hurts women led businesses, and this is reading 481 00:24:21,200 --> 00:24:26,080 Speaker 5: about how a shapewear company essentially can't advertise on social 482 00:24:26,119 --> 00:24:31,679 Speaker 5: media because images of women's bodies fully covered in shapewear 483 00:24:32,080 --> 00:24:34,840 Speaker 5: will just be suppressed by these algorithms, and they essentially 484 00:24:35,320 --> 00:24:37,680 Speaker 5: cannot advertise their product on social media, which is like 485 00:24:37,720 --> 00:24:41,359 Speaker 5: where you advertise products in twenty twenty three, this also like, 486 00:24:41,400 --> 00:24:44,440 Speaker 5: not only does it hurt female led businesses, it also 487 00:24:44,640 --> 00:24:48,600 Speaker 5: has medical impacts. They found that this disparity is also 488 00:24:48,640 --> 00:24:51,720 Speaker 5: true for medical images. AI was shown images from the 489 00:24:51,840 --> 00:24:55,120 Speaker 5: US National Cancer Institute demonstrating how to do a clinical 490 00:24:55,160 --> 00:24:59,199 Speaker 5: breast exam. Google's AI gave this photo the highest score 491 00:24:59,359 --> 00:25:03,760 Speaker 5: for race Microsoft's AI was eighty two percent confident that 492 00:25:03,800 --> 00:25:08,240 Speaker 5: these images of women doing breast exams was explicit sexually 493 00:25:08,400 --> 00:25:13,160 Speaker 5: in nature, and Amazon classified it as representing explicit nudity. 494 00:25:13,520 --> 00:25:16,240 Speaker 5: This is also true for pregnant bellies, like if you're 495 00:25:16,480 --> 00:25:19,840 Speaker 5: if you are heavily pregnant and showing a pregnant belly 496 00:25:20,040 --> 00:25:23,840 Speaker 5: on social media platforms, AI is much more likely to 497 00:25:24,040 --> 00:25:27,000 Speaker 5: deem that image to be racy and then suppress that 498 00:25:27,080 --> 00:25:29,639 Speaker 5: in their content moderation tools. And so you really get 499 00:25:29,640 --> 00:25:32,840 Speaker 5: a sense of the way that these platforms are creating 500 00:25:33,840 --> 00:25:38,080 Speaker 5: a disproportionate cost for being a woman who shows up online, 501 00:25:38,119 --> 00:25:40,600 Speaker 5: Like it prevents women from being able to express themselves, 502 00:25:40,760 --> 00:25:43,400 Speaker 5: it prevents women from being able to get medical information 503 00:25:43,440 --> 00:25:46,320 Speaker 5: about our bodies, and ultimately it's just not fair, like 504 00:25:46,320 --> 00:25:47,960 Speaker 5: we shouldn't have to deal with this just because we 505 00:25:48,000 --> 00:25:50,639 Speaker 5: showed up on our social media platform with our bodies, Like, 506 00:25:50,840 --> 00:25:54,280 Speaker 5: there's nothing wrong with women's bodies, they're they're not racy 507 00:25:54,359 --> 00:25:56,080 Speaker 5: or explicit just by us having. 508 00:25:55,880 --> 00:26:00,439 Speaker 2: Them right, right, And I believe Smith and I we 509 00:26:00,520 --> 00:26:02,959 Speaker 2: talked about this about YouTube because YouTube had a similar 510 00:26:03,040 --> 00:26:06,760 Speaker 2: strange thing that was happening where it was flagging videos 511 00:26:06,800 --> 00:26:10,119 Speaker 2: of children. The young girls is just like, this is 512 00:26:10,119 --> 00:26:13,080 Speaker 2: so sexual and it's just like literally young girls, and 513 00:26:13,119 --> 00:26:18,960 Speaker 2: it's telling to what is going on in our society, 514 00:26:19,040 --> 00:26:22,720 Speaker 2: the problems of objectifying and sexualizing women. But also this 515 00:26:22,760 --> 00:26:24,159 Speaker 2: is a big thing that we talked about with you, 516 00:26:24,200 --> 00:26:29,080 Speaker 2: Bridget when it matters who's making these things, who's doing 517 00:26:29,119 --> 00:26:32,639 Speaker 2: this stuff, and AI is a pretty new space, but 518 00:26:32,680 --> 00:26:35,320 Speaker 2: I've already seen a lot of conversations about that, about 519 00:26:35,920 --> 00:26:38,840 Speaker 2: the importance of who is working on it. 520 00:26:39,640 --> 00:26:43,480 Speaker 1: So that's part of what is going on here, right, totally. 521 00:26:43,600 --> 00:26:46,240 Speaker 5: So The Guardian spoke to Margaret Mitchell of the AI 522 00:26:46,280 --> 00:26:49,199 Speaker 5: company Hugging Face, who said that she believes that the 523 00:26:49,200 --> 00:26:52,360 Speaker 5: photos used to train these algorithms were probably just being 524 00:26:52,440 --> 00:26:56,399 Speaker 5: labeled by straight men who probably associate men working out 525 00:26:56,480 --> 00:26:59,960 Speaker 5: with fitness, but maybe consider an image of a woman 526 00:27:00,080 --> 00:27:04,080 Speaker 5: and working out as being like racy or sexual or explicit, 527 00:27:04,359 --> 00:27:07,200 Speaker 5: even though it's like the same theme, the same content. 528 00:27:07,440 --> 00:27:10,159 Speaker 5: And so it's possible that these ratings seem gender biased 529 00:27:10,200 --> 00:27:13,120 Speaker 5: in the US and in Europe because the labelers might 530 00:27:13,160 --> 00:27:16,040 Speaker 5: be from a place that might have more conservative cultures, right, 531 00:27:16,080 --> 00:27:20,560 Speaker 5: And so yeah, it really matters who is building technology, 532 00:27:20,600 --> 00:27:22,760 Speaker 5: who's in the room and the technology gets built, who 533 00:27:22,840 --> 00:27:24,639 Speaker 5: is training it, who is rolling it out, who is 534 00:27:24,680 --> 00:27:26,840 Speaker 5: thinking about it, who's writing about it, who's talking about it. 535 00:27:27,200 --> 00:27:32,200 Speaker 5: If these people are mostly men, then like, of course 536 00:27:32,240 --> 00:27:35,800 Speaker 5: women and other marginalized people are not showing up in 537 00:27:35,840 --> 00:27:38,359 Speaker 5: an equitable way. With all the conversations that we have 538 00:27:38,440 --> 00:27:41,160 Speaker 5: around things like inclusion and diversity and. 539 00:27:41,080 --> 00:27:44,600 Speaker 3: Equity in tech, I'm not like harping on those. 540 00:27:44,400 --> 00:27:46,320 Speaker 5: Just because it's like nice to do or it's like 541 00:27:46,359 --> 00:27:48,639 Speaker 5: the right thing to do, which it is. It is 542 00:27:48,680 --> 00:27:53,280 Speaker 5: because eventually the technology that gets built is worse, is 543 00:27:53,359 --> 00:27:56,840 Speaker 5: less inclusive, is more dangerous, like it includes less people, 544 00:27:56,840 --> 00:27:58,480 Speaker 5: and that ends up hurting all of us. 545 00:27:58,840 --> 00:28:00,080 Speaker 1: Yes, it does. 546 00:28:00,160 --> 00:28:02,640 Speaker 2: And speaking of that, you have another point on here, 547 00:28:02,720 --> 00:28:05,280 Speaker 2: going back to something we were talking about earlier about 548 00:28:05,720 --> 00:28:07,760 Speaker 2: women in journalism. 549 00:28:07,920 --> 00:28:12,080 Speaker 5: Yes, yes, so I'm so glad that was a great transition, 550 00:28:12,320 --> 00:28:15,879 Speaker 5: which is that you know, it's like one of the 551 00:28:15,920 --> 00:28:18,879 Speaker 5: reasons why I started my own podcast about this is 552 00:28:18,920 --> 00:28:24,200 Speaker 5: that we unfortunately have a tech culture that can treat 553 00:28:24,280 --> 00:28:29,520 Speaker 5: women like perpetual outsiders, right, whether by accident or with intention, 554 00:28:30,040 --> 00:28:31,600 Speaker 5: and that is something that we got to leave in 555 00:28:31,640 --> 00:28:33,600 Speaker 5: twenty twenty three. Really, we could have left it like 556 00:28:34,840 --> 00:28:37,119 Speaker 5: in like many many many years past. 557 00:28:37,400 --> 00:28:38,920 Speaker 3: But this is the time that we should be. 558 00:28:38,960 --> 00:28:41,960 Speaker 5: Leaving it in the past because exactly what you said, 559 00:28:41,960 --> 00:28:44,360 Speaker 5: it is so critical. These tools are going to be 560 00:28:44,400 --> 00:28:47,600 Speaker 5: shaping our world and how we understand our world. 561 00:28:47,640 --> 00:28:48,400 Speaker 3: So we've got to make. 562 00:28:48,320 --> 00:28:51,920 Speaker 5: Sure that people who are publicly talking about it and 563 00:28:51,960 --> 00:28:54,080 Speaker 5: are included in that conversation are done so in a 564 00:28:54,120 --> 00:28:56,560 Speaker 5: way that does not treat them like perpetual outsiders. And 565 00:28:56,600 --> 00:29:00,080 Speaker 5: so women who are working for that AI company that 566 00:29:00,120 --> 00:29:02,920 Speaker 5: I mentioned before, Hugging Face, what you can sort of 567 00:29:02,920 --> 00:29:05,520 Speaker 5: think of as like a competitor to Open AI, the 568 00:29:05,560 --> 00:29:07,480 Speaker 5: company that makes chat GPT, which is run by. 569 00:29:07,400 --> 00:29:08,320 Speaker 3: That guy Sam Altman. 570 00:29:08,880 --> 00:29:11,560 Speaker 5: Hugging Face has a lot of women who work there, 571 00:29:11,840 --> 00:29:13,240 Speaker 5: and these women do a ton of. 572 00:29:13,160 --> 00:29:16,080 Speaker 3: Like public speaking about tech and AI. 573 00:29:15,800 --> 00:29:19,000 Speaker 5: And the media, which is great, and I also think 574 00:29:19,040 --> 00:29:21,440 Speaker 5: again it's important because it can help to change the 575 00:29:21,480 --> 00:29:24,120 Speaker 5: face of like who we think of as somebody who 576 00:29:24,160 --> 00:29:26,120 Speaker 5: gets to speak or gets to have an opinion when 577 00:29:26,120 --> 00:29:29,600 Speaker 5: it comes to technology, and that's great. However, the women 578 00:29:29,640 --> 00:29:32,040 Speaker 5: at hugging Face also noticed that when they were doing 579 00:29:32,080 --> 00:29:35,040 Speaker 5: public speaking about AI with the press, they sometimes would 580 00:29:35,080 --> 00:29:39,440 Speaker 5: get like sexist or otherwise kind of messed up questions 581 00:29:39,560 --> 00:29:42,880 Speaker 5: during interviews that just like really highlighted that they were 582 00:29:42,960 --> 00:29:48,600 Speaker 5: not necessarily being treated like people who belonged in the space. 583 00:29:49,280 --> 00:29:52,680 Speaker 5: Margaret Mitchell, she is Hugging Face as chief Ethics scientist, 584 00:29:53,040 --> 00:29:55,480 Speaker 5: framed it as a research question. She asked what are 585 00:29:55,600 --> 00:29:58,640 Speaker 5: patterns and how journalists talk to and about women and AI. 586 00:29:59,480 --> 00:30:02,280 Speaker 5: She found that compared to male peers, there is a 587 00:30:02,320 --> 00:30:07,440 Speaker 5: disproportionate focus from the press on their ages, their motherhood, 588 00:30:07,760 --> 00:30:12,080 Speaker 5: their physical appearance or behaviors, their failures, and what AI 589 00:30:12,680 --> 00:30:16,240 Speaker 5: gossip they could provide, rather than their like technical work. 590 00:30:16,640 --> 00:30:20,080 Speaker 5: And these are people who are incredibly accomplished. They're like 591 00:30:20,200 --> 00:30:22,480 Speaker 5: doing very important works that they've gone to school for 592 00:30:22,680 --> 00:30:25,200 Speaker 5: been trained for. In an article, if you get to 593 00:30:25,280 --> 00:30:28,040 Speaker 5: interview them and you ask about their age, or you 594 00:30:28,080 --> 00:30:30,280 Speaker 5: ask if they have kids, or you ask you know, 595 00:30:30,360 --> 00:30:33,800 Speaker 5: about their the way they look, it's it's so limiting 596 00:30:33,800 --> 00:30:35,880 Speaker 5: because it's like you have somebody who has dedicated their 597 00:30:35,920 --> 00:30:38,560 Speaker 5: life to this very important technology and this is what 598 00:30:38,640 --> 00:30:39,200 Speaker 5: you ask them. 599 00:30:39,240 --> 00:30:41,400 Speaker 3: Like it's such a it's such a miss on part 600 00:30:41,480 --> 00:30:42,680 Speaker 3: on the part of the journalists. 601 00:30:43,080 --> 00:30:47,000 Speaker 6: It's like I thought you by now at this point 602 00:30:47,000 --> 00:30:49,280 Speaker 6: in AI, there would be more things to talk about. 603 00:30:49,280 --> 00:30:52,000 Speaker 6: There's so many questions that we should be asking again, 604 00:30:52,120 --> 00:30:55,280 Speaker 6: things like how are how are sexist issues being handled 605 00:30:55,440 --> 00:30:58,760 Speaker 6: in AI? How are you protecting women in AI? And 606 00:30:58,800 --> 00:31:02,400 Speaker 6: not about the person be like, so you got a kid, 607 00:31:03,760 --> 00:31:05,520 Speaker 6: you gotta be on the computer while you have a kid. 608 00:31:05,920 --> 00:31:08,240 Speaker 6: How are you gonna do that? Like this seems so 609 00:31:08,440 --> 00:31:10,600 Speaker 6: like far foot, Like why is this? Is this a 610 00:31:10,640 --> 00:31:13,280 Speaker 6: scot are you? Are we still doing this? 611 00:31:13,880 --> 00:31:16,880 Speaker 5: What's frustrating is like they would never ask a man. 612 00:31:17,560 --> 00:31:19,840 Speaker 5: If you were talking to like a powerful man, you 613 00:31:19,840 --> 00:31:21,640 Speaker 5: would never be like, oh, well, who watches your kids 614 00:31:21,640 --> 00:31:23,959 Speaker 5: while you're working? Or like how do how do how 615 00:31:23,960 --> 00:31:27,480 Speaker 5: do you juggle being a dad and like being a scientist? 616 00:31:27,520 --> 00:31:29,160 Speaker 5: Like these are things that would not that like would 617 00:31:29,160 --> 00:31:31,400 Speaker 5: not come up. They're only coming up because they're women. 618 00:31:41,960 --> 00:31:44,600 Speaker 5: Doctor Sasha Lucioni, who was an AI researcher and a 619 00:31:44,640 --> 00:31:48,240 Speaker 5: climate lead at Hucking Face, was in this like pretty glowing. 620 00:31:48,000 --> 00:31:49,000 Speaker 3: Piece for Adweek. 621 00:31:49,400 --> 00:31:52,280 Speaker 5: The piece was great, but the headline to the article 622 00:31:52,360 --> 00:31:56,280 Speaker 5: read this AI ethics expert juggles motherhood and a tech 623 00:31:56,360 --> 00:31:59,280 Speaker 5: career and people had to like raise hell to get 624 00:31:59,280 --> 00:32:02,320 Speaker 5: them to change it and Yeah, it's just like I 625 00:32:02,400 --> 00:32:05,360 Speaker 5: do think that there is a place for conversations about 626 00:32:05,440 --> 00:32:10,440 Speaker 5: what it looks like to juggle career and family and 627 00:32:10,480 --> 00:32:13,160 Speaker 5: all of that, But those are not conversations that should 628 00:32:13,200 --> 00:32:15,040 Speaker 5: only be happening to women, and there's a time and 629 00:32:15,080 --> 00:32:17,000 Speaker 5: a place for them, right, Like, if you're meant to 630 00:32:17,040 --> 00:32:20,960 Speaker 5: be interviewing somebody about their technical prowess, that I don't 631 00:32:20,960 --> 00:32:24,440 Speaker 5: see how that is relevant to their technical expertise. 632 00:32:24,280 --> 00:32:26,400 Speaker 6: Right, And that piece in the self the title is 633 00:32:26,440 --> 00:32:28,920 Speaker 6: so condescending? Is that definitely like Ah. 634 00:32:28,800 --> 00:32:30,840 Speaker 4: Look at you, how cute? Look at you doing that? 635 00:32:31,240 --> 00:32:33,960 Speaker 6: You go, you go, girl, I'm so proud of you, 636 00:32:34,000 --> 00:32:37,640 Speaker 6: instead of seeing this professional scientist who has more degrees 637 00:32:37,800 --> 00:32:40,520 Speaker 6: and more experience than the person who actually probably wrote 638 00:32:40,960 --> 00:32:43,239 Speaker 6: the article. I don't know for sure, but just that 639 00:32:43,360 --> 00:32:46,440 Speaker 6: level of like, wow, really, what are you doing? Like 640 00:32:46,800 --> 00:32:48,560 Speaker 6: is it because you feel insecure that you need to 641 00:32:48,560 --> 00:32:51,080 Speaker 6: be condescending but maybe like passive aggressively like but no, no, no, 642 00:32:51,160 --> 00:32:51,960 Speaker 6: I'm really impressed. 643 00:32:52,240 --> 00:32:56,560 Speaker 5: Really it is so condescending, and so, in an effort 644 00:32:56,720 --> 00:32:59,400 Speaker 5: to help the space be better, the women who work 645 00:32:59,400 --> 00:33:02,960 Speaker 5: at Hugging actually developed a guide for journalists. 646 00:33:02,240 --> 00:33:03,600 Speaker 3: To help get it right. 647 00:33:03,680 --> 00:33:06,200 Speaker 5: To help create a better dynamic so that it's not 648 00:33:06,280 --> 00:33:11,200 Speaker 5: just condescending question and sexist question after sex's question, the 649 00:33:11,240 --> 00:33:14,080 Speaker 5: guide reads, the real achievements of women on our team 650 00:33:14,160 --> 00:33:17,040 Speaker 5: often get overshadowed by a focus on personal and sometimes 651 00:33:17,160 --> 00:33:20,080 Speaker 5: very intrusive details that are not relevant to their work. 652 00:33:20,400 --> 00:33:22,880 Speaker 5: With all the amazing press attention we get at hugging Face, 653 00:33:23,120 --> 00:33:26,240 Speaker 5: we're bound to see some journalists rely on outdated tropes. Lately, 654 00:33:26,240 --> 00:33:29,360 Speaker 5: we've seen more reporters ignoring our amazing achievements of our 655 00:33:29,400 --> 00:33:33,680 Speaker 5: shees and days and instead focusing on stereotypes in tech. 656 00:33:33,960 --> 00:33:35,600 Speaker 5: One of the things that I love about the guys 657 00:33:35,640 --> 00:33:39,920 Speaker 5: that they put out is that it really highlights the 658 00:33:39,960 --> 00:33:43,320 Speaker 5: importance of not treating tech like a place where women 659 00:33:43,480 --> 00:33:46,680 Speaker 5: don't belong. And so, for instance, the guide reads, don't 660 00:33:46,680 --> 00:33:50,200 Speaker 5: rely on antiquated stereotypes about women in tech. This includes 661 00:33:50,240 --> 00:33:53,479 Speaker 5: describing women as outsiders in the field, which only serves 662 00:33:53,520 --> 00:33:56,320 Speaker 5: to reinforce the idea that women don't belong in tech. 663 00:33:56,880 --> 00:34:00,200 Speaker 5: An example of a problematic sentence they give despite being 664 00:34:00,240 --> 00:34:02,960 Speaker 5: a woman in a male dominated field. Brooke Brookie has 665 00:34:02,960 --> 00:34:05,800 Speaker 5: made a name for herself in the tech industry. Better 666 00:34:06,120 --> 00:34:09,600 Speaker 5: is through her brilliant results on magic and large language models. 667 00:34:09,640 --> 00:34:12,520 Speaker 5: Brooke Brookie made a name for herself in the tech industry. 668 00:34:12,880 --> 00:34:15,360 Speaker 5: And so this I think is really key, and I 669 00:34:15,360 --> 00:34:18,759 Speaker 5: think people do this without even really meaning to you, 670 00:34:19,000 --> 00:34:23,400 Speaker 5: is that the tech industry, women and queer folks and 671 00:34:23,480 --> 00:34:27,040 Speaker 5: trans folks and black folks and folks with disabilities and 672 00:34:27,600 --> 00:34:30,320 Speaker 5: all different kinds of people who are traditionally marginaliznsed in 673 00:34:30,400 --> 00:34:33,880 Speaker 5: our society have been at the beginning of technology and 674 00:34:33,960 --> 00:34:35,360 Speaker 5: have been since the very start. 675 00:34:35,640 --> 00:34:36,160 Speaker 3: We have this. 676 00:34:36,239 --> 00:34:38,760 Speaker 5: Idea that like, oh, tech is this like white male 677 00:34:38,880 --> 00:34:41,160 Speaker 5: cis boys club and anybody else is trying to break 678 00:34:41,200 --> 00:34:43,920 Speaker 5: their way in, and that's I can understand why people 679 00:34:43,920 --> 00:34:46,400 Speaker 5: feel that way, But we have been there from the 680 00:34:46,520 --> 00:34:49,600 Speaker 5: very beginning. And if you don't always hear our stories 681 00:34:49,719 --> 00:34:50,840 Speaker 5: or our voices, it's. 682 00:34:50,680 --> 00:34:51,640 Speaker 3: Not because we're not there. 683 00:34:51,640 --> 00:34:54,600 Speaker 5: It's because specific choices have been made to keep to 684 00:34:54,680 --> 00:34:57,600 Speaker 5: like sideline US and to turn down our voices. And 685 00:34:57,680 --> 00:35:00,680 Speaker 5: so really starting from a place that like, we do 686 00:35:00,800 --> 00:35:03,319 Speaker 5: belong in these conversations, we do belong in tech, we 687 00:35:03,360 --> 00:35:04,400 Speaker 5: do belong in the sector. 688 00:35:04,600 --> 00:35:06,319 Speaker 3: We're not outsiders, I. 689 00:35:06,280 --> 00:35:10,480 Speaker 5: Think is really really important, and it also just really 690 00:35:10,719 --> 00:35:14,160 Speaker 5: matters you know, if the people who are building technology 691 00:35:14,440 --> 00:35:18,480 Speaker 5: and talking about technology and thinking about technology are only 692 00:35:18,600 --> 00:35:21,279 Speaker 5: one type of person, the tech that gets built is 693 00:35:21,280 --> 00:35:21,680 Speaker 5: gonna be a. 694 00:35:21,680 --> 00:35:22,520 Speaker 3: Whole lot worse. 695 00:35:22,560 --> 00:35:25,399 Speaker 5: And so all of us benefit when more voices are 696 00:35:25,400 --> 00:35:29,200 Speaker 5: included and feel included and feel empowered to join the conversation. 697 00:35:29,560 --> 00:35:31,720 Speaker 6: You would think that would be an obvious and also 698 00:35:31,760 --> 00:35:33,919 Speaker 6: that it would be it would save money, It would 699 00:35:33,920 --> 00:35:37,760 Speaker 6: save money, and all the revamping and redoing of everything 700 00:35:37,840 --> 00:35:41,440 Speaker 6: that you knew, well, I would expect you knew, but 701 00:35:41,800 --> 00:35:44,000 Speaker 6: that could go wrong when you don't have the right 702 00:35:44,040 --> 00:35:47,719 Speaker 6: people or all the people at least represented in this conversation, 703 00:35:47,800 --> 00:35:50,600 Speaker 6: especially if you're wanting these people to use your technology. 704 00:35:55,160 --> 00:35:57,879 Speaker 5: You're so right, And like, what's interesting is I once 705 00:35:57,920 --> 00:36:00,800 Speaker 5: read this book called Mothers of Invention, all about how 706 00:36:01,320 --> 00:36:05,960 Speaker 5: things like misogyny and bias around women stifle innovation. And 707 00:36:06,600 --> 00:36:09,640 Speaker 5: something that's really struck me from that book is how 708 00:36:09,719 --> 00:36:12,600 Speaker 5: much money gets left on the table because of things 709 00:36:12,640 --> 00:36:15,279 Speaker 5: like gender bias and misogyny. It's like, you would think 710 00:36:15,320 --> 00:36:18,560 Speaker 5: in a capitalistic society that for profit companies would want 711 00:36:18,719 --> 00:36:21,880 Speaker 5: money above anything. You might be surprised because they are 712 00:36:22,000 --> 00:36:27,239 Speaker 5: perfectly willing to lose money if it means further entrenching 713 00:36:27,480 --> 00:36:29,359 Speaker 5: gender bias and massogeity. 714 00:36:29,600 --> 00:36:32,040 Speaker 3: Yeah, which is so nonsensical. 715 00:36:32,000 --> 00:36:34,600 Speaker 6: Right, I guess it's power over money, even though money 716 00:36:34,600 --> 00:36:35,920 Speaker 6: could be considered power. 717 00:36:36,360 --> 00:36:38,040 Speaker 7: Yeah. 718 00:36:38,400 --> 00:36:40,320 Speaker 2: Yeah, we were doing I was trying to do a 719 00:36:40,320 --> 00:36:43,680 Speaker 2: wrap up episode like this about video games and board games, 720 00:36:44,160 --> 00:36:48,800 Speaker 2: and I ran across kind of a stunning, upsetting amount 721 00:36:49,040 --> 00:36:54,160 Speaker 2: of how often people are in power who are Dudes 722 00:36:54,160 --> 00:36:59,279 Speaker 2: were like, it's just women don't make good stuff, and 723 00:36:59,320 --> 00:37:02,439 Speaker 2: they would literally be like, uh, people don't buy those things, 724 00:37:02,480 --> 00:37:05,799 Speaker 2: those party games, Why yes they do. 725 00:37:07,320 --> 00:37:10,040 Speaker 5: Yeah, look at the demographics of who is actually playing 726 00:37:10,120 --> 00:37:12,600 Speaker 5: video games. It is a lot of women, and so 727 00:37:12,719 --> 00:37:18,160 Speaker 5: sat like, I think not making video games and stuff 728 00:37:18,160 --> 00:37:21,480 Speaker 5: that like women want when a big chunk of your 729 00:37:21,480 --> 00:37:24,120 Speaker 5: customer base just is women, it's a fact that is 730 00:37:24,160 --> 00:37:25,799 Speaker 5: on you, that is like you don't know how to 731 00:37:25,800 --> 00:37:29,000 Speaker 5: do it or unwilling to do it. It's like any 732 00:37:29,160 --> 00:37:30,880 Speaker 5: anything that you could say, it's just an excuse to 733 00:37:30,960 --> 00:37:31,600 Speaker 5: not do it. 734 00:37:31,600 --> 00:37:34,600 Speaker 6: Right, Like you look at animal crossing and the fact 735 00:37:34,800 --> 00:37:39,479 Speaker 6: they may bank, especially during COVID and quarantine, and you're 736 00:37:39,520 --> 00:37:41,720 Speaker 6: like you sure, are you sure? 737 00:37:42,760 --> 00:37:43,440 Speaker 1: Yeah? 738 00:37:43,520 --> 00:37:45,719 Speaker 2: This guy in question was he was saying like all 739 00:37:45,760 --> 00:37:48,279 Speaker 2: the games women like to play, which are like, in 740 00:37:48,400 --> 00:37:52,680 Speaker 2: his mind, are much more too political or two like parties, 741 00:37:53,040 --> 00:37:58,640 Speaker 2: like animal Mobile games like aren't good and don't make money, 742 00:37:58,640 --> 00:38:01,760 Speaker 2: and it's like, uh. 743 00:38:00,000 --> 00:38:01,680 Speaker 3: But are you sure. 744 00:38:03,200 --> 00:38:03,840 Speaker 4: I can't? 745 00:38:04,520 --> 00:38:05,200 Speaker 3: Why? 746 00:38:05,560 --> 00:38:05,920 Speaker 6: Oh? 747 00:38:06,480 --> 00:38:09,480 Speaker 2: This also reminds me of it's kind of upsetting, but 748 00:38:09,560 --> 00:38:13,400 Speaker 2: it reminds me of when Sally Ride was, you know, 749 00:38:13,440 --> 00:38:16,319 Speaker 2: getting ready to go to space, and they would have 750 00:38:16,360 --> 00:38:19,120 Speaker 2: these press conferences with her and all these men and 751 00:38:19,160 --> 00:38:22,240 Speaker 2: the men they're asking like all these like questions related 752 00:38:22,280 --> 00:38:23,160 Speaker 2: to the jab and her. 753 00:38:23,200 --> 00:38:25,040 Speaker 1: They'd be like, how are you going to put your 754 00:38:25,080 --> 00:38:26,200 Speaker 1: makeup on space? 755 00:38:29,600 --> 00:38:33,399 Speaker 3: What it has? 756 00:38:33,440 --> 00:38:33,719 Speaker 6: It? Like? 757 00:38:33,800 --> 00:38:38,240 Speaker 2: It's still there that kind of we still gender these things, 758 00:38:38,280 --> 00:38:40,680 Speaker 2: and we still other people, and it does really matter 759 00:38:40,719 --> 00:38:44,880 Speaker 2: who's writing about it and who whose stories are getting 760 00:38:44,960 --> 00:38:49,920 Speaker 2: published or getting that traction, because you know, if it's 761 00:38:50,200 --> 00:38:53,520 Speaker 2: mostly white men's stories who's getting like all of the attention, 762 00:38:55,080 --> 00:38:57,919 Speaker 2: then that does help reinforce this idea that yeah, it's 763 00:38:57,960 --> 00:38:59,920 Speaker 2: this white male space when it isn't. 764 00:39:00,080 --> 00:39:01,320 Speaker 1: Has it been? As you said? 765 00:39:01,800 --> 00:39:03,880 Speaker 5: Yeah, did you ever hear? I mean, I'm sure you 766 00:39:04,040 --> 00:39:06,000 Speaker 5: all have heard this story. But it really sticks with 767 00:39:06,040 --> 00:39:08,520 Speaker 5: me that when Sally Ride was going to space. She's 768 00:39:08,560 --> 00:39:12,200 Speaker 5: gonna be there for one week, and the like scientists 769 00:39:12,280 --> 00:39:15,319 Speaker 5: or engineers at NASA whoever, were like, they gave her 770 00:39:15,320 --> 00:39:18,680 Speaker 5: a hunt. Was it two hundred tampons or one hundred tampons? 771 00:39:18,800 --> 00:39:20,000 Speaker 3: She gonna be there for one week? 772 00:39:20,320 --> 00:39:22,719 Speaker 5: And when she was like, oh, one hundred tampons, they 773 00:39:22,760 --> 00:39:26,120 Speaker 5: were like, will that be enough? Like these are the 774 00:39:26,160 --> 00:39:29,759 Speaker 5: greatest scientific mott like, right, nobody can tell me that, 775 00:39:29,840 --> 00:39:34,719 Speaker 5: like male scientists are like better than women when that's 776 00:39:34,760 --> 00:39:36,920 Speaker 5: what's going on with our male second. 777 00:39:37,239 --> 00:39:40,279 Speaker 2: Right, I can you imagine if you honestly believe that 778 00:39:40,440 --> 00:39:44,880 Speaker 2: was true and then for to expect the women in 779 00:39:44,920 --> 00:39:47,360 Speaker 2: your life or the people who menstraight in your life 780 00:39:47,640 --> 00:39:48,040 Speaker 2: to just. 781 00:39:48,000 --> 00:39:51,000 Speaker 1: Be going through two hundred tampots. 782 00:39:50,520 --> 00:39:53,040 Speaker 6: In a way, we have to pay that much when 783 00:39:53,080 --> 00:39:55,080 Speaker 6: you were like looking at tampons and they're like thirty 784 00:39:55,320 --> 00:39:59,400 Speaker 6: four ways of like eight dollars or thirty thirty. 785 00:40:00,280 --> 00:40:02,880 Speaker 3: So it's like how much that I have so many questions? 786 00:40:02,920 --> 00:40:04,560 Speaker 4: I have so many, so many questions. 787 00:40:04,600 --> 00:40:07,600 Speaker 6: There's so many questions, and to be fair and like 788 00:40:07,640 --> 00:40:09,799 Speaker 6: in everything, like how did you think this was gonna work? 789 00:40:09,840 --> 00:40:12,000 Speaker 4: Like who invented this? This is not? This is this 790 00:40:12,040 --> 00:40:13,919 Speaker 4: doesn't work like that, what are you doing? 791 00:40:14,800 --> 00:40:15,040 Speaker 3: God? 792 00:40:15,120 --> 00:40:19,440 Speaker 5: Sometimes like, Hugh, this is such a non secutor sometimes 793 00:40:19,480 --> 00:40:24,280 Speaker 5: when I think about gender dynamics, like I remember reading 794 00:40:24,280 --> 00:40:27,759 Speaker 5: this tweet from a guy who was like, people think 795 00:40:27,800 --> 00:40:31,319 Speaker 5: Taylor Swift is so pretty, but here she is without makeup, And. 796 00:40:31,280 --> 00:40:32,960 Speaker 3: It's like, did you think that she was born with 797 00:40:33,000 --> 00:40:36,399 Speaker 3: cat eyes and like break red lips? And you thought 798 00:40:36,480 --> 00:40:39,319 Speaker 3: all of us thought that. You clearly thought that, but 799 00:40:39,400 --> 00:40:39,920 Speaker 3: you thought all of. 800 00:40:40,000 --> 00:40:42,680 Speaker 5: Us thought that, Like and then you're like you're supposed 801 00:40:42,680 --> 00:40:44,960 Speaker 5: to be getting one over on us women, Like the 802 00:40:45,000 --> 00:40:46,360 Speaker 5: whole thing is just laughable. 803 00:40:47,320 --> 00:40:51,480 Speaker 2: Yeah, it's like such a great example of who we 804 00:40:51,600 --> 00:40:55,520 Speaker 2: consider the norm in society, whose experience we consider the 805 00:40:55,560 --> 00:40:59,200 Speaker 2: norm we base it on, Because I I was thinking 806 00:40:59,239 --> 00:41:03,840 Speaker 2: about that recently too, about this whole idea that I 807 00:41:03,840 --> 00:41:07,279 Speaker 2: feel like has faded away largely but not always. But 808 00:41:07,400 --> 00:41:09,319 Speaker 2: like women who would wake up early and put on 809 00:41:09,360 --> 00:41:14,040 Speaker 2: makeup so their partner would not see them without makeup. 810 00:41:14,480 --> 00:41:18,560 Speaker 2: So it's like it's like a lot of us feel 811 00:41:18,600 --> 00:41:22,439 Speaker 2: that we have to do this thing because then we'll 812 00:41:22,440 --> 00:41:27,040 Speaker 2: get this complaint from this random jackets on facial media. 813 00:41:27,760 --> 00:41:30,080 Speaker 2: But then it's also like, dude, what do you think 814 00:41:30,480 --> 00:41:35,560 Speaker 2: it's strange. It's like a strange dynamic that is happening 815 00:41:35,760 --> 00:41:38,440 Speaker 2: where you feel the pressure to do it. But then 816 00:41:38,600 --> 00:41:42,560 Speaker 2: also shouldn't they shouldn't people realize how much work. 817 00:41:42,400 --> 00:41:45,359 Speaker 6: It is, right, Yeah, And it kind of goes into 818 00:41:45,360 --> 00:41:47,919 Speaker 6: another question that I'm kind of concerned about. I've had, 819 00:41:48,200 --> 00:41:50,359 Speaker 6: like I've been noticing more and more on TikTok they 820 00:41:50,400 --> 00:41:52,960 Speaker 6: just have automatic filters, and a lot of people talk 821 00:41:53,000 --> 00:41:55,120 Speaker 6: about the fact that they use different filters, especially because 822 00:41:55,120 --> 00:41:56,960 Speaker 6: they don't want to wear makeup and all that, and 823 00:41:56,960 --> 00:41:59,520 Speaker 6: this feels makes them feel better, which all, all in all, 824 00:41:59,520 --> 00:42:02,520 Speaker 6: whatever you do, you but then this level of expectation 825 00:42:03,640 --> 00:42:07,919 Speaker 6: that especially men, especially I must say sis heterosexual men, 826 00:42:08,840 --> 00:42:11,040 Speaker 6: think that this is what you should look like and 827 00:42:11,080 --> 00:42:13,400 Speaker 6: if you don't look like that in real life, holy 828 00:42:13,840 --> 00:42:15,840 Speaker 6: catfished me and you've lied to me. Like in this 829 00:42:16,560 --> 00:42:20,480 Speaker 6: new level of standard, a beauty standard, that might been like, Okay, 830 00:42:20,480 --> 00:42:22,560 Speaker 6: this is good for us because it makes us kind 831 00:42:22,560 --> 00:42:24,319 Speaker 6: of feel better. But at the same time, what is 832 00:42:24,320 --> 00:42:24,920 Speaker 6: it leading to. 833 00:42:25,760 --> 00:42:28,560 Speaker 3: That's such an interesting question. And I guess I just 834 00:42:28,600 --> 00:42:29,200 Speaker 3: think that like. 835 00:42:31,160 --> 00:42:35,919 Speaker 5: A lot of men, like SIS heterosexual men, I think, 836 00:42:35,960 --> 00:42:39,600 Speaker 5: are really willing to live in a fantasy world like 837 00:42:39,640 --> 00:42:42,319 Speaker 5: they're really like it's like if you if you think that, 838 00:42:42,560 --> 00:42:45,160 Speaker 5: you know, some of those TikTok filters, they give you 839 00:42:45,239 --> 00:42:47,520 Speaker 5: like glitter on your eyes, if you thought that, like 840 00:42:47,600 --> 00:42:49,759 Speaker 5: somebody came out of the womb with like glitter on 841 00:42:49,800 --> 00:42:53,120 Speaker 5: their eyes, like I sometimes it's it's honestly, it's one 842 00:42:53,120 --> 00:42:55,319 Speaker 5: of the reasons why I don't really remove my body hair, 843 00:42:55,360 --> 00:42:57,440 Speaker 5: because when you add up how much time it takes 844 00:42:57,480 --> 00:42:59,440 Speaker 5: to do it, it's like, it's actually not it's a 845 00:42:59,440 --> 00:43:02,239 Speaker 5: lot of work. And it's like, do I want to 846 00:43:02,280 --> 00:43:05,239 Speaker 5: do extra work and pay whatever extra money it takes 847 00:43:05,280 --> 00:43:07,560 Speaker 5: to by the razor and this and that to feed 848 00:43:07,600 --> 00:43:09,040 Speaker 5: into a fantasy that. 849 00:43:09,200 --> 00:43:11,640 Speaker 3: Adult women don't have body hair. No, I don't. 850 00:43:11,719 --> 00:43:14,640 Speaker 5: Like if you're if you're like, you should by now 851 00:43:14,680 --> 00:43:17,200 Speaker 5: know that adult women do you generally have body hair. 852 00:43:17,200 --> 00:43:23,080 Speaker 5: And I'm not interested in perpetuating this like very weird 853 00:43:23,280 --> 00:43:26,680 Speaker 5: fantasy world where this becomes how women are in the world. 854 00:43:26,960 --> 00:43:28,640 Speaker 5: And then if you see a woman who is not 855 00:43:28,719 --> 00:43:32,080 Speaker 5: doing that, it's it's not it's like out of sync 856 00:43:32,080 --> 00:43:33,880 Speaker 5: with how you think women should exist. I guess I'll say, 857 00:43:33,880 --> 00:43:35,040 Speaker 5: I I don't know if that makes sense. 858 00:43:35,640 --> 00:43:38,400 Speaker 6: No, yeah, it absolutely does it kind of again, it 859 00:43:38,440 --> 00:43:41,560 Speaker 6: fades into this narrative that women are supposed to be 860 00:43:41,640 --> 00:43:46,719 Speaker 6: this higher level of beauty standard in order to fit 861 00:43:46,800 --> 00:43:49,279 Speaker 6: with a norm which is so much less work for 862 00:43:49,400 --> 00:43:51,400 Speaker 6: men in general. 863 00:43:51,640 --> 00:43:55,960 Speaker 2: Yeah, and when we're looking at like tech, I mean, 864 00:43:56,000 --> 00:43:58,680 Speaker 2: so many of the things that we've talked about on here, 865 00:43:58,840 --> 00:44:01,640 Speaker 2: do you play into that, whether it's filters or social media, 866 00:44:01,719 --> 00:44:06,080 Speaker 2: of these kind of beauty standards that are getting perpetuated. 867 00:44:06,800 --> 00:44:12,920 Speaker 2: Just the vitriol women can receive just by here's my 868 00:44:13,040 --> 00:44:20,800 Speaker 2: face on social media. So it is really unfortunate because 869 00:44:22,000 --> 00:44:25,400 Speaker 2: it's it's we can't ignore that that does happen. And 870 00:44:25,480 --> 00:44:27,759 Speaker 2: I know I've told the story before, but I have 871 00:44:27,800 --> 00:44:31,160 Speaker 2: a lovely collage of all of the horrible things people 872 00:44:31,200 --> 00:44:32,240 Speaker 2: have said to me online. 873 00:44:32,960 --> 00:44:38,120 Speaker 1: But it's also like, why can't we just these are 874 00:44:38,200 --> 00:44:42,040 Speaker 1: women who are doing amazing things in AI. We don't 875 00:44:42,120 --> 00:44:44,640 Speaker 1: have to talk about how they look. We really don't. 876 00:44:45,360 --> 00:44:46,160 Speaker 1: We really don't. 877 00:44:46,560 --> 00:44:48,279 Speaker 6: We don't need to know how many children she has. 878 00:44:48,360 --> 00:44:50,200 Speaker 6: We don't need to know she has a husband or 879 00:44:50,200 --> 00:44:52,719 Speaker 6: a partner, Like, that's not we want to know. Are 880 00:44:52,760 --> 00:44:56,160 Speaker 6: you doing better AI that protects people? Yeah, wonderful, let's 881 00:44:56,160 --> 00:44:56,799 Speaker 6: support that. 882 00:44:57,040 --> 00:44:58,640 Speaker 3: Yes, and it's such a yeah. 883 00:44:58,680 --> 00:45:01,880 Speaker 5: It's just such a missed app community because like the 884 00:45:01,960 --> 00:45:06,360 Speaker 5: majority of people who are making sure that AI is 885 00:45:06,480 --> 00:45:09,840 Speaker 5: safe or more inclusive, or doesn't harm people, or is 886 00:45:09,880 --> 00:45:13,719 Speaker 5: ethical or doing really interesting work, those people tend to 887 00:45:13,800 --> 00:45:16,880 Speaker 5: also be women, people of color, people who are not 888 00:45:17,040 --> 00:45:21,239 Speaker 5: necessarily treated as the norm in tech. And so when 889 00:45:21,280 --> 00:45:24,360 Speaker 5: you have an opportunity to talk to these people, actually 890 00:45:24,520 --> 00:45:28,520 Speaker 5: use it because what they are working on probably really matters, 891 00:45:28,560 --> 00:45:30,160 Speaker 5: and like it matters to all of us. Even if 892 00:45:30,200 --> 00:45:32,800 Speaker 5: you're not somebody who is a kateechie, you're not somebody 893 00:45:32,840 --> 00:45:36,080 Speaker 5: who thinks of yourself as like somebody who thinks about 894 00:45:36,080 --> 00:45:38,960 Speaker 5: AI a lot, this stuff is going to matter to everybody. 895 00:45:39,000 --> 00:45:41,000 Speaker 5: And I guess that brings me to one of the 896 00:45:41,040 --> 00:45:43,759 Speaker 5: things that I'm looking forward to in twenty twenty four 897 00:45:43,920 --> 00:45:47,399 Speaker 5: that I if I'm the girl in the cartoon, I've 898 00:45:47,440 --> 00:45:50,160 Speaker 5: got this in my bag slung over my shoulder, that 899 00:45:50,200 --> 00:45:52,239 Speaker 5: I'm taking it with me to the next year, and 900 00:45:52,280 --> 00:45:54,879 Speaker 5: that is all of us, each and every one of us, 901 00:45:55,280 --> 00:45:59,000 Speaker 5: being more involved in conversations around tech. Like I do 902 00:45:59,040 --> 00:46:02,640 Speaker 5: think I have seen a shift this year around how 903 00:46:02,719 --> 00:46:06,040 Speaker 5: regular people like you and me and people listening, you know, 904 00:46:06,360 --> 00:46:09,719 Speaker 5: are thinking about technology and talking about technology. I think 905 00:46:09,760 --> 00:46:13,040 Speaker 5: that we are done with this idea that tech is 906 00:46:13,120 --> 00:46:16,280 Speaker 5: only decided by a bunch of like super smart, genius 907 00:46:16,320 --> 00:46:18,440 Speaker 5: white guys who don't have to be accountable to us 908 00:46:18,440 --> 00:46:21,360 Speaker 5: at all because we're like not smart enough to understand 909 00:46:21,440 --> 00:46:22,560 Speaker 5: the brilliance that they do. 910 00:46:23,000 --> 00:46:23,640 Speaker 3: That is out. 911 00:46:23,960 --> 00:46:25,839 Speaker 5: I think that in twenty twenty three, we are coming 912 00:46:25,880 --> 00:46:28,600 Speaker 5: to the realization that these people have been using that 913 00:46:28,719 --> 00:46:31,480 Speaker 5: dynamic as a way to essentially like get rich off 914 00:46:31,480 --> 00:46:33,919 Speaker 5: of us. And I think that we're going to start 915 00:46:33,960 --> 00:46:36,560 Speaker 5: asking some questions about that dynamic and pushing back, like 916 00:46:36,920 --> 00:46:39,360 Speaker 5: should these companies and tech leaders be able to just 917 00:46:39,400 --> 00:46:42,640 Speaker 5: get rich off of us without us asking any questions 918 00:46:42,719 --> 00:46:45,560 Speaker 5: or having any say I think I think I'm starting 919 00:46:45,560 --> 00:46:48,719 Speaker 5: to see people be like, no, actually, I am smart 920 00:46:48,840 --> 00:46:51,520 Speaker 5: enough to understand that I'm being taken advantage of it exploited, 921 00:46:51,800 --> 00:46:53,160 Speaker 5: and I have questions about that. 922 00:46:53,320 --> 00:46:55,560 Speaker 3: So let's keep asking those questions. 923 00:46:55,760 --> 00:46:58,840 Speaker 5: In twenty twenty four, let's bring that dynamic into the 924 00:46:58,840 --> 00:46:59,840 Speaker 5: new year with us. 925 00:47:00,160 --> 00:47:06,160 Speaker 2: Yes, yes, and of course always your show is such 926 00:47:06,200 --> 00:47:08,920 Speaker 2: a good part of that and part of that conversation, 927 00:47:09,239 --> 00:47:14,680 Speaker 2: so very eagerly awaiting the next the next season. Do 928 00:47:14,719 --> 00:47:16,640 Speaker 2: you call it seasons, Bridget, We call. 929 00:47:16,520 --> 00:47:19,360 Speaker 5: It seasons, but they're really just like when I get 930 00:47:19,760 --> 00:47:21,440 Speaker 5: tired of making the show and I have to take 931 00:47:21,480 --> 00:47:21,839 Speaker 5: a break. 932 00:47:21,880 --> 00:47:24,439 Speaker 3: So we are taking a break, but we'll be back 933 00:47:24,480 --> 00:47:24,960 Speaker 3: real soon. 934 00:47:26,280 --> 00:47:29,760 Speaker 6: Well, you're here with us, answering those questions and asking 935 00:47:29,840 --> 00:47:33,040 Speaker 6: those questions and allowing us to ask you those questions. 936 00:47:33,080 --> 00:47:35,759 Speaker 4: So we are very grateful for that and excited for that. 937 00:47:35,800 --> 00:47:36,480 Speaker 4: For the new year. 938 00:47:37,160 --> 00:47:40,759 Speaker 2: Yes, yes, and we are hoping maybe we'll get to 939 00:47:41,000 --> 00:47:45,200 Speaker 2: hang out RL and do some things four one day. 940 00:47:45,640 --> 00:47:49,879 Speaker 3: One day. It's going to happen soon. Stay yes tune folks. 941 00:47:50,239 --> 00:47:54,600 Speaker 2: Yes, well, Bridget, thank you so much as always for 942 00:47:54,640 --> 00:47:56,439 Speaker 2: being here at the end of this year. 943 00:47:57,800 --> 00:48:00,160 Speaker 1: Where can the good listeners find you. 944 00:48:00,120 --> 00:48:02,160 Speaker 5: Out my podcast? There are no girls on the internet 945 00:48:02,160 --> 00:48:04,800 Speaker 5: on iHeartRadio. You can find me on Instagram at Bridget 946 00:48:04,800 --> 00:48:08,200 Speaker 5: Marie and DC or on Twitter at Bridget Marie. 947 00:48:08,640 --> 00:48:12,680 Speaker 2: Yes, and definitely go do that listeners if you haven't already, Bridget, 948 00:48:12,800 --> 00:48:17,160 Speaker 2: I hope you have a good relaxing holiday, weirdo Christmas. 949 00:48:17,440 --> 00:48:21,320 Speaker 5: Yes, thank you all, Happy Mary whatever to all of y'all. 950 00:48:21,680 --> 00:48:26,640 Speaker 2: Yes, thank you, and listeners, if you would like to 951 00:48:26,680 --> 00:48:29,240 Speaker 2: contact us, you can. Our email is Stephanie and Momstuff 952 00:48:29,239 --> 00:48:31,319 Speaker 2: at iHeartMedia dot com. You can find us on Twitter 953 00:48:31,360 --> 00:48:33,319 Speaker 2: at mom Stuff podcast, or on Instagram and TikTok at 954 00:48:33,320 --> 00:48:35,359 Speaker 2: stuff When Never told You. We have a tea public store, 955 00:48:35,400 --> 00:48:37,799 Speaker 2: and we do have a book. Thanks as always to 956 00:48:37,960 --> 00:48:40,880 Speaker 2: our super producer Christina, our executi producer Maya, and our 957 00:48:40,920 --> 00:48:41,800 Speaker 2: contributor Joey. 958 00:48:42,160 --> 00:48:43,680 Speaker 4: Thank you and happy holidays. 959 00:48:43,880 --> 00:48:46,880 Speaker 2: Yes yes, and thanks to you for listening Stuff I 960 00:48:46,960 --> 00:48:49,120 Speaker 2: Never told You the prediction of iHeartRadio. For more podcasts 961 00:48:49,120 --> 00:48:50,960 Speaker 2: from my Heart Radio, you can check out the iHeartRadio app, 962 00:48:50,960 --> 00:49:02,080 Speaker 2: Apple Podcasts, or where you listen to your favorite shows. 963 00:49:02,480 --> 00:49:02,640 Speaker 6: Ya