1 00:00:05,160 --> 00:00:06,720 Speaker 1: Hey this is Annie and Samantha. 2 00:00:06,760 --> 00:00:07,440 Speaker 2: I w goome stuff. 3 00:00:07,440 --> 00:00:08,920 Speaker 3: When ever told you protection if? I Heart Radio? 4 00:00:18,560 --> 00:00:20,919 Speaker 4: And today we are bringing back another episode that we 5 00:00:20,960 --> 00:00:23,440 Speaker 4: did with Bridget Todd, who is one of our favorites. 6 00:00:24,160 --> 00:00:27,840 Speaker 4: We always love talking to her and hearing what topics 7 00:00:27,880 --> 00:00:30,520 Speaker 4: she's brought to us. This is one I thought was 8 00:00:30,560 --> 00:00:34,120 Speaker 4: really fascinating about the first lady of the Internet and 9 00:00:34,400 --> 00:00:38,320 Speaker 4: the Linna image and the history of that and how 10 00:00:39,080 --> 00:00:42,559 Speaker 4: the model that provided this image got no credit, got 11 00:00:42,600 --> 00:00:47,600 Speaker 4: no compensation, but it was so foundational to a lot 12 00:00:47,840 --> 00:00:50,720 Speaker 4: of the early Internet and it's just something that really 13 00:00:51,320 --> 00:00:53,559 Speaker 4: stuck with me and I still think about, especially as 14 00:00:53,560 --> 00:00:55,720 Speaker 4: we're seeing all these advances right now. 15 00:00:56,200 --> 00:00:58,120 Speaker 1: So I thought, oh, why not. 16 00:00:58,000 --> 00:01:01,640 Speaker 3: Bring that one back. Enjoy this classic episode. 17 00:01:07,000 --> 00:01:09,199 Speaker 1: Hey, this is Annie and Samantha. I walk good stuff. 18 00:01:09,240 --> 00:01:10,840 Speaker 3: I never told you are picture of iHeartRadio. 19 00:01:20,400 --> 00:01:22,440 Speaker 4: And today we are once again thrilled to be joined 20 00:01:22,440 --> 00:01:26,960 Speaker 4: by the fabulous, the fantastic Bridget Todd to welcome Bridget. 21 00:01:27,600 --> 00:01:29,960 Speaker 5: Thank you so much for having me. It's always such 22 00:01:29,959 --> 00:01:32,520 Speaker 5: a joy when I get to start my week talking 23 00:01:32,560 --> 00:01:33,000 Speaker 5: to you all. 24 00:01:33,760 --> 00:01:36,120 Speaker 1: Yes, I feel like you have an extra glow. 25 00:01:36,280 --> 00:01:38,480 Speaker 6: Maybe it's because you've been like soaking in all of 26 00:01:38,520 --> 00:01:42,720 Speaker 6: the sun on the beautiful beaches abroad. Actually, I've been 27 00:01:42,720 --> 00:01:44,880 Speaker 6: stalking you on Instagram and I'm like, how is this 28 00:01:44,959 --> 00:01:46,120 Speaker 6: woman always traveling? 29 00:01:46,160 --> 00:01:48,360 Speaker 1: And I miss and I'm sad that I'm not. 30 00:01:48,640 --> 00:01:51,680 Speaker 2: I'm just kidding. Oh you we actually so. 31 00:01:51,840 --> 00:01:56,080 Speaker 5: I was in Mazatlan, Mexico for a clips to be 32 00:01:56,120 --> 00:01:58,920 Speaker 5: in a paspitality. We actually it was one of those 33 00:01:58,960 --> 00:02:01,840 Speaker 5: trips where we'd invited all of our friends, were like, 34 00:02:01,840 --> 00:02:02,400 Speaker 5: we're gonna. 35 00:02:02,200 --> 00:02:04,240 Speaker 2: Get a big house on the beach. It's gonna be amazing. 36 00:02:04,560 --> 00:02:06,480 Speaker 5: And then all of our friends are in and then 37 00:02:06,600 --> 00:02:10,480 Speaker 5: one by one, by one by one, I'm just there alone, essentially. 38 00:02:10,960 --> 00:02:14,200 Speaker 1: No enjoyed it. 39 00:02:14,600 --> 00:02:17,440 Speaker 5: I enjoyed it. I love Mexico. It is one of 40 00:02:17,480 --> 00:02:19,919 Speaker 5: my favorite places. It was my first time in mazat Lawn. 41 00:02:20,800 --> 00:02:22,600 Speaker 5: Ten out of ten completely recommend. 42 00:02:22,800 --> 00:02:26,360 Speaker 3: Okay, did you see the eclipse and totality? 43 00:02:26,800 --> 00:02:28,400 Speaker 2: I saw the eclipse of totality. 44 00:02:28,480 --> 00:02:31,520 Speaker 5: It was my first time ever being in the path 45 00:02:31,520 --> 00:02:33,600 Speaker 5: of totality of an eclipse. Have you have either of 46 00:02:33,639 --> 00:02:34,640 Speaker 5: you ever experienced this? 47 00:02:35,000 --> 00:02:36,720 Speaker 3: No? Yes? Once? 48 00:02:36,919 --> 00:02:39,360 Speaker 2: What were your thoughts? Anny, I am dying to know. 49 00:02:43,360 --> 00:02:45,720 Speaker 4: Oh my gosh, if this was a different podcast. We 50 00:02:45,720 --> 00:02:48,720 Speaker 4: would go into a whole separate thing because I had 51 00:02:48,760 --> 00:02:53,600 Speaker 4: like a relationship issue that was happening on this day 52 00:02:52,840 --> 00:02:57,679 Speaker 4: and kind of a drama situation. So a lot of 53 00:02:57,720 --> 00:02:59,359 Speaker 4: times when I look back at it, a lot of 54 00:02:59,360 --> 00:03:01,720 Speaker 4: the pictures i'd I was like, oh wow, we were fighting, 55 00:03:02,480 --> 00:03:09,120 Speaker 4: but it was also a work event that I was at, 56 00:03:09,360 --> 00:03:11,000 Speaker 4: so there was that layer. But it was beautiful. It 57 00:03:11,080 --> 00:03:15,840 Speaker 4: was so cool. I it sounds silly, but I love space, 58 00:03:15,919 --> 00:03:18,799 Speaker 4: like I like love the stars are like my favorite thing. 59 00:03:20,080 --> 00:03:24,040 Speaker 4: So it was really really cool to see. It was 60 00:03:24,120 --> 00:03:27,840 Speaker 4: not what I quite expected because the glasses. Samanth and 61 00:03:27,840 --> 00:03:30,720 Speaker 4: I were joking about this recently, but the glasses feel 62 00:03:30,760 --> 00:03:33,200 Speaker 4: so funny because you're like looking around like they're not working, 63 00:03:34,040 --> 00:03:37,840 Speaker 4: and then you like find the space you're supposed to 64 00:03:37,920 --> 00:03:41,960 Speaker 4: look at. I was just like, very very happy to 65 00:03:42,040 --> 00:03:46,360 Speaker 4: see it, honestly, like all that drama I was working aside, 66 00:03:46,480 --> 00:03:48,600 Speaker 4: I remember thinking, this is really cool that I get 67 00:03:48,640 --> 00:03:50,240 Speaker 4: to see this, and I'm really happy that I get 68 00:03:50,240 --> 00:03:50,640 Speaker 4: to see this. 69 00:03:50,960 --> 00:03:53,480 Speaker 2: Yes, that was what I remember too. 70 00:03:54,640 --> 00:03:59,480 Speaker 5: I burst into tears and the next day I woke 71 00:03:59,560 --> 00:04:01,080 Speaker 5: up in the middle middle of the night in a 72 00:04:01,440 --> 00:04:05,120 Speaker 5: panic because I was worried that I will forget what 73 00:04:05,160 --> 00:04:08,720 Speaker 5: it looked like being in totality like that, Like that's 74 00:04:08,760 --> 00:04:10,760 Speaker 5: how like I had never seen it, but I did 75 00:04:10,760 --> 00:04:13,200 Speaker 5: never seen anything like it. Anybody who listens to their 76 00:04:13,240 --> 00:04:15,120 Speaker 5: no girls on the internet is probably so sick of 77 00:04:15,160 --> 00:04:17,440 Speaker 5: me talking me about this eclipse, and I am fully 78 00:04:17,480 --> 00:04:21,039 Speaker 5: making seeing this total eclipse like my personality. 79 00:04:21,120 --> 00:04:21,560 Speaker 7: But it's. 80 00:04:23,000 --> 00:04:25,359 Speaker 5: But yeah, and I'm already planning where I will go 81 00:04:25,400 --> 00:04:27,640 Speaker 5: for the next one. So I guess I will see 82 00:04:27,680 --> 00:04:30,640 Speaker 5: y'all in I think, what is it, Spain? 83 00:04:31,839 --> 00:04:34,120 Speaker 2: So that's that's the for you one. 84 00:04:34,480 --> 00:04:36,120 Speaker 5: The next the next one that you can see if 85 00:04:36,160 --> 00:04:38,000 Speaker 5: you can go to Spain and see I think in 86 00:04:38,080 --> 00:04:41,800 Speaker 5: twenty twenty six. Ok So, but then the one that 87 00:04:41,839 --> 00:04:44,599 Speaker 5: you're referring to, Sam is like, don't quote me on 88 00:04:44,640 --> 00:04:46,479 Speaker 5: any of this, but that's supposed to be like the 89 00:04:46,520 --> 00:04:48,560 Speaker 5: big one, the big one that we will probably be 90 00:04:48,560 --> 00:04:50,920 Speaker 5: able to see in our lifetimes, and I think it's 91 00:04:51,000 --> 00:04:54,200 Speaker 5: in parts of the African continent. 92 00:04:54,960 --> 00:04:57,400 Speaker 2: I want to say, Morocco. Don't quote me on that either. 93 00:04:57,880 --> 00:04:59,200 Speaker 6: Yeah, I did say at the one point like it 94 00:04:59,200 --> 00:05:01,240 Speaker 6: you would be able to view in the US. That's 95 00:05:01,279 --> 00:05:03,160 Speaker 6: the next time you'll be able to it. I don't know. 96 00:05:03,240 --> 00:05:06,799 Speaker 6: It's like the actual, like the totality, as you say, 97 00:05:07,120 --> 00:05:09,800 Speaker 6: but like I don't know because I know nothing about this. 98 00:05:11,200 --> 00:05:14,320 Speaker 1: That's the only I know the date, in no kindness. 99 00:05:14,320 --> 00:05:15,640 Speaker 2: For this past eclipse. 100 00:05:15,720 --> 00:05:19,160 Speaker 5: Earlier this month, we had done so much planning, including 101 00:05:19,240 --> 00:05:22,080 Speaker 5: like looking at farmer's almanacs to see what the weather 102 00:05:22,160 --> 00:05:24,600 Speaker 5: and cloud coverage is like this time of year. And 103 00:05:24,640 --> 00:05:27,440 Speaker 5: that's how we settled on Mazetlan, Mexico, because it was 104 00:05:27,480 --> 00:05:30,680 Speaker 5: the place that is closest to us on the East 105 00:05:30,720 --> 00:05:33,800 Speaker 5: Coast in the United States that was most likely to 106 00:05:33,880 --> 00:05:37,040 Speaker 5: not have cloud coverage in April, because you could see 107 00:05:37,080 --> 00:05:39,279 Speaker 5: it from Vermont and upstate New York and Texas, but 108 00:05:39,320 --> 00:05:41,520 Speaker 5: a lot of those places in April might be cloudy. 109 00:05:41,560 --> 00:05:43,920 Speaker 5: And so I have friends who are in Vermont and 110 00:05:44,000 --> 00:05:45,600 Speaker 5: upstate New York who were like, Oh, We're just. 111 00:05:45,560 --> 00:05:47,480 Speaker 2: Gonna see it from our house, and I'm like. 112 00:05:47,360 --> 00:05:48,159 Speaker 3: Oh, will you. 113 00:05:48,960 --> 00:05:52,359 Speaker 5: Then on the morning of vi eclipse in Mazetlan, Mexico, 114 00:05:52,720 --> 00:05:54,920 Speaker 5: we'd been there for a week. Every single day it's 115 00:05:54,920 --> 00:05:58,120 Speaker 5: like a beautiful, cloudless, blue sky day. So I wake 116 00:05:58,200 --> 00:06:00,200 Speaker 5: up on April eighth, the day of the eclipse, and 117 00:06:00,320 --> 00:06:03,479 Speaker 5: it's cloudy, the first cloudy day we've had in Mexico 118 00:06:03,560 --> 00:06:06,839 Speaker 5: for the entire week we have been there. Luckily, during 119 00:06:06,839 --> 00:06:09,680 Speaker 5: the eclipse time, the clouds did part, so we did 120 00:06:09,720 --> 00:06:12,080 Speaker 5: get to see it. But there would have been a 121 00:06:12,120 --> 00:06:14,960 Speaker 5: lot of feelings had we not been able to see it. 122 00:06:14,960 --> 00:06:17,119 Speaker 4: It's a lot of pressure to put on a trip 123 00:06:17,200 --> 00:06:19,640 Speaker 4: like that. Honestly, it's pressure. 124 00:06:19,360 --> 00:06:22,599 Speaker 1: To put on the eclipse. I mean, it exists. It's 125 00:06:22,640 --> 00:06:23,400 Speaker 1: not their fault. 126 00:06:25,920 --> 00:06:29,600 Speaker 5: So you can can contact like the manager of the. 127 00:06:29,600 --> 00:06:32,479 Speaker 2: Sky to be like, actually we didn't get a good view. 128 00:06:33,160 --> 00:06:35,840 Speaker 6: We were like, okay, for y'all who are Christians here 129 00:06:36,040 --> 00:06:36,720 Speaker 6: tell this God. 130 00:06:39,360 --> 00:06:42,479 Speaker 5: Truly, when we get canceled, we were like getting a 131 00:06:42,480 --> 00:06:44,719 Speaker 5: little superstitious, like the things that we were doing to 132 00:06:44,800 --> 00:06:47,960 Speaker 5: try to like ensure good sky. 133 00:06:48,800 --> 00:06:51,640 Speaker 2: It was getting a little a little out there. We'll 134 00:06:51,640 --> 00:06:52,280 Speaker 2: just sleep it at that. 135 00:06:52,440 --> 00:06:54,400 Speaker 1: He brought a shaman in. I guess we're going in. 136 00:06:55,720 --> 00:06:58,760 Speaker 4: Oh my gosh, Bridget, I want to ask so many 137 00:06:58,839 --> 00:07:05,440 Speaker 4: questions about this later on my Oh well, I'm very 138 00:07:05,480 --> 00:07:06,680 Speaker 4: glad that you got to see it. 139 00:07:06,680 --> 00:07:07,040 Speaker 3: It is. 140 00:07:07,279 --> 00:07:11,560 Speaker 4: It is amazing, like truly, and over on the other 141 00:07:11,560 --> 00:07:14,680 Speaker 4: podcast I Do Savor, we did an episode on like 142 00:07:14,880 --> 00:07:19,400 Speaker 4: weird companies making money off of the eclipse with their products, 143 00:07:20,040 --> 00:07:22,400 Speaker 4: and I have heard from so many people about the 144 00:07:22,400 --> 00:07:24,200 Speaker 4: foods they made for the eclipse, and it's brought me 145 00:07:24,240 --> 00:07:30,040 Speaker 4: so much story so oh oh yeah, like totalityea, like tea, 146 00:07:30,200 --> 00:07:34,160 Speaker 4: oh oh my gosh, so many things like this. So 147 00:07:34,240 --> 00:07:36,240 Speaker 4: I feel like we have a couple of we have 148 00:07:36,280 --> 00:07:38,720 Speaker 4: some years to brainstorm things like this. 149 00:07:39,880 --> 00:07:43,280 Speaker 1: Next giant celebration, but keep that. 150 00:07:43,200 --> 00:07:44,600 Speaker 3: In the back of your head, you. 151 00:07:44,520 --> 00:07:48,600 Speaker 5: Know, Oh, next time, definitely doing eclipse themed food party 152 00:07:48,680 --> 00:07:49,840 Speaker 5: or dinner party or something. 153 00:07:49,880 --> 00:07:50,400 Speaker 2: I love that. 154 00:07:51,040 --> 00:07:54,120 Speaker 3: Yes, there's a so many puns. 155 00:07:54,120 --> 00:07:58,760 Speaker 4: I will I will hold myself back for now, but 156 00:07:58,880 --> 00:08:01,440 Speaker 4: I have to say I am very, very excited to 157 00:08:01,520 --> 00:08:03,760 Speaker 4: talk about the topic he brought today, Bridget, because it 158 00:08:03,800 --> 00:08:06,559 Speaker 4: is a thing that I love of, like the history 159 00:08:06,600 --> 00:08:08,400 Speaker 4: of something I think a lot of people don't question 160 00:08:08,440 --> 00:08:11,280 Speaker 4: the history of and it's fascinating and I didn't know 161 00:08:11,320 --> 00:08:11,720 Speaker 4: about it. 162 00:08:12,080 --> 00:08:14,840 Speaker 3: So can you tell us what we're discussing today? 163 00:08:15,280 --> 00:08:17,440 Speaker 5: I feel the exact same way, And today we are 164 00:08:17,480 --> 00:08:19,480 Speaker 5: talking about the Lenna image. 165 00:08:19,560 --> 00:08:21,400 Speaker 2: Is this something that either of you had ever heard of? 166 00:08:21,760 --> 00:08:22,520 Speaker 3: I do not know. 167 00:08:23,000 --> 00:08:25,320 Speaker 1: I had not known, So. 168 00:08:25,400 --> 00:08:27,440 Speaker 5: Even if you're listening and you're like, what is the 169 00:08:27,480 --> 00:08:29,520 Speaker 5: Lenna image. I've never heard of this image. I've never 170 00:08:29,560 --> 00:08:32,199 Speaker 5: seen this image. Even if you don't know this story 171 00:08:32,240 --> 00:08:34,240 Speaker 5: and you don't feel like you've ever seen this image before, 172 00:08:34,520 --> 00:08:37,280 Speaker 5: you kind of do know this image because, as Linda 173 00:08:37,440 --> 00:08:40,120 Speaker 5: Kinsler puts it in a really meaty piece for Wired 174 00:08:40,200 --> 00:08:42,520 Speaker 5: that I'll be referencing a few times in this conversation, 175 00:08:42,840 --> 00:08:45,520 Speaker 5: she writes, whether or not you know her face, you've 176 00:08:45,600 --> 00:08:49,079 Speaker 5: used the technology it helps create practically every photo you've 177 00:08:49,080 --> 00:08:51,640 Speaker 5: ever taken, every website you've ever visited. 178 00:08:51,559 --> 00:08:53,040 Speaker 2: Every meme you've ever shared. 179 00:08:53,320 --> 00:08:57,640 Speaker 5: Owes some small debt to Lenna, and it really is 180 00:08:57,760 --> 00:09:01,240 Speaker 5: exactly as you were saying, Annie, one of those stories 181 00:09:01,280 --> 00:09:04,600 Speaker 5: that is foundational to the Internet and technology that you 182 00:09:04,640 --> 00:09:07,200 Speaker 5: don't necessarily think of, don't necessarily think of how it 183 00:09:07,200 --> 00:09:10,560 Speaker 5: came to be, and especially I think it's one of 184 00:09:10,559 --> 00:09:15,320 Speaker 5: those stories that says a lot about technology on you know, 185 00:09:15,320 --> 00:09:17,960 Speaker 5: here on Sminty, We've had plenty of conversations about this. 186 00:09:18,040 --> 00:09:20,600 Speaker 5: I've had many conversations about this on There No Girls 187 00:09:20,600 --> 00:09:24,000 Speaker 5: on the Internet, about how things like massogyny are kind 188 00:09:24,040 --> 00:09:27,760 Speaker 5: of can sort of be baked into the foundation of technology. 189 00:09:27,880 --> 00:09:30,040 Speaker 5: And I think that is one of the reasons why 190 00:09:30,120 --> 00:09:33,800 Speaker 5: tech is so often perpetuating misogyny, not because it's some 191 00:09:33,840 --> 00:09:37,719 Speaker 5: sort of an unfortunate bug, but because this misogyny can 192 00:09:37,760 --> 00:09:40,280 Speaker 5: be sort of foundational in some ways. And I think 193 00:09:40,280 --> 00:09:42,720 Speaker 5: this image really is a good example. 194 00:09:42,400 --> 00:09:42,920 Speaker 2: Of what I mean. 195 00:09:42,960 --> 00:09:46,040 Speaker 5: And I think, especially as we're having conversations about the 196 00:09:46,120 --> 00:09:49,600 Speaker 5: rise of things like new toify apps and AI generated 197 00:09:49,640 --> 00:09:53,880 Speaker 5: adult content creators, we're seeing what is kind of becoming 198 00:09:54,320 --> 00:09:58,600 Speaker 5: a marketplace that is, men making money off of the 199 00:09:58,679 --> 00:10:02,160 Speaker 5: bodies and or le labor of women without their consent, 200 00:10:02,320 --> 00:10:06,080 Speaker 5: certainly without their compensation. And I think this situation in 201 00:10:06,160 --> 00:10:09,600 Speaker 5: the Lenna image, where the image of a woman went 202 00:10:09,600 --> 00:10:13,080 Speaker 5: on to create this entire field of technology without her consent, 203 00:10:13,320 --> 00:10:15,720 Speaker 5: can perhaps really tell us something about where we're headed 204 00:10:15,840 --> 00:10:16,840 Speaker 5: in twenty twenty four. 205 00:10:17,200 --> 00:10:20,240 Speaker 4: Yes, absolutely, especially when you consider where it comes from, 206 00:10:20,280 --> 00:10:22,120 Speaker 4: which I know we'll talk about. But also, yeah, these 207 00:10:22,200 --> 00:10:27,080 Speaker 4: conversations we're having now about like actors perhaps not given 208 00:10:27,120 --> 00:10:30,960 Speaker 4: their consent to being used in certain ways just and 209 00:10:31,440 --> 00:10:32,959 Speaker 4: honestly it ekes sends to all of us if you've 210 00:10:32,960 --> 00:10:36,480 Speaker 4: posted an image online, right, not consenting to using an 211 00:10:36,480 --> 00:10:38,760 Speaker 4: image to an image getting used in a certain way. 212 00:10:39,200 --> 00:10:42,400 Speaker 4: But so much about this history is fascinating because it 213 00:10:42,440 --> 00:10:46,920 Speaker 4: feels so standardized, which is odd. Can you tell us 214 00:10:46,920 --> 00:10:47,360 Speaker 4: about that? 215 00:10:48,160 --> 00:10:48,640 Speaker 2: Totally? 216 00:10:48,720 --> 00:10:51,720 Speaker 5: So for folks who don't know, the Lenna image is 217 00:10:52,280 --> 00:10:56,160 Speaker 5: literally an image of this woman, Lenna Lenna Forsen. She 218 00:10:56,280 --> 00:10:58,640 Speaker 5: is a woman from Sweden who in the seventies was 219 00:10:58,720 --> 00:11:02,439 Speaker 5: a model. So this kind of sensual image of her 220 00:11:02,520 --> 00:11:05,760 Speaker 5: wearing a tan hat with a purple feather flowing down 221 00:11:05,800 --> 00:11:09,040 Speaker 5: her bare back, staring kind of seductively over one shoulder. 222 00:11:09,440 --> 00:11:12,120 Speaker 5: That image of her was published in Playboy in nineteen 223 00:11:12,160 --> 00:11:15,440 Speaker 5: seventy two. She was essentially a playmate. That image would 224 00:11:15,480 --> 00:11:18,480 Speaker 5: go on to become what's called a standard test image. 225 00:11:18,480 --> 00:11:22,200 Speaker 2: So big caveat here. I am at an engineer. 226 00:11:22,679 --> 00:11:24,720 Speaker 5: If I say something that is you're like, if you're 227 00:11:24,720 --> 00:11:27,120 Speaker 5: an engineer listening, and you're like, that's not totally correct. 228 00:11:27,440 --> 00:11:28,400 Speaker 2: I am at an engineer. 229 00:11:28,559 --> 00:11:31,120 Speaker 5: But here is a definition of what a standard test 230 00:11:31,160 --> 00:11:33,760 Speaker 5: image is that I found from caggle dot com, which 231 00:11:33,800 --> 00:11:36,960 Speaker 5: is like a developer community site. They say a standard 232 00:11:37,000 --> 00:11:39,960 Speaker 5: test image is a digital image file used across different 233 00:11:40,000 --> 00:11:44,120 Speaker 5: institutions to test image processing and image compression algorithms by 234 00:11:44,200 --> 00:11:47,160 Speaker 5: using the same standard test images, different labs were able 235 00:11:47,200 --> 00:11:50,760 Speaker 5: to compare results both visually and quantitatively. The images are 236 00:11:50,760 --> 00:11:54,120 Speaker 5: in many cases chosen to represent natural or typical images 237 00:11:54,400 --> 00:11:57,199 Speaker 5: that a class of processing techniques would need to deal with. 238 00:11:57,600 --> 00:12:00,240 Speaker 5: Other test images are chosen because they present a change 239 00:12:00,280 --> 00:12:04,160 Speaker 5: of challenges to image reconstruction algorithms, such as the reproduction 240 00:12:04,240 --> 00:12:07,440 Speaker 5: of fine detail and textures, sharp transitions and edges, and 241 00:12:07,559 --> 00:12:11,360 Speaker 5: uniform reasons. So basically, to put that in Layman's terms, 242 00:12:11,880 --> 00:12:14,640 Speaker 5: a standard test image is like a test image that 243 00:12:14,760 --> 00:12:18,400 Speaker 5: tests to make sure that the technology is working as 244 00:12:18,440 --> 00:12:20,080 Speaker 5: it should be, or like rendering the way that it 245 00:12:20,080 --> 00:12:24,160 Speaker 5: should be. Lenna's image is not the only common standard 246 00:12:24,200 --> 00:12:27,120 Speaker 5: test image. There's also one that is like a bunch 247 00:12:27,160 --> 00:12:30,079 Speaker 5: of different colored jelly beans on a table. There's another 248 00:12:30,080 --> 00:12:32,400 Speaker 5: one that's called peppers that's just a bunch of different 249 00:12:32,400 --> 00:12:35,480 Speaker 5: colored like red and green peppers, like calapeno peppers. So 250 00:12:35,760 --> 00:12:38,440 Speaker 5: this is just a thing that becomes a way for 251 00:12:38,800 --> 00:12:42,880 Speaker 5: technologists to test that the image generating technology is working correctly. 252 00:12:43,440 --> 00:12:45,679 Speaker 4: I do think this is very interesting for a lot 253 00:12:45,679 --> 00:12:48,080 Speaker 4: of reasons. But if you have like jelly beans and peppers, 254 00:12:48,080 --> 00:12:53,240 Speaker 4: those are things to be consumed and then when you're 255 00:12:53,280 --> 00:12:57,120 Speaker 4: thinking about where they got this image from, a Lena like, 256 00:12:57,160 --> 00:13:01,200 Speaker 4: how did this happen? How did this image become this 257 00:13:01,360 --> 00:13:03,360 Speaker 4: standard testing thing? 258 00:13:03,840 --> 00:13:05,600 Speaker 2: So this is actually a pretty interesting story. 259 00:13:05,880 --> 00:13:09,000 Speaker 5: The story of hell Letta's Playboy picture becomes this standard 260 00:13:09,160 --> 00:13:12,640 Speaker 5: test image that is everywhere and very ubiquitous, starts with 261 00:13:12,800 --> 00:13:17,520 Speaker 5: computer and electrical engineer Alexander Sawchuk. According to the newsletter 262 00:13:17,600 --> 00:13:20,720 Speaker 5: for the Institute of Electrical and Electronic Engineering, or the 263 00:13:20,760 --> 00:13:23,640 Speaker 5: I triple E, as I have found out it's sometimes called. 264 00:13:23,800 --> 00:13:25,920 Speaker 5: I was talking to somebody about this and I was like, oh, 265 00:13:26,000 --> 00:13:26,839 Speaker 5: the I. 266 00:13:27,000 --> 00:13:29,720 Speaker 2: E E ee, and they were like, it's just the 267 00:13:29,720 --> 00:13:30,880 Speaker 2: I triple E. 268 00:13:32,480 --> 00:13:38,600 Speaker 1: I E. Actually the I E exactly. 269 00:13:39,840 --> 00:13:42,000 Speaker 2: So it's the summer of nineteen seventy three. 270 00:13:42,080 --> 00:13:45,160 Speaker 5: Alexander Sawchuk was an assistant professor of electrical engineering at 271 00:13:45,200 --> 00:13:48,160 Speaker 5: the University of Southern California and also a grad student 272 00:13:48,200 --> 00:13:50,600 Speaker 5: in the sip I lab as a manager. 273 00:13:50,920 --> 00:13:52,360 Speaker 2: As the story goes, he's like. 274 00:13:52,360 --> 00:13:55,320 Speaker 5: Frantically searching around the lab for a good image to 275 00:13:55,400 --> 00:13:58,560 Speaker 5: scan for a colleague's conference paper. He had just sort 276 00:13:58,559 --> 00:14:01,480 Speaker 5: of gotten bored with their usual stock test images because 277 00:14:01,480 --> 00:14:05,440 Speaker 5: they mostly had come from like nineteen sixties TV standards. 278 00:14:04,960 --> 00:14:06,000 Speaker 2: And then we're just a little bit dull. 279 00:14:06,440 --> 00:14:09,520 Speaker 5: He wanted something glossy and sort of like fresh and dynamic, 280 00:14:09,600 --> 00:14:12,600 Speaker 5: but he also wanted to use a human face specifically. 281 00:14:13,240 --> 00:14:16,000 Speaker 5: Just then, as the story goes, somebody happens to walk 282 00:14:16,040 --> 00:14:19,680 Speaker 5: in holding the most recent issue of Playboy magazine. 283 00:14:19,840 --> 00:14:20,760 Speaker 2: Why this person was. 284 00:14:20,680 --> 00:14:24,560 Speaker 5: Bringing Playboy Magazine into his workplace, I'm cannot. 285 00:14:24,320 --> 00:14:27,120 Speaker 2: Tell you how good you just. 286 00:14:27,120 --> 00:14:28,560 Speaker 1: Come into your world institute? 287 00:14:29,120 --> 00:14:29,360 Speaker 6: Cool? 288 00:14:29,480 --> 00:14:31,920 Speaker 5: Yeah, Like, I mean, I do think that that sort 289 00:14:31,920 --> 00:14:35,800 Speaker 5: of like gives you a sense of like the dynamics 290 00:14:35,800 --> 00:14:37,800 Speaker 5: that we're dealing with, Right, that somebody just happens to 291 00:14:37,840 --> 00:14:40,280 Speaker 5: walk in with the but the most recent Playboy. 292 00:14:39,960 --> 00:14:40,560 Speaker 2: Under their arm. 293 00:14:41,080 --> 00:14:44,640 Speaker 5: Right, The engineers tore away the top third of the 294 00:14:44,640 --> 00:14:47,120 Speaker 5: centerfold so they could wrap it around the drum of 295 00:14:47,160 --> 00:14:50,840 Speaker 5: their Mrhead wire photo scanner, which they had outfitted with 296 00:14:50,920 --> 00:14:54,280 Speaker 5: analog to digital converters, one for each red, green, and 297 00:14:54,280 --> 00:14:57,840 Speaker 5: blue channels, and an HP twenty one hundred mini computer. 298 00:14:58,280 --> 00:15:01,080 Speaker 5: So all of that to say is that they effectively 299 00:15:01,480 --> 00:15:04,760 Speaker 5: cropped this image so that you can't see the models 300 00:15:04,960 --> 00:15:07,080 Speaker 5: bears in the image, so it's just a picture of 301 00:15:07,120 --> 00:15:09,240 Speaker 5: her from the shoulders up looking over her shoulder. 302 00:15:09,280 --> 00:15:11,440 Speaker 2: It's still like quite a seductive. 303 00:15:10,960 --> 00:15:14,520 Speaker 5: Photo, but the full photo has her like bare booty 304 00:15:14,520 --> 00:15:17,120 Speaker 5: in it. She's wearing I look like a feather boa 305 00:15:17,320 --> 00:15:18,560 Speaker 5: and like thigh high. 306 00:15:18,480 --> 00:15:19,920 Speaker 2: Stockings looking over her shoulder. 307 00:15:20,200 --> 00:15:23,560 Speaker 5: So back in the seventies and eighties, this image was 308 00:15:23,640 --> 00:15:26,560 Speaker 5: really sort of like used in very limited cases. You 309 00:15:26,560 --> 00:15:29,480 Speaker 5: could only really see it on dot org domains. It 310 00:15:29,560 --> 00:15:32,600 Speaker 5: was pretty limited to like engineers. Then in July of 311 00:15:32,720 --> 00:15:35,400 Speaker 5: nineteen ninety one, the image was featured on the cover 312 00:15:35,480 --> 00:15:39,320 Speaker 5: of Optical Engineering, alongside that other test image of the 313 00:15:39,320 --> 00:15:40,360 Speaker 5: different colored peppers. 314 00:15:40,640 --> 00:15:43,520 Speaker 2: Funny enough, I took a look at that cover. 315 00:15:43,880 --> 00:15:45,560 Speaker 5: It's all black and white, so I'm like, ah, I 316 00:15:45,560 --> 00:15:48,960 Speaker 5: think they're trying to demonstrate that, like these images had 317 00:15:48,960 --> 00:15:51,120 Speaker 5: all these different dynamic colors, but both of them are 318 00:15:51,120 --> 00:15:54,200 Speaker 5: rendered in black and white, kind of rendering that meaningless. 319 00:15:55,120 --> 00:15:57,840 Speaker 5: So this is when Playboy gets wind of this, and 320 00:15:57,880 --> 00:16:01,840 Speaker 5: they are not happy because it's basically copyright infringement, which 321 00:16:02,520 --> 00:16:04,280 Speaker 5: this is not related to the story, but I always 322 00:16:04,320 --> 00:16:06,920 Speaker 5: have to add this whenever it comes up. How litigious 323 00:16:07,000 --> 00:16:10,000 Speaker 5: Hugh Hefner and Playboy were. I always think this is 324 00:16:10,120 --> 00:16:14,200 Speaker 5: very rich because, as y'all probably know, Hugh Hefner made 325 00:16:14,520 --> 00:16:19,040 Speaker 5: an entire lucrative industry off of images of Marilyn Monroe 326 00:16:19,080 --> 00:16:21,400 Speaker 5: that she did for a calendar company, for which. 327 00:16:21,240 --> 00:16:22,840 Speaker 2: She was only paid fifty dollars. 328 00:16:23,160 --> 00:16:26,920 Speaker 5: Many years after that photo shoot, Hugh Hefner bought those 329 00:16:26,920 --> 00:16:31,160 Speaker 5: photographs from the calendar company republished them without Marilyn Monroe's 330 00:16:31,200 --> 00:16:33,960 Speaker 5: consent or permission in nineteen fifty three. That was the 331 00:16:34,000 --> 00:16:37,920 Speaker 5: first ever issue of Playboy. Hugh Hefner paid five hundred dollars. 332 00:16:38,000 --> 00:16:40,520 Speaker 5: She got fifty dollars. Right, So, whenever I read about 333 00:16:40,520 --> 00:16:44,000 Speaker 5: how litigious Playboy is, which they're very ligious, I always 334 00:16:44,000 --> 00:16:46,840 Speaker 5: had to chuckle at that that, Oh, like, you don't 335 00:16:46,840 --> 00:16:50,840 Speaker 5: want somebody profiting off of your intellectual property, but had 336 00:16:50,880 --> 00:16:55,800 Speaker 5: no problem profiting off of a woman's body without compensating 337 00:16:55,840 --> 00:16:57,320 Speaker 5: her fairly or even her consent. 338 00:16:57,720 --> 00:17:00,560 Speaker 1: Interesting, this is like par for the course for him. 339 00:17:01,280 --> 00:17:03,920 Speaker 2: Oh my god, don't even get me started with Sue Hefner. 340 00:17:03,920 --> 00:17:07,480 Speaker 2: We will be here all day. 341 00:17:08,000 --> 00:17:10,800 Speaker 6: The things that came out after he died, which I'm like, Wow, 342 00:17:10,840 --> 00:17:14,480 Speaker 6: he had a pretty good, like powerful handle on people 343 00:17:14,600 --> 00:17:16,159 Speaker 6: not talking until he died. 344 00:17:16,680 --> 00:17:17,439 Speaker 2: Oh my gosh. 345 00:17:17,560 --> 00:17:21,160 Speaker 5: I was listening to an episode of celebrity memoir book Club, 346 00:17:21,200 --> 00:17:24,160 Speaker 5: where they read a lot of X Playmate and ex 347 00:17:24,200 --> 00:17:27,640 Speaker 5: Playboy bunny memoirs. Some of the things that they write about, 348 00:17:27,680 --> 00:17:30,840 Speaker 5: I'm like, oh my god. Like even even Lenna in 349 00:17:30,920 --> 00:17:33,720 Speaker 5: an interview, she talks about how in the seventies, after 350 00:17:34,200 --> 00:17:36,480 Speaker 5: this photo shoot, she was invited to go to the 351 00:17:36,480 --> 00:17:40,360 Speaker 5: Playboy mansion and the quote is something like, they made 352 00:17:40,359 --> 00:17:42,120 Speaker 5: it clear in the invites that I would have to 353 00:17:42,160 --> 00:17:44,439 Speaker 5: spend time with Hugh Hefner while he was in his 354 00:17:44,600 --> 00:17:49,960 Speaker 5: dressing robe, and I said, no, thanks, I mean. 355 00:17:50,680 --> 00:17:51,480 Speaker 3: She already knew. 356 00:17:52,320 --> 00:18:06,400 Speaker 7: Was like yeah, yeah. 357 00:18:06,520 --> 00:18:09,399 Speaker 5: So Playboy threatens to sue these engineers, and at this point, 358 00:18:09,400 --> 00:18:12,280 Speaker 5: the engineers, it sounds like, had like grown very fond 359 00:18:12,320 --> 00:18:14,000 Speaker 5: of using this image, so they fought back. 360 00:18:14,280 --> 00:18:16,120 Speaker 2: Eventually Playboy back down. 361 00:18:15,920 --> 00:18:19,199 Speaker 5: Because, as a Playboy vice president put it, quote, we 362 00:18:19,320 --> 00:18:22,320 Speaker 5: decided we should exploit this because it is a phenomenon. 363 00:18:22,520 --> 00:18:26,280 Speaker 5: So yeah, by his own words like, oh, let's exploit this. 364 00:18:26,680 --> 00:18:26,840 Speaker 7: Yeah. 365 00:18:26,920 --> 00:18:30,360 Speaker 2: No, talk about the fact that this is two groups. 366 00:18:30,040 --> 00:18:33,560 Speaker 5: Of men fighting over who owns this image of a woman, 367 00:18:34,520 --> 00:18:37,199 Speaker 5: in one case being used in a manner that is 368 00:18:37,240 --> 00:18:39,160 Speaker 5: completely without her consent or control. 369 00:18:39,359 --> 00:18:40,959 Speaker 2: It just it already from the beginning. 370 00:18:40,960 --> 00:18:44,200 Speaker 5: It just feels to me like men fighting over how 371 00:18:44,240 --> 00:18:47,199 Speaker 5: they can use a woman's representation that I think is 372 00:18:47,720 --> 00:18:50,400 Speaker 5: so foundational to some of the conversations we're having about 373 00:18:50,440 --> 00:18:53,080 Speaker 5: technology like AI right here in twenty twenty. 374 00:18:52,840 --> 00:18:59,200 Speaker 4: Four, absolutely, and she did become pretty foundational, right. 375 00:18:59,760 --> 00:19:02,600 Speaker 5: Oh, absolutely, So this is when the image of Leta 376 00:19:03,080 --> 00:19:06,320 Speaker 5: really becomes super popular. The whole drama about the cover 377 00:19:06,880 --> 00:19:09,840 Speaker 5: catapults this image into what you might think of as 378 00:19:09,880 --> 00:19:13,160 Speaker 5: like early Internet virality or popularity. This was in nineteen 379 00:19:13,240 --> 00:19:16,040 Speaker 5: ninety five. The use of the photo and electronic imaging 380 00:19:16,200 --> 00:19:19,200 Speaker 5: has been described as clearly one of the most important 381 00:19:19,200 --> 00:19:20,480 Speaker 5: events in history. 382 00:19:20,720 --> 00:19:21,840 Speaker 2: It is truly hard. 383 00:19:21,600 --> 00:19:24,680 Speaker 5: To overstate how ubiquitous this one image is in technology. 384 00:19:24,920 --> 00:19:28,600 Speaker 5: There is this fascinating interactive piece by Jennifer Ding at 385 00:19:28,600 --> 00:19:29,040 Speaker 5: the Pudding. 386 00:19:29,359 --> 00:19:30,280 Speaker 2: The piece is so cool. 387 00:19:30,280 --> 00:19:32,760 Speaker 5: It's like one of those interactive pieces that has a timeline. 388 00:19:32,880 --> 00:19:33,840 Speaker 2: Definitely check it out. 389 00:19:33,960 --> 00:19:38,800 Speaker 5: But in that piece, Ding actually includes a freeze frame 390 00:19:38,920 --> 00:19:42,239 Speaker 5: of the show Silicon Valley on HBO, where in the 391 00:19:42,280 --> 00:19:43,440 Speaker 5: background there is a. 392 00:19:43,359 --> 00:19:45,200 Speaker 2: Poster with the Lenna image on the wall. 393 00:19:45,280 --> 00:19:48,320 Speaker 5: Right, So this image is also included in scientific journals 394 00:19:48,359 --> 00:19:51,800 Speaker 5: just all over the place. Ding found that within the 395 00:19:51,840 --> 00:19:55,639 Speaker 5: dot edu world, so like websites related to education, the 396 00:19:55,720 --> 00:19:59,640 Speaker 5: Lenna image continues to appear in homework questions, class slides, 397 00:19:59,680 --> 00:20:03,120 Speaker 5: and to he hosted on educational and research sites, ensuring 398 00:20:03,200 --> 00:20:05,880 Speaker 5: that it will be passed down to new generations of engineer. 399 00:20:06,000 --> 00:20:10,600 Speaker 5: So this became so popular that Lenna herself is often 400 00:20:10,680 --> 00:20:12,640 Speaker 5: called the first Lady of the Internet. 401 00:20:13,040 --> 00:20:13,320 Speaker 3: Wow. 402 00:20:13,600 --> 00:20:16,760 Speaker 4: Yeah, I kind of like her taking that picture having 403 00:20:16,800 --> 00:20:20,200 Speaker 4: no idea that this is what would happen, which, yeah, 404 00:20:20,240 --> 00:20:22,000 Speaker 4: I mean, I guess that speaks to the next question, 405 00:20:22,080 --> 00:20:25,480 Speaker 4: why did this image take off the way that it did? 406 00:20:25,840 --> 00:20:27,119 Speaker 2: Well? If you asked David C. 407 00:20:27,240 --> 00:20:29,679 Speaker 5: Munson, who is the editor in chief of the i 408 00:20:30,000 --> 00:20:35,480 Speaker 5: E or the iter fully Transactions on image processing, he 409 00:20:35,560 --> 00:20:37,520 Speaker 5: said that the image happened to meet all of these 410 00:20:37,560 --> 00:20:40,640 Speaker 5: requirements for a good test image because of its detail, 411 00:20:40,720 --> 00:20:45,760 Speaker 5: it's flat regions, shading and texture. But even he will 412 00:20:45,800 --> 00:20:47,800 Speaker 5: not leave out the obvious fact that it's also a 413 00:20:47,840 --> 00:20:50,920 Speaker 5: picture of like a seductive, sexy young woman. 414 00:20:51,040 --> 00:20:54,040 Speaker 2: Duh right, Like that's that's definitely part of it. He says. 415 00:20:54,359 --> 00:20:57,119 Speaker 5: The Lena image picture is of an attractive woman. It 416 00:20:57,160 --> 00:20:59,440 Speaker 5: is not surprising to me that the mostly male image 417 00:20:59,480 --> 00:21:03,879 Speaker 5: processing research community gravitated toward an image that they found attractive, 418 00:21:03,920 --> 00:21:09,000 Speaker 5: and so I do think there's something about these highly 419 00:21:09,119 --> 00:21:12,200 Speaker 5: male dominated spaces where it's not just that there's a 420 00:21:12,200 --> 00:21:16,120 Speaker 5: lot of men that it's like their worldviews, their interests, 421 00:21:16,400 --> 00:21:20,159 Speaker 5: their perspectives, their biases that are really taking up a 422 00:21:20,160 --> 00:21:23,080 Speaker 5: lot of space in these in these spaces. I just 423 00:21:23,080 --> 00:21:26,439 Speaker 5: think that men feel like these spaces are theirs, and 424 00:21:26,520 --> 00:21:29,520 Speaker 5: that they are free to decorate those spaces with the 425 00:21:29,560 --> 00:21:31,840 Speaker 5: pretty women that they think they feel like they should 426 00:21:31,840 --> 00:21:35,560 Speaker 5: be able to use without their consent or compensation. I 427 00:21:35,640 --> 00:21:38,000 Speaker 5: just think that, like Annie, you mentioned earlier that the 428 00:21:38,040 --> 00:21:40,800 Speaker 5: other test images are these things that you consume, right, 429 00:21:40,920 --> 00:21:44,760 Speaker 5: like peppers or jellybeans. There's another famous one of a 430 00:21:44,800 --> 00:21:48,280 Speaker 5: baboon that's like has different colors on its face. It's 431 00:21:48,320 --> 00:21:50,840 Speaker 5: interesting to me that it's these things that are not human, 432 00:21:51,040 --> 00:21:54,479 Speaker 5: things that are like animal or that you consume that 433 00:21:54,600 --> 00:21:57,520 Speaker 5: like throwing a sexy young woman into that mix. I 434 00:21:57,560 --> 00:21:59,960 Speaker 5: don't think maybe seem like a huge departure for these guys. 435 00:22:00,320 --> 00:22:04,359 Speaker 4: Yeah, and again, when we think about things in the 436 00:22:04,359 --> 00:22:08,000 Speaker 4: realm of AI or even I know I've complained about 437 00:22:08,000 --> 00:22:11,560 Speaker 4: this many times, but in the worlds of fandom are gaming. 438 00:22:11,800 --> 00:22:12,360 Speaker 3: It's like that. 439 00:22:12,400 --> 00:22:14,159 Speaker 4: It's like, you can come into our world on our 440 00:22:14,240 --> 00:22:15,720 Speaker 4: terms and you wear what we want you to wear. 441 00:22:16,000 --> 00:22:18,439 Speaker 4: You are here because we let you be here in 442 00:22:18,480 --> 00:22:22,280 Speaker 4: this male dominated space. But you're gonna do what we want. 443 00:22:22,560 --> 00:22:25,000 Speaker 4: It's not up to you, and that's the only way 444 00:22:25,040 --> 00:22:30,320 Speaker 4: that you can be in this world. But that being said, 445 00:22:30,920 --> 00:22:37,400 Speaker 4: there has been some pushback recently ish right, Bridget. 446 00:22:37,400 --> 00:22:39,760 Speaker 2: Yeah, So one thing about what you just said. 447 00:22:40,359 --> 00:22:42,720 Speaker 5: When I was researching for this episode, some of the 448 00:22:42,760 --> 00:22:47,960 Speaker 5: different engineers who had contributed to this images popularity, they 449 00:22:48,000 --> 00:22:51,480 Speaker 5: were quoted when they actually met the actual real Letta 450 00:22:51,560 --> 00:22:53,920 Speaker 5: at a conference that she was invited to, they were like, 451 00:22:54,400 --> 00:22:56,600 Speaker 5: I can't believe she's a real person. A part of 452 00:22:56,600 --> 00:22:57,919 Speaker 5: me was like, you didn't even see her as a 453 00:22:57,920 --> 00:23:00,359 Speaker 5: real human. They just saw her as something that they 454 00:23:00,359 --> 00:23:03,320 Speaker 5: had an image in a picture that they had been 455 00:23:03,320 --> 00:23:06,880 Speaker 5: consuming for decades, and they had so removed her from 456 00:23:06,920 --> 00:23:10,480 Speaker 5: being a real, breathing human that meeting her in real 457 00:23:10,520 --> 00:23:13,280 Speaker 5: life was like they were surprised that she was real. 458 00:23:13,359 --> 00:23:15,199 Speaker 5: And I think that really speaks to the sort of 459 00:23:15,200 --> 00:23:17,320 Speaker 5: fandom element that you were talking about, This idea that 460 00:23:17,359 --> 00:23:22,280 Speaker 5: like you can come if you are a fantasy and 461 00:23:22,400 --> 00:23:24,680 Speaker 5: in some ways not even a real human. 462 00:23:25,000 --> 00:23:28,640 Speaker 2: You know what I'm saying. You're like, do I. 463 00:23:28,640 --> 00:23:33,359 Speaker 4: Ever, Yeah, like, don't say anything that I don't like, Like, 464 00:23:33,480 --> 00:23:35,080 Speaker 4: keep quiet and look the way I like. 465 00:23:35,359 --> 00:23:38,640 Speaker 3: Then you can be here. But oh you're a real person. 466 00:23:39,800 --> 00:23:43,679 Speaker 3: Oh no, I don't want you hear it at all. Yeah. 467 00:23:43,760 --> 00:23:46,600 Speaker 5: So you're exactly right, Annie. All of this happened, but 468 00:23:46,640 --> 00:23:50,280 Speaker 5: it was not without pushback. Around like the twenty tens, 469 00:23:51,080 --> 00:23:54,919 Speaker 5: people started publicly asking whether or not this image of 470 00:23:54,960 --> 00:23:59,600 Speaker 5: a woman from Playboy should be so fundational to technology 471 00:24:00,080 --> 00:24:03,400 Speaker 5: es actually in education settings, you know, given conversations about 472 00:24:03,400 --> 00:24:05,880 Speaker 5: the need for more women in these spaces and how 473 00:24:05,880 --> 00:24:07,879 Speaker 5: to make these spaces more inclusive and more diverse. 474 00:24:08,240 --> 00:24:09,640 Speaker 2: That's really around when you start. 475 00:24:09,440 --> 00:24:12,200 Speaker 5: Hearing like people in public being like, wait a minute, 476 00:24:12,240 --> 00:24:15,480 Speaker 5: maybe this isn't so cool. In twenty fifteen, Mattie Zugg, 477 00:24:15,520 --> 00:24:17,680 Speaker 5: who was then a student at the Thomas Jefferson High 478 00:24:17,680 --> 00:24:20,000 Speaker 5: School for Science and Technology right here in DC hre 479 00:24:20,119 --> 00:24:22,359 Speaker 5: live who I should say now is a product safety 480 00:24:22,359 --> 00:24:25,359 Speaker 5: engineer at Apple who focuses on preventing tech and abled 481 00:24:25,400 --> 00:24:27,840 Speaker 5: abuse and stalking and harassment on Apple platforms. 482 00:24:27,840 --> 00:24:29,000 Speaker 2: So like go Maddie. 483 00:24:29,000 --> 00:24:31,320 Speaker 5: Maddie sounds like she was cool in high school and 484 00:24:31,440 --> 00:24:35,160 Speaker 5: is cool now. So Maddie wrote this op ed basically 485 00:24:35,200 --> 00:24:38,080 Speaker 5: asking the question of like, should I, as a high 486 00:24:38,080 --> 00:24:42,920 Speaker 5: school student at a STEM high school, be given an 487 00:24:42,960 --> 00:24:46,480 Speaker 5: image from Playboy as part of my education in technology 488 00:24:46,520 --> 00:24:49,240 Speaker 5: and STEM? She writes, I first saw a picture of 489 00:24:49,280 --> 00:24:52,120 Speaker 5: Playboy magazines Miss November nineteen seventy two. 490 00:24:52,480 --> 00:24:53,080 Speaker 2: A year ago. 491 00:24:53,119 --> 00:24:56,640 Speaker 5: As a junior at TJ, my artificial intelligence teacher told 492 00:24:56,680 --> 00:24:59,679 Speaker 5: our class to search Google for Lenna Soderbird, not the 493 00:24:59,720 --> 00:25:02,520 Speaker 5: full image, though, and use her picture to test our 494 00:25:02,600 --> 00:25:06,000 Speaker 5: latest coding assignment. At the time, I was sixteen and 495 00:25:06,040 --> 00:25:08,680 Speaker 5: struggling to believe that I belonged in a male dominated 496 00:25:08,680 --> 00:25:09,880 Speaker 5: computer science class. 497 00:25:10,119 --> 00:25:12,400 Speaker 2: I tried to tune out the boy's sexual comments. 498 00:25:12,760 --> 00:25:15,959 Speaker 5: Why is an advanced science, technology, engineering, and mathematics school 499 00:25:16,240 --> 00:25:19,760 Speaker 5: using a Playboy centerfold in its classrooms? Her piece ends 500 00:25:19,840 --> 00:25:22,520 Speaker 5: with saying it's time for TJ to say hello to 501 00:25:22,600 --> 00:25:26,000 Speaker 5: an inclusive computer science education and say goodbye to Lena. 502 00:25:26,400 --> 00:25:29,280 Speaker 5: So Maddie was not the only person who was like, 503 00:25:29,600 --> 00:25:31,800 Speaker 5: maybe this image shouldn't be the thing that all of 504 00:25:31,840 --> 00:25:34,760 Speaker 5: our education is centered around. In that piece for Wired 505 00:25:34,800 --> 00:25:37,199 Speaker 5: I mentioned they talked to several women in technology you 506 00:25:37,200 --> 00:25:38,360 Speaker 5: had very similar stories. 507 00:25:38,600 --> 00:25:39,879 Speaker 2: This one is actually pretty funny. 508 00:25:40,119 --> 00:25:43,760 Speaker 5: Deanna Needle, a math professor at UCLA, had similar memories 509 00:25:43,800 --> 00:25:46,480 Speaker 5: from college. So in twenty thirteen, she and a colleague 510 00:25:46,480 --> 00:25:49,800 Speaker 5: staged a quiet protest. They acquired the rights to a 511 00:25:49,840 --> 00:25:53,960 Speaker 5: headshot of the male model Fabio Lonzoni and used that 512 00:25:54,160 --> 00:25:57,320 Speaker 5: for their imaging research. So they kind of like turned 513 00:25:57,320 --> 00:25:59,600 Speaker 5: it around, like, oh, you're going to use a sexy woman, 514 00:25:59,680 --> 00:26:01,040 Speaker 5: Well we'll use a sexy man. 515 00:26:01,040 --> 00:26:01,840 Speaker 2: What do you think about that? 516 00:26:03,840 --> 00:26:04,440 Speaker 3: I love it. 517 00:26:05,320 --> 00:26:08,840 Speaker 5: So in that piece they actually track down and speak 518 00:26:08,920 --> 00:26:12,439 Speaker 5: to the real Lenna, who also called for her image 519 00:26:12,480 --> 00:26:15,119 Speaker 5: to be retired. She says, I retired from modeling a 520 00:26:15,160 --> 00:26:18,400 Speaker 5: long time ago. It is time I retired from tech two. 521 00:26:18,480 --> 00:26:20,520 Speaker 5: We can make a simple change today that creates a 522 00:26:20,600 --> 00:26:25,040 Speaker 5: lasting change for tomorrow. Let's commit to losing me. And 523 00:26:25,160 --> 00:26:27,800 Speaker 5: there's actually some news on that front, because as of 524 00:26:27,920 --> 00:26:31,520 Speaker 5: April first of this year, that I Triple E officially 525 00:26:31,560 --> 00:26:34,280 Speaker 5: retired the use of the Lenna image and announced they 526 00:26:34,280 --> 00:26:36,840 Speaker 5: will no longer be using that image and their publications. 527 00:26:37,320 --> 00:26:39,159 Speaker 5: Ours Technica points out that this is kind of a 528 00:26:39,200 --> 00:26:41,760 Speaker 5: really big deal that will likely have a ripple effect 529 00:26:41,840 --> 00:26:42,320 Speaker 5: in the space. 530 00:26:42,520 --> 00:26:43,720 Speaker 2: Because the journal. 531 00:26:43,400 --> 00:26:47,720 Speaker 5: Has been so historically important for computer imaging development, it'll 532 00:26:47,760 --> 00:26:51,400 Speaker 5: likely set a precedent removing this image from common use. 533 00:26:51,680 --> 00:26:53,680 Speaker 5: In an email, a spokesperson for the I Triple E 534 00:26:54,080 --> 00:26:57,840 Speaker 5: recommended wider sensitivity about the issue writing. In order to 535 00:26:57,920 --> 00:27:00,840 Speaker 5: raise awareness of and increase author com clients with this 536 00:27:00,960 --> 00:27:04,280 Speaker 5: new policy, program, committee members and reviewers should look for 537 00:27:04,320 --> 00:27:07,360 Speaker 5: inclusion of this image, and if present, should ask authors 538 00:27:07,359 --> 00:27:09,760 Speaker 5: to replace the Letna image with an alternative. 539 00:27:10,440 --> 00:27:10,760 Speaker 3: Yeah. 540 00:27:11,160 --> 00:27:14,280 Speaker 4: I love that from Lenna herself, like, let's commit to 541 00:27:14,320 --> 00:27:18,400 Speaker 4: forgetting me. That's such a great line. But it does 542 00:27:18,440 --> 00:27:22,760 Speaker 4: speak to it speaks volumes, as you've been saying, bridget 543 00:27:23,119 --> 00:27:27,119 Speaker 4: to our attitude towards women on the internet and towards 544 00:27:27,119 --> 00:27:32,880 Speaker 4: consent on the internet. And so when we're thinking about this, 545 00:27:33,720 --> 00:27:39,520 Speaker 4: which was foundational, what do you think about this the legacy. 546 00:27:39,040 --> 00:27:39,959 Speaker 3: Of this image? 547 00:27:40,600 --> 00:27:43,280 Speaker 5: Yeah, I love that question. You know what I was 548 00:27:43,400 --> 00:27:46,359 Speaker 5: reading about how this image came to be. I'm imagining 549 00:27:46,640 --> 00:27:50,200 Speaker 5: a very different time, right, It's the seventies. People aren't 550 00:27:50,280 --> 00:27:54,360 Speaker 5: necessarily having a lot of public, loud conversations about the 551 00:27:54,400 --> 00:27:56,879 Speaker 5: power dynamics of who's in the room. And who's not 552 00:27:56,960 --> 00:27:59,159 Speaker 5: in the room where a lot of this technology is 553 00:27:59,160 --> 00:28:02,440 Speaker 5: getting built. And it really made me think of, like, Wow, 554 00:28:02,480 --> 00:28:05,000 Speaker 5: the seventies, that probably was such a different time. But 555 00:28:05,160 --> 00:28:08,520 Speaker 5: here in twenty twenty four, we are having those conversations, 556 00:28:08,760 --> 00:28:12,919 Speaker 5: loud voices, are publicly having those conversations. There are women 557 00:28:13,080 --> 00:28:15,679 Speaker 5: and people of color, and trans folks and queer folks 558 00:28:15,680 --> 00:28:18,520 Speaker 5: and all kinds of folks who are building and making 559 00:28:18,520 --> 00:28:21,240 Speaker 5: the technology that shapes our world today. And so in 560 00:28:21,240 --> 00:28:25,280 Speaker 5: twenty twenty four, it almost feels like we are pretending 561 00:28:25,400 --> 00:28:29,160 Speaker 5: that we're still in this nineteen seventies we didn't really 562 00:28:29,280 --> 00:28:32,600 Speaker 5: know how who could have foreseen world, when in fact 563 00:28:33,200 --> 00:28:35,880 Speaker 5: we're not really in that world. People are asking the questions, 564 00:28:35,960 --> 00:28:38,200 Speaker 5: people are raising the alarm, And I guess I don't 565 00:28:38,200 --> 00:28:42,760 Speaker 5: think it should be several decades after AI technology becomes 566 00:28:42,840 --> 00:28:46,000 Speaker 5: ubiquitous for people to start asking the question about how 567 00:28:46,720 --> 00:28:50,600 Speaker 5: traditionally marginalized people like women are being used and represented 568 00:28:50,640 --> 00:28:54,320 Speaker 5: and perhaps exploited without their consent in these spaces. I 569 00:28:54,320 --> 00:28:57,080 Speaker 5: think it provides a really interesting precedent for what's going 570 00:28:57,120 --> 00:28:58,320 Speaker 5: on here in twenty twenty four. 571 00:28:58,680 --> 00:29:01,080 Speaker 2: And Jennifer put it really well. 572 00:29:01,080 --> 00:29:03,680 Speaker 5: She writes to me, the crux of the LETNA story 573 00:29:03,760 --> 00:29:06,560 Speaker 5: is how little power we have over our data and 574 00:29:06,640 --> 00:29:10,400 Speaker 5: how it is used and abused. That threat seems disproportionately 575 00:29:10,440 --> 00:29:13,960 Speaker 5: higher for women, who are overrepresented in Internet content but 576 00:29:14,200 --> 00:29:19,040 Speaker 5: underrepresented in Internet company leadership and decision making. Given this reality, 577 00:29:19,200 --> 00:29:23,640 Speaker 5: engineering and product decisions will continue to consciously and unconsciously 578 00:29:24,000 --> 00:29:27,320 Speaker 5: exclude our needs and concerns, Right, And so I really 579 00:29:27,360 --> 00:29:29,920 Speaker 5: agree with that that this LETA story really is a 580 00:29:29,960 --> 00:29:34,680 Speaker 5: story about power dynamics and who is represented in technology 581 00:29:34,720 --> 00:29:36,840 Speaker 5: and who is just sort of like has their needs 582 00:29:36,840 --> 00:29:38,120 Speaker 5: exploited or erased? 583 00:29:38,200 --> 00:29:38,680 Speaker 2: Right, Like. 584 00:29:40,440 --> 00:29:44,080 Speaker 5: Men wanting to consume the bodies of women is like 585 00:29:44,720 --> 00:29:47,640 Speaker 5: foundational to the Internet. It's like why we have the 586 00:29:47,680 --> 00:29:49,680 Speaker 5: Internet the way that we have it. And I think 587 00:29:49,880 --> 00:29:53,960 Speaker 5: we know that now. It's like an objective fact about 588 00:29:54,000 --> 00:29:56,920 Speaker 5: the Internet and technology. I don't think we can still 589 00:29:57,440 --> 00:30:01,360 Speaker 5: make technology that does not honest about because if we're 590 00:30:01,400 --> 00:30:03,720 Speaker 5: not being honest about that, we can never fix that, 591 00:30:03,760 --> 00:30:05,720 Speaker 5: we can never question that, we can never have that 592 00:30:05,840 --> 00:30:09,560 Speaker 5: be a dynamic that we stop perpetuating with technology. 593 00:30:09,960 --> 00:30:12,160 Speaker 4: Yeah, and I think it's like going back to the 594 00:30:12,160 --> 00:30:16,160 Speaker 4: point about being in a classroom setting and being shown 595 00:30:16,240 --> 00:30:19,560 Speaker 4: explicitly like, this is how women are viewed in the space. 596 00:30:19,640 --> 00:30:21,920 Speaker 4: This is what built a lot of what we use 597 00:30:21,960 --> 00:30:25,600 Speaker 4: today and we're still talking about it is telling in itself, 598 00:30:25,640 --> 00:30:29,560 Speaker 4: and especially when we're seeing that perpetuate in all of 599 00:30:29,600 --> 00:30:32,959 Speaker 4: these tech spaces where it still feels in a lot 600 00:30:32,960 --> 00:30:35,240 Speaker 4: of ways even though women in marginalized people have built 601 00:30:35,240 --> 00:30:39,760 Speaker 4: those spaces that like, you're the guest here, and you're 602 00:30:39,880 --> 00:30:44,000 Speaker 4: only here because we're opening our gates a little bit 603 00:30:44,800 --> 00:30:48,760 Speaker 4: to let you in, but otherwise, yes, get out. 604 00:30:49,400 --> 00:30:51,320 Speaker 5: And I just think that's a dynamic we need to 605 00:30:51,320 --> 00:30:53,880 Speaker 5: be questioning in twenty twenty four. And I think so 606 00:30:53,960 --> 00:30:57,720 Speaker 5: like something about the use of this image it's ubiquity 607 00:30:57,760 --> 00:31:02,000 Speaker 5: in education spaces, I find so telling. But also even 608 00:31:02,040 --> 00:31:04,480 Speaker 5: if you're not studying to be an engineer or something, 609 00:31:04,840 --> 00:31:06,920 Speaker 5: I think there is a dynamic that says that if 610 00:31:06,960 --> 00:31:10,600 Speaker 5: you are a person who is traditionally marginalized, you're not 611 00:31:10,640 --> 00:31:13,160 Speaker 5: a decision maker, you're not a power holder, you're not 612 00:31:13,360 --> 00:31:16,200 Speaker 5: doing or making anything that anybody needs to care about, 613 00:31:16,200 --> 00:31:20,120 Speaker 5: and the entire dynamic is that we use you. In fact, 614 00:31:20,520 --> 00:31:22,920 Speaker 5: so Ding actually points this out on her piece. She says, 615 00:31:23,160 --> 00:31:26,440 Speaker 5: while social norms are changing, toward non consensual data collection 616 00:31:26,480 --> 00:31:29,560 Speaker 5: and data exploitation. Digital norms seem to be moving in 617 00:31:29,600 --> 00:31:33,120 Speaker 5: the opposite direction. Advancement and machine learning algorithms and data 618 00:31:33,120 --> 00:31:37,160 Speaker 5: storage capabilities are only making data misuse easier, whether the 619 00:31:37,200 --> 00:31:41,720 Speaker 5: outcome is revenge porn or targeted ads, surveillance, or discriminatory AI. 620 00:31:42,080 --> 00:31:44,480 Speaker 5: If we want a world where our data can retire 621 00:31:44,560 --> 00:31:47,400 Speaker 5: when it's outlived its time or when it's directly harming 622 00:31:47,400 --> 00:31:50,160 Speaker 5: our lives, we must create the tools and policies that 623 00:31:50,200 --> 00:31:53,240 Speaker 5: empower data subjects to have a say in what happens 624 00:31:53,320 --> 00:31:56,160 Speaker 5: to their data, including allowing their data to die. And 625 00:31:56,200 --> 00:31:58,400 Speaker 5: so I think, even if you're not somebody who is 626 00:31:58,440 --> 00:32:01,600 Speaker 5: a techie, that does concern you this dynamic that just 627 00:32:01,600 --> 00:32:05,920 Speaker 5: says we consume, we exploit, we make money from you, 628 00:32:06,160 --> 00:32:07,600 Speaker 5: and you don't get to have a say about it. 629 00:32:07,720 --> 00:32:08,880 Speaker 2: That's the dynamics that I think this. 630 00:32:08,960 --> 00:32:13,680 Speaker 5: Lena image really did usher in without really even necessarily 631 00:32:13,800 --> 00:32:14,200 Speaker 5: meaning to. 632 00:32:24,480 --> 00:32:27,280 Speaker 6: I think there's a big conversation here on like the 633 00:32:27,320 --> 00:32:31,719 Speaker 6: power of capitalism within the tech industry and what makes money. 634 00:32:32,320 --> 00:32:35,280 Speaker 6: I can't help but think, like with the Lina image, 635 00:32:35,440 --> 00:32:38,480 Speaker 6: the fact that this toxicity was used to make more 636 00:32:38,560 --> 00:32:42,560 Speaker 6: profit and more power within this industry. It took forty 637 00:32:42,560 --> 00:32:44,760 Speaker 6: to fifty years for it to even have a conversation 638 00:32:44,840 --> 00:32:47,800 Speaker 6: about like, let's change it, let's retire it. But the 639 00:32:47,840 --> 00:32:50,320 Speaker 6: fact that I had that much pushback because they didn't 640 00:32:50,360 --> 00:32:52,880 Speaker 6: care enough and they wanted to build on this toxicity 641 00:32:52,920 --> 00:32:55,160 Speaker 6: because they knew it could make money is the most 642 00:32:55,200 --> 00:32:57,920 Speaker 6: concerning thing to me. And then and then the powers 643 00:32:57,960 --> 00:33:00,000 Speaker 6: that be are saying that, yeah, yeah, we're definitely gonna 644 00:33:00,080 --> 00:33:02,080 Speaker 6: control this, and then just goes off to an app 645 00:33:02,400 --> 00:33:04,640 Speaker 6: instead of the root of the problem. It seems like 646 00:33:04,680 --> 00:33:07,560 Speaker 6: the biggest part of the conversation because even in the 647 00:33:07,600 --> 00:33:11,280 Speaker 6: AI world, with new apps coming through, new programs coming through, 648 00:33:11,360 --> 00:33:14,360 Speaker 6: and they're all competing with each other, they don't want 649 00:33:14,400 --> 00:33:17,000 Speaker 6: to let go of the toxicity. But that's what's making 650 00:33:17,040 --> 00:33:18,960 Speaker 6: the money, which is really really concerning. 651 00:33:19,840 --> 00:33:22,200 Speaker 5: Yeah, and I mean like if there was one, so 652 00:33:22,400 --> 00:33:25,200 Speaker 5: what of why I wanted to have this conversation Sam? 653 00:33:25,240 --> 00:33:28,200 Speaker 5: That is exactly it that it is about money. It 654 00:33:28,280 --> 00:33:31,520 Speaker 5: is about capitalism. It is about making money off of 655 00:33:31,560 --> 00:33:35,280 Speaker 5: people's own exploitation and selling that exploitation back to them 656 00:33:35,480 --> 00:33:40,080 Speaker 5: to make more money. And it's just a really toxic dynamic. 657 00:33:40,360 --> 00:33:44,160 Speaker 5: That I believe is harming us and making the people 658 00:33:44,840 --> 00:33:48,560 Speaker 5: who have created that dynamic rich all the while they 659 00:33:48,640 --> 00:33:50,000 Speaker 5: get to be like, Oh, it's not a big deal 660 00:33:50,000 --> 00:33:50,239 Speaker 5: for you. 661 00:33:50,280 --> 00:33:51,680 Speaker 2: Actually, this is going to be really good for you. 662 00:33:51,720 --> 00:33:52,920 Speaker 2: This is going to be convenient for you. 663 00:33:53,320 --> 00:33:56,400 Speaker 5: And I don't know, Like I woke up this morning 664 00:33:56,560 --> 00:33:58,520 Speaker 5: when I was trying to decide, like what I wanted 665 00:33:58,560 --> 00:34:01,040 Speaker 5: to talk to you all today about, and one of 666 00:34:01,040 --> 00:34:04,320 Speaker 5: the ideas that I had that I scrapped was just 667 00:34:04,360 --> 00:34:07,560 Speaker 5: this feeling that being on the internet just doesn't feel 668 00:34:07,600 --> 00:34:10,480 Speaker 5: fun anymore. Anytime I go on a website, anytime I 669 00:34:10,560 --> 00:34:13,880 Speaker 5: google something just to find out information, it feels like 670 00:34:13,920 --> 00:34:14,600 Speaker 5: a scam. 671 00:34:14,680 --> 00:34:16,040 Speaker 2: It feels like exploitation. 672 00:34:16,239 --> 00:34:18,520 Speaker 5: I feel like I am one click away from somebody 673 00:34:18,520 --> 00:34:23,000 Speaker 5: getting my social Security number. It feels like AI generated garbage. 674 00:34:23,400 --> 00:34:26,960 Speaker 5: And I just think we have hit the wall of 675 00:34:27,000 --> 00:34:29,680 Speaker 5: that feeling. I can't imagine that I'm alone in this. 676 00:34:30,239 --> 00:34:32,800 Speaker 5: I think the feeling of being showing up online today 677 00:34:32,800 --> 00:34:37,160 Speaker 5: in twenty twenty four feels exhausting, and I think part 678 00:34:37,200 --> 00:34:39,359 Speaker 5: of it is because it feels like we are being 679 00:34:39,440 --> 00:34:42,640 Speaker 5: bled dry by people that we have already made rich 680 00:34:43,000 --> 00:34:44,520 Speaker 5: from our own exploitation. 681 00:34:44,680 --> 00:34:45,759 Speaker 2: Do you want to feel that way. 682 00:34:46,000 --> 00:34:49,719 Speaker 6: Oh absolutely, I think with because getting on TikTok, the 683 00:34:49,719 --> 00:34:52,239 Speaker 6: first opening video I'm sure you've seen it is that 684 00:34:52,360 --> 00:34:54,840 Speaker 6: content manager who is like, I'm here for the safety 685 00:34:54,920 --> 00:34:55,440 Speaker 6: of TikTok. 686 00:34:55,480 --> 00:34:56,000 Speaker 1: Have you seen this? 687 00:34:56,440 --> 00:34:56,880 Speaker 2: I have not. 688 00:34:57,400 --> 00:34:58,200 Speaker 1: She's been there. 689 00:34:58,360 --> 00:35:00,560 Speaker 6: She is for safety in some like she has a 690 00:35:00,640 --> 00:35:05,560 Speaker 6: very specific time. Yes, Susie someone she is very white 691 00:35:05,640 --> 00:35:07,759 Speaker 6: and she's very redheaded. It's I was like, okay, so 692 00:35:07,840 --> 00:35:09,600 Speaker 6: we've got played into the xenophobey. She's like, look, I'm 693 00:35:09,600 --> 00:35:11,600 Speaker 6: a white person. I'm gonna help you out here. 694 00:35:11,600 --> 00:35:12,640 Speaker 2: Don't worry. Don't worry. 695 00:35:12,880 --> 00:35:13,440 Speaker 1: I got this. 696 00:35:14,440 --> 00:35:16,359 Speaker 6: But that's the first thing that I'm seeing, so like, 697 00:35:17,080 --> 00:35:20,200 Speaker 6: you know, urging TikTok users to talk to the government 698 00:35:20,200 --> 00:35:22,040 Speaker 6: because they voted this in and this is real bad 699 00:35:22,040 --> 00:35:23,840 Speaker 6: and all this and not whatnot. And I'm just like, 700 00:35:23,840 --> 00:35:28,000 Speaker 6: all right, it's gonna go away next. This is now 701 00:35:28,040 --> 00:35:31,000 Speaker 6: my attitude because also I'm very tired. But also I 702 00:35:31,120 --> 00:35:33,719 Speaker 6: just got an email saying that AT and d Yes 703 00:35:33,800 --> 00:35:36,360 Speaker 6: has a record that oh that that reached. They have 704 00:35:36,400 --> 00:35:38,480 Speaker 6: your stuff. But good news, since you don't have a 705 00:35:38,520 --> 00:35:40,480 Speaker 6: bill with us, we don't. We didn't you did, didn't 706 00:35:40,520 --> 00:35:44,720 Speaker 6: get any perfinent information. But I literally think every month 707 00:35:44,920 --> 00:35:47,800 Speaker 6: I have been given seeing an email that says something 708 00:35:47,840 --> 00:35:51,080 Speaker 6: of my information is has been breached, and it's nothing 709 00:35:51,080 --> 00:35:54,280 Speaker 6: that I have done. It is literally everything from my insurance, 710 00:35:54,520 --> 00:35:59,880 Speaker 6: my dental insurance, my healthcare provider, my internet, which I'm like, 711 00:36:00,040 --> 00:36:04,279 Speaker 6: what the hell, my phone subscription, my cell phone, which 712 00:36:04,280 --> 00:36:05,680 Speaker 6: I'm like, I'm starting to get back to that. 713 00:36:05,719 --> 00:36:06,960 Speaker 1: I think I want a landline. 714 00:36:07,160 --> 00:36:09,680 Speaker 6: I've at this moment, y'all, to each of those things 715 00:36:09,719 --> 00:36:12,239 Speaker 6: popping up on things, I'm like, I hadn't. I have 716 00:36:12,280 --> 00:36:14,960 Speaker 6: to use that information in order for me to have healthcare. 717 00:36:15,320 --> 00:36:18,080 Speaker 6: So y'all, let my healthcare information go out and they 718 00:36:18,120 --> 00:36:19,400 Speaker 6: have my social Security number. 719 00:36:19,719 --> 00:36:21,000 Speaker 1: There's nothing I can do about that. 720 00:36:21,040 --> 00:36:24,319 Speaker 6: As many times as I can change my password, the 721 00:36:24,360 --> 00:36:27,480 Speaker 6: next email I'm getting is telling me that I've got 722 00:36:27,480 --> 00:36:29,160 Speaker 6: a data breach of my information. 723 00:36:29,640 --> 00:36:30,920 Speaker 1: So what is the point. 724 00:36:31,120 --> 00:36:34,719 Speaker 6: Like, at this point, the only way is to rewrite 725 00:36:34,760 --> 00:36:38,439 Speaker 6: my identity and to never get online again, which would 726 00:36:38,440 --> 00:36:39,560 Speaker 6: be really hard for my job. 727 00:36:40,000 --> 00:36:44,640 Speaker 5: Yes, Like, if you have a phone in your life, 728 00:36:44,760 --> 00:36:48,640 Speaker 5: if you vote, if you drive, these things that we 729 00:36:48,680 --> 00:36:52,800 Speaker 5: are required to do to participate in public life should 730 00:36:52,840 --> 00:36:55,560 Speaker 5: not just be avenues for somebody to make money and 731 00:36:55,600 --> 00:36:56,160 Speaker 5: scam us. 732 00:36:56,320 --> 00:36:58,040 Speaker 2: But yet it feels that way. 733 00:36:58,080 --> 00:37:00,640 Speaker 5: And you know what, Sam, I have actually not seen 734 00:37:00,840 --> 00:37:03,000 Speaker 5: the TikTok that you're referring to, because I have not 735 00:37:03,120 --> 00:37:06,160 Speaker 5: opened my TikTok app in days because it's starting to 736 00:37:06,200 --> 00:37:09,440 Speaker 5: feel like QVC and I cannot take it anymore. 737 00:37:09,480 --> 00:37:11,160 Speaker 2: Like whatever happened. 738 00:37:10,840 --> 00:37:12,800 Speaker 5: To spaces on the Internet that we're supposed to feel 739 00:37:12,880 --> 00:37:16,680 Speaker 5: like safety or exploration or fun or community or connection. 740 00:37:17,280 --> 00:37:19,719 Speaker 5: I'm I hope that somebody out there listening is like 741 00:37:20,080 --> 00:37:22,640 Speaker 5: Bridget you're old and on hip. We have those spaces, 742 00:37:22,680 --> 00:37:25,440 Speaker 5: they are syz tell me about them. I want to 743 00:37:25,480 --> 00:37:27,480 Speaker 5: know about them. But I think that we should. We 744 00:37:27,560 --> 00:37:32,040 Speaker 5: really got to get back to, like to those principles 745 00:37:32,239 --> 00:37:35,960 Speaker 5: of the Internet feeling like something other than being taken 746 00:37:36,040 --> 00:37:38,200 Speaker 5: for a ride on which you are the chump. 747 00:37:38,680 --> 00:37:40,840 Speaker 6: Right, And I will say a lot of people have 748 00:37:40,880 --> 00:37:43,600 Speaker 6: felt like Discord and read it has been like brought 749 00:37:43,640 --> 00:37:45,239 Speaker 6: in but we already know read it has kind of 750 00:37:45,280 --> 00:37:48,040 Speaker 6: its problems. And then I think there's a new lawsuit 751 00:37:48,080 --> 00:37:51,040 Speaker 6: with Discord with this problems and its terms of service 752 00:37:51,120 --> 00:37:51,799 Speaker 6: changing as well. 753 00:37:51,840 --> 00:37:53,640 Speaker 2: I'm like, what totally happened? 754 00:37:53,640 --> 00:37:58,560 Speaker 6: So there's literally no one is protecting the individual to like, 755 00:37:58,600 --> 00:38:01,319 Speaker 6: there's no protection for us at all, but they want 756 00:38:01,400 --> 00:38:04,439 Speaker 6: us to say, they want us to take away things 757 00:38:04,440 --> 00:38:06,760 Speaker 6: from us, which is like the least of our worries, 758 00:38:07,200 --> 00:38:09,919 Speaker 6: or they're just like sorry, you're so like you can't 759 00:38:09,920 --> 00:38:10,680 Speaker 6: sue us. 760 00:38:11,040 --> 00:38:13,440 Speaker 5: Yeah, I think everybody is feeling that, but I think 761 00:38:13,520 --> 00:38:18,239 Speaker 5: it is particularly dangerous for people who are traditionally marginalized 762 00:38:18,280 --> 00:38:22,040 Speaker 5: because yeah, which it's just the expectation. 763 00:38:21,640 --> 00:38:24,719 Speaker 2: That, oh, it's totally fine if. 764 00:38:24,920 --> 00:38:29,320 Speaker 5: People who make apps that non consensually undress women using AI, 765 00:38:30,080 --> 00:38:32,440 Speaker 5: why wouldn't they be able to advertise on Facebook or 766 00:38:32,440 --> 00:38:33,600 Speaker 5: Instagram or Twitter. 767 00:38:33,719 --> 00:38:35,360 Speaker 2: They got to make money. That's a business. 768 00:38:35,360 --> 00:38:39,480 Speaker 5: Like how easy it is to erase the human people 769 00:38:39,680 --> 00:38:43,240 Speaker 5: at the heart of this dynamic, erase their concerns, erase 770 00:38:43,280 --> 00:38:46,319 Speaker 5: their needs, erase their harm because men got to make 771 00:38:46,400 --> 00:38:47,000 Speaker 5: money off of it. 772 00:38:47,400 --> 00:38:50,759 Speaker 6: I'm thick of it, right, Or is tradition literally like, yeah, 773 00:38:50,760 --> 00:38:52,600 Speaker 6: this image has always been here, we need to teach 774 00:38:52,600 --> 00:38:54,400 Speaker 6: it as a it's historical. 775 00:38:54,480 --> 00:38:54,640 Speaker 1: Now. 776 00:38:54,680 --> 00:38:59,120 Speaker 6: It was definitely not exploiting somebody or taking advantage of somebody, 777 00:38:59,239 --> 00:39:02,440 Speaker 6: or using humiliating content because. 778 00:39:02,239 --> 00:39:03,560 Speaker 1: She wasn't humiliated, I don't think. 779 00:39:03,560 --> 00:39:05,800 Speaker 6: But like in the ideal of like it being forever 780 00:39:05,880 --> 00:39:10,240 Speaker 6: and ever and ever, of like your seductive picture being 781 00:39:10,320 --> 00:39:13,320 Speaker 6: used for it people, which is a whole different conversation 782 00:39:13,400 --> 00:39:13,960 Speaker 6: in itself. 783 00:39:14,280 --> 00:39:16,759 Speaker 2: Yeah, I mean, so Lenna, the real life Lena. 784 00:39:16,960 --> 00:39:19,600 Speaker 5: And again there's a really interesting Wired article that has 785 00:39:19,640 --> 00:39:22,280 Speaker 5: an interview with her. She doesn't feel like she was exploited. 786 00:39:22,360 --> 00:39:24,400 Speaker 5: She's actually really proud of that image, even as she 787 00:39:24,440 --> 00:39:26,640 Speaker 5: recognizes that it's like time for it to be retired. 788 00:39:26,920 --> 00:39:29,480 Speaker 2: However, she does wish. 789 00:39:29,280 --> 00:39:32,440 Speaker 5: That she had been fairly compensated for what would go 790 00:39:32,520 --> 00:39:35,640 Speaker 5: on to be her like non consensual contributions to tech 791 00:39:35,680 --> 00:39:36,000 Speaker 5: when she. 792 00:39:35,960 --> 00:39:36,840 Speaker 2: Took that image. 793 00:39:36,920 --> 00:39:39,240 Speaker 5: There's no way that as a you know, young playboy 794 00:39:39,239 --> 00:39:41,680 Speaker 5: playmate in the night in nineteen seventy one or whatever, 795 00:39:42,080 --> 00:39:43,680 Speaker 5: you would have a sense of like, well, if this 796 00:39:43,760 --> 00:39:46,280 Speaker 5: goes on to be to make me the first lady 797 00:39:46,280 --> 00:39:50,120 Speaker 5: of the Internet, I better have compensation and protections. No way, right, 798 00:39:50,160 --> 00:39:53,279 Speaker 5: So in that Wired piece they say it makes sense 799 00:39:53,280 --> 00:39:55,720 Speaker 5: that she would feel this way. Unlike so many women 800 00:39:55,760 --> 00:39:59,120 Speaker 5: in tech, Lenna has at least been acknowledged, even feted 801 00:39:59,160 --> 00:40:01,440 Speaker 5: for her contribution. And she did that work and the 802 00:40:01,480 --> 00:40:04,000 Speaker 5: people started using that photo in this neat new way, 803 00:40:04,239 --> 00:40:07,160 Speaker 5: and now she has this kind of immortality woven into 804 00:40:07,200 --> 00:40:09,879 Speaker 5: the design of the machine. This is from Marie Hicks, 805 00:40:09,920 --> 00:40:12,920 Speaker 5: a historian of technology and the author of programmed Inequality. 806 00:40:13,480 --> 00:40:15,680 Speaker 5: All of this happened for a reason. Hicks writes, if 807 00:40:15,680 --> 00:40:18,520 Speaker 5: they hadn't used a Playboy centerfold, they almost certainly would 808 00:40:18,560 --> 00:40:20,799 Speaker 5: have used another picture of a pretty white woman. The 809 00:40:20,840 --> 00:40:23,560 Speaker 5: playboy thing gets our attention, but really what it's about 810 00:40:23,680 --> 00:40:26,359 Speaker 5: is this world building that's gone on in computing from 811 00:40:26,400 --> 00:40:29,560 Speaker 5: the beginning. It's about building worlds for certain people and 812 00:40:29,600 --> 00:40:30,920 Speaker 5: not for others. 813 00:40:31,360 --> 00:40:33,360 Speaker 6: I find it interesting, dude, that they invited her to 814 00:40:33,440 --> 00:40:37,520 Speaker 6: the conference, Like, I'm wondering what the purpose was other 815 00:40:37,640 --> 00:40:40,879 Speaker 6: than two like for because it obviously wasn't to ask 816 00:40:40,920 --> 00:40:44,720 Speaker 6: her questions about tech and how she did this thing, 817 00:40:44,880 --> 00:40:47,960 Speaker 6: because they did not even consider human as we know. 818 00:40:48,280 --> 00:40:51,440 Speaker 6: It was just literally to aggle her in real life. 819 00:40:51,719 --> 00:40:54,319 Speaker 2: Yeah, I was thinking about why they did that. 820 00:40:54,960 --> 00:40:57,080 Speaker 5: I don't know, I have parted me wonders if it 821 00:40:57,160 --> 00:40:59,439 Speaker 5: was like an attempt to be like, oh, we need 822 00:40:59,480 --> 00:41:03,520 Speaker 5: to acknowledge the way that this woman's image was so 823 00:41:03,640 --> 00:41:07,960 Speaker 5: foundational to our technology, but then like not really doing it, 824 00:41:08,040 --> 00:41:10,040 Speaker 5: like still sort of treating her as like a booth 825 00:41:10,120 --> 00:41:11,040 Speaker 5: babe or something like. 826 00:41:11,040 --> 00:41:14,520 Speaker 6: I don't know, right, I just find all of that interesting, 827 00:41:14,600 --> 00:41:18,719 Speaker 6: in this level of like not again of not what 828 00:41:18,800 --> 00:41:21,960 Speaker 6: she was doing this for. She came in with, like 829 00:41:22,719 --> 00:41:25,640 Speaker 6: whatever her ambitions were in being this model and whatnot, 830 00:41:25,680 --> 00:41:27,160 Speaker 6: and then all of a sudden being told. 831 00:41:27,000 --> 00:41:29,120 Speaker 1: You're being used. 832 00:41:30,440 --> 00:41:34,360 Speaker 6: As an example for computers, like for six images for computers, 833 00:41:34,520 --> 00:41:37,799 Speaker 6: and not only will you see this, but your grandkids 834 00:41:37,840 --> 00:41:39,759 Speaker 6: will also, like if she has children like any of 835 00:41:39,760 --> 00:41:41,680 Speaker 6: those things, and your your family members forever. 836 00:41:42,560 --> 00:41:45,120 Speaker 5: I'm like, whoo, who would have ever thought that that's 837 00:41:45,120 --> 00:41:46,920 Speaker 5: how that image would go on to be used in history? 838 00:41:47,000 --> 00:41:49,759 Speaker 5: And I really think like this is where we are today, 839 00:41:49,800 --> 00:41:51,680 Speaker 5: and this is like why I wanted to talk about 840 00:41:51,680 --> 00:41:55,839 Speaker 5: this is that I think, like the idea, the concept 841 00:41:56,000 --> 00:41:59,040 Speaker 5: of images being shared online, the way we understand that 842 00:41:59,080 --> 00:42:01,239 Speaker 5: in twenty twenty four, the fact that this image of 843 00:42:01,320 --> 00:42:06,080 Speaker 5: Leta became so foundational to that concept without her consent, 844 00:42:06,840 --> 00:42:09,520 Speaker 5: you know, perhaps without like proper contribution to the way 845 00:42:09,520 --> 00:42:12,160 Speaker 5: that she actually was foundational to that and building out 846 00:42:12,160 --> 00:42:16,160 Speaker 5: this entire universe around it that is mostly like controlled 847 00:42:16,360 --> 00:42:20,960 Speaker 5: and protected and profited off by men and nobody stopping 848 00:42:21,000 --> 00:42:23,919 Speaker 5: to ask about the ramifications of that until decades later. 849 00:42:24,400 --> 00:42:28,240 Speaker 5: I just think it really establishes like a concerning precedent 850 00:42:28,320 --> 00:42:30,160 Speaker 5: for where we're going right now with AI in twenty 851 00:42:30,160 --> 00:42:32,640 Speaker 5: twenty four. And it doesn't have to We can learn 852 00:42:32,760 --> 00:42:34,880 Speaker 5: from what we did with that Letta image if we 853 00:42:34,960 --> 00:42:37,960 Speaker 5: ask the right questions, if we center the right perspectives 854 00:42:38,000 --> 00:42:40,120 Speaker 5: and the right voices, and so Yeah, I don't want 855 00:42:40,120 --> 00:42:43,560 Speaker 5: to wait until twenty forty to be like, oh, should 856 00:42:43,560 --> 00:42:45,680 Speaker 5: we have been talking about the ways that women and 857 00:42:45,719 --> 00:42:49,360 Speaker 5: girls and other marginalized people are being exploited and used 858 00:42:49,360 --> 00:42:52,080 Speaker 5: to make technology companies money. 859 00:42:52,600 --> 00:42:54,319 Speaker 2: I don't want to ask that question when it's too. 860 00:42:54,280 --> 00:42:59,240 Speaker 6: Late, right, And here's like the big conversation is, shouldn't 861 00:42:59,280 --> 00:43:02,920 Speaker 6: we alsoe that big companies and big tech companies and 862 00:43:03,000 --> 00:43:06,480 Speaker 6: big companies that are developing are purposely leaving out marginalized 863 00:43:06,520 --> 00:43:08,960 Speaker 6: people because they like the old ways and that it's 864 00:43:09,000 --> 00:43:11,080 Speaker 6: only making a certain amount of people money. 865 00:43:11,560 --> 00:43:12,719 Speaker 2: Yes, that's exact. 866 00:43:12,760 --> 00:43:14,759 Speaker 5: I think that I would argue that's exactly what's going on. 867 00:43:14,840 --> 00:43:18,360 Speaker 5: I mean, in twenty twenty four, there are so many loud, 868 00:43:18,680 --> 00:43:22,399 Speaker 5: thoughtful voices from women and people of color who are 869 00:43:22,960 --> 00:43:26,319 Speaker 5: really talking about AI in some interesting and thoughtful ways. 870 00:43:26,320 --> 00:43:28,799 Speaker 5: So they exist, they are out there. This is the 871 00:43:28,840 --> 00:43:30,680 Speaker 5: tale as the oldest time when it comes to technology. 872 00:43:30,960 --> 00:43:33,040 Speaker 5: It is not that they are not there. It is 873 00:43:33,080 --> 00:43:39,040 Speaker 5: that they are being, whether intentionally or unintentionally, marginalized, sideline silenced, 874 00:43:39,400 --> 00:43:43,839 Speaker 5: pushed aside to make room for voices who are just 875 00:43:43,960 --> 00:43:46,960 Speaker 5: repeating the status quo, who are just saying like, well, 876 00:43:47,440 --> 00:43:49,840 Speaker 5: I'm trying to get rich, So who cares how this harms? 877 00:43:49,840 --> 00:43:53,080 Speaker 5: Somebody who cares about whether or not this goes on 878 00:43:53,120 --> 00:43:53,680 Speaker 5: to exploit. 879 00:43:53,920 --> 00:43:55,880 Speaker 2: And I think that's really it's really. 880 00:43:55,680 --> 00:43:58,120 Speaker 5: Like a it's a little bit of a complicated cultural 881 00:43:58,200 --> 00:44:00,800 Speaker 5: dynamic and cultural shift that I think that we really 882 00:44:00,840 --> 00:44:01,560 Speaker 5: got a break. 883 00:44:02,440 --> 00:44:03,280 Speaker 3: Yeah. 884 00:44:03,480 --> 00:44:05,480 Speaker 4: Yeah, And it's really sad, going back to your point 885 00:44:05,480 --> 00:44:08,239 Speaker 4: Bridget of like the Internet not being a place of 886 00:44:08,360 --> 00:44:12,040 Speaker 4: joy anymore, because so many times it was marginalized people 887 00:44:12,080 --> 00:44:16,720 Speaker 4: who made those spaces because they couldn't find them anywhere else, 888 00:44:18,200 --> 00:44:22,320 Speaker 4: and then these companies come in and are like, Okay, 889 00:44:22,360 --> 00:44:26,080 Speaker 4: well we can make money, and then it doesn't become 890 00:44:26,360 --> 00:44:30,520 Speaker 4: a joyous space anymore. It becomes a very toxic, a 891 00:44:30,600 --> 00:44:36,080 Speaker 4: toxic place. And so like hearing this story and seeing 892 00:44:36,800 --> 00:44:40,320 Speaker 4: how so much of what we use still is based 893 00:44:40,440 --> 00:44:41,520 Speaker 4: on something that was. 894 00:44:42,239 --> 00:44:44,800 Speaker 3: A guy walked in with the Playboy. 895 00:44:44,440 --> 00:44:51,520 Speaker 4: Magazine like, it's it's bad when you that doesn't feel 896 00:44:51,560 --> 00:44:54,040 Speaker 4: so out of place and what we're talking about in 897 00:44:54,080 --> 00:44:54,799 Speaker 4: our current time. 898 00:44:55,560 --> 00:44:58,600 Speaker 5: Yeah, and again, I mean, I opened up our conversation 899 00:44:58,680 --> 00:45:00,000 Speaker 5: with this, but I guess and I guess I'll close 900 00:45:00,080 --> 00:45:03,200 Speaker 5: with that too. I believe people when I say this, 901 00:45:03,239 --> 00:45:06,399 Speaker 5: people think I sound alarmist or extreme, but I mean 902 00:45:06,400 --> 00:45:08,080 Speaker 5: it the way that I mean it. I think that 903 00:45:08,120 --> 00:45:10,960 Speaker 5: these things are features, not bugs. I think we got 904 00:45:11,000 --> 00:45:13,520 Speaker 5: to be honest about the ways that things like misogyny 905 00:45:13,560 --> 00:45:17,680 Speaker 5: and exploitation, particularly when it comes to marginalized people, has 906 00:45:17,719 --> 00:45:21,120 Speaker 5: been foundational to technology and the Internet from the very beginning. 907 00:45:22,000 --> 00:45:24,520 Speaker 2: I love the Internet. I love technology. It is why 908 00:45:24,600 --> 00:45:25,719 Speaker 2: I do the work that I do. 909 00:45:26,040 --> 00:45:28,279 Speaker 5: But I think that until we are honest about that 910 00:45:28,920 --> 00:45:30,359 Speaker 5: but these things are features and. 911 00:45:30,400 --> 00:45:32,560 Speaker 2: Not bugs, we will never get anywhere. 912 00:45:32,640 --> 00:45:34,720 Speaker 5: And so I think that it really starts with having 913 00:45:34,800 --> 00:45:37,920 Speaker 5: honest conversations about where we started so that we can 914 00:45:37,960 --> 00:45:40,000 Speaker 5: get to a place that we that actually feels a 915 00:45:40,040 --> 00:45:41,200 Speaker 5: little bit better for everybody. 916 00:45:41,920 --> 00:45:47,880 Speaker 4: Yes, yes, well, thank you so much. As always, Ritete, 917 00:45:47,920 --> 00:45:49,560 Speaker 4: every time you come on. I'm like, we could talk 918 00:45:49,600 --> 00:45:52,719 Speaker 4: for hours about this and this and this. 919 00:45:53,040 --> 00:45:55,800 Speaker 5: Invite me back for an episode, just dragging Hugh Hefner. 920 00:45:56,280 --> 00:45:58,319 Speaker 2: Yeah, I'll be here for it. 921 00:45:58,520 --> 00:45:59,520 Speaker 1: I think we need to do this. 922 00:46:00,000 --> 00:46:03,239 Speaker 6: I don't think about this for a minute. Back into 923 00:46:03,280 --> 00:46:05,880 Speaker 6: the magazine world and jumping into like all of that. 924 00:46:06,120 --> 00:46:09,120 Speaker 5: Don't even get I mean, like this is like spoiler alert. 925 00:46:09,280 --> 00:46:11,600 Speaker 5: I like totally had this wrong. For so long in 926 00:46:11,600 --> 00:46:14,000 Speaker 5: my life. I was like, oh, Hugh Hefner was a 927 00:46:14,080 --> 00:46:16,760 Speaker 5: champion for free speech and civil riots and blah blah blah. 928 00:46:16,840 --> 00:46:19,280 Speaker 5: Then I grew up and learned and I was like, actually, 929 00:46:19,920 --> 00:46:20,840 Speaker 5: she wasn't such a good. 930 00:46:20,719 --> 00:46:24,360 Speaker 6: Guy, right, I mean we really fed into the but 931 00:46:24,440 --> 00:46:25,120 Speaker 6: read the article. 932 00:46:25,320 --> 00:46:26,080 Speaker 1: They're so good. 933 00:46:26,160 --> 00:46:27,960 Speaker 2: Oh my god, they've corbell. 934 00:46:29,400 --> 00:46:34,360 Speaker 4: Yes, yes, oh yes, please come back for that, Bridget. 935 00:46:36,239 --> 00:46:38,920 Speaker 3: In the meantime, where can the good listeners find you? 936 00:46:39,120 --> 00:46:40,759 Speaker 5: Well, you can listen to my podcast. There are no 937 00:46:40,800 --> 00:46:42,839 Speaker 5: girls on the internet. You can follow me. I'm not 938 00:46:42,880 --> 00:46:44,919 Speaker 5: really on social media that much anymore, but you can 939 00:46:44,920 --> 00:46:47,040 Speaker 5: try to find me there. I'm on Instagram at Bridget 940 00:46:47,040 --> 00:46:50,680 Speaker 5: Marie in DC. I am at Blue Sky at Bridget 941 00:46:50,680 --> 00:46:55,800 Speaker 5: Todd on threads at Bridget, Marie and DC sometimes on TikTok. 942 00:46:55,840 --> 00:46:57,560 Speaker 2: You'll I'm easy to find. You'll find Google me, You'll 943 00:46:57,560 --> 00:46:57,960 Speaker 2: find me. 944 00:46:59,320 --> 00:47:01,240 Speaker 1: Yes, Google, that's a flex. 945 00:47:02,520 --> 00:47:05,720 Speaker 3: It's true though. Our listeners are smart. They can find 946 00:47:05,760 --> 00:47:06,840 Speaker 3: you and listeners. 947 00:47:06,840 --> 00:47:08,879 Speaker 4: If you would like to contact us, you can our 948 00:47:08,920 --> 00:47:11,319 Speaker 4: email us a step media mom stuff at iHeartMedia dot com. 949 00:47:11,320 --> 00:47:13,480 Speaker 4: You can find us on Twitter at most of the podcasts, 950 00:47:13,520 --> 00:47:15,880 Speaker 4: or on TikTok and Instagram that stuff I Never told you. 951 00:47:15,920 --> 00:47:17,800 Speaker 4: We're also on YouTube. We have a tea public store 952 00:47:18,280 --> 00:47:20,160 Speaker 4: and a book you can get wherever you get your books. 953 00:47:20,239 --> 00:47:22,680 Speaker 4: Thanks as always to our superduce Christine, our executive producer 954 00:47:22,719 --> 00:47:25,320 Speaker 4: My and your contributor Joey. Thank you and thanks to 955 00:47:25,400 --> 00:47:27,239 Speaker 4: you for listening. Stuff Will Never Told you This poction 956 00:47:27,320 --> 00:47:29,080 Speaker 4: by Heart Radio. For more podcasts my heart Radio, you 957 00:47:29,120 --> 00:47:31,040 Speaker 4: can check out the heart Radio app Apple Podcasts wherever 958 00:47:31,040 --> 00:47:32,000 Speaker 4: you listen to your favorite show