1 00:00:05,160 --> 00:00:06,720 Speaker 1: Hey, this is Sanny and Samantha. 2 00:00:06,720 --> 00:00:08,319 Speaker 2: I'm welcome to Steffan never told you a production of 3 00:00:08,320 --> 00:00:20,840 Speaker 2: iHeart Ratio, and today we are once again so happy 4 00:00:21,000 --> 00:00:24,439 Speaker 2: to be joined by the incredible, the irreplaceable Bridget Todd. 5 00:00:24,680 --> 00:00:29,520 Speaker 2: Welcome Bridget, thanks for having me so lovely to be back. Yes, 6 00:00:29,960 --> 00:00:31,960 Speaker 2: thank you so much for being back. You and I 7 00:00:32,040 --> 00:00:35,560 Speaker 2: were chatting about how stressful these times can be, so 8 00:00:36,520 --> 00:00:40,640 Speaker 2: the holidays, those times being so thank you for making 9 00:00:40,680 --> 00:00:45,400 Speaker 2: the time. We always love talking to you. How are you, Brigid, 10 00:00:45,479 --> 00:00:46,000 Speaker 2: How have you been? 11 00:00:46,720 --> 00:00:49,400 Speaker 3: I'm not bad. We were talking off Mike about how 12 00:00:49,600 --> 00:00:50,920 Speaker 3: it is a known issue with me. 13 00:00:51,080 --> 00:00:55,480 Speaker 4: I'm a grinch. I don't like the holidays. When people ask, 14 00:00:55,640 --> 00:00:57,280 Speaker 4: you know, this time of year, people are like, what. 15 00:00:57,280 --> 00:01:00,600 Speaker 3: Are you doing for the holiday? I like my fur. 16 00:01:00,760 --> 00:01:02,400 Speaker 3: I can feel my fur go up. You know, I 17 00:01:02,520 --> 00:01:03,240 Speaker 3: just don't like it. 18 00:01:03,640 --> 00:01:06,160 Speaker 4: My heart beats a little bit faster, and I just 19 00:01:06,360 --> 00:01:11,400 Speaker 4: I just find the holiday season extremely stressful. And especially 20 00:01:11,480 --> 00:01:14,120 Speaker 4: when you're someone who is self employed, it's like, hey, 21 00:01:14,240 --> 00:01:16,200 Speaker 4: you know the work that you usually do, What if 22 00:01:16,280 --> 00:01:19,520 Speaker 4: you did that work plus a week's worth of work 23 00:01:19,840 --> 00:01:22,520 Speaker 4: on top of it, and then that'll buy you a 24 00:01:22,600 --> 00:01:24,720 Speaker 4: week off what will you be doing with that time? 25 00:01:24,959 --> 00:01:28,640 Speaker 4: Going to see your family? Notably not stressful activity. It's 26 00:01:28,959 --> 00:01:32,000 Speaker 4: like I just find it like a stress sandwich. Essentially, 27 00:01:32,080 --> 00:01:34,280 Speaker 4: I can't handle it. I cannot hang with the holidays. 28 00:01:35,920 --> 00:01:41,759 Speaker 2: Well, as I said, Samantha, you and I would agree 29 00:01:41,920 --> 00:01:44,360 Speaker 2: on this. I mean I also think they're very stressful. 30 00:01:44,440 --> 00:01:48,560 Speaker 2: I yesterday we had a we went to the office 31 00:01:48,760 --> 00:01:51,720 Speaker 2: for a Sminty team meeting. We go, like Samantha and 32 00:01:51,760 --> 00:01:55,720 Speaker 2: I go once a year, and our our producer Christina was. 33 00:01:55,720 --> 00:02:00,440 Speaker 3: Like, how are you? And I was like, I'm so 34 00:02:01,160 --> 00:02:04,360 Speaker 3: tired just a deep sigh. Yeah. 35 00:02:04,400 --> 00:02:06,200 Speaker 4: And then it's the end of the year. It just 36 00:02:06,920 --> 00:02:11,840 Speaker 4: I just always feel like it's like everything catching up 37 00:02:11,880 --> 00:02:15,720 Speaker 4: with you all of the year. It's been a heavy year. 38 00:02:16,480 --> 00:02:18,520 Speaker 4: It's something about like by the time it hits December, 39 00:02:18,639 --> 00:02:21,840 Speaker 4: I'm just so freaking done, just done, spent. 40 00:02:23,160 --> 00:02:26,200 Speaker 2: Yeah, and then you have the new year barreling in 41 00:02:26,320 --> 00:02:30,040 Speaker 2: on you and you're like, well, it's still gonna be here, 42 00:02:30,880 --> 00:02:34,519 Speaker 2: but as all these expectations of maybe something will change 43 00:02:34,600 --> 00:02:38,240 Speaker 2: and not really Yeah, the. 44 00:02:38,280 --> 00:02:40,679 Speaker 4: New year I'm actually okay with because you get to 45 00:02:40,760 --> 00:02:42,840 Speaker 4: have those couple of days where you buy a new 46 00:02:42,919 --> 00:02:45,320 Speaker 4: journal and some new pens and if you for a 47 00:02:45,400 --> 00:02:48,120 Speaker 4: brief moment you're like, this is gonna be my year, 48 00:02:48,240 --> 00:02:51,079 Speaker 4: this is the year, then it all I get it 49 00:02:51,120 --> 00:02:54,000 Speaker 4: all together. It doesn't usually last for me. I usually 50 00:02:54,040 --> 00:02:55,720 Speaker 4: get about a week out of that. But you get 51 00:02:55,760 --> 00:02:59,079 Speaker 4: that week, which is not bad. And you and Smanita 52 00:02:59,160 --> 00:03:00,160 Speaker 4: did Bondo. 53 00:02:59,880 --> 00:03:01,120 Speaker 1: For last time. 54 00:03:01,280 --> 00:03:04,080 Speaker 5: Oh yeah, I got a new set. I was gifted 55 00:03:04,160 --> 00:03:05,920 Speaker 5: a new set, Bridget, just to give you an. 56 00:03:05,880 --> 00:03:07,680 Speaker 3: Update, and how are they writing. 57 00:03:08,120 --> 00:03:09,799 Speaker 5: I had to get used to it because it was 58 00:03:09,840 --> 00:03:12,639 Speaker 5: a lot free flowing, a lot more free flowing than 59 00:03:12,680 --> 00:03:14,799 Speaker 5: my previous one. So I had to practice with them 60 00:03:15,000 --> 00:03:17,680 Speaker 5: so I didn't mess up my tiny coloring books. 61 00:03:19,480 --> 00:03:21,600 Speaker 4: A good set of pans, a good set of markers. 62 00:03:21,720 --> 00:03:23,720 Speaker 4: It really will have you believing this is going to 63 00:03:23,760 --> 00:03:24,560 Speaker 4: turn everything around. 64 00:03:24,919 --> 00:03:27,600 Speaker 5: It's for a split second I was like, I can 65 00:03:27,720 --> 00:03:29,600 Speaker 5: color within the lines, and then I was like, no, 66 00:03:29,800 --> 00:03:30,280 Speaker 5: I cannot. 67 00:03:33,760 --> 00:03:37,720 Speaker 2: Well, the topic you've bought today for us, Bridget is 68 00:03:38,040 --> 00:03:41,400 Speaker 2: so so timely. I've been thinking about this because I mean, 69 00:03:41,440 --> 00:03:44,040 Speaker 2: it's hard to escape. But also you get to the 70 00:03:44,160 --> 00:03:46,240 Speaker 2: end of the year and you have, like Miriam Webster 71 00:03:46,880 --> 00:03:49,080 Speaker 2: chooses their word of the year and it's slop as 72 00:03:49,120 --> 00:03:52,280 Speaker 2: an AI slop and then you have time chooses their 73 00:03:52,400 --> 00:03:56,000 Speaker 2: persons a year and it's like the architects of AI 74 00:03:56,200 --> 00:04:00,920 Speaker 2: and it's like all dudes. And yesterday when we were 75 00:04:01,440 --> 00:04:04,160 Speaker 2: at the office, we were having kind of a team 76 00:04:04,280 --> 00:04:07,240 Speaker 2: meeting and it came up we were talking about AI 77 00:04:07,520 --> 00:04:10,040 Speaker 2: and our jobs and what does that look like? 78 00:04:10,640 --> 00:04:13,680 Speaker 1: So what are we talking about specifically today, bridget. 79 00:04:13,960 --> 00:04:17,479 Speaker 4: Today you're talking about the reality that women are using 80 00:04:17,600 --> 00:04:21,000 Speaker 4: AI tools a lot less than men in the workplace. 81 00:04:21,440 --> 00:04:25,640 Speaker 4: There's a definite, well studied, well documented gender gap when 82 00:04:25,680 --> 00:04:29,559 Speaker 4: it comes to the usage of AI. And I feel 83 00:04:29,600 --> 00:04:33,000 Speaker 4: you because I think as a podcaster especially, but any 84 00:04:33,080 --> 00:04:35,960 Speaker 4: kind of creative the first conversation that comes up when 85 00:04:36,000 --> 00:04:38,719 Speaker 4: it's a bunch of podcasters together is are you using 86 00:04:38,760 --> 00:04:40,680 Speaker 4: AI and your work? How are you seeing AI show up? 87 00:04:40,839 --> 00:04:44,320 Speaker 4: Or people asking are you afraid of AI? Do you 88 00:04:44,360 --> 00:04:46,200 Speaker 4: see AI taking your job? 89 00:04:46,279 --> 00:04:47,640 Speaker 3: One day? Like you won't have a job. 90 00:04:47,680 --> 00:04:50,080 Speaker 4: People will be listening to AI podcasts all of that, right, 91 00:04:50,680 --> 00:04:54,920 Speaker 4: Those conversations are everywhere, but and I think often time 92 00:04:55,040 --> 00:04:57,960 Speaker 4: those conversations can be grounded in a kind of hype. 93 00:04:58,120 --> 00:05:01,640 Speaker 4: And so I think the the act that AI is 94 00:05:01,800 --> 00:05:05,080 Speaker 4: not being adopted equally by people of all genders at 95 00:05:05,080 --> 00:05:05,719 Speaker 4: the same rates. 96 00:05:05,920 --> 00:05:07,040 Speaker 3: Is pretty interesting. 97 00:05:07,160 --> 00:05:09,880 Speaker 4: So I wanted to talk through what we know about 98 00:05:09,920 --> 00:05:12,080 Speaker 4: the data when it comes to the gender gap in AI. 99 00:05:12,880 --> 00:05:15,160 Speaker 4: But I think I might be coming at this from 100 00:05:15,200 --> 00:05:17,360 Speaker 4: a little bit of a different place, right. You know, 101 00:05:17,760 --> 00:05:21,200 Speaker 4: in a lot of the research, they automatically assume two 102 00:05:21,360 --> 00:05:24,039 Speaker 4: points that I want to call out early that I'll 103 00:05:24,080 --> 00:05:28,080 Speaker 4: return to one, women adopting AI less is going to 104 00:05:28,120 --> 00:05:30,240 Speaker 4: be bad for women. I'll talk about this later, but 105 00:05:30,880 --> 00:05:33,760 Speaker 4: I don't know about treating that like a foregone conclusion. 106 00:05:34,600 --> 00:05:36,520 Speaker 4: And then this is kind of an offshoot of something 107 00:05:36,560 --> 00:05:39,559 Speaker 4: that I hear a ton when talking about AI. AI 108 00:05:39,960 --> 00:05:42,640 Speaker 4: is inevitable. If you don't get with the program, you're 109 00:05:42,640 --> 00:05:45,400 Speaker 4: going to be left behind. I have people who tell 110 00:05:45,480 --> 00:05:49,200 Speaker 4: me that not being into AI right now is like 111 00:05:49,440 --> 00:05:52,320 Speaker 4: not using email. You know, it will be the same 112 00:05:52,440 --> 00:05:54,720 Speaker 4: kind of thing in a few years, and you know, 113 00:05:54,800 --> 00:05:57,600 Speaker 4: if women are adopting it less, the idea that we're 114 00:05:57,640 --> 00:06:01,239 Speaker 4: going to be less employable, less competitive, making less money overall. 115 00:06:01,800 --> 00:06:03,960 Speaker 4: And so I want to be clear that I don't 116 00:06:04,000 --> 00:06:06,680 Speaker 4: really have the answer here, but and I want to 117 00:06:06,720 --> 00:06:08,920 Speaker 4: look at what the research says, but I don't like 118 00:06:09,040 --> 00:06:14,240 Speaker 4: this idea that frames widespread AI adoption as an inevitable 119 00:06:14,320 --> 00:06:16,360 Speaker 4: thing that everyone is going to be doing, because it 120 00:06:16,400 --> 00:06:19,080 Speaker 4: really doesn't leave a lot of room for criticism away 121 00:06:19,160 --> 00:06:21,480 Speaker 4: around the way that AI is being adopted. It's just 122 00:06:21,560 --> 00:06:24,479 Speaker 4: repeating it's inevitable, it's inevitable, it's inevitable, get on board. 123 00:06:24,880 --> 00:06:26,880 Speaker 4: I do think that when it comes to adopting any 124 00:06:27,000 --> 00:06:30,520 Speaker 4: new technology in this way, we shouldn't just be being 125 00:06:30,640 --> 00:06:33,640 Speaker 4: told to get on board it's inevitable, just fram it 126 00:06:33,720 --> 00:06:36,960 Speaker 4: down people's throats, especially when people are saying, hey, maybe 127 00:06:37,000 --> 00:06:38,839 Speaker 4: we don't actually like this technology. 128 00:06:38,920 --> 00:06:41,040 Speaker 3: Hey, maybe this technology isn't good. 129 00:06:41,160 --> 00:06:41,280 Speaker 2: Right. 130 00:06:41,720 --> 00:06:44,080 Speaker 4: I love technology. I'm an advocate for knowing the basics 131 00:06:44,120 --> 00:06:46,120 Speaker 4: of all kinds of technology. But if a technology is 132 00:06:46,120 --> 00:06:49,799 Speaker 4: showing itself to be ineffective, biased, problematic, full of lies, 133 00:06:50,160 --> 00:06:52,520 Speaker 4: actually adds work to your plate. I don't know that 134 00:06:52,560 --> 00:06:54,720 Speaker 4: we should be just to be telling people get on 135 00:06:54,800 --> 00:06:57,760 Speaker 4: board with that it's inevitable if that's actually what they're experiencing, 136 00:06:57,839 --> 00:06:58,039 Speaker 4: you know. 137 00:06:58,960 --> 00:07:03,120 Speaker 2: Yes, yes, And this has come up a lot at 138 00:07:03,200 --> 00:07:06,680 Speaker 2: our office of the at least right now, it seems 139 00:07:06,720 --> 00:07:09,880 Speaker 2: to be adding more work. It's not like actually helping, 140 00:07:11,160 --> 00:07:13,480 Speaker 2: but that you're told like, no, it's going to really 141 00:07:13,560 --> 00:07:16,400 Speaker 2: help you out and streamline things. I have to say, 142 00:07:16,440 --> 00:07:21,480 Speaker 2: I had not considered this gender gap in AI usage. 143 00:07:21,520 --> 00:07:24,560 Speaker 2: So I'm really interested about the research. 144 00:07:24,440 --> 00:07:25,080 Speaker 1: That you found. 145 00:07:25,800 --> 00:07:27,160 Speaker 3: Yeah, let's get into it. 146 00:07:28,080 --> 00:07:29,880 Speaker 4: So I took a look at some of the studies 147 00:07:30,000 --> 00:07:32,880 Speaker 4: and they all basically confirm that women are using AI 148 00:07:33,040 --> 00:07:35,560 Speaker 4: in the workplace a lot less than our mail counterparts. 149 00:07:36,080 --> 00:07:38,720 Speaker 4: Let's look at a meta analysis called Global Evidence on 150 00:07:38,840 --> 00:07:41,880 Speaker 4: Gender Gaps in Generative AI published in the Harvard Business Review. 151 00:07:42,200 --> 00:07:44,240 Speaker 4: So they took a look at eighteen surveys and studies 152 00:07:44,320 --> 00:07:47,320 Speaker 4: covering one hundred and forty three thousand plus individuals across 153 00:07:47,440 --> 00:07:51,400 Speaker 4: many countries, sectors, and occupations. They also combined this survey 154 00:07:51,520 --> 00:07:54,720 Speaker 4: data with Internet traffic and mobile app download data for 155 00:07:55,200 --> 00:07:58,920 Speaker 4: major generative AI platforms like Quad and chackbut platforms that 156 00:07:59,000 --> 00:08:02,360 Speaker 4: have one hundred of millions of users. They found a large, 157 00:08:03,000 --> 00:08:07,160 Speaker 4: nearly universal gender gap. Here's what they found. They found 158 00:08:07,200 --> 00:08:10,600 Speaker 4: that the gender gap in generative AI usage is global, persistent, 159 00:08:11,000 --> 00:08:14,800 Speaker 4: and not fully explained by access or occupation, pointing to 160 00:08:14,920 --> 00:08:18,080 Speaker 4: some deeper social, cultural, and institutional frictions. 161 00:08:18,480 --> 00:08:20,720 Speaker 3: So overall, it just seems like women. 162 00:08:20,720 --> 00:08:23,960 Speaker 4: Are not the ones when when people are saying it's inevitable, 163 00:08:24,040 --> 00:08:25,280 Speaker 4: get on board, you got to use it. 164 00:08:25,640 --> 00:08:28,760 Speaker 3: It sounds like, by and large that is not coming 165 00:08:28,880 --> 00:08:29,520 Speaker 3: from women. 166 00:08:30,480 --> 00:08:34,960 Speaker 4: I'm curious, do these findings surprise you, like as women 167 00:08:35,040 --> 00:08:38,400 Speaker 4: in media, as women in tech and podcasting, Like, is 168 00:08:38,480 --> 00:08:41,520 Speaker 4: this surprising information for me? 169 00:08:41,760 --> 00:08:42,240 Speaker 1: It's not. 170 00:08:42,520 --> 00:08:44,679 Speaker 2: I don't think i'd ever clocked it until I was 171 00:08:44,800 --> 00:08:48,840 Speaker 2: looking at your outline. But once I looked at the outline, 172 00:08:48,840 --> 00:08:52,319 Speaker 2: I was like, yeah, yeah, because most people I know 173 00:08:52,360 --> 00:08:56,079 Speaker 2: who are into AI are men, and they usually have 174 00:08:56,200 --> 00:08:59,319 Speaker 2: a very like but it's this global technology thing about it, 175 00:09:00,400 --> 00:09:04,640 Speaker 2: and I can't say I have really anyone I can 176 00:09:04,679 --> 00:09:08,319 Speaker 2: think of, but as a woman, that's like that. And 177 00:09:08,440 --> 00:09:10,240 Speaker 2: I mean, we're going to go over some of the 178 00:09:10,280 --> 00:09:12,280 Speaker 2: reasons why this is, and I thought a lot of 179 00:09:12,320 --> 00:09:16,439 Speaker 2: them were interesting, but they did resonate with me, and 180 00:09:16,520 --> 00:09:19,520 Speaker 2: I thought like, yeah, that makes sense, Like I get 181 00:09:21,480 --> 00:09:26,160 Speaker 2: I can't why there's a hesitance, but I guess I 182 00:09:26,280 --> 00:09:29,439 Speaker 2: hadn't until I read this, I hadn't picked up on it, 183 00:09:29,559 --> 00:09:30,199 Speaker 2: but I do. 184 00:09:30,480 --> 00:09:32,920 Speaker 1: It resonates with me now, and I'm like, yeah, that 185 00:09:33,040 --> 00:09:34,400 Speaker 1: seems right, that seems right. 186 00:09:34,720 --> 00:09:39,480 Speaker 5: I think being on like the ex lineal community. That's 187 00:09:39,520 --> 00:09:41,079 Speaker 5: kind of one of those things that I don't think about. 188 00:09:41,120 --> 00:09:44,160 Speaker 5: But my partner, who was heavily, like even employed into 189 00:09:44,320 --> 00:09:47,480 Speaker 5: the industry of AI, the difference between him and I 190 00:09:47,640 --> 00:09:52,120 Speaker 5: are so vast that it's not surprising the way that 191 00:09:52,240 --> 00:09:54,960 Speaker 5: he has integrated that into his systems, into his life, 192 00:09:55,120 --> 00:09:57,920 Speaker 5: but also understands how he has to be cautious and 193 00:09:57,960 --> 00:10:00,439 Speaker 5: how it is one of those things that has to 194 00:10:00,520 --> 00:10:04,080 Speaker 5: be operated with responsibility. I think that there's a level 195 00:10:04,160 --> 00:10:07,760 Speaker 5: of understanding, but he is also very very excited by 196 00:10:07,880 --> 00:10:11,800 Speaker 5: the possibilities of what AI could do. So there's this 197 00:10:11,960 --> 00:10:16,120 Speaker 5: level like in our in my life specifically now our 198 00:10:16,200 --> 00:10:18,719 Speaker 5: responses to AI as me being a little more like 199 00:10:18,880 --> 00:10:23,720 Speaker 5: dismissive and very much more like negative about the whole 200 00:10:23,800 --> 00:10:26,319 Speaker 5: thing as where he is trying to play it to 201 00:10:26,480 --> 00:10:28,319 Speaker 5: his cards. That makes sense. 202 00:10:28,640 --> 00:10:31,439 Speaker 4: That makes so much sense, And it's funny because you 203 00:10:31,600 --> 00:10:34,800 Speaker 4: are basically living a microchasm of what the data suggests. 204 00:10:34,960 --> 00:10:36,520 Speaker 3: And that's kind of what it feels like when I'm 205 00:10:36,559 --> 00:10:37,160 Speaker 3: like thinking about it. 206 00:10:37,160 --> 00:10:39,319 Speaker 5: I'm like, honestly, i feel like I'm living as the 207 00:10:39,400 --> 00:10:43,440 Speaker 5: stereotype that is just or like in this conversation. 208 00:10:43,559 --> 00:10:47,760 Speaker 4: Absolutely so in this meta analysis, they found that across 209 00:10:48,000 --> 00:10:50,480 Speaker 4: almost all the data sets they looked at, women are 210 00:10:50,480 --> 00:10:53,360 Speaker 4: about twenty to twenty five percent less likely than men 211 00:10:53,480 --> 00:10:56,680 Speaker 4: to be using generative AI. The meta analysis shows women 212 00:10:56,800 --> 00:10:59,520 Speaker 4: have twenty two percent lower odds of using AI than men. 213 00:11:00,320 --> 00:11:04,000 Speaker 4: The gap appears across regions, industries, education levels, and occupations. 214 00:11:04,360 --> 00:11:07,760 Speaker 4: So you might hear that and think, okay, but maybe 215 00:11:07,800 --> 00:11:10,520 Speaker 4: the people who did this these surveys got it wrong, 216 00:11:10,800 --> 00:11:14,719 Speaker 4: looked at the wrong information. That gap, however, cannot be 217 00:11:14,880 --> 00:11:17,800 Speaker 4: explained by things like survey bias, because they also, as 218 00:11:17,840 --> 00:11:20,760 Speaker 4: they said, looked at who is downloading tools like Claud 219 00:11:20,800 --> 00:11:24,760 Speaker 4: and chat sheebut, and that data showed that women compromise 220 00:11:24,840 --> 00:11:28,760 Speaker 4: about forty two percent of chatjept website users globally, twenty 221 00:11:28,800 --> 00:11:32,080 Speaker 4: seven percent of chat sheepet mobile app users, and even 222 00:11:32,240 --> 00:11:35,880 Speaker 4: lower shares of women are using the tool Claud for manthropic. 223 00:11:36,280 --> 00:11:38,880 Speaker 4: So this really suggests that the gender gap is at 224 00:11:38,920 --> 00:11:41,320 Speaker 4: a pretty massive global scale. 225 00:11:42,440 --> 00:11:46,080 Speaker 3: A couple of notable findings. If women are more senior 226 00:11:46,240 --> 00:11:47,719 Speaker 3: or have more experience. 227 00:11:47,320 --> 00:11:49,800 Speaker 4: It does narrow that gender gap a little bit, but 228 00:11:50,120 --> 00:11:54,439 Speaker 4: it does not completely eliminate that gap. Senior women in 229 00:11:54,520 --> 00:11:58,600 Speaker 4: technical roles sometimes match or exceed men's usage, but also 230 00:11:59,040 --> 00:12:02,520 Speaker 4: junior women and women in non technical roles show much 231 00:12:02,760 --> 00:12:06,320 Speaker 4: larger gaps. So there are a couple of interesting, notable 232 00:12:06,480 --> 00:12:10,319 Speaker 4: outliers there, But in general, just like what's happening in 233 00:12:10,360 --> 00:12:15,319 Speaker 4: Samantha's household, men are even cautiously sort of gung ho, 234 00:12:15,600 --> 00:12:19,800 Speaker 4: experimenting with it, thinking pretty optimistically about what AI might be. 235 00:12:19,840 --> 00:12:23,280 Speaker 3: Able to do. Women are a little more skeptical. 236 00:12:23,600 --> 00:12:36,720 Speaker 1: We'll say. I wonder. 237 00:12:38,000 --> 00:12:40,720 Speaker 2: Just because we've been talking a lot on the show 238 00:12:42,080 --> 00:12:45,720 Speaker 2: about like the quote male loneliness epidemic, which women are 239 00:12:46,160 --> 00:12:49,040 Speaker 2: also very lonely, but that's what we're talking about. But 240 00:12:49,160 --> 00:12:51,160 Speaker 2: I see a lot of stories of men like forming 241 00:12:51,240 --> 00:12:57,160 Speaker 2: these bonds with AI and these relationships with AI, and 242 00:12:57,280 --> 00:12:59,720 Speaker 2: I have no idea, but I wonder if that's also 243 00:13:00,120 --> 00:13:02,480 Speaker 2: part of it, because women usually have more of a 244 00:13:02,559 --> 00:13:03,559 Speaker 2: support group than this. 245 00:13:03,800 --> 00:13:08,120 Speaker 3: Yeah, yeah, I mean, I absolutely think that. 246 00:13:09,679 --> 00:13:12,480 Speaker 4: Some of the way that AI is talked about by 247 00:13:12,559 --> 00:13:15,480 Speaker 4: people who make it, some of the marketing language and 248 00:13:15,559 --> 00:13:21,199 Speaker 4: decisions around it, I think reinforce some attitudes that I 249 00:13:21,280 --> 00:13:23,240 Speaker 4: think are are kind of sexist. 250 00:13:23,480 --> 00:13:23,600 Speaker 3: Right. 251 00:13:23,720 --> 00:13:27,760 Speaker 4: I remember very clearly it was like a grand opening 252 00:13:27,880 --> 00:13:32,240 Speaker 4: when sam Altman the CEO of open Ai, was introducing 253 00:13:32,360 --> 00:13:37,600 Speaker 4: this new vocal chatbot for open AI's chatbot named Sky, 254 00:13:38,280 --> 00:13:41,559 Speaker 4: and he basically said that he wanted the experience to 255 00:13:41,640 --> 00:13:44,280 Speaker 4: be like the experience that people had in the movie Her, 256 00:13:44,679 --> 00:13:47,880 Speaker 4: which is Folks seventeen. That movie Looking Phoenix is basically 257 00:13:47,960 --> 00:13:51,320 Speaker 4: having a romantic and sexual relationship with an AI chatbot 258 00:13:51,400 --> 00:13:52,720 Speaker 4: voice by Scarlett Johanson. 259 00:13:53,120 --> 00:13:55,880 Speaker 3: And I love that movie. It's one of my favorite movies. 260 00:13:55,920 --> 00:13:57,720 Speaker 3: But obviously, within the framing of. 261 00:13:57,720 --> 00:14:00,760 Speaker 4: The movie, it is about a man you and technology 262 00:14:00,880 --> 00:14:03,959 Speaker 4: to satisfy his emotional and sexual needs on top of 263 00:14:04,080 --> 00:14:07,760 Speaker 4: his you know, admin that needs that you might be 264 00:14:07,880 --> 00:14:10,280 Speaker 4: using AI for already. And so when the head of 265 00:14:10,360 --> 00:14:13,280 Speaker 4: open Ai gets on a stage and promises people that 266 00:14:13,360 --> 00:14:15,520 Speaker 4: they will be able to use technology the same way 267 00:14:15,520 --> 00:14:17,840 Speaker 4: it is used in this movie, well, I can understand 268 00:14:17,960 --> 00:14:21,240 Speaker 4: why that might hit women in a kind of a 269 00:14:21,280 --> 00:14:23,800 Speaker 4: weird way, right, I can understand why it's like, well, 270 00:14:23,800 --> 00:14:26,120 Speaker 4: what are you really saying about this technology that that 271 00:14:26,280 --> 00:14:29,040 Speaker 4: is your frame of reference for how you envision people 272 00:14:29,160 --> 00:14:29,480 Speaker 4: using it? 273 00:14:29,600 --> 00:14:30,440 Speaker 3: Do you know what I mean? 274 00:14:31,440 --> 00:14:35,520 Speaker 2: Yes, yes, absolutely, And going back to one of the 275 00:14:35,600 --> 00:14:38,880 Speaker 2: other points he made at the beginning that comes up 276 00:14:38,960 --> 00:14:42,080 Speaker 2: in these arguments of why women aren't using AI as much. 277 00:14:43,160 --> 00:14:46,200 Speaker 2: It did kind of remind me of the argument around STEM, 278 00:14:46,560 --> 00:14:49,720 Speaker 2: like a lot of STEM things, where it's like, well 279 00:14:50,600 --> 00:14:53,480 Speaker 2: they just that's not how their brain works, or that 280 00:14:53,520 --> 00:14:56,920 Speaker 2: they don't pursue those films because of whatever X y Z. 281 00:14:58,160 --> 00:15:00,520 Speaker 2: So that feels very similar to me. 282 00:15:01,160 --> 00:15:03,480 Speaker 4: Oh yes, I'm so glad that you brought that up, 283 00:15:03,520 --> 00:15:06,840 Speaker 4: because when you look at how the other conversations around 284 00:15:06,960 --> 00:15:10,120 Speaker 4: gender gaps that we know exist in workplaces more broadly, 285 00:15:10,160 --> 00:15:13,440 Speaker 4: in leadership roles more broadly, stam rolls more broadly. Often, 286 00:15:13,800 --> 00:15:16,440 Speaker 4: just as you said, the pushback is that women maybe 287 00:15:16,480 --> 00:15:18,200 Speaker 4: are picking less lucrative. 288 00:15:17,840 --> 00:15:18,600 Speaker 3: Work or fields. 289 00:15:18,960 --> 00:15:21,680 Speaker 4: If you're Larry Summers, maybe it's the idea that women 290 00:15:21,720 --> 00:15:25,120 Speaker 4: are just innately less good at these fields or something. Now, 291 00:15:25,200 --> 00:15:27,880 Speaker 4: never mind that that really discounts the question of why 292 00:15:28,000 --> 00:15:32,920 Speaker 4: fields that are associated with women are historically devalued and underpaid. 293 00:15:33,000 --> 00:15:35,440 Speaker 4: So even if you know there was a time where 294 00:15:35,880 --> 00:15:39,960 Speaker 4: computing was seen as women's administrative or secretarial work. When 295 00:15:40,640 --> 00:15:43,080 Speaker 4: women were associated with it, it was lower paid ad 296 00:15:43,160 --> 00:15:46,480 Speaker 4: men work. When men became associated with it, very clearly, 297 00:15:46,560 --> 00:15:48,640 Speaker 4: it became a different kind of work. So that's clearly 298 00:15:48,720 --> 00:15:51,520 Speaker 4: not the women's fault for you know, for what roles 299 00:15:51,560 --> 00:15:54,560 Speaker 4: they happen to be choosing. But in any event, not 300 00:15:55,000 --> 00:15:57,160 Speaker 4: only is that not what's really going on and not 301 00:15:57,280 --> 00:15:59,720 Speaker 4: telling the full story of what's happening in these workplaces, 302 00:16:00,240 --> 00:16:02,400 Speaker 4: nor is it really what's going on or telling the 303 00:16:02,480 --> 00:16:06,120 Speaker 4: full story of AI adoption. That's not me saying this, 304 00:16:06,320 --> 00:16:10,160 Speaker 4: that's according to this meta analysis, as social scientists Katie 305 00:16:10,240 --> 00:16:13,000 Speaker 4: Jin puts it in a piece called There's a reason 306 00:16:13,080 --> 00:16:15,720 Speaker 4: why women aren't swooning over AI like men are, which 307 00:16:15,800 --> 00:16:20,040 Speaker 4: I love that title, She writes, The off proposed explanation 308 00:16:20,200 --> 00:16:23,560 Speaker 4: is that women understand this new technology less, largely because 309 00:16:23,600 --> 00:16:26,480 Speaker 4: they work in roles with lower exposure to it. Women are, 310 00:16:26,680 --> 00:16:30,280 Speaker 4: after all, still outnumbered in STEM degrees and careers, including 311 00:16:30,400 --> 00:16:33,640 Speaker 4: AI specific roles. The same is true in AI leadership, 312 00:16:33,960 --> 00:16:37,000 Speaker 4: women hold fewer than fourteen percent of senior executive positions 313 00:16:37,040 --> 00:16:40,480 Speaker 4: in the industry. But Harvard's study also found that the 314 00:16:40,600 --> 00:16:44,080 Speaker 4: usage gap remains even when women are explicitly given opportunities 315 00:16:44,120 --> 00:16:47,360 Speaker 4: to learn and use AI roles. So it's not just 316 00:16:47,560 --> 00:16:50,920 Speaker 4: that women have less training or less access to AI. 317 00:16:51,440 --> 00:16:54,440 Speaker 4: Even when the people who did this meta analysis equalized 318 00:16:54,480 --> 00:16:58,400 Speaker 4: for that, they still found that this AI gender gap persist. 319 00:16:59,040 --> 00:17:01,000 Speaker 4: They looked at a study out of Kenya where women 320 00:17:01,080 --> 00:17:04,800 Speaker 4: were exuplicitly offered training and access to AI, yet those 321 00:17:05,119 --> 00:17:09,400 Speaker 4: women still used chat GPT thirteen percent less than men. 322 00:17:09,600 --> 00:17:11,280 Speaker 3: So this seems to suggest. 323 00:17:11,040 --> 00:17:16,119 Speaker 4: These barriers aren't really about access or awareness about AI, 324 00:17:16,359 --> 00:17:17,480 Speaker 4: it's something else going on. 325 00:17:21,000 --> 00:17:22,520 Speaker 5: I feel like there's so many things as I'm trying 326 00:17:22,560 --> 00:17:25,359 Speaker 5: to work out when it comes to this conversation about 327 00:17:25,440 --> 00:17:29,000 Speaker 5: whys that seem like, well, obviously A equals B, or 328 00:17:29,080 --> 00:17:31,520 Speaker 5: A is because of B. And one of the things 329 00:17:31,560 --> 00:17:34,480 Speaker 5: that I'm thinking about is like maybe just what I'm 330 00:17:34,520 --> 00:17:37,600 Speaker 5: seeing when it comes to things like chat gpt, which 331 00:17:37,680 --> 00:17:40,560 Speaker 5: feels like just an advanced version of Google where we 332 00:17:41,240 --> 00:17:43,960 Speaker 5: made that that the thing in researching, But it gives 333 00:17:44,280 --> 00:17:48,119 Speaker 5: the mediocre man who has a lot more confidence than 334 00:17:48,200 --> 00:17:52,399 Speaker 5: women in general, even more confidence, I guess in a 335 00:17:52,480 --> 00:17:55,520 Speaker 5: way because they think they're getting that answer without having 336 00:17:55,600 --> 00:17:57,720 Speaker 5: to go to the people who actually found those answers 337 00:17:58,320 --> 00:18:01,080 Speaker 5: in some weird ways, Like there's so many thoughts like 338 00:18:01,200 --> 00:18:05,199 Speaker 5: maybe what are some of these reasons that we are 339 00:18:05,280 --> 00:18:07,280 Speaker 5: finding that men are more likely to use it and 340 00:18:07,359 --> 00:18:09,359 Speaker 5: not for me like, that just seems like a rational 341 00:18:10,720 --> 00:18:11,760 Speaker 5: I think that's true. 342 00:18:12,560 --> 00:18:14,879 Speaker 4: I think twice now, I've gotten into a back and 343 00:18:14,960 --> 00:18:17,400 Speaker 4: forth with somebody on Reddit, and I guess I don't 344 00:18:17,440 --> 00:18:19,720 Speaker 4: know their gender, but I can tell that they were 345 00:18:19,840 --> 00:18:22,159 Speaker 4: responding to me using chat sheep put and I'm like, Oh, 346 00:18:22,200 --> 00:18:22,560 Speaker 4: you didn't. 347 00:18:22,640 --> 00:18:24,200 Speaker 3: You couldn't. It wasn't worth it. 348 00:18:24,400 --> 00:18:26,520 Speaker 4: You just wanted to keep the like argument going, but 349 00:18:26,600 --> 00:18:27,879 Speaker 4: it wasn't worth it for you to come up with 350 00:18:27,920 --> 00:18:29,600 Speaker 4: your own answer. We're going to take the chat sepeute 351 00:18:29,760 --> 00:18:35,320 Speaker 4: really right. So the meta analysis gave some answers as 352 00:18:35,359 --> 00:18:37,240 Speaker 4: to what might be going on here, what might explain 353 00:18:37,320 --> 00:18:39,880 Speaker 4: some of these gaps. I'll tell y'all what they said, 354 00:18:39,920 --> 00:18:41,159 Speaker 4: and then i'll kind of give you my thoughts on it, 355 00:18:41,200 --> 00:18:42,840 Speaker 4: because I'm not some of these I'm not so sure about. 356 00:18:43,440 --> 00:18:46,800 Speaker 4: So the first, it's a lower familiarity and knowledge with AI, 357 00:18:46,960 --> 00:18:49,840 Speaker 4: So they said women are more likely to report not 358 00:18:50,080 --> 00:18:51,879 Speaker 4: knowing how generative AI works or. 359 00:18:51,880 --> 00:18:52,560 Speaker 3: How to use it. 360 00:18:53,600 --> 00:18:56,800 Speaker 4: I'm not totally sold on this. You know it is 361 00:18:56,880 --> 00:18:58,879 Speaker 4: in their meta analysis, so I wanted to include it. 362 00:18:58,960 --> 00:19:02,600 Speaker 4: But this is just my tabe. AI is not difficult 363 00:19:02,640 --> 00:19:05,680 Speaker 4: to understand. It is not difficult to use, nor is 364 00:19:05,760 --> 00:19:09,040 Speaker 4: it terribly complex. There is nobody listening to the sound 365 00:19:09,080 --> 00:19:12,280 Speaker 4: of my voice who could not figure it out who 366 00:19:12,840 --> 00:19:14,879 Speaker 4: it is outside of the realm of their ability to 367 00:19:14,920 --> 00:19:18,080 Speaker 4: comprehend it is not terribly complex. I think that what 368 00:19:18,320 --> 00:19:21,440 Speaker 4: this study is actually highlighting here is I think this 369 00:19:21,680 --> 00:19:24,920 Speaker 4: tendency for the people who make technology and talk about 370 00:19:24,960 --> 00:19:28,439 Speaker 4: technology to do so in a way where it seems 371 00:19:28,520 --> 00:19:31,879 Speaker 4: like the technology they're talking about is so complex and 372 00:19:32,520 --> 00:19:36,359 Speaker 4: your puty little lady brains could never figure out what 373 00:19:36,480 --> 00:19:39,040 Speaker 4: we're talking about here. Think about the way that people 374 00:19:39,160 --> 00:19:42,840 Speaker 4: like Elon Musk, Mark Zuckerberg, Sam Altman talk about technology. 375 00:19:42,880 --> 00:19:44,440 Speaker 4: They talk about it in this way where it gives 376 00:19:44,480 --> 00:19:48,600 Speaker 4: it this gravity toss that I think is oftentimes unearned, 377 00:19:48,640 --> 00:19:51,800 Speaker 4: because you don't need any kind of special training to 378 00:19:52,400 --> 00:19:54,560 Speaker 4: understand the basics of AI. And so I think the 379 00:19:54,640 --> 00:19:58,920 Speaker 4: fact that tech leaders, who, let's be real, often happen 380 00:19:59,000 --> 00:20:01,600 Speaker 4: to be not just me, but a specific kind of man, 381 00:20:01,720 --> 00:20:04,120 Speaker 4: not just white men, but a specific kind of white guy. 382 00:20:04,840 --> 00:20:07,520 Speaker 4: I think the fact that they are consistently talking about 383 00:20:07,560 --> 00:20:10,560 Speaker 4: this technology in a way that makes it seem like 384 00:20:10,600 --> 00:20:12,800 Speaker 4: you're ordinary person could never understand it. I think it's 385 00:20:12,800 --> 00:20:14,520 Speaker 4: sort of at play here, and I think that is 386 00:20:14,600 --> 00:20:17,120 Speaker 4: with intention. I think they do this because they want 387 00:20:17,440 --> 00:20:20,640 Speaker 4: people to feel I couldn't possibly understand what they're talking about. 388 00:20:20,800 --> 00:20:23,000 Speaker 4: How could I hold these people accountable? How could I 389 00:20:23,119 --> 00:20:24,639 Speaker 4: expect things to be better for me? How could I 390 00:20:24,680 --> 00:20:29,879 Speaker 4: even you know, know, you know, Oftentimes they are telling people, oh, no, no, 391 00:20:30,320 --> 00:20:31,160 Speaker 4: you want AI. 392 00:20:31,400 --> 00:20:34,960 Speaker 3: AI is good. People will say, no, I know what 393 00:20:35,119 --> 00:20:35,879 Speaker 3: I like, and I know. 394 00:20:35,960 --> 00:20:38,280 Speaker 4: What my experiences are, and I know that I'm not enjoying, 395 00:20:38,760 --> 00:20:40,760 Speaker 4: you know, opening my email, not even be able to 396 00:20:40,920 --> 00:20:43,760 Speaker 4: use it because AI tools are being foisted on me 397 00:20:43,920 --> 00:20:45,960 Speaker 4: non consensually, and they're like, no, no, no, it's actually 398 00:20:46,200 --> 00:20:50,520 Speaker 4: great for you. Right that if people truly innately felt 399 00:20:50,560 --> 00:20:53,760 Speaker 4: like they understood this technology, they would tech leaders wouldn't 400 00:20:53,760 --> 00:20:54,720 Speaker 4: be able to do that anymore. 401 00:20:55,160 --> 00:20:57,720 Speaker 5: Right. It really does feel like they're just inflating the 402 00:20:57,800 --> 00:21:00,320 Speaker 5: importance of it so that we'll believe it, and then 403 00:21:00,400 --> 00:21:02,240 Speaker 5: that when we are forced to use it because we 404 00:21:02,480 --> 00:21:05,440 Speaker 5: have no way of pushing back and there's actually no 405 00:21:05,560 --> 00:21:08,800 Speaker 5: regulation anymore on how it's being used, that will just 406 00:21:08,960 --> 00:21:11,800 Speaker 5: accept how great it is without actually knowing how great 407 00:21:11,840 --> 00:21:14,000 Speaker 5: it is. In the midst of the conversation of knowing 408 00:21:14,040 --> 00:21:17,679 Speaker 5: how inaccurate it has been in the past. Also, how 409 00:21:17,840 --> 00:21:20,080 Speaker 5: hard is it when you literally do not have a 410 00:21:20,200 --> 00:21:23,000 Speaker 5: choice if you use what used to be Google and 411 00:21:23,240 --> 00:21:25,280 Speaker 5: what you thought would be something that would give you 412 00:21:25,280 --> 00:21:27,960 Speaker 5: trusted sources, and the top of it is literally AI 413 00:21:28,480 --> 00:21:31,240 Speaker 5: results that you did not ask for. So how hard 414 00:21:31,320 --> 00:21:33,440 Speaker 5: is it to use when you're not given a choice 415 00:21:33,840 --> 00:21:35,159 Speaker 5: to whether or not you have to use it? 416 00:21:35,920 --> 00:21:37,200 Speaker 3: Right? And this is what I'm saying. 417 00:21:37,600 --> 00:21:39,399 Speaker 4: It goes back to what I was when I started 418 00:21:39,440 --> 00:21:41,320 Speaker 4: the conversation with of how we talk about it as 419 00:21:41,359 --> 00:21:43,880 Speaker 4: this inevitable technology and we all have to get on board. 420 00:21:44,280 --> 00:21:46,240 Speaker 4: Things that are good, you don't have to talk about 421 00:21:46,280 --> 00:21:48,600 Speaker 4: them that way. Things that people want you don't. That's 422 00:21:48,640 --> 00:21:50,360 Speaker 4: not how that's not how you have to frame them. 423 00:21:50,600 --> 00:21:53,320 Speaker 4: So Sam Wiles said on threads If AI is a 424 00:21:53,400 --> 00:21:56,280 Speaker 4: good thing, why are we constantly told it's inevitable. You 425 00:21:56,359 --> 00:21:58,160 Speaker 4: don't have to do that with good things. No one's 426 00:21:58,200 --> 00:22:00,520 Speaker 4: ever like iced coffee is inevitable, So you may as 427 00:22:00,560 --> 00:22:03,399 Speaker 4: well get used to it whole. This is so correct, right, 428 00:22:03,480 --> 00:22:05,800 Speaker 4: This is things that are good. You do not have 429 00:22:05,920 --> 00:22:08,080 Speaker 4: to continue over and over and over again to tell 430 00:22:08,119 --> 00:22:11,639 Speaker 4: people that they better get on board with this good thing, like, no, 431 00:22:11,920 --> 00:22:14,280 Speaker 4: that's not how we talk about things that people like 432 00:22:14,400 --> 00:22:14,920 Speaker 4: and are good. 433 00:22:15,680 --> 00:22:15,880 Speaker 1: Right. 434 00:22:16,400 --> 00:22:19,119 Speaker 5: Also, it's not forced upon us by the government, like 435 00:22:19,240 --> 00:22:22,320 Speaker 5: we are not the ice coffee being told you have 436 00:22:22,480 --> 00:22:24,840 Speaker 5: to buy ice coffee when it gets to this temperature, 437 00:22:25,160 --> 00:22:27,520 Speaker 5: and no state can regulate whether or not you can 438 00:22:27,600 --> 00:22:29,159 Speaker 5: use it, even though it can be harmful for you. 439 00:22:29,440 --> 00:22:31,200 Speaker 5: And we have seen harm done by this. 440 00:22:31,880 --> 00:22:34,600 Speaker 3: Sorry exactly that. 441 00:22:45,200 --> 00:22:49,200 Speaker 4: Another thing this meta analysis found was lower confidence with 442 00:22:49,320 --> 00:22:52,280 Speaker 4: it and sort of persistence or lower levels of persistence, 443 00:22:52,320 --> 00:22:55,040 Speaker 4: and so, according to the meta analysis, women report less 444 00:22:55,119 --> 00:22:58,560 Speaker 4: confidence in prompting and are less likely to persist after 445 00:22:58,720 --> 00:23:03,520 Speaker 4: poor AI outputs. Men, however, are more likely to experiment repeatedly. 446 00:23:04,160 --> 00:23:06,920 Speaker 4: So I think this is one of those things where 447 00:23:08,040 --> 00:23:10,640 Speaker 4: essentially what is being said there is women have tried 448 00:23:10,680 --> 00:23:13,880 Speaker 4: this technology and have identified that it can be janky 449 00:23:14,320 --> 00:23:17,000 Speaker 4: and have reasonably concluded that maybe it's not worked the 450 00:23:17,040 --> 00:23:18,960 Speaker 4: pain in the butt to keep trying with it, and 451 00:23:19,320 --> 00:23:20,960 Speaker 4: men are like, I'm going to keep on trying. 452 00:23:21,280 --> 00:23:28,159 Speaker 3: I think that it's like framing women taking like gleaning. 453 00:23:27,880 --> 00:23:31,359 Speaker 4: From their experiences of AI being full of vies hallucinations 454 00:23:31,440 --> 00:23:33,440 Speaker 4: actually being more work for them, which has been my 455 00:23:33,560 --> 00:23:35,520 Speaker 4: experience when I when I've tried to use AI a 456 00:23:35,560 --> 00:23:38,359 Speaker 4: lot of times, I'm like, Okay, well, I could have 457 00:23:38,400 --> 00:23:40,000 Speaker 4: done this in the time, in the time it took 458 00:23:40,040 --> 00:23:42,000 Speaker 4: me to get to finally get AI to do, I 459 00:23:42,040 --> 00:23:43,479 Speaker 4: could have done this myself and done a better job 460 00:23:43,480 --> 00:23:43,639 Speaker 4: at it. 461 00:23:43,640 --> 00:23:44,800 Speaker 3: So why should it is on that to begin with 462 00:23:45,080 --> 00:23:45,680 Speaker 3: wasted time? 463 00:23:46,000 --> 00:23:48,400 Speaker 4: But I don't like how this is framed as women 464 00:23:48,560 --> 00:23:52,200 Speaker 4: being risk averse, when in reality, I think it's more 465 00:23:52,280 --> 00:23:56,120 Speaker 4: like women making reasonable deductions about the time they want 466 00:23:56,200 --> 00:23:58,879 Speaker 4: to spend on something that has proved itself to be ineffective, 467 00:23:59,520 --> 00:24:01,000 Speaker 4: and like, that's that's not risk averse. 468 00:24:01,040 --> 00:24:01,760 Speaker 3: That's something else. 469 00:24:02,320 --> 00:24:05,680 Speaker 5: That's being tired of arguing with an inanimate object, which 470 00:24:05,680 --> 00:24:07,840 Speaker 5: is what I've had to do when I'm yelling back 471 00:24:08,200 --> 00:24:10,760 Speaker 5: at Google I want you Siri, but any of those 472 00:24:10,840 --> 00:24:12,720 Speaker 5: days because they did not understand what I was saying 473 00:24:12,920 --> 00:24:14,800 Speaker 5: and I just stare at it and like, why am 474 00:24:14,800 --> 00:24:17,240 Speaker 5: I doing this with you? You're and then being told 475 00:24:17,280 --> 00:24:20,440 Speaker 5: also they remember when you're reading them. 476 00:24:20,840 --> 00:24:23,480 Speaker 4: Listen, I told this story on my own podcast, there 477 00:24:23,480 --> 00:24:25,960 Speaker 4: are our goals on the Internet. After I did the 478 00:24:26,080 --> 00:24:29,399 Speaker 4: episode with you all about Larry Summers, who was formerly 479 00:24:29,520 --> 00:24:31,800 Speaker 4: on the board of Open AI, stepping down because of 480 00:24:31,880 --> 00:24:35,600 Speaker 4: those emails that were revealed between him and convicted child 481 00:24:35,680 --> 00:24:37,000 Speaker 4: sex criminal Jeffrey Epstein. 482 00:24:37,560 --> 00:24:38,840 Speaker 3: I was trying to get. 483 00:24:38,920 --> 00:24:40,760 Speaker 4: I was like, oh, maybe chatchepet can help me come 484 00:24:40,800 --> 00:24:42,720 Speaker 4: up with like metadata and all of that. So I 485 00:24:42,880 --> 00:24:45,920 Speaker 4: put in what I have to Chattebaut and Chattebet says 486 00:24:46,200 --> 00:24:50,240 Speaker 4: this is not correct. There is no link between Larry 487 00:24:50,320 --> 00:24:53,720 Speaker 4: Summers and Jeffrey Epstein. And I was like, oh, really, wow, 488 00:24:53,880 --> 00:24:56,840 Speaker 4: we don't have emails linking the two and they're like 489 00:24:56,960 --> 00:25:00,879 Speaker 4: and then Chattebet is like, Okay, well, yeah, maybe there's emails, 490 00:25:01,280 --> 00:25:05,040 Speaker 4: but you're suggesting that something illegal happened if you don't 491 00:25:05,040 --> 00:25:05,840 Speaker 4: know anything about that. 492 00:25:05,960 --> 00:25:08,439 Speaker 3: And I was like, I said the emails were creepy. 493 00:25:08,800 --> 00:25:11,360 Speaker 4: How would you define emails where a grown married man 494 00:25:11,840 --> 00:25:14,639 Speaker 4: is going to a convicted pedophile for advice on how 495 00:25:14,680 --> 00:25:16,000 Speaker 4: we can have text with his mentee. 496 00:25:16,840 --> 00:25:19,480 Speaker 3: You wouldn't define those as creepy. We got into a 497 00:25:19,600 --> 00:25:23,600 Speaker 3: huge argument to the point where I asked it. I 498 00:25:23,800 --> 00:25:28,399 Speaker 3: was like, cut the crap. Are you being so cagey 499 00:25:28,560 --> 00:25:30,520 Speaker 3: because Larry Summers used to be on the board of 500 00:25:30,600 --> 00:25:31,000 Speaker 3: open AI. 501 00:25:31,200 --> 00:25:36,399 Speaker 4: That makes you CHATCHYPT. CHATCHYPT says, let's be clear, Larry 502 00:25:36,440 --> 00:25:39,040 Speaker 4: Summers was never on the board of open AI. Larry 503 00:25:39,080 --> 00:25:40,680 Speaker 4: Summers has been on the board of open ai since 504 00:25:40,720 --> 00:25:42,240 Speaker 4: like twenty twenty three, so it's. 505 00:25:42,240 --> 00:25:43,040 Speaker 3: Been kind of a while. 506 00:25:43,200 --> 00:25:47,200 Speaker 4: So it's like, not only was that infuriating, think the 507 00:25:47,400 --> 00:25:50,320 Speaker 4: work it took for me to to like take stop 508 00:25:50,400 --> 00:25:53,440 Speaker 4: my work day and have an argument, a real no 509 00:25:54,119 --> 00:25:57,600 Speaker 4: argument with CHATCHPT. This is the technology they're saying is 510 00:25:57,640 --> 00:25:59,320 Speaker 4: the legiten of our king economy right now. 511 00:25:59,400 --> 00:25:59,879 Speaker 3: I don't think so. 512 00:26:00,040 --> 00:26:03,520 Speaker 5: Oh, you literally spent emotions with a computer who is 513 00:26:03,600 --> 00:26:06,080 Speaker 5: just trying to protect their reputation, which. 514 00:26:05,920 --> 00:26:07,600 Speaker 3: Is weird because they're not a human. 515 00:26:08,000 --> 00:26:11,280 Speaker 4: This is what I'm saying, Like, like, I'll tell you something, 516 00:26:11,400 --> 00:26:13,959 Speaker 4: I have not asked chat gpt a single thing since then. 517 00:26:14,240 --> 00:26:16,200 Speaker 3: That was a little experiment to see if it was 518 00:26:16,240 --> 00:26:16,680 Speaker 3: gonna work. 519 00:26:16,760 --> 00:26:19,480 Speaker 4: It went so astronomically left and I'm like, okay, well 520 00:26:19,520 --> 00:26:21,280 Speaker 4: that's a waste of my time and I got my 521 00:26:21,320 --> 00:26:23,520 Speaker 4: blood pressure up, so it's a double bad. 522 00:26:23,760 --> 00:26:25,320 Speaker 5: I don't I don't need this from you too. 523 00:26:26,080 --> 00:26:27,760 Speaker 4: If I'm gonna argue with someone, it's gonna be a 524 00:26:27,800 --> 00:26:30,400 Speaker 4: flesh and blood human in my life, that freaking chat GPT. 525 00:26:31,560 --> 00:26:33,280 Speaker 2: Well, I think that's also a good point of like 526 00:26:33,560 --> 00:26:36,520 Speaker 2: we've we've talked about this before several times, who makes 527 00:26:36,560 --> 00:26:38,840 Speaker 2: this technology and the fear of like the biases and 528 00:26:38,880 --> 00:26:41,479 Speaker 2: all that stuff in there. And I know, like Elon 529 00:26:41,600 --> 00:26:45,080 Speaker 2: Musk and Gronkipedia has had a lot of things come 530 00:26:45,200 --> 00:26:48,280 Speaker 2: up with what it is saying that are just flat 531 00:26:48,359 --> 00:26:51,159 Speaker 2: out ridiculous and like these flight out lies, and so 532 00:26:51,280 --> 00:26:56,680 Speaker 2: I do think that that is a concern, and I 533 00:26:57,040 --> 00:27:03,840 Speaker 2: would I would pause it that that influences like I 534 00:27:03,920 --> 00:27:06,320 Speaker 2: would guess that women are not getting the answers they 535 00:27:06,400 --> 00:27:09,320 Speaker 2: want more often than men are getting the answers that 536 00:27:09,440 --> 00:27:10,240 Speaker 2: they do want. 537 00:27:10,840 --> 00:27:13,600 Speaker 3: Yes, And something to remember about AI. 538 00:27:13,840 --> 00:27:16,359 Speaker 4: If there's one thing, even if you're thinking, I'm not 539 00:27:16,440 --> 00:27:18,080 Speaker 4: a teche, I'm not a computer person. 540 00:27:18,359 --> 00:27:20,920 Speaker 3: This is not my forte. If there's one thing to 541 00:27:21,000 --> 00:27:22,200 Speaker 3: remember is that AI. 542 00:27:22,600 --> 00:27:26,160 Speaker 4: It's easy to think of it as robot computer brains 543 00:27:26,200 --> 00:27:28,000 Speaker 4: that know everything's all knowing. 544 00:27:28,119 --> 00:27:28,399 Speaker 3: It's not. 545 00:27:28,600 --> 00:27:31,159 Speaker 4: It's trained and built by all of us humans and 546 00:27:31,240 --> 00:27:34,440 Speaker 4: so the same kind of biases that we know humans have, 547 00:27:35,040 --> 00:27:38,720 Speaker 4: AI is just trained on all of that to replicate it, right, 548 00:27:38,840 --> 00:27:41,560 Speaker 4: And so keep that in mind when you're thinking about AI. 549 00:27:41,680 --> 00:27:44,320 Speaker 4: And we have so much documentation about the ways that 550 00:27:44,400 --> 00:27:47,760 Speaker 4: AI is biased when it comes to reflecting women in workplaces. 551 00:27:47,840 --> 00:27:50,760 Speaker 4: It's more likely to reflect women as younger than they 552 00:27:50,840 --> 00:27:54,879 Speaker 4: actually are. And so if the majority of let's say, 553 00:27:54,920 --> 00:27:57,520 Speaker 4: for instance, if the majority of women who are engineers 554 00:27:57,680 --> 00:28:02,280 Speaker 4: are over forty, it's like a documented fact. AI will 555 00:28:02,440 --> 00:28:04,879 Speaker 4: will tell people, oh, they're actually younger than they are, 556 00:28:05,000 --> 00:28:08,880 Speaker 4: because it's reflecting a bias that against more mature women. 557 00:28:09,280 --> 00:28:12,400 Speaker 4: You know, there have been studies that when women ask 558 00:28:12,880 --> 00:28:16,400 Speaker 4: CHATGBT for negotiation advice, it regularly tells them to ask 559 00:28:16,480 --> 00:28:19,520 Speaker 4: for less money. Right, so all of the biases that 560 00:28:19,600 --> 00:28:22,520 Speaker 4: exist in society, AI is simply replicating those because it 561 00:28:22,600 --> 00:28:25,160 Speaker 4: was trained on all of us. And so really keep 562 00:28:25,240 --> 00:28:28,520 Speaker 4: that in mind, Like I can understand why women are 563 00:28:28,600 --> 00:28:32,320 Speaker 4: not keen about going to this technology to help them 564 00:28:32,520 --> 00:28:35,720 Speaker 4: understand the world around them and their workplace when we 565 00:28:35,960 --> 00:28:38,480 Speaker 4: know it has these biases. 566 00:28:38,680 --> 00:28:40,840 Speaker 5: Right, I mean it comes to the other point and 567 00:28:40,920 --> 00:28:44,880 Speaker 5: that we already understand and know that the data research 568 00:28:45,040 --> 00:28:48,160 Speaker 5: are based on men and there's not enough based on 569 00:28:48,280 --> 00:28:51,240 Speaker 5: women or those marginalized communities at all. Like we just 570 00:28:51,360 --> 00:28:53,400 Speaker 5: got to the point of realizing, hey, we need more, 571 00:28:53,760 --> 00:28:56,280 Speaker 5: so more people have been doing it, but all of 572 00:28:56,360 --> 00:29:00,240 Speaker 5: that information is not there to provide a background for chat, 573 00:29:00,320 --> 00:29:03,000 Speaker 5: GPT to pick up or whatever AI company. And that's 574 00:29:03,040 --> 00:29:06,320 Speaker 5: really concerning that we're acting like the information we know 575 00:29:06,400 --> 00:29:07,840 Speaker 5: now is all we need to know when we know 576 00:29:07,920 --> 00:29:09,280 Speaker 5: that's not true exactly. 577 00:29:09,720 --> 00:29:11,040 Speaker 3: That is such a good way to put it. 578 00:29:12,440 --> 00:29:16,800 Speaker 4: And another concern that women report about using AI why 579 00:29:16,840 --> 00:29:18,440 Speaker 4: they're kind of skeptical about it, is that women have 580 00:29:18,520 --> 00:29:21,920 Speaker 4: stronger what the meta analysis called kind of ethical concern. 581 00:29:22,040 --> 00:29:25,640 Speaker 4: So women are more likely to view using AI, especially 582 00:29:25,720 --> 00:29:29,479 Speaker 4: in professional settings or education settings, as unethical or cheating. 583 00:29:29,920 --> 00:29:33,480 Speaker 3: So I don't disagree with this. This totally makes sense 584 00:29:33,520 --> 00:29:34,560 Speaker 3: to me anecdotally. 585 00:29:34,960 --> 00:29:38,640 Speaker 4: However, I think it's more complex than that, because I 586 00:29:38,840 --> 00:29:41,440 Speaker 4: don't think it's just that women see AI is cheating. 587 00:29:42,240 --> 00:29:44,719 Speaker 4: It is that, But I also think that in workplaces 588 00:29:44,800 --> 00:29:48,400 Speaker 4: and in educational settings, women are more likely to be scrutinized. 589 00:29:47,840 --> 00:29:50,920 Speaker 3: That our male counterparts, right, and that's just the way. 590 00:29:51,000 --> 00:29:52,959 Speaker 4: Then it is, like women aren't wrong for being like, oh, 591 00:29:53,000 --> 00:29:55,720 Speaker 4: I feel like everybody's checking my work extra hard and 592 00:29:55,800 --> 00:29:59,160 Speaker 4: harder than they're checking Joe's work or whatever. The BBC 593 00:29:59,280 --> 00:30:02,239 Speaker 4: spoke to techlogist Lee Chambers, who said women are more 594 00:30:02,400 --> 00:30:04,800 Speaker 4: likely to be accused of not being competent, so they 595 00:30:04,880 --> 00:30:07,400 Speaker 4: have to emphasize their credentials more or demonstrate their subject 596 00:30:07,440 --> 00:30:09,800 Speaker 4: matter expertise in a particular field. There could be this 597 00:30:09,920 --> 00:30:12,080 Speaker 4: feeling that if people know that you, as a woman, 598 00:30:12,240 --> 00:30:14,560 Speaker 4: use AI, it is suggesting that you might not be 599 00:30:14,640 --> 00:30:17,680 Speaker 4: as qualified as you actually are. Women are already discredited 600 00:30:17,720 --> 00:30:19,800 Speaker 4: and have their ideas taken by men and passed off 601 00:30:19,880 --> 00:30:22,520 Speaker 4: as their own, So having people knowing that you use 602 00:30:22,560 --> 00:30:24,920 Speaker 4: AI might also play into this narrative that you're not 603 00:30:25,040 --> 00:30:28,280 Speaker 4: qualified enough. It's just another thing debasing your skills, your 604 00:30:28,320 --> 00:30:31,920 Speaker 4: competence and your value. Now that makes total sense to me. 605 00:30:32,000 --> 00:30:33,280 Speaker 3: And I am in. 606 00:30:35,040 --> 00:30:38,440 Speaker 4: Professional spaces of all kinds, and in some of these 607 00:30:38,440 --> 00:30:43,160 Speaker 4: professional spaces, I've been around very accomplished men in tech 608 00:30:43,560 --> 00:30:47,080 Speaker 4: who will not blink an eye at submitting like an 609 00:30:47,120 --> 00:30:49,160 Speaker 4: op ed or something, or a piece of writing that 610 00:30:49,280 --> 00:30:51,880 Speaker 4: was clearly written by AI. And it's just like and 611 00:30:52,040 --> 00:30:57,400 Speaker 4: I would never like the way. I would be so 612 00:30:57,640 --> 00:31:01,480 Speaker 4: embarrassed to have anybody be like, oh did you submit this? 613 00:31:01,640 --> 00:31:04,280 Speaker 4: And it was generated? I can tell by whatever, whatever, 614 00:31:04,680 --> 00:31:06,280 Speaker 4: I would crawl into a hole and die. I would 615 00:31:06,280 --> 00:31:08,640 Speaker 4: be so embarrassed. And the way that I can confirm 616 00:31:08,720 --> 00:31:11,760 Speaker 4: that there are a lot of men in high up 617 00:31:13,040 --> 00:31:16,160 Speaker 4: positions who do not feel that level of scrutiny. And 618 00:31:16,200 --> 00:31:19,160 Speaker 4: it goes back to like marginalized people in these settings. 619 00:31:19,480 --> 00:31:22,840 Speaker 4: Are we're so used to carrying that the extra eyes, 620 00:31:23,320 --> 00:31:26,480 Speaker 4: you know, extra scrutiny of why we're here at all, 621 00:31:26,760 --> 00:31:30,240 Speaker 4: our work, our value, our competence. Why would you want 622 00:31:30,280 --> 00:31:33,920 Speaker 4: to inject another reason to give somebody to continue. 623 00:31:33,560 --> 00:31:35,920 Speaker 3: To do that when you're already facing that in multiple levels. 624 00:31:37,440 --> 00:31:40,360 Speaker 2: I have to bring this back to fan fiction, you 625 00:31:40,480 --> 00:31:45,840 Speaker 2: know I do. So there's there's a big, like concern 626 00:31:46,000 --> 00:31:50,520 Speaker 2: in the fan fiction space of like using AI, and 627 00:31:50,880 --> 00:31:55,440 Speaker 2: it's become such a thing, and as discussed, a lot 628 00:31:55,600 --> 00:31:58,120 Speaker 2: of fan fiction is mostly written by women or non 629 00:31:58,200 --> 00:32:01,120 Speaker 2: binary people. People will go out of their way to say, 630 00:32:01,280 --> 00:32:03,959 Speaker 2: like in the tags I use M dashes, but it's 631 00:32:04,000 --> 00:32:06,200 Speaker 2: not AI. They go way out of their way to 632 00:32:06,240 --> 00:32:09,760 Speaker 2: be like I sawyar, it's not AI. And it's just 633 00:32:10,800 --> 00:32:13,040 Speaker 2: it's so clear to me that people don't want you 634 00:32:13,160 --> 00:32:14,960 Speaker 2: to think like, oh, this is not worth Like I 635 00:32:15,000 --> 00:32:19,040 Speaker 2: didn't even put in the time to do this, whereas yeah, 636 00:32:19,080 --> 00:32:22,920 Speaker 2: I do think I don't think men would see it 637 00:32:23,200 --> 00:32:26,200 Speaker 2: necessarily in that way. They would see it as like, oh, yeah, 638 00:32:26,200 --> 00:32:28,440 Speaker 2: it just saved me some time. But meanwhile, and now 639 00:32:28,480 --> 00:32:30,440 Speaker 2: I know that AI uses a lot of M dashes 640 00:32:30,560 --> 00:32:31,920 Speaker 2: because of fiction. 641 00:32:33,000 --> 00:32:37,040 Speaker 4: Yeah, that doesn't surprise me at all, because I think 642 00:32:37,560 --> 00:32:41,360 Speaker 4: the fan fiction community is definitely a community that values 643 00:32:41,880 --> 00:32:44,200 Speaker 4: human creativity, Like you wouldn't be in the fan fiction 644 00:32:44,280 --> 00:32:46,440 Speaker 4: community if you didn't value human creativity, and it's not. 645 00:32:46,520 --> 00:32:48,239 Speaker 4: It doesn't surprise me that that's a community that it's 646 00:32:48,280 --> 00:32:49,560 Speaker 4: also full of women. 647 00:32:51,000 --> 00:32:54,280 Speaker 2: Yes, yes, And then people always put this thing too 648 00:32:54,320 --> 00:32:56,160 Speaker 2: at the top where they're like, please don't feed my 649 00:32:56,240 --> 00:32:57,960 Speaker 2: work to AI, and I'm like, that's just not going 650 00:32:58,040 --> 00:32:59,920 Speaker 2: to help you, but I appreciate the effort. 651 00:33:01,240 --> 00:33:03,680 Speaker 5: Well, that's the other part to that is when you 652 00:33:03,800 --> 00:33:07,320 Speaker 5: start thinking about using chat GPT or any of the situations, 653 00:33:07,360 --> 00:33:09,520 Speaker 5: you don't know who you're taking from, and there is 654 00:33:09,600 --> 00:33:12,000 Speaker 5: this level of guilt for me on the end of 655 00:33:12,080 --> 00:33:14,960 Speaker 5: like not being the creative. But if I'm researching things 656 00:33:15,040 --> 00:33:17,920 Speaker 5: and I'm just basing it on one small bit of 657 00:33:17,960 --> 00:33:21,880 Speaker 5: information that's you know, from CHATJPT or Gemini or whatever whatnot, 658 00:33:22,080 --> 00:33:24,680 Speaker 5: which is constant, and I can't stand it. It feels 659 00:33:24,800 --> 00:33:28,760 Speaker 5: unethical in that manner of like we already know artists 660 00:33:29,000 --> 00:33:32,400 Speaker 5: are really really on edge because their work is feeling 661 00:33:32,480 --> 00:33:36,240 Speaker 5: constantly stolen. Creators who are writers, who are part of this, 662 00:33:36,480 --> 00:33:38,840 Speaker 5: researchers who are doing lots of work to get their 663 00:33:38,960 --> 00:33:41,480 Speaker 5: educations and to be in this field like using that 664 00:33:41,600 --> 00:33:45,480 Speaker 5: information and like oftentimes it's not cited. It just becomes 665 00:33:45,600 --> 00:33:48,600 Speaker 5: one big blob of information. It feels so gross to 666 00:33:48,800 --> 00:33:50,120 Speaker 5: see that play out. 667 00:33:50,320 --> 00:33:52,120 Speaker 3: Yeah, said such a good point. 668 00:33:52,240 --> 00:33:55,760 Speaker 4: There was this like micro controversy online a couple of 669 00:33:55,800 --> 00:33:59,440 Speaker 4: weeks ago about this literary festival, the Black Romance Literary Festival, 670 00:33:59,480 --> 00:34:02,760 Speaker 4: where the question was, do the people who run this festival, 671 00:34:02,760 --> 00:34:06,760 Speaker 4: the organizers, will they accept people who use AI in. 672 00:34:06,840 --> 00:34:09,399 Speaker 3: Their work at this organization? Will those people be allowed 673 00:34:09,440 --> 00:34:10,320 Speaker 3: to be included in panels? 674 00:34:10,360 --> 00:34:12,239 Speaker 4: And so it sounded that they were really talking about 675 00:34:12,640 --> 00:34:16,160 Speaker 4: illustrations like cover art, and so the organization gave what 676 00:34:16,360 --> 00:34:18,000 Speaker 4: to a lot of people in the community felt like 677 00:34:18,960 --> 00:34:21,759 Speaker 4: not a great answer, just sort of what they wrote. 678 00:34:21,800 --> 00:34:24,279 Speaker 4: The people the organizer said, we don't use AI in 679 00:34:24,400 --> 00:34:26,360 Speaker 4: our work, but we're not turning away people who do 680 00:34:26,600 --> 00:34:30,960 Speaker 4: use AI. We want to provide resources and education to 681 00:34:31,040 --> 00:34:33,239 Speaker 4: those folks, but we don't want to disclude them. One 682 00:34:33,280 --> 00:34:35,800 Speaker 4: of my listeners made a very good point that it 683 00:34:35,920 --> 00:34:38,919 Speaker 4: seemed like that was a kind of attempt, a harm 684 00:34:39,000 --> 00:34:41,800 Speaker 4: reduction kind of position, But maybe they didn't say it 685 00:34:41,880 --> 00:34:43,759 Speaker 4: like that, and they perhaps they shouldn't have quarified that. 686 00:34:43,920 --> 00:34:49,120 Speaker 4: But you know, I think at a time when artists, 687 00:34:49,320 --> 00:34:53,960 Speaker 4: particularly marginalized artists, are really having a hard like anybody 688 00:34:54,000 --> 00:34:56,719 Speaker 4: who tries to make make money from making a thing 689 00:34:56,880 --> 00:34:58,840 Speaker 4: is having a hard time of it right now. And 690 00:34:59,840 --> 00:35:02,680 Speaker 4: I I think when, as you said, Sam, we know 691 00:35:03,680 --> 00:35:07,919 Speaker 4: when you ask AI to generated image, it's not generating 692 00:35:07,960 --> 00:35:10,279 Speaker 4: a new piece, it's just taking. 693 00:35:10,080 --> 00:35:11,200 Speaker 3: From what's out there, right. 694 00:35:11,520 --> 00:35:14,400 Speaker 4: I remember there was a time where people were using 695 00:35:15,239 --> 00:35:18,360 Speaker 4: an AI tool to make those futuristic selfies. Half of 696 00:35:18,400 --> 00:35:22,600 Speaker 4: them had watermarks or logos in them somewhere. That's how 697 00:35:22,680 --> 00:35:25,640 Speaker 4: much they were just taking from other artists copyrighted work. 698 00:35:25,920 --> 00:35:26,759 Speaker 3: So I don't think that. 699 00:35:26,960 --> 00:35:30,560 Speaker 4: The creatives who are saying, hey, we need this to 700 00:35:30,640 --> 00:35:32,320 Speaker 4: be clarified, Hey, we need to put a line on 701 00:35:32,400 --> 00:35:34,160 Speaker 4: the stand of how what we're going to do when 702 00:35:34,200 --> 00:35:37,879 Speaker 4: it comes to taking other artists work in this way. 703 00:35:38,040 --> 00:35:40,520 Speaker 4: I don't think they're wrong for being anxious about that 704 00:35:40,640 --> 00:35:42,920 Speaker 4: and expecting and deserving some answers around it. 705 00:35:43,000 --> 00:35:56,560 Speaker 3: Frankly so, when it comes to kind of the. 706 00:35:56,600 --> 00:35:59,719 Speaker 4: Reasons why women might not be adopting AI like their 707 00:35:59,760 --> 00:36:03,600 Speaker 4: male counterparts, I did find this meta analysis pretty useful 708 00:36:03,680 --> 00:36:06,080 Speaker 4: and under like giving us the scope of the issue, 709 00:36:06,239 --> 00:36:08,640 Speaker 4: giving us the scope of the gender gap, and some 710 00:36:08,800 --> 00:36:11,400 Speaker 4: reasons as to why that might be. However, I just 711 00:36:11,480 --> 00:36:13,840 Speaker 4: don't think that some of the answers really spoke to 712 00:36:13,920 --> 00:36:16,080 Speaker 4: the nuance that I see in that experience as a 713 00:36:16,120 --> 00:36:18,200 Speaker 4: woman that's a little bit skeptical of AI. So I 714 00:36:18,280 --> 00:36:21,080 Speaker 4: found this piece in the Stanford Social Innovation Review by 715 00:36:21,160 --> 00:36:23,719 Speaker 4: Mara Bolis, who is the founder of First Prompt and 716 00:36:23,719 --> 00:36:26,520 Speaker 4: a fellow at the Harvard Kennedy School, called the AI 717 00:36:26,719 --> 00:36:30,279 Speaker 4: gender Gap Paradox, and I think that this piece does 718 00:36:30,480 --> 00:36:32,400 Speaker 4: a really good job of getting at some of the 719 00:36:32,480 --> 00:36:35,640 Speaker 4: nuances that's not just women aren't smart enough to get 720 00:36:35,719 --> 00:36:38,440 Speaker 4: AI or whatever right. And so one bit that she 721 00:36:38,520 --> 00:36:41,600 Speaker 4: pointed out is that women are already doing a lot 722 00:36:41,840 --> 00:36:44,839 Speaker 4: of work. She writes, more than half a professional say 723 00:36:44,880 --> 00:36:47,520 Speaker 4: that learning AI feels like a second job, which for 724 00:36:47,640 --> 00:36:50,720 Speaker 4: most women is actually a third job when you consider 725 00:36:50,800 --> 00:36:54,880 Speaker 4: the continued disparities in time spent on childcare and housework. 726 00:36:55,640 --> 00:36:58,800 Speaker 4: And so that I thought was such a good point, 727 00:36:58,920 --> 00:37:02,120 Speaker 4: right that doing your current job or your current educational 728 00:37:02,160 --> 00:37:04,839 Speaker 4: workload or whatever is already a lot of work. If 729 00:37:04,840 --> 00:37:08,440 Speaker 4: you're a woman, you're probably taking on more of the 730 00:37:08,520 --> 00:37:10,719 Speaker 4: labor at home, or of the emotional labor, or of 731 00:37:10,800 --> 00:37:13,520 Speaker 4: the care work, all of that. And on top of it, 732 00:37:13,719 --> 00:37:15,440 Speaker 4: now people are telling you that if you don't take 733 00:37:15,480 --> 00:37:18,120 Speaker 4: the time to learn how to integrate AI into that work, 734 00:37:18,480 --> 00:37:20,439 Speaker 4: you don't really care about your career and you're holding 735 00:37:20,480 --> 00:37:23,320 Speaker 4: yourself back. I just, yeah, it's like that, Like that 736 00:37:23,480 --> 00:37:25,239 Speaker 4: has to be part of the conversation if we're going 737 00:37:25,280 --> 00:37:30,040 Speaker 4: to actually address the gender gap in AI usage. Yes. Yeah, 738 00:37:30,360 --> 00:37:32,439 Speaker 4: And this piece I think also just does a great 739 00:37:32,520 --> 00:37:37,759 Speaker 4: job of reframing fear or knowledge gaps into what I 740 00:37:37,840 --> 00:37:41,480 Speaker 4: suspect they actually are, which is the ability to competently 741 00:37:41,719 --> 00:37:44,960 Speaker 4: analyze risk. Women aren't looking at AI and saying I'm 742 00:37:45,000 --> 00:37:46,840 Speaker 4: afraid of it or like I can never learn it. 743 00:37:47,080 --> 00:37:50,399 Speaker 4: I think what they're actually saying is I have analyzed 744 00:37:50,640 --> 00:37:53,600 Speaker 4: the risk that using it might bring into my work, 745 00:37:53,680 --> 00:37:56,919 Speaker 4: and I've made a decision based on that that I'm 746 00:37:56,960 --> 00:37:59,880 Speaker 4: not gonna be using it like that. That piece in 747 00:38:00,080 --> 00:38:03,719 Speaker 4: the Stanford Innovation Review looked at studies from Deloitte and Pew. 748 00:38:04,480 --> 00:38:07,279 Speaker 4: Both of them showed that women predict AI will bring 749 00:38:07,440 --> 00:38:11,840 Speaker 4: less benefit and do more harm across personal, professional and 750 00:38:11,920 --> 00:38:15,680 Speaker 4: public life. Men, however, tend to be more optimistic, confident, 751 00:38:15,960 --> 00:38:18,080 Speaker 4: and self assured in their competence. 752 00:38:18,840 --> 00:38:19,920 Speaker 3: Doesn't that really say at all? 753 00:38:20,960 --> 00:38:22,960 Speaker 5: Well, there's so much Like when I'm thinking about the 754 00:38:23,000 --> 00:38:26,239 Speaker 5: amount that we research and the amount that we try 755 00:38:26,360 --> 00:38:29,360 Speaker 5: to make sure that we are noting who is you know, 756 00:38:29,480 --> 00:38:32,080 Speaker 5: referencing and giving credit to whom ever, so we're not 757 00:38:32,280 --> 00:38:37,120 Speaker 5: doing any violent violations, copyright law, any of those things. 758 00:38:37,440 --> 00:38:40,480 Speaker 5: And again we know that chat, GPT or any of 759 00:38:40,560 --> 00:38:41,600 Speaker 5: the AI really care. 760 00:38:42,000 --> 00:38:44,120 Speaker 3: They don't care where the sources are coming from. 761 00:38:44,560 --> 00:38:48,360 Speaker 5: And so as people who know A that it is 762 00:38:48,480 --> 00:38:51,160 Speaker 5: not one hundred percent proof and we are trying to 763 00:38:51,239 --> 00:38:56,080 Speaker 5: be doing our due diligence and give correct information. Yeah, 764 00:38:56,239 --> 00:38:58,640 Speaker 5: that is more work to go back to whatever we 765 00:38:58,800 --> 00:39:03,160 Speaker 5: thought we could use with AI, to actually verify that 766 00:39:03,480 --> 00:39:05,880 Speaker 5: where this information is coming from. Like it's kind of 767 00:39:05,880 --> 00:39:08,600 Speaker 5: funny when we talk about having research help and having 768 00:39:08,719 --> 00:39:11,359 Speaker 5: someone else do the research, it feels like more work 769 00:39:11,360 --> 00:39:13,040 Speaker 5: because I'm like, I don't know if this is true. 770 00:39:13,040 --> 00:39:14,640 Speaker 3: I gotta go read the whole article again, you know. 771 00:39:14,640 --> 00:39:17,080 Speaker 5: What, I'm like doing more work. And it feels the 772 00:39:17,160 --> 00:39:21,080 Speaker 5: same way in this type of conversation with AI, because 773 00:39:21,120 --> 00:39:24,600 Speaker 5: I don't completely trust once again and like analyzing the 774 00:39:24,680 --> 00:39:26,200 Speaker 5: risk that this is actually true. 775 00:39:26,920 --> 00:39:29,320 Speaker 3: This might sound way off dates, but when I was 776 00:39:29,440 --> 00:39:32,680 Speaker 3: reading some of the research about why women don't use 777 00:39:32,800 --> 00:39:36,680 Speaker 3: AI as much, I couldn't help thinking about what a 778 00:39:36,760 --> 00:39:39,600 Speaker 3: lot of heterosexual women who are married to man or 779 00:39:39,640 --> 00:39:42,759 Speaker 3: in relationships with men report, where if you ask a 780 00:39:42,840 --> 00:39:45,600 Speaker 3: man in your life to do something, what you're actually 781 00:39:45,800 --> 00:39:48,440 Speaker 3: creating is like work for you later when you have 782 00:39:48,560 --> 00:39:52,479 Speaker 3: to answer the follow up or correct what he's done wrong. 783 00:39:53,040 --> 00:39:56,240 Speaker 3: I hate that for us, find women who find themselves 784 00:39:56,360 --> 00:39:59,040 Speaker 3: in close relation with men hate I hate that for us, 785 00:40:00,160 --> 00:40:04,279 Speaker 3: But like that that is what we got, that we're 786 00:40:04,360 --> 00:40:06,880 Speaker 3: served that all the time we would be lying, Like I. 787 00:40:06,880 --> 00:40:09,000 Speaker 4: Think every woman listening has had that experience, and it's 788 00:40:09,360 --> 00:40:13,279 Speaker 4: so frustrating, But I think that's it really mirrors to 789 00:40:13,400 --> 00:40:16,000 Speaker 4: me what women are talking about when they talk about 790 00:40:16,120 --> 00:40:18,120 Speaker 4: how AI is used in their work. When you have 791 00:40:18,200 --> 00:40:21,719 Speaker 4: to go and correct the mistake, ask again, give the 792 00:40:21,800 --> 00:40:24,520 Speaker 4: ball whatever it's gonna be. That's just adding more and 793 00:40:24,640 --> 00:40:27,279 Speaker 4: more to your emotional load. It's been easier for you 794 00:40:27,360 --> 00:40:29,160 Speaker 4: to do it yourself in the first place. Right. 795 00:40:29,760 --> 00:40:32,000 Speaker 5: Also, that does feel like if we get something wrong, 796 00:40:32,200 --> 00:40:34,600 Speaker 5: as women are marginalized people, we're going to be penalized 797 00:40:34,719 --> 00:40:37,680 Speaker 5: a lot more than when men do. Men are given 798 00:40:37,760 --> 00:40:40,640 Speaker 5: excuses and are like, eh, you're fine as a one 799 00:40:40,719 --> 00:40:42,759 Speaker 5: time thing, or they feel like they can brush it off. 800 00:40:42,760 --> 00:40:45,000 Speaker 5: Whether it's just maybe they feel like they can do 801 00:40:45,080 --> 00:40:47,040 Speaker 5: that way as where women have the anxiety of like, no, 802 00:40:47,120 --> 00:40:50,080 Speaker 5: we're going to get the worst punishment everlash martinalized people. 803 00:40:50,120 --> 00:40:52,840 Speaker 5: So maybe that's also that level of confidence that we 804 00:40:52,960 --> 00:40:54,760 Speaker 5: don't have, which is delusional. 805 00:40:55,960 --> 00:40:59,600 Speaker 4: Yes, And I think this speaks to something else that 806 00:40:59,640 --> 00:41:01,480 Speaker 4: I saw the Stanford piece and I just thought was 807 00:41:01,560 --> 00:41:04,560 Speaker 4: so interesting, which is that women's self report not feeling 808 00:41:04,640 --> 00:41:08,400 Speaker 4: confident with how AI works. We might be offered AI 809 00:41:08,760 --> 00:41:11,680 Speaker 4: skills training, but what we're not really offered is a 810 00:41:11,760 --> 00:41:14,920 Speaker 4: chance that like actual knowledge into this technology, all the 811 00:41:15,000 --> 00:41:18,120 Speaker 4: issues that go into it, like an actual fundamental understanding 812 00:41:18,200 --> 00:41:20,440 Speaker 4: and crash course on how it works. The Peace Sites 813 00:41:20,480 --> 00:41:22,520 Speaker 4: a Federal Reserve Bank of New York survey that shows 814 00:41:22,520 --> 00:41:25,319 Speaker 4: that when women say they want generative AI training, they 815 00:41:25,360 --> 00:41:28,480 Speaker 4: are not just looking for skills. They're signaling awareness of 816 00:41:28,520 --> 00:41:31,480 Speaker 4: this technology's opacity and their unwillingness to trust a system 817 00:41:31,480 --> 00:41:34,399 Speaker 4: that they don't fully understand. A systemic review of gender 818 00:41:34,480 --> 00:41:37,360 Speaker 4: and AI adoption found that women consistently cite things like 819 00:41:37,480 --> 00:41:40,760 Speaker 4: lack of transparency and the opaque nature of AI tools 820 00:41:40,800 --> 00:41:41,960 Speaker 4: as barriers to trust. 821 00:41:42,280 --> 00:41:45,040 Speaker 3: And so it's exactly what you dis laid out, Sam, 822 00:41:45,160 --> 00:41:46,960 Speaker 3: And again, that's. 823 00:41:46,840 --> 00:41:50,160 Speaker 4: Not the same thing as not feeling smart enough or 824 00:41:50,280 --> 00:41:53,520 Speaker 4: competent enough to understand something. That it's reasonable to say 825 00:41:53,800 --> 00:41:56,839 Speaker 4: that people that make AI tools have intentionally designed them 826 00:41:56,920 --> 00:41:59,399 Speaker 4: and they intentionally talk about them with such a lack 827 00:41:59,440 --> 00:42:02,480 Speaker 4: of transparent garancy that I don't trust it. That's that's 828 00:42:02,680 --> 00:42:06,640 Speaker 4: that is a really reasonable position. That's not unreasonable. I 829 00:42:06,640 --> 00:42:10,080 Speaker 4: don't like the framing that this like totally reasonable way 830 00:42:10,200 --> 00:42:14,200 Speaker 4: to be is you know, bad in some way, or 831 00:42:14,239 --> 00:42:16,960 Speaker 4: that women are shooting themselves in the foot economically by 832 00:42:17,320 --> 00:42:20,239 Speaker 4: showing this like reasonable lack of trust. 833 00:42:20,480 --> 00:42:23,080 Speaker 5: Right, And you know, another thing to that is again, 834 00:42:23,120 --> 00:42:25,520 Speaker 5: I keep coming back to how we've seen the mistakes 835 00:42:25,640 --> 00:42:27,960 Speaker 5: and maybe that's what it is. This has been so rushed, 836 00:42:28,280 --> 00:42:30,560 Speaker 5: but the constant times that like we see things that 837 00:42:30,640 --> 00:42:33,160 Speaker 5: are not good are usually really bad for women. Like 838 00:42:33,200 --> 00:42:36,239 Speaker 5: I keep thinking about the AI videos of the deep 839 00:42:36,320 --> 00:42:38,520 Speaker 5: fakes that happened with that twitch switch streamer that we 840 00:42:38,600 --> 00:42:40,439 Speaker 5: talked about. Was it two years ago? And I don't 841 00:42:40,440 --> 00:42:42,399 Speaker 5: even remember we had, like cause we were really talking 842 00:42:42,480 --> 00:42:46,400 Speaker 5: about the fact things like this is super concerning because 843 00:42:46,440 --> 00:42:49,919 Speaker 5: it does go after women for like just being there. 844 00:42:50,680 --> 00:42:52,959 Speaker 4: Yeah, I mean that speaks to what I was talking 845 00:42:53,000 --> 00:42:54,640 Speaker 4: about earlier when we were talking about the way that 846 00:42:54,680 --> 00:42:58,160 Speaker 4: Sam Altman, the way that Sam Altman was presenting talking 847 00:42:58,200 --> 00:43:02,000 Speaker 4: about the chatbot they were creating Sky that there are 848 00:43:02,200 --> 00:43:08,600 Speaker 4: so many examples of the vibes just being suss for 849 00:43:08,760 --> 00:43:11,920 Speaker 4: women in these spaces, right. You know, I make a 850 00:43:12,040 --> 00:43:17,319 Speaker 4: podcast about the intersection of gender and technology, and oftentimes 851 00:43:17,360 --> 00:43:20,120 Speaker 4: when we talk about AI, we're talking about things like 852 00:43:20,520 --> 00:43:23,960 Speaker 4: non consensual sexualized deep fakes. We're talking about things like 853 00:43:24,120 --> 00:43:27,400 Speaker 4: Sam Altman, you know, saying people should be able to 854 00:43:27,440 --> 00:43:30,239 Speaker 4: have the kind of relationship that Joaquin Phoenix has with 855 00:43:30,880 --> 00:43:33,880 Speaker 4: the chatbot in the movie Her. We're talking about things 856 00:43:34,000 --> 00:43:40,040 Speaker 4: like Elon Musk creating a anime teenager kind of sex 857 00:43:40,239 --> 00:43:45,799 Speaker 4: fantasy chatbot. We're talking about men who build technology, who 858 00:43:46,520 --> 00:43:50,640 Speaker 4: fill their teams with other men. They lack gender diversity 859 00:43:50,680 --> 00:43:52,719 Speaker 4: and racial diversity or any kind of real inclusion, and 860 00:43:52,880 --> 00:43:57,160 Speaker 4: then they talk about women in these like misogynistic ways. 861 00:43:57,400 --> 00:44:00,279 Speaker 4: It's not surprising to me that women are being lower 862 00:44:00,400 --> 00:44:02,719 Speaker 4: to adopt this technology that is made in such a 863 00:44:03,480 --> 00:44:07,160 Speaker 4: misogynistic soup. And as Mara Boll has put it in 864 00:44:07,239 --> 00:44:11,279 Speaker 4: that Stanford Review piece, as in financial systems, women are 865 00:44:11,280 --> 00:44:14,680 Speaker 4: attuned to the weaknesses in generative AI systems that designers 866 00:44:14,880 --> 00:44:18,640 Speaker 4: didn't notice or prioritize things like bias, privacy risks, unreliable 867 00:44:18,680 --> 00:44:22,080 Speaker 4: outputs before putting their products out into the world, which 868 00:44:22,120 --> 00:44:24,200 Speaker 4: speaks to that that rushed quality that you were just 869 00:44:24,440 --> 00:44:28,560 Speaker 4: clocking earlier. Some of the industry's more misogynistic offerings see 870 00:44:28,800 --> 00:44:33,120 Speaker 4: rocks Annie fantasy chatbot, or disturbing policies see Facebook's Leach 871 00:44:33,200 --> 00:44:35,640 Speaker 4: policies on children and a list of content are enough 872 00:44:35,680 --> 00:44:39,000 Speaker 4: to send most users into a kindatonic depressive spiral. But 873 00:44:39,160 --> 00:44:43,239 Speaker 4: for women, beyond being offensive, such outputs are evidence of 874 00:44:43,360 --> 00:44:47,040 Speaker 4: what gets built when developmental teams lack gender diversity. When 875 00:44:47,080 --> 00:44:49,680 Speaker 4: women engage with systems that they've been largely left out 876 00:44:49,680 --> 00:44:53,840 Speaker 4: of creating, the products can feel foreign, awkward, or even hostile. 877 00:44:54,200 --> 00:44:56,960 Speaker 4: And so I think that is absolutely what's going on here. 878 00:44:57,560 --> 00:45:00,560 Speaker 4: The vibes around how this technology is made, who makes it, 879 00:45:00,840 --> 00:45:02,960 Speaker 4: who is in the room, who's who has the power 880 00:45:03,040 --> 00:45:05,160 Speaker 4: over it, and how how that power is being youthed 881 00:45:05,160 --> 00:45:08,000 Speaker 4: and wielded to shape the world. All of that sucks 882 00:45:08,080 --> 00:45:11,880 Speaker 4: for marginalized people. So yeah, surprise, surprise, those same marginalized 883 00:45:11,880 --> 00:45:15,439 Speaker 4: people are saying no, thank you, y'all can keep it, yep. 884 00:45:15,960 --> 00:45:20,200 Speaker 2: And it's not like we don't have historical, societal, systemic 885 00:45:20,320 --> 00:45:22,200 Speaker 2: evidence of this going wrong for us. 886 00:45:22,280 --> 00:45:22,560 Speaker 1: I could. 887 00:45:22,840 --> 00:45:26,239 Speaker 2: I love how men are like confident, optimistic. Oh, it's 888 00:45:26,280 --> 00:45:29,040 Speaker 2: gonna be great. I'm like, because its historically as for. 889 00:45:29,120 --> 00:45:34,800 Speaker 3: You, Yeah, guess what. Not so much over here, not 890 00:45:35,080 --> 00:45:38,279 Speaker 3: so much at all. And I wanted to end on this. 891 00:45:39,080 --> 00:45:40,800 Speaker 4: When I was researching this, I found this post on 892 00:45:40,840 --> 00:45:42,800 Speaker 4: Reddit that I thought really did a great job of 893 00:45:42,840 --> 00:45:44,800 Speaker 4: summarizing my thoughts. So there was a post in the 894 00:45:44,920 --> 00:45:48,520 Speaker 4: ask Feminists subreddit, and the post asks why are women 895 00:45:48,800 --> 00:45:52,040 Speaker 4: using generative AI lesson men and the redditor trooper s 896 00:45:52,120 --> 00:45:53,000 Speaker 4: JP answers. 897 00:45:53,680 --> 00:45:54,920 Speaker 3: I'm a university professor. 898 00:45:55,320 --> 00:45:58,440 Speaker 4: If female students are using chat EPT less than male students, 899 00:45:58,640 --> 00:46:01,399 Speaker 4: it is the male students should be worried about. There 900 00:46:01,440 --> 00:46:03,680 Speaker 4: have been studies showing that there is a negative cognitive 901 00:46:03,719 --> 00:46:07,000 Speaker 4: impact of using generative AI on learners. This includes loss 902 00:46:07,040 --> 00:46:10,040 Speaker 4: and ability to retain information, loss of critical thinking skills, 903 00:46:10,320 --> 00:46:13,680 Speaker 4: loss of empathy, degradation of writing and math skills, degradation 904 00:46:13,800 --> 00:46:17,960 Speaker 4: of problem solving skills, and on and on. Furthermore, generative 905 00:46:18,000 --> 00:46:20,840 Speaker 4: AI is bad for the environment, contributes to environmental racism, 906 00:46:21,200 --> 00:46:23,840 Speaker 4: is built on stolen data by corporations who will zealously 907 00:46:23,920 --> 00:46:27,360 Speaker 4: guard their own intellectual property while willingly violating yours and 908 00:46:27,520 --> 00:46:31,520 Speaker 4: capitalist excitement in replacing entry level workers with AI is 909 00:46:31,600 --> 00:46:34,759 Speaker 4: having an immediate negative impact on young job seekers aged 910 00:46:34,760 --> 00:46:37,160 Speaker 4: twenty two to twenty seven, but will have a much 911 00:46:37,320 --> 00:46:40,120 Speaker 4: more profound impact on all of us when we lose 912 00:46:40,160 --> 00:46:42,640 Speaker 4: a generation of entry level workers who won't get the 913 00:46:42,680 --> 00:46:46,960 Speaker 4: experience needed to become experienced workers when the experienced workers retire. 914 00:46:47,480 --> 00:46:51,120 Speaker 4: Chat GPT is also inaccurate, bland, and produces terrible work 915 00:46:51,360 --> 00:46:55,440 Speaker 4: full of false information. Furthermore, in classrooms, including mine, the 916 00:46:55,560 --> 00:46:58,760 Speaker 4: use of generative AI is considered an academic integrity violation 917 00:46:59,120 --> 00:47:01,040 Speaker 4: and will result in as zero for that assignment and 918 00:47:01,200 --> 00:47:04,200 Speaker 4: forwarding the case to the student Conduct Board. My students 919 00:47:04,239 --> 00:47:06,440 Speaker 4: would not appreciate it if I use chat cheapt to 920 00:47:06,480 --> 00:47:08,919 Speaker 4: create lesson plans, to grade their work, and to write 921 00:47:08,920 --> 00:47:11,800 Speaker 4: their letters of recommendation. I do not appreciate my students 922 00:47:11,880 --> 00:47:14,120 Speaker 4: passing off work they didn't write that has a direct 923 00:47:14,200 --> 00:47:16,640 Speaker 4: negative impact on themselves and the world as their own. 924 00:47:16,960 --> 00:47:18,560 Speaker 4: So I don't think it is bad that women are 925 00:47:18,640 --> 00:47:20,960 Speaker 4: less likely to be cheating by using chat cheapt in 926 00:47:20,960 --> 00:47:22,800 Speaker 4: their work than men. I think it is bad that 927 00:47:22,920 --> 00:47:26,880 Speaker 4: so many men are wasting their educations by cheating. And 928 00:47:27,000 --> 00:47:30,239 Speaker 4: I felt like that really summed up what I was 929 00:47:30,320 --> 00:47:33,360 Speaker 4: trying to say about this reframing. I don't like that 930 00:47:33,480 --> 00:47:37,000 Speaker 4: this conversation is simply framed as women aren't using AI 931 00:47:37,719 --> 00:47:41,640 Speaker 4: and thus sort of almost blaming women for all of 932 00:47:41,719 --> 00:47:44,600 Speaker 4: these systemic reasons and reasonable reasons as to why that 933 00:47:44,760 --> 00:47:47,439 Speaker 4: might be, and also sort of saying, well, if women 934 00:47:47,520 --> 00:47:51,080 Speaker 4: weren't so afraid and also stupid about AI, maybe they'd 935 00:47:51,120 --> 00:47:51,919 Speaker 4: be making more women. 936 00:47:52,040 --> 00:47:53,640 Speaker 3: It's not society, it's a stupid women. 937 00:47:54,000 --> 00:47:56,600 Speaker 4: I really think that we need to reframe that and say, well, 938 00:47:56,640 --> 00:47:59,680 Speaker 4: what are women telling us by not using this technology? Like, 939 00:48:00,040 --> 00:48:02,399 Speaker 4: should we really just be saying these women are making 940 00:48:02,440 --> 00:48:04,600 Speaker 4: a bad choice. Shouldn't we really be zeroing in on 941 00:48:04,760 --> 00:48:07,279 Speaker 4: what women are saying as that pertains to why we 942 00:48:07,360 --> 00:48:11,000 Speaker 4: are not so gung ho to be adopting this technology 943 00:48:11,040 --> 00:48:11,799 Speaker 4: the way that men are. 944 00:48:12,120 --> 00:48:13,719 Speaker 3: That's that's sort of my that's sort of my my. 945 00:48:13,880 --> 00:48:14,840 Speaker 3: So what in all this? 946 00:48:15,840 --> 00:48:16,000 Speaker 4: You know? 947 00:48:16,040 --> 00:48:17,759 Speaker 5: And I think in the future we were going to 948 00:48:17,800 --> 00:48:21,520 Speaker 5: come back and have a conversation about the red the 949 00:48:21,640 --> 00:48:24,960 Speaker 5: lining of this type of usage in this conversation and 950 00:48:25,280 --> 00:48:29,440 Speaker 5: the growing population of the red pill community, Yes, because 951 00:48:29,440 --> 00:48:31,399 Speaker 5: we're talking more and more like it has become even 952 00:48:31,440 --> 00:48:33,200 Speaker 5: more so. Worthy have come to the point of saying 953 00:48:33,560 --> 00:48:37,000 Speaker 5: of inceels saying being straight and having a relationship with 954 00:48:37,040 --> 00:48:41,399 Speaker 5: a woman is gay, Like literally, that's their like and soul? 955 00:48:41,480 --> 00:48:43,800 Speaker 3: Have you seen this? So really a man can do 956 00:48:44,000 --> 00:48:46,680 Speaker 3: is sleep with a woman having sex with a woman. 957 00:48:46,719 --> 00:48:48,920 Speaker 5: Right, Submitting and pleasing a woman essentially was kind of 958 00:48:48,960 --> 00:48:51,640 Speaker 5: like that kind of intake. But I really do wonder 959 00:48:51,880 --> 00:48:56,000 Speaker 5: if this conversation of using chat GPT to replace that 960 00:48:56,120 --> 00:48:58,440 Speaker 5: interaction again with a mal loneliness epidemic, and how it 961 00:48:58,440 --> 00:49:02,160 Speaker 5: becomes violent. If this conversation and if this push and 962 00:49:02,239 --> 00:49:04,920 Speaker 5: things like what Sam Allman was trying to do when 963 00:49:04,960 --> 00:49:08,640 Speaker 5: he first created that chatbot, as well as what Elon 964 00:49:08,760 --> 00:49:11,800 Speaker 5: Musk is trying to do creating his chatbot is just 965 00:49:12,000 --> 00:49:16,560 Speaker 5: opening up a bigger community, a bigger doorway for more 966 00:49:16,800 --> 00:49:19,080 Speaker 5: in cell activity and red pill activity, and we're gonna 967 00:49:19,120 --> 00:49:20,800 Speaker 5: have to come back and have to deal with the 968 00:49:20,840 --> 00:49:25,000 Speaker 5: consequence of things like AI being a part of this narrative. 969 00:49:25,840 --> 00:49:30,800 Speaker 4: Absolutely, I completely agree. I think those things are totally linked. 970 00:49:31,280 --> 00:49:32,480 Speaker 4: And I'm in the middle of doing a lot of 971 00:49:32,520 --> 00:49:36,160 Speaker 4: research about the kinds of sexual and romantic attachments that 972 00:49:36,200 --> 00:49:40,800 Speaker 4: people create with AI, and I think it's complicated because 973 00:49:41,040 --> 00:49:43,600 Speaker 4: I want people to feel like they have nourishing relationships 974 00:49:43,600 --> 00:49:48,120 Speaker 4: in their lives. But healthy nourishing relationships come with friction, 975 00:49:48,360 --> 00:49:50,480 Speaker 4: like that's what you get from being in relationship with 976 00:49:50,560 --> 00:49:54,320 Speaker 4: other humans. And I think that the availability of folks 977 00:49:54,400 --> 00:49:59,880 Speaker 4: to kind of engage in frictionless relationships where they can 978 00:50:00,080 --> 00:50:02,000 Speaker 4: just take and take and take an extract and extract 979 00:50:02,040 --> 00:50:03,800 Speaker 4: and extract and always get what they want out of 980 00:50:03,840 --> 00:50:06,280 Speaker 4: them in some ways I can see by that is therapeutic, 981 00:50:06,400 --> 00:50:09,879 Speaker 4: but that is not what builds healthy people like health 982 00:50:09,960 --> 00:50:13,600 Speaker 4: Like healthy people have to experience relationships with friction. And 983 00:50:13,680 --> 00:50:14,600 Speaker 4: that's just the bottom line. 984 00:50:16,239 --> 00:50:20,160 Speaker 1: Yes, yes, well topic for a future day. 985 00:50:20,520 --> 00:50:21,800 Speaker 3: Yeah, I love it. 986 00:50:23,080 --> 00:50:25,800 Speaker 2: I also have to say a dude told me the 987 00:50:25,880 --> 00:50:29,520 Speaker 2: other day. He was like, well, I love AI because 988 00:50:29,520 --> 00:50:32,160 Speaker 2: it gives me more leisure time. And I was like, well, 989 00:50:32,160 --> 00:50:34,399 Speaker 2: wait till you don't have a job, and then you're 990 00:50:34,440 --> 00:50:35,760 Speaker 2: not gonna have a way. 991 00:50:35,920 --> 00:50:40,040 Speaker 3: That time, it'll be all all the time and we 992 00:50:40,200 --> 00:50:43,360 Speaker 3: won't be able to do all those things because you 993 00:50:43,480 --> 00:50:47,319 Speaker 3: have no monies more leisure time. 994 00:50:48,040 --> 00:50:51,680 Speaker 2: Oh all right, well, thank you so much, Bridget for 995 00:50:51,880 --> 00:50:55,239 Speaker 2: coming during the stressful time. We always love talking to you. 996 00:50:55,880 --> 00:50:57,640 Speaker 2: Can't wait to see you in the new year twenty 997 00:50:57,680 --> 00:50:58,200 Speaker 2: twenty six. 998 00:50:58,680 --> 00:50:59,879 Speaker 3: Yes, we're almost. 999 00:50:59,600 --> 00:51:03,200 Speaker 1: There, almost there. In the meantime, where can the good 1000 00:51:03,239 --> 00:51:03,920 Speaker 1: listeners find you? 1001 00:51:04,200 --> 00:51:05,880 Speaker 4: You can find me at my podcast There are no 1002 00:51:06,040 --> 00:51:07,960 Speaker 4: goals on the internet, where we explore all kinds of 1003 00:51:08,000 --> 00:51:11,640 Speaker 4: issues of the intersection of gender and identity and technology 1004 00:51:11,680 --> 00:51:13,520 Speaker 4: and social media. You can check me out on Instagram 1005 00:51:13,600 --> 00:51:16,000 Speaker 4: at Bridget Marie in DC, or on YouTube. But there 1006 00:51:16,040 --> 00:51:17,439 Speaker 4: are no girls on the internet. 1007 00:51:17,840 --> 00:51:19,279 Speaker 1: Yes, and go do that. 1008 00:51:19,440 --> 00:51:22,440 Speaker 2: If you have not already listeners, If you would like 1009 00:51:22,520 --> 00:51:24,400 Speaker 2: to contact us, you can You can email us at 1010 00:51:24,440 --> 00:51:26,080 Speaker 2: Hello at stuff onever Told You dot com. You can 1011 00:51:26,120 --> 00:51:27,880 Speaker 2: find us on Blue skyte Mom Stuff Podcasts, or on 1012 00:51:27,880 --> 00:51:29,600 Speaker 2: Instagram and TikTok at stuff one Never Told You. 1013 00:51:29,880 --> 00:51:30,760 Speaker 1: We're also on YouTube. 1014 00:51:30,880 --> 00:51:32,480 Speaker 2: We have some new merchandise at comp Bureau, and we 1015 00:51:32,520 --> 00:51:34,040 Speaker 2: have a book you can get wherever you get your books. 1016 00:51:34,239 --> 00:51:36,839 Speaker 2: Thanks as always too, our superducer PRIs Senior Executive Prius 1017 00:51:36,880 --> 00:51:39,040 Speaker 2: my under contributor Joey, Thank you and thanks to you 1018 00:51:39,160 --> 00:51:41,480 Speaker 2: for listening. Stuff Never Told You is production by Heart Radio. 1019 00:51:41,520 --> 00:51:43,080 Speaker 2: For more podcasts from my heart Radio, you can check 1020 00:51:43,080 --> 00:51:45,040 Speaker 2: out the heart Radio app Apple Podcasts wherever you listen 1021 00:51:45,040 --> 00:51:45,840 Speaker 2: to your favorite shows.