1 00:00:05,320 --> 00:00:07,920 Speaker 1: Hey, this is Annie and this is Bridget, and you're 2 00:00:07,960 --> 00:00:22,360 Speaker 1: listening to stuff I'm never told you. And today we're 3 00:00:22,400 --> 00:00:25,560 Speaker 1: doing a topic that has been on my mind since 4 00:00:25,760 --> 00:00:31,400 Speaker 1: probably about seventh grade, and it is sexism in language. 5 00:00:31,960 --> 00:00:35,599 Speaker 1: And language is a really powerful tool that we use 6 00:00:35,680 --> 00:00:38,960 Speaker 1: to communicate ideas um vocally or in written word. And 7 00:00:39,000 --> 00:00:41,800 Speaker 1: it's not only for like making plans for the evening um, 8 00:00:41,800 --> 00:00:44,440 Speaker 1: but big ideas ways we can change things, and it 9 00:00:44,479 --> 00:00:46,559 Speaker 1: can shape our thoughts and how we think. And I'm 10 00:00:46,600 --> 00:00:49,600 Speaker 1: not just saying this because we're podcasters and it's sort 11 00:00:49,640 --> 00:00:53,480 Speaker 1: of how we make a living um, but it says 12 00:00:53,560 --> 00:00:56,440 Speaker 1: so much about our cultures and attitudes and how those 13 00:00:56,480 --> 00:00:59,200 Speaker 1: things have changed. And I love I read somewhere in 14 00:00:59,240 --> 00:01:02,480 Speaker 1: an article that it's like looking at a societal and 15 00:01:02,520 --> 00:01:06,840 Speaker 1: cultural fossil record. That, yeah, I love that by chasing 16 00:01:06,840 --> 00:01:08,880 Speaker 1: it back, we can get a snapshot of what life 17 00:01:08,920 --> 00:01:12,600 Speaker 1: was like for our ancestors and how we've improved and 18 00:01:12,640 --> 00:01:15,400 Speaker 1: how we've stayed the same in some cases, how we've 19 00:01:15,440 --> 00:01:18,839 Speaker 1: gotten worse. Because it is a reflection of us past 20 00:01:18,880 --> 00:01:22,720 Speaker 1: and present, of our relationships, of our discourse, the discourse 21 00:01:22,720 --> 00:01:25,000 Speaker 1: we have in public and private, of our wants and 22 00:01:25,080 --> 00:01:28,000 Speaker 1: needs a tool that can be used for persuasion intimidation. 23 00:01:28,480 --> 00:01:31,880 Speaker 1: It can also be used to help perpetuate and reproduce 24 00:01:32,000 --> 00:01:37,440 Speaker 1: sexist and racist and ablest and homophobic ideas, consciously or unconsciously. 25 00:01:38,160 --> 00:01:44,160 Speaker 1: Even our language discriminates against us. And I am guilty 26 00:01:44,160 --> 00:01:45,960 Speaker 1: of a lot of the things we're going to talk 27 00:01:46,000 --> 00:01:49,920 Speaker 1: about in this episode, UM, like saying hey, guys are 28 00:01:50,000 --> 00:01:54,160 Speaker 1: oh man or ablest wise saying stupid or dumb. I'm 29 00:01:54,160 --> 00:01:56,520 Speaker 1: really bad about that and it's something that I've been 30 00:01:56,520 --> 00:01:59,960 Speaker 1: working on. Yeah, I will say that being a pot 31 00:02:00,080 --> 00:02:03,440 Speaker 1: castor it forces you to be very critical and deliberate 32 00:02:03,440 --> 00:02:06,680 Speaker 1: and careful about your language. UM. In a former life, 33 00:02:06,880 --> 00:02:11,200 Speaker 1: I used to run trainings UM for activists and organizers 34 00:02:11,240 --> 00:02:14,520 Speaker 1: in the progressive space, and we intentionally wanted these trainings 35 00:02:14,520 --> 00:02:17,040 Speaker 1: to be inclusive so that no matter your background, you 36 00:02:17,040 --> 00:02:20,520 Speaker 1: would feel uplifted and respected and heard. And you know, 37 00:02:20,560 --> 00:02:22,200 Speaker 1: when you're in front of a room of people that 38 00:02:22,280 --> 00:02:24,840 Speaker 1: you specifically have gotten the room because of their diverse 39 00:02:24,840 --> 00:02:28,399 Speaker 1: backgrounds and you want to be inclusive with you really 40 00:02:28,400 --> 00:02:32,920 Speaker 1: get a sense of how difficult it can be to 41 00:02:33,040 --> 00:02:38,760 Speaker 1: modify your language. And for me, it was UM using 42 00:02:38,800 --> 00:02:42,880 Speaker 1: the word crazy a lot, which I realized now is 43 00:02:43,320 --> 00:02:46,560 Speaker 1: kind of messed up. It's very ablest and my trick 44 00:02:46,760 --> 00:02:48,960 Speaker 1: for how I would you know, if I said crazy, 45 00:02:49,040 --> 00:02:53,639 Speaker 1: or if I addressed the room of men, women, non binary, 46 00:02:53,680 --> 00:02:55,920 Speaker 1: gender nonconforming, you know, if I if I address this 47 00:02:56,000 --> 00:02:59,840 Speaker 1: inclusive space with hey, guys, that wouldn't work. And so 48 00:03:00,000 --> 00:03:02,120 Speaker 1: I ended up doing was putting a rubber band around 49 00:03:02,160 --> 00:03:06,160 Speaker 1: my wrist and whenever I heard myself say one of 50 00:03:06,160 --> 00:03:08,120 Speaker 1: the things I was trying not to say, I would snap. 51 00:03:08,160 --> 00:03:11,160 Speaker 1: It just aska not hard, but it just has a 52 00:03:11,200 --> 00:03:13,800 Speaker 1: way to remind myself you're not doing that. You're not 53 00:03:13,840 --> 00:03:16,800 Speaker 1: doing that as a as a physical self corrective method. 54 00:03:17,480 --> 00:03:19,239 Speaker 1: That sounds like a great tip. That reminds me of 55 00:03:19,280 --> 00:03:21,880 Speaker 1: when I was trying to work on my posture and 56 00:03:21,960 --> 00:03:25,840 Speaker 1: I put um duct tape on my lewer back and 57 00:03:25,840 --> 00:03:28,040 Speaker 1: so every time I was slouching, it would pull on 58 00:03:28,160 --> 00:03:30,120 Speaker 1: the hairs and skin on my mor back and it 59 00:03:30,120 --> 00:03:32,280 Speaker 1: would make me sit up straight. It sounds like you 60 00:03:32,320 --> 00:03:36,200 Speaker 1: were doing the language version of my duct tape back 61 00:03:36,240 --> 00:03:42,200 Speaker 1: trick pretty much. Um So, listeners probably know I'm a 62 00:03:42,200 --> 00:03:46,120 Speaker 1: bit of a language nerd um So we're gonna be 63 00:03:46,200 --> 00:03:49,320 Speaker 1: using some language terms but I promise it'll be fun um. 64 00:03:49,400 --> 00:03:52,240 Speaker 1: And this is something, this whole idea of sexism and language, 65 00:03:52,240 --> 00:03:54,240 Speaker 1: like I said, I've been thinking about since middle school, 66 00:03:54,240 --> 00:03:57,280 Speaker 1: when someone told me during French class that throughout most 67 00:03:57,280 --> 00:03:59,880 Speaker 1: of the world, the word for woman is a so 68 00:04:00,000 --> 00:04:02,320 Speaker 1: ceated with the word for evil. And I don't know 69 00:04:02,360 --> 00:04:04,520 Speaker 1: if that's true. I couldn't find out if that was 70 00:04:04,560 --> 00:04:08,560 Speaker 1: actually true. But while I was researching to find out 71 00:04:08,560 --> 00:04:11,800 Speaker 1: if it was true, I found a wealth of sexism 72 00:04:11,880 --> 00:04:14,680 Speaker 1: in our language and I wanted to talk about it. 73 00:04:15,640 --> 00:04:17,839 Speaker 1: And I think another thing before we dive in here 74 00:04:17,920 --> 00:04:21,000 Speaker 1: that we forget are these other forms of communication, like 75 00:04:21,040 --> 00:04:23,640 Speaker 1: body language or sign language. I used to be part 76 00:04:23,640 --> 00:04:26,479 Speaker 1: of this organization in college that was all about fostering 77 00:04:26,480 --> 00:04:31,159 Speaker 1: relationships and understanding between different cultures, and when we went 78 00:04:31,200 --> 00:04:35,160 Speaker 1: to conferences, we all spoke different languages. So the delegation 79 00:04:35,200 --> 00:04:38,080 Speaker 1: from each country would go on stage and do a 80 00:04:38,160 --> 00:04:41,039 Speaker 1: dance that they had prepared that they thought communicated what 81 00:04:41,120 --> 00:04:43,560 Speaker 1: they were all about, and it was a great icebreaker. 82 00:04:44,120 --> 00:04:45,960 Speaker 1: I just feel like we forget a lot that we 83 00:04:46,040 --> 00:04:48,800 Speaker 1: do communicate, and we can communicate in all of these 84 00:04:48,800 --> 00:04:52,760 Speaker 1: other ways. Dancing is a fun one, yeah, that's a 85 00:04:52,800 --> 00:04:56,400 Speaker 1: good reminder. We often don't think about non verbal communication 86 00:04:56,600 --> 00:04:58,839 Speaker 1: as a as a as this. I don't. I forget 87 00:04:58,880 --> 00:05:01,320 Speaker 1: about it all the time. But like when Annie and 88 00:05:01,360 --> 00:05:03,880 Speaker 1: I do this podcast. I right now I am in 89 00:05:03,960 --> 00:05:06,240 Speaker 1: d C. And Annie is in Atlanta, but we have 90 00:05:06,760 --> 00:05:09,200 Speaker 1: on Skype so we can see how the other is 91 00:05:09,720 --> 00:05:14,039 Speaker 1: visually and like facially responding. Um. Because it does help 92 00:05:14,080 --> 00:05:18,039 Speaker 1: to communicate, Yes, it does. And I've said before on 93 00:05:18,080 --> 00:05:21,240 Speaker 1: this show. I have one ear that is really really bad, 94 00:05:21,720 --> 00:05:24,720 Speaker 1: and so I depend a lot on watching people's mouths too. 95 00:05:24,920 --> 00:05:29,279 Speaker 1: It helps me process simultaneously. It's almost like I'm hearing 96 00:05:29,320 --> 00:05:34,080 Speaker 1: you through my eyes. Um. And if I don't. That's 97 00:05:34,080 --> 00:05:36,040 Speaker 1: why I hate speaking on the phone so much. It 98 00:05:36,240 --> 00:05:40,240 Speaker 1: makes me really anxious when I can't see people's mouths 99 00:05:40,360 --> 00:05:44,120 Speaker 1: because I'm having to work twice as hard to make 100 00:05:44,160 --> 00:05:46,200 Speaker 1: out what they're saying. And it makes me seem really 101 00:05:46,240 --> 00:05:49,480 Speaker 1: awkward on the phone. Um. In fact, I found out 102 00:05:49,560 --> 00:05:52,799 Speaker 1: later that somebody almost didn't give me a job because 103 00:05:53,000 --> 00:05:55,159 Speaker 1: I was so awkward on the phone. But it was 104 00:05:55,200 --> 00:05:58,279 Speaker 1: just me trying to okay. I think that's what they said. Like, 105 00:05:58,279 --> 00:06:01,000 Speaker 1: there's there's always these paul is that seem a little 106 00:06:01,040 --> 00:06:03,760 Speaker 1: longer than necessary when you're speaking with me on the phone, 107 00:06:03,760 --> 00:06:05,479 Speaker 1: because I'm trying to make sure Okay, I think I 108 00:06:05,520 --> 00:06:08,440 Speaker 1: know what you said. UM. So if you've ever noticed 109 00:06:08,440 --> 00:06:13,880 Speaker 1: any awkwardness with phone interviews, that's what it was. You say. 110 00:06:14,040 --> 00:06:20,359 Speaker 1: It's a shout out to your former interviewees. Yes, but 111 00:06:20,560 --> 00:06:24,960 Speaker 1: here I am in an audio medium, so dreams can 112 00:06:25,080 --> 00:06:28,520 Speaker 1: come true. Um. And we are going to be digging 113 00:06:28,520 --> 00:06:33,520 Speaker 1: into mostly sexism in language and mostly English. But I 114 00:06:33,560 --> 00:06:35,279 Speaker 1: did want to include that in there, that there is 115 00:06:35,760 --> 00:06:39,120 Speaker 1: a lot of nonverbal communication stuff. And also for listeners 116 00:06:39,120 --> 00:06:43,920 Speaker 1: outside of the English speaking world, please right in if 117 00:06:43,960 --> 00:06:47,560 Speaker 1: you have examples, um, from your own language, because I 118 00:06:47,600 --> 00:06:49,840 Speaker 1: would love to hear them. I, like I said, I'm 119 00:06:49,839 --> 00:06:54,000 Speaker 1: a really big language nerd um. And also, we're not 120 00:06:54,040 --> 00:06:57,200 Speaker 1: here to ruin your party, are we are? But it's 121 00:06:57,200 --> 00:06:59,279 Speaker 1: so you can throw a better party that's more welcoming 122 00:06:59,320 --> 00:07:04,040 Speaker 1: to everyone. And also, I mean, these are fun facts, 123 00:07:04,160 --> 00:07:08,479 Speaker 1: even if some of them are sad. That sounds so strange, 124 00:07:08,560 --> 00:07:11,400 Speaker 1: but it's true. It's like what historically where these things 125 00:07:11,480 --> 00:07:15,480 Speaker 1: have come from in some cases is really interesting. I 126 00:07:15,520 --> 00:07:19,600 Speaker 1: also think it's just an interesting reminder that language changes 127 00:07:19,640 --> 00:07:23,240 Speaker 1: and shapes and you know, grows with culture. One of 128 00:07:23,240 --> 00:07:24,640 Speaker 1: the things I wanted to make sure that we point 129 00:07:24,640 --> 00:07:27,560 Speaker 1: out is that you know, in conversations that folks are 130 00:07:27,600 --> 00:07:31,160 Speaker 1: having around the dictionary, adding the pronouns they and how 131 00:07:31,600 --> 00:07:34,840 Speaker 1: the push to make our language less gendered, language shapes, 132 00:07:34,880 --> 00:07:37,360 Speaker 1: it changes, it grows, it breathes, you know, the words 133 00:07:37,400 --> 00:07:40,840 Speaker 1: that mean one thing and become something else, and that's 134 00:07:40,920 --> 00:07:42,960 Speaker 1: how it's always been. I think we can get really 135 00:07:43,000 --> 00:07:47,960 Speaker 1: hung up on this word means x always and we 136 00:07:48,320 --> 00:07:50,600 Speaker 1: need to step back and see the ways that language 137 00:07:50,640 --> 00:07:53,120 Speaker 1: does grow and it's it's some of it is disappointing, 138 00:07:53,200 --> 00:07:55,320 Speaker 1: but yeah, like you said, it's interesting and it's interesting 139 00:07:55,320 --> 00:07:59,320 Speaker 1: too to see how language has shaped over the years. Yeah, 140 00:07:59,400 --> 00:08:03,240 Speaker 1: seeing it evolve. Um right now, kind of like you 141 00:08:03,280 --> 00:08:05,040 Speaker 1: were saying, I went to a talk and they were 142 00:08:05,080 --> 00:08:08,600 Speaker 1: discussing how and a lot of languages there aren't words 143 00:08:08,720 --> 00:08:13,120 Speaker 1: for transgender specifically, and how kind of a race is 144 00:08:13,360 --> 00:08:15,720 Speaker 1: that experience when there aren't words for it? And then 145 00:08:15,880 --> 00:08:17,760 Speaker 1: what do you do? Do you make a word you 146 00:08:17,840 --> 00:08:21,160 Speaker 1: adopt like an English word. So these conversations are happening 147 00:08:21,160 --> 00:08:26,800 Speaker 1: in language is constantly shifting, and I think that that 148 00:08:26,960 --> 00:08:31,720 Speaker 1: is a good thing that it's becoming more inclusive, but 149 00:08:32,679 --> 00:08:35,880 Speaker 1: there are still a lot of words I think we 150 00:08:35,960 --> 00:08:41,959 Speaker 1: take for granted that do have uh, this underlying sexism 151 00:08:42,000 --> 00:08:44,880 Speaker 1: with them. And so to talk about this, we're gonna 152 00:08:45,160 --> 00:08:51,240 Speaker 1: discuss co location first. So co location is when you 153 00:08:51,320 --> 00:08:54,080 Speaker 1: hear a word and your brain automatically fills in the 154 00:08:54,120 --> 00:08:57,120 Speaker 1: blank for the word that should go with it. Um. So, 155 00:08:57,240 --> 00:08:59,920 Speaker 1: if I say, I think the example the Guardian are 156 00:09:00,040 --> 00:09:02,760 Speaker 1: Clay I was reading about this gave is pop, like 157 00:09:02,840 --> 00:09:07,320 Speaker 1: your mind might automatically think star or corn or something 158 00:09:07,360 --> 00:09:11,199 Speaker 1: that it just your brain supplies this word for you 159 00:09:11,240 --> 00:09:14,600 Speaker 1: that it thinks goes with that word, the first word. 160 00:09:15,480 --> 00:09:18,600 Speaker 1: And I've been doing a lot of research into Disney's 161 00:09:18,640 --> 00:09:22,600 Speaker 1: portrayals of Mother's Lately and one another one I found 162 00:09:22,640 --> 00:09:25,880 Speaker 1: is evil stepmother. So a lot of times when you 163 00:09:25,920 --> 00:09:30,559 Speaker 1: hear the word stepmother, your brain automatically thinks evil, which 164 00:09:30,640 --> 00:09:36,200 Speaker 1: is not great. It's not great. I mean I remember 165 00:09:36,280 --> 00:09:40,600 Speaker 1: growing up and thinking that step moms were all evil, 166 00:09:41,200 --> 00:09:43,920 Speaker 1: like if there was if someone was like, oh this 167 00:09:43,960 --> 00:09:46,560 Speaker 1: is my stepmom. When I was a kid, I associated 168 00:09:46,640 --> 00:09:49,760 Speaker 1: that with evil, with like she must be horrible in 169 00:09:49,800 --> 00:09:54,320 Speaker 1: the basement, right, And yeah, the research shows that that 170 00:09:54,360 --> 00:09:57,359 Speaker 1: association starts pretty much as soon as you can watch Cinderella, 171 00:09:58,080 --> 00:10:04,559 Speaker 1: like it's cemented. Um So. In early The Guardian published 172 00:10:04,600 --> 00:10:07,640 Speaker 1: an article criticizing the Oxford English Dictionary and some of 173 00:10:07,679 --> 00:10:11,600 Speaker 1: the sexist examples of colocation they used when demonstrating a 174 00:10:11,679 --> 00:10:15,200 Speaker 1: word's common usage, which is, by the way, how the 175 00:10:15,280 --> 00:10:19,000 Speaker 1: dictionary defended these examples kind of like, it's sexist, but 176 00:10:19,120 --> 00:10:21,520 Speaker 1: it's how people use it. It's not us, it's the 177 00:10:21,800 --> 00:10:25,840 Speaker 1: type of thing. Um So, here's the official statement the 178 00:10:25,840 --> 00:10:29,160 Speaker 1: o E D released. The example sentences we use are 179 00:10:29,200 --> 00:10:31,680 Speaker 1: taken from a huge variety of different sources and do 180 00:10:31,760 --> 00:10:35,040 Speaker 1: not represent the views or opinions of Oxford University Press. 181 00:10:35,800 --> 00:10:39,960 Speaker 1: So yeah, we're not sexist. You are. I love I 182 00:10:40,000 --> 00:10:44,480 Speaker 1: love that response. Right. Um So, here's an example of 183 00:10:44,480 --> 00:10:49,160 Speaker 1: what we're talking about, rabid feminist. This was an example 184 00:10:49,240 --> 00:10:53,240 Speaker 1: they gave for the how the word rabbit is commonly 185 00:10:53,320 --> 00:10:59,040 Speaker 1: used rabbit feminist. Another example is nagging wife and grading 186 00:10:59,040 --> 00:11:03,160 Speaker 1: and shrill to suscribe female voices but not male voices. 187 00:11:04,520 --> 00:11:08,120 Speaker 1: I do love that. The writer and comedian Lindy West, 188 00:11:08,360 --> 00:11:11,640 Speaker 1: her autobiography is called Shrill and now she's developing a 189 00:11:11,720 --> 00:11:14,800 Speaker 1: TV show spinning off of that experience, and it's really 190 00:11:14,840 --> 00:11:17,960 Speaker 1: sort of taking back this idea that as a as 191 00:11:18,000 --> 00:11:23,679 Speaker 1: a loud feminists that you know, she has probably been 192 00:11:23,679 --> 00:11:26,720 Speaker 1: called shrill her entire life, you know, kind of reclaiming 193 00:11:26,720 --> 00:11:29,600 Speaker 1: that word and saying, yeah, I am shrill, my badge 194 00:11:29,600 --> 00:11:34,560 Speaker 1: of honor, I'm proud to be shrill. So, after all 195 00:11:34,600 --> 00:11:38,040 Speaker 1: of this negative press, the o E ED released a 196 00:11:38,040 --> 00:11:41,640 Speaker 1: statement saying they were reviewing the example sentence for rabid 197 00:11:41,880 --> 00:11:46,000 Speaker 1: to ensure that it reflects current usage, which is good, 198 00:11:46,360 --> 00:11:48,480 Speaker 1: but it's only a start because I would bet that 199 00:11:48,520 --> 00:11:51,000 Speaker 1: a lot of folks are still going to co locate 200 00:11:51,160 --> 00:11:55,120 Speaker 1: rabid with feminist. Yeah. I think I co located with 201 00:11:55,120 --> 00:11:59,480 Speaker 1: with dog that. Yeah, that makes sense to me. That 202 00:11:59,559 --> 00:12:06,240 Speaker 1: makes to me. Another part of this conversation is pejoration. Um, 203 00:12:06,320 --> 00:12:09,960 Speaker 1: So this is, as you might guess from pejorative. It's 204 00:12:09,960 --> 00:12:13,080 Speaker 1: when the meaning of a word gets worse as time 205 00:12:13,120 --> 00:12:16,200 Speaker 1: goes on. And linguists positive that this happens more often 206 00:12:16,360 --> 00:12:19,120 Speaker 1: when looking at words referring to women than when looking 207 00:12:19,160 --> 00:12:23,199 Speaker 1: at words referring to men. And there are so many examples, 208 00:12:23,600 --> 00:12:30,240 Speaker 1: and surprise, surprise, most of them negatively sexualized women over time. Um. 209 00:12:30,280 --> 00:12:33,040 Speaker 1: And also all of them reinforce the gender binary, which 210 00:12:33,040 --> 00:12:34,719 Speaker 1: we were sort of talking about at the top, which 211 00:12:34,760 --> 00:12:37,320 Speaker 1: is another problem that we need to tackle when it 212 00:12:37,400 --> 00:12:42,439 Speaker 1: comes to our language. But okay, let's look at some examples, 213 00:12:43,640 --> 00:12:48,800 Speaker 1: starting with courtesan or courtison. At one, you gotta say 214 00:12:48,800 --> 00:12:52,280 Speaker 1: it fancy, Yeah, I realized I didn't put the right 215 00:12:52,800 --> 00:12:56,240 Speaker 1: accent with it, quotas on which at one time a 216 00:12:56,320 --> 00:12:59,880 Speaker 1: courtisan was the female version of the courtier, which referred 217 00:12:59,880 --> 00:13:04,520 Speaker 1: to somebody invited to attend the court of royalty. Courtisan 218 00:13:04,760 --> 00:13:07,800 Speaker 1: lost that meaning completely, And if you look up the 219 00:13:07,840 --> 00:13:10,520 Speaker 1: definition now you'll get something along the lines of a 220 00:13:10,679 --> 00:13:15,040 Speaker 1: prostitute or paramore, especially one associating with noblemen or men 221 00:13:15,200 --> 00:13:18,920 Speaker 1: of wealth and nothing else. That's the definition that you get. 222 00:13:19,960 --> 00:13:23,160 Speaker 1: And to me, this is so telling to the positions 223 00:13:23,160 --> 00:13:25,520 Speaker 1: of power women were allowed to have or not have, 224 00:13:26,200 --> 00:13:31,479 Speaker 1: suspicions around women's ambitions and motives, anxieties around female sexuality 225 00:13:31,520 --> 00:13:34,600 Speaker 1: and sexual agency, and valuing women based on their bodies 226 00:13:34,800 --> 00:13:38,520 Speaker 1: and sex appeal. Um. And if those sound like they're 227 00:13:38,559 --> 00:13:44,880 Speaker 1: still relevant, that's because they are. So. Another one is governess. 228 00:13:44,960 --> 00:13:47,840 Speaker 1: So if you think about it logically, governess should be 229 00:13:47,880 --> 00:13:50,200 Speaker 1: the female version of governor, which is the title of 230 00:13:50,240 --> 00:13:54,360 Speaker 1: someone who has power and authority over a place. But 231 00:13:54,440 --> 00:13:57,000 Speaker 1: around the fifteenth century it came to meet a woman 232 00:13:57,040 --> 00:14:00,520 Speaker 1: who cares for and supervises someone, usually a child. Now 233 00:14:00,559 --> 00:14:02,680 Speaker 1: that's not to say that, you know, taking care of 234 00:14:02,679 --> 00:14:05,040 Speaker 1: a kid is not a valid and important job, but 235 00:14:05,120 --> 00:14:09,280 Speaker 1: compared that to governor, it doesn't really have the same weight. Yeah, 236 00:14:09,320 --> 00:14:13,680 Speaker 1: it's not equal. Another one is hussey. Hussy once meant 237 00:14:13,800 --> 00:14:17,200 Speaker 1: head of house and the story it's derived from the 238 00:14:17,200 --> 00:14:20,880 Speaker 1: thirteenth century word for housewife, but when the seventeenth century 239 00:14:20,920 --> 00:14:23,840 Speaker 1: rolled around, it took on another meaning, which eventually became 240 00:14:23,880 --> 00:14:29,320 Speaker 1: the only definition a disreputable woman of improper behavior. And 241 00:14:29,360 --> 00:14:32,320 Speaker 1: again this is a put down of women to undercut 242 00:14:32,360 --> 00:14:37,200 Speaker 1: their power and to stigmatize female sexuality. One of my 243 00:14:37,240 --> 00:14:41,400 Speaker 1: favorites madam that this used to be what ma'am is today, 244 00:14:41,720 --> 00:14:44,160 Speaker 1: only more it distinguished. Like when I hear madam, I 245 00:14:44,200 --> 00:14:47,320 Speaker 1: think of it as as a very kind of distinguished title. 246 00:14:47,880 --> 00:14:50,880 Speaker 1: It denotes a woman with high rank, the female equipvent 247 00:14:50,960 --> 00:14:54,240 Speaker 1: of sir. Now in the eighteenth century this change when 248 00:14:54,280 --> 00:14:57,200 Speaker 1: people heard using madam to mean quote a concedered and 249 00:14:57,280 --> 00:15:00,360 Speaker 1: precocious girl, a young woman, a hussy, a minx, or 250 00:15:00,400 --> 00:15:03,280 Speaker 1: a prostitute or mistress, and by the nineteenth century, a 251 00:15:03,320 --> 00:15:07,000 Speaker 1: woman managing a brothel because that is the only high 252 00:15:07,040 --> 00:15:09,600 Speaker 1: up rank a woman can have, right, is managing a brothel? 253 00:15:10,800 --> 00:15:15,760 Speaker 1: That sounds about right. And speaking of mistress, this is 254 00:15:15,800 --> 00:15:19,600 Speaker 1: another great example. Once the female equivalent of master or 255 00:15:19,640 --> 00:15:23,960 Speaker 1: someone with power or authority, and usually an employer. Um. 256 00:15:24,000 --> 00:15:25,760 Speaker 1: But as we all probably know, that is not the 257 00:15:25,840 --> 00:15:28,920 Speaker 1: meaning anymore. It lost to that original meaning in the 258 00:15:28,960 --> 00:15:31,920 Speaker 1: seventeenth century and it now refers to a woman that 259 00:15:32,080 --> 00:15:34,320 Speaker 1: is not a man's wife, that the man is having 260 00:15:34,400 --> 00:15:39,760 Speaker 1: a long term sexual relationship with. That is its sole definition. 261 00:15:40,160 --> 00:15:43,040 Speaker 1: And once again you see the title stripped of power, 262 00:15:43,560 --> 00:15:48,840 Speaker 1: and you see it sexualized and sexualized in service to men. 263 00:15:50,120 --> 00:15:55,240 Speaker 1: One of my favorite words spinster. It's a good one. 264 00:15:55,280 --> 00:15:58,200 Speaker 1: It's not one that you hear a lot of, but 265 00:15:58,640 --> 00:16:00,400 Speaker 1: it's got a lot of weight a hatched to it. 266 00:16:00,400 --> 00:16:03,560 Speaker 1: I'll guess for me, I guess I'll put it that way. Um. So, basically, 267 00:16:03,840 --> 00:16:05,560 Speaker 1: y'all know what a spencer is supposed to be. It's 268 00:16:05,600 --> 00:16:08,480 Speaker 1: a woman who lives alone. Maybe she asked some cats, 269 00:16:08,880 --> 00:16:11,880 Speaker 1: Maybe she's weird probably wears a lot of sweaters, maybe 270 00:16:11,920 --> 00:16:13,960 Speaker 1: some clogs in the mix. I don't know, you know, 271 00:16:14,200 --> 00:16:18,000 Speaker 1: like there's a weird woman who romantic partnership is like 272 00:16:18,200 --> 00:16:22,000 Speaker 1: off the table. She lives alone and has cats. Well, 273 00:16:22,040 --> 00:16:25,960 Speaker 1: spencer actually just meant someone who spun thread or yarn. Now, 274 00:16:26,000 --> 00:16:29,120 Speaker 1: typically that person was a woman, but not always, and 275 00:16:29,160 --> 00:16:32,280 Speaker 1: it wasn't until the eighteenth century that that definition became 276 00:16:32,520 --> 00:16:37,200 Speaker 1: associated with someone who was outside of the typical like 277 00:16:37,320 --> 00:16:43,280 Speaker 1: marrying or childbearing age. Yeah, because for a while, an 278 00:16:43,360 --> 00:16:45,600 Speaker 1: unmarried woman one way she can make money is by 279 00:16:45,640 --> 00:16:51,720 Speaker 1: spinning yarn. And so the legal definition actually became um 280 00:16:51,840 --> 00:16:58,480 Speaker 1: that spencer referred to an unmarried woman. And then from 281 00:16:58,520 --> 00:17:02,280 Speaker 1: there it's pretty easy to see. Yeah, it was old 282 00:17:02,320 --> 00:17:05,160 Speaker 1: made essentially in the current definition comes with the disclaimer 283 00:17:05,520 --> 00:17:10,359 Speaker 1: disparaging and defensive. But the male equivalent is bachelor. That's 284 00:17:10,359 --> 00:17:14,760 Speaker 1: a much more has a much more positive connotation. Yeah, 285 00:17:14,800 --> 00:17:17,360 Speaker 1: it's so funny. I was thinking about this recently, how 286 00:17:17,600 --> 00:17:19,520 Speaker 1: I don't I don't know if you remember this, but 287 00:17:19,560 --> 00:17:22,720 Speaker 1: back in the day, Warren Beatty, young Warren Beatty, he 288 00:17:22,760 --> 00:17:25,560 Speaker 1: was sort of this um serial they used to call him, 289 00:17:25,560 --> 00:17:29,360 Speaker 1: like a serial bachelor and kind of in the seventies 290 00:17:29,359 --> 00:17:32,240 Speaker 1: and the eighties, he was sort of this like sexy, 291 00:17:32,320 --> 00:17:36,360 Speaker 1: worldly guy who was like the quintessential bachelor. And there's 292 00:17:36,400 --> 00:17:40,159 Speaker 1: not really a a female equivalent, and in culture, at 293 00:17:40,240 --> 00:17:42,439 Speaker 1: least what I can think of, it's like, you know, 294 00:17:43,480 --> 00:17:47,880 Speaker 1: an unmarried woman who is having lots of fun dating around, 295 00:17:48,040 --> 00:17:51,280 Speaker 1: has a great career, travels, Like that's not really We 296 00:17:51,400 --> 00:17:58,240 Speaker 1: definitely associate enjoying intentional singleness with men and with women. 297 00:17:58,280 --> 00:18:01,560 Speaker 1: It's just sad and weird. Yes, And I was thinking 298 00:18:01,560 --> 00:18:04,560 Speaker 1: about this too when it comes to the word cougar 299 00:18:05,359 --> 00:18:09,199 Speaker 1: and how that has this negative vibe to it. And 300 00:18:09,200 --> 00:18:14,000 Speaker 1: I was trying to think if there was a male equivalent, Um, cougar, Yeah, 301 00:18:14,000 --> 00:18:16,439 Speaker 1: there is a male clipping to cougar, it's silver fox. 302 00:18:17,920 --> 00:18:21,560 Speaker 1: See that sounds so much nicer. I guess it is 303 00:18:21,600 --> 00:18:24,560 Speaker 1: not a one to one comparison, but an older, still 304 00:18:24,640 --> 00:18:28,040 Speaker 1: handsome man who kind of like can date a younger woman, 305 00:18:28,359 --> 00:18:30,520 Speaker 1: but not in like the sugar baby sugar daddy kind 306 00:18:30,520 --> 00:18:32,879 Speaker 1: of way. I would consider that to be a silver fox. 307 00:18:34,040 --> 00:18:38,480 Speaker 1: A so like Sean Connery, you might be a silver fox, 308 00:18:38,880 --> 00:18:42,600 Speaker 1: right right right right? Um, that was the most when 309 00:18:42,640 --> 00:18:45,320 Speaker 1: I saw what is it called entrapment? When I was 310 00:18:45,520 --> 00:18:48,560 Speaker 1: pretty young, and I didn't buy it immediately because I 311 00:18:48,600 --> 00:18:51,359 Speaker 1: was like, Sean Connery is too old. Why is that 312 00:18:51,440 --> 00:18:57,400 Speaker 1: hot lady came to him? Is that Catherine's Aida Jones. Yeah, 313 00:18:58,760 --> 00:19:01,280 Speaker 1: she's married to Michael Dugla. I know, I didn't know 314 00:19:01,320 --> 00:19:04,080 Speaker 1: that at the time. She was married to a silver fox. 315 00:19:04,280 --> 00:19:07,600 Speaker 1: She is, she is, She's very important to this whole conversation. 316 00:19:09,040 --> 00:19:11,639 Speaker 1: This was ten year old Annie's thoughts. Please don't judge 317 00:19:11,640 --> 00:19:13,920 Speaker 1: current Annie on it. I haven't seen that movie in 318 00:19:13,960 --> 00:19:17,280 Speaker 1: a while. I'll have to rethink it. But so. Another 319 00:19:17,320 --> 00:19:21,720 Speaker 1: example is tart. It used to be a shortening of sweetheart, 320 00:19:22,000 --> 00:19:24,960 Speaker 1: but starting in the eighteen eighties, it came to mean yep, 321 00:19:25,080 --> 00:19:31,440 Speaker 1: a prostitute or a woman of immoral character. Another good one, wench. 322 00:19:32,000 --> 00:19:34,200 Speaker 1: It used to just mean a baby or a young, 323 00:19:34,280 --> 00:19:37,119 Speaker 1: unmarried woman. Starting in the fourteen hundreds, it began to 324 00:19:37,160 --> 00:19:39,960 Speaker 1: be used to describe a mistress or a sexually lawless 325 00:19:40,119 --> 00:19:45,320 Speaker 1: or quote loose woman. Yeah, so pretty much all of these, 326 00:19:45,400 --> 00:19:50,880 Speaker 1: if we look at them, they started meaning something that 327 00:19:51,040 --> 00:19:55,000 Speaker 1: was equivalent to the male word, the male version of 328 00:19:55,000 --> 00:19:57,959 Speaker 1: the word, and then they all lost their power and 329 00:19:58,000 --> 00:20:01,720 Speaker 1: by and large started defining women through their sexual desirability, 330 00:20:01,760 --> 00:20:05,240 Speaker 1: our lack of desirability with regards to men, or otherwise 331 00:20:05,280 --> 00:20:09,119 Speaker 1: subordinating them to men and or defining them in relationship 332 00:20:09,160 --> 00:20:11,040 Speaker 1: to men. So the very fact that we have a 333 00:20:11,200 --> 00:20:15,399 Speaker 1: misses and miss but not a male equivalent, that should 334 00:20:15,400 --> 00:20:21,280 Speaker 1: tell you something definitely. So we have some more discussion 335 00:20:21,359 --> 00:20:23,640 Speaker 1: around this whole thing. But first we have a quick 336 00:20:23,640 --> 00:20:35,320 Speaker 1: break for word from our sponsor M and we're back, 337 00:20:35,400 --> 00:20:38,480 Speaker 1: Thank you, sponsor. Another piece of this whole thing is 338 00:20:38,520 --> 00:20:43,840 Speaker 1: how in our language, male is generally the default, and 339 00:20:44,119 --> 00:20:47,760 Speaker 1: this is everywhere. Think of popular suffixes in English added 340 00:20:47,800 --> 00:20:50,520 Speaker 1: onto words traditionally referring to men, so that they now 341 00:20:50,560 --> 00:20:54,400 Speaker 1: refer to women, like et s, tricks, or the suffix 342 00:20:54,480 --> 00:20:56,840 Speaker 1: man to refer to jobs that both men and women 343 00:20:56,880 --> 00:21:01,560 Speaker 1: can't hold, like in chairman, councilman, policeman, salesman, maleman, and 344 00:21:01,600 --> 00:21:04,399 Speaker 1: the word mankind or even human mankind or man made. 345 00:21:04,680 --> 00:21:08,960 Speaker 1: I mean it's everywhere. Um or if you look at 346 00:21:08,960 --> 00:21:11,480 Speaker 1: the words for doctor or lawyer, think of how often 347 00:21:11,520 --> 00:21:15,960 Speaker 1: you hear in the news female doctor or female lawyer, 348 00:21:16,400 --> 00:21:19,600 Speaker 1: since women are seen as outside of the norm in 349 00:21:19,640 --> 00:21:23,719 Speaker 1: those professions. And I remember hearing female prosecutor over and 350 00:21:23,840 --> 00:21:28,879 Speaker 1: over again in the whole Brett Kavanaugh news coverage thing 351 00:21:29,480 --> 00:21:33,680 Speaker 1: female prosecutor. Yeah, as if that means that she prosecutes 352 00:21:33,800 --> 00:21:36,239 Speaker 1: with her vagina, Like, oh, she's a woman, you know, 353 00:21:37,200 --> 00:21:40,600 Speaker 1: you know, it didn't make any sense. And it's funny 354 00:21:40,640 --> 00:21:44,960 Speaker 1: because recently they've just done some rejiggering of how they're 355 00:21:44,960 --> 00:21:48,040 Speaker 1: going to make those signs that say things like men working, 356 00:21:48,480 --> 00:21:50,240 Speaker 1: that they're going to try to figure out a way 357 00:21:50,600 --> 00:21:53,399 Speaker 1: to have it be more gender inclusive, where you know, 358 00:21:54,040 --> 00:21:57,280 Speaker 1: it's not men working, it's people working, you know. Um, 359 00:21:57,320 --> 00:22:00,360 Speaker 1: and so yeah, they're there. It's interesting how these things 360 00:22:00,440 --> 00:22:04,919 Speaker 1: kind of show up. Pretty recently, Alexandria okco Cortes, somebody 361 00:22:04,920 --> 00:22:07,240 Speaker 1: tweeted at her that she needed to stop saying congress 362 00:22:07,280 --> 00:22:10,280 Speaker 1: people because you know, she's in Congress and she's not 363 00:22:10,320 --> 00:22:13,639 Speaker 1: a man. Then the title as congressman, and that person 364 00:22:13,680 --> 00:22:16,600 Speaker 1: eventually apologized. They were like, yeah, she's right, Like she's 365 00:22:16,640 --> 00:22:19,840 Speaker 1: right to say congress people. Um. It's really interesting how 366 00:22:20,560 --> 00:22:24,120 Speaker 1: gendered these things are. They're just we'd say them without 367 00:22:24,160 --> 00:22:26,439 Speaker 1: even thinking about it, really, but it does warp how 368 00:22:26,480 --> 00:22:29,800 Speaker 1: we see ourselves. It does um, and it kind of 369 00:22:29,840 --> 00:22:34,119 Speaker 1: reminds me of that riddle where uh, I mean, ultimately 370 00:22:34,200 --> 00:22:39,840 Speaker 1: the answer is the surgeon is a woman, but it's 371 00:22:39,840 --> 00:22:43,679 Speaker 1: supposed to showcase I guess that you automatically assumed it 372 00:22:43,760 --> 00:22:46,320 Speaker 1: was a man, so like the whole, to get to 373 00:22:46,359 --> 00:22:48,399 Speaker 1: the answer of the riddle, you have to be like, ah, 374 00:22:48,440 --> 00:22:53,040 Speaker 1: the surgeon is a woman. Um. It's even in our riddles, 375 00:22:53,119 --> 00:22:56,520 Speaker 1: is what I'm trying to say. We cannot escape sexism 376 00:22:56,720 --> 00:22:59,399 Speaker 1: even in our riddles. That's right, And actually we're going 377 00:22:59,440 --> 00:23:02,920 Speaker 1: to talk about that more a little later. Um. And 378 00:23:02,960 --> 00:23:06,320 Speaker 1: if we can even look at the reverse example of 379 00:23:06,359 --> 00:23:11,439 Speaker 1: like male nurse, so yeah, it hurts everybody. Um. And 380 00:23:11,440 --> 00:23:13,880 Speaker 1: since we don't have a gender neutral third person pronoun 381 00:23:13,920 --> 00:23:17,440 Speaker 1: in English, most of our idioms and proverbs and perhaps 382 00:23:17,480 --> 00:23:22,320 Speaker 1: riddles use the masculine he. And another thing about proverbs 383 00:23:22,520 --> 00:23:25,400 Speaker 1: is the ones that do you she are refer specifically 384 00:23:25,400 --> 00:23:28,920 Speaker 1: to women usually don't paint women in the best light, 385 00:23:29,119 --> 00:23:32,280 Speaker 1: like he who follows his wife's advice will never see 386 00:23:32,320 --> 00:23:35,720 Speaker 1: the face of God, or a neck without a head, 387 00:23:35,880 --> 00:23:38,080 Speaker 1: bucks without a whole, and a girl with that shame 388 00:23:38,240 --> 00:23:42,320 Speaker 1: are not worth admiring or marrying. What an expression? I 389 00:23:42,400 --> 00:23:46,480 Speaker 1: know when we're people saying that I missed it. A 390 00:23:46,520 --> 00:23:48,800 Speaker 1: woman is like a lemon. You squeeze her and throw 391 00:23:48,840 --> 00:23:53,160 Speaker 1: her away. Gross. Yeah, seven, women in their right senses 392 00:23:53,200 --> 00:23:56,959 Speaker 1: are surpassed by a madman, which is offensive on more 393 00:23:57,000 --> 00:24:00,200 Speaker 1: than one level. Nice like little layer of the it's 394 00:24:00,320 --> 00:24:08,240 Speaker 1: ablest sexist. Yes, that's good. Yeah. Um. And then if 395 00:24:08,280 --> 00:24:11,720 Speaker 1: we look at gendered languages, that is Latin languages where 396 00:24:11,760 --> 00:24:16,240 Speaker 1: nouns themselves are gendered, like French or Spanish. Um, so 397 00:24:16,560 --> 00:24:20,280 Speaker 1: love for masculine in law for feminine in French. Every down, 398 00:24:20,920 --> 00:24:23,560 Speaker 1: we'll have either the law in front of it. Most 399 00:24:23,560 --> 00:24:25,199 Speaker 1: of you probably know this, but you know, just make 400 00:24:25,200 --> 00:24:27,560 Speaker 1: sure we're on the same page. On top of that, 401 00:24:27,600 --> 00:24:30,399 Speaker 1: these languages don't have a gender neutral plural word like 402 00:24:30,560 --> 00:24:34,480 Speaker 1: them are they? Instead, you have the masculine plural and 403 00:24:34,520 --> 00:24:38,520 Speaker 1: the feminine plural. For example, in French, heel which is masculine, 404 00:24:38,600 --> 00:24:41,320 Speaker 1: or l which is feminine. And when you're talking about 405 00:24:41,320 --> 00:24:44,080 Speaker 1: a group of all men are all women the word 406 00:24:44,119 --> 00:24:47,040 Speaker 1: you would use to describe the group is clear. Even 407 00:24:47,119 --> 00:24:50,280 Speaker 1: if one man enters a group of five, ten, one hundred, 408 00:24:50,320 --> 00:24:53,760 Speaker 1: one thousand women. Though you switch over to the masculine plural, 409 00:24:53,880 --> 00:24:56,119 Speaker 1: you don't call it a group of women anymore. It 410 00:24:56,200 --> 00:24:58,760 Speaker 1: becomes a group of men, even though there's one man 411 00:24:59,760 --> 00:25:02,440 Speaker 1: in a whole group. And the suggestion is that one 412 00:25:02,480 --> 00:25:05,840 Speaker 1: man takes precedent over any number of women. He is 413 00:25:05,880 --> 00:25:09,320 Speaker 1: the most important person in that group. He is how 414 00:25:09,359 --> 00:25:13,000 Speaker 1: you will refer to that group. And if someone walks by, uh, 415 00:25:13,040 --> 00:25:15,560 Speaker 1: this group of mostly women and one dude, the person 416 00:25:15,680 --> 00:25:17,280 Speaker 1: is going to address them as if they were all 417 00:25:17,320 --> 00:25:21,320 Speaker 1: men are, alternately as if there aren't any women present. Yeah. 418 00:25:21,359 --> 00:25:23,399 Speaker 1: That reminds me of the episode that we did on 419 00:25:24,040 --> 00:25:27,840 Speaker 1: women and travel Alone, where if people still accused a 420 00:25:27,880 --> 00:25:32,199 Speaker 1: group of women as traveling alone, as if the only 421 00:25:32,240 --> 00:25:34,840 Speaker 1: thing that validates their existence is the presence of at 422 00:25:34,920 --> 00:25:37,440 Speaker 1: least one man, you could be a hundred women deep 423 00:25:37,440 --> 00:25:40,560 Speaker 1: and it's still alone. Right, you're traveling alone? What were 424 00:25:40,600 --> 00:25:45,080 Speaker 1: you doing? And this is where I said, like the English, 425 00:25:45,119 --> 00:25:48,080 Speaker 1: hey guys would fall into two And I am very 426 00:25:48,119 --> 00:25:51,480 Speaker 1: guilty of saying that. In my brain it's de gendered. 427 00:25:51,560 --> 00:25:55,000 Speaker 1: But as the article I was reading pointed out, if 428 00:25:55,000 --> 00:25:57,840 Speaker 1: you went to a group of men and said hey, ladies, 429 00:25:58,359 --> 00:26:01,200 Speaker 1: it would in general not be taken favorably and might 430 00:26:01,240 --> 00:26:04,439 Speaker 1: even start a fight. Yes, well, that's one of my 431 00:26:04,560 --> 00:26:10,879 Speaker 1: favorite colloquialism, really bits and handy y'all people people in 432 00:26:10,920 --> 00:26:13,680 Speaker 1: the South like that, like we got it down. Y'all 433 00:26:13,840 --> 00:26:17,520 Speaker 1: means all. It's a it's a gender neutral way to 434 00:26:17,560 --> 00:26:20,919 Speaker 1: address a group. So instead of saying hey guys or 435 00:26:20,920 --> 00:26:24,480 Speaker 1: hey ladies, I'm gonna say hey, y'all. And what's funny 436 00:26:24,480 --> 00:26:27,800 Speaker 1: because when I was teaching, you have to find all 437 00:26:27,800 --> 00:26:31,320 Speaker 1: different kinds of ways to address a classroom, um that 438 00:26:31,400 --> 00:26:34,080 Speaker 1: are not guys or hey ladies or whatever. So it 439 00:26:34,160 --> 00:26:37,320 Speaker 1: was always good morning, friends, good morning, comrades, good morning, 440 00:26:37,320 --> 00:26:39,439 Speaker 1: this good morning that you know, um, So they are 441 00:26:39,440 --> 00:26:43,800 Speaker 1: all kinds of better words other than guys to use. Yeah, 442 00:26:43,880 --> 00:26:47,160 Speaker 1: And it's one of those things where in my head 443 00:26:47,200 --> 00:26:48,960 Speaker 1: it's d gendered, but I don't know what it is 444 00:26:49,000 --> 00:26:52,280 Speaker 1: in other people's heads. And I don't want to leave 445 00:26:52,320 --> 00:26:55,320 Speaker 1: anyone out with my choice of language, and it I 446 00:26:55,359 --> 00:26:57,680 Speaker 1: know that some people are probably thinking this seems such 447 00:26:57,720 --> 00:27:00,879 Speaker 1: a small thing, But as I tried to get across 448 00:27:00,880 --> 00:27:04,000 Speaker 1: from the beginning, language really is very important and it 449 00:27:04,240 --> 00:27:06,440 Speaker 1: impacts so much of the way we think, even if 450 00:27:06,440 --> 00:27:10,320 Speaker 1: we don't realize it. Definitely, Oh god, there's this is 451 00:27:10,359 --> 00:27:13,800 Speaker 1: really random. But there's this classic final episode of the 452 00:27:13,840 --> 00:27:17,440 Speaker 1: Mary Tyler Moore Show. That show ends with Mary getting fired, 453 00:27:17,680 --> 00:27:20,199 Speaker 1: and when she's fired, it's she's the only woman in 454 00:27:20,200 --> 00:27:22,359 Speaker 1: a group of male journalists and the person who is 455 00:27:22,480 --> 00:27:26,040 Speaker 1: firing or says you guys are fired. And then she's like, 456 00:27:26,440 --> 00:27:29,680 Speaker 1: wait a minute, he said guys, maybe that doesn't include me, 457 00:27:30,119 --> 00:27:33,560 Speaker 1: and she has to like ask she was included in that, 458 00:27:33,680 --> 00:27:37,120 Speaker 1: because technically she is not a guy, and I'm going 459 00:27:37,160 --> 00:27:43,440 Speaker 1: to assume she was. She was. Um. Here's another one 460 00:27:43,440 --> 00:27:45,359 Speaker 1: that a couple of you listeners have written in about 461 00:27:45,920 --> 00:27:48,680 Speaker 1: is the whole question of calling women girls and whether 462 00:27:48,760 --> 00:27:52,679 Speaker 1: or not it's degrading UM and generally outside of friend groups, 463 00:27:53,359 --> 00:27:54,680 Speaker 1: if you think about it, if you called a group 464 00:27:54,680 --> 00:27:57,879 Speaker 1: of men boys, it is viewed as an insult or 465 00:27:57,880 --> 00:28:02,560 Speaker 1: it implies that they are in potent Oh my goodness, 466 00:28:02,600 --> 00:28:04,439 Speaker 1: this is Benna. I have a lot of feelings on 467 00:28:04,480 --> 00:28:10,120 Speaker 1: this particular one. Um. First of all, I call women girls. 468 00:28:10,320 --> 00:28:13,360 Speaker 1: I definitely should not um, But for me, it's it's 469 00:28:13,400 --> 00:28:16,400 Speaker 1: a friendly thing, like hey girl, Hey girl. And even 470 00:28:16,440 --> 00:28:19,520 Speaker 1: someone someone pointed out that that's not necessarily you know, 471 00:28:19,760 --> 00:28:23,840 Speaker 1: if someone is non binary, saying hey girl to them 472 00:28:23,920 --> 00:28:26,400 Speaker 1: is might not be something they want to hear. Someone 473 00:28:26,480 --> 00:28:28,360 Speaker 1: is gender nonconforming, that might not be something they want 474 00:28:28,359 --> 00:28:31,720 Speaker 1: to hear. I call men girl like if we're friends, 475 00:28:31,720 --> 00:28:34,480 Speaker 1: you're my girl, like that how it is in my head, 476 00:28:34,640 --> 00:28:38,400 Speaker 1: And it sucks to realize that, like the imprint, like 477 00:28:38,520 --> 00:28:41,120 Speaker 1: what you were saying with guys, like the way that 478 00:28:41,200 --> 00:28:44,320 Speaker 1: it reads in my head might not be the way 479 00:28:44,320 --> 00:28:46,120 Speaker 1: it reads to other people in their head. And I 480 00:28:46,160 --> 00:28:48,800 Speaker 1: just want to be you know, aware of that. But 481 00:28:48,800 --> 00:28:51,240 Speaker 1: then it's also tough because if it shows how limiting 482 00:28:51,240 --> 00:28:53,760 Speaker 1: our language is, Like if you if if I'm am 483 00:28:53,760 --> 00:28:58,080 Speaker 1: a woman in a heterosexual romantic relationship, that person is 484 00:28:58,120 --> 00:29:02,000 Speaker 1: supposed to be my boyfriend, but he's not a boy. 485 00:29:02,240 --> 00:29:04,600 Speaker 1: I'm not a girl. You know, we don't really have 486 00:29:04,640 --> 00:29:06,880 Speaker 1: a lot of words that Like I don't use the 487 00:29:06,880 --> 00:29:08,800 Speaker 1: word boyfriend. I don't use the word girlfriend, because this 488 00:29:08,920 --> 00:29:12,880 Speaker 1: is if I'm in a relationship with someone, I'm not 489 00:29:12,920 --> 00:29:15,480 Speaker 1: a girl. He's not a boy. That's not seventh grade 490 00:29:16,280 --> 00:29:20,520 Speaker 1: or girlfriend, and boyfriend does not accurately describe what a 491 00:29:20,600 --> 00:29:25,880 Speaker 1: like partnership between two adults is totally agree. I remember 492 00:29:25,960 --> 00:29:28,160 Speaker 1: we did an episode, a video episode on this a 493 00:29:28,200 --> 00:29:31,600 Speaker 1: while ago, but how there aren't really good substitutes for that, 494 00:29:31,680 --> 00:29:34,600 Speaker 1: but how it is so it does feel so high school, 495 00:29:34,680 --> 00:29:40,240 Speaker 1: like just the terminology doesn't fit. Yeah, I I for 496 00:29:40,400 --> 00:29:43,280 Speaker 1: my romantic partners, I use the word boo. I don't 497 00:29:43,280 --> 00:29:45,520 Speaker 1: want to be anyone's girlfriend, like I'll be your boo. 498 00:29:47,440 --> 00:29:51,320 Speaker 1: That's that's a good one. I think that's a step up. Um. 499 00:29:51,360 --> 00:29:54,680 Speaker 1: And if you're thinking that this is an unwieldy grammatical rule, 500 00:29:55,240 --> 00:29:58,000 Speaker 1: you are correct. But studies show that grammar rules like 501 00:29:58,040 --> 00:30:01,600 Speaker 1: this do influence woul global sexism, so that's kind of 502 00:30:01,640 --> 00:30:04,680 Speaker 1: a big thing. Two thousand nine study found that grammar 503 00:30:04,800 --> 00:30:08,200 Speaker 1: like this does correlate with sexism. When asked to read 504 00:30:08,240 --> 00:30:11,600 Speaker 1: a passage in either English, which is a gender natural language, 505 00:30:11,840 --> 00:30:13,920 Speaker 1: meaning nouns are in assign gender, but there are gender 506 00:30:13,960 --> 00:30:19,800 Speaker 1: specific pronouns, or French and Spanish gendered languages. Those that 507 00:30:19,840 --> 00:30:23,560 Speaker 1: read the passage and engendered languages showed higher levels of 508 00:30:23,600 --> 00:30:27,480 Speaker 1: sexism in their responses on a questionnaire they filled out afterwards. 509 00:30:27,960 --> 00:30:30,600 Speaker 1: Doesn't mean that English speakers are less sexist, but it 510 00:30:30,680 --> 00:30:33,640 Speaker 1: does show that grammar things like this might influence how 511 00:30:33,720 --> 00:30:38,280 Speaker 1: we think, even if it's unconsciously. The research didn't stop there, 512 00:30:39,240 --> 00:30:42,640 Speaker 1: they took the World Economic Forums Global Gender Gap Index, 513 00:30:42,680 --> 00:30:45,880 Speaker 1: which measures the gender inequality in one thirty four countries 514 00:30:45,920 --> 00:30:48,880 Speaker 1: and in various sectors like economics, politics, health, and education, 515 00:30:49,320 --> 00:30:51,640 Speaker 1: and they divided them up by language type most common 516 00:30:51,640 --> 00:30:56,320 Speaker 1: in that country. Natural gender was gendered fifty four point 517 00:30:56,360 --> 00:31:00,000 Speaker 1: five percent and gender less nineteen point four. If you're 518 00:31:00,080 --> 00:31:03,080 Speaker 1: doing that math and you're saying that's not you are 519 00:31:03,160 --> 00:31:06,560 Speaker 1: totally correct. The remaining countries spoke a mixture of these 520 00:31:06,600 --> 00:31:10,080 Speaker 1: language types. Now, when controlling for things that might influence 521 00:31:10,160 --> 00:31:14,640 Speaker 1: gender inequality like religion, political system, relative development, and geographical location, 522 00:31:14,920 --> 00:31:18,040 Speaker 1: the researchers found that countries where gendered languages are primarily 523 00:31:18,040 --> 00:31:22,160 Speaker 1: spoken ranked highest in terms of general inequality. Interestingly, though 524 00:31:22,480 --> 00:31:25,160 Speaker 1: genderless language is displayed the second highest rate of general 525 00:31:25,200 --> 00:31:30,200 Speaker 1: inequality and natural gender is the least and one hypothesis 526 00:31:30,240 --> 00:31:34,040 Speaker 1: for why genderless languages didn't fare as well suggest that 527 00:31:34,560 --> 00:31:38,240 Speaker 1: in all language without a gender the brain in the brain, 528 00:31:38,480 --> 00:31:42,200 Speaker 1: the default is male. Other studies kind of demonstrate this, 529 00:31:42,360 --> 00:31:45,960 Speaker 1: finding that hearing a phrase like heroines and heroes makes 530 00:31:46,000 --> 00:31:49,840 Speaker 1: people believe that there are more heroines um, as opposed 531 00:31:49,840 --> 00:31:52,800 Speaker 1: to just saying heroes, UM. You're just gonna think all 532 00:31:52,880 --> 00:31:55,480 Speaker 1: men are. And that's what the study found, and that 533 00:31:55,560 --> 00:31:58,800 Speaker 1: the order matters too, Like whatever you say first, your 534 00:31:58,840 --> 00:32:02,640 Speaker 1: brain assigns that being more importance, So putting heroines first 535 00:32:02,680 --> 00:32:07,120 Speaker 1: instead of heroes. Uh, there's all these things that we 536 00:32:07,160 --> 00:32:11,320 Speaker 1: don't realize our brain is doing when we say things. Um, 537 00:32:11,360 --> 00:32:13,480 Speaker 1: if we go back to the gendering of nouns, a 538 00:32:13,520 --> 00:32:17,320 Speaker 1: two thousand to study found that this leaked out into 539 00:32:17,360 --> 00:32:20,640 Speaker 1: other thought processes. The researchers put together a list of 540 00:32:20,680 --> 00:32:24,640 Speaker 1: twenty four objects that have opposite genders in Spanish and German. 541 00:32:25,480 --> 00:32:28,840 Speaker 1: For both languages, half of the items were gendered masculine 542 00:32:28,840 --> 00:32:32,080 Speaker 1: and the other half feminine. A group of native Spanish 543 00:32:32,120 --> 00:32:35,360 Speaker 1: speakers and a group of native German speakers who were 544 00:32:35,400 --> 00:32:38,400 Speaker 1: also proficient in English were then asked in English to 545 00:32:38,440 --> 00:32:43,400 Speaker 1: come up with three adjectives for each object. With each participant, 546 00:32:43,520 --> 00:32:46,640 Speaker 1: the gender of their native language influenced the adjectives of 547 00:32:46,680 --> 00:32:49,560 Speaker 1: the words that they chose. For example, the German word 548 00:32:49,600 --> 00:32:52,680 Speaker 1: for key is masculine, and in Spanish it is feminine. 549 00:32:53,160 --> 00:32:57,640 Speaker 1: German speakers described keys with words like useful, hard, heavy, jagged, 550 00:32:57,680 --> 00:33:03,440 Speaker 1: and metal. Spanish speakers described it using words like intricate, little, tiny, lovely, 551 00:33:03,600 --> 00:33:07,280 Speaker 1: and golden. Or if we look at the example of bridge, 552 00:33:07,320 --> 00:33:11,560 Speaker 1: the German speakers described it as delicate, fragile, beautiful, elegant, 553 00:33:11,760 --> 00:33:16,760 Speaker 1: and slender. Spanish speakers described a bridge using words like strong, big, dangerous, sturdy, 554 00:33:16,840 --> 00:33:20,360 Speaker 1: and towering. And I bet you can guess in which 555 00:33:20,480 --> 00:33:26,680 Speaker 1: language bridge is gendered feminine. Yeah, it's the German language. 556 00:33:27,080 --> 00:33:35,040 Speaker 1: And that's pretty stark. Those examples. It kind of shocked me. Yeah, 557 00:33:35,360 --> 00:33:37,400 Speaker 1: they're describing a bridge. It sounds like you're describing like 558 00:33:37,440 --> 00:33:41,960 Speaker 1: a woman's arm, like slender, elegant. It's really stark, it is. 559 00:33:42,880 --> 00:33:45,960 Speaker 1: These participants were also shown a pair of pictures, one 560 00:33:46,000 --> 00:33:49,200 Speaker 1: containing a person and the other an object, and they 561 00:33:49,200 --> 00:33:52,680 Speaker 1: were asked to rate how similar the pictures were the 562 00:33:52,760 --> 00:33:55,160 Speaker 1: person and the object. If the biological sex of the 563 00:33:55,320 --> 00:33:59,600 Speaker 1: person matched the gender of the object in their native language, 564 00:33:59,680 --> 00:34:03,240 Speaker 1: that participants rated them as similar. If it didn't, then 565 00:34:03,280 --> 00:34:08,839 Speaker 1: they said they had no similarities, which is interesting. Yeah, 566 00:34:09,239 --> 00:34:14,000 Speaker 1: it's it's wild how language. I mean, it really gets 567 00:34:14,000 --> 00:34:17,440 Speaker 1: in your head. I guess these examples just really drive 568 00:34:17,520 --> 00:34:20,000 Speaker 1: that home that it really gets in your head and 569 00:34:20,040 --> 00:34:23,160 Speaker 1: in ways that you might not even realize. Yeah, and 570 00:34:23,239 --> 00:34:27,440 Speaker 1: I as someone who didn't grow up with speaking a 571 00:34:27,480 --> 00:34:32,200 Speaker 1: gendered language. I remember learning UM, French and Spanish in 572 00:34:32,480 --> 00:34:37,279 Speaker 1: like elementary school and being so confused by the the 573 00:34:37,440 --> 00:34:42,200 Speaker 1: article before the word, and so I don't have I 574 00:34:42,200 --> 00:34:45,640 Speaker 1: don't really have a starting point for knowing how much 575 00:34:45,680 --> 00:34:49,200 Speaker 1: that impacts how you think of things. I do remember 576 00:34:49,239 --> 00:34:54,080 Speaker 1: thinking like, I feel like chocolate is masculine in French UM, 577 00:34:54,160 --> 00:34:56,160 Speaker 1: and I always thought it should be feminine, which says 578 00:34:56,239 --> 00:34:58,560 Speaker 1: more about me than it does about the French language. 579 00:34:59,120 --> 00:35:01,839 Speaker 1: But I had more of that kind of association with 580 00:35:01,880 --> 00:35:04,520 Speaker 1: it UM. But yeah, again, I would love to hear 581 00:35:04,560 --> 00:35:09,160 Speaker 1: from people who speak other languages about their thoughts on this. 582 00:35:11,120 --> 00:35:15,440 Speaker 1: And obviously studies like the ones we're describing are tricky. 583 00:35:15,480 --> 00:35:19,160 Speaker 1: There are a lot of factors that influence inequality and UM. 584 00:35:19,200 --> 00:35:22,360 Speaker 1: The larger sample size countries that speak gendered languages compared 585 00:35:22,400 --> 00:35:25,400 Speaker 1: to those that speak genter, natural ones makes it harder 586 00:35:25,440 --> 00:35:28,520 Speaker 1: to draw conclusions. But at the same time, I would 587 00:35:28,560 --> 00:35:31,120 Speaker 1: love to see more research because I think that there 588 00:35:31,239 --> 00:35:37,560 Speaker 1: is something worth looking into their definitely, and we have 589 00:35:37,719 --> 00:35:40,200 Speaker 1: a little bit more for you, But first we have 590 00:35:40,280 --> 00:35:47,200 Speaker 1: one more quick break for word from our sponsor M 591 00:35:52,440 --> 00:35:56,359 Speaker 1: and we're back, Thank you, sponsor. So another thing that 592 00:35:56,440 --> 00:36:01,960 Speaker 1: we can discuss U is masculine eised words versus feminized words. 593 00:36:02,960 --> 00:36:06,360 Speaker 1: Over time, some words and even non gendered languages have 594 00:36:06,480 --> 00:36:09,680 Speaker 1: been masculinized are feminized, sort of like that co location 595 00:36:09,680 --> 00:36:12,000 Speaker 1: thing we were discussing at the top, but a bit broader. 596 00:36:12,800 --> 00:36:15,040 Speaker 1: So men are more likely to be described using certain 597 00:36:15,040 --> 00:36:18,080 Speaker 1: words and women are more likely to be described using 598 00:36:18,160 --> 00:36:21,880 Speaker 1: other words. And masculinized words generally have a more positive 599 00:36:21,920 --> 00:36:26,160 Speaker 1: connotation and feminized words generally have a more negative one. 600 00:36:27,400 --> 00:36:30,240 Speaker 1: And we could even extend this conversation to include phrases 601 00:36:30,320 --> 00:36:33,319 Speaker 1: like like a girl, which has traditionally been considered a 602 00:36:33,400 --> 00:36:35,799 Speaker 1: bad thing, but it's sort of like shrill kind of 603 00:36:35,840 --> 00:36:39,400 Speaker 1: being reclaimed. But still this idea that things that are 604 00:36:39,440 --> 00:36:43,640 Speaker 1: associated with femininity are bad. That's so a thing, I mean, 605 00:36:43,680 --> 00:36:45,839 Speaker 1: And you've talked a bit on the show about how 606 00:36:46,000 --> 00:36:49,440 Speaker 1: that's something that you grew up with. Yeah, by the 607 00:36:49,520 --> 00:36:51,759 Speaker 1: time I think I mentioned this in an episode. By 608 00:36:51,760 --> 00:36:53,879 Speaker 1: the time I was in kindergarten, I hated the color 609 00:36:53,960 --> 00:36:57,719 Speaker 1: pink because I associated it with being girly and I 610 00:36:57,719 --> 00:37:01,440 Speaker 1: didn't want to be associated with that. Um And we 611 00:37:01,680 --> 00:37:05,080 Speaker 1: talked about how this could even be applied and there's 612 00:37:05,080 --> 00:37:07,960 Speaker 1: a lot of discussion around this um recently, that it 613 00:37:07,960 --> 00:37:10,279 Speaker 1: could be applied to the party system the US, with 614 00:37:10,320 --> 00:37:14,200 Speaker 1: the Democratic Party being the feminized party in the Conservative 615 00:37:14,200 --> 00:37:20,480 Speaker 1: Party being the masculinized party. Oh that's interesting. Oh yeah, 616 00:37:20,520 --> 00:37:24,480 Speaker 1: there's so much interesting writing around that happening as we speak. 617 00:37:24,960 --> 00:37:27,920 Speaker 1: And I think we it's if you spend any time 618 00:37:28,120 --> 00:37:31,560 Speaker 1: in the mucky muck that is, like right wing Twitter, 619 00:37:32,040 --> 00:37:36,360 Speaker 1: their obsession with conveying like liberal or progressive men on 620 00:37:36,400 --> 00:37:40,960 Speaker 1: the left as feminine is really fascinating, like terms like 621 00:37:41,040 --> 00:37:45,520 Speaker 1: calling a progressive male like beta soy boy, like yeah, 622 00:37:45,520 --> 00:37:48,759 Speaker 1: like soy boy, beta, couk, things like that, where it's 623 00:37:48,800 --> 00:37:54,960 Speaker 1: so weirdly, it's just very weird. It is it is um. 624 00:37:55,000 --> 00:37:57,200 Speaker 1: And there's actually a study kind of looking into this 625 00:37:57,840 --> 00:38:01,360 Speaker 1: where um it was looking into the relationship or possible 626 00:38:01,400 --> 00:38:05,399 Speaker 1: relationships between sexism and self image, and it found that 627 00:38:05,440 --> 00:38:09,200 Speaker 1: people who display hostile sexism dislike of women, that's what 628 00:38:09,239 --> 00:38:12,000 Speaker 1: they defined it as in the study, are more likely 629 00:38:12,080 --> 00:38:16,320 Speaker 1: to describe themselves using words we typically associate with masculinity 630 00:38:16,800 --> 00:38:22,520 Speaker 1: like brave, physically, strong, determined, admirable, confident, And that was 631 00:38:22,560 --> 00:38:25,840 Speaker 1: the case for men, but for women displaying hostile sexism, 632 00:38:26,040 --> 00:38:29,360 Speaker 1: they describe themselves in terms that win against traditional femininity. 633 00:38:29,480 --> 00:38:35,440 Speaker 1: So not tolerant, not cooperative, not compassionate, not sensitive, things 634 00:38:35,480 --> 00:38:37,680 Speaker 1: like that, almost going out of their way to say 635 00:38:37,719 --> 00:38:40,319 Speaker 1: I am not like those other girls, which is a 636 00:38:40,360 --> 00:38:47,480 Speaker 1: trope we've discussed before. Yes, it's so, I mean, it's 637 00:38:47,480 --> 00:38:52,680 Speaker 1: funny when we apply gender traits that are good, you know, 638 00:38:53,280 --> 00:38:55,400 Speaker 1: it's just really really you are, like who wants to 639 00:38:55,400 --> 00:38:59,560 Speaker 1: describe themselves as intolerant and not cooperative? Like those aren't 640 00:38:59,600 --> 00:39:03,760 Speaker 1: good qual a. No, I can't imagine being very proud 641 00:39:04,280 --> 00:39:08,480 Speaker 1: why I am not sensitive, not tolerant, not and it's 642 00:39:08,520 --> 00:39:11,560 Speaker 1: all of these nuts, like it's nothing like that you are. 643 00:39:12,200 --> 00:39:17,480 Speaker 1: You're just basically saying I am not that m hmm um. 644 00:39:17,560 --> 00:39:20,359 Speaker 1: And the study also found that sexism and racism are 645 00:39:20,440 --> 00:39:23,560 Speaker 1: likely to occur in the same people because basically they're 646 00:39:23,560 --> 00:39:26,080 Speaker 1: buying into the idea that we are not all created 647 00:39:26,120 --> 00:39:29,680 Speaker 1: equal and that the unequal social hierarchies that stem from that. 648 00:39:30,200 --> 00:39:34,800 Speaker 1: Um like, they buy into those Oh no, like surprise 649 00:39:34,880 --> 00:39:37,960 Speaker 1: me at all. It's one of those things reading it 650 00:39:38,000 --> 00:39:41,400 Speaker 1: like thank you science. I mean, I've I always suspected 651 00:39:42,040 --> 00:39:47,160 Speaker 1: science confirms what we've already known forever exactly. Yeah, other 652 00:39:47,160 --> 00:39:50,879 Speaker 1: studies have shown that words masculinized and words that are 653 00:39:50,880 --> 00:39:54,320 Speaker 1: feminized influence what we believe to be the proper behavior 654 00:39:54,440 --> 00:39:57,319 Speaker 1: for men and for women, and that this impacts what 655 00:39:57,360 --> 00:40:00,960 Speaker 1: types of job men versus women can get. For instance, 656 00:40:00,960 --> 00:40:04,840 Speaker 1: women are more likely to get jobs with descriptors like kind, caring, 657 00:40:05,239 --> 00:40:06,920 Speaker 1: other things like that, and men are more likely to 658 00:40:06,960 --> 00:40:10,239 Speaker 1: get jobs with words like ambitious and independent. And this 659 00:40:10,320 --> 00:40:13,600 Speaker 1: impacts the advertising and targeting of jobs. So which jobs 660 00:40:13,600 --> 00:40:16,400 Speaker 1: men and women are more likely to pursue based on 661 00:40:16,440 --> 00:40:20,680 Speaker 1: the descriptors in the job description, and also the candidate 662 00:40:20,920 --> 00:40:22,680 Speaker 1: more likely to get hired for it. So it's kind 663 00:40:22,680 --> 00:40:27,279 Speaker 1: of like not cyclical, I guess, sort of cyclical, but 664 00:40:27,320 --> 00:40:33,520 Speaker 1: it happens from the advertising process to the job interview. 665 00:40:33,680 --> 00:40:37,640 Speaker 1: It impacts all of that. And I think once you 666 00:40:37,719 --> 00:40:41,239 Speaker 1: have the job, I think it probably at least in 667 00:40:41,360 --> 00:40:44,359 Speaker 1: my you know, my personal experience, I think it informs 668 00:40:44,800 --> 00:40:46,840 Speaker 1: who ends up doing what kind of labor in a workplace. 669 00:40:46,840 --> 00:40:49,600 Speaker 1: So we've talked about emotional labor. You know, if you 670 00:40:49,640 --> 00:40:57,040 Speaker 1: associate being caring or organized or you know, sensitive with 671 00:40:57,360 --> 00:41:01,319 Speaker 1: being a woman, you might let your mail employees get 672 00:41:01,400 --> 00:41:03,640 Speaker 1: like not do that, Like you might be thinking like, oh, 673 00:41:03,680 --> 00:41:06,440 Speaker 1: that's just a gendered trait that I associate with women. Therefore, 674 00:41:07,080 --> 00:41:09,920 Speaker 1: you know, Joe, the man doesn't have to be responsible 675 00:41:09,960 --> 00:41:11,200 Speaker 1: for that bit of labor. And so I think, not 676 00:41:11,280 --> 00:41:15,120 Speaker 1: only can it inform who applies for and gets what jobs, 677 00:41:15,160 --> 00:41:19,040 Speaker 1: but how those jobs are done once you get them. Yeah, yeah, completely, 678 00:41:19,719 --> 00:41:22,080 Speaker 1: And all of the stuff we're talking about does have 679 00:41:22,480 --> 00:41:26,239 Speaker 1: a real world impact, like things like that um contributing 680 00:41:26,280 --> 00:41:29,760 Speaker 1: to a culture that disrespects women, that makes it difficult 681 00:41:29,760 --> 00:41:32,719 Speaker 1: for them to be as openly ambitious to get equal pay, 682 00:41:32,800 --> 00:41:36,320 Speaker 1: to whole positions of authority because we don't see women 683 00:41:36,440 --> 00:41:39,160 Speaker 1: in those positions reflected in our language, and that in 684 00:41:39,200 --> 00:41:43,040 Speaker 1: turn influences how we think of women. It suggests that 685 00:41:43,040 --> 00:41:46,040 Speaker 1: women are less than are are outside of the norm. 686 00:41:46,120 --> 00:41:48,360 Speaker 1: It suggest that women are sexual objects at the service 687 00:41:48,400 --> 00:41:51,080 Speaker 1: of men, and that they are not to be believed 688 00:41:51,360 --> 00:41:55,480 Speaker 1: that the masculine is ideal, And this stuff is internalized 689 00:41:55,520 --> 00:41:57,680 Speaker 1: at a young age. It contributes to a culture that 690 00:41:57,760 --> 00:42:01,880 Speaker 1: routinely erases the experience trans people and non binary folks. 691 00:42:01,960 --> 00:42:05,360 Speaker 1: These are important things that we're talking about on a 692 00:42:05,400 --> 00:42:08,879 Speaker 1: societal level right now, and of course language is only 693 00:42:08,920 --> 00:42:11,200 Speaker 1: one part of the solution, but it is something that 694 00:42:11,239 --> 00:42:15,360 Speaker 1: we can work on improving. Definitely, there are some changes 695 00:42:15,400 --> 00:42:18,120 Speaker 1: that we can be happy about. Some institutions around the 696 00:42:18,160 --> 00:42:20,839 Speaker 1: world are taking steps to change language, whether it's at 697 00:42:20,840 --> 00:42:23,200 Speaker 1: the countrywide level in the case of Switzerland updating their 698 00:42:23,239 --> 00:42:26,960 Speaker 1: dictionary with a gender neutral pronoun or u S University 699 00:42:27,000 --> 00:42:29,320 Speaker 1: is like Yale swapping out words like freshman or upper 700 00:42:29,320 --> 00:42:33,600 Speaker 1: classmen to first year and upper year. And for those 701 00:42:33,680 --> 00:42:38,640 Speaker 1: that are thinking, well, this sounds like a lot of work, uh, 702 00:42:38,760 --> 00:42:41,359 Speaker 1: and it can be, But there are some steps that 703 00:42:41,400 --> 00:42:44,600 Speaker 1: we can take on a personal level. Gently calling people 704 00:42:44,600 --> 00:42:47,880 Speaker 1: out is one, including yourself. My Mbi Ali has a 705 00:42:47,920 --> 00:42:49,600 Speaker 1: whole video on how to do this when it comes 706 00:42:49,640 --> 00:42:53,560 Speaker 1: to the whole girl versus woman thing, and it's wonderful 707 00:42:53,600 --> 00:42:56,440 Speaker 1: and super helpful. Um. Doing this all the time might 708 00:42:56,440 --> 00:42:58,040 Speaker 1: seem like a full time job, but doing it even 709 00:42:58,040 --> 00:43:00,880 Speaker 1: some of the time can make a difference. Yeah, and 710 00:43:01,040 --> 00:43:03,839 Speaker 1: updating idioms and award titles like instead of best man 711 00:43:03,880 --> 00:43:08,160 Speaker 1: for the job, best person for the job in letters 712 00:43:08,200 --> 00:43:11,279 Speaker 1: or emails, using something gender neutral. If you don't know 713 00:43:11,400 --> 00:43:14,480 Speaker 1: how someone identifies to whom it may concern. For example, 714 00:43:15,120 --> 00:43:17,160 Speaker 1: our name A name works too if you're comfortable on 715 00:43:17,160 --> 00:43:19,600 Speaker 1: a first name basis, because then again, if you're not, 716 00:43:19,680 --> 00:43:23,240 Speaker 1: then you get into the mr, mrs, mrs, mrs mrs 717 00:43:23,320 --> 00:43:28,560 Speaker 1: thing totally using they are avoiding pronounce altogether, which can 718 00:43:28,600 --> 00:43:30,440 Speaker 1: be helpful to be more inclusive of folks who are 719 00:43:30,480 --> 00:43:33,720 Speaker 1: gender nonconforming. UM. Here in d C you can actually 720 00:43:33,760 --> 00:43:38,640 Speaker 1: get the honorific mix m X on your driver's license 721 00:43:38,680 --> 00:43:41,719 Speaker 1: is opposed to like miss or mrs. UM. It's a 722 00:43:41,800 --> 00:43:48,800 Speaker 1: gender neutral way of addressing someone formally. That's that's pretty cool. UM. 723 00:43:48,840 --> 00:43:51,480 Speaker 1: In general, being more aware of the words we choose 724 00:43:51,760 --> 00:43:54,480 Speaker 1: and what those words communicates and what they say about 725 00:43:54,520 --> 00:43:57,520 Speaker 1: the values and beliefs of our society. On a deeper level, 726 00:43:57,960 --> 00:44:01,080 Speaker 1: if you will, recognizing where we need to do some 727 00:44:01,120 --> 00:44:03,799 Speaker 1: work and then doing that work. I think that that's 728 00:44:03,800 --> 00:44:06,799 Speaker 1: where we can leave this. Yeah. I think it's just 729 00:44:07,000 --> 00:44:11,440 Speaker 1: really about being open to thinking about how we use 730 00:44:11,520 --> 00:44:14,400 Speaker 1: language and and sort of you know, being willing to 731 00:44:14,440 --> 00:44:17,879 Speaker 1: get it wrong and being willing to change and sort 732 00:44:17,920 --> 00:44:23,600 Speaker 1: of update these outdated ways of thinking. Yeah, and knowing 733 00:44:23,640 --> 00:44:28,320 Speaker 1: this history, I think is can be a important step 734 00:44:28,320 --> 00:44:31,360 Speaker 1: of that. An important step of moving forward and making 735 00:44:31,400 --> 00:44:36,160 Speaker 1: something better, and I'm excited to see that This about 736 00:44:36,200 --> 00:44:38,800 Speaker 1: brings us to the end of our deep dive into 737 00:44:38,800 --> 00:44:42,880 Speaker 1: the sexism of language. Like we said, we'd love to 738 00:44:42,920 --> 00:44:46,920 Speaker 1: hear from people who speak different languages, um what your 739 00:44:46,960 --> 00:44:50,720 Speaker 1: experience has been. You can email us at mom Stuff 740 00:44:50,760 --> 00:44:52,560 Speaker 1: at how stuff works dot com, or you can find 741 00:44:52,640 --> 00:44:56,200 Speaker 1: us on Twitter at mom Stuff podcast or on Instagram 742 00:44:56,239 --> 00:44:58,960 Speaker 1: at Stuff I've Never Told You And where can people 743 00:44:58,960 --> 00:45:03,360 Speaker 1: find you? Bridget Well, As I mentioned in an earlier episode, 744 00:45:03,400 --> 00:45:05,200 Speaker 1: my time with Sminty is winding down, so if you 745 00:45:05,239 --> 00:45:06,799 Speaker 1: want to keep up with all the fun things I'm 746 00:45:06,800 --> 00:45:08,839 Speaker 1: gonna be doing, you can find me on Twitter at 747 00:45:08,840 --> 00:45:12,759 Speaker 1: Bridget Marie and on Instagram at Bridget Marie in d C, 748 00:45:13,400 --> 00:45:16,959 Speaker 1: d C Like the City. Thanks as always to our 749 00:45:17,000 --> 00:45:20,440 Speaker 1: producer Trevor Young, and thanks to you for listening.