1 00:00:03,160 --> 00:00:06,439 Speaker 1: Welcome to Stuff Mom Never Told You. From how Supports 2 00:00:06,480 --> 00:00:14,640 Speaker 1: dot Com. Hello and welcome to the podcast. I'm Kristen 3 00:00:14,760 --> 00:00:18,759 Speaker 1: and I'm Caroline and with us today from Stuff to 4 00:00:18,800 --> 00:00:24,040 Speaker 1: Blow Your Mind, we have Julie Douglas Paudy, So thanks 5 00:00:24,040 --> 00:00:26,560 Speaker 1: so much for being here, Julie, to help us kick 6 00:00:26,680 --> 00:00:32,320 Speaker 1: off our four part Women in STEM Science, Technology, Engineering 7 00:00:32,360 --> 00:00:36,000 Speaker 1: and Math series. That's right, you're the all important s 8 00:00:36,320 --> 00:00:39,080 Speaker 1: in STEM. That's right. I represent S. I feel like 9 00:00:39,080 --> 00:00:41,839 Speaker 1: I'm on sesame Street. That's right. I'm s. Well it's 10 00:00:41,840 --> 00:00:44,720 Speaker 1: appropriate because you're actually holding a giant S right now. 11 00:00:45,159 --> 00:00:47,080 Speaker 1: I told you you didn't have to do that. I know, 12 00:00:47,200 --> 00:00:49,120 Speaker 1: but I just thought it would really ramp us up, 13 00:00:49,159 --> 00:00:51,720 Speaker 1: you know. And no pressure, Julie. But you do represent 14 00:00:51,800 --> 00:00:54,640 Speaker 1: all of science and more importantly, all of women and 15 00:00:54,680 --> 00:00:57,440 Speaker 1: girls in science. So I hope you are ready for that. 16 00:00:57,680 --> 00:01:01,520 Speaker 1: I am so ready to talk really eloquently about this 17 00:01:01,640 --> 00:01:04,920 Speaker 1: subject and just to change everybody's minds and just to 18 00:01:05,040 --> 00:01:08,399 Speaker 1: revisit your childhood memories. Perhaps everybody gets fetal at some 19 00:01:08,480 --> 00:01:11,280 Speaker 1: point and then you emerge in the field of lilies 20 00:01:11,440 --> 00:01:14,440 Speaker 1: with with with an absolute new concept of life in 21 00:01:14,440 --> 00:01:16,880 Speaker 1: the universe. I've got goose bumps. How do you too? 22 00:01:17,000 --> 00:01:22,280 Speaker 1: Is that on your resume? That's pretty impressive that empyrotechnics. Well, Julie, 23 00:01:22,319 --> 00:01:25,559 Speaker 1: you host the science podcast Stuff to Blow your Mind, 24 00:01:25,920 --> 00:01:27,920 Speaker 1: So could you just to kick things off, tell us 25 00:01:27,920 --> 00:01:31,720 Speaker 1: a little bit about what piques your interest with science. 26 00:01:31,720 --> 00:01:33,800 Speaker 1: Have you always been kind of a science nerd or 27 00:01:34,280 --> 00:01:36,880 Speaker 1: or what? Um. I think that I've just always been 28 00:01:36,920 --> 00:01:40,479 Speaker 1: curious about everything. I mean, I remember being thirteen years 29 00:01:40,480 --> 00:01:43,759 Speaker 1: old in um buying a book on hypnosis and trying 30 00:01:43,760 --> 00:01:45,880 Speaker 1: to hypnotize myself, like that was what I was doing 31 00:01:45,920 --> 00:01:49,160 Speaker 1: on a Saturday night. UM. So that I think just 32 00:01:49,360 --> 00:01:53,560 Speaker 1: naturally led to me, UM really getting more into some 33 00:01:53,720 --> 00:01:57,600 Speaker 1: aspects of what Richard Dawkins calls the magic of reality, 34 00:01:58,000 --> 00:02:01,480 Speaker 1: this idea that our reality can be far more interesting 35 00:02:01,520 --> 00:02:05,600 Speaker 1: and weird than you know, the these fictionalized accounts that 36 00:02:05,640 --> 00:02:08,200 Speaker 1: we turned to. And so when I think about scientists 37 00:02:08,240 --> 00:02:10,040 Speaker 1: think about that a lot, because I think, wow, I mean, 38 00:02:10,320 --> 00:02:12,600 Speaker 1: there are aspects of the our physical role that just 39 00:02:12,639 --> 00:02:17,400 Speaker 1: completely changed our perspective. Like, for instance, UM, there's something 40 00:02:17,400 --> 00:02:21,160 Speaker 1: called green leaf volatiles in a tobacco plant. So if 41 00:02:21,200 --> 00:02:25,359 Speaker 1: you actually cut that leaf, it releases these g lvs 42 00:02:25,440 --> 00:02:28,440 Speaker 1: which is basically a scream for help. It will tell 43 00:02:28,480 --> 00:02:31,960 Speaker 1: the predators of whatever just munched on it like a 44 00:02:32,000 --> 00:02:35,639 Speaker 1: caterpillar and say, hey, come and get this caterpillar off me. 45 00:02:35,840 --> 00:02:37,920 Speaker 1: So it's just weird and wacky the way that our 46 00:02:37,960 --> 00:02:42,040 Speaker 1: physical world works. Um, And when I think about science 47 00:02:42,080 --> 00:02:45,240 Speaker 1: and women, I think naturally about my daughter. She's four 48 00:02:45,280 --> 00:02:48,320 Speaker 1: years old. You know, about a year ago, the onslaught 49 00:02:48,639 --> 00:02:55,680 Speaker 1: of total pink pony magic, you know, castle, just that 50 00:02:55,720 --> 00:03:00,280 Speaker 1: whole onslaught of the generalization just came rushing at us. 51 00:03:00,360 --> 00:03:03,280 Speaker 1: So here's this daughter that I have that is so 52 00:03:03,400 --> 00:03:06,080 Speaker 1: interested in every aspect of life. But now I see 53 00:03:06,120 --> 00:03:10,600 Speaker 1: things getting genderized. Instead of being interested in bugs and um, 54 00:03:10,639 --> 00:03:13,400 Speaker 1: you know, all the cool aspects of that, uh, the 55 00:03:13,440 --> 00:03:16,359 Speaker 1: insect world, She's starting to veer more towards what I 56 00:03:16,400 --> 00:03:20,119 Speaker 1: would say is magical reality instead of the magic of reality. Well, 57 00:03:20,120 --> 00:03:23,480 Speaker 1: it's interesting that you bring up that difference in what 58 00:03:23,680 --> 00:03:27,240 Speaker 1: might have been her just natural interests and then the 59 00:03:27,280 --> 00:03:31,840 Speaker 1: onslaught of all of the pink princessy products and programming 60 00:03:31,919 --> 00:03:35,040 Speaker 1: and all that stuff. Because I feel like that divide 61 00:03:35,120 --> 00:03:39,960 Speaker 1: between the nature and the nurture really is a bulk 62 00:03:40,160 --> 00:03:44,480 Speaker 1: of this conversation surrounding women and girls and science and 63 00:03:44,560 --> 00:03:49,120 Speaker 1: unfortunately the lack thereof um where there's not as many 64 00:03:49,280 --> 00:03:53,280 Speaker 1: girls pursuing science, and so the whole question a lot 65 00:03:53,320 --> 00:03:55,800 Speaker 1: of times is well, are we just not as good 66 00:03:55,840 --> 00:04:00,280 Speaker 1: at science or are we just not being informed about 67 00:04:00,400 --> 00:04:04,760 Speaker 1: science and encouraged towards science enough. Well, UM, working on 68 00:04:04,800 --> 00:04:07,320 Speaker 1: the podcast on Stuff to Build Your Mind with Robert Lamb, 69 00:04:07,360 --> 00:04:09,680 Speaker 1: He's the co host we have. We just keep sort 70 00:04:09,720 --> 00:04:12,160 Speaker 1: of getting more at this sort of invisible reality, all 71 00:04:12,160 --> 00:04:15,120 Speaker 1: these sort of underpinnings of of how we define our 72 00:04:15,120 --> 00:04:17,160 Speaker 1: world and we move around in it. And one of 73 00:04:17,200 --> 00:04:18,760 Speaker 1: the things that I was thinking about is something called 74 00:04:18,839 --> 00:04:21,719 Speaker 1: enclothed cognition. Now this is a subset of something called 75 00:04:21,720 --> 00:04:25,400 Speaker 1: embodied cognition. So if I'm holding a say, a warm 76 00:04:25,440 --> 00:04:27,320 Speaker 1: cup of coffee, and I'm talking to you guys, there 77 00:04:27,320 --> 00:04:30,599 Speaker 1: are studies that say that I feel more warmly towards you. 78 00:04:30,680 --> 00:04:33,640 Speaker 1: It sounds kind of ridiculous and reductionist, but sometimes we 79 00:04:33,760 --> 00:04:38,120 Speaker 1: simple humans operate on that level. And so enclothed cognition 80 00:04:38,200 --> 00:04:40,640 Speaker 1: is this idea that what you wear can actually change 81 00:04:40,640 --> 00:04:42,480 Speaker 1: your behavior. And they did a bunch of studies. I 82 00:04:42,520 --> 00:04:45,880 Speaker 1: want go into about clipboards, which obviously make everybody feel 83 00:04:45,960 --> 00:04:50,240 Speaker 1: more important, and doctor's coats, and they just found amazing 84 00:04:50,360 --> 00:04:53,200 Speaker 1: results that that people could actually not just change their behavior, 85 00:04:53,200 --> 00:04:55,960 Speaker 1: but the way that their brain works. Um, they become 86 00:04:56,080 --> 00:04:59,640 Speaker 1: more attentive if they're wearing a doctor's coat. Um. So 87 00:04:59,760 --> 00:05:02,520 Speaker 1: I think about those things, and then I think about, um, 88 00:05:02,560 --> 00:05:05,320 Speaker 1: as you said that the ways that nature and nurture work, 89 00:05:05,920 --> 00:05:07,680 Speaker 1: and there was a lot of evidence out there are 90 00:05:07,680 --> 00:05:11,480 Speaker 1: amounting evidence or the idea that, um, it's really our 91 00:05:11,520 --> 00:05:13,719 Speaker 1: society and our culture that's informing the way that we 92 00:05:13,800 --> 00:05:17,000 Speaker 1: behave in it in our interests that sort of bloom 93 00:05:17,000 --> 00:05:20,320 Speaker 1: as a result. Well, it's interesting that you mentioned like 94 00:05:20,400 --> 00:05:23,039 Speaker 1: lab coats making you feel more attentive. It sounds it 95 00:05:23,080 --> 00:05:26,919 Speaker 1: seems to me like that's almost taking our stereotypes of 96 00:05:26,920 --> 00:05:29,680 Speaker 1: what a doctor is or should be and applying that 97 00:05:30,000 --> 00:05:33,400 Speaker 1: sort of in our brains internally making us feel a 98 00:05:33,440 --> 00:05:36,760 Speaker 1: certain way. And all of that to say, some people 99 00:05:36,880 --> 00:05:42,360 Speaker 1: have some interesting stereotypes and ideas about who should be 100 00:05:42,400 --> 00:05:46,599 Speaker 1: involved where in the sciences, who should be wearing that 101 00:05:46,720 --> 00:05:48,919 Speaker 1: lab coat, who should be who should be wearing that 102 00:05:49,000 --> 00:05:53,520 Speaker 1: lab coat? Back in January two thousand five, then Harvard 103 00:05:53,560 --> 00:05:58,600 Speaker 1: President Larry Summers set off on a public apology tour 104 00:05:58,720 --> 00:06:01,280 Speaker 1: after the Boston Global word it on his remarks given 105 00:06:01,279 --> 00:06:04,200 Speaker 1: at a small seminar that January in which he said 106 00:06:04,440 --> 00:06:08,240 Speaker 1: that the women in science gender gap maybe partially explained 107 00:06:08,320 --> 00:06:12,560 Speaker 1: by issues of intrinsic aptitude. And I'm thinking, does he 108 00:06:12,640 --> 00:06:15,880 Speaker 1: actually believe that or is he just harping on some 109 00:06:15,960 --> 00:06:18,920 Speaker 1: old stereotypes about women in science. What's going on? Yeah, 110 00:06:18,960 --> 00:06:22,799 Speaker 1: And during this public apology tour, he explained himself more fully, 111 00:06:22,839 --> 00:06:26,760 Speaker 1: saying that well, actually they are all these socialization issues 112 00:06:26,920 --> 00:06:31,560 Speaker 1: and maybe women not being afforded enough time because usually 113 00:06:31,560 --> 00:06:35,040 Speaker 1: things like the child care tends to fall on our shoulders. 114 00:06:35,480 --> 00:06:39,600 Speaker 1: But he couldn't erase the fact that he did say 115 00:06:39,640 --> 00:06:44,600 Speaker 1: that maybe our brains aren't as cut out for science. Yeah, 116 00:06:44,640 --> 00:06:48,680 Speaker 1: I mean he said intrinsic aptitude. I mean he's saying 117 00:06:48,720 --> 00:06:52,040 Speaker 1: from with the depths within women, they just I mean 118 00:06:52,080 --> 00:06:54,640 Speaker 1: he's not saying this, but I'm taking it that is saying, like, 119 00:06:54,720 --> 00:06:56,640 Speaker 1: you know that it's just they just don't have it 120 00:06:56,960 --> 00:07:00,120 Speaker 1: in them. They can't wear that lab coat even it 121 00:07:00,200 --> 00:07:03,599 Speaker 1: makes them perform, you know, better on these stroop attention tests. 122 00:07:04,200 --> 00:07:06,800 Speaker 1: Um So, it was unfortunate that he said it, But 123 00:07:06,880 --> 00:07:09,279 Speaker 1: I think what's interesting about it is it just set 124 00:07:09,320 --> 00:07:13,960 Speaker 1: off this ripple effect, this realization that in academia this 125 00:07:14,000 --> 00:07:19,320 Speaker 1: is some pervasive, um, you know, biased toward women, um 126 00:07:19,320 --> 00:07:23,000 Speaker 1: in sexism. Yeah. The thing is he was saying what 127 00:07:23,200 --> 00:07:27,080 Speaker 1: people were unfortunately thinking that maybe this this was the case. 128 00:07:27,240 --> 00:07:30,400 Speaker 1: And in a way it's good because even you know, 129 00:07:30,560 --> 00:07:33,480 Speaker 1: years later, we're still talking about it. Definitely sparked a 130 00:07:33,600 --> 00:07:38,160 Speaker 1: huge discussion, and soon thereafter, in April two five, there 131 00:07:38,360 --> 00:07:42,720 Speaker 1: was highly publicized debate between Stephen Pinker and Elizabeth Spelky 132 00:07:43,400 --> 00:07:46,640 Speaker 1: about whether or not, you know, Summers was right essentially, 133 00:07:46,680 --> 00:07:52,080 Speaker 1: whether it's more effect of nature or nurture. And Pinker, 134 00:07:52,120 --> 00:07:56,760 Speaker 1: I mean presented this entire slide show just hammering away 135 00:07:57,080 --> 00:08:00,680 Speaker 1: at it being nature that no girls are some not 136 00:08:00,880 --> 00:08:06,320 Speaker 1: as inclined to want to explore scientifically, it's not there 137 00:08:06,360 --> 00:08:09,440 Speaker 1: in our brains. But I mean, as you were saying, Julie, 138 00:08:09,440 --> 00:08:12,760 Speaker 1: I mean you watched your daughter transform before your eyes 139 00:08:12,840 --> 00:08:15,640 Speaker 1: from being all into bugs and insects and the outdoors too, 140 00:08:16,600 --> 00:08:20,280 Speaker 1: you know, maybe liking my little pony a little bit more. Yeah, Well, 141 00:08:20,320 --> 00:08:23,679 Speaker 1: and you know she does, she does veer towards that now, um, 142 00:08:23,720 --> 00:08:26,840 Speaker 1: inherently like she does have still she still has the interest. 143 00:08:26,920 --> 00:08:28,920 Speaker 1: And I watched just this morning she was using these 144 00:08:28,920 --> 00:08:31,920 Speaker 1: things called magnetiles, which I really want for myself, um, 145 00:08:32,120 --> 00:08:35,319 Speaker 1: and building this incredible structure and it ended up being 146 00:08:35,320 --> 00:08:37,360 Speaker 1: a counsel for a which I don't know if that's 147 00:08:37,360 --> 00:08:40,480 Speaker 1: an improvement or not over a princess. But um, you know, 148 00:08:40,920 --> 00:08:44,520 Speaker 1: I still see these skills in this interest. But again, 149 00:08:44,840 --> 00:08:48,600 Speaker 1: there's this whole other pink avalanche that is coming at 150 00:08:48,640 --> 00:08:50,160 Speaker 1: us and will come at her for the rest of 151 00:08:50,160 --> 00:08:52,960 Speaker 1: her life. It is very damaging, whether it's you know, 152 00:08:53,120 --> 00:08:55,640 Speaker 1: Summer who carries a lot of weight and Pinker saying 153 00:08:55,679 --> 00:08:58,199 Speaker 1: that this is something that is inherent in nature. And 154 00:08:58,440 --> 00:09:02,200 Speaker 1: what I really love about this debate between Spelky and 155 00:09:02,400 --> 00:09:05,520 Speaker 1: Pinker is that they're first of all, they're they're great friends, 156 00:09:05,679 --> 00:09:08,400 Speaker 1: but they differ on this one thing. Obviously. Um. It 157 00:09:08,559 --> 00:09:13,400 Speaker 1: is such a rousing debate, and Spelky cleans up, and 158 00:09:13,600 --> 00:09:16,560 Speaker 1: you know, she she looks at twenty years of her research. 159 00:09:16,920 --> 00:09:20,880 Speaker 1: She's a cognitive psychologist. She deals mainly with infants and children. 160 00:09:21,160 --> 00:09:23,480 Speaker 1: She comes through Can you imagine going through twenty years 161 00:09:23,480 --> 00:09:26,720 Speaker 1: of research, and she she looks at all the problem 162 00:09:27,320 --> 00:09:30,120 Speaker 1: solving skills and number skills that children possess and she 163 00:09:30,200 --> 00:09:33,199 Speaker 1: says there is no difference. She says, it is a 164 00:09:33,320 --> 00:09:37,320 Speaker 1: null hypothesis. Now that is not you know, just that's 165 00:09:37,360 --> 00:09:39,319 Speaker 1: the debate and it was over. It's actually, if you 166 00:09:39,360 --> 00:09:41,520 Speaker 1: want to look at it online, is a very long 167 00:09:41,640 --> 00:09:45,600 Speaker 1: discussion between the two. But it is fascinating, and it's 168 00:09:45,679 --> 00:09:48,840 Speaker 1: just it's a bit sad to me that we actually 169 00:09:48,880 --> 00:09:50,839 Speaker 1: even have to have this debate in the first place 170 00:09:50,920 --> 00:09:52,920 Speaker 1: with slide shows to say that this is not a 171 00:09:53,440 --> 00:09:56,120 Speaker 1: nurtured thing. Excuse me, to say that this is not 172 00:09:56,240 --> 00:09:59,040 Speaker 1: a nature of thing. Well, and I think it's almost 173 00:09:59,080 --> 00:10:03,000 Speaker 1: like we want to believe that it has to be nature. 174 00:10:03,080 --> 00:10:05,400 Speaker 1: I mean, the year after that, in two thousand and six, 175 00:10:05,720 --> 00:10:09,760 Speaker 1: there was the best seller The Female Brain by Lewin Brisandine, 176 00:10:09,840 --> 00:10:14,520 Speaker 1: who got so much attention but not so favorably from 177 00:10:14,559 --> 00:10:17,840 Speaker 1: people who actually read the book. And we're scientifically inclined 178 00:10:17,840 --> 00:10:20,600 Speaker 1: and said, wait, wait, wait, wait, wait uh, you're saying 179 00:10:20,679 --> 00:10:23,400 Speaker 1: that the female brain has cut out like this, and 180 00:10:23,440 --> 00:10:28,079 Speaker 1: you're ignoring a lot of actual scientific truths to make 181 00:10:28,080 --> 00:10:31,440 Speaker 1: this argument that, I mean, the only thing that comes 182 00:10:31,440 --> 00:10:35,439 Speaker 1: to mind, is just is that we have lady brains. Yeah, 183 00:10:35,600 --> 00:10:38,600 Speaker 1: it's true. It's like she she I've read the book actually, 184 00:10:38,960 --> 00:10:44,040 Speaker 1: and um, she makes a very persuasive argument with with 185 00:10:44,120 --> 00:10:48,720 Speaker 1: what passes f as fact or facts and um, and 186 00:10:48,800 --> 00:10:53,760 Speaker 1: just sort of these overwhelming generalities about the brain to 187 00:10:53,880 --> 00:10:56,040 Speaker 1: sort of say, oh, this is why we get the sad. 188 00:10:56,240 --> 00:11:00,360 Speaker 1: Sometimes there's hormones and every month and you never know, 189 00:11:00,559 --> 00:11:02,880 Speaker 1: you're just gonna be nuts, you know. So it's sort 190 00:11:02,880 --> 00:11:06,400 Speaker 1: of reinforcing all these stereotypes through the lens of science, 191 00:11:06,400 --> 00:11:09,400 Speaker 1: which is unfortunate. Well, they I had a review in 192 00:11:09,559 --> 00:11:14,720 Speaker 1: Nature that basically highlighted three main areas where her book 193 00:11:14,760 --> 00:11:18,040 Speaker 1: is sort of off base. And they refer number one 194 00:11:18,360 --> 00:11:22,440 Speaker 1: to her discussion of human sex differences and brain structure 195 00:11:22,440 --> 00:11:26,320 Speaker 1: and behavior that are characterized by small average differences with 196 00:11:26,400 --> 00:11:29,760 Speaker 1: a lot of overlap on the individual level. They point 197 00:11:29,800 --> 00:11:32,520 Speaker 1: out that she characterized it as a more of a 198 00:11:32,640 --> 00:11:37,240 Speaker 1: massive golf in in those differences. Yeah, and on top 199 00:11:37,240 --> 00:11:39,920 Speaker 1: of that, there was just a lot of misinterpretation and 200 00:11:39,960 --> 00:11:44,200 Speaker 1: sort of fudging other study results to conveniently align with 201 00:11:44,320 --> 00:11:47,800 Speaker 1: this nature heavy argument. Um, and I feel like this 202 00:11:47,880 --> 00:11:50,760 Speaker 1: is still the nature versus nurture debate. It's still going on, 203 00:11:50,840 --> 00:11:54,280 Speaker 1: though it's not like it was settled with Spelky and 204 00:11:54,480 --> 00:11:57,720 Speaker 1: with the you know this nature, this damning nature review 205 00:11:58,280 --> 00:12:01,719 Speaker 1: of the female brain. We're still talking about it. It's 206 00:12:01,720 --> 00:12:04,920 Speaker 1: still posed as a debate. I mean, do you think that, Julie, 207 00:12:05,000 --> 00:12:09,360 Speaker 1: there's ever going to be a winner declared in terms 208 00:12:09,360 --> 00:12:14,480 Speaker 1: of nature versus nurture with women in science? Um? No, 209 00:12:15,080 --> 00:12:17,120 Speaker 1: I mean not for a long time. Because the problem 210 00:12:17,200 --> 00:12:19,040 Speaker 1: is the more we learn about the brain, more the 211 00:12:19,040 --> 00:12:21,319 Speaker 1: more data we add to it. And that's good, right, 212 00:12:21,360 --> 00:12:23,880 Speaker 1: because over time we begin to get a more accurate 213 00:12:24,720 --> 00:12:27,320 Speaker 1: idea of how the brain does work, you know. But 214 00:12:27,520 --> 00:12:31,680 Speaker 1: to try to to try to create a female brain 215 00:12:31,720 --> 00:12:33,520 Speaker 1: and a male brain is like trying to create two 216 00:12:33,520 --> 00:12:36,880 Speaker 1: different species. And so that's where the error and logic is. 217 00:12:37,720 --> 00:12:41,000 Speaker 1: And I think that we do a huge disservice and 218 00:12:41,400 --> 00:12:44,160 Speaker 1: looking at this black and white you know, like nature 219 00:12:44,360 --> 00:12:48,560 Speaker 1: or nurture and not considering the individual. And that's what 220 00:12:48,600 --> 00:12:50,719 Speaker 1: it really boils down town, because we know that individually 221 00:12:50,760 --> 00:12:54,079 Speaker 1: we all differ in and and there are certainly different 222 00:12:54,080 --> 00:12:58,199 Speaker 1: aspects of our makeup that really tend to dominate the 223 00:12:58,240 --> 00:13:01,320 Speaker 1: way that we act or our interests, and so I 224 00:13:01,360 --> 00:13:03,959 Speaker 1: think that it's again a huge disservice. I think about 225 00:13:04,160 --> 00:13:06,760 Speaker 1: men in this instance too. I feel badly for men 226 00:13:06,880 --> 00:13:09,600 Speaker 1: because what happens is the flip side is that men 227 00:13:09,679 --> 00:13:13,040 Speaker 1: are not able to say that they feel vulnerable or 228 00:13:13,240 --> 00:13:16,560 Speaker 1: dwell in what we would call the world of female emotions. Well, 229 00:13:16,760 --> 00:13:20,240 Speaker 1: men are just as sad and you know, feeling vulnerable 230 00:13:20,360 --> 00:13:23,040 Speaker 1: or experiencing shame as women are. And I think about 231 00:13:23,080 --> 00:13:24,679 Speaker 1: my dad and my brother in the way that they 232 00:13:24,760 --> 00:13:27,160 Speaker 1: can't express themselves. So I think that, you know, we 233 00:13:27,559 --> 00:13:30,000 Speaker 1: to to try to divvy up the world in these 234 00:13:30,000 --> 00:13:33,880 Speaker 1: ways and tell people that you know, you should be uh, 235 00:13:33,960 --> 00:13:38,240 Speaker 1: you know, getting into these various fields for yourself because 236 00:13:38,240 --> 00:13:41,280 Speaker 1: they fit your gender is a ridiculous statement, Like can 237 00:13:41,320 --> 00:13:44,160 Speaker 1: you imagine your mom or dad saying you are you 238 00:13:44,200 --> 00:13:47,400 Speaker 1: know you're because you're this gender. You should really think about, 239 00:13:47,559 --> 00:13:52,920 Speaker 1: you know, podcasting. Well, no, I mean, you know, you 240 00:13:52,960 --> 00:13:55,520 Speaker 1: mentioned Spelky going back every twenty years of research, and 241 00:13:55,559 --> 00:13:58,560 Speaker 1: I have gone back over zero years of research other 242 00:13:58,600 --> 00:14:01,400 Speaker 1: than my own existence. And I was thinking about my 243 00:14:01,440 --> 00:14:04,520 Speaker 1: own interests and how I mean, I was interested in 244 00:14:04,559 --> 00:14:07,480 Speaker 1: science and I always have been. It's just that's not 245 00:14:07,559 --> 00:14:11,600 Speaker 1: the field that I necessarily pursued. And I'm wondering, you know, Okay, 246 00:14:11,640 --> 00:14:13,760 Speaker 1: a I was better at you know, language, arts and 247 00:14:13,760 --> 00:14:15,840 Speaker 1: stuff like that, so I went into a writing career. 248 00:14:16,000 --> 00:14:21,320 Speaker 1: But I'm just wondering if more women had been present 249 00:14:22,200 --> 00:14:26,440 Speaker 1: in my life telling me about science, explaining science. I mean, 250 00:14:26,480 --> 00:14:29,160 Speaker 1: I had a couple of female math and science teachers 251 00:14:29,200 --> 00:14:32,760 Speaker 1: growing up, but it I don't know. I just wonder 252 00:14:32,800 --> 00:14:35,640 Speaker 1: if we need more people like that to help our 253 00:14:35,720 --> 00:14:38,080 Speaker 1: daughters realize that, yes, you can still be interested in 254 00:14:38,080 --> 00:14:41,920 Speaker 1: the pink avalanche that's coming at you, but you can 255 00:14:41,960 --> 00:14:47,320 Speaker 1: also build castles, build blocks, be interested in plants and bugs. Yeah, 256 00:14:47,320 --> 00:14:50,600 Speaker 1: and when we move a conversation from more of you know, 257 00:14:50,680 --> 00:14:54,960 Speaker 1: debating about the brain and ability into looking at the 258 00:14:55,080 --> 00:14:56,760 Speaker 1: data that we have in terms of the number of 259 00:14:56,800 --> 00:15:00,200 Speaker 1: girls in classrooms who were interested in science and through 260 00:15:00,360 --> 00:15:05,960 Speaker 1: pursuing science. Um, then yeah, that visibility factor absolutely comes 261 00:15:06,080 --> 00:15:08,560 Speaker 1: up a lot. And we're going to talk about where 262 00:15:08,600 --> 00:15:12,280 Speaker 1: we are today with women in science by the numbers 263 00:15:12,760 --> 00:15:16,120 Speaker 1: when we come right back from a quick break and 264 00:15:16,200 --> 00:15:19,160 Speaker 1: now back to the podcast. So when we left off, 265 00:15:19,240 --> 00:15:22,600 Speaker 1: we were going to talk about women in science. How 266 00:15:22,760 --> 00:15:28,400 Speaker 1: many women are studying and pursuing stem fields today, and 267 00:15:28,720 --> 00:15:32,120 Speaker 1: there are a lot of us in psychology. There's that well, 268 00:15:32,200 --> 00:15:34,640 Speaker 1: I mean, but yeah, doesn't that seem like it makes sense? 269 00:15:34,640 --> 00:15:38,440 Speaker 1: It aligns with our expectations because it may be scientific, 270 00:15:38,520 --> 00:15:41,560 Speaker 1: but it's also in the in the feelings pool well, 271 00:15:41,600 --> 00:15:45,000 Speaker 1: and we see a lot of women as therapists as psychologists, 272 00:15:45,360 --> 00:15:49,640 Speaker 1: thinking of Dr Melfie on the Sopranos for instance. Um, 273 00:15:49,760 --> 00:15:53,480 Speaker 1: I took a psychology class in college. I even considered 274 00:15:54,120 --> 00:15:57,920 Speaker 1: minoring in it. Um. So yeah, I think it very 275 00:15:58,000 --> 00:16:01,480 Speaker 1: much aligns with our idea. Is but when we move 276 00:16:01,600 --> 00:16:06,400 Speaker 1: into things like math, the physical sciences, engineering, the numbers 277 00:16:06,800 --> 00:16:10,520 Speaker 1: to start eroding very quickly. It's true. Um, But I 278 00:16:10,600 --> 00:16:13,320 Speaker 1: keep thinking about this article called the End of Men, 279 00:16:13,640 --> 00:16:15,960 Speaker 1: and I think it was Hannah Rosen who wrote it, 280 00:16:16,120 --> 00:16:19,120 Speaker 1: and she was saying that people who are pursuing their 281 00:16:19,120 --> 00:16:22,200 Speaker 1: masters and their PhD, they tend to women are dominating. 282 00:16:22,680 --> 00:16:26,640 Speaker 1: So I wonder at some point in fifty years, because 283 00:16:26,800 --> 00:16:31,280 Speaker 1: the amount of women pursuing higher education, Uh if maybe 284 00:16:31,280 --> 00:16:34,480 Speaker 1: that'll you know, mess with the stats a little bit. 285 00:16:34,560 --> 00:16:36,680 Speaker 1: But of course, you know, currently this this is sort 286 00:16:36,680 --> 00:16:38,400 Speaker 1: of the picture that we have. It's the picture of 287 00:16:38,400 --> 00:16:42,400 Speaker 1: ourselves that we're fine with psychology, but we're not necessarily 288 00:16:42,440 --> 00:16:46,560 Speaker 1: being moved into science and engineering. I mean, it seems 289 00:16:46,600 --> 00:16:49,720 Speaker 1: like the ranks are obviously swelling in terms of women 290 00:16:50,080 --> 00:16:52,920 Speaker 1: pursuing higher education, but a lot of times the pipeline 291 00:16:53,000 --> 00:16:56,000 Speaker 1: is directed more toward the and I hate to use 292 00:16:56,040 --> 00:16:59,920 Speaker 1: this term, but the pink collar fields, things like nurse 293 00:17:00,000 --> 00:17:03,240 Speaker 1: saying where especially because of like the aging baby boomers, 294 00:17:03,240 --> 00:17:07,800 Speaker 1: it's a highly like growth kind of industry to get into. UM. 295 00:17:07,840 --> 00:17:11,720 Speaker 1: But still, if you look at physics, for instance, only 296 00:17:11,840 --> 00:17:15,080 Speaker 1: one fifth of the PhDs in the US, which first 297 00:17:15,080 --> 00:17:17,880 Speaker 1: of all, just the thought of pursuing a physics PhD 298 00:17:18,200 --> 00:17:20,879 Speaker 1: makes a mond explode a little bit, but only a 299 00:17:20,920 --> 00:17:24,359 Speaker 1: fifth of them are being awarded to women. And even 300 00:17:24,359 --> 00:17:29,320 Speaker 1: more tellingly, only half of the PhD is going to physics. 301 00:17:29,320 --> 00:17:32,160 Speaker 1: pH d's going to women in the US are going 302 00:17:32,200 --> 00:17:35,160 Speaker 1: to American women. So it's a lot of international students 303 00:17:35,200 --> 00:17:38,119 Speaker 1: also who will come into the United States. Yeah, and 304 00:17:38,160 --> 00:17:40,840 Speaker 1: if you're looking at physics in particular, The New York 305 00:17:40,840 --> 00:17:43,679 Speaker 1: Times pointed out that gender differences in enrollment rates in 306 00:17:43,760 --> 00:17:46,840 Speaker 1: high school physics classes tend to be correlated with the 307 00:17:46,920 --> 00:17:50,000 Speaker 1: number of women in the larger community who do or 308 00:17:50,040 --> 00:17:51,840 Speaker 1: do not work in stum fields, which is kind of 309 00:17:51,840 --> 00:17:54,320 Speaker 1: what we were talking about earlier about having representation and 310 00:17:54,440 --> 00:17:57,680 Speaker 1: role models. Yeah. Right, if your mom is a scientist, 311 00:17:57,720 --> 00:18:00,679 Speaker 1: then you probably are going to be more amenable to 312 00:18:00,760 --> 00:18:03,200 Speaker 1: that idea of having that career for yourself. I mean, 313 00:18:04,280 --> 00:18:07,480 Speaker 1: Marie Cree is a perfect example of this, and her 314 00:18:07,560 --> 00:18:11,359 Speaker 1: daughter went on to win a Nobel Prize as well. Um, 315 00:18:11,440 --> 00:18:14,440 Speaker 1: so you know, it's what you're exposed to. I think, 316 00:18:14,720 --> 00:18:18,000 Speaker 1: I know. I keep saying that society keeps dictating what 317 00:18:18,040 --> 00:18:20,280 Speaker 1: we should and shouldn't do, and it seems kind of 318 00:18:20,359 --> 00:18:22,040 Speaker 1: ridiculous say that because you want to feel like you're 319 00:18:22,040 --> 00:18:24,439 Speaker 1: in control of your life and say no, Um, I 320 00:18:24,520 --> 00:18:26,840 Speaker 1: know what society is throwing at me. But so much 321 00:18:26,880 --> 00:18:29,879 Speaker 1: of this is unconscious. And that's where the unconscious bias 322 00:18:29,960 --> 00:18:32,040 Speaker 1: comes in, and the sciences and the way we regard 323 00:18:32,080 --> 00:18:34,720 Speaker 1: ourselves and the way that others regard us as women. 324 00:18:35,280 --> 00:18:39,080 Speaker 1: And if you doubt the power of culture over the individual, 325 00:18:39,400 --> 00:18:42,159 Speaker 1: I just wanted to point out that. Um. There was 326 00:18:42,200 --> 00:18:45,680 Speaker 1: this really interesting study by JOJN. Kim at the University 327 00:18:45,720 --> 00:18:49,480 Speaker 1: of California, and he wanted to know how South Korean 328 00:18:49,480 --> 00:18:53,960 Speaker 1: men dealt with feelings of isolation and sort of reaching 329 00:18:53,960 --> 00:18:56,560 Speaker 1: out to one another, particularly men who have this G 330 00:18:57,040 --> 00:19:01,520 Speaker 1: variant when it comes to UH having receptors for oxytosin 331 00:19:01,600 --> 00:19:05,280 Speaker 1: oxytois and being the social bonding drug that our body produces. 332 00:19:05,400 --> 00:19:08,119 Speaker 1: And so in theory, the South Korean men should have 333 00:19:08,240 --> 00:19:11,159 Speaker 1: been the ones who would reach out the most. But 334 00:19:11,280 --> 00:19:15,520 Speaker 1: because in South Korean culture it is looked down upon 335 00:19:15,600 --> 00:19:19,000 Speaker 1: to actually try to go to another friend for solace 336 00:19:19,240 --> 00:19:22,160 Speaker 1: no matter what your gender is, those men were actually 337 00:19:22,440 --> 00:19:25,760 Speaker 1: more likely to never reach out. So the point of 338 00:19:25,760 --> 00:19:28,760 Speaker 1: that is is that here you have again society dictating 339 00:19:29,080 --> 00:19:33,199 Speaker 1: behavior and thought as opposed to biology, which should have 340 00:19:33,320 --> 00:19:38,320 Speaker 1: for these guys made them like just huge, like hug junkies. Yeah. 341 00:19:38,359 --> 00:19:41,480 Speaker 1: As I was reading up for this episode, I was 342 00:19:41,520 --> 00:19:46,080 Speaker 1: reflecting on my own life, and from second through eighth grade, 343 00:19:46,119 --> 00:19:50,000 Speaker 1: I was homeschooled, and I mean, obviously I had interaction 344 00:19:50,080 --> 00:19:52,239 Speaker 1: with other kids. I watched TV and all of that, 345 00:19:52,320 --> 00:19:56,639 Speaker 1: and so I knew about nerds stereotypes. But nevertheless, in 346 00:19:56,840 --> 00:20:00,680 Speaker 1: more relative social isolation, I was kind of a math 347 00:20:00,760 --> 00:20:03,920 Speaker 1: and science nerd. I still wanted always to be a writer, 348 00:20:04,359 --> 00:20:06,560 Speaker 1: but I always was really good at math and science 349 00:20:06,800 --> 00:20:10,600 Speaker 1: as well. But once I got to high school and 350 00:20:11,240 --> 00:20:14,600 Speaker 1: was just slapped in the face by all of the 351 00:20:14,640 --> 00:20:19,840 Speaker 1: importance of your image and how it comes across my interest, 352 00:20:19,920 --> 00:20:23,720 Speaker 1: or at least the interest that I openly expressed toward 353 00:20:23,840 --> 00:20:29,320 Speaker 1: math and science diminished. And I remember in senior year, 354 00:20:29,400 --> 00:20:32,600 Speaker 1: I had the option to take AP biology and I 355 00:20:32,680 --> 00:20:34,919 Speaker 1: just didn't want to because I didn't want to be 356 00:20:34,960 --> 00:20:38,760 Speaker 1: one of those nerds. And I it's heartbreaking to think now, 357 00:20:39,320 --> 00:20:42,160 Speaker 1: like myself now I would totally take you know, AP 358 00:20:42,280 --> 00:20:44,960 Speaker 1: biology and tell myself to do that. But I really 359 00:20:45,240 --> 00:20:48,800 Speaker 1: I feel like it was totally an issue of that 360 00:20:48,920 --> 00:20:52,879 Speaker 1: socialization factor. And I'm just one example. Well, I mean, 361 00:20:52,920 --> 00:20:55,720 Speaker 1: I'll go back to that magical reality versus the magic 362 00:20:55,760 --> 00:20:58,320 Speaker 1: of reality. So you know at some point again that 363 00:20:58,680 --> 00:21:01,600 Speaker 1: you know, girls have these books that they're exposed to 364 00:21:01,760 --> 00:21:05,919 Speaker 1: about um, you know, magical castles and getting married and 365 00:21:06,000 --> 00:21:09,879 Speaker 1: these that's sort of the terrain of girls spells. Again, 366 00:21:09,920 --> 00:21:15,360 Speaker 1: these relief systems into magic in the occult all sort 367 00:21:15,400 --> 00:21:19,240 Speaker 1: of line up with women, you know, for the most part, 368 00:21:19,400 --> 00:21:24,640 Speaker 1: being stripped of this ability to really access that magic 369 00:21:24,720 --> 00:21:27,679 Speaker 1: of reality. And I wonder though, too, how much of 370 00:21:27,680 --> 00:21:32,040 Speaker 1: this is more of a Western issue because there was 371 00:21:32,359 --> 00:21:34,719 Speaker 1: one stat that was tossed out um in a New 372 00:21:34,760 --> 00:21:39,000 Speaker 1: York Times magazine piece about how um from nineteen seventy 373 00:21:39,040 --> 00:21:41,760 Speaker 1: four to two thousand and eight, the US sent just 374 00:21:41,920 --> 00:21:48,240 Speaker 1: three girls to the International Mathematical Olympiad, but by comparison, Bulgaria, 375 00:21:48,520 --> 00:21:52,440 Speaker 1: for instance, from two thousand eight had sent twenty one 376 00:21:52,880 --> 00:21:56,120 Speaker 1: and a lot of times like with that physics PhD 377 00:21:56,200 --> 00:21:59,199 Speaker 1: statistic where half of the women in the United States 378 00:21:59,320 --> 00:22:03,160 Speaker 1: who are received these PhDs are from other countries. Well, yeah, 379 00:22:03,160 --> 00:22:05,720 Speaker 1: I mean I definitely think you could argue that partially 380 00:22:05,720 --> 00:22:07,880 Speaker 1: it is it is cultural because if you look at 381 00:22:07,920 --> 00:22:11,199 Speaker 1: like Eastern European countries for instance, like they have a 382 00:22:11,359 --> 00:22:15,560 Speaker 1: history of wanting to train their women in more technical 383 00:22:16,160 --> 00:22:20,199 Speaker 1: and mathematical fields, and so it is more common to 384 00:22:20,400 --> 00:22:25,480 Speaker 1: see you know, women from that region in in programming, 385 00:22:25,680 --> 00:22:29,399 Speaker 1: in mathematics and science and all of this stuff, whereas 386 00:22:29,560 --> 00:22:32,360 Speaker 1: in in America, I don't know, are we are we 387 00:22:32,359 --> 00:22:35,080 Speaker 1: were not as actively trying to push girls and women 388 00:22:35,080 --> 00:22:37,160 Speaker 1: into those fields. No, not at all, Um. I mean, 389 00:22:37,200 --> 00:22:39,959 Speaker 1: the deminitiative is really huge in the United States right 390 00:22:39,960 --> 00:22:42,760 Speaker 1: now because they you know, a lot of educators realized 391 00:22:42,760 --> 00:22:45,879 Speaker 1: that there's a huge brain drain in that field or 392 00:22:45,920 --> 00:22:48,520 Speaker 1: in those fields just for the country as a whole. 393 00:22:48,640 --> 00:22:51,000 Speaker 1: So you see a lot of people paying attention to 394 00:22:51,080 --> 00:22:53,680 Speaker 1: it now, but that's because you know, we're we're not 395 00:22:53,920 --> 00:22:56,680 Speaker 1: ranking as high as we used to at all. It's 396 00:22:56,720 --> 00:23:00,600 Speaker 1: actually pretty dire and we're more export or is really 397 00:23:00,640 --> 00:23:03,240 Speaker 1: of entertainment. If you look at what the United States 398 00:23:03,320 --> 00:23:06,040 Speaker 1: is known for now, we're not you know, in the 399 00:23:06,119 --> 00:23:08,680 Speaker 1: space race like we were in the sixties and seventies 400 00:23:08,720 --> 00:23:12,960 Speaker 1: and really driving um innovation when it comes to science, math, 401 00:23:13,000 --> 00:23:15,919 Speaker 1: and engineering. Yeah, and and girls are going to be 402 00:23:16,080 --> 00:23:20,280 Speaker 1: such a big part of filling that pipeline. Because I 403 00:23:20,320 --> 00:23:24,159 Speaker 1: think it might have been Magiary, the astrophysicist who was 404 00:23:24,200 --> 00:23:27,280 Speaker 1: making this point, but she was someone was saying how 405 00:23:27,920 --> 00:23:33,400 Speaker 1: the problem with STEM being so stereotypically focused on boys 406 00:23:34,000 --> 00:23:38,119 Speaker 1: is that and this is nothing against men or boys capabilities, 407 00:23:38,560 --> 00:23:40,560 Speaker 1: but it's like, if that's the only pool we have 408 00:23:40,680 --> 00:23:42,840 Speaker 1: to choose from, you're gonna have to start going to 409 00:23:42,840 --> 00:23:46,400 Speaker 1: the bottom of the barrel rather than tapping more into 410 00:23:46,760 --> 00:23:48,879 Speaker 1: the female talent that you have as well. So you're 411 00:23:48,920 --> 00:23:52,879 Speaker 1: going to have more top tier applicants to choose from. 412 00:23:53,040 --> 00:23:57,640 Speaker 1: But when it comes to encouragement. It's almost as though 413 00:23:57,680 --> 00:24:02,600 Speaker 1: women need just so much to you know what I mean. Well, 414 00:24:02,640 --> 00:24:05,199 Speaker 1: there's again, there's all that unconscious bias going on, all 415 00:24:05,240 --> 00:24:07,239 Speaker 1: these things that we've absorbed over the years. I'm sure 416 00:24:07,240 --> 00:24:10,920 Speaker 1: you guys are familiar with the UM, I hate math Barbie, right, 417 00:24:11,119 --> 00:24:13,520 Speaker 1: so you know you pulled a little cord on the 418 00:24:13,560 --> 00:24:16,439 Speaker 1: back room she does and I hate math. Now you know, 419 00:24:16,520 --> 00:24:19,359 Speaker 1: they have since Mattel has put out other types of 420 00:24:19,359 --> 00:24:27,360 Speaker 1: Barbies that have better representations of Glad Barbie's most intimate thoughts. Yes, glasses, um, 421 00:24:27,680 --> 00:24:31,080 Speaker 1: but you know that still sticks with us. It's very 422 00:24:31,119 --> 00:24:33,880 Speaker 1: hard to shed those ideas. And I was thinking about 423 00:24:34,000 --> 00:24:37,080 Speaker 1: the New York Times article that you guys had sent me, 424 00:24:37,160 --> 00:24:39,760 Speaker 1: and it's by Eileen Pollock, who was really sort of 425 00:24:39,800 --> 00:24:42,280 Speaker 1: looking at this idea for herself because she was one 426 00:24:42,480 --> 00:24:45,360 Speaker 1: of the first two women physics majors who graduated from 427 00:24:45,440 --> 00:24:48,680 Speaker 1: Yale University in nineteen seventy. So this this article is 428 00:24:48,720 --> 00:24:52,639 Speaker 1: an exploration of why she didn't pursue that herself. UM, 429 00:24:52,800 --> 00:24:56,240 Speaker 1: which in it she talked about how priming is really important. 430 00:24:56,240 --> 00:24:59,960 Speaker 1: And there's the study of the University of Michigan students 431 00:25:00,080 --> 00:25:03,840 Speaker 1: who had uh students with the same abilities and the 432 00:25:03,880 --> 00:25:06,720 Speaker 1: same level of performance, men and women, and they were 433 00:25:06,800 --> 00:25:09,440 Speaker 1: splintered into two groups. The first was told that men 434 00:25:09,640 --> 00:25:13,320 Speaker 1: perform better than women on math tests. The second was 435 00:25:13,359 --> 00:25:16,120 Speaker 1: told that, no matter what Barbie might have said, there 436 00:25:16,160 --> 00:25:19,520 Speaker 1: was no difference in the abilities among the two genders. Now, 437 00:25:19,560 --> 00:25:21,879 Speaker 1: both were then given a math test, and in the 438 00:25:21,920 --> 00:25:25,119 Speaker 1: first group, men outscored women by twenty points, in the 439 00:25:25,160 --> 00:25:28,800 Speaker 1: second by only two points. That is one. I mean 440 00:25:28,880 --> 00:25:32,280 Speaker 1: to me, that's amazing because that's one instance of priming. 441 00:25:32,520 --> 00:25:36,960 Speaker 1: One math test, a twenty point difference right there. And 442 00:25:37,240 --> 00:25:39,120 Speaker 1: that's not even saying like, well, what if you had 443 00:25:39,160 --> 00:25:43,000 Speaker 1: done that, you know, five years before those students reached 444 00:25:43,000 --> 00:25:45,280 Speaker 1: that age, and you had started doing that priming, then 445 00:25:45,840 --> 00:25:50,160 Speaker 1: what then would their test results look like five years later? Yeah, 446 00:25:50,200 --> 00:25:52,000 Speaker 1: it would be incredible to see what would happen if 447 00:25:52,000 --> 00:25:55,399 Speaker 1: we eliminated that I think it's usually called the stereotype threat. 448 00:25:55,960 --> 00:25:59,439 Speaker 1: Take that out of the classroom and see what would happen. 449 00:25:59,720 --> 00:26:02,040 Speaker 1: It would incredible because you would also probably have some 450 00:26:02,119 --> 00:26:04,720 Speaker 1: of the cross pollination of more girls going to science 451 00:26:04,720 --> 00:26:07,840 Speaker 1: and maybe more boys heading over into things like language 452 00:26:07,920 --> 00:26:12,120 Speaker 1: arts and everyone could then flourish in this wonderful garden 453 00:26:12,119 --> 00:26:18,280 Speaker 1: of learning. That's amazing. But but one thing though that 454 00:26:18,400 --> 00:26:22,760 Speaker 1: astrophysicists Meg Yeary did say in that New York Times 455 00:26:22,760 --> 00:26:26,840 Speaker 1: magazine piece was that women need more positive reinforcement and 456 00:26:26,960 --> 00:26:30,760 Speaker 1: men need more negative reinforcement. Which is funny because, and 457 00:26:30,760 --> 00:26:32,800 Speaker 1: this was something that came up in one of our 458 00:26:33,119 --> 00:26:36,879 Speaker 1: episodes on Cheryl Sandberg's book Lean In how in the 459 00:26:36,960 --> 00:26:40,360 Speaker 1: business setting, a lot of times, if something goes wrong, 460 00:26:40,680 --> 00:26:44,000 Speaker 1: a woman will assume that it's her fault, whereas men 461 00:26:44,080 --> 00:26:47,720 Speaker 1: tend to look externally for the blame, which is which 462 00:26:47,720 --> 00:26:50,600 Speaker 1: is kind of interesting, And that's what it reminded me 463 00:26:50,640 --> 00:26:53,880 Speaker 1: of of these women in the science classroom just assuming 464 00:26:53,880 --> 00:26:57,160 Speaker 1: that they're not going to be good enough. Yeah I am. 465 00:26:57,160 --> 00:27:01,639 Speaker 1: My middle school actually had single gender math classes to 466 00:27:01,720 --> 00:27:05,480 Speaker 1: try to combat the h I guess, you know, just 467 00:27:05,480 --> 00:27:08,040 Speaker 1: to try to get more women interested in math, to 468 00:27:08,040 --> 00:27:10,520 Speaker 1: to help us flourish and not feel like we had 469 00:27:10,560 --> 00:27:13,520 Speaker 1: to keep our hands down. I you know, talk about 470 00:27:13,560 --> 00:27:16,320 Speaker 1: priming like as an outside force, Like I think I 471 00:27:16,520 --> 00:27:19,679 Speaker 1: primed myself to not do well though, because I never, like, 472 00:27:19,800 --> 00:27:24,359 Speaker 1: I just never got math and it never clicked with me. 473 00:27:24,480 --> 00:27:27,400 Speaker 1: The way that writing and even science did. And so, 474 00:27:27,520 --> 00:27:29,600 Speaker 1: like I I think I kind of set myself back 475 00:27:29,600 --> 00:27:32,480 Speaker 1: because from the moment I realized that I wasn't good 476 00:27:32,520 --> 00:27:34,720 Speaker 1: at it and I didn't have the same confidence with 477 00:27:34,760 --> 00:27:36,720 Speaker 1: it that I did in other subjects. I was kind 478 00:27:36,720 --> 00:27:39,199 Speaker 1: of like, well, I'm just gonna be a bad math student. 479 00:27:39,840 --> 00:27:42,120 Speaker 1: So yeah, I mean that I just kind of accepted 480 00:27:42,119 --> 00:27:46,520 Speaker 1: and internalized that, and the single gender math classes did 481 00:27:46,560 --> 00:27:50,000 Speaker 1: not did not really help me. But even for the 482 00:27:50,040 --> 00:27:53,920 Speaker 1: girls who are able to bypass stereo type threat, who 483 00:27:53,960 --> 00:27:59,120 Speaker 1: have incredible math talent and science brains um because there 484 00:27:59,200 --> 00:28:02,240 Speaker 1: is a you know, particular skill set and talent there, 485 00:28:02,840 --> 00:28:06,639 Speaker 1: even for the girls who go on to college, there's 486 00:28:06,720 --> 00:28:12,439 Speaker 1: still a big problem in terms of the biases against women. 487 00:28:12,480 --> 00:28:17,080 Speaker 1: Even at the highest levels of science academia, there are 488 00:28:17,800 --> 00:28:20,280 Speaker 1: you know, it's it's a tougher road for women. Yeah, 489 00:28:20,359 --> 00:28:23,360 Speaker 1: and there's you know, they're starting to be some emerging 490 00:28:23,720 --> 00:28:26,480 Speaker 1: data about this discrimination that you can really get your 491 00:28:26,640 --> 00:28:30,359 Speaker 1: your eyes around some data on that. But there's also 492 00:28:30,440 --> 00:28:34,359 Speaker 1: a ton of anecdotal information coming out. I was just 493 00:28:34,440 --> 00:28:38,640 Speaker 1: reading something about um doctor I think Daniel Lee is 494 00:28:38,680 --> 00:28:42,360 Speaker 1: her name, and she has something called the urban scientists. 495 00:28:42,360 --> 00:28:46,040 Speaker 1: It's a blog that's on Scientific American and she was 496 00:28:46,320 --> 00:28:49,760 Speaker 1: offered a guest blogging gig at I think it was 497 00:28:49,840 --> 00:28:52,800 Speaker 1: Biology Online, you know, of us. I heard about this, Okay, 498 00:28:52,840 --> 00:28:55,000 Speaker 1: So the editor said, hey, you know there's would you 499 00:28:55,000 --> 00:28:57,200 Speaker 1: like to join us? And well it's an unpaid, you know, 500 00:28:57,360 --> 00:29:01,320 Speaker 1: blogging gig for her. So she replied back very nicely, thanks, 501 00:29:01,360 --> 00:29:05,720 Speaker 1: but not interested. And so he wrote back, is it 502 00:29:05,760 --> 00:29:07,960 Speaker 1: because the pay? Are you the urban scientist or the 503 00:29:08,080 --> 00:29:13,040 Speaker 1: urban whore? Alright? So you know, guys, again, this is 504 00:29:13,040 --> 00:29:15,000 Speaker 1: aunt dontald, but this isn't This is what kind of 505 00:29:15,040 --> 00:29:18,720 Speaker 1: gives you an idea of how someone decided to take 506 00:29:18,800 --> 00:29:23,440 Speaker 1: not only her livelihood but her gender and just undercut 507 00:29:23,520 --> 00:29:26,560 Speaker 1: all of that and accused her of prostitution. You know, 508 00:29:26,800 --> 00:29:28,720 Speaker 1: of course he probably thought it was a great joke, 509 00:29:28,800 --> 00:29:30,400 Speaker 1: and I bet he was in love with that because 510 00:29:30,400 --> 00:29:36,520 Speaker 1: he was like, urban scientist women prostitution. Yeah, I'm going 511 00:29:36,560 --> 00:29:39,720 Speaker 1: to convince her yet right, Well, and it's a similar 512 00:29:39,760 --> 00:29:42,760 Speaker 1: thing that happened when not too long ago. Uh, the 513 00:29:42,800 --> 00:29:45,840 Speaker 1: woman whose name is escaping me right now, who runs 514 00:29:46,120 --> 00:29:50,320 Speaker 1: um I effing loves science. Uh. It started out as 515 00:29:50,360 --> 00:29:54,040 Speaker 1: just a Facebook page and when she you know, came 516 00:29:54,040 --> 00:29:56,200 Speaker 1: out and said, oh yeah, I'm the person who does it. 517 00:29:56,280 --> 00:29:59,760 Speaker 1: There was this whole fallout of a paper trail of 518 00:30:00,000 --> 00:30:04,720 Speaker 1: spoke comments of outraged men saying they know who felt 519 00:30:04,720 --> 00:30:07,880 Speaker 1: like they had been duped into enjoying all of this 520 00:30:08,040 --> 00:30:11,760 Speaker 1: scientific content that a woman was sharing, which is I mean, 521 00:30:11,840 --> 00:30:17,000 Speaker 1: that blows my effing speaking of effing mind, because I mean, okay, 522 00:30:17,080 --> 00:30:20,200 Speaker 1: just like talking about this one instance specifically, what has 523 00:30:20,280 --> 00:30:23,040 Speaker 1: that taken away from you? Like you were still reading 524 00:30:23,080 --> 00:30:26,840 Speaker 1: scientific content that you enjoyed and and talking about it 525 00:30:26,880 --> 00:30:30,960 Speaker 1: with your peers. Um, what difference does it make that 526 00:30:31,000 --> 00:30:32,959 Speaker 1: it's presented by a woman. Well, you know, it's just 527 00:30:33,080 --> 00:30:36,200 Speaker 1: that's the Internet. It is the portal into the underbeledy 528 00:30:36,360 --> 00:30:40,480 Speaker 1: of of humanity. And I can't even recall another situation 529 00:30:40,520 --> 00:30:44,080 Speaker 1: where this is actually a brilliant un campaign, um that 530 00:30:44,760 --> 00:30:48,880 Speaker 1: for women that they took this Google search results of 531 00:30:48,920 --> 00:30:52,880 Speaker 1: these phrases like women shouldn't or women need to, and 532 00:30:52,880 --> 00:30:54,840 Speaker 1: then they took the top results and then they showed 533 00:30:54,880 --> 00:30:57,600 Speaker 1: that is what maybe is on the minds of some 534 00:30:57,680 --> 00:31:00,280 Speaker 1: men and perhaps even women too, And there was tolls 535 00:31:00,320 --> 00:31:02,600 Speaker 1: were just astounding. It was like women shouldn't be able 536 00:31:02,640 --> 00:31:07,760 Speaker 1: to vote, Women shouldn't work, women should know their place. What. Yeah, 537 00:31:07,800 --> 00:31:10,080 Speaker 1: these are the thoughts that are actually going on and 538 00:31:10,240 --> 00:31:13,280 Speaker 1: and under the surface. Well, and that's the thing when 539 00:31:13,280 --> 00:31:17,920 Speaker 1: we when we look into academia with all these brilliant people, 540 00:31:18,720 --> 00:31:23,800 Speaker 1: um and looking at these unconscious or sometimes maybe conscious biases, 541 00:31:24,240 --> 00:31:26,880 Speaker 1: it isn't just men. It is also women who are 542 00:31:27,080 --> 00:31:30,560 Speaker 1: discriminating against other women. This was highlighted in a two 543 00:31:30,560 --> 00:31:36,080 Speaker 1: thousand twelve Yale study which found that physicists, chemists, and 544 00:31:36,200 --> 00:31:42,000 Speaker 1: biologists were more likely to view a male applicant more 545 00:31:42,080 --> 00:31:46,920 Speaker 1: favorably than a female applicant if they had the same qualifications. 546 00:31:47,160 --> 00:31:49,680 Speaker 1: They wanted to hire the guy, and they were prepared 547 00:31:49,720 --> 00:31:53,360 Speaker 1: to pay the woman four thousand dollars lests. So that's 548 00:31:53,520 --> 00:31:56,480 Speaker 1: I mean, that is a real life, real world example 549 00:31:56,560 --> 00:31:59,360 Speaker 1: of how a lot of women scientists are getting the shaft. 550 00:32:00,240 --> 00:32:03,200 Speaker 1: Because this I believe they called it the Jennifer versus 551 00:32:03,280 --> 00:32:06,200 Speaker 1: John because that's really I mean, they had identical resumes 552 00:32:06,360 --> 00:32:09,479 Speaker 1: sent to seven participants, so I don't even think they 553 00:32:09,520 --> 00:32:13,040 Speaker 1: knew that they were participants by the way, um, and 554 00:32:13,520 --> 00:32:17,720 Speaker 1: identical resumes once said John, one said Jennifer. And they 555 00:32:17,800 --> 00:32:21,560 Speaker 1: also were sort of okay candidates, they weren't great, but 556 00:32:21,640 --> 00:32:24,239 Speaker 1: they could fulfill this job of I believe that the 557 00:32:24,240 --> 00:32:30,120 Speaker 1: position was lab manager. And so Jennifer scored just saying 558 00:32:30,200 --> 00:32:34,760 Speaker 1: no higher unlikability than John did. So she did, you know, 559 00:32:34,800 --> 00:32:37,280 Speaker 1: trump him in that category. But I think that you know, 560 00:32:37,360 --> 00:32:42,320 Speaker 1: that is like a huge what. Yeah, it's astounding and 561 00:32:42,360 --> 00:32:46,000 Speaker 1: also just astounding to think that, you know, women also 562 00:32:46,440 --> 00:32:52,479 Speaker 1: are exercising a similar unconscious bias. So so what what 563 00:32:52,560 --> 00:32:55,240 Speaker 1: are we to do? This is kicking off the women 564 00:32:55,280 --> 00:32:57,280 Speaker 1: in Stems series, so we're gonna talk about this more 565 00:32:57,320 --> 00:33:01,120 Speaker 1: specifically in the areas of tech, engineering and math, but 566 00:33:01,160 --> 00:33:04,400 Speaker 1: looking at this more broadly, is there a way to 567 00:33:05,440 --> 00:33:09,520 Speaker 1: encourage more girls to be brave about their scientific interests 568 00:33:09,600 --> 00:33:14,080 Speaker 1: or you know, stoke those kinds of you know, curiosities 569 00:33:14,520 --> 00:33:18,000 Speaker 1: that can help fill this pipeline? Well, I know, personally, 570 00:33:18,040 --> 00:33:22,840 Speaker 1: being like a non scientific person, UM, I had great, 571 00:33:23,160 --> 00:33:27,800 Speaker 1: great teachers all through all through school, elementary through high 572 00:33:27,840 --> 00:33:30,120 Speaker 1: school who it didn't matter if you were a girl 573 00:33:30,240 --> 00:33:33,719 Speaker 1: or a boy, or what your inherent abilities were coming 574 00:33:33,760 --> 00:33:37,400 Speaker 1: into their classroom. You know. I I just found it 575 00:33:37,440 --> 00:33:41,360 Speaker 1: amazing to have great teachers who were um encouraging me 576 00:33:41,400 --> 00:33:45,840 Speaker 1: and some of them were women, which I mean can't hurt, right. Yeah, 577 00:33:45,880 --> 00:33:47,880 Speaker 1: I think about this and I think, you know, on 578 00:33:48,160 --> 00:33:51,680 Speaker 1: some level requires everybody to be like Exen Buddhist master 579 00:33:51,800 --> 00:33:54,880 Speaker 1: of their thoughts, you know, because the educator, if they're 580 00:33:54,880 --> 00:33:57,240 Speaker 1: talking to a male or female, you know, they have 581 00:33:57,280 --> 00:34:00,040 Speaker 1: to ask themselves, are they communicating with this person in 582 00:34:00,040 --> 00:34:02,920 Speaker 1: a different way? Are they perceiving this person in a 583 00:34:02,920 --> 00:34:04,760 Speaker 1: different way? Because this is what we saw in this 584 00:34:04,920 --> 00:34:07,440 Speaker 1: article by Pollock over and over again. There are a 585 00:34:07,480 --> 00:34:12,200 Speaker 1: lot of misperceptions. So you know, there was a European 586 00:34:12,239 --> 00:34:14,560 Speaker 1: study that said that women had to be two and 587 00:34:14,600 --> 00:34:17,319 Speaker 1: a half almost half times more productive than their male 588 00:34:17,400 --> 00:34:20,520 Speaker 1: colleagues to secure financial support. Okay, because it was sort 589 00:34:20,560 --> 00:34:23,160 Speaker 1: of that, hey, look at me, look at me. It's 590 00:34:23,160 --> 00:34:26,720 Speaker 1: the misperception of the person in the classroom that, because 591 00:34:26,760 --> 00:34:29,759 Speaker 1: of this gender bias and this unconscious bias, that you 592 00:34:29,880 --> 00:34:33,480 Speaker 1: might be regarding that individual differently than than how they're 593 00:34:33,520 --> 00:34:36,480 Speaker 1: actually expressing themselves. Which is chilling because we all go 594 00:34:36,560 --> 00:34:39,319 Speaker 1: around thinking, I think that person gets me, I think 595 00:34:39,360 --> 00:34:42,200 Speaker 1: that person gets what I just said. But in fact, 596 00:34:42,239 --> 00:34:45,839 Speaker 1: it requires a female to be more proactive about what 597 00:34:45,920 --> 00:34:49,520 Speaker 1: she's saying and more aware of how she's being perceived, right, 598 00:34:49,600 --> 00:34:51,520 Speaker 1: I think that perception is key. I mean we've talked 599 00:34:51,520 --> 00:34:53,839 Speaker 1: about that a lot, that a woman and a man 600 00:34:53,920 --> 00:34:55,759 Speaker 1: can say the same thing, but they will not be 601 00:34:55,840 --> 00:34:58,440 Speaker 1: perceived the same way for having said it. Yeah, and 602 00:34:58,480 --> 00:35:01,560 Speaker 1: there's also bodyline which also ties into this. You sent 603 00:35:01,680 --> 00:35:05,520 Speaker 1: us Julia a ted talk by psychologist Amy Cutty on 604 00:35:06,000 --> 00:35:10,279 Speaker 1: how power poses, as she calls them, of stretching your 605 00:35:10,400 --> 00:35:12,680 Speaker 1: arms out and just standing in like taking up a 606 00:35:12,719 --> 00:35:15,960 Speaker 1: lot of space. Because she was talking about how in 607 00:35:16,200 --> 00:35:19,480 Speaker 1: I think it was NBA classes specifically, how men are 608 00:35:20,320 --> 00:35:23,040 Speaker 1: much more apt to throw their hands up and be 609 00:35:23,200 --> 00:35:26,960 Speaker 1: much more active and participate a lot more, whereas women 610 00:35:27,239 --> 00:35:30,120 Speaker 1: do tend to draw into themselves. I personally have a 611 00:35:30,200 --> 00:35:33,000 Speaker 1: terrible habit of crossing my arms too much and makes 612 00:35:33,000 --> 00:35:36,480 Speaker 1: me look drawn in. But she talked about how her 613 00:35:36,640 --> 00:35:40,600 Speaker 1: research has found that changing your body language, making it 614 00:35:40,680 --> 00:35:44,480 Speaker 1: more powerful, can actually have those behavioral changes that can 615 00:35:44,520 --> 00:35:48,959 Speaker 1: help women out in the classroom to succeed more too. Yeah, 616 00:35:48,960 --> 00:35:51,760 Speaker 1: she was really curious about why there was a gender 617 00:35:51,760 --> 00:35:53,520 Speaker 1: gap in that performance, and that's what she found with 618 00:35:53,520 --> 00:35:57,279 Speaker 1: those NBA UM graduates, and she wanted to sort of 619 00:35:57,360 --> 00:36:00,680 Speaker 1: game that a little bit. So she found doubt that 620 00:36:00,960 --> 00:36:04,560 Speaker 1: effective leader leaders have a classic hormone profile. It's high 621 00:36:04,640 --> 00:36:08,520 Speaker 1: levels of testosterone, you know that sort of confidence, right, 622 00:36:08,920 --> 00:36:13,160 Speaker 1: and low levels of cortisol, which is the stress associated hormones. 623 00:36:13,239 --> 00:36:15,560 Speaker 1: So what she found is that when people took these 624 00:36:15,600 --> 00:36:18,400 Speaker 1: expansive poses like you see men doing right. Think of 625 00:36:18,400 --> 00:36:20,880 Speaker 1: a classroom right now, is there a guy like sitting back, 626 00:36:21,200 --> 00:36:23,680 Speaker 1: you know, with his shoulders kind of back. But just 627 00:36:23,719 --> 00:36:26,480 Speaker 1: being in those poses for two minutes will give you 628 00:36:26,600 --> 00:36:30,840 Speaker 1: that opimal cocktail for your body to have you feel 629 00:36:30,960 --> 00:36:33,920 Speaker 1: more confident. And the really cool thing about this is 630 00:36:33,960 --> 00:36:37,320 Speaker 1: that she took a bunch of participants. She had them 631 00:36:37,360 --> 00:36:41,680 Speaker 1: assumed these low power poses and these high power poses, 632 00:36:41,920 --> 00:36:43,719 Speaker 1: and then they had to give a talk, right, which 633 00:36:43,760 --> 00:36:45,120 Speaker 1: is awful when you have to give a talk. We 634 00:36:45,160 --> 00:36:46,960 Speaker 1: all know this, right, Even if this is what you 635 00:36:47,040 --> 00:36:50,759 Speaker 1: do for yeah, you know, your livelihood, it still takes 636 00:36:50,800 --> 00:36:52,479 Speaker 1: a lot of energy to get up there and sort 637 00:36:52,480 --> 00:36:54,440 Speaker 1: of bury your soul in front of a bunch of strangers. 638 00:36:54,440 --> 00:36:56,880 Speaker 1: So she had these people do this and low and 639 00:36:56,920 --> 00:37:00,879 Speaker 1: behold the people who had been in those high expansive 640 00:37:00,960 --> 00:37:05,799 Speaker 1: power poses. They performed a lot better. As Um dictated, 641 00:37:05,920 --> 00:37:08,719 Speaker 1: by people that third party who actually watched all of 642 00:37:08,760 --> 00:37:12,600 Speaker 1: these videotapes of these participants performing, so they didn't have 643 00:37:12,640 --> 00:37:15,000 Speaker 1: any bias about what was going on here with this study. 644 00:37:15,360 --> 00:37:19,040 Speaker 1: And on top of not just educating girls and women 645 00:37:19,200 --> 00:37:24,120 Speaker 1: on how to conduct themselves empower themselves, and it sort 646 00:37:24,160 --> 00:37:28,040 Speaker 1: of plow forward through these kinds of unconscious biases that 647 00:37:28,120 --> 00:37:31,279 Speaker 1: might be at work. I do think it's so crucial 648 00:37:31,760 --> 00:37:35,400 Speaker 1: for there to be more visibility of women in science. 649 00:37:35,520 --> 00:37:38,080 Speaker 1: And I think I mean, Julie, I think it's fantastic 650 00:37:38,440 --> 00:37:41,040 Speaker 1: that there is that you are a female voice talking 651 00:37:41,040 --> 00:37:43,640 Speaker 1: about science, and that's great. I think that, you know, 652 00:37:43,719 --> 00:37:46,799 Speaker 1: I think in a way that you are maybe unconsciously 653 00:37:46,880 --> 00:37:50,400 Speaker 1: mentoring young female scientists out there, because it's not common 654 00:37:50,680 --> 00:37:54,040 Speaker 1: to hear women talking about science. Well, I was thinking 655 00:37:54,040 --> 00:37:56,640 Speaker 1: about this, and I was thinking about um maps of 656 00:37:56,680 --> 00:37:59,120 Speaker 1: the world, whichinance sounds weird, But if you look at 657 00:37:59,160 --> 00:38:01,320 Speaker 1: maps of the world that are produced by say the 658 00:38:01,400 --> 00:38:04,720 Speaker 1: United States, you're gonna see this map of the United 659 00:38:04,760 --> 00:38:07,799 Speaker 1: States like central and everything else just is to the 660 00:38:07,840 --> 00:38:10,360 Speaker 1: sides of it, right, So North America is Central. But 661 00:38:10,400 --> 00:38:13,400 Speaker 1: if you look at a map that's produced by Australia, 662 00:38:13,880 --> 00:38:16,640 Speaker 1: you'll see you know that that is Asia and it's 663 00:38:16,760 --> 00:38:20,040 Speaker 1: Australia that are central on that map. And that's always 664 00:38:20,040 --> 00:38:22,680 Speaker 1: a surprise for people who live outside of North America 665 00:38:22,719 --> 00:38:24,720 Speaker 1: when you're like, what do you mean there's other maps 666 00:38:24,719 --> 00:38:27,920 Speaker 1: that I have us at the center of of this planet. 667 00:38:27,960 --> 00:38:30,839 Speaker 1: And so I was thinking about science in the same way. 668 00:38:30,880 --> 00:38:34,759 Speaker 1: I mean, when you think about stem Um and you 669 00:38:34,800 --> 00:38:36,560 Speaker 1: look at that, if there's a map of that, then 670 00:38:36,600 --> 00:38:39,719 Speaker 1: you see men as like the huge land mass and 671 00:38:39,760 --> 00:38:42,879 Speaker 1: women at the margins. And I think it's just a 672 00:38:42,880 --> 00:38:45,959 Speaker 1: matter of awareness, because I cannot tell you how many times, 673 00:38:45,960 --> 00:38:49,400 Speaker 1: and I'm really conscious of this, I come upon female 674 00:38:49,480 --> 00:38:52,920 Speaker 1: name after female name after female name when I'm doing research, 675 00:38:53,000 --> 00:38:56,399 Speaker 1: and I look at the studies, and they are there, 676 00:38:56,400 --> 00:39:00,520 Speaker 1: there there in those fields. They're just not as as 677 00:39:00,640 --> 00:39:04,320 Speaker 1: a high profile, you know we when we think about science. Actually, 678 00:39:04,320 --> 00:39:06,600 Speaker 1: if you do this is a nice little challenge for 679 00:39:06,640 --> 00:39:09,480 Speaker 1: everybody listening. If you go into Google right now and 680 00:39:09,920 --> 00:39:14,719 Speaker 1: you type in most influential women in science today, I 681 00:39:14,760 --> 00:39:16,880 Speaker 1: can tell you your page one results will be like 682 00:39:17,120 --> 00:39:25,200 Speaker 1: fifties most historical women in science, most influential women ever historically, 683 00:39:25,520 --> 00:39:28,000 Speaker 1: it's all about what has happened in the past and 684 00:39:28,000 --> 00:39:30,759 Speaker 1: and the the sad thing about that is that we're 685 00:39:30,760 --> 00:39:34,800 Speaker 1: not really that familiar with those historical figures in stem 686 00:39:35,400 --> 00:39:38,680 Speaker 1: let alone the figures that that are prominent. Now it's 687 00:39:38,719 --> 00:39:42,160 Speaker 1: like we're still we're catching up now to to the 688 00:39:42,280 --> 00:39:46,160 Speaker 1: history and highlighting those names. Um. I mean, even if 689 00:39:46,160 --> 00:39:48,480 Speaker 1: you think about the story of you know, Rosalind Franklin, 690 00:39:48,520 --> 00:39:51,400 Speaker 1: who's often left out. You hear about with D N. A. O. 691 00:39:51,880 --> 00:39:55,719 Speaker 1: Watson and Crick, but no, there's also Rosalind Franklin and there. 692 00:39:56,040 --> 00:39:57,319 Speaker 1: I mean, I think a lot of that has to 693 00:39:57,320 --> 00:40:02,400 Speaker 1: do with just people's inherent oh mistrust of women, women's 694 00:40:02,480 --> 00:40:05,799 Speaker 1: voices in those fields of just thinking like, well, let 695 00:40:05,920 --> 00:40:08,879 Speaker 1: let's hear it from a man, you know. I mean, 696 00:40:09,000 --> 00:40:13,279 Speaker 1: I think, until people until it becomes more normal, and 697 00:40:13,320 --> 00:40:15,480 Speaker 1: how do we get it to be more normal quote 698 00:40:15,560 --> 00:40:18,279 Speaker 1: unquote obviously in people's minds. And a lot of that 699 00:40:18,400 --> 00:40:23,080 Speaker 1: is visibility. I hope to one day soon, you know, 700 00:40:23,120 --> 00:40:26,799 Speaker 1: people won't be surprised that a woman is presenting scientific 701 00:40:26,840 --> 00:40:29,840 Speaker 1: information on Facebook or something is stilly like that. Hopefully 702 00:40:29,880 --> 00:40:32,480 Speaker 1: it will just be like, oh, yeah, whatever, there's science stuff. 703 00:40:33,000 --> 00:40:36,080 Speaker 1: And the exciting thing is though that you have the 704 00:40:36,120 --> 00:40:39,360 Speaker 1: white House, for instance, sponsoring women in STEM girls and 705 00:40:39,440 --> 00:40:42,640 Speaker 1: STEM initiatives. It's a you know, they're top down things 706 00:40:42,680 --> 00:40:47,440 Speaker 1: that are happening to really encourage more girls into science, 707 00:40:47,480 --> 00:40:50,600 Speaker 1: to empower more women who are studying science. Um. So 708 00:40:50,760 --> 00:40:53,719 Speaker 1: I think it's a better time than ever before to 709 00:40:53,840 --> 00:40:56,319 Speaker 1: be a woman who's interested in science. I think. So. 710 00:40:56,360 --> 00:40:57,920 Speaker 1: I was thinking about this when I went to the 711 00:40:57,960 --> 00:41:00,359 Speaker 1: World of Science Festival, not this year, but last here 712 00:41:00,480 --> 00:41:03,680 Speaker 1: and there was a great panel on exoplanets. This idea 713 00:41:03,800 --> 00:41:06,640 Speaker 1: that within the next fifty years we might find an 714 00:41:06,640 --> 00:41:10,840 Speaker 1: Earth like planet. So think about that. I mean, yeah, 715 00:41:10,880 --> 00:41:13,960 Speaker 1: I mean that that would change our entire concept of 716 00:41:14,719 --> 00:41:16,799 Speaker 1: life and how it works, and we might get more 717 00:41:16,840 --> 00:41:19,520 Speaker 1: information about our place in the universe. I mean, it's 718 00:41:19,560 --> 00:41:23,280 Speaker 1: just really groundbreaking stuff. And two of the three participants 719 00:41:23,280 --> 00:41:26,160 Speaker 1: in the panel were women in one was physicist and 720 00:41:26,200 --> 00:41:30,759 Speaker 1: Italia Batalia she's an astrophysicist, and Sarah Seeger, who is 721 00:41:30,840 --> 00:41:34,759 Speaker 1: a physicist, and um they were just amazing and I thought, 722 00:41:34,800 --> 00:41:37,959 Speaker 1: this is nice to see. This is uh, the sort 723 00:41:37,960 --> 00:41:42,040 Speaker 1: of passion, the sort of rigor that I think many 724 00:41:42,080 --> 00:41:44,160 Speaker 1: women are doing in these fields is up here on 725 00:41:44,200 --> 00:41:46,839 Speaker 1: this panel being represented. And we need more of that 726 00:41:46,920 --> 00:41:49,000 Speaker 1: and more awareness of that, yeah, because I think a 727 00:41:49,000 --> 00:41:52,560 Speaker 1: lot of the focus is on the lack of women instead, 728 00:41:53,040 --> 00:41:56,040 Speaker 1: but there are absolutely, as you say, plenty of women 729 00:41:56,160 --> 00:41:59,120 Speaker 1: and girls who are doing incredible work that need to 730 00:41:59,160 --> 00:42:03,200 Speaker 1: be highlighted as well. So so thanks to you, Julie though, 731 00:42:03,239 --> 00:42:06,520 Speaker 1: for coming on and enlightening us about Thanks for having 732 00:42:06,520 --> 00:42:08,279 Speaker 1: me and to honor be hanging out with you go 733 00:42:09,200 --> 00:42:12,279 Speaker 1: has been fantastic, very fun. And Julie, do you want 734 00:42:12,320 --> 00:42:14,680 Speaker 1: to can you give a shout out to where folks 735 00:42:14,719 --> 00:42:17,759 Speaker 1: can find stuff to blow your mind? They can go 736 00:42:17,840 --> 00:42:20,880 Speaker 1: to stuff to Blow your Mind dot com. You can 737 00:42:20,920 --> 00:42:26,360 Speaker 1: find videos, books, uh mold, podcast episodes, all sorts of 738 00:42:26,360 --> 00:42:28,959 Speaker 1: stuff there. Um. So now we want to hear from 739 00:42:29,120 --> 00:42:32,439 Speaker 1: our scientific listeners out there. What do you think about 740 00:42:32,440 --> 00:42:34,480 Speaker 1: the stuff we just talked about? Are you in science? 741 00:42:34,480 --> 00:42:37,600 Speaker 1: Are you doing research right now as you listen to 742 00:42:37,719 --> 00:42:40,640 Speaker 1: us podcast? We want to hear from you. Email us 743 00:42:40,680 --> 00:42:43,799 Speaker 1: at mom stuff at Discovery dot com. And we've got 744 00:42:43,800 --> 00:42:45,719 Speaker 1: a couple of letters to share with you when we 745 00:42:45,800 --> 00:42:48,960 Speaker 1: come right back from a quick break. And now back 746 00:42:48,960 --> 00:42:54,600 Speaker 1: to our letters. So I've got one here from Helen 747 00:42:55,239 --> 00:42:59,000 Speaker 1: and it's subjec line's language choice. So she writes, I'm 748 00:42:59,000 --> 00:43:00,560 Speaker 1: a big fan of your podcast US and look forward 749 00:43:00,560 --> 00:43:02,879 Speaker 1: to seeing what topics you will be covering each week. 750 00:43:02,920 --> 00:43:06,480 Speaker 1: I find your subject choices fascinating, and I'm always impressed 751 00:43:06,520 --> 00:43:10,680 Speaker 1: by your respectable and measured approach to controversial our sensitive subjects. 752 00:43:11,120 --> 00:43:13,280 Speaker 1: I'm writing because I was shocked to hear the words 753 00:43:13,320 --> 00:43:17,160 Speaker 1: spastic used to describe an inattentive waiter in your Women 754 00:43:17,160 --> 00:43:19,680 Speaker 1: and Wine episode. I'm from the UK, where this is 755 00:43:19,719 --> 00:43:22,640 Speaker 1: a very offensive and outdated term to describe a person 756 00:43:22,680 --> 00:43:26,120 Speaker 1: with cerebral palsy. I realized that you broadcast in the 757 00:43:26,239 --> 00:43:29,000 Speaker 1: US and that there are cultural differences surrounding slang, but 758 00:43:29,000 --> 00:43:31,239 Speaker 1: I felt I had to point out that to a 759 00:43:31,280 --> 00:43:33,439 Speaker 1: wider audience. Hearing this term in your show is quite 760 00:43:33,520 --> 00:43:37,400 Speaker 1: shocking and doesn't fit well with the image you portray otherwise. 761 00:43:37,880 --> 00:43:41,399 Speaker 1: So our apologies if we have ended any sensitive ears. 762 00:43:41,400 --> 00:43:44,279 Speaker 1: We certainly did not know that, and yeah, we we 763 00:43:44,360 --> 00:43:47,520 Speaker 1: are hyper sensitive about our language. So thank you Helen 764 00:43:47,600 --> 00:43:52,960 Speaker 1: for pointing that out to us. Yeah, no, no offense intended. Um, absolutely, 765 00:43:53,239 --> 00:43:55,000 Speaker 1: but thank you for writing in and bring it to 766 00:43:55,040 --> 00:43:57,840 Speaker 1: our attention. Well, I have a letter here from Laura 767 00:43:58,280 --> 00:44:01,839 Speaker 1: talking about our Women and why Fine episode. She says, 768 00:44:01,840 --> 00:44:04,480 Speaker 1: I just graduated from the University of California Davis with 769 00:44:04,520 --> 00:44:07,439 Speaker 1: a degree in viticulture and analogy, and I absolutely love 770 00:44:07,520 --> 00:44:10,640 Speaker 1: working in wine production. I'm currently listening to this podcast 771 00:44:10,640 --> 00:44:14,680 Speaker 1: while working harvest in Burgundy, France. I wanted to share 772 00:44:14,680 --> 00:44:17,240 Speaker 1: a few words. As a woman in wine production. Wine 773 00:44:17,239 --> 00:44:20,760 Speaker 1: making is an incredible field if you love science, agriculture, 774 00:44:20,920 --> 00:44:25,200 Speaker 1: working with your hands, and of course fermented beverages. She says, well, 775 00:44:25,239 --> 00:44:27,400 Speaker 1: there are many women involved in all aspects of the 776 00:44:27,440 --> 00:44:30,880 Speaker 1: wine industry. We are typically underrepresented in the production side 777 00:44:30,880 --> 00:44:34,080 Speaker 1: because of all the reasons you mentioned. Wine making requires 778 00:44:34,120 --> 00:44:37,280 Speaker 1: a lot of physical activity and strength, and yes there are, 779 00:44:37,280 --> 00:44:41,400 Speaker 1: in rare cases superstitious people who believe are unique vaginal 780 00:44:41,480 --> 00:44:45,000 Speaker 1: flora will affect the wine. In my experience, the gap 781 00:44:45,040 --> 00:44:47,799 Speaker 1: between men and women in production is definitely shrinking, and 782 00:44:47,840 --> 00:44:49,839 Speaker 1: I'm lucky to say that all of the wineries where 783 00:44:49,880 --> 00:44:52,920 Speaker 1: I've worked have been fairly evenly split between men and 784 00:44:52,960 --> 00:44:56,840 Speaker 1: women in production. There are still somewhat gendered tasks, however, 785 00:44:56,880 --> 00:44:59,960 Speaker 1: which can be frustrating. Generally speaking, women are often see 786 00:45:00,040 --> 00:45:04,800 Speaker 1: doing more laboratory analyzes, fruits sorting during harvest, and cleaning 787 00:45:04,800 --> 00:45:07,799 Speaker 1: the smaller, more petite tank, while men are often the 788 00:45:07,800 --> 00:45:11,040 Speaker 1: ones handling the forklift, driving, heavy lifting and vineyard management. 789 00:45:11,480 --> 00:45:14,439 Speaker 1: Of course, these are generalizations. All is fair in love 790 00:45:14,480 --> 00:45:17,359 Speaker 1: and wine making, and when you're working sixteen hour days 791 00:45:17,440 --> 00:45:20,920 Speaker 1: during harvest, you'll do whatever task needs to be done. 792 00:45:21,719 --> 00:45:24,360 Speaker 1: So thank you Laura for you two, since I am 793 00:45:24,400 --> 00:45:26,120 Speaker 1: so impressed that you had time to write this this 794 00:45:26,280 --> 00:45:30,120 Speaker 1: lovely email during this wine making harvest festivals. Hey, and 795 00:45:30,239 --> 00:45:33,799 Speaker 1: she counts as a woman in stem. Indeed, you did 796 00:45:33,840 --> 00:45:37,920 Speaker 1: a culture, a culture well thanks to everyone who's written in. 797 00:45:37,960 --> 00:45:39,799 Speaker 1: Mom Stuff at discovery dot com is where you can 798 00:45:39,840 --> 00:45:42,120 Speaker 1: send your letters. You can also follow us on Twitter 799 00:45:42,160 --> 00:45:45,120 Speaker 1: at mom Stuff Podcast and find us on Facebook and 800 00:45:45,200 --> 00:45:48,160 Speaker 1: don't forget. You can follow us on Instagram at stuff 801 00:45:48,239 --> 00:45:51,200 Speaker 1: mom Never Told You. And if you haven't done so already, 802 00:45:51,360 --> 00:45:54,160 Speaker 1: you need to head over and check out our YouTube 803 00:45:54,239 --> 00:45:59,439 Speaker 1: channels with more than one hundred fantastic videos to choose from. 804 00:45:59,480 --> 00:46:02,160 Speaker 1: It's You Too dot com, slash stuff Mom Never Told You, 805 00:46:02,440 --> 00:46:08,120 Speaker 1: and don't forget to subscribe for more on this and 806 00:46:08,200 --> 00:46:10,759 Speaker 1: thousands of other topics because it how stuff works dot 807 00:46:10,760 --> 00:46:19,720 Speaker 1: com