1 00:00:00,160 --> 00:00:02,760 Speaker 1: Right now, though. A new report into women working in 2 00:00:02,800 --> 00:00:05,600 Speaker 1: the tech sector has set off a few alarm bells. 3 00:00:06,040 --> 00:00:10,479 Speaker 1: The advocacy group Tech Beyond Gender surveyed two hundred female 4 00:00:11,400 --> 00:00:15,280 Speaker 1: workers tech workers. It found more than half have considered 5 00:00:15,360 --> 00:00:18,440 Speaker 1: leaving their current positions and over a third of thought 6 00:00:18,440 --> 00:00:23,079 Speaker 1: about quitting the sector altogether. Mina Satesh Kumar from Tech 7 00:00:23,120 --> 00:00:27,080 Speaker 1: Beyond Gender is with me to explain this finding. Mina, 8 00:00:27,280 --> 00:00:29,040 Speaker 1: good evening to you. Thanks for being with me. 9 00:00:30,320 --> 00:00:31,840 Speaker 2: Good evening, Thanks for the time. 10 00:00:32,760 --> 00:00:37,040 Speaker 1: So you've done these interviews, there's some quite some quite 11 00:00:37,159 --> 00:00:41,120 Speaker 1: radical claims being made here. Tell me how you've come 12 00:00:41,159 --> 00:00:44,760 Speaker 1: to the conclusion. You know, half of the people you've 13 00:00:44,800 --> 00:00:47,720 Speaker 1: interviewed have considered quitting do you know whin? 14 00:00:49,280 --> 00:00:52,560 Speaker 2: Yeah? Cool? So let me it's a small correction here. 15 00:00:53,920 --> 00:00:58,640 Speaker 2: This report we surveyed women and gender divers professionals working 16 00:00:58,640 --> 00:01:02,560 Speaker 2: in New Zealand tech sector. I think it is gener 17 00:01:02,960 --> 00:01:07,160 Speaker 2: frustration and a feeling of hopelessness for these people working 18 00:01:07,200 --> 00:01:11,240 Speaker 2: in tech sector where more than forty persons have experienced 19 00:01:11,319 --> 00:01:16,520 Speaker 2: gender based microaggressions in their workplaces and more than seventy 20 00:01:17,000 --> 00:01:22,800 Speaker 2: seventy percentage have reported no training or to create awareness 21 00:01:22,800 --> 00:01:28,200 Speaker 2: regarding microaggressions have been provided in their workplaces. And besides that, 22 00:01:28,560 --> 00:01:32,080 Speaker 2: more than seventy percentage of them are dissatisfied with paid 23 00:01:32,080 --> 00:01:34,880 Speaker 2: transparency and more than half of them feel that they're 24 00:01:34,880 --> 00:01:39,880 Speaker 2: not being paid fair with disparities particularly affecting technical and 25 00:01:40,000 --> 00:01:41,520 Speaker 2: migrant professionals. 26 00:01:42,520 --> 00:01:46,319 Speaker 1: What do you consider a microaggression? Forty percent say that 27 00:01:46,400 --> 00:01:50,800 Speaker 1: they've had gender based microaggressions committed against them. What is that? 28 00:01:51,840 --> 00:01:58,280 Speaker 2: So? Microaggressions are subtle, often unintentional comments or behaviors or 29 00:01:58,320 --> 00:02:04,040 Speaker 2: actions that communicate bias or stereotype towards a person or 30 00:02:04,080 --> 00:02:10,200 Speaker 2: a group. So this is particularly targeted towards marginalized or 31 00:02:10,400 --> 00:02:11,839 Speaker 2: minority background. 32 00:02:13,000 --> 00:02:18,360 Speaker 1: So I'm sorry, sorry, just you say that it's unintentional 33 00:02:18,760 --> 00:02:22,120 Speaker 1: aggressions and yet it's targeted. How can it be both? 34 00:02:23,240 --> 00:02:28,000 Speaker 2: Well, because this is because of the stereotypes that are 35 00:02:28,080 --> 00:02:33,079 Speaker 2: so ingrained in our society. There are certain systemic issues 36 00:02:33,120 --> 00:02:37,560 Speaker 2: that we have really not addressed. So it's easy to 37 00:02:37,840 --> 00:02:41,359 Speaker 2: just assume in a meeting that a woman would probably 38 00:02:41,440 --> 00:02:44,600 Speaker 2: be there to take notes or be a people manager, 39 00:02:45,080 --> 00:02:48,480 Speaker 2: and not see a woman as a front end developer 40 00:02:48,800 --> 00:02:53,400 Speaker 2: or a solution architect or a technical lead. So that's 41 00:02:53,440 --> 00:02:57,200 Speaker 2: what I meant, because it is very unintentional because we 42 00:02:57,240 --> 00:02:59,880 Speaker 2: are so ingrained in those stereotypes. 43 00:03:00,760 --> 00:03:03,440 Speaker 1: Right, But but what can you give us an example 44 00:03:03,480 --> 00:03:05,200 Speaker 1: of what a microaggression is? 45 00:03:06,320 --> 00:03:13,000 Speaker 2: So being interrupted in meetings, or having ideas appropriated, or 46 00:03:13,040 --> 00:03:19,200 Speaker 2: facing some dismissive comments, or like just having some remarks 47 00:03:19,240 --> 00:03:24,040 Speaker 2: such as, oh, you're surprisingly technical, or even saying, uh, 48 00:03:24,480 --> 00:03:30,280 Speaker 2: well you're too aggressive if being assertive ableman's right, yeah. 49 00:03:30,040 --> 00:03:34,280 Speaker 1: But these things will happen to men too. I mean, 50 00:03:34,880 --> 00:03:38,600 Speaker 1: you know, if so if somebody, if a manager interrupts 51 00:03:38,640 --> 00:03:40,760 Speaker 1: a man, is that a microaggression? No? 52 00:03:41,880 --> 00:03:46,120 Speaker 2: Possibly, I'm not saying that men do not have any 53 00:03:46,200 --> 00:03:49,120 Speaker 2: issues as such. Probably they do face issues. 54 00:03:49,160 --> 00:03:51,600 Speaker 1: But then isn't it Isn't that The point though, that 55 00:03:51,600 --> 00:03:55,120 Speaker 1: that if you know there's a there are good managers 56 00:03:55,120 --> 00:03:58,000 Speaker 1: and there are bad managers. It doesn't necessarily mean just 57 00:03:58,040 --> 00:04:00,640 Speaker 1: because somebody interrupts you that they're saying access or that 58 00:04:00,640 --> 00:04:02,760 Speaker 1: they're doing it because you're you're a woman. 59 00:04:04,160 --> 00:04:11,160 Speaker 2: But certain systemic barriers make all of these behaviors very 60 00:04:11,360 --> 00:04:16,000 Speaker 2: pronounced for women and gender divers professionals. So, because of 61 00:04:16,040 --> 00:04:19,520 Speaker 2: the stereotypes that are so ingrained in us, we just 62 00:04:19,880 --> 00:04:24,680 Speaker 2: assume certain things and just talk about it without really 63 00:04:25,120 --> 00:04:28,200 Speaker 2: thinking about it. So, for instance, if I'm in a 64 00:04:28,279 --> 00:04:32,400 Speaker 2: meeting and if I'm actually providing a technical input this 65 00:04:32,720 --> 00:04:35,680 Speaker 2: way along with few other men in my team, that 66 00:04:37,000 --> 00:04:39,760 Speaker 2: message that I would can be will hold value if 67 00:04:39,800 --> 00:04:42,800 Speaker 2: it comes if it is backed by another man. 68 00:04:44,080 --> 00:04:46,919 Speaker 1: Okaynna, thank you very much for explaining this to us. 69 00:04:46,960 --> 00:04:49,960 Speaker 1: I really appreciate it. MENA Satish Kumar from the Tech 70 00:04:50,040 --> 00:04:53,520 Speaker 1: Beyond Gender, which is an advocacy group for women in tech. 71 00:04:57,720 --> 00:05:00,279 Speaker 1: I mean, what do you say. I just I think 72 00:05:00,320 --> 00:05:03,080 Speaker 1: the whole, the whole I've read, I've read the release 73 00:05:03,440 --> 00:05:06,719 Speaker 1: from this group. I just think it sounds like absolute 74 00:05:06,760 --> 00:05:10,440 Speaker 1: b Yes, half have considered quitting. Go and ask any 75 00:05:11,640 --> 00:05:14,760 Speaker 1: worker anywhere, and half of them will tell you they 76 00:05:14,800 --> 00:05:17,240 Speaker 1: have considered quitting their job. It doesn't matter what gender 77 00:05:17,279 --> 00:05:20,480 Speaker 1: they are, who their bosses. Half don't believe they're being 78 00:05:20,520 --> 00:05:23,320 Speaker 1: paid very well. Go and ask any worker, doesn't matter 79 00:05:23,320 --> 00:05:25,200 Speaker 1: what gender they are, and they'll tell you that they're 80 00:05:25,240 --> 00:05:28,320 Speaker 1: not being paid very well. Go and ask how many 81 00:05:28,320 --> 00:05:32,200 Speaker 1: have had microaggressions and then define it as loosely as that, 82 00:05:33,040 --> 00:05:36,839 Speaker 1: and half will have said that they've had microaggressions. 83 00:05:39,000 --> 00:05:42,159 Speaker 2: For more from Heather Duplessy Allen Drive. Listen live to 84 00:05:42,240 --> 00:05:42,799 Speaker 2: news talks. 85 00:05:42,800 --> 00:05:46,000 Speaker 1: It'd be from four pm weekdays, or follow the podcast 86 00:05:46,080 --> 00:05:47,039 Speaker 1: on iHeartRadio