1 00:00:02,480 --> 00:00:07,360 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:07,840 --> 00:00:08,400 Speaker 2: Please to say. 3 00:00:08,480 --> 00:00:11,440 Speaker 3: Joining us right now is Bloomberg Original's host and executive 4 00:00:11,480 --> 00:00:12,440 Speaker 3: producer Emily Chang. 5 00:00:12,520 --> 00:00:16,599 Speaker 2: She joins us right now with Cheryl Sandberg. Emily Romaine. 6 00:00:16,680 --> 00:00:18,239 Speaker 3: Thank you so much on Cheryl, thank you so much 7 00:00:18,239 --> 00:00:21,080 Speaker 3: for joining us. I know this ambition gap is troubling 8 00:00:21,120 --> 00:00:24,640 Speaker 3: to you. Essentially, women are leaning out. They're deciding they 9 00:00:24,640 --> 00:00:27,280 Speaker 3: don't want to be promoted because it doesn't seem worth 10 00:00:27,320 --> 00:00:30,000 Speaker 3: it after all the work that's been done, all the talk, 11 00:00:30,160 --> 00:00:31,200 Speaker 3: what do you think. 12 00:00:31,040 --> 00:00:33,640 Speaker 2: Broke, Well, let's start at the top. 13 00:00:33,920 --> 00:00:36,760 Speaker 1: So not all women are leaning out, and not all companies. 14 00:00:36,760 --> 00:00:38,800 Speaker 1: But what our report shows this year, this is our 15 00:00:38,840 --> 00:00:43,560 Speaker 1: eleventh year, is that about half of companies no longer 16 00:00:43,640 --> 00:00:47,440 Speaker 1: prioritize advancement for women, and twenty one percent of those 17 00:00:47,440 --> 00:00:50,960 Speaker 1: companies say women's career advancement is a low or no 18 00:00:51,120 --> 00:00:54,640 Speaker 1: priority at all. And those are the companies that participated 19 00:00:54,720 --> 00:00:57,280 Speaker 1: in the Women in the Workplace report we do with McKinsey. 20 00:00:57,720 --> 00:00:59,800 Speaker 1: And so these companies in many ways are the best 21 00:01:00,120 --> 00:01:03,400 Speaker 1: the best, and then we do see that ambition gap, 22 00:01:03,560 --> 00:01:08,560 Speaker 1: but only when women don't get the opportunities and support 23 00:01:08,640 --> 00:01:09,160 Speaker 1: they need. 24 00:01:09,319 --> 00:01:14,000 Speaker 3: How when companies at risk like or is the conventional 25 00:01:14,000 --> 00:01:16,800 Speaker 3: wisdom that this is better for business not ringing true. 26 00:01:17,120 --> 00:01:20,839 Speaker 1: I think the conventional wisdom should be and is what's true, 27 00:01:21,000 --> 00:01:22,640 Speaker 1: which is that when you get the best out of 28 00:01:22,640 --> 00:01:25,800 Speaker 1: your whole workforce, you're going to do better. So what's 29 00:01:25,840 --> 00:01:29,319 Speaker 1: happening is that women face more barriers at every level 30 00:01:29,400 --> 00:01:33,039 Speaker 1: of the career entry level. We call it the broken wrung, 31 00:01:33,080 --> 00:01:35,520 Speaker 1: and we see it every year. For every one hundred 32 00:01:35,600 --> 00:01:39,960 Speaker 1: men that get promoted, ninety three women, sixty black women, 33 00:01:40,080 --> 00:01:43,640 Speaker 1: eighty two latinas. That's because we hire and promote men 34 00:01:43,800 --> 00:01:46,559 Speaker 1: based on potential and women for what they've already proven. 35 00:01:46,959 --> 00:01:49,160 Speaker 2: So of course women can't prove they're a manager. 36 00:01:49,760 --> 00:01:52,880 Speaker 1: Then at the senior levels, our report shows this year 37 00:01:53,240 --> 00:01:56,160 Speaker 1: that at the same levels, a man is seventy percent 38 00:01:56,280 --> 00:01:59,880 Speaker 1: more likely to get tapped for leadership training. Think about 39 00:01:59,880 --> 00:02:02,120 Speaker 1: what that says. If you're a future leader, come to 40 00:02:02,200 --> 00:02:06,240 Speaker 1: leadership training. And so this is only happening in the 41 00:02:06,240 --> 00:02:08,880 Speaker 1: companies that aren't doing the right thing. When women get 42 00:02:08,880 --> 00:02:12,560 Speaker 1: the full support and the same stretch opportunities, they're not 43 00:02:12,720 --> 00:02:15,359 Speaker 1: leaning out at all. And so it's a question of 44 00:02:15,400 --> 00:02:18,880 Speaker 1: economic productivity. Do we want to get the best growth 45 00:02:18,880 --> 00:02:20,800 Speaker 1: in our economy? Do we want to get the best 46 00:02:20,840 --> 00:02:23,080 Speaker 1: out of our workforce. We're at to fork in the road, 47 00:02:23,480 --> 00:02:25,080 Speaker 1: and companies have a decision to make. 48 00:02:25,840 --> 00:02:29,160 Speaker 3: The Trump administration is pushing policies that explicitly try to 49 00:02:29,200 --> 00:02:35,679 Speaker 3: incentivize women to have more babies while simultaneously weakening workplace protections. 50 00:02:36,200 --> 00:02:39,600 Speaker 3: Do you see these natalist policies as they are called, 51 00:02:40,160 --> 00:02:45,000 Speaker 3: as pressure on women to return to traditional roles or 52 00:02:45,080 --> 00:02:46,320 Speaker 3: is it support for families? 53 00:02:47,800 --> 00:02:51,200 Speaker 1: I mean, look, women can have as many kids as 54 00:02:51,240 --> 00:02:54,679 Speaker 1: they want and still have to go to work. I 55 00:02:54,720 --> 00:02:57,400 Speaker 1: think what we forget in a lot of this is 56 00:02:57,440 --> 00:03:00,480 Speaker 1: that the great majority of women do not have the 57 00:03:00,560 --> 00:03:02,920 Speaker 1: choice to be a full time mother and a full 58 00:03:02,919 --> 00:03:05,840 Speaker 1: time spouse. Now, I feel we sometimes come up with 59 00:03:06,320 --> 00:03:09,280 Speaker 1: new language for old ideas, and I want to be clear. 60 00:03:09,800 --> 00:03:11,920 Speaker 1: If you can afford to be a full time spouse 61 00:03:12,040 --> 00:03:14,239 Speaker 1: and a full time parent as a man or a woman, 62 00:03:14,680 --> 00:03:16,840 Speaker 1: and you want to do that, I think that can 63 00:03:16,880 --> 00:03:19,960 Speaker 1: be deeply fulfilling work. But we've got to remember that 64 00:03:20,400 --> 00:03:23,680 Speaker 1: most women don't have that option. They have an economic 65 00:03:23,760 --> 00:03:25,639 Speaker 1: reality that they have to wake up in the morning 66 00:03:25,680 --> 00:03:28,639 Speaker 1: and leave their home to earn money. To support their families, 67 00:03:29,080 --> 00:03:33,360 Speaker 1: and so again new language for old ideas trad wife. 68 00:03:34,639 --> 00:03:36,840 Speaker 1: That's just telling these women that have to leave their 69 00:03:36,880 --> 00:03:39,240 Speaker 1: home that it's going to harm their marriages and their kids. 70 00:03:39,640 --> 00:03:41,440 Speaker 2: That's not what the data supports. 71 00:03:41,840 --> 00:03:43,960 Speaker 1: We should be able to make any choice we make 72 00:03:44,600 --> 00:03:49,480 Speaker 1: without putting old pressures on women in a modern workforce 73 00:03:49,480 --> 00:03:51,720 Speaker 1: where that's not the economic reality they live in. 74 00:03:52,120 --> 00:03:54,720 Speaker 3: President Trump has called on companies to root out what 75 00:03:54,800 --> 00:03:59,360 Speaker 3: he calls illegal DEI you know, threatening federal contracts, threatening 76 00:03:59,680 --> 00:04:05,000 Speaker 3: regulatory action. How is this going to be looked back on? 77 00:04:05,320 --> 00:04:07,760 Speaker 3: How is history going to look back on the DEI rollback? 78 00:04:09,480 --> 00:04:13,960 Speaker 1: You know, I think people didn't understand and thought that 79 00:04:14,040 --> 00:04:16,560 Speaker 1: women were getting unfair treatment. But let's there are some 80 00:04:16,680 --> 00:04:20,279 Speaker 1: numbers at this Women got fifty nine percent of the 81 00:04:20,320 --> 00:04:24,160 Speaker 1: college degrees, and women are ten percent of fortune five 82 00:04:24,279 --> 00:04:25,560 Speaker 1: hundred CEO jobs. 83 00:04:26,000 --> 00:04:26,920 Speaker 2: I'm not saying there. 84 00:04:26,760 --> 00:04:29,279 Speaker 1: Aren't times when people are given preferential treatment. 85 00:04:29,400 --> 00:04:32,320 Speaker 2: Of course there are, what on average in our economy. 86 00:04:32,760 --> 00:04:34,960 Speaker 1: Do you really think that fifty nine percent of the 87 00:04:35,000 --> 00:04:38,440 Speaker 1: college degree is getting ten percent of book jobs means 88 00:04:38,440 --> 00:04:42,240 Speaker 1: they're systematic special treatment for women. I mean my experience 89 00:04:42,279 --> 00:04:44,000 Speaker 1: in the workforce and I think yours and a lot 90 00:04:44,000 --> 00:04:46,760 Speaker 1: of people is that it was hard. It was hard 91 00:04:46,800 --> 00:04:48,839 Speaker 1: to be one of the only women women in the room. 92 00:04:48,880 --> 00:04:52,359 Speaker 1: And so the question is what can companies do? And 93 00:04:52,400 --> 00:04:54,320 Speaker 1: I'll tell you there's a lot they can do, and 94 00:04:54,360 --> 00:04:59,720 Speaker 1: it's in our report and it's completely legal. So, for example, feedback, 95 00:05:00,480 --> 00:05:03,760 Speaker 1: one percent of men get style based feedback in performance 96 00:05:03,760 --> 00:05:06,279 Speaker 1: reviews and sixty six percent of women. 97 00:05:06,720 --> 00:05:07,800 Speaker 2: What can companies do? 98 00:05:08,200 --> 00:05:11,320 Speaker 1: You establish criteria in advance that everyone agrees to that 99 00:05:11,360 --> 00:05:15,880 Speaker 1: are universally applied. Everyone gets this kind of feedback that 100 00:05:16,080 --> 00:05:20,560 Speaker 1: is not just legally permissible, but allowed and encouraged and 101 00:05:20,880 --> 00:05:24,080 Speaker 1: creates a level playing field. This isn't about special treatment. 102 00:05:24,760 --> 00:05:27,560 Speaker 1: This is about getting everyone the opportunity to do their 103 00:05:27,600 --> 00:05:29,440 Speaker 1: best work and contribute. 104 00:05:29,720 --> 00:05:33,240 Speaker 3: Meta is among many companies that have rolled back DEI policies, 105 00:05:33,360 --> 00:05:37,080 Speaker 3: and Mark Zuckerberg reportedly blamed you for the policies being 106 00:05:37,080 --> 00:05:40,480 Speaker 3: there in the first place. Some employees have felt that 107 00:05:40,560 --> 00:05:44,360 Speaker 3: your legacy is being dismantled. What have your conversations been 108 00:05:44,480 --> 00:05:46,559 Speaker 3: like with Mark about this. I know you're still friends 109 00:05:46,560 --> 00:05:48,240 Speaker 3: and you have to have feelings about this. 110 00:05:49,279 --> 00:05:51,479 Speaker 1: I don't think that's exactly what happened in that meeting, 111 00:05:51,520 --> 00:05:52,680 Speaker 1: and Mark went out and. 112 00:05:52,640 --> 00:05:54,240 Speaker 2: Publicly posted and clarified. 113 00:05:54,240 --> 00:05:58,359 Speaker 1: But here's what I would say is that every company, 114 00:05:58,400 --> 00:06:02,320 Speaker 1: including Meta, has the opportunity unity to make sure that 115 00:06:02,320 --> 00:06:05,320 Speaker 1: they're fair to women. Here's what the data shows us, 116 00:06:05,360 --> 00:06:09,560 Speaker 1: over and over and over again, so many examples. When 117 00:06:09,640 --> 00:06:12,839 Speaker 1: a man and woman ask for raises or promotions, the 118 00:06:12,880 --> 00:06:16,279 Speaker 1: woman's thirty percent more likely to be told she's too aggressive. 119 00:06:16,640 --> 00:06:17,240 Speaker 2: What do you do? 120 00:06:18,000 --> 00:06:21,400 Speaker 1: Standardize your processes. Every company should be doing it. So 121 00:06:21,480 --> 00:06:26,600 Speaker 1: for example, interviews, if you don't have agreed upon questions, 122 00:06:26,600 --> 00:06:30,120 Speaker 1: you ask naturally and maybe not on purpose, but naturally, 123 00:06:30,680 --> 00:06:33,240 Speaker 1: people sometimes ask the easier questions to the men and 124 00:06:33,279 --> 00:06:37,000 Speaker 1: the harder questions to the women. Just standardize your questions, 125 00:06:37,360 --> 00:06:42,039 Speaker 1: put systems in place that protect people, but also that 126 00:06:42,200 --> 00:06:44,760 Speaker 1: just give people the opportunity to contribute. 127 00:06:45,040 --> 00:06:48,640 Speaker 3: You have to acknowledge that there's a big rhetoric shift happening, 128 00:06:48,760 --> 00:06:50,680 Speaker 3: and obviously we're seeing it in Silicon Valley. 129 00:06:50,720 --> 00:06:52,000 Speaker 2: We are seeing people. 130 00:06:52,160 --> 00:06:54,040 Speaker 3: Tech leaders who said one thing in the last election 131 00:06:54,080 --> 00:06:57,200 Speaker 3: now whispering in the president's ear. A lot of these 132 00:06:57,200 --> 00:06:59,640 Speaker 3: people that you know personally, What do you think is 133 00:06:59,680 --> 00:07:02,599 Speaker 3: happening here? Is this transactional? Is this just business or 134 00:07:02,680 --> 00:07:05,320 Speaker 3: is this a real change in values happening. 135 00:07:06,560 --> 00:07:09,000 Speaker 1: I think a lot of the rhetoric is terrible, and 136 00:07:09,240 --> 00:07:12,080 Speaker 1: I think we see some of the impacts in this report. 137 00:07:12,920 --> 00:07:14,800 Speaker 1: But you know, I'm fifty six, so I've been in 138 00:07:14,840 --> 00:07:17,640 Speaker 1: the workforces my fourth decade, and what I see is 139 00:07:17,640 --> 00:07:20,760 Speaker 1: that we make progress, we backslide, we make progress. 140 00:07:20,360 --> 00:07:21,400 Speaker 2: There's a backlash. 141 00:07:21,800 --> 00:07:24,560 Speaker 1: I think the reason these ideas take hold so easily 142 00:07:25,360 --> 00:07:28,000 Speaker 1: is they were never really gone, even though the rhetoric 143 00:07:28,080 --> 00:07:31,080 Speaker 1: is bad. Now, do I really think we ever fully 144 00:07:31,160 --> 00:07:35,080 Speaker 1: encouraged leadership in little girls and little boys as and 145 00:07:35,480 --> 00:07:39,760 Speaker 1: women as much as men know, So when it happens, 146 00:07:39,800 --> 00:07:42,880 Speaker 1: when this rhetoric happens, it's so easy to take hold 147 00:07:42,920 --> 00:07:45,520 Speaker 1: because it's like fertile ground. I'll give you one that 148 00:07:45,560 --> 00:07:49,200 Speaker 1: really scares me. Eighth and tenth grade boys, middle and 149 00:07:49,280 --> 00:07:52,520 Speaker 1: high school boys. They surveyed them in twenty eighteen, and 150 00:07:52,560 --> 00:07:55,640 Speaker 1: they said, what do you believe women should have the 151 00:07:55,640 --> 00:07:57,680 Speaker 1: same opportunities as men in the workforce? 152 00:07:58,040 --> 00:08:00,360 Speaker 2: In twenty eighteen, sixty three percent. Yes. 153 00:08:01,520 --> 00:08:04,520 Speaker 1: I could spend all day talking about why that's so upsetting, 154 00:08:04,560 --> 00:08:08,280 Speaker 1: like where are the other thirty seven percent? But sixty 155 00:08:08,280 --> 00:08:10,160 Speaker 1: three said yees, sixty two percent said yes. 156 00:08:10,280 --> 00:08:11,320 Speaker 2: Today it's forty five. 157 00:08:12,160 --> 00:08:15,160 Speaker 1: We are seeing that same double ditch slide in middle 158 00:08:15,200 --> 00:08:17,480 Speaker 1: and high school boys believing that women should. 159 00:08:17,200 --> 00:08:20,280 Speaker 2: Get equal pay. That's not okay. 160 00:08:20,360 --> 00:08:23,160 Speaker 1: And what it's going to take to change that is 161 00:08:23,200 --> 00:08:27,160 Speaker 1: I think people realizing that this is about economic productivity. 162 00:08:27,200 --> 00:08:29,640 Speaker 1: This is about do we want our companies to succeed. 163 00:08:29,920 --> 00:08:32,280 Speaker 3: You built two companies, Help build two companies that are 164 00:08:32,280 --> 00:08:34,960 Speaker 3: incredibly economically productive. We are in the middle of this 165 00:08:35,160 --> 00:08:38,600 Speaker 3: massive AI moment where companies are investing a lot, but 166 00:08:38,600 --> 00:08:43,240 Speaker 3: it's also total chaos, it seems for you know, within 167 00:08:43,400 --> 00:08:45,120 Speaker 3: and among some of the startups that are trying to 168 00:08:45,120 --> 00:08:47,320 Speaker 3: build sustainable business models in an AI moment. 169 00:08:47,720 --> 00:08:49,520 Speaker 2: What's your advice to companies right. 170 00:08:49,400 --> 00:08:52,440 Speaker 3: Now about how they build a business model that can 171 00:08:52,520 --> 00:08:55,040 Speaker 3: work and survive in the age of AI. 172 00:08:56,280 --> 00:08:57,439 Speaker 2: You know, it's such a good question. 173 00:08:57,480 --> 00:08:59,079 Speaker 1: And I was at Google in some of the early 174 00:08:59,160 --> 00:09:03,840 Speaker 1: years and then I to Facebook now Meta, and I 175 00:09:03,880 --> 00:09:06,160 Speaker 1: do think there's times when it really makes sense to 176 00:09:06,280 --> 00:09:09,160 Speaker 1: invest ahead of revenue in business models. Right If Google, 177 00:09:09,200 --> 00:09:11,400 Speaker 1: when I was there in the early days, had tried 178 00:09:11,440 --> 00:09:14,120 Speaker 1: to cover our costs for search, we never would have 179 00:09:14,160 --> 00:09:16,480 Speaker 1: gotten enough search out there to get enough user feedback 180 00:09:16,480 --> 00:09:18,800 Speaker 1: to improve the search results, to get to what was 181 00:09:18,800 --> 00:09:20,000 Speaker 1: a great business for Google. 182 00:09:20,040 --> 00:09:21,240 Speaker 2: So it makes sense. 183 00:09:21,840 --> 00:09:24,080 Speaker 1: At some point, at some level, you have to have 184 00:09:24,160 --> 00:09:25,440 Speaker 1: revenue that covers costs. 185 00:09:25,520 --> 00:09:26,280 Speaker 2: Is that from ads? 186 00:09:26,600 --> 00:09:29,880 Speaker 1: Well, people make it so complicated. It's not complicated ready. 187 00:09:29,920 --> 00:09:32,760 Speaker 1: Someone has to pay, who can pay, Businesses can pay, 188 00:09:33,040 --> 00:09:35,400 Speaker 1: They can pay via advertising, they can pay via paying 189 00:09:35,440 --> 00:09:38,760 Speaker 1: in some way, shape or form for database services, or 190 00:09:38,800 --> 00:09:41,280 Speaker 1: people have to pay, and it will be a combination 191 00:09:41,440 --> 00:09:42,720 Speaker 1: of all of these things. 192 00:09:43,080 --> 00:09:45,040 Speaker 2: But over time, the revenue is going to have to 193 00:09:45,080 --> 00:09:46,520 Speaker 2: cover the costs, all right. 194 00:09:46,840 --> 00:09:50,520 Speaker 3: Cheryl Samberg, author of Leaning You Know, obviously help build 195 00:09:50,800 --> 00:09:52,400 Speaker 3: Facebook and Google into what they are today.