1 00:00:00,200 --> 00:00:02,280 Speaker 1: Next Question with Katie Curic is a production of I 2 00:00:02,400 --> 00:00:05,960 Speaker 1: Heart Radio and Katie Kuric Media. Hi everyone, I'm Katie 3 00:00:06,000 --> 00:00:08,760 Speaker 1: Curic and welcome to Next Question, where we try to 4 00:00:08,880 --> 00:00:12,560 Speaker 1: understand the complicated world we're living in and the crazy 5 00:00:12,680 --> 00:00:16,759 Speaker 1: things that are happening by asking questions and by listening 6 00:00:16,800 --> 00:00:20,160 Speaker 1: to people who really know what they're talking about. At times, 7 00:00:20,200 --> 00:00:24,240 Speaker 1: it may lead to some pretty uncomfortable conversations, but stick 8 00:00:24,320 --> 00:00:33,640 Speaker 1: with me, everyone, let's all learn together. More than two 9 00:00:33,680 --> 00:00:37,240 Speaker 1: point one billion people use Facebook or one of its 10 00:00:37,240 --> 00:00:42,040 Speaker 1: services like Instagram or What's App every single day. That's 11 00:00:42,080 --> 00:00:46,479 Speaker 1: nearly one third of the entire world's population. But recently 12 00:00:46,520 --> 00:00:49,560 Speaker 1: the company has gone from the brilliant brainchild of a 13 00:00:49,640 --> 00:00:52,880 Speaker 1: Harvard dropout named Mark Zuckerberg to one of the most 14 00:00:52,920 --> 00:00:56,920 Speaker 1: controversial companies on the planet. He was recently grilled on 15 00:00:56,960 --> 00:01:00,360 Speaker 1: Capitol hilld by members of Congress concerned about the plat forms, 16 00:01:00,440 --> 00:01:05,520 Speaker 1: increasing footprint, and almost every aspect of our lives. Sure, 17 00:01:05,640 --> 00:01:09,560 Speaker 1: Facebook can bring communities together, help you share photos with 18 00:01:09,600 --> 00:01:13,240 Speaker 1: your family, and even start movements, but it can also 19 00:01:13,560 --> 00:01:18,600 Speaker 1: unfairly impact elections, spread misinformation, create a safe space for 20 00:01:18,680 --> 00:01:23,080 Speaker 1: child pornographers, and white supremacists, invade our privacy, exploit our 21 00:01:23,120 --> 00:01:27,160 Speaker 1: personal information, and increase the deep divisions of our already 22 00:01:27,240 --> 00:01:31,160 Speaker 1: polarized nation. That's quite a laundry list, isn't it, And 23 00:01:31,200 --> 00:01:35,440 Speaker 1: with the election fast approaching, you may be wondering if 24 00:01:35,480 --> 00:01:39,200 Speaker 1: it might be deja vu all over again, and worried that, 25 00:01:39,280 --> 00:01:42,080 Speaker 1: to borrow a phrase from the nineteen sixty six movie 26 00:01:42,440 --> 00:01:45,319 Speaker 1: The Russians Are Coming, The Russians are Coming, not to 27 00:01:45,400 --> 00:01:49,720 Speaker 1: mention China and other foreign powers, and the company's recent 28 00:01:49,840 --> 00:01:53,360 Speaker 1: decision not to fact check political ads lead to a 29 00:01:53,440 --> 00:01:57,320 Speaker 1: heated debate on social media between Zuckerberg and Democratic presidential 30 00:01:57,360 --> 00:02:00,800 Speaker 1: candidate Elizabeth Warren, who set the platform had become a 31 00:02:01,040 --> 00:02:05,760 Speaker 1: quote disinformation for profit machine, and she even placed an 32 00:02:05,840 --> 00:02:09,320 Speaker 1: ad on Facebook sayt Zuckerberg was supporting Trump for president 33 00:02:09,600 --> 00:02:13,800 Speaker 1: to test if it would be removed. It wasn't. Meanwhile, 34 00:02:13,840 --> 00:02:16,839 Speaker 1: more than two hundred and fifty of its own employees 35 00:02:17,280 --> 00:02:20,560 Speaker 1: signed an open letter warning that the ad policy is 36 00:02:20,800 --> 00:02:24,960 Speaker 1: quote a threat to what Facebook stands for. So I 37 00:02:25,000 --> 00:02:28,359 Speaker 1: was impressed that the company CEO, Cheryl Sandberg was willing 38 00:02:28,400 --> 00:02:31,079 Speaker 1: to sit down with me recently at the Vanity Fair 39 00:02:31,120 --> 00:02:34,919 Speaker 1: New Establishment conference in Los Angeles. She's been with the 40 00:02:35,000 --> 00:02:38,320 Speaker 1: company since two thousand eleven and has played a pivotal 41 00:02:38,440 --> 00:02:41,760 Speaker 1: role in shaping both its culture and its business strategy, 42 00:02:42,200 --> 00:02:45,480 Speaker 1: leading it to more than twenty two billion dollars in 43 00:02:45,600 --> 00:02:49,160 Speaker 1: profits last year. She's also an advocate for women in 44 00:02:49,200 --> 00:02:52,919 Speaker 1: the workplace with her two thousand thirteen book and organization 45 00:02:53,360 --> 00:02:56,920 Speaker 1: Lean In. And I got to know Cheryl after her husband, Dave, 46 00:02:57,040 --> 00:03:01,520 Speaker 1: died unexpectedly in two thousand fifteen. She reached out because 47 00:03:01,639 --> 00:03:04,840 Speaker 1: I too had lost my husband at an early age. 48 00:03:05,240 --> 00:03:08,200 Speaker 1: Cheryl wrote a book about her experience, called Option B, 49 00:03:08,680 --> 00:03:12,040 Speaker 1: and I interviewed her for that back in two thousand seventeen. 50 00:03:12,480 --> 00:03:15,400 Speaker 1: If you're interested, you can find that interview in my feed. 51 00:03:15,919 --> 00:03:18,760 Speaker 1: Our recent conversation at the Vanity Fair summit got a 52 00:03:18,800 --> 00:03:21,400 Speaker 1: lot of attention, and I thought it made sense to 53 00:03:21,440 --> 00:03:24,680 Speaker 1: share it with all of you on my podcast. So 54 00:03:24,919 --> 00:03:29,000 Speaker 1: my next question for Cheryl Sandberg is Facebook doing enough 55 00:03:29,040 --> 00:03:34,040 Speaker 1: to protect it's more than two billion users and our democracy? 56 00:03:34,160 --> 00:03:39,640 Speaker 1: Or is it time to unfriend Facebook? Cheryl, thank you 57 00:03:39,680 --> 00:03:42,720 Speaker 1: for being here. We have a lot to talk about, 58 00:03:42,760 --> 00:03:45,080 Speaker 1: as you know, so let's get right to it. We're 59 00:03:45,160 --> 00:03:48,920 Speaker 1: just over a year from the election. Three hundred and 60 00:03:48,960 --> 00:03:51,680 Speaker 1: seventy eight days to be exactly who's counting? Yeah, But 61 00:03:52,080 --> 00:03:57,640 Speaker 1: I think the way Facebook addresses and fixes the platform 62 00:03:57,680 --> 00:03:59,840 Speaker 1: that was used in two thousand and sixteen is seen 63 00:04:00,080 --> 00:04:03,720 Speaker 1: is a major, critically important test. I know certain measures 64 00:04:03,720 --> 00:04:06,560 Speaker 1: have in fact been implemented, for example, thirty five thousand 65 00:04:06,640 --> 00:04:11,240 Speaker 1: moderators looking for fake accounts and suspicious patterns. Mark Zuckerberg 66 00:04:11,280 --> 00:04:15,320 Speaker 1: announced news safeguards like labeling media outlets that are state controlled. 67 00:04:15,800 --> 00:04:17,920 Speaker 1: But do you believe that's enough? I mean, do you 68 00:04:18,000 --> 00:04:21,080 Speaker 1: really seriously believe that we won't witness the kind of 69 00:04:21,160 --> 00:04:25,840 Speaker 1: widespread interference we saw in two thousand sixteen. Well, we're 70 00:04:25,839 --> 00:04:28,039 Speaker 1: gonna do everything we can to prevent it. Um. I 71 00:04:28,080 --> 00:04:30,280 Speaker 1: do think we're in a very different place. So if 72 00:04:30,279 --> 00:04:34,480 Speaker 1: you think back to we had protections against state actors, 73 00:04:34,960 --> 00:04:37,440 Speaker 1: but when you thought about state actors going against a 74 00:04:37,520 --> 00:04:41,560 Speaker 1: technology platform, what you thought of was hacking the Sony emails, 75 00:04:41,560 --> 00:04:45,360 Speaker 1: the DNC emails, stealing information. And that's what our defenses 76 00:04:45,400 --> 00:04:47,720 Speaker 1: were really set up to prevent, and so were everyone 77 00:04:47,720 --> 00:04:51,800 Speaker 1: Else's what we totally missed, and it is on us 78 00:04:51,839 --> 00:04:56,400 Speaker 1: from missing it, and everyone missed. This was not stealing information, 79 00:04:56,920 --> 00:04:59,719 Speaker 1: but going in and writing fake stuff was a totally 80 00:04:59,800 --> 00:05:02,160 Speaker 1: diff threat and our systems weren't set up to deal 81 00:05:02,200 --> 00:05:04,800 Speaker 1: with it. So the question is as you're asking, what 82 00:05:04,839 --> 00:05:06,800 Speaker 1: are we doing going forward and how are we going 83 00:05:06,800 --> 00:05:10,600 Speaker 1: into the election? And how did we do in and 84 00:05:10,600 --> 00:05:13,520 Speaker 1: we're in a totally different place. The FBI has a 85 00:05:13,560 --> 00:05:15,919 Speaker 1: task force on this. They didn't have anyone working on it. 86 00:05:16,080 --> 00:05:19,520 Speaker 1: Homeland Security is working on it, all the tech companies 87 00:05:19,560 --> 00:05:22,480 Speaker 1: are working together, because when you try to interfere on 88 00:05:22,480 --> 00:05:26,160 Speaker 1: one platform, you try to interfere on another. In Sten, 89 00:05:26,480 --> 00:05:30,120 Speaker 1: we didn't know what this threat was in We did 90 00:05:30,160 --> 00:05:33,800 Speaker 1: one takedown. In the last year, we did fifty and 91 00:05:33,839 --> 00:05:36,560 Speaker 1: I read a shocking number. You took down more than 92 00:05:36,640 --> 00:05:41,320 Speaker 1: two point two billion fake accounts in a three month. 93 00:05:41,800 --> 00:05:45,039 Speaker 1: That's right. We take down millions every day. So thirty 94 00:05:45,720 --> 00:05:49,520 Speaker 1: moderators is that even enough? Given I mean two point 95 00:05:49,520 --> 00:05:51,720 Speaker 1: two billion is almost the number of people who are 96 00:05:51,720 --> 00:05:55,440 Speaker 1: on the platform. So the moderators are looking for content. 97 00:05:55,800 --> 00:05:58,359 Speaker 1: The fake accounts are being found with engineering. That's the 98 00:05:58,400 --> 00:06:00,960 Speaker 1: only way to find those fake account Most of those 99 00:06:00,960 --> 00:06:04,960 Speaker 1: are found before anyone ever sees them. And fake accounts 100 00:06:04,960 --> 00:06:07,960 Speaker 1: are a really important point here because everything that was 101 00:06:08,000 --> 00:06:12,440 Speaker 1: done by Russia in everything was done under a fake account. 102 00:06:12,480 --> 00:06:15,000 Speaker 1: So if you can find the fake accounts, you often 103 00:06:15,040 --> 00:06:17,840 Speaker 1: find the root of the problem. And so we are 104 00:06:17,839 --> 00:06:21,560 Speaker 1: now taking down millions every day, almost all of which 105 00:06:21,560 --> 00:06:25,200 Speaker 1: no one has seen. You talked about disrupting fifty individual 106 00:06:25,240 --> 00:06:28,480 Speaker 1: campaigns from multiple nation states so far. But what about 107 00:06:28,600 --> 00:06:34,800 Speaker 1: domestic threats. Facebook's own former security chief Alex Stamos has said, quote, 108 00:06:35,080 --> 00:06:37,840 Speaker 1: what I expect is that the Russian playbook is going 109 00:06:37,880 --> 00:06:41,280 Speaker 1: to be executed inside of the US by domestic groups, 110 00:06:41,279 --> 00:06:44,320 Speaker 1: in which case some of it, other than hacking, is 111 00:06:44,400 --> 00:06:48,039 Speaker 1: not illegal. My real fear, he says, is that in 112 00:06:48,960 --> 00:06:51,760 Speaker 1: it's going to be the battle of the billionaires of 113 00:06:51,920 --> 00:06:55,479 Speaker 1: secret groups working for people aligned on both sides who 114 00:06:55,480 --> 00:06:59,479 Speaker 1: are trying to manipulate us at scale online. So what 115 00:06:59,640 --> 00:07:02,520 Speaker 1: is face spook doing to defend the platform against this 116 00:07:02,600 --> 00:07:06,200 Speaker 1: kind of domestic threat. It's a really good question, because 117 00:07:06,240 --> 00:07:10,600 Speaker 1: things are against our policies if they're fraudulent or fake accounts, 118 00:07:10,600 --> 00:07:13,400 Speaker 1: but people can also kind of deceive. Again, if you 119 00:07:13,400 --> 00:07:15,960 Speaker 1: look at where we were and where we are, the 120 00:07:16,040 --> 00:07:20,160 Speaker 1: transparency is dramatically different. So you look on every page 121 00:07:20,160 --> 00:07:23,119 Speaker 1: on Facebook, you can now see the origin of where 122 00:07:23,120 --> 00:07:25,760 Speaker 1: the person is. So if someone is has a page 123 00:07:25,800 --> 00:07:29,040 Speaker 1: that's called I don't know us whatever, but they're from 124 00:07:29,080 --> 00:07:32,080 Speaker 1: the Ukraine, it's clearly marked that. If you look at 125 00:07:32,120 --> 00:07:35,040 Speaker 1: our ad library we didn't have this last time, you 126 00:07:35,080 --> 00:07:38,240 Speaker 1: can see any political ad running actually anywhere in the 127 00:07:38,280 --> 00:07:40,640 Speaker 1: country or in most places of the world, even if 128 00:07:40,680 --> 00:07:43,480 Speaker 1: they're not targeted to you. So before, if they were 129 00:07:43,480 --> 00:07:45,320 Speaker 1: trying to reach you, you could see it, but you 130 00:07:45,320 --> 00:07:48,320 Speaker 1: couldn't see anything else. Now you can see everything. And 131 00:07:48,360 --> 00:07:51,080 Speaker 1: we rolled out on Presidential ad Tracker so that you 132 00:07:51,080 --> 00:07:55,240 Speaker 1: can see the presidential campaigns much more holistically. So with 133 00:07:55,320 --> 00:07:58,840 Speaker 1: the transparency measures we have, people should be able to 134 00:07:58,920 --> 00:08:01,080 Speaker 1: be trying to get rid of the accounts and the 135 00:08:01,120 --> 00:08:04,360 Speaker 1: ones that are legitimate, whether they run domestically or globally. 136 00:08:04,720 --> 00:08:07,560 Speaker 1: Make sure people understand who the people are behind what 137 00:08:07,600 --> 00:08:11,960 Speaker 1: they're seeing. But then why did Facebook announced not to 138 00:08:12,080 --> 00:08:15,800 Speaker 1: fact check political ads last month? I know the Rand 139 00:08:15,840 --> 00:08:19,200 Speaker 1: Corporation actually has a term for this, which is truth decay. 140 00:08:19,240 --> 00:08:23,760 Speaker 1: And Mark himself has defended this decision even as he 141 00:08:23,840 --> 00:08:28,560 Speaker 1: expressed concern about the erosion of truth online. So what 142 00:08:28,760 --> 00:08:32,160 Speaker 1: is the rationale for that? And I know you're gonna 143 00:08:32,200 --> 00:08:35,000 Speaker 1: say we're not a news organization. We're a platform. I'm 144 00:08:35,040 --> 00:08:38,280 Speaker 1: not going to say that, but it's a really important question, 145 00:08:38,360 --> 00:08:40,200 Speaker 1: and I'm really glad to have a chance to take 146 00:08:40,200 --> 00:08:42,160 Speaker 1: a beat and really think about it and talk about it. 147 00:08:42,240 --> 00:08:45,480 Speaker 1: So one of the most controversial things out there right 148 00:08:45,520 --> 00:08:48,720 Speaker 1: now is what adds do we take? What ads do 149 00:08:48,760 --> 00:08:51,320 Speaker 1: others take? And do we fact check political ads? And 150 00:08:51,360 --> 00:08:54,240 Speaker 1: it is a hard conversation and emotions are running very 151 00:08:54,320 --> 00:08:57,720 Speaker 1: high on this. I also sit here realizing it's however 152 00:08:57,800 --> 00:09:00,880 Speaker 1: many days you said before the election. So the ads 153 00:09:00,880 --> 00:09:03,480 Speaker 1: that are controversial now we have not even seen the 154 00:09:03,520 --> 00:09:05,000 Speaker 1: beginning of what we're going to see. There are going 155 00:09:05,000 --> 00:09:08,360 Speaker 1: to be a lot of controversial ads and controversial speech. 156 00:09:08,760 --> 00:09:11,360 Speaker 1: So why are we doing this. It's not for the money. 157 00:09:11,440 --> 00:09:13,680 Speaker 1: Let's start there. This is a very small part of 158 00:09:13,679 --> 00:09:16,720 Speaker 1: our revenue five percent or something. We don't release the numbers, 159 00:09:16,720 --> 00:09:20,920 Speaker 1: but it's very small, very small, and it is very controversial. 160 00:09:21,000 --> 00:09:23,760 Speaker 1: We're not doing this for the money. We take political 161 00:09:23,800 --> 00:09:26,880 Speaker 1: ads because we really believe they are part of political 162 00:09:26,960 --> 00:09:31,520 Speaker 1: discourse and that taking political ads means that people can speak. 163 00:09:31,920 --> 00:09:33,960 Speaker 1: If you look at this over time, the people who 164 00:09:34,000 --> 00:09:37,520 Speaker 1: have most benefited from being able to run ads are 165 00:09:37,600 --> 00:09:39,840 Speaker 1: people who are not covered by the media so they 166 00:09:39,840 --> 00:09:43,640 Speaker 1: can't get their message out otherwise, people who are challenging 167 00:09:43,640 --> 00:09:46,680 Speaker 1: and incumbent so they are a challenger, and people who 168 00:09:46,840 --> 00:09:49,720 Speaker 1: have different points of view. That's that's been true historically. 169 00:09:49,800 --> 00:09:52,360 Speaker 1: And so we also have this issue that if we 170 00:09:52,440 --> 00:09:54,959 Speaker 1: let's say we took political ads off the service, we 171 00:09:55,000 --> 00:09:57,800 Speaker 1: would still have all the issue ads. So I'm running 172 00:09:57,800 --> 00:09:59,800 Speaker 1: an AD on gender equality, I'm running an AD on 173 00:09:59,840 --> 00:10:03,040 Speaker 1: an other political issue. Those ads are much much much 174 00:10:03,080 --> 00:10:06,040 Speaker 1: bigger in terms of scope than the political ads, so 175 00:10:06,080 --> 00:10:08,800 Speaker 1: you would have every voice in the debate except the 176 00:10:08,840 --> 00:10:13,320 Speaker 1: politicians themselves. So instead, what we're doing is as much 177 00:10:13,360 --> 00:10:16,280 Speaker 1: transparency as possible. Every ad has to be marked by 178 00:10:16,280 --> 00:10:19,160 Speaker 1: who paid for it. We're doing verification to make sure 179 00:10:19,160 --> 00:10:22,200 Speaker 1: the people that say they're paid and that adds library 180 00:10:22,240 --> 00:10:25,480 Speaker 1: I started talking about is really important because you can't hide. 181 00:10:25,800 --> 00:10:28,160 Speaker 1: You can't run one ad in one state, one add 182 00:10:28,160 --> 00:10:30,600 Speaker 1: and another, one add to one group, one add to another. 183 00:10:31,280 --> 00:10:34,600 Speaker 1: Anyone can go into that library and see any ad 184 00:10:34,720 --> 00:10:37,560 Speaker 1: that any politician is running anywhere. Well, this is what 185 00:10:37,800 --> 00:10:40,760 Speaker 1: Nita Gupta wrote, the former head of the dj Civil 186 00:10:40,880 --> 00:10:44,959 Speaker 1: Rights Division, and Politico simply put she wrote, while major 187 00:10:45,000 --> 00:10:49,680 Speaker 1: news organizations are strengthening fact checking and accountability, Facebook is saying, 188 00:10:50,080 --> 00:10:53,880 Speaker 1: if you are a politician who wishes to pedal in lies, distortion, 189 00:10:54,000 --> 00:10:58,040 Speaker 1: and not so subtle racial appeals, welcome to our platform. 190 00:10:58,520 --> 00:11:02,160 Speaker 1: You will not be fact check you are automatically newsworthy. 191 00:11:02,240 --> 00:11:06,520 Speaker 1: You're automatically exempt from scrutiny. So I know of Anita, 192 00:11:06,559 --> 00:11:08,440 Speaker 1: and I've had a chance to speak to her since 193 00:11:08,440 --> 00:11:10,880 Speaker 1: she since she posted that, and I think the debate 194 00:11:10,960 --> 00:11:13,160 Speaker 1: is really important. I've had a chance to work with 195 00:11:13,160 --> 00:11:15,080 Speaker 1: her on our civil rights work. We've taken a lot 196 00:11:15,120 --> 00:11:18,120 Speaker 1: of feedback from her and already continue which she was 197 00:11:18,120 --> 00:11:20,640 Speaker 1: writing there was not only about ads, it was really 198 00:11:20,679 --> 00:11:23,720 Speaker 1: about content on the platform. So taking a step back, 199 00:11:24,520 --> 00:11:27,840 Speaker 1: here's what we do. When you write something. We have 200 00:11:27,920 --> 00:11:30,720 Speaker 1: a very strong free expression bent. We think it's very 201 00:11:30,760 --> 00:11:33,880 Speaker 1: important that we judge as little as possible and let 202 00:11:33,920 --> 00:11:37,360 Speaker 1: people express themselves. But we don't allow anything on the platform. 203 00:11:37,559 --> 00:11:42,160 Speaker 1: If something is hate, terrorism, violence, bullying, you know, hate 204 00:11:42,200 --> 00:11:44,920 Speaker 1: against protective classes, it comes down we take it off, 205 00:11:45,280 --> 00:11:50,320 Speaker 1: voter suppression. If something is false, misinformation, fake news, we 206 00:11:50,360 --> 00:11:52,679 Speaker 1: don't take it off. We send it to third party 207 00:11:52,720 --> 00:11:56,120 Speaker 1: fact checkers. If they market as false. We market as false. 208 00:11:56,559 --> 00:11:58,640 Speaker 1: If you go to share it and it's marked as false, 209 00:11:58,679 --> 00:12:00,199 Speaker 1: we warn you with a pop up and we say, 210 00:12:00,240 --> 00:12:02,080 Speaker 1: do you want to share this it's been marked as false? 211 00:12:02,800 --> 00:12:06,880 Speaker 1: We dramatically decrease distribution, so we decrease it to about 212 00:12:08,240 --> 00:12:10,840 Speaker 1: and we show related articles. How can you possibly do 213 00:12:10,920 --> 00:12:14,880 Speaker 1: that with two point seven billion users? How can you 214 00:12:15,000 --> 00:12:20,079 Speaker 1: possibly keep up with all the content that's being produced 215 00:12:20,559 --> 00:12:24,720 Speaker 1: on Facebook and distributed and shared, etcetera. We can't fact 216 00:12:24,760 --> 00:12:26,719 Speaker 1: check everything. We're not trying to fact check everything or 217 00:12:26,760 --> 00:12:29,400 Speaker 1: send everything to third party fact checkers at all. We 218 00:12:29,520 --> 00:12:32,600 Speaker 1: prioritize in terms of what's going most quickly. So when 219 00:12:32,640 --> 00:12:34,920 Speaker 1: something is growing really quickly, it gets referred, it goes 220 00:12:35,000 --> 00:12:37,160 Speaker 1: to the top of the heap sending it to fact checkers. 221 00:12:37,160 --> 00:12:40,719 Speaker 1: And these are really news links. You know, if you're 222 00:12:40,720 --> 00:12:43,800 Speaker 1: a bad example because you're a media journalist, but you know, 223 00:12:43,880 --> 00:12:46,280 Speaker 1: if my sister writes a post about her kids and 224 00:12:46,280 --> 00:12:48,560 Speaker 1: her dogs, which she does all the time, that's not 225 00:12:48,600 --> 00:12:52,600 Speaker 1: getting fact check. That said, the challenges of scale here 226 00:12:52,679 --> 00:12:54,920 Speaker 1: are really important, and in a lot of the areas 227 00:12:54,960 --> 00:12:58,720 Speaker 1: where we are reluctant to weigh in, it's because we 228 00:12:58,760 --> 00:13:01,520 Speaker 1: know we can't do this well at scale, so we 229 00:13:01,679 --> 00:13:04,400 Speaker 1: have to rely on other sources. I think one of 230 00:13:04,400 --> 00:13:06,480 Speaker 1: the most important things we're rolling out in the next 231 00:13:06,559 --> 00:13:10,280 Speaker 1: year is our Content Advisory Board. We understand that there 232 00:13:10,280 --> 00:13:12,640 Speaker 1: are real concerns with the amount of power and control 233 00:13:12,720 --> 00:13:15,480 Speaker 1: we have that right now we are the ultimate arbiters 234 00:13:15,480 --> 00:13:17,959 Speaker 1: of what stays on our service, and so we're setting 235 00:13:18,000 --> 00:13:20,560 Speaker 1: up a content review board. The final charter has just 236 00:13:20,679 --> 00:13:24,160 Speaker 1: been released. We've consulted with over a thousand experts around 237 00:13:24,160 --> 00:13:26,239 Speaker 1: the world and they're going to be forty people appointed 238 00:13:26,840 --> 00:13:28,800 Speaker 1: and by next year they're going to start hearing cases. 239 00:13:28,800 --> 00:13:30,920 Speaker 1: They don't report to me, they don't report to Mark. 240 00:13:31,440 --> 00:13:34,880 Speaker 1: It means that if you disagree and something was pulled 241 00:13:34,920 --> 00:13:36,520 Speaker 1: down and you think it should be up, or if 242 00:13:36,559 --> 00:13:40,520 Speaker 1: you disagree and we are letting something run from someone else, 243 00:13:40,559 --> 00:13:42,440 Speaker 1: that you don't think, you have a place to go 244 00:13:42,520 --> 00:13:44,920 Speaker 1: and we're going to abide by their decisions. Since two 245 00:13:44,960 --> 00:13:47,600 Speaker 1: thirds of people get their news and information now from 246 00:13:47,679 --> 00:13:51,720 Speaker 1: social media, do you have any responsibility in your view 247 00:13:51,920 --> 00:13:55,680 Speaker 1: to at least attempt to make sure that the news 248 00:13:55,760 --> 00:14:01,080 Speaker 1: on your platform is factual? Because oftentimes I've heard, well, 249 00:14:01,080 --> 00:14:04,560 Speaker 1: we're a platform, we're not a publisher, right, and so 250 00:14:04,640 --> 00:14:07,640 Speaker 1: we're basically the pipes. So where do you see your 251 00:14:07,679 --> 00:14:10,600 Speaker 1: responsibility in terms of that? So we do think we 252 00:14:10,640 --> 00:14:13,599 Speaker 1: have a responsibility for fake news and misinformation? Would you 253 00:14:13,640 --> 00:14:16,560 Speaker 1: say you're not a publisher? Still, well, what would you 254 00:14:16,640 --> 00:14:19,520 Speaker 1: call it? So that is a complicated thing and it 255 00:14:19,520 --> 00:14:21,720 Speaker 1: means different things to different people. Here's what we are. 256 00:14:22,240 --> 00:14:24,600 Speaker 1: We are a technology company. A lot of things are 257 00:14:24,640 --> 00:14:27,040 Speaker 1: published on us. But what I think when people ask 258 00:14:27,120 --> 00:14:30,400 Speaker 1: that question, they're wondering if we take responsibility for what's 259 00:14:30,400 --> 00:14:32,920 Speaker 1: on our service. And my answer to you is is yes, 260 00:14:33,360 --> 00:14:35,520 Speaker 1: we're not a publisher in the traditional sense because we 261 00:14:35,520 --> 00:14:38,480 Speaker 1: don't have editors who are fact checking, but we take 262 00:14:38,600 --> 00:14:43,480 Speaker 1: responsibility and what we've done on misinformation has decreased people's interactions. 263 00:14:43,520 --> 00:14:46,480 Speaker 1: Stanford just published a study there down by more than half. 264 00:14:46,480 --> 00:14:50,880 Speaker 1: Since it's not perfect, we're not able to fact check everything. 265 00:14:50,880 --> 00:14:54,240 Speaker 1: But we had no policies against this in the last election, 266 00:14:54,560 --> 00:14:56,400 Speaker 1: and you fast forward to today. I think we are 267 00:14:56,440 --> 00:15:00,440 Speaker 1: in an imperfect but a much stronger position. Let's talk 268 00:15:00,440 --> 00:15:04,320 Speaker 1: about the free speech rationale at Georgetown. Mark used Martin 269 00:15:04,400 --> 00:15:07,480 Speaker 1: Luther King Jr's name and his defense of free speech 270 00:15:07,480 --> 00:15:11,280 Speaker 1: on Facebook, but King's daughter, Bernice tweeted, I'd like to 271 00:15:11,320 --> 00:15:15,800 Speaker 1: help Facebook better understand the challenges that MLK faced from 272 00:15:15,880 --> 00:15:21,880 Speaker 1: disinformation campaigns launched by politicians. These campaigns created an atmosphere 273 00:15:21,960 --> 00:15:25,760 Speaker 1: for his assassination. And then Sherylyn Eiffel, as you know, 274 00:15:25,880 --> 00:15:28,760 Speaker 1: president of the n double a CP Legal Defense Fund, 275 00:15:28,800 --> 00:15:32,440 Speaker 1: called his speech quote a profound misreading of the civil 276 00:15:32,560 --> 00:15:36,320 Speaker 1: rights movement in America and a dangerous misunderstanding of the 277 00:15:36,400 --> 00:15:41,960 Speaker 1: political and digital landscape we now inhabit. It was a 278 00:15:41,960 --> 00:15:45,640 Speaker 1: controversial speech, and I think the civil rights concerns are 279 00:15:46,160 --> 00:15:50,280 Speaker 1: very real. Um. In terms of Bernice King, you know her, 280 00:15:50,480 --> 00:15:53,080 Speaker 1: her father's legacy, I know her. I actually spoke to 281 00:15:53,160 --> 00:15:56,760 Speaker 1: her after that tweet, totally scheduled separately. She's coming to 282 00:15:56,800 --> 00:15:59,600 Speaker 1: Facebook tomorrow and I'm going to be in your chair interviewing, 283 00:15:59,600 --> 00:16:02,280 Speaker 1: and then I'm hoasting her for dinner tomorrow night. And 284 00:16:02,560 --> 00:16:04,320 Speaker 1: what I told her is what I'll say to you, 285 00:16:04,360 --> 00:16:06,320 Speaker 1: which is that I was grateful she published. We would 286 00:16:06,360 --> 00:16:08,360 Speaker 1: have liked her to push on Facebook, not just tweet, 287 00:16:08,880 --> 00:16:11,160 Speaker 1: but we were grateful she spoke out because this is 288 00:16:11,200 --> 00:16:13,400 Speaker 1: the dialogue we want to have. And she actually tweeted 289 00:16:13,440 --> 00:16:15,760 Speaker 1: again this morning that she heard from Mark and is 290 00:16:15,760 --> 00:16:19,840 Speaker 1: looking forward to sitting down and talking with him civil rights. 291 00:16:19,960 --> 00:16:25,080 Speaker 1: She's smooth, isn't she. I mean, these are just facts, 292 00:16:25,160 --> 00:16:28,320 Speaker 1: she tweeted. You can check again to my friend Vernice. 293 00:16:28,360 --> 00:16:30,680 Speaker 1: We'd like you to post on our platform too. But 294 00:16:30,800 --> 00:16:34,600 Speaker 1: this is the dialogue, right, there's a lot of disagreement. 295 00:16:35,040 --> 00:16:38,360 Speaker 1: Civil rights and protecting civil rights are hugely important to Mark, 296 00:16:38,440 --> 00:16:41,440 Speaker 1: hugely important to me. I'm personally leading the civil rights 297 00:16:41,440 --> 00:16:43,880 Speaker 1: work at Facebook and we'll continue to do that. And 298 00:16:43,920 --> 00:16:46,120 Speaker 1: while we don't agree with everything, and there was certainly 299 00:16:46,200 --> 00:16:49,680 Speaker 1: disagreement over some of Mark's speech, there were other things 300 00:16:49,720 --> 00:16:52,120 Speaker 1: that we've done because we've listened and learned to them 301 00:16:52,120 --> 00:16:54,720 Speaker 1: over the last year, that I think they feel really 302 00:16:54,720 --> 00:16:58,400 Speaker 1: good about. We've taken much stronger steps on hate, looked 303 00:16:58,400 --> 00:17:01,720 Speaker 1: at white nationalism and white separatism because they informed us 304 00:17:01,720 --> 00:17:04,840 Speaker 1: of it. We've really come down on a very strong 305 00:17:04,880 --> 00:17:08,840 Speaker 1: policy on voter suppression. We are taking down voter suppression 306 00:17:08,840 --> 00:17:11,280 Speaker 1: as hate. If you publish you know the polls are 307 00:17:11,320 --> 00:17:14,679 Speaker 1: open on Wednesday, not Tuesday. We're taking that down because 308 00:17:14,720 --> 00:17:16,880 Speaker 1: it's as important to us as hate. And that's all 309 00:17:16,920 --> 00:17:19,800 Speaker 1: based on work, And so why is voter suppression more 310 00:17:19,840 --> 00:17:24,720 Speaker 1: important than voter misinformation. It's not. It's not more important. 311 00:17:24,760 --> 00:17:27,040 Speaker 1: It's just a question of how we handle it when 312 00:17:27,080 --> 00:17:31,440 Speaker 1: we have misinformation. What we believe is that unless it's hate, 313 00:17:31,440 --> 00:17:33,800 Speaker 1: are going to lead to real world violence, we need 314 00:17:33,880 --> 00:17:37,199 Speaker 1: to let the debate continue. We dial massively down the 315 00:17:37,240 --> 00:17:39,720 Speaker 1: distribution to As I said, we don't want things to 316 00:17:39,760 --> 00:17:42,040 Speaker 1: go viral. We mark them as false, but then we 317 00:17:42,080 --> 00:17:45,399 Speaker 1: publish related articles. Here's the other side of the story. 318 00:17:45,920 --> 00:17:48,480 Speaker 1: We think that's how people get informed that it's the 319 00:17:48,520 --> 00:17:51,959 Speaker 1: discourse about the discourse. It is market giving a speech 320 00:17:52,040 --> 00:17:56,359 Speaker 1: and Bernice King disagreeing with it publicly, and that dialogue 321 00:17:56,359 --> 00:17:59,359 Speaker 1: that matters. Whereas if it's hate or if someone's really 322 00:17:59,359 --> 00:18:01,400 Speaker 1: going to show up to the polls the wrong day, 323 00:18:01,800 --> 00:18:03,600 Speaker 1: we just want to take it off our service. And 324 00:18:04,040 --> 00:18:08,520 Speaker 1: this is really hard because one person will think something 325 00:18:08,640 --> 00:18:12,439 Speaker 1: is clearly just they really disagree with it, and we 326 00:18:12,480 --> 00:18:15,960 Speaker 1: do too, but they think it's someone else's free expression, 327 00:18:16,040 --> 00:18:18,280 Speaker 1: and so these lines are going to continue to be 328 00:18:18,560 --> 00:18:21,080 Speaker 1: really hard to draw. Do you really think that people 329 00:18:21,200 --> 00:18:24,400 Speaker 1: use Facebook as an opportunity to look at both sides 330 00:18:24,480 --> 00:18:27,560 Speaker 1: and to see something when it's corrected, or don't you 331 00:18:27,600 --> 00:18:30,480 Speaker 1: think that people are getting stuff in their feed that 332 00:18:30,920 --> 00:18:34,159 Speaker 1: is really affirmation that information. And I'm so glad you 333 00:18:34,200 --> 00:18:36,399 Speaker 1: asked this because there's actually really strong data here and 334 00:18:36,480 --> 00:18:40,240 Speaker 1: no one understands this. So when you think about your 335 00:18:40,320 --> 00:18:44,840 Speaker 1: contacts in the world, psychologists, you have what's called um 336 00:18:44,880 --> 00:18:48,040 Speaker 1: your tight circle of contacts and then your broader circle 337 00:18:48,080 --> 00:18:50,800 Speaker 1: of contact. So you basically can keep in touch with 338 00:18:50,840 --> 00:18:55,200 Speaker 1: five to seven people. That's your mom, your daughter's, your husband, John, 339 00:18:55,240 --> 00:18:58,159 Speaker 1: the people who you know where they are. What Facebook 340 00:18:58,560 --> 00:19:00,560 Speaker 1: enables you to do is keep in top with many 341 00:19:00,600 --> 00:19:04,000 Speaker 1: more people. Without Facebook, without social media, without Instagram, Twitter, 342 00:19:04,280 --> 00:19:06,400 Speaker 1: you won't hear from your college friends or the people 343 00:19:06,440 --> 00:19:10,160 Speaker 1: you grew up with that often. So if you compare 344 00:19:10,200 --> 00:19:12,960 Speaker 1: people who do not use social media for people who do, 345 00:19:13,359 --> 00:19:16,399 Speaker 1: the people who use social media see much more broad 346 00:19:16,600 --> 00:19:19,800 Speaker 1: points of view because if you don't use social media, 347 00:19:20,040 --> 00:19:22,320 Speaker 1: you go to maybe one or two news outlets. They 348 00:19:22,359 --> 00:19:24,280 Speaker 1: have one particular point of view. You read one or 349 00:19:24,280 --> 00:19:27,440 Speaker 1: two newspapers, and that's it. On Facebook, you will see, 350 00:19:27,480 --> 00:19:31,000 Speaker 1: on average of the stuff you see a news will 351 00:19:31,040 --> 00:19:33,320 Speaker 1: be from another point of view, which means it's not 352 00:19:33,440 --> 00:19:36,000 Speaker 1: half and half, but it is broadening of your views. 353 00:19:36,440 --> 00:19:38,600 Speaker 1: And that's something that I don't think we've we've been 354 00:19:38,640 --> 00:19:41,359 Speaker 1: able to explain to other people really understand. And the 355 00:19:41,400 --> 00:19:43,320 Speaker 1: reason for that is if you go to your news feed, 356 00:19:43,680 --> 00:19:46,600 Speaker 1: you don't see like half blue and half read. You 357 00:19:46,720 --> 00:19:50,520 Speaker 1: just see about more from the other side than you 358 00:19:50,680 --> 00:19:54,880 Speaker 1: otherwise would. So it is unequivocally true that Facebook usage 359 00:19:54,920 --> 00:19:58,280 Speaker 1: and usage of social media shows you broader points of view, 360 00:19:58,280 --> 00:20:00,359 Speaker 1: not narrower points of view than you would see otherwise. 361 00:20:00,359 --> 00:20:03,119 Speaker 1: And that's something no one understands. When we come back, 362 00:20:03,359 --> 00:20:07,280 Speaker 1: we take a deep dive into the rise of deep fakes, 363 00:20:07,480 --> 00:20:11,439 Speaker 1: Facebook's role in the increasing polarization of our country and 364 00:20:11,560 --> 00:20:15,280 Speaker 1: what the consequences should be if the company doesn't put 365 00:20:15,320 --> 00:20:24,080 Speaker 1: the proper safeguards in place for presidential election. Let's talk 366 00:20:24,080 --> 00:20:27,440 Speaker 1: about the free speech argument, which came under attack earlier 367 00:20:27,480 --> 00:20:30,439 Speaker 1: this year when Facebook decided not to take down that 368 00:20:30,560 --> 00:20:34,280 Speaker 1: doctored video of how speaker Nancy Pelosi. Her speech was 369 00:20:34,320 --> 00:20:37,800 Speaker 1: slowed down, it made her appear to be slurring her words, 370 00:20:37,960 --> 00:20:41,119 Speaker 1: that people thought she was drunk. You defended the decision 371 00:20:41,200 --> 00:20:43,680 Speaker 1: by saying, we think the only way to fight bad 372 00:20:43,680 --> 00:20:46,560 Speaker 1: information is with good information, and you said it had 373 00:20:46,600 --> 00:20:49,679 Speaker 1: been marked as false but at that point, Cheryl, it 374 00:20:49,720 --> 00:20:53,240 Speaker 1: had been viewed two point five million times. So isn't 375 00:20:53,240 --> 00:20:56,560 Speaker 1: the damage already done at that point, like when you 376 00:20:56,600 --> 00:20:59,320 Speaker 1: do a correction in the newspaper two days later in 377 00:20:59,440 --> 00:21:03,800 Speaker 1: tiny on page two. And studies have shown if you 378 00:21:03,880 --> 00:21:08,160 Speaker 1: see the false story enough and the correction fewer times, 379 00:21:08,200 --> 00:21:11,160 Speaker 1: than the false story actually stays in your head. Not 380 00:21:11,240 --> 00:21:14,240 Speaker 1: to mention, another study by m I t that fake 381 00:21:14,320 --> 00:21:18,439 Speaker 1: news spreads seventy times faster than real news on Twitter. 382 00:21:18,880 --> 00:21:22,359 Speaker 1: So I guess, isn't the current standard operating procedure on 383 00:21:22,920 --> 00:21:25,520 Speaker 1: videos like this a case of too little, too late? 384 00:21:25,920 --> 00:21:27,639 Speaker 1: I think what the Plosi video it was, and we 385 00:21:27,680 --> 00:21:30,359 Speaker 1: have said that publicly our fact checkers moved way to 386 00:21:30,760 --> 00:21:33,119 Speaker 1: the process and not the fact checkers. The process for 387 00:21:33,160 --> 00:21:35,080 Speaker 1: getting it to them and getting it back moved way 388 00:21:35,080 --> 00:21:37,679 Speaker 1: too slowly, and we've made a change in how we 389 00:21:37,760 --> 00:21:40,479 Speaker 1: do that to prioritize things that are moving quickly and 390 00:21:40,680 --> 00:21:43,440 Speaker 1: massively cut down the review time. In that case, we 391 00:21:43,480 --> 00:21:45,600 Speaker 1: should have caught it way earlier. We think you're right, 392 00:21:45,640 --> 00:21:47,760 Speaker 1: and we want our systems to work more because now 393 00:21:47,800 --> 00:21:51,200 Speaker 1: the technology allows people to appear that they're doing something 394 00:21:51,200 --> 00:21:54,240 Speaker 1: and are saying something other than what they're actually saying. 395 00:21:54,280 --> 00:21:56,159 Speaker 1: I mean, how do you keep up with all of 396 00:21:56,200 --> 00:21:58,840 Speaker 1: those things? Well, deep fakes is what you're talking about. 397 00:21:58,880 --> 00:22:00,560 Speaker 1: It is a new and emerging and that's when I'm 398 00:22:00,520 --> 00:22:02,600 Speaker 1: met deep. Yeah. And it's a new and emerging area, 399 00:22:02,640 --> 00:22:05,719 Speaker 1: and it is definitely one that we don't believe we 400 00:22:05,800 --> 00:22:07,800 Speaker 1: know everything about because we don't even know what they're 401 00:22:07,800 --> 00:22:09,920 Speaker 1: gonna look like. Here's what we know. We know we're 402 00:22:09,920 --> 00:22:12,399 Speaker 1: gonna need to move way, way, way faster. We know 403 00:22:12,480 --> 00:22:16,239 Speaker 1: we're going to need very sophisticated engineering to detect them 404 00:22:16,280 --> 00:22:18,760 Speaker 1: in the first place. We also know that the policies 405 00:22:18,800 --> 00:22:21,640 Speaker 1: themselves are hard to set right, and so we this 406 00:22:21,720 --> 00:22:24,119 Speaker 1: is an area where we know we moved too slowly 407 00:22:24,119 --> 00:22:26,600 Speaker 1: with the Pelosi video. We are trying to move faster. 408 00:22:26,680 --> 00:22:29,480 Speaker 1: But we're also setting up working groups and AI working 409 00:22:29,480 --> 00:22:32,440 Speaker 1: groups to try to develop the technology that will help 410 00:22:32,520 --> 00:22:34,840 Speaker 1: us identify these in the first place. I wanted to 411 00:22:34,880 --> 00:22:37,359 Speaker 1: ask you about Joe Biden because I know he's cut 412 00:22:37,359 --> 00:22:41,320 Speaker 1: down substantially on his Facebook ad spending because he wasn't 413 00:22:41,320 --> 00:22:45,520 Speaker 1: seeing very good return. Some strategists have speculated that his 414 00:22:45,720 --> 00:22:49,840 Speaker 1: message is to centrists and lacking in the inflammatory red 415 00:22:49,920 --> 00:22:54,120 Speaker 1: meat content that does so well on platforms like Facebook. 416 00:22:54,160 --> 00:22:57,360 Speaker 1: Are you concerned that you are creating an environment where 417 00:22:57,400 --> 00:23:02,520 Speaker 1: the most aggressive, inflammatory, each bribal content is what sells. 418 00:23:02,640 --> 00:23:05,399 Speaker 1: I know you address that briefly in saying that people 419 00:23:05,440 --> 00:23:09,840 Speaker 1: get different points of view, but certainly these people seem 420 00:23:09,880 --> 00:23:12,840 Speaker 1: to gravitate towards that kind of content. I mean, I 421 00:23:12,840 --> 00:23:15,240 Speaker 1: think that's true across political discourse. I think it's a 422 00:23:15,320 --> 00:23:17,400 Speaker 1: problem we face. I think you see it in rallies. 423 00:23:17,440 --> 00:23:19,160 Speaker 1: I think you see it in the debates. I think 424 00:23:19,200 --> 00:23:23,320 Speaker 1: the problem of people making more inflammatory statements and people 425 00:23:23,400 --> 00:23:26,520 Speaker 1: rallying to those, particularly as things get more polarized, is 426 00:23:26,560 --> 00:23:28,800 Speaker 1: a real problem. I don't think it's unique to us. 427 00:23:28,840 --> 00:23:31,199 Speaker 1: But do you think you've contributed to the polarization in 428 00:23:31,240 --> 00:23:35,760 Speaker 1: the country. Um, I think everything has contributed. I do 429 00:23:35,880 --> 00:23:39,280 Speaker 1: think Facebook is I think held accountable for that well. 430 00:23:39,320 --> 00:23:42,080 Speaker 1: I think we have a responsibility to help make sure 431 00:23:42,119 --> 00:23:45,520 Speaker 1: people get real information, to help participate in the debate, 432 00:23:45,560 --> 00:23:48,120 Speaker 1: and make sure that people can see other other points 433 00:23:48,160 --> 00:23:50,840 Speaker 1: of view. So I think, but are they getting real information? 434 00:23:50,880 --> 00:23:54,200 Speaker 1: If they're if they are getting the most aggressive inflammatory, 435 00:23:54,200 --> 00:23:56,920 Speaker 1: in other words, sort of more moderate points of view, 436 00:23:57,040 --> 00:24:00,520 Speaker 1: they're not as provocative. They don't stoke out rage as 437 00:24:00,600 --> 00:24:02,919 Speaker 1: much as some of this other content. Look, I think 438 00:24:02,960 --> 00:24:04,520 Speaker 1: that's true. I think you see it in rallies too. 439 00:24:04,600 --> 00:24:06,200 Speaker 1: I think you see it on social media. I think 440 00:24:06,200 --> 00:24:08,560 Speaker 1: you see it in rallies when does the crowd cheer. 441 00:24:08,960 --> 00:24:10,359 Speaker 1: You think you've see it in the debates. But I 442 00:24:10,359 --> 00:24:13,120 Speaker 1: think here's what matters. What matters is that we want 443 00:24:13,160 --> 00:24:16,200 Speaker 1: people to be able to communicate, express themselves. We want 444 00:24:16,200 --> 00:24:19,320 Speaker 1: people to register to vote and stay in the political process. 445 00:24:19,320 --> 00:24:22,439 Speaker 1: What I will most worry about is if people start 446 00:24:22,520 --> 00:24:24,560 Speaker 1: opting out. So one of the things I'm proud of 447 00:24:24,600 --> 00:24:27,399 Speaker 1: that Facebook has done is we registered over two million 448 00:24:27,400 --> 00:24:32,240 Speaker 1: people to vote, and on Facebook, when you turn eighteen, 449 00:24:32,880 --> 00:24:35,480 Speaker 1: we basically say happy birthday, and you should register to vote. 450 00:24:35,760 --> 00:24:38,200 Speaker 1: We have a really easy tool that lets you find 451 00:24:38,240 --> 00:24:40,639 Speaker 1: your local representative. Most people don't know who their local 452 00:24:40,680 --> 00:24:43,520 Speaker 1: representative is. So yes, I worry about all that, but 453 00:24:43,560 --> 00:24:46,679 Speaker 1: we also worry about core engagement to making sure that 454 00:24:46,720 --> 00:24:50,040 Speaker 1: people don't just opt out, but stay engaged, that they vote, 455 00:24:50,240 --> 00:24:52,399 Speaker 1: that they know who their representatives are, they know who 456 00:24:52,440 --> 00:24:56,119 Speaker 1: they're voting, and they participate in the debate. Mark said 457 00:24:56,160 --> 00:24:59,280 Speaker 1: recently in a leaked audio from an internal Facebook meeting 458 00:24:59,320 --> 00:25:02,520 Speaker 1: that if Eliza with Warren becomes president and tries to 459 00:25:02,560 --> 00:25:05,439 Speaker 1: break up the company, it would be an existential threat 460 00:25:05,840 --> 00:25:09,080 Speaker 1: and Facebook would go to the mat. What does that 461 00:25:09,119 --> 00:25:12,560 Speaker 1: mean exactly, go to the mat? We'll have to see. 462 00:25:12,640 --> 00:25:16,960 Speaker 1: But what this is about is whether I mean what 463 00:25:17,080 --> 00:25:18,720 Speaker 1: we'll have to see is what this is. This is 464 00:25:18,760 --> 00:25:21,359 Speaker 1: about whether or not Facebook should be broken up. And 465 00:25:21,400 --> 00:25:24,520 Speaker 1: that's a really important question. I think we're facing it. 466 00:25:24,560 --> 00:25:27,680 Speaker 1: I think all the tech companies are facing it um 467 00:25:27,680 --> 00:25:29,880 Speaker 1: And it's interesting. What do you think about the fear 468 00:25:29,920 --> 00:25:32,800 Speaker 1: about that? Well, I don't know if it's if it's 469 00:25:32,800 --> 00:25:34,679 Speaker 1: the biggest fear. I just think it's it would it 470 00:25:34,680 --> 00:25:37,000 Speaker 1: would you be okay if it was broken up? Well, 471 00:25:37,000 --> 00:25:39,040 Speaker 1: we don't want Facebook to be broken up because we 472 00:25:39,119 --> 00:25:42,280 Speaker 1: think we're able to provide great services across the board. 473 00:25:42,600 --> 00:25:45,440 Speaker 1: We think we're able to invest in security across the board. 474 00:25:45,480 --> 00:25:49,080 Speaker 1: So we invest enough and security across the board, we 475 00:25:49,119 --> 00:25:52,639 Speaker 1: invest a lot. We're investing much much, much more. We 476 00:25:52,720 --> 00:25:55,200 Speaker 1: have hired an extra thirty five thousand people, We've put 477 00:25:55,240 --> 00:25:59,399 Speaker 1: tremendous engineering resources, and we're doing things like red teams, 478 00:25:59,480 --> 00:26:01,359 Speaker 1: asking what do we think the bad guys would do 479 00:26:01,400 --> 00:26:03,760 Speaker 1: and how would we do it. So we're never going 480 00:26:03,800 --> 00:26:06,120 Speaker 1: to fully be ahead of everything. But if you want 481 00:26:06,119 --> 00:26:08,480 Speaker 1: to understand what companies care about, you look at where 482 00:26:08,480 --> 00:26:11,000 Speaker 1: they invest their resources. And if you look back three 483 00:26:11,040 --> 00:26:13,919 Speaker 1: to five years and you look at today, we've totally 484 00:26:14,000 --> 00:26:16,399 Speaker 1: changed when we invest our resources. And my job has 485 00:26:16,520 --> 00:26:18,800 Speaker 1: changed too. If I look at I've been at Facebook 486 00:26:18,800 --> 00:26:22,720 Speaker 1: eleven and a half years. For the first eight or so, 487 00:26:23,280 --> 00:26:25,399 Speaker 1: I spent most of my time growing the company and 488 00:26:25,480 --> 00:26:28,800 Speaker 1: sometime protecting the community. We always did some protection, but 489 00:26:28,880 --> 00:26:32,560 Speaker 1: now that's definitely flipped. My job is a majority building 490 00:26:32,560 --> 00:26:35,880 Speaker 1: the systems that protect and minority grow. And so we're 491 00:26:35,920 --> 00:26:38,640 Speaker 1: definitely changing as a company. We're in a different place 492 00:26:38,680 --> 00:26:40,520 Speaker 1: across the board on all of these things. Do you 493 00:26:40,520 --> 00:26:44,800 Speaker 1: think you're changing enough fast enough? I hope. So we're trying. 494 00:26:45,160 --> 00:26:47,639 Speaker 1: We're definitely trying. I mean, I think it's about not 495 00:26:47,800 --> 00:26:50,320 Speaker 1: just the current threats, but the next threat. The question 496 00:26:50,359 --> 00:26:52,879 Speaker 1: we ask ourselves every day is, Okay, we know what 497 00:26:52,920 --> 00:26:56,679 Speaker 1: happened in and now we're going to work to prevent it. 498 00:26:56,760 --> 00:26:58,840 Speaker 1: What is the next thing someone is going to do? 499 00:26:58,960 --> 00:27:01,439 Speaker 1: And that's going to take a lot of thought and 500 00:27:01,480 --> 00:27:03,720 Speaker 1: a lot of cooperation across the board. Do you see 501 00:27:03,760 --> 00:27:08,600 Speaker 1: breaking up Facebook as the existential threat? Mark Zuckerberg described, 502 00:27:08,680 --> 00:27:11,119 Speaker 1: And how are you feeling about Elizabeth Warren these days? 503 00:27:13,040 --> 00:27:16,399 Speaker 1: So I know Elizabeth Warren, and would you support her? 504 00:27:16,520 --> 00:27:20,439 Speaker 1: She's a Democratic nominee. I mean, I'm a Democrat. I 505 00:27:20,480 --> 00:27:23,119 Speaker 1: have supported Democrat nominees in the past. I imagine I 506 00:27:23,200 --> 00:27:28,040 Speaker 1: will support a Democrat nominee if it's Elizabeth Warren. I mean, 507 00:27:29,280 --> 00:27:32,480 Speaker 1: I'm not in the primary right now. I think that's 508 00:27:32,480 --> 00:27:34,560 Speaker 1: a good place for us to be, and so I'm 509 00:27:34,560 --> 00:27:36,360 Speaker 1: not going to let you drag me into the primary. 510 00:27:36,400 --> 00:27:39,439 Speaker 1: But I am a very well understood Democrat. I was 511 00:27:39,480 --> 00:27:42,720 Speaker 1: a supporter of Hillary Clinton. I have spoken for many 512 00:27:42,800 --> 00:27:45,720 Speaker 1: years about my desire for my daughter and yours to 513 00:27:45,800 --> 00:27:48,159 Speaker 1: see a woman as president. And so I'd like that 514 00:27:48,240 --> 00:27:50,639 Speaker 1: sounds like a yes, I'd like that. Not just here, 515 00:27:51,280 --> 00:27:52,720 Speaker 1: I'd like that all over the world. I have this 516 00:27:52,760 --> 00:27:54,880 Speaker 1: really funny story from a friend of mine in Germany 517 00:27:54,920 --> 00:27:57,800 Speaker 1: whose son I love. This said to his mother he 518 00:27:57,880 --> 00:28:01,840 Speaker 1: was five, I can't be chance where and she said 519 00:28:01,840 --> 00:28:05,000 Speaker 1: why not? He said, well, I'm not a girl because 520 00:28:05,040 --> 00:28:07,720 Speaker 1: of Angelo Merkel. Because the only person he has ever 521 00:28:07,760 --> 00:28:10,800 Speaker 1: known was Angelo Merkels. That's pretty good. You've said yourself 522 00:28:10,840 --> 00:28:13,280 Speaker 1: that you have to get right. What should be the 523 00:28:13,320 --> 00:28:18,879 Speaker 1: consequences if Facebook doesn't. I mean, I think we have 524 00:28:18,960 --> 00:28:22,720 Speaker 1: to earn back trust. Trust is very easily broken. It 525 00:28:22,840 --> 00:28:25,439 Speaker 1: is hard to earn back. I think we have to 526 00:28:25,520 --> 00:28:29,479 Speaker 1: earn back trust. I think we need deeper cooperation across 527 00:28:29,520 --> 00:28:32,119 Speaker 1: the board. We are arguing for regulation in some of 528 00:28:32,160 --> 00:28:36,400 Speaker 1: these areas, including things that would impact foreign interference, and 529 00:28:36,600 --> 00:28:38,680 Speaker 1: I think the consequences to us will be grave if 530 00:28:38,840 --> 00:28:40,960 Speaker 1: if we don't, what is it? What does that mean? 531 00:28:41,120 --> 00:28:43,400 Speaker 1: Consequences will be I think I think it would further 532 00:28:43,440 --> 00:28:45,920 Speaker 1: a road trust. I think people will have less faith 533 00:28:45,920 --> 00:28:48,160 Speaker 1: in the platform and our services. People are continuing to 534 00:28:48,240 --> 00:28:50,880 Speaker 1: use our services. That's trust. We need to earn back, 535 00:28:50,920 --> 00:28:52,720 Speaker 1: not just with what we say, but what we do. 536 00:28:53,280 --> 00:28:56,000 Speaker 1: And it is about finding those syndicates and taking them down. 537 00:28:56,360 --> 00:28:58,920 Speaker 1: It is showing that we can cooperate across the board, 538 00:28:58,960 --> 00:29:01,360 Speaker 1: on both sides of the AI, in Congress and around 539 00:29:01,360 --> 00:29:04,000 Speaker 1: the world to find the things that threaten our democracy. 540 00:29:04,120 --> 00:29:07,880 Speaker 1: What can other people do to help Facebook solve some 541 00:29:07,960 --> 00:29:10,959 Speaker 1: of these problems? Well, thank you for the question. I mean, 542 00:29:10,960 --> 00:29:12,400 Speaker 1: I think there's a lot of things. So one of 543 00:29:12,400 --> 00:29:14,560 Speaker 1: the things that makes us very different than where we 544 00:29:14,560 --> 00:29:17,440 Speaker 1: were years ago is I think pretty radical transparency. So, 545 00:29:17,520 --> 00:29:21,800 Speaker 1: for example, our community standards are completely public. We go 546 00:29:21,920 --> 00:29:24,360 Speaker 1: public every week or so I think every two weeks 547 00:29:24,760 --> 00:29:27,760 Speaker 1: with here's some of the decisions we're making, and we 548 00:29:27,840 --> 00:29:31,600 Speaker 1: take feedback. We're publishing a transparency report by next year. 549 00:29:31,640 --> 00:29:34,080 Speaker 1: We're gonna do it every quarter, just like earnings, because 550 00:29:34,120 --> 00:29:36,880 Speaker 1: it's just as important to us as earnings, which has 551 00:29:37,200 --> 00:29:40,200 Speaker 1: Here's all the stuff we took down, So here's how 552 00:29:40,200 --> 00:29:42,520 Speaker 1: many billions of accounts that where that number comes from. 553 00:29:42,560 --> 00:29:45,400 Speaker 1: Here's how much terrorism content, Here's how much hate speech, 554 00:29:45,920 --> 00:29:47,920 Speaker 1: and then how much of it did we find before 555 00:29:47,920 --> 00:29:51,000 Speaker 1: it was reported to us? So what that report shows 556 00:29:51,080 --> 00:29:53,760 Speaker 1: is iss n L KADA content of what we take down, 557 00:29:53,800 --> 00:29:57,920 Speaker 1: we find it before it's reported hate speech. We're in 558 00:29:57,960 --> 00:30:01,480 Speaker 1: the mid sixty percentages now that's more than double where 559 00:30:01,480 --> 00:30:03,120 Speaker 1: we were a year and a half ago, but it 560 00:30:03,200 --> 00:30:05,520 Speaker 1: still means that thirty percent of the hate speech we 561 00:30:05,600 --> 00:30:08,280 Speaker 1: take down has to be reported to us, which means 562 00:30:08,320 --> 00:30:10,760 Speaker 1: someone has seen it, and so we are whack a 563 00:30:10,800 --> 00:30:13,640 Speaker 1: mole in a way, though, Cheryl that everything you take down, 564 00:30:13,800 --> 00:30:16,480 Speaker 1: something pops up. In its place. How can you ever 565 00:30:16,560 --> 00:30:20,080 Speaker 1: get really get control over this. Well, it is like whackable, 566 00:30:20,240 --> 00:30:22,040 Speaker 1: right we take something down. I mean, right now, as 567 00:30:22,080 --> 00:30:24,280 Speaker 1: you and I have spoken on this stage, someone many 568 00:30:24,320 --> 00:30:28,440 Speaker 1: people have posted things. Our job is to build technology 569 00:30:28,480 --> 00:30:31,280 Speaker 1: that takes that down as quickly as possible, and have 570 00:30:31,400 --> 00:30:33,680 Speaker 1: enough human staff that they can take down the rest 571 00:30:33,760 --> 00:30:36,840 Speaker 1: really quickly. It is whack a mole, but it is 572 00:30:36,880 --> 00:30:39,400 Speaker 1: the price of free speech. We have a service that 573 00:30:39,480 --> 00:30:43,280 Speaker 1: two points seven billion people are using our services. That 574 00:30:43,360 --> 00:30:45,240 Speaker 1: means that there's going to be, you know, all the 575 00:30:45,280 --> 00:30:48,120 Speaker 1: beauty and all the ugliness of humanity. And our job, 576 00:30:48,160 --> 00:30:50,840 Speaker 1: and it is whack a mole, is to get as 577 00:30:50,960 --> 00:30:53,800 Speaker 1: much of the bad off as quickly as possible and 578 00:30:53,920 --> 00:30:55,840 Speaker 1: let the good continue. And the only way to get 579 00:30:55,920 --> 00:30:57,280 Speaker 1: rid of all of it is to shut down all 580 00:30:57,320 --> 00:30:59,480 Speaker 1: of these services. And I don't think anyone's really for that. 581 00:31:00,360 --> 00:31:03,240 Speaker 1: What about temporarily shutting them down so you can fix 582 00:31:03,280 --> 00:31:06,600 Speaker 1: the problems? Would you ever do anything like that? I 583 00:31:06,680 --> 00:31:09,800 Speaker 1: don't think the temporary shutdown would fix the problems because 584 00:31:09,800 --> 00:31:11,640 Speaker 1: we have to be in the game to see what 585 00:31:11,680 --> 00:31:14,080 Speaker 1: people are doing to build the systems to shut down. 586 00:31:14,400 --> 00:31:16,520 Speaker 1: But the point is people have speech. Now, Like, if 587 00:31:16,560 --> 00:31:19,240 Speaker 1: you think about my childhood, right, I grew up in Miami. 588 00:31:19,280 --> 00:31:21,080 Speaker 1: I went to public school. If I wanted to say 589 00:31:21,120 --> 00:31:22,960 Speaker 1: something to the world, I had no no opportunity to 590 00:31:23,000 --> 00:31:26,719 Speaker 1: do it. Couldn't get on your show. No one was no. Seriously, 591 00:31:26,720 --> 00:31:28,560 Speaker 1: you weren't going to take me as a guest. No, 592 00:31:29,040 --> 00:31:31,600 Speaker 1: I wasn't young enough for me that. But hypothetically I 593 00:31:31,600 --> 00:31:33,880 Speaker 1: couldn't get on the person. Before I could write a 594 00:31:33,920 --> 00:31:35,520 Speaker 1: not ed to the local paper, they weren't going to 595 00:31:35,560 --> 00:31:38,240 Speaker 1: take it. People did not have voice full stop. Now, 596 00:31:39,200 --> 00:31:41,760 Speaker 1: that was a world that people felt actually pretty comfortable, 597 00:31:41,760 --> 00:31:45,160 Speaker 1: and you could fact check everything. You could fast forward 598 00:31:45,200 --> 00:31:48,840 Speaker 1: to today, whatever services get shut down, you can post somewhere, 599 00:31:49,360 --> 00:31:51,400 Speaker 1: which means that everyone has voice, which means that things 600 00:31:51,400 --> 00:31:54,120 Speaker 1: are not fact checked. Now that doesn't mean we don't 601 00:31:54,120 --> 00:31:57,920 Speaker 1: have responsibility. We do, but we are in a fundamentally 602 00:31:57,960 --> 00:32:01,080 Speaker 1: different place where people around the world have voice. And 603 00:32:01,600 --> 00:32:04,200 Speaker 1: as hard as this is and as challenging as it is, 604 00:32:05,040 --> 00:32:09,160 Speaker 1: I so deeply believe in that world, so deeply I am. 605 00:32:09,960 --> 00:32:11,479 Speaker 1: As a friend of mine behind the stage who went 606 00:32:11,520 --> 00:32:14,640 Speaker 1: to my high school, our high school teacher found a 607 00:32:14,720 --> 00:32:19,160 Speaker 1: kidney donor on Facebook because she could publish, and she 608 00:32:19,200 --> 00:32:21,600 Speaker 1: could reach people in a way she never could. We 609 00:32:21,760 --> 00:32:24,480 Speaker 1: just announced that two billion dollars have been raised by 610 00:32:24,480 --> 00:32:28,160 Speaker 1: people on Facebook for their for their for their birthdays, 611 00:32:28,160 --> 00:32:31,160 Speaker 1: and their personal fundraisers. Does that mean everything on Facebook 612 00:32:31,240 --> 00:32:33,400 Speaker 1: is good? Of course not. But you can't shut this 613 00:32:33,480 --> 00:32:36,360 Speaker 1: down without shutting down a lot of good. And I 614 00:32:36,400 --> 00:32:38,680 Speaker 1: don't think that's an unacceptable answer. And so we're going 615 00:32:38,720 --> 00:32:41,840 Speaker 1: to fight to get the bad off and let the 616 00:32:41,880 --> 00:32:43,880 Speaker 1: good keep happening. And I think there is a lot 617 00:32:43,920 --> 00:32:47,560 Speaker 1: of good out there. When we come back a look 618 00:32:47,560 --> 00:32:51,520 Speaker 1: at the alarming psychological effects of social media on our kids, 619 00:32:52,040 --> 00:32:54,400 Speaker 1: whether it's time to take a second look at lean 620 00:32:54,520 --> 00:32:57,200 Speaker 1: in in light of the me too movement, and I'll 621 00:32:57,280 --> 00:33:06,920 Speaker 1: ask Cheryl about her legacy. Let's talk about kids in 622 00:33:07,000 --> 00:33:12,440 Speaker 1: social media. This isn't so good. The addictive nature the 623 00:33:12,440 --> 00:33:15,880 Speaker 1: the the addictive nature of social media is just one concern. 624 00:33:16,280 --> 00:33:18,560 Speaker 1: But as you know, I know you have two kids 625 00:33:19,080 --> 00:33:24,120 Speaker 1: twelve and fourteen. Now, depression is up dramatically among young people, 626 00:33:24,240 --> 00:33:27,320 Speaker 1: and the suicide rate of adolescent girls is up one 627 00:33:27,440 --> 00:33:31,760 Speaker 1: hundred and seventy after two decades of decline. And as 628 00:33:31,800 --> 00:33:35,680 Speaker 1: you know, the leading explanation is the arrival of smartphones 629 00:33:35,760 --> 00:33:38,840 Speaker 1: and social media. So, as a parent and someone who 630 00:33:38,920 --> 00:33:42,520 Speaker 1: has been a powerful voice for women, how do you 631 00:33:42,600 --> 00:33:47,120 Speaker 1: respond to that terrifying statistic and the bigger question, what 632 00:33:47,280 --> 00:33:50,600 Speaker 1: can be done about it? We take this really seriously. 633 00:33:50,640 --> 00:33:52,680 Speaker 1: I take it seriously as a Facebook execut I take 634 00:33:52,720 --> 00:33:55,479 Speaker 1: it seriously as a mom. So it turns out that 635 00:33:55,640 --> 00:33:58,880 Speaker 1: all uses of phones, all uses of social media, are 636 00:33:58,920 --> 00:34:01,600 Speaker 1: not equal. There are some that are actually quite good 637 00:34:01,640 --> 00:34:03,320 Speaker 1: for well being, and there are some that are not 638 00:34:03,400 --> 00:34:06,680 Speaker 1: as good. So when you are actively consuming, when you 639 00:34:06,720 --> 00:34:09,680 Speaker 1: are sharing, when you are messaging, when you are posting, liking, 640 00:34:10,000 --> 00:34:13,520 Speaker 1: you're interacting with people, that's fairly positive. When you are 641 00:34:13,560 --> 00:34:17,120 Speaker 1: more passively consuming, that is more negative. And so we 642 00:34:17,160 --> 00:34:21,520 Speaker 1: made a very big change to the Facebook algorithms in January. 643 00:34:21,840 --> 00:34:24,759 Speaker 1: And what about Instagram as well? Yeah, and Instagram we're 644 00:34:24,760 --> 00:34:27,000 Speaker 1: working on as well. But we dramatically dialed up the 645 00:34:27,040 --> 00:34:32,359 Speaker 1: friends and family sharing and dramatically dialed down on self harm. 646 00:34:32,440 --> 00:34:35,600 Speaker 1: Our policies are very strict. We do not allow any 647 00:34:35,600 --> 00:34:38,839 Speaker 1: glorification of it. We don't allow any We don't allow 648 00:34:38,840 --> 00:34:41,400 Speaker 1: any glorification of self harm. We don't allow any encouragement. 649 00:34:41,760 --> 00:34:44,480 Speaker 1: We do allow people to post about their experiences, and 650 00:34:44,520 --> 00:34:47,839 Speaker 1: that has been very important. We've worked really hard to 651 00:34:47,840 --> 00:34:51,400 Speaker 1: develop automated tools, so if you post something that looks 652 00:34:51,440 --> 00:34:54,080 Speaker 1: like you might be about to self harm, we will 653 00:34:54,120 --> 00:34:58,759 Speaker 1: automatically flag UM phone numbers and helplines. We've had a 654 00:34:58,800 --> 00:35:02,280 Speaker 1: tremendous response from this, and if we think there's imminent danger, 655 00:35:02,320 --> 00:35:04,920 Speaker 1: we refer it to local law enforcement, and many people 656 00:35:04,960 --> 00:35:07,879 Speaker 1: have actually been saved by this. The other thing where well, 657 00:35:07,920 --> 00:35:11,840 Speaker 1: that's sort of not addressing the problem of addiction, of 658 00:35:12,120 --> 00:35:14,840 Speaker 1: you know, comparison being the thief of joy. Let me 659 00:35:14,880 --> 00:35:16,520 Speaker 1: finish some of the other things we're doing, because these 660 00:35:16,520 --> 00:35:19,280 Speaker 1: are all really important, and I'm conscious that this clock 661 00:35:19,360 --> 00:35:22,400 Speaker 1: is beeping at us UM, so they're gonna give me 662 00:35:22,440 --> 00:35:24,200 Speaker 1: a little extra so they are okay, then I can 663 00:35:24,239 --> 00:35:30,520 Speaker 1: slow down. So so one of the other things that 664 00:35:30,600 --> 00:35:33,359 Speaker 1: happens is, you know, social media can considered by some 665 00:35:33,880 --> 00:35:35,960 Speaker 1: to be a place where you know, you're supposed to 666 00:35:35,960 --> 00:35:38,600 Speaker 1: have the perfect the perfect life, the perfect body, a 667 00:35:38,640 --> 00:35:41,160 Speaker 1: real issue for teenage girls. You and I have talked about. 668 00:35:41,600 --> 00:35:43,480 Speaker 1: We're really trying to go against that. We ran a 669 00:35:43,480 --> 00:35:47,560 Speaker 1: campaign that's very popular UM on Instagram with real men 670 00:35:47,600 --> 00:35:50,759 Speaker 1: and women with real body types talking about that. We've 671 00:35:50,800 --> 00:35:53,799 Speaker 1: worked with the National Suicide Awareness Lines on this, We're 672 00:35:53,800 --> 00:35:57,000 Speaker 1: working with the w h O on mental health. We're 673 00:35:57,000 --> 00:36:00,120 Speaker 1: also I think the answer is almost always technology. So 674 00:36:00,160 --> 00:36:01,680 Speaker 1: one of the things I think is great. We have 675 00:36:01,719 --> 00:36:05,400 Speaker 1: a comment warning now that we've been rolling out, where 676 00:36:05,400 --> 00:36:08,200 Speaker 1: our automatic filters detect that you might be posting something 677 00:36:08,239 --> 00:36:10,759 Speaker 1: that's not nice. We will do a pop up and 678 00:36:10,800 --> 00:36:13,360 Speaker 1: say do you really want to post that? And again 679 00:36:13,400 --> 00:36:16,799 Speaker 1: we're seeing a tremendous response. We also have abilities to 680 00:36:16,840 --> 00:36:19,920 Speaker 1: restrict people to prevent bullying, so that you know, if 681 00:36:19,920 --> 00:36:23,040 Speaker 1: someone were bullying you, you can restrict them. They won't 682 00:36:23,080 --> 00:36:25,720 Speaker 1: know you're restricting them, and if they comment on your post, 683 00:36:25,760 --> 00:36:27,839 Speaker 1: no one can see them. And so these issues are 684 00:36:27,920 --> 00:36:30,640 Speaker 1: real and we have to work hard on building the 685 00:36:30,719 --> 00:36:33,919 Speaker 1: technology and that technology and the answers. There's so many 686 00:36:34,040 --> 00:36:38,400 Speaker 1: huge challenges and how difficult is it CHERYLD truly to 687 00:36:38,480 --> 00:36:42,200 Speaker 1: address any of these when solving them in some ways 688 00:36:42,280 --> 00:36:45,800 Speaker 1: works against your business model. You know, one critic said 689 00:36:45,840 --> 00:36:49,920 Speaker 1: Facebook has priced itself out a morality, and I'm just 690 00:36:50,080 --> 00:36:55,080 Speaker 1: curious if implementing some of these changes is bad for business. 691 00:36:55,160 --> 00:36:57,640 Speaker 1: So on this, I'm really pretty proud of our track 692 00:36:57,680 --> 00:37:00,839 Speaker 1: record if you look a number of years ago and 693 00:37:00,880 --> 00:37:03,279 Speaker 1: you listen to our earnings calls. So earnings calls are 694 00:37:03,320 --> 00:37:05,960 Speaker 1: exactly what people are worried about. They're directed at investors. 695 00:37:05,960 --> 00:37:07,960 Speaker 1: It's our quarterly report. If you actually watch us and 696 00:37:08,040 --> 00:37:11,640 Speaker 1: earning calls, we are spending as much time talking about 697 00:37:11,640 --> 00:37:14,239 Speaker 1: the measures we take on safety and security as we 698 00:37:14,280 --> 00:37:18,120 Speaker 1: are about our business growth. Easily. We actually said many 699 00:37:18,200 --> 00:37:20,719 Speaker 1: quarters ago, this is so important to us that we 700 00:37:20,760 --> 00:37:23,759 Speaker 1: are going to make massive investments and change the profitability 701 00:37:23,760 --> 00:37:26,880 Speaker 1: of our company by making real resource investments. And we 702 00:37:27,000 --> 00:37:29,560 Speaker 1: have to the tune of billions and billions of dollars, 703 00:37:29,560 --> 00:37:32,320 Speaker 1: and we will keep doing it. We've taken action after 704 00:37:32,360 --> 00:37:36,400 Speaker 1: action after action that is better for protecting the community 705 00:37:36,640 --> 00:37:38,279 Speaker 1: than it is for our growth, and we're going to 706 00:37:38,360 --> 00:37:40,200 Speaker 1: continue to do that. Mark has said it over and 707 00:37:40,239 --> 00:37:41,959 Speaker 1: over again. I have said it over and over again. 708 00:37:42,040 --> 00:37:45,000 Speaker 1: Let me ask you about Mark testifying before the House 709 00:37:45,080 --> 00:37:48,400 Speaker 1: Financial Services committing and a hearing focused on Facebook's plans 710 00:37:48,440 --> 00:37:51,719 Speaker 1: to launch a new digital currency called Libra. Given the 711 00:37:51,760 --> 00:37:55,200 Speaker 1: massive reach and trust the public has experience with Facebook 712 00:37:55,280 --> 00:37:59,480 Speaker 1: selling personal information through third parties, is it realistic to 713 00:38:00,000 --> 00:38:04,800 Speaker 1: expect the world to embrace cryptocurrency an initiative like libra 714 00:38:05,000 --> 00:38:09,680 Speaker 1: given that protecting personal financial data really is next level 715 00:38:09,719 --> 00:38:13,759 Speaker 1: in terms of the need for security. And I understand 716 00:38:14,080 --> 00:38:16,160 Speaker 1: you were supposed to testify, but you had kind of 717 00:38:16,200 --> 00:38:19,400 Speaker 1: a testy exchange with Maxine Waters when you were up 718 00:38:19,400 --> 00:38:22,360 Speaker 1: on Capitol Hill or somewhere. Can you tell us what happened. 719 00:38:22,400 --> 00:38:24,200 Speaker 1: We have a lot of respect for Maxine Waters for 720 00:38:24,200 --> 00:38:26,080 Speaker 1: the work we've done, and we worked really closely with 721 00:38:26,080 --> 00:38:29,040 Speaker 1: her committee. It was her choice to have Mark testify, 722 00:38:29,080 --> 00:38:33,040 Speaker 1: and that's obviously something we respect. But what happened between 723 00:38:33,040 --> 00:38:37,160 Speaker 1: you just answer the question, you don't mind on libra Um, 724 00:38:37,280 --> 00:38:40,000 Speaker 1: what we have said is that we are working on 725 00:38:40,040 --> 00:38:42,719 Speaker 1: a digital currency. I think it's really important to think 726 00:38:42,719 --> 00:38:45,640 Speaker 1: about how many people in the world are not financially 727 00:38:45,680 --> 00:38:48,400 Speaker 1: included in the banking system. By the way, not a shock. 728 00:38:48,760 --> 00:38:52,600 Speaker 1: Most of those are women. Women pay huge frommittance fees. 729 00:38:52,640 --> 00:38:54,759 Speaker 1: If you go to work as a domestic worker in 730 00:38:54,800 --> 00:38:58,200 Speaker 1: another home in another country, you're sending back money and 731 00:38:58,200 --> 00:39:00,200 Speaker 1: you're paying larger fees if you're a one. And there 732 00:39:00,239 --> 00:39:03,400 Speaker 1: are people who are unbanked. They work in the fields 733 00:39:03,440 --> 00:39:06,240 Speaker 1: and their money can be stolen by anyone, and women 734 00:39:06,239 --> 00:39:08,640 Speaker 1: are the most vulnerable. So I think there are really 735 00:39:08,680 --> 00:39:11,640 Speaker 1: good reasons for a digital currency to exist, and I 736 00:39:11,640 --> 00:39:13,760 Speaker 1: think they will be good for a lot of people. 737 00:39:13,920 --> 00:39:17,680 Speaker 1: That said, we've been very clear that we're not launching 738 00:39:17,680 --> 00:39:21,560 Speaker 1: this until we have regulatory approval. It's not a Facebook project. 739 00:39:21,880 --> 00:39:25,799 Speaker 1: The currency itself is an international nonprofit set up that 740 00:39:25,840 --> 00:39:28,120 Speaker 1: we are part of. I know that we wanted to 741 00:39:28,120 --> 00:39:31,319 Speaker 1: have a moment to talk about lean In and some 742 00:39:31,400 --> 00:39:34,560 Speaker 1: of the research that you have found about the discomfort 743 00:39:34,640 --> 00:39:38,680 Speaker 1: men feel mentoring and spending time alone with women. This 744 00:39:38,760 --> 00:39:42,160 Speaker 1: is something that greatly concerns you. And what can we 745 00:39:42,200 --> 00:39:46,439 Speaker 1: do about the increasing unwillingness of men to mentor their 746 00:39:46,480 --> 00:39:50,120 Speaker 1: female colleagues and tell us a little more about that research. Well, 747 00:39:50,160 --> 00:39:53,480 Speaker 1: it's really important because look, the METO movement you and 748 00:39:53,520 --> 00:39:54,960 Speaker 1: I have had chance to talk about it is so 749 00:39:55,000 --> 00:39:58,279 Speaker 1: important because women have faced too much harassment for too 750 00:39:58,320 --> 00:40:00,400 Speaker 1: long and I think we're in a better place, but 751 00:40:00,440 --> 00:40:04,160 Speaker 1: we're certainly not protecting everyone we should. That said, we 752 00:40:04,200 --> 00:40:06,839 Speaker 1: have to worry about the unintended consequences. So what our 753 00:40:06,880 --> 00:40:09,920 Speaker 1: research shows, this is lean In and survey Monkey, is 754 00:40:09,960 --> 00:40:14,759 Speaker 1: that six of male managers in the United states are 755 00:40:14,800 --> 00:40:18,040 Speaker 1: not willing right now, are nervous about having a one 756 00:40:18,120 --> 00:40:21,360 Speaker 1: on one interaction with a woman, including a meeting. We 757 00:40:21,440 --> 00:40:23,840 Speaker 1: do a show of hands in the audience. Who's promoted 758 00:40:23,880 --> 00:40:28,640 Speaker 1: someone you've never met with, just in case you can't 759 00:40:28,640 --> 00:40:31,560 Speaker 1: see there are no hands. If you cannot get a meeting, 760 00:40:31,600 --> 00:40:34,560 Speaker 1: you cannot get a promotion. A senior man in the 761 00:40:34,560 --> 00:40:37,239 Speaker 1: world today is nine times more likely to hesitate to 762 00:40:37,280 --> 00:40:40,480 Speaker 1: travel with the junior woman and six times more likely 763 00:40:40,520 --> 00:40:43,560 Speaker 1: to hesitate to travel to have dinner with the junior 764 00:40:43,600 --> 00:40:45,960 Speaker 1: woman and a man. So who's getting the travel the men, 765 00:40:46,000 --> 00:40:47,839 Speaker 1: who's getting the dinners the men? And who's gonna get 766 00:40:47,880 --> 00:40:50,640 Speaker 1: promoted the men? Which is what was happening before, and 767 00:40:51,400 --> 00:40:54,640 Speaker 1: talks a lot about that. It's absolutely the case you 768 00:40:54,680 --> 00:40:57,800 Speaker 1: promote the people you know better now. I think everyone 769 00:40:57,800 --> 00:40:59,800 Speaker 1: should be able to do all of these things with everyone. 770 00:41:00,200 --> 00:41:01,600 Speaker 1: You should be able to have a meeting, keep the 771 00:41:01,640 --> 00:41:03,600 Speaker 1: door up, and if you want to travel does not 772 00:41:03,719 --> 00:41:06,960 Speaker 1: mean a hotel room. Travel means a public airport. Dinner 773 00:41:07,000 --> 00:41:10,000 Speaker 1: does not mean you're flat, dinner means a restaurant. We 774 00:41:10,120 --> 00:41:11,400 Speaker 1: have to be able to do all of this. But 775 00:41:11,440 --> 00:41:13,960 Speaker 1: what we really want men to understand is that if 776 00:41:14,000 --> 00:41:15,520 Speaker 1: you're not going to have dinner with women, don't have 777 00:41:15,520 --> 00:41:20,360 Speaker 1: dinner with men, group lunches for everyone, make access equal, 778 00:41:20,400 --> 00:41:23,239 Speaker 1: because if we don't make access equal, we're never going 779 00:41:23,280 --> 00:41:25,320 Speaker 1: to move these numbers at the top, and women today 780 00:41:25,680 --> 00:41:30,520 Speaker 1: have seven percent seven percent of the CEO jobs Before before, 781 00:41:30,520 --> 00:41:32,200 Speaker 1: we guys want to talk to you because we talked 782 00:41:32,239 --> 00:41:35,520 Speaker 1: about lean in prior to Me Too, and given the 783 00:41:35,560 --> 00:41:39,279 Speaker 1: systemic failures of so many organizations that we've seen that 784 00:41:39,320 --> 00:41:43,840 Speaker 1: have tolerated sexual misconduct and harassment silence women through n 785 00:41:43,880 --> 00:41:46,960 Speaker 1: d as, do you think, in retrospect, given the very 786 00:41:47,040 --> 00:41:50,319 Speaker 1: real revelations that have surfaced as a result of the 787 00:41:50,360 --> 00:41:53,279 Speaker 1: Me Too movement, lean in might have put too much 788 00:41:53,280 --> 00:41:56,240 Speaker 1: of the onus on women to change instead of getting 789 00:41:56,280 --> 00:41:59,200 Speaker 1: a lot of these screwed up companies to change. Well, 790 00:41:59,200 --> 00:42:02,080 Speaker 1: we've always done. One of the problems with the word 791 00:42:02,160 --> 00:42:05,120 Speaker 1: lean in is you can really oversimplify without actually reading 792 00:42:05,120 --> 00:42:07,040 Speaker 1: the book and ourself. But if you read actually what 793 00:42:07,080 --> 00:42:09,480 Speaker 1: we've written and the work my foundation is done. What 794 00:42:09,600 --> 00:42:12,120 Speaker 1: we've always said is that we wanted to be okay 795 00:42:12,160 --> 00:42:15,480 Speaker 1: for women to be ambitious, and we want companies to 796 00:42:15,600 --> 00:42:18,160 Speaker 1: change and fix and it has to be both. It's 797 00:42:18,160 --> 00:42:21,600 Speaker 1: actually pretty interesting if you save the sentence he's ambitious, 798 00:42:22,280 --> 00:42:24,279 Speaker 1: it's pretty neutral or positive. He's going to get the 799 00:42:24,320 --> 00:42:27,759 Speaker 1: job done. She's ambitious. That's a negative. And that is 800 00:42:27,840 --> 00:42:30,640 Speaker 1: still true today. If you look at the use of 801 00:42:30,640 --> 00:42:34,800 Speaker 1: the word bossy. You know, go to the playground anywhere, 802 00:42:34,880 --> 00:42:37,120 Speaker 1: I promise, l A or anywhere this weekend and you 803 00:42:37,160 --> 00:42:38,960 Speaker 1: see a little girl. You won't see a little girl 804 00:42:39,000 --> 00:42:41,680 Speaker 1: get called bossy. And you walk up to her parents 805 00:42:41,760 --> 00:42:44,560 Speaker 1: and you say that little girl's not bossy. Her parents 806 00:42:44,560 --> 00:42:47,160 Speaker 1: probably did it, big smile on your face. That little 807 00:42:47,160 --> 00:42:52,640 Speaker 1: girl has executive leadership skills. No one says that. No 808 00:42:52,680 --> 00:42:57,040 Speaker 1: one says that because we don't expect leadership from girls, 809 00:42:57,120 --> 00:42:59,759 Speaker 1: and so we have to fix that problem. And that 810 00:42:59,840 --> 00:43:02,960 Speaker 1: means companies have to change, culture has to change, and 811 00:43:03,000 --> 00:43:05,960 Speaker 1: women have to feel free. Now they're really well. I 812 00:43:06,000 --> 00:43:11,000 Speaker 1: have one question at I might discussion, but time to wrap. 813 00:43:11,080 --> 00:43:15,040 Speaker 1: Thank you, Graham. Was that that my final question? Getting 814 00:43:15,080 --> 00:43:18,960 Speaker 1: back getting back to all the controversies, I mean Facebook, 815 00:43:20,280 --> 00:43:23,520 Speaker 1: My last question is I'm gonna gaun us, but no, 816 00:43:23,600 --> 00:43:26,960 Speaker 1: I'm curious because I just wanted to end this conversation, Cheryl, 817 00:43:27,080 --> 00:43:32,040 Speaker 1: given all the controversy Facebook is facing clearly in the crosshairs. 818 00:43:32,280 --> 00:43:36,680 Speaker 1: I mean, the company people love to hate. Since you 819 00:43:36,800 --> 00:43:40,920 Speaker 1: are so associated with Facebook, how worried are you about 820 00:43:41,000 --> 00:43:45,040 Speaker 1: your personal legacy as a result of your association with 821 00:43:45,080 --> 00:43:49,000 Speaker 1: this company. I think I have a really big responsibility 822 00:43:49,040 --> 00:43:51,920 Speaker 1: here for a company I love and believe in that. 823 00:43:52,080 --> 00:43:55,240 Speaker 1: I really believe in what I said about people having voice. 824 00:43:55,360 --> 00:43:57,840 Speaker 1: I really know that when I was growing up, I 825 00:43:57,880 --> 00:44:01,239 Speaker 1: had no ability to reach anyone, and most people in 826 00:44:01,280 --> 00:44:04,439 Speaker 1: the world didn't, and social media has changed that. There 827 00:44:04,520 --> 00:44:06,640 Speaker 1: are a lot of problems to fix, and we did 828 00:44:06,640 --> 00:44:09,080 Speaker 1: a great job in this audience talking about a lot 829 00:44:09,120 --> 00:44:12,080 Speaker 1: of them in this interview. They're real and I have 830 00:44:12,120 --> 00:44:15,160 Speaker 1: a real responsibility to do it. But I feel more 831 00:44:15,200 --> 00:44:18,560 Speaker 1: committed and energized than ever because I want to fight 832 00:44:18,600 --> 00:44:21,600 Speaker 1: to preserve the good. Because I met a woman not 833 00:44:21,680 --> 00:44:25,200 Speaker 1: so long ago who for her birthday raised four thousand 834 00:44:25,200 --> 00:44:28,239 Speaker 1: dollars for a domestic violence shelter that she volunteers at, 835 00:44:28,719 --> 00:44:31,399 Speaker 1: and crying, she told me I saved two women from 836 00:44:31,440 --> 00:44:34,920 Speaker 1: domestic abuse. I never could have done that before Facebook, 837 00:44:35,320 --> 00:44:38,319 Speaker 1: and so there are really big issues to fix, but 838 00:44:38,400 --> 00:44:42,040 Speaker 1: I am so committed to giving people voice and giving 839 00:44:42,080 --> 00:44:44,480 Speaker 1: people away to react that I just want to keep 840 00:44:44,520 --> 00:44:46,839 Speaker 1: doing the work and committed. They feel honored to do 841 00:44:46,880 --> 00:44:49,759 Speaker 1: it and committed to fix problem. I want to fix 842 00:44:49,840 --> 00:44:52,600 Speaker 1: them all right, Well, they're definitely gonna kill me if 843 00:44:52,640 --> 00:44:55,680 Speaker 1: I don't stop now. Definite, Lamberg. Thank thank you, thank you. 844 00:44:59,239 --> 00:45:02,560 Speaker 1: After we were done, Cheryl and I later exchanged emails. 845 00:45:02,640 --> 00:45:04,920 Speaker 1: She told me this was the toughest interview she had 846 00:45:04,960 --> 00:45:09,200 Speaker 1: ever done, but complimented me on being so well prepared. 847 00:45:09,480 --> 00:45:13,560 Speaker 1: She was incredibly gracious about the whole thing. Meanwhile, about 848 00:45:13,560 --> 00:45:17,920 Speaker 1: a week after our conversation, Twitter CEO Jack Dorsey announced 849 00:45:17,960 --> 00:45:22,359 Speaker 1: it was banning all paid political ads globally. Facebook, though, 850 00:45:22,480 --> 00:45:25,799 Speaker 1: is still sticking with its policy, at least for now. 851 00:45:26,719 --> 00:45:29,839 Speaker 1: Thanks so much for listening everyone. If a weekly podcast 852 00:45:30,160 --> 00:45:33,200 Speaker 1: isn't enough of me, you can follow me yet on 853 00:45:33,320 --> 00:45:37,719 Speaker 1: social media Facebook, Instagram, and Twitter. And if you feel 854 00:45:37,719 --> 00:45:41,160 Speaker 1: like you're drowning in a seven sea of news and information, 855 00:45:41,560 --> 00:45:44,839 Speaker 1: sign up for my morning newsletter, wake Up Call at 856 00:45:44,920 --> 00:45:49,720 Speaker 1: Katie Curic dot com because, as they say, the best 857 00:45:49,920 --> 00:45:55,960 Speaker 1: part of waking up is Katie in your inbox. Sorry, folgers, 858 00:45:56,280 --> 00:46:00,400 Speaker 1: that was pretty bad, wasn't it. Everyone, Thanks again for listening. Everyone, 859 00:46:00,440 --> 00:46:02,640 Speaker 1: and I can't wait to be in your ear again 860 00:46:02,960 --> 00:46:12,560 Speaker 1: next week. Next Question with Katie Curic is a production 861 00:46:12,600 --> 00:46:15,560 Speaker 1: of I Heart Radio and Katie Curic Media. The executive 862 00:46:15,600 --> 00:46:18,920 Speaker 1: producers are Katie Curic, Lauren Bright Pacheco, Julie Douglas, and 863 00:46:18,960 --> 00:46:23,040 Speaker 1: Tyler Klang. Our show producers are Bethan Macalooso and Courtney Litz. 864 00:46:23,560 --> 00:46:27,160 Speaker 1: The supervising producer is Dylan Fagan. Associate producers are Emily 865 00:46:27,200 --> 00:46:31,560 Speaker 1: Pinto and Derek Clemens. Editing is by Dylan Fagan, Derek Clements, 866 00:46:31,640 --> 00:46:35,640 Speaker 1: and Lowell Brolante. Our researcher is Barbara Keene. For more 867 00:46:35,640 --> 00:46:38,479 Speaker 1: information on today's episode, go to Katie Currek dot com 868 00:46:38,520 --> 00:46:41,239 Speaker 1: and follow us on Twitter and Instagram at Katie currec. 869 00:46:47,520 --> 00:46:49,840 Speaker 1: For more podcasts for My Heart Radio, visit the I 870 00:46:49,920 --> 00:46:53,000 Speaker 1: Heart Radio app, Apple podcast, or wherever you listen to 871 00:46:53,040 --> 00:46:53,880 Speaker 1: your favorite shows.