1 00:00:04,080 --> 00:00:07,440 Speaker 1: Hi everyone, I'm Katie Kurk, and this is next question. 2 00:00:08,119 --> 00:00:10,640 Speaker 1: In September of twenty twenty one, The Wall Street Journal 3 00:00:10,680 --> 00:00:13,280 Speaker 1: began to roll out a series of eleven stories that 4 00:00:13,320 --> 00:00:16,200 Speaker 1: would have a major impact on the way people think 5 00:00:16,239 --> 00:00:20,400 Speaker 1: about technology companies. The series was called The Facebook Files, 6 00:00:20,600 --> 00:00:23,200 Speaker 1: and the source for the series maybe one of the 7 00:00:23,239 --> 00:00:27,640 Speaker 1: biggest journalistic treasure troves of the century, was a former 8 00:00:27,720 --> 00:00:32,600 Speaker 1: Facebook product manager named Francis Hogan. Francis had only been 9 00:00:32,640 --> 00:00:35,040 Speaker 1: at Facebook for a couple of years, but during that 10 00:00:35,159 --> 00:00:39,440 Speaker 1: time she became increasingly alarmed by a disturbing pattern. 11 00:00:39,800 --> 00:00:41,920 Speaker 2: The only data that we get out of these companies 12 00:00:42,000 --> 00:00:44,040 Speaker 2: is how many users do they have? How much time 13 00:00:44,040 --> 00:00:45,680 Speaker 2: do you spend, how the ads do you book on? 14 00:00:45,760 --> 00:00:49,600 Speaker 2: What's that revenue? You don't get the societal costs that 15 00:00:49,680 --> 00:00:51,159 Speaker 2: come as a consequence. 16 00:00:51,600 --> 00:00:55,400 Speaker 1: It seemed that Facebook was prioritizing their own profits over 17 00:00:55,480 --> 00:01:00,720 Speaker 1: public safety and putting people's lives at risk. So she 18 00:01:00,840 --> 00:01:04,320 Speaker 1: blew the whistle. She made tens of thousands of pages 19 00:01:04,360 --> 00:01:07,880 Speaker 1: of internal documents available to The Wall Street Journal. And 20 00:01:08,000 --> 00:01:11,760 Speaker 1: what happened next testifying before the US Congress, the UK 21 00:01:11,959 --> 00:01:16,000 Speaker 1: and EU parliaments, and filing a complaint with the sec 22 00:01:16,760 --> 00:01:20,000 Speaker 1: exposed the depths that the tech company would go to 23 00:01:20,000 --> 00:01:23,960 Speaker 1: to mislead the public and grow its bottom line. Now, 24 00:01:24,000 --> 00:01:27,119 Speaker 1: Francis has written a book. It's called The Power of One. 25 00:01:27,480 --> 00:01:30,040 Speaker 1: How I found the strength to tell the truth and 26 00:01:30,160 --> 00:01:33,560 Speaker 1: why I blew the whistle on Facebook, And I'm very 27 00:01:33,600 --> 00:01:37,520 Speaker 1: glad that it's brought her to next question. First of all, Francis, 28 00:01:37,760 --> 00:01:41,000 Speaker 1: it's great to finally meet you. I am a huge 29 00:01:41,040 --> 00:01:42,920 Speaker 1: admirer of yours, so. 30 00:01:42,800 --> 00:01:44,880 Speaker 2: Excited to be here. Thank you so much for inviting me. 31 00:01:45,280 --> 00:01:48,639 Speaker 1: Of course, it's been quite a ride for you since 32 00:01:48,800 --> 00:01:54,440 Speaker 1: you became the Facebook whistleblower in October of twenty twenty one, 33 00:01:54,560 --> 00:01:57,160 Speaker 1: now a year and a half or so later. How 34 00:01:57,160 --> 00:02:00,760 Speaker 1: are you feeling? Are you happy that you came forward? 35 00:02:00,880 --> 00:02:02,880 Speaker 1: Do you have any regrets? Would you do it all 36 00:02:02,920 --> 00:02:03,559 Speaker 1: over again? 37 00:02:04,240 --> 00:02:07,120 Speaker 2: You know, when I originally came forward, I had very 38 00:02:07,240 --> 00:02:10,200 Speaker 2: very basic goals, like I wanted to not have to 39 00:02:10,240 --> 00:02:12,760 Speaker 2: carry a secret that I thought had the potential to 40 00:02:12,840 --> 00:02:15,960 Speaker 2: really impact the lives of others. I came forward because 41 00:02:16,080 --> 00:02:19,160 Speaker 2: I was concerned about how Facebook was operating in African 42 00:02:19,160 --> 00:02:23,320 Speaker 2: countries in Southeast Asia, and I then and still genuinely 43 00:02:23,360 --> 00:02:25,880 Speaker 2: believe that if we continue to operate the way we do, 44 00:02:26,280 --> 00:02:28,440 Speaker 2: there are millions of lives on the line from things 45 00:02:28,480 --> 00:02:31,760 Speaker 2: like ethnic violence. But the world has changed a lot 46 00:02:31,840 --> 00:02:34,240 Speaker 2: since I came out. I was really shocked last week 47 00:02:34,280 --> 00:02:37,440 Speaker 2: when the Surgeon General issued his advisory on social media 48 00:02:37,480 --> 00:02:40,840 Speaker 2: and mental health for kids. I've been amazed at how 49 00:02:40,919 --> 00:02:46,000 Speaker 2: just knowing that these companies knew these harms were real 50 00:02:46,080 --> 00:02:49,520 Speaker 2: across a wide variety of harms has really galvanized the 51 00:02:49,560 --> 00:02:53,640 Speaker 2: activist community. It's caused legislative conversations around the world that 52 00:02:54,000 --> 00:02:57,120 Speaker 2: were pretty stalled for a long time. And so if 53 00:02:57,160 --> 00:02:58,919 Speaker 2: I could do it again, I would totally do it again. 54 00:02:59,160 --> 00:03:02,560 Speaker 2: I've been incredibly fortunate and how smooth it's gone, and 55 00:03:02,760 --> 00:03:04,840 Speaker 2: it's exceeded my wildest expectations. 56 00:03:05,160 --> 00:03:08,040 Speaker 1: You've written a book about your experiences. Why did you 57 00:03:08,160 --> 00:03:13,200 Speaker 1: want to take pen to paper, fingers to laptop and 58 00:03:13,400 --> 00:03:15,000 Speaker 1: share your story with the world. 59 00:03:16,280 --> 00:03:18,800 Speaker 2: One of the things that kind of baffles tech journalists 60 00:03:18,800 --> 00:03:22,160 Speaker 2: when I talk to them is because tech journalists live 61 00:03:22,200 --> 00:03:23,920 Speaker 2: in a little bit of an echo chamber. You know, 62 00:03:24,000 --> 00:03:26,520 Speaker 2: like our classic criticism of tech is tech lives in 63 00:03:26,520 --> 00:03:30,480 Speaker 2: an echo chamber. Tech journalists also live in an echo chamber. 64 00:03:30,520 --> 00:03:32,080 Speaker 2: A little bit when I say to them, you know, 65 00:03:32,320 --> 00:03:35,120 Speaker 2: when I take flights. You know, I'm a friendly person. 66 00:03:35,360 --> 00:03:37,400 Speaker 2: I'm one of those annoying people that talks to their 67 00:03:37,400 --> 00:03:38,000 Speaker 2: seed mates. 68 00:03:38,280 --> 00:03:39,000 Speaker 1: I did that too. 69 00:03:39,200 --> 00:03:43,000 Speaker 2: Yeah, and it's amazing. At least half the people I 70 00:03:43,000 --> 00:03:45,920 Speaker 2: sit next to you have never heard of the Facebook whistleblower. 71 00:03:46,400 --> 00:03:48,840 Speaker 2: And so it's one of these things where culture change, 72 00:03:48,960 --> 00:03:51,360 Speaker 2: and that's the thing that we really need. We need 73 00:03:51,400 --> 00:03:54,640 Speaker 2: to reset our relationship with these companies. It takes a 74 00:03:54,760 --> 00:03:59,480 Speaker 2: long time, and this book, I'm hoping helps a much 75 00:03:59,560 --> 00:04:02,080 Speaker 2: much larger set of people, much more diverse set of 76 00:04:02,120 --> 00:04:04,800 Speaker 2: people get a seat at the table, but kind of 77 00:04:04,880 --> 00:04:08,120 Speaker 2: laying out, like what are the conversations we need to 78 00:04:08,120 --> 00:04:10,480 Speaker 2: be having, Like what are the choices we get to 79 00:04:10,520 --> 00:04:12,600 Speaker 2: make in the next few years, because we are in 80 00:04:12,640 --> 00:04:15,000 Speaker 2: a moment of inflection and we need to have as 81 00:04:15,040 --> 00:04:17,000 Speaker 2: many informed people at the table as possible. 82 00:04:17,320 --> 00:04:19,960 Speaker 1: You know, you talk about the Vic Murphy, the Surgeon 83 00:04:20,080 --> 00:04:23,920 Speaker 1: General issuing a warning about the dangers of social media 84 00:04:24,240 --> 00:04:29,000 Speaker 1: for kids. Other people becoming more aware, and I'm just 85 00:04:29,120 --> 00:04:32,240 Speaker 1: curious if you feel this way. But for me, I'm 86 00:04:32,320 --> 00:04:35,719 Speaker 1: kind of like, what took so long? This doesn't take 87 00:04:35,760 --> 00:04:39,039 Speaker 1: a brain surgeon, a rocket scientist, or a tech expert 88 00:04:39,480 --> 00:04:43,680 Speaker 1: to know that people were becoming have become incredibly addicted 89 00:04:44,080 --> 00:04:48,000 Speaker 1: to social media. Tristan Harris was sounding the alarm after 90 00:04:48,040 --> 00:04:51,880 Speaker 1: he left Google. All kinds of experts were saying, this 91 00:04:52,160 --> 00:04:56,000 Speaker 1: is dangerous. Why did it take so long for this 92 00:04:56,200 --> 00:04:59,200 Speaker 1: to really become headline news. 93 00:05:00,200 --> 00:05:02,320 Speaker 2: One of the things I talk about in my book is, 94 00:05:02,400 --> 00:05:05,240 Speaker 2: you know, what's the difference between say, the car industry 95 00:05:05,240 --> 00:05:08,200 Speaker 2: and like the automotive industry and social media when it 96 00:05:08,240 --> 00:05:12,320 Speaker 2: comes to our ability to hold it accountable or or 97 00:05:12,360 --> 00:05:16,240 Speaker 2: our ability to understand it. Back in nineteen sixty five, 98 00:05:16,480 --> 00:05:19,760 Speaker 2: it's going to sound shocking. There were no seatbelts in cars, 99 00:05:20,160 --> 00:05:23,359 Speaker 2: no airbags. I remember that, Yeah, Like I listened to 100 00:05:23,400 --> 00:05:25,719 Speaker 2: like my parents still stories about the kids all like 101 00:05:25,880 --> 00:05:27,600 Speaker 2: jumbling all over each other in the back of the 102 00:05:27,920 --> 00:05:31,080 Speaker 2: station wagon, and it's like, really, wow, a different world. 103 00:05:31,760 --> 00:05:33,680 Speaker 2: We now put eight year olds in car seats, right. 104 00:05:34,040 --> 00:05:36,599 Speaker 2: But the world changed very suddenly when a guy named 105 00:05:36,680 --> 00:05:39,400 Speaker 2: Ralph Nader came out with a book called Unsafe at 106 00:05:39,440 --> 00:05:43,360 Speaker 2: Any Speed. And what really changed was that people didn't 107 00:05:43,400 --> 00:05:46,400 Speaker 2: realize that there was the ability to live in a 108 00:05:46,440 --> 00:05:49,720 Speaker 2: different world. You know, that our fatality rate today is 109 00:05:50,120 --> 00:05:53,040 Speaker 2: way less per mild driven for cars because of a 110 00:05:53,160 --> 00:05:55,760 Speaker 2: long series of actions. But the thing that people need 111 00:05:55,800 --> 00:05:58,479 Speaker 2: to understand is when when Ralph Nator published that book. 112 00:05:58,839 --> 00:06:01,800 Speaker 2: You know, there were one hundred thousand automotive engineers in 113 00:06:01,839 --> 00:06:05,000 Speaker 2: the world when I came forward. I think they were 114 00:06:05,000 --> 00:06:07,320 Speaker 2: on in order, probably three hundred or four hundred people 115 00:06:07,360 --> 00:06:10,760 Speaker 2: in the world who really understood how systems like facebooks work. 116 00:06:11,640 --> 00:06:15,160 Speaker 2: And of those people, you know, we are educated in 117 00:06:15,200 --> 00:06:17,960 Speaker 2: such narrow ways, I think a lot of those people 118 00:06:18,040 --> 00:06:22,440 Speaker 2: didn't understand the larger societal consequences of those choices and decisions. 119 00:06:22,920 --> 00:06:26,640 Speaker 2: And so Ralph Nader could have a chorus of automotive 120 00:06:26,680 --> 00:06:30,040 Speaker 2: engineers all say this is happening. When it comes to 121 00:06:30,080 --> 00:06:33,400 Speaker 2: social media, each of us sees a different world. You know, 122 00:06:33,520 --> 00:06:36,599 Speaker 2: for many, many, many people who would be the ones 123 00:06:36,640 --> 00:06:39,920 Speaker 2: asking those questions. When they open social media, they see 124 00:06:39,960 --> 00:06:43,880 Speaker 2: their friends and family who are likely relatively similar to themselves. 125 00:06:44,360 --> 00:06:47,320 Speaker 2: You know, the idea that Facebook could be radically different, 126 00:06:47,480 --> 00:06:50,719 Speaker 2: radically more dangerous in a place like an African country 127 00:06:51,120 --> 00:06:53,520 Speaker 2: or in Southeast Asia, it sounds foreign to us. Were 128 00:06:53,560 --> 00:06:55,880 Speaker 2: like social media is about looking at pictures of cats, 129 00:06:56,440 --> 00:06:58,159 Speaker 2: and so I think that's a big part of it, 130 00:06:58,240 --> 00:07:00,440 Speaker 2: Like we need to be able to have the right 131 00:07:00,520 --> 00:07:03,240 Speaker 2: to study social media, We need to have the right 132 00:07:03,279 --> 00:07:05,880 Speaker 2: to be able to get independent data off these systems, 133 00:07:06,160 --> 00:07:08,400 Speaker 2: because then we can have definitive conversations. 134 00:07:08,760 --> 00:07:12,600 Speaker 1: Is that because we're not having a universal experience. It's 135 00:07:12,640 --> 00:07:18,840 Speaker 1: a highly deeply personalized experience for everyone. So it's not 136 00:07:18,960 --> 00:07:23,800 Speaker 1: as if we're all driving cars. We're all on different vehicles, 137 00:07:23,840 --> 00:07:28,280 Speaker 1: if you will. So it's not unifying people to realize 138 00:07:28,280 --> 00:07:30,040 Speaker 1: that they have to demand change. 139 00:07:31,400 --> 00:07:35,000 Speaker 2: It's really important for people to understand just how different 140 00:07:35,480 --> 00:07:38,200 Speaker 2: those worlds are in terms of transparency. And that's part 141 00:07:38,200 --> 00:07:39,960 Speaker 2: of why I wrote the book. You know, I've I've 142 00:07:39,960 --> 00:07:42,040 Speaker 2: got to live a lot of that arc of how 143 00:07:42,040 --> 00:07:44,240 Speaker 2: we write software or like what does it mean to 144 00:07:44,240 --> 00:07:47,680 Speaker 2: have experiences online? And I wanted to walk people through, 145 00:07:48,160 --> 00:07:50,239 Speaker 2: you know, this is what changed from step to step 146 00:07:50,280 --> 00:07:52,760 Speaker 2: to step so that more people have that context. 147 00:07:52,880 --> 00:07:56,239 Speaker 1: I'm just curious, Francis, I know you weren't super psyched 148 00:07:56,240 --> 00:08:00,240 Speaker 1: to go to Facebook. When did you realize we're not 149 00:08:00,280 --> 00:08:03,440 Speaker 1: in Kansas anymore? Something is awry. 150 00:08:03,600 --> 00:08:05,400 Speaker 2: It was interesting, like when they reached out to me, 151 00:08:05,440 --> 00:08:07,640 Speaker 2: I was like the only thing I work on is misinformation. 152 00:08:08,160 --> 00:08:10,440 Speaker 2: You know. It's interesting. I got there and I think 153 00:08:10,520 --> 00:08:13,080 Speaker 2: one of the first moments where I was like, wow, 154 00:08:13,200 --> 00:08:16,600 Speaker 2: this is this is chaotic is to the role I 155 00:08:16,640 --> 00:08:19,920 Speaker 2: had is something called a product manager. So product managers 156 00:08:20,440 --> 00:08:24,520 Speaker 2: are responsible for helping articulate what is the problem we're 157 00:08:24,520 --> 00:08:27,960 Speaker 2: trying to solve, how might we solve that, and then 158 00:08:28,160 --> 00:08:30,880 Speaker 2: once we come to consensus on a solution, what's this 159 00:08:31,080 --> 00:08:34,280 Speaker 2: series of engineering tasks that will allow us to execute 160 00:08:34,280 --> 00:08:37,280 Speaker 2: that solution. I had a role of being a product manager. 161 00:08:37,559 --> 00:08:41,880 Speaker 2: And Facebook understood that they were a different enough company 162 00:08:42,800 --> 00:08:44,679 Speaker 2: that they had seen that if people came in from 163 00:08:44,720 --> 00:08:48,240 Speaker 2: the outside, they didn't succeed at a very high rate, 164 00:08:48,280 --> 00:08:50,280 Speaker 2: like there was a lot of churn, and so they 165 00:08:50,480 --> 00:08:53,600 Speaker 2: established a boot camp for two full weeks to just 166 00:08:53,720 --> 00:08:56,080 Speaker 2: give kind of a basic level of like heire's how 167 00:08:56,120 --> 00:08:59,560 Speaker 2: Facebook works. And my manager pulled me out of it 168 00:08:59,600 --> 00:09:02,040 Speaker 2: with a like three days. He was like, we're you know, 169 00:09:02,080 --> 00:09:04,199 Speaker 2: things aren't fired too much, Like we have to come 170 00:09:04,280 --> 00:09:06,120 Speaker 2: up with a plan for the next six months, even 171 00:09:06,200 --> 00:09:08,640 Speaker 2: though you know nothing about the problem or what's going on, 172 00:09:08,880 --> 00:09:11,439 Speaker 2: Like we need you an articulative plan. Now. That was 173 00:09:11,520 --> 00:09:14,079 Speaker 2: kind of my first warning that I was like, oh, wow, 174 00:09:14,200 --> 00:09:16,840 Speaker 2: like the house is on fire. The house is on fire, 175 00:09:16,840 --> 00:09:19,559 Speaker 2: and people are running around, even having the self awareness 176 00:09:19,640 --> 00:09:22,760 Speaker 2: to be like, oh, we know that if people don't 177 00:09:22,760 --> 00:09:25,600 Speaker 2: get at least a certain amount of bootstrapping. Facebook is 178 00:09:25,760 --> 00:09:28,120 Speaker 2: very hard to figure out, even internally, how it works. 179 00:09:28,880 --> 00:09:31,120 Speaker 2: And it was interesting. I showed up for that first meeting, 180 00:09:31,160 --> 00:09:32,960 Speaker 2: you know, the one that my managers like urgently pushing 181 00:09:32,960 --> 00:09:36,840 Speaker 2: me to prepare for, and we spent twenty minutes basically 182 00:09:36,840 --> 00:09:40,079 Speaker 2: discussing should I have a job? So imagine you show up, 183 00:09:40,480 --> 00:09:43,079 Speaker 2: You've just gotten hired for this thing, and all the 184 00:09:43,160 --> 00:09:45,360 Speaker 2: leadership you're saying is like, why why do you have 185 00:09:45,400 --> 00:09:47,920 Speaker 2: a team? Like why does this team exist? Like think 186 00:09:47,920 --> 00:09:51,200 Speaker 2: about that for a moment. You know, because activists have 187 00:09:51,280 --> 00:09:54,320 Speaker 2: told you, because the un told you, Hey, and me 188 00:09:54,400 --> 00:09:58,800 Speaker 2: and Mark, your negligence around misinformation killed twenty four thousand people. 189 00:09:59,559 --> 00:10:02,000 Speaker 2: You know of a problem. And yet I could sit 190 00:10:02,040 --> 00:10:04,680 Speaker 2: in a room full of the leadership of safety having 191 00:10:04,720 --> 00:10:08,079 Speaker 2: them be like shouldn't this team exist? Right? You can 192 00:10:08,120 --> 00:10:10,079 Speaker 2: imagine that first six months was a little stressful. 193 00:10:10,880 --> 00:10:13,880 Speaker 1: What was Facebook doing in the lead up to the events? 194 00:10:13,920 --> 00:10:17,720 Speaker 1: On January sixth, we'll talk about that right after this, 195 00:10:22,440 --> 00:10:26,480 Speaker 1: we're back with Francis Hogan, Facebook whistleblower and author of 196 00:10:26,520 --> 00:10:33,360 Speaker 1: the Power of Want. Francis, did Facebook really care about 197 00:10:33,400 --> 00:10:37,480 Speaker 1: misinformation or did the company just feel like dealing with 198 00:10:37,559 --> 00:10:39,600 Speaker 1: it was an exercise and futility. 199 00:10:39,840 --> 00:10:42,319 Speaker 2: I think part of the problem was that Facebook had 200 00:10:42,320 --> 00:10:45,920 Speaker 2: taken the most obvious path to deal with misinformation, which was, 201 00:10:46,400 --> 00:10:49,840 Speaker 2: let's hire experts, let's hire journalists to help us assess 202 00:10:49,880 --> 00:10:53,160 Speaker 2: what's true and false. But that kind of approach, you know, 203 00:10:53,280 --> 00:10:56,840 Speaker 2: fixing safety after the fact, like, oh, we've already hyper 204 00:10:56,840 --> 00:10:59,720 Speaker 2: amplified you know, extreme content. Now we're going to pluck 205 00:10:59,720 --> 00:11:03,360 Speaker 2: out the dangerous parts. Those strategies don't scale. You know, 206 00:11:03,400 --> 00:11:07,160 Speaker 2: Facebook has three billion users. There were maybe a few 207 00:11:07,200 --> 00:11:10,600 Speaker 2: thousand fact checks being done a month, maybe a few thousand. 208 00:11:11,000 --> 00:11:12,800 Speaker 2: You can see why you actually need to take a 209 00:11:12,800 --> 00:11:15,120 Speaker 2: different kind of approach, which is coming in and saying, 210 00:11:15,880 --> 00:11:19,720 Speaker 2: why are the algorithms rewarding extreme content? How do we 211 00:11:19,840 --> 00:11:22,360 Speaker 2: change our operations to deal with that fact? 212 00:11:23,000 --> 00:11:26,240 Speaker 1: As they say, lies make it around the world before 213 00:11:26,360 --> 00:11:29,280 Speaker 1: the truth has the chance to tie its shoes, And 214 00:11:29,400 --> 00:11:33,000 Speaker 1: so what was the alternative? What would that approach it? 215 00:11:33,080 --> 00:11:35,120 Speaker 1: Ben I mean, what is the answer. 216 00:11:35,679 --> 00:11:38,199 Speaker 2: Let's take, for example, something as simple as should you 217 00:11:38,280 --> 00:11:40,640 Speaker 2: have to click on a link before you reshare it, 218 00:11:41,280 --> 00:11:43,360 Speaker 2: so you hit the nail on the head. One of 219 00:11:43,400 --> 00:11:46,800 Speaker 2: the problems with third party fact checking is journalism takes time. 220 00:11:47,280 --> 00:11:49,880 Speaker 2: At Facebook, on average, it was like two or three 221 00:11:50,000 --> 00:11:52,199 Speaker 2: days for someone to write a fact check, and they 222 00:11:52,280 --> 00:11:54,240 Speaker 2: put a huge amount of effort into trying to build 223 00:11:54,280 --> 00:11:58,280 Speaker 2: prediction systems to guess which are the pieces of content 224 00:11:58,360 --> 00:12:01,760 Speaker 2: that might go viral, because we have to give the 225 00:12:01,840 --> 00:12:04,920 Speaker 2: journalists a head start. Alternatives are things like if you 226 00:12:05,000 --> 00:12:07,439 Speaker 2: require people to click on a link before they reshare it, 227 00:12:07,720 --> 00:12:11,320 Speaker 2: that reduces misinformation by like ten or fifteen percent, just 228 00:12:11,320 --> 00:12:13,839 Speaker 2: because people have to pause and think for a moment. 229 00:12:14,480 --> 00:12:17,680 Speaker 2: You know, there's one or two billion people on Facebook 230 00:12:17,720 --> 00:12:20,920 Speaker 2: who live in places where Facebook is the internet. You know, 231 00:12:20,960 --> 00:12:23,880 Speaker 2: they might have become literate to use Facebook, and as 232 00:12:23,920 --> 00:12:27,560 Speaker 2: a result those places. In some countries, thirty five percent 233 00:12:27,640 --> 00:12:30,360 Speaker 2: of everything you see on your news feed is a 234 00:12:30,440 --> 00:12:33,920 Speaker 2: reshare and so Facebook wasn't willing to take the hit 235 00:12:34,120 --> 00:12:36,680 Speaker 2: of you know, zero point one point two percent less 236 00:12:36,679 --> 00:12:39,880 Speaker 2: profit of reducing the amount of content that was moving 237 00:12:39,880 --> 00:12:42,720 Speaker 2: through the ecosystem. As a whole. So those are kinds 238 00:12:42,760 --> 00:12:45,880 Speaker 2: of product design ways of dealing with misinformation. 239 00:12:46,520 --> 00:12:50,760 Speaker 1: Our third party fact checker still the primary way Facebook 240 00:12:50,840 --> 00:12:54,040 Speaker 1: is ostensibly trying to combat miss and disinformation. 241 00:12:54,760 --> 00:12:57,520 Speaker 2: You know, I don't know like because we have no transparency. 242 00:12:58,000 --> 00:13:00,440 Speaker 2: You know, right now, we don't have any trend parency 243 00:13:00,480 --> 00:13:03,640 Speaker 2: into how these book operates. We know that Mark fired 244 00:13:03,720 --> 00:13:06,440 Speaker 2: lots and lots of safety people during his year of efficiency, 245 00:13:06,960 --> 00:13:09,760 Speaker 2: so it's possible that things have changed, but given his 246 00:13:09,960 --> 00:13:13,080 Speaker 2: recent behavior, it's unlikely things have materially changed. 247 00:13:13,600 --> 00:13:19,160 Speaker 1: Let's talk about sort of the public facing explanation from Facebook. 248 00:13:19,240 --> 00:13:23,120 Speaker 1: I interviewed Cheryl Samberg in twenty nineteen when you were 249 00:13:23,160 --> 00:13:25,400 Speaker 1: still working at Facebook. I asked her if she felt 250 00:13:25,480 --> 00:13:29,280 Speaker 1: like Facebook was doing enough to invest in its security. 251 00:13:29,760 --> 00:13:34,160 Speaker 1: Let's take a listen and let's hear your reaction to 252 00:13:34,160 --> 00:13:35,200 Speaker 1: what Cheryl told me. 253 00:13:35,880 --> 00:13:39,760 Speaker 3: We've put tremendous engineering resources, and we're doing things like 254 00:13:39,880 --> 00:13:42,120 Speaker 3: red teams, asking what do we think the bad guys 255 00:13:42,160 --> 00:13:44,520 Speaker 3: would do and how would we do it? So we're 256 00:13:44,559 --> 00:13:47,040 Speaker 3: never going to fully be ahead of everything. But if 257 00:13:47,080 --> 00:13:48,959 Speaker 3: you look at if you want to understand what companies 258 00:13:49,040 --> 00:13:51,400 Speaker 3: care about, you look at where they invest their resources 259 00:13:51,640 --> 00:13:53,480 Speaker 3: and if you look back three to five years and 260 00:13:53,520 --> 00:13:56,520 Speaker 3: you look at today, we've totally changed when we invest 261 00:13:56,520 --> 00:13:59,559 Speaker 3: our resources. And my job has changed too. If I 262 00:13:59,559 --> 00:14:01,880 Speaker 3: look at up at Facebook eleven and a half years, 263 00:14:02,360 --> 00:14:05,800 Speaker 3: for the first eight or so, I spent most of 264 00:14:05,840 --> 00:14:08,800 Speaker 3: my time growing the company and sometime protecting the community. 265 00:14:08,800 --> 00:14:11,920 Speaker 3: We always did some protection, but now that's definitely flipped. 266 00:14:11,960 --> 00:14:15,640 Speaker 3: My job is a majority building the systems that protect 267 00:14:15,720 --> 00:14:19,280 Speaker 3: and minority grow. And so we're definitely changing as a company. 268 00:14:19,560 --> 00:14:21,440 Speaker 3: We're in a different place across the board on all 269 00:14:21,480 --> 00:14:22,080 Speaker 3: of these things. 270 00:14:22,160 --> 00:14:24,200 Speaker 1: Do you think you're changing enough fast enough? 271 00:14:25,360 --> 00:14:26,560 Speaker 3: I hope, so we're trying. 272 00:14:27,080 --> 00:14:29,640 Speaker 1: What's your reaction to that, Francis, It's. 273 00:14:29,520 --> 00:14:33,040 Speaker 2: So interesting when we listen to media, you know, a 274 00:14:33,080 --> 00:14:35,600 Speaker 2: film clip, an audio clip from the past, you know, 275 00:14:35,640 --> 00:14:38,400 Speaker 2: you can often hear the emotional echoes of that moment. 276 00:14:38,920 --> 00:14:41,400 Speaker 2: And I think back in twenty nineteen, she's quite earnest, 277 00:14:41,680 --> 00:14:44,480 Speaker 2: like the only part of Facebook that was growing was 278 00:14:44,560 --> 00:14:47,320 Speaker 2: the civic Integrity team. I think twenty nineteen was the 279 00:14:47,400 --> 00:14:50,280 Speaker 2: year that the UN report on Menmark came out and 280 00:14:50,360 --> 00:14:53,440 Speaker 2: you know, firmly placed blame on Facebook. They were still 281 00:14:53,440 --> 00:14:57,000 Speaker 2: living right in the immediate echoes of Cambridge Analytico for context, 282 00:14:57,120 --> 00:14:59,400 Speaker 2: right about when when you interviewed her, that would have 283 00:14:59,440 --> 00:15:03,640 Speaker 2: been when the FTC find Facebook five billion dollars because 284 00:15:03,640 --> 00:15:07,360 Speaker 2: of privacy violations from Cambridge Alica. But over the course 285 00:15:07,400 --> 00:15:09,440 Speaker 2: of the next two years or even the next year, 286 00:15:09,920 --> 00:15:13,720 Speaker 2: I think Facebook began to realize that having a big 287 00:15:13,720 --> 00:15:19,360 Speaker 2: safety team, having people with PhDs asking questions was putting 288 00:15:19,400 --> 00:15:23,520 Speaker 2: Facebook in a quite awkward position because the more people 289 00:15:23,880 --> 00:15:27,280 Speaker 2: dug in, the more people having the ability to ask questions, 290 00:15:27,720 --> 00:15:30,960 Speaker 2: they found things, and they found things that were quite disturbing. 291 00:15:31,400 --> 00:15:33,320 Speaker 2: I think they did a pretty good job in the 292 00:15:33,400 --> 00:15:36,480 Speaker 2: run up to the twenty twenty election, but as soon 293 00:15:36,480 --> 00:15:39,800 Speaker 2: as the twenty twenty election passed, they fired that team, 294 00:15:39,960 --> 00:15:41,160 Speaker 2: the Civic Integrity team. 295 00:15:41,280 --> 00:15:44,680 Speaker 1: One month after the election, yeah, one month, and of 296 00:15:44,720 --> 00:15:49,200 Speaker 1: course five weeks later was when the Trump supporters, many 297 00:15:49,240 --> 00:15:53,880 Speaker 1: of whom organized on Facebook, stormed the US capital. 298 00:15:54,440 --> 00:15:57,160 Speaker 2: And I think part of what happened was their papers 299 00:15:57,160 --> 00:16:00,240 Speaker 2: in the Facebook files. They say like, we saw all 300 00:16:00,280 --> 00:16:05,000 Speaker 2: this building up. But I think because now no single 301 00:16:05,080 --> 00:16:08,480 Speaker 2: person was responsible, no one felt like they had the 302 00:16:08,520 --> 00:16:10,600 Speaker 2: authority to go in there and intervene. 303 00:16:10,920 --> 00:16:13,320 Speaker 1: Where do you draw the line? And nobody has a 304 00:16:13,320 --> 00:16:16,440 Speaker 1: crystal ball to say these people are going to do that. 305 00:16:16,920 --> 00:16:20,960 Speaker 2: Facebook talks about the difference between movements and we are 306 00:16:20,960 --> 00:16:25,080 Speaker 2: to knows adversarial movements. So an adversarial movement knows that 307 00:16:25,120 --> 00:16:29,800 Speaker 2: they're violating Facebook's policies and actively does countermeasures to try 308 00:16:29,840 --> 00:16:33,280 Speaker 2: to get around Facebook. That's one we differentiate between, like, 309 00:16:33,480 --> 00:16:36,840 Speaker 2: does this movement think they're doing something wrong? Right? 310 00:16:37,280 --> 00:16:37,520 Speaker 1: Right? 311 00:16:37,760 --> 00:16:39,960 Speaker 2: And you saw that extensively with Stop the Seal. 312 00:16:40,360 --> 00:16:44,560 Speaker 1: If the Facebook Civic Integrity Team had not been disbanded 313 00:16:44,880 --> 00:16:48,320 Speaker 1: less than one month after the election, what would they 314 00:16:48,360 --> 00:16:51,320 Speaker 1: have seen and what would they have done with the 315 00:16:51,400 --> 00:16:54,520 Speaker 1: activity they witnessed going on on the platform. 316 00:16:54,960 --> 00:16:56,720 Speaker 2: In the round of the twenty twenty election, there were 317 00:16:56,720 --> 00:16:58,880 Speaker 2: a number of things in place where Facebook said, hey, 318 00:16:58,960 --> 00:17:01,560 Speaker 2: we know we have vulnerability in our system. For example, 319 00:17:02,200 --> 00:17:04,840 Speaker 2: live videos. This is where you know, I can film 320 00:17:04,880 --> 00:17:07,680 Speaker 2: something on my phone and Facebook will put a little 321 00:17:07,680 --> 00:17:11,040 Speaker 2: announcement at the top of people's feeds. Facebook knew that 322 00:17:11,119 --> 00:17:14,680 Speaker 2: live video was a particularly big vulnerability for the company 323 00:17:14,840 --> 00:17:19,040 Speaker 2: because video is harder to monitor than audio or text, 324 00:17:19,080 --> 00:17:21,680 Speaker 2: for sure. So you either can deal with that after 325 00:17:21,680 --> 00:17:24,760 Speaker 2: the fact, or you can say, hey, what's leading to 326 00:17:25,119 --> 00:17:28,000 Speaker 2: that video? Going viral, and in the case of live video, 327 00:17:28,400 --> 00:17:31,159 Speaker 2: Facebook said, hey, you know, every piece of content on 328 00:17:31,200 --> 00:17:35,680 Speaker 2: Facebook earns a score based on how relevant it is 329 00:17:35,960 --> 00:17:38,399 Speaker 2: to uktie or to your listener. You know, is it 330 00:17:38,440 --> 00:17:40,520 Speaker 2: similar to other things that they've seen before. Does this 331 00:17:40,560 --> 00:17:43,239 Speaker 2: person generally produce content that people would like to engage with? 332 00:17:43,320 --> 00:17:45,560 Speaker 2: You know, there's a bunch of factors. You earn a score, 333 00:17:45,880 --> 00:17:48,320 Speaker 2: and that gives you a priority in the news feed. 334 00:17:48,800 --> 00:17:51,440 Speaker 2: When it came to live video, they would give a boost. 335 00:17:51,560 --> 00:17:53,760 Speaker 2: They'd say that score, We're going to multiply it by 336 00:17:53,800 --> 00:17:56,760 Speaker 2: eight hundred and fifty times to make sure that it 337 00:17:56,760 --> 00:17:59,880 Speaker 2: will show up at the top of your feed. They said, hey, 338 00:18:00,160 --> 00:18:02,399 Speaker 2: we know this is dangerous. We're going to only boost 339 00:18:02,440 --> 00:18:06,159 Speaker 2: it sixty five times. In the runchy election, it's a 340 00:18:06,240 --> 00:18:09,679 Speaker 2: little tiny detail, but when they stormed the capitol, the 341 00:18:09,800 --> 00:18:13,120 Speaker 2: rioters actively used live video to coordinate, and so there's 342 00:18:13,160 --> 00:18:15,560 Speaker 2: these little things where they could have had the safety 343 00:18:15,600 --> 00:18:19,199 Speaker 2: measures on that were on election day, but because no 344 00:18:19,240 --> 00:18:21,480 Speaker 2: one felt they had the authority to say we're in 345 00:18:21,520 --> 00:18:24,720 Speaker 2: a situation, no one turned those on until the day 346 00:18:24,800 --> 00:18:28,119 Speaker 2: after they stormed the capital. These are little, tiny details, 347 00:18:28,480 --> 00:18:31,879 Speaker 2: but you have to remember When people interviewed the rioters 348 00:18:31,920 --> 00:18:36,440 Speaker 2: after January six, they said it seemed real. It seemed real. 349 00:18:36,480 --> 00:18:39,280 Speaker 2: It seemed like everyone was saying, like, we're about to 350 00:18:39,359 --> 00:18:41,959 Speaker 2: experience a coup, like we need to like go and 351 00:18:42,000 --> 00:18:46,159 Speaker 2: save our democracy. These little product tweaks would have changed 352 00:18:46,160 --> 00:18:49,360 Speaker 2: the information environment that those people experienced, and who knows 353 00:18:49,400 --> 00:18:50,960 Speaker 2: what would have happened with January six. 354 00:18:51,400 --> 00:18:53,919 Speaker 1: So why didn't they do it? Because the team had 355 00:18:53,960 --> 00:18:54,720 Speaker 1: been dissolved. 356 00:18:55,560 --> 00:18:58,080 Speaker 2: There was no longer a person in the company who 357 00:18:58,320 --> 00:19:00,359 Speaker 2: wore the hat of saying, let's make sure we're a 358 00:19:00,359 --> 00:19:03,840 Speaker 2: positive force in society, right, there was diffuse responsibility for 359 00:19:03,920 --> 00:19:07,600 Speaker 2: Little Tiny Slippers. And I think after they dissolved the team, 360 00:19:07,600 --> 00:19:10,880 Speaker 2: which was on December second or December third, I don't 361 00:19:10,880 --> 00:19:12,879 Speaker 2: think there was anyone who felt like they had the 362 00:19:12,920 --> 00:19:15,720 Speaker 2: authority to say, hey, some people are going to have 363 00:19:15,720 --> 00:19:18,119 Speaker 2: to work over the holidays, right, Like this is a 364 00:19:18,119 --> 00:19:20,000 Speaker 2: big enough deal that someone's going to have to do something. 365 00:19:20,520 --> 00:19:22,560 Speaker 2: And I think that's why Facebook was asleep at the wheel. 366 00:19:23,320 --> 00:19:26,080 Speaker 1: When we come back, Francis and I talk about improvements 367 00:19:26,080 --> 00:19:29,560 Speaker 1: in social media that could have a positive impact on 368 00:19:29,640 --> 00:19:39,520 Speaker 1: the team Mental health crisis. We're back with Francis, Howgan 369 00:19:42,160 --> 00:19:46,280 Speaker 1: do you think Mark Zuckerberg just cares about profit over everything? 370 00:19:46,359 --> 00:19:50,000 Speaker 1: And is there something about the broader culture of Facebook 371 00:19:50,359 --> 00:19:57,000 Speaker 1: that makes this almost a Sisifian task to try to 372 00:19:57,080 --> 00:20:03,360 Speaker 1: control or at least even monit or remove really dangerous content. 373 00:20:04,480 --> 00:20:07,240 Speaker 2: So I'm glad that you bring up Mark, so just 374 00:20:07,280 --> 00:20:10,879 Speaker 2: for people understand how different the leadership of Facebook is 375 00:20:10,960 --> 00:20:15,320 Speaker 2: versus other companies. Mark Zuckerberg holds about fifty five fifty 376 00:20:15,359 --> 00:20:18,760 Speaker 2: six percent of the voting shares that control Facebook, so 377 00:20:18,800 --> 00:20:21,359 Speaker 2: that means he's the chairman of the board, He's the CEO. 378 00:20:22,040 --> 00:20:25,359 Speaker 2: If he wants to invest tens of billions of dollars 379 00:20:25,400 --> 00:20:27,919 Speaker 2: in the metaverse, no one can stop him because he 380 00:20:28,040 --> 00:20:32,000 Speaker 2: is the only voice that matters. I do think responsibility 381 00:20:32,040 --> 00:20:34,359 Speaker 2: goes to the top, right, Like part of the challenge 382 00:20:34,359 --> 00:20:36,720 Speaker 2: here is you have a man who has been CEO 383 00:20:36,840 --> 00:20:40,800 Speaker 2: since he was nineteen years old. Facebook is intimately tied 384 00:20:40,840 --> 00:20:43,359 Speaker 2: to his identity, and it's very hard for people to 385 00:20:43,400 --> 00:20:46,040 Speaker 2: admit that their life's work might be hurting other people. 386 00:20:46,680 --> 00:20:48,960 Speaker 2: And so unfortunately, there is an internal culture to the 387 00:20:49,000 --> 00:20:53,480 Speaker 2: company where the people who surround Mark know that being 388 00:20:53,520 --> 00:20:56,040 Speaker 2: too critical isn't going to get you very far, I said, 389 00:20:56,040 --> 00:20:58,520 Speaker 2: I think part of why Cheryl left from when I 390 00:20:58,560 --> 00:21:02,040 Speaker 2: came out, she left maybe six once after the Facebook 391 00:21:02,080 --> 00:21:04,760 Speaker 2: files happened. I think as Cheryl was a voice that 392 00:21:04,920 --> 00:21:08,000 Speaker 2: was trying to push for responsibility, and there wasn't really 393 00:21:08,040 --> 00:21:09,840 Speaker 2: an appetite in Tronto the company to do that. 394 00:21:10,280 --> 00:21:13,639 Speaker 1: To this point, in twenty nineteen, I spoke with her 395 00:21:13,720 --> 00:21:18,960 Speaker 1: about whether Facebook's business model ultimately rendered implementing security measures 396 00:21:19,040 --> 00:21:22,040 Speaker 1: bad for business. Let's hear what she said. 397 00:21:22,640 --> 00:21:25,560 Speaker 3: So on this, I'm really pretty proud of our track record. 398 00:21:25,640 --> 00:21:28,480 Speaker 3: If you look a number of years ago and you 399 00:21:28,560 --> 00:21:31,159 Speaker 3: listen to our earnings calls, So earnings calls are exactly 400 00:21:31,240 --> 00:21:33,600 Speaker 3: what people are worried about. They're directed at investors. It's 401 00:21:33,600 --> 00:21:36,119 Speaker 3: our quarterly report. If you actually watch us in earning calls, 402 00:21:36,560 --> 00:21:39,639 Speaker 3: we are spending as much time talking about the measures 403 00:21:39,640 --> 00:21:42,240 Speaker 3: we take on safety and security as we are about 404 00:21:42,240 --> 00:21:43,159 Speaker 3: our business growth. 405 00:21:43,240 --> 00:21:43,680 Speaker 2: Easily. 406 00:21:44,280 --> 00:21:47,800 Speaker 3: We actually said many quarters ago, this is so important 407 00:21:47,800 --> 00:21:49,840 Speaker 3: to us that we are going to make massive investments 408 00:21:49,840 --> 00:21:53,240 Speaker 3: and change the profitability of our company by making real 409 00:21:53,280 --> 00:21:56,120 Speaker 3: resource investments. And we have to the tune of billions 410 00:21:56,160 --> 00:21:58,120 Speaker 3: and billions of dollars, and we will keep doing it. 411 00:21:58,480 --> 00:22:02,359 Speaker 3: We've taken action after act after action that is better 412 00:22:02,400 --> 00:22:05,280 Speaker 3: for protecting them community than it is for our growth, 413 00:22:05,359 --> 00:22:07,080 Speaker 3: and we're going to continue to do that. Mark has 414 00:22:07,080 --> 00:22:08,760 Speaker 3: said it over and over again. I have said it 415 00:22:08,760 --> 00:22:09,520 Speaker 3: over and over again. 416 00:22:10,080 --> 00:22:12,040 Speaker 1: Do you believe that, Francis. 417 00:22:11,760 --> 00:22:14,399 Speaker 4: Oh, Kittie, I'm so glad you played that clip for me, 418 00:22:14,400 --> 00:22:16,800 Speaker 4: because I am totally going to go get the transcripts 419 00:22:16,800 --> 00:22:18,800 Speaker 4: now of the investor calls just to see how things 420 00:22:18,800 --> 00:22:22,040 Speaker 4: have changed, right, because I think back in twenty nineteen 421 00:22:22,840 --> 00:22:27,120 Speaker 4: they were trying like they got burned by Cambridge Analytica. 422 00:22:27,200 --> 00:22:31,080 Speaker 4: They lost a huge amount of goodwill and regulators from users. 423 00:22:31,480 --> 00:22:33,960 Speaker 4: I don't think that sentiment she expressed is still true. 424 00:22:34,480 --> 00:22:36,960 Speaker 4: One of the things that Elon Musk showed was that 425 00:22:37,000 --> 00:22:39,280 Speaker 4: you could fire all your safety teams and no one bad. 426 00:22:39,320 --> 00:22:43,000 Speaker 4: It an not right because we don't have any stats. 427 00:22:43,280 --> 00:22:45,680 Speaker 4: I want to be super honest with people. Mark Zuckerberg 428 00:22:45,760 --> 00:22:47,960 Speaker 4: has fired a huge number of safety people in the 429 00:22:48,040 --> 00:22:51,080 Speaker 4: last six months, and the market has rewarded him. You know, 430 00:22:51,119 --> 00:22:54,000 Speaker 4: their stock price is going up because Facebook looks more profitable. 431 00:22:54,400 --> 00:22:57,960 Speaker 4: But he also fired their AI safety team, and then 432 00:22:58,000 --> 00:23:01,800 Speaker 4: they open sourced their large language model when people talk 433 00:23:01,880 --> 00:23:06,960 Speaker 4: about existential risks from AI. Allowing for mass proliferation of 434 00:23:07,000 --> 00:23:11,840 Speaker 4: these technologies doesn't allow us to do thoughtful, slow, intentional development, 435 00:23:12,440 --> 00:23:14,960 Speaker 4: And so I don't think what she's saying is true anymore. 436 00:23:15,040 --> 00:23:16,600 Speaker 4: We're living in a very different world. 437 00:23:17,000 --> 00:23:19,120 Speaker 1: In fairness to Cheryl, do you think it was true 438 00:23:19,160 --> 00:23:19,720 Speaker 1: at the time. 439 00:23:20,119 --> 00:23:23,280 Speaker 2: I think in twenty nineteen, they were trying hard. If 440 00:23:23,320 --> 00:23:26,560 Speaker 2: Facebook had continued in the vein they were working in 441 00:23:26,600 --> 00:23:29,320 Speaker 2: twenty nineteen, I probably would have never been a whistleblower. 442 00:23:29,480 --> 00:23:31,200 Speaker 2: You know. I probably would have been like many people 443 00:23:31,240 --> 00:23:33,800 Speaker 2: who came before me, who've kept their head down and 444 00:23:33,920 --> 00:23:36,720 Speaker 2: kept trying, kept trying to make it safer, and eventually 445 00:23:36,720 --> 00:23:39,560 Speaker 2: burned out because the only part of Facebook that was 446 00:23:39,600 --> 00:23:43,480 Speaker 2: growing was the safety teams in twenty nineteen. By twenty twenty, 447 00:23:43,640 --> 00:23:45,640 Speaker 2: they had given up on that. You know, they'd said, 448 00:23:45,680 --> 00:23:48,199 Speaker 2: we're not getting acknowledged for the effort we're putting in, 449 00:23:48,720 --> 00:23:50,680 Speaker 2: and these teams are just liabilities. 450 00:23:50,960 --> 00:23:53,959 Speaker 1: Let me ask you just what can be done. We've 451 00:23:54,200 --> 00:23:58,840 Speaker 1: heard about kids and mental health. We've heard about misinformation 452 00:23:59,560 --> 00:24:04,520 Speaker 1: and the election. We've heard about so many things that 453 00:24:05,320 --> 00:24:12,200 Speaker 1: are causing harms to society because social media platforms like Facebook. 454 00:24:12,440 --> 00:24:17,720 Speaker 1: Section two thirty prevents or protects these social media platforms 455 00:24:17,720 --> 00:24:23,080 Speaker 1: from liability for the content they may carry. The Supreme 456 00:24:23,119 --> 00:24:25,320 Speaker 1: Court just made a ruling on that, and I guess 457 00:24:25,359 --> 00:24:29,439 Speaker 1: now it's up to Congress. But in the best of 458 00:24:29,520 --> 00:24:33,480 Speaker 1: all possible worlds, what would you like, Francis to be 459 00:24:33,560 --> 00:24:37,040 Speaker 1: done to rain in the social media companies if you 460 00:24:37,160 --> 00:24:39,560 Speaker 1: had to wave a magic wand so. 461 00:24:39,520 --> 00:24:41,600 Speaker 2: I think it's important for people to understand kind of 462 00:24:41,720 --> 00:24:45,160 Speaker 2: what's the tool chest that's available to us. I think 463 00:24:45,200 --> 00:24:48,160 Speaker 2: the way forward is more something like what Europe did. 464 00:24:48,440 --> 00:24:51,240 Speaker 2: So Europe came in and said, hey, you need to 465 00:24:51,240 --> 00:24:54,600 Speaker 2: be honest with us about the risks, the harms you 466 00:24:54,680 --> 00:24:57,800 Speaker 2: know about. You need to publicly tell us how you're 467 00:24:57,800 --> 00:24:59,959 Speaker 2: going to reduce those risks, and you need to get 468 00:25:00,160 --> 00:25:02,000 Speaker 2: us enough data that we can see if you're making 469 00:25:02,080 --> 00:25:05,879 Speaker 2: progress on those things. Because for context, I think the 470 00:25:05,880 --> 00:25:09,440 Speaker 2: fundamental problem is our relationship with these companies is spewed. 471 00:25:09,840 --> 00:25:16,120 Speaker 1: And Congress doesn't seem to really understand the rudimentaries of 472 00:25:16,520 --> 00:25:20,920 Speaker 1: the technology that powers Facebook to actually want to do 473 00:25:21,000 --> 00:25:21,840 Speaker 1: something about it. 474 00:25:22,040 --> 00:25:24,400 Speaker 2: I think the thing that's going to push Congress over 475 00:25:24,440 --> 00:25:29,560 Speaker 2: the line is actually the growing crisis around teenage mental health. Historically, 476 00:25:29,760 --> 00:25:33,000 Speaker 2: just for people's contexts, over the last sixty years, we've 477 00:25:33,040 --> 00:25:36,719 Speaker 2: had only a handful of Surgeon General advisories. It's things 478 00:25:36,800 --> 00:25:42,520 Speaker 2: like seat belts save lives, smoking causes cancer, breastfeeding helps 479 00:25:42,560 --> 00:25:46,080 Speaker 2: infants health, things that we take for granted today. But 480 00:25:46,200 --> 00:25:52,359 Speaker 2: before those advisories happened, there was ambiguity, there was controversy. Historically, 481 00:25:52,480 --> 00:25:55,880 Speaker 2: after a Surgeon General advisory is issued, usually within two 482 00:25:55,880 --> 00:25:59,040 Speaker 2: to three years, some sort of legislative action takes place. 483 00:26:00,080 --> 00:26:01,800 Speaker 2: I think it'll be really interesting to see how things 484 00:26:01,840 --> 00:26:03,640 Speaker 2: play out over the next year or two, at least 485 00:26:03,680 --> 00:26:05,440 Speaker 2: in the context of kids. 486 00:26:05,240 --> 00:26:08,160 Speaker 1: And what can be done about that. Tell me how 487 00:26:08,200 --> 00:26:15,560 Speaker 1: to reverse or stop the negative impact that social media 488 00:26:15,600 --> 00:26:18,959 Speaker 1: and things like Instagram are having on young people. 489 00:26:20,119 --> 00:26:23,719 Speaker 2: So you mentioned earlier that you know the business model 490 00:26:23,800 --> 00:26:27,120 Speaker 2: is working counter to our own well being or safety. 491 00:26:27,520 --> 00:26:30,520 Speaker 2: Let's take a look at sleep deprivation and kits. So 492 00:26:30,800 --> 00:26:32,320 Speaker 2: one of the things called up by the surge in 493 00:26:32,359 --> 00:26:36,720 Speaker 2: general was that thirty percent thirty percent of teenagers say 494 00:26:36,760 --> 00:26:40,399 Speaker 2: they use social media till midnight or later most weekdays. 495 00:26:40,760 --> 00:26:43,840 Speaker 2: That's crazy when we look at risk factors for things 496 00:26:43,920 --> 00:26:47,280 Speaker 2: like multiple kinds of mental illness. That's not just depressure 497 00:26:47,320 --> 00:26:50,040 Speaker 2: and anxiety. It's also things like bipolar. When we look 498 00:26:50,080 --> 00:26:53,159 Speaker 2: at risk factors for accidental death, both automotive and just 499 00:26:53,280 --> 00:26:56,520 Speaker 2: general accidents. When we look at risk factors for substance use, 500 00:26:56,880 --> 00:27:00,600 Speaker 2: uppers post they're tired downers because they're depressed. All of 501 00:27:00,640 --> 00:27:04,919 Speaker 2: those things link back to sleep deprivation. We've known for 502 00:27:05,040 --> 00:27:08,840 Speaker 2: twenty years that we can influence whether or not people 503 00:27:08,960 --> 00:27:14,040 Speaker 2: use products. Imagine if for two hours before eleven, Instagram 504 00:27:14,119 --> 00:27:15,960 Speaker 2: got a little bit slower and a little bit slower, 505 00:27:16,000 --> 00:27:17,320 Speaker 2: and a little bit slower, it was like it was 506 00:27:17,320 --> 00:27:19,840 Speaker 2: like you're pushing the post a little harder. Maybe there 507 00:27:19,880 --> 00:27:23,480 Speaker 2: was a lag on TikTok between videos. Who knows. We've 508 00:27:23,520 --> 00:27:26,120 Speaker 2: known for twenty years that if you make an app 509 00:27:26,160 --> 00:27:28,840 Speaker 2: a little bit slower, people use the less. Imagine as 510 00:27:28,920 --> 00:27:31,359 Speaker 2: you approached your bedtime, you just got tired and went 511 00:27:31,400 --> 00:27:36,880 Speaker 2: to bed. That feature is live on Instagram today. That's 512 00:27:36,880 --> 00:27:38,800 Speaker 2: a meaningful thing that would help kids go to bed. 513 00:27:38,880 --> 00:27:41,720 Speaker 1: How Mud if parents come in and take their kids' phones. 514 00:27:41,600 --> 00:27:44,440 Speaker 2: We should definitely do that, right, we ignore the fact 515 00:27:44,520 --> 00:27:48,000 Speaker 2: that these technologies are extremely powerful and addictive, and they 516 00:27:48,040 --> 00:27:51,439 Speaker 2: operate the level of independence that no other consumer product 517 00:27:51,480 --> 00:27:51,960 Speaker 2: does today. 518 00:27:52,400 --> 00:27:54,920 Speaker 1: In closing, Francis, I feel like I have to ask 519 00:27:54,960 --> 00:28:00,840 Speaker 1: you about AI, which is the new boogeyman of technology, 520 00:28:01,080 --> 00:28:05,080 Speaker 1: And rightfully so, it was pretty chilling when these AI 521 00:28:05,320 --> 00:28:11,440 Speaker 1: leaders said that artificial intelligence poses a threat as big 522 00:28:11,480 --> 00:28:17,199 Speaker 1: as pandemics and nuclear war, and it's sort of like, 523 00:28:17,320 --> 00:28:22,960 Speaker 1: holy shit. And yet you wonder, since the government has 524 00:28:23,040 --> 00:28:28,080 Speaker 1: been so impotent when it comes to figuring out how 525 00:28:28,080 --> 00:28:31,760 Speaker 1: to regulate social media, what they're going to do about 526 00:28:31,800 --> 00:28:33,320 Speaker 1: this looming threat. 527 00:28:34,600 --> 00:28:36,760 Speaker 2: So I think it's always important to remember that these 528 00:28:36,800 --> 00:28:39,720 Speaker 2: are percentage risks, right, So this is you know, they 529 00:28:39,720 --> 00:28:42,320 Speaker 2: say there's a one percent and two percent risk, which 530 00:28:42,360 --> 00:28:45,560 Speaker 2: is terrifying, right, you know, one or two percent risks 531 00:28:45,560 --> 00:28:48,440 Speaker 2: of extension. We should take those seriously. But I think 532 00:28:48,440 --> 00:28:50,480 Speaker 2: one of the things that people also need to be 533 00:28:50,520 --> 00:28:53,040 Speaker 2: honest about is we kind of let the cat out 534 00:28:53,080 --> 00:28:56,000 Speaker 2: of the bag, right. I think things like Fortune five 535 00:28:56,080 --> 00:28:59,080 Speaker 2: hundred companies should get together and say, hey, we will 536 00:28:59,120 --> 00:29:03,280 Speaker 2: only buy generative AI products that meet this bar of safety. 537 00:29:03,560 --> 00:29:05,800 Speaker 2: There's a code of practice, the code of conduct, where 538 00:29:05,800 --> 00:29:08,120 Speaker 2: we're like, we're not going to let our economic might 539 00:29:08,520 --> 00:29:12,680 Speaker 2: fuel development of AI unless you do it in an intentional, thoughtful, 540 00:29:13,080 --> 00:29:15,760 Speaker 2: responsible way. I think that's totally a thing that should 541 00:29:15,760 --> 00:29:18,720 Speaker 2: happen one hundred percent. I think Sam Altman's talks about 542 00:29:18,760 --> 00:29:22,520 Speaker 2: having licenses of saying hey, right now, there is a 543 00:29:22,600 --> 00:29:26,480 Speaker 2: market disincentive to be safe, you know, move fast and 544 00:29:26,480 --> 00:29:28,800 Speaker 2: break things to quote more exeper work. The fact that 545 00:29:29,240 --> 00:29:32,640 Speaker 2: Facebook fired AI safety team, no one's punishing them for them. 546 00:29:33,000 --> 00:29:36,520 Speaker 2: But when people talk about existential risk, to not have 547 00:29:36,640 --> 00:29:39,480 Speaker 2: that existential risk, we have to say, no one in 548 00:29:39,520 --> 00:29:42,960 Speaker 2: the world, that includes governments and militaries, get to have 549 00:29:43,080 --> 00:29:45,800 Speaker 2: AIS more powerful than a certain level. You know, how 550 00:29:45,800 --> 00:29:49,480 Speaker 2: do we have a just more stable world? Because if 551 00:29:49,520 --> 00:29:53,080 Speaker 2: we are just escalating, the path of escalation will lead 552 00:29:53,160 --> 00:29:55,800 Speaker 2: to will lead to all those existential risks. 553 00:29:56,160 --> 00:29:59,280 Speaker 1: Is there anything you're excited about when it comes to AI, Francis, 554 00:29:59,320 --> 00:30:02,600 Speaker 1: so we don't have have to end on a terrifying note. 555 00:30:02,920 --> 00:30:05,200 Speaker 2: Yeah, we need to talk about short term and long term. 556 00:30:05,440 --> 00:30:08,560 Speaker 2: The short term on generative AI, I think is transformative. 557 00:30:09,000 --> 00:30:11,320 Speaker 2: Right now around the world, there are literally billions of 558 00:30:11,320 --> 00:30:13,920 Speaker 2: people who don't have doctors. We're going to live in 559 00:30:13,960 --> 00:30:17,200 Speaker 2: a world in the next ten years where every child 560 00:30:17,240 --> 00:30:19,719 Speaker 2: in the world has a pediatrician, it just might be 561 00:30:19,840 --> 00:30:22,800 Speaker 2: a robot pediatrician. We're going to live in a world 562 00:30:23,320 --> 00:30:24,840 Speaker 2: in the next ten years. We're going to live in 563 00:30:24,840 --> 00:30:26,840 Speaker 2: a world where every child in the world is going 564 00:30:26,880 --> 00:30:30,160 Speaker 2: to have the highest quality reading instruction that has ever 565 00:30:30,200 --> 00:30:34,080 Speaker 2: existed for humanity. You know, an endlessly patient tutor that 566 00:30:34,120 --> 00:30:36,480 Speaker 2: will sit there and over and over again as long 567 00:30:36,480 --> 00:30:39,120 Speaker 2: as that kid keeps working, will help them learn to read. 568 00:30:39,520 --> 00:30:43,240 Speaker 2: That's going to be transformative. There are high probability short 569 00:30:43,320 --> 00:30:46,200 Speaker 2: term rewards that I think are almost certainly going to happen. 570 00:30:46,360 --> 00:30:49,200 Speaker 2: It is going to transform the world. The thing I 571 00:30:49,240 --> 00:30:52,080 Speaker 2: try to caution people on is those existential risks are 572 00:30:52,160 --> 00:30:55,960 Speaker 2: very low probability, and they're much longer off, and so 573 00:30:56,000 --> 00:30:58,080 Speaker 2: it is more important for us to try to build 574 00:30:58,280 --> 00:31:02,520 Speaker 2: a just world where the motivations incentives for doing those 575 00:31:02,520 --> 00:31:05,320 Speaker 2: existential risks are as low as possible. So one of 576 00:31:05,400 --> 00:31:08,640 Speaker 2: the things that I am always trying to remind people 577 00:31:09,200 --> 00:31:13,560 Speaker 2: is we have invented new communication technologies before. When we 578 00:31:13,600 --> 00:31:16,760 Speaker 2: invented the printing press, suddenly a bunch of people learned 579 00:31:16,760 --> 00:31:20,280 Speaker 2: to read, and people start publishing pamphlets on things like 580 00:31:20,720 --> 00:31:23,080 Speaker 2: how do you know if your neighbor's a witch? What 581 00:31:23,120 --> 00:31:26,320 Speaker 2: should you do about that? And chaos ensued. We had 582 00:31:26,320 --> 00:31:29,320 Speaker 2: wars that killed huge numbers of people when we invented 583 00:31:29,360 --> 00:31:33,120 Speaker 2: the cheap printing press. We had wars about misinformation things 584 00:31:33,200 --> 00:31:36,960 Speaker 2: like you know, yellow journalism. But we learned and we responded. 585 00:31:37,480 --> 00:31:41,880 Speaker 2: We developed journalistic ethics, We founded journalism schools to teach 586 00:31:41,920 --> 00:31:46,360 Speaker 2: those things, journalistic trade associations to help people self regulate. 587 00:31:46,680 --> 00:31:49,720 Speaker 2: We passed laws on media concentration to make sure that 588 00:31:49,760 --> 00:31:52,240 Speaker 2: you know, you got to hear from different voices. We 589 00:31:52,520 --> 00:31:55,040 Speaker 2: learned about how it lived in our media environment or 590 00:31:55,200 --> 00:32:00,440 Speaker 2: information environment. It feels overwhelming right now because because we're 591 00:32:00,440 --> 00:32:02,880 Speaker 2: the ones who are responsible for figuring out where we 592 00:32:02,880 --> 00:32:05,520 Speaker 2: go from here, you know, it's about how are we 593 00:32:05,560 --> 00:32:07,360 Speaker 2: going to learn, How are we going to respond, how 594 00:32:07,360 --> 00:32:09,320 Speaker 2: are we going to act? And part of why I 595 00:32:09,320 --> 00:32:11,160 Speaker 2: have faith that we're going to figure this out is 596 00:32:11,720 --> 00:32:15,160 Speaker 2: is while it may seem impossible right now, every single 597 00:32:15,200 --> 00:32:18,280 Speaker 2: time before when we've made a new media technology, we've 598 00:32:18,360 --> 00:32:22,040 Speaker 2: learned and we've responded. So I will keep on pushing 599 00:32:22,360 --> 00:32:24,560 Speaker 2: and I just have a longer time horizon, I think 600 00:32:24,880 --> 00:32:25,880 Speaker 2: than many other people did. 601 00:32:26,520 --> 00:32:29,719 Speaker 1: From your lips to God's ears. Francis Hagen, thank you 602 00:32:29,760 --> 00:32:31,800 Speaker 1: so much for talking with me. Your new book is 603 00:32:31,840 --> 00:32:34,360 Speaker 1: called The Power of One. How I found the Strength 604 00:32:34,400 --> 00:32:37,080 Speaker 1: to Tell the Truth and Why I Blew the Whistle 605 00:32:37,200 --> 00:32:42,240 Speaker 1: on Facebook. Thank you so much. Thanks for listening everyone. 606 00:32:42,520 --> 00:32:44,480 Speaker 1: If you have a question for me, or want to 607 00:32:44,480 --> 00:32:47,920 Speaker 1: share your thoughts about how you navigate this crazy world 608 00:32:48,240 --> 00:32:51,080 Speaker 1: reach out. You can leave a short message at six 609 00:32:51,240 --> 00:32:54,600 Speaker 1: oh nine five one two five to five oh five, 610 00:32:55,120 --> 00:32:57,720 Speaker 1: or you can send me a DM on Instagram. I 611 00:32:57,760 --> 00:33:00,440 Speaker 1: would love to hear from you next time. This Question 612 00:33:00,600 --> 00:33:04,280 Speaker 1: is a production of iHeartMedia and Katie Couric Media. The 613 00:33:04,400 --> 00:33:08,280 Speaker 1: executive producers are Me, Katie Kuric, and Courtney Ltz. Our 614 00:33:08,360 --> 00:33:13,360 Speaker 1: supervising producer is Marcy Thompson. Our producers are Adrianna Fazzio 615 00:33:13,520 --> 00:33:17,080 Speaker 1: and Catherine Law. Our audio engineer is Matt Russell, who 616 00:33:17,120 --> 00:33:21,440 Speaker 1: also composed our theme music. For more information about today's episode, 617 00:33:21,600 --> 00:33:23,920 Speaker 1: or to sign up for my newsletter wake Up Call, 618 00:33:24,200 --> 00:33:26,920 Speaker 1: go to the description in the podcast app or visit 619 00:33:27,000 --> 00:33:30,000 Speaker 1: us at Katiecuric dot com. You can also find me 620 00:33:30,080 --> 00:33:33,640 Speaker 1: on Instagram and all my social media channels. For more 621 00:33:33,720 --> 00:33:38,840 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app. Apple podcasts, or 622 00:33:38,920 --> 00:33:41,040 Speaker 1: wherever you listen to your favorite shows.