1 00:00:05,840 --> 00:00:08,680 Speaker 1: From The Australian. Here's what's on the front. I'm Claire Harvey. 2 00:00:08,760 --> 00:00:15,720 Speaker 1: It's Wednesday, April sixteen, twenty twenty five. The Greens have 3 00:00:15,800 --> 00:00:19,480 Speaker 1: canceled plans for an Anzac Day rave after a backlash. 4 00:00:19,760 --> 00:00:22,640 Speaker 1: The party was planning to charge thirty dollars a ticket 5 00:00:22,840 --> 00:00:25,759 Speaker 1: and offered party people the chance to donate up to 6 00:00:25,800 --> 00:00:29,640 Speaker 1: one thousand dollars to help the Greens get elected. You 7 00:00:29,640 --> 00:00:31,960 Speaker 1: can read all our role in coverage of the federal 8 00:00:32,000 --> 00:00:35,879 Speaker 1: election campaign right now at The Australian dot com dot AU. 9 00:00:39,920 --> 00:00:43,519 Speaker 1: A whistleblower has given explosive testimony to the United States 10 00:00:43,560 --> 00:00:47,560 Speaker 1: Congress about the rot inside of Facebook, and The Australian's 11 00:00:47,640 --> 00:00:51,839 Speaker 1: reporting is front and center of the proceedings. Today, Managing 12 00:00:51,960 --> 00:00:56,160 Speaker 1: editor Darren Davidson reflects on breaking the story that exposed 13 00:00:56,280 --> 00:00:59,200 Speaker 1: Facebook's targeting of vulnerable teams. 14 00:01:07,200 --> 00:01:11,160 Speaker 2: It's a reporter for an Australian newspaper who's got his 15 00:01:11,319 --> 00:01:14,839 Speaker 2: hands on one of the internal documents about how Facebook 16 00:01:14,959 --> 00:01:18,040 Speaker 2: actually does this, and he reaches out for a comment 17 00:01:18,120 --> 00:01:19,959 Speaker 2: from Facebook before publishing. 18 00:01:22,120 --> 00:01:25,440 Speaker 1: This is the voice of Facebook's former public policy director, 19 00:01:25,640 --> 00:01:30,000 Speaker 1: Sarah win Williams. Win Williams is reading an excerpt from 20 00:01:30,040 --> 00:01:33,440 Speaker 1: her new book Careless People, A Story of where I 21 00:01:33,600 --> 00:01:37,360 Speaker 1: used to work. She's become the latest whistleblower to turn 22 00:01:37,400 --> 00:01:41,840 Speaker 1: on Meta, the parent company of Facebook and Instagram. The 23 00:01:41,959 --> 00:01:45,959 Speaker 1: reporter she's talking about is Darren Davidson. He's now The 24 00:01:46,000 --> 00:01:50,639 Speaker 1: Australian's managing editor and commercial director. And the newspaper, of course, 25 00:01:50,840 --> 00:01:51,880 Speaker 1: is The Australian. 26 00:01:53,680 --> 00:01:56,880 Speaker 2: That's when I hear about it. I didn't know anything 27 00:01:56,920 --> 00:02:00,200 Speaker 2: about this, and neither did the policy team in Australia. 28 00:02:01,600 --> 00:02:04,520 Speaker 1: In twenty seventeen, Darren broke a story about the way 29 00:02:04,560 --> 00:02:09,400 Speaker 1: Facebook raked in millions of advertising dollars by targeting children 30 00:02:09,600 --> 00:02:13,880 Speaker 1: when they were vulnerable, and bragged about it to advertising agencies. 31 00:02:14,440 --> 00:02:17,280 Speaker 1: It was a global bombshell. 32 00:02:17,480 --> 00:02:21,480 Speaker 3: So Facebook is under fire again, this time because of 33 00:02:21,560 --> 00:02:26,320 Speaker 3: a leaked document that could allow advertisers to target these 34 00:02:26,360 --> 00:02:28,760 Speaker 3: teams when they are at their most vulnerable. 35 00:02:29,120 --> 00:02:32,200 Speaker 4: The twenty three page report leaked to The Australian shows 36 00:02:32,200 --> 00:02:33,919 Speaker 4: how Facebook gathers data. 37 00:02:33,760 --> 00:02:37,519 Speaker 5: To determine when users are feeling stressed, anxious, and overwhelmed, 38 00:02:37,560 --> 00:02:44,080 Speaker 5: among other things. The idea in this piece of research, 39 00:02:44,160 --> 00:02:46,919 Speaker 5: this presentation to the bank, was that they could then 40 00:02:47,000 --> 00:02:50,079 Speaker 5: target their products at those moments when people needed a 41 00:02:50,120 --> 00:02:53,520 Speaker 5: boost and therefore they would be more vulnerable susceptible to 42 00:02:53,600 --> 00:02:56,600 Speaker 5: the advertising. This was buried at the back of a 43 00:02:56,639 --> 00:02:58,880 Speaker 5: bachel documents that managed to get hold of and verify. 44 00:02:59,680 --> 00:03:01,359 Speaker 5: At the time, it was really really concerning to me. 45 00:03:01,400 --> 00:03:04,000 Speaker 5: I remember saying to my contact, wow, have you seen this? 46 00:03:04,520 --> 00:03:06,359 Speaker 5: Because at the time I was more interested in a 47 00:03:06,400 --> 00:03:10,160 Speaker 5: different aspect of the Facebook story, and it struck me 48 00:03:10,200 --> 00:03:12,400 Speaker 5: as interesting that the person that handed me in these 49 00:03:12,400 --> 00:03:15,120 Speaker 5: documents hadn't really paid a lot of attention to this 50 00:03:15,200 --> 00:03:18,079 Speaker 5: discovery at the back of the bachelor documents and initially 51 00:03:18,080 --> 00:03:20,360 Speaker 5: didn't occur to them this might be concerning or disturbing. 52 00:03:20,400 --> 00:03:21,639 Speaker 5: It was only when I kind of laid it out 53 00:03:21,639 --> 00:03:24,160 Speaker 5: and said, hey, look at what they're saying that they 54 00:03:24,200 --> 00:03:26,880 Speaker 5: actually went wow, actually, yeah, that is that's pretty disturbing, 55 00:03:26,960 --> 00:03:28,440 Speaker 5: pretty concerning. 56 00:03:30,440 --> 00:03:33,280 Speaker 1: The scoop came at a time when society at large 57 00:03:33,360 --> 00:03:36,360 Speaker 1: was just at the beginning of grappling with a huge 58 00:03:36,520 --> 00:03:40,200 Speaker 1: shift brought about by the big tech platforms and social 59 00:03:40,240 --> 00:03:41,680 Speaker 1: media networks. 60 00:03:42,240 --> 00:03:45,040 Speaker 5: We had written a number of stories at that point. 61 00:03:45,160 --> 00:03:46,760 Speaker 5: It was a big story for us on the Australian 62 00:03:46,760 --> 00:03:49,040 Speaker 5: at the time, and it wasn't necessarily a big story 63 00:03:49,720 --> 00:03:53,200 Speaker 5: in the wider media. It wasn't as fashionable as it 64 00:03:53,240 --> 00:03:55,240 Speaker 5: is now to write about the power of the big 65 00:03:55,280 --> 00:04:00,200 Speaker 5: tech platforms. And we were very interested in this sorts 66 00:04:00,240 --> 00:04:05,120 Speaker 5: of reasons. The digital advertising market at that point was booming, 67 00:04:05,440 --> 00:04:08,200 Speaker 5: and it continues to be this day, a very strongly 68 00:04:08,280 --> 00:04:11,280 Speaker 5: growing market, but it's very opaque and there's lots of 69 00:04:11,280 --> 00:04:15,040 Speaker 5: intermediaries middlemen. So it was of great interest to us, 70 00:04:15,040 --> 00:04:17,200 Speaker 5: and I was looking at lots of different story angles. 71 00:04:17,720 --> 00:04:20,880 Speaker 5: I had heard rumors about the way in which the 72 00:04:20,920 --> 00:04:23,760 Speaker 5: advertising business at Facebook worked. 73 00:04:24,640 --> 00:04:28,640 Speaker 1: Facebook didn't like Darren's reporting. They dispatched a team to 74 00:04:28,680 --> 00:04:31,920 Speaker 1: discredit him to other reporters who'd picked up the story. 75 00:04:32,839 --> 00:04:36,360 Speaker 5: The reaction from Facebook was staggering. They asked for more 76 00:04:36,400 --> 00:04:39,080 Speaker 5: time to prepare a statement. They then delayed sending me 77 00:04:39,160 --> 00:04:42,240 Speaker 5: the statement. They kind of set off a bomb inside 78 00:04:42,240 --> 00:04:44,960 Speaker 5: Facebook at the time that involved all levels of the 79 00:04:45,040 --> 00:04:48,560 Speaker 5: organization at the very top of Facebook about how they 80 00:04:48,560 --> 00:04:51,120 Speaker 5: could actually initially kill the story and then contain it. 81 00:04:56,000 --> 00:04:59,840 Speaker 1: Coming up why Sarah Wynn Williams has turned on Facebook. 82 00:05:17,480 --> 00:05:20,920 Speaker 1: Sarah Wynn Williams is a former lawyer and diplomat, and 83 00:05:21,000 --> 00:05:23,599 Speaker 1: as you heard in those excerpts from her book, She's 84 00:05:23,640 --> 00:05:26,719 Speaker 1: a KeyWe She worked at Meta, the parent company of 85 00:05:26,760 --> 00:05:31,680 Speaker 1: social media networks and messaging platforms like Facebook, Instagram, and WhatsApp, 86 00:05:32,000 --> 00:05:36,200 Speaker 1: for seven years, starting in twenty eleven. She hasn't just 87 00:05:36,240 --> 00:05:39,480 Speaker 1: been critical of Meta and its founder Mark Zuckerberg, She's 88 00:05:39,520 --> 00:05:45,000 Speaker 1: gone absolutely nuclear. And Meta, so incensed by her claims, 89 00:05:45,320 --> 00:05:49,240 Speaker 1: has tried to stop her from promoting her book, Careless People. 90 00:05:50,040 --> 00:05:52,680 Speaker 4: While it couldn't stop its release, Matter is trying to 91 00:05:52,720 --> 00:05:55,159 Speaker 4: stop further sales of a new book written by a 92 00:05:55,240 --> 00:05:58,679 Speaker 4: former employee. The arbitrator in the case sided with Matter, 93 00:05:58,839 --> 00:06:02,120 Speaker 4: saying the company would suffer immediate and irreparable loss. He 94 00:06:02,200 --> 00:06:04,240 Speaker 4: also sided with Matter on the grounds of the non 95 00:06:04,279 --> 00:06:05,440 Speaker 4: disparagement agreement. 96 00:06:06,640 --> 00:06:09,760 Speaker 1: It worked for a while, but the book is well 97 00:06:09,760 --> 00:06:12,839 Speaker 1: and truly out there and it's landed. Sarah Wyn Williams 98 00:06:12,839 --> 00:06:17,120 Speaker 1: in front of the US Congress a Senate Judiciary Committee investigation. 99 00:06:18,560 --> 00:06:24,120 Speaker 2: One example is that Facebook was targeting thirteen to seventeen 100 00:06:24,160 --> 00:06:30,040 Speaker 2: year olds. It could identify when they were feeling worthless 101 00:06:30,760 --> 00:06:36,279 Speaker 2: or helpless, or like a failure, and they would take 102 00:06:36,279 --> 00:06:44,159 Speaker 2: that information and share it with advertisers. If a thirteen 103 00:06:44,200 --> 00:06:47,800 Speaker 2: year old girl would delete her selfie, that's a really 104 00:06:47,800 --> 00:06:50,680 Speaker 2: good time to try and sell her a beauty product. 105 00:06:52,960 --> 00:06:54,839 Speaker 5: As we know now from a book that the Facebook 106 00:06:54,839 --> 00:06:58,480 Speaker 5: whistle blooder has written, Careless People, they put out misleading 107 00:06:58,520 --> 00:07:01,800 Speaker 5: statements that senior manage and knew where untrue, but yet 108 00:07:01,880 --> 00:07:05,160 Speaker 5: they still put them out at the time. And as 109 00:07:05,240 --> 00:07:08,480 Speaker 5: we've since discovered, and the whistleblower in the book call 110 00:07:08,520 --> 00:07:11,320 Speaker 5: as people noticed this, there was a case of a 111 00:07:11,400 --> 00:07:14,200 Speaker 5: teenager in the UK who took her own life and 112 00:07:14,240 --> 00:07:17,160 Speaker 5: what they noticed looking back at her social activity was 113 00:07:17,160 --> 00:07:21,280 Speaker 5: that she was following an Instagram account called feeling Worthless. 114 00:07:21,520 --> 00:07:24,320 Speaker 5: And Worthless was one of the emotional fields in that 115 00:07:24,560 --> 00:07:26,800 Speaker 5: document that was given to the media agency in the 116 00:07:26,840 --> 00:07:29,440 Speaker 5: Bank that we discovered back in twenty seventeen. 117 00:07:29,880 --> 00:07:32,960 Speaker 1: There seemed to be a moment in the internal Facebook 118 00:07:32,960 --> 00:07:37,840 Speaker 1: discussions revealed by this whistleblower Darren, where senior executives at 119 00:07:37,840 --> 00:07:40,960 Speaker 1: Facebook discussed amongst themselves, well are we doing this? You know, 120 00:07:41,080 --> 00:07:41,800 Speaker 1: is this possible? 121 00:07:43,720 --> 00:07:47,800 Speaker 2: Joel directs that our comms should swap that down clearly, 122 00:07:48,440 --> 00:07:53,000 Speaker 2: but he's told that it's not possible. Joel's response, we 123 00:07:53,120 --> 00:07:55,960 Speaker 2: can't confirm that, we don't target on the basis of 124 00:07:56,120 --> 00:08:02,320 Speaker 2: insecurity or how someone is feeling. Despite Elliott Joel and 125 00:08:02,400 --> 00:08:06,000 Speaker 2: many of Facebook's most senior executives devise a cover up, 126 00:08:06,720 --> 00:08:10,000 Speaker 2: Facebook issues a second statement that's a flat out lie. 127 00:08:11,480 --> 00:08:14,280 Speaker 1: What do you think about the way those senior executives 128 00:08:14,280 --> 00:08:17,280 Speaker 1: at Facebook handled it, in their initial reaction and then 129 00:08:17,320 --> 00:08:18,240 Speaker 1: their spin to you. 130 00:08:19,120 --> 00:08:21,000 Speaker 5: It's amazing to read some of those comments in the 131 00:08:21,000 --> 00:08:23,880 Speaker 5: book Careless People. Some of the executives felt that they 132 00:08:23,880 --> 00:08:26,960 Speaker 5: should be proud of this technology, and they should be 133 00:08:27,240 --> 00:08:31,640 Speaker 5: out there boasting about their capabilities and the very precision 134 00:08:32,240 --> 00:08:34,760 Speaker 5: targeted way in which they can connect with their users 135 00:08:34,760 --> 00:08:37,640 Speaker 5: and people on Facebook. I think so many of them 136 00:08:37,640 --> 00:08:39,640 Speaker 5: are caught up in a bubble and a world that's 137 00:08:39,720 --> 00:08:43,040 Speaker 5: far removed from ours that they don't necessarily stop to 138 00:08:43,080 --> 00:08:46,439 Speaker 5: think about the ethical consequences of what they're doing. However, 139 00:08:46,520 --> 00:08:49,960 Speaker 5: I didn't note that Sarah Wynn Williams was dispatched by 140 00:08:50,000 --> 00:08:52,360 Speaker 5: Facebook to kill my story, and she talks about that 141 00:08:52,400 --> 00:08:55,440 Speaker 5: in the book. She notes that many executives are in 142 00:08:55,480 --> 00:08:59,400 Speaker 5: Silicon Balley and Facebook like absolutely refuse to allow their 143 00:08:59,440 --> 00:09:02,800 Speaker 5: children to have smartphones spent any time on screens, and 144 00:09:02,840 --> 00:09:06,000 Speaker 5: in fact, the favorite thing of these very expensive wooden toys. 145 00:09:06,200 --> 00:09:08,719 Speaker 5: There's a brand in the US, and they would rather 146 00:09:08,720 --> 00:09:10,720 Speaker 5: give their Kinstiny's wooden toys. And I think that really 147 00:09:10,800 --> 00:09:13,440 Speaker 5: says it already, that anecdote that she shares about the 148 00:09:13,480 --> 00:09:16,120 Speaker 5: pavior of the parents that work at Facebook. 149 00:09:18,960 --> 00:09:23,840 Speaker 2: These executives, they know, they know the harm that this 150 00:09:24,000 --> 00:09:29,400 Speaker 2: product does. They don't allow their own teenagers to use 151 00:09:30,120 --> 00:09:33,640 Speaker 2: the products that Meta develops. I mean, the hypocrisy is 152 00:09:33,679 --> 00:09:34,400 Speaker 2: at every level. 153 00:09:36,280 --> 00:09:39,599 Speaker 1: Sarah wyn William's testimony about young people was kind of 154 00:09:39,640 --> 00:09:42,720 Speaker 1: an aside at the Senate committee, which was mainly looking 155 00:09:42,760 --> 00:09:45,920 Speaker 1: at her revelations about Meta's relationship with China. 156 00:09:47,040 --> 00:09:50,760 Speaker 2: We are engaged in a high stakes AI right arms 157 00:09:50,840 --> 00:09:55,000 Speaker 2: race against China, and during my time at Meta, company 158 00:09:55,000 --> 00:09:58,160 Speaker 2: executives lied about what they were doing with the Chinese 159 00:09:58,200 --> 00:10:07,520 Speaker 2: Communist partymployees, shareholders, Congress, and the American public. Mark Zuckerberg 160 00:10:07,840 --> 00:10:13,120 Speaker 2: pledged himself a free speech champion, yet I witnessed Matter 161 00:10:13,520 --> 00:10:17,560 Speaker 2: work hand in glove with the Chinese Communist Party to 162 00:10:17,640 --> 00:10:24,520 Speaker 2: construct and test custom built censorship tools that silenced and 163 00:10:24,679 --> 00:10:29,959 Speaker 2: censored their critics. When Beijing demanded that Facebook delete the 164 00:10:30,000 --> 00:10:34,040 Speaker 2: account of a prominent Chinese dissident living on American soil. 165 00:10:34,960 --> 00:10:39,079 Speaker 2: They did it and then lied to Congress When asked 166 00:10:39,120 --> 00:10:41,439 Speaker 2: about the incident in a Senate hearing. 167 00:10:42,320 --> 00:10:46,839 Speaker 1: Meta rejects Sarah Wyn Williams's testimony. In a statement, Meta said, 168 00:10:46,880 --> 00:10:49,920 Speaker 1: this is all pushed by an employee terminated eight years 169 00:10:49,960 --> 00:10:53,480 Speaker 1: ago for poor performance. We do not operate our services 170 00:10:53,480 --> 00:10:56,240 Speaker 1: in China today. It is no secret we were once 171 00:10:56,280 --> 00:10:59,280 Speaker 1: interested in doing so as part of Facebook's effort to 172 00:10:59,320 --> 00:11:02,480 Speaker 1: connect the world world. We ultimately opted not to go 173 00:11:02,559 --> 00:11:06,640 Speaker 1: through with the ideas we'd explored, which Mark Zuckerberg announced 174 00:11:06,679 --> 00:11:11,680 Speaker 1: in twenty nineteen. Now, of course, the Australian government has 175 00:11:12,360 --> 00:11:15,520 Speaker 1: banned social media for people under sixteen. That's yet to 176 00:11:15,520 --> 00:11:19,320 Speaker 1: come into effect, but will do soon. We talk now 177 00:11:19,360 --> 00:11:22,880 Speaker 1: about the concept of a social license. In the years 178 00:11:22,920 --> 00:11:25,880 Speaker 1: since you broke that story originally, do you think Facebook 179 00:11:25,880 --> 00:11:28,679 Speaker 1: has done anything to earn that social license? 180 00:11:29,000 --> 00:11:30,679 Speaker 5: I think you know, the thing about Facebook is it's 181 00:11:31,080 --> 00:11:34,880 Speaker 5: we all know like it's an aggressive commercial operation. It 182 00:11:35,080 --> 00:11:38,240 Speaker 5: ruthlessly executes on its technology to do that. I was 183 00:11:38,240 --> 00:11:40,600 Speaker 5: thinking the ad revenue Facebook the other day annualized it's 184 00:11:40,600 --> 00:11:43,320 Speaker 5: more than one hundred and sixty billion. Now, which is 185 00:11:43,920 --> 00:11:47,200 Speaker 5: the advertising market, and Australia alone, the whole country or 186 00:11:47,200 --> 00:11:50,240 Speaker 5: the ad revenue, the pool of money is about ten billion. 187 00:11:50,800 --> 00:11:54,040 Speaker 5: The Facebook alone makes one hundred and sixty billion. So 188 00:11:54,920 --> 00:11:57,280 Speaker 5: unless there's pressure from the government and the e Safety 189 00:11:57,280 --> 00:12:00,640 Speaker 5: Commission and stuff like that to do tools don't read bother, 190 00:12:00,720 --> 00:12:02,240 Speaker 5: and they certainly weren't doing it back then. 191 00:12:09,920 --> 00:12:13,520 Speaker 1: Darren Davidson is The Australian's Managing editor and commercial director. 192 00:12:14,000 --> 00:12:17,000 Speaker 1: You can read his reflections on this story right now 193 00:12:17,040 --> 00:12:19,280 Speaker 1: at The Australian dot com dot au