1 00:00:00,520 --> 00:00:05,600 Speaker 1: Already, and this is the daily This is the Daily OS. Oh, 2 00:00:05,800 --> 00:00:16,160 Speaker 1: now it makes sense. Good morning, and welcome to the 3 00:00:16,239 --> 00:00:19,120 Speaker 1: Daily OS. It's Thursday, the ninth of January. I'm Zara, 4 00:00:19,320 --> 00:00:23,840 Speaker 1: I'm Sam. Yesterday, Meta founder Mark Zuckerberg announced major changes 5 00:00:23,880 --> 00:00:27,680 Speaker 1: to the company's social media moderation policies. In a video 6 00:00:27,720 --> 00:00:31,040 Speaker 1: posted to socials, Zuckerberg said Meta would no longer use 7 00:00:31,080 --> 00:00:34,320 Speaker 1: independent fact checkers in a move to restore what he 8 00:00:34,400 --> 00:00:38,240 Speaker 1: called free expression for its users, and also to reduce 9 00:00:38,240 --> 00:00:41,360 Speaker 1: the number of mistakes he believed had been made. It's 10 00:00:41,479 --> 00:00:44,839 Speaker 1: a significant departure for the social media behemoth and has 11 00:00:44,920 --> 00:00:48,240 Speaker 1: drawn praise from ex owner Elon Musk and incoming US 12 00:00:48,320 --> 00:00:49,600 Speaker 1: President Donald Trump. 13 00:00:53,280 --> 00:00:56,360 Speaker 2: It's crazy to think that there was a time without Meta. 14 00:00:56,520 --> 00:00:59,280 Speaker 2: I mean yeah, it's pretty much one of the companies 15 00:00:59,280 --> 00:01:00,800 Speaker 2: that shapes the twenty first century. 16 00:01:00,880 --> 00:01:02,960 Speaker 1: Evasive in every part of my life. 17 00:01:02,720 --> 00:01:04,520 Speaker 2: And it's continuing to change. 18 00:01:04,560 --> 00:01:07,520 Speaker 1: My partner on Meta, and my business exists on Meta. 19 00:01:07,640 --> 00:01:09,119 Speaker 1: One hundred percent hit rate for Meta. 20 00:01:09,200 --> 00:01:11,160 Speaker 2: It's in every part of our lives. And that's why 21 00:01:11,200 --> 00:01:13,640 Speaker 2: these changes are so important to talk about. Before we 22 00:01:13,680 --> 00:01:16,000 Speaker 2: get to the changes themselves, what was the state of 23 00:01:16,040 --> 00:01:17,440 Speaker 2: play before this announcement. 24 00:01:17,800 --> 00:01:21,640 Speaker 1: Yeah, So since twenty sixteen, Meta has used what it 25 00:01:21,680 --> 00:01:27,680 Speaker 1: called third party moderators to essentially identify false claims, misinformation, disinformation, 26 00:01:28,200 --> 00:01:32,240 Speaker 1: and hoaxes on all of its platforms, so across Facebook, Instagram, 27 00:01:32,280 --> 00:01:36,319 Speaker 1: and WhatsApp. In his video, Zuckerberg said that after Donald 28 00:01:36,360 --> 00:01:39,360 Speaker 1: Trump won the election back in twenty sixteen, so not 29 00:01:39,400 --> 00:01:42,399 Speaker 1: this most recent time, but back then, he said that 30 00:01:42,480 --> 00:01:46,200 Speaker 1: legacy media wrote NonStop about how misinformation was a threat 31 00:01:46,200 --> 00:01:50,279 Speaker 1: to democracy, and Zuckerberg said that Meta, as a result 32 00:01:50,360 --> 00:01:52,160 Speaker 1: of all of that discourse and as a result of 33 00:01:52,200 --> 00:01:55,160 Speaker 1: I guess that concern, he said that Meta tried in 34 00:01:55,280 --> 00:01:58,800 Speaker 1: good faith to address those concerns. And that's why these 35 00:01:58,800 --> 00:02:01,640 Speaker 1: fact checkers were in introduced in the first place as 36 00:02:01,680 --> 00:02:04,520 Speaker 1: an almost response to the election of Donald Trump, which 37 00:02:04,600 --> 00:02:06,840 Speaker 1: is an interesting point that we'll get to a bit later. 38 00:02:07,400 --> 00:02:09,440 Speaker 1: And so the way that it has worked up until 39 00:02:09,440 --> 00:02:12,880 Speaker 1: now is that these moderation teams, and there's about ninety 40 00:02:13,000 --> 00:02:16,480 Speaker 1: organizations across dozens of countries who work as part of 41 00:02:16,520 --> 00:02:20,079 Speaker 1: these teams. What they do is look very closely at 42 00:02:20,080 --> 00:02:24,079 Speaker 1: content that either they or other users have identified as 43 00:02:24,120 --> 00:02:27,560 Speaker 1: being false or misleading. So you, as a user can 44 00:02:27,600 --> 00:02:30,480 Speaker 1: identify something, can click and say this is hate speech 45 00:02:30,560 --> 00:02:33,160 Speaker 1: or this is misinformation or whatever it is. That's then 46 00:02:33,200 --> 00:02:38,239 Speaker 1: reviewed by these people. The moderators then rate the accuracy 47 00:02:38,320 --> 00:02:42,400 Speaker 1: of the post based on their factual review, and med 48 00:02:42,520 --> 00:02:44,720 Speaker 1: can then add a warning label and the fact check 49 00:02:44,800 --> 00:02:47,800 Speaker 1: is rating to the posts and then reduce its distribution, 50 00:02:47,960 --> 00:02:51,840 Speaker 1: so affect how that post can be seen across the algorithm. 51 00:02:52,080 --> 00:02:54,480 Speaker 2: And that's where you also see those kind of lighter 52 00:02:54,520 --> 00:02:57,760 Speaker 2: boxes underneath the post that says this might not be true. 53 00:02:57,800 --> 00:03:00,760 Speaker 2: Here's a link to a government website. Remember, particularly in 54 00:03:00,800 --> 00:03:03,120 Speaker 2: the pandemic, there was a lot of those underneath posts 55 00:03:03,160 --> 00:03:05,840 Speaker 2: about vaccines or about the spread of the virus. So 56 00:03:05,919 --> 00:03:08,640 Speaker 2: that's the way it's worked until now. Talk me through 57 00:03:08,720 --> 00:03:13,320 Speaker 2: what Mark Zuckerberg announced via a short video yesterday and 58 00:03:14,000 --> 00:03:16,520 Speaker 2: what his thinking was behind this one eighty. 59 00:03:16,800 --> 00:03:20,040 Speaker 1: I mean, to the thinking point. It is fairly rare 60 00:03:20,120 --> 00:03:22,960 Speaker 1: that Mark Zuckerberg takes us through his thinking on something. 61 00:03:23,000 --> 00:03:25,679 Speaker 1: There are a lot of high profile people who work 62 00:03:25,720 --> 00:03:28,639 Speaker 1: at metas so when it comes to Instagram, Adamisari is 63 00:03:28,680 --> 00:03:30,600 Speaker 1: the head of Instagram and he's announced a lot of 64 00:03:30,600 --> 00:03:33,720 Speaker 1: the changes. But this came clearly from the top, and 65 00:03:34,160 --> 00:03:37,000 Speaker 1: it makes sense. It's a huge change. So essentially, the 66 00:03:37,040 --> 00:03:39,840 Speaker 1: biggest change among them is that those fact checkers that 67 00:03:39,880 --> 00:03:41,640 Speaker 1: I was just talking about, the ones who are reviewing 68 00:03:41,680 --> 00:03:42,520 Speaker 1: the content. 69 00:03:42,240 --> 00:03:44,200 Speaker 2: The ninety organization, Yes, they're no. 70 00:03:44,120 --> 00:03:47,720 Speaker 1: Longer gone done goodbye. And to add some color as 71 00:03:47,880 --> 00:03:52,160 Speaker 1: to why overnight we have seen the exit of all 72 00:03:52,240 --> 00:03:56,840 Speaker 1: fact checking on these platforms, Zuckerberg described the system as complex, 73 00:03:57,160 --> 00:04:00,360 Speaker 1: and he said that it was creating too many mistakes. Said, 74 00:04:00,440 --> 00:04:03,880 Speaker 1: even if they accidentally censor just one percent of posts, 75 00:04:03,960 --> 00:04:06,720 Speaker 1: that is millions of people. Now I'm going to applay 76 00:04:06,720 --> 00:04:08,800 Speaker 1: you a short clip that explains a bit more about 77 00:04:08,840 --> 00:04:10,640 Speaker 1: the thinking behind the decision. 78 00:04:11,200 --> 00:04:14,880 Speaker 3: There's been widespread debate about potential harms from online content. 79 00:04:15,480 --> 00:04:18,880 Speaker 3: Governments and legacy media have pushed to censor more and more. 80 00:04:19,160 --> 00:04:22,039 Speaker 3: So we built a lot of complex systems to moderate content. 81 00:04:22,320 --> 00:04:25,000 Speaker 3: But the problem with complex systems is they make mistakes. 82 00:04:25,200 --> 00:04:28,200 Speaker 3: The fact checkers have just been too politically biased and 83 00:04:28,240 --> 00:04:31,320 Speaker 3: have destroyed more trust than they've created, especially in the US. 84 00:04:31,720 --> 00:04:33,599 Speaker 3: So over the next couple of months, we're going to 85 00:04:33,600 --> 00:04:36,560 Speaker 3: phase in a more comprehensive community notes system. 86 00:04:37,200 --> 00:04:39,839 Speaker 2: So let's play this out right. We're talking about a 87 00:04:39,920 --> 00:04:44,279 Speaker 2: post being reported as missiles disinformation, this third party fact 88 00:04:44,320 --> 00:04:47,280 Speaker 2: checking service looking at it and deciding it was in fact, 89 00:04:47,480 --> 00:04:49,680 Speaker 2: it's then limiting the reach of that And in Mark 90 00:04:49,760 --> 00:04:52,960 Speaker 2: Zackberg's mind, that is then limiting the reach of that post, 91 00:04:53,040 --> 00:04:55,279 Speaker 2: where if the fact checker's got that wrong, it actually 92 00:04:55,279 --> 00:04:57,680 Speaker 2: shouldn't have been limited in the first place. And I 93 00:04:57,680 --> 00:05:00,719 Speaker 2: guess it's why that concern is around sense, right. 94 00:05:00,720 --> 00:05:03,000 Speaker 1: Yeah, well, I mean, yeah, censorship is really at the 95 00:05:03,040 --> 00:05:07,840 Speaker 1: heart of the reasoning behind removing these fact checkers. And 96 00:05:08,360 --> 00:05:11,480 Speaker 1: I guess the way that Mark Zuckerberg sees it is 97 00:05:11,480 --> 00:05:15,479 Speaker 1: that the people, the fact checkers, who are deciding whether 98 00:05:15,560 --> 00:05:18,440 Speaker 1: or not something is miss or disinformation, he is claiming 99 00:05:18,480 --> 00:05:21,360 Speaker 1: that they are becoming the arbiters of truth, and that 100 00:05:21,480 --> 00:05:23,960 Speaker 1: Meta itself never wanted to be the arbiter of truth. 101 00:05:24,040 --> 00:05:26,279 Speaker 1: It wanted to be a social media platform, or Facebook 102 00:05:26,279 --> 00:05:29,400 Speaker 1: at least wanted to be a social media platform where 103 00:05:29,560 --> 00:05:33,640 Speaker 1: lots of different ideas and social connections could be fostered. 104 00:05:34,000 --> 00:05:36,880 Speaker 1: And what he's saying is that it has now gone 105 00:05:37,000 --> 00:05:40,200 Speaker 1: too far and it's now impeding on freedom of speech 106 00:05:40,200 --> 00:05:43,080 Speaker 1: and the principles that underpin that. And to that end, 107 00:05:43,240 --> 00:05:45,200 Speaker 1: Meta has also said it's getting rid of a number 108 00:05:45,200 --> 00:05:49,279 Speaker 1: of restrictions on topics, so things like immigration, gender identity, 109 00:05:49,279 --> 00:05:54,120 Speaker 1: and gender that have been the subject of frequent political discourse. 110 00:05:54,560 --> 00:05:57,040 Speaker 1: A statement from Meta said, it's not right that things 111 00:05:57,080 --> 00:05:58,800 Speaker 1: can be said on TV or on the floor of 112 00:05:58,839 --> 00:06:00,279 Speaker 1: Congress but not on Beta. 113 00:06:00,880 --> 00:06:03,040 Speaker 2: So essentially, Zara, does that mean that people can now 114 00:06:03,080 --> 00:06:06,120 Speaker 2: post anything they want on metal platforms. 115 00:06:06,440 --> 00:06:09,159 Speaker 1: It's certainly going to be easier to post what you 116 00:06:09,400 --> 00:06:11,719 Speaker 1: like and what you think, but it's not going to 117 00:06:11,720 --> 00:06:14,360 Speaker 1: be a free for all. Instead, the onus is going 118 00:06:14,400 --> 00:06:18,679 Speaker 1: to shift from fact checkers to the users themselves. Now, 119 00:06:19,040 --> 00:06:21,839 Speaker 1: this idea is modeled on the way that X so 120 00:06:22,320 --> 00:06:24,680 Speaker 1: Twitter that's owned by Elon Musk, the way that that 121 00:06:24,839 --> 00:06:28,479 Speaker 1: platform has treated miss and disinformation, and that's been through 122 00:06:28,480 --> 00:06:32,679 Speaker 1: the introduction of something called community notes. Essentially community notes. 123 00:06:33,000 --> 00:06:35,080 Speaker 1: If you've been on x recently, it's that little thing 124 00:06:35,080 --> 00:06:37,480 Speaker 1: that comes up at the end of a tweet and 125 00:06:37,520 --> 00:06:41,240 Speaker 1: it says community note so and so xyz is wrong. 126 00:06:41,480 --> 00:06:44,839 Speaker 1: And the way that this is done is that X 127 00:06:44,960 --> 00:06:50,160 Speaker 1: users basically fact check each other. Essentially, if two users 128 00:06:50,760 --> 00:06:52,719 Speaker 1: so me and you, okay, if me and you, let's 129 00:06:52,800 --> 00:06:55,560 Speaker 1: use you you and I, so you and I have 130 00:06:55,800 --> 00:06:59,159 Speaker 1: previously disagreed on whether the sky is blue. 131 00:06:59,320 --> 00:07:00,760 Speaker 2: I say it's green, you say it's blue. 132 00:07:00,960 --> 00:07:05,119 Speaker 1: Yeah, And then there's another fact that's posted about whether 133 00:07:05,400 --> 00:07:09,080 Speaker 1: or not Anthony Albanezi is the Prime Minister, and you 134 00:07:09,120 --> 00:07:12,080 Speaker 1: and I agree on that. That is then going to 135 00:07:12,120 --> 00:07:15,480 Speaker 1: be published as a community note because previously we've disagreed, 136 00:07:15,600 --> 00:07:18,360 Speaker 1: Now we agree on the same thing. And that is 137 00:07:18,400 --> 00:07:20,840 Speaker 1: the way at least that X. I know it's super confusing, 138 00:07:20,880 --> 00:07:22,960 Speaker 1: but that's the way that X has said that. It's 139 00:07:23,000 --> 00:07:27,640 Speaker 1: like prioritizing diverse voices and diverse perspectives. Is that if 140 00:07:27,680 --> 00:07:30,040 Speaker 1: you and I agree on every single thing and we 141 00:07:30,120 --> 00:07:33,360 Speaker 1: have the same political positions and therefore critique something else 142 00:07:33,520 --> 00:07:36,880 Speaker 1: as miss or disinformation, you're ending up in the same situation. 143 00:07:37,320 --> 00:07:39,320 Speaker 1: So that's the way X does it. I hope that 144 00:07:39,320 --> 00:07:42,720 Speaker 1: that makes sense. I'm not on the back end of X, 145 00:07:42,800 --> 00:07:45,360 Speaker 1: so look, I can't actually tell you the technology that 146 00:07:45,400 --> 00:07:47,960 Speaker 1: goes behind it, but essentially what they're trying to say 147 00:07:48,040 --> 00:07:50,720 Speaker 1: is that that same idea is going to be brought 148 00:07:50,720 --> 00:07:52,000 Speaker 1: in on meta platforms. 149 00:07:52,040 --> 00:07:56,320 Speaker 2: Okay, so basically via a lot of complex algorithms. It's 150 00:07:56,320 --> 00:07:59,440 Speaker 2: on users. Now, you're not on these third part the 151 00:07:59,520 --> 00:08:04,280 Speaker 2: TLDs checking services to identify whether something on meta platforms 152 00:08:04,440 --> 00:08:06,320 Speaker 2: is factually wrong correct. 153 00:08:06,600 --> 00:08:09,680 Speaker 1: And that's the way it's been done on X. And 154 00:08:09,760 --> 00:08:13,400 Speaker 1: you know X has changed, It's completely transformed from the 155 00:08:13,400 --> 00:08:15,000 Speaker 1: app at once was, so I think that we can 156 00:08:15,080 --> 00:08:19,680 Speaker 1: expect to see the meta platforms also change quite significantly 157 00:08:19,880 --> 00:08:20,760 Speaker 1: under this announcement. 158 00:08:20,960 --> 00:08:23,520 Speaker 2: And Zara, we can't ignore the wider context that's happening 159 00:08:23,560 --> 00:08:26,680 Speaker 2: around us, which is the election of Donald Trump for 160 00:08:26,720 --> 00:08:28,880 Speaker 2: a second term. He's being sworn in on the twentieth. 161 00:08:29,520 --> 00:08:32,720 Speaker 2: Meta isn't the only company in the world readying itself 162 00:08:32,800 --> 00:08:35,480 Speaker 2: for a Trump presidency. Can you talk to me about 163 00:08:35,520 --> 00:08:38,000 Speaker 2: how Donald Trump fits into all of this. 164 00:08:38,480 --> 00:08:43,839 Speaker 1: Yeah, Look, it's an absolutely fascinating one because Mark Zuckerberg 165 00:08:44,000 --> 00:08:48,240 Speaker 1: explicitly said in his video that he was going to 166 00:08:48,600 --> 00:08:51,800 Speaker 1: work with President Trump to push back on governments around 167 00:08:51,800 --> 00:08:54,560 Speaker 1: the world pushing to censor more So, there was a 168 00:08:54,800 --> 00:08:58,520 Speaker 1: very clear nod to the incoming president elect Donald Trump there, 169 00:08:58,880 --> 00:09:01,480 Speaker 1: and the fact that you know he is looking to 170 00:09:01,559 --> 00:09:03,280 Speaker 1: work with him and be on the same side when 171 00:09:03,280 --> 00:09:08,040 Speaker 1: it comes to censorship, I guess as Trump. But what's 172 00:09:08,080 --> 00:09:10,680 Speaker 1: interesting is that the two absolutely have not always seen 173 00:09:10,800 --> 00:09:14,640 Speaker 1: eida eye you know, Trump previously threatened to jals Zuckerberg 174 00:09:14,720 --> 00:09:17,920 Speaker 1: if he did anything quote illegal during the twenty twenty 175 00:09:17,960 --> 00:09:21,800 Speaker 1: four election, and at a press conference after the announcement, 176 00:09:21,920 --> 00:09:25,800 Speaker 1: Trump said that this threat was probably responsible for Meta's decision. 177 00:09:26,440 --> 00:09:28,800 Speaker 1: We can't test the validity of that. We are not inside, 178 00:09:28,960 --> 00:09:31,840 Speaker 1: you know, the Meta decision making rooms. But I do 179 00:09:31,880 --> 00:09:34,960 Speaker 1: think that it's important to, I guess, highlight some of 180 00:09:35,040 --> 00:09:38,040 Speaker 1: the interesting changes that have happened at Meta recently that, 181 00:09:38,120 --> 00:09:40,760 Speaker 1: as you say, do demonstrate the fact that it is 182 00:09:40,800 --> 00:09:44,280 Speaker 1: readying itself for a Trump presidency. So two things that 183 00:09:44,480 --> 00:09:47,160 Speaker 1: were of note. The first is that Dana White, who 184 00:09:47,280 --> 00:09:50,760 Speaker 1: was the creator of UFC, he was this week appointed 185 00:09:50,800 --> 00:09:51,880 Speaker 1: to Meta's board. 186 00:09:51,840 --> 00:09:54,880 Speaker 2: And he's very close to the Trump camp, right. 187 00:09:54,760 --> 00:09:57,120 Speaker 1: Yeah, yeah. He is a full Trump loyalist to the 188 00:09:57,200 --> 00:10:00,520 Speaker 1: extent that when Donald Trump won the presidency back in November, 189 00:10:00,600 --> 00:10:03,200 Speaker 1: he was on stage on the night of the election, 190 00:10:03,440 --> 00:10:07,640 Speaker 1: so very close to the president elect, and that means something. 191 00:10:07,760 --> 00:10:11,240 Speaker 1: He wrote this kind of explanation of being appointed to 192 00:10:11,280 --> 00:10:13,400 Speaker 1: the board and he said, I don't do boards, but 193 00:10:13,559 --> 00:10:15,960 Speaker 1: I want to do this board. So a really interesting 194 00:10:16,000 --> 00:10:18,960 Speaker 1: thing there. The second is that the head of Global Affairs, 195 00:10:19,800 --> 00:10:23,000 Speaker 1: so global affairs is essentially the way that a company 196 00:10:23,720 --> 00:10:27,360 Speaker 1: connects and engages with government. That person at META was 197 00:10:27,360 --> 00:10:29,959 Speaker 1: replaced this week and the new head, his name is 198 00:10:30,000 --> 00:10:33,440 Speaker 1: George Kaplan, has close ties to Trump and is also 199 00:10:33,520 --> 00:10:36,720 Speaker 1: understood to have played a significant role in the announcement yesterday. 200 00:10:36,960 --> 00:10:39,839 Speaker 2: It's really interesting how Meta is getting ready for this 201 00:10:40,000 --> 00:10:42,440 Speaker 2: Trump presidency. They're not the only company that's doing so, 202 00:10:42,520 --> 00:10:44,920 Speaker 2: as I said, and these decisions are going to impact 203 00:10:44,960 --> 00:10:47,160 Speaker 2: millions of people, not just in America but all the 204 00:10:47,160 --> 00:10:49,920 Speaker 2: way around the world. What has the response been to 205 00:10:50,000 --> 00:10:51,240 Speaker 2: this announcement. 206 00:10:50,960 --> 00:10:53,920 Speaker 1: Well, I thought I'd focus on the response from the 207 00:10:54,240 --> 00:10:56,960 Speaker 1: fact checking groups. You know, I said that they were 208 00:10:56,960 --> 00:10:59,920 Speaker 1: in ninety Yeah, there were ninety organizations who were involved 209 00:10:59,920 --> 00:11:03,199 Speaker 1: in fact checking who overnight must have lost huge contracts 210 00:11:03,679 --> 00:11:07,600 Speaker 1: and also, you know, believe truly in their role and 211 00:11:07,640 --> 00:11:10,880 Speaker 1: the value that they can bring. So one of the groups, 212 00:11:10,880 --> 00:11:13,720 Speaker 1: the Pointer Institute, said in a statement that facts are 213 00:11:13,800 --> 00:11:17,400 Speaker 1: not censorship. They said it's time to quit invoking inflammatory 214 00:11:17,400 --> 00:11:20,480 Speaker 1: and false language in describing the role of journalists and 215 00:11:20,520 --> 00:11:24,480 Speaker 1: fact checking, not mincing words there. FactCheck dot Org, which 216 00:11:24,480 --> 00:11:26,960 Speaker 1: has also been used by Meta, said that community notes 217 00:11:27,000 --> 00:11:29,040 Speaker 1: model will mean that you'll have to do more work 218 00:11:29,080 --> 00:11:32,160 Speaker 1: on your own when you see questionable posts. I think 219 00:11:32,200 --> 00:11:34,320 Speaker 1: that that's an interesting note to end on, which is, 220 00:11:34,360 --> 00:11:37,280 Speaker 1: you know, how much critical analysis, how much literacy are 221 00:11:37,280 --> 00:11:41,240 Speaker 1: we bringing to the online world and to our experience 222 00:11:41,280 --> 00:11:43,760 Speaker 1: on the online world, because the meta and the Twitter 223 00:11:43,840 --> 00:11:47,800 Speaker 1: of twenty fourteen is very different today, and you know, 224 00:11:47,840 --> 00:11:50,840 Speaker 1: here in Australia we're heading into an election campaign where 225 00:11:50,840 --> 00:11:53,480 Speaker 1: we know young voters are going to be met online 226 00:11:53,640 --> 00:11:57,240 Speaker 1: and especially on socials and meta platforms. So understanding how 227 00:11:57,280 --> 00:12:00,199 Speaker 1: this might affect us down the road is definitely to 228 00:12:00,280 --> 00:12:00,760 Speaker 1: keep an eye on. 229 00:12:00,880 --> 00:12:02,840 Speaker 2: Well, the onus is kind of now more on us 230 00:12:02,920 --> 00:12:05,760 Speaker 2: than that before. Yep, Zara, thank you. That's a very 231 00:12:05,800 --> 00:12:09,360 Speaker 2: interesting and slightly complex discussion on how fact checking works 232 00:12:09,360 --> 00:12:11,880 Speaker 2: and is evolving into the future. We'll be back again 233 00:12:11,960 --> 00:12:15,000 Speaker 2: with the headlines this afternoon, but until then, have a 234 00:12:15,000 --> 00:12:19,880 Speaker 2: fantastic day, which out to you later. My name is 235 00:12:19,880 --> 00:12:23,360 Speaker 2: Lily Maddon and I'm a proud Arunda Bunjelung Kalkutin woman 236 00:12:23,440 --> 00:12:27,720 Speaker 2: from Gadighl Country. The Daily oz acknowledges that this podcast 237 00:12:27,880 --> 00:12:30,160 Speaker 2: is recorded on the lands of the Gadighl people and 238 00:12:30,200 --> 00:12:33,720 Speaker 2: pays respect to all Aboriginal and Torres Strait Island and nations. 239 00:12:34,040 --> 00:12:36,960 Speaker 2: We pay our respects to the first peoples of these countries, 240 00:12:37,080 --> 00:12:38,280 Speaker 2: both past and present.