1 00:00:03,080 --> 00:00:05,760 Speaker 1: Good morning and welcome to the Daily OS. It's Monday, 2 00:00:05,800 --> 00:00:08,840 Speaker 1: the seventeenth of May. My name is Zara Seidler, and 3 00:00:08,920 --> 00:00:11,879 Speaker 1: helping me make sense of today's news is Sam Koslowski. 4 00:00:12,400 --> 00:00:18,439 Speaker 1: Making news today protests across Australia, repatriation flights in India, 5 00:00:18,600 --> 00:00:21,759 Speaker 1: some good news about restrictions easing in Sydney, and a 6 00:00:21,880 --> 00:00:24,560 Speaker 1: chat about a really important phone call over the weekends 7 00:00:24,560 --> 00:00:26,920 Speaker 1: that chatted through an ethical algorithm. 8 00:00:27,720 --> 00:00:30,000 Speaker 2: Here's today's Daily Digest. 9 00:00:30,560 --> 00:00:33,360 Speaker 1: Thousands of people in cities across Australia have gathered for 10 00:00:33,440 --> 00:00:37,320 Speaker 1: pro Palestinian rallies and vigils amid worsening violence in Israel, 11 00:00:37,400 --> 00:00:41,280 Speaker 1: Gaza and the West Bank. Gatherings were held in Sydney, Melbourne, 12 00:00:41,360 --> 00:00:45,000 Speaker 1: Adelaide and Canberra on Saturday, with one protester in Sydney 13 00:00:45,000 --> 00:00:47,640 Speaker 1: saying the situation is very bad. We have seen the 14 00:00:47,640 --> 00:00:50,640 Speaker 1: devastating bombings that have been happening which made us come here. 15 00:00:50,920 --> 00:00:53,280 Speaker 1: One woman in Sydney was issued a move on direction 16 00:00:53,440 --> 00:00:56,360 Speaker 1: after walking in the streets with a burning Israeli flag. 17 00:00:58,000 --> 00:01:01,360 Speaker 2: The first Repatriation Flight, which is flights enabled by the 18 00:01:01,400 --> 00:01:06,440 Speaker 2: Australian government to help Australians stuck overseas return Home landed 19 00:01:06,440 --> 00:01:08,920 Speaker 2: in Darwin on Saturday with eighty of its one hundred 20 00:01:08,959 --> 00:01:13,400 Speaker 2: and fifty scheduled passengers. Forty two passengers tested positive during 21 00:01:13,400 --> 00:01:16,600 Speaker 2: their mandatory three day hotel stay in Delhi, and another 22 00:01:16,680 --> 00:01:19,160 Speaker 2: thirty one of the close contacts were barred from flying 23 00:01:19,240 --> 00:01:22,000 Speaker 2: under those rules from the Australian government that ban any 24 00:01:22,080 --> 00:01:26,720 Speaker 2: positive cases and their close contacts from boarding those repatriation flights. 25 00:01:28,280 --> 00:01:31,600 Speaker 1: As India struggles with its second deadly wave of COVID nineteen, 26 00:01:31,880 --> 00:01:35,360 Speaker 1: misinformation and medical quackery is on the rise. According to 27 00:01:35,440 --> 00:01:40,160 Speaker 1: new reports, Practices including cow dung bars and mass anti 28 00:01:40,200 --> 00:01:44,640 Speaker 1: COVID steam inhalation events have reportedly been growing as India's 29 00:01:44,640 --> 00:01:48,960 Speaker 1: healthcare system continues to struggle. Indian scientists have criticized the 30 00:01:49,000 --> 00:01:52,160 Speaker 1: Indian government's role in the rise of medical quackery after 31 00:01:52,200 --> 00:01:55,240 Speaker 1: the government last year called for further research into the 32 00:01:55,280 --> 00:01:58,760 Speaker 1: medicinal properties of milk, dung and urine belonging to the 33 00:01:58,800 --> 00:02:02,440 Speaker 1: indigenous Indian calas. 34 00:02:01,840 --> 00:02:04,520 Speaker 2: And today's good news, restrictions will be lifted in New 35 00:02:04,520 --> 00:02:08,400 Speaker 2: South Wales from twelveh one am this morning after the 36 00:02:08,400 --> 00:02:12,600 Speaker 2: state recorded zero locally acquired COVID nineteen cases, despite a 37 00:02:12,639 --> 00:02:15,960 Speaker 2: couple from Sydney's Eaves and Suburbs both returning positive tests. 38 00:02:16,040 --> 00:02:19,040 Speaker 2: On May five, There's now no longer a twenty person 39 00:02:19,120 --> 00:02:22,120 Speaker 2: limit on guests allowed to visit in private households, and 40 00:02:22,280 --> 00:02:25,239 Speaker 2: vertical consumption is back on the cards. You can also 41 00:02:25,360 --> 00:02:28,560 Speaker 2: sing and dance indoors, and masks are no longer acquired 42 00:02:28,600 --> 00:02:29,640 Speaker 2: on public transport. 43 00:02:31,160 --> 00:02:34,240 Speaker 1: So you'll be forgiven for having probably missed this story 44 00:02:34,320 --> 00:02:36,040 Speaker 1: because there's a lot going on in the world at 45 00:02:36,040 --> 00:02:39,040 Speaker 1: the moment. But over the weekend there was something called 46 00:02:39,040 --> 00:02:42,120 Speaker 1: the christ Church Call, which was a call with world 47 00:02:42,200 --> 00:02:46,080 Speaker 1: leaders and big tech to discuss how to curb extremism 48 00:02:46,280 --> 00:02:49,840 Speaker 1: on social media. And one of the very key areas 49 00:02:50,080 --> 00:02:52,320 Speaker 1: that was explored during this call was the role of 50 00:02:52,360 --> 00:02:57,919 Speaker 1: the algorithm in perpetuating or stopping extremism online. So before 51 00:02:57,960 --> 00:03:01,360 Speaker 1: Sam dies into what actually happened at that meeting, let's 52 00:03:01,400 --> 00:03:04,320 Speaker 1: just recap on what we know about the algorithm. So 53 00:03:04,440 --> 00:03:07,920 Speaker 1: the algorithm, which can occur on any social media platform, 54 00:03:08,240 --> 00:03:11,520 Speaker 1: is the code that determines what posts appear in someone's 55 00:03:11,560 --> 00:03:14,120 Speaker 1: feed and in what order they might show up. As 56 00:03:14,160 --> 00:03:17,760 Speaker 1: an example, when Instagram first launched, posts appeared in the 57 00:03:17,800 --> 00:03:19,919 Speaker 1: feed based on the time that they were posted by 58 00:03:19,919 --> 00:03:23,560 Speaker 1: the user, so the feed was chronological. Anyone with Instagram 59 00:03:23,560 --> 00:03:26,120 Speaker 1: now knows that that is not what happens, And in 60 00:03:26,160 --> 00:03:30,040 Speaker 1: twenty sixteen, Instagram introduced a new algorithm whereby users were 61 00:03:30,040 --> 00:03:32,840 Speaker 1: shown posts ordered by what the app thought was relevant 62 00:03:32,840 --> 00:03:35,920 Speaker 1: to the user. Keep this in mind when Sam talks 63 00:03:36,000 --> 00:03:39,080 Speaker 1: us through what actually was discussed at this meeting and 64 00:03:39,160 --> 00:03:43,080 Speaker 1: what role the algorithm can play in stopping violent extremism. 65 00:03:43,200 --> 00:03:45,640 Speaker 2: And when we put a post up on Instagram about 66 00:03:45,840 --> 00:03:48,320 Speaker 2: defining the algorithm, it was so interesting to see how 67 00:03:48,360 --> 00:03:52,520 Speaker 2: many questions it posed for readers who hadn't actually thought 68 00:03:52,560 --> 00:03:55,560 Speaker 2: about the internal mechanics of their social media feeds before. 69 00:03:55,960 --> 00:03:58,720 Speaker 2: And this was the key topic of conversation for Jacinda 70 00:03:58,800 --> 00:04:02,520 Speaker 2: Ardern as she led conference call with world leaders. It 71 00:04:02,600 --> 00:04:06,160 Speaker 2: was co chaired by French President Emmanuel Macron and as 72 00:04:06,280 --> 00:04:09,600 Speaker 2: Zara said, there were world leaders, tech company executives, and 73 00:04:09,720 --> 00:04:14,160 Speaker 2: also members of affected communities, including Keywei Muslim leaders in 74 00:04:14,240 --> 00:04:17,719 Speaker 2: this virtual call. Some big names were the US Secretary 75 00:04:17,720 --> 00:04:21,200 Speaker 2: of State Anthony Blincoln and Canadian Prime Minister Justin Trudeau. 76 00:04:21,480 --> 00:04:23,880 Speaker 2: If we take a step back, the key problem that 77 00:04:24,080 --> 00:04:27,000 Speaker 2: was trying to be solved actually stems from the Royal 78 00:04:27,000 --> 00:04:30,000 Speaker 2: commission into the christ Church Mosk attacks. One of the 79 00:04:30,040 --> 00:04:32,880 Speaker 2: key findings of that Royal commission was that the Australian 80 00:04:32,960 --> 00:04:35,600 Speaker 2: man who has since been jailed for life without parole 81 00:04:36,000 --> 00:04:40,280 Speaker 2: was actually radicalized on YouTube and other online spaces. If 82 00:04:40,320 --> 00:04:43,200 Speaker 2: you wanted some further listening, the New York Times podcast 83 00:04:43,320 --> 00:04:47,320 Speaker 2: rabbit Hole draws out this process of YouTube radicalization, and 84 00:04:47,400 --> 00:04:51,039 Speaker 2: it shows how something as simple as the recommended video 85 00:04:51,120 --> 00:04:54,120 Speaker 2: to watch next has a lot of control over shaping 86 00:04:54,120 --> 00:04:57,560 Speaker 2: somebody's ideology. I totally get that this is a tricky 87 00:04:57,600 --> 00:05:00,400 Speaker 2: one for social media companies because of of course, they 88 00:05:00,440 --> 00:05:03,360 Speaker 2: want to provide you with content that you will enjoy, 89 00:05:03,440 --> 00:05:05,240 Speaker 2: and the best way to tell what content you're going 90 00:05:05,279 --> 00:05:08,280 Speaker 2: to enjoy is the content that you're consuming. Now. That's 91 00:05:08,320 --> 00:05:11,120 Speaker 2: why if you watch one clip of AFL football highlights, 92 00:05:11,160 --> 00:05:14,560 Speaker 2: they'll give you another clip of AFL football highlights. The problem, 93 00:05:14,600 --> 00:05:17,560 Speaker 2: of course, is that when extremist content is only backed 94 00:05:17,640 --> 00:05:21,640 Speaker 2: up by more extremist content, the tech platforms have ultimately 95 00:05:21,680 --> 00:05:24,640 Speaker 2: come round to understanding that a change needs to be 96 00:05:24,720 --> 00:05:27,640 Speaker 2: made to the way in which these processes are enabled 97 00:05:27,720 --> 00:05:31,520 Speaker 2: on social media. With YouTube's chief executives saying on Twitter 98 00:05:31,560 --> 00:05:34,839 Speaker 2: over the weekend that her company was continuing to strengthen policies, 99 00:05:35,120 --> 00:05:39,720 Speaker 2: improve transparency, and restrict borderline content, and that's content that 100 00:05:40,520 --> 00:05:43,080 Speaker 2: is on the border of being illegal or not. A 101 00:05:43,080 --> 00:05:45,600 Speaker 2: big move for the christ Church Call was having the 102 00:05:45,720 --> 00:05:48,080 Speaker 2: US in the room. They've held out for about two 103 00:05:48,240 --> 00:05:51,920 Speaker 2: years on joining such initiatives, and experts believe that under 104 00:05:51,960 --> 00:05:54,560 Speaker 2: a new president, they are now more open to keeping 105 00:05:54,600 --> 00:05:58,800 Speaker 2: big tech accountable. So at the moment, artificial intelligence and 106 00:05:58,880 --> 00:06:02,360 Speaker 2: algorithms are used on social media not only to give 107 00:06:02,400 --> 00:06:04,679 Speaker 2: you content that they think that you're going to enjoy, 108 00:06:04,960 --> 00:06:07,880 Speaker 2: but also by identifying risks. So it's actually the same 109 00:06:07,960 --> 00:06:10,320 Speaker 2: system that will prevent a post from going up for 110 00:06:10,400 --> 00:06:13,159 Speaker 2: breaching community standards as it will that will serve you 111 00:06:13,279 --> 00:06:15,680 Speaker 2: up your next piece of content. And the aim of 112 00:06:15,680 --> 00:06:18,159 Speaker 2: the christ Church Call was really to marry up those 113 00:06:18,200 --> 00:06:21,520 Speaker 2: two ideas in a more meaningful way that if social 114 00:06:21,520 --> 00:06:25,520 Speaker 2: media platforms do have the capacity to stop violent content 115 00:06:25,560 --> 00:06:28,800 Speaker 2: from being shown on your Instagram feed, then how can 116 00:06:28,800 --> 00:06:32,719 Speaker 2: they harness that tech capacity to make sure that radicalization 117 00:06:32,880 --> 00:06:36,240 Speaker 2: on the platform is much harder to fall into. As 118 00:06:36,240 --> 00:06:39,440 Speaker 2: an added side note, another thing that was achieved on 119 00:06:39,480 --> 00:06:42,280 Speaker 2: the call over the weekend was a protocol which can 120 00:06:42,320 --> 00:06:45,880 Speaker 2: intervene to stop the live streaming of similar attacks to 121 00:06:45,960 --> 00:06:48,240 Speaker 2: that happened in christ Church in twenty eighteen. 122 00:06:48,600 --> 00:06:50,840 Speaker 1: So, aside from being a bit of a tongue twister, 123 00:06:51,120 --> 00:06:54,839 Speaker 1: I'm not entirely clear on what ethical algorithms are or 124 00:06:54,839 --> 00:06:58,680 Speaker 1: how they can actually manifest in reality. Is this just 125 00:06:58,720 --> 00:07:00,640 Speaker 1: a pipe dream or is it something that could actually 126 00:07:00,640 --> 00:07:02,520 Speaker 1: be implemented by big tech. 127 00:07:02,680 --> 00:07:05,960 Speaker 2: It can totally be implemented. It just needs a realignment 128 00:07:06,080 --> 00:07:10,120 Speaker 2: of driving aims behind why AI is there. It needs 129 00:07:10,160 --> 00:07:12,440 Speaker 2: to no longer be there just to drive engagement, but 130 00:07:12,520 --> 00:07:15,840 Speaker 2: to make sure that that engagement is safe. Facebook, Google, 131 00:07:15,960 --> 00:07:18,880 Speaker 2: and Twitter have all said that they're open to making 132 00:07:18,920 --> 00:07:22,520 Speaker 2: their algorithms more ethical, and I don't think anybody doubts 133 00:07:22,560 --> 00:07:25,080 Speaker 2: that they don't have the technology that will enable them 134 00:07:25,120 --> 00:07:28,320 Speaker 2: to make the algorithm ethical. It's just about whether they 135 00:07:28,360 --> 00:07:31,600 Speaker 2: actually want to. I think Another important point to remember 136 00:07:31,760 --> 00:07:35,760 Speaker 2: is that these ethical algorithm issues pop up all over 137 00:07:35,800 --> 00:07:39,080 Speaker 2: the place. So, for example, there's massive issues in the 138 00:07:39,200 --> 00:07:43,200 Speaker 2: United States with AI programs being used to approve credit 139 00:07:43,240 --> 00:07:48,160 Speaker 2: cards and a racial bias being identified across the country. 140 00:07:48,520 --> 00:07:51,480 Speaker 2: In the approval process, which is the fault of AI, 141 00:07:51,800 --> 00:07:55,480 Speaker 2: not the fault of any human making the decisions. Steps 142 00:07:55,520 --> 00:07:58,520 Speaker 2: are now being taken to rectify that process, but it's 143 00:07:58,560 --> 00:08:01,640 Speaker 2: a clear indication that if we aren't in touch with 144 00:08:01,680 --> 00:08:05,320 Speaker 2: how the AI technology works, it can often take approval 145 00:08:05,360 --> 00:08:09,080 Speaker 2: processes down a rabbit hole. We didn't intend to set 146 00:08:09,080 --> 00:08:11,120 Speaker 2: it on. The Other thing I just wanted to mention 147 00:08:11,320 --> 00:08:14,040 Speaker 2: in defense of big tech is that this call was 148 00:08:14,080 --> 00:08:16,800 Speaker 2: not the beginning of the process for them. YouTube especially 149 00:08:16,800 --> 00:08:19,720 Speaker 2: has been under pressure for some time, and they proudly 150 00:08:19,760 --> 00:08:22,880 Speaker 2: reported to the Call a seventy percent drop in watch 151 00:08:22,920 --> 00:08:26,880 Speaker 2: time for videos deemed borderline since they had implemented changes 152 00:08:26,920 --> 00:08:28,880 Speaker 2: to the algorithm about a year ago. 153 00:08:29,080 --> 00:08:31,440 Speaker 1: So from here we know that there is twelve months 154 00:08:31,440 --> 00:08:34,680 Speaker 1: for the christ Church Call Committee to come up with 155 00:08:34,720 --> 00:08:36,560 Speaker 1: an action plan and next steps. 156 00:08:37,120 --> 00:08:38,040 Speaker 2: What that looks like. 157 00:08:38,080 --> 00:08:40,920 Speaker 1: Remains to be seen, but it is going to lead 158 00:08:41,280 --> 00:08:45,240 Speaker 1: hopefully no, but it has hoped that this is the 159 00:08:45,280 --> 00:08:49,800 Speaker 1: beginning of some meaningful action to combat violent extremism online 160 00:08:49,840 --> 00:08:54,120 Speaker 1: and the proliferation of that content online specifically. That's all 161 00:08:54,160 --> 00:08:56,640 Speaker 1: we have time for today, but in the meantime, follow 162 00:08:56,679 --> 00:08:58,079 Speaker 1: the day's news on Instagram. 163 00:08:58,080 --> 00:09:00,000 Speaker 2: At the daily os. 164 00:09:00,000 --> 00:09:02,600 Speaker 1: Over a one hundred thousand Australians get their news every 165 00:09:02,679 --> 00:09:09,679 Speaker 1: day and we'd love you to become part of the community.