1 00:00:01,280 --> 00:00:04,160 Speaker 1: Good morning and welcome to the Daily Os. It's Thursday, 2 00:00:04,200 --> 00:00:06,920 Speaker 1: the tenth of June. My name is Zara Sidler, and 3 00:00:07,000 --> 00:00:09,879 Speaker 1: helping me make sense of today's news is Sam Koslowski. 4 00:00:10,880 --> 00:00:14,280 Speaker 1: Making news today, Victoria to exit lockdown. 5 00:00:14,640 --> 00:00:16,320 Speaker 2: An incident in France. 6 00:00:16,480 --> 00:00:18,360 Speaker 1: Some good news from across the Ditch. 7 00:00:18,680 --> 00:00:21,800 Speaker 2: And we're going to have a chat about the Instagram algorithm. 8 00:00:22,200 --> 00:00:24,000 Speaker 2: He is Today's Daily Digest. 9 00:00:25,160 --> 00:00:28,360 Speaker 1: There was some much awaited good news for Victorians yesterday 10 00:00:28,400 --> 00:00:31,160 Speaker 1: when the government announced that they would be easing restrictions 11 00:00:31,160 --> 00:00:35,199 Speaker 1: from tonight. So if you're in Metropolitan Melbourne as of 12 00:00:35,360 --> 00:00:37,760 Speaker 1: eleven fifty nine tonight, you'll be able to leave home 13 00:00:37,800 --> 00:00:40,880 Speaker 1: for any reason. Travel limits will be increased to twenty 14 00:00:40,880 --> 00:00:43,879 Speaker 1: five kilometers. Any students will be able to return to 15 00:00:43,920 --> 00:00:46,960 Speaker 1: classrooms from tomorrow, and while there be no visitors in 16 00:00:47,000 --> 00:00:49,800 Speaker 1: the homes, outdoor gatherings will be increased to a maximum 17 00:00:49,880 --> 00:00:53,720 Speaker 1: of ten people. In other COVID news, yesterday Queensland recorded 18 00:00:53,760 --> 00:00:56,480 Speaker 1: one locally acquired case and that was a positive case 19 00:00:56,520 --> 00:00:59,400 Speaker 1: that had traveled from Melbourne to the Sunshine Coast. We'll 20 00:00:59,480 --> 00:01:02,360 Speaker 1: keep you update on any further news concerning this case. 21 00:01:02,400 --> 00:01:05,320 Speaker 2: Over on Instagram, we've chatted to you a little bit 22 00:01:05,360 --> 00:01:08,119 Speaker 2: this week about the case of former Special Air Service 23 00:01:08,200 --> 00:01:11,000 Speaker 2: Regiment Corporal Ben Roberts Smith, and if you need some 24 00:01:11,040 --> 00:01:14,560 Speaker 2: context head over to our insta. But there's actually a 25 00:01:14,600 --> 00:01:17,720 Speaker 2: separate lawsuit that is currently being heard between him and 26 00:01:17,760 --> 00:01:20,600 Speaker 2: his ex wife, and they have heard evidence of a 27 00:01:20,640 --> 00:01:24,600 Speaker 2: personal relationship between Robert Smith and his lawyer. Robert Smith 28 00:01:24,680 --> 00:01:27,560 Speaker 2: is suing ex wife Emma Roberts for providing parties in 29 00:01:27,600 --> 00:01:31,600 Speaker 2: his defamation proceedings with his private emails without his permission. 30 00:01:32,240 --> 00:01:35,520 Speaker 2: In court yesterday, the judge raised concerns over Robert Smith's 31 00:01:35,520 --> 00:01:38,840 Speaker 2: sworn evidence, which was sworn in by the lawyer in question. 32 00:01:39,840 --> 00:01:43,479 Speaker 1: French President Emmanuel Macron has been slapped by a member 33 00:01:43,520 --> 00:01:45,760 Speaker 1: of the public during a visit to the southeast of 34 00:01:45,800 --> 00:01:49,160 Speaker 1: France on Wednesday. Macron was greeting onlookers when he was 35 00:01:49,200 --> 00:01:51,680 Speaker 1: slapped across his left cheek by a man in the crowd. 36 00:01:52,080 --> 00:01:55,760 Speaker 1: Security quickly intervene, tackling the man while pulling the president away. 37 00:01:56,200 --> 00:01:58,480 Speaker 1: Two twenty eight year old men, including the man who 38 00:01:58,520 --> 00:02:02,200 Speaker 1: slapped Macron and another him, were placed in police custody. 39 00:02:02,720 --> 00:02:04,200 Speaker 1: The president was uninjured. 40 00:02:05,280 --> 00:02:07,960 Speaker 2: And today's good news comes from over the Ditch New 41 00:02:08,040 --> 00:02:11,680 Speaker 2: Zealand has reached one hundred days without community transmission of 42 00:02:11,680 --> 00:02:15,680 Speaker 2: COVID nineteen. The one hundred day period spans from Sunday, 43 00:02:15,840 --> 00:02:18,919 Speaker 2: February twenty eighth, and the country is well on its 44 00:02:18,919 --> 00:02:21,320 Speaker 2: way to beating the previous milestone of one hundred and 45 00:02:21,320 --> 00:02:26,560 Speaker 2: two days that was set in August last year. So 46 00:02:26,600 --> 00:02:28,760 Speaker 2: one of the most common questions we get asked at 47 00:02:28,760 --> 00:02:31,960 Speaker 2: The Daily ODS is actually about our Instagram page itself, 48 00:02:32,200 --> 00:02:35,480 Speaker 2: and the questions range from why don't I see every 49 00:02:35,520 --> 00:02:38,280 Speaker 2: post that the Daily ODS does? To why do I 50 00:02:38,280 --> 00:02:40,800 Speaker 2: see the Daily OS every day when I don't see 51 00:02:40,880 --> 00:02:43,680 Speaker 2: other content? A couple of weeks ago, we chatted to 52 00:02:43,720 --> 00:02:46,480 Speaker 2: you guys about the idea of an ethical social media 53 00:02:46,520 --> 00:02:49,760 Speaker 2: algorithm when New Zealand Prime Minister Jacinda Adern was holding 54 00:02:49,840 --> 00:02:53,280 Speaker 2: talks in relation to the christ Church shooting in twenty eighteen. 55 00:02:54,200 --> 00:02:56,680 Speaker 2: We know you guys care a lot about how social 56 00:02:56,720 --> 00:02:59,640 Speaker 2: media works, and so we could not ignore a quite 57 00:02:59,680 --> 00:03:04,200 Speaker 2: remind development that came through late yesterday. In a lengthy 58 00:03:04,280 --> 00:03:09,360 Speaker 2: blog post, CEO of Instagram, Adam Massiri, laid out how 59 00:03:09,440 --> 00:03:12,200 Speaker 2: the Instagram algorithm works, and it's one of the first 60 00:03:12,200 --> 00:03:15,560 Speaker 2: times we've heard directly from the platform as to the 61 00:03:15,600 --> 00:03:19,720 Speaker 2: mechanics of social media. For me, the most interesting takeaway 62 00:03:19,760 --> 00:03:21,880 Speaker 2: from the blog post, and we'll throw the link in 63 00:03:21,919 --> 00:03:25,280 Speaker 2: our show notes, was a really clear four part answer 64 00:03:25,400 --> 00:03:28,880 Speaker 2: as to how posts are ordered on your feed, and 65 00:03:28,919 --> 00:03:32,000 Speaker 2: to break it down, the four factors that go into 66 00:03:32,240 --> 00:03:34,600 Speaker 2: what is served up to you when you open Instagram 67 00:03:35,040 --> 00:03:37,880 Speaker 2: is a information about the post, so how well the 68 00:03:37,920 --> 00:03:40,960 Speaker 2: post is performing on social media already, whether it's getting 69 00:03:41,000 --> 00:03:44,000 Speaker 2: a high level of engagement in the seconds, minutes, hours 70 00:03:44,000 --> 00:03:47,600 Speaker 2: it's been online. B information about the person who actually 71 00:03:47,640 --> 00:03:50,920 Speaker 2: posted the post, whether that person is known for having 72 00:03:51,000 --> 00:03:55,240 Speaker 2: engaging posts, whether you're known for engaging with them, see 73 00:03:55,400 --> 00:03:58,520 Speaker 2: your activity on the app itself, and that could be 74 00:03:58,840 --> 00:04:01,640 Speaker 2: how likely you are to spend a longer time on 75 00:04:01,720 --> 00:04:05,520 Speaker 2: the post or engage with likes and follows, and d 76 00:04:05,800 --> 00:04:07,800 Speaker 2: your history of engaging with that person. 77 00:04:08,400 --> 00:04:10,640 Speaker 1: So what out of all of that is new information 78 00:04:10,800 --> 00:04:11,240 Speaker 1: to us? 79 00:04:11,600 --> 00:04:14,880 Speaker 2: That there is an algorithm is relatively new information. And 80 00:04:15,120 --> 00:04:16,880 Speaker 2: in fact, Adam Maisari said you. 81 00:04:17,040 --> 00:04:19,480 Speaker 1: Or was just previously unconfirmed. 82 00:04:19,000 --> 00:04:23,080 Speaker 2: Previously very unconfirmed, and Adamisari actually has clarified that there 83 00:04:23,120 --> 00:04:26,960 Speaker 2: is not an algorithm. There are many algorithms and the 84 00:04:27,040 --> 00:04:30,720 Speaker 2: algorithms combine to provide you with the best user experience 85 00:04:30,760 --> 00:04:34,159 Speaker 2: the app can create. He also clarified around this idea 86 00:04:34,200 --> 00:04:37,159 Speaker 2: of an algorithm that the feed page, the explore page, 87 00:04:37,160 --> 00:04:40,560 Speaker 2: and the reels page all have different algorithms. So that's 88 00:04:40,560 --> 00:04:42,400 Speaker 2: why I we'll see some content do better on a 89 00:04:42,440 --> 00:04:44,160 Speaker 2: story than it will on a post. 90 00:04:44,000 --> 00:04:45,840 Speaker 1: That's really good. When we had just got in our 91 00:04:45,880 --> 00:04:48,680 Speaker 1: heads around one of those algorithms. 92 00:04:48,440 --> 00:04:50,800 Speaker 2: There could be hundreds, and I'm not sure we'll ever 93 00:04:50,920 --> 00:04:54,360 Speaker 2: understand the details of how these algorithms work. But now 94 00:04:54,360 --> 00:04:56,080 Speaker 2: at least we can stop referring to it as the 95 00:04:56,120 --> 00:04:58,760 Speaker 2: algorithm and start referring to it as the algorithms. 96 00:04:59,400 --> 00:05:01,760 Speaker 1: So while we spend a lot of time thinking about 97 00:05:01,760 --> 00:05:04,280 Speaker 1: the algorithm, trying to beat the algorithm, trying to play 98 00:05:04,320 --> 00:05:06,839 Speaker 1: the algorithm. One of the other things that you and 99 00:05:06,880 --> 00:05:09,839 Speaker 1: I Sam especially have come up against is this idea 100 00:05:09,920 --> 00:05:13,360 Speaker 1: of shadow banning. So does Instagram actually talk about that 101 00:05:13,720 --> 00:05:17,080 Speaker 1: openly in the statement? And first, can you define shadow banning? 102 00:05:17,480 --> 00:05:20,160 Speaker 2: So shadow banning is when a post that is in 103 00:05:20,200 --> 00:05:25,200 Speaker 2: regards to a particular event, person, word, or idea is 104 00:05:25,360 --> 00:05:28,040 Speaker 2: prevented in its reach from being exposed to its audience. 105 00:05:28,200 --> 00:05:31,159 Speaker 2: So some users will report that they'll do a post 106 00:05:31,160 --> 00:05:33,760 Speaker 2: about black Lives matter and it won't get the same 107 00:05:33,839 --> 00:05:36,600 Speaker 2: reach as a post about their favorite cup of coffee 108 00:05:36,640 --> 00:05:39,280 Speaker 2: for the day. And so the theories online is that 109 00:05:39,880 --> 00:05:43,760 Speaker 2: posts about particular racial issues, posts around conflict in the 110 00:05:43,760 --> 00:05:47,360 Speaker 2: Middle East last month, they are restricted by the platform 111 00:05:47,720 --> 00:05:51,240 Speaker 2: that obviously ruffles feathers when you have heated political debate. 112 00:05:51,680 --> 00:05:54,880 Speaker 2: So Instagram took the time to clarify some notions around 113 00:05:55,040 --> 00:05:58,680 Speaker 2: shadow banning. They notably admitted that they hadn't done enough 114 00:05:58,720 --> 00:06:02,240 Speaker 2: work in this space to verify what happens when comments 115 00:06:02,279 --> 00:06:06,680 Speaker 2: are reported or when particular hashtags attract and I think 116 00:06:06,680 --> 00:06:08,679 Speaker 2: that was quite validating for a lot of the social 117 00:06:08,720 --> 00:06:12,280 Speaker 2: media community. They very much alluded to more information coming 118 00:06:12,279 --> 00:06:14,680 Speaker 2: out about shadow bands and how they work in the future, 119 00:06:14,680 --> 00:06:17,920 Speaker 2: but what they said for now was that it's basically 120 00:06:17,960 --> 00:06:21,240 Speaker 2: a product of the amount of reporting that happens within 121 00:06:21,400 --> 00:06:24,040 Speaker 2: that and because of the number of reports they get 122 00:06:24,040 --> 00:06:27,039 Speaker 2: each day, they often struggle to define what actually needs 123 00:06:27,120 --> 00:06:30,720 Speaker 2: to be restricted to keep users safe and what perhaps 124 00:06:30,880 --> 00:06:32,760 Speaker 2: is healthy political discourse. 125 00:06:33,080 --> 00:06:35,120 Speaker 1: I'll be honest, we have been speaking about this for 126 00:06:35,160 --> 00:06:37,880 Speaker 1: a number of minutes now and it is still as 127 00:06:37,920 --> 00:06:39,880 Speaker 1: clear as mud, but it's good to see that there 128 00:06:39,920 --> 00:06:43,760 Speaker 1: are some transparency measures being implemented by these tech giants. 129 00:06:44,160 --> 00:06:47,680 Speaker 1: We will continue to try understand the behemoth that is. 130 00:06:47,640 --> 00:06:50,360 Speaker 2: The algorithm, the algorithms. 131 00:06:50,040 --> 00:06:52,799 Speaker 1: The algorithms, and will be sure to keep you along 132 00:06:52,839 --> 00:06:55,159 Speaker 1: for the journey. That's all we have time for today, 133 00:06:55,160 --> 00:06:57,440 Speaker 1: but in the meantime, please follow the day's news on 134 00:06:57,480 --> 00:06:59,960 Speaker 1: Instagram at the Daily os. It's where over one hundred 135 00:07:00,200 --> 00:07:03,039 Speaker 1: fifteen thousand young people get their news every day, and 136 00:07:03,080 --> 00:07:05,240 Speaker 1: we would love you to become part of our community.