1 00:00:00,520 --> 00:00:03,240 Speaker 1: Already and this is the Daily OS. 2 00:00:03,400 --> 00:00:06,840 Speaker 2: This is the Daily ohs oh, now it makes sense. 3 00:00:14,720 --> 00:00:17,800 Speaker 3: Good morning and welcome to the Daily OS. It's Thursday, 4 00:00:17,840 --> 00:00:20,480 Speaker 3: the thirty first of July. I'm Emma Gillespie. 5 00:00:20,560 --> 00:00:21,720 Speaker 2: I'm Billy FitzSimons. 6 00:00:22,000 --> 00:00:26,119 Speaker 3: The Federal government has confirmed that YouTube will be included 7 00:00:26,239 --> 00:00:29,880 Speaker 3: in its social media ban for children under sixteen. The 8 00:00:29,960 --> 00:00:33,680 Speaker 3: video streaming platform was originally going to be exempt from 9 00:00:33,680 --> 00:00:37,920 Speaker 3: the ban, but following mounting pressure from Australia's e Safety Commissioner, 10 00:00:38,040 --> 00:00:42,040 Speaker 3: the government has changed its mind. The move's drawn criticism 11 00:00:42,159 --> 00:00:46,480 Speaker 3: from the opposition, while YouTube's parent company Google is reportedly 12 00:00:46,560 --> 00:00:50,199 Speaker 3: considering legal action against the government over the decision. 13 00:00:50,600 --> 00:00:52,960 Speaker 2: We'll take you through the latest on the social media 14 00:00:53,000 --> 00:00:57,320 Speaker 2: ban and discuss whether Google's threat of legal action could succeed. 15 00:00:57,600 --> 00:01:04,560 Speaker 2: Right after a quick message from our sponsor, so M 16 00:01:04,640 --> 00:01:08,800 Speaker 2: we're talking today about this social media ban for under sixteens. Now, 17 00:01:08,880 --> 00:01:12,440 Speaker 2: this is legislation that has actually already been tabled in 18 00:01:12,480 --> 00:01:15,400 Speaker 2: Parliament and as I understand, it has already gone through. 19 00:01:15,880 --> 00:01:18,400 Speaker 2: For anyone who missed this, because I think it happened 20 00:01:18,520 --> 00:01:20,640 Speaker 2: at the very end of last year, do you want 21 00:01:20,680 --> 00:01:22,319 Speaker 2: to just explain the context of this. 22 00:01:22,720 --> 00:01:25,640 Speaker 3: Yes. So you might remember last year it was November 23 00:01:25,720 --> 00:01:28,680 Speaker 3: when the government introduced its bill to ban under sixteen 24 00:01:28,720 --> 00:01:31,920 Speaker 3: year olds from social media. Now, that followed this kind 25 00:01:31,959 --> 00:01:36,039 Speaker 3: of intense public campaign that gained a lot of media attention, 26 00:01:36,200 --> 00:01:38,800 Speaker 3: a lot of celebrity endorsements, a lot of parents and 27 00:01:38,840 --> 00:01:42,280 Speaker 3: schools were on board, essentially saying that the government needed 28 00:01:42,319 --> 00:01:45,440 Speaker 3: to do a better job of protecting young people's mental 29 00:01:45,440 --> 00:01:49,000 Speaker 3: health in the online space. So, off the back of 30 00:01:49,040 --> 00:01:52,400 Speaker 3: that campaign, Australia became the first country in the world 31 00:01:52,720 --> 00:01:56,680 Speaker 3: to legislate an age specific social media ban. So under 32 00:01:56,720 --> 00:02:03,480 Speaker 3: the legislation, platforms including Snapchat, Instagram, x TikTok, Reddit, Facebook, 33 00:02:03,840 --> 00:02:07,280 Speaker 3: they all have to block any user under sixteen from 34 00:02:07,320 --> 00:02:11,079 Speaker 3: having an account on their platform, and the responsibility basically 35 00:02:11,280 --> 00:02:13,840 Speaker 3: will fall on the platforms themselves. So it'll be up 36 00:02:13,840 --> 00:02:17,720 Speaker 3: to social media companies to enforce this ban, and failure 37 00:02:17,800 --> 00:02:20,920 Speaker 3: to comply will result in fines of fifty million dollars. 38 00:02:21,280 --> 00:02:24,640 Speaker 2: Okay, So, the Australian government introduces this ban for social 39 00:02:24,680 --> 00:02:28,919 Speaker 2: media for under sixteen year olds, but YouTube is exempt 40 00:02:28,960 --> 00:02:32,480 Speaker 2: from the ban. What was the government's reasoning at the 41 00:02:32,560 --> 00:02:35,079 Speaker 2: time as to why YouTube was exempt? 42 00:02:35,240 --> 00:02:38,119 Speaker 3: So initially YouTube was not included in this legislation, as 43 00:02:38,120 --> 00:02:42,480 Speaker 3: you mentioned, Billy, predominantly because the government said it's used 44 00:02:42,600 --> 00:02:46,120 Speaker 3: for learning. YouTube is used as an educational resource. So 45 00:02:46,160 --> 00:02:51,160 Speaker 3: in a speech last year, then Communications Minister Michelle Roland said, YouTube, 46 00:02:51,200 --> 00:02:54,200 Speaker 3: as well as some other platforms like messaging platform WhatsApp, 47 00:02:54,200 --> 00:02:57,440 Speaker 3: which is also exempt quote enable young people to get 48 00:02:57,440 --> 00:03:01,160 Speaker 3: the education and health support they needed. So the logic 49 00:03:01,280 --> 00:03:06,320 Speaker 3: was basically that unlike Instagram, TikTok and Facebook, that YouTube 50 00:03:06,360 --> 00:03:10,079 Speaker 3: serves as more of an educational tool, so it's used 51 00:03:10,120 --> 00:03:16,600 Speaker 3: by teachers and students for tutorials, videos, documentaries, historical docos, 52 00:03:16,680 --> 00:03:21,399 Speaker 3: and scientific experiments, all these kinds of resources, rather than 53 00:03:21,560 --> 00:03:26,040 Speaker 3: being used as a social networking platform designed around user 54 00:03:26,080 --> 00:03:29,400 Speaker 3: generated content and social interaction. So there is a social 55 00:03:29,520 --> 00:03:32,520 Speaker 3: element to YouTube in the comments for example, but the 56 00:03:32,560 --> 00:03:34,560 Speaker 3: government basically said a lot of people come to the 57 00:03:34,560 --> 00:03:38,360 Speaker 3: platform for a use that is completely different. YouTube also 58 00:03:38,480 --> 00:03:42,080 Speaker 3: already requires children under thirteen to provide a parent or 59 00:03:42,080 --> 00:03:45,400 Speaker 3: guardians contact details when they create an account, and they 60 00:03:45,440 --> 00:03:47,640 Speaker 3: also have YouTube Kids, which you might have heard of. 61 00:03:47,760 --> 00:03:52,360 Speaker 3: It's a separate platform designed specifically for really young users, 62 00:03:52,440 --> 00:03:55,240 Speaker 3: and there are more content restrictions on YouTube kids. 63 00:03:55,480 --> 00:03:59,280 Speaker 2: It's funny how different platforms are used by different people 64 00:03:59,280 --> 00:04:01,320 Speaker 2: for different things. Yeah, I have to say I do 65 00:04:01,440 --> 00:04:04,760 Speaker 2: not use YouTube for educational purposes, or if I do, 66 00:04:05,000 --> 00:04:07,920 Speaker 2: the extent of that was makeup tutorials when I was 67 00:04:07,960 --> 00:04:08,640 Speaker 2: growing up. 68 00:04:08,520 --> 00:04:11,840 Speaker 3: But you could argue that's kind of an educational resource. 69 00:04:11,880 --> 00:04:13,000 Speaker 3: It's still a tutorial. 70 00:04:13,160 --> 00:04:16,159 Speaker 2: Yeah, that and comedy sketches. So perhaps wasn't using it 71 00:04:16,200 --> 00:04:18,800 Speaker 2: for the best thing, but hey, it was the early days, 72 00:04:18,920 --> 00:04:22,640 Speaker 2: it was, and so now the government has changed its course. 73 00:04:22,880 --> 00:04:24,560 Speaker 2: What prompted this shift. 74 00:04:25,080 --> 00:04:27,680 Speaker 3: So there were a few key factors that seemed to 75 00:04:27,720 --> 00:04:30,880 Speaker 3: have influenced this decision to include YouTube in the ban. 76 00:04:31,400 --> 00:04:33,600 Speaker 3: And when it was first confirmed last year that YouTube 77 00:04:33,600 --> 00:04:36,919 Speaker 3: would be exempt, you can imagine that this drew a 78 00:04:36,960 --> 00:04:40,919 Speaker 3: lot of criticism from other social media companies. They argued 79 00:04:40,960 --> 00:04:42,960 Speaker 3: that it wasn't fair to have a kind of separate 80 00:04:43,000 --> 00:04:45,320 Speaker 3: set of rules for one and not the others. And 81 00:04:45,360 --> 00:04:48,440 Speaker 3: this year we've really seen a concerted effort from the 82 00:04:48,440 --> 00:04:52,400 Speaker 3: a Safety Commissioner. So this is Australia's top online safety advisor. 83 00:04:52,520 --> 00:04:56,320 Speaker 3: The Commissioner's name is Julie Inman Grant and in June 84 00:04:56,360 --> 00:05:00,880 Speaker 3: so last month she issued official advice to the federal government, 85 00:05:01,160 --> 00:05:05,360 Speaker 3: urging it to reconsider its stance specifically around YouTube. So 86 00:05:05,440 --> 00:05:08,400 Speaker 3: She pointed out that you can still access YouTube without 87 00:05:08,440 --> 00:05:12,200 Speaker 3: an account, but argued that the platform has evolved beyond 88 00:05:12,600 --> 00:05:16,240 Speaker 3: just educational content and that young people are still at 89 00:05:16,320 --> 00:05:18,680 Speaker 3: risk if they hold an account and if they are 90 00:05:18,720 --> 00:05:21,919 Speaker 3: engaging with the platform. So E Safety recommended there be 91 00:05:22,040 --> 00:05:26,279 Speaker 3: no exemptions for specific platforms under the legislation because quote 92 00:05:26,360 --> 00:05:29,359 Speaker 3: the relative risks and harms can change at any given 93 00:05:29,400 --> 00:05:32,679 Speaker 3: moment across different types of social media. In an addressed 94 00:05:32,720 --> 00:05:35,800 Speaker 3: to the National Press Club, Julian mcgrant said that recent 95 00:05:35,880 --> 00:05:40,080 Speaker 3: E safety research had shown four in ten children reported 96 00:05:40,120 --> 00:05:43,520 Speaker 3: being exposed to harmful content on YouTube, which was the 97 00:05:43,600 --> 00:05:47,520 Speaker 3: highest of any platform. So that research really drove this 98 00:05:47,560 --> 00:05:50,680 Speaker 3: push by E safety to get YouTube included in the. 99 00:05:50,600 --> 00:05:53,920 Speaker 2: Band okay so Australia's E Safety Commissioner said that in 100 00:05:54,040 --> 00:05:58,440 Speaker 2: June at the time, how did YouTube respond to those claims? 101 00:05:58,800 --> 00:06:03,800 Speaker 3: So YouTube has rejected those claims roundly. Its public policy 102 00:06:03,880 --> 00:06:08,280 Speaker 3: senior manager Rachel Lord criticized E Safety's advice, saying the 103 00:06:08,360 --> 00:06:12,600 Speaker 3: video sharing platform is widely used in classroom teaching, and 104 00:06:12,640 --> 00:06:15,960 Speaker 3: in a statement, Lord said banning YouTube would ignore evidence 105 00:06:16,000 --> 00:06:20,160 Speaker 3: from teachers and parents that YouTube is suitable for younger users. 106 00:06:20,760 --> 00:06:23,160 Speaker 3: So they really leaned on that kind of community messaging 107 00:06:23,240 --> 00:06:26,080 Speaker 3: that this is not the same type of social media 108 00:06:26,200 --> 00:06:28,920 Speaker 3: as your kind of meta owned platforms and the others 109 00:06:29,160 --> 00:06:29,479 Speaker 3: got it. 110 00:06:29,839 --> 00:06:33,120 Speaker 2: Did the other social media platforms that were included in 111 00:06:33,200 --> 00:06:36,120 Speaker 2: a ban have anything to say about YouTube's exemption? 112 00:06:36,400 --> 00:06:39,479 Speaker 3: Yes, So the criticism has broadly been that if the 113 00:06:39,520 --> 00:06:43,520 Speaker 3: government was serious about protecting children from online harms, then 114 00:06:43,640 --> 00:06:48,440 Speaker 3: the world's largest video platform shouldn't get special treatment. So TikTok, 115 00:06:48,560 --> 00:06:52,360 Speaker 3: for example, its policy director in Australia, Ella Wood's Joyce, 116 00:06:52,640 --> 00:06:56,440 Speaker 3: compared YouTube's exemption to quote banning the sale of soft 117 00:06:56,520 --> 00:06:59,520 Speaker 3: drinks to minors but exempting Coca cola. 118 00:07:00,640 --> 00:07:05,440 Speaker 2: Yeah, we love an analogy, sure too, Okay, So clearly 119 00:07:05,440 --> 00:07:09,000 Speaker 2: there was mounting pressure from the big social media platforms 120 00:07:09,040 --> 00:07:12,160 Speaker 2: and also the E Safety Commissioner to ensure that there 121 00:07:12,160 --> 00:07:16,040 Speaker 2: are no exemptions, and that pressure seemed to have worked, 122 00:07:16,120 --> 00:07:19,560 Speaker 2: because this week the government announced that YouTube would no 123 00:07:19,640 --> 00:07:22,760 Speaker 2: longer be exempt and now under sixteen year olds will 124 00:07:22,800 --> 00:07:26,480 Speaker 2: be banned from using that platform. How did this announcement 125 00:07:26,720 --> 00:07:27,960 Speaker 2: unfold this week? 126 00:07:28,480 --> 00:07:31,760 Speaker 3: So after the E Safety Commissioner wrote to the government 127 00:07:31,920 --> 00:07:35,920 Speaker 3: in June with that advice we heard from the government. 128 00:07:36,120 --> 00:07:40,160 Speaker 3: It said that basically the Communications Minister Annika Wells was 129 00:07:40,240 --> 00:07:44,040 Speaker 3: carefully considering that advice and so that's where things stayed 130 00:07:44,040 --> 00:07:46,880 Speaker 3: out for a few weeks until Tuesday night, when the 131 00:07:46,920 --> 00:07:50,160 Speaker 3: government announced YouTube would be subject to the same age 132 00:07:50,200 --> 00:07:53,880 Speaker 3: restrictions as other social media platforms. We got a statement 133 00:07:53,920 --> 00:07:56,920 Speaker 3: from Prime Minister Anthony Alberesi's office which said the decision 134 00:07:56,960 --> 00:08:00,960 Speaker 3: followed extensive consultation and was in four formed by advice 135 00:08:01,120 --> 00:08:04,840 Speaker 3: from the E Safety Commissioner. The statement said online gaming, 136 00:08:05,000 --> 00:08:09,080 Speaker 3: messaging apps, health and education services though would not be 137 00:08:09,160 --> 00:08:12,520 Speaker 3: included in the ban quote because they pose fewer social 138 00:08:12,560 --> 00:08:17,080 Speaker 3: media harms to under sixteens or are regulated under different laws, 139 00:08:17,480 --> 00:08:20,800 Speaker 3: so that includes WhatsApp for example. WhatsApp will not be 140 00:08:20,840 --> 00:08:23,840 Speaker 3: included or affected by the ban, but the change does 141 00:08:23,960 --> 00:08:27,600 Speaker 3: mean that from December, YouTube will be subject to the 142 00:08:27,640 --> 00:08:31,920 Speaker 3: same minimum age laws that platforms like TikTok, Instagram and 143 00:08:32,040 --> 00:08:34,760 Speaker 3: Snapchat will have to adhere to, and at a press 144 00:08:34,760 --> 00:08:38,400 Speaker 3: conference on Wednesday, the Communications Minister Annika Wells said the 145 00:08:38,440 --> 00:08:42,360 Speaker 3: evidence of harms to young users on YouTube cannot be ignored. 146 00:08:42,640 --> 00:08:45,920 Speaker 1: We want kids to know who they are before platforms 147 00:08:46,080 --> 00:08:50,440 Speaker 1: assume who they are. These are not set and forget 148 00:08:50,520 --> 00:08:53,600 Speaker 1: rules is a set of support rules. They are world leading. 149 00:08:53,800 --> 00:08:56,920 Speaker 1: But this is manifestly too important for us not to 150 00:08:56,960 --> 00:08:57,439 Speaker 1: have a crap. 151 00:08:57,920 --> 00:09:01,400 Speaker 2: So you mentioned before that YouTube does have the educational 152 00:09:01,480 --> 00:09:04,000 Speaker 2: content on its platform, the ones that are much more 153 00:09:04,120 --> 00:09:07,080 Speaker 2: educational than the makeup tutorials I watched when I was sixteen. 154 00:09:07,679 --> 00:09:09,640 Speaker 2: But in terms of I know that there are you know, 155 00:09:09,720 --> 00:09:13,040 Speaker 2: documentaries there which you know, maybe teachers would even use 156 00:09:13,200 --> 00:09:16,560 Speaker 2: to help with classes that they're teaching. Yeah, did the 157 00:09:16,600 --> 00:09:21,199 Speaker 2: Communications Minister Annika Wells say anything about whether that educational 158 00:09:21,240 --> 00:09:23,160 Speaker 2: content will be accessible still? 159 00:09:23,360 --> 00:09:26,320 Speaker 3: Yeah? Well, I think it's worth noting that the restriction 160 00:09:26,600 --> 00:09:31,240 Speaker 3: only applies to YouTube account holders, so under sixteen year 161 00:09:31,280 --> 00:09:34,120 Speaker 3: olds won't be allowed to log in on YouTube, but 162 00:09:34,160 --> 00:09:36,600 Speaker 3: they can still watch videos on the platform, you know, 163 00:09:36,640 --> 00:09:39,040 Speaker 3: the same way any of us can without having to 164 00:09:39,120 --> 00:09:43,559 Speaker 3: log in, So that doesn't change the classroom aspect. That 165 00:09:43,679 --> 00:09:48,160 Speaker 3: doesn't stop kids from watching potentially helpful videos, but it 166 00:09:48,200 --> 00:09:52,000 Speaker 3: also doesn't stop them from seeing potentially harmful content either. 167 00:09:52,280 --> 00:09:55,440 Speaker 2: That's interesting, So here they are really just banning under 168 00:09:55,480 --> 00:09:59,720 Speaker 2: sixteens from creating an account. But YouTube is unlike something 169 00:09:59,760 --> 00:10:03,360 Speaker 2: like Twitter or x I should say, or Instagram, where 170 00:10:03,400 --> 00:10:06,160 Speaker 2: you actually need an account in order to view content 171 00:10:06,240 --> 00:10:08,480 Speaker 2: on those platforms exactly. YouTube is not like that. 172 00:10:08,600 --> 00:10:11,640 Speaker 3: And the other really important aspect to having a YouTube 173 00:10:11,720 --> 00:10:15,680 Speaker 3: account is the algorithm that comes with that. So if 174 00:10:15,720 --> 00:10:19,360 Speaker 3: you are logged in on YouTube, then the intuitive kind 175 00:10:19,360 --> 00:10:22,200 Speaker 3: of nature of the algorithm pays attention to what you watch. 176 00:10:22,280 --> 00:10:24,480 Speaker 3: It starts to learn about your habits, it starts to 177 00:10:24,520 --> 00:10:26,920 Speaker 3: serve you videos that I think you might want to watch. 178 00:10:27,280 --> 00:10:29,760 Speaker 3: And it's that kind of rabbit hole algorithm that's been 179 00:10:29,800 --> 00:10:35,040 Speaker 3: really criticized in terms of promoting violence or serving videos 180 00:10:35,080 --> 00:10:37,439 Speaker 3: to young boys that could be seen as kind of 181 00:10:37,520 --> 00:10:42,720 Speaker 3: radicalizing them towards becoming violent. So the algorithm aspect is 182 00:10:42,720 --> 00:10:45,200 Speaker 3: a really big one. And if you're a casual or 183 00:10:45,200 --> 00:10:49,120 Speaker 3: more passive YouTube consumer, you're not logged in. It's not 184 00:10:49,280 --> 00:10:52,440 Speaker 3: learning about you, it's not suggesting video content to you. 185 00:10:53,000 --> 00:10:56,160 Speaker 2: Another story that I have read about this week is 186 00:10:56,200 --> 00:11:00,480 Speaker 2: in relation to Google, which is YouTube's parent company, them 187 00:11:00,480 --> 00:11:05,559 Speaker 2: potentially considering suing the Australian government. What is that story about. 188 00:11:06,120 --> 00:11:08,960 Speaker 3: So meanwhile, in the background of all of this We've 189 00:11:08,960 --> 00:11:12,040 Speaker 3: got reports from the Daily Telegraph that Google is threatening 190 00:11:12,120 --> 00:11:15,880 Speaker 3: legal action against the government for proceeding with including YouTube 191 00:11:15,920 --> 00:11:18,400 Speaker 3: in the ban. So we haven't seen the full details 192 00:11:18,440 --> 00:11:22,319 Speaker 3: of Google's legal arguments, but they have previously raised concerns 193 00:11:22,400 --> 00:11:26,679 Speaker 3: about the timing and the feasibility of age verification systems. 194 00:11:26,880 --> 00:11:31,160 Speaker 3: So this all relies on successful age verification technology, and 195 00:11:31,240 --> 00:11:35,080 Speaker 3: in submissions to Parliament last year, Google warned against introducing 196 00:11:35,120 --> 00:11:39,920 Speaker 3: the law before tested age verification systems are in place. Now, 197 00:11:39,960 --> 00:11:43,079 Speaker 3: despite the government's claims that it is running age verification 198 00:11:43,200 --> 00:11:46,600 Speaker 3: trials that have been mostly successful, Google has pointed out 199 00:11:46,679 --> 00:11:49,800 Speaker 3: that the full age verification system trial won't be finished 200 00:11:49,840 --> 00:11:54,480 Speaker 3: until mid next year, making the bills timing concerning in 201 00:11:54,600 --> 00:11:58,080 Speaker 3: its words. Annaka Wells has addressed this the Communications Minister, 202 00:11:58,200 --> 00:12:02,040 Speaker 3: saying the government's waiting on the Age Assurance final recommendations, 203 00:12:02,440 --> 00:12:05,360 Speaker 3: but she said that they will publish those recommendations as 204 00:12:05,400 --> 00:12:06,960 Speaker 3: soon as possible. 205 00:12:06,920 --> 00:12:09,920 Speaker 2: So they just so I understand they are suing over 206 00:12:10,120 --> 00:12:16,080 Speaker 2: the timeline, but not necessarily the suggestion of YouTube being included. 207 00:12:15,840 --> 00:12:19,720 Speaker 3: Well, this is all reported by the Daily Telegraph, but 208 00:12:19,800 --> 00:12:23,720 Speaker 3: the suggestion is yes, that they're considering legal action because 209 00:12:23,760 --> 00:12:27,000 Speaker 3: of the timeline, and because the technology might not have 210 00:12:27,280 --> 00:12:32,040 Speaker 3: met the timeline. And there's also a potential argument about 211 00:12:32,120 --> 00:12:36,400 Speaker 3: challenging the legislation on constitutional grounds. So the Daily Telegraph 212 00:12:36,480 --> 00:12:39,680 Speaker 3: reports cited a letter sent by Google to the Communications 213 00:12:39,720 --> 00:12:42,800 Speaker 3: Minister which said including YouTube in the band would diminish 214 00:12:42,920 --> 00:12:47,960 Speaker 3: the quote implied constitutional freedom of political communication, So a 215 00:12:48,000 --> 00:12:50,520 Speaker 3: free speech argument there, and how has. 216 00:12:50,400 --> 00:12:54,280 Speaker 2: The government responded to this threat of legal action from Google? 217 00:12:54,640 --> 00:12:58,760 Speaker 3: So Anaka Wells addressed this directly yesterday when she said 218 00:12:58,840 --> 00:13:02,120 Speaker 3: she will quote not be intimidated by legal threats when 219 00:13:02,120 --> 00:13:04,960 Speaker 3: this is a genuine fight for the well being of 220 00:13:05,120 --> 00:13:10,199 Speaker 3: Australian kids. Speaking alongside her, also yesterday, Prime Minister Anthony 221 00:13:10,200 --> 00:13:13,920 Speaker 3: Albernezi acknowledged that the ban isn't going to be a 222 00:13:14,000 --> 00:13:17,120 Speaker 3: simple or easy process, but he did say the government 223 00:13:17,160 --> 00:13:20,760 Speaker 3: wants this to be a cooperative one. So he essentially 224 00:13:20,840 --> 00:13:24,760 Speaker 3: dismissed claims that these platforms don't have the resources or 225 00:13:24,800 --> 00:13:27,560 Speaker 3: technology or that they're not going to be ready to 226 00:13:27,679 --> 00:13:30,600 Speaker 3: roll out age verification. Here's a little bit of what 227 00:13:30,720 --> 00:13:31,199 Speaker 3: he said. 228 00:13:31,720 --> 00:13:35,559 Speaker 4: I know where you go, how you talk to, what 229 00:13:35,640 --> 00:13:39,240 Speaker 4: you're interested in. You know, they do keep that information 230 00:13:39,320 --> 00:13:42,920 Speaker 4: and during the election campaign if they could identify for 231 00:13:43,080 --> 00:13:46,840 Speaker 4: political parties in order to encourage us to invest on 232 00:13:46,920 --> 00:13:53,320 Speaker 4: their platforms on an issue like childcare, identifying women between 233 00:13:53,360 --> 00:13:56,760 Speaker 4: a particular age, in a particular seat, in a particular 234 00:13:56,840 --> 00:14:01,880 Speaker 4: demographic with particular postcodes, then they can help out here too. 235 00:14:02,400 --> 00:14:05,120 Speaker 4: They can use the capacity which we know that they have. 236 00:14:05,840 --> 00:14:10,240 Speaker 2: Has the opposition responded to the YouTube announcement, So. 237 00:14:10,360 --> 00:14:14,240 Speaker 3: The Coalition interestingly has supported this social media ban for 238 00:14:14,320 --> 00:14:16,959 Speaker 3: under sixteen year olds. It's really very much in favor 239 00:14:17,000 --> 00:14:20,120 Speaker 3: of us exactly. They really were on board, but since 240 00:14:20,200 --> 00:14:22,960 Speaker 3: news that YouTube was going to be included, they have 241 00:14:23,280 --> 00:14:26,720 Speaker 3: been pretty critical of the government. So the Coalition are 242 00:14:26,720 --> 00:14:30,080 Speaker 3: not criticizing the ban itself, but they've said that Labor 243 00:14:30,120 --> 00:14:33,640 Speaker 3: has essentially broken a promise with this backflip. We've heard 244 00:14:33,640 --> 00:14:37,280 Speaker 3: from Shadow Communications Minister Melissa Macintosh, who has accused the 245 00:14:37,320 --> 00:14:40,560 Speaker 3: government of a lack of transparency. She said Labor quote 246 00:14:40,560 --> 00:14:43,280 Speaker 3: cannot hide the fact that they deliberately misled the public 247 00:14:43,320 --> 00:14:46,480 Speaker 3: at the last election by promising to keep YouTube out 248 00:14:46,520 --> 00:14:49,600 Speaker 3: of the social media age minimum. She said the Coalition 249 00:14:49,720 --> 00:14:52,600 Speaker 3: is concerned that the e Safety Commissioner as well is 250 00:14:52,640 --> 00:14:56,840 Speaker 3: testing boundaries quote which are moving beyond what Australians are 251 00:14:56,840 --> 00:14:57,480 Speaker 3: comfortable with. 252 00:14:58,720 --> 00:15:01,240 Speaker 2: So in terms of what happens now, you said before 253 00:15:01,360 --> 00:15:04,080 Speaker 2: that this comes into effect on the tenth of December, 254 00:15:04,120 --> 00:15:07,120 Speaker 2: I believe you said. And so what needs to happen 255 00:15:07,160 --> 00:15:08,360 Speaker 2: between now and then. 256 00:15:08,520 --> 00:15:10,960 Speaker 3: Yes, So, as you said, this legislation is due to 257 00:15:10,960 --> 00:15:14,960 Speaker 3: come into effect in December. The E Safety Commissioner will 258 00:15:15,000 --> 00:15:18,880 Speaker 3: be responsible for enforcing the new rules against the social 259 00:15:18,920 --> 00:15:21,640 Speaker 3: media companies, but it will be up to each platform 260 00:15:21,880 --> 00:15:24,920 Speaker 3: to roll out their own strategies to ensure they're complying 261 00:15:24,960 --> 00:15:27,440 Speaker 3: with the law. So in terms of how they do that, 262 00:15:27,680 --> 00:15:32,600 Speaker 3: we need I suppose more concrete evidence or advice on 263 00:15:32,720 --> 00:15:36,480 Speaker 3: age verification technology. But it is worth noting that the 264 00:15:36,600 --> 00:15:40,520 Speaker 3: legislation places the responsibility on the platforms, not the users, 265 00:15:40,600 --> 00:15:43,120 Speaker 3: so not on young people or their parents. Users are 266 00:15:43,160 --> 00:15:47,480 Speaker 3: not going to face penalties for attempting to access these platforms. Now, 267 00:15:47,520 --> 00:15:50,720 Speaker 3: in terms of the legal threat, Google may follow through 268 00:15:50,760 --> 00:15:54,240 Speaker 3: on that, and that could involve challenging this legislation in 269 00:15:54,320 --> 00:15:57,040 Speaker 3: the Federal court in order to avoid that. You know, 270 00:15:57,120 --> 00:16:00,520 Speaker 3: purely speculating here, but the government might end up goiating 271 00:16:00,640 --> 00:16:03,600 Speaker 3: or working with them to kind of prevent legal action 272 00:16:03,640 --> 00:16:05,600 Speaker 3: from going all the way to court, and that could 273 00:16:05,640 --> 00:16:09,200 Speaker 3: look like, you know, some flexibility on the technical implementation 274 00:16:09,680 --> 00:16:13,160 Speaker 3: of age verification or some delays. So the legal threat 275 00:16:13,160 --> 00:16:17,880 Speaker 3: could potentially delay implementation. But given the bipartisan support that 276 00:16:17,920 --> 00:16:20,840 Speaker 3: we've seen for this legislation and the substantial work that's 277 00:16:20,840 --> 00:16:25,000 Speaker 3: already been done on the age verification trials, the government seems, 278 00:16:25,120 --> 00:16:29,200 Speaker 3: you know, pretty sternly committed to proceeding with the December deadline. 279 00:16:29,440 --> 00:16:31,400 Speaker 2: Something tells me we are going to be hearing a 280 00:16:31,480 --> 00:16:34,320 Speaker 2: lot more about it in the months to come, exactly. 281 00:16:34,520 --> 00:16:37,200 Speaker 2: And thank you so much for explaining that. Thank you, 282 00:16:37,520 --> 00:16:39,640 Speaker 2: and thank you so much for listening to this episode 283 00:16:39,680 --> 00:16:42,160 Speaker 2: of The Daily os or you might be listening or 284 00:16:42,200 --> 00:16:45,080 Speaker 2: watching on YouTube hopefully you're over sixteen. 285 00:16:45,200 --> 00:16:45,480 Speaker 1: Hello. 286 00:16:47,200 --> 00:16:49,840 Speaker 2: We'll be back this afternoon with your evening headlines, but 287 00:16:49,960 --> 00:16:54,800 Speaker 2: until then, have a good day. 288 00:16:55,640 --> 00:16:57,920 Speaker 1: My name is Lily Madden and I'm a proud Arunda 289 00:16:58,160 --> 00:17:03,320 Speaker 1: Bungelung Calcuttin woman country. The Daily oz acknowledges that this 290 00:17:03,440 --> 00:17:05,960 Speaker 1: podcast is recorded on the lands of the Gadigal people 291 00:17:06,280 --> 00:17:09,399 Speaker 1: and pays respect to all Aboriginal and torrest Rate island 292 00:17:09,400 --> 00:17:12,400 Speaker 1: and nations. We pay our respects to the first peoples 293 00:17:12,440 --> 00:17:14,520 Speaker 1: of these countries, both past and present,