1 00:00:00,480 --> 00:00:05,640 Speaker 1: Already and this is the Daily This is the Daily OS. Oh, 2 00:00:05,800 --> 00:00:06,840 Speaker 1: now it makes sense. 3 00:00:14,680 --> 00:00:17,320 Speaker 2: Good morning and welcome to the Daily ODS. It's Tuesday, 4 00:00:17,360 --> 00:00:18,640 Speaker 2: the second of September. 5 00:00:18,840 --> 00:00:21,320 Speaker 1: I'm Harry Sekulic, I'm Emma Gillespie. 6 00:00:21,560 --> 00:00:24,599 Speaker 2: In fewer than one hundred days, under sixteens in Australia 7 00:00:24,680 --> 00:00:27,920 Speaker 2: will be banned from social media. Kids and teens won't 8 00:00:27,960 --> 00:00:31,280 Speaker 2: be able to make a YouTube, TikTok or Facebook account. 9 00:00:31,400 --> 00:00:35,120 Speaker 2: But a big question remains unanswered, how will it work. 10 00:00:35,560 --> 00:00:39,080 Speaker 2: Age verification trials have been taking place over the past year, 11 00:00:39,320 --> 00:00:42,920 Speaker 2: and the government has now released its findings. In today's 12 00:00:42,920 --> 00:00:45,160 Speaker 2: deep Dive, we'll take you through some of the findings 13 00:00:45,280 --> 00:00:46,800 Speaker 2: and look at what will happen next. 14 00:00:47,240 --> 00:00:49,320 Speaker 1: But before we get stuck into it, here's a quick 15 00:00:49,360 --> 00:00:57,480 Speaker 1: word from our sponsor, Harry. This social media band story 16 00:00:57,800 --> 00:01:01,800 Speaker 1: has been following us. We've been following it. It's felt 17 00:01:01,840 --> 00:01:05,120 Speaker 1: like a really long time in the making, from the 18 00:01:05,160 --> 00:01:07,959 Speaker 1: first time we ever heard about a potential social media 19 00:01:08,000 --> 00:01:11,760 Speaker 1: ban to the legislation passing. The ban itself is not 20 00:01:12,080 --> 00:01:15,120 Speaker 1: yet in place, but you've got a really interesting update 21 00:01:15,200 --> 00:01:19,199 Speaker 1: for us today on the age verification stuff that's been 22 00:01:19,400 --> 00:01:22,920 Speaker 1: kind of at the center of the questions around whether 23 00:01:23,000 --> 00:01:25,680 Speaker 1: or not this is going to work. But before we 24 00:01:25,720 --> 00:01:29,000 Speaker 1: talk about that, let's just rewind. Can you give us 25 00:01:29,160 --> 00:01:33,240 Speaker 1: a quick timeline on the big developments of late. 26 00:01:33,480 --> 00:01:35,920 Speaker 2: Yeah, this social media ban is one of those stories 27 00:01:35,959 --> 00:01:38,480 Speaker 2: where in the world of journalism you say that the 28 00:01:38,560 --> 00:01:42,280 Speaker 2: lead always updates, so the new headline comes through every 29 00:01:42,360 --> 00:01:45,040 Speaker 2: so often. You get a new top line with every 30 00:01:45,040 --> 00:01:47,840 Speaker 2: development in this story. And I guess, going back to 31 00:01:48,200 --> 00:01:51,680 Speaker 2: the very beginning's very origins, there was this really big 32 00:01:51,720 --> 00:01:55,400 Speaker 2: campaign to ban social media for under sixteen's, which was 33 00:01:55,920 --> 00:01:59,360 Speaker 2: mostly led by concerned parents who said that their kids 34 00:01:59,400 --> 00:02:03,560 Speaker 2: were being ex to some violent and sexually explicit content, 35 00:02:03,760 --> 00:02:07,440 Speaker 2: that social media was being used for bullying, that there 36 00:02:07,480 --> 00:02:10,480 Speaker 2: were just some harms online that they wanted to see 37 00:02:10,520 --> 00:02:14,000 Speaker 2: more protections against. And in November last year, the government 38 00:02:14,040 --> 00:02:17,720 Speaker 2: responded to some of these calls and banned social media 39 00:02:17,840 --> 00:02:22,480 Speaker 2: for under sixteen's, And initially there was an exemption for YouTube. 40 00:02:22,680 --> 00:02:26,160 Speaker 2: That exemption's now been overturned and that's subject to a 41 00:02:26,440 --> 00:02:29,640 Speaker 2: legal challenge, which we could go into very separately in 42 00:02:29,720 --> 00:02:32,400 Speaker 2: its own podcast. And the ban itself is scheduled to 43 00:02:32,440 --> 00:02:34,960 Speaker 2: take place on the tenth of December later this year. 44 00:02:35,240 --> 00:02:39,600 Speaker 1: Okay, so that YouTube exemption that was reversed. We actually 45 00:02:39,639 --> 00:02:42,560 Speaker 1: did chat about that on the podcast. When that happened. 46 00:02:42,600 --> 00:02:44,160 Speaker 1: I think it was the end of June or the 47 00:02:44,200 --> 00:02:46,239 Speaker 1: start of July off the top of my head, but 48 00:02:46,400 --> 00:02:48,200 Speaker 1: will poppling to that in the show notes if you 49 00:02:48,240 --> 00:02:50,400 Speaker 1: want to get up to speed on all of that 50 00:02:50,600 --> 00:02:53,840 Speaker 1: debarcle and what unfolded there. But for now, Harry, we 51 00:02:53,960 --> 00:02:57,960 Speaker 1: know this ban is coming into effect regardless of how 52 00:02:58,000 --> 00:03:01,280 Speaker 1: the social media platforms feel about it. But there has 53 00:03:01,360 --> 00:03:05,520 Speaker 1: been a lot of confusion about the practicalities of this band. 54 00:03:05,600 --> 00:03:08,880 Speaker 1: It feels like there's been a collective understanding or agreement 55 00:03:09,040 --> 00:03:12,679 Speaker 1: that social media is bad for kids' mental health and 56 00:03:12,840 --> 00:03:16,280 Speaker 1: we need to do more to protect kids online. But 57 00:03:16,919 --> 00:03:19,440 Speaker 1: how's it going to work? Who will be responsible for 58 00:03:19,600 --> 00:03:20,280 Speaker 1: enforcing this? 59 00:03:20,720 --> 00:03:22,560 Speaker 2: When I first heard that there was going to be 60 00:03:22,560 --> 00:03:25,680 Speaker 2: a band for under sixteens, I was thinking that kids 61 00:03:25,720 --> 00:03:27,920 Speaker 2: were going to be slugged with a fine potentially for 62 00:03:28,440 --> 00:03:30,919 Speaker 2: creating an account when they're not meant to. But obviously 63 00:03:30,960 --> 00:03:33,280 Speaker 2: that's not how this is going to work, and the 64 00:03:33,320 --> 00:03:36,240 Speaker 2: government was pretty clear from the outset that it would 65 00:03:36,280 --> 00:03:39,160 Speaker 2: be on the social media companies. They would bear the 66 00:03:39,200 --> 00:03:43,800 Speaker 2: responsibility to enforce the ban. But on a practical level, 67 00:03:44,160 --> 00:03:47,200 Speaker 2: there is probably a few aspects to how the band 68 00:03:47,240 --> 00:03:50,400 Speaker 2: will work. So there'll be the deterrent effect. So parents 69 00:03:50,440 --> 00:03:52,200 Speaker 2: will be able to say to their kids, it's actually 70 00:03:52,280 --> 00:03:55,480 Speaker 2: illegal for you to have a social media account, and 71 00:03:56,120 --> 00:04:00,080 Speaker 2: that might have an effect of kids just not downloading 72 00:04:00,120 --> 00:04:02,120 Speaker 2: the app or whatever it might be. They just might 73 00:04:02,240 --> 00:04:05,160 Speaker 2: be dissuaded to do so. But obviously it goes beyond that, 74 00:04:05,400 --> 00:04:08,160 Speaker 2: and as I said, the owners will fall on the 75 00:04:08,200 --> 00:04:12,320 Speaker 2: social media companies the platforms themselves to enforce a ban, 76 00:04:12,600 --> 00:04:14,800 Speaker 2: or they would risk getting a fine, which is to 77 00:04:14,880 --> 00:04:18,440 Speaker 2: the tune of about fifty million dollars. So, for example, 78 00:04:18,640 --> 00:04:23,360 Speaker 2: if it was found that Meta was not taking enough 79 00:04:23,400 --> 00:04:26,600 Speaker 2: reasonable steps so that's the language in the legislation, reasonable 80 00:04:26,600 --> 00:04:30,479 Speaker 2: steps to prevent under sixteens from making an account on 81 00:04:30,520 --> 00:04:33,919 Speaker 2: their platform, they could be slugged with a fifty million 82 00:04:33,960 --> 00:04:37,080 Speaker 2: dollar fine. So the responsibility lies with them. 83 00:04:37,200 --> 00:04:41,440 Speaker 1: Okay, so the responsibilities on the platforms, not children or parents. 84 00:04:42,040 --> 00:04:46,960 Speaker 1: If we've got social media companies then regulating and enforcing this, 85 00:04:48,080 --> 00:04:50,719 Speaker 1: how will they and I know this is a million 86 00:04:50,760 --> 00:04:54,000 Speaker 1: dollar question, how will they verify the age of their users? 87 00:04:54,200 --> 00:04:56,359 Speaker 2: Well, Australia is a first country in the world to 88 00:04:56,480 --> 00:05:00,599 Speaker 2: have an age specific ban for social media. So this 89 00:05:00,760 --> 00:05:03,839 Speaker 2: is kind of a novel space. This is very much 90 00:05:03,920 --> 00:05:06,400 Speaker 2: new territory. And I think it's helpful to think of 91 00:05:06,520 --> 00:05:08,960 Speaker 2: a nightclub. When you go to a club, you need 92 00:05:09,000 --> 00:05:12,119 Speaker 2: to show proof that you are at least eighteen years old, 93 00:05:12,400 --> 00:05:14,919 Speaker 2: and I think that there are some steps being taken 94 00:05:15,400 --> 00:05:18,680 Speaker 2: to bring in a digital version of that for sixteen 95 00:05:18,720 --> 00:05:20,120 Speaker 2: year olds on social media. 96 00:05:20,200 --> 00:05:23,240 Speaker 1: Well, Harry, it's interesting you say that, because it's one 97 00:05:23,320 --> 00:05:27,160 Speaker 1: thing to be at a licensed venue in person and 98 00:05:27,600 --> 00:05:31,400 Speaker 1: be able to physically present your ID, to have a 99 00:05:31,440 --> 00:05:34,719 Speaker 1: bouncer or security person kind of check your ID, look 100 00:05:34,760 --> 00:05:37,440 Speaker 1: you up and down, decide yep, that's you. You are 101 00:05:37,480 --> 00:05:40,400 Speaker 1: free to pass. But in the digital world, in the 102 00:05:40,440 --> 00:05:43,120 Speaker 1: online environment, it's just a whole different story. 103 00:05:43,560 --> 00:05:46,279 Speaker 2: And that's where we get into this space of talking 104 00:05:46,320 --> 00:05:50,479 Speaker 2: about age assurance technology, which is basically verifying that you 105 00:05:50,600 --> 00:05:53,440 Speaker 2: are at least sixteen years old and able to use 106 00:05:53,720 --> 00:05:57,920 Speaker 2: social media. And so since the government announced its ban, 107 00:05:58,000 --> 00:06:00,160 Speaker 2: it said that it was going to trial so of 108 00:06:00,200 --> 00:06:02,960 Speaker 2: this technology, that it was going to get an independent 109 00:06:03,720 --> 00:06:06,839 Speaker 2: review into some of the tech that's available out there. 110 00:06:07,040 --> 00:06:10,320 Speaker 2: And what we have this week is the findings from 111 00:06:10,360 --> 00:06:14,839 Speaker 2: that review into existing technologies that are available. Was conducted 112 00:06:14,880 --> 00:06:18,160 Speaker 2: by British firm. It was independent of the government, and 113 00:06:18,640 --> 00:06:21,640 Speaker 2: it looked at forty eight different companies and the way 114 00:06:21,680 --> 00:06:24,960 Speaker 2: in which they detect age assurance and how they use 115 00:06:25,000 --> 00:06:25,720 Speaker 2: this technology. 116 00:06:25,800 --> 00:06:29,680 Speaker 1: Okay, so a pretty big pool I suppose of companies 117 00:06:30,000 --> 00:06:33,599 Speaker 1: doing the age verification staff or building this tech. But 118 00:06:33,720 --> 00:06:38,040 Speaker 1: I am really curious to know what this report found. 119 00:06:38,240 --> 00:06:40,960 Speaker 1: What were the types of tech that they looked into, 120 00:06:41,720 --> 00:06:43,240 Speaker 1: how did they work? Did they work? 121 00:06:43,640 --> 00:06:48,400 Speaker 2: So there were broadly three different categories of age assurance tech. 122 00:06:48,560 --> 00:06:52,440 Speaker 2: So the first is called verification, and this is directly 123 00:06:52,480 --> 00:06:55,760 Speaker 2: proving how old you are through official identity documents. So 124 00:06:55,800 --> 00:06:59,799 Speaker 2: think of birth certificates, passports, those things that we already 125 00:07:00,240 --> 00:07:02,640 Speaker 2: use day to day in our lives. 126 00:07:02,839 --> 00:07:05,560 Speaker 1: Thinking of if you are applying for a new rental 127 00:07:05,640 --> 00:07:08,520 Speaker 1: and you need one hundred thousand points of IDs, that's right, 128 00:07:08,760 --> 00:07:10,960 Speaker 1: You've got to find every little document that proves who 129 00:07:11,040 --> 00:07:13,600 Speaker 1: you really are to get past the first hurdle on 130 00:07:13,640 --> 00:07:14,360 Speaker 1: your application. 131 00:07:14,680 --> 00:07:17,000 Speaker 2: And there are some platforms that already require you to 132 00:07:17,080 --> 00:07:20,200 Speaker 2: pose next to your passport photo and that way they 133 00:07:20,240 --> 00:07:23,760 Speaker 2: can determine whether your fatal fitches match up to the 134 00:07:23,840 --> 00:07:27,120 Speaker 2: passport itself. Okay, but also in Australia, the minimum age 135 00:07:27,120 --> 00:07:31,000 Speaker 2: to get a learner's license to drive is sixteen, so 136 00:07:31,040 --> 00:07:35,320 Speaker 2: that's also one way of verifying whether you are of 137 00:07:35,560 --> 00:07:39,200 Speaker 2: the social media ban age if you've cleared that threshold. 138 00:07:39,040 --> 00:07:41,720 Speaker 1: Because a lot of under sixteen year olds wouldn't have 139 00:07:42,080 --> 00:07:45,920 Speaker 1: on hand a lot of formal identification. Yeah, I don't 140 00:07:45,920 --> 00:07:48,400 Speaker 1: know if school library cards are going to pass the test. 141 00:07:48,600 --> 00:07:51,679 Speaker 2: Well, that actually taps into some of the other areas 142 00:07:51,680 --> 00:07:54,400 Speaker 2: where you can actually go through age assurance. If you've 143 00:07:54,400 --> 00:07:56,560 Speaker 2: only got a school card, then some of this tech 144 00:07:56,640 --> 00:08:00,280 Speaker 2: might actually think that you are of school age, maybe 145 00:08:00,360 --> 00:08:02,840 Speaker 2: thirteen or fourteen years old as well, So that's one 146 00:08:02,840 --> 00:08:05,720 Speaker 2: way of rooting it out. The second main age dessurance 147 00:08:05,800 --> 00:08:08,560 Speaker 2: tech is called estimation, and that's where we're getting into 148 00:08:08,760 --> 00:08:11,480 Speaker 2: using things like face ID to prove that you're at 149 00:08:11,560 --> 00:08:15,240 Speaker 2: least sixteen years old. So this is scanning technology that 150 00:08:15,760 --> 00:08:19,960 Speaker 2: determines whether, based on your biometric analysis, whether you are 151 00:08:20,120 --> 00:08:23,840 Speaker 2: at least sixteen. And the final area is called inference 152 00:08:23,920 --> 00:08:27,400 Speaker 2: and this is a little bit more tricky. This is 153 00:08:27,840 --> 00:08:31,440 Speaker 2: technology that uses your metadata or looks at your online 154 00:08:31,720 --> 00:08:34,400 Speaker 2: digital footprint to get a sense of how old you are. 155 00:08:34,640 --> 00:08:38,240 Speaker 2: This is basically taking a big overall look at how 156 00:08:38,360 --> 00:08:42,400 Speaker 2: you've been conducting yourself online and whether you fall within 157 00:08:42,440 --> 00:08:44,920 Speaker 2: a likely age range. So it might just be your 158 00:08:45,000 --> 00:08:47,720 Speaker 2: email domain, if you have like a work email address, 159 00:08:48,240 --> 00:08:50,720 Speaker 2: that would assume that you are a little bit older. 160 00:08:50,960 --> 00:08:53,120 Speaker 2: If you're on the electoral roll, you have to be 161 00:08:53,160 --> 00:08:56,120 Speaker 2: at least eighteen to vote, so they will cross reference 162 00:08:56,120 --> 00:08:59,880 Speaker 2: your details against that. Yeah, and there's also questions of 163 00:09:00,000 --> 00:09:03,080 Speaker 2: whether they can get data from school enrollments to determine 164 00:09:03,080 --> 00:09:07,320 Speaker 2: whether someone's in your ten or elevens, you know over sixteen. 165 00:09:07,080 --> 00:09:10,520 Speaker 1: So official channels because in my head, I'm thinking, if 166 00:09:10,520 --> 00:09:14,360 Speaker 1: you've got someone whose browser history contains like a lot 167 00:09:14,400 --> 00:09:18,400 Speaker 1: of homewares and couch options versus someone whose browser history 168 00:09:18,600 --> 00:09:22,600 Speaker 1: is streaming a lot of scivity toilet, we're probably able 169 00:09:22,640 --> 00:09:24,560 Speaker 1: to make a detection on their age from that. 170 00:09:24,720 --> 00:09:28,000 Speaker 2: It would be a really humbling experience with me being 171 00:09:28,040 --> 00:09:30,959 Speaker 2: in my late twenties was confused for being thirteen years 172 00:09:31,000 --> 00:09:33,199 Speaker 2: old on my browser You history. 173 00:09:33,120 --> 00:09:37,400 Speaker 1: That inference side is really interesting, But you mentioned estimation, 174 00:09:37,920 --> 00:09:39,719 Speaker 1: and I know that this is something that had come 175 00:09:39,800 --> 00:09:43,840 Speaker 1: up during the trial about you know, what is the 176 00:09:43,880 --> 00:09:47,120 Speaker 1: difference between a fifteen year old's face and a sixteen 177 00:09:47,200 --> 00:09:51,240 Speaker 1: year old's face. Are these kind of different categories all 178 00:09:51,360 --> 00:09:55,440 Speaker 1: about working together with each other to form a more 179 00:09:55,480 --> 00:09:58,320 Speaker 1: thorough picture of age verification or will platforms kind of 180 00:09:58,320 --> 00:09:59,360 Speaker 1: pick and choose one. 181 00:09:59,360 --> 00:10:04,000 Speaker 2: Each Well, it's all part of what is being thrashed 182 00:10:04,000 --> 00:10:06,920 Speaker 2: out at the moment, and this goes to what the 183 00:10:07,000 --> 00:10:10,240 Speaker 2: legislation requires, which is the reasonable step. So they might 184 00:10:10,360 --> 00:10:14,160 Speaker 2: go to these third party platforms which verify or estimate 185 00:10:14,200 --> 00:10:18,440 Speaker 2: someone's age as a way of blocking anyone who's under 186 00:10:18,480 --> 00:10:21,559 Speaker 2: sixteen from making an account. But also some of these 187 00:10:21,600 --> 00:10:25,880 Speaker 2: platforms already use this type of verification. Meta already uses it, 188 00:10:26,440 --> 00:10:30,520 Speaker 2: so if they're harnessing it to stop people under sixteen 189 00:10:30,559 --> 00:10:33,600 Speaker 2: from getting into that platforms, that might be considered a 190 00:10:33,600 --> 00:10:37,320 Speaker 2: reasonable step. It depends what the E Safety Commissioner determines 191 00:10:37,400 --> 00:10:39,199 Speaker 2: before the December deadline. 192 00:10:39,480 --> 00:10:42,640 Speaker 1: So, Harry, there's a big recurring question here around privacy 193 00:10:42,679 --> 00:10:45,800 Speaker 1: whenever we think about the social media platforms. Privacy is 194 00:10:45,840 --> 00:10:49,600 Speaker 1: obviously something that matters to users, but it's especially delicate 195 00:10:49,640 --> 00:10:52,720 Speaker 1: here because we're talking about under eighteen year olds and children. 196 00:10:53,040 --> 00:10:55,040 Speaker 1: Did the report say anything about that. 197 00:10:55,520 --> 00:10:58,160 Speaker 2: Yeah, it looked into some of the pretty genuine concerns 198 00:10:58,200 --> 00:11:02,560 Speaker 2: about privacy breaches that could occur, especially when we're thinking 199 00:11:02,559 --> 00:11:07,280 Speaker 2: about the inference technology. So that's using the online footprint 200 00:11:07,320 --> 00:11:09,800 Speaker 2: to sort of build an estimation of someone's age. And 201 00:11:10,200 --> 00:11:15,040 Speaker 2: the report actually warned that some continuous behavioral monitoring online 202 00:11:15,480 --> 00:11:19,160 Speaker 2: could lead to overstepping an ethical mark. So it said 203 00:11:19,160 --> 00:11:21,840 Speaker 2: that there needed to be some ground rules and regulation 204 00:11:21,960 --> 00:11:24,840 Speaker 2: to make sure that if the inference technology was to 205 00:11:24,840 --> 00:11:27,560 Speaker 2: be used, that it is safe and that it doesn't 206 00:11:27,640 --> 00:11:29,160 Speaker 2: lead to this ongoing effect. 207 00:11:29,400 --> 00:11:29,640 Speaker 1: Okay. 208 00:11:29,840 --> 00:11:31,480 Speaker 2: The other main issue that was brought up in the 209 00:11:31,520 --> 00:11:36,120 Speaker 2: report was accuracy. So what if the technology accidentally starts 210 00:11:36,160 --> 00:11:39,600 Speaker 2: locking people out who are over sixteen and actually able 211 00:11:39,640 --> 00:11:41,319 Speaker 2: to use social media under the law. 212 00:11:41,320 --> 00:11:44,679 Speaker 1: Or similarly approves people who are under sixteen because it 213 00:11:44,760 --> 00:11:46,040 Speaker 1: might think they look eighteen. 214 00:11:46,400 --> 00:11:49,240 Speaker 2: So the trials found that some of the tech could 215 00:11:49,240 --> 00:11:51,800 Speaker 2: mismatch someone's age and what it called a buffer zone, 216 00:11:51,800 --> 00:11:54,959 Speaker 2: and that's about two to three years. So that means 217 00:11:54,960 --> 00:11:57,840 Speaker 2: that someone who's seventeen might be confused for a fourteen 218 00:11:57,920 --> 00:12:00,360 Speaker 2: year old or vice versa. 219 00:12:00,000 --> 00:12:03,240 Speaker 1: Okay, so this report is acknowledging some of the shortcomings 220 00:12:03,280 --> 00:12:06,800 Speaker 1: of this tech, and it's saying that there is a 221 00:12:06,800 --> 00:12:10,160 Speaker 1: bit of a buffer zone. That feels kind of difficult 222 00:12:10,200 --> 00:12:12,400 Speaker 1: because if it's a buffer zone of two to three years, 223 00:12:12,440 --> 00:12:14,720 Speaker 1: you know, the experts are telling us that that's a 224 00:12:14,760 --> 00:12:17,480 Speaker 1: critical age, those two to three years between you know, 225 00:12:17,600 --> 00:12:21,240 Speaker 1: fourteen to eighteen. So I'm sure there'll be people watching 226 00:12:21,280 --> 00:12:21,960 Speaker 1: that one closely. 227 00:12:22,160 --> 00:12:24,640 Speaker 2: And the language and the report says that based on 228 00:12:24,679 --> 00:12:28,680 Speaker 2: its trials, the way in which the estimations have occurred 229 00:12:28,679 --> 00:12:31,480 Speaker 2: are within what they call the acceptable range. Okay, so 230 00:12:31,520 --> 00:12:35,760 Speaker 2: that means that there aren't enough inaccurate estimations of someone's 231 00:12:35,800 --> 00:12:40,480 Speaker 2: age to cause too much alarm based on the report's findings, 232 00:12:41,000 --> 00:12:43,400 Speaker 2: but it is still you know, once you roll this 233 00:12:43,480 --> 00:12:48,120 Speaker 2: out on mass to every teenager and child in Australia, 234 00:12:48,280 --> 00:12:51,200 Speaker 2: like that does become a bit more of a pressing 235 00:12:51,240 --> 00:12:54,040 Speaker 2: issue lasting I just note on accuracy. There was this 236 00:12:54,120 --> 00:12:57,920 Speaker 2: really interesting note in the report as well about some 237 00:12:58,000 --> 00:13:02,840 Speaker 2: issues with non Caucasian user is, including that First Nations 238 00:13:02,880 --> 00:13:08,040 Speaker 2: peoples are underrepresented when training this data, so there could 239 00:13:08,080 --> 00:13:12,120 Speaker 2: actually be an inbuilt bias to the system, which is 240 00:13:12,160 --> 00:13:15,679 Speaker 2: something that we've heard about with face ID scanning technology 241 00:13:16,000 --> 00:13:18,959 Speaker 2: before as well, and so the report said that there 242 00:13:19,040 --> 00:13:22,239 Speaker 2: was a bit of awareness among the age assurance companies 243 00:13:22,280 --> 00:13:25,160 Speaker 2: as well that this is a shortcoming and that this 244 00:13:25,320 --> 00:13:28,320 Speaker 2: is something that they need to look further into as well. 245 00:13:28,520 --> 00:13:33,800 Speaker 1: Fascinating so having more diversity across the trials to understand 246 00:13:33,880 --> 00:13:36,319 Speaker 1: the data better on a more diverse range of users. 247 00:13:37,040 --> 00:13:42,079 Speaker 1: We are talking, though, Harry, about a generation that's grown 248 00:13:42,160 --> 00:13:46,560 Speaker 1: up online, right, We're also talking about teenagers. When you 249 00:13:46,600 --> 00:13:50,520 Speaker 1: consider those two things together, you can't help but wonder 250 00:13:51,120 --> 00:13:55,120 Speaker 1: is this generation that will find its way around age 251 00:13:55,240 --> 00:13:59,640 Speaker 1: verification technology. Did the research consider that? 252 00:14:00,120 --> 00:14:03,680 Speaker 2: Yeah, And the standout thing to me was the use 253 00:14:03,720 --> 00:14:06,920 Speaker 2: of virtual private networks which are also known as VPNs 254 00:14:07,679 --> 00:14:10,400 Speaker 2: to dodge the rules. And the reason it sort of 255 00:14:10,440 --> 00:14:12,840 Speaker 2: picked up my interest is because recently in the UK 256 00:14:13,280 --> 00:14:17,320 Speaker 2: there was a ban on pornography websites for under eighteen's 257 00:14:17,360 --> 00:14:20,960 Speaker 2: and so they rolled out this age deshurance technology and 258 00:14:21,200 --> 00:14:24,000 Speaker 2: all of a sudden there was just an uptick in 259 00:14:24,120 --> 00:14:27,040 Speaker 2: the number of people downloading VPNs. 260 00:14:27,080 --> 00:14:30,720 Speaker 1: And so there's a learning there for Australia exactly. 261 00:14:30,800 --> 00:14:34,880 Speaker 2: So the report basically recommended to get around the issue 262 00:14:34,880 --> 00:14:38,320 Speaker 2: of VPNs, which basically makes you appear as though you're 263 00:14:38,320 --> 00:14:42,040 Speaker 2: in another country where the rules don't apply, and that's 264 00:14:42,240 --> 00:14:44,880 Speaker 2: where you change what's known as your IP address. But 265 00:14:45,200 --> 00:14:50,120 Speaker 2: there's geolocation technology that can detect an inconsistency. So if 266 00:14:50,160 --> 00:14:53,880 Speaker 2: your IP address and your regional patterns are not lined up, 267 00:14:54,160 --> 00:14:57,720 Speaker 2: then this type of technology could pick up on that, okay, 268 00:14:57,800 --> 00:15:00,480 Speaker 2: And it's already been used in some context as well, 269 00:15:00,600 --> 00:15:01,200 Speaker 2: so it. 270 00:15:01,120 --> 00:15:05,920 Speaker 1: Is possible to identify if someone changes their VPN in 271 00:15:05,960 --> 00:15:09,800 Speaker 1: Australia but has an Australian IP address. I think that 272 00:15:09,800 --> 00:15:13,800 Speaker 1: that's really interesting and will probably be some relief for 273 00:15:14,400 --> 00:15:17,920 Speaker 1: policymakers and parents alike to know that that exists, because 274 00:15:18,640 --> 00:15:22,480 Speaker 1: if a workaround exists, we can expect teenagers to find it. 275 00:15:22,720 --> 00:15:27,160 Speaker 2: That's right. Overall, the government says that it's quite reassured 276 00:15:27,200 --> 00:15:31,640 Speaker 2: by the findings that it's workable, it's scalable, and there 277 00:15:31,720 --> 00:15:33,880 Speaker 2: is going to be a bit of a trial and 278 00:15:34,000 --> 00:15:37,480 Speaker 2: error when rolling out the technology, but it's acknowledged there's 279 00:15:37,520 --> 00:15:40,120 Speaker 2: not really a one size fits all approach that is 280 00:15:40,440 --> 00:15:45,240 Speaker 2: appropriate when considering getting kids off social media. But the 281 00:15:45,320 --> 00:15:47,120 Speaker 2: report does give a bit of a rundown of what 282 00:15:47,160 --> 00:15:50,040 Speaker 2: we can expect from the tenth of December. You might 283 00:15:50,040 --> 00:15:52,240 Speaker 2: need to show your passport, you might need to scan 284 00:15:52,320 --> 00:15:55,280 Speaker 2: your face to see if this technology picks up that 285 00:15:55,320 --> 00:15:57,760 Speaker 2: you're over sixteen, or there might be an app that 286 00:15:57,880 --> 00:16:02,160 Speaker 2: scans your search history, your online footprint to see how 287 00:16:02,160 --> 00:16:04,480 Speaker 2: old it thinks you are. And the fact that they're 288 00:16:04,560 --> 00:16:07,720 Speaker 2: still not fully sure how the technology is going to 289 00:16:07,760 --> 00:16:10,960 Speaker 2: work has been picked up this week by the opposition. 290 00:16:11,240 --> 00:16:15,600 Speaker 2: So the Shadow Communications Minister, Melissa Macintosh said that the 291 00:16:15,640 --> 00:16:18,560 Speaker 2: report has come ten seconds to midnight, adding that the 292 00:16:18,640 --> 00:16:20,920 Speaker 2: report is not the final step by any means. The 293 00:16:21,120 --> 00:16:23,680 Speaker 2: Safety Commissioner now needs to have a look at the 294 00:16:23,680 --> 00:16:27,040 Speaker 2: findings and then advise the social media companies on what 295 00:16:27,280 --> 00:16:30,400 Speaker 2: will form the best practice for them, and all the 296 00:16:30,400 --> 00:16:33,280 Speaker 2: while the bean is just under one hundred days away. 297 00:16:33,360 --> 00:16:35,320 Speaker 2: So TikTok punintended. 298 00:16:35,680 --> 00:16:39,000 Speaker 1: Sure was Thank you so much, Harry. It's so interesting 299 00:16:39,040 --> 00:16:41,040 Speaker 1: because I think, you know, we hear about this ban 300 00:16:41,160 --> 00:16:43,640 Speaker 1: and it might feel far away for some of us, 301 00:16:43,720 --> 00:16:46,800 Speaker 1: you know, for those of us who are shockingly over sixteen. 302 00:16:47,480 --> 00:16:50,760 Speaker 1: That's my secret. I'll never tell. But you know, anyone 303 00:16:50,920 --> 00:16:54,400 Speaker 1: who has a log in across these platforms is hypothetically 304 00:16:54,440 --> 00:16:56,840 Speaker 1: going to be affected. Right, we will, I imagine on the 305 00:16:56,840 --> 00:17:00,320 Speaker 1: tenth of December, all have to verify in some way 306 00:17:00,600 --> 00:17:01,960 Speaker 1: at our place on these apps. 307 00:17:02,240 --> 00:17:04,399 Speaker 2: Unless we've been on there for as long as I have, 308 00:17:04,520 --> 00:17:08,040 Speaker 2: then I hope they just know that I'm well and truly. 309 00:17:08,880 --> 00:17:12,600 Speaker 1: Is not maths sing. This guy's over sixteen. Harry, thank 310 00:17:12,640 --> 00:17:16,800 Speaker 1: you as always for that breakdown. We really appreciate your guidance. 311 00:17:16,840 --> 00:17:18,440 Speaker 2: Thanks Sam, I appreciate it. 312 00:17:18,280 --> 00:17:21,080 Speaker 1: And thank you for listening to today's episode. We'll be 313 00:17:21,119 --> 00:17:23,440 Speaker 1: back a little later on with your evening news headlines, 314 00:17:23,480 --> 00:17:29,119 Speaker 1: but until then, have a great day. 315 00:17:29,520 --> 00:17:31,800 Speaker 2: My name is Lily Madden and I'm a proud Adunda 316 00:17:32,040 --> 00:17:36,840 Speaker 2: Bungelung Calcuttin woman from Gadighl Country. The Daily oz acknowledges 317 00:17:36,920 --> 00:17:39,080 Speaker 2: that this podcast is recorded on the lands of the 318 00:17:39,080 --> 00:17:42,640 Speaker 2: Gadighl people and pays respect to all Aboriginal and Torres 319 00:17:42,680 --> 00:17:45,600 Speaker 2: Strait Island and nations. We pay our respects to the 320 00:17:45,600 --> 00:17:48,400 Speaker 2: first peoples of these countries, both past and present.