1 00:00:00,160 --> 00:00:03,160 Speaker 1: Now over to Australia. It's finally law. Under sixteen year 2 00:00:03,200 --> 00:00:05,800 Speaker 1: olds are going to be banned from social media. This 3 00:00:05,880 --> 00:00:08,479 Speaker 1: is in Australia only. Obviously. The Federal Parliament passed the 4 00:00:08,480 --> 00:00:12,119 Speaker 1: bill into legislation last night and companies failing to comply 5 00:00:12,680 --> 00:00:15,440 Speaker 1: will potentially be fined up to fifty million dollars if 6 00:00:15,480 --> 00:00:17,599 Speaker 1: they have a year. This is the social media companies 7 00:00:17,600 --> 00:00:19,480 Speaker 1: to figure out how to implement this now. Tama lev 8 00:00:19,640 --> 00:00:24,800 Speaker 1: is a professor of Internet studies at Curtin University in Perth. Tama, Hello, Hello, 9 00:00:25,200 --> 00:00:27,120 Speaker 1: how are we actually going to verify ages? 10 00:00:28,720 --> 00:00:31,960 Speaker 2: That's a really good question. One of the challenges with 11 00:00:32,040 --> 00:00:35,040 Speaker 2: this particular bill is that it says that platforms are 12 00:00:35,040 --> 00:00:38,080 Speaker 2: responsible for doing that. It doesn't tell platforms how they 13 00:00:38,120 --> 00:00:40,720 Speaker 2: are supposed to do that. We've got an age assurance 14 00:00:40,760 --> 00:00:43,400 Speaker 2: trial sometime in the middle of next year. But the 15 00:00:43,520 --> 00:00:46,120 Speaker 2: thing that you would imagine probably was the easiest way 16 00:00:46,159 --> 00:00:47,920 Speaker 2: to do that, which is asking for some sort of 17 00:00:48,000 --> 00:00:52,360 Speaker 2: government issued identification. We've added an amendment so that platforms 18 00:00:52,360 --> 00:00:55,279 Speaker 2: aren't allowed to ask for that because there were privacy concerns. 19 00:00:55,680 --> 00:00:57,960 Speaker 2: So we know that platforms are going to bear the 20 00:00:58,000 --> 00:01:00,320 Speaker 2: brunt of trying to do this, and that may be 21 00:01:00,320 --> 00:01:02,960 Speaker 2: a good thing, but the exact technicality of how they're 22 00:01:02,960 --> 00:01:06,280 Speaker 2: going to do age verification or age assurance is very 23 00:01:06,360 --> 00:01:07,119 Speaker 2: much up in the air. 24 00:01:07,400 --> 00:01:09,520 Speaker 1: What is the relevance of this age assurance trial? 25 00:01:11,400 --> 00:01:14,240 Speaker 2: So the age assurance trial, which was originally going to 26 00:01:14,280 --> 00:01:16,840 Speaker 2: happen first, was to test out all of those tools 27 00:01:16,880 --> 00:01:19,440 Speaker 2: that claim to be able to figure out your age 28 00:01:19,440 --> 00:01:22,959 Speaker 2: without you entering it yourself, so maybe facial recognition or 29 00:01:22,959 --> 00:01:26,039 Speaker 2: other biometrics, or looking at the history of how you 30 00:01:26,120 --> 00:01:28,520 Speaker 2: use a platform, or something else to sort of an 31 00:01:28,680 --> 00:01:31,600 Speaker 2: educated guess if you like, about your age. We know 32 00:01:31,680 --> 00:01:34,880 Speaker 2: that those tools have been fairly immature, that they've been 33 00:01:35,000 --> 00:01:39,039 Speaker 2: fairly high error rates, and that they've worked much better 34 00:01:39,080 --> 00:01:42,039 Speaker 2: on whitefaces than any other sort of face, so that 35 00:01:42,040 --> 00:01:43,959 Speaker 2: there are some real issues there that we've known about. 36 00:01:44,480 --> 00:01:46,360 Speaker 2: It would have been lovely to do that trial first 37 00:01:46,400 --> 00:01:49,040 Speaker 2: and figure out if those tools have improved or not, 38 00:01:49,400 --> 00:01:51,280 Speaker 2: But as it is, we've got a year to try 39 00:01:51,320 --> 00:01:52,200 Speaker 2: and get these things right. 40 00:01:52,560 --> 00:01:54,280 Speaker 1: Do you think that the social media do you think 41 00:01:54,320 --> 00:01:55,680 Speaker 1: this is going to work? Will they be able to 42 00:01:55,720 --> 00:01:56,680 Speaker 1: figure this out in a year? 43 00:01:58,240 --> 00:02:00,760 Speaker 2: I honestly, I'm not quite sure. I think there will 44 00:02:00,760 --> 00:02:03,320 Speaker 2: be something put in place. I think there will be 45 00:02:03,360 --> 00:02:07,080 Speaker 2: a real effort. I mean, there are significant penalties forgetting 46 00:02:07,080 --> 00:02:09,000 Speaker 2: this wrong, but I also think there might be a 47 00:02:09,080 --> 00:02:11,320 Speaker 2: legal challenge from the platform saying you've asked us to 48 00:02:11,360 --> 00:02:14,320 Speaker 2: do something nobody's ever been able to do. Well, it's 49 00:02:14,360 --> 00:02:17,000 Speaker 2: not really our fault that we can't. So that there 50 00:02:17,040 --> 00:02:18,760 Speaker 2: are lots of questions still in the air, and a 51 00:02:18,880 --> 00:02:21,040 Speaker 2: year is a long time to figure this out. There's 52 00:02:21,040 --> 00:02:23,639 Speaker 2: also a federal election in the middle of that year, 53 00:02:23,760 --> 00:02:26,399 Speaker 2: so it'll depend on who's driving the ship as well. 54 00:02:26,400 --> 00:02:29,400 Speaker 2: When it gets to the point of actually having to 55 00:02:29,440 --> 00:02:30,000 Speaker 2: implement it. 56 00:02:30,160 --> 00:02:33,440 Speaker 1: TAMA. Why doesn't the government provide the age assurance itself, 57 00:02:33,480 --> 00:02:35,480 Speaker 1: because I mean, there must be a way to issue 58 00:02:35,520 --> 00:02:37,600 Speaker 1: a token or something. It already has the details of 59 00:02:37,639 --> 00:02:39,760 Speaker 1: these kids, or can have the details of these kids 60 00:02:39,960 --> 00:02:43,120 Speaker 1: and can be trusted to keep it safe, because everybody's 61 00:02:43,160 --> 00:02:45,880 Speaker 1: government keeps our details safe, right with our driver's license 62 00:02:45,919 --> 00:02:48,560 Speaker 1: for our passport. And then just assure the social media 63 00:02:48,600 --> 00:02:52,240 Speaker 1: companies yes, TAMA is over eighteen or sixteen or whatever. 64 00:02:53,680 --> 00:02:56,400 Speaker 2: Yes, So I mean I think Australia has a long 65 00:02:56,520 --> 00:02:59,200 Speaker 2: history of trying to bring in some sort of systematic 66 00:02:59,240 --> 00:03:02,200 Speaker 2: to national identity via scheme, and every time people have 67 00:03:02,280 --> 00:03:05,120 Speaker 2: absolutely freaked out and not wanted to do it, the 68 00:03:05,200 --> 00:03:07,360 Speaker 2: current government and then did. The last few governments have 69 00:03:07,360 --> 00:03:10,400 Speaker 2: had a pretty bad track record with looking after personal information. 70 00:03:10,800 --> 00:03:13,000 Speaker 2: They've been hacked through all sorts of different departments. So 71 00:03:13,000 --> 00:03:16,280 Speaker 2: I don't think that trust relationship, if you like, exists 72 00:03:16,880 --> 00:03:20,120 Speaker 2: with most citizens. So it's a tricky one. It's technically 73 00:03:20,240 --> 00:03:22,680 Speaker 2: quite hard to do any other way. But if people 74 00:03:22,720 --> 00:03:25,360 Speaker 2: don't want to share government issued ID, and of course 75 00:03:25,560 --> 00:03:28,960 Speaker 2: most kids under sixteen don't have government issued ID in 76 00:03:29,000 --> 00:03:32,399 Speaker 2: many respects, so there's a lot of additional hurdles there. 77 00:03:32,680 --> 00:03:35,520 Speaker 1: Yeah, I mean, wouldn't you though, if you I it 78 00:03:35,560 --> 00:03:37,560 Speaker 1: was put to you, wouldn't you trust the government with 79 00:03:37,600 --> 00:03:39,200 Speaker 1: your details rather than Facebook? 80 00:03:41,560 --> 00:03:44,040 Speaker 2: Oh look, that's a really good question. I'm pretty sure 81 00:03:44,040 --> 00:03:46,720 Speaker 2: that Facebook can see I've had an account for fourteen years, 82 00:03:46,720 --> 00:03:49,280 Speaker 2: so if they think I'm under sixteen, that's going to 83 00:03:49,280 --> 00:03:50,040 Speaker 2: be a bit of a worry. 84 00:03:50,520 --> 00:03:53,240 Speaker 1: Yeah, But I mean I was wondering why you were 85 00:03:53,280 --> 00:03:56,800 Speaker 1: so so in two minds about it? But did you 86 00:03:56,840 --> 00:03:59,640 Speaker 1: trust Zuckerberg to have your driver's license on file? 87 00:04:00,840 --> 00:04:03,360 Speaker 2: I don't trust Zuckerberg to have anything of mine on 88 00:04:03,440 --> 00:04:08,480 Speaker 2: a file. But I do think that other ways of 89 00:04:08,600 --> 00:04:11,920 Speaker 2: doing it are harder, and I really don't think that 90 00:04:12,080 --> 00:04:14,880 Speaker 2: Australians are very used to being cynical about the relationship 91 00:04:14,920 --> 00:04:17,040 Speaker 2: they have with their government, so I think trusting them 92 00:04:17,080 --> 00:04:19,360 Speaker 2: to do it is never going to be something because 93 00:04:19,400 --> 00:04:22,280 Speaker 2: of course we're not verifying kids' identities. We are verifying 94 00:04:22,800 --> 00:04:26,080 Speaker 2: everyone's identities. Adults have to prove they're not kids. It's 95 00:04:26,120 --> 00:04:28,680 Speaker 2: not just convincing seventeen year olds to do this, it 96 00:04:28,720 --> 00:04:30,719 Speaker 2: is convincing fifty five year olds. 97 00:04:31,040 --> 00:04:33,400 Speaker 1: Listen, tamer thanks for running us through to really appreciate it, 98 00:04:33,440 --> 00:04:36,640 Speaker 1: Tama Leaver, Professor of Internet Studies at Curtain University, and 99 00:04:36,640 --> 00:04:38,320 Speaker 1: personally to see how this plays out. If they do it, 100 00:04:38,360 --> 00:04:39,560 Speaker 1: and they do it well, then we'll have to do 101 00:04:39,600 --> 00:04:39,960 Speaker 1: it too. 102 00:04:40,720 --> 00:04:43,880 Speaker 2: For more from Hither Dupless Allen Drive, listen live to 103 00:04:44,000 --> 00:04:47,039 Speaker 2: news talks it'd be from four pm weekdays, or follow 104 00:04:47,080 --> 00:04:48,840 Speaker 2: the podcast on iHeartRadio.