1 00:00:00,120 --> 00:00:03,119 Speaker 1: Over in Australia, kids will soon be banned from social 2 00:00:03,160 --> 00:00:06,159 Speaker 1: media apps like TikTok and Snapchat and Instagram and Facebook. 3 00:00:06,720 --> 00:00:08,600 Speaker 1: The government's going to decide an age limit then they're 4 00:00:08,600 --> 00:00:11,440 Speaker 1: going to introduce legislation for the band before the end 5 00:00:11,480 --> 00:00:13,520 Speaker 1: of the year. And Chris lux And our Prime Minister 6 00:00:13,560 --> 00:00:15,640 Speaker 1: says he's thinking about doing a similar thing here now 7 00:00:15,680 --> 00:00:18,240 Speaker 1: Lisa Given as a professor at the Royal Melbourne Institute 8 00:00:18,239 --> 00:00:21,320 Speaker 1: of Technology and with us now, hey Lisa, Hello, they 9 00:00:21,360 --> 00:00:23,239 Speaker 1: haven't specified what age they're going to go for. What 10 00:00:23,280 --> 00:00:23,880 Speaker 1: do you reckon? 11 00:00:24,800 --> 00:00:27,800 Speaker 2: Ah, they haven't. They've sort of set arrange fourteen to sixteen. 12 00:00:28,680 --> 00:00:32,479 Speaker 2: Most other jurisdictions exploring this have head on fourteen. But 13 00:00:32,600 --> 00:00:33,640 Speaker 2: we'll see what they say. 14 00:00:34,159 --> 00:00:35,879 Speaker 1: And how do you think it's going to Where will 15 00:00:35,920 --> 00:00:38,760 Speaker 1: the enforcement be. Will it be at the point of 16 00:00:38,840 --> 00:00:41,960 Speaker 1: the social media site, so Facebook for letting the kids 17 00:00:42,040 --> 00:00:43,519 Speaker 1: sign up at all, or would it be at the 18 00:00:43,560 --> 00:00:46,120 Speaker 1: app store where the parents would have to give permission 19 00:00:46,120 --> 00:00:47,080 Speaker 1: for the download of the app. 20 00:00:48,040 --> 00:00:50,320 Speaker 2: It could be either one. We actually don't know that 21 00:00:50,440 --> 00:00:53,320 Speaker 2: quite yet. I suspect it will be at the point 22 00:00:53,440 --> 00:00:56,480 Speaker 2: of applying to say get an account. There may be 23 00:00:56,520 --> 00:01:00,200 Speaker 2: something pop up where parents might have to say upload 24 00:01:00,400 --> 00:01:02,520 Speaker 2: or at least a test to a child's age of 25 00:01:02,800 --> 00:01:04,080 Speaker 2: that minimum requirement. 26 00:01:04,560 --> 00:01:08,360 Speaker 1: Is that the smartest way of doing it, Well, this 27 00:01:08,400 --> 00:01:09,000 Speaker 1: is challenging. 28 00:01:09,080 --> 00:01:11,080 Speaker 2: I mean, that's certainly a way to do it, and 29 00:01:11,120 --> 00:01:14,080 Speaker 2: you can then exclude people if they don't provide appropriate 30 00:01:14,160 --> 00:01:18,280 Speaker 2: evidence of age. The question is are there workarounds on that. 31 00:01:19,000 --> 00:01:21,760 Speaker 2: There's certainly ways that children could get around that type 32 00:01:22,000 --> 00:01:25,240 Speaker 2: of mechanism, using a VPN for example, or even just 33 00:01:25,280 --> 00:01:27,480 Speaker 2: accessing content through other people account man. 34 00:01:27,520 --> 00:01:29,119 Speaker 1: And I'll tell you what, Lisa, If there's one thing 35 00:01:29,160 --> 00:01:30,880 Speaker 1: you know about kids is that they know more about 36 00:01:30,920 --> 00:01:33,840 Speaker 1: technology than their parents, right, so that will get around us. 37 00:01:34,720 --> 00:01:38,039 Speaker 2: They know what they're doing. And I think that's really 38 00:01:38,080 --> 00:01:41,520 Speaker 2: the critical thing is is a ban even possible? The 39 00:01:41,560 --> 00:01:44,679 Speaker 2: other question is is this what we need? We don't 40 00:01:44,720 --> 00:01:48,200 Speaker 2: actually have definitive evidence around the harms to children with 41 00:01:48,320 --> 00:01:51,760 Speaker 2: social media, and so the experts are quite divided on 42 00:01:51,800 --> 00:01:52,279 Speaker 2: this one. 43 00:01:52,680 --> 00:01:55,760 Speaker 1: So let's say you find out who is whoever the 44 00:01:55,760 --> 00:01:58,760 Speaker 1: Facebook police are, find out that Johnny, whose age twelve 45 00:01:58,840 --> 00:02:01,400 Speaker 1: is on Facebook, who gets in trouble Johnny or Facebook 46 00:02:01,440 --> 00:02:02,280 Speaker 1: or Johnny's parents. 47 00:02:03,440 --> 00:02:06,080 Speaker 2: Effectively, it's going to be the social media companies. So 48 00:02:06,160 --> 00:02:08,959 Speaker 2: what the government is talking about is legislation that would 49 00:02:09,120 --> 00:02:13,000 Speaker 2: enable could be fines, different kinds of sanctions, could even 50 00:02:13,040 --> 00:02:13,960 Speaker 2: be court proceedings. 51 00:02:15,320 --> 00:02:16,920 Speaker 1: Is it going to work? What do you reckon, Lisa, 52 00:02:17,000 --> 00:02:19,919 Speaker 1: If you had to give us your professional opinion. 53 00:02:20,160 --> 00:02:22,519 Speaker 2: Yeah, and my professional view is that this isn't going 54 00:02:22,560 --> 00:02:26,040 Speaker 2: to work. I think it's too simple a potential solution 55 00:02:26,240 --> 00:02:28,800 Speaker 2: for the complexity of the problem. This is really a 56 00:02:28,840 --> 00:02:31,560 Speaker 2: social problem. It's really about how do we keep our 57 00:02:31,680 --> 00:02:35,799 Speaker 2: children safe from harm. There are many other technical solutions, 58 00:02:36,040 --> 00:02:39,520 Speaker 2: for example, battling the algorithms on social media, and this 59 00:02:39,600 --> 00:02:42,320 Speaker 2: one leaves too many holes. There's almost no way to 60 00:02:42,680 --> 00:02:45,480 Speaker 2: ensure that every single child would be off of social 61 00:02:45,560 --> 00:02:47,240 Speaker 2: media under this legislation. 62 00:02:47,400 --> 00:02:51,080 Speaker 1: Does the responsibility really lie with parents to equip kids 63 00:02:51,120 --> 00:02:52,320 Speaker 1: for this world that we now live in? 64 00:02:53,639 --> 00:02:57,080 Speaker 2: Absolutely? I think it is about parents. It's also about teachers, 65 00:02:57,160 --> 00:02:59,680 Speaker 2: other people in the community that can help kids and 66 00:02:59,680 --> 00:03:02,880 Speaker 2: their families to navigate social media. I'd love to see 67 00:03:02,880 --> 00:03:06,760 Speaker 2: more resources going into those kinds of supports around digital literacy, 68 00:03:07,280 --> 00:03:11,160 Speaker 2: as well as regulation for social media companies to stop 69 00:03:11,200 --> 00:03:12,920 Speaker 2: the harmful content at the source. 70 00:03:13,120 --> 00:03:15,680 Speaker 1: What would you teach if there was a parent listening 71 00:03:15,760 --> 00:03:17,120 Speaker 1: right now, who had a button. I know, let's say 72 00:03:17,120 --> 00:03:18,600 Speaker 1: thirteen year old who wants to go on the face, 73 00:03:19,160 --> 00:03:20,520 Speaker 1: what would you say to the parents is the most 74 00:03:20,560 --> 00:03:22,000 Speaker 1: important thing to teach that child? 75 00:03:23,120 --> 00:03:26,120 Speaker 2: The most important thing is actually teaching them openness and 76 00:03:26,160 --> 00:03:29,560 Speaker 2: willingness to engage in dialogue about what they're seeing. So 77 00:03:29,639 --> 00:03:32,200 Speaker 2: I'd be wanting to sit with that child, help them 78 00:03:32,200 --> 00:03:35,240 Speaker 2: to set up an account, put some reasonable boundaries, perhaps 79 00:03:35,280 --> 00:03:37,200 Speaker 2: around what time of day you're going to do this, 80 00:03:37,680 --> 00:03:40,880 Speaker 2: and then have conversations about the content they're seeing online. 81 00:03:40,920 --> 00:03:42,520 Speaker 1: So, for example, if you've got a week go and 82 00:03:42,560 --> 00:03:44,280 Speaker 1: this is a problem particularly for girls, but it does 83 00:03:44,360 --> 00:03:46,160 Speaker 1: affect boys as well. But if you've got a girl 84 00:03:46,200 --> 00:03:48,840 Speaker 1: on Instagram and she's looking at the photographs of perfect 85 00:03:48,840 --> 00:03:51,280 Speaker 1: supermodels and their perfect lives and their perfect bodies and 86 00:03:51,360 --> 00:03:53,000 Speaker 1: starts to feel a bit bummed out about it, the 87 00:03:53,040 --> 00:03:54,640 Speaker 1: best thing is to go and talk to your parents 88 00:03:54,680 --> 00:03:56,760 Speaker 1: about as your parents can tell you, that's not really 89 00:03:56,760 --> 00:03:58,600 Speaker 1: what life is like. Yeah. 90 00:03:58,640 --> 00:04:01,880 Speaker 2: Absolutely, the problem itself does not go away with this 91 00:04:02,040 --> 00:04:04,640 Speaker 2: kind of ban. This is a social problem we have 92 00:04:04,800 --> 00:04:07,520 Speaker 2: around body image and the things that young women are 93 00:04:07,520 --> 00:04:10,280 Speaker 2: seeing online. But they're also going to see that in 94 00:04:10,360 --> 00:04:12,600 Speaker 2: real world. They see it on television and lots of 95 00:04:12,640 --> 00:04:13,320 Speaker 2: other places. 96 00:04:13,880 --> 00:04:15,480 Speaker 1: Lisa, it's good to talk to you. I really appreciate 97 00:04:15,480 --> 00:04:17,560 Speaker 1: your expertise on that, Lisa given professor at the Royal 98 00:04:17,600 --> 00:04:20,120 Speaker 1: Melbourne Institute of Technology. I mean that Lisa's banged on 99 00:04:20,160 --> 00:04:22,200 Speaker 1: as you can ban the kids if you want to, 100 00:04:22,200 --> 00:04:24,360 Speaker 1: but you're not actually you actually got to just equip them. 101 00:04:24,680 --> 00:04:28,400 Speaker 1: It's like, because whether the fourteen or sixteen or eighteen, 102 00:04:28,400 --> 00:04:30,120 Speaker 1: they're still gonna have to go and come across the 103 00:04:30,200 --> 00:04:32,120 Speaker 1: stuff on the internet. It's a bit like trying to 104 00:04:32,160 --> 00:04:35,440 Speaker 1: ban toddlers from walking on the footpath. The smartest thing 105 00:04:35,480 --> 00:04:37,200 Speaker 1: to do is to teach them road skills, isn't it, 106 00:04:37,200 --> 00:04:38,920 Speaker 1: So they can actually manage the roads. You can't ban 107 00:04:39,000 --> 00:04:42,440 Speaker 1: them from roads forever. For more from Hither to Duplassy 108 00:04:42,520 --> 00:04:45,320 Speaker 1: Allen Drive, listen live to news talks. It'd be from 109 00:04:45,400 --> 00:04:49,000 Speaker 1: four pm weekdays, or follow the podcast on iHeartRadio.