1 00:00:00,080 --> 00:00:03,240 Speaker 1: Turns out the Australian government is now considering not only 2 00:00:03,279 --> 00:00:06,320 Speaker 1: banning the under sixteens from social media, but from YouTube 3 00:00:06,360 --> 00:00:08,200 Speaker 1: as well. They're still going to be able to watch it, 4 00:00:08,240 --> 00:00:09,960 Speaker 1: they just won't be able to set up an account 5 00:00:10,200 --> 00:00:13,160 Speaker 1: or interact on it. Cecilia Robinson is the co chair 6 00:00:13,200 --> 00:00:16,840 Speaker 1: of the Child Online Safety Advocacy group Before sixteen is 7 00:00:16,840 --> 00:00:20,680 Speaker 1: with us now, Cecilia, Hello, Hi, So despite the ban, 8 00:00:20,800 --> 00:00:22,360 Speaker 1: they're basically going to be able to use it. 9 00:00:22,360 --> 00:00:24,919 Speaker 2: Aren't they? Well, look, I don't think that that's the 10 00:00:24,960 --> 00:00:27,080 Speaker 2: case at all, And you know, to be completely frank, 11 00:00:27,480 --> 00:00:30,000 Speaker 2: you know, this limitation on the plan that they are 12 00:00:30,000 --> 00:00:32,839 Speaker 2: doing on YouTube should have included from the start and 13 00:00:32,840 --> 00:00:34,479 Speaker 2: should have been what they should have looked at originally 14 00:00:34,520 --> 00:00:37,279 Speaker 2: in Australia because the problem with youtubeer is that it's 15 00:00:37,320 --> 00:00:40,480 Speaker 2: just like every other social media platform, just like Snapchat, 16 00:00:40,760 --> 00:00:43,599 Speaker 2: just like TikTok, just like Instagram in the way that 17 00:00:43,880 --> 00:00:46,479 Speaker 2: people are creating user generated content and the way that 18 00:00:46,479 --> 00:00:49,959 Speaker 2: the algorithms work. So no, it definitely needs to be 19 00:00:50,400 --> 00:00:51,520 Speaker 2: properly managed as well. 20 00:00:51,920 --> 00:00:54,680 Speaker 1: Okay, but I mean if they are able to, if 21 00:00:54,720 --> 00:00:57,280 Speaker 1: they can still view it, then can't they still succumb 22 00:00:57,280 --> 00:00:58,279 Speaker 1: to the dangers of it. 23 00:00:59,040 --> 00:01:01,280 Speaker 2: So children are going to be able to use YouTube 24 00:01:01,360 --> 00:01:04,000 Speaker 2: kits and so as far as I understand the way 25 00:01:04,000 --> 00:01:05,920 Speaker 2: that this legislation is going to work, which is in 26 00:01:06,000 --> 00:01:09,960 Speaker 2: the same way that Facebook, Snapchat, YouTube Instagram are going 27 00:01:10,000 --> 00:01:12,920 Speaker 2: to all be treated in the same way. So YouTube shorts, yeah, 28 00:01:12,920 --> 00:01:15,640 Speaker 2: but that's not going to be viewed in some kids. 29 00:01:15,240 --> 00:01:17,120 Speaker 1: Only if they log in, right, So you don't have 30 00:01:17,160 --> 00:01:19,559 Speaker 1: to log into YouTube, you can just you can watch 31 00:01:19,560 --> 00:01:21,319 Speaker 1: it without logging, and in which case you can watch 32 00:01:21,360 --> 00:01:22,360 Speaker 1: the adult version, can't you. 33 00:01:23,080 --> 00:01:25,080 Speaker 2: And look, that's where it's going to be really interesting 34 00:01:25,120 --> 00:01:27,880 Speaker 2: to see how this legislation evolves, right and see how 35 00:01:27,920 --> 00:01:30,720 Speaker 2: they actually get to that granular detail, because clearly what 36 00:01:30,720 --> 00:01:33,520 Speaker 2: we're trying to do is prevent young people from seeing 37 00:01:33,520 --> 00:01:36,479 Speaker 2: harmful content online. And so when you bring it back 38 00:01:36,520 --> 00:01:38,640 Speaker 2: to what the problem here is is that you're seeing 39 00:01:38,680 --> 00:01:41,199 Speaker 2: young kids as young as six being exposed to things 40 00:01:41,200 --> 00:01:44,039 Speaker 2: like violent pornography without even searching for it. And so 41 00:01:44,120 --> 00:01:46,360 Speaker 2: I think that this is a starting point. I think 42 00:01:46,360 --> 00:01:50,600 Speaker 2: it's fantastic that Australia is, you know, now accepting that 43 00:01:50,720 --> 00:01:54,280 Speaker 2: YouTube is also a social media platform which has been 44 00:01:54,320 --> 00:01:56,760 Speaker 2: all along, and actually accepting that there needs to be 45 00:01:56,800 --> 00:01:59,680 Speaker 2: appropriate boundaries in place for that. 46 00:02:00,080 --> 00:02:02,720 Speaker 1: From Mike, this lady is crazy, I've got four kids. 47 00:02:02,760 --> 00:02:04,160 Speaker 1: YouTube is not social media. 48 00:02:05,520 --> 00:02:09,239 Speaker 2: Yeah, look, it's a really clear cut case here. When 49 00:02:09,240 --> 00:02:11,959 Speaker 2: you look at how social media is defined by things 50 00:02:12,040 --> 00:02:16,760 Speaker 2: like algorithmic feeds, content virility, live streaming and stories and 51 00:02:16,800 --> 00:02:21,560 Speaker 2: just addictive nature of social media which has been provided 52 00:02:21,560 --> 00:02:25,400 Speaker 2: through TikTok and for YouTube, you can see that this 53 00:02:25,560 --> 00:02:28,160 Speaker 2: is social media. It's pretty plain and simple. And when 54 00:02:28,200 --> 00:02:31,160 Speaker 2: you look at platforms that you compare, for example, TikTok too, 55 00:02:31,240 --> 00:02:34,600 Speaker 2: which has also been defined previously in this legislation in Australia, 56 00:02:35,440 --> 00:02:38,720 Speaker 2: YouTube and specifically YouTube shorts operate in the same way 57 00:02:38,760 --> 00:02:41,320 Speaker 2: as TikTok does, so why they had an exemption in 58 00:02:41,360 --> 00:02:44,040 Speaker 2: the first place is mind boggling because they operate the 59 00:02:44,040 --> 00:02:44,480 Speaker 2: same way. 60 00:02:45,080 --> 00:02:48,680 Speaker 1: Okay, Cecilia, thank you, appreciate your time. Cecilia Robinson before 61 00:02:48,960 --> 00:02:52,919 Speaker 1: sixteen co chair. For more from Heather Duplessy Alan Drive, 62 00:02:53,080 --> 00:02:56,480 Speaker 1: listen live to news Talks. It'd be from four pm weekdays, 63 00:02:56,600 --> 00:02:58,800 Speaker 1: or follow the podcast on iHeartRadio.