1 00:00:00,080 --> 00:00:03,000 Speaker 1: How about this Instagram appears to be buckling under pressure 2 00:00:03,320 --> 00:00:05,320 Speaker 1: and is going to control what kids can see if 3 00:00:05,320 --> 00:00:07,920 Speaker 1: they're on the social media app. It's launched what they're 4 00:00:07,920 --> 00:00:10,920 Speaker 1: calling teen accounts, and these will have a bunch of restrictions, 5 00:00:10,920 --> 00:00:13,399 Speaker 1: including being private by default, so people can't just randomly 6 00:00:13,440 --> 00:00:15,400 Speaker 1: message the teens, and also the teens can only see 7 00:00:15,440 --> 00:00:19,520 Speaker 1: certain things need safe. Chief Online Safety Officer Sean Lyons. 8 00:00:19,239 --> 00:00:21,480 Speaker 2: Is with us a Sean, Hi, how are you doing? 9 00:00:21,520 --> 00:00:23,720 Speaker 1: I'm very well? Thank you? Is this good enough? Is 10 00:00:24,120 --> 00:00:24,799 Speaker 1: this what we want? 11 00:00:26,079 --> 00:00:28,280 Speaker 2: I think it's certainly our step in the right direction. 12 00:00:28,440 --> 00:00:31,760 Speaker 2: I mean, it's not everything. It doesn't fix all the problems, 13 00:00:31,760 --> 00:00:33,800 Speaker 2: but I think what we're starting to see, and you know, 14 00:00:33,880 --> 00:00:38,000 Speaker 2: the look like you alluded to, is movement towards giving 15 00:00:38,200 --> 00:00:41,960 Speaker 2: more protection by default to young people and hopefully helping 16 00:00:42,000 --> 00:00:45,040 Speaker 2: parents to understand and have more of an active role 17 00:00:45,400 --> 00:00:47,160 Speaker 2: in what keeping young people safe online? 18 00:00:47,240 --> 00:00:50,159 Speaker 1: How do you restrict what they see? What are we 19 00:00:50,200 --> 00:00:51,160 Speaker 1: going to restrict it too? 20 00:00:52,120 --> 00:00:54,400 Speaker 2: Well, the biggest one is the fact that the accounts 21 00:00:54,440 --> 00:00:57,160 Speaker 2: will go automatically from public to private if they're not 22 00:00:57,200 --> 00:00:59,960 Speaker 2: already private, So that's restricting other people's ability to see. 23 00:01:00,200 --> 00:01:03,320 Speaker 2: But Also with that goes for the kind of sensitive 24 00:01:03,400 --> 00:01:06,760 Speaker 2: what they call sensitive content filters, So where the AI 25 00:01:06,840 --> 00:01:10,360 Speaker 2: detects that some of those images might be violent or 26 00:01:10,520 --> 00:01:15,199 Speaker 2: highly sexual or content along those lines, it will be blurred, 27 00:01:15,280 --> 00:01:17,679 Speaker 2: it will be obfuscated and won't be able to be 28 00:01:17,800 --> 00:01:21,759 Speaker 2: seen by by those individuals. So it's some content control 29 00:01:22,319 --> 00:01:24,959 Speaker 2: and like I say, it's it's the right start, but 30 00:01:25,000 --> 00:01:26,480 Speaker 2: there's there's there's a lot further to go. 31 00:01:27,680 --> 00:01:30,200 Speaker 1: Are you reading this as them being worried about the 32 00:01:30,240 --> 00:01:33,080 Speaker 1: possibility of countries like Australia banning kids under the age 33 00:01:33,080 --> 00:01:36,880 Speaker 1: of let's say, sixteen, I am sure. 34 00:01:36,959 --> 00:01:38,800 Speaker 2: I mean, I don't know, but I'm sure they're aware 35 00:01:38,880 --> 00:01:41,520 Speaker 2: of all of the kind of legislative calls that that 36 00:01:41,520 --> 00:01:44,560 Speaker 2: that that are going around and people's you know, genuine 37 00:01:44,600 --> 00:01:47,920 Speaker 2: feeling that that perhaps there's more to be done. Not 38 00:01:47,960 --> 00:01:50,600 Speaker 2: sure what that what the motivation it is. I mean, 39 00:01:50,600 --> 00:01:54,400 Speaker 2: they're they're US based company, that they're probably legislation changes 40 00:01:54,400 --> 00:01:56,600 Speaker 2: the foot foot there too, But I think it is 41 00:01:56,680 --> 00:01:59,120 Speaker 2: probably in recognition of the fact that the more of 42 00:01:59,120 --> 00:02:02,560 Speaker 2: the community of US are saying we would like to 43 00:02:02,600 --> 00:02:05,320 Speaker 2: understand how these platforms could be safer, and we'd like 44 00:02:05,400 --> 00:02:07,280 Speaker 2: you the platform to try and make them safer. 45 00:02:07,440 --> 00:02:08,880 Speaker 1: Sean, it's good to talk to you, mate, Thank you 46 00:02:08,880 --> 00:02:13,040 Speaker 1: for that. That's Sean Lyon's net Safe Chief Online Safety Officer. Hey, 47 00:02:13,320 --> 00:02:17,880 Speaker 1: so the Commonwealth Games. So yeah, on the last legs. 48 00:02:17,919 --> 00:02:20,400 Speaker 1: But guess who loves the idea of the Commonwealth Games. 49 00:02:20,400 --> 00:02:23,520 Speaker 1: Oh that's right, the New Zealand Olympic Committee. Yes they do, 50 00:02:23,600 --> 00:02:26,200 Speaker 1: and they're going to be with us in about quarter 51 00:02:26,240 --> 00:02:28,200 Speaker 1: of an hour to explain why. Five twenty two. 52 00:02:29,000 --> 00:02:32,160 Speaker 2: For more from Hither Dupless Yellen Drive, listen live to 53 00:02:32,280 --> 00:02:35,320 Speaker 2: news talks. It'd be from four pm weekdays, or follow 54 00:02:35,360 --> 00:02:37,120 Speaker 2: the podcast on iHeartRadio.