1 00:00:00,120 --> 00:00:03,440 Speaker 1: Now on another subject altogether, the National Party has revealed 2 00:00:03,440 --> 00:00:05,840 Speaker 1: that it wants to ban social media for under sixteen 3 00:00:05,920 --> 00:00:08,360 Speaker 1: year olds. One of the MP's took he took emp 4 00:00:08,520 --> 00:00:11,959 Speaker 1: Katherine wood Weed is submitting submitting rather a member's bill 5 00:00:12,320 --> 00:00:15,200 Speaker 1: that would put the responsibility of age verification on the 6 00:00:15,240 --> 00:00:18,200 Speaker 1: social media companies and this is supported by the Prime Minister. 7 00:00:18,200 --> 00:00:21,680 Speaker 1: Stephen Sheeler is the former CEO of Facebook for Australia 8 00:00:21,720 --> 00:00:24,960 Speaker 1: and New Zealand. Has Stephen hither good idea to ban 9 00:00:25,040 --> 00:00:26,880 Speaker 1: the under sixteens. 10 00:00:28,680 --> 00:00:30,440 Speaker 2: I think it's a good idea in principle. I think 11 00:00:30,440 --> 00:00:32,680 Speaker 2: the challenge is going to be doing it in practice. 12 00:00:32,760 --> 00:00:37,240 Speaker 2: But I do think that evidence shows that younger people 13 00:00:38,600 --> 00:00:40,680 Speaker 2: there's harm to them. I think that they need to 14 00:00:40,680 --> 00:00:43,280 Speaker 2: be protected from and as we know, we kids kids 15 00:00:43,360 --> 00:00:47,960 Speaker 2: younger than thirteen are not allowed on social media. I 16 00:00:47,960 --> 00:00:50,560 Speaker 2: know if money find it away around that, but I 17 00:00:50,600 --> 00:00:52,960 Speaker 2: think there's a good, strong evidence to show that sixteen 18 00:00:53,040 --> 00:00:54,440 Speaker 2: is a good age for that sort of level. 19 00:00:54,600 --> 00:00:57,040 Speaker 1: What's the practical problem? Is it difficult for the social 20 00:00:57,080 --> 00:01:00,280 Speaker 1: media guys to actually confirm the kid's age. 21 00:01:01,080 --> 00:01:05,399 Speaker 2: Well, there's there's a lot that these platforms can do 22 00:01:05,680 --> 00:01:09,800 Speaker 2: in terms of using a I other verification methods to 23 00:01:09,840 --> 00:01:14,360 Speaker 2: try to discern what your real age is. However, I 24 00:01:14,400 --> 00:01:16,119 Speaker 2: think when you come to these levels where there's sort 25 00:01:16,120 --> 00:01:18,520 Speaker 2: of penalties that apply and there's going to be a 26 00:01:18,640 --> 00:01:21,760 Speaker 2: bigger regime, you then then set to ask question, well, okay, 27 00:01:21,880 --> 00:01:24,680 Speaker 2: well how is this going to be verified? Most kids 28 00:01:24,720 --> 00:01:27,400 Speaker 2: don't have Many kids don't have ID that you know, 29 00:01:27,480 --> 00:01:30,360 Speaker 2: that's that's that has their not like adults to a 30 00:01:30,440 --> 00:01:33,520 Speaker 2: driver's license and passports and things, So how they can 31 00:01:33,600 --> 00:01:37,440 Speaker 2: confirm their age or not confirm their age? What happens 32 00:01:37,480 --> 00:01:40,360 Speaker 2: when that there's a deception? What do the platforms do? 33 00:01:40,400 --> 00:01:42,400 Speaker 2: What do the platforms do in terms of storing this data? 34 00:01:42,520 --> 00:01:45,160 Speaker 2: Do we really want them to have this data? There's 35 00:01:44,959 --> 00:01:48,040 Speaker 2: a there is complexity in making it work. And look, 36 00:01:48,160 --> 00:01:51,200 Speaker 2: we've all been teenagers before. You know, you're not supposed 37 00:01:51,200 --> 00:01:53,840 Speaker 2: to buy alcohol or buy tobacco, and many kids get 38 00:01:53,840 --> 00:01:56,080 Speaker 2: around this, maybe some of us have on the who 39 00:01:56,080 --> 00:01:59,080 Speaker 2: we're listening today, But with social media it's the same 40 00:01:59,120 --> 00:02:01,120 Speaker 2: and that there's there's lots of ways trying to get 41 00:02:01,160 --> 00:02:03,840 Speaker 2: around these these prohibitions. So I think it's it's good 42 00:02:03,880 --> 00:02:06,120 Speaker 2: in theory, but it's always the devil's off in the 43 00:02:06,160 --> 00:02:09,200 Speaker 2: detail of how the government proposes to enforce this and 44 00:02:09,200 --> 00:02:11,360 Speaker 2: what penalties are proposing to put on the platforms if 45 00:02:11,360 --> 00:02:12,160 Speaker 2: it's not enforced. 46 00:02:12,200 --> 00:02:16,040 Speaker 1: Steve, and you have Kidzi, I do, yeah, and so 47 00:02:16,280 --> 00:02:18,480 Speaker 1: what are your rules around social media use. 48 00:02:20,560 --> 00:02:23,080 Speaker 2: For good or for bad? My kids are are probably 49 00:02:23,160 --> 00:02:26,120 Speaker 2: too young to even quite know what social media is. 50 00:02:26,600 --> 00:02:31,080 Speaker 2: I want, but we've and this has come into a 51 00:02:31,080 --> 00:02:34,040 Speaker 2: lot of folks who've worked at the big social media platforms. 52 00:02:34,040 --> 00:02:38,160 Speaker 2: I think I'm not unusual in this that there is 53 00:02:38,200 --> 00:02:41,600 Speaker 2: a there's a realization that for all the good that 54 00:02:41,800 --> 00:02:44,320 Speaker 2: the internet and for social media and that AI have 55 00:02:44,440 --> 00:02:48,160 Speaker 2: done for the economy, the society, for you know, for 56 00:02:48,480 --> 00:02:51,640 Speaker 2: all kinds of good benefits that come from these platforms, 57 00:02:52,400 --> 00:02:57,480 Speaker 2: the challenges are for certain people, particularly vulnerable folks in 58 00:02:57,520 --> 00:03:01,720 Speaker 2: our society, such as kids or young social media may 59 00:03:01,720 --> 00:03:05,679 Speaker 2: be simply an overwhelming technology that their brains simply and 60 00:03:05,760 --> 00:03:07,799 Speaker 2: involved enough to be able to deal with. And we've 61 00:03:07,800 --> 00:03:10,959 Speaker 2: done this with tobacco and with alcohol and other things, 62 00:03:11,480 --> 00:03:14,079 Speaker 2: to try to even driving a car, right, you restrict 63 00:03:14,080 --> 00:03:15,800 Speaker 2: it to a certain age and then we ease people 64 00:03:15,840 --> 00:03:18,200 Speaker 2: into it, and I think the time has come with 65 00:03:18,240 --> 00:03:21,399 Speaker 2: social media. So I'm a big believer that the thirteen 66 00:03:21,480 --> 00:03:23,840 Speaker 2: is too young. I think the limits should be higher. 67 00:03:23,840 --> 00:03:26,639 Speaker 2: I've seen evidence of success. Sixteen is about is a 68 00:03:26,680 --> 00:03:30,200 Speaker 2: good number. Some have advocated even higher, sort of eighteen twenty. 69 00:03:30,720 --> 00:03:33,640 Speaker 2: I think there you get into practical challenges that we 70 00:03:33,680 --> 00:03:35,760 Speaker 2: treat people as adults when they turn eighteen. For most 71 00:03:35,800 --> 00:03:37,800 Speaker 2: of us society, it's probably gonna be hard to restrict 72 00:03:37,880 --> 00:03:40,920 Speaker 2: social media past a certain age. But I think sixteen 73 00:03:41,000 --> 00:03:44,600 Speaker 2: sounds like a reasonable number. With my kids, you know, 74 00:03:45,560 --> 00:03:48,000 Speaker 2: we do in a way. They're more aware about social 75 00:03:48,080 --> 00:03:52,400 Speaker 2: media's challenges than I am, because they come home and 76 00:03:52,440 --> 00:03:55,000 Speaker 2: tell me, you know, you know, these are the reasons 77 00:03:55,040 --> 00:03:57,720 Speaker 2: why we need to control our social media usage. And 78 00:03:57,760 --> 00:03:59,000 Speaker 2: I think that's some of the things that are being 79 00:03:59,760 --> 00:04:01,800 Speaker 2: taught in scorch I think is good. There's a lot 80 00:04:01,840 --> 00:04:05,040 Speaker 2: more awareness at their age now about the challenges of 81 00:04:05,040 --> 00:04:07,080 Speaker 2: some of these platforms, the good things and the bad things. 82 00:04:07,400 --> 00:04:08,880 Speaker 1: Steven, it's really good to talk to you. I really 83 00:04:08,920 --> 00:04:11,080 Speaker 1: appreciate your time. Thanks so much, as Steven Sheila, former 84 00:04:11,120 --> 00:04:13,040 Speaker 1: Facebook Australia New Zealand CEO. 85 00:04:13,640 --> 00:04:16,800 Speaker 2: For more from Hither Duplessy Allen Drive, listen live to 86 00:04:16,920 --> 00:04:19,960 Speaker 2: news talks it'd be from four pm weekdays, Or follow 87 00:04:20,000 --> 00:04:21,760 Speaker 2: the podcast on iHeartRadio