1 00:00:00,160 --> 00:00:02,720 Speaker 1: Act in National squabbling in public again. This time they're 2 00:00:02,759 --> 00:00:05,600 Speaker 1: fighting over the planned ban on social media for under sixteens. 3 00:00:05,800 --> 00:00:08,680 Speaker 1: What's happened is actors rejected the NATS member's bill, calling 4 00:00:08,720 --> 00:00:11,400 Speaker 1: it unworkable. But this morning Chris Luxon said, David Seymour 5 00:00:11,480 --> 00:00:13,120 Speaker 1: was coming around to the idea. I've sort of started 6 00:00:13,160 --> 00:00:15,000 Speaker 1: to shift their position a little bit on the weekend, 7 00:00:15,280 --> 00:00:16,960 Speaker 1: which is good. So let's just see where we get 8 00:00:17,000 --> 00:00:19,400 Speaker 1: to with them. There might still be a pathway through 9 00:00:19,440 --> 00:00:21,160 Speaker 1: all of that. And then David Seymour said, no, they're 10 00:00:21,160 --> 00:00:22,720 Speaker 1: not shifting their position, and he's with me. 11 00:00:22,800 --> 00:00:22,920 Speaker 2: Now. 12 00:00:22,960 --> 00:00:28,160 Speaker 1: Hello David, Hey, Heather, Okay, run me through it. What's 13 00:00:28,160 --> 00:00:28,760 Speaker 1: your argument? 14 00:00:31,360 --> 00:00:34,280 Speaker 2: Well, first of all, I think it's a little unfair 15 00:00:34,320 --> 00:00:37,640 Speaker 2: to say we're squabbling. We have two parties that have 16 00:00:37,720 --> 00:00:40,120 Speaker 2: different positions, and I actually think that's healthy. I think 17 00:00:40,159 --> 00:00:44,519 Speaker 2: it shows a political system is maturing. So far as 18 00:00:44,560 --> 00:00:47,600 Speaker 2: the substance of the policy. I just make the point 19 00:00:47,600 --> 00:00:50,199 Speaker 2: that you can believe that there's a very real problem 20 00:00:50,520 --> 00:00:53,120 Speaker 2: and that the Internet is a place that is having 21 00:00:53,960 --> 00:00:57,400 Speaker 2: very bad effects on young people in New Zealand, and 22 00:00:57,480 --> 00:01:01,760 Speaker 2: yet be skeptical that something as simple as the band 23 00:01:01,800 --> 00:01:05,919 Speaker 2: that was proposed just last week is actually going to work. 24 00:01:06,360 --> 00:01:08,800 Speaker 2: And I think in a way the fact that the 25 00:01:08,840 --> 00:01:14,040 Speaker 2: bill came out last Wednesday and by Sunday Christmas saying well, actually, 26 00:01:14,319 --> 00:01:17,560 Speaker 2: we need to do more work kind of proves the 27 00:01:17,600 --> 00:01:21,440 Speaker 2: point I'd been making. And for my money, I think 28 00:01:21,480 --> 00:01:24,000 Speaker 2: one of the most helpful things that could happen is 29 00:01:24,040 --> 00:01:28,880 Speaker 2: that the Education and Workforce Select Committee could open up 30 00:01:28,920 --> 00:01:34,160 Speaker 2: a full, open, transparent inquiry. Get all the technologists, the principles, 31 00:01:34,240 --> 00:01:39,840 Speaker 2: the parents, the young people themselves, the technologists, the child psychiatrists, 32 00:01:40,120 --> 00:01:42,480 Speaker 2: actually the people in other countries that are trying to 33 00:01:42,480 --> 00:01:45,120 Speaker 2: do this, and by the end of it, you might 34 00:01:45,160 --> 00:01:47,400 Speaker 2: actually find that you look at the whole thing quite 35 00:01:47,400 --> 00:01:50,400 Speaker 2: differently and have a much better solution with a lot 36 00:01:50,400 --> 00:01:53,720 Speaker 2: more political consensus. That would be the way to take 37 00:01:53,760 --> 00:01:54,960 Speaker 2: the problem. Seriously. If it is. 38 00:01:54,960 --> 00:01:58,400 Speaker 1: Possible to technologically ban kids from social media, do you 39 00:01:58,400 --> 00:02:01,480 Speaker 1: support it? 40 00:02:01,240 --> 00:02:06,800 Speaker 2: If it's technologically possible to withdraw kids from the harms, 41 00:02:07,400 --> 00:02:09,360 Speaker 2: I'd certainly support that. But I think what we're going 42 00:02:09,400 --> 00:02:12,360 Speaker 2: to be clear about there's basically three types of harms. 43 00:02:12,360 --> 00:02:19,200 Speaker 2: There's inappropriate content, there's inappropriate contact, bullying, predatory behavior, and 44 00:02:19,240 --> 00:02:22,919 Speaker 2: so on, and then there's the nature of some platforms 45 00:02:22,960 --> 00:02:25,880 Speaker 2: that are purposely made to be addictive, because just like 46 00:02:25,960 --> 00:02:28,839 Speaker 2: any other media or a radio show, for example, they're 47 00:02:28,880 --> 00:02:32,280 Speaker 2: competing for eyeballs and ears, and as a result, they 48 00:02:32,320 --> 00:02:35,960 Speaker 2: make it very addictive. Well, what you're really asking, and 49 00:02:36,200 --> 00:02:38,000 Speaker 2: I don't want to sound like I'm avoiding the question, 50 00:02:38,080 --> 00:02:41,600 Speaker 2: but the question is can you ban them from certain platforms? 51 00:02:42,160 --> 00:02:46,359 Speaker 2: The question is can you create a solution that prevents 52 00:02:46,560 --> 00:02:48,720 Speaker 2: the various harms coming to young things. 53 00:02:48,800 --> 00:02:51,360 Speaker 1: That's not what I'm asking I'm quite deliberately asking you 54 00:02:51,400 --> 00:02:55,200 Speaker 1: this question. If it was possible to ban under sixteens, 55 00:02:55,200 --> 00:02:57,760 Speaker 1: like if it was technologically possible to ban them, would 56 00:02:57,840 --> 00:02:58,400 Speaker 1: you support that? 57 00:03:00,160 --> 00:03:04,160 Speaker 2: But I'm sorry, I'm not trying to avoid the question. 58 00:03:04,280 --> 00:03:07,359 Speaker 2: I just believe that you've got to ask yourself, what 59 00:03:07,480 --> 00:03:10,720 Speaker 2: are you banning them from? From the entire internet? Certain 60 00:03:10,760 --> 00:03:16,240 Speaker 2: specific platforms. Okay, so from specific platforms. Then this is 61 00:03:16,240 --> 00:03:20,160 Speaker 2: why the question I want answer to is what happens next? 62 00:03:20,840 --> 00:03:23,760 Speaker 2: Is this going to drive young people into the dark 63 00:03:23,800 --> 00:03:26,720 Speaker 2: web where they're going to do stuff that's actually even worse? 64 00:03:26,760 --> 00:03:29,720 Speaker 2: Can you imagine and no longer be prepared to talk 65 00:03:29,760 --> 00:03:31,960 Speaker 2: to their parents about it because they've been told that 66 00:03:32,000 --> 00:03:34,640 Speaker 2: what they're doing is illegal. So I just asked the question, 67 00:03:34,800 --> 00:03:37,640 Speaker 2: what practically works and what are the outcomes. I think 68 00:03:37,640 --> 00:03:40,000 Speaker 2: that's the kind of quality discussion we need if you, 69 00:03:40,280 --> 00:03:43,280 Speaker 2: if indeed you believe it's an important issue, which I do. 70 00:03:43,480 --> 00:03:45,640 Speaker 1: Okay, David listen, Thank you very much, appreciate your time. 71 00:03:45,680 --> 00:03:48,680 Speaker 1: That's David Seymore, the act Party leader also Associate Education Minister. 72 00:03:50,200 --> 00:03:53,360 Speaker 2: For more from Hither Duplessy Allan Drive Listen live to 73 00:03:53,480 --> 00:03:56,480 Speaker 2: news talks. It'd be from four pm weekdays, or follow 74 00:03:56,520 --> 00:03:58,240 Speaker 2: the podcast on iHeartRadio.