1 00:00:00,160 --> 00:00:02,920 Speaker 1: Details about the Tom Phillips case will remain suppressed for 2 00:00:02,960 --> 00:00:06,360 Speaker 1: at least another month. Media police Auto Atomitiki have been 3 00:00:06,360 --> 00:00:09,600 Speaker 1: prevented from publishing certain details related to the case since 4 00:00:09,680 --> 00:00:12,360 Speaker 1: Tom Phillips and his kids were found last Monday. Now 5 00:00:12,360 --> 00:00:14,680 Speaker 1: the High Court in Wellington will hear the matter again 6 00:00:15,120 --> 00:00:19,520 Speaker 1: in mid October. Meanwhile, the details are spreading across social 7 00:00:19,560 --> 00:00:22,880 Speaker 1: media with no known repercussion so far. Now Nicki Chamberlain 8 00:00:22,960 --> 00:00:25,319 Speaker 1: is a senior lecture at Auckland University's Law School and 9 00:00:25,360 --> 00:00:29,720 Speaker 1: with us Hey Nikki Hi. Howther, when the court considers 10 00:00:29,760 --> 00:00:32,279 Speaker 1: the injunction like they did yesterday and like they will 11 00:00:32,280 --> 00:00:34,879 Speaker 1: do in about a month's time, do they consider the 12 00:00:34,920 --> 00:00:37,400 Speaker 1: fact that the information is spreading on social media? 13 00:00:38,560 --> 00:00:42,720 Speaker 2: Yes, they will do so. There is normally for an 14 00:00:42,720 --> 00:00:46,479 Speaker 2: injunction based on private information three things the court looks at. 15 00:00:46,560 --> 00:00:48,800 Speaker 2: Is there a serious question to be tried, the information 16 00:00:48,920 --> 00:00:51,720 Speaker 2: is private, where does the balance of convenience lie? And 17 00:00:51,760 --> 00:00:54,040 Speaker 2: what is in the interest of justice? Now that third 18 00:00:54,120 --> 00:00:58,240 Speaker 2: criteria interest of justice, the court will look at how 19 00:00:58,280 --> 00:01:02,400 Speaker 2: widely known the facts are that they are trying to suppress, 20 00:01:02,600 --> 00:01:06,679 Speaker 2: and if it is seen to be fairly common knowledge 21 00:01:06,680 --> 00:01:09,000 Speaker 2: in the public then it will be seen as unfairly 22 00:01:09,040 --> 00:01:12,319 Speaker 2: prejudicial to keep the injunction in place. So that is 23 00:01:12,360 --> 00:01:15,440 Speaker 2: something that the courts will look at. And in fact, 24 00:01:16,319 --> 00:01:20,319 Speaker 2: there is precedent of a case, interestingly back in the 25 00:01:20,360 --> 00:01:24,760 Speaker 2: eighties where there was an injunction in place, but because 26 00:01:24,920 --> 00:01:28,880 Speaker 2: information was published by media outlets in New Zealand and 27 00:01:28,920 --> 00:01:32,120 Speaker 2: overseas who were not subject to the injunction, the court 28 00:01:32,200 --> 00:01:34,360 Speaker 2: ended up withdrawing the injunction because they said it was 29 00:01:34,480 --> 00:01:36,880 Speaker 2: unfair on those media outlets that were bound by it. 30 00:01:37,360 --> 00:01:40,800 Speaker 2: So I think what we'll see here is a consideration 31 00:01:40,880 --> 00:01:43,199 Speaker 2: of just how widely known these facts are and whether 32 00:01:43,240 --> 00:01:44,920 Speaker 2: it is oppressive to continue the injunction. 33 00:01:45,120 --> 00:01:48,080 Speaker 1: So would you in your opinion, then what would you say? 34 00:01:48,120 --> 00:01:50,160 Speaker 1: If this continues to spread at the rate that it is, 35 00:01:50,200 --> 00:01:52,040 Speaker 1: the court will eventually be forced to lift it. 36 00:01:53,240 --> 00:01:55,320 Speaker 2: I think that they will. I think the problem is 37 00:01:55,360 --> 00:01:59,080 Speaker 2: going to be, well, there's a twofold problem, Heather. First, 38 00:01:59,120 --> 00:02:03,400 Speaker 2: you've got over news media, right, So the injunction applies 39 00:02:03,520 --> 00:02:06,360 Speaker 2: to those parties who are subject to it, which includes 40 00:02:06,400 --> 00:02:09,400 Speaker 2: the New Zealand media. If you have overseas media who 41 00:02:09,440 --> 00:02:12,280 Speaker 2: are discussing matters that are subject to the injunction. You're 42 00:02:12,280 --> 00:02:15,160 Speaker 2: going to have issues and enforcing the injunction overseas because 43 00:02:15,200 --> 00:02:17,080 Speaker 2: you're going to have to get the injunction the court 44 00:02:17,160 --> 00:02:20,920 Speaker 2: order recognized in that foreign jurisdiction to have any effect 45 00:02:20,960 --> 00:02:24,840 Speaker 2: and be enforced. That's issue. One issue too, is that 46 00:02:24,919 --> 00:02:28,840 Speaker 2: the injunction applies, as you said, to media organizations, the police, 47 00:02:29,639 --> 00:02:33,680 Speaker 2: and araga tamariki. The problem is is that there might 48 00:02:33,720 --> 00:02:37,080 Speaker 2: be individuals right within New Zealand who are not subject 49 00:02:37,160 --> 00:02:39,200 Speaker 2: to the terms of the injunction, who might be posting 50 00:02:39,200 --> 00:02:43,760 Speaker 2: this on social media. Now, how do you stop that occurring. Well, 51 00:02:44,280 --> 00:02:48,680 Speaker 2: unless you can get those individuals subject to the injunction itself, 52 00:02:49,440 --> 00:02:51,920 Speaker 2: you need to be able to somehow get the social 53 00:02:51,960 --> 00:02:55,560 Speaker 2: media companies as a party to the injunction. And the 54 00:02:55,600 --> 00:02:58,320 Speaker 2: issue around that is you'll have your social media companies, 55 00:02:58,320 --> 00:03:00,640 Speaker 2: which a are overseas, so it's hard to enforce an 56 00:03:00,720 --> 00:03:04,360 Speaker 2: order against them. But be what they will say is is, well, look, 57 00:03:04,440 --> 00:03:09,160 Speaker 2: we need reasonable notice because we can't possibly surveil millions 58 00:03:09,200 --> 00:03:11,320 Speaker 2: of people on social media. So as long as you 59 00:03:11,360 --> 00:03:14,800 Speaker 2: give us reasonable notice that there's information up on you know, 60 00:03:14,840 --> 00:03:17,600 Speaker 2: the facebooks of the world and Twitter, et cetera, then 61 00:03:17,680 --> 00:03:21,079 Speaker 2: we can pull down the information. So there's a number 62 00:03:21,120 --> 00:03:25,880 Speaker 2: of problems within the scope of the injunction and making 63 00:03:25,919 --> 00:03:28,520 Speaker 2: sure that it stays in effect. You know, we have 64 00:03:28,560 --> 00:03:30,720 Speaker 2: a saying in the United States, Heather, you've probably heard 65 00:03:30,760 --> 00:03:34,119 Speaker 2: of this, the horse has already bolted. It's too late 66 00:03:34,200 --> 00:03:38,400 Speaker 2: to shut the barn door now. And that's and I 67 00:03:38,480 --> 00:03:41,480 Speaker 2: kind of feel like in this situation it may get 68 00:03:41,520 --> 00:03:45,600 Speaker 2: to the stage where where that's the position in Layman's terms, 69 00:03:45,680 --> 00:03:48,960 Speaker 2: because you know, injunctions are great, but you've got to 70 00:03:48,960 --> 00:03:52,600 Speaker 2: get them issued immediately to stop the leak of the 71 00:03:52,640 --> 00:03:53,920 Speaker 2: information in the first place. 72 00:03:54,200 --> 00:03:57,560 Speaker 1: Right, So yeah, and I can imagine also just the 73 00:03:57,560 --> 00:03:59,640 Speaker 1: the logistics of like a Facebook or a Twitter or 74 00:03:59,680 --> 00:04:02,360 Speaker 1: what for having to go and then track down the 75 00:04:02,360 --> 00:04:04,800 Speaker 1: comments and get them deleted. That must be a reasonably 76 00:04:05,040 --> 00:04:07,200 Speaker 1: Is that a difficult task or is that an easy 77 00:04:07,200 --> 00:04:07,880 Speaker 1: thing for them to do? 78 00:04:09,520 --> 00:04:12,640 Speaker 2: Well? The law says actually at the moment, So for example, 79 00:04:12,640 --> 00:04:15,000 Speaker 2: in defamation law, if you make a complaint that somebody 80 00:04:15,120 --> 00:04:17,479 Speaker 2: is defaming you, they have a reasonable period of time. 81 00:04:17,520 --> 00:04:19,760 Speaker 2: They have to have reasonable notice and a reasonable period 82 00:04:19,839 --> 00:04:22,760 Speaker 2: time to remove it. And I would assume that something 83 00:04:22,839 --> 00:04:26,960 Speaker 2: like that would be in this situation unless there was 84 00:04:27,080 --> 00:04:30,880 Speaker 2: laws that were to created to circumvent that. Now if 85 00:04:30,920 --> 00:04:33,920 Speaker 2: you did that, you could have arguments and again we're 86 00:04:33,920 --> 00:04:36,640 Speaker 2: getting into some policy issues here that it's a cost 87 00:04:36,640 --> 00:04:39,440 Speaker 2: of doing business. Right, So the fact that Facebook wants 88 00:04:39,480 --> 00:04:43,400 Speaker 2: to operate in New Zealand, we're going to enact laws 89 00:04:43,480 --> 00:04:47,240 Speaker 2: which will be more prohibitive on them and put the 90 00:04:47,279 --> 00:04:50,719 Speaker 2: onus on them to monitor. Now, how cumbersome is that 91 00:04:50,800 --> 00:04:54,120 Speaker 2: going to be? Well, I imagine it could be quite cumbersome. 92 00:04:55,320 --> 00:04:59,280 Speaker 2: But again, you know, with technology and you know various 93 00:04:59,279 --> 00:05:02,440 Speaker 2: ways to to search what people are posting about. I'm 94 00:05:02,480 --> 00:05:05,520 Speaker 2: sure you could have buzzwords, right, which could filter through posts. 95 00:05:05,560 --> 00:05:09,920 Speaker 2: So that's something that a social media technologist would be 96 00:05:09,960 --> 00:05:10,760 Speaker 2: able to comment on. 97 00:05:10,960 --> 00:05:12,960 Speaker 1: Brilliant het, Niki, it's been fascinating to talk to you. 98 00:05:12,960 --> 00:05:15,720 Speaker 1: Thank you very much, Nikki Chamberlain, Senior Lecture of Auckland 99 00:05:15,800 --> 00:05:18,400 Speaker 1: UNI Law Faculty. So she thinks the injunction will be 100 00:05:18,440 --> 00:05:22,320 Speaker 1: lifted at some stage. For more from Hither Duplessy Allen 101 00:05:22,400 --> 00:05:25,200 Speaker 1: Drive listen live to news talks. It'd be from four 102 00:05:25,240 --> 00:05:28,520 Speaker 1: pm weekdays, or follow the podcast on iHeartRadio