1 00:00:00,200 --> 00:00:03,040 Speaker 1: There is a realistic possibility of a terrorist attack in 2 00:00:03,080 --> 00:00:05,320 Speaker 1: New Zealand and the most likely way that it could 3 00:00:05,320 --> 00:00:08,039 Speaker 1: happen is a lone actor. That's according to the latest 4 00:00:08,080 --> 00:00:11,440 Speaker 1: security assessment by our Intelligence Services. Andrew Hampton is the 5 00:00:11,480 --> 00:00:15,200 Speaker 1: Director General at the SIS Hey Andrew Hi Heather, is 6 00:00:15,240 --> 00:00:18,040 Speaker 1: the possibility of a terrorist attack like this more or 7 00:00:18,160 --> 00:00:19,680 Speaker 1: less likely than it was last year. 8 00:00:22,160 --> 00:00:26,800 Speaker 2: It is consistent with last year's assessment, which is there 9 00:00:26,840 --> 00:00:30,840 Speaker 2: is a realistic possibility of a terrorist event. But the 10 00:00:30,920 --> 00:00:35,200 Speaker 2: threat environment continues to evolve and one of the things 11 00:00:35,240 --> 00:00:39,000 Speaker 2: that has changed over the last year is the increased 12 00:00:39,120 --> 00:00:42,640 Speaker 2: prominence of what we call people with mixed unclear ones 13 00:00:42,640 --> 00:00:48,080 Speaker 2: stable wider ideologies or motivations. So these are people who 14 00:00:48,560 --> 00:00:53,360 Speaker 2: often young, radicalized online, fixated with violence. They may not 15 00:00:54,320 --> 00:00:59,200 Speaker 2: have strong affiliations to a particular ideology, but they're looking 16 00:00:59,280 --> 00:01:01,360 Speaker 2: for some thing to justify undertaking a. 17 00:01:01,400 --> 00:01:03,760 Speaker 1: Violent act right And these kinds of people are very 18 00:01:03,800 --> 00:01:07,160 Speaker 1: hard to stop, aren't they, because they act by themselves. 19 00:01:07,840 --> 00:01:12,760 Speaker 2: Well, fortunately, you know, we're talking about only a very 20 00:01:12,800 --> 00:01:16,120 Speaker 2: small number of individuals who we know about, who we're 21 00:01:16,160 --> 00:01:19,440 Speaker 2: concerned about, and we're keeping an eye on them. But yes, 22 00:01:19,520 --> 00:01:23,559 Speaker 2: you're You're right. The risk with a self radicalized individual, 23 00:01:23,840 --> 00:01:28,640 Speaker 2: they're likely to be acting alone. There won't necessarily be 24 00:01:28,720 --> 00:01:32,039 Speaker 2: much intelligence warning, and they'll use capabilities that are readily 25 00:01:32,040 --> 00:01:35,920 Speaker 2: at hand, such as knives or vehicles. That's one of 26 00:01:35,920 --> 00:01:39,120 Speaker 2: the reasons why we've put this threat assessment out, so 27 00:01:39,160 --> 00:01:42,720 Speaker 2: that the public know what the signs of someone who 28 00:01:42,720 --> 00:01:47,400 Speaker 2: may be on that mobilization pathway looks like, and so 29 00:01:47,480 --> 00:01:50,160 Speaker 2: they can spot those signs and raise their handful place. 30 00:01:51,160 --> 00:01:55,880 Speaker 2: There's a range of range of things to look at, 31 00:01:55,960 --> 00:02:00,919 Speaker 2: and we released a couple of years ago a report 32 00:02:01,040 --> 00:02:06,440 Speaker 2: specifically on that. But part of it is people's online behavior. 33 00:02:06,680 --> 00:02:12,520 Speaker 2: Part of it is individuals maybe looking at their own security, 34 00:02:12,600 --> 00:02:15,519 Speaker 2: you know, how they obscure what they're doing. Sometimes it 35 00:02:15,639 --> 00:02:19,800 Speaker 2: may be radical changes in behavior from people. It may 36 00:02:19,840 --> 00:02:24,600 Speaker 2: be about how they espouse particular ideologies. So there are 37 00:02:24,840 --> 00:02:29,640 Speaker 2: a range of things to look for. The key message 38 00:02:29,760 --> 00:02:34,840 Speaker 2: we want to get across is if you know within 39 00:02:34,919 --> 00:02:39,160 Speaker 2: your community, even within your family, you're concerned about someone 40 00:02:39,200 --> 00:02:42,120 Speaker 2: because they may be engaging with extremist material, They may 41 00:02:42,200 --> 00:02:45,119 Speaker 2: be talking about wanting to undertake some sort of act, 42 00:02:45,280 --> 00:02:49,840 Speaker 2: They may be seeking to associate with people online of 43 00:02:50,360 --> 00:02:53,079 Speaker 2: concern that you know, you do, you do raise that 44 00:02:53,240 --> 00:02:57,120 Speaker 2: with us or with the police, you know, and in 45 00:02:57,240 --> 00:03:00,960 Speaker 2: many cases, in most cases, you know, people who exhibit 46 00:03:01,040 --> 00:03:04,959 Speaker 2: online behaviors don't end up taking undertaking a real world act, 47 00:03:05,040 --> 00:03:05,799 Speaker 2: but sometimes they do. 48 00:03:06,880 --> 00:03:09,600 Speaker 1: On the spying, the report says that you guys busted 49 00:03:09,639 --> 00:03:12,320 Speaker 1: a unit gathering info for a foreign intelligence service, and 50 00:03:12,360 --> 00:03:14,280 Speaker 1: then you send a message to them to stop. Did 51 00:03:14,320 --> 00:03:14,720 Speaker 1: they stop? 52 00:03:16,960 --> 00:03:23,160 Speaker 2: They definitely took notice. What what we try and do 53 00:03:23,760 --> 00:03:27,920 Speaker 2: is make very clear to those countries that are undertaking 54 00:03:27,960 --> 00:03:31,040 Speaker 2: that type of activity that we know what they're up to, 55 00:03:32,520 --> 00:03:36,440 Speaker 2: that we are monitoring them, and that it is said 56 00:03:36,440 --> 00:03:38,920 Speaker 2: it may not be a legal activity, but it is 57 00:03:38,960 --> 00:03:43,160 Speaker 2: activity that's inconsistent with our values and our national interest 58 00:03:43,640 --> 00:03:47,560 Speaker 2: and for them to cease doing it. And as you 59 00:03:47,560 --> 00:03:51,680 Speaker 2: would expect, they don't like that very much. Sometimes though, 60 00:03:52,360 --> 00:03:55,720 Speaker 2: it just leads to them trying to you know, obscure 61 00:03:55,800 --> 00:04:00,440 Speaker 2: their their activities more. But in that particular our case, 62 00:04:00,520 --> 00:04:01,440 Speaker 2: yesday took notice. 63 00:04:02,080 --> 00:04:03,760 Speaker 1: Andrew. Thank you, I really appreciate you having a chat 64 00:04:03,800 --> 00:04:07,280 Speaker 1: with us. Andrew Hopton, Director General of the SIS. For 65 00:04:07,440 --> 00:04:10,680 Speaker 1: more from Heather Duplessy Allen Drive, listen live to news 66 00:04:10,760 --> 00:04:13,600 Speaker 1: talks 'd B from four pm weekdays, or follow the 67 00:04:13,680 --> 00:04:15,280 Speaker 1: podcast on iHeartRadio