1 00:00:00,080 --> 00:00:01,840 Speaker 1: Now there's been a if you're in the shops, you 2 00:00:01,920 --> 00:00:05,080 Speaker 1: may notice this. A notable drop in violent crime being 3 00:00:05,120 --> 00:00:09,000 Speaker 1: reported by retailers this year. Retail crime intelligence company AURA 4 00:00:09,080 --> 00:00:11,760 Speaker 1: has found the use of weapons is down twelve percent, 5 00:00:12,480 --> 00:00:17,560 Speaker 1: violent events down six percent, threatening events down five Meanwhile, 6 00:00:17,640 --> 00:00:21,080 Speaker 1: in Australia, the same kinds of incidents are trending upwards. 7 00:00:21,840 --> 00:00:26,079 Speaker 1: Or Vice president Nick McDonnell is with me this afternoon. 8 00:00:26,200 --> 00:00:26,360 Speaker 2: Nick. 9 00:00:26,400 --> 00:00:30,400 Speaker 1: Good afternoon, afternoon, Ryan. How are you very well? Thank you? 10 00:00:30,720 --> 00:00:33,360 Speaker 1: So how do we know that? Is this just in 11 00:00:33,440 --> 00:00:36,760 Speaker 1: the areas where your cameras are recording? 12 00:00:38,159 --> 00:00:41,160 Speaker 2: Yeah, so we're not a camera company. A retailer will 13 00:00:41,159 --> 00:00:45,479 Speaker 2: put the information into our system, like a database of 14 00:00:45,600 --> 00:00:49,760 Speaker 2: information that they've already captured themselves, and they'll put in 15 00:00:49,800 --> 00:00:53,600 Speaker 2: the characteristics of an offense or an incident in their store. 16 00:00:54,000 --> 00:00:55,960 Speaker 2: And what they'll do is they'll tag it whether it's 17 00:00:56,080 --> 00:00:58,680 Speaker 2: had a weapon involved, or it's been violent or threatening, 18 00:00:58,680 --> 00:01:00,920 Speaker 2: those sorts of things, and that's whether that's where the 19 00:01:01,040 --> 00:01:04,720 Speaker 2: data comes from. And it is obviously for people who 20 00:01:04,760 --> 00:01:08,600 Speaker 2: are customers of ours, but that's quite a large spread 21 00:01:09,360 --> 00:01:10,240 Speaker 2: across the country. 22 00:01:10,400 --> 00:01:12,600 Speaker 1: And what do people tell you, like, why is this happening? 23 00:01:12,600 --> 00:01:16,319 Speaker 2: Do you think, Well, I think New Zealand has been 24 00:01:17,080 --> 00:01:21,240 Speaker 2: very leading in terms of its adoption of technology in 25 00:01:21,280 --> 00:01:24,840 Speaker 2: this space. So one of the big things is ten 26 00:01:24,880 --> 00:01:27,880 Speaker 2: percent of people are causing sixty percent of the crime 27 00:01:28,240 --> 00:01:31,440 Speaker 2: in retail, and we know that that ten percent are 28 00:01:31,600 --> 00:01:34,760 Speaker 2: six or four to six times more likely to be violent. 29 00:01:35,440 --> 00:01:38,160 Speaker 2: But you have to surface that information in order to 30 00:01:38,200 --> 00:01:40,600 Speaker 2: find the patterns and in order to find the people 31 00:01:40,720 --> 00:01:43,800 Speaker 2: causing the most harm. And retailers have really led the 32 00:01:43,800 --> 00:01:46,399 Speaker 2: way and leaning into it, and then the police on 33 00:01:46,440 --> 00:01:49,080 Speaker 2: the other side of the system have also lent into 34 00:01:49,120 --> 00:01:53,600 Speaker 2: this earlier than other markets and are really collaborating together 35 00:01:53,760 --> 00:01:57,480 Speaker 2: through the platform. But it's that recognition that retail crime 36 00:01:57,920 --> 00:02:01,440 Speaker 2: is a really high volume crime type. You can't overwhelm 37 00:02:01,520 --> 00:02:05,160 Speaker 2: the system with every incident. So if you use the 38 00:02:05,240 --> 00:02:08,919 Speaker 2: technology to really make sense of the information, that allows 39 00:02:08,919 --> 00:02:11,079 Speaker 2: you to focus your resources more effectively. 40 00:02:11,200 --> 00:02:14,240 Speaker 1: And does that mean we are identifying people who you know? 41 00:02:14,280 --> 00:02:16,959 Speaker 1: You said the ten percent? Are we identifying the repeat 42 00:02:16,960 --> 00:02:20,399 Speaker 1: offenders and not letting them into shops? And that's why 43 00:02:20,440 --> 00:02:21,160 Speaker 1: it's coming down? 44 00:02:23,200 --> 00:02:27,519 Speaker 2: It can be a combination, right, So you are identifying them, 45 00:02:28,400 --> 00:02:32,720 Speaker 2: you are visiting to those people the events that previously 46 00:02:32,760 --> 00:02:36,200 Speaker 2: would have been seen as isolated incidents, So you're actually 47 00:02:36,240 --> 00:02:40,040 Speaker 2: saying this one person has caused ten events, rather than 48 00:02:40,040 --> 00:02:43,480 Speaker 2: there's ten different people causing ten events. And that then 49 00:02:43,560 --> 00:02:46,600 Speaker 2: allows police to be able to go pick up or 50 00:02:46,800 --> 00:02:50,880 Speaker 2: investigate those high harm offenders that are having the most 51 00:02:50,960 --> 00:02:53,360 Speaker 2: amount of impact. And if you're picking up those people, 52 00:02:53,360 --> 00:02:56,040 Speaker 2: you're taking them out of circulation, or you're doing whatever 53 00:02:56,080 --> 00:02:58,959 Speaker 2: else you need to do to prevent further crime. 54 00:03:00,080 --> 00:03:03,120 Speaker 1: Nick, really appreciate that update. Thank you. That's Nick McDonell, 55 00:03:03,200 --> 00:03:06,959 Speaker 1: who is with AURA Vice president. For more from Hither 56 00:03:07,120 --> 00:03:08,280 Speaker 1: Duplessy Allen Drive. 57 00:03:08,400 --> 00:03:09,840 Speaker 2: Listen live to news talks. 58 00:03:09,840 --> 00:03:13,040 Speaker 1: It'd be from four pm weekdays, or follow the podcast 59 00:03:13,120 --> 00:03:14,160 Speaker 1: on iHeartRadio