1 00:00:00,160 --> 00:00:02,880 Speaker 1: Now food Stuff Speaking of crime, food Stuffs reckon that 2 00:00:02,880 --> 00:00:05,400 Speaker 1: their facial recognition trial has been a success. Hey, they've 3 00:00:05,400 --> 00:00:07,240 Speaker 1: been running this thing for six months, been keeping an 4 00:00:07,240 --> 00:00:09,920 Speaker 1: eye on it, running it in twenty five North Island stores. 5 00:00:10,240 --> 00:00:12,960 Speaker 1: They reckon, harm was reduced by sixteen percent and it 6 00:00:13,000 --> 00:00:16,400 Speaker 1: prevented more than one hundred serious install incidents, including assault. 7 00:00:16,600 --> 00:00:18,919 Speaker 1: Now the General Council for Food Stuffs North Island is 8 00:00:19,000 --> 00:00:22,120 Speaker 1: Julian Benefield had Julian Hi, Heather, is this about what 9 00:00:22,160 --> 00:00:22,840 Speaker 1: you were expecting? 10 00:00:24,600 --> 00:00:28,360 Speaker 2: Well, it was ultimately a trial to work out whether 11 00:00:29,080 --> 00:00:33,040 Speaker 2: facial recognition technology could help keep our people safer by 12 00:00:33,080 --> 00:00:35,640 Speaker 2: preventing repeat offendage from doing more harm. We've seen huge 13 00:00:35,640 --> 00:00:38,080 Speaker 2: increases in retail crime and we felt we needed to 14 00:00:38,080 --> 00:00:40,280 Speaker 2: do more to keep our people safe. It was a 15 00:00:40,320 --> 00:00:42,640 Speaker 2: genuine trial where we wanted to learn whether we could 16 00:00:42,720 --> 00:00:45,320 Speaker 2: use the technology safely to do so. We now have 17 00:00:45,440 --> 00:00:48,400 Speaker 2: the final report from the Child's Independent evaluator who has 18 00:00:48,400 --> 00:00:50,920 Speaker 2: found that facial recognition has reduced serious harm and the 19 00:00:50,960 --> 00:00:53,840 Speaker 2: child stores by needstimate of sixteen percent and has strong 20 00:00:54,240 --> 00:00:56,600 Speaker 2: public support justifying the use of the technology. 21 00:00:57,080 --> 00:00:59,480 Speaker 1: So with the nine who were misidentified, well all of 22 00:00:59,520 --> 00:01:02,200 Speaker 1: them trest No. 23 00:01:02,200 --> 00:01:05,000 Speaker 2: So out of firstly, you know, we didn't want any 24 00:01:05,280 --> 00:01:10,600 Speaker 2: misidentifications through this process, but unfortunately there were nine situations 25 00:01:10,720 --> 00:01:14,280 Speaker 2: where there was a misidentification. Out of the nine, there 26 00:01:14,280 --> 00:01:17,440 Speaker 2: were only two customers who are asked to leave the stores. 27 00:01:17,959 --> 00:01:21,120 Speaker 2: In all of those situations, we owned our mistake and 28 00:01:21,200 --> 00:01:24,520 Speaker 2: acknowledged our mistake and apologist drives as soon as we could. 29 00:01:24,920 --> 00:01:27,360 Speaker 2: And in each of those situations, it was a human era. 30 00:01:27,480 --> 00:01:29,920 Speaker 2: It was a manual failure and not a failure of 31 00:01:29,920 --> 00:01:33,759 Speaker 2: the systems. We've apologized to the customers, and we've looked 32 00:01:33,800 --> 00:01:36,480 Speaker 2: very hard the learnings of those situations and implemented training 33 00:01:36,720 --> 00:01:38,280 Speaker 2: and improved procedures as a result. 34 00:01:38,400 --> 00:01:40,080 Speaker 1: See one of them, the one that's based on rude 35 00:01:40,120 --> 00:01:41,959 Speaker 1: through is going to fight you guys even further. Ah, 36 00:01:42,120 --> 00:01:42,800 Speaker 1: did you see that. 37 00:01:44,040 --> 00:01:46,200 Speaker 2: Yeah, it's probably not something I can comment on at 38 00:01:46,200 --> 00:01:48,560 Speaker 2: this point in time, but ultimately we know it was 39 00:01:48,560 --> 00:01:52,640 Speaker 2: an unfortunate situation. As has been widely reported previously. We've 40 00:01:52,640 --> 00:01:56,000 Speaker 2: acknowledged our mistake and apologized and thought very deeply about 41 00:01:56,000 --> 00:01:59,360 Speaker 2: the situation and implemented and let's. 42 00:01:59,200 --> 00:02:02,440 Speaker 1: Be honest about it. Let's be honest about it. If 43 00:02:02,480 --> 00:02:04,800 Speaker 1: you're going to ask a couple of people to leave 44 00:02:04,800 --> 00:02:07,160 Speaker 1: who shouldn't leave. It's probably worth it to stop one 45 00:02:07,200 --> 00:02:10,040 Speaker 1: hundred assaults or one hundred incidents, Isn't it like you're 46 00:02:10,040 --> 00:02:11,360 Speaker 1: going to have some This is going to happen. 47 00:02:12,520 --> 00:02:14,320 Speaker 2: Yeah, And look, this was a genuine trial that we 48 00:02:14,400 --> 00:02:17,040 Speaker 2: knew we would learn from. We didn't want any misidentifications, 49 00:02:17,040 --> 00:02:19,880 Speaker 2: but to give it context, that was you know, nine 50 00:02:19,880 --> 00:02:23,240 Speaker 2: situations out of seventeen hundred and forty two alerts, right, 51 00:02:23,320 --> 00:02:25,000 Speaker 2: so we got it right over ninety eight percent at 52 00:02:25,000 --> 00:02:26,440 Speaker 2: the time. But we have learned from it and we 53 00:02:26,480 --> 00:02:26,959 Speaker 2: will improve. 54 00:02:27,040 --> 00:02:29,120 Speaker 1: Yeah, what happens next? So are you guys do you 55 00:02:29,120 --> 00:02:30,760 Speaker 1: guys get to make the final call on whether you 56 00:02:30,800 --> 00:02:32,480 Speaker 1: can actually roll this out across the country or is 57 00:02:32,480 --> 00:02:33,640 Speaker 1: that with the Privacy Commissioner. 58 00:02:34,280 --> 00:02:36,120 Speaker 2: Yeah, So a couple of months ago, at the end 59 00:02:36,120 --> 00:02:39,280 Speaker 2: of the trial, we decided to continue using FASH recognition 60 00:02:39,320 --> 00:02:42,000 Speaker 2: technology in the twenty five trial stores under the same 61 00:02:42,080 --> 00:02:44,880 Speaker 2: privacy and security protocols, and that was based on the 62 00:02:44,880 --> 00:02:47,880 Speaker 2: preliminary findings of independent evaluator at that point in time. 63 00:02:48,200 --> 00:02:52,720 Speaker 2: At this stage, we're now waiting for the OPCs to 64 00:02:52,760 --> 00:02:56,120 Speaker 2: publish their public inquiry into our trial. We want to 65 00:02:56,120 --> 00:02:57,959 Speaker 2: hear what they have to say, so we will review 66 00:02:57,960 --> 00:03:00,560 Speaker 2: what they have to say about their own cryer inquiry 67 00:03:00,600 --> 00:03:03,160 Speaker 2: before deciding on any further use of the technology. But 68 00:03:03,200 --> 00:03:05,079 Speaker 2: today we just wanted to be transparent or where we 69 00:03:05,160 --> 00:03:06,639 Speaker 2: landed with our final evaluation. 70 00:03:06,880 --> 00:03:08,320 Speaker 1: So who gets to make the call as to whether 71 00:03:08,320 --> 00:03:10,760 Speaker 1: you roll it out? You or the Privacy Commissioner. 72 00:03:11,760 --> 00:03:14,600 Speaker 2: Well, as a business, we have felt that, you know, 73 00:03:14,639 --> 00:03:16,560 Speaker 2: that the harm is so bad that we need to 74 00:03:16,560 --> 00:03:18,400 Speaker 2: do more about it. So we as a business made 75 00:03:18,400 --> 00:03:20,560 Speaker 2: a decision to continue using So. 76 00:03:20,480 --> 00:03:22,640 Speaker 1: You guys get the final say, you can decide if 77 00:03:22,639 --> 00:03:23,360 Speaker 1: you want to do it or not. 78 00:03:24,600 --> 00:03:26,360 Speaker 2: Well, you know, we've decided to use it in the 79 00:03:26,360 --> 00:03:28,799 Speaker 2: twenty five trial stores, but we do respect the role 80 00:03:28,840 --> 00:03:32,280 Speaker 2: of the OPC. We've worked constructively both before the trial 81 00:03:32,320 --> 00:03:34,520 Speaker 2: and then through the trial as they've done their inquiry 82 00:03:34,520 --> 00:03:37,720 Speaker 2: and we've done store visits and had you know, constructive conversations, 83 00:03:37,760 --> 00:03:39,120 Speaker 2: and we would like to hear what they have to 84 00:03:39,120 --> 00:03:41,360 Speaker 2: say as part of the public inquiry before we make 85 00:03:41,400 --> 00:03:42,400 Speaker 2: any further decisions. 86 00:03:42,520 --> 00:03:44,320 Speaker 1: Okay, what are you going to do about this brown 87 00:03:44,360 --> 00:03:45,120 Speaker 1: onion business? 88 00:03:46,720 --> 00:03:49,840 Speaker 2: Yes, so we are aware of that situation. I mean, 89 00:03:49,960 --> 00:03:54,640 Speaker 2: ultimately it is it is theft, but look, we trust 90 00:03:54,640 --> 00:03:56,200 Speaker 2: our customers to do the right thing. 91 00:03:57,880 --> 00:03:59,920 Speaker 1: Noticed the evidence would suggest that you shouldn't. 92 00:04:01,240 --> 00:04:05,080 Speaker 2: Look, we haven't noticed any unusual increase in brown onions 93 00:04:05,120 --> 00:04:07,960 Speaker 2: going through self checkouts. We trust our customers to be 94 00:04:07,960 --> 00:04:10,400 Speaker 2: honest and most people do understand that theft is theft 95 00:04:10,440 --> 00:04:11,320 Speaker 2: and we'll do the right thing. 96 00:04:11,680 --> 00:04:14,040 Speaker 1: Yeah, all right, Julian listen, thanks very much, really appreciate 97 00:04:14,040 --> 00:04:16,400 Speaker 1: your time. That's Julian Benefield, the General Council for Food 98 00:04:16,400 --> 00:04:20,239 Speaker 1: Stuffs North Island. For more from Heather Duplessy, Alan Drive, 99 00:04:20,400 --> 00:04:23,800 Speaker 1: listen live to news Talks. It'd be from four pm weekdays, 100 00:04:23,920 --> 00:04:26,120 Speaker 1: or follow the podcast on iHeartRadio