1 00:00:00,160 --> 00:00:03,320 Speaker 1: Meta says it wants to bring anti scam measures to 2 00:00:03,360 --> 00:00:05,760 Speaker 1: New Zealand, but it might be a while until they're 3 00:00:05,760 --> 00:00:08,360 Speaker 1: actually in place. In Australia, social media companies have had 4 00:00:08,400 --> 00:00:11,799 Speaker 1: to take reasonable steps to detect and prevent scam attempts, 5 00:00:12,039 --> 00:00:15,840 Speaker 1: and our government is now considering introducing similar rules. So 6 00:00:15,960 --> 00:00:18,000 Speaker 1: to talk about this, I'm joined by a chief online 7 00:00:18,000 --> 00:00:22,239 Speaker 1: safety officer at knit Safe, Sean Lyons. Hello, Sean, Hello, 8 00:00:22,320 --> 00:00:23,920 Speaker 1: how are you doing all right? What do they think 9 00:00:23,920 --> 00:00:24,360 Speaker 1: you're doing? 10 00:00:25,560 --> 00:00:29,120 Speaker 2: So they're talking about and mirroring the same sort of 11 00:00:29,120 --> 00:00:32,879 Speaker 2: controls that they have over in Australia over here. They 12 00:00:32,880 --> 00:00:35,400 Speaker 2: originally said that wasn't wasn't possible, and I think they're 13 00:00:35,400 --> 00:00:37,480 Speaker 2: still saying it might be some time. But they're talking 14 00:00:37,560 --> 00:00:42,839 Speaker 2: about instituting some defense around ads that that claimed to 15 00:00:42,840 --> 00:00:45,479 Speaker 2: be you know, get rich of quick fem of finance 16 00:00:45,520 --> 00:00:48,640 Speaker 2: based ad, making sure that those anyone who advertised in 17 00:00:48,680 --> 00:00:51,640 Speaker 2: that way is registered with our FMA over here, but 18 00:00:51,760 --> 00:00:54,320 Speaker 2: basically clamping down on the advert that some of us 19 00:00:54,320 --> 00:00:56,440 Speaker 2: are seeing on social media. They are leading us down 20 00:00:56,640 --> 00:00:57,840 Speaker 2: a very unfortunate path. 21 00:00:58,040 --> 00:00:59,880 Speaker 1: Yeah. So one the other day it was Mike Hosking 22 00:01:00,000 --> 00:01:02,200 Speaker 1: telling me to invest money and I thought, no, he's 23 00:01:02,200 --> 00:01:05,320 Speaker 1: neven getting give away those sorts of secrets. Tech giants 24 00:01:05,360 --> 00:01:08,240 Speaker 1: and banks are at loggerheads over who should be responsible 25 00:01:08,240 --> 00:01:12,240 Speaker 1: for online scams. Does this show that Meta actually thinks 26 00:01:12,280 --> 00:01:14,160 Speaker 1: they do have a responsibility to stop them? 27 00:01:15,240 --> 00:01:16,800 Speaker 2: Well, I mean, I think what it is that is 28 00:01:16,800 --> 00:01:19,480 Speaker 2: a pretty star cognition that they think they can have 29 00:01:19,520 --> 00:01:21,480 Speaker 2: an impact and that that means that they must be 30 00:01:21,480 --> 00:01:23,920 Speaker 2: looking at these things. So it is encouraging that they 31 00:01:23,920 --> 00:01:26,240 Speaker 2: are looking in that direction. It is encouraging that they're 32 00:01:26,240 --> 00:01:28,000 Speaker 2: talking about trying to clamp down on those, you know, 33 00:01:28,040 --> 00:01:31,679 Speaker 2: the celebrity based scam adverts that people are seeing. So 34 00:01:32,600 --> 00:01:34,840 Speaker 2: whether or not they think they're the entire solution and 35 00:01:35,240 --> 00:01:37,520 Speaker 2: that they're probably not, but if it's on their platforms, 36 00:01:37,560 --> 00:01:39,880 Speaker 2: we should definitely be pushing them to do more so 37 00:01:39,920 --> 00:01:41,640 Speaker 2: that we are not put in hard way whilst we're 38 00:01:41,720 --> 00:01:44,160 Speaker 2: using their platform for whatever it is that we might 39 00:01:44,240 --> 00:01:45,120 Speaker 2: choose to be doing. So. 40 00:01:45,200 --> 00:01:49,040 Speaker 1: Australia got these protections because they actually applied pressure with 41 00:01:49,200 --> 00:01:52,320 Speaker 1: a code of conduct. Do we need to apply some 42 00:01:52,360 --> 00:01:54,960 Speaker 1: regulatory pressure as well? Well? 43 00:01:54,960 --> 00:01:57,680 Speaker 2: I mean, I think a code of conduct certainly seems 44 00:01:57,680 --> 00:01:59,160 Speaker 2: to be working in as being the kind of the 45 00:01:59,560 --> 00:02:02,000 Speaker 2: basis it ties all this together over there, and I 46 00:02:02,000 --> 00:02:04,600 Speaker 2: think we are talking in this country and mister Baileys 47 00:02:04,680 --> 00:02:08,160 Speaker 2: has talked recently about the code of conduct here. I 48 00:02:08,200 --> 00:02:10,640 Speaker 2: think it can work. I think the important thing for 49 00:02:10,720 --> 00:02:13,320 Speaker 2: us is that if there is a code, these things 50 00:02:13,400 --> 00:02:16,240 Speaker 2: need to be focused on the platform providers and the 51 00:02:16,240 --> 00:02:17,919 Speaker 2: banks and the telcos and all the people that play 52 00:02:17,960 --> 00:02:19,520 Speaker 2: a role in it. But it has to have the 53 00:02:19,520 --> 00:02:21,840 Speaker 2: people that are the victims of these scams at the 54 00:02:21,880 --> 00:02:24,639 Speaker 2: heart of it. So as long as it helps clear 55 00:02:24,720 --> 00:02:27,040 Speaker 2: up that confusion, as long as it means that when 56 00:02:27,080 --> 00:02:29,359 Speaker 2: we are scanned, if we are scammed, we know how 57 00:02:29,400 --> 00:02:31,720 Speaker 2: to take action, and we have some surety that action 58 00:02:31,880 --> 00:02:34,280 Speaker 2: can be taken, then I think it could well have 59 00:02:34,680 --> 00:02:36,280 Speaker 2: some benefit in this country too. 60 00:02:36,480 --> 00:02:39,720 Speaker 1: Finally, why are we behind Australia. 61 00:02:39,040 --> 00:02:42,800 Speaker 2: Again on all in this respect? That that's a question 62 00:02:42,840 --> 00:02:44,560 Speaker 2: that has to go to meta. That's about the way 63 00:02:44,560 --> 00:02:47,399 Speaker 2: that they roll things out to different markets. Maybe that's 64 00:02:47,400 --> 00:02:49,880 Speaker 2: because they have different legislation over there. There are things 65 00:02:49,880 --> 00:02:53,840 Speaker 2: where where we get to be first, and sometimes, unfortunately 66 00:02:53,840 --> 00:02:56,480 Speaker 2: in this situation, we're not. But at least we're on 67 00:02:56,520 --> 00:02:58,840 Speaker 2: that radar at least or on that path and hopefully 68 00:02:58,960 --> 00:03:01,280 Speaker 2: before too very long we'll have the same protections here 69 00:03:01,320 --> 00:03:01,600 Speaker 2: from New. 70 00:03:01,560 --> 00:03:03,440 Speaker 1: Zealand, S Sean, thank you so much for good time. 71 00:03:03,480 --> 00:03:06,840 Speaker 1: Sean Lyons is the chief online safety officer at net safe. 72 00:03:07,560 --> 00:03:10,760 Speaker 2: For more from Hither Duplessy Allen Drive, listen live to 73 00:03:10,840 --> 00:03:13,880 Speaker 2: news talks. It'd be from four pm weekdays, or follow 74 00:03:13,919 --> 00:03:15,680 Speaker 2: the podcast on iHeartRadio.