1 00:00:00,120 --> 00:00:02,400 Speaker 1: Now, Spark has become the first New Zealand telco to 2 00:00:02,480 --> 00:00:06,200 Speaker 1: block websites that are hosting AI generated child sex abuse 3 00:00:06,240 --> 00:00:09,400 Speaker 1: material stuff that's non photographic. Now what that means is 4 00:00:09,400 --> 00:00:12,640 Speaker 1: it doesn't look realistic. It's cartoon style images or artistic 5 00:00:12,640 --> 00:00:16,840 Speaker 1: impressions of child abuse material. Leela Ashford is Spark Sustainability 6 00:00:16,840 --> 00:00:18,439 Speaker 1: director and with me Hi Leela. 7 00:00:19,079 --> 00:00:20,439 Speaker 2: Hey, Heather, how are you going? Good? 8 00:00:20,480 --> 00:00:23,200 Speaker 1: Thank you? Now the Internet Watch Foundation because they've seen 9 00:00:23,200 --> 00:00:25,079 Speaker 1: about a four hundred percent increase of this kind of 10 00:00:25,079 --> 00:00:27,400 Speaker 1: stuff AI generated stuff. Are you seeing that? 11 00:00:29,320 --> 00:00:32,720 Speaker 2: So we basically see it through the IWF, So we 12 00:00:32,840 --> 00:00:36,040 Speaker 2: take a list that they compile rather than going and 13 00:00:36,200 --> 00:00:39,000 Speaker 2: I guess finding it ourselves because it is a legal content. 14 00:00:39,479 --> 00:00:42,279 Speaker 2: But yes, they've told us that it's been rising by 15 00:00:42,320 --> 00:00:45,680 Speaker 2: about four hundred percent and that includes both you know, 16 00:00:45,800 --> 00:00:49,680 Speaker 2: photorealistic as well as non photographic images and videos, which 17 00:00:49,720 --> 00:00:50,599 Speaker 2: is really concerning. 18 00:00:50,760 --> 00:00:52,600 Speaker 1: Yeah, and so are they the ones who tell you 19 00:00:52,680 --> 00:00:53,800 Speaker 1: what websites to block? 20 00:00:54,600 --> 00:00:57,320 Speaker 2: That's right, because you know, Spark can't have its people 21 00:00:57,320 --> 00:01:00,240 Speaker 2: obviously going and trawling for that kind of material be 22 00:01:00,520 --> 00:01:02,760 Speaker 2: illegal for us to access it. So the way that 23 00:01:02,800 --> 00:01:07,000 Speaker 2: it works is IWF has its own organization in place. 24 00:01:07,040 --> 00:01:11,040 Speaker 2: It has appropriate people who are trained to identify this content. 25 00:01:11,640 --> 00:01:15,479 Speaker 2: They create a list that automatically, it's automated to come 26 00:01:15,480 --> 00:01:17,680 Speaker 2: into our network, and then we block it, and that 27 00:01:17,760 --> 00:01:21,480 Speaker 2: list gets updated around twice a day, so it's very current. 28 00:01:21,600 --> 00:01:22,880 Speaker 1: How many websites you've blocked. 29 00:01:24,840 --> 00:01:28,080 Speaker 2: We don't have a specific tracking of it at this 30 00:01:28,160 --> 00:01:33,039 Speaker 2: stage because we've just implemented broad it in. But THEWF 31 00:01:33,080 --> 00:01:37,640 Speaker 2: list itself can be anywhere from thousands to tens of thousands, 32 00:01:37,680 --> 00:01:40,400 Speaker 2: but because it's updated twice a day, that number does 33 00:01:40,520 --> 00:01:41,400 Speaker 2: change over time. 34 00:01:42,040 --> 00:01:44,560 Speaker 1: What about a site like Twitter? Would you go as 35 00:01:44,600 --> 00:01:45,399 Speaker 1: far as to block that? 36 00:01:47,280 --> 00:01:50,440 Speaker 2: No, So this is the challenge with network blocking. It's 37 00:01:50,560 --> 00:01:52,960 Speaker 2: a bit of a blunt instrument. So the only thing 38 00:01:53,040 --> 00:01:56,480 Speaker 2: that Spark can do is block at a total network level, 39 00:01:56,520 --> 00:01:59,520 Speaker 2: So that means we block a whole domain or a 40 00:01:59,560 --> 00:02:02,480 Speaker 2: whole week site. So when IWF says to us, this 41 00:02:02,520 --> 00:02:05,400 Speaker 2: whole website is full of this awful material, then we 42 00:02:05,520 --> 00:02:08,200 Speaker 2: block it. But if it was something hosted within a 43 00:02:08,240 --> 00:02:13,240 Speaker 2: Facebook page or a post, for example, the only tool 44 00:02:13,280 --> 00:02:16,080 Speaker 2: we have available is to block Facebook for everyone on 45 00:02:16,080 --> 00:02:19,120 Speaker 2: the Spark network, which obviously we can't do. So you 46 00:02:19,200 --> 00:02:22,200 Speaker 2: do need a number of different interventions to tackle this issue. 47 00:02:22,200 --> 00:02:23,840 Speaker 1: But that is though, Leela, I guess, I mean that 48 00:02:23,919 --> 00:02:26,959 Speaker 1: shows just the extent of what we can do, which 49 00:02:26,960 --> 00:02:28,400 Speaker 1: is that we can't get rid of it all right, 50 00:02:28,440 --> 00:02:31,000 Speaker 1: because even if we've blocked every single bad website out there, 51 00:02:31,040 --> 00:02:33,040 Speaker 1: there will still be people who use Twitter to share 52 00:02:33,040 --> 00:02:33,680 Speaker 1: this kind of thing. 53 00:02:34,320 --> 00:02:37,440 Speaker 2: Absolutely, and I think if there are people who want 54 00:02:37,480 --> 00:02:40,000 Speaker 2: to access this material, they will, They will get around 55 00:02:40,040 --> 00:02:44,040 Speaker 2: our network blocking with certain tools. But what we're trying 56 00:02:44,080 --> 00:02:47,120 Speaker 2: to do, I guess, is to protect particularly kids, but 57 00:02:47,440 --> 00:02:51,840 Speaker 2: any of our customers from inadvertently stumbling across this because 58 00:02:52,240 --> 00:02:55,440 Speaker 2: this content isn't just on the dark Web, it's across 59 00:02:55,480 --> 00:02:56,040 Speaker 2: the Internet. 60 00:02:56,280 --> 00:02:58,799 Speaker 1: Hey are you worried? Unrelated, but are you worried about 61 00:02:58,840 --> 00:02:59,880 Speaker 1: the three G shutdown? 62 00:03:01,919 --> 00:03:03,840 Speaker 2: We're doing a lot of work on the three G 63 00:03:03,960 --> 00:03:08,560 Speaker 2: shutdown obviously, this is you know, the old version, the 64 00:03:08,600 --> 00:03:11,040 Speaker 2: old G, and we're now into four G and we've 65 00:03:11,080 --> 00:03:12,280 Speaker 2: got five G coming. 66 00:03:12,760 --> 00:03:13,640 Speaker 1: We're going to shut down. 67 00:03:15,680 --> 00:03:19,480 Speaker 2: Well, we've been communicating to our customers who are using 68 00:03:19,720 --> 00:03:22,600 Speaker 2: devices that are impacted by three G. So you'll start 69 00:03:22,639 --> 00:03:27,720 Speaker 2: to notice you're either getting direct messages or when you 70 00:03:27,760 --> 00:03:29,600 Speaker 2: start to make a call on your phone, it'll start 71 00:03:29,639 --> 00:03:33,040 Speaker 2: to say, hey, you're dialing from a device that is impacted. 72 00:03:33,120 --> 00:03:35,280 Speaker 2: There are devices like what you're talking about that are 73 00:03:35,320 --> 00:03:40,440 Speaker 2: not necessarily connected to us, like IoT sensors or medical alarms, 74 00:03:40,480 --> 00:03:44,440 Speaker 2: and generally those businesses and managing the communications. But we're 75 00:03:44,480 --> 00:03:47,480 Speaker 2: working very closely with them to ensure that we're giving 76 00:03:47,520 --> 00:03:50,840 Speaker 2: everyone a really long heads up before it shuts down. 77 00:03:51,000 --> 00:03:53,080 Speaker 1: Yest year. Yeah, hey, thank you very much, Lena, look 78 00:03:53,080 --> 00:03:56,680 Speaker 1: after yourself. Leilah Ashband, sustainability director, It'spike. For more from 79 00:03:56,760 --> 00:04:00,480 Speaker 1: Hither Duplessylan Drive, listen live to News Talk Said from 80 00:04:00,520 --> 00:04:04,160 Speaker 1: four pm weekdays, or follow the podcast on iHeartRadio