1 00:00:00,160 --> 00:00:02,759 Speaker 1: Andrew Carpento. I've spoken to him a number of times 2 00:00:02,840 --> 00:00:07,160 Speaker 1: on the program. He is certainly concerned with child sexual 3 00:00:07,160 --> 00:00:12,480 Speaker 1: abuse and has worked with people to garner to have 4 00:00:12,560 --> 00:00:17,360 Speaker 1: the government take in the superannuation of convicted offenders to 5 00:00:17,400 --> 00:00:22,439 Speaker 1: be used as compensation for those affected by this. Heenus 6 00:00:22,480 --> 00:00:26,760 Speaker 1: crime and he regularly every week will put up figures 7 00:00:26,800 --> 00:00:30,720 Speaker 1: of child sex abuse cases on his Instagram page and 8 00:00:31,480 --> 00:00:34,280 Speaker 1: it tragically it's high enough as it is. It sits 9 00:00:34,280 --> 00:00:38,440 Speaker 1: at around twenty to thirty percent usually, but today he 10 00:00:38,520 --> 00:00:41,320 Speaker 1: had a look at the list last Friday and posted 11 00:00:41,440 --> 00:00:45,960 Speaker 1: today forty eight point two seven percent of people charged 12 00:00:46,000 --> 00:00:50,080 Speaker 1: with crimes in the district court in South Australia forty 13 00:00:50,120 --> 00:00:53,800 Speaker 1: eight point twenty seven percent relate to the sexual abuse 14 00:00:53,840 --> 00:00:58,480 Speaker 1: of a child. And that is just untenable in this society. 15 00:00:58,600 --> 00:01:01,520 Speaker 1: I would say, let's have a chat with child protection 16 00:01:01,640 --> 00:01:06,240 Speaker 1: expert unis a professor Elsbeth McGinnis, professor, good morning, thank 17 00:01:06,240 --> 00:01:06,839 Speaker 1: you for your time. 18 00:01:07,680 --> 00:01:09,120 Speaker 2: Oh that's the pleasure. Thank you. 19 00:01:09,360 --> 00:01:13,160 Speaker 1: That's a horrendous number. One in two cases effectively. 20 00:01:13,760 --> 00:01:18,679 Speaker 2: Absolutely, but don't forget. It tells us something about how 21 00:01:20,080 --> 00:01:24,800 Speaker 2: the criminal justice system responds to these cases, and it 22 00:01:24,920 --> 00:01:29,039 Speaker 2: also tells us something about the prevalence of this kind 23 00:01:29,080 --> 00:01:33,319 Speaker 2: of crime in our communities, and there's work to be 24 00:01:33,440 --> 00:01:37,480 Speaker 2: done on both sides of that fence. It's really shocking 25 00:01:37,640 --> 00:01:39,120 Speaker 2: and terrible. 26 00:01:39,240 --> 00:01:43,560 Speaker 1: As you say, what happens at this point is the 27 00:01:43,600 --> 00:01:47,280 Speaker 1: government alerted When police, I presume, and police charge someone 28 00:01:47,560 --> 00:01:51,960 Speaker 1: with a crime like this, the Department of Child Protection 29 00:01:52,080 --> 00:01:52,680 Speaker 1: is notified. 30 00:01:54,280 --> 00:02:02,600 Speaker 2: Look the protocols of communication around evidence and investigation with 31 00:02:02,840 --> 00:02:06,960 Speaker 2: chart Protection and police as a matter of course, so 32 00:02:07,480 --> 00:02:11,120 Speaker 2: you know that would normally have happened in the lead 33 00:02:11,240 --> 00:02:17,480 Speaker 2: up to criminal charges being laid because Chart Protection are 34 00:02:17,760 --> 00:02:25,120 Speaker 2: often bound up in the processes of investigation alongside police. 35 00:02:25,400 --> 00:02:28,359 Speaker 1: It points to the fact something is seriously wrong in society, 36 00:02:28,400 --> 00:02:34,120 Speaker 1: doesn't it. Children are vulnerable and are being preyed on absolutely. 37 00:02:33,639 --> 00:02:38,040 Speaker 2: And as we've been hearing in their politics and the media, 38 00:02:38,800 --> 00:02:44,520 Speaker 2: social media is strongly implicated in this. It provides a 39 00:02:44,639 --> 00:02:50,840 Speaker 2: gateway to children for identification and grooming by offenders online 40 00:02:51,480 --> 00:02:57,840 Speaker 2: and that can these days with AI, crimes can be 41 00:02:58,720 --> 00:03:04,440 Speaker 2: committed without actually physical contact of any sort between the 42 00:03:04,520 --> 00:03:10,000 Speaker 2: victim and the offender. We've seen it happen with people 43 00:03:10,080 --> 00:03:16,760 Speaker 2: having their images inserted over pornographic images, etc. Through to 44 00:03:17,160 --> 00:03:20,960 Speaker 2: actually meeting up with children or gaining access to them 45 00:03:21,480 --> 00:03:27,320 Speaker 2: and physically offending against them. One of the problems, of course, 46 00:03:28,240 --> 00:03:34,600 Speaker 2: is the ubiquity of social media and the ubiquity of screens, 47 00:03:34,680 --> 00:03:40,880 Speaker 2: which has led to a whole expansion of opportunities to 48 00:03:40,960 --> 00:03:46,480 Speaker 2: produce child sexual exploitation material, both getting access to children 49 00:03:46,560 --> 00:03:50,360 Speaker 2: but also being able to disseminate it through global networks 50 00:03:51,040 --> 00:03:54,840 Speaker 2: on the dark web, and of course there are elements 51 00:03:54,880 --> 00:04:01,440 Speaker 2: of capitalization making money of producing child sexy exploitation material, 52 00:04:02,120 --> 00:04:07,320 Speaker 2: So there is this explosion that is part of the 53 00:04:07,480 --> 00:04:13,960 Speaker 2: technology of our current society. On the other hand, that 54 00:04:14,360 --> 00:04:19,360 Speaker 2: very willingness to commit horrendous images of offending on screen 55 00:04:20,640 --> 00:04:26,919 Speaker 2: has in fact increased detection of offenders who have child 56 00:04:27,560 --> 00:04:33,280 Speaker 2: exploitation digitally recorded. It's no longer relying on some poor 57 00:04:33,320 --> 00:04:37,920 Speaker 2: little kid being expected to have their voice survail against 58 00:04:37,920 --> 00:04:44,400 Speaker 2: an important and admired offender. It's look is located on 59 00:04:44,440 --> 00:04:47,440 Speaker 2: his phone. He's got it on his computer. So in 60 00:04:47,480 --> 00:04:52,240 Speaker 2: a sense, the flip side of the digital revolution is 61 00:04:52,279 --> 00:04:55,480 Speaker 2: that they're much more accessible as bang. 62 00:04:55,279 --> 00:04:59,480 Speaker 1: To right now they say, yeah, they're implicating themselves with that. 63 00:05:02,040 --> 00:05:04,600 Speaker 1: You know, this comes to the heart what you said 64 00:05:04,640 --> 00:05:07,600 Speaker 1: at the beginning about the ease of the technology comes 65 00:05:07,600 --> 00:05:10,120 Speaker 1: to the heart of what the Social Media Summit hopes 66 00:05:10,160 --> 00:05:12,039 Speaker 1: to achieve, the one held last week, and that is 67 00:05:12,080 --> 00:05:14,400 Speaker 1: to raise the age of children on social media. 68 00:05:16,000 --> 00:05:19,240 Speaker 2: Yes, well, that of course has all of its a 69 00:05:19,279 --> 00:05:25,440 Speaker 2: whole range of pitfalls, and the continuum extends through to 70 00:05:25,960 --> 00:05:34,080 Speaker 2: platforms putting in age verification procedures. And of course the 71 00:05:34,560 --> 00:05:38,080 Speaker 2: problem is if you remove social media altogether and then 72 00:05:38,160 --> 00:05:42,840 Speaker 2: there's no opportunity to learn how to use it, how 73 00:05:42,839 --> 00:05:45,919 Speaker 2: to be critical of it, how to manage it safely. 74 00:05:47,880 --> 00:05:53,840 Speaker 2: In a sense, you're stopping that capacity and it's an 75 00:05:53,880 --> 00:06:01,640 Speaker 2: important set of learnings to help people get through the 76 00:06:01,680 --> 00:06:06,240 Speaker 2: perils and the affordances of social media. So there's that 77 00:06:06,520 --> 00:06:11,280 Speaker 2: end of the spectrum of consideration rather than a complete 78 00:06:11,320 --> 00:06:16,640 Speaker 2: bann And parallels have also been drawn between the fact 79 00:06:16,640 --> 00:06:20,560 Speaker 2: that we will pop children who are ten years old 80 00:06:21,240 --> 00:06:27,800 Speaker 2: in jail in some jurisdictions when they are experimenting with 81 00:06:27,920 --> 00:06:31,920 Speaker 2: the world and playing out what's happening in their lives. 82 00:06:32,560 --> 00:06:35,520 Speaker 2: But we're saying, well, you can't go on social media 83 00:06:35,680 --> 00:06:39,200 Speaker 2: till you're in your late teens, And there's a huge 84 00:06:39,440 --> 00:06:44,360 Speaker 2: discrepancy there in how we see the vulnerability and development 85 00:06:44,400 --> 00:06:48,440 Speaker 2: of children and why there is that huge difference in 86 00:06:48,480 --> 00:06:51,600 Speaker 2: the age we're prepared to throw them in jail as 87 00:06:51,640 --> 00:06:55,960 Speaker 2: opposed to educating them in social media. So we have 88 00:06:56,040 --> 00:06:58,480 Speaker 2: a bit of work to do, I think as a 89 00:06:58,520 --> 00:07:03,400 Speaker 2: society around how we see children and how we protect 90 00:07:03,480 --> 00:07:06,200 Speaker 2: them and the evidence, as you say, it's pretty clear 91 00:07:06,760 --> 00:07:10,280 Speaker 2: that too many children are in fact not being protected 92 00:07:10,320 --> 00:07:14,280 Speaker 2: and are being sexually exploited, and we've got to address 93 00:07:14,400 --> 00:07:17,520 Speaker 2: that end of the equation as well, and how to 94 00:07:19,800 --> 00:07:25,880 Speaker 2: suppose There has been research that shown by Professor Michael 95 00:07:26,000 --> 00:07:29,600 Speaker 2: Salter in the University of New South Wales, but I 96 00:07:29,680 --> 00:07:33,280 Speaker 2: think up into one in ten men in our society 97 00:07:34,240 --> 00:07:40,240 Speaker 2: have an engagement with the sexual exploitation of children. Material 98 00:07:41,600 --> 00:07:46,120 Speaker 2: that's huge and why is that happening? And how do 99 00:07:46,240 --> 00:07:52,720 Speaker 2: we work with those people who may become offenders to 100 00:07:52,920 --> 00:07:59,480 Speaker 2: change that behavior and in the process protect children. So 101 00:07:59,520 --> 00:08:01,640 Speaker 2: I think we we've got a lot of work to 102 00:08:01,720 --> 00:08:03,920 Speaker 2: do as a society here. 103 00:08:04,440 --> 00:08:08,320 Speaker 1: I'm forty eight point two percent that should make the 104 00:08:08,360 --> 00:08:10,520 Speaker 1: government set up and take note. I don't know how 105 00:08:10,520 --> 00:08:13,560 Speaker 1: many people that is, but I imagine because some every 106 00:08:13,640 --> 00:08:17,600 Speaker 1: day there's never a day where it's zero. So if 107 00:08:17,600 --> 00:08:22,520 Speaker 1: it's twenty fifty per week, I mean that shouldn't be 108 00:08:22,560 --> 00:08:26,640 Speaker 1: any of course, but that many per week, it's incredible, 109 00:08:26,960 --> 00:08:30,960 Speaker 1: and the government needs to just pay attention. I don't 110 00:08:31,000 --> 00:08:33,920 Speaker 1: know what they can do beyond what policies are in place, 111 00:08:34,000 --> 00:08:36,280 Speaker 1: but that is ridiculous that that number today. 112 00:08:37,000 --> 00:08:42,240 Speaker 2: Well, they have in fact just increased the I think 113 00:08:42,280 --> 00:08:48,280 Speaker 2: indefinite imprisonment for sex offenders child sex offenders. So they have, 114 00:08:49,400 --> 00:08:54,320 Speaker 2: as I understand, been happening the legislation in ways that 115 00:08:55,160 --> 00:09:01,439 Speaker 2: has in fact created some controversy in legal circles because 116 00:09:01,480 --> 00:09:06,080 Speaker 2: it is an intrusion of government into you know, the 117 00:09:06,200 --> 00:09:09,360 Speaker 2: rule of law and its ability to determine things. So 118 00:09:10,679 --> 00:09:15,440 Speaker 2: they have been increasing penalties, and you know, there's a 119 00:09:15,520 --> 00:09:18,000 Speaker 2: bit of a debate as to whether that in fact 120 00:09:18,920 --> 00:09:24,440 Speaker 2: makes a difference. I'm not sure that it necessarily follows 121 00:09:24,480 --> 00:09:28,760 Speaker 2: that people are always thinking of penalties when they commit crimes, 122 00:09:29,800 --> 00:09:35,679 Speaker 2: but you know, it is very concerning, and I think 123 00:09:35,800 --> 00:09:42,120 Speaker 2: we need to look at our human services for people 124 00:09:42,160 --> 00:09:49,880 Speaker 2: who are experiencing addictions to screen material, which may include 125 00:09:49,960 --> 00:09:54,000 Speaker 2: child text exploitation material. But of course people can get 126 00:09:54,040 --> 00:10:00,800 Speaker 2: addicted to gambling, to adult pornography, to all sorts of things. 127 00:10:02,679 --> 00:10:07,800 Speaker 2: But do we have the workforce of therapists who can 128 00:10:07,920 --> 00:10:12,120 Speaker 2: work with people with these addictions? Are they affordable and 129 00:10:12,200 --> 00:10:15,560 Speaker 2: accessible to people? Do they know that they're there? I 130 00:10:15,559 --> 00:10:18,480 Speaker 2: mean all it seems to me that we've got to 131 00:10:18,520 --> 00:10:22,400 Speaker 2: work on all sides of this sort of social problem 132 00:10:22,880 --> 00:10:26,360 Speaker 2: and not You know, child protection and police are two 133 00:10:26,559 --> 00:10:33,479 Speaker 2: key agencies, certainly, but it's these people. And I suspect 134 00:10:33,520 --> 00:10:38,280 Speaker 2: a lot of it is technology based, because we're still 135 00:10:38,320 --> 00:10:41,320 Speaker 2: really bad at listening to the voices of actual children. 136 00:10:41,760 --> 00:10:44,760 Speaker 2: But we will believe at computer screen and a device 137 00:10:44,880 --> 00:10:49,560 Speaker 2: with things loaded onto it, and I suspect that's where 138 00:10:49,600 --> 00:10:56,120 Speaker 2: we are seeing a lot more evidence that is accepted 139 00:10:56,200 --> 00:10:59,920 Speaker 2: by courts. But it's evidence that we have a math 140 00:11:00,080 --> 00:11:07,600 Speaker 2: of social problems in our society around the desire for 141 00:11:07,880 --> 00:11:12,640 Speaker 2: child exploitation material will stop, and we need to we 142 00:11:12,720 --> 00:11:18,959 Speaker 2: need to have the have services for people who are 143 00:11:21,440 --> 00:11:26,960 Speaker 2: expressing those kinds of desires and inclinations as to how 144 00:11:27,000 --> 00:11:29,720 Speaker 2: they can manage their lives differently. 145 00:11:30,080 --> 00:11:32,959 Speaker 1: Indeed, appreciate your time this morning, professor, Thank. 146 00:11:32,800 --> 00:11:35,040 Speaker 2: You, Thank you, Professor. 147 00:11:35,200 --> 00:11:38,600 Speaker 1: Professor Elsbeth mcinni's Child Protection expert at Uni Sa