1 00:00:00,400 --> 00:00:03,640 Speaker 1: The spotlight again on cyber bullying. Latest figures out today 2 00:00:03,720 --> 00:00:07,360 Speaker 1: show a thirty seven percent increase in reports over the 3 00:00:07,440 --> 00:00:10,920 Speaker 1: last financial year. Despite all the knowledge and talk about 4 00:00:10,920 --> 00:00:14,560 Speaker 1: cyber bullying and how dangerous it can be, reports up 5 00:00:14,680 --> 00:00:18,000 Speaker 1: unbelievable twenty seven hundred. Well just short of that by 6 00:00:18,000 --> 00:00:22,119 Speaker 1: a few actionable reports received by E Safety in the 7 00:00:22,200 --> 00:00:26,079 Speaker 1: year to the end of June, many involving social media, 8 00:00:26,440 --> 00:00:29,960 Speaker 1: some that involved music streaming services. Let's find out some 9 00:00:30,040 --> 00:00:32,920 Speaker 1: more of the Safety Commissioner Julie Inmgrant on the line. 10 00:00:32,960 --> 00:00:35,720 Speaker 2: Julie, good morning, good morning, thanks for having me. 11 00:00:36,080 --> 00:00:40,120 Speaker 1: Online bullying comes in many different forms. Children and parents 12 00:00:40,200 --> 00:00:42,240 Speaker 1: need to know about this, don't they clearly. 13 00:00:42,920 --> 00:00:46,240 Speaker 2: Well, of course, bullying has existed from time in memorial, 14 00:00:46,360 --> 00:00:50,159 Speaker 2: but what makes cyberbullying so insidious is that it no 15 00:00:50,240 --> 00:00:52,920 Speaker 2: longer stops at the school GET and in the vast 16 00:00:53,000 --> 00:00:57,680 Speaker 2: majority of cases at cyberbullying is an extension of conflict 17 00:00:57,680 --> 00:01:01,600 Speaker 2: that's happening within the school GETS. But it is invasive 18 00:01:01,600 --> 00:01:04,959 Speaker 2: and pervasive and now follows kids on their phones into 19 00:01:04,959 --> 00:01:07,720 Speaker 2: their homes. And it's very visible to a child and 20 00:01:07,760 --> 00:01:12,800 Speaker 2: their peers, but often hidden to parents and teachers and 21 00:01:12,840 --> 00:01:15,319 Speaker 2: So this is why the spotlight on cyber building is 22 00:01:15,360 --> 00:01:18,800 Speaker 2: so important, because we offer the only service in the 23 00:01:18,880 --> 00:01:23,200 Speaker 2: world that can help children get harmful cyber bulling content 24 00:01:23,440 --> 00:01:26,240 Speaker 2: down when the platforms fail to act. 25 00:01:26,600 --> 00:01:29,920 Speaker 1: A lot of this depends on kids coming forward does not, 26 00:01:30,200 --> 00:01:33,760 Speaker 1: and that might be the figures out today might be 27 00:01:33,800 --> 00:01:35,680 Speaker 1: the tip of the iceberg in that a lot of 28 00:01:35,760 --> 00:01:38,360 Speaker 1: kids will not come forward to say they're being cyberbullied. 29 00:01:39,200 --> 00:01:42,400 Speaker 2: I'm sure that this is just the tip of the iceberg, 30 00:01:42,440 --> 00:01:45,000 Speaker 2: but we've seen actually a lot of positive change. When 31 00:01:45,000 --> 00:01:47,840 Speaker 2: I started this role in twenty seventeen, I did a 32 00:01:47,960 --> 00:01:50,640 Speaker 2: roundtable with young people and I said, well, you know, 33 00:01:50,720 --> 00:01:52,960 Speaker 2: the companies aren't going to do anything, and can we 34 00:01:53,000 --> 00:01:56,800 Speaker 2: trust a government agency? But really they were more concerned 35 00:01:56,840 --> 00:01:59,360 Speaker 2: about being a dobber and a snitch. So we spend 36 00:01:59,400 --> 00:02:03,320 Speaker 2: a lot of time I'm focusing on letting children know 37 00:02:03,400 --> 00:02:05,680 Speaker 2: that they should be talking to their parents, a trusted it, 38 00:02:05,920 --> 00:02:08,840 Speaker 2: a sibling, just to disclose and talk to someone so 39 00:02:08,919 --> 00:02:13,440 Speaker 2: that doesn't escalate. And now we've seen a real change 40 00:02:13,440 --> 00:02:16,880 Speaker 2: in behaviors by young people. Ninety eight percent of them 41 00:02:16,880 --> 00:02:20,560 Speaker 2: will take some form of action when they're receiving fulling 42 00:02:20,560 --> 00:02:23,280 Speaker 2: online and it may be something like blocking or ignoring 43 00:02:23,400 --> 00:02:27,120 Speaker 2: or meeting, but they're also doing more reporting and about 44 00:02:27,120 --> 00:02:29,839 Speaker 2: seventy percent are talking to their parents when something goes 45 00:02:29,880 --> 00:02:32,840 Speaker 2: wrong online. We just want to extend that further. When 46 00:02:32,840 --> 00:02:38,200 Speaker 2: it's really targeted and really damaging that and the platform 47 00:02:38,320 --> 00:02:41,720 Speaker 2: that it's happening on doesn't respond, we serve as that 48 00:02:41,840 --> 00:02:44,280 Speaker 2: safety net and we can help advocate on that child's 49 00:02:44,320 --> 00:02:46,480 Speaker 2: behalf and we have a ninety percent success rate in 50 00:02:46,520 --> 00:02:46,800 Speaker 2: doing so. 51 00:02:46,960 --> 00:02:49,880 Speaker 1: Okay, that's excellent something you said right at the stop. 52 00:02:49,960 --> 00:02:54,760 Speaker 1: The companies aren't doing anything social media companies, online companies. 53 00:02:55,080 --> 00:02:58,840 Speaker 1: Should the government be more proactive here in in targeting them, 54 00:02:58,919 --> 00:03:02,880 Speaker 1: in enforcing them to take some action in this regard 55 00:03:02,919 --> 00:03:06,400 Speaker 1: to block accounts, to close down content, et cetera. 56 00:03:07,320 --> 00:03:11,079 Speaker 2: Well, that's exactly what this youth based Suberbulin scheme does. 57 00:03:11,080 --> 00:03:13,120 Speaker 2: And as I said, it's the only one in the 58 00:03:13,160 --> 00:03:17,320 Speaker 2: world you know. Of course, they're getting millions and millions 59 00:03:17,320 --> 00:03:20,320 Speaker 2: of reports of online abuse, and they often have content 60 00:03:20,400 --> 00:03:23,440 Speaker 2: moderators offshore that have thirty seconds to a minute to 61 00:03:23,520 --> 00:03:26,360 Speaker 2: determine whether or not this contravenes their terms of service, 62 00:03:27,040 --> 00:03:30,239 Speaker 2: and a number of the companies will kind of lean 63 00:03:30,280 --> 00:03:33,760 Speaker 2: on the side of freedom of expression, and freedom of 64 00:03:33,760 --> 00:03:36,760 Speaker 2: expression is fine, but when it veers into the lane 65 00:03:37,000 --> 00:03:40,440 Speaker 2: of serious harm targeting a child, they should be taking 66 00:03:40,440 --> 00:03:43,680 Speaker 2: a different approach. This is why we try and work 67 00:03:43,760 --> 00:03:47,560 Speaker 2: with them cooperatively through the informal takedowns. But if they 68 00:03:47,600 --> 00:03:51,000 Speaker 2: refuse to take something down that we think violates our 69 00:03:51,120 --> 00:03:55,600 Speaker 2: laws and is you know, threatening, harassing, intimidating or humiliating 70 00:03:55,640 --> 00:03:59,160 Speaker 2: to the child, then we can issue formal removal notices. 71 00:03:59,440 --> 00:04:02,320 Speaker 2: We can also issue what are called end user notices 72 00:04:02,440 --> 00:04:07,520 Speaker 2: to children who are really egregiously and violently targeting other 73 00:04:07,640 --> 00:04:10,520 Speaker 2: kids as a way of it's almost like a cease 74 00:04:10,560 --> 00:04:13,360 Speaker 2: and desist order that we tend to do in conjunction 75 00:04:13,840 --> 00:04:17,320 Speaker 2: with the parent community and schools, so that we can 76 00:04:17,480 --> 00:04:20,599 Speaker 2: tell them to well apologize or write letters to the 77 00:04:20,680 --> 00:04:23,520 Speaker 2: target to remove the content and let them know that 78 00:04:24,360 --> 00:04:26,880 Speaker 2: if they continue in the bullying behavior, then we can 79 00:04:26,880 --> 00:04:29,240 Speaker 2: take other forms of action including fine. 80 00:04:29,320 --> 00:04:31,919 Speaker 1: Okay, National Wake of Action gets on to way August 81 00:04:32,000 --> 00:04:35,320 Speaker 1: the twelfth, then we're in that now. Obviously it's a 82 00:04:35,320 --> 00:04:38,200 Speaker 1: few days to guy, But what's the important message of 83 00:04:38,240 --> 00:04:41,240 Speaker 1: this this this week? What would you like to be 84 00:04:41,520 --> 00:04:44,320 Speaker 1: the long term outcome of the messaging you're putting out today. 85 00:04:45,200 --> 00:04:49,440 Speaker 2: Well, I just note that over the past four years 86 00:04:49,920 --> 00:04:52,559 Speaker 2: with the COVID lockdowns, we've actually seen a three hundred 87 00:04:52,600 --> 00:04:58,080 Speaker 2: and eleven percent increase in cyberbullying and in reports to us. 88 00:04:58,120 --> 00:05:00,640 Speaker 2: You can read that positively in that piece. People are 89 00:05:00,839 --> 00:05:03,640 Speaker 2: learning more about it, about us and our reporting. But 90 00:05:03,760 --> 00:05:06,520 Speaker 2: on the negative side, we did see parents being a 91 00:05:06,600 --> 00:05:11,120 Speaker 2: lot more permissive with technology use over lockdowns, understandably, so 92 00:05:11,200 --> 00:05:14,880 Speaker 2: we now have kids who are much younger reporting cyberbulling 93 00:05:14,960 --> 00:05:18,520 Speaker 2: to us. You know, they're on TikTok and snap and 94 00:05:18,600 --> 00:05:22,880 Speaker 2: Instagram too young. So we need to make sure that 95 00:05:23,040 --> 00:05:25,960 Speaker 2: parents are really engaged in their kids online lives, and 96 00:05:26,440 --> 00:05:29,080 Speaker 2: we've got tons of parent guides, But the spotlight on 97 00:05:29,160 --> 00:05:33,200 Speaker 2: cyberbulling is specifically for children, parents and teachers as well, 98 00:05:33,320 --> 00:05:36,080 Speaker 2: and you can find all this information at e SAT 99 00:05:36,160 --> 00:05:40,159 Speaker 2: dot gov dot au and you can also report youth 100 00:05:40,200 --> 00:05:41,000 Speaker 2: based cyber bullying. 101 00:05:41,160 --> 00:05:44,360 Speaker 1: Okay is the commission on TikTok as well. Trying to 102 00:05:44,360 --> 00:05:47,320 Speaker 1: get this message out two kids, Snapchat, et cetera. 103 00:05:48,400 --> 00:05:51,960 Speaker 2: Yep, we have a channel called scroll Ye Safety, We 104 00:05:52,000 --> 00:05:55,520 Speaker 2: have a Youth Advisory Council, and we have them putting 105 00:05:55,560 --> 00:05:57,640 Speaker 2: out youth based messages for youth. 106 00:05:57,839 --> 00:06:00,960 Speaker 1: Absolutely, is it exciting for kids to go to because 107 00:06:01,440 --> 00:06:03,760 Speaker 1: I've got to say that that title will just make 108 00:06:03,760 --> 00:06:05,680 Speaker 1: my eyes glaze over if I was about twelve or 109 00:06:05,760 --> 00:06:10,160 Speaker 1: thirty scroll Yeah, yeah, if you're scrolling through, would you 110 00:06:10,200 --> 00:06:10,839 Speaker 1: stop a little bit? 111 00:06:11,320 --> 00:06:13,560 Speaker 2: Should it be that doom scroll or that scroll? Do 112 00:06:13,600 --> 00:06:14,760 Speaker 2: you think that would be more appealing? 113 00:06:14,760 --> 00:06:17,040 Speaker 1: Well, I don't know, but you know, as long as 114 00:06:17,120 --> 00:06:18,880 Speaker 1: kids stop and look at it would be the thing? 115 00:06:18,960 --> 00:06:21,159 Speaker 1: Are they Are they stopping as they're scrolling through? 116 00:06:21,960 --> 00:06:27,080 Speaker 2: Well, again, these are messages by kids for kids who've 117 00:06:27,120 --> 00:06:30,640 Speaker 2: experienced the same and I think that kind of messaging 118 00:06:30,720 --> 00:06:33,440 Speaker 2: is much more of dontech. And of course we consulted 119 00:06:33,640 --> 00:06:37,680 Speaker 2: with our Youth Advisory Council on the on the name 120 00:06:37,839 --> 00:06:41,360 Speaker 2: of the channel dot we can take a relook at. 121 00:06:41,839 --> 00:06:44,919 Speaker 1: Okay, is enough being done in schools just finally, And 122 00:06:45,040 --> 00:06:48,599 Speaker 1: I'm sure the Commission would work with educators and does 123 00:06:48,720 --> 00:06:51,719 Speaker 1: And I know at the school that my boys go to, 124 00:06:51,880 --> 00:06:54,160 Speaker 1: there's has been in the past and I'm sure we'll 125 00:06:54,200 --> 00:06:58,680 Speaker 1: be again. Sessions with a safety type people, you know, 126 00:06:58,800 --> 00:07:02,480 Speaker 1: cyber people who want to get the message across about 127 00:07:02,520 --> 00:07:06,320 Speaker 1: safety and you know, they're usually six seven pm. You 128 00:07:06,400 --> 00:07:08,720 Speaker 1: turn along, turn up and there's an hour long talk 129 00:07:08,760 --> 00:07:11,920 Speaker 1: about things parents should be aware about. And that's all 130 00:07:12,040 --> 00:07:15,520 Speaker 1: very useful and really good. But more in that space, 131 00:07:15,680 --> 00:07:17,240 Speaker 1: I suppose you can never do enough. 132 00:07:17,920 --> 00:07:22,800 Speaker 2: Well listen, those are often trusted, these safety providers that 133 00:07:22,920 --> 00:07:26,600 Speaker 2: go into the schools. We deliver free webinars all of 134 00:07:26,640 --> 00:07:30,920 Speaker 2: the time. I think the real challenge is that parents 135 00:07:31,000 --> 00:07:35,200 Speaker 2: are so busy and while they find this online safety 136 00:07:35,280 --> 00:07:38,560 Speaker 2: is one of their the perennial parenting challenges. Ninety five 137 00:07:38,560 --> 00:07:41,400 Speaker 2: percent of parents tell us that, but only ten percent 138 00:07:41,480 --> 00:07:44,920 Speaker 2: speak out information from new safety will go to these 139 00:07:44,960 --> 00:07:49,920 Speaker 2: types of things, So ten percent, even though ninety five 140 00:07:49,960 --> 00:07:52,000 Speaker 2: percent see it is a problem or really seeking up 141 00:07:52,000 --> 00:07:55,640 Speaker 2: this information until something goes wrong. So you know, the 142 00:07:55,720 --> 00:07:58,560 Speaker 2: schools are really important vectors. And we're working with the 143 00:07:58,560 --> 00:08:02,000 Speaker 2: Department of Education in South Australia and we have something 144 00:08:02,000 --> 00:08:05,840 Speaker 2: called a National Online Safety Education Council. The more we 145 00:08:05,880 --> 00:08:08,280 Speaker 2: can get more boots on the ground and we can 146 00:08:08,320 --> 00:08:12,040 Speaker 2: empower teachers and we're providing them with professional training to 147 00:08:12,080 --> 00:08:14,520 Speaker 2: know how to deal with these kind of issues when 148 00:08:14,520 --> 00:08:17,680 Speaker 2: they happen in the classroom, the more we can get 149 00:08:17,760 --> 00:08:20,480 Speaker 2: to parents the more we can get to young people themselves, 150 00:08:20,760 --> 00:08:23,080 Speaker 2: I think, the better off we will all be. 151 00:08:23,400 --> 00:08:25,040 Speaker 1: I think I agree with that. Thank you for your 152 00:08:25,040 --> 00:08:28,920 Speaker 1: time this morning, Commissioner. Thank you so much, Julian Julian 153 00:08:28,960 --> 00:08:32,400 Speaker 1: mon Grant there who is the Australian A Safety Commissioner