1 00:00:05,880 --> 00:00:08,600 Speaker 1: Gooday, Welcome to the Happy Families podcast Something Different today 2 00:00:08,680 --> 00:00:11,119 Speaker 1: as we kick off your new week coming up in 3 00:00:11,160 --> 00:00:13,920 Speaker 1: the next sort of six weeks or thereabouts December tenth, 4 00:00:14,520 --> 00:00:17,079 Speaker 1: we see a massive change in Australia with minimum age 5 00:00:17,160 --> 00:00:22,600 Speaker 1: legislation for social media coming in. All children sixteen and 6 00:00:22,640 --> 00:00:26,440 Speaker 1: below will no longer have access to accounts or to 7 00:00:26,440 --> 00:00:31,880 Speaker 1: be algorithmically followed online by the major technology companies. Today, 8 00:00:32,000 --> 00:00:34,479 Speaker 1: my very special guest on the podcast is the person 9 00:00:34,520 --> 00:00:37,839 Speaker 1: in charge of regulating, developing, and essentially rolling all of 10 00:00:37,880 --> 00:00:41,879 Speaker 1: this out. She is the nationwide federal government appointed e 11 00:00:42,040 --> 00:00:45,519 Speaker 1: Safety Commissioner, Julie Inman Grand Julia has been on the 12 00:00:45,520 --> 00:00:48,519 Speaker 1: podcast a number of times and she consistently provides us 13 00:00:48,520 --> 00:00:50,839 Speaker 1: with the very best information about what's happening to keep 14 00:00:50,880 --> 00:00:54,520 Speaker 1: our kids safe. Today. We have a much longer conversation 15 00:00:54,800 --> 00:00:57,240 Speaker 1: than normal on the pod because there is so much 16 00:00:57,280 --> 00:01:01,120 Speaker 1: to cover and it matters so much, everything from who's 17 00:01:01,160 --> 00:01:03,360 Speaker 1: included in the band, what parents are supposed to know 18 00:01:03,440 --> 00:01:05,800 Speaker 1: about it, how fines are going to be enacted, what 19 00:01:05,920 --> 00:01:09,120 Speaker 1: to do about explicit content online, whether that's covered or not, 20 00:01:09,240 --> 00:01:11,640 Speaker 1: and what the government's doing about that, and even those 21 00:01:11,760 --> 00:01:15,400 Speaker 1: neudifying apps where we're seeing on a weekly basis schools 22 00:01:15,400 --> 00:01:20,039 Speaker 1: finding kids using them and causing harm. We're going to 23 00:01:20,040 --> 00:01:22,160 Speaker 1: cover all of that and more. This is a conversation 24 00:01:22,280 --> 00:01:27,720 Speaker 1: you don't want to miss. It's coming up right after this. Holo, 25 00:01:27,800 --> 00:01:30,000 Speaker 1: I'm welcome to the Happy Family's podcast, Real Parenting Solutions 26 00:01:30,040 --> 00:01:32,959 Speaker 1: every single day on Australia's most downloaded parenting podcast, which 27 00:01:33,000 --> 00:01:35,080 Speaker 1: is over a month month and a half away from 28 00:01:35,160 --> 00:01:39,920 Speaker 1: the arrival of the brand new legislation, world first legislation. 29 00:01:40,160 --> 00:01:41,400 Speaker 1: There are a lot of countries who are kind of 30 00:01:41,480 --> 00:01:44,959 Speaker 1: jealous about what's going on in Australia when it comes 31 00:01:45,000 --> 00:01:48,520 Speaker 1: to social media. Minimum age legislation limits on kids under 32 00:01:48,560 --> 00:01:51,680 Speaker 1: the age of sixteen being able to access the social 33 00:01:51,680 --> 00:01:55,160 Speaker 1: media platforms because of the risks associated with it. And 34 00:01:55,200 --> 00:01:57,120 Speaker 1: there is nobody better to talk to about this than 35 00:01:57,120 --> 00:01:58,880 Speaker 1: the person who is in charge of regulating it all, 36 00:01:58,920 --> 00:02:01,520 Speaker 1: the Safety Commissioner of AUS Australia, who is a regular 37 00:02:01,760 --> 00:02:04,800 Speaker 1: conversationalist on the Happy Families podcast. What a privilege to 38 00:02:04,800 --> 00:02:08,080 Speaker 1: have you back, Julie Immigrant, Thank you for your time today. 39 00:02:08,919 --> 00:02:11,120 Speaker 2: The privilege is online justin I can't. 40 00:02:10,960 --> 00:02:13,840 Speaker 1: Even imagine how busy you heard the moment I well, actually, 41 00:02:13,840 --> 00:02:15,360 Speaker 1: I do have some sense of how busy you are, 42 00:02:15,400 --> 00:02:17,560 Speaker 1: because when I was speaking with Amy or assistant, it 43 00:02:17,639 --> 00:02:19,960 Speaker 1: took us a long time to lock this time in. 44 00:02:20,040 --> 00:02:22,959 Speaker 1: So let's get started straight away. You're in charge of 45 00:02:23,000 --> 00:02:25,520 Speaker 1: the upcoming social media and inter major legislation. It starts 46 00:02:25,560 --> 00:02:29,360 Speaker 1: on December ten. There is still, unfortunately, probably more than 47 00:02:29,440 --> 00:02:31,919 Speaker 1: is necessary, a lot of confusion out there. Can you 48 00:02:31,960 --> 00:02:33,920 Speaker 1: step it out in really simple terms? What does it 49 00:02:34,000 --> 00:02:37,280 Speaker 1: mean for parents and teenagers? 50 00:02:38,120 --> 00:02:41,480 Speaker 2: Ease Safety, being the first online safety regulator in the world, 51 00:02:41,560 --> 00:02:43,639 Speaker 2: were used to writing the playbook as we go along, 52 00:02:43,720 --> 00:02:47,400 Speaker 2: and this was probably, I can definitively say, the most novel, 53 00:02:47,480 --> 00:02:51,560 Speaker 2: complex piece of legislation that I've ever ever seen, and 54 00:02:51,639 --> 00:02:54,040 Speaker 2: so it's taken a lot of mental gymnastics to actually 55 00:02:54,040 --> 00:02:56,760 Speaker 2: figure out how this will work in practice. But I 56 00:02:56,760 --> 00:03:00,640 Speaker 2: guess what parents really need to know is that the 57 00:03:00,680 --> 00:03:04,200 Speaker 2: whole idea is to take the pressure away from parents 58 00:03:04,280 --> 00:03:08,680 Speaker 2: and kids themselves and the onus of responsibility on keeping 59 00:03:08,720 --> 00:03:13,160 Speaker 2: them off social media until the age of sixteen. That 60 00:03:13,240 --> 00:03:16,720 Speaker 2: falls squarely on the platforms. As you would understand, justin 61 00:03:17,240 --> 00:03:20,800 Speaker 2: these companies have, particularly the larger ones, and where the 62 00:03:20,880 --> 00:03:24,280 Speaker 2: vast majority of young people are spending their time, They've 63 00:03:24,320 --> 00:03:29,680 Speaker 2: got vast resources, the best minds, they're using advanced AA 64 00:03:29,720 --> 00:03:32,160 Speaker 2: eye tools, and the vast majority of them do use 65 00:03:32,240 --> 00:03:37,200 Speaker 2: age inference tools today. The government also completed in August 66 00:03:37,760 --> 00:03:41,720 Speaker 2: age Assurance Technical Trial where about fifty different age verification 67 00:03:41,840 --> 00:03:47,480 Speaker 2: technologies were tested for accuracy, robustness, whether or not they 68 00:03:48,320 --> 00:03:51,560 Speaker 2: what kind of bias, or whether they captured the broader 69 00:03:51,600 --> 00:03:55,600 Speaker 2: ethnicities across Australia and whether they were privacy preserving, and 70 00:03:55,680 --> 00:03:59,920 Speaker 2: about ten of those scored the top technical readiness rating 71 00:04:00,920 --> 00:04:04,920 Speaker 2: or just below. So this gives all these major companies 72 00:04:05,040 --> 00:04:07,880 Speaker 2: additional technologies to use. So the way that it will 73 00:04:07,880 --> 00:04:11,760 Speaker 2: work is we're going through something called the assessment and 74 00:04:11,840 --> 00:04:15,360 Speaker 2: self assessment process. But I think you can be reasonably 75 00:04:15,880 --> 00:04:20,359 Speaker 2: comfortable that the major platforms that the Prime Minister mentioned 76 00:04:21,120 --> 00:04:26,560 Speaker 2: and that kids are using today, YouTube, Facebook, Instagram, TikTok, 77 00:04:26,920 --> 00:04:31,680 Speaker 2: snapchat and acts are likely to be covered. At this time, 78 00:04:31,839 --> 00:04:36,000 Speaker 2: we're still engaging in some conversations around procedural do fairness. 79 00:04:36,480 --> 00:04:41,400 Speaker 2: Later this week will announce a number of organizations that 80 00:04:41,520 --> 00:04:45,920 Speaker 2: probably will not be in the ban. There are two exemptions. 81 00:04:45,920 --> 00:04:48,000 Speaker 2: One is for messaging sites and the other is for 82 00:04:48,080 --> 00:04:52,040 Speaker 2: online gaming platforms. But as you would understand, there really 83 00:04:52,240 --> 00:04:58,160 Speaker 2: is no right defining line about a lot of messaging services, 84 00:04:58,200 --> 00:05:02,560 Speaker 2: for instance, have social media functionality and online gaming if 85 00:05:02,560 --> 00:05:06,040 Speaker 2: you're thinking about Orldblocks or Fortnite for instance, lots of 86 00:05:06,080 --> 00:05:11,240 Speaker 2: social media, chatting and interactive features. So are they primarily 87 00:05:11,480 --> 00:05:13,880 Speaker 2: is their primary or so purpose as an online gaming 88 00:05:13,920 --> 00:05:17,000 Speaker 2: platform or social media platform. These are the processes we're 89 00:05:17,040 --> 00:05:20,640 Speaker 2: going through now. But either way, we have a range 90 00:05:20,800 --> 00:05:24,960 Speaker 2: of information coming out this week for parents and young 91 00:05:24,960 --> 00:05:29,719 Speaker 2: people themselves. We consulted with about one hundred and sixty 92 00:05:29,760 --> 00:05:34,440 Speaker 2: different organization, including with young people specifically and with parents 93 00:05:34,440 --> 00:05:37,200 Speaker 2: so that we could tailor these materials for what they want. 94 00:05:37,320 --> 00:05:41,840 Speaker 2: So everything from what are conversation starters, Whether are the 95 00:05:41,920 --> 00:05:44,320 Speaker 2: checklists you need to go go through with your kid, 96 00:05:44,800 --> 00:05:48,560 Speaker 2: for instance, how do you start having the conversation about 97 00:05:48,560 --> 00:05:51,920 Speaker 2: what platforms they are on today and get them to 98 00:05:51,960 --> 00:05:55,039 Speaker 2: start weaning themselves off social media? This is going to 99 00:05:55,040 --> 00:05:57,240 Speaker 2: be a monumental event for a lot of young people. 100 00:05:57,760 --> 00:06:02,560 Speaker 2: How do you download the ur lis and their pictures? 101 00:06:03,040 --> 00:06:06,359 Speaker 2: How do you make sure that they're still connecting with 102 00:06:06,440 --> 00:06:09,640 Speaker 2: their friend group? Particularly in December, one school is breaking 103 00:06:09,680 --> 00:06:13,120 Speaker 2: out at school holidays, we'll explain how to set up 104 00:06:13,200 --> 00:06:16,920 Speaker 2: sort of a group messaging chat. For instance, if there 105 00:06:16,920 --> 00:06:20,680 Speaker 2: are influencers you approve of that your kid might like 106 00:06:20,760 --> 00:06:24,400 Speaker 2: to follow, then you know, look for their website and 107 00:06:24,440 --> 00:06:28,080 Speaker 2: bookmarket now. And then we've been privileged to work with 108 00:06:28,160 --> 00:06:34,280 Speaker 2: groups like Beyond Blue, Headspace reach Out and Kids Helpline 109 00:06:34,800 --> 00:06:37,760 Speaker 2: to make sure that all the mental health resources and 110 00:06:37,800 --> 00:06:40,680 Speaker 2: the help that kids need are readily available in our 111 00:06:40,680 --> 00:06:44,280 Speaker 2: resources and the language is kind, caring and compassionate. 112 00:06:44,839 --> 00:06:46,760 Speaker 1: Julie, there are so many moving parts here. I want 113 00:06:46,760 --> 00:06:49,760 Speaker 1: to go back to the platforms themselves for just a moment, 114 00:06:50,080 --> 00:06:52,640 Speaker 1: as much as there are so many resources that are 115 00:06:53,279 --> 00:06:55,360 Speaker 1: on the cusp of arriving, and certainly over the next 116 00:06:55,400 --> 00:06:58,800 Speaker 1: month or so, we'll continue to land to godparents through this. 117 00:06:59,800 --> 00:07:02,279 Speaker 1: The most common question that I get when parents are 118 00:07:02,320 --> 00:07:04,280 Speaker 1: asking me about it, other than do I or do 119 00:07:04,320 --> 00:07:07,120 Speaker 1: I not favor the changes, and let's be clear I 120 00:07:07,120 --> 00:07:10,760 Speaker 1: really do, is what apps are included? So you've highlighted 121 00:07:10,760 --> 00:07:13,640 Speaker 1: and the Prime Minister has mentioned as well, the major 122 00:07:13,840 --> 00:07:16,840 Speaker 1: tech companies. But I'm just I'm thinking, and I don't 123 00:07:16,840 --> 00:07:18,240 Speaker 1: want to get into the weeds too much here, but 124 00:07:18,280 --> 00:07:22,760 Speaker 1: I'm thinking about things like discord. Discord is a messaging app, right, 125 00:07:22,840 --> 00:07:24,960 Speaker 1: but it's very much about social media, and we know 126 00:07:25,040 --> 00:07:27,280 Speaker 1: that there's a lot of really really damaging stuff that 127 00:07:27,320 --> 00:07:30,680 Speaker 1: happens there. And then I'm thinking about how recently some 128 00:07:31,120 --> 00:07:34,680 Speaker 1: filings that Facebook made in the United States, they argued 129 00:07:34,720 --> 00:07:37,280 Speaker 1: that they are not a social media company. Now they're 130 00:07:37,320 --> 00:07:39,680 Speaker 1: a I mean, they're a short form TV company, and 131 00:07:39,720 --> 00:07:41,880 Speaker 1: they're saying that they're not a social media company because 132 00:07:42,280 --> 00:07:45,080 Speaker 1: people are sharing content with strangers, and people are consuming 133 00:07:45,080 --> 00:07:47,600 Speaker 1: content from strangers, and it's usually all short form video. 134 00:07:48,240 --> 00:07:51,200 Speaker 1: There's only about ten to twenty percent of the interactions 135 00:07:51,200 --> 00:07:56,960 Speaker 1: that happen on Facebook now that are technically social media interactions. 136 00:07:56,960 --> 00:08:00,000 Speaker 1: The rest of it's shortfam you mean Facebook or YouTube Facebook. 137 00:08:00,200 --> 00:08:03,320 Speaker 1: I read something about Facebook doing that just recently. And 138 00:08:03,400 --> 00:08:07,200 Speaker 1: so when we've got these organizations, who are I guess 139 00:08:07,200 --> 00:08:09,400 Speaker 1: what I'm saying is it's a constantly moving target. 140 00:08:12,240 --> 00:08:15,960 Speaker 2: These lists will be dynamic, and you've just encapsulated what 141 00:08:16,000 --> 00:08:18,640 Speaker 2: the challenge has been for me and my team. I 142 00:08:18,680 --> 00:08:22,800 Speaker 2: don't have specific declaratory powers in this legislation to say 143 00:08:22,800 --> 00:08:25,000 Speaker 2: who's in and who's out. I have to work with 144 00:08:25,040 --> 00:08:28,200 Speaker 2: the rules that were tabled and they're very broad rules. So, 145 00:08:28,720 --> 00:08:31,800 Speaker 2: as I mentioned, the primary test that we're using now 146 00:08:31,920 --> 00:08:35,440 Speaker 2: is what is your soul or primary purpose? Is your 147 00:08:35,440 --> 00:08:38,880 Speaker 2: solar primary purpose? Social media? Well, of course we're seeing 148 00:08:38,920 --> 00:08:42,959 Speaker 2: a lot of shape shifting. So YouTube's been saying oh no, 149 00:08:43,520 --> 00:08:46,160 Speaker 2: even though they designate themselves as a social media site. 150 00:08:46,160 --> 00:08:50,160 Speaker 2: For our codes, we're a video sharing platform, and pinterest 151 00:08:50,440 --> 00:08:54,640 Speaker 2: is a visual search engine, and snap is a camera app. 152 00:08:54,679 --> 00:08:58,960 Speaker 2: It's really messaging. So I mean what we're having to 153 00:08:59,000 --> 00:09:01,680 Speaker 2: go through and do is do our own testing, but 154 00:09:01,720 --> 00:09:05,240 Speaker 2: look at features and functionality and then try and apply 155 00:09:05,360 --> 00:09:09,640 Speaker 2: these tests where there isn't a clear line and we 156 00:09:09,720 --> 00:09:12,120 Speaker 2: know that things are going to be dynamic and change. 157 00:09:12,160 --> 00:09:15,480 Speaker 2: So to give you an example, I was at OpenAI 158 00:09:15,720 --> 00:09:18,520 Speaker 2: headquarters last week talking to them about a range of 159 00:09:18,520 --> 00:09:21,439 Speaker 2: things because we've got some provisions in our codes around 160 00:09:21,520 --> 00:09:25,760 Speaker 2: AI companions and chatbots and preventing kids from accessing born 161 00:09:25,840 --> 00:09:29,560 Speaker 2: and explicit violence and suicidal ideation content and the like. 162 00:09:30,200 --> 00:09:32,960 Speaker 2: They didn't mention once that the following week they were 163 00:09:33,000 --> 00:09:36,200 Speaker 2: going to be introducing an AI generated social media app 164 00:09:36,240 --> 00:09:40,160 Speaker 2: called Sora. Well, so here you go, and now you've 165 00:09:40,160 --> 00:09:44,880 Speaker 2: got the melding of social media and AI generated deep 166 00:09:44,920 --> 00:09:48,160 Speaker 2: fake videos. So I've written to them and I've sent 167 00:09:48,240 --> 00:09:51,360 Speaker 2: them the assessment tool. Things are going to be dynamic 168 00:09:51,400 --> 00:09:54,000 Speaker 2: and are going to change all the time. Let's say 169 00:09:54,000 --> 00:09:57,080 Speaker 2: you're looking at roadblocks. I think most people would agree 170 00:09:57,080 --> 00:10:01,120 Speaker 2: that online gaming is probably its primary purpose. US. There 171 00:10:01,200 --> 00:10:04,199 Speaker 2: is that chat functionality they've just released in the US, 172 00:10:04,240 --> 00:10:08,000 Speaker 2: a future called Moments, which is like stories. So features 173 00:10:08,040 --> 00:10:11,240 Speaker 2: and functionalities are changing all the time. This is always 174 00:10:11,280 --> 00:10:13,760 Speaker 2: going to be dynamic. We want to give parents the 175 00:10:13,800 --> 00:10:16,800 Speaker 2: most clarity that we can, but I do want to 176 00:10:16,840 --> 00:10:20,280 Speaker 2: say this justin One of the normative changes that this 177 00:10:20,400 --> 00:10:24,040 Speaker 2: legislation is meant to give parents is to be able 178 00:10:24,080 --> 00:10:27,720 Speaker 2: to say this has a meaning you'll never be on 179 00:10:27,920 --> 00:10:31,200 Speaker 2: social media, or you'll never have a smartphone, just not yet, 180 00:10:31,280 --> 00:10:34,360 Speaker 2: not until you're ready. And the government is saying that 181 00:10:34,440 --> 00:10:38,240 Speaker 2: if you're under sixteen, you're not ready. So and you 182 00:10:38,280 --> 00:10:40,559 Speaker 2: don't have to worry about the POMO because your friends 183 00:10:40,559 --> 00:10:43,840 Speaker 2: aren't going to be beyond this either. So parents do 184 00:10:43,920 --> 00:10:48,920 Speaker 2: have agency. Let's just say we decide roadblocks meets the 185 00:10:48,960 --> 00:10:52,880 Speaker 2: gaming exsumption. You have agency as a parent to say 186 00:10:53,880 --> 00:10:55,840 Speaker 2: you're not going to be on Facebook or TikTok, but 187 00:10:55,880 --> 00:10:58,120 Speaker 2: I'm not comfortable with you being on roadblocks either. 188 00:10:58,920 --> 00:11:03,640 Speaker 1: Yeah, Julie, even as I listen to what you're saying here, Gentwangy, 189 00:11:03,840 --> 00:11:06,000 Speaker 1: who I'm sure that you would be very familiar with. 190 00:11:06,520 --> 00:11:08,520 Speaker 1: She's got her new book out. I'm actually speaking with 191 00:11:08,559 --> 00:11:11,760 Speaker 1: her on Wednesday about her new book, and one of 192 00:11:11,800 --> 00:11:14,200 Speaker 1: her rules is you are the parent like you are 193 00:11:14,240 --> 00:11:19,000 Speaker 1: in control. And she's so behind this change as well 194 00:11:19,040 --> 00:11:21,360 Speaker 1: because it allows parents to feel like they can step 195 00:11:21,360 --> 00:11:23,880 Speaker 1: into that role and say, well, the government has said no, 196 00:11:24,000 --> 00:11:25,960 Speaker 1: and my job as a parent is to help you 197 00:11:26,000 --> 00:11:27,760 Speaker 1: to stick with what the laws are. In the same 198 00:11:27,760 --> 00:11:29,040 Speaker 1: way that I'm not going to let my twelve year 199 00:11:29,040 --> 00:11:31,280 Speaker 1: old driver car or go into a pub, or go 200 00:11:31,320 --> 00:11:33,760 Speaker 1: into a casino or any of those things. They're now 201 00:11:33,880 --> 00:11:35,400 Speaker 1: not going to be able to do these things, And 202 00:11:35,440 --> 00:11:38,640 Speaker 1: it just it feels reassuring, it feels validating, It feels 203 00:11:38,720 --> 00:11:41,920 Speaker 1: so much safer for parents and for families. Just listening 204 00:11:41,920 --> 00:11:43,640 Speaker 1: to your talk, though, it sounds like you have such 205 00:11:43,640 --> 00:11:46,520 Speaker 1: an exhausting role to play. After the break, I want 206 00:11:46,520 --> 00:11:49,640 Speaker 1: to ask you, Julie, how enforcement is going to work. 207 00:11:49,679 --> 00:11:53,040 Speaker 1: I mean, we're talking nearly fifty million dollars for each breach, 208 00:11:53,800 --> 00:11:55,880 Speaker 1: and this is something that a lot of parents are 209 00:11:55,920 --> 00:11:57,560 Speaker 1: trying to work out. Well, hang on, I'm not going 210 00:11:57,559 --> 00:11:59,520 Speaker 1: to get in trouble, so the onus is off me. 211 00:12:00,080 --> 00:12:02,600 Speaker 1: But my kids are definitely going to try to be sneaky. 212 00:12:02,600 --> 00:12:05,480 Speaker 1: They're upset about this. So that's coming up, plus a 213 00:12:05,520 --> 00:12:08,520 Speaker 1: conversation about what the government, what the E Safety Commissioner 214 00:12:08,600 --> 00:12:12,440 Speaker 1: is doing around ornographic age gating, and a discussion about 215 00:12:12,480 --> 00:12:14,960 Speaker 1: neudifying apps which are a concern in so many schools. 216 00:12:15,040 --> 00:12:25,280 Speaker 1: Right now, stay with us, it's the Happy Family's podcast, 217 00:12:25,280 --> 00:12:28,319 Speaker 1: Real Parenting Solutions, every day on Australia's most downloaded parenting podcast. 218 00:12:28,360 --> 00:12:31,240 Speaker 1: My name's doctor Justin Coulson. The E Safety Commissioner, Julie 219 00:12:31,240 --> 00:12:34,040 Speaker 1: Inman Grant is joining me now, Julie. As the minimum 220 00:12:34,040 --> 00:12:39,520 Speaker 1: age legislation date arrivedal gets closer December tenth, something that 221 00:12:39,559 --> 00:12:41,959 Speaker 1: a lot of people are talking about is there's a 222 00:12:42,000 --> 00:12:45,280 Speaker 1: forty nine point five million dollar fine for social media 223 00:12:45,320 --> 00:12:49,880 Speaker 1: companies who allow under sixteen's onto the platform for each breach. 224 00:12:50,720 --> 00:12:55,320 Speaker 1: My question is who's policing it and how well? 225 00:12:55,440 --> 00:12:59,760 Speaker 2: That is my job as the regulator, and I've developed 226 00:12:59,840 --> 00:13:03,560 Speaker 2: a compliance and enforcement strategy but let me tell you 227 00:13:03,600 --> 00:13:06,880 Speaker 2: the five things, because not everybody's going to be reading 228 00:13:06,920 --> 00:13:09,760 Speaker 2: the regulatory guidance, although it is available on our social 229 00:13:09,800 --> 00:13:14,120 Speaker 2: media Minimum Age hub at eSafety dot gov dot au. 230 00:13:14,360 --> 00:13:17,880 Speaker 2: First of all, we're asking companies to focus on using 231 00:13:17,920 --> 00:13:22,000 Speaker 2: age assurance technologies and what we call a waterfall approach 232 00:13:22,200 --> 00:13:25,080 Speaker 2: or a layered safety approach, So it's not just going 233 00:13:25,160 --> 00:13:28,560 Speaker 2: to be one way of testing age, and it can't 234 00:13:28,679 --> 00:13:31,040 Speaker 2: and while they can ask for ID, it can't be 235 00:13:31,120 --> 00:13:35,640 Speaker 2: the sole thing, the sole determinant of age because a 236 00:13:35,679 --> 00:13:39,120 Speaker 2: lot of people aren't comfortable giving up their ID. So 237 00:13:39,400 --> 00:13:43,400 Speaker 2: they are going to have to tell us before December tenth, 238 00:13:43,440 --> 00:13:46,040 Speaker 2: how many people, how many under sixteens they have on 239 00:13:46,080 --> 00:13:49,960 Speaker 2: our platforms. We've already used our transparency powers in February 240 00:13:50,040 --> 00:13:53,200 Speaker 2: of this year to get a nominal number. I've been 241 00:13:53,679 --> 00:13:56,680 Speaker 2: pushing the companies because I think that the number of 242 00:13:57,040 --> 00:14:00,680 Speaker 2: kids that are on their platforms are higher. So there's 243 00:14:00,760 --> 00:14:04,360 Speaker 2: two point five million eight to fifteen year olds, and 244 00:14:04,679 --> 00:14:08,239 Speaker 2: eighty four percent of young people told us last September 245 00:14:08,360 --> 00:14:12,120 Speaker 2: that they had at least one social media account. Very 246 00:14:12,160 --> 00:14:15,679 Speaker 2: few of them had ever been banned for age related violations. 247 00:14:15,840 --> 00:14:19,280 Speaker 2: And in ninety percent of cases, their parents helped them 248 00:14:19,800 --> 00:14:20,920 Speaker 2: set up the account. 249 00:14:21,120 --> 00:14:24,440 Speaker 1: So again I'm sitting my head like, seriously, your parents, 250 00:14:24,560 --> 00:14:26,480 Speaker 1: just what are you doing these? 251 00:14:26,840 --> 00:14:30,360 Speaker 2: Well, again it's the fear of their kids being excluded, 252 00:14:30,600 --> 00:14:34,680 Speaker 2: so that should take away that. So we've asked them 253 00:14:34,720 --> 00:14:37,200 Speaker 2: to tell us the number, and then on December tenth, 254 00:14:37,280 --> 00:14:41,280 Speaker 2: we're going to be tracking how many deactivations or removals 255 00:14:41,360 --> 00:14:43,600 Speaker 2: of under sixteen accounts they take. We know this is 256 00:14:43,600 --> 00:14:46,440 Speaker 2: not going to be perfect and not every company is 257 00:14:46,440 --> 00:14:48,000 Speaker 2: going to do it the same way or with the 258 00:14:48,000 --> 00:14:51,200 Speaker 2: same level of accuracy, so we're also asking them as 259 00:14:51,400 --> 00:14:54,440 Speaker 2: the second second phase is to make sure that there 260 00:14:54,520 --> 00:14:59,880 Speaker 2: is a discoverable, easy to use, and responsive user empower 261 00:15:00,240 --> 00:15:04,560 Speaker 2: reporting so that parents are educators can can report to 262 00:15:04,600 --> 00:15:08,200 Speaker 2: the platforms they if there's an under sixteen that's still 263 00:15:08,240 --> 00:15:12,360 Speaker 2: on there. The third the third requirement is that they 264 00:15:12,400 --> 00:15:16,000 Speaker 2: have an appeals process that is fair, so if they overblock, 265 00:15:16,640 --> 00:15:19,120 Speaker 2: people can also say I would like to prove to 266 00:15:19,120 --> 00:15:22,960 Speaker 2: you that I am sixteen or over and reinstate my account. 267 00:15:23,240 --> 00:15:26,160 Speaker 2: The fourth thing we're asking them to do is to 268 00:15:26,240 --> 00:15:30,080 Speaker 2: make sure that they're avoiding circumvention, whether location based circumvention 269 00:15:30,240 --> 00:15:35,600 Speaker 2: through VPNs or through age based circumvention like spoofing AI 270 00:15:35,760 --> 00:15:40,080 Speaker 2: estimation systems, and we go into a lot of specific 271 00:15:40,120 --> 00:15:42,440 Speaker 2: technical detail about what we know they can do and 272 00:15:42,440 --> 00:15:45,400 Speaker 2: they should be doing, and that the last part really 273 00:15:45,480 --> 00:15:49,560 Speaker 2: is about measuring the efficacy of the tools that they're 274 00:15:49,640 --> 00:15:52,160 Speaker 2: using and making sure that they're being transparent with the 275 00:15:52,280 --> 00:15:58,160 Speaker 2: data so that we can measure success. Now, these hefty 276 00:15:58,200 --> 00:16:02,520 Speaker 2: fines are only for systemic failures, So if a couple 277 00:16:02,920 --> 00:16:05,680 Speaker 2: of we expect that you know a few of these 278 00:16:05,680 --> 00:16:07,920 Speaker 2: will fall through the cracks. But if we have an 279 00:16:07,920 --> 00:16:13,760 Speaker 2: indication that these companies don't appear to be applying effective 280 00:16:14,280 --> 00:16:20,120 Speaker 2: age assurance solutions, that's when we'll take take action, will 281 00:16:20,120 --> 00:16:28,120 Speaker 2: be issuing some information notices, will be reassessing in June 282 00:16:28,640 --> 00:16:32,640 Speaker 2: because things will change. We may be looking at a certification, 283 00:16:33,080 --> 00:16:36,720 Speaker 2: there will be technical standards that will be being developed. 284 00:16:37,080 --> 00:16:39,920 Speaker 2: But I'll say one thing justin that this has helped 285 00:16:39,960 --> 00:16:41,880 Speaker 2: with and I was just in the US meeting with 286 00:16:41,920 --> 00:16:46,000 Speaker 2: a number of these companies. Age assurance is happening. It's 287 00:16:46,000 --> 00:16:50,560 Speaker 2: happening everywhere, and companies understand that this is where they 288 00:16:50,600 --> 00:16:53,680 Speaker 2: need to go. So that the train has left the station. 289 00:16:54,000 --> 00:16:58,480 Speaker 2: So companies can try and ignore it or avoid doing 290 00:16:58,520 --> 00:17:01,000 Speaker 2: it for a long time, but the UK is asking 291 00:17:01,040 --> 00:17:06,000 Speaker 2: for it, Ireland's asking for it. The European Union of 292 00:17:06,040 --> 00:17:10,600 Speaker 2: twenty seven member states also is going to require this. 293 00:17:10,200 --> 00:17:12,040 Speaker 2: This is where the world needs to go. 294 00:17:12,200 --> 00:17:13,680 Speaker 1: I kind of feel like throwing my hands in the 295 00:17:13,680 --> 00:17:15,200 Speaker 1: air and having a little party on my own here. 296 00:17:15,200 --> 00:17:17,640 Speaker 1: This is I mean, it's wonderful to hear that we're 297 00:17:17,680 --> 00:17:20,720 Speaker 1: starting this off, and obviously the world will become well, 298 00:17:20,800 --> 00:17:22,280 Speaker 1: I think the world will become a better place for 299 00:17:22,600 --> 00:17:24,800 Speaker 1: our young people as a result. A quick comment on 300 00:17:24,880 --> 00:17:27,480 Speaker 1: number four of those five points that you highlighted in 301 00:17:27,600 --> 00:17:32,080 Speaker 1: terms of that cascading approach to age verification, the fourth 302 00:17:32,080 --> 00:17:37,400 Speaker 1: one you highlighted was location based and AI check ins. 303 00:17:37,800 --> 00:17:41,920 Speaker 1: So my fifteen year old daughter has been extremely clear 304 00:17:41,920 --> 00:17:43,680 Speaker 1: that while she doesn't like it, she's happy to comply 305 00:17:43,760 --> 00:17:45,080 Speaker 1: with it, but none of her friends are going to 306 00:17:45,359 --> 00:17:48,840 Speaker 1: They're all researching VPNs, they're all talking about different ways 307 00:17:48,840 --> 00:17:50,800 Speaker 1: that they can get around this, or they're just saying, 308 00:17:50,840 --> 00:17:52,879 Speaker 1: my parents are going to set up another account in 309 00:17:52,920 --> 00:17:54,480 Speaker 1: their name that I can use and I'm just going 310 00:17:54,480 --> 00:17:57,040 Speaker 1: to pretend to be them. What I'm hearing you say 311 00:17:57,160 --> 00:17:59,959 Speaker 1: in relation to that is that number one, the company, 312 00:18:00,119 --> 00:18:02,399 Speaker 1: the social media companies are going to have to be 313 00:18:02,560 --> 00:18:04,720 Speaker 1: mindful of what's going on from a VPN point of view. 314 00:18:04,760 --> 00:18:06,840 Speaker 1: For people who are not familiar, VPN is basically a 315 00:18:06,920 --> 00:18:10,120 Speaker 1: virtual private network, which means that you can circumvent any 316 00:18:10,160 --> 00:18:15,360 Speaker 1: location based elements of your online activity, and also your 317 00:18:15,440 --> 00:18:20,320 Speaker 1: AI component is that, like AI can see what people 318 00:18:20,320 --> 00:18:24,440 Speaker 1: are saying, what they're following, what they're liking, what they're sharing, 319 00:18:24,880 --> 00:18:27,920 Speaker 1: and make some fairly well educated guesses as to the 320 00:18:27,960 --> 00:18:30,239 Speaker 1: age of the person who's using that platform. Is that 321 00:18:30,280 --> 00:18:31,480 Speaker 1: what you mean with that. 322 00:18:31,440 --> 00:18:33,679 Speaker 2: Specific Yeah, I mean a number of companies have been 323 00:18:33,760 --> 00:18:38,000 Speaker 2: using inference technologies for some times, so there are different 324 00:18:38,040 --> 00:18:41,720 Speaker 2: behavioral signals or even using natural language processing, you know 325 00:18:42,680 --> 00:18:45,439 Speaker 2: the way that grammar is used, or acronyms or emojis. 326 00:18:45,880 --> 00:18:48,480 Speaker 2: Thirteen year olds generally speak to other thirteen year olds. 327 00:18:48,480 --> 00:18:52,200 Speaker 2: If they can see you're logging in between before school 328 00:18:52,240 --> 00:18:54,440 Speaker 2: and after school, there are a whole range of signals 329 00:18:54,480 --> 00:18:57,359 Speaker 2: that they're already picking up, and then they'll be using 330 00:18:57,400 --> 00:19:03,040 Speaker 2: facial age estimation to keep improving their classifiers. But that's 331 00:19:03,080 --> 00:19:05,159 Speaker 2: why I put a lot of technical information in the 332 00:19:05,200 --> 00:19:09,480 Speaker 2: regulatory guidance that basically says, most of these platforms are 333 00:19:09,520 --> 00:19:14,640 Speaker 2: picking up device ideas, you know, IP ranges and addresses. 334 00:19:15,520 --> 00:19:18,160 Speaker 2: They can tell if it's been downloaded from the Australian 335 00:19:18,160 --> 00:19:23,600 Speaker 2: App Store, so they can see very clearly when through 336 00:19:23,600 --> 00:19:27,159 Speaker 2: the IP range when a VPN appears to be being used, 337 00:19:27,240 --> 00:19:30,199 Speaker 2: and that will just that will need to trigger another 338 00:19:30,280 --> 00:19:33,840 Speaker 2: age estimation check, so it won't be as easy as 339 00:19:34,800 --> 00:19:37,320 Speaker 2: young people think. I love it. 340 00:19:37,359 --> 00:19:39,360 Speaker 1: I love it. Okay, our time is up. I want 341 00:19:39,400 --> 00:19:41,120 Speaker 1: to do a quick summary and then I've got two 342 00:19:41,119 --> 00:19:44,199 Speaker 1: more questions around explicit content, because I mean, this is 343 00:19:44,200 --> 00:19:48,359 Speaker 1: one of the most consistent and overwhelmingly horrible questions that 344 00:19:48,359 --> 00:19:50,439 Speaker 1: come up around explicit content. So here's my quick summary 345 00:19:50,520 --> 00:19:54,560 Speaker 1: of what you've described. Over the next month or thereabouts, 346 00:19:54,760 --> 00:19:56,880 Speaker 1: the social media companies are going to have to be 347 00:19:57,080 --> 00:20:00,159 Speaker 1: demonstrating to you what their primary purpose is and a 348 00:20:00,160 --> 00:20:02,280 Speaker 1: list of who's included and who's not included will be 349 00:20:02,280 --> 00:20:06,040 Speaker 1: finalized in time for December ten. Children under the age 350 00:20:06,040 --> 00:20:09,680 Speaker 1: of sixteen will be limited from creating accounts and being 351 00:20:09,800 --> 00:20:13,600 Speaker 1: algorithmically followed. They will be able to access whatever content 352 00:20:13,720 --> 00:20:17,399 Speaker 1: is publicly available just via the websites of those platforms, 353 00:20:17,680 --> 00:20:20,960 Speaker 1: and in terms of enforcement, parents no longer have to 354 00:20:21,040 --> 00:20:24,040 Speaker 1: stress about it because the platforms are responsible for doing 355 00:20:24,080 --> 00:20:26,600 Speaker 1: the work at a systematic level, and if they're found 356 00:20:26,760 --> 00:20:28,919 Speaker 1: to have systematic breaches, they're going to be hit for 357 00:20:28,920 --> 00:20:31,720 Speaker 1: forty nine and a half million dollars because they're just 358 00:20:31,880 --> 00:20:34,040 Speaker 1: not doing it right. And you've got that wonderful five 359 00:20:34,080 --> 00:20:38,560 Speaker 1: step cascading approach to different age verification systems. Is there 360 00:20:38,600 --> 00:20:41,760 Speaker 1: anything that I've missed that is just vitally important for 361 00:20:41,760 --> 00:20:44,280 Speaker 1: parents to understand or is that a pretty good snapshot 362 00:20:44,280 --> 00:20:45,280 Speaker 1: of where we are right now. 363 00:20:45,880 --> 00:20:48,639 Speaker 2: I think that's a perfect encapsulation, but I would just 364 00:20:48,720 --> 00:20:51,320 Speaker 2: have a couple of caveats. And this is related to 365 00:20:51,359 --> 00:20:55,600 Speaker 2: your next question. Is just because a platform is exempted 366 00:20:56,119 --> 00:20:59,840 Speaker 2: they're a messaging platform, an online gaming platform, that doesn't 367 00:20:59,840 --> 00:21:02,920 Speaker 2: mean as parents that we stop being vigilant and set 368 00:21:02,920 --> 00:21:07,000 Speaker 2: and forget. It doesn't mean these platforms are necessarily safer 369 00:21:07,160 --> 00:21:11,240 Speaker 2: this These are just exemptions that we're written into the legislation. 370 00:21:11,880 --> 00:21:14,240 Speaker 2: I've heard some people over sell that this is going 371 00:21:14,280 --> 00:21:18,360 Speaker 2: to stop cyberbullying or image based abuse. You know, these 372 00:21:18,359 --> 00:21:22,040 Speaker 2: are actually fundamentally human behaviors that play out online. They 373 00:21:22,040 --> 00:21:23,880 Speaker 2: play out on social media, but they can play out 374 00:21:23,880 --> 00:21:27,720 Speaker 2: on messaging and gaming platforms as well. So we'll be watching. 375 00:21:28,000 --> 00:21:30,440 Speaker 2: We'll be watching for that too, and we'll have guidance 376 00:21:30,480 --> 00:21:33,280 Speaker 2: for parents, and we'll continue to have our reporting schemes 377 00:21:33,280 --> 00:21:37,399 Speaker 2: for cyberbullying and image based abuse. So I think this 378 00:21:37,440 --> 00:21:40,800 Speaker 2: will make parenting in the digital age a little bit easier. 379 00:21:40,840 --> 00:21:44,600 Speaker 2: But you know, it's like anything if you compare it 380 00:21:44,640 --> 00:21:49,439 Speaker 2: to a car, no, if you're you know, we expect 381 00:21:49,480 --> 00:21:53,120 Speaker 2: the car makers to make safer cars and embed seat 382 00:21:53,119 --> 00:21:56,000 Speaker 2: builts and put airbags and that sort of thing. You know, 383 00:21:56,119 --> 00:21:59,000 Speaker 2: they know that they're going to be inexperienced or bad 384 00:21:59,400 --> 00:22:01,560 Speaker 2: bad driver is there. You still have to follow the 385 00:22:01,600 --> 00:22:03,560 Speaker 2: rules of the road, but you're not going to be 386 00:22:03,560 --> 00:22:06,399 Speaker 2: penalized if you get into an accident. Right, it's the 387 00:22:06,440 --> 00:22:11,400 Speaker 2: same thing here. It's the platforms themselves that are responsible 388 00:22:11,920 --> 00:22:14,560 Speaker 2: will give parents guidance, but you know they can choose 389 00:22:14,560 --> 00:22:16,840 Speaker 2: how to parent how they want. If they want their 390 00:22:16,920 --> 00:22:19,480 Speaker 2: child who's under sixteen to have a platform that aren't 391 00:22:19,520 --> 00:22:22,520 Speaker 2: going to be penalized, and the child themselves won either. 392 00:22:22,920 --> 00:22:25,240 Speaker 1: So when I hear that, the metaphor that I use 393 00:22:25,320 --> 00:22:28,520 Speaker 1: is you might move into a safer neighborhood, but you 394 00:22:28,520 --> 00:22:33,240 Speaker 1: still want to lock your door. Absolutely, okay. Last two questions, 395 00:22:33,840 --> 00:22:37,439 Speaker 1: Commissioner Grant and so grateful for your time. What is 396 00:22:37,440 --> 00:22:42,240 Speaker 1: the government doing about explicit content, specifically pornographic age gating? 397 00:22:42,680 --> 00:22:45,560 Speaker 1: Where we're making big fuss about social media and rightly so, 398 00:22:46,680 --> 00:22:53,399 Speaker 1: but most explicit websites simply require a user, whether they 399 00:22:53,440 --> 00:22:57,600 Speaker 1: are six or fifty six, to click the yes I'm 400 00:22:57,640 --> 00:23:00,239 Speaker 1: over eighteen button and they're in. That's as far as 401 00:23:00,280 --> 00:23:02,280 Speaker 1: it goes. At this point, it seems to me that 402 00:23:02,320 --> 00:23:04,760 Speaker 1: if we can do this with social media, we must 403 00:23:04,760 --> 00:23:08,199 Speaker 1: be able to do this with explicit and pornographic content. 404 00:23:09,320 --> 00:23:13,040 Speaker 2: And we are, and that is through the industry codes 405 00:23:13,520 --> 00:23:16,960 Speaker 2: that I've just registered over the past couple of months. 406 00:23:17,080 --> 00:23:21,840 Speaker 2: You'll see the first changes, particularly to search engines, at 407 00:23:21,880 --> 00:23:24,359 Speaker 2: the end of December this year, and then you'll see 408 00:23:25,320 --> 00:23:29,920 Speaker 2: changes up and down the stack. What these codes require 409 00:23:30,440 --> 00:23:33,080 Speaker 2: eight different sectors of the industry to do, from the 410 00:23:33,119 --> 00:23:36,520 Speaker 2: app stores to social media sites, to ISPs, search engines 411 00:23:36,560 --> 00:23:40,080 Speaker 2: and the like. It is again a layered safety approach, 412 00:23:40,200 --> 00:23:42,960 Speaker 2: and each of them will have different requirements. So we 413 00:23:43,080 --> 00:23:46,440 Speaker 2: know that about fifty eight percent of young people under 414 00:23:46,440 --> 00:23:49,160 Speaker 2: the age of thirteen that come across pornography come about 415 00:23:49,200 --> 00:23:52,960 Speaker 2: it incidentally, accidentally and in your face. That's how they 416 00:23:52,960 --> 00:23:56,399 Speaker 2: describe it. Yeah, intel through an innocent search or a 417 00:23:56,440 --> 00:24:00,280 Speaker 2: browse on the Internet. So what our search engine codeodes 418 00:24:00,600 --> 00:24:04,600 Speaker 2: which were developed by industry themselves. But I decided to 419 00:24:04,640 --> 00:24:07,600 Speaker 2: register because I thought the amount appropriate community safeguards will 420 00:24:07,600 --> 00:24:13,240 Speaker 2: do from now on if it comes across peignt graphic 421 00:24:13,280 --> 00:24:16,800 Speaker 2: content or explicit violence they like the Charlie Kirk assassination, 422 00:24:16,960 --> 00:24:20,800 Speaker 2: which has been designated refuse classification in Australia, it will 423 00:24:20,840 --> 00:24:25,480 Speaker 2: blur that if so that will prevent the incidental coming 424 00:24:25,520 --> 00:24:28,920 Speaker 2: across of that content. If you're an adult, you can 425 00:24:29,160 --> 00:24:32,840 Speaker 2: click through and you watch the adult content. It will 426 00:24:32,880 --> 00:24:36,680 Speaker 2: also for people who are seeking out suicide instructional material, 427 00:24:37,280 --> 00:24:39,640 Speaker 2: it will refer them to a mental health site rather 428 00:24:39,680 --> 00:24:43,480 Speaker 2: than taking to them that to that kind of you know, 429 00:24:44,320 --> 00:24:50,160 Speaker 2: perfect material that you can lead to greevous outcomes. But 430 00:24:50,240 --> 00:24:55,000 Speaker 2: we also will be requiring a whole broad range of 431 00:24:55,040 --> 00:25:00,800 Speaker 2: platforms to be using age assurance to prevent un eighteens 432 00:25:00,840 --> 00:25:05,720 Speaker 2: from maxising suicidal ideation content, self harm, disordered eating, explicit 433 00:25:05,840 --> 00:25:09,000 Speaker 2: violence and pornography. So again this is going to be 434 00:25:09,359 --> 00:25:11,960 Speaker 2: up and down the technology stack. It will also apply 435 00:25:12,040 --> 00:25:17,840 Speaker 2: to AI companions in chatbots, because we know that fifth 436 00:25:17,840 --> 00:25:21,639 Speaker 2: and sixth graders are spending up to five hours in 437 00:25:21,720 --> 00:25:25,440 Speaker 2: what they think are quasi romantic relationships with AI companions, 438 00:25:25,520 --> 00:25:30,800 Speaker 2: and they're designed to be emotionally manipulative and to prey 439 00:25:30,840 --> 00:25:36,119 Speaker 2: on your best instincts and your worst fears from monetization purposes. 440 00:25:36,320 --> 00:25:38,720 Speaker 1: We saw that in Parental Guidence season three when we 441 00:25:38,800 --> 00:25:41,240 Speaker 1: got the AI companions to chat with the kids, and 442 00:25:41,240 --> 00:25:45,000 Speaker 1: they were though and manipulative, they were absolutely concerning. So 443 00:25:45,359 --> 00:25:47,240 Speaker 1: that's reassuring. We'll probably get you back next to you 444 00:25:47,320 --> 00:25:49,000 Speaker 1: to have a chat about number one, how the social 445 00:25:49,000 --> 00:25:51,280 Speaker 1: medium in major legislation is rolling out, but number two 446 00:25:51,359 --> 00:25:56,320 Speaker 1: to talk more about those those explicit content and dangerous 447 00:25:56,400 --> 00:25:59,000 Speaker 1: content guidelines as they as they roll out as well. 448 00:25:59,280 --> 00:26:01,760 Speaker 1: Last question, and you've been so generous with your time 449 00:26:01,960 --> 00:26:05,200 Speaker 1: notifying apps there are massive concern in Australia at schools 450 00:26:05,200 --> 00:26:08,280 Speaker 1: at the moment are tearing their hair out as kids 451 00:26:08,280 --> 00:26:11,320 Speaker 1: get hold of them and then create explicit content based 452 00:26:11,359 --> 00:26:16,479 Speaker 1: off even innocuous school book photos of kids, usually girls, 453 00:26:16,600 --> 00:26:21,960 Speaker 1: or teachers in their school. What can you offer parents 454 00:26:22,000 --> 00:26:25,160 Speaker 1: to help here. If they're concerned that their kids might 455 00:26:25,200 --> 00:26:27,360 Speaker 1: be either a victim of or a user of these 456 00:26:27,440 --> 00:26:28,720 Speaker 1: neudifying apps. 457 00:26:30,119 --> 00:26:34,600 Speaker 2: Well we share your concerns. Now we've using our mandatory standards. 458 00:26:34,600 --> 00:26:38,239 Speaker 2: We've gone after a couple of newdifying companies, one in 459 00:26:38,320 --> 00:26:42,520 Speaker 2: which has created at least three very popular apps that 460 00:26:42,560 --> 00:26:46,440 Speaker 2: have been downloaded or viewed by at least two hundred 461 00:26:46,520 --> 00:26:51,600 Speaker 2: thousand Australians over the past year. So we're taking action 462 00:26:51,760 --> 00:26:55,800 Speaker 2: against them and the app stores that are hosting them. 463 00:26:56,040 --> 00:27:00,399 Speaker 2: We've just won in court against a man who post 464 00:27:00,400 --> 00:27:04,240 Speaker 2: to deep fakes of Australian women, you know, a fine 465 00:27:04,280 --> 00:27:06,159 Speaker 2: of up to three hundred and forty five thousand dollars, 466 00:27:06,160 --> 00:27:08,399 Speaker 2: which may not seem big, but for an individual it 467 00:27:08,520 --> 00:27:12,280 Speaker 2: is and it is a deterrent. We've also developed a 468 00:27:12,280 --> 00:27:19,160 Speaker 2: deep fake image based abuse Incident Management toolkit for schools because, 469 00:27:19,160 --> 00:27:21,320 Speaker 2: as you say, this is happening at least once a 470 00:27:21,320 --> 00:27:26,280 Speaker 2: week and schools across the country, and schools haven't known 471 00:27:26,520 --> 00:27:28,320 Speaker 2: how to deal with it. You know how to collect 472 00:27:28,320 --> 00:27:30,000 Speaker 2: the evidence, when to go to the police, when to 473 00:27:30,000 --> 00:27:32,960 Speaker 2: report to ease safety, and I will note if there's 474 00:27:33,000 --> 00:27:37,160 Speaker 2: intimate images or deep fake intimate imagery of a child 475 00:27:37,240 --> 00:27:39,919 Speaker 2: that's likely to be considered child sexual abuse material. We 476 00:27:39,960 --> 00:27:42,879 Speaker 2: can tackle that and we will work with law enforcement there, 477 00:27:43,080 --> 00:27:45,040 Speaker 2: but we have the ninety eight percent success rate of 478 00:27:45,040 --> 00:27:48,440 Speaker 2: getting that kind of content down. What people need to understand, 479 00:27:48,480 --> 00:27:50,320 Speaker 2: and this is where I think we need some cultural 480 00:27:50,400 --> 00:27:54,199 Speaker 2: change with young people. They might think this is a 481 00:27:54,200 --> 00:27:55,960 Speaker 2: bit of a laugh and look what they can do 482 00:27:56,040 --> 00:27:59,800 Speaker 2: for this with this virtually free technology. Well, the cost 483 00:28:00,080 --> 00:28:04,520 Speaker 2: the victim survivor is significant. It's humiliating, it's denigrating, and 484 00:28:04,960 --> 00:28:08,439 Speaker 2: it's incalculable in terms of the kinds of impact it 485 00:28:08,480 --> 00:28:11,400 Speaker 2: can have on young people. So we need to be 486 00:28:11,720 --> 00:28:14,679 Speaker 2: preventing this behavior from happening in the first place. 487 00:28:15,160 --> 00:28:20,440 Speaker 1: Julie, you've got a tough gig. Really appreciate the work 488 00:28:20,440 --> 00:28:23,080 Speaker 1: that you do and what you strive to do to 489 00:28:23,160 --> 00:28:25,320 Speaker 1: keep our community and particularly our young people and our 490 00:28:25,400 --> 00:28:29,040 Speaker 1: vulnerable people are safe. Thank you for your generous time today. 491 00:28:29,200 --> 00:28:31,919 Speaker 2: King. Well, we'll try everything we can until it works. 492 00:28:32,520 --> 00:28:35,639 Speaker 1: A big thank you, Julie Immigrant, the e Safety Commissioner 493 00:28:35,720 --> 00:28:38,480 Speaker 1: of Australia for taking time out of an incredibly busy 494 00:28:38,520 --> 00:28:39,920 Speaker 1: schedule to have a chat with me about all of 495 00:28:39,920 --> 00:28:41,960 Speaker 1: those things we could have talked for a lot longer, 496 00:28:41,960 --> 00:28:44,800 Speaker 1: but hopefully there's enough detail there to put you straight 497 00:28:44,880 --> 00:28:47,280 Speaker 1: and get you started on conversations with your kids. E 498 00:28:47,400 --> 00:28:49,600 Speaker 1: Safety dot gov dot au is where you find more 499 00:28:49,600 --> 00:28:51,680 Speaker 1: information and we will link in the show notes to 500 00:28:51,760 --> 00:28:55,040 Speaker 1: a bunch of resources that will be useful. In addition, 501 00:28:55,200 --> 00:28:58,240 Speaker 1: check out the safety dot gov dot U website because 502 00:28:58,240 --> 00:29:01,560 Speaker 1: there's a whole lot of webinars to explain even more 503 00:29:01,560 --> 00:29:03,960 Speaker 1: about what's going on that are happening over the next 504 00:29:04,000 --> 00:29:06,800 Speaker 1: month or thereabouts. It would be well worth your time 505 00:29:06,960 --> 00:29:08,520 Speaker 1: having a look. Again, we'll link to that in the 506 00:29:08,560 --> 00:29:11,320 Speaker 1: show notes. Okay, longer one than normal today. Thanks so 507 00:29:11,400 --> 00:29:13,520 Speaker 1: much for listening. I hope that you've gotten heaps out 508 00:29:13,520 --> 00:29:15,720 Speaker 1: of it and found it to be helpful and informative. 509 00:29:15,960 --> 00:29:18,640 Speaker 1: The Happy Families podcast is produced by Justin Ruland from 510 00:29:18,640 --> 00:29:22,080 Speaker 1: Bridge Media. Mimhammonds provides additional research, admin and other support, 511 00:29:22,320 --> 00:29:24,520 Speaker 1: and if you'd like to find resources to help your 512 00:29:24,520 --> 00:29:27,400 Speaker 1: family to be happier and to function better, visit us 513 00:29:27,440 --> 00:29:30,600 Speaker 1: at happy families dot com dot au. Tomorrow we're back 514 00:29:30,680 --> 00:29:33,360 Speaker 1: answering another one of your tricky questions on the pod