1 00:00:06,840 --> 00:00:18,400 Speaker 1: Before we start today. This episode discusses violent pornography from 2 00:00:18,400 --> 00:00:21,320 Speaker 1: the Australian. Here's what's on the front. I'm Claire Harvey. 3 00:00:21,400 --> 00:00:30,400 Speaker 1: It's Tuesday, July two. Labour's rogue senator Fatima Payman says 4 00:00:30,480 --> 00:00:34,200 Speaker 1: she feels like an exile after being dumped from caucus. 5 00:00:34,560 --> 00:00:37,600 Speaker 1: The Prime Minister says he punished Payment for ruining the 6 00:00:37,640 --> 00:00:41,680 Speaker 1: government's big promotion of its July one tax cuts by 7 00:00:41,720 --> 00:00:44,239 Speaker 1: going on TV at the weekend to explain why she 8 00:00:44,479 --> 00:00:51,120 Speaker 1: voted with the Greens in support of Palestine. House prices 9 00:00:51,159 --> 00:00:54,680 Speaker 1: will rise seven percent this year, according to Australia's biggest bank. 10 00:00:54,960 --> 00:00:59,160 Speaker 1: It's all because demand is outstripping supply and prices could 11 00:00:59,200 --> 00:01:02,720 Speaker 1: go up another five percent in twenty twenty five. That's 12 00:01:02,760 --> 00:01:05,920 Speaker 1: live at the Australian dot com dot au right now. 13 00:01:09,000 --> 00:01:12,880 Speaker 1: Australian kids are being exposed to pornography on popular social 14 00:01:12,920 --> 00:01:17,520 Speaker 1: media science like Facebook, Instagram and TikTok before they term thirteen, 15 00:01:17,959 --> 00:01:20,840 Speaker 1: and Australia's e Safety Commissioner is putting the heat on 16 00:01:20,959 --> 00:01:24,440 Speaker 1: tech giants to come up with measures to protect children. 17 00:01:24,959 --> 00:01:28,640 Speaker 1: If they fail, Julie Imman Grant says she'll impose mandatory 18 00:01:28,760 --> 00:01:34,120 Speaker 1: codes of conduct today, Can Australia really force tech giants 19 00:01:34,200 --> 00:01:46,080 Speaker 1: to do anything? Ah scrolling? 20 00:01:48,440 --> 00:01:50,960 Speaker 2: I always pack way too damn much, but you never 21 00:01:51,040 --> 00:01:52,000 Speaker 2: know what you're gonna need. 22 00:01:54,040 --> 00:01:55,640 Speaker 1: Travel pets. 23 00:01:56,200 --> 00:01:57,280 Speaker 2: Are you supposed to be in? 24 00:01:57,360 --> 00:01:59,560 Speaker 1: The poor duke? Your mommy say? 25 00:01:59,600 --> 00:02:00,040 Speaker 3: You can. 26 00:02:02,560 --> 00:02:07,320 Speaker 1: Strangely mesmerizing makeup tutorials, get ready with me for dinner 27 00:02:07,320 --> 00:02:13,400 Speaker 1: with the girls. If you keep swiping for long enough, 28 00:02:13,560 --> 00:02:17,120 Speaker 1: you might wind up somewhere depressing, like the so called 29 00:02:17,280 --> 00:02:21,840 Speaker 1: mani sphere, where creepy dudes like Andrew Tate spew hate 30 00:02:22,080 --> 00:02:22,760 Speaker 1: about women. 31 00:02:23,720 --> 00:02:27,120 Speaker 3: So they call me a misogynists. I'm anti woman, I'm problem. 32 00:02:27,400 --> 00:02:29,320 Speaker 3: The best way you can improve the lives of women 33 00:02:29,360 --> 00:02:32,240 Speaker 3: is make men better, men better at being men, what 34 00:02:32,280 --> 00:02:32,720 Speaker 3: a man is. 35 00:02:32,960 --> 00:02:35,639 Speaker 1: And then, especially if you are young, it can get 36 00:02:35,720 --> 00:02:42,919 Speaker 1: even darker. Nudity, revenge, porn, violence, abuse. More than half 37 00:02:42,919 --> 00:02:46,640 Speaker 1: of children and adolescents have been exposed to pawn on 38 00:02:46,760 --> 00:02:51,200 Speaker 1: social media, according to new research release today by Australia's 39 00:02:51,200 --> 00:02:55,440 Speaker 1: E Safety Commissioner Julie Inman Grant. It says most kids 40 00:02:55,560 --> 00:02:59,920 Speaker 1: first encounter adult content around age thirteen, and for men 41 00:03:00,440 --> 00:03:04,359 Speaker 1: much younger, and when they haven't gone looking for it, 42 00:03:08,639 --> 00:03:12,359 Speaker 1: Jeff Chambers is The Australian's chief political correspondent. Jeff, you 43 00:03:12,480 --> 00:03:15,880 Speaker 1: spoke to Julie inman Grant and she's really stick about 44 00:03:15,880 --> 00:03:19,480 Speaker 1: the fact that young people may and often will seek 45 00:03:19,480 --> 00:03:22,840 Speaker 1: out pornography as they get older. But this is different. 46 00:03:23,160 --> 00:03:23,919 Speaker 1: Why is it different? 47 00:03:25,480 --> 00:03:29,320 Speaker 2: Ten years ago when Facebook emerged, that was really the 48 00:03:29,360 --> 00:03:33,239 Speaker 2: primary sort of social media platform. There were porn websites 49 00:03:33,280 --> 00:03:36,440 Speaker 2: and things like that. But now we've just had this proliferation, 50 00:03:36,640 --> 00:03:41,280 Speaker 2: this explosion in growth in a range of social media platforms, 51 00:03:41,920 --> 00:03:46,760 Speaker 2: encrypted messaging apps, even in and around gaming, where a 52 00:03:46,880 --> 00:03:51,960 Speaker 2: lot of young Australian children are being exposed to pornographic 53 00:03:52,040 --> 00:03:57,320 Speaker 2: material quite often inadvertently. They're not necessarily seeking it out, 54 00:03:57,600 --> 00:04:00,720 Speaker 2: but there's a lot of pop ups people are sending 55 00:04:01,160 --> 00:04:05,560 Speaker 2: unsolicited dick pics and and other explicit material, and it's 56 00:04:05,600 --> 00:04:09,200 Speaker 2: really seeped its way into the schoolyards and into homes. 57 00:04:09,680 --> 00:04:13,880 Speaker 2: So it is just becoming so much more difficult for 58 00:04:14,000 --> 00:04:21,560 Speaker 2: families and parents to know what their kids are seeing online. 59 00:04:22,279 --> 00:04:26,320 Speaker 1: We all remember the mortification of that talk with our parents, 60 00:04:26,760 --> 00:04:31,440 Speaker 1: awkward conversations about sex as adolescents. Slapped us in the face, 61 00:04:32,080 --> 00:04:34,560 Speaker 1: or maybe we worked it out for ourselves with the 62 00:04:34,600 --> 00:04:38,320 Speaker 1: help of some self appointed experts in the playground. But 63 00:04:38,440 --> 00:04:41,880 Speaker 1: now those opportunities for parents to have conversations about sex, 64 00:04:42,279 --> 00:04:47,520 Speaker 1: about healthy relationships, respect intimacy are being subverted by social 65 00:04:47,600 --> 00:04:50,760 Speaker 1: media and according to the new research, kids are aware 66 00:04:50,920 --> 00:04:54,320 Speaker 1: of how it can affect their ideas of sex, relationships 67 00:04:54,400 --> 00:04:57,800 Speaker 1: and gender, but they feel like they're losing control of 68 00:04:57,839 --> 00:04:59,719 Speaker 1: their own online experience. 69 00:05:03,240 --> 00:05:07,719 Speaker 2: It's that repeat exposure and it's often peer pressure, particularly 70 00:05:08,040 --> 00:05:10,440 Speaker 2: when young Australians move into high school. 71 00:05:10,200 --> 00:05:11,680 Speaker 3: And get to their teenage years. 72 00:05:12,279 --> 00:05:15,880 Speaker 2: And one of the issues raised by julianvan Grant is 73 00:05:16,440 --> 00:05:20,239 Speaker 2: the fact that they're seeing this violent, extreme pornography without 74 00:05:20,279 --> 00:05:25,840 Speaker 2: any guidance, context, or any appropriate maturity levels, because they 75 00:05:25,960 --> 00:05:29,040 Speaker 2: might think that a video showing a man aggressively choking 76 00:05:29,080 --> 00:05:31,880 Speaker 2: a woman during sex on a porn side is what 77 00:05:32,240 --> 00:05:35,960 Speaker 2: consent sex and healthy relationship should look like. And I 78 00:05:36,000 --> 00:05:39,760 Speaker 2: think that's why Peter Dutton and Anthony Albernezi are both 79 00:05:39,800 --> 00:05:42,120 Speaker 2: on a bit of a unity ticket here to try 80 00:05:42,160 --> 00:05:46,400 Speaker 2: and put every single layer of protection in place, because 81 00:05:46,400 --> 00:05:49,720 Speaker 2: we know that this unfettered access to pornography at such 82 00:05:49,760 --> 00:05:53,159 Speaker 2: a young age, when kids are still developing, can have 83 00:05:53,320 --> 00:05:58,720 Speaker 2: really perverse outcomes, whether that is just awful bullying at 84 00:05:58,760 --> 00:06:03,680 Speaker 2: the very least, to self harm, suicide, and eating disorders. 85 00:06:07,560 --> 00:06:11,120 Speaker 1: Now the safety Commissioner has the tech platforms in her sites, 86 00:06:11,880 --> 00:06:15,520 Speaker 1: Julie Mngrant has given tech giants like Google, Meta, Snapchat, 87 00:06:15,560 --> 00:06:18,920 Speaker 1: and TikTok six months to come up with voluntary codes 88 00:06:18,960 --> 00:06:22,680 Speaker 1: to protect kids from porn along with other potentially harmful 89 00:06:22,720 --> 00:06:27,200 Speaker 1: content like suicide, serious illness, self harm, and disordered eating. 90 00:06:27,839 --> 00:06:31,839 Speaker 1: She wants barriers like age verification and tools that allow 91 00:06:32,000 --> 00:06:35,960 Speaker 1: users or their parents to filter or blur out unwanted 92 00:06:36,080 --> 00:06:36,960 Speaker 1: sexual content. 93 00:06:41,400 --> 00:06:44,640 Speaker 2: And I guess this goes to one point that I 94 00:06:44,680 --> 00:06:48,040 Speaker 2: hadn't really thought about, but when a parent buys their 95 00:06:48,120 --> 00:06:52,360 Speaker 2: child an iPad or an iPhone, quite often there is 96 00:06:52,400 --> 00:06:56,680 Speaker 2: an opt in element around parental controls and safety measures. 97 00:06:57,040 --> 00:06:59,240 Speaker 2: And one of the simple things that's being raised here 98 00:06:59,360 --> 00:07:03,960 Speaker 2: is sibly, these big tech providers can make that an 99 00:07:04,000 --> 00:07:07,919 Speaker 2: opt out default setting. And if you do that, I 100 00:07:07,960 --> 00:07:10,640 Speaker 2: think most adults will know that they've got to get 101 00:07:10,680 --> 00:07:12,520 Speaker 2: in and change that if they wish to change that 102 00:07:13,000 --> 00:07:16,440 Speaker 2: if there are no children using it, Because the trends 103 00:07:16,440 --> 00:07:19,320 Speaker 2: show that when it is an opt in arrangement that 104 00:07:19,440 --> 00:07:22,160 Speaker 2: many people forget about it, don't do it, or don't 105 00:07:22,160 --> 00:07:22,760 Speaker 2: do it properly. 106 00:07:23,800 --> 00:07:26,679 Speaker 1: If they don't front up. Inman, Grant says she'll set 107 00:07:26,800 --> 00:07:31,960 Speaker 1: rules for the tech platforms with mandatory standards. Duly Imman. 108 00:07:32,040 --> 00:07:36,280 Speaker 1: Grant has issued notices to these organizations giving them six 109 00:07:36,360 --> 00:07:39,800 Speaker 1: months to come up with their own enforceable codes, and 110 00:07:39,840 --> 00:07:42,560 Speaker 1: she's saying that if they don't do that, she'll take action. 111 00:07:43,360 --> 00:07:46,480 Speaker 1: But Jeff, this feels like the second issue on which 112 00:07:46,520 --> 00:07:49,880 Speaker 1: this government is saying to tech giants, if you don't 113 00:07:50,520 --> 00:07:52,480 Speaker 1: clean up your act, We've going to wave a big 114 00:07:52,480 --> 00:07:55,400 Speaker 1: stick at you. The other being the social media giants 115 00:07:55,640 --> 00:07:58,120 Speaker 1: refusal in some instances to pay for the news content 116 00:07:58,160 --> 00:08:01,240 Speaker 1: that they use. So why not just go straight to 117 00:08:01,480 --> 00:08:02,080 Speaker 1: the big stick. 118 00:08:03,200 --> 00:08:04,840 Speaker 3: Look, they may well end up there. 119 00:08:05,160 --> 00:08:09,680 Speaker 2: So Julian and Grant has the Online Safety Act and 120 00:08:10,240 --> 00:08:14,160 Speaker 2: under phase one that was looking at child sex abuse material, 121 00:08:14,440 --> 00:08:18,920 Speaker 2: terrorists and extremist materials, and this is part of phase two. 122 00:08:19,160 --> 00:08:23,000 Speaker 2: We saw in phase one there had been some participation 123 00:08:23,160 --> 00:08:26,360 Speaker 2: by some of these tech firms, most of them based overseas, 124 00:08:26,360 --> 00:08:28,920 Speaker 2: most of them in the US very cashed, are very 125 00:08:29,000 --> 00:08:30,160 Speaker 2: lowered up outfits. 126 00:08:30,720 --> 00:08:34,599 Speaker 3: And what we saw just in recent weeks. 127 00:08:34,840 --> 00:08:38,240 Speaker 2: Was the lack of involvement in terms of a voluntary code. 128 00:08:38,640 --> 00:08:43,080 Speaker 2: Julian and Grant had to regulate mandatory codes under law, 129 00:08:43,120 --> 00:08:47,959 Speaker 2: Australian law, and they've given three months for these platforms 130 00:08:47,960 --> 00:08:52,439 Speaker 2: and tech companies to come forward with their plans around 131 00:08:52,520 --> 00:08:58,800 Speaker 2: protecting kids from pornography and a hard December deadline. Now 132 00:08:59,400 --> 00:09:02,200 Speaker 2: the argument and a lot of people will say that, well, 133 00:09:02,760 --> 00:09:05,959 Speaker 2: clearly Australia can't do much in this space, but we're 134 00:09:06,000 --> 00:09:10,120 Speaker 2: seeing other Western countries going down this same path as 135 00:09:10,160 --> 00:09:13,520 Speaker 2: we are, and I think it's always important to get 136 00:09:13,600 --> 00:09:17,320 Speaker 2: action to ensure that all of the other countries such 137 00:09:17,360 --> 00:09:20,559 Speaker 2: as the US and the UK and the European Union. 138 00:09:20,679 --> 00:09:23,680 Speaker 3: They all go in a pack to try and force action. 139 00:09:23,880 --> 00:09:27,640 Speaker 2: And I think it's demanded when we talk about protecting kids, 140 00:09:28,080 --> 00:09:31,400 Speaker 2: whether it's from child sex material, but in this case 141 00:09:31,440 --> 00:09:35,120 Speaker 2: pornography and some of those perverse outcomes of that, I 142 00:09:35,160 --> 00:09:37,720 Speaker 2: think that they have to try and do something. 143 00:09:41,160 --> 00:09:44,800 Speaker 1: Coming up the tech companies explanations for why they haven't 144 00:09:44,840 --> 00:09:48,520 Speaker 1: cleaned up before now and can the government actually scare 145 00:09:48,559 --> 00:09:51,760 Speaker 1: them into action. Don't forget to check us out at 146 00:09:51,760 --> 00:09:55,559 Speaker 1: the Australian dot Com dot you and consider subscribing. It's 147 00:09:55,600 --> 00:09:58,600 Speaker 1: where all the actions happening twenty four to seven. We'll 148 00:09:58,640 --> 00:10:16,160 Speaker 1: be back after this break. So why is Julie Inman 149 00:10:16,240 --> 00:10:19,520 Speaker 1: Grant asking the social media sites to act instead of 150 00:10:19,640 --> 00:10:23,680 Speaker 1: forcing them to Well, she says the tech companies themselves 151 00:10:24,080 --> 00:10:28,440 Speaker 1: have to create the solutions, like reliable age verification tools 152 00:10:28,760 --> 00:10:35,400 Speaker 1: that kids just can't trick. For example, do you think, though, 153 00:10:35,559 --> 00:10:38,320 Speaker 1: Jeff as a seasoned observer of the way big companies 154 00:10:38,320 --> 00:10:41,160 Speaker 1: behave in their relationships with government, are they actually going 155 00:10:41,160 --> 00:10:42,000 Speaker 1: to comply with this? 156 00:10:43,080 --> 00:10:45,840 Speaker 2: And this is the difficulty for this one because Julian 157 00:10:45,880 --> 00:10:48,200 Speaker 2: Mgrant describes it as a stack, but there is so 158 00:10:48,320 --> 00:10:51,800 Speaker 2: many different layers of that stack where you will get breaches. 159 00:10:52,280 --> 00:10:55,560 Speaker 2: Maybe one company does the right thing, but then using 160 00:10:55,600 --> 00:10:59,200 Speaker 2: that platform or that technology or device, there will be 161 00:10:59,240 --> 00:11:02,840 Speaker 2: other areas where that stack is fractured. And I guess 162 00:11:03,160 --> 00:11:04,880 Speaker 2: I don't think that they're saying that they'll get a 163 00:11:04,920 --> 00:11:08,880 Speaker 2: full proof, one hundred percent iron clad protection for kids, 164 00:11:09,200 --> 00:11:11,720 Speaker 2: but they're trying to put as many different layers as possible. 165 00:11:12,080 --> 00:11:15,120 Speaker 2: To go to your question about these big, cashed up, 166 00:11:15,280 --> 00:11:18,720 Speaker 2: well resourced companies, the history shows that they do everything 167 00:11:18,760 --> 00:11:22,599 Speaker 2: they can to get out of adding extra laser complexity 168 00:11:22,679 --> 00:11:23,920 Speaker 2: or costs for themselves. 169 00:11:25,559 --> 00:11:28,320 Speaker 1: The codes won't just apply to websites and social media 170 00:11:29,200 --> 00:11:34,719 Speaker 1: search engines, hosting services, Internet service providers, instant messaging programs 171 00:11:34,920 --> 00:11:40,840 Speaker 1: sms chat, multiplayer gaming, online dating services, and hardware manufacturers 172 00:11:41,120 --> 00:11:45,840 Speaker 1: will also be on the hook. So we're starting to 173 00:11:45,840 --> 00:11:50,000 Speaker 1: see changes. We just need to see consistency across the industry. 174 00:11:50,040 --> 00:11:50,880 Speaker 1: I wouldn't say that. 175 00:11:50,840 --> 00:11:55,200 Speaker 2: There's any company in particular that I think is doing nothing. 176 00:11:55,360 --> 00:11:56,880 Speaker 1: I think collectively they. 177 00:11:56,720 --> 00:12:01,680 Speaker 2: All need to do more. She is very concerned that 178 00:12:02,080 --> 00:12:05,440 Speaker 2: if unfettered access to online pornography by children is not 179 00:12:05,720 --> 00:12:10,240 Speaker 2: urgently reversed or controlled in a way, that Australia is 180 00:12:10,280 --> 00:12:14,839 Speaker 2: facing a harmful sexual socialization of an entire generation. 181 00:12:15,559 --> 00:12:17,760 Speaker 3: And she reflected on the. 182 00:12:17,720 --> 00:12:21,040 Speaker 2: Fact that this current age is not like the days 183 00:12:21,040 --> 00:12:24,120 Speaker 2: of the pentase that your dad did in his sock draw. 184 00:12:24,400 --> 00:12:28,520 Speaker 2: It's now available without any real impediments, and it's just 185 00:12:28,559 --> 00:12:32,120 Speaker 2: becoming easier for kids to get around blocks, and it's 186 00:12:32,120 --> 00:12:40,559 Speaker 2: becoming a really, really difficult issue for parents to police. 187 00:12:43,840 --> 00:12:47,920 Speaker 1: Jeff Chambers is The Australian's chief political correspondent. Thanks for 188 00:12:47,960 --> 00:12:49,920 Speaker 1: joining us on the front and don't forget. You can 189 00:12:49,960 --> 00:12:52,960 Speaker 1: read all the coverage of Labour's unfolding crisis in the 190 00:12:53,040 --> 00:12:56,120 Speaker 1: Senate right now at the Australian dot com. Dou