1 00:00:06,840 --> 00:00:09,240 Speaker 1: You can listen to the Front on your smart speaker 2 00:00:09,360 --> 00:00:13,480 Speaker 1: every morning to hear the latest episode. Just say play 3 00:00:13,520 --> 00:00:22,160 Speaker 1: the news from The Australian. From the Australian, here's what's 4 00:00:22,200 --> 00:00:25,400 Speaker 1: on the Front. I'm Claire Harvey. It's Monday, September sixteen. 5 00:00:30,200 --> 00:00:33,440 Speaker 1: The government's trying to clean dodgy disability providers out of 6 00:00:33,479 --> 00:00:38,360 Speaker 1: the NDS by forcing fifteen thousand operators to register so 7 00:00:38,400 --> 00:00:42,239 Speaker 1: they can be properly regulated. Some disability advocates say that 8 00:00:42,360 --> 00:00:50,640 Speaker 1: will force some SOG traders out of business. A pedophile 9 00:00:50,640 --> 00:00:54,279 Speaker 1: who groomed and abused six children has been sentenced to 10 00:00:54,400 --> 00:00:58,840 Speaker 1: ten years without parole, but he'll get a lifetime taxpayer 11 00:00:58,880 --> 00:01:02,240 Speaker 1: funded pension when he leaves jail because he was a 12 00:01:02,360 --> 00:01:06,760 Speaker 1: senior public servant. One of his victims' mothers is crusading 13 00:01:06,840 --> 00:01:10,920 Speaker 1: to have Stephen Mitchell stripped of his super entitlements. That's 14 00:01:10,959 --> 00:01:14,720 Speaker 1: an exclusive live now at The Australian. Dot com au. 15 00:01:17,800 --> 00:01:20,880 Speaker 1: Meta says it's up to the Apple and Android app 16 00:01:20,920 --> 00:01:24,680 Speaker 1: stores to enforce a social media ban on children, and 17 00:01:24,880 --> 00:01:29,560 Speaker 1: claims parents don't use existing options to keep their kids safe. 18 00:01:29,680 --> 00:01:33,920 Speaker 1: Online media companies are fighting back, urging the government to 19 00:01:33,920 --> 00:01:39,039 Speaker 1: toughen up on tech giants today, how online safety will 20 00:01:39,160 --> 00:01:41,440 Speaker 1: shape the looming federal election. 21 00:01:50,360 --> 00:01:52,840 Speaker 2: I remember once I came across a video of a 22 00:01:52,880 --> 00:01:55,920 Speaker 2: man being shot and was being recorded as he died. 23 00:01:57,000 --> 00:01:59,680 Speaker 1: This is an eighteen year old woman talking about her 24 00:01:59,680 --> 00:02:03,960 Speaker 1: early experiences on social media several years ago when she 25 00:02:04,080 --> 00:02:05,720 Speaker 1: was in her mid teens. 26 00:02:06,880 --> 00:02:09,119 Speaker 2: I haven't seen it since, and I hope other young 27 00:02:09,200 --> 00:02:12,639 Speaker 2: people never see it. It took away most of my innocence. 28 00:02:14,720 --> 00:02:17,640 Speaker 1: This young woman, who isn't named, is quoted in a 29 00:02:17,639 --> 00:02:21,560 Speaker 1: submission to a federal parliamentary inquiry on social media and 30 00:02:21,639 --> 00:02:26,320 Speaker 1: Australian society. The submission is by News Corp Australia, which 31 00:02:26,360 --> 00:02:29,480 Speaker 1: is the parent company of The Australian and US. Here 32 00:02:29,520 --> 00:02:32,960 Speaker 1: at the front, the eighteen year old, whose words were 33 00:02:33,000 --> 00:02:35,840 Speaker 1: read for US by a voice actor, spoke to a 34 00:02:35,919 --> 00:02:40,600 Speaker 1: research company called Dinata, which News Corp commissioned to ask 35 00:02:40,680 --> 00:02:46,960 Speaker 1: Australians about their experience in that intangible, magical, terrifying world 36 00:02:47,320 --> 00:02:51,600 Speaker 1: that surrounds us at every moment social media. 37 00:02:51,840 --> 00:02:54,960 Speaker 2: I also remember one time a person who I thought 38 00:02:55,000 --> 00:02:57,800 Speaker 2: was a friend to being an unsolicited picture of his 39 00:02:57,960 --> 00:03:01,000 Speaker 2: penis and I blocked him without saying anything. 40 00:03:04,560 --> 00:03:07,280 Speaker 1: Social media can bring us so much. 41 00:03:07,200 --> 00:03:11,280 Speaker 2: Joy, healthy chocolated it which you can also put in 42 00:03:11,280 --> 00:03:12,440 Speaker 2: the fridge to make a moose. 43 00:03:13,160 --> 00:03:16,200 Speaker 3: And what do we do when somebody hates us for 44 00:03:16,240 --> 00:03:21,520 Speaker 3: no reason? Give them a reason with therapy or. 45 00:03:21,480 --> 00:03:25,000 Speaker 1: Without therapy or without therapy. Please, because my husband actually 46 00:03:25,040 --> 00:03:29,120 Speaker 1: just stole all of my cat. But we've all also 47 00:03:29,360 --> 00:03:34,840 Speaker 1: probably had one of those experiences seeing something you can't unsee. 48 00:03:35,920 --> 00:03:39,600 Speaker 1: The idea of children on social media consuming death and 49 00:03:39,760 --> 00:03:44,160 Speaker 1: misery while they sit quietly in their bedrooms horrifies parents 50 00:03:44,320 --> 00:03:48,320 Speaker 1: and kids themselves. But we're all addicted and none of 51 00:03:48,360 --> 00:03:52,000 Speaker 1: us have any power, which is why this seemed like 52 00:03:52,240 --> 00:03:56,400 Speaker 1: big news. The Prime Minister is vowing to ban children 53 00:03:56,440 --> 00:03:59,320 Speaker 1: from social media by the end of the year. We 54 00:03:59,440 --> 00:04:04,000 Speaker 1: know that social media is causing social harm. It's an 55 00:04:04,000 --> 00:04:08,200 Speaker 1: idea first floated by Peter Dutton earlier this year. And 56 00:04:08,280 --> 00:04:11,080 Speaker 1: at the same time, the government's introducing legislation that would 57 00:04:11,120 --> 00:04:15,080 Speaker 1: put rules around doxing that's revealing the private information of 58 00:04:15,120 --> 00:04:19,560 Speaker 1: someone without their consent, and misinformation or disinformation. 59 00:04:21,040 --> 00:04:25,279 Speaker 4: From both sides of politics, we're seeing red hot rhetoric 60 00:04:25,640 --> 00:04:30,159 Speaker 4: calling out the Elon Musks and Mark Zuckerberg's of this world. 61 00:04:30,880 --> 00:04:35,160 Speaker 1: Jeff Chambers is The Australian's chief political correspondent. 62 00:04:35,560 --> 00:04:39,679 Speaker 4: For both Anthony Alberanezi and Peter Dutton. Up alongside cost 63 00:04:39,720 --> 00:04:44,440 Speaker 4: of living, social media harms is becoming a top tier issue. 64 00:04:44,960 --> 00:04:48,480 Speaker 3: So we see this flurry of activity around Parliament House. 65 00:04:48,920 --> 00:04:51,400 Speaker 4: I kept hearing it described to me as this is 66 00:04:51,440 --> 00:04:55,280 Speaker 4: such a big water cooler issue. And the Prime Minister 67 00:04:56,360 --> 00:05:00,760 Speaker 4: reveals that in this term of government that he will 68 00:05:00,839 --> 00:05:03,960 Speaker 4: legislate a minimum age of access to social media and 69 00:05:04,040 --> 00:05:07,920 Speaker 4: relevant tech platforms. Now, what he didn't have in that 70 00:05:07,960 --> 00:05:12,440 Speaker 4: announcement was what was the optimum age? And my understanding 71 00:05:12,560 --> 00:05:16,560 Speaker 4: is that while the Prime Minister supports Miss Dutton's idea 72 00:05:16,720 --> 00:05:20,320 Speaker 4: of restricting teams up to sixteen, that some around his 73 00:05:20,440 --> 00:05:25,000 Speaker 4: cabinet table and more broadly in caucus have differing opinions 74 00:05:25,040 --> 00:05:28,800 Speaker 4: on that. So inside labor and probably more broadly, there's 75 00:05:28,839 --> 00:05:32,160 Speaker 4: a bit of a debate over what that right age is, 76 00:05:32,200 --> 00:05:36,839 Speaker 4: whether it's fourteen, fifteen or sixteen. We've seen Metaglobal Affairs 77 00:05:36,920 --> 00:05:41,920 Speaker 4: chief Nick Klegg last Friday pushing back, suggesting that without 78 00:05:41,960 --> 00:05:44,640 Speaker 4: the help of Apple and Google app stores, it would 79 00:05:44,640 --> 00:05:47,599 Speaker 4: be very difficult to implement there are. 80 00:05:47,480 --> 00:05:50,839 Speaker 5: Only two choke points in the modern Internet. It's the 81 00:05:50,920 --> 00:05:54,120 Speaker 5: operating systems upon which everything else is built, which is 82 00:05:54,160 --> 00:05:57,080 Speaker 5: iOS owned by Apple and Android by Google. If you 83 00:05:57,160 --> 00:06:00,000 Speaker 5: ask each company's by whack them all of these things, 84 00:05:59,839 --> 00:06:02,359 Speaker 5: it's going to be a nightmare for parents. So I 85 00:06:02,360 --> 00:06:05,359 Speaker 5: think you know, in as much as the debate unfolds 86 00:06:05,360 --> 00:06:07,919 Speaker 5: in Australia and elsewhere, if you're going to make a 87 00:06:07,960 --> 00:06:10,200 Speaker 5: big move like that, you've got to make it workable. 88 00:06:10,240 --> 00:06:12,560 Speaker 5: It's got to cover all the apps that young people use, 89 00:06:12,600 --> 00:06:14,320 Speaker 5: not just some of them, and the only way is 90 00:06:14,320 --> 00:06:15,719 Speaker 5: through the app stores. There's no other way. 91 00:06:16,920 --> 00:06:20,080 Speaker 1: So Facebook says it's not on them, it's on Apple 92 00:06:20,320 --> 00:06:20,960 Speaker 1: and Google. 93 00:06:21,600 --> 00:06:26,600 Speaker 5: And this We've introduced around fifty tools over the last 94 00:06:26,960 --> 00:06:29,320 Speaker 5: few years to give parents more control over the time 95 00:06:29,320 --> 00:06:31,120 Speaker 5: that their kids spent who are they friending with? We 96 00:06:31,240 --> 00:06:35,080 Speaker 5: default under sixteen year olds into a much more restrictive setting, 97 00:06:35,160 --> 00:06:37,960 Speaker 5: so they can't be connected or communicated with by people 98 00:06:37,960 --> 00:06:40,680 Speaker 5: they're not connected with, et cetera, et cetera. And guess what, 99 00:06:40,800 --> 00:06:43,520 Speaker 5: even when we build these controls, parents don't use them. 100 00:06:43,600 --> 00:06:46,400 Speaker 5: I think we'll be making some very significant announcements fairly 101 00:06:46,440 --> 00:06:50,560 Speaker 5: soon to try and really make these controls simpler, easier, 102 00:06:50,920 --> 00:06:52,839 Speaker 5: particularly reassuring for parents. 103 00:06:53,839 --> 00:06:57,719 Speaker 4: Social media giants they've got huge war chests. They're multi 104 00:06:57,720 --> 00:07:02,120 Speaker 4: trillion dollar companies, armies of lobbyists, lawyers, and the general 105 00:07:02,200 --> 00:07:04,960 Speaker 4: view is when they see ad hoc legislation like this, 106 00:07:05,320 --> 00:07:06,560 Speaker 4: they view it with contempt. 107 00:07:08,680 --> 00:07:11,200 Speaker 1: So it seems like there's a lot going on, But 108 00:07:11,320 --> 00:07:14,880 Speaker 1: media companies like News Corp Australia say in their submissions 109 00:07:14,920 --> 00:07:18,160 Speaker 1: to the committee, in fact, the government isn't taking big, 110 00:07:18,240 --> 00:07:22,800 Speaker 1: bold steps to really get social platforms under control. News 111 00:07:22,840 --> 00:07:25,840 Speaker 1: Corp says it wants two things. First, for the government 112 00:07:25,840 --> 00:07:30,200 Speaker 1: to force Meta, which owns Instagram and Facebook, to negotiate 113 00:07:30,240 --> 00:07:33,840 Speaker 1: payments to news companies for the material it serves up 114 00:07:33,880 --> 00:07:37,880 Speaker 1: to its users. This was happening. You might have heard 115 00:07:37,920 --> 00:07:40,800 Speaker 1: about it. In twenty twenty one. Meta and Google paid 116 00:07:40,800 --> 00:07:45,160 Speaker 1: Australian media outlets millions for their content. That's after Meta 117 00:07:45,200 --> 00:07:48,560 Speaker 1: had a brief tantrum where it removed all news content 118 00:07:48,800 --> 00:07:52,040 Speaker 1: then restored it. But this year Meta said it doesn't 119 00:07:52,040 --> 00:07:55,440 Speaker 1: want to pay anymore and claimed its users don't really 120 00:07:55,520 --> 00:07:57,000 Speaker 1: consume much news anyway. 121 00:07:58,400 --> 00:08:02,640 Speaker 4: The News Media Bargaining Code was established by Josh Freedenberg. 122 00:08:03,000 --> 00:08:06,400 Speaker 4: It wasn't easy to land for him, but he personally 123 00:08:06,440 --> 00:08:10,600 Speaker 4: took on responsibility of negotiating this outcome with the big 124 00:08:10,640 --> 00:08:15,600 Speaker 4: social media guys, in particular Meta and Google, and written 125 00:08:15,600 --> 00:08:19,800 Speaker 4: into that code, if one of the major social media 126 00:08:19,840 --> 00:08:24,360 Speaker 4: platforms or tech giants does withdraw not in good faith, 127 00:08:24,520 --> 00:08:28,320 Speaker 4: they can designate that company. What we've seen is in 128 00:08:28,440 --> 00:08:33,520 Speaker 4: late February early March, Meta shredded their deals, walked away. 129 00:08:33,559 --> 00:08:36,800 Speaker 4: We've seen this kind of behavior from Meta across the world. 130 00:08:37,559 --> 00:08:41,480 Speaker 4: So just over six months now, the Alberenese government has 131 00:08:41,559 --> 00:08:45,080 Speaker 4: talked a big game in terms of the effect of 132 00:08:45,120 --> 00:08:48,680 Speaker 4: what Meta has done. Anthony Alberzi could have taken this 133 00:08:48,760 --> 00:08:53,680 Speaker 4: on himself. Technically, the media Bargaining Code falls under Jim Charmers. 134 00:08:54,360 --> 00:08:58,040 Speaker 4: Jim Charmers recused himself for a conflict of interest. 135 00:08:58,920 --> 00:09:02,000 Speaker 1: The treasure is wife. He's a senior journalist with News 136 00:09:02,000 --> 00:09:03,360 Speaker 1: Corp Australia. 137 00:09:03,640 --> 00:09:07,760 Speaker 4: And instead of appointing a senior cabinet minister to lead 138 00:09:07,800 --> 00:09:12,920 Speaker 4: negotiations with Meta and swiftly determine whether the company should 139 00:09:12,920 --> 00:09:17,360 Speaker 4: be designated under the code, Anthony Alberizi picked assistant Treasurer 140 00:09:17,440 --> 00:09:18,400 Speaker 4: Stephen Jones. 141 00:09:19,000 --> 00:09:21,320 Speaker 3: He sits outside of Cabinet and he. 142 00:09:21,400 --> 00:09:25,160 Speaker 4: Is the one who's been given the responsibility to determine 143 00:09:25,160 --> 00:09:29,400 Speaker 4: the government's position. And again a lot of rehtric from 144 00:09:29,400 --> 00:09:33,880 Speaker 4: mister Jones, we're hearing that he is seriously considering an 145 00:09:33,920 --> 00:09:38,480 Speaker 4: alternative option of imposing a general levy, a government ordered levy, 146 00:09:38,920 --> 00:09:43,440 Speaker 4: instead of invoking designation powers that the government has. 147 00:09:44,080 --> 00:09:44,520 Speaker 3: I think the. 148 00:09:44,480 --> 00:09:47,640 Speaker 4: Problem here is time is just ticking. There's only three 149 00:09:48,120 --> 00:09:51,400 Speaker 4: joint sitting weeks left of the year, which means both 150 00:09:51,400 --> 00:09:54,280 Speaker 4: the Senate and the House of Representative sitting because I 151 00:09:54,320 --> 00:09:56,640 Speaker 4: think the government thought they might be having an early election. 152 00:09:57,520 --> 00:10:01,520 Speaker 4: It's almost like they're dumping out all this legislation knowing 153 00:10:01,600 --> 00:10:03,800 Speaker 4: full well that the reality of getting all of it 154 00:10:03,920 --> 00:10:06,640 Speaker 4: through by the time of an election is looking very unlikely. 155 00:10:10,640 --> 00:10:28,800 Speaker 1: Coming up, what the news organizations want? News Corp wants 156 00:10:28,840 --> 00:10:32,200 Speaker 1: the government to create a new system called a social License, 157 00:10:32,440 --> 00:10:34,959 Speaker 1: where giant tech companies would have to pay an annual 158 00:10:35,040 --> 00:10:38,840 Speaker 1: license fee to operate in Australia and be held liable 159 00:10:38,960 --> 00:10:44,079 Speaker 1: for algorithmically amplified content. That's like the video of someone's death, 160 00:10:44,360 --> 00:10:46,400 Speaker 1: which the eighteen year old we heard from at the 161 00:10:46,400 --> 00:10:49,720 Speaker 1: top of the episode said she saw during COVID so 162 00:10:49,920 --> 00:10:53,520 Speaker 1: when she was fourteen or fifteen. News also wants the 163 00:10:53,559 --> 00:10:57,560 Speaker 1: media companies to have to implement customer complaint systems so 164 00:10:57,600 --> 00:11:00,120 Speaker 1: you can ring a call center here in Australia to 165 00:11:00,200 --> 00:11:03,280 Speaker 1: raise a problem. They also want something called an ex 166 00:11:03,400 --> 00:11:08,240 Speaker 1: ant a competition framework that means regulations that look forward, 167 00:11:08,640 --> 00:11:13,000 Speaker 1: anticipating harmful situations as they arise and taking action to 168 00:11:13,080 --> 00:11:18,840 Speaker 1: regulate the platform's behavior. Jeff, these are our bosses. This 169 00:11:18,960 --> 00:11:22,559 Speaker 1: is what they want. But as an independent analyst yourself, 170 00:11:22,880 --> 00:11:25,320 Speaker 1: do you think the Albanezi government is actually going to 171 00:11:25,360 --> 00:11:28,520 Speaker 1: designate meta and or would the opposition do it. 172 00:11:28,520 --> 00:11:32,200 Speaker 4: It's interesting because the Coalition on a whole front of 173 00:11:32,360 --> 00:11:36,280 Speaker 4: different policy areas are still sticking in that opposition space 174 00:11:36,320 --> 00:11:38,640 Speaker 4: where they say they are the government of the day, 175 00:11:38,800 --> 00:11:42,680 Speaker 4: it is their decision to make. But as that election 176 00:11:42,800 --> 00:11:44,800 Speaker 4: comes around, we're going to have to see some clearer 177 00:11:45,160 --> 00:11:48,640 Speaker 4: policy indicators from the coalition. As for the government, there's 178 00:11:48,679 --> 00:11:51,760 Speaker 4: been lots of mixed messaging and a lot of concern 179 00:11:52,000 --> 00:11:56,360 Speaker 4: that a more senior cabinet minister, potentially Michelle Rowland or 180 00:11:56,400 --> 00:12:00,600 Speaker 4: even the Prime Minister himself haven't personally taken on this issue. 181 00:12:00,880 --> 00:12:03,760 Speaker 4: What it means we've got a junior minister sits outside 182 00:12:03,760 --> 00:12:07,360 Speaker 4: of cabinet who's leading this when you're a recall Josh 183 00:12:07,440 --> 00:12:11,880 Speaker 4: Freedenberg with support of Scott Morris, and they had personally 184 00:12:12,400 --> 00:12:15,960 Speaker 4: taken control of that and they got a big outcome 185 00:12:16,040 --> 00:12:20,720 Speaker 4: from it. And if Australia doesn't designate, it really does 186 00:12:20,840 --> 00:12:26,120 Speaker 4: undermine the future of what was a world first at 187 00:12:26,160 --> 00:12:28,240 Speaker 4: a world leading media bargaining code. 188 00:12:28,440 --> 00:12:31,320 Speaker 1: And what about that social license, Jeff, do you feel 189 00:12:31,360 --> 00:12:33,600 Speaker 1: like that's an idea that either the government or the 190 00:12:33,600 --> 00:12:35,320 Speaker 1: opposition are actually going to embrace. 191 00:12:36,160 --> 00:12:39,240 Speaker 4: Look, it's a new idea that's being floated and I'm 192 00:12:39,280 --> 00:12:42,480 Speaker 4: sure that both parties would have a close look at it. 193 00:12:42,920 --> 00:12:47,319 Speaker 4: I did ask Michelle Roland about a social license and 194 00:12:47,800 --> 00:12:51,720 Speaker 4: her response was there is nothing special about social media 195 00:12:51,800 --> 00:12:54,959 Speaker 4: companies when it comes to adherence with Australian law. 196 00:12:55,559 --> 00:12:57,920 Speaker 3: These companies aren't licensed. 197 00:12:57,360 --> 00:13:01,199 Speaker 4: In the way that traditional broadcasts and telecommunity location services are, 198 00:13:01,520 --> 00:13:04,800 Speaker 4: but they are regulated and they do have obligations to society. 199 00:13:04,920 --> 00:13:07,880 Speaker 3: So not an overwhelming kind of positive response. 200 00:13:08,600 --> 00:13:12,280 Speaker 1: Yeah, it's an interesting notion, isn't it that these companies 201 00:13:12,320 --> 00:13:14,640 Speaker 1: are regulated? I mean, are they. 202 00:13:15,040 --> 00:13:17,720 Speaker 3: Well, if you look at the tax arrangements. 203 00:13:17,800 --> 00:13:21,000 Speaker 4: If you look at their behaviors, I mean X now 204 00:13:21,080 --> 00:13:24,480 Speaker 4: have no staff based in Australia. If you wanted to 205 00:13:24,600 --> 00:13:28,359 Speaker 4: complain to any of these big companies, it's very, very difficult. 206 00:13:29,200 --> 00:13:32,080 Speaker 4: None of that really shows that there's any sort of 207 00:13:32,400 --> 00:13:35,040 Speaker 4: fear from them around Australian regulation. 208 00:13:42,360 --> 00:13:47,079 Speaker 1: Jeff Chambers is The Australian's chief political correspondent. You can 209 00:13:47,160 --> 00:13:49,880 Speaker 1: check out news Corp submission and read what the opposition 210 00:13:50,160 --> 00:13:53,560 Speaker 1: and the social platforms say right now at the Australian 211 00:13:53,720 --> 00:14:03,479 Speaker 1: dot com dot au