1 00:00:03,760 --> 00:00:05,640 Speaker 1: From the Australian. Here's what's on the front. 2 00:00:05,720 --> 00:00:09,080 Speaker 2: I'm Claire Harvey. It's Thursday, July thirty one, twenty twenty five. 3 00:00:12,400 --> 00:00:16,439 Speaker 2: The Albanzi government says recognizing Palestinian statehood is essential to 4 00:00:16,480 --> 00:00:19,439 Speaker 2: bringing peace to the Middle East. The declaration comes on 5 00:00:19,480 --> 00:00:22,680 Speaker 2: the heels of an ultimatum by Britain. It says it'll 6 00:00:22,680 --> 00:00:28,000 Speaker 2: formally recognize Palestine in September if Israel doesn't cease hostilities. 7 00:00:30,920 --> 00:00:34,879 Speaker 2: The Australian's revealing today the Iranian exile community here is 8 00:00:35,000 --> 00:00:39,160 Speaker 2: living in fear, with loyalists of the Iranian regime stalking, 9 00:00:39,320 --> 00:00:44,200 Speaker 2: surveiling and threatening anyone who speaks out against Tehran. That's 10 00:00:44,200 --> 00:00:48,040 Speaker 2: an exclusive by our reporter Muhammad Alpharez, live now at 11 00:00:48,080 --> 00:00:49,960 Speaker 2: the Australian dot Com dot a U. 12 00:00:53,680 --> 00:00:55,400 Speaker 1: The Wiggles have failed. 13 00:00:56,080 --> 00:00:59,840 Speaker 2: Not even everyone's favorite kids entertainers could talk the Albanese 14 00:00:59,840 --> 00:01:03,880 Speaker 2: GA government out of banning YouTube for children under sixteen. 15 00:01:04,400 --> 00:01:07,680 Speaker 2: That will bring YouTube in line with other social networks 16 00:01:07,880 --> 00:01:13,240 Speaker 2: who'd been outraged when YouTube was originally excluded from the ban. Today, 17 00:01:13,440 --> 00:01:16,560 Speaker 2: inside the back room lobbying behind a decision that'll have 18 00:01:16,800 --> 00:01:19,160 Speaker 2: every parent wondering what's. 19 00:01:18,920 --> 00:01:28,600 Speaker 1: Okay for kids? To consume on YouTube. 20 00:01:28,640 --> 00:01:33,039 Speaker 2: The Wiggles have eight point five million subscribers. This is 21 00:01:33,080 --> 00:01:36,000 Speaker 2: where they grow the next generation of kids whose mum 22 00:01:36,000 --> 00:01:39,760 Speaker 2: and dad's pay to stream their albums by their merchandise. 23 00:01:39,400 --> 00:01:42,679 Speaker 1: Show up at their gigs. But now that audience is 24 00:01:42,840 --> 00:01:45,240 Speaker 1: under threat. But do not let them in. 25 00:01:46,959 --> 00:01:50,240 Speaker 2: The Wiggles and every other creator of content for kids 26 00:01:50,680 --> 00:01:52,960 Speaker 2: are now subject to a new band by the federal 27 00:01:53,000 --> 00:01:58,280 Speaker 2: government determined to bring social media platforms under control. YouTube, 28 00:01:58,400 --> 00:02:02,280 Speaker 2: owned by Google, has been furiously lobbying the government to 29 00:02:02,440 --> 00:02:05,840 Speaker 2: ensure it was excluded from the social media ban for 30 00:02:05,920 --> 00:02:06,960 Speaker 2: kids under sixteen. 31 00:02:08,080 --> 00:02:11,239 Speaker 3: The YouTube did send the Wiggles to try and persuade 32 00:02:11,280 --> 00:02:12,240 Speaker 3: me to their position. 33 00:02:13,200 --> 00:02:16,440 Speaker 2: That's Anika Wells, the Federal Communications Minister. 34 00:02:16,880 --> 00:02:18,800 Speaker 3: For clarity, because we've got a lot of incomings on this. 35 00:02:18,919 --> 00:02:21,280 Speaker 3: It was it was the black Skivvies, it was Wiggles Inc. 36 00:02:21,760 --> 00:02:25,959 Speaker 3: It was Wiggles management, not individual members of our cherished 37 00:02:26,120 --> 00:02:29,280 Speaker 3: national icon, the Wiggles. 38 00:02:28,240 --> 00:02:34,799 Speaker 4: We love the Wiggles. Wiggle My government is pro. 39 00:02:37,080 --> 00:02:39,440 Speaker 3: Their argument was that the other had a few but 40 00:02:39,600 --> 00:02:41,880 Speaker 3: in essence that they are a video streaming platform, not 41 00:02:41,919 --> 00:02:46,960 Speaker 3: a social media platform. Catch that a video platform not 42 00:02:47,080 --> 00:02:51,679 Speaker 3: a social platform. Last year, when the government passed its 43 00:02:51,720 --> 00:02:56,040 Speaker 3: new social media ban, YouTube was excluded. The government had 44 00:02:56,080 --> 00:03:00,040 Speaker 3: accepted that argument the issues of trolling or bullying on 45 00:03:00,080 --> 00:03:04,880 Speaker 3: the platforms like Snapchat or TikTok, or predatory adults contacting 46 00:03:04,960 --> 00:03:07,680 Speaker 3: kids weren't so relevant to YouTube. 47 00:03:08,840 --> 00:03:12,520 Speaker 5: YouTube was initially excluded from the band, but that immediately 48 00:03:12,600 --> 00:03:14,000 Speaker 5: drew rage from the. 49 00:03:14,000 --> 00:03:15,400 Speaker 6: Other social media platforms. 50 00:03:15,560 --> 00:03:18,839 Speaker 5: They argued that YouTube was very, very similar to them. 51 00:03:19,200 --> 00:03:23,320 Speaker 1: This is Jack Quail, a federal politics reporter with The Australian. 52 00:03:23,560 --> 00:03:27,440 Speaker 5: And what subsequently ensued was this very strong lobbying campaign 53 00:03:27,480 --> 00:03:30,200 Speaker 5: from YouTube to ensure its continued exclusion from the ban. 54 00:03:30,480 --> 00:03:33,880 Speaker 5: But after the E Safety Commissioner advised that no YouTube 55 00:03:33,880 --> 00:03:36,000 Speaker 5: should be roped in, there was really no going back 56 00:03:36,000 --> 00:03:39,440 Speaker 5: from there, and in a last dish attempt last week, 57 00:03:39,560 --> 00:03:43,000 Speaker 5: YouTube threatened to launch a High Court challenge against the legislation, 58 00:03:43,600 --> 00:03:46,160 Speaker 5: basically arguing that it amounted to a breach of the 59 00:03:46,200 --> 00:03:48,040 Speaker 5: implied right of political communication. 60 00:03:49,680 --> 00:03:53,600 Speaker 2: Google's lobbying was intense, but they had a formidable opponent 61 00:03:53,760 --> 00:03:57,560 Speaker 2: in E Safety Commissioner Julie Inman Grant. She released a 62 00:03:57,600 --> 00:04:01,120 Speaker 2: report this month that put YouTube squarely in the frame. 63 00:04:01,680 --> 00:04:04,600 Speaker 2: Here's a line from that report, we've used a voice actor. 64 00:04:06,080 --> 00:04:08,960 Speaker 7: The most common platforms on which children reported that their 65 00:04:09,000 --> 00:04:12,640 Speaker 7: most recent or impactful exposure to content associated with harm 66 00:04:12,680 --> 00:04:17,400 Speaker 7: had occurred were YouTube thirty seven percent and TikTok twenty 67 00:04:17,440 --> 00:04:18,000 Speaker 7: three percent. 68 00:04:19,360 --> 00:04:24,159 Speaker 2: Content associated with harm, the report says, that includes sexist, misogynistic, 69 00:04:24,200 --> 00:04:28,960 Speaker 2: and hateful content, content depicting dangerous online challenges or fight videos, 70 00:04:29,279 --> 00:04:33,560 Speaker 2: and content that encourages unhealthy eating or exercise habits. So, 71 00:04:33,800 --> 00:04:38,000 Speaker 2: at least in part, it's the adolescence effect, the rabbit hole. 72 00:04:39,160 --> 00:04:42,640 Speaker 5: Certainly, I mean, I'm twenty six years old, and YouTube's 73 00:04:42,680 --> 00:04:45,920 Speaker 5: constantly serving our personalized videos after you've watched the first one. 74 00:04:46,000 --> 00:04:48,640 Speaker 5: So it's really about, I guess, eliminating this rabbit hole 75 00:04:48,720 --> 00:04:52,640 Speaker 5: effect that the regulator has talked about, where children are 76 00:04:52,800 --> 00:04:55,240 Speaker 5: sucked in and spending hours and hours a day on 77 00:04:55,279 --> 00:04:56,920 Speaker 5: YouTube rather than other activities. 78 00:04:57,760 --> 00:05:01,400 Speaker 2: The report also said YouTube, along with TikTok, was exposing 79 00:05:01,480 --> 00:05:05,599 Speaker 2: kids to online hate. That is, someone said something offensive 80 00:05:05,680 --> 00:05:09,039 Speaker 2: or hateful to a child, presumably in the comments on 81 00:05:09,080 --> 00:05:14,839 Speaker 2: a video. Okay, Jack, so let's just step inside the 82 00:05:14,960 --> 00:05:19,040 Speaker 2: ordinary family home. Kids are watching YouTube, either on an 83 00:05:19,080 --> 00:05:21,839 Speaker 2: iPad or a tablet, or on a phone, or on 84 00:05:21,880 --> 00:05:23,000 Speaker 2: the family television. 85 00:05:23,520 --> 00:05:25,200 Speaker 1: They may or may not be logged in. 86 00:05:25,560 --> 00:05:28,600 Speaker 2: Some kids have a YouTube kid's account, some are watching 87 00:05:28,839 --> 00:05:31,200 Speaker 2: YouTube while not logged in at all, and some are 88 00:05:31,240 --> 00:05:34,479 Speaker 2: watching on a parent's account. What will this change mean 89 00:05:34,600 --> 00:05:36,359 Speaker 2: to the way YouTube's actually consumed? 90 00:05:37,080 --> 00:05:40,320 Speaker 5: So, if you're watching YouTube in a logged out state, 91 00:05:40,720 --> 00:05:43,320 Speaker 5: it means that there's really a limited access to age 92 00:05:43,360 --> 00:05:46,320 Speaker 5: inappropriate content. You can't comment on a video, you can't 93 00:05:46,400 --> 00:05:48,719 Speaker 5: upload a video, and you don't receive the personalized video 94 00:05:48,800 --> 00:05:52,200 Speaker 5: recommendations that come with a logged in account. So if 95 00:05:52,240 --> 00:05:54,880 Speaker 5: I'm uploading a video, let's say I'm a content creator 96 00:05:54,920 --> 00:05:57,719 Speaker 5: making a video about Lego, I sign off on that 97 00:05:57,800 --> 00:06:00,760 Speaker 5: this is appropriate for children, and YouTube checks through their 98 00:06:00,800 --> 00:06:05,680 Speaker 5: algorithms when any video is uploaded. So children without access 99 00:06:05,720 --> 00:06:07,640 Speaker 5: to an account we'll still be able to watch those videos, 100 00:06:07,720 --> 00:06:10,919 Speaker 5: but anything that isn't age appropriate will now be not 101 00:06:11,080 --> 00:06:14,120 Speaker 5: able to be watched. So if a child is not 102 00:06:14,240 --> 00:06:17,039 Speaker 5: signed in, they'll be able to watch videos about history 103 00:06:17,200 --> 00:06:19,320 Speaker 5: or building Lego, but they won't be able to watch 104 00:06:19,400 --> 00:06:21,520 Speaker 5: videos with swearing or violence. 105 00:06:21,160 --> 00:06:22,640 Speaker 6: That you can still watch on YouTube now. 106 00:06:23,040 --> 00:06:25,000 Speaker 5: But if a child is watching on their parents' account, 107 00:06:25,000 --> 00:06:27,280 Speaker 5: they can watch anything on YouTube. But it's up to 108 00:06:27,320 --> 00:06:29,600 Speaker 5: the parents then to ensure that their children is watching 109 00:06:29,640 --> 00:06:30,680 Speaker 5: age appropriate content. 110 00:06:31,839 --> 00:06:35,719 Speaker 2: So Jacket sounds as though this is designed to force 111 00:06:35,800 --> 00:06:38,480 Speaker 2: parents to pay attention, perhaps in a way that some 112 00:06:38,640 --> 00:06:41,760 Speaker 2: parents at the moment aren't. Parents might hand their kids 113 00:06:41,760 --> 00:06:44,520 Speaker 2: an iPad or on iPhone and think that because the 114 00:06:44,600 --> 00:06:46,839 Speaker 2: kid is logged in and they're watching YouTube kids, that 115 00:06:46,920 --> 00:06:47,799 Speaker 2: it's safe. 116 00:06:48,240 --> 00:06:48,480 Speaker 6: Yeah. 117 00:06:48,520 --> 00:06:50,960 Speaker 5: Absolutely, And I mean it also gives agency to the 118 00:06:51,040 --> 00:06:54,920 Speaker 5: parents as well, so they can say, sorry, kids, YouTube's ban, 119 00:06:55,120 --> 00:06:57,080 Speaker 5: you can't watch it without an account. 120 00:06:57,240 --> 00:06:59,120 Speaker 6: Go outside and play in the playground instead. 121 00:07:01,560 --> 00:07:04,320 Speaker 2: The government isn't saying don't let your kids watch YouTube. 122 00:07:04,440 --> 00:07:06,719 Speaker 2: They're saying be in the room with them, know what 123 00:07:06,760 --> 00:07:11,160 Speaker 2: they're watching. They've acknowledged that teachers, coaches and other educators 124 00:07:11,320 --> 00:07:15,600 Speaker 2: use YouTube as a tool like Eddie Woo's viral maths videos, 125 00:07:15,960 --> 00:07:18,000 Speaker 2: news and current affairs and history. 126 00:07:18,800 --> 00:07:22,720 Speaker 4: We want Astrayan parents and families to know that we've 127 00:07:22,760 --> 00:07:25,640 Speaker 4: got your back. We know this is not the only 128 00:07:25,720 --> 00:07:29,400 Speaker 4: solution and there's more to do, but it will make 129 00:07:29,440 --> 00:07:33,480 Speaker 4: a difference. Parents want to get their kids off their 130 00:07:33,480 --> 00:07:38,720 Speaker 4: devices and onto the playing fields. They want children to 131 00:07:38,760 --> 00:07:42,960 Speaker 4: be able to enjoy their childhood and that is what 132 00:07:43,040 --> 00:07:44,600 Speaker 4: this is about. 133 00:07:45,400 --> 00:07:48,800 Speaker 2: Is that part of the design of this scheme overall, Jack, 134 00:07:48,880 --> 00:07:52,560 Speaker 2: do you think that the government doesn't necessarily think that 135 00:07:53,200 --> 00:07:58,640 Speaker 2: there'll be complete compliance by either the platforms or by users, 136 00:07:59,240 --> 00:08:03,280 Speaker 2: but that they're giving parents some moral authority really backed 137 00:08:03,280 --> 00:08:05,920 Speaker 2: by legislation, a bit like alcohol laws. You know, no, 138 00:08:06,040 --> 00:08:08,720 Speaker 2: you can't have a drink, it's actually illegal until you 139 00:08:08,760 --> 00:08:09,360 Speaker 2: turn eighteen? 140 00:08:09,600 --> 00:08:10,560 Speaker 1: Is it that idea? 141 00:08:10,640 --> 00:08:12,760 Speaker 6: Absolutely? And the Prime Minister talked about this today. 142 00:08:12,760 --> 00:08:14,600 Speaker 5: I mean, no ban is going to be one hundred 143 00:08:14,760 --> 00:08:18,440 Speaker 5: percent effective, but it is providing parents with that agency 144 00:08:18,480 --> 00:08:20,760 Speaker 5: to tell their children, no, it's time to get off 145 00:08:20,800 --> 00:08:23,880 Speaker 5: the phone. And also, I guess, reduce the addictive nature 146 00:08:23,880 --> 00:08:26,400 Speaker 5: of technology. I mean juing Imman Grundt, the Acceptic Commissioner, 147 00:08:26,400 --> 00:08:27,880 Speaker 5: has really talked about this frequently. 148 00:08:28,080 --> 00:08:31,560 Speaker 6: You know, that sort of addictive pool of social media. 149 00:08:32,200 --> 00:08:36,520 Speaker 8: The relationship between social media and children's mental health is 150 00:08:36,600 --> 00:08:40,320 Speaker 8: one of the most important conversations of our time. So 151 00:08:40,400 --> 00:08:44,960 Speaker 8: while the tech industry regresses backwards, we must move forwards. 152 00:08:45,720 --> 00:08:48,640 Speaker 5: The concern for social media platforms is under this legislation 153 00:08:48,760 --> 00:08:50,760 Speaker 5: they have to ensure that no one under sixteen years 154 00:08:50,760 --> 00:08:54,280 Speaker 5: of age is on their platform, and I guess ambiguity 155 00:08:54,320 --> 00:08:58,720 Speaker 5: around taking reasonable steps is causing a lot of concern 156 00:08:58,840 --> 00:09:02,160 Speaker 5: for the industry. In this legislation, leve is very hefty 157 00:09:02,200 --> 00:09:05,680 Speaker 5: fines if they haven't taken those reasonable steps, up to 158 00:09:05,720 --> 00:09:10,000 Speaker 5: fifty million dollars. So for social media companies, it's really 159 00:09:10,040 --> 00:09:12,640 Speaker 5: important that they do comply with this legislation. 160 00:09:15,840 --> 00:09:16,360 Speaker 1: Coming up? 161 00:09:16,480 --> 00:09:35,600 Speaker 2: Why gaming and dating platforms are next? If you've got kids, 162 00:09:35,760 --> 00:09:37,960 Speaker 2: you know they're spending a lot of time online that 163 00:09:38,120 --> 00:09:44,400 Speaker 2: isn't necessarily social media, gaming, using generative AI drawing, designing, 164 00:09:44,679 --> 00:09:45,040 Speaker 2: and for. 165 00:09:44,960 --> 00:09:46,280 Speaker 1: Adults there's dating. 166 00:09:46,800 --> 00:09:50,600 Speaker 2: Those platforms are excluded from the ban for now, but 167 00:09:50,640 --> 00:09:55,160 Speaker 2: they'll be next. Why do the social media companies so 168 00:09:55,320 --> 00:09:57,880 Speaker 2: desperately want children to be using their platforms. 169 00:09:58,400 --> 00:10:01,199 Speaker 5: Kids are the consumers of the future, obviously, and getting 170 00:10:01,240 --> 00:10:03,400 Speaker 5: them young is great. It means that the dollars can 171 00:10:03,440 --> 00:10:07,760 Speaker 5: continue to roll into the social media platforms pockets for 172 00:10:08,040 --> 00:10:11,120 Speaker 5: years and years to come. But kids are also very 173 00:10:11,200 --> 00:10:13,560 Speaker 5: very powerful consumers as well. I mean, if they see 174 00:10:13,640 --> 00:10:17,080 Speaker 5: an online very impressionable young minds, you know they want 175 00:10:17,160 --> 00:10:20,640 Speaker 5: to buy it. And social media platforms are presently popular 176 00:10:20,679 --> 00:10:23,760 Speaker 5: medium to reach children. They're not watching free to ATV 177 00:10:24,280 --> 00:10:28,040 Speaker 5: and so for advertisers it's a great way to access 178 00:10:28,040 --> 00:10:28,559 Speaker 5: the market. 179 00:10:29,120 --> 00:10:31,439 Speaker 2: You've also revealed in The Australian Jack that the e 180 00:10:31,559 --> 00:10:36,240 Speaker 2: Safety Commissions moving to establish industry specific codes that kind 181 00:10:36,280 --> 00:10:41,760 Speaker 2: of cover search engines, gaming platforms, dating websites. What does 182 00:10:41,800 --> 00:10:43,760 Speaker 2: that mean and how does that differ from what they've 183 00:10:43,800 --> 00:10:44,600 Speaker 2: already done. 184 00:10:45,000 --> 00:10:46,760 Speaker 5: I think what this is a recognition of is the 185 00:10:46,760 --> 00:10:50,360 Speaker 5: fact that it's not just children at risk online. 186 00:10:49,840 --> 00:10:52,520 Speaker 6: You know, online gaming dating apps. 187 00:10:52,720 --> 00:10:56,480 Speaker 5: There are risks here across the board, and so what 188 00:10:56,760 --> 00:10:58,800 Speaker 5: the es Afty Commission is trying to do here is 189 00:10:58,840 --> 00:11:02,800 Speaker 5: create an system online of regulation that ensures that every 190 00:11:02,840 --> 00:11:04,920 Speaker 5: consumer on the Internet is the same as that possibly 191 00:11:04,920 --> 00:11:05,200 Speaker 5: can be. 192 00:11:05,720 --> 00:11:06,640 Speaker 6: This is just beginning. 193 00:11:06,960 --> 00:11:09,440 Speaker 5: Julie in mcgrant, thest after Commissioner, she's got the ear 194 00:11:09,520 --> 00:11:12,600 Speaker 5: of the government. She is very determined and at the 195 00:11:12,600 --> 00:11:15,559 Speaker 5: same time, social media platforms don't have a lot of 196 00:11:15,640 --> 00:11:16,640 Speaker 5: popular support in the. 197 00:11:16,559 --> 00:11:18,760 Speaker 6: Community, even though they're very widely used. 198 00:11:18,840 --> 00:11:21,320 Speaker 5: So it'll be interesting to see just how far the 199 00:11:21,320 --> 00:11:23,400 Speaker 5: government goes here in clamping. 200 00:11:23,000 --> 00:11:25,240 Speaker 6: Down on social media platforms. More broadly. 201 00:11:34,679 --> 00:11:37,720 Speaker 2: Jackquail is a federal political reporter with The Australian. You 202 00:11:37,760 --> 00:11:40,720 Speaker 2: can read all our Coverage of politics, state, federal and 203 00:11:40,800 --> 00:11:44,080 Speaker 2: local right now at the Australian dot com dot au