1 00:00:05,920 --> 00:00:09,160 Speaker 1: Hello, this is the Happy Families Podcast. Minimum age legislation 2 00:00:09,560 --> 00:00:11,720 Speaker 1: is creeping up. It's only a couple of months away now. 3 00:00:11,760 --> 00:00:16,960 Speaker 1: December tenth is when the Federal Government's Social Media minimum 4 00:00:17,000 --> 00:00:20,720 Speaker 1: age legislation kicks off. In a lot of ways, this 5 00:00:20,800 --> 00:00:24,520 Speaker 1: is first of its kind across the globe and there 6 00:00:24,560 --> 00:00:26,960 Speaker 1: are a lot of people who are watching with interest. 7 00:00:27,080 --> 00:00:31,120 Speaker 1: Today on the Happy Families Podcast, we unpack what it means, 8 00:00:31,120 --> 00:00:34,800 Speaker 1: what people misunderstand, and a whole lot more about how 9 00:00:34,960 --> 00:00:38,040 Speaker 1: this thing is actually going to work. The clock's ticking 10 00:00:38,640 --> 00:00:40,960 Speaker 1: and we're going to tell you all about it right 11 00:00:41,080 --> 00:00:45,960 Speaker 1: after this. Stay with us. Hello, Welcome to the Happy 12 00:00:46,000 --> 00:00:49,600 Speaker 1: Families Podcast. It's real parenting solutions every single day on 13 00:00:49,640 --> 00:00:52,680 Speaker 1: Australia's most downloaded parenting podcast. We love that you choose 14 00:00:52,720 --> 00:00:55,080 Speaker 1: to listen to us. Hopefully you've had a great school holidays. 15 00:00:55,320 --> 00:00:57,320 Speaker 1: A couple of states still going through their holidays, but 16 00:00:57,680 --> 00:00:59,880 Speaker 1: here in Queensland where we live, it's back to school 17 00:01:00,080 --> 00:01:03,280 Speaker 1: and we're kicking off the term with a conversation that 18 00:01:03,400 --> 00:01:06,120 Speaker 1: is on a lot of people's minds. The Social Media 19 00:01:06,319 --> 00:01:09,760 Speaker 1: Minimum Age Legislation. Term four is the last term of 20 00:01:09,800 --> 00:01:13,199 Speaker 1: school where kids under sixteen get access to social media. 21 00:01:13,720 --> 00:01:15,600 Speaker 1: It's all about to change. 22 00:01:16,240 --> 00:01:21,160 Speaker 2: Tilly died at just fifteen today. Her mom took her 23 00:01:21,360 --> 00:01:26,360 Speaker 2: story from regional New South Wales to the world, speaking 24 00:01:26,640 --> 00:01:31,720 Speaker 2: at the United Nations. Exhausted and broken, she just couldn't 25 00:01:31,720 --> 00:01:36,600 Speaker 2: fight anymore, and social media played a direct role in 26 00:01:36,640 --> 00:01:37,120 Speaker 2: her death. 27 00:01:37,520 --> 00:01:40,680 Speaker 3: Well, I've got a whole heap of questions for you today, Okay, 28 00:01:40,800 --> 00:01:42,959 Speaker 3: I'd really love to unpack this because I think that 29 00:01:43,000 --> 00:01:47,720 Speaker 3: there's plenty of parents who either don't understand the policy 30 00:01:47,720 --> 00:01:50,280 Speaker 3: and what's going to change for them and their children, 31 00:01:50,840 --> 00:01:53,800 Speaker 3: or kind of misunderstand what the purpose of it is. 32 00:01:54,520 --> 00:01:59,120 Speaker 3: So my first question for you is what does this 33 00:01:59,240 --> 00:02:00,560 Speaker 3: policy actually do. 34 00:02:01,120 --> 00:02:05,480 Speaker 1: So the policy is designed literally to limit access to 35 00:02:05,680 --> 00:02:10,200 Speaker 1: social media platforms to children under the age of sixteen. 36 00:02:10,480 --> 00:02:14,600 Speaker 1: That's it in a nutshell. It's pretty widely misunderstood. It's 37 00:02:14,919 --> 00:02:19,080 Speaker 1: been very much mischaracterized around the world. But if I 38 00:02:19,160 --> 00:02:24,119 Speaker 1: was to read this directly from the E Safety Commissioner's 39 00:02:24,720 --> 00:02:29,799 Speaker 1: FAQ about it, it says this age restricted platforms won't 40 00:02:29,800 --> 00:02:34,320 Speaker 1: be allowed to let under sixteens create or keep an account, 41 00:02:35,120 --> 00:02:37,320 Speaker 1: and that's because being logged into an account increases the 42 00:02:37,360 --> 00:02:39,200 Speaker 1: likelihood that they're going to be exposed to pressures and 43 00:02:39,280 --> 00:02:41,600 Speaker 1: risks that can be hard to deal with. These come 44 00:02:41,600 --> 00:02:44,080 Speaker 1: from social media platform design features that encourage them to 45 00:02:44,080 --> 00:02:46,280 Speaker 1: spend more time on screens while also serving up content 46 00:02:46,320 --> 00:02:48,880 Speaker 1: that can harm their health and well being. Under sixteens 47 00:02:48,880 --> 00:02:52,840 Speaker 1: and this is critical will still be able to see 48 00:02:52,880 --> 00:02:58,720 Speaker 1: publicly available social media content that doesn't require being logged 49 00:02:58,800 --> 00:03:03,480 Speaker 1: into an account to view. Okay, So that's what it does. 50 00:03:03,600 --> 00:03:06,000 Speaker 1: It keeps kids from having an account, It keeps kids 51 00:03:06,040 --> 00:03:08,679 Speaker 1: from getting access to anything that's locked down behind an account. 52 00:03:08,800 --> 00:03:11,239 Speaker 1: But if there is content on social media platforms that 53 00:03:11,320 --> 00:03:15,360 Speaker 1: is available for public viewing, children and teachers and whoever 54 00:03:15,400 --> 00:03:18,480 Speaker 1: else will be able to access that content and it 55 00:03:18,520 --> 00:03:19,440 Speaker 1: will be able to be viewed. 56 00:03:19,880 --> 00:03:22,560 Speaker 3: So you're suggesting that you don't need to have a 57 00:03:23,080 --> 00:03:27,520 Speaker 3: profile to be able to access publicly available content. 58 00:03:27,600 --> 00:03:30,240 Speaker 1: Yeah, exactly, even now as an adult. So I believe 59 00:03:30,240 --> 00:03:33,000 Speaker 1: that you don't have a TikTok account, right, yep, But 60 00:03:33,240 --> 00:03:36,320 Speaker 1: you can still jump online and look for short videos 61 00:03:36,360 --> 00:03:39,160 Speaker 1: and TikTok content will come up in your online search. 62 00:03:39,680 --> 00:03:41,400 Speaker 1: You don't have to have an account to be able 63 00:03:41,400 --> 00:03:46,000 Speaker 1: to see the content. So the social media minimum age 64 00:03:46,080 --> 00:03:50,640 Speaker 1: legislation once more does not stop kids from viewing content 65 00:03:50,880 --> 00:03:56,200 Speaker 1: on social media. They can see publicly available social media 66 00:03:56,280 --> 00:03:59,400 Speaker 1: content so long as it doesn't require being logged into 67 00:03:59,400 --> 00:04:01,400 Speaker 1: an account. If people have privacy settings on, you have 68 00:04:01,440 --> 00:04:02,600 Speaker 1: to have an account to be able to get in 69 00:04:02,600 --> 00:04:04,960 Speaker 1: there and be friends with them whatever. But they will 70 00:04:05,280 --> 00:04:08,320 Speaker 1: not be able to keep an account, create an account, 71 00:04:08,520 --> 00:04:11,880 Speaker 1: and therefore have the algorithm doing what algorithms do, which 72 00:04:12,200 --> 00:04:17,839 Speaker 1: serves up increasingly half for material and provides ongoingly compelling 73 00:04:17,920 --> 00:04:19,400 Speaker 1: content to keep them there longer. 74 00:04:19,839 --> 00:04:22,520 Speaker 3: So it seems like a really lofty goal on the 75 00:04:22,560 --> 00:04:28,440 Speaker 3: government's part to I guess rain in some control over 76 00:04:28,520 --> 00:04:32,679 Speaker 3: what has been just openly available to these kids. 77 00:04:32,920 --> 00:04:35,120 Speaker 1: Yeah, I mean social media has been the wild wildwist 78 00:04:35,520 --> 00:04:36,120 Speaker 1: it still is. 79 00:04:36,400 --> 00:04:38,159 Speaker 3: How are they going to monitor it? Like, how do 80 00:04:38,240 --> 00:04:42,719 Speaker 3: they know that this child is sixteen or We've seen 81 00:04:42,920 --> 00:04:45,720 Speaker 3: so many times that kids are pretty clever and they 82 00:04:45,760 --> 00:04:49,359 Speaker 3: know how to get their way around all these different 83 00:04:49,440 --> 00:04:54,360 Speaker 3: controls and platforms, So how do they monitor it? 84 00:04:54,440 --> 00:04:56,360 Speaker 1: So I'm going to be talking with Julie Inman Grant, 85 00:04:56,440 --> 00:04:58,279 Speaker 1: the e Safety Commissioner, in a couple of weeks and 86 00:04:58,400 --> 00:05:01,479 Speaker 1: ask a little bit more about this. The details on 87 00:05:01,520 --> 00:05:04,400 Speaker 1: this are a little bit skinny, but basically, again from 88 00:05:04,440 --> 00:05:07,760 Speaker 1: the FAQ on the E Safety Commission's website, there is 89 00:05:07,760 --> 00:05:10,080 Speaker 1: a range of technologies available to check age at the 90 00:05:10,080 --> 00:05:13,360 Speaker 1: point of account sign up and later on as well. 91 00:05:13,400 --> 00:05:15,600 Speaker 1: It's going to be up to each platform to decide 92 00:05:15,640 --> 00:05:19,120 Speaker 1: which method they choose to use, and E Safety is 93 00:05:19,160 --> 00:05:22,400 Speaker 1: going to publish regulatory guidance to help platforms to decide 94 00:05:22,400 --> 00:05:24,479 Speaker 1: which methods are going to be most effective and comply 95 00:05:24,560 --> 00:05:27,520 Speaker 1: with the Online Safety Act. The guidelines as well are 96 00:05:27,520 --> 00:05:30,039 Speaker 1: going to draw on the Australian Government's Age Assurance Technology 97 00:05:30,080 --> 00:05:33,760 Speaker 1: trial as well as a whole lot of stakeholder consultations, 98 00:05:33,839 --> 00:05:37,480 Speaker 1: and essentially, no Australian is going to be forced to 99 00:05:37,560 --> 00:05:41,880 Speaker 1: use government ID to prove their age online. The age 100 00:05:42,000 --> 00:05:46,479 Speaker 1: restricted social media platforms have to offer reasonable alternatives to users, 101 00:05:46,800 --> 00:05:49,799 Speaker 1: but basically this is going to be on the platforms 102 00:05:50,120 --> 00:05:53,159 Speaker 1: to make sure that all users are over the age 103 00:05:53,200 --> 00:05:53,719 Speaker 1: of sixteen. 104 00:05:54,200 --> 00:05:57,799 Speaker 3: So I find this part curious then, because we've done 105 00:05:57,839 --> 00:06:01,479 Speaker 3: so many podcasts we've actually acknowledged that these platforms do 106 00:06:01,680 --> 00:06:04,440 Speaker 3: not have our children's best interests at heart, correct, and 107 00:06:04,520 --> 00:06:07,239 Speaker 3: yet all of a sudden we're actually putting the onus 108 00:06:07,240 --> 00:06:09,080 Speaker 3: back on them to take care of our kids. 109 00:06:09,279 --> 00:06:11,320 Speaker 1: Yeah, and that's because too many parents either won't do 110 00:06:11,360 --> 00:06:13,400 Speaker 1: it or be want to be their kids' best friends, 111 00:06:13,440 --> 00:06:19,080 Speaker 1: so they actively assist them to be online or see, 112 00:06:19,120 --> 00:06:21,640 Speaker 1: it's just too hard, like parents are just what they're trying. 113 00:06:21,640 --> 00:06:23,479 Speaker 1: They're trying, they're trying, and the kids find ways around it. 114 00:06:23,520 --> 00:06:26,160 Speaker 1: So now the focus is very much on what the 115 00:06:26,200 --> 00:06:27,839 Speaker 1: platforms are going to do to keep kids off. 116 00:06:28,320 --> 00:06:31,400 Speaker 3: So who gets in trouble if someone falls through the cracks? 117 00:06:31,440 --> 00:06:33,520 Speaker 3: Is it the platform or is it parents and kids? 118 00:06:33,839 --> 00:06:35,360 Speaker 1: Okay, So again, if we have a look at the 119 00:06:35,440 --> 00:06:37,440 Speaker 1: FAQ on the E Safety website, it says this there 120 00:06:37,440 --> 00:06:41,240 Speaker 1: are no penalties for under sixteens who access an age 121 00:06:41,400 --> 00:06:44,680 Speaker 1: restricted social media platform. There are no penalties for their 122 00:06:44,720 --> 00:06:48,400 Speaker 1: parents or cares. This is about protecting young people, not 123 00:06:48,400 --> 00:06:50,480 Speaker 1: punishing or isolating them. The goal is to help parents 124 00:06:50,520 --> 00:06:52,599 Speaker 1: and cares support their health and well being of under sixteens. 125 00:06:53,000 --> 00:06:57,720 Speaker 1: The penalties therefore go to the platforms. A court can 126 00:06:57,839 --> 00:07:00,960 Speaker 1: order civil penalties for platforms that don't take reasonable steps 127 00:07:00,960 --> 00:07:03,800 Speaker 1: to prevent underage users from having accounts on their platforms. 128 00:07:04,240 --> 00:07:10,400 Speaker 1: This includes court imposed fines of up to a total 129 00:07:10,440 --> 00:07:15,400 Speaker 1: of forty nine point five million Australian dollars. That is, 130 00:07:15,400 --> 00:07:20,640 Speaker 1: for each case brought against Meta or against TikTok, or 131 00:07:20,680 --> 00:07:25,760 Speaker 1: against Snap or against a YouTube. For each case where 132 00:07:26,120 --> 00:07:28,240 Speaker 1: it can be proved that they have not taken reasonable 133 00:07:28,280 --> 00:07:33,560 Speaker 1: steps to prevent an under sixteen accessing their platform, they 134 00:07:33,640 --> 00:07:43,920 Speaker 1: can be fined up to forty nine point five million dollars. 135 00:07:45,480 --> 00:07:49,200 Speaker 3: I know you and a handful of your colleagues worked 136 00:07:49,240 --> 00:07:51,360 Speaker 3: really hard to get this legislation in place, and I 137 00:07:51,400 --> 00:07:53,320 Speaker 3: know that it means a lot to you, But I 138 00:07:53,360 --> 00:07:55,720 Speaker 3: am curious. Do you really think that this is going 139 00:07:55,800 --> 00:07:58,239 Speaker 3: to move the needle at all for our kids? 140 00:07:58,400 --> 00:08:00,840 Speaker 1: Yeah? I mean it will. It's not going to change everything. 141 00:08:00,880 --> 00:08:03,040 Speaker 1: They've still got access to gaming. There's still work to 142 00:08:03,040 --> 00:08:06,200 Speaker 1: be done in terms of access to explicit content. They 143 00:08:06,240 --> 00:08:09,160 Speaker 1: will still be able to be on the platforms in 144 00:08:09,240 --> 00:08:12,080 Speaker 1: one form or another, in that the platforms can continue 145 00:08:12,120 --> 00:08:15,440 Speaker 1: to serve up content in the public domain, and kids 146 00:08:15,520 --> 00:08:18,840 Speaker 1: and screens are inextricably linked. But I think it is 147 00:08:18,880 --> 00:08:21,800 Speaker 1: going to soften the impact of the platforms. I think 148 00:08:21,840 --> 00:08:26,040 Speaker 1: the platforms for under sixteens are going to become less engaging, 149 00:08:26,200 --> 00:08:30,080 Speaker 1: and that's because the algorithm no longer knows who they are. 150 00:08:30,120 --> 00:08:34,240 Speaker 1: They can't log in unless they're using their parents accounts, 151 00:08:34,240 --> 00:08:35,440 Speaker 1: and I don't know too many parents that are going 152 00:08:35,480 --> 00:08:36,920 Speaker 1: to be like, of course, you can have my social 153 00:08:36,960 --> 00:08:39,320 Speaker 1: media account. Here's the thing I reckon. There's going to 154 00:08:39,320 --> 00:08:40,880 Speaker 1: be some parents who are going to sign up for 155 00:08:41,400 --> 00:08:43,960 Speaker 1: like a second account and just let the kids use 156 00:08:44,000 --> 00:08:48,040 Speaker 1: that one. You can be me. But the algorithm, apparently 157 00:08:48,720 --> 00:08:51,480 Speaker 1: this is hearsay. We'll be able to tell whether it's 158 00:08:51,480 --> 00:08:54,080 Speaker 1: a teenager using the account or not, and hopefully that 159 00:08:54,120 --> 00:08:56,679 Speaker 1: will flag it and be like, Okay, a parent has 160 00:08:56,679 --> 00:08:59,160 Speaker 1: signed up for this pretended it's them, but it's actually 161 00:08:59,160 --> 00:09:02,840 Speaker 1: the child. I have a sense that the technology company 162 00:09:02,920 --> 00:09:05,000 Speaker 1: is going to be able to figure out who's using it, 163 00:09:05,040 --> 00:09:08,360 Speaker 1: and forty nine point five million dollar fines are not 164 00:09:08,760 --> 00:09:12,040 Speaker 1: attractive to companies like this. This should work. Let me 165 00:09:12,080 --> 00:09:18,920 Speaker 1: also highlight there is enormous support among parents and even 166 00:09:18,960 --> 00:09:22,679 Speaker 1: surprisingly some growing support among young people for this ban 167 00:09:23,040 --> 00:09:27,880 Speaker 1: for this minimum age legislation. Ahead of the implementation, YouGov 168 00:09:27,920 --> 00:09:30,920 Speaker 1: did a survey and found that seventy seven percent of 169 00:09:30,960 --> 00:09:36,760 Speaker 1: Australians back the under sixteen social media minimum age legislation. 170 00:09:37,800 --> 00:09:41,760 Speaker 1: That's up from sixty one percent just a few months earlier, 171 00:09:41,920 --> 00:09:45,000 Speaker 1: there's only twenty three percent that opposed the measure. In addition, 172 00:09:45,120 --> 00:09:47,480 Speaker 1: eighty seven percent of Austrains support the introduction of stronger 173 00:09:47,480 --> 00:09:50,880 Speaker 1: penalties for social media companies that are doing this. The 174 00:09:50,960 --> 00:09:54,680 Speaker 1: polls are showing really strong support across all aspects of 175 00:09:54,679 --> 00:09:57,040 Speaker 1: the government's proposed social media regulations. 176 00:09:57,440 --> 00:09:59,240 Speaker 3: So I feel like there are going to be so 177 00:09:59,440 --> 00:10:01,800 Speaker 3: many found is that are going to be breathing a 178 00:10:01,840 --> 00:10:06,280 Speaker 3: sigh of relief that finally they have a village that's 179 00:10:06,400 --> 00:10:10,120 Speaker 3: actually starting to recognize the damage that this is doing 180 00:10:10,160 --> 00:10:15,760 Speaker 3: to individuals and family units. But I'm also concerned that 181 00:10:16,559 --> 00:10:22,720 Speaker 3: in doing so, parents become less vigilant. And I think 182 00:10:22,800 --> 00:10:27,079 Speaker 3: that this actually would suggest that more than ever, as parents, 183 00:10:27,120 --> 00:10:29,120 Speaker 3: we actually need to band together and we need to 184 00:10:29,160 --> 00:10:32,440 Speaker 3: be more vigilant than we've ever been because we've now 185 00:10:32,559 --> 00:10:36,920 Speaker 3: been given this safeguard. It's in place, and if we 186 00:10:36,960 --> 00:10:41,640 Speaker 3: can honor that safeguard and work together as communities to 187 00:10:41,720 --> 00:10:47,160 Speaker 3: continue to safeguard our children in more profoundly impactful ways, 188 00:10:47,800 --> 00:10:50,160 Speaker 3: that's when the needle shifts, not because the government's put 189 00:10:50,200 --> 00:10:50,880 Speaker 3: in this policy. 190 00:10:51,160 --> 00:10:53,200 Speaker 1: Yeah, so nature will fill a vacuum, and if the 191 00:10:53,240 --> 00:10:55,560 Speaker 1: kids aren't able to do X, y Z, they'll find 192 00:10:55,600 --> 00:10:57,719 Speaker 1: something else to do there. And so it's going to 193 00:10:57,760 --> 00:10:59,240 Speaker 1: be up to us to make sure that we're delaying 194 00:10:59,240 --> 00:11:02,280 Speaker 1: access to screen, having conversations with kids about what they're 195 00:11:02,320 --> 00:11:04,880 Speaker 1: doing on screens, and giving them in riching environments where 196 00:11:04,920 --> 00:11:08,000 Speaker 1: they can go and be creative, come up with other ideas, 197 00:11:08,040 --> 00:11:10,400 Speaker 1: do other things that are going to be far more 198 00:11:10,400 --> 00:11:12,319 Speaker 1: in their best interests. If we just think, oh, well, 199 00:11:12,320 --> 00:11:13,920 Speaker 1: the government's done it now we don't have to do anything, 200 00:11:14,120 --> 00:11:16,480 Speaker 1: then we're selling our children short. And we're also we 201 00:11:16,800 --> 00:11:19,240 Speaker 1: have far too much belief in the power of legislation. 202 00:11:19,640 --> 00:11:21,600 Speaker 1: It can move the needle, but it's not going to 203 00:11:21,679 --> 00:11:22,640 Speaker 1: change everything. 204 00:11:23,040 --> 00:11:26,640 Speaker 3: I was actually speaking with a young man recently about 205 00:11:26,800 --> 00:11:29,679 Speaker 3: his real fears that this is just another ploy from 206 00:11:29,720 --> 00:11:33,199 Speaker 3: the government to take control of us. They want they 207 00:11:33,200 --> 00:11:35,440 Speaker 3: want they want our id they want to you know, 208 00:11:35,480 --> 00:11:39,319 Speaker 3: they want to know everything about us, and that we're 209 00:11:39,400 --> 00:11:42,400 Speaker 3: heading towards a dictatorship because we no longer have the 210 00:11:42,520 --> 00:11:46,199 Speaker 3: choice to decide whether or not we will have an account. 211 00:11:46,679 --> 00:11:48,240 Speaker 1: How do you allay those fears? 212 00:11:48,280 --> 00:11:51,119 Speaker 3: Because for him it was real, like it was intensely 213 00:11:51,160 --> 00:11:51,719 Speaker 3: real for him. 214 00:11:51,760 --> 00:11:53,839 Speaker 1: So a couple of things. First, of all, young men 215 00:11:53,880 --> 00:11:55,760 Speaker 1: and young women who are getting all of their news 216 00:11:55,760 --> 00:11:58,040 Speaker 1: from influencers who are spreading fear and panic are just 217 00:11:58,080 --> 00:12:00,439 Speaker 1: going to end up with anxiety and be miserable about it. 218 00:12:00,720 --> 00:12:02,200 Speaker 1: Number two, you don't have to have an account if 219 00:12:02,200 --> 00:12:04,640 Speaker 1: you don't want to, so nobody's getting your details if 220 00:12:04,679 --> 00:12:06,440 Speaker 1: you're not putting your details up there. If you do 221 00:12:06,480 --> 00:12:08,680 Speaker 1: want to have an account, guess what. Alphabet is the 222 00:12:08,679 --> 00:12:12,240 Speaker 1: parent company of Google, who also owns YouTube and Meta. 223 00:12:12,520 --> 00:12:14,320 Speaker 1: They know more about you than you know about you, 224 00:12:14,480 --> 00:12:17,080 Speaker 1: and they like, if you're worried about the government knowing 225 00:12:17,080 --> 00:12:19,480 Speaker 1: about you, I'd be just as concerned, if not more concerned, 226 00:12:19,480 --> 00:12:21,720 Speaker 1: about what these corporations know. And they know everything. And 227 00:12:21,760 --> 00:12:23,960 Speaker 1: by the way, the government knows everything about you as well. 228 00:12:24,320 --> 00:12:26,000 Speaker 1: They know where you live. They've got all of your 229 00:12:26,040 --> 00:12:28,960 Speaker 1: ID already because of things like driver's licenses and passports 230 00:12:28,960 --> 00:12:32,000 Speaker 1: and because of spending habits. The ATO does actually talk 231 00:12:32,040 --> 00:12:33,839 Speaker 1: to the banks and gets all the information about where 232 00:12:33,880 --> 00:12:35,920 Speaker 1: money is coming from, where money is going to the 233 00:12:36,000 --> 00:12:38,319 Speaker 1: idea that this is going to turn us into a 234 00:12:38,360 --> 00:12:42,680 Speaker 1: surveillance state. The Big Brother is watching that ship sailed 235 00:12:42,760 --> 00:12:45,840 Speaker 1: years ago, right like we're already there we're already there. Yeah, 236 00:12:45,840 --> 00:12:48,160 Speaker 1: if you don't want to be on the grid, use cash, 237 00:12:48,240 --> 00:12:50,880 Speaker 1: get rid of your social media, don't have a credit card, 238 00:12:51,440 --> 00:12:53,680 Speaker 1: be very very careful about even having an email account, 239 00:12:54,520 --> 00:12:56,760 Speaker 1: and I don't know, get a dial up phone for 240 00:12:56,880 --> 00:12:59,120 Speaker 1: your wall in your kitchen, and don't talk to me 241 00:12:59,320 --> 00:13:04,360 Speaker 1: on a mobile phone. It's conspiracy theory thinking at its 242 00:13:05,120 --> 00:13:08,400 Speaker 1: its most basic, and I just don't buy it. The 243 00:13:08,440 --> 00:13:11,120 Speaker 1: government already has all the information they need is kind 244 00:13:11,120 --> 00:13:12,000 Speaker 1: of where I'm going with that. 245 00:13:12,559 --> 00:13:15,520 Speaker 3: So you alluded to it earlier on in your comments, 246 00:13:16,679 --> 00:13:20,040 Speaker 3: The idea that there is quite a lot of misunderstanding 247 00:13:20,360 --> 00:13:23,720 Speaker 3: about what this policy is and what it is going 248 00:13:23,800 --> 00:13:26,880 Speaker 3: to I guess give us as a community. Can you 249 00:13:26,920 --> 00:13:28,640 Speaker 3: just unpack some of that for us? 250 00:13:28,800 --> 00:13:30,360 Speaker 1: Yeah, okay, so let's go through. I'm going to go 251 00:13:30,360 --> 00:13:32,920 Speaker 1: through each platform and I'll do this fairly quickly. All 252 00:13:32,960 --> 00:13:35,520 Speaker 1: this information is available online. But if you listen to 253 00:13:35,559 --> 00:13:36,760 Speaker 1: the pod and you're sort of going, oh, yeah, I 254 00:13:36,800 --> 00:13:39,040 Speaker 1: need to figure out what we're doing about this YouTube. 255 00:13:39,520 --> 00:13:42,520 Speaker 1: What stays for under sixteens is this full access to 256 00:13:42,600 --> 00:13:45,360 Speaker 1: view or content, but you're going to be accountless, You're 257 00:13:45,360 --> 00:13:47,600 Speaker 1: going to be logged out. That includes via web browsers. 258 00:13:47,640 --> 00:13:50,000 Speaker 1: It includes the YouTube app. As an under sixteen, you're 259 00:13:50,040 --> 00:13:52,880 Speaker 1: logged out unless you're using your parents account because sometimes 260 00:13:52,920 --> 00:13:54,920 Speaker 1: parents stay logged in like on the TV or something 261 00:13:54,960 --> 00:13:57,439 Speaker 1: like that, you're going to be logged out. And therefore 262 00:13:57,880 --> 00:14:05,600 Speaker 1: what goes is the cross device, cross session personalized algorithmic recommendations. 263 00:14:05,880 --> 00:14:07,280 Speaker 1: You know how when you're on one device and you're 264 00:14:07,280 --> 00:14:08,720 Speaker 1: going to the next device and you can kind of 265 00:14:08,720 --> 00:14:11,280 Speaker 1: pick up watching what you're watching from elsewhere, it knows, 266 00:14:11,480 --> 00:14:13,160 Speaker 1: it knows who you are, it knows what you've been watching, 267 00:14:13,200 --> 00:14:15,120 Speaker 1: and it continues to recommend things to you because it 268 00:14:15,120 --> 00:14:17,480 Speaker 1: knows you on all of your devices. So that goes. 269 00:14:17,520 --> 00:14:19,680 Speaker 1: The ability to upload and post content goes, and the 270 00:14:19,720 --> 00:14:22,080 Speaker 1: ability to interact with strangers in the comments goes for 271 00:14:22,120 --> 00:14:26,400 Speaker 1: YouTube as well. When it comes to Instagram, very very 272 00:14:26,440 --> 00:14:28,840 Speaker 1: limited access. So as part of the way that they 273 00:14:28,840 --> 00:14:33,160 Speaker 1: acquire users on Instagram, Meta pretty much restricts access to 274 00:14:33,200 --> 00:14:36,680 Speaker 1: people who have an account. You can get public access 275 00:14:36,680 --> 00:14:38,920 Speaker 1: to landing pages and maybe a little bit, but they 276 00:14:39,000 --> 00:14:42,440 Speaker 1: pretty strongly restrict browsing of any public content unless you 277 00:14:42,520 --> 00:14:45,480 Speaker 1: have an account, and that's a business decision made by Meta, 278 00:14:45,520 --> 00:14:48,160 Speaker 1: that's not an Australian government decision. So what's going to 279 00:14:48,240 --> 00:14:51,160 Speaker 1: go if you're under sixteen? There goes the cross device, 280 00:14:51,160 --> 00:14:54,040 Speaker 1: cross session of personalized algorithmic recommendations. Again, it doesn't know 281 00:14:54,080 --> 00:14:56,400 Speaker 1: who you are. You also won't be able to have 282 00:14:56,480 --> 00:15:00,120 Speaker 1: unlimited scrolling. That's just gone. That's against the way that 283 00:15:00,480 --> 00:15:04,040 Speaker 1: Instagram works. You'll lose the ability to upload and post content, 284 00:15:04,200 --> 00:15:07,480 Speaker 1: the ability to react to or comment on other people's posts, 285 00:15:08,000 --> 00:15:10,880 Speaker 1: the ability to interact with strangers via the comments, the 286 00:15:10,960 --> 00:15:15,800 Speaker 1: ability to interact with strangers via DMS, no more direct messaging. Also, 287 00:15:15,920 --> 00:15:18,200 Speaker 1: there won't be any notifications that lure you back to 288 00:15:18,240 --> 00:15:20,680 Speaker 1: the platform, and you're going to lose most access unless 289 00:15:20,720 --> 00:15:23,840 Speaker 1: Meta changes as policy, and you also won't have meta 290 00:15:23,920 --> 00:15:26,720 Speaker 1: AI access, which I think is fantastic and that's going 291 00:15:26,760 --> 00:15:29,160 Speaker 1: to apply pretty much across Facebook as well. 292 00:15:29,800 --> 00:15:33,720 Speaker 3: So that is really going to impact how our teenagers communicate. 293 00:15:34,600 --> 00:15:36,840 Speaker 1: Yeah, one hundred percent. And again that's why building that 294 00:15:36,880 --> 00:15:39,560 Speaker 1: community and finding ways to fill that vacuum, fill that hole, 295 00:15:39,600 --> 00:15:41,080 Speaker 1: making sure that kids have got a way to be 296 00:15:41,120 --> 00:15:43,400 Speaker 1: in touch with each other matters. Good news is there 297 00:15:43,400 --> 00:15:45,440 Speaker 1: are I mean Wayne Warburton about a month ago on 298 00:15:45,480 --> 00:15:48,000 Speaker 1: the podcast when I raised that with him. He just said, 299 00:15:48,160 --> 00:15:50,360 Speaker 1: it's just ridiculous. There are so many ways for kids 300 00:15:50,400 --> 00:15:52,240 Speaker 1: to get in touch with each other even without using 301 00:15:52,280 --> 00:15:55,520 Speaker 1: the dominant social media platforms that if they're crying foul 302 00:15:55,560 --> 00:15:58,120 Speaker 1: over that, they're just not using their imaginations. It's still 303 00:15:58,200 --> 00:16:02,720 Speaker 1: completely doable. What about okay, So TikTok once again, here's 304 00:16:02,760 --> 00:16:05,960 Speaker 1: what stays. All of the public content via a web 305 00:16:05,960 --> 00:16:09,400 Speaker 1: browser that's going to remain available as it currently is. 306 00:16:10,160 --> 00:16:13,160 Speaker 1: Any public content and landing pages that you can access 307 00:16:13,160 --> 00:16:14,840 Speaker 1: without an account, you'll still be able to see that, 308 00:16:15,240 --> 00:16:17,040 Speaker 1: and you will be able to see the comments from 309 00:16:17,120 --> 00:16:21,040 Speaker 1: other users. But remember this, the TikTok mobile app will 310 00:16:21,080 --> 00:16:23,400 Speaker 1: require an account to use, and if you're under sixteen, 311 00:16:23,440 --> 00:16:24,960 Speaker 1: you won't be allowed to have an account. So here's 312 00:16:25,000 --> 00:16:29,320 Speaker 1: what goes access to private content once again, the cross device, 313 00:16:29,680 --> 00:16:34,320 Speaker 1: cross session, personalized algorithmic recommendations that I'm going to just 314 00:16:34,360 --> 00:16:38,320 Speaker 1: throw in here. TikTok is a China It's owned by ByteDance, 315 00:16:38,600 --> 00:16:42,320 Speaker 1: which is a Chinese owned company, and the CCP, the 316 00:16:42,440 --> 00:16:45,760 Speaker 1: Chinese Communist Party, has their tentacles into every single Chinese organization, 317 00:16:46,400 --> 00:16:50,440 Speaker 1: so fundamentally those recommendations are controlled by the Chinese government, 318 00:16:50,920 --> 00:16:52,160 Speaker 1: and I think that that's going to be a really 319 00:16:52,200 --> 00:16:54,200 Speaker 1: good thing that our young kids aren't being exposed to 320 00:16:54,240 --> 00:16:56,720 Speaker 1: what the Chinese government wants them to see. If you're 321 00:16:56,720 --> 00:16:58,360 Speaker 1: on TikTok and you're under sixteen, you won't be able 322 00:16:58,400 --> 00:17:01,360 Speaker 1: to upload and post content, or react to or comment 323 00:17:01,400 --> 00:17:04,200 Speaker 1: on other people's posts, or react with strangers, or send 324 00:17:04,240 --> 00:17:07,000 Speaker 1: direct messages, and you won't get those notifications bringing you 325 00:17:07,040 --> 00:17:09,440 Speaker 1: back to the platform. It's basically the same with Snapchat 326 00:17:09,440 --> 00:17:11,080 Speaker 1: as well. That's the only platform that we haven't covered. 327 00:17:11,119 --> 00:17:13,280 Speaker 1: That's one of the major ones. You still get all 328 00:17:13,280 --> 00:17:17,720 Speaker 1: the public content via web browsers, but Snapchat is designed 329 00:17:17,800 --> 00:17:21,760 Speaker 1: to use with an account, So if you are a 330 00:17:21,840 --> 00:17:24,560 Speaker 1: teenager under sixteen, you're not going to have disappearing messages, 331 00:17:24,600 --> 00:17:26,360 Speaker 1: you're not going to get access to private content, you're 332 00:17:26,400 --> 00:17:29,439 Speaker 1: not going to get the gamified streaks that increase that 333 00:17:29,480 --> 00:17:31,720 Speaker 1: pressure to keep you there. You're going to lose the notifications, 334 00:17:31,880 --> 00:17:34,280 Speaker 1: all the stuff that we're talking about. This is going 335 00:17:34,320 --> 00:17:39,240 Speaker 1: to severely hamper restricts, reduce access to and time on 336 00:17:39,320 --> 00:17:41,840 Speaker 1: social media for our young people if they're under sixteen. 337 00:17:42,080 --> 00:17:44,119 Speaker 1: There will be ways around it because people always find it, 338 00:17:44,200 --> 00:17:47,280 Speaker 1: but generally speaking when social media platforms are going to 339 00:17:47,280 --> 00:17:49,639 Speaker 1: get stung forty nine point five million dollars for not 340 00:17:49,640 --> 00:17:52,439 Speaker 1: taking every reasonable precaution. This is a win for our 341 00:17:52,480 --> 00:17:53,040 Speaker 1: young people. 342 00:17:53,560 --> 00:17:56,639 Speaker 3: So guess my last question for you is understanding and 343 00:17:56,680 --> 00:18:01,000 Speaker 3: recognizing how prevalent the use of social media has been 344 00:18:01,160 --> 00:18:06,520 Speaker 3: for about thirteen plus and sometimes much younger children. In essence, 345 00:18:06,520 --> 00:18:08,800 Speaker 3: we're going cold turkey. One day they're going to have 346 00:18:08,840 --> 00:18:11,360 Speaker 3: access and the next day they're not. Right, is there 347 00:18:11,560 --> 00:18:14,080 Speaker 3: going to be anything available to parents to help them 348 00:18:14,160 --> 00:18:15,719 Speaker 3: navigate that transition period. 349 00:18:15,800 --> 00:18:18,040 Speaker 1: I keep on getting so many like I've had so 350 00:18:18,080 --> 00:18:20,960 Speaker 1: many interviews in the press and in the media about this, 351 00:18:21,160 --> 00:18:23,960 Speaker 1: and my response is they'll be okay, Like they'll be 352 00:18:24,000 --> 00:18:25,560 Speaker 1: a little bit sad, they'll be a little bit frustrated, 353 00:18:25,600 --> 00:18:28,040 Speaker 1: and then they'll figure it out. Kids are creative, Kids have ingenuity. 354 00:18:28,080 --> 00:18:30,280 Speaker 1: Kids will start to hassle you about different ways that 355 00:18:30,320 --> 00:18:32,439 Speaker 1: they can contact their friends. You'll have to do that 356 00:18:32,440 --> 00:18:35,159 Speaker 1: thing that parents used to do before social media, Like 357 00:18:35,240 --> 00:18:37,720 Speaker 1: go back fifteen twenty years where parents used to talk 358 00:18:37,760 --> 00:18:40,640 Speaker 1: to each other about getting the kids together, and then 359 00:18:40,680 --> 00:18:42,080 Speaker 1: the kids are going to have to do things like 360 00:18:42,320 --> 00:18:44,960 Speaker 1: ring each other or send each other text messages or 361 00:18:45,000 --> 00:18:48,439 Speaker 1: get on the bike and ride to each other's houses. Again, 362 00:18:48,960 --> 00:18:50,600 Speaker 1: there are going to be some kids who struggle. There 363 00:18:50,640 --> 00:18:52,600 Speaker 1: are going to be some disadvantages. I'm not denying that, 364 00:18:52,680 --> 00:18:54,640 Speaker 1: but overwhelmingly I think this is a net positive. 365 00:18:55,440 --> 00:18:59,080 Speaker 3: I think back on our parenting journey and I remember 366 00:18:59,119 --> 00:19:05,680 Speaker 3: the day our eldest received, against our desire, a mobile phone, 367 00:19:06,240 --> 00:19:09,080 Speaker 3: and I look at the lives that we lived before 368 00:19:09,119 --> 00:19:13,480 Speaker 3: those small devices were placed in our kids' hands, and 369 00:19:14,040 --> 00:19:17,600 Speaker 3: maybe I'm wearing rose colored glasses, but family life was 370 00:19:17,640 --> 00:19:21,880 Speaker 3: so much simpler. There were some challenges, obviously, family life 371 00:19:21,920 --> 00:19:25,040 Speaker 3: is full of it. It's messy, but on the whole, 372 00:19:25,520 --> 00:19:30,320 Speaker 3: the challenges that we were experiencing were everyday challenges as 373 00:19:30,320 --> 00:19:38,920 Speaker 3: opposed to a constant navigation of impactful negatively impactful information 374 00:19:39,000 --> 00:19:41,480 Speaker 3: that our children are being fed on a daily basis. 375 00:19:42,480 --> 00:19:45,360 Speaker 3: And as much as there's been some good that's come 376 00:19:45,400 --> 00:19:47,960 Speaker 3: out of them, honestly, I'd go back to the day 377 00:19:48,000 --> 00:19:50,280 Speaker 3: when my phone was stuck on a wall any day. 378 00:19:50,560 --> 00:19:52,600 Speaker 1: I know we're sounding like a couple of old fogies 379 00:19:52,680 --> 00:19:55,679 Speaker 1: right now, but I just don't think that the changes 380 00:19:55,720 --> 00:19:58,080 Speaker 1: that social media has brought to the world have been 381 00:19:58,240 --> 00:20:00,560 Speaker 1: a fundamentally positive. I think social media has been an 382 00:20:00,600 --> 00:20:05,359 Speaker 1: overall net negative in far too many lives to justify 383 00:20:05,400 --> 00:20:06,040 Speaker 1: its existence. 384 00:20:06,280 --> 00:20:08,480 Speaker 3: If for no other reason, then it zaps our time 385 00:20:09,040 --> 00:20:12,280 Speaker 3: and literally takes us away out I've lost human people 386 00:20:12,960 --> 00:20:14,280 Speaker 3: that matter most in our lives. 387 00:20:14,320 --> 00:20:15,960 Speaker 1: If anyway, I'm going to talk to Julian and Grant, 388 00:20:16,000 --> 00:20:18,399 Speaker 1: the ESCAPT Commissioner, in the next couple of weeks about 389 00:20:18,520 --> 00:20:21,560 Speaker 1: more around this. If this has been helpful, please share 390 00:20:21,560 --> 00:20:24,720 Speaker 1: it with somebody you know who's maybe struggling with or 391 00:20:24,720 --> 00:20:27,399 Speaker 1: has some questions around these topics. And play it for 392 00:20:27,440 --> 00:20:29,520 Speaker 1: your kids as well, because the kids will want to 393 00:20:29,680 --> 00:20:31,720 Speaker 1: know what's going on and this could be helpful. Play 394 00:20:31,760 --> 00:20:33,399 Speaker 1: it on one and a half speed. I mean, they 395 00:20:33,440 --> 00:20:35,840 Speaker 1: don't need to listen to us talk in slow motion. 396 00:20:36,080 --> 00:20:38,920 Speaker 1: The Happy Families podcast has been produced today by Justin 397 00:20:39,000 --> 00:20:42,720 Speaker 1: Ruland from Bridge Medium. Imhammits provides additional research, admin and 398 00:20:42,880 --> 00:20:45,440 Speaker 1: other support, and if you'd like more information about making 399 00:20:45,440 --> 00:20:47,840 Speaker 1: your family happy, visit us at happy families dot com 400 00:20:47,880 --> 00:20:49,080 Speaker 1: dot au.