1 00:00:00,240 --> 00:00:03,120 Speaker 1: This is the Fitting in Whip with Kate Richie podcast. 2 00:00:03,320 --> 00:00:05,600 Speaker 1: Guys wanted to bring up to speed on thirty six 3 00:00:05,640 --> 00:00:09,520 Speaker 1: months and the age ban from social media change in 4 00:00:09,520 --> 00:00:12,600 Speaker 1: the age from thirteen to sixteen because of City in 5 00:00:12,640 --> 00:00:16,480 Speaker 1: Parliament this week. It's been fascinating to see so much unfold. 6 00:00:16,560 --> 00:00:18,760 Speaker 1: I mean Peter Dutton was in the studio back in June, 7 00:00:19,079 --> 00:00:20,920 Speaker 1: who threw his support behind this campaign. 8 00:00:21,079 --> 00:00:22,320 Speaker 2: And then you know the Prime. 9 00:00:22,079 --> 00:00:26,239 Speaker 1: Minister a couple of weeks ago made that all important commitment, 10 00:00:26,600 --> 00:00:29,280 Speaker 1: an announcement that he would make this crucial change. 11 00:00:29,320 --> 00:00:31,680 Speaker 2: Well we saw you in canber too well. 12 00:00:31,800 --> 00:00:33,360 Speaker 1: I mean this is off the back of one hundred 13 00:00:33,360 --> 00:00:36,279 Speaker 1: and thirty thousand signatures at thirty six months dot com 14 00:00:36,280 --> 00:00:39,800 Speaker 1: dot au. That's one hundred and thirty thousand motivated parents 15 00:00:39,840 --> 00:00:40,960 Speaker 1: saying enough is enough. 16 00:00:41,400 --> 00:00:42,640 Speaker 2: And what was great in support? 17 00:00:42,680 --> 00:00:46,000 Speaker 1: Also you GV A survey came out recently, this was 18 00:00:46,040 --> 00:00:49,720 Speaker 1: just yesterday and they found that seventy seventy seven percent 19 00:00:49,760 --> 00:00:52,960 Speaker 1: of the community support the change. So we understand that 20 00:00:53,040 --> 00:00:57,320 Speaker 1: everybody is behind this. But here's the fascinating part. I 21 00:00:57,440 --> 00:00:59,440 Speaker 1: just want to clear up a couple of things. Because 22 00:00:59,760 --> 00:01:02,639 Speaker 1: you've got the big tech companies that come out and 23 00:01:02,720 --> 00:01:06,839 Speaker 1: simply try to muddy the water because complications to them 24 00:01:07,319 --> 00:01:09,600 Speaker 1: mean delays, which means more profit. 25 00:01:09,720 --> 00:01:13,160 Speaker 2: They don't care about kids. They care about money, of course, 26 00:01:13,200 --> 00:01:15,600 Speaker 2: and we all know that. It's just been the angles 27 00:01:15,600 --> 00:01:16,880 Speaker 2: have been quite interruing. 28 00:01:16,520 --> 00:01:18,759 Speaker 3: And I feel as though that the more questions they ask, 29 00:01:18,840 --> 00:01:21,920 Speaker 3: the more they muddy the waters. It's look over, look, over. 30 00:01:21,760 --> 00:01:25,480 Speaker 2: Ear look social media. The companies have never been in 31 00:01:25,520 --> 00:01:26,000 Speaker 2: it for us. 32 00:01:26,240 --> 00:01:29,440 Speaker 3: They're trying to extract as much money in our eyes 33 00:01:29,480 --> 00:01:30,440 Speaker 3: as much as they care. 34 00:01:30,520 --> 00:01:31,640 Speaker 2: So here's a couple of cleanup. 35 00:01:32,720 --> 00:01:35,160 Speaker 1: So people have been suggesting that, you know, if there 36 00:01:35,360 --> 00:01:39,640 Speaker 1: is a change, then the social media platforms, who we're 37 00:01:39,640 --> 00:01:41,840 Speaker 1: going to need to make changes themselves, are going to 38 00:01:41,920 --> 00:01:46,280 Speaker 1: require digital ID. Now this is simply not true. So 39 00:01:46,360 --> 00:01:48,840 Speaker 1: amendments have even been made to the bill to make 40 00:01:48,880 --> 00:01:52,440 Speaker 1: sure that digital IDs would not be required. No drivers license, 41 00:01:52,480 --> 00:01:55,520 Speaker 1: no passport, no further proof of digital ID. 42 00:01:55,720 --> 00:01:58,600 Speaker 2: It does not exist. It's not in the bill, right, 43 00:01:58,640 --> 00:01:59,760 Speaker 2: so you can't have that. 44 00:02:00,520 --> 00:02:02,760 Speaker 1: The other point too, which has been a major one, 45 00:02:02,760 --> 00:02:06,000 Speaker 1: that there's also been a concern about marginalized kids. You know, 46 00:02:06,080 --> 00:02:10,880 Speaker 1: for example, the LGBTQI plus communities finding like minded people 47 00:02:11,320 --> 00:02:14,639 Speaker 1: they're not banning the internet. They're not banning websites, they're 48 00:02:14,680 --> 00:02:18,240 Speaker 1: not banning chat rooms, they're not banning messenger services. These 49 00:02:18,280 --> 00:02:22,120 Speaker 1: still exist. So great example is will next door over 50 00:02:22,200 --> 00:02:24,480 Speaker 1: he's fifteen. The other day he said, oh, I made 51 00:02:24,560 --> 00:02:26,959 Speaker 1: heard about the ban. All the guys have now gone 52 00:02:27,000 --> 00:02:29,240 Speaker 1: from Snapchat and we've formed a group on WhatsApp. 53 00:02:30,000 --> 00:02:32,680 Speaker 2: Perfect. That's great. Exactly what you want to do. 54 00:02:32,840 --> 00:02:35,720 Speaker 3: That's much more connected, isn't it. And I don't want 55 00:02:35,760 --> 00:02:37,720 Speaker 3: to go on Snapchat If you want to, if you 56 00:02:37,800 --> 00:02:40,520 Speaker 3: want to find a beautiful community, you go on. 57 00:02:40,480 --> 00:02:42,840 Speaker 2: Snapchat, go on Twitter, don't you. 58 00:02:43,880 --> 00:02:46,160 Speaker 1: I mean it's a closed group without That's what WhatsApp is. 59 00:02:46,200 --> 00:02:48,520 Speaker 1: It's a closed group without algorithms, which is free from 60 00:02:48,560 --> 00:02:50,959 Speaker 1: bullies and creeps. Meta came out on the topic and 61 00:02:51,000 --> 00:02:54,760 Speaker 1: said age variation technology just isn't there yet, and the 62 00:02:54,800 --> 00:02:56,519 Speaker 1: apps would need to collect personal data. 63 00:02:56,560 --> 00:02:57,880 Speaker 2: We've already written that one out. 64 00:02:58,080 --> 00:03:02,079 Speaker 1: So TikTok admitted that last year they removed seventy six 65 00:03:02,240 --> 00:03:07,160 Speaker 1: million underage accounts. One million of those were in Australia, 66 00:03:07,240 --> 00:03:10,320 Speaker 1: so they have the tech. It already exists. Here's a 67 00:03:10,320 --> 00:03:13,720 Speaker 1: great hypothetical for Mata. Imagine if they got paid one 68 00:03:13,720 --> 00:03:16,840 Speaker 1: thousand dollars every time they were able to identify a 69 00:03:16,960 --> 00:03:18,600 Speaker 1: child was under the age of sixteen. 70 00:03:18,960 --> 00:03:19,840 Speaker 2: How do you reckon that? Go? 71 00:03:20,200 --> 00:03:23,080 Speaker 1: Yeah, we need to wind back to the start and 72 00:03:23,200 --> 00:03:26,320 Speaker 1: talk about why we're here. It's about the direct correlation 73 00:03:26,440 --> 00:03:30,200 Speaker 1: between the impact of social media and the psychological damage 74 00:03:30,560 --> 00:03:34,840 Speaker 1: on kids. Depression, anxiety, eating disorders, suicide. There are so 75 00:03:35,000 --> 00:03:39,840 Speaker 1: many families living actual tragedies and how's this from behind 76 00:03:39,840 --> 00:03:42,560 Speaker 1: the scenes. In the past six months, we've had someone 77 00:03:42,800 --> 00:03:45,520 Speaker 1: from each of these social media giants that has given 78 00:03:45,560 --> 00:03:47,760 Speaker 1: a team member at thirty six months a hug and 79 00:03:47,840 --> 00:03:51,520 Speaker 1: privately said keep doing what you're doing. The reason why 80 00:03:51,640 --> 00:03:53,520 Speaker 1: because they were all parents, but they don't want to 81 00:03:53,560 --> 00:03:54,360 Speaker 1: be identified. 82 00:03:55,440 --> 00:03:56,400 Speaker 2: I'll leave you with this one. 83 00:03:56,480 --> 00:03:59,840 Speaker 1: Yeah, okay, Scott Galloway, he's the clinical professor of marketing 84 00:04:00,160 --> 00:04:04,400 Speaker 1: and why you what's more challenging figuring out if someone 85 00:04:04,480 --> 00:04:07,440 Speaker 1: is younger than sixteen or building a global real time 86 00:04:07,480 --> 00:04:11,480 Speaker 1: communication network that stores a near infinite amount of text, video, 87 00:04:11,840 --> 00:04:16,640 Speaker 1: and all audio retrieval by billions of simultaneous users in 88 00:04:16,720 --> 00:04:20,320 Speaker 1: milliseconds with twenty four to seven up time. The social 89 00:04:20,360 --> 00:04:23,039 Speaker 1: media giants know where you are. They know what you're doing. 90 00:04:23,240 --> 00:04:25,880 Speaker 1: They know how you're feeling, but they can't figure out 91 00:04:25,920 --> 00:04:28,279 Speaker 1: your age. You couldn't make this shit up. 92 00:04:28,520 --> 00:04:30,919 Speaker 3: Yeah, and can I just say, We even had somebody 93 00:04:30,920 --> 00:04:32,960 Speaker 3: on air this morning, a dad who called about something 94 00:04:33,000 --> 00:04:38,000 Speaker 3: completely different and he stopped down to say, just you know, Jess, 95 00:04:38,120 --> 00:04:40,400 Speaker 3: thank you. He has teens who are going to have 96 00:04:40,440 --> 00:04:42,840 Speaker 3: to give up social media come the end of the year. 97 00:04:43,320 --> 00:04:45,760 Speaker 3: And I think that's that's who we need. 98 00:04:45,640 --> 00:04:47,840 Speaker 2: To be expreen for relief. Yeah, forget the. 99 00:04:47,760 --> 00:04:50,720 Speaker 3: Procession to support, even just as a fellow parent. 100 00:04:50,680 --> 00:04:53,480 Speaker 1: Listen to the parents. That's all we can do, Prime Minister, 101 00:04:53,880 --> 00:04:56,560 Speaker 1: get this done. Sits in Whipper with Kate Ritchie is 102 00:04:56,560 --> 00:05:00,400 Speaker 1: a nov podcast A great shows like this. Downloads Nova 103 00:05:00,440 --> 00:05:03,960 Speaker 1: Player via the app store or Google Play the overplayer