1 00:00:00,880 --> 00:00:03,960 Speaker 1: Listen, I reckon, we might want to consider doing never mind, No, no, no, no, 2 00:00:04,600 --> 00:00:07,960 Speaker 1: not consider I reckon we should I reckon we should 3 00:00:08,039 --> 00:00:10,280 Speaker 1: do what the Australians are planning to do and ban 4 00:00:10,360 --> 00:00:12,960 Speaker 1: the kids from using social media until they reach the 5 00:00:13,000 --> 00:00:15,880 Speaker 1: age of sixteen. At the moment, we don't have a 6 00:00:15,920 --> 00:00:18,000 Speaker 1: huge amount of detail on what actually is going to 7 00:00:18,040 --> 00:00:20,279 Speaker 1: happen in Australia, but we've got a little bit of 8 00:00:20,280 --> 00:00:22,919 Speaker 1: a sketch kind of emerging. And what we know is 9 00:00:22,920 --> 00:00:25,680 Speaker 1: the Australians are planning to put in the highest age 10 00:00:25,720 --> 00:00:29,960 Speaker 1: limit in the world at sixteen. And then the other 11 00:00:30,160 --> 00:00:31,880 Speaker 1: very important thing is they going to put this The 12 00:00:31,880 --> 00:00:34,159 Speaker 1: onus on social media companies to make this work, not 13 00:00:34,200 --> 00:00:37,159 Speaker 1: on the parents, not on the kids, not on the government, 14 00:00:37,320 --> 00:00:39,960 Speaker 1: not on anybody else. It's going to be on the 15 00:00:40,080 --> 00:00:43,639 Speaker 1: social media companies to keep the kids off. So there 16 00:00:43,680 --> 00:00:45,920 Speaker 1: will be an age verification tool and it will be 17 00:00:45,960 --> 00:00:49,920 Speaker 1: an official one, either biometrics or government ID, and the 18 00:00:49,960 --> 00:00:52,760 Speaker 1: social media companies have to use it and they have 19 00:00:52,840 --> 00:00:55,560 Speaker 1: to keep kids off if they haven't hit sixteen yet, 20 00:00:55,600 --> 00:00:58,640 Speaker 1: and if they don't, they get penalized. And this is 21 00:00:58,680 --> 00:01:01,720 Speaker 1: exactly how it should work, because if these social media 22 00:01:01,760 --> 00:01:05,280 Speaker 1: companies are not forced to take responsibility. They will not 23 00:01:05,480 --> 00:01:08,560 Speaker 1: take responsibility. They will simply blame someone else. Ago, it's 24 00:01:08,560 --> 00:01:10,880 Speaker 1: not actually our job. It's up to the parents. Oh whoops, 25 00:01:10,920 --> 00:01:13,160 Speaker 1: the kid's got on. Wits up to the kids. Oh whoops, 26 00:01:13,280 --> 00:01:15,600 Speaker 1: the kids got on. Hits up to the government. Oh whoops, 27 00:01:15,640 --> 00:01:17,640 Speaker 1: the kids got on. That's basically how they'll roll, because 28 00:01:17,640 --> 00:01:19,440 Speaker 1: that is how they roll at the moment, because I 29 00:01:19,480 --> 00:01:23,480 Speaker 1: can tell you with absolute certainty they are not taking responsibility. Now. 30 00:01:23,520 --> 00:01:28,360 Speaker 1: They know full well that their product is really bad 31 00:01:28,400 --> 00:01:31,160 Speaker 1: for the developing brain. And they also know that they're 32 00:01:31,160 --> 00:01:33,360 Speaker 1: supposed to keep the kids off until they're thirteen, and 33 00:01:33,400 --> 00:01:36,280 Speaker 1: yet whoops, a whole bunch of kids with age under 34 00:01:36,319 --> 00:01:40,920 Speaker 1: the age of thirteen just happen to have social media accounts. Whoops. Now, 35 00:01:40,959 --> 00:01:44,960 Speaker 1: I have absolutely zero tolerance for any argument from these 36 00:01:45,000 --> 00:01:48,840 Speaker 1: companies that they can't do this. Yes they can. They 37 00:01:48,840 --> 00:01:51,600 Speaker 1: are literally tech geniuses. So if they've got an age 38 00:01:51,680 --> 00:01:55,680 Speaker 1: verification tool make the tech work. They are extremely wealthy, 39 00:01:55,920 --> 00:01:57,960 Speaker 1: so if there's some sort of a penalty that involves money, 40 00:01:58,000 --> 00:02:00,800 Speaker 1: they can pay that fine. And they are also some 41 00:02:00,920 --> 00:02:03,120 Speaker 1: of the founders and a whole bunch of the employees' 42 00:02:03,560 --> 00:02:06,320 Speaker 1: parents as well, so they should be caring what is 43 00:02:06,360 --> 00:02:09,680 Speaker 1: happening to caring about what is happening to other people's kids. 44 00:02:09,919 --> 00:02:12,080 Speaker 1: And you know what worst case scenario is they do 45 00:02:12,120 --> 00:02:13,919 Speaker 1: the thing that they always do and they threaten to leave. 46 00:02:14,120 --> 00:02:17,000 Speaker 1: So what be my guest, bugger off leave the country. 47 00:02:17,040 --> 00:02:20,040 Speaker 1: I don't care if Facebook or Insta or TikTok or 48 00:02:20,200 --> 00:02:22,120 Speaker 1: x pulled out of Australia or if we were to 49 00:02:22,120 --> 00:02:23,800 Speaker 1: do the same thing out of New Zealand tomorrow. I 50 00:02:23,800 --> 00:02:26,600 Speaker 1: could not give less of a toss. This literally does 51 00:02:26,639 --> 00:02:29,840 Speaker 1: not make your life any better. The only people who 52 00:02:29,919 --> 00:02:32,919 Speaker 1: I can see he would actually really really be hit 53 00:02:33,040 --> 00:02:36,240 Speaker 1: by these guys pulling out are businesses who are impacted 54 00:02:36,240 --> 00:02:40,520 Speaker 1: because they advertise on and have business accounts on the 55 00:02:40,600 --> 00:02:43,639 Speaker 1: social media companies. We managed to make it work on 56 00:02:43,680 --> 00:02:45,600 Speaker 1: the internet beforehand. We will make it work again. We 57 00:02:45,639 --> 00:02:47,600 Speaker 1: will simply go back to using Google and finding ways 58 00:02:47,600 --> 00:02:50,400 Speaker 1: to advertise and get ahold of businesses through Google. So 59 00:02:50,639 --> 00:02:53,320 Speaker 1: the more I see what Australia is doing, the more 60 00:02:53,400 --> 00:02:55,080 Speaker 1: I want us to do it. And I'm bloody well 61 00:02:55,120 --> 00:02:58,680 Speaker 1: hope we do. For more from Heather Duplessy Allen Drive 62 00:02:58,840 --> 00:03:02,239 Speaker 1: Listen live to news talks. It'd be from four pm weekdays, 63 00:03:02,360 --> 00:03:04,600 Speaker 1: or follow the podcast on iHeartRadio