1 00:00:00,320 --> 00:00:02,000 Speaker 1: Sam Emory, Aussie correspondent with US. 2 00:00:02,040 --> 00:00:04,400 Speaker 2: Now, Hey Sam, Hell, that good afternoon. Sam. 3 00:00:04,440 --> 00:00:06,360 Speaker 1: Listen, You've got to fill us in on what's just 4 00:00:06,400 --> 00:00:07,680 Speaker 1: been breaking in your part of the world. 5 00:00:08,240 --> 00:00:11,160 Speaker 2: Yeah. Look, some sad news just breaking through this afternoon, Heather. 6 00:00:11,200 --> 00:00:14,680 Speaker 2: We're still waiting for more details, but police are investigating 7 00:00:15,080 --> 00:00:17,480 Speaker 2: after two children were found dead in a home in 8 00:00:17,520 --> 00:00:20,960 Speaker 2: the Blue Mountains in New South Wales. Emergency services were 9 00:00:20,960 --> 00:00:24,320 Speaker 2: called to the home in Falconbridge at just before twelve 10 00:00:24,440 --> 00:00:28,000 Speaker 2: forty today due to concerns for the welfare of a 11 00:00:28,080 --> 00:00:31,680 Speaker 2: woman and two children. They arrived to find the bodies 12 00:00:31,720 --> 00:00:34,360 Speaker 2: of two boys at age nine and eleven. The forty 13 00:00:34,360 --> 00:00:36,680 Speaker 2: two year old woman's been taken to Westmead Hospital under 14 00:00:36,680 --> 00:00:40,560 Speaker 2: police guard. But police, this is the interesting part. Police 15 00:00:40,560 --> 00:00:42,560 Speaker 2: are sort of saying that they're not looking for anyone 16 00:00:42,640 --> 00:00:46,720 Speaker 2: else in connection with the situation. So a bit dubious 17 00:00:46,760 --> 00:00:49,159 Speaker 2: there and certainly a sad news coming out of the 18 00:00:49,400 --> 00:00:51,839 Speaker 2: Blue Mountains just west of Sydney. Jesse. It's tough to 19 00:00:51,880 --> 00:00:52,199 Speaker 2: hear that. 20 00:00:52,800 --> 00:00:54,880 Speaker 1: What do you make of there? Some age limit for 21 00:00:54,920 --> 00:00:58,040 Speaker 1: the social media companies? What are you hearing round about? Fourteen? 22 00:01:00,000 --> 00:01:04,360 Speaker 2: Australia is definitely pushing for fourteen years old to ban 23 00:01:04,560 --> 00:01:08,240 Speaker 2: children registering on social media platforms. The pms come out 24 00:01:08,319 --> 00:01:12,360 Speaker 2: saying he thinks maybe sixteen or younger off the social 25 00:01:12,400 --> 00:01:15,640 Speaker 2: media platforms. They're going to try and get this legislation 26 00:01:15,760 --> 00:01:20,880 Speaker 2: through this afternoon. The South Australian premiere this afternoon. This 27 00:01:20,959 --> 00:01:24,720 Speaker 2: year I meet the Premier agrees that national regulations on 28 00:01:24,880 --> 00:01:28,400 Speaker 2: child access to social media would certainly assist parents. Some 29 00:01:28,440 --> 00:01:31,000 Speaker 2: people have sort of come out saying, why should we 30 00:01:31,040 --> 00:01:32,720 Speaker 2: ban them from it? We should be given them tools 31 00:01:32,800 --> 00:01:36,800 Speaker 2: to actually use them properly. But it's whether or not 32 00:01:36,880 --> 00:01:39,560 Speaker 2: these kids actually really want to be on it just 33 00:01:39,600 --> 00:01:41,560 Speaker 2: because all their friends are on it, or because they 34 00:01:41,560 --> 00:01:44,440 Speaker 2: actually want to connect with a lot of people. I 35 00:01:44,440 --> 00:01:46,440 Speaker 2: know that I was speaking before I came on with 36 00:01:46,480 --> 00:01:49,720 Speaker 2: your producers. Off Heather. I've got two kids, thirteen and eleven. 37 00:01:49,960 --> 00:01:52,920 Speaker 2: My son, you know, he's on Snapchat. He uses it 38 00:01:53,000 --> 00:01:55,840 Speaker 2: quite a bit. My daughter's got no interest in it whatsoever. 39 00:01:55,920 --> 00:01:58,720 Speaker 2: We're actually talking about it the other day and she says, 40 00:01:58,720 --> 00:02:00,560 Speaker 2: why would I need something like that when I can 41 00:02:00,960 --> 00:02:03,720 Speaker 2: quite easily. You know, they're on messages, they're on I Message, 42 00:02:03,960 --> 00:02:07,240 Speaker 2: they're on YouTube, they're constantly talking with each other, but 43 00:02:07,240 --> 00:02:10,320 Speaker 2: they're not actually on these apps, you know, looking at 44 00:02:10,400 --> 00:02:13,040 Speaker 2: photos and looking at all these other things, or being 45 00:02:13,440 --> 00:02:17,160 Speaker 2: exposed to these sort of advertisements. So I thought it 46 00:02:17,200 --> 00:02:19,680 Speaker 2: was very interesting to hear that from my eleven year 47 00:02:19,680 --> 00:02:22,720 Speaker 2: old saying that she didn't need any more connection, and 48 00:02:23,320 --> 00:02:25,720 Speaker 2: it may change by the time she gets to high school. 49 00:02:25,960 --> 00:02:28,600 Speaker 2: You know, teenagers will certainly want to be more connected, 50 00:02:29,280 --> 00:02:32,760 Speaker 2: but there's I think the general consensus here in Australia 51 00:02:32,800 --> 00:02:35,080 Speaker 2: is that what's the point of being on these things? 52 00:02:35,520 --> 00:02:39,080 Speaker 2: You know, at fourteen, fifteen, sixteen, you know what benefit 53 00:02:39,120 --> 00:02:42,000 Speaker 2: is it actually bringing to the youth of Australia. So 54 00:02:42,639 --> 00:02:44,839 Speaker 2: I don't think they'll have too much problem getting this 55 00:02:45,080 --> 00:02:48,000 Speaker 2: across the line with legislation. It's certainly something that people 56 00:02:48,080 --> 00:02:53,360 Speaker 2: want more powers around and more protection for kids, but 57 00:02:53,480 --> 00:02:57,519 Speaker 2: adults as well. You know, it's certainly not discriminating against 58 00:02:57,600 --> 00:03:01,760 Speaker 2: who finds it hard, who is being attacked online. 59 00:03:01,840 --> 00:03:04,119 Speaker 1: It's a very good point. Listen, tell me what you'll 60 00:03:04,120 --> 00:03:07,359 Speaker 1: take on this. Is that video of David Spears snorting 61 00:03:07,400 --> 00:03:09,120 Speaker 1: the white powder real or is it not? 62 00:03:10,600 --> 00:03:13,960 Speaker 2: Well, he's calling it deep fake. He's been out on 63 00:03:13,960 --> 00:03:16,760 Speaker 2: News Limited. He's calling it on a few other news 64 00:03:16,800 --> 00:03:22,960 Speaker 2: sites as well as an artificially generated video. It's pretty 65 00:03:23,000 --> 00:03:25,440 Speaker 2: hard to say why would anyone go to the trouble 66 00:03:25,639 --> 00:03:30,280 Speaker 2: of making a deep fake video of a person who's 67 00:03:30,320 --> 00:03:35,920 Speaker 2: no longer in parliament with the former opposition leader. And yeah, look, 68 00:03:36,240 --> 00:03:38,080 Speaker 2: I think he's trying to sort of pull a bit 69 00:03:38,120 --> 00:03:41,480 Speaker 2: of a furfey over this one, and I think he 70 00:03:41,600 --> 00:03:46,840 Speaker 2: will quickly find that technology and experts will be able 71 00:03:46,880 --> 00:03:48,440 Speaker 2: to tell him whether or not it's a deep fake. 72 00:03:48,560 --> 00:03:50,240 Speaker 2: So I don't think we'll be sitting around waiting for 73 00:03:50,280 --> 00:03:51,600 Speaker 2: too long on this one, haven't Sam. 74 00:03:51,760 --> 00:03:54,440 Speaker 1: Do deep fake video makers go to the effort of 75 00:03:54,520 --> 00:03:57,880 Speaker 1: leaving details like an open laptop in the background and 76 00:03:57,920 --> 00:04:00,560 Speaker 1: a red bottle of wine and a lighter. It feels 77 00:04:00,760 --> 00:04:03,960 Speaker 1: very It feels very real, doesn't it. 78 00:04:03,960 --> 00:04:07,040 Speaker 2: It does. I mean, I'm sure these deepak editors can 79 00:04:07,080 --> 00:04:09,480 Speaker 2: get creative in their own world, Heather, who knows what 80 00:04:09,600 --> 00:04:14,000 Speaker 2: they can create, but it's certainly you know why. I 81 00:04:14,080 --> 00:04:17,240 Speaker 2: just think the person, the profile, the reason behind it. 82 00:04:17,320 --> 00:04:19,160 Speaker 2: I just don't see why anyone would go to any 83 00:04:20,760 --> 00:04:22,760 Speaker 2: of that sort of effort to try and make something 84 00:04:22,800 --> 00:04:25,240 Speaker 2: like that. What's the point, what's the benefit for them? 85 00:04:25,279 --> 00:04:28,960 Speaker 2: And who's actually helping or not helping to right? 86 00:04:29,000 --> 00:04:31,480 Speaker 1: Hey, Sam, thank you very much, really appreciated that. Sam Emory, 87 00:04:31,760 --> 00:04:36,279 Speaker 1: Australia correspondent. For more from Hither Duplessy Allen Drive, listen 88 00:04:36,360 --> 00:04:39,400 Speaker 1: live to news talks it'd be from four pm weekdays, 89 00:04:39,480 --> 00:04:41,640 Speaker 1: or follow the podcast on iHeartRadio