1 00:00:00,080 --> 00:00:03,080 Speaker 1: Now Mesa's global head of Safety, Rickins, we should consider 2 00:00:03,120 --> 00:00:06,000 Speaker 1: other options instead of doing the same as Australia and 3 00:00:06,040 --> 00:00:09,040 Speaker 1: banning under sixteen year olds from social media and Tigany 4 00:00:09,119 --> 00:00:11,160 Speaker 1: Davis is in the country in New Zealand for a 5 00:00:11,200 --> 00:00:13,840 Speaker 1: safety conference. She was on the mic hosting Breakfast this morning. 6 00:00:14,720 --> 00:00:17,960 Speaker 2: If you put parental approval, so anytime you download an app, 7 00:00:18,320 --> 00:00:20,640 Speaker 2: a team goes to download an app, a parent has 8 00:00:20,680 --> 00:00:23,120 Speaker 2: to say yes or no. In other words, you're starting 9 00:00:23,160 --> 00:00:27,000 Speaker 2: with a restriction around entering and you're basically saying Okay, 10 00:00:27,080 --> 00:00:29,360 Speaker 2: yes you can do this or no you can't. That's 11 00:00:29,640 --> 00:00:32,120 Speaker 2: a better way to do that across the entire Internet. 12 00:00:32,400 --> 00:00:35,280 Speaker 1: Stephen Sheila is the former CEO of Facebook Australia and 13 00:00:35,320 --> 00:00:38,760 Speaker 1: New Zealand and he's with US high Stephen Good definitely 14 00:00:38,920 --> 00:00:41,280 Speaker 1: do you agree with her that parental controls are preferable. 15 00:00:43,840 --> 00:00:47,000 Speaker 3: I think parental controls at the at the app store 16 00:00:47,120 --> 00:00:51,639 Speaker 3: level would be another good safety feature that should be 17 00:00:52,120 --> 00:00:56,720 Speaker 3: should be considered, And the platforms have been saying that 18 00:00:56,760 --> 00:00:59,200 Speaker 3: for a while. Their belief is it's too hard to 19 00:00:59,240 --> 00:01:02,280 Speaker 3: do this in the app itself. You should push it 20 00:01:02,320 --> 00:01:04,200 Speaker 3: back to the app stores, which are controlled by Apple 21 00:01:04,240 --> 00:01:06,920 Speaker 3: and Google and have them kind of you know, making 22 00:01:06,920 --> 00:01:09,520 Speaker 3: sure the parents are certifying whether an app, any app 23 00:01:09,640 --> 00:01:12,560 Speaker 3: winds up on a kid's phone. But as you can imagine, 24 00:01:12,560 --> 00:01:15,959 Speaker 3: that's that's part of a solution at making social media 25 00:01:16,360 --> 00:01:18,600 Speaker 3: and the internet more safe. But it doesn't you know, 26 00:01:18,600 --> 00:01:21,040 Speaker 3: it doesn't solve the problem of well, okay, what happens 27 00:01:21,080 --> 00:01:24,920 Speaker 3: when that individual that team gets on to social media. 28 00:01:24,959 --> 00:01:26,840 Speaker 3: You still want to have that side of the equation 29 00:01:27,000 --> 00:01:30,400 Speaker 3: safe as well. So I don't think it's a total solution, 30 00:01:30,520 --> 00:01:33,440 Speaker 3: but I think it is another tool in the toolkit. 31 00:01:34,280 --> 00:01:35,440 Speaker 1: I mean, it seems to me. 32 00:01:35,560 --> 00:01:35,760 Speaker 3: Yep. 33 00:01:35,800 --> 00:01:38,240 Speaker 1: The parental controls are a great idea, don't they. They 34 00:01:38,280 --> 00:01:40,560 Speaker 1: do already exist. I'm not imagining the say. 35 00:01:41,800 --> 00:01:44,959 Speaker 3: Yeah, absolutely, anybody who's listening who's got kids than you 36 00:01:45,000 --> 00:01:47,760 Speaker 3: are trying to see what your kids are doing on 37 00:01:47,800 --> 00:01:50,120 Speaker 3: social media. You've all been in, many of us have 38 00:01:50,160 --> 00:01:52,680 Speaker 3: been in and fiddled with the controls. The problem is 39 00:01:53,640 --> 00:01:56,400 Speaker 3: at that level, it's confusing, right, your kids around six 40 00:01:56,520 --> 00:02:00,040 Speaker 3: or seven different social media apps. The you know, the 41 00:02:00,040 --> 00:02:02,560 Speaker 3: the controls are usually set to your default on, not 42 00:02:02,640 --> 00:02:04,600 Speaker 3: default off, so you have to go and turn things off. 43 00:02:05,080 --> 00:02:06,960 Speaker 3: It's hard to figure out and then you've got to 44 00:02:07,480 --> 00:02:09,519 Speaker 3: your kids can go and get around you and change 45 00:02:09,560 --> 00:02:11,840 Speaker 3: it back, and so at least a lot on parents 46 00:02:11,840 --> 00:02:13,960 Speaker 3: to have to try to figure out. And most many 47 00:02:14,000 --> 00:02:16,080 Speaker 3: of us aren't tech experts in the first place, so 48 00:02:16,639 --> 00:02:19,480 Speaker 3: it does exist, but it's still probably is too difficult 49 00:02:19,520 --> 00:02:21,720 Speaker 3: for parents to make sense of and really use in. 50 00:02:21,680 --> 00:02:24,760 Speaker 1: A practical Okay, so tell me if I'm wrong here. 51 00:02:24,880 --> 00:02:26,800 Speaker 1: It seems to me that these things are useful, but 52 00:02:26,919 --> 00:02:29,480 Speaker 1: as you say, they already exist and we don't use them. 53 00:02:29,560 --> 00:02:32,400 Speaker 1: The default is actually that people don't use them. The 54 00:02:32,400 --> 00:02:34,920 Speaker 1: benefit of a social media ban as opposed to this 55 00:02:35,160 --> 00:02:38,560 Speaker 1: is that it creates the default setting of no, you're 56 00:02:38,600 --> 00:02:41,600 Speaker 1: not allowed to unleash your parent decides to break the rules, 57 00:02:41,960 --> 00:02:44,160 Speaker 1: and that actually gives parents a lot of power that 58 00:02:44,240 --> 00:02:46,800 Speaker 1: is really quite helpful to parents. Is that fair. 59 00:02:47,960 --> 00:02:49,560 Speaker 3: I think that's a good way of thinking about it. 60 00:02:49,600 --> 00:02:53,160 Speaker 3: In fact, it is this idea what's the default? And 61 00:02:53,200 --> 00:02:56,960 Speaker 3: I think what Meta is arguing is the default should 62 00:02:57,000 --> 00:02:59,640 Speaker 3: be at the parents should have the control at the 63 00:02:59,680 --> 00:03:02,160 Speaker 3: point of downloading an app from an app store, and 64 00:03:02,200 --> 00:03:04,320 Speaker 3: that's where the parents should say yes or no, rather 65 00:03:04,360 --> 00:03:07,240 Speaker 3: than having the government say no. But I think you 66 00:03:07,280 --> 00:03:09,960 Speaker 3: can argue. You know, reasonable people can disagree as to 67 00:03:10,000 --> 00:03:11,920 Speaker 3: where that point should be. But I think it is 68 00:03:11,960 --> 00:03:15,160 Speaker 3: all about giving parents more control and making it simpler 69 00:03:15,160 --> 00:03:17,240 Speaker 3: for parents to be able to handle, you know, the 70 00:03:17,240 --> 00:03:19,919 Speaker 3: fire hose of stuff that's coming at them from social media. 71 00:03:20,000 --> 00:03:22,480 Speaker 1: Okay, so we are what it's coming up. Actually, a 72 00:03:22,480 --> 00:03:24,799 Speaker 1: couple of months since the Australians banned the kids from 73 00:03:24,800 --> 00:03:26,720 Speaker 1: getting on the social media is would you would you 74 00:03:26,960 --> 00:03:28,160 Speaker 1: describe it as a success? 75 00:03:29,760 --> 00:03:33,760 Speaker 3: Look, I've seen you'll have We have to be careful, right, 76 00:03:33,800 --> 00:03:38,200 Speaker 3: it's early days. Yeah, it takes time to change human behavior. 77 00:03:38,280 --> 00:03:40,840 Speaker 3: And this is you know, you've got a whole literally 78 00:03:40,840 --> 00:03:43,240 Speaker 3: millions of kids suddenly having something taken away from them. 79 00:03:43,360 --> 00:03:45,000 Speaker 4: Those kids are trying to get around it now. 80 00:03:45,080 --> 00:03:47,520 Speaker 3: Right, that's a lot of what's happening, and it will 81 00:03:47,520 --> 00:03:50,240 Speaker 3: take time for this to become the new normal. And 82 00:03:50,280 --> 00:03:52,720 Speaker 3: so I think really the judge success or failure, you've 83 00:03:52,760 --> 00:03:54,640 Speaker 3: got to look for, you know, a year or two 84 00:03:54,760 --> 00:03:57,160 Speaker 3: into the cycle to see where it lands. 85 00:03:57,480 --> 00:03:59,080 Speaker 4: So that's far. It's it's really hard to see. 86 00:03:59,080 --> 00:04:01,720 Speaker 3: I think both sides are all sides here. 87 00:04:01,760 --> 00:04:03,160 Speaker 4: I think there's more than just two sides. 88 00:04:03,160 --> 00:04:06,040 Speaker 3: There's parents and kids as well, they all have opinions 89 00:04:06,360 --> 00:04:08,000 Speaker 3: and there's bits of data, but I think it's too 90 00:04:08,040 --> 00:04:08,600 Speaker 3: early to say. 91 00:04:08,760 --> 00:04:11,680 Speaker 1: I haven't seen any compliance data data, which I think 92 00:04:11,720 --> 00:04:15,119 Speaker 1: is probably a key to it. Have you seen any 93 00:04:15,120 --> 00:04:15,800 Speaker 1: of it out yet? 94 00:04:17,279 --> 00:04:18,119 Speaker 4: I haven't seen. 95 00:04:18,640 --> 00:04:20,240 Speaker 3: No, I haven't seen any, so I don't have to 96 00:04:20,279 --> 00:04:22,800 Speaker 3: wait for that. I'm still I think we still have 97 00:04:22,839 --> 00:04:25,680 Speaker 3: to wait for better data able to assess this. 98 00:04:25,839 --> 00:04:28,000 Speaker 1: And Stephen one final question. The fact that we've got 99 00:04:28,000 --> 00:04:30,719 Speaker 1: Antigony Davis going around the world making the case for 100 00:04:30,880 --> 00:04:34,320 Speaker 1: anything other than a band suggests to me that Meta 101 00:04:34,400 --> 00:04:36,760 Speaker 1: is reasonably worried about this idea catching on. What do 102 00:04:36,800 --> 00:04:39,560 Speaker 1: you think, Well. 103 00:04:40,320 --> 00:04:42,560 Speaker 4: Yeah, I think they are worried, and look, it's good. 104 00:04:43,080 --> 00:04:47,760 Speaker 3: I don't know antike any Personally, I think it's good 105 00:04:47,760 --> 00:04:50,880 Speaker 3: to see Meta have people out in the market. I 106 00:04:50,920 --> 00:04:52,440 Speaker 3: haven't seen that a lot, to be honest with you, 107 00:04:52,640 --> 00:04:54,440 Speaker 3: that they're out, They've got people out there talking to 108 00:04:54,480 --> 00:04:58,039 Speaker 3: the media more actively. I think personally, I think it 109 00:04:58,120 --> 00:05:01,560 Speaker 3: was a mistake that they didn't do that, but they 110 00:05:01,600 --> 00:05:04,040 Speaker 3: tried to lobby government. They made pronouncements, but they really 111 00:05:04,080 --> 00:05:06,200 Speaker 3: didn't have the face of the company out there talking. 112 00:05:06,400 --> 00:05:07,080 Speaker 4: And I thought she. 113 00:05:07,080 --> 00:05:09,360 Speaker 3: Made you know, she made some good points and made 114 00:05:09,400 --> 00:05:11,960 Speaker 3: some good case but I think it is showing that yes, 115 00:05:13,279 --> 00:05:15,320 Speaker 3: naturally they're going to be worried. They're worried that a 116 00:05:15,360 --> 00:05:17,919 Speaker 3: big chunk of their business is going to be restricted 117 00:05:18,480 --> 00:05:21,200 Speaker 3: and indeed it already has been in Australia, and you 118 00:05:21,200 --> 00:05:23,000 Speaker 3: know this just puts more sand in the wheels of 119 00:05:23,640 --> 00:05:24,760 Speaker 3: them trying to run their business. 120 00:05:25,200 --> 00:05:27,279 Speaker 1: Hey, thank you very much. Steven has always appreciate talking 121 00:05:27,320 --> 00:05:30,200 Speaker 1: to you. Stephen Sheila, the former Facebook Australia New Zealand CEO. 122 00:05:30,920 --> 00:05:34,080 Speaker 3: For more from Hither Duplessy Allen Drive, listen live to 123 00:05:34,200 --> 00:05:37,200 Speaker 3: news talks. It'd be from four pm weekdays, or follow 124 00:05:37,240 --> 00:05:38,960 Speaker 3: the podcast on iHeartRadio