1 00:00:00,480 --> 00:00:04,440 Speaker 1: Let's talk about social media because the state government here 2 00:00:04,640 --> 00:00:07,600 Speaker 1: and the New South Wales government are coming together to 3 00:00:07,760 --> 00:00:12,479 Speaker 1: jointly host a social media summit, partnering up first of 4 00:00:12,520 --> 00:00:15,600 Speaker 1: its kind two day, two state social Media Summit to 5 00:00:15,600 --> 00:00:21,280 Speaker 1: be held in October. Sarah Daviy, CEO Alana and Madeline Foundation, Sarah, 6 00:00:21,360 --> 00:00:22,680 Speaker 1: good morning, Thank you for your time. 7 00:00:23,320 --> 00:00:24,120 Speaker 2: Good morning, Matthia. 8 00:00:24,400 --> 00:00:26,200 Speaker 1: What do you hope is achieved here? 9 00:00:27,880 --> 00:00:30,760 Speaker 2: Look, a couple of things. The first one is the 10 00:00:30,800 --> 00:00:34,199 Speaker 2: more people who understand what's happening, and the more people 11 00:00:34,240 --> 00:00:37,159 Speaker 2: who are aware of it and engaged, the better. So 12 00:00:37,280 --> 00:00:40,559 Speaker 2: I think this is just fabulous, and I hope that 13 00:00:40,600 --> 00:00:43,400 Speaker 2: the summit and the participants in the summit get a 14 00:00:43,440 --> 00:00:47,040 Speaker 2: lot of their time and that people are open to 15 00:00:47,200 --> 00:00:49,840 Speaker 2: exploring kind of all the angles here. So I think 16 00:00:49,920 --> 00:00:52,680 Speaker 2: that's the first thing. The more we're aware and understand 17 00:00:52,720 --> 00:00:56,320 Speaker 2: about what's going on, the more informed and intelligent and 18 00:00:56,360 --> 00:00:58,800 Speaker 2: then effective we're going to be about dealing with the harm. 19 00:00:59,080 --> 00:01:01,880 Speaker 1: Okay, there's a lot of things that will be talked about. 20 00:01:01,880 --> 00:01:07,480 Speaker 1: Public health, response to harm, minimization, opportunity to address social 21 00:01:07,520 --> 00:01:12,319 Speaker 1: media harms, understanding the role in identity, unpacking the links 22 00:01:12,319 --> 00:01:17,640 Speaker 1: they say between extremism, misogyny, and social media and reclaiming 23 00:01:17,680 --> 00:01:21,120 Speaker 1: and this is good reclaiming the digital town square, creating 24 00:01:21,160 --> 00:01:26,199 Speaker 1: positive digital community. So the digital town square has been 25 00:01:26,600 --> 00:01:30,000 Speaker 1: taken over by people, some people who are a bit 26 00:01:30,040 --> 00:01:33,720 Speaker 1: off killed or sometimes to say, to put it mildly. 27 00:01:34,280 --> 00:01:37,559 Speaker 2: Absolutely, I think what I would. There are two things 28 00:01:37,600 --> 00:01:42,360 Speaker 2: that I'd really encourage the Stummit organizers to focus on, 29 00:01:42,560 --> 00:01:47,200 Speaker 2: and one is absolutely including and amplifying the voice and 30 00:01:47,319 --> 00:01:51,240 Speaker 2: experience and participation of children and young people themselves. So 31 00:01:51,680 --> 00:01:54,120 Speaker 2: they have to be front and center in terms of 32 00:01:54,360 --> 00:01:57,160 Speaker 2: what they want for their digital world, what's worrying them, 33 00:01:57,680 --> 00:02:01,080 Speaker 2: how they'd like to see it addressed. And the second 34 00:02:01,080 --> 00:02:03,280 Speaker 2: thing is, and we did talk about this a little 35 00:02:03,320 --> 00:02:06,200 Speaker 2: bit before, Matthew a couple of weeks ago, I am 36 00:02:06,240 --> 00:02:13,200 Speaker 2: a bit worried about the narrow focus on social media. Absolutely, 37 00:02:13,320 --> 00:02:17,800 Speaker 2: social media is just one aspect of tech. So let's 38 00:02:17,840 --> 00:02:20,679 Speaker 2: think about game, right, we don't think about games of 39 00:02:20,800 --> 00:02:24,320 Speaker 2: social media and that they have in game chat group, 40 00:02:24,760 --> 00:02:29,320 Speaker 2: in game recommender systems, in game context and content. It 41 00:02:29,360 --> 00:02:33,760 Speaker 2: is equally important that all types of tech, apps, games, 42 00:02:34,280 --> 00:02:37,640 Speaker 2: ed tech are included in this because social media is 43 00:02:37,800 --> 00:02:40,920 Speaker 2: just one element of it. So it needs to be 44 00:02:41,040 --> 00:02:43,080 Speaker 2: much broader than social media. 45 00:02:43,200 --> 00:02:45,680 Speaker 1: Okay, that's really interesting because you're right when you think 46 00:02:45,720 --> 00:02:48,800 Speaker 1: of particularly one game that has over the last ten years, 47 00:02:48,800 --> 00:02:54,080 Speaker 1: I suppose dominated the eight to twelve primarily age group, 48 00:02:54,080 --> 00:02:56,720 Speaker 1: and that's Fortnite, and there is a lot of interaction 49 00:02:56,840 --> 00:02:58,560 Speaker 1: in that with people you don't know potentially. 50 00:02:59,000 --> 00:03:02,000 Speaker 2: That is exactly right. So you know, all of those 51 00:03:02,080 --> 00:03:04,960 Speaker 2: key games, you know, even Club Penguins for children that 52 00:03:05,000 --> 00:03:10,880 Speaker 2: are younger, Roblex, Fortnite, Minecraft, they all have in game 53 00:03:11,440 --> 00:03:14,800 Speaker 2: features such as recommended systems that will push content and 54 00:03:14,840 --> 00:03:18,600 Speaker 2: contacted children and young people and users, private chat groups, 55 00:03:18,760 --> 00:03:23,120 Speaker 2: non private chat groups, sharing of imagery. So we've really 56 00:03:23,120 --> 00:03:26,600 Speaker 2: got to be inclusive when we think about the digital 57 00:03:26,639 --> 00:03:28,760 Speaker 2: world for children and young people. Yeah. 58 00:03:28,760 --> 00:03:32,200 Speaker 1: Absolutely, that's a very good point. So okay, widen the 59 00:03:32,240 --> 00:03:34,240 Speaker 1: focus you'd like to are you able to? You're going 60 00:03:34,240 --> 00:03:35,600 Speaker 1: to make a submission your foundation? 61 00:03:36,160 --> 00:03:38,880 Speaker 2: Oh you betcha yeah, yeah, so this. 62 00:03:38,840 --> 00:03:40,920 Speaker 1: Will be part of it obviously what you've just been saying. 63 00:03:41,600 --> 00:03:44,640 Speaker 2: Absolutely, And look, if people are interested, we're very open 64 00:03:44,680 --> 00:03:47,320 Speaker 2: with all our policy submissions. They can all be found 65 00:03:47,400 --> 00:03:52,880 Speaker 2: on our website under our policy banner. We appeared before 66 00:03:52,920 --> 00:03:56,560 Speaker 2: the Joint Select Committee into Social Media in Australia Life 67 00:03:56,560 --> 00:03:59,160 Speaker 2: a couple of weeks ago. Our testimony is up there. 68 00:03:59,320 --> 00:04:02,119 Speaker 2: We are very transparent about this and if people want 69 00:04:02,160 --> 00:04:06,080 Speaker 2: to use any of that with their own advocacy, contacting 70 00:04:06,120 --> 00:04:09,040 Speaker 2: their own members of Parliament for example, around things that 71 00:04:09,040 --> 00:04:10,720 Speaker 2: they'd like to see than go through life. 72 00:04:10,800 --> 00:04:14,400 Speaker 1: Yeah, I don't imagine a lot of young people, probably 73 00:04:14,440 --> 00:04:17,200 Speaker 1: aged under thirty zero to thirty, would have Twitter on 74 00:04:17,240 --> 00:04:20,920 Speaker 1: their devices, would have Facebook on their devices, but they 75 00:04:20,920 --> 00:04:25,240 Speaker 1: would have TikTok, they would have Snapchat primarily those two 76 00:04:25,440 --> 00:04:29,679 Speaker 1: and there may be others as well, but they seem 77 00:04:29,720 --> 00:04:34,279 Speaker 1: to be quite dominant in teenager's use at the moment. 78 00:04:34,480 --> 00:04:37,320 Speaker 1: So what are your concerns with that? I mean, there's 79 00:04:37,320 --> 00:04:40,520 Speaker 1: a lot said about TikTok particularly and it's links to China. 80 00:04:40,839 --> 00:04:42,880 Speaker 1: Is that an issue or is it just the harmful 81 00:04:42,960 --> 00:04:48,280 Speaker 1: content that kids can be exposed to by more malicious users. 82 00:04:48,680 --> 00:04:51,800 Speaker 2: That is a great question, Matthew, because it comes down 83 00:04:51,839 --> 00:04:56,280 Speaker 2: to actually what's the cause of all the problems that 84 00:04:56,279 --> 00:04:58,960 Speaker 2: we're experiencing. And I think of it like an iceberg. 85 00:04:59,279 --> 00:05:01,320 Speaker 2: So the top of the iceberg that you can see 86 00:05:01,360 --> 00:05:04,760 Speaker 2: above the water, that's all the kind of visible and 87 00:05:04,880 --> 00:05:09,880 Speaker 2: experiential experiences and impacts of using tech, both positive and negative, right, 88 00:05:10,000 --> 00:05:13,320 Speaker 2: And that's what we're responding to as a community because 89 00:05:13,360 --> 00:05:17,560 Speaker 2: we're responding to the very clear harms that we can see. 90 00:05:18,160 --> 00:05:22,000 Speaker 2: But what sits under the surface that you can't see, 91 00:05:22,440 --> 00:05:27,480 Speaker 2: right at the very bottom, in the dirty dark depths 92 00:05:28,000 --> 00:05:31,599 Speaker 2: at the bottom of that iceberg, is data. And that 93 00:05:32,080 --> 00:05:35,320 Speaker 2: has got dollar signs all over. It's tech companies, and 94 00:05:35,400 --> 00:05:39,080 Speaker 2: it is the data. It's us, we're the currency, but 95 00:05:39,240 --> 00:05:42,400 Speaker 2: all tech it's our data that is monetized. And what 96 00:05:42,520 --> 00:05:45,400 Speaker 2: happens is that data moves up from the they collected, 97 00:05:45,560 --> 00:05:48,920 Speaker 2: they scrape it, they steal it. It comes up through 98 00:05:48,960 --> 00:05:53,240 Speaker 2: the iceberg into the kind of tech engines, and what 99 00:05:53,279 --> 00:05:56,000 Speaker 2: the engines do is that data is then the fuel 100 00:05:56,160 --> 00:06:01,599 Speaker 2: for the algorithms, the recommendus systems, all of predictive analytics. 101 00:06:02,040 --> 00:06:09,719 Speaker 2: They profileists demographically, psychographically, behaviorally, biometrically, and now neurologically. So 102 00:06:09,800 --> 00:06:12,880 Speaker 2: you've got that engine that's fed by all that data 103 00:06:12,920 --> 00:06:17,560 Speaker 2: that is running these algorithms and these recommended systems, and 104 00:06:17,600 --> 00:06:21,839 Speaker 2: that's then floating up closer to the surface in terms 105 00:06:21,880 --> 00:06:27,600 Speaker 2: of pushing content to people, contacts to people encouraging particular 106 00:06:27,640 --> 00:06:34,160 Speaker 2: types of behavior, encouraging sort of particular types of habits. So, 107 00:06:34,320 --> 00:06:36,520 Speaker 2: you know, we look at YouTube and TikTok and think 108 00:06:36,560 --> 00:06:41,480 Speaker 2: about the scrolling and the constant nudging of pushing viewers 109 00:06:41,520 --> 00:06:44,400 Speaker 2: into more and more, and then what happens is then 110 00:06:44,440 --> 00:06:47,160 Speaker 2: you break the surface of the iceberg, and that's when 111 00:06:47,200 --> 00:06:53,080 Speaker 2: you're experiencing. That's all we see, these visible behaviors and harms. Yeah. Absolutely, 112 00:06:53,120 --> 00:06:56,280 Speaker 2: we don't go to the bottom of the iceberg and 113 00:06:56,600 --> 00:07:01,040 Speaker 2: sort out the data and stop children's data being taken 114 00:07:01,320 --> 00:07:06,040 Speaker 2: and exploited for commercial and harmful purposes. Doesn't matter how 115 00:07:06,120 --> 00:07:08,520 Speaker 2: much spotlight we put on the top of that iceberg. 116 00:07:08,720 --> 00:07:12,239 Speaker 2: We've got to get down dirty and fix the data 117 00:07:12,320 --> 00:07:18,840 Speaker 2: issue and have genuine children's online privacy and not allowing 118 00:07:19,120 --> 00:07:24,280 Speaker 2: commercial use of that data, behavior profiling and recommended systems 119 00:07:24,280 --> 00:07:27,560 Speaker 2: that are feeding the harms to children and young people. 120 00:07:28,240 --> 00:07:31,760 Speaker 1: If we somehow manage to regulate no kids on social 121 00:07:31,800 --> 00:07:35,840 Speaker 1: media until they're sixteen, kids being kids, they'll find a way, 122 00:07:35,880 --> 00:07:37,480 Speaker 1: won't they. They will. 123 00:07:37,520 --> 00:07:41,239 Speaker 2: And Matthew, I just I think that's such the wrong question, 124 00:07:41,800 --> 00:07:46,480 Speaker 2: because actually, if social media is designed well and healthily 125 00:07:46,960 --> 00:07:50,040 Speaker 2: with age appropriate design, you can have children at any 126 00:07:50,080 --> 00:07:52,560 Speaker 2: age on it. That's what I mean. It's these it's 127 00:07:52,600 --> 00:07:55,880 Speaker 2: the engines and the profiling and the recommended systems that 128 00:07:55,960 --> 00:08:00,320 Speaker 2: are causing the harm. It's like going into a toy 129 00:08:00,360 --> 00:08:03,640 Speaker 2: shop and choosing a toy for a recommended age range. 130 00:08:03,840 --> 00:08:06,800 Speaker 2: You know that that toy has been designed to be 131 00:08:06,920 --> 00:08:11,080 Speaker 2: age appropriate and safe for that age range. You could 132 00:08:11,080 --> 00:08:14,360 Speaker 2: do the same with tech so easily. So it's not 133 00:08:14,640 --> 00:08:19,000 Speaker 2: actually the age of the person, it's the fundamental design 134 00:08:19,040 --> 00:08:21,960 Speaker 2: of the product. And what worries me is the more 135 00:08:22,080 --> 00:08:25,480 Speaker 2: we respond to what we see on top of the water, 136 00:08:26,400 --> 00:08:28,520 Speaker 2: the more it lets the tech companies off the hook 137 00:08:28,600 --> 00:08:32,120 Speaker 2: for actually changing it, because then it's like, well, children 138 00:08:32,160 --> 00:08:34,440 Speaker 2: and young people, it's your fault, you shouldn't be using it. 139 00:08:34,480 --> 00:08:36,839 Speaker 2: And parents, it's your fault. You're bad parents. You should 140 00:08:37,200 --> 00:08:39,760 Speaker 2: you know, you should police this or pictures, it's your fault, 141 00:08:39,800 --> 00:08:41,840 Speaker 2: you should teach better. And it's not done. And that 142 00:08:41,960 --> 00:08:44,959 Speaker 2: is true. We have to hold tech to account for 143 00:08:45,080 --> 00:08:47,600 Speaker 2: safety by design and age appropriate design. 144 00:08:47,920 --> 00:08:50,160 Speaker 1: It's a Pandora's box, isn't that We've opened the lid 145 00:08:50,360 --> 00:08:54,120 Speaker 1: and this genie has got out, and trying to now 146 00:08:54,280 --> 00:09:01,120 Speaker 1: regulate how social media companies, giants use the rhythms. It's 147 00:09:01,160 --> 00:09:03,800 Speaker 1: almost too late. I mean that's something government should have. 148 00:09:04,520 --> 00:09:06,280 Speaker 1: I don't know if it was possible to foresee, but 149 00:09:06,320 --> 00:09:08,360 Speaker 1: they should have. I mean, that's what they get paid 150 00:09:08,400 --> 00:09:10,440 Speaker 1: the big bucks for and get on top of early on. 151 00:09:10,679 --> 00:09:13,480 Speaker 2: That's you're absolutely right. So what we've had Facebook for 152 00:09:13,520 --> 00:09:16,040 Speaker 2: what twenty years? So they've had you know, so we've 153 00:09:16,080 --> 00:09:19,560 Speaker 2: had tech of twenty years of building what we're now living. 154 00:09:20,160 --> 00:09:22,120 Speaker 2: But I think there is a way, and I actually 155 00:09:22,200 --> 00:09:25,440 Speaker 2: think we need to think globally right. This is because 156 00:09:25,480 --> 00:09:30,200 Speaker 2: this is a global issue. These massive organizations work across 157 00:09:30,240 --> 00:09:34,120 Speaker 2: the world and in Australia we have lower safety standards 158 00:09:34,120 --> 00:09:35,840 Speaker 2: than they do in other parts of the world because 159 00:09:35,880 --> 00:09:38,480 Speaker 2: they can get away with it. But if we had, 160 00:09:38,559 --> 00:09:40,400 Speaker 2: if you imagine, if we had kind of like a 161 00:09:40,520 --> 00:09:44,599 Speaker 2: NATO for tech, right, a global agreement about hearing a 162 00:09:44,840 --> 00:09:49,640 Speaker 2: minimum safety standards. Here are a minimum age appropriate standards 163 00:09:50,040 --> 00:09:51,720 Speaker 2: and if you don't meet them, you can't play in 164 00:09:51,720 --> 00:09:54,960 Speaker 2: our market. That's I think where we've got to go 165 00:09:55,080 --> 00:09:57,960 Speaker 2: with this. And our East Safety Commissioner in Australia is 166 00:09:58,000 --> 00:10:01,720 Speaker 2: phenomenal and is building that alian across the world, so 167 00:10:02,280 --> 00:10:05,160 Speaker 2: that I think is our massive opportunity. Yes, we need 168 00:10:05,200 --> 00:10:07,280 Speaker 2: to do it in a state by state and nation 169 00:10:07,400 --> 00:10:11,200 Speaker 2: by nation, but actually they are global players and we 170 00:10:11,240 --> 00:10:14,760 Speaker 2: need a global minimum set of safety standards for children 171 00:10:14,840 --> 00:10:15,480 Speaker 2: and young people. 172 00:10:15,520 --> 00:10:18,160 Speaker 1: All right, well this is in October and Sarah, look 173 00:10:18,240 --> 00:10:21,600 Speaker 1: forward to seeing what your submission is when that comes around. 174 00:10:21,640 --> 00:10:25,560 Speaker 1: This joint summit between Yeah, absolutely, thank you for your time, 175 00:10:25,720 --> 00:10:30,560 Speaker 1: Sarah Davis, CEO Alana and Madeline Foundation on the state 176 00:10:30,720 --> 00:10:34,880 Speaker 1: governments of South Australia and New South Wales coming together 177 00:10:35,040 --> 00:10:40,280 Speaker 1: to hold a summit on social media and where do 178 00:10:40,320 --> 00:10:42,760 Speaker 1: we take it from there? Things like cyber bullying to 179 00:10:42,760 --> 00:10:46,880 Speaker 1: be addressed, negative impacts on kids' mental health and development 180 00:10:46,960 --> 00:10:50,520 Speaker 1: to be tackled. But I do like Sarah's suggestion that 181 00:10:50,960 --> 00:10:53,240 Speaker 1: social media is just a part of it. What about 182 00:10:53,240 --> 00:10:56,760 Speaker 1: the games where they're interacting and interacting with who knows 183 00:10:56,760 --> 00:11:00,640 Speaker 1: who right across the world, and what the intent of 184 00:11:01,200 --> 00:11:03,240 Speaker 1: the person is. Now, it could be just another kid 185 00:11:03,320 --> 00:11:06,320 Speaker 1: just like them. Fine, I suppose, But how would you know? 186 00:11:07,000 --> 00:11:11,640 Speaker 1: Because so much of it in online activities and has 187 00:11:11,640 --> 00:11:14,160 Speaker 1: been proven over and over and over again to the 188 00:11:14,200 --> 00:11:20,320 Speaker 1: detriment of people's lives destroyed, and some tragically, even taking 189 00:11:20,440 --> 00:11:23,760 Speaker 1: their own lives. And for anyone in that sort of situation, 190 00:11:23,880 --> 00:11:28,800 Speaker 1: Lifeline is there to help. Thirteen eleven fourteen. But you 191 00:11:28,920 --> 00:11:32,320 Speaker 1: don't know is the key. Scammers pretend to be people 192 00:11:32,480 --> 00:11:34,600 Speaker 1: to get money out of you, and there are people 193 00:11:34,640 --> 00:11:38,880 Speaker 1: with more evil intents on children who are there masquerading 194 00:11:38,920 --> 00:11:41,839 Speaker 1: as children. It's not good, not good at all.