1 00:00:00,360 --> 00:00:01,120 Speaker 1: Bryan Bridge. 2 00:00:01,160 --> 00:00:03,360 Speaker 2: TikTok's been at Parliament today. They obviously don't want to 3 00:00:03,400 --> 00:00:06,440 Speaker 2: be regulated no media, well no company generally wants to 4 00:00:06,480 --> 00:00:09,440 Speaker 2: be regulated, but particularly not TikTok. So what they did 5 00:00:09,480 --> 00:00:12,879 Speaker 2: is commission research from Toolbook Mills and they found that 6 00:00:12,920 --> 00:00:17,000 Speaker 2: only one in three parents have rules for their children 7 00:00:17,040 --> 00:00:20,320 Speaker 2: around social media. Which is interesting because you hear parents 8 00:00:20,320 --> 00:00:23,280 Speaker 2: say all the time, well, my kids, you know, turn 9 00:00:23,360 --> 00:00:26,079 Speaker 2: into a zombie because they're on their phone all the time. Well, actually, 10 00:00:26,120 --> 00:00:28,800 Speaker 2: do we have the rules in place? Makes sense as 11 00:00:28,840 --> 00:00:31,480 Speaker 2: an advocacy group for children online safety co found a 12 00:00:31,520 --> 00:00:35,200 Speaker 2: Hollybrookers with me now, Holly, good evening, good evening, Good 13 00:00:35,200 --> 00:00:37,320 Speaker 2: to have you here. Are you surprised to hear that number? 14 00:00:38,880 --> 00:00:41,520 Speaker 1: Oh yeah, yeah, I am surprised to hear that number. 15 00:00:41,560 --> 00:00:46,000 Speaker 1: It doesn't it doesn't really reflect what is happening for 16 00:00:46,040 --> 00:00:47,560 Speaker 1: our parents and for our young people. 17 00:00:48,040 --> 00:00:49,360 Speaker 2: So you think the numbers wrong. 18 00:00:50,560 --> 00:00:56,040 Speaker 1: So I'm really sorry, but can you repeat the sorry 19 00:00:56,120 --> 00:00:58,480 Speaker 1: you said? I actually got a bit mud of at up. 20 00:00:58,600 --> 00:01:02,480 Speaker 2: One in three parents have rules around their children's social 21 00:01:02,560 --> 00:01:05,920 Speaker 2: media in place. This is what according to the tour 22 00:01:05,920 --> 00:01:08,080 Speaker 2: Block Mills. So so, in other words, two thirds of 23 00:01:08,160 --> 00:01:10,520 Speaker 2: parents potentially don't have rules. 24 00:01:10,800 --> 00:01:14,200 Speaker 1: Oh yeah, yeah, yes, yes, I mean I think that 25 00:01:14,319 --> 00:01:16,760 Speaker 1: this is an area that parents are really struggling with 26 00:01:16,800 --> 00:01:19,440 Speaker 1: the parents, you know, and that your education work that 27 00:01:19,480 --> 00:01:22,280 Speaker 1: I do, parents are really struggling to keep up and 28 00:01:22,319 --> 00:01:24,920 Speaker 1: to put boundaries in place because there's so many platforms 29 00:01:25,000 --> 00:01:27,360 Speaker 1: and where you know, we're really encouraged parents to use 30 00:01:27,400 --> 00:01:30,160 Speaker 1: prind or controls and put boundaries in place, but it 31 00:01:30,200 --> 00:01:34,039 Speaker 1: is hard to implement that on across many different platforms 32 00:01:34,080 --> 00:01:37,800 Speaker 1: and devices. And often, you know, parents actually aren't that 33 00:01:37,880 --> 00:01:41,680 Speaker 1: aware of what their kids are doing online. And you know, 34 00:01:42,720 --> 00:01:45,880 Speaker 1: I have I have many stories of pearances Principles talking 35 00:01:45,959 --> 00:01:47,840 Speaker 1: to me and saying, appearances don't actually think their kids 36 00:01:47,840 --> 00:01:50,280 Speaker 1: are on social media, and they find out that they 37 00:01:50,320 --> 00:01:53,480 Speaker 1: are when something unfolds with students at school. So you know, 38 00:01:53,560 --> 00:01:56,920 Speaker 1: it's you've got to take these these types of results 39 00:01:56,960 --> 00:01:57,880 Speaker 1: with the grain of salt. 40 00:01:58,040 --> 00:01:59,960 Speaker 2: But isn't that I mean, can you as a parent 41 00:02:00,080 --> 00:02:03,080 Speaker 2: pleaded naivety? You know, isn't it your job to know 42 00:02:03,200 --> 00:02:06,240 Speaker 2: what you're especially if they're eleven, twelve, thirteen, I mean, 43 00:02:06,280 --> 00:02:09,679 Speaker 2: they're the biggest liars under the sun, you know, isn't 44 00:02:09,680 --> 00:02:10,720 Speaker 2: it your job to know. 45 00:02:11,760 --> 00:02:14,440 Speaker 1: Yeah, I think, you know, it's really important that as 46 00:02:14,520 --> 00:02:16,880 Speaker 1: parents that we step up and that we engage with 47 00:02:16,919 --> 00:02:20,519 Speaker 1: the with what's happening online, with conversations with our kids, 48 00:02:20,560 --> 00:02:23,960 Speaker 1: and we actually understand what's happening in this space. But 49 00:02:24,440 --> 00:02:27,040 Speaker 1: to date, there hasn't been a lot of conversation around 50 00:02:27,080 --> 00:02:29,600 Speaker 1: the harms that are happening online for our young people. 51 00:02:30,080 --> 00:02:32,880 Speaker 1: And it's great that we're kind of really starting to 52 00:02:32,919 --> 00:02:35,640 Speaker 1: have some good, robust conversations about that. But in saying that, 53 00:02:35,680 --> 00:02:37,640 Speaker 1: where many years behind the rest of the world in 54 00:02:37,720 --> 00:02:40,200 Speaker 1: terms of many countries around the world, I should say, 55 00:02:40,200 --> 00:02:44,959 Speaker 1: in terms of actually putting any kind of systemic kind 56 00:02:44,960 --> 00:02:47,040 Speaker 1: of measures in place to manage this. But you know, 57 00:02:47,160 --> 00:02:49,840 Speaker 1: it is a printal responsibility. But it's also bigger than that. 58 00:02:50,520 --> 00:02:54,760 Speaker 1: It's really difficult for parents to manage tech and it's exploded, 59 00:02:55,160 --> 00:02:58,000 Speaker 1: you know, with normalized giving our kids' phones very young 60 00:02:58,040 --> 00:03:00,560 Speaker 1: and giving them access to everything on the internet, and 61 00:03:00,600 --> 00:03:02,280 Speaker 1: we do need to scale that back, but I do 62 00:03:02,320 --> 00:03:05,200 Speaker 1: think that we need more support from a kind of 63 00:03:05,240 --> 00:03:06,320 Speaker 1: regulatory approach. 64 00:03:06,400 --> 00:03:09,040 Speaker 2: But both Holly, thank you for that. Hollybrooker makes sense, 65 00:03:09,120 --> 00:03:11,600 Speaker 2: co founder on TikTok heading to Parliament today. 66 00:03:11,919 --> 00:03:14,120 Speaker 1: For more from Hither Duplessy Allen Drive. 67 00:03:14,280 --> 00:03:17,720 Speaker 2: Listen live to news talks it'd be from four pm weekdays, 68 00:03:17,840 --> 00:03:20,040 Speaker 2: or follow the podcast on iHeartRadio