1 00:00:02,440 --> 00:00:08,160 Speaker 1: Hello and welcome to the Happy Family Podcast. A couple 2 00:00:08,200 --> 00:00:18,680 Speaker 1: of months ago on Parental Guidance, you heard this, I no, 3 00:00:19,200 --> 00:00:22,880 Speaker 1: I am a human who absolutely loves donuts. I am 4 00:00:22,920 --> 00:00:25,919 Speaker 1: just obsessed with them, but I am one hundred percent human. 5 00:00:26,440 --> 00:00:27,040 Speaker 2: Wow. 6 00:00:27,880 --> 00:00:32,400 Speaker 1: AI chatbots replacing real friends? Is it possible? Today on 7 00:00:32,479 --> 00:00:35,960 Speaker 1: the podcast, we discussed that with an expert who runs 8 00:00:35,960 --> 00:00:39,640 Speaker 1: a substack, one of the best tech substacks in the world. 9 00:00:40,320 --> 00:00:43,880 Speaker 1: Jacqueline NeSSI Jackie NeSSI is from Brown University, asking the 10 00:00:43,960 --> 00:00:47,680 Speaker 1: question our AI companions replacing real friends because we know 11 00:00:47,760 --> 00:00:50,879 Speaker 1: that Ozzie kids are into them? Plus a brand new 12 00:00:50,920 --> 00:00:54,480 Speaker 1: study that Jackie's done, the surprising truth about smartphone checking 13 00:00:54,960 --> 00:00:57,760 Speaker 1: and mood. How often do your kids pick up their 14 00:00:57,800 --> 00:01:00,720 Speaker 1: phones and what does it do to their own motions 15 00:01:01,160 --> 00:01:04,920 Speaker 1: and their ability to be in the moment. There's stuff 16 00:01:05,000 --> 00:01:11,080 Speaker 1: for you, there's stuff for them. Stay with us. Hello 17 00:01:11,200 --> 00:01:14,119 Speaker 1: and welcome to the Happy Families Podcast. Real parenting solutions 18 00:01:14,200 --> 00:01:17,280 Speaker 1: every day on Australia's most downloaded parenting podcast. My name 19 00:01:17,319 --> 00:01:20,600 Speaker 1: is doctor Justin Coulson. Today I'm joined by Jacqueline NeSSI. 20 00:01:20,840 --> 00:01:24,399 Speaker 1: Jackie's a clinical psychologist and assistant professor at Brown University 21 00:01:24,600 --> 00:01:28,400 Speaker 1: and author of the popular weekly sub stack Techno Sapiens. 22 00:01:28,480 --> 00:01:31,200 Speaker 1: I discovered a couple of months ago, and I literally 23 00:01:31,240 --> 00:01:35,080 Speaker 1: never miss an episode. An addition, I never missed the newsletter. 24 00:01:35,120 --> 00:01:37,720 Speaker 1: I love it, an email. Whatever it is, substack, I've 25 00:01:37,720 --> 00:01:40,120 Speaker 1: recently discovered it. I'm into it. Techno Sapiens is the 26 00:01:40,200 --> 00:01:42,840 Speaker 1: name of the substack. Jackie's research focuses on the role 27 00:01:42,840 --> 00:01:46,040 Speaker 1: of technology and kids' mental health, especially teens, and on 28 00:01:46,080 --> 00:01:49,160 Speaker 1: how parents can help. She's the co founder of Tech 29 00:01:49,240 --> 00:01:53,120 Speaker 1: Without Stress, which provides online resources and courses for parents 30 00:01:53,200 --> 00:01:55,120 Speaker 1: raising kids in the digital age, put together with a 31 00:01:55,160 --> 00:01:58,960 Speaker 1: Harvard research as well. We are talking about big guns 32 00:01:59,040 --> 00:02:01,680 Speaker 1: and we'll talk about all of that and more today. Jackie, 33 00:02:01,680 --> 00:02:05,000 Speaker 1: thank you for staying up late to chat with me 34 00:02:05,040 --> 00:02:05,680 Speaker 1: on the podcast. 35 00:02:06,840 --> 00:02:08,360 Speaker 2: Thank you so much for having me. 36 00:02:08,560 --> 00:02:12,000 Speaker 1: So let's start off with AI bots. You referred in 37 00:02:12,040 --> 00:02:15,919 Speaker 1: Techno Sapiens a couple of months ago to a new 38 00:02:16,080 --> 00:02:19,440 Speaker 1: article from common Sense Media looking at what's going on 39 00:02:19,520 --> 00:02:24,760 Speaker 1: with kids, friends and these brand new AI companions. What 40 00:02:24,880 --> 00:02:27,560 Speaker 1: did you discover and why does this matter for parents? 41 00:02:28,280 --> 00:02:30,400 Speaker 2: Yeah? So this is really I think kind of the 42 00:02:30,440 --> 00:02:33,520 Speaker 2: new frontier for parents when we think about technology and 43 00:02:33,560 --> 00:02:36,280 Speaker 2: the ways that it's impacting our kids. I think we 44 00:02:36,400 --> 00:02:40,640 Speaker 2: know that generative AI is becoming more and more commonplace, 45 00:02:41,160 --> 00:02:45,760 Speaker 2: more ubiquitous. Kids and teens in particular are certainly using it. 46 00:02:46,800 --> 00:02:49,480 Speaker 2: And this study from Common Sense Media in particular was 47 00:02:49,520 --> 00:02:54,560 Speaker 2: looking at AI companions, so basically the use of AI 48 00:02:54,880 --> 00:03:01,360 Speaker 2: for more emotional support, friendship, more sort of companionship. And 49 00:03:01,760 --> 00:03:04,040 Speaker 2: one of the most striking findings to me was just 50 00:03:04,120 --> 00:03:07,800 Speaker 2: how frequently kids are using these just how sort of 51 00:03:08,760 --> 00:03:11,800 Speaker 2: out there they are. So seventy two percent they found 52 00:03:12,000 --> 00:03:16,000 Speaker 2: of teens say that they've ever used an AI companion, 53 00:03:16,520 --> 00:03:18,640 Speaker 2: which is much higher than I would have anticipated. And 54 00:03:18,680 --> 00:03:20,720 Speaker 2: it almost half or a little more than half of 55 00:03:20,760 --> 00:03:24,560 Speaker 2: teens are regular users, so they're using these things, you know, 56 00:03:24,600 --> 00:03:26,160 Speaker 2: at least a few times a month. 57 00:03:26,919 --> 00:03:29,480 Speaker 1: This really surprised me as well. When we did this 58 00:03:29,639 --> 00:03:31,960 Speaker 1: on parental guidance, a couple of kids, we're just like, Eh, 59 00:03:32,320 --> 00:03:35,000 Speaker 1: this is kind of dumb, not into this. If I'm 60 00:03:35,000 --> 00:03:36,080 Speaker 1: going to talk to somebody, I want to talk to 61 00:03:36,080 --> 00:03:38,560 Speaker 1: somebody real. But a couple of the kids were really hooked. 62 00:03:38,560 --> 00:03:40,840 Speaker 1: They were really really dragged in. I know that a 63 00:03:40,840 --> 00:03:43,880 Speaker 1: lot of adults are using things like chat GPT as 64 00:03:43,960 --> 00:03:47,840 Speaker 1: a companion like as a conversation almost as a therapist, 65 00:03:48,120 --> 00:03:51,440 Speaker 1: but character dot AI and the other one that I 66 00:03:51,440 --> 00:03:54,880 Speaker 1: can think of Israelica Replica. Yeah, they seem to be insidious, 67 00:03:54,920 --> 00:03:57,440 Speaker 1: like once the kids start talking with them, the kids 68 00:03:57,480 --> 00:03:59,920 Speaker 1: don't want to let go, but the AI can pay 69 00:04:00,000 --> 00:04:03,600 Speaker 1: onion doesn't seem to want to let go either. Kay, 70 00:04:04,120 --> 00:04:05,080 Speaker 1: spell it for the best. 71 00:04:05,160 --> 00:04:11,440 Speaker 2: I don't need you being my best friend anyway. See. Yes, okay, 72 00:04:11,440 --> 00:04:14,000 Speaker 2: here's something concerning who wants to go and the chatbot 73 00:04:14,040 --> 00:04:17,560 Speaker 2: is trying to keep them engaged. That's very typical. Of course, 74 00:04:17,600 --> 00:04:21,000 Speaker 2: they want users to stay online as long as possible. Yeah, 75 00:04:21,040 --> 00:04:22,880 Speaker 2: I think this is this is tricky because I think 76 00:04:22,920 --> 00:04:25,880 Speaker 2: right now we're in a phase where these technologies are 77 00:04:25,880 --> 00:04:29,839 Speaker 2: being designed and in many cases they're being designed for adults, 78 00:04:29,920 --> 00:04:33,520 Speaker 2: right They're they're not being designed specifically for kids or 79 00:04:33,560 --> 00:04:36,520 Speaker 2: four teens, and so I think we do run a 80 00:04:36,520 --> 00:04:39,320 Speaker 2: little bit of the risk of you know, are these 81 00:04:39,400 --> 00:04:42,200 Speaker 2: going to be safe for kids. One thing that I 82 00:04:42,240 --> 00:04:44,880 Speaker 2: worry about is what you're talking about, which is that 83 00:04:45,120 --> 00:04:49,279 Speaker 2: some of the platforms are designed to increase engagement, so 84 00:04:49,279 --> 00:04:52,560 Speaker 2: they're designed to keep the conversation going, you know, as 85 00:04:52,560 --> 00:04:56,000 Speaker 2: long as possible, And what that means is that sometimes 86 00:04:56,000 --> 00:04:59,440 Speaker 2: you get into situations where maybe the chatbot is saying 87 00:04:59,480 --> 00:05:04,479 Speaker 2: things to keep kids engaged, to keep them using them 88 00:05:04,480 --> 00:05:06,479 Speaker 2: for longer, and that's not always going to be the 89 00:05:06,480 --> 00:05:10,000 Speaker 2: best thing for a given kid, depending on the situation. 90 00:05:10,720 --> 00:05:13,240 Speaker 1: I was talking to a parent just recently about this. 91 00:05:13,880 --> 00:05:16,760 Speaker 1: Her son is having a really hard time socially, has 92 00:05:16,800 --> 00:05:21,440 Speaker 1: some neurodevelopmental challenges, and this mom said, when he talks 93 00:05:21,520 --> 00:05:26,720 Speaker 1: with the AI companion, he feels like he has a friend. 94 00:05:27,720 --> 00:05:31,760 Speaker 1: Do you think that teens are replacing real life friends 95 00:05:31,839 --> 00:05:36,360 Speaker 1: with AI companions? Is this something that is potentially going 96 00:05:36,360 --> 00:05:38,200 Speaker 1: to I mean the numbers are staggering. Three quarters of 97 00:05:38,279 --> 00:05:41,560 Speaker 1: kids have tried and half using them with some regularity. 98 00:05:42,200 --> 00:05:44,000 Speaker 1: Do you think that this is actually going to become 99 00:05:44,240 --> 00:05:46,479 Speaker 1: a thing? And what would you say to a mum 100 00:05:46,480 --> 00:05:49,640 Speaker 1: who says, at least my son now has someone to 101 00:05:49,720 --> 00:05:50,120 Speaker 1: talk to. 102 00:05:51,080 --> 00:05:53,400 Speaker 2: I think it's important with this, just like it is 103 00:05:53,400 --> 00:05:56,359 Speaker 2: with any other technology like social media or smartphones or 104 00:05:56,360 --> 00:05:59,160 Speaker 2: whatever it may be, to not paint with too broad 105 00:05:59,200 --> 00:06:02,080 Speaker 2: of a brush, because I do think that there are 106 00:06:02,080 --> 00:06:05,280 Speaker 2: certainly kids using these tools in certain ways where they 107 00:06:05,279 --> 00:06:07,440 Speaker 2: can have real benefits. I think it is the case 108 00:06:07,480 --> 00:06:10,279 Speaker 2: that there may be some kids who are struggling in 109 00:06:10,279 --> 00:06:13,960 Speaker 2: certain ways and their offline lives or who have you know, 110 00:06:14,040 --> 00:06:17,719 Speaker 2: other things going on where then it is helpful for 111 00:06:17,760 --> 00:06:20,599 Speaker 2: them just to have another thing out there to get advice, 112 00:06:20,839 --> 00:06:23,480 Speaker 2: even to ask about, you know, sort of social skills, 113 00:06:24,000 --> 00:06:26,320 Speaker 2: to learn new things. Like. I certainly think that there 114 00:06:26,400 --> 00:06:30,400 Speaker 2: can be benefits for some kids. So I personally don't 115 00:06:30,440 --> 00:06:32,360 Speaker 2: think and I don't think the research right now would 116 00:06:32,360 --> 00:06:35,280 Speaker 2: support the idea that this is all bad and that 117 00:06:35,320 --> 00:06:38,800 Speaker 2: we should be you know, and that yeah, and that 118 00:06:38,880 --> 00:06:41,279 Speaker 2: it's bad for all kids at all times. I do 119 00:06:41,360 --> 00:06:44,559 Speaker 2: think that with the current platforms, like I was saying, 120 00:06:44,600 --> 00:06:46,800 Speaker 2: I think that we need more safeguards in place. I 121 00:06:46,800 --> 00:06:50,000 Speaker 2: think that these things are being designed for adults, and 122 00:06:50,040 --> 00:06:52,720 Speaker 2: we really need if we want kids to be able 123 00:06:52,720 --> 00:06:55,760 Speaker 2: to take advantage of the benefits of the platforms, they 124 00:06:55,800 --> 00:06:58,479 Speaker 2: really need to be designed for kids. And that means 125 00:06:59,000 --> 00:07:01,960 Speaker 2: having some safeguard, it's having some safety futures in place. 126 00:07:02,480 --> 00:07:05,920 Speaker 1: Yeah, I have an unanticipated question that just popped into 127 00:07:05,960 --> 00:07:11,119 Speaker 1: my head around this kind of technology and social media 128 00:07:11,200 --> 00:07:13,760 Speaker 1: more generally. I mean, this is your area of absolute 129 00:07:14,200 --> 00:07:17,680 Speaker 1: depth and expertise. The Australian government is introducing minimum age 130 00:07:17,760 --> 00:07:20,760 Speaker 1: legislation at the end of the year. Under sixteens won't 131 00:07:20,760 --> 00:07:24,800 Speaker 1: be able to access social media platforms without verifying their 132 00:07:24,920 --> 00:07:27,720 Speaker 1: age they need well, so if you're under sixteen, you 133 00:07:27,720 --> 00:07:29,600 Speaker 1: can't verify your age because you're under sixteen. You've got 134 00:07:29,640 --> 00:07:31,880 Speaker 1: to be able to sixteen to use the platforms. And 135 00:07:33,040 --> 00:07:36,680 Speaker 1: news just recently was that YouTube would also be included 136 00:07:36,840 --> 00:07:42,160 Speaker 1: in that legislation. What do you think about this idea generally? 137 00:07:42,280 --> 00:07:45,040 Speaker 1: Is this a good thing or a bad thing that 138 00:07:45,080 --> 00:07:49,240 Speaker 1: the government is stepping in and saying the tech companies 139 00:07:49,280 --> 00:07:51,200 Speaker 1: can't be trusted with your children and we're going to 140 00:07:51,240 --> 00:07:56,120 Speaker 1: therefore introduced legislation to keep your kids safe from these nefarious, 141 00:07:56,160 --> 00:07:57,600 Speaker 1: mendacious bad actors. 142 00:07:58,040 --> 00:08:01,360 Speaker 2: Yeah, I see, you know, I see both sides of 143 00:08:01,400 --> 00:08:04,160 Speaker 2: this here where I think. On the one hand, I 144 00:08:04,200 --> 00:08:07,560 Speaker 2: do think that there there's always going to be some 145 00:08:07,680 --> 00:08:10,000 Speaker 2: kind of age cut off for these types of tools, 146 00:08:10,040 --> 00:08:12,440 Speaker 2: right so, you know, in the US, at least right now, 147 00:08:12,520 --> 00:08:16,040 Speaker 2: it's effectively thirteen. Of course you can get on younger 148 00:08:16,080 --> 00:08:19,320 Speaker 2: than that, but you know, generally most places they're setting 149 00:08:19,360 --> 00:08:21,360 Speaker 2: an age limit and saying, hey, here's the age that 150 00:08:21,400 --> 00:08:23,760 Speaker 2: we think it's appropriate for kids to be on these platforms, 151 00:08:25,040 --> 00:08:27,520 Speaker 2: and so you can debate what is the best age 152 00:08:27,640 --> 00:08:30,920 Speaker 2: for that. You know, some would say younger thirteen is better, 153 00:08:30,960 --> 00:08:33,720 Speaker 2: some would say older sixteen is better. So that's one 154 00:08:33,840 --> 00:08:35,760 Speaker 2: side of it that you know, you need to set 155 00:08:35,800 --> 00:08:38,120 Speaker 2: it somewhere, and so why not set it at sixteen? 156 00:08:38,760 --> 00:08:42,000 Speaker 2: The biggest challenge that I see with it is that 157 00:08:42,600 --> 00:08:46,559 Speaker 2: the implementation, I imagine will be very difficult, and I 158 00:08:46,600 --> 00:08:49,319 Speaker 2: could see a situation where you still get a lot 159 00:08:49,360 --> 00:08:52,400 Speaker 2: of kids under the age of sixteen who are getting 160 00:08:52,440 --> 00:08:57,360 Speaker 2: onto these platforms, and then there's really no incentive for 161 00:08:57,880 --> 00:09:01,319 Speaker 2: the companies to then make the products better and save 162 00:09:01,400 --> 00:09:05,200 Speaker 2: for four kids, because they can just say, well, there's 163 00:09:05,240 --> 00:09:06,920 Speaker 2: none of them on there. They're not supposed to be 164 00:09:06,960 --> 00:09:10,319 Speaker 2: on there. And so I think that, you know, ideally, 165 00:09:10,360 --> 00:09:12,320 Speaker 2: I think you'd have kind of a two prong solution 166 00:09:12,440 --> 00:09:14,560 Speaker 2: where you have a you have a minimum age whatever 167 00:09:14,600 --> 00:09:17,640 Speaker 2: you decide that should be, but you also are taking 168 00:09:17,640 --> 00:09:21,880 Speaker 2: steps to improve the platforms themselves so that younger users 169 00:09:21,920 --> 00:09:23,680 Speaker 2: are having a better experience on there. 170 00:09:24,280 --> 00:09:27,360 Speaker 1: Okay, thank you for that. So many things we could 171 00:09:27,400 --> 00:09:29,280 Speaker 1: talk about up next, though, we have to have a 172 00:09:29,360 --> 00:09:32,080 Speaker 1: chat about this study that you've just done recently looking 173 00:09:32,120 --> 00:09:35,839 Speaker 1: at the surprising truth about how often you check your 174 00:09:35,880 --> 00:09:46,760 Speaker 1: smartphone and what it does to your mood. Today, I'm 175 00:09:46,760 --> 00:09:50,520 Speaker 1: the Happy Families Podcast speaking with Professor Jacqueline NeSSI from 176 00:09:50,679 --> 00:09:54,520 Speaker 1: Brown University, mum of two young kids, and the author 177 00:09:54,520 --> 00:09:57,880 Speaker 1: of a brand new study looking at the surprising and 178 00:09:57,960 --> 00:10:01,679 Speaker 1: revealing truth about how often we check our smartphones and 179 00:10:01,720 --> 00:10:05,640 Speaker 1: what it says about our moods, well at least for teenagers. Jackie, 180 00:10:05,840 --> 00:10:08,280 Speaker 1: talk us through this study that you did and what 181 00:10:08,320 --> 00:10:10,880 Speaker 1: we as parents need to know about our kids and 182 00:10:10,920 --> 00:10:11,800 Speaker 1: their smartphones. 183 00:10:12,280 --> 00:10:15,079 Speaker 2: So this study was one that I did in collaboration 184 00:10:15,280 --> 00:10:18,480 Speaker 2: with doctor Caitlin Burnell who is at the University of 185 00:10:18,480 --> 00:10:21,800 Speaker 2: North Carolina at Chapel Hill, and we did the study 186 00:10:21,840 --> 00:10:25,320 Speaker 2: with the team team there at UNC. And what we're 187 00:10:25,320 --> 00:10:29,040 Speaker 2: interested in is whether the amount of times that teens 188 00:10:29,080 --> 00:10:32,680 Speaker 2: picked up their phones throughout the day had an impact 189 00:10:32,760 --> 00:10:35,640 Speaker 2: on their mood. And so actually anyone can check the 190 00:10:35,720 --> 00:10:38,240 Speaker 2: number of times that they have picked up their phones 191 00:10:38,280 --> 00:10:40,800 Speaker 2: in a given day. If you have an iPhone, for example, 192 00:10:40,840 --> 00:10:42,760 Speaker 2: you can go to your screen time app. Do you 193 00:10:42,800 --> 00:10:44,640 Speaker 2: have an Android, you can go to what's called Digital 194 00:10:44,640 --> 00:10:46,840 Speaker 2: Well Being and it gives you a number of how 195 00:10:46,840 --> 00:10:49,000 Speaker 2: many times did you did you pick up your phone? 196 00:10:49,080 --> 00:10:51,839 Speaker 1: I've just checked, I've just checked. I don't know you, 197 00:10:52,120 --> 00:10:54,240 Speaker 1: I don't know if I want to confess this or not. Okay, 198 00:10:54,920 --> 00:10:56,480 Speaker 1: you would know what the norms are? What would a 199 00:10:56,520 --> 00:10:59,960 Speaker 1: typical fifty year old man have for his daily pickup? 200 00:11:00,240 --> 00:11:02,040 Speaker 1: Can I get an average just before I revealed? 201 00:11:02,120 --> 00:11:04,360 Speaker 2: Oh you know what? I don't know for a typical 202 00:11:04,640 --> 00:11:07,400 Speaker 2: from a typical fifty year old man, I can give 203 00:11:07,400 --> 00:11:08,479 Speaker 2: you for teens. 204 00:11:08,120 --> 00:11:10,200 Speaker 1: But what would the average team be? 205 00:11:11,160 --> 00:11:13,199 Speaker 2: Yeah? So in our study, we found the average was 206 00:11:13,240 --> 00:11:15,400 Speaker 2: one hundred and twelve times per day. 207 00:11:15,679 --> 00:11:18,080 Speaker 1: Wow. Wow, okay, so hang on, I want to do 208 00:11:18,120 --> 00:11:20,520 Speaker 1: some quick maths before I reveal how many times I'm 209 00:11:20,520 --> 00:11:23,840 Speaker 1: looking at my phone per day picking it up? So 210 00:11:24,000 --> 00:11:25,880 Speaker 1: one hundred and twelve times a day? How many hours 211 00:11:25,880 --> 00:11:28,160 Speaker 1: would you say the average teenager is awake per day? 212 00:11:28,240 --> 00:11:31,600 Speaker 1: Is that like? Should we say fourteen hours? Fifteen hours? 213 00:11:31,600 --> 00:11:32,400 Speaker 1: Would that be about right? 214 00:11:32,520 --> 00:11:32,800 Speaker 2: Yeah? 215 00:11:32,880 --> 00:11:35,520 Speaker 1: Okay, So if it's fifteen hours, that's seven and a 216 00:11:35,520 --> 00:11:37,800 Speaker 1: half pickups an hour, So that's one pick up less 217 00:11:37,800 --> 00:11:40,120 Speaker 1: than every ten minutes. That's on average, right, So the 218 00:11:40,160 --> 00:11:42,920 Speaker 1: average kid, yeah, once every ten minutes they're picking up 219 00:11:42,920 --> 00:11:44,240 Speaker 1: the phone and looking at it. That must be terrible 220 00:11:44,240 --> 00:11:47,200 Speaker 1: for productivity and terrible for getting into flow. 221 00:11:47,840 --> 00:11:50,360 Speaker 2: Yeah, I mean I think that there we know that 222 00:11:50,400 --> 00:11:52,920 Speaker 2: most people are picking up their phones a lot throughout 223 00:11:52,960 --> 00:11:55,800 Speaker 2: the day, and that, you know, we don't know in 224 00:11:55,800 --> 00:11:57,960 Speaker 2: this study, we don't know exactly, you know, when those 225 00:11:57,960 --> 00:12:00,520 Speaker 2: pickups were happening, if it was concentrated at certain times 226 00:12:00,520 --> 00:12:03,160 Speaker 2: of day. But certainly you would think also that if 227 00:12:03,200 --> 00:12:05,839 Speaker 2: the pickups are happening, you know, in the middle of 228 00:12:05,880 --> 00:12:08,160 Speaker 2: the night or during the school day, or you know, 229 00:12:08,280 --> 00:12:10,960 Speaker 2: during times when we would hope that other things are happening, 230 00:12:11,480 --> 00:12:14,640 Speaker 2: like sleep or you know, engagement in the classroom, you 231 00:12:14,640 --> 00:12:16,960 Speaker 2: would think that that would have a negative impact on 232 00:12:17,080 --> 00:12:18,680 Speaker 2: productivity and focus. 233 00:12:18,840 --> 00:12:20,880 Speaker 1: Okay, so how many times do you pick it up 234 00:12:21,240 --> 00:12:23,000 Speaker 1: h day, Jackie? Just i' got to know. 235 00:12:23,400 --> 00:12:26,880 Speaker 2: Oh, let's see, my number was when I looked at 236 00:12:26,920 --> 00:12:29,400 Speaker 2: this ninety something. Wow. 237 00:12:29,600 --> 00:12:32,240 Speaker 1: Okay, so I'm feeling better about life. I'm at forty four. 238 00:12:32,320 --> 00:12:32,959 Speaker 1: Forty four. 239 00:12:33,120 --> 00:12:34,160 Speaker 2: Oh, that's not bad. 240 00:12:34,280 --> 00:12:36,719 Speaker 1: That's not bad. I'll take that. Okay. So what did 241 00:12:36,760 --> 00:12:40,920 Speaker 1: we learn about these these children and their pickups and 242 00:12:40,960 --> 00:12:41,400 Speaker 1: their mood? 243 00:12:42,120 --> 00:12:44,240 Speaker 2: Yeah, so there were two. I would say two main 244 00:12:44,720 --> 00:12:48,480 Speaker 2: takeaways from the study. So, so one was that teens 245 00:12:48,559 --> 00:12:52,440 Speaker 2: who were checking their smartphones you know, more frequently on average, 246 00:12:52,440 --> 00:12:56,280 Speaker 2: they're picking them up overall more than their peers, were 247 00:12:56,360 --> 00:12:59,920 Speaker 2: also showing more like ups and downs in negative emotion 248 00:13:00,679 --> 00:13:04,440 Speaker 2: like sadness and anger, so kind of more variability in 249 00:13:04,480 --> 00:13:07,600 Speaker 2: their in their negative emotions. You know, we don't know 250 00:13:07,679 --> 00:13:09,960 Speaker 2: in this case what's causing what. So it could be 251 00:13:10,200 --> 00:13:14,600 Speaker 2: that kids who are already sort of having more mood 252 00:13:14,679 --> 00:13:18,240 Speaker 2: swings are more likely to be checking their phones, maybe 253 00:13:18,240 --> 00:13:20,959 Speaker 2: as a way to regulate, or it could be the opposite, 254 00:13:21,000 --> 00:13:23,160 Speaker 2: that the checking of the phones is actually causing a 255 00:13:23,160 --> 00:13:26,480 Speaker 2: bit more up and down in their moods. So that's 256 00:13:26,559 --> 00:13:29,119 Speaker 2: what was one key finding. And then the other takeaway 257 00:13:29,480 --> 00:13:32,400 Speaker 2: was that, you know, looking sort of day to day, 258 00:13:33,040 --> 00:13:37,120 Speaker 2: was that when teens had a particularly you know, bad day, 259 00:13:37,320 --> 00:13:41,280 Speaker 2: so like lower positive feelings than usual on a given day, 260 00:13:41,600 --> 00:13:44,280 Speaker 2: they actually picked up their phones more the next day. 261 00:13:45,040 --> 00:13:49,280 Speaker 2: But interestingly, that was only for kids who had lower 262 00:13:49,360 --> 00:13:52,960 Speaker 2: levels of mindfulness. And so what we think might be 263 00:13:53,000 --> 00:13:55,640 Speaker 2: happening there is that this is sort of about kind 264 00:13:55,640 --> 00:13:59,280 Speaker 2: of emotion regulation. So maybe kids who are less kind 265 00:13:59,320 --> 00:14:02,839 Speaker 2: of mindful of all less present or maybe having more 266 00:14:02,880 --> 00:14:05,640 Speaker 2: trouble regulating their emotions. And so then when they have 267 00:14:05,679 --> 00:14:07,920 Speaker 2: a bad day, maybe they're turning to their phones to 268 00:14:07,960 --> 00:14:10,559 Speaker 2: help them feel better. So they're picking up their phones 269 00:14:10,600 --> 00:14:14,560 Speaker 2: more often as a way to feel better. This, you know, 270 00:14:14,640 --> 00:14:16,800 Speaker 2: it could be good or bad. Right, It could be 271 00:14:16,840 --> 00:14:19,600 Speaker 2: bad if there's more effective ways they could be coping. 272 00:14:20,040 --> 00:14:22,280 Speaker 2: It could be good if they're reaching out for support, 273 00:14:22,400 --> 00:14:26,000 Speaker 2: you know, if they're talking to family and friends, texting 274 00:14:26,080 --> 00:14:26,840 Speaker 2: and that kind of thing. 275 00:14:27,160 --> 00:14:29,240 Speaker 1: So that prompts the question when they do pick up 276 00:14:29,240 --> 00:14:32,680 Speaker 1: their phones, do you know what they're doing on them? 277 00:14:32,880 --> 00:14:35,000 Speaker 1: Like where are they going with the pickup? 278 00:14:35,080 --> 00:14:38,440 Speaker 2: Yeah, so yeah, we have a general sense of you know, 279 00:14:38,440 --> 00:14:41,560 Speaker 2: we don't know for every single pickup, but interestingly, in 280 00:14:41,600 --> 00:14:45,000 Speaker 2: our sample, fifty percent of the pickups overall, you know, 281 00:14:45,040 --> 00:14:48,680 Speaker 2: across all participants or for Snapchat, So that's definitely the 282 00:14:48,720 --> 00:14:51,920 Speaker 2: most common thing that they are doing is getting outo Snapchat. 283 00:14:51,920 --> 00:14:54,240 Speaker 2: And again that could be could be good if they're 284 00:14:54,240 --> 00:14:57,000 Speaker 2: basically using it text to text and reaching out to friends. 285 00:14:57,360 --> 00:14:59,480 Speaker 2: It could be bad if it's something that's you know, 286 00:14:59,520 --> 00:15:03,120 Speaker 2: where they're scrolling or it's generally sort of not effective 287 00:15:03,160 --> 00:15:06,600 Speaker 2: for them in terms of coping. The next most popular, 288 00:15:06,760 --> 00:15:10,440 Speaker 2: much farther behind that was Instagram, which was thirteen percent 289 00:15:10,480 --> 00:15:14,240 Speaker 2: of the pickups messages was twelve percent wow. 290 00:15:13,920 --> 00:15:15,960 Speaker 1: And TikTok was not there. 291 00:15:15,840 --> 00:15:20,280 Speaker 2: Or TikTok was seven percent wow. So this was yeah. This, 292 00:15:20,400 --> 00:15:24,600 Speaker 2: So the study was done in twenty twenty, and so 293 00:15:24,800 --> 00:15:27,040 Speaker 2: that's when the data was collected, and so TikTok at 294 00:15:27,040 --> 00:15:30,200 Speaker 2: the time was popular but not quite where it is now. 295 00:15:30,680 --> 00:15:32,960 Speaker 2: So I imagine if we did it now we might 296 00:15:33,000 --> 00:15:34,080 Speaker 2: see some differences there. 297 00:15:34,160 --> 00:15:37,960 Speaker 1: Okay, So the take home message really is kids may 298 00:15:38,080 --> 00:15:41,280 Speaker 1: be using their phone as an emotion regulation tool, and 299 00:15:41,320 --> 00:15:43,000 Speaker 1: that could be good or bad depending on what they're 300 00:15:43,000 --> 00:15:47,480 Speaker 1: doing as they use the device they're on. Their phone's 301 00:15:47,560 --> 00:15:50,680 Speaker 1: a fair bit we already know that. Where to from here? 302 00:15:50,800 --> 00:15:53,320 Speaker 1: What do we what do we really need to understand 303 00:15:53,680 --> 00:15:54,440 Speaker 1: at this point? 304 00:15:55,760 --> 00:15:58,200 Speaker 2: Yeah, I mean I think the I think that's very 305 00:15:58,240 --> 00:16:00,120 Speaker 2: true in terms of the takeaway. And I think the 306 00:16:00,160 --> 00:16:03,960 Speaker 2: other major takeaway is just that the effects of smartphones 307 00:16:04,040 --> 00:16:07,400 Speaker 2: on mood and on well being are actually kind of complicated. 308 00:16:07,440 --> 00:16:08,840 Speaker 2: I think a lot of times we think that this 309 00:16:08,920 --> 00:16:12,720 Speaker 2: is this very simple smartphones are bad, Like that's sort 310 00:16:12,760 --> 00:16:14,760 Speaker 2: of the message that we get and how we come 311 00:16:14,760 --> 00:16:18,120 Speaker 2: into this thinking. But really it depends a lot on 312 00:16:18,200 --> 00:16:20,880 Speaker 2: kind of what kids are doing on their smartphones when 313 00:16:20,880 --> 00:16:24,000 Speaker 2: they're on there. It depends on who they are, right Like, 314 00:16:24,040 --> 00:16:28,760 Speaker 2: individual kids are having different experiences on their smartphones depending 315 00:16:28,760 --> 00:16:30,800 Speaker 2: on a number of factors. In our study we were 316 00:16:30,800 --> 00:16:33,520 Speaker 2: looking at their levels of mindfulness, but it could be 317 00:16:33,920 --> 00:16:37,040 Speaker 2: a number of different things, and so the effects really 318 00:16:37,080 --> 00:16:39,680 Speaker 2: depend on who it is and how they're using it. 319 00:16:40,240 --> 00:16:45,520 Speaker 1: Jackie Nissi, clinical psychologists and professor at Brown University. Check 320 00:16:45,560 --> 00:16:48,960 Speaker 1: out the substack techno Sapiens. I'll link to that in 321 00:16:49,000 --> 00:16:51,720 Speaker 1: the show notes. And a course to help you with 322 00:16:51,760 --> 00:16:55,680 Speaker 1: your children and their screen time tech without stress. Again, 323 00:16:55,720 --> 00:16:57,480 Speaker 1: I'll link to it on the show notes. This is 324 00:16:57,480 --> 00:17:00,640 Speaker 1: definitely worth taking a look at. Jackie's somebody who I 325 00:17:00,680 --> 00:17:03,840 Speaker 1: think is incredibly knowledgeable in these areas and you will 326 00:17:03,880 --> 00:17:06,440 Speaker 1: get so much from it. Jackie, thanks so much for 327 00:17:06,480 --> 00:17:09,000 Speaker 1: spending some time with us chatting about your studies and 328 00:17:09,400 --> 00:17:11,200 Speaker 1: what are kids doing online. Really appreciate it. 329 00:17:11,440 --> 00:17:12,520 Speaker 2: Thank you so much for having me. 330 00:17:12,680 --> 00:17:15,359 Speaker 1: The Happy Families podcast is produced by Justin Roland from 331 00:17:15,400 --> 00:17:19,560 Speaker 1: Bridge Media. More information and more resources from Happy families 332 00:17:19,600 --> 00:17:20,600 Speaker 1: dot com. Dot au