WEBVTT - S1 E5: The case for giving your kids more freedom online

0:00:00.640 --> 0:00:07.170
<v Speaker 1>Hi, my name is Scarlett Ashbury. I'm 14. I like

0:00:07.170 --> 0:00:09.910
<v Speaker 1>to talk to people. I like to role play. I

0:00:09.910 --> 0:00:13.660
<v Speaker 1>like to look at fan art of things that I like.

0:00:14.550 --> 0:00:15.760
<v Speaker 1>I like to watch Youtube.

0:00:16.239 --> 0:00:16.459
<v Speaker 1>My

0:00:16.460 --> 0:00:19.280
<v Speaker 2>name is Helena Ashbury when she asks for an app.

0:00:19.280 --> 0:00:21.860
<v Speaker 2>I think I drive her nuts particularly when she was younger.

0:00:22.200 --> 0:00:24.450
<v Speaker 2>You know that I'll go off and research and you

0:00:24.450 --> 0:00:25.979
<v Speaker 2>got to the point of where I'd say, well I'll

0:00:25.980 --> 0:00:27.900
<v Speaker 2>have a look at it and she'll go, you'll just

0:00:27.900 --> 0:00:30.639
<v Speaker 2>look until you find something that says that it's dangerous

0:00:30.640 --> 0:00:32.530
<v Speaker 2>and then you'll tell me you can't have it.

0:00:32.540 --> 0:00:34.790
<v Speaker 1>She always looks at parent reviews.

0:00:35.140 --> 0:00:37.269
<v Speaker 1>Don't do that. If you're looking at the parent reviews

0:00:37.270 --> 0:00:38.780
<v Speaker 1>that are gonna be talking about, oh my child is

0:00:38.780 --> 0:00:40.620
<v Speaker 1>addicted to the stop. Oh my child been talking to

0:00:40.620 --> 0:00:41.529
<v Speaker 1>this old guys like

0:00:42.140 --> 0:00:44.589
<v Speaker 1>you're looking for bad reviews, you're gonna find them.

0:00:45.240 --> 0:00:46.740
<v Speaker 1>Hi Taylor.

0:00:47.840 --> 0:00:50.750
<v Speaker 1>Today I'm going to tell you about a parent named Helena.

0:00:51.440 --> 0:00:52.760
<v Speaker 1>She's an engineer

0:00:53.440 --> 0:00:57.200
<v Speaker 1>and she describes herself as having a decent grasp on tech.

0:00:57.210 --> 0:00:59.600
<v Speaker 1>But she still feels like when it comes to newer

0:00:59.600 --> 0:01:01.900
<v Speaker 1>media there's stuff she just doesn't get.

0:01:01.960 --> 0:01:04.690
<v Speaker 2>I grew up without any of this. You know, I

0:01:04.690 --> 0:01:07.360
<v Speaker 2>mean Atari came out when I was a kid

0:01:07.740 --> 0:01:11.170
<v Speaker 2>and I entered the online world when I was probably

0:01:11.170 --> 0:01:11.760
<v Speaker 2>about

0:01:12.240 --> 0:01:16.050
<v Speaker 2>21, and it was the dial up modem and all

0:01:16.050 --> 0:01:18.190
<v Speaker 2>that kind of thing and the Internet was cool.

0:01:18.470 --> 0:01:21.200
<v Speaker 1>Taylor. Do you remember when the internet was cool. It

0:01:21.200 --> 0:01:21.910
<v Speaker 1>still is

0:01:22.340 --> 0:01:25.250
<v Speaker 1>good answer. Good answer.

0:01:25.640 --> 0:01:29.330
<v Speaker 1>Helena has a 14 year old daughter. Scarlett and Scarlett

0:01:29.330 --> 0:01:31.380
<v Speaker 1>has spent a lot of time trying to educate her

0:01:31.380 --> 0:01:34.160
<v Speaker 1>parents about what she's doing online.

0:01:34.540 --> 0:01:37.340
<v Speaker 1>They do not understand instagram at all. Like they know

0:01:37.340 --> 0:01:39.110
<v Speaker 1>what it's for, they just don't know how to use

0:01:39.110 --> 0:01:39.360
<v Speaker 1>it

0:01:39.740 --> 0:01:44.900
<v Speaker 1>at all. You look annoyed because it's so easy,

0:01:45.940 --> 0:01:49.380
<v Speaker 1>it's not even that hard. Have you tried to teach them?

0:01:49.990 --> 0:01:52.180
<v Speaker 1>My mom was like, oh well I keep seeing these

0:01:52.180 --> 0:01:54.380
<v Speaker 1>people dancing on my screen. This this is not what

0:01:54.380 --> 0:01:56.070
<v Speaker 1>I'm not, I don't follow them and like

0:01:57.440 --> 0:02:00.290
<v Speaker 1>your urine reels, that's not nobody watches reels, what are

0:02:00.290 --> 0:02:01.520
<v Speaker 1>you doing? She goes, well, I thought that was the

0:02:01.520 --> 0:02:05.330
<v Speaker 1>home button. There is literally a house button, It's a

0:02:05.340 --> 0:02:06.120
<v Speaker 1>house

0:02:06.540 --> 0:02:09.869
<v Speaker 1>press the house to get to your home screen, It

0:02:09.870 --> 0:02:10.860
<v Speaker 1>just makes sense.

0:02:13.340 --> 0:02:15.560
<v Speaker 1>Have you had moments like this with your son?

0:02:16.440 --> 0:02:19.210
<v Speaker 1>Oh, absolutely. Some of the tools he uses what we

0:02:19.210 --> 0:02:21.560
<v Speaker 1>do as you know. Yes.

0:02:22.139 --> 0:02:26.959
<v Speaker 1>Is he a kind teacher? No, no, he's very impatient.

0:02:27.340 --> 0:02:28.520
<v Speaker 1>So it's a trend. Then

0:02:30.240 --> 0:02:33.320
<v Speaker 1>today's story hinges on this fact that many parents just

0:02:33.330 --> 0:02:36.320
<v Speaker 1>do not understand the world their kids live in and

0:02:36.320 --> 0:02:39.889
<v Speaker 1>some of the fears and tensions that come out of that.

0:02:39.900 --> 0:02:43.900
<v Speaker 1>I'm taylor Owen an academic who studies technology and who

0:02:43.900 --> 0:02:46.890
<v Speaker 1>is desperately trying to keep up with a very online

0:02:46.889 --> 0:02:50.669
<v Speaker 1>eight year old Nicole Edwards, A journalist whose parents are

0:02:50.669 --> 0:02:56.660
<v Speaker 1>remarkably tech savvy. This is screen time.

0:02:57.139 --> 0:03:01.860
<v Speaker 1>There are huge questions playing out right now over the

0:03:01.860 --> 0:03:06.630
<v Speaker 1>place of technology in our lives. Facebook was scheming to

0:03:06.630 --> 0:03:12.200
<v Speaker 1>bring even younger users into their field. You're basically giving

0:03:12.200 --> 0:03:15.310
<v Speaker 1>out your personal I. D. Two games so they can

0:03:15.310 --> 0:03:16.460
<v Speaker 1>make money for it.

0:03:16.730 --> 0:03:19.000
<v Speaker 1>There are some people that I would like to block

0:03:19.000 --> 0:03:19.760
<v Speaker 1>in real life,

0:03:20.120 --> 0:03:23.929
<v Speaker 1>we could work together. But it will add the third

0:03:23.930 --> 0:03:28.680
<v Speaker 1>tensions because in the app market their job is to sell, sell, sell.

0:03:30.500 --> 0:03:33.210
<v Speaker 1>They're like I just don't understand this is their platform,

0:03:33.210 --> 0:03:34.450
<v Speaker 1>this is their life

0:03:34.940 --> 0:03:36.050
<v Speaker 1>where's the limit?

0:03:36.840 --> 0:03:37.050
<v Speaker 1>Mhm.

0:03:37.440 --> 0:03:40.580
<v Speaker 1>Every parent is struggling with these questions, governments around the

0:03:40.580 --> 0:03:43.020
<v Speaker 1>world are trying to keep up and the scale and

0:03:43.020 --> 0:03:46.460
<v Speaker 1>pace of change is only increasing. In this show. We'll

0:03:46.460 --> 0:03:48.880
<v Speaker 1>talk to parents and kids about how they navigate the

0:03:48.880 --> 0:03:52.570
<v Speaker 1>digital world and to the researchers and policy makers who

0:03:52.570 --> 0:03:54.650
<v Speaker 1>can help us understand the consequences.

0:04:00.440 --> 0:04:03.510
<v Speaker 1>Helena had a story that I think illustrates how a

0:04:03.510 --> 0:04:06.050
<v Speaker 1>lot of parents feel about their Children navigating the internet.

0:04:06.540 --> 0:04:12.140
<v Speaker 2>My daughter asked for a Fitbit probably about three years

0:04:12.140 --> 0:04:15.420
<v Speaker 2>ago now and she was a Tween and already as

0:04:15.420 --> 0:04:17.760
<v Speaker 2>a parent worrying about her being on the device is

0:04:17.760 --> 0:04:20.710
<v Speaker 2>too much and not getting enough exercise. So it seemed

0:04:20.710 --> 0:04:23.160
<v Speaker 2>like a great idea to me, so

0:04:23.410 --> 0:04:26.580
<v Speaker 2>totally embraced the idea and she got a Fitbit and

0:04:26.580 --> 0:04:27.130
<v Speaker 2>I guess it

0:04:27.940 --> 0:04:32.520
<v Speaker 2>it was probably two or three months later and for

0:04:32.520 --> 0:04:35.830
<v Speaker 2>some reason I came across on her phone that she

0:04:35.830 --> 0:04:41.710
<v Speaker 2>was getting messages through the social media side of Fitbit,

0:04:41.710 --> 0:04:43.960
<v Speaker 2>which I had not realized

0:04:44.140 --> 0:04:46.469
<v Speaker 2>really. It hadn't sort of clued in with me that

0:04:46.470 --> 0:04:48.460
<v Speaker 2>there was a social media side to it.

0:04:48.839 --> 0:04:52.500
<v Speaker 2>And my daughter was getting messages from men I guess

0:04:52.500 --> 0:04:55.670
<v Speaker 2>men or boys on on the app that could be

0:04:55.680 --> 0:04:58.950
<v Speaker 2>you know, as outright as will you be my girlfriend,

0:04:58.990 --> 0:05:03.160
<v Speaker 2>Which just totally stunned me, you know, And we sort

0:05:03.160 --> 0:05:05.610
<v Speaker 2>of laughed a little bit about it. Like what are

0:05:05.610 --> 0:05:10.260
<v Speaker 2>they really expecting they really expecting you to say? Yeah, okay.

0:05:12.839 --> 0:05:16.000
<v Speaker 1>I had no idea. You could message someone on a Fitbit.

0:05:16.010 --> 0:05:19.630
<v Speaker 1>Did you know that? I didn't, I didn't. There's a

0:05:19.630 --> 0:05:22.300
<v Speaker 1>legal academic named Evelyn do eric who likes to say

0:05:22.300 --> 0:05:26.469
<v Speaker 1>that everything is a content moderation problem with technology right now.

0:05:26.480 --> 0:05:29.710
<v Speaker 1>And I think as all these different tools we use,

0:05:29.710 --> 0:05:32.460
<v Speaker 1>whether it be like a peloton or a Fitbit,

0:05:32.740 --> 0:05:36.029
<v Speaker 1>all of these platforms are trying to become social

0:05:36.140 --> 0:05:39.120
<v Speaker 1>and the second you try and become social, you have

0:05:39.120 --> 0:05:41.619
<v Speaker 1>to deal with bad content and how you're going to

0:05:41.620 --> 0:05:44.860
<v Speaker 1>decide what can and can't be said. And it's no

0:05:44.860 --> 0:05:50.150
<v Speaker 1>longer just instagram and facebook are exercise bikes and our

0:05:50.150 --> 0:05:56.059
<v Speaker 1>cars and anything that creates a connected environment. Absolutely.

0:05:56.140 --> 0:05:58.849
<v Speaker 1>And to be clear, I'm not sharing this story to

0:05:58.850 --> 0:06:01.779
<v Speaker 1>alert parents to the danger of fitbits or anything like that,

0:06:01.790 --> 0:06:04.210
<v Speaker 1>you know, thankfully nothing happened. Scarlett is fine,

0:06:04.640 --> 0:06:05.960
<v Speaker 1>but Helena's story,

0:06:06.440 --> 0:06:07.880
<v Speaker 1>I guess that a key thing that came up a

0:06:07.880 --> 0:06:10.660
<v Speaker 1>lot when I spoke to parents about online safety, which

0:06:10.660 --> 0:06:15.100
<v Speaker 1>is this concern that there's a hidden threat lurking somewhere

0:06:15.100 --> 0:06:15.950
<v Speaker 1>on the internet.

0:06:16.339 --> 0:06:18.890
<v Speaker 1>Here's Helena thinking that she's letting her kid have something

0:06:18.890 --> 0:06:21.080
<v Speaker 1>that's pretty benign, only to find out that there are

0:06:21.080 --> 0:06:23.260
<v Speaker 1>strangers on it who are propositioning her child.

0:06:23.640 --> 0:06:25.750
<v Speaker 1>Yeah, and I think it sort of gets it to

0:06:25.760 --> 0:06:28.320
<v Speaker 1>kind of really critical things here. It's like one we,

0:06:28.320 --> 0:06:32.640
<v Speaker 1>as parents don't always know sufficiently how technologies are kids

0:06:32.640 --> 0:06:35.040
<v Speaker 1>are using are used and what the potential is, right?

0:06:35.040 --> 0:06:37.839
<v Speaker 1>So not knowing even the fit, but had a social

0:06:37.850 --> 0:06:39.560
<v Speaker 1>interaction aspect to it.

0:06:39.640 --> 0:06:41.930
<v Speaker 1>But the second is just how to measure harm. Part

0:06:41.930 --> 0:06:43.919
<v Speaker 1>of the challenge we face in this debate about online

0:06:43.920 --> 0:06:45.960
<v Speaker 1>security and kids is that

0:06:46.640 --> 0:06:49.760
<v Speaker 1>we conflate all sorts of different kinds of harm. So,

0:06:50.240 --> 0:06:54.210
<v Speaker 1>child exploitative content and sex trafficking on one end of

0:06:54.210 --> 0:06:58.280
<v Speaker 1>these really acute horrible things that happen online and that

0:06:58.279 --> 0:07:01.779
<v Speaker 1>we feel like this visceral need to protect obviously our

0:07:01.779 --> 0:07:04.820
<v Speaker 1>kids in our society from and will go to great

0:07:04.820 --> 0:07:07.580
<v Speaker 1>lengths to do that. But then there's just all these

0:07:07.589 --> 0:07:10.850
<v Speaker 1>other harms that teenagers face in this world

0:07:10.940 --> 0:07:16.940
<v Speaker 1>online, of the propositioning the cyberbullying the trolling, the kind

0:07:16.940 --> 0:07:20.790
<v Speaker 1>of gross behavior that exists disproportionate to it doesn't in

0:07:20.790 --> 0:07:23.600
<v Speaker 1>the physical world and that might require a different kind

0:07:23.600 --> 0:07:27.710
<v Speaker 1>of response. So, assessing harm and risk is really difficult

0:07:27.720 --> 0:07:29.690
<v Speaker 1>at any time, but I think it's really hard for

0:07:29.690 --> 0:07:30.350
<v Speaker 1>parents

0:07:30.640 --> 0:07:34.040
<v Speaker 1>in this environment, where they don't know always what's going on.

0:07:34.040 --> 0:07:36.060
<v Speaker 1>They don't know how these tools are being used

0:07:36.440 --> 0:07:39.260
<v Speaker 1>and they don't know if their kids are subjective to

0:07:39.260 --> 0:07:43.270
<v Speaker 1>these acute or more sort of systemic harms and how

0:07:43.270 --> 0:07:47.340
<v Speaker 1>they should respond accordingly, right? And Helena's strategy here. And

0:07:47.340 --> 0:07:49.960
<v Speaker 1>the reason that she even found out about these messages

0:07:50.340 --> 0:07:52.160
<v Speaker 1>is that she checks Scarlett's phone

0:07:52.840 --> 0:07:55.450
<v Speaker 1>um which is something she does periodically, both of them

0:07:55.450 --> 0:07:58.270
<v Speaker 1>are aware. And when Helena brought up what she found

0:07:58.270 --> 0:08:00.250
<v Speaker 1>to Scarlett, it didn't go well,

0:08:02.040 --> 0:08:05.260
<v Speaker 2>I think she felt she'd done something wrong. And I

0:08:05.260 --> 0:08:09.960
<v Speaker 2>think because I did freak out because it was totally unexpected.

0:08:09.960 --> 0:08:12.370
<v Speaker 2>So she was very upset and I got upset me.

0:08:12.370 --> 0:08:14.550
<v Speaker 2>And these things just sort of escalate.

0:08:14.940 --> 0:08:16.440
<v Speaker 2>I think it was a bit of a shock to

0:08:16.440 --> 0:08:17.040
<v Speaker 2>both of us.

0:08:17.510 --> 0:08:20.060
<v Speaker 1>Scarlett didn't want to talk to me about the Fitbit thing.

0:08:20.440 --> 0:08:23.240
<v Speaker 1>But she did talk about her parents monitoring her internet,

0:08:23.240 --> 0:08:26.820
<v Speaker 1>use her parents do have these rules, like not to

0:08:26.820 --> 0:08:29.920
<v Speaker 1>give her information out to strangers. For example. And one

0:08:29.920 --> 0:08:31.800
<v Speaker 1>of the things her mom said is that they keep

0:08:31.800 --> 0:08:34.850
<v Speaker 1>her computer in a common area so they can casually

0:08:34.850 --> 0:08:36.360
<v Speaker 1>glance at it when they want to.

0:08:39.740 --> 0:08:44.329
<v Speaker 1>She mentioned that sometimes she looks over your shoulder when

0:08:44.330 --> 0:08:45.460
<v Speaker 1>you're on your laptop,

0:08:46.140 --> 0:08:48.260
<v Speaker 1>how do you feel about that?

0:08:50.440 --> 0:08:50.820
<v Speaker 1>It's

0:08:52.740 --> 0:08:54.920
<v Speaker 1>it makes me feel like I'm doing something wrong, even

0:08:54.920 --> 0:08:55.650
<v Speaker 1>if I'm not

0:08:56.140 --> 0:08:57.969
<v Speaker 1>like I could just be chatting with my friend just

0:08:57.970 --> 0:09:00.740
<v Speaker 1>talking about normal things that I could talk with with

0:09:00.740 --> 0:09:01.450
<v Speaker 1>my mom

0:09:01.940 --> 0:09:03.760
<v Speaker 1>and she hovers over my shoulder, I just go,

0:09:04.440 --> 0:09:05.309
<v Speaker 1>what did I do?

0:09:06.140 --> 0:09:08.520
<v Speaker 1>It's worse when my dad doesn't know because he breathes

0:09:08.520 --> 0:09:11.610
<v Speaker 1>into my ear and just stands there and stares. My

0:09:11.610 --> 0:09:14.900
<v Speaker 1>mom just walks past. My dad is just standing there

0:09:14.900 --> 0:09:15.260
<v Speaker 1>like

0:09:15.640 --> 0:09:20.840
<v Speaker 1>with with with bug eyes. Yeah, I looked into his

0:09:20.840 --> 0:09:23.810
<v Speaker 1>face and he's just judging me and I'm like what?

0:09:23.820 --> 0:09:26.260
<v Speaker 1>I don't understand why,

0:09:26.640 --> 0:09:29.290
<v Speaker 1>No. And then he'll ask a dumb question like, oh

0:09:29.290 --> 0:09:31.160
<v Speaker 1>why is she wearing that shirt? I don't know

0:09:31.540 --> 0:09:33.720
<v Speaker 1>why shouldn't she wear that shirt? Do you have a

0:09:33.720 --> 0:09:36.600
<v Speaker 1>problem with the fact that her shirt slightly is like

0:09:37.390 --> 0:09:38.760
<v Speaker 1>slightly

0:09:40.340 --> 0:09:45.050
<v Speaker 1>revealing? Is that a problem? Can you just go away?

0:09:45.050 --> 0:09:46.660
<v Speaker 1>I'm trying to watch you to leave.

0:09:48.540 --> 0:09:51.100
<v Speaker 1>And this all brings us to this big worry that

0:09:51.100 --> 0:09:53.210
<v Speaker 1>I hear over and over from parents

0:09:53.460 --> 0:09:57.920
<v Speaker 2>probably she's the first generation that's grown up with it,

0:09:57.929 --> 0:10:00.750
<v Speaker 2>always being in existence. Like there's never been a world

0:10:00.750 --> 0:10:05.550
<v Speaker 2>without the internet for her, without iphones, without all these devices.

0:10:05.840 --> 0:10:07.930
<v Speaker 2>I've got no frame of reference for it and we're

0:10:07.929 --> 0:10:10.390
<v Speaker 2>all making it up as we go along and it's

0:10:10.390 --> 0:10:13.250
<v Speaker 2>so difficult and we're probably getting it wrong?

0:10:14.640 --> 0:10:17.670
<v Speaker 2>What can you do? I wonder am I being too strict,

0:10:17.679 --> 0:10:17.959
<v Speaker 2>but

0:10:19.330 --> 0:10:21.360
<v Speaker 2>you know, is it really gonna work? Am I really

0:10:21.360 --> 0:10:22.290
<v Speaker 2>gonna protect her?

0:10:23.640 --> 0:10:25.960
<v Speaker 1>So there are two parts to this question on the

0:10:25.960 --> 0:10:28.490
<v Speaker 1>one hand, is my kids safe in these spaces that

0:10:28.490 --> 0:10:29.660
<v Speaker 1>I have no reference for.

0:10:30.140 --> 0:10:33.500
<v Speaker 1>And on the other hand, am I worrying too much?

0:10:33.760 --> 0:10:35.860
<v Speaker 1>Like at what point do parents just let go?

0:10:36.240 --> 0:10:40.140
<v Speaker 1>Those are questions parents are clearly struggling with and need

0:10:40.140 --> 0:10:44.260
<v Speaker 1>to ask themselves, but they're not the only players here.

0:10:44.280 --> 0:10:48.099
<v Speaker 1>Tech companies are also evolving their own tools for what

0:10:48.100 --> 0:10:51.170
<v Speaker 1>they allow and don't allow. Then governments as well are

0:10:51.170 --> 0:10:52.349
<v Speaker 1>trying to decide

0:10:52.440 --> 0:10:54.979
<v Speaker 1>what rules they should put in place for the companies

0:10:54.980 --> 0:10:56.550
<v Speaker 1>and the tools they design

0:10:56.940 --> 0:10:59.689
<v Speaker 1>and for kids of varying ages and how they engage

0:10:59.690 --> 0:11:00.620
<v Speaker 1>in those spaces.

0:11:06.440 --> 0:11:08.880
<v Speaker 1>The first person I wanted to speak to about this

0:11:08.880 --> 0:11:10.350
<v Speaker 1>was Sonia Livingstone

0:11:10.840 --> 0:11:12.670
<v Speaker 1>who is one of the world's leading experts when it

0:11:12.670 --> 0:11:14.359
<v Speaker 1>comes to kids in technology.

0:11:14.370 --> 0:11:17.929
<v Speaker 2>Okay, good. Alright. I'm pressing record on the phone.

0:11:18.030 --> 0:11:21.370
<v Speaker 1>She teaches social Psychology at the London School of Economics

0:11:21.520 --> 0:11:24.250
<v Speaker 1>and directs the Digital Futures Commission.

0:11:24.740 --> 0:11:26.740
<v Speaker 1>She is in many ways at the center of the

0:11:26.740 --> 0:11:30.410
<v Speaker 1>kids in tech policy debate both in the UK and

0:11:30.420 --> 0:11:31.160
<v Speaker 1>around the world

0:11:31.640 --> 0:11:34.780
<v Speaker 1>and as someone who lives in the UK, she had

0:11:34.780 --> 0:11:39.760
<v Speaker 1>a really interesting perspective on how different societies view parenting.

0:11:40.140 --> 0:11:45.160
<v Speaker 2>So my understanding is that in europe we have more

0:11:45.160 --> 0:11:48.590
<v Speaker 2>of an expectation that Children can have the independence to

0:11:48.590 --> 0:11:52.390
<v Speaker 2>figure things out for themselves and in north America. You

0:11:52.390 --> 0:11:54.150
<v Speaker 2>can tell me if I'm wrong about Canada

0:11:54.340 --> 0:11:56.550
<v Speaker 2>there is more a sense that the parent kind of

0:11:56.550 --> 0:11:59.689
<v Speaker 2>has the right to know everything that's happening to the child,

0:11:59.690 --> 0:12:01.060
<v Speaker 2>including on the phone

0:12:01.440 --> 0:12:05.470
<v Speaker 2>until the child kind of passes a threshold or proves himself.

0:12:05.480 --> 0:12:07.300
<v Speaker 2>I mean, I guess I do have that kind of

0:12:07.300 --> 0:12:12.010
<v Speaker 2>child rights focus that says Children can only really develop

0:12:12.020 --> 0:12:16.070
<v Speaker 2>and gain their agency and autonomy by having the leeway

0:12:16.070 --> 0:12:19.470
<v Speaker 2>to make some mistakes, not terrible mistakes. You know, we

0:12:19.470 --> 0:12:20.060
<v Speaker 2>want some

0:12:20.240 --> 0:12:22.510
<v Speaker 2>kind of safeguards around it, but they've got to have

0:12:22.510 --> 0:12:23.600
<v Speaker 2>some freedom to

0:12:23.840 --> 0:12:26.440
<v Speaker 2>stretch their legs as it were and that might mean

0:12:26.440 --> 0:12:28.060
<v Speaker 2>that they make some mistakes and then you want them

0:12:28.059 --> 0:12:29.360
<v Speaker 2>to come to you and say,

0:12:29.840 --> 0:12:32.400
<v Speaker 2>hey, this happened and that was a problem. We don't

0:12:32.400 --> 0:12:34.630
<v Speaker 2>even know what it's like for a child to grow up,

0:12:34.640 --> 0:12:38.240
<v Speaker 2>always being watched and checked and kind of controlled. That's

0:12:38.460 --> 0:12:39.790
<v Speaker 1>and what effect is that having

0:12:40.370 --> 0:12:43.940
<v Speaker 2>that has its own effect. Yes, and parents watched anxiously

0:12:43.940 --> 0:12:45.760
<v Speaker 2>and guiltily. And

0:12:46.040 --> 0:12:48.939
<v Speaker 2>so I see that these are really hard challenges for

0:12:48.940 --> 0:12:50.450
<v Speaker 2>parents to manage. But

0:12:50.840 --> 0:12:53.040
<v Speaker 2>I think there has to be an element of standing

0:12:53.040 --> 0:12:56.210
<v Speaker 2>back and making sure you're the person that the child

0:12:56.210 --> 0:12:58.109
<v Speaker 2>will tell if anything goes wrong.

0:12:58.640 --> 0:13:02.360
<v Speaker 1>How do you think parents should think through acceptable risk online?

0:13:02.940 --> 0:13:05.490
<v Speaker 1>And is that the right construct to be thinking about

0:13:05.490 --> 0:13:09.439
<v Speaker 1>this from just making small mistakes for kids which are

0:13:09.450 --> 0:13:12.150
<v Speaker 1>part of growing up versus being exposed to

0:13:12.640 --> 0:13:15.990
<v Speaker 1>real potential harm here, which is sometimes the case.

0:13:16.540 --> 0:13:19.760
<v Speaker 2>I used to say that the analogy was crossing roads

0:13:19.760 --> 0:13:22.100
<v Speaker 2>and walking down the street and so on, but actually

0:13:22.100 --> 0:13:25.490
<v Speaker 2>parents have been making a different decision there too. So

0:13:25.660 --> 0:13:28.220
<v Speaker 2>I think we all know that as Children, we were

0:13:28.220 --> 0:13:30.950
<v Speaker 2>allowed to be able to kind of walk to the

0:13:30.950 --> 0:13:34.000
<v Speaker 2>streets or to the shops or to the park unaccompanied

0:13:34.000 --> 0:13:36.310
<v Speaker 2>and now Children are much more accompanied. So

0:13:36.640 --> 0:13:39.420
<v Speaker 2>in a way the risk and the fear of getting

0:13:39.420 --> 0:13:43.060
<v Speaker 2>it wrong has spread its online and it's offline.

0:13:43.340 --> 0:13:45.840
<v Speaker 2>So, you know, I'm a social scientist, I believe in

0:13:45.840 --> 0:13:49.120
<v Speaker 2>the summer of statistics, there's no more road accidents. Two

0:13:49.120 --> 0:13:52.620
<v Speaker 2>kids now than 30 years ago. There's no more abductions

0:13:52.620 --> 0:13:55.980
<v Speaker 2>by pedophiles now than they were 30 years ago. There

0:13:55.990 --> 0:14:00.950
<v Speaker 2>is more exposure to online kind of risky content

0:14:01.340 --> 0:14:05.000
<v Speaker 2>But there is not really more things going wrong in

0:14:05.000 --> 0:14:08.100
<v Speaker 2>children's lives now than there was 30 years ago. So

0:14:08.100 --> 0:14:11.420
<v Speaker 2>I think parents, you know, if you can stand back

0:14:11.429 --> 0:14:14.330
<v Speaker 2>and give the child a bit more leeway, the paradox

0:14:14.330 --> 0:14:17.860
<v Speaker 2>is that you'll have a stronger and more resilient child.

0:14:17.870 --> 0:14:20.970
<v Speaker 2>A lot of the worst risks. They're incredibly rare, just

0:14:20.970 --> 0:14:22.800
<v Speaker 2>as you know, the chance of your child being knocked

0:14:22.800 --> 0:14:24.050
<v Speaker 2>down by a car on the way to the

0:14:24.140 --> 0:14:26.340
<v Speaker 2>buy an ice cream is very rare and it is

0:14:26.340 --> 0:14:30.650
<v Speaker 2>so beneficial that the child can make that journey by themselves.

0:14:33.440 --> 0:14:33.650
<v Speaker 1>Mhm,

0:14:35.830 --> 0:14:39.250
<v Speaker 1>wow, interesting. Yeah, it's really hard to get a worldwide

0:14:39.250 --> 0:14:42.180
<v Speaker 1>picture of this, but it's certainly true that if we're

0:14:42.180 --> 0:14:45.380
<v Speaker 1>looking at Canada and the us both child abductions and

0:14:45.380 --> 0:14:48.380
<v Speaker 1>road deaths are down. But what strikes me here is

0:14:48.380 --> 0:14:51.650
<v Speaker 1>that so much of the public discourse around safety online

0:14:51.940 --> 0:14:55.120
<v Speaker 1>revolves around these sorts of really acute harms that Sonja

0:14:55.120 --> 0:14:56.160
<v Speaker 1>is talking about.

0:14:56.740 --> 0:14:59.340
<v Speaker 1>For example, you remember that Apple got into some controversy

0:14:59.340 --> 0:15:02.350
<v Speaker 1>last summer when they introduced new measures to fight child

0:15:02.350 --> 0:15:06.160
<v Speaker 1>sexual abuse material, which involved collecting some information directly from

0:15:06.160 --> 0:15:07.600
<v Speaker 1>the phones of their users.

0:15:08.040 --> 0:15:10.630
<v Speaker 1>This set off a huge debate about the balance between

0:15:10.630 --> 0:15:12.150
<v Speaker 1>security and privacy,

0:15:12.540 --> 0:15:14.310
<v Speaker 1>but I talked to someone who thinks this is the

0:15:14.310 --> 0:15:15.750
<v Speaker 1>wrong debate to be having.

0:15:16.340 --> 0:15:19.650
<v Speaker 1>This is Valerie Steeves, professor of criminology at the University

0:15:19.650 --> 0:15:21.050
<v Speaker 1>of Ottawa,

0:15:22.040 --> 0:15:25.590
<v Speaker 1>sorry, I was watching the clock, watching the clock and

0:15:25.590 --> 0:15:27.790
<v Speaker 1>then naturally I got tied up in something on my

0:15:27.790 --> 0:15:30.920
<v Speaker 1>computer just when it was time to log in. No problem.

0:15:30.930 --> 0:15:32.770
<v Speaker 1>That's so nice to meet you find that's lovely to

0:15:32.770 --> 0:15:35.250
<v Speaker 1>meet you too. Valerie does a lot of research around

0:15:35.250 --> 0:15:38.820
<v Speaker 1>human rights and technology. She also co leads the equality

0:15:38.820 --> 0:15:43.030
<v Speaker 1>project where she talks to kids policymakers and experts about

0:15:43.030 --> 0:15:46.160
<v Speaker 1>ways to promote privacy and equality online.

0:15:46.540 --> 0:15:49.480
<v Speaker 1>She makes the argument that this entire discussion is a

0:15:49.480 --> 0:15:53.760
<v Speaker 1>distraction from the broader problems kids actually face online

0:15:54.240 --> 0:15:56.340
<v Speaker 1>and that in fact it's in the interests of tech

0:15:56.340 --> 0:15:59.250
<v Speaker 1>companies for us to be focusing on these kinds of

0:15:59.250 --> 0:16:00.360
<v Speaker 1>security issues.

0:16:01.440 --> 0:16:04.420
<v Speaker 1>One of the things that fascinates me that I don't

0:16:04.420 --> 0:16:07.250
<v Speaker 1>think anybody really talks about very often outside of academic

0:16:07.250 --> 0:16:10.840
<v Speaker 1>circles is the fact that companies made a very conscious

0:16:10.840 --> 0:16:15.000
<v Speaker 1>choice to stop talking about privacy and start talking about

0:16:15.000 --> 0:16:15.960
<v Speaker 1>security

0:16:16.440 --> 0:16:20.160
<v Speaker 1>company started saying your child is at risk of being

0:16:20.170 --> 0:16:24.220
<v Speaker 1>attacked by a predator. Online company started saying you should

0:16:24.220 --> 0:16:26.990
<v Speaker 1>be concerned your child could be talking to a stranger online.

0:16:27.440 --> 0:16:30.540
<v Speaker 1>I've collected data for years and all of the evidence

0:16:30.540 --> 0:16:33.450
<v Speaker 1>indicates that when kids say they're talking to a stranger online,

0:16:33.660 --> 0:16:36.890
<v Speaker 1>they almost always means someone in the context of their

0:16:36.890 --> 0:16:39.860
<v Speaker 1>social network, their grandmother in Lebanon, that they've never met

0:16:39.860 --> 0:16:43.320
<v Speaker 1>a person, their cousin's friend who also plays volleyball in

0:16:43.320 --> 0:16:44.570
<v Speaker 1>the town next door.

0:16:44.740 --> 0:16:46.970
<v Speaker 1>Now, if you think about it from a company's point

0:16:46.970 --> 0:16:49.060
<v Speaker 1>of view, this makes perfect sense because if you get

0:16:49.060 --> 0:16:50.210
<v Speaker 1>parents afraid,

0:16:50.500 --> 0:16:54.730
<v Speaker 1>then they will consent to their Children being placed under surveillance.

0:16:54.730 --> 0:16:58.290
<v Speaker 1>And companies very conveniently come and say, don't worry, parents

0:16:58.300 --> 0:17:01.380
<v Speaker 1>will watch your child. We've got this will take care

0:17:01.380 --> 0:17:02.260
<v Speaker 1>of your kiddies.

0:17:02.640 --> 0:17:06.960
<v Speaker 1>And it legitimizes this constant collection of the minutia of

0:17:06.960 --> 0:17:10.230
<v Speaker 1>young people's lives. So I think when we talk about

0:17:10.300 --> 0:17:13.460
<v Speaker 1>the harms that Children face, it's important to realize that

0:17:13.460 --> 0:17:17.070
<v Speaker 1>this is a constructive dialogue, you're being told your kids

0:17:17.070 --> 0:17:18.770
<v Speaker 1>are at risk of these harms because

0:17:19.140 --> 0:17:20.410
<v Speaker 1>it makes somebody money.

0:17:21.240 --> 0:17:24.340
<v Speaker 1>Now in a previous lifetime, I was a criminal lawyer

0:17:24.670 --> 0:17:27.780
<v Speaker 1>and I can tell you that from sex traffickers that

0:17:27.780 --> 0:17:30.889
<v Speaker 1>I have talked to, that the Children who are vulnerable

0:17:30.890 --> 0:17:34.649
<v Speaker 1>to that type of attack share a certain set of characteristics.

0:17:34.700 --> 0:17:36.680
<v Speaker 1>One of them is not the mere fact that their

0:17:36.680 --> 0:17:41.160
<v Speaker 1>online Valerie says those characteristics include childhood trauma,

0:17:41.380 --> 0:17:44.129
<v Speaker 1>a history of sexual abuse, low self esteem,

0:17:44.340 --> 0:17:47.280
<v Speaker 1>running away from home and minimal social support.

0:17:48.040 --> 0:17:51.000
<v Speaker 1>In other words, if your child is vulnerable to that

0:17:51.200 --> 0:17:54.400
<v Speaker 1>and they go online, you should be really worried if

0:17:54.400 --> 0:17:57.690
<v Speaker 1>your child is not vulnerable to that and they go online,

0:17:58.140 --> 0:18:01.210
<v Speaker 1>we haven't seen a case where that's worked out badly

0:18:01.210 --> 0:18:04.109
<v Speaker 1>for you or your child. One of the markers and

0:18:04.109 --> 0:18:07.409
<v Speaker 1>vulnerability is not the fact that your kid speaks to

0:18:07.410 --> 0:18:09.920
<v Speaker 1>their friends on Tiktok. The real problems you should be

0:18:09.920 --> 0:18:12.110
<v Speaker 1>worried about is the fact that everything they say on

0:18:12.109 --> 0:18:15.370
<v Speaker 1>Tiktok is grabbed by the platform and used for its

0:18:15.380 --> 0:18:16.420
<v Speaker 1>own purposes,

0:18:16.540 --> 0:18:18.620
<v Speaker 1>usually not for the child's well being.

0:18:18.740 --> 0:18:23.290
<v Speaker 1>So the recent news about Facebook, you know, intentionally gathering

0:18:23.290 --> 0:18:26.020
<v Speaker 1>data and using it against young people to give them

0:18:26.020 --> 0:18:28.879
<v Speaker 1>a bad sense of self for a bad body image. Well,

0:18:28.880 --> 0:18:32.490
<v Speaker 1>that's good marketing. That's been around for 20 years. I mean,

0:18:32.490 --> 0:18:34.080
<v Speaker 1>there's nothing new about this.

0:18:34.240 --> 0:18:38.949
<v Speaker 1>These corporations are purposely injecting a child's social environment with

0:18:38.950 --> 0:18:44.480
<v Speaker 1>these highly stereotypical media ties images to manipulate them for

0:18:44.480 --> 0:18:48.370
<v Speaker 1>commercial purposes and to create sticky places where they get

0:18:48.369 --> 0:18:51.830
<v Speaker 1>addicted and feel drawn to out of their own sense

0:18:51.830 --> 0:18:55.310
<v Speaker 1>of insecurity because they drop more data about themselves when

0:18:55.310 --> 0:18:57.270
<v Speaker 1>they're there. So it's a two pronged attack.

0:19:01.240 --> 0:19:05.510
<v Speaker 1>You mentioned Valerie steve speaks to kids to did she

0:19:05.510 --> 0:19:08.290
<v Speaker 1>share any of their thoughts on this? Absolutely, she did.

0:19:08.290 --> 0:19:11.379
<v Speaker 1>And essentially she says they've caught on to what's happening

0:19:11.390 --> 0:19:12.960
<v Speaker 1>and they're starting to adapt.

0:19:13.840 --> 0:19:17.240
<v Speaker 1>Well, I've seen a real shift over time actually in 2001,

0:19:17.240 --> 0:19:18.930
<v Speaker 1>when we first started talking to kids,

0:19:19.440 --> 0:19:24.150
<v Speaker 1>they talked about companies as trustworthy online friends. Oh, it's

0:19:24.150 --> 0:19:26.090
<v Speaker 1>a corporate side, it's got a dot com, I can

0:19:26.090 --> 0:19:30.070
<v Speaker 1>trust that company by 2015 when you sit down with

0:19:30.070 --> 0:19:32.610
<v Speaker 1>kids and talk to them about corporations on the internet,

0:19:32.609 --> 0:19:34.200
<v Speaker 1>they call them creepy old men

0:19:34.440 --> 0:19:37.030
<v Speaker 1>that they are just trying to manipulate us, Have you

0:19:37.030 --> 0:19:39.690
<v Speaker 1>ever read a privacy policy? It clearly wasn't written, so

0:19:39.690 --> 0:19:40.760
<v Speaker 1>I could understand it,

0:19:41.040 --> 0:19:43.479
<v Speaker 1>you know, they're just trying to hoodwink me kids now

0:19:43.480 --> 0:19:46.760
<v Speaker 1>talk about technology as kind of a necessary evil in

0:19:46.760 --> 0:19:48.740
<v Speaker 1>my life. Yeah, I have a phone, I love talking

0:19:48.740 --> 0:19:52.340
<v Speaker 1>to my friends, but I'm really careful what I do.

0:19:52.410 --> 0:19:54.960
<v Speaker 1>In fact, in the last four years or so we

0:19:54.960 --> 0:19:59.350
<v Speaker 1>found kids are withdrawing from online interactions. They have a

0:19:59.359 --> 0:20:01.670
<v Speaker 1>fake online public persona

0:20:01.800 --> 0:20:04.700
<v Speaker 1>that is really bland because they're trying not to attract

0:20:04.700 --> 0:20:05.459
<v Speaker 1>any bad

0:20:05.840 --> 0:20:10.040
<v Speaker 1>consequences, right? But at the same time, I remember about

0:20:10.040 --> 0:20:12.879
<v Speaker 1>five years ago, somebody saying, you know, it's really sad now,

0:20:12.880 --> 0:20:14.290
<v Speaker 1>all I can do is talk to my friends in

0:20:14.290 --> 0:20:22.580
<v Speaker 1>person comes full circle. It does and I think that

0:20:22.580 --> 0:20:24.669
<v Speaker 1>that's actually a really good sign because it shows you

0:20:24.670 --> 0:20:26.170
<v Speaker 1>that kids are thoughtful

0:20:26.540 --> 0:20:29.880
<v Speaker 1>and that they are not being Hoodwinked, that they figured

0:20:29.880 --> 0:20:30.950
<v Speaker 1>a lot of this out.

0:20:31.340 --> 0:20:34.560
<v Speaker 1>But the reason they have is because they've lived the consequences.

0:20:34.570 --> 0:20:37.130
<v Speaker 1>Like I've talked to lots of kids who over the

0:20:37.130 --> 0:20:41.700
<v Speaker 1>years suffered horribly because they posted a bikini shot and

0:20:41.700 --> 0:20:45.420
<v Speaker 1>then they were just ripped into shreds by peers and

0:20:45.420 --> 0:20:51.170
<v Speaker 1>strangers online. And so what they've learned is don't be online.

0:20:51.340 --> 0:20:53.820
<v Speaker 1>They're looking at the internet much more as a flat

0:20:53.830 --> 0:20:55.320
<v Speaker 1>entertainment medium.

0:20:55.740 --> 0:20:57.649
<v Speaker 1>You have to really be sure of who you're talking

0:20:57.650 --> 0:20:59.530
<v Speaker 1>to in the environment you're in to say something. You

0:20:59.530 --> 0:21:03.660
<v Speaker 1>actually think that's what kids complain about. It's a burdensome,

0:21:04.240 --> 0:21:08.560
<v Speaker 1>unnecessary, difficult world that we've created for them and we've

0:21:08.560 --> 0:21:12.850
<v Speaker 1>created it largely because it works really well for tech companies.

0:21:13.940 --> 0:21:17.359
<v Speaker 1>That's super interesting. It reminds me of something Scarlet said

0:21:17.359 --> 0:21:19.340
<v Speaker 1>when I asked her if there's anything she worries about

0:21:19.340 --> 0:21:20.450
<v Speaker 1>encountering online.

0:21:20.940 --> 0:21:23.170
<v Speaker 1>Not really. I mean, I know there's bad stuff, but

0:21:23.170 --> 0:21:26.250
<v Speaker 1>I don't really worry about encountering it because I just

0:21:26.250 --> 0:21:30.119
<v Speaker 1>block them. Yeah, it's nice to have that power. There

0:21:30.119 --> 0:21:32.260
<v Speaker 1>are some people that I would like to block in

0:21:32.260 --> 0:21:36.270
<v Speaker 1>real life. It wouldn't that be nice if you just

0:21:36.280 --> 0:21:38.790
<v Speaker 1>straight up block people who don't see them, don't hear them,

0:21:38.790 --> 0:21:40.950
<v Speaker 1>don't know they exist. That would be great.

0:21:41.540 --> 0:21:44.450
<v Speaker 1>Yeah, she's right. I also wish I could block people

0:21:44.450 --> 0:21:45.460
<v Speaker 1>in the real world,

0:21:45.840 --> 0:21:49.810
<v Speaker 1>don't we all Scarlet also runs some servers on discord,

0:21:49.810 --> 0:21:52.859
<v Speaker 1>which is a chat platform with these really long codes

0:21:52.859 --> 0:21:53.670
<v Speaker 1>of conduct.

0:21:54.040 --> 0:21:57.740
<v Speaker 1>Number one. Be respectful, civil and welcoming. Number two, no

0:21:57.740 --> 0:22:01.210
<v Speaker 1>inappropriate or unsafe content. Number three do not misuse or

0:22:01.210 --> 0:22:04.100
<v Speaker 1>spam in any of the channels. Number four do not

0:22:04.100 --> 0:22:06.270
<v Speaker 1>join the server to promote your content.

0:22:07.640 --> 0:22:07.850
<v Speaker 1>Mhm.

0:22:11.640 --> 0:22:13.679
<v Speaker 1>If people break the rules, she has a series of

0:22:13.680 --> 0:22:16.560
<v Speaker 1>warnings which escalate to them being banned, she seems like

0:22:16.560 --> 0:22:19.359
<v Speaker 1>she's figured out how to keep things manageable and civil.

0:22:19.770 --> 0:22:21.160
<v Speaker 1>This is something that really struck me in a lot

0:22:21.160 --> 0:22:24.619
<v Speaker 1>of these conversations is just the sophistication of the self

0:22:24.630 --> 0:22:27.450
<v Speaker 1>governance that's emerging in these spaces. People have

0:22:27.640 --> 0:22:30.619
<v Speaker 1>real rules and guidelines for how they want people in

0:22:30.619 --> 0:22:32.169
<v Speaker 1>their communities to behave.

0:22:32.340 --> 0:22:36.000
<v Speaker 1>Kids included. Absolutely, absolutely. The final thing I want to

0:22:36.000 --> 0:22:38.860
<v Speaker 1>talk to Valerie about is whether there's anything parents can

0:22:38.859 --> 0:22:40.909
<v Speaker 1>do to make things a little bit better and a

0:22:40.910 --> 0:22:43.950
<v Speaker 1>little safer. She had a piece of advice that's pretty simple.

0:22:44.340 --> 0:22:45.450
<v Speaker 1>Don't be a hypocrite.

0:22:48.840 --> 0:22:50.650
<v Speaker 1>The first thing you should do as a parent is,

0:22:50.650 --> 0:22:52.399
<v Speaker 1>take a real look in the mirror and take an

0:22:52.400 --> 0:22:55.090
<v Speaker 1>inventory of your own tech use because one of the

0:22:55.100 --> 0:22:57.700
<v Speaker 1>consistent things we hear from kids is how much they

0:22:57.700 --> 0:23:00.450
<v Speaker 1>hate the fact their parents use devices.

0:23:00.940 --> 0:23:03.510
<v Speaker 1>I remember when my kids were in high school and

0:23:03.510 --> 0:23:06.290
<v Speaker 1>they came in one day and I was answering an

0:23:06.290 --> 0:23:08.939
<v Speaker 1>email on work on my phone and boy did I

0:23:08.940 --> 0:23:11.750
<v Speaker 1>catch it hot, you know, like I can't believe it,

0:23:11.760 --> 0:23:14.400
<v Speaker 1>you know, you're not paying attention to me, I want

0:23:14.400 --> 0:23:15.359
<v Speaker 1>to talk to you about this.

0:23:15.440 --> 0:23:17.400
<v Speaker 1>And I realized that as soon as my kids came

0:23:17.400 --> 0:23:19.909
<v Speaker 1>home from high school I put my phone in a drawer,

0:23:19.920 --> 0:23:21.560
<v Speaker 1>I didn't want it anywhere near me.

0:23:21.740 --> 0:23:24.350
<v Speaker 1>Not because they sat down and talked to me every day,

0:23:24.350 --> 0:23:27.220
<v Speaker 1>but because like it's not about, there's no such thing

0:23:27.220 --> 0:23:29.869
<v Speaker 1>as quality time. There's only quantity time you need to

0:23:29.869 --> 0:23:34.510
<v Speaker 1>be there because between 1105 and 11071 night, they might

0:23:34.510 --> 0:23:36.800
<v Speaker 1>want to tell you this terrible thing happened at school.

0:23:37.040 --> 0:23:38.760
<v Speaker 1>If you're not there, you don't get it. The interesting

0:23:38.760 --> 0:23:40.670
<v Speaker 1>thing is from kids point of view if I'm on

0:23:40.670 --> 0:23:42.949
<v Speaker 1>my phone or if on my ipad and my computer,

0:23:43.040 --> 0:23:47.360
<v Speaker 1>I'm not there because I'm somewhere else. I'm on the screen, right?

0:23:47.740 --> 0:23:48.930
<v Speaker 1>And the last thing I want to ask you is

0:23:48.930 --> 0:23:50.350
<v Speaker 1>really stems off that is

0:23:50.740 --> 0:23:52.909
<v Speaker 1>what parents can learn from their kids here. And I

0:23:52.910 --> 0:23:56.169
<v Speaker 1>think that modeling behavior is probably one of the big ones, right?

0:23:56.540 --> 0:24:00.820
<v Speaker 1>Yeah, absolutely. I think it's actually in all forms of parenting,

0:24:00.820 --> 0:24:04.080
<v Speaker 1>it is our strongest tool. Kids don't really hear the

0:24:04.080 --> 0:24:06.680
<v Speaker 1>words you say they watch your actions and the way

0:24:06.680 --> 0:24:11.570
<v Speaker 1>you live. So if we model a healthy relationship with technology,

0:24:11.570 --> 0:24:13.670
<v Speaker 1>it's easier for them to navigate this world.

0:24:14.140 --> 0:24:17.129
<v Speaker 1>Keep in mind that it's becoming increasingly common for kids

0:24:17.130 --> 0:24:18.550
<v Speaker 1>to talk about disconnection

0:24:18.940 --> 0:24:21.179
<v Speaker 1>About three years ago, I got a call from my

0:24:21.180 --> 0:24:23.740
<v Speaker 1>friend Val Michaelson and who was at Queen's, She was

0:24:23.740 --> 0:24:25.510
<v Speaker 1>working with a group of kids that wanted to do

0:24:25.510 --> 0:24:28.540
<v Speaker 1>some youth participatory action research. And they were interested in

0:24:28.540 --> 0:24:31.850
<v Speaker 1>social media and how social media affected them. I've heard

0:24:31.850 --> 0:24:33.899
<v Speaker 1>kids for years say, you know, I worry that all

0:24:33.900 --> 0:24:36.960
<v Speaker 1>these devices make me lazy. I worry that they cut

0:24:36.960 --> 0:24:38.460
<v Speaker 1>me off from my friends.

0:24:38.540 --> 0:24:40.290
<v Speaker 1>I go over to my friend's house and everybody's on

0:24:40.290 --> 0:24:41.660
<v Speaker 1>their phone that's boring.

0:24:41.840 --> 0:24:44.590
<v Speaker 1>So we worked with this group and they decided that

0:24:44.590 --> 0:24:47.219
<v Speaker 1>they wanted to try a disconnection challenge. They wanted to

0:24:47.220 --> 0:24:51.150
<v Speaker 1>completely unplug from technology for a week. And they were thrilled.

0:24:51.160 --> 0:24:52.459
<v Speaker 1>This is amazing.

0:24:52.640 --> 0:24:56.020
<v Speaker 1>It's gonna be so cool. We will see how technology

0:24:56.020 --> 0:24:58.950
<v Speaker 1>affects us. And one kid said, oh, except for school.

0:24:58.960 --> 0:25:00.550
<v Speaker 1>You know, my teacher is not going to let me

0:25:00.550 --> 0:25:03.480
<v Speaker 1>not use the internet for school. Okay, So no technology

0:25:03.480 --> 0:25:06.800
<v Speaker 1>except for school. Yeah, it'll be amazing. Oh, except my

0:25:06.800 --> 0:25:08.959
<v Speaker 1>parents will never put up with that. Because when I'm

0:25:08.960 --> 0:25:11.040
<v Speaker 1>in ballet, my mom always wants me to text her

0:25:11.040 --> 0:25:12.350
<v Speaker 1>to let her know when to come.

0:25:12.740 --> 0:25:16.290
<v Speaker 1>And they went, oh, I can't because my coach sends

0:25:16.290 --> 0:25:19.830
<v Speaker 1>me my schedule on my phone. I can't because my

0:25:19.830 --> 0:25:21.690
<v Speaker 1>boss tells me when I need to come in on

0:25:21.690 --> 0:25:24.380
<v Speaker 1>saturday on my phone and they turned to the two

0:25:24.380 --> 0:25:26.689
<v Speaker 1>adults in the room val? And I And they said,

0:25:26.690 --> 0:25:29.119
<v Speaker 1>what is wrong with you adults? Why are you always

0:25:29.119 --> 0:25:31.270
<v Speaker 1>forcing us to use technology?

0:25:36.140 --> 0:25:40.160
<v Speaker 1>So Nicole, I'm curious to hear your thoughts about those conversations.

0:25:40.540 --> 0:25:42.470
<v Speaker 1>Yeah. You know, one thing that stood out to me

0:25:42.470 --> 0:25:45.300
<v Speaker 1>is is the creepy old man archetype and how it's

0:25:45.300 --> 0:25:47.260
<v Speaker 1>undergone a pretty interesting shift.

0:25:47.740 --> 0:25:50.990
<v Speaker 1>You know, where once upon a time kids imagine the

0:25:50.990 --> 0:25:54.480
<v Speaker 1>stereotypical internet predator as a man waiting for them in

0:25:54.480 --> 0:25:55.360
<v Speaker 1>a chat room

0:25:55.840 --> 0:25:58.420
<v Speaker 1>today. Instead the creepy old man is actually the tech

0:25:58.420 --> 0:26:01.859
<v Speaker 1>company gathering their information and watching what they do.

0:26:02.330 --> 0:26:05.820
<v Speaker 1>And I think that shows how sophisticated kid's understanding of

0:26:05.820 --> 0:26:09.530
<v Speaker 1>online harms has become. I completely agree. And it really

0:26:09.530 --> 0:26:10.550
<v Speaker 1>struck me

0:26:10.940 --> 0:26:14.740
<v Speaker 1>that people who have spoken to kids over a decade

0:26:14.780 --> 0:26:18.250
<v Speaker 1>see a real change in how kids see this space

0:26:18.250 --> 0:26:21.609
<v Speaker 1>and see harm in this space where originally there was

0:26:21.609 --> 0:26:25.510
<v Speaker 1>this digital divide where the kids understand technology and parents

0:26:25.510 --> 0:26:28.960
<v Speaker 1>didn't and parents were scared of it and kids weren't.

0:26:29.140 --> 0:26:31.770
<v Speaker 1>And it seems way more sophisticated now where

0:26:31.940 --> 0:26:34.920
<v Speaker 1>kids are actually concerned about some real things. They just

0:26:34.920 --> 0:26:38.000
<v Speaker 1>don't frame it the same way as parents. It's not

0:26:38.010 --> 0:26:38.560
<v Speaker 1>this kind of

0:26:38.940 --> 0:26:41.590
<v Speaker 1>predator and a trench coat. It's like you say it's

0:26:41.590 --> 0:26:45.790
<v Speaker 1>more nuanced. It's the type of trolling. They receive the

0:26:45.790 --> 0:26:49.010
<v Speaker 1>type of bad behavior, the data collection, the invasions of

0:26:49.010 --> 0:26:52.350
<v Speaker 1>their privacy. And I think we switch our terms of

0:26:52.350 --> 0:26:55.490
<v Speaker 1>our conversation to theirs. It can be a much more

0:26:55.490 --> 0:26:56.810
<v Speaker 1>constructive place

0:26:57.040 --> 0:26:59.050
<v Speaker 1>for talking about online safety.

0:26:59.240 --> 0:27:03.139
<v Speaker 1>Absolutely. The other thing that I found interesting is something

0:27:03.140 --> 0:27:05.060
<v Speaker 1>Sonia Livingstone said about

0:27:05.440 --> 0:27:09.700
<v Speaker 1>giving kids privacy, you know, I think before hearing from her,

0:27:09.710 --> 0:27:12.520
<v Speaker 1>I was pretty guilty of thinking of it as a

0:27:12.520 --> 0:27:17.760
<v Speaker 1>one way street where parents demonstrate their trust in kids

0:27:17.760 --> 0:27:19.169
<v Speaker 1>by giving them autonomy.

0:27:19.540 --> 0:27:21.950
<v Speaker 1>And she kind of brought to mind for me that

0:27:21.960 --> 0:27:25.360
<v Speaker 1>the more we create spaces where kids are able to

0:27:25.740 --> 0:27:29.679
<v Speaker 1>play and make mistakes and not feel too surveyed,

0:27:30.140 --> 0:27:32.810
<v Speaker 1>the more trust kids will have in their parents and

0:27:32.810 --> 0:27:34.850
<v Speaker 1>in the adults in their life because

0:27:35.240 --> 0:27:37.640
<v Speaker 1>they'll know that they don't need to hide the fact

0:27:37.640 --> 0:27:40.900
<v Speaker 1>that they've made a mistake or that they've encountered something

0:27:40.900 --> 0:27:43.360
<v Speaker 1>in an online environment that makes them uncomfortable

0:27:43.740 --> 0:27:45.710
<v Speaker 1>because their parents have given them the freedom to go

0:27:45.710 --> 0:27:48.490
<v Speaker 1>out and have those experiences. And so there's that kind

0:27:48.490 --> 0:27:51.070
<v Speaker 1>of built in confidence that comes from the fact that

0:27:51.070 --> 0:27:54.080
<v Speaker 1>they know their parents trust them to Rome. So I

0:27:54.080 --> 0:27:57.120
<v Speaker 1>get that and I understand the point that they were

0:27:57.119 --> 0:28:00.780
<v Speaker 1>all making that we need spaces online where kids can

0:28:00.780 --> 0:28:03.359
<v Speaker 1>take risks and have privacy

0:28:03.740 --> 0:28:07.070
<v Speaker 1>and have their own identities. The challenge I face, I

0:28:07.070 --> 0:28:09.700
<v Speaker 1>think as a parent is you want those risks to

0:28:09.700 --> 0:28:12.530
<v Speaker 1>be acceptable. You don't want to let your kids loose

0:28:12.530 --> 0:28:14.900
<v Speaker 1>in these places knowing that there's all these benefits they're

0:28:14.900 --> 0:28:18.870
<v Speaker 1>going to get. Well also they be subjected to real

0:28:18.880 --> 0:28:19.960
<v Speaker 1>potential harm.

0:28:20.440 --> 0:28:23.700
<v Speaker 1>And so that for me takes us into a conversation

0:28:23.700 --> 0:28:27.399
<v Speaker 1>about governance. How do we collectively want to set the

0:28:27.400 --> 0:28:30.210
<v Speaker 1>rules for some of these spaces? So yes, kids have

0:28:30.210 --> 0:28:33.570
<v Speaker 1>freedom and privacy and are able to have independence and

0:28:33.570 --> 0:28:36.820
<v Speaker 1>take risks but also not be subjected to the worst

0:28:36.820 --> 0:28:39.760
<v Speaker 1>harms we know potentially exist in this space.

0:28:39.840 --> 0:28:41.570
<v Speaker 1>And that we're gonna be talking about next episode when

0:28:41.570 --> 0:28:44.790
<v Speaker 1>we really dive into some of the governance opportunities in

0:28:44.790 --> 0:28:47.360
<v Speaker 1>this space that are being deployed around the world.

0:28:47.540 --> 0:28:48.460
<v Speaker 1>So stay tuned.

0:28:50.040 --> 0:28:50.360
<v Speaker 1>Mm hmm.

0:28:57.940 --> 0:29:00.670
<v Speaker 1>Thanks for listening to this episode of screen time from

0:29:00.670 --> 0:29:04.380
<v Speaker 1>tvo Antica Productions and the Center for Media Technology and

0:29:04.380 --> 0:29:06.270
<v Speaker 1>Democracy at McGill University.

0:29:06.940 --> 0:29:09.650
<v Speaker 1>I produce this show along with our senior producer, kevin

0:29:09.650 --> 0:29:14.040
<v Speaker 1>Sexton production assistants by Emily Marantz Research support by Sonia

0:29:14.040 --> 0:29:18.190
<v Speaker 1>Solomon Kody, Hakka and Helen Hayes mixing and sound design

0:29:18.190 --> 0:29:19.170
<v Speaker 1>by Phil Wilson.

0:29:19.540 --> 0:29:22.930
<v Speaker 1>Our executive producer is laura rigor Stuart cox is the

0:29:22.930 --> 0:29:26.940
<v Speaker 1>president of Antica, Katie O'Connor is the senior producer of podcasts.

0:29:26.950 --> 0:29:31.450
<v Speaker 1>A tvo Lori. Few is the executive producer for digital tv. Oh,

0:29:31.840 --> 0:29:33.760
<v Speaker 1>if you like what you heard tell a friend.

0:29:35.040 --> 0:29:40.120
<v Speaker 1>Yeah,

0:29:42.540 --> 0:29:42.700
<v Speaker 1>yeah,

0:29:45.840 --> 0:29:48.270
<v Speaker 1>nobody reads the terms of service is nobody

0:29:48.840 --> 0:29:53.280
<v Speaker 1>literally, it says terms of services and just click accept

0:29:53.280 --> 0:29:56.220
<v Speaker 1>immediately because nobody cares. I barely have the patience to

0:29:56.220 --> 0:29:57.959
<v Speaker 1>make a sandwich. I'm not going to read that.

0:29:58.540 --> 0:30:00.660
<v Speaker 1>Mm hmm hmm hmm hmm hmm hmm hmm hmm hmm

0:30:00.660 --> 0:30:00.850
<v Speaker 1>hmm