1 00:00:00,800 --> 00:00:03,800 Speaker 1: Welcome to the Tutor Dixon Podcast. You guys are probably 2 00:00:03,840 --> 00:00:07,560 Speaker 1: all seeing this kind of landmark decision with Meta and 3 00:00:07,640 --> 00:00:12,160 Speaker 1: Google where they have to actually pay damages because they 4 00:00:12,280 --> 00:00:16,759 Speaker 1: are I guess, I guess essentially being found guilty of 5 00:00:17,520 --> 00:00:21,279 Speaker 1: affecting a child's mental health and therefore things that happened 6 00:00:21,280 --> 00:00:24,520 Speaker 1: in this child's life. And I honestly, I will tell 7 00:00:24,560 --> 00:00:27,280 Speaker 1: you all have heard me talk about social media in 8 00:00:27,320 --> 00:00:30,560 Speaker 1: the past. I really feel as a parent, this is 9 00:00:30,600 --> 00:00:33,280 Speaker 1: something that is going to be a part of their life, 10 00:00:33,320 --> 00:00:36,200 Speaker 1: and there's a lot of responsibility on mom and dad too. 11 00:00:36,960 --> 00:00:40,760 Speaker 1: I think this case is very interesting because obviously they've 12 00:00:40,800 --> 00:00:44,000 Speaker 1: been found liable of something that they've done here and 13 00:00:44,080 --> 00:00:46,559 Speaker 1: they're having to pay damages. Now, we don't know if 14 00:00:46,560 --> 00:00:49,160 Speaker 1: this will hold up an appeal, but I just wanted 15 00:00:49,159 --> 00:00:51,480 Speaker 1: to kind of dive deep into what does this mean, 16 00:00:52,280 --> 00:00:55,000 Speaker 1: how could this go for companies like this, and at 17 00:00:55,040 --> 00:00:58,960 Speaker 1: what point do we say we can definitively say that 18 00:00:59,080 --> 00:01:03,000 Speaker 1: someone's mental health was affected by one particular thing instead 19 00:01:03,040 --> 00:01:06,720 Speaker 1: of like a variety of different things that happened to 20 00:01:06,760 --> 00:01:09,039 Speaker 1: you throughout your life. How do you know one thing 21 00:01:09,160 --> 00:01:12,080 Speaker 1: is responsible? So we brought an expert in. She is 22 00:01:12,120 --> 00:01:15,240 Speaker 1: an expert in mental health policy and she's also at 23 00:01:15,280 --> 00:01:18,880 Speaker 1: the Manhattan Institute. Her name is Carolyn Gorman. Thank you 24 00:01:18,959 --> 00:01:19,280 Speaker 1: so much. 25 00:01:19,319 --> 00:01:21,720 Speaker 2: I thank jesting much for having me here, tutor. And 26 00:01:21,880 --> 00:01:23,520 Speaker 2: it is a big case. 27 00:01:24,120 --> 00:01:27,319 Speaker 3: And I'll give you the highlight the top line of 28 00:01:27,360 --> 00:01:31,319 Speaker 3: your last question. We're never going to know, at least 29 00:01:31,520 --> 00:01:35,800 Speaker 3: we haven't yet so far, when one thing causes a 30 00:01:35,880 --> 00:01:40,760 Speaker 3: mental health challenge. So definitely a very nuanced decision that 31 00:01:41,640 --> 00:01:43,200 Speaker 3: has some upsides and some downside. 32 00:01:43,240 --> 00:01:45,200 Speaker 2: So I think you're looking at it in exactly the 33 00:01:45,200 --> 00:01:45,720 Speaker 2: way we should. 34 00:01:46,360 --> 00:01:48,640 Speaker 1: Well, so that is, and I don't mean to be 35 00:01:48,960 --> 00:01:51,320 Speaker 1: you know, I'm not like a huge fan of the 36 00:01:51,560 --> 00:01:54,680 Speaker 1: social media platforms by any means. I certainly think that 37 00:01:54,760 --> 00:01:57,800 Speaker 1: there is this addictive factor that is a part of it, 38 00:01:57,920 --> 00:02:00,040 Speaker 1: and I've seen it happen with adults. I see it 39 00:02:00,200 --> 00:02:03,040 Speaker 1: happened with kids. But I mean, I see the same 40 00:02:03,080 --> 00:02:05,800 Speaker 1: thing with potato chips. You know, It's like there's an 41 00:02:05,880 --> 00:02:09,200 Speaker 1: addictive factor built into a lot of products that are sold. 42 00:02:09,600 --> 00:02:12,360 Speaker 1: And at what point is it our responsibility to say 43 00:02:12,560 --> 00:02:15,560 Speaker 1: we have to kind of like force ourselves off of this. 44 00:02:15,680 --> 00:02:17,680 Speaker 2: I think there's a couple of things going on here. 45 00:02:17,919 --> 00:02:20,440 Speaker 3: First of all, we know that there are a lot 46 00:02:20,480 --> 00:02:24,720 Speaker 3: of kind of complex problems around mental health, and that 47 00:02:25,240 --> 00:02:30,160 Speaker 3: oftentimes you have one problem associated with another problem. So 48 00:02:30,360 --> 00:02:35,120 Speaker 3: the causality here, the direction of the causality is not 49 00:02:35,200 --> 00:02:36,720 Speaker 3: necessarily clear. 50 00:02:37,080 --> 00:02:37,639 Speaker 2: We don't know. 51 00:02:37,680 --> 00:02:44,359 Speaker 3: If the underlying mental health condition is sort of drawing 52 00:02:44,400 --> 00:02:49,400 Speaker 3: someone to these platforms, if the platforms are exacerbating the situation. 53 00:02:50,720 --> 00:02:53,839 Speaker 2: But I think just to be clear. 54 00:02:53,880 --> 00:02:58,520 Speaker 3: What the case found was that so it was a 55 00:02:58,639 --> 00:03:03,480 Speaker 3: jury and personal injury trial, and they found this is 56 00:03:03,520 --> 00:03:06,280 Speaker 3: in Los Angeles. They found Meta and YouTube liable for 57 00:03:06,720 --> 00:03:10,799 Speaker 3: negligence and for failing to warn miners about the dangers 58 00:03:10,800 --> 00:03:18,200 Speaker 3: associated with their platforms. And so yes, essentially, the jury 59 00:03:18,760 --> 00:03:25,720 Speaker 3: decided that the plaintiff who believed that alleged that she 60 00:03:25,840 --> 00:03:31,239 Speaker 3: became addicted to apps like YouTube and Instagram, that the 61 00:03:31,360 --> 00:03:34,440 Speaker 3: kind of design of the platforms were the types of 62 00:03:34,480 --> 00:03:38,280 Speaker 3: things that were addicting her. The jury did ultimately agree 63 00:03:38,320 --> 00:03:43,760 Speaker 3: with this, and they decided that negligence of these companies 64 00:03:43,960 --> 00:03:48,920 Speaker 3: played a really substantial factor in causing her mental health harm. 65 00:03:48,960 --> 00:03:51,080 Speaker 1: I mean, it's interesting because I think there's a few 66 00:03:51,080 --> 00:03:53,480 Speaker 1: different trials going on right now. This was one of them, 67 00:03:53,600 --> 00:03:55,600 Speaker 1: and this was, as you said, it was based on 68 00:03:56,200 --> 00:03:59,280 Speaker 1: really how how the product is built and how the 69 00:03:59,320 --> 00:04:01,880 Speaker 1: product kind of you in. And the other one, I 70 00:04:01,920 --> 00:04:05,600 Speaker 1: feel like as the content that is actually on the sites, 71 00:04:05,760 --> 00:04:09,800 Speaker 1: and there's potentially been content that has driven some people 72 00:04:09,920 --> 00:04:13,400 Speaker 1: to be convinced to commit suicide and things like that, 73 00:04:13,400 --> 00:04:16,160 Speaker 1: and they're saying like that content should be policed. We've 74 00:04:16,200 --> 00:04:18,880 Speaker 1: had some situation. I know there was a situation a 75 00:04:18,920 --> 00:04:22,919 Speaker 1: few years ago in Michigan where there was where the 76 00:04:23,000 --> 00:04:27,120 Speaker 1: kid was being kind of totally had to pay money extorted, 77 00:04:27,279 --> 00:04:29,880 Speaker 1: like because they had convinced him to send a picture 78 00:04:29,920 --> 00:04:32,480 Speaker 1: and so there were some dangerous things going on on 79 00:04:32,520 --> 00:04:35,919 Speaker 1: their platform. And I do feel like if your platform 80 00:04:36,040 --> 00:04:37,880 Speaker 1: has things like that, there should be some way to 81 00:04:37,960 --> 00:04:43,520 Speaker 1: police it. Again, I struggle with the idea that I 82 00:04:43,560 --> 00:04:47,039 Speaker 1: guess I understand giving them the warning, like potentially saying 83 00:04:47,360 --> 00:04:49,000 Speaker 1: just so you know, this is how this is built, 84 00:04:49,040 --> 00:04:51,400 Speaker 1: this is how you can get sucked in. But I 85 00:04:51,440 --> 00:04:54,160 Speaker 1: do think we live in this world where we have 86 00:04:54,240 --> 00:04:56,799 Speaker 1: to force ourselves off. But we do have a situation 87 00:04:56,880 --> 00:04:59,520 Speaker 1: where we're almost kind of constantly forced back in because 88 00:04:59,600 --> 00:05:02,760 Speaker 1: even tho schools do have devices in schools, like my 89 00:05:02,839 --> 00:05:05,719 Speaker 1: kids get a chromebook. Other kids at their old school 90 00:05:05,760 --> 00:05:08,960 Speaker 1: they got an iPad and they were told find videos 91 00:05:09,000 --> 00:05:11,920 Speaker 1: on YouTube, use this as a resource. So it is 92 00:05:12,040 --> 00:05:15,400 Speaker 1: kind of conflicting the information. I think that's exactly right, Tutor. 93 00:05:15,920 --> 00:05:19,200 Speaker 3: Of course, personal responsibility is not something that we should 94 00:05:19,400 --> 00:05:21,800 Speaker 3: give up on, and it is on parents to make 95 00:05:21,839 --> 00:05:24,880 Speaker 3: sure that their kids are not using this technology. 96 00:05:25,160 --> 00:05:26,760 Speaker 2: You know too much. 97 00:05:27,040 --> 00:05:30,480 Speaker 3: But even the most engaged parents have found this to 98 00:05:30,520 --> 00:05:34,520 Speaker 3: be a real problem for exactly the types of reasons 99 00:05:34,520 --> 00:05:38,320 Speaker 3: that you've said. And so I think the kind of 100 00:05:38,400 --> 00:05:43,520 Speaker 3: most positive outcome of these cases is that it reflects 101 00:05:44,040 --> 00:05:48,520 Speaker 3: kind of a growing acknowledgment and consensus among Americans that 102 00:05:48,560 --> 00:05:53,040 Speaker 3: we recognize that these online platforms are problematic, we're using 103 00:05:53,080 --> 00:05:58,160 Speaker 3: them more than we want them to be used, and again, 104 00:05:58,360 --> 00:06:01,680 Speaker 3: far more concerning, they are putting kids at serious risk 105 00:06:02,120 --> 00:06:06,680 Speaker 3: of harm. And so I think we can balance personal 106 00:06:06,720 --> 00:06:10,000 Speaker 3: responsibility with the fact that we need a little bit 107 00:06:10,040 --> 00:06:14,719 Speaker 3: more support from something else out there. And so I 108 00:06:14,760 --> 00:06:19,520 Speaker 3: think this consensus does seem to be taken or we 109 00:06:19,560 --> 00:06:23,760 Speaker 3: should take this as a sign that we government can 110 00:06:23,800 --> 00:06:24,480 Speaker 3: be involved and. 111 00:06:24,400 --> 00:06:25,240 Speaker 2: Should be involved. 112 00:06:25,560 --> 00:06:29,680 Speaker 1: So what is the next steps, Because one, if one 113 00:06:29,720 --> 00:06:32,440 Speaker 1: group can sue and they get this money, then, isn't 114 00:06:32,440 --> 00:06:35,400 Speaker 1: there I mean, when't you think there could be endless 115 00:06:35,480 --> 00:06:38,240 Speaker 1: lawsuits behind this where people are going, oh, I did? 116 00:06:38,320 --> 00:06:41,039 Speaker 1: I mean, I haven't gotten my laundry done like I 117 00:06:41,040 --> 00:06:43,880 Speaker 1: should because I've caught my cell phone. So I'm like, 118 00:06:43,960 --> 00:06:45,680 Speaker 1: oh my gosh, I could have just done the laundry 119 00:06:45,720 --> 00:06:47,920 Speaker 1: for the last twenty minutes. You know, we have. Hasn't 120 00:06:47,920 --> 00:06:49,960 Speaker 1: it happened to all of us? And at what point 121 00:06:50,000 --> 00:06:53,200 Speaker 1: is this where you're just like, you know what, I 122 00:06:53,240 --> 00:06:56,360 Speaker 1: got fined because my mom isn't mode and that's your fault. 123 00:06:56,400 --> 00:06:59,080 Speaker 1: You know, like, at what point do we how do 124 00:06:59,120 --> 00:06:59,839 Speaker 1: we stop the laws? 125 00:07:00,000 --> 00:07:01,640 Speaker 2: It definitely going to be a difficult problem. 126 00:07:01,760 --> 00:07:04,359 Speaker 3: So, first, we don't know that these decisions are going 127 00:07:04,440 --> 00:07:09,080 Speaker 3: to stick in both cases, in New Mexico and in California. 128 00:07:09,800 --> 00:07:11,960 Speaker 2: Meta has said that they're going to appeal. 129 00:07:12,520 --> 00:07:17,760 Speaker 3: In the California case, it is not unreasonable to think 130 00:07:17,960 --> 00:07:23,880 Speaker 3: that the verdict could be weakened or even reversed. And 131 00:07:23,880 --> 00:07:29,520 Speaker 3: that's because there's not a super strong consensus around again, 132 00:07:29,600 --> 00:07:33,400 Speaker 3: these sort of causes of mental illness, mental health, and addiction, 133 00:07:34,160 --> 00:07:36,880 Speaker 3: and so we've got to wait to see how that 134 00:07:36,920 --> 00:07:43,200 Speaker 3: plays out. But liability and the incentive to bring a 135 00:07:43,280 --> 00:07:47,280 Speaker 3: lawsuit is definitely going to be something that we should 136 00:07:47,320 --> 00:07:50,560 Speaker 3: be wary of, and it's not necessarily going to be 137 00:07:50,640 --> 00:07:56,800 Speaker 3: something that incentivizes these platforms to proactively make them more safe. 138 00:07:57,280 --> 00:07:59,640 Speaker 1: Is there something though, that came out of the lawsuit 139 00:07:59,680 --> 00:08:02,080 Speaker 1: where if they I mean you talked about they didn't 140 00:08:02,120 --> 00:08:06,200 Speaker 1: have this these warnings. Is that something? Is that like 141 00:08:06,680 --> 00:08:09,200 Speaker 1: at a point when you see, okay, well if they 142 00:08:09,200 --> 00:08:11,880 Speaker 1: had this, and this is that the next law that's 143 00:08:11,920 --> 00:08:13,520 Speaker 1: going to be passed, like does that bill go in 144 00:08:13,520 --> 00:08:15,160 Speaker 1: front of Congress? And like, oh, if you're a social 145 00:08:15,160 --> 00:08:18,680 Speaker 1: media company, you have to have somebody click something, because 146 00:08:18,720 --> 00:08:21,960 Speaker 1: I heard someone saying the other day that in certain 147 00:08:22,000 --> 00:08:24,200 Speaker 1: states now you have to put before you open any 148 00:08:24,240 --> 00:08:26,160 Speaker 1: of these apps, you have to put in your birth date. 149 00:08:26,200 --> 00:08:28,480 Speaker 1: And people are getting really annoyed that they have to 150 00:08:28,480 --> 00:08:30,880 Speaker 1: put in their birth date every time to open the app. 151 00:08:31,120 --> 00:08:32,840 Speaker 1: And then I think, well, what if you get to 152 00:08:32,840 --> 00:08:34,640 Speaker 1: the point where it's like you have to click okay 153 00:08:34,720 --> 00:08:38,880 Speaker 1: that you've read their policy every time? I think, I mean, 154 00:08:38,920 --> 00:08:40,880 Speaker 1: if that's what it takes to keep people safe, But 155 00:08:40,960 --> 00:08:43,040 Speaker 1: I do think that kind of just also becomes like 156 00:08:43,080 --> 00:08:45,160 Speaker 1: a numbing thing where you're like, I'm putting in my 157 00:08:45,200 --> 00:08:48,000 Speaker 1: birth date and then you're not actually paying attentally to birthday. 158 00:08:48,040 --> 00:08:50,760 Speaker 3: It's a box check. You're you're exactly right. I mean 159 00:08:50,800 --> 00:08:52,920 Speaker 3: it's actually literally a box check. 160 00:08:54,080 --> 00:08:54,960 Speaker 2: Yeah right. Yeah. 161 00:08:55,000 --> 00:08:58,640 Speaker 3: I think it is a difficult problem to solve enforcement 162 00:08:58,720 --> 00:09:01,600 Speaker 3: and so there has to be so creative thinking about this. 163 00:09:02,200 --> 00:09:08,319 Speaker 3: But putting some friction between use seems reasonable. I think 164 00:09:08,400 --> 00:09:11,199 Speaker 3: they are also kind of unique ways we might think 165 00:09:11,320 --> 00:09:17,200 Speaker 3: about restriction that might seem counterintuitive. For one example, we 166 00:09:17,360 --> 00:09:22,679 Speaker 3: might not want to require accounts to be set up 167 00:09:23,160 --> 00:09:28,760 Speaker 3: because then even if someone is underage or young, their 168 00:09:28,880 --> 00:09:33,600 Speaker 3: data isn't necessarily being collected, so we're not stopping them 169 00:09:33,640 --> 00:09:36,640 Speaker 3: from viewing content. But at least we can kind of 170 00:09:36,679 --> 00:09:40,079 Speaker 3: protect them in that way. We can protect their data 171 00:09:40,120 --> 00:09:42,280 Speaker 3: in that way. I see, Yeah, that's interesting. 172 00:09:42,320 --> 00:09:44,120 Speaker 1: I wouldn't have thought about that, but I guess that 173 00:09:44,280 --> 00:09:47,480 Speaker 1: is something that you at this point, maybe it's these 174 00:09:47,559 --> 00:09:50,640 Speaker 1: lawsuits kick these things off, and then that's when lawmakers 175 00:09:50,640 --> 00:09:53,600 Speaker 1: start to go, Okay, now that we've seen how the 176 00:09:53,640 --> 00:09:56,240 Speaker 1: public feels about this, and we've seen how the jury 177 00:09:56,240 --> 00:09:59,520 Speaker 1: feels about this, how do we take what has happened 178 00:09:59,520 --> 00:10:02,640 Speaker 1: in the core and turn that into law, although I 179 00:10:02,679 --> 00:10:06,920 Speaker 1: think sometimes that also becomes a bit of an issue 180 00:10:07,080 --> 00:10:10,120 Speaker 1: because then you never know what government's. 181 00:10:09,679 --> 00:10:11,280 Speaker 2: Actually going to be true. 182 00:10:11,720 --> 00:10:14,960 Speaker 3: Again, there's upsides and downsides to everything, and I think 183 00:10:15,040 --> 00:10:20,200 Speaker 3: we need to weigh those trade offs in a in 184 00:10:20,240 --> 00:10:23,320 Speaker 3: a very measured and balanced way. We do know that 185 00:10:23,360 --> 00:10:27,400 Speaker 3: these platforms are making kids less safe in New Mexico. 186 00:10:27,960 --> 00:10:31,400 Speaker 3: You know, there were a lot of scary examples of 187 00:10:31,480 --> 00:10:35,520 Speaker 3: children being exploited, and the jury in the Californian case 188 00:10:35,800 --> 00:10:41,120 Speaker 3: really felt like Mark Zuckerberg's testimony was not necessarily consistent 189 00:10:41,520 --> 00:10:46,120 Speaker 3: they you know, reporting kind of described them as getting 190 00:10:46,120 --> 00:10:49,720 Speaker 3: bad vibes, and so these are things that we need 191 00:10:49,760 --> 00:10:50,280 Speaker 3: to consider. 192 00:10:50,880 --> 00:10:53,880 Speaker 1: One thing I think I heard, I heard that there 193 00:10:53,880 --> 00:10:56,680 Speaker 1: were some things that they said, and I didn't hear 194 00:10:56,679 --> 00:10:58,680 Speaker 1: what it was, but I heard reporting where it was 195 00:10:58,679 --> 00:11:00,200 Speaker 1: like there were some things that were said in the 196 00:11:00,200 --> 00:11:03,880 Speaker 1: courtroom that led people to believe that they knew they 197 00:11:03,920 --> 00:11:06,680 Speaker 1: were putting kids in harm's way, and they were there 198 00:11:06,760 --> 00:11:10,000 Speaker 1: had been like chats or maybe it was like email 199 00:11:10,040 --> 00:11:12,840 Speaker 1: conversations where the people that worked there were kind of like, yeah, 200 00:11:12,920 --> 00:11:15,000 Speaker 1: this is really going to do this, but who that's right. 201 00:11:15,360 --> 00:11:20,200 Speaker 3: It sounds like from reporting that Meta's own research was 202 00:11:20,679 --> 00:11:26,880 Speaker 3: definitely influential in the jury's decision. So again, all these 203 00:11:26,920 --> 00:11:28,839 Speaker 3: types of trade offs are ones that we're going to 204 00:11:28,880 --> 00:11:31,880 Speaker 3: have to face when we're addressing this complicated problem. But 205 00:11:32,000 --> 00:11:35,920 Speaker 3: to your original point of making sure that we're thinking 206 00:11:35,960 --> 00:11:42,520 Speaker 3: about parental responsibility, I think that we should not undervalue 207 00:11:43,000 --> 00:11:47,679 Speaker 3: how much social pressure and social norms can play a 208 00:11:47,760 --> 00:11:53,200 Speaker 3: positive role here. We should think about stigmatizing social media 209 00:11:53,400 --> 00:11:58,160 Speaker 3: use more. I mean, we don't say it necessarily, but 210 00:11:58,559 --> 00:12:01,520 Speaker 3: we all have that friend who post a hundred selfies 211 00:12:01,559 --> 00:12:06,040 Speaker 3: a day and pictures of their salads, and it's you know, 212 00:12:06,720 --> 00:12:10,160 Speaker 3: a little silly or I don't want to say pathetic, 213 00:12:10,280 --> 00:12:11,800 Speaker 3: but you know, we're not looking at. 214 00:12:11,760 --> 00:12:13,080 Speaker 2: It in a very positive light. 215 00:12:13,600 --> 00:12:15,480 Speaker 3: We could maybe be a little bit more honest and 216 00:12:15,559 --> 00:12:19,679 Speaker 3: upfront about that for the greater good and start, you know, 217 00:12:19,760 --> 00:12:21,880 Speaker 3: shaming each other a little bit for being on these 218 00:12:21,920 --> 00:12:23,959 Speaker 3: platforms too much, because. 219 00:12:24,760 --> 00:12:27,120 Speaker 1: You know, it's so funny because I've heard people like 220 00:12:27,200 --> 00:12:29,840 Speaker 1: from my generation and this is obviously a different thing, 221 00:12:29,920 --> 00:12:33,080 Speaker 1: but they, you know, kids now are coming to school 222 00:12:33,160 --> 00:12:36,080 Speaker 1: with these furry outfits on, and one of our friends 223 00:12:36,080 --> 00:12:38,679 Speaker 1: the other day was psych And you think about this, 224 00:12:38,760 --> 00:12:41,400 Speaker 1: When we were young, there were bullies. You would never 225 00:12:41,480 --> 00:12:43,480 Speaker 1: have been allowed to come to school as a furry. 226 00:12:43,880 --> 00:12:46,440 Speaker 1: And I think that's I mean, to a certain extent, 227 00:12:46,720 --> 00:12:49,320 Speaker 1: you know that's true. I'm certainly not a fan of bullying, 228 00:12:49,400 --> 00:12:52,160 Speaker 1: but I do think so. I will tell you my 229 00:12:52,360 --> 00:12:55,959 Speaker 1: kids at their school, somebody and we don't know who 230 00:12:56,000 --> 00:12:59,839 Speaker 1: this is, but there's some account that posts like if 231 00:12:59,840 --> 00:13:02,720 Speaker 1: you post something ridiculous on your social media, and it's 232 00:13:02,840 --> 00:13:06,160 Speaker 1: run by the seniors, and then if anybody in the 233 00:13:06,240 --> 00:13:09,080 Speaker 1: high school posts something ridiculous on their social media, they 234 00:13:09,200 --> 00:13:12,920 Speaker 1: post it on this school site. Now it's not sanctioned 235 00:13:12,920 --> 00:13:15,800 Speaker 1: by the school. They're not happy about it. They never 236 00:13:15,880 --> 00:13:18,680 Speaker 1: it gets passed down from senior year to the next 237 00:13:18,720 --> 00:13:22,000 Speaker 1: senior year. But it's funny the way my kids react 238 00:13:22,040 --> 00:13:24,520 Speaker 1: to it. They're like, oh my gosh, I can't post 239 00:13:24,559 --> 00:13:26,720 Speaker 1: anything like that on social media because it could end 240 00:13:26,800 --> 00:13:29,880 Speaker 1: up on this site. Mom, Mom, don't post anything on 241 00:13:29,920 --> 00:13:32,680 Speaker 1: social media. So they're like yelling at me, don't you 242 00:13:32,760 --> 00:13:35,640 Speaker 1: dare ever post anything abo us on social media? And 243 00:13:35,679 --> 00:13:38,240 Speaker 1: I just kind of wonder if this is also new, 244 00:13:38,720 --> 00:13:42,680 Speaker 1: And you know, I look at my mom's generation and 245 00:13:42,760 --> 00:13:47,600 Speaker 1: my mom. Sometimes it's like your mom's the first one 246 00:13:47,640 --> 00:13:50,720 Speaker 1: to comment on your Facebook story. Mom, don't publicly comment 247 00:13:50,760 --> 00:13:55,760 Speaker 1: on my Facebook story. But that and you see your parents' generation, 248 00:13:55,880 --> 00:13:58,199 Speaker 1: they're like putting all kinds of stuff out there. And 249 00:13:58,240 --> 00:14:00,920 Speaker 1: then I feel like my generation is like a lot 250 00:14:00,920 --> 00:14:03,480 Speaker 1: of stuff about my kids, but now it's starting to 251 00:14:03,559 --> 00:14:07,280 Speaker 1: get political, like my friends are like my kids played soccer, 252 00:14:07,520 --> 00:14:10,360 Speaker 1: and then the next post is like abolish icy, you know, 253 00:14:10,720 --> 00:14:14,559 Speaker 1: so that's a mix. And then I but I look 254 00:14:14,600 --> 00:14:17,840 Speaker 1: at my kids generation and they're all of their posts 255 00:14:17,880 --> 00:14:20,480 Speaker 1: are very artistic, like you know, things that I saw 256 00:14:20,520 --> 00:14:23,280 Speaker 1: that were cool, but not really personal and not really 257 00:14:23,280 --> 00:14:25,760 Speaker 1: about them. And I wonder if they're sort of learning 258 00:14:25,800 --> 00:14:27,440 Speaker 1: this a little bit on their own. I think that's 259 00:14:27,520 --> 00:14:28,280 Speaker 1: probably right. 260 00:14:28,600 --> 00:14:31,720 Speaker 3: And I also I learned a term from your show, 261 00:14:31,760 --> 00:14:33,119 Speaker 3: actually sharenting. 262 00:14:33,400 --> 00:14:34,160 Speaker 2: Is it charenting? 263 00:14:34,680 --> 00:14:37,000 Speaker 1: Yes, I just learned that word too, and I don't 264 00:14:37,000 --> 00:14:39,080 Speaker 1: think we've talked about it on here, but I was like, 265 00:14:39,240 --> 00:14:40,560 Speaker 1: that's such a good point. 266 00:14:40,600 --> 00:14:42,760 Speaker 3: It is a good point, and I think we should 267 00:14:42,760 --> 00:14:44,960 Speaker 3: all just kind of be more aware of what we're 268 00:14:45,000 --> 00:14:47,520 Speaker 3: posting and do we really need to do that? 269 00:14:48,640 --> 00:14:49,960 Speaker 2: But yeah, maybe kids. 270 00:14:49,800 --> 00:14:52,280 Speaker 1: Are abroad so listen, so explain what that is for 271 00:14:52,360 --> 00:14:55,600 Speaker 1: people who haven't heard what sharenting is. And this it's 272 00:14:55,640 --> 00:14:59,240 Speaker 1: also kind of we're somewhat shaming people, like, don't put 273 00:14:59,280 --> 00:15:00,760 Speaker 1: your kids out there all the time. 274 00:15:00,960 --> 00:15:05,240 Speaker 3: So, from what I understand, sharenting is posting pictures of 275 00:15:05,280 --> 00:15:08,480 Speaker 3: your kids on you the parents' platform. So if you 276 00:15:08,520 --> 00:15:10,880 Speaker 3: have an Instagram account and you're on vacation and your 277 00:15:10,920 --> 00:15:15,400 Speaker 3: kids look super cute in their athletic beach clothes, you're 278 00:15:15,440 --> 00:15:20,800 Speaker 3: posting photos of them. But we don't necessarily know if 279 00:15:21,000 --> 00:15:23,840 Speaker 3: our children want that. And also those images are still 280 00:15:23,880 --> 00:15:27,160 Speaker 3: being put online for them, they're still kind of being 281 00:15:27,280 --> 00:15:31,360 Speaker 3: put at risk. Some of these photos end up on 282 00:15:31,600 --> 00:15:36,080 Speaker 3: pornography sites, and so we should be wary of the 283 00:15:36,160 --> 00:15:38,360 Speaker 3: type of content that we're putting out there, even if 284 00:15:38,360 --> 00:15:40,520 Speaker 3: it's something that seems sort of harmless to us. 285 00:15:40,760 --> 00:15:43,520 Speaker 1: Let's take a quick commercial break. We'll continue next on 286 00:15:43,560 --> 00:15:50,320 Speaker 1: the Tutor Dixon podcast. I saw this so creepy commercial, 287 00:15:50,320 --> 00:15:52,480 Speaker 1: which I think is interesting. I think it's something that 288 00:15:52,520 --> 00:15:55,600 Speaker 1: we should do here. It was like Scottish or it 289 00:15:55,680 --> 00:15:58,320 Speaker 1: was something from Scotland or something. They had a thick 290 00:15:58,440 --> 00:16:01,600 Speaker 1: accent that I seem to remember. It being a Scottish accent, 291 00:16:02,040 --> 00:16:05,360 Speaker 1: and I showed my girls because these kids get out 292 00:16:05,400 --> 00:16:07,720 Speaker 1: of the car with their parents and the guy that's 293 00:16:07,760 --> 00:16:10,560 Speaker 1: like pumping gas is like, so are you going to 294 00:16:10,640 --> 00:16:14,000 Speaker 1: join the club? And the kids like, and then the 295 00:16:14,040 --> 00:16:16,760 Speaker 1: next one's like, your soccer game was amazing. Like they 296 00:16:16,800 --> 00:16:18,520 Speaker 1: walk into the store and this little old woman's like, 297 00:16:18,560 --> 00:16:21,920 Speaker 1: your soccer game was amazing, amber good job. And then 298 00:16:21,920 --> 00:16:25,040 Speaker 1: they get to check out and they're like, so did 299 00:16:25,080 --> 00:16:26,760 Speaker 1: you make the play? Are you going to be in 300 00:16:26,840 --> 00:16:30,680 Speaker 1: the school musical? And it was like, stop sharing everything 301 00:16:30,720 --> 00:16:33,520 Speaker 1: about your kids. And it was so scary because I thought, 302 00:16:33,760 --> 00:16:37,520 Speaker 1: it's true. I know I see somebody that you know. 303 00:16:37,560 --> 00:16:40,360 Speaker 1: I have all these people across the state that are 304 00:16:40,400 --> 00:16:43,520 Speaker 1: political friends, but they post everything about their life and 305 00:16:43,600 --> 00:16:45,800 Speaker 1: then they don't really know anything about me, and I 306 00:16:46,400 --> 00:16:48,760 Speaker 1: see them and I think, I don't know everything you 307 00:16:48,800 --> 00:16:49,360 Speaker 1: did last week. 308 00:16:49,880 --> 00:16:53,200 Speaker 3: It's a really sing I recently went smartphone free for 309 00:16:53,240 --> 00:16:57,000 Speaker 3: a week to test this out because I think another 310 00:16:57,800 --> 00:17:00,400 Speaker 3: challenge here is if we're asking kids to be off 311 00:17:00,400 --> 00:17:03,440 Speaker 3: their phones, but we are buried in our owns sort 312 00:17:03,480 --> 00:17:06,280 Speaker 3: of like, how seriously do we expect them to take us? 313 00:17:07,080 --> 00:17:11,440 Speaker 3: And the types of kind of things that I experienced 314 00:17:11,560 --> 00:17:16,639 Speaker 3: and felt not having my smartphone on me were not 315 00:17:16,840 --> 00:17:19,159 Speaker 3: exactly what I expected, but all good. 316 00:17:19,800 --> 00:17:22,360 Speaker 2: I felt much more local. 317 00:17:23,320 --> 00:17:27,200 Speaker 3: I wasn't driving far because I wasn't using my GPS, 318 00:17:27,520 --> 00:17:29,320 Speaker 3: so I was going to the types of places I 319 00:17:29,359 --> 00:17:32,840 Speaker 3: knew I could get without having to plug in an address. 320 00:17:33,320 --> 00:17:36,560 Speaker 2: And you just kind of have an opportunity to be bored. 321 00:17:37,040 --> 00:17:39,359 Speaker 3: And then when you're bored, sometimes you actually do the 322 00:17:39,400 --> 00:17:42,520 Speaker 3: things that you should be doing, like cleaning the kitchen 323 00:17:42,840 --> 00:17:45,639 Speaker 3: or cooking dinner that you've been putting off because you're 324 00:17:45,720 --> 00:17:49,119 Speaker 3: doom scrolling. And we as adults are guilty of this 325 00:17:49,320 --> 00:17:52,880 Speaker 3: as well. So if we can be off our phones 326 00:17:52,960 --> 00:17:56,480 Speaker 3: ourselves more, that's going to help us encourage our kids 327 00:17:56,480 --> 00:17:57,960 Speaker 3: to be off our phones more. 328 00:17:57,800 --> 00:17:59,960 Speaker 2: And we're going to get more of a positive fee 329 00:18:00,200 --> 00:18:03,560 Speaker 2: back loop. Again, I don't want to. 330 00:18:05,240 --> 00:18:09,840 Speaker 3: Imply that this is easy or even possible, because so 331 00:18:10,040 --> 00:18:13,120 Speaker 3: much of our lives are really online, but you do 332 00:18:13,160 --> 00:18:17,000 Speaker 3: see little communities of people springing up and agreeing, you know, 333 00:18:17,280 --> 00:18:19,840 Speaker 3: we're not going to be on the phones. And Jonathan 334 00:18:19,880 --> 00:18:22,720 Speaker 3: Hight really deserves a lot of credit for this, you know, 335 00:18:22,800 --> 00:18:27,000 Speaker 3: he's raised so much awareness people thought that you know, 336 00:18:27,040 --> 00:18:28,920 Speaker 3: we were just gonna have to throw our hands up 337 00:18:29,080 --> 00:18:33,000 Speaker 3: with the phones, and I think these cases show us 338 00:18:33,040 --> 00:18:37,160 Speaker 3: that's not true. We're getting somewhere, so we shouldn't abandon 339 00:18:37,800 --> 00:18:39,680 Speaker 3: our own efforts right now either. 340 00:18:40,200 --> 00:18:40,400 Speaker 2: Well. 341 00:18:40,440 --> 00:18:43,120 Speaker 1: I also think that we have to somewhat understand how 342 00:18:43,160 --> 00:18:46,040 Speaker 1: technology is a part of to your point, it is 343 00:18:46,080 --> 00:18:48,399 Speaker 1: a huge part of our lives, how it affects our kids, 344 00:18:48,440 --> 00:18:51,680 Speaker 1: and how our kids will learn how to use technology 345 00:18:52,680 --> 00:18:55,439 Speaker 1: to quite frankly, just get out of things too. Like 346 00:18:55,800 --> 00:18:58,960 Speaker 1: I walked into my daughter's room last night. It's gonna 347 00:18:59,040 --> 00:19:01,439 Speaker 1: kill me for saying this into her room. It was 348 00:19:01,480 --> 00:19:04,000 Speaker 1: like late right, I'm like, who are you talking to 349 00:19:04,320 --> 00:19:06,600 Speaker 1: on the phone? And she was like, Mom, I'm not 350 00:19:06,640 --> 00:19:08,720 Speaker 1: on the phone with someone. I'm writing a paper. And 351 00:19:08,760 --> 00:19:11,000 Speaker 1: I was like, oh my gosh, you're not writing a paper. 352 00:19:11,280 --> 00:19:14,320 Speaker 1: You're just talking and the computer is writing the paper. 353 00:19:14,400 --> 00:19:18,720 Speaker 1: Like she's talking to text essentially on her chromebook, just 354 00:19:19,320 --> 00:19:23,920 Speaker 1: puts the microphone on, it writes the paper. What part 355 00:19:23,960 --> 00:19:25,959 Speaker 1: of me is jealous, but the other part of me 356 00:19:26,119 --> 00:19:28,280 Speaker 1: is like, really mad, I'm like, what are you doing? 357 00:19:28,400 --> 00:19:31,320 Speaker 1: That's this is let me define writing for you. None 358 00:19:31,359 --> 00:19:33,280 Speaker 1: of what you're doing right now is writing, you know, 359 00:19:33,400 --> 00:19:37,480 Speaker 1: like but they And then the other night I was 360 00:19:37,520 --> 00:19:41,080 Speaker 1: with a group of parents and the one parent was like, oh, 361 00:19:41,119 --> 00:19:43,800 Speaker 1: my son has learned to master AI. And I'm like, oh, 362 00:19:43,960 --> 00:19:46,159 Speaker 1: how cool. What does that mean? And she's like, he 363 00:19:46,160 --> 00:19:48,520 Speaker 1: hasn't written a single paper in college. And I was like, oh, 364 00:19:48,560 --> 00:19:52,159 Speaker 1: I'm that's not what I thought mastering AI was meaning, 365 00:19:52,359 --> 00:19:55,240 Speaker 1: like what do you what? And she's like, he takes 366 00:19:55,320 --> 00:19:59,240 Speaker 1: three different ais and puts all the information into all 367 00:19:59,320 --> 00:20:02,480 Speaker 1: three and then mixes the paper up. And actually he 368 00:20:02,720 --> 00:20:06,760 Speaker 1: had special, like a special The teacher came to him 369 00:20:06,800 --> 00:20:09,199 Speaker 1: specially and said, well, I feel like I should give 370 00:20:09,200 --> 00:20:11,360 Speaker 1: you an award. Your writing is so good. We're so 371 00:20:11,400 --> 00:20:13,800 Speaker 1: impressed with your writing. But and she's like, and he's 372 00:20:13,840 --> 00:20:16,320 Speaker 1: never written a thing. But I was like, I don't 373 00:20:16,640 --> 00:20:18,960 Speaker 1: know that I'd hear that so proaly. 374 00:20:18,600 --> 00:20:21,679 Speaker 2: I think this is actually a very serious problem. 375 00:20:21,760 --> 00:20:23,840 Speaker 3: We're laughing about it, but we're going to have to 376 00:20:23,880 --> 00:20:27,159 Speaker 3: think about this and think about what education means. And 377 00:20:27,480 --> 00:20:32,560 Speaker 3: education may look very different in the next couple of 378 00:20:32,640 --> 00:20:38,480 Speaker 3: years because it is totally possible for us to outsource 379 00:20:38,640 --> 00:20:43,399 Speaker 3: our thinking, and we have to remember that what's easy 380 00:20:43,520 --> 00:20:46,760 Speaker 3: isn't always good for us. And it's going to take 381 00:20:46,840 --> 00:20:51,640 Speaker 3: real discipline. And again it may take other solutions, community solutions, 382 00:20:52,600 --> 00:20:57,199 Speaker 3: private sector solutions, government solutions, whatever they are. We have 383 00:20:57,280 --> 00:21:01,120 Speaker 3: to think seriously about how much we want in our lives. 384 00:21:01,280 --> 00:21:06,520 Speaker 3: It does feel like the ball is rolling. But again, 385 00:21:06,600 --> 00:21:08,879 Speaker 3: we felt a couple of years ago about social media. 386 00:21:09,359 --> 00:21:12,320 Speaker 3: It's game over. We can't do anything, and we shouldn't 387 00:21:12,320 --> 00:21:13,560 Speaker 3: give up on AI either. 388 00:21:13,840 --> 00:21:15,879 Speaker 1: You know, when I was a kid, it was you 389 00:21:15,880 --> 00:21:18,040 Speaker 1: wrote your you like typed out your paper, and your 390 00:21:18,040 --> 00:21:20,399 Speaker 1: computer didn't know you were typing out a paper. I 391 00:21:20,480 --> 00:21:21,919 Speaker 1: had no idea what you were doing, right, so you 392 00:21:21,920 --> 00:21:24,480 Speaker 1: had to actually do it yourself and put in the grammar. 393 00:21:24,520 --> 00:21:28,200 Speaker 1: There's no grammar check. But you know, now they all 394 00:21:28,240 --> 00:21:31,560 Speaker 1: have these chromebooks or these iPads that they come home 395 00:21:31,600 --> 00:21:35,040 Speaker 1: from school with and everything is interconnected and the teacher 396 00:21:35,080 --> 00:21:38,240 Speaker 1: can watch you while you're doing things. But you're still 397 00:21:38,359 --> 00:21:41,000 Speaker 1: able to do a lot, get out of a lot. 398 00:21:41,040 --> 00:21:44,119 Speaker 1: I will never forget. There was this girl who just 399 00:21:44,480 --> 00:21:47,280 Speaker 1: you know, did a quick video on social media. But 400 00:21:47,440 --> 00:21:49,600 Speaker 1: it struck me so hard because I feel like this 401 00:21:49,640 --> 00:21:51,840 Speaker 1: is my kid's generation. She's just like whispering at work 402 00:21:51,880 --> 00:21:54,320 Speaker 1: and she was like, I am a twenty two year 403 00:21:54,359 --> 00:21:57,320 Speaker 1: old woman with a job, and I can't spell. I 404 00:21:57,440 --> 00:21:59,719 Speaker 1: do not know how to spell. And she's like, I'm 405 00:21:59,720 --> 00:22:02,320 Speaker 1: pretty sure that my little sister's at home. I don't 406 00:22:02,359 --> 00:22:05,119 Speaker 1: think they can spell either. And I thought, but that 407 00:22:05,359 --> 00:22:09,560 Speaker 1: was like my one child spelling was during the pandemic, 408 00:22:09,840 --> 00:22:12,680 Speaker 1: Like the critical parts of spelling was when she was 409 00:22:12,720 --> 00:22:14,760 Speaker 1: out of school. And she's like, Mom, I don't need 410 00:22:14,800 --> 00:22:17,200 Speaker 1: to spell because the computer spells for me. I'm like, 411 00:22:17,240 --> 00:22:19,400 Speaker 1: oh my gosh, how do you not know how to spell? 412 00:22:19,760 --> 00:22:20,280 Speaker 2: How do you read? 413 00:22:20,320 --> 00:22:21,080 Speaker 1: If you can't spell? 414 00:22:21,480 --> 00:22:22,560 Speaker 2: But people can't sell. 415 00:22:22,520 --> 00:22:24,800 Speaker 3: I mean, it doesn't surprise me at all because I 416 00:22:24,800 --> 00:22:26,520 Speaker 3: feel like I can barely spell. 417 00:22:27,440 --> 00:22:31,000 Speaker 1: But yeah, see this is like but you don't need 418 00:22:31,040 --> 00:22:33,320 Speaker 1: to spell, but you do need to spell. Isn't it 419 00:22:33,359 --> 00:22:34,520 Speaker 1: weird to not I don't spell? 420 00:22:34,640 --> 00:22:36,000 Speaker 2: I mean, he's a new question with a lot of 421 00:22:36,040 --> 00:22:36,560 Speaker 2: the things. 422 00:22:36,800 --> 00:22:40,000 Speaker 3: But I think that there is some benefit to thinking 423 00:22:40,040 --> 00:22:45,000 Speaker 3: about ways that we can reduce technology in schools, even 424 00:22:45,080 --> 00:22:48,200 Speaker 3: when it comes to just doing things pen and paper, 425 00:22:48,320 --> 00:22:50,400 Speaker 3: not even necessarily phone bands. 426 00:22:51,600 --> 00:22:53,959 Speaker 1: Right, I think you have to think more, like you 427 00:22:54,000 --> 00:22:57,439 Speaker 1: have to really stretch your brain when you're doing things 428 00:22:57,600 --> 00:23:00,480 Speaker 1: like that. And somehow I feel like my oldest she's 429 00:23:00,520 --> 00:23:02,960 Speaker 1: got like, you know, callous is on her fingers. I'm like, 430 00:23:02,960 --> 00:23:04,920 Speaker 1: what are those from? She's like, I'm writing all day 431 00:23:04,960 --> 00:23:05,639 Speaker 1: long at school. 432 00:23:05,680 --> 00:23:05,880 Speaker 2: Mom. 433 00:23:06,040 --> 00:23:08,919 Speaker 1: I'm like, oh, how are you do? You go to 434 00:23:08,920 --> 00:23:11,560 Speaker 1: a different school, Like how is this possible that you write? 435 00:23:11,600 --> 00:23:15,160 Speaker 1: And your sister is talking to text like what is happening? 436 00:23:15,200 --> 00:23:17,200 Speaker 1: But I mean, I think part of it is the kid, 437 00:23:17,359 --> 00:23:21,080 Speaker 1: you know, because I have she is like the oldest, 438 00:23:21,200 --> 00:23:23,280 Speaker 1: you know, and you can tell there's a difference in 439 00:23:23,320 --> 00:23:25,320 Speaker 1: your kids. My oldest is like, I'm gonna do everything 440 00:23:25,359 --> 00:23:28,000 Speaker 1: by the book. I'm gonna do everything right. My middle 441 00:23:28,080 --> 00:23:30,480 Speaker 1: child is like, if there is a way to figure 442 00:23:30,480 --> 00:23:31,880 Speaker 1: it out, I'm going to figure it out. And that's 443 00:23:31,920 --> 00:23:34,280 Speaker 1: my creative talent is to get out of doing work, 444 00:23:34,320 --> 00:23:36,600 Speaker 1: Like you know, I just think that's what she thinks. 445 00:23:36,840 --> 00:23:40,040 Speaker 1: And and sometimes I look at her and I'm like, God, smart, 446 00:23:40,320 --> 00:23:42,960 Speaker 1: you know, but I don't like it. It's like the kid, 447 00:23:43,000 --> 00:23:46,640 Speaker 1: the AI kid. I'm like, that's intelligent. But often, as 448 00:23:46,680 --> 00:23:48,280 Speaker 1: with everything, trade offs, right. 449 00:23:48,400 --> 00:23:51,320 Speaker 3: I think being creative and learning how to problem solve 450 00:23:51,600 --> 00:23:55,240 Speaker 3: is a benefit, but we also should be wary of 451 00:23:55,359 --> 00:23:59,080 Speaker 3: things that seem to come too easy. One of the 452 00:23:59,119 --> 00:24:01,720 Speaker 3: reasons that I think a lot of young people feel 453 00:24:01,880 --> 00:24:06,520 Speaker 3: disconnected right now or feel like they don't have purpose 454 00:24:06,840 --> 00:24:11,520 Speaker 3: or feel depressed is because we aren't actually giving any 455 00:24:11,720 --> 00:24:16,080 Speaker 3: reason for them to feel like they've accomplished something. And 456 00:24:16,400 --> 00:24:20,919 Speaker 3: we're very quick to say what's hard and what's uncomfortable 457 00:24:22,080 --> 00:24:24,600 Speaker 3: and makes us upset is a negative thing. It's a 458 00:24:24,640 --> 00:24:28,639 Speaker 3: mental health problem now and not necessarily a challenge that 459 00:24:28,680 --> 00:24:31,359 Speaker 3: we need to overcome. And so I think we need 460 00:24:31,400 --> 00:24:36,360 Speaker 3: to be more aware of how we can inject resilience 461 00:24:36,480 --> 00:24:43,199 Speaker 3: culture and sorry, inject resilience culture and reject therapy culture. 462 00:24:43,440 --> 00:24:46,000 Speaker 1: I feel like I am constantly telling my kids, like, 463 00:24:46,040 --> 00:24:48,639 Speaker 1: you are meant to feel miserable sometimes and it's fine, 464 00:24:48,720 --> 00:24:50,760 Speaker 1: And I'm like, what is wrong with me that I 465 00:24:50,840 --> 00:24:54,119 Speaker 1: have to convince them that feeling bad about something is okay? 466 00:24:54,240 --> 00:24:57,160 Speaker 1: Occasionally okay? But I think we live in a society 467 00:24:57,200 --> 00:24:59,560 Speaker 1: where we're constantly like, how can I make this better? 468 00:24:59,760 --> 00:25:02,440 Speaker 1: Can make you feel better? How can I make you happier? 469 00:25:02,680 --> 00:25:02,879 Speaker 3: You know? 470 00:25:02,920 --> 00:25:06,120 Speaker 1: And if you're at all unhappy, there's a difference between 471 00:25:06,160 --> 00:25:09,240 Speaker 1: being depressed and being unhappy about something. There are times 472 00:25:09,320 --> 00:25:11,800 Speaker 1: I'm unhappy. I'm unhappy when I walk in in the 473 00:25:11,800 --> 00:25:13,679 Speaker 1: middle of the night and find her not writing a 474 00:25:13,720 --> 00:25:17,000 Speaker 1: paper but talking to paper. You know. It's like, I'm unhappy, 475 00:25:17,080 --> 00:25:20,040 Speaker 1: I'm not depressed. And and then if I do get 476 00:25:20,080 --> 00:25:21,960 Speaker 1: upset about something, they're like, do you hate me? And 477 00:25:22,000 --> 00:25:26,040 Speaker 1: I'm like, good grief, that's no. This is there is discipline. 478 00:25:26,080 --> 00:25:29,159 Speaker 1: There are times when discipline is necessary, and discipline is 479 00:25:29,200 --> 00:25:32,040 Speaker 1: because I love you so much, you know. But we 480 00:25:32,119 --> 00:25:35,240 Speaker 1: live in a society where it is becoming harder for 481 00:25:35,359 --> 00:25:40,800 Speaker 1: kids to understand that you there's so few consequences. I mean, 482 00:25:40,880 --> 00:25:44,239 Speaker 1: even if you gosh, there's no consequences. I think about, like, 483 00:25:44,840 --> 00:25:46,359 Speaker 1: you know, if you didn't get home in time to 484 00:25:46,400 --> 00:25:48,320 Speaker 1: watch your favorite show, you're just never gonna see that 485 00:25:48,359 --> 00:25:49,840 Speaker 1: show again. When I was a kid, now it's like 486 00:25:49,880 --> 00:25:52,719 Speaker 1: you can you can go back and rewind TV. Like 487 00:25:52,760 --> 00:25:55,760 Speaker 1: there's like literally nothing you can't do. When I was 488 00:25:55,800 --> 00:25:57,399 Speaker 1: a kid, if you had to do a research paper, 489 00:25:57,560 --> 00:25:59,280 Speaker 1: is it good to the library. I don't even know 490 00:25:59,320 --> 00:26:03,159 Speaker 1: why we have libra anymore because my kids have They're like, 491 00:26:03,240 --> 00:26:06,960 Speaker 1: what happens at the library? Drag queens read at the library. 492 00:26:07,000 --> 00:26:09,199 Speaker 1: I'm like, that's all they know about libraries now, you know, 493 00:26:09,200 --> 00:26:11,879 Speaker 1: I'm like, no, we used to actually go there to 494 00:26:12,000 --> 00:26:14,360 Speaker 1: do research, Like why don't you go on the internet. 495 00:26:14,520 --> 00:26:18,439 Speaker 1: That's not a thing, like this is what happened, you know. 496 00:26:18,680 --> 00:26:21,320 Speaker 1: And I feel like because of that, their lives are 497 00:26:21,359 --> 00:26:25,640 Speaker 1: so easy. And I do think that life has to you. 498 00:26:25,640 --> 00:26:28,280 Speaker 1: You have to experience some hard things when you're young, 499 00:26:28,359 --> 00:26:32,040 Speaker 1: because the reality is adulthood is very hard, regardless of 500 00:26:32,119 --> 00:26:36,120 Speaker 1: the Internet. So if if your young life is very easy, 501 00:26:36,480 --> 00:26:39,439 Speaker 1: anything hard feels like a disaster. And then we have 502 00:26:39,560 --> 00:26:42,399 Speaker 1: everybody talking about mental health. And before we got on 503 00:26:42,480 --> 00:26:44,480 Speaker 1: we were talking about how we have these mental health 504 00:26:44,480 --> 00:26:47,480 Speaker 1: clinics in schools now, and I think it's a disaster. 505 00:26:47,680 --> 00:26:49,960 Speaker 1: I'm like, the last thing I need is somebody going 506 00:26:50,000 --> 00:26:52,600 Speaker 1: to my kids and being like you you let me 507 00:26:52,680 --> 00:26:55,400 Speaker 1: label you here and not at the school. I don't 508 00:26:55,400 --> 00:26:58,640 Speaker 1: want them to be labeled. So I have people there's 509 00:26:58,680 --> 00:27:02,400 Speaker 1: people in Michigan right now, conservatives in Michigan right now, 510 00:27:02,560 --> 00:27:05,000 Speaker 1: that are like, we've got to have more mental health 511 00:27:05,080 --> 00:27:08,919 Speaker 1: clinics in our schools. Who why do you want someone 512 00:27:09,000 --> 00:27:13,080 Speaker 1: you don't know giving your kid advice on what label 513 00:27:13,119 --> 00:27:15,640 Speaker 1: they should have as a mental health It. 514 00:27:15,800 --> 00:27:20,160 Speaker 3: Is really hard to push back on a mental health narrative, 515 00:27:20,400 --> 00:27:23,240 Speaker 3: because no one wants to come off as if they 516 00:27:23,320 --> 00:27:26,760 Speaker 3: don't support kids. But I think we really need to 517 00:27:26,880 --> 00:27:31,639 Speaker 3: recognize is that without giving kids the perspective Tutor, that 518 00:27:32,040 --> 00:27:35,800 Speaker 3: you were talking about, telling them that you know, it's 519 00:27:35,800 --> 00:27:38,919 Speaker 3: okay to feel upset, you don't have a condition. If 520 00:27:39,000 --> 00:27:42,520 Speaker 3: you are sad sometimes or stressed. If we don't do that, 521 00:27:42,560 --> 00:27:45,200 Speaker 3: then they do believe they have something wrong with them. 522 00:27:45,520 --> 00:27:49,399 Speaker 3: And unfortunately a lot of our mental health policy and 523 00:27:49,480 --> 00:27:53,840 Speaker 3: nearly everything that happens in schools really just drives home 524 00:27:53,920 --> 00:27:58,200 Speaker 3: that message. If you feel upset, seek help. Something might 525 00:27:58,240 --> 00:28:00,960 Speaker 3: be wrong, It might be the sign of a bigger problem. 526 00:28:01,560 --> 00:28:04,360 Speaker 3: Everyone should get therapy. There's nothing wrong with that. All 527 00:28:04,440 --> 00:28:06,800 Speaker 3: of these messages are sort of all the messages that 528 00:28:06,840 --> 00:28:09,320 Speaker 3: we're sending kids all the time. They don't need to 529 00:28:09,320 --> 00:28:12,760 Speaker 3: hear that. All that that's doing is having them diagnose 530 00:28:12,840 --> 00:28:17,280 Speaker 3: themselves or pushing them towards some formal diagnosis. 531 00:28:17,640 --> 00:28:20,359 Speaker 1: Let's take a quick commercial break. We'll continue next on 532 00:28:20,440 --> 00:28:26,040 Speaker 1: the Tutor Dixon podcast. I don't know if you've heard this, 533 00:28:26,119 --> 00:28:29,480 Speaker 1: but all these kids have accommodations now for all these things, 534 00:28:29,520 --> 00:28:32,240 Speaker 1: so they don't have to take their tests at a 535 00:28:32,280 --> 00:28:35,840 Speaker 1: normal class period. They can take extra time for tests, 536 00:28:35,880 --> 00:28:39,959 Speaker 1: they can take extra time to type papers, and it's like, 537 00:28:40,160 --> 00:28:42,400 Speaker 1: these kids are getting this pushed on them. And that's 538 00:28:42,440 --> 00:28:45,120 Speaker 1: what bothers me about having someone else talk to my 539 00:28:45,200 --> 00:28:47,840 Speaker 1: kid about mental health, because they have these professionals in 540 00:28:47,840 --> 00:28:51,400 Speaker 1: they're like, you have this, and you should have an accommodation. 541 00:28:51,480 --> 00:28:53,920 Speaker 1: So this happened in my family. Actually, my niece was 542 00:28:53,920 --> 00:28:58,200 Speaker 1: born without her left hand and she's incredibly gifted. She's 543 00:28:58,240 --> 00:29:01,840 Speaker 1: in all the AP classes and her teacher came to 544 00:29:01,880 --> 00:29:05,360 Speaker 1: her and was like, look, it's going to take you 545 00:29:05,640 --> 00:29:08,560 Speaker 1: longer to type, so you need an accommodation. And she 546 00:29:08,680 --> 00:29:12,080 Speaker 1: was mad. She's like, I'm fine, I don't need an accommodation. 547 00:29:12,440 --> 00:29:15,640 Speaker 1: And he went to the school and was like, look 548 00:29:15,840 --> 00:29:18,080 Speaker 1: I need her. I need to schedule her an appointment 549 00:29:18,080 --> 00:29:20,760 Speaker 1: with the like whatever the person is that gives you 550 00:29:20,800 --> 00:29:23,400 Speaker 1: these accommodations. They had someone on staff that would do it. 551 00:29:23,560 --> 00:29:25,560 Speaker 1: My sister was really mad. She's like, are you kidding me? 552 00:29:25,680 --> 00:29:28,360 Speaker 1: I get to make this decision, like she doesn't want this. 553 00:29:28,920 --> 00:29:32,320 Speaker 1: And I give my niece kudos for saying I got this. 554 00:29:32,440 --> 00:29:36,680 Speaker 1: I don't need an accommodation. But think of the pushback 555 00:29:36,720 --> 00:29:39,040 Speaker 1: they had to give. So that's what my concern is. 556 00:29:39,560 --> 00:29:41,880 Speaker 1: He was telling her you can't, and she had to 557 00:29:41,920 --> 00:29:45,160 Speaker 1: look at the adult and go, no, I can. I'm fine. 558 00:29:45,320 --> 00:29:50,000 Speaker 3: It's definitely becoming really problematic. Of course, we want to 559 00:29:50,040 --> 00:29:53,680 Speaker 3: make sure that there are resources available for students with 560 00:29:54,320 --> 00:29:58,680 Speaker 3: disabilities that do truly impact the way that they learn. 561 00:29:59,200 --> 00:30:02,480 Speaker 3: We should be thought about how we're not necessarily setting 562 00:30:02,720 --> 00:30:08,600 Speaker 3: students up for the real world by accommodating very minor things, 563 00:30:09,120 --> 00:30:13,200 Speaker 3: and we're not only trying to sort of screen for 564 00:30:13,360 --> 00:30:18,800 Speaker 3: problems among those students that we believe may actually have 565 00:30:18,960 --> 00:30:24,400 Speaker 3: a learning disability. We're now doing things like universal screening 566 00:30:24,520 --> 00:30:28,960 Speaker 3: for mental health problems. We're having all kids fill out 567 00:30:29,000 --> 00:30:33,080 Speaker 3: surveys multiple times per year that ask them questions about 568 00:30:33,080 --> 00:30:37,959 Speaker 3: how they feel if they've considered committing suicide, how they 569 00:30:38,000 --> 00:30:42,040 Speaker 3: would commit suicide if they decided to do that. There 570 00:30:42,120 --> 00:30:46,160 Speaker 3: are real serious questions we should ask ourselves about the 571 00:30:46,280 --> 00:30:50,480 Speaker 3: need to ask all students this type of information, particularly 572 00:30:50,720 --> 00:30:54,800 Speaker 3: younger students. We're asking students this type of thing at 573 00:30:54,840 --> 00:30:56,600 Speaker 3: age five, what's in. 574 00:30:57,640 --> 00:31:00,480 Speaker 1: And those students are trying to give you the right 575 00:31:00,520 --> 00:31:03,440 Speaker 1: answer because they're used to tests that have a right 576 00:31:03,520 --> 00:31:05,600 Speaker 1: and a wrong answer. And I will say I've heard 577 00:31:05,600 --> 00:31:07,880 Speaker 1: the kids. I've heard my younger kids talk about this 578 00:31:08,520 --> 00:31:10,640 Speaker 1: with their friends, like, what do you think the right 579 00:31:10,680 --> 00:31:13,840 Speaker 1: answer is on those questions? It's not that they're giving 580 00:31:13,920 --> 00:31:16,920 Speaker 1: their answer, they're trying to please the person who's giving 581 00:31:16,960 --> 00:31:18,920 Speaker 1: the tests. Are we supposed to think this way? Are 582 00:31:18,960 --> 00:31:22,480 Speaker 1: we supposed to think that way? How can people who 583 00:31:22,520 --> 00:31:26,280 Speaker 1: deal with children that age not understand that they believe 584 00:31:26,720 --> 00:31:29,760 Speaker 1: there's a correct answer and they're trying to find it. 585 00:31:29,800 --> 00:31:31,920 Speaker 1: They're not thinking about how they feel, they're trying to 586 00:31:31,920 --> 00:31:32,760 Speaker 1: please you totally. 587 00:31:33,400 --> 00:31:37,520 Speaker 3: I think we really should be as parents, we should 588 00:31:37,560 --> 00:31:40,360 Speaker 3: be thoughtful about opting our kids out of this type 589 00:31:40,400 --> 00:31:45,480 Speaker 3: of screening. What parents should recognize is that mental health screenings, 590 00:31:45,600 --> 00:31:50,040 Speaker 3: even in clinical settings, are not recommended for everyone so 591 00:31:50,240 --> 00:31:57,520 Speaker 3: universally because they present so many false positives. Screenings intentionally 592 00:31:57,720 --> 00:32:01,640 Speaker 3: look at any sign of distress or discomfort through a 593 00:32:01,680 --> 00:32:05,200 Speaker 3: clinical lens, through a medical lens, But think about how 594 00:32:05,200 --> 00:32:07,440 Speaker 3: many times a day we feel like sort of tired, 595 00:32:07,560 --> 00:32:10,880 Speaker 3: or sort of uncomfortable or not necessarily happy. That means 596 00:32:10,960 --> 00:32:15,160 Speaker 3: we're going to capture a ton of kids that are 597 00:32:15,240 --> 00:32:18,960 Speaker 3: kind of feeling those normal emotions of human life and 598 00:32:19,480 --> 00:32:23,520 Speaker 3: consider them at risk of a mental health condition. If 599 00:32:23,560 --> 00:32:27,520 Speaker 3: we are not doing this and not recommending this in 600 00:32:27,720 --> 00:32:31,320 Speaker 3: clinical settings, why would we want to do this in 601 00:32:31,360 --> 00:32:36,720 Speaker 3: schools which have less protection, which are you know, being 602 00:32:36,840 --> 00:32:44,160 Speaker 3: administering these screenings with professionals who have fewer medical credentials. 603 00:32:43,920 --> 00:32:46,080 Speaker 2: It really doesn't make a lot of sense. 604 00:32:46,320 --> 00:32:49,880 Speaker 1: Well, and then what happens is once they've seen somebody 605 00:32:49,960 --> 00:32:52,760 Speaker 1: like this, and in my state, we've talked about this 606 00:32:52,840 --> 00:32:56,200 Speaker 1: many times on this program. Once you're twelve, your parents 607 00:32:56,200 --> 00:32:58,520 Speaker 1: don't have access to your medical records unless you sign 608 00:32:58,560 --> 00:33:01,040 Speaker 1: them over as your proxy. Once you have this in 609 00:33:01,080 --> 00:33:04,200 Speaker 1: your school, if you have somebody telling you, you know, 610 00:33:04,240 --> 00:33:06,640 Speaker 1: you're depressed and you need a medication, I mean we 611 00:33:07,160 --> 00:33:09,920 Speaker 1: had this where we had somebody say this about one 612 00:33:09,960 --> 00:33:12,360 Speaker 1: of our kids, and then it was a battle to 613 00:33:12,440 --> 00:33:15,080 Speaker 1: get this psychiatrist to stop calling our house. I'm like, 614 00:33:15,160 --> 00:33:17,719 Speaker 1: she's not going to go on medication. But once they 615 00:33:17,760 --> 00:33:21,200 Speaker 1: get you into this system, you're almost concerned that the 616 00:33:21,240 --> 00:33:22,840 Speaker 1: government is going to come like they're going to have 617 00:33:22,920 --> 00:33:25,360 Speaker 1: CPS come to your house and be like, look, you've 618 00:33:25,400 --> 00:33:28,280 Speaker 1: been recommended for medication. She now has to take it. 619 00:33:28,760 --> 00:33:31,480 Speaker 1: And I talked it to your point about the tests, 620 00:33:31,560 --> 00:33:33,800 Speaker 1: I was talking to the kids about it, and my 621 00:33:33,920 --> 00:33:36,960 Speaker 1: daughter said to me, but when it says the question 622 00:33:37,120 --> 00:33:39,960 Speaker 1: of like, have you felt overwhelmed in the last week, 623 00:33:40,040 --> 00:33:42,320 Speaker 1: she said I did because we had a really big test. 624 00:33:42,400 --> 00:33:45,479 Speaker 1: And I'm like, but that's not what it actually means. 625 00:33:45,520 --> 00:33:49,000 Speaker 1: It's like I can't describe it to her because she's 626 00:33:49,040 --> 00:33:50,800 Speaker 1: too young to understand what they mean. 627 00:33:50,920 --> 00:33:55,720 Speaker 3: Most mental health conditions that are serious and debilitating do 628 00:33:55,840 --> 00:34:00,680 Speaker 3: not manifest until late teens and early twenties. So most 629 00:34:00,680 --> 00:34:03,640 Speaker 3: of the time what we're seeing with kids are just 630 00:34:03,880 --> 00:34:09,000 Speaker 3: kind of normal problems of living, challenges of adolescents. I mean, 631 00:34:09,120 --> 00:34:12,520 Speaker 3: the phrase teen angst we never hear anymore because now 632 00:34:12,560 --> 00:34:17,200 Speaker 3: it's a mental health condition, and so we are pushing 633 00:34:17,239 --> 00:34:21,280 Speaker 3: people towards these clinical diagnoses. And to your point, tutor, 634 00:34:21,360 --> 00:34:26,120 Speaker 3: it is a little concerning, but in many states, the 635 00:34:26,320 --> 00:34:31,320 Speaker 3: age of requirement for mental consent is lower for mental 636 00:34:31,360 --> 00:34:35,560 Speaker 3: health services than others, and so in states you might 637 00:34:35,560 --> 00:34:38,759 Speaker 3: be as young as twelve. Other states, a counselor can 638 00:34:38,800 --> 00:34:42,000 Speaker 3: provide services if they think a child is mature enough 639 00:34:42,040 --> 00:34:46,239 Speaker 3: at any age, and so these are real downsides of 640 00:34:46,280 --> 00:34:49,759 Speaker 3: trying to bring the mental health system into schools, which, 641 00:34:50,120 --> 00:34:51,960 Speaker 3: by the way, aren't even teaching kids how to read 642 00:34:52,080 --> 00:34:53,799 Speaker 3: very well. So I'm not sure why we think that 643 00:34:53,880 --> 00:34:55,239 Speaker 3: they can do. 644 00:34:55,239 --> 00:34:58,080 Speaker 1: This that, And what I'm like, could we focus on 645 00:34:58,160 --> 00:35:00,960 Speaker 1: what's supposed to happen at school. Certainly that's how we 646 00:35:01,000 --> 00:35:03,360 Speaker 1: feel in Michigan. We're forty fourth in the nation. The 647 00:35:03,440 --> 00:35:06,560 Speaker 1: kids are struggling to read. We're still we are still 648 00:35:06,560 --> 00:35:10,080 Speaker 1: coming back from the pandemic in many ways. So you know, 649 00:35:10,719 --> 00:35:13,640 Speaker 1: I agree with that, and I appreciate everything you had 650 00:35:13,680 --> 00:35:15,960 Speaker 1: to say today. I think this is to me, it's 651 00:35:16,040 --> 00:35:18,839 Speaker 1: very fascinating. But sometimes it's things that we don't think 652 00:35:18,880 --> 00:35:22,040 Speaker 1: deeply about how to have those conversations with our kids, 653 00:35:22,320 --> 00:35:24,960 Speaker 1: but also how to push back when it seems like 654 00:35:25,000 --> 00:35:27,239 Speaker 1: big brother is kind of coming in and trying to 655 00:35:27,280 --> 00:35:29,960 Speaker 1: take over all of the health care for our children. 656 00:35:30,560 --> 00:35:32,399 Speaker 1: And I think there's a good balance, But I think 657 00:35:32,640 --> 00:35:34,759 Speaker 1: it's a discussion that needs to be had, because, like 658 00:35:34,800 --> 00:35:38,000 Speaker 1: I said, even when I look at the differences between 659 00:35:38,120 --> 00:35:40,960 Speaker 1: how my mom views social media, how I view social media, 660 00:35:41,080 --> 00:35:43,919 Speaker 1: how my kids view social media, and how they view 661 00:35:44,400 --> 00:35:48,320 Speaker 1: all of that technology, when it comes to actually learning, 662 00:35:48,640 --> 00:35:50,840 Speaker 1: these are all things that we as parents don't have 663 00:35:50,880 --> 00:35:53,000 Speaker 1: a guidebook on and we're just kind of figuring out. 664 00:35:53,040 --> 00:35:56,280 Speaker 1: So I appreciate, I appreciate everything you had to say today, 665 00:35:56,320 --> 00:35:58,080 Speaker 1: and I appreciate what you do. Thank you so much 666 00:35:58,120 --> 00:36:02,719 Speaker 1: for mature podcast. Absolutely, Carolyn Gorman, thank you so much, 667 00:36:02,760 --> 00:36:05,800 Speaker 1: and thank you all for joining the Tutor Dixon Podcast. 668 00:36:06,000 --> 00:36:08,520 Speaker 1: For this episode and others, go to Tutor Dixon podcast 669 00:36:08,600 --> 00:36:11,640 Speaker 1: dot com, the iHeartRadio app, Apple Podcasts, or you can 670 00:36:11,680 --> 00:36:14,440 Speaker 1: watch it on Rumble or YouTube. Just make sure you 671 00:36:14,520 --> 00:36:16,600 Speaker 1: tune in and go out there now and have a 672 00:36:16,600 --> 00:36:17,120 Speaker 1: blessed day.