1 00:00:00,760 --> 00:00:04,240 Speaker 1: Welcome to the Tutor Dixon Podcast. So often we hear 2 00:00:04,440 --> 00:00:07,560 Speaker 1: politicians on both the left and the right say they 3 00:00:07,600 --> 00:00:11,160 Speaker 1: are best equipped to protect our kids, and they introduce 4 00:00:11,280 --> 00:00:14,520 Speaker 1: legislation they claim will do just that. While they're well 5 00:00:14,560 --> 00:00:17,560 Speaker 1: meeting and genuine in their efforts, many of these government 6 00:00:17,600 --> 00:00:20,880 Speaker 1: reforms are not actually getting to the root of the problem, 7 00:00:21,040 --> 00:00:24,680 Speaker 1: or worse, they end up unknowingly making the problem worse. 8 00:00:25,239 --> 00:00:27,600 Speaker 1: But the last few weeks there's been a rare coming 9 00:00:27,640 --> 00:00:30,240 Speaker 1: together in Washington on an issue we should all want 10 00:00:30,280 --> 00:00:33,200 Speaker 1: to solve. I've talked about it several times on this podcast, 11 00:00:33,520 --> 00:00:36,960 Speaker 1: and that is protecting our children online. Today, I'm bringing 12 00:00:36,960 --> 00:00:39,239 Speaker 1: in someone who's been at the center of much of 13 00:00:39,280 --> 00:00:43,040 Speaker 1: this work for decades. Maureen Flatley is an expert in 14 00:00:43,159 --> 00:00:47,720 Speaker 1: government reform and oversight involving children. Her advocacy on Capitol 15 00:00:47,760 --> 00:00:50,960 Speaker 1: Hill has resulted in the passage of a wide range 16 00:00:51,000 --> 00:00:55,279 Speaker 1: of reforms of child welfare, adoption, and child abuse and 17 00:00:55,320 --> 00:00:58,920 Speaker 1: exploitation laws, like Masha's Law, a bill that tripled the 18 00:00:58,960 --> 00:01:04,280 Speaker 1: civil penalty for downloading child sexual abuse material. Marien, thanks 19 00:01:04,319 --> 00:01:05,640 Speaker 1: for joining me today. 20 00:01:05,880 --> 00:01:08,280 Speaker 2: Well, thanks for having me. It's an important topic and 21 00:01:08,319 --> 00:01:10,200 Speaker 2: I'm so glad you're interested to explore it. 22 00:01:11,120 --> 00:01:14,720 Speaker 1: I definitely am. At Most of the people listening know 23 00:01:14,800 --> 00:01:18,400 Speaker 1: that I have four children myself, for girls, but we've 24 00:01:18,440 --> 00:01:21,559 Speaker 1: had parents on here who've had issues where their kids 25 00:01:21,600 --> 00:01:26,520 Speaker 1: have been exploited online ultimately ended up committing suicide, all 26 00:01:26,600 --> 00:01:29,080 Speaker 1: kinds of I mean, there's just so much that goes 27 00:01:29,120 --> 00:01:31,680 Speaker 1: on with this that people don't realize. And I think 28 00:01:31,680 --> 00:01:35,000 Speaker 1: that most of us are uneducated on how to protect 29 00:01:35,000 --> 00:01:37,400 Speaker 1: our kids. And even if we think we're educated, there's 30 00:01:37,440 --> 00:01:40,679 Speaker 1: always new ways people can get to our kids. So 31 00:01:40,760 --> 00:01:43,360 Speaker 1: if you can explain a little bit about what you've 32 00:01:43,400 --> 00:01:46,880 Speaker 1: been working on and how you think we can kind 33 00:01:46,880 --> 00:01:48,320 Speaker 1: of make this safer for our kids. 34 00:01:49,520 --> 00:01:51,800 Speaker 2: Well, I'm a mom and a grandmother of a whole 35 00:01:51,840 --> 00:01:55,800 Speaker 2: flock of kids, mostly girls myself, and so this is 36 00:01:56,040 --> 00:02:00,040 Speaker 2: both personal and professional for me. I often start the 37 00:02:00,160 --> 00:02:02,600 Speaker 2: conversations by pointing out that I'm the daughter of an 38 00:02:02,680 --> 00:02:05,240 Speaker 2: FBI agent, and my dad spent most of his career 39 00:02:05,280 --> 00:02:07,360 Speaker 2: detailed to Capitol Hill, where he worked for the Senate 40 00:02:07,400 --> 00:02:14,280 Speaker 2: Racketeering Committee developing testimony against organized crime figures, and that 41 00:02:14,400 --> 00:02:17,720 Speaker 2: experience growing up literally as a child watching it every 42 00:02:17,800 --> 00:02:21,760 Speaker 2: day has really informed my approach to this problem. I 43 00:02:21,840 --> 00:02:25,400 Speaker 2: often say that while people try to frame this sometimes 44 00:02:25,440 --> 00:02:29,240 Speaker 2: as a tech problem, underneath all of it it's a 45 00:02:29,280 --> 00:02:33,919 Speaker 2: crime problem. And when you look at the statistics about 46 00:02:34,320 --> 00:02:36,880 Speaker 2: I'm used, for example, the cybertips, so they're thirty two 47 00:02:36,919 --> 00:02:42,920 Speaker 2: million cybertips, almost virtually all of those reports come actually 48 00:02:42,960 --> 00:02:46,440 Speaker 2: from the tech companies directly, but they can't arrest and 49 00:02:46,560 --> 00:02:49,919 Speaker 2: prosecute any of the perpetrators who are involved in those 50 00:02:50,560 --> 00:02:55,000 Speaker 2: various allegations and reports, and the conviction rates are very 51 00:02:55,120 --> 00:03:00,040 Speaker 2: very low. So while on one level I understand and 52 00:03:00,280 --> 00:03:02,679 Speaker 2: the concern the tech companies do more, and I think 53 00:03:02,760 --> 00:03:05,920 Speaker 2: most are actually trying to keep up, at the end 54 00:03:05,960 --> 00:03:08,919 Speaker 2: of the day, we really have to invest in law 55 00:03:09,000 --> 00:03:13,560 Speaker 2: enforcement here because even you know, something like sextortion, which 56 00:03:13,600 --> 00:03:16,079 Speaker 2: I know has been a tremendous concern for a lot 57 00:03:16,080 --> 00:03:20,200 Speaker 2: of parents like us, it seems innocuous. It seems like, 58 00:03:20,320 --> 00:03:22,320 Speaker 2: well it, you know, just might be a kid bullying 59 00:03:22,360 --> 00:03:28,079 Speaker 2: my kid, but with a terrible result. But there are rings, international, 60 00:03:28,160 --> 00:03:34,880 Speaker 2: rings of sextortionists who have over many years covered twenty 61 00:03:34,920 --> 00:03:41,480 Speaker 2: and thirty countries with very deliberate efforts to exploit kids. 62 00:03:41,520 --> 00:03:43,920 Speaker 2: So we're never going to get ahead of this problem 63 00:03:44,000 --> 00:03:47,960 Speaker 2: if we don't invest in the law enforcement side of this. 64 00:03:48,040 --> 00:03:51,200 Speaker 2: And unfortunately, the Protect Act, which was passed in two 65 00:03:51,240 --> 00:03:54,640 Speaker 2: thousand and eight, was a virtually perfect response to the problem. 66 00:03:55,720 --> 00:04:00,000 Speaker 2: DOJ has almost completely failed, in my view, to implement 67 00:04:00,080 --> 00:04:02,920 Speaker 2: meant the law adequately. So we've lost fifteen years of 68 00:04:02,960 --> 00:04:07,400 Speaker 2: building an infrastructure to interdict these criminals. So I hope 69 00:04:07,400 --> 00:04:10,840 Speaker 2: that everybody can understand that while there are a lot 70 00:04:10,920 --> 00:04:13,560 Speaker 2: of things that we talk about doing, this is really 71 00:04:13,600 --> 00:04:16,000 Speaker 2: the most important thing that we have to attack. First. 72 00:04:16,560 --> 00:04:19,159 Speaker 1: Well, let's dig into that a little bit, because in Michigan, 73 00:04:19,279 --> 00:04:22,919 Speaker 1: I know, we really have one main crime lab in 74 00:04:23,000 --> 00:04:27,040 Speaker 1: the entire state. So any type of sexual assault or 75 00:04:27,080 --> 00:04:30,839 Speaker 1: any rape kit, anything like that goes through one crime 76 00:04:30,920 --> 00:04:35,000 Speaker 1: lab in the entire state. And this was because this community. 77 00:04:35,120 --> 00:04:38,279 Speaker 1: It wasn't a statewide decision. It was the community that 78 00:04:38,320 --> 00:04:41,200 Speaker 1: decided to put all of their money into this crime lab. 79 00:04:41,600 --> 00:04:44,359 Speaker 1: So why when we look at these massive budgets, I 80 00:04:44,360 --> 00:04:46,479 Speaker 1: mean the state of Michigan. I'll pick on our state 81 00:04:46,520 --> 00:04:49,119 Speaker 1: because I know that we are in the top five 82 00:04:49,279 --> 00:04:52,400 Speaker 1: for rapes per capita in the entire country. We obviously 83 00:04:52,480 --> 00:04:57,040 Speaker 1: have a sex crime problem outside of being an online problem. 84 00:04:57,240 --> 00:04:59,840 Speaker 1: But when I talked to the sheriff in this part 85 00:05:00,000 --> 00:05:03,680 Speaker 1: ticular community, He's like, look, I can find a sex 86 00:05:04,200 --> 00:05:10,200 Speaker 1: a sexual crime. I got a person who's committing some 87 00:05:10,240 --> 00:05:13,279 Speaker 1: sort of a sexual crime every single day in this state. 88 00:05:13,560 --> 00:05:15,640 Speaker 1: I can locate them, and I can arrest them, and 89 00:05:15,680 --> 00:05:18,039 Speaker 1: I can put them behind bars. But I don't have 90 00:05:18,160 --> 00:05:21,440 Speaker 1: the resources to run everything through the labs. So what 91 00:05:21,640 --> 00:05:23,440 Speaker 1: is that when you talk about that, is that what 92 00:05:23,480 --> 00:05:25,480 Speaker 1: you're talking about one hundred percent? 93 00:05:25,680 --> 00:05:28,960 Speaker 2: One hundred percent. So when you think about the thirty 94 00:05:28,960 --> 00:05:31,800 Speaker 2: two million cyber tips, that doesn't necessarily mean thirty two 95 00:05:31,839 --> 00:05:35,279 Speaker 2: million crimes. It just means thirty two million reports. I 96 00:05:35,320 --> 00:05:39,599 Speaker 2: analogize that Tutor two mandated reporting and child welfare. So 97 00:05:40,440 --> 00:05:42,640 Speaker 2: you know, in a state like Michigan, which is one 98 00:05:42,640 --> 00:05:45,640 Speaker 2: of the largest populations of kids in foster care, you 99 00:05:45,720 --> 00:05:50,160 Speaker 2: could have I'm ball parking a million child abust complaints 100 00:05:50,160 --> 00:05:52,760 Speaker 2: a year. But you know, some of them could just 101 00:05:52,920 --> 00:05:56,000 Speaker 2: be you know, neighbors are having a dispute and they 102 00:05:56,040 --> 00:05:59,760 Speaker 2: make a gratuitous complain about their neighbor and their kids. 103 00:06:00,320 --> 00:06:05,400 Speaker 2: It's nothing really serious. Meanwhile, while social workers wigdhe through 104 00:06:05,480 --> 00:06:08,520 Speaker 2: those what I would consider to be kind of bogus complaints. 105 00:06:08,640 --> 00:06:11,640 Speaker 2: They're not getting to the serious and life threatening stories 106 00:06:11,640 --> 00:06:14,839 Speaker 2: that need to be addressed quickly. So one of the. 107 00:06:14,839 --> 00:06:17,000 Speaker 1: Most So, how does that when you talk about the 108 00:06:17,080 --> 00:06:20,239 Speaker 1: Protect Act Act? Explain that a little bit. What would 109 00:06:20,279 --> 00:06:23,200 Speaker 1: that have had it been implemented the right way? What 110 00:06:23,240 --> 00:06:23,880 Speaker 1: would that mean? 111 00:06:25,000 --> 00:06:29,640 Speaker 2: Well, you know, the Project Act actually has had money 112 00:06:29,720 --> 00:06:32,760 Speaker 2: appropriated for it that hasn't even been spent, and so 113 00:06:33,400 --> 00:06:36,120 Speaker 2: you know, I look at the Internet crimes against Children's 114 00:06:36,120 --> 00:06:39,919 Speaker 2: task forces. Michigan has a great one, and that the 115 00:06:40,040 --> 00:06:44,440 Speaker 2: linkage between municipal, state, and federal law enforcement in this 116 00:06:45,279 --> 00:06:49,400 Speaker 2: is pretty critical because one, internet crimes tend to be 117 00:06:49,520 --> 00:06:53,840 Speaker 2: interstate activity, but also because we want to aggregate and 118 00:06:53,880 --> 00:06:57,120 Speaker 2: look at patterns and look at perpetrators that are operating 119 00:06:57,160 --> 00:07:00,719 Speaker 2: across state lines. So the simple answer to your question 120 00:07:00,720 --> 00:07:03,840 Speaker 2: about the Protect Act is that had it been fully implemented, 121 00:07:04,480 --> 00:07:08,839 Speaker 2: we would have been infusing those state Internet crimes against 122 00:07:08,920 --> 00:07:12,760 Speaker 2: Children's task forces with more money so that they would 123 00:07:12,840 --> 00:07:16,440 Speaker 2: have more just to begin with, more analysts to look 124 00:07:16,440 --> 00:07:18,960 Speaker 2: at the reports to see if there actually are crimes, 125 00:07:19,840 --> 00:07:24,600 Speaker 2: more investigators at the local and state level to investigate 126 00:07:24,640 --> 00:07:28,120 Speaker 2: those crimes. Because the tech companies, the best of the 127 00:07:28,160 --> 00:07:31,880 Speaker 2: tech companies can't arrest and prosecute anybody. It has to 128 00:07:31,880 --> 00:07:35,480 Speaker 2: be a law of law enforcement function. So by failing 129 00:07:35,560 --> 00:07:38,960 Speaker 2: to implement that over a long period of time, and 130 00:07:39,000 --> 00:07:41,960 Speaker 2: I blame Congress for this because it's their responsibility to 131 00:07:42,040 --> 00:07:44,680 Speaker 2: engage in oversight, we would have had a lot more 132 00:07:44,720 --> 00:07:48,120 Speaker 2: people on the ground doing the work. The good news 133 00:07:48,200 --> 00:07:52,160 Speaker 2: is that BEST passed almost sixteen years ago. I'm sorry, 134 00:07:52,200 --> 00:07:54,920 Speaker 2: Protect passed almost sixteen years ago. There's a new bill 135 00:07:55,000 --> 00:07:57,880 Speaker 2: in Congress called the Investing Child Safety Act that would 136 00:07:57,880 --> 00:08:02,040 Speaker 2: what I call right size this spending on law enforcement 137 00:08:02,480 --> 00:08:05,480 Speaker 2: in this sector. It's not just handing out blank checks 138 00:08:05,480 --> 00:08:08,760 Speaker 2: to police departments. In the FBI, it's very focused on 139 00:08:08,880 --> 00:08:15,920 Speaker 2: child exploitation, and that would bring dramatic increases in the 140 00:08:16,000 --> 00:08:19,480 Speaker 2: number of analysts, the number of investigators, the number of prosecutors, 141 00:08:20,160 --> 00:08:23,760 Speaker 2: and the number of judges that would look at these cases. 142 00:08:23,840 --> 00:08:26,840 Speaker 2: And from where I sit as a mom and as 143 00:08:26,880 --> 00:08:30,480 Speaker 2: an expert in chop protection, we can't even begin to 144 00:08:30,640 --> 00:08:34,560 Speaker 2: address this issue until we attack the criminal side of it. 145 00:08:34,640 --> 00:08:38,160 Speaker 2: Because I use the analogy blaming the tech companies for 146 00:08:38,240 --> 00:08:41,400 Speaker 2: all of this is like blaming a bank for being robbed. 147 00:08:41,880 --> 00:08:43,920 Speaker 2: You know you have these cases. I don't know if 148 00:08:43,920 --> 00:08:46,120 Speaker 2: you guys read about this, but the Chanel store was 149 00:08:46,160 --> 00:08:48,920 Speaker 2: targeted and you know, twenty five shoplifters went in and 150 00:08:48,960 --> 00:08:51,760 Speaker 2: stole everything in the entire store. Well, nobody blamed the store. 151 00:08:51,800 --> 00:08:54,360 Speaker 2: They called the police. And so when you start to 152 00:08:54,400 --> 00:08:57,120 Speaker 2: talk about the level of organized crime in this issue, 153 00:08:57,600 --> 00:09:00,720 Speaker 2: we have to get that out of the mix before 154 00:09:00,720 --> 00:09:03,520 Speaker 2: we're going to see any significant improvements. 155 00:09:03,800 --> 00:09:06,679 Speaker 1: So I think for people who don't understand exactly what 156 00:09:06,679 --> 00:09:09,800 Speaker 1: we're talking about, let's just go through. In twenty twenty three, 157 00:09:10,200 --> 00:09:15,400 Speaker 1: there were more than one hundred and five million online images, videos, 158 00:09:15,520 --> 00:09:20,199 Speaker 1: materials related to child sexual abuse, and those were flagged 159 00:09:20,200 --> 00:09:23,040 Speaker 1: by the National Center for Missing and Exploited Children. So 160 00:09:23,360 --> 00:09:27,200 Speaker 1: when we talk about this, I think that sometimes people hear, oh, 161 00:09:27,320 --> 00:09:31,000 Speaker 1: sometimes there's pictures on Twitter or x now, I guess, 162 00:09:31,040 --> 00:09:34,280 Speaker 1: but formerly Twitter or Facebook or you know, any of 163 00:09:34,320 --> 00:09:36,800 Speaker 1: these social media sites, people say, well, this is out there. 164 00:09:36,840 --> 00:09:39,679 Speaker 1: Why aren't these sites doing something? They like you said, 165 00:09:39,800 --> 00:09:42,640 Speaker 1: they can report them, they can't go and arrest these people. 166 00:09:42,800 --> 00:09:45,199 Speaker 1: They can take them off. But then if you take 167 00:09:45,240 --> 00:09:48,679 Speaker 1: them off their site, do these people reoccur as someone else? 168 00:09:48,760 --> 00:09:49,000 Speaker 2: Where? 169 00:09:49,080 --> 00:09:52,079 Speaker 1: Is this where is this imagery coming from Are they 170 00:09:52,120 --> 00:09:55,319 Speaker 1: getting this, are they getting this from the children themselves? 171 00:09:55,840 --> 00:09:59,439 Speaker 1: How exactly is this working? And what do we need 172 00:09:59,480 --> 00:10:01,679 Speaker 1: as parents to be on the lookout for with our 173 00:10:01,720 --> 00:10:02,400 Speaker 1: own kids. 174 00:10:02,679 --> 00:10:05,800 Speaker 2: Yeah, well, I mean, let's talk about what happens with 175 00:10:05,840 --> 00:10:10,200 Speaker 2: cyber tips. So, just like doctors and school nurses and 176 00:10:10,240 --> 00:10:14,440 Speaker 2: teachers and pastors, the tech companies are federally mandated reporters. 177 00:10:14,480 --> 00:10:17,480 Speaker 2: That means that they if they believe that there is 178 00:10:18,160 --> 00:10:21,480 Speaker 2: an instance of child exploitation or child sexual abuse, they 179 00:10:21,480 --> 00:10:24,000 Speaker 2: have to report it, and they do. They're the only 180 00:10:24,320 --> 00:10:28,400 Speaker 2: real source of reports across the board. So once those 181 00:10:28,520 --> 00:10:31,160 Speaker 2: reports are made and they go in the pool of 182 00:10:31,200 --> 00:10:36,840 Speaker 2: cyber tips, then nickmck does what we call geolocate the pictures. So, 183 00:10:36,960 --> 00:10:40,800 Speaker 2: in other words, they determine to the extent possible where 184 00:10:40,800 --> 00:10:43,800 Speaker 2: the pictures come from. So one of the important facts 185 00:10:43,920 --> 00:10:47,920 Speaker 2: is that ninety four percent of cyber tips involve foreign governments, 186 00:10:48,080 --> 00:10:51,920 Speaker 2: and so when they geolocate the pictures, they refer them 187 00:10:52,240 --> 00:10:57,480 Speaker 2: out to these other governments who have no real mandate 188 00:10:57,679 --> 00:11:00,839 Speaker 2: or responsibility to do anything with them. Of course, many 189 00:11:00,880 --> 00:11:03,760 Speaker 2: of them involve extremely poor countries that really don't have 190 00:11:03,800 --> 00:11:08,959 Speaker 2: any law enforcement the remaining six percent are US based, 191 00:11:09,640 --> 00:11:12,560 Speaker 2: and so that's where we start to think about, Okay, 192 00:11:12,559 --> 00:11:15,240 Speaker 2: what can we do in this country. We can start 193 00:11:15,320 --> 00:11:19,600 Speaker 2: by evaluating what those pictures are, but right now, because 194 00:11:19,640 --> 00:11:23,280 Speaker 2: of limited resources, I think the number is something like 195 00:11:23,400 --> 00:11:27,360 Speaker 2: three thousand out of one hundred thousand reports are actually examined. 196 00:11:27,520 --> 00:11:31,240 Speaker 2: So problem number one, if my father was still alive, 197 00:11:31,320 --> 00:11:34,040 Speaker 2: he would be the perfect person to organize this. Problem 198 00:11:34,120 --> 00:11:37,280 Speaker 2: number one is we actually don't know enough about where 199 00:11:37,280 --> 00:11:40,560 Speaker 2: those images are coming from. So how much of it 200 00:11:40,679 --> 00:11:43,719 Speaker 2: is organized criminal activity? And I can tell you from 201 00:11:43,760 --> 00:11:47,560 Speaker 2: working on Musha's Law that that child who was adopted 202 00:11:47,559 --> 00:11:50,080 Speaker 2: from Russia by a pedophile to be content in his 203 00:11:50,200 --> 00:11:55,079 Speaker 2: child pornography empire was actually one of the most prolific 204 00:11:55,160 --> 00:11:59,520 Speaker 2: images on the Internet at the time, literally probably now 205 00:11:59,600 --> 00:12:02,880 Speaker 2: millions of pictures. She was rescued in the early two 206 00:12:02,920 --> 00:12:06,400 Speaker 2: thousands and her pictures are still all over the internet. 207 00:12:07,000 --> 00:12:09,920 Speaker 2: So we need to understand. We can't fix the problem 208 00:12:09,960 --> 00:12:13,839 Speaker 2: we don't understand. So we need to begin by differentiating 209 00:12:14,320 --> 00:12:17,199 Speaker 2: where do those pictures come from? So is it seventy 210 00:12:17,240 --> 00:12:20,840 Speaker 2: five percent organized criminal activity? Because that's a hard target 211 00:12:20,920 --> 00:12:24,319 Speaker 2: that we can start to go after and disrupt how 212 00:12:24,400 --> 00:12:28,599 Speaker 2: much of it is kids producing their own content, you know, foolishly, 213 00:12:28,640 --> 00:12:32,600 Speaker 2: innocently or whatever. That's a problem that we can definitely 214 00:12:32,640 --> 00:12:35,920 Speaker 2: begin to address with education. I think a lot of 215 00:12:35,920 --> 00:12:41,840 Speaker 2: what's going on is addressed by enforcing existing law, by 216 00:12:41,960 --> 00:12:46,800 Speaker 2: bolstering our existing investment in law enforcement, and then tutor 217 00:12:47,480 --> 00:12:49,840 Speaker 2: to me as a mother, one of the number one 218 00:12:49,960 --> 00:12:54,440 Speaker 2: things is education, education, education. If we don't let our 219 00:12:54,480 --> 00:12:57,199 Speaker 2: kids get behind the wheel of a car without driver's 220 00:12:57,240 --> 00:13:02,640 Speaker 2: ad we really need to have a digital education approach 221 00:13:02,720 --> 00:13:05,559 Speaker 2: to this, both for parents and for kids. Florida passed 222 00:13:05,720 --> 00:13:09,560 Speaker 2: a marvelous bill that we're going to push hard as 223 00:13:09,600 --> 00:13:13,760 Speaker 2: a model for other states to use. Very simple, because 224 00:13:14,040 --> 00:13:16,960 Speaker 2: you know, there have been a lot of ideas promoted 225 00:13:17,000 --> 00:13:21,880 Speaker 2: here that involve having the government decide when your child 226 00:13:21,920 --> 00:13:24,040 Speaker 2: can go on the internet, what your child can do 227 00:13:24,120 --> 00:13:26,040 Speaker 2: on the internet. And I'm here to tell you, with 228 00:13:26,160 --> 00:13:29,520 Speaker 2: somebody who has spent many, many years watching the in 229 00:13:29,559 --> 00:13:34,280 Speaker 2: real life child welfare system destroy families, we really don't 230 00:13:34,320 --> 00:13:38,080 Speaker 2: want to create a digital CPS here because we know, 231 00:13:38,520 --> 00:13:40,760 Speaker 2: I mean, I got into this issue as an oversight 232 00:13:40,840 --> 00:13:44,840 Speaker 2: matter on working on prontal rights cases, and so this 233 00:13:45,040 --> 00:13:50,880 Speaker 2: is something that parents can do more to control and manage, 234 00:13:51,240 --> 00:13:54,720 Speaker 2: but you know, oftentimes parents themselves need to have a 235 00:13:54,720 --> 00:13:58,800 Speaker 2: better understanding of exactly what's involved here. So I think 236 00:13:59,080 --> 00:14:01,679 Speaker 2: a little bit less government involvement in a way is 237 00:14:01,920 --> 00:14:05,440 Speaker 2: probably better. But we have to have the law enforcement 238 00:14:06,040 --> 00:14:09,200 Speaker 2: piece a bit at the front end, because you know, 239 00:14:09,280 --> 00:14:11,560 Speaker 2: it's sort of like some of the proposals that have 240 00:14:11,679 --> 00:14:15,800 Speaker 2: been made involve, you know, way down the road after 241 00:14:15,840 --> 00:14:20,360 Speaker 2: the fact, after the damage has been done, tinkering with 242 00:14:20,480 --> 00:14:25,240 Speaker 2: various tech rules with you know, victims access to civil 243 00:14:25,360 --> 00:14:30,440 Speaker 2: justice and victims can sue their perpetrators now. But if 244 00:14:30,480 --> 00:14:33,000 Speaker 2: we don't get to the heart of this, it's just 245 00:14:33,040 --> 00:14:36,120 Speaker 2: going to be perpetuated. We'll have countless more victims. 246 00:14:36,320 --> 00:14:39,040 Speaker 1: Let's take a quick commercial break. We'll continue next on 247 00:14:39,080 --> 00:14:44,680 Speaker 1: the Tutor Dixon podcast. So let me ask you when 248 00:14:44,720 --> 00:14:47,880 Speaker 1: you talk about ninety four percent of this being coming 249 00:14:47,920 --> 00:14:50,680 Speaker 1: from other countries, so does that mean that the actual 250 00:14:51,040 --> 00:14:53,520 Speaker 1: images are coming from other countries or that it is 251 00:14:54,800 --> 00:14:58,200 Speaker 1: the offenders are in other countries and they are somehow 252 00:14:58,360 --> 00:15:02,040 Speaker 1: coming into our country and getting our kids and getting 253 00:15:02,120 --> 00:15:06,400 Speaker 1: pictures of them or assaulting them. Explain what that means. 254 00:15:06,920 --> 00:15:10,800 Speaker 2: Well, it largely means that it isn't US activity. So 255 00:15:11,080 --> 00:15:14,480 Speaker 2: whether it's the perpetrator is in another country or the 256 00:15:14,560 --> 00:15:18,680 Speaker 2: child is in another country, it's not a US legal matter. 257 00:15:18,880 --> 00:15:22,240 Speaker 2: It's something that falls into the jurisdiction of you know, 258 00:15:22,600 --> 00:15:28,720 Speaker 2: literally Bangladesh, Algeria, Germany, and so it really isn't something 259 00:15:28,880 --> 00:15:32,320 Speaker 2: And this is something that people often don't understand as 260 00:15:32,360 --> 00:15:36,520 Speaker 2: they kind of go after the tech companies. We don't 261 00:15:36,560 --> 00:15:41,680 Speaker 2: have jurisdictionally the ability to prosecute a lot of these cases. 262 00:15:42,120 --> 00:15:44,640 Speaker 2: But at the end of the day, we're not prosecuting 263 00:15:44,760 --> 00:15:48,000 Speaker 2: enough of the cases that we do have the ability 264 00:15:48,440 --> 00:15:51,320 Speaker 2: to attack. And I sometimes to use the example of 265 00:15:51,400 --> 00:15:53,960 Speaker 2: Mothers against drunk Driving, which is really one of my 266 00:15:54,280 --> 00:15:59,680 Speaker 2: very favorite examples of grassroots advocacy. So, you know, tragically Candy, 267 00:15:59,800 --> 00:16:02,680 Speaker 2: like his daughter was killed by a drunk driver. She 268 00:16:02,920 --> 00:16:05,600 Speaker 2: very quickly realized that this was a guy that had 269 00:16:05,640 --> 00:16:08,800 Speaker 2: been arrested for DUI many times, had never gone to jail, 270 00:16:09,600 --> 00:16:12,720 Speaker 2: and so she started paying attention to the sentences the 271 00:16:12,800 --> 00:16:16,800 Speaker 2: judges were handing out, and very quickly, within a period 272 00:16:16,880 --> 00:16:19,000 Speaker 2: of four or five years, we went from it was 273 00:16:19,040 --> 00:16:22,120 Speaker 2: basically legal to drive drunk and kill somebody with your 274 00:16:22,160 --> 00:16:26,680 Speaker 2: car to having really stiff sentences, and we've seen the 275 00:16:26,800 --> 00:16:32,200 Speaker 2: number of fatalities due to drunk driving go way way down, 276 00:16:32,400 --> 00:16:35,640 Speaker 2: and I think that we need to structure sort of 277 00:16:35,680 --> 00:16:38,240 Speaker 2: the same kind of deterrence, if you will. As long 278 00:16:38,280 --> 00:16:41,680 Speaker 2: as the focus is on the tech companies who aren't 279 00:16:41,680 --> 00:16:45,200 Speaker 2: the ones perpetrating against the kids, you know, they may 280 00:16:45,200 --> 00:16:50,000 Speaker 2: be a platform that the crime happens to appear on. 281 00:16:50,800 --> 00:16:54,680 Speaker 2: But unless we start to really attack the people who 282 00:16:54,760 --> 00:16:59,280 Speaker 2: are directly harming the kids, we're not going to stop 283 00:16:59,320 --> 00:17:02,440 Speaker 2: this victim is And in fact, I've seen I mean 284 00:17:02,480 --> 00:17:04,520 Speaker 2: I said this, and I was at the hearing in 285 00:17:04,560 --> 00:17:07,520 Speaker 2: Congress a couple weeks ago, and I said to someone, look, 286 00:17:08,640 --> 00:17:13,160 Speaker 2: we've lost sixteen years because Congress didn't bother to enforce 287 00:17:13,760 --> 00:17:16,760 Speaker 2: a bill that is now a law that was literally 288 00:17:16,920 --> 00:17:19,040 Speaker 2: at the time, one of the perfect responses to this 289 00:17:19,160 --> 00:17:24,600 Speaker 2: problem from a law enforcement perspective. And so now we 290 00:17:24,720 --> 00:17:27,000 Speaker 2: have to start now, because if you look at the 291 00:17:27,040 --> 00:17:30,320 Speaker 2: explosion of the activity in those fifteen years, I just 292 00:17:30,359 --> 00:17:32,040 Speaker 2: think I get up every morning and one of the 293 00:17:32,119 --> 00:17:34,240 Speaker 2: first things I think of is how many of those 294 00:17:34,320 --> 00:17:38,159 Speaker 2: kids could have been protected if we had done our 295 00:17:38,240 --> 00:17:41,520 Speaker 2: job at the front end. And so we can come 296 00:17:41,600 --> 00:17:44,199 Speaker 2: up with a lot of and you said it at 297 00:17:44,240 --> 00:17:47,159 Speaker 2: the beginning. I think virtually everyone who's waited on this 298 00:17:47,359 --> 00:17:52,199 Speaker 2: is well intended. But the question is how much do 299 00:17:52,280 --> 00:17:58,080 Speaker 2: they understand how this process, this problem plays out, and 300 00:17:58,119 --> 00:18:01,199 Speaker 2: what really constitutes well And. 301 00:18:01,240 --> 00:18:03,399 Speaker 1: I think we've become a society that wants to have 302 00:18:03,480 --> 00:18:07,480 Speaker 1: a person or an entity to hold accountable. And that's 303 00:18:07,560 --> 00:18:10,719 Speaker 1: where we see these hearings. They get a lot of 304 00:18:10,800 --> 00:18:13,840 Speaker 1: play because there's always these statements that come out and 305 00:18:13,880 --> 00:18:17,320 Speaker 1: then they travel on the social media platforms and we're like, yes, 306 00:18:17,440 --> 00:18:20,000 Speaker 1: that's the person we want to have pay for this. 307 00:18:20,119 --> 00:18:22,919 Speaker 1: But we had an interesting story here in Michigan in 308 00:18:22,960 --> 00:18:25,800 Speaker 1: the Upper Peninsula. A young man I think he was sixteen, 309 00:18:26,119 --> 00:18:30,439 Speaker 1: sixteen or seventeen years old and he went and this 310 00:18:30,600 --> 00:18:33,399 Speaker 1: was a young man who was on the football team, 311 00:18:33,640 --> 00:18:37,520 Speaker 1: had everything going for him, ended up having somebody this 312 00:18:37,680 --> 00:18:41,200 Speaker 1: sextortion happened to him at night where these guys I 313 00:18:41,240 --> 00:18:43,600 Speaker 1: think it was from Nigeria, they went after him. They 314 00:18:43,800 --> 00:18:46,840 Speaker 1: opposed as a girl. The parents were actually able to 315 00:18:47,080 --> 00:18:50,919 Speaker 1: extradite them and they have never stopped fighting. And I 316 00:18:50,920 --> 00:18:53,280 Speaker 1: think that's kind of what you're talking about with moms 317 00:18:53,280 --> 00:18:56,040 Speaker 1: against drunk driving is like you really have to say 318 00:18:56,480 --> 00:18:59,760 Speaker 1: we're going to take this to the next level and say, no, 319 00:18:59,800 --> 00:19:02,440 Speaker 1: I'm not okay with the fact that they're going to 320 00:19:02,520 --> 00:19:04,800 Speaker 1: get five months and get off. I'm going to keep 321 00:19:04,840 --> 00:19:07,800 Speaker 1: fighting until the next person doesn't ever have a chance 322 00:19:07,840 --> 00:19:09,480 Speaker 1: to do this again. And I think we've kind of 323 00:19:09,480 --> 00:19:14,000 Speaker 1: gotten away from that tenacity with crime, but we have 324 00:19:14,040 --> 00:19:15,439 Speaker 1: to get back there a. 325 00:19:16,040 --> 00:19:18,560 Speaker 2: One hundred percent. Now I know about that case, and 326 00:19:18,600 --> 00:19:22,359 Speaker 2: in fact, I'm fairly certain that the gang that was 327 00:19:22,400 --> 00:19:24,760 Speaker 2: involved in that case is one of the gangs that's 328 00:19:24,800 --> 00:19:27,840 Speaker 2: on my radar. It's a Nigerian sextortion gang called the 329 00:19:27,920 --> 00:19:31,240 Speaker 2: Yahoo Boys, and it's sort of loosely organized, but it's 330 00:19:31,400 --> 00:19:36,280 Speaker 2: well organized enough to operate multinationally and to perpetrate crimes 331 00:19:36,359 --> 00:19:40,960 Speaker 2: all over the world literally. And so I work with 332 00:19:41,000 --> 00:19:44,360 Speaker 2: a group called Stop Child Predators, started by the wonderful 333 00:19:44,359 --> 00:19:46,560 Speaker 2: Stacey Rouman app who is kind of the candy lightener 334 00:19:46,600 --> 00:19:50,920 Speaker 2: of this issue. Wasn't a personal victim, but she recognized 335 00:19:50,920 --> 00:19:53,840 Speaker 2: a long time ago that this law enforcement piece was 336 00:19:53,880 --> 00:19:58,639 Speaker 2: really crucial. And so you just gave me the perfect 337 00:19:58,800 --> 00:20:02,720 Speaker 2: example one of the targets that we're going to take 338 00:20:02,760 --> 00:20:07,320 Speaker 2: on and it's the Yagu Boys who have been responsible 339 00:20:07,359 --> 00:20:11,880 Speaker 2: for some gruesome sextortion cases, I mean including having kids 340 00:20:11,960 --> 00:20:14,919 Speaker 2: commit suicide. They were so distraught by what happened. But 341 00:20:15,840 --> 00:20:18,960 Speaker 2: you know, this is all about personal responsibility too, and 342 00:20:19,040 --> 00:20:22,280 Speaker 2: I think we can shift the blame to the easy 343 00:20:22,320 --> 00:20:25,280 Speaker 2: targets to the deep pockets. I think some of what 344 00:20:25,440 --> 00:20:29,360 Speaker 2: has driven the criticism of the tech companies has been 345 00:20:29,600 --> 00:20:33,359 Speaker 2: quite frankly opportunism, because who has deeper pockets than they do? 346 00:20:33,960 --> 00:20:36,840 Speaker 2: But it doesn't take the criminals off the street. And 347 00:20:36,960 --> 00:20:40,280 Speaker 2: so when you start to talk about a gang like 348 00:20:40,320 --> 00:20:43,680 Speaker 2: the Aagu Boys, and I reference my own father's work 349 00:20:43,760 --> 00:20:46,920 Speaker 2: because you know, he developed Joblacci's testimony against the mob 350 00:20:46,960 --> 00:20:50,440 Speaker 2: and that ended up being a global undertaking. But that's 351 00:20:50,920 --> 00:20:54,640 Speaker 2: how we start to develop a solution. And I love 352 00:20:54,720 --> 00:20:56,919 Speaker 2: that story, and actually I might want to get in 353 00:20:56,960 --> 00:21:01,840 Speaker 2: touch with that family because that's how you do it. Yes, 354 00:21:02,040 --> 00:21:09,080 Speaker 2: sold the actual perpetrators accountable, and pretty soon two things happened. One, 355 00:21:09,720 --> 00:21:12,439 Speaker 2: you take the criminals off the street and you start 356 00:21:12,480 --> 00:21:18,200 Speaker 2: to disrupt and discourage the activity. But also the process 357 00:21:18,280 --> 00:21:22,679 Speaker 2: of doing it is educational for the public, and so 358 00:21:23,119 --> 00:21:27,040 Speaker 2: people can see that this is a lot more complicated 359 00:21:27,280 --> 00:21:29,520 Speaker 2: than they would like to think. So I know members 360 00:21:29,560 --> 00:21:31,800 Speaker 2: of Congress who just want to wave a magic wand 361 00:21:31,840 --> 00:21:34,760 Speaker 2: and you know they think suing the tech companies. I mean, 362 00:21:34,760 --> 00:21:37,119 Speaker 2: I'll give you. I mean, somebody asked me about what 363 00:21:37,160 --> 00:21:39,240 Speaker 2: I thought about the hearings a couple of weeks ago, 364 00:21:39,240 --> 00:21:42,120 Speaker 2: and I said, you know, if I had been Mark Zuckerberg, 365 00:21:42,600 --> 00:21:45,040 Speaker 2: and you know, I'm not here to defund the tech companies, 366 00:21:45,080 --> 00:21:46,600 Speaker 2: but if I had been Mark Zuckerberg, this is what 367 00:21:46,600 --> 00:21:48,480 Speaker 2: I would have said. I would have said. You know, 368 00:21:48,520 --> 00:21:51,440 Speaker 2: Congress has had sixteen years to build a world class 369 00:21:51,880 --> 00:21:55,919 Speaker 2: child exploitation and addiction system, and they haven't really lifted 370 00:21:55,920 --> 00:21:58,320 Speaker 2: a finger. The GOO just came out with a skating 371 00:21:58,440 --> 00:22:04,439 Speaker 2: report about DOJ's fails to work this issue. And the 372 00:22:04,480 --> 00:22:08,680 Speaker 2: tech companies are the only reporters of this activity, only 373 00:22:08,720 --> 00:22:12,439 Speaker 2: meaningful reporters in the mandated reporting system. But they're not 374 00:22:12,600 --> 00:22:17,080 Speaker 2: the police, and so they hand the cyber tips over 375 00:22:18,000 --> 00:22:23,280 Speaker 2: but a tiny percentage of those cases they're prosecuted, and 376 00:22:23,359 --> 00:22:26,160 Speaker 2: so and this is so this is a public safety problem. 377 00:22:26,680 --> 00:22:28,800 Speaker 2: So at the end of the day, I think one 378 00:22:28,840 --> 00:22:31,760 Speaker 2: of the things I love about these stories where families, 379 00:22:32,200 --> 00:22:36,040 Speaker 2: you know, do pursue justice in a especially in criminal court, 380 00:22:36,760 --> 00:22:40,440 Speaker 2: is that it shows the public exactly what's going on, 381 00:22:40,720 --> 00:22:43,879 Speaker 2: that this isn't some wave of magic wand scenario. This 382 00:22:44,119 --> 00:22:48,160 Speaker 2: is you know, boots on the ground, giving law enforcement 383 00:22:48,280 --> 00:22:52,600 Speaker 2: the tools they need and connecting those dots because honestly, 384 00:22:52,640 --> 00:22:55,720 Speaker 2: it's these gangs that are you know, this isn't some 385 00:22:56,800 --> 00:23:01,160 Speaker 2: you know, kind of innocuous little act activity. If somebody's 386 00:23:01,160 --> 00:23:05,320 Speaker 2: sitting in his bedroom in Milwaukee, you know, it's these 387 00:23:05,400 --> 00:23:06,760 Speaker 2: people know what they're doing. 388 00:23:07,240 --> 00:23:10,399 Speaker 1: Yeah, this is organized crime, and it's just a different 389 00:23:10,520 --> 00:23:13,720 Speaker 1: form than we're used to hearing about exactly. Let's take 390 00:23:13,760 --> 00:23:16,440 Speaker 1: a quick commercial break. We'll continue next on a Tutor 391 00:23:16,480 --> 00:23:22,080 Speaker 1: Dixon podcast. Before I let you go, I just want 392 00:23:22,119 --> 00:23:25,560 Speaker 1: to get into when you talk about education and you say, well, 393 00:23:26,040 --> 00:23:28,840 Speaker 1: you know, children and parents need to be educated. And 394 00:23:28,920 --> 00:23:32,320 Speaker 1: I believe this because I'm in the situation where this 395 00:23:32,520 --> 00:23:34,159 Speaker 1: was not a thing when I was a kid. I 396 00:23:34,200 --> 00:23:37,560 Speaker 1: have no idea what they're facing online, but I don't 397 00:23:37,560 --> 00:23:39,840 Speaker 1: feel like there are great resources out there for me 398 00:23:39,920 --> 00:23:43,159 Speaker 1: to find out how I protect them. Is there something 399 00:23:43,200 --> 00:23:45,080 Speaker 1: that I'm missing? Is there a way for us to 400 00:23:46,040 --> 00:23:48,359 Speaker 1: point parents in a direction where it's like Okay, here's 401 00:23:48,400 --> 00:23:50,440 Speaker 1: the facts, and this is what you need to teach 402 00:23:50,480 --> 00:23:53,240 Speaker 1: your kids or is this something our schools should also 403 00:23:53,359 --> 00:23:56,679 Speaker 1: have a course on. I mean, I'm really looking for 404 00:23:56,760 --> 00:23:57,920 Speaker 1: those answers. 405 00:23:58,280 --> 00:24:03,840 Speaker 2: Well, certainly working with Stop Child Predators and our partner organizations, 406 00:24:03,960 --> 00:24:06,800 Speaker 2: the number one thing that we feel we can do 407 00:24:07,920 --> 00:24:11,399 Speaker 2: is to develop those educational materials. And so do I 408 00:24:11,440 --> 00:24:15,040 Speaker 2: think it should be a course in school? You betcha? 409 00:24:15,119 --> 00:24:16,879 Speaker 2: I think, as I said, there was a bill in 410 00:24:16,920 --> 00:24:19,679 Speaker 2: Florida that was passed on a bipartisan basis that we 411 00:24:19,760 --> 00:24:25,480 Speaker 2: are absolutely distributing to other states to talk to them 412 00:24:25,560 --> 00:24:30,320 Speaker 2: about this is a very simple, basic, useful tool for 413 00:24:30,400 --> 00:24:34,200 Speaker 2: kids in school. But then also we're looking to partner 414 00:24:34,320 --> 00:24:39,680 Speaker 2: with national parents organizations, foster parents for instance. I'm a 415 00:24:39,760 --> 00:24:44,920 Speaker 2: Junior League member, to make sure that we can distribute 416 00:24:45,400 --> 00:24:49,360 Speaker 2: just some basic educational material to parents. I mean, I'm 417 00:24:49,560 --> 00:24:51,960 Speaker 2: probably old enough to at least be your mother, if 418 00:24:51,960 --> 00:24:55,280 Speaker 2: not your nana. And I think when I look at 419 00:24:55,359 --> 00:25:00,479 Speaker 2: how the Internet has played out as a thing in 420 00:25:00,520 --> 00:25:03,879 Speaker 2: my lifetime but also during the span of my work 421 00:25:03,880 --> 00:25:06,639 Speaker 2: in this area, the thing that I also try to 422 00:25:06,640 --> 00:25:08,680 Speaker 2: remind people of is that technology has done a lot 423 00:25:08,680 --> 00:25:11,720 Speaker 2: of really good things for kids. So it's helping us 424 00:25:11,760 --> 00:25:14,800 Speaker 2: locate kids abducted from Ukraine by the Russians, it's helping 425 00:25:14,880 --> 00:25:17,439 Speaker 2: us find adoptive families for kids in foster care. I 426 00:25:17,440 --> 00:25:21,439 Speaker 2: can go on all day long about how great technology is, 427 00:25:21,480 --> 00:25:23,920 Speaker 2: but you know, it's a little bit of common sense. 428 00:25:23,960 --> 00:25:25,959 Speaker 2: You wouldn't let your twelve year old drive to them 429 00:25:25,960 --> 00:25:28,959 Speaker 2: all by themselves and just wander around. And you know, 430 00:25:30,119 --> 00:25:32,880 Speaker 2: my twelve year old granddaughter doesn't go on her iPad 431 00:25:33,040 --> 00:25:37,879 Speaker 2: without you know, some real boundaries from my daughter. So 432 00:25:38,560 --> 00:25:41,920 Speaker 2: I think we have to help everybody get a better 433 00:25:42,040 --> 00:25:45,800 Speaker 2: grasp of what this is really about and how you know, 434 00:25:45,920 --> 00:25:49,520 Speaker 2: sort of one thing to have your granddaughter this is 435 00:25:49,560 --> 00:25:52,040 Speaker 2: I'm speaking from personal experience, sit on your boat watching 436 00:25:52,080 --> 00:25:54,960 Speaker 2: cocoa lemon while she takes a nap or melon, and 437 00:25:55,359 --> 00:25:59,960 Speaker 2: it's another thing to just let kids have unfettered access. 438 00:26:00,240 --> 00:26:04,280 Speaker 2: So I feel like we are moving in the right direction. 439 00:26:05,080 --> 00:26:08,359 Speaker 2: But I think we need to focus on two really 440 00:26:08,400 --> 00:26:12,640 Speaker 2: important things, the criminals that are doing this, and then 441 00:26:12,800 --> 00:26:16,760 Speaker 2: helping families understand how much control they have over this, 442 00:26:16,920 --> 00:26:20,679 Speaker 2: how they can protect their own kids. As I said earlier, 443 00:26:20,760 --> 00:26:23,439 Speaker 2: I just think that the government makes a terrible parent. 444 00:26:23,480 --> 00:26:26,480 Speaker 2: There are thousands of foster children who can speak to 445 00:26:26,520 --> 00:26:29,919 Speaker 2: that on any given day, and so I really feel 446 00:26:29,920 --> 00:26:32,080 Speaker 2: like the solutions need to come from families. 447 00:26:33,000 --> 00:26:36,040 Speaker 1: I think there's a different level too that we didn't 448 00:26:36,080 --> 00:26:38,120 Speaker 1: talk about, and it's not a safety thing. But when 449 00:26:38,760 --> 00:26:41,800 Speaker 1: you talk about this, it just reminded me. I have 450 00:26:41,840 --> 00:26:45,280 Speaker 1: a daughter in high school and they obviously the kids 451 00:26:45,359 --> 00:26:48,040 Speaker 1: text back and forth, but then if they get mad 452 00:26:48,080 --> 00:26:50,480 Speaker 1: at one another for any reason, I mean the most 453 00:26:50,520 --> 00:26:53,720 Speaker 1: sensitive reason whatsoever, or break up with a girlfriend or 454 00:26:53,720 --> 00:26:57,800 Speaker 1: a boyfriend, they immediately block that person. And I took 455 00:26:57,840 --> 00:26:59,960 Speaker 1: my girls aside the other day and I said, look, 456 00:27:01,119 --> 00:27:04,080 Speaker 1: don't do that. I don't care how angry you are 457 00:27:04,080 --> 00:27:06,320 Speaker 1: with someone. You don't know what you do to them 458 00:27:06,520 --> 00:27:09,760 Speaker 1: in that moment. If you cut them off completely, they 459 00:27:09,760 --> 00:27:12,320 Speaker 1: have no access to you. They can't have a conversation. 460 00:27:12,920 --> 00:27:15,840 Speaker 1: I know that you guys have this different technology and 461 00:27:15,840 --> 00:27:18,720 Speaker 1: that wasn't my story as a kid, but these are 462 00:27:18,840 --> 00:27:22,600 Speaker 1: just like there are etiquette tips that we didn't have 463 00:27:22,640 --> 00:27:25,000 Speaker 1: to think about when we were young, and now life 464 00:27:25,040 --> 00:27:25,720 Speaker 1: has changed. 465 00:27:26,920 --> 00:27:29,320 Speaker 2: You know. I'm so glad you said that, and maybe 466 00:27:29,320 --> 00:27:31,439 Speaker 2: we should do a whole other show about this because 467 00:27:32,040 --> 00:27:36,040 Speaker 2: I think that part of what has fueled a lot 468 00:27:36,080 --> 00:27:38,040 Speaker 2: of First of all, the whole issue of mental health 469 00:27:38,080 --> 00:27:42,000 Speaker 2: and kids, Technology is not the problem. The rest of 470 00:27:42,080 --> 00:27:44,760 Speaker 2: life in a lot of kids' lives is the problem. 471 00:27:44,760 --> 00:27:48,280 Speaker 2: But you know the way that social meda I'm sure 472 00:27:48,320 --> 00:27:50,960 Speaker 2: we've both done this, but social media has a way 473 00:27:50,960 --> 00:27:55,760 Speaker 2: of amplifying and sharpening communication, oftentimes not in a good way. 474 00:27:55,840 --> 00:27:58,280 Speaker 2: I know I've been guilty of doing this myself. So 475 00:27:58,960 --> 00:28:02,640 Speaker 2: I mean, part of this is is how we communicate 476 00:28:02,720 --> 00:28:05,679 Speaker 2: with each other and how we treat each other. And 477 00:28:05,720 --> 00:28:09,600 Speaker 2: I think it's actually a really big teachable moment for kids. 478 00:28:10,119 --> 00:28:12,240 Speaker 2: You know, people ask me about bullying all the time, 479 00:28:12,240 --> 00:28:16,040 Speaker 2: and I'm like, look, you know, technology is not going 480 00:28:16,119 --> 00:28:21,320 Speaker 2: to stop bullying, and bullying will move wherever. Kids. I'm 481 00:28:21,359 --> 00:28:23,879 Speaker 2: seventy five years old, I'm sure there was bullying. All 482 00:28:23,920 --> 00:28:26,080 Speaker 2: the nuns in my school probably wouldn't stand for it. 483 00:28:26,119 --> 00:28:28,159 Speaker 2: But you know, there was bullying going on in the 484 00:28:28,240 --> 00:28:30,320 Speaker 2: fifties and sixties, and when my kids were growing up 485 00:28:30,359 --> 00:28:33,520 Speaker 2: in the seventies and eighties, and the grandchildren. Now, it's 486 00:28:33,600 --> 00:28:36,639 Speaker 2: a human thing. It's not a technology thing, right, And 487 00:28:36,680 --> 00:28:40,560 Speaker 2: I think that technology, though, gives us an opportunity to 488 00:28:40,680 --> 00:28:44,840 Speaker 2: maybe use technology more constructively and more compassionately. And to 489 00:28:45,000 --> 00:28:48,200 Speaker 2: your point, you know, when I hear these heart wrenching 490 00:28:48,280 --> 00:28:52,400 Speaker 2: stories about kids committing suicide because they felt rejected, Wow, 491 00:28:52,480 --> 00:28:56,520 Speaker 2: you just I mean, sometimes the answers are really simple, 492 00:28:57,360 --> 00:29:00,360 Speaker 2: and you just raised I think an extremely important. 493 00:29:01,320 --> 00:29:03,560 Speaker 1: But honestly, it's one of those things where if you're 494 00:29:03,600 --> 00:29:07,000 Speaker 1: not having that conversation with your kids. I mean, my 495 00:29:07,120 --> 00:29:09,760 Speaker 1: daughter was like, I can't believe this happened. You know, 496 00:29:09,960 --> 00:29:12,440 Speaker 1: we had this conversation. Then she blocked me and I 497 00:29:12,440 --> 00:29:15,080 Speaker 1: haven't spoken to her since, and I was like, how 498 00:29:15,120 --> 00:29:18,080 Speaker 1: often does that happen? Oh, this is how the kids 499 00:29:18,120 --> 00:29:20,560 Speaker 1: do it all the time. Now, than my seventh grader 500 00:29:20,640 --> 00:29:23,480 Speaker 1: chimes in and I said, you know what that can 501 00:29:23,560 --> 00:29:26,120 Speaker 1: feel like. It's a forever thing in that moment, and 502 00:29:26,160 --> 00:29:28,960 Speaker 1: you guys need to understand how that makes someone else feel. 503 00:29:29,160 --> 00:29:32,000 Speaker 1: Don't ever do that. But I would never have known 504 00:29:32,120 --> 00:29:34,320 Speaker 1: had we not had that conversation. So it is it's 505 00:29:34,400 --> 00:29:36,400 Speaker 1: kind of one of those things. We could talk about this, 506 00:29:36,560 --> 00:29:39,880 Speaker 1: and we should because we should talk about all of 507 00:29:39,920 --> 00:29:43,120 Speaker 1: those factors and that should go into that material of 508 00:29:43,360 --> 00:29:45,640 Speaker 1: how to talk to your kids about technology. 509 00:29:46,120 --> 00:29:49,120 Speaker 2: Yeah, yeah, yeah, and being a good person. I have 510 00:29:49,240 --> 00:29:51,840 Speaker 2: a dear friend who has a big foster care program 511 00:29:51,880 --> 00:29:55,120 Speaker 2: and his tagline is be a good person. And you know, 512 00:29:55,240 --> 00:29:58,000 Speaker 2: I think sometimes we lose sight of that. And technology 513 00:29:58,360 --> 00:30:01,280 Speaker 2: has a way of sort of ample find things sometimes, 514 00:30:01,360 --> 00:30:04,800 Speaker 2: and you know, I think we probably have both because 515 00:30:04,840 --> 00:30:07,320 Speaker 2: I think we're both into a lot of stuff. Policy wise. 516 00:30:08,200 --> 00:30:11,800 Speaker 2: It just keeps throwing stuff at you and it can 517 00:30:11,840 --> 00:30:15,680 Speaker 2: be overwhelming, even for an adult, and sometimes you do 518 00:30:15,720 --> 00:30:18,480 Speaker 2: want to just turn that thing off, whatever it is, 519 00:30:18,560 --> 00:30:21,200 Speaker 2: but that isn't necessarily the best way to work through 520 00:30:21,200 --> 00:30:22,240 Speaker 2: conflict in a moment. 521 00:30:22,360 --> 00:30:25,360 Speaker 1: So I mean, I keep finding that no matter what 522 00:30:25,520 --> 00:30:28,800 Speaker 1: the subject is, it seems like better to handle it 523 00:30:28,880 --> 00:30:32,240 Speaker 1: in society than try to make government create some. 524 00:30:32,720 --> 00:30:38,640 Speaker 2: Rule or law around Amen's sister. Because the biggest concern 525 00:30:38,720 --> 00:30:42,200 Speaker 2: that I have right now is that we have the 526 00:30:42,280 --> 00:30:46,440 Speaker 2: government weighing in with a lot of knee jerk ideas. 527 00:30:46,560 --> 00:30:49,640 Speaker 2: And I'm sure that a lot of them are well intended, 528 00:30:50,240 --> 00:30:53,680 Speaker 2: but you know, as I think we all know, the 529 00:30:53,720 --> 00:30:57,120 Speaker 2: government is not the answer to everything, and sometimes the 530 00:30:57,160 --> 00:31:00,600 Speaker 2: government creates a lot of unintended consequences. I'll just leave 531 00:31:00,600 --> 00:31:03,160 Speaker 2: you with one thought, because I think Michigan's involved with it. 532 00:31:03,720 --> 00:31:05,880 Speaker 2: You know, a lot of the state attorneys general are 533 00:31:05,920 --> 00:31:09,320 Speaker 2: suing Meta right now for all kinds of perceived you know, 534 00:31:09,800 --> 00:31:12,760 Speaker 2: harm to children. But what I found, I mean, I 535 00:31:12,800 --> 00:31:15,000 Speaker 2: literally laughed out loud when I heard about the lawsuit, 536 00:31:15,040 --> 00:31:18,560 Speaker 2: because almost every single one of those states has been 537 00:31:18,640 --> 00:31:23,800 Speaker 2: sued for their own horrid child welfare outcomes, and so 538 00:31:24,040 --> 00:31:26,960 Speaker 2: kids dying in foster care, kids being lost in foster care, 539 00:31:27,960 --> 00:31:31,680 Speaker 2: you know, infant mortality rates off the charts. So let's 540 00:31:31,720 --> 00:31:34,960 Speaker 2: not get ourselves. This really isn't a technology problem. This 541 00:31:35,000 --> 00:31:37,120 Speaker 2: is kind of a human problem. Yeah, And I think 542 00:31:37,160 --> 00:31:41,440 Speaker 2: it's up to us to attack it, and preferably together, 543 00:31:41,800 --> 00:31:44,080 Speaker 2: because I think that we're all learning a lot from 544 00:31:44,080 --> 00:31:46,200 Speaker 2: each other, and I think that that's the way you 545 00:31:46,280 --> 00:31:47,080 Speaker 2: get things done. 546 00:31:47,720 --> 00:31:49,480 Speaker 1: Well, this has been great. I mean, I've learned a 547 00:31:49,480 --> 00:31:51,800 Speaker 1: lot today. I appreciate you taking the time to talk 548 00:31:51,800 --> 00:31:54,040 Speaker 1: to me about it, and if there's anything that you 549 00:31:54,080 --> 00:31:56,640 Speaker 1: can point us to, let me know. I mean, we'd 550 00:31:56,640 --> 00:31:59,440 Speaker 1: love to share that with our audience. And if you 551 00:31:59,480 --> 00:32:01,440 Speaker 1: have those heals, let's get them out there. 552 00:32:02,360 --> 00:32:04,880 Speaker 2: I will be delighted to stay in touch. I really 553 00:32:04,960 --> 00:32:05,600 Speaker 2: enjoyed this. 554 00:32:05,800 --> 00:32:08,080 Speaker 1: Me too, well, Maureen, it was wonderful having you. 555 00:32:08,640 --> 00:32:10,280 Speaker 2: Thank you anytime, and. 556 00:32:10,280 --> 00:32:12,960 Speaker 1: Thank you all for joining us on the Tutor Dixon Podcast. 557 00:32:13,040 --> 00:32:16,200 Speaker 1: For this episode and others, go to Tutor dixonpodcast dot com. 558 00:32:16,240 --> 00:32:18,400 Speaker 1: You can subscribe right there, or head over to the 559 00:32:18,440 --> 00:32:22,320 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you get your podcasts 560 00:32:22,360 --> 00:32:24,920 Speaker 1: and join us next time on the Tutor Dixon Podcast. 561 00:32:25,120 --> 00:32:28,520 Speaker 1: Have a blast day.