1 00:00:00,320 --> 00:00:03,920 Speaker 1: Topics featured in this episode may be disturbing to some listeners. 2 00:00:04,400 --> 00:00:05,920 Speaker 1: Please take care while listening. 3 00:00:08,720 --> 00:00:13,080 Speaker 2: This is a crime that thrives in the shadows, and 4 00:00:13,520 --> 00:00:17,000 Speaker 2: people needed to hear what was actually going on. 5 00:00:17,000 --> 00:00:19,239 Speaker 3: One of the biggest problems reporting on this is nobody 6 00:00:19,239 --> 00:00:21,600 Speaker 3: wants to hear about the problem because of how awful 7 00:00:21,640 --> 00:00:21,919 Speaker 3: it is. 8 00:00:30,360 --> 00:00:37,880 Speaker 1: I'm Andrea Gunning and this is a betrayal bonus episode. 9 00:00:37,960 --> 00:00:40,840 Speaker 1: In episode four, you heard from New York Times reporters 10 00:00:40,960 --> 00:00:44,000 Speaker 1: Michael Keller and Gabriel Dance as they spoke to their 11 00:00:44,040 --> 00:00:48,320 Speaker 1: twenty nineteen investigative piece on child sexual abuse material. It's 12 00:00:48,360 --> 00:00:52,280 Speaker 1: called the Internet is overrun with images of child sexual abuse? 13 00:00:53,000 --> 00:00:56,080 Speaker 1: What went wrong? If you have a chance to read it, 14 00:00:56,120 --> 00:00:59,600 Speaker 1: look it up because it's superb investigative reporting. There's a 15 00:00:59,640 --> 00:01:03,120 Speaker 1: link in our show notes to the article. We wanted 16 00:01:03,120 --> 00:01:06,640 Speaker 1: to dive a little deeper. How are crimes being reported, 17 00:01:06,959 --> 00:01:10,120 Speaker 1: what role our technology company's playing, and how is the 18 00:01:10,160 --> 00:01:12,839 Speaker 1: government responding. Here's Michael Keller. 19 00:01:13,640 --> 00:01:18,000 Speaker 2: This is a crime that thrives in the shadows, and 20 00:01:18,480 --> 00:01:21,320 Speaker 2: people needed to hear what was actually going on. 21 00:01:22,240 --> 00:01:23,800 Speaker 1: Reporter Gabriel Dance. 22 00:01:24,280 --> 00:01:26,520 Speaker 3: One of the biggest problems reporting on this is nobody 23 00:01:26,520 --> 00:01:29,160 Speaker 3: wants to hear about the problem because of how awful it. 24 00:01:29,120 --> 00:01:33,679 Speaker 1: Is, and to be honest, we were nervous. We know 25 00:01:33,760 --> 00:01:37,000 Speaker 1: from season one of Betrayal that our audience is genuinely 26 00:01:37,040 --> 00:01:40,800 Speaker 1: interested in letting the light in on dark stories. One 27 00:01:40,840 --> 00:01:44,080 Speaker 1: of Michael and Gabriel's most important revelations was that our 28 00:01:44,160 --> 00:01:49,160 Speaker 1: legislators don't really want to hear about it. State lawmakers, judges, 29 00:01:49,200 --> 00:01:52,840 Speaker 1: and members of Congress have avoided attending meetings and hearings 30 00:01:52,840 --> 00:01:56,320 Speaker 1: when it was on the agenda. They just aren't showing up. 31 00:01:56,880 --> 00:01:59,440 Speaker 2: One of the big things was the failures of the 32 00:01:59,480 --> 00:02:03,440 Speaker 2: federal govern to live up to its own promises that 33 00:02:03,480 --> 00:02:07,480 Speaker 2: it made around two thousand and eight to develop a 34 00:02:07,720 --> 00:02:15,240 Speaker 2: strong national response. The government had not really followed through 35 00:02:16,000 --> 00:02:20,520 Speaker 2: on its grand plans. The high role position at DOJ 36 00:02:21,440 --> 00:02:27,200 Speaker 2: was never fully created. The strategy reports that we're supposed 37 00:02:27,200 --> 00:02:30,560 Speaker 2: to come out on a regular basis, there's only been 38 00:02:30,720 --> 00:02:34,000 Speaker 2: two of them over the last decade. You know, these 39 00:02:34,040 --> 00:02:39,799 Speaker 2: reports have risen, but federal funding to these special task 40 00:02:39,840 --> 00:02:42,440 Speaker 2: forces has largely remained flat. 41 00:02:42,840 --> 00:02:45,480 Speaker 3: I mean, there's so much of these offenses going on, 42 00:02:45,520 --> 00:02:47,600 Speaker 3: there's so many reports, there's not enough police in the 43 00:02:47,680 --> 00:02:50,320 Speaker 3: United States seemingly to solved this problem. 44 00:02:50,520 --> 00:02:53,480 Speaker 1: I CAACK, or the Internet Crimes Against Children's Task Force, 45 00:02:53,720 --> 00:02:56,680 Speaker 1: is working on the front lines every day. There's at 46 00:02:56,760 --> 00:02:59,680 Speaker 1: least one eye CACK in every state hearing what they 47 00:02:59,720 --> 00:03:02,519 Speaker 1: go through through daily. It's truly harrowing. 48 00:03:04,240 --> 00:03:07,320 Speaker 3: What I will say is speaking with members of these 49 00:03:07,600 --> 00:03:12,200 Speaker 3: ICAC task forces, I was always in such admiration and 50 00:03:12,280 --> 00:03:15,760 Speaker 3: awe of their work dealing with this kind of content 51 00:03:15,880 --> 00:03:19,480 Speaker 3: and this kind of horrible crime, and really the survivors 52 00:03:19,760 --> 00:03:24,560 Speaker 3: and how hurt some of them are for sometimes the 53 00:03:24,600 --> 00:03:28,160 Speaker 3: rest of their lives. We spoke with an Ikaq guy 54 00:03:28,200 --> 00:03:32,600 Speaker 3: in Kansas who had served in the Iraq War, and 55 00:03:32,680 --> 00:03:36,119 Speaker 3: he said he would almost rather go back and serve 56 00:03:36,160 --> 00:03:41,080 Speaker 3: another tour than continue in his position dealing with these 57 00:03:41,120 --> 00:03:42,000 Speaker 3: types of crimes. 58 00:03:42,480 --> 00:03:45,240 Speaker 2: He had said that he worked in Ikaq and then 59 00:03:45,280 --> 00:03:47,800 Speaker 2: to take a break, he did a tour in Iraq 60 00:03:48,880 --> 00:03:52,320 Speaker 2: and then came back and felt like, all right, now 61 00:03:52,360 --> 00:03:54,280 Speaker 2: I can go back and keep doing this work. 62 00:03:55,000 --> 00:03:58,640 Speaker 1: I'm in awe of the law enforcement officers. They choose 63 00:03:58,720 --> 00:04:01,760 Speaker 1: this work because they want to save children, but it 64 00:04:01,800 --> 00:04:04,800 Speaker 1: really is a kin to war in an emotional sense. 65 00:04:05,480 --> 00:04:09,440 Speaker 3: Some people like viscerally cannot deal with this issue because 66 00:04:09,440 --> 00:04:13,920 Speaker 3: it is truly one of the most awful crimes that 67 00:04:13,960 --> 00:04:19,200 Speaker 3: we commit against one another and the descriptions. Michael and 68 00:04:19,240 --> 00:04:24,560 Speaker 3: I probably read hundreds and hundreds of search warrants and 69 00:04:24,880 --> 00:04:30,000 Speaker 3: legal documents that would describe videos and photos and the 70 00:04:30,000 --> 00:04:31,360 Speaker 3: acts in them. 71 00:04:31,880 --> 00:04:34,920 Speaker 1: One strategy ICAQ uses to write reports is to turn 72 00:04:34,960 --> 00:04:38,359 Speaker 1: the video off when documenting the audio and turn the 73 00:04:38,440 --> 00:04:41,800 Speaker 1: sound off when documenting the video because it's too much 74 00:04:41,839 --> 00:04:45,400 Speaker 1: to handle. At the same time, these KAC Task Force 75 00:04:45,440 --> 00:04:48,719 Speaker 1: members can only do so much with what they are given. 76 00:04:49,400 --> 00:04:54,359 Speaker 1: They triage the cases, often prioritizing the youngest victims, but 77 00:04:54,400 --> 00:04:57,320 Speaker 1: they can only investigate about a third of all the 78 00:04:57,360 --> 00:05:03,360 Speaker 1: tips because the caseload so overwhelming. Of course, predators are 79 00:05:03,360 --> 00:05:06,839 Speaker 1: the biggest problem and bear the most responsibility, but we 80 00:05:06,920 --> 00:05:10,680 Speaker 1: need to acknowledge there's another culpable participant when it comes 81 00:05:10,760 --> 00:05:16,400 Speaker 1: to the explosion of seesam material technology companies. Before the Internet, 82 00:05:16,400 --> 00:05:19,359 Speaker 1: the US Postal Service was the leading reporter of CESAM 83 00:05:19,680 --> 00:05:23,479 Speaker 1: and was stopping the dissemination of material via the mail. However, 84 00:05:23,640 --> 00:05:26,920 Speaker 1: with millions of images plaguing the Internet, is it time 85 00:05:27,000 --> 00:05:31,000 Speaker 1: we started holding technology companies responsible? For their lack of action. 86 00:05:31,920 --> 00:05:35,080 Speaker 3: What I do think they're certainly responsible for is allowing 87 00:05:35,120 --> 00:05:40,039 Speaker 3: this problem to get very serious before they started to 88 00:05:40,080 --> 00:05:45,160 Speaker 3: take responsibility for their role in it. As early as 89 00:05:45,640 --> 00:05:49,120 Speaker 3: two thousand, tech companies knew this was a very serious problem, 90 00:05:49,480 --> 00:05:54,200 Speaker 3: and we're doing nothing to solve it. So I would 91 00:05:54,279 --> 00:05:58,320 Speaker 3: say that tech companies are certainly responsible for allowing the 92 00:05:58,360 --> 00:06:02,200 Speaker 3: problem to spiral out of control in the early part 93 00:06:02,200 --> 00:06:06,880 Speaker 3: of this century, and I'm encouraged that from what we've seen, 94 00:06:07,640 --> 00:06:10,360 Speaker 3: several of them have begun to take the problem much 95 00:06:10,360 --> 00:06:11,000 Speaker 3: more seriously. 96 00:06:11,640 --> 00:06:15,679 Speaker 1: The technology exists to root out criminal behavior, why aren't 97 00:06:15,720 --> 00:06:17,040 Speaker 1: tech companies deploying it? 98 00:06:18,040 --> 00:06:22,000 Speaker 2: Microsoft, along with professor Hani Farred, came up with the 99 00:06:22,000 --> 00:06:27,440 Speaker 2: technology called photo DNA. This takes a database of image 100 00:06:27,440 --> 00:06:33,800 Speaker 2: fingerprints and whenever a photograph gets uploaded to an Internet platform, 101 00:06:34,000 --> 00:06:37,080 Speaker 2: that company can scan it to see if it's in 102 00:06:37,160 --> 00:06:41,640 Speaker 2: the database of verified illegal imagery. And so that's the 103 00:06:41,680 --> 00:06:44,960 Speaker 2: main tool that tech companies use, which is great because 104 00:06:45,000 --> 00:06:48,479 Speaker 2: it's largely automated and easy to use. It's been around 105 00:06:48,520 --> 00:06:52,760 Speaker 2: for a long time, so a company like Facebook or 106 00:06:52,839 --> 00:06:57,520 Speaker 2: others that are doing automated scanning, they can generate a 107 00:06:57,680 --> 00:07:02,880 Speaker 2: large number of reports just through this software. They also 108 00:07:03,000 --> 00:07:07,640 Speaker 2: generally have a team of human moderators that review it, 109 00:07:08,200 --> 00:07:09,560 Speaker 2: and that serves an. 110 00:07:09,440 --> 00:07:11,600 Speaker 1: Important role of. 111 00:07:11,160 --> 00:07:15,040 Speaker 2: Verifying what was found and also escalating it if there's 112 00:07:15,080 --> 00:07:18,560 Speaker 2: evidence of actual hands on abuse. 113 00:07:18,200 --> 00:07:18,760 Speaker 3: Of a child. 114 00:07:20,720 --> 00:07:26,760 Speaker 2: If you talk with most technology policy people, one perspective 115 00:07:26,800 --> 00:07:30,400 Speaker 2: that you hear a lot. Technology companies don't have that 116 00:07:30,520 --> 00:07:35,880 Speaker 2: much pressure to get rid of harmful content on their 117 00:07:35,920 --> 00:07:42,440 Speaker 2: platform because they don't face any legal liability for it. 118 00:07:43,320 --> 00:07:45,240 Speaker 2: You know, technology companies, of course would say, we have 119 00:07:45,320 --> 00:07:48,080 Speaker 2: every reason to get rid of this harmful content. We 120 00:07:48,080 --> 00:07:51,480 Speaker 2: don't want to be a place for exploitation. What the 121 00:07:51,600 --> 00:07:56,440 Speaker 2: legislative solutions that have been proposed so far try to 122 00:07:56,480 --> 00:08:00,320 Speaker 2: go after Section two thirty of the Communications Decency Act, 123 00:08:00,560 --> 00:08:05,160 Speaker 2: which shields technology companies from any liability for content that 124 00:08:05,400 --> 00:08:10,080 Speaker 2: users post. There have been a few proposals to try 125 00:08:10,120 --> 00:08:13,280 Speaker 2: and change that, both from Democrats and Republicans. It's been 126 00:08:13,320 --> 00:08:17,920 Speaker 2: one of the few areas of bipartisan support. Those proposals 127 00:08:17,960 --> 00:08:21,560 Speaker 2: have not gone through, but over the last few years 128 00:08:21,640 --> 00:08:26,680 Speaker 2: you do see people trying to find ways to increase 129 00:08:26,720 --> 00:08:30,960 Speaker 2: the incentives for tech companies to clamp down on this more. 130 00:08:31,600 --> 00:08:35,120 Speaker 1: Let's take Facebook's parent company Meta. As an example, Meta 131 00:08:35,160 --> 00:08:37,640 Speaker 1: is the leading reporter of child sexual abuse material to 132 00:08:37,679 --> 00:08:41,600 Speaker 1: the National Center for Missing and Exploited Children. Almost all 133 00:08:41,640 --> 00:08:44,760 Speaker 1: of the illegal content gets transmitted through their Messenger app. 134 00:08:45,320 --> 00:08:48,840 Speaker 1: That isn't necessarily because it has the most CSAM because 135 00:08:48,840 --> 00:08:53,880 Speaker 1: they're using photo DNA and finding offenders. Currently, Messenger does 136 00:08:53,920 --> 00:08:58,040 Speaker 1: not encrypt their messages. However, Meta has announced that this 137 00:08:58,120 --> 00:09:01,240 Speaker 1: year it will make end to end and encryption the default. 138 00:09:02,080 --> 00:09:06,120 Speaker 1: Meta executives have admitted that encryption will decrease its ability 139 00:09:06,160 --> 00:09:10,280 Speaker 1: to report SESAM, saying if it's content we cannot see, 140 00:09:10,720 --> 00:09:16,280 Speaker 1: then it's content we cannot report. The Virtual Global Task Force, 141 00:09:16,440 --> 00:09:20,640 Speaker 1: a consortium of fifteen law enforcement agencies, is practically begging 142 00:09:20,679 --> 00:09:25,440 Speaker 1: Meta not to do it. Meta CEO Mark Sockerberg stated 143 00:09:26,000 --> 00:09:29,640 Speaker 1: encryption is a powerful tool for privacy, but that includes 144 00:09:29,679 --> 00:09:33,200 Speaker 1: the privacy of people doing bad things. When billions of 145 00:09:33,240 --> 00:09:36,280 Speaker 1: people use a service to connect, some of them are 146 00:09:36,280 --> 00:09:42,280 Speaker 1: going to misuse it for truly terrible things like child exploitation, terrorism, 147 00:09:42,800 --> 00:09:43,480 Speaker 1: and extortion. 148 00:09:44,280 --> 00:09:50,640 Speaker 3: The more communications are encrypted, the less capable tech companies 149 00:09:50,679 --> 00:09:54,320 Speaker 3: are of using these automated scanning tools to find and 150 00:09:54,360 --> 00:09:59,640 Speaker 3: report se SAM. That's a much broader conversation that should 151 00:09:59,640 --> 00:10:04,480 Speaker 3: be had, and oftentimes it gets shorthanded to everything should 152 00:10:04,480 --> 00:10:21,680 Speaker 3: be encrypted or nothing should be encrypted. The encryption conversation 153 00:10:22,280 --> 00:10:28,440 Speaker 3: is often complicated by this particular issue and held out 154 00:10:28,480 --> 00:10:31,760 Speaker 3: by both sides, both by law enforcement and by tech 155 00:10:31,800 --> 00:10:34,600 Speaker 3: companies and people who believe that all communications should be 156 00:10:34,679 --> 00:10:39,920 Speaker 3: encrypted as a wedge issue. I think there can be 157 00:10:39,960 --> 00:10:43,760 Speaker 3: more nuanced to that conversation, particularly when you come to 158 00:10:44,440 --> 00:10:49,240 Speaker 3: platforms and social media networks where adults can engage with children. 159 00:10:50,200 --> 00:10:54,959 Speaker 3: Just by definition, children are at such a disadvantage. Something 160 00:10:55,000 --> 00:10:57,400 Speaker 3: that's important to note as well is that many of 161 00:10:57,400 --> 00:11:01,880 Speaker 3: these social networks also give predators is an opportunity to 162 00:11:02,040 --> 00:11:05,920 Speaker 3: engage with children in a way that was never before possible. 163 00:11:06,600 --> 00:11:11,880 Speaker 3: You have documented cases of grown men going on Facebook 164 00:11:12,440 --> 00:11:17,600 Speaker 3: pretending to be children and then sexually extorting other children 165 00:11:17,679 --> 00:11:22,200 Speaker 3: into sending images of themselves, after which they continue to 166 00:11:22,320 --> 00:11:24,680 Speaker 3: force them to produce more and more imagery. 167 00:11:26,320 --> 00:11:29,040 Speaker 1: Gabe is referring to what is commonly known as sextortion, 168 00:11:30,120 --> 00:11:32,880 Speaker 1: tricking a young person into sending an image and then 169 00:11:33,000 --> 00:11:36,160 Speaker 1: essentially blackmailing the child into scending more with threats of 170 00:11:36,200 --> 00:11:41,120 Speaker 1: exposure or harm. The encryption debate won't be solved anytime soon, 171 00:11:41,440 --> 00:11:44,160 Speaker 1: but it's clear that protecting children from abuse is not 172 00:11:44,440 --> 00:11:47,360 Speaker 1: enough of a reason to compel for profit tech companies 173 00:11:47,559 --> 00:11:51,559 Speaker 1: to consider changing their approach. Social media websites and messaging 174 00:11:51,600 --> 00:11:55,200 Speaker 1: platforms are ground zero for the production and sharing of 175 00:11:55,240 --> 00:11:59,319 Speaker 1: ce sam material through the dark web and encrypted groups. 176 00:11:59,400 --> 00:12:03,600 Speaker 1: Appalling communities have developed. Take the site welcome to Video. 177 00:12:04,320 --> 00:12:07,600 Speaker 1: This darknet site, hosted in South Korea, a mass more 178 00:12:07,640 --> 00:12:11,000 Speaker 1: than two hundred and fifty thousand child exploitation videos in 179 00:12:11,160 --> 00:12:15,439 Speaker 1: only two years. Welcome to Video created a community of 180 00:12:15,559 --> 00:12:20,720 Speaker 1: users who bought and traded appalling content. Videos were sold 181 00:12:20,760 --> 00:12:28,880 Speaker 1: for bitcoin. According to an April twenty twenty two article 182 00:12:28,880 --> 00:12:33,720 Speaker 1: in Wired magazine, the site's upload page instructed do not 183 00:12:33,880 --> 00:12:38,240 Speaker 1: upload adult porn, the last two words highlighted and read 184 00:12:38,280 --> 00:12:44,480 Speaker 1: for emphasis. The page also warned that uploaded videos would 185 00:12:44,480 --> 00:12:49,959 Speaker 1: be checked for uniqueness, meaning only new material would be accepted. 186 00:12:50,520 --> 00:12:52,840 Speaker 2: In a lot of online groups, these images are like 187 00:12:52,840 --> 00:12:57,439 Speaker 2: a currency. In order to gain access to people's collections, 188 00:12:58,160 --> 00:13:02,840 Speaker 2: it's required that you produce knew, never before seen images. 189 00:13:03,400 --> 00:13:07,160 Speaker 2: So you also have that dynamic where people that want 190 00:13:07,400 --> 00:13:13,400 Speaker 2: to get images are pushed into abusing children and documenting 191 00:13:13,440 --> 00:13:15,520 Speaker 2: that abuse and sharing it online. 192 00:13:17,480 --> 00:13:20,320 Speaker 1: Welcome to Video was brought down by a joint effort 193 00:13:20,360 --> 00:13:23,880 Speaker 1: between the FBI and the South Korean government. It was 194 00:13:23,920 --> 00:13:27,679 Speaker 1: the result of dogged detective work and internet sleuthing, and 195 00:13:27,720 --> 00:13:30,760 Speaker 1: while it was hosted in South Korea, many of its 196 00:13:30,840 --> 00:13:35,640 Speaker 1: users were United States citizens. There are so many people 197 00:13:35,720 --> 00:13:39,679 Speaker 1: who don't realize just how big this problem is and 198 00:13:39,720 --> 00:13:44,920 Speaker 1: how close to home it actually hits. So with all 199 00:13:44,960 --> 00:13:47,880 Speaker 1: of this information we have what can we do to 200 00:13:47,960 --> 00:13:50,760 Speaker 1: make the public more aware of this problem. 201 00:13:51,240 --> 00:13:54,320 Speaker 3: What I came away with as the clearest call to 202 00:13:54,400 --> 00:14:01,200 Speaker 3: action from our reporting is spreading awareness and educing parents 203 00:14:01,720 --> 00:14:05,400 Speaker 3: and encouraging them to educate their children. This is not 204 00:14:05,960 --> 00:14:11,240 Speaker 3: necessarily a problem that tech companies and solve, and certainly 205 00:14:11,559 --> 00:14:13,240 Speaker 3: don't seem determined to solve. 206 00:14:14,120 --> 00:14:18,000 Speaker 2: We spoke with a few online child safety experts who 207 00:14:18,800 --> 00:14:21,520 Speaker 2: had a few pieces of advice. One brought up the 208 00:14:21,560 --> 00:14:25,720 Speaker 2: idea that the industry is not about the business of 209 00:14:25,760 --> 00:14:28,840 Speaker 2: promoting safety, and she said that she would love to 210 00:14:28,840 --> 00:14:31,560 Speaker 2: see whenever she buys a cell phone a pamphlet that 211 00:14:31,960 --> 00:14:33,480 Speaker 2: comes along with it that says how to keep your 212 00:14:33,560 --> 00:14:37,560 Speaker 2: children safe with this device. The key thing is to 213 00:14:37,560 --> 00:14:42,200 Speaker 2: not keep abuse secret. The less we talk about this, 214 00:14:42,920 --> 00:14:48,120 Speaker 2: the more the offenders have an advantage. They thrive on 215 00:14:48,800 --> 00:14:52,840 Speaker 2: the feelings of guilt and blame that a child may 216 00:14:52,880 --> 00:14:56,400 Speaker 2: have if they were tricked into sending a nude photograph. 217 00:14:57,080 --> 00:15:00,000 Speaker 2: That shame is really what gives them more. 218 00:15:04,520 --> 00:15:06,560 Speaker 1: If you or someone you know has been a victim 219 00:15:06,600 --> 00:15:10,040 Speaker 1: of sextortion, you can get help email the National Center 220 00:15:10,040 --> 00:15:13,200 Speaker 1: for Missing and Exploited Children or call one eight hundred 221 00:15:13,320 --> 00:15:17,320 Speaker 1: The Lost Many Thanks to Michael Keller and Gabriel Dance 222 00:15:17,360 --> 00:15:20,400 Speaker 1: from The New York Times, see our show notes for 223 00:15:20,440 --> 00:15:23,560 Speaker 1: a link to their article, The Internet is overrun with 224 00:15:23,640 --> 00:15:31,480 Speaker 1: images of child sexual abuse? What went wrong? Since we 225 00:15:31,560 --> 00:15:34,160 Speaker 1: spoke with Michael and Gabriel, Meta has been caught up 226 00:15:34,160 --> 00:15:37,800 Speaker 1: in controversy again. A recent investigation by The Wall Street 227 00:15:37,840 --> 00:15:41,320 Speaker 1: Journal and researchers at Stanford University and the University of 228 00:15:41,360 --> 00:15:45,800 Speaker 1: Massachusetts Amherst found that Instagram was helping to link predators 229 00:15:46,080 --> 00:15:50,040 Speaker 1: and people selling child sexual abuse material. Its algorithm connected 230 00:15:50,040 --> 00:15:54,120 Speaker 1: accounts offering to sell illicit sex material with people seeking it. 231 00:15:55,120 --> 00:15:58,120 Speaker 1: According to The Wall Street Journal, Instagram allowed users to 232 00:15:58,160 --> 00:16:01,680 Speaker 1: search for terms that its own algorithms no may be 233 00:16:01,800 --> 00:16:05,520 Speaker 1: associated with illegal material. And it's not like they were 234 00:16:05,560 --> 00:16:11,520 Speaker 1: hiding in. Instagram enabled people to search hashtags like hashtag 235 00:16:11,760 --> 00:16:15,520 Speaker 1: pewdo horror and hashtag preteen sex, then connected them to 236 00:16:15,560 --> 00:16:20,920 Speaker 1: accounts advertising CSAM for sale. If that wasn't troubling enough, 237 00:16:21,240 --> 00:16:24,800 Speaker 1: a pop up screen for the users warned these results 238 00:16:25,240 --> 00:16:31,000 Speaker 1: may contain images of child sexual abuse, and then offered users' options. 239 00:16:31,680 --> 00:16:36,920 Speaker 1: One of them was see results anyway. Meta has set 240 00:16:37,000 --> 00:16:41,080 Speaker 1: up an internal task force to address the problem. If 241 00:16:41,120 --> 00:16:42,840 Speaker 1: you would like to reach out to the Betrayal team, 242 00:16:42,960 --> 00:16:45,880 Speaker 1: email us at Betrayal Pod at gmail dot com. That's 243 00:16:45,960 --> 00:16:50,080 Speaker 1: Betrayal Pod at gmail dot com. To report a case 244 00:16:50,120 --> 00:16:53,440 Speaker 1: of child sexual exploitation, called the National Center for Missing 245 00:16:53,440 --> 00:16:57,800 Speaker 1: and Exploited Children's cyber tipline at one eight hundred the Lost. 246 00:16:58,400 --> 00:17:00,680 Speaker 1: If you or someone you know is worried about their 247 00:17:00,720 --> 00:17:04,119 Speaker 1: sexual thoughts and feelings towards children, reach out to Stop 248 00:17:04,119 --> 00:17:06,720 Speaker 1: It Now dot org. In the United Kingdom go to 249 00:17:06,720 --> 00:17:10,840 Speaker 1: Stop it Now dot org dot UK. These organizations can help. 250 00:17:11,640 --> 00:17:14,080 Speaker 1: We're grateful for your support, and one way to show 251 00:17:14,119 --> 00:17:16,920 Speaker 1: support is by subscribing to our show on Apple Podcasts 252 00:17:17,200 --> 00:17:20,560 Speaker 1: and Don't Forget to rate and review Betrayal five star reviews, 253 00:17:20,560 --> 00:17:23,240 Speaker 1: go a Long Way, A big thank you to all 254 00:17:23,280 --> 00:17:26,680 Speaker 1: of our listeners. The Trail is a production of Glass Podcasts, 255 00:17:26,880 --> 00:17:30,240 Speaker 1: a division of Glass Entertainment Group and partnership with iHeart Podcasts. 256 00:17:30,800 --> 00:17:33,639 Speaker 1: The show was executive produced by Nancy Glass and Jennifer Fason, 257 00:17:34,240 --> 00:17:37,560 Speaker 1: hosted and produced by me Andrea Gunning, written and produced 258 00:17:37,560 --> 00:17:42,000 Speaker 1: by Kerry Hartman, also produced by Ben Fetterman, Associate producer 259 00:17:42,280 --> 00:17:46,720 Speaker 1: Kristin Melcurrie. Our iHeart team is Ali Perry and Jessica Krincheck. 260 00:17:47,200 --> 00:17:50,000 Speaker 1: Special thanks to our talent Ashley Litton and production assistant 261 00:17:50,040 --> 00:17:54,159 Speaker 1: Tessa Shields. Audio editing and mixing by Matt Alvecchio. A 262 00:17:54,240 --> 00:17:58,040 Speaker 1: Trail's theme composed by Oliver Bains. Music library provided by 263 00:17:58,040 --> 00:18:01,199 Speaker 1: my Music and For more podcast from iHeart, visit the 264 00:18:01,240 --> 00:18:04,720 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. 265 00:18:08,760 --> 00:18:08,800 Speaker 3: M