1 00:00:01,120 --> 00:00:03,800 Speaker 1: Welcome to the Tutor Dixon Podcast. Well, you may have 2 00:00:03,960 --> 00:00:06,680 Speaker 1: heard that there is a landmark trial right now going 3 00:00:06,720 --> 00:00:09,639 Speaker 1: on in LA that is hoping to hold the world's 4 00:00:09,760 --> 00:00:13,280 Speaker 1: largest social media companies accountable for the harms that they've 5 00:00:13,320 --> 00:00:18,759 Speaker 1: allegedly caused to teenagers and children. Now, historically these companies 6 00:00:18,760 --> 00:00:22,599 Speaker 1: have claimed First Amendment protection. They've also hidden behind Section 7 00:00:22,680 --> 00:00:25,360 Speaker 1: two thirty. But this lawsuit takes a little bit of 8 00:00:25,400 --> 00:00:29,080 Speaker 1: a different approach that I find very interesting because we've 9 00:00:29,120 --> 00:00:32,040 Speaker 1: always kind of felt like they were untouchable. But I 10 00:00:32,080 --> 00:00:34,680 Speaker 1: have my friend Dan Schneider here to break it all 11 00:00:34,720 --> 00:00:37,200 Speaker 1: down with us. Dan is the vice president for Free 12 00:00:37,240 --> 00:00:41,600 Speaker 1: Speech at the Media Research Center. Dan, welcome to the podcast. 13 00:00:42,000 --> 00:00:43,440 Speaker 2: Thanks, Tutor, it's great being with you. 14 00:00:43,880 --> 00:00:45,800 Speaker 1: I'm so excited to have you here. I saw you 15 00:00:45,880 --> 00:00:48,239 Speaker 1: talking about this on Fox and I was like, I 16 00:00:48,320 --> 00:00:50,720 Speaker 1: have to have him on because I think we really 17 00:00:50,760 --> 00:00:54,840 Speaker 1: have felt like we can't touch these social media companies. 18 00:00:54,840 --> 00:00:58,360 Speaker 1: They're so powerful, and they've hidden behind Section two thirty, 19 00:00:58,360 --> 00:01:01,640 Speaker 1: which is like, I mean, I would explain it, but 20 00:01:01,720 --> 00:01:04,440 Speaker 1: you can explain it in better terms. I would explain 21 00:01:04,480 --> 00:01:07,040 Speaker 1: it that they say, well, look, it's not our content, 22 00:01:07,400 --> 00:01:11,400 Speaker 1: so we can't be held responsible for what's on these platforms. 23 00:01:11,920 --> 00:01:17,319 Speaker 2: Yeah, yeah, tutor, it's maybe. First of all, I just 24 00:01:17,360 --> 00:01:20,000 Speaker 2: want to say, if I sound like I'm ever defending 25 00:01:20,120 --> 00:01:24,200 Speaker 2: or apologizing for big tech, please push back, because you 26 00:01:24,240 --> 00:01:28,720 Speaker 2: know Google hates me, Facebook does not like me. These 27 00:01:28,760 --> 00:01:33,880 Speaker 2: big tech platforms really I see them as very pernicious 28 00:01:34,120 --> 00:01:37,040 Speaker 2: and harming people. But at the same time, I can 29 00:01:37,080 --> 00:01:39,600 Speaker 2: recognize that there's some really good products they create that 30 00:01:39,680 --> 00:01:44,600 Speaker 2: I use. And then as we analyze these cases, the 31 00:01:45,040 --> 00:01:49,200 Speaker 2: plaintiffs do have some real challenges, so as I address 32 00:01:49,280 --> 00:01:52,120 Speaker 2: those challenges. Please don't think that I'm an advocate for 33 00:01:52,160 --> 00:01:54,960 Speaker 2: big tech. I'm not, but but we have to be 34 00:01:55,040 --> 00:01:56,760 Speaker 2: realistic about what we face. 35 00:01:57,280 --> 00:02:00,520 Speaker 1: No, it's interesting that you say that because I have 36 00:02:00,640 --> 00:02:04,880 Speaker 1: also been a person who I guess people would say, well, 37 00:02:05,040 --> 00:02:07,760 Speaker 1: you're kind of defending them as well, because I've said, look, 38 00:02:07,880 --> 00:02:10,960 Speaker 1: I don't actually want the government telling me what my 39 00:02:11,120 --> 00:02:13,640 Speaker 1: children can and can't do. I want to be in 40 00:02:13,720 --> 00:02:17,560 Speaker 1: control because they do create some good products. And I 41 00:02:17,560 --> 00:02:21,280 Speaker 1: say that I'm backing up your point because I've had 42 00:02:21,320 --> 00:02:23,360 Speaker 1: so many parents say to me, well, just have your 43 00:02:23,440 --> 00:02:26,120 Speaker 1: kids go back to a landline or make your kids 44 00:02:26,280 --> 00:02:29,920 Speaker 1: use a flip phone. But that's not the technology today. 45 00:02:30,120 --> 00:02:33,280 Speaker 1: They're going to eventually have to use this technology because 46 00:02:33,320 --> 00:02:35,600 Speaker 1: this is where the world is going. I want them 47 00:02:35,600 --> 00:02:39,120 Speaker 1: to learn it under my roof, with me involved. So 48 00:02:39,280 --> 00:02:43,080 Speaker 1: I do I look at this lawsuit as kind of twofold. 49 00:02:43,240 --> 00:02:47,080 Speaker 1: I think that there is an interesting message here that 50 00:02:47,120 --> 00:02:49,880 Speaker 1: they've created a product that is addictive in a certain 51 00:02:49,919 --> 00:02:54,000 Speaker 1: way to draw kids in, similar to how Big Tobacco 52 00:02:54,120 --> 00:02:57,920 Speaker 1: created addictive products to draw people back to them. But 53 00:02:58,080 --> 00:03:01,760 Speaker 1: I also think that there is there is a message 54 00:03:01,800 --> 00:03:03,960 Speaker 1: that you have to be aware of what your children 55 00:03:04,000 --> 00:03:06,000 Speaker 1: are doing and they're in your home, and you should 56 00:03:06,000 --> 00:03:09,400 Speaker 1: also be monitoring that. So give us kind of a 57 00:03:09,440 --> 00:03:12,600 Speaker 1: breakdown of what is the in between of those two things. 58 00:03:12,840 --> 00:03:18,560 Speaker 2: Yeah, two really important principles right off the bat. And 59 00:03:18,840 --> 00:03:21,520 Speaker 2: here's one of the challenges that the planeffs. And they 60 00:03:21,520 --> 00:03:24,800 Speaker 2: are actually two cases. In California, it's a private cause 61 00:03:24,800 --> 00:03:27,839 Speaker 2: of action. It's essentially like a class action suit of 62 00:03:28,160 --> 00:03:31,160 Speaker 2: individuals who claim to have been harmed. In New Mexico, 63 00:03:31,440 --> 00:03:35,280 Speaker 2: it's the state attorney general the state prosecutor who's bringing 64 00:03:35,320 --> 00:03:38,720 Speaker 2: a case. And they literally both started on Monday and 65 00:03:38,760 --> 00:03:43,440 Speaker 2: there both being promulgated, prosecuted alongside each other. They both 66 00:03:43,440 --> 00:03:47,240 Speaker 2: have similar schedules and similar witnesses, but different theories, but 67 00:03:48,760 --> 00:03:53,880 Speaker 2: two relevant and connected cases California, New Mexico. In both cases, 68 00:03:54,920 --> 00:03:58,320 Speaker 2: one of the big challenges that the plaintiffs or the 69 00:03:58,320 --> 00:04:03,360 Speaker 2: prosecutor phase is is that you know, they're they're claiming 70 00:04:03,520 --> 00:04:07,080 Speaker 2: that the product was created to be addictive. Well, what 71 00:04:07,320 --> 00:04:11,720 Speaker 2: manufacturer isn't trying to create a product that consumers want? True? 72 00:04:11,960 --> 00:04:15,120 Speaker 2: What's the difference between creating a great product that people 73 00:04:15,160 --> 00:04:18,200 Speaker 2: are drawn to and want to use and own and 74 00:04:18,200 --> 00:04:24,400 Speaker 2: possess versus one that is illegally addictive. That's a really 75 00:04:24,960 --> 00:04:28,040 Speaker 2: hard line to prove. Like, you know, my wife loves 76 00:04:28,200 --> 00:04:31,920 Speaker 2: diet Coke, Well, Diet Coke wants to be loved. Diet 77 00:04:31,960 --> 00:04:34,760 Speaker 2: Coke makes a product specifically so that people will want 78 00:04:34,800 --> 00:04:39,440 Speaker 2: to buy it. Does that make it unlawful? No, it doesn't. 79 00:04:39,839 --> 00:04:42,800 Speaker 2: So yeah, the plaintiffs and the prosecutor in neectically have 80 00:04:42,880 --> 00:04:45,320 Speaker 2: to prove that this somehow crosses the line. The product 81 00:04:45,400 --> 00:04:50,480 Speaker 2: is so good that it that that consumers can't resist 82 00:04:50,520 --> 00:04:53,320 Speaker 2: it and it's designed specifically to harm them. That's a 83 00:04:53,360 --> 00:04:54,520 Speaker 2: really tough case to make. 84 00:04:55,120 --> 00:04:58,400 Speaker 1: M It's interesting because maybe part of it is also 85 00:04:58,720 --> 00:05:01,960 Speaker 1: just our own guilt of allowing ourselves to get sucked in. 86 00:05:02,680 --> 00:05:05,800 Speaker 1: And I am not making excuses, but I have certainly 87 00:05:05,839 --> 00:05:07,880 Speaker 1: been that person who what do we call it, like 88 00:05:07,960 --> 00:05:10,560 Speaker 1: doom scrolling or something where you just get sucked into 89 00:05:10,600 --> 00:05:13,719 Speaker 1: one video after another, or you just start rolling through 90 00:05:13,839 --> 00:05:16,440 Speaker 1: the posts that you find on Facebook and then all 91 00:05:16,440 --> 00:05:18,159 Speaker 1: of a sudden you look over and you're like, oh, 92 00:05:18,320 --> 00:05:20,680 Speaker 1: I just literally lost to half an hour that I 93 00:05:20,720 --> 00:05:23,400 Speaker 1: could have been doing something else. And I think that 94 00:05:23,480 --> 00:05:29,719 Speaker 1: there is also potentially some self reflection that we have 95 00:05:29,800 --> 00:05:32,400 Speaker 1: to do and say, Okay, we have to, like I said, 96 00:05:32,400 --> 00:05:34,599 Speaker 1: as a parent, you have to set limits. You have 97 00:05:34,640 --> 00:05:36,640 Speaker 1: to say I want you to I want you to 98 00:05:36,640 --> 00:05:38,200 Speaker 1: stay off your phone, I want you to clean up, 99 00:05:38,240 --> 00:05:39,880 Speaker 1: I want you to do something active, I want you 100 00:05:39,920 --> 00:05:41,800 Speaker 1: to do this. You have to kind of be aware 101 00:05:41,839 --> 00:05:44,280 Speaker 1: of what your kids are doing as well, right yeah. 102 00:05:44,200 --> 00:05:46,280 Speaker 2: Yeah, Look, it's a bag of potato chips. Are you 103 00:05:46,320 --> 00:05:49,080 Speaker 2: going to sit there and eat the whole bag one 104 00:05:49,160 --> 00:05:51,200 Speaker 2: chip at a time and you can't resist, you can't 105 00:05:51,240 --> 00:05:53,960 Speaker 2: put it down. Well, that's what Lais and Frigos want 106 00:05:53,960 --> 00:05:57,680 Speaker 2: you to do. It's not unlawful. The consumer ultimately has 107 00:05:57,680 --> 00:06:01,800 Speaker 2: the choice of doing that. Now. Now, the claim is 108 00:06:01,880 --> 00:06:08,400 Speaker 2: that the products, you know, these these big tech uh assets, 109 00:06:08,440 --> 00:06:13,040 Speaker 2: you know, Facebook and YouTube in particular, that they were 110 00:06:13,040 --> 00:06:17,839 Speaker 2: designed specifically to harm and that that and it's true, 111 00:06:18,160 --> 00:06:22,760 Speaker 2: these companies have hired psychiatrists, they've hired people who know 112 00:06:22,839 --> 00:06:26,240 Speaker 2: how the human brain works. They've used technology that casinos 113 00:06:26,320 --> 00:06:30,400 Speaker 2: use uh to try to you know, hook people. But 114 00:06:30,560 --> 00:06:34,760 Speaker 2: guess what, casinos are also legal, So it's a and 115 00:06:34,839 --> 00:06:37,560 Speaker 2: this leads to the second real challenge that the planeffs have. 116 00:06:39,000 --> 00:06:45,400 Speaker 2: They can't merely prove or show that that Facebook and 117 00:06:45,480 --> 00:06:50,280 Speaker 2: YouTube contributed to their mental illness or their depression or 118 00:06:50,320 --> 00:06:54,560 Speaker 2: their suicidal thoughts. They've got to prove that it's because 119 00:06:54,720 --> 00:06:58,960 Speaker 2: of YouTube and because of Facebook that they became addicted 120 00:06:59,080 --> 00:07:04,080 Speaker 2: and they were specifically harmed by it. So there are 121 00:07:04,120 --> 00:07:07,359 Speaker 2: a lot of these plaintiffs that may come from difficult 122 00:07:07,360 --> 00:07:12,080 Speaker 2: home lives or have a sort of genetic predispositions and 123 00:07:12,120 --> 00:07:16,080 Speaker 2: that sort of thing. And how can they show that 124 00:07:16,120 --> 00:07:21,400 Speaker 2: it's directly because of Facebook or YouTube that they have 125 00:07:21,520 --> 00:07:25,240 Speaker 2: fallen into this depression of these psychological problems. You know, 126 00:07:25,560 --> 00:07:27,440 Speaker 2: it's going to be very difficult for them to show 127 00:07:27,480 --> 00:07:31,600 Speaker 2: that it's exclusively and only because of these outlets that 128 00:07:31,640 --> 00:07:32,880 Speaker 2: they ended up being harmed. 129 00:07:33,160 --> 00:07:36,040 Speaker 1: So it's interesting, I mean, that is essentially what Meta 130 00:07:36,120 --> 00:07:38,920 Speaker 1: came out and said. They said. Recently a number of 131 00:07:39,000 --> 00:07:41,960 Speaker 1: lawsuits have attempted to place the blame for teen mental 132 00:07:42,000 --> 00:07:46,440 Speaker 1: health struggles squarely on social media companies, But this oversimplifies 133 00:07:46,480 --> 00:07:50,280 Speaker 1: a serious issue. Clinicians and researchers find that mental health 134 00:07:50,400 --> 00:07:54,680 Speaker 1: is deeply complex and multifaceted, and trends regarding teens well 135 00:07:54,720 --> 00:07:58,480 Speaker 1: being aren't clear cut or universal. Narrowing the challenges faced 136 00:07:58,480 --> 00:08:02,040 Speaker 1: by teens to a single fact ignores the scientific research 137 00:08:02,280 --> 00:08:05,120 Speaker 1: and the many stressors that are impacting young people today, 138 00:08:05,160 --> 00:08:08,640 Speaker 1: and it goes on. So essentially, what you're saying is, 139 00:08:08,840 --> 00:08:12,240 Speaker 1: how do you pinpoint that this happened there? I do 140 00:08:12,320 --> 00:08:14,880 Speaker 1: think that there are certain cases where they can say 141 00:08:15,320 --> 00:08:19,320 Speaker 1: there were maybe instructional videos on there. I think one 142 00:08:19,360 --> 00:08:22,120 Speaker 1: of the cases there was a family that said there 143 00:08:22,200 --> 00:08:25,160 Speaker 1: was actually an instructional video on there for telling my 144 00:08:25,280 --> 00:08:28,000 Speaker 1: daughter how to commit suicide and we ended up losing her. 145 00:08:28,240 --> 00:08:30,800 Speaker 1: And I think in those cases there has to be 146 00:08:30,880 --> 00:08:34,640 Speaker 1: some way that they're monitoring some of the content on 147 00:08:35,000 --> 00:08:36,120 Speaker 1: their sites, right. 148 00:08:36,559 --> 00:08:39,160 Speaker 2: Right, Okay, So toutor you've just hit the crux of 149 00:08:39,200 --> 00:08:43,760 Speaker 2: this thing. These cases are designed as product liability suits 150 00:08:44,200 --> 00:08:47,400 Speaker 2: because when people have tried to sue these tech companies 151 00:08:47,760 --> 00:08:51,400 Speaker 2: based on the content, this Section two thirty law has 152 00:08:51,400 --> 00:08:54,400 Speaker 2: been used by judges to block those suits. Now, I 153 00:08:54,520 --> 00:08:58,640 Speaker 2: think those judges have misinterpreted the law. And this is 154 00:08:58,679 --> 00:09:01,680 Speaker 2: where anybody who's thinking I'm shilling for the big tech, 155 00:09:01,720 --> 00:09:04,880 Speaker 2: we'll understand, I am not. I do not believe Section 156 00:09:04,880 --> 00:09:08,240 Speaker 2: two thirty provides this absolute immunity shield that some of 157 00:09:08,240 --> 00:09:10,680 Speaker 2: these judges have created. By the way, the Supreme Court 158 00:09:10,679 --> 00:09:13,480 Speaker 2: has never ruled on Section two thirty. The breadth and 159 00:09:13,520 --> 00:09:18,199 Speaker 2: depth of it. Section two thirty says that if you're 160 00:09:18,240 --> 00:09:21,920 Speaker 2: one of these platforms, you know, and there's this specific 161 00:09:22,080 --> 00:09:25,360 Speaker 2: legal term that was you. By the way, tutor, this 162 00:09:25,440 --> 00:09:29,720 Speaker 2: Sunday marks the thirtieth anniversary of the establishment of Section 163 00:09:29,760 --> 00:09:33,040 Speaker 2: two thirty the Communications Decency Act. It's thirty years old. 164 00:09:33,360 --> 00:09:36,600 Speaker 2: Mark Zuckerberg was eleven years old when this thing was created. 165 00:09:36,760 --> 00:09:39,400 Speaker 2: There was no Facebook, there was no Google, you know, 166 00:09:39,679 --> 00:09:42,680 Speaker 2: there was AOL, and there was compu serve. So the 167 00:09:42,760 --> 00:09:45,760 Speaker 2: law is a bit outdated already and it uses arcane terms. 168 00:09:46,240 --> 00:09:49,800 Speaker 2: But Section two thirty basically said that if you're a 169 00:09:49,840 --> 00:09:52,920 Speaker 2: what we now know is a social media company, that 170 00:09:53,000 --> 00:09:57,480 Speaker 2: you're deemed not to be a publisher but a platform. 171 00:09:58,000 --> 00:10:03,120 Speaker 2: Except in their whole bunch of exceptions, these publishers cannot 172 00:10:03,640 --> 00:10:07,040 Speaker 2: allow people to post things that are illegal or criminal, 173 00:10:07,480 --> 00:10:12,280 Speaker 2: or abhorrent or and they list all of these things. Okay, Well, 174 00:10:12,360 --> 00:10:19,079 Speaker 2: when these tech platforms sensor conservatives not based on some terrible, 175 00:10:19,559 --> 00:10:24,400 Speaker 2: you know, content that that is illegal, I think they 176 00:10:24,480 --> 00:10:27,080 Speaker 2: stop being a mere platform and in fact do become 177 00:10:27,080 --> 00:10:30,319 Speaker 2: a publisher and no longer get section to the protection 178 00:10:30,880 --> 00:10:34,880 Speaker 2: or if they are allowing criminal conduct child exploitation. Child 179 00:10:34,920 --> 00:10:38,040 Speaker 2: exploitation is criminal, it's not allowed. But if they are 180 00:10:38,200 --> 00:10:41,240 Speaker 2: permitting that, then they ought to be held accountable. And 181 00:10:41,280 --> 00:10:45,760 Speaker 2: that's where the new New Mexico Attorney General has said 182 00:10:45,800 --> 00:10:51,440 Speaker 2: he's got some earth shattering evidence showing that Facebook in 183 00:10:51,480 --> 00:10:59,040 Speaker 2: particular was knowingly allowing criminal content to remain on the site. Look, 184 00:10:59,120 --> 00:11:02,760 Speaker 2: that's true, and Facebook's going to be in serious, serious trouble. 185 00:11:03,320 --> 00:11:06,520 Speaker 2: But as an attorney, and I've been involved in massive 186 00:11:06,559 --> 00:11:08,720 Speaker 2: litigations in the past, so I have a little bit 187 00:11:08,760 --> 00:11:12,360 Speaker 2: of familiarity with the challenges here. On one hand, I 188 00:11:12,400 --> 00:11:14,800 Speaker 2: see that the planefs have a very difficult road to 189 00:11:14,800 --> 00:11:17,679 Speaker 2: hoe unless, of course, they've got the kind of evidence 190 00:11:17,960 --> 00:11:19,800 Speaker 2: that the New Mexico ag claims to have. 191 00:11:20,040 --> 00:11:22,800 Speaker 1: Let's take a quick commercial break. We'll continue next on 192 00:11:22,840 --> 00:11:29,520 Speaker 1: the Tutor Dixon Podcast. You make an interesting point because 193 00:11:29,600 --> 00:11:32,480 Speaker 1: we do remember when we were all we were all 194 00:11:32,520 --> 00:11:36,840 Speaker 1: being ratcheted back or censored on social media. And actually 195 00:11:36,880 --> 00:11:40,240 Speaker 1: I know moms that lost Facebook pages and they were 196 00:11:40,320 --> 00:11:42,959 Speaker 1: kicked off Facebook completely, And to me, that was it 197 00:11:43,120 --> 00:11:46,800 Speaker 1: legally yes, legally yes, And that to me was a 198 00:11:46,920 --> 00:11:49,120 Speaker 1: very critical moment because then there was kind of this 199 00:11:49,200 --> 00:11:53,640 Speaker 1: conversation of well, is Facebook just a product or has 200 00:11:53,760 --> 00:11:57,239 Speaker 1: Facebook become kind of a utility because people were communicating 201 00:11:57,280 --> 00:12:00,000 Speaker 1: on there just like they communicate through the phone. Parents 202 00:12:00,080 --> 00:12:02,199 Speaker 1: are part of the school group, they're part of the 203 00:12:02,520 --> 00:12:05,360 Speaker 1: soccer group. You know, you're getting this is truly an 204 00:12:05,400 --> 00:12:09,280 Speaker 1: information center, and you were getting kicked off. So there 205 00:12:09,360 --> 00:12:12,120 Speaker 1: was a part of us in the conservative movement who 206 00:12:12,240 --> 00:12:16,560 Speaker 1: said they can obviously monitor every post and hear ever 207 00:12:17,040 --> 00:12:19,400 Speaker 1: and know every word. They have trigger words that they 208 00:12:19,400 --> 00:12:22,080 Speaker 1: are choosing to kick someone off for. But then you 209 00:12:22,080 --> 00:12:24,520 Speaker 1: would see that there is still child porn on there 210 00:12:24,559 --> 00:12:27,280 Speaker 1: and you would go, how can you possibly not be 211 00:12:27,480 --> 00:12:31,240 Speaker 1: monitoring this? And even these instructional videos that we've heard 212 00:12:31,240 --> 00:12:36,120 Speaker 1: about instructions to commit suicide. I mean, that's against the 213 00:12:36,120 --> 00:12:38,680 Speaker 1: First Amendment. You can't be going out there and giving 214 00:12:38,679 --> 00:12:41,640 Speaker 1: guides to hurt people, even if it is hurting yourself. 215 00:12:41,960 --> 00:12:44,240 Speaker 1: That shouldn't be allowed. So these are things that you 216 00:12:44,360 --> 00:12:47,200 Speaker 1: have to kind of step back and say, well, what 217 00:12:47,400 --> 00:12:51,679 Speaker 1: is the problem that these social media outlets can't seem 218 00:12:51,720 --> 00:12:54,240 Speaker 1: to get this stuff off? Now I want to say, 219 00:12:54,960 --> 00:12:58,520 Speaker 1: I realized these are huge information sites and they have 220 00:12:58,600 --> 00:13:01,440 Speaker 1: a massive amount of information, but I'd like to hear 221 00:13:01,520 --> 00:13:05,480 Speaker 1: from them, do you genuinely have a problem figuring out 222 00:13:05,559 --> 00:13:06,680 Speaker 1: what's on your sites? 223 00:13:08,040 --> 00:13:10,840 Speaker 2: They know exactly what's on their sites. And Tudor, let 224 00:13:10,880 --> 00:13:15,880 Speaker 2: me address the next point in this whole debate, because 225 00:13:15,880 --> 00:13:20,480 Speaker 2: some libertarians and some conservatives will say, hey, they're private companies, 226 00:13:20,520 --> 00:13:24,400 Speaker 2: they can do whatever they want. Well, that's not true. 227 00:13:25,720 --> 00:13:30,520 Speaker 2: Even before our nation was founded, we had common carriage laws. 228 00:13:30,760 --> 00:13:33,960 Speaker 2: This is part of the common law in the United 229 00:13:33,960 --> 00:13:38,880 Speaker 2: States of America. And there's only been one period in 230 00:13:38,920 --> 00:13:42,480 Speaker 2: our nation's history where where we walked away from common 231 00:13:42,559 --> 00:13:45,920 Speaker 2: carrier laws, and that was in the infamous Plus versus 232 00:13:45,920 --> 00:13:50,920 Speaker 2: Ferguson case where the Supreme Court for a time said, hey, 233 00:13:51,000 --> 00:13:53,960 Speaker 2: railroad company, if you don't want to sell a train 234 00:13:54,000 --> 00:13:57,520 Speaker 2: ticket to a black man, you don't have to. You 235 00:13:57,559 --> 00:14:03,680 Speaker 2: can discriminate based on race. Now it is true, Tutor, 236 00:14:03,760 --> 00:14:08,199 Speaker 2: that if some guy who sells bicycles doesn't want to 237 00:14:08,240 --> 00:14:11,240 Speaker 2: sell a bicycle, you know he's permitted to be a racist. 238 00:14:11,960 --> 00:14:15,400 Speaker 2: But if you're a common carrier, and there's there's sort 239 00:14:15,440 --> 00:14:20,320 Speaker 2: of a broad definition common carriers, some corporate entity that 240 00:14:20,680 --> 00:14:24,720 Speaker 2: is situated in the market, such that if we allow 241 00:14:24,800 --> 00:14:29,000 Speaker 2: that entity to discriminate for any reason, that disrupts the 242 00:14:29,040 --> 00:14:33,760 Speaker 2: whole market. That's sort of common carriers. So AT and T, Sprint, 243 00:14:34,040 --> 00:14:37,720 Speaker 2: T Mobile, these are common carriers. If we were allowed, 244 00:14:38,120 --> 00:14:42,200 Speaker 2: if they were allowed to discriminate, the whole US economy 245 00:14:42,640 --> 00:14:46,360 Speaker 2: would would be disrupted. So AT and T. If you 246 00:14:46,400 --> 00:14:48,880 Speaker 2: want to sign up for an AT and T account, 247 00:14:49,240 --> 00:14:54,120 Speaker 2: they cannot deny you service under the law because the 248 00:14:54,120 --> 00:14:57,320 Speaker 2: Supreme Court reverse the Plus versus Ferguson standard that permitted 249 00:14:57,360 --> 00:14:59,960 Speaker 2: discrimination for any reason. 250 00:15:00,320 --> 00:15:02,640 Speaker 1: But these do you think that you couldn't be debanked? 251 00:15:02,720 --> 00:15:08,600 Speaker 2: Then yeah, that's right. Banks are probably common carriers. These 252 00:15:08,640 --> 00:15:11,960 Speaker 2: big tech platforms are certainly under Texas law they are 253 00:15:12,000 --> 00:15:14,760 Speaker 2: common carriers. I believe under just US common law they 254 00:15:14,800 --> 00:15:17,640 Speaker 2: should be deemed common carriers. They should not be permitted 255 00:15:17,680 --> 00:15:21,920 Speaker 2: to discriminate. But in a Supreme Court case that I 256 00:15:21,960 --> 00:15:25,440 Speaker 2: attended oral arguments in the so called net choice cases 257 00:15:25,440 --> 00:15:30,160 Speaker 2: brought by the Attorney's General of Texas and Florida, big 258 00:15:30,240 --> 00:15:33,360 Speaker 2: tech is trying to argue that they should be permitted 259 00:15:33,400 --> 00:15:38,720 Speaker 2: to discriminate against black people based on people's faith and 260 00:15:38,800 --> 00:15:42,280 Speaker 2: their political viewpoint, for any reason that they should be 261 00:15:42,320 --> 00:15:45,360 Speaker 2: able to discriminate. Okay, Well, that's just returning to the 262 00:15:45,400 --> 00:15:49,360 Speaker 2: plus versus Ferguson standard the Supreme Court obviously rejected decades ago. 263 00:15:50,040 --> 00:15:53,040 Speaker 2: So no, big Tech should not be able to discriminate 264 00:15:53,080 --> 00:15:56,280 Speaker 2: against us based on our religion, our race, our ethnicity, 265 00:15:56,920 --> 00:15:59,560 Speaker 2: or our political viewpoint. We should be protected from that 266 00:15:59,600 --> 00:16:00,560 Speaker 2: kind of discrimination. 267 00:16:01,040 --> 00:16:03,760 Speaker 1: Okay. So that takes me to someone else who I 268 00:16:03,760 --> 00:16:07,240 Speaker 1: would say is also considered a common carrier in that case, 269 00:16:07,280 --> 00:16:09,760 Speaker 1: and that would be Apple. And you're working on a 270 00:16:11,120 --> 00:16:13,680 Speaker 1: you're exposing Apple news right now. And I say that 271 00:16:13,760 --> 00:16:16,600 Speaker 1: because if you think about your phone, you are either 272 00:16:16,640 --> 00:16:20,600 Speaker 1: Android or your Apple and Apple is allowing you to 273 00:16:20,640 --> 00:16:23,520 Speaker 1: get the apps, allowing you to get to Netflix, allowing 274 00:16:23,560 --> 00:16:26,320 Speaker 1: you to get to all of these different other They 275 00:16:26,360 --> 00:16:29,440 Speaker 1: are carrying all of these other products, but they have 276 00:16:29,560 --> 00:16:32,480 Speaker 1: their own news product on there. So you are a 277 00:16:32,600 --> 00:16:34,920 Speaker 1: user of Apple. You have an Apple iPhone, you have 278 00:16:34,920 --> 00:16:37,880 Speaker 1: an Apple iPad, you have an Apple computer. You are 279 00:16:38,040 --> 00:16:41,360 Speaker 1: apparently only getting a certain type of news, aren't you. 280 00:16:41,800 --> 00:16:46,040 Speaker 2: Yeah, Apple, We here at the Media Research cent we 281 00:16:46,080 --> 00:16:51,320 Speaker 2: refer to the big four news apps, Apple News, Google News, MSN, 282 00:16:51,360 --> 00:16:58,239 Speaker 2: and Yahoo News, and these apps completely control information dissemination. 283 00:16:58,640 --> 00:17:01,040 Speaker 2: People don't even realize it. You have an Apple phone, 284 00:17:01,080 --> 00:17:03,880 Speaker 2: and about forty five to fifty percent of American voters 285 00:17:03,960 --> 00:17:07,720 Speaker 2: have an Apple phone. Apple News is being fed to you. 286 00:17:07,720 --> 00:17:11,600 Speaker 2: You probably don't even realize it. But you know, we've 287 00:17:11,680 --> 00:17:14,960 Speaker 2: been tracking every single day for months now, all of 288 00:17:15,000 --> 00:17:19,800 Speaker 2: these news apps Apple News. Today's the ninety eighth day 289 00:17:20,000 --> 00:17:22,800 Speaker 2: in a row that Apple News has not published a 290 00:17:22,920 --> 00:17:27,560 Speaker 2: single story from a right leaning media outlet. In that 291 00:17:27,600 --> 00:17:31,399 Speaker 2: same time, they have published one two hundred and ninety 292 00:17:31,480 --> 00:17:37,919 Speaker 2: nine stories from left leaning outlets. Ninety nine to zero 293 00:17:38,840 --> 00:17:43,240 Speaker 2: is intentional. It's on purpose, obviously, and you know what 294 00:17:43,359 --> 00:17:49,359 Speaker 2: news has transpired, in those several months. Well you know 295 00:17:49,400 --> 00:17:53,680 Speaker 2: the Epstein files. Uh, we've got you know, the Trump administration, 296 00:17:53,840 --> 00:17:57,600 Speaker 2: the president of bombing Iran. We can go through all 297 00:17:57,640 --> 00:18:01,040 Speaker 2: of these major issues, and people with an iPhone are 298 00:18:01,160 --> 00:18:04,840 Speaker 2: only getting stories from the left, nothing from the right, 299 00:18:04,920 --> 00:18:09,160 Speaker 2: literally zero. Google News is not much better. Maybe two 300 00:18:09,200 --> 00:18:12,800 Speaker 2: percent of the stories that it promotes come from right 301 00:18:12,920 --> 00:18:18,240 Speaker 2: leaning outlets. But you know this is obviously intentional, it's political. 302 00:18:18,520 --> 00:18:21,840 Speaker 2: This is how Donald Trump's approval ratings are being driven 303 00:18:21,920 --> 00:18:27,240 Speaker 2: down when these outlets are probably propagandizing America. And between 304 00:18:27,960 --> 00:18:34,440 Speaker 2: the iPhone and every Samsung phone and you know, any 305 00:18:34,480 --> 00:18:37,760 Speaker 2: other phone operate with the Google operating system. They account 306 00:18:37,800 --> 00:18:41,880 Speaker 2: for ninety nine point eight percent of every phone in America. 307 00:18:42,119 --> 00:18:44,560 Speaker 2: Ninety nine point eight percent is either an Apple phone 308 00:18:44,960 --> 00:18:48,680 Speaker 2: or an iOS Google powered Google type phone like Samsung. 309 00:18:50,080 --> 00:18:51,840 Speaker 2: They are controlling information flow. 310 00:18:52,240 --> 00:18:54,720 Speaker 1: It's a danger, but there's one okay, So it's one 311 00:18:54,720 --> 00:18:57,080 Speaker 1: thing for us to report on this. And I've seen this. 312 00:18:57,480 --> 00:19:00,840 Speaker 1: I've actually seen this. I talked about it on Newsmax 313 00:19:00,960 --> 00:19:04,960 Speaker 1: just last night. This is a conversation that conservative outlets 314 00:19:05,040 --> 00:19:07,119 Speaker 1: are talking about right now. But of course, if you 315 00:19:07,160 --> 00:19:09,720 Speaker 1: are in that world, then you've kind of had to 316 00:19:09,800 --> 00:19:13,080 Speaker 1: seek out that information because, as you can see, you 317 00:19:13,119 --> 00:19:16,680 Speaker 1: aren't seeing conservative media if you are outside of the bubble. 318 00:19:16,840 --> 00:19:19,959 Speaker 1: And I call us in the conservative media bubble. So 319 00:19:20,520 --> 00:19:23,760 Speaker 1: what can an organization like yours or someone else do 320 00:19:24,119 --> 00:19:27,400 Speaker 1: or one of these conservative outlets. Is there the ability 321 00:19:27,440 --> 00:19:31,000 Speaker 1: to cite the case that you already cited and say, look, 322 00:19:31,400 --> 00:19:35,560 Speaker 1: you are essentially breaking the law by refusing, by discriminating, 323 00:19:35,640 --> 00:19:40,480 Speaker 1: and refusing to have conservative outlets in your realm of 324 00:19:40,600 --> 00:19:41,400 Speaker 1: media stories. 325 00:19:41,800 --> 00:19:45,080 Speaker 2: Yeah. So, the Federal Trade Commission the FTC, and by 326 00:19:45,080 --> 00:19:48,159 Speaker 2: the way, Donald Trump has picked an excellent chairman of 327 00:19:48,200 --> 00:19:52,359 Speaker 2: the FTC. FTC sort of has two broad areas. One 328 00:19:52,400 --> 00:19:56,000 Speaker 2: is sort of the antitrust market share side, are consumers 329 00:19:56,040 --> 00:20:01,119 Speaker 2: being harmed from a sort of a market price perspective? 330 00:20:01,880 --> 00:20:08,520 Speaker 2: Or on the deceptive trade practices side. Does Apple claim 331 00:20:09,160 --> 00:20:12,920 Speaker 2: that it is not biased? Does Google claim that it's 332 00:20:12,960 --> 00:20:17,400 Speaker 2: not biased? Are they deceiving consumers? When I bought this phone, 333 00:20:17,840 --> 00:20:20,359 Speaker 2: I bought it believing that that Apple was trying to 334 00:20:20,359 --> 00:20:24,199 Speaker 2: be fair. I didn't know that they were committing that 335 00:20:24,240 --> 00:20:27,600 Speaker 2: they were not being truthful to me. That's a misrepresentation, 336 00:20:27,720 --> 00:20:31,840 Speaker 2: that's a fraud, that's a consumer harm FTC can look 337 00:20:31,880 --> 00:20:36,880 Speaker 2: into on the market domination side, ninety nine point eight 338 00:20:36,880 --> 00:20:40,840 Speaker 2: percent of all these phones are either an Apple powered 339 00:20:40,840 --> 00:20:44,960 Speaker 2: phone or a Google powered phone. And Apple has a 340 00:20:45,000 --> 00:20:50,159 Speaker 2: subscription service. Well, is Apple engaged in an illegal tying 341 00:20:50,280 --> 00:20:55,040 Speaker 2: arrangement where if you know, the New York Times doesn't 342 00:20:55,080 --> 00:20:57,679 Speaker 2: pay Apple to be part of the subscription service, that 343 00:20:57,680 --> 00:21:01,240 Speaker 2: Apple will then not push Apple news stories with New 344 00:21:01,320 --> 00:21:05,240 Speaker 2: York Times. That could be an illegal time arrangement. In addition, 345 00:21:06,080 --> 00:21:10,159 Speaker 2: are these companies actually engaged in political activity? Is this 346 00:21:10,400 --> 00:21:13,719 Speaker 2: politically motivated? And if so, do they actually have to 347 00:21:13,880 --> 00:21:19,000 Speaker 2: file as donations with the Federal Election Commission? And if 348 00:21:19,000 --> 00:21:21,760 Speaker 2: they've not been filing, that's a violation of law. So 349 00:21:21,800 --> 00:21:24,600 Speaker 2: there are a number of things that this administration and 350 00:21:24,640 --> 00:21:25,919 Speaker 2: Congress can look at. 351 00:21:26,400 --> 00:21:29,960 Speaker 1: Well, okay, so that takes me to Wikipedia. You wrote 352 00:21:30,000 --> 00:21:33,040 Speaker 1: an op ed about Wikipedia, which I think is a 353 00:21:33,320 --> 00:21:37,520 Speaker 1: very similar situation, but I think it's even more nefarious. 354 00:21:38,000 --> 00:21:41,600 Speaker 1: But you can argue that point with me if you want. 355 00:21:41,640 --> 00:21:45,199 Speaker 1: But you've written that Wikipedia came out, and when it 356 00:21:45,240 --> 00:21:48,240 Speaker 1: came out years ago, the message to consumers was kind 357 00:21:48,240 --> 00:21:50,840 Speaker 1: of like, this is a neutral site where you can 358 00:21:51,240 --> 00:21:54,080 Speaker 1: add your history and we're all kind of sharing. It's 359 00:21:54,119 --> 00:21:58,080 Speaker 1: like an encyclopedia online that we all share information and 360 00:21:58,119 --> 00:22:01,080 Speaker 1: then there are some fact checks, but it's pretty neutral. 361 00:22:01,960 --> 00:22:04,040 Speaker 1: That was how it was presented. Then there was like 362 00:22:04,119 --> 00:22:06,080 Speaker 1: the big switch. It was the bait and switch, and 363 00:22:06,119 --> 00:22:09,200 Speaker 1: it became a propaganda site and it's pretty well known 364 00:22:09,600 --> 00:22:12,600 Speaker 1: by conservatives, but I don't think by the general public 365 00:22:12,960 --> 00:22:18,119 Speaker 1: that this is really progressive propaganda. And you can't change it. 366 00:22:18,480 --> 00:22:22,919 Speaker 1: So you like my own site, my own page on 367 00:22:22,960 --> 00:22:28,199 Speaker 1: Wikipedia says horrible things that are legitimately not true, but 368 00:22:28,320 --> 00:22:31,520 Speaker 1: they've had this. Lee's leftist media sites write these things 369 00:22:31,600 --> 00:22:34,520 Speaker 1: up and you can't change it. I'll give you an example. 370 00:22:34,920 --> 00:22:37,480 Speaker 1: When I was in the media years ago. You know, 371 00:22:37,560 --> 00:22:39,800 Speaker 1: you get these stories that your producers give you, and 372 00:22:39,840 --> 00:22:42,200 Speaker 1: our producers gave us a story on a woman who 373 00:22:42,200 --> 00:22:45,840 Speaker 1: had decided to cancel herself. The story was on should 374 00:22:45,920 --> 00:22:49,879 Speaker 1: people self canceled? Was during cancel culture, Should people self cancel? 375 00:22:50,240 --> 00:22:53,480 Speaker 1: The woman had done something when she was sixteen. She 376 00:22:53,720 --> 00:22:56,880 Speaker 1: had one of the largest YouTube pages in the country. 377 00:22:57,359 --> 00:23:00,680 Speaker 1: She had just decided she was going to she had 378 00:23:00,720 --> 00:23:03,879 Speaker 1: just like erased her YouTube page. And she was twenty 379 00:23:03,920 --> 00:23:06,360 Speaker 1: one and she came out and she said, something came 380 00:23:06,440 --> 00:23:09,199 Speaker 1: up that happened when I was sixteen, I've decided to 381 00:23:09,240 --> 00:23:12,760 Speaker 1: self cancel. At the time, no one knew what she 382 00:23:12,920 --> 00:23:16,320 Speaker 1: had done at sixteen to cause her to cancel her 383 00:23:16,400 --> 00:23:19,119 Speaker 1: YouTube page. So we talked about it on the news 384 00:23:19,160 --> 00:23:22,240 Speaker 1: and said, you know, people should there should be grace 385 00:23:22,280 --> 00:23:24,520 Speaker 1: for people who when they are sixteen make a mistake 386 00:23:24,600 --> 00:23:26,480 Speaker 1: and this is their business and they shouldn't have to 387 00:23:26,520 --> 00:23:31,040 Speaker 1: self cancel. Well, when I ran for office at that point, 388 00:23:31,040 --> 00:23:33,719 Speaker 1: it was years later, it had come out that people 389 00:23:33,760 --> 00:23:38,480 Speaker 1: had accused her of putting makeup on and being using blackface. 390 00:23:38,680 --> 00:23:41,399 Speaker 1: She claimed that wasn't her intent, but she ended up 391 00:23:41,400 --> 00:23:45,600 Speaker 1: canceling her page anyway, the news a news article ran 392 00:23:45,720 --> 00:23:51,320 Speaker 1: saying Tutor Dixon defended this action, and we went to 393 00:23:51,440 --> 00:23:53,520 Speaker 1: Wikipedia and said, this is so, this is what it 394 00:23:53,560 --> 00:23:56,240 Speaker 1: says on my Wikipedia page. That's not even true. It's 395 00:23:56,280 --> 00:23:59,240 Speaker 1: not even close to true. They say, you can't change 396 00:23:59,280 --> 00:24:02,399 Speaker 1: it because it's a it's fact now that it's written 397 00:24:02,480 --> 00:24:05,040 Speaker 1: in a media article, and you didn't get the article 398 00:24:05,160 --> 00:24:07,879 Speaker 1: changed at the time, so you can't change this page 399 00:24:08,200 --> 00:24:10,040 Speaker 1: saying that you're this horrible person. 400 00:24:11,119 --> 00:24:14,720 Speaker 2: Yeah, well, Wiki, let me give you another example. And 401 00:24:14,760 --> 00:24:19,719 Speaker 2: then I'll sort of delve into how Wikipedia's just wrong. 402 00:24:21,760 --> 00:24:26,639 Speaker 2: Pete Hegseeth, of course, is now our Secretary of War. Well, 403 00:24:26,760 --> 00:24:30,040 Speaker 2: there was a Wikipedia page for him before he was nominated, 404 00:24:30,080 --> 00:24:33,959 Speaker 2: before he was announced, and it listed all the awards 405 00:24:34,160 --> 00:24:38,080 Speaker 2: that he had earned as an elite soldier in our 406 00:24:38,160 --> 00:24:41,399 Speaker 2: US military and his record, and you know, he has a, 407 00:24:41,480 --> 00:24:48,320 Speaker 2: you know, a very impressive record in our services. The 408 00:24:48,480 --> 00:24:53,000 Speaker 2: day after he was announced, not only did Wikipedia literally 409 00:24:53,280 --> 00:24:57,560 Speaker 2: delete many of the awards that that had been awarded 410 00:24:57,560 --> 00:25:01,560 Speaker 2: to him, they also then took down the pictorial icons 411 00:25:01,600 --> 00:25:03,439 Speaker 2: of those awards and put it at the end of 412 00:25:03,480 --> 00:25:07,920 Speaker 2: his page. They were trying to diminish his actual service. 413 00:25:08,760 --> 00:25:12,000 Speaker 2: What's the difference between Peter Hexth on day one versus 414 00:25:12,040 --> 00:25:15,959 Speaker 2: day two? Well, day two, he's a Trump announced nominee, 415 00:25:16,320 --> 00:25:18,240 Speaker 2: and Wikipedia then did everything they could to try to 416 00:25:18,240 --> 00:25:22,719 Speaker 2: take him out. This included by lying about his record, 417 00:25:22,960 --> 00:25:27,760 Speaker 2: by removing awards that he had earned on the battlefield. 418 00:25:27,920 --> 00:25:30,800 Speaker 2: It's a line they're trying to suggest he wasn't given 419 00:25:30,920 --> 00:25:36,280 Speaker 2: those awards. Now, Wikipedia, it tries to get Section two 420 00:25:36,359 --> 00:25:39,720 Speaker 2: thirty protection by claiming that it's not a publisher, that 421 00:25:39,800 --> 00:25:42,240 Speaker 2: it's a platform, and you might scratch your head and think, well, no, 422 00:25:42,680 --> 00:25:45,760 Speaker 2: they're editing their stuff all the time, it's a publisher. 423 00:25:46,040 --> 00:25:49,480 Speaker 2: Well what they'll do is say, well, they're volunteer editors. 424 00:25:49,760 --> 00:25:50,080 Speaker 2: They're not. 425 00:25:50,600 --> 00:25:52,000 Speaker 1: They only choose who they are. 426 00:25:52,680 --> 00:25:55,120 Speaker 2: Not well, not only do they choose who they are, 427 00:25:55,480 --> 00:25:59,879 Speaker 2: they also select them and train them. So we just 428 00:26:00,160 --> 00:26:04,080 Speaker 2: covered and exposed how they've got this massive campaign through. 429 00:26:04,119 --> 00:26:09,800 Speaker 2: It's it's They claim it's an independent, not for profit 430 00:26:09,880 --> 00:26:14,320 Speaker 2: called wiki Education, but Wikipedia spun it off. It had 431 00:26:14,320 --> 00:26:17,119 Speaker 2: been an internal project Wikipedia. The same people who were 432 00:26:17,119 --> 00:26:20,600 Speaker 2: in Wikipedia or Wiki Education. Wiki Education goes on thousands 433 00:26:20,640 --> 00:26:24,199 Speaker 2: of college campuses working with radical left wing professors to 434 00:26:24,280 --> 00:26:28,280 Speaker 2: have those professors as part of the curriculum, train people 435 00:26:28,320 --> 00:26:33,040 Speaker 2: on how to edit and write things for Wikipedia. And 436 00:26:33,119 --> 00:26:36,560 Speaker 2: so then which professors and which classrooms are doing this, Well, 437 00:26:36,680 --> 00:26:40,560 Speaker 2: the radical classrooms, the you know, the African American studies, 438 00:26:40,600 --> 00:26:45,160 Speaker 2: the LGBTQ studies, the feminist studies programs, they're the ones 439 00:26:45,680 --> 00:26:49,280 Speaker 2: that are working with Wiki Education to train students on 440 00:26:49,320 --> 00:26:52,040 Speaker 2: how to become editors of Wikipedia. And then they're the 441 00:26:52,040 --> 00:26:55,800 Speaker 2: ones who then go through tens of thousands of Wiki 442 00:26:55,840 --> 00:27:01,080 Speaker 2: sites to edit them to conform to the radical agenda. 443 00:27:01,400 --> 00:27:04,119 Speaker 2: I mean, this is a this is a well crafted, 444 00:27:04,200 --> 00:27:08,320 Speaker 2: well tailored program. Again, it's purposeful. Wikipedia is trying to 445 00:27:08,359 --> 00:27:09,920 Speaker 2: radicalize the whole world. 446 00:27:10,280 --> 00:27:13,199 Speaker 1: Well, they also have blacklists though, I mean, even if 447 00:27:13,240 --> 00:27:15,200 Speaker 1: I were to go to them and say, hey, this 448 00:27:15,280 --> 00:27:18,199 Speaker 1: is a new article that says that this is not 449 00:27:18,320 --> 00:27:21,120 Speaker 1: who I am and that this was a lie, if 450 00:27:21,119 --> 00:27:24,280 Speaker 1: it comes from Heritage Foundation, if it comes from Breitbart News, 451 00:27:24,280 --> 00:27:28,280 Speaker 1: if it comes from even the Washington Examiner, they're blacklisted altogether. 452 00:27:28,359 --> 00:27:30,840 Speaker 1: You can't use an article from there, right right. 453 00:27:30,960 --> 00:27:35,159 Speaker 2: It's they've got a very sophisticated blacklisting, black valling program. 454 00:27:35,200 --> 00:27:38,040 Speaker 2: They are actually three tiers of this. The first tier 455 00:27:38,119 --> 00:27:41,960 Speaker 2: is if if if you're an outlet like Breitbart. On 456 00:27:42,000 --> 00:27:45,960 Speaker 2: the first tier, if somebody tries to offer an edit 457 00:27:46,000 --> 00:27:49,080 Speaker 2: to Wikipedia, the edit isn't even looked at a review. 458 00:27:49,200 --> 00:27:53,919 Speaker 2: It is automatically rejected, algorithm crazy. The second blacklist they 459 00:27:53,960 --> 00:27:56,959 Speaker 2: have is, well, you can you can try to offer 460 00:27:58,760 --> 00:28:00,919 Speaker 2: an outlet, but once in the editor looks at at 461 00:28:00,920 --> 00:28:03,399 Speaker 2: the editors and structed to reject it. And then the 462 00:28:03,440 --> 00:28:07,520 Speaker 2: third list of the blacklist is if you offer this edit, 463 00:28:09,160 --> 00:28:12,320 Speaker 2: if there's any other publication anywhere in the world that 464 00:28:12,359 --> 00:28:15,840 Speaker 2: has addressed the general topic. Then Wikipedia will use that 465 00:28:16,560 --> 00:28:19,760 Speaker 2: and only use the recommended outlet if there's no other 466 00:28:19,800 --> 00:28:23,960 Speaker 2: publication out there. So, Tutor, even if there had been 467 00:28:24,160 --> 00:28:27,760 Speaker 2: a media outlet that explained the truth behind what you 468 00:28:27,800 --> 00:28:29,840 Speaker 2: had done and it was on that third tier, if 469 00:28:29,840 --> 00:28:33,800 Speaker 2: there's any other outlet at all, like the Detroit Free 470 00:28:33,840 --> 00:28:38,560 Speaker 2: Press or something that had written on you, Wikipedia would 471 00:28:38,680 --> 00:28:40,480 Speaker 2: only be able to rely on that and not the 472 00:28:40,520 --> 00:28:44,120 Speaker 2: alternative source. And this is all designed to harm people 473 00:28:44,120 --> 00:28:44,680 Speaker 2: like you, Tutor. 474 00:28:44,760 --> 00:28:47,440 Speaker 1: Let's take a quick commercial break. We'll continue next on 475 00:28:47,480 --> 00:28:54,400 Speaker 1: the Tutor Dixon Podcast. I'm one person that ran for office, 476 00:28:54,640 --> 00:28:57,120 Speaker 1: and I think that I did run for office at 477 00:28:57,160 --> 00:29:00,360 Speaker 1: a time when I thought I thought that there was 478 00:29:00,760 --> 00:29:03,200 Speaker 1: a left side and the right side, and I thought 479 00:29:03,240 --> 00:29:06,920 Speaker 1: people had their ideals on one side and their ideals 480 00:29:06,960 --> 00:29:09,880 Speaker 1: on the other side. And this was America. And anybody 481 00:29:09,920 --> 00:29:12,320 Speaker 1: can run for office who decides that they want to 482 00:29:12,360 --> 00:29:15,120 Speaker 1: do they want to serve the country. And when I 483 00:29:15,160 --> 00:29:16,840 Speaker 1: got into it, I was like, wow, this is so 484 00:29:17,000 --> 00:29:19,440 Speaker 1: much different than I thought, and so much different for 485 00:29:19,520 --> 00:29:21,920 Speaker 1: people on the right, because first of all, you have 486 00:29:22,000 --> 00:29:24,000 Speaker 1: to have a massive amount of money to try to 487 00:29:24,000 --> 00:29:27,280 Speaker 1: fight these things. But we are at a totally different 488 00:29:27,360 --> 00:29:30,080 Speaker 1: level than the left. I mean, if you watch right now, 489 00:29:30,280 --> 00:29:33,160 Speaker 1: if you are watching at all the midterms, you will 490 00:29:33,200 --> 00:29:38,080 Speaker 1: not see negative news on leftist candidates. Even that guy 491 00:29:38,160 --> 00:29:43,760 Speaker 1: Jay Jones who was talking about murdering the children, wanting 492 00:29:43,800 --> 00:29:45,800 Speaker 1: to watch the children of the Speaker of the House 493 00:29:45,840 --> 00:29:52,280 Speaker 1: of Virginia die, got almost no coverage whatsoever. It's shocking 494 00:29:52,360 --> 00:29:54,320 Speaker 1: to me what we are up against. And then when 495 00:29:54,360 --> 00:29:57,320 Speaker 1: you start to I mean, thank goodness for organizations like yours. 496 00:29:57,680 --> 00:29:59,640 Speaker 1: And that's why I think people should follow you and 497 00:29:59,640 --> 00:30:04,360 Speaker 1: follow what you're doing at the Research Center, because without 498 00:30:04,640 --> 00:30:07,760 Speaker 1: us following what you're doing, and without this story that 499 00:30:07,800 --> 00:30:11,240 Speaker 1: you've broken on Apple News, we don't even get You 500 00:30:11,320 --> 00:30:13,200 Speaker 1: get so caught up in the day to day of 501 00:30:13,240 --> 00:30:16,000 Speaker 1: taking kids to soccer and getting to work and all 502 00:30:16,040 --> 00:30:19,600 Speaker 1: of it, making lunches and dinners and everything that you 503 00:30:19,640 --> 00:30:22,680 Speaker 1: focus on for your own life that you don't realize 504 00:30:22,720 --> 00:30:24,560 Speaker 1: what you're being fed is a bunch of lies. 505 00:30:25,600 --> 00:30:27,960 Speaker 2: Yeah, when we did this Apple News story, one of 506 00:30:28,000 --> 00:30:30,440 Speaker 2: these left wing trolls came at me and said, well, 507 00:30:30,880 --> 00:30:34,280 Speaker 2: you know, what would a conservative ever do if you 508 00:30:34,560 --> 00:30:37,080 Speaker 2: won this? You know, if you didn't have anything else 509 00:30:37,120 --> 00:30:41,000 Speaker 2: to whine about to which I responded, well, we would 510 00:30:41,080 --> 00:30:44,680 Speaker 2: provide for our families, we would worship our creator, you know, 511 00:30:44,800 --> 00:30:47,320 Speaker 2: and we would try to be productive citizens. What would 512 00:30:47,320 --> 00:30:51,000 Speaker 2: a liberal do? Because liberals don't do this. Liberals simply 513 00:30:51,000 --> 00:30:55,560 Speaker 2: tried to destroy anything that's good and holy. They try 514 00:30:55,600 --> 00:30:59,560 Speaker 2: to upend everything. They think the chaos is the norm, 515 00:30:59,840 --> 00:31:03,160 Speaker 2: and that anything that is stable is somehow harmful. I mean, 516 00:31:03,600 --> 00:31:05,600 Speaker 2: this is the world where d it's no longer democrat 517 00:31:05,640 --> 00:31:09,640 Speaker 2: versus Republican, it's pro freedom versus anti freedom. 518 00:31:10,560 --> 00:31:13,200 Speaker 1: You're so right, because it's not just that this stuff 519 00:31:13,360 --> 00:31:17,120 Speaker 1: stirs people up and gets them to vote a certain way. 520 00:31:17,440 --> 00:31:19,960 Speaker 1: But now you see these people that are like the 521 00:31:20,560 --> 00:31:22,400 Speaker 1: I don't there's not a nicer term for them, but 522 00:31:22,440 --> 00:31:25,120 Speaker 1: the useful idiots that go out and protest, and the 523 00:31:25,160 --> 00:31:31,360 Speaker 1: protest creates this visual that then allows jerks like Eric 524 00:31:31,440 --> 00:31:35,160 Speaker 1: Swaalwell to ask the ICE director or the folks the 525 00:31:36,560 --> 00:31:39,800 Speaker 1: agent who was being that was testifying at the hearing, 526 00:31:40,240 --> 00:31:41,960 Speaker 1: are you going to resign? And do you think and 527 00:31:41,960 --> 00:31:43,520 Speaker 1: then the other woman that said do you think you're 528 00:31:43,520 --> 00:31:46,160 Speaker 1: going to hell? I mean, these are it's like, these 529 00:31:46,200 --> 00:31:50,360 Speaker 1: political stunts are so humiliating to people who are actually 530 00:31:50,400 --> 00:31:53,080 Speaker 1: serving and trying to keep the Americans safe. But I 531 00:31:53,160 --> 00:31:55,680 Speaker 1: just want to say I appreciate that you bring this 532 00:31:55,760 --> 00:31:58,240 Speaker 1: to light because then people can kind of sit back 533 00:31:58,280 --> 00:32:01,240 Speaker 1: and go, oh, these stories about aren't true. Oh, these 534 00:32:01,280 --> 00:32:04,960 Speaker 1: stories about conservatives aren't true. These stories about the administration 535 00:32:05,080 --> 00:32:08,400 Speaker 1: aren't true. But it's a daily fight. So I appreciate 536 00:32:08,440 --> 00:32:11,200 Speaker 1: that it's not just that you are worshiping your creator 537 00:32:11,280 --> 00:32:13,840 Speaker 1: and providing for your family, but you are also fighting 538 00:32:13,880 --> 00:32:16,520 Speaker 1: for us every day. Dan Schneider, thank you so much 539 00:32:16,520 --> 00:32:17,400 Speaker 1: for coming on today. 540 00:32:17,720 --> 00:32:19,360 Speaker 2: Thanks too, Thank. 541 00:32:19,160 --> 00:32:21,360 Speaker 1: You, and thank you all for listening to the Tutor 542 00:32:21,400 --> 00:32:23,920 Speaker 1: Dixon podcast. For this episode and others. Go to Tutor 543 00:32:23,960 --> 00:32:28,120 Speaker 1: dixonpodcast dot com, the iHeartRadio app, Apple Podcasts, or wherever 544 00:32:28,160 --> 00:32:30,320 Speaker 1: you get your podcasts, and you can always watch us 545 00:32:30,320 --> 00:32:32,920 Speaker 1: on Rumble or YouTube at Tutor Dixon. But make sure 546 00:32:32,960 --> 00:32:35,840 Speaker 1: you join us and have a blessed day.