1 00:00:00,560 --> 00:00:04,200 Speaker 1: Big Tech has figured out their next move, and that 2 00:00:04,360 --> 00:00:08,440 Speaker 1: is to make sure after the inauguration that you and 3 00:00:08,560 --> 00:00:14,880 Speaker 1: what you believe in disappears, that you can be silenced 4 00:00:15,120 --> 00:00:21,079 Speaker 1: no matter what, that you can shut up, no matter what, 5 00:00:23,000 --> 00:00:25,680 Speaker 1: that you won't be able to talk or have a platform. 6 00:00:26,600 --> 00:00:32,520 Speaker 1: The former Facebook security chief actually this week called on 7 00:00:32,920 --> 00:00:40,040 Speaker 1: the cable companies to start silencing you, saying that you 8 00:00:40,280 --> 00:00:45,400 Speaker 1: are the problem. These networks and what they say are 9 00:00:45,479 --> 00:00:50,199 Speaker 1: the problem. And if we can just start talking to 10 00:00:50,240 --> 00:00:53,680 Speaker 1: these networks and get these networks to silence you, then 11 00:00:53,760 --> 00:00:58,680 Speaker 1: everything will be okay. But the world post Trump is 12 00:00:58,720 --> 00:01:03,280 Speaker 1: not about Trump anymore. It's about silencing you. Take a 13 00:01:03,360 --> 00:01:06,640 Speaker 1: listen to the guy who is actually the former Facebook 14 00:01:06,760 --> 00:01:11,480 Speaker 1: security chief calling for Verizon AT and ten others to 15 00:01:11,640 --> 00:01:16,360 Speaker 1: d platform actual news networks OA N NEWSMAX and you 16 00:01:16,480 --> 00:01:19,479 Speaker 1: guessed at Fox News and on the capability of these 17 00:01:19,480 --> 00:01:23,440 Speaker 1: conservative influencers to reach these huge audiences. There are people 18 00:01:23,480 --> 00:01:26,680 Speaker 1: on YouTube, for example, that have a larger daytime larger 19 00:01:26,720 --> 00:01:30,000 Speaker 1: audience than daytime CNN, and they are extremely radical and 20 00:01:30,000 --> 00:01:33,399 Speaker 1: pushing extremely radical views. And so it's up to the 21 00:01:33,400 --> 00:01:36,920 Speaker 1: facebooks and youtubes in particular to think about whether or 22 00:01:36,920 --> 00:01:39,520 Speaker 1: not they want to be effectively cable networks for disinformation, 23 00:01:39,680 --> 00:01:40,920 Speaker 1: and then we're gonna have to figure out the O 24 00:01:41,200 --> 00:01:44,000 Speaker 1: n N and Newsmax problem. You know that these companies 25 00:01:44,000 --> 00:01:46,679 Speaker 1: have freedom of speech, but I'm not sure we need 26 00:01:47,000 --> 00:01:50,200 Speaker 1: Verizon AT and T, Comcast and such to be bringing 27 00:01:50,280 --> 00:01:54,240 Speaker 1: them into tens of millions of homes. This is allowing 28 00:01:54,280 --> 00:01:56,680 Speaker 1: people to seek out information if they really want to, 29 00:01:56,760 --> 00:01:58,960 Speaker 1: but not pushing it into their faces, I think is 30 00:01:58,960 --> 00:02:00,600 Speaker 1: where we're going to have to go here. But by 31 00:02:00,600 --> 00:02:02,840 Speaker 1: the way, these networks are not pushing it into anyone's face. 32 00:02:03,720 --> 00:02:07,480 Speaker 1: This guy's an idiot. I have the choice to turn 33 00:02:07,520 --> 00:02:10,320 Speaker 1: the channel away from OA N, away from Fox News, 34 00:02:10,320 --> 00:02:14,080 Speaker 1: away from Newsmax and the threat that he's talking about. 35 00:02:14,080 --> 00:02:16,000 Speaker 1: And you noticed Sary didn't give any facts. You said, 36 00:02:16,040 --> 00:02:18,360 Speaker 1: they're they're they're extremists. You know, they have the right 37 00:02:18,400 --> 00:02:20,360 Speaker 1: to say it, but they shouldn't have a platform to 38 00:02:20,360 --> 00:02:24,799 Speaker 1: say it on These people have bigger platforms on YouTube 39 00:02:24,840 --> 00:02:28,839 Speaker 1: than audio time audiences on CNN. Well, then we got 40 00:02:28,840 --> 00:02:31,040 Speaker 1: to shut them down and silence them real quick, right, 41 00:02:32,720 --> 00:02:34,400 Speaker 1: So now we're gonna take this next step in the 42 00:02:34,520 --> 00:02:39,400 Speaker 1: silencing people that you disagree with. Remember Parlor and the 43 00:02:39,400 --> 00:02:42,520 Speaker 1: CEO of Parlor. He had his story that he wanted 44 00:02:42,520 --> 00:02:45,200 Speaker 1: to tell. The only person that's been willing to really 45 00:02:45,240 --> 00:02:47,239 Speaker 1: let him tell his story about what happened to Parlor 46 00:02:47,240 --> 00:02:52,160 Speaker 1: and how I disappeared was Mark Levin on his TV 47 00:02:52,240 --> 00:02:55,680 Speaker 1: show Life, Liberty and Levin on Fox News. He had 48 00:02:55,760 --> 00:03:00,120 Speaker 1: on the ceo Parlor, and he talked about how there 49 00:03:00,240 --> 00:03:04,519 Speaker 1: was literally no indication a shutdown was looming until everybody 50 00:03:04,560 --> 00:03:08,400 Speaker 1: ganged up on him on the same day after they'd 51 00:03:08,440 --> 00:03:13,280 Speaker 1: built this company and put everything into it and it 52 00:03:13,400 --> 00:03:16,320 Speaker 1: was a number one trending app on Apple, and then 53 00:03:16,360 --> 00:03:19,040 Speaker 1: they said, oh, now we got to nail them. Now 54 00:03:19,080 --> 00:03:23,120 Speaker 1: we gotta shut him down because this is conservatives are 55 00:03:23,680 --> 00:03:30,000 Speaker 1: these are conservatives that need to be silenced. We gotta 56 00:03:30,040 --> 00:03:34,440 Speaker 1: we gotta shut him down. We got to make sure 57 00:03:34,520 --> 00:03:38,520 Speaker 1: that there's nothing that you can say anymore that we 58 00:03:38,680 --> 00:03:44,320 Speaker 1: can't control it. Here's part of that interview. It was 59 00:03:44,360 --> 00:03:49,240 Speaker 1: amazing explaining just how big tech came after to silence 60 00:03:49,680 --> 00:03:53,120 Speaker 1: Parlor as they were a threat to Facebook and Twitter 61 00:03:53,920 --> 00:03:58,040 Speaker 1: and the liberal ideals of Silicon Valley tightening that kind 62 00:03:58,040 --> 00:04:01,480 Speaker 1: of insanity. It's very important that the public get to 63 00:04:01,520 --> 00:04:03,960 Speaker 1: know you, and so I'm prepared to have a long 64 00:04:04,040 --> 00:04:06,800 Speaker 1: form interview with any or all of the CEOs of 65 00:04:06,840 --> 00:04:11,120 Speaker 1: these major companies next Sunday. Just contact us at Fox 66 00:04:11,160 --> 00:04:12,880 Speaker 1: and we will set it up and I'd be happy 67 00:04:12,920 --> 00:04:16,080 Speaker 1: to do it. It'll be very civil, but it won't 68 00:04:16,080 --> 00:04:19,200 Speaker 1: be the kind of interviews you're used to in the 69 00:04:19,240 --> 00:04:21,160 Speaker 1: Washington Post in the New York Times. That I can. 70 00:04:21,480 --> 00:04:22,760 Speaker 1: And by the way, if you want to know why 71 00:04:22,880 --> 00:04:25,240 Speaker 1: Levinia Saibody said there is his point is, we're not 72 00:04:25,320 --> 00:04:27,520 Speaker 1: hiding from you, We're not afraid to talk to you. 73 00:04:27,760 --> 00:04:31,200 Speaker 1: We're not afraid to debate ideas. The reason why I 74 00:04:31,279 --> 00:04:33,520 Speaker 1: played the beginning of what Levin said there is because 75 00:04:33,520 --> 00:04:36,880 Speaker 1: it's important. Well Levin saying is we can have a debate. 76 00:04:36,880 --> 00:04:39,000 Speaker 1: We're not afraid of you, we're not trying to silence you. 77 00:04:39,520 --> 00:04:41,599 Speaker 1: Yet you're trying to silence us. And the question is 78 00:04:41,640 --> 00:04:45,560 Speaker 1: why why are you trying to silence us. We're willing 79 00:04:45,560 --> 00:04:47,400 Speaker 1: to have a debate with you. We're not going to 80 00:04:47,480 --> 00:04:50,760 Speaker 1: just let you walk all over us. Of course, right, 81 00:04:50,800 --> 00:04:52,560 Speaker 1: We're not gonna let you walk all over us. We're 82 00:04:52,560 --> 00:04:55,200 Speaker 1: not gonna let you just you know, get away with 83 00:04:56,120 --> 00:04:59,160 Speaker 1: murdering free speech in this country, which is what they 84 00:04:59,200 --> 00:05:07,320 Speaker 1: are doing right now. They are murdering free speech. But 85 00:05:07,440 --> 00:05:09,800 Speaker 1: the point that he was making there is we are 86 00:05:09,839 --> 00:05:12,839 Speaker 1: not afraid to have a dialogue, to have an open discussion. 87 00:05:13,200 --> 00:05:17,279 Speaker 1: We're not afraid of having a debate. We're not afraid 88 00:05:17,360 --> 00:05:19,640 Speaker 1: to talk it out with you. We're not afraid to 89 00:05:19,960 --> 00:05:22,760 Speaker 1: have a conversation about what we believe in and our 90 00:05:22,800 --> 00:05:32,960 Speaker 1: ideals that we're not afraid of having that conversation with you. Now, 91 00:05:33,000 --> 00:05:36,400 Speaker 1: listen to what this CEO of this company who was 92 00:05:36,440 --> 00:05:38,599 Speaker 1: trying to live the American dream, taking on the big 93 00:05:38,640 --> 00:05:42,320 Speaker 1: boys a big deck and finally succeeding after years of 94 00:05:42,400 --> 00:05:45,640 Speaker 1: not winning, and how fast they took it all away 95 00:05:45,680 --> 00:05:50,800 Speaker 1: from him. Assure you now, John Mates parlor Parlor starts 96 00:05:50,839 --> 00:05:54,479 Speaker 1: about two two and a half years ago, and as usual, 97 00:05:54,520 --> 00:05:57,400 Speaker 1: it's fledgling and so forth, and yet you're picking up 98 00:05:58,080 --> 00:06:00,920 Speaker 1: a lot of followers. Tell us how you're your your 99 00:06:00,960 --> 00:06:04,320 Speaker 1: business model was taking off. You know, in twenty nineteen, 100 00:06:04,400 --> 00:06:07,719 Speaker 1: we had uh, you know, a few hundred thousand accounts 101 00:06:07,720 --> 00:06:09,920 Speaker 1: I think is by the end of twenty nineteen, and 102 00:06:10,279 --> 00:06:12,840 Speaker 1: by the end of twenty twenty, we had, you know, 103 00:06:13,000 --> 00:06:17,800 Speaker 1: fifteen plus million accounts. You know we're growing. Uh, you know, January, 104 00:06:18,120 --> 00:06:20,400 Speaker 1: we were you know, on some of the last days 105 00:06:20,520 --> 00:06:24,000 Speaker 1: before we were we got the acts from from Amazon 106 00:06:24,080 --> 00:06:27,800 Speaker 1: Web Services. We had a more almost a million new 107 00:06:27,800 --> 00:06:31,600 Speaker 1: accounts created. On that last day. We were number one 108 00:06:31,640 --> 00:06:34,599 Speaker 1: on the app store. We were above Facebook, we were 109 00:06:34,600 --> 00:06:38,520 Speaker 1: above TikTok, we were above YouTube, above Instagram, above every 110 00:06:38,560 --> 00:06:40,400 Speaker 1: app on the app store in the United States. We 111 00:06:40,400 --> 00:06:45,839 Speaker 1: were number one before we got the acts um and uh, 112 00:06:45,880 --> 00:06:49,120 Speaker 1: the business model was proving very you know, it was working. 113 00:06:49,320 --> 00:06:52,760 Speaker 1: You know, our ads were not intrusive. We were not 114 00:06:53,000 --> 00:06:57,479 Speaker 1: using data to you know, kind of predict people or 115 00:06:57,520 --> 00:07:00,640 Speaker 1: mine people's data. We were presenting ads and vary what 116 00:07:00,760 --> 00:07:03,960 Speaker 1: I like to describe as humane way, so that we were, 117 00:07:05,400 --> 00:07:08,080 Speaker 1: you know, doing what I think is best for ads, 118 00:07:08,120 --> 00:07:11,880 Speaker 1: which is respecting people's privacy. We were making tremendous amount 119 00:07:11,880 --> 00:07:15,600 Speaker 1: of revenue from organic small businesses and helping them out. 120 00:07:15,960 --> 00:07:19,080 Speaker 1: And so we've proved our model, we proved our growth 121 00:07:19,160 --> 00:07:23,600 Speaker 1: in the marketplace. And so the marketplace competition entrepreneurship was 122 00:07:23,640 --> 00:07:27,400 Speaker 1: just too much for Silicon Valley to handle. So as 123 00:07:27,440 --> 00:07:31,120 Speaker 1: I read it, they colluded every avenue of support to 124 00:07:31,200 --> 00:07:34,720 Speaker 1: your business in order to run it in order to 125 00:07:34,760 --> 00:07:38,239 Speaker 1: reach out to potential customers and so forth was cut 126 00:07:38,280 --> 00:07:41,360 Speaker 1: off in order to choke you off and destroy you. 127 00:07:41,400 --> 00:07:43,920 Speaker 1: These are companies that have no business that had no 128 00:07:44,000 --> 00:07:47,200 Speaker 1: problem doing business with communist China. They have no problem 129 00:07:47,280 --> 00:07:50,679 Speaker 1: doing business with all kinds of genocidal dictators and so forth. 130 00:07:51,320 --> 00:07:53,640 Speaker 1: And by the way, Markovin's right based in fact here 131 00:07:54,640 --> 00:08:01,200 Speaker 1: what he just said. There, he's absolutely right, he's spot 132 00:08:01,240 --> 00:08:06,559 Speaker 1: on with what he just said. You know, you listen 133 00:08:06,640 --> 00:08:10,800 Speaker 1: to what he just described. These companies have no problem 134 00:08:10,960 --> 00:08:16,160 Speaker 1: doing business with China, doing business with dictators. But all 135 00:08:16,200 --> 00:08:20,200 Speaker 1: of a sudden they decided to collude. They decided to 136 00:08:20,280 --> 00:08:25,320 Speaker 1: collude together because they both they found somebody's viewpoint offensive 137 00:08:26,160 --> 00:08:28,840 Speaker 1: yours and the voters of Donald Trump, and they saw 138 00:08:28,880 --> 00:08:32,920 Speaker 1: this platform that was all of a sudden starting to 139 00:08:33,000 --> 00:08:35,960 Speaker 1: have influence. All right, I want to take a moment 140 00:08:36,000 --> 00:08:38,400 Speaker 1: real quick, though, and just let you know about something else, 141 00:08:38,440 --> 00:08:42,400 Speaker 1: and that is the canceling of conservatives from social media 142 00:08:42,480 --> 00:08:45,800 Speaker 1: and our accounts being silenced, our accounts being shut down. 143 00:08:46,000 --> 00:08:47,640 Speaker 1: If you want to keep in touch with us, there's 144 00:08:47,640 --> 00:08:49,440 Speaker 1: two ways now you can do it. Even if we're 145 00:08:49,559 --> 00:08:52,600 Speaker 1: kicked offline on social media, which we fully expects going 146 00:08:52,679 --> 00:08:56,280 Speaker 1: to happen. First off, you can send a text message 147 00:08:56,480 --> 00:08:59,400 Speaker 1: on your cell phone like a normal text message to 148 00:08:59,480 --> 00:09:03,600 Speaker 1: the number of five five four three three, Okay, and 149 00:09:03,880 --> 00:09:08,240 Speaker 1: text the word Ben to five five four three three, 150 00:09:08,559 --> 00:09:11,000 Speaker 1: and you will be that's our last resort for us 151 00:09:11,040 --> 00:09:13,360 Speaker 1: to be able to get in touch with you. So again, 152 00:09:13,559 --> 00:09:17,000 Speaker 1: text the word Ben to five five four three three 153 00:09:17,040 --> 00:09:18,840 Speaker 1: and we can get in touch with you that way. 154 00:09:19,000 --> 00:09:23,200 Speaker 1: Second thing, you can get our emails. And I have 155 00:09:23,240 --> 00:09:24,920 Speaker 1: a feeling that you won't be able to find us 156 00:09:24,920 --> 00:09:27,680 Speaker 1: on social media pretty soon based on what we're seeing, 157 00:09:28,320 --> 00:09:31,000 Speaker 1: So you can get our emails by signing up right 158 00:09:31,040 --> 00:09:36,880 Speaker 1: now for AMAC, the number one conservative organization in the country, 159 00:09:36,880 --> 00:09:41,000 Speaker 1: by going to ben free online dot com that's been 160 00:09:41,520 --> 00:09:44,360 Speaker 1: free online dot com. All you gotta do is going 161 00:09:44,400 --> 00:09:47,280 Speaker 1: to ben free online dot com right now and you 162 00:09:47,320 --> 00:09:50,040 Speaker 1: can join AMAC for free. You'll get the AMAC magazine 163 00:09:50,040 --> 00:09:52,680 Speaker 1: for free, and you'll get our emails so that we 164 00:09:52,720 --> 00:09:54,800 Speaker 1: can keep in touch with you that way. So make 165 00:09:54,840 --> 00:09:57,320 Speaker 1: sure you join AMAC for free right now and get 166 00:09:57,320 --> 00:10:00,000 Speaker 1: our emails ben free online dot com that's been free 167 00:10:00,000 --> 00:10:02,720 Speaker 1: online dot com or just text as well as a 168 00:10:02,840 --> 00:10:06,880 Speaker 1: last resort on your cell phone, text to the number 169 00:10:06,920 --> 00:10:11,920 Speaker 1: five five four three three the word Ben Parlor's CEO, 170 00:10:12,080 --> 00:10:15,400 Speaker 1: explaining his story of how they were shut down with 171 00:10:15,440 --> 00:10:19,040 Speaker 1: no warning, with all these companies doing it at the 172 00:10:19,080 --> 00:10:23,280 Speaker 1: same time to make sure they couldn't survive, explaining just 173 00:10:23,360 --> 00:10:26,120 Speaker 1: how fast the legs were they were, you know, everything 174 00:10:26,200 --> 00:10:30,000 Speaker 1: just disappeared from them. He said, we were in a 175 00:10:30,080 --> 00:10:32,440 Speaker 1: situation where we had no idea what was coming, right, 176 00:10:32,440 --> 00:10:35,640 Speaker 1: We had no clue that this was about to happen. 177 00:10:36,679 --> 00:10:38,280 Speaker 1: We can't even we don't even have phone numbers to 178 00:10:38,280 --> 00:10:40,240 Speaker 1: get in touch with people. They were silencing us. Take 179 00:10:40,240 --> 00:10:43,800 Speaker 1: a listen due toube above Instagram, above every app on 180 00:10:43,840 --> 00:10:45,760 Speaker 1: the app store in the United States. We were number 181 00:10:45,760 --> 00:10:51,679 Speaker 1: one before we got the acts, and the business model 182 00:10:51,720 --> 00:10:54,480 Speaker 1: was proving very you know, it was working. You know, 183 00:10:54,679 --> 00:10:59,200 Speaker 1: our ads were not intrusive. We were not using data 184 00:10:59,600 --> 00:11:03,440 Speaker 1: to you know, kind of predict people or mind people's data. 185 00:11:03,520 --> 00:11:05,880 Speaker 1: We were presenting ads in a very what I like 186 00:11:05,960 --> 00:11:10,559 Speaker 1: to describe as humane way, so that we were, you know, 187 00:11:11,000 --> 00:11:13,360 Speaker 1: doing what I think is best for ads, which is 188 00:11:13,840 --> 00:11:17,640 Speaker 1: respecting people's privacy. We were making tremendous amount of revenue 189 00:11:17,679 --> 00:11:21,360 Speaker 1: from organic small businesses and helping them out and so 190 00:11:21,800 --> 00:11:24,880 Speaker 1: we've proved our model, we proved our growth in the marketplace, 191 00:11:25,200 --> 00:11:29,280 Speaker 1: and so the marketplace competition entrepreneurship was just too much 192 00:11:29,720 --> 00:11:32,800 Speaker 1: for Silicon Valley to handle. So as I read it, 193 00:11:32,840 --> 00:11:37,040 Speaker 1: they colluded every avenue of support to your business in 194 00:11:37,160 --> 00:11:40,280 Speaker 1: order to run it, in order to reach out to 195 00:11:40,360 --> 00:11:44,440 Speaker 1: potential customers, and so forth was cut off in order 196 00:11:44,480 --> 00:11:47,079 Speaker 1: to choke you off and destroy you. These are companies 197 00:11:47,120 --> 00:11:49,840 Speaker 1: that have no business, that have no problem doing business 198 00:11:49,840 --> 00:11:52,880 Speaker 1: with communist China, They have no problem doing business with 199 00:11:52,920 --> 00:11:56,640 Speaker 1: all kinds of genocidal dictators and so forth. But Little 200 00:11:56,679 --> 00:11:59,000 Speaker 1: Parlor was getting a little too big for its bridges. 201 00:12:00,040 --> 00:12:02,840 Speaker 1: Why do you think Why do you think they focused 202 00:12:02,840 --> 00:12:06,480 Speaker 1: on you and targeted you? Well, a few reasons, but 203 00:12:06,600 --> 00:12:09,839 Speaker 1: you know, in my opinion, it seems our growth was 204 00:12:11,000 --> 00:12:14,319 Speaker 1: very high, and we didn't agree with their ideas on 205 00:12:14,480 --> 00:12:20,000 Speaker 1: you know, censorship. You know, Google, Apple, and Amazon Web 206 00:12:20,040 --> 00:12:23,120 Speaker 1: Services all agreed that our terms of service were acceptable. 207 00:12:23,400 --> 00:12:26,720 Speaker 1: They never said publicly or anywhere as far as I'm aware, 208 00:12:27,040 --> 00:12:29,560 Speaker 1: that there was any problems with our terms of service 209 00:12:29,640 --> 00:12:33,320 Speaker 1: and free speech. So instead they seem to make it 210 00:12:33,400 --> 00:12:36,720 Speaker 1: about violence, which we don't condone and don't allow you know, 211 00:12:36,760 --> 00:12:40,560 Speaker 1: we don't allow violence or insurrection. These these are illegal acts, right, 212 00:12:41,520 --> 00:12:45,120 Speaker 1: So it's very very interesting that they all on the 213 00:12:45,160 --> 00:12:49,040 Speaker 1: exact same day without previously indicating. They never indicated to 214 00:12:49,160 --> 00:12:53,520 Speaker 1: us that there was any serious or material problem with 215 00:12:53,600 --> 00:12:56,720 Speaker 1: our app, but on the same day, you know, all 216 00:12:56,760 --> 00:13:01,560 Speaker 1: on the same day, they send us these very threatening notices. 217 00:13:01,600 --> 00:13:03,880 Speaker 1: So we said, okay, let's call let's let's see what 218 00:13:04,040 --> 00:13:06,040 Speaker 1: you know, let's see what Google said. Oh, they actually 219 00:13:06,160 --> 00:13:08,280 Speaker 1: never emailed us, and you know, we have no way 220 00:13:08,320 --> 00:13:11,559 Speaker 1: to contact them. Okay, so Google's out. Apple, let's call 221 00:13:11,600 --> 00:13:13,240 Speaker 1: a rep. And we called it a rep and they 222 00:13:13,400 --> 00:13:16,040 Speaker 1: basically shrugged it off and made no indication that this 223 00:13:16,120 --> 00:13:19,200 Speaker 1: was deadly serious, despite their letter being in their email 224 00:13:19,280 --> 00:13:23,720 Speaker 1: being very serious an Amazon you know as usuals basically 225 00:13:23,760 --> 00:13:27,280 Speaker 1: saying you know, oh I didn't I never saw any 226 00:13:27,360 --> 00:13:29,680 Speaker 1: material problems. There's no issues. You know, they were played 227 00:13:29,679 --> 00:13:34,440 Speaker 1: it off very nonchalantly, and so we had still even uh, 228 00:13:34,840 --> 00:13:36,560 Speaker 1: you know, on the eighth and the ninth, you know, 229 00:13:36,600 --> 00:13:41,520 Speaker 1: we had no real indication that this was deadly serious, 230 00:13:42,960 --> 00:13:49,840 Speaker 1: had no indication that this was deadly serious, no indication 231 00:13:49,920 --> 00:13:55,840 Speaker 1: at all, and then just like that, bam, everybody shuts 232 00:13:55,880 --> 00:14:02,000 Speaker 1: them down, which brings me back to this former head, 233 00:14:03,000 --> 00:14:06,600 Speaker 1: former Facebook security chief calling for Verizon AT and ten 234 00:14:06,679 --> 00:14:10,640 Speaker 1: others to the platform. Now, legitimate news agencies like OAM, 235 00:14:10,720 --> 00:14:15,959 Speaker 1: Newsmax and Fox on CNN. This is CNN on c 236 00:14:16,200 --> 00:14:19,160 Speaker 1: I mean, this is CNN saying, well, the other R 237 00:14:19,240 --> 00:14:25,120 Speaker 1: competitors don't really deserve to have a voice. They just don't. 238 00:14:26,680 --> 00:14:33,520 Speaker 1: They shouldn't have a voice. Here. Former Facebook security chief 239 00:14:33,600 --> 00:14:38,000 Speaker 1: calling for Verizon AT and T and others to be 240 00:14:38,200 --> 00:14:41,800 Speaker 1: the platform because they're they're just such a problem here, 241 00:14:43,760 --> 00:14:46,520 Speaker 1: you know. Brian Skelter, it's CNN says, you know, hey, 242 00:14:47,000 --> 00:14:52,760 Speaker 1: is it possible, Alex Stamos, there will ever be a 243 00:14:52,800 --> 00:14:56,440 Speaker 1: solution to this information crisis that has been perpetrated, in 244 00:14:56,560 --> 00:14:58,800 Speaker 1: my view, like Facebook as well as Twitter and others 245 00:14:58,960 --> 00:15:01,720 Speaker 1: because you criticize, and then of course they're going to respond, right, 246 00:15:01,720 --> 00:15:03,840 Speaker 1: they're going to respond and say, oh, we're so sorry, 247 00:15:05,840 --> 00:15:10,080 Speaker 1: We're so sorry. Is there ever a chance that this could, 248 00:15:10,400 --> 00:15:13,120 Speaker 1: you know, we could ever get out of this because 249 00:15:13,120 --> 00:15:16,400 Speaker 1: it's so bad now, crisis that has been perpetuated in 250 00:15:16,440 --> 00:15:18,240 Speaker 1: my view by platforms like the one where you used 251 00:15:18,240 --> 00:15:21,960 Speaker 1: to work, Facebook as well as Twitter and others. It's 252 00:15:22,000 --> 00:15:24,680 Speaker 1: really hard because what's happening is people are able to 253 00:15:24,720 --> 00:15:27,840 Speaker 1: seek out the information that makes them feel good. That 254 00:15:27,960 --> 00:15:29,760 Speaker 1: is what happenings that you know, people have so much 255 00:15:29,880 --> 00:15:33,080 Speaker 1: choice now. They can choose what their new sources are, 256 00:15:33,360 --> 00:15:36,560 Speaker 1: they can choose what influencers they want to follow, and 257 00:15:36,920 --> 00:15:40,240 Speaker 1: they can try to seal out anything that helps them 258 00:15:40,320 --> 00:15:43,440 Speaker 1: question that. And I think that gets to a really 259 00:15:43,520 --> 00:15:47,760 Speaker 1: core issue with how our freedoms as Americans, in the 260 00:15:47,800 --> 00:15:51,080 Speaker 1: way we've treated press freedom in the past, is being 261 00:15:51,120 --> 00:15:54,360 Speaker 1: abused by these actors, and that we have given a 262 00:15:54,400 --> 00:15:57,560 Speaker 1: lot of leeway, both in the traditional media and on 263 00:15:57,600 --> 00:16:00,520 Speaker 1: social media to people to have a very broad range 264 00:16:00,520 --> 00:16:03,240 Speaker 1: of political views, and it is now in the great 265 00:16:03,280 --> 00:16:06,640 Speaker 1: economic interest of those individuals to become more and more radical. 266 00:16:06,680 --> 00:16:08,360 Speaker 1: And I think that one of the places you can 267 00:16:08,360 --> 00:16:10,760 Speaker 1: see this is on the fact that you now have 268 00:16:11,080 --> 00:16:14,840 Speaker 1: competitors to Fox News on their right OA and then Newsmax, 269 00:16:15,000 --> 00:16:19,360 Speaker 1: which are carried by all the major cable networks, who 270 00:16:19,400 --> 00:16:21,680 Speaker 1: are trying to now outflank Fox in the right Because 271 00:16:21,680 --> 00:16:25,880 Speaker 1: the moment Fox introduced any kind of realism into their reporting, 272 00:16:26,200 --> 00:16:29,600 Speaker 1: immediately a bunch of people chose to put themselves into 273 00:16:29,600 --> 00:16:32,080 Speaker 1: a sealed ecosystem, and they can do that both on cable, 274 00:16:32,160 --> 00:16:35,080 Speaker 1: they can do it online, and that becomes a huge 275 00:16:35,160 --> 00:16:37,320 Speaker 1: challenge of figuring out how do you bring those people 276 00:16:37,360 --> 00:16:40,560 Speaker 1: back into the mainstream of fact based reporting and try 277 00:16:40,600 --> 00:16:43,200 Speaker 1: to get us all back into the same consensual reality. 278 00:16:44,120 --> 00:16:47,520 Speaker 1: And can you Is that possible? Seems that that's an 279 00:16:47,520 --> 00:16:49,880 Speaker 1: open question. It's hard. I mean, I think we gotta 280 00:16:49,880 --> 00:16:51,520 Speaker 1: do a couple of things. One, there needs to be 281 00:16:51,880 --> 00:16:55,240 Speaker 1: an intentional work by the social media companies collaborating together 282 00:16:55,280 --> 00:16:57,400 Speaker 1: to work on violent extremism in the same way they 283 00:16:57,480 --> 00:17:00,760 Speaker 1: worked on ISIS. When I started at Facebook in twenty fifteen, 284 00:17:00,840 --> 00:17:04,040 Speaker 1: the number one challenge from a content perspective was the 285 00:17:04,080 --> 00:17:07,440 Speaker 1: abuse of social media by the Islamic state, and there 286 00:17:07,520 --> 00:17:10,119 Speaker 1: was a collaboration between the tech companies and between the 287 00:17:10,119 --> 00:17:13,320 Speaker 1: tech companies and law enforcement to make it impossible for 288 00:17:13,359 --> 00:17:17,200 Speaker 1: them to use the internet to recruit and radicalize mostly 289 00:17:17,240 --> 00:17:19,360 Speaker 1: young muzzlement at the time around the world. Now we're 290 00:17:19,400 --> 00:17:22,760 Speaker 1: talking about domestic audience in the United States. You do 291 00:17:22,880 --> 00:17:24,920 Speaker 1: realize what just happened there. You just had a guy 292 00:17:25,000 --> 00:17:30,720 Speaker 1: who actually says you just had a guy who actually 293 00:17:30,840 --> 00:17:37,800 Speaker 1: said that we are no different than al Qaeda. We 294 00:17:37,920 --> 00:17:43,560 Speaker 1: are no different than al Qaeda. I want to make 295 00:17:43,600 --> 00:17:48,719 Speaker 1: sure everybody just heard what I just heard. Please tell 296 00:17:48,800 --> 00:17:52,960 Speaker 1: me that everybody just heard what I just heard. That 297 00:17:53,040 --> 00:17:55,359 Speaker 1: we are domestic terrorists, that we are just like al Qaida, 298 00:17:55,440 --> 00:17:57,840 Speaker 1: that we are no different, that we are the same thing, 299 00:17:57,880 --> 00:18:02,280 Speaker 1: and so therefore we should be silence. And the challenge 300 00:18:02,320 --> 00:18:05,120 Speaker 1: is going to be partially that you know, Isis did 301 00:18:05,119 --> 00:18:08,400 Speaker 1: not have a domestic constituency in the United States Congress, 302 00:18:08,680 --> 00:18:12,040 Speaker 1: but there is over half of the Republicans and Congress 303 00:18:12,080 --> 00:18:14,800 Speaker 1: voted to overturn the election, and there will be a 304 00:18:14,800 --> 00:18:18,679 Speaker 1: continual political pressure on the companies to not take it seriously. 305 00:18:18,720 --> 00:18:22,440 Speaker 1: So I think, so now you have terrorists in Congress 306 00:18:22,480 --> 00:18:25,480 Speaker 1: just like Isis. Right now, it's worse because you have Republicans. 307 00:18:26,760 --> 00:18:31,159 Speaker 1: You have Republicans that are like Isis that are in Congress. Folks, 308 00:18:33,200 --> 00:18:36,240 Speaker 1: this is the former head of Facebook security saying this 309 00:18:36,480 --> 00:18:45,119 Speaker 1: on CNN about you. I'll say it again. They're coming 310 00:18:45,160 --> 00:18:51,360 Speaker 1: after all of you. If you are a conservative, every 311 00:18:51,359 --> 00:18:55,800 Speaker 1: one of you. If you want to keep up with us, 312 00:18:55,880 --> 00:18:58,000 Speaker 1: make sure if you have a phone, that you send 313 00:18:58,080 --> 00:19:01,000 Speaker 1: us a text right now so that we can keep 314 00:19:01,040 --> 00:19:04,000 Speaker 1: up with you, no matter what happens. Text us to 315 00:19:04,720 --> 00:19:09,720 Speaker 1: the word ben to five five four three three. That's 316 00:19:09,960 --> 00:19:14,080 Speaker 1: five five four three three. Text the word ben to 317 00:19:14,280 --> 00:19:16,360 Speaker 1: five five four three three, and you can get our 318 00:19:16,359 --> 00:19:18,840 Speaker 1: text message so we can always keep up with you. 319 00:19:19,720 --> 00:19:20,560 Speaker 1: And I'll leave it at that