1 00:00:09,960 --> 00:00:13,160 Speaker 1: Censorship on social media is out of control, with companies 2 00:00:13,200 --> 00:00:17,840 Speaker 1: like Twitter, Facebook, and Instagram eliminating prominent conservative voices from 3 00:00:17,840 --> 00:00:20,160 Speaker 1: their platforms. It's an issue that many on the right 4 00:00:20,200 --> 00:00:22,560 Speaker 1: have been discussing for years now what can be done 5 00:00:22,560 --> 00:00:25,240 Speaker 1: to fight back? In this special edition of Hold the Line, 6 00:00:25,520 --> 00:00:28,680 Speaker 1: we'll talk to several tech entrepreneurs and lawmakers who are 7 00:00:28,840 --> 00:00:32,720 Speaker 1: challenging the big tech social media monopoly by building alternatives 8 00:00:32,920 --> 00:00:43,760 Speaker 1: for those who are being silenced. Welcome to the special 9 00:00:43,880 --> 00:00:46,559 Speaker 1: edition of Hold the Line. I'm buck Sexton. If we 10 00:00:46,680 --> 00:00:49,239 Speaker 1: lose the big tech battle, we will lose all the 11 00:00:49,240 --> 00:00:52,919 Speaker 1: political battles to come after it. They control the spread 12 00:00:53,040 --> 00:00:57,800 Speaker 1: of information. They control the pipes of the dissemination of 13 00:00:58,200 --> 00:01:01,960 Speaker 1: data and facts and figures in such a way that 14 00:01:02,040 --> 00:01:06,120 Speaker 1: they can shape your perception, everyone's perception about reality. And 15 00:01:06,160 --> 00:01:09,160 Speaker 1: they've done this for partisan ends. We know this now 16 00:01:09,600 --> 00:01:13,000 Speaker 1: beyond any reasonable doubt. They used to claim that this 17 00:01:13,080 --> 00:01:16,080 Speaker 1: was just an accident. They would occasionally ban a conservative 18 00:01:16,160 --> 00:01:20,440 Speaker 1: or suspend one for sharing a not PC thought or 19 00:01:20,480 --> 00:01:24,080 Speaker 1: sharing something that was considered too dangerous to the ruling 20 00:01:24,160 --> 00:01:27,759 Speaker 1: Democrat apparatus, and of course, the big tuck oligarch sitting 21 00:01:27,800 --> 00:01:31,720 Speaker 1: atop that largely, but now we know, now we've seen 22 00:01:31,760 --> 00:01:34,720 Speaker 1: what the reality actually is. Here's just a little flashback 23 00:01:34,760 --> 00:01:38,759 Speaker 1: to October of twenty twenty, when Jack Dorsey, former CEO 24 00:01:38,800 --> 00:01:41,319 Speaker 1: of Twitter, was saying that there was no evidence that 25 00:01:41,360 --> 00:01:44,800 Speaker 1: the New York Post story on Hunter Biden's emails was disinformation. 26 00:01:44,880 --> 00:01:48,520 Speaker 1: Watch this one, okay for cooth Mister Zuckerberg and Dorsey 27 00:01:48,520 --> 00:01:54,520 Speaker 1: who censored, censored New York Post stories or throttled them back. 28 00:01:54,960 --> 00:01:58,000 Speaker 1: You either one of you have any evidence that the 29 00:01:58,080 --> 00:02:00,560 Speaker 1: New York Post story is part of Russian distry information 30 00:02:00,920 --> 00:02:05,840 Speaker 1: or that those emails aren't authentic to any any any 31 00:02:06,000 --> 00:02:08,280 Speaker 1: instipation what's there they're not authentic, or that they are 32 00:02:08,360 --> 00:02:14,680 Speaker 1: Russian disinformation. Mister Darcy, we don't you know, so, so 33 00:02:14,680 --> 00:02:16,760 Speaker 1: why would why would you censor it? Why did you 34 00:02:16,840 --> 00:02:19,799 Speaker 1: prevent that from being disseminated on your platform that is 35 00:02:19,840 --> 00:02:23,120 Speaker 1: supposed to be for the free expression of ideas and 36 00:02:23,200 --> 00:02:27,200 Speaker 1: particularly true ideas. We believed to Philip Felt our hacking 37 00:02:27,280 --> 00:02:32,399 Speaker 1: materials policy, we judged and it was hacked. They weren't hacked. 38 00:02:32,680 --> 00:02:35,440 Speaker 1: We judged in a moment that it looked like it 39 00:02:35,480 --> 00:02:39,480 Speaker 1: was hacked materials. You were wrong, purpacing and and we 40 00:02:39,600 --> 00:02:42,919 Speaker 1: updated our policy and our enforcement within twenty four hours. 41 00:02:45,040 --> 00:02:48,720 Speaker 1: They were wrong. It wasn't hacked material. It was true material, 42 00:02:48,960 --> 00:02:51,359 Speaker 1: and it was right before an election, and it might 43 00:02:51,440 --> 00:02:54,280 Speaker 1: have actually changed the course of that election. The Big 44 00:02:54,280 --> 00:02:59,840 Speaker 1: Tech now is clearly openly a Democrat superpack in act, 45 00:03:00,120 --> 00:03:04,079 Speaker 1: and that's the way they actually operate, and they have 46 00:03:04,120 --> 00:03:07,840 Speaker 1: an understanding now of the amount of power that they wield, 47 00:03:08,240 --> 00:03:10,480 Speaker 1: and this is not going to be something we could 48 00:03:10,480 --> 00:03:13,440 Speaker 1: ignore any longer. You can care a lot about the border, 49 00:03:13,520 --> 00:03:16,359 Speaker 1: about defund the police and crime, and you can care 50 00:03:16,400 --> 00:03:21,120 Speaker 1: about spending and COVID policy and inflation, all these things. 51 00:03:21,160 --> 00:03:27,000 Speaker 1: If the lib left Marxist social media platforms are able 52 00:03:27,080 --> 00:03:30,760 Speaker 1: to continue to control the spread of information, to shut 53 00:03:30,800 --> 00:03:33,720 Speaker 1: down voices that are speaking the truth or that are 54 00:03:33,760 --> 00:03:36,960 Speaker 1: critics of the regime, we're never going to be able 55 00:03:37,000 --> 00:03:39,480 Speaker 1: to win at the national level. Again, those are the stakes, 56 00:03:39,920 --> 00:03:42,720 Speaker 1: which is why the Democrats are now all in on censorship. 57 00:03:42,800 --> 00:03:46,200 Speaker 1: They realize they want to use this weapon, this tool 58 00:03:46,800 --> 00:03:51,320 Speaker 1: of information suppression, as long as they possibly can, because 59 00:03:51,440 --> 00:03:55,200 Speaker 1: it is enormously helpful to their power and their aspirations 60 00:03:55,240 --> 00:03:57,560 Speaker 1: for more of it. This was just in July present, 61 00:03:57,560 --> 00:04:00,760 Speaker 1: the United States, Joe Biden was actually saying that information 62 00:04:00,800 --> 00:04:03,400 Speaker 1: on Facebook that the regime doesn't approve of is literally 63 00:04:03,520 --> 00:04:07,640 Speaker 1: killing people. Watch what's your message the platforms like Facebook, 64 00:04:10,120 --> 00:04:14,960 Speaker 1: they're killing people. I mean they're really Look, the only 65 00:04:15,000 --> 00:04:20,000 Speaker 1: pandemic we have is among the unvaccinated, and they're killing people. 66 00:04:21,400 --> 00:04:23,720 Speaker 1: That's not true, by the way, as you know, one 67 00:04:23,720 --> 00:04:25,960 Speaker 1: of the dumbest lines ever uttered by any president's a 68 00:04:26,000 --> 00:04:29,200 Speaker 1: pandemic of the unvaccdor. No, it's not. That is a lie. 69 00:04:29,839 --> 00:04:35,200 Speaker 1: They were wrong anyway. Jensaki, White House Press Secretary, really 70 00:04:35,200 --> 00:04:38,039 Speaker 1: the propagandas are of the Biden regime. She says that 71 00:04:38,120 --> 00:04:41,400 Speaker 1: you should be banned from all social media. Plan Let 72 00:04:41,440 --> 00:04:43,400 Speaker 1: let's take a step back for a second. Here. The 73 00:04:43,520 --> 00:04:46,560 Speaker 1: people that are supposed to be in charge of dissemiting 74 00:04:46,560 --> 00:04:49,640 Speaker 1: information to the public so they can make informed decisions 75 00:04:49,680 --> 00:04:53,720 Speaker 1: and that this republic can stand with citizens who are 76 00:04:54,600 --> 00:04:57,400 Speaker 1: informed enough to be able to hold their government to 77 00:04:57,520 --> 00:05:01,240 Speaker 1: account and to make wise decisions and who they allect. Meanwhile, 78 00:05:01,360 --> 00:05:03,240 Speaker 1: the people in charge are actually saying, no, no, we 79 00:05:03,279 --> 00:05:05,960 Speaker 1: want to shut down anybody that gets it our way. 80 00:05:06,000 --> 00:05:08,880 Speaker 1: We want to censor them, we want to silence them. 81 00:05:09,080 --> 00:05:11,560 Speaker 1: This was just in July Jensaki saying you should be 82 00:05:11,600 --> 00:05:14,880 Speaker 1: banned from all social media platforms, not just one. A 83 00:05:14,920 --> 00:05:18,400 Speaker 1: couple of the steps that we have, you know, that 84 00:05:18,480 --> 00:05:22,320 Speaker 1: could be constructive for the public health of the country 85 00:05:22,480 --> 00:05:27,320 Speaker 1: are providing for Facebook or their platforms to measure and 86 00:05:27,360 --> 00:05:31,039 Speaker 1: publicly share the impact of misinformation on their platform and 87 00:05:31,120 --> 00:05:34,479 Speaker 1: the audience. It's reaching also with the public, with all 88 00:05:34,480 --> 00:05:38,880 Speaker 1: of you to create robust enforcement strategies that bridge their 89 00:05:38,880 --> 00:05:42,159 Speaker 1: properties and provide transparency about rules. You shouldn't be banned 90 00:05:42,200 --> 00:05:47,120 Speaker 1: from one platform and not others if you afford providing 91 00:05:47,200 --> 00:05:52,279 Speaker 1: misinformation out there spanned from all of them. You see, 92 00:05:52,400 --> 00:05:57,280 Speaker 1: they are open about their desire for censorship. Left controls 93 00:05:57,360 --> 00:06:00,880 Speaker 1: the social media platforms, they run them, and they expect 94 00:06:00,880 --> 00:06:03,320 Speaker 1: them to be their allies, and they have been. We 95 00:06:03,400 --> 00:06:05,920 Speaker 1: need to do something about it, or we can just 96 00:06:06,279 --> 00:06:08,640 Speaker 1: continue to hear stuff like this from a member of Congress, 97 00:06:09,480 --> 00:06:14,119 Speaker 1: Permilla Jaya Paula Washington, who's saying even banning conservatives isn't enough. Watch. 98 00:06:14,680 --> 00:06:18,360 Speaker 1: It's no secret that our social media companies have been 99 00:06:18,560 --> 00:06:24,200 Speaker 1: part of their algorithms promoting disinformation, and I think that 100 00:06:24,279 --> 00:06:28,440 Speaker 1: these steps are important, but frankly, a little too little 101 00:06:28,720 --> 00:06:31,960 Speaker 1: and a little too late. The reality is it's not 102 00:06:32,000 --> 00:06:36,320 Speaker 1: just Marjorie Taylor Green. All over Twitter, social Media, Facebook, 103 00:06:36,920 --> 00:06:40,520 Speaker 1: all of these companies have been using algorithms that are 104 00:06:40,520 --> 00:06:43,560 Speaker 1: just about clickbait, not about truth. And so if we 105 00:06:43,640 --> 00:06:48,120 Speaker 1: are going to take on the disinformation that's out there, 106 00:06:48,200 --> 00:06:50,880 Speaker 1: the big lie, and everything else that goes with it, 107 00:06:51,160 --> 00:06:52,880 Speaker 1: then yes, this is a part of it. But it's 108 00:06:52,880 --> 00:06:57,440 Speaker 1: got to be much much more amazing, isn't it. Only 109 00:06:57,480 --> 00:07:00,960 Speaker 1: want to prove narrative allowed agree with the people in power, 110 00:07:01,440 --> 00:07:03,320 Speaker 1: or they will shut you down and silence you and 111 00:07:03,360 --> 00:07:07,239 Speaker 1: punish you. Or we can fight back. We can actually 112 00:07:07,279 --> 00:07:11,760 Speaker 1: build our own unsinkable aircraft carriers of free speech. And 113 00:07:11,920 --> 00:07:13,400 Speaker 1: that's what we want to talk to you about here 114 00:07:13,440 --> 00:07:15,640 Speaker 1: on this special for Hold the line. We've got a 115 00:07:15,640 --> 00:07:18,600 Speaker 1: great lineup of guest coming up, and after the break 116 00:07:18,600 --> 00:07:22,000 Speaker 1: we'll talk to Jason Miller, whose social media platform Getter 117 00:07:22,320 --> 00:07:24,600 Speaker 1: is poised to give Twitter a run for its money. 118 00:07:24,680 --> 00:07:43,840 Speaker 1: Stay with us. Twenty sixteen presidential campaign was arguably the 119 00:07:43,880 --> 00:07:47,200 Speaker 1: first national race in which Twitter played a central role. 120 00:07:47,600 --> 00:07:50,000 Speaker 1: At the time the general election rolled around, then candidate 121 00:07:50,000 --> 00:07:52,520 Speaker 1: Trump had amassed a following of over ten million people 122 00:07:52,960 --> 00:07:56,040 Speaker 1: social media megaphone allowed him to bypass hostile news outlets 123 00:07:56,040 --> 00:07:59,520 Speaker 1: and bring his message directly to potential voters. You already 124 00:07:59,520 --> 00:08:02,320 Speaker 1: know President Trump was banned from Twitter in late twenty twenty, 125 00:08:02,320 --> 00:08:04,200 Speaker 1: a move it's done the political world and left him 126 00:08:04,480 --> 00:08:08,160 Speaker 1: without the most important communication tool in his arsenal. What 127 00:08:08,160 --> 00:08:10,360 Speaker 1: I mean now is the former senior advisor to the 128 00:08:10,360 --> 00:08:15,760 Speaker 1: Trump twenty twenty reelection campaign and CEO of Getter, Jason Miller. Jason, 129 00:08:15,800 --> 00:08:18,800 Speaker 1: great to see him, but good to be back with you. 130 00:08:19,480 --> 00:08:21,800 Speaker 1: So let's just talk about what it was like. You're 131 00:08:21,800 --> 00:08:24,720 Speaker 1: on the front lines of President Trump's twenty sixteen campaign 132 00:08:24,720 --> 00:08:27,400 Speaker 1: in twenty twenty. But let's talk about twenty sixteen for 133 00:08:27,440 --> 00:08:30,000 Speaker 1: a moment and the enormous win that he had that 134 00:08:30,040 --> 00:08:33,439 Speaker 1: so many people said was impossible. How critical was Twitter? 135 00:08:33,480 --> 00:08:35,880 Speaker 1: I mean, what role did Twitter play and the President's 136 00:08:35,960 --> 00:08:38,600 Speaker 1: usage of that tool in his election and his defeat 137 00:08:38,600 --> 00:08:42,880 Speaker 1: of Hillary Clinton. Well, it's very important, both Twitter and 138 00:08:42,960 --> 00:08:44,840 Speaker 1: Facebook and let's say, all of social media. And what 139 00:08:44,840 --> 00:08:47,480 Speaker 1: I've always said is that President Trump really his superpower 140 00:08:47,760 --> 00:08:51,079 Speaker 1: has been ability to bypass traditional media go directly to voters. 141 00:08:51,320 --> 00:08:53,240 Speaker 1: And so in twenty sixteen, that was part of the 142 00:08:53,240 --> 00:08:55,400 Speaker 1: way that the President Trump was able to sneak up 143 00:08:55,720 --> 00:08:58,440 Speaker 1: on the elites or the media or a lot of 144 00:08:58,440 --> 00:09:01,520 Speaker 1: folks who didn't see a Trump coming because he's able 145 00:09:01,520 --> 00:09:03,440 Speaker 1: to talk directly to people, and they couldn't shut him 146 00:09:03,480 --> 00:09:06,320 Speaker 1: down by saying, here's negative coverage and the failing New 147 00:09:06,360 --> 00:09:08,599 Speaker 1: York Times or fake news CNN or any of the 148 00:09:08,679 --> 00:09:12,319 Speaker 1: traditional news sources. What's different or changed in a lot 149 00:09:12,360 --> 00:09:15,200 Speaker 1: of ways over twenty twenty is they were ready for 150 00:09:15,280 --> 00:09:17,560 Speaker 1: President Trump and for our team, and so they started 151 00:09:17,559 --> 00:09:19,240 Speaker 1: to change the rules of the game, whether it be 152 00:09:19,280 --> 00:09:22,040 Speaker 1: the warning labels that they would put up, whether it 153 00:09:22,040 --> 00:09:25,240 Speaker 1: be the shadow banning, or changing the algorithms. All these 154 00:09:25,280 --> 00:09:28,360 Speaker 1: different things made it much difficult to connect directly with people. 155 00:09:29,000 --> 00:09:31,000 Speaker 1: And one of the things that I think is important 156 00:09:31,000 --> 00:09:33,760 Speaker 1: to know is that Dorsey and Zuckerberg both got a 157 00:09:33,840 --> 00:09:36,880 Speaker 1: lot of heat from the left, particularly the progressive left, 158 00:09:37,080 --> 00:09:40,760 Speaker 1: who blamed them for essentially helping President Trump win in 159 00:09:40,800 --> 00:09:43,960 Speaker 1: twenty sixteen. They say, if you hadn't, if you darned 160 00:09:44,040 --> 00:09:46,679 Speaker 1: kids hadn't gone and created these platforms, then we wouldn't 161 00:09:46,679 --> 00:09:48,679 Speaker 1: have ended up with Trump. And so I think that's 162 00:09:48,679 --> 00:09:50,800 Speaker 1: part of the reason why you saw such the backlash 163 00:09:50,840 --> 00:09:53,480 Speaker 1: from Dorsey and Zuckerberg heading into twenty twenty because they 164 00:09:53,520 --> 00:09:57,840 Speaker 1: were trying to make amends for having created their platforms 165 00:09:57,840 --> 00:10:01,280 Speaker 1: in twenty sixteen. How the issue of Twitter's cracked down 166 00:10:01,360 --> 00:10:04,719 Speaker 1: on conservatives progressed over the years. I mean, Jason, I'm 167 00:10:04,720 --> 00:10:06,760 Speaker 1: all enough to remember when Twitter claimed there was no 168 00:10:07,040 --> 00:10:11,280 Speaker 1: political bias or politicized decision making about who's being either 169 00:10:11,320 --> 00:10:16,839 Speaker 1: shadow banded or outright suspended. What's changed, Well, I think 170 00:10:16,840 --> 00:10:19,640 Speaker 1: a couple of things have changed. Number One, They've decided 171 00:10:19,840 --> 00:10:23,240 Speaker 1: that they want to use their platform to help frame 172 00:10:23,679 --> 00:10:25,679 Speaker 1: the world as far as what the issues are, how 173 00:10:25,720 --> 00:10:28,280 Speaker 1: people look at things. This is no longer kind of 174 00:10:28,080 --> 00:10:31,240 Speaker 1: the altruistic free speech platform. This is very much about 175 00:10:31,520 --> 00:10:34,080 Speaker 1: they have a particular ideology and they want people to 176 00:10:34,720 --> 00:10:36,679 Speaker 1: abide by that. And so you take a look at, say, 177 00:10:36,679 --> 00:10:39,840 Speaker 1: for example, Google and YouTube, the way that they now 178 00:10:39,880 --> 00:10:42,400 Speaker 1: put warning labels on if you talk about climate change 179 00:10:42,400 --> 00:10:44,160 Speaker 1: in the way that they don't agree with. You. Take 180 00:10:44,200 --> 00:10:46,320 Speaker 1: a look at the way that Twitter will go and 181 00:10:46,360 --> 00:10:49,200 Speaker 1: actually ban people or the sentence you to digital jail 182 00:10:49,280 --> 00:10:51,400 Speaker 1: for a short amount of time and simply for saying 183 00:10:51,400 --> 00:10:54,320 Speaker 1: something about vaccines or COVID that they don't agree with 184 00:10:54,400 --> 00:10:55,880 Speaker 1: and that's part of the reason why we end up 185 00:10:55,880 --> 00:10:58,840 Speaker 1: getting Rogan. You had the Robert doctor, Robert Malone, you 186 00:10:58,880 --> 00:11:01,559 Speaker 1: had Marjorie Taylor green Book get kicked off, and then 187 00:11:01,600 --> 00:11:04,120 Speaker 1: that rolls up into helping us get Rogan, then ultimately 188 00:11:04,160 --> 00:11:06,920 Speaker 1: Tucker Carlson yesterday. But Buck, I think if we kind 189 00:11:06,960 --> 00:11:10,120 Speaker 1: of step back here and take a look at it, 190 00:11:09,360 --> 00:11:12,640 Speaker 1: it's President Trump. I think was right when he said 191 00:11:12,720 --> 00:11:14,280 Speaker 1: last year, if they're willing to do it to me, 192 00:11:14,480 --> 00:11:16,280 Speaker 1: they're willing to kick me off, they'll do it to 193 00:11:16,320 --> 00:11:19,120 Speaker 1: you in a second. Now we're seeing how they've kicked 194 00:11:19,120 --> 00:11:22,200 Speaker 1: off President Trump. Really, the guardrails are off, the big 195 00:11:22,240 --> 00:11:24,839 Speaker 1: tech social media companies are unencumbered for what they want 196 00:11:24,840 --> 00:11:29,439 Speaker 1: to do next. It's President Trump depending on an alternative 197 00:11:29,480 --> 00:11:31,400 Speaker 1: being available if he should choose to run again in 198 00:11:31,400 --> 00:11:35,560 Speaker 1: twenty twenty four. In your estimation, that's a great question. 199 00:11:35,600 --> 00:11:37,679 Speaker 1: I don't know if he's necessarily depending on it, but 200 00:11:37,800 --> 00:11:42,800 Speaker 1: an alternative will be absolutely necessary. That's a critical function 201 00:11:42,800 --> 00:11:44,360 Speaker 1: if he wants to have a chance to win in 202 00:11:44,480 --> 00:11:48,800 Speaker 1: twenty twenty four, is to have this strongly developed alternative platform. 203 00:11:49,040 --> 00:11:51,199 Speaker 1: Because the fact of the matter is the big tech 204 00:11:51,240 --> 00:11:55,199 Speaker 1: social media companies. If they even allow his new platform 205 00:11:55,320 --> 00:11:57,720 Speaker 1: or some new platform that he's on to be on 206 00:11:57,840 --> 00:12:00,480 Speaker 1: some of the app stores, they're going to look for 207 00:12:00,760 --> 00:12:04,120 Speaker 1: every excuse an opportunity to try to shut him down. 208 00:12:04,440 --> 00:12:08,400 Speaker 1: They will say it's in the future of human civilization. 209 00:12:08,480 --> 00:12:10,840 Speaker 1: They'll say asteroids are going to crash into the earth 210 00:12:10,840 --> 00:12:14,240 Speaker 1: that we allow Trump back on Twitter or on Facebook. 211 00:12:14,400 --> 00:12:16,800 Speaker 1: I think it's already clear that the bands for those 212 00:12:16,840 --> 00:12:18,959 Speaker 1: platforms are going to be permanent. But I think where 213 00:12:19,000 --> 00:12:20,439 Speaker 1: the rubbers are really going to meet the road is 214 00:12:20,480 --> 00:12:23,640 Speaker 1: when he launches his own platform into what extent that's 215 00:12:23,679 --> 00:12:26,079 Speaker 1: allowed to thrive and operate, or if they just look 216 00:12:26,160 --> 00:12:28,679 Speaker 1: to shut it down right away. What happens, by the way, 217 00:12:28,720 --> 00:12:30,560 Speaker 1: for other You know, when you can ban a president, 218 00:12:30,600 --> 00:12:33,400 Speaker 1: you can obviously ban anybody, right so for a lot 219 00:12:33,480 --> 00:12:37,840 Speaker 1: of other people, including prominent conservatives, on social media platforms 220 00:12:37,840 --> 00:12:41,360 Speaker 1: Twitter and Facebook and others, it would be even harder, 221 00:12:41,400 --> 00:12:43,560 Speaker 1: you would think, to try to get some answers here. 222 00:12:43,600 --> 00:12:46,319 Speaker 1: But is there a well, what happens if you get 223 00:12:46,360 --> 00:12:48,439 Speaker 1: banned right now from one of these social media giants 224 00:12:48,480 --> 00:12:51,360 Speaker 1: and your conservative and you say, hey, you know, I 225 00:12:51,480 --> 00:12:53,959 Speaker 1: want my day in court, so to speak. Right without 226 00:12:54,040 --> 00:12:56,400 Speaker 1: actually necessarily getting a legal aspect, though I know that's 227 00:12:56,400 --> 00:12:57,840 Speaker 1: a part of this too for some people, my friend 228 00:12:57,880 --> 00:13:01,040 Speaker 1: Alex Barnson among them. But what happens, I mean, is 229 00:13:01,040 --> 00:13:03,480 Speaker 1: there like a Twitter get out of jail court you 230 00:13:03,520 --> 00:13:05,319 Speaker 1: go to. I mean, who do you talk? It feels 231 00:13:05,320 --> 00:13:10,920 Speaker 1: like it's also nameless and faceless, but yet immensely powerful. Well, 232 00:13:10,920 --> 00:13:13,440 Speaker 1: they do have an appeals process. I couldn't tell you 233 00:13:13,440 --> 00:13:17,200 Speaker 1: the inner workings of how that truly functions, just for 234 00:13:17,240 --> 00:13:19,320 Speaker 1: the simple fact that they keep that very tight and 235 00:13:19,480 --> 00:13:22,720 Speaker 1: very closely guarded, because again, there is a lot of 236 00:13:22,800 --> 00:13:25,360 Speaker 1: unevenness and I think the way their standards are applied. 237 00:13:25,800 --> 00:13:27,200 Speaker 1: I mean, just take a look at the fact that 238 00:13:27,240 --> 00:13:31,839 Speaker 1: you have the Taliban, the Ayatola all on Twitter, but 239 00:13:31,880 --> 00:13:34,520 Speaker 1: no President Trump. And pretty sure President Trump has never 240 00:13:34,600 --> 00:13:37,120 Speaker 1: killed anybody. You can't see the same thing of her 241 00:13:37,160 --> 00:13:40,080 Speaker 1: a Moss, the Aetola and the Taliban. They've killed lots 242 00:13:40,080 --> 00:13:43,120 Speaker 1: of people. Senator Ram Paul, by the way, speaking of 243 00:13:43,200 --> 00:13:47,000 Speaker 1: people kicked off, politician, big public figure, well known, he 244 00:13:47,160 --> 00:13:49,840 Speaker 1: explained why he was quitting. We've talked about Twitter a lot. 245 00:13:49,880 --> 00:13:54,080 Speaker 1: YouTube obviously owned by Google, which is the subsidiary of 246 00:13:54,160 --> 00:13:57,319 Speaker 1: alphabet here he is saying he left YouTube over censorship. 247 00:13:58,400 --> 00:14:00,600 Speaker 1: You know, I'm tired of censorship. You know they say 248 00:14:00,640 --> 00:14:03,160 Speaker 1: Mark Twain used to say, everybody's complaining about the weather, 249 00:14:03,240 --> 00:14:05,960 Speaker 1: but nobody's doing anything about it. Well, everybody on the 250 00:14:06,040 --> 00:14:09,640 Speaker 1: right complains about social media in their censorship, We'll do 251 00:14:09,720 --> 00:14:12,400 Speaker 1: something about it. Let's quit. So I'm no longer going 252 00:14:12,440 --> 00:14:14,960 Speaker 1: to let some punk, some snot nosed kid over at 253 00:14:15,040 --> 00:14:17,520 Speaker 1: YouTube decide that a speech that I gave on the 254 00:14:17,559 --> 00:14:20,720 Speaker 1: Senate floor is not appropriate, or that when I say 255 00:14:20,800 --> 00:14:23,480 Speaker 1: cloth mask don't work, because I'm trying to save lives, 256 00:14:23,800 --> 00:14:26,320 Speaker 1: because if you go into the right, your grandparents are 257 00:14:26,320 --> 00:14:29,000 Speaker 1: wearing a cloth mask, you're going to get infected. If 258 00:14:29,000 --> 00:14:31,040 Speaker 1: an eighty year old's taking care of their wife and 259 00:14:31,040 --> 00:14:33,280 Speaker 1: they're wearing a cloth mask and their wife ask COVID, 260 00:14:33,400 --> 00:14:36,000 Speaker 1: they're going to get infected because the cloth mask don't work. 261 00:14:37,560 --> 00:14:41,960 Speaker 1: You don't say, Jason, well, that's right. I hope that 262 00:14:42,000 --> 00:14:44,320 Speaker 1: we can get Senator Paul over on Getter. He certainly 263 00:14:44,360 --> 00:14:46,720 Speaker 1: would be a welcome addition and a strong voice. I 264 00:14:46,720 --> 00:14:49,320 Speaker 1: think about a dozen or maybe fifteen senators who have 265 00:14:49,480 --> 00:14:51,920 Speaker 1: joined the platform. But he's right, I mean at this point, 266 00:14:52,280 --> 00:14:54,400 Speaker 1: we got to look and say we're additional places to 267 00:14:54,440 --> 00:14:56,400 Speaker 1: go and make our voices heard. You know, when the 268 00:14:56,440 --> 00:14:59,160 Speaker 1: things buck is. I never tell people though that they 269 00:14:59,200 --> 00:15:02,960 Speaker 1: should go and give another platform. My philosophy is is 270 00:15:03,000 --> 00:15:05,160 Speaker 1: the CEO of gettor is that I have to make 271 00:15:05,280 --> 00:15:08,240 Speaker 1: Ghettor a desirable place for people to be. I need 272 00:15:08,240 --> 00:15:11,000 Speaker 1: to make it so people want to be a Ghettor. Yes, 273 00:15:11,000 --> 00:15:13,480 Speaker 1: at a certain point people just start to give up 274 00:15:13,640 --> 00:15:16,480 Speaker 1: the big tech platforms on their own. But I want 275 00:15:16,480 --> 00:15:18,800 Speaker 1: to lead with our strengths and our attributes and then 276 00:15:18,840 --> 00:15:21,200 Speaker 1: pull people in that way. Certainly a lot of people 277 00:15:21,200 --> 00:15:23,320 Speaker 1: are saying, hey, we need to quit big tech altogether. 278 00:15:23,400 --> 00:15:26,520 Speaker 1: Though we want to talk about going on offense here 279 00:15:26,720 --> 00:15:29,000 Speaker 1: when it comes to free sphjas and we'll have more 280 00:15:29,080 --> 00:15:31,960 Speaker 1: with the CEO of Ghettter, Jason Miller in a moment. 281 00:15:48,480 --> 00:15:51,200 Speaker 1: The past few years have demonstrated the importance of social 282 00:15:51,240 --> 00:15:54,000 Speaker 1: media to the national political discussion and races all across 283 00:15:54,000 --> 00:15:57,280 Speaker 1: the country. So with Twitter's cracked down on conservatives and 284 00:15:57,320 --> 00:15:59,320 Speaker 1: false wing, what can we donna take back power from 285 00:15:59,360 --> 00:16:02,440 Speaker 1: big tech job aience. In July last year, Jason Miller 286 00:16:02,440 --> 00:16:05,920 Speaker 1: and his team launched get Her, a social media app 287 00:16:06,000 --> 00:16:10,160 Speaker 1: designed to directly compete with the Twitter monolith. I mean, 288 00:16:10,160 --> 00:16:13,000 Speaker 1: now once again, CEO of get Her, Jason Miller, Jason, 289 00:16:13,160 --> 00:16:17,040 Speaker 1: thanks so much. Man. Now let's talk about offense. Yeah, 290 00:16:17,120 --> 00:16:19,400 Speaker 1: absolutely so. We're taking the fight right too big tech, 291 00:16:19,440 --> 00:16:21,680 Speaker 1: and I think the bucket beginning of this. There are 292 00:16:21,680 --> 00:16:24,120 Speaker 1: a couple of philosophies that I laid out for our team. 293 00:16:24,200 --> 00:16:26,480 Speaker 1: Number One, I think for too long we've been told 294 00:16:26,520 --> 00:16:28,880 Speaker 1: if you want to have some alternative platform, you go 295 00:16:29,000 --> 00:16:30,720 Speaker 1: develop it just not going to be as good as 296 00:16:30,760 --> 00:16:33,800 Speaker 1: big tech. I tell my team every single day. I 297 00:16:33,800 --> 00:16:35,800 Speaker 1: don't care how long Twitter has been around, how long 298 00:16:35,920 --> 00:16:38,680 Speaker 1: Facebook has been around. We're not judged on the fact 299 00:16:38,680 --> 00:16:41,000 Speaker 1: that we're only six months old. We're judged to get 300 00:16:41,040 --> 00:16:43,280 Speaker 1: what the current user experience is for these other big 301 00:16:43,280 --> 00:16:45,840 Speaker 1: tech platforms. They've had a big head start on us, 302 00:16:45,840 --> 00:16:47,600 Speaker 1: but we have to catch up. That's why right now, 303 00:16:47,840 --> 00:16:50,640 Speaker 1: that's say we're a marketplace competitor to Twitter and Facebook. 304 00:16:50,720 --> 00:16:52,600 Speaker 1: When we launch a vision, which will be our short 305 00:16:52,720 --> 00:16:56,440 Speaker 1: video format that competes directly with TikTok with Instagram reels, 306 00:16:56,440 --> 00:16:58,880 Speaker 1: we're launching that in February, then that's going to make 307 00:16:58,920 --> 00:17:02,239 Speaker 1: us competitors to them help with a whole new demographic 308 00:17:02,240 --> 00:17:04,760 Speaker 1: of younger folks are coming on the platform. And then 309 00:17:04,920 --> 00:17:07,040 Speaker 1: where we go next, where we go and launch get 310 00:17:07,119 --> 00:17:11,480 Speaker 1: ter pay, our two coin ecosystem with essentially a stable 311 00:17:11,520 --> 00:17:14,200 Speaker 1: coin and a fluctuating coin. We launched that this summer. 312 00:17:14,400 --> 00:17:15,879 Speaker 1: We're going to go a place where none of these 313 00:17:15,960 --> 00:17:18,600 Speaker 1: social media platforms have gone before. And as we get 314 00:17:18,640 --> 00:17:21,359 Speaker 1: into decentralized finance, we get to the peer to peer lending, 315 00:17:21,680 --> 00:17:23,879 Speaker 1: all sorts of cool stuff that's coming that's going to 316 00:17:23,920 --> 00:17:27,040 Speaker 1: completely set us apart. But the other thing that I 317 00:17:27,040 --> 00:17:30,000 Speaker 1: always say, make our team remember every single day is 318 00:17:30,000 --> 00:17:32,600 Speaker 1: we're guided by a simple principle that if you believe 319 00:17:32,640 --> 00:17:35,359 Speaker 1: in free speech and you oppose cancel culture, then you 320 00:17:35,400 --> 00:17:37,720 Speaker 1: have a home here. It doesn't matter what someone's background 321 00:17:37,840 --> 00:17:40,040 Speaker 1: or their ideology. If they're on the left, they're on 322 00:17:40,080 --> 00:17:42,400 Speaker 1: the right, they don't care about politics at all. We're 323 00:17:42,440 --> 00:17:45,360 Speaker 1: guided by that principle that free speech has to have 324 00:17:45,760 --> 00:17:48,600 Speaker 1: a safe space and that's here on getter. Okay, so 325 00:17:48,720 --> 00:17:51,600 Speaker 1: gettor is somewhat similar as it stands right now to Twitter. 326 00:17:51,840 --> 00:17:54,040 Speaker 1: I am full disclosure, probably just said it's the first 327 00:17:54,040 --> 00:17:57,240 Speaker 1: segment I'm on Getter. I use Getter similar in some 328 00:17:57,280 --> 00:18:00,120 Speaker 1: ways to Twitter. What's different about it for the folks 329 00:18:00,119 --> 00:18:04,919 Speaker 1: out there? Absolutely so. Number one, we have longer posts 330 00:18:05,000 --> 00:18:08,160 Speaker 1: up to seven hundred and seventy seven characters, Longer videos 331 00:18:08,240 --> 00:18:10,760 Speaker 1: were now up to three minute videos. We also have 332 00:18:10,920 --> 00:18:14,000 Speaker 1: live streaming now that we've started introducing that in the 333 00:18:14,040 --> 00:18:17,160 Speaker 1: beta stage, so folks who are verified users, we've started 334 00:18:17,160 --> 00:18:19,399 Speaker 1: to allow a couple dozen folks using that. Some of 335 00:18:19,400 --> 00:18:22,280 Speaker 1: the live streams have got some pretty big audiences so far. 336 00:18:23,240 --> 00:18:25,879 Speaker 1: Where we're also going to is very surely will allow 337 00:18:25,880 --> 00:18:30,080 Speaker 1: an edit feature, as well as the fact that that'll 338 00:18:30,119 --> 00:18:31,479 Speaker 1: be one of the big things, but you won't have 339 00:18:31,520 --> 00:18:33,679 Speaker 1: to like Twitter blue pain and extra three ninety nine 340 00:18:33,760 --> 00:18:35,440 Speaker 1: or four ninety nine a month. We're gonna make that 341 00:18:35,440 --> 00:18:37,879 Speaker 1: available to everybody, as well as some longer videos a 342 00:18:37,880 --> 00:18:40,159 Speaker 1: little bit closer to ten minutes. So we have some 343 00:18:40,240 --> 00:18:42,240 Speaker 1: really good features right now every thing too. I think 344 00:18:42,240 --> 00:18:45,359 Speaker 1: our translate feature is much easier to use. And here's 345 00:18:45,359 --> 00:18:48,080 Speaker 1: a cool thing, buck. When you set up your Twitter account, 346 00:18:49,480 --> 00:18:52,200 Speaker 1: excuse you set your Getter account, you can actually import 347 00:18:52,400 --> 00:18:56,080 Speaker 1: in all of your tweets because that's your intellectual property. 348 00:18:56,320 --> 00:18:58,560 Speaker 1: The platforms don't actually own that, and so when you 349 00:18:58,600 --> 00:19:00,879 Speaker 1: set up Getter, you can poured in all your tweets 350 00:19:00,880 --> 00:19:03,760 Speaker 1: that helped populate your timeline. So whether you have memes 351 00:19:03,840 --> 00:19:06,480 Speaker 1: or videos or recordings or other things, then you have 352 00:19:06,520 --> 00:19:08,520 Speaker 1: a backup of that that's all going to come with you. 353 00:19:08,680 --> 00:19:11,280 Speaker 1: It's a feature. A lot of folks really like that's 354 00:19:11,280 --> 00:19:13,440 Speaker 1: really cool. I got to use that one. Well kind 355 00:19:13,480 --> 00:19:15,959 Speaker 1: of growth have you seen since you actually launched get 356 00:19:16,040 --> 00:19:20,600 Speaker 1: or Jason? Absolutely so, we're the fastest ever social media platform. 357 00:19:20,640 --> 00:19:22,320 Speaker 1: We get to a million users. We did that in 358 00:19:22,400 --> 00:19:24,560 Speaker 1: three days. Then over the course of the last few 359 00:19:24,600 --> 00:19:27,000 Speaker 1: months we got up to three million users, and then 360 00:19:27,119 --> 00:19:31,200 Speaker 1: literally just over this past week, we added on about 361 00:19:31,240 --> 00:19:34,240 Speaker 1: one point one or one point two million users following 362 00:19:34,280 --> 00:19:38,439 Speaker 1: the editions of folks like doctor Malone and Marger Taylor 363 00:19:38,440 --> 00:19:41,040 Speaker 1: Green and Joe Rogan and now Tucker Carlson. So we're 364 00:19:41,040 --> 00:19:44,520 Speaker 1: now up over four million users on the platform. Here's 365 00:19:44,520 --> 00:19:46,960 Speaker 1: another cool thing, and this is another principle that I 366 00:19:47,000 --> 00:19:48,479 Speaker 1: put down at the beginning. I said, this is going 367 00:19:48,520 --> 00:19:50,520 Speaker 1: to be a global platform. If we're going to grow 368 00:19:50,520 --> 00:19:53,960 Speaker 1: and really take on Twitter and Facebook and Instagram TikTok, 369 00:19:54,040 --> 00:19:55,800 Speaker 1: we have to go global. We've got to be big. 370 00:19:56,080 --> 00:19:58,560 Speaker 1: Only about forty percent of our user base is here 371 00:19:58,600 --> 00:20:01,280 Speaker 1: in the US, about sixty percent is global, with about 372 00:20:01,280 --> 00:20:04,600 Speaker 1: fifteen percent and excuse me in Brazil, about ten percent 373 00:20:04,640 --> 00:20:07,080 Speaker 1: in Japan. So this truly is a global free speech 374 00:20:07,119 --> 00:20:12,080 Speaker 1: effort right now. Now, what kind of resistance have you 375 00:20:12,119 --> 00:20:16,480 Speaker 1: faced from the rival big tech oligarchs out there, and 376 00:20:16,680 --> 00:20:18,600 Speaker 1: what are you expecting that they're going to do as 377 00:20:18,640 --> 00:20:23,040 Speaker 1: you guys continue to gain steam and momentum. Well, we've 378 00:20:23,040 --> 00:20:25,119 Speaker 1: seen already in some of the early stages that Twitter 379 00:20:25,200 --> 00:20:27,399 Speaker 1: was actually buying ads off of us. So if you 380 00:20:27,400 --> 00:20:29,560 Speaker 1: went to the app store and typed in get Her, 381 00:20:30,040 --> 00:20:32,560 Speaker 1: an ad for Twitter would pop up, which they did 382 00:20:32,600 --> 00:20:34,679 Speaker 1: that when we're only about one or two million users, 383 00:20:34,680 --> 00:20:37,200 Speaker 1: So we could tell that we shook them quite a bit. Also, 384 00:20:37,600 --> 00:20:40,439 Speaker 1: as we started talking about some of the advancements and 385 00:20:40,680 --> 00:20:43,200 Speaker 1: get Her and some of the new features, we've noticed 386 00:20:43,240 --> 00:20:46,120 Speaker 1: that Twitter has gotten a lot more aggressive in saying 387 00:20:46,119 --> 00:20:48,920 Speaker 1: that they're going to offer different premiums and different features. 388 00:20:49,160 --> 00:20:50,840 Speaker 1: And so we've definitely lit a fire in them because 389 00:20:50,840 --> 00:20:53,399 Speaker 1: Twitter hasn't had any real advancements in quite a while, 390 00:20:53,600 --> 00:20:55,720 Speaker 1: but where they have gone as far as on their 391 00:20:55,760 --> 00:20:58,880 Speaker 1: spaces front with doing kind of the they essentially did 392 00:20:58,920 --> 00:21:02,280 Speaker 1: kind of a rip off a clubhouse is different from 393 00:21:02,320 --> 00:21:04,240 Speaker 1: we're going on the video front. I think the video 394 00:21:04,280 --> 00:21:06,480 Speaker 1: front is going to be more successful. I think that 395 00:21:06,600 --> 00:21:09,119 Speaker 1: the otherwise, I think Clubhouse would have taken off and 396 00:21:09,160 --> 00:21:12,560 Speaker 1: been a big ginormous success on its own. But I 397 00:21:12,560 --> 00:21:15,120 Speaker 1: think the video feature is something that I mean, look, 398 00:21:15,359 --> 00:21:18,000 Speaker 1: look at your platform. You're on video and that's that's 399 00:21:18,040 --> 00:21:21,239 Speaker 1: where so much of the action the interest is in 400 00:21:21,280 --> 00:21:23,960 Speaker 1: these days. So we actually kind of sit back and 401 00:21:24,040 --> 00:21:26,000 Speaker 1: have a little bit of a smile. We see Twitter 402 00:21:26,080 --> 00:21:28,000 Speaker 1: saying that they're going to do something new, because it's 403 00:21:28,040 --> 00:21:30,359 Speaker 1: really they can't teach an old dog new tricks, so 404 00:21:30,400 --> 00:21:33,720 Speaker 1: to speak. But Forgetter, we're still young, we're still growing. 405 00:21:33,800 --> 00:21:36,000 Speaker 1: We have a lot of cool things ahead. Yeah, what 406 00:21:36,080 --> 00:21:39,440 Speaker 1: else do you guys get or have in the pipeline? 407 00:21:39,640 --> 00:21:42,439 Speaker 1: And what are some of the milestones you're hoping to 408 00:21:42,480 --> 00:21:45,399 Speaker 1: reach if you have some sense of estimated timeline you 409 00:21:45,440 --> 00:21:49,240 Speaker 1: know by all means, Yeah, absolutely So in addition to 410 00:21:49,359 --> 00:21:51,760 Speaker 1: Vision the short video platform, then also get or Pay 411 00:21:51,840 --> 00:21:53,720 Speaker 1: that's coming up. We're also going to make as you 412 00:21:53,720 --> 00:21:57,919 Speaker 1: can import in Instagram posts that will be something that 413 00:21:57,920 --> 00:21:59,879 Speaker 1: we allow very similar to how you can now import 414 00:22:00,040 --> 00:22:02,399 Speaker 1: in Twitter posts. So we think that's going to be 415 00:22:02,440 --> 00:22:05,480 Speaker 1: a cool feature. Also, as I said, expanding out live 416 00:22:05,520 --> 00:22:09,320 Speaker 1: stream to a lot more people, there'll be another exciting dynamic. 417 00:22:09,359 --> 00:22:12,600 Speaker 1: And also too, we'll have cross posting where you can 418 00:22:12,640 --> 00:22:15,159 Speaker 1: elect to if you post on Getter to then have 419 00:22:15,280 --> 00:22:17,080 Speaker 1: that appear on Twitter. You know, one of the things 420 00:22:17,080 --> 00:22:18,879 Speaker 1: that a lot of folks come back and say is 421 00:22:19,000 --> 00:22:21,639 Speaker 1: I got to bounce around in different platforms, and everyone 422 00:22:21,680 --> 00:22:23,480 Speaker 1: wants to get away from big tech. But a lot 423 00:22:23,480 --> 00:22:27,160 Speaker 1: of content creators also very much don't want to say 424 00:22:27,200 --> 00:22:29,920 Speaker 1: give up followings or groups that they have and other audiences. 425 00:22:29,920 --> 00:22:31,520 Speaker 1: So those will make it so if you post on Getter, 426 00:22:31,560 --> 00:22:33,879 Speaker 1: it'll show up on Twitter, which makes us you really 427 00:22:33,920 --> 00:22:36,560 Speaker 1: don't have to go back to Twitter at all. The 428 00:22:36,600 --> 00:22:39,240 Speaker 1: one of the question that Jason for people that switch 429 00:22:39,320 --> 00:22:43,200 Speaker 1: over is are the followers that someone has on Twitter 430 00:22:43,359 --> 00:22:45,560 Speaker 1: the same exact followers they start out with on Getter 431 00:22:45,760 --> 00:22:49,200 Speaker 1: or how does that work? Great questions? So right now, 432 00:22:49,240 --> 00:22:51,600 Speaker 1: so what we list or the total number of followers, 433 00:22:51,600 --> 00:22:54,359 Speaker 1: so we have both the Getter followers and the Twitter followers, 434 00:22:54,400 --> 00:22:57,080 Speaker 1: whe people can see your combined reach Essentially the front 435 00:22:57,160 --> 00:22:58,800 Speaker 1: end of the engineering got a little bit ahead of 436 00:22:58,840 --> 00:23:01,040 Speaker 1: the back end, and so about two weeks or so 437 00:23:01,080 --> 00:23:02,919 Speaker 1: when we roll that out and you'll see that when 438 00:23:02,960 --> 00:23:05,080 Speaker 1: you post something on Getter and you've elected to also 439 00:23:05,080 --> 00:23:08,159 Speaker 1: have that appear on Twitter, then the follower account that 440 00:23:08,160 --> 00:23:10,520 Speaker 1: you have there is reflective of the total follower base, 441 00:23:11,040 --> 00:23:12,479 Speaker 1: or also might put it in there where you can 442 00:23:12,480 --> 00:23:15,520 Speaker 1: see just very clearly here's the Getter specific and here's 443 00:23:15,520 --> 00:23:17,520 Speaker 1: the combined numbers. So that's a little bit more clear 444 00:23:17,560 --> 00:23:21,159 Speaker 1: for folks. So it's so it gives you the opportunity 445 00:23:21,200 --> 00:23:24,000 Speaker 1: of a much bigger amplification. The other thing, too, is 446 00:23:24,000 --> 00:23:26,200 Speaker 1: just the fact that we are a global platform, that 447 00:23:26,280 --> 00:23:28,359 Speaker 1: we are now up over four million. I think a 448 00:23:28,359 --> 00:23:30,760 Speaker 1: lot of people may previously just had a domestic reach, 449 00:23:30,800 --> 00:23:33,239 Speaker 1: whether that's just here in the US, or say just 450 00:23:33,359 --> 00:23:35,399 Speaker 1: in Brazil or the UK or wherever they might be. 451 00:23:35,720 --> 00:23:38,760 Speaker 1: Now they're realizing that they're using their megaphone to talk globally, 452 00:23:38,760 --> 00:23:41,760 Speaker 1: like a Penis cantor Freedom from the Boston Celtics, who 453 00:23:41,880 --> 00:23:44,320 Speaker 1: now with the help of Getter, in addition to some 454 00:23:44,400 --> 00:23:47,320 Speaker 1: of his other efforts, has taken as anti CCP fight 455 00:23:47,400 --> 00:23:51,720 Speaker 1: globally Jason exciting stuff. Good work, Thanks for being with us. 456 00:23:52,920 --> 00:23:56,040 Speaker 1: Thank you. Sorry, We'll be right back with more of 457 00:23:56,040 --> 00:24:14,800 Speaker 1: this special edition of All the Line. Left efforts to 458 00:24:15,080 --> 00:24:18,879 Speaker 1: dominate social media began largely in twenty sixteen when they 459 00:24:18,880 --> 00:24:22,960 Speaker 1: accused Russia of using Facebook to influence voters. As back 460 00:24:22,960 --> 00:24:25,639 Speaker 1: in the day we were allowed to question the legitimacy 461 00:24:25,640 --> 00:24:28,640 Speaker 1: of a presidential election, of course, but it ultimately mounted 462 00:24:28,640 --> 00:24:32,240 Speaker 1: to was Russia purchasing approximately one hundred and fifty thousand 463 00:24:32,280 --> 00:24:35,080 Speaker 1: dollars worth of advertiser at least people in Russia acting, 464 00:24:35,080 --> 00:24:37,600 Speaker 1: we believe, at the behest of the Russian government. It's 465 00:24:37,600 --> 00:24:40,600 Speaker 1: not exactly a great threat to our democracy, though, considering 466 00:24:40,640 --> 00:24:43,640 Speaker 1: Hillary Clinton and Donald Trump spent over up billion dollars 467 00:24:43,640 --> 00:24:46,840 Speaker 1: over the course of the campaign. Still, the moral panic 468 00:24:46,840 --> 00:24:49,359 Speaker 1: over a few Russian bots and Facebook ads eventually became 469 00:24:49,440 --> 00:24:52,719 Speaker 1: a broader campaign by the left to control what can 470 00:24:52,840 --> 00:24:55,720 Speaker 1: and can't be said on social media, and they've seen 471 00:24:55,800 --> 00:24:59,080 Speaker 1: some success. Due to the constant pressure, Democrats have managed 472 00:24:59,119 --> 00:25:04,320 Speaker 1: to squash new unflattering stories, including the Hunter Biden laptop story, 473 00:25:04,640 --> 00:25:06,879 Speaker 1: and they've d platform people who've been critical of the 474 00:25:06,880 --> 00:25:09,720 Speaker 1: governments response to COVID nineteen I've got with this myself, 475 00:25:09,760 --> 00:25:12,679 Speaker 1: getting suspended and getting all kinds of strikes and shadow 476 00:25:12,680 --> 00:25:16,000 Speaker 1: banning and demodetization just for saying things that are obviously 477 00:25:16,040 --> 00:25:20,080 Speaker 1: true about the COVID regime of Fauci. Remember, it's not 478 00:25:20,160 --> 00:25:25,199 Speaker 1: about spreading disinformation, it's about who decides what disinformation is. 479 00:25:25,800 --> 00:25:28,560 Speaker 1: I'll take a closer look at their efforts at censorship 480 00:25:28,640 --> 00:25:40,760 Speaker 1: in tonight's buckbrief. Let's understand this right now, the Democrats 481 00:25:40,960 --> 00:25:47,200 Speaker 1: have embraced wholesale the usage of social media platforms as 482 00:25:47,240 --> 00:25:52,360 Speaker 1: a tool of their preferred narratives and policies. That's right. 483 00:25:52,359 --> 00:25:54,840 Speaker 1: The Democrats are now at a place where they quite 484 00:25:54,840 --> 00:25:58,800 Speaker 1: obviously and quite clearly want Facebook and the other major 485 00:25:58,840 --> 00:26:02,960 Speaker 1: social media company to do everything in their power to 486 00:26:03,160 --> 00:26:06,760 Speaker 1: shut down whatever ideas they don't like. In fact, they 487 00:26:06,760 --> 00:26:10,080 Speaker 1: even threatened, not just asked politely, they threatened some of 488 00:26:10,119 --> 00:26:12,920 Speaker 1: these social media companies to make sure that they fall 489 00:26:12,960 --> 00:26:17,280 Speaker 1: in line. Now, they're overwhelmingly run by not just Democrats, 490 00:26:17,359 --> 00:26:21,440 Speaker 1: but really far left zealots. That's who actually runs Facebook 491 00:26:21,440 --> 00:26:25,200 Speaker 1: and Google and these other platforms Twitter. But they also 492 00:26:25,200 --> 00:26:28,760 Speaker 1: recognize that their businesses and so antagonizing half the country 493 00:26:29,040 --> 00:26:32,159 Speaker 1: might not be a great idea, but they don't care anymore. 494 00:26:32,200 --> 00:26:36,200 Speaker 1: You know why, because the Democrats demand that they fall 495 00:26:36,240 --> 00:26:39,199 Speaker 1: in line and that they actively censor and shutdown. They 496 00:26:39,240 --> 00:26:41,159 Speaker 1: used to pretend they weren't doing it right because they 497 00:26:41,160 --> 00:26:44,320 Speaker 1: didn't want to lose half the country and all of this. 498 00:26:44,440 --> 00:26:47,320 Speaker 1: But here, for example, is Biden in November of twenty 499 00:26:47,400 --> 00:26:51,560 Speaker 1: nineteen saying that social media companies should not be exempted 500 00:26:51,880 --> 00:26:56,639 Speaker 1: from being sued for the promotion of fake news. Watch. 501 00:26:57,280 --> 00:26:59,600 Speaker 1: I just think that social media has to be more 502 00:26:59,640 --> 00:27:05,600 Speaker 1: social conscious of what is important in terms of our democracy, 503 00:27:06,000 --> 00:27:09,359 Speaker 1: and part of that is a little truth in lending 504 00:27:09,440 --> 00:27:12,120 Speaker 1: here and making sure that everything is not about whether 505 00:27:12,160 --> 00:27:15,160 Speaker 1: they can make a book. It requires that the journalistic 506 00:27:15,240 --> 00:27:17,800 Speaker 1: responsibility you have. You can't do what they can do 507 00:27:17,880 --> 00:27:20,679 Speaker 1: on Facebook. You can't do what they can do and 508 00:27:20,720 --> 00:27:24,640 Speaker 1: just say anything at all, then not acknowledge that when 509 00:27:24,680 --> 00:27:29,040 Speaker 1: you know something is fundamentally not true. And I just 510 00:27:29,080 --> 00:27:32,800 Speaker 1: think it's a little out of hand, and I, for one, 511 00:27:32,960 --> 00:27:36,880 Speaker 1: think we should be considering taking away the exemption that 512 00:27:36,920 --> 00:27:43,720 Speaker 1: they cannot be sued for knowingly engaged down in promoting 513 00:27:43,800 --> 00:27:46,880 Speaker 1: something that's not true. I understand this is the government 514 00:27:46,920 --> 00:27:51,119 Speaker 1: effectively threatening the social media company, saying, not only do 515 00:27:51,200 --> 00:27:54,840 Speaker 1: we want you to be the truth police, think about 516 00:27:54,840 --> 00:27:57,280 Speaker 1: how that goes. But if you don't do it, maybe 517 00:27:57,320 --> 00:28:00,200 Speaker 1: we'll make sure that you can get sued, maybe will 518 00:28:00,240 --> 00:28:04,040 Speaker 1: bankrupt you, maybe we'll go after you. This is what 519 00:28:04,119 --> 00:28:07,280 Speaker 1: you would expect in an authoritarian regime. Remember, it's not 520 00:28:07,400 --> 00:28:11,600 Speaker 1: that you will have ninety five percent of the media, 521 00:28:11,640 --> 00:28:15,119 Speaker 1: of journalists of newspapers against an authoritarian regime. That's not 522 00:28:15,160 --> 00:28:16,919 Speaker 1: how it goes. That's what you had under Donald Trump. 523 00:28:17,320 --> 00:28:20,160 Speaker 1: It's in fact that you will have a collusion, if 524 00:28:20,160 --> 00:28:24,840 Speaker 1: you will, between the media and the regime in power, 525 00:28:24,880 --> 00:28:27,960 Speaker 1: which is exactly what you have with the Biden administration. 526 00:28:28,080 --> 00:28:30,159 Speaker 1: That's when you should be really concerned about not just 527 00:28:30,280 --> 00:28:34,120 Speaker 1: free speech, but freedom in general. And they don't even 528 00:28:34,119 --> 00:28:36,040 Speaker 1: really hide it from him. Back in May of twenty 529 00:28:36,080 --> 00:28:40,880 Speaker 1: twenty one, the chief propagandas are of this White House, Jensaki, 530 00:28:41,200 --> 00:28:45,920 Speaker 1: said that there's an obligation of responsibility for the social 531 00:28:45,920 --> 00:28:50,080 Speaker 1: media platforms to shut down untrustworthy content and disinformation Watch. 532 00:28:50,480 --> 00:28:55,960 Speaker 1: The President's view is that the major platforms have a 533 00:28:56,000 --> 00:28:59,160 Speaker 1: responsibility related to the health and safety of all Americans 534 00:29:00,040 --> 00:29:07,000 Speaker 1: to stop amplifying untrustworthy content, disinformation and misinformation, especially related 535 00:29:07,000 --> 00:29:12,400 Speaker 1: to COVID nineteen vaccinations and elections. So, for example, would 536 00:29:12,400 --> 00:29:15,719 Speaker 1: it be a disinformation Was it disinformation to suggest that 537 00:29:15,720 --> 00:29:20,840 Speaker 1: the Wuhan coronavirus came from a lab leak scenario in Wuhan, China? 538 00:29:20,920 --> 00:29:23,920 Speaker 1: They said it was. Were they right, No, they were not. 539 00:29:24,800 --> 00:29:28,440 Speaker 1: Is it disinformation to say that the vaccines don't last 540 00:29:28,480 --> 00:29:31,440 Speaker 1: more than a few months with their maximum protection and 541 00:29:31,560 --> 00:29:34,720 Speaker 1: aren't very effective at all at stopping the spread of 542 00:29:34,760 --> 00:29:37,800 Speaker 1: the omicron variant of coronavirus. No, but if you said 543 00:29:37,840 --> 00:29:41,320 Speaker 1: that three months ago, two months ago, they would have 544 00:29:41,320 --> 00:29:44,760 Speaker 1: shut you down for misinformation. Seeing the pattern here, But 545 00:29:44,840 --> 00:29:47,320 Speaker 1: they don't care. This is about power. This is about 546 00:29:47,320 --> 00:29:50,560 Speaker 1: the ability to control the narrative. It's not about what's true. 547 00:29:50,800 --> 00:29:53,400 Speaker 1: Democrats don't care about what's true. They don't care about 548 00:29:53,400 --> 00:29:56,240 Speaker 1: the First Amendment. They want what they want. They're like 549 00:29:56,320 --> 00:29:59,160 Speaker 1: spoiled children. They don't want people saying things that upsets them. 550 00:29:59,360 --> 00:30:02,560 Speaker 1: They don't want people being able to challenge their sacred cows. 551 00:30:03,920 --> 00:30:07,400 Speaker 1: Here's Hillary Clinton back in November of twenty twenty one, 552 00:30:07,840 --> 00:30:09,840 Speaker 1: just making it clear to everybody that she thinks there 553 00:30:09,880 --> 00:30:14,800 Speaker 1: should be new laws to regulate social media companies. First, 554 00:30:14,880 --> 00:30:21,760 Speaker 1: we have to take necessary legislative and regulatory action to 555 00:30:21,880 --> 00:30:26,600 Speaker 1: begin to regulate the way that our social media and 556 00:30:26,640 --> 00:30:29,720 Speaker 1: tech companies operate. You know, we had to have new 557 00:30:29,800 --> 00:30:33,480 Speaker 1: rules for the industrial age at the beginning of the 558 00:30:33,600 --> 00:30:37,160 Speaker 1: last century. Well, we certainly need new rules for the 559 00:30:37,240 --> 00:30:44,000 Speaker 1: information age because our current laws, our framework is just 560 00:30:44,200 --> 00:30:47,800 Speaker 1: not to adequate for what we're facing. And there are 561 00:30:47,840 --> 00:30:53,120 Speaker 1: a number of very good ideas about how to both 562 00:30:53,160 --> 00:30:58,440 Speaker 1: apply existing laws and to fill the gaps that exist 563 00:30:58,520 --> 00:31:01,760 Speaker 1: so that we can begin to try to rain in 564 00:31:02,000 --> 00:31:06,240 Speaker 1: some of the abuses of the technology companies, particularly the 565 00:31:06,320 --> 00:31:13,080 Speaker 1: social media companies. Yeah, they want to make sure that 566 00:31:13,120 --> 00:31:16,240 Speaker 1: they restrict Conservatives are getting kicked off, Conservatives are getting 567 00:31:16,280 --> 00:31:18,640 Speaker 1: deep platformed. We're the ones that we're getting censored. And 568 00:31:18,760 --> 00:31:22,680 Speaker 1: the government and people in government currently and previously at 569 00:31:22,680 --> 00:31:25,920 Speaker 1: the top levels are saying we want more of this. 570 00:31:26,400 --> 00:31:29,760 Speaker 1: In fact, the sitting president just said recently, social media 571 00:31:29,800 --> 00:31:32,840 Speaker 1: outlets need to deal with the misinformation. Watch I make 572 00:31:32,880 --> 00:31:36,120 Speaker 1: a special appeal to social media companies and media outlets, 573 00:31:36,440 --> 00:31:41,160 Speaker 1: please deal with the misinformation and disinformation. It's on your shows. 574 00:31:42,040 --> 00:31:45,480 Speaker 1: It has to stop. COVID nineteen is one of the 575 00:31:45,520 --> 00:31:49,040 Speaker 1: most formidable and amas America's ever faced. We've got to 576 00:31:49,040 --> 00:31:53,360 Speaker 1: work together, not against each other. We're America. We can 577 00:31:53,440 --> 00:32:00,840 Speaker 1: do this. Sure, shutdown free speech? Why not the president? 578 00:32:01,960 --> 00:32:03,880 Speaker 1: All right? At least one member of Congress has been 579 00:32:04,000 --> 00:32:06,160 Speaker 1: vocal in her support of free speech when it comes 580 00:32:06,160 --> 00:32:09,040 Speaker 1: to social media. And we come back, Tennessee Senator Marsha 581 00:32:09,080 --> 00:32:11,520 Speaker 1: Blackburn's going to join us to discuss her efforts to 582 00:32:11,640 --> 00:32:29,320 Speaker 1: take on the big tech juggernaut. Stay with us. The 583 00:32:29,400 --> 00:32:31,600 Speaker 1: fight against big tech censorship has made its way to 584 00:32:31,680 --> 00:32:34,440 Speaker 1: Capitol Hill, where a few brave senators have continued to 585 00:32:34,440 --> 00:32:38,160 Speaker 1: push back against the power hungry tech giants like Facebook, Twitter, 586 00:32:38,200 --> 00:32:41,560 Speaker 1: and Google and stand up for conservative voices that have 587 00:32:41,640 --> 00:32:45,760 Speaker 1: been silenced on these platforms. Republican Tennessee Senator Marsha Blackburn 588 00:32:45,840 --> 00:32:48,960 Speaker 1: has been a force in DC against big tech. From 589 00:32:49,000 --> 00:32:52,520 Speaker 1: pressing major tech executives over their biased practices and shadow 590 00:32:52,520 --> 00:32:55,920 Speaker 1: banning rules to being lead Republican on the App Store Bill, 591 00:32:56,080 --> 00:32:59,520 Speaker 1: Senator Blackburn is on a mission to hold tech giants accountable. 592 00:33:00,040 --> 00:33:02,640 Speaker 1: Joins us now to explain center great to have you, 593 00:33:03,520 --> 00:33:06,600 Speaker 1: It is good to be with you, and you're exactly right. 594 00:33:07,160 --> 00:33:10,880 Speaker 1: I am determined that after a decade of trying, we 595 00:33:10,960 --> 00:33:14,920 Speaker 1: are going to put some guardrails around what big tech 596 00:33:15,000 --> 00:33:18,880 Speaker 1: can do with your data, your information. You know, buck, 597 00:33:18,960 --> 00:33:21,959 Speaker 1: when you are online, when you're on one of the 598 00:33:22,120 --> 00:33:27,480 Speaker 1: social media platforms, you are the product. And it leads 599 00:33:27,840 --> 00:33:32,640 Speaker 1: us to ask who actually owns the virtual you? Which 600 00:33:32,720 --> 00:33:36,880 Speaker 1: is you and your information? When you're online, you send 601 00:33:36,920 --> 00:33:39,840 Speaker 1: the past center of that American The American people no 602 00:33:39,880 --> 00:33:41,800 Speaker 1: longer trust big tech. Least a lot of us don't. 603 00:33:41,880 --> 00:33:44,840 Speaker 1: So what steps are you and your Republican colleagues going 604 00:33:44,880 --> 00:33:49,480 Speaker 1: to take when you're in the majority to address that. Yes, 605 00:33:49,840 --> 00:33:56,480 Speaker 1: the very first thing is establishing a federal online privacy standard. 606 00:33:57,040 --> 00:34:00,840 Speaker 1: This means passing a bill like the brows Act, which 607 00:34:01,280 --> 00:34:04,440 Speaker 1: I first filed a privacy bill believe it or not, 608 00:34:04,920 --> 00:34:08,840 Speaker 1: in twenty twelve. So this has taken a while. But 609 00:34:09,120 --> 00:34:13,080 Speaker 1: what it would do is give you, the consumer control 610 00:34:13,400 --> 00:34:17,800 Speaker 1: over your data. If you wanted to share your information 611 00:34:18,640 --> 00:34:22,680 Speaker 1: with the tech platform, then you would have to give 612 00:34:22,760 --> 00:34:26,239 Speaker 1: your explicit consent. In other words, you would have to 613 00:34:26,280 --> 00:34:30,759 Speaker 1: opt to in, and then for nonsensitive information like your 614 00:34:30,760 --> 00:34:34,319 Speaker 1: browsing history, then you would be able to opt out. 615 00:34:34,840 --> 00:34:39,439 Speaker 1: This means the tech platform could not track you, They 616 00:34:39,520 --> 00:34:44,640 Speaker 1: could not mine your data, they could not share or 617 00:34:44,800 --> 00:34:51,280 Speaker 1: sell your data. You would be the individual owning your data. Also, 618 00:34:51,400 --> 00:34:54,360 Speaker 1: what it would do is established one set of rules 619 00:34:54,400 --> 00:34:59,960 Speaker 1: for the entire Internet ecosystem with one regulator, the Federal 620 00:35:00,120 --> 00:35:04,120 Speaker 1: Trade Commission. That is how we think is the best 621 00:35:04,360 --> 00:35:09,840 Speaker 1: way to ensure that people are safe online. Likewise data security, 622 00:35:09,920 --> 00:35:12,960 Speaker 1: so that if there's a breach, they have to notify 623 00:35:13,080 --> 00:35:16,680 Speaker 1: you within a given period of time like forty eight 624 00:35:16,760 --> 00:35:20,880 Speaker 1: or seventy two hours, and also that there are penalties 625 00:35:20,920 --> 00:35:27,439 Speaker 1: and enforcements for people that are committing these breaches. And 626 00:35:27,480 --> 00:35:30,879 Speaker 1: then section two thirty is where you get into censorship, 627 00:35:31,280 --> 00:35:35,799 Speaker 1: where you get into these platforms that allow human trafficking, 628 00:35:35,920 --> 00:35:42,239 Speaker 1: sex trafficking, drug trafficking on their platform. Reforming section two 629 00:35:42,480 --> 00:35:46,960 Speaker 1: thirty so that these platforms have to be accountable for 630 00:35:47,280 --> 00:35:53,200 Speaker 1: what they allow to be accessed, viewed, heard, and shared 631 00:35:53,680 --> 00:35:57,160 Speaker 1: on their platforms would be the way for us to 632 00:35:57,320 --> 00:36:01,520 Speaker 1: in censorship and the selling of a lot of fraudulent 633 00:36:01,600 --> 00:36:07,800 Speaker 1: and counterfeit products and end these cartails and trafficking organizations 634 00:36:08,280 --> 00:36:13,440 Speaker 1: from using these tech platforms to handle sex trafficking or 635 00:36:13,520 --> 00:36:17,440 Speaker 1: drug trafficking. Senator Blackburn the big tech companies, of course, 636 00:36:17,520 --> 00:36:20,520 Speaker 1: have all kinds of answers and responses for a lot 637 00:36:20,520 --> 00:36:23,160 Speaker 1: of this stuff on the Section to thirty point. Their 638 00:36:23,320 --> 00:36:26,640 Speaker 1: spokespersons will go to the media, of course, and they 639 00:36:26,680 --> 00:36:29,319 Speaker 1: control a lot of the media themselves, to suggest that 640 00:36:29,360 --> 00:36:32,080 Speaker 1: if they don't have Section two thirty protection, they will 641 00:36:32,120 --> 00:36:34,879 Speaker 1: have to be even more strict in their censorship because 642 00:36:34,920 --> 00:36:37,799 Speaker 1: they would then be liable. What's the truth of that, 643 00:36:37,880 --> 00:36:41,160 Speaker 1: I mean, is there enough to reform Section two thirty? 644 00:36:41,320 --> 00:36:44,400 Speaker 1: Is that really the heart of this? The heart of 645 00:36:44,440 --> 00:36:47,799 Speaker 1: it is the reforming of Section two thirty to be 646 00:36:47,960 --> 00:36:52,600 Speaker 1: more explicit in what is allowed and to be more 647 00:36:52,800 --> 00:36:58,080 Speaker 1: explicit in the companies. The size of those companies that 648 00:36:58,200 --> 00:37:03,880 Speaker 1: can seek two thirty tections now buck news sites like 649 00:37:04,480 --> 00:37:09,120 Speaker 1: sagettor or Parlor or some of these, they still need 650 00:37:09,680 --> 00:37:14,520 Speaker 1: that Section to thirty protection. They're small, they haven't stood 651 00:37:14,600 --> 00:37:17,360 Speaker 1: up yet. And Section to thirty was put there in 652 00:37:17,400 --> 00:37:21,840 Speaker 1: the mid nineties to allow these platforms to be able 653 00:37:21,840 --> 00:37:28,240 Speaker 1: to grow. Now you have Facebook, Instagram Meta as they're called, 654 00:37:28,280 --> 00:37:33,040 Speaker 1: you have Twitter, you have YouTube, you have Amazon. You 655 00:37:33,120 --> 00:37:37,480 Speaker 1: have companies that are some of the biggest, wealthiest companies 656 00:37:37,560 --> 00:37:41,640 Speaker 1: in the country. These people know what is taking place 657 00:37:41,680 --> 00:37:47,040 Speaker 1: on their platform because they're constantly data mining and they 658 00:37:47,040 --> 00:37:52,200 Speaker 1: are constantly surveying, so no people realize you can no 659 00:37:52,400 --> 00:37:56,920 Speaker 1: longer let them hide behind section to thirty. They have 660 00:37:57,080 --> 00:37:59,920 Speaker 1: to be held to account and that is an area 661 00:38:00,120 --> 00:38:05,120 Speaker 1: where we have bipartisan agreement. You mentioned Getter Parlor. There 662 00:38:05,120 --> 00:38:08,400 Speaker 1: are some of these conservative or really just free speech 663 00:38:08,440 --> 00:38:12,759 Speaker 1: dedicated social media platforms that are finally getting some real 664 00:38:12,800 --> 00:38:15,880 Speaker 1: attention usership, and we've been talking about them here on 665 00:38:15,920 --> 00:38:18,759 Speaker 1: the program. Are there any measures you think have to 666 00:38:18,800 --> 00:38:21,600 Speaker 1: be taken from a legislative perspective to make sure that 667 00:38:21,640 --> 00:38:25,880 Speaker 1: there aren't anti competitive practices, say collusion between some of 668 00:38:25,880 --> 00:38:28,800 Speaker 1: the big tech companies. We know that Parlor, for example, 669 00:38:29,239 --> 00:38:31,960 Speaker 1: was kicked off of Amazon Web servers out of nowhere, 670 00:38:32,400 --> 00:38:35,960 Speaker 1: and that certainly was a huge benefit to Twitter, which 671 00:38:36,000 --> 00:38:39,840 Speaker 1: was Parlor's primary competitors. So is there any legislation about 672 00:38:40,040 --> 00:38:44,399 Speaker 1: anti competitive practices that you foresee? Is that a concern? Yes, 673 00:38:44,760 --> 00:38:49,480 Speaker 1: those anti competitive practices are a tremendous concern to us, 674 00:38:49,960 --> 00:38:54,040 Speaker 1: and yes, looking at this is something that we are doing. 675 00:38:54,800 --> 00:38:58,440 Speaker 1: The Open App Market Bill is a good example of 676 00:38:58,719 --> 00:39:03,200 Speaker 1: something that will help to make the app market marketplace 677 00:39:03,680 --> 00:39:09,560 Speaker 1: more competitive because what you would do is allow these 678 00:39:09,680 --> 00:39:12,839 Speaker 1: apps to work with somebody other than Apple. Right now, 679 00:39:12,920 --> 00:39:18,200 Speaker 1: Apple kind of has the stranglehold on that. What innovators want, 680 00:39:18,320 --> 00:39:20,520 Speaker 1: they want choice and options, and they don't want to 681 00:39:20,520 --> 00:39:23,360 Speaker 1: have to give thirty percent of their profits over to 682 00:39:23,440 --> 00:39:27,920 Speaker 1: Apple just to be in the app store. So there 683 00:39:27,960 --> 00:39:36,399 Speaker 1: are different components that will make basically antitrust legislation that 684 00:39:36,480 --> 00:39:40,120 Speaker 1: we will be bringing forward. Many of our bills, like 685 00:39:40,200 --> 00:39:43,919 Speaker 1: my App market Bill, are written, they're filed, and we're 686 00:39:43,920 --> 00:39:47,200 Speaker 1: going to be pushing to pass those, get them signed 687 00:39:47,200 --> 00:39:50,000 Speaker 1: into law. Not sure we'll get them done this year, 688 00:39:50,040 --> 00:39:52,880 Speaker 1: but I have no doubt that in twenty three we 689 00:39:53,000 --> 00:39:56,759 Speaker 1: will get them done. Senator blackburn always appreciate you make 690 00:39:56,760 --> 00:40:01,319 Speaker 1: the time, Thanks for being with us. Thank you. That's 691 00:40:01,320 --> 00:40:03,040 Speaker 1: all the time we have for this special edition of 692 00:40:03,120 --> 00:40:04,880 Speaker 1: Hold the Line. I'd like to thank my guest Jason 693 00:40:04,920 --> 00:40:08,400 Speaker 1: Miller and Senator Marsha Blackburne for sheridoner expertise. Have a 694 00:40:08,440 --> 00:40:18,680 Speaker 1: great night. As always, Shield's high