1 00:00:01,560 --> 00:00:05,359 Speaker 1: There was an interview that was done where Biden officials 2 00:00:05,400 --> 00:00:08,760 Speaker 1: are actually saying that the pandemic lockdowns were worth it 3 00:00:09,000 --> 00:00:13,560 Speaker 1: because pollution was following during the lockdowns, and now they're 4 00:00:13,600 --> 00:00:17,200 Speaker 1: claiming that actually, whether you understood it or not, it 5 00:00:17,280 --> 00:00:21,200 Speaker 1: was helping the quality of your life. They also said 6 00:00:21,239 --> 00:00:25,720 Speaker 1: that quote many people enjoyed being in an environment instead 7 00:00:25,760 --> 00:00:29,840 Speaker 1: of being at work, again saying that the lockdowns were 8 00:00:29,880 --> 00:00:33,680 Speaker 1: a good thing now. During this interview on ABC News 9 00:00:33,680 --> 00:00:37,960 Speaker 1: on Monday, the outgoing Assistant Secretary for Oceans and International 10 00:00:38,080 --> 00:00:41,800 Speaker 1: Environmental and Scientific Affairs, Yeah you're paying for that position 11 00:00:41,840 --> 00:00:46,000 Speaker 1: at the Department of State stated that during the lockdowns 12 00:00:46,040 --> 00:00:48,920 Speaker 1: it took place during the coronavirus pandemic, many people quote 13 00:00:48,920 --> 00:00:51,879 Speaker 1: really appreciated how much they enjoyed getting to be in 14 00:00:51,920 --> 00:00:56,040 Speaker 1: the environment more days than they would if they had 15 00:00:56,080 --> 00:00:59,040 Speaker 1: been at work, and that we saw pollution levels go 16 00:00:59,120 --> 00:01:01,480 Speaker 1: down and people went home. My quality of life is 17 00:01:01,520 --> 00:01:04,280 Speaker 1: a little bit better now because they didn't have to 18 00:01:04,319 --> 00:01:07,720 Speaker 1: worry as much about air pollution as they did before. 19 00:01:08,920 --> 00:01:10,920 Speaker 1: Medina said that, I quote, a lot of people during 20 00:01:10,959 --> 00:01:16,360 Speaker 1: the pandemic discovered a greater appreciation for the environment because 21 00:01:16,360 --> 00:01:18,920 Speaker 1: they were at home when they went outside, they really 22 00:01:18,920 --> 00:01:21,480 Speaker 1: appreciated how much they enjoyed getting to be in the 23 00:01:21,560 --> 00:01:24,360 Speaker 1: environment more days than they would if they would have 24 00:01:24,440 --> 00:01:26,840 Speaker 1: been in the office. And in fact, we saw pollution 25 00:01:27,000 --> 00:01:30,080 Speaker 1: levels go down, and people went, huh, my quality of 26 00:01:30,120 --> 00:01:33,000 Speaker 1: life is a little bit better now that I don't 27 00:01:33,040 --> 00:01:36,200 Speaker 1: have to worry about whether it's Code red day or 28 00:01:36,240 --> 00:01:40,520 Speaker 1: smog or air pollution day. So people did become more aware, 29 00:01:40,600 --> 00:01:45,440 Speaker 1: and that is a good thing, all right. So you're 30 00:01:45,440 --> 00:01:50,120 Speaker 1: telling me now that the saving grace of COVID in 31 00:01:50,200 --> 00:01:53,160 Speaker 1: lockdowns for two years and destroying people's lives and putting 32 00:01:53,160 --> 00:01:57,880 Speaker 1: people into deep depression, suicide rates, especially among young people's 33 00:01:58,040 --> 00:02:03,480 Speaker 1: skyrocketing during COVID, is somehow actually a good thing. That 34 00:02:03,520 --> 00:02:05,600 Speaker 1: we were shut off from our family and our friends, 35 00:02:05,640 --> 00:02:11,040 Speaker 1: where people suffered from serious clinical depression, suicide rate skyrocket 36 00:02:11,080 --> 00:02:13,120 Speaker 1: as well. At least the people that didn't kill themselves 37 00:02:13,160 --> 00:02:16,720 Speaker 1: saw a beautiful world outside. 38 00:02:15,760 --> 00:02:17,119 Speaker 2: And they didn't have to go to work. 39 00:02:17,160 --> 00:02:19,160 Speaker 1: And so therefore this is a good thing that maybe 40 00:02:19,160 --> 00:02:21,840 Speaker 1: we should do again in the future, because that's what 41 00:02:21,919 --> 00:02:24,839 Speaker 1: they're really now telling you. The truth of this. There 42 00:02:24,960 --> 00:02:26,840 Speaker 1: was a point I want to make. I want to 43 00:02:26,880 --> 00:02:30,800 Speaker 1: make this clear. During these lockdowns right after the just 44 00:02:30,840 --> 00:02:33,840 Speaker 1: give us two weeks lockdown right when you know that 45 00:02:34,040 --> 00:02:35,880 Speaker 1: was the thing, just give us two weeks. 46 00:02:37,160 --> 00:02:38,200 Speaker 2: After those two. 47 00:02:38,040 --> 00:02:40,560 Speaker 1: Weeks were up, and then it turned into two months, 48 00:02:40,560 --> 00:02:42,680 Speaker 1: six months, eight months, eight There was a certain point 49 00:02:42,680 --> 00:02:47,679 Speaker 1: where they realized this is really about control, Like this 50 00:02:48,160 --> 00:02:54,760 Speaker 1: is not about anything else but control. Like control here 51 00:02:54,840 --> 00:03:00,480 Speaker 1: is pretty much everything we like. Control. We want to 52 00:03:00,520 --> 00:03:04,760 Speaker 1: have a lot of that control. Control is something that 53 00:03:04,760 --> 00:03:07,880 Speaker 1: we think is a good thing. So now we know 54 00:03:07,919 --> 00:03:11,240 Speaker 1: what it is. We know that the control is clearly there, 55 00:03:12,240 --> 00:03:14,640 Speaker 1: and they can decide when to shut you down and 56 00:03:14,680 --> 00:03:17,960 Speaker 1: when to lock you down, when to silence you. That's 57 00:03:18,000 --> 00:03:20,600 Speaker 1: what they're going to do. I'll have more on this 58 00:03:20,720 --> 00:03:22,880 Speaker 1: story coming up. Back to the other story I wanted 59 00:03:22,880 --> 00:03:25,720 Speaker 1: to talk to you about. Elon Musk set down for 60 00:03:25,800 --> 00:03:31,120 Speaker 1: a very interesting interview with Tucker Carlson about shutting jobs 61 00:03:31,160 --> 00:03:33,640 Speaker 1: at Twitter and how easy it is to run a 62 00:03:33,639 --> 00:03:37,360 Speaker 1: big tech company with a much smaller staff when you're 63 00:03:37,400 --> 00:03:41,360 Speaker 1: not trying to be a woke organization that is trying 64 00:03:41,400 --> 00:03:44,200 Speaker 1: to silence and censor people and pick winners and losers 65 00:03:44,200 --> 00:03:46,960 Speaker 1: when you tweet. I want you to hear part of 66 00:03:47,000 --> 00:03:49,680 Speaker 1: that conversation it's an important one. And see how just 67 00:03:50,120 --> 00:03:53,520 Speaker 1: how different Twitter is now and what all he actually found. 68 00:03:53,520 --> 00:03:57,360 Speaker 3: In his own words, Elon must bought Twitter because he 69 00:03:57,520 --> 00:03:59,760 Speaker 3: used Twitter. It's as simple as that. And he was 70 00:03:59,800 --> 00:04:04,080 Speaker 3: in furiated by Twitter's effort to silence people on the Internet. 71 00:04:04,600 --> 00:04:06,720 Speaker 3: That's how strongly he believed in free speech. He paid 72 00:04:06,760 --> 00:04:09,160 Speaker 3: forty four billion dollars and a miliar you lost tens 73 00:04:09,160 --> 00:04:10,920 Speaker 3: of billions of dollars doing it. And when he took 74 00:04:10,960 --> 00:04:15,200 Speaker 3: over and looked behind the curtain, he discovered Twitter was 75 00:04:15,360 --> 00:04:18,400 Speaker 3: really a tool of the global intel agencies to spy 76 00:04:18,520 --> 00:04:22,800 Speaker 3: on people and emit propaganda. Here it is, you bought 77 00:04:22,839 --> 00:04:26,320 Speaker 3: Twitter famously got a lot of other businesses and a 78 00:04:26,320 --> 00:04:29,080 Speaker 3: lot going on. Yes, you said you bought it because 79 00:04:29,120 --> 00:04:31,960 Speaker 3: you believe in speech, free speech. You've had a lot 80 00:04:31,960 --> 00:04:34,120 Speaker 3: of hassle since you bought it. In retrospect, was it 81 00:04:34,240 --> 00:04:34,960 Speaker 3: worth buying it? 82 00:04:35,880 --> 00:04:37,360 Speaker 4: I mean it raised to be seen as to whether 83 00:04:37,440 --> 00:04:43,320 Speaker 4: this was financially smart currently is it is not? You know, 84 00:04:43,360 --> 00:04:45,800 Speaker 4: we just revalued the company at less than half of 85 00:04:45,880 --> 00:04:46,760 Speaker 4: the acquisition produce. 86 00:04:47,560 --> 00:04:47,839 Speaker 5: Yes. 87 00:04:51,880 --> 00:04:56,640 Speaker 4: No, my timing was terrible for when the offer was 88 00:04:56,680 --> 00:05:01,440 Speaker 4: made because it was right before advertising plummeted. 89 00:05:01,040 --> 00:05:03,640 Speaker 3: And you've got the high water mark I noticed. 90 00:05:03,880 --> 00:05:06,840 Speaker 4: Yeah, yeah, so it must be a real genius year. 91 00:05:08,080 --> 00:05:12,560 Speaker 4: My timing is amazing, since I had vorder for at 92 00:05:12,640 --> 00:05:14,600 Speaker 4: least twice as much as it should have been bored for. 93 00:05:16,200 --> 00:05:21,080 Speaker 4: But somethings are priceless, and so the whether I lose 94 00:05:21,120 --> 00:05:23,200 Speaker 4: money or not, that is a secondary issue compared to 95 00:05:23,880 --> 00:05:29,159 Speaker 4: ensuring the strength of democracy and free speech is the 96 00:05:29,160 --> 00:05:30,680 Speaker 4: bedrock of a functioning democracy. 97 00:05:30,880 --> 00:05:33,640 Speaker 5: Yes, and the speech needs to. 98 00:05:33,600 --> 00:05:38,560 Speaker 4: Be as transparent and truthful as possible. So we've got 99 00:05:38,720 --> 00:05:42,280 Speaker 4: a huge push on Twitter to be as truthful as possible. 100 00:05:42,320 --> 00:05:45,240 Speaker 4: We've got this community notes feature, which is great. It 101 00:05:45,360 --> 00:05:47,360 Speaker 4: is great, It is awesome, and it's like. 102 00:05:47,480 --> 00:05:49,960 Speaker 3: I thought this morning, Yeah, it was far more honest 103 00:05:50,000 --> 00:05:50,960 Speaker 3: than the New York Times. 104 00:05:50,960 --> 00:05:51,960 Speaker 5: It's great. Yeah. 105 00:05:52,000 --> 00:05:54,640 Speaker 4: We put a lot of effort to ensuring that community 106 00:05:54,640 --> 00:05:57,960 Speaker 4: notes does not get gamed or have biases. It simply 107 00:05:58,360 --> 00:06:02,560 Speaker 4: cares about what is the most act thing. And you know, 108 00:06:02,600 --> 00:06:04,360 Speaker 4: sometimes truth can be a little bit elusive, but you 109 00:06:04,920 --> 00:06:06,520 Speaker 4: can still aspire to get closer to it. 110 00:06:06,839 --> 00:06:08,240 Speaker 5: Yes, you know, and so. 111 00:06:10,920 --> 00:06:14,800 Speaker 4: And I think the effect of community notes is more 112 00:06:14,839 --> 00:06:17,480 Speaker 4: powerful than people may realize, because once people know that 113 00:06:17,680 --> 00:06:21,080 Speaker 4: they could get noted, you know, community noted on Twitter. 114 00:06:21,240 --> 00:06:24,839 Speaker 4: Then they'll think that more carefully about what they say 115 00:06:25,520 --> 00:06:28,400 Speaker 4: they are likely. It's basically it's an encouragement to be 116 00:06:28,440 --> 00:06:29,720 Speaker 4: more truthful and less deceptive. 117 00:06:29,880 --> 00:06:31,880 Speaker 3: When you jumped into this, though, when you bought it, 118 00:06:32,279 --> 00:06:34,919 Speaker 3: did you understand well, clearly you understood it's importance. 119 00:06:34,920 --> 00:06:37,960 Speaker 5: You wouldn't bought it Twitter, Yes, right. 120 00:06:38,080 --> 00:06:40,080 Speaker 3: But it's not the biggest, but it's the most important 121 00:06:40,120 --> 00:06:43,240 Speaker 3: in the social media companies. But did you understand the 122 00:06:43,360 --> 00:06:46,919 Speaker 3: kind of ferocity you'd be facing, the attacks you'd be 123 00:06:46,960 --> 00:06:50,159 Speaker 3: facing from power centers in the country. 124 00:06:50,960 --> 00:06:57,440 Speaker 4: I thought there'd probably be some negative reactions, so I'm 125 00:06:57,480 --> 00:06:58,480 Speaker 4: sure everyone would. 126 00:06:58,279 --> 00:07:00,360 Speaker 5: Not be pleased with the with with it. 127 00:07:01,560 --> 00:07:05,520 Speaker 4: But at the end of the day, you know, if 128 00:07:05,640 --> 00:07:08,200 Speaker 4: if if the public is happy with it, that's what matters, 129 00:07:09,200 --> 00:07:11,960 Speaker 4: and the public will speak with their actions. Although I mean, 130 00:07:13,280 --> 00:07:15,640 Speaker 4: if if they find truth Twitter to be useful, they 131 00:07:15,680 --> 00:07:17,240 Speaker 4: will use it more, and if they find it to 132 00:07:17,280 --> 00:07:19,720 Speaker 4: be not useful, I will use it less. They find 133 00:07:19,720 --> 00:07:21,880 Speaker 4: it to be the best source of truth, I think 134 00:07:21,880 --> 00:07:22,760 Speaker 4: they will use it more. 135 00:07:23,600 --> 00:07:23,800 Speaker 5: You know. 136 00:07:24,160 --> 00:07:28,320 Speaker 4: Now there's there's obviously a lot of organizations that are 137 00:07:28,400 --> 00:07:33,400 Speaker 4: used to having sort of unfettered influence on Twitter that 138 00:07:33,520 --> 00:07:34,040 Speaker 4: no longer have. 139 00:07:34,080 --> 00:07:36,560 Speaker 3: That we used the New York Times of their of 140 00:07:36,600 --> 00:07:40,160 Speaker 3: their badge this morning, and then you call them diarrhea. 141 00:07:38,960 --> 00:07:42,280 Speaker 5: You did, you did. I'm just I'm just quoting it. 142 00:07:42,560 --> 00:07:45,320 Speaker 3: You described their Twitter feed is diarrhea. 143 00:07:46,040 --> 00:07:48,680 Speaker 4: I said, it's the Twitter Twitter equivalent of diarrhea. Okay, 144 00:07:48,720 --> 00:07:51,240 Speaker 4: it's not literally diarrhea, but no, it's a you know, 145 00:07:51,760 --> 00:07:57,200 Speaker 4: it's a metaphor, but an accurate one. So, I mean, 146 00:07:57,240 --> 00:07:59,840 Speaker 4: if you look at the at NY Times twitter feed 147 00:07:59,880 --> 00:08:02,960 Speaker 4: is unreadable, it's like because what they do is that 148 00:08:03,000 --> 00:08:06,760 Speaker 4: they tweet every single article, even the ones that are boring, 149 00:08:06,800 --> 00:08:10,160 Speaker 4: even ones that don't make it into the paper. So 150 00:08:10,160 --> 00:08:13,840 Speaker 4: so it's just NonStop zillion tweets a day with no 151 00:08:15,800 --> 00:08:17,480 Speaker 4: you know, they really should just be saying, like, what 152 00:08:17,520 --> 00:08:20,320 Speaker 4: are the top tweets? Yea, what of the what of 153 00:08:20,400 --> 00:08:22,720 Speaker 4: the what are the big stories of the day. I 154 00:08:22,760 --> 00:08:25,720 Speaker 4: don't know, put out like ten or something. You know, 155 00:08:25,720 --> 00:08:28,560 Speaker 4: so it's a number that's manageable as opposed to right now, 156 00:08:28,640 --> 00:08:30,920 Speaker 4: if you if you would have followed whit time at 157 00:08:30,960 --> 00:08:33,359 Speaker 4: n White Times on Twitter, you're going to get barraged 158 00:08:33,480 --> 00:08:36,640 Speaker 4: with like hundreds of tweets a day, and your whole 159 00:08:36,640 --> 00:08:39,880 Speaker 4: feet will be fieled with n white times. So that's 160 00:08:39,960 --> 00:08:42,880 Speaker 4: that's this is something I would recommend actually for all publications, 161 00:08:43,200 --> 00:08:47,040 Speaker 4: which is for your primary feed, only put out your 162 00:08:47,040 --> 00:08:48,959 Speaker 4: best stuff. I think I know a thing or two 163 00:08:49,000 --> 00:08:51,160 Speaker 4: about how to use Twitter, because you know, it was 164 00:08:51,240 --> 00:08:54,680 Speaker 4: the most interacted with account on the whole system before 165 00:08:54,720 --> 00:08:57,839 Speaker 4: the acquisition, before the opposition closed. I didn't have the 166 00:08:57,840 --> 00:08:59,400 Speaker 4: most number of followers, but I had the most number 167 00:08:59,400 --> 00:09:03,080 Speaker 4: of interactions, and so I clearly know something about how 168 00:09:03,120 --> 00:09:05,560 Speaker 4: to use Twitter. People's attention is limited, so just make 169 00:09:05,600 --> 00:09:07,880 Speaker 4: sure you put the stuff that's most important. 170 00:09:07,520 --> 00:09:10,960 Speaker 3: There, because you know, you and people like you do 171 00:09:11,200 --> 00:09:17,160 Speaker 3: interact on Twitter. It's obviously enormously powerful in shaping public opinion. 172 00:09:17,160 --> 00:09:19,280 Speaker 3: It's where a lot of ideas and trends are incubated, 173 00:09:19,679 --> 00:09:20,640 Speaker 3: norms wherebsolutely. 174 00:09:21,559 --> 00:09:23,360 Speaker 5: It's also a magnet. 175 00:09:22,880 --> 00:09:25,959 Speaker 3: For intell agencies from around the world. And one of 176 00:09:25,960 --> 00:09:27,880 Speaker 3: the things we learned after you started opening the books 177 00:09:28,280 --> 00:09:31,000 Speaker 3: is that they were exerting influence from within Twitter. 178 00:09:31,880 --> 00:09:37,080 Speaker 5: I mean, it was absurd. Did you know that? Going in? 179 00:09:37,320 --> 00:09:40,920 Speaker 4: No Since I've been a heavy Twitter user since two 180 00:09:40,960 --> 00:09:44,840 Speaker 4: thousand and nine. It's it's sort of like I'm in 181 00:09:44,880 --> 00:09:48,200 Speaker 4: the matrix. I mean, I can see like things, do 182 00:09:48,280 --> 00:09:51,400 Speaker 4: things feel right? Do they not feel right? What tweets 183 00:09:51,520 --> 00:09:55,920 Speaker 4: am I being shown as recommended? Like I get to 184 00:09:55,960 --> 00:09:59,960 Speaker 4: feel like what accounts are making comments? Where the comment 185 00:10:00,040 --> 00:10:04,520 Speaker 4: it's uh ear rely similar yeah, and uh and then 186 00:10:04,520 --> 00:10:06,240 Speaker 4: you look at the account and it's just obviously a 187 00:10:06,280 --> 00:10:10,520 Speaker 4: fake photo and uh you know, uh, it's just obviously 188 00:10:10,520 --> 00:10:13,079 Speaker 4: a bought cluster over and over again. So I started 189 00:10:13,080 --> 00:10:15,679 Speaker 4: to get like, just more and more uneasy about the 190 00:10:16,120 --> 00:10:18,079 Speaker 4: Twitter situation. I started I was starting to feel like 191 00:10:18,160 --> 00:10:21,280 Speaker 4: something's rot on the state of Denmark here, There's there's 192 00:10:21,280 --> 00:10:22,920 Speaker 4: something feels wrong about the platform. 193 00:10:23,320 --> 00:10:27,520 Speaker 5: It seemed to be just drifting in a I couldn't. 194 00:10:27,200 --> 00:10:29,880 Speaker 4: Place it exactly, just ahead of it felt like it 195 00:10:29,920 --> 00:10:32,120 Speaker 4: was drifting in a bad direction. So then I was like, 196 00:10:32,800 --> 00:10:36,720 Speaker 4: and my conversations with the board and management seemed to 197 00:10:36,720 --> 00:10:39,959 Speaker 4: confirm my intuition about that. 198 00:10:40,440 --> 00:10:42,679 Speaker 5: But basically I was caniss, these guys do not care 199 00:10:42,679 --> 00:10:43,559 Speaker 5: about fixing. 200 00:10:43,280 --> 00:10:48,040 Speaker 4: Twitter, and and and I had a bad feeling about 201 00:10:48,040 --> 00:10:48,840 Speaker 4: where I was headed based. 202 00:10:48,679 --> 00:10:50,240 Speaker 5: On the conversations conversations I had with them. 203 00:10:50,280 --> 00:10:54,720 Speaker 4: So then I was like, you know what I I'll 204 00:10:54,800 --> 00:10:57,200 Speaker 4: try acquiring it and see if that's see if acquiring 205 00:10:57,240 --> 00:11:01,280 Speaker 4: it is possible. Now I didn't have enough cash to 206 00:11:01,320 --> 00:11:03,680 Speaker 4: acquire it, so I would need, you know, support from others, 207 00:11:04,200 --> 00:11:07,319 Speaker 4: from some of the existing investors. I would also need 208 00:11:07,360 --> 00:11:10,360 Speaker 4: like a lot of dat And so it wasn't clear 209 00:11:10,360 --> 00:11:12,480 Speaker 4: to me whether an acquisition would succeed ath or I 210 00:11:12,480 --> 00:11:16,600 Speaker 4: would try and ultimately it did succeed. 211 00:11:16,720 --> 00:11:17,600 Speaker 5: Anyway, here we are. 212 00:11:18,200 --> 00:11:19,920 Speaker 3: But when you got there and all of a sudden 213 00:11:19,920 --> 00:11:21,840 Speaker 3: you own it, and all the data on the service 214 00:11:21,840 --> 00:11:23,160 Speaker 3: belongs to you and. 215 00:11:23,280 --> 00:11:25,200 Speaker 4: What belongs to the people in my view, but yes, 216 00:11:25,280 --> 00:11:27,319 Speaker 4: but you can see what it is, and you can 217 00:11:27,360 --> 00:11:29,280 Speaker 4: see what they've been doing, and you can see who's 218 00:11:29,280 --> 00:11:30,040 Speaker 4: been working there. 219 00:11:30,200 --> 00:11:34,000 Speaker 3: You were shocked to find out that various intel agencies 220 00:11:34,000 --> 00:11:36,079 Speaker 3: were affecting its operations. 221 00:11:37,520 --> 00:11:41,720 Speaker 4: The degree to which various government agencies had effectively had 222 00:11:41,760 --> 00:11:44,400 Speaker 4: full access to everything that was going on on Twitter 223 00:11:44,720 --> 00:11:45,480 Speaker 4: blew my mind. 224 00:11:46,080 --> 00:11:47,280 Speaker 5: I was not aware of that. 225 00:11:47,320 --> 00:11:48,880 Speaker 3: Would that include people's dms? 226 00:11:50,000 --> 00:11:52,640 Speaker 5: Yes, yes, because the dms are not encrypted. 227 00:11:53,240 --> 00:11:54,560 Speaker 4: So one of the posts, you know, one of the 228 00:11:54,600 --> 00:11:57,679 Speaker 4: things that we're about to release is ability to encrypt 229 00:11:57,720 --> 00:11:58,120 Speaker 4: your DM. 230 00:11:58,200 --> 00:12:00,320 Speaker 3: That's pretty heavy duty though, because a lot of well 231 00:12:00,360 --> 00:12:04,240 Speaker 3: known people, reporters talking to their sources, government officials, rich 232 00:12:04,600 --> 00:12:08,200 Speaker 3: in the world, they're dming each other and the assumption 233 00:12:08,440 --> 00:12:11,560 Speaker 3: obviously was incorrect. But was it that's private, but that 234 00:12:11,720 --> 00:12:13,760 Speaker 3: was being read by various governments. 235 00:12:14,760 --> 00:12:19,520 Speaker 5: Yeah, that seems to yes, scary, Yes, it is. 236 00:12:19,920 --> 00:12:25,080 Speaker 4: So, like I said, we're moving to have the dms 237 00:12:25,160 --> 00:12:27,240 Speaker 4: be optionally encrypted. I mean, you know, there's like a 238 00:12:27,240 --> 00:12:29,760 Speaker 4: lot of DM conversations, which are you know, just chatting 239 00:12:29,800 --> 00:12:30,240 Speaker 4: with friends. 240 00:12:30,240 --> 00:12:31,280 Speaker 5: It's not not important. 241 00:12:32,320 --> 00:12:35,040 Speaker 4: That's hopefully coming out later this month, but no later 242 00:12:35,120 --> 00:12:38,199 Speaker 4: the next month. Is the ability to tuggle encryption on 243 00:12:38,640 --> 00:12:40,840 Speaker 4: or off. So if you if you are in a 244 00:12:40,840 --> 00:12:43,320 Speaker 4: conversation you think a sensitive, you can just toggle encryption 245 00:12:43,400 --> 00:12:45,520 Speaker 4: on and then no one on Twitter can see what 246 00:12:45,559 --> 00:12:47,440 Speaker 4: are you're talking about. I could put a gun to 247 00:12:47,440 --> 00:12:49,480 Speaker 4: my head and I can't. I couldn't tell I can't. 248 00:12:51,400 --> 00:12:52,719 Speaker 5: That's sort of the gun to the head test. 249 00:12:52,840 --> 00:12:55,120 Speaker 4: Afually puts it gun to my head and can I 250 00:12:55,160 --> 00:12:57,560 Speaker 4: still not see your dms? 251 00:12:57,559 --> 00:12:58,840 Speaker 5: That should be that's the assid test. 252 00:12:59,000 --> 00:13:03,280 Speaker 4: Yes, and that's how it should be if you want 253 00:13:03,440 --> 00:13:04,679 Speaker 4: your if you had complaints. 254 00:13:05,040 --> 00:13:06,840 Speaker 1: But by the way, let's just pause there and stop 255 00:13:06,880 --> 00:13:08,960 Speaker 1: for a second and just think about what he just said. 256 00:13:09,679 --> 00:13:12,960 Speaker 1: The federal government could then spy on anything in your 257 00:13:13,040 --> 00:13:17,640 Speaker 1: sources through direct messages. There's a lot of people that 258 00:13:17,720 --> 00:13:21,760 Speaker 1: have contacted me for background and government agencies and stories 259 00:13:21,800 --> 00:13:26,400 Speaker 1: and local government, state government, national There are people that 260 00:13:26,480 --> 00:13:28,720 Speaker 1: have been like, Hey, don't use my name, but I 261 00:13:28,760 --> 00:13:30,600 Speaker 1: want you to know this is what's going on. 262 00:13:30,760 --> 00:13:32,800 Speaker 2: Here or there, or here or there. 263 00:13:33,679 --> 00:13:38,840 Speaker 1: And when you hear that right now, to know that 264 00:13:38,920 --> 00:13:42,760 Speaker 1: they could just read that. They could read the DMS 265 00:13:42,760 --> 00:13:45,400 Speaker 1: of the President United States America Donald Trump. They could 266 00:13:45,440 --> 00:13:49,559 Speaker 1: read senators or congressmen. I've had senators and congressmen who 267 00:13:49,679 --> 00:13:56,080 Speaker 1: have sent me things right, who have sent me different 268 00:13:56,240 --> 00:14:01,160 Speaker 1: things that need to be you know, talked about. Those 269 00:14:01,200 --> 00:14:04,760 Speaker 1: could be read by Twitter or other people in the government. Hey, 270 00:14:04,760 --> 00:14:10,200 Speaker 1: what's Ben Ferguson sending messages about what is sender Marshall 271 00:14:10,240 --> 00:14:13,320 Speaker 1: Blackburn sending things about? What is Bill Haggerty sending things 272 00:14:13,320 --> 00:14:17,200 Speaker 1: about what is Ted Cruz sending DMS about I've had 273 00:14:17,480 --> 00:14:20,320 Speaker 1: people in government that have sent me stories. They're like, hey, 274 00:14:20,400 --> 00:14:24,119 Speaker 1: you might want to look into this for this reason, FYI, 275 00:14:24,280 --> 00:14:26,160 Speaker 1: I can't come out and say anything about this, but 276 00:14:26,280 --> 00:14:30,640 Speaker 1: you can, and apparently now that was being read or 277 00:14:30,760 --> 00:14:36,000 Speaker 1: could be read by the federal government your direct messages. 278 00:14:36,080 --> 00:14:41,040 Speaker 1: That's what a DM is. Is no longer a private conversation, 279 00:14:41,200 --> 00:14:44,600 Speaker 1: which the word direct implies, right, not indirect, Not through 280 00:14:44,640 --> 00:14:47,800 Speaker 1: the government first or after, or in between, not through 281 00:14:47,840 --> 00:14:50,760 Speaker 1: Twitter employees before, after, in between. What if you send 282 00:14:50,760 --> 00:14:53,480 Speaker 1: something that's inappropriate and I'm talking about like humor that 283 00:14:53,520 --> 00:14:57,120 Speaker 1: someone might judge you for later, that pushes a line 284 00:14:58,240 --> 00:15:01,160 Speaker 1: they could read that use it against you later in life. 285 00:15:01,760 --> 00:15:03,960 Speaker 1: What if you send something funny from a stand up 286 00:15:04,000 --> 00:15:07,120 Speaker 1: comedian that makes an off color joke. I've done that before, 287 00:15:07,200 --> 00:15:09,880 Speaker 1: something I would never tweet out because it would start 288 00:15:09,920 --> 00:15:13,040 Speaker 1: a firestorm. Right, That is pure humor that you send 289 00:15:13,040 --> 00:15:16,120 Speaker 1: to a family member or friend. Yes, the federal government 290 00:15:16,120 --> 00:15:19,280 Speaker 1: and yes Twitter employees could read every one of your direct. 291 00:15:18,960 --> 00:15:22,120 Speaker 2: Messages that put people's lives at risk. 292 00:15:22,560 --> 00:15:28,440 Speaker 1: Sources, whistleblowers, those inside outside the government who may be 293 00:15:28,560 --> 00:15:30,640 Speaker 1: trying to get the word out to journalists and others 294 00:15:30,640 --> 00:15:34,320 Speaker 1: of what's really happening. Every almost every media tip that 295 00:15:34,360 --> 00:15:38,720 Speaker 1: I receive, or I say every government inside government tip 296 00:15:38,720 --> 00:15:40,800 Speaker 1: that I receive, comes in through a DM. 297 00:15:41,800 --> 00:15:42,880 Speaker 2: People will reach out to. 298 00:15:42,880 --> 00:15:44,520 Speaker 1: You on social media because they don't I mean house 299 00:15:44,520 --> 00:15:46,640 Speaker 1: they going to reach out to you and know it's you, right, 300 00:15:47,560 --> 00:15:49,760 Speaker 1: and they'll send it Hey can you know? And some 301 00:15:49,800 --> 00:15:51,520 Speaker 1: people we are smart, like hey, I need to chat 302 00:15:51,560 --> 00:15:54,120 Speaker 1: with you. But even that, right there is enough to 303 00:15:54,120 --> 00:15:56,800 Speaker 1: be like, oh, this person works here, they're whistle blowing 304 00:15:56,800 --> 00:15:58,720 Speaker 1: on this. We need to let them know what's happening. 305 00:15:59,240 --> 00:16:02,560 Speaker 1: Like even that right there, just saying can we talk offline? 306 00:16:03,360 --> 00:16:06,400 Speaker 1: I got one of these couple weeks ago and it's 307 00:16:06,400 --> 00:16:09,000 Speaker 1: a big story that I'm still working on, and it 308 00:16:09,080 --> 00:16:10,760 Speaker 1: was can we talk offline? 309 00:16:11,280 --> 00:16:11,360 Speaker 5: That? 310 00:16:11,600 --> 00:16:14,000 Speaker 1: Right there is enough for somebody in the government to 311 00:16:14,000 --> 00:16:16,400 Speaker 1: figure it out. Now, before I get into more of 312 00:16:16,400 --> 00:16:18,360 Speaker 1: this story, I want to say thank you and tell 313 00:16:18,360 --> 00:16:21,080 Speaker 1: you about our good friends at Augusta Precious Medals. You 314 00:16:21,160 --> 00:16:25,000 Speaker 1: can get free gold just by learning about gold iras 315 00:16:25,080 --> 00:16:28,000 Speaker 1: from Augusta Precious Medals. It's important that you know what's 316 00:16:28,040 --> 00:16:30,520 Speaker 1: going on in this crazy economy and your hard earned 317 00:16:30,520 --> 00:16:34,680 Speaker 1: savings need protecting from the devalued dollar. Protecting from the 318 00:16:34,760 --> 00:16:36,760 Speaker 1: devalued dollar, especially if. 319 00:16:36,680 --> 00:16:38,000 Speaker 2: You're close to retirement. 320 00:16:38,120 --> 00:16:40,640 Speaker 1: Augusta Precious Medals will give you information on how to 321 00:16:40,680 --> 00:16:43,800 Speaker 1: protect your savings and open a gold IRA. So if 322 00:16:43,800 --> 00:16:46,360 Speaker 1: you've saved at least one hundred thousand for retirement, call 323 00:16:46,440 --> 00:16:50,120 Speaker 1: and ask about their ultimate guide to gold iras. I 324 00:16:50,240 --> 00:16:53,520 Speaker 1: trust Augusta Precious Metals and you can too. They will 325 00:16:53,560 --> 00:16:56,120 Speaker 1: make sure that you understand what is best for you 326 00:16:56,360 --> 00:16:59,840 Speaker 1: and your portfolio. There's a reason Money magazine says they 327 00:16:59,880 --> 00:17:04,200 Speaker 1: are the best gold IRA company. Get free, goal free 328 00:17:04,240 --> 00:17:08,879 Speaker 1: information and retirement protection now by calling eight seven seven 329 00:17:09,080 --> 00:17:12,160 Speaker 1: four gold Ira. That's eight seven seven the number four 330 00:17:12,480 --> 00:17:18,080 Speaker 1: GOLDRA or a Gusta Precious Metals dot com. So imagine 331 00:17:18,280 --> 00:17:20,879 Speaker 1: that you were doing this and you're trying to get 332 00:17:20,920 --> 00:17:24,280 Speaker 1: a story out and you find out that Twitter, anybody 333 00:17:24,280 --> 00:17:26,960 Speaker 1: at Twitter, any employee or government official, basically could read 334 00:17:27,000 --> 00:17:30,119 Speaker 1: your direct messages, which are supposed to be direct meaning 335 00:17:30,160 --> 00:17:33,399 Speaker 1: to you. And that's and the question is, you know, 336 00:17:34,200 --> 00:17:36,159 Speaker 1: is their ability to sue the old people of Twitter 337 00:17:36,240 --> 00:17:38,800 Speaker 1: for knowing that this abuse of power was taking place 338 00:17:38,840 --> 00:17:40,879 Speaker 1: while they tried to claim that it would be a 339 00:17:40,920 --> 00:17:43,760 Speaker 1: direct message, not you and the government, not you and 340 00:17:43,800 --> 00:17:45,479 Speaker 1: Twitter employees a direct message? 341 00:17:45,480 --> 00:17:47,480 Speaker 2: And how often were they reading these dms? 342 00:17:47,960 --> 00:17:50,119 Speaker 1: I'm sure depending on who you are, they could be 343 00:17:50,119 --> 00:17:53,480 Speaker 1: reading them quite often, depending on what you do depending 344 00:17:53,480 --> 00:17:56,320 Speaker 1: on what stories you're breaking. Imagine your reporter, you come 345 00:17:56,359 --> 00:17:58,720 Speaker 1: out with a big story, they could probably go backwards 346 00:17:58,760 --> 00:18:01,760 Speaker 1: and read your DMS. Here's the last part of that 347 00:18:01,800 --> 00:18:03,400 Speaker 1: interview with Tucker Carlson. 348 00:18:03,400 --> 00:18:07,080 Speaker 4: Various governments about doing this. I haven't had direct complaints 349 00:18:07,119 --> 00:18:09,960 Speaker 4: to me. I've had sort of, like some indirect complaints. 350 00:18:10,080 --> 00:18:12,720 Speaker 4: I think people are a little concerned about complaining to me 351 00:18:12,760 --> 00:18:21,080 Speaker 4: directly in case I treat about it. You know, they're like, oh, 352 00:18:21,160 --> 00:18:22,800 Speaker 4: so they sort of try to be more round about 353 00:18:22,840 --> 00:18:26,119 Speaker 4: than that, you know. I mean, if I got something 354 00:18:26,160 --> 00:18:29,480 Speaker 4: that was unconstitutional from the US government, my reply would 355 00:18:29,520 --> 00:18:31,880 Speaker 4: be to sad on the copy of the First Amendment 356 00:18:32,560 --> 00:18:34,159 Speaker 4: and just say, like, what pit of this are we're 357 00:18:34,160 --> 00:18:34,680 Speaker 4: getting wrong? 358 00:18:36,680 --> 00:18:38,919 Speaker 5: You have a lot of you have a lot of governments. 359 00:18:38,960 --> 00:18:40,560 Speaker 5: What part of this are getting wrong? Please tell me. 360 00:18:40,600 --> 00:18:41,919 Speaker 3: I mean it's a pretty not I'm just saying, but 361 00:18:41,920 --> 00:18:43,680 Speaker 3: you're kind of exposed in your other businesses. 362 00:18:43,720 --> 00:18:47,080 Speaker 5: So this is just in case reviewers aren'tollowing this. 363 00:18:47,080 --> 00:18:49,320 Speaker 3: This is not You're not just like a journalist taking 364 00:18:49,320 --> 00:18:50,879 Speaker 3: a stand on behalf of the First Amendment. You're a 365 00:18:50,880 --> 00:18:54,440 Speaker 3: guy with big government contracts giving the finger to the government. 366 00:18:54,560 --> 00:18:55,399 Speaker 5: Do you think. 367 00:18:56,840 --> 00:19:00,159 Speaker 3: Twitter will be as central to this presidential camp and 368 00:19:00,280 --> 00:19:02,800 Speaker 3: as it was in the last several. 369 00:19:03,119 --> 00:19:06,480 Speaker 4: I think it will play a significant role in elections, 370 00:19:06,560 --> 00:19:08,360 Speaker 4: not just domestically but internationally. 371 00:19:08,640 --> 00:19:11,080 Speaker 5: The goal of new Twitter is to be. 372 00:19:12,480 --> 00:19:15,600 Speaker 4: As fair and even had it as possible, so not 373 00:19:15,720 --> 00:19:21,040 Speaker 4: favoring any political ideology, but just. 374 00:19:23,040 --> 00:19:26,240 Speaker 5: Yeah, being being fair at all. Why doesn't Facebook do this? 375 00:19:26,359 --> 00:19:28,679 Speaker 3: I know that Zuckerberg has said, and I take him 376 00:19:28,680 --> 00:19:32,320 Speaker 3: at face value that he well, I do actually in 377 00:19:32,359 --> 00:19:35,040 Speaker 3: this way that he is a kind of old fashioned 378 00:19:35,040 --> 00:19:35,880 Speaker 3: liberal who. 379 00:19:35,880 --> 00:19:37,000 Speaker 5: Doesn't like to censor. 380 00:19:37,200 --> 00:19:39,960 Speaker 3: He has, but he you know, like, why wouldn't a 381 00:19:40,000 --> 00:19:42,600 Speaker 3: company like that take the stand that you have taken, 382 00:19:43,080 --> 00:19:49,280 Speaker 3: which pretty American traditional political custom, you know, for free speech. 383 00:19:50,480 --> 00:19:55,639 Speaker 4: My understanding is that Zuckerberg spent four hundred million dollars 384 00:19:55,680 --> 00:19:58,359 Speaker 4: in the last election, nominally in a get out the 385 00:19:58,440 --> 00:20:01,359 Speaker 4: vote campaign, but really fund him in support of Democrats. 386 00:20:01,720 --> 00:20:05,040 Speaker 5: Is that accurate or not accurate? That is accurate? Does 387 00:20:05,040 --> 00:20:06,679 Speaker 5: that sound unbiased to you? 388 00:20:06,680 --> 00:20:06,760 Speaker 6: No? 389 00:20:06,880 --> 00:20:07,359 Speaker 5: It doesn't. 390 00:20:07,520 --> 00:20:11,040 Speaker 3: Yes, so you don't see hope that Facebook will approach 391 00:20:11,119 --> 00:20:16,040 Speaker 3: this as a non aligned arbiter. You've allowed Donald Trump 392 00:20:16,119 --> 00:20:17,879 Speaker 3: back on Twitter. He hasn't taken you up on your 393 00:20:17,880 --> 00:20:20,000 Speaker 3: offer because he's got his own thing. Do you think 394 00:20:20,000 --> 00:20:21,240 Speaker 3: he will go back on Twitter? 395 00:20:23,119 --> 00:20:26,520 Speaker 4: Well, that's that's obviously after him. You know, my job 396 00:20:26,640 --> 00:20:32,399 Speaker 4: is to you know, I take the freedom of speech 397 00:20:34,640 --> 00:20:38,880 Speaker 4: very seriously. So it's you know, I didn't I didn't 398 00:20:38,920 --> 00:20:40,879 Speaker 4: vote for Donald Trump. I actually vote for Biden. Not 399 00:20:41,119 --> 00:20:43,560 Speaker 4: saying I'm a huge fan of Biden, because I would 400 00:20:43,720 --> 00:20:47,320 Speaker 4: thinks that would probably be an accurate But you know, 401 00:20:47,359 --> 00:20:50,920 Speaker 4: we have difficult choices to make in these residential elections. 402 00:20:50,920 --> 00:20:56,639 Speaker 4: It's not I would prefer, frankly, that we put someone 403 00:20:57,040 --> 00:21:01,639 Speaker 4: just a normal person as president, a normal person with 404 00:21:01,720 --> 00:21:05,800 Speaker 4: common sense and whose values are smack in the middle 405 00:21:05,840 --> 00:21:08,680 Speaker 4: of the country, you know, just you know, serah, with 406 00:21:08,680 --> 00:21:09,440 Speaker 4: the normal distribution. 407 00:21:10,080 --> 00:21:12,920 Speaker 5: And I think they'll do that. They would be great. Yeah, 408 00:21:12,960 --> 00:21:13,920 Speaker 5: I think we have made. 409 00:21:13,760 --> 00:21:19,280 Speaker 4: Maybe being president not that much fun, you know, to 410 00:21:19,280 --> 00:21:20,240 Speaker 4: be totally frank. 411 00:21:22,000 --> 00:21:24,679 Speaker 3: Subscribe to the Fox News YouTube channeled Act. 412 00:21:25,440 --> 00:21:28,560 Speaker 1: That is part of that interview, and I want you 413 00:21:28,600 --> 00:21:33,119 Speaker 1: to know that I think what Elon must did was 414 00:21:33,200 --> 00:21:41,159 Speaker 1: not just incredibly brave. Some would say it was financially insane. 415 00:21:41,400 --> 00:21:46,280 Speaker 1: And I think that's how much he cared about exposing, 416 00:21:46,840 --> 00:21:50,240 Speaker 1: not just what big tech was doing, but also actual 417 00:21:50,320 --> 00:21:52,920 Speaker 1: free speech. He's got more money than he knows what's 418 00:21:53,000 --> 00:21:54,760 Speaker 1: do with, and now he's lost a lot of it 419 00:21:55,520 --> 00:21:57,439 Speaker 1: trying to fix some of these problems that we have 420 00:21:57,480 --> 00:21:58,399 Speaker 1: in the world, which I think. 421 00:21:58,280 --> 00:21:59,040 Speaker 2: Is just amazing. 422 00:21:59,840 --> 00:22:03,679 Speaker 1: I give a lot of credit to Elon Musk for 423 00:22:03,760 --> 00:22:06,240 Speaker 1: doing this, for exposing it, and also sitting down having 424 00:22:06,280 --> 00:22:10,159 Speaker 1: these conversations. The amount of abuse of power that the 425 00:22:10,200 --> 00:22:14,960 Speaker 1: federal government had the amount of abuse of power to 426 00:22:15,359 --> 00:22:18,320 Speaker 1: just dive into your private personal life and your DMS 427 00:22:18,320 --> 00:22:22,719 Speaker 1: that they had access to unlimited access to tells you 428 00:22:22,800 --> 00:22:27,240 Speaker 1: just how corrupt they actually are. Joe Biden, remember Joe Biden, 429 00:22:27,240 --> 00:22:30,320 Speaker 1: the guy who touted the disinformation letter in the Presidents 430 00:22:30,320 --> 00:22:34,600 Speaker 1: of debate, saying, there are fifty former national intelligence officials 431 00:22:34,600 --> 00:22:37,320 Speaker 1: that are saying that this is Russian disinformation, the Hunter 432 00:22:37,320 --> 00:22:38,240 Speaker 1: Byden laptop story. 433 00:22:38,240 --> 00:22:40,639 Speaker 2: We now know they lied, and we now know how 434 00:22:40,640 --> 00:22:41,480 Speaker 2: they covered it up. 435 00:22:41,960 --> 00:22:44,840 Speaker 1: This is back in Nashville in October twenty second, right 436 00:22:44,880 --> 00:22:47,359 Speaker 1: before the presidential election in twenty twenty listened to the 437 00:22:47,359 --> 00:22:48,280 Speaker 1: president his own. 438 00:22:48,080 --> 00:22:53,360 Speaker 7: Words, Look, there are fifty former national intelligence folks who 439 00:22:53,440 --> 00:22:56,040 Speaker 7: said that what this he's accusing me of is a 440 00:22:56,160 --> 00:23:00,040 Speaker 7: Russian plant. They have said that this has all the 441 00:23:00,440 --> 00:23:04,520 Speaker 7: four five former heads of the CIA. Both parties say 442 00:23:04,560 --> 00:23:08,199 Speaker 7: what he's saying is a bunch of garbage. Nobody believes 443 00:23:08,240 --> 00:23:11,160 Speaker 7: it except them his and his good friend Rudi Jielli. 444 00:23:11,560 --> 00:23:15,320 Speaker 3: You mean the laptop is now another Russia Russia Russia hoax. 445 00:23:15,359 --> 00:23:18,639 Speaker 6: Hey you that's exactly what this is. Exactly what this 446 00:23:18,760 --> 00:23:19,640 Speaker 6: is where he's going. 447 00:23:20,520 --> 00:23:25,080 Speaker 1: Yeah, they were lying to you. Everybody now knows they 448 00:23:25,080 --> 00:23:30,399 Speaker 1: were lying to you. Those intelligence officials knew they were 449 00:23:30,560 --> 00:23:34,919 Speaker 1: lying to you. Okay, everybody involved was lying to you. 450 00:23:35,000 --> 00:23:37,560 Speaker 1: These top Russian officials were covering it up because they 451 00:23:37,560 --> 00:23:40,720 Speaker 1: wanted to cook the books. Okay, they wanted to cook 452 00:23:40,960 --> 00:23:44,360 Speaker 1: the books. They wanted to make sure that they got 453 00:23:44,400 --> 00:23:46,320 Speaker 1: their guy in the White House that they could control. 454 00:23:46,359 --> 00:23:49,040 Speaker 1: And that's what they've done. They can now control them. 455 00:23:49,480 --> 00:23:52,480 Speaker 1: They can control them, and it's pretty much perfect. Now 456 00:23:53,400 --> 00:23:56,600 Speaker 1: Biden touting that group we now know is a lie. 457 00:23:58,000 --> 00:23:59,760 Speaker 1: We've known it for some time, but now it's been 458 00:23:59,800 --> 00:24:01,320 Speaker 1: con firmed on a new letter. 459 00:24:01,320 --> 00:24:02,640 Speaker 2: A new level. I should say. 460 00:24:03,720 --> 00:24:06,879 Speaker 1: We have congressmen now coming out talking about what this means. 461 00:24:07,880 --> 00:24:13,199 Speaker 1: I also have to go back and just remind you 462 00:24:13,359 --> 00:24:17,600 Speaker 1: of something else that happened, and this is this is 463 00:24:17,680 --> 00:24:22,960 Speaker 1: just another example of the pure arrogance of this administration. 464 00:24:24,200 --> 00:24:29,200 Speaker 2: This administration has known that the media would. 465 00:24:28,960 --> 00:24:32,960 Speaker 1: Cover them no matter what they did right, no matter 466 00:24:33,000 --> 00:24:35,800 Speaker 1: what they did, they were going to do this. 467 00:24:36,680 --> 00:24:39,080 Speaker 2: They were going to cover and now and now, and 468 00:24:39,119 --> 00:24:39,760 Speaker 2: now you have. 469 00:24:39,800 --> 00:24:42,320 Speaker 1: People like CBS News that are being forced to report 470 00:24:42,359 --> 00:24:46,000 Speaker 1: what really happened with this cover up. CBS Evening News 471 00:24:46,040 --> 00:24:48,800 Speaker 1: Nora o'donald's had this on last night. 472 00:24:49,680 --> 00:24:52,600 Speaker 8: My client wants to come forward to Congress. He's ready 473 00:24:52,600 --> 00:24:54,720 Speaker 8: to be questioned about what he knows and what he 474 00:24:54,800 --> 00:24:57,560 Speaker 8: experienced under the proper legal protections. 475 00:24:58,000 --> 00:25:02,480 Speaker 6: Attorney Mark Ladle's client is supervisory special agent at the IRS, 476 00:25:02,920 --> 00:25:06,520 Speaker 6: who's prepared to tell Congress the investigation he's been working 477 00:25:06,560 --> 00:25:10,120 Speaker 6: on has been hampered by what he thinks is special treatment. 478 00:25:10,720 --> 00:25:15,000 Speaker 8: Typical steps that a law enforcement investigator would take were 479 00:25:15,160 --> 00:25:18,280 Speaker 8: compromised because of political considerations. 480 00:25:18,680 --> 00:25:22,359 Speaker 6: Lytel wouldn't talk in specifics, declining to identify either his 481 00:25:22,480 --> 00:25:26,400 Speaker 6: client or the target of the investigation his client helped conduct. 482 00:25:26,720 --> 00:25:29,959 Speaker 5: Can you identify him? I can't at this stage, Jim. 483 00:25:30,280 --> 00:25:33,920 Speaker 6: But CBS News has learned the investigation the whistleblower worked 484 00:25:33,920 --> 00:25:36,080 Speaker 6: on is about Hunter Biden. 485 00:25:36,240 --> 00:25:39,879 Speaker 5: What we're doing is is being completely cooperative. That was Biden. 486 00:25:39,920 --> 00:25:43,840 Speaker 6: Two years ago, after the DOJ opened an investigation into 487 00:25:43,840 --> 00:25:48,159 Speaker 6: his finances. The FBI collected what it believed was sufficient 488 00:25:48,240 --> 00:25:52,000 Speaker 6: evidence to charge Biden with tax crimes, and last year 489 00:25:52,119 --> 00:25:57,440 Speaker 6: sent its findings to the US attorney in Delaware. Since then, silence, 490 00:25:57,880 --> 00:26:00,160 Speaker 6: Why can't your client talk to us directly? 491 00:26:00,200 --> 00:26:04,199 Speaker 8: At this point, there are laws that provide protection to whistleblowers, 492 00:26:04,600 --> 00:26:06,000 Speaker 8: and he has to navigate that. 493 00:26:06,440 --> 00:26:10,119 Speaker 6: Today, Leytel sent this letter to Congress claiming his client 494 00:26:10,200 --> 00:26:14,679 Speaker 6: could provide information that would contradict sworn testimony by a 495 00:26:14,720 --> 00:26:16,280 Speaker 6: senior political appointee. 496 00:26:16,720 --> 00:26:19,520 Speaker 8: I promise to ensure that he's able to carry out 497 00:26:20,040 --> 00:26:20,840 Speaker 8: his investigation. 498 00:26:21,160 --> 00:26:24,840 Speaker 6: CBS News has learned that was Attorney General Merrick Garland. 499 00:26:25,320 --> 00:26:29,320 Speaker 1: That was Attorney General Merit Garland. Oh okay, well that's 500 00:26:29,400 --> 00:26:34,920 Speaker 1: kind of important information, right, I mean, that's kind of important. Yeah, 501 00:26:34,960 --> 00:26:37,600 Speaker 1: That's just how we roll, That's just what we do. 502 00:26:38,720 --> 00:26:43,040 Speaker 1: That's how all this is gonna happen. The media will 503 00:26:43,080 --> 00:26:47,400 Speaker 1: cover for us, because the media always covers for us. 504 00:26:48,160 --> 00:26:51,320 Speaker 1: There's also something else they said in this report from 505 00:26:51,400 --> 00:26:55,440 Speaker 1: NBC News last night that this IRS agent who's seeking 506 00:26:55,480 --> 00:26:58,840 Speaker 1: whistleblower status in the Hunter Biden case. Here's what NBC 507 00:26:58,920 --> 00:26:59,560 Speaker 1: News had to say. 508 00:26:59,600 --> 00:27:04,320 Speaker 9: Listen carefully, two senior law enforcement officials describing to NBC. 509 00:27:04,040 --> 00:27:06,120 Speaker 6: News growing frustration. 510 00:27:05,720 --> 00:27:09,000 Speaker 9: Inside the FBI because federal investigators finished the bulk of 511 00:27:09,080 --> 00:27:12,200 Speaker 9: their work about a year ago, and suspect political interference 512 00:27:12,280 --> 00:27:13,400 Speaker 9: is delaying the process. 513 00:27:14,280 --> 00:27:18,000 Speaker 1: So now a growing frustration is happening inside the FBI 514 00:27:18,040 --> 00:27:20,480 Speaker 1: because federal investigators finished the bulk or their work about 515 00:27:20,480 --> 00:27:23,360 Speaker 1: a year ago, and the Hunter Byden investigation m suspect 516 00:27:23,440 --> 00:27:25,840 Speaker 1: quote political interference is delaying the process. 517 00:27:26,560 --> 00:27:26,840 Speaker 9: Oho. 518 00:27:28,000 --> 00:27:31,760 Speaker 1: Okay, so let's go back to ABC News. What are 519 00:27:31,800 --> 00:27:34,080 Speaker 1: they saying over at ABC News. Now, that's the first 520 00:27:34,080 --> 00:27:36,160 Speaker 1: time they've all recovered this in the same night. 521 00:27:36,560 --> 00:27:38,840 Speaker 10: The attorney four one hundred bodies says the agent is 522 00:27:38,880 --> 00:27:42,560 Speaker 10: committing a crime by discussing an ongoing tax investigation and 523 00:27:42,600 --> 00:27:45,560 Speaker 10: a tempt to harm the president's son. But the whistleblowers 524 00:27:45,600 --> 00:27:48,120 Speaker 10: attorney just told me such claims. 525 00:27:47,760 --> 00:27:52,760 Speaker 1: Are baseless, baseless. I tell you they're baseless, right, baseless. 526 00:27:53,840 --> 00:27:58,320 Speaker 1: These are baseless claims, baseless. I think we all know 527 00:27:58,359 --> 00:28:01,280 Speaker 1: that to lie. Now we all know there's nothing even 528 00:28:01,359 --> 00:28:05,080 Speaker 1: close to that. We know it's real, we know what's 529 00:28:05,119 --> 00:28:07,679 Speaker 1: going on with it. We understand what's happening with it. 530 00:28:08,200 --> 00:28:08,400 Speaker 5: Now. 531 00:28:08,400 --> 00:28:10,720 Speaker 2: This is the media for the first time I've ever 532 00:28:10,760 --> 00:28:13,119 Speaker 2: seen them cover it this way. 533 00:28:13,160 --> 00:28:16,520 Speaker 1: That's something that people really need to understand. I also 534 00:28:16,600 --> 00:28:19,920 Speaker 1: want you to hear from Jim Jordan. Jim Jordan had 535 00:28:19,960 --> 00:28:23,159 Speaker 1: this to say on Fox about what's happening with the 536 00:28:23,200 --> 00:28:24,520 Speaker 1: Attorney General Garland. 537 00:28:24,720 --> 00:28:26,479 Speaker 2: Here's what he said on Fox. Listen. 538 00:28:26,680 --> 00:28:29,360 Speaker 9: This has been political from the get go, clear back 539 00:28:29,400 --> 00:28:32,959 Speaker 9: to the moral situation. When the story came out on 540 00:28:33,040 --> 00:28:36,800 Speaker 9: October fourteenth, twenty twenty, about the Biden business operation and 541 00:28:37,000 --> 00:28:39,480 Speaker 9: was then Vice President Joe Biden involved, there was some 542 00:28:39,560 --> 00:28:42,520 Speaker 9: concern that he was. And then quickly it turns in 543 00:28:42,600 --> 00:28:45,120 Speaker 9: this political operation that that letter that became the basis 544 00:28:45,160 --> 00:28:47,920 Speaker 9: for suppressing the story and keeping it from the American 545 00:28:47,920 --> 00:28:50,560 Speaker 9: people just days before the most important election we have 546 00:28:50,600 --> 00:28:53,280 Speaker 9: election for president of United States. So understand what happened, Laura. 547 00:28:53,360 --> 00:28:54,800 Speaker 9: The fourteenth post as a story. 548 00:28:54,840 --> 00:28:55,600 Speaker 5: The seventeenth. 549 00:28:55,680 --> 00:28:58,560 Speaker 9: Tony Blaken, senior advisor to the Biden campaign, current Secretary 550 00:28:58,560 --> 00:29:01,720 Speaker 9: of State, contacts Mike gets him interested in this. Mike 551 00:29:01,800 --> 00:29:04,600 Speaker 9: Morell looks at it the next day, organizes on the 552 00:29:04,640 --> 00:29:08,120 Speaker 9: eighteenth all these other people to sign the letter. The nineteenth, 553 00:29:08,200 --> 00:29:10,560 Speaker 9: the letter goes out, and then on the twenty second, 554 00:29:10,600 --> 00:29:12,200 Speaker 9: And the reason Mike Morrell said he did the letter 555 00:29:12,280 --> 00:29:14,760 Speaker 9: was he thought President Trump would bring the issue up 556 00:29:14,800 --> 00:29:18,440 Speaker 9: during that debate on the twenty second of October, and 557 00:29:18,520 --> 00:29:20,760 Speaker 9: of course he did. And they wanted some statement that 558 00:29:20,840 --> 00:29:23,320 Speaker 9: Joe Biden could use because, as mister Morrell said, they 559 00:29:23,360 --> 00:29:25,760 Speaker 9: wanted him to win. Now what happens on the twenty second, 560 00:29:26,120 --> 00:29:29,520 Speaker 9: Joe Biden brings it up, and then after that debate, 561 00:29:29,560 --> 00:29:33,240 Speaker 9: here's the kicker. Steve Roschetti, chair of the Biden campaign, 562 00:29:33,400 --> 00:29:36,640 Speaker 9: calls up Mike Morell and thanks him for doing it all. 563 00:29:36,880 --> 00:29:40,160 Speaker 9: It was a total political operation. And the most important 564 00:29:40,200 --> 00:29:43,880 Speaker 9: fact is, Laura, it was false. It wasn't Russian. This information, 565 00:29:44,200 --> 00:29:46,040 Speaker 9: the laptop story looks to be true. 566 00:29:46,760 --> 00:29:51,800 Speaker 1: It wasn't false information. The laptop story is true. Share 567 00:29:52,360 --> 00:29:55,560 Speaker 1: this podcast with your family and friends. Hit that little 568 00:29:56,120 --> 00:29:59,920 Speaker 1: forward arrow and text it or put it on social media. Also, 569 00:30:00,000 --> 00:30:02,160 Speaker 1: so hit that auto download or subscribe button so you 570 00:30:02,240 --> 00:30:05,400 Speaker 1: get this podcast each and every day for free. 571 00:30:05,440 --> 00:30:06,720 Speaker 2: And I'll see you back here tomorrow,