1 00:00:00,240 --> 00:00:02,840 Speaker 1: All right, guys, very excited to welcome back to the program. 2 00:00:02,960 --> 00:00:03,720 Speaker 2: RFK Junior. 3 00:00:03,720 --> 00:00:06,720 Speaker 1: He, of course, is presidential candidate, activist, and author. 4 00:00:06,760 --> 00:00:07,400 Speaker 2: Great Cazar. 5 00:00:07,520 --> 00:00:13,080 Speaker 3: Good Cazar, thanks for having me back. It's wonderful to 6 00:00:13,119 --> 00:00:13,760 Speaker 3: be back with you. 7 00:00:14,200 --> 00:00:15,680 Speaker 2: Yeah, truly, it's truly our pleasure. 8 00:00:16,160 --> 00:00:18,120 Speaker 1: I wanted to get off the top of your reaction 9 00:00:18,239 --> 00:00:21,520 Speaker 1: to breaking news. Of course, we have new charges filed 10 00:00:21,560 --> 00:00:24,480 Speaker 1: against former President Trump with regards to his handling of 11 00:00:24,520 --> 00:00:28,200 Speaker 1: classified documents and alleged obstruction. Just wanted to get your 12 00:00:28,200 --> 00:00:29,800 Speaker 1: reaction to those indictments. 13 00:00:31,880 --> 00:00:34,400 Speaker 3: You know, I don't know enough about him. I mean, 14 00:00:34,440 --> 00:00:38,240 Speaker 3: I would say this, and you know I'm not giving 15 00:00:38,280 --> 00:00:41,720 Speaker 3: you anything special or any kind of special insight, information 16 00:00:41,880 --> 00:00:45,920 Speaker 3: or insight. I thought the New York case was a 17 00:00:46,040 --> 00:00:48,400 Speaker 3: very weak case, and if I had been a prosecutor, 18 00:00:48,400 --> 00:00:51,400 Speaker 3: I would not have brought it, mainly because it's the 19 00:00:51,440 --> 00:00:54,360 Speaker 3: president of the United States, and so you have to 20 00:00:54,440 --> 00:01:00,920 Speaker 3: kind of tiptoe through the minefields of your prosecutor because 21 00:01:01,880 --> 00:01:04,200 Speaker 3: you want to avoid the optics that this is a 22 00:01:04,319 --> 00:01:08,640 Speaker 3: political prosecution. In the United States, We've always tried to 23 00:01:08,680 --> 00:01:16,080 Speaker 3: avoid those optics. But this case looks much stronger, and 24 00:01:16,160 --> 00:01:21,880 Speaker 3: the reason to me it's surprising is because the judge 25 00:01:22,200 --> 00:01:26,600 Speaker 3: penetrated the attorney client privilege, which almost never happens. And 26 00:01:26,760 --> 00:01:30,480 Speaker 3: you know, normally the conversations between between an attorney and 27 00:01:30,520 --> 00:01:33,759 Speaker 3: a client are privileged and no court can look at those. 28 00:01:35,080 --> 00:01:39,240 Speaker 3: But in this case, the exception is if the attorney 29 00:01:39,240 --> 00:01:43,480 Speaker 3: and the client are plotting a crime together or involved 30 00:01:43,520 --> 00:01:47,000 Speaker 3: in sort of some kind of criminal conspiracy, then the 31 00:01:47,000 --> 00:01:52,240 Speaker 3: courts will allow that to be penetrated. And once the 32 00:01:52,320 --> 00:01:58,080 Speaker 3: determination is made and that threshold is passed, where the 33 00:01:58,120 --> 00:02:01,840 Speaker 3: court says, okay, you can now look at conversations between 34 00:02:01,880 --> 00:02:06,320 Speaker 3: the client and its attorney, it is it is really really, 35 00:02:07,600 --> 00:02:13,440 Speaker 3: it'll be mayhem the client because a client was was 36 00:02:13,520 --> 00:02:17,480 Speaker 3: having those conversations assuming nobody would ever see them, and 37 00:02:17,520 --> 00:02:21,800 Speaker 3: that he could he could say anything he wanted. I 38 00:02:21,840 --> 00:02:26,000 Speaker 3: think that that, you know that that makes this case 39 00:02:26,040 --> 00:02:29,880 Speaker 3: look very real and I think very dangerous for President Trump. 40 00:02:30,400 --> 00:02:32,920 Speaker 4: Sir, would you ever consider a pardon for President Trump? 41 00:02:35,320 --> 00:02:38,680 Speaker 3: I wouldn't even talk about that at this point. Got it? 42 00:02:38,840 --> 00:02:39,240 Speaker 2: Gotcha? 43 00:02:40,200 --> 00:02:42,680 Speaker 1: We wanted to ask you too a little bit about 44 00:02:42,720 --> 00:02:45,800 Speaker 1: your views of current President Biden. So one of the 45 00:02:45,840 --> 00:02:49,200 Speaker 1: concerns among the general public and among Democratic primary voters 46 00:02:49,240 --> 00:02:52,560 Speaker 1: as well, is about his age and his just ability 47 00:02:52,639 --> 00:02:55,880 Speaker 1: to perform the job that is entailed as president of 48 00:02:55,880 --> 00:02:58,720 Speaker 1: the United States. Do you personally think that Joe Biden 49 00:02:58,840 --> 00:03:01,399 Speaker 1: is fit to serve as president for another four years? 50 00:03:01,880 --> 00:03:05,080 Speaker 3: You know, I don't know. I don't have any idea. 51 00:03:05,480 --> 00:03:09,360 Speaker 3: I think people around him should be giving him good 52 00:03:09,400 --> 00:03:13,200 Speaker 3: advice on that if they think that there's a lack 53 00:03:13,240 --> 00:03:18,040 Speaker 3: of mental acuity, or lack of capacity to deal with crises, 54 00:03:18,160 --> 00:03:22,360 Speaker 3: or lack of independent and you know, an ability to 55 00:03:22,800 --> 00:03:28,360 Speaker 3: exercise the independent and very good judgment. I think that 56 00:03:28,480 --> 00:03:34,320 Speaker 3: they have. His family and his counselors have a have 57 00:03:34,400 --> 00:03:37,840 Speaker 3: an obligation to our country to, you know, to tell 58 00:03:37,920 --> 00:03:40,360 Speaker 3: him to step down. But I don't have any kind 59 00:03:40,400 --> 00:03:44,680 Speaker 3: of insight. And you know, I see some of the films. 60 00:03:44,280 --> 00:03:46,880 Speaker 2: And stuff, right, and you know him, don't you. 61 00:03:47,840 --> 00:03:50,240 Speaker 3: Yeah, but I haven't seen him in years. And when 62 00:03:50,240 --> 00:03:53,560 Speaker 3: I you know, when I had interactions with him, he 63 00:03:53,720 --> 00:03:58,280 Speaker 3: was always very sharp, and so I don't know what 64 00:03:58,360 --> 00:04:01,440 Speaker 3: he's like now behind the scene, I mean always we 65 00:04:01,600 --> 00:04:06,760 Speaker 3: see them in these kind of bubbles and and so 66 00:04:06,960 --> 00:04:09,720 Speaker 3: I have no idea how to assess that. 67 00:04:10,320 --> 00:04:12,280 Speaker 1: Yeah, well, that's why it'd be good to have debates 68 00:04:12,280 --> 00:04:14,280 Speaker 1: so that the American people could assess for themselves. 69 00:04:14,360 --> 00:04:15,800 Speaker 4: I agree, sir. 70 00:04:16,000 --> 00:04:19,560 Speaker 5: One of the interesting things in primary differences between yourself 71 00:04:19,600 --> 00:04:22,120 Speaker 5: and President Bien eve Ben talking about is the way 72 00:04:22,120 --> 00:04:24,760 Speaker 5: that you would approach the war in Ukraine. So I'm 73 00:04:24,800 --> 00:04:28,320 Speaker 5: curious right now we see recent polling have in front 74 00:04:28,320 --> 00:04:31,279 Speaker 5: of me about seventy nine percent of Democrats saying, according 75 00:04:31,279 --> 00:04:34,560 Speaker 5: to Axios IPSO, say that they support continuing military aid 76 00:04:34,640 --> 00:04:37,479 Speaker 5: to Ukraine. Since you are a Democratic candidate, how do 77 00:04:37,560 --> 00:04:40,080 Speaker 5: you plan on changing some of people's minds within the 78 00:04:40,080 --> 00:04:42,720 Speaker 5: Democratic base about the US approach to Ukraine. 79 00:04:44,720 --> 00:04:47,680 Speaker 3: I mean, my only way of doing that is to 80 00:04:47,680 --> 00:04:52,159 Speaker 3: continue talking about it and continuing to sort of get 81 00:04:52,240 --> 00:04:55,599 Speaker 3: people too. You know, I don't think you even have 82 00:04:55,640 --> 00:04:59,760 Speaker 3: to change people's minds about the war. You know, about 83 00:04:59,800 --> 00:05:03,800 Speaker 3: what there. I mean, you know clearly Putin is you know, 84 00:05:03,920 --> 00:05:07,279 Speaker 3: I broke the law. He's a gangster. You know, this 85 00:05:07,480 --> 00:05:12,280 Speaker 3: was a brutal invasion that could have been avoided. So 86 00:05:12,520 --> 00:05:15,040 Speaker 3: and I don't think you need to change people's minds 87 00:05:15,080 --> 00:05:20,039 Speaker 3: about that. I think we all want peace. And you 88 00:05:20,080 --> 00:05:24,479 Speaker 3: know the number of Ukrainians who have now died, some 89 00:05:24,800 --> 00:05:28,240 Speaker 3: estimates are over three hundred and fifty thousand the kids 90 00:05:28,400 --> 00:05:31,400 Speaker 3: Ukrainian kids on the front lines, and then fourteen or 91 00:05:31,400 --> 00:05:35,480 Speaker 3: fifteen thousand civilians, and that's not good for anybody, and 92 00:05:35,520 --> 00:05:40,560 Speaker 3: then one hundred dollars thirty to eighty thousand Russians, and 93 00:05:40,839 --> 00:05:43,800 Speaker 3: you know, nobody, it's just not good for anybody. We 94 00:05:43,839 --> 00:05:46,760 Speaker 3: should we should be looking for paths to piece. And 95 00:05:46,839 --> 00:05:51,520 Speaker 3: I think most America, if you frame the polling question, 96 00:05:51,680 --> 00:05:54,320 Speaker 3: should we be looking for paths to piece? I think 97 00:05:54,360 --> 00:05:56,960 Speaker 3: most people will agree that we should be. That this 98 00:05:57,080 --> 00:06:01,159 Speaker 3: is not good for anybody. And it's definitely not good 99 00:06:01,200 --> 00:06:05,240 Speaker 3: from a geopolitical strategy for US because it's you know, 100 00:06:05,320 --> 00:06:07,760 Speaker 3: we said, well, we're going to bankrupt Russia, but we 101 00:06:07,920 --> 00:06:12,680 Speaker 3: really destroying the economy in Europe and pushing Russia into 102 00:06:12,760 --> 00:06:18,400 Speaker 3: the embrace with China, which is the worst geopolitical outcome 103 00:06:18,440 --> 00:06:21,120 Speaker 3: that we could possibly have. So I don't think this 104 00:06:21,240 --> 00:06:23,960 Speaker 3: is good for us. It's not good for America certainly. 105 00:06:24,040 --> 00:06:28,239 Speaker 3: And the cost of the war is enormous, one hundred 106 00:06:28,240 --> 00:06:31,119 Speaker 3: and thirteen billions so far that you know, the entire 107 00:06:31,560 --> 00:06:35,479 Speaker 3: budget of EPA is twelve billion dollars a year. The 108 00:06:35,520 --> 00:06:39,720 Speaker 3: budget for CDC is twelve billion we have. We have 109 00:06:39,839 --> 00:06:43,200 Speaker 3: fifty seven percent of Americans who cannot put their hands 110 00:06:43,200 --> 00:06:45,880 Speaker 3: on one thousand dollars if they have an emergency. We've 111 00:06:45,960 --> 00:06:50,960 Speaker 3: just cut food stamps, the SNAP program, the thirty million 112 00:06:51,000 --> 00:06:54,920 Speaker 3: Americans by ninety percent. Oh you know, I have a 113 00:06:54,960 --> 00:06:58,200 Speaker 3: friend who is a commercial fisherman, worked his whole life 114 00:06:58,200 --> 00:07:00,920 Speaker 3: building his business, but he doesn't benefit and he's now 115 00:07:00,960 --> 00:07:04,320 Speaker 3: on disability and he's been surviving on two hundred and 116 00:07:04,320 --> 00:07:08,240 Speaker 3: eighty dollars with the food stamps a month. He got 117 00:07:08,279 --> 00:07:12,000 Speaker 3: a call on March first that that was cut to 118 00:07:12,600 --> 00:07:16,640 Speaker 3: twenty five dollars a month. And try to feed yourself 119 00:07:16,640 --> 00:07:19,800 Speaker 3: on ninety cents a day. And meanwhile, because of them, 120 00:07:19,800 --> 00:07:22,760 Speaker 3: we're paying for all these wars and sixteen billion dollars 121 00:07:22,800 --> 00:07:26,600 Speaker 3: for the lockdowns. We have spiraling inflation which has driven 122 00:07:26,640 --> 00:07:30,400 Speaker 3: the cost of primary food stuffs seventy six percent in 123 00:07:30,440 --> 00:07:33,800 Speaker 3: two years. So he's dealing with, you know, this rising 124 00:07:33,880 --> 00:07:37,880 Speaker 3: cost of food and his food stamps getting cut to 125 00:07:37,920 --> 00:07:41,480 Speaker 3: twenty five dollars. That same month, we topped off Ukrainian 126 00:07:41,560 --> 00:07:44,920 Speaker 3: eate at one hundred and thirteen billion and we pin 127 00:07:45,000 --> 00:07:48,000 Speaker 3: at printed three hundred billion more for the silicon. But 128 00:07:48,080 --> 00:07:50,840 Speaker 3: Tabilla at the Silicon Valley banks, and so we got 129 00:07:50,880 --> 00:07:53,240 Speaker 3: lots in the ridge and for the war machine. And 130 00:07:53,320 --> 00:07:58,280 Speaker 3: yet Americans are really in a place of terrible, terrible hardship. 131 00:07:59,280 --> 00:08:01,880 Speaker 3: If we took money that we're giving to the Ukraine, 132 00:08:02,000 --> 00:08:06,160 Speaker 3: we could pay for all of the food stamps for 133 00:08:06,360 --> 00:08:10,000 Speaker 3: every American without any cuts at all. And so those 134 00:08:10,080 --> 00:08:12,480 Speaker 3: the choices we're making are you know, we're going to 135 00:08:12,680 --> 00:08:16,760 Speaker 3: starve people in our country. And in order to do 136 00:08:16,960 --> 00:08:22,240 Speaker 3: something that I think we also need to look behind 137 00:08:22,440 --> 00:08:25,640 Speaker 3: the sort of comic book depictions of this war that 138 00:08:25,760 --> 00:08:28,000 Speaker 3: it's a black and white war, which is the same 139 00:08:28,160 --> 00:08:30,320 Speaker 3: kind of comic book depictions that they give us to 140 00:08:30,480 --> 00:08:33,000 Speaker 3: justify every war. And we need to look at the 141 00:08:33,160 --> 00:08:40,600 Speaker 3: US role in the series of US provocations that without 142 00:08:40,679 --> 00:08:43,040 Speaker 3: which this war would probably never have happened. 143 00:08:43,920 --> 00:08:45,480 Speaker 2: I want to ask you a little bit more. 144 00:08:45,679 --> 00:08:50,400 Speaker 3: It doesn't excuse Vladimir Putin, it doesn't exclude vatamt ter Pot, 145 00:08:50,520 --> 00:08:54,040 Speaker 3: but we, the neocons and the White House, bear a 146 00:08:54,120 --> 00:08:56,160 Speaker 3: lot of responsibility for this conflict. 147 00:08:56,480 --> 00:08:59,160 Speaker 1: Yeah, I wanted to ask you a little bit more about. Okay, 148 00:08:59,160 --> 00:09:01,680 Speaker 1: so if you're president, you're going to cut back the 149 00:09:01,720 --> 00:09:04,680 Speaker 1: military aid, You're going to seek a diplomatic solution in Ukraine. 150 00:09:05,080 --> 00:09:06,040 Speaker 2: What would you do. 151 00:09:06,200 --> 00:09:09,720 Speaker 1: To bolster the economic prospects of the American people? For example, 152 00:09:09,720 --> 00:09:12,760 Speaker 1: do you support things like federal jobs guarantee, do you 153 00:09:12,800 --> 00:09:16,920 Speaker 1: support a universal basic income? What would your economic platform 154 00:09:17,000 --> 00:09:19,120 Speaker 1: be to help working class Americans? 155 00:09:19,760 --> 00:09:25,360 Speaker 3: I mean, my primary platform is to cut the costs 156 00:09:25,360 --> 00:09:28,160 Speaker 3: on the military, you know, to get what we were 157 00:09:28,200 --> 00:09:31,240 Speaker 3: told was a peace dividenda after the collapse that the 158 00:09:31,320 --> 00:09:33,000 Speaker 3: Soviet Union in the end of the Cold War in 159 00:09:33,080 --> 00:09:36,680 Speaker 3: nineteen ninety two, we were going to cut our military 160 00:09:36,720 --> 00:09:39,040 Speaker 3: budget from about six hundred billion a year to two 161 00:09:39,120 --> 00:09:41,960 Speaker 3: hundred billion a year, and that instead of making you know, 162 00:09:42,520 --> 00:09:47,200 Speaker 3: a billion dollar stealth bomber it can't fly in the rain, 163 00:09:48,520 --> 00:09:50,600 Speaker 3: we were going to take that money and invested in 164 00:09:50,640 --> 00:09:58,040 Speaker 3: the schools, in infrastructure, and in rebuilding America. And instead, 165 00:09:58,360 --> 00:10:01,320 Speaker 3: you know, we've had the military and complex running our 166 00:10:01,360 --> 00:10:04,400 Speaker 3: foreign policy and the neocons, and instead of going down 167 00:10:04,440 --> 00:10:06,680 Speaker 3: to two hundred billion, we're now at one point three 168 00:10:06,800 --> 00:10:10,680 Speaker 3: trillion if you look at all the costs associated with 169 00:10:10,720 --> 00:10:14,559 Speaker 3: the military, and we need to cut that back to 170 00:10:14,280 --> 00:10:17,520 Speaker 3: the levels that we were promised somewhere around those levels. 171 00:10:17,640 --> 00:10:21,120 Speaker 3: Of course, we need to protect our country. But we 172 00:10:21,160 --> 00:10:25,760 Speaker 3: can do that by building fortress of America, arming ourselves 173 00:10:25,800 --> 00:10:31,199 Speaker 3: to the teeth at home, and then spending that money 174 00:10:31,240 --> 00:10:33,920 Speaker 3: to rebuild the industrial base in this country and rebuild 175 00:10:33,920 --> 00:10:35,880 Speaker 3: the middle class. I mean, my job is going to 176 00:10:36,240 --> 00:10:40,000 Speaker 3: rebuild the American middle class. We had a period in 177 00:10:40,040 --> 00:10:42,240 Speaker 3: this country that I was lucky enough to grow up 178 00:10:42,240 --> 00:10:46,200 Speaker 3: at the height of beginning at the end of World 179 00:10:46,280 --> 00:10:50,640 Speaker 3: War two to about nineteen eighty that's called economists called 180 00:10:50,640 --> 00:10:56,079 Speaker 3: the Great Great Prosperity. It was the largest economic growth 181 00:10:56,120 --> 00:10:58,640 Speaker 3: in the history of the world. We had the biggest 182 00:10:58,720 --> 00:11:03,320 Speaker 3: and most robust middle class. We had. The middle class 183 00:11:03,400 --> 00:11:06,920 Speaker 3: was an economic engine that generated half the wealth on 184 00:11:06,960 --> 00:11:09,400 Speaker 3: the face of the earth from the American middle class. 185 00:11:09,679 --> 00:11:13,760 Speaker 3: And people wanted our products. People, you know, loved our 186 00:11:13,800 --> 00:11:18,239 Speaker 3: country by the way we were. We were really worshiped 187 00:11:18,280 --> 00:11:20,880 Speaker 3: around the globe and looked to for a moral authority 188 00:11:20,880 --> 00:11:24,640 Speaker 3: and leadership and uh, you know, and I would like 189 00:11:24,679 --> 00:11:28,000 Speaker 3: to get back to that kind of America where we 190 00:11:28,160 --> 00:11:33,479 Speaker 3: focus on building our economy at home rather than wars abroad. 191 00:11:33,800 --> 00:11:36,480 Speaker 1: And does that just to clarify, do you support either 192 00:11:36,559 --> 00:11:39,600 Speaker 1: a federal jobs guarantee or a universal basic income as 193 00:11:39,640 --> 00:11:40,800 Speaker 1: part of that program. 194 00:11:41,720 --> 00:11:43,719 Speaker 3: I don't know about that. I need to look at 195 00:11:43,720 --> 00:11:46,400 Speaker 3: those things, you know, and see, and I need to 196 00:11:46,440 --> 00:11:50,480 Speaker 3: talk to a lot of economists and talk about the 197 00:11:50,600 --> 00:11:53,160 Speaker 3: ups and downs of those issues. You know. I can 198 00:11:53,200 --> 00:11:55,600 Speaker 3: see a lot of problems with those issues, which I 199 00:11:55,640 --> 00:11:59,239 Speaker 3: think are obvious to anybody. And it's a real departure 200 00:11:59,360 --> 00:12:02,640 Speaker 3: from American free market capitalism. I'd like to try to 201 00:12:02,640 --> 00:12:04,560 Speaker 3: give this system a chance to work. 202 00:12:04,960 --> 00:12:06,760 Speaker 4: What do you think the minimum WAG should be, sir? 203 00:12:08,280 --> 00:12:13,720 Speaker 3: And let me say something about that. I A You know, 204 00:12:13,920 --> 00:12:17,000 Speaker 3: the reason I wouldn't just say an outright no to 205 00:12:17,040 --> 00:12:23,280 Speaker 3: the universal basic income and I guarantee his jobs program 206 00:12:23,360 --> 00:12:27,080 Speaker 3: is because I don't really understand what AI is going 207 00:12:27,120 --> 00:12:31,280 Speaker 3: to do to our country and what self driving I mean, 208 00:12:31,520 --> 00:12:33,760 Speaker 3: you know, even with self driving ch cars. And I 209 00:12:33,800 --> 00:12:36,800 Speaker 3: talked to this to Elon Musk about this a couple 210 00:12:36,840 --> 00:12:40,440 Speaker 3: of weeks ago. I read an article and I don't 211 00:12:40,480 --> 00:12:42,520 Speaker 3: know if this is true or not, that over forty 212 00:12:42,520 --> 00:12:48,120 Speaker 3: percent of American jobs include driving. So if you cut 213 00:12:48,160 --> 00:12:51,520 Speaker 3: away all the driving cars, which is what the intention 214 00:12:51,640 --> 00:12:55,400 Speaker 3: and aspiration of this, you know, of this new technology is, 215 00:12:56,880 --> 00:12:59,120 Speaker 3: I do not know what that's going to do to 216 00:12:59,160 --> 00:13:02,600 Speaker 3: the American economy. I mean, we've had big dis locations before, 217 00:13:02,840 --> 00:13:05,440 Speaker 3: you know, the end of the of slavery, for example, 218 00:13:05,480 --> 00:13:08,640 Speaker 3: in our country, and we had to do these big 219 00:13:09,160 --> 00:13:13,120 Speaker 3: economic readjustments. And then the invention of the automobile around 220 00:13:13,240 --> 00:13:16,640 Speaker 3: at the beginning of the nineteen hundreds, when you had 221 00:13:16,679 --> 00:13:19,839 Speaker 3: a you know, if you were a buget buggy whip manufacturer, 222 00:13:19,880 --> 00:13:23,080 Speaker 3: that job disappeared, or if you or you know, a 223 00:13:23,120 --> 00:13:27,199 Speaker 3: buggy manufacturer, whatever, a stable. Those jobs disappeared in a 224 00:13:27,240 --> 00:13:31,160 Speaker 3: couple of years, but they were replaced by new jobs, 225 00:13:31,200 --> 00:13:33,400 Speaker 3: and in most cases they were better jobs. They were 226 00:13:33,480 --> 00:13:37,800 Speaker 3: manufacturing jobs in factories, they were and if those were 227 00:13:37,920 --> 00:13:40,360 Speaker 3: very high paying, and that's what supported the growth of 228 00:13:40,400 --> 00:13:44,360 Speaker 3: the American middle class. Oh, I do not know, And 229 00:13:44,440 --> 00:13:47,240 Speaker 3: you know, I'm talking to people right now about this. 230 00:13:47,320 --> 00:13:50,120 Speaker 3: I had a long talk with Jack Dorsey this morning, 231 00:13:50,160 --> 00:13:52,000 Speaker 3: and I'm talking with a lot of people in the 232 00:13:52,600 --> 00:13:57,160 Speaker 3: tech sector about how we're going to prepare ourselves for 233 00:13:57,520 --> 00:14:01,280 Speaker 3: a for the AI economy when a lot of human jobs, 234 00:14:01,320 --> 00:14:04,920 Speaker 3: even the writers in Hollywood. You know, one of the 235 00:14:04,960 --> 00:14:08,160 Speaker 3: reasons for this right now is that you know, they 236 00:14:08,240 --> 00:14:12,320 Speaker 3: have chat g GBT right which which can write a 237 00:14:12,360 --> 00:14:16,360 Speaker 3: script for you in seconds. And you know a lot 238 00:14:16,400 --> 00:14:19,520 Speaker 3: of people, even you know, high level white collar jobs 239 00:14:20,720 --> 00:14:23,360 Speaker 3: maybe on the chopping block and may disappear, and we 240 00:14:23,440 --> 00:14:26,840 Speaker 3: may have we may face massive, massive unemployment over the 241 00:14:26,920 --> 00:14:29,080 Speaker 3: next few years. And I think we really need the 242 00:14:29,120 --> 00:14:32,120 Speaker 3: best minds in the world to come together and figure 243 00:14:32,120 --> 00:14:34,760 Speaker 3: out how to protect ourselves against this. And so that's 244 00:14:34,760 --> 00:14:39,240 Speaker 3: why I wouldn't instantaneously rule out, you know, some kind 245 00:14:39,280 --> 00:14:42,640 Speaker 3: of universal incomer guaranteed jobs, which normally I would be 246 00:14:44,440 --> 00:14:46,680 Speaker 3: I would have a lot of antipathy for that because 247 00:14:46,760 --> 00:14:52,520 Speaker 3: I'm like a free market capitalism guy. But so I 248 00:14:52,560 --> 00:14:55,520 Speaker 3: don't you know, So that's why I'm edging on that. 249 00:14:56,120 --> 00:14:59,480 Speaker 5: Sure, Just a return to the minimum wage question, assuming 250 00:14:59,520 --> 00:15:01,800 Speaker 5: you know nothing does change. Do you have a specific 251 00:15:01,920 --> 00:15:03,520 Speaker 5: number that you think the minimum wage should be. 252 00:15:04,480 --> 00:15:06,800 Speaker 3: I don't have a specific number, but I think people 253 00:15:06,880 --> 00:15:09,680 Speaker 3: should have a living wage in this country. You know, 254 00:15:09,720 --> 00:15:13,720 Speaker 3: people shouldn't be able to feed themselves right now, is 255 00:15:13,760 --> 00:15:20,040 Speaker 3: that thirty five Americans could not do not are not 256 00:15:20,320 --> 00:15:24,400 Speaker 3: making enough money to pay for basic human needs. And 257 00:15:24,400 --> 00:15:29,960 Speaker 3: that means food, transportation, and housing, And that means those 258 00:15:30,000 --> 00:15:33,760 Speaker 3: Americans are sitting on the precipice of you know, of 259 00:15:33,840 --> 00:15:37,360 Speaker 3: a cliff that they're inches away from or on top 260 00:15:37,440 --> 00:15:42,400 Speaker 3: of becoming homeless, and that is you know, that's the 261 00:15:42,440 --> 00:15:44,880 Speaker 3: catastrophe for our country. So we have to figure out 262 00:15:44,920 --> 00:15:47,960 Speaker 3: ways if raising the minimum wage is one of the 263 00:15:48,040 --> 00:15:52,000 Speaker 3: ways that we insulate them or you know, or or 264 00:15:52,080 --> 00:15:56,120 Speaker 3: improve those to mitigate those outcomes. We ought to be 265 00:15:56,160 --> 00:15:56,760 Speaker 3: doing that. 266 00:15:59,480 --> 00:16:01,840 Speaker 1: To your point about you know, when we had a 267 00:16:01,920 --> 00:16:05,640 Speaker 1: growing middle class, when we had shared prosperity, part of 268 00:16:05,680 --> 00:16:08,120 Speaker 1: that picture was much higher union density. And I know 269 00:16:08,120 --> 00:16:10,360 Speaker 1: you've been a vocal supporter of you know, the writers 270 00:16:10,360 --> 00:16:13,080 Speaker 1: are on strike right now, vocal supporter of labor unions 271 00:16:13,080 --> 00:16:15,560 Speaker 1: in general. I wanted to get more specifics for you 272 00:16:15,760 --> 00:16:18,840 Speaker 1: on what you would do to help rebuild union density 273 00:16:18,920 --> 00:16:21,320 Speaker 1: and union membership. As you know, right now, the system 274 00:16:21,360 --> 00:16:23,520 Speaker 1: is so rigged against workers, are so difficult for them 275 00:16:23,560 --> 00:16:27,280 Speaker 1: to even exercise their rights to collectively bargain, exercise their 276 00:16:27,360 --> 00:16:30,080 Speaker 1: rights to form a union. Do you support things like 277 00:16:30,120 --> 00:16:32,640 Speaker 1: the Proact? Are you at all interested in like the 278 00:16:32,640 --> 00:16:36,560 Speaker 1: sectoral bargaining that Gavin Newsom has been experimenting with in 279 00:16:36,600 --> 00:16:38,680 Speaker 1: California that other countries do as well. 280 00:16:38,720 --> 00:16:40,160 Speaker 2: What's your what are your views there? 281 00:16:40,800 --> 00:16:42,960 Speaker 3: I mean, we need to build rebuild unions in this 282 00:16:43,040 --> 00:16:47,320 Speaker 3: country because it's one of the key ways we can 283 00:16:47,400 --> 00:16:52,880 Speaker 3: counterbalance is, you know, the the domination of our government 284 00:16:52,880 --> 00:16:55,960 Speaker 3: by corporate power. The unions have always when I was 285 00:16:56,000 --> 00:16:58,040 Speaker 3: a kid, forty five percent of the labor for US 286 00:16:58,120 --> 00:17:02,600 Speaker 3: was unionized. Today it's less than right and that was 287 00:17:02,760 --> 00:17:06,919 Speaker 3: a counterbalance to, you know, to corporate power at the 288 00:17:07,000 --> 00:17:09,920 Speaker 3: national level, and now that is kind of has gone. 289 00:17:10,160 --> 00:17:14,800 Speaker 3: Even the unions that are active are not participating in 290 00:17:14,800 --> 00:17:19,199 Speaker 3: the way that they used to in in politics and 291 00:17:19,280 --> 00:17:21,119 Speaker 3: the political system, which they need to get it be 292 00:17:21,200 --> 00:17:25,919 Speaker 3: involved in. I you know, I want, we need to 293 00:17:26,119 --> 00:17:29,760 Speaker 3: we need to protect collective bargaining. I don't, you know, 294 00:17:29,840 --> 00:17:34,199 Speaker 3: the right to work well laws are are damaging to that. 295 00:17:34,560 --> 00:17:39,160 Speaker 3: I oppose those and I support, you know, any way 296 00:17:39,200 --> 00:17:42,879 Speaker 3: that makes sense of growing the strength of of you know, 297 00:17:42,960 --> 00:17:44,040 Speaker 3: worker solidarity. 298 00:17:44,680 --> 00:17:44,960 Speaker 4: Sir. 299 00:17:45,119 --> 00:17:46,959 Speaker 5: I've noticed, though, you've gotten quite a bit of support 300 00:17:46,960 --> 00:17:50,439 Speaker 5: recently for Tremath Polyhopotilla David Sachs. So you had the 301 00:17:50,480 --> 00:17:54,840 Speaker 5: Twitter spaces with Elon Musk and they're at least some 302 00:17:54,920 --> 00:17:58,920 Speaker 5: of these people are supporting your campaign. Given your support 303 00:17:58,920 --> 00:18:02,880 Speaker 5: amongst them, how are you going to approach technology companies 304 00:18:02,920 --> 00:18:04,439 Speaker 5: and whether they should be broken up or not. 305 00:18:07,200 --> 00:18:12,280 Speaker 3: Well, oddly enough, you know, like the conversation I had 306 00:18:12,320 --> 00:18:15,520 Speaker 3: with Jack Torrisy this morning was exactly along those lines. 307 00:18:15,560 --> 00:18:18,240 Speaker 3: And you know, I'm impressed. And I know a lot 308 00:18:18,280 --> 00:18:21,520 Speaker 3: of people don't feel this way about Elon Mosk and 309 00:18:21,600 --> 00:18:25,040 Speaker 3: Jack Dorrisy, but I really find them incredibly patriotic and 310 00:18:25,320 --> 00:18:30,520 Speaker 3: incredibly committed the democracy. And you know, I didn't a 311 00:18:30,560 --> 00:18:33,199 Speaker 3: lot of those activities and their behavior was kind of 312 00:18:33,240 --> 00:18:36,119 Speaker 3: obscure to me while I was on the I've been 313 00:18:36,160 --> 00:18:38,760 Speaker 3: on the other end of the censorship, but I did 314 00:18:38,840 --> 00:18:43,080 Speaker 3: take note very early on in the pandemic when the 315 00:18:43,119 --> 00:18:47,320 Speaker 3: White House and Adam Schiff were asking the tech companies 316 00:18:47,359 --> 00:18:50,800 Speaker 3: to sensor people like me who were challenging you know, 317 00:18:50,840 --> 00:18:54,600 Speaker 3: a lot of the orthodoxies, that the one company that 318 00:18:54,680 --> 00:18:59,200 Speaker 3: stood up was Twitter. And I also, you know, although 319 00:19:00,480 --> 00:19:04,560 Speaker 3: Elon Musk is now vilified from the laft by the left, 320 00:19:05,040 --> 00:19:08,000 Speaker 3: which he shouldn't be, he should be. You know, to me, 321 00:19:08,200 --> 00:19:11,080 Speaker 3: he's a hero. He's a guy who who came in 322 00:19:11,280 --> 00:19:17,679 Speaker 3: and restored a free expression on Twitter. And you know 323 00:19:17,760 --> 00:19:20,399 Speaker 3: that I think the left sees that and they say, well, 324 00:19:20,440 --> 00:19:24,960 Speaker 3: you're you're letting Donald Trump talk now, but that was 325 00:19:25,000 --> 00:19:28,639 Speaker 3: not his intention. His intention was to let everybody talk, 326 00:19:29,040 --> 00:19:33,640 Speaker 3: you know. Because he made that choice, he lost billions 327 00:19:33,720 --> 00:19:36,080 Speaker 3: of dollars. And you know, he said to me in 328 00:19:36,119 --> 00:19:38,880 Speaker 3: our conversation when I asked him about that, he said, 329 00:19:38,960 --> 00:19:41,480 Speaker 3: It's worth every penny that I lose because we need 330 00:19:41,520 --> 00:19:43,880 Speaker 3: free speech in America. Because if we don't have free 331 00:19:43,880 --> 00:19:46,359 Speaker 3: speech in America, we don't have democracy in America. And 332 00:19:46,359 --> 00:19:49,160 Speaker 3: if we don't have democracy in America, it's the end 333 00:19:49,160 --> 00:19:53,399 Speaker 3: of the world. And I feel like he feels that way, 334 00:19:53,480 --> 00:19:55,760 Speaker 3: and that he's going to be a good ally for 335 00:19:55,920 --> 00:19:58,480 Speaker 3: me when I'm in office, and that dors He is too, 336 00:20:00,040 --> 00:20:04,040 Speaker 3: because they have this very, very natal commitment to free speech. 337 00:20:04,080 --> 00:20:07,200 Speaker 3: And I think they saw these you know, I mean, 338 00:20:07,320 --> 00:20:10,000 Speaker 3: you guys, remember, we were all promised that the you know, 339 00:20:10,080 --> 00:20:12,960 Speaker 3: at the outset of the when the social media was 340 00:20:12,960 --> 00:20:15,080 Speaker 3: selling itself to us, that it was it was going 341 00:20:15,119 --> 00:20:19,040 Speaker 3: to become the instrument it was going to democratize communications 342 00:20:19,080 --> 00:20:24,119 Speaker 3: around the world, and instead they become the instrument for 343 00:20:24,200 --> 00:20:28,680 Speaker 3: totalitarian control. And it's very ironic, but I do think 344 00:20:28,760 --> 00:20:31,280 Speaker 3: that you know, some of these guys, at least the 345 00:20:31,280 --> 00:20:36,200 Speaker 3: ones that I'm talking to, including David's Acts, are absolutely 346 00:20:36,240 --> 00:20:39,240 Speaker 3: committed to, you know, to figuring out how to make 347 00:20:39,400 --> 00:20:43,879 Speaker 3: censorship proof on social media sites, and they're doing that 348 00:20:43,960 --> 00:20:47,120 Speaker 3: with Noster. You know, Noster, I think is the first one, 349 00:20:47,320 --> 00:20:49,679 Speaker 3: and I think that's really promising. And some of this 350 00:20:49,800 --> 00:20:54,240 Speaker 3: blockchain technology that's coming out of Bitcoin will be very, 351 00:20:54,320 --> 00:20:57,480 Speaker 3: very useful. And by the way, you know, I understand 352 00:20:57,560 --> 00:21:02,360 Speaker 3: that some of my fellow liberals unlike Elon Muss because 353 00:21:02,400 --> 00:21:04,760 Speaker 3: they think he's giving a voice to Donald Trump and 354 00:21:04,800 --> 00:21:07,480 Speaker 3: to some of the you know, Donald Trump's supporters. 355 00:21:08,080 --> 00:21:09,800 Speaker 2: Well let me speak to that a little bit. 356 00:21:10,000 --> 00:21:11,879 Speaker 3: I just want to remember to remind you this. In 357 00:21:12,000 --> 00:21:16,040 Speaker 3: nineteen seventy seven, there was a march by Nazi by 358 00:21:16,040 --> 00:21:19,720 Speaker 3: the Nazi American Nazi Party in Skoki, Illinois, through a 359 00:21:19,800 --> 00:21:25,359 Speaker 3: Jewish neighborhood, and the liberal infrastructure, including the ac LU 360 00:21:25,440 --> 00:21:28,720 Speaker 3: and everybody else turned out to support their right to 361 00:21:28,840 --> 00:21:32,000 Speaker 3: march right, and nobody is agreeing with what they're saying. 362 00:21:32,040 --> 00:21:36,560 Speaker 3: What they're saying is appalling and repulsive and repellent. But 363 00:21:36,960 --> 00:21:39,360 Speaker 3: you know, we need to be willing to die in 364 00:21:39,400 --> 00:21:42,679 Speaker 3: our country for their right to say those things, because 365 00:21:42,720 --> 00:21:45,199 Speaker 3: if somebody can sens to them, they can sensor the 366 00:21:45,200 --> 00:21:47,240 Speaker 3: rest of us do, and that's just going to end 367 00:21:47,320 --> 00:21:52,399 Speaker 3: up benefiting the oligarchy and the military industrial complex. 368 00:21:53,280 --> 00:21:56,520 Speaker 1: Well it's we actually interviewed Jack Dorsey yesterday, so this 369 00:21:56,680 --> 00:21:59,080 Speaker 1: is really relevant. And you know, I'm on the left 370 00:21:59,160 --> 00:22:02,040 Speaker 1: and I support free speech and anti censorship as well. 371 00:22:02,080 --> 00:22:05,480 Speaker 1: I supported bringing Donald Trump back on to Twitter. What 372 00:22:05,600 --> 00:22:09,360 Speaker 1: has concerned me about Elon Musk's leadership is the places 373 00:22:09,359 --> 00:22:12,399 Speaker 1: where he hasn't met his free speech commitments. And I'll 374 00:22:12,400 --> 00:22:16,359 Speaker 1: give you a few specific examples when he censored journalists 375 00:22:16,359 --> 00:22:21,000 Speaker 1: who were critical of him, when he bent to demands 376 00:22:21,000 --> 00:22:23,960 Speaker 1: from the Turkish government ahead of a critical election there 377 00:22:24,040 --> 00:22:26,680 Speaker 1: to censor journalists who were digging into you know what 378 00:22:26,720 --> 00:22:31,080 Speaker 1: the the what Erdawan was up to, also bending to 379 00:22:31,240 --> 00:22:34,919 Speaker 1: requests from the Modi government in India as well. So 380 00:22:35,600 --> 00:22:38,160 Speaker 1: what do you say to those critiques of Elon Musk 381 00:22:38,240 --> 00:22:41,800 Speaker 1: that he actually hasn't lived up to his commitment and 382 00:22:41,840 --> 00:22:44,520 Speaker 1: his stated principle that he is in favor of free 383 00:22:44,520 --> 00:22:46,359 Speaker 1: speech absolutism. 384 00:22:46,920 --> 00:22:49,920 Speaker 3: Well, you know, I'm not gonna you know, my job 385 00:22:50,000 --> 00:22:55,359 Speaker 3: is not defending els. I would bet you this is 386 00:22:55,520 --> 00:22:59,080 Speaker 3: crystal that if you asked him those questions that he 387 00:22:59,119 --> 00:23:01,720 Speaker 3: would have a pretty good answer for them, and I 388 00:23:01,760 --> 00:23:04,240 Speaker 3: can guess that part of it. You know, I don't 389 00:23:04,280 --> 00:23:07,200 Speaker 3: know why he censored the people who are criticizing him. 390 00:23:07,280 --> 00:23:09,359 Speaker 3: I know that he did respond to that, but I 391 00:23:09,400 --> 00:23:13,280 Speaker 3: don't recall what his response was. But one of the 392 00:23:13,280 --> 00:23:16,200 Speaker 3: things that Dorisy talked to me about this morning was 393 00:23:16,240 --> 00:23:20,959 Speaker 3: the difficulty of operating in foreign countries and now all 394 00:23:21,000 --> 00:23:25,560 Speaker 3: across Europe where they're demanding censorship, right, and so you 395 00:23:25,600 --> 00:23:28,040 Speaker 3: have to make a choice in some of these countries. 396 00:23:28,080 --> 00:23:32,520 Speaker 3: And you know, Turkey is is not a freedom of 397 00:23:32,560 --> 00:23:37,040 Speaker 3: speech country. You have to make a choice. Am I 398 00:23:37,119 --> 00:23:40,040 Speaker 3: going to continue to operate this institution in this country 399 00:23:40,960 --> 00:23:42,920 Speaker 3: and bring some of the benefits that it does and 400 00:23:43,040 --> 00:23:47,120 Speaker 3: the revenue to you know, my shareholders in my company, 401 00:23:47,320 --> 00:23:49,600 Speaker 3: or I'm going to you know, make us stand here 402 00:23:49,640 --> 00:23:53,320 Speaker 3: and die on this hill and and and you know, 403 00:23:53,560 --> 00:23:56,720 Speaker 3: report the truth and then be shut down the next day. So, I, 404 00:23:56,840 --> 00:24:00,359 Speaker 3: you know, I don't know what he would say, but 405 00:24:00,520 --> 00:24:03,240 Speaker 3: I can imagine that those are some of the really 406 00:24:03,280 --> 00:24:06,080 Speaker 3: difficult choices he was forced to make. 407 00:24:06,240 --> 00:24:08,920 Speaker 1: Yeah, well, that is actually what he did say in response, 408 00:24:09,160 --> 00:24:11,440 Speaker 1: is that his commitment to free speech goes as far 409 00:24:11,480 --> 00:24:15,600 Speaker 1: as the laws of a certain country provide, which raises 410 00:24:15,600 --> 00:24:17,520 Speaker 1: a lot of questions. Is actually what Jack Dorsey brought 411 00:24:17,560 --> 00:24:20,800 Speaker 1: up with us yesterday is that when Dorsey was head 412 00:24:20,800 --> 00:24:23,040 Speaker 1: of Twitter, he tried to take a more global approach. 413 00:24:23,119 --> 00:24:25,400 Speaker 1: This is in his words, So he received requests from 414 00:24:25,400 --> 00:24:28,440 Speaker 1: the Indian government. They either threatened or actually did rate 415 00:24:28,480 --> 00:24:31,520 Speaker 1: his offices. According to him, they pulled staff from the 416 00:24:31,600 --> 00:24:34,960 Speaker 1: country and were concerned about operating there, but they didn't 417 00:24:35,000 --> 00:24:38,679 Speaker 1: accede to the demands. So what is your view of 418 00:24:38,680 --> 00:24:41,320 Speaker 1: the way that those things should be handled, Because you know, 419 00:24:41,359 --> 00:24:44,359 Speaker 1: if you look at any sort of totalitarian government, they're 420 00:24:44,400 --> 00:24:47,480 Speaker 1: going to have egregious anti free speech laws on the books. 421 00:24:47,960 --> 00:24:50,040 Speaker 1: Is the responsibility of someone who claims to be a 422 00:24:50,080 --> 00:24:52,520 Speaker 1: free speech advocate to stand up to those governments or 423 00:24:52,560 --> 00:24:54,720 Speaker 1: just to abide by whatever the law is and whatever 424 00:24:54,760 --> 00:24:56,360 Speaker 1: that authoritarian government. 425 00:24:56,040 --> 00:25:01,040 Speaker 3: Demands, Ystal. I think that's a really good, good question, 426 00:25:01,160 --> 00:25:05,080 Speaker 3: and it's a really troubling issue. And I think the 427 00:25:06,480 --> 00:25:10,240 Speaker 3: you know, right now, it's becoming more and more difficult 428 00:25:10,280 --> 00:25:17,000 Speaker 3: because there are competitors within those countries that have the 429 00:25:17,040 --> 00:25:21,080 Speaker 3: ability to completely replace you know, institutionals like Twitter, where 430 00:25:21,240 --> 00:25:25,120 Speaker 3: we're the Twitter, and you know, some of these other 431 00:25:26,359 --> 00:25:30,280 Speaker 3: social media companies are now no longer all powerful. If 432 00:25:30,320 --> 00:25:32,840 Speaker 3: you go to China there they got each one of 433 00:25:32,840 --> 00:25:36,520 Speaker 3: them has competitors there. And if you drive the American 434 00:25:36,600 --> 00:25:39,480 Speaker 3: companies out, you know, is the country going to be 435 00:25:39,640 --> 00:25:43,320 Speaker 3: and is free speech I'm more likely to be protected 436 00:25:43,480 --> 00:25:45,240 Speaker 3: or not? I don't know. I think all of the 437 00:25:45,760 --> 00:25:50,280 Speaker 3: I don't think at this point there's probably any hard 438 00:25:50,320 --> 00:25:53,639 Speaker 3: and fast answer. And I suspect, although I'm not an 439 00:25:53,680 --> 00:25:56,720 Speaker 3: expert on this, that Jack Dorsey it was an easier 440 00:25:56,760 --> 00:25:59,600 Speaker 3: decision when he made it than it probably is for 441 00:25:59,640 --> 00:26:03,320 Speaker 3: ELI today because of the change landscapes of social media 442 00:26:03,480 --> 00:26:08,440 Speaker 3: and in the growth of indigenous cut companies within those 443 00:26:09,640 --> 00:26:15,120 Speaker 3: within those, you know, i'd say tyrannical systems, sir. 444 00:26:15,280 --> 00:26:17,399 Speaker 5: Just to last questions for you, I'm curious what your 445 00:26:17,520 --> 00:26:21,680 Speaker 5: view is of corporations like bud Light getting involved in 446 00:26:21,880 --> 00:26:23,720 Speaker 5: Pride Month and in trans issues. 447 00:26:25,560 --> 00:26:34,000 Speaker 3: I would say those are those are strategy choices for them, 448 00:26:35,400 --> 00:26:38,400 Speaker 3: And my understanding of bud Light is that they probably 449 00:26:38,480 --> 00:26:41,800 Speaker 3: regret having made that choice because there was such a 450 00:26:41,800 --> 00:26:45,720 Speaker 3: strong consumer reaction, and that's probably the best way to 451 00:26:45,840 --> 00:26:48,520 Speaker 3: work things, rather than you know, for them to make, 452 00:26:48,840 --> 00:26:54,000 Speaker 3: you know, their own decisions about about how to operate 453 00:26:54,560 --> 00:26:58,040 Speaker 3: and you know, how to keep their consumers going. I mean, listen, 454 00:26:58,600 --> 00:27:02,679 Speaker 3: I would love companies to do the right thing on 455 00:27:02,720 --> 00:27:06,919 Speaker 3: the environment, even if it's against their economic interests, you know, 456 00:27:07,760 --> 00:27:10,680 Speaker 3: And I'd love it if we had laws. But it's 457 00:27:10,680 --> 00:27:15,280 Speaker 3: not it's not a reliable Uh, it's it's not a 458 00:27:15,320 --> 00:27:18,520 Speaker 3: reliable way to change policy is to you know, you 459 00:27:18,640 --> 00:27:24,200 Speaker 3: need you need laws that are actually enforceable, that that 460 00:27:24,440 --> 00:27:27,320 Speaker 3: do what you know what laws and the democracy are 461 00:27:27,320 --> 00:27:31,080 Speaker 3: supposed to do, which is to encourage support and reward 462 00:27:31,160 --> 00:27:35,159 Speaker 3: good behavior and then to punish bad behavior. Sure, and 463 00:27:35,200 --> 00:27:38,120 Speaker 3: then if we want to do it in a reliable way, 464 00:27:38,160 --> 00:27:40,520 Speaker 3: we needed to do it through legislation rather than to 465 00:27:40,600 --> 00:27:45,480 Speaker 3: rely on on you know, corporate goodwill. If corporations act 466 00:27:45,520 --> 00:27:48,440 Speaker 3: in ways that are sociable, you know, I commend them, but. 467 00:27:50,080 --> 00:27:52,359 Speaker 2: I can't rely on that, certainly, Tagan. 468 00:27:52,640 --> 00:27:54,919 Speaker 5: Last question on my pet issue that I know a 469 00:27:54,920 --> 00:27:57,800 Speaker 5: lot of people want to know, will you declassify all 470 00:27:58,040 --> 00:27:59,919 Speaker 5: UFO related documents as. 471 00:27:59,760 --> 00:28:04,240 Speaker 3: Pres Yeah, let me ask you something. Oh please, did 472 00:28:04,280 --> 00:28:10,520 Speaker 3: you see this article last week about the guy about Yes. 473 00:28:11,280 --> 00:28:12,640 Speaker 4: We've covered it extensively here. 474 00:28:12,720 --> 00:28:15,840 Speaker 3: Yeah, And do you think that that was a psyop 475 00:28:16,000 --> 00:28:17,440 Speaker 3: or do you think that that was real? 476 00:28:18,320 --> 00:28:20,680 Speaker 5: I believe it was real based upon a lot of 477 00:28:20,880 --> 00:28:23,160 Speaker 5: credible people who I know, who have spoken with him 478 00:28:23,200 --> 00:28:24,120 Speaker 5: and have vetted him. 479 00:28:25,760 --> 00:28:27,919 Speaker 3: Why don't you tell people what we're talking about? 480 00:28:28,160 --> 00:28:30,320 Speaker 4: Sure? Sure, yeah. For those who don't know. 481 00:28:30,600 --> 00:28:33,399 Speaker 5: Dave Grush, who's an intel whistleblower, came forward through the 482 00:28:33,440 --> 00:28:38,400 Speaker 5: intelligence community process. He says that multiple alien spacecraft are 483 00:28:38,400 --> 00:28:41,400 Speaker 5: in the possession of the United States government. The Inspector 484 00:28:41,440 --> 00:28:44,480 Speaker 5: General of the Intelligence Community says that he is highly 485 00:28:44,520 --> 00:28:48,800 Speaker 5: credible and as bringing forward urgent information. And actually several 486 00:28:48,800 --> 00:28:51,960 Speaker 5: representatives in Congress were asked about it just thirty minutes 487 00:28:52,040 --> 00:28:54,320 Speaker 5: or so before you and I spoke, and many of them, 488 00:28:54,360 --> 00:28:58,040 Speaker 5: including Senator Jill Brand, Senator Holly Bipartisanship, if you will, 489 00:28:58,240 --> 00:29:00,440 Speaker 5: all said that they were credible in that they had 490 00:29:00,520 --> 00:29:04,320 Speaker 5: asked the they had asked the intelligence community about that, 491 00:29:04,360 --> 00:29:08,600 Speaker 5: and were quote unquote rang true at least in some cases. 492 00:29:09,440 --> 00:29:09,680 Speaker 2: Yeah. 493 00:29:09,680 --> 00:29:11,800 Speaker 3: Well, I can't wait to be bresident of the United 494 00:29:11,840 --> 00:29:15,600 Speaker 3: States and and dig into that one. That's really that 495 00:29:15,760 --> 00:29:16,640 Speaker 3: is fascinating. 496 00:29:16,880 --> 00:29:18,680 Speaker 5: And what is your what is your view of the 497 00:29:18,760 --> 00:29:21,040 Speaker 5: UFO phenomena just your personal. 498 00:29:20,800 --> 00:29:23,000 Speaker 3: I don't really, I mean, I don't know anything about 499 00:29:23,040 --> 00:29:27,280 Speaker 3: it other than I'm very very good friends with with 500 00:29:27,480 --> 00:29:31,520 Speaker 3: dan Ackroight, who's kind of devoted his life to studying 501 00:29:31,560 --> 00:29:34,680 Speaker 3: that phenomenon. Is you know, very very convincedive. But I've 502 00:29:34,680 --> 00:29:38,479 Speaker 3: never seen a UFO. And but you know, I also 503 00:29:39,760 --> 00:29:42,840 Speaker 3: the stuff that I've read, you know of people, you know, 504 00:29:42,920 --> 00:29:46,440 Speaker 3: the particularly the navy pilots who have you know, who 505 00:29:46,520 --> 00:29:50,960 Speaker 3: have recorded these encounters seem you know, I'm not I'm 506 00:29:50,960 --> 00:29:53,800 Speaker 3: certainly not going to dismiss it. It seems like it's real, 507 00:29:54,640 --> 00:29:58,280 Speaker 3: but I don't have I don't have a good, uh 508 00:29:58,560 --> 00:30:00,120 Speaker 3: a good grasp on it. But I mean, and that 509 00:30:00,840 --> 00:30:03,440 Speaker 3: article you were talking about was fascinating because it was 510 00:30:03,640 --> 00:30:07,680 Speaker 3: just exactly what men in blacks that is actually happening. 511 00:30:07,720 --> 00:30:10,080 Speaker 3: That you know, they gave us a velcrow and all 512 00:30:10,120 --> 00:30:10,760 Speaker 3: of this. 513 00:30:12,080 --> 00:30:12,200 Speaker 5: That. 514 00:30:13,520 --> 00:30:16,520 Speaker 1: Yeah, well, I always say, I mean, I try to 515 00:30:16,560 --> 00:30:19,080 Speaker 1: keep the skeptic hat on and say, it's hard for 516 00:30:19,120 --> 00:30:21,080 Speaker 1: me imagine that they'd be able to have this cover 517 00:30:21,200 --> 00:30:25,280 Speaker 1: up across multiple governments, across multiple decades, administrations, all these years. 518 00:30:25,320 --> 00:30:27,680 Speaker 1: But I have a feeling you might dispute the possibility 519 00:30:27,720 --> 00:30:28,960 Speaker 1: of such a cover up occurring. 520 00:30:28,720 --> 00:30:30,840 Speaker 4: Exactly, especially what happened to your own family. 521 00:30:31,120 --> 00:30:32,600 Speaker 5: Sir. I know your staff says you got to get 522 00:30:32,600 --> 00:30:34,120 Speaker 5: out of here, so I just want to say thank 523 00:30:34,120 --> 00:30:35,960 Speaker 5: you very much for joining us. It was really a 524 00:30:35,960 --> 00:30:37,880 Speaker 5: pleasure to speak with you again, and you were welcome 525 00:30:37,880 --> 00:30:38,800 Speaker 5: back on the show anytime. 526 00:30:38,920 --> 00:30:41,320 Speaker 2: Love talking to you. Thank you, y I love coming. 527 00:30:41,080 --> 00:30:43,200 Speaker 3: On the show, So thank you very much for having me. 528 00:30:43,560 --> 00:30:44,560 Speaker 4: Thank you for our pleasure.