1 00:00:01,680 --> 00:00:02,040 Speaker 1: Welcome. 2 00:00:02,080 --> 00:00:05,000 Speaker 2: It is Verdict with Center Ted Kruz, Ben Ferguson with 3 00:00:05,040 --> 00:00:05,440 Speaker 2: you Centator. 4 00:00:05,519 --> 00:00:07,440 Speaker 1: This is really fun. We're getting to do back. 5 00:00:07,320 --> 00:00:10,559 Speaker 2: To back evenings with a live audience, which is really exciting. 6 00:00:10,560 --> 00:00:12,640 Speaker 2: And we've got a dear friend of yours and a 7 00:00:12,720 --> 00:00:15,440 Speaker 2: guest with us tonight getting to talk about some really 8 00:00:15,440 --> 00:00:19,360 Speaker 2: cool things, especially when it comes to education. Imagine starting 9 00:00:19,360 --> 00:00:21,919 Speaker 2: a university. That's a pretty cool idea, and we have 10 00:00:22,000 --> 00:00:22,800 Speaker 2: someone that's done that. 11 00:00:22,960 --> 00:00:23,880 Speaker 1: I'll let you do the intro. 12 00:00:24,360 --> 00:00:27,160 Speaker 3: Well, we're very proud to be in Austin, Texas tonight 13 00:00:27,280 --> 00:00:30,160 Speaker 3: with a very good friend of mine and someone who 14 00:00:30,280 --> 00:00:33,160 Speaker 3: I say, without hyperbole, is one of the smartest people 15 00:00:33,200 --> 00:00:37,160 Speaker 3: on planet Earth. We are with Joe Lonsdale. Joe Lonsdale 16 00:00:37,880 --> 00:00:41,160 Speaker 3: is a big tech entrepreneur. He is a venture capitalist. 17 00:00:42,040 --> 00:00:45,479 Speaker 3: He runs a major venture capital fund that invests in 18 00:00:45,520 --> 00:00:49,520 Speaker 3: tech companies across the country. He was the CEO of Pallanteer. 19 00:00:50,560 --> 00:00:56,880 Speaker 3: He led the big tech exodus from California to Austin, Texas. 20 00:00:56,880 --> 00:00:59,720 Speaker 3: And one of the amazing things we're seeing is Austin 21 00:00:59,720 --> 00:01:04,280 Speaker 3: Tech is becoming a mecca for people in tech who 22 00:01:04,280 --> 00:01:09,440 Speaker 3: were not insane socialists came into that and Joe came 23 00:01:09,520 --> 00:01:14,160 Speaker 3: as sort of the search party. He was sent first, 24 00:01:14,840 --> 00:01:18,440 Speaker 3: and I think probably Silicon Valley was curious the reception 25 00:01:18,560 --> 00:01:21,120 Speaker 3: he would get, and they found the cannibals did not 26 00:01:21,200 --> 00:01:25,320 Speaker 3: eat him, and so he wired back. Come to Texas. 27 00:01:25,640 --> 00:01:28,319 Speaker 3: There's freedom here, there's sanity here, Joe. It is great 28 00:01:28,319 --> 00:01:29,520 Speaker 3: to be with you. Welcome to Verdict. 29 00:01:29,640 --> 00:01:31,480 Speaker 4: Thanks for being here, Ted, and thank you very much 30 00:01:31,520 --> 00:01:33,120 Speaker 4: for including me. It's very kind of you, saye one 31 00:01:33,120 --> 00:01:35,120 Speaker 4: of the smartest senators. I appreciate the line there. 32 00:01:35,560 --> 00:01:38,360 Speaker 2: Let me tell you about Freedom Gold USA. If you're 33 00:01:38,480 --> 00:01:41,119 Speaker 2: like me and you like to make sure that you 34 00:01:41,200 --> 00:01:44,480 Speaker 2: protect your hard earned hours. If you're close to retirement, 35 00:01:44,560 --> 00:01:48,600 Speaker 2: if you're in retirement and you're watching what's happening with inflation, 36 00:01:49,120 --> 00:01:52,440 Speaker 2: you're watching your savings a road away, you're purchasing power 37 00:01:52,560 --> 00:01:55,280 Speaker 2: a road away, Well then it may be time for 38 00:01:55,320 --> 00:01:57,840 Speaker 2: you to take a look at gold and silver as 39 00:01:57,920 --> 00:02:01,840 Speaker 2: part of diversifying your financial portfolio. It is something that 40 00:02:01,880 --> 00:02:04,880 Speaker 2: I've done and I use Freedom Gold. In fact, right 41 00:02:04,920 --> 00:02:08,320 Speaker 2: now in my hand, I have a ounce of silver 42 00:02:08,520 --> 00:02:11,320 Speaker 2: that is part of my portfolio. And I have this 43 00:02:11,480 --> 00:02:14,160 Speaker 2: because no matter what happens, we've got a death that's 44 00:02:14,160 --> 00:02:18,880 Speaker 2: above thirty four trillion dollars with a T. And we've 45 00:02:18,919 --> 00:02:22,560 Speaker 2: seen the push for central bank digital currencies. Our financial 46 00:02:22,600 --> 00:02:25,800 Speaker 2: freedom is at risk. That is why I want you 47 00:02:25,840 --> 00:02:28,600 Speaker 2: to know about Freedom Gold USA. They are ready to 48 00:02:28,639 --> 00:02:31,720 Speaker 2: help you preserve your wealth and provide you stability and 49 00:02:31,919 --> 00:02:35,040 Speaker 2: uncertain times. Now here's the big perk. If you call 50 00:02:35,080 --> 00:02:37,960 Speaker 2: them right now, you can see if you qualify for 51 00:02:38,080 --> 00:02:42,400 Speaker 2: up to ten thousand dollars in free silver. 52 00:02:42,840 --> 00:02:43,320 Speaker 1: That's right. 53 00:02:43,400 --> 00:02:45,480 Speaker 2: Learn how to add gold or silver to your IRA 54 00:02:46,040 --> 00:02:49,200 Speaker 2: or have it shipped directly to your home, and safe 55 00:02:49,320 --> 00:02:52,520 Speaker 2: guard your wealth with physical gold and silver. One eight 56 00:02:52,600 --> 00:02:57,320 Speaker 2: hundred sixty five five eight eight four three. That's one 57 00:02:57,360 --> 00:03:01,320 Speaker 2: eight hundred sixty five five eight eight four three. Use 58 00:03:01,400 --> 00:03:04,560 Speaker 2: the company that I use. Go to freedom Goold USA 59 00:03:04,880 --> 00:03:10,720 Speaker 2: dot com slash verdict. That's Freedom Gold USA dot com 60 00:03:10,760 --> 00:03:13,079 Speaker 2: slash verdict, and talk to them. Receive if you qualify 61 00:03:13,160 --> 00:03:16,480 Speaker 2: for up to ten thousand dollars in free silver one 62 00:03:16,520 --> 00:03:19,079 Speaker 2: eight hundred and sixty fivey five eight eight four three. 63 00:03:19,600 --> 00:03:22,160 Speaker 2: You've got something that I think is amazing, and it 64 00:03:22,240 --> 00:03:24,240 Speaker 2: deals with education. I want to start with that first, 65 00:03:24,280 --> 00:03:27,280 Speaker 2: because not only did you leave California, but you also 66 00:03:27,360 --> 00:03:30,160 Speaker 2: had this insane idea of Hey, I want to start 67 00:03:30,200 --> 00:03:31,360 Speaker 2: a university. 68 00:03:32,160 --> 00:03:34,920 Speaker 1: Where did that come from? And what is the goal emission? 69 00:03:35,320 --> 00:03:37,360 Speaker 4: Oh, you know, we've been talking about the universities for 70 00:03:37,400 --> 00:03:38,760 Speaker 4: a while. I mean all of us have seen over 71 00:03:38,760 --> 00:03:40,760 Speaker 4: the years, so they've kind of gone the wrong direction. 72 00:03:40,840 --> 00:03:42,920 Speaker 4: But you know, when I was in university, I'm I'm, 73 00:03:42,960 --> 00:03:44,440 Speaker 4: I'm you know, I was there in two thousand and 74 00:03:44,440 --> 00:03:46,680 Speaker 4: two thousand and four. They with problems. There was there 75 00:03:46,720 --> 00:03:48,280 Speaker 4: was is you know, you'd get in trouble for being 76 00:03:48,320 --> 00:03:50,720 Speaker 4: polically incorrect. You be told not to talk about things. 77 00:03:50,760 --> 00:03:52,600 Speaker 4: They got my first B for trying to defend John 78 00:03:52,640 --> 00:03:54,600 Speaker 4: Locke in a humanities course because that was, you know, 79 00:03:54,760 --> 00:03:55,520 Speaker 4: not supposed to do that. 80 00:03:55,840 --> 00:03:57,280 Speaker 1: But but it was famed that one, by the way. 81 00:03:57,360 --> 00:03:59,360 Speaker 4: Yeah yeah, I know, you know, but it wasn't like. 82 00:03:59,400 --> 00:04:03,240 Speaker 3: Joe that's called humble brag first B first that's true, 83 00:04:03,400 --> 00:04:04,560 Speaker 3: but its freshman year. 84 00:04:04,560 --> 00:04:08,160 Speaker 4: It was you know, but but you know, it wasn't 85 00:04:08,200 --> 00:04:09,920 Speaker 4: totally crazy. And I think a lot of people who 86 00:04:09,960 --> 00:04:12,920 Speaker 4: have not been at universities for a while don't realize 87 00:04:12,920 --> 00:04:15,360 Speaker 4: like just how crazy these places have become over the 88 00:04:15,480 --> 00:04:17,520 Speaker 4: last you know, five six seven years. There was I 89 00:04:17,520 --> 00:04:19,520 Speaker 4: think it was like a shift in our society. Maybe 90 00:04:19,560 --> 00:04:22,400 Speaker 4: it was around twenty fourteen, twenty fifteen, but I mean 91 00:04:22,480 --> 00:04:24,640 Speaker 4: at Stanford Review, which Peter Thiel started and I was 92 00:04:24,680 --> 00:04:27,560 Speaker 4: involved with, which is a libertarian and conservative paper. Like 93 00:04:27,600 --> 00:04:28,839 Speaker 4: a lot of the kids there, it is like you 94 00:04:28,880 --> 00:04:30,960 Speaker 4: can't even admit you write for it anymore. It has 95 00:04:31,000 --> 00:04:33,040 Speaker 4: to be pseudonymous because it would ruin your social life 96 00:04:33,080 --> 00:04:35,440 Speaker 4: and you'd be attacked for it. And like these departments 97 00:04:35,480 --> 00:04:38,720 Speaker 4: have just gotten so radicalized and so broken, and it's 98 00:04:38,839 --> 00:04:40,720 Speaker 4: so there's there's an you know, the administrators. Over the 99 00:04:40,800 --> 00:04:43,080 Speaker 4: last twenty years, you've tripled the size of the administrations. There's 100 00:04:43,080 --> 00:04:45,560 Speaker 4: more administrators now at Yelling. There our students, bat's many 101 00:04:45,520 --> 00:04:48,600 Speaker 4: administrators at Harvard. This DEI thing has come in. It's 102 00:04:48,640 --> 00:04:51,160 Speaker 4: anti merit. It's just it's just so broken. So I 103 00:04:51,200 --> 00:04:53,440 Speaker 4: think people don't realize that, you know, I actually believe 104 00:04:53,520 --> 00:04:55,520 Speaker 4: university has played a really important role in our society 105 00:04:55,560 --> 00:04:57,240 Speaker 4: the last hundred years. And you know, I was really 106 00:04:57,279 --> 00:04:59,880 Speaker 4: I'm really lucky to have friends such as Neil Ferguson, 107 00:04:59,920 --> 00:05:02,600 Speaker 4: the great historian, Barry Weiss, you know who I think 108 00:05:02,680 --> 00:05:04,440 Speaker 4: runs one of the most important media companies in the 109 00:05:04,520 --> 00:05:07,080 Speaker 4: US today, as my two co founders and all of 110 00:05:07,160 --> 00:05:10,000 Speaker 4: us realize, you know, American institutions are breaking, they're failing. 111 00:05:10,040 --> 00:05:12,000 Speaker 4: It's bad for our country. You want you to do 112 00:05:12,000 --> 00:05:14,159 Speaker 4: in America when these things are broken, you build new ones. 113 00:05:14,440 --> 00:05:16,919 Speaker 3: And Joe, what you're doing here is incredibly important. So 114 00:05:17,000 --> 00:05:20,760 Speaker 3: you founded the University of Austin. Tell listeners of the 115 00:05:20,760 --> 00:05:23,880 Speaker 3: podcast what the University of Austin is and what's the vision. 116 00:05:23,880 --> 00:05:25,680 Speaker 3: What is it trying to accomplish. 117 00:05:25,920 --> 00:05:29,159 Speaker 4: You know, we're trying to build a new great university 118 00:05:29,520 --> 00:05:32,680 Speaker 4: in America that competes with Harvard, Yale, Princeton, Stanford, MIT. 119 00:05:32,800 --> 00:05:34,960 Speaker 4: It competes for the best and brightest, and we want 120 00:05:35,000 --> 00:05:37,960 Speaker 4: to have one of these places where it actually pursues truth, 121 00:05:38,200 --> 00:05:41,520 Speaker 4: where it doesn't defer to who's offended, doesn't defer to 122 00:05:41,560 --> 00:05:43,919 Speaker 4: a crazy ideology. You don't have a bunch of you know, 123 00:05:43,920 --> 00:05:46,880 Speaker 4: administrators hounding you down. You actually are are focused on 124 00:05:47,040 --> 00:05:51,200 Speaker 4: teaching young people to be courageous, to basically learn how 125 00:05:51,200 --> 00:05:53,360 Speaker 4: to speak up, learn how to confront solutions in our society. 126 00:05:53,360 --> 00:05:55,400 Speaker 4: If you can have even like a small number of 127 00:05:55,480 --> 00:05:57,719 Speaker 4: the people who go on to run our elite to 128 00:05:57,839 --> 00:05:59,400 Speaker 4: learn how to speak out, learn how to call out? 129 00:06:00,080 --> 00:06:02,800 Speaker 3: Where is the university and the journey of its founding 130 00:06:02,800 --> 00:06:04,719 Speaker 3: and becoming established and growing. 131 00:06:04,839 --> 00:06:06,000 Speaker 4: So you know, it turns out, like a lot of 132 00:06:06,040 --> 00:06:08,680 Speaker 4: other industries, there's a big cartel for starrying these things. 133 00:06:08,760 --> 00:06:10,640 Speaker 4: We had to do two thousand pages of regulation, We 134 00:06:10,760 --> 00:06:12,520 Speaker 4: had to you know, go through all sorts of things. 135 00:06:12,560 --> 00:06:15,720 Speaker 4: But we were officially operating university, the first new private 136 00:06:15,800 --> 00:06:18,520 Speaker 4: university in Texas in over sixty years. Where we're we've done, 137 00:06:18,600 --> 00:06:20,560 Speaker 4: We've done all sorts of different events and seminars, and 138 00:06:20,560 --> 00:06:23,880 Speaker 4: we have our first, our founding undergraduate class joining right now. 139 00:06:23,920 --> 00:06:26,040 Speaker 4: There's gonna bee hundred students to join the fall. You know, 140 00:06:26,040 --> 00:06:27,039 Speaker 4: we've had over five thousand. 141 00:06:27,040 --> 00:06:29,360 Speaker 3: Professor, are they all freshmen or that are starting or 142 00:06:29,360 --> 00:06:31,160 Speaker 3: how does how does it work in terms of the 143 00:06:31,160 --> 00:06:31,920 Speaker 3: class coming. 144 00:06:31,720 --> 00:06:34,080 Speaker 4: In, We're we're defining them as all freshmen, although we 145 00:06:34,080 --> 00:06:35,880 Speaker 4: may admit a few people who who've gone to a 146 00:06:35,920 --> 00:06:37,919 Speaker 4: couple of the top schools and are fleeing them, and 147 00:06:38,000 --> 00:06:40,640 Speaker 4: so you know, we conclude them with us instead of course. 148 00:06:40,520 --> 00:06:42,839 Speaker 3: Now are there particular majors that are being offered to 149 00:06:42,839 --> 00:06:43,240 Speaker 3: start with? 150 00:06:43,279 --> 00:06:45,039 Speaker 5: What? What are the students going to be studying. 151 00:06:45,080 --> 00:06:47,000 Speaker 4: Well, you know, we want all the students to have 152 00:06:47,320 --> 00:06:50,480 Speaker 4: a sort of intellectual foundations of like the Great Debates 153 00:06:50,480 --> 00:06:53,360 Speaker 4: of Western civilization, kind of like the core what's called 154 00:06:53,360 --> 00:06:55,760 Speaker 4: a classical liberal core. But then you have different centers. 155 00:06:55,800 --> 00:06:58,520 Speaker 4: We have a Center for Economics, History and Politics, and 156 00:06:58,520 --> 00:07:00,160 Speaker 4: we have people who've given up tenure at place is 157 00:07:00,200 --> 00:07:03,240 Speaker 4: like University of Chicago and Columbia and other places like that, 158 00:07:03,279 --> 00:07:05,560 Speaker 4: so I teach there. They have a center for STEM. 159 00:07:05,720 --> 00:07:07,560 Speaker 4: Our friend Elon Musk built a lot of things here, 160 00:07:07,600 --> 00:07:10,000 Speaker 4: and people who help run SpaceX and Boring Company are 161 00:07:10,120 --> 00:07:11,720 Speaker 4: helping us shape some of the STEM to make sure 162 00:07:11,720 --> 00:07:13,320 Speaker 4: these students are people they want to partner with. So 163 00:07:13,840 --> 00:07:15,760 Speaker 4: we have multiple different electives as well. 164 00:07:16,040 --> 00:07:18,600 Speaker 2: Just like any other college, raising money is something that 165 00:07:18,640 --> 00:07:21,880 Speaker 2: obviously is vitally important. And then you have moments where 166 00:07:21,920 --> 00:07:23,880 Speaker 2: you can see we're probably instead of you going to 167 00:07:23,920 --> 00:07:26,960 Speaker 2: people telling them the story, they start coming to you. 168 00:07:27,760 --> 00:07:30,480 Speaker 2: Did that just happen when we saw so much anti 169 00:07:30,640 --> 00:07:33,760 Speaker 2: Israel rhetoric on college campuses, where then people that you 170 00:07:33,920 --> 00:07:36,200 Speaker 2: know came to you and said, hey, maybe I do 171 00:07:36,240 --> 00:07:37,640 Speaker 2: want to get involved in this idea. 172 00:07:37,720 --> 00:07:39,360 Speaker 1: Maybe you're onto something here. 173 00:07:39,640 --> 00:07:41,880 Speaker 4: Yeah, you know a lot of people like I haven't 174 00:07:41,920 --> 00:07:44,120 Speaker 4: realized just help broken the universities are. They always thought 175 00:07:44,120 --> 00:07:46,840 Speaker 4: it was something guys like Ted and I like to 176 00:07:46,920 --> 00:07:48,440 Speaker 4: argue with the crazy people on the far left, and 177 00:07:48,440 --> 00:07:51,000 Speaker 4: we've probably always called these places out. But they've gotten 178 00:07:51,040 --> 00:07:52,880 Speaker 4: a lot worse since we were arguing with them twenty 179 00:07:52,960 --> 00:07:55,480 Speaker 4: years ago, and so, and a lot of these people 180 00:07:55,480 --> 00:07:58,360 Speaker 4: finally woke up after October seventh, and after they saw 181 00:07:58,480 --> 00:08:01,880 Speaker 4: obviously the presidents of Harvard and MIT and whatnot like 182 00:08:01,960 --> 00:08:04,320 Speaker 4: in Penn going and making total fools of themselves, and 183 00:08:04,320 --> 00:08:06,240 Speaker 4: they started to look a little more closely. And then 184 00:08:06,280 --> 00:08:08,480 Speaker 4: of course, like the plagism scandal comes out, and then 185 00:08:08,480 --> 00:08:10,640 Speaker 4: if you're been paying attention, it turns out that it's 186 00:08:10,640 --> 00:08:13,320 Speaker 4: not just clydeing Gay, it's all the leadership of the 187 00:08:13,360 --> 00:08:15,720 Speaker 4: DEI and the Title Line Office all plagiarize all their stuff. 188 00:08:15,720 --> 00:08:17,760 Speaker 4: Because guess what, if you have a philosophy that's anti 189 00:08:17,840 --> 00:08:20,360 Speaker 4: merrit maybe you yourself aren't doing things that are meritocratic. 190 00:08:20,400 --> 00:08:20,560 Speaker 4: You know. 191 00:08:20,600 --> 00:08:22,800 Speaker 3: Look, I don't know about you, Joe, but I for one, 192 00:08:23,040 --> 00:08:26,840 Speaker 3: was really inspired when when former Harvard president Claudine Gay 193 00:08:26,960 --> 00:08:29,120 Speaker 3: wrote the immortal words we have nothing to fear but 194 00:08:29,160 --> 00:08:29,880 Speaker 3: fear itself. 195 00:08:33,120 --> 00:08:35,560 Speaker 1: First time she's ever written that. It was amazing, you know. 196 00:08:35,600 --> 00:08:39,520 Speaker 3: Then she said I have a dream, yes, and then 197 00:08:39,559 --> 00:08:40,640 Speaker 3: eat pluribusunum. 198 00:08:43,240 --> 00:08:44,160 Speaker 1: It's really brilliant. 199 00:08:44,240 --> 00:08:46,480 Speaker 4: Let's just be honest, and it shows you how rotten 200 00:08:46,520 --> 00:08:48,880 Speaker 4: these places are that we've they've now done an investigation. 201 00:08:49,000 --> 00:08:51,760 Speaker 4: The board of Harvard didn't even look into her scholarship 202 00:08:51,760 --> 00:08:53,520 Speaker 4: before making her president. They didn't even look into it 203 00:08:53,559 --> 00:08:55,199 Speaker 4: at all. And it's clear because it's not She was 204 00:08:55,240 --> 00:08:57,040 Speaker 4: not hired for being a great scholar. She was hired 205 00:08:57,280 --> 00:08:59,600 Speaker 4: for obviously other reasons for her, for her gender and 206 00:08:59,600 --> 00:08:59,920 Speaker 4: her sex. 207 00:09:00,320 --> 00:09:03,080 Speaker 3: So how are you finding your faculty to assemble a 208 00:09:03,160 --> 00:09:06,040 Speaker 3: new university. You've got people like Neil Ferguson, You've got 209 00:09:06,040 --> 00:09:09,240 Speaker 3: Barry Weiss, who I would note if you have not 210 00:09:09,440 --> 00:09:13,240 Speaker 3: read Barry Weiss's resignation letter from the editorial board of 211 00:09:13,240 --> 00:09:16,160 Speaker 3: The New York Times. It is one of the most 212 00:09:16,200 --> 00:09:21,240 Speaker 3: important things written in the past decade, and it is 213 00:09:21,520 --> 00:09:25,520 Speaker 3: the most concise and effective indictment of the corruption of 214 00:09:25,640 --> 00:09:28,719 Speaker 3: corporate journalism that I've read anywhere. So the two of them, 215 00:09:28,760 --> 00:09:30,079 Speaker 3: how did you team up with them? And how did 216 00:09:30,120 --> 00:09:32,400 Speaker 3: you find the other members of your faculty. 217 00:09:32,640 --> 00:09:33,760 Speaker 4: You know, I have to have to give Marc and 218 00:09:33,840 --> 00:09:35,839 Speaker 4: Drewson credit for introducing me to Barry. We both were 219 00:09:35,840 --> 00:09:37,720 Speaker 4: talking to him about the need to rebuild our broken 220 00:09:37,760 --> 00:09:40,480 Speaker 4: institutions in the US, whether it's media, whether it's universities, 221 00:09:40,760 --> 00:09:42,440 Speaker 4: whether it's so much else we need to weed to fix. 222 00:09:42,520 --> 00:09:45,480 Speaker 4: And I'm so inspired by her. Neil's been her friend 223 00:09:45,520 --> 00:09:47,000 Speaker 4: for a very long time. I think he's the greatest 224 00:09:47,000 --> 00:09:50,240 Speaker 4: living historian, and a lot of other people are attracted 225 00:09:50,240 --> 00:09:53,120 Speaker 4: to working with people like that. So, you know, we 226 00:09:53,960 --> 00:09:56,240 Speaker 4: announced this, We've had we had in the first few months, 227 00:09:56,240 --> 00:09:58,600 Speaker 4: five thousand professors send us notes to try to inquire 228 00:09:58,600 --> 00:10:00,480 Speaker 4: a boy working with us. So it's not been wild them. 229 00:10:00,679 --> 00:10:02,000 Speaker 4: This is the place people want to be part of. 230 00:10:02,240 --> 00:10:07,320 Speaker 3: Five thousand professors. That's worth underscoring. Look, there are, and 231 00:10:07,400 --> 00:10:09,080 Speaker 3: I think this is true in every one of our 232 00:10:09,120 --> 00:10:12,200 Speaker 3: institutions that is corrupted and captured by the left, there 233 00:10:12,200 --> 00:10:16,600 Speaker 3: are people trapped within them who have not lost their minds, 234 00:10:17,640 --> 00:10:19,520 Speaker 3: but they're scared. They still want to earn a living, 235 00:10:19,559 --> 00:10:21,400 Speaker 3: they want a job that they know if they open 236 00:10:21,480 --> 00:10:25,520 Speaker 3: their mouth they risk being canceled, being fired, being thrown out. 237 00:10:25,840 --> 00:10:27,960 Speaker 3: But I think that's true at universities. I think that 238 00:10:28,040 --> 00:10:30,679 Speaker 3: is true in entertainment. I think that is true in journalism. 239 00:10:30,880 --> 00:10:33,520 Speaker 3: I think that's true in big tech. Let me shift, 240 00:10:33,920 --> 00:10:36,920 Speaker 3: you know, the world a big tech Well, you know, 241 00:10:37,000 --> 00:10:40,520 Speaker 3: I think back to big tech fifteen years ago, and 242 00:10:40,600 --> 00:10:42,760 Speaker 3: I think at the time, big tech was really at 243 00:10:42,760 --> 00:10:45,319 Speaker 3: a fork in the road, and it could have gone 244 00:10:45,360 --> 00:10:49,720 Speaker 3: down the road of embracing a libertarian utopia, of saying, 245 00:10:50,080 --> 00:10:52,720 Speaker 3: leave us the hell alone, we're gonna be entrepreneurs, We're 246 00:10:52,720 --> 00:10:56,920 Speaker 3: gonna invent a new world, or it could have gone 247 00:10:56,920 --> 00:11:00,280 Speaker 3: down the road they chose instead, which is nanny state 248 00:11:00,320 --> 00:11:03,520 Speaker 3: to talitarianism. We have the power and we will use 249 00:11:03,559 --> 00:11:05,600 Speaker 3: the power to silence anyone who dare. 250 00:11:05,520 --> 00:11:06,960 Speaker 5: Speak speak out. 251 00:11:07,280 --> 00:11:09,960 Speaker 3: Do you agree with that assessment, and if so, why 252 00:11:10,000 --> 00:11:12,720 Speaker 3: did they choose road number two? 253 00:11:13,240 --> 00:11:15,560 Speaker 4: I do agree with that. And to tie it back 254 00:11:15,600 --> 00:11:18,960 Speaker 4: to what we're just talking about, Ted, these cultures come 255 00:11:19,000 --> 00:11:24,400 Speaker 4: from our universities. Google is hiring thousands of PhDs out 256 00:11:24,400 --> 00:11:26,280 Speaker 4: of these universities who've just grown up in that culture 257 00:11:26,320 --> 00:11:29,640 Speaker 4: their entire life. Amazon, Microsoft, Facebook. These are these tech 258 00:11:29,679 --> 00:11:32,200 Speaker 4: cultures and university cultures. They're one and the same. And 259 00:11:32,920 --> 00:11:35,080 Speaker 4: you know, the university culture is that Roden. That's what 260 00:11:35,120 --> 00:11:36,959 Speaker 4: these kids are up in their whole life. And it's 261 00:11:37,160 --> 00:11:39,640 Speaker 4: interesting because you kind of learned these places like rather 262 00:11:39,679 --> 00:11:41,960 Speaker 4: than a university teaches you to be courageous and speak up, 263 00:11:41,960 --> 00:11:44,920 Speaker 4: you learn shut up, virtue signal, go along, or you're 264 00:11:44,920 --> 00:11:46,960 Speaker 4: going to get in trouble. And you learn there's gonna 265 00:11:46,960 --> 00:11:49,120 Speaker 4: be a five percent of crazy people on the far left, 266 00:11:49,320 --> 00:11:51,920 Speaker 4: and when they shout, you obey because that's how you 267 00:11:51,920 --> 00:11:53,560 Speaker 4: stay out of trouble. And that's the way these companies 268 00:11:53,600 --> 00:11:53,960 Speaker 4: are run. 269 00:11:54,160 --> 00:11:56,600 Speaker 5: Now, is there a tipping point there? 270 00:11:56,640 --> 00:11:59,200 Speaker 3: There have been a handful of people who have shown 271 00:12:00,080 --> 00:12:03,520 Speaker 3: real courage in the tech space. There's you, There's Elon Musk, 272 00:12:03,679 --> 00:12:07,040 Speaker 3: There's Peter til there's Palmer Lucky, there's Larry Ellison. 273 00:12:07,080 --> 00:12:11,400 Speaker 5: That there are a few how many others are there? 274 00:12:11,440 --> 00:12:14,360 Speaker 3: And do you see a tipping point where others will 275 00:12:14,360 --> 00:12:17,400 Speaker 3: feel like, wait, I can speak out in support a 276 00:12:17,440 --> 00:12:20,280 Speaker 3: free enterprise, I can speak out in support of free speech. 277 00:12:20,400 --> 00:12:25,359 Speaker 3: I can stand up to the borg the collective mentality 278 00:12:25,440 --> 00:12:26,480 Speaker 3: of Silicon Valley. 279 00:12:27,240 --> 00:12:29,160 Speaker 4: You know, I think a lot of people are scared, 280 00:12:29,200 --> 00:12:31,480 Speaker 4: and they're scared for good reason. This is where I'd 281 00:12:31,480 --> 00:12:33,440 Speaker 4: actually have a little bit empathy towards lat of these 282 00:12:33,440 --> 00:12:36,440 Speaker 4: friends of mine. I get texts every time that we 283 00:12:36,520 --> 00:12:38,800 Speaker 4: put something online that you and I like and we're 284 00:12:38,800 --> 00:12:40,640 Speaker 4: speaking out or are being strong. I get texts from 285 00:12:40,640 --> 00:12:43,199 Speaker 4: people who run multipillion dollar companies. I get texts from 286 00:12:43,200 --> 00:12:46,200 Speaker 4: people whose companies support hundreds of thousands of other companies, 287 00:12:46,520 --> 00:12:48,600 Speaker 4: and they're terrified if they're supporting one hundred thousand dollar 288 00:12:48,640 --> 00:12:52,080 Speaker 4: companies that if they become you know, we drinking vodka 289 00:12:52,120 --> 00:12:53,480 Speaker 4: with the name of a friend that Austin on it. 290 00:12:53,600 --> 00:12:55,679 Speaker 4: He does not do politics because he knows that if 291 00:12:55,679 --> 00:12:58,600 Speaker 4: he becomes controversial, it could hurt him. So there's a 292 00:12:58,640 --> 00:12:59,959 Speaker 4: lot of fear right now in the community, and the 293 00:13:00,080 --> 00:13:02,280 Speaker 4: far left very good at demonizing people who speak out. 294 00:13:02,360 --> 00:13:05,960 Speaker 2: You moved a company from California to Texas. There's a 295 00:13:05,960 --> 00:13:08,440 Speaker 2: lot of people that love that, but they also there saying, 296 00:13:08,480 --> 00:13:13,000 Speaker 2: don't California my Texas? When people move here, I'm assuming 297 00:13:13,000 --> 00:13:15,840 Speaker 2: you had people that didn't agree with your conservative values 298 00:13:15,880 --> 00:13:16,240 Speaker 2: it came. 299 00:13:16,720 --> 00:13:18,559 Speaker 1: Do they see life differently? I mean it's been a 300 00:13:18,600 --> 00:13:19,120 Speaker 1: couple of years. 301 00:13:19,120 --> 00:13:21,319 Speaker 2: Now do they come and go, hey, it's actually better 302 00:13:21,840 --> 00:13:24,120 Speaker 2: way of life and they like freedom and they're starting 303 00:13:24,120 --> 00:13:25,640 Speaker 2: to come around to it, or do they just move 304 00:13:25,679 --> 00:13:28,280 Speaker 2: here because they say, Okay, well it's more freedom during 305 00:13:28,320 --> 00:13:30,240 Speaker 2: COVID and I pay less taxes, But I'm still the 306 00:13:30,280 --> 00:13:31,120 Speaker 2: same person voting. 307 00:13:31,360 --> 00:13:33,560 Speaker 4: You know, I want to give you statistic you probably know, 308 00:13:33,640 --> 00:13:35,920 Speaker 4: and obviously I'm a huge fan of this, sad Or. 309 00:13:35,960 --> 00:13:37,600 Speaker 4: The last race you run was closer than it should 310 00:13:37,600 --> 00:13:39,559 Speaker 4: have been. If it wasn't for people who had moved 311 00:13:39,600 --> 00:13:41,600 Speaker 4: to Texas, the race would have gone the other way 312 00:13:41,720 --> 00:13:44,319 Speaker 4: by the numbers. And so it turns out that on 313 00:13:44,440 --> 00:13:46,079 Speaker 4: the on the whole, the people who choose to move 314 00:13:46,120 --> 00:13:48,880 Speaker 4: here tend to be even more on the side of liberty, 315 00:13:48,960 --> 00:13:51,040 Speaker 4: more on the side of freedom even than the people 316 00:13:51,040 --> 00:13:53,640 Speaker 4: who were born here. So I think it's fair to 317 00:13:53,679 --> 00:13:55,760 Speaker 4: be really worried about these crazy people coming in. But 318 00:13:55,800 --> 00:13:58,280 Speaker 4: you should know the people who choose to come here, like, 319 00:13:58,640 --> 00:14:01,800 Speaker 4: we're fleeing something that's bro and we're coming here because 320 00:14:02,120 --> 00:14:05,120 Speaker 4: if America falls were screwed right, And a lot of 321 00:14:05,160 --> 00:14:06,680 Speaker 4: my friends, by the way, have given up. I have 322 00:14:06,720 --> 00:14:09,120 Speaker 4: billionaire friends who are living in Switzerland's we're living in Singapore, 323 00:14:09,160 --> 00:14:10,880 Speaker 4: who are say, Joe, the woke guys are in charge. 324 00:14:11,000 --> 00:14:13,040 Speaker 4: You're done. My wife and I are here in Texas 325 00:14:13,040 --> 00:14:14,760 Speaker 4: because we are making a stand here for America and 326 00:14:14,800 --> 00:14:15,920 Speaker 4: those are our values and there are a lot of 327 00:14:15,920 --> 00:14:16,680 Speaker 4: our friends values. 328 00:14:18,240 --> 00:14:20,680 Speaker 2: You've got kids, how much of that was your decision 329 00:14:20,760 --> 00:14:22,640 Speaker 2: as well? I mean, when you've got young kids. You 330 00:14:22,640 --> 00:14:24,680 Speaker 2: and I are actually the same age, graduated the same year. 331 00:14:26,080 --> 00:14:27,400 Speaker 2: And I know for me that. 332 00:14:27,640 --> 00:14:30,040 Speaker 3: By the way, Ben, he's made gazillions of dollars, he's 333 00:14:30,040 --> 00:14:31,840 Speaker 3: been a major CEO. What the hell have you done 334 00:14:31,840 --> 00:14:32,280 Speaker 3: with your life? 335 00:14:32,320 --> 00:14:34,360 Speaker 1: Man, I'm a co host of Center Ted Cruz. I've 336 00:14:34,400 --> 00:14:40,120 Speaker 1: got that going right, you know. But how much was 337 00:14:40,160 --> 00:14:40,440 Speaker 1: it your? 338 00:14:41,200 --> 00:14:44,360 Speaker 3: And I will tell you back in a moment, and 339 00:14:44,440 --> 00:14:46,840 Speaker 3: I will tell you Joe does have a basketball corner 340 00:14:46,920 --> 00:14:49,840 Speaker 3: and we are gonna shortly play hoops. I'm told Joe 341 00:14:49,920 --> 00:14:53,600 Speaker 3: has a pretty serious hoops game. Ben in high school 342 00:14:53,760 --> 00:14:56,040 Speaker 3: was a center, and you know, has got some mass. 343 00:14:56,480 --> 00:14:59,240 Speaker 3: So we're gonna see how things play out. 344 00:15:00,320 --> 00:15:02,080 Speaker 4: He's a little bigger guy. You I wouldn't talk too 345 00:15:02,160 --> 00:15:03,440 Speaker 4: much Tress before a game. 346 00:15:03,520 --> 00:15:04,880 Speaker 1: Somebody's going to be out of here. 347 00:15:06,080 --> 00:15:09,000 Speaker 3: This is the only part of my game is trash shocking. 348 00:15:09,120 --> 00:15:12,360 Speaker 5: If I give up on trash, I had zero game left. 349 00:15:14,120 --> 00:15:16,520 Speaker 4: Ben. We're loving raising our four daughters here. I think 350 00:15:16,560 --> 00:15:18,440 Speaker 4: this is a great place to raise kids. I think 351 00:15:18,480 --> 00:15:19,920 Speaker 4: it's a lot more tolerant place of a lot of 352 00:15:19,920 --> 00:15:21,880 Speaker 4: different views. It's funny to say, used to be Sanrancisco 353 00:15:21,920 --> 00:15:24,400 Speaker 4: was quote unquote tolerant, but it actually is not like 354 00:15:24,400 --> 00:15:26,080 Speaker 4: if you if you speak out and then they are aside, 355 00:15:26,080 --> 00:15:27,840 Speaker 4: you're in trouble there. I think here is very accepting 356 00:15:28,200 --> 00:15:30,440 Speaker 4: where we live in Texas and yeah, I mean, listen, 357 00:15:30,560 --> 00:15:32,960 Speaker 4: you guys have all heard the stories. I had friends 358 00:15:33,040 --> 00:15:34,960 Speaker 4: kids who were in first grade at the local private 359 00:15:34,960 --> 00:15:36,840 Speaker 4: school that we had heard was the best one, and 360 00:15:36,880 --> 00:15:38,680 Speaker 4: they came to us and they were distraught, and they said, 361 00:15:38,680 --> 00:15:40,600 Speaker 4: you know, the teacher lined the kids up today and 362 00:15:40,640 --> 00:15:42,120 Speaker 4: told them to be lining up by gender, and then 363 00:15:42,160 --> 00:15:43,480 Speaker 4: she yelled at them for half an hour about how 364 00:15:43,520 --> 00:15:45,640 Speaker 4: there's not only two genders and they're confused. And then 365 00:15:45,640 --> 00:15:48,360 Speaker 4: he's like, I mean, I'm not obsessed with this stuff myself, 366 00:15:48,400 --> 00:15:49,880 Speaker 4: but if the teachers are obsessed with it, that's kind 367 00:15:49,920 --> 00:15:51,000 Speaker 4: of weird place to raise kids. 368 00:15:52,160 --> 00:15:53,760 Speaker 5: What's the path to take them back? 369 00:15:53,800 --> 00:15:56,400 Speaker 3: So with University of Austin, you were fighting to try 370 00:15:56,400 --> 00:15:59,720 Speaker 3: to take back universities. I think that's incredibly important you know, 371 00:15:59,760 --> 00:16:02,480 Speaker 3: I guess to say, as a parent, you know, you 372 00:16:02,520 --> 00:16:05,160 Speaker 3: sit here and think, do you do you spend hundreds 373 00:16:05,160 --> 00:16:08,360 Speaker 3: of thousands of dollars to send your kid to a 374 00:16:08,400 --> 00:16:11,840 Speaker 3: school that will try to brainwash them to hate America 375 00:16:11,880 --> 00:16:15,120 Speaker 3: and hate you exactly, And it's it's hard. 376 00:16:14,880 --> 00:16:15,480 Speaker 5: To know what to do. 377 00:16:15,520 --> 00:16:16,800 Speaker 3: But at the same time, you want your kids to 378 00:16:16,800 --> 00:16:20,120 Speaker 3: do well in Education has been the key to success 379 00:16:20,280 --> 00:16:22,720 Speaker 3: so often in America that I know a lot of 380 00:16:22,760 --> 00:16:25,160 Speaker 3: parents that are just just almost paralyzed. 381 00:16:25,160 --> 00:16:25,720 Speaker 5: What do I do? 382 00:16:25,800 --> 00:16:28,880 Speaker 3: You're fighting to take that institution back. Do you have 383 00:16:29,040 --> 00:16:32,240 Speaker 3: hope we can take universities back? And then I'm gonna 384 00:16:32,240 --> 00:16:33,680 Speaker 3: ask you the same question on big tech. 385 00:16:34,160 --> 00:16:36,120 Speaker 4: You know, on universities, we got to build some new ones. 386 00:16:36,120 --> 00:16:38,360 Speaker 4: They're gonna they're gonna influence the broken ones to be better. 387 00:16:39,000 --> 00:16:41,560 Speaker 4: I think it's gonna take multiple that we build and 388 00:16:41,560 --> 00:16:43,480 Speaker 4: and uh, yes, I think we can shift the back. 389 00:16:43,480 --> 00:16:46,120 Speaker 4: We're not gonna reconquer Harvard and Yale. I mean it's 390 00:16:46,160 --> 00:16:49,120 Speaker 4: not you have basically realize these places. It's the administrators, 391 00:16:49,120 --> 00:16:51,360 Speaker 4: it's the departments through their own hiring. It's it's the 392 00:16:52,000 --> 00:16:53,760 Speaker 4: lawyers who are in charge. It's the board is in 393 00:16:53,800 --> 00:16:56,280 Speaker 4: charge that you're not gonna reconquer those schools, but you 394 00:16:56,320 --> 00:16:57,840 Speaker 4: can influence them to be better and you can build 395 00:16:57,840 --> 00:16:59,640 Speaker 4: better ones. I actually think we have a better chance 396 00:16:59,720 --> 00:17:01,960 Speaker 4: in case to twelve that we do in universities, and 397 00:17:02,000 --> 00:17:04,439 Speaker 4: that's thanks to school choice. If we can get that done, 398 00:17:04,480 --> 00:17:06,200 Speaker 4: and this is something I wish though, I think a 399 00:17:06,240 --> 00:17:08,280 Speaker 4: law of rary Republicans are on our side now in Texas. 400 00:17:08,320 --> 00:17:10,840 Speaker 4: They get it, they get how bad it is. I 401 00:17:10,840 --> 00:17:12,760 Speaker 4: think a lot of people in the rural areas, you know, 402 00:17:12,760 --> 00:17:15,040 Speaker 4: they know the teachers, they love them, and they're confused. 403 00:17:15,040 --> 00:17:17,159 Speaker 4: They don't realize there's tons of schools, especially in our 404 00:17:17,200 --> 00:17:19,760 Speaker 4: cities in Texas even here, that are brainwashing our kids, 405 00:17:19,800 --> 00:17:21,600 Speaker 4: and we desperately need to give parents the right to 406 00:17:21,600 --> 00:17:22,639 Speaker 4: get out of those schools. 407 00:17:22,960 --> 00:17:24,520 Speaker 2: I want to tell you about our friends over at 408 00:17:24,520 --> 00:17:27,040 Speaker 2: Patriot Mobile. If you are sick and tired of giving 409 00:17:27,160 --> 00:17:29,880 Speaker 2: your money to woke companies that literally hate. 410 00:17:29,680 --> 00:17:31,720 Speaker 1: Your values, hate your family values, hate. 411 00:17:31,520 --> 00:17:33,840 Speaker 2: Your faith, it is time for you to vote with 412 00:17:33,880 --> 00:17:37,160 Speaker 2: your dollars and switch to a company that stands by 413 00:17:37,200 --> 00:17:37,960 Speaker 2: what you believe in. 414 00:17:38,080 --> 00:17:39,240 Speaker 1: That is Patriot Mobile. 415 00:17:39,480 --> 00:17:41,280 Speaker 2: When I look down at my phone, I see the 416 00:17:41,280 --> 00:17:44,359 Speaker 2: word Patriot in the top left. Why because I switched 417 00:17:44,359 --> 00:17:46,640 Speaker 2: to Patriot Mobile. Now I get the same great service 418 00:17:46,640 --> 00:17:49,320 Speaker 2: that I had with Big Mobile. But the biggest difference 419 00:17:49,400 --> 00:17:51,159 Speaker 2: is every time I make a call, every time I 420 00:17:51,200 --> 00:17:52,960 Speaker 2: send a text, and every time I pay my bill, 421 00:17:53,400 --> 00:17:55,720 Speaker 2: I know I'm standing with a company that's actually fighting 422 00:17:55,760 --> 00:18:00,399 Speaker 2: for my values. Patriot Mobile offers you dependable nationwide coverage, 423 00:18:00,880 --> 00:18:04,240 Speaker 2: giving you the ability to access all the major network towers, 424 00:18:04,240 --> 00:18:06,520 Speaker 2: which means you get the same coverage you've been accustomed 425 00:18:06,560 --> 00:18:09,760 Speaker 2: to without funding that left. And when you switch to 426 00:18:09,840 --> 00:18:12,399 Speaker 2: Patriot you're sending a message because five percent of your 427 00:18:12,400 --> 00:18:15,600 Speaker 2: bill every month is given back and no charge to 428 00:18:15,720 --> 00:18:19,560 Speaker 2: you to causes that you help choose the support we're 429 00:18:19,560 --> 00:18:23,879 Speaker 2: talking about supporting free speech, religious freedom, the sanctity of life, 430 00:18:24,119 --> 00:18:28,040 Speaker 2: our Second Amendment, as well as supporting our military, our veterans, 431 00:18:28,320 --> 00:18:31,520 Speaker 2: our first responder heroes, and our wounded warriors. 432 00:18:32,080 --> 00:18:33,520 Speaker 4: How easy is it to switch? 433 00:18:33,720 --> 00:18:37,520 Speaker 2: Just go to Patriotmobile dot com slash verdict that's Patriot 434 00:18:37,520 --> 00:18:41,040 Speaker 2: Mobile dot com slash verdict or call them nine seven 435 00:18:41,119 --> 00:18:44,320 Speaker 2: to two Patriot make the switch and make a difference 436 00:18:44,400 --> 00:18:47,000 Speaker 2: with that bill every month free activation when you use 437 00:18:47,040 --> 00:18:50,520 Speaker 2: the offer code verdict. That's nine seven to two Patriot 438 00:18:50,840 --> 00:18:53,960 Speaker 2: nine seven two Patriot or Patriot mobile dot com slash verdict. 439 00:18:54,640 --> 00:18:57,760 Speaker 2: You seem to be doing something that is really cool 440 00:18:57,840 --> 00:19:00,639 Speaker 2: and fun. A lot of people do legacy late in life. 441 00:19:00,880 --> 00:19:04,359 Speaker 2: They think about their country, they think about their grandkids, 442 00:19:04,440 --> 00:19:06,960 Speaker 2: and then they kind of shift what they're doing. You 443 00:19:07,080 --> 00:19:09,680 Speaker 2: seem to be in the fight right now and includes 444 00:19:09,800 --> 00:19:13,400 Speaker 2: you working on legislation through I Think Tank. 445 00:19:13,440 --> 00:19:15,160 Speaker 1: You even had an op ed that just came out 446 00:19:15,200 --> 00:19:15,920 Speaker 1: this last week. 447 00:19:16,000 --> 00:19:18,240 Speaker 2: Tell people about that aspect of what you're doing, because 448 00:19:18,240 --> 00:19:20,160 Speaker 2: you seem to be all in on the fight again 449 00:19:20,160 --> 00:19:22,000 Speaker 2: at a young age, in your early forties. 450 00:19:22,320 --> 00:19:24,520 Speaker 4: Yeah, you know, I'm still building companies, are running my fund. 451 00:19:24,560 --> 00:19:26,119 Speaker 4: So I'd be really nice just to be able to 452 00:19:26,200 --> 00:19:28,800 Speaker 4: wait until I was sixty or seventy. I think maybe 453 00:19:28,800 --> 00:19:31,000 Speaker 4: if this was twenty thirty years ago, I might have 454 00:19:31,040 --> 00:19:32,879 Speaker 4: done that. It feels like this is a really critical 455 00:19:32,920 --> 00:19:34,520 Speaker 4: time for our country and we can't just wait twenty 456 00:19:34,600 --> 00:19:35,440 Speaker 4: or thirty years. 457 00:19:36,040 --> 00:19:36,600 Speaker 5: Absolutely. 458 00:19:36,880 --> 00:19:39,360 Speaker 4: Yeah, So so give given that given that we might 459 00:19:39,359 --> 00:19:40,720 Speaker 4: not have a country left if we don't all fight 460 00:19:40,760 --> 00:19:42,600 Speaker 4: for it right now. It's my job to do it. 461 00:19:42,680 --> 00:19:45,480 Speaker 4: And you know, at Cerro Institute, we have teams in 462 00:19:45,480 --> 00:19:48,040 Speaker 4: twenty States. We're fighting for liberty and accountability and going 463 00:19:48,080 --> 00:19:50,680 Speaker 4: after all sorts of nonsense there. Yeah, the op ed 464 00:19:50,840 --> 00:19:53,960 Speaker 4: was fun this week. I felt pretty strongly. I'm friends 465 00:19:53,960 --> 00:19:56,520 Speaker 4: with Elon Musk and watching what's happening to him and 466 00:19:56,560 --> 00:19:59,000 Speaker 4: the president with the weaponized courts, and you know, our country. 467 00:19:59,040 --> 00:20:01,159 Speaker 4: One of the reasons is except as we have, you know, 468 00:20:01,200 --> 00:20:03,600 Speaker 4: equality of justice under the law. And I'd be very 469 00:20:03,640 --> 00:20:06,040 Speaker 4: against Republicans weaponizing courts to attack the left, and I 470 00:20:06,080 --> 00:20:09,120 Speaker 4: was also very against the left weaponizing courts to attack them. 471 00:20:09,119 --> 00:20:10,560 Speaker 4: I think it was great. Jeb Bush spoke up about 472 00:20:10,560 --> 00:20:10,920 Speaker 4: that as well. 473 00:20:11,000 --> 00:20:14,560 Speaker 3: Yeah, look, it is absolutely grotesque the weaponization of our 474 00:20:14,720 --> 00:20:17,680 Speaker 3: justice system. We're seeing it against Donald Trump with four 475 00:20:17,800 --> 00:20:21,480 Speaker 3: indictments all over the country, with a ridiculous civil verdict 476 00:20:21,560 --> 00:20:24,399 Speaker 3: in New York, and we're seeing it against Elon Musk. 477 00:20:24,440 --> 00:20:28,520 Speaker 3: I will say the Biden administration, watching the Biden administration 478 00:20:28,680 --> 00:20:34,040 Speaker 3: weaponize every single federal agency against Elon Musk. And as 479 00:20:34,040 --> 00:20:39,280 Speaker 3: you know, Elon, until like twelve minutes ago, wasn't a Republican. 480 00:20:39,640 --> 00:20:43,440 Speaker 3: Elon had never voted Republican until just over two years ago. 481 00:20:43,800 --> 00:20:47,320 Speaker 3: Elon voted for Hillary Clinton and for Joe Biden, and 482 00:20:47,640 --> 00:20:50,000 Speaker 3: he said publicly the first Republican he ever voted for 483 00:20:50,160 --> 00:20:52,919 Speaker 3: was Myra Flores here in Texas, just a couple of 484 00:20:53,000 --> 00:20:57,119 Speaker 3: years ago. And the fact that he dared speak out, 485 00:20:57,160 --> 00:21:01,000 Speaker 3: and especially the fact that he bought Twitter and has 486 00:21:01,040 --> 00:21:05,720 Speaker 3: a loud free speech, the left has decided he must 487 00:21:05,760 --> 00:21:10,879 Speaker 3: be destroyed. And even for someone with vast resources, having 488 00:21:10,960 --> 00:21:15,119 Speaker 3: the federal government come after you, it is a daunting proposition. 489 00:21:15,800 --> 00:21:19,439 Speaker 4: Yeah, it's really disgusting to watch, like how open they are. 490 00:21:19,480 --> 00:21:21,679 Speaker 4: It feels like a thorobold country tech And you know, 491 00:21:21,680 --> 00:21:23,679 Speaker 4: I think you step back a little bit. There's this 492 00:21:23,720 --> 00:21:26,840 Speaker 4: battle in our in our civilization for truth and justice, 493 00:21:26,840 --> 00:21:29,320 Speaker 4: and it's very clear truth and justice have been losing 494 00:21:29,320 --> 00:21:31,520 Speaker 4: a lot the last twenty thirty years. I think buying 495 00:21:31,560 --> 00:21:34,240 Speaker 4: Twitter now X, I think hopefully what we're trying to 496 00:21:34,280 --> 00:21:35,879 Speaker 4: do with some of these institutions, we can start to 497 00:21:36,080 --> 00:21:37,959 Speaker 4: turn it around. I still think we're losing a little bit, 498 00:21:37,960 --> 00:21:39,600 Speaker 4: but I think we're going to start turning things around. 499 00:21:39,600 --> 00:21:41,160 Speaker 4: And I if more of us can get into the fight, 500 00:21:41,240 --> 00:21:42,720 Speaker 4: more people like Elon, I think we have a chance 501 00:21:42,760 --> 00:21:43,000 Speaker 4: to win. 502 00:21:43,119 --> 00:21:46,919 Speaker 3: So how about big tech? Is their hope for turning 503 00:21:46,960 --> 00:21:50,560 Speaker 3: big tech around is are there You mentioned five thousand 504 00:21:50,560 --> 00:21:53,160 Speaker 3: professors wanting to get out. That's a really encouraging stat 505 00:21:53,200 --> 00:21:56,439 Speaker 3: to me. Do you think there are likewise people in 506 00:21:56,520 --> 00:22:02,040 Speaker 3: big tech that are quietly wanting some semblance of sanity 507 00:22:02,160 --> 00:22:05,119 Speaker 3: but are afraid and is there a way that they 508 00:22:05,160 --> 00:22:06,080 Speaker 3: can come out of hiding. 509 00:22:06,480 --> 00:22:08,560 Speaker 4: Yeah, there's definitely a lot of them. And I'll tell 510 00:22:08,560 --> 00:22:11,119 Speaker 4: you what. The way that this works, thank goodness, is 511 00:22:11,119 --> 00:22:14,240 Speaker 4: there are market forces and you know, Google would have 512 00:22:14,280 --> 00:22:16,560 Speaker 4: been way ahead of everyone else if they didn't have 513 00:22:16,600 --> 00:22:19,040 Speaker 4: a completely corrupt I mean, it's a joke online, but 514 00:22:19,080 --> 00:22:20,560 Speaker 4: it's like they put out these things they saw this 515 00:22:20,640 --> 00:22:22,159 Speaker 4: week where they couldn't even do a picture of a 516 00:22:22,160 --> 00:22:24,680 Speaker 4: white person. You'd ask eighteen twenties, show me a couple 517 00:22:24,680 --> 00:22:27,240 Speaker 4: from the eighteen twenties America, and it's like it's like 518 00:22:27,280 --> 00:22:29,880 Speaker 4: a black guy and a Japanese woman, and like that's nice, 519 00:22:29,880 --> 00:22:30,480 Speaker 4: but it's probably not. 520 00:22:30,560 --> 00:22:31,360 Speaker 5: I don't know if you've seen. 521 00:22:31,400 --> 00:22:34,560 Speaker 3: It's actually very complex game theory, but if you ask 522 00:22:34,640 --> 00:22:38,520 Speaker 3: Google to create a chessboard, it has only black pieces 523 00:22:38,960 --> 00:22:40,520 Speaker 3: and it's very hard to know how to win or 524 00:22:40,600 --> 00:22:41,480 Speaker 3: lose at that point. 525 00:22:41,840 --> 00:22:43,560 Speaker 4: That the funny part is they only got in trouble. 526 00:22:43,560 --> 00:22:44,760 Speaker 4: I want to make the point. But the funny part 527 00:22:44,800 --> 00:22:46,480 Speaker 4: is they only got in trouble for this because somebody 528 00:22:46,520 --> 00:22:48,959 Speaker 4: thought to say, show me a German Nazi soldier from 529 00:22:48,960 --> 00:22:51,320 Speaker 4: the nineteen thirties, and it showed this Han Chinese woman, 530 00:22:51,440 --> 00:22:52,879 Speaker 4: this Native American guy and not. 531 00:22:52,960 --> 00:22:56,119 Speaker 2: Seeing it for him, and that was too far amazing, 532 00:22:56,240 --> 00:22:57,440 Speaker 2: that was too far New York Times. 533 00:22:57,520 --> 00:22:59,840 Speaker 1: It's like, Okay, now we actually know we need to fix. 534 00:22:59,640 --> 00:23:02,360 Speaker 3: This lit the Chinese Nazis. That was a real problem 535 00:23:03,720 --> 00:23:04,399 Speaker 3: history class. 536 00:23:04,800 --> 00:23:05,679 Speaker 4: So but we're scayed. 537 00:23:05,680 --> 00:23:07,359 Speaker 1: Did she actually cover that in paper? I don't know 538 00:23:07,359 --> 00:23:08,320 Speaker 1: if you've read that one yet. 539 00:23:08,440 --> 00:23:11,000 Speaker 4: Here. Here's the great thing about markets and about innovation 540 00:23:11,240 --> 00:23:13,280 Speaker 4: is that when you start to focus so much on 541 00:23:13,359 --> 00:23:15,320 Speaker 4: nonsense that you start to lose and you start to 542 00:23:15,320 --> 00:23:17,520 Speaker 4: not track the best people, other people defeat you in 543 00:23:17,560 --> 00:23:20,159 Speaker 4: the market, and those new things are very often you know, 544 00:23:20,240 --> 00:23:23,640 Speaker 4: if you look at fast drawing startups versus these technopolies, 545 00:23:23,800 --> 00:23:25,959 Speaker 4: the fast drawing startups are far less woke because they 546 00:23:25,960 --> 00:23:27,680 Speaker 4: have to be focused on competence, and a lot of 547 00:23:27,680 --> 00:23:30,119 Speaker 4: people who are joining them are fleeing these crazy broken places. 548 00:23:30,119 --> 00:23:31,280 Speaker 4: So I do think it's going the right direction. 549 00:23:31,520 --> 00:23:34,760 Speaker 3: Let me ask a business question. You know tech better 550 00:23:34,840 --> 00:23:39,000 Speaker 3: than most people alive. Where are things going in terms 551 00:23:39,080 --> 00:23:42,480 Speaker 3: of innovation ten years from now? What should we know 552 00:23:42,680 --> 00:23:46,080 Speaker 3: now that we don't know, And how will the world 553 00:23:46,200 --> 00:23:47,360 Speaker 3: be different in a decade. 554 00:23:47,560 --> 00:23:50,600 Speaker 4: Well, the really positive thing that's happening right now. And 555 00:23:50,880 --> 00:23:52,479 Speaker 4: I was never a huge crypto guy. I don't love 556 00:23:52,520 --> 00:23:54,639 Speaker 4: fiacht currencies. I think there's a good use against like, 557 00:23:54,720 --> 00:23:56,639 Speaker 4: you know, corrupt governments, but I was never that in 558 00:23:56,680 --> 00:24:01,280 Speaker 4: a crypto AI actually, to me is actually very real. Uh. 559 00:24:01,080 --> 00:24:03,040 Speaker 4: The way to think about it. We can talk all 560 00:24:03,040 --> 00:24:04,840 Speaker 4: about sorts of complicated things, but the simple thing to 561 00:24:04,840 --> 00:24:07,760 Speaker 4: think about is productivity is just really key in our economy. 562 00:24:07,800 --> 00:24:09,360 Speaker 4: The reason we have more wealth is we do more 563 00:24:09,400 --> 00:24:11,920 Speaker 4: with less. And there's all these industries in our economy 564 00:24:12,240 --> 00:24:15,680 Speaker 4: where this this AI combined with operations can do things 565 00:24:15,760 --> 00:24:17,959 Speaker 4: much more affordably, much cheaper. And so if you look 566 00:24:18,000 --> 00:24:20,520 Speaker 4: at this like healthcare building, for example, we spend probably 567 00:24:20,560 --> 00:24:22,879 Speaker 4: over a quarter trillion dollars a year healthcare building and 568 00:24:22,920 --> 00:24:25,040 Speaker 4: you can probably cut that in the third over the 569 00:24:25,040 --> 00:24:26,920 Speaker 4: next five or six years. There's tons of areas like that. 570 00:24:27,000 --> 00:24:30,840 Speaker 3: So there are lots of cassandras painting stories of impending 571 00:24:30,920 --> 00:24:34,960 Speaker 3: doom from Ai. Is AI gonna destroy us all? And 572 00:24:35,000 --> 00:24:37,639 Speaker 3: do you know what year does Skynet go online? 573 00:24:38,520 --> 00:24:40,240 Speaker 4: My do work a lot in defense, so I'm working 574 00:24:40,240 --> 00:24:43,800 Speaker 4: on it. Said, but there's this. We can control all 575 00:24:43,800 --> 00:24:48,120 Speaker 4: of you now we uh no, listen, there's there's two 576 00:24:48,119 --> 00:24:51,800 Speaker 4: different conversations with AIS, my master, thank you, Downtown getting 577 00:24:51,800 --> 00:24:52,360 Speaker 4: in trouble with there? 578 00:24:52,600 --> 00:24:52,719 Speaker 2: Uh. 579 00:24:52,840 --> 00:24:56,960 Speaker 4: It does in charge the the uh. There's two different 580 00:24:56,960 --> 00:24:59,800 Speaker 4: conversations with AI. One of them is productivity and wealth 581 00:24:59,800 --> 00:25:02,400 Speaker 4: cre and it's actually extremely positive and that's really good. 582 00:25:02,520 --> 00:25:04,439 Speaker 4: The other conversation with AI, it's very funny. A lot 583 00:25:04,440 --> 00:25:06,400 Speaker 4: of people in the tech world are not religious. They've 584 00:25:06,400 --> 00:25:09,159 Speaker 4: given up their religion, and so this is kind of 585 00:25:09,160 --> 00:25:11,600 Speaker 4: like a form of their religion. The Singularity that taking 586 00:25:11,600 --> 00:25:13,600 Speaker 4: over the world of AI, and it's very funny. It's 587 00:25:13,600 --> 00:25:16,080 Speaker 4: a very missionic vision. It's very much like revelations in 588 00:25:16,720 --> 00:25:19,879 Speaker 4: Judaism and Christianity, where this thing comes and it changes 589 00:25:19,920 --> 00:25:21,800 Speaker 4: everything and it's effectively a new God because once it 590 00:25:21,800 --> 00:25:24,960 Speaker 4: improves itself keeps getting better, and so it's like it's 591 00:25:24,960 --> 00:25:27,480 Speaker 4: like a secular religion. In Silicon Valley, people are obsessed 592 00:25:27,480 --> 00:25:29,679 Speaker 4: with it as They talk about end of times with 593 00:25:29,720 --> 00:25:31,639 Speaker 4: it all the time. And it's funny because America has 594 00:25:31,640 --> 00:25:33,720 Speaker 4: had a lot of other religious like revival movements over 595 00:25:33,720 --> 00:25:35,800 Speaker 4: the last two hundred years where people were convinced that 596 00:25:35,800 --> 00:25:37,879 Speaker 4: at times was coming very soon. This is quite a 597 00:25:37,880 --> 00:25:39,320 Speaker 4: weird one based in Silicon Valley. 598 00:25:39,320 --> 00:25:41,119 Speaker 3: All right, so we're going to wrap up momentarily, but 599 00:25:41,160 --> 00:25:44,280 Speaker 3: I want to ask, so you are very engaged in policy, 600 00:25:45,040 --> 00:25:48,320 Speaker 3: a policy question Washington is wrestling with right now. So 601 00:25:48,359 --> 00:25:50,160 Speaker 3: as you know, I'm the ranking member on the Senate 602 00:25:50,240 --> 00:25:54,600 Speaker 3: Commerce Committee, and AI is squarely within our jurisdiction. In fact, 603 00:25:55,080 --> 00:25:58,040 Speaker 3: back in twenty fifteen, I chaired the first ever Congressional 604 00:25:58,040 --> 00:26:00,440 Speaker 3: hearing on AI and have been focused on it for 605 00:26:00,520 --> 00:26:01,199 Speaker 3: a long time now. 606 00:26:01,240 --> 00:26:01,960 Speaker 5: There are a lot of. 607 00:26:01,960 --> 00:26:06,240 Speaker 3: Voices in Washington, most notably Chuck Schumer, but also including 608 00:26:06,280 --> 00:26:10,119 Speaker 3: some Republicans that are eager for a very heavy hand 609 00:26:10,119 --> 00:26:12,399 Speaker 3: of government when it comes to AI, and Schumer and 610 00:26:12,440 --> 00:26:19,760 Speaker 3: Democrats are proposing literally prior government approval before innovations in AI. 611 00:26:20,280 --> 00:26:24,080 Speaker 3: I've been very vocal in saying that it's catastrophically stupid 612 00:26:24,240 --> 00:26:27,080 Speaker 3: and if we put government in the position of prior approval, 613 00:26:27,400 --> 00:26:30,879 Speaker 3: we will seed leadership of AI to our enemies, to 614 00:26:31,000 --> 00:26:34,520 Speaker 3: China and other countries, and we will kill American leadership. 615 00:26:34,680 --> 00:26:37,800 Speaker 3: I'm interested in your views because this policy discussion. I 616 00:26:37,800 --> 00:26:40,199 Speaker 3: got to tell you a lot of big tech, the 617 00:26:40,240 --> 00:26:42,920 Speaker 3: Googles and facebooks of the world, are saying yes, yes, 618 00:26:43,040 --> 00:26:46,320 Speaker 3: regulate us because they believe they can capture the government 619 00:26:46,520 --> 00:26:48,800 Speaker 3: and use it to shut everyone down. What's your take 620 00:26:49,440 --> 00:26:52,360 Speaker 3: on how government should approach AI, because this is as 621 00:26:52,400 --> 00:26:55,560 Speaker 3: hot as any question in Washington right now. 622 00:26:55,960 --> 00:26:58,720 Speaker 4: Well, you know, mister Santa, one hundred percent agree with you. 623 00:26:58,840 --> 00:27:01,720 Speaker 4: I'm really glad you're taking that that tactic. As you 624 00:27:01,760 --> 00:27:04,000 Speaker 4: know that big companies, allot of them know they're losing 625 00:27:04,000 --> 00:27:05,280 Speaker 4: some of their best talent. They know it's going to 626 00:27:05,359 --> 00:27:07,119 Speaker 4: be hard to compete. But you know what they have, 627 00:27:07,200 --> 00:27:09,920 Speaker 4: Like if I want to start a competitor, for example, 628 00:27:09,920 --> 00:27:11,640 Speaker 4: to black Rock right now in New York, I've spend 629 00:27:11,640 --> 00:27:13,399 Speaker 4: one hundred million dollars year on lawyers even just to 630 00:27:13,440 --> 00:27:15,280 Speaker 4: do what they do. They love the fact that there's 631 00:27:15,359 --> 00:27:18,199 Speaker 4: tons of rules regulations. These big companies would love it 632 00:27:18,240 --> 00:27:20,120 Speaker 4: to make it impossible to compete against them in AI. 633 00:27:20,280 --> 00:27:22,920 Speaker 4: So number one, one hundred percent keuth regulations as small 634 00:27:22,960 --> 00:27:26,000 Speaker 4: as possible. Now, the thing I will give them, and 635 00:27:26,040 --> 00:27:27,200 Speaker 4: we have to be very careful because this it's not 636 00:27:27,240 --> 00:27:28,960 Speaker 4: why they're doing it. The thing I will give them 637 00:27:29,000 --> 00:27:31,360 Speaker 4: is there probably are ways that some people could figure 638 00:27:31,359 --> 00:27:34,000 Speaker 4: out how to use AI in bioterror in other areas, 639 00:27:34,000 --> 00:27:35,240 Speaker 4: and so we have to watch it. We have to 640 00:27:35,280 --> 00:27:37,120 Speaker 4: be very careful, we have to see as it goes along. 641 00:27:37,119 --> 00:27:39,040 Speaker 4: But let's not give them the ability to make the 642 00:27:39,040 --> 00:27:40,159 Speaker 4: whole thing crony and break it. 643 00:27:40,359 --> 00:27:43,160 Speaker 3: Well, and look, there is no doubt there will need 644 00:27:43,200 --> 00:27:45,920 Speaker 3: to be regulations applied to AI like to any other industry. 645 00:27:45,960 --> 00:27:48,320 Speaker 3: Now many of our existing laws can apply. So are 646 00:27:48,359 --> 00:27:50,480 Speaker 3: there risks of fraud? Are the risk of deception? 647 00:27:51,000 --> 00:27:51,200 Speaker 5: Yes? 648 00:27:51,320 --> 00:27:54,399 Speaker 3: So do you see things like Taylor Swift had the 649 00:27:54,440 --> 00:27:57,280 Speaker 3: AI fake porn put put on, and because she was 650 00:27:57,480 --> 00:28:00,240 Speaker 3: Taylor Swift and had such a prominence, she was able 651 00:28:00,240 --> 00:28:03,040 Speaker 3: to get it pulled down. Well, what happened is if that's. 652 00:28:02,880 --> 00:28:05,639 Speaker 4: Your kids, you yeah. 653 00:28:05,800 --> 00:28:06,800 Speaker 5: And nobody would watch that. 654 00:28:06,800 --> 00:28:09,840 Speaker 3: That's all right, that the market forces would take care 655 00:28:09,880 --> 00:28:10,840 Speaker 3: of that all on its own. 656 00:28:12,119 --> 00:28:15,160 Speaker 1: I was so ready to get in there, so ready. 657 00:28:15,200 --> 00:28:17,159 Speaker 1: That was my moment, and you knew it, and you 658 00:28:17,320 --> 00:28:25,359 Speaker 1: jumped in beforehand I'm okay, keep going, folks, go ahead. 659 00:28:25,440 --> 00:28:28,159 Speaker 3: But there's no doubt there are going to be need 660 00:28:28,440 --> 00:28:33,600 Speaker 3: to apply laws and rules, whether fraud, whether deception. The 661 00:28:33,680 --> 00:28:36,399 Speaker 3: legal system will have to be applied. But but I 662 00:28:36,400 --> 00:28:40,120 Speaker 3: think we should move slowly and understand what we're doing 663 00:28:40,160 --> 00:28:45,440 Speaker 3: because the productivity benefits potentially are are are massive. And 664 00:28:45,880 --> 00:28:48,000 Speaker 3: I will say, when you know, you talked a minute 665 00:28:48,000 --> 00:28:52,480 Speaker 3: ago about how how the big tech companies want barriers 666 00:28:52,480 --> 00:28:55,200 Speaker 3: to entry, and that is the most common one of 667 00:28:55,240 --> 00:28:59,800 Speaker 3: the great lies of politics. It is the idea that 668 00:28:59,800 --> 00:29:03,960 Speaker 3: that that conservatives are pro big business. The reality is 669 00:29:04,040 --> 00:29:09,120 Speaker 3: big business loves big government. Big business usually gets in 670 00:29:09,200 --> 00:29:13,560 Speaker 3: bed with big government, and big business loves when government 671 00:29:13,600 --> 00:29:17,920 Speaker 3: puts barriers to entry to stop the next generation of entrepreneurs. 672 00:29:17,920 --> 00:29:21,000 Speaker 3: And I'll say this, look, I have nothing for her 673 00:29:21,040 --> 00:29:24,160 Speaker 3: against big business, but I am interested in the little guys, 674 00:29:24,200 --> 00:29:28,480 Speaker 3: the next group of entrepreneurs. What the economist Joseph Schumpeter 675 00:29:28,560 --> 00:29:32,240 Speaker 3: called creative destruction. And one of my favorite images on 676 00:29:32,280 --> 00:29:35,800 Speaker 3: the Internet is a picture of the founders of Microsoft 677 00:29:36,080 --> 00:29:39,000 Speaker 3: in nineteen seventy eight. And you have Paul Allen with 678 00:29:39,080 --> 00:29:40,640 Speaker 3: long hair and a beard, and he looks like one 679 00:29:40,640 --> 00:29:43,760 Speaker 3: of the begs. You've got Bill Gates with glasses the 680 00:29:43,800 --> 00:29:48,680 Speaker 3: size of hip hubcaps and on. It's just that picture 681 00:29:49,200 --> 00:29:51,840 Speaker 3: of a bunch of college dropouts, and it just asks 682 00:29:52,400 --> 00:29:55,680 Speaker 3: would you invest money with these guys? And that is 683 00:29:55,720 --> 00:29:58,880 Speaker 3: and they were taking on IBM, Big Blue, the giant behemoth, 684 00:29:59,160 --> 00:30:00,760 Speaker 3: and they were there creative destruction. 685 00:30:00,960 --> 00:30:02,040 Speaker 5: Now they're the giant. 686 00:30:02,240 --> 00:30:05,040 Speaker 3: And I will say, let's do this to wrap up, 687 00:30:05,200 --> 00:30:08,880 Speaker 3: talk about the importance of disruptors, of the innovation of 688 00:30:08,920 --> 00:30:13,680 Speaker 3: the next generation, driving techs, driving productivity, driving our counties. 689 00:30:13,760 --> 00:30:15,840 Speaker 4: I mean, this is one hundred percent how America works, 690 00:30:15,840 --> 00:30:17,800 Speaker 4: as you say. And by the way, it's our biggest 691 00:30:17,840 --> 00:30:20,680 Speaker 4: advantage against China as our adversary in China right now, 692 00:30:20,880 --> 00:30:23,320 Speaker 4: the CCP, aside from just having killed a bunch of 693 00:30:23,320 --> 00:30:25,840 Speaker 4: our billionaire Chinese tech friends, so everyone's terrified to build 694 00:30:25,880 --> 00:30:27,880 Speaker 4: more tech if you're alredy sucessful in China, the other 695 00:30:27,960 --> 00:30:29,480 Speaker 4: thing they have going against. 696 00:30:29,200 --> 00:30:30,840 Speaker 5: Them, hold on, say that again. 697 00:30:31,080 --> 00:30:32,960 Speaker 4: A lot of our tech friends died in the last 698 00:30:33,040 --> 00:30:35,080 Speaker 4: or died under or fled in the last five years 699 00:30:35,080 --> 00:30:36,640 Speaker 4: out of China there and a lot of them were 700 00:30:36,640 --> 00:30:38,280 Speaker 4: taken away and disappeared and then came back and they 701 00:30:38,320 --> 00:30:39,360 Speaker 4: won't talk about it anymore. 702 00:30:39,400 --> 00:30:42,480 Speaker 3: So do we know names of people who are killed because. 703 00:30:42,280 --> 00:30:43,480 Speaker 5: I give I don't. 704 00:30:43,520 --> 00:30:45,520 Speaker 4: I'll give you a friend. And Andy Tan ran an 705 00:30:45,560 --> 00:30:48,240 Speaker 4: Asian innovations group forty seven years old, about to go 706 00:30:48,280 --> 00:30:50,880 Speaker 4: public last year after working hard for eleven years, and 707 00:30:50,920 --> 00:30:53,000 Speaker 4: they told him they wanted to do things differently with 708 00:30:53,040 --> 00:30:55,040 Speaker 4: the data and going in China. He said, I'm going 709 00:30:55,120 --> 00:30:57,480 Speaker 4: to go talk to him Beijing. Next I heard he 710 00:30:57,520 --> 00:30:59,320 Speaker 4: died in his sleep that night at forty seven years old. 711 00:30:59,360 --> 00:30:59,640 Speaker 5: Wow. 712 00:31:00,040 --> 00:31:01,680 Speaker 4: And there's a lot of stories like this. There's a 713 00:31:01,720 --> 00:31:03,360 Speaker 4: lot of guys who built a lot of it, who 714 00:31:03,400 --> 00:31:05,720 Speaker 4: fled and who are very sketish using thing. But I'll 715 00:31:05,720 --> 00:31:07,640 Speaker 4: tell you the other big advantage we have though against them, 716 00:31:07,680 --> 00:31:11,479 Speaker 4: other than them strewing that up, is basically all this 717 00:31:11,600 --> 00:31:14,320 Speaker 4: productivity coming from AI. It's going to disrupt healthcare. It's 718 00:31:14,320 --> 00:31:16,000 Speaker 4: gonna change of health care works. It's going to change 719 00:31:16,040 --> 00:31:18,360 Speaker 4: how logisticsort. It's gonna change all these industries work. In China, 720 00:31:18,520 --> 00:31:21,000 Speaker 4: the government people and their cronies they own those industries. 721 00:31:21,120 --> 00:31:22,680 Speaker 4: They are not going to allow those to be disrupted. 722 00:31:22,720 --> 00:31:25,440 Speaker 4: The question is is in America are we still able 723 00:31:25,480 --> 00:31:27,160 Speaker 4: to disrupt things? Are we still going to be allowed 724 00:31:27,160 --> 00:31:29,120 Speaker 4: by our government to go in and change how those 725 00:31:29,160 --> 00:31:30,600 Speaker 4: things work. And it's going to be about because we 726 00:31:30,640 --> 00:31:33,000 Speaker 4: have regulatory agencies that also want to slow it down 727 00:31:33,000 --> 00:31:35,240 Speaker 4: with the big companies. But I still believe in America, 728 00:31:35,280 --> 00:31:37,600 Speaker 4: with the right leadership, we actually can disrupt these things 729 00:31:37,600 --> 00:31:38,360 Speaker 4: and we can grow. Well. 730 00:31:38,360 --> 00:31:41,000 Speaker 3: Look, when AI replace this podcast, I hope that the 731 00:31:41,040 --> 00:31:43,480 Speaker 3: computer that takes my place does a really fine job. 732 00:31:44,040 --> 00:31:45,720 Speaker 2: I want to talk to you about how you start 733 00:31:45,760 --> 00:31:48,040 Speaker 2: your morning off. If you're like me and you're a 734 00:31:48,080 --> 00:31:50,520 Speaker 2: coffee drinker, I get up early, I get on the 735 00:31:50,600 --> 00:31:53,040 Speaker 2: radio at seven am, and I have got to have 736 00:31:53,600 --> 00:31:55,920 Speaker 2: not just a cup of coffee, a really good cup 737 00:31:55,960 --> 00:31:58,680 Speaker 2: of coffee. And I have a twenty twenty four New 738 00:31:58,760 --> 00:32:01,600 Speaker 2: Year's resolution. I am not giving my money to woke 739 00:32:01,720 --> 00:32:05,320 Speaker 2: coffee companies. That is something I have gotten rid of. 740 00:32:05,920 --> 00:32:08,240 Speaker 2: Blackout Coffee is the coffee. 741 00:32:07,880 --> 00:32:09,680 Speaker 1: That I drink. It is amazing. 742 00:32:10,360 --> 00:32:15,800 Speaker 2: This is one hundred percent America and zero percent woke coffee. 743 00:32:16,160 --> 00:32:19,680 Speaker 2: Blackout Coffee is one hundred percent committed to conservative values 744 00:32:19,720 --> 00:32:23,760 Speaker 2: as a company. From sourcing their beans to the roasting process, 745 00:32:24,240 --> 00:32:29,480 Speaker 2: customer support and shipping, they embody true American values. And 746 00:32:29,520 --> 00:32:34,280 Speaker 2: they accept no compromise on premium taste and premium quality. 747 00:32:34,680 --> 00:32:38,400 Speaker 2: If you want a great cup of coffee, not good, 748 00:32:38,840 --> 00:32:42,080 Speaker 2: not kind of good, not pretty good, but amazing, you 749 00:32:42,160 --> 00:32:46,160 Speaker 2: need to go to Blackoutcoffee dot com slash verdict. 750 00:32:46,160 --> 00:32:47,200 Speaker 1: Now here's the cool part. 751 00:32:47,760 --> 00:32:51,320 Speaker 2: Use the promo code verdict for twenty percent off your 752 00:32:51,440 --> 00:32:52,440 Speaker 2: first order. 753 00:32:52,920 --> 00:32:53,840 Speaker 5: So try it. 754 00:32:53,920 --> 00:32:56,000 Speaker 2: You're going to be hooked like I am, and you'll 755 00:32:56,080 --> 00:32:59,800 Speaker 2: never go back to those other woke brands. Blackout Coffee 756 00:33:00,520 --> 00:33:05,480 Speaker 2: slash verdict, be awake, not woke, that's Blackoutcoffee dot com 757 00:33:05,480 --> 00:33:09,640 Speaker 2: slash verdict. Promo code verdict for twenty percent off your 758 00:33:09,760 --> 00:33:13,920 Speaker 2: first order. Fine, amy, Yeah, final question for you, and 759 00:33:13,920 --> 00:33:15,720 Speaker 2: I want to go back to the university because there's 760 00:33:15,720 --> 00:33:17,080 Speaker 2: gonna be a lot of kids that listen to this, 761 00:33:17,240 --> 00:33:20,520 Speaker 2: a lot of parents, grandparents they're gonna and maybe even 762 00:33:20,520 --> 00:33:23,600 Speaker 2: professors that may want to reach out. 763 00:33:23,800 --> 00:33:25,400 Speaker 1: What does next year's class look like? 764 00:33:25,480 --> 00:33:27,160 Speaker 2: Is there a cap on that if someone says I 765 00:33:27,240 --> 00:33:29,800 Speaker 2: want more information, If there's a professor that's listening to 766 00:33:29,800 --> 00:33:32,520 Speaker 2: this and says, hey, I want to leave this great 767 00:33:32,520 --> 00:33:35,600 Speaker 2: institution that I'm at because of I'm being stifled or silence. 768 00:33:35,840 --> 00:33:37,400 Speaker 4: I want to talk to you. How can they do that? 769 00:33:38,400 --> 00:33:40,600 Speaker 4: So we're bidding our first class right now. This is 770 00:33:40,640 --> 00:33:42,360 Speaker 4: just as competitive to get into as the other top 771 00:33:42,360 --> 00:33:44,360 Speaker 4: ten schools. But if you have a really bright, young 772 00:33:44,640 --> 00:33:48,480 Speaker 4: young student who's a founding personality, entrepreneur personality, it's pretty 773 00:33:48,520 --> 00:33:49,800 Speaker 4: much one of the coolest places you can go. We 774 00:33:49,840 --> 00:33:51,880 Speaker 4: have one hundred of my top tech friends who put 775 00:33:51,880 --> 00:33:53,440 Speaker 4: their names on and advising it. We have all these 776 00:33:53,440 --> 00:33:55,800 Speaker 4: tall academics. It's going to be very competitive to get in, 777 00:33:55,920 --> 00:33:57,960 Speaker 4: but yes, please please please apply. You can go to 778 00:33:58,040 --> 00:34:00,600 Speaker 4: you Austin dot org and Stritch University of all online 779 00:34:00,960 --> 00:34:03,400 Speaker 4: uh professors, they're they're they're welcome to the email obviously. 780 00:34:03,440 --> 00:34:05,160 Speaker 4: If they're amazing, we'll love to talk to and we 781 00:34:05,200 --> 00:34:07,000 Speaker 4: have a pretty big line of people trying to get 782 00:34:07,040 --> 00:34:09,080 Speaker 4: in these professors right now. But obviously it's very very 783 00:34:09,080 --> 00:34:10,160 Speaker 4: interested in meeting great people. 784 00:34:10,320 --> 00:34:12,160 Speaker 2: Thank you for coming on the podcast, Thanks for having 785 00:34:12,239 --> 00:34:13,640 Speaker 2: us here. Thank you to everybody that's here in the 786 00:34:13,640 --> 00:34:16,480 Speaker 2: audience as well. That don't forget we do the show Monday, 787 00:34:16,520 --> 00:34:20,319 Speaker 2: Wednesday Friday. Hit that subscribe auto download button, and don't 788 00:34:20,360 --> 00:34:23,000 Speaker 2: forget the Saturday week in review. Anything you may have 789 00:34:22,960 --> 00:34:24,160 Speaker 2: miss during the week and the Cina and I will 790 00:34:24,200 --> 00:34:25,680 Speaker 2: see you back here in a couple of days.