1 00:00:11,697 --> 00:00:14,977 Speaker 1: You're listening to The Buck Sexton Show podcast, make sure 2 00:00:14,977 --> 00:00:17,977 Speaker 1: you subscribe to the podcast on the iHeartRadio app or 3 00:00:18,017 --> 00:00:20,017 Speaker 1: wherever you get your podcasts. 4 00:00:20,617 --> 00:00:23,417 Speaker 2: Hey, everybody, welcome to the Buck Sexton Show. We have 5 00:00:24,097 --> 00:00:28,577 Speaker 2: Matt Kim with us. Internet guy, or I should say 6 00:00:29,177 --> 00:00:36,097 Speaker 2: famous internet personality with huge following from Instagram to YouTube 7 00:00:36,137 --> 00:00:40,017 Speaker 2: to all the other places. So, my friend, how did 8 00:00:40,017 --> 00:00:42,457 Speaker 2: you become Matt Kim? The audience won't be familiar with 9 00:00:42,497 --> 00:00:44,817 Speaker 2: you because I've never had you on before. Tell me 10 00:00:44,857 --> 00:00:46,937 Speaker 2: about all things, Matt Kim. 11 00:00:47,097 --> 00:00:49,177 Speaker 3: So, I'm just a guy from the corner of Georgia. 12 00:00:49,417 --> 00:00:52,057 Speaker 3: I decided one day that I wanted to start talking 13 00:00:52,097 --> 00:00:55,417 Speaker 3: to a camera. I think, just more frustration and how 14 00:00:55,457 --> 00:00:58,097 Speaker 3: the world is going. I recently had a daughter at 15 00:00:58,097 --> 00:00:59,977 Speaker 3: the time, it was a little bit over a year 16 00:01:00,017 --> 00:01:04,017 Speaker 3: and a half ago, and I just one of those 17 00:01:04,017 --> 00:01:07,177 Speaker 3: things where you just start talking and I don't know 18 00:01:07,217 --> 00:01:09,817 Speaker 3: what happened. It would just started listening. It happened really fast. 19 00:01:10,177 --> 00:01:13,337 Speaker 3: I've only been doing this for maybe a little bit 20 00:01:13,377 --> 00:01:13,937 Speaker 3: over a year. 21 00:01:15,137 --> 00:01:20,897 Speaker 2: Wow, You've yeah done very well, sir. So I know 22 00:01:20,897 --> 00:01:22,537 Speaker 2: a little bit of your background from watching some of 23 00:01:22,577 --> 00:01:25,137 Speaker 2: your content, which is how I had you here. Your 24 00:01:25,177 --> 00:01:29,297 Speaker 2: parents are South Korean, right came to this country, you know, 25 00:01:29,377 --> 00:01:33,857 Speaker 2: kind of the immigrant American dream story, and now you're 26 00:01:33,897 --> 00:01:35,057 Speaker 2: trying to save the country. 27 00:01:35,497 --> 00:01:36,537 Speaker 1: What's the first thing? I mean? 28 00:01:36,617 --> 00:01:37,977 Speaker 2: Right now, as you and I are talking, it's kind 29 00:01:37,977 --> 00:01:40,377 Speaker 2: of funny, right because I was looking forward to chatting 30 00:01:40,417 --> 00:01:41,937 Speaker 2: with you because we don't have to just dive into 31 00:01:42,097 --> 00:01:44,537 Speaker 2: the Trump trial. I feel like everything as you and 32 00:01:44,617 --> 00:01:47,737 Speaker 2: I talk, we're still waiting for a verdict in the 33 00:01:47,777 --> 00:01:50,897 Speaker 2: Trump trial. The jury has been excused. It is Thursday 34 00:01:51,097 --> 00:01:56,097 Speaker 2: of the deliberation first deliberations week. I think that everything 35 00:01:56,097 --> 00:01:58,137 Speaker 2: that can be said about the Trump Trial by the 36 00:01:58,177 --> 00:02:01,097 Speaker 2: people who care about it has been said, and so 37 00:02:01,377 --> 00:02:03,417 Speaker 2: I didn't really want to get into that as much 38 00:02:03,457 --> 00:02:06,737 Speaker 2: as what ails us as a country right now, Like 39 00:02:06,737 --> 00:02:08,977 Speaker 2: what is Matt Kim want to fix? How are you 40 00:02:08,977 --> 00:02:09,937 Speaker 2: going to fix America? 41 00:02:11,457 --> 00:02:14,417 Speaker 3: I think the biggest issue that people have right now 42 00:02:14,617 --> 00:02:17,817 Speaker 3: with the current state of the country is this trust 43 00:02:17,897 --> 00:02:24,857 Speaker 3: and establishment, this trust and government, this trust and institutions. 44 00:02:25,057 --> 00:02:27,497 Speaker 3: I think this is where people really fail the most, 45 00:02:27,657 --> 00:02:31,337 Speaker 3: because we all have different issues that we care about most, 46 00:02:31,817 --> 00:02:33,817 Speaker 3: but the reality is that we feel duped and we 47 00:02:33,897 --> 00:02:36,897 Speaker 3: feel lied to by the institutions that we've trusted over time, 48 00:02:37,617 --> 00:02:43,337 Speaker 3: not only universities, but government and healthcare, our neighbors, our communities, 49 00:02:43,617 --> 00:02:47,177 Speaker 3: our city governments. People are just really fed up. People 50 00:02:47,337 --> 00:02:50,297 Speaker 3: feel like they've been lied to constantly. And I think 51 00:02:50,377 --> 00:02:51,897 Speaker 3: that's the biggest thing for most people. 52 00:02:52,297 --> 00:02:57,897 Speaker 2: How much of it is institutional capture by certain worldview. 53 00:02:57,897 --> 00:03:01,017 Speaker 2: I think it's even more than just an ideology. It's 54 00:03:01,057 --> 00:03:03,137 Speaker 2: the way they view everything. It's the way they view 55 00:03:03,697 --> 00:03:05,537 Speaker 2: more than a political ideology, right, it's the way they 56 00:03:05,617 --> 00:03:09,457 Speaker 2: view existence. But also, how do you factor into these 57 00:03:10,337 --> 00:03:14,097 Speaker 2: We're frustrated, right, We're frustrated about what's going on in 58 00:03:14,097 --> 00:03:17,617 Speaker 2: the country, but we keep putting power in the hands 59 00:03:17,617 --> 00:03:22,857 Speaker 2: of people who we then say are dishonest, they're morons, 60 00:03:22,937 --> 00:03:25,697 Speaker 2: or you know, is it a reflection like the people 61 00:03:25,697 --> 00:03:29,257 Speaker 2: in charge making bad decisions? That's one thing, But is 62 00:03:29,257 --> 00:03:31,657 Speaker 2: it a reflection of bad decisions that we're making as 63 00:03:31,657 --> 00:03:33,457 Speaker 2: a society by putting them in place, you know what 64 00:03:33,497 --> 00:03:33,777 Speaker 2: I mean? 65 00:03:34,897 --> 00:03:37,857 Speaker 3: I definitely think so. I've had a lot of conversations 66 00:03:37,897 --> 00:03:42,057 Speaker 3: recently with people that would consider hatriots, people that genuinely 67 00:03:42,097 --> 00:03:44,497 Speaker 3: care about the state of the country, people that I 68 00:03:44,497 --> 00:03:47,337 Speaker 3: feel would be extremely competent if they were get involved 69 00:03:47,377 --> 00:03:50,337 Speaker 3: in government, not the incompetent, power hungry people that are 70 00:03:50,377 --> 00:03:53,857 Speaker 3: in government now, but people that I feel would genuinely 71 00:03:53,897 --> 00:03:57,057 Speaker 3: do a good job. And everyone says the same thing. 72 00:03:57,617 --> 00:04:00,337 Speaker 3: I would love to but kind of got a lot 73 00:04:00,337 --> 00:04:04,137 Speaker 3: of things going on right now. We care just enough 74 00:04:04,217 --> 00:04:07,697 Speaker 3: to talk about it, we care just enough to maybe 75 00:04:07,737 --> 00:04:10,337 Speaker 3: get angry about it. But when it takes time to 76 00:04:10,337 --> 00:04:13,657 Speaker 3: be selfless and to go out there and do something 77 00:04:13,697 --> 00:04:16,857 Speaker 3: for others and sacrifice maybe a little bit of your 78 00:04:16,897 --> 00:04:20,177 Speaker 3: own livelihood, maybe sacrifice a little bit of your own privacy, 79 00:04:20,617 --> 00:04:23,097 Speaker 3: sacrifice a little bit of your own way of life, 80 00:04:24,217 --> 00:04:26,097 Speaker 3: people don't really want to do that. And I think 81 00:04:26,137 --> 00:04:28,897 Speaker 3: that's really what it is, and that's on us. 82 00:04:28,977 --> 00:04:30,497 Speaker 1: And what do you think? 83 00:04:30,817 --> 00:04:33,217 Speaker 2: What do you think about those who would say it's 84 00:04:33,257 --> 00:04:39,217 Speaker 2: because we live in a society of so much abundance, wealth, 85 00:04:41,097 --> 00:04:45,537 Speaker 2: just prosperity, relative safety. I know people are freaked out 86 00:04:45,537 --> 00:04:48,057 Speaker 2: a crime and it's a real problem. But you know, 87 00:04:48,097 --> 00:04:51,217 Speaker 2: if you were to compare what our lives are like 88 00:04:51,257 --> 00:04:55,057 Speaker 2: to what lives have been like in even recent generations, 89 00:04:55,057 --> 00:04:58,057 Speaker 2: but certainly stretching back for centuries, you know, are we 90 00:04:58,137 --> 00:05:01,697 Speaker 2: just getting soft and therefore creating problems. It's like, you know, 91 00:05:01,737 --> 00:05:05,217 Speaker 2: hedonic adaptation is when whatever good that happens to you, 92 00:05:05,217 --> 00:05:06,897 Speaker 2: you just get used to and then you need something 93 00:05:06,937 --> 00:05:09,737 Speaker 2: else good because you know, are we in a situation 94 00:05:09,777 --> 00:05:12,017 Speaker 2: where or we have to get upset about little things 95 00:05:12,057 --> 00:05:13,377 Speaker 2: because the big things are all covered? 96 00:05:13,417 --> 00:05:14,217 Speaker 1: Like how do you view that? 97 00:05:16,177 --> 00:05:19,017 Speaker 3: I think it's social media. I think it's social media. 98 00:05:19,097 --> 00:05:22,057 Speaker 3: I think it's movies. I think it's entertainment. I'm gonna 99 00:05:22,097 --> 00:05:25,137 Speaker 3: say twenty thirty forty years ago, we didn't know that 100 00:05:25,217 --> 00:05:28,297 Speaker 3: we didn't have certain things. We didn't get shoved in 101 00:05:28,297 --> 00:05:30,657 Speaker 3: our face that you should buy a really nice car 102 00:05:30,777 --> 00:05:32,857 Speaker 3: all the time, that you need to buy these clothes, 103 00:05:32,937 --> 00:05:36,897 Speaker 3: you need to consume this technology. Ignorance is bliss. And 104 00:05:36,977 --> 00:05:40,497 Speaker 3: you didn't realize how people outside of your small neighborhood 105 00:05:40,577 --> 00:05:44,297 Speaker 3: were living. And we treated movie stars and people on 106 00:05:44,337 --> 00:05:47,777 Speaker 3: the big screen almost as gods, and you had a 107 00:05:47,777 --> 00:05:50,417 Speaker 3: separation from them, so you didn't feel that you could 108 00:05:50,417 --> 00:05:53,417 Speaker 3: be them. Therefore, you didn't really ever chase after it 109 00:05:53,497 --> 00:05:55,977 Speaker 3: want to be them. You just kind of admired them 110 00:05:55,977 --> 00:05:59,137 Speaker 3: from afar. Now it's so accessible and so in your 111 00:05:59,177 --> 00:06:01,697 Speaker 3: face that everyone feels like if I'm not doing what 112 00:06:01,697 --> 00:06:05,697 Speaker 3: they're doing. I must be doing something wrong, and people 113 00:06:05,777 --> 00:06:09,137 Speaker 3: lose context of kind of the simpler things that that 114 00:06:09,217 --> 00:06:10,777 Speaker 3: really matter in life, like family. 115 00:06:10,977 --> 00:06:12,897 Speaker 2: Well, I think you definitely see this. I mean you 116 00:06:12,937 --> 00:06:16,257 Speaker 2: mentioned the social media in the way that first of all, 117 00:06:16,297 --> 00:06:19,257 Speaker 2: when people take a step back, they see that social 118 00:06:19,337 --> 00:06:21,577 Speaker 2: media and I'll admit this for me too, right, social 119 00:06:21,577 --> 00:06:26,457 Speaker 2: media is not a reflection of what your life is 120 00:06:26,497 --> 00:06:29,977 Speaker 2: really like. No one is really putting No one's putting 121 00:06:30,057 --> 00:06:32,697 Speaker 2: on photos when you know, for the most part, where 122 00:06:32,697 --> 00:06:35,577 Speaker 2: they look ugly, or at least they think they look ugly, 123 00:06:35,617 --> 00:06:38,057 Speaker 2: or where they're depressed, or where they're having a bad day, 124 00:06:38,297 --> 00:06:41,577 Speaker 2: or you know, in general, people are trying to present 125 00:06:42,137 --> 00:06:45,057 Speaker 2: essentially an online brand, and this is I think we're 126 00:06:45,177 --> 00:06:46,817 Speaker 2: entering this whole new world, you know. I think The 127 00:06:47,097 --> 00:06:49,937 Speaker 2: Matrix is one of the best sci fi and one 128 00:06:49,937 --> 00:06:52,937 Speaker 2: of the best action movies of all time, and it 129 00:06:53,017 --> 00:06:55,177 Speaker 2: came out at this really interesting moment where the Internet 130 00:06:55,217 --> 00:06:59,057 Speaker 2: was still this very powerful but new phenomenon. I do 131 00:06:59,097 --> 00:07:01,777 Speaker 2: feel like we're all we're not connected into the Matrix, 132 00:07:01,817 --> 00:07:03,857 Speaker 2: Like there's weird machines that are running our lives, but 133 00:07:04,417 --> 00:07:06,897 Speaker 2: everybody is an online brand now, right. I mean, if 134 00:07:06,937 --> 00:07:10,497 Speaker 2: you're looking to work in any field, that's the first 135 00:07:10,497 --> 00:07:11,657 Speaker 2: thing that a lot of people do. And they're going 136 00:07:11,697 --> 00:07:14,017 Speaker 2: to employ you. They're going to check you out online. 137 00:07:14,177 --> 00:07:15,897 Speaker 2: If you're dating, if you're out there you're a married 138 00:07:15,897 --> 00:07:18,897 Speaker 2: guy who said you've got a baby, right, But you know, 139 00:07:18,937 --> 00:07:21,857 Speaker 2: if you're dating out there, I used to be single. 140 00:07:21,897 --> 00:07:24,577 Speaker 2: I'm now married, Matt and uh so you know, yay, 141 00:07:24,737 --> 00:07:25,137 Speaker 2: joined I. 142 00:07:25,177 --> 00:07:25,817 Speaker 1: Joined the club. 143 00:07:26,017 --> 00:07:28,577 Speaker 2: I'm very very happy of joining the married club. But 144 00:07:28,857 --> 00:07:32,177 Speaker 2: girls would tell me that they would they would, you know, 145 00:07:32,377 --> 00:07:33,537 Speaker 2: scope me out online? 146 00:07:33,617 --> 00:07:33,777 Speaker 1: Right? 147 00:07:33,817 --> 00:07:35,657 Speaker 2: And what you realize there's a lot of stuff about you. 148 00:07:35,737 --> 00:07:37,457 Speaker 2: Do you see what I'm saying, Like, on the one hand, 149 00:07:38,177 --> 00:07:40,137 Speaker 2: we want to avoid this stuff. On the other hand, 150 00:07:40,137 --> 00:07:44,137 Speaker 2: we're drawn into it. There's the there's the superficial, Oh 151 00:07:44,217 --> 00:07:46,257 Speaker 2: I just want clicks and likes, and then there's also 152 00:07:46,417 --> 00:07:48,857 Speaker 2: the if you're not on the internet, are you able 153 00:07:48,937 --> 00:07:50,617 Speaker 2: to compete in any field? 154 00:07:51,737 --> 00:07:54,457 Speaker 3: I mean, first off, the Matrix wasn't a movie, it 155 00:07:54,497 --> 00:07:58,817 Speaker 3: was a documentary. True, say that all the time. And yeah, 156 00:07:58,817 --> 00:08:01,457 Speaker 3: we live in a clicks in views economy. People, that 157 00:08:01,697 --> 00:08:04,817 Speaker 3: is your social credibility, that is what people look at. 158 00:08:04,977 --> 00:08:07,617 Speaker 3: I remember I was around some politicians, and every once 159 00:08:07,617 --> 00:08:10,417 Speaker 3: in a while I'll end up somehow around different types 160 00:08:10,497 --> 00:08:14,337 Speaker 3: of politicians and people within Congress, and I'll be talking 161 00:08:14,377 --> 00:08:16,497 Speaker 3: to them, or I'll chouce myself, or maybe their chief 162 00:08:16,537 --> 00:08:18,697 Speaker 3: of staff will bring me over, bring them over and 163 00:08:18,777 --> 00:08:20,537 Speaker 3: be like, oh, yeah, we want you to meet Matt Kim. 164 00:08:21,217 --> 00:08:22,617 Speaker 3: And they're like, oh yeah, nice, meet you, and they 165 00:08:22,697 --> 00:08:26,137 Speaker 3: kind of walk off, no big deal, not that upset 166 00:08:26,137 --> 00:08:28,657 Speaker 3: about it. And then the chief of staff leans back 167 00:08:28,697 --> 00:08:32,217 Speaker 3: in and starts whispering of the year, that's you know, 168 00:08:32,217 --> 00:08:34,137 Speaker 3: he's got so much or something, and they come back, Oh, 169 00:08:34,177 --> 00:08:36,457 Speaker 3: Matt Kim, we love Matt Kim. That's so great to 170 00:08:36,497 --> 00:08:40,777 Speaker 3: see you like, oh, they're so disingenuous. But the reality 171 00:08:40,857 --> 00:08:42,297 Speaker 3: is that they look at the numbers, they look at 172 00:08:42,337 --> 00:08:45,497 Speaker 3: the metrics, and that's how they decide you're worth now, 173 00:08:46,057 --> 00:08:48,177 Speaker 3: because if they can talk to one person and you 174 00:08:48,177 --> 00:08:50,337 Speaker 3: have the ability to get to one hundred people or 175 00:08:50,377 --> 00:08:53,337 Speaker 3: a thousand people or one hundred thousand people, they don't 176 00:08:53,377 --> 00:08:55,297 Speaker 3: have to waste their time trying to talk to one 177 00:08:55,337 --> 00:08:58,257 Speaker 3: hundred thousand people. What do you think the idea of 178 00:08:58,297 --> 00:09:02,257 Speaker 3: the matrix is trying to put people into data sets categories, 179 00:09:02,937 --> 00:09:05,857 Speaker 3: and population is just so much easier to control if 180 00:09:05,897 --> 00:09:08,457 Speaker 3: you can categorize them. 181 00:09:08,697 --> 00:09:11,697 Speaker 2: What do you think is the if you're doing it 182 00:09:11,697 --> 00:09:14,337 Speaker 2: the way you want to do it, the central premise 183 00:09:14,337 --> 00:09:17,697 Speaker 2: of the content that you're putting online. What do you 184 00:09:17,697 --> 00:09:19,897 Speaker 2: want to accomplish for the people that are the consumers 185 00:09:19,937 --> 00:09:21,577 Speaker 2: of it. I'm always curious about this when I talk 186 00:09:21,617 --> 00:09:23,617 Speaker 2: to people who who do what we do. 187 00:09:24,897 --> 00:09:27,537 Speaker 3: I think, for what you mentioned, it almost feels like 188 00:09:27,577 --> 00:09:30,617 Speaker 3: we're all plugged directly into the matrix. Maybe not physically, 189 00:09:30,857 --> 00:09:35,377 Speaker 3: but maybe metaphysically, maybe psychologically, whatever it is, but we 190 00:09:35,457 --> 00:09:38,137 Speaker 3: all feel like we're plugged in and we're fed a 191 00:09:38,137 --> 00:09:43,457 Speaker 3: certain narrative where they're removing our ability to think for ourselves, 192 00:09:43,817 --> 00:09:47,297 Speaker 3: to question narratives and just comply. And I think that's 193 00:09:47,337 --> 00:09:50,657 Speaker 3: really what I'm trying to do, is to pick different topics, 194 00:09:51,137 --> 00:09:56,377 Speaker 3: give an alternative, natural, logical solution, or take on that topic. 195 00:09:57,097 --> 00:10:00,737 Speaker 3: And sometimes all it takes is one moment to realize, Huh, 196 00:10:00,857 --> 00:10:02,857 Speaker 3: they've been lying to me about this, or this is 197 00:10:02,937 --> 00:10:05,697 Speaker 3: not what they made it perceived to be. And once 198 00:10:05,737 --> 00:10:08,097 Speaker 3: you see that cracking the matrix at one time, then 199 00:10:08,097 --> 00:10:10,017 Speaker 3: the second time is easier and the third time is easier, 200 00:10:10,097 --> 00:10:12,817 Speaker 3: the fourth time easier, and then you realize everything around you. 201 00:10:13,657 --> 00:10:15,857 Speaker 3: And I think if people had the ability to think 202 00:10:15,857 --> 00:10:19,937 Speaker 3: for themselves and the ability to look past all the propaganda, 203 00:10:20,137 --> 00:10:22,417 Speaker 3: I think we'd be a better society because we'd be 204 00:10:22,537 --> 00:10:24,417 Speaker 3: less controllable and we'd have more freedom. 205 00:10:25,337 --> 00:10:25,537 Speaker 1: Yeah. 206 00:10:25,817 --> 00:10:28,577 Speaker 2: I think that you mentioned earlier on. We just got 207 00:10:28,577 --> 00:10:32,817 Speaker 2: started talking about institutions, and people will refer on the 208 00:10:32,857 --> 00:10:36,977 Speaker 2: right a lot to institutional capture. And I have to say, 209 00:10:37,617 --> 00:10:42,097 Speaker 2: when people now say that they are skeptical of a 210 00:10:42,097 --> 00:10:46,017 Speaker 2: whole range of institutions, I can't argue with that. Like 211 00:10:46,217 --> 00:10:48,777 Speaker 2: I can weigh in and say, well, I think that's 212 00:10:48,817 --> 00:10:52,177 Speaker 2: maybe going a little too far, or now not everyone 213 00:10:52,217 --> 00:10:54,777 Speaker 2: there is bad and these But even as you say that, 214 00:10:54,857 --> 00:10:57,897 Speaker 2: I think it feels a little mealy mouthed, a little 215 00:10:57,937 --> 00:11:01,297 Speaker 2: million It's like, oh, well, you know the CDC. I mean, 216 00:11:01,297 --> 00:11:03,057 Speaker 2: I don't know if you know this about me. I 217 00:11:03,097 --> 00:11:07,177 Speaker 2: don't think anyone hates what Fauci stood for during COVID 218 00:11:07,217 --> 00:11:10,017 Speaker 2: more than me. I mean, I'm and the whole time, 219 00:11:10,177 --> 00:11:12,577 Speaker 2: I just I saw exactly who that guy was, what 220 00:11:12,617 --> 00:11:14,657 Speaker 2: he was all about, and I think that he's one 221 00:11:14,657 --> 00:11:17,377 Speaker 2: of the most destructive and it comes out of cowardice, 222 00:11:17,417 --> 00:11:19,497 Speaker 2: and it comes out of lies, right, so much that 223 00:11:19,537 --> 00:11:22,617 Speaker 2: we see that's evil, that's bad, that's tyrannical. Is really 224 00:11:22,657 --> 00:11:26,897 Speaker 2: the outgrowth of lies. It's just people embracing untruth or 225 00:11:26,977 --> 00:11:29,617 Speaker 2: non truth in some way. Fauci was central to that. 226 00:11:29,857 --> 00:11:31,457 Speaker 2: And so I'll say, oh, well, the CDC is not 227 00:11:31,497 --> 00:11:33,977 Speaker 2: all bad. You know, they do some nice I don't know, 228 00:11:34,217 --> 00:11:37,377 Speaker 2: health education stuff out there or something, but what they're 229 00:11:37,857 --> 00:11:40,857 Speaker 2: supposed to really do where it matters, they are really bad. 230 00:11:40,897 --> 00:11:42,417 Speaker 2: And that's true of a lot of institutions. 231 00:11:42,457 --> 00:11:42,657 Speaker 1: Now. 232 00:11:43,217 --> 00:11:46,537 Speaker 2: I think people feel that way about the elite universities. 233 00:11:46,577 --> 00:11:48,817 Speaker 2: I mean, Matt, did you see this stat and I'll 234 00:11:48,857 --> 00:11:51,297 Speaker 2: just describe it in case you didn't see it, where 235 00:11:51,657 --> 00:11:56,697 Speaker 2: the prevalence of a pro Palestinian or pro Hamas, however 236 00:11:56,777 --> 00:12:01,137 Speaker 2: you want to slice that encampment and the cost of 237 00:12:01,177 --> 00:12:03,097 Speaker 2: the school was directly correlated. 238 00:12:03,297 --> 00:12:05,577 Speaker 1: Basically, if you go to community college, state. 239 00:12:05,417 --> 00:12:10,617 Speaker 2: School, you know, a not elite in quotes college university, 240 00:12:11,297 --> 00:12:14,817 Speaker 2: we're not very expensive. College university. You didn't do that stuff. 241 00:12:15,137 --> 00:12:18,417 Speaker 2: But at the Harvard's in Columbia, et cetera. Encampments at 242 00:12:18,417 --> 00:12:21,657 Speaker 2: all of them, you know, what I'm saying, like those schools, 243 00:12:21,817 --> 00:12:24,577 Speaker 2: the schools that are supposed to be at the very forefront, 244 00:12:24,577 --> 00:12:27,777 Speaker 2: are supposed to be educating the best, are brainwashing kids 245 00:12:27,977 --> 00:12:30,137 Speaker 2: and telling them crazy things. 246 00:12:31,617 --> 00:12:33,977 Speaker 3: I think a lot of it has to do with pride. 247 00:12:34,817 --> 00:12:37,417 Speaker 3: I think we are so proud as a population, so 248 00:12:37,577 --> 00:12:41,177 Speaker 3: proud and kind of what we want people to perceive us, 249 00:12:41,497 --> 00:12:43,977 Speaker 3: that it's hard to look back past kind of our 250 00:12:44,017 --> 00:12:48,057 Speaker 3: own selves. Because if you've dedicated your life to be 251 00:12:48,217 --> 00:12:51,217 Speaker 3: in a certain field and you've done your research, or 252 00:12:51,217 --> 00:12:54,017 Speaker 3: if you were a doctor of researcher and you took 253 00:12:54,057 --> 00:12:56,737 Speaker 3: a stance on COVID at that time, and you find 254 00:12:56,817 --> 00:12:59,697 Speaker 3: out that maybe it wasn't true, it's really hard to 255 00:12:59,697 --> 00:13:02,217 Speaker 3: go bad because then it would ruin your credibility, it 256 00:13:02,257 --> 00:13:04,817 Speaker 3: would ruin possibly your lives to work. No one wants 257 00:13:04,817 --> 00:13:06,897 Speaker 3: to be wrong. No one wants to be perceived as 258 00:13:06,897 --> 00:13:09,937 Speaker 3: a person that was wrong. No one wants to apologize 259 00:13:09,937 --> 00:13:13,137 Speaker 3: it if you mess it up. I think that's the 260 00:13:13,257 --> 00:13:17,217 Speaker 3: hardest thing, especially about COVID. It's that we know for 261 00:13:17,257 --> 00:13:20,897 Speaker 3: a fact that people in charge, people in higher levels 262 00:13:20,897 --> 00:13:24,937 Speaker 3: of government, got it wrong. There's no denying that. Everyone 263 00:13:24,937 --> 00:13:28,137 Speaker 3: knows that to be true. At this point, when are 264 00:13:28,137 --> 00:13:29,817 Speaker 3: they ever going to come out and be like, look 265 00:13:30,457 --> 00:13:33,017 Speaker 3: kind of mess that one up. I'm sorry, Yeah, I 266 00:13:33,057 --> 00:13:34,737 Speaker 3: think a lot of people just want to hear that. 267 00:13:35,017 --> 00:13:37,497 Speaker 2: Well, you know, this brings me to one of the 268 00:13:37,537 --> 00:13:41,177 Speaker 2: biggest challenges that I see in America today, and this 269 00:13:41,217 --> 00:13:44,457 Speaker 2: affects the media, that affects the institutions, it affects a 270 00:13:44,497 --> 00:13:48,297 Speaker 2: lot of the things we've already started to speak about here, 271 00:13:49,097 --> 00:13:52,937 Speaker 2: is that they have decided, and this is more open 272 00:13:52,977 --> 00:13:55,657 Speaker 2: than I can ever remember being, you know, when I 273 00:13:55,697 --> 00:13:59,897 Speaker 2: was younger, And they have decided that being wrong and 274 00:14:00,017 --> 00:14:05,697 Speaker 2: telling lies is always okay and even admirable, as long 275 00:14:05,737 --> 00:14:09,297 Speaker 2: as it is in the service of the collective good 276 00:14:09,457 --> 00:14:13,737 Speaker 2: or the collect team. Journalism for sure did this during Trump, 277 00:14:13,777 --> 00:14:16,457 Speaker 2: and they were pretty explicit about it, right instead of 278 00:14:16,737 --> 00:14:19,177 Speaker 2: because they knew they could no longer sell themselves like 279 00:14:19,217 --> 00:14:22,777 Speaker 2: the Washington Post can't sell we're just we're nonpartisan, we're 280 00:14:22,777 --> 00:14:26,097 Speaker 2: just objective. So what they tried to do was having 281 00:14:26,177 --> 00:14:29,017 Speaker 2: and this was an open discussion, if you remember, was say, well, 282 00:14:29,057 --> 00:14:32,457 Speaker 2: we're anti Trump because the truth is anti Trump, and 283 00:14:32,537 --> 00:14:37,377 Speaker 2: therefore by telling the truth, we're inherently anti Trump. So 284 00:14:37,497 --> 00:14:39,497 Speaker 2: we have no choice. Did you see what I mean? Like, 285 00:14:39,697 --> 00:14:44,337 Speaker 2: there's this twisting of reality and objectives and mission statement 286 00:14:44,377 --> 00:14:47,057 Speaker 2: that occur that has occurred I think across the spectrum 287 00:14:47,057 --> 00:14:47,577 Speaker 2: in America. 288 00:14:48,577 --> 00:14:53,297 Speaker 3: Yeah, the normalization of this logistical gymnastics that we go through. 289 00:14:53,617 --> 00:14:57,297 Speaker 3: It's mind boggling because we take things that are obviously 290 00:14:57,337 --> 00:15:00,457 Speaker 3: not true and we're able to justify it for a 291 00:15:00,537 --> 00:15:04,177 Speaker 3: greater good that only maybe one side believes in. And 292 00:15:04,217 --> 00:15:07,377 Speaker 3: that's why we're losing objectivity. We're losing truth and people 293 00:15:07,497 --> 00:15:10,337 Speaker 3: can't agree because we're not even talking a lot the 294 00:15:10,377 --> 00:15:11,537 Speaker 3: same premise any longer. 295 00:15:12,337 --> 00:15:12,497 Speaker 1: You know. 296 00:15:12,537 --> 00:15:14,657 Speaker 2: One thing I will say also is a lot of 297 00:15:14,657 --> 00:15:16,657 Speaker 2: people will hear this later on the week. There may 298 00:15:16,977 --> 00:15:18,697 Speaker 2: I don't think there's gonna be a verdict tomorrow, but 299 00:15:18,737 --> 00:15:22,737 Speaker 2: I don't know, but they'll there's the possibility right on 300 00:15:22,777 --> 00:15:25,377 Speaker 2: the Trump thing. And like I said, I've already said 301 00:15:25,497 --> 00:15:27,737 Speaker 2: a million things about the Trump and the Trump trial. 302 00:15:27,737 --> 00:15:29,857 Speaker 2: And when we were setting up to talk today, I 303 00:15:29,897 --> 00:15:31,977 Speaker 2: was like, I don't want to. Yes, we know that 304 00:15:32,097 --> 00:15:35,057 Speaker 2: ja Sham thirty four counts is outrageous, that it's not 305 00:15:35,097 --> 00:15:35,697 Speaker 2: really a crime. 306 00:15:35,737 --> 00:15:37,457 Speaker 1: They delayed it. We could go through all this put 307 00:15:37,457 --> 00:15:37,937 Speaker 1: that aside. 308 00:15:38,857 --> 00:15:40,937 Speaker 2: I feel like one part of it that is is 309 00:15:40,977 --> 00:15:43,297 Speaker 2: still hard to imagine is that I think whether he's 310 00:15:43,297 --> 00:15:47,017 Speaker 2: guilty or found guilty, or hungery or whatever it is, 311 00:15:47,657 --> 00:15:52,057 Speaker 2: people are just so saturated now with the craziness of 312 00:15:52,137 --> 00:15:56,097 Speaker 2: politics that don't I don't know that there is some 313 00:15:56,257 --> 00:15:57,857 Speaker 2: other level that it goes to, do, you know what 314 00:15:57,937 --> 00:16:02,777 Speaker 2: I mean? We're so attuned to being told, oh my gosh, 315 00:16:02,857 --> 00:16:04,457 Speaker 2: it's the end of the country or it's the end 316 00:16:04,497 --> 00:16:07,497 Speaker 2: of democracy, and that even if you don't believe it, 317 00:16:07,897 --> 00:16:11,657 Speaker 2: I feel like it recalibrate. So now it's like, well, 318 00:16:12,017 --> 00:16:14,217 Speaker 2: I'm already told the country's gonna end if Trump wins. 319 00:16:14,257 --> 00:16:16,177 Speaker 1: So what I'm I'm I supposed to freak out about it, 320 00:16:16,217 --> 00:16:16,737 Speaker 1: you know what I mean? 321 00:16:17,177 --> 00:16:21,817 Speaker 2: There's no additional level for the outrage to go to, 322 00:16:21,937 --> 00:16:25,257 Speaker 2: which kind of we've reached like maximum, you know, terminal 323 00:16:25,297 --> 00:16:26,577 Speaker 2: velocity outrage. 324 00:16:27,417 --> 00:16:29,257 Speaker 3: And I feel like sometimes I feel like that may 325 00:16:29,297 --> 00:16:33,297 Speaker 3: even be the plan get people so saturated and so 326 00:16:33,577 --> 00:16:37,017 Speaker 3: jaded with the Polica process, because typically we'd be starting 327 00:16:37,017 --> 00:16:41,377 Speaker 3: presidential election season around now, but we started way early 328 00:16:41,417 --> 00:16:45,657 Speaker 3: because primaries really didn't happen, and people are fed up 329 00:16:45,657 --> 00:16:48,457 Speaker 3: with the Trump trials. Half most of people aren't even 330 00:16:48,497 --> 00:16:51,057 Speaker 3: following anymore. You kind of just flip through and use 331 00:16:51,137 --> 00:16:54,257 Speaker 3: hoping to get an update. People aren't following as closely 332 00:16:54,257 --> 00:16:58,217 Speaker 3: as they could be. People aren't following the other trials 333 00:16:58,217 --> 00:17:01,937 Speaker 3: that just seem to never go anywhere. I think the 334 00:17:01,977 --> 00:17:04,937 Speaker 3: point is to get people so jaded and so fed 335 00:17:05,017 --> 00:17:08,817 Speaker 3: up with the Polica process that come election time in November, 336 00:17:08,977 --> 00:17:11,697 Speaker 3: people are like, uh, I don't really care that much anymore. 337 00:17:11,697 --> 00:17:14,097 Speaker 3: I'm just gonna sit this one out. And I think 338 00:17:14,097 --> 00:17:17,177 Speaker 3: that may even be the plan, just to keep people 339 00:17:17,377 --> 00:17:21,217 Speaker 3: so done with the process. And it's so ridiculous that 340 00:17:21,697 --> 00:17:23,857 Speaker 3: maybe they just don't want people to turn out, because 341 00:17:24,297 --> 00:17:25,937 Speaker 3: if a lot of people turn out, I think the 342 00:17:25,937 --> 00:17:28,097 Speaker 3: more people that turn out, Trump would have a higher 343 00:17:28,177 --> 00:17:31,777 Speaker 3: chance of winning. Yeah, how it is. 344 00:17:31,817 --> 00:17:33,817 Speaker 2: We'll come back to this in a second with my 345 00:17:33,857 --> 00:17:37,377 Speaker 2: friend Matt Kim here, but just from our sponsor, the 346 00:17:37,417 --> 00:17:38,617 Speaker 2: Oxford Gold Group for a second. 347 00:17:38,657 --> 00:17:41,057 Speaker 1: I've got gold here at home. I'm a believer in gold. 348 00:17:41,097 --> 00:17:43,377 Speaker 2: If you look at what gold has been doing over 349 00:17:43,417 --> 00:17:45,337 Speaker 2: the last well not just the last few months, but 350 00:17:45,337 --> 00:17:46,897 Speaker 2: if you look at over the last few decades. 351 00:17:46,977 --> 00:17:47,537 Speaker 1: Gold is a. 352 00:17:47,457 --> 00:17:50,457 Speaker 2: Store of value, and at a time when inflation is 353 00:17:50,977 --> 00:17:53,777 Speaker 2: just eating away at your savings, and there's the possibility 354 00:17:53,777 --> 00:17:58,097 Speaker 2: of real bank central bank disruptions ahead. You want to 355 00:17:58,137 --> 00:18:00,457 Speaker 2: have a hedge against inflation. You want to have tangible 356 00:18:00,497 --> 00:18:03,377 Speaker 2: currency on hand. Nobody can predict the future, but you 357 00:18:03,457 --> 00:18:06,377 Speaker 2: can prepare for it. The Oxford Gold Group of pros 358 00:18:06,417 --> 00:18:08,057 Speaker 2: at this They're who I go to for all my 359 00:18:08,137 --> 00:18:10,537 Speaker 2: gold and silver, and they make it simp and easy 360 00:18:10,537 --> 00:18:13,337 Speaker 2: to understand. Just give them a call the Oxford Gold Group. 361 00:18:13,377 --> 00:18:15,537 Speaker 2: You may qualify for to ten thousand dollars in free 362 00:18:15,537 --> 00:18:19,097 Speaker 2: precious metals. The number is eight three three nine nine 363 00:18:19,217 --> 00:18:25,937 Speaker 2: five gold. That's eight three three nine five g old. Matt, 364 00:18:25,937 --> 00:18:26,817 Speaker 2: what do you think of Elon? 365 00:18:27,817 --> 00:18:30,297 Speaker 3: Hmmm, I'm on the fence with him. 366 00:18:30,737 --> 00:18:31,697 Speaker 1: Really tell me more. 367 00:18:31,737 --> 00:18:34,297 Speaker 2: I'm I'm surprised, but I want to hear more before 368 00:18:34,337 --> 00:18:34,977 Speaker 2: I say anymore. 369 00:18:34,977 --> 00:18:36,297 Speaker 1: What do you think? Why so? 370 00:18:37,057 --> 00:18:40,257 Speaker 3: I think I don't know his intention is. I assume 371 00:18:40,377 --> 00:18:44,457 Speaker 3: that his attention intentions are pure. I don't like the 372 00:18:44,497 --> 00:18:47,297 Speaker 3: idea that there is one person that is in charge 373 00:18:47,577 --> 00:18:52,657 Speaker 3: and leading in vehicles, in transportation, in space and satellites 374 00:18:53,137 --> 00:18:58,057 Speaker 3: and thinking underground tunnels and AI and robots and your 375 00:18:58,057 --> 00:19:01,777 Speaker 3: communication and your data. He's in charge, and on top 376 00:19:01,857 --> 00:19:04,257 Speaker 3: of all of it, if he's a good guy, maybe 377 00:19:04,257 --> 00:19:04,857 Speaker 3: it's a good thing. 378 00:19:06,177 --> 00:19:09,257 Speaker 1: However, are you just suspicious? Then? Just too much power 379 00:19:09,257 --> 00:19:11,457 Speaker 1: in the hands of one man? Is that the baseline. 380 00:19:11,977 --> 00:19:14,257 Speaker 3: I don't like too much power in the hands of 381 00:19:14,297 --> 00:19:18,017 Speaker 3: one person because it creates god complex and all it 382 00:19:18,057 --> 00:19:21,217 Speaker 3: does because again we talked about it earlier Pride. All 383 00:19:21,217 --> 00:19:24,897 Speaker 3: it takes is one incident, one person to put someone 384 00:19:24,897 --> 00:19:27,297 Speaker 3: in the wrong direction, one girl to really mess him up. 385 00:19:27,417 --> 00:19:30,657 Speaker 3: That's how super villains are made, and all of a 386 00:19:30,697 --> 00:19:33,537 Speaker 3: sudden we have the bad The good guy becomes a 387 00:19:33,577 --> 00:19:36,177 Speaker 3: bad guy and he's in charge of everything. I don't 388 00:19:36,217 --> 00:19:37,497 Speaker 3: like that. I like distribution of. 389 00:19:37,457 --> 00:19:41,977 Speaker 2: Power, no, I see my problem is that, and particularly 390 00:19:42,057 --> 00:19:46,417 Speaker 2: again to the pandemic, which was in many ways a 391 00:19:46,697 --> 00:19:48,457 Speaker 2: period of political radicalization. 392 00:19:48,537 --> 00:19:49,497 Speaker 1: I was already right. 393 00:19:49,377 --> 00:19:54,737 Speaker 2: Wing, but the pandemic made me really fearful about the 394 00:19:54,737 --> 00:19:59,017 Speaker 2: future of the country if the tyrannical leftists were able 395 00:19:59,057 --> 00:20:01,697 Speaker 2: to get what they want. But I just remember the 396 00:20:01,737 --> 00:20:04,777 Speaker 2: social media companies. What we've been through is a society. 397 00:20:05,257 --> 00:20:10,057 Speaker 2: It has never happened before that you have so much 398 00:20:10,177 --> 00:20:14,377 Speaker 2: much power concentrated in Really what is a fringe ideology 399 00:20:14,777 --> 00:20:18,537 Speaker 2: when you think of Silicon Valley far left wing democrats, 400 00:20:18,577 --> 00:20:20,497 Speaker 2: it's like the most far left wing part of the 401 00:20:20,617 --> 00:20:23,377 Speaker 2: entire country, by party registration, by belief. So I mean, 402 00:20:23,497 --> 00:20:26,657 Speaker 2: just go spend some time there. And they created, or 403 00:20:26,697 --> 00:20:32,337 Speaker 2: they were able to build such dominance online search, social media, 404 00:20:33,137 --> 00:20:36,297 Speaker 2: you know, platforms, I mean everything, right, I mean how 405 00:20:36,337 --> 00:20:40,457 Speaker 2: you buy things, how you share information, the servers that 406 00:20:40,497 --> 00:20:43,817 Speaker 2: we have, and it felt like they all started back 407 00:20:43,857 --> 00:20:45,337 Speaker 2: to back to we're talking a lot of matrix that 408 00:20:45,377 --> 00:20:48,457 Speaker 2: a back to our matrix. The conversation, it felt to 409 00:20:48,497 --> 00:20:51,617 Speaker 2: me like, you know, agent Smith just decided they were 410 00:20:51,617 --> 00:20:54,137 Speaker 2: going to shut down the right for a while there. 411 00:20:54,217 --> 00:20:59,657 Speaker 2: So I have to be I'm very, very hopeful about 412 00:20:59,657 --> 00:21:02,457 Speaker 2: what Elon's trying to do because to me, if you 413 00:21:02,577 --> 00:21:05,177 Speaker 2: had the continuation of what we had, where you had 414 00:21:05,217 --> 00:21:09,177 Speaker 2: true zelots I mean emotionally unstable zealots in charge of 415 00:21:09,217 --> 00:21:12,097 Speaker 2: trust and safety at Twitter, in these places as we've seen, 416 00:21:12,817 --> 00:21:14,777 Speaker 2: I don't know if we really live in a free society, right, 417 00:21:14,817 --> 00:21:17,377 Speaker 2: I mean, if someone can control ninety five percent of 418 00:21:17,377 --> 00:21:21,737 Speaker 2: the information flow in one direction online, do we live 419 00:21:21,777 --> 00:21:23,417 Speaker 2: in a free society? And you might be able to 420 00:21:23,417 --> 00:21:25,617 Speaker 2: write whatever you want on a little piece of paper 421 00:21:25,657 --> 00:21:27,137 Speaker 2: and put it in front of your house, right, But 422 00:21:27,537 --> 00:21:28,697 Speaker 2: that's not gonna do very much. 423 00:21:29,657 --> 00:21:32,177 Speaker 3: And I get that, And I think what he's done 424 00:21:32,177 --> 00:21:35,017 Speaker 3: for Twitter or X so far has been great. I 425 00:21:35,177 --> 00:21:39,297 Speaker 3: do think that. I do know that for sure. They 426 00:21:40,937 --> 00:21:44,057 Speaker 3: I don't like I'm trying to be careful because you know, 427 00:21:44,097 --> 00:21:45,017 Speaker 3: everyone loves that Elon. 428 00:21:46,817 --> 00:21:48,377 Speaker 1: I thought you were gonna like Elon, so I was 429 00:21:48,417 --> 00:21:49,017 Speaker 1: really interested. 430 00:21:49,217 --> 00:21:52,097 Speaker 3: I kind of like Elon. I was more bullish on 431 00:21:52,177 --> 00:21:55,657 Speaker 3: him earlier. I don't like the fact that X has 432 00:21:55,697 --> 00:21:59,777 Speaker 3: a social credit system built in, and I'm very adamantly 433 00:21:59,817 --> 00:22:03,657 Speaker 3: against social credit systems, but there's one inherently been built in. 434 00:22:04,057 --> 00:22:06,217 Speaker 3: So for example, if I put up a poster or 435 00:22:06,377 --> 00:22:11,337 Speaker 3: video on TikTok or Instagram and a people share that video, 436 00:22:11,937 --> 00:22:14,857 Speaker 3: it counts as a thousand people shared that video, regardless 437 00:22:14,897 --> 00:22:19,457 Speaker 3: if you have one follower or a million followers. The 438 00:22:19,497 --> 00:22:21,577 Speaker 3: algorithm will say, okay, a thousand people shared it. 439 00:22:23,137 --> 00:22:23,737 Speaker 1: On X. 440 00:22:24,137 --> 00:22:26,417 Speaker 3: If a thousand people that share that video, but each 441 00:22:26,577 --> 00:22:29,377 Speaker 3: thousand people have less than I think the numbers one 442 00:22:29,417 --> 00:22:35,097 Speaker 3: thousand followers, it counts as zero. Certain people are worth 443 00:22:35,457 --> 00:22:38,657 Speaker 3: worth less to the algorithm within X and other people, 444 00:22:39,537 --> 00:22:41,737 Speaker 3: And I think that's weird. I don't like that at all. 445 00:22:41,937 --> 00:22:44,377 Speaker 3: I think people should be considered equal and if they 446 00:22:44,377 --> 00:22:47,377 Speaker 3: have more visibility, that's great. But you shouldn't be penalizing 447 00:22:47,377 --> 00:22:50,337 Speaker 3: people for not being popular enough on a platform. How 448 00:22:50,337 --> 00:22:51,737 Speaker 3: are you going to tell me that a thousand people 449 00:22:51,817 --> 00:22:54,577 Speaker 3: shared a post but they don't count because they're not 450 00:22:54,617 --> 00:22:56,337 Speaker 3: popular enough. I don't like that idea. 451 00:22:57,977 --> 00:22:59,617 Speaker 2: What's the most important thing we could do right now 452 00:22:59,657 --> 00:23:02,417 Speaker 2: to fix the country. 453 00:23:02,497 --> 00:23:05,017 Speaker 3: I think continue to have conversations. I think people should 454 00:23:05,057 --> 00:23:07,577 Speaker 3: continue to speak out. I think free speech is the 455 00:23:07,617 --> 00:23:09,857 Speaker 3: most important thing that we have in this country. It's 456 00:23:09,897 --> 00:23:13,017 Speaker 3: what separates this country from every other country in the world, 457 00:23:13,337 --> 00:23:15,857 Speaker 3: and to see it at risk is a huge concern. 458 00:23:16,377 --> 00:23:19,097 Speaker 3: I think if more and more people spoke out about 459 00:23:19,177 --> 00:23:22,577 Speaker 3: the injustices of COVID, maybe we wouldn't have gotten into 460 00:23:22,657 --> 00:23:24,857 Speaker 3: the situation that we had. And people say, well, I 461 00:23:24,857 --> 00:23:28,177 Speaker 3: did speak up, but not enough. They can't cancel everybody. 462 00:23:28,857 --> 00:23:31,297 Speaker 3: What would happen if everyone spoke out about it the 463 00:23:31,337 --> 00:23:35,217 Speaker 3: same exact time. Noah's gonna consume their technology, Noah's gonna 464 00:23:35,217 --> 00:23:37,977 Speaker 3: read their ads, Noah's go watch their ads, Noah's gonna 465 00:23:38,017 --> 00:23:41,137 Speaker 3: buy their product. Their product? Would cease to exist. They 466 00:23:41,177 --> 00:23:45,097 Speaker 3: can't cancel everyone. I think they let I think they 467 00:23:45,137 --> 00:23:47,897 Speaker 3: make examples of certain people and they scare everyone else 468 00:23:47,897 --> 00:23:50,737 Speaker 3: from talking. And I think it's our job as a 469 00:23:50,737 --> 00:23:53,297 Speaker 3: populace to speak out and say no, like enough is enough. 470 00:23:54,097 --> 00:23:56,177 Speaker 2: Matt Kim, it has been a pleasure. Man, Let's let's 471 00:23:56,177 --> 00:23:57,657 Speaker 2: do this again. I would I would love to come 472 00:23:57,697 --> 00:23:59,777 Speaker 2: on the Matt Kim Show sometime toos you. Let me know, 473 00:24:00,697 --> 00:24:02,377 Speaker 2: let's do it when that works for you. I want, 474 00:24:02,377 --> 00:24:03,977 Speaker 2: I want to meet all of Mat Kim's people, so 475 00:24:04,017 --> 00:24:07,697 Speaker 2: to speak, in the ecosystem. You know, I'll you'll be 476 00:24:07,857 --> 00:24:09,577 Speaker 2: You'll be more fheust, I'll be neo or you can 477 00:24:09,617 --> 00:24:12,017 Speaker 2: be Neo'll be morpheles. We'll keep the matrix stuff going. 478 00:24:12,737 --> 00:24:14,377 Speaker 2: But anyway, man, it was great to talk to you. 479 00:24:15,057 --> 00:24:17,017 Speaker 2: And uh, let's let's keep in touch and let's try 480 00:24:17,057 --> 00:24:19,617 Speaker 2: to save save the internet, saved the country sounds good 481 00:24:19,617 --> 00:24:19,817 Speaker 2: to me. 482 00:24:20,577 --> 00:24:22,257 Speaker 3: Thank you, brother, I'll see us sain. Thank you everybody 483 00:24:22,297 --> 00:24:22,497 Speaker 3: much