1 00:00:01,240 --> 00:00:05,360 Speaker 1: So often when you are scrolling through X, or you're 2 00:00:05,400 --> 00:00:09,560 Speaker 1: scrolling through Instagram, or you're scrolling through YouTube, there are 3 00:00:09,920 --> 00:00:13,760 Speaker 1: often times where you get people that are delivering information 4 00:00:13,920 --> 00:00:16,720 Speaker 1: but always have this sense that they need to add 5 00:00:16,760 --> 00:00:20,400 Speaker 1: whatever they have to add to it. Today's guest is 6 00:00:20,520 --> 00:00:23,360 Speaker 1: a person that literally stopped me in my tracks when 7 00:00:23,400 --> 00:00:26,880 Speaker 1: I found him about a year ago and started following 8 00:00:26,960 --> 00:00:29,480 Speaker 1: him when I got back on social media, in particular 9 00:00:29,560 --> 00:00:33,720 Speaker 1: back on X or Twitter. Then X is Kagan Dunlop, 10 00:00:33,840 --> 00:00:36,279 Speaker 1: and so what I love so much about what he 11 00:00:36,360 --> 00:00:40,239 Speaker 1: does is that he delivers information in real time. He 12 00:00:40,360 --> 00:00:43,000 Speaker 1: delivers it with a sense of this is the facts 13 00:00:43,080 --> 00:00:45,639 Speaker 1: and allows you to decipher what you want to do. 14 00:00:46,240 --> 00:00:49,120 Speaker 1: So what I thought we'd do today is, first off, Kagan, 15 00:00:49,120 --> 00:00:51,520 Speaker 1: thank you so much for coming on. I really appreciate 16 00:00:51,560 --> 00:00:52,240 Speaker 1: you joining me. 17 00:00:52,680 --> 00:00:57,240 Speaker 2: Yeah, of course, I appreciate you inviting me on. You know, 18 00:00:57,320 --> 00:00:59,560 Speaker 2: I'm happy to be here. You know, I apologize it 19 00:00:59,560 --> 00:01:01,600 Speaker 2: took so long for you to try to plan this out. 20 00:01:01,640 --> 00:01:03,680 Speaker 2: I know it's kind of a pain in the butt 21 00:01:03,720 --> 00:01:07,200 Speaker 2: dealing with scheduling and everything I've been I've been slammed, 22 00:01:07,200 --> 00:01:10,560 Speaker 2: and then I've also recently moved, so I'm trying to 23 00:01:10,640 --> 00:01:12,240 Speaker 2: I was trying to like sort my life out. So 24 00:01:12,280 --> 00:01:14,319 Speaker 2: I appreciate you being patient with me while I was 25 00:01:14,480 --> 00:01:15,680 Speaker 2: working through all that stuff. 26 00:01:16,000 --> 00:01:18,280 Speaker 1: Oh man, I would have waited for ten years to 27 00:01:18,319 --> 00:01:19,160 Speaker 1: get you on Man. 28 00:01:19,240 --> 00:01:22,920 Speaker 3: I just there's certain people that I've just come across 29 00:01:22,959 --> 00:01:24,679 Speaker 3: that it just it hits me. 30 00:01:25,240 --> 00:01:27,880 Speaker 1: I was like, Man, the way that person is doing 31 00:01:27,920 --> 00:01:32,360 Speaker 1: it is the way that it just it connects. And 32 00:01:33,280 --> 00:01:35,679 Speaker 1: I just have so much appreciation for how you do 33 00:01:35,720 --> 00:01:36,680 Speaker 1: what you do, man. 34 00:01:36,840 --> 00:01:39,120 Speaker 4: I appreciate that. That means a lot. It means a lot. 35 00:01:39,520 --> 00:01:42,840 Speaker 1: Okay, So the most recent things that you've been putting out, 36 00:01:43,080 --> 00:01:47,080 Speaker 1: and I think actually you've been doing this for a 37 00:01:47,120 --> 00:01:47,840 Speaker 1: long time. 38 00:01:47,720 --> 00:01:48,520 Speaker 3: Is is. 39 00:01:50,080 --> 00:01:54,800 Speaker 1: The information that's coming out about Russia and Ukraine in particular, 40 00:01:55,040 --> 00:01:58,480 Speaker 1: the various new weapons systems in terms of drone warfare, 41 00:01:58,680 --> 00:02:03,400 Speaker 1: anti drone warfare. You know, as you've been watching closely 42 00:02:03,480 --> 00:02:06,800 Speaker 1: on this, the tactics that are emerging on both sides, 43 00:02:06,800 --> 00:02:11,200 Speaker 1: whether you're talking, you know, it's it's utilizing drones to 44 00:02:11,240 --> 00:02:15,919 Speaker 1: a different autonomous vehicles, or my favorite was you posted 45 00:02:16,360 --> 00:02:18,240 Speaker 1: a video the other day of some guy getting in 46 00:02:18,280 --> 00:02:21,080 Speaker 1: and one of his tires were off, and you know, 47 00:02:21,160 --> 00:02:25,920 Speaker 1: and it was just I think there's just if you 48 00:02:25,960 --> 00:02:29,720 Speaker 1: could give your assessment of what that conflict really looks 49 00:02:29,840 --> 00:02:33,919 Speaker 1: like and where do you think it's evolving to next, 50 00:02:34,440 --> 00:02:38,200 Speaker 1: and then you know over all that, you know, maybe 51 00:02:38,200 --> 00:02:41,160 Speaker 1: we can once we'd established that, we'll talk about what 52 00:02:41,560 --> 00:02:45,000 Speaker 1: you think about the negotiations and how the ratcheting up 53 00:02:45,160 --> 00:02:49,080 Speaker 1: is going in terms of the drone warfare inside Russia. 54 00:02:49,200 --> 00:02:52,400 Speaker 1: Russia's response the other day, what is your impression. 55 00:02:51,960 --> 00:02:53,360 Speaker 3: Of where we're at and all those. 56 00:02:54,000 --> 00:02:58,560 Speaker 2: Yeah, well, first, I would say first and foremost, I 57 00:02:58,800 --> 00:03:03,680 Speaker 2: wouldn't consider myself an expert on probably anything really. I 58 00:03:03,760 --> 00:03:06,120 Speaker 2: just kind of like observe stuff, and like I like 59 00:03:06,240 --> 00:03:08,640 Speaker 2: to know what's going on, and I like seeing like 60 00:03:08,720 --> 00:03:12,920 Speaker 2: the evolution of technology, especially when it pertains to conflict, 61 00:03:12,960 --> 00:03:16,600 Speaker 2: because I mean, obviously the military is heavily involved in 62 00:03:16,720 --> 00:03:19,640 Speaker 2: stuff around the world, and I like to be informed 63 00:03:19,720 --> 00:03:22,320 Speaker 2: on what kind of technology is being used out there. 64 00:03:22,440 --> 00:03:24,040 Speaker 2: I'd like to be in I like to be able 65 00:03:24,040 --> 00:03:26,480 Speaker 2: to share that information with other people who may find 66 00:03:26,480 --> 00:03:29,320 Speaker 2: it interesting, who may encounter it based on whatever their 67 00:03:29,440 --> 00:03:32,840 Speaker 2: job is, whether that be contractors that are independent of 68 00:03:32,880 --> 00:03:35,160 Speaker 2: the military, or active duty service members that may come 69 00:03:35,200 --> 00:03:39,680 Speaker 2: in contact with it, and I think that this whole 70 00:03:39,720 --> 00:03:43,080 Speaker 2: thing has been I would say if I had to 71 00:03:43,120 --> 00:03:47,200 Speaker 2: summarize this whole event, and this is just like completely 72 00:03:47,600 --> 00:03:51,240 Speaker 2: divorced of like your opinions on the war and like 73 00:03:51,720 --> 00:03:54,960 Speaker 2: how it started, and like who's at fall. Obviously, Russian 74 00:03:55,160 --> 00:03:57,839 Speaker 2: invaded Ukraine in twenty twenty two. It's pretty cut and dry, 75 00:03:57,880 --> 00:04:00,560 Speaker 2: pretty clear that Russia's the aggressor and this one the 76 00:04:00,640 --> 00:04:04,760 Speaker 2: Ukraine was invaded by a foreign nation. That's pretty just 77 00:04:05,000 --> 00:04:07,720 Speaker 2: plain and simple. That's the way it is. But as 78 00:04:07,720 --> 00:04:10,600 Speaker 2: far as like your opinions on it, like separate from that, 79 00:04:11,200 --> 00:04:18,320 Speaker 2: this has been one of the biggest intelligence gathering events 80 00:04:18,560 --> 00:04:22,599 Speaker 2: of probably my lifetime for sure. As far as just 81 00:04:22,680 --> 00:04:27,600 Speaker 2: like this is the probably the most observed, most recorded, 82 00:04:27,720 --> 00:04:32,920 Speaker 2: and most documented war in the history of the world, probably, 83 00:04:32,960 --> 00:04:36,720 Speaker 2: I would say just from like the the sheer volume 84 00:04:36,839 --> 00:04:40,440 Speaker 2: of ISR feeds that people have been able to watch, 85 00:04:40,560 --> 00:04:43,000 Speaker 2: drone feeds that people have been able to watch actual 86 00:04:43,040 --> 00:04:46,320 Speaker 2: combat footage from go pros that have been recording footage 87 00:04:46,520 --> 00:04:50,360 Speaker 2: either from the Russian perspective or the Ukrainian perspective, or 88 00:04:50,600 --> 00:04:52,800 Speaker 2: you know, the foreign volunteers that were part of the 89 00:04:52,880 --> 00:04:58,599 Speaker 2: Ukrainian Foreign legions perspective, and there has been more videos 90 00:04:58,720 --> 00:05:03,520 Speaker 2: and content and photos and documentation of just combat operations 91 00:05:03,560 --> 00:05:06,159 Speaker 2: throughout the entire duration it's war since twenty twenty two's 92 00:05:06,160 --> 00:05:08,720 Speaker 2: invasion than anything before. 93 00:05:09,240 --> 00:05:14,480 Speaker 4: And so I think that while the war itself may. 94 00:05:14,360 --> 00:05:18,680 Speaker 2: Be an absolute tragedy because there's been countless life lost 95 00:05:18,800 --> 00:05:21,520 Speaker 2: on both sides, and that should be something that everybody 96 00:05:21,520 --> 00:05:24,000 Speaker 2: should be like, Hey, maybe this isn't good. 97 00:05:24,040 --> 00:05:26,000 Speaker 4: We probably should find a way to, you know, come 98 00:05:26,040 --> 00:05:26,440 Speaker 4: to peace. 99 00:05:27,320 --> 00:05:30,640 Speaker 2: And there's no prosperity being gained for the people of 100 00:05:30,720 --> 00:05:33,719 Speaker 2: Russia or the people of Ukraine from constant death like this. 101 00:05:34,600 --> 00:05:36,719 Speaker 2: But at the same time, like it has been a 102 00:05:36,880 --> 00:05:41,599 Speaker 2: massive intelligence gathering event because people from all over the 103 00:05:41,640 --> 00:05:46,320 Speaker 2: planet have been gathering intelligence from this, like massive, massive 104 00:05:46,320 --> 00:05:51,560 Speaker 2: amounts of intelligence not only on Russian operations and Russian 105 00:05:51,760 --> 00:05:55,599 Speaker 2: tactics and Russian techniques and the way that they conduct 106 00:05:55,680 --> 00:05:59,039 Speaker 2: themselves in battle, and like what their kind of battle 107 00:05:59,120 --> 00:06:01,480 Speaker 2: rhythm is, and like what they like to do as 108 00:06:01,480 --> 00:06:04,320 Speaker 2: far as like how things go from the beginning of 109 00:06:04,360 --> 00:06:07,000 Speaker 2: an invasion to like kind of like where we're at 110 00:06:07,120 --> 00:06:10,880 Speaker 2: right now. The type of tactics that have changed with 111 00:06:11,000 --> 00:06:14,479 Speaker 2: drone warfare, the type of tactics that have changed with 112 00:06:14,680 --> 00:06:18,839 Speaker 2: unmanned surface vessels like the boats on the Black Sea 113 00:06:18,920 --> 00:06:22,080 Speaker 2: and stuff like that taking things out. There's been a 114 00:06:22,120 --> 00:06:26,279 Speaker 2: lot of a lot of information that's been gathered from this, 115 00:06:26,400 --> 00:06:29,640 Speaker 2: and then obviously like the information war with social media 116 00:06:29,760 --> 00:06:33,560 Speaker 2: and everything else that's happening, like Telegram and you know, 117 00:06:34,000 --> 00:06:37,640 Speaker 2: TikTok and Twitter or x and everything else. Like there's 118 00:06:37,640 --> 00:06:39,760 Speaker 2: been videos and footage of this post it all over 119 00:06:39,760 --> 00:06:42,080 Speaker 2: the place, and there's all kinds of narratives being pushed. 120 00:06:42,320 --> 00:06:45,039 Speaker 2: You can have the same video and like fifteen different 121 00:06:45,080 --> 00:06:47,680 Speaker 2: narratives being pushed with that one video because it was 122 00:06:47,800 --> 00:06:50,279 Speaker 2: getting shared by different accounts that are trying to share 123 00:06:50,320 --> 00:06:52,719 Speaker 2: it from a different angle and all this other stuff. 124 00:06:52,760 --> 00:06:55,520 Speaker 2: So I think, if nothing else, this has definitely been 125 00:06:55,520 --> 00:07:01,880 Speaker 2: a big information gathering thing for people on all sides, 126 00:07:01,920 --> 00:07:05,040 Speaker 2: whether that be the United States or Russia or Ukraine 127 00:07:05,160 --> 00:07:07,800 Speaker 2: or anyone else that's just sitting back on the sidelines 128 00:07:07,839 --> 00:07:11,240 Speaker 2: watching this thing. And I think right now we're kind 129 00:07:11,240 --> 00:07:14,320 Speaker 2: of obviously in a tenuous situation because I know that 130 00:07:15,000 --> 00:07:19,640 Speaker 2: President Donald Trump has been trying to negotiate peace, and 131 00:07:19,680 --> 00:07:21,960 Speaker 2: so it's like his cabinet they've been trying to get 132 00:07:22,560 --> 00:07:27,080 Speaker 2: the you know, the Russian president Vladimir Putin to agree 133 00:07:27,080 --> 00:07:29,680 Speaker 2: to a ceasefire and a peace agreement with Ukraine, and 134 00:07:29,800 --> 00:07:31,800 Speaker 2: Ukraine's been trying to do a peace agreement. 135 00:07:32,240 --> 00:07:33,360 Speaker 4: It's been all over the place. 136 00:07:33,400 --> 00:07:35,560 Speaker 2: And then I even just today, like I haven't even 137 00:07:35,560 --> 00:07:39,600 Speaker 2: made a video about it, but apparently Vladimir Putin's helicopter 138 00:07:39,840 --> 00:07:42,400 Speaker 2: was somewhere near the front at one point and it 139 00:07:42,480 --> 00:07:46,559 Speaker 2: came under massive drone attack and now they're they're using 140 00:07:46,600 --> 00:07:50,680 Speaker 2: that event as a they're framing it as a potential 141 00:07:50,800 --> 00:07:52,080 Speaker 2: assassination attempt. 142 00:07:52,480 --> 00:07:54,119 Speaker 4: And this just came out today. 143 00:07:54,160 --> 00:07:58,720 Speaker 2: I've seen this trending already, and like that's obviously going 144 00:07:58,800 --> 00:08:01,240 Speaker 2: to have an impact on Negotia now, you know, like 145 00:08:01,840 --> 00:08:04,920 Speaker 2: they already seem like the like Russia already seems like 146 00:08:05,680 --> 00:08:09,360 Speaker 2: they're they don't really care, whether they don't really seem 147 00:08:09,360 --> 00:08:13,280 Speaker 2: really like apathetic towards having peace, Like it almost seems 148 00:08:13,320 --> 00:08:16,160 Speaker 2: like an apathy towards peace, which is weird. And I 149 00:08:16,200 --> 00:08:19,080 Speaker 2: don't know if that's the Russian people. I think that's 150 00:08:19,200 --> 00:08:22,960 Speaker 2: just the Russian leadership seems to be apathetic towards peace. 151 00:08:23,520 --> 00:08:25,680 Speaker 2: And I don't know if that's because like their economy 152 00:08:25,800 --> 00:08:28,400 Speaker 2: is dependent on the war economy, and if they stop that, 153 00:08:28,480 --> 00:08:30,720 Speaker 2: they're gonna like their economy is going to collapse. Or 154 00:08:31,400 --> 00:08:34,480 Speaker 2: or if maybe they don't trust the United States and 155 00:08:34,600 --> 00:08:37,640 Speaker 2: the negotiation process, or if they don't trust Ukraine enough 156 00:08:37,679 --> 00:08:40,439 Speaker 2: to follow through on their word. Like, I don't know 157 00:08:40,520 --> 00:08:44,959 Speaker 2: what's happening that's causing this to take so long to resolve. 158 00:08:45,000 --> 00:08:47,200 Speaker 2: I do know that it's going to be obviously very 159 00:08:47,240 --> 00:08:50,880 Speaker 2: complicated brokering peace between two people, two nations who've been 160 00:08:50,920 --> 00:08:54,520 Speaker 2: fighting for you know, we're going on three years now, 161 00:08:54,679 --> 00:08:58,439 Speaker 2: you know, a little over three years. Obviously a hard 162 00:08:58,480 --> 00:09:01,840 Speaker 2: thing to do. I don't even know where you would start, Like, 163 00:09:01,880 --> 00:09:04,560 Speaker 2: I'm not a geopolitical expert. I'm not sure how they 164 00:09:04,600 --> 00:09:07,560 Speaker 2: would even broker this stuff, but obviously there's a lot of, 165 00:09:08,160 --> 00:09:12,720 Speaker 2: you know, animosity between the two nations, and it's only 166 00:09:13,400 --> 00:09:15,679 Speaker 2: it just keeps getting kicked down the roads. I don't 167 00:09:15,720 --> 00:09:17,120 Speaker 2: know how they're going to do it, but I mean, 168 00:09:17,120 --> 00:09:20,000 Speaker 2: I hope they can manage it. I would like to 169 00:09:20,040 --> 00:09:23,120 Speaker 2: see peace for all the people of both those countries. 170 00:09:23,160 --> 00:09:25,360 Speaker 2: I think I think we should get to a place 171 00:09:25,400 --> 00:09:29,240 Speaker 2: where war is no longer well, I mean the unfortunate 172 00:09:29,240 --> 00:09:31,360 Speaker 2: thing is war is very lucrative for a lot of 173 00:09:31,480 --> 00:09:34,800 Speaker 2: right until it gets to a point where it's no 174 00:09:34,840 --> 00:09:38,280 Speaker 2: longer as lucrative for those nations, Like it's going to 175 00:09:38,400 --> 00:09:41,720 Speaker 2: keep happening, you know. So I don't know, I think 176 00:09:41,760 --> 00:09:43,280 Speaker 2: maybe we need to figure out a way to like 177 00:09:45,480 --> 00:09:49,400 Speaker 2: maybe we just like make robots fight and use that 178 00:09:49,520 --> 00:09:52,800 Speaker 2: as a means to generate wealth, like the coliseum, but 179 00:09:52,920 --> 00:09:54,800 Speaker 2: this way you're not losing at like it's not at 180 00:09:54,800 --> 00:09:57,400 Speaker 2: the expense of human life. Maybe that's a way that 181 00:09:57,600 --> 00:10:01,439 Speaker 2: the defense industrial complex could still make their money and 182 00:10:01,480 --> 00:10:03,920 Speaker 2: then like it'd be like an entertainment thing, but nobody 183 00:10:03,960 --> 00:10:07,160 Speaker 2: would actually be perishing, like and it wouldn't you know, 184 00:10:07,400 --> 00:10:07,959 Speaker 2: hurt anybody. 185 00:10:07,960 --> 00:10:11,040 Speaker 4: I don't know, I mean, there's this is genius. I 186 00:10:11,080 --> 00:10:11,400 Speaker 4: don't know. 187 00:10:11,480 --> 00:10:13,440 Speaker 2: I mean, I'm somebody somebody's gonna hear this be like 188 00:10:14,280 --> 00:10:16,520 Speaker 2: I could make a lot of money doing it. You know, 189 00:10:16,520 --> 00:10:19,040 Speaker 2: it'd be like uh, you know, battle bots or something 190 00:10:19,080 --> 00:10:19,360 Speaker 2: like that. 191 00:10:19,400 --> 00:10:23,160 Speaker 1: But but for adults, for the military, that's a no. 192 00:10:23,400 --> 00:10:28,319 Speaker 1: I I'm I you know as a person that that's 193 00:10:28,679 --> 00:10:32,360 Speaker 1: you know, was involved from the g WID from you know, 194 00:10:32,720 --> 00:10:35,880 Speaker 1: my first cobbat point was O two summer O two. 195 00:10:36,040 --> 00:10:38,640 Speaker 3: My last was with the agency in twenty eleven. 196 00:10:39,360 --> 00:10:44,640 Speaker 1: You know, there there was a massive evolution of technology, 197 00:10:44,800 --> 00:10:48,000 Speaker 1: for sure, but it seemed to be very one sided. 198 00:10:48,600 --> 00:10:51,320 Speaker 3: Like it was it was us making the advancements. 199 00:10:51,360 --> 00:10:54,480 Speaker 1: It was us that were really I mean, obviously I 200 00:10:54,840 --> 00:10:56,959 Speaker 1: was never in Iraq, and all my buddies that were 201 00:10:56,960 --> 00:11:01,079 Speaker 1: over there they started facing, you know, the the v 202 00:11:01,240 --> 00:11:04,800 Speaker 1: bids that were a lot more technically proficient with how 203 00:11:04,800 --> 00:11:07,920 Speaker 1: they crafted them in terms of their ability to penetrate 204 00:11:07,960 --> 00:11:11,400 Speaker 1: through their M wraps and just the sophistication advance. 205 00:11:11,480 --> 00:11:13,720 Speaker 3: But now you know, you've. 206 00:11:13,520 --> 00:11:17,400 Speaker 1: Got you know there, I saw a report, I think 207 00:11:17,440 --> 00:11:20,360 Speaker 1: you posted it that that they're going to build something 208 00:11:20,480 --> 00:11:26,040 Speaker 1: like four million drones over the next you know, year 209 00:11:26,160 --> 00:11:28,760 Speaker 1: or so. They're going to ramp up their production, right 210 00:11:29,320 --> 00:11:31,959 Speaker 1: and then and then on top of that, you also 211 00:11:32,160 --> 00:11:37,640 Speaker 1: have uh, you know, China feeding their drone manufacturing, Iran's 212 00:11:37,760 --> 00:11:38,880 Speaker 1: drone manufacturing. 213 00:11:38,920 --> 00:11:40,480 Speaker 3: They're feeding the hooties. They're free. 214 00:11:40,760 --> 00:11:43,960 Speaker 1: So it almost feels like, you know, because of this 215 00:11:45,960 --> 00:11:50,800 Speaker 1: lower end technology that delivers even a more capable of capability. 216 00:11:50,840 --> 00:11:55,040 Speaker 1: I I also saw the herd in the number that 217 00:11:55,040 --> 00:12:00,480 Speaker 1: that's you said, that's seventy percent of all casualties we're 218 00:12:00,480 --> 00:12:03,120 Speaker 1: a result of drone warfare in some capacity. 219 00:12:03,679 --> 00:12:03,959 Speaker 4: Yeah. 220 00:12:04,440 --> 00:12:08,200 Speaker 2: Now, and that's just from like using open source, Like 221 00:12:08,280 --> 00:12:10,640 Speaker 2: I don't personally know if that's the case, but I've 222 00:12:10,679 --> 00:12:12,839 Speaker 2: been seeing a lot of open sources and a lot 223 00:12:12,840 --> 00:12:15,600 Speaker 2: of like reporting that saying that it could be upwards 224 00:12:15,600 --> 00:12:19,600 Speaker 2: of seventy percent of casualties or destroyed vehicles where as 225 00:12:19,600 --> 00:12:23,760 Speaker 2: a result of drones, which is nuts, because you know 226 00:12:24,160 --> 00:12:26,719 Speaker 2: that was I mean, it's relatively it's not like new 227 00:12:26,800 --> 00:12:30,440 Speaker 2: new because I mean, you know, we had you know, 228 00:12:30,520 --> 00:12:34,280 Speaker 2: Jihadis and stuff like that using drones back in you know, 229 00:12:34,520 --> 00:12:38,400 Speaker 2: when we were still in Afghanistan and Syria and Iraq 230 00:12:38,440 --> 00:12:41,800 Speaker 2: and stuff like that. I think around twenty sixteen seventeen, 231 00:12:41,840 --> 00:12:43,960 Speaker 2: they were starting to do that stuff. But this it's 232 00:12:44,000 --> 00:12:48,160 Speaker 2: like blown way way up since this conflict started, you know. 233 00:12:48,880 --> 00:12:52,880 Speaker 1: I and for me it because obviously, I mean even 234 00:12:52,920 --> 00:12:56,240 Speaker 1: after ben Ghazi's when I really you know, kind of 235 00:12:56,320 --> 00:13:00,760 Speaker 1: understood the mass of in terms of American arm sales, 236 00:13:00,800 --> 00:13:02,520 Speaker 1: how we do those arm sales. 237 00:13:02,240 --> 00:13:04,480 Speaker 3: Whether they're overt covert. 238 00:13:04,679 --> 00:13:07,480 Speaker 1: You know, you start to realize, all right, there's this 239 00:13:07,600 --> 00:13:13,360 Speaker 1: whole undercurrent of technology and then distribution across channels. 240 00:13:13,400 --> 00:13:15,880 Speaker 3: Now that they're all the. 241 00:13:17,480 --> 00:13:21,760 Speaker 1: Players that our military art has to face around the world, 242 00:13:21,840 --> 00:13:26,960 Speaker 1: have it seems like they've exponentially gotten bigger. So in 243 00:13:27,040 --> 00:13:30,480 Speaker 1: what you're seeing in this reporting, this open source stuff 244 00:13:30,480 --> 00:13:34,480 Speaker 1: that you're always on top of, how many different nations 245 00:13:34,480 --> 00:13:39,000 Speaker 1: do you think are producing quality munitions and weapons systems 246 00:13:39,040 --> 00:13:44,000 Speaker 1: that are contributing not just to Ukraine and Russia, but 247 00:13:44,160 --> 00:13:51,520 Speaker 1: to Syria, to Lebanon, to Israel. The Hoothy's out of Yemen? 248 00:13:51,679 --> 00:13:54,920 Speaker 1: I mean, is it just exploding on a way that 249 00:13:54,920 --> 00:13:56,440 Speaker 1: that becomes uncontrollable? 250 00:14:02,000 --> 00:14:04,080 Speaker 3: What's up, everybody? Sorry again for the interruption. 251 00:14:04,240 --> 00:14:07,679 Speaker 1: I just want to tell you about this incredible five 252 00:14:07,840 --> 00:14:11,160 Speaker 1: day challenge that we're offering on David Rutherford dot com 253 00:14:11,280 --> 00:14:13,960 Speaker 1: or in the link in the description. And what this 254 00:14:14,160 --> 00:14:18,040 Speaker 1: is is this is your Embrace Fear five day Challenge. Now, 255 00:14:18,120 --> 00:14:22,040 Speaker 1: over the past thirty years, I've spent a tremendous amount 256 00:14:22,040 --> 00:14:25,200 Speaker 1: of time developing what I call the Frog Logic concepts. 257 00:14:25,240 --> 00:14:28,240 Speaker 1: I teach people to embrace their fears, to forge their 258 00:14:28,240 --> 00:14:30,720 Speaker 1: self confidence, how to live a team life, and to 259 00:14:30,920 --> 00:14:34,160 Speaker 1: ultimately live with purpose. So what we want to introduce 260 00:14:34,200 --> 00:14:37,560 Speaker 1: you to is how to get going in these ideas, 261 00:14:37,600 --> 00:14:42,120 Speaker 1: and so we've got this incredible five day Embrace Fear Challenge. Again, 262 00:14:42,200 --> 00:14:45,160 Speaker 1: the link is in the description. In this there's five 263 00:14:45,240 --> 00:14:47,720 Speaker 1: missions five to one per day that's going to ask 264 00:14:47,760 --> 00:14:50,080 Speaker 1: you to do all different types of things, which is, 265 00:14:50,760 --> 00:14:53,600 Speaker 1: identify the fears which are running your life except the 266 00:14:53,680 --> 00:14:57,120 Speaker 1: hard truth of where you are, rechain your mindset, to 267 00:14:57,240 --> 00:15:00,080 Speaker 1: test your limits, and then to forge your courage to 268 00:15:00,160 --> 00:15:04,840 Speaker 1: live with purpose. These involve both physical and emotional and 269 00:15:04,880 --> 00:15:07,640 Speaker 1: mental aspects in terms of journaling, but I'm telling you 270 00:15:07,760 --> 00:15:10,600 Speaker 1: it's going to kick start your mind as you begin 271 00:15:10,680 --> 00:15:12,880 Speaker 1: to think about the number one emotion out of our 272 00:15:12,920 --> 00:15:17,040 Speaker 1: eight core motions that impedes us from our successes, and 273 00:15:17,080 --> 00:15:20,560 Speaker 1: that's fear. And trust me when I tell you I've 274 00:15:20,600 --> 00:15:23,160 Speaker 1: spent a lot of my life being afraid of things. 275 00:15:23,680 --> 00:15:26,080 Speaker 1: When you're jumping out of a perfectly good airplane in 276 00:15:26,120 --> 00:15:28,240 Speaker 1: the middle of the night, chasing two ducks into the 277 00:15:28,240 --> 00:15:31,280 Speaker 1: middle of the ocean, you better believe you're scared. And 278 00:15:31,360 --> 00:15:34,640 Speaker 1: this is the ideas that emerged out of my understanding 279 00:15:34,680 --> 00:15:37,600 Speaker 1: of fear that I hope you can understand too, that 280 00:15:37,640 --> 00:15:41,120 Speaker 1: will end up helping you be able to find your purpose. 281 00:15:41,280 --> 00:15:45,400 Speaker 3: In life. So here's the kicker. This is a free challenge. 282 00:15:45,400 --> 00:15:47,280 Speaker 1: You don't have to pay for it, you don't have 283 00:15:47,360 --> 00:15:49,880 Speaker 1: to buy it, you don't have to dow anything. All 284 00:15:49,880 --> 00:15:52,440 Speaker 1: you got to do is sign up, take the challenge, 285 00:15:52,560 --> 00:15:55,280 Speaker 1: and get your life back on track being able to 286 00:15:55,360 --> 00:15:58,600 Speaker 1: embrace that fear in order for you to find purpose 287 00:15:58,640 --> 00:15:59,240 Speaker 1: in life. 288 00:15:59,440 --> 00:16:01,200 Speaker 3: All right? Enjoyed? Who yuh? 289 00:16:02,440 --> 00:16:06,320 Speaker 1: Is it just exploding on a way that that becomes uncontrollable? 290 00:16:08,400 --> 00:16:08,640 Speaker 1: You know? 291 00:16:08,760 --> 00:16:09,320 Speaker 4: I think that. 292 00:16:12,480 --> 00:16:16,960 Speaker 2: It's tough because there's so many different nations that are 293 00:16:17,000 --> 00:16:22,480 Speaker 2: producing armaments, and then of course we're selling armaments through 294 00:16:22,680 --> 00:16:25,560 Speaker 2: like uh you know, foreign military sales and stuff like 295 00:16:25,600 --> 00:16:29,200 Speaker 2: that to different nations. Like for example, I think on 296 00:16:29,440 --> 00:16:34,239 Speaker 2: May twenty second, the State Department released a an approval 297 00:16:34,320 --> 00:16:39,080 Speaker 2: of possible possible foreign military sale of like two hundred 298 00:16:39,080 --> 00:16:42,800 Speaker 2: and ninety six million dollars in Javelin missile systems and 299 00:16:42,840 --> 00:16:48,000 Speaker 2: related equipment to Estonia, which is obviously a former Soviet 300 00:16:48,040 --> 00:16:50,320 Speaker 2: block nation, and Sonia is our ally let's. 301 00:16:50,200 --> 00:16:50,960 Speaker 4: Be clear about that. 302 00:16:51,120 --> 00:16:53,520 Speaker 2: Like I've worked, I've worked like. 303 00:16:53,520 --> 00:16:56,320 Speaker 4: With the Estonians before, Oh wow, not. 304 00:16:56,440 --> 00:17:00,520 Speaker 2: Directly, but like they were I so I was, I 305 00:17:00,600 --> 00:17:04,240 Speaker 2: was in sent comm in twenty twenty three and they 306 00:17:04,280 --> 00:17:06,399 Speaker 2: were part of the defense force for the base that 307 00:17:06,440 --> 00:17:07,879 Speaker 2: I was on at the time, and I met a 308 00:17:07,880 --> 00:17:10,399 Speaker 2: bunch of those guys. They were all super nice and 309 00:17:10,440 --> 00:17:12,720 Speaker 2: they're just they were all standing posts like they're doing 310 00:17:12,760 --> 00:17:15,560 Speaker 2: their job like everybody else. But they're like one of 311 00:17:15,600 --> 00:17:19,440 Speaker 2: our allies for sure. They're big partner partner for us. 312 00:17:19,440 --> 00:17:23,000 Speaker 1: And we had them all through Afghanistan back in the day. 313 00:17:23,040 --> 00:17:26,280 Speaker 1: They would always be like on these little bases that 314 00:17:26,440 --> 00:17:29,400 Speaker 1: the Estonians would do, like gate guard detail and all that. 315 00:17:30,320 --> 00:17:33,320 Speaker 1: They just get off they doing it all. And I'll 316 00:17:33,320 --> 00:17:35,880 Speaker 1: never forget they showed up one time and they were 317 00:17:35,920 --> 00:17:39,520 Speaker 1: wearing like rubber like hunting boots, like rubber duck boots, 318 00:17:39,560 --> 00:17:42,439 Speaker 1: you know. Man, oh indud it was I felt we 319 00:17:42,480 --> 00:17:45,360 Speaker 1: felt so bad. And it was the Brits. The British says. 320 00:17:45,520 --> 00:17:48,280 Speaker 1: They said, hey, man, hey, you know, here's some boots, 321 00:17:48,320 --> 00:17:51,080 Speaker 1: and they gave them like badass boots and stuff, and 322 00:17:51,800 --> 00:17:54,879 Speaker 1: they were like they they couldn't have been cool, like 323 00:17:54,920 --> 00:17:58,040 Speaker 1: they the level of commitment from them just went through 324 00:17:58,040 --> 00:17:59,960 Speaker 1: the roof because they're like, oh that's cool. 325 00:18:00,119 --> 00:18:02,560 Speaker 2: So yeah, yeah, they've all they were always I mean, 326 00:18:02,720 --> 00:18:06,800 Speaker 2: I had nothing but good interactions with all the Estonians 327 00:18:06,840 --> 00:18:10,520 Speaker 2: and you know, but again, it's like you've got a 328 00:18:10,520 --> 00:18:13,199 Speaker 2: lot of a lot of different nations that some of 329 00:18:13,200 --> 00:18:16,959 Speaker 2: them manufacture weapons, some of them don't, and they're like 330 00:18:17,000 --> 00:18:19,440 Speaker 2: everybody's trading with each other and trying to sell each 331 00:18:19,440 --> 00:18:21,639 Speaker 2: other stuff, and like people are wanting to buy stuff, 332 00:18:21,680 --> 00:18:24,159 Speaker 2: like you know, you see nations that have you know, 333 00:18:24,359 --> 00:18:28,040 Speaker 2: F sixteens and like F fifteens and F eighteens and 334 00:18:28,080 --> 00:18:30,520 Speaker 2: F thirty fives that they purchased from the United States 335 00:18:30,520 --> 00:18:32,960 Speaker 2: that may have had certain key technology removed from it 336 00:18:33,000 --> 00:18:35,440 Speaker 2: before they purchased it, and then they added their own 337 00:18:35,600 --> 00:18:36,720 Speaker 2: proprietary stuff. 338 00:18:36,720 --> 00:18:38,320 Speaker 4: They just have like the basic airframe. 339 00:18:38,600 --> 00:18:41,560 Speaker 2: It's like, hey, I'd like to buy your souped up 340 00:18:41,600 --> 00:18:44,280 Speaker 2: Honda Civic and then they they buy it. We take 341 00:18:44,320 --> 00:18:46,480 Speaker 2: all the souped up stuff out and then they take 342 00:18:46,520 --> 00:18:48,960 Speaker 2: it and then they soup it up themselves because they're like, 343 00:18:49,040 --> 00:18:50,880 Speaker 2: I really like that frame and the things that does. 344 00:18:52,080 --> 00:18:55,920 Speaker 2: But you know, it's it is a weird Uh. It 345 00:18:56,000 --> 00:19:00,280 Speaker 2: is a weird thing because it's not something I really 346 00:19:00,359 --> 00:19:03,280 Speaker 2: knew much about. For the longest time, I didn't realize 347 00:19:03,280 --> 00:19:07,119 Speaker 2: that we sold like our older equipment or sold like 348 00:19:07,160 --> 00:19:09,879 Speaker 2: our frames to people, or sold our tanks to people, 349 00:19:10,000 --> 00:19:14,439 Speaker 2: or sold our like you know, hum v's or whatever 350 00:19:14,480 --> 00:19:18,840 Speaker 2: other vehicles. Like I mean, I've never seen combat footage 351 00:19:18,840 --> 00:19:22,359 Speaker 2: of Bradley's before before this came out. And then like 352 00:19:22,440 --> 00:19:26,480 Speaker 2: the war in Ukraine, you've seen like Bradley's taken on 353 00:19:26,840 --> 00:19:29,800 Speaker 2: T ninety tanks and stuff like that. Like I remember 354 00:19:29,840 --> 00:19:32,800 Speaker 2: that was one of the most viral videos ever. I 355 00:19:32,800 --> 00:19:34,920 Speaker 2: think back in twenty twe I might have been twenty 356 00:19:34,920 --> 00:19:38,480 Speaker 2: two or twenty thirty. Yeah, and there was like two 357 00:19:38,560 --> 00:19:42,040 Speaker 2: Bradley's facing off against one T ninety and it was 358 00:19:42,080 --> 00:19:45,320 Speaker 2: getting hit so hot hard by the Bushmaster Shaine guns 359 00:19:45,840 --> 00:19:48,639 Speaker 2: that it just couldn't do anything. I mean, it was 360 00:19:48,720 --> 00:19:51,560 Speaker 2: just like the older systems got fried from this thing. 361 00:19:51,640 --> 00:19:53,840 Speaker 2: I can't even imagine, like how terrifying that would be 362 00:19:53,880 --> 00:19:58,960 Speaker 2: to be getting attacked by two Bradley's inside of that thing. 363 00:20:00,400 --> 00:20:03,359 Speaker 2: Wild to see, but yeah, it's I think, you know, 364 00:20:04,080 --> 00:20:06,439 Speaker 2: really at this point when it comes to like arms 365 00:20:06,480 --> 00:20:09,439 Speaker 2: sales and you know, things like that, the world is 366 00:20:09,520 --> 00:20:14,560 Speaker 2: so intertwined with just how connected we all are, like 367 00:20:14,720 --> 00:20:17,760 Speaker 2: militarily and GDP and like all this other stuff, because 368 00:20:17,800 --> 00:20:19,919 Speaker 2: like you know, I don't know. 369 00:20:19,920 --> 00:20:21,200 Speaker 4: I'm not an economics guy. 370 00:20:21,240 --> 00:20:24,760 Speaker 2: I don't really know a lot about like global economics 371 00:20:24,840 --> 00:20:28,040 Speaker 2: or global currencies or things like that, or sales across 372 00:20:28,080 --> 00:20:32,200 Speaker 2: the ocean. But there's definitely a lot of you know, 373 00:20:32,440 --> 00:20:35,840 Speaker 2: give and take going on between nations, whether you know, 374 00:20:35,920 --> 00:20:39,040 Speaker 2: whether they need it for themselves or if they're just 375 00:20:39,080 --> 00:20:41,080 Speaker 2: trying to buy it and sell it off to somebody 376 00:20:41,080 --> 00:20:43,080 Speaker 2: else at at a higher rate. I don't know, you know, 377 00:20:43,160 --> 00:20:45,080 Speaker 2: I mean, like probably a lot of that going on 378 00:20:45,119 --> 00:20:47,760 Speaker 2: to just like just like how a lot of scalpers 379 00:20:47,800 --> 00:20:49,320 Speaker 2: like to buy things and then sell them for a 380 00:20:49,359 --> 00:20:51,880 Speaker 2: higher rate. I would imagine the same thing happens with weapons, 381 00:20:52,400 --> 00:20:54,479 Speaker 2: you know. I mean, it's just humans trying to make 382 00:20:54,480 --> 00:20:55,480 Speaker 2: a buck, you know what I mean. 383 00:20:56,440 --> 00:20:58,520 Speaker 3: And it all just kind of goes back to that, 384 00:20:58,760 --> 00:20:59,199 Speaker 3: does it it? 385 00:20:59,280 --> 00:20:59,480 Speaker 1: Right? 386 00:20:59,560 --> 00:21:02,520 Speaker 3: I Mean, there are people that pray on conflicts no 387 00:21:02,560 --> 00:21:05,879 Speaker 3: matter what to make those that make those dollars. For sure. 388 00:21:07,200 --> 00:21:09,800 Speaker 1: One of the funniest things that you you posted at 389 00:21:10,000 --> 00:21:14,040 Speaker 1: I shouldn't never funny, but I think remarkable aspects of 390 00:21:14,560 --> 00:21:17,879 Speaker 1: the basic human condition was the video you posted about, 391 00:21:18,400 --> 00:21:23,600 Speaker 1: uh that the Ladies of the Night video in Ukraine 392 00:21:23,640 --> 00:21:26,920 Speaker 1: and how how much they're making on the front lines, 393 00:21:27,000 --> 00:21:29,520 Speaker 1: and and how that's come about. And I think that's 394 00:21:29,560 --> 00:21:33,840 Speaker 1: more indicative as to these you know, these black economies 395 00:21:33,920 --> 00:21:37,720 Speaker 1: that evolve right these black market economies that seem to 396 00:21:37,800 --> 00:21:42,840 Speaker 1: emerge through combat engagements and and and I found that fascinating. 397 00:21:42,960 --> 00:21:49,840 Speaker 1: What as you're assessing your open source the places that 398 00:21:49,880 --> 00:21:53,879 Speaker 1: you're pulling information from, is there a process of of 399 00:21:53,880 --> 00:21:57,119 Speaker 1: of how you filter all this or or are you 400 00:21:57,160 --> 00:22:00,199 Speaker 1: simply just looking for something that really is kind kind 401 00:22:00,200 --> 00:22:05,199 Speaker 1: of unique and fascinating about the grander context of conflict 402 00:22:05,280 --> 00:22:08,560 Speaker 1: that that's what kind of will inspire you to make 403 00:22:08,720 --> 00:22:09,640 Speaker 1: a video about it. 404 00:22:10,400 --> 00:22:13,880 Speaker 2: Typically, it's just like, you know, if I find something 405 00:22:14,520 --> 00:22:17,880 Speaker 2: that I think is interesting and I was like, oh wow, 406 00:22:17,920 --> 00:22:22,919 Speaker 2: I didn't realize that was something that was happening, and 407 00:22:22,960 --> 00:22:25,840 Speaker 2: I find it interesting and it doesn't seem like extreme. 408 00:22:26,680 --> 00:22:29,119 Speaker 2: I try to stay away from stuff that seems really 409 00:22:29,240 --> 00:22:34,840 Speaker 2: clickbaity or extreme, like extreme fringe statements. Like if it's 410 00:22:34,880 --> 00:22:38,359 Speaker 2: like something that's super outlandish and wild and crazy, like 411 00:22:38,440 --> 00:22:40,439 Speaker 2: unless I see a video of it or something like that, 412 00:22:40,520 --> 00:22:44,240 Speaker 2: I'm I. I try not to just post it right 413 00:22:44,280 --> 00:22:47,440 Speaker 2: off the bat without knowing anything about it, unless I'm 414 00:22:47,680 --> 00:22:50,560 Speaker 2: unless it's just like, hey, you know, I had a 415 00:22:50,560 --> 00:22:52,480 Speaker 2: few other friends talking to me about it or something 416 00:22:52,520 --> 00:22:54,080 Speaker 2: like that. Because there's a lot of guys out there 417 00:22:54,080 --> 00:22:56,119 Speaker 2: doing the same thing I'm doing, I'm not like unique 418 00:22:56,160 --> 00:22:57,800 Speaker 2: in that sense. There's a lot of folks out there 419 00:22:57,880 --> 00:23:02,160 Speaker 2: that do very similar to you know, videos and content. 420 00:23:02,200 --> 00:23:04,439 Speaker 2: They might just not have their face in it or 421 00:23:04,480 --> 00:23:07,919 Speaker 2: talking about it or giving commentary on it or discussing it. 422 00:23:07,960 --> 00:23:11,280 Speaker 2: They might just like post the video and then post 423 00:23:11,440 --> 00:23:15,080 Speaker 2: I guess sometimes they post they'll credit like the person 424 00:23:15,119 --> 00:23:19,479 Speaker 2: that came from, or sometimes they won't write yeah, you know, 425 00:23:19,560 --> 00:23:23,040 Speaker 2: I just I like, if there's something I think is interesting, 426 00:23:23,760 --> 00:23:25,880 Speaker 2: I you know, I try to find out as much 427 00:23:25,920 --> 00:23:26,920 Speaker 2: as I can about it. 428 00:23:26,840 --> 00:23:29,960 Speaker 4: And you know, see if I can. 429 00:23:30,280 --> 00:23:33,960 Speaker 2: Vet if it's legit, and or also if it's like, 430 00:23:34,640 --> 00:23:38,639 Speaker 2: you know, something that seems feasible that's not just like 431 00:23:39,040 --> 00:23:42,800 Speaker 2: wild and outlandish. I try to be careful when it 432 00:23:42,840 --> 00:23:48,639 Speaker 2: comes to certain topics and subjects, just because you know, obviously, 433 00:23:48,720 --> 00:23:53,320 Speaker 2: like there's especially like the war that's the conflict that's 434 00:23:53,359 --> 00:23:58,639 Speaker 2: kind of ongoing with Gaza and Israel and stuff like that. 435 00:23:58,680 --> 00:24:02,040 Speaker 2: There's so much information it's like not necessarily credible coming 436 00:24:02,080 --> 00:24:05,960 Speaker 2: out of that whole thing that I you know, unless 437 00:24:05,960 --> 00:24:08,359 Speaker 2: it's like, hey, this is something that happened. Here's a 438 00:24:08,400 --> 00:24:11,560 Speaker 2: statement about that thing that happened. Here's the video. That's 439 00:24:11,600 --> 00:24:14,120 Speaker 2: one thing. But there's so much information coming out from 440 00:24:14,160 --> 00:24:16,760 Speaker 2: both sides, sometimes it gets really muddied and I can't 441 00:24:16,800 --> 00:24:19,680 Speaker 2: tell like what actually happened, and so I just won't 442 00:24:19,720 --> 00:24:21,760 Speaker 2: even talk about it because it's like, well, I don't 443 00:24:21,760 --> 00:24:24,879 Speaker 2: want to propagate really bad information if I don't know 444 00:24:25,119 --> 00:24:28,639 Speaker 2: for sure, definitively that anything like that happened, you know, 445 00:24:28,680 --> 00:24:31,240 Speaker 2: And I try to keep my opinions out of. 446 00:24:31,400 --> 00:24:32,879 Speaker 4: A lot of that stuff if I can. 447 00:24:33,520 --> 00:24:35,959 Speaker 2: We're all gonna have inherent bias, you know, Like I'm 448 00:24:35,960 --> 00:24:38,359 Speaker 2: gonna have inherent bias. Everyone's gonna have inherent bias. Like 449 00:24:38,400 --> 00:24:40,760 Speaker 2: I have friends in the Israeli Defense Force, you know 450 00:24:40,880 --> 00:24:44,520 Speaker 2: that watch my videos and watch my content. I've interviewed 451 00:24:45,280 --> 00:24:48,160 Speaker 2: an Israeli Defense Force officer that was back here during 452 00:24:48,200 --> 00:24:52,600 Speaker 2: Shot show before. A friend of mine, Orton Julie's her 453 00:24:52,680 --> 00:24:56,760 Speaker 2: name Big like gun content creator and stuff like big 454 00:24:56,800 --> 00:24:59,359 Speaker 2: into firearms and mental health and stuff. So I was 455 00:24:59,359 --> 00:25:01,680 Speaker 2: able to interview her for my first Shot show back 456 00:25:01,720 --> 00:25:06,560 Speaker 2: in twenty twenty four. I think it was four very cool. 457 00:25:06,800 --> 00:25:10,679 Speaker 2: So there's there's always gonna be inherent bias, and like 458 00:25:11,000 --> 00:25:12,960 Speaker 2: I try to be careful as I can, and some 459 00:25:13,160 --> 00:25:16,000 Speaker 2: if something's jacked up and it comes out later that hey, 460 00:25:16,080 --> 00:25:21,080 Speaker 2: that's like that was manufactured or that was like nonsense 461 00:25:21,200 --> 00:25:24,119 Speaker 2: then and some and I'm and I see it like 462 00:25:24,280 --> 00:25:26,639 Speaker 2: oh okay, cool, Well I'll take that one down or 463 00:25:26,680 --> 00:25:28,600 Speaker 2: I'll delete it or something like that. I don't want 464 00:25:28,640 --> 00:25:31,200 Speaker 2: to like push bad information out there, but I try 465 00:25:31,240 --> 00:25:33,399 Speaker 2: to put stuff that if if there's at least like 466 00:25:34,040 --> 00:25:38,320 Speaker 2: a fair amount of credible sources talking about it, you know, 467 00:25:38,400 --> 00:25:40,520 Speaker 2: then I'll I'll usually try to talk about some stuff. 468 00:25:40,560 --> 00:25:42,680 Speaker 2: The only thing that's hard is like if it's breaking, 469 00:25:43,440 --> 00:25:46,920 Speaker 2: usually more than likely when it's breaking, the only people 470 00:25:46,960 --> 00:25:49,360 Speaker 2: that are talking about that stuff at that particular time 471 00:25:49,359 --> 00:25:53,159 Speaker 2: are like local news sources. Like that's all the small, 472 00:25:53,240 --> 00:25:55,720 Speaker 2: small ones, all the way down at the bottom, because 473 00:25:55,720 --> 00:25:58,480 Speaker 2: like you're not gonna have all the big mainstream media 474 00:25:58,520 --> 00:26:01,960 Speaker 2: ones talking about something if it just happened, because they're 475 00:26:02,000 --> 00:26:04,960 Speaker 2: pulling that information from way the way down at the 476 00:26:04,960 --> 00:26:08,160 Speaker 2: bottom too, or local sources that are feeding it to them, 477 00:26:08,320 --> 00:26:10,959 Speaker 2: you know. So but yeah, really, what it comes down 478 00:26:11,000 --> 00:26:13,280 Speaker 2: to is like if I find it interesting or fascinating, 479 00:26:13,320 --> 00:26:16,440 Speaker 2: like hey, here's some new technology that's coming out, or hey, uh, 480 00:26:16,520 --> 00:26:19,640 Speaker 2: some pro Russian channels posted a video of a guy 481 00:26:20,119 --> 00:26:23,359 Speaker 2: using an ammunition cart that that has a quick release 482 00:26:24,040 --> 00:26:26,120 Speaker 2: latch so that he can hop off and hop into 483 00:26:26,119 --> 00:26:28,479 Speaker 2: a security position or something like, you know that kind 484 00:26:28,520 --> 00:26:28,840 Speaker 2: of stuff. 485 00:26:28,880 --> 00:26:30,000 Speaker 4: I think it's interesting. 486 00:26:31,320 --> 00:26:34,679 Speaker 2: I try not to make it controversial by saying, you know, 487 00:26:35,280 --> 00:26:37,520 Speaker 2: any outline of statements about it one way or another, 488 00:26:37,680 --> 00:26:40,119 Speaker 2: just like, hey, here's what's happening, here's what this is, 489 00:26:40,200 --> 00:26:42,199 Speaker 2: here's what they were posting and saying it was, and 490 00:26:42,880 --> 00:26:44,080 Speaker 2: you know, there it is. 491 00:26:44,119 --> 00:26:46,000 Speaker 4: That's cut. You don't need me. I don't need to 492 00:26:46,040 --> 00:26:48,280 Speaker 4: like tell you how to feel about this, you know. Yeah. 493 00:26:48,520 --> 00:26:50,800 Speaker 1: And I think that's that's one of the other things 494 00:26:50,800 --> 00:26:53,800 Speaker 1: that I think the way you deliver is so refreshing 495 00:26:53,880 --> 00:26:57,879 Speaker 1: because you know, you're letting people make the decision on 496 00:26:58,359 --> 00:27:01,400 Speaker 1: where their biases is going to lead them to regardless, right, 497 00:27:01,480 --> 00:27:04,920 Speaker 1: And so it's just I think the more people that 498 00:27:05,000 --> 00:27:08,919 Speaker 1: we can have out delivering that that style of information, 499 00:27:09,160 --> 00:27:12,359 Speaker 1: the better, right And and and I think you know, 500 00:27:12,480 --> 00:27:16,560 Speaker 1: the greatest challenge in particular as people are making this 501 00:27:16,760 --> 00:27:21,000 Speaker 1: transition away from you know, the old guard media whatever 502 00:27:21,040 --> 00:27:25,520 Speaker 1: that is now, you know, into this new space is 503 00:27:25,600 --> 00:27:30,080 Speaker 1: like how do they determine, you know, which channel to 504 00:27:30,160 --> 00:27:34,080 Speaker 1: follow and and why? You know? And I think that 505 00:27:34,080 --> 00:27:38,640 Speaker 1: that lends itself when when people can innately kind of 506 00:27:39,280 --> 00:27:46,040 Speaker 1: uh what feel the clarity of of your deliverable your deliverable, 507 00:27:46,480 --> 00:27:47,800 Speaker 1: then then they're more. 508 00:27:47,760 --> 00:27:50,520 Speaker 3: That that trust builds up in a quicker capacity. 509 00:27:51,119 --> 00:27:53,240 Speaker 2: Yeah, well that's another thing. It takes time for people 510 00:27:53,280 --> 00:27:55,639 Speaker 2: to trust you. You know, you got to earn the trust. 511 00:27:55,680 --> 00:27:58,280 Speaker 2: And if you're like if you if you're like, hey, 512 00:27:58,320 --> 00:28:00,240 Speaker 2: we you know, we made a mistake, or if you 513 00:28:00,400 --> 00:28:04,200 Speaker 2: like if the video was you know, maybe you jacked 514 00:28:04,280 --> 00:28:07,080 Speaker 2: up some information on it and you remove it, then 515 00:28:07,119 --> 00:28:09,800 Speaker 2: it's not there anymore, and then that way people see 516 00:28:09,960 --> 00:28:10,600 Speaker 2: you know you don't. 517 00:28:10,800 --> 00:28:11,640 Speaker 4: But here's the thing, Like. 518 00:28:11,600 --> 00:28:13,800 Speaker 2: I'm not gonna necessarily come out and say, oops, we 519 00:28:13,840 --> 00:28:14,520 Speaker 2: made a mistake. 520 00:28:14,600 --> 00:28:15,600 Speaker 4: I'll just take it down. 521 00:28:16,040 --> 00:28:19,720 Speaker 2: I'll just immediately take that accountability, remove it, and then 522 00:28:19,840 --> 00:28:22,760 Speaker 2: we'll keep pushing forward. You know, I don't like necessarily 523 00:28:22,800 --> 00:28:24,720 Speaker 2: spend a lot of time on oops, see my bad. 524 00:28:24,840 --> 00:28:26,960 Speaker 2: I'll just delete it because it's like, no, we're gonna 525 00:28:26,960 --> 00:28:28,480 Speaker 2: take care of this right now and then we're gonna 526 00:28:28,480 --> 00:28:29,360 Speaker 2: move on to the next thing. 527 00:28:29,720 --> 00:28:30,000 Speaker 1: Cool. 528 00:28:30,160 --> 00:28:33,439 Speaker 2: And maybe that's like an aggressive approach to it or 529 00:28:33,520 --> 00:28:36,919 Speaker 2: not as a nuance of an approach, because like it 530 00:28:37,000 --> 00:28:40,760 Speaker 2: might maybe to some people feel more personable if I'm 531 00:28:40,800 --> 00:28:44,200 Speaker 2: like like if I publicly talk about, hey, this was 532 00:28:44,280 --> 00:28:45,880 Speaker 2: wrong or this wasn't right. 533 00:28:45,800 --> 00:28:46,719 Speaker 4: So we deleted it. 534 00:28:48,080 --> 00:28:50,520 Speaker 2: But I mean, as people stick around and they get 535 00:28:50,520 --> 00:28:53,080 Speaker 2: to know me, or they watch podcasts and me talking 536 00:28:53,080 --> 00:28:55,400 Speaker 2: about stuff with people's podcasts, and they can get to 537 00:28:55,440 --> 00:28:58,680 Speaker 2: know that stuff from here rather than me posting a 538 00:28:58,680 --> 00:29:01,640 Speaker 2: short video that's like an apology video which doesn't do 539 00:29:01,680 --> 00:29:06,000 Speaker 2: anything anyway. You know, like honestly, if there's something that 540 00:29:06,000 --> 00:29:08,960 Speaker 2: I put up that's like incorrect or wrong or or 541 00:29:09,120 --> 00:29:11,240 Speaker 2: another thing that I that I noticed, is if it's 542 00:29:11,320 --> 00:29:15,760 Speaker 2: just like pure inflammatory stuff that's just like making like 543 00:29:15,840 --> 00:29:20,000 Speaker 2: people get divided that happened in the nation, Like I 544 00:29:20,000 --> 00:29:21,960 Speaker 2: don't know, you could find plenty of topics to talk 545 00:29:22,000 --> 00:29:25,400 Speaker 2: about then I typically just you know, I posted something 546 00:29:25,440 --> 00:29:27,240 Speaker 2: a while back about it, and it just caused a 547 00:29:27,240 --> 00:29:30,640 Speaker 2: bunch of people to start fighting. And because it was 548 00:29:30,720 --> 00:29:33,520 Speaker 2: like there's some racial stuff involved with it, and it 549 00:29:33,600 --> 00:29:36,320 Speaker 2: was like somebody got murdered and like that kind of thing, 550 00:29:36,640 --> 00:29:38,680 Speaker 2: and I was like, you know what, this isn't really 551 00:29:39,360 --> 00:29:42,440 Speaker 2: doing anything beneficial for the community of people that are 552 00:29:42,640 --> 00:29:45,840 Speaker 2: engaging with my content. I'm just going to try to 553 00:29:45,920 --> 00:29:49,320 Speaker 2: keep I like to be informative and informed and teach 554 00:29:49,360 --> 00:29:53,240 Speaker 2: people stuff. But I'm not trying to like rage bait people. 555 00:29:53,320 --> 00:29:56,120 Speaker 2: That's not what I'm interested. I don't I don't want to. 556 00:29:56,240 --> 00:29:58,160 Speaker 2: There's enough people out there doing that crap. I don't 557 00:29:58,200 --> 00:30:00,800 Speaker 2: want to contribute to it. I I'm going to report 558 00:30:00,840 --> 00:30:04,240 Speaker 2: and talk about certain things that may be controversial and 559 00:30:04,480 --> 00:30:07,320 Speaker 2: may have some people that get upset about it, but 560 00:30:07,400 --> 00:30:10,280 Speaker 2: it's not explicitly there to rage bait people. 561 00:30:10,600 --> 00:30:12,280 Speaker 4: And so that's something I have had to. 562 00:30:12,320 --> 00:30:14,680 Speaker 2: Like figure out and kind of like work through and 563 00:30:14,720 --> 00:30:17,600 Speaker 2: figure out, like what's the right way to approach this 564 00:30:18,040 --> 00:30:22,280 Speaker 2: without feeling like I'm censoring myself or censoring my ability 565 00:30:22,320 --> 00:30:23,520 Speaker 2: to communicate information. 566 00:30:24,000 --> 00:30:25,720 Speaker 4: You know, it's a it's a tough it's a tough 567 00:30:25,840 --> 00:30:26,440 Speaker 4: kind of dance. 568 00:30:26,440 --> 00:30:29,280 Speaker 1: I guess you know, it's in pot it's a very 569 00:30:29,320 --> 00:30:34,880 Speaker 1: difficult dance because there's a certain there's a certain appetite 570 00:30:34,960 --> 00:30:37,760 Speaker 1: that I think people on the Internet, and particular in 571 00:30:37,880 --> 00:30:41,000 Speaker 1: people who want to be more informed, who are spending 572 00:30:41,720 --> 00:30:45,440 Speaker 1: an exorbitant amount of time through their feed and getting 573 00:30:45,480 --> 00:30:48,840 Speaker 1: captured by that that that rage bait stuff, and you know, 574 00:30:48,880 --> 00:30:52,120 Speaker 1: they're they're they're going to gravitate that towards you know, 575 00:30:52,560 --> 00:30:56,400 Speaker 1: just by the natural order of their conditioning right through 576 00:30:56,440 --> 00:31:01,080 Speaker 1: that that propaganda feed that fortifies in their their consciousness, 577 00:31:01,200 --> 00:31:05,640 Speaker 1: right so, you know, but I also do believe that, 578 00:31:06,280 --> 00:31:09,320 Speaker 1: you know, that's why your show has become so successful 579 00:31:09,440 --> 00:31:12,760 Speaker 1: is because you've been able to kind of rise through 580 00:31:12,800 --> 00:31:15,600 Speaker 1: all that as as you could very easily go down 581 00:31:15,640 --> 00:31:20,640 Speaker 1: that road and and and make hardcore commentary about these 582 00:31:20,680 --> 00:31:23,560 Speaker 1: different types of engagements based on your background and your 583 00:31:23,680 --> 00:31:26,320 Speaker 1: your familiarity with them, but you don't. You keep that 584 00:31:26,400 --> 00:31:30,120 Speaker 1: restriction upon yourself, which is which is really kind of interesting. 585 00:31:30,160 --> 00:31:32,200 Speaker 1: I wanted to talk to you a little bit about this. 586 00:31:33,080 --> 00:31:37,640 Speaker 1: You know, there's a I think there's a there's a 587 00:31:37,720 --> 00:31:44,280 Speaker 1: growing I don't know whether it's fabricator whatever controversy in 588 00:31:44,320 --> 00:31:47,320 Speaker 1: particular on the right, I think when left has its 589 00:31:47,320 --> 00:31:50,040 Speaker 1: own issues or whatever. But it's like, what is what 590 00:31:50,160 --> 00:31:54,520 Speaker 1: is your responsibility as a creator? You know, and and 591 00:31:54,640 --> 00:31:57,479 Speaker 1: what is your responsibility especially when you get to a 592 00:31:57,480 --> 00:32:00,760 Speaker 1: certain level you know, and you you had talked about 593 00:32:00,840 --> 00:32:04,760 Speaker 1: when you first got I'm not an expert in these things. 594 00:32:05,240 --> 00:32:09,280 Speaker 1: I'm trying to deliver this type of factual evaluation of 595 00:32:09,320 --> 00:32:10,960 Speaker 1: it for you to make your. 596 00:32:10,800 --> 00:32:12,200 Speaker 3: Own conclusive decisions. 597 00:32:12,760 --> 00:32:14,720 Speaker 1: Do you and you talked a little bit about through 598 00:32:14,760 --> 00:32:18,320 Speaker 1: the interview, but like, what can you explain deeper that 599 00:32:18,600 --> 00:32:21,920 Speaker 1: level of responsibility and how it's grown in you as 600 00:32:21,920 --> 00:32:30,560 Speaker 1: you've your audience has just gotten bigger and bigger and bigger. Okay, 601 00:32:30,600 --> 00:32:33,840 Speaker 1: thank you so much for listening today. Pardon the interruption, 602 00:32:33,960 --> 00:32:35,840 Speaker 1: but I just got to give a shout out to 603 00:32:36,800 --> 00:32:39,600 Speaker 1: one of our big sponsors here, and this comes from 604 00:32:39,640 --> 00:32:43,440 Speaker 1: my good friend Alex, you know, who has a family 605 00:32:43,480 --> 00:32:47,680 Speaker 1: owned business called Firecracker Farm. When I talk about family business, 606 00:32:47,760 --> 00:32:52,280 Speaker 1: and I've worked with thousands of family businesses across the 607 00:32:52,360 --> 00:32:56,000 Speaker 1: country for the last twenty years of public speaking, you know, 608 00:32:56,160 --> 00:32:58,160 Speaker 1: it's when I meet them and I know that this 609 00:32:58,320 --> 00:33:01,080 Speaker 1: business is going to succeed, and it's successful because it's 610 00:33:01,080 --> 00:33:03,760 Speaker 1: a business that's that's a part of love and their 611 00:33:03,760 --> 00:33:06,959 Speaker 1: family and how they support each other. This is that place, 612 00:33:07,920 --> 00:33:09,760 Speaker 1: you know, I've been up. I've been to their farm. 613 00:33:09,880 --> 00:33:12,320 Speaker 1: I watch how they raise their peppers with love. I 614 00:33:12,360 --> 00:33:15,760 Speaker 1: watch how they process them in this thing they call 615 00:33:15,840 --> 00:33:18,200 Speaker 1: the Three Kings, and how they infuse them into this 616 00:33:18,640 --> 00:33:22,320 Speaker 1: these beauty, this beautiful salt, the spicy salt that enhances 617 00:33:22,360 --> 00:33:25,200 Speaker 1: your food. I put it on my eggs every single morning. 618 00:33:25,240 --> 00:33:26,840 Speaker 1: I put it on my steak, I put it on 619 00:33:26,880 --> 00:33:30,320 Speaker 1: my protein. It's in these cool salt shakers. But you 620 00:33:30,320 --> 00:33:33,480 Speaker 1: know more so that the product is impeccable. It's I've 621 00:33:33,640 --> 00:33:37,280 Speaker 1: even been able to phase out ultra process uh sauces 622 00:33:37,360 --> 00:33:41,040 Speaker 1: hot sauces, and I'm now I'm using this, this spicy salt. 623 00:33:41,320 --> 00:33:43,720 Speaker 1: But the thing I know is just how much Alex 624 00:33:43,840 --> 00:33:47,400 Speaker 1: loves doing this, how much his family loves to support them, 625 00:33:47,680 --> 00:33:51,640 Speaker 1: and really the quality of the products. So if you 626 00:33:51,760 --> 00:33:54,840 Speaker 1: believe me and you trust what I'm telling you, please 627 00:33:55,240 --> 00:33:59,600 Speaker 1: visit Firecracker dot Farm and if you want a discount, 628 00:33:59,640 --> 00:34:03,000 Speaker 1: you can pipe in the discount code rut r U 629 00:34:03,080 --> 00:34:06,960 Speaker 1: t Romeo Uniform Tango one five to get your discount. Again, 630 00:34:07,120 --> 00:34:09,279 Speaker 1: a family owned business that's incredible. 631 00:34:09,719 --> 00:34:12,359 Speaker 3: UH. You will love their product. I promise you that's 632 00:34:12,440 --> 00:34:13,680 Speaker 3: Firecracker dot far. 633 00:34:15,800 --> 00:34:18,440 Speaker 2: Well, I've definitely become more self aware of biases I have, 634 00:34:19,560 --> 00:34:21,120 Speaker 2: you know, That's that's one thing. 635 00:34:21,120 --> 00:34:22,640 Speaker 4: I've become very self aware of it. 636 00:34:22,719 --> 00:34:25,440 Speaker 2: And and I've gotten called out for being biased for 637 00:34:25,520 --> 00:34:29,200 Speaker 2: certain things in certain posts, even if it wasn't like 638 00:34:29,920 --> 00:34:35,520 Speaker 2: overtly biased, maybe it was just like slightly biased, or 639 00:34:35,560 --> 00:34:38,080 Speaker 2: maybe I put in some wording that seemed like I 640 00:34:38,200 --> 00:34:44,040 Speaker 2: was four or against something. And you know, I'm gonna 641 00:34:44,080 --> 00:34:47,600 Speaker 2: have biases, and sometimes I'm going to post about things 642 00:34:47,800 --> 00:34:52,200 Speaker 2: and say like I feel a certain way about it, 643 00:34:52,600 --> 00:34:57,040 Speaker 2: or that I feel like this is the reason why 644 00:34:57,080 --> 00:34:59,640 Speaker 2: this is happening, or this is the reason why they 645 00:34:59,719 --> 00:35:05,360 Speaker 2: are doing this thing or whatever, and you know, I 646 00:35:05,400 --> 00:35:09,759 Speaker 2: but at the same time, I also try to put 647 00:35:09,760 --> 00:35:11,839 Speaker 2: it out there because I'm not trying. I don't want 648 00:35:11,880 --> 00:35:14,560 Speaker 2: to like tell people how to think or how to 649 00:35:14,719 --> 00:35:18,600 Speaker 2: feel about stuff. Like I have my own opinions about things, 650 00:35:18,600 --> 00:35:22,160 Speaker 2: and like, you know, I'm sure there's probably people out 651 00:35:22,200 --> 00:35:23,319 Speaker 2: there that think I'm all right. 652 00:35:23,360 --> 00:35:25,280 Speaker 4: There's also probably people out there that think I'm. 653 00:35:25,160 --> 00:35:27,400 Speaker 2: A liberal, you know, but they don't know because I 654 00:35:27,440 --> 00:35:29,480 Speaker 2: don't talk about that stuff, you know what I mean, 655 00:35:29,560 --> 00:35:32,440 Speaker 2: Like I registered independent for a reason, you. 656 00:35:32,400 --> 00:35:32,799 Speaker 4: Know what I mean. 657 00:35:33,480 --> 00:35:36,960 Speaker 2: I've always I've been a registered independent for years now 658 00:35:37,000 --> 00:35:39,799 Speaker 2: because I, like, I I believe in liberty, but I 659 00:35:39,800 --> 00:35:41,480 Speaker 2: also want people to make up their own mind, and 660 00:35:41,520 --> 00:35:44,919 Speaker 2: I believe in like freedom, you know, And I don't 661 00:35:44,920 --> 00:35:49,320 Speaker 2: want to, Like I've never done ads for any political 662 00:35:49,360 --> 00:35:51,759 Speaker 2: action committee, and I won't ever because like that's something 663 00:35:51,800 --> 00:35:54,280 Speaker 2: I want to stay very far away from and remove from, 664 00:35:55,080 --> 00:35:59,520 Speaker 2: and I you know, want to you know. 665 00:35:59,480 --> 00:36:02,480 Speaker 4: I want people to feel like at the end of 666 00:36:02,480 --> 00:36:02,719 Speaker 4: the day. 667 00:36:02,760 --> 00:36:07,440 Speaker 2: One of the biggest things I've noticed, especially with social media, 668 00:36:07,880 --> 00:36:12,720 Speaker 2: is that like a lot of these echo chambers have formed, 669 00:36:12,920 --> 00:36:16,680 Speaker 2: like along party lines in all kinds of ways, and 670 00:36:17,800 --> 00:36:21,160 Speaker 2: it just what ends up happening is like these pages 671 00:36:21,200 --> 00:36:24,000 Speaker 2: that are very big pages across social media will post 672 00:36:24,040 --> 00:36:27,440 Speaker 2: stuff that is obviously appealing to a specific type of 673 00:36:27,480 --> 00:36:32,920 Speaker 2: individual and and it's designed to get people mad and 674 00:36:33,080 --> 00:36:36,320 Speaker 2: comment and say something because they're angry or because they're 675 00:36:36,960 --> 00:36:40,479 Speaker 2: like or because it makes them feel a certain way, 676 00:36:40,640 --> 00:36:43,920 Speaker 2: or it makes it evokes certain emotions that cause them 677 00:36:43,960 --> 00:36:47,279 Speaker 2: to want to engage with it a trigger. Yeah, it's 678 00:36:47,320 --> 00:36:50,479 Speaker 2: like a trick, right, And obviously, like I would say, 679 00:36:50,480 --> 00:36:53,160 Speaker 2: probably older folks are probably more likely to fall for 680 00:36:53,239 --> 00:36:55,680 Speaker 2: that one because it's still relatively new and they have, 681 00:36:55,880 --> 00:36:58,520 Speaker 2: like social media hasn't been around their whole life, like 682 00:36:58,600 --> 00:37:01,239 Speaker 2: a lot of gen Zers. By the time gen z 683 00:37:01,400 --> 00:37:04,000 Speaker 2: Ers are the Saint are in their sixties and retirement age, 684 00:37:04,360 --> 00:37:07,440 Speaker 2: they'll have had social media their whole life and they'll 685 00:37:07,600 --> 00:37:09,520 Speaker 2: know how to deal with this stuff. But although there's 686 00:37:09,520 --> 00:37:11,920 Speaker 2: probably gonna be other challenges by them with AI. 687 00:37:11,719 --> 00:37:12,399 Speaker 4: And everything, but. 688 00:37:13,920 --> 00:37:17,479 Speaker 2: You know, I think one of the biggest problems is that, 689 00:37:17,640 --> 00:37:20,640 Speaker 2: like there's a lot of people that are just posting 690 00:37:20,680 --> 00:37:23,399 Speaker 2: stuff that doesn't do anything to bring people together, because 691 00:37:23,440 --> 00:37:25,760 Speaker 2: at the end of the day, like we're all Americans first, 692 00:37:26,480 --> 00:37:30,439 Speaker 2: and I feel like we need more stuff that's just like, Hey, 693 00:37:30,440 --> 00:37:34,000 Speaker 2: this is what's happening. This is something that's taking place. 694 00:37:35,080 --> 00:37:38,520 Speaker 2: Here's what's in this bill. Here's what's in this statement 695 00:37:38,600 --> 00:37:42,320 Speaker 2: from this department. Here's what they said, Here's what they're doing. 696 00:37:42,719 --> 00:37:45,719 Speaker 2: Here's what the your Congress is voting on right now. 697 00:37:45,920 --> 00:37:48,799 Speaker 2: Here's what's in the package. Here's what's in the one 698 00:37:48,800 --> 00:37:52,440 Speaker 2: big beautiful bill. Oh it's a thousand pages. Well, you know, 699 00:37:52,760 --> 00:37:55,319 Speaker 2: here's something that the left is going to really like 700 00:37:55,360 --> 00:37:56,000 Speaker 2: that's in the bill. 701 00:37:56,120 --> 00:37:57,960 Speaker 4: Okay, here's something that the right's. 702 00:37:57,719 --> 00:38:00,279 Speaker 2: Going to really like in this bill or whatever, you know, like, 703 00:38:00,320 --> 00:38:02,920 Speaker 2: here's just what it is, here's what the wording is. 704 00:38:03,000 --> 00:38:05,560 Speaker 4: Like you can decide for yourself. 705 00:38:05,600 --> 00:38:07,880 Speaker 2: I don't want people to think that my page is 706 00:38:07,960 --> 00:38:11,000 Speaker 2: a right wing media page or a left wing media page. 707 00:38:11,000 --> 00:38:13,319 Speaker 2: I want them to think like, hey, this is a 708 00:38:13,320 --> 00:38:17,840 Speaker 2: guy that just talks about current events, current affairs, military stuff, technology, science, 709 00:38:18,560 --> 00:38:19,560 Speaker 2: things that are happening. 710 00:38:20,480 --> 00:38:22,600 Speaker 4: It's interesting and we can all kind of be here. 711 00:38:22,760 --> 00:38:25,080 Speaker 2: The problem is is I notice, no matter what it is, 712 00:38:25,120 --> 00:38:28,640 Speaker 2: if it's if it's been politicized in some way, people 713 00:38:28,840 --> 00:38:32,200 Speaker 2: fight in the comments. And it's not like I'm asking 714 00:38:32,200 --> 00:38:34,520 Speaker 2: for that. I don't want that to happen. I'm not trying. 715 00:38:34,719 --> 00:38:37,000 Speaker 2: I'm not posting it because I want people to argue 716 00:38:37,040 --> 00:38:39,359 Speaker 2: in the comments with each other. I'm posting it because 717 00:38:39,400 --> 00:38:42,239 Speaker 2: it's just like this is something I found interesting that's 718 00:38:42,480 --> 00:38:45,440 Speaker 2: new or has come out recently. But you can't do 719 00:38:45,480 --> 00:38:49,800 Speaker 2: any once once your media grows to a certain level, 720 00:38:50,480 --> 00:38:53,960 Speaker 2: it becomes totally unmanageable for one person to go in 721 00:38:54,000 --> 00:38:56,800 Speaker 2: there and like clean your comments section up. You can't 722 00:38:56,800 --> 00:39:01,120 Speaker 2: do it. It's impossible, and honestly, like for my own 723 00:39:01,160 --> 00:39:03,839 Speaker 2: mental sanity. I spend less time in the comments than 724 00:39:03,880 --> 00:39:07,960 Speaker 2: ever before because I'm an habitual Joe Rogan listener, Like 725 00:39:08,000 --> 00:39:09,880 Speaker 2: I really like the things that he does and the 726 00:39:09,880 --> 00:39:12,359 Speaker 2: people he has on that he talks to. I think 727 00:39:12,400 --> 00:39:16,680 Speaker 2: he's a very reasonable guy, like straight, like center of 728 00:39:16,719 --> 00:39:20,120 Speaker 2: the road, like I like. I think most Americans are 729 00:39:20,120 --> 00:39:24,520 Speaker 2: probably in the center. I genuinely believe that, and that 730 00:39:24,600 --> 00:39:28,080 Speaker 2: the the only reason why all the it seems like 731 00:39:28,120 --> 00:39:30,960 Speaker 2: we're super divided is because the people that are the 732 00:39:31,120 --> 00:39:33,640 Speaker 2: loudest on the right and the loudest on the left 733 00:39:34,040 --> 00:39:37,800 Speaker 2: are super super loud and get like magnified because. 734 00:39:37,600 --> 00:39:39,680 Speaker 4: It's so outlandish. 735 00:39:39,760 --> 00:39:42,680 Speaker 2: I think most people are in the center, And I 736 00:39:42,719 --> 00:39:45,640 Speaker 2: think most people like want to want everybody to be 737 00:39:45,680 --> 00:39:48,960 Speaker 2: able to like have enough money to feed your family, 738 00:39:49,040 --> 00:39:51,200 Speaker 2: have a nice like a house with a roof over 739 00:39:51,239 --> 00:39:54,440 Speaker 2: your head, have a good, steady paying job, like have 740 00:39:54,560 --> 00:39:57,239 Speaker 2: enough time to like take vacations, uh, you know, have 741 00:39:57,640 --> 00:40:01,000 Speaker 2: not have food insecurity, you know. I want to see 742 00:40:01,000 --> 00:40:03,399 Speaker 2: everybody in the country be able to have some level 743 00:40:03,400 --> 00:40:06,600 Speaker 2: of success. And I still feel like this is still 744 00:40:06,680 --> 00:40:11,040 Speaker 2: like the land of where you can actually pursue happiness, 745 00:40:11,080 --> 00:40:13,680 Speaker 2: where you have the freedom to pursue happiness, not the 746 00:40:13,719 --> 00:40:16,800 Speaker 2: freedom to like be given happiness. 747 00:40:16,800 --> 00:40:18,319 Speaker 4: It's not something you're entitled to. 748 00:40:18,440 --> 00:40:21,160 Speaker 2: But you're free to pursue whatever you want and like 749 00:40:21,360 --> 00:40:24,880 Speaker 2: worship whatever god of your understanding you want, and you know, 750 00:40:25,120 --> 00:40:27,120 Speaker 2: live your life as you see fit. As long as 751 00:40:27,160 --> 00:40:30,440 Speaker 2: you're not having a negative impact on somebody else physically, 752 00:40:31,000 --> 00:40:34,640 Speaker 2: right mentally, maybe that's going to happen. But that's like okay, 753 00:40:34,840 --> 00:40:37,520 Speaker 2: like as long as you're not like guilty of some 754 00:40:37,600 --> 00:40:40,200 Speaker 2: sort of crime, you know, But that's another that's a 755 00:40:40,239 --> 00:40:42,680 Speaker 2: whole another thing too. It's like you see a lot 756 00:40:42,719 --> 00:40:46,239 Speaker 2: of politicization of legal stuff right now. It's like, man, 757 00:40:46,280 --> 00:40:49,040 Speaker 2: that makes the water super muddy too. And it's like 758 00:40:49,280 --> 00:40:51,520 Speaker 2: I don't even I'm not a lawyer, Like I don't 759 00:40:51,600 --> 00:40:52,800 Speaker 2: understand how that stuff works. 760 00:40:52,840 --> 00:40:52,960 Speaker 1: Man. 761 00:40:53,040 --> 00:40:54,920 Speaker 2: It's like that's why I just try to keep it 762 00:40:54,960 --> 00:40:57,480 Speaker 2: clean and straight simple, like, hey, this is what's happening, 763 00:40:57,480 --> 00:40:59,840 Speaker 2: this is what I'm seeing. You can decide this is 764 00:40:59,840 --> 00:41:02,440 Speaker 2: what this is, what the wording is in this statement, 765 00:41:02,520 --> 00:41:05,839 Speaker 2: this person said or whatever, like you can decide how 766 00:41:05,880 --> 00:41:08,120 Speaker 2: you feel about it, and here it is. 767 00:41:08,360 --> 00:41:13,000 Speaker 1: You know, well, Kagan, I I just can't thank you 768 00:41:13,120 --> 00:41:16,120 Speaker 1: enough for the focus that you have and doing that 769 00:41:16,840 --> 00:41:20,040 Speaker 1: and in your delivery and having that type of personal 770 00:41:20,080 --> 00:41:24,680 Speaker 1: responsibility to believe that that's the way that that's such 771 00:41:24,719 --> 00:41:28,640 Speaker 1: a benefit to the society right now, because as you said, 772 00:41:28,680 --> 00:41:30,920 Speaker 1: those those people that are screaming at the top of 773 00:41:30,960 --> 00:41:33,280 Speaker 1: their lungs, man, they're they're getting a lot of traction. 774 00:41:33,440 --> 00:41:36,080 Speaker 1: And you know, those bot farms that we talked about 775 00:41:36,120 --> 00:41:40,040 Speaker 1: before coming on are are are increasing their capabilities. They're 776 00:41:40,680 --> 00:41:44,800 Speaker 1: driving more diversity and or not diversity, but decite divisiveness 777 00:41:44,840 --> 00:41:48,719 Speaker 1: within the American public. And I too agree that a man, 778 00:41:48,760 --> 00:41:50,439 Speaker 1: the best thing we can do right now is find 779 00:41:50,440 --> 00:41:52,000 Speaker 1: out that common ground on the. 780 00:41:53,080 --> 00:41:56,560 Speaker 2: Information, that common ground, man, that that's the real problem. 781 00:41:56,719 --> 00:41:58,439 Speaker 4: It's like, dude, I guarantee you could. 782 00:41:58,560 --> 00:42:01,520 Speaker 2: You could take fifty Democrats and fifty Republicans, put them 783 00:42:01,560 --> 00:42:02,719 Speaker 2: in the room and they would all be able to 784 00:42:02,760 --> 00:42:05,400 Speaker 2: find something in common with each other, you know, and 785 00:42:05,520 --> 00:42:08,520 Speaker 2: and like we should all like you, we should all 786 00:42:08,719 --> 00:42:10,960 Speaker 2: root for the success of the nation, no matter who's 787 00:42:10,960 --> 00:42:13,880 Speaker 2: in power, no matter who's running what, no matter who's 788 00:42:13,920 --> 00:42:16,879 Speaker 2: in charge of what. You know, we should all be like, well, 789 00:42:16,920 --> 00:42:19,839 Speaker 2: I want this to win. I want like to see 790 00:42:19,880 --> 00:42:22,560 Speaker 2: the American people succeed, and I want people to be 791 00:42:22,560 --> 00:42:25,080 Speaker 2: able to have more money to spend on their kids 792 00:42:25,160 --> 00:42:28,520 Speaker 2: and like have some semblance of happiness and you know 793 00:42:29,120 --> 00:42:29,800 Speaker 2: that kind of stuff. 794 00:42:29,840 --> 00:42:33,920 Speaker 1: Man a man, all right, brother, thank you so much 795 00:42:34,080 --> 00:42:37,960 Speaker 1: for your time and for what you're doing. Is there 796 00:42:38,000 --> 00:42:41,440 Speaker 1: any uh where where can people find you follow you 797 00:42:41,520 --> 00:42:44,560 Speaker 1: if they haven't already, and then obviously, you know what, 798 00:42:44,640 --> 00:42:46,920 Speaker 1: do you have anything kind of cool or unique coming up? 799 00:42:47,000 --> 00:42:49,560 Speaker 1: Are there any veterans charities you're working with or groups 800 00:42:49,560 --> 00:42:50,600 Speaker 1: that you'd like to highlight? 801 00:42:51,000 --> 00:42:51,879 Speaker 4: Yeah? Absolutely well. 802 00:42:51,880 --> 00:42:53,799 Speaker 2: So, I mean, if anybody wants to find me, you 803 00:42:53,840 --> 00:42:56,000 Speaker 2: just type CAG and done lap on pretty much any 804 00:42:56,120 --> 00:42:57,200 Speaker 2: any social platform. 805 00:42:57,239 --> 00:42:58,960 Speaker 4: And I mean it's my first thing and last name 806 00:42:58,960 --> 00:42:59,840 Speaker 4: where you can google me. 807 00:43:00,160 --> 00:43:02,440 Speaker 2: If you type in my name in Google Kagan ka 808 00:43:02,520 --> 00:43:04,680 Speaker 2: g a N last name, dunlap d u n l 809 00:43:04,680 --> 00:43:08,319 Speaker 2: a P, then all my social media will pop up 810 00:43:08,400 --> 00:43:09,920 Speaker 2: up there and you can just find it on there. 811 00:43:09,960 --> 00:43:13,960 Speaker 2: It's easy to find. As far as veteran charities, so 812 00:43:14,000 --> 00:43:16,400 Speaker 2: there's a couple of great organizations, Like this shirt that 813 00:43:16,400 --> 00:43:19,560 Speaker 2: I'm wearing right now is from an organization called Operation 814 00:43:19,640 --> 00:43:23,319 Speaker 2: Allies Refuge Foundation and we actually just did a a 815 00:43:23,360 --> 00:43:27,480 Speaker 2: big roundtable discussion video that we posted on YouTube together. 816 00:43:27,600 --> 00:43:29,480 Speaker 2: It was a pretty big event we did, and we 817 00:43:29,600 --> 00:43:35,680 Speaker 2: recorded it in Quantico, I would say about a month ago, 818 00:43:35,719 --> 00:43:38,640 Speaker 2: maybe it was like at the beginning. I think it 819 00:43:38,680 --> 00:43:40,839 Speaker 2: was the very beginning of April. I went up there 820 00:43:40,880 --> 00:43:44,319 Speaker 2: on a weekend on a Friday, and we were we 821 00:43:44,400 --> 00:43:48,760 Speaker 2: had they had three Vietnam vets that helped evacuate Saigon, 822 00:43:49,040 --> 00:43:51,960 Speaker 2: the Saigon Embassy and one of the consulates, and then 823 00:43:52,440 --> 00:43:59,279 Speaker 2: three h Hamatkarzai International Airport veterans who were in the 824 00:43:59,320 --> 00:44:02,680 Speaker 2: Marine Corps that helped evacuate the airport there in twenty 825 00:44:02,719 --> 00:44:06,200 Speaker 2: twenty one. And we brought the three or the six 826 00:44:06,280 --> 00:44:09,400 Speaker 2: of them together, two different generations that had both evacuated 827 00:44:09,400 --> 00:44:14,400 Speaker 2: a country, and they and they had a roundtable discussion. 828 00:44:14,920 --> 00:44:16,800 Speaker 2: I just kind of sat there as like a mediator 829 00:44:16,840 --> 00:44:18,160 Speaker 2: and asked questions if. 830 00:44:18,320 --> 00:44:19,200 Speaker 4: Like things came up. 831 00:44:19,200 --> 00:44:23,520 Speaker 2: And we also did with like the organization, they did 832 00:44:23,560 --> 00:44:25,600 Speaker 2: one on one interviews for every single one of these 833 00:44:25,640 --> 00:44:28,279 Speaker 2: people and that way they could kind of like tell 834 00:44:28,320 --> 00:44:30,719 Speaker 2: tell their story about what their experiences were. And then 835 00:44:30,760 --> 00:44:33,600 Speaker 2: we put together all of this footage into a big 836 00:44:33,640 --> 00:44:37,800 Speaker 2: YouTube video. It's on my YouTube channel right now, which 837 00:44:38,440 --> 00:44:39,640 Speaker 2: you know, if we can get. 838 00:44:39,480 --> 00:44:41,120 Speaker 4: This thing moving. 839 00:44:41,160 --> 00:44:44,640 Speaker 2: It was posted on the anniversary of the evacuation of 840 00:44:45,080 --> 00:44:50,600 Speaker 2: Saigon on the thirtieth of April, and basically any of 841 00:44:50,600 --> 00:44:53,400 Speaker 2: the money that's generated from this thing in revenue is 842 00:44:53,440 --> 00:44:57,000 Speaker 2: going to go back to that foundation. And then also 843 00:44:57,239 --> 00:45:01,160 Speaker 2: there's like fundraising links in the pinned in the comment 844 00:45:01,239 --> 00:45:04,000 Speaker 2: for it, and also in the description for the video. 845 00:45:04,080 --> 00:45:05,480 Speaker 4: But they're a great organization. 846 00:45:05,560 --> 00:45:08,680 Speaker 2: They do a lot to help folks that served in 847 00:45:08,719 --> 00:45:11,960 Speaker 2: Afghanistan during the evacuation, like to heal from moral injury, 848 00:45:12,840 --> 00:45:15,840 Speaker 2: and to bring people together and just and help people 849 00:45:15,880 --> 00:45:18,600 Speaker 2: in the in the g Wat community that have a 850 00:45:18,640 --> 00:45:22,160 Speaker 2: lot of moral injury from that whole that whole thing, 851 00:45:22,440 --> 00:45:24,560 Speaker 2: you know. And so that's a good that's a good organization. 852 00:45:24,640 --> 00:45:27,080 Speaker 2: I'm sure you know who Hunter seven is probably a 853 00:45:27,120 --> 00:45:29,719 Speaker 2: hundred seven foundation. Chelsea's good friend of mine and her 854 00:45:29,760 --> 00:45:32,200 Speaker 2: and her husband as well. They're all great people. They've 855 00:45:32,239 --> 00:45:36,800 Speaker 2: done an immense amount to help out veterans and folks 856 00:45:36,880 --> 00:45:40,320 Speaker 2: and folks in the first responder community and any and 857 00:45:40,400 --> 00:45:43,360 Speaker 2: police and all that stuff. Like they've done free cancer 858 00:45:43,400 --> 00:45:48,239 Speaker 2: screenings for thousands of people. Each cancer screening costs like 859 00:45:48,320 --> 00:45:51,840 Speaker 2: I think twelve to fifteen hundred dollars per cancer screening 860 00:45:51,880 --> 00:45:54,319 Speaker 2: and they do free cancer screenings to people all over 861 00:45:54,360 --> 00:45:55,400 Speaker 2: the place all the time. 862 00:45:55,560 --> 00:45:55,920 Speaker 4: Wow. 863 00:45:56,480 --> 00:45:59,839 Speaker 2: In at Shot Show they screened like I think over 864 00:45:59,880 --> 00:46:02,720 Speaker 2: a hundred people or something crazy like that for free. 865 00:46:03,080 --> 00:46:07,160 Speaker 2: And uh, you know, they they really do help a 866 00:46:07,239 --> 00:46:10,480 Speaker 2: lot of people in the veteran community get get screened 867 00:46:10,520 --> 00:46:13,360 Speaker 2: and they they they they do a lot for cancer 868 00:46:13,400 --> 00:46:16,360 Speaker 2: research to for exposure related illnesses. 869 00:46:16,520 --> 00:46:19,760 Speaker 4: Check them out hundred seven Foundation. They're great. They're great people, 870 00:46:21,280 --> 00:46:22,839 Speaker 4: I mean, and there's tons of others too. 871 00:46:22,880 --> 00:46:24,879 Speaker 2: There's tons of Another good one a friend of mine 872 00:46:24,920 --> 00:46:27,720 Speaker 2: I went to college with called Jets for Vets uh, 873 00:46:27,760 --> 00:46:32,000 Speaker 2: and they basically fly junior service members home for free 874 00:46:32,120 --> 00:46:36,600 Speaker 2: from their duty station right now overseas for like seeing family, 875 00:46:36,640 --> 00:46:39,040 Speaker 2: because like if you're a junior enlisted service member, it's 876 00:46:39,080 --> 00:46:42,480 Speaker 2: expensive to pay for tickets coming home. I only went 877 00:46:42,520 --> 00:46:44,880 Speaker 2: home three times the five years I was stationed in 878 00:46:44,960 --> 00:46:48,239 Speaker 2: Hawaii because it's like one thousand dollars round trip to 879 00:46:48,320 --> 00:46:50,239 Speaker 2: fly back to the East Coast, And so I was like, 880 00:46:50,600 --> 00:46:50,960 Speaker 2: I'm just. 881 00:46:50,920 --> 00:46:52,640 Speaker 4: Not going to go home. You know, most people, most 882 00:46:52,640 --> 00:46:53,520 Speaker 4: people can't afford it. 883 00:46:53,520 --> 00:46:56,400 Speaker 2: You know, you're PFC and you're making like six hundred 884 00:46:56,440 --> 00:46:59,160 Speaker 2: and seven hundred dollars every two weeks, You're not gonna 885 00:46:59,160 --> 00:47:02,040 Speaker 2: be able to afford a thousand dollars you know, round 886 00:47:02,080 --> 00:47:05,360 Speaker 2: trip flight, you know, like you barely afford anything, you know. 887 00:47:05,440 --> 00:47:08,120 Speaker 2: So they do a really good job too. But there's tons, 888 00:47:08,320 --> 00:47:10,360 Speaker 2: tons of good organizations that could go on and on there. 889 00:47:10,440 --> 00:47:11,960 Speaker 2: I mean, I'm friends with a lot of folks that 890 00:47:12,320 --> 00:47:14,640 Speaker 2: run nonprofits and I love everything that they're doing, and 891 00:47:14,680 --> 00:47:16,200 Speaker 2: that's one of the things I'd like to get more 892 00:47:16,200 --> 00:47:19,120 Speaker 2: heavily involved in once I can get because I'm still 893 00:47:19,160 --> 00:47:21,920 Speaker 2: like trying to build my business and my brand to 894 00:47:21,920 --> 00:47:23,400 Speaker 2: a point where because I if I can get to 895 00:47:23,440 --> 00:47:25,480 Speaker 2: a point where I can just give away thousands of 896 00:47:25,520 --> 00:47:27,600 Speaker 2: dollars a month of my own money, I want to 897 00:47:27,640 --> 00:47:30,080 Speaker 2: do that. Like that's what I really that's where I'm 898 00:47:30,120 --> 00:47:33,200 Speaker 2: working towards. And you have to build something that's profitable 899 00:47:33,480 --> 00:47:34,880 Speaker 2: to get to a point where you can do that. 900 00:47:34,920 --> 00:47:36,200 Speaker 4: And a lot of people don't understand that. 901 00:47:36,239 --> 00:47:38,680 Speaker 2: They think, oh, well, this person's just making all this 902 00:47:38,760 --> 00:47:40,640 Speaker 2: money and they're keeping it off of themselves. Like, bro, 903 00:47:40,840 --> 00:47:43,440 Speaker 2: how do you think Black Rifle Coffee got so successful? 904 00:47:44,040 --> 00:47:46,280 Speaker 2: You think they just got that successful just giving everything 905 00:47:46,320 --> 00:47:47,920 Speaker 2: away they made, no they had to roll it back 906 00:47:47,920 --> 00:47:50,080 Speaker 2: into the company until they grew big enough where now 907 00:47:50,480 --> 00:47:53,760 Speaker 2: they give hundreds of thousands of dollars away to tons 908 00:47:53,840 --> 00:47:56,440 Speaker 2: of veteran charity organizations. And that's what I want to 909 00:47:56,440 --> 00:47:58,120 Speaker 2: get to be able to do eventually, because I want 910 00:47:58,120 --> 00:47:59,560 Speaker 2: to have an impact on people and I want to 911 00:47:59,560 --> 00:48:00,000 Speaker 2: help people. 912 00:48:00,360 --> 00:48:01,520 Speaker 4: That's really what it comes down to. 913 00:48:01,600 --> 00:48:04,360 Speaker 2: Because we do a really good job of uh we 914 00:48:04,400 --> 00:48:06,120 Speaker 2: do a really good job of tearing each other down. 915 00:48:06,320 --> 00:48:08,040 Speaker 2: We also do a really good job of helping each 916 00:48:08,040 --> 00:48:09,719 Speaker 2: other up when we're having a hard time, you know. 917 00:48:09,840 --> 00:48:11,840 Speaker 2: And that's and I feel like we have an obligation 918 00:48:12,040 --> 00:48:16,320 Speaker 2: as service members, you know, military members, veterans, first responders, 919 00:48:16,400 --> 00:48:18,359 Speaker 2: of doing the right thing, taking care of each other, 920 00:48:18,440 --> 00:48:22,520 Speaker 2: being good ambassadors for our community because you know, the 921 00:48:22,760 --> 00:48:24,680 Speaker 2: people are left in our right, whether they're still in 922 00:48:24,760 --> 00:48:27,279 Speaker 2: or out, you know, deserve that, you know. And so 923 00:48:27,360 --> 00:48:30,319 Speaker 2: that's that's what I hope. I hope that happens. I 924 00:48:30,320 --> 00:48:33,239 Speaker 2: hope more people do that. Maybe maybe they see what 925 00:48:33,280 --> 00:48:35,920 Speaker 2: I'm doing and maybe they're like, Okay, maybe I should 926 00:48:35,920 --> 00:48:37,719 Speaker 2: try to do something more myself. You know, I don't 927 00:48:37,719 --> 00:48:39,640 Speaker 2: know if I'm the best role model. I definitely don't 928 00:48:39,640 --> 00:48:41,799 Speaker 2: think so. Like I'm just trying to do the best 929 00:48:41,880 --> 00:48:43,799 Speaker 2: I can, like everybody else out there, like you, like 930 00:48:43,880 --> 00:48:46,440 Speaker 2: everybody that's that's out here, trying to do something to 931 00:48:46,440 --> 00:48:48,719 Speaker 2: put out a good message or talk through things, or 932 00:48:49,239 --> 00:48:52,640 Speaker 2: have good, honest, open dialogue about stuff, and and and 933 00:48:52,760 --> 00:48:55,760 Speaker 2: raising awareness about good organizations that have doing good things 934 00:48:55,560 --> 00:48:57,000 Speaker 2: for other people in the community. 935 00:48:57,000 --> 00:48:57,160 Speaker 1: You know. 936 00:48:57,320 --> 00:49:01,160 Speaker 2: So I start rambling, I think probably I lose track 937 00:49:01,200 --> 00:49:03,560 Speaker 2: of my I lose track of what I'm saying sometimes. 938 00:49:03,560 --> 00:49:06,200 Speaker 1: But I'll tell you what those are the best rambles 939 00:49:06,200 --> 00:49:07,560 Speaker 1: that you just had ever. Man. 940 00:49:07,920 --> 00:49:10,840 Speaker 3: I wish you all the best. God bless you Kagan, 941 00:49:10,880 --> 00:49:14,560 Speaker 3: and keep up the outstanding work man. Stay safe too, please. 942 00:49:14,680 --> 00:49:16,600 Speaker 4: Yeah, I appreciate it. Thanks for having me on again. Man, 943 00:49:16,600 --> 00:49:18,840 Speaker 4: I appreciate your time. Thank you absolutely