1 00:00:01,560 --> 00:00:05,000 Speaker 1: Hi, and welcome back to the Carol Markowitz Show on iHeartRadio. 2 00:00:05,440 --> 00:00:08,800 Speaker 1: My guest today is Jason Chankits, an American politician and 3 00:00:08,880 --> 00:00:11,400 Speaker 1: Fox News contributor and the author of the new book 4 00:00:11,440 --> 00:00:15,040 Speaker 1: They're Coming for You, How deep state spies, NGOs and 5 00:00:15,160 --> 00:00:19,239 Speaker 1: woke corporations plan to push you out of the economy. Jason, 6 00:00:19,600 --> 00:00:20,560 Speaker 1: so nice to have you on. 7 00:00:20,880 --> 00:00:22,480 Speaker 2: Hey, thanks for having me. Appreciate it. 8 00:00:22,920 --> 00:00:26,520 Speaker 3: That is a terrifying title. And why do you want 9 00:00:26,600 --> 00:00:27,800 Speaker 3: us not to sleep at night? 10 00:00:29,040 --> 00:00:32,720 Speaker 2: Who are they? And should people be concerned? 11 00:00:33,120 --> 00:00:37,840 Speaker 4: If you saw recently, you saw that Doge really uncovered 12 00:00:37,880 --> 00:00:40,839 Speaker 4: all the money, the waste, the fraud, the abuse, the 13 00:00:41,000 --> 00:00:43,840 Speaker 4: hundreds of billions of dollars a taxpayer money was going 14 00:00:43,880 --> 00:00:47,600 Speaker 4: out the door. But this book lays out there and 15 00:00:47,720 --> 00:00:51,240 Speaker 4: examines and reveals is how much data. 16 00:00:51,400 --> 00:00:53,080 Speaker 2: It's the data side of the equation. 17 00:00:54,080 --> 00:00:57,200 Speaker 4: Most people, I think, understand that they're going to trade 18 00:00:57,320 --> 00:01:00,000 Speaker 4: a little bit of their privacy of ways to increase 19 00:01:00,160 --> 00:01:04,520 Speaker 4: their convenience so they can find their local coffee shop easier, 20 00:01:04,560 --> 00:01:07,440 Speaker 4: that sort of thing. What they don't understand is how 21 00:01:07,480 --> 00:01:12,600 Speaker 4: pervasive it is in understanding everything about you. Facial recognition, 22 00:01:13,920 --> 00:01:17,640 Speaker 4: your hair color, your propensity, your sexual preferences. All of 23 00:01:17,680 --> 00:01:22,479 Speaker 4: these things are now being collected by the government, sold 24 00:01:22,840 --> 00:01:27,120 Speaker 4: by the government, and used by the government that they 25 00:01:27,160 --> 00:01:31,039 Speaker 4: would never be able to use otherwise. And that's what's scary, 26 00:01:31,120 --> 00:01:33,560 Speaker 4: and you have to be worried about it. And you 27 00:01:33,640 --> 00:01:36,880 Speaker 4: combine that with artificial intelligence and the deep fakes and 28 00:01:36,920 --> 00:01:40,880 Speaker 4: where we're going. It is a scary world. But there's 29 00:01:41,400 --> 00:01:44,120 Speaker 4: there are blueprints out there to see where this is going. 30 00:01:44,240 --> 00:01:46,800 Speaker 4: You think of China, I think of other places. A 31 00:01:46,840 --> 00:01:49,720 Speaker 4: lot of people they wanted to be debanked, they wanted 32 00:01:49,760 --> 00:01:53,120 Speaker 4: to be pushed out of the economy, and that's what's scary. 33 00:01:54,200 --> 00:01:59,080 Speaker 1: Does the average person understand what's coming or do you 34 00:01:59,160 --> 00:02:02,720 Speaker 1: think that it's still so far away from you know, 35 00:02:02,880 --> 00:02:06,240 Speaker 1: normal life that it's hard to convey to people that 36 00:02:06,280 --> 00:02:08,080 Speaker 1: they should be worried about this threat. 37 00:02:09,120 --> 00:02:12,080 Speaker 4: Well, what I worry about is that the data that's 38 00:02:12,120 --> 00:02:14,000 Speaker 4: not a genie that's easy to be put. 39 00:02:13,800 --> 00:02:14,560 Speaker 2: Back in the bottle. 40 00:02:14,600 --> 00:02:17,240 Speaker 4: And what I really worry about are these young kids 41 00:02:17,280 --> 00:02:19,760 Speaker 4: that are growing up. You know, I'm a little bit 42 00:02:19,800 --> 00:02:22,679 Speaker 4: older now, but they're growing up in an atmosphere where 43 00:02:22,680 --> 00:02:26,280 Speaker 4: everything is captured. But what really bothers me is how 44 00:02:26,320 --> 00:02:30,959 Speaker 4: Our government is dealing with this because, for instance, law 45 00:02:31,040 --> 00:02:35,640 Speaker 4: enforcement would have to have probable cause or articulable suspicion 46 00:02:35,720 --> 00:02:37,679 Speaker 4: or have to get a warrant in order to do 47 00:02:37,760 --> 00:02:41,400 Speaker 4: things in order to profile you and track you and 48 00:02:42,000 --> 00:02:44,760 Speaker 4: understand who you are and what you're doing. But in 49 00:02:44,840 --> 00:02:47,239 Speaker 4: order to get around that, they just sell data and 50 00:02:47,240 --> 00:02:49,560 Speaker 4: then they buy it from a data broker, and then 51 00:02:49,560 --> 00:02:51,640 Speaker 4: once they buy it from a data broker, they feel 52 00:02:51,639 --> 00:02:54,640 Speaker 4: like they don't need a warrant in order to track you. 53 00:02:54,960 --> 00:02:57,320 Speaker 4: And you think, well, wait, I haven't done anything wrong. 54 00:02:57,720 --> 00:03:01,760 Speaker 4: I lead a clean life, you know, but what's happening? 55 00:03:01,840 --> 00:03:05,280 Speaker 4: Even literally in the last forty eight hours, the Federal 56 00:03:05,360 --> 00:03:11,080 Speaker 4: Reserve has said that there they do bank exams and 57 00:03:11,120 --> 00:03:16,240 Speaker 4: they used to require banks to minimize reputational risk. And 58 00:03:16,320 --> 00:03:19,359 Speaker 4: what that meant in the Federal for banking is if 59 00:03:19,360 --> 00:03:22,839 Speaker 4: you do business with people who shop at Cabela's, who 60 00:03:22,880 --> 00:03:26,959 Speaker 4: buy maybe a Trump hat, maybe they went out and 61 00:03:28,440 --> 00:03:33,320 Speaker 4: they went to a rally, then you should be tracked 62 00:03:33,400 --> 00:03:35,640 Speaker 4: and you should not allow to be to have a 63 00:03:35,680 --> 00:03:39,160 Speaker 4: banking relationship, or you should pay right, or you should 64 00:03:39,360 --> 00:03:42,240 Speaker 4: if you went to Cabella's and bought a gun, that's 65 00:03:42,440 --> 00:03:46,280 Speaker 4: legally and lawfully allowed. You can only have a certain 66 00:03:46,320 --> 00:03:50,320 Speaker 4: percentage of those people in your banking PFOILO portfolio. So 67 00:03:50,480 --> 00:03:54,400 Speaker 4: like they didn't like meat and poultry, and so the 68 00:03:55,240 --> 00:03:59,160 Speaker 4: meat and poultry producers of this country, they were being debanked. 69 00:03:59,200 --> 00:04:00,880 Speaker 2: They couldn't use them banking system. 70 00:04:01,240 --> 00:04:04,960 Speaker 4: Malania and Baron Trump were pushed out of the banking system. 71 00:04:05,560 --> 00:04:07,400 Speaker 2: So it's pretty scary what they're doing. 72 00:04:08,440 --> 00:04:12,080 Speaker 1: You were in the US House Representatives for almost a decade. 73 00:04:12,480 --> 00:04:16,600 Speaker 3: Is there anything that Congress can do to stop this? 74 00:04:17,520 --> 00:04:17,919 Speaker 2: Yeah? 75 00:04:18,000 --> 00:04:22,080 Speaker 4: Yeah, absolutely, because first of all, the best disinfectant is 76 00:04:22,120 --> 00:04:25,120 Speaker 4: to expose it. And Donald Trump has done a masterful 77 00:04:25,160 --> 00:04:27,240 Speaker 4: job of starting to tear this down. I think there's 78 00:04:27,240 --> 00:04:30,400 Speaker 4: a reason why, literally the Federal Reserve just got rid 79 00:04:30,400 --> 00:04:31,360 Speaker 4: of this requirement. 80 00:04:31,920 --> 00:04:33,080 Speaker 2: But we have to know about this. 81 00:04:33,120 --> 00:04:37,159 Speaker 4: For instance, in the state of Florida, when you're sixteen 82 00:04:37,240 --> 00:04:38,960 Speaker 4: year old, maybe you have a sixteen year old daughter 83 00:04:39,000 --> 00:04:41,040 Speaker 4: went and got a driver's license for the first time. 84 00:04:41,560 --> 00:04:44,680 Speaker 4: Did the people know that when she got that driver's license, 85 00:04:44,760 --> 00:04:47,920 Speaker 4: she put in her height, her weight, her hair color, 86 00:04:48,120 --> 00:04:50,599 Speaker 4: took a picture of her that the state of Florida 87 00:04:50,760 --> 00:04:54,320 Speaker 4: sold that information. They actually profited for You paid to 88 00:04:54,320 --> 00:04:57,760 Speaker 4: get the driver's license. Then they sold it. Well, it's 89 00:04:57,839 --> 00:05:01,279 Speaker 4: things like this get exposed. People say, wait, what who 90 00:05:01,279 --> 00:05:04,160 Speaker 4: did you sell it to? And did you know that 91 00:05:04,640 --> 00:05:07,400 Speaker 4: there's a company out there, for instance, called Clearview AI. 92 00:05:07,680 --> 00:05:11,240 Speaker 4: I'm not saying they did anything illegal, but they have 93 00:05:11,760 --> 00:05:15,320 Speaker 4: billions of photos. So when you walk down the street, 94 00:05:16,120 --> 00:05:19,559 Speaker 4: or you walk into a shop, or you post something 95 00:05:19,600 --> 00:05:23,279 Speaker 4: on Instagram or Snapchat, or or maybe you just you know, 96 00:05:23,400 --> 00:05:26,160 Speaker 4: post a profile picture on Facebook, did you know that's 97 00:05:26,200 --> 00:05:29,680 Speaker 4: all being what they call scraped and built into a profile. 98 00:05:30,520 --> 00:05:33,080 Speaker 4: And then maybe when you get in your car and 99 00:05:33,120 --> 00:05:36,760 Speaker 4: you start driving, they're gonna they're going to gauge your speed, 100 00:05:37,200 --> 00:05:40,520 Speaker 4: how often you look into the rear view mirror, what speed? 101 00:05:41,000 --> 00:05:44,920 Speaker 4: These are rolling computers. These we detail how the car 102 00:05:45,000 --> 00:05:47,359 Speaker 4: companies they sell this information. You know, who wants to 103 00:05:47,360 --> 00:05:51,200 Speaker 4: buy insurance companies? Right, that's how they But you don't. 104 00:05:51,040 --> 00:05:52,520 Speaker 2: Give permission to do that, did you? 105 00:05:52,920 --> 00:05:56,799 Speaker 4: But once you know all this stuff, then you start 106 00:05:56,839 --> 00:05:59,320 Speaker 4: to open your eyes and say, wait a second, this 107 00:05:59,400 --> 00:06:01,320 Speaker 4: is not who we are. So I think Congress has 108 00:06:01,360 --> 00:06:04,200 Speaker 4: to understand it, and then they're going to have to 109 00:06:04,200 --> 00:06:05,280 Speaker 4: put some guardrails on. 110 00:06:05,960 --> 00:06:08,160 Speaker 1: Well as somebody who has a fifteen year old daughter 111 00:06:08,160 --> 00:06:10,560 Speaker 1: who did just get her learner's permit in the state 112 00:06:10,600 --> 00:06:13,360 Speaker 1: of Florida. It makes me feel kind of helpless that 113 00:06:13,839 --> 00:06:17,080 Speaker 1: this goes on and I can't do anything about it. 114 00:06:17,120 --> 00:06:19,360 Speaker 1: Is there anything we can do on a state level? 115 00:06:20,320 --> 00:06:22,200 Speaker 2: Yeah, you have to expose it. 116 00:06:22,240 --> 00:06:25,320 Speaker 4: And I think Florida ultimately or has done the right 117 00:06:26,040 --> 00:06:29,160 Speaker 4: the right thing. Most people don't understand that about half 118 00:06:29,200 --> 00:06:30,080 Speaker 4: of our states. 119 00:06:30,440 --> 00:06:33,720 Speaker 2: They also trade all this information to the FBI. 120 00:06:33,880 --> 00:06:37,560 Speaker 4: So you're suddenly in a database and you just think, 121 00:06:37,720 --> 00:06:40,600 Speaker 4: well why, and then you think, well, wait, they haven't 122 00:06:40,600 --> 00:06:45,360 Speaker 4: been able to keep this data clean, right, Yeah, they 123 00:06:45,360 --> 00:06:50,120 Speaker 4: haven't been able to protect it. So the amount of 124 00:06:50,200 --> 00:06:53,680 Speaker 4: data that's out there is so massive and so big, 125 00:06:53,760 --> 00:06:56,720 Speaker 4: and there's really no way to pull it back. And 126 00:06:59,160 --> 00:07:01,480 Speaker 4: think of it this way, the government knows almost everything 127 00:07:01,480 --> 00:07:04,040 Speaker 4: about us, and we know very little about the government. 128 00:07:04,360 --> 00:07:04,600 Speaker 1: Right. 129 00:07:05,080 --> 00:07:07,120 Speaker 2: That's so upside down. Right, we're supposed to be the 130 00:07:07,279 --> 00:07:11,800 Speaker 2: private citizens, they're supposed to be the public. But it's 131 00:07:11,840 --> 00:07:13,560 Speaker 2: the opposite, and that's what's wrong. 132 00:07:13,560 --> 00:07:15,400 Speaker 4: That's why I wrote the book They're Coming for You, 133 00:07:15,520 --> 00:07:18,920 Speaker 4: because they are going to use it to manipulate you. 134 00:07:19,120 --> 00:07:21,760 Speaker 4: And we talk about how they use it to manipulate elections, 135 00:07:22,280 --> 00:07:26,520 Speaker 4: voter turnout all of these things. Who gets benefits who doesn't? 136 00:07:26,560 --> 00:07:29,040 Speaker 4: Who gets to go banking, who doesn't, We get to 137 00:07:29,080 --> 00:07:30,720 Speaker 4: see what material who doesn't. 138 00:07:30,800 --> 00:07:32,720 Speaker 1: We're going to take a quick break and be right 139 00:07:32,760 --> 00:07:35,480 Speaker 1: back on the Carol Marcoit Show. Were there any prizes 140 00:07:35,720 --> 00:07:37,080 Speaker 1: in writing they're coming for you? 141 00:07:37,160 --> 00:07:39,119 Speaker 3: Did you find anything you didn't expect to find? 142 00:07:39,280 --> 00:07:43,680 Speaker 4: Big the data brokerage business is and how pervasive and 143 00:07:44,080 --> 00:07:46,320 Speaker 4: incestuous it is with our federal government. 144 00:07:46,640 --> 00:07:48,120 Speaker 2: Days after Joe Biden. 145 00:07:47,840 --> 00:07:50,840 Speaker 4: Took office, he signed this like five hundred thousand dollars 146 00:07:50,840 --> 00:07:54,880 Speaker 4: contract with Clearview AI. Well, why why was that such 147 00:07:54,920 --> 00:07:58,200 Speaker 4: an importance of the White House to have this contract 148 00:07:58,200 --> 00:08:01,360 Speaker 4: to have all this facial recognition? Because they wanted to 149 00:08:01,360 --> 00:08:03,320 Speaker 4: get a picture of everybody that was going to a 150 00:08:03,360 --> 00:08:06,480 Speaker 4: Trump rally, everybody that showed up on you know, at 151 00:08:06,480 --> 00:08:11,560 Speaker 4: certain places, everybody unbelievable. And that's where it started, really 152 00:08:11,560 --> 00:08:16,160 Speaker 4: in the Obama administration where they operation choke Point, that's 153 00:08:16,200 --> 00:08:18,120 Speaker 4: what they called it. They thought they could choke off 154 00:08:18,200 --> 00:08:20,960 Speaker 4: guns in this country by not allowing people to own 155 00:08:21,000 --> 00:08:24,200 Speaker 4: guns or wanted to buy ammunition, from being able to 156 00:08:24,200 --> 00:08:28,040 Speaker 4: have a banking relationship so that really shocked me. 157 00:08:28,080 --> 00:08:29,600 Speaker 2: That it's best I can tell. 158 00:08:29,600 --> 00:08:32,679 Speaker 4: It's about a three hundred and fifty billion dollar industry 159 00:08:32,760 --> 00:08:37,559 Speaker 4: on data brokerages, and that's that's growing exponentially. 160 00:08:38,320 --> 00:08:39,920 Speaker 3: Did you enjoy your time in the House? 161 00:08:40,400 --> 00:08:40,559 Speaker 1: Oh? 162 00:08:40,600 --> 00:08:41,040 Speaker 2: I love you. 163 00:08:42,600 --> 00:08:45,680 Speaker 1: Your handle on Twitter X is Jason in the House. 164 00:08:45,840 --> 00:08:47,880 Speaker 1: I always still think of you as in the House. 165 00:08:49,040 --> 00:08:54,400 Speaker 4: But that's because nobody can pronounce chafits right. So you know, 166 00:08:56,000 --> 00:09:00,000 Speaker 4: I love I love policy. I love digging my fingernail 167 00:09:00,280 --> 00:09:04,400 Speaker 4: deep into public policy and exposing because what I find 168 00:09:04,480 --> 00:09:09,640 Speaker 4: is most people are like, wait, what, Yeah, we would 169 00:09:09,679 --> 00:09:10,360 Speaker 4: never have done it. 170 00:09:10,960 --> 00:09:12,200 Speaker 2: And I love that part of it. 171 00:09:12,480 --> 00:09:15,360 Speaker 1: Right, But maybe actually you're being even more effective on 172 00:09:15,400 --> 00:09:18,200 Speaker 1: the outside by writing books like this, because you know, 173 00:09:18,320 --> 00:09:21,360 Speaker 1: Congress is a little slow to action a little bit. 174 00:09:21,520 --> 00:09:22,959 Speaker 2: You know, you think, yeah. 175 00:09:22,800 --> 00:09:25,840 Speaker 3: They do, they don't really move that quickly. But you 176 00:09:25,840 --> 00:09:26,400 Speaker 3: know your. 177 00:09:26,280 --> 00:09:30,440 Speaker 1: Book, they're coming for you. There's really no way to 178 00:09:30,480 --> 00:09:34,920 Speaker 1: misunderstand it or to not want to take action once 179 00:09:34,960 --> 00:09:36,880 Speaker 1: you read it. Do you feel like that's true? 180 00:09:37,240 --> 00:09:40,160 Speaker 4: Yeah, Look, we have four hundred and fifty references. It 181 00:09:40,200 --> 00:09:42,839 Speaker 4: took me here and a half plus to put all 182 00:09:42,880 --> 00:09:43,480 Speaker 4: this together. 183 00:09:43,800 --> 00:09:44,160 Speaker 3: Wow. 184 00:09:44,440 --> 00:09:47,760 Speaker 4: And then the other thing is people with kids. If 185 00:09:47,760 --> 00:09:50,319 Speaker 4: you want to understand, like how the big movie studios 186 00:09:50,440 --> 00:09:56,360 Speaker 4: manipulate your kids. How how in California in particular, they 187 00:09:56,440 --> 00:09:59,000 Speaker 4: hand out free computers. Oh look, how good we are. 188 00:09:59,120 --> 00:10:01,960 Speaker 4: We're so nice. We hand out all these computers. That's 189 00:10:01,960 --> 00:10:04,560 Speaker 4: how they can monitor you twenty four to seven. And 190 00:10:04,640 --> 00:10:07,679 Speaker 4: they are they're taking that data and then they're selling it. 191 00:10:07,880 --> 00:10:11,199 Speaker 4: That's what's crazy about it. They're monetizing it and selling it. 192 00:10:12,080 --> 00:10:14,920 Speaker 4: And it just makes me mad. And this is the 193 00:10:14,960 --> 00:10:16,840 Speaker 4: best way I can find to kind of put it 194 00:10:16,880 --> 00:10:19,760 Speaker 4: all together. So somebody can either listen to the audiobook 195 00:10:19,840 --> 00:10:22,640 Speaker 4: or read it for themselves and say, I've got to 196 00:10:22,720 --> 00:10:24,440 Speaker 4: do something about that. I got to make sure my 197 00:10:25,240 --> 00:10:28,839 Speaker 4: local city doesn't do this or my state doesn't do that. 198 00:10:29,640 --> 00:10:32,080 Speaker 3: So a question that I ask all of my guests 199 00:10:32,360 --> 00:10:34,280 Speaker 3: is what do you worry about? 200 00:10:34,960 --> 00:10:39,800 Speaker 2: Is it this? This? This? Really? Is I think this generation. 201 00:10:39,840 --> 00:10:42,800 Speaker 4: I have six grandkids at this point, my wife blessed, Yes, 202 00:10:42,800 --> 00:10:43,480 Speaker 4: they're blessed. 203 00:10:43,520 --> 00:10:44,400 Speaker 2: We've got three kid. 204 00:10:46,040 --> 00:10:48,480 Speaker 4: Yeah, we had six grade and I look at those 205 00:10:48,480 --> 00:10:51,640 Speaker 4: little kids and we're taking videos and pictures, and you know, 206 00:10:52,360 --> 00:10:54,239 Speaker 4: because the phone is so great. 207 00:10:54,000 --> 00:10:56,160 Speaker 2: I can always capture it. But they're going to grow 208 00:10:56,200 --> 00:10:58,040 Speaker 2: up in a world that's so weird, right. 209 00:10:58,040 --> 00:11:02,520 Speaker 4: Artificial intelligence, deep fake, And I just I worry about 210 00:11:02,559 --> 00:11:05,560 Speaker 4: how bizarre the world's going to get the next really 211 00:11:05,600 --> 00:11:08,199 Speaker 4: in the short while, next five to ten years, it's 212 00:11:08,600 --> 00:11:11,400 Speaker 4: really going to be a very different place, and not 213 00:11:11,480 --> 00:11:12,880 Speaker 4: when we're going to want to live in if we're 214 00:11:12,880 --> 00:11:13,400 Speaker 4: not careful. 215 00:11:13,760 --> 00:11:15,960 Speaker 3: But I wonder if they're going to see it as weird. 216 00:11:16,320 --> 00:11:19,640 Speaker 1: My kids already, you know, just their technological use is 217 00:11:19,640 --> 00:11:22,040 Speaker 1: so different from mine. 218 00:11:22,480 --> 00:11:24,360 Speaker 3: Will they see all of this is odd? 219 00:11:24,440 --> 00:11:26,520 Speaker 1: Or will they be able to spot, for example, deep 220 00:11:26,559 --> 00:11:27,559 Speaker 1: fakes very easily. 221 00:11:28,440 --> 00:11:31,120 Speaker 4: It's just there. It's just their way they're going to 222 00:11:31,160 --> 00:11:33,000 Speaker 4: grow up. But what are they going to do for jobs? 223 00:11:33,800 --> 00:11:36,160 Speaker 4: That's a whole nother. Yeah, we're going to have three 224 00:11:36,160 --> 00:11:38,200 Speaker 4: and a half year old granddaughter. She knows how to 225 00:11:38,200 --> 00:11:41,719 Speaker 4: grab her mom's phone, hit the four buttons and then 226 00:11:41,800 --> 00:11:46,199 Speaker 4: hit FaceTime video wow. And once a day I'll get 227 00:11:46,240 --> 00:11:49,080 Speaker 4: a call and I just know that that little girl 228 00:11:49,200 --> 00:11:50,720 Speaker 4: is going to be calling me, and of course I'm 229 00:11:50,720 --> 00:11:53,120 Speaker 4: going to answer it. But yeah, let me think how 230 00:11:53,120 --> 00:11:55,480 Speaker 4: how that's just how she's grown up. 231 00:11:55,800 --> 00:11:59,280 Speaker 1: What advice would you give your sixteen year old self 232 00:11:59,559 --> 00:12:02,160 Speaker 1: having to you know, kind of think about doing it 233 00:12:02,200 --> 00:12:02,880 Speaker 1: all over again. 234 00:12:04,320 --> 00:12:07,040 Speaker 4: Well, if I was living right now, i'd say put 235 00:12:07,080 --> 00:12:10,280 Speaker 4: down the phone and have those experiences. And my sixteen 236 00:12:10,360 --> 00:12:12,920 Speaker 4: year old self, who didn't have mobile phones when he 237 00:12:13,000 --> 00:12:16,360 Speaker 4: was growing up, right it didn't have I just cherished 238 00:12:16,480 --> 00:12:21,960 Speaker 4: those outdoor get out there, go do whatever, be safe, 239 00:12:22,000 --> 00:12:26,240 Speaker 4: be smart. But I felt like I tried a lot 240 00:12:26,280 --> 00:12:29,440 Speaker 4: of things. I would have tried even more, things like 241 00:12:30,120 --> 00:12:31,480 Speaker 4: I wish I had done more surfing. 242 00:12:31,520 --> 00:12:33,320 Speaker 2: I wish I had learned how to golf. I wish 243 00:12:33,400 --> 00:12:34,599 Speaker 2: I had done. 244 00:12:34,480 --> 00:12:38,160 Speaker 4: Like there's so many but the things that I did 245 00:12:38,200 --> 00:12:41,040 Speaker 4: try they changed my life. And I would just I 246 00:12:41,040 --> 00:12:43,240 Speaker 4: would turn up the volume on that if you will 247 00:12:43,720 --> 00:12:44,640 Speaker 4: to do even more. 248 00:12:45,600 --> 00:12:46,120 Speaker 3: I love that. 249 00:12:46,280 --> 00:12:49,400 Speaker 1: I really feel like people are living too much online 250 00:12:49,480 --> 00:12:50,760 Speaker 1: and on the screens. 251 00:12:50,559 --> 00:12:52,199 Speaker 3: It's it's you know, this. 252 00:12:52,280 --> 00:12:55,640 Speaker 1: Is a show about largely about living better and improving 253 00:12:55,679 --> 00:12:57,559 Speaker 1: your life, and it's one of the things people say 254 00:12:57,559 --> 00:12:59,240 Speaker 1: all the time, I wish I could put down my phone, 255 00:12:59,240 --> 00:13:01,240 Speaker 1: I wish I could get off the screens, and then 256 00:13:01,280 --> 00:13:02,080 Speaker 1: they don't do it. 257 00:13:03,160 --> 00:13:05,000 Speaker 3: I think that's really a. 258 00:13:04,920 --> 00:13:09,400 Speaker 1: Good lesson for people. Well, I love this conversation. The 259 00:13:09,440 --> 00:13:11,640 Speaker 1: book is called They're Coming for you. You can buy 260 00:13:11,679 --> 00:13:15,080 Speaker 1: it anywhere books are sold. Jason end us here with 261 00:13:15,440 --> 00:13:18,120 Speaker 1: your best tip from my listeners on how they can 262 00:13:18,120 --> 00:13:19,080 Speaker 1: improve their lives. 263 00:13:19,960 --> 00:13:24,040 Speaker 4: Love your neighbor, love yourself, and love your love your family. 264 00:13:24,200 --> 00:13:26,240 Speaker 4: It's okay to just pick up the phone or go 265 00:13:26,360 --> 00:13:28,720 Speaker 4: drop by and do the old fashioned thing and knock 266 00:13:28,760 --> 00:13:31,160 Speaker 4: on their door. And when you walk by your neighbor, 267 00:13:31,360 --> 00:13:34,800 Speaker 4: you know, say hello. It just makes a world of difference. 268 00:13:35,120 --> 00:13:38,800 Speaker 4: And I just being sincere in your heart, I just 269 00:13:39,120 --> 00:13:41,440 Speaker 4: I think that's I saw a neighbor the other day and. 270 00:13:41,559 --> 00:13:43,640 Speaker 2: Seen it in a while and it was just great. 271 00:13:43,960 --> 00:13:47,640 Speaker 4: And it's because I stopped and paused. And I don't 272 00:13:47,679 --> 00:13:51,640 Speaker 4: regret those ten minutes at all. In fact, I cherished them. 273 00:13:51,920 --> 00:13:55,319 Speaker 4: And I think we need we as a community, as 274 00:13:55,360 --> 00:13:57,800 Speaker 4: a people, as a nation, we just. 275 00:13:57,720 --> 00:13:58,600 Speaker 2: Need to do more of that. 276 00:13:58,720 --> 00:13:59,559 Speaker 3: I love that he is. 277 00:13:59,679 --> 00:14:03,439 Speaker 1: Jason Schaefits by their coming for you everywhere books are sold. 278 00:14:03,559 --> 00:14:04,760 Speaker 3: Thank you so much for coming on. 279 00:14:05,120 --> 00:14:05,439 Speaker 2: Thank you. 280 00:14:07,160 --> 00:14:09,120 Speaker 4: M m h m hm