1 00:00:00,840 --> 00:00:03,680 Speaker 1: Welcome to the Tutor Dixon Podcast. So I just want 2 00:00:03,680 --> 00:00:05,560 Speaker 1: to ask if any of you have ever felt like 3 00:00:05,640 --> 00:00:09,600 Speaker 1: there was somebody watching you, watching what you're doing in 4 00:00:09,640 --> 00:00:12,719 Speaker 1: your house. I know we all had those baby monitors 5 00:00:12,760 --> 00:00:15,920 Speaker 1: when our kids were little. I had baby monitors all 6 00:00:15,960 --> 00:00:18,239 Speaker 1: over the house because I had four kids, and there 7 00:00:18,480 --> 00:00:20,439 Speaker 1: was a part of me that was like, is it 8 00:00:20,640 --> 00:00:23,599 Speaker 1: just me that is watching on these baby monitors. Well, 9 00:00:23,680 --> 00:00:26,639 Speaker 1: we have Attorney General Mike Hilger's with us today to 10 00:00:26,680 --> 00:00:28,760 Speaker 1: tell us, no, it's not. It's not just you and 11 00:00:28,800 --> 00:00:31,040 Speaker 1: your family. Welcome to the podcast. 12 00:00:31,680 --> 00:00:33,200 Speaker 2: Great to be here, Thanks for having me, Tutor. 13 00:00:34,080 --> 00:00:37,480 Speaker 1: So tell us what is going on. Honestly, I heard 14 00:00:37,520 --> 00:00:41,559 Speaker 1: the story. I'm like, Okay, the Nebraska ag is on 15 00:00:41,680 --> 00:00:45,320 Speaker 1: top of what's happening. China is now in our kids' 16 00:00:45,360 --> 00:00:48,080 Speaker 1: bedrooms watching us, And I got to tell you I 17 00:00:48,120 --> 00:00:50,239 Speaker 1: heard this and I said, what are they getting out 18 00:00:50,280 --> 00:00:52,640 Speaker 1: of this? I mean, these are some of my worst 19 00:00:52,640 --> 00:00:54,400 Speaker 1: moments in the middle of the night walking into my 20 00:00:54,480 --> 00:00:57,520 Speaker 1: kids room and being like, why are you wake again? 21 00:00:59,320 --> 00:01:00,640 Speaker 2: Well, it's a good question. 22 00:01:00,680 --> 00:01:02,440 Speaker 3: Although I would tell you China is probably number one 23 00:01:02,520 --> 00:01:07,319 Speaker 3: in the world in terms of using surveillance of their citizens. 24 00:01:07,040 --> 00:01:10,920 Speaker 3: For evil purposes. So I can't necessarily put myself in 25 00:01:10,959 --> 00:01:12,720 Speaker 3: their shoes, but I do know that it's not going 26 00:01:12,800 --> 00:01:13,160 Speaker 3: to be good. 27 00:01:13,520 --> 00:01:14,880 Speaker 2: Although too, I'm glad. 28 00:01:14,600 --> 00:01:17,120 Speaker 1: You can't put yourself in their shoes. That actually that 29 00:01:17,200 --> 00:01:17,839 Speaker 1: was comforting. 30 00:01:18,240 --> 00:01:21,360 Speaker 3: Yeah, well, I'm in the four kid club like you, 31 00:01:21,400 --> 00:01:23,399 Speaker 3: and I know those days. And so sometimes you think 32 00:01:23,440 --> 00:01:25,919 Speaker 3: what can they get? But the truth is those monitors 33 00:01:25,920 --> 00:01:28,319 Speaker 3: can actually capture some of the most intimate and private 34 00:01:28,360 --> 00:01:30,720 Speaker 3: moments of our lives that if you actually have them 35 00:01:30,760 --> 00:01:33,199 Speaker 3: across you know, millions of people in the United States, 36 00:01:33,200 --> 00:01:35,800 Speaker 3: some of whom have very prominent jobs, have access to 37 00:01:35,880 --> 00:01:39,040 Speaker 3: really critical infrastructure. It can be it could be actually 38 00:01:39,080 --> 00:01:41,880 Speaker 3: probably more information than maybe we would think. But this 39 00:01:42,000 --> 00:01:44,399 Speaker 3: is actually part of a broader team I think tutor 40 00:01:44,400 --> 00:01:48,640 Speaker 3: where you have Chinese based companies that are selling consumer 41 00:01:48,680 --> 00:01:51,400 Speaker 3: products or services. So in this case it's this baby 42 00:01:51,400 --> 00:01:53,920 Speaker 3: monitor we'll talk about. In other cases it's services like 43 00:01:53,960 --> 00:01:57,680 Speaker 3: Timu or even TikTok, where they are telling American consumers 44 00:01:57,720 --> 00:02:00,360 Speaker 3: one thing that these are safe, that they're that they're 45 00:02:00,400 --> 00:02:03,800 Speaker 3: secure the private, when in fact they are there anything 46 00:02:03,920 --> 00:02:06,360 Speaker 3: but And so in this case, we have a company 47 00:02:06,440 --> 00:02:10,000 Speaker 3: by the name of Lorex. They sell these baby monitors 48 00:02:10,040 --> 00:02:12,760 Speaker 3: and other surveillance equipment. By the way, so also other 49 00:02:12,800 --> 00:02:15,800 Speaker 3: home surveillance. If you have something that's monitoring the external 50 00:02:15,840 --> 00:02:19,200 Speaker 3: part of your house, maybe your backyard. They make those 51 00:02:19,280 --> 00:02:21,960 Speaker 3: kinds of products and they tell they tell you online 52 00:02:22,080 --> 00:02:24,639 Speaker 3: that they are safe and they're secure, and they take 53 00:02:24,680 --> 00:02:27,760 Speaker 3: every step that they can to keep your information private. 54 00:02:28,080 --> 00:02:30,360 Speaker 3: The problem is, and that's by the way, that's great, 55 00:02:30,400 --> 00:02:32,560 Speaker 3: and a lot of home security companies do the same thing. 56 00:02:32,680 --> 00:02:36,760 Speaker 3: The problem is that this company uses for its hardware 57 00:02:36,840 --> 00:02:42,640 Speaker 3: and software back end Chinese hardware manufactured by Chinese company 58 00:02:42,639 --> 00:02:45,000 Speaker 3: that we now have significant ties to the Chinese military 59 00:02:45,400 --> 00:02:49,120 Speaker 3: and as on the FCC list as having enormous back 60 00:02:49,200 --> 00:02:52,200 Speaker 3: end security issues, and so really create sort of a 61 00:02:52,320 --> 00:02:56,960 Speaker 3: gaping privacy loophole for both hackers and also the Chinese government. 62 00:02:57,720 --> 00:03:00,560 Speaker 1: Well, I've always been nervous about having any type of 63 00:03:00,600 --> 00:03:02,919 Speaker 1: camera in my home for this exact reason, and the 64 00:03:03,000 --> 00:03:05,600 Speaker 1: ring doorbells and all of that. But then you see 65 00:03:05,600 --> 00:03:08,639 Speaker 1: what happened just a few weeks ago. It's almost as 66 00:03:08,680 --> 00:03:12,160 Speaker 1: if the ring doorbells or the security cameras in the 67 00:03:12,160 --> 00:03:16,919 Speaker 1: neighborhood around that Utah University where Charlie Kirk was assassinated, 68 00:03:16,960 --> 00:03:20,560 Speaker 1: caught the assassin on camera. So what is the balance here? 69 00:03:21,440 --> 00:03:22,880 Speaker 2: Well, that's that's a really good question. 70 00:03:22,880 --> 00:03:26,280 Speaker 3: I think when you talk to least Nebraska consumers, or 71 00:03:26,320 --> 00:03:27,840 Speaker 3: if I talk to my wife, who's always a good 72 00:03:27,840 --> 00:03:31,560 Speaker 3: barometer to these things, I think people are decently comfortable. 73 00:03:31,919 --> 00:03:34,400 Speaker 3: I mean they don't maybe necessarily love that you can 74 00:03:34,480 --> 00:03:38,040 Speaker 3: get information that sort of shows something indirectly about you 75 00:03:38,160 --> 00:03:39,840 Speaker 3: or your life. So think about some of the information 76 00:03:39,880 --> 00:03:42,080 Speaker 3: on cell phones or maybe even a ring camera. You know, 77 00:03:42,200 --> 00:03:44,840 Speaker 3: ring camera gets part of a public street, you know, 78 00:03:45,000 --> 00:03:47,520 Speaker 3: so whatever's on the public street, maybe that's fair game. 79 00:03:48,000 --> 00:03:50,520 Speaker 3: Or maybe if you can get track a little bit 80 00:03:50,520 --> 00:03:51,760 Speaker 3: of my movements. I don't love that. 81 00:03:51,800 --> 00:03:52,640 Speaker 2: I don't think that's right. 82 00:03:52,720 --> 00:03:54,600 Speaker 3: But you know, I think we're willing to accept it 83 00:03:54,640 --> 00:03:57,160 Speaker 3: if our phone is going to be tapped into a 84 00:03:57,280 --> 00:03:59,960 Speaker 3: wireless system and we're getting some benefit. But I think 85 00:04:00,160 --> 00:04:03,320 Speaker 3: the difference here, and I think this is where it's unbalanced, Tutor, 86 00:04:03,440 --> 00:04:05,960 Speaker 3: is these are actually these are some of the most 87 00:04:06,000 --> 00:04:08,840 Speaker 3: private things that you can have. This is in your home, 88 00:04:08,880 --> 00:04:11,760 Speaker 3: private conversations you're having with your spouse, your loved one, 89 00:04:11,840 --> 00:04:14,440 Speaker 3: your kids, maybe your good friends were coming over maybe 90 00:04:14,440 --> 00:04:17,920 Speaker 3: work colleagues. Yeah, And even though it's I can't see 91 00:04:17,920 --> 00:04:19,440 Speaker 3: what the Chinese can get out of me when I'm 92 00:04:19,480 --> 00:04:21,320 Speaker 3: sort of half awake at two in the morning trying 93 00:04:21,320 --> 00:04:24,719 Speaker 3: to like put my daughter back to sleep, it's still 94 00:04:24,920 --> 00:04:27,800 Speaker 3: it's sort of a breach of privacy that is pretty alarming. 95 00:04:27,920 --> 00:04:30,520 Speaker 3: And when you have that kind of information that's flowing 96 00:04:30,640 --> 00:04:33,720 Speaker 3: in the aggregate, maybe millions of people flowing to our 97 00:04:33,760 --> 00:04:36,960 Speaker 3: greatest geopolitical adversary. I don't know where the balance is, 98 00:04:36,960 --> 00:04:39,520 Speaker 3: but I think that wherever the line is drawn, I 99 00:04:39,520 --> 00:04:42,039 Speaker 3: think that's so far over it to be sort of 100 00:04:42,120 --> 00:04:43,040 Speaker 3: very objectionable. 101 00:04:44,720 --> 00:04:48,040 Speaker 1: Well, you've watched China come into the United States, and 102 00:04:48,360 --> 00:04:52,359 Speaker 1: especially here in Michigan. We've seen they've sent Chinese nationals 103 00:04:52,360 --> 00:04:58,200 Speaker 1: to our universities that have brought some potentially biohazardous materials 104 00:04:58,200 --> 00:05:01,800 Speaker 1: into the university. We've seen they've tried to buy property 105 00:05:01,800 --> 00:05:04,520 Speaker 1: in the state of Michigan right outside of one of 106 00:05:04,560 --> 00:05:08,479 Speaker 1: our military bases. We've actually caught students who were at 107 00:05:08,520 --> 00:05:13,440 Speaker 1: the university outside spying on one of our military units 108 00:05:13,440 --> 00:05:17,760 Speaker 1: that was training soldiers to protect to protect Taiwan. So 109 00:05:17,920 --> 00:05:21,760 Speaker 1: when we think about China, it is very concerning that 110 00:05:21,839 --> 00:05:25,279 Speaker 1: they have any type of insight into American life, but 111 00:05:25,360 --> 00:05:28,880 Speaker 1: you did say they see things differently, they are monitoring 112 00:05:28,960 --> 00:05:32,039 Speaker 1: for different reasons. I think that's something that the American 113 00:05:32,080 --> 00:05:35,000 Speaker 1: people need to understand that having this in your home, 114 00:05:35,400 --> 00:05:38,200 Speaker 1: you have to be aware of who is watching, because 115 00:05:38,240 --> 00:05:41,159 Speaker 1: it is not just the conversations that you're having or 116 00:05:41,240 --> 00:05:44,839 Speaker 1: your interactions with your baby. It is what American life 117 00:05:44,880 --> 00:05:48,400 Speaker 1: is like. Because they're propagandists. They're trying to get into 118 00:05:48,440 --> 00:05:52,480 Speaker 1: the minds of the next generation and change America fundamentally. 119 00:05:53,480 --> 00:05:54,640 Speaker 2: I think that's exactly right. 120 00:05:54,680 --> 00:05:56,440 Speaker 3: I think we've been lolled to sleep a little bit, 121 00:05:56,560 --> 00:05:58,800 Speaker 3: say since the end of the Cold War, so that's 122 00:05:58,920 --> 00:06:02,359 Speaker 3: I was born gen X late seventies. You know, in 123 00:06:02,400 --> 00:06:06,080 Speaker 3: the eighties we had a significant adversary, the Soviet Union. 124 00:06:06,160 --> 00:06:07,919 Speaker 3: Everyone sort of understood that was that we're in this 125 00:06:07,960 --> 00:06:10,440 Speaker 3: geopolitical race that we had to win or maybe really 126 00:06:10,480 --> 00:06:11,960 Speaker 3: bad things were going to happen. I think in the 127 00:06:12,040 --> 00:06:14,200 Speaker 3: last twenty or thirty years we've been lulled asleep a 128 00:06:14,240 --> 00:06:17,359 Speaker 3: little bit. I think we were the nation of the 129 00:06:17,360 --> 00:06:20,520 Speaker 3: world's only superpower. We haven't had this sort of like 130 00:06:20,560 --> 00:06:24,120 Speaker 3: significant geopolitical adversary that both had the will and potentially 131 00:06:24,240 --> 00:06:26,599 Speaker 3: the ability to really go after. 132 00:06:26,440 --> 00:06:27,800 Speaker 2: The United States in this kind of way. 133 00:06:27,800 --> 00:06:29,560 Speaker 3: And I think we've also been lulled asleep by a 134 00:06:29,640 --> 00:06:32,080 Speaker 3: lot of our consumer companies selling us all these products 135 00:06:32,240 --> 00:06:34,360 Speaker 3: and having a flood of cheap Chinese goods over the 136 00:06:34,440 --> 00:06:37,320 Speaker 3: last twenty years. I think people have generally sort of 137 00:06:37,360 --> 00:06:40,560 Speaker 3: looked at these as, yeah, maybe there's there's something bad happening, 138 00:06:40,560 --> 00:06:43,599 Speaker 3: maybe it's maybe it's having some impact on our manufacturing base, 139 00:06:43,720 --> 00:06:45,760 Speaker 3: or maybe it's done. You know, maybe maybe there's a 140 00:06:45,760 --> 00:06:48,440 Speaker 3: little bit of information there. But I think we're looking 141 00:06:48,520 --> 00:06:51,360 Speaker 3: through the wrong lens. It's a lens that's outdated. It's 142 00:06:51,480 --> 00:06:54,520 Speaker 3: a lens that basically assumes that the people who are 143 00:06:54,560 --> 00:06:57,520 Speaker 3: selling us products, who are competing with us, have more 144 00:06:57,600 --> 00:06:59,000 Speaker 3: or less decent intentions. 145 00:06:59,040 --> 00:07:00,920 Speaker 2: And I think what is very. 146 00:07:00,640 --> 00:07:04,919 Speaker 3: Clear over the last several years, but very recently in particular, 147 00:07:05,160 --> 00:07:07,600 Speaker 3: is that China does not have the best of intentions 148 00:07:07,600 --> 00:07:09,160 Speaker 3: when it comes to the United States. And I think 149 00:07:09,240 --> 00:07:11,760 Speaker 3: viewing it through that new lens tuitor really I think 150 00:07:11,840 --> 00:07:14,360 Speaker 3: hopefully it's a wake up call to Americans like we 151 00:07:14,400 --> 00:07:18,600 Speaker 3: are in a significant competition and they're using and all 152 00:07:18,680 --> 00:07:21,360 Speaker 3: of the they're using every resource that they can, from 153 00:07:21,640 --> 00:07:26,120 Speaker 3: missiles and guns and cyber attacks to or the ability 154 00:07:26,160 --> 00:07:29,840 Speaker 3: to conduct cyber attacks, to espionage to maybe these types 155 00:07:29,840 --> 00:07:31,760 Speaker 3: of things done at scale that we just need to 156 00:07:31,760 --> 00:07:33,120 Speaker 3: be on guard for, need to push back. 157 00:07:33,960 --> 00:07:37,000 Speaker 1: That's something that we haven't really considered, the knowledge of 158 00:07:37,040 --> 00:07:40,240 Speaker 1: the American home and the conversations that go along with it. 159 00:07:40,360 --> 00:07:44,440 Speaker 1: And then you add in companies like TikTok into that, 160 00:07:44,600 --> 00:07:47,400 Speaker 1: and we know that the President just signed an executive 161 00:07:47,480 --> 00:07:50,640 Speaker 1: order on TikTok. The administration has come out and said 162 00:07:50,960 --> 00:07:54,880 Speaker 1: TikTok will be safe. Now, I think we are still 163 00:07:54,960 --> 00:07:58,200 Speaker 1: concerned about whether or not TikTok can actually be safe. 164 00:07:58,640 --> 00:08:02,360 Speaker 1: I think that your personal data in TikTok could can 165 00:08:02,480 --> 00:08:05,400 Speaker 1: be safe from now on. But something you're saying is 166 00:08:05,440 --> 00:08:08,400 Speaker 1: making me think if I had somebody in my home 167 00:08:08,720 --> 00:08:11,680 Speaker 1: listening to our conversations, you can kind of listen to 168 00:08:11,720 --> 00:08:15,040 Speaker 1: the political battle and get into the culture war. I mean, 169 00:08:15,160 --> 00:08:16,840 Speaker 1: just the other day we had someone at our house 170 00:08:16,880 --> 00:08:20,320 Speaker 1: who was saying, how could you possibly be supportive of 171 00:08:20,360 --> 00:08:23,880 Speaker 1: the Trump administration? Cash Battel was just a podcaster and 172 00:08:23,920 --> 00:08:25,120 Speaker 1: now he's running the FBI. 173 00:08:25,880 --> 00:08:26,240 Speaker 2: What are you. 174 00:08:26,280 --> 00:08:30,560 Speaker 1: Talking about that's not even true, not even close to 175 00:08:30,640 --> 00:08:34,640 Speaker 1: being true. And yet this is what is seeping in 176 00:08:34,800 --> 00:08:38,439 Speaker 1: through apps like TikTok, like Reddit, those types of places 177 00:08:38,640 --> 00:08:41,920 Speaker 1: where misinformation, for a lack of a better term, or 178 00:08:41,960 --> 00:08:46,319 Speaker 1: just lies, propaganda, I would say, is going into the 179 00:08:46,320 --> 00:08:51,520 Speaker 1: minds of young Americans. But all ages, all ages. I've 180 00:08:51,559 --> 00:08:53,440 Speaker 1: had my mom come over to my house and say, 181 00:08:53,640 --> 00:08:55,640 Speaker 1: you know, I saw this, this is this video. This 182 00:08:55,720 --> 00:08:58,240 Speaker 1: is terrible. I'm like, that's not that is not real. 183 00:08:58,720 --> 00:09:03,840 Speaker 1: On both sides there there's just a infiltration of propaganda 184 00:09:03,880 --> 00:09:07,600 Speaker 1: into the United States. There is no better propaganda. There 185 00:09:07,600 --> 00:09:10,520 Speaker 1: are no better propagandas in the world than communists. That's 186 00:09:10,520 --> 00:09:13,000 Speaker 1: how they get into power. That's what they do. That's 187 00:09:13,440 --> 00:09:16,040 Speaker 1: the whole platform that they run on, is to get 188 00:09:16,040 --> 00:09:18,840 Speaker 1: into the minds of people with their propaganda. If you 189 00:09:19,040 --> 00:09:23,040 Speaker 1: have a Chinese company, which we know Chinese companies are 190 00:09:23,080 --> 00:09:27,400 Speaker 1: connected to the CCP, the CCP wants to know how 191 00:09:27,440 --> 00:09:29,960 Speaker 1: to cause chaos in the United States. They want to 192 00:09:29,960 --> 00:09:32,960 Speaker 1: be the world power. They want to take over. If 193 00:09:33,000 --> 00:09:35,960 Speaker 1: they can get to the younger generation, or if they 194 00:09:36,000 --> 00:09:39,920 Speaker 1: can even just split Americans and you've seen a great split, 195 00:09:40,040 --> 00:09:43,520 Speaker 1: a great divide in America right now. I mean honestly, 196 00:09:44,040 --> 00:09:46,600 Speaker 1: I was reading the notes for this podcast, and I said, 197 00:09:47,160 --> 00:09:49,800 Speaker 1: if this guy is a Republican, the minute he says this, 198 00:09:49,920 --> 00:09:51,840 Speaker 1: you know, there's going to be all these liberal women 199 00:09:51,880 --> 00:09:53,600 Speaker 1: that are going to put cameras all over their house 200 00:09:53,640 --> 00:09:56,720 Speaker 1: and be like, here's what I'm doing. You know, that's 201 00:09:56,840 --> 00:09:59,320 Speaker 1: just how it is. Like that's how divided we are. 202 00:10:00,000 --> 00:10:03,360 Speaker 1: I watch these women downing tailan on like the next thing, 203 00:10:03,400 --> 00:10:06,160 Speaker 1: you know, they're going to be like, here's my bank account, 204 00:10:06,480 --> 00:10:09,319 Speaker 1: look in the camera. You know, like this is how 205 00:10:09,600 --> 00:10:14,280 Speaker 1: divided we are. But that's how China works. And I 206 00:10:14,320 --> 00:10:17,400 Speaker 1: don't think people, I think you're so spot on when 207 00:10:17,400 --> 00:10:20,079 Speaker 1: you said that, I'm the same age as you. It 208 00:10:20,160 --> 00:10:23,360 Speaker 1: was exactly the same story. Was like the Soviet Union's bad. 209 00:10:23,440 --> 00:10:26,520 Speaker 1: We all knew it was bad. It was in our schools, 210 00:10:26,559 --> 00:10:30,240 Speaker 1: we were told about it. Now the propaganda is so strong, 211 00:10:30,280 --> 00:10:34,760 Speaker 1: it's even in our elementary schools, in our high schools. 212 00:10:34,920 --> 00:10:37,480 Speaker 1: Nobody is talking about the fact that China is a threat. 213 00:10:37,679 --> 00:10:40,040 Speaker 1: And when they do say China is the biggest threat 214 00:10:40,080 --> 00:10:43,199 Speaker 1: to the United States, the media does not talk about it. 215 00:10:44,000 --> 00:10:46,880 Speaker 3: Boy Tudor, I could not agree more with everything you 216 00:10:46,960 --> 00:10:48,840 Speaker 3: just said. I mean, part of this conversation is about 217 00:10:48,840 --> 00:10:51,080 Speaker 3: what kind of information China can get from us. But 218 00:10:51,120 --> 00:10:53,080 Speaker 3: the other part of it, to your point, is what 219 00:10:53,160 --> 00:10:54,960 Speaker 3: kind of information are they pushing to us? And let 220 00:10:55,000 --> 00:10:58,200 Speaker 3: me give you a very very very alarming example, and 221 00:10:58,240 --> 00:11:00,880 Speaker 3: that is in the TikTok. You sort of touched on 222 00:11:00,880 --> 00:11:03,400 Speaker 3: that TikTok for the misinformation, which is absolutely true. But 223 00:11:03,440 --> 00:11:05,880 Speaker 3: let me give you a part of another part of 224 00:11:05,920 --> 00:11:08,680 Speaker 3: what TikTok has done, which is really poison our youth. 225 00:11:08,760 --> 00:11:10,559 Speaker 3: And this is actually part of another lawsuit that we 226 00:11:10,600 --> 00:11:13,920 Speaker 3: have filed. And if you go to if you sign 227 00:11:14,000 --> 00:11:16,880 Speaker 3: up as a ten year old boy or girl on TikTok, 228 00:11:16,920 --> 00:11:19,240 Speaker 3: and we had investigators do this. They signed up these 229 00:11:19,280 --> 00:11:22,400 Speaker 3: accounts within minutes. The algorithm was so powerful. With then minutes. 230 00:11:22,480 --> 00:11:24,680 Speaker 3: We had an example of an account for a twelve 231 00:11:24,720 --> 00:11:28,080 Speaker 3: year old girl. Within minutes, tutor that account the videos 232 00:11:28,080 --> 00:11:30,760 Speaker 3: without doing any searching or anything else. They were getting videos. 233 00:11:30,880 --> 00:11:33,679 Speaker 3: They were pretty poisonous. They had to do with pornography, 234 00:11:34,480 --> 00:11:37,719 Speaker 3: glorifying drug use, suicidal ideation, how to hide suicide from 235 00:11:37,760 --> 00:11:39,800 Speaker 3: your parents, and if you go to so all these 236 00:11:39,800 --> 00:11:41,920 Speaker 3: things that we know are caught driving mental health issues 237 00:11:41,960 --> 00:11:44,000 Speaker 3: for our young people. And if you go to China, 238 00:11:45,080 --> 00:11:48,240 Speaker 3: we've used TikTok in China. Actually none of that's available. 239 00:11:48,280 --> 00:11:51,000 Speaker 3: You can't use it during the school day. And they 240 00:11:51,160 --> 00:11:55,360 Speaker 3: shove video, they really poor videos about patriotism and productivity 241 00:11:55,400 --> 00:11:58,360 Speaker 3: and working hard and positivity, all these things. And so 242 00:11:58,400 --> 00:12:01,679 Speaker 3: it's really a national security issue for the reasons you 243 00:12:01,720 --> 00:12:04,880 Speaker 3: just sayd which is flooding this information that's really either 244 00:12:04,920 --> 00:12:07,120 Speaker 3: poisoning our discourse. And I saw the same tile in 245 00:12:07,160 --> 00:12:09,640 Speaker 3: all videos that you saw, just sort of mind boggling 246 00:12:10,240 --> 00:12:13,000 Speaker 3: or really influencing our youth in a way that we 247 00:12:13,440 --> 00:12:14,680 Speaker 3: have to wake up to and we have to fight 248 00:12:14,720 --> 00:12:15,160 Speaker 3: back against. 249 00:12:15,160 --> 00:12:16,000 Speaker 2: I couldn't agree more. 250 00:12:16,080 --> 00:12:18,800 Speaker 1: Let's take a quick commercial break. We'll continue next on 251 00:12:18,840 --> 00:12:25,480 Speaker 1: a Tutor Dixon podcast. We had a state senator who 252 00:12:25,559 --> 00:12:30,600 Speaker 1: is running for US Senate post a video on Twitter 253 00:12:30,720 --> 00:12:33,840 Speaker 1: or she posted a post on Twitter saying or ex 254 00:12:33,880 --> 00:12:37,760 Speaker 1: whatever we call it now, saying if you have to 255 00:12:37,800 --> 00:12:40,560 Speaker 1: take Thailand all during your pregnancy, you should take it 256 00:12:40,640 --> 00:12:45,040 Speaker 1: as recommended. It's safe. This don't listen to the Trump administration. 257 00:12:45,400 --> 00:12:50,760 Speaker 1: And I think the boldness imagine if the Biden administration 258 00:12:50,920 --> 00:12:53,720 Speaker 1: had come out. I mean, we saw the Biden administration 259 00:12:53,800 --> 00:12:57,240 Speaker 1: come out and say kids should take vaccines even though 260 00:12:57,320 --> 00:12:59,640 Speaker 1: there was no I mean, now you see the experts 261 00:12:59,640 --> 00:13:02,920 Speaker 1: saying these kids don't have to have the COVID vaccine. 262 00:13:03,360 --> 00:13:06,680 Speaker 1: They were all about the minute the Biden administration said 263 00:13:07,120 --> 00:13:09,920 Speaker 1: newborns should be getting the COVID vaccine, they were all 264 00:13:09,960 --> 00:13:13,680 Speaker 1: about it. But the minute you have Harvard Studies, studies 265 00:13:13,720 --> 00:13:16,600 Speaker 1: from Johns Hopkins saying, you know what, this needs to 266 00:13:16,640 --> 00:13:19,760 Speaker 1: be looked at, don't take too much tailan on when 267 00:13:19,760 --> 00:13:23,200 Speaker 1: you're pregnant. You've got a state senator with no medical 268 00:13:23,280 --> 00:13:27,240 Speaker 1: degree coming out and just parroting anything on the other side. 269 00:13:27,360 --> 00:13:30,920 Speaker 1: And I do believe that this is where China thrives. 270 00:13:31,400 --> 00:13:34,319 Speaker 1: They have gotten into the minds of these people. I mean, 271 00:13:35,080 --> 00:13:38,200 Speaker 1: I was just on Newsmax and I was watching they 272 00:13:38,280 --> 00:13:42,480 Speaker 1: had this woman who was Colin First's ex wife, whoever 273 00:13:42,520 --> 00:13:44,440 Speaker 1: that is, you know, we don't know who she is, 274 00:13:44,920 --> 00:13:48,439 Speaker 1: but she's making news because she received an award from 275 00:13:48,640 --> 00:13:51,679 Speaker 1: the government in the UK and she's ripping it up 276 00:13:51,720 --> 00:13:54,800 Speaker 1: because they had Trump over there give I mean, this 277 00:13:54,840 --> 00:13:59,760 Speaker 1: is definitely because they have had this stuff fed to them. 278 00:14:00,160 --> 00:14:04,680 Speaker 1: And that's because I do believe that as you are 279 00:14:05,200 --> 00:14:07,720 Speaker 1: trying to take over another country, as you were trying 280 00:14:07,800 --> 00:14:11,520 Speaker 1: to cause chaos. The best way of understanding that country 281 00:14:11,840 --> 00:14:14,320 Speaker 1: is to be in the homes of the people, to 282 00:14:14,440 --> 00:14:17,240 Speaker 1: learn how they interact, to hear the families talking to 283 00:14:17,280 --> 00:14:21,960 Speaker 1: one another, to be there for their early mornings, for bedtime. 284 00:14:23,320 --> 00:14:26,600 Speaker 1: Those conversations are had right in front of those monitors, 285 00:14:26,800 --> 00:14:28,960 Speaker 1: right in front of the alexa, right in front of 286 00:14:28,760 --> 00:14:31,920 Speaker 1: the ring cam. You know, all of those conversations. I mean, 287 00:14:32,160 --> 00:14:34,120 Speaker 1: you've got kids in bed, you go out on the 288 00:14:34,120 --> 00:14:36,800 Speaker 1: front porch to have a business conversation, you're right in 289 00:14:36,800 --> 00:14:39,640 Speaker 1: front of your camera. I just think we don't even 290 00:14:39,720 --> 00:14:44,320 Speaker 1: think about the fact that those conversations give China such 291 00:14:44,320 --> 00:14:47,480 Speaker 1: an insight into how to divide, how to get into 292 00:14:47,520 --> 00:14:51,800 Speaker 1: the minds of Americans. Because they're in our deepest conversations, 293 00:14:51,840 --> 00:14:54,760 Speaker 1: they're in our minds. They don't know how to manipulate 294 00:14:54,800 --> 00:14:55,280 Speaker 1: our minds. 295 00:14:55,560 --> 00:14:56,760 Speaker 2: Boy, I couldn't agree more. 296 00:14:56,960 --> 00:14:58,600 Speaker 3: Let me take you two points one on your point 297 00:14:58,600 --> 00:15:02,360 Speaker 3: about saying listen to the FDA. That is like a 298 00:15:02,360 --> 00:15:04,120 Speaker 3: little there's a through line on the left where they 299 00:15:04,720 --> 00:15:08,120 Speaker 3: try to destroy institutions that are issuing opinions that they 300 00:15:08,120 --> 00:15:10,320 Speaker 3: don't agree with. And you've seen it, maybe most prominently 301 00:15:10,360 --> 00:15:12,760 Speaker 3: with the Supreme Court. Once there was a conservative majority 302 00:15:12,800 --> 00:15:13,240 Speaker 3: on the court. 303 00:15:13,520 --> 00:15:14,080 Speaker 2: What do they do. 304 00:15:14,120 --> 00:15:16,480 Speaker 3: They've they've tried, you know, Center White House and others 305 00:15:16,520 --> 00:15:18,880 Speaker 3: have tried to slowly delegitimize the Court. And I think 306 00:15:18,920 --> 00:15:21,440 Speaker 3: when you take out these institutions like they're trying to 307 00:15:21,440 --> 00:15:24,360 Speaker 3: do with the FDA like that has a long, very 308 00:15:24,440 --> 00:15:26,520 Speaker 3: long term, significant harm that I think we're going to 309 00:15:26,600 --> 00:15:28,320 Speaker 3: have to wrestle with for a long time to come, 310 00:15:28,640 --> 00:15:32,120 Speaker 3: and really weakens our ability from a national security perspective 311 00:15:32,200 --> 00:15:35,160 Speaker 3: or internally to really be nimble and respond to big problems. 312 00:15:35,600 --> 00:15:38,600 Speaker 3: The other thing, I think the layer on to your 313 00:15:38,640 --> 00:15:41,520 Speaker 3: point about looking at the cameras, I think people don't 314 00:15:41,520 --> 00:15:45,480 Speaker 3: fully appreciate, appreciate the technological tools that are now available 315 00:15:45,480 --> 00:15:47,640 Speaker 3: to nation state, especially in the AI realm. So there's 316 00:15:47,800 --> 00:15:51,360 Speaker 3: there's probably not someone I hope not sitting in China 317 00:15:51,640 --> 00:15:54,080 Speaker 3: looking at my baby monitor and saying, what if Mike 318 00:15:54,120 --> 00:15:54,520 Speaker 3: seeing it. 319 00:15:55,880 --> 00:15:57,000 Speaker 1: And she's breastfeeding. 320 00:15:58,400 --> 00:16:00,800 Speaker 3: But but it is certainly true that they'd have the 321 00:16:00,840 --> 00:16:05,160 Speaker 3: capability to sort of aggregate millions of data points across 322 00:16:05,200 --> 00:16:08,080 Speaker 3: their entire whatever ecosystem that they're pulling from whether it's 323 00:16:08,280 --> 00:16:11,160 Speaker 3: TMU technology where they're getting access to or text they 324 00:16:11,160 --> 00:16:13,640 Speaker 3: can get access to text messages in your microphone, or 325 00:16:13,960 --> 00:16:16,280 Speaker 3: or lo rex where they can actually get access to 326 00:16:16,320 --> 00:16:19,120 Speaker 3: some of these your surveillance videos and the audio on 327 00:16:19,160 --> 00:16:20,360 Speaker 3: your home security. 328 00:16:19,960 --> 00:16:23,200 Speaker 2: Cameras, or or TikTok and being able. 329 00:16:23,080 --> 00:16:26,400 Speaker 3: To have the technological tools to pull all that information together, 330 00:16:26,480 --> 00:16:29,080 Speaker 3: aggregate it, both understand it and then manipulate to your 331 00:16:29,080 --> 00:16:32,280 Speaker 3: point too or I think I think sometimes people fail 332 00:16:32,360 --> 00:16:37,280 Speaker 3: to appreciate how far the technological advances have gone and 333 00:16:37,320 --> 00:16:40,520 Speaker 3: what they what they can do. So it kind of 334 00:16:40,560 --> 00:16:42,520 Speaker 3: changed the conversation a little bit when you're like, Wow, 335 00:16:42,520 --> 00:16:44,360 Speaker 3: it's the big deal of them seeing me at two 336 00:16:44,400 --> 00:16:47,960 Speaker 3: in the morning. Well, actually, there's a really significant deal 337 00:16:48,000 --> 00:16:51,480 Speaker 3: of them being able to see the patterns and trends 338 00:16:51,520 --> 00:16:53,760 Speaker 3: of Americans throughout their day to day lives throughout the 339 00:16:53,760 --> 00:16:55,400 Speaker 3: country to do it for the kind of thing that 340 00:16:55,440 --> 00:16:56,640 Speaker 3: you're describing. 341 00:16:57,480 --> 00:17:00,600 Speaker 1: So how does this work? Even though I mean, I 342 00:17:00,640 --> 00:17:03,840 Speaker 1: also feel like when I'm on my cell phone, anybody 343 00:17:03,880 --> 00:17:07,000 Speaker 1: can be listening, and I think that was the concern 344 00:17:07,440 --> 00:17:12,119 Speaker 1: when everybody was saying TikTok is such a dangerous app TikTok. 345 00:17:12,480 --> 00:17:16,240 Speaker 1: We my understanding, And as you said, I don't know 346 00:17:16,320 --> 00:17:20,440 Speaker 1: how this technology works. It's so far beyond what our 347 00:17:20,520 --> 00:17:24,320 Speaker 1: generation had growing up. I mean I remember my mom 348 00:17:24,440 --> 00:17:26,600 Speaker 1: picking up the phone in the house and saying, get off. 349 00:17:26,640 --> 00:17:29,040 Speaker 1: That was the technology we had, you know, It's like, 350 00:17:29,640 --> 00:17:32,320 Speaker 1: this is so far beyond. So I don't know what 351 00:17:32,440 --> 00:17:35,000 Speaker 1: to think. And when I hear if you have TikTok, 352 00:17:35,040 --> 00:17:36,640 Speaker 1: it can go through your phone and it can hear 353 00:17:36,640 --> 00:17:41,960 Speaker 1: your conversations, I get concerned about that. But how many 354 00:17:42,040 --> 00:17:45,080 Speaker 1: other apps do I download on a daily basis that 355 00:17:45,440 --> 00:17:47,720 Speaker 1: are not made in the United States. I don't even 356 00:17:47,880 --> 00:17:49,800 Speaker 1: I don't even really think about it. You know, we 357 00:17:49,880 --> 00:17:52,560 Speaker 1: assume this is in my purse, it's in my house, 358 00:17:52,560 --> 00:17:56,080 Speaker 1: it's my it's my phone, so it's safe. But how 359 00:17:56,119 --> 00:17:58,080 Speaker 1: do you how do you protect yourself? How do you 360 00:17:58,240 --> 00:18:01,640 Speaker 1: know when you I mean, you talk about these baby monitors. 361 00:18:01,640 --> 00:18:04,399 Speaker 1: But as a parent, I mean, I'm not going to 362 00:18:04,480 --> 00:18:07,600 Speaker 1: have a baby monitor again because I don't have little kids. 363 00:18:07,640 --> 00:18:09,760 Speaker 1: But if you're a new parent or you're a grandparent, 364 00:18:09,880 --> 00:18:13,240 Speaker 1: your kids are buying baby monitors, how do you find 365 00:18:13,320 --> 00:18:16,240 Speaker 1: something that's safe and still doing the job. 366 00:18:17,119 --> 00:18:19,560 Speaker 3: It's it's a great question, and I think I think 367 00:18:19,840 --> 00:18:22,920 Speaker 3: education is maybe the most important way to fight back 368 00:18:22,960 --> 00:18:24,840 Speaker 3: against this, and that's partly what we're trying to do 369 00:18:24,880 --> 00:18:28,080 Speaker 3: because I think most parents, what first, My kids are 370 00:18:28,119 --> 00:18:30,679 Speaker 3: now out of the crib years, But I know what 371 00:18:30,720 --> 00:18:33,280 Speaker 3: that's It wasn't that long ago. You're very busy, you're 372 00:18:33,320 --> 00:18:36,360 Speaker 3: running around, you don't have hours every day to sort 373 00:18:36,359 --> 00:18:38,760 Speaker 3: of think through what you're all your the nuances of 374 00:18:38,760 --> 00:18:40,240 Speaker 3: every particular consumer. 375 00:18:39,920 --> 00:18:41,080 Speaker 2: Choice that you're going to make. 376 00:18:41,080 --> 00:18:42,800 Speaker 3: But I also think there's been an assumption in our 377 00:18:42,840 --> 00:18:45,000 Speaker 3: country of like, yeah, maybe there's some data, but the 378 00:18:45,040 --> 00:18:48,840 Speaker 3: people getting the data probably large corporations. Maybe I like them, 379 00:18:48,840 --> 00:18:50,760 Speaker 3: maybe I don't. But what they're not doing with the data, 380 00:18:50,840 --> 00:18:53,040 Speaker 3: because of all the trial lawyers out there willing tossue them, 381 00:18:53,440 --> 00:18:55,920 Speaker 3: is taking it for some really nefarious purpose. So we 382 00:18:56,000 --> 00:18:59,800 Speaker 3: might joke about, you know, talking about a product our iPhone, 383 00:18:59,840 --> 00:19:02,240 Speaker 3: picking it up and then getting ads for that product, 384 00:19:02,560 --> 00:19:05,200 Speaker 3: but for a lot of people that feels maybe intrusive, 385 00:19:05,240 --> 00:19:07,040 Speaker 3: but somewhat benign. And I guess what we're trying to 386 00:19:07,040 --> 00:19:09,520 Speaker 3: tell people is just maybe change your assumptions, you know, 387 00:19:09,840 --> 00:19:12,520 Speaker 3: don't just assume everyone who's loading things on your phone 388 00:19:13,119 --> 00:19:15,120 Speaker 3: is doing so with with really good. 389 00:19:15,400 --> 00:19:17,119 Speaker 1: How do you look at it? I mean you have 390 00:19:17,200 --> 00:19:21,000 Speaker 1: the app Store and I guess there's information there that 391 00:19:21,200 --> 00:19:24,439 Speaker 1: we don't necessarily read. We just click yes, download that 392 00:19:24,440 --> 00:19:26,280 Speaker 1: little cloud with the aero. Yeah, I'm good. 393 00:19:27,000 --> 00:19:29,840 Speaker 3: Well, first you should subscribe to the alerts from the 394 00:19:29,840 --> 00:19:32,560 Speaker 3: Nebraska Attorney General's Office because we have been trying to 395 00:19:32,600 --> 00:19:34,439 Speaker 3: go against these companies Tea who is a good example, 396 00:19:34,520 --> 00:19:38,320 Speaker 3: toud little plug, EMU, they will seam who is a 397 00:19:38,320 --> 00:19:40,919 Speaker 3: great example. When you sign up on the app Store's 398 00:19:41,040 --> 00:19:42,800 Speaker 3: it will say what kind of data it's getting, and 399 00:19:42,840 --> 00:19:45,119 Speaker 3: it's very limited what it tells you. But what we 400 00:19:45,200 --> 00:19:48,480 Speaker 3: found is once it hewnloads on your phone, the software 401 00:19:48,560 --> 00:19:51,439 Speaker 3: recompiles and it can get access to pretty much anything 402 00:19:51,440 --> 00:19:53,920 Speaker 3: on your phone, everything from your keystrokes to your text 403 00:19:53,920 --> 00:19:55,119 Speaker 3: messages to your audio. 404 00:19:55,240 --> 00:19:57,280 Speaker 2: So part of it is being a. 405 00:19:57,200 --> 00:19:59,960 Speaker 1: Little strokes, so they literally have your passwords. 406 00:20:00,160 --> 00:20:03,159 Speaker 3: That's really that is what That's what our investigation is 407 00:20:03,200 --> 00:20:05,439 Speaker 3: down so far shown so far. Yeah, so it's it 408 00:20:05,480 --> 00:20:08,439 Speaker 3: can get a lot of information. So part of it 409 00:20:08,520 --> 00:20:10,480 Speaker 3: is being aware of some of these some of these 410 00:20:10,960 --> 00:20:13,159 Speaker 3: that maybe not every maybe start with an assumption that 411 00:20:13,240 --> 00:20:16,240 Speaker 3: oh maybe maybe not trust, but verify, but maybe sort 412 00:20:16,280 --> 00:20:19,000 Speaker 3: of do a little diligence beforehand. If they're domestic, if 413 00:20:19,040 --> 00:20:22,639 Speaker 3: they're local companies and their brand names. I think generally speaking, 414 00:20:22,640 --> 00:20:25,679 Speaker 3: you're probably okay. I don't think Amazon is putting out 415 00:20:25,720 --> 00:20:27,080 Speaker 3: apps that are going to send your information. 416 00:20:27,320 --> 00:20:29,520 Speaker 1: I think people think of Timu as a brand name. 417 00:20:29,880 --> 00:20:31,720 Speaker 2: Yeah, I think that's right, that's part of it. I 418 00:20:31,760 --> 00:20:32,320 Speaker 2: think you're right. 419 00:20:32,359 --> 00:20:34,400 Speaker 3: I think in this case we've been trying to push 420 00:20:34,400 --> 00:20:36,800 Speaker 3: against that because of the Chinese I certainly anything that 421 00:20:36,800 --> 00:20:39,560 Speaker 3: has a Chinese manufacturing base as a tiny Chinese tie 422 00:20:39,600 --> 00:20:42,880 Speaker 3: of any kind, I would strongly recommend people not download 423 00:20:42,920 --> 00:20:44,880 Speaker 3: the app unless they've done some additional diligence. 424 00:20:44,960 --> 00:20:47,679 Speaker 1: Let's take a quick commercial break. We'll continue next on 425 00:20:47,720 --> 00:20:53,720 Speaker 1: the Tutor Dixon Podcast. You have to pay so much 426 00:20:53,760 --> 00:20:57,720 Speaker 1: attention because even if you aren't downloading the app, some 427 00:20:57,760 --> 00:21:02,320 Speaker 1: of these places that the quality of their stuff everything, 428 00:21:02,440 --> 00:21:04,879 Speaker 1: and they are getting your address, they're sending things to 429 00:21:04,960 --> 00:21:07,680 Speaker 1: your house, They're monitoring what they're sending to your house. 430 00:21:07,880 --> 00:21:10,320 Speaker 1: I say you have to be careful because a few 431 00:21:10,359 --> 00:21:13,640 Speaker 1: years ago, my girls came home and they were like, oh, 432 00:21:13,720 --> 00:21:17,080 Speaker 1: we're we have cash app on their phones in case 433 00:21:17,119 --> 00:21:19,440 Speaker 1: they need to get something when they're out with friends 434 00:21:19,480 --> 00:21:22,880 Speaker 1: and they come home and they're like, oh, we ordered 435 00:21:22,880 --> 00:21:25,240 Speaker 1: all this stuff from Sheen, And I was like, I 436 00:21:25,280 --> 00:21:29,200 Speaker 1: don't know this, Sheen, what is this? They had doubt 437 00:21:29,280 --> 00:21:31,479 Speaker 1: they had gotten into this. They didn't download it, they 438 00:21:31,480 --> 00:21:34,720 Speaker 1: got into the website, they ordered all this stuff. This 439 00:21:34,960 --> 00:21:38,800 Speaker 1: was the worst quality stuff I have ever seen. But 440 00:21:38,960 --> 00:21:42,600 Speaker 1: also it's all coming from China. I don't then you 441 00:21:42,640 --> 00:21:45,280 Speaker 1: start seeing these articles that there's chemicals in these clothes. 442 00:21:45,320 --> 00:21:48,240 Speaker 1: I mean, we really don't know when it's coming over 443 00:21:48,440 --> 00:21:51,239 Speaker 1: directly from the manufacturer. I think, like you know, if 444 00:21:51,240 --> 00:21:53,320 Speaker 1: you order from j CRW and they're getting some of 445 00:21:53,359 --> 00:21:56,280 Speaker 1: their stuff in they have a quality control system. You 446 00:21:56,400 --> 00:21:59,560 Speaker 1: have to explain to your kids. Timu and all and 447 00:21:59,600 --> 00:22:02,800 Speaker 1: Sheen in all these places, all these Ali Baba places, 448 00:22:03,240 --> 00:22:06,919 Speaker 1: they are vendors that are not monitored by the United States. 449 00:22:06,920 --> 00:22:12,240 Speaker 1: They do not go through the same care and scrutiny 450 00:22:12,359 --> 00:22:13,640 Speaker 1: that our products go through. 451 00:22:14,920 --> 00:22:17,160 Speaker 3: That's exactly right, and I in fact, some of them 452 00:22:17,160 --> 00:22:21,080 Speaker 3: can be tied to slave labor in China. In addition 453 00:22:21,119 --> 00:22:24,280 Speaker 3: to some of these national security national security issues that 454 00:22:24,440 --> 00:22:26,240 Speaker 3: we've been talking a little bit about. I mean, maybe 455 00:22:26,320 --> 00:22:28,800 Speaker 3: one test tutor as a starting point is that if 456 00:22:28,800 --> 00:22:31,040 Speaker 3: they existed when you and I were kids, if they're 457 00:22:31,040 --> 00:22:33,560 Speaker 3: a brand that exists in the eighties, they're fine. You know, 458 00:22:33,560 --> 00:22:35,560 Speaker 3: if you get a Jcpenny app or you know, I 459 00:22:35,560 --> 00:22:37,600 Speaker 3: guess Seers or Kmart aren't around in the same way 460 00:22:37,600 --> 00:22:38,919 Speaker 3: that I was really younger. 461 00:22:39,080 --> 00:22:41,199 Speaker 1: Whoa, we do have a j. C. Penny which I 462 00:22:41,200 --> 00:22:45,639 Speaker 1: do shop at, but I have our Seers died in 463 00:22:45,720 --> 00:22:49,760 Speaker 1: our Camark Kmart. Nobody remembers Kmart. That's so exciting. 464 00:22:50,600 --> 00:22:51,520 Speaker 2: Hey we're bringing it back. 465 00:22:51,560 --> 00:22:53,399 Speaker 3: But but I will say this, I mean part of 466 00:22:53,440 --> 00:22:55,600 Speaker 3: the challenge is, I mean, look, we have all the 467 00:22:55,600 --> 00:22:58,840 Speaker 3: inflation are inflation from the Biden years. People are being 468 00:22:58,960 --> 00:23:01,439 Speaker 3: being stretched and still feeling the impact of the increase 469 00:23:01,480 --> 00:23:04,520 Speaker 3: of all the costs from twenty twenty post COVID, and 470 00:23:04,560 --> 00:23:08,560 Speaker 3: so I understand the human impulse to say, hey, this 471 00:23:08,720 --> 00:23:11,200 Speaker 3: is a little cheaper. I've got to save some money 472 00:23:11,240 --> 00:23:13,880 Speaker 3: from my family, whether it's gene or TIMU or anything else. 473 00:23:14,720 --> 00:23:16,320 Speaker 3: It can't be that bad. And I'm just going to 474 00:23:16,359 --> 00:23:18,680 Speaker 3: make this in the moment decision. And that's a very 475 00:23:18,800 --> 00:23:21,680 Speaker 3: very rational and understandable thing. And I think what we try, 476 00:23:21,720 --> 00:23:23,159 Speaker 3: we have to try to do as much as we can, 477 00:23:23,200 --> 00:23:26,040 Speaker 3: whether it's do podcasts like yours, using your voice and platform, 478 00:23:26,119 --> 00:23:28,679 Speaker 3: using my voice and platform. Is just to try to 479 00:23:28,720 --> 00:23:33,280 Speaker 3: spread the known risks. Timu is one, Lorex is another, 480 00:23:33,640 --> 00:23:35,800 Speaker 3: and just start to educate people as much as we 481 00:23:35,880 --> 00:23:38,880 Speaker 3: can that this is actually a bigger issue for both 482 00:23:38,920 --> 00:23:40,600 Speaker 3: for them and their family because a lot of people 483 00:23:40,640 --> 00:23:43,320 Speaker 3: don't want to be spied on. They don't want hackers 484 00:23:43,320 --> 00:23:46,120 Speaker 3: to be able to access the homes, their baby monitors, 485 00:23:46,160 --> 00:23:48,880 Speaker 3: their home security networks. But it also is a bigger 486 00:23:48,920 --> 00:23:50,760 Speaker 3: issue for the country. And I don't know any other 487 00:23:51,000 --> 00:23:53,000 Speaker 3: better way than to sue the bad guys when they're 488 00:23:53,000 --> 00:23:55,840 Speaker 3: doing wrong things and try to elevate these issues with 489 00:23:55,880 --> 00:23:57,199 Speaker 3: the voice and platform that we have. 490 00:23:57,800 --> 00:24:01,439 Speaker 1: But it's hard when you have irates over in China, 491 00:24:01,480 --> 00:24:03,880 Speaker 1: and that's something that I think people don't understand either. 492 00:24:04,040 --> 00:24:07,119 Speaker 1: As someone who had a steel foundry, and we know 493 00:24:07,240 --> 00:24:11,320 Speaker 1: that China took the majority of those steel castings for 494 00:24:11,400 --> 00:24:14,640 Speaker 1: that steel casting business from the United States, and then 495 00:24:14,800 --> 00:24:17,879 Speaker 1: our customers went there, Oh this is so much less expensive, 496 00:24:17,920 --> 00:24:21,800 Speaker 1: same thing on a bigger corporate scale, so much less expensive, 497 00:24:22,080 --> 00:24:26,439 Speaker 1: and then their products were being re engineered and then 498 00:24:27,119 --> 00:24:31,400 Speaker 1: reverse engineered and then stolen and made over there. Now 499 00:24:31,400 --> 00:24:34,280 Speaker 1: we're seeing that same thing at some of these companies 500 00:24:34,600 --> 00:24:38,119 Speaker 1: like Timu and Sheen. My daughter had a brand that 501 00:24:38,200 --> 00:24:41,199 Speaker 1: saw a faith brand sweatshirt that says you are a 502 00:24:41,280 --> 00:24:43,439 Speaker 1: child of God on the back, and it was a 503 00:24:43,520 --> 00:24:46,919 Speaker 1: very popular sweatshirt. But in the United States, it's a 504 00:24:46,920 --> 00:24:50,160 Speaker 1: small company that makes the sweatshirts and they're about sixty 505 00:24:50,200 --> 00:24:53,480 Speaker 1: five to seventy dollars. Her friend came over one day, 506 00:24:54,000 --> 00:24:56,800 Speaker 1: exact same sweatshirt. She's like, I got this on Sheen 507 00:24:57,240 --> 00:25:00,960 Speaker 1: for six dollars and ninety nine cents. That is all. 508 00:25:01,080 --> 00:25:04,480 Speaker 1: How do you manage that when another country is pirating 509 00:25:04,600 --> 00:25:08,359 Speaker 1: your products and selling them like they're the exact same thing. 510 00:25:09,160 --> 00:25:10,480 Speaker 2: Well, and we saw that with Timu. 511 00:25:10,640 --> 00:25:13,360 Speaker 3: We actually found that they were selling local brands in Nebraska, 512 00:25:13,440 --> 00:25:16,160 Speaker 3: very prominent brands, ones like Runzo, which you've probably never 513 00:25:16,160 --> 00:25:18,640 Speaker 3: heard of. It's a very big deal in Nebraska, Nebraska 514 00:25:18,720 --> 00:25:21,159 Speaker 3: football creating Blue Jays. We saw that and it is 515 00:25:21,200 --> 00:25:22,480 Speaker 3: a little bit of a whack a mole and it's 516 00:25:22,520 --> 00:25:24,600 Speaker 3: very difficult, especially for a small business. I mean, I 517 00:25:24,600 --> 00:25:26,240 Speaker 3: think if you're going to zoom out to do and 518 00:25:26,240 --> 00:25:28,080 Speaker 3: think about really how to try to solve these types 519 00:25:28,119 --> 00:25:30,560 Speaker 3: of problems. Number One, I think we actually have to 520 00:25:30,600 --> 00:25:32,960 Speaker 3: do better about making sure that we're not fighting over 521 00:25:33,040 --> 00:25:36,080 Speaker 3: Thailand all during pregnancy and having we need to have 522 00:25:36,160 --> 00:25:38,920 Speaker 3: people who are the people who are taking tail and 523 00:25:39,000 --> 00:25:42,320 Speaker 3: all and doing TikTok videos while they're pregnant, like actually 524 00:25:42,320 --> 00:25:44,240 Speaker 3: maybe put that aside and start thinking about the bigger 525 00:25:44,320 --> 00:25:46,919 Speaker 3: problems that are facing our country. Secondly, I do think 526 00:25:46,960 --> 00:25:49,760 Speaker 3: there's probably a pretty significant federal component here. I mean 527 00:25:49,800 --> 00:25:51,919 Speaker 3: a lot of what we're talking about, our import controls, 528 00:25:52,800 --> 00:25:55,600 Speaker 3: you know, import export issues, what's coming into our country. 529 00:25:55,920 --> 00:25:58,399 Speaker 3: Also maybe some reforms of some Section two thirty or 530 00:25:58,400 --> 00:25:59,720 Speaker 3: some of these other things that I have given some 531 00:25:59,720 --> 00:26:02,719 Speaker 3: of these online platforms a little of an immunity, uh 532 00:26:02,840 --> 00:26:05,400 Speaker 3: in some of these contents, not necessarily selling up the products. 533 00:26:05,440 --> 00:26:06,760 Speaker 2: But that's part of this. I think. 534 00:26:06,960 --> 00:26:10,240 Speaker 3: I do think if it's probably a there's a significant 535 00:26:10,240 --> 00:26:14,400 Speaker 3: federal component probably to give more teeth on the import 536 00:26:14,520 --> 00:26:16,880 Speaker 3: side to help really shut down some of these because 537 00:26:16,880 --> 00:26:19,000 Speaker 3: they got it. That's once they get into the United 538 00:26:19,000 --> 00:26:21,159 Speaker 3: States and they're in the stream of commerce. It's very 539 00:26:21,200 --> 00:26:25,400 Speaker 3: difficult for anyone, including the Nebraska Attorney general to completely 540 00:26:25,440 --> 00:26:27,000 Speaker 3: stop them. I mean, you could play whack themle and 541 00:26:27,040 --> 00:26:29,160 Speaker 3: you could do some things, but to really stop the problem, 542 00:26:29,200 --> 00:26:30,760 Speaker 3: I think there's a federal component here. 543 00:26:31,240 --> 00:26:33,480 Speaker 1: All right. So you said we should follow you to 544 00:26:33,600 --> 00:26:36,240 Speaker 1: figure out what's happening in this world. So tell people 545 00:26:36,240 --> 00:26:37,199 Speaker 1: where they can follow you. 546 00:26:37,520 --> 00:26:42,040 Speaker 3: Well, probably the best place is on X. My Kilgers 547 00:26:42,240 --> 00:26:46,480 Speaker 3: is my what X handle x X account. I don't 548 00:26:46,520 --> 00:26:48,280 Speaker 3: know what the what the word is now, I'm like you, 549 00:26:48,320 --> 00:26:50,520 Speaker 3: I'm still trying to I know, I know it's X, 550 00:26:50,880 --> 00:26:52,640 Speaker 3: and I think I still say tweet, but I don't 551 00:26:52,640 --> 00:26:54,040 Speaker 3: know anyway, So Mike. 552 00:26:54,000 --> 00:26:56,800 Speaker 1: Kill I do say tweet. I mean, it's very hard 553 00:26:56,800 --> 00:26:57,760 Speaker 1: to change us world. 554 00:26:58,119 --> 00:27:00,520 Speaker 3: That's right, that's right, that's right, And then you get 555 00:27:00,560 --> 00:27:02,920 Speaker 3: the Nebraska that's probably the best and most efficient way. 556 00:27:03,080 --> 00:27:05,399 Speaker 3: Just follow me on X and whenever we're suing these 557 00:27:05,440 --> 00:27:08,000 Speaker 3: bad guys, we'll let people know or we do consumer alerts, 558 00:27:08,000 --> 00:27:10,280 Speaker 3: and any consumer alert that happens in Nebraska really is 559 00:27:10,280 --> 00:27:12,359 Speaker 3: relevant to the rest of the country because these are 560 00:27:12,440 --> 00:27:14,439 Speaker 3: national these are national brands. 561 00:27:15,440 --> 00:27:19,399 Speaker 1: All right. Well, Nebraska Attorney General Mike Hilgers, thank you 562 00:27:19,680 --> 00:27:21,120 Speaker 1: so much for being on the two. 563 00:27:21,359 --> 00:27:23,320 Speaker 2: It's great to be on here. Thanks Tudor so. 564 00:27:23,280 --> 00:27:26,760 Speaker 1: Much, absolutely, and thank you all for listening. As always, 565 00:27:26,840 --> 00:27:29,840 Speaker 1: you can find this on the iHeartRadio app, Apple Podcasts 566 00:27:29,960 --> 00:27:32,320 Speaker 1: or wherever you get your podcasts, or you can watch 567 00:27:32,359 --> 00:27:35,560 Speaker 1: the full video go over to Rumble or YouTube at 568 00:27:35,600 --> 00:27:38,040 Speaker 1: Tutor Dixon and make sure you join us next time. 569 00:27:38,240 --> 00:27:41,240 Speaker 1: Have a blessed day.