1 00:00:00,120 --> 00:00:03,360 Speaker 1: The second hour, Clay and Buck kicks off. Now, as 2 00:00:03,440 --> 00:00:08,840 Speaker 1: we know, Donald Trump was sentenced this morning for the 3 00:00:08,920 --> 00:00:14,280 Speaker 1: non crime of a business records ledger issue. The whole 4 00:00:14,320 --> 00:00:17,520 Speaker 1: thing we talked about insane. He was sentenced to the 5 00:00:18,160 --> 00:00:23,000 Speaker 1: stiff punishment of nothing, no punishment other than they can 6 00:00:23,079 --> 00:00:25,919 Speaker 1: still say he is technically a convicted felon, although no 7 00:00:26,079 --> 00:00:29,960 Speaker 1: serious person thinks that Donald Trump committed a felony, or 8 00:00:30,040 --> 00:00:33,559 Speaker 1: any crime for that matter. I don't know what it 9 00:00:33,640 --> 00:00:37,080 Speaker 1: is going to take for Democrats to finally give it 10 00:00:37,159 --> 00:00:41,000 Speaker 1: up on this, but in the meantime we'll continue to 11 00:00:41,920 --> 00:00:44,040 Speaker 1: follow that, and I think we'll talk to Andy McCarthy 12 00:00:44,200 --> 00:00:47,800 Speaker 1: in the third hour about the whole law fair mess. 13 00:00:48,400 --> 00:00:50,800 Speaker 1: We've also got our friend Ryan Gerdusky whose podcast is 14 00:00:50,840 --> 00:00:51,880 Speaker 1: doing great, a lot. 15 00:00:51,760 --> 00:00:53,120 Speaker 2: Of you really enjoying it. 16 00:00:53,120 --> 00:00:56,120 Speaker 1: It's putting on his full data nerd hat and going 17 00:00:56,160 --> 00:01:00,360 Speaker 1: into everything in the world of politics and sort of 18 00:01:00,360 --> 00:01:02,920 Speaker 1: general interest news too. He does a great job with 19 00:01:03,200 --> 00:01:07,400 Speaker 1: It's a numbers game. But Clay, I remember the first 20 00:01:07,440 --> 00:01:12,040 Speaker 1: time then I came across or heard about TikTok. It 21 00:01:12,040 --> 00:01:14,280 Speaker 1: was during the pandemic, actually, so I was at home 22 00:01:15,000 --> 00:01:17,720 Speaker 1: and I looked something like a wooly mammoth. Did you 23 00:01:17,800 --> 00:01:20,520 Speaker 1: have the full like no haircut thing for a long time? 24 00:01:20,640 --> 00:01:23,080 Speaker 1: I think I went four or five. I actually cut 25 00:01:23,120 --> 00:01:28,720 Speaker 1: my own hair for a while, which was at what ages? Well, 26 00:01:28,800 --> 00:01:32,600 Speaker 1: during the pandemic, Clay, not as like a child. 27 00:01:33,120 --> 00:01:35,840 Speaker 2: Oh, I thought you were like talking about like in college, 28 00:01:35,920 --> 00:01:38,280 Speaker 2: like you were living and you know, like I'm saying, what, 29 00:01:38,400 --> 00:01:41,160 Speaker 2: everything was shut down? You could nothing shut down here, 30 00:01:41,440 --> 00:01:43,240 Speaker 2: shut down for like a month. I went and just 31 00:01:43,280 --> 00:01:45,240 Speaker 2: went to get my haircut like a normal person. 32 00:01:45,240 --> 00:01:48,240 Speaker 1: So unfair. I should probably share. I have a photo 33 00:01:48,320 --> 00:01:50,960 Speaker 1: where I look like, I don't know, I looked like 34 00:01:51,000 --> 00:01:52,840 Speaker 1: I grew up in a cave or something or was living. 35 00:01:52,880 --> 00:01:54,520 Speaker 2: You couldn't get a haircut in New York. 36 00:01:55,000 --> 00:01:59,560 Speaker 1: Absolutely, not for months, like I mean from whatever it 37 00:01:59,640 --> 00:02:02,680 Speaker 1: was February, you know, end of February, early March, shutdown 38 00:02:02,760 --> 00:02:08,240 Speaker 1: day to June, maybe July. So I mean, think, what 39 00:02:08,280 --> 00:02:10,799 Speaker 1: would what would you look like, Clay Travis always very 40 00:02:10,800 --> 00:02:13,160 Speaker 1: well quaffed, mister Clay, what would you look like with 41 00:02:13,280 --> 00:02:15,160 Speaker 1: no haircut for six months? 42 00:02:15,280 --> 00:02:17,799 Speaker 2: Dude? Think about it. I probably looked like I did 43 00:02:17,800 --> 00:02:19,360 Speaker 2: in high school. That's why I thought I was going 44 00:02:19,400 --> 00:02:22,760 Speaker 2: to get ridiculed for my You know, my kids have 45 00:02:22,960 --> 00:02:26,320 Speaker 2: my high school yearbook photo as their backdrop to make 46 00:02:26,360 --> 00:02:29,920 Speaker 2: fun of me on their phones, like and Laura does 47 00:02:29,960 --> 00:02:32,880 Speaker 2: my wife like if you like their screensaver is my 48 00:02:33,080 --> 00:02:34,600 Speaker 2: senior class yearbook photo. 49 00:02:35,400 --> 00:02:37,440 Speaker 1: You you and my brother are the same year in 50 00:02:37,680 --> 00:02:39,320 Speaker 1: My older brother the same year in school. 51 00:02:39,720 --> 00:02:42,679 Speaker 2: And I remember this. There was a time everybody was listening. 52 00:02:42,480 --> 00:02:46,040 Speaker 1: To Dave Matthews playing their Hackey sack, wearing their birken 53 00:02:46,120 --> 00:02:49,200 Speaker 1: stocks and uh, and they would they put their hair 54 00:02:49,280 --> 00:02:50,320 Speaker 1: behind their ears. 55 00:02:50,440 --> 00:02:55,000 Speaker 2: That's right. Super styles guy, that was That was a superstyle. 56 00:02:55,440 --> 00:02:57,680 Speaker 2: That was the like I'm preppy, I play soccer and 57 00:02:57,720 --> 00:03:00,600 Speaker 2: listen to Dave Matthews and yeah, you know, like that 58 00:03:00,760 --> 00:03:02,680 Speaker 2: was the look for a while that we had to 59 00:03:02,720 --> 00:03:05,440 Speaker 2: tuck your hair behind the ear. And so I think 60 00:03:05,480 --> 00:03:08,560 Speaker 2: it would look something like that. But I still for 61 00:03:08,639 --> 00:03:13,040 Speaker 2: people who lived in Red States, what you guys dealt 62 00:03:13,040 --> 00:03:16,280 Speaker 2: with in New York and California is like finding out 63 00:03:16,320 --> 00:03:19,440 Speaker 2: that someone was in a foreign country because I we 64 00:03:19,520 --> 00:03:23,920 Speaker 2: shut my gym buck. My gym was back open, like 65 00:03:24,240 --> 00:03:26,799 Speaker 2: by the first week of May, Like they shut down 66 00:03:26,840 --> 00:03:29,840 Speaker 2: at Gym's for like six weeks, which was stupid, But 67 00:03:30,440 --> 00:03:33,960 Speaker 2: by May one, ish in Tennessee. If you lived in 68 00:03:34,040 --> 00:03:37,920 Speaker 2: a normal place, your life was basically back to normal 69 00:03:38,000 --> 00:03:41,080 Speaker 2: May one, twenty twenty. Schools opened back up in August 70 00:03:41,120 --> 00:03:43,800 Speaker 2: full go. Now. People still had to wear masks. It 71 00:03:43,880 --> 00:03:47,040 Speaker 2: was stupid, but it was. There was very little we 72 00:03:47,680 --> 00:03:50,480 Speaker 2: COVID existed for like six weeks and then people were like, yeah, 73 00:03:50,480 --> 00:03:52,440 Speaker 2: this is stupid, like let's kind of go back to normal. 74 00:03:52,680 --> 00:03:55,560 Speaker 1: And New York City, my friend in New York City, 75 00:03:56,080 --> 00:03:59,080 Speaker 1: they had metrics for reopening gyms and other things. And 76 00:03:59,120 --> 00:04:02,080 Speaker 1: when we hit the metrics after all the waiting, de Blasio, 77 00:04:02,240 --> 00:04:05,920 Speaker 1: because he is a vile communist and a little dictator 78 00:04:05,960 --> 00:04:07,880 Speaker 1: at heart, just said, yeah, I don't care, We're not 79 00:04:07,920 --> 00:04:08,720 Speaker 1: opening them anyway. 80 00:04:09,040 --> 00:04:11,360 Speaker 2: So he set and he said, you guys had to 81 00:04:11,440 --> 00:04:14,000 Speaker 2: work out in masks, which is still crazy to me 82 00:04:14,080 --> 00:04:15,880 Speaker 2: to think about, Like in the gym that you had 83 00:04:15,880 --> 00:04:17,960 Speaker 2: to have a mask on. People would come up to 84 00:04:18,000 --> 00:04:19,480 Speaker 2: you and yell at you. On the treadmill. 85 00:04:19,680 --> 00:04:21,880 Speaker 1: If you are on a treadmill without a map, think 86 00:04:21,920 --> 00:04:23,600 Speaker 1: about that you had to run with your mask on. 87 00:04:23,680 --> 00:04:27,880 Speaker 1: It was the dumbest thing in the world. And along 88 00:04:27,880 --> 00:04:30,680 Speaker 1: with this and danger, so stupid, bad for you by 89 00:04:30,720 --> 00:04:32,719 Speaker 1: the way you're all these fibers and stuff you're in here. 90 00:04:32,800 --> 00:04:34,240 Speaker 2: It's horrible. Anyway. 91 00:04:35,520 --> 00:04:36,800 Speaker 1: I bring it up because that was when I learned 92 00:04:36,800 --> 00:04:39,760 Speaker 1: about TikTok, and I remember the first TikTok video. I'm 93 00:04:39,760 --> 00:04:42,239 Speaker 1: getting little nostalgic about TikTok because it might be rip 94 00:04:42,400 --> 00:04:45,440 Speaker 1: TikTok here pretty soon thanks to the Supreme Court. Uh well, 95 00:04:45,440 --> 00:04:47,600 Speaker 1: actually not the Supreme Court. It's a law that's been 96 00:04:47,640 --> 00:04:48,760 Speaker 1: passed and now the Supreme. 97 00:04:48,480 --> 00:04:49,279 Speaker 2: Court's weighing in on it. 98 00:04:49,320 --> 00:04:54,040 Speaker 1: But uh it was a shuffle dance which I tried 99 00:04:54,040 --> 00:04:55,720 Speaker 1: a little bit during COVID. I did not do very 100 00:04:55,720 --> 00:04:58,680 Speaker 1: well with that, so, uh, you know, you know, even 101 00:04:58,720 --> 00:05:00,120 Speaker 1: do you know what I'm talking about. That was of 102 00:05:00,120 --> 00:05:03,080 Speaker 1: the big things TikTok went viral for in the early days, 103 00:05:03,080 --> 00:05:05,839 Speaker 1: teaching people during COVID how to do these little dance 104 00:05:05,880 --> 00:05:06,680 Speaker 1: steps at home. 105 00:05:07,320 --> 00:05:11,039 Speaker 2: And I did not go through this universe. We were 106 00:05:11,120 --> 00:05:14,200 Speaker 2: just not that at home that much. I'm sorry. I'm sorry, 107 00:05:14,240 --> 00:05:18,720 Speaker 2: mister German cinephile over here. Hold on you in your 108 00:05:18,760 --> 00:05:21,800 Speaker 2: apartment by yourself, were learning how to do dances. This 109 00:05:21,880 --> 00:05:23,480 Speaker 2: is what you were doing in your free time. 110 00:05:24,120 --> 00:05:26,200 Speaker 1: I was, you know, like I was trying to pick 111 00:05:26,279 --> 00:05:29,000 Speaker 1: up a new hobby. Don't worry about whether it was 112 00:05:29,120 --> 00:05:32,240 Speaker 1: dancing or not. The point is that was when I 113 00:05:32,279 --> 00:05:33,280 Speaker 1: first saw TikTok. 114 00:05:34,520 --> 00:05:37,600 Speaker 2: I can't imagine you. First of all, I told you 115 00:05:37,640 --> 00:05:40,840 Speaker 2: this when we first started doing the show in June 116 00:05:40,880 --> 00:05:43,279 Speaker 2: of twenty one. Sometime that summer. I was up in 117 00:05:43,279 --> 00:05:44,480 Speaker 2: New York and you were like, oh, you can come 118 00:05:44,480 --> 00:05:47,560 Speaker 2: see my apartment. And I walked in, and I was like, 119 00:05:47,720 --> 00:05:52,039 Speaker 2: I would have gone absolutely insane if I lived in 120 00:05:52,080 --> 00:05:54,240 Speaker 2: a city and they walked down and I had to 121 00:05:54,279 --> 00:05:56,000 Speaker 2: stay in an apartment like you're in a high rise 122 00:05:56,040 --> 00:05:59,200 Speaker 2: apartment building. The only reason to live in New York City, 123 00:05:59,440 --> 00:06:03,839 Speaker 2: in my opinion, is to experience outside of your apartment 124 00:06:03,920 --> 00:06:05,360 Speaker 2: the life of New York City. 125 00:06:05,440 --> 00:06:08,960 Speaker 1: Well, this is why, this is why people decided. Some 126 00:06:09,040 --> 00:06:12,600 Speaker 1: people decided to move because they were so upset. I mean, 127 00:06:12,800 --> 00:06:15,400 Speaker 1: New York City absent the use of the city is 128 00:06:15,480 --> 00:06:18,200 Speaker 1: it feels like a prison, you know, because you have 129 00:06:18,240 --> 00:06:20,760 Speaker 1: no you have no land, you have no outdoor space, 130 00:06:20,800 --> 00:06:23,200 Speaker 1: you have no freedom. So yeah, no, it was particularly brutal. 131 00:06:23,240 --> 00:06:24,479 Speaker 1: I think it was worse in the I think the 132 00:06:24,520 --> 00:06:26,400 Speaker 1: lockdown was worse in New York City than it was 133 00:06:26,440 --> 00:06:28,640 Speaker 1: honestly anywhere else in the entire I think that's probably 134 00:06:28,680 --> 00:06:31,480 Speaker 1: true because at least in La Buying large people had 135 00:06:31,600 --> 00:06:33,520 Speaker 1: land like there were a lot of crazy towns. 136 00:06:33,920 --> 00:06:36,320 Speaker 2: But to your point, you're basically in a high rise 137 00:06:36,400 --> 00:06:39,080 Speaker 2: prison if you can't leave your apartment and a lot 138 00:06:39,120 --> 00:06:40,719 Speaker 2: of people you don't even have You did not have 139 00:06:41,279 --> 00:06:43,760 Speaker 2: outdoor air, like you didn't have like a balcony where 140 00:06:43,760 --> 00:06:46,240 Speaker 2: you could even go outside and like get fresh air. 141 00:06:46,440 --> 00:06:47,240 Speaker 2: Like it's crazy. 142 00:06:47,960 --> 00:06:50,480 Speaker 1: Yes, and uh and I couldn't even get a haircut 143 00:06:50,520 --> 00:06:52,479 Speaker 1: for months, so I managed to buy myself a pair 144 00:06:52,520 --> 00:06:55,359 Speaker 1: of like Crayola scissors off of Amazon and then just 145 00:06:55,400 --> 00:06:56,880 Speaker 1: sit there and try to cut my hair so it 146 00:06:56,880 --> 00:06:59,720 Speaker 1: wouldn't be so long as in my eyes anyway, Ah, memories, 147 00:06:59,760 --> 00:07:01,920 Speaker 1: memory is TikTok. That's how I learned about it. And 148 00:07:01,960 --> 00:07:04,000 Speaker 1: then all of a sudden you get there's some there's 149 00:07:04,000 --> 00:07:05,880 Speaker 1: some cool content on there. I'll tell you there's this 150 00:07:05,920 --> 00:07:09,240 Speaker 1: guy who takes a lot of you. You've probably if 151 00:07:09,400 --> 00:07:11,000 Speaker 1: you haven't seen this. I think his name is Donnie 152 00:07:11,040 --> 00:07:15,080 Speaker 1: Dust and he takes like tools and someone will say, 153 00:07:15,080 --> 00:07:17,239 Speaker 1: can you make a tomahawk with you know, a stick 154 00:07:17,280 --> 00:07:18,840 Speaker 1: in a rock? And he just does it right in 155 00:07:18,840 --> 00:07:21,000 Speaker 1: front of you. It's amazing. There's some cool stuff on there. 156 00:07:21,280 --> 00:07:24,880 Speaker 1: Problem is it's owned by China and it's a Chinese 157 00:07:24,880 --> 00:07:25,760 Speaker 1: spying app or whatever. 158 00:07:25,840 --> 00:07:28,200 Speaker 2: Right, that's the problem. That's why everyone's also freaked out 159 00:07:28,200 --> 00:07:28,560 Speaker 2: about this. 160 00:07:29,160 --> 00:07:32,520 Speaker 1: So Supreme Court looks like they're going to uphold the 161 00:07:32,640 --> 00:07:36,920 Speaker 1: law after oral arguments this morning, after both Republicans and 162 00:07:36,960 --> 00:07:40,400 Speaker 1: Democrats had decided that there was a national security risk 163 00:07:40,560 --> 00:07:43,840 Speaker 1: from TikTok, and so they passed this law that said 164 00:07:43,840 --> 00:07:46,560 Speaker 1: it has to be sold or will be basically shut 165 00:07:46,600 --> 00:07:51,040 Speaker 1: down in the United States. So really it seems like 166 00:07:51,080 --> 00:07:54,040 Speaker 1: they got to find a buyer for it. Here, Clay, So, 167 00:07:54,400 --> 00:07:56,480 Speaker 1: do you know anybody we got to keep this thing alive. 168 00:07:56,720 --> 00:07:59,800 Speaker 1: There is so much great cooking content and actually a 169 00:07:59,840 --> 00:08:01,880 Speaker 1: lot of good fitness content on TikTok as well. 170 00:08:01,880 --> 00:08:03,520 Speaker 2: I'll tell you. I know, everyone just thinks that they're 171 00:08:03,560 --> 00:08:05,480 Speaker 2: trying to steal all your data for China. But at 172 00:08:05,560 --> 00:08:08,360 Speaker 2: least you can get abs in the process. I So 173 00:08:08,520 --> 00:08:12,640 Speaker 2: here's a Ali just texted us producer Ali that Elon 174 00:08:12,720 --> 00:08:17,200 Speaker 2: should buy it. So here is the challenge as I 175 00:08:17,240 --> 00:08:21,360 Speaker 2: see it. And we had an actual discussion about this yesterday, 176 00:08:21,360 --> 00:08:24,720 Speaker 2: because I don't think this is an easy answer. You 177 00:08:25,000 --> 00:08:28,160 Speaker 2: believe that it should basically exist as is, right, is 178 00:08:28,200 --> 00:08:32,320 Speaker 2: a rough way of describing your position. My position is, 179 00:08:32,559 --> 00:08:35,120 Speaker 2: I actually do not believe that China should be able 180 00:08:35,160 --> 00:08:39,280 Speaker 2: to own this company, which I believe is more powerful 181 00:08:39,360 --> 00:08:43,320 Speaker 2: as a media entity than the New York Times is, 182 00:08:43,520 --> 00:08:47,120 Speaker 2: than the Washington Post is, than Fox News is, than 183 00:08:47,160 --> 00:08:50,840 Speaker 2: the Wall Street Journal is, whatever media outlet you enjoy consuming. 184 00:08:51,600 --> 00:08:53,800 Speaker 2: And we would never allow the four things that I 185 00:08:53,960 --> 00:08:57,199 Speaker 2: just ticked through to be owned by a foreigner because 186 00:08:57,240 --> 00:08:59,839 Speaker 2: of the impact that it could have. So I think, 187 00:09:00,000 --> 00:09:01,800 Speaker 2: and you were just kind of hitting at this, I 188 00:09:01,840 --> 00:09:05,400 Speaker 2: think that TikTok should be forced to divest itself of 189 00:09:05,520 --> 00:09:08,959 Speaker 2: Chinese ownership of its US based assets. And then the 190 00:09:09,040 --> 00:09:12,920 Speaker 2: question beyond that becomes, Okay, who would buy it? And 191 00:09:13,160 --> 00:09:17,280 Speaker 2: I want an Elon Musk like figure to buy TikTok 192 00:09:17,440 --> 00:09:19,880 Speaker 2: because all I want them to do is do the 193 00:09:19,920 --> 00:09:22,800 Speaker 2: same thing that now Facebook says it's doing and that 194 00:09:23,040 --> 00:09:26,400 Speaker 2: Twitter does, which is just have a content neutral policy 195 00:09:26,440 --> 00:09:30,199 Speaker 2: in place where the algorithm doesn't favor anything in particular 196 00:09:30,320 --> 00:09:35,080 Speaker 2: of a political bent, and everybody gets a fair playing field. 197 00:09:35,559 --> 00:09:38,640 Speaker 2: That's my ideal world for where we could go with TikTok. 198 00:09:38,720 --> 00:09:43,200 Speaker 2: Now here's the challenge. The value of TikTok I would 199 00:09:43,240 --> 00:09:47,079 Speaker 2: think buck as it inches closer to this banning date, 200 00:09:47,120 --> 00:09:49,559 Speaker 2: which is what like January nineteenth, what's the day when 201 00:09:49,559 --> 00:09:51,480 Speaker 2: it would And for people who are like, what does 202 00:09:51,520 --> 00:09:54,360 Speaker 2: a band look like? My understanding and correct me if 203 00:09:54,360 --> 00:09:57,200 Speaker 2: I'm wrong on this is basically the app would no 204 00:09:57,280 --> 00:10:01,800 Speaker 2: longer be able to update, and therefore eventually becomes unusable 205 00:10:01,960 --> 00:10:06,880 Speaker 2: on iPhones or androids or whatever else because without being 206 00:10:06,880 --> 00:10:10,040 Speaker 2: able to update, bugs takeover, it becomes less efficient all 207 00:10:10,080 --> 00:10:10,600 Speaker 2: those things. 208 00:10:11,440 --> 00:10:14,480 Speaker 1: Yes, and I mean TikTok It's January nineteenth is the date, 209 00:10:15,280 --> 00:10:19,800 Speaker 1: and it's a big platform. I mean, it claims to 210 00:10:19,880 --> 00:10:24,120 Speaker 1: have a billion monthly active users globally and claims to 211 00:10:24,120 --> 00:10:26,760 Speaker 1: have one hundred and fifty million monthly active users in 212 00:10:26,760 --> 00:10:29,600 Speaker 1: the United States. That seems I mean, I don't see 213 00:10:29,640 --> 00:10:33,320 Speaker 1: how that's possible, considering that would mean that half the 214 00:10:33,400 --> 00:10:36,840 Speaker 1: country is basically on using TikTok at least once a month. 215 00:10:36,920 --> 00:10:40,560 Speaker 1: That number strikes me as not really possible. But this 216 00:10:40,679 --> 00:10:43,760 Speaker 1: is the number that they This is the number that 217 00:10:43,800 --> 00:10:45,640 Speaker 1: they put out there officially. I mean, go Google it, 218 00:10:45,640 --> 00:10:48,480 Speaker 1: you'll find you'll find that that's what they say. I 219 00:10:48,520 --> 00:10:52,480 Speaker 1: think it's interesting because, for one thing, it's because China 220 00:10:52,520 --> 00:10:56,400 Speaker 1: owns this. If this was owned by Sweden or the 221 00:10:56,520 --> 00:10:59,600 Speaker 1: UK or you know, France or something, I don't think 222 00:10:59,640 --> 00:11:03,720 Speaker 1: that they're be this issue. So we clearly put China 223 00:11:03,800 --> 00:11:09,720 Speaker 1: in a different category of national security concern. And I've 224 00:11:09,760 --> 00:11:12,800 Speaker 1: always thought that the spying components of this, or the 225 00:11:12,840 --> 00:11:15,000 Speaker 1: ability to spy with this, is a little bit overblown, 226 00:11:15,080 --> 00:11:18,440 Speaker 1: considering that TikTok is actually to the TikTok that we 227 00:11:18,520 --> 00:11:20,440 Speaker 1: use is based in the US and the servers are 228 00:11:20,480 --> 00:11:21,640 Speaker 1: based in the US. 229 00:11:21,320 --> 00:11:23,400 Speaker 2: And people say, well, they could route it all back 230 00:11:23,400 --> 00:11:23,839 Speaker 2: to China. 231 00:11:23,960 --> 00:11:26,400 Speaker 1: Yeah, the Chinese can hack into a lot of things, 232 00:11:26,440 --> 00:11:28,040 Speaker 1: and they have hacked into a lot of things anyway. 233 00:11:29,000 --> 00:11:32,880 Speaker 1: But here we have a major social media platform that 234 00:11:33,000 --> 00:11:37,040 Speaker 1: seems ripe for the taking from somebody with very deep pockets, 235 00:11:37,679 --> 00:11:39,679 Speaker 1: and I would just love for it to be this 236 00:11:39,720 --> 00:11:40,280 Speaker 1: is kind of my. 237 00:11:40,280 --> 00:11:41,280 Speaker 2: Takeaway on a clay. 238 00:11:41,320 --> 00:11:44,400 Speaker 1: I would love for this to be an opportunity to 239 00:11:44,600 --> 00:11:48,280 Speaker 1: once again push the Internet more toward freedom insanity, which 240 00:11:48,280 --> 00:11:50,000 Speaker 1: is what we've seen with X and now some up 241 00:11:50,000 --> 00:11:52,320 Speaker 1: with Facebook. I would say this too. On the Facebook 242 00:11:52,320 --> 00:11:55,080 Speaker 1: part of this, which is huge as you all know 243 00:11:55,200 --> 00:11:58,839 Speaker 1: that Facebook is moving away from Clay said no more 244 00:11:58,880 --> 00:11:59,920 Speaker 1: DEI programs. 245 00:12:00,520 --> 00:12:00,600 Speaker 3: Ye. 246 00:12:00,760 --> 00:12:03,320 Speaker 1: Now they're going to be far more friendly to conservatives 247 00:12:03,360 --> 00:12:05,040 Speaker 1: into politics and the free speech stuff. 248 00:12:05,480 --> 00:12:06,640 Speaker 2: But is it. 249 00:12:08,040 --> 00:12:11,720 Speaker 1: Opportunistic by Zuckerberg, Yeah, of course, But is it still 250 00:12:11,920 --> 00:12:13,719 Speaker 1: the direction I want Zuckerberg going in? 251 00:12:13,880 --> 00:12:16,880 Speaker 2: Yes? So do you think Zuckerberg go Trump if you 252 00:12:16,960 --> 00:12:23,520 Speaker 2: had to bet? That's so interesting? Yes? I think. 253 00:12:23,760 --> 00:12:25,760 Speaker 1: But I don't think he thinks of himself as a Republican. 254 00:12:25,840 --> 00:12:28,120 Speaker 1: I just think I think he recognizes Kamalo as a 255 00:12:28,200 --> 00:12:30,480 Speaker 1: moron and that the whole Democrat campaign was a lie, 256 00:12:30,840 --> 00:12:32,679 Speaker 1: that Diden the whole thing was a lie, and that 257 00:12:32,760 --> 00:12:35,560 Speaker 1: Kamala's competent was a lie. And I think a lot 258 00:12:35,640 --> 00:12:38,880 Speaker 1: of the tech bros are in that category. They're not 259 00:12:39,120 --> 00:12:41,920 Speaker 1: ideologically we'd even get into some of this on some 260 00:12:41,960 --> 00:12:44,520 Speaker 1: of the immigration stuff. There's some disputes on the right. 261 00:12:44,920 --> 00:12:47,440 Speaker 1: Maybe we'll talk more about that another time. A lot 262 00:12:47,520 --> 00:12:50,600 Speaker 1: of the tech bros are not ideologically right wing, but 263 00:12:50,640 --> 00:12:55,120 Speaker 1: they are ideologically results in common sense aligned, and that 264 00:12:55,320 --> 00:12:57,760 Speaker 1: meant a rejection of the Democrats in this election. 265 00:12:58,200 --> 00:13:01,640 Speaker 2: I agree with you. I think Bezos and Zuckerberg in 266 00:13:01,679 --> 00:13:05,679 Speaker 2: their own private voting actually both voted Trump. I really do. 267 00:13:06,080 --> 00:13:09,360 Speaker 2: And by the way, on TikTok, what percentage of our 268 00:13:09,440 --> 00:13:11,960 Speaker 2: audience that is listening to us right now? You talked 269 00:13:11,960 --> 00:13:15,480 Speaker 2: about the sheer number that they claim. What percentage of 270 00:13:15,480 --> 00:13:18,120 Speaker 2: this audience has a TikTok account? I would actually love 271 00:13:18,160 --> 00:13:20,760 Speaker 2: to hear from people if you are active on TikTok 272 00:13:21,080 --> 00:13:24,720 Speaker 2: and listening to us, what do you think should happen? 273 00:13:24,960 --> 00:13:26,680 Speaker 2: And I'm not sure we will get a single call, 274 00:13:26,760 --> 00:13:28,440 Speaker 2: by the way, from many of you're listening to us, 275 00:13:28,440 --> 00:13:30,000 Speaker 2: is I'm not sure we'll get a single call from 276 00:13:30,000 --> 00:13:32,600 Speaker 2: that when he uses TikTok frequently. I could be completely wrong, 277 00:13:32,679 --> 00:13:35,480 Speaker 2: but I would love to hear from people if TikTok 278 00:13:35,559 --> 00:13:39,319 Speaker 2: is your preferred social media site because eight hundred and 279 00:13:39,360 --> 00:13:41,320 Speaker 2: two A two two eight A two what do you 280 00:13:41,480 --> 00:13:46,400 Speaker 2: think as a user should happen here? And also do 281 00:13:46,480 --> 00:13:48,360 Speaker 2: you use it? Because one of the things we think 282 00:13:48,360 --> 00:13:50,720 Speaker 2: about a lot on this show is how do we 283 00:13:50,800 --> 00:13:55,440 Speaker 2: reach audiences in so many different places? Because it we're 284 00:13:55,440 --> 00:13:58,800 Speaker 2: in such a unique world now old school. When Rush 285 00:13:58,800 --> 00:14:01,760 Speaker 2: started this show, he build built a radio universe. There 286 00:14:01,840 --> 00:14:05,080 Speaker 2: was no podcasting. There was no video on demand that 287 00:14:05,120 --> 00:14:08,160 Speaker 2: you could watch on YouTube of sit down interviews. It 288 00:14:08,200 --> 00:14:10,840 Speaker 2: was a very different media environment. So we're trying to 289 00:14:10,880 --> 00:14:13,840 Speaker 2: be everywhere, and I'm just curious how many people that 290 00:14:13,880 --> 00:14:17,040 Speaker 2: are listening to us right now TikTok is your preferred 291 00:14:17,080 --> 00:14:20,160 Speaker 2: social media platform, and what do you think should happen 292 00:14:20,200 --> 00:14:24,000 Speaker 2: Because the politics on this book are actually pretty fascinating. 293 00:14:24,240 --> 00:14:28,560 Speaker 2: Trump is against a band so and Biden is trying 294 00:14:28,560 --> 00:14:33,480 Speaker 2: to ban it. So I do think this doesn't line 295 00:14:33,520 --> 00:14:37,320 Speaker 2: itself up naturally with Republicans believe X and Democrats believe 296 00:14:37,360 --> 00:14:40,320 Speaker 2: why as it pertains to what should happen for TikTok, 297 00:14:40,560 --> 00:14:41,760 Speaker 2: which is why I think it's one of the most 298 00:14:41,800 --> 00:14:43,320 Speaker 2: interesting stories out there right now. 299 00:14:44,640 --> 00:14:47,680 Speaker 1: Rapid radios are walkie talkies that make staying in touch 300 00:14:47,880 --> 00:14:52,000 Speaker 1: so much easier. One touch, total connectivity. Just press a 301 00:14:52,000 --> 00:14:56,360 Speaker 1: button instantly connect with friends, family, or coworkers nationwide. Rapid 302 00:14:56,440 --> 00:14:59,280 Speaker 1: radios work on a nationwide LT network and you can 303 00:14:59,320 --> 00:15:03,040 Speaker 1: connect time or pre program your radio, your rapid radio 304 00:15:03,440 --> 00:15:06,360 Speaker 1: to talk to two hundred people all at once. Rapid 305 00:15:06,400 --> 00:15:08,160 Speaker 1: radios come ready to go right out of the box. 306 00:15:08,200 --> 00:15:10,400 Speaker 1: The charge last up to five days if you have 307 00:15:10,400 --> 00:15:13,920 Speaker 1: an emergency go bag. Consider getting some rapid radios. Go 308 00:15:13,960 --> 00:15:16,000 Speaker 1: online to rapid radios dot Com. You'll get up to 309 00:15:16,040 --> 00:15:19,200 Speaker 1: sixty percent off free ups shipping from Michigan, plus a 310 00:15:19,320 --> 00:15:22,640 Speaker 1: free protection bag. Ad Code Radio and get an extra 311 00:15:22,760 --> 00:15:27,160 Speaker 1: five percent off. That's rapid Radios dot Com. Use Code 312 00:15:27,240 --> 00:15:30,520 Speaker 1: Radio for an extra five percent off. They're fantastic. Rapid 313 00:15:30,560 --> 00:15:31,840 Speaker 1: radios dot Com. 314 00:15:32,800 --> 00:15:38,640 Speaker 4: Stories are freedom stories of America, inspirational stories that you unite. 315 00:15:38,320 --> 00:15:41,560 Speaker 5: Us all each day. Spend time with Clay and Boy. 316 00:15:42,040 --> 00:15:45,320 Speaker 5: Find them on the free iHeartRadio app or wherever you 317 00:15:45,440 --> 00:15:46,560 Speaker 5: get your podcasts. 318 00:15:46,920 --> 00:15:51,040 Speaker 2: Welcome back in Clay Travis buck Sexton Show. We have 319 00:15:51,160 --> 00:15:54,120 Speaker 2: got a lot of reaction actually from you guys out 320 00:15:54,160 --> 00:16:01,240 Speaker 2: there on TikTok and Tyrell in New Mexico. You are 321 00:16:01,440 --> 00:16:04,320 Speaker 2: a big time TikTok user. What do you think should happen? 322 00:16:05,840 --> 00:16:09,600 Speaker 6: Well, here's the thing. I get the whole like being 323 00:16:09,640 --> 00:16:12,280 Speaker 6: afraid that China is you know, spine on and stuff, 324 00:16:12,280 --> 00:16:15,080 Speaker 6: But like you said, they do that all the time anyways. 325 00:16:15,080 --> 00:16:17,160 Speaker 6: I mean they've even messed with our waste water plants 326 00:16:18,400 --> 00:16:21,200 Speaker 6: and we were able to stop that. That was actually 327 00:16:21,240 --> 00:16:26,080 Speaker 6: in an Albuquerque that that happened so and so, by 328 00:16:26,080 --> 00:16:28,160 Speaker 6: the way, don't don't set your waste water plants stuff 329 00:16:28,200 --> 00:16:31,960 Speaker 6: up to the internet. But uh so they're able to 330 00:16:32,000 --> 00:16:35,560 Speaker 6: do that, so it doesn't matter. It's TikTok. Iok. I 331 00:16:35,600 --> 00:16:37,800 Speaker 6: look at like you said, dance videos and stuff. But 332 00:16:38,240 --> 00:16:41,400 Speaker 6: me so, I'm a machinist by trade, and I've actually 333 00:16:41,480 --> 00:16:45,320 Speaker 6: learned a lot of tricks machinist tricks on TikTok because 334 00:16:45,320 --> 00:16:47,400 Speaker 6: I deal with a lot older SEE and C machines 335 00:16:47,800 --> 00:16:50,120 Speaker 6: and they're no longer in the States, most of them. 336 00:16:50,280 --> 00:16:54,400 Speaker 6: But I can find overseas videos on TikTok about these machines. 337 00:16:56,040 --> 00:16:58,640 Speaker 1: I'm with you, Jar, I mean, I have learned my 338 00:16:58,800 --> 00:17:02,320 Speaker 1: reverse seer tech and my ability to perfectly season a 339 00:17:02,360 --> 00:17:05,680 Speaker 1: steak thanks to TikTok is I'm like Emerald Lagasi level 340 00:17:05,680 --> 00:17:07,480 Speaker 1: at this point. I'm not trying to be immodest, but 341 00:17:07,520 --> 00:17:10,120 Speaker 1: I mean, I make a really mean steak thanks to TikTok. 342 00:17:10,720 --> 00:17:14,880 Speaker 2: Not to mention you're incredible dancer now thanks to TikTok. Wow, 343 00:17:15,680 --> 00:17:19,520 Speaker 2: you know you know just I can't stop thinking about 344 00:17:19,600 --> 00:17:24,440 Speaker 2: you in that apartment by yourself dancing TikTok dance moves 345 00:17:25,160 --> 00:17:29,480 Speaker 2: like I would give anything the feet I get my 346 00:17:29,560 --> 00:17:33,080 Speaker 2: heart rate? Are their videos of you doing dance moves 347 00:17:33,119 --> 00:17:36,440 Speaker 2: to TikTok? Does this exist anywhere? Is there any evidence? 348 00:17:36,480 --> 00:17:38,520 Speaker 2: I can't believe you admitted to this, Like I can't 349 00:17:38,520 --> 00:17:42,720 Speaker 2: stop laughing about you just doing step. There's like step aerobics. 350 00:17:42,720 --> 00:17:44,560 Speaker 2: You might as well have a thigh baster getting loose 351 00:17:44,600 --> 00:17:48,560 Speaker 2: on your couch. You would choose and summers. I just 352 00:17:48,880 --> 00:17:50,960 Speaker 2: I can't. I can't stop laughing about this. 353 00:17:53,400 --> 00:17:56,120 Speaker 1: All right, No one wants to go through the hassle 354 00:17:56,160 --> 00:17:58,240 Speaker 1: of changing their cell phone service provider. 355 00:17:58,520 --> 00:18:02,359 Speaker 2: But where were you going to do that? What's the 356 00:18:02,400 --> 00:18:06,080 Speaker 2: point of learning play? I'm doing the read stop? So 357 00:18:06,320 --> 00:18:08,359 Speaker 2: what if? What if you could? You could do this 358 00:18:08,440 --> 00:18:11,760 Speaker 2: without sacrificing inequality. That's what pure talk can do for you, 359 00:18:11,840 --> 00:18:12,440 Speaker 2: my friends. 360 00:18:12,560 --> 00:18:14,840 Speaker 1: That's because pure Talk is on the same towers and 361 00:18:14,880 --> 00:18:16,960 Speaker 1: network is one of the big wireless companies you already know, 362 00:18:17,080 --> 00:18:19,919 Speaker 1: but you pay so much less money for it. You 363 00:18:19,960 --> 00:18:22,639 Speaker 1: can spend thirty five dollars, not one hundred dollars, not 364 00:18:22,680 --> 00:18:25,040 Speaker 1: eighty dollars, thirty five dollars a month with pure Talk, 365 00:18:25,080 --> 00:18:28,159 Speaker 1: get unlimited talk, text and fifteen gigs of data with 366 00:18:28,200 --> 00:18:31,000 Speaker 1: a mobile hotspot the family of four saves about one 367 00:18:31,000 --> 00:18:33,760 Speaker 1: thousand dollars a year with pure Talk, all in while 368 00:18:33,840 --> 00:18:36,760 Speaker 1: enjoying America's most dependable five G network. My phone I 369 00:18:36,800 --> 00:18:40,040 Speaker 1: have in my hand right now is a pure Talk phone. 370 00:18:40,440 --> 00:18:43,400 Speaker 1: Switch to pure Talk today. Dial pound two fifty say 371 00:18:43,440 --> 00:18:46,720 Speaker 1: the keywords clay and buck. That's pound two five zero. 372 00:18:46,920 --> 00:18:50,200 Speaker 1: Say the keywords clay and buck. Save it additional fifty 373 00:18:50,240 --> 00:18:52,760 Speaker 1: percent off your first month with pure Talk. Got a 374 00:18:52,760 --> 00:18:55,040 Speaker 1: bunch of callers want to weigh in on the TikTok situation. 375 00:18:55,640 --> 00:18:58,360 Speaker 1: It is Ray, I were just talking about this. It's 376 00:18:58,359 --> 00:19:01,240 Speaker 1: a big deal for a massive social media platform to 377 00:19:01,320 --> 00:19:03,600 Speaker 1: be told you got to be sold or you're shutting 378 00:19:03,640 --> 00:19:05,600 Speaker 1: down because we don't like the. 379 00:19:07,119 --> 00:19:11,399 Speaker 2: Parent company that has, you know, ownership of this entity. 380 00:19:11,640 --> 00:19:14,240 Speaker 2: It's because it's in China. 381 00:19:14,600 --> 00:19:18,280 Speaker 1: It's interesting. This is kind of new territory in a 382 00:19:18,320 --> 00:19:20,159 Speaker 1: lot of ways. Let's take some of these calls. We 383 00:19:20,200 --> 00:19:25,520 Speaker 1: get to Jacob in Nebraska, old forty four year old Republican, 384 00:19:25,560 --> 00:19:26,280 Speaker 1: what have you got for us? 385 00:19:26,359 --> 00:19:31,520 Speaker 3: Jacob, Hey, hap'pter be and gentlemen, So I did just 386 00:19:31,560 --> 00:19:34,240 Speaker 3: happen calling in on this. I got rid of all 387 00:19:34,320 --> 00:19:38,760 Speaker 3: social media during the pandemic. It was just it was 388 00:19:38,840 --> 00:19:42,240 Speaker 3: just too much, so completely took myself off all of it. 389 00:19:42,920 --> 00:19:45,120 Speaker 3: I jump ahead a few years. I have an eleven 390 00:19:45,200 --> 00:19:50,359 Speaker 3: year old daughter now. She wanted to get onto TikTok, Facebook, Snapchat, 391 00:19:50,400 --> 00:19:53,760 Speaker 3: all of those. My only stipulation was, I have to 392 00:19:53,800 --> 00:19:56,400 Speaker 3: be honest, right, I have to be able to monitor. 393 00:19:57,600 --> 00:20:00,800 Speaker 3: So it was able to well. As I'm getting more 394 00:20:00,800 --> 00:20:04,000 Speaker 3: and more into TikTok, you could actually learn quite a 395 00:20:04,000 --> 00:20:07,280 Speaker 3: bit based off kind of what you're where, You're following, 396 00:20:07,280 --> 00:20:11,919 Speaker 3: what you're doing, what you're liking. So far, I am really, 397 00:20:11,960 --> 00:20:16,000 Speaker 3: really really into home setting and I'm learning how to 398 00:20:16,400 --> 00:20:19,680 Speaker 3: build a new greenhouse. And I've also learned quite a 399 00:20:19,760 --> 00:20:24,520 Speaker 3: bit about ducks versus chickens. Ducks are hardier, their eggs 400 00:20:24,520 --> 00:20:30,240 Speaker 3: are healthier, they're cleaner. As far as the whole China thing, 401 00:20:32,040 --> 00:20:34,400 Speaker 3: you know, if they want to find out information, they're 402 00:20:34,440 --> 00:20:37,320 Speaker 3: going to find it out, whether or not they own 403 00:20:37,400 --> 00:20:38,000 Speaker 3: an half or not. 404 00:20:39,520 --> 00:20:40,480 Speaker 2: Yeah, I guess you know. 405 00:20:40,560 --> 00:20:42,760 Speaker 1: First of all, Jacob, I appreciate that you know you 406 00:20:42,760 --> 00:20:45,080 Speaker 1: you've had these interesting experiences with TikTok. I have the 407 00:20:45,119 --> 00:20:47,119 Speaker 1: same thing. I've actually learned a lot from it. I 408 00:20:47,119 --> 00:20:48,920 Speaker 1: don't really post on it It's funny. It's the only 409 00:20:48,960 --> 00:20:55,399 Speaker 1: social media platform Clay that I consume not for work. Yeah, Like, 410 00:20:55,440 --> 00:20:56,919 Speaker 1: I don't use it for work. I just kind of 411 00:20:57,000 --> 00:20:58,760 Speaker 1: use it to if you want to kill time. If 412 00:20:58,760 --> 00:21:01,240 Speaker 1: you're stuck in like an airport, you know, waiting area 413 00:21:01,280 --> 00:21:03,560 Speaker 1: for an hour, you sit there, you scrow through TikTok. 414 00:21:03,600 --> 00:21:05,800 Speaker 2: Your brain just goes on autopilot. You won't even realize 415 00:21:05,800 --> 00:21:06,320 Speaker 2: what's going on. 416 00:21:07,280 --> 00:21:10,359 Speaker 1: But I also think that the the risks of the 417 00:21:10,480 --> 00:21:13,920 Speaker 1: Chinese exploitation of TikTok have never been explained to me 418 00:21:13,960 --> 00:21:16,120 Speaker 1: in a way where I find it compelling. That's that's 419 00:21:16,160 --> 00:21:19,040 Speaker 1: my Uh, they're gonna know what you like and don't like. 420 00:21:19,520 --> 00:21:22,119 Speaker 1: I think what people get into Clay a little bit 421 00:21:22,160 --> 00:21:25,080 Speaker 1: more here is are And this goes to the media 422 00:21:25,119 --> 00:21:27,000 Speaker 1: angle that you've brought up, which is okay. But a 423 00:21:27,040 --> 00:21:32,040 Speaker 1: foreign country can shift perception in a sense, or can 424 00:21:32,160 --> 00:21:35,880 Speaker 1: can begin to have influence at a mass level over 425 00:21:35,960 --> 00:21:38,840 Speaker 1: how people view issues and things. So, but that's more 426 00:21:38,840 --> 00:21:43,159 Speaker 1: of a propaganda component than a spying component. So let 427 00:21:43,200 --> 00:21:43,880 Speaker 1: me ask you this. 428 00:21:43,960 --> 00:21:50,240 Speaker 2: Then. My take is I'm less concerned about, uh, the 429 00:21:50,359 --> 00:21:56,160 Speaker 2: Chinese ownership than the fact that China restricts American companies 430 00:21:56,160 --> 00:21:59,240 Speaker 2: from being able to compete in its markets, but we 431 00:21:59,280 --> 00:22:02,320 Speaker 2: aren't restricted China from being able to compete in our markets. 432 00:22:02,320 --> 00:22:05,600 Speaker 2: And the example I would use is YouTube doesn't have 433 00:22:05,680 --> 00:22:10,439 Speaker 2: access to China. Google doesn't have access to China. Twitter 434 00:22:10,520 --> 00:22:14,720 Speaker 2: doesn't have access to China. Facebook doesn't have access to China. 435 00:22:14,760 --> 00:22:19,359 Speaker 2: All of those companies effectively have clones that exist in China, 436 00:22:19,520 --> 00:22:23,440 Speaker 2: and those are Chinese owned companies that then create business 437 00:22:23,520 --> 00:22:26,880 Speaker 2: create opportunities there. My big thing is if we don't 438 00:22:26,880 --> 00:22:30,520 Speaker 2: have mutuality of competition in the same way that we 439 00:22:30,640 --> 00:22:34,680 Speaker 2: have similar in theory tariffs going back and forth, try 440 00:22:34,720 --> 00:22:38,320 Speaker 2: to create free trade, we don't have free big tech trade, 441 00:22:38,680 --> 00:22:41,639 Speaker 2: and so I don't want China building huge businesses in 442 00:22:41,680 --> 00:22:44,719 Speaker 2: the United States and not allowing the United States to 443 00:22:44,720 --> 00:22:46,119 Speaker 2: do the same in the media space. 444 00:22:46,440 --> 00:22:46,680 Speaker 6: See. 445 00:22:46,720 --> 00:22:49,639 Speaker 2: I find that argument compelling and I agree. 446 00:22:49,400 --> 00:22:51,720 Speaker 1: With that that as far as I know, though, it 447 00:22:51,760 --> 00:22:54,000 Speaker 1: has nothing to do with why they're forcing the sale, right, 448 00:22:54,000 --> 00:22:56,760 Speaker 1: this isn't about unfair Chinese trade practice. 449 00:22:56,800 --> 00:23:00,280 Speaker 2: Well, what policy has been that they would be able 450 00:23:00,320 --> 00:23:04,439 Speaker 2: to stay open if they had United States ownership, So 451 00:23:04,920 --> 00:23:08,639 Speaker 2: the company would still exist as it is if they sold, 452 00:23:08,640 --> 00:23:10,360 Speaker 2: and they don't have to sell the whole company. They 453 00:23:10,400 --> 00:23:12,040 Speaker 2: just have to sell the US assets. 454 00:23:12,280 --> 00:23:15,159 Speaker 1: I mean, but they have said TikTok is based in 455 00:23:15,280 --> 00:23:20,240 Speaker 1: I think California, and they have said that that TikTok 456 00:23:20,240 --> 00:23:23,280 Speaker 1: that you operate, you deal with in America is run 457 00:23:23,280 --> 00:23:25,919 Speaker 1: by people who live in America, by US citizens on 458 00:23:26,119 --> 00:23:30,120 Speaker 1: US servers, on you know, it's about just foreign ownership, right, 459 00:23:30,160 --> 00:23:33,320 Speaker 1: It's you know, it's like the Byte Dance, Right, it's 460 00:23:33,320 --> 00:23:36,399 Speaker 1: byt dances. 461 00:23:35,280 --> 00:23:38,200 Speaker 2: The majority stake of TikTok. 462 00:23:38,760 --> 00:23:41,760 Speaker 1: And you sit there, you say, okay, So they could 463 00:23:41,760 --> 00:23:44,879 Speaker 1: certainly try to influence the US based company, but also 464 00:23:45,920 --> 00:23:47,919 Speaker 1: how much are they really going to do that? What 465 00:23:48,040 --> 00:23:49,800 Speaker 1: kind of problems would that cause for them if that 466 00:23:49,880 --> 00:23:53,359 Speaker 1: came out? Like I just I don't see what is it? 467 00:23:53,400 --> 00:23:54,960 Speaker 1: Me and Jamal Bowman, right, were the only. 468 00:23:54,840 --> 00:23:57,679 Speaker 2: Job I Actually, I think your argument is strong in this. 469 00:23:58,920 --> 00:24:01,800 Speaker 2: You now, maybe it's gonna change because Trump won. Certainly 470 00:24:01,840 --> 00:24:03,720 Speaker 2: we're seeing a change at Facebook and we've seen a 471 00:24:03,800 --> 00:24:08,280 Speaker 2: change at Twitter. YouTube is horribly bal biased in favor 472 00:24:08,400 --> 00:24:11,600 Speaker 2: of the left right, and that's why Rumble came to 473 00:24:11,680 --> 00:24:15,520 Speaker 2: exist because you and I remember they took down our 474 00:24:15,520 --> 00:24:18,600 Speaker 2: interview with Rand Paul. They took down our interview with 475 00:24:18,680 --> 00:24:22,600 Speaker 2: Donald Trump. They wouldn't allow us to share the interview 476 00:24:22,640 --> 00:24:25,520 Speaker 2: that we did with both elected officials on YouTube in 477 00:24:25,560 --> 00:24:27,320 Speaker 2: the last couple Thank you for reminding me of what 478 00:24:27,359 --> 00:24:29,600 Speaker 2: has been my most compelling argument. I haven't brought this yet, 479 00:24:29,600 --> 00:24:32,440 Speaker 2: but it's absolutely true, which is that Google is a 480 00:24:32,520 --> 00:24:34,399 Speaker 2: much bigger threat to all of you listening to this 481 00:24:34,480 --> 00:24:37,440 Speaker 2: right now. Google is a much bigger threat to your freedoms, 482 00:24:37,520 --> 00:24:41,159 Speaker 2: your constitutional rights than anything TikTok has been up to. 483 00:24:41,280 --> 00:24:42,360 Speaker 2: I can assure you of that. 484 00:24:42,480 --> 00:24:45,919 Speaker 1: Okay, the enemy within is far more concerning to me 485 00:24:46,480 --> 00:24:52,280 Speaker 1: than what Beijing can do working through a proxy that 486 00:24:52,320 --> 00:24:55,600 Speaker 1: has a lot of people, you know, posting content that's 487 00:24:55,600 --> 00:24:58,600 Speaker 1: pretty No, it's Americans posting the content too. This Like 488 00:24:58,720 --> 00:25:01,679 Speaker 1: I know, I've actually and that a couple of TikTok influencers, 489 00:25:01,680 --> 00:25:04,080 Speaker 1: Like I know some of these people, and you know 490 00:25:04,280 --> 00:25:07,240 Speaker 1: they're not getting their marching instructions from China. 491 00:25:07,280 --> 00:25:08,200 Speaker 2: You know, this is the thing. 492 00:25:08,480 --> 00:25:12,359 Speaker 1: It's it's not like a edity full of Chinese nationals 493 00:25:12,720 --> 00:25:15,480 Speaker 1: is trying to explain American politics to us. It's like 494 00:25:15,560 --> 00:25:17,879 Speaker 1: kids who grew up in Cincinnati who are doing dance 495 00:25:17,960 --> 00:25:19,520 Speaker 1: and and you know, teaching. 496 00:25:19,280 --> 00:25:21,439 Speaker 2: Buck how to dance in his in his apartment by himself. 497 00:25:21,840 --> 00:25:26,480 Speaker 2: I will say that my solution here, which seems like 498 00:25:26,480 --> 00:25:28,919 Speaker 2: a very reasonable one. In addition to Buck having to 499 00:25:28,960 --> 00:25:31,399 Speaker 2: go on TikTok and post the dance himself that he 500 00:25:31,520 --> 00:25:34,600 Speaker 2: was learning, which is hysterical to me, I think the 501 00:25:34,640 --> 00:25:38,480 Speaker 2: easy solution is that there is American ownership of TikTok 502 00:25:38,480 --> 00:25:41,119 Speaker 2: based company. And if I just think about this rationally, 503 00:25:41,760 --> 00:25:47,320 Speaker 2: if I owned substantial assets in TikTok America and those 504 00:25:47,359 --> 00:25:50,560 Speaker 2: assets are worth let's say, I don't think it's crazy 505 00:25:50,680 --> 00:25:52,960 Speaker 2: one hundred billion dollars, right. I don't think that's a 506 00:25:53,000 --> 00:25:55,600 Speaker 2: crazy dollar figure to think that TikTok based on its 507 00:25:55,640 --> 00:25:59,000 Speaker 2: penetration market size, the things that we've been talking about, 508 00:25:59,040 --> 00:26:00,600 Speaker 2: and even in this audience, so we're going to take 509 00:26:00,640 --> 00:26:03,280 Speaker 2: some more your calls. A lot of you were on TikTok, 510 00:26:03,640 --> 00:26:06,160 Speaker 2: I don't think it's crazy to think that's one hundred 511 00:26:06,200 --> 00:26:11,199 Speaker 2: billion dollar company. Typically, if your company faces a choice 512 00:26:11,200 --> 00:26:15,040 Speaker 2: between hey, that one hundred billion dollar valuation can vanish, 513 00:26:16,040 --> 00:26:19,480 Speaker 2: or you can get your money and sell it to 514 00:26:19,560 --> 00:26:23,119 Speaker 2: someone else, the rational outcome here is that they would 515 00:26:23,480 --> 00:26:25,720 Speaker 2: sell this asset to someone else in America, and it 516 00:26:25,720 --> 00:26:27,639 Speaker 2: would continue to run just a few things. 517 00:26:28,080 --> 00:26:30,840 Speaker 1: I mean, now, to be fair, this was put out 518 00:26:30,960 --> 00:26:33,640 Speaker 1: by by TikTok, but I don't think, you know, putting 519 00:26:33,680 --> 00:26:35,880 Speaker 1: out lies that would be disprovable would not serve their 520 00:26:35,880 --> 00:26:40,159 Speaker 1: interest in this in this matter. But TikTok's parent company, 521 00:26:40,200 --> 00:26:44,320 Speaker 1: byte Dance, was founded by Chinese entrepreneurs, but roughly sixty 522 00:26:44,359 --> 00:26:48,280 Speaker 1: percent of the company is owned by global institutional investors 523 00:26:48,320 --> 00:26:53,360 Speaker 1: like the Carlisle Group, General Atlantic, Susquehanna International. An additional 524 00:26:53,440 --> 00:26:56,480 Speaker 1: twenty percent of the company is owned by Bytdance employees 525 00:26:56,480 --> 00:27:00,919 Speaker 1: around the world, including nearly seven thousand Americans. Remaining twenty 526 00:27:00,960 --> 00:27:03,520 Speaker 1: percent is owned by the company's founder, who is a 527 00:27:03,520 --> 00:27:06,040 Speaker 1: private individual not part of any state or government entergy. 528 00:27:06,080 --> 00:27:07,640 Speaker 1: Now I think you can push back. I mean, if 529 00:27:08,280 --> 00:27:10,120 Speaker 1: if the twenty percent is owned by a guy in China, 530 00:27:10,160 --> 00:27:11,879 Speaker 1: the Chinese government gets to call the shots with that 531 00:27:11,960 --> 00:27:13,919 Speaker 1: twenty percent. To be clear, there's no such thing as 532 00:27:13,960 --> 00:27:16,440 Speaker 1: a separation of public and private in China. So that's 533 00:27:16,480 --> 00:27:19,040 Speaker 1: where the TikTok fact sheet here, you know, leaves a 534 00:27:19,080 --> 00:27:23,320 Speaker 1: little bit out, but you know, just to show some 535 00:27:23,359 --> 00:27:26,160 Speaker 1: of what you know, some of what's on here seems 536 00:27:26,160 --> 00:27:28,679 Speaker 1: to be to me at least indicating that it has more. 537 00:27:28,560 --> 00:27:31,040 Speaker 2: Diverse ownership than people really people realize. 538 00:27:31,880 --> 00:27:34,520 Speaker 1: Look, I just I always think of the clay of 539 00:27:34,600 --> 00:27:39,320 Speaker 1: the precedent that these kinds of things set, and to say, 540 00:27:39,720 --> 00:27:42,000 Speaker 1: I don't like the ownership of this company, so it 541 00:27:42,040 --> 00:27:43,960 Speaker 1: has to be you know, because it's partially foreign, and 542 00:27:44,040 --> 00:27:47,040 Speaker 1: so it has to be sold. You know, this is 543 00:27:47,080 --> 00:27:49,560 Speaker 1: a This is a government intrusion on the marketplace. This 544 00:27:49,680 --> 00:27:54,919 Speaker 1: is not a marketplace first mechanism. And to meybe just 545 00:27:55,000 --> 00:27:56,720 Speaker 1: remember what the other side likes to do when they 546 00:27:56,760 --> 00:27:58,160 Speaker 1: can just call the shots on things. 547 00:27:58,359 --> 00:28:00,200 Speaker 2: Well, let me also point this out because I think 548 00:28:00,200 --> 00:28:02,080 Speaker 2: it's a Again, this is why I don't think this 549 00:28:02,280 --> 00:28:07,280 Speaker 2: lines up in a typical left right dynamic. In fact, 550 00:28:07,320 --> 00:28:10,960 Speaker 2: I think you're likely to see some form of left 551 00:28:11,000 --> 00:28:14,560 Speaker 2: and right agreement as to whether or not the government 552 00:28:14,640 --> 00:28:19,840 Speaker 2: can can ban TikTok. I think there'll be different different resolutions, 553 00:28:19,920 --> 00:28:23,640 Speaker 2: different opinions that are drafted from a variety of perspectives. 554 00:28:23,720 --> 00:28:25,440 Speaker 2: Let me just ask you this, because this is this 555 00:28:25,520 --> 00:28:27,639 Speaker 2: is ultimately what I find to be the crux of 556 00:28:27,640 --> 00:28:31,120 Speaker 2: the case. How do you define media in today's day 557 00:28:31,119 --> 00:28:35,760 Speaker 2: and age? And that's a big, big picture question. And 558 00:28:36,640 --> 00:28:40,280 Speaker 2: let's analogize here. If the New York Times, which you 559 00:28:40,320 --> 00:28:43,160 Speaker 2: and I agree is a super left wing biased company, 560 00:28:43,720 --> 00:28:46,880 Speaker 2: put itself on the market and instead of the Sulzberger family, 561 00:28:46,920 --> 00:28:49,720 Speaker 2: which I think owns it in the New York City area, 562 00:28:49,760 --> 00:28:54,080 Speaker 2: instead of that, a Chinese based company bought The New 563 00:28:54,160 --> 00:28:59,360 Speaker 2: York Times, I think everybody would say, that's unacceptable. We're 564 00:28:59,360 --> 00:29:01,880 Speaker 2: not going to allow that, right, So that would be 565 00:29:01,960 --> 00:29:03,640 Speaker 2: And by the way, it could be a Wall Street Journal, 566 00:29:03,680 --> 00:29:07,440 Speaker 2: it could be Fox News, it could be choose your defined, 567 00:29:07,480 --> 00:29:13,880 Speaker 2: definite media outlet. Where does TikTok fall in the media universe, 568 00:29:14,280 --> 00:29:16,400 Speaker 2: because I think most of you out there would say, yeah, 569 00:29:16,440 --> 00:29:18,560 Speaker 2: the Wall Street Journal can't be owned by a foreigner. 570 00:29:19,320 --> 00:29:21,640 Speaker 2: This is why, to a large extent, Ruper Murdoch became 571 00:29:21,640 --> 00:29:26,360 Speaker 2: an American citizen is because American media ownership rules require 572 00:29:26,520 --> 00:29:31,280 Speaker 2: that American citizens own our media companies, because otherwise the 573 00:29:31,320 --> 00:29:33,720 Speaker 2: fear is that you get George Soros and you have 574 00:29:33,880 --> 00:29:37,520 Speaker 2: some guy come in and you start to diabolically pull streams. 575 00:29:38,000 --> 00:29:40,800 Speaker 2: So I think that's the question, where does TikTok rank 576 00:29:40,920 --> 00:29:43,640 Speaker 2: on the media flowchart? Because that also factors in here. 577 00:29:44,000 --> 00:29:46,400 Speaker 1: Well, what you're also seeing though, is in the digital era, 578 00:29:46,520 --> 00:29:51,160 Speaker 1: perhaps more so than ever, information is a national security issue. 579 00:29:51,880 --> 00:29:54,680 Speaker 1: And I don't mean classified information. I mean the distribution 580 00:29:54,800 --> 00:29:59,800 Speaker 1: of information. I mean mass media. Shaping perception at scale 581 00:30:00,520 --> 00:30:04,360 Speaker 1: is now a national security concern because if you can 582 00:30:04,440 --> 00:30:07,880 Speaker 1: influence people, especially let's say, if you're China, you want 583 00:30:07,920 --> 00:30:09,200 Speaker 1: to influence US population. 584 00:30:09,280 --> 00:30:12,120 Speaker 2: Do we really care about Taiwan? Is Taiwan really are? 585 00:30:12,280 --> 00:30:12,520 Speaker 2: You know? 586 00:30:12,720 --> 00:30:15,240 Speaker 1: Is that really something we need to be so concerned about? 587 00:30:15,880 --> 00:30:19,040 Speaker 1: Now you could say that that will take time, and yeah, well, 588 00:30:19,120 --> 00:30:21,840 Speaker 1: China thinks in long term, and there's a lot of 589 00:30:21,840 --> 00:30:24,280 Speaker 1: ways that you can manipulate public opinion. You can kind 590 00:30:24,280 --> 00:30:29,360 Speaker 1: of manufacture a consensus of sorts. So that's that's you. 591 00:30:29,240 --> 00:30:30,080 Speaker 2: Know, you see what I mean? 592 00:30:30,240 --> 00:30:33,040 Speaker 1: Yes, No, I think it's it's we can all understand, Okay, 593 00:30:33,240 --> 00:30:36,160 Speaker 1: you can't have a company selling missiles like advanced missile 594 00:30:36,200 --> 00:30:37,040 Speaker 1: technology to Iran. 595 00:30:37,120 --> 00:30:39,640 Speaker 2: We understand that, right, clear national security implication. 596 00:30:40,360 --> 00:30:43,760 Speaker 1: Now it's we can't have a company that does a 597 00:30:43,800 --> 00:30:47,480 Speaker 1: lot of US generated content that's too popular because their 598 00:30:47,520 --> 00:30:52,160 Speaker 1: algorithm that they use may shift perception broadly on political 599 00:30:52,200 --> 00:30:55,479 Speaker 1: issues in a way that is not advantageous. 600 00:30:54,800 --> 00:30:57,520 Speaker 2: To the US. Do you see this. I mean, I 601 00:30:57,760 --> 00:31:00,440 Speaker 2: think it kind of gets crystallized. I think you've talked 602 00:31:00,440 --> 00:31:03,800 Speaker 2: about this before on the show. You can be watching CNN, 603 00:31:03,880 --> 00:31:07,120 Speaker 2: which is allowed to broadcast in China, and when they 604 00:31:07,160 --> 00:31:11,000 Speaker 2: discuss anything China related that they don't like, the television 605 00:31:11,040 --> 00:31:14,720 Speaker 2: just goes black. Like you know, CNN International Watch. 606 00:31:14,800 --> 00:31:17,160 Speaker 1: Yeah, yeah, you also like all you have to know 607 00:31:17,200 --> 00:31:19,400 Speaker 1: about seeing an international is that the play it in 608 00:31:19,440 --> 00:31:21,920 Speaker 1: countries where they absolutely hate America. 609 00:31:21,640 --> 00:31:22,880 Speaker 2: Tells you a lot about CNN. 610 00:31:23,640 --> 00:31:25,360 Speaker 1: You turn it on, you go, oh, America is the 611 00:31:25,400 --> 00:31:27,760 Speaker 1: bad guy according to CNN International. 612 00:31:27,800 --> 00:31:29,480 Speaker 2: A lot of that, but it is interesting right to 613 00:31:29,520 --> 00:31:33,800 Speaker 2: think about they basically individual segments have Chinese censors who 614 00:31:33,840 --> 00:31:36,280 Speaker 2: will just block it out. You can't get you know, 615 00:31:36,320 --> 00:31:39,000 Speaker 2: the site that I sold to Fox, you can't get 616 00:31:39,040 --> 00:31:42,680 Speaker 2: out kick in China. We're banned that you cannot pull Now. 617 00:31:42,680 --> 00:31:44,680 Speaker 2: I know people have VPNs and they get smart and 618 00:31:44,720 --> 00:31:46,000 Speaker 2: they figure out a way to pull it up. But 619 00:31:46,040 --> 00:31:49,080 Speaker 2: I mean, it's a sports focused website, but we've been 620 00:31:49,120 --> 00:31:52,440 Speaker 2: critical of China to such an extent associated with sports, 621 00:31:52,480 --> 00:31:53,320 Speaker 2: you can't even pull it up. 622 00:31:53,680 --> 00:31:57,560 Speaker 1: I also just think, you know, people, people have become 623 00:31:57,680 --> 00:32:00,840 Speaker 1: very distrustful of big pharma, for example, and in the 624 00:32:00,880 --> 00:32:03,719 Speaker 1: post COVID shot era. I think there's more reason than 625 00:32:03,720 --> 00:32:06,040 Speaker 1: ever for that, although Big Farmer actually does do a 626 00:32:06,080 --> 00:32:08,280 Speaker 1: lot of important things which I like to remind everybody of. 627 00:32:08,400 --> 00:32:10,840 Speaker 1: Like everyone people like their stating drugs. I mean, there's 628 00:32:10,840 --> 00:32:13,240 Speaker 1: really important stuff that's being done. People like their cancer 629 00:32:13,320 --> 00:32:16,600 Speaker 1: drugs when they really need them. You know, there's it's 630 00:32:16,600 --> 00:32:21,560 Speaker 1: not just a downside with with big pharma, big media, 631 00:32:21,840 --> 00:32:24,440 Speaker 1: I mean, or big social media you can't trust you 632 00:32:24,520 --> 00:32:26,719 Speaker 1: look at like Google and YouTube and all this. 633 00:32:26,880 --> 00:32:29,640 Speaker 2: I think they want TikTok on. I don't know. 634 00:32:29,720 --> 00:32:32,440 Speaker 1: I think that's certainly true because they like thet off 635 00:32:32,480 --> 00:32:35,400 Speaker 1: members of Congress. Yeah, I think they've bought off members 636 00:32:35,400 --> 00:32:38,080 Speaker 1: of Congress. Google look at the Google money, and I 637 00:32:38,120 --> 00:32:40,960 Speaker 1: think on both sides, man, Google is the most powerful 638 00:32:41,000 --> 00:32:43,200 Speaker 1: company in the world right now, pretty much. 639 00:32:44,680 --> 00:32:46,160 Speaker 2: I think that's true. We'll come back, We'll take some 640 00:32:46,200 --> 00:32:48,200 Speaker 2: more of your calls. I'm actually impressed at how many 641 00:32:48,280 --> 00:32:50,640 Speaker 2: of you are committed to using TikTok even from a 642 00:32:50,720 --> 00:32:52,719 Speaker 2: variety of perspectives. Eight hundred and two A two two 643 00:32:52,760 --> 00:32:54,920 Speaker 2: eight A two. Maybe one of you has seen buck 644 00:32:55,040 --> 00:32:58,280 Speaker 2: dancing imagine living in a country under constant assault from 645 00:32:58,280 --> 00:33:01,640 Speaker 2: all sides, having to endure thousands of missile attacks, evacuating 646 00:33:01,640 --> 00:33:04,280 Speaker 2: the bomb shelters on a regular basis, overall thread of 647 00:33:04,400 --> 00:33:07,760 Speaker 2: terror always hovering everywhere you go. That's what it's like 648 00:33:07,800 --> 00:33:10,120 Speaker 2: in Israel right now. I was just there last month. 649 00:33:10,160 --> 00:33:12,640 Speaker 2: You're driving by bomb shelters. In any moment you might 650 00:33:12,680 --> 00:33:14,280 Speaker 2: have to pull off. You get a notice that there 651 00:33:14,280 --> 00:33:17,000 Speaker 2: are missiles coming in and you have to go stand 652 00:33:17,040 --> 00:33:20,520 Speaker 2: in a bomb shelter. It's absolutely crazy. Our sponsor, the 653 00:33:20,600 --> 00:33:24,360 Speaker 2: International Fellowship of Christians and Jews, provides life saving resources 654 00:33:24,360 --> 00:33:28,160 Speaker 2: for our friends in Israel. Your contributions have made a 655 00:33:28,240 --> 00:33:30,959 Speaker 2: huge difference. I met a guy in a kibbutz who 656 00:33:31,080 --> 00:33:33,440 Speaker 2: pulled up in a vehicle. He pointed to the bulletproof 657 00:33:33,480 --> 00:33:35,520 Speaker 2: parts of his car. He said, I would have been 658 00:33:35,520 --> 00:33:38,720 Speaker 2: dead that day if I hadn't had the bulletproof provided 659 00:33:38,760 --> 00:33:41,360 Speaker 2: by the Foundation so that I could come and fight 660 00:33:41,440 --> 00:33:45,200 Speaker 2: back against the terrorists. They make a tremendous difference providing food, 661 00:33:45,320 --> 00:33:48,520 Speaker 2: shelter safety to so many people out there. You can 662 00:33:48,640 --> 00:33:52,120 Speaker 2: join the movement, show your support for Israel and for 663 00:33:52,200 --> 00:33:56,120 Speaker 2: the Fellowship of Christians and Jews by going to SUPPORTIFCJ 664 00:33:56,320 --> 00:34:00,479 Speaker 2: dot org. That's one word. Support I f CJ org. 665 00:34:00,640 --> 00:34:04,000 Speaker 2: Your support of the IFCJ has saved lives and answered 666 00:34:04,000 --> 00:34:08,239 Speaker 2: prayers and will continue to do so. Support IFCJ dot org. 667 00:34:09,160 --> 00:34:14,920 Speaker 4: Stories are freedom stories of America, inspirational stories that you unite. 668 00:34:14,680 --> 00:34:18,640 Speaker 5: Us all each day. Spend time with Clay and find 669 00:34:18,640 --> 00:34:22,000 Speaker 5: them on the free iHeartRadio app or wherever you get 670 00:34:22,040 --> 00:34:22,919 Speaker 5: your podcasts. 671 00:34:23,560 --> 00:34:27,040 Speaker 2: Welcome back in Clay Travis Buck Sexton show, closing up 672 00:34:27,200 --> 00:34:29,239 Speaker 2: the shop here. We've got a couple of guests coming 673 00:34:29,280 --> 00:34:32,839 Speaker 2: your way in the third hour, Andy McCarthy on the 674 00:34:33,200 --> 00:34:39,080 Speaker 2: end of Trump lawfair, the unconditional discharge that occurred in 675 00:34:39,120 --> 00:34:43,120 Speaker 2: the New York City courtroom. Just Judge merchand what does 676 00:34:43,160 --> 00:34:45,080 Speaker 2: he make of all of this. He will join us 677 00:34:45,120 --> 00:34:47,839 Speaker 2: top of the next hour, and then our buddy Ryan Kurdusky, 678 00:34:48,239 --> 00:34:50,839 Speaker 2: who is now a part of the Clay and Buck 679 00:34:50,920 --> 00:34:54,000 Speaker 2: podcast network, which is just doing gangbuster numbers. By the way, 680 00:34:54,400 --> 00:34:57,719 Speaker 2: you guys are really diving into so many of the 681 00:34:57,760 --> 00:35:01,600 Speaker 2: different different shows out there, and that is fabulous to see. 682 00:35:01,640 --> 00:35:04,080 Speaker 2: We'll break down and discuss all of that. We're gonna 683 00:35:04,080 --> 00:35:09,200 Speaker 2: continue deluged with emails from you guys who are on TikTok, 684 00:35:09,440 --> 00:35:13,560 Speaker 2: and the responses are super interesting. And for the most part, 685 00:35:14,200 --> 00:35:16,440 Speaker 2: what's interesting to me, Buck is most people aren't on 686 00:35:16,480 --> 00:35:18,799 Speaker 2: there for current events, at least reaching out to us 687 00:35:18,840 --> 00:35:23,719 Speaker 2: and politics. It's whatever particular niche you are interested in, 688 00:35:24,480 --> 00:35:28,799 Speaker 2: you are getting served videos from that niche. So for Buck, 689 00:35:28,840 --> 00:35:31,440 Speaker 2: it might be new dance moves. For some of you 690 00:35:31,520 --> 00:35:34,600 Speaker 2: out there, it might be you know, how do you 691 00:35:34,840 --> 00:35:36,280 Speaker 2: build or frame a new home? 692 00:35:37,000 --> 00:35:41,000 Speaker 1: Or it might be firearms tips and room clearing play 693 00:35:41,320 --> 00:35:44,439 Speaker 1: or edge weapons tactics in close corners. Okay, don't don't 694 00:35:44,480 --> 00:35:47,560 Speaker 1: pigeonhole me into the shuffle dance only I think. 695 00:35:47,480 --> 00:35:49,880 Speaker 2: It's just does Carrie know about your dance moves? Is this? 696 00:35:50,200 --> 00:35:52,239 Speaker 2: I know she was just talking to a soft air here. 697 00:35:52,280 --> 00:35:55,160 Speaker 2: Is she aware before Carrie was in the picture? Okay, 698 00:35:55,200 --> 00:35:57,239 Speaker 2: you know, has she seen the moves? Has she seen 699 00:35:57,239 --> 00:36:00,200 Speaker 2: the moves? That's the question. You're gonna have to break 700 00:36:00,239 --> 00:36:02,560 Speaker 2: these out at the inaugural parties. Are there gonna be 701 00:36:02,640 --> 00:36:05,960 Speaker 2: like you gonna be leading a slide show and out. 702 00:36:05,840 --> 00:36:08,719 Speaker 1: Some shuffle dance at the Trump Ball? Maybe now, maybe, 703 00:36:08,760 --> 00:36:10,600 Speaker 1: if the people demand it, it could happen. 704 00:36:10,800 --> 00:36:11,239 Speaker 2: You never know. 705 00:36:12,239 --> 00:36:12,439 Speaker 6: Well. 706 00:36:12,520 --> 00:36:15,319 Speaker 2: By the way, go subscribe Crocket Coffee Buck. We'll send 707 00:36:15,360 --> 00:36:18,480 Speaker 2: you a private dance tutorial. You can go to Crocketcoffee 708 00:36:18,520 --> 00:36:21,800 Speaker 2: dot com right now. Sign up's coffee that loves America, 709 00:36:22,160 --> 00:36:25,000 Speaker 2: loves America as much as Buck loves line dancing. Uh, 710 00:36:25,120 --> 00:36:27,840 Speaker 2: you go to Crocket Coffee dot com. I'll sign books 711 00:36:27,920 --> 00:36:30,040 Speaker 2: a lot of snow here. We're gonna end up with 712 00:36:30,080 --> 00:36:31,960 Speaker 2: like six or eight inches of snow. I'm gonna be 713 00:36:31,960 --> 00:36:34,680 Speaker 2: sitting around watching football signing books. They'll be going out 714 00:36:34,719 --> 00:36:37,520 Speaker 2: to you on Monday for everybody who uses code book 715 00:36:37,880 --> 00:36:41,719 Speaker 2: Crocketcoffee dot Com. We come back Trump. Waffair will also 716 00:36:41,760 --> 00:36:43,959 Speaker 2: ask Andy what he thinks about the TikTok Supreme Court 717 00:36:43,960 --> 00:36:44,720 Speaker 2: case that's next