1 00:00:01,160 --> 00:00:04,560 Speaker 1: The FCC Commissioner saying, quote, I wouldn't support anything but 2 00:00:04,640 --> 00:00:09,400 Speaker 1: an outright ban on TikTok at this time, after we 3 00:00:09,560 --> 00:00:13,080 Speaker 1: heard from TikTok Ceo in Congress. I want you to 4 00:00:13,119 --> 00:00:16,960 Speaker 1: hear what the FCC Commissioner, Brendan Carr said on whether 5 00:00:16,960 --> 00:00:20,119 Speaker 1: the app could be banned in the US. Listened from 6 00:00:20,200 --> 00:00:23,279 Speaker 1: CBS this morning. Creative concerns. Lawmakers seem to have a 7 00:00:23,280 --> 00:00:25,840 Speaker 1: pretty clear line of thought on this. You know, TikTok 8 00:00:25,880 --> 00:00:28,000 Speaker 1: is owned by byt Dance by Dances Chinese, and are 9 00:00:28,000 --> 00:00:31,800 Speaker 1: the Chinese law is accountable to the Communist party there 10 00:00:31,840 --> 00:00:34,400 Speaker 1: in that country? With that chain of events, is there 11 00:00:34,400 --> 00:00:37,760 Speaker 1: anything short of an outright ban that you would support? No, 12 00:00:37,840 --> 00:00:39,360 Speaker 1: I don't see it at this point. Even the White 13 00:00:39,360 --> 00:00:41,800 Speaker 1: House last week reports where that they went to TikTok 14 00:00:41,880 --> 00:00:44,479 Speaker 1: and said, look, you either have to divest Chinese government 15 00:00:44,520 --> 00:00:47,640 Speaker 1: ownership Chinese ownership or face a potential ban. I think 16 00:00:47,640 --> 00:00:49,720 Speaker 1: that's the right thing. That The most important thing that 17 00:00:49,760 --> 00:00:52,280 Speaker 1: the TikTok Ceo had to do yesterday was built some 18 00:00:52,720 --> 00:00:56,040 Speaker 1: level of trust and credibility with Congress. And in that effort, 19 00:00:56,200 --> 00:00:58,760 Speaker 1: I think he completely failed. And so we'll see. Look 20 00:00:58,760 --> 00:01:01,480 Speaker 1: there's still a lot of time left. There's still a 21 00:01:01,520 --> 00:01:04,040 Speaker 1: lot of issues, pencil hurdles to get legislation passed or 22 00:01:04,080 --> 00:01:05,840 Speaker 1: to get the Treasury Department is looking at this to 23 00:01:05,920 --> 00:01:09,320 Speaker 1: take action, but really the trust is just bottoming out 24 00:01:09,360 --> 00:01:12,039 Speaker 1: for TikTok and it doesn't look good. Look, there's one 25 00:01:12,120 --> 00:01:14,480 Speaker 1: hundred fifty million Americans that are on this for sure, 26 00:01:14,640 --> 00:01:15,959 Speaker 1: and so we got to make the case to them 27 00:01:16,080 --> 00:01:19,600 Speaker 1: very clearly that there's a risk here. There's a risk. 28 00:01:20,400 --> 00:01:23,959 Speaker 1: The risk is they're spying and getting all of your information. 29 00:01:24,360 --> 00:01:27,640 Speaker 1: That's the risk. Christy Nome was asked about it this 30 00:01:27,680 --> 00:01:30,560 Speaker 1: morning on Fox and Friends about the idea of could 31 00:01:30,600 --> 00:01:33,440 Speaker 1: there be a bipartisan build a band TikTok. Here's what 32 00:01:33,600 --> 00:01:36,920 Speaker 1: she said after she banned it on all government devices 33 00:01:36,959 --> 00:01:42,119 Speaker 1: in North Dakota. Governor. Governor, the Chinese government came out 34 00:01:42,160 --> 00:01:44,319 Speaker 1: yesterday and spoke and said we are going to stop 35 00:01:44,400 --> 00:01:47,760 Speaker 1: you Byte Dance from selling off to an American company. 36 00:01:47,880 --> 00:01:50,160 Speaker 1: So anybody who thought this was a linked directly to 37 00:01:50,160 --> 00:01:52,600 Speaker 1: the government. The government doesn't come out and prevent a 38 00:01:52,760 --> 00:01:56,360 Speaker 1: sale unless they're vehemently involved. Plus, he could not deny 39 00:01:56,480 --> 00:01:59,240 Speaker 1: that they are under they are obligated to give up 40 00:01:59,280 --> 00:02:03,640 Speaker 1: our our information to them. But here's the problem. Donald 41 00:02:03,640 --> 00:02:05,760 Speaker 1: Trump tried to ban it, but when it comes to 42 00:02:05,800 --> 00:02:09,720 Speaker 1: banning it, they're going to need bipartisan a bill, send 43 00:02:09,720 --> 00:02:12,880 Speaker 1: it legislation, hand it up to the President to have 44 00:02:13,000 --> 00:02:15,640 Speaker 1: him signed to ban it. So do you think he'll 45 00:02:15,680 --> 00:02:19,359 Speaker 1: sign it knowing that he's concerned about losing the eighteen 46 00:02:19,560 --> 00:02:22,200 Speaker 1: to twenty two year old vote and he just didn't 47 00:02:22,240 --> 00:02:26,280 Speaker 1: TikTok on St. Patrick's Day. I don't think he'll sign 48 00:02:26,320 --> 00:02:29,280 Speaker 1: it because over and over again, this president has proven 49 00:02:29,320 --> 00:02:31,840 Speaker 1: that he won't protect America and he won't make the 50 00:02:31,919 --> 00:02:35,160 Speaker 1: right decision, especially when there's pressure. And the reason that 51 00:02:35,200 --> 00:02:36,680 Speaker 1: I think he thinks he can get away with that 52 00:02:37,160 --> 00:02:39,280 Speaker 1: is because it was just weeks ago we had a 53 00:02:39,360 --> 00:02:42,799 Speaker 1: Chinese spy balloon flying over this country and we quit 54 00:02:42,840 --> 00:02:45,440 Speaker 1: talking about it already if you go to every other network, 55 00:02:45,480 --> 00:02:48,280 Speaker 1: if you go on social media, even Americans and their 56 00:02:48,280 --> 00:02:51,240 Speaker 1: coffee shops aren't talking about the fact that this president 57 00:02:51,280 --> 00:02:54,560 Speaker 1: allowed China to send a spy balloon over our military 58 00:02:54,560 --> 00:02:58,079 Speaker 1: installations and gather information and our president didn't shoot it 59 00:02:58,120 --> 00:03:00,960 Speaker 1: down until it was over the ocean, and the difficulty 60 00:03:01,040 --> 00:03:04,160 Speaker 1: recovering the information that was in it. So I think 61 00:03:04,200 --> 00:03:07,840 Speaker 1: they believe that our society here in America has add 62 00:03:08,480 --> 00:03:10,600 Speaker 1: that we don't pay attention to anything for very long, 63 00:03:10,639 --> 00:03:13,320 Speaker 1: and we forget about these important issues and move on, 64 00:03:13,680 --> 00:03:16,320 Speaker 1: and that the president has learned that about us as well, 65 00:03:16,360 --> 00:03:18,920 Speaker 1: and thinks if he can just change the narrative, maybe 66 00:03:19,000 --> 00:03:22,320 Speaker 1: come out with some other political scandal against the other 67 00:03:22,400 --> 00:03:25,440 Speaker 1: party or a former president, that the American people will 68 00:03:25,480 --> 00:03:27,400 Speaker 1: just move on and forget about the fact that he's 69 00:03:27,440 --> 00:03:29,480 Speaker 1: giving away our country and he's giving it away to 70 00:03:29,520 --> 00:03:33,360 Speaker 1: the enemy. The point that she just made there about 71 00:03:33,360 --> 00:03:35,360 Speaker 1: giving it away to the enemy, and the fact that 72 00:03:35,360 --> 00:03:39,240 Speaker 1: he is actually using TikTok, I also think tells you 73 00:03:39,280 --> 00:03:43,320 Speaker 1: a lot about how compromise this president actually is. Why 74 00:03:43,360 --> 00:03:47,800 Speaker 1: on earth would you do this unless you're compromised, why 75 00:03:47,800 --> 00:03:51,200 Speaker 1: would you use TikTok and TikTok's The government China came 76 00:03:51,200 --> 00:03:53,080 Speaker 1: out and said we will not allow for TikTok to 77 00:03:53,080 --> 00:03:56,320 Speaker 1: be sold to an American company. Why are they saying that, 78 00:03:56,360 --> 00:03:59,960 Speaker 1: Because they're basically saying to America, don't even think about it, 79 00:04:00,560 --> 00:04:02,400 Speaker 1: and if you do think about it, there will be 80 00:04:02,480 --> 00:04:06,280 Speaker 1: hell to pay. You better not even get close to 81 00:04:06,320 --> 00:04:11,040 Speaker 1: this now you look at you look at TikTok, and 82 00:04:11,080 --> 00:04:14,160 Speaker 1: there's some people that say, oh, it's just a joke. 83 00:04:14,560 --> 00:04:18,360 Speaker 1: Jimmy Kimmel, by the way, mocked the TikTok probe right 84 00:04:19,040 --> 00:04:21,400 Speaker 1: and acting like, oh, it's just a cute little app 85 00:04:22,000 --> 00:04:25,680 Speaker 1: and no one should really care, and it's just an 86 00:04:25,680 --> 00:04:27,840 Speaker 1: app where you know, they teach you how to cook 87 00:04:27,960 --> 00:04:30,799 Speaker 1: spaghetti in a few seconds. He's laughing at the fact 88 00:04:30,800 --> 00:04:36,640 Speaker 1: that Republicans and Democrats, by the way, are focusing on TikTok. Now, 89 00:04:36,680 --> 00:04:39,080 Speaker 1: I think this also tells you how stupid Jimmy Kimmel is. 90 00:04:39,600 --> 00:04:41,840 Speaker 1: Jimmy Kimmel is a guy that doesn't understand that this 91 00:04:41,960 --> 00:04:44,719 Speaker 1: is exactly how you get and if you're the Chinese 92 00:04:44,720 --> 00:04:47,240 Speaker 1: Communist Party. He was a brilliant move from day one. 93 00:04:47,560 --> 00:04:51,080 Speaker 1: Let's do a social media app that we can get 94 00:04:51,120 --> 00:04:54,080 Speaker 1: to go viral. Half of Americans have it on their 95 00:04:54,120 --> 00:04:57,520 Speaker 1: phone right now. And then once we get it on 96 00:04:57,600 --> 00:05:01,560 Speaker 1: half of Americans phones, then we own them and we 97 00:05:01,640 --> 00:05:04,760 Speaker 1: have all of their intel and everything they do and 98 00:05:04,800 --> 00:05:07,680 Speaker 1: how they think, and we can influence them with the 99 00:05:07,760 --> 00:05:10,120 Speaker 1: videos that we show them, and we can get them 100 00:05:10,160 --> 00:05:12,960 Speaker 1: to go down certain roads, and we can have all 101 00:05:13,000 --> 00:05:16,080 Speaker 1: of this intel on the American people, and then many 102 00:05:16,160 --> 00:05:18,920 Speaker 1: people in the government will have this, and we'll have 103 00:05:19,000 --> 00:05:22,040 Speaker 1: all of their intel, including keystrokes. One of the things 104 00:05:22,040 --> 00:05:25,880 Speaker 1: that came out yesterday that was admitted by the TikTok 105 00:05:26,000 --> 00:05:31,120 Speaker 1: CEO is that it does record your keystrokes. That's spine. 106 00:05:31,600 --> 00:05:35,320 Speaker 1: Imagine if you had access to every keystroke of somebody's 107 00:05:35,480 --> 00:05:39,360 Speaker 1: phone and what you could do with all that intel 108 00:05:39,400 --> 00:05:42,920 Speaker 1: and all of that information. That is exactly what TikTok 109 00:05:43,120 --> 00:05:45,920 Speaker 1: has admitted yesterday they are doing. But then you get 110 00:05:45,960 --> 00:05:48,839 Speaker 1: Jimmy Kimmel who goes out there and talks to his 111 00:05:48,960 --> 00:05:53,280 Speaker 1: liberal audience and makes fun of this, thinks it's hysterical 112 00:05:53,760 --> 00:05:56,320 Speaker 1: because he's too stupid to realize that while he's got 113 00:05:56,360 --> 00:05:59,560 Speaker 1: TikTok for his show and on his phone, that he's 114 00:05:59,600 --> 00:06:02,520 Speaker 1: being by the Chinese. And the Chinese are the ones laughing, right, 115 00:06:02,560 --> 00:06:05,360 Speaker 1: because they have stooges like this in America. They're too 116 00:06:05,440 --> 00:06:09,520 Speaker 1: dumb to understand that TikTok was never designed for your enjoyment. 117 00:06:09,920 --> 00:06:14,560 Speaker 1: TikTok was designed to make an app that people would 118 00:06:14,600 --> 00:06:18,080 Speaker 1: put on their phones quickly, that celebrities would use quickly, 119 00:06:18,880 --> 00:06:21,920 Speaker 1: just like Facebook and Instagram and then it was actually 120 00:06:21,920 --> 00:06:25,680 Speaker 1: a spying apparatus the entire time. It's very clear that's 121 00:06:25,680 --> 00:06:28,919 Speaker 1: what and this isn't a conspiracy theory. We actually heard 122 00:06:28,960 --> 00:06:32,120 Speaker 1: from the CEO of TikTok yesterday. We now know everything 123 00:06:32,120 --> 00:06:34,640 Speaker 1: that I just said is actually true. So this was 124 00:06:34,680 --> 00:06:37,560 Speaker 1: their plan from the Chinese Commuist government for the very beginning. 125 00:06:37,720 --> 00:06:41,320 Speaker 1: This was not about just becoming a social media company. 126 00:06:41,800 --> 00:06:45,440 Speaker 1: This was about spying on the users who put it 127 00:06:45,520 --> 00:06:48,000 Speaker 1: on their phones. Did you know that in China people 128 00:06:48,040 --> 00:06:50,480 Speaker 1: don't use TikTok? Did you know that? Do you know 129 00:06:50,600 --> 00:06:53,080 Speaker 1: kids in China don't use TikTok? Did you know that 130 00:06:53,760 --> 00:06:56,240 Speaker 1: a lot of people don't realize that. I wonder why. 131 00:06:56,760 --> 00:06:58,599 Speaker 1: I wonder why in China they don't use it, but 132 00:06:58,680 --> 00:07:00,839 Speaker 1: in America they're like, oh, please do it. And then 133 00:07:00,880 --> 00:07:04,160 Speaker 1: again you have these idiot students like Jimmy Kimmel. Here's 134 00:07:04,160 --> 00:07:06,359 Speaker 1: what he said about on the show last night. There's 135 00:07:06,400 --> 00:07:10,080 Speaker 1: nothing about it. The focus in Congress today was on TikTok, 136 00:07:10,560 --> 00:07:13,120 Speaker 1: of all things. Lawmakers from both sides seem to be 137 00:07:13,160 --> 00:07:17,120 Speaker 1: headed towards banning TikTok in the United States and TikTok 138 00:07:17,160 --> 00:07:18,880 Speaker 1: for those you Don't Know is an app that lets 139 00:07:18,880 --> 00:07:21,520 Speaker 1: you watch videos similar to what you're doing right now 140 00:07:21,600 --> 00:07:25,840 Speaker 1: on Instead of a long horizontal video, they go long 141 00:07:25,880 --> 00:07:29,400 Speaker 1: ways like this, and then half of the video is 142 00:07:29,480 --> 00:07:34,720 Speaker 1: footage of somebody carving soap for reasons why I don't know. 143 00:07:34,800 --> 00:07:37,680 Speaker 1: And they're also buttons and filters with hearts and stuff, 144 00:07:37,680 --> 00:07:41,080 Speaker 1: and you can turn you into a dog, or they 145 00:07:41,080 --> 00:07:43,400 Speaker 1: could even turn you into a lion if you want, 146 00:07:43,440 --> 00:07:46,640 Speaker 1: and there's text that describes what you're watching. While you're 147 00:07:46,640 --> 00:07:53,120 Speaker 1: watching him. Now, everybody's laughing because they're turning Jimmy Kimmel's sideways. 148 00:07:53,160 --> 00:07:55,080 Speaker 1: They're making him look like a lion with some of 149 00:07:55,120 --> 00:07:58,520 Speaker 1: the filters on TikTok. And then they have someone carving 150 00:07:58,600 --> 00:08:02,240 Speaker 1: soap on Tiktoko him and he's talking about how funny 151 00:08:02,280 --> 00:08:04,520 Speaker 1: this is, right, and the audience is laughing. What they 152 00:08:04,560 --> 00:08:07,840 Speaker 1: don't understand is this is exactly what China wants you 153 00:08:07,880 --> 00:08:11,240 Speaker 1: to think. TikTok is, and he's giving you the Chinese 154 00:08:11,640 --> 00:08:17,080 Speaker 1: propaganda in real time, right and in real time. And 155 00:08:17,120 --> 00:08:20,000 Speaker 1: then you throw in some random people talking into tiny 156 00:08:20,080 --> 00:08:25,280 Speaker 1: microphones yelling about who knows what. Then and finally there's 157 00:08:25,280 --> 00:08:28,160 Speaker 1: a comment section asking what your feet look like. And 158 00:08:29,560 --> 00:08:32,480 Speaker 1: the president of China watches every second of this. He's 159 00:08:32,520 --> 00:08:34,920 Speaker 1: just watching, and one day he's going to use it 160 00:08:34,920 --> 00:08:39,440 Speaker 1: against us. That's TikTok for you. Now. You can hear 161 00:08:39,480 --> 00:08:42,320 Speaker 1: the audience laughing, and they think it's hysterical. Right. If 162 00:08:42,320 --> 00:08:47,240 Speaker 1: this wasn't such an important spying apparatus for China, then 163 00:08:47,240 --> 00:08:49,600 Speaker 1: why would the Chinese government have come out yesterday saying 164 00:08:49,600 --> 00:08:52,560 Speaker 1: they will not allow for a sale of TikTok to 165 00:08:52,600 --> 00:08:55,000 Speaker 1: an American company. That's the Chinese government that said this, 166 00:08:55,679 --> 00:08:59,199 Speaker 1: we will not allow it. And you've got these morons 167 00:08:59,240 --> 00:09:01,480 Speaker 1: like Jimmy Kimmel like, Oh, isn't it funny? Oh, it's 168 00:09:01,520 --> 00:09:04,720 Speaker 1: just an app. It's not just an app. Okay, it's not. 169 00:09:04,920 --> 00:09:08,400 Speaker 1: It's not just an app. And he should understand that. 170 00:09:08,880 --> 00:09:11,360 Speaker 1: And if he did any research at all, he'd understand it. 171 00:09:11,360 --> 00:09:13,560 Speaker 1: But hey, I guess he likes it. The Americans, half 172 00:09:13,600 --> 00:09:16,280 Speaker 1: of us have it on our phones, are downloading and 173 00:09:16,840 --> 00:09:22,600 Speaker 1: watching Jimmy Kimmel videos. So he's like, works for me. Right, 174 00:09:22,640 --> 00:09:26,320 Speaker 1: I'm fine with it. Right, I got no problem with this. 175 00:09:27,720 --> 00:09:32,800 Speaker 1: It looks like there's bipartisan support growing in Congress for 176 00:09:33,120 --> 00:09:37,760 Speaker 1: TikTok to be banned in this country. China has come out. 177 00:09:38,720 --> 00:09:42,319 Speaker 1: That's so important. It is from a spying apparatus perspective 178 00:09:42,400 --> 00:09:45,680 Speaker 1: from them. They've said they will not allow it to 179 00:09:45,720 --> 00:09:49,000 Speaker 1: be sold off to an American company because it's not 180 00:09:49,080 --> 00:09:52,800 Speaker 1: about being a Chinese basic company that can be bought 181 00:09:52,840 --> 00:09:55,360 Speaker 1: or sold. It's about exactly what I've said over and 182 00:09:55,400 --> 00:10:00,520 Speaker 1: over and over again. This is about them spying. Always 183 00:10:01,080 --> 00:10:06,080 Speaker 1: ben about them spying. Now, Democrats, they're laughing about this, 184 00:10:06,160 --> 00:10:08,840 Speaker 1: Jimmy Kimmel mocking it, saying it's not a big deal, 185 00:10:08,880 --> 00:10:13,360 Speaker 1: it shouldn't actually happen. Centator Holly also talking about this 186 00:10:13,520 --> 00:10:18,240 Speaker 1: TikTok and their connections back to the Biden administration. Take 187 00:10:18,240 --> 00:10:22,280 Speaker 1: a listen. The CCP has access right now to any 188 00:10:22,360 --> 00:10:24,640 Speaker 1: data they want that belongs to us. And that's why 189 00:10:24,640 --> 00:10:27,559 Speaker 1: I say again, this is a backdoor for the Beijing 190 00:10:27,640 --> 00:10:31,360 Speaker 1: government into our private lives, into our personal security, into 191 00:10:31,400 --> 00:10:33,120 Speaker 1: the heads of our children, and we've got to close 192 00:10:33,200 --> 00:10:35,280 Speaker 1: that door. The only thing to do, Laura is to 193 00:10:35,360 --> 00:10:38,360 Speaker 1: ban it. You mentioned the Biden administration. You know, TikTok's 194 00:10:38,360 --> 00:10:41,040 Speaker 1: not stupid. They went out and hired a PR firm 195 00:10:41,080 --> 00:10:44,320 Speaker 1: that was founded by Biden administration people. They are trying 196 00:10:44,360 --> 00:10:47,200 Speaker 1: to purchase the Democrat Party right now, Why are all 197 00:10:47,240 --> 00:10:51,240 Speaker 1: of these corporate media outlets now rehearsing TikTok's aligns for them. 198 00:10:51,240 --> 00:10:53,959 Speaker 1: It's because TikTok is out there pushing this. They're out 199 00:10:53,960 --> 00:10:56,600 Speaker 1: there buying influence of the Democrat Party. Let's put it 200 00:10:56,600 --> 00:10:58,640 Speaker 1: to a vote. Let's just see what the tally is. 201 00:10:58,679 --> 00:11:01,640 Speaker 1: Let's see who's willing to ban thing. Let's see who's 202 00:11:01,640 --> 00:11:03,520 Speaker 1: willing to ban this thing. And there's gonna be a 203 00:11:03,520 --> 00:11:06,760 Speaker 1: bunch of Democrats have been bought and paid for that 204 00:11:06,840 --> 00:11:09,160 Speaker 1: are going to say, oh no, no, I kid, we 205 00:11:09,440 --> 00:11:13,160 Speaker 1: need TikTok. It's not an app to spying device. It's 206 00:11:13,200 --> 00:11:15,880 Speaker 1: not a social media app. It's a spying device for 207 00:11:15,960 --> 00:11:18,000 Speaker 1: the Chinese Communist Party. Now, if you're stupid enough to 208 00:11:18,040 --> 00:11:21,240 Speaker 1: keep this on your phone, that's your decision. But you're 209 00:11:21,280 --> 00:11:25,680 Speaker 1: pretty dumb, You're pretty stupid. It's you download it. It's like, 210 00:11:25,760 --> 00:11:29,439 Speaker 1: here's what I'm downloading. Okay. More and more people are 211 00:11:29,440 --> 00:11:33,640 Speaker 1: coming out now talking about TikTok and why it should 212 00:11:33,679 --> 00:11:36,839 Speaker 1: be banned. I do want you to hear part of 213 00:11:36,880 --> 00:11:41,840 Speaker 1: what was said in Congress about this yesterday in the 214 00:11:41,880 --> 00:11:44,360 Speaker 1: back and forth. It took place when they were actually 215 00:11:44,480 --> 00:11:47,880 Speaker 1: arguing with the CEO, and I want you to hear 216 00:11:50,000 --> 00:11:55,000 Speaker 1: how they responded to TikTok or how the maybe I 217 00:11:55,040 --> 00:12:00,920 Speaker 1: should say, the CEO responded. The TikTok ceo add this 218 00:12:01,280 --> 00:12:05,480 Speaker 1: about on government devices. Take a listen to delay any 219 00:12:05,520 --> 00:12:08,240 Speaker 1: other people to start it. And what we've already established 220 00:12:08,240 --> 00:12:11,960 Speaker 1: about the ability of the Chinese Communist Party to access 221 00:12:12,120 --> 00:12:15,959 Speaker 1: personal user data. Would you agree that no US government 222 00:12:16,000 --> 00:12:20,360 Speaker 1: electronic devices should have access to TikTok platform, as your 223 00:12:20,520 --> 00:12:25,360 Speaker 1: lackluster security currently stands. I disagree with that characterization. Like 224 00:12:25,360 --> 00:12:29,120 Speaker 1: I said, any individual should be utilizing that on any 225 00:12:29,200 --> 00:12:33,280 Speaker 1: government platform. I think the government devices shouldn't have no 226 00:12:34,000 --> 00:12:37,640 Speaker 1: social media apps, to be honest, targeted to us. Mister 227 00:12:37,760 --> 00:12:42,760 Speaker 1: Chu joined this hearing you have mentioned. So China's CEO 228 00:12:42,880 --> 00:12:45,040 Speaker 1: of TikTok just said he thinks that there should be 229 00:12:45,240 --> 00:12:50,040 Speaker 1: no government devices that have social media apps on them, 230 00:12:50,080 --> 00:12:53,559 Speaker 1: not just TikTok. Now that's a little bit telling, right. No, No, 231 00:12:53,559 --> 00:12:56,400 Speaker 1: no government devices should have social media apps, to be honest, 232 00:12:57,360 --> 00:13:01,400 Speaker 1: none of them. TikTok teo also had this about spying 233 00:13:01,400 --> 00:13:06,079 Speaker 1: on Americans from representative done FIDA second District. Listen, and 234 00:13:06,160 --> 00:13:09,880 Speaker 1: they were going to follow individual American citizens. I ask 235 00:13:09,960 --> 00:13:15,120 Speaker 1: you against has Bite Dance spied on American citizens? I 236 00:13:15,120 --> 00:13:17,960 Speaker 1: don't think the spying is the right way to describe it. 237 00:13:18,240 --> 00:13:23,440 Speaker 1: This is ultimately can differ, this utiate. Any TikTok or 238 00:13:23,520 --> 00:13:27,560 Speaker 1: Bite Dance stated that is viewed, stored, or passes through 239 00:13:27,720 --> 00:13:30,840 Speaker 1: China is subject to the laws of China. One party, 240 00:13:30,880 --> 00:13:35,720 Speaker 1: authoritarian state, hostile. I love it. I wouldn't I love 241 00:13:35,920 --> 00:13:41,960 Speaker 1: how the TikTok CEOs is. I wouldn't describe it as spying, right, 242 00:13:43,800 --> 00:13:48,040 Speaker 1: That's not something that I would describe it as TikTok. 243 00:13:48,120 --> 00:13:51,480 Speaker 1: CEO also can't say how many children have died because 244 00:13:51,480 --> 00:13:55,240 Speaker 1: of dangerous challenges on TikTok. There's a lot of TikTok 245 00:13:55,320 --> 00:13:57,800 Speaker 1: challenges and a lot of people have died because of them. Listen. 246 00:13:58,120 --> 00:14:01,000 Speaker 1: Understanding that they're looking at the as, how do you 247 00:14:01,040 --> 00:14:05,160 Speaker 1: determine what age they are? Then we rely on age 248 00:14:05,200 --> 00:14:08,480 Speaker 1: skating as ow ye gating, which is when you ask 249 00:14:08,640 --> 00:14:11,320 Speaker 1: the user what age they are. We have also developed 250 00:14:11,320 --> 00:14:14,199 Speaker 1: some tools where we look at their public profile to 251 00:14:14,360 --> 00:14:16,440 Speaker 1: go through the videos that they post to see whether 252 00:14:16,720 --> 00:14:19,880 Speaker 1: that's creepy. Tell me more about that. It's public. So 253 00:14:19,960 --> 00:14:22,360 Speaker 1: if you post a video that you choose that video 254 00:14:22,400 --> 00:14:25,040 Speaker 1: to go public. That's how you get people to see 255 00:14:25,080 --> 00:14:27,480 Speaker 1: a video. We look at those to see if it 256 00:14:27,520 --> 00:14:29,760 Speaker 1: matches up the age that you talked about. Now, this 257 00:14:29,840 --> 00:14:33,160 Speaker 1: is a real challenge for our industry because privacy versus 258 00:14:33,280 --> 00:14:37,080 Speaker 1: h assurance is a really big problem. Look, look, you 259 00:14:37,160 --> 00:14:39,680 Speaker 1: keep talking about the industry. We're talking about TikTok. Here 260 00:14:40,040 --> 00:14:42,800 Speaker 1: we're talking about children dying. Do you know how many 261 00:14:42,880 --> 00:14:45,240 Speaker 1: children have died because of this? Do you have any idea? 262 00:14:45,400 --> 00:14:48,400 Speaker 1: Can you tell me a congressman, again, it is heartbreak. 263 00:14:48,440 --> 00:14:50,960 Speaker 1: Do you tell me if how many children in America 264 00:14:51,040 --> 00:14:53,880 Speaker 1: have died because of challenges like this? The majority of 265 00:14:54,200 --> 00:14:57,400 Speaker 1: people who use our platform use it for positive experiences, 266 00:14:57,960 --> 00:15:00,320 Speaker 1: and that's not what I ask you ask you tell 267 00:15:00,360 --> 00:15:02,520 Speaker 1: me the number of children of US children who have 268 00:15:02,560 --> 00:15:08,240 Speaker 1: died because of these challenges. Again, the majority of majority 269 00:15:08,280 --> 00:15:10,560 Speaker 1: of people who come on up platform. I'm not talking 270 00:15:10,600 --> 00:15:12,560 Speaker 1: about the majority of children. I want to know a 271 00:15:12,680 --> 00:15:15,840 Speaker 1: number dangerous changed US are not allowed in all platform. 272 00:15:16,120 --> 00:15:17,960 Speaker 1: If we find it, we will remove them. We think 273 00:15:17,960 --> 00:15:20,800 Speaker 1: this is obviously you found one today and you removed it. 274 00:15:20,920 --> 00:15:22,560 Speaker 1: We had to bring it to your attension, and I 275 00:15:22,600 --> 00:15:24,640 Speaker 1: know I'm out of time. Thank you for being here. 276 00:15:24,680 --> 00:15:28,560 Speaker 1: Welcome again to the most bipartisan Committian Congress. They don't 277 00:15:28,600 --> 00:15:32,640 Speaker 1: care about your kids, They care about collecting data. He 278 00:15:32,760 --> 00:15:35,160 Speaker 1: can't say how many children have died because of dangerous 279 00:15:35,280 --> 00:15:39,000 Speaker 1: challenges on TikTok. And this is a CEO who is 280 00:15:39,000 --> 00:15:44,680 Speaker 1: sitting there basically laughing in our face. Is it how 281 00:15:44,680 --> 00:15:47,280 Speaker 1: stupid we are that half of Americans have this on 282 00:15:47,360 --> 00:15:49,960 Speaker 1: their phone. One other thing I want to play for you. 283 00:15:50,360 --> 00:15:54,560 Speaker 1: The TikTok To ceo also did confirm yesterday that the 284 00:15:54,680 --> 00:15:58,920 Speaker 1: Chinese engineers of the Chinese Commist government do have access 285 00:15:59,000 --> 00:16:04,440 Speaker 1: to quote global data, meaning they can spy on you today. 286 00:16:05,040 --> 00:16:10,080 Speaker 1: Do Bite Dance employees in Beijing have access to American data? 287 00:16:10,200 --> 00:16:13,080 Speaker 1: A Congressman, we have been very open about this. We 288 00:16:13,160 --> 00:16:16,840 Speaker 1: have relied on global into operability. You have access to 289 00:16:16,880 --> 00:16:19,720 Speaker 1: American data? Coressman. I'm answering because if it could be 290 00:16:19,760 --> 00:16:23,000 Speaker 1: just a bit of time, we rely on global intooperability, 291 00:16:23,000 --> 00:16:25,280 Speaker 1: and we have employees in China. So yes, the Chinese 292 00:16:25,320 --> 00:16:27,400 Speaker 1: engineers do have access to global data, have access to 293 00:16:27,400 --> 00:16:32,360 Speaker 1: global data. There it is. He's admitting it. And after 294 00:16:32,400 --> 00:16:35,240 Speaker 1: this hearing, China was so upset they came out and 295 00:16:35,280 --> 00:16:39,680 Speaker 1: made it very clear that in China they will not 296 00:16:39,760 --> 00:16:45,480 Speaker 1: allow it to sell to an American company. Why is 297 00:16:45,560 --> 00:16:49,840 Speaker 1: that because it's it's not a social media app. It 298 00:16:50,000 --> 00:16:56,160 Speaker 1: is a spying app, That's what it is. Jamal Moment 299 00:16:57,200 --> 00:16:59,880 Speaker 1: has come out saying there's no evidence of security concerns 300 00:17:00,120 --> 00:17:03,960 Speaker 1: from Chinese spying TikTok, Yeah, saying that on NBC News, 301 00:17:04,400 --> 00:17:08,679 Speaker 1: Democrats are now doing the bidding of the Chinese Communist Party. 302 00:17:09,720 --> 00:17:15,360 Speaker 1: He has said on yesterday afternoon, Democrat claiming that there 303 00:17:15,400 --> 00:17:18,160 Speaker 1: isn't any evidence of TikTok posing as security concern through 304 00:17:18,240 --> 00:17:22,280 Speaker 1: Chinese espionage on the app. That's how they're describing it. 305 00:17:22,760 --> 00:17:26,840 Speaker 1: Democrats doing this, and of course Democrats doing the bidding 306 00:17:26,840 --> 00:17:28,880 Speaker 1: of the Chinese comngress governments saying, wow, is it really 307 00:17:28,960 --> 00:17:32,239 Speaker 1: a problem here? Listen, welcome back. While the consensus on 308 00:17:32,280 --> 00:17:34,840 Speaker 1: Capitol Hill this morning was for a crackdown on TikTok 309 00:17:34,840 --> 00:17:36,960 Speaker 1: over its times to the Chinese government, the feeling is 310 00:17:37,000 --> 00:17:40,040 Speaker 1: not universal. New York Congressman Jamal Bowman told NBC News 311 00:17:40,160 --> 00:17:42,040 Speaker 1: yesterday there are many apps on our phones right now 312 00:17:42,280 --> 00:17:44,800 Speaker 1: that are Chinese apps, and so the idea that oh, 313 00:17:44,840 --> 00:17:47,720 Speaker 1: TikTok is the boogeyman, it's just part of a political 314 00:17:47,760 --> 00:17:50,840 Speaker 1: fearmongering that is happening. Joining me now is New York 315 00:17:50,840 --> 00:17:54,679 Speaker 1: Democratic Congressman Jamal Bowman. Congressman. I appreciate you coming on, 316 00:17:54,760 --> 00:17:57,480 Speaker 1: So let me start with the first question. There, you 317 00:17:57,560 --> 00:18:00,600 Speaker 1: heard from TikTok CEO. You think he did a good 318 00:18:00,680 --> 00:18:03,200 Speaker 1: job defending the app or do you think he would 319 00:18:03,200 --> 00:18:07,280 Speaker 1: have done a better job defending the app? Listen, it's 320 00:18:07,359 --> 00:18:11,240 Speaker 1: tough to sit there and answer hundreds of questions about 321 00:18:11,320 --> 00:18:14,479 Speaker 1: any topic, even if you are an expert. But it 322 00:18:14,560 --> 00:18:16,560 Speaker 1: was good for him to come. It was good for 323 00:18:16,640 --> 00:18:18,600 Speaker 1: him to be in the hot seat. It was good 324 00:18:18,640 --> 00:18:21,880 Speaker 1: for him to answer the questions that he had to answer. 325 00:18:22,200 --> 00:18:24,640 Speaker 1: But I think part of his testimony that I want 326 00:18:24,680 --> 00:18:27,000 Speaker 1: to highlight, in which I've been talking about all week, 327 00:18:27,640 --> 00:18:32,120 Speaker 1: is we're targeting TikTok for safety and security reasons, particularly 328 00:18:32,200 --> 00:18:36,600 Speaker 1: national security reasons, but we're not talking about all of 329 00:18:36,640 --> 00:18:40,159 Speaker 1: the social media apps out there that have access to 330 00:18:40,280 --> 00:18:45,359 Speaker 1: our data, often without our consents or understanding of where 331 00:18:45,359 --> 00:18:48,000 Speaker 1: the data is going on, how it works, and our 332 00:18:48,080 --> 00:18:51,280 Speaker 1: data is now being sold, has always been sold to 333 00:18:51,400 --> 00:18:54,879 Speaker 1: the highest bidders, both in foreign nations and here in 334 00:18:54,880 --> 00:18:58,120 Speaker 1: our country. So for me, it's about having a larger 335 00:18:58,160 --> 00:19:02,159 Speaker 1: conversation about the arms and benefits of social media and 336 00:19:02,200 --> 00:19:05,800 Speaker 1: to finally work in a biparson way for national legislation 337 00:19:05,880 --> 00:19:09,879 Speaker 1: around safety and security on all social media platforms. All right, 338 00:19:09,920 --> 00:19:13,280 Speaker 1: let me ask you this, Why shouldn't the government be 339 00:19:13,359 --> 00:19:19,560 Speaker 1: concerned about foreign ownership of companies that target Americans on 340 00:19:19,640 --> 00:19:23,159 Speaker 1: the consumer level if that country is an antagonist to 341 00:19:23,200 --> 00:19:27,240 Speaker 1: our way of life? So first, I haven't seen any 342 00:19:27,480 --> 00:19:31,000 Speaker 1: evidence that shows me, as a member of Congress that 343 00:19:31,080 --> 00:19:34,320 Speaker 1: these countries have been China or any other country has 344 00:19:34,359 --> 00:19:37,080 Speaker 1: been a bad actor in this space in this way, 345 00:19:37,280 --> 00:19:42,000 Speaker 1: except Russian's interference in our twenty sixteen election, and that 346 00:19:42,119 --> 00:19:45,879 Speaker 1: happened on Facebook, and there was no conversation at that 347 00:19:46,000 --> 00:19:51,200 Speaker 1: point about banning Facebook, selling Facebook, breaking up Facebook. People 348 00:19:51,240 --> 00:19:56,240 Speaker 1: deleated Facebook. Correct, right, but still not enough. But now 349 00:19:56,320 --> 00:19:59,360 Speaker 1: we have this consistent fear of monitoring starting with the 350 00:19:59,440 --> 00:20:03,320 Speaker 1: Republican Party, and now many Democrats are sort of catching 351 00:20:03,320 --> 00:20:07,960 Speaker 1: on around China and around their access to our data. 352 00:20:08,000 --> 00:20:11,240 Speaker 1: But just this overall sort of global competition with China 353 00:20:11,560 --> 00:20:14,760 Speaker 1: in all its forms, and you know that's how Republicans governed, 354 00:20:14,840 --> 00:20:18,240 Speaker 1: That's how this place works way too often around fear. 355 00:20:18,520 --> 00:20:21,560 Speaker 1: Let's take up step back and have a comprehensive conversation 356 00:20:21,920 --> 00:20:26,800 Speaker 1: about social media, about addiction, about monopolies, about time on 357 00:20:26,840 --> 00:20:29,359 Speaker 1: the apps, and about safety and security of our data. 358 00:20:29,440 --> 00:20:33,040 Speaker 1: So let's talk about, excuse me, what is a data 359 00:20:33,119 --> 00:20:35,439 Speaker 1: privacy law. What kind of teeth would you like to 360 00:20:35,440 --> 00:20:37,960 Speaker 1: see in a data privacy law that you think would 361 00:20:37,960 --> 00:20:42,919 Speaker 1: a swage many people's concerns about TikTok? Specifically, what can 362 00:20:42,960 --> 00:20:45,920 Speaker 1: we do in a national law that would answer our 363 00:20:45,920 --> 00:20:50,639 Speaker 1: TikTok concerns? First of all, what in the world is 364 00:20:50,680 --> 00:20:53,919 Speaker 1: happening with my data as we speak? What are you 365 00:20:54,000 --> 00:20:56,639 Speaker 1: doing with it? Where is it? Where does it go? 366 00:20:57,040 --> 00:20:59,920 Speaker 1: Who is it being sold to? We need that education 367 00:21:01,000 --> 00:21:05,639 Speaker 1: on these plats. We have no idea. Our data is 368 00:21:05,640 --> 00:21:09,040 Speaker 1: just out there somewhere and it's being used everything from 369 00:21:09,160 --> 00:21:13,920 Speaker 1: facial recognition software to the Catholic Church buying data from 370 00:21:13,920 --> 00:21:17,480 Speaker 1: the app Grinder to search for gay priests. This was 371 00:21:17,520 --> 00:21:21,480 Speaker 1: reported by the Washington Post. So all of this is happening, 372 00:21:21,520 --> 00:21:23,800 Speaker 1: All of this data is out there we have no 373 00:21:24,040 --> 00:21:27,920 Speaker 1: knowledge of it. First, we need knowledge. Second, we need consent. 374 00:21:28,240 --> 00:21:30,480 Speaker 1: Do I consent to you using my data? And the 375 00:21:30,640 --> 00:21:33,840 Speaker 1: ways are not? And if I don't, then don't use it. 376 00:21:34,080 --> 00:21:39,560 Speaker 1: The problem is the entire advertising infrastructure for Facebook and 377 00:21:39,640 --> 00:21:42,640 Speaker 1: other American companies is built on this data. So if 378 00:21:42,640 --> 00:21:45,640 Speaker 1: that data comes away, ad dollars come away. If those 379 00:21:45,680 --> 00:21:48,760 Speaker 1: ad dollars come away, these companies are broke. And they've 380 00:21:48,760 --> 00:21:52,600 Speaker 1: been allowed to build oligarchies over this time. And now 381 00:21:52,640 --> 00:21:55,200 Speaker 1: we're focused on TikTok and not talking about the others. 382 00:21:55,520 --> 00:21:58,520 Speaker 1: I love how this democrat is literally doing the bidding 383 00:21:58,560 --> 00:22:01,120 Speaker 1: of the Chinese commuist government. Sitting there, he's like, oh, 384 00:22:01,200 --> 00:22:03,439 Speaker 1: it's not that bad. Oh well, we need a bigger 385 00:22:03,480 --> 00:22:06,840 Speaker 1: conversation on this right well, and it's not just tiptok. 386 00:22:07,200 --> 00:22:09,399 Speaker 1: He's like, Facebook did it too, Well, Facebook was an 387 00:22:09,400 --> 00:22:12,879 Speaker 1: American company, not a Chinese communist company. Let's not forget that. 388 00:22:13,640 --> 00:22:17,600 Speaker 1: And he's like, Facebook did this too. No, they didn't, No, 389 00:22:17,720 --> 00:22:22,359 Speaker 1: they didn't, they did not. That's not what happened. Okay, 390 00:22:22,440 --> 00:22:26,960 Speaker 1: that's a lie. So remember that company XTF. It was 391 00:22:27,000 --> 00:22:32,280 Speaker 1: a crypto currency ponzi scheme. And the guy who is 392 00:22:32,359 --> 00:22:36,720 Speaker 1: running it what just happened to be the second largest 393 00:22:36,800 --> 00:22:40,119 Speaker 1: donor to the Democratic Party in the mid term elections, 394 00:22:40,119 --> 00:22:43,160 Speaker 1: and then all of a sudden everything went belly up 395 00:22:43,160 --> 00:22:47,240 Speaker 1: just three days after the midterm elections. You remember that story, 396 00:22:47,359 --> 00:22:50,040 Speaker 1: you know, just from a couple of months ago. Has 397 00:22:50,200 --> 00:22:54,240 Speaker 1: any of the Democrats that look the other way on cryptocurrency? 398 00:22:55,280 --> 00:22:59,440 Speaker 1: Have any of them? Have any of them been arrested? Yeah? 399 00:23:00,280 --> 00:23:02,480 Speaker 1: Have any of them had to give back their money? No? 400 00:23:03,440 --> 00:23:05,080 Speaker 1: Did any of the candidates to me, he was the 401 00:23:05,160 --> 00:23:08,439 Speaker 1: second biggest owner behind George Sours the Democratic Party? No. 402 00:23:09,359 --> 00:23:12,400 Speaker 1: But what I can tell you now is Lindsay Lohan, 403 00:23:12,880 --> 00:23:17,680 Speaker 1: remember that child actress from Parent Trapp and Jake Paul, 404 00:23:17,800 --> 00:23:21,880 Speaker 1: the kid YouTube stories now a boxer. They both now 405 00:23:21,960 --> 00:23:27,200 Speaker 1: been charged for alleged illegal cryptocurrency schemes. So we're going 406 00:23:27,280 --> 00:23:31,520 Speaker 1: after childhood stars, but not the Democrats to cast the 407 00:23:31,720 --> 00:23:38,159 Speaker 1: checks from a corrupt XTF that they refused to actually 408 00:23:38,440 --> 00:23:43,400 Speaker 1: look at or regulate. Seems about writing America Today. Actress 409 00:23:43,520 --> 00:23:49,080 Speaker 1: Lindsay Lohan and quote influencer Jake Paul have been charged 410 00:23:49,800 --> 00:23:54,160 Speaker 1: for allegedly partaking an illegal crypto scheme, along with six 411 00:23:54,240 --> 00:23:59,040 Speaker 1: other celebrities. The charges from regulators from the US Security 412 00:23:59,119 --> 00:24:02,240 Speaker 1: Exchange Commission. And this is what the FEC is working 413 00:24:02,280 --> 00:24:05,800 Speaker 1: on today. Not let's happen out there with the Bank, 414 00:24:05,880 --> 00:24:08,160 Speaker 1: Silicon Valley Bank or any of the other banks. No 415 00:24:08,160 --> 00:24:11,040 Speaker 1: no, no no no, not not looking at all the democrats 416 00:24:11,080 --> 00:24:15,439 Speaker 1: taking them I no, no, no no. The SEC is 417 00:24:15,520 --> 00:24:19,680 Speaker 1: now going after childhood stars. The charges from the regulators 418 00:24:19,680 --> 00:24:24,280 Speaker 1: center on crypto asset entrepreneur Justin's Son and three of 419 00:24:24,280 --> 00:24:30,040 Speaker 1: his companies, Tron Foundation Limited, bit Torrent Foundation, and Rainberry 420 00:24:30,119 --> 00:24:34,600 Speaker 1: Ink formerly known as bit Torrent. A statement from the 421 00:24:34,680 --> 00:24:38,119 Speaker 1: SEC said that Son and his companies allegedly participate in 422 00:24:38,119 --> 00:24:43,720 Speaker 1: the unregistered offer and sale of crypto asset securities and 423 00:24:43,840 --> 00:24:49,440 Speaker 1: also fraudently manipulate the secondary market for x r X 424 00:24:49,440 --> 00:24:55,040 Speaker 1: excuse me tr X through an extensive wash trading scandal. 425 00:24:55,920 --> 00:25:00,200 Speaker 1: Wash trading reportedly involves the simultaneous or near simultane this 426 00:25:00,280 --> 00:25:02,600 Speaker 1: purchase and sale of a security to make it appear 427 00:25:03,359 --> 00:25:10,400 Speaker 1: actively traded without an actual change in beneficial ownership, and 428 00:25:10,480 --> 00:25:15,040 Speaker 1: for also orchestrating a scheme to pay celebrities to tout 429 00:25:15,119 --> 00:25:22,000 Speaker 1: this These companies to tout the TRX and BTT without 430 00:25:22,000 --> 00:25:26,480 Speaker 1: disclosing their compensation. So this is who our justice system 431 00:25:26,680 --> 00:25:30,280 Speaker 1: is going after now. Both Lindsay Lohan and Jake Paul 432 00:25:30,320 --> 00:25:35,720 Speaker 1: were charged with illegal illegally touting XRT and BTT without 433 00:25:35,720 --> 00:25:39,280 Speaker 1: disclosing that they were compensating them for doing so and 434 00:25:39,359 --> 00:25:43,679 Speaker 1: the amount of their compensation. Six other celebrities alongside the 435 00:25:43,720 --> 00:25:53,199 Speaker 1: two were DeAndre Cortez Way, Austin Mahone, Michael Manson, my 436 00:25:53,359 --> 00:25:58,879 Speaker 1: arms excuse me, Kendrew lust Or, Michelle Manson, Miles Parker McCullum, 437 00:25:59,800 --> 00:26:05,000 Speaker 1: and Knee Yo. If you don't know these names, I'm 438 00:26:05,040 --> 00:26:07,399 Speaker 1: not surprised. That means you probably have a job and 439 00:26:07,480 --> 00:26:11,000 Speaker 1: you actually go to work. Lindsay Lohan and Jake Paul 440 00:26:11,040 --> 00:26:14,720 Speaker 1: both reportedly agreed to a settlement payment without admitting guilt. 441 00:26:15,560 --> 00:26:21,720 Speaker 1: With the exception of Cortez, Way and Mahone, celebrities charged 442 00:26:21,880 --> 00:26:23,959 Speaker 1: today agree to pay a total of more than four 443 00:26:23,960 --> 00:26:27,399 Speaker 1: in a thousand and Interests and Penalties has settled the 444 00:26:27,480 --> 00:26:31,800 Speaker 1: charges without admitting or denying the SEC's filing. The SEC 445 00:26:32,000 --> 00:26:37,119 Speaker 1: is now set. The SEC chair Gary Gensler said in 446 00:26:37,119 --> 00:26:39,760 Speaker 1: a statement that the case represents the kind of high 447 00:26:39,880 --> 00:26:44,159 Speaker 1: risk investors face when crypto assets securities are offered and 448 00:26:44,200 --> 00:26:48,400 Speaker 1: soul without proper disclosure as alledge Son and his companies 449 00:26:48,880 --> 00:26:53,000 Speaker 1: not only targeted US investors in their unregistered offers and sales, 450 00:26:53,359 --> 00:26:57,720 Speaker 1: generating millions in illegal proceeds at the expense of investors, 451 00:26:58,200 --> 00:27:03,240 Speaker 1: but they also coordinated washing on an unregistered trading platform 452 00:27:03,320 --> 00:27:08,600 Speaker 1: to create the misleading appearance of active trading. Son further 453 00:27:09,520 --> 00:27:13,840 Speaker 1: induced investors to purchase stocks by orchestrated, or i should say, 454 00:27:14,000 --> 00:27:17,760 Speaker 1: these cryptos, by orchestrating a promotional campaign in which he 455 00:27:17,840 --> 00:27:20,719 Speaker 1: and his celebrity promoters hid the fact that the celebrities 456 00:27:21,080 --> 00:27:24,200 Speaker 1: were paid for their tweets that they were sending out. 457 00:27:24,800 --> 00:27:27,919 Speaker 1: So to be clear, you run a massive Ponzi scheme, 458 00:27:28,560 --> 00:27:32,720 Speaker 1: and Democrats are taking massive checks from your Ponzi scheme. 459 00:27:33,440 --> 00:27:37,120 Speaker 1: And you're the second biggest donor in the Democratic Party 460 00:27:37,800 --> 00:27:41,359 Speaker 1: in the midterm elections. And if you're anyone of those Democrats, 461 00:27:41,359 --> 00:27:44,280 Speaker 1: you're fine. But we're gonna go after child stars now 462 00:27:44,800 --> 00:27:48,720 Speaker 1: and get them to pay little dinky fines. And that's 463 00:27:48,720 --> 00:27:52,399 Speaker 1: how we're cleaning up the cryptocurrency mess. This is not 464 00:27:52,560 --> 00:27:55,639 Speaker 1: a joke. It's embarrassing, but this is what they're doing. 465 00:27:56,080 --> 00:27:57,840 Speaker 1: I want to remind you you can download the ben 466 00:27:57,880 --> 00:28:01,680 Speaker 1: Ferguson Podcasts. We do our podcast six days a week, 467 00:28:01,880 --> 00:28:04,840 Speaker 1: so it makes you download the Ben Ferguson podcasts now