1 00:00:04,480 --> 00:00:12,440 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,440 --> 00:00:16,160 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,160 --> 00:00:19,439 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:19,520 --> 00:00:23,040 Speaker 1: Tech Are You? So I had been working on a 5 00:00:23,200 --> 00:00:27,400 Speaker 1: totally different episode for today, but then yesterday the US 6 00:00:27,560 --> 00:00:31,680 Speaker 1: Senate did what I originally said I thought they wouldn't do, 7 00:00:32,120 --> 00:00:35,640 Speaker 1: which is that they passed a bill that mandates byte 8 00:00:35,800 --> 00:00:41,000 Speaker 1: Dance the Chinese company must sell its subsidiary TikTok to 9 00:00:41,120 --> 00:00:44,680 Speaker 1: some other entity that is not identified as a foreign 10 00:00:44,840 --> 00:00:49,199 Speaker 1: adversary to the United States or face a ban across 11 00:00:49,240 --> 00:00:51,800 Speaker 1: the nation. That bill is now going to President Biden's 12 00:00:51,840 --> 00:00:54,760 Speaker 1: desk for his signature, which seems to be pretty much forthcoming. 13 00:00:54,800 --> 00:00:56,200 Speaker 1: I mean he said he was going to sign it 14 00:00:56,240 --> 00:00:58,960 Speaker 1: into law. In fact, it may have already happened by 15 00:00:59,000 --> 00:01:01,440 Speaker 1: the time I get this episode published, and then that 16 00:01:01,560 --> 00:01:05,120 Speaker 1: means this bill will in fact become law. This is 17 00:01:05,160 --> 00:01:10,280 Speaker 1: a huge moment in technology, in business, in politics, both 18 00:01:10,319 --> 00:01:13,759 Speaker 1: domestic and international, and I figured it would make sense 19 00:01:13,800 --> 00:01:17,039 Speaker 1: to kind of go through and talk through this. So 20 00:01:17,200 --> 00:01:21,319 Speaker 1: first up, let's talk about byte Edance and TikTok, because 21 00:01:21,360 --> 00:01:24,959 Speaker 1: we've learned some things about byte dance in particular recently 22 00:01:25,240 --> 00:01:30,399 Speaker 1: that really have shaped perspectives in very recent weeks. So 23 00:01:30,480 --> 00:01:33,520 Speaker 1: let's start off with a guy named Jean E. Ming. 24 00:01:34,000 --> 00:01:36,840 Speaker 1: He got his start working for a travel website in 25 00:01:36,920 --> 00:01:41,600 Speaker 1: China called Kuksoon, and he spent a brief period working 26 00:01:41,640 --> 00:01:46,760 Speaker 1: for Microsoft, but then ventured out into the entrepreneurial landscape himself. 27 00:01:47,280 --> 00:01:52,520 Speaker 1: So the traditional story is that he created various apps 28 00:01:52,640 --> 00:01:54,960 Speaker 1: in an attempt to kind of find one that really stuck, 29 00:01:55,000 --> 00:01:58,360 Speaker 1: and one of those was a real estate app, not 30 00:01:58,600 --> 00:02:01,600 Speaker 1: too different from what you might see in an app 31 00:02:01,680 --> 00:02:05,320 Speaker 1: like Zillow right, but this one was called ninety nine 32 00:02:05,400 --> 00:02:10,480 Speaker 1: Fong Fang. That app was not a success. It ended 33 00:02:10,560 --> 00:02:13,639 Speaker 1: up eventually going out of business in a couple of years, 34 00:02:13,800 --> 00:02:16,600 Speaker 1: but it would end up being really important to the 35 00:02:16,639 --> 00:02:20,720 Speaker 1: TikTok story for multiple reasons. For one thing, engineers at 36 00:02:20,800 --> 00:02:24,760 Speaker 1: ninety nine Fong developed tools that were meant to help 37 00:02:24,800 --> 00:02:28,320 Speaker 1: for search as well as recommended real estate listings based 38 00:02:28,400 --> 00:02:32,160 Speaker 1: upon user information, which you know a lot of that 39 00:02:32,160 --> 00:02:35,320 Speaker 1: sounds fairly standard for a real estate app, but the 40 00:02:35,360 --> 00:02:40,960 Speaker 1: story goes Jeong Yi Ming decided that maybe you could 41 00:02:41,160 --> 00:02:46,079 Speaker 1: pair a recommendation algorithm that didn't work for real estate, 42 00:02:46,240 --> 00:02:49,560 Speaker 1: but it would work for something like serving someone up 43 00:02:49,960 --> 00:02:52,960 Speaker 1: news that you know. You would learn about a user's behavior, 44 00:02:52,960 --> 00:02:55,800 Speaker 1: their interests, the things that are important to them, and 45 00:02:55,840 --> 00:03:00,000 Speaker 1: you could aggregate news for those users, and you could 46 00:03:00,200 --> 00:03:04,000 Speaker 1: match people to content. Essentially, he was just thinking about 47 00:03:04,040 --> 00:03:07,440 Speaker 1: how the best apply recommendation algorithms. How do you judge 48 00:03:07,520 --> 00:03:10,720 Speaker 1: what someone likes? And then how do you keep them 49 00:03:10,960 --> 00:03:13,400 Speaker 1: at your app by serving them the stuff that makes 50 00:03:13,440 --> 00:03:16,280 Speaker 1: them stay there? How do you make that happen? So 51 00:03:16,320 --> 00:03:19,920 Speaker 1: to that end, again, the traditional story goes. Jiang Yiming 52 00:03:20,480 --> 00:03:25,280 Speaker 1: launched a news aggregator app called Totyoo, and he and 53 00:03:25,360 --> 00:03:29,560 Speaker 1: his former college roommate Leong Rubo decided to found a 54 00:03:29,600 --> 00:03:32,800 Speaker 1: company that would sort of be the umbrella corporation for 55 00:03:32,880 --> 00:03:36,200 Speaker 1: the various apps they were developing, and this would become 56 00:03:36,720 --> 00:03:41,360 Speaker 1: Bite Dance. They sought seed money from investors, and then 57 00:03:41,440 --> 00:03:45,240 Speaker 1: a company founded by an American politically conservative mega donor 58 00:03:45,560 --> 00:03:48,560 Speaker 1: enters into the picture. So we're in around twenty twelve 59 00:03:48,640 --> 00:03:51,320 Speaker 1: at this point ninety nine. Fong really started around two 60 00:03:51,360 --> 00:03:55,160 Speaker 1: thousand and nine. Byte Dance starts around twenty twelve. Now, 61 00:03:55,240 --> 00:03:59,080 Speaker 1: the conservative mega donor is a fellow named Jeff Yas 62 00:03:59,560 --> 00:04:02,160 Speaker 1: And you know, I get it. I know tons of 63 00:04:02,200 --> 00:04:05,880 Speaker 1: y'all out there hate politics in your tech stories, but 64 00:04:06,040 --> 00:04:10,600 Speaker 1: unfortunately this time we have to include politics because it 65 00:04:10,640 --> 00:04:13,120 Speaker 1: plays a huge part in what we're talking about. Me 66 00:04:13,200 --> 00:04:16,320 Speaker 1: we're talking about a ban on TikTok, a legal ban, 67 00:04:16,520 --> 00:04:19,640 Speaker 1: so we have to talk about politics, and Yes in 68 00:04:19,720 --> 00:04:24,880 Speaker 1: particular has had a really huge impact on the TikTok story. 69 00:04:25,360 --> 00:04:29,440 Speaker 1: So Yes co founded an investment company called Susquehanna International 70 00:04:29,440 --> 00:04:32,080 Speaker 1: Group with some of his fellow students at the State 71 00:04:32,160 --> 00:04:35,680 Speaker 1: University of New York at Binghampton back in the nineteen seventies. 72 00:04:35,800 --> 00:04:39,800 Speaker 1: And this company would actually become wildly profitable. It would 73 00:04:39,800 --> 00:04:42,960 Speaker 1: become one of the massive investors in the stock market 74 00:04:42,960 --> 00:04:45,760 Speaker 1: here in the United States, and it would turn Yass 75 00:04:45,920 --> 00:04:49,040 Speaker 1: into a billionaire in the process. So here we are 76 00:04:49,040 --> 00:04:52,960 Speaker 1: in twenty twelve and Susquehanna International Group invests in this 77 00:04:53,080 --> 00:04:57,640 Speaker 1: fledgling Chinese company called Byte Dance, ultimately spending enough to 78 00:04:58,040 --> 00:05:01,880 Speaker 1: purchase a fifteen percent state in the company. And you know, 79 00:05:01,960 --> 00:05:05,120 Speaker 1: that's a healthy chunk of ownership, and today it represents 80 00:05:05,160 --> 00:05:10,160 Speaker 1: a truly massive investment considering the stratospheric altitude that Byte 81 00:05:10,240 --> 00:05:15,320 Speaker 1: dances valuation would reach. So Yas is heavily invested in 82 00:05:15,400 --> 00:05:19,159 Speaker 1: byte Dance, and some argue that the majority of Yas's 83 00:05:19,320 --> 00:05:23,000 Speaker 1: estimated wealth is tied up with this investment. So he 84 00:05:23,080 --> 00:05:27,160 Speaker 1: clearly has a huge incentive to protect Byte Dance. That's 85 00:05:27,200 --> 00:05:31,080 Speaker 1: where his billions of dollars are tied up. Now Moreover, 86 00:05:31,360 --> 00:05:34,000 Speaker 1: the New York Times published a piece not too long 87 00:05:34,040 --> 00:05:38,440 Speaker 1: ago about some court records that were accidentally made public. 88 00:05:38,800 --> 00:05:41,920 Speaker 1: These records relate to a lawsuit in which some former 89 00:05:42,160 --> 00:05:46,040 Speaker 1: ninety nine Fong engineers allege that their work has been 90 00:05:46,120 --> 00:05:51,440 Speaker 1: repurposed to power byte Dances various apps, most notably TikTok, 91 00:05:51,839 --> 00:05:55,600 Speaker 1: and that these engineers were not fairly compensated for the 92 00:05:55,720 --> 00:05:59,680 Speaker 1: use of their work. The court records show that Susquehanna 93 00:05:59,720 --> 00:06:04,840 Speaker 1: had invested in ninety nine Fong, So Susquehanna's involvement and 94 00:06:04,920 --> 00:06:10,159 Speaker 1: thus Jeff Yass's involvement with Xiang y Ning, TikTok and 95 00:06:10,240 --> 00:06:13,520 Speaker 1: byte Dance has gone much further back than was first known. 96 00:06:13,640 --> 00:06:16,560 Speaker 1: This goes back to two thousand and nine, so Susquehanna 97 00:06:16,600 --> 00:06:20,520 Speaker 1: apparently invests in ninety nine Fong, but that app, like 98 00:06:20,600 --> 00:06:23,680 Speaker 1: I said, went belly up. And then, according to these 99 00:06:23,720 --> 00:06:28,520 Speaker 1: court documents, Susquehanna essentially created byte Dance and put Xiang 100 00:06:28,600 --> 00:06:31,160 Speaker 1: Yi Ming in charge of it. So, in other words, 101 00:06:31,400 --> 00:06:33,520 Speaker 1: the court records say the origin story for a byte 102 00:06:33,600 --> 00:06:35,960 Speaker 1: Dance is not the same thing as what has been 103 00:06:36,040 --> 00:06:40,240 Speaker 1: generally believed, and that Susquehanna wasn't just a passive investor 104 00:06:40,360 --> 00:06:44,160 Speaker 1: in this Chinese company, but actually took a very prominent 105 00:06:44,320 --> 00:06:48,560 Speaker 1: role in the creation of byte Dance itself. Anyway, those 106 00:06:48,600 --> 00:06:51,600 Speaker 1: court records were never meant to be made public. They 107 00:06:51,640 --> 00:06:54,520 Speaker 1: should not have been made public. So the court subsequently 108 00:06:54,560 --> 00:06:57,640 Speaker 1: removed those records from public access, but not before The 109 00:06:57,680 --> 00:07:00,600 Speaker 1: New York Times was able to review them and publish 110 00:07:00,680 --> 00:07:03,280 Speaker 1: an article about them. So now let's move on. So 111 00:07:03,600 --> 00:07:07,400 Speaker 1: ByteDance begins to form in twenty twelve. It's based in China, 112 00:07:07,760 --> 00:07:11,960 Speaker 1: and we're still quite a ways out from TikTok at 113 00:07:11,960 --> 00:07:16,480 Speaker 1: this point. TikTok's own history is very convoluted. For one thing, 114 00:07:16,480 --> 00:07:19,440 Speaker 1: it's really the story of a few apps kind of 115 00:07:19,640 --> 00:07:23,680 Speaker 1: merging together Voltron's style to create what we know of 116 00:07:23,920 --> 00:07:28,240 Speaker 1: as TikTok today. One of those apps was called Musical 117 00:07:28,600 --> 00:07:32,200 Speaker 1: dot l y or Musically and some of y'all might 118 00:07:32,240 --> 00:07:34,840 Speaker 1: even remember that app. I remember it was really big 119 00:07:34,880 --> 00:07:37,000 Speaker 1: with some of my actor friends back in you know, 120 00:07:37,080 --> 00:07:41,000 Speaker 1: twenty fifteen or so. This app was originally founded by 121 00:07:41,160 --> 00:07:45,760 Speaker 1: Luyu Yong and Alex Hu in China, and the app 122 00:07:45,840 --> 00:07:49,040 Speaker 1: started off originally that they wanted to create an app, 123 00:07:49,080 --> 00:07:51,360 Speaker 1: a totally different app, not musically. They wanted to make 124 00:07:51,360 --> 00:07:54,480 Speaker 1: an app that was an educational tool, something that could 125 00:07:54,480 --> 00:07:58,400 Speaker 1: deliver short informative lectures and such to users. But they 126 00:07:58,480 --> 00:08:01,880 Speaker 1: found that folks didn't want to know learning, so they 127 00:08:01,960 --> 00:08:05,400 Speaker 1: decided to change directions and they made an app designed 128 00:08:05,400 --> 00:08:08,840 Speaker 1: for very short form video content that you could create 129 00:08:08,960 --> 00:08:12,280 Speaker 1: and share short form videos or you could browse short 130 00:08:12,320 --> 00:08:14,400 Speaker 1: form videos, and that these would be you know, less 131 00:08:14,400 --> 00:08:17,720 Speaker 1: than a minute long, somewhere between fifteen seconds and sixty seconds, 132 00:08:18,040 --> 00:08:21,120 Speaker 1: and they included the ability for users to post videos 133 00:08:21,480 --> 00:08:24,880 Speaker 1: with specific music clips. They had a library of music clips, 134 00:08:24,880 --> 00:08:27,240 Speaker 1: so it was kind of like adding a soundtrack to 135 00:08:27,320 --> 00:08:30,720 Speaker 1: your video content. And this led to people using Musical 136 00:08:30,960 --> 00:08:35,160 Speaker 1: dot Le to post lip syncing videos. That's really where 137 00:08:35,160 --> 00:08:38,520 Speaker 1: the app took off. It also coincided with a popular 138 00:08:38,559 --> 00:08:41,400 Speaker 1: lip syncing program that was happening in the United States 139 00:08:41,400 --> 00:08:43,480 Speaker 1: at the time. So again when I say it got 140 00:08:43,520 --> 00:08:46,880 Speaker 1: really popular, the app really got popular in America. In China, 141 00:08:46,920 --> 00:08:49,280 Speaker 1: it actually didn't move the needle very much at all. 142 00:08:49,520 --> 00:08:54,040 Speaker 1: So with the American market going banana for lip syncing videos, 143 00:08:54,200 --> 00:08:57,320 Speaker 1: the founders decided to establish an office in the good 144 00:08:57,320 --> 00:09:01,560 Speaker 1: old US of A. So they did so in Santa Monica, California. 145 00:09:01,640 --> 00:09:04,280 Speaker 1: This is kind of where the heart of TikTok being 146 00:09:04,320 --> 00:09:07,840 Speaker 1: an American company comes in. So they create this office 147 00:09:07,880 --> 00:09:12,160 Speaker 1: in Santa Monica. The app continues to go incredibly popular. 148 00:09:12,280 --> 00:09:15,200 Speaker 1: At one point, an estimated half of all US teens 149 00:09:15,640 --> 00:09:19,360 Speaker 1: were using Musically and it became the number one app 150 00:09:19,400 --> 00:09:21,440 Speaker 1: on the iOS app store, so it was a really 151 00:09:21,440 --> 00:09:25,640 Speaker 1: big deal. Meanwhile, back in China, ByteDance was working on 152 00:09:26,160 --> 00:09:29,120 Speaker 1: kind of a similar app, not the exact same thing 153 00:09:29,160 --> 00:09:32,480 Speaker 1: as Musically, but along the same lines, and they called 154 00:09:32,480 --> 00:09:36,160 Speaker 1: it dou Yin, which, from what I understand, means shaking 155 00:09:36,440 --> 00:09:39,560 Speaker 1: sound and a cursory description of dou Yen makes it 156 00:09:39,600 --> 00:09:42,800 Speaker 1: sound an awful lot like TikTok. It's an app that 157 00:09:42,880 --> 00:09:46,640 Speaker 1: lets users create and browse short form videos, often ones 158 00:09:46,679 --> 00:09:49,520 Speaker 1: that include popular music in the background. Do youan though, 159 00:09:49,559 --> 00:09:52,600 Speaker 1: also allows people to live stream, which you can do 160 00:09:52,640 --> 00:09:55,200 Speaker 1: on TikTok now too, or you know, watch a live stream, 161 00:09:55,280 --> 00:09:57,960 Speaker 1: not just to do one yourself. And dou Yan also 162 00:09:58,320 --> 00:10:01,240 Speaker 1: has some kind of incredible advanced features that are not 163 00:10:01,440 --> 00:10:04,800 Speaker 1: in TikTok, like being able to search videos based on 164 00:10:04,880 --> 00:10:07,920 Speaker 1: the face of someone who's appearing in the video you're watching, 165 00:10:08,280 --> 00:10:10,680 Speaker 1: or being able to purchase goods that are featured in 166 00:10:10,679 --> 00:10:13,760 Speaker 1: one of those videos just through a couple of presses 167 00:10:13,800 --> 00:10:16,360 Speaker 1: at the screen, or even like to book a stay 168 00:10:16,400 --> 00:10:18,960 Speaker 1: in a resort or a hotel that is featured in 169 00:10:18,960 --> 00:10:22,040 Speaker 1: a video. And as you would imagine it also as 170 00:10:22,080 --> 00:10:25,800 Speaker 1: a Chinese app, has incredibly tight restrictions on the type 171 00:10:25,800 --> 00:10:29,200 Speaker 1: of content that you can post there. Essentially, anything that 172 00:10:29,240 --> 00:10:32,880 Speaker 1: would bring into question the authority or legitimacy of China's 173 00:10:32,880 --> 00:10:35,760 Speaker 1: government is strictly forbidden. As it would later turn out 174 00:10:35,920 --> 00:10:38,400 Speaker 1: that kind of stuff would sort of be forbidden on 175 00:10:38,480 --> 00:10:40,880 Speaker 1: TikTok as well. We're going to have to come back 176 00:10:40,920 --> 00:10:44,920 Speaker 1: to that in a bit. So in September twenty seventeen, 177 00:10:45,080 --> 00:10:49,040 Speaker 1: byte Dance launches TikTok, which is a sister app to 178 00:10:49,160 --> 00:10:53,440 Speaker 1: dou Yin. It's the international version of du Yan and 179 00:10:53,520 --> 00:10:55,800 Speaker 1: as such, it doesn't have all the features and it 180 00:10:55,800 --> 00:10:58,600 Speaker 1: doesn't have all the restrictions that Duyin has. But the 181 00:10:58,640 --> 00:11:02,160 Speaker 1: company had also become aware of Musical Lee and there 182 00:11:02,240 --> 00:11:04,840 Speaker 1: was this clear market in America that was just waiting 183 00:11:04,840 --> 00:11:08,360 Speaker 1: for exploitation, and Music Lee had already tapped into it. Right, 184 00:11:08,400 --> 00:11:11,000 Speaker 1: it makes way more sense if we can get musically 185 00:11:11,520 --> 00:11:13,199 Speaker 1: and we can get access to that, then we get 186 00:11:13,240 --> 00:11:15,960 Speaker 1: access to those users and we don't have to build 187 00:11:16,040 --> 00:11:19,719 Speaker 1: up our audience from scratch. So Byte Dance, through its 188 00:11:19,720 --> 00:11:24,840 Speaker 1: subsidiary Totial, makes an offer to acquire Musical Lee in 189 00:11:24,920 --> 00:11:29,079 Speaker 1: November twenty seventeen. Now, the price tag was not made public, 190 00:11:29,120 --> 00:11:31,240 Speaker 1: but estimates put it somewhere in the neighborhood of eight 191 00:11:31,320 --> 00:11:34,840 Speaker 1: hundred million dollars to one billion smack a Rouse, which 192 00:11:34,880 --> 00:11:39,400 Speaker 1: is a princely sum indeed. Now. At the time, Musical 193 00:11:39,520 --> 00:11:42,480 Speaker 1: Lee released a statement that said the company would continue 194 00:11:42,480 --> 00:11:46,120 Speaker 1: to operate independently in the United States. While, in the 195 00:11:46,120 --> 00:11:50,120 Speaker 1: words of a tech Crunch article, quote taking advantage of 196 00:11:50,160 --> 00:11:54,240 Speaker 1: the distribution in tech that its new parent offers end quote, 197 00:11:54,440 --> 00:11:58,560 Speaker 1: in reality, things would get a bit murkier Byte Dance 198 00:11:58,679 --> 00:12:03,240 Speaker 1: intended to leverage the recommendation algorithm that it had developed 199 00:12:03,280 --> 00:12:07,160 Speaker 1: for Totyoo, possibly by taking it from ninety nine Fong, 200 00:12:07,200 --> 00:12:10,960 Speaker 1: although the company would dispute this and then incorporate this 201 00:12:11,120 --> 00:12:15,960 Speaker 1: recommendation algorithm into the app Musical Lee in order to 202 00:12:16,040 --> 00:12:20,160 Speaker 1: supercharge it and to tap into that massive audience, and 203 00:12:20,800 --> 00:12:23,880 Speaker 1: they decided that it made more sense to fold Musical 204 00:12:24,000 --> 00:12:28,200 Speaker 1: Lee into the company's own efforts to launch TikTok. So 205 00:12:28,679 --> 00:12:31,480 Speaker 1: byt Dance made the call to merge these apps together 206 00:12:31,840 --> 00:12:34,960 Speaker 1: and to drop the Musical Lee name in the process 207 00:12:35,160 --> 00:12:38,439 Speaker 1: and instead use the name TikTok. That was a pretty 208 00:12:38,520 --> 00:12:42,360 Speaker 1: gutsy move in America, right because Musical Lee was young, 209 00:12:42,480 --> 00:12:45,560 Speaker 1: but it was firmly established as a brand already, at 210 00:12:45,640 --> 00:12:48,880 Speaker 1: least for its target audience, So making a change like 211 00:12:48,920 --> 00:12:51,880 Speaker 1: that was risky. And really you could say TikTok was 212 00:12:52,000 --> 00:12:54,240 Speaker 1: kind of the product of three different apps. You had 213 00:12:54,320 --> 00:12:57,680 Speaker 1: Musical Lie, you had Duyen, and you had the original 214 00:12:57,800 --> 00:13:02,080 Speaker 1: version of TikTok. The international variant of dou Yen by 215 00:13:02,240 --> 00:13:05,960 Speaker 1: Dance would make great use of its recommendation algorithm on 216 00:13:06,240 --> 00:13:08,680 Speaker 1: the new app, and this is the algorithm that the 217 00:13:08,720 --> 00:13:11,760 Speaker 1: engineers who worked for ninety nine fond claims came from 218 00:13:11,800 --> 00:13:15,240 Speaker 1: their work, which is a claim again that Bitedance continues 219 00:13:15,360 --> 00:13:19,320 Speaker 1: to dispute as well as Susquehanna International Group they disputed 220 00:13:19,400 --> 00:13:22,600 Speaker 1: as well, that again was found in those court records. 221 00:13:22,800 --> 00:13:28,880 Speaker 1: So TikTok essentially becomes a thing officially in early twenty eighteen. 222 00:13:29,320 --> 00:13:33,040 Speaker 1: When we come back, we'll shift over to politics again 223 00:13:33,080 --> 00:13:36,920 Speaker 1: for a bit, because unfortunately that's what this whole mess 224 00:13:36,960 --> 00:13:41,000 Speaker 1: revolves around. But first, let's take a quick break to 225 00:13:41,080 --> 00:13:54,760 Speaker 1: thank our sponsors. All Right, we're back. Let's plunge on 226 00:13:54,920 --> 00:13:59,240 Speaker 1: into the political bit of this story, and that all 227 00:13:59,280 --> 00:14:03,280 Speaker 1: gets started in China itself. So back in the mid 228 00:14:03,400 --> 00:14:06,880 Speaker 1: twenty tens, China started to pass laws that were meant 229 00:14:06,880 --> 00:14:10,320 Speaker 1: to tighten up national security in various ways, and that 230 00:14:10,360 --> 00:14:14,160 Speaker 1: included stuff like, you know, cybersecurity laws and cracking down 231 00:14:14,320 --> 00:14:17,360 Speaker 1: on digital espionage, you know, that kind of stuff. And 232 00:14:17,440 --> 00:14:20,720 Speaker 1: in twenty seventeen, China passed a law called the National 233 00:14:20,760 --> 00:14:23,640 Speaker 1: Intelligence Law of the People's Republic of China. At least 234 00:14:23,640 --> 00:14:27,440 Speaker 1: that's the Western translation this law, and the law covers 235 00:14:27,720 --> 00:14:29,960 Speaker 1: a lot of ground actually, but the bit that really 236 00:14:30,000 --> 00:14:32,560 Speaker 1: sticks out, at least with regard to you know, the 237 00:14:32,640 --> 00:14:37,200 Speaker 1: TikTok mess would really be articles seven and ten. So 238 00:14:37,400 --> 00:14:41,480 Speaker 1: Article seven states that all Chinese citizens and companies and 239 00:14:41,600 --> 00:14:46,200 Speaker 1: organizations are to cooperate in intelligence gathering efforts, and that 240 00:14:46,320 --> 00:14:49,280 Speaker 1: seems to mean that if the Chinese government wants information 241 00:14:49,360 --> 00:14:53,479 Speaker 1: on something and a Chinese company has access to this information, 242 00:14:53,800 --> 00:14:57,040 Speaker 1: that company is obligated to hand it over to the government. 243 00:14:57,480 --> 00:15:00,160 Speaker 1: Article ten says that that goes for you too, two 244 00:15:00,400 --> 00:15:03,080 Speaker 1: companies that are owned by Chinese interests but are actually 245 00:15:03,120 --> 00:15:07,000 Speaker 1: operating overseas. So essentially, the law says it doesn't matter 246 00:15:07,360 --> 00:15:10,640 Speaker 1: if the business practices are outside of China or not. 247 00:15:10,960 --> 00:15:14,720 Speaker 1: If the company that's conducting these matters ultimately is a 248 00:15:14,800 --> 00:15:18,840 Speaker 1: Chinese company, then it is subject to this law, whether 249 00:15:18,920 --> 00:15:22,720 Speaker 1: the business is happening inside China or someplace like the 250 00:15:22,760 --> 00:15:27,520 Speaker 1: United States. Now, how and how frequently the Chinese government 251 00:15:27,680 --> 00:15:33,000 Speaker 1: enforces this law isn't exactly clear, but the implication worries 252 00:15:33,080 --> 00:15:35,600 Speaker 1: a lot of folks internationally for good reason. You know, 253 00:15:35,720 --> 00:15:38,560 Speaker 1: Chinese companies make a lot of stuff, and a lot 254 00:15:38,600 --> 00:15:42,040 Speaker 1: of that stuff ends up in complex critical systems like 255 00:15:42,240 --> 00:15:46,560 Speaker 1: telecommunications infrastructure, for example. So for that reason, countries like 256 00:15:46,600 --> 00:15:50,040 Speaker 1: the United States began to reconsider the wisdom of allowing 257 00:15:50,320 --> 00:15:55,480 Speaker 1: telecommunications companies to use Chinese components in their various systems, 258 00:15:55,640 --> 00:15:58,840 Speaker 1: out of fear that these companies could end up being 259 00:15:58,880 --> 00:16:02,880 Speaker 1: complicit in spying on America on behalf of China, with 260 00:16:03,160 --> 00:16:07,840 Speaker 1: or without that telecommunications companies, you know, knowledge or active cooperation, 261 00:16:08,240 --> 00:16:12,360 Speaker 1: just by using these components that could be snooping on stuff. 262 00:16:12,560 --> 00:16:14,880 Speaker 1: So we in the States started to see calls for 263 00:16:15,240 --> 00:16:19,520 Speaker 1: telecommunications companies to remove and replace these Chinese components with 264 00:16:19,520 --> 00:16:23,280 Speaker 1: stuff from other companies, and that got underway. All right, 265 00:16:23,320 --> 00:16:26,280 Speaker 1: So China passes a law that seems to say any 266 00:16:26,440 --> 00:16:29,600 Speaker 1: Chinese company, et cetera, et cetera, has to comply with 267 00:16:29,760 --> 00:16:33,400 Speaker 1: government requests to gather and share intelligence. Byte Dance is 268 00:16:33,440 --> 00:16:37,360 Speaker 1: a Chinese company, and TikTok is a byte Dance subsidiary. 269 00:16:37,680 --> 00:16:40,920 Speaker 1: So our seed has been planted. Now let's pop on 270 00:16:41,040 --> 00:16:44,200 Speaker 1: over to the old us of a. In twenty eighteen, 271 00:16:44,440 --> 00:16:48,400 Speaker 1: the US government, led by President Donald Trump, got involved 272 00:16:48,440 --> 00:16:51,840 Speaker 1: in a protracted trade war with China, and the issue 273 00:16:51,840 --> 00:16:55,080 Speaker 1: at hand was really a trade deficit, and to go 274 00:16:55,160 --> 00:16:57,680 Speaker 1: into the history of that would require a whole lot 275 00:16:57,720 --> 00:17:00,960 Speaker 1: more time and expertise than I have, just do a 276 00:17:01,000 --> 00:17:04,760 Speaker 1: bad job of describing it, but we can summarize it sloppily. 277 00:17:04,960 --> 00:17:07,640 Speaker 1: I might add by saying that for a few decades now, 278 00:17:07,800 --> 00:17:13,080 Speaker 1: the United States imports from China drastically exceed its exports 279 00:17:13,320 --> 00:17:17,000 Speaker 1: to China. We're bringing in more stuff from overseas than 280 00:17:17,040 --> 00:17:20,520 Speaker 1: we are selling to overseas customers. So the worry is 281 00:17:21,000 --> 00:17:24,000 Speaker 1: that ultimately this could have a huge negative impact on 282 00:17:24,240 --> 00:17:28,000 Speaker 1: American economy and jobs, as well as national security. Whether 283 00:17:28,080 --> 00:17:32,080 Speaker 1: that actually is the case is a lot more complicated, 284 00:17:32,400 --> 00:17:36,320 Speaker 1: and I frankly would go dizzy trying to suss it 285 00:17:36,359 --> 00:17:40,040 Speaker 1: all out. So Trump's administration sought to reverse the trade 286 00:17:40,040 --> 00:17:43,560 Speaker 1: deficit trends, and in order to do that, his administration 287 00:17:43,800 --> 00:17:48,480 Speaker 1: levied tariffs on Chinese imports. And again, going into whether 288 00:17:48,560 --> 00:17:50,240 Speaker 1: or not this was actually a good idea or a 289 00:17:50,240 --> 00:17:52,959 Speaker 1: bad idea would require a lot more time and knowledge 290 00:17:53,000 --> 00:17:56,280 Speaker 1: than I possess, and I don't know ultimately if it 291 00:17:56,280 --> 00:17:57,840 Speaker 1: was a good or bad idea, But I do know 292 00:17:58,040 --> 00:18:00,000 Speaker 1: there were a lot of folks who are at least 293 00:18:00,119 --> 00:18:04,160 Speaker 1: called experts in the field, who debated that topic amongst themselves, 294 00:18:04,240 --> 00:18:06,960 Speaker 1: So I don't know that there's a clear answer anyway. 295 00:18:07,200 --> 00:18:10,120 Speaker 1: As you can imagine, the Chinese government was not thrilled 296 00:18:10,359 --> 00:18:13,639 Speaker 1: about this move. The Chinese government responded by saying that 297 00:18:13,640 --> 00:18:17,720 Speaker 1: Trump's actual goal wasn't to correct some sort of unbalanced 298 00:18:17,960 --> 00:18:23,240 Speaker 1: trade playing field, but rather to directly attack China's economic growth. 299 00:18:23,280 --> 00:18:26,159 Speaker 1: They're saying, oh, you're not trying to fix things. You 300 00:18:26,320 --> 00:18:30,040 Speaker 1: just don't want us to succeed, so you're attacking us, 301 00:18:30,080 --> 00:18:34,200 Speaker 1: which I'm pretty sure is not a really sincere argument, 302 00:18:34,440 --> 00:18:36,920 Speaker 1: but that was the one that was put forth. Things 303 00:18:36,920 --> 00:18:40,439 Speaker 1: got really complicated from an economic standpoint. But this provides 304 00:18:40,440 --> 00:18:43,199 Speaker 1: a pretty good backdrop for what would come next. So 305 00:18:43,520 --> 00:18:45,639 Speaker 1: we get up to the summer of twenty twenty. This 306 00:18:45,720 --> 00:18:48,480 Speaker 1: trade dispute has been going on for two years. President 307 00:18:48,480 --> 00:18:53,439 Speaker 1: Trump says he wants to ban TikTok. He claimed he 308 00:18:53,480 --> 00:18:55,879 Speaker 1: would do this by executive order, and in fact, he 309 00:18:56,040 --> 00:19:01,800 Speaker 1: did issue an executive order to force byte Dance to 310 00:19:02,160 --> 00:19:05,880 Speaker 1: either sell TikTok off or face a ban. In fact, 311 00:19:05,920 --> 00:19:07,199 Speaker 1: for a while he was just saying he was going 312 00:19:07,240 --> 00:19:09,879 Speaker 1: to ban it. That was it, and what no sale 313 00:19:10,400 --> 00:19:13,840 Speaker 1: was just going to be a ban. Byteedance approached Microsoft 314 00:19:13,840 --> 00:19:17,280 Speaker 1: to potentially buy the US portion of the TikTok business 315 00:19:17,320 --> 00:19:20,639 Speaker 1: to kind of spin that off and Microsoft could take 316 00:19:20,680 --> 00:19:25,040 Speaker 1: control of it. But Trump initially rejected that move, but 317 00:19:25,119 --> 00:19:28,080 Speaker 1: Microsoft would remain part of the conversation about a potential 318 00:19:28,080 --> 00:19:30,560 Speaker 1: buyout for about another month. Like this was happening in 319 00:19:30,600 --> 00:19:34,000 Speaker 1: August of twenty twenty, things would move really fast. By 320 00:19:34,040 --> 00:19:38,040 Speaker 1: September twenty twenty, things got real messy. So again Trump 321 00:19:38,080 --> 00:19:41,800 Speaker 1: says he's going to ban the existence of TikTok in 322 00:19:41,800 --> 00:19:43,639 Speaker 1: the United States. And the first time he said that 323 00:19:44,000 --> 00:19:46,480 Speaker 1: was really toward the end of July twenty twenty, but 324 00:19:46,560 --> 00:19:49,080 Speaker 1: then literally just a few days later, Trump reverses his 325 00:19:49,200 --> 00:19:52,119 Speaker 1: stance and he says, you know what, I would allow 326 00:19:52,160 --> 00:19:54,800 Speaker 1: for the sale of TikTok to an American company. Maybe 327 00:19:54,840 --> 00:19:57,199 Speaker 1: we don't ban the app, maybe we do allow it 328 00:19:57,240 --> 00:19:59,920 Speaker 1: to get sold as long as it's to an American company, 329 00:20:00,359 --> 00:20:04,720 Speaker 1: And he establishes a deadline of September fifteenth, twenty twenty, 330 00:20:04,920 --> 00:20:07,240 Speaker 1: and that if a suitable buyer did not emerge by 331 00:20:07,240 --> 00:20:11,000 Speaker 1: that point, he would shut TikTok down with this ban 332 00:20:11,160 --> 00:20:15,119 Speaker 1: by executive order. Now why he wanted to ban TikTok, 333 00:20:15,600 --> 00:20:19,119 Speaker 1: that's a lot harder to say, because it's hard to 334 00:20:19,200 --> 00:20:22,480 Speaker 1: parse what he was trying to convey. He's difficult to 335 00:20:22,600 --> 00:20:27,880 Speaker 1: understand at points. So generally speaking, it seemed that Trump 336 00:20:28,080 --> 00:20:32,919 Speaker 1: believed that although TikTok belonged to a Chinese company, it 337 00:20:33,080 --> 00:20:37,040 Speaker 1: was operating in the United States and that was the issue, 338 00:20:37,720 --> 00:20:40,520 Speaker 1: not from a national security point of view. In fact, 339 00:20:40,520 --> 00:20:43,320 Speaker 1: at one point he said, quote, it's a great asset, 340 00:20:43,680 --> 00:20:46,240 Speaker 1: but it's not a great asset in the United States 341 00:20:46,480 --> 00:20:50,520 Speaker 1: unless they have approval in the United States end quote. 342 00:20:50,680 --> 00:20:52,879 Speaker 1: I don't know what that means. I do not know 343 00:20:52,960 --> 00:20:55,280 Speaker 1: what he was trying to communicate at this point, but 344 00:20:55,359 --> 00:20:58,639 Speaker 1: he did seem to feel that the United States was 345 00:20:58,760 --> 00:21:02,760 Speaker 1: owed some money for the success of TikTok, right like, 346 00:21:02,840 --> 00:21:05,760 Speaker 1: based on what I've read and what I've watched that 347 00:21:05,920 --> 00:21:09,119 Speaker 1: was right around that time period. His argument largely boiled 348 00:21:09,160 --> 00:21:14,120 Speaker 1: down to, hey, Americans are what pushed TikTok to incredible success, 349 00:21:14,320 --> 00:21:17,840 Speaker 1: so America deserves some of that cheddar Now, I don't 350 00:21:18,040 --> 00:21:21,200 Speaker 1: really think that argument holds much water. You know. That's 351 00:21:21,240 --> 00:21:24,040 Speaker 1: like saying to someone that because they were so successful 352 00:21:24,080 --> 00:21:26,720 Speaker 1: in the company, the company deserves all the credit. I 353 00:21:26,800 --> 00:21:30,320 Speaker 1: don't think that really makes sense, but that is what 354 00:21:30,440 --> 00:21:32,960 Speaker 1: I gathered from the material. I mean, that's based upon 355 00:21:33,119 --> 00:21:35,800 Speaker 1: what Trump himself was saying. So again, this was not 356 00:21:35,920 --> 00:21:40,159 Speaker 1: being conveyed in the context of national security. It was 357 00:21:40,200 --> 00:21:43,200 Speaker 1: being conveyed and they got big, So I want the money. 358 00:21:43,640 --> 00:21:46,760 Speaker 1: Microsoft and Oracle were both in talks to acquire the 359 00:21:46,960 --> 00:21:51,240 Speaker 1: US based operations of TikTok from Byteedance, and for a 360 00:21:51,280 --> 00:21:53,760 Speaker 1: while it really did look like something along those lines 361 00:21:53,840 --> 00:21:57,439 Speaker 1: was going to happen. But then TikTok filed some legal 362 00:21:57,560 --> 00:22:02,080 Speaker 1: challenges to this executive orders. Arguments included an accusation that 363 00:22:02,119 --> 00:22:05,919 Speaker 1: Trump's administration failed to produce any evidence that TikTok is 364 00:22:05,960 --> 00:22:08,760 Speaker 1: actually a threat to national security. They're saying like, yeah, 365 00:22:08,800 --> 00:22:12,840 Speaker 1: you're saying that the conditions exist for this to be 366 00:22:12,840 --> 00:22:15,399 Speaker 1: a threat to national security, but you failed to point 367 00:22:15,440 --> 00:22:18,440 Speaker 1: at any actual evidence that it's happening. So just because 368 00:22:18,480 --> 00:22:21,800 Speaker 1: something could happen doesn't mean that you have to ban it. Right, Like, 369 00:22:22,240 --> 00:22:25,520 Speaker 1: any one of you out there could get in your 370 00:22:25,560 --> 00:22:29,679 Speaker 1: car and purposefully drive into crowds of people, but that 371 00:22:29,680 --> 00:22:31,479 Speaker 1: doesn't mean the government has the right to come in 372 00:22:31,520 --> 00:22:33,679 Speaker 1: and say, you know what, we revoked your driver's license 373 00:22:33,680 --> 00:22:36,159 Speaker 1: because you could do this. Even though you haven't done it. 374 00:22:36,400 --> 00:22:39,120 Speaker 1: That's the same sort of thing. Tytuk was saying, Hey, 375 00:22:39,240 --> 00:22:42,200 Speaker 1: you can't ban us on the basis of a threat 376 00:22:42,240 --> 00:22:44,879 Speaker 1: to national security if you can't prove that we're a 377 00:22:44,880 --> 00:22:48,880 Speaker 1: threat to national security. And TikTok would also bring up 378 00:22:48,920 --> 00:22:53,560 Speaker 1: other arguments, saying that this was an unconstitutional move in 379 00:22:53,560 --> 00:22:57,359 Speaker 1: the first place, that it violated First and Fifth Amendment rights, 380 00:22:57,800 --> 00:23:00,840 Speaker 1: and the ban would get moved moved back, like the 381 00:23:00,880 --> 00:23:04,120 Speaker 1: deadline would be moved back a couple of times throughout this, 382 00:23:04,520 --> 00:23:08,040 Speaker 1: and TikTok successfully petitioned a court for an injunction against 383 00:23:08,080 --> 00:23:11,440 Speaker 1: the order to prevent the ban in late September twenty twenty, 384 00:23:11,640 --> 00:23:14,720 Speaker 1: so everything was on hold. Then we get to early 385 00:23:14,760 --> 00:23:19,640 Speaker 1: November twenty twenty, where the US held its previous election 386 00:23:20,080 --> 00:23:24,720 Speaker 1: and President Trump lost that election, a matter of which 387 00:23:24,760 --> 00:23:31,120 Speaker 1: there was some debate. Anyway, this obviously redirected political momentum 388 00:23:31,200 --> 00:23:33,960 Speaker 1: at this point. Right the president who would issued this 389 00:23:34,040 --> 00:23:37,840 Speaker 1: executive order is no longer going to be president after 390 00:23:38,440 --> 00:23:41,639 Speaker 1: January twenty, twenty twenty one, so a lot of the 391 00:23:41,680 --> 00:23:47,240 Speaker 1: pressure that had been on TikTok was alleviated. Was now 392 00:23:47,600 --> 00:23:49,520 Speaker 1: the guy who had made the order wasn't going to 393 00:23:49,560 --> 00:23:55,880 Speaker 1: be in charge anymore. Biden would ultimately revoke that executive 394 00:23:55,960 --> 00:23:58,520 Speaker 1: order with an executive order of his own that happened 395 00:23:58,520 --> 00:24:00,640 Speaker 1: in the summer of twenty twenty one, but he did 396 00:24:00,680 --> 00:24:04,600 Speaker 1: call for an investigation into TikTok to actually determine if, 397 00:24:04,680 --> 00:24:07,960 Speaker 1: in fact there were any threats to national security present. 398 00:24:08,240 --> 00:24:10,399 Speaker 1: And things kind of quieted down for at least a 399 00:24:10,440 --> 00:24:13,159 Speaker 1: couple of months, right like, there was this investigation that 400 00:24:13,200 --> 00:24:15,760 Speaker 1: was going to go on, but otherwise things kind of 401 00:24:16,200 --> 00:24:18,560 Speaker 1: calmed down. But then we get to the end of 402 00:24:18,600 --> 00:24:23,240 Speaker 1: twenty twenty one. That's when we got the Facebook whistleblower event, 403 00:24:23,600 --> 00:24:28,159 Speaker 1: where documents from within Facebook now Meta indicated that a 404 00:24:28,200 --> 00:24:31,159 Speaker 1: lot of shady stuff was going on at a corporate level, 405 00:24:31,440 --> 00:24:36,080 Speaker 1: and it generally raised concerns about social platforms across the spectrum, 406 00:24:36,119 --> 00:24:39,920 Speaker 1: which included TikTok. So leaders were starting to raise some 407 00:24:40,000 --> 00:24:43,439 Speaker 1: pretty tough questions about how these companies collect, use, and 408 00:24:43,560 --> 00:24:48,080 Speaker 1: protect or failed to protect user information, and the specter 409 00:24:48,640 --> 00:24:54,360 Speaker 1: of potential Chinese surveillance resurfaced. Now Over at TikTok, efforts 410 00:24:54,359 --> 00:24:58,359 Speaker 1: were underway to distance the company from byteedance, at least 411 00:24:58,680 --> 00:25:04,280 Speaker 1: on surface appearances. So those efforts got the nickname Project Texas. 412 00:25:04,600 --> 00:25:09,119 Speaker 1: The whole idea was that TikTok's US operations would flow 413 00:25:09,240 --> 00:25:13,040 Speaker 1: through servers that were owned and operated by Oracle, and 414 00:25:13,160 --> 00:25:16,040 Speaker 1: Oracle's an American company that's based in Texas, thus the 415 00:25:16,359 --> 00:25:19,560 Speaker 1: nickname Project Texas. This project had its roots in the 416 00:25:19,640 --> 00:25:23,800 Speaker 1: days when Oracle was in consideration to acquire TikTok's US 417 00:25:23,800 --> 00:25:27,480 Speaker 1: operations from byte Dance. Now that never happened, but the 418 00:25:27,560 --> 00:25:30,679 Speaker 1: move to silo TikTok's US data from the rest of 419 00:25:30,720 --> 00:25:34,800 Speaker 1: international operations did continue, and the effort really ramped up 420 00:25:34,920 --> 00:25:39,920 Speaker 1: in twenty twenty two. However, subsequent reporting by Fortune revealed 421 00:25:39,920 --> 00:25:43,600 Speaker 1: that several former TikTok employees claimed the efforts were quote 422 00:25:43,680 --> 00:25:49,160 Speaker 1: unquote largely cosmetic. One former data scientist at TikTok named 423 00:25:49,240 --> 00:25:53,440 Speaker 1: Evan Turner said that TikTok regularly received direction from byte 424 00:25:53,520 --> 00:25:57,880 Speaker 1: Dance while outwardly posing as being largely independent of their 425 00:25:57,920 --> 00:26:01,160 Speaker 1: parent company. So to outward appearances, they were saying, no, 426 00:26:01,240 --> 00:26:05,080 Speaker 1: we're completely distinct, we are US based and US operated, 427 00:26:05,480 --> 00:26:07,960 Speaker 1: but then behind the scenes, Bite Dance was very much 428 00:26:08,000 --> 00:26:12,119 Speaker 1: pulling the strings. Now. Turner claimed that while he he 429 00:26:12,200 --> 00:26:15,159 Speaker 1: regularly had to send reports to byte Dance, and that 430 00:26:15,200 --> 00:26:18,800 Speaker 1: these reports included these spreadsheets that had tons of American 431 00:26:18,920 --> 00:26:22,240 Speaker 1: user data on them, primarily at least apparently in an 432 00:26:22,240 --> 00:26:26,600 Speaker 1: effort to refine the recommendation algorithm to drive better engagement. So, 433 00:26:26,960 --> 00:26:29,959 Speaker 1: you know, that's not great. It's not great to hear that, 434 00:26:30,040 --> 00:26:32,720 Speaker 1: but it's not quite a national security threat, right if 435 00:26:32,720 --> 00:26:35,119 Speaker 1: that's in fact, all that was going on was that 436 00:26:35,160 --> 00:26:37,879 Speaker 1: this was an effort for engineers back at byte Dance 437 00:26:37,960 --> 00:26:42,080 Speaker 1: to make even more addictive recommendation algorithms. That's not the 438 00:26:42,119 --> 00:26:45,760 Speaker 1: same thing as collecting information for the purposes of surveillance. 439 00:26:46,160 --> 00:26:48,439 Speaker 1: But you know, we don't know if that's all that 440 00:26:48,560 --> 00:26:50,760 Speaker 1: was going on. Actually, we don't even know if that 441 00:26:51,000 --> 00:26:54,240 Speaker 1: was going on. That's been under dispute as well. TikTok's 442 00:26:54,280 --> 00:26:59,080 Speaker 1: CEO a man named Shaozi Chu who originates from Singapore, 443 00:26:59,240 --> 00:27:03,160 Speaker 1: which is not, despite what some American politicians appeared to believe, 444 00:27:03,280 --> 00:27:06,520 Speaker 1: part of China. He has appeared before Congress and claimed 445 00:27:06,600 --> 00:27:10,399 Speaker 1: that TikTok is as American as apple Pie, which I 446 00:27:10,400 --> 00:27:13,240 Speaker 1: think is more Canadian than American, but whatever, And he 447 00:27:13,320 --> 00:27:18,680 Speaker 1: has maintained that the company TikTok does not work closely 448 00:27:18,720 --> 00:27:22,600 Speaker 1: with byte Dance, so there's definitely conflicting information going on here. 449 00:27:22,960 --> 00:27:26,359 Speaker 1: He previously served as chief financial officer for a byte Dance, 450 00:27:26,760 --> 00:27:30,520 Speaker 1: so that also raises concerns, right, like, yeah, he's from Singapore, 451 00:27:30,560 --> 00:27:32,960 Speaker 1: he's not a Chinese citizen, but he was also the 452 00:27:33,000 --> 00:27:37,199 Speaker 1: CFO for byte Dance, a Chinese company that does reserve 453 00:27:37,200 --> 00:27:38,919 Speaker 1: a seat on its board for a member of the 454 00:27:39,000 --> 00:27:42,240 Speaker 1: Chinese Communist Party. That in itself is not unusual for 455 00:27:42,320 --> 00:27:46,080 Speaker 1: big companies in China. So the fear is that perhaps 456 00:27:46,119 --> 00:27:49,560 Speaker 1: he's hiding the real relationship between the parent company and 457 00:27:49,640 --> 00:27:52,359 Speaker 1: the subsidiary. In fact, you know, I mentioned earlier that 458 00:27:52,600 --> 00:27:56,280 Speaker 1: law about intelligence in China, and I mentioned that it 459 00:27:56,359 --> 00:27:59,680 Speaker 1: has sections that say, part of the responsibility is that 460 00:27:59,760 --> 00:28:03,000 Speaker 1: China these organizations have to gather intelligence, whether they are 461 00:28:03,119 --> 00:28:07,400 Speaker 1: domestic or foreign in operation, but they also are supposed 462 00:28:07,440 --> 00:28:12,640 Speaker 1: to do so in secret. They're not supposed to admit 463 00:28:13,200 --> 00:28:16,120 Speaker 1: or reveal that that's in fact happening. So you could 464 00:28:16,160 --> 00:28:18,880 Speaker 1: make the argument that, well, of course, he says TikTok 465 00:28:19,080 --> 00:28:22,600 Speaker 1: doesn't gather information on behalf of byte Edance because he 466 00:28:22,720 --> 00:28:26,199 Speaker 1: can't because of the nature of the Chinese company, and 467 00:28:26,240 --> 00:28:29,200 Speaker 1: that there's this law that says he cannot say that 468 00:28:29,480 --> 00:28:31,840 Speaker 1: even though he himself is not a Chinese citizen, he's 469 00:28:32,160 --> 00:28:34,480 Speaker 1: the CEO of a company that is owned by a 470 00:28:34,560 --> 00:28:37,879 Speaker 1: Chinese company. So it gets very complicated. But that's not 471 00:28:38,000 --> 00:28:40,720 Speaker 1: exactly hard evidence of wrongdoing, right. You can't say for 472 00:28:40,800 --> 00:28:46,440 Speaker 1: sure that because this is you know, potentially possible, that 473 00:28:46,520 --> 00:28:49,920 Speaker 1: it's really you know, plausible, or that really is happening. 474 00:28:50,480 --> 00:28:53,960 Speaker 1: And TikTok representatives have said that the various allegations that 475 00:28:54,000 --> 00:28:56,959 Speaker 1: were made by former employees that were reported on in 476 00:28:57,000 --> 00:29:01,800 Speaker 1: Fortune all related to times that itsed before Project Texas 477 00:29:01,960 --> 00:29:06,320 Speaker 1: was fully operational, and that those complaints are no longer relevant, 478 00:29:06,320 --> 00:29:09,680 Speaker 1: that none of that actually applies to the TikTok of today. 479 00:29:09,920 --> 00:29:12,800 Speaker 1: So you've got that too. All right, We're gonna have 480 00:29:12,840 --> 00:29:14,760 Speaker 1: to take another quick break when we come back. I've 481 00:29:14,800 --> 00:29:17,560 Speaker 1: got a lot more to say about the things that 482 00:29:17,840 --> 00:29:22,560 Speaker 1: escalated and eventually led to this legislation that could potentially 483 00:29:22,640 --> 00:29:35,440 Speaker 1: ban TikTok, But first let's take another quick break. Okay, 484 00:29:35,480 --> 00:29:38,479 Speaker 1: we're back. So in late twenty twenty two, we had 485 00:29:38,520 --> 00:29:43,320 Speaker 1: another scandal involving TikTok and Byte Dance. Forbes published a 486 00:29:43,320 --> 00:29:46,320 Speaker 1: piece that said that Byte Dance had used TikTok to 487 00:29:46,440 --> 00:29:50,840 Speaker 1: spy on Forbes' journalists, and that Byte Dance essentially had 488 00:29:50,880 --> 00:29:54,880 Speaker 1: this list of journalists who were covering byte Dance, and 489 00:29:54,920 --> 00:29:58,400 Speaker 1: they decided to use TikTok to figure out where these 490 00:29:58,480 --> 00:30:01,920 Speaker 1: journalists had been going. They wanted to identify their IP 491 00:30:02,120 --> 00:30:06,800 Speaker 1: addresses and then track these journalists, and this was supposedly 492 00:30:07,160 --> 00:30:10,800 Speaker 1: in order to figure out who the journalists were meeting 493 00:30:10,880 --> 00:30:14,400 Speaker 1: with within Byte Dance to find the source of internal 494 00:30:14,520 --> 00:30:18,000 Speaker 1: leaks from within the company, and they were using TikTok 495 00:30:18,400 --> 00:30:21,880 Speaker 1: as the data gathering tool to do this, which is 496 00:30:22,200 --> 00:30:25,000 Speaker 1: kind of like the nightmare scenario. Right here we have 497 00:30:25,520 --> 00:30:31,080 Speaker 1: the smoking gun of byte Dance using access of TikTok 498 00:30:31,400 --> 00:30:36,360 Speaker 1: to track people, including American citizens. So Forbes publishes these 499 00:30:36,360 --> 00:30:40,480 Speaker 1: claims and Byte Dance responds by firing or forcing to 500 00:30:40,600 --> 00:30:44,080 Speaker 1: resign the people who were connected to the tracking efforts 501 00:30:44,120 --> 00:30:47,080 Speaker 1: within the company. So, like the guy who supposedly oversaw 502 00:30:47,120 --> 00:30:51,280 Speaker 1: the efforts and his direct boss, they left the company. Now, 503 00:30:51,320 --> 00:30:54,720 Speaker 1: whether this was an indication that the decision to track 504 00:30:54,800 --> 00:30:59,160 Speaker 1: journalists was actually localized to a lower spot on the 505 00:30:59,200 --> 00:31:03,120 Speaker 1: corporate higher and thus like executive level you know, c 506 00:31:03,320 --> 00:31:09,160 Speaker 1: suite level leaders had no knowledge of this project, or 507 00:31:10,080 --> 00:31:13,080 Speaker 1: if this was like a cover up where you know, 508 00:31:13,120 --> 00:31:16,520 Speaker 1: the people were fired in order to preserve plausible deniability, 509 00:31:16,960 --> 00:31:20,160 Speaker 1: I don't know, but Byte Dance's leadership was essentially saying 510 00:31:20,360 --> 00:31:23,400 Speaker 1: this was unacceptable. We didn't approve this. You know, the 511 00:31:23,400 --> 00:31:27,280 Speaker 1: executive leadership did not approve this. This was done by 512 00:31:27,680 --> 00:31:30,640 Speaker 1: lower level management. We're dealing with it internally to make 513 00:31:30,680 --> 00:31:35,000 Speaker 1: sure it never happens again. But whether that was sincere 514 00:31:35,120 --> 00:31:37,960 Speaker 1: or not, it certainly did not win byte Dance or 515 00:31:37,960 --> 00:31:41,640 Speaker 1: TikTok any confidence points in the United States. Now, we 516 00:31:41,680 --> 00:31:44,280 Speaker 1: started to see movement in the US as various government 517 00:31:44,280 --> 00:31:47,880 Speaker 1: agencies and college campuses and some other organizations began to 518 00:31:48,000 --> 00:31:51,560 Speaker 1: ban TikTok to some extent. So with government agencies, the 519 00:31:51,640 --> 00:31:55,000 Speaker 1: usual move was to forbid staff from installing TikTok on 520 00:31:55,200 --> 00:31:58,800 Speaker 1: government owned devices, but employees could still have TikTok on 521 00:31:58,800 --> 00:32:02,360 Speaker 1: their personal devices. Personally, I think this is entirely reasonable. 522 00:32:02,840 --> 00:32:05,520 Speaker 1: I think it's the same as if you get a 523 00:32:05,600 --> 00:32:09,560 Speaker 1: company issued laptop and they say, hey, don't go installing 524 00:32:09,640 --> 00:32:12,320 Speaker 1: Stardu Valley on this thing, because that's not what this 525 00:32:12,480 --> 00:32:14,960 Speaker 1: is for. And I'm like, I think that's totally reasonable. 526 00:32:15,000 --> 00:32:20,360 Speaker 1: It's not your computer. It belongs to the corporation or, 527 00:32:20,400 --> 00:32:22,920 Speaker 1: in the case I'm talking about here, to the state 528 00:32:23,040 --> 00:32:27,480 Speaker 1: or federal government, and therefore they do have some authority 529 00:32:27,560 --> 00:32:29,840 Speaker 1: over what should and should not be installed on that 530 00:32:30,200 --> 00:32:33,520 Speaker 1: over at college campuses, the general approach to banning TikTok 531 00:32:33,680 --> 00:32:37,240 Speaker 1: was to block TikTok traffic across college servers. So you 532 00:32:37,240 --> 00:32:41,360 Speaker 1: could still access TikTok through cellular data or through some 533 00:32:41,520 --> 00:32:43,960 Speaker 1: other Wi Fi network, You just couldn't do it through 534 00:32:44,000 --> 00:32:47,440 Speaker 1: college operated Wi Fi. So there were, you know, limits 535 00:32:47,440 --> 00:32:51,760 Speaker 1: to these bans. In May twenty twenty three, the state 536 00:32:51,760 --> 00:32:54,640 Speaker 1: of Montana became the first in the United States to 537 00:32:54,840 --> 00:32:59,120 Speaker 1: ban TikTok. The state government argued that TikTok represented a 538 00:32:59,200 --> 00:33:02,760 Speaker 1: threat to the private data and security of Montana's citizens, 539 00:33:03,000 --> 00:33:06,160 Speaker 1: and that the Chinese government in general was a huge 540 00:33:06,240 --> 00:33:09,720 Speaker 1: threat to national security, and the law stated that app 541 00:33:09,760 --> 00:33:13,640 Speaker 1: stores that would continue to offer TikTok to Montana citizens 542 00:33:13,760 --> 00:33:16,040 Speaker 1: could face a fine of up to ten thousand dollars 543 00:33:16,040 --> 00:33:18,760 Speaker 1: per day of violations. The law was due to take 544 00:33:18,760 --> 00:33:23,040 Speaker 1: effect on January one, twenty twenty four. However, in November 545 00:33:23,160 --> 00:33:27,360 Speaker 1: twenty twenty three, a federal judge blocked the law, citing 546 00:33:27,400 --> 00:33:30,080 Speaker 1: concerns that Montana could be violating the First Amendment of 547 00:33:30,120 --> 00:33:33,400 Speaker 1: the US Constitution, and the judge argued that it was 548 00:33:33,520 --> 00:33:37,000 Speaker 1: not the state of Montana's responsibility to determine matters of 549 00:33:37,080 --> 00:33:40,440 Speaker 1: national security. That is a federal concern, not a state 550 00:33:40,520 --> 00:33:43,959 Speaker 1: level concern, and that further, the move in Montana appeared 551 00:33:44,000 --> 00:33:48,400 Speaker 1: to have a greater connection to xenophobia than a legitimate threat. 552 00:33:48,480 --> 00:33:52,200 Speaker 1: There was no evidence, again shown that TikTok actually did 553 00:33:52,280 --> 00:33:56,440 Speaker 1: represent a threat, just this kind of overall vague glegation. 554 00:33:57,160 --> 00:33:59,680 Speaker 1: The First Amendment argument is one that is likely to 555 00:33:59,680 --> 00:34:02,440 Speaker 1: come up again, you know, assuming that Biden actually does 556 00:34:02,480 --> 00:34:05,160 Speaker 1: sign this into law, which I believe he will. I mean, 557 00:34:05,160 --> 00:34:07,720 Speaker 1: he said he would as soon as it was announced 558 00:34:07,720 --> 00:34:09,239 Speaker 1: that it had passed the Senate. He said he would 559 00:34:09,239 --> 00:34:11,319 Speaker 1: sign it into law today. So again, by the time 560 00:34:11,360 --> 00:34:13,880 Speaker 1: you hear this, that might already be the case. The 561 00:34:13,920 --> 00:34:18,000 Speaker 1: American Civil Liberties Union or ACLU, maintains that a ban 562 00:34:18,120 --> 00:34:21,480 Speaker 1: on TikTok amounts to a violation of the First Amendment. 563 00:34:21,600 --> 00:34:24,720 Speaker 1: So I just can't imagine a world in which TikTok, 564 00:34:24,719 --> 00:34:26,799 Speaker 1: you know, kind of rolls over and says, oh, well, 565 00:34:26,840 --> 00:34:29,320 Speaker 1: we had a good run. That's not going to happen. 566 00:34:29,400 --> 00:34:31,480 Speaker 1: I mean, obviously there's going to be a fight on 567 00:34:31,560 --> 00:34:36,360 Speaker 1: this to force further decisions among the judicial branch about 568 00:34:36,360 --> 00:34:40,880 Speaker 1: whether or not this law is constitutional. Anyway, the political 569 00:34:40,880 --> 00:34:43,279 Speaker 1: move to ban TikTok grew in momentum in the back 570 00:34:43,320 --> 00:34:45,959 Speaker 1: half of twenty twenty three and into twenty twenty four. 571 00:34:46,239 --> 00:34:48,960 Speaker 1: The House of Representatives passed a bill back in March 572 00:34:49,280 --> 00:34:52,200 Speaker 1: that was to force byte Dance to divest itself of 573 00:34:52,239 --> 00:34:56,040 Speaker 1: TikTok the US operations of TikTok within six months or 574 00:34:56,160 --> 00:34:58,840 Speaker 1: face a national ban, which would essentially amount to the 575 00:34:58,960 --> 00:35:02,280 Speaker 1: United States government for seeing app stores and communications networks 576 00:35:02,280 --> 00:35:04,920 Speaker 1: from allowing TikTok traffic in the US kind of a 577 00:35:05,280 --> 00:35:08,560 Speaker 1: grander scale of what we're seeing on college campuses, just 578 00:35:08,600 --> 00:35:11,920 Speaker 1: going across the entire nation. That would not only stop 579 00:35:12,080 --> 00:35:16,120 Speaker 1: new downloads of the app, but also all TikTok transmissions 580 00:35:16,120 --> 00:35:18,120 Speaker 1: in general in the US. Though I suppose you could 581 00:35:18,160 --> 00:35:21,080 Speaker 1: use a VPN and connect to another country, like a 582 00:35:21,120 --> 00:35:23,359 Speaker 1: server in another country to get around some of that, 583 00:35:23,440 --> 00:35:26,439 Speaker 1: but it's a bit of a hassle, and I don't 584 00:35:26,480 --> 00:35:28,960 Speaker 1: know what that would do to your recommendation algorithm. Honestly, 585 00:35:29,000 --> 00:35:31,760 Speaker 1: I don't know if it varies from country to country. 586 00:35:31,800 --> 00:35:35,800 Speaker 1: But anyway, at the time when the House first passed 587 00:35:35,800 --> 00:35:38,520 Speaker 1: this bill, I thought that the US Senate would be 588 00:35:38,600 --> 00:35:42,800 Speaker 1: unlikely to follow suit. The deadline meant that a rapidly 589 00:35:42,840 --> 00:35:46,040 Speaker 1: approaching ban would be really big news at the same 590 00:35:46,160 --> 00:35:49,480 Speaker 1: time as the election cycle, and a TikTok ban is 591 00:35:49,920 --> 00:35:54,080 Speaker 1: really an unpopular move among a significant population of US citizens. 592 00:35:54,320 --> 00:35:57,439 Speaker 1: You know, folks who could potentially vote, assuming that they're 593 00:35:57,680 --> 00:36:01,360 Speaker 1: eighteen or older, or not vote, like just choose not 594 00:36:01,400 --> 00:36:03,520 Speaker 1: to vote because they're ticked off. And I think a 595 00:36:03,520 --> 00:36:07,120 Speaker 1: lot of senators worried that supporting a bill that would 596 00:36:07,160 --> 00:36:09,520 Speaker 1: be you know, coming into effect right around that time, 597 00:36:09,520 --> 00:36:13,040 Speaker 1: would be political suicide with elections coming up this coming November. 598 00:36:13,280 --> 00:36:17,520 Speaker 1: But then US Senator Maria Cantwell proposed that the House 599 00:36:17,600 --> 00:36:21,359 Speaker 1: amend its approach and extend the deadline to one year 600 00:36:21,640 --> 00:36:24,200 Speaker 1: instead of six months. Now. The reason given was that 601 00:36:24,239 --> 00:36:27,319 Speaker 1: one year is a more reasonable timeline for byte Dance 602 00:36:27,400 --> 00:36:29,560 Speaker 1: to secure a buyer for the company, which is true. 603 00:36:29,600 --> 00:36:32,000 Speaker 1: I mean, six months is not much time, but some 604 00:36:32,200 --> 00:36:35,080 Speaker 1: senics were suggesting that another reason for this change would 605 00:36:35,120 --> 00:36:38,240 Speaker 1: be it pushes the effects of the bill out beyond 606 00:36:38,360 --> 00:36:42,560 Speaker 1: the election cycle. Whatever the justification, the US House voted 607 00:36:42,600 --> 00:36:46,279 Speaker 1: to extend that deadline, and then the House lumped the 608 00:36:46,360 --> 00:36:48,880 Speaker 1: TikTok bill in with a bunch of other measures that 609 00:36:48,960 --> 00:36:52,720 Speaker 1: were meant to provide aid to Ukraine and Israel. The 610 00:36:52,760 --> 00:36:55,920 Speaker 1: bundle of bills then passed to the Senate. The Senate 611 00:36:56,000 --> 00:36:59,280 Speaker 1: voted in favor of them, seventy nine in favor, eighteen opposed, 612 00:36:59,600 --> 00:37:01,719 Speaker 1: and by said he would sign the bill into law 613 00:37:01,760 --> 00:37:05,640 Speaker 1: today and the clock started ticking. So some other things 614 00:37:05,680 --> 00:37:07,960 Speaker 1: we do need to consider before we wrap up here. 615 00:37:08,160 --> 00:37:10,840 Speaker 1: While much of the rhetoric around TikTok has centered on 616 00:37:11,000 --> 00:37:14,719 Speaker 1: this threat to national security from an intelligence gathering perspective, 617 00:37:14,960 --> 00:37:17,040 Speaker 1: you know, the idea that TikTok could be used to 618 00:37:17,160 --> 00:37:22,000 Speaker 1: spy on America, there's another component that also deserves consideration, 619 00:37:22,440 --> 00:37:26,920 Speaker 1: and that's the foreign control over domestic broadcast operations. So 620 00:37:26,920 --> 00:37:29,080 Speaker 1: if you go all the way back to nineteen thirty four, 621 00:37:29,160 --> 00:37:31,640 Speaker 1: the United States has maintained that it's just not a 622 00:37:31,680 --> 00:37:35,160 Speaker 1: good idea to allow a foreign government or a foreign 623 00:37:35,200 --> 00:37:41,120 Speaker 1: government's representative, like a foreign owned company to have ownership 624 00:37:41,160 --> 00:37:44,440 Speaker 1: of broadcast businesses within the United States. Now, this all 625 00:37:44,480 --> 00:37:48,080 Speaker 1: started with radio licenses. The Communications Act of nineteen thirty four, 626 00:37:48,200 --> 00:37:52,280 Speaker 1: Section three ten A prohibits such a thing with foreign 627 00:37:52,280 --> 00:37:56,600 Speaker 1: governments or its agents and radio licenses. Now, the concern 628 00:37:56,680 --> 00:38:00,640 Speaker 1: is that a foreign controlled communications entity could spread propaganda 629 00:38:00,680 --> 00:38:04,400 Speaker 1: to the US public, potentially threatening national security in the process. 630 00:38:04,600 --> 00:38:07,920 Speaker 1: And so another aspect of this ban on TikTok resides 631 00:38:08,000 --> 00:38:10,960 Speaker 1: in the company's potential role as a means to spread 632 00:38:11,000 --> 00:38:15,600 Speaker 1: messages that benefit China's goals, whether that's to downplay China's 633 00:38:15,640 --> 00:38:19,799 Speaker 1: own involvement in international affairs or to promote certain narratives 634 00:38:19,800 --> 00:38:23,360 Speaker 1: that align with China's own strategies. It can include disrupting 635 00:38:23,560 --> 00:38:26,800 Speaker 1: US elections by attempting to either push people to support 636 00:38:26,840 --> 00:38:29,720 Speaker 1: specific people whom China has deemed as being of greater 637 00:38:29,800 --> 00:38:33,560 Speaker 1: benefit to China itself, or just discouraging people from participating 638 00:38:33,560 --> 00:38:37,040 Speaker 1: in the democratic process at all. It could include shaping 639 00:38:37,040 --> 00:38:41,279 Speaker 1: public opinion on matters ranging from trade to international conflicts, 640 00:38:41,480 --> 00:38:45,240 Speaker 1: including stuff that's going on right now, like between Israel 641 00:38:45,640 --> 00:38:49,400 Speaker 1: and Palestine. And you know, I'm sure a lot of 642 00:38:49,440 --> 00:38:52,520 Speaker 1: y'all out there are aware that China is very much 643 00:38:52,600 --> 00:38:55,080 Speaker 1: using social platforms to do this kind of thing, like 644 00:38:55,120 --> 00:38:58,080 Speaker 1: that's not under dispute. It is happening. But here's the deal. 645 00:38:58,480 --> 00:39:01,840 Speaker 1: China's doing this on platform that include ones that China 646 00:39:01,960 --> 00:39:05,160 Speaker 1: does not have any control over. Agents working on behalf 647 00:39:05,200 --> 00:39:08,480 Speaker 1: of China, either with or without the nation's direct support, 648 00:39:08,800 --> 00:39:12,359 Speaker 1: have used platforms like x formerly known as Twitter, as 649 00:39:12,400 --> 00:39:16,000 Speaker 1: well as Facebook and YouTube and others to spread this 650 00:39:16,160 --> 00:39:19,200 Speaker 1: kind of messaging. And China does not have ownership in 651 00:39:19,280 --> 00:39:23,880 Speaker 1: these companies, but they're still leveraging them to achieve that goal. 652 00:39:24,080 --> 00:39:27,120 Speaker 1: So one could argue that whether China has a stake 653 00:39:27,160 --> 00:39:31,240 Speaker 1: in TikTok or not ultimately is immaterial because the problem 654 00:39:31,320 --> 00:39:35,480 Speaker 1: exists either way, and that problem is much larger than 655 00:39:35,520 --> 00:39:38,879 Speaker 1: who ultimately owns TikTok, which brings us to the whole 656 00:39:38,880 --> 00:39:42,280 Speaker 1: issue of the US's overall terrible lack of data protection 657 00:39:42,440 --> 00:39:45,720 Speaker 1: and privacy laws. But that's a matter for another episode. However, 658 00:39:46,120 --> 00:39:49,000 Speaker 1: we do have to acknowledge that there has been evidence 659 00:39:49,040 --> 00:39:52,960 Speaker 1: of China interfering with how TikTok promotes and shares videos 660 00:39:53,000 --> 00:39:56,920 Speaker 1: relating to sensitive topics in China. Late last year, the 661 00:39:56,960 --> 00:40:01,359 Speaker 1: Network Contagion Research Institute published to report that claimed that 662 00:40:01,440 --> 00:40:06,640 Speaker 1: they analyzed hashtag ratios on TikTok quote across topics sensitive 663 00:40:06,680 --> 00:40:09,560 Speaker 1: to the Chinese government end quote and compare them with 664 00:40:10,160 --> 00:40:13,560 Speaker 1: ones from other social platforms, and they said that they 665 00:40:13,600 --> 00:40:19,359 Speaker 1: found anomalies that could mean quote TikTok systematically promotes or 666 00:40:19,440 --> 00:40:22,840 Speaker 1: demotes content on the basis of whether it is aligned 667 00:40:22,880 --> 00:40:26,440 Speaker 1: with or opposed to the interests of the Chinese government 668 00:40:26,719 --> 00:40:30,560 Speaker 1: end quote. So according to this research, it is possible 669 00:40:30,719 --> 00:40:33,719 Speaker 1: that TikTok is a more active tool for China to 670 00:40:33,800 --> 00:40:38,840 Speaker 1: spread messaging internationally or to suppress messages that it deems sensitive. 671 00:40:39,080 --> 00:40:42,120 Speaker 1: And while China can leverage other platforms to try and 672 00:40:42,200 --> 00:40:45,759 Speaker 1: spread messages, it doesn't have that level of control on 673 00:40:46,160 --> 00:40:49,960 Speaker 1: other platforms beyond TikTok. So you could say that in 674 00:40:49,960 --> 00:40:55,840 Speaker 1: that regard specifically, this is a very China centered problem. 675 00:40:56,080 --> 00:40:58,359 Speaker 1: Now we have to go back to jeff yas So 676 00:40:58,600 --> 00:41:01,040 Speaker 1: back in twenty twenty, Donald's Trump calls for a ban 677 00:41:01,160 --> 00:41:04,000 Speaker 1: on TikTok, and his reasons for doing so are at 678 00:41:04,120 --> 00:41:06,880 Speaker 1: best muddled, But he's really firm on the fact that 679 00:41:06,920 --> 00:41:10,319 Speaker 1: he is gosh darn it gunna ban TikTok. But the 680 00:41:10,360 --> 00:41:13,200 Speaker 1: band gets postponed due to legal challenges, and Trump gets 681 00:41:13,280 --> 00:41:16,719 Speaker 1: voted out and the matter kind of fizzles. Now, as 682 00:41:16,760 --> 00:41:20,120 Speaker 1: we have this new ban that's one backed by Congress 683 00:41:20,120 --> 00:41:23,439 Speaker 1: and two supported by Joe Biden, Trump has spoken out 684 00:41:23,840 --> 00:41:27,640 Speaker 1: against the ban. So why does he oppose a ban 685 00:41:27,760 --> 00:41:30,800 Speaker 1: now when he was calling for one four years ago. Well, 686 00:41:30,840 --> 00:41:33,479 Speaker 1: his reasons for opposing the ban are I would argue, 687 00:41:33,719 --> 00:41:36,640 Speaker 1: equally as muddled as his reasons calling for one in 688 00:41:36,680 --> 00:41:39,880 Speaker 1: the first place. Generally, he seems to say we shouldn't 689 00:41:39,920 --> 00:41:42,520 Speaker 1: ban TikTok because a lot of people happen to like it. 690 00:41:42,840 --> 00:41:44,840 Speaker 1: And I mean that's true, there are a lot of 691 00:41:44,840 --> 00:41:47,399 Speaker 1: people who like TikTok, But I don't know that that's 692 00:41:47,640 --> 00:41:51,160 Speaker 1: the argument you make to oppose a ban, but it's 693 00:41:51,200 --> 00:41:54,160 Speaker 1: the one that he offered. Now he has associated this 694 00:41:54,280 --> 00:41:56,920 Speaker 1: ban with Joe Biden. Like he's not so much saying 695 00:41:57,400 --> 00:41:59,960 Speaker 1: why he thinks the band's a bad idea, he's more 696 00:42:00,000 --> 00:42:04,280 Speaker 1: we are concentrating on saying Biden is the one banning TikTok. 697 00:42:04,680 --> 00:42:06,920 Speaker 1: So really it seems to be a political maneuver. You know, 698 00:42:06,960 --> 00:42:10,080 Speaker 1: TikTok is popular with a lot of people. Biden supports 699 00:42:10,080 --> 00:42:12,640 Speaker 1: the legislation calling for the ban. So Trump wants to 700 00:42:12,680 --> 00:42:15,359 Speaker 1: win the election in November, so you make sure your 701 00:42:15,440 --> 00:42:19,560 Speaker 1: opponent is associated with a really unpopular bill. Never mind 702 00:42:20,200 --> 00:42:23,200 Speaker 1: if you yourself initially called for the very same thing 703 00:42:23,280 --> 00:42:26,759 Speaker 1: four years ago and did so by executive order. But 704 00:42:26,880 --> 00:42:30,040 Speaker 1: another potential reason for Trump's change of heart goes back 705 00:42:30,080 --> 00:42:33,280 Speaker 1: to Jeff Yas. So Jeff Yas is a mega donor 706 00:42:33,400 --> 00:42:36,840 Speaker 1: to conservative campaigns. In fact, he's like the biggest donor 707 00:42:36,960 --> 00:42:41,120 Speaker 1: to GOP campaigns in the United States, and he met 708 00:42:41,160 --> 00:42:44,960 Speaker 1: with Trump back in March of this year. So the 709 00:42:45,000 --> 00:42:48,960 Speaker 1: two people meet, the biggest donor to conservative campaigns in 710 00:42:49,000 --> 00:42:52,360 Speaker 1: the United States and the presidential nominee for the GOP. 711 00:42:52,920 --> 00:42:57,359 Speaker 1: At this point, Trump's tune regarding TikTok changes. Now. I'm 712 00:42:57,400 --> 00:43:00,799 Speaker 1: not saying Trump changed his approach simply cause a guy 713 00:43:00,920 --> 00:43:05,240 Speaker 1: with exceedingly deep pockets, a guy whose wealth is heavily 714 00:43:05,280 --> 00:43:09,440 Speaker 1: dependent upon byte dance, and by extension, TikTok told Trump 715 00:43:09,600 --> 00:43:12,400 Speaker 1: that Trump needs to oppose the ban if he wants 716 00:43:12,520 --> 00:43:15,319 Speaker 1: access to some of that sweet, sweet money. I'm not 717 00:43:15,480 --> 00:43:18,680 Speaker 1: saying that happened. I'm just saying I wouldn't be at 718 00:43:18,680 --> 00:43:22,240 Speaker 1: all surprised if this is a big part of it. Trump, 719 00:43:22,280 --> 00:43:25,640 Speaker 1: as you might know, definitely needs the cash because it 720 00:43:25,719 --> 00:43:28,719 Speaker 1: is way easier to spend other people's money when you 721 00:43:28,800 --> 00:43:31,560 Speaker 1: owe courts hundreds of millions of dollars than it is 722 00:43:31,600 --> 00:43:34,680 Speaker 1: to spend your own money. But yeah, it seems that 723 00:43:34,760 --> 00:43:37,960 Speaker 1: a conservative billionaire who has most of his wealth tied 724 00:43:38,040 --> 00:43:41,080 Speaker 1: up with a Chinese company is spending a lot of time, effort, 725 00:43:41,160 --> 00:43:44,680 Speaker 1: and yes, money to fight off a threat to his fortune, 726 00:43:45,000 --> 00:43:48,200 Speaker 1: and that the beneficiaries of his donations are eager to 727 00:43:48,239 --> 00:43:51,040 Speaker 1: play a part in that. So it all comes down 728 00:43:51,080 --> 00:43:54,120 Speaker 1: to money, and that's kind of where we're at now. 729 00:43:54,200 --> 00:43:56,279 Speaker 1: You might wonder what's my opinion on the whole thing, 730 00:43:56,360 --> 00:43:59,080 Speaker 1: in case it's not obvious, Well, I don't like TikTok, 731 00:43:59,400 --> 00:44:03,080 Speaker 1: but I also so don't necessarily think TikTok represents the 732 00:44:03,160 --> 00:44:07,719 Speaker 1: really huge problem that other people do or rather to clarify, 733 00:44:08,160 --> 00:44:11,879 Speaker 1: I think TikTok is a problem, but it's a subset. 734 00:44:12,000 --> 00:44:15,680 Speaker 1: It's a smaller problem in a much much larger bubble 735 00:44:16,000 --> 00:44:19,600 Speaker 1: that so far has gone largely unaddressed in the United States. 736 00:44:19,640 --> 00:44:22,480 Speaker 1: That unless leaders take a look at the bigger picture 737 00:44:22,560 --> 00:44:26,200 Speaker 1: of data privacy and security and they make some real changes. 738 00:44:26,480 --> 00:44:30,960 Speaker 1: TikTok's ownership or even TikTok's existence is kind of not important, 739 00:44:31,000 --> 00:44:33,840 Speaker 1: at least not from a national security level, because I 740 00:44:33,880 --> 00:44:37,319 Speaker 1: don't really think there's much that TikTok could do that 741 00:44:37,440 --> 00:44:42,319 Speaker 1: other platforms aren't already doing as well, like American platforms, 742 00:44:42,480 --> 00:44:45,839 Speaker 1: and I mean even like American platforms doing things that 743 00:44:46,000 --> 00:44:50,600 Speaker 1: benefit China. So whether that's gathering bookoos of information on 744 00:44:50,719 --> 00:44:53,960 Speaker 1: every single user, as well as people those users interact with, 745 00:44:54,040 --> 00:44:57,040 Speaker 1: whether they're on the platform or not, to the spread 746 00:44:57,080 --> 00:45:00,680 Speaker 1: of propaganda and misinformation, into designing ways to ca attention 747 00:45:00,840 --> 00:45:03,680 Speaker 1: and to hold it potentially to the detriment of users 748 00:45:03,719 --> 00:45:06,480 Speaker 1: and society in general. All that is happening whether you 749 00:45:06,680 --> 00:45:10,040 Speaker 1: snuff out TikTok or you allow it to continue. So 750 00:45:10,520 --> 00:45:13,279 Speaker 1: killing TikTok doesn't fix the problem. It's like putting a 751 00:45:13,320 --> 00:45:17,080 Speaker 1: little band aid on a big, old, gaping wound. Now. 752 00:45:17,320 --> 00:45:19,640 Speaker 1: I'm also not a big fan of the politics around this. 753 00:45:19,760 --> 00:45:22,040 Speaker 1: That might surprise you considering how much time I spent 754 00:45:22,120 --> 00:45:24,719 Speaker 1: on it, but I'm not. I find the whole politics 755 00:45:24,719 --> 00:45:28,760 Speaker 1: thing incredibly distracting because to me, it feels like theater. 756 00:45:29,080 --> 00:45:31,360 Speaker 1: It feels like it's a show that's being put on 757 00:45:31,560 --> 00:45:34,839 Speaker 1: and it's focusing on an easy target. Also, a lot 758 00:45:34,880 --> 00:45:39,680 Speaker 1: of the rhetoric around this starts to feel really xenophobic, Like, 759 00:45:40,040 --> 00:45:43,640 Speaker 1: you know, there are legitimate reasons to worry about China's influence. 760 00:45:44,080 --> 00:45:47,120 Speaker 1: There's no denying that there are good reasons to worry 761 00:45:47,160 --> 00:45:50,080 Speaker 1: about that. However, a lot of the messaging, a lot 762 00:45:50,120 --> 00:45:53,640 Speaker 1: of the speeches, a lot of the conversations that we've seen, 763 00:45:53,719 --> 00:45:57,560 Speaker 1: like the interrogations in Congress when you've got TikTok's CEO 764 00:45:57,680 --> 00:46:02,280 Speaker 1: up there, they come across less about legitimate concerns about 765 00:46:02,320 --> 00:46:06,759 Speaker 1: China and more, you know, kind of racist, and you 766 00:46:06,800 --> 00:46:10,319 Speaker 1: know there's there's obviously fuzzy ground here. Like again, there 767 00:46:10,320 --> 00:46:14,760 Speaker 1: are legitimate concerns, but the way they're being communicated doesn't 768 00:46:14,760 --> 00:46:17,200 Speaker 1: come across as sincere. And then you start to worry 769 00:46:17,200 --> 00:46:20,800 Speaker 1: that this is like watching a pro wrestling show where 770 00:46:21,080 --> 00:46:24,640 Speaker 1: the face is getting the crowd to chant usa usa 771 00:46:24,920 --> 00:46:30,000 Speaker 1: usa against a foreign character. Heel. It's not a good look. 772 00:46:30,280 --> 00:46:32,399 Speaker 1: So again, the whole thing is a mess, and it's 773 00:46:32,400 --> 00:46:34,719 Speaker 1: not ending here. We've got the legal challenges to look 774 00:46:34,760 --> 00:46:37,680 Speaker 1: forward to, so to speak, and the outcome from that 775 00:46:37,960 --> 00:46:40,920 Speaker 1: is far from certain, so I'm sure we will revisit 776 00:46:41,000 --> 00:46:45,080 Speaker 1: this to find out. You know, will the US ban TikTok, 777 00:46:45,120 --> 00:46:47,760 Speaker 1: will TikTok be forced to be spun off and sold 778 00:46:47,800 --> 00:46:51,000 Speaker 1: to American company, or will the whole thing get dismissed 779 00:46:51,040 --> 00:46:54,840 Speaker 1: on constitutional grounds, or will something else entirely happen. I 780 00:46:54,880 --> 00:46:56,920 Speaker 1: don't know. I'm not going to make any predictions at 781 00:46:56,960 --> 00:46:59,080 Speaker 1: this point because the last one I made was that 782 00:46:59,080 --> 00:47:01,239 Speaker 1: the Senate was not to support this, and I was 783 00:47:01,320 --> 00:47:04,759 Speaker 1: totals wrong about that. So we're dropping it off here. 784 00:47:05,040 --> 00:47:07,040 Speaker 1: But I wanted to give the update and give the 785 00:47:07,080 --> 00:47:09,359 Speaker 1: context and the understanding of all the things that are 786 00:47:09,360 --> 00:47:12,440 Speaker 1: going on, because it is a complicated subject. I hope 787 00:47:12,719 --> 00:47:15,200 Speaker 1: all of you out there are well. I hope you 788 00:47:15,239 --> 00:47:18,920 Speaker 1: are staying sane in this crazy world of ours, and 789 00:47:18,960 --> 00:47:28,440 Speaker 1: I'll talk to you again really soon. Tech Stuff is 790 00:47:28,480 --> 00:47:33,000 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 791 00:47:33,080 --> 00:47:36,680 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 792 00:47:36,719 --> 00:47:37,440 Speaker 1: favorite shows.