1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from My Heart Radio. 2 00:00:12,039 --> 00:00:14,840 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,000 --> 00:00:18,000 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio, 4 00:00:18,079 --> 00:00:21,680 Speaker 1: and I love all things tech and get off my lawn. 5 00:00:22,440 --> 00:00:24,639 Speaker 1: You know, as I get older, I have to work 6 00:00:24,680 --> 00:00:27,320 Speaker 1: harder to keep up with trends in tech. I mean, 7 00:00:27,480 --> 00:00:30,880 Speaker 1: I find that the stereotypes about old people getting out 8 00:00:30,920 --> 00:00:34,159 Speaker 1: of touch with new tech are at least for me, 9 00:00:34,960 --> 00:00:38,720 Speaker 1: partly true. I love tech and I do try to 10 00:00:38,760 --> 00:00:42,240 Speaker 1: stay up to date with news, but some trends appear 11 00:00:42,360 --> 00:00:45,000 Speaker 1: and I'm a little slow on the uptake. And I 12 00:00:45,000 --> 00:00:48,919 Speaker 1: would say that holds true with TikTok, the insanely popular 13 00:00:49,040 --> 00:00:53,440 Speaker 1: video app and meme generator extraordinaire. And that's kind of 14 00:00:53,440 --> 00:00:56,560 Speaker 1: funny because I was at least semi aware of it 15 00:00:56,680 --> 00:00:59,800 Speaker 1: on some level before it was even TikTok. So in 16 00:01:00,000 --> 00:01:03,000 Speaker 1: this episode, we're gonna look into the app, the company 17 00:01:03,040 --> 00:01:06,039 Speaker 1: that owns the app, the people who developed it, and 18 00:01:06,120 --> 00:01:09,399 Speaker 1: the controversies surrounding it, as well as the culture that 19 00:01:09,400 --> 00:01:13,080 Speaker 1: has grown up around it. And it's all pretty fascinating stuff. 20 00:01:13,520 --> 00:01:18,800 Speaker 1: There's actually a couple of origin stories for TikTok because, 21 00:01:18,880 --> 00:01:22,120 Speaker 1: as it turns out, TikTok is the product of some 22 00:01:22,200 --> 00:01:26,680 Speaker 1: acquisitions and mergers of other apps. Let's start by taking 23 00:01:26,680 --> 00:01:29,760 Speaker 1: a look at the company Byte Dance, which as of 24 00:01:29,840 --> 00:01:33,560 Speaker 1: this recording still owns TikTok. More about that later in 25 00:01:33,600 --> 00:01:37,000 Speaker 1: this episode. And that story really begins with its founder, 26 00:01:37,480 --> 00:01:42,200 Speaker 1: a Chinese entrepreneur named Jeong ye Ming. Now I'd say 27 00:01:42,240 --> 00:01:45,479 Speaker 1: that Jong has kept a pretty low profile in the West. 28 00:01:45,920 --> 00:01:48,320 Speaker 1: In fact, to dig up some info on him, I 29 00:01:48,400 --> 00:01:50,720 Speaker 1: really had to pull up some articles that were written 30 00:01:50,760 --> 00:01:53,640 Speaker 1: in Chinese, and then I had to rely heavily on 31 00:01:53,760 --> 00:01:57,120 Speaker 1: Google Translate. So with that in mind, we have to 32 00:01:57,160 --> 00:01:59,400 Speaker 1: take a lot of what I've got to say with 33 00:01:59,840 --> 00:02:03,000 Speaker 1: a few grains of salt for two big reasons. One, 34 00:02:03,600 --> 00:02:09,160 Speaker 1: journalism in China is not exactly objective. The state, that is, 35 00:02:09,280 --> 00:02:13,560 Speaker 1: the Chinese Communist Party, often plays a very big role 36 00:02:13,760 --> 00:02:17,720 Speaker 1: in that, and so the truthiness of a piece can 37 00:02:17,840 --> 00:02:22,200 Speaker 1: come under question. Second, the translation algorithms that we depend 38 00:02:22,280 --> 00:02:26,400 Speaker 1: upon today, they sometimes fail to capture more subtle linguistic 39 00:02:26,480 --> 00:02:30,280 Speaker 1: elements like idiom or you know, metaphor. Similarly, so some 40 00:02:30,320 --> 00:02:33,680 Speaker 1: stuff does get lost in translation. So with that in mind, 41 00:02:34,120 --> 00:02:36,960 Speaker 1: let's see what we can find out now. According to 42 00:02:37,000 --> 00:02:40,200 Speaker 1: the Chinese website so Who, which is part of a 43 00:02:40,240 --> 00:02:42,760 Speaker 1: big internet company that I'll have to cover in a 44 00:02:42,800 --> 00:02:47,680 Speaker 1: future episode. Here's what I was able to discover. Jong 45 00:02:48,040 --> 00:02:53,120 Speaker 1: initially studied microelectronics, but then switched to software engineering in 46 00:02:53,200 --> 00:02:56,560 Speaker 1: college at non Kai University, and so he was there 47 00:02:56,600 --> 00:03:00,320 Speaker 1: from two thousand one to two thousand five. Upon graduate wation, 48 00:03:00,600 --> 00:03:03,520 Speaker 1: he interviewed with a company called cook Soon, and he 49 00:03:03,560 --> 00:03:07,400 Speaker 1: became employee number five over there. Cook Soon got its 50 00:03:07,400 --> 00:03:11,120 Speaker 1: start as a meta search engine designed to pull up 51 00:03:11,240 --> 00:03:15,080 Speaker 1: train ticket options for travelers in China. It's kind of 52 00:03:15,120 --> 00:03:18,640 Speaker 1: like how search services like hotels dot com or booking 53 00:03:18,680 --> 00:03:22,079 Speaker 1: dot com will aggregate deals for various hotels in a 54 00:03:22,160 --> 00:03:24,399 Speaker 1: region so that you can find the one that best 55 00:03:24,400 --> 00:03:27,160 Speaker 1: suits your needs, whether it's for your budget or where 56 00:03:27,200 --> 00:03:31,040 Speaker 1: you plan on being or whatever. Over time, the cook 57 00:03:31,120 --> 00:03:35,120 Speaker 1: Soon site expanded its scope, becoming more of a general 58 00:03:35,200 --> 00:03:39,760 Speaker 1: search engine for everything from shopping to job positions, and 59 00:03:39,800 --> 00:03:43,680 Speaker 1: it would change its focus numerous times, sometimes scaling back 60 00:03:44,000 --> 00:03:47,480 Speaker 1: to focus more on the travel industry and sometimes going 61 00:03:47,520 --> 00:03:51,720 Speaker 1: more broad. In the meantime, Gen rose rapidly from being 62 00:03:51,760 --> 00:03:55,400 Speaker 1: a software engineer to leading teams of engineers who are 63 00:03:55,400 --> 00:03:58,600 Speaker 1: developing the various products that would get rolled out onto 64 00:03:58,640 --> 00:04:01,760 Speaker 1: the site. There's Jong would leave cook soon in two 65 00:04:01,760 --> 00:04:04,160 Speaker 1: thousand and eight, and at that point the site had 66 00:04:04,240 --> 00:04:08,600 Speaker 1: not yet become profitable, possibly due to this wavering focus 67 00:04:08,640 --> 00:04:12,040 Speaker 1: on what the site actually was about. But it wouldn't 68 00:04:12,040 --> 00:04:15,400 Speaker 1: be the end of his involvement with Cookson the company, however, 69 00:04:15,800 --> 00:04:18,320 Speaker 1: we'll get back to that. So in the meantime, he 70 00:04:18,480 --> 00:04:23,599 Speaker 1: joined Microsoft, but he found that working for an established 71 00:04:23,839 --> 00:04:27,839 Speaker 1: global company that had a fairly rigid corporate structure was 72 00:04:27,880 --> 00:04:31,159 Speaker 1: a very different experience from working in a startup environment, 73 00:04:31,800 --> 00:04:34,480 Speaker 1: and he didn't stick around with Microsoft for very long. 74 00:04:34,720 --> 00:04:37,840 Speaker 1: He would leave that company to join another Chinese company 75 00:04:37,880 --> 00:04:42,560 Speaker 1: called fan Fou, which was essentially a clone of Twitter. 76 00:04:42,720 --> 00:04:48,000 Speaker 1: It was a short messaging service style social platform device 77 00:04:48,600 --> 00:04:51,840 Speaker 1: that did not necessarily go that smoothly either, though it 78 00:04:51,920 --> 00:04:55,360 Speaker 1: wasn't through any fault of gens necessarily. In the summer 79 00:04:55,400 --> 00:04:58,359 Speaker 1: of two thousand nine, a series of riots relating to 80 00:04:58,440 --> 00:05:02,640 Speaker 1: China's treatment of ethnic mon noorities erupted in northwest China. 81 00:05:03,000 --> 00:05:05,960 Speaker 1: In an effort to clamp down on the situation, the 82 00:05:06,040 --> 00:05:09,960 Speaker 1: Chinese government began to restrict communication channels as much as 83 00:05:10,000 --> 00:05:13,880 Speaker 1: they possibly could, and that included the effective shutdown of 84 00:05:13,920 --> 00:05:17,520 Speaker 1: Fanfu and several other Chinese websites, which were to remain 85 00:05:17,680 --> 00:05:21,960 Speaker 1: dark until the spring of Joan. Ye Ming didn't wait 86 00:05:22,040 --> 00:05:25,280 Speaker 1: around for the Chinese government to lift restrictions. Instead, he 87 00:05:25,480 --> 00:05:28,599 Speaker 1: left the company and he pounced on what he saw 88 00:05:28,640 --> 00:05:33,000 Speaker 1: as an opportunity to have his own start up. Expedia, 89 00:05:33,440 --> 00:05:37,640 Speaker 1: the online travel company, was in talks to acquire Cook Soon, 90 00:05:37,920 --> 00:05:41,400 Speaker 1: Jong's old employer, but as I mentioned earlier, Cook Soon 91 00:05:41,480 --> 00:05:44,760 Speaker 1: had sort of wandered all over the place when trying 92 00:05:44,800 --> 00:05:48,359 Speaker 1: to establish what it actually was, and one business it 93 00:05:48,440 --> 00:05:52,520 Speaker 1: had dipped its corporate toe into was in real estate, 94 00:05:52,839 --> 00:05:55,680 Speaker 1: and that's what Jong was interested in. He arranged to 95 00:05:55,720 --> 00:05:59,120 Speaker 1: acquire that part of Cook Soon's business because it wasn't 96 00:05:59,200 --> 00:06:02,880 Speaker 1: really a line end with expedia strategy and it wouldn't 97 00:06:02,880 --> 00:06:05,919 Speaker 1: be needed for Expedia's acquisition. It was a part of 98 00:06:05,920 --> 00:06:08,680 Speaker 1: the company that wasn't really that interesting to Expedia, so 99 00:06:08,839 --> 00:06:11,520 Speaker 1: Jeong y Ming was able to swoop in and purchase 100 00:06:11,600 --> 00:06:13,719 Speaker 1: that part of the company, and he founded his first 101 00:06:13,760 --> 00:06:18,880 Speaker 1: independent company, which he called nine Fong dot Com. A 102 00:06:18,920 --> 00:06:22,560 Speaker 1: couple of years later, Jeong saw a new opportunity, this 103 00:06:22,600 --> 00:06:27,080 Speaker 1: time revolving around how people access news. China was lagging 104 00:06:27,120 --> 00:06:29,280 Speaker 1: a little bit behind the rest of the world in 105 00:06:29,360 --> 00:06:32,320 Speaker 1: this endeavor, but even in China, it was clear that 106 00:06:32,440 --> 00:06:36,479 Speaker 1: mobile devices were completely upending the way people get access 107 00:06:36,520 --> 00:06:40,560 Speaker 1: to information. So with that in mind, Jong resigned as 108 00:06:40,600 --> 00:06:44,640 Speaker 1: the CEO of Fong dot com and he founded a 109 00:06:44,640 --> 00:06:47,240 Speaker 1: new company, one that would be known in the West 110 00:06:47,400 --> 00:06:52,240 Speaker 1: as byte Dance b y t e Dance. The original 111 00:06:52,360 --> 00:06:56,520 Speaker 1: product from byte Dance wasn't a short videos service, nor 112 00:06:56,560 --> 00:06:58,440 Speaker 1: did it have anything to do with lip syncing to 113 00:06:58,600 --> 00:07:02,159 Speaker 1: songs or generating me eames or anything like that. It 114 00:07:02,279 --> 00:07:06,000 Speaker 1: was a news aggregator service, one that would pull headlines 115 00:07:06,160 --> 00:07:10,920 Speaker 1: from various Chinese news organizations. The app was called Chao 116 00:07:11,080 --> 00:07:14,440 Speaker 1: Chow and it was a big hit in China, despite 117 00:07:14,480 --> 00:07:18,640 Speaker 1: the fact that Jong initially encountered very little enthusiasm from 118 00:07:18,640 --> 00:07:22,200 Speaker 1: investors for his idea. Bye Dance would launch a video 119 00:07:22,240 --> 00:07:26,120 Speaker 1: sharing app for mobile devices called dough yin in China, 120 00:07:26,520 --> 00:07:31,240 Speaker 1: but it would be known elsewhere as TikTok and the 121 00:07:31,320 --> 00:07:36,119 Speaker 1: app allowed users to create short, like really short video 122 00:07:36,120 --> 00:07:39,840 Speaker 1: clips fifteen seconds long and then share them online. Users 123 00:07:39,840 --> 00:07:43,320 Speaker 1: would create lip sync videos. They had access to a 124 00:07:43,400 --> 00:07:46,480 Speaker 1: large database of sound bites they could use, or they 125 00:07:46,600 --> 00:07:49,960 Speaker 1: might create short comedy videos, or they might show off 126 00:07:50,000 --> 00:07:54,200 Speaker 1: odd or impressive talents. The service launched in China in 127 00:07:54,320 --> 00:07:58,600 Speaker 1: twenty sixteen. It received a warm reception in China, but 128 00:07:58,920 --> 00:08:01,680 Speaker 1: when they tried to exp and the app outside of 129 00:08:01,760 --> 00:08:06,360 Speaker 1: China in Seen, it did not immediately catch on. That 130 00:08:06,360 --> 00:08:09,920 Speaker 1: would change a bit later in seventeen when byte Dance 131 00:08:09,920 --> 00:08:13,800 Speaker 1: would merge its TikTok app with another app called Musical 132 00:08:14,200 --> 00:08:18,200 Speaker 1: dot Lee or Musically if you prefer, and that's when 133 00:08:18,200 --> 00:08:20,520 Speaker 1: things would really take off. But that means we need 134 00:08:20,560 --> 00:08:24,280 Speaker 1: to chat about musically for a second. And where that 135 00:08:24,360 --> 00:08:28,800 Speaker 1: came from. That started as a different Chinese startup, this 136 00:08:28,840 --> 00:08:32,600 Speaker 1: time co founded by friends Alex Shu and Louis Young. 137 00:08:33,240 --> 00:08:37,160 Speaker 1: The two originally intended to create a short form video 138 00:08:37,280 --> 00:08:41,040 Speaker 1: service meant for educational videos. So these were supposed to 139 00:08:41,080 --> 00:08:44,320 Speaker 1: be things like how to videos or explainer videos, you know, 140 00:08:44,360 --> 00:08:48,160 Speaker 1: like how this thing works, videos I can identify with that. 141 00:08:48,800 --> 00:08:51,679 Speaker 1: They called their app Cicada, and they pitched it to 142 00:08:51,800 --> 00:08:56,280 Speaker 1: venture capitalists back in and investors like the idea, and 143 00:08:56,360 --> 00:08:58,760 Speaker 1: they poured about a quarter of a million dollars worth 144 00:08:58,800 --> 00:09:04,960 Speaker 1: of cash into the There was only one tiny little problem, 145 00:09:04,960 --> 00:09:08,360 Speaker 1: nobody was using it. They were able to develop the tool, 146 00:09:08,600 --> 00:09:12,960 Speaker 1: they launched the tool, they saw very little adoption, and 147 00:09:13,000 --> 00:09:16,080 Speaker 1: as it turned out, they they realized the reason was 148 00:09:16,120 --> 00:09:21,080 Speaker 1: it's actually really difficult to communicate complicated ideas effectively in 149 00:09:21,160 --> 00:09:24,680 Speaker 1: just a few minutes. I mean, I've done that. That's 150 00:09:24,720 --> 00:09:27,960 Speaker 1: been my gig for a while with making short videos. 151 00:09:28,000 --> 00:09:30,080 Speaker 1: I used to do that back with how Stuff works 152 00:09:30,080 --> 00:09:33,400 Speaker 1: all the time. So it's hard to do and that's 153 00:09:33,400 --> 00:09:35,600 Speaker 1: why you noticed that the average running time for a 154 00:09:35,640 --> 00:09:39,040 Speaker 1: tech Stuff episode is closer to like forty five minutes. 155 00:09:39,080 --> 00:09:41,800 Speaker 1: It's just it's not easy to boil all that down 156 00:09:41,800 --> 00:09:44,920 Speaker 1: in just a few minutes and make it accurate and interesting. 157 00:09:45,559 --> 00:09:48,559 Speaker 1: So the co founders discovered that people didn't want to 158 00:09:48,679 --> 00:09:53,000 Speaker 1: use the app because it took so long to write, produce, 159 00:09:53,120 --> 00:09:56,839 Speaker 1: and edit videos that the return on investment in terms 160 00:09:56,840 --> 00:10:00,200 Speaker 1: of time and effort just weren't there. It wasn't fun 161 00:10:00,240 --> 00:10:03,679 Speaker 1: to use. Creation really needed to be more effortless and 162 00:10:03,760 --> 00:10:07,079 Speaker 1: have a better payoff, and young folks who were leading 163 00:10:07,080 --> 00:10:10,280 Speaker 1: the way when it came to mobile phone adoption just 164 00:10:10,400 --> 00:10:14,320 Speaker 1: weren't into Cicada, so it was not really succeeding. So 165 00:10:14,360 --> 00:10:16,720 Speaker 1: the co founders were faced with the problem. They had 166 00:10:16,760 --> 00:10:20,520 Speaker 1: built tools that worked, but they weren't really suitable for 167 00:10:20,559 --> 00:10:23,040 Speaker 1: the purposes that they had in mind. People just weren't 168 00:10:23,120 --> 00:10:26,280 Speaker 1: using it. So they were running out of cash, and 169 00:10:26,559 --> 00:10:28,840 Speaker 1: they had a couple of different choices that they could 170 00:10:29,040 --> 00:10:32,199 Speaker 1: really follow. They could go back to their investors and 171 00:10:32,240 --> 00:10:34,959 Speaker 1: return what little money was left with an apology saying 172 00:10:35,360 --> 00:10:37,800 Speaker 1: we built this thing, it works, but nobody's using it. 173 00:10:38,080 --> 00:10:40,440 Speaker 1: Or they could attempt a pivot and see if they 174 00:10:40,480 --> 00:10:43,120 Speaker 1: could leverage the tools they had already built for some 175 00:10:43,160 --> 00:10:46,599 Speaker 1: other purpose. And that's when they decided to give entertainment 176 00:10:46,640 --> 00:10:50,080 Speaker 1: a shot. Now. According to you, he made this decision 177 00:10:50,240 --> 00:10:53,880 Speaker 1: after visiting the United States. He was in Mountain View, California, 178 00:10:53,960 --> 00:10:56,599 Speaker 1: according to his story, and saw a group of teenagers 179 00:10:56,600 --> 00:10:59,360 Speaker 1: on a train that were listening to music. They were 180 00:10:59,360 --> 00:11:02,480 Speaker 1: taking self fees and videos with the music in the background, 181 00:11:02,760 --> 00:11:05,599 Speaker 1: and they were using social media apps to add filters 182 00:11:05,720 --> 00:11:08,439 Speaker 1: and virtual stickers on top of the images in the video. 183 00:11:08,440 --> 00:11:11,800 Speaker 1: And he thought, hey, you know, these kids are sharing 184 00:11:11,840 --> 00:11:15,400 Speaker 1: stuff in a really creative way, and they're cluging it together, 185 00:11:15,640 --> 00:11:19,640 Speaker 1: right there. They're doing this by using different things to 186 00:11:19,760 --> 00:11:22,560 Speaker 1: try and make something new. If I made a tool, 187 00:11:22,600 --> 00:11:25,640 Speaker 1: if I used the stuff we developed for Cicada, and 188 00:11:25,679 --> 00:11:28,280 Speaker 1: I adapted it so that it was all all of 189 00:11:28,320 --> 00:11:32,480 Speaker 1: those elements were together in one app, people would use 190 00:11:32,520 --> 00:11:35,280 Speaker 1: it because they're doing it already, but they're doing it by, 191 00:11:35,480 --> 00:11:38,760 Speaker 1: you know, kind of piecing it together piecemeal. If I 192 00:11:38,760 --> 00:11:41,319 Speaker 1: put it all together, it's gonna be a big hit. Now. 193 00:11:41,320 --> 00:11:43,440 Speaker 1: I find it interesting that, at least according to the 194 00:11:43,440 --> 00:11:47,480 Speaker 1: sources I looked at, SHO wasn't inspired by previous short 195 00:11:47,520 --> 00:11:51,079 Speaker 1: form video services like Vine now. Vine was a short 196 00:11:51,080 --> 00:11:54,480 Speaker 1: form video sharing service that Twitter had acquired shortly after 197 00:11:54,640 --> 00:11:58,760 Speaker 1: Vine had been founded. Users could create six second long 198 00:11:59,080 --> 00:12:03,120 Speaker 1: looping videos. Those videos lived on vines servers, but they 199 00:12:03,160 --> 00:12:07,680 Speaker 1: could easily be shared the platforms like Facebook and Twitter. Inteen. 200 00:12:08,040 --> 00:12:11,440 Speaker 1: When You and Young were working on their tweaked service, 201 00:12:11,880 --> 00:12:15,000 Speaker 1: Vine was still a thing, though Twitter would shut that 202 00:12:15,080 --> 00:12:21,160 Speaker 1: service down in After about a month of making changes 203 00:12:21,200 --> 00:12:24,559 Speaker 1: to their original platform, the Cicada team created a new 204 00:12:24,679 --> 00:12:29,040 Speaker 1: service that they called Musically or musical dot l y. 205 00:12:29,280 --> 00:12:32,640 Speaker 1: Users could create fifteen second long videos with this app. 206 00:12:33,400 --> 00:12:37,360 Speaker 1: They saw some early adoption and user retention, but growth 207 00:12:37,480 --> 00:12:41,000 Speaker 1: was a little slow in the beginning they were running 208 00:12:41,040 --> 00:12:43,720 Speaker 1: low on cash when they made some more adjustments, a 209 00:12:43,760 --> 00:12:47,000 Speaker 1: big one being that they repositioned the logo so that 210 00:12:47,080 --> 00:12:49,319 Speaker 1: it wouldn't get cropped out when the videos would get 211 00:12:49,320 --> 00:12:52,640 Speaker 1: shared on services like Instagram. So that meant that once 212 00:12:52,679 --> 00:12:55,240 Speaker 1: people saw these videos, they could actually see the logo 213 00:12:55,320 --> 00:12:58,360 Speaker 1: for music Lee, and they knew where to go in 214 00:12:58,440 --> 00:13:00,800 Speaker 1: order so that they could get the happen and make 215 00:13:00,840 --> 00:13:04,959 Speaker 1: their own videos. And adoption really took off at that point, 216 00:13:05,000 --> 00:13:09,400 Speaker 1: particularly in the United States. It did not receive widespread 217 00:13:09,440 --> 00:13:13,439 Speaker 1: adoption back in China, but the musically app would hit 218 00:13:13,520 --> 00:13:17,160 Speaker 1: number one on the iTunes app Store, which guaranteed that 219 00:13:17,280 --> 00:13:20,440 Speaker 1: the app would have a prominent position that helped drive adoption, 220 00:13:20,679 --> 00:13:25,480 Speaker 1: particularly in the United States. The music Lee team created 221 00:13:25,600 --> 00:13:28,800 Speaker 1: some features that really helped it get attention. One was 222 00:13:28,880 --> 00:13:32,880 Speaker 1: creating a type of tier called Best Fans Forever or 223 00:13:33,000 --> 00:13:38,200 Speaker 1: b f F. So if you followed a music Lee creator, 224 00:13:38,840 --> 00:13:42,360 Speaker 1: you could become a BFF of that creator who would 225 00:13:42,800 --> 00:13:45,800 Speaker 1: follow you back. Essentially your BFFs with each other. So 226 00:13:45,880 --> 00:13:49,280 Speaker 1: let's say two users become BFFs. They could then record 227 00:13:49,440 --> 00:13:54,000 Speaker 1: videos of themselves lip sticking to the same song, for example, 228 00:13:54,440 --> 00:13:58,080 Speaker 1: and music Lee would take these two videos and edit 229 00:13:58,160 --> 00:14:01,839 Speaker 1: them together, creating what was called a duet, so it 230 00:14:01,920 --> 00:14:04,960 Speaker 1: would switch back and forth between the two little videos 231 00:14:05,000 --> 00:14:08,280 Speaker 1: to create a single experience as long as the two 232 00:14:08,400 --> 00:14:12,160 Speaker 1: users were BFFs of each other. A related feature allowed 233 00:14:12,240 --> 00:14:16,280 Speaker 1: music Lee users to create a video response to another video, 234 00:14:16,840 --> 00:14:19,400 Speaker 1: building a chain of videos together, so you could do 235 00:14:19,720 --> 00:14:22,800 Speaker 1: a Q and a style format that musically would stitch 236 00:14:22,800 --> 00:14:25,680 Speaker 1: together so then users could see the full thing later. 237 00:14:25,720 --> 00:14:28,160 Speaker 1: They could see the questions and the answers, or they 238 00:14:28,160 --> 00:14:31,680 Speaker 1: could see a video and a response. And these features, 239 00:14:31,720 --> 00:14:34,720 Speaker 1: along with the ability to post videos to other popular 240 00:14:34,800 --> 00:14:39,120 Speaker 1: platforms like Instagram, meant music Lee had an appeal that 241 00:14:39,200 --> 00:14:42,960 Speaker 1: tapped right into the zeitgeist. Before long, it became a 242 00:14:43,080 --> 00:14:46,680 Speaker 1: launching pad for Internet celebrities as well as musicians who 243 00:14:46,720 --> 00:14:51,280 Speaker 1: otherwise we're finding discovery to be an insurmountable challenge. So 244 00:14:51,880 --> 00:14:54,440 Speaker 1: you had music Lee, which was doing well in the 245 00:14:54,520 --> 00:14:57,960 Speaker 1: United States and in Europe as well as several other markets. 246 00:14:58,440 --> 00:15:00,960 Speaker 1: And you had du Yen or to Talk, which was 247 00:15:01,000 --> 00:15:06,040 Speaker 1: doing well in China but hadn't gained much traction anywhere else. 248 00:15:06,800 --> 00:15:10,000 Speaker 1: Jong saw the chance to bring together two similar but 249 00:15:10,400 --> 00:15:15,000 Speaker 1: unrelated services under one company, and so byte Dance would 250 00:15:15,120 --> 00:15:18,320 Speaker 1: make its move. When we come back, I'll talk about 251 00:15:18,400 --> 00:15:21,640 Speaker 1: that acquisition and what came next, but first let's take 252 00:15:21,920 --> 00:15:34,680 Speaker 1: a quick break. Musically started back in TikTok Or or 253 00:15:34,760 --> 00:15:39,000 Speaker 1: du Yan had launched in two thousand sixteen. Byte Dance 254 00:15:39,040 --> 00:15:43,200 Speaker 1: was doing incredibly well, largely driven by the enormous success 255 00:15:43,240 --> 00:15:47,000 Speaker 1: of the news aggregator service it had built years before. 256 00:15:47,840 --> 00:15:51,560 Speaker 1: Based on funding rounds in China, byte Dance had a 257 00:15:51,640 --> 00:15:56,400 Speaker 1: valuation of nearly twenty billion dollars around this time, which 258 00:15:56,440 --> 00:16:03,240 Speaker 1: is in fact and extremely princely some In November, news 259 00:16:03,280 --> 00:16:06,080 Speaker 1: broke that byte Dance was making a move to acquire 260 00:16:06,240 --> 00:16:09,920 Speaker 1: the music Lee service, which itself had been valued at 261 00:16:09,960 --> 00:16:13,880 Speaker 1: nearly one billion dollars. So while I was watching friends 262 00:16:13,920 --> 00:16:17,680 Speaker 1: do silly lips sinking videos to songs from the musical Hamilton's, 263 00:16:18,400 --> 00:16:23,520 Speaker 1: these companies were skyrocketing in value. In December seen the 264 00:16:23,600 --> 00:16:28,200 Speaker 1: deal was finalized for an undisclosed amount. Analysts estimate it 265 00:16:28,240 --> 00:16:32,760 Speaker 1: was somewhere between eight hundred million dollars and one billion dollars. 266 00:16:33,320 --> 00:16:36,680 Speaker 1: Not a bad take for the developers of music Lee, 267 00:16:36,680 --> 00:16:39,280 Speaker 1: who had started out with a failed effort to create 268 00:16:39,320 --> 00:16:43,160 Speaker 1: an educational video tool. I really wish my failure is 269 00:16:43,160 --> 00:16:45,880 Speaker 1: paid off like that. Now I'm being a bit snarky, 270 00:16:46,040 --> 00:16:48,200 Speaker 1: but I have to point out that the services were 271 00:16:48,240 --> 00:16:52,480 Speaker 1: both doing really, really well both before and after Byte 272 00:16:52,560 --> 00:16:57,760 Speaker 1: Dance acquired musical Lee. By mid teen, musical Lee had 273 00:16:57,880 --> 00:17:03,280 Speaker 1: one hundred million active monthly users. TikTok or du Yan 274 00:17:03,600 --> 00:17:08,280 Speaker 1: had five hundred million users in China, and those users 275 00:17:08,280 --> 00:17:13,240 Speaker 1: were generating tons of data, and they were watching lots 276 00:17:13,280 --> 00:17:15,040 Speaker 1: of ads, and as we all know, this is really 277 00:17:15,119 --> 00:17:18,480 Speaker 1: valuable stuff, particularly when you're looking at it in bulk. 278 00:17:18,760 --> 00:17:21,199 Speaker 1: So Byte Dance was sitting on a treasure trove of 279 00:17:21,240 --> 00:17:25,880 Speaker 1: information as well as getting lots of money through ad deals, 280 00:17:25,920 --> 00:17:29,200 Speaker 1: and it really gave the company an enormous boost in value. 281 00:17:29,760 --> 00:17:35,160 Speaker 1: In August, byte Dance announced it was shuttering musical Lee 282 00:17:35,200 --> 00:17:37,840 Speaker 1: as a brand. It was going to go away as 283 00:17:37,880 --> 00:17:41,520 Speaker 1: its own thing, and it revamped the service into TikTok, 284 00:17:42,040 --> 00:17:45,719 Speaker 1: so it unified the name across its two services. However, 285 00:17:45,800 --> 00:17:48,200 Speaker 1: the company would still continue to operate the du Yan 286 00:17:48,480 --> 00:17:53,160 Speaker 1: version of TikTok separately on Chinese servers. This was so 287 00:17:53,440 --> 00:17:57,120 Speaker 1: the company could adhere to Chinese government requirements. I'll talk 288 00:17:57,160 --> 00:17:59,720 Speaker 1: more about that a little later in this episode. So 289 00:18:00,320 --> 00:18:03,120 Speaker 1: even though there was a unified Dame, there are still 290 00:18:03,200 --> 00:18:08,160 Speaker 1: two distinct, separate incarnations of TikTok to this day. One 291 00:18:08,200 --> 00:18:12,480 Speaker 1: month after byte Dance consolidated TikTok, the service reached a 292 00:18:12,520 --> 00:18:17,199 Speaker 1: new milestone. In September, it became the top app in 293 00:18:17,320 --> 00:18:23,280 Speaker 1: monthly installs in Apple's App Store. It surpassed Snapchat, YouTube, Instagram, 294 00:18:23,320 --> 00:18:27,439 Speaker 1: and Facebook. It was a social media app on the rise. 295 00:18:28,119 --> 00:18:31,280 Speaker 1: This was a huge deal, and driving this adoption were 296 00:18:31,320 --> 00:18:34,560 Speaker 1: young users who otherwise weren't quite as keen to join 297 00:18:34,880 --> 00:18:37,720 Speaker 1: some of the more established networks out there. Now that 298 00:18:37,760 --> 00:18:41,320 Speaker 1: being said, the user base still wasn't as big as 299 00:18:41,359 --> 00:18:44,480 Speaker 1: something like Facebook, which is a true monolith of a 300 00:18:44,520 --> 00:18:49,080 Speaker 1: social platform. When it comes to total users, Facebook remains king, 301 00:18:49,240 --> 00:18:53,080 Speaker 1: but TikTok was seeing much faster adoption than other platforms 302 00:18:53,119 --> 00:18:55,760 Speaker 1: at this time. While all this was going on from 303 00:18:55,800 --> 00:18:59,200 Speaker 1: an adoption perspective, there's another aspect we need to address, 304 00:18:59,240 --> 00:19:02,920 Speaker 1: and that's the cultural impact of TikTok. The short form 305 00:19:03,080 --> 00:19:06,160 Speaker 1: video restrictions meant people had to come up with creative 306 00:19:06,160 --> 00:19:09,760 Speaker 1: ways to make use of that time and of the platform, 307 00:19:10,040 --> 00:19:12,159 Speaker 1: and they had to do it in order to stand 308 00:19:12,160 --> 00:19:15,639 Speaker 1: out among all the other stuff being uploaded every day. 309 00:19:15,760 --> 00:19:18,960 Speaker 1: TikTok culture began to evolve, and soon there was a 310 00:19:19,080 --> 00:19:24,719 Speaker 1: TikTok optimized form of storytelling and of humor. TikTok emerged 311 00:19:25,000 --> 00:19:27,359 Speaker 1: at the same time as a generation of kids who 312 00:19:27,440 --> 00:19:31,200 Speaker 1: had grown up in the smartphone age, and they were 313 00:19:31,200 --> 00:19:35,720 Speaker 1: all becoming teenagers around this time. So smartphones have immersed 314 00:19:35,840 --> 00:19:40,480 Speaker 1: us in media like nothing before. Now, my generation would 315 00:19:40,560 --> 00:19:44,159 Speaker 1: spend hours in front of a television every week, and 316 00:19:44,480 --> 00:19:46,960 Speaker 1: there were often times that we could easily escape from 317 00:19:47,280 --> 00:19:51,119 Speaker 1: mass media if we wanted to just go outside or whatever, 318 00:19:51,240 --> 00:19:53,679 Speaker 1: because cell phones were barely a thing when I was 319 00:19:53,720 --> 00:19:57,200 Speaker 1: a kid. There was no YouTube or Facebook, there were 320 00:19:57,240 --> 00:20:01,720 Speaker 1: no smartphones. Even digital cameras were in credibly rare and expensive, 321 00:20:02,040 --> 00:20:05,919 Speaker 1: and so my generation developed a very different sensibility and 322 00:20:06,040 --> 00:20:10,760 Speaker 1: relationship with communication and humor. Now, I don't say that 323 00:20:11,080 --> 00:20:15,919 Speaker 1: superior sensibility. I don't think my generation had a better 324 00:20:16,200 --> 00:20:19,520 Speaker 1: version of that than any other generation. I'm just pointing 325 00:20:19,560 --> 00:20:23,800 Speaker 1: out that it's different. So you contrast my generation with 326 00:20:23,960 --> 00:20:27,639 Speaker 1: the teens of the twenty one century. The Internet and 327 00:20:27,720 --> 00:20:32,080 Speaker 1: mobile devices have changed how we experience, consume, and interact 328 00:20:32,160 --> 00:20:36,680 Speaker 1: with media. Everything seems pulled into the media realm. People 329 00:20:36,720 --> 00:20:41,560 Speaker 1: who would otherwise have led fairly average lives were able 330 00:20:41,560 --> 00:20:45,239 Speaker 1: to leverage online tools to become celebrities. They they've been 331 00:20:45,240 --> 00:20:50,080 Speaker 1: able to amass fortunes by vlogging or live streaming games 332 00:20:50,160 --> 00:20:55,440 Speaker 1: or developing themselves into a brand. It's a totally different landscape, 333 00:20:55,480 --> 00:20:58,639 Speaker 1: and as such, it provides a much different launch point 334 00:20:58,720 --> 00:21:03,159 Speaker 1: for humor and communication, and that's what we're seeing on TikTok. 335 00:21:04,119 --> 00:21:07,520 Speaker 1: TikTok Let's users share short videos that often end up 336 00:21:07,520 --> 00:21:12,119 Speaker 1: being a reflection, commentary, or criticism of some form of 337 00:21:12,240 --> 00:21:17,160 Speaker 1: media or cultural idea, and it's easily shared and it's 338 00:21:17,200 --> 00:21:20,480 Speaker 1: easily digested. So in many ways, it reminds me of 339 00:21:20,480 --> 00:21:24,240 Speaker 1: how people tend to encounter the news. The twenty four 340 00:21:24,280 --> 00:21:27,680 Speaker 1: hour news cycle created a demand for news. You had 341 00:21:27,720 --> 00:21:30,840 Speaker 1: to fill up all that time with something if you're 342 00:21:30,840 --> 00:21:34,040 Speaker 1: going to be broadcasting twenty four hours a day. And 343 00:21:34,080 --> 00:21:37,879 Speaker 1: then the Internet comes along and took that news cycle 344 00:21:37,960 --> 00:21:42,159 Speaker 1: and made it a bazillion times worse. One consequence of 345 00:21:42,200 --> 00:21:44,840 Speaker 1: this is that we don't tend to have very long 346 00:21:44,920 --> 00:21:48,280 Speaker 1: memories when it comes to big events because we're flooded 347 00:21:48,320 --> 00:21:52,639 Speaker 1: with notifications whenever the next big event happens, and because 348 00:21:52,680 --> 00:21:57,000 Speaker 1: the Internet is global, there's always a big event happening somewhere, 349 00:21:57,320 --> 00:22:01,480 Speaker 1: so it just becomes this sequence of big events. Well, 350 00:22:01,480 --> 00:22:05,680 Speaker 1: TikTok videos kind of taps into that in a little way. 351 00:22:05,960 --> 00:22:10,920 Speaker 1: Video creators can become famous. Some may only achieve temporary notoriety, 352 00:22:11,000 --> 00:22:15,000 Speaker 1: but other people have launched entire careers from TikTok and 353 00:22:15,040 --> 00:22:19,400 Speaker 1: they can make ironic observations about the world and media 354 00:22:19,720 --> 00:22:22,639 Speaker 1: and memes that are going on at that time. Most 355 00:22:22,760 --> 00:22:26,320 Speaker 1: videos use a soundtrack taken from established media, such as 356 00:22:26,359 --> 00:22:30,240 Speaker 1: a musical artist or a film soundtrack or audio from 357 00:22:30,240 --> 00:22:33,560 Speaker 1: a television show, and so to someone like me, the 358 00:22:33,680 --> 00:22:37,720 Speaker 1: videos might appear silly or unimportant because I'm a grumpy 359 00:22:37,760 --> 00:22:41,320 Speaker 1: old man. But to a different generation, it's an actual response, 360 00:22:41,560 --> 00:22:45,240 Speaker 1: whether conscious or otherwise, to the environment that they're growing 361 00:22:45,320 --> 00:22:50,680 Speaker 1: up in. Also, because these TikTok videos rely so heavily 362 00:22:50,880 --> 00:22:56,240 Speaker 1: on identifiable pop culture media, they tend to transcend barriers 363 00:22:56,320 --> 00:23:00,800 Speaker 1: like language, so they become really accessible. Oh the the 364 00:23:00,920 --> 00:23:04,679 Speaker 1: tone is really interesting too. There's a sort of self 365 00:23:04,880 --> 00:23:09,240 Speaker 1: deprecating humor that runs through a lot of TikTok videos, 366 00:23:09,280 --> 00:23:12,600 Speaker 1: which is a stark contrast to how people try to 367 00:23:12,680 --> 00:23:17,000 Speaker 1: present themselves on other platforms like Instagram. On Instagram, it's 368 00:23:17,280 --> 00:23:21,080 Speaker 1: far more common to see someone present themselves in full 369 00:23:21,160 --> 00:23:24,800 Speaker 1: blown self promotion mode, so kind of like, look how 370 00:23:24,840 --> 00:23:27,920 Speaker 1: awesome I am, Look how awesome my life is. Look 371 00:23:27,920 --> 00:23:31,160 Speaker 1: how awesome these products I enjoy are. Don't you want 372 00:23:31,160 --> 00:23:35,680 Speaker 1: these products too? Like? That's kind of the perception of 373 00:23:35,760 --> 00:23:39,840 Speaker 1: your typical Instagram personality, right, they are a person who 374 00:23:39,920 --> 00:23:43,840 Speaker 1: are They're presenting themselves as a brand that interacts with 375 00:23:43,960 --> 00:23:49,280 Speaker 1: other established brands. Well, a lot of TikTok users seem 376 00:23:49,480 --> 00:23:53,120 Speaker 1: to use TikTok in a totally different way. They're using 377 00:23:53,119 --> 00:23:56,560 Speaker 1: TikTok in a way to kind of voice their own 378 00:23:56,560 --> 00:24:00,160 Speaker 1: insecurities and to deal with that and to to sort 379 00:24:00,200 --> 00:24:02,920 Speaker 1: of poke fun at themselves in a self aware way, 380 00:24:03,440 --> 00:24:08,160 Speaker 1: as opposed to trying to present this kind of idealistic 381 00:24:08,720 --> 00:24:13,119 Speaker 1: vision of themselves and on the app that frequently goes 382 00:24:13,359 --> 00:24:17,400 Speaker 1: fairly well if you're viewing TikTok through the app, that's 383 00:24:17,560 --> 00:24:19,840 Speaker 1: kind of the vibe you get with a lot of 384 00:24:19,840 --> 00:24:22,639 Speaker 1: TikTok videos, not all of them, by the means but 385 00:24:22,840 --> 00:24:26,800 Speaker 1: it's a pretty common thread. But TikTok also has created 386 00:24:26,840 --> 00:24:29,520 Speaker 1: a strange duality. So on the one hand, you had 387 00:24:29,560 --> 00:24:32,679 Speaker 1: the TikTok culture of the app itself, and then on 388 00:24:32,720 --> 00:24:36,720 Speaker 1: the other you have the culture around TikTok that has 389 00:24:37,440 --> 00:24:40,840 Speaker 1: grown out from the way TikTok is presented on other 390 00:24:41,000 --> 00:24:46,080 Speaker 1: platforms platforms like YouTube and Facebook, where people will upload 391 00:24:46,160 --> 00:24:50,560 Speaker 1: collections of TikTok videos, they'll aggregate them much in the 392 00:24:50,600 --> 00:24:54,480 Speaker 1: same way as Byte Dances news aggregator would gather headlines. 393 00:24:55,240 --> 00:24:59,240 Speaker 1: Many of those compilations popping up on other platforms aren't 394 00:24:59,440 --> 00:25:02,280 Speaker 1: like a best of you know, it's not the best 395 00:25:02,359 --> 00:25:07,200 Speaker 1: of TikTok featuring clever users of the app or particularly 396 00:25:07,240 --> 00:25:12,560 Speaker 1: funny jokes. Instead, there are a lot of cringe compilations. 397 00:25:12,840 --> 00:25:17,360 Speaker 1: These are videos where something isn't necessarily going well, or 398 00:25:17,560 --> 00:25:21,639 Speaker 1: maybe they're meant to service material for mockery and derision, 399 00:25:22,240 --> 00:25:25,879 Speaker 1: so you're making someone's making fun of another creator. So 400 00:25:25,960 --> 00:25:29,320 Speaker 1: these are two very different experiences. Right on the one hand, 401 00:25:29,359 --> 00:25:32,239 Speaker 1: you have the app experience, which tends to be a 402 00:25:32,280 --> 00:25:36,159 Speaker 1: bit more lighthearted, and then you have the wider experience 403 00:25:36,240 --> 00:25:39,640 Speaker 1: of videos appearing on other platforms where people are being 404 00:25:40,400 --> 00:25:45,560 Speaker 1: you know, jerks on the internet. As per usual, the 405 00:25:45,640 --> 00:25:49,040 Speaker 1: duet feature on TikTok can be used for mockery in 406 00:25:49,080 --> 00:25:51,919 Speaker 1: this way. So with a duet, a user can respond 407 00:25:52,400 --> 00:25:56,760 Speaker 1: to a previously posted TikTok video, creating a split screen 408 00:25:57,000 --> 00:25:59,840 Speaker 1: in which the original video plays on one side and 409 00:26:00,040 --> 00:26:03,240 Speaker 1: on the other side you have the response video. Now, 410 00:26:03,280 --> 00:26:06,879 Speaker 1: ideally you would use this to add something of value 411 00:26:06,880 --> 00:26:10,320 Speaker 1: to the original video, to create a duet. Maybe someone's 412 00:26:10,440 --> 00:26:13,760 Speaker 1: singing in one video and you harmonize with them in 413 00:26:13,800 --> 00:26:16,920 Speaker 1: the duet video, and then it's presented that way together. 414 00:26:17,840 --> 00:26:22,399 Speaker 1: That's the intent of this feature. However, a lot of 415 00:26:22,400 --> 00:26:25,040 Speaker 1: people use it in order to make fun of the 416 00:26:25,040 --> 00:26:27,720 Speaker 1: original video in some way, either to react to something 417 00:26:27,760 --> 00:26:30,399 Speaker 1: that happens in the video in an over the top way, 418 00:26:30,600 --> 00:26:33,919 Speaker 1: or to just out and out burn the person who 419 00:26:34,000 --> 00:26:37,240 Speaker 1: created the first video. So you might post an earnest 420 00:26:37,280 --> 00:26:39,960 Speaker 1: attempt to sing a song, and then you might see 421 00:26:39,960 --> 00:26:44,480 Speaker 1: that your video shows up on a YouTube list that's 422 00:26:44,480 --> 00:26:47,280 Speaker 1: being held up for mockery by some TikTok troll who 423 00:26:47,359 --> 00:26:49,960 Speaker 1: gets most of their views by tearing down other people. 424 00:26:50,520 --> 00:26:55,320 Speaker 1: In TikTok culture, it's called ironic TikTok. I think it's 425 00:26:55,320 --> 00:26:59,240 Speaker 1: playing pretty fast and loose with the definition of irony, 426 00:26:59,359 --> 00:27:02,840 Speaker 1: because often the word ironic is being used in a 427 00:27:02,920 --> 00:27:08,639 Speaker 1: place for something like means spirited. That being mean is 428 00:27:08,640 --> 00:27:12,880 Speaker 1: not necessarily being ironic. The two are not synonymous. However, 429 00:27:13,320 --> 00:27:16,560 Speaker 1: this isn't exactly new. I'm not pointing out something that 430 00:27:16,640 --> 00:27:19,520 Speaker 1: has just popped up. This is not to say that 431 00:27:19,560 --> 00:27:23,320 Speaker 1: this generation is particularly awful. I don't think that's true 432 00:27:23,320 --> 00:27:26,520 Speaker 1: at all. The Internet has had plenty of forums and 433 00:27:26,560 --> 00:27:29,480 Speaker 1: platforms where this stuff has happened well in the past. 434 00:27:30,359 --> 00:27:32,879 Speaker 1: It's pretty widespread on TikTok, or at least on the 435 00:27:32,880 --> 00:27:37,040 Speaker 1: plot secondary platforms like YouTube and Instagram. But it's not 436 00:27:37,119 --> 00:27:41,080 Speaker 1: like this is a huge surprise, right that the target 437 00:27:41,080 --> 00:27:44,320 Speaker 1: demographic for TikTok tends to be young people, and young 438 00:27:44,359 --> 00:27:48,360 Speaker 1: people throughout most of history have often sought approval by 439 00:27:48,400 --> 00:27:51,560 Speaker 1: making fun of vulnerable targets. I mean that was true 440 00:27:51,560 --> 00:27:54,359 Speaker 1: when I was a kid and the Internet didn't even 441 00:27:54,400 --> 00:27:57,560 Speaker 1: really exist, at least not in the minds of kids 442 00:27:57,640 --> 00:27:59,760 Speaker 1: my age. The Internet was a thing, but none of 443 00:27:59,800 --> 00:28:02,680 Speaker 1: us access to it. The Worldwide Web didn't exist yet 444 00:28:02,720 --> 00:28:05,639 Speaker 1: when I was a kid, but kids were still you know, 445 00:28:05,760 --> 00:28:09,200 Speaker 1: mean and picked on people. So this is definitely something 446 00:28:09,240 --> 00:28:12,720 Speaker 1: that has been a thing for ages and ages and ages. 447 00:28:13,119 --> 00:28:16,800 Speaker 1: It's just that TikTok enables it on a scale that's 448 00:28:16,880 --> 00:28:18,840 Speaker 1: much larger than what you could do back when I 449 00:28:18,880 --> 00:28:20,840 Speaker 1: was a kid, Like I might get fun of in 450 00:28:20,840 --> 00:28:24,320 Speaker 1: front of the whole school, but that was a school people, 451 00:28:24,680 --> 00:28:27,960 Speaker 1: it wasn't the whole Internet. So this creates a pretty 452 00:28:28,040 --> 00:28:33,720 Speaker 1: unusual landscape. Right. The app itself isn't really plagued with trolling. Meanwhile, 453 00:28:33,760 --> 00:28:36,920 Speaker 1: the secondary platforms like Instagram and YouTube are much more 454 00:28:37,000 --> 00:28:41,600 Speaker 1: likely to feature trollish videos and making it worse, a 455 00:28:41,680 --> 00:28:45,760 Speaker 1: large percentage of TikTok's audience are not watching the videos 456 00:28:45,840 --> 00:28:49,560 Speaker 1: on TikTok. A lot of them have never downloaded TikTok. 457 00:28:49,960 --> 00:28:53,360 Speaker 1: They're watching the videos on these other platforms like Instagram 458 00:28:53,360 --> 00:28:58,280 Speaker 1: and YouTube, so they're seeing more of the cringe compilations 459 00:28:58,280 --> 00:29:01,280 Speaker 1: and things of that nature. So your experiences with TikTok 460 00:29:01,400 --> 00:29:06,240 Speaker 1: really depend heavily on how you actually access the content. Now, 461 00:29:06,920 --> 00:29:10,560 Speaker 1: not all of the ironic TikTok videos are means spirited. 462 00:29:10,760 --> 00:29:12,600 Speaker 1: A lot of them are, but not all of them. 463 00:29:12,640 --> 00:29:15,840 Speaker 1: Some are more absurdist humor that doesn't really seem to 464 00:29:15,880 --> 00:29:19,200 Speaker 1: have any kind of malicious intent. It's more like a 465 00:29:19,320 --> 00:29:24,239 Speaker 1: very goofy reaction to a similarly goofy video and not 466 00:29:24,360 --> 00:29:27,080 Speaker 1: meant to be like it wasn't that first video terrible 467 00:29:27,160 --> 00:29:29,400 Speaker 1: or whatever. But there are a lot of users who 468 00:29:29,400 --> 00:29:32,040 Speaker 1: really do set out to make stuff that was either 469 00:29:32,080 --> 00:29:36,480 Speaker 1: intended to belittle or insult the original creator. Some are 470 00:29:36,560 --> 00:29:40,160 Speaker 1: just making stuff that is an overt attempt to be offensive, 471 00:29:40,920 --> 00:29:44,120 Speaker 1: probably for no other reason than they find it amusing 472 00:29:44,200 --> 00:29:46,680 Speaker 1: to get an emotional reaction out of people. It's a 473 00:29:46,720 --> 00:29:50,960 Speaker 1: classic troll behavior. Others might be using TikTok to express 474 00:29:51,080 --> 00:29:55,920 Speaker 1: some truly terrible beliefs that they hold, like racist beliefs 475 00:29:55,960 --> 00:29:59,000 Speaker 1: or misogynistic beliefs, because they see it as a way 476 00:29:59,080 --> 00:30:01,600 Speaker 1: where they can exp us these things without there being 477 00:30:01,640 --> 00:30:05,400 Speaker 1: any kind of consequence to that. But why are these 478 00:30:05,440 --> 00:30:08,920 Speaker 1: secondary platforms presenting a more toxic version of TikTok in 479 00:30:08,960 --> 00:30:12,160 Speaker 1: the first place? Why is it so popular there? What 480 00:30:12,280 --> 00:30:15,400 Speaker 1: mostly has to do with how algorithms suss out which 481 00:30:15,520 --> 00:30:18,960 Speaker 1: videos they should suggest to users. Videos that get a 482 00:30:19,000 --> 00:30:22,080 Speaker 1: lot of engagement tend to rise to the top because 483 00:30:22,240 --> 00:30:27,160 Speaker 1: engagement translates to spending more time on that platform, and 484 00:30:27,240 --> 00:30:30,440 Speaker 1: spending more time on the platform means spending more time 485 00:30:30,560 --> 00:30:34,400 Speaker 1: around ads. Spending more time around ads means that the 486 00:30:34,440 --> 00:30:38,280 Speaker 1: platform makes more money. So you see how this drives 487 00:30:38,320 --> 00:30:42,360 Speaker 1: decision making From a platform perspective, the platforms like YouTube 488 00:30:42,360 --> 00:30:46,240 Speaker 1: and Instagram our businesses, and a driving motivator for business 489 00:30:46,320 --> 00:30:49,600 Speaker 1: owners is to maximize profits. So to do that, if 490 00:30:49,640 --> 00:30:52,720 Speaker 1: your YouTube or Instagram, you have to find ways to 491 00:30:52,800 --> 00:30:56,680 Speaker 1: keep people on your platform, to keep them engaged, and 492 00:30:56,800 --> 00:30:59,960 Speaker 1: that often means serving up some stuff that's pretty nasty. 493 00:31:00,040 --> 00:31:04,360 Speaker 1: DN means spirited, not because the content is better, but 494 00:31:04,480 --> 00:31:07,600 Speaker 1: because it keeps people glued to the platform, which is 495 00:31:07,600 --> 00:31:10,240 Speaker 1: pretty gross. But we see this all the time, and 496 00:31:10,280 --> 00:31:12,560 Speaker 1: not just with TikTok. I'm not trying to call them 497 00:31:12,600 --> 00:31:15,800 Speaker 1: out here, It's on all these platforms. It's also one 498 00:31:15,840 --> 00:31:18,800 Speaker 1: of the underlying principles that fuels the discussion around fake 499 00:31:18,920 --> 00:31:22,480 Speaker 1: news and the promotion of extremist ideologies on platforms like 500 00:31:22,520 --> 00:31:26,840 Speaker 1: Twitter and Facebook. Over at the TikTok app, meanwhile, it's 501 00:31:26,920 --> 00:31:30,479 Speaker 1: largely using a different type of curating. A common feature 502 00:31:30,720 --> 00:31:33,640 Speaker 1: on social platforms is the ability to indicate you like 503 00:31:34,080 --> 00:31:37,120 Speaker 1: a post or video or whatever. So you click on 504 00:31:37,160 --> 00:31:39,680 Speaker 1: the little like button or the thumbs up icon or 505 00:31:39,680 --> 00:31:42,120 Speaker 1: a little hard icon or whatever it is, and you 506 00:31:42,200 --> 00:31:46,440 Speaker 1: express your admiration for the content. On the back end, 507 00:31:46,800 --> 00:31:50,440 Speaker 1: the service logs this response then begins to develop a 508 00:31:50,520 --> 00:31:54,120 Speaker 1: profile for you. Are you liking a lot of pictures 509 00:31:54,120 --> 00:31:57,160 Speaker 1: of dogs on Instagram? Well, what do you know when 510 00:31:57,200 --> 00:31:59,760 Speaker 1: you start doing a search on Instagram. A lot of 511 00:31:59,760 --> 00:32:02,520 Speaker 1: the images that are just gonna populate before you even 512 00:32:02,560 --> 00:32:04,800 Speaker 1: type in a search term are going to be dogs 513 00:32:05,360 --> 00:32:09,080 Speaker 1: on TikTok. If you indicate that the lip syncing videos 514 00:32:09,080 --> 00:32:12,120 Speaker 1: that have really clever edits, maybe they have really interesting 515 00:32:12,160 --> 00:32:15,000 Speaker 1: makeup effects or something you're clicking like on a lot 516 00:32:15,040 --> 00:32:17,000 Speaker 1: of those, you're gonna see more of those pop up 517 00:32:17,000 --> 00:32:20,600 Speaker 1: in your feed. If you don't like videos that are 518 00:32:20,720 --> 00:32:23,120 Speaker 1: more mean spirited, like you never hit like on a 519 00:32:23,160 --> 00:32:26,000 Speaker 1: mean spirited video, then over time they're going to show 520 00:32:26,080 --> 00:32:30,240 Speaker 1: up less frequently. Your actions guide TikTok to curating a 521 00:32:30,280 --> 00:32:33,680 Speaker 1: feed that's most likely to keep your attention, because again, 522 00:32:34,160 --> 00:32:37,240 Speaker 1: keeping your attention, keeping you on the platform for as 523 00:32:37,320 --> 00:32:40,800 Speaker 1: long as possible is the goal, because that's what generates 524 00:32:40,840 --> 00:32:44,520 Speaker 1: revenue for TikTok. There's a feedback loop going on here 525 00:32:44,880 --> 00:32:48,280 Speaker 1: in which TikTok gathers information about its users, then it 526 00:32:48,360 --> 00:32:51,720 Speaker 1: makes use of that information to tweak the presentation of 527 00:32:51,760 --> 00:32:54,920 Speaker 1: the platform to those users in an effort to improve 528 00:32:54,960 --> 00:32:58,160 Speaker 1: the experience, and and by improve the experience, I mean 529 00:32:58,400 --> 00:33:03,120 Speaker 1: encourage more engagement, then monitor the results. The cycle repeats 530 00:33:03,520 --> 00:33:06,840 Speaker 1: endlessly with the goal of constantly morphing to suit the 531 00:33:06,840 --> 00:33:10,520 Speaker 1: preferences of the individual user, while also promoting content that 532 00:33:10,600 --> 00:33:15,240 Speaker 1: has near universal acceptance. There's another aspect of TikTok that 533 00:33:15,280 --> 00:33:17,320 Speaker 1: we're going to explore when we come back, and that's 534 00:33:17,320 --> 00:33:20,520 Speaker 1: how it has come under scrutiny from the standpoint of 535 00:33:21,360 --> 00:33:24,600 Speaker 1: national security. I'll explain more in a minute, but first 536 00:33:24,640 --> 00:33:34,840 Speaker 1: let's take another quick break. Before I get to the 537 00:33:35,000 --> 00:33:39,160 Speaker 1: national security stuff with TikTok, It's important to mention that 538 00:33:39,240 --> 00:33:41,920 Speaker 1: the app has been at the object of scrutiny for 539 00:33:42,200 --> 00:33:46,760 Speaker 1: lots of different reasons, not just national security. One of 540 00:33:46,760 --> 00:33:49,840 Speaker 1: those reasons is that the app hasn't been terribly good 541 00:33:49,840 --> 00:33:53,880 Speaker 1: at enforcing any sort of age restrictions, and so there 542 00:33:53,880 --> 00:33:57,400 Speaker 1: have been some awful, high profile cases in which the 543 00:33:57,400 --> 00:34:00,400 Speaker 1: app has played a part in putting kids in danger. 544 00:34:00,760 --> 00:34:04,560 Speaker 1: For example, according to the South China Morning Post. Kids 545 00:34:04,560 --> 00:34:07,880 Speaker 1: in Hong Kong potentially put themselves in danger on TikTok 546 00:34:08,120 --> 00:34:11,960 Speaker 1: by sharing personal information like their full names or their 547 00:34:12,000 --> 00:34:16,520 Speaker 1: phone numbers. So that's not great, and it's not that 548 00:34:16,560 --> 00:34:20,200 Speaker 1: TikTok should prevent them from doing that, but TikTok allowing 549 00:34:20,280 --> 00:34:22,959 Speaker 1: such young people access to the app in the first place, 550 00:34:23,000 --> 00:34:26,200 Speaker 1: and I'm talking like nine or ten year olds, that's 551 00:34:26,200 --> 00:34:30,080 Speaker 1: a problem. TikTok, like pretty much any online space, has 552 00:34:30,120 --> 00:34:33,080 Speaker 1: its share of predators who might try and exploit that 553 00:34:33,200 --> 00:34:36,960 Speaker 1: kind of information, and that alone is truly disturbing and 554 00:34:37,000 --> 00:34:40,440 Speaker 1: needs to be addressed. And because TikTok doesn't have good 555 00:34:40,480 --> 00:34:43,400 Speaker 1: age restrictions in place, it has run a foul of 556 00:34:43,480 --> 00:34:46,200 Speaker 1: the law in some countries. So in the United States, 557 00:34:46,400 --> 00:34:51,040 Speaker 1: the Federal Trade Commission find TikTok in February nineteen for 558 00:34:51,120 --> 00:34:54,839 Speaker 1: collecting data on children under the age of thirteen. That's 559 00:34:54,840 --> 00:34:58,560 Speaker 1: a violation of the Children's Online Privacy Protection Act in 560 00:34:58,600 --> 00:35:02,560 Speaker 1: the United States, also own as Kappa. Now, incidentally, I'll 561 00:35:02,560 --> 00:35:05,960 Speaker 1: have to do a full episode on Kappa because it's 562 00:35:05,960 --> 00:35:09,840 Speaker 1: a law that's affecting lots of people, including creators on YouTube, 563 00:35:10,239 --> 00:35:14,400 Speaker 1: and it's actually a pretty complicated issue. Anyway, the FTC 564 00:35:14,600 --> 00:35:19,040 Speaker 1: found TikTok guilty of violating Kappa and tracking this data 565 00:35:19,120 --> 00:35:22,640 Speaker 1: of underage users, and so byte Dance was hit with 566 00:35:22,680 --> 00:35:28,840 Speaker 1: a fine of um five point seven million dollars. And sure, 567 00:35:29,080 --> 00:35:32,640 Speaker 1: to folks like me, five point seven million bucks is 568 00:35:32,719 --> 00:35:36,280 Speaker 1: an enormous amount of money. It's truly enough to merit 569 00:35:36,320 --> 00:35:40,240 Speaker 1: the designation of a princely sum. But around that same time, 570 00:35:40,280 --> 00:35:45,560 Speaker 1: byte Dance had a valuation of nearly eighty billion dollars. 571 00:35:46,440 --> 00:35:49,120 Speaker 1: Do you know how many times five point seven million 572 00:35:49,239 --> 00:35:52,600 Speaker 1: goes into eighty billion? I do. I did the math. 573 00:35:52,680 --> 00:35:57,759 Speaker 1: It's more than fourteen thousand times. It is point zero 574 00:35:58,040 --> 00:36:03,160 Speaker 1: zero seven vent of the company's value. So the fine 575 00:36:03,280 --> 00:36:06,360 Speaker 1: wasn't even a drop in the bucket for byite Dance. 576 00:36:06,960 --> 00:36:09,480 Speaker 1: The app has also been in the spotlight for how 577 00:36:09,520 --> 00:36:13,480 Speaker 1: it monetizes itself. So there's the advertising that I mentioned. 578 00:36:13,480 --> 00:36:16,600 Speaker 1: Of course, you know you see ads and that generates revenue, 579 00:36:16,960 --> 00:36:21,919 Speaker 1: but that's everywhere. TikTok also allows for sponsored videos, So 580 00:36:22,200 --> 00:36:25,560 Speaker 1: you could have a brand sponsor a content creator, and 581 00:36:25,640 --> 00:36:30,239 Speaker 1: that creator then goes and makes videos featuring that brand's products, 582 00:36:30,320 --> 00:36:32,080 Speaker 1: and then you would have a tab in the app 583 00:36:32,320 --> 00:36:34,879 Speaker 1: that would make it easy for users to purchase those 584 00:36:34,920 --> 00:36:37,319 Speaker 1: products featured in the videos. They click on the little 585 00:36:37,320 --> 00:36:40,880 Speaker 1: tab they can order the stuff and it really facilitates 586 00:36:40,920 --> 00:36:45,960 Speaker 1: that transaction. Then there are the virtual items, which are 587 00:36:46,440 --> 00:36:50,840 Speaker 1: coins and then virtual gifts. This gets a little wishy 588 00:36:50,840 --> 00:36:55,600 Speaker 1: washy complicated, timey whimy wibbly wobbly. So TikTok allows quote 589 00:36:55,640 --> 00:37:00,879 Speaker 1: select users aged sixteen or above end quote too participate 590 00:37:01,000 --> 00:37:05,720 Speaker 1: in the program. In this case, the program is live streaming. Quote. 591 00:37:05,880 --> 00:37:09,520 Speaker 1: Such users shall be selected exclusively at the discretion of 592 00:37:09,560 --> 00:37:13,480 Speaker 1: the platform on the basis of various criteria, including their 593 00:37:13,520 --> 00:37:18,520 Speaker 1: track record and creating quality content, their number of followers, etcetera. 594 00:37:18,680 --> 00:37:22,920 Speaker 1: End quote. So live streaming is now a thing on TikTok, 595 00:37:23,080 --> 00:37:25,160 Speaker 1: and you have to be at least sixteen years old 596 00:37:25,200 --> 00:37:29,399 Speaker 1: in order to be able to do it. Beyond that, 597 00:37:29,719 --> 00:37:33,879 Speaker 1: for those users who are eighteen years or older, there's 598 00:37:33,880 --> 00:37:40,359 Speaker 1: an additional benefit. These users can purchase virtual coins, or 599 00:37:40,400 --> 00:37:43,640 Speaker 1: if they are people in this live stream program, they 600 00:37:43,680 --> 00:37:48,279 Speaker 1: can accept virtual items. So to purchase a virtual item, 601 00:37:48,400 --> 00:37:51,560 Speaker 1: users have to be at least eighteen years or uh. 602 00:37:51,600 --> 00:37:53,440 Speaker 1: They have to be the age of majority for their 603 00:37:53,480 --> 00:37:58,719 Speaker 1: respective country. And they can purchase virtual gifts with virtual coins. 604 00:37:58,760 --> 00:38:00,920 Speaker 1: The gifts are meant to show a pre ciation toward 605 00:38:01,040 --> 00:38:04,920 Speaker 1: content creators, and the creators received the gifts in the 606 00:38:05,000 --> 00:38:09,480 Speaker 1: form of another virtual currency called diamonds, which is getting 607 00:38:09,480 --> 00:38:13,120 Speaker 1: a bit confusing right. Virtual coins are used to purchase 608 00:38:13,239 --> 00:38:19,000 Speaker 1: virtual gifts, which convert into another virtual item called a diamond. Gift. 609 00:38:19,040 --> 00:38:22,040 Speaker 1: Giving is public, by the way, so anyone on TikTok 610 00:38:22,120 --> 00:38:24,560 Speaker 1: will be able to see when one user sends a 611 00:38:24,560 --> 00:38:29,640 Speaker 1: gift to another, including what that gift was diamonds. Meanwhile, 612 00:38:29,680 --> 00:38:33,239 Speaker 1: are as TikTok puts it quote a measurement of the 613 00:38:33,280 --> 00:38:38,800 Speaker 1: popularity of the relevant user content end quote, so it's 614 00:38:39,239 --> 00:38:45,520 Speaker 1: essentially saying how good or how popular is this video. 615 00:38:45,960 --> 00:38:48,960 Speaker 1: TikTok reserves the right to determine the rate of conversion 616 00:38:49,000 --> 00:38:53,120 Speaker 1: of diamonds too, you know, actual real world money that 617 00:38:53,160 --> 00:38:56,560 Speaker 1: can be spent on real world stuff. So creators who 618 00:38:56,640 --> 00:39:01,360 Speaker 1: earn diamonds can withdraw diamonds from their account, whereupon TikTok 619 00:39:01,400 --> 00:39:05,319 Speaker 1: will convert the virtual diamonds into real money based in 620 00:39:05,440 --> 00:39:12,000 Speaker 1: US currency using an arbitrary conversion rule that the company dictates. 621 00:39:12,080 --> 00:39:15,360 Speaker 1: So you can make money as a creator on TikTok. 622 00:39:15,640 --> 00:39:18,720 Speaker 1: But the amount you make is purely at the whim 623 00:39:18,719 --> 00:39:21,440 Speaker 1: of Bite Dance. They might say like, oh, a hundred 624 00:39:21,440 --> 00:39:24,279 Speaker 1: diamonds equals one dollar one day, and then the next 625 00:39:24,360 --> 00:39:27,280 Speaker 1: day they might say a thousand diamonds equals one dollar. 626 00:39:27,960 --> 00:39:32,640 Speaker 1: It's completely up to the company and there's no solid 627 00:39:32,680 --> 00:39:37,239 Speaker 1: conversion there. All right, But now about national security, what's 628 00:39:37,280 --> 00:39:40,600 Speaker 1: going on there? Well, part of the issue doesn't necessarily 629 00:39:40,640 --> 00:39:43,799 Speaker 1: point back to Bite Dance the company itself, but rather 630 00:39:43,880 --> 00:39:48,080 Speaker 1: how extremists can make use of social media to spread 631 00:39:48,160 --> 00:39:53,600 Speaker 1: their message or to spread misinformation or disinformation. The concern 632 00:39:54,080 --> 00:39:57,600 Speaker 1: is that extremist groups can spread messages promoting their philosophies 633 00:39:57,640 --> 00:40:01,400 Speaker 1: in short, bite sized packages to a wide audience of 634 00:40:01,440 --> 00:40:07,200 Speaker 1: mostly young and by extension, impressionable viewers. Whether it's an 635 00:40:07,280 --> 00:40:12,280 Speaker 1: organization like ISIS or a looser groups such as white supremacists, 636 00:40:12,600 --> 00:40:15,680 Speaker 1: there's a real concern that platforms like TikTok will serve 637 00:40:15,760 --> 00:40:18,800 Speaker 1: as an entry point for more young people to join 638 00:40:19,200 --> 00:40:23,040 Speaker 1: dangerous groups. Then there are fears that the app could 639 00:40:23,120 --> 00:40:26,200 Speaker 1: be used to interfere with major events, such as a 640 00:40:26,320 --> 00:40:29,879 Speaker 1: country's elections. Here in the United States, there's a real 641 00:40:29,960 --> 00:40:33,839 Speaker 1: concern that social media services are being leveraged by all 642 00:40:33,960 --> 00:40:37,680 Speaker 1: sorts of parties, both domestic and foreign, in an effort 643 00:40:37,719 --> 00:40:41,440 Speaker 1: to spread misinformation, to undermine faith in the democratic process, 644 00:40:41,719 --> 00:40:45,359 Speaker 1: or otherwise affect the outcome of the elections process in 645 00:40:45,400 --> 00:40:50,480 Speaker 1: some way. While you could argue convincingly, I might add 646 00:40:50,920 --> 00:40:54,919 Speaker 1: that the way these social media platforms curate information can 647 00:40:54,960 --> 00:40:58,480 Speaker 1: contribute to their misuse. You could also argue that the 648 00:40:58,520 --> 00:41:04,080 Speaker 1: services themselves aren't necessarily engaged in these activities. They're enablers, 649 00:41:04,120 --> 00:41:08,399 Speaker 1: but they're not necessarily instigators. So while there are real 650 00:41:08,520 --> 00:41:14,040 Speaker 1: criticisms about platforms like Facebook promoting harmful misinformation or extremist 651 00:41:14,120 --> 00:41:17,680 Speaker 1: views due to the way the Facebook algorithm works, most 652 00:41:17,719 --> 00:41:20,640 Speaker 1: people would not go so far as to accuse Facebook 653 00:41:20,680 --> 00:41:24,840 Speaker 1: itself of creating that content. But there's a big difference 654 00:41:24,840 --> 00:41:28,319 Speaker 1: between TikTok and Facebook besides the way the two platforms 655 00:41:28,360 --> 00:41:32,840 Speaker 1: present content, and that's the fact that TikTok is owned 656 00:41:33,040 --> 00:41:36,320 Speaker 1: by a Chinese company. Now, I talked a lot about 657 00:41:36,360 --> 00:41:40,799 Speaker 1: tech politics and China in my episodes about Huawei and 658 00:41:40,920 --> 00:41:43,600 Speaker 1: also another one called why is Everything Made in China? 659 00:41:43,840 --> 00:41:47,080 Speaker 1: But here's a quick overview. The government in China is 660 00:41:47,160 --> 00:41:50,640 Speaker 1: ultimately under the governance of the Communist Party of China. 661 00:41:51,120 --> 00:41:55,200 Speaker 1: In China, the government has tight controls on what the 662 00:41:55,280 --> 00:41:58,560 Speaker 1: media is allowed to present to the Chinese public. The 663 00:41:58,640 --> 00:42:02,920 Speaker 1: Chinese government can sense or information, restrict access to information 664 00:42:02,960 --> 00:42:06,360 Speaker 1: from outside China, and even dictate what can actually be 665 00:42:06,440 --> 00:42:11,000 Speaker 1: communicated to citizens. In addition, in twenty seventeen, the government 666 00:42:11,040 --> 00:42:14,920 Speaker 1: passed a law that says, quote any organization and citizen 667 00:42:15,440 --> 00:42:20,000 Speaker 1: uh in China that is should support and cooperate the 668 00:42:20,239 --> 00:42:23,960 Speaker 1: in the national intelligence work end quote. So that is, 669 00:42:24,520 --> 00:42:27,719 Speaker 1: if the company or person originates in China, then that 670 00:42:27,880 --> 00:42:30,320 Speaker 1: person or a company has a duty to support China's 671 00:42:30,440 --> 00:42:35,920 Speaker 1: national intelligence efforts, which includes spying. So you've got this 672 00:42:36,160 --> 00:42:41,320 Speaker 1: incredibly popular app used by hundreds of millions of people 673 00:42:41,480 --> 00:42:44,000 Speaker 1: around the world, and you have a country with a 674 00:42:44,040 --> 00:42:48,040 Speaker 1: government that demands companies and citizens within that country amplify 675 00:42:48,200 --> 00:42:53,200 Speaker 1: the country's own intelligence efforts. It is understandable why leaders 676 00:42:53,239 --> 00:42:56,239 Speaker 1: and other countries would become concerned about the rise of 677 00:42:56,320 --> 00:43:00,239 Speaker 1: popularity of a Chinese based app. If the company we're 678 00:43:00,280 --> 00:43:03,840 Speaker 1: gathering all that data, it might be used in harmful ways. 679 00:43:04,239 --> 00:43:06,680 Speaker 1: If people in the US were to use the app 680 00:43:06,719 --> 00:43:10,799 Speaker 1: in sensitive locations such as on government property or on 681 00:43:10,880 --> 00:43:14,480 Speaker 1: military sites. It could give away information to a not 682 00:43:14,840 --> 00:43:18,360 Speaker 1: quite friendly country. In fact, in the United States, the 683 00:43:18,480 --> 00:43:22,160 Speaker 1: Army and the Navy have banned the use of TikTok. 684 00:43:22,239 --> 00:43:25,280 Speaker 1: You you cannot install it on any government issued phone. 685 00:43:26,040 --> 00:43:29,239 Speaker 1: A country that has been known to employ hackers in 686 00:43:29,320 --> 00:43:33,920 Speaker 1: cyber warfare projects in the past. China is already pretty 687 00:43:33,960 --> 00:43:37,439 Speaker 1: high on that suspect list, so for some the app 688 00:43:37,560 --> 00:43:41,600 Speaker 1: is akin to handing over ammunition to an enemy during wartime. 689 00:43:42,200 --> 00:43:46,440 Speaker 1: In September twenty nineteen, the newspaper The Guardian published an 690 00:43:46,520 --> 00:43:50,840 Speaker 1: article that included excerpts from leaked documents from inside TikTok. 691 00:43:51,320 --> 00:43:54,480 Speaker 1: Those documents showed that the company had been sending directives 692 00:43:54,480 --> 00:43:57,440 Speaker 1: to moderators, whose job it is to look for content 693 00:43:57,600 --> 00:44:02,600 Speaker 1: that violates TikTok's terms of serve. This the directives expanded 694 00:44:02,920 --> 00:44:06,279 Speaker 1: that definition beyond the stuff you would expect, you know, 695 00:44:06,640 --> 00:44:12,240 Speaker 1: stuff that depicts violence or sexual content, and the directives 696 00:44:12,280 --> 00:44:15,760 Speaker 1: included other stuff like if a video were to include 697 00:44:15,800 --> 00:44:19,600 Speaker 1: material that criticized the Chinese government, or one that addressed 698 00:44:19,680 --> 00:44:23,360 Speaker 1: the political situation in Tibet, or one that talked about 699 00:44:23,480 --> 00:44:26,799 Speaker 1: other topics that the Chinese government wanted to restrict those 700 00:44:26,840 --> 00:44:30,360 Speaker 1: were to be removed or shifted over so that nobody 701 00:44:30,360 --> 00:44:33,279 Speaker 1: would ever see them. So technically the videos would be 702 00:44:33,320 --> 00:44:36,360 Speaker 1: on the platform, but they would be in a bucket 703 00:44:36,800 --> 00:44:40,000 Speaker 1: that no one would ever access. When citizens in Hong 704 00:44:40,080 --> 00:44:44,560 Speaker 1: Kong began organizing protests in twenty nineteen against China, their 705 00:44:44,600 --> 00:44:49,480 Speaker 1: stories were shared on numerous social platforms mysteriously, though it 706 00:44:49,520 --> 00:44:53,320 Speaker 1: was really hard to find any examples on TikTok since 707 00:44:53,360 --> 00:44:58,279 Speaker 1: those protests were directed at mainland China, the implication was 708 00:44:58,320 --> 00:45:02,120 Speaker 1: that TikTok might be purpose fully suppressing any videos that 709 00:45:02,160 --> 00:45:05,480 Speaker 1: were coming out of Hong Kong on TikTok that were 710 00:45:05,520 --> 00:45:09,359 Speaker 1: related to the protests. Now in the United States, lawmakers 711 00:45:09,440 --> 00:45:13,560 Speaker 1: called for investigations into TikTok. Bye Dance responded by saying 712 00:45:13,600 --> 00:45:17,200 Speaker 1: that all US user data exists on servers that are 713 00:45:17,200 --> 00:45:20,799 Speaker 1: actually in the United States, with some backups that are 714 00:45:20,840 --> 00:45:25,600 Speaker 1: also in Singapore, but that no US data, no US 715 00:45:25,800 --> 00:45:30,680 Speaker 1: user information lives on any servers in China itself, nor 716 00:45:31,600 --> 00:45:34,919 Speaker 1: are the data on those servers subject to Chinese law, 717 00:45:35,480 --> 00:45:40,360 Speaker 1: so the Chinese government cannot do anything about that, according 718 00:45:40,400 --> 00:45:43,840 Speaker 1: to byte Dance, nor has it asked for any stuff 719 00:45:43,880 --> 00:45:47,080 Speaker 1: to be taken down, according to Byte Dance. Towards the 720 00:45:47,160 --> 00:45:49,879 Speaker 1: end of twenty nineteen, rumors were popping up that Byte 721 00:45:49,920 --> 00:45:54,080 Speaker 1: Dance was actually considering selling off some or all of TikTok. 722 00:45:54,680 --> 00:45:57,799 Speaker 1: Byte Dance executives have denied these rumors. They say that 723 00:45:57,840 --> 00:46:00,160 Speaker 1: the company has no plans to sell any of the 724 00:46:00,200 --> 00:46:03,640 Speaker 1: service off, despite the pressure on the company that it 725 00:46:03,920 --> 00:46:07,120 Speaker 1: is facing on the international stage. At the same time, 726 00:46:07,200 --> 00:46:11,080 Speaker 1: the US Committee on Foreign Investment launched an inquiry into 727 00:46:11,160 --> 00:46:14,879 Speaker 1: whether or not Byte Dance should be forced to spin 728 00:46:14,920 --> 00:46:18,800 Speaker 1: off music Lee. That's the basis of TikTok's presence outside 729 00:46:18,800 --> 00:46:23,080 Speaker 1: of China itself. As of this recording, Byte Dance still 730 00:46:23,120 --> 00:46:26,120 Speaker 1: has control of TikTok, and according to the company, the 731 00:46:26,200 --> 00:46:29,360 Speaker 1: Chinese government has no say in how data outside of 732 00:46:29,440 --> 00:46:32,799 Speaker 1: China can be stored or displayed. There have been a 733 00:46:32,800 --> 00:46:37,239 Speaker 1: few cases in which investigators pointed out examples of apparent censorship, 734 00:46:37,680 --> 00:46:41,880 Speaker 1: where people's videos appeared to have been taken down on purpose, 735 00:46:42,400 --> 00:46:46,120 Speaker 1: but so far the TikTok representatives have explained those away 736 00:46:46,160 --> 00:46:48,960 Speaker 1: as just being examples of human error. You know, they're 737 00:46:48,960 --> 00:46:53,000 Speaker 1: not examples of a conscious effort to suppress information. By 738 00:46:53,000 --> 00:46:56,160 Speaker 1: the time you hear this, things may have changed. We'll 739 00:46:56,200 --> 00:47:00,719 Speaker 1: have to see. There's also an ongoing concern that TikTok 740 00:47:01,480 --> 00:47:05,279 Speaker 1: is going to be the home of deep fakes in 741 00:47:05,320 --> 00:47:08,600 Speaker 1: the near future. There's talk that byte Dance is investing 742 00:47:08,640 --> 00:47:12,960 Speaker 1: heavily in technology that could lead to deep fake videos. 743 00:47:13,000 --> 00:47:17,279 Speaker 1: So that's another thing that people are worried about. I'm 744 00:47:17,280 --> 00:47:19,760 Speaker 1: sure I'll have to do an update on this topic 745 00:47:20,040 --> 00:47:22,640 Speaker 1: in the future, but in the meantime, I'm gonna sign 746 00:47:22,640 --> 00:47:25,520 Speaker 1: off and go be old and grouchy for a while. 747 00:47:26,280 --> 00:47:28,520 Speaker 1: If you guys want to reach out to me, please do. 748 00:47:29,080 --> 00:47:32,520 Speaker 1: You can find me on those other old social media platforms, 749 00:47:32,840 --> 00:47:34,839 Speaker 1: you know, the ones where an old guy like me 750 00:47:34,960 --> 00:47:38,640 Speaker 1: can still feel comfortable. I'm talking about Twitter and Facebook. 751 00:47:38,800 --> 00:47:41,120 Speaker 1: The handle for both of those is text stuff H 752 00:47:41,400 --> 00:47:44,799 Speaker 1: s W and I'll talk to you again really soon. 753 00:47:49,080 --> 00:47:52,120 Speaker 1: Text Stuff is an I Heart Radio production. For more 754 00:47:52,200 --> 00:47:55,600 Speaker 1: podcasts from my Heart Radio, visit the I Heart Radio app, 755 00:47:55,719 --> 00:47:58,880 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows. 756 00:48:00,760 --> 00:48:00,920 Speaker 1: Ye