1 00:00:00,080 --> 00:00:02,680 Speaker 1: Okay, we would like to welcome Thine Rosenbaum to the show. 2 00:00:02,720 --> 00:00:05,640 Speaker 1: Now CBS News Legal Aus So what is going on 3 00:00:05,920 --> 00:00:08,360 Speaker 1: with TikTok? Is it going to exist in the United States? 4 00:00:08,400 --> 00:00:11,479 Speaker 1: And is it going to be the same algorithm so jacket. 5 00:00:11,480 --> 00:00:13,440 Speaker 2: We won't know for another ninety days. 6 00:00:13,560 --> 00:00:18,400 Speaker 3: This is now the fourth attempt to stave off forcing 7 00:00:18,520 --> 00:00:22,239 Speaker 3: TikTok to end operations in the United States. Remember this 8 00:00:22,360 --> 00:00:27,280 Speaker 3: started with legislation during the Biden administration that essentially had 9 00:00:27,400 --> 00:00:31,120 Speaker 3: a you know, we're not interested in social media companies 10 00:00:31,520 --> 00:00:36,600 Speaker 3: that mine data of Americans that come from putative enemies 11 00:00:36,600 --> 00:00:41,320 Speaker 3: of ours, and that there was no sufficient insulation or 12 00:00:41,479 --> 00:00:45,559 Speaker 3: Chinese wall, so to speak, between the Chinese government and TikTok. 13 00:00:45,680 --> 00:00:47,280 Speaker 4: That's a reasonable stance in my opinion. 14 00:00:48,080 --> 00:00:51,280 Speaker 3: Yeah, yeah, and the TikTok's epic we're not really owned 15 00:00:51,360 --> 00:00:55,560 Speaker 3: by Chinese and so there became a fight about that 16 00:00:55,640 --> 00:00:59,000 Speaker 3: when in fact Bike Dance did have relationships with the 17 00:00:59,200 --> 00:01:02,360 Speaker 3: Chinese government, and it's not as if they could refuse 18 00:01:02,400 --> 00:01:03,120 Speaker 3: a request. 19 00:01:03,720 --> 00:01:05,440 Speaker 2: You know, it's like the Chinese government. 20 00:01:05,160 --> 00:01:07,800 Speaker 3: Is like the Godfather, you don't the request. So if 21 00:01:07,800 --> 00:01:09,640 Speaker 3: they had asked for something, they would have said yes. 22 00:01:10,040 --> 00:01:12,760 Speaker 3: So the Biden administration said, look, you know, you either 23 00:01:12,840 --> 00:01:18,400 Speaker 3: sell yourself to an independent corporation or you you shut 24 00:01:18,440 --> 00:01:20,600 Speaker 3: down in your operations. And I said, you want to 25 00:01:20,680 --> 00:01:22,880 Speaker 3: do whatever you do in Europe, that's fine, but not here. 26 00:01:23,680 --> 00:01:27,039 Speaker 3: Donald Trump, you know, said, well wait a minute. I 27 00:01:27,080 --> 00:01:30,120 Speaker 3: think I won this election in part because there's TikTok. 28 00:01:30,480 --> 00:01:34,360 Speaker 2: Probably, you know, it rallied an audience that I didn't 29 00:01:34,360 --> 00:01:36,960 Speaker 2: really even have the last time, and. 30 00:01:37,319 --> 00:01:40,280 Speaker 3: Everyone seems to have it, and there are an enormous 31 00:01:40,319 --> 00:01:43,600 Speaker 3: amount of it's not just apparently I wouldn't know this, Jack, 32 00:01:43,880 --> 00:01:46,640 Speaker 3: but apparently it's not just images of people dancing with 33 00:01:46,680 --> 00:01:49,840 Speaker 3: their cats. There are, in fact, small businesses all over 34 00:01:49,880 --> 00:01:53,000 Speaker 3: the United States that depend on TikTok is their main 35 00:01:53,080 --> 00:01:57,920 Speaker 3: marketing technique. So there was a movement among American businesses 36 00:01:57,960 --> 00:01:59,920 Speaker 3: they said, well, wait a minute, we're a small business. 37 00:02:00,280 --> 00:02:01,920 Speaker 2: This is how we could be. We need to tuck. 38 00:02:02,360 --> 00:02:06,880 Speaker 3: So Donald Trump signed a series of executive orders, just 39 00:02:06,960 --> 00:02:10,240 Speaker 3: one the other day that extended the time period for 40 00:02:10,320 --> 00:02:15,840 Speaker 3: either shutting down or selling. And he claims that he 41 00:02:16,000 --> 00:02:20,760 Speaker 3: knows of a consortium of venture capital companies, private equity, 42 00:02:21,240 --> 00:02:25,600 Speaker 3: and tech companies, including Oracle and as you know, Larry 43 00:02:25,960 --> 00:02:30,360 Speaker 3: Ellison has just his wealth has increased enormously, and I 44 00:02:30,400 --> 00:02:33,080 Speaker 3: think he's interested in this too, So he apparently is 45 00:02:33,120 --> 00:02:35,200 Speaker 3: part of this consortium. 46 00:02:35,560 --> 00:02:37,239 Speaker 2: And the argument is. 47 00:02:38,120 --> 00:02:43,760 Speaker 3: That, well, we need we have to add a board 48 00:02:43,760 --> 00:02:46,560 Speaker 3: of director, someone on the board of director who has 49 00:02:46,600 --> 00:02:48,200 Speaker 3: some American affiliation. 50 00:02:48,360 --> 00:02:50,960 Speaker 2: That's a requirement, and. 51 00:02:52,560 --> 00:02:56,680 Speaker 3: Will extend the deadline for ninety days, hoping that the 52 00:02:56,760 --> 00:03:00,160 Speaker 3: consortium can pull together the funds and meet with the 53 00:03:00,200 --> 00:03:04,400 Speaker 3: regulatory approval that this no longer presents a national security 54 00:03:04,480 --> 00:03:05,720 Speaker 3: risks in the United States. 55 00:03:06,800 --> 00:03:10,760 Speaker 4: So, uh, do you have TikTok on your phone? No? 56 00:03:10,800 --> 00:03:15,440 Speaker 2: Me, neither, and I can't I vaguely know what it is. 57 00:03:17,440 --> 00:03:21,520 Speaker 3: But but Jack, Jack, if you'll remember you had trouble 58 00:03:21,560 --> 00:03:22,040 Speaker 3: calling me. 59 00:03:22,600 --> 00:03:24,480 Speaker 4: Yeah, well I didn't personally. 60 00:03:24,080 --> 00:03:29,480 Speaker 3: But obviously right saying no, I'm blaming myself obviously since 61 00:03:29,520 --> 00:03:30,440 Speaker 3: I don't have TikTok. 62 00:03:30,520 --> 00:03:32,360 Speaker 2: I don't even know how to use a phone, apparently, 63 00:03:32,520 --> 00:03:35,560 Speaker 2: so I'm not and I'm not the right guy. 64 00:03:35,760 --> 00:03:38,040 Speaker 1: Do you not have TikTok because you're worried it's a 65 00:03:38,120 --> 00:03:40,040 Speaker 1: Chinese data harvesting tool? 66 00:03:41,240 --> 00:03:44,920 Speaker 4: No, not really, you're just not interested in social media. 67 00:03:44,920 --> 00:03:45,640 Speaker 2: I'm I'm a. 68 00:03:45,560 --> 00:03:48,760 Speaker 3: Writer by by trade, and I just don't live in 69 00:03:48,800 --> 00:03:50,800 Speaker 3: a world where everything has to be visual. 70 00:03:50,960 --> 00:03:52,680 Speaker 2: And I worked for CBS Radio. 71 00:03:54,680 --> 00:03:56,600 Speaker 3: I don't, you know, I don't I don't focus. I 72 00:03:56,600 --> 00:03:58,880 Speaker 3: don't need videos all day to keep me entertained. 73 00:03:58,960 --> 00:04:02,680 Speaker 1: Yeah, and I understand completely. The only reason I ever 74 00:04:02,720 --> 00:04:04,880 Speaker 1: got Instagram, which was fairly recently, is it's kind of 75 00:04:04,920 --> 00:04:08,480 Speaker 1: a marketing tool for the show. But yeah, but we've 76 00:04:08,480 --> 00:04:10,440 Speaker 1: had the stats before, and you've probably seen them of 77 00:04:10,680 --> 00:04:14,040 Speaker 1: how many hours a day young people spend on TikTok. 78 00:04:14,080 --> 00:04:15,200 Speaker 4: I mean, it's crazy. 79 00:04:15,400 --> 00:04:18,280 Speaker 1: They almost seem made up, but I assume they're true 80 00:04:18,279 --> 00:04:20,840 Speaker 1: because I've seen study after study say that, you know, 81 00:04:20,880 --> 00:04:22,800 Speaker 1: it's five hours for this group or seven hours for 82 00:04:22,839 --> 00:04:23,200 Speaker 1: that group. 83 00:04:23,200 --> 00:04:24,600 Speaker 4: I don't even know how that's possible. 84 00:04:25,920 --> 00:04:29,120 Speaker 3: But it's even worse than that, Jack, because if they 85 00:04:29,160 --> 00:04:32,120 Speaker 3: were just watching videos of people dancing with their cats, 86 00:04:32,440 --> 00:04:34,200 Speaker 3: that would be one thing, it would be it would 87 00:04:34,240 --> 00:04:37,360 Speaker 3: be stupid as all get out. But if that's how 88 00:04:37,400 --> 00:04:39,679 Speaker 3: you're if that's how you want your kid to spend 89 00:04:39,720 --> 00:04:43,600 Speaker 3: your day with another cat that's dancing. Fine. But what 90 00:04:43,640 --> 00:04:46,760 Speaker 3: we're really learning, which is really distressing, and it raises 91 00:04:46,800 --> 00:04:50,360 Speaker 3: First Amendment issues and other regulatory issues is that most 92 00:04:50,400 --> 00:04:54,480 Speaker 3: young people get their news from TikTok. Yeah, so that's 93 00:04:54,560 --> 00:04:58,360 Speaker 3: where it's worse than being entertained by silly videos. It's 94 00:04:58,400 --> 00:05:04,640 Speaker 3: that algorithm that can easily be manipulated. Are tailor making 95 00:05:05,040 --> 00:05:08,560 Speaker 3: a news seeds. They tailor it for certain age groups, 96 00:05:08,600 --> 00:05:13,159 Speaker 3: for certain sensibilities, perspectives, where they live, you know, whatever information. 97 00:05:13,279 --> 00:05:16,680 Speaker 3: So that's really what national security means. It doesn't necessarily 98 00:05:16,760 --> 00:05:18,000 Speaker 3: mean a bomb is going to go off. 99 00:05:18,360 --> 00:05:21,000 Speaker 2: It means that you're going to mind data that tells. 100 00:05:20,800 --> 00:05:24,000 Speaker 3: You everything you need to know about America, where we're vulnerable, 101 00:05:24,240 --> 00:05:26,800 Speaker 3: what kind of stuff we believe in, How easy it 102 00:05:26,839 --> 00:05:27,920 Speaker 3: is to manipulate us. 103 00:05:28,440 --> 00:05:31,520 Speaker 1: Just give us a video or shade news stories a 104 00:05:31,520 --> 00:05:33,120 Speaker 1: certain direction that's in your favor. 105 00:05:34,400 --> 00:05:37,559 Speaker 2: Exactly right as I'm saying about, you know, tailoring news 106 00:05:37,560 --> 00:05:39,599 Speaker 2: seeds or blocking. 107 00:05:39,200 --> 00:05:42,960 Speaker 3: News seeds, for instance, make sure that Jack Armstrong never 108 00:05:43,320 --> 00:05:44,680 Speaker 3: sees a story. 109 00:05:44,320 --> 00:05:46,200 Speaker 4: About this, right, he. 110 00:05:46,160 --> 00:05:48,440 Speaker 3: Cannot see a story about this, because he's the kind 111 00:05:48,440 --> 00:05:52,159 Speaker 3: of guy that will it'll, it'll, he'll talk about it 112 00:05:52,240 --> 00:05:55,719 Speaker 3: on a show. So you're saying, that's the kind of manipulation, 113 00:05:56,080 --> 00:06:00,000 Speaker 3: and you know, remember the social media companies are protect 114 00:06:00,040 --> 00:06:04,520 Speaker 3: acted the government doesn't can't interfere with their First Amendment rights. 115 00:06:04,800 --> 00:06:07,080 Speaker 3: So if it TikTok is an American company, if it 116 00:06:07,120 --> 00:06:11,760 Speaker 3: becomes one, the government really can't regulate it because they're saying, look, 117 00:06:11,800 --> 00:06:14,760 Speaker 3: we have First Amendment rights. We're essentially like a newspaper. 118 00:06:14,800 --> 00:06:16,400 Speaker 3: You can't tell us what's to publish. 119 00:06:16,360 --> 00:06:17,320 Speaker 4: Or what not the public. 120 00:06:17,440 --> 00:06:20,640 Speaker 1: Right, my final question, you're the CBS News legal analyst 121 00:06:20,680 --> 00:06:23,360 Speaker 1: put on a quote. So CNBC reported the other day 122 00:06:23,480 --> 00:06:27,760 Speaker 1: that the algorithm was going to that the for the 123 00:06:27,800 --> 00:06:29,400 Speaker 1: for the deal to happen, it was going to have 124 00:06:29,400 --> 00:06:31,600 Speaker 1: to be a different algorithm. Then I think it was 125 00:06:31,600 --> 00:06:33,760 Speaker 1: a Wall Street Journal reported that, though is going to 126 00:06:33,760 --> 00:06:36,120 Speaker 1: be the same algorithm. Well, that's a pretty big difference, 127 00:06:36,160 --> 00:06:39,160 Speaker 1: because if it's a different algorithm, it's you can call 128 00:06:39,160 --> 00:06:41,240 Speaker 1: it TikTok, but it's not the same thing. And from 129 00:06:41,240 --> 00:06:43,800 Speaker 1: what I understand from people who love TikTok, it's all 130 00:06:43,839 --> 00:06:47,159 Speaker 1: about that amazing algorithm that can predict, predict what you 131 00:06:47,200 --> 00:06:48,120 Speaker 1: want to be entertained by. 132 00:06:49,880 --> 00:06:52,960 Speaker 3: Yeah, I mean, that's that's the problem, the algorithm. Although 133 00:06:53,000 --> 00:06:56,280 Speaker 3: I read somewhere else that they're saying, well nowadays, at 134 00:06:56,279 --> 00:07:00,000 Speaker 3: the time that this legislation was created, you know, algorithm 135 00:07:00,240 --> 00:07:04,640 Speaker 3: were impenetrable and could never be done. And apparently that's 136 00:07:04,640 --> 00:07:08,680 Speaker 3: no longer true. You know, we've cracked the code and 137 00:07:08,760 --> 00:07:11,560 Speaker 3: people can actually duplicate it themselves. 138 00:07:11,720 --> 00:07:12,880 Speaker 2: That doesn't change the. 139 00:07:12,880 --> 00:07:16,600 Speaker 3: Fact that we don't obligate social media companies or internet 140 00:07:16,600 --> 00:07:20,240 Speaker 3: companies to tell balanced stories. It doesn't, you know, it 141 00:07:20,240 --> 00:07:24,560 Speaker 3: doesn't change the fact that, you know, should we be regulating, 142 00:07:24,640 --> 00:07:28,280 Speaker 3: should there be subject to the f E, FCC and 143 00:07:28,400 --> 00:07:30,560 Speaker 3: other regulations. You know, there used to be something called 144 00:07:31,160 --> 00:07:34,760 Speaker 3: probably way too young for this jack the fairness doctrine, 145 00:07:34,920 --> 00:07:37,480 Speaker 3: which ended, i think in the late eighties, where if 146 00:07:37,520 --> 00:07:42,160 Speaker 3: you had a license for broadcasting NBC, ABC, CBS, you 147 00:07:42,240 --> 00:07:46,120 Speaker 3: had to present controversial views and you had to present 148 00:07:46,280 --> 00:07:48,800 Speaker 3: other views so that it was more balanced. 149 00:07:48,920 --> 00:07:49,680 Speaker 2: They got rid of that. 150 00:07:50,080 --> 00:07:52,280 Speaker 3: So that's why when I was a kid and Walter 151 00:07:52,360 --> 00:07:55,920 Speaker 3: Cronkite was the anchor for CBS News and a third 152 00:07:55,960 --> 00:07:57,840 Speaker 3: of the country or half of the country who was 153 00:07:57,880 --> 00:08:00,400 Speaker 3: watching him, No one knew if he was a democreder, 154 00:08:00,480 --> 00:08:02,800 Speaker 3: a Republican. You had no way of knowing. That was 155 00:08:02,840 --> 00:08:05,880 Speaker 3: a different time. Now it's clear the politics is on 156 00:08:05,920 --> 00:08:08,920 Speaker 3: your sleeve, no matter who you are. We tell you upfront, 157 00:08:08,960 --> 00:08:11,360 Speaker 3: and we protect you from differing opinions. 158 00:08:11,520 --> 00:08:13,680 Speaker 1: So I got one question, is this is a Thane 159 00:08:13,760 --> 00:08:18,920 Speaker 1: Rosenbaum question. So TikTok is beneath you? Is our chat 160 00:08:18,960 --> 00:08:21,280 Speaker 1: bots beneath you to use chat, GPT and stuff like that. 161 00:08:21,720 --> 00:08:23,160 Speaker 4: Not yet, not yet good for you. 162 00:08:23,240 --> 00:08:26,000 Speaker 3: Yeah, I'm not saying I'm not saying that I won't. 163 00:08:26,760 --> 00:08:29,040 Speaker 3: But you know, I'm a writer and a novelist, and 164 00:08:29,080 --> 00:08:30,080 Speaker 3: that stuff scares me. 165 00:08:30,320 --> 00:08:30,920 Speaker 4: I'll get my car. 166 00:08:32,480 --> 00:08:35,400 Speaker 1: I am currently reading Ulysses and I'm forty percent of 167 00:08:35,400 --> 00:08:37,400 Speaker 1: the way through. I'm fighting my way through that book. 168 00:08:37,640 --> 00:08:40,439 Speaker 1: So to give myself some credibility in your world, the 169 00:08:40,840 --> 00:08:41,760 Speaker 1: higher thinking world. 170 00:08:42,240 --> 00:08:43,960 Speaker 2: You really impressed the hell out of me. Jack. 171 00:08:46,520 --> 00:08:49,560 Speaker 3: You're the only radio guy that I've talked to in 172 00:08:49,840 --> 00:08:52,360 Speaker 3: ye who says anything like that. 173 00:08:53,120 --> 00:08:54,520 Speaker 2: Joice is something you're reading. 174 00:08:54,600 --> 00:08:55,400 Speaker 4: Yeah, there you go. 175 00:08:56,559 --> 00:08:59,320 Speaker 1: Rosenbom, CBS News legal analyist, Thanks for your time today. 176 00:08:59,360 --> 00:09:00,040 Speaker 4: Appreciate it. 177 00:09:00,200 --> 00:09:00,840 Speaker 2: Thank you, Jack. 178 00:09:01,120 --> 00:09:06,200 Speaker 4: I liked his personality. Entertaining guy. Okay, we've got more 179 00:09:06,240 --> 00:09:09,800 Speaker 4: on the way. Stay here, Armstrong and Getty