1 00:00:03,200 --> 00:00:08,000 Speaker 1: This is Bloomberg Law with June Brusso from Bloomberg Radio. 2 00:00:09,840 --> 00:00:13,200 Speaker 2: Is time ticking away for TikTok? Or is it just 3 00:00:13,400 --> 00:00:17,480 Speaker 2: another step in a year's long tech drama. The wildly 4 00:00:17,640 --> 00:00:21,320 Speaker 2: popular short video app used by one hundred seventy million 5 00:00:21,360 --> 00:00:25,640 Speaker 2: Americans is facing its most serious challenge to date, a 6 00:00:25,680 --> 00:00:29,600 Speaker 2: bill that would require its parent company, Chinese tech giant 7 00:00:29,680 --> 00:00:33,479 Speaker 2: Fyte Edance, to divest itself of the popular video service 8 00:00:33,520 --> 00:00:37,360 Speaker 2: in the US or remove the app from the country altogether. 9 00:00:37,840 --> 00:00:40,479 Speaker 2: That bill passed by a vote of three hundred fifty 10 00:00:40,479 --> 00:00:43,440 Speaker 2: two to sixty five on Wednesday in the House and 11 00:00:43,560 --> 00:00:47,199 Speaker 2: now goes to the Senate. Lawmakers cite privacy risks and 12 00:00:47,320 --> 00:00:52,320 Speaker 2: national security fears over the Chinese Communist Party's perceived ability 13 00:00:52,520 --> 00:00:57,120 Speaker 2: to use personal information for political gains or manipulation. Here's 14 00:00:57,160 --> 00:01:00,639 Speaker 2: North Carolina Republican Senator Tom tillis, we. 15 00:01:00,640 --> 00:01:04,000 Speaker 3: Have to recognize that information on any of these platforms 16 00:01:04,040 --> 00:01:07,560 Speaker 3: are at risk of going to the Chinese Communist Party. 17 00:01:07,640 --> 00:01:09,600 Speaker 3: So this is one step in one. I thinks a 18 00:01:09,640 --> 00:01:12,000 Speaker 3: long journey to hold China accountable. 19 00:01:11,959 --> 00:01:16,560 Speaker 2: But others like Republican Congressman Tom McClintock of California are 20 00:01:16,680 --> 00:01:20,000 Speaker 2: concerned that the US is going down a dangerous path. 21 00:01:20,520 --> 00:01:26,800 Speaker 1: The answer to CCPA style propaganda is not CCPA style oppression. 22 00:01:27,200 --> 00:01:30,959 Speaker 1: Let us slow down before we blunder down this very 23 00:01:31,000 --> 00:01:32,560 Speaker 1: steep and slippery slope. 24 00:01:32,640 --> 00:01:36,960 Speaker 2: And TikTok Ceo showed you promised a fight that includes 25 00:01:37,040 --> 00:01:37,640 Speaker 2: the courts. 26 00:01:38,120 --> 00:01:41,319 Speaker 4: We will continue to do all we can, including exercising 27 00:01:41,319 --> 00:01:42,400 Speaker 4: all legal rights. 28 00:01:42,760 --> 00:01:46,200 Speaker 2: That's something the company did last time in twenty twenty, 29 00:01:46,400 --> 00:01:50,480 Speaker 2: when then President Donald Trump ordered by Dance to sell TikTok. 30 00:01:50,840 --> 00:01:53,720 Speaker 2: My guest is Eric Goleman, a professor at Santa Clara 31 00:01:53,840 --> 00:01:57,000 Speaker 2: University Law School and co director of its High Tech 32 00:01:57,120 --> 00:02:01,960 Speaker 2: Law Institute. Eric explained why so many lawmakers are concerned 33 00:02:01,960 --> 00:02:04,360 Speaker 2: about TikTok and it's Chinese parent. 34 00:02:05,000 --> 00:02:09,200 Speaker 4: A lot of the legislatures object to TikTok principally because 35 00:02:09,240 --> 00:02:13,600 Speaker 4: of the fact that it's related to China, and there's 36 00:02:13,639 --> 00:02:19,000 Speaker 4: such senophobia among our country that it becomes a very 37 00:02:19,040 --> 00:02:23,000 Speaker 4: politically popular move to tweak China. So it's not so 38 00:02:23,120 --> 00:02:28,880 Speaker 4: much that the representatives care about the privacy issues. They 39 00:02:28,960 --> 00:02:33,640 Speaker 4: care about China, and they're signaling that to their voters. 40 00:02:33,880 --> 00:02:39,400 Speaker 2: They point to Chinese national security law that forces companies 41 00:02:39,400 --> 00:02:44,160 Speaker 2: to turn over information when compelled to do so. TikTok 42 00:02:44,320 --> 00:02:48,679 Speaker 2: CEO show Chu said that TikTok has never shared or 43 00:02:48,720 --> 00:02:52,639 Speaker 2: received request to share US user data with the Chinese government, 44 00:02:53,240 --> 00:02:57,320 Speaker 2: and he said that information on TikTok's American users had 45 00:02:57,320 --> 00:03:01,959 Speaker 2: been moved to US servers run by Texas based company Oracle, 46 00:03:02,440 --> 00:03:04,440 Speaker 2: and that under the new structure there's no way for 47 00:03:04,520 --> 00:03:08,280 Speaker 2: the Chinese government to access it or compel access to it. 48 00:03:09,120 --> 00:03:11,760 Speaker 2: So is that just a false premise or is there 49 00:03:11,840 --> 00:03:14,600 Speaker 2: something to be concerned about in general? 50 00:03:14,639 --> 00:03:17,680 Speaker 4: I think we should be concerned about government access to 51 00:03:17,760 --> 00:03:21,680 Speaker 4: our private information online, but I don't restrict my concern 52 00:03:21,720 --> 00:03:24,560 Speaker 4: to China. I feel that way about all governments across 53 00:03:24,639 --> 00:03:28,080 Speaker 4: the globe, including here in the United States, where government 54 00:03:28,080 --> 00:03:32,440 Speaker 4: agencies have a variety of ways about taining private online 55 00:03:32,440 --> 00:03:35,720 Speaker 4: communications that in fact we would prefer that they don't 56 00:03:35,720 --> 00:03:39,800 Speaker 4: have access to. Indeed, Europe cut off data flows with 57 00:03:39,880 --> 00:03:43,320 Speaker 4: the United States because of the concern in Europe that 58 00:03:43,400 --> 00:03:47,360 Speaker 4: the US government had access to European private information online. 59 00:03:47,520 --> 00:03:52,040 Speaker 4: So if that's the concern, then trying to target China 60 00:03:52,360 --> 00:03:55,360 Speaker 4: and doing it in this way is really an indirect solution. 61 00:03:55,800 --> 00:03:58,560 Speaker 4: We really ought to have a broad society wide conversation 62 00:03:58,640 --> 00:04:02,040 Speaker 4: about when government to have access to private citizen data 63 00:04:02,360 --> 00:04:04,240 Speaker 4: and what we're going to do to protect the interest 64 00:04:04,280 --> 00:04:05,200 Speaker 4: of private citizens. 65 00:04:05,640 --> 00:04:08,000 Speaker 2: So tell us about this bill what it would do. 66 00:04:08,600 --> 00:04:12,920 Speaker 4: The bill is styled as a divestiture law. The idea 67 00:04:12,960 --> 00:04:15,840 Speaker 4: is to get the ownership of TikTok out of the 68 00:04:15,880 --> 00:04:20,279 Speaker 4: hands of people related to China. In practice, it's almost 69 00:04:20,320 --> 00:04:23,880 Speaker 4: certainly going to turn into a TikTok ban. It would 70 00:04:23,960 --> 00:04:26,960 Speaker 4: ban TikTok in the United States and require app stores 71 00:04:27,320 --> 00:04:31,560 Speaker 4: not to allow the installation of TikTok. So it's trying 72 00:04:31,600 --> 00:04:34,159 Speaker 4: to just kick TikTok out of the US market. 73 00:04:34,480 --> 00:04:37,960 Speaker 2: So what the lawmakers are saying is that by dance, 74 00:04:38,040 --> 00:04:42,920 Speaker 2: the Chinese parent would have six months to divest TikTok. 75 00:04:43,400 --> 00:04:46,280 Speaker 4: Yeah, just to be clear, nobody expects that to really 76 00:04:46,320 --> 00:04:49,960 Speaker 4: be the outcome. There's a limited number of people who 77 00:04:50,040 --> 00:04:54,159 Speaker 4: might be interested in buying TikTok, and the forced sale 78 00:04:54,240 --> 00:04:56,640 Speaker 4: would almost certainly depress the prices. But I want to 79 00:04:56,680 --> 00:05:00,279 Speaker 4: analogize this to what happened after the Ukrainian War, where 80 00:05:00,480 --> 00:05:06,000 Speaker 4: Russia put the squeeze on US owned enterprises and force 81 00:05:06,120 --> 00:05:09,799 Speaker 4: them to sell to Russian owners, and in so doing, 82 00:05:09,880 --> 00:05:13,719 Speaker 4: they transferred billions of dollars of wealth from the US 83 00:05:13,880 --> 00:05:17,200 Speaker 4: owners to new Russian owners who bought it pennies on 84 00:05:17,240 --> 00:05:21,120 Speaker 4: the dollar. And so essentially the Congress is doing what 85 00:05:21,279 --> 00:05:25,240 Speaker 4: Russia did to our companies, and that really seems like 86 00:05:25,279 --> 00:05:27,000 Speaker 4: the wrong inspiration for US to be. 87 00:05:27,000 --> 00:05:30,920 Speaker 2: Drawn from China. Beijing has warned that a ban would 88 00:05:31,040 --> 00:05:35,159 Speaker 2: ultimately backfire on the United States itself. So what could 89 00:05:35,160 --> 00:05:35,760 Speaker 2: they do there? 90 00:05:36,200 --> 00:05:39,960 Speaker 4: You know, we're locked in a long term strategic battle 91 00:05:40,000 --> 00:05:45,000 Speaker 4: of words and actions against China and some other global powers, 92 00:05:45,080 --> 00:05:47,880 Speaker 4: So there's an infinite number of ways in which China 93 00:05:47,960 --> 00:05:52,040 Speaker 4: could work to degrade the interest of US companies. Now, 94 00:05:52,040 --> 00:05:54,400 Speaker 4: of course, China has a long history of doing that. 95 00:05:54,760 --> 00:05:58,640 Speaker 4: They have literally kicked US companies out of the China market. 96 00:05:59,080 --> 00:06:02,120 Speaker 4: They could continue to tamp down on US interests in 97 00:06:02,200 --> 00:06:05,840 Speaker 4: China to the economic detriment of US interests, And these 98 00:06:05,960 --> 00:06:09,680 Speaker 4: kinds of trade wars or attempts to kick foreign companies 99 00:06:09,680 --> 00:06:13,719 Speaker 4: out of the markets ultimately erect barriers that make everyone poorer. 100 00:06:13,960 --> 00:06:16,000 Speaker 4: So let's hope that we don't go there. But no 101 00:06:16,160 --> 00:06:18,520 Speaker 4: doubt that the US would be sending a strong signal 102 00:06:18,560 --> 00:06:21,919 Speaker 4: to China. We're erecting another trade barrier. You know, we 103 00:06:22,000 --> 00:06:23,159 Speaker 4: expected retaliation. 104 00:06:24,040 --> 00:06:28,679 Speaker 2: China's Foreign ministry also said that there was a failure 105 00:06:28,720 --> 00:06:31,719 Speaker 2: to provide evidence that TikTok is a threat to the 106 00:06:31,839 --> 00:06:35,960 Speaker 2: US national security. I've heard a lot of blanket statements 107 00:06:36,040 --> 00:06:38,799 Speaker 2: that it's a threat, but is there actual proof. 108 00:06:39,320 --> 00:06:44,240 Speaker 4: So one of the big questions is exactly how is 109 00:06:44,279 --> 00:06:48,360 Speaker 4: TikTok a threat to national security? And there's been some 110 00:06:48,400 --> 00:06:51,400 Speaker 4: seculation about this, but the best evidence of this has 111 00:06:51,480 --> 00:06:54,680 Speaker 4: not been revealed to the public. It's only been disclosed privately, 112 00:06:54,880 --> 00:06:58,799 Speaker 4: with the justification that because it's in national security interests, 113 00:06:58,800 --> 00:07:01,920 Speaker 4: we couldn't tell the public we would jeopardize national security. 114 00:07:02,080 --> 00:07:05,440 Speaker 4: But that kind of rationale makes it impossible for us 115 00:07:05,480 --> 00:07:09,480 Speaker 4: to actually scrutinize credibly the evidence, and as a result, 116 00:07:09,640 --> 00:07:13,000 Speaker 4: it's possible that that evidence is not credible. And in fact, 117 00:07:13,360 --> 00:07:16,240 Speaker 4: the efforts to ban TikTok were initiated in the Trump 118 00:07:16,240 --> 00:07:19,600 Speaker 4: administration and some of the judges got to see that 119 00:07:19,680 --> 00:07:22,560 Speaker 4: evidence and they did not find a persuasive Now, maybe 120 00:07:22,600 --> 00:07:25,000 Speaker 4: there's new evidence that we haven't seen yet, but the 121 00:07:25,400 --> 00:07:29,600 Speaker 4: most likely scenario is that national security is being invoked 122 00:07:29,640 --> 00:07:34,000 Speaker 4: as a concept, not actually as pieces of specific credible 123 00:07:34,040 --> 00:07:35,960 Speaker 4: evidence to try and justify the ban. 124 00:07:36,600 --> 00:07:41,360 Speaker 2: The company TikTok has warned about the potential economic impact 125 00:07:41,400 --> 00:07:44,960 Speaker 2: on small business owners who depend on the app. It's 126 00:07:45,120 --> 00:07:48,200 Speaker 2: urged it's one hundred and seventy million US users to 127 00:07:48,240 --> 00:07:51,960 Speaker 2: speak out against it, and they have been influencers. Took 128 00:07:52,000 --> 00:07:55,880 Speaker 2: to Capital Hill on Wednesday. Members of Congress were flooded 129 00:07:55,960 --> 00:08:00,280 Speaker 2: with calls from angry constituents and small business as are 130 00:08:00,320 --> 00:08:03,560 Speaker 2: saying a ban would crush their business. Do you think 131 00:08:03,560 --> 00:08:07,200 Speaker 2: it would have a severe economic impact on certain people? 132 00:08:08,200 --> 00:08:10,400 Speaker 4: It's not just the economic impact, although I think that's 133 00:08:10,400 --> 00:08:13,960 Speaker 4: a helpful way of establishing a measuring stick, but so 134 00:08:14,120 --> 00:08:18,000 Speaker 4: many communities rely upon TikTok to communicate with each other, 135 00:08:18,440 --> 00:08:23,760 Speaker 4: and that can create new stars, celebrities, influencers who wouldn't 136 00:08:23,760 --> 00:08:26,360 Speaker 4: have had a voice any other way that are actually 137 00:08:26,440 --> 00:08:29,600 Speaker 4: quite helpful to the communities that they're speaking to. It 138 00:08:29,720 --> 00:08:34,000 Speaker 4: also has created a variety of different small business opportunities 139 00:08:34,040 --> 00:08:36,800 Speaker 4: where new entrants to a market can use social media, 140 00:08:36,840 --> 00:08:40,760 Speaker 4: including TikTok, in order to gain traction against incumbents who 141 00:08:40,760 --> 00:08:43,560 Speaker 4: otherwise would rather not give them any opportunity to grow. 142 00:08:43,880 --> 00:08:47,000 Speaker 4: So When Congress says that we want to ban TikTok, 143 00:08:47,040 --> 00:08:49,840 Speaker 4: what they're really saying is we don't want people talking 144 00:08:49,880 --> 00:08:52,600 Speaker 4: to each other in that way, and we don't care 145 00:08:52,679 --> 00:08:55,280 Speaker 4: that sometimes that's the very best way for those people 146 00:08:55,320 --> 00:08:59,360 Speaker 4: to help self actualize or the very best way for 147 00:08:59,440 --> 00:09:02,480 Speaker 4: those entre to help build their business. So Congress is 148 00:09:02,520 --> 00:09:04,480 Speaker 4: just saying, we'll care about any of that, all of 149 00:09:04,480 --> 00:09:08,160 Speaker 4: that we're willing to forego in this nebulous chasing of 150 00:09:08,480 --> 00:09:10,120 Speaker 4: the quote national security interest. 151 00:09:10,840 --> 00:09:14,240 Speaker 2: Eric, I am not on TikTok. I confess, is it 152 00:09:14,400 --> 00:09:18,400 Speaker 2: really this unique platform? Because hearing people talk about it, 153 00:09:18,440 --> 00:09:22,400 Speaker 2: they say, this is the only way I can accomplish 154 00:09:22,760 --> 00:09:25,560 Speaker 2: this goal, whether it's business or reaching out to people. 155 00:09:25,920 --> 00:09:27,400 Speaker 2: Is it really that unique? 156 00:09:27,840 --> 00:09:31,560 Speaker 4: It doesn't really matter if TikTok is unique compared to 157 00:09:31,640 --> 00:09:35,199 Speaker 4: other social media. What matters is that there's certain types 158 00:09:35,240 --> 00:09:38,520 Speaker 4: of communities that have developed on TikTok that don't exist 159 00:09:38,600 --> 00:09:42,280 Speaker 4: in the other social media, and so by eliminating those communities, 160 00:09:42,800 --> 00:09:45,560 Speaker 4: even if you could publish the exact same content on 161 00:09:45,760 --> 00:09:48,760 Speaker 4: social media, you can't reach the same audience in the 162 00:09:48,800 --> 00:09:51,600 Speaker 4: same way. Now I don't personally use TikTok, it doesn't 163 00:09:51,640 --> 00:09:54,440 Speaker 4: speak to me, But for those who it speaks to, 164 00:09:54,920 --> 00:09:57,240 Speaker 4: it is perhaps the best way to reach them, the 165 00:09:57,280 --> 00:10:00,640 Speaker 4: best way to engage them. And so it isn't something 166 00:10:00,640 --> 00:10:04,560 Speaker 4: where we could simply tell the content publisher and the 167 00:10:04,640 --> 00:10:08,000 Speaker 4: audience just move your conversation another social media that literally 168 00:10:08,040 --> 00:10:10,800 Speaker 4: wouldn't happen, That relationship would bust up instead. 169 00:10:11,200 --> 00:10:15,360 Speaker 2: Does it seem as if the Senate is not as 170 00:10:15,559 --> 00:10:18,040 Speaker 2: enthrall with this bill as the House was. 171 00:10:18,800 --> 00:10:20,480 Speaker 4: We don't know what's going to happen to the bill 172 00:10:20,520 --> 00:10:25,280 Speaker 4: in the Senate. The Senate leaders have created equivocation about 173 00:10:25,280 --> 00:10:28,480 Speaker 4: the bill, but the political calculus that got the bill 174 00:10:28,559 --> 00:10:31,320 Speaker 4: through the House is the same for the Senate. In 175 00:10:31,360 --> 00:10:34,960 Speaker 4: all cases, there's political gold for beating up on China 176 00:10:35,160 --> 00:10:38,520 Speaker 4: legitimately or not. So I don't think that the bill 177 00:10:38,559 --> 00:10:42,079 Speaker 4: will die in the Senate easily. It will be given 178 00:10:42,240 --> 00:10:45,600 Speaker 4: a fair amount of push I think forward. But there 179 00:10:45,600 --> 00:10:48,160 Speaker 4: are a number of senators I think we understand both 180 00:10:48,240 --> 00:10:51,520 Speaker 4: the ridiculousness of the justifications of the bill as well 181 00:10:51,559 --> 00:10:53,880 Speaker 4: as the fact that in the end, the bill is 182 00:10:53,920 --> 00:10:56,320 Speaker 4: a censorship bill, and I think there are some senators 183 00:10:56,360 --> 00:10:57,560 Speaker 4: who just can't support that. 184 00:10:58,320 --> 00:11:02,600 Speaker 2: And some lawmakers did say that the United States shouldn't 185 00:11:02,640 --> 00:11:07,880 Speaker 2: be following China in selectively blocking social media platforms citing 186 00:11:07,960 --> 00:11:09,240 Speaker 2: free speech concerns. 187 00:11:09,960 --> 00:11:12,839 Speaker 4: I mean, it sounds so weird to say we should 188 00:11:12,840 --> 00:11:16,640 Speaker 4: not be ripping the playbook off of from authoritarian countries 189 00:11:16,720 --> 00:11:20,280 Speaker 4: and then justifying our actions by because they're doing it 190 00:11:20,520 --> 00:11:23,280 Speaker 4: like that, Just to me is so obvious, such a 191 00:11:23,320 --> 00:11:26,960 Speaker 4: baseline foundational assumption in how I proceed, and yet it 192 00:11:27,040 --> 00:11:28,240 Speaker 4: is not obvious in the house. 193 00:11:28,600 --> 00:11:30,959 Speaker 2: Coming up next on the Bloomberg Law Show, I'll continue 194 00:11:30,960 --> 00:11:34,800 Speaker 2: this conversation with Professor Eric Goleman of Santa Clara University 195 00:11:34,880 --> 00:11:37,400 Speaker 2: Law School. TikTok says it will fight the bill in 196 00:11:37,440 --> 00:11:39,720 Speaker 2: the courts. What are some of the legal grounds it 197 00:11:39,760 --> 00:11:44,000 Speaker 2: could use? And later in the show, the accident investigation 198 00:11:44,360 --> 00:11:48,720 Speaker 2: into that Alaska Airlines mid air emergency in January is 199 00:11:48,760 --> 00:11:53,080 Speaker 2: being slowed down by Boeing's failure to cooperate. Remember, you 200 00:11:53,080 --> 00:11:55,320 Speaker 2: can always get the latest legal news by listening to 201 00:11:55,360 --> 00:11:58,280 Speaker 2: our Bloomberg Law podcasts. You can get them wherever you 202 00:11:58,320 --> 00:12:04,360 Speaker 2: get your favorite podcasts, and you're listening to Bloomberg. TikTok 203 00:12:04,480 --> 00:12:07,760 Speaker 2: is used by one hundred and seventy million Americans, so 204 00:12:07,800 --> 00:12:11,600 Speaker 2: it's no wonder that after TikTok urged them on, angry 205 00:12:11,760 --> 00:12:15,480 Speaker 2: users flooded members of Congress with calls over a bill 206 00:12:15,520 --> 00:12:18,240 Speaker 2: that would lead to a nationwide ban of the popular 207 00:12:18,320 --> 00:12:21,840 Speaker 2: video app if it's China based owner doesn't sell at stake. 208 00:12:22,120 --> 00:12:25,480 Speaker 2: Here's the bill's author, Wisconsin Republican Mike Gallagher. 209 00:12:25,760 --> 00:12:27,800 Speaker 1: It's also based on a lie. They're saying we're pushing 210 00:12:27,800 --> 00:12:30,520 Speaker 1: an outright ban of TikTok, which is not true. It's 211 00:12:30,520 --> 00:12:33,760 Speaker 1: not actually what the bill does. It forces a divestiture. 212 00:12:33,480 --> 00:12:38,680 Speaker 2: But users, influencers, social media content creators, and small business 213 00:12:38,679 --> 00:12:42,920 Speaker 2: owners weren't buying it. Brandon Hurst, who sells plants on TikTok, 214 00:12:43,320 --> 00:12:45,679 Speaker 2: lobbied lawmakers not to pass the bill. 215 00:12:45,960 --> 00:12:48,360 Speaker 4: My whole business would be devastated. Yeah, I would lose 216 00:12:48,400 --> 00:12:50,800 Speaker 4: the opportunity to connect with millions of people on a 217 00:12:50,800 --> 00:12:53,720 Speaker 4: regular basis, and the community that I've worked really hard 218 00:12:53,720 --> 00:12:55,680 Speaker 4: to build would be would be gone. 219 00:12:55,800 --> 00:12:59,720 Speaker 2: But the bill passed the House with overwhelming bipartisan support 220 00:13:00,120 --> 00:13:02,400 Speaker 2: and now goes on to the Senate. I've been talking 221 00:13:02,400 --> 00:13:06,680 Speaker 2: to Professor Eric Goleman of Santa Clara University School of Law. Eric, 222 00:13:06,760 --> 00:13:09,600 Speaker 2: let's say that some version of the bill is enacted, 223 00:13:09,920 --> 00:13:14,320 Speaker 2: TikTok says it plans to exhaust all legal challenges before 224 00:13:14,360 --> 00:13:19,360 Speaker 2: it considers divestiture. What kind of legal challenges could TikTok bring. 225 00:13:19,880 --> 00:13:22,520 Speaker 4: Yeah, no doubt that if the bill is enacted, it 226 00:13:22,559 --> 00:13:24,800 Speaker 4: will be challenged in court, and it will be challenged 227 00:13:24,840 --> 00:13:27,720 Speaker 4: by TikTok. It's also probable that it will be challenged 228 00:13:27,720 --> 00:13:32,199 Speaker 4: by users of TikTok expressing their own free speech concerns 229 00:13:32,360 --> 00:13:35,280 Speaker 4: independent of the concerns that TikTok has, and the bill 230 00:13:35,320 --> 00:13:38,199 Speaker 4: will also impact the app stores, and it's possible that 231 00:13:38,240 --> 00:13:41,240 Speaker 4: they will point out that the bill impacts their free 232 00:13:41,240 --> 00:13:43,800 Speaker 4: speech as well. So there are at least three different 233 00:13:44,160 --> 00:13:47,560 Speaker 4: communities that have concerns about how the bill would restrict 234 00:13:47,600 --> 00:13:51,440 Speaker 4: their speech, and we actually have some indication of how 235 00:13:51,480 --> 00:13:53,880 Speaker 4: those legal challenges might look. This is not the first 236 00:13:53,880 --> 00:13:57,240 Speaker 4: time that efforts to ban social media apps have been 237 00:13:57,679 --> 00:14:01,360 Speaker 4: attempted and have been challenged in court. Both in the 238 00:14:01,400 --> 00:14:05,160 Speaker 4: Trump administration, there was a ban on TikTok and we chat, 239 00:14:05,440 --> 00:14:08,480 Speaker 4: and then states have been enacting bands, including an illegal 240 00:14:08,520 --> 00:14:11,480 Speaker 4: challenge was filed against one in Montana, and all of 241 00:14:11,520 --> 00:14:15,080 Speaker 4: those efforts to ban in the past have failed, including 242 00:14:15,520 --> 00:14:19,360 Speaker 4: on First Amendment grounds, those bands were unconstitutional, and so 243 00:14:19,680 --> 00:14:22,240 Speaker 4: it seems to me that TikTok and its users and 244 00:14:22,280 --> 00:14:25,480 Speaker 4: maybe the answers have pretty good reasons to think that 245 00:14:25,560 --> 00:14:27,520 Speaker 4: they might win a First Amendment challenge. 246 00:14:27,720 --> 00:14:31,760 Speaker 2: Tell us more about the possible First Amendment challenges from 247 00:14:31,800 --> 00:14:33,680 Speaker 2: those three communities you mentioned. 248 00:14:34,280 --> 00:14:37,400 Speaker 4: So let's talk about TikTok's interest under the First Amendment. 249 00:14:37,480 --> 00:14:40,560 Speaker 4: TikTok operates a service that other people's used to talk 250 00:14:40,600 --> 00:14:44,320 Speaker 4: to each other, and sometimes there's been discussion that TikTok 251 00:14:44,320 --> 00:14:47,400 Speaker 4: doesn't have its own First Amendment interest. It's just a 252 00:14:47,520 --> 00:14:52,240 Speaker 4: vessel for other communications. But TikTok publishes content in the 253 00:14:52,240 --> 00:14:54,920 Speaker 4: form of the software that offers to its users, and 254 00:14:55,080 --> 00:14:58,480 Speaker 4: it curates users speech in a way that communicates its 255 00:14:58,640 --> 00:15:03,160 Speaker 4: editorial Priorities and courts have recognized that TikTok has a 256 00:15:03,240 --> 00:15:06,280 Speaker 4: speech interest in the publication of the software and in 257 00:15:06,840 --> 00:15:10,000 Speaker 4: the way in which it curates its content for its users. 258 00:15:10,160 --> 00:15:13,040 Speaker 4: So there's some serious First Amendment issues that I think 259 00:15:13,360 --> 00:15:16,400 Speaker 4: are raised when you tell a service that the government's 260 00:15:16,400 --> 00:15:20,560 Speaker 4: going to come in and override its its curatorial discretion. 261 00:15:21,000 --> 00:15:23,960 Speaker 4: I want to talk about the TikTok users interest for 262 00:15:24,120 --> 00:15:27,760 Speaker 4: the First Amendment. The TikTok users are talking to each other, 263 00:15:28,160 --> 00:15:31,920 Speaker 4: and they are creating conversations that don't exist in the 264 00:15:31,920 --> 00:15:34,760 Speaker 4: same way to the same audience anywhere else in the 265 00:15:34,800 --> 00:15:39,240 Speaker 4: information ecosystem, and so taking away TikTok takes away their 266 00:15:39,280 --> 00:15:42,960 Speaker 4: ability to engage with each other. If you're content publisher 267 00:15:43,000 --> 00:15:45,440 Speaker 4: in TikTok, it takes away your ability to create your 268 00:15:45,520 --> 00:15:48,520 Speaker 4: own audience and speak with them in the way you choose, 269 00:15:48,920 --> 00:15:51,920 Speaker 4: And so that impacts their First Amendment interests. And looking 270 00:15:52,040 --> 00:15:54,240 Speaker 4: back to the app stores, the app stores have not 271 00:15:54,640 --> 00:15:59,320 Speaker 4: previously taken the leadership on challenging the constitutionality of TikTok 272 00:15:59,400 --> 00:16:03,160 Speaker 4: bands or bands of social media, but they're actually harmed 273 00:16:03,280 --> 00:16:07,040 Speaker 4: by the TikTok band bill as well. And let me 274 00:16:07,080 --> 00:16:09,800 Speaker 4: give an analogy of how we might think about their interests. 275 00:16:09,880 --> 00:16:14,600 Speaker 4: The ban is kind of like telling a bookstore that 276 00:16:14,680 --> 00:16:17,160 Speaker 4: they can't carry a particular book. They can still run 277 00:16:17,200 --> 00:16:19,840 Speaker 4: a bookstore, but they just can't carry a book. And 278 00:16:19,960 --> 00:16:22,160 Speaker 4: so for the app store, they're being told you have 279 00:16:22,280 --> 00:16:26,200 Speaker 4: to pull an app off of your shelves. And by 280 00:16:26,360 --> 00:16:29,720 Speaker 4: forcing the bookstore or in this case, the app store 281 00:16:29,800 --> 00:16:32,560 Speaker 4: to change what it wants to offer to its users 282 00:16:32,960 --> 00:16:37,040 Speaker 4: that encroaches on the app store speech interests as well. 283 00:16:37,200 --> 00:16:41,000 Speaker 4: So there are three different each interests to play here, TikTok's, 284 00:16:41,160 --> 00:16:44,960 Speaker 4: TikTok users, and the app stores. And I think each 285 00:16:45,000 --> 00:16:46,760 Speaker 4: of them has very good reason to point to the 286 00:16:46,800 --> 00:16:50,640 Speaker 4: Constitution and say that the ban hurts their ability to 287 00:16:50,760 --> 00:16:51,800 Speaker 4: speak the way they choose. 288 00:16:52,520 --> 00:16:57,160 Speaker 2: TikTok is banned on federal devices, and Biden administration officials 289 00:16:57,200 --> 00:17:00,720 Speaker 2: help with the bill's technical language. But by re election 290 00:17:01,000 --> 00:17:06,000 Speaker 2: campaign joined TikTok last month, I mean talk about irony. 291 00:17:06,520 --> 00:17:10,040 Speaker 4: Biden's presence on TikTok is a reminder that even if 292 00:17:10,080 --> 00:17:13,720 Speaker 4: Biden could reach audiences through other channels, there's value to 293 00:17:13,760 --> 00:17:17,400 Speaker 4: reach them through TikTok. And it is that exact political 294 00:17:17,520 --> 00:17:21,560 Speaker 4: speech benefit that shows how corrupt it is to try 295 00:17:21,560 --> 00:17:25,800 Speaker 4: and ban TikTok. It literally would suppress valuable political speech 296 00:17:26,080 --> 00:17:28,440 Speaker 4: coming from the president who would potentially sign to build 297 00:17:28,480 --> 00:17:28,840 Speaker 4: a bandit. 298 00:17:29,760 --> 00:17:35,440 Speaker 2: Former President Donald Trump, surprise surprise, has flip flopped on TikTok. Recently, 299 00:17:35,480 --> 00:17:38,800 Speaker 2: he's expressed his opposition to the ban. Do you think 300 00:17:38,840 --> 00:17:42,320 Speaker 2: that this could be an issue in the presidential election. 301 00:17:43,080 --> 00:17:46,480 Speaker 4: Trump's flip flopping on TikTok is really quite shocking. He 302 00:17:46,800 --> 00:17:49,640 Speaker 4: made a concerted effort to try to force the exact 303 00:17:49,760 --> 00:17:53,919 Speaker 4: same moves that the House enacted while he was in office, 304 00:17:54,400 --> 00:17:57,880 Speaker 4: but he didn't sway a lot of his loyal fans. 305 00:17:58,280 --> 00:18:02,320 Speaker 4: Only fifteen Republicans voted against the TikTok band despite Trump 306 00:18:02,400 --> 00:18:05,320 Speaker 4: asking him to do so, so the vast majority of 307 00:18:05,400 --> 00:18:09,439 Speaker 4: Republicans didn't heed his requests. Though I don't really know 308 00:18:09,480 --> 00:18:11,919 Speaker 4: how serious anyone's taking him on this topic. 309 00:18:12,440 --> 00:18:16,120 Speaker 2: You know, I thought that in all the discussion about TikTok, 310 00:18:16,200 --> 00:18:20,720 Speaker 2: there might be some discussion about how it's harmful to 311 00:18:21,400 --> 00:18:25,240 Speaker 2: young adults. You know, we had those lawsuits by parents. 312 00:18:25,720 --> 00:18:29,400 Speaker 4: Yeah. So the effort to ban TikTok is a very 313 00:18:29,480 --> 00:18:33,320 Speaker 4: small corner of the broader tech lash, the effort among 314 00:18:33,400 --> 00:18:37,960 Speaker 4: regulators to try to suppress and control social media generally. 315 00:18:38,240 --> 00:18:40,920 Speaker 4: And there's a wide list of gripes against social media 316 00:18:40,920 --> 00:18:44,560 Speaker 4: that regulators have lobbed, of which privacy is a non 317 00:18:44,600 --> 00:18:47,040 Speaker 4: trivial one. But there are other concerns as well, including 318 00:18:47,080 --> 00:18:51,359 Speaker 4: the concerns about the impact of social media on teenage 319 00:18:51,359 --> 00:18:54,800 Speaker 4: and younger users. But in this sense, the TikTok ban 320 00:18:54,960 --> 00:19:00,000 Speaker 4: is not really about protecting the users. Although nominally it's 321 00:19:00,040 --> 00:19:02,560 Speaker 4: about protecting their privacy, it always comes back to the 322 00:19:02,560 --> 00:19:06,960 Speaker 4: idea it's China that's doing these nefarious things, and if 323 00:19:07,000 --> 00:19:10,520 Speaker 4: it had been a British company owning TikTok who did 324 00:19:10,520 --> 00:19:12,639 Speaker 4: the exact same things that China was doing, we wouldn't 325 00:19:12,640 --> 00:19:14,720 Speaker 4: even be hearing a peep about this. And it's just 326 00:19:14,720 --> 00:19:17,240 Speaker 4: a reminder that all the efforts that are taking place 327 00:19:17,320 --> 00:19:20,960 Speaker 4: to ban TikTok are not about protecting users. They really 328 00:19:21,040 --> 00:19:22,640 Speaker 4: are just political gamesmanship. 329 00:19:23,280 --> 00:19:26,800 Speaker 2: Have any of those lawsuits by parents reached a motion 330 00:19:26,920 --> 00:19:31,720 Speaker 2: to dismiss stage or have been settled or any important 331 00:19:31,760 --> 00:19:32,920 Speaker 2: rulings yet. 332 00:19:33,160 --> 00:19:36,560 Speaker 4: Yeah, a number of the lawsuits against TikTok for harming 333 00:19:36,640 --> 00:19:41,520 Speaker 4: their users, including underage users, have been dismissed outright. Some 334 00:19:41,560 --> 00:19:44,919 Speaker 4: of those are on appeal, but there's a very large 335 00:19:45,080 --> 00:19:48,399 Speaker 4: pair of cases in California courts that have not reached 336 00:19:48,400 --> 00:19:51,119 Speaker 4: a final judgment on the question, and so those are 337 00:19:51,160 --> 00:19:53,240 Speaker 4: really going to dictate whether or not there's going to 338 00:19:53,240 --> 00:19:56,120 Speaker 4: be legal traction to the claims against TikTok. 339 00:19:56,880 --> 00:19:59,679 Speaker 2: While we're on the topic, across the pond, as they say, 340 00:20:00,400 --> 00:20:04,400 Speaker 2: there's a battle over one of Britain's most famous newspapers, 341 00:20:04,800 --> 00:20:08,760 Speaker 2: and the government has outlined plans to stop foreign states 342 00:20:08,800 --> 00:20:11,040 Speaker 2: from owning newspapers. 343 00:20:10,840 --> 00:20:14,359 Speaker 4: And the target there was some US buyers were planned 344 00:20:14,400 --> 00:20:16,520 Speaker 4: to buy up one of the leading newspapers in the 345 00:20:16,640 --> 00:20:21,280 Speaker 4: UK that caters to the conservative audience, and the members 346 00:20:21,400 --> 00:20:25,280 Speaker 4: of the legislature there didn't want to see the newspaper 347 00:20:25,359 --> 00:20:27,399 Speaker 4: fall into the hands of US interests, and it was 348 00:20:27,880 --> 00:20:30,320 Speaker 4: a reminder that here in the United States, we would 349 00:20:30,320 --> 00:20:35,000 Speaker 4: not tolerate an effort to ban or control the identity 350 00:20:35,000 --> 00:20:38,639 Speaker 4: of owners of a newspaper. That would be clearly a 351 00:20:38,760 --> 00:20:42,280 Speaker 4: First Amendment objection because it would be dictating who gets 352 00:20:42,280 --> 00:20:44,760 Speaker 4: the right to speak in our country. And believe it 353 00:20:44,880 --> 00:20:50,159 Speaker 4: or not, foreign speakers have protected constitutional rights to speak 354 00:20:50,440 --> 00:20:53,240 Speaker 4: here in the United States, So there would not be 355 00:20:53,359 --> 00:20:58,280 Speaker 4: the possibility of establishing controls over who owned a newspaper 356 00:20:58,359 --> 00:21:01,960 Speaker 4: here in the US, unlike other countries like Britain, where 357 00:21:01,960 --> 00:21:03,919 Speaker 4: literally they're proposed to that, but they don't have the 358 00:21:03,960 --> 00:21:06,600 Speaker 4: First Amendment, they can pass laws like that with the 359 00:21:06,840 --> 00:21:10,119 Speaker 4: intent and objective of controlling who gets to speak in 360 00:21:10,160 --> 00:21:13,359 Speaker 4: their country. We wouldn't tolerate that here. 361 00:21:14,000 --> 00:21:18,000 Speaker 2: What about the regulation of broadcast ownership in the US. 362 00:21:18,400 --> 00:21:20,879 Speaker 4: Back to this ish about foreign ownership over media in 363 00:21:20,880 --> 00:21:23,960 Speaker 4: the United States, Now, there is one area where there 364 00:21:24,119 --> 00:21:28,120 Speaker 4: is the right of the government to control who owns 365 00:21:28,600 --> 00:21:32,000 Speaker 4: the media that reaches the public, and that's in the 366 00:21:32,040 --> 00:21:36,720 Speaker 4: area of broadcasting. We limit the ability of foreign owners 367 00:21:36,760 --> 00:21:41,359 Speaker 4: to own broadcasting stations in the United States, and I 368 00:21:41,440 --> 00:21:46,120 Speaker 4: believe that that's constitutional. Now, even though we allow the 369 00:21:46,160 --> 00:21:51,480 Speaker 4: regulation of broadcast ownership, that doesn't extend to the ownership 370 00:21:51,520 --> 00:21:55,600 Speaker 4: of online services, even if they're broadcasting video, just like 371 00:21:55,680 --> 00:21:59,359 Speaker 4: the broadcasters do who are regulated by the SEC. And 372 00:21:59,400 --> 00:22:03,600 Speaker 4: that's because of nineteen ninety seven court opinion that said 373 00:22:03,840 --> 00:22:06,520 Speaker 4: that the rules that apply to broadcasting are not the 374 00:22:06,560 --> 00:22:08,480 Speaker 4: same as the rules that apply to the Internet, that 375 00:22:08,600 --> 00:22:14,520 Speaker 4: the reasons why we control the access to broadcasting don't 376 00:22:14,560 --> 00:22:17,679 Speaker 4: extend to why we might permit people to speak on 377 00:22:17,720 --> 00:22:20,919 Speaker 4: the Internet. So there are rules about foreign ownership of 378 00:22:20,960 --> 00:22:23,680 Speaker 4: media here in the US that might very well be constitutional, 379 00:22:24,080 --> 00:22:27,160 Speaker 4: and yet they don't extend to a ban on TikTok 380 00:22:27,200 --> 00:22:29,520 Speaker 4: because the Internet is not like broadcasting. 381 00:22:29,920 --> 00:22:32,040 Speaker 2: Thank you so much, Eric, This has been a tour 382 00:22:32,119 --> 00:22:36,000 Speaker 2: de force. We covered so many different topics. That's Professor 383 00:22:36,119 --> 00:22:39,520 Speaker 2: Eric Goldman of Santa Clara University Law School and co 384 00:22:39,600 --> 00:22:43,400 Speaker 2: director of the High Tech Law Institute. In other legal news, today, 385 00:22:43,960 --> 00:22:47,960 Speaker 2: Donald Trump's first criminal trial may not begin on March 386 00:22:48,000 --> 00:22:51,600 Speaker 2: twenty fifth, as planned, after lawyers for the Manhattan District 387 00:22:51,640 --> 00:22:55,200 Speaker 2: Attorney and the former president told a judge they'd received 388 00:22:55,320 --> 00:23:00,560 Speaker 2: thousands of new pages of evidence from Federal prosecutors Office 389 00:23:00,600 --> 00:23:04,000 Speaker 2: said today that some thirty one thousand pages of records 390 00:23:04,280 --> 00:23:08,240 Speaker 2: tied to former Trump lawyer Michael Cohen's federal campaign finance 391 00:23:08,320 --> 00:23:12,280 Speaker 2: convictions were turned over to their office on Wednesday, adding 392 00:23:12,280 --> 00:23:14,679 Speaker 2: that they would not object to a thirty day delay 393 00:23:14,760 --> 00:23:18,200 Speaker 2: in the start of Trump's hush money trial. Bragg blamed 394 00:23:18,280 --> 00:23:21,280 Speaker 2: Trump for the delay of the production of documents, saying 395 00:23:21,320 --> 00:23:25,480 Speaker 2: the defense waited until January eighteenth to issue a subpoena 396 00:23:25,520 --> 00:23:29,400 Speaker 2: for more materials. The potential delay is the latest good 397 00:23:29,520 --> 00:23:32,920 Speaker 2: news in court for Trump, who had faced the prospect 398 00:23:32,960 --> 00:23:36,640 Speaker 2: of up to four criminal trials before the November presidential election. 399 00:23:36,960 --> 00:23:40,200 Speaker 2: Three of those cases now might slip into next year. 400 00:23:40,720 --> 00:23:43,640 Speaker 2: Coming up next on the Bloomberg Law Show, we'll look 401 00:23:43,640 --> 00:23:49,040 Speaker 2: at the accident investigation into the Alaska Airlines mid air emergency. 402 00:23:49,200 --> 00:23:53,800 Speaker 2: In January, officials say that investigation is being slowed down 403 00:23:53,880 --> 00:23:58,120 Speaker 2: and complicated by Boeing's failure to cooperate. I'm June Grosso, 404 00:23:58,200 --> 00:24:02,280 Speaker 2: and you're listening to Bloomberg. It's been two months since 405 00:24:02,320 --> 00:24:06,240 Speaker 2: that Alaska Airlines jet had a mid air emergency when 406 00:24:06,240 --> 00:24:09,440 Speaker 2: a door plug blew off, and officials with the National 407 00:24:09,480 --> 00:24:13,399 Speaker 2: Transportation Safety Board say they're still in the dark about 408 00:24:13,440 --> 00:24:16,000 Speaker 2: who performed the work on the panel of the Boeing 409 00:24:16,119 --> 00:24:20,200 Speaker 2: seven thirty seven MAX that failed. At a Senate Commerce 410 00:24:20,200 --> 00:24:25,560 Speaker 2: Committee hearing last week, NTSB chair Jennifer Homandy said they've 411 00:24:25,600 --> 00:24:29,360 Speaker 2: repeatedly asked Boeing for the information, and they can't even 412 00:24:29,440 --> 00:24:31,480 Speaker 2: get a list of the people who are on the 413 00:24:31,560 --> 00:24:33,920 Speaker 2: team that handles doors and door plugs. 414 00:24:34,600 --> 00:24:37,320 Speaker 5: We don't have the records. We don't have the names 415 00:24:37,680 --> 00:24:41,879 Speaker 5: of the twenty five people that is in charge of 416 00:24:41,920 --> 00:24:45,880 Speaker 5: doing that work in that facility. It's absurd that two 417 00:24:45,920 --> 00:24:47,639 Speaker 5: months later we don't have that. 418 00:24:48,000 --> 00:24:52,439 Speaker 2: Joining me is Alan Levin, Bloomberg Aviation Safety reporter. So 419 00:24:52,760 --> 00:24:55,679 Speaker 2: I want you to start with a description of what 420 00:24:55,960 --> 00:25:01,000 Speaker 2: happened to Alaska Airlines Flight twelve eighty two, and you 421 00:25:01,040 --> 00:25:02,440 Speaker 2: know what we know about it. 422 00:25:03,240 --> 00:25:06,919 Speaker 6: This plane took off from Portland in the evening of 423 00:25:07,000 --> 00:25:13,600 Speaker 6: January fifth, and when it reached about fourteen fifteen thousand feet, 424 00:25:14,119 --> 00:25:20,240 Speaker 6: the cabin pressure suddenly dropped. And what happened was there's 425 00:25:20,359 --> 00:25:23,520 Speaker 6: what is known as a door plug, which is basically 426 00:25:23,600 --> 00:25:29,240 Speaker 6: just a beefed up panel that covers an unused exit door, 427 00:25:29,640 --> 00:25:32,760 Speaker 6: and it blew off the side of the plane. And 428 00:25:32,800 --> 00:25:36,000 Speaker 6: if you can imagine the air pressure and an airliner 429 00:25:36,200 --> 00:25:40,480 Speaker 6: is about the equivalent of usually somewhere around five thousand 430 00:25:40,560 --> 00:25:45,560 Speaker 6: feet altitude, but they were well above that. So nature 431 00:25:45,560 --> 00:25:48,960 Speaker 6: of horrors of vacuum, there's a tremendous amount of force 432 00:25:49,480 --> 00:25:53,240 Speaker 6: pushing from the inside where the pressure is higher out 433 00:25:53,920 --> 00:25:57,920 Speaker 6: and this plug blew off, and then you know, all 434 00:25:58,080 --> 00:26:02,400 Speaker 6: the air inside the car wants to get outside, and 435 00:26:02,440 --> 00:26:05,840 Speaker 6: so there's like a river, if you will, of pressurized 436 00:26:05,880 --> 00:26:10,480 Speaker 6: air flowing out through this. It can be quite powerful. 437 00:26:11,160 --> 00:26:14,080 Speaker 6: Luckily they were not at you know, thirty five thousand feet, 438 00:26:14,080 --> 00:26:16,600 Speaker 6: where it would have been even more intense. And also 439 00:26:16,720 --> 00:26:20,600 Speaker 6: luckily nobody was seated next to where the door plug 440 00:26:20,760 --> 00:26:23,560 Speaker 6: was or in the middle seat, so the two closest 441 00:26:23,560 --> 00:26:26,480 Speaker 6: seats were empty. On a very full plane, it's kind 442 00:26:26,480 --> 00:26:31,000 Speaker 6: of lucky as dramatic as this was. Nobody was injured, 443 00:26:31,400 --> 00:26:37,400 Speaker 6: amazing no damage to the aircraft structure, and the pilots 444 00:26:37,400 --> 00:26:40,560 Speaker 6: were able to return and make an emergency landing. 445 00:26:41,480 --> 00:26:47,840 Speaker 2: The NTSB has determined that those missing bolts caused the incident. 446 00:26:48,080 --> 00:26:51,760 Speaker 6: So it's a little early to say the cause, but 447 00:26:51,800 --> 00:26:55,520 Speaker 6: the NTSB did put out a preliminary report and they 448 00:26:55,560 --> 00:27:00,080 Speaker 6: said it appears, and probably even a little stronger than that, 449 00:27:00,320 --> 00:27:04,240 Speaker 6: there's very strong indications that this flug or this panel 450 00:27:04,400 --> 00:27:09,959 Speaker 6: over the unused door was not attached properly in the factory. 451 00:27:10,560 --> 00:27:14,400 Speaker 6: It's in rent in Washington that Boeing uses to complete 452 00:27:14,480 --> 00:27:18,960 Speaker 6: the construction of these seven thirty seven Max jets. In particular, 453 00:27:19,440 --> 00:27:24,399 Speaker 6: there's an extensive mechanism to hold the pressure on the door, 454 00:27:24,520 --> 00:27:28,800 Speaker 6: about a dozen points of these pins that interact with 455 00:27:28,880 --> 00:27:32,280 Speaker 6: the plane that hold the pressure, but they don't ensure 456 00:27:32,320 --> 00:27:36,960 Speaker 6: the door will never come off. So Boeing ads four 457 00:27:37,200 --> 00:27:42,160 Speaker 6: bolts that prevent the door from sliding, and before they 458 00:27:42,240 --> 00:27:44,920 Speaker 6: ship a plane out, they put four bolts in there 459 00:27:44,960 --> 00:27:48,880 Speaker 6: so it can never move and if installed correctly, they 460 00:27:48,880 --> 00:27:52,520 Speaker 6: work pretty well. Because this has never happened before, but 461 00:27:52,640 --> 00:27:56,200 Speaker 6: in this case, all the evidence suggests that the bolts 462 00:27:56,280 --> 00:28:00,359 Speaker 6: were not installed, and so the plane was delivered not October. 463 00:28:00,560 --> 00:28:06,439 Speaker 6: It went into service in November, and it bumped around 464 00:28:06,600 --> 00:28:10,359 Speaker 6: enough that it came loose on this flight in early January. 465 00:28:11,080 --> 00:28:16,119 Speaker 2: So the NTSB accident investigators still don't know who performed 466 00:28:16,119 --> 00:28:18,800 Speaker 2: the work on that panel that failed. 467 00:28:19,440 --> 00:28:20,040 Speaker 4: That's correct. 468 00:28:20,119 --> 00:28:26,720 Speaker 6: It's quite extraordinary actually how this investigation has evolved, if 469 00:28:26,720 --> 00:28:30,919 Speaker 6: you will, so any factory, you know, you theoretically have 470 00:28:31,080 --> 00:28:34,919 Speaker 6: records of the construction of whatever widgets they're making, But 471 00:28:35,520 --> 00:28:39,600 Speaker 6: in the aviation industry, the record keeping is on steroids. 472 00:28:39,720 --> 00:28:42,920 Speaker 6: Anytime you do a safety critical thing, you're supposed to 473 00:28:43,240 --> 00:28:46,360 Speaker 6: memorialize it. You know, in this case it's a computer system. 474 00:28:46,720 --> 00:28:49,800 Speaker 6: When that happens, for a safety critical item like this, 475 00:28:50,200 --> 00:28:53,560 Speaker 6: there's supposed to be a second set of eyes. So 476 00:28:53,600 --> 00:28:56,959 Speaker 6: not only does the workers supervisor have to oversee it, 477 00:28:57,080 --> 00:29:00,320 Speaker 6: but you know, in some cases there's a secondary set 478 00:29:00,320 --> 00:29:04,160 Speaker 6: of eyes who comes in afterward and certifies that it 479 00:29:04,240 --> 00:29:09,480 Speaker 6: was done properly. Now, in this case, Boeing has said 480 00:29:09,920 --> 00:29:13,680 Speaker 6: that they do not have a record of this work 481 00:29:13,720 --> 00:29:18,880 Speaker 6: being performed, and in fact, they an extraordinary admission last 482 00:29:18,880 --> 00:29:21,880 Speaker 6: week in a letter to the Senate. They said that 483 00:29:21,960 --> 00:29:25,840 Speaker 6: it was a violation of their processes and so, as 484 00:29:25,880 --> 00:29:28,800 Speaker 6: you say, as a part of that, they do not 485 00:29:29,120 --> 00:29:34,600 Speaker 6: know who performed the work, and the NTSB has been 486 00:29:34,640 --> 00:29:39,840 Speaker 6: asking Boeing for information. The NTSB is frustrated because they 487 00:29:39,880 --> 00:29:42,680 Speaker 6: can't get the names of the key people to interview, 488 00:29:42,840 --> 00:29:46,560 Speaker 6: so instead they had to ask for the names of 489 00:29:47,160 --> 00:29:51,240 Speaker 6: every person who might have done work on a door 490 00:29:51,280 --> 00:29:54,640 Speaker 6: assembly in the plant at the time this plane was 491 00:29:54,640 --> 00:29:58,760 Speaker 6: moving through. Told it's twenty five people. They were in 492 00:29:59,280 --> 00:30:03,120 Speaker 6: rent and Washington at this plant doing interviews last week. 493 00:30:03,320 --> 00:30:06,880 Speaker 6: But the head of the NTSB, Jennifer Hammandy, this week 494 00:30:07,080 --> 00:30:10,480 Speaker 6: told senators that they still have been unable to determine 495 00:30:10,520 --> 00:30:11,120 Speaker 6: who did it. 496 00:30:12,320 --> 00:30:17,760 Speaker 2: And Boeing overrode security camera footage of repair work on 497 00:30:18,080 --> 00:30:22,600 Speaker 2: the door plug, and they said that's consistent with standard practice. 498 00:30:22,720 --> 00:30:26,000 Speaker 2: Video recordings are maintained on a rolling thirty day basis. 499 00:30:26,480 --> 00:30:28,440 Speaker 2: Is Is that a problem as well? 500 00:30:28,800 --> 00:30:32,520 Speaker 6: I don't see that as a huge problem. So, you know, 501 00:30:32,560 --> 00:30:35,920 Speaker 6: this work was done in late September. Boeing has a 502 00:30:35,960 --> 00:30:40,080 Speaker 6: policy to overwrite ape every thirty days, so they have, 503 00:30:40,280 --> 00:30:43,960 Speaker 6: you know, videos showing the shop floor. Presumably we don't 504 00:30:43,960 --> 00:30:47,800 Speaker 6: know precisely, but they're not a key safety tool such 505 00:30:47,840 --> 00:30:52,320 Speaker 6: as performing the secondary inspection on work would be they're there, 506 00:30:53,080 --> 00:30:56,400 Speaker 6: you know, just to ensure people aren't stealing tools and 507 00:30:56,840 --> 00:31:01,400 Speaker 6: et cetera. You know, the NTSB always for video records 508 00:31:01,440 --> 00:31:04,920 Speaker 6: because there's so there's so many surveillance cameras and so 509 00:31:05,000 --> 00:31:09,440 Speaker 6: many people just randomly taking video with their phones. They 510 00:31:09,480 --> 00:31:12,160 Speaker 6: always sort of ask for that. In this case, it 511 00:31:12,280 --> 00:31:15,400 Speaker 6: wasn't available, but there was no indication there was any 512 00:31:16,280 --> 00:31:18,400 Speaker 6: anything untoward or about that. 513 00:31:18,920 --> 00:31:23,320 Speaker 2: So now you also had a Federal Aviation Administration audit 514 00:31:24,000 --> 00:31:28,840 Speaker 2: of Boeing seven thirty seven MAX that found a plethora 515 00:31:28,880 --> 00:31:31,240 Speaker 2: of issues with the production process. 516 00:31:32,040 --> 00:31:37,400 Speaker 6: That's correct. We understand there were dozens of violations, but 517 00:31:37,640 --> 00:31:41,400 Speaker 6: we still don't know the detail. I would add that 518 00:31:41,480 --> 00:31:45,840 Speaker 6: the fa also audited a second company called Spirit Aerospace, 519 00:31:46,800 --> 00:31:51,560 Speaker 6: which builds the raw fuselage parts for the seven thirty 520 00:31:51,600 --> 00:31:56,240 Speaker 6: seven and then ships them to Washington State where Boeing 521 00:31:56,280 --> 00:31:59,800 Speaker 6: does the final assembly, and they found violations that Spirit 522 00:32:00,000 --> 00:32:05,080 Speaker 6: as well. It's relevant because the reason this door plug 523 00:32:05,160 --> 00:32:07,880 Speaker 6: got removed in the first place at the factory is 524 00:32:07,880 --> 00:32:13,479 Speaker 6: that there were improperly done rivets by the Spirit folks, 525 00:32:13,520 --> 00:32:15,480 Speaker 6: and so they had to go in and fix those. 526 00:32:15,640 --> 00:32:18,680 Speaker 6: But the bottom line is, yeah, the FAA has done 527 00:32:18,720 --> 00:32:22,400 Speaker 6: this audit. We don't know the precise details of how 528 00:32:22,440 --> 00:32:25,680 Speaker 6: bad the violations were, but they did find dozens of 529 00:32:25,720 --> 00:32:30,720 Speaker 6: cases where there were shortfalls in the safety processes and 530 00:32:30,800 --> 00:32:33,880 Speaker 6: record keeping, etc. That FAA expects. 531 00:32:34,360 --> 00:32:40,360 Speaker 2: So we've seen several mishaps with Boeing aircraft recently, flaming engines, 532 00:32:40,600 --> 00:32:44,760 Speaker 2: tires falling from the sky, engine failures over the Pacific, 533 00:32:44,920 --> 00:32:50,640 Speaker 2: malfunctioning rudders, issues with wiring gears and hydraulic systems. Am 534 00:32:50,640 --> 00:32:53,160 Speaker 2: I overstating it? It sounds like a lot. 535 00:32:53,480 --> 00:32:57,520 Speaker 6: Well. I have two almost contradictory points here. One is 536 00:32:57,560 --> 00:33:00,480 Speaker 6: that in the case of the tire falling off, it 537 00:33:00,560 --> 00:33:05,200 Speaker 6: was quite dramatic. Somebody videotaped it and the tire came 538 00:33:05,240 --> 00:33:07,120 Speaker 6: down and hit a car in a parking lot. It 539 00:33:07,160 --> 00:33:10,800 Speaker 6: was in San Francisco. You know, that's not a good look, 540 00:33:11,040 --> 00:33:14,080 Speaker 6: but it was a twenty plus year old plane. That 541 00:33:14,600 --> 00:33:17,640 Speaker 6: failure had absolutely nothing to do with Boeing. We don't 542 00:33:17,640 --> 00:33:20,080 Speaker 6: know what happened, but you think it would be related 543 00:33:20,160 --> 00:33:25,800 Speaker 6: to maintenance somehow. Quite frankly, if you keep an eye 544 00:33:26,120 --> 00:33:31,280 Speaker 6: on incidents on a daily basis. As I've done many years, 545 00:33:32,000 --> 00:33:35,800 Speaker 6: these things happen all the time. I think it's fair 546 00:33:35,840 --> 00:33:40,200 Speaker 6: to say the accident rates are remarkable. They'd never been better, 547 00:33:40,800 --> 00:33:45,360 Speaker 6: and these incidents, I think demonstrate that the safety net 548 00:33:45,400 --> 00:33:49,520 Speaker 6: is pretty robust. But I would also say that one 549 00:33:49,600 --> 00:33:53,760 Speaker 6: of those incidents that you mentioned does speak to broader 550 00:33:53,800 --> 00:33:57,400 Speaker 6: issues with Boeing. There was a notice by the federal 551 00:33:57,480 --> 00:34:02,680 Speaker 6: government last week that thirty seven MAX wiring had been 552 00:34:02,720 --> 00:34:07,600 Speaker 6: improperly installed. When that happens, sometimes it chased, and what 553 00:34:07,760 --> 00:34:11,680 Speaker 6: happened was a flight control device known as a spoiler 554 00:34:12,520 --> 00:34:17,239 Speaker 6: was activating without a pilot intending for it to activate. 555 00:34:17,640 --> 00:34:21,080 Speaker 6: That speaks to some of these very same issues with 556 00:34:21,239 --> 00:34:25,440 Speaker 6: quality that we've seen with Boeing in this door plugse. 557 00:34:25,680 --> 00:34:30,279 Speaker 2: The Justice Department has opened a criminal investigation into that 558 00:34:30,520 --> 00:34:36,239 Speaker 2: Alaska Airlines incident, examining whether it falls under the government's 559 00:34:36,280 --> 00:34:42,120 Speaker 2: twenty twenty one deferred prosecution agreement with Boeing. The accident 560 00:34:42,120 --> 00:34:44,719 Speaker 2: took place just two days before the expiration of the 561 00:34:44,760 --> 00:34:46,800 Speaker 2: deferred prosecution agreement. 562 00:34:46,760 --> 00:34:52,080 Speaker 6: Several years ago. After these two fatal accidents in twenty 563 00:34:52,080 --> 00:34:56,920 Speaker 6: eighteen and twenty nineteen involving seven thirty seven MAX aircraft. 564 00:34:57,200 --> 00:35:01,080 Speaker 6: If you recall, the accidents were caused at least in 565 00:35:01,160 --> 00:35:07,160 Speaker 6: part by a bad design that Boeing put into the plane, 566 00:35:07,520 --> 00:35:13,040 Speaker 6: and afterward the Justice Department reached an agreement not to 567 00:35:13,080 --> 00:35:19,239 Speaker 6: prosecute Boeing, and in exchange, Boeing admitted to a violation 568 00:35:19,560 --> 00:35:23,640 Speaker 6: and agreed to pay a large civil penalty and then 569 00:35:23,760 --> 00:35:28,160 Speaker 6: also agreed to essentially clean up its act over a 570 00:35:28,239 --> 00:35:31,520 Speaker 6: three year period. That period, as you say, was about 571 00:35:31,560 --> 00:35:36,880 Speaker 6: to expire. The Justice Department has six months to decide 572 00:35:36,920 --> 00:35:42,640 Speaker 6: whether Boeing has violated is this earlier agreement. We understand 573 00:35:42,680 --> 00:35:47,160 Speaker 6: that they've convened a grand jury to look at whether 574 00:35:47,200 --> 00:35:51,760 Speaker 6: there are criminal charges to be brought. There are two possibilities. 575 00:35:51,800 --> 00:35:56,560 Speaker 6: We understand. It's possible they could say that Boeing violated 576 00:35:56,800 --> 00:36:00,000 Speaker 6: the earlier agreements and then they would bring the charge 577 00:36:00,560 --> 00:36:05,359 Speaker 6: on that score, or they are also potentially looking at 578 00:36:05,440 --> 00:36:11,080 Speaker 6: bringing charges related to the actual January fifth failure. Now 579 00:36:11,120 --> 00:36:15,840 Speaker 6: I would note some countries France, for example, almost always 580 00:36:16,280 --> 00:36:20,399 Speaker 6: files a criminal charge after an airline accident, but it's 581 00:36:20,719 --> 00:36:25,000 Speaker 6: very unusual in the US and quite controversial. We're very 582 00:36:25,000 --> 00:36:27,359 Speaker 6: early on. It's unclear how it's going to play out. 583 00:36:27,360 --> 00:36:30,239 Speaker 6: But it's a very sticky, complicated situation. 584 00:36:30,760 --> 00:36:34,480 Speaker 2: Seems like Bowen's been here before. Thanks so much, Alan. 585 00:36:34,840 --> 00:36:39,120 Speaker 2: That's Alan Levin, Bloomberg Aviation Safety Reporter. And that's it 586 00:36:39,200 --> 00:36:42,160 Speaker 2: for this edition of the Bloomberg Law Podcast. Remember you've 587 00:36:42,200 --> 00:36:44,920 Speaker 2: can always get the latest legal news by subscribing and 588 00:36:44,960 --> 00:36:48,480 Speaker 2: listening to the show on Apple Podcasts, Spotify, and at 589 00:36:48,520 --> 00:36:52,840 Speaker 2: Bloomberg dot com, slash podcast, slash Law. I'm June Grosso 590 00:36:53,040 --> 00:36:54,520 Speaker 2: and this is Bloomberg