1 00:00:04,840 --> 00:00:08,200 Speaker 1: On this episode of News World. In a whistleblower complaint 2 00:00:08,680 --> 00:00:12,160 Speaker 1: obtained and published by The Washington Post this week, former 3 00:00:12,200 --> 00:00:17,560 Speaker 1: Twitter security chief Peter mudge Zatko alleges the company misled 4 00:00:17,600 --> 00:00:21,160 Speaker 1: federal regulators and the company's own board of directors about 5 00:00:21,239 --> 00:00:27,280 Speaker 1: quote extreme egregious deficiencies in its defenses against hackers, as 6 00:00:27,320 --> 00:00:30,920 Speaker 1: well as its lacks efforts to fight spam. The complaint 7 00:00:30,920 --> 00:00:34,640 Speaker 1: has potential implications for Twitter's legal battle with Elon Musk, 8 00:00:34,920 --> 00:00:37,120 Speaker 1: who's trying to get out of a forty four billion 9 00:00:37,159 --> 00:00:41,479 Speaker 1: dollar contract to buy the social media platform. In fact, 10 00:00:41,960 --> 00:00:45,839 Speaker 1: Senator Chuck Grassley, the Republican leader on the Senate Judiciary Committee, 11 00:00:46,159 --> 00:00:49,519 Speaker 1: commented this week, quote the claims I've received from a 12 00:00:49,560 --> 00:00:54,600 Speaker 1: Twitter whistleblower raised serious national security concerns as well as 13 00:00:54,680 --> 00:00:58,640 Speaker 1: privacy issues, and they must be investigated further. Close quote 14 00:00:59,480 --> 00:01:03,440 Speaker 1: here to help us understand Twitter's troubles. I'm really pleased 15 00:01:03,480 --> 00:01:07,480 Speaker 1: to welcome my guest, Jessica Maluchan. She is the director 16 00:01:07,520 --> 00:01:10,840 Speaker 1: of the Center for Technology and Innovation at the Competitive 17 00:01:10,959 --> 00:01:24,680 Speaker 1: Enterprise Institute. Welcome, and thank you for joining me on 18 00:01:24,760 --> 00:01:27,759 Speaker 1: news World. It's really my pleasure. Thank you so much 19 00:01:27,800 --> 00:01:31,440 Speaker 1: for having me. Let me start with the history and 20 00:01:31,480 --> 00:01:34,080 Speaker 1: the evolution of Twitter. How did it become what it 21 00:01:34,160 --> 00:01:37,840 Speaker 1: is today. So I think the thing that people need 22 00:01:37,880 --> 00:01:41,240 Speaker 1: to remember about Twitter in all the conversation these days 23 00:01:41,280 --> 00:01:44,800 Speaker 1: about big tech is that Twitter really by most metrics 24 00:01:44,840 --> 00:01:49,520 Speaker 1: doesn't qualify. It never grew up to be as profitable 25 00:01:49,600 --> 00:01:51,960 Speaker 1: as other big tech companies. When you think about Meta 26 00:01:52,160 --> 00:01:56,880 Speaker 1: or Google, Amazon or Apple, it's still not profitable, and 27 00:01:57,000 --> 00:01:59,480 Speaker 1: it certainly doesn't have the number of users that those 28 00:01:59,520 --> 00:02:03,920 Speaker 1: big platforms do, but it's sort of overweighted in terms 29 00:02:03,920 --> 00:02:08,880 Speaker 1: of influence. Some very engaged, involved public facing people are 30 00:02:09,120 --> 00:02:12,520 Speaker 1: mostly the users of Twitter, So it has a disparate 31 00:02:12,680 --> 00:02:16,640 Speaker 1: impact on American society in the social media space. That 32 00:02:16,760 --> 00:02:19,160 Speaker 1: puts it in some ways in the category with those 33 00:02:19,200 --> 00:02:21,480 Speaker 1: big guys. But in some ways it's very different. And 34 00:02:21,520 --> 00:02:24,280 Speaker 1: it started out as that startup can do, with a 35 00:02:24,320 --> 00:02:27,720 Speaker 1: couple guys who figured it out. But it's never really 36 00:02:28,440 --> 00:02:31,240 Speaker 1: been monetized the way the other big tech companies have. 37 00:02:32,000 --> 00:02:34,760 Speaker 1: And it's interesting because it's certainly in terms of its 38 00:02:35,720 --> 00:02:39,040 Speaker 1: political impact and its ability to signal the news media, 39 00:02:39,280 --> 00:02:42,520 Speaker 1: it has been sort of an important national institution. I 40 00:02:42,560 --> 00:02:45,160 Speaker 1: think in some ways it is watching a lot of 41 00:02:45,200 --> 00:02:47,720 Speaker 1: the journalists I follow for work purposes on the tech 42 00:02:47,760 --> 00:02:51,760 Speaker 1: issues sort of have a Twitter public meltdown about the 43 00:02:51,880 --> 00:02:55,040 Speaker 1: idea of Elon Musk owning. Twitter kind of opened my 44 00:02:55,120 --> 00:03:01,440 Speaker 1: eyes to how personally people take their presence and their 45 00:03:01,480 --> 00:03:04,880 Speaker 1: identity on Twitter, and I think, like you said, it's 46 00:03:04,919 --> 00:03:09,160 Speaker 1: influential people, and it has an impact on society, especially 47 00:03:09,280 --> 00:03:11,920 Speaker 1: in the media vein that I think might be more 48 00:03:12,480 --> 00:03:15,440 Speaker 1: than any other social media platform, or at least as much. 49 00:03:15,840 --> 00:03:18,760 Speaker 1: When Twitter was founded in two thousand and six, there 50 00:03:18,760 --> 00:03:22,840 Speaker 1: were four people, Jack Dorsey, Noah Glass, Evan Williams, and 51 00:03:22,880 --> 00:03:26,240 Speaker 1: Biz Stone. Dorsey went on to become the CEO and 52 00:03:26,360 --> 00:03:28,760 Speaker 1: he stepped down in twenty twenty one. Why do you 53 00:03:28,800 --> 00:03:31,400 Speaker 1: think he left? You know, his explanation is that he 54 00:03:31,480 --> 00:03:34,000 Speaker 1: wanted to focus on other things. He was getting into 55 00:03:34,080 --> 00:03:38,360 Speaker 1: the fintech space. But what's interesting about that is that 56 00:03:38,680 --> 00:03:42,520 Speaker 1: before he left, he was the CEO that brought this 57 00:03:42,560 --> 00:03:46,440 Speaker 1: whistleblower on board to sort of kick the tires and 58 00:03:46,520 --> 00:03:49,640 Speaker 1: see where the security holes were. And since then, of course, 59 00:03:49,680 --> 00:03:52,800 Speaker 1: this whistleblower has been fired from Twitter. That's with the 60 00:03:52,840 --> 00:03:56,560 Speaker 1: second CEO that came on, you kind of feel like 61 00:03:56,640 --> 00:03:59,680 Speaker 1: the original guard at Twitter has left, and now we're 62 00:03:59,680 --> 00:04:02,240 Speaker 1: dealing with a new cast of characters, and perhaps the 63 00:04:02,280 --> 00:04:06,520 Speaker 1: whistleblower was sort of a victim of that shift. Well, 64 00:04:06,680 --> 00:04:10,000 Speaker 1: when he'd left, he was replaced by the company's chief 65 00:04:10,200 --> 00:04:15,480 Speaker 1: technology officer, Paragall. Was that a step towards trying to 66 00:04:15,520 --> 00:04:18,040 Speaker 1: get a grip on the company's technology problems or was 67 00:04:18,080 --> 00:04:22,240 Speaker 1: this luck of the politics of the front office. I 68 00:04:22,240 --> 00:04:25,040 Speaker 1: would guess probably a mix. I mean, Jack Dorsey had 69 00:04:25,040 --> 00:04:28,159 Speaker 1: become sort of the face of Twitter, and in becoming 70 00:04:28,160 --> 00:04:30,280 Speaker 1: the face of Twitter, bore the brunt of a lot 71 00:04:30,279 --> 00:04:33,600 Speaker 1: of the criticism. You know, it became much more controversial 72 00:04:33,640 --> 00:04:36,960 Speaker 1: and probably a lot less fun as the company became 73 00:04:37,440 --> 00:04:41,479 Speaker 1: involved in political situations that Jack would have to go 74 00:04:41,520 --> 00:04:43,840 Speaker 1: to Congress and answer for. I'm not sure how many 75 00:04:43,839 --> 00:04:46,640 Speaker 1: times he testified, but he became a familiar face in 76 00:04:46,680 --> 00:04:49,120 Speaker 1: that vein. And you know, I don't know that many 77 00:04:49,200 --> 00:04:52,160 Speaker 1: CEOs who enjoy that part of the job. Getting grilled 78 00:04:52,200 --> 00:04:56,080 Speaker 1: by Congress is probably not a pleasure. And especially if 79 00:04:56,120 --> 00:04:59,480 Speaker 1: you're a founder type and you're an innovator and a 80 00:04:59,560 --> 00:05:01,479 Speaker 1: bit of an entrepreneur, I don't think that's what you 81 00:05:01,520 --> 00:05:03,920 Speaker 1: want to spend your time doing. I have no doubt 82 00:05:03,920 --> 00:05:06,760 Speaker 1: the sincerity that he was ready to move on. But 83 00:05:06,880 --> 00:05:09,200 Speaker 1: I've used Twitter for years, and in fact, I was 84 00:05:09,240 --> 00:05:11,960 Speaker 1: just tweeting yesterday. I mean, I really be in paying 85 00:05:12,000 --> 00:05:16,520 Speaker 1: attention when they banned Donald Trump, who after all, it 86 00:05:16,560 --> 00:05:18,320 Speaker 1: was president I stay so at the time, and then 87 00:05:18,360 --> 00:05:22,800 Speaker 1: they suspended the New York Post, which is the fourth 88 00:05:22,880 --> 00:05:26,279 Speaker 1: largest and the oldest newspaper in America, founded by Alexander Hamilton. 89 00:05:26,640 --> 00:05:29,040 Speaker 1: I mean, it seems to that was just blatantly political. 90 00:05:29,680 --> 00:05:32,440 Speaker 1: I think if your right of center and looking at 91 00:05:32,480 --> 00:05:38,040 Speaker 1: this space, you really feel the pressure and a suppression 92 00:05:38,080 --> 00:05:41,000 Speaker 1: almost from a lot of these social media platforms that 93 00:05:41,160 --> 00:05:44,600 Speaker 1: your comments and your opinions aren't as welcome here as 94 00:05:44,640 --> 00:05:47,479 Speaker 1: those from the left. But I think that sometimes what 95 00:05:47,560 --> 00:05:50,280 Speaker 1: people on the right miss, and I only see both 96 00:05:50,320 --> 00:05:53,039 Speaker 1: because it's my job to look at both, is how 97 00:05:53,400 --> 00:05:57,560 Speaker 1: people on the left on these platforms are equally outraged 98 00:05:57,760 --> 00:06:01,600 Speaker 1: at the content moderation, but for dead opposite reasons. They 99 00:06:01,640 --> 00:06:05,800 Speaker 1: can't believe how much content is left up on these platforms. 100 00:06:06,120 --> 00:06:11,080 Speaker 1: So these content moderation decisions get a little caught in 101 00:06:11,120 --> 00:06:13,880 Speaker 1: the middle. And then when you turn up the heat 102 00:06:14,880 --> 00:06:16,760 Speaker 1: about as high as you can go in terms of 103 00:06:16,800 --> 00:06:20,159 Speaker 1: a presidential election, the pressure from both sides I think 104 00:06:20,240 --> 00:06:23,719 Speaker 1: was enormous. And then I think when things crossed over 105 00:06:23,800 --> 00:06:28,160 Speaker 1: into violence, not with the president himself, but the whole 106 00:06:28,200 --> 00:06:33,839 Speaker 1: situation on January sixth, I think that the impulse is 107 00:06:33,880 --> 00:06:37,200 Speaker 1: to just distance yourself from that. And Twitter has in 108 00:06:37,279 --> 00:06:41,800 Speaker 1: place sort of a special policy for political leaders that 109 00:06:41,960 --> 00:06:44,800 Speaker 1: gives them a bit wider of a birth to say 110 00:06:44,839 --> 00:06:48,920 Speaker 1: things and have those comments remain up on the platform, 111 00:06:49,360 --> 00:06:52,200 Speaker 1: maybe things that I wouldn't be granted that grace to 112 00:06:52,279 --> 00:06:54,880 Speaker 1: have left up. But they do seem to acknowledge in 113 00:06:54,880 --> 00:06:58,000 Speaker 1: their terms of service that people who are in control 114 00:06:58,320 --> 00:07:03,159 Speaker 1: of political whole bodies might need a little more space, 115 00:07:03,200 --> 00:07:05,320 Speaker 1: that this is a bit more of a public service, 116 00:07:05,839 --> 00:07:08,040 Speaker 1: and the leash is a bit longer for them. But 117 00:07:08,120 --> 00:07:12,240 Speaker 1: I think that with the violence that proved to be 118 00:07:12,360 --> 00:07:14,400 Speaker 1: just sort of too hot to handle for them, I 119 00:07:14,440 --> 00:07:16,960 Speaker 1: think that there was an impulse to just stay out 120 00:07:17,000 --> 00:07:18,640 Speaker 1: of that. I mean, of course, it was too late 121 00:07:18,640 --> 00:07:21,360 Speaker 1: to stay out of that, right because them kicking President 122 00:07:21,360 --> 00:07:25,360 Speaker 1: Trump off the platform became the story. And content moderation 123 00:07:25,480 --> 00:07:28,320 Speaker 1: is tricky even in the mundane day to day when 124 00:07:28,360 --> 00:07:30,960 Speaker 1: you start looking into it. But obviously in this situation, 125 00:07:31,000 --> 00:07:33,760 Speaker 1: I don't think the focus could have been more cute. 126 00:07:33,960 --> 00:07:36,760 Speaker 1: It was a confusing time. I think the public statement 127 00:07:36,800 --> 00:07:41,000 Speaker 1: that they issued said they were concerned that leaving President 128 00:07:41,000 --> 00:07:46,160 Speaker 1: Trump up to tweet might encourage other people for more violence. 129 00:07:46,360 --> 00:07:50,080 Speaker 1: There was some concern about the actual day that President 130 00:07:50,080 --> 00:07:52,000 Speaker 1: Biden would be sworn in, that there would be some 131 00:07:52,160 --> 00:07:55,720 Speaker 1: movement towards that. One of the tweets that got President 132 00:07:56,000 --> 00:07:58,280 Speaker 1: Trump kicked off was that he announced he would not 133 00:07:58,320 --> 00:08:01,600 Speaker 1: be attending the inaugury and Twitter took that to mean 134 00:08:02,080 --> 00:08:04,760 Speaker 1: that was part of green lighting violence there because Donald 135 00:08:04,760 --> 00:08:08,160 Speaker 1: Trump wouldn't be in harm's way being there. That speculation. 136 00:08:08,720 --> 00:08:12,520 Speaker 1: I don't patrol these platforms, and I wouldn't want that job. 137 00:08:13,000 --> 00:08:15,560 Speaker 1: But you know, you can see how in a closely 138 00:08:15,600 --> 00:08:18,960 Speaker 1: divided America here that left half of the nation cold 139 00:08:19,400 --> 00:08:21,680 Speaker 1: and the other half pretty pleased with the decision. And 140 00:08:21,720 --> 00:08:24,120 Speaker 1: that's a bad place to be in as a corporation, 141 00:08:24,280 --> 00:08:28,320 Speaker 1: you know, alienating half of your potential customer base either way, 142 00:08:28,720 --> 00:08:34,640 Speaker 1: it's not enviable. They've become surprisingly more political than Google 143 00:08:34,760 --> 00:08:37,240 Speaker 1: or Facebook or what have you, and I think part 144 00:08:37,280 --> 00:08:57,960 Speaker 1: of that has been these kind of decisions now that 145 00:08:58,120 --> 00:09:02,440 Speaker 1: led to Helon Musk announcing he's going to buy Twitter, 146 00:09:02,480 --> 00:09:04,480 Speaker 1: which is a sort of a soap opera of its own. 147 00:09:04,880 --> 00:09:08,840 Speaker 1: On January thirty first, he started increasing his stakes in Twitter, 148 00:09:09,400 --> 00:09:11,679 Speaker 1: got up to five percent of the total stock. Then 149 00:09:11,720 --> 00:09:15,800 Speaker 1: on March twenty fourth, he started criticizing the site, tweeting 150 00:09:15,960 --> 00:09:19,839 Speaker 1: quote free speeches essential to functioning democracy. Do you believe 151 00:09:19,880 --> 00:09:23,840 Speaker 1: Twitter rigorously adheres this principle? Then on April fourth, he 152 00:09:23,960 --> 00:09:27,160 Speaker 1: buys a nine point two percent stake in Twitter, becoming 153 00:09:27,160 --> 00:09:31,680 Speaker 1: the largest shareholder. He makes passive investment of seventy three 154 00:09:31,679 --> 00:09:36,080 Speaker 1: point five million shares of commons stock in his personal capacity, 155 00:09:36,440 --> 00:09:39,880 Speaker 1: and then in April fifth, the CEO announced that Musk 156 00:09:39,920 --> 00:09:43,200 Speaker 1: will be joining the Twitter board. But this becomes really 157 00:09:43,240 --> 00:09:47,040 Speaker 1: like a soap opera. The CEO informs the company's shareholders 158 00:09:47,040 --> 00:09:50,000 Speaker 1: and employees that Musk has refused to buy Twitter on 159 00:09:50,040 --> 00:09:53,600 Speaker 1: the eleventh. On April thirteenth, a group of Twitter investors 160 00:09:53,679 --> 00:09:56,280 Speaker 1: file a complaint in the New York Federal Court alleging 161 00:09:56,360 --> 00:10:00,040 Speaker 1: Musk of delay and revealing his stake in Twitter so 162 00:10:00,160 --> 00:10:01,760 Speaker 1: he could buy more shares in the company at a 163 00:10:01,840 --> 00:10:05,440 Speaker 1: cheaper price than On the fourteenth, Musk offers to buy 164 00:10:05,440 --> 00:10:08,240 Speaker 1: the site worth forty one point thirty nine billion for 165 00:10:08,360 --> 00:10:11,360 Speaker 1: fifty four dollars and twenty cents per share, and at 166 00:10:11,400 --> 00:10:14,400 Speaker 1: a thirty eight percent premium to the closing price of 167 00:10:14,440 --> 00:10:17,960 Speaker 1: Twitter stock on April first. On April fourteenth, the same day, 168 00:10:18,320 --> 00:10:21,720 Speaker 1: Vanguard Holdings acquires a ten point three percent stake in 169 00:10:21,760 --> 00:10:26,200 Speaker 1: the company, thus defeating Musk as the largest shareholder by 170 00:10:26,200 --> 00:10:29,439 Speaker 1: the twenty first. A week later, Must secure his financing 171 00:10:29,480 --> 00:10:33,080 Speaker 1: worth forty six point five billion to buy the social 172 00:10:33,120 --> 00:10:36,079 Speaker 1: media platform, and three days later, in April twenty fourth, 173 00:10:36,280 --> 00:10:39,520 Speaker 1: the Twitter board holds discussions with Musk to consider his 174 00:10:39,640 --> 00:10:42,600 Speaker 1: buyout proposal, and then on the twenty fifth, the next day, 175 00:10:42,679 --> 00:10:45,600 Speaker 1: the Twitter board seals the deal with Musk to sell 176 00:10:45,640 --> 00:10:49,559 Speaker 1: the microblogging site worth forty four billion. Now Must then 177 00:10:49,600 --> 00:10:52,360 Speaker 1: puts the deal on hold, he says, because the number 178 00:10:52,400 --> 00:10:54,640 Speaker 1: of spam Bota counts on the site, So let me 179 00:10:54,640 --> 00:10:57,439 Speaker 1: stop it at that point, because this whole thing is unfolding, 180 00:10:58,040 --> 00:11:01,079 Speaker 1: and it looks at one point like the Twitter stockholder 181 00:11:01,120 --> 00:11:03,360 Speaker 1: is going to get a pretty good windfall. And then 182 00:11:03,480 --> 00:11:08,000 Speaker 1: Must suddenly says, oh, there are too many spam bot accounts. Jessica, 183 00:11:08,280 --> 00:11:11,160 Speaker 1: why do you think Elon Musk got involved in this 184 00:11:11,240 --> 00:11:14,920 Speaker 1: adventure and decided he wanted to buy Twitter. Well, I 185 00:11:14,920 --> 00:11:17,040 Speaker 1: think there's a good case to be made that he 186 00:11:17,160 --> 00:11:20,000 Speaker 1: likes a business challenge. We've seen that at a number 187 00:11:20,040 --> 00:11:24,760 Speaker 1: of companies and making Twitter profitable. It certainly qualifies. But 188 00:11:24,960 --> 00:11:27,480 Speaker 1: I would like to think too that he likes so 189 00:11:27,520 --> 00:11:34,080 Speaker 1: many Americans, has really sincere concerns about speech online and 190 00:11:34,720 --> 00:11:37,800 Speaker 1: wanted to see if he can improve that situation. I 191 00:11:37,840 --> 00:11:41,920 Speaker 1: think that's a noble pursuit. That's what I think innovators do, 192 00:11:41,960 --> 00:11:45,320 Speaker 1: and he's certainly an entrepreneur and an innovator. I think 193 00:11:45,360 --> 00:11:48,600 Speaker 1: that probably once you start digging into it, it's much 194 00:11:48,640 --> 00:11:53,280 Speaker 1: more challenging than maybe even he expected. But I think 195 00:11:53,320 --> 00:11:57,160 Speaker 1: it was great because it showed everyone what could happen 196 00:11:57,320 --> 00:11:59,600 Speaker 1: when someone comes along and does it a little differently 197 00:11:59,640 --> 00:12:02,320 Speaker 1: and has some different priorities. How far can you push it? 198 00:12:02,360 --> 00:12:04,559 Speaker 1: I think is healthy. I think there was a lot 199 00:12:04,600 --> 00:12:07,200 Speaker 1: of excitement from the right about that, and I certainly 200 00:12:07,240 --> 00:12:12,959 Speaker 1: understand that people really have genuine anger and concern that 201 00:12:13,000 --> 00:12:16,720 Speaker 1: they're shrinking spaces for their voices. I think the answer 202 00:12:16,760 --> 00:12:19,719 Speaker 1: to that is people like Elon must buying platforms and 203 00:12:19,840 --> 00:12:23,720 Speaker 1: new platforms being built and different kinds of platforms being available, 204 00:12:23,760 --> 00:12:26,720 Speaker 1: and I hope that that happens sooner rather than later, 205 00:12:26,720 --> 00:12:30,680 Speaker 1: and sooner than regulation, because I don't think it would 206 00:12:30,679 --> 00:12:33,400 Speaker 1: work as well. But I think that he is good 207 00:12:33,400 --> 00:12:36,360 Speaker 1: at identifying a challenge or a problem and trying to 208 00:12:36,360 --> 00:12:38,160 Speaker 1: put his money where his mouth is. I don't know 209 00:12:38,559 --> 00:12:40,280 Speaker 1: if he'll end up having to buy it or not. 210 00:12:40,360 --> 00:12:42,320 Speaker 1: I don't know how it will turn out, but I 211 00:12:42,440 --> 00:12:46,240 Speaker 1: like that we remembered that the market provides solutions, and 212 00:12:46,240 --> 00:12:48,080 Speaker 1: that's something we have to keep in mind even when 213 00:12:48,120 --> 00:12:50,960 Speaker 1: half the country is frustrated with the status quo. I 214 00:12:51,080 --> 00:12:53,680 Speaker 1: think that the nation is sort of having a crisis 215 00:12:53,720 --> 00:12:57,360 Speaker 1: of faith in creative destruction, and I like the reminder 216 00:12:57,440 --> 00:13:00,440 Speaker 1: that there's still a lot that can be took up 217 00:13:00,440 --> 00:13:03,120 Speaker 1: and a lot of change that can come without Washington 218 00:13:03,160 --> 00:13:06,280 Speaker 1: being involved at all. Do you have any sense of 219 00:13:06,280 --> 00:13:09,000 Speaker 1: how many spam accounts there are on Twitter? Yeah? I 220 00:13:09,000 --> 00:13:12,400 Speaker 1: think that Twitter's official line on that is five percent 221 00:13:12,559 --> 00:13:15,400 Speaker 1: or less. I think the whistle blowing part of this 222 00:13:16,400 --> 00:13:20,400 Speaker 1: adds some question marks to that, because if you're focused 223 00:13:20,480 --> 00:13:26,439 Speaker 1: on growing the platform, right, because these are all advertising 224 00:13:26,559 --> 00:13:30,120 Speaker 1: driven revenues, so the more eyeballs you have. The more 225 00:13:30,160 --> 00:13:32,520 Speaker 1: you adds, you can sell for more, and that's the 226 00:13:32,559 --> 00:13:38,120 Speaker 1: point of the game. I think that it's probably difficult 227 00:13:38,120 --> 00:13:40,600 Speaker 1: for them to give an exact number. And then on 228 00:13:40,640 --> 00:13:44,079 Speaker 1: top of that inherent uncertainty that we've been dealing with 229 00:13:44,120 --> 00:13:46,920 Speaker 1: all the way through the soap oper portions that you 230 00:13:47,520 --> 00:13:50,720 Speaker 1: took us through, now with the whistleblower, it puts extra 231 00:13:50,840 --> 00:13:53,440 Speaker 1: question marks on the end of that because we don't 232 00:13:53,440 --> 00:13:56,240 Speaker 1: even know if they're interested in finding that number. The 233 00:13:56,280 --> 00:13:59,680 Speaker 1: allegation is sort of that they're not really concerned about 234 00:13:59,679 --> 00:14:01,520 Speaker 1: a lot of things they should be concerned about, one 235 00:14:01,559 --> 00:14:04,480 Speaker 1: of which are the bots. But if the price of 236 00:14:04,520 --> 00:14:07,439 Speaker 1: the advertising is a function of the number of subscribers 237 00:14:07,800 --> 00:14:11,439 Speaker 1: and somewhere between five and twenty percent of the subscribers 238 00:14:11,559 --> 00:14:16,880 Speaker 1: or phony, you're in a sense charging for non performance. Yes, 239 00:14:17,040 --> 00:14:20,840 Speaker 1: you've been overcharged. Now, whether that came to light in 240 00:14:20,920 --> 00:14:24,440 Speaker 1: time for Elon to legally back out of the deal 241 00:14:24,560 --> 00:14:27,240 Speaker 1: is a matter for the Delaware Court of Chancery, not 242 00:14:27,360 --> 00:14:30,560 Speaker 1: for me to say. But it passes the common sense test, 243 00:14:30,600 --> 00:14:32,480 Speaker 1: I think when we hear about it, when we say, well, 244 00:14:32,720 --> 00:14:34,880 Speaker 1: he was sold a bad bill of goods. He was 245 00:14:34,920 --> 00:14:37,200 Speaker 1: told it was one thing and it wasn't that, and 246 00:14:37,320 --> 00:14:41,440 Speaker 1: that goes directly to his aim to make Twitter profitable. 247 00:14:42,600 --> 00:14:45,360 Speaker 1: On May thirteenth, I mean, he puts the Twitter deal 248 00:14:45,440 --> 00:14:49,240 Speaker 1: temporarily on hold, arguing that there are far more than 249 00:14:49,320 --> 00:14:52,560 Speaker 1: five percent of Twitter's active users or spam or fake accounts, 250 00:14:52,840 --> 00:14:55,520 Speaker 1: And on the seventeenth of May he says the deal 251 00:14:55,560 --> 00:14:59,920 Speaker 1: would not move forward unless Twitter discloses information regarding bot. 252 00:15:00,560 --> 00:15:02,920 Speaker 1: He goes further on June sex and says he's going 253 00:15:02,960 --> 00:15:05,360 Speaker 1: to pull out of the deal, accusing Twitter of not 254 00:15:05,520 --> 00:15:09,520 Speaker 1: providing the required information on spam bot accounts. Then on 255 00:15:09,640 --> 00:15:13,240 Speaker 1: July ninth, there's an SEC filing revealing must plan of 256 00:15:13,320 --> 00:15:15,360 Speaker 1: backing out of the Twitter deal. Why do you think 257 00:15:15,400 --> 00:15:18,400 Speaker 1: Twitter didn't just go to a straight audit and find 258 00:15:18,440 --> 00:15:21,080 Speaker 1: out how many blots they have? Well, it does suggest 259 00:15:21,120 --> 00:15:24,120 Speaker 1: that maybe they're not as confident in the number they've 260 00:15:24,120 --> 00:15:27,080 Speaker 1: been releasing as they should be. Certainly I see your 261 00:15:27,160 --> 00:15:29,800 Speaker 1: point there. It seems to me that if they had 262 00:15:29,840 --> 00:15:33,160 Speaker 1: nothing to hide, they would roll it all out. But 263 00:15:33,600 --> 00:15:37,160 Speaker 1: perhaps what the whistle blower is alleging goes to that point. 264 00:15:37,720 --> 00:15:40,840 Speaker 1: With a lack of concern for those numbers, maybe not 265 00:15:40,920 --> 00:15:43,880 Speaker 1: as much attention to cleaning that part of the platform 266 00:15:43,960 --> 00:15:46,720 Speaker 1: up as there should have been, and whether or not 267 00:15:46,840 --> 00:15:51,040 Speaker 1: Elon can have his pleadings reopened and add this new 268 00:15:51,040 --> 00:15:54,600 Speaker 1: information as it comes in from much that'll be up 269 00:15:54,680 --> 00:15:57,320 Speaker 1: to that judge to decide. But it certainly goes to 270 00:15:57,440 --> 00:16:00,640 Speaker 1: his larger point that what he read to buy is 271 00:16:00,680 --> 00:16:03,480 Speaker 1: not what he's getting, so he would like to not 272 00:16:03,560 --> 00:16:06,000 Speaker 1: do it, or at the very least, which sort of 273 00:16:06,120 --> 00:16:08,640 Speaker 1: was the speculation, that he just wanted a better price. 274 00:16:09,280 --> 00:16:12,360 Speaker 1: It does seem is it possible that judge could actually 275 00:16:12,440 --> 00:16:14,480 Speaker 1: order him to pay the original price. I think it's 276 00:16:14,520 --> 00:16:17,760 Speaker 1: certainly possible. From what I've heard from people who do 277 00:16:17,840 --> 00:16:20,000 Speaker 1: this day and day out. Is that it could be 278 00:16:20,040 --> 00:16:22,200 Speaker 1: that he discovered this too far into the deal, because 279 00:16:22,240 --> 00:16:24,760 Speaker 1: the accusation a little bit that pushes that way is 280 00:16:24,800 --> 00:16:28,320 Speaker 1: that he really didn't do his due diligence before getting 281 00:16:28,400 --> 00:16:31,840 Speaker 1: himself into this contractually. Part of his allegation is that 282 00:16:31,960 --> 00:16:38,920 Speaker 1: the whistleblower document alleges that the company wanted growth over accuracy, 283 00:16:39,280 --> 00:16:42,280 Speaker 1: and that executives stood to win individual bonuses of up 284 00:16:42,280 --> 00:16:46,160 Speaker 1: to ten million dollars tied to increases and daily users, 285 00:16:46,800 --> 00:16:50,240 Speaker 1: but nothing for cutting spam, so all the bias was 286 00:16:50,240 --> 00:16:53,160 Speaker 1: in favor of just bring on the numbers, right. I 287 00:16:53,200 --> 00:16:55,680 Speaker 1: think that every CEO will tell you there's things that 288 00:16:55,720 --> 00:16:58,120 Speaker 1: you have to spend money on for long term health 289 00:16:58,160 --> 00:16:59,920 Speaker 1: of your company that might not be as a trap 290 00:17:00,040 --> 00:17:03,440 Speaker 1: active for your shareholder value, might not be as glamorous, 291 00:17:03,440 --> 00:17:05,439 Speaker 1: but those are the things you still have to invest in. 292 00:17:05,480 --> 00:17:08,679 Speaker 1: And I think the suggestion here is that the privacy 293 00:17:08,720 --> 00:17:12,240 Speaker 1: of users and the security of the platform itself was 294 00:17:12,280 --> 00:17:15,679 Speaker 1: not made a priority over the profit motive of growing 295 00:17:15,720 --> 00:17:18,480 Speaker 1: the number of users on the platform. Well, and supposedly, 296 00:17:18,520 --> 00:17:21,560 Speaker 1: the documents alleged that in July twenty twenty, a group 297 00:17:21,560 --> 00:17:25,360 Speaker 1: of teenagers hacked into the accounts of former President Obama, 298 00:17:25,680 --> 00:17:29,960 Speaker 1: presidential Canada, Joe Biden, Jeff Bezos, Bill Gates, and Elon Musk. 299 00:17:30,480 --> 00:17:32,560 Speaker 1: I mean, here you have three of the richest men 300 00:17:32,600 --> 00:17:35,480 Speaker 1: in the world, and one president and one about to 301 00:17:35,480 --> 00:17:38,600 Speaker 1: be president, all getting hacked into. I mean, isn't that 302 00:17:38,720 --> 00:17:42,480 Speaker 1: automatic evidence that your security wasn't very good? Yeah? I 303 00:17:42,520 --> 00:17:44,600 Speaker 1: think that's right, and I think that's why you see 304 00:17:44,800 --> 00:17:48,359 Speaker 1: in twenty eleven, the FTC had a consent decree with 305 00:17:48,400 --> 00:17:52,760 Speaker 1: Twitter saying listen, your security breaches are off the rails here, 306 00:17:52,800 --> 00:17:55,000 Speaker 1: and we need some assurances going forward that you're going 307 00:17:55,040 --> 00:17:56,960 Speaker 1: to do a better job with this, and that's what 308 00:17:57,000 --> 00:17:59,840 Speaker 1: Twitter agreed to do. Part of these allegations are saying 309 00:18:00,280 --> 00:18:02,440 Speaker 1: they agreed to do that, but they haven't really been 310 00:18:02,480 --> 00:18:05,399 Speaker 1: doing it. They haven't been walking the walk. So I 311 00:18:05,440 --> 00:18:09,720 Speaker 1: think that's the calls from Gradsley to investigate this. It's 312 00:18:09,760 --> 00:18:12,639 Speaker 1: sitting at the SEC and the FTC and the DOJ, 313 00:18:13,160 --> 00:18:15,080 Speaker 1: and they're going to need to parse out what did 314 00:18:15,119 --> 00:18:17,960 Speaker 1: Twitter do what it promised to do in this twenty 315 00:18:18,000 --> 00:18:21,000 Speaker 1: eleven consent degree because it has been long plagued by 316 00:18:21,080 --> 00:18:25,600 Speaker 1: security problems. It has insufficient encryption, outdated software running on 317 00:18:25,640 --> 00:18:29,240 Speaker 1: half of its servers, and accusations that too many of 318 00:18:29,240 --> 00:18:32,560 Speaker 1: its employees have access to user data and to control 319 00:18:32,600 --> 00:18:36,000 Speaker 1: over the platform, which is just sort of sloppy security practices. 320 00:18:36,040 --> 00:18:37,720 Speaker 1: But I will point out that when you look at 321 00:18:37,760 --> 00:18:41,040 Speaker 1: them scored against similar tech companies, they're really sort of 322 00:18:41,080 --> 00:18:44,360 Speaker 1: in the mainstream, which might just mean that a lot 323 00:18:44,400 --> 00:18:46,840 Speaker 1: of these companies have some catching up to do. This 324 00:18:46,920 --> 00:18:49,919 Speaker 1: is still relatively new technology for us. It feels like 325 00:18:49,960 --> 00:18:54,240 Speaker 1: we don't remember a time before Facebook and Instagram, and Twitter. 326 00:18:54,320 --> 00:18:57,880 Speaker 1: But in terms of the life of technologies, I think 327 00:18:57,960 --> 00:19:00,240 Speaker 1: there's a learning curve for users, and we might be 328 00:19:00,240 --> 00:19:02,520 Speaker 1: finding out that there's a security learning curve for these 329 00:19:02,520 --> 00:19:06,200 Speaker 1: companies too. And Twitter counterattacked and when after the whistle 330 00:19:06,200 --> 00:19:10,040 Speaker 1: blower said that the report was riddled with inaccuracies and 331 00:19:10,240 --> 00:19:13,400 Speaker 1: this is their language. He appears to be opportunistically seeking 332 00:19:13,440 --> 00:19:16,760 Speaker 1: to inflict harm on Twitter, It's customers and its shareholders. 333 00:19:16,800 --> 00:19:19,040 Speaker 1: So I assume they may be on the process of 334 00:19:19,040 --> 00:19:21,679 Speaker 1: filing a suit against him. It wouldn't surprise me. The 335 00:19:21,720 --> 00:19:24,240 Speaker 1: problem for them with that line is that he has 336 00:19:24,280 --> 00:19:29,040 Speaker 1: a really excellent reputation among cybersecurity experts. He's what's called 337 00:19:29,080 --> 00:19:33,560 Speaker 1: an ethical hacker, and he's testified before Congress for twenty 338 00:19:33,640 --> 00:19:36,919 Speaker 1: years about internet security, and a lot of people on Twitter, 339 00:19:36,960 --> 00:19:39,200 Speaker 1: of course, who worked with them, came to his defense 340 00:19:39,240 --> 00:19:41,359 Speaker 1: and said, listen, this guy's a straight shooter, and if 341 00:19:41,359 --> 00:19:43,600 Speaker 1: he says there's a problem, there's a problem. But of 342 00:19:43,640 --> 00:19:46,680 Speaker 1: course we won't know who's blowing more smoke and who 343 00:19:46,680 --> 00:19:49,320 Speaker 1: has a real point until some of these investigations at 344 00:19:49,320 --> 00:19:52,399 Speaker 1: the agency's go through and say okay, so prove it. 345 00:19:52,480 --> 00:19:54,120 Speaker 1: Where's the proof? And all that, and I think we'll 346 00:19:54,119 --> 00:19:57,439 Speaker 1: see that certainly at the FTC, and I think maybe 347 00:19:57,440 --> 00:20:00,639 Speaker 1: at the SEC too. At the same time, not as 348 00:20:00,640 --> 00:20:04,080 Speaker 1: an allegation, but as a fact, a Twitter employee was 349 00:20:04,119 --> 00:20:08,600 Speaker 1: convicted of using his position to spy on Saudi dissidents 350 00:20:08,640 --> 00:20:11,320 Speaker 1: and pass the information on to an aid to the 351 00:20:11,359 --> 00:20:14,919 Speaker 1: Crown Prince in Saudi Arabia in exchange for cash and gifts. 352 00:20:15,359 --> 00:20:18,600 Speaker 1: He's convicted of turning over personal information of platform users 353 00:20:18,920 --> 00:20:22,359 Speaker 1: who used anonymous handles to criticize the royal family in 354 00:20:22,440 --> 00:20:26,160 Speaker 1: Saudi government. I mean, given the toughness of the Saudi dictatorship, 355 00:20:26,560 --> 00:20:29,560 Speaker 1: that's really dangerous. It is. And this is where this 356 00:20:29,600 --> 00:20:34,120 Speaker 1: whistleblower file is also sitting at the DOJ because there's 357 00:20:34,119 --> 00:20:36,840 Speaker 1: some international ramifications like the one you just mentioned to 358 00:20:36,880 --> 00:20:40,880 Speaker 1: all of this. We use Twitter as Americans, and that's 359 00:20:40,920 --> 00:20:43,680 Speaker 1: not the way that Twitter is able to be used 360 00:20:43,720 --> 00:20:46,400 Speaker 1: by everyone living over the world. You know, the innanimity 361 00:20:46,400 --> 00:20:50,399 Speaker 1: of Twitter is a protection against dangerous governments in lots 362 00:20:50,400 --> 00:20:53,000 Speaker 1: of places, and when you see that that protection has 363 00:20:53,040 --> 00:20:56,480 Speaker 1: been breached by them placing employees within Twitter, I think 364 00:20:56,560 --> 00:20:59,520 Speaker 1: that's a cause for concern because one of the things 365 00:20:59,560 --> 00:21:02,199 Speaker 1: we all adit when all this came online, was that 366 00:21:02,440 --> 00:21:04,480 Speaker 1: this might be a way for people who don't have 367 00:21:04,520 --> 00:21:08,160 Speaker 1: the political rights recognized to speak freely to do so anyway. 368 00:21:08,520 --> 00:21:11,879 Speaker 1: And as an American company, I think Twitter tries to 369 00:21:11,920 --> 00:21:14,720 Speaker 1: stay close to that brand, but these are obviously examples 370 00:21:14,720 --> 00:21:17,000 Speaker 1: where they've failed, and that's a problem. Well, are these 371 00:21:17,080 --> 00:21:20,359 Speaker 1: the kind of concerns that you think the government should 372 00:21:20,400 --> 00:21:24,359 Speaker 1: be focusing on. I think these kinds of security and 373 00:21:24,440 --> 00:21:29,400 Speaker 1: privacy vulnerabilities that could cause material damage to users are 374 00:21:29,440 --> 00:21:34,560 Speaker 1: a much more appropriate use of limited government resources than 375 00:21:34,640 --> 00:21:36,919 Speaker 1: some of the things you might see more in the 376 00:21:36,920 --> 00:21:39,840 Speaker 1: news these days about big tech. I think there's a 377 00:21:39,880 --> 00:21:44,879 Speaker 1: lot of sincere anger around the speech questions of these platforms, 378 00:21:44,880 --> 00:21:47,520 Speaker 1: and that's a conversation we need to have as a country, 379 00:21:47,560 --> 00:21:50,399 Speaker 1: because again, this is new technology and we need to 380 00:21:50,440 --> 00:21:53,359 Speaker 1: feel our way through that in a way that keeps 381 00:21:53,400 --> 00:21:57,360 Speaker 1: to our first principles and allows for innovations. There might 382 00:21:57,400 --> 00:21:59,600 Speaker 1: be market solutions to a lot of that that we 383 00:21:59,680 --> 00:22:03,359 Speaker 1: don't want to curtail before they come online. But I 384 00:22:03,400 --> 00:22:06,359 Speaker 1: think the emphasis on the antitrust stuff with a lot 385 00:22:06,440 --> 00:22:09,400 Speaker 1: of these tech companies is sort of a lot of 386 00:22:09,560 --> 00:22:14,800 Speaker 1: political haymaking. When you look at what consumers think about 387 00:22:14,800 --> 00:22:17,920 Speaker 1: these companies and all the services they provide, often for free. 388 00:22:18,200 --> 00:22:22,639 Speaker 1: It's very difficult to find consumer harm in a traditional 389 00:22:22,680 --> 00:22:26,440 Speaker 1: antitrust sense. The antitrust laws that we've been using very 390 00:22:26,440 --> 00:22:28,960 Speaker 1: successfully for at least the last forty years. There was 391 00:22:29,000 --> 00:22:32,359 Speaker 1: a big reform with Judge Bork, and under Ronald Reagan's 392 00:22:32,359 --> 00:22:35,840 Speaker 1: administration at the FTC, we've really cleaned up that body 393 00:22:35,840 --> 00:22:38,679 Speaker 1: of law, and now there's a push to expand it 394 00:22:38,720 --> 00:22:41,959 Speaker 1: again into sort of these vague goals of equity and 395 00:22:42,080 --> 00:22:45,560 Speaker 1: social justice and things that would be very difficult to 396 00:22:45,680 --> 00:22:48,720 Speaker 1: quantify for tech companies, I think, But I think there's 397 00:22:48,720 --> 00:22:51,480 Speaker 1: a lot more political interest in those and I think 398 00:22:51,880 --> 00:22:54,280 Speaker 1: we've done a poor job mixing up the speech issue 399 00:22:54,280 --> 00:22:58,760 Speaker 1: with the antitrust stuff, and that's not clear why those 400 00:22:58,800 --> 00:23:01,280 Speaker 1: things should be connected in a policy way. And I'm 401 00:23:01,280 --> 00:23:04,240 Speaker 1: sure for people who are not doing this full time 402 00:23:04,240 --> 00:23:06,639 Speaker 1: for a living in our busy paying their bills and 403 00:23:06,800 --> 00:23:10,320 Speaker 1: raising their kids and living their lives, it's especially confusing. 404 00:23:10,680 --> 00:23:14,240 Speaker 1: So I think at least with these questions with the whistleblower, 405 00:23:14,280 --> 00:23:18,160 Speaker 1: we're looking at real material harms and real security issues 406 00:23:18,440 --> 00:23:21,280 Speaker 1: for users in the US and internationally. Like you said, 407 00:23:21,280 --> 00:23:23,840 Speaker 1: there's some national security issues that are raised as well. 408 00:23:24,080 --> 00:23:28,600 Speaker 1: I think that's a much more appropriate inquiry that a 409 00:23:28,640 --> 00:23:31,840 Speaker 1: lot of the things we see splashed on the newspaper 410 00:23:31,840 --> 00:23:51,600 Speaker 1: pages these days. So in a parallel development, what are 411 00:23:51,600 --> 00:23:55,120 Speaker 1: your thoughts on the development of Truth Social. I am 412 00:23:55,119 --> 00:23:57,520 Speaker 1: of the opinion that to get to a place of 413 00:23:57,640 --> 00:24:01,480 Speaker 1: perfect political or any other new reality on these platforms 414 00:24:01,560 --> 00:24:04,840 Speaker 1: is sort of naive. It's a very well intentioned goal, 415 00:24:05,320 --> 00:24:09,560 Speaker 1: but that becomes very difficult when you understand that, especially 416 00:24:09,640 --> 00:24:13,720 Speaker 1: these days when everything has been politicized. Global warming questions 417 00:24:13,800 --> 00:24:16,639 Speaker 1: used to be those of science, and now that's very politicized. 418 00:24:16,680 --> 00:24:20,600 Speaker 1: So one person's scientific opinion is another person's political propaganda. 419 00:24:20,760 --> 00:24:24,440 Speaker 1: It's very difficult to show both sides or all sides, 420 00:24:24,560 --> 00:24:27,119 Speaker 1: or be neutral. I think that's tough. So for the 421 00:24:27,160 --> 00:24:30,040 Speaker 1: same reason that the solution to CNN when it first 422 00:24:30,040 --> 00:24:33,280 Speaker 1: came on leaning to the left, wasn't government regulation to 423 00:24:33,400 --> 00:24:37,640 Speaker 1: make the solution was Fox News. I'm really for this 424 00:24:37,760 --> 00:24:43,160 Speaker 1: idea of competing platforms with competing biases, and I think 425 00:24:43,240 --> 00:24:45,760 Speaker 1: that you're seeing that with True Social. It's not the 426 00:24:45,840 --> 00:24:48,240 Speaker 1: Truth Social as much as it likes to say. It's 427 00:24:48,320 --> 00:24:51,679 Speaker 1: this bastion for free speech. You know, it doesn't have 428 00:24:52,280 --> 00:24:55,040 Speaker 1: to be that. It can be a place where conservatives 429 00:24:55,080 --> 00:24:57,880 Speaker 1: feel like things are a little tilted in that direction. 430 00:24:58,400 --> 00:25:01,320 Speaker 1: And that's okay, because I think that they certainly feel 431 00:25:01,359 --> 00:25:03,639 Speaker 1: that on the platforms we use most today. That's the 432 00:25:03,640 --> 00:25:06,120 Speaker 1: case for the left. And that's okay too, because these 433 00:25:06,119 --> 00:25:09,840 Speaker 1: aren't government networks. These are private platforms, and I would 434 00:25:09,880 --> 00:25:12,800 Speaker 1: rather see them competing for eyeballs. And the great thing 435 00:25:12,840 --> 00:25:16,600 Speaker 1: about that is that it doesn't cost anything more to 436 00:25:16,680 --> 00:25:19,679 Speaker 1: have a couple different networks on your phone. You can 437 00:25:19,760 --> 00:25:22,040 Speaker 1: check out lots of different sources, just like we do 438 00:25:22,480 --> 00:25:25,680 Speaker 1: across the more traditional media landscape, and I think that's 439 00:25:25,760 --> 00:25:29,080 Speaker 1: probably a more practical way to achieve some balance for 440 00:25:29,160 --> 00:25:34,040 Speaker 1: you as the user than getting Washington involved in patrolling speech. 441 00:25:34,080 --> 00:25:37,879 Speaker 1: There's been suggestions that the FTC would have to certify 442 00:25:38,119 --> 00:25:41,440 Speaker 1: some sort of neutrality so that these companies keep their 443 00:25:41,480 --> 00:25:44,560 Speaker 1: liability protections. And I think when you get into that, 444 00:25:44,760 --> 00:25:48,359 Speaker 1: what's different and special and wonderful about America is that 445 00:25:48,400 --> 00:25:51,920 Speaker 1: we have the First Amendment, and that's unique justus, and 446 00:25:51,960 --> 00:25:54,199 Speaker 1: I think you really have to tread lightly when you 447 00:25:54,240 --> 00:25:56,560 Speaker 1: start pushing on that. Even if you don't like the 448 00:25:56,600 --> 00:25:58,800 Speaker 1: results right now, and even if the market has taken 449 00:25:58,840 --> 00:26:02,120 Speaker 1: a little time to provide some right of center alternatives, 450 00:26:02,560 --> 00:26:06,119 Speaker 1: because looking back, we see that in radio with the 451 00:26:06,200 --> 00:26:09,560 Speaker 1: rise of conservative talk radio, we see that with Fox 452 00:26:09,560 --> 00:26:12,040 Speaker 1: News and now a couple of different networks on the 453 00:26:12,160 --> 00:26:15,639 Speaker 1: right there, and I think that's probably a more realistic 454 00:26:15,640 --> 00:26:18,200 Speaker 1: approach to balancing all this. And True Social is sort 455 00:26:18,200 --> 00:26:20,960 Speaker 1: of the latest and train into that, and I think 456 00:26:20,960 --> 00:26:23,080 Speaker 1: that's great. I think they should benefit from the same 457 00:26:23,119 --> 00:26:26,159 Speaker 1: protections that these big guys have had. The section to 458 00:26:26,320 --> 00:26:29,360 Speaker 1: thirty stuff, which we can talk about, but that basically 459 00:26:29,400 --> 00:26:32,800 Speaker 1: just says that whoever is speaking on your platform, if 460 00:26:32,840 --> 00:26:35,800 Speaker 1: I tweet something, I'm responsible for the content of that tweet. 461 00:26:35,840 --> 00:26:38,960 Speaker 1: Twitter's not. So that's what it says. And even if 462 00:26:39,000 --> 00:26:41,159 Speaker 1: Twitter messes with it and takes it down or doesn't 463 00:26:41,160 --> 00:26:43,040 Speaker 1: take it down, that doesn't change for them. They have 464 00:26:43,160 --> 00:26:45,880 Speaker 1: that shield, and I think rather than spending our time 465 00:26:45,920 --> 00:26:50,160 Speaker 1: taking that away, which would really alter the Internet ecosystem 466 00:26:50,240 --> 00:26:52,919 Speaker 1: that has grown up with that protection, it would be 467 00:26:52,920 --> 00:26:55,320 Speaker 1: better to just say, let's get some people with different 468 00:26:55,320 --> 00:26:57,879 Speaker 1: opinions that get to benefit from that shield too, and 469 00:26:57,960 --> 00:27:01,360 Speaker 1: let's really get the conversation going, you don't think establishing 470 00:27:01,359 --> 00:27:05,840 Speaker 1: a common carrier principle ATNT doesn't actually listen to my 471 00:27:05,920 --> 00:27:09,320 Speaker 1: phone calls or Verizon doesn't listen to my phone calls, 472 00:27:09,680 --> 00:27:12,600 Speaker 1: so it's very unlikely that I will be kicked out 473 00:27:12,600 --> 00:27:15,000 Speaker 1: of AT and T or Verizon. Do you think it's 474 00:27:15,040 --> 00:27:18,200 Speaker 1: too hard to try to establish that kind of standard, 475 00:27:18,240 --> 00:27:21,240 Speaker 1: that these are really common carriers, which actually would not 476 00:27:21,280 --> 00:27:24,280 Speaker 1: saything about whether they're biased, and I would simply say 477 00:27:24,280 --> 00:27:26,720 Speaker 1: they couldn't kick you off. Yeah, So I think there's 478 00:27:26,960 --> 00:27:29,320 Speaker 1: two things there. I think there's some long term effects 479 00:27:29,320 --> 00:27:32,480 Speaker 1: that we wouldn't like with that approach, and I think 480 00:27:32,560 --> 00:27:36,320 Speaker 1: there's some practical problems with that approach. So there's two 481 00:27:36,320 --> 00:27:38,560 Speaker 1: states that are attempting to do that, Florida in Texas, 482 00:27:38,640 --> 00:27:40,959 Speaker 1: and if you look at those laws, they've been passed, 483 00:27:40,960 --> 00:27:44,040 Speaker 1: but they're both in the process of being challenged in 484 00:27:44,119 --> 00:27:48,520 Speaker 1: court on First Amendment principles. Mostly what that means is 485 00:27:48,600 --> 00:27:51,480 Speaker 1: there's this whole body of content that hopefully people aren't 486 00:27:51,520 --> 00:27:54,280 Speaker 1: familiar with because it gets removed before they even have 487 00:27:54,400 --> 00:27:57,880 Speaker 1: to see it, and it's called lawful but awful. So 488 00:27:58,000 --> 00:28:02,359 Speaker 1: this is animal abuse video violins and I'm not talking 489 00:28:02,400 --> 00:28:05,640 Speaker 1: about sort of the snowflake offending stuff that we can 490 00:28:05,680 --> 00:28:08,160 Speaker 1: all agree maybe people need to grow a thicker skin 491 00:28:08,240 --> 00:28:10,120 Speaker 1: and that stuff should stay up. I'm talking about things 492 00:28:10,119 --> 00:28:11,880 Speaker 1: that I don't think anyone in their right mind would 493 00:28:11,880 --> 00:28:15,280 Speaker 1: have any interest in seeing. So how do you either 494 00:28:15,400 --> 00:28:18,720 Speaker 1: put up with that being on your platform. I don't 495 00:28:18,720 --> 00:28:21,720 Speaker 1: think users would like that. I don't think advertisers that 496 00:28:21,920 --> 00:28:24,800 Speaker 1: want their ad popping up next to someone being beheaded 497 00:28:25,359 --> 00:28:28,239 Speaker 1: with political violence somewhere around the world, they don't want that. 498 00:28:28,800 --> 00:28:30,639 Speaker 1: And I don't think we want to send our children 499 00:28:30,680 --> 00:28:33,240 Speaker 1: onto an Internet where that stuff can't be removed. And 500 00:28:33,280 --> 00:28:35,720 Speaker 1: these big platforms remove a ton of that a day. 501 00:28:36,119 --> 00:28:38,160 Speaker 1: They get it wrong. I think there's plenty of things 502 00:28:38,200 --> 00:28:41,479 Speaker 1: that they remove that I think I couldn't justify that 503 00:28:41,600 --> 00:28:43,960 Speaker 1: to you or explain it in a million years, But 504 00:28:44,120 --> 00:28:46,000 Speaker 1: I think that the bulk of what they take down 505 00:28:46,000 --> 00:28:48,479 Speaker 1: is stuff we'd probably want them to take down. And 506 00:28:48,520 --> 00:28:51,480 Speaker 1: I think the other danger with the common carrier approach, 507 00:28:51,600 --> 00:28:55,640 Speaker 1: you know, even if you could somehow right legislation where 508 00:28:55,640 --> 00:28:58,240 Speaker 1: you exempted all of that, which I think is very 509 00:28:58,240 --> 00:29:01,360 Speaker 1: difficult because it's a very subject active thing to say, well, 510 00:29:01,360 --> 00:29:05,200 Speaker 1: what's political or what's violence that's very tough, and even 511 00:29:05,360 --> 00:29:07,520 Speaker 1: there's some violence that you want to see, right, it's 512 00:29:07,560 --> 00:29:11,920 Speaker 1: part of history. Those are very difficult decisions to bake 513 00:29:12,000 --> 00:29:14,400 Speaker 1: into a law. And then the other part of that 514 00:29:14,520 --> 00:29:17,520 Speaker 1: is if you just let everything up, then these platforms 515 00:29:17,560 --> 00:29:21,640 Speaker 1: really don't continue to evolve, right. Common carrier is sort 516 00:29:21,640 --> 00:29:23,640 Speaker 1: of a bad fit for this if you look at 517 00:29:23,640 --> 00:29:27,520 Speaker 1: what has traditionally been common carriers, because what you're really 518 00:29:28,040 --> 00:29:30,280 Speaker 1: looking for, and I think what people who are going 519 00:29:30,280 --> 00:29:33,080 Speaker 1: to truth social are looking for is a certain kind 520 00:29:33,120 --> 00:29:38,440 Speaker 1: of curated experience. So they're looking for conservative content, they're 521 00:29:38,440 --> 00:29:41,120 Speaker 1: looking for tolerance for conservative content, and I think that's 522 00:29:41,640 --> 00:29:44,480 Speaker 1: very reasonable. I think maybe people who are on Twitter, 523 00:29:44,640 --> 00:29:47,480 Speaker 1: most of the users are looking for something different than that, 524 00:29:47,680 --> 00:29:50,360 Speaker 1: and that's okay too. I just think that if you 525 00:29:50,440 --> 00:29:52,920 Speaker 1: say it's a free for all and everything's everywhere, then 526 00:29:53,000 --> 00:29:56,280 Speaker 1: what's the difference between Twitter and TikTok? Right? If anything 527 00:29:56,320 --> 00:29:59,479 Speaker 1: goes everywhere, that kind of makes for a lot of 528 00:29:59,520 --> 00:30:02,280 Speaker 1: garbage on your feed, which doesn't do much for your 529 00:30:02,320 --> 00:30:06,080 Speaker 1: advertising revenues and doesn't do much for the next platform 530 00:30:06,200 --> 00:30:08,239 Speaker 1: to try to get funded, because you can't go in 531 00:30:08,280 --> 00:30:10,440 Speaker 1: and say to a venture capitalists. I've got this great 532 00:30:10,480 --> 00:30:13,880 Speaker 1: idea to tweak it and I'm going to curate differently, 533 00:30:14,240 --> 00:30:16,400 Speaker 1: and they say, well, you're not allowed to do that. Right, 534 00:30:16,480 --> 00:30:20,080 Speaker 1: that curation has been outlawed. So it kind of kills innovation, 535 00:30:20,160 --> 00:30:22,360 Speaker 1: which I think is the long term problem, and I 536 00:30:22,400 --> 00:30:26,600 Speaker 1: think it also makes for a very unpleasant Internet in 537 00:30:26,640 --> 00:30:28,360 Speaker 1: the short term. And if you go back and talk 538 00:30:28,400 --> 00:30:31,360 Speaker 1: to the authors of Section to thirty, the liability shield 539 00:30:31,400 --> 00:30:34,400 Speaker 1: we talked about that sort of protects companies from being 540 00:30:34,440 --> 00:30:37,160 Speaker 1: held liable for third party speech and allows them to 541 00:30:37,160 --> 00:30:40,640 Speaker 1: take things down. That's sort of exactly why Christopher Cox 542 00:30:40,680 --> 00:30:43,200 Speaker 1: and Ron Wyden, who were the authors of the original bill, 543 00:30:43,280 --> 00:30:48,240 Speaker 1: did it, because they saw the Internet emerging and those companies, 544 00:30:48,240 --> 00:30:51,480 Speaker 1: which then weren't social media platforms, they were sort of 545 00:30:51,520 --> 00:30:55,240 Speaker 1: like message boards. If everyone's old enough to remember, those 546 00:30:55,280 --> 00:30:58,320 Speaker 1: companies themselves didn't know what their liability would be for 547 00:30:58,360 --> 00:31:01,680 Speaker 1: that third party This was their concern. Because of how 548 00:31:01,760 --> 00:31:04,240 Speaker 1: liability law has grown up in the US, it has 549 00:31:04,240 --> 00:31:07,120 Speaker 1: a lot to do, or it did before Section two 550 00:31:07,200 --> 00:31:10,320 Speaker 1: thirty with well, did the carrier of that speech, it's 551 00:31:10,360 --> 00:31:13,440 Speaker 1: not their speech, but the person distributing that speech, did 552 00:31:13,440 --> 00:31:16,479 Speaker 1: they know that there was something violent or filthy in there, 553 00:31:16,480 --> 00:31:19,120 Speaker 1: because if they knew and they didn't do anything about it, 554 00:31:19,360 --> 00:31:21,520 Speaker 1: then they're on the hook. But if they didn't know, 555 00:31:21,640 --> 00:31:24,200 Speaker 1: and we think it's unreasonable that they could know, they're 556 00:31:24,240 --> 00:31:26,280 Speaker 1: probably not on the hook. That's sort of like the 557 00:31:26,320 --> 00:31:30,080 Speaker 1: bookstore method we'd grown up with. So that's obviously very 558 00:31:30,120 --> 00:31:34,720 Speaker 1: confusing when we jump to these internet companies, and I 559 00:31:34,800 --> 00:31:39,080 Speaker 1: think that Ron Wyden and Chris Cox wanted to give 560 00:31:39,120 --> 00:31:42,880 Speaker 1: some clarity and they wanted to encourage third party speech 561 00:31:42,920 --> 00:31:45,400 Speaker 1: online because they wanted to give sort of that microphone 562 00:31:45,400 --> 00:31:49,280 Speaker 1: to everyone. So what they said was feel free and 563 00:31:50,040 --> 00:31:53,000 Speaker 1: moderate the content on your site, and don't worry even 564 00:31:53,000 --> 00:31:55,120 Speaker 1: if you found something filthy and took it down, and 565 00:31:55,160 --> 00:31:58,400 Speaker 1: the next week there's something filthy that you didn't find. 566 00:31:58,520 --> 00:32:00,520 Speaker 1: No one can sue you over that second one because 567 00:32:00,560 --> 00:32:02,800 Speaker 1: you knew about the first one. We could have done 568 00:32:02,800 --> 00:32:04,880 Speaker 1: it differently, right, We could have moved forward differently, And 569 00:32:04,920 --> 00:32:07,080 Speaker 1: I can't say if that would have turned out better 570 00:32:07,160 --> 00:32:09,520 Speaker 1: at worse. We'll never know. But that's the way we 571 00:32:09,520 --> 00:32:13,200 Speaker 1: did it. And it wasn't so that there was nothing 572 00:32:13,240 --> 00:32:16,120 Speaker 1: in that about being neutral or anything else. That Congress 573 00:32:16,160 --> 00:32:19,000 Speaker 1: past this them saying you can do what you want 574 00:32:19,160 --> 00:32:22,440 Speaker 1: with your space on the internet, because Facebook isn't the 575 00:32:22,440 --> 00:32:24,880 Speaker 1: whole internet, right, And that's the great thing about truth 576 00:32:25,000 --> 00:32:28,360 Speaker 1: Social there's an option now, and whether that's for you 577 00:32:28,440 --> 00:32:30,240 Speaker 1: or not, I love that you get to decide and 578 00:32:30,280 --> 00:32:32,280 Speaker 1: you can pick as many as you want. They're all 579 00:32:32,320 --> 00:32:35,080 Speaker 1: free to download, and you can tune in and tune 580 00:32:35,120 --> 00:32:36,840 Speaker 1: out and be as active as you want, or just 581 00:32:36,880 --> 00:32:39,280 Speaker 1: watch from the sidelines and see what other people are saying. 582 00:32:39,320 --> 00:32:41,840 Speaker 1: And for me, I think that's great. I'm all for it, 583 00:32:41,880 --> 00:32:44,360 Speaker 1: and I'm all for them all having those same protections 584 00:32:44,560 --> 00:32:48,600 Speaker 1: because at some point we're responsible for ourselves. If we 585 00:32:48,640 --> 00:32:50,400 Speaker 1: don't like the way what we see on the internet, 586 00:32:50,440 --> 00:32:53,040 Speaker 1: we need to be a part of the solution there. 587 00:32:53,080 --> 00:32:56,360 Speaker 1: And you know, don't click on stuff that's not anything 588 00:32:56,400 --> 00:32:58,200 Speaker 1: you would want your mother to know you clicked on. 589 00:32:58,640 --> 00:33:00,640 Speaker 1: I think that's always a better plan. And you know, 590 00:33:00,640 --> 00:33:03,320 Speaker 1: it's just like restricting the size of sodas in New 591 00:33:03,400 --> 00:33:05,640 Speaker 1: York City, right, I'm sure it's not good for you. 592 00:33:05,840 --> 00:33:08,000 Speaker 1: If I trust you enough to elect your own leaders, 593 00:33:08,000 --> 00:33:10,720 Speaker 1: and I think that that's a god given inalienable, right, 594 00:33:11,040 --> 00:33:13,440 Speaker 1: then I trust you enough to pick the correct size 595 00:33:13,440 --> 00:33:15,880 Speaker 1: soda for yourself, and I feel that way on the Internet. 596 00:33:16,160 --> 00:33:18,840 Speaker 1: I like that better than someone in Washington deciding what 597 00:33:18,920 --> 00:33:20,720 Speaker 1: I can see or that I have to see everything, 598 00:33:20,720 --> 00:33:22,720 Speaker 1: because maybe I don't want to see everything. You've mentioned 599 00:33:22,720 --> 00:33:25,400 Speaker 1: truth Social. You do you think that it can grow 600 00:33:25,480 --> 00:33:29,080 Speaker 1: up and compete with Twitter and the other social media companies. 601 00:33:29,480 --> 00:33:31,880 Speaker 1: I don't see why not. I mean, as long as 602 00:33:31,920 --> 00:33:35,360 Speaker 1: we don't start fiddling with those liability protections. Because it's 603 00:33:35,360 --> 00:33:37,800 Speaker 1: a little bit like these companies that have gotten so 604 00:33:37,880 --> 00:33:40,440 Speaker 1: big with Section two thirty have grown up. They can 605 00:33:40,480 --> 00:33:45,760 Speaker 1: probably afford to litigate their way forward without Section two thirty. 606 00:33:45,800 --> 00:33:48,120 Speaker 1: But the next little guy, and maybe True Social, is 607 00:33:48,200 --> 00:33:50,680 Speaker 1: still in the growing up phase, so we can put 608 00:33:50,760 --> 00:33:53,960 Speaker 1: him in there. Probably can't afford that army of lawyers 609 00:33:54,120 --> 00:33:56,520 Speaker 1: and all those cases, so I would like to give 610 00:33:56,560 --> 00:33:59,400 Speaker 1: them the same runway that everyone else had. But I 611 00:33:59,520 --> 00:34:02,840 Speaker 1: do think that they are figuring out, and it's speculation 612 00:34:02,920 --> 00:34:07,080 Speaker 1: how much they anticipate this before they started. But the reality, 613 00:34:07,360 --> 00:34:09,600 Speaker 1: it's one thing to market yourself as the free speech 614 00:34:09,600 --> 00:34:14,160 Speaker 1: anything goes, but it's another thing to operate under those 615 00:34:14,200 --> 00:34:16,560 Speaker 1: principles is very difficult because you still have to sell 616 00:34:16,600 --> 00:34:19,680 Speaker 1: ads and also your users want a certain thing, and 617 00:34:20,080 --> 00:34:21,759 Speaker 1: I would love for them to just come out and 618 00:34:21,800 --> 00:34:24,400 Speaker 1: say this is a safe space for a speech that 619 00:34:24,520 --> 00:34:27,600 Speaker 1: isn't welcome elsewhere. Just own it. I don't have the 620 00:34:27,719 --> 00:34:29,960 Speaker 1: right to even if I buy a ticket to Disneyland. 621 00:34:30,120 --> 00:34:33,520 Speaker 1: I assume action would be taken by Disney security if 622 00:34:33,560 --> 00:34:37,240 Speaker 1: I went in there and started screaming profanities. And that's fine. 623 00:34:37,280 --> 00:34:39,680 Speaker 1: That's their space and they're going to patrol it to 624 00:34:39,800 --> 00:34:43,480 Speaker 1: cater to what their customers want. And I think probably 625 00:34:43,520 --> 00:34:46,080 Speaker 1: it's a better idea to treat platforms like that too. 626 00:34:46,200 --> 00:34:48,560 Speaker 1: That there's going to be places safe for kids and 627 00:34:48,640 --> 00:34:50,759 Speaker 1: places not safe for kids, and there's going to be 628 00:34:50,800 --> 00:34:54,920 Speaker 1: places that are pretty rough, hitting with political debates and 629 00:34:55,000 --> 00:34:58,640 Speaker 1: places to share your pet videos, and that's fine. I'd 630 00:34:58,760 --> 00:35:01,400 Speaker 1: rather sort of see a thousand flowers bloom on a 631 00:35:01,440 --> 00:35:04,439 Speaker 1: thousand platforms then try to make them all the same 632 00:35:04,960 --> 00:35:07,680 Speaker 1: in a common career way. Well, I want to thank 633 00:35:07,719 --> 00:35:11,120 Speaker 1: you for joining me and for helping all of our listeners, 634 00:35:11,120 --> 00:35:15,279 Speaker 1: including me, better understand Twitter's troubles. And I really think 635 00:35:15,320 --> 00:35:17,759 Speaker 1: it's going to be a fascinating period. I think we're 636 00:35:17,760 --> 00:35:21,080 Speaker 1: going to continue to see evolution. I do think it's important. 637 00:35:21,120 --> 00:35:24,080 Speaker 1: If they did in fact lie about what they had, 638 00:35:24,120 --> 00:35:28,640 Speaker 1: that's a straightforward corporate business problem. But I think, basically 639 00:35:28,680 --> 00:35:30,880 Speaker 1: I agree with you that having a little bit of 640 00:35:30,880 --> 00:35:33,200 Speaker 1: the wild West is not a bad thing, and it 641 00:35:33,280 --> 00:35:35,799 Speaker 1: is probably a part of freedom. And I think in 642 00:35:35,800 --> 00:35:40,360 Speaker 1: a sense that the people who enthusiastically censored Donald Trump 643 00:35:40,400 --> 00:35:42,440 Speaker 1: may now find that they led him to create a 644 00:35:42,440 --> 00:35:46,359 Speaker 1: monster that's competitive in his own way, in his own world. So, Jesse, 645 00:35:46,560 --> 00:35:48,799 Speaker 1: I just want to say thank you. It's fascinating. I'm 646 00:35:48,800 --> 00:35:51,000 Speaker 1: sure over the years we're going to want to ask 647 00:35:51,040 --> 00:35:53,960 Speaker 1: you to join us again because you're sitting right there 648 00:35:54,000 --> 00:35:57,600 Speaker 1: in the middle of an extraordinary range of innovation and change, 649 00:35:57,600 --> 00:36:00,799 Speaker 1: and the work you're doing is really important for the country. Oh, 650 00:36:00,880 --> 00:36:02,440 Speaker 1: thank you so much. It was such a pleasure to 651 00:36:02,480 --> 00:36:07,880 Speaker 1: talk with you really enjoyed it. Thank you to my guests, 652 00:36:08,040 --> 00:36:11,160 Speaker 1: Jessica m Legimate. You can learn more about Twitter's troubles 653 00:36:11,160 --> 00:36:14,120 Speaker 1: on our showpage at newtsworld dot com. Newt World is 654 00:36:14,120 --> 00:36:18,400 Speaker 1: produced by Gingwig, Sweet sixty and iHeartMedia. Our executive producer 655 00:36:18,480 --> 00:36:22,080 Speaker 1: is Garnsey Sloan, our producers Rebecca Howell, and our researcher 656 00:36:22,120 --> 00:36:25,160 Speaker 1: is Rachel Peterson. The artwork for the show, who was 657 00:36:25,200 --> 00:36:28,160 Speaker 1: created by Steve Penley. Special thanks to the team at 658 00:36:28,200 --> 00:36:31,600 Speaker 1: Gingwish three sixty. If you've been enjoying Newtsworld, I hope 659 00:36:31,600 --> 00:36:34,080 Speaker 1: you'll go to Apple Podcast and both rate us with 660 00:36:34,160 --> 00:36:37,239 Speaker 1: five stars and give us a review so others can 661 00:36:37,320 --> 00:36:40,200 Speaker 1: learn what it's all about. Right now, listeners of newts 662 00:36:40,239 --> 00:36:43,840 Speaker 1: World can sign up for my three free weekly columns 663 00:36:44,120 --> 00:36:48,360 Speaker 1: at gingwid sweet sixty dot com slash newsletter. I'm newt Gangwig. 664 00:36:48,719 --> 00:36:49,680 Speaker 1: This is news World.