1 00:00:08,600 --> 00:00:12,200 Speaker 1: Welcome, Welcome, Welcome back to the Bob Left Sets podcast. 2 00:00:12,760 --> 00:00:17,640 Speaker 1: My guest today is warrants Less, Harvard law professor, noted 3 00:00:17,720 --> 00:00:22,720 Speaker 1: legal scholar in candidate for president in glad to have you. 4 00:00:23,200 --> 00:00:25,520 Speaker 1: Glad to be here, Bob, thanks for having me. Okay, 5 00:00:25,560 --> 00:00:29,280 Speaker 1: Can the legal system save democracy? Yeah? No, only the 6 00:00:29,320 --> 00:00:33,000 Speaker 1: people can save democracy. Legal system is a very poor 7 00:00:33,360 --> 00:00:36,639 Speaker 1: backup for that. So, knowing that, what would you advise 8 00:00:36,720 --> 00:00:39,040 Speaker 1: the people to do or how do we move down 9 00:00:39,040 --> 00:00:41,640 Speaker 1: that path? Well, my own view is we have an 10 00:00:41,640 --> 00:00:45,720 Speaker 1: existential crisis facing us right now with this next election, 11 00:00:46,560 --> 00:00:51,160 Speaker 1: and it's critical not just um to get responsible leadership 12 00:00:51,240 --> 00:00:54,040 Speaker 1: in the White House, but also to get a Senate 13 00:00:54,320 --> 00:00:58,320 Speaker 1: that is responsive to the needs of Americans. And if 14 00:00:58,360 --> 00:01:01,440 Speaker 1: we could get both of those things, then I think 15 00:01:01,440 --> 00:01:04,080 Speaker 1: the first thing that next Congress needs to do is 16 00:01:04,080 --> 00:01:08,120 Speaker 1: to take up the challenge of removing the corruption that 17 00:01:08,120 --> 00:01:10,680 Speaker 1: that's at the core of our of our government, and 18 00:01:10,680 --> 00:01:13,560 Speaker 1: that's by passing the kind of reform that makes it 19 00:01:13,600 --> 00:01:17,840 Speaker 1: so that representatives care about what the people want instead 20 00:01:17,880 --> 00:01:21,640 Speaker 1: of what the tiny fraction of funders of their campaigns want. 21 00:01:22,800 --> 00:01:25,640 Speaker 1: Let's start right today, Right as we speak, there's a 22 00:01:25,680 --> 00:01:30,360 Speaker 1: crisis with the USPS United States Postal Service. Uh, they're 23 00:01:30,360 --> 00:01:34,640 Speaker 1: going to have hearings. Can hearings make any difference? Yeah, 24 00:01:34,680 --> 00:01:38,520 Speaker 1: what we've seen is that there's basically nothing that stops 25 00:01:38,520 --> 00:01:43,480 Speaker 1: this president. And so as as extraordinary as the post 26 00:01:44,000 --> 00:01:48,840 Speaker 1: post office crisis is, I think it's even more astonishing 27 00:01:48,880 --> 00:01:51,640 Speaker 1: to think what it signals about what's going to come, 28 00:01:52,200 --> 00:01:54,360 Speaker 1: because you know, what the president is doing with the 29 00:01:54,400 --> 00:01:59,240 Speaker 1: post office is literally, um, incomprehensible. I mean Jackson. It 30 00:01:59,280 --> 00:02:03,560 Speaker 1: turns out Saint Jackson did something similar way back in 31 00:02:03,600 --> 00:02:05,400 Speaker 1: the day when he was trying to get Martin van 32 00:02:05,440 --> 00:02:09,160 Speaker 1: Buren to be elected after him. Um. But since then 33 00:02:09,280 --> 00:02:13,440 Speaker 1: there's been a pretty clear understanding that the infrastructure of 34 00:02:13,520 --> 00:02:17,200 Speaker 1: the post office is not a political pawn for the president. 35 00:02:17,840 --> 00:02:20,840 Speaker 1: But here you have a president who has appointed his 36 00:02:20,960 --> 00:02:26,160 Speaker 1: number one contributor to dismantle the infrastructure of the post 37 00:02:26,200 --> 00:02:30,560 Speaker 1: office for the purpose of making it harder for somebody 38 00:02:30,600 --> 00:02:32,959 Speaker 1: to beat him in the next election. Now, of course, 39 00:02:33,000 --> 00:02:36,240 Speaker 1: the post office is not just a system for delivering ballots. 40 00:02:36,280 --> 00:02:38,639 Speaker 1: You know, there are people who have medicine, um, who 41 00:02:38,680 --> 00:02:41,519 Speaker 1: need that medicine, who will have that medicine delayed because 42 00:02:41,520 --> 00:02:44,040 Speaker 1: the post office isn't functioning well. I I sold two 43 00:02:44,080 --> 00:02:47,480 Speaker 1: things on eBay. It took six weeks for them to 44 00:02:47,520 --> 00:02:50,519 Speaker 1: be delivered using you know, the standard parcel post. The 45 00:02:51,160 --> 00:02:53,959 Speaker 1: standard away should have taken four days. Um. So we 46 00:02:54,040 --> 00:02:58,240 Speaker 1: are dismantling infrastructure to make it easier for him to 47 00:02:58,280 --> 00:03:00,280 Speaker 1: be re elected. Now, the idea that he to do 48 00:03:00,360 --> 00:03:04,440 Speaker 1: that shows there's nothing he want to And what terrifies 49 00:03:04,480 --> 00:03:07,680 Speaker 1: me is that the system we have for electing the 50 00:03:07,680 --> 00:03:12,359 Speaker 1: president is so fragile. It depends on people of good 51 00:03:12,400 --> 00:03:16,040 Speaker 1: faith on both sides participating in election. If you have 52 00:03:16,120 --> 00:03:20,000 Speaker 1: somebody acting in bad faith, especially with courts as pathetic 53 00:03:20,040 --> 00:03:22,880 Speaker 1: as our courts have been. UM, I'm not sure what 54 00:03:23,000 --> 00:03:26,359 Speaker 1: happens when he does everything he possibly can to make 55 00:03:26,360 --> 00:03:30,480 Speaker 1: sure that he gets another four years in office. Let's 56 00:03:30,480 --> 00:03:33,600 Speaker 1: talk for a second. What is Larry Lessig selling on eBay? 57 00:03:33,680 --> 00:03:38,280 Speaker 1: I was selling some used technology. I had an original 58 00:03:38,320 --> 00:03:41,760 Speaker 1: Apple TV and then I had from a G three 59 00:03:42,360 --> 00:03:44,640 Speaker 1: power book. I had one of those c D drives 60 00:03:44,640 --> 00:03:46,600 Speaker 1: for the G three power Book, which you know, got 61 00:03:46,640 --> 00:03:49,280 Speaker 1: ten dollars on any pay but whatever, okay, because the 62 00:03:49,320 --> 00:03:53,360 Speaker 1: remuneration is so low and the effort involved, I wouldn't 63 00:03:53,360 --> 00:03:57,400 Speaker 1: say as significant, but certainly takes time. Do you appreciate 64 00:03:57,480 --> 00:03:59,920 Speaker 1: the experience? Why do you take time out as opposed 65 00:03:59,920 --> 00:04:03,240 Speaker 1: to just leaving this stuff on the shelf for recycling it? Yeah? 66 00:04:03,280 --> 00:04:06,320 Speaker 1: You know my wife, Um, I love my wife. My 67 00:04:06,360 --> 00:04:13,200 Speaker 1: wife is obsessed about recycling and waste. So um, you know, 68 00:04:13,320 --> 00:04:17,200 Speaker 1: the the expense of the time, the hassle uh in 69 00:04:17,279 --> 00:04:20,520 Speaker 1: selling is tiny compared to that guilt that I would 70 00:04:20,520 --> 00:04:22,120 Speaker 1: feel if I if I tried to get rid of 71 00:04:22,120 --> 00:04:24,560 Speaker 1: it in the in the more rational way. So yeah, 72 00:04:24,640 --> 00:04:26,640 Speaker 1: I I try to sell it now. You know some 73 00:04:26,720 --> 00:04:28,400 Speaker 1: of these things like you know, the CD for a 74 00:04:28,480 --> 00:04:31,520 Speaker 1: G three power book. Um, there are not many out there. 75 00:04:31,520 --> 00:04:33,200 Speaker 1: And if you've got a G three Power book, I 76 00:04:33,240 --> 00:04:34,760 Speaker 1: don't know what you're using it for, but you know, 77 00:04:34,880 --> 00:04:37,480 Speaker 1: I'm sure you're happy to have that CD. So um, 78 00:04:38,040 --> 00:04:39,920 Speaker 1: part of me is okay with it. Okay. Staying on 79 00:04:39,960 --> 00:04:42,599 Speaker 1: that same topic, you know, there're been a lot of articles, 80 00:04:42,640 --> 00:04:46,440 Speaker 1: even the New York Times magazine that recycling doesn't work. 81 00:04:46,520 --> 00:04:48,440 Speaker 1: I know this is a little off topic going to 82 00:04:48,480 --> 00:04:52,400 Speaker 1: your wife, but you have any take on that. Yeah, 83 00:04:52,520 --> 00:04:57,560 Speaker 1: we certainly have not built up a profitable infrastructure for recycling. 84 00:04:58,279 --> 00:04:59,919 Speaker 1: And you know, there's I think there's a real debate. 85 00:05:00,000 --> 00:05:01,880 Speaker 1: Are we just trying to train people so that when 86 00:05:01,880 --> 00:05:06,080 Speaker 1: we get to that profitable infrastructure, we can deploy it effectively. Um. 87 00:05:06,120 --> 00:05:09,560 Speaker 1: Obviously other countries are much more creative about ways to 88 00:05:09,760 --> 00:05:14,840 Speaker 1: reduce or to channel waste. Um. You know, in Switzerland 89 00:05:14,880 --> 00:05:20,120 Speaker 1: you pay per I can't remember it's cubic meter or pound, 90 00:05:20,160 --> 00:05:23,120 Speaker 1: but whatever, you have this extraordinarily high price you pay 91 00:05:23,200 --> 00:05:25,880 Speaker 1: for every bag of trash that you might want to 92 00:05:25,880 --> 00:05:28,520 Speaker 1: get rid of. So it really changes that consumption pattern. 93 00:05:28,600 --> 00:05:31,839 Speaker 1: And and um, you know, we experiment with what we 94 00:05:31,880 --> 00:05:34,320 Speaker 1: can here. I've got three kids. Um, so there's a 95 00:05:34,360 --> 00:05:38,919 Speaker 1: limit to how idealistic we can be about these issues. Okay, 96 00:05:39,000 --> 00:05:43,000 Speaker 1: let's go back to the postmaster. Assuming you were advising 97 00:05:43,040 --> 00:05:46,520 Speaker 1: the Democrats, are those in power, what strategy would you 98 00:05:46,600 --> 00:05:52,679 Speaker 1: employ to try to fix this situation? Well, I don't 99 00:05:52,680 --> 00:05:55,240 Speaker 1: know that there is a strategy now. I mean what 100 00:05:55,360 --> 00:05:59,960 Speaker 1: frustrated me in March and April is that they did 101 00:06:00,320 --> 00:06:04,200 Speaker 1: use the leverage they had over the response to the 102 00:06:04,240 --> 00:06:08,000 Speaker 1: COVID crisis to guarantee that we would have the infrastructure 103 00:06:08,040 --> 00:06:11,400 Speaker 1: necessary to make this election work. Um. Everyone was so 104 00:06:11,440 --> 00:06:13,560 Speaker 1: obsessed with, like what do we do in the immediate term, 105 00:06:13,600 --> 00:06:15,920 Speaker 1: and of course in the immediate term they bent over 106 00:06:15,960 --> 00:06:20,560 Speaker 1: backwards to make sure businesses and relatively successful people were 107 00:06:20,600 --> 00:06:22,159 Speaker 1: taken care of, and there was a little bit of 108 00:06:22,160 --> 00:06:26,039 Speaker 1: a you know, six benefit that everybody could take advantage 109 00:06:26,040 --> 00:06:28,719 Speaker 1: of if they needed it. But they didn't build in 110 00:06:29,600 --> 00:06:32,920 Speaker 1: the guarantee for vote by mail at the states, and 111 00:06:32,920 --> 00:06:36,200 Speaker 1: they certainly didn't guarantee build in a guarantee that the 112 00:06:36,240 --> 00:06:38,279 Speaker 1: Post Office would have the capacity to do what it 113 00:06:38,320 --> 00:06:42,520 Speaker 1: needs to do. Having not done that, um, you know, 114 00:06:42,600 --> 00:06:44,560 Speaker 1: I think the most that they can do now is 115 00:06:44,600 --> 00:06:48,440 Speaker 1: just to drive people's focus to this fact. Now, that's 116 00:06:49,000 --> 00:06:52,080 Speaker 1: that's a hard thing to do in the media environment 117 00:06:52,120 --> 00:06:54,839 Speaker 1: that we live in right now, because you know, a 118 00:06:54,880 --> 00:06:58,200 Speaker 1: certain chunk of America is focused on a channel or 119 00:06:58,240 --> 00:07:00,880 Speaker 1: a couple of channels that will never cover any of 120 00:07:00,920 --> 00:07:03,320 Speaker 1: these issues in a way that reveals will suggest that 121 00:07:03,400 --> 00:07:06,840 Speaker 1: there's anything problematic with what the president is doing. Um. 122 00:07:07,360 --> 00:07:09,320 Speaker 1: And so you know, it's there's a limit to the 123 00:07:09,320 --> 00:07:11,680 Speaker 1: extent to which the Democrats can even get people to 124 00:07:11,760 --> 00:07:14,960 Speaker 1: recognize what's happening. But I think that, you know, we 125 00:07:15,040 --> 00:07:17,720 Speaker 1: need to begin a drumbeat of getting people to realize 126 00:07:18,200 --> 00:07:24,160 Speaker 1: a just how um just how extraordinary, how unprecedented. This 127 00:07:24,240 --> 00:07:27,760 Speaker 1: behavior is number one and number two. Get people to 128 00:07:27,800 --> 00:07:31,400 Speaker 1: begin to reflect on what will America look like if 129 00:07:31,400 --> 00:07:34,120 Speaker 1: this man is re elected in a way that everybody 130 00:07:34,200 --> 00:07:38,120 Speaker 1: believes is just theft. Um. You know, how does the 131 00:07:38,200 --> 00:07:42,360 Speaker 1: nation hold itself together? Because I think, you know the 132 00:07:42,360 --> 00:07:45,640 Speaker 1: other side, my side has been pretty patient after two 133 00:07:45,680 --> 00:07:48,880 Speaker 1: thousand and two thousand sixteen. Those were two elections that 134 00:07:48,960 --> 00:07:51,480 Speaker 1: were ordinary. They happened to flip the result from the 135 00:07:51,520 --> 00:07:55,679 Speaker 1: Democratic majority. But through ordinary process. We could argue about 136 00:07:55,680 --> 00:07:59,360 Speaker 1: Bush Vigor and Florida. But okay, we'll accept it. But 137 00:07:59,400 --> 00:08:02,880 Speaker 1: the idea that this man would be reelected through means 138 00:08:02,920 --> 00:08:07,560 Speaker 1: that people perceive as basically fraudulent or criminal and then 139 00:08:07,640 --> 00:08:12,040 Speaker 1: expect to govern peacefully, I think is just crazy talk. 140 00:08:12,160 --> 00:08:16,200 Speaker 1: And I'm terrified about what happens to our nation. Um 141 00:08:16,240 --> 00:08:18,200 Speaker 1: if we get to a place where that has happened 142 00:08:18,280 --> 00:08:23,160 Speaker 1: and there's no effective remedy, because obviously Congress can't impeach 143 00:08:23,200 --> 00:08:27,000 Speaker 1: the man. The partisan premise of Congress means there's no 144 00:08:27,040 --> 00:08:31,400 Speaker 1: impeachment at least depending on who controls the Senate and UM, 145 00:08:31,760 --> 00:08:33,280 Speaker 1: and that means that we're not going to have any 146 00:08:33,280 --> 00:08:36,920 Speaker 1: opportunity to remedy what you know will be an astonishing 147 00:08:37,040 --> 00:08:41,360 Speaker 1: moment in American history, the basic theft of the basic 148 00:08:41,400 --> 00:08:44,559 Speaker 1: theft of an election. Well, if we look at what 149 00:08:44,559 --> 00:08:48,400 Speaker 1: we'll call the Black Lives Matters protests with George Floyd, 150 00:08:49,240 --> 00:08:52,720 Speaker 1: it's he was not the first African American whose life 151 00:08:52,800 --> 00:08:55,480 Speaker 1: was taken at the hands of the police force. Yet 152 00:08:55,520 --> 00:09:00,160 Speaker 1: there was an international conflagration of protests after that. Uh, 153 00:09:00,520 --> 00:09:04,560 Speaker 1: do we anticipate any protests either before or after the 154 00:09:04,640 --> 00:09:08,920 Speaker 1: election spontaneously because as you said earlier, it depends on 155 00:09:09,000 --> 00:09:11,240 Speaker 1: the public serving the public. So does the public have 156 00:09:11,440 --> 00:09:17,079 Speaker 1: to rise first and then pull the politicians along with them. Well, 157 00:09:17,120 --> 00:09:22,120 Speaker 1: I don't expect unless there's some well, I mean, let's 158 00:09:22,120 --> 00:09:24,280 Speaker 1: be clear, there there are scenarios where, yes, there would 159 00:09:24,320 --> 00:09:29,400 Speaker 1: certainly be extraordinary protests in advance. So our constitution, which 160 00:09:29,440 --> 00:09:34,800 Speaker 1: is um, as I said, extremely fragile here, according to 161 00:09:34,840 --> 00:09:39,000 Speaker 1: the Supreme Court, plainly allows a state like Florida to 162 00:09:39,120 --> 00:09:43,000 Speaker 1: declare that, um, it's not gonna have an election and 163 00:09:43,040 --> 00:09:45,680 Speaker 1: it's just gonna pick its electors how it wants. So 164 00:09:45,800 --> 00:09:48,839 Speaker 1: Florida could declare before the election at least it's clear 165 00:09:49,280 --> 00:09:51,800 Speaker 1: after the election it's not as clear, but before the election, 166 00:09:52,240 --> 00:09:54,360 Speaker 1: the legislature could vote, well, we're not going to have 167 00:09:54,400 --> 00:09:57,560 Speaker 1: a vote, We're just gonna give the electors to the 168 00:09:57,600 --> 00:10:01,720 Speaker 1: Republican Party. If states do something like that, I think 169 00:10:01,840 --> 00:10:05,120 Speaker 1: you'll see tons of protests before the election. But if 170 00:10:05,120 --> 00:10:06,640 Speaker 1: we go through and we try to have a real 171 00:10:06,679 --> 00:10:10,080 Speaker 1: election and um, and it seems like there have been 172 00:10:10,120 --> 00:10:12,920 Speaker 1: games that have been played or I don't know why 173 00:10:12,920 --> 00:10:16,760 Speaker 1: we call it games. If there's criminal um, corruption of 174 00:10:16,800 --> 00:10:21,040 Speaker 1: the process of voting, uh, and afterwards, the consequence of 175 00:10:21,040 --> 00:10:24,960 Speaker 1: that is that um, President Trump is re elected. I 176 00:10:25,080 --> 00:10:27,960 Speaker 1: certainly think there will be extraordinary protests. How I'll be 177 00:10:28,040 --> 00:10:31,520 Speaker 1: out there with them because it is just not acceptable, 178 00:10:32,080 --> 00:10:34,240 Speaker 1: and it's there's a certain moment at which you've got 179 00:10:34,240 --> 00:10:38,040 Speaker 1: to stand up and fight for a democracy. And and 180 00:10:38,240 --> 00:10:42,120 Speaker 1: we've seen so much contrary to the norms of democratic 181 00:10:42,640 --> 00:10:46,520 Speaker 1: order over the past three years that at some point 182 00:10:46,520 --> 00:10:49,520 Speaker 1: we've got to say we can't accept it anymore and 183 00:10:50,000 --> 00:10:53,640 Speaker 1: do whatever it takes to make sure that it doesn't 184 00:10:53,679 --> 00:10:57,200 Speaker 1: have its effect. For those who are not constitutional scholars, 185 00:10:57,200 --> 00:11:01,360 Speaker 1: could you please walk slowly through the process of Florida 186 00:11:01,559 --> 00:11:07,160 Speaker 1: or another state picking electors and essentially punting the election. Well, 187 00:11:07,200 --> 00:11:10,760 Speaker 1: the Constitution, of course, doesn't directly elect I'd say that 188 00:11:10,800 --> 00:11:13,720 Speaker 1: the president will be directly elected. What it says is 189 00:11:13,760 --> 00:11:18,200 Speaker 1: that states get to a point in whatever manner, they 190 00:11:18,960 --> 00:11:26,320 Speaker 1: choose electors, and their electors then vote for the president. So, um, 191 00:11:26,400 --> 00:11:29,520 Speaker 1: the states appoint the electors. So they set up the 192 00:11:29,559 --> 00:11:33,040 Speaker 1: process to appoint the electors through state law. And if 193 00:11:33,040 --> 00:11:36,480 Speaker 1: prior to an election, the state changes its law and says, 194 00:11:36,960 --> 00:11:39,320 Speaker 1: you know, we originally thought there'd be an election, but 195 00:11:39,360 --> 00:11:41,000 Speaker 1: now there won't be an election. We'll just pick a 196 00:11:41,120 --> 00:11:44,760 Speaker 1: slate of Republican electors. The Supreme Court has said, most 197 00:11:44,760 --> 00:11:47,640 Speaker 1: recently in Bush versus Gore, but before that a hundred 198 00:11:47,720 --> 00:11:51,440 Speaker 1: years ago, in a case called McPherson versus Blacker, that 199 00:11:51,600 --> 00:11:54,600 Speaker 1: this that the states retain the right quote at any 200 00:11:54,640 --> 00:11:59,960 Speaker 1: time to recall the selection of electors and execute extors 201 00:12:00,000 --> 00:12:02,480 Speaker 1: eyes that power themselves. Now at the founding, there are 202 00:12:02,520 --> 00:12:05,840 Speaker 1: many states that did exactly that. Many states didn't have elections, 203 00:12:06,280 --> 00:12:10,480 Speaker 1: They just picked electors. Um. Some states had elections, um um. 204 00:12:10,800 --> 00:12:13,640 Speaker 1: Some of those elections were by district, others were at 205 00:12:13,640 --> 00:12:16,520 Speaker 1: the at the state level like we have today. Um. 206 00:12:16,559 --> 00:12:19,360 Speaker 1: So there are lots of different systems. Originally, we've all 207 00:12:19,400 --> 00:12:22,560 Speaker 1: converged now in the idea of the public having a 208 00:12:22,640 --> 00:12:26,360 Speaker 1: say through a vote, But the constitution doesn't say that 209 00:12:26,400 --> 00:12:28,800 Speaker 1: it must have a say through a vote. And so 210 00:12:28,920 --> 00:12:32,800 Speaker 1: if the state of Florida, governed by a Republican legislature 211 00:12:32,800 --> 00:12:35,840 Speaker 1: and a Republican governor, comes to believe that the only 212 00:12:35,880 --> 00:12:38,720 Speaker 1: way Donald Trump gets re elected is if Florida goes 213 00:12:38,760 --> 00:12:41,400 Speaker 1: with Donald Trump, and the polls seemed to suggest that 214 00:12:41,440 --> 00:12:44,600 Speaker 1: it won't, I am I would not be at all 215 00:12:44,600 --> 00:12:48,520 Speaker 1: surprised if they exercise this power to say that Florida 216 00:12:48,559 --> 00:12:50,920 Speaker 1: doesn't get a vote. Um. And I think what ought 217 00:12:50,920 --> 00:12:53,840 Speaker 1: to be happening right now is that people ought to 218 00:12:53,840 --> 00:12:57,920 Speaker 1: be clear about this possibility and demand that people like 219 00:12:58,000 --> 00:13:01,120 Speaker 1: the governor of Florida UH or the people running the 220 00:13:01,200 --> 00:13:04,560 Speaker 1: legislature in Florida, answer the question will you promise to 221 00:13:04,640 --> 00:13:07,320 Speaker 1: have an election? Will you promise not to cancel the election? 222 00:13:08,200 --> 00:13:12,600 Speaker 1: Because I think that there are so many scenarios where 223 00:13:12,640 --> 00:13:15,400 Speaker 1: you know, the pandemic recurs and there's all sorts of 224 00:13:15,440 --> 00:13:19,319 Speaker 1: anxiety about safety, and they say, well, we can't safely 225 00:13:19,360 --> 00:13:21,920 Speaker 1: imagine people going out and voting, and so rather than 226 00:13:22,400 --> 00:13:25,760 Speaker 1: risking their lives, we're going to cancel the election, and 227 00:13:25,800 --> 00:13:28,000 Speaker 1: we know that there's a Republican state anyway, so we'll 228 00:13:28,040 --> 00:13:30,560 Speaker 1: just give it to the Republican electors. There's so many 229 00:13:30,559 --> 00:13:32,679 Speaker 1: ways they could get away with it, and so I 230 00:13:32,760 --> 00:13:34,840 Speaker 1: think we right now need to start saying, Okay, just 231 00:13:34,960 --> 00:13:37,960 Speaker 1: be clear, are you going to commit to an election 232 00:13:38,360 --> 00:13:40,040 Speaker 1: and are you going to commit not to cancel the 233 00:13:40,040 --> 00:13:43,040 Speaker 1: election no matter what? Because you know, you know, we've 234 00:13:43,280 --> 00:13:45,720 Speaker 1: we've run elections during wars, We've run elections during the 235 00:13:45,760 --> 00:13:49,559 Speaker 1: Civil War. Uh. You know, Lincoln was re elected in 236 00:13:49,640 --> 00:13:52,680 Speaker 1: eighteen sixty four, it was another year before the war 237 00:13:52,760 --> 00:13:55,760 Speaker 1: was over. Um. So we've done it before, and I 238 00:13:55,800 --> 00:13:57,800 Speaker 1: think we need to be committed to doing it now. 239 00:13:58,440 --> 00:14:00,760 Speaker 1: But the danger is that there could be a strong 240 00:14:00,800 --> 00:14:04,040 Speaker 1: incentive for some states to flip if they're going to 241 00:14:04,120 --> 00:14:07,280 Speaker 1: try to make sure Donald Trump gets reelected. Okay, there 242 00:14:07,360 --> 00:14:11,439 Speaker 1: was a recent Supreme Court case relative to the obligation 243 00:14:11,520 --> 00:14:14,360 Speaker 1: of electors to vote according to the public will could 244 00:14:14,360 --> 00:14:18,600 Speaker 1: you explain that case in the effect thereof yeah, it's 245 00:14:18,640 --> 00:14:24,120 Speaker 1: actually my case. So um I UM UH was working 246 00:14:24,200 --> 00:14:28,240 Speaker 1: with electors from the state of Washington and UM and 247 00:14:28,600 --> 00:14:32,320 Speaker 1: my organization Equal Citizens had sort of teed up this 248 00:14:32,400 --> 00:14:36,520 Speaker 1: case both in Washington and Colorado because in two thousand 249 00:14:36,600 --> 00:14:41,680 Speaker 1: sixteen there was a question raised whether electors are free 250 00:14:41,720 --> 00:14:45,640 Speaker 1: to cast their ballot however they wish um, And we 251 00:14:45,720 --> 00:14:49,120 Speaker 1: thought it was really important that the Supreme Court answer 252 00:14:49,280 --> 00:14:53,040 Speaker 1: the question whether they have that freedom before it creates 253 00:14:53,040 --> 00:14:56,920 Speaker 1: a constitutional crisis. So we teed up the case and 254 00:14:56,960 --> 00:15:01,760 Speaker 1: we got electors in UM in Washington in state um uh. 255 00:15:01,800 --> 00:15:05,160 Speaker 1: These electors had been fined because they had voted contrary 256 00:15:05,240 --> 00:15:08,320 Speaker 1: to their pledge in and then there was an elector 257 00:15:08,320 --> 00:15:11,600 Speaker 1: in Colorado who had been kicked off UM. He cast 258 00:15:11,640 --> 00:15:13,760 Speaker 1: his vote contrary to his pledge and he was kicked 259 00:15:13,760 --> 00:15:16,800 Speaker 1: off and replaced with another elector. And so the question 260 00:15:16,840 --> 00:15:20,600 Speaker 1: we wanted the court to answer was whether, by being 261 00:15:20,680 --> 00:15:23,640 Speaker 1: given the power of a quote elector and told to 262 00:15:23,680 --> 00:15:27,160 Speaker 1: cast quote their votes, they're allowed to vote however they want, 263 00:15:27,280 --> 00:15:30,600 Speaker 1: or whether, as Justice Kegan put it, like a Soviet system, 264 00:15:30,640 --> 00:15:32,960 Speaker 1: they had to vote as they were directed. Well, the 265 00:15:32,960 --> 00:15:35,760 Speaker 1: answer to the Supreme Court was it's a Soviet system, um, 266 00:15:35,800 --> 00:15:38,480 Speaker 1: they have to vote as they are directed. UM. And 267 00:15:38,520 --> 00:15:40,480 Speaker 1: that made a lot of people happy because they don't 268 00:15:40,520 --> 00:15:43,640 Speaker 1: like the idea. And I understand at first glance why 269 00:15:43,680 --> 00:15:45,720 Speaker 1: one would not like the idea that electors have this 270 00:15:45,800 --> 00:15:48,720 Speaker 1: kind of discretion. Um. I actually, by the end of 271 00:15:48,720 --> 00:15:50,560 Speaker 1: the case, I argued the case in the Supreme Court, 272 00:15:50,640 --> 00:15:53,520 Speaker 1: or I argued it by telephone from my office in 273 00:15:53,600 --> 00:15:56,080 Speaker 1: the Supreme Court. Um. But by the end of that, 274 00:15:56,720 --> 00:15:59,240 Speaker 1: by the end of my preparation for that case, I 275 00:15:59,280 --> 00:16:01,800 Speaker 1: became pretty con vince that it was going to be 276 00:16:01,840 --> 00:16:05,240 Speaker 1: a huge mistake if the Supreme Court said that electors 277 00:16:05,280 --> 00:16:08,440 Speaker 1: didn't have discretion, because you can imagine that discretion being 278 00:16:08,440 --> 00:16:12,880 Speaker 1: the final safety valve in this whole disaster of an 279 00:16:12,880 --> 00:16:15,200 Speaker 1: election that we're seeing. But but the way the Court 280 00:16:15,280 --> 00:16:18,800 Speaker 1: decided it, electors are basically pawns. They can't do anything 281 00:16:18,800 --> 00:16:22,480 Speaker 1: except what they're directed by their state to do. And so, um, 282 00:16:22,800 --> 00:16:24,560 Speaker 1: if they're directed to vote one way, they're gonna have 283 00:16:24,560 --> 00:16:28,480 Speaker 1: to vote that way regardless. And so what do you 284 00:16:28,520 --> 00:16:31,720 Speaker 1: feel about the status today? Well, I think that there 285 00:16:31,760 --> 00:16:36,040 Speaker 1: are many ways in which this goes south. Um. Uh, 286 00:16:36,680 --> 00:16:39,400 Speaker 1: you know, he's just one. You know, imagine God forbid, 287 00:16:39,440 --> 00:16:44,080 Speaker 1: but imagine uh Joe Biden m wins the popular vote 288 00:16:44,240 --> 00:16:49,760 Speaker 1: but then passes away before the Electoral College votes the twenties. 289 00:16:49,800 --> 00:16:54,720 Speaker 1: Amendments passed in the nineteen thirties addresses the case where 290 00:16:54,840 --> 00:16:58,160 Speaker 1: a candidate dies after the Electoral College votes, but before 291 00:16:58,200 --> 00:17:02,920 Speaker 1: he's sworn in as president, and they expressly thought to themselves, well, 292 00:17:02,960 --> 00:17:05,000 Speaker 1: there's no reason for us to deal with the problem 293 00:17:05,119 --> 00:17:08,720 Speaker 1: of a candidate dying before the Electoral College votes, because 294 00:17:08,760 --> 00:17:10,760 Speaker 1: the electors have a discretion. They can decide to vote 295 00:17:10,800 --> 00:17:14,480 Speaker 1: however they want. And so if you know Joe Biden died, 296 00:17:14,520 --> 00:17:17,400 Speaker 1: you would imagine all the electors would vote for um 297 00:17:17,680 --> 00:17:23,200 Speaker 1: uh uh Kamala Harris. But the Supreme Court's decision means 298 00:17:23,440 --> 00:17:26,280 Speaker 1: they have to vote as their directed to vote. And 299 00:17:26,680 --> 00:17:30,120 Speaker 1: though the Court said, we're not actually addressing that case here, 300 00:17:30,240 --> 00:17:34,120 Speaker 1: we're not addressing that question. Um. If that scenario happens, 301 00:17:34,160 --> 00:17:36,840 Speaker 1: there's going to be a fight because if they vote, 302 00:17:37,480 --> 00:17:39,760 Speaker 1: if they're required to vote for Joe Biden, and then 303 00:17:39,760 --> 00:17:42,439 Speaker 1: they vote for Joe Biden, Joe Biden's vote can't be 304 00:17:42,520 --> 00:17:45,760 Speaker 1: counted because Joe Biden is not alive. So that means 305 00:17:45,760 --> 00:17:48,560 Speaker 1: it would throw the election into the House of Representatives, 306 00:17:49,680 --> 00:17:54,280 Speaker 1: and the House of Representatives under the constitution, votes one 307 00:17:54,400 --> 00:17:58,040 Speaker 1: vote per state, so you could have a clear majority 308 00:17:58,280 --> 00:18:01,840 Speaker 1: of people voting for a Democrat. But depending on how 309 00:18:02,480 --> 00:18:05,919 Speaker 1: the elections happen in you could have a majority of 310 00:18:05,960 --> 00:18:09,439 Speaker 1: states that are Republicans, state delegations that are Republican. So 311 00:18:09,480 --> 00:18:13,719 Speaker 1: you could have the Democrat win the popular vote and 312 00:18:13,760 --> 00:18:17,040 Speaker 1: would have won the electoral College vote, but for the 313 00:18:17,080 --> 00:18:20,159 Speaker 1: weird way that the votes get real allocated. But the 314 00:18:20,200 --> 00:18:24,520 Speaker 1: way the system ultily results is that House of Representatives votes, 315 00:18:24,560 --> 00:18:27,920 Speaker 1: and the House of Representatives votes for the Republican and 316 00:18:27,920 --> 00:18:30,040 Speaker 1: and again I think most people would be astonished at 317 00:18:30,040 --> 00:18:33,960 Speaker 1: that result. But that's plainly on one track in the 318 00:18:33,960 --> 00:18:37,760 Speaker 1: way that this election might unpack. What is the experience, 319 00:18:37,800 --> 00:18:40,840 Speaker 1: whether it be live or over the phone of arguing 320 00:18:40,880 --> 00:18:44,000 Speaker 1: before the Supreme Court certainly views of the some other 321 00:18:44,080 --> 00:18:50,440 Speaker 1: lower court, so it's pretty inferior compared to a lower court. UM. 322 00:18:50,480 --> 00:18:53,960 Speaker 1: I had two arguments in the last couple of one 323 00:18:53,960 --> 00:18:56,920 Speaker 1: in May and one in June or one in July. 324 00:18:57,680 --> 00:19:00,080 Speaker 1: One in the Supreme Court it was in May, and 325 00:19:00,240 --> 00:19:04,800 Speaker 1: one in a district court was in July. And the 326 00:19:04,880 --> 00:19:07,280 Speaker 1: problem with the argument in the Supreme Court is that 327 00:19:08,440 --> 00:19:11,040 Speaker 1: the Chief Justice set it up so every justice gets 328 00:19:11,080 --> 00:19:15,679 Speaker 1: basically two minutes to ask questions. UM. And uh, you know, 329 00:19:15,760 --> 00:19:19,040 Speaker 1: that's less than a congress person has. And so they 330 00:19:19,080 --> 00:19:22,000 Speaker 1: come into their argument with a list of questions. They're 331 00:19:22,000 --> 00:19:23,920 Speaker 1: not really listening to what other people have asked and 332 00:19:24,040 --> 00:19:28,600 Speaker 1: not really interested in creating a understanding around the argument 333 00:19:28,680 --> 00:19:31,800 Speaker 1: about the case. They're just interested in getting their particular 334 00:19:31,880 --> 00:19:34,040 Speaker 1: questions asked. So it it's a very it was a 335 00:19:34,119 --> 00:19:37,720 Speaker 1: very unsatisfying experience because I felt like so much was 336 00:19:37,760 --> 00:19:40,600 Speaker 1: being missed and every single answer you gave could be 337 00:19:40,640 --> 00:19:42,640 Speaker 1: no more than a hundred words, you know, no more 338 00:19:42,680 --> 00:19:44,760 Speaker 1: than a minute, because the chief would cut you off 339 00:19:45,119 --> 00:19:47,520 Speaker 1: so that somebody else can ask a question. When I 340 00:19:47,640 --> 00:19:51,399 Speaker 1: argued in the district court in California before Judge Judge 341 00:19:51,480 --> 00:19:55,359 Speaker 1: chen Um, uh you know, we had it was just 342 00:19:55,400 --> 00:19:58,560 Speaker 1: me and another guy and this uh Ted Olsen, pretty 343 00:19:58,640 --> 00:20:02,920 Speaker 1: significant lawyer Um and the judge. You know, we probably 344 00:20:02,960 --> 00:20:04,679 Speaker 1: went on for an hour an hour and a half. 345 00:20:05,119 --> 00:20:07,479 Speaker 1: The judge asked questions, He let us answer them as 346 00:20:07,520 --> 00:20:10,359 Speaker 1: long as it took to answer them. He probed the questions. 347 00:20:10,400 --> 00:20:13,719 Speaker 1: He you know, he really understood by the end our argument, 348 00:20:13,720 --> 00:20:15,760 Speaker 1: whether he agreed with it or not as a separate question. 349 00:20:15,760 --> 00:20:20,280 Speaker 1: But the point is it was an actually effective context 350 00:20:20,440 --> 00:20:23,959 Speaker 1: for for engaging in in in the question. And and 351 00:20:24,000 --> 00:20:26,679 Speaker 1: what's frustrating is, Um, you know, the case that I 352 00:20:26,720 --> 00:20:30,320 Speaker 1: was arguing in the District Court involving a regulation involving 353 00:20:30,359 --> 00:20:33,280 Speaker 1: cell phones. Um, you know, we think it's important enough, 354 00:20:33,280 --> 00:20:36,200 Speaker 1: but it's not the Constitution. And so the idea that 355 00:20:36,720 --> 00:20:38,720 Speaker 1: you would end the argument and feel like the judge 356 00:20:38,800 --> 00:20:42,400 Speaker 1: understands the question and can now decide for this cell 357 00:20:42,440 --> 00:20:46,919 Speaker 1: phone issue, but the Supreme Court barely pierces the issue 358 00:20:47,440 --> 00:20:53,840 Speaker 1: around the Constitution's uh electoral college was ultimately incredibly unsatisfying. 359 00:20:54,040 --> 00:20:57,399 Speaker 1: I've argued, you know, live in the Supreme Court, and 360 00:20:57,440 --> 00:21:01,399 Speaker 1: the differences when you argue live, there's no you know, 361 00:21:01,560 --> 00:21:03,400 Speaker 1: they don't go around one by one and say, what's 362 00:21:03,400 --> 00:21:06,960 Speaker 1: your question, Justice Thomas, what's your question? Justice ginsburg Um. 363 00:21:07,000 --> 00:21:09,960 Speaker 1: They just basically the person starts arguing and then people 364 00:21:10,040 --> 00:21:14,280 Speaker 1: interrupt him or her and ask questions, and that dynamic 365 00:21:14,400 --> 00:21:18,720 Speaker 1: actually builds, um, a kind of arc of the argument. 366 00:21:18,840 --> 00:21:21,560 Speaker 1: So you can the questions build of each other, and 367 00:21:21,600 --> 00:21:25,600 Speaker 1: by the end the case has kind of been you know, resolved, 368 00:21:25,640 --> 00:21:28,080 Speaker 1: you kind of understand the strong points and the weak 369 00:21:28,119 --> 00:21:30,159 Speaker 1: points on the basis of how people have engaged in 370 00:21:30,160 --> 00:21:32,800 Speaker 1: the argument. Now, even then, I think the Supreme Court 371 00:21:32,840 --> 00:21:36,200 Speaker 1: has been wildly to conservative and the amount of time 372 00:21:36,240 --> 00:21:39,639 Speaker 1: that it gives people to engage with them, um, you know, 373 00:21:39,760 --> 00:21:41,879 Speaker 1: and uh and and maybe you know, there was a 374 00:21:41,880 --> 00:21:44,399 Speaker 1: time when the Court was deciding a hundred and fifty 375 00:21:44,400 --> 00:21:47,880 Speaker 1: cases a year. It's down now to like less than eighty. Um. 376 00:21:47,960 --> 00:21:51,120 Speaker 1: So it's not I don't understand why they can't give 377 00:21:51,160 --> 00:21:54,600 Speaker 1: it more time. But um. But in either context, it's 378 00:21:54,600 --> 00:21:56,240 Speaker 1: not as great as it can be in the kind 379 00:21:56,400 --> 00:21:59,800 Speaker 1: in you know, other contexts like lower courts, and you 380 00:22:00,040 --> 00:22:03,280 Speaker 1: get stage freight or you intimidated at all in arguing 381 00:22:03,320 --> 00:22:07,120 Speaker 1: in front of the Supreme Court. I don't think that 382 00:22:07,280 --> 00:22:12,800 Speaker 1: I felt frightened. Um. You know, so no, I don't. 383 00:22:12,880 --> 00:22:16,560 Speaker 1: I don't feel that, I imagine, um, you know. Part 384 00:22:16,560 --> 00:22:19,159 Speaker 1: of that is just I had the extraordinary experience of 385 00:22:19,160 --> 00:22:23,040 Speaker 1: clerking at the court, and that was you know, an 386 00:22:23,119 --> 00:22:26,359 Speaker 1: honor and um and incredible fund. But it also is 387 00:22:26,359 --> 00:22:29,800 Speaker 1: an experience that taught you just how human the justices were. 388 00:22:30,119 --> 00:22:33,080 Speaker 1: And uh. And you you know, you realize these you're 389 00:22:33,080 --> 00:22:37,440 Speaker 1: not legal gods, um uh. And they're just trying to 390 00:22:37,520 --> 00:22:40,359 Speaker 1: understand the case, and you're kind of helping them to 391 00:22:40,440 --> 00:22:44,280 Speaker 1: understand the case. So it's not about being afraid of them. Um, 392 00:22:44,320 --> 00:22:46,960 Speaker 1: they're not gonna you know, evict you from the country 393 00:22:47,080 --> 00:22:49,800 Speaker 1: or get you fired. Um, it's just trying to figure 394 00:22:49,840 --> 00:22:51,760 Speaker 1: out how you bring them to the right place and 395 00:22:51,760 --> 00:22:55,919 Speaker 1: and and that's always hard. It's you know, it depends 396 00:22:55,920 --> 00:22:58,320 Speaker 1: on the case. It can be extremely hard, especially when 397 00:22:59,240 --> 00:23:02,680 Speaker 1: the result that you're arguing for would be a very 398 00:23:02,760 --> 00:23:06,440 Speaker 1: difficult result for the court to give given the political 399 00:23:06,520 --> 00:23:08,760 Speaker 1: reaction to that. So, you know, our side was arguing 400 00:23:09,400 --> 00:23:14,119 Speaker 1: electors were constitutionally free. UM. And we thought that was 401 00:23:14,160 --> 00:23:17,320 Speaker 1: an important position, and we thought, you know, I would 402 00:23:17,359 --> 00:23:20,360 Speaker 1: defend what the electors that I was representing did. There 403 00:23:20,400 --> 00:23:24,359 Speaker 1: was a CNN documentary about them on Saturday night. UM. 404 00:23:24,440 --> 00:23:27,280 Speaker 1: And you know, the basic story with these electors was 405 00:23:27,320 --> 00:23:30,000 Speaker 1: that after they saw that the candidate they were going 406 00:23:30,040 --> 00:23:32,160 Speaker 1: to vote for, Hillary Clinton, was not going to win, 407 00:23:33,160 --> 00:23:36,320 Speaker 1: they tried to work with Republican electors to get enough 408 00:23:36,320 --> 00:23:38,719 Speaker 1: of them to vote for someone other than Donald Trump, 409 00:23:39,320 --> 00:23:41,920 Speaker 1: so that the House of Representatives could then decide should 410 00:23:41,920 --> 00:23:46,159 Speaker 1: Donald Trump be the president or should another Republican be 411 00:23:46,200 --> 00:23:48,960 Speaker 1: the president. Nobody imagined Hillary Clinton would be the president, 412 00:23:49,320 --> 00:23:53,159 Speaker 1: so they were trying to exercise their discretion um to 413 00:23:53,240 --> 00:23:55,840 Speaker 1: bring about a result that was closer to what the 414 00:23:55,920 --> 00:23:59,600 Speaker 1: people who had voted for them wanted than the election 415 00:23:59,600 --> 00:24:02,919 Speaker 1: of Donald Trump would have been. But uh still, I 416 00:24:02,960 --> 00:24:06,440 Speaker 1: think when people talk about the so called faithless electors, 417 00:24:06,480 --> 00:24:09,040 Speaker 1: that's how they're referred to. It kind of scares people 418 00:24:09,080 --> 00:24:11,639 Speaker 1: to imagine electors having that kind of discretion. So if 419 00:24:11,640 --> 00:24:14,879 Speaker 1: the Court had ruled with us UM, you know, I 420 00:24:14,880 --> 00:24:18,200 Speaker 1: would be suffering an extraordinary amounts of hate from people 421 00:24:18,240 --> 00:24:20,280 Speaker 1: all across the country saying, how could you do such 422 00:24:20,320 --> 00:24:23,119 Speaker 1: a thing. Now, Donald Trump can bribe these electors and 423 00:24:23,160 --> 00:24:25,239 Speaker 1: they can vote however he wants them to vote, and 424 00:24:25,600 --> 00:24:27,520 Speaker 1: all of that, I think it's just crazy talk. But 425 00:24:27,520 --> 00:24:30,320 Speaker 1: but it still would have been anxiety producing and maybe 426 00:24:30,640 --> 00:24:32,760 Speaker 1: anxiety is the last thing the world needs right now 427 00:24:32,840 --> 00:24:36,159 Speaker 1: in the middle of this election. Staying with the Supreme Court, 428 00:24:36,480 --> 00:24:42,560 Speaker 1: the Conservatives tend to go with an originalistic originalism interpretation 429 00:24:42,600 --> 00:24:46,200 Speaker 1: of the Constitution. Give me your take on so they 430 00:24:46,320 --> 00:24:50,440 Speaker 1: want you to believe, But I don't think it's actually 431 00:24:50,520 --> 00:24:56,159 Speaker 1: true that that principle constrains them. It's more tool that 432 00:24:56,240 --> 00:25:00,239 Speaker 1: they can deploy when it fits with what they they 433 00:25:00,240 --> 00:25:02,120 Speaker 1: think they have to say, or what they what they 434 00:25:02,160 --> 00:25:05,600 Speaker 1: want to say. You know, I I've argued two cases 435 00:25:05,600 --> 00:25:10,439 Speaker 1: in the Supreme Court where the originalist answer was pretty clear. 436 00:25:10,800 --> 00:25:13,800 Speaker 1: The first case I argued was a case about whether 437 00:25:13,960 --> 00:25:17,520 Speaker 1: Congress has the power to extend the terms of an 438 00:25:17,520 --> 00:25:21,400 Speaker 1: existing copyright. So this was the sunny bono Copyright Term 439 00:25:21,400 --> 00:25:26,760 Speaker 1: Extension Act. Congress extended the term of copyrights in eight 440 00:25:26,800 --> 00:25:29,680 Speaker 1: by twenty years. That was the I think the eighteenth 441 00:25:29,720 --> 00:25:32,040 Speaker 1: time they had extended the term of copyright, and that 442 00:25:32,160 --> 00:25:36,119 Speaker 1: prior to twenty five years. Um. And so our claim 443 00:25:36,160 --> 00:25:40,760 Speaker 1: was the Constitution says Congress has the power to set terms, 444 00:25:41,359 --> 00:25:45,560 Speaker 1: set quote limited times for copyright, and it makes no 445 00:25:45,600 --> 00:25:47,959 Speaker 1: sense to say that every time that limit expires they 446 00:25:47,960 --> 00:25:49,960 Speaker 1: can extend it again and again and again. And we 447 00:25:50,040 --> 00:25:53,600 Speaker 1: had tons of originalist argument to say the frameworks were 448 00:25:53,720 --> 00:25:56,399 Speaker 1: very skeptical about the monopoly of a copyright and they 449 00:25:56,440 --> 00:26:00,240 Speaker 1: would want it restricted. Um. And I went, I made 450 00:26:00,240 --> 00:26:03,160 Speaker 1: that argument, and I fully expected the Conservatives would pipe 451 00:26:03,240 --> 00:26:05,840 Speaker 1: up and say, yes, that's what the originalist position is. 452 00:26:05,880 --> 00:26:08,960 Speaker 1: But they sat silent, and they joined. The five of 453 00:26:09,000 --> 00:26:12,080 Speaker 1: them joined an opinion by Justice Ginsburg, who's not an originalist, 454 00:26:12,080 --> 00:26:15,480 Speaker 1: So there's no complaint against her um which allowed Congress 455 00:26:15,480 --> 00:26:17,879 Speaker 1: to do whatever the hell they wanted. UM. And the 456 00:26:17,920 --> 00:26:21,600 Speaker 1: same thing with this case. UM. You know the original understanding. 457 00:26:22,000 --> 00:26:25,840 Speaker 1: You know, Justice Jackson, in an opinion two said, no 458 00:26:25,920 --> 00:26:31,879 Speaker 1: one faithful um to our Constitution can deny that the 459 00:26:31,920 --> 00:26:35,879 Speaker 1: framers contemplated what's implicit in the text, that electors would 460 00:26:36,280 --> 00:26:38,800 Speaker 1: have would be free agents, free to cast their vote 461 00:26:38,800 --> 00:26:41,240 Speaker 1: however they wanted. So an originalists would look at this 462 00:26:41,280 --> 00:26:43,640 Speaker 1: and they look, there's no doubt that's what they expected. 463 00:26:44,200 --> 00:26:46,720 Speaker 1: Whether that's what we expect or whether we should continue 464 00:26:46,720 --> 00:26:49,040 Speaker 1: to respect that as a separate question, but there's no 465 00:26:49,119 --> 00:26:53,680 Speaker 1: doubt that's what they expected. And an originalist I would 466 00:26:53,680 --> 00:26:56,880 Speaker 1: have thought, would have taken that seriously. But once again 467 00:26:56,920 --> 00:27:01,040 Speaker 1: the originalists were silent that the opinion UM written by 468 00:27:01,119 --> 00:27:04,159 Speaker 1: Justice Kagan is not Justice Keagan is not an originalist. 469 00:27:04,880 --> 00:27:09,160 Speaker 1: Justice uh Thomas wrote a concurrence. He agreed with one 470 00:27:09,200 --> 00:27:12,280 Speaker 1: part of our argument. He added something else from left field, 471 00:27:12,320 --> 00:27:16,160 Speaker 1: but but there too he was not constrained by original understanding. 472 00:27:16,200 --> 00:27:18,960 Speaker 1: So I I still have faith that will get them 473 00:27:19,000 --> 00:27:21,159 Speaker 1: to do the originalist thing when it's an important thing 474 00:27:21,200 --> 00:27:23,159 Speaker 1: to do, even when it's not an important thing to 475 00:27:23,160 --> 00:27:25,560 Speaker 1: do for conservatives, just an important thing to do for 476 00:27:26,119 --> 00:27:29,080 Speaker 1: the constitution. But I don't think it's fair to say 477 00:27:29,160 --> 00:27:33,600 Speaker 1: that they're consistent originalists out there. To what degree is 478 00:27:33,640 --> 00:27:38,119 Speaker 1: the Federalist Society important and impacted the court system? Leave, 479 00:27:38,160 --> 00:27:41,320 Speaker 1: Let's say the Republicans have played a long game over decades. 480 00:27:41,760 --> 00:27:46,600 Speaker 1: They were a minority in the educational system, They invested money, 481 00:27:46,640 --> 00:27:50,119 Speaker 1: they formed this organization. In the decades later, it seems 482 00:27:50,160 --> 00:27:52,439 Speaker 1: to be paying off. So to what degree is that 483 00:27:52,520 --> 00:27:55,640 Speaker 1: an effect? And to what degree is their concomitant effort 484 00:27:55,720 --> 00:27:58,920 Speaker 1: on the other side, the Federal Society has been the 485 00:27:58,960 --> 00:28:03,359 Speaker 1: most success full influence organization in setting the direction of 486 00:28:03,400 --> 00:28:07,399 Speaker 1: the federal judiciary ever. You know, it's begun in the 487 00:28:07,480 --> 00:28:11,800 Speaker 1: early nineteen eighties at Yale Law School, UM, Stephen Calibraze 488 00:28:11,960 --> 00:28:14,440 Speaker 1: and someone else. I'm blanking on the person's name. It 489 00:28:14,520 --> 00:28:18,159 Speaker 1: might be Lee Lieberman, UM start the start the organization, 490 00:28:18,200 --> 00:28:20,440 Speaker 1: And originally, you know, it started the Yale Law School, 491 00:28:20,480 --> 00:28:24,200 Speaker 1: which is a hotbed of liberalism. It started really UM 492 00:28:24,240 --> 00:28:27,560 Speaker 1: as an intellectual balance, as an organization that tries to 493 00:28:27,560 --> 00:28:32,399 Speaker 1: create an opportunity for conservatives to express their views in 494 00:28:32,440 --> 00:28:37,880 Speaker 1: a context where mainly liberals were um, holding holding the court, um, 495 00:28:38,280 --> 00:28:41,720 Speaker 1: the local law school court. Um. And you know, I 496 00:28:41,760 --> 00:28:44,280 Speaker 1: think from that, from that perspective, what they were doing 497 00:28:44,400 --> 00:28:47,520 Speaker 1: was perfectly fine. It was great because of course, adding 498 00:28:47,520 --> 00:28:50,760 Speaker 1: diversity to the intellectual environment of law school is what 499 00:28:50,800 --> 00:28:54,360 Speaker 1: we should all be aspiring to do. But over time 500 00:28:54,600 --> 00:28:59,680 Speaker 1: they developed very tight connections with the infrastructure for appointing judges, 501 00:29:00,360 --> 00:29:03,160 Speaker 1: and they almost became a kind of clearing house. Um. 502 00:29:03,240 --> 00:29:05,720 Speaker 1: You know, the ADA is supposed to be a clearing house. 503 00:29:05,760 --> 00:29:08,920 Speaker 1: The American Bar Association supposed to vet judges and give 504 00:29:08,960 --> 00:29:12,719 Speaker 1: their views thumbs upper, thumps down about qualifications. Um. But 505 00:29:12,760 --> 00:29:16,560 Speaker 1: the Federalist Society became an almost um you know uh 506 00:29:16,640 --> 00:29:20,960 Speaker 1: um a farm league or a feeding uh, a system 507 00:29:21,040 --> 00:29:25,400 Speaker 1: for feeding a supply of judges to presidents. Republican presidents 508 00:29:25,480 --> 00:29:28,680 Speaker 1: keen to appoint more judges. So at the lower court, 509 00:29:28,720 --> 00:29:31,520 Speaker 1: in the lower courts, district courts and courts of appeals, 510 00:29:32,240 --> 00:29:36,240 Speaker 1: they've been enormously important. And um, you know, this president 511 00:29:36,320 --> 00:29:38,920 Speaker 1: has appointed an extraordinary number of judges. I think the 512 00:29:38,960 --> 00:29:41,520 Speaker 1: numbers something like three hundred. But the point is, Mitch 513 00:29:41,600 --> 00:29:44,720 Speaker 1: McConnell views, this is the one thing he knows he 514 00:29:44,760 --> 00:29:47,640 Speaker 1: can get done UM, and he has bent over backwards 515 00:29:47,640 --> 00:29:50,880 Speaker 1: to a point as quickly as he can. Practically everybody 516 00:29:50,920 --> 00:29:55,440 Speaker 1: who's come through UM. There's no UM filibuster check anymore 517 00:29:55,520 --> 00:29:59,360 Speaker 1: that allows UM a minority to stop that type of appointment. 518 00:29:59,480 --> 00:30:02,720 Speaker 1: And we've seen some extraordinary appointments. People which the A, B. 519 00:30:02,840 --> 00:30:06,680 Speaker 1: A is that are plainly not qualified have been appointed 520 00:30:06,720 --> 00:30:11,040 Speaker 1: nonetheless because they are perceived to be UM extremely loyal 521 00:30:11,120 --> 00:30:14,440 Speaker 1: in a partisan sense at the lower Court. And we'll 522 00:30:14,440 --> 00:30:18,000 Speaker 1: see what the long term effect of this is will be. UM. 523 00:30:18,040 --> 00:30:21,080 Speaker 1: You know, I'm not somebody who believes that a person 524 00:30:21,120 --> 00:30:23,320 Speaker 1: who's a conservative is not qualified to be on the 525 00:30:23,360 --> 00:30:26,680 Speaker 1: court I or any court. I clark for Judge Posner, 526 00:30:27,200 --> 00:30:31,000 Speaker 1: the Seventh Circuit judge Judge Easterbrook was on the court 527 00:30:31,000 --> 00:30:34,320 Speaker 1: at the same time. They were both Reagan appointees, extremely conservative, 528 00:30:35,120 --> 00:30:37,320 Speaker 1: but you know, two of the best legal minds in 529 00:30:37,480 --> 00:30:41,000 Speaker 1: the history of America. So I think that there's you know, 530 00:30:41,120 --> 00:30:43,960 Speaker 1: extremely talented people on the right, like there are talented 531 00:30:43,960 --> 00:30:47,040 Speaker 1: people on the left. But I think that this effort 532 00:30:47,120 --> 00:30:52,800 Speaker 1: to appoint reliable votes as opposed to talented conservative lawyers 533 00:30:53,360 --> 00:30:57,560 Speaker 1: is a real mistake because it really UM dilutes the 534 00:30:58,120 --> 00:31:02,960 Speaker 1: integrity of the judiciary, and UM we've already seen um, 535 00:31:03,040 --> 00:31:05,400 Speaker 1: the beginnings of that. And I fear, you know, these 536 00:31:05,400 --> 00:31:08,920 Speaker 1: appointments of people who are in their forties, UM, thirties 537 00:31:09,000 --> 00:31:12,760 Speaker 1: in some cases, UM isn't a is a It is 538 00:31:12,760 --> 00:31:14,200 Speaker 1: an act that's going to have an effect for the 539 00:31:14,240 --> 00:31:18,160 Speaker 1: next thirty years, next forty years. Well other than railing 540 00:31:18,200 --> 00:31:23,520 Speaker 1: against this and being aware of it, certainly under Trump's reign, UH, 541 00:31:23,800 --> 00:31:26,840 Speaker 1: is the other side doing anything or they organized to 542 00:31:27,040 --> 00:31:30,360 Speaker 1: combat this effort by the right. I don't think that 543 00:31:30,400 --> 00:31:34,040 Speaker 1: they're organized in the same way. There's an you know, 544 00:31:34,080 --> 00:31:38,880 Speaker 1: a parallel organization called a c S UM, the American 545 00:31:38,880 --> 00:31:45,920 Speaker 1: Constitution Society, which UM, you know UH runs liberal direct 546 00:31:46,080 --> 00:31:50,280 Speaker 1: directed conversations on law school campuses. But they have not 547 00:31:50,400 --> 00:31:53,840 Speaker 1: tried to build the same kind of recruiting infrastructure. I mean, 548 00:31:53,880 --> 00:31:55,360 Speaker 1: you know, you could call the head of the a 549 00:31:55,480 --> 00:31:57,160 Speaker 1: c S and the of the a c S. I'm 550 00:31:57,160 --> 00:31:59,760 Speaker 1: sure we'll have people to recommend, but it's not like 551 00:31:59,800 --> 00:32:03,240 Speaker 1: the business model. UM. So you know, if there's a 552 00:32:03,280 --> 00:32:08,080 Speaker 1: democratic president elected, UM, and there's democratic control of the Senate, UM, 553 00:32:08,160 --> 00:32:11,240 Speaker 1: I expect there will be a lot of democratic judges 554 00:32:11,520 --> 00:32:17,040 Speaker 1: um who are or judges appointed by the Democratic Party who? Um? 555 00:32:17,120 --> 00:32:21,880 Speaker 1: Um you know who? Um. I will say, maybe this 556 00:32:21,960 --> 00:32:25,520 Speaker 1: is just stupid blindness or ignorance or bias on my part. 557 00:32:25,560 --> 00:32:27,320 Speaker 1: I will say they they are not going to be 558 00:32:27,400 --> 00:32:30,880 Speaker 1: people who the A B A says they're not qualified. Um. 559 00:32:31,240 --> 00:32:33,520 Speaker 1: You know, there might be young people. I hope they 560 00:32:33,560 --> 00:32:37,160 Speaker 1: are young people. Um, but they're not going to be 561 00:32:37,560 --> 00:32:40,800 Speaker 1: um that kind of people that Ms McConnell has affected. 562 00:32:41,120 --> 00:32:48,840 Speaker 1: And that's a good time. Okay, let's switch to an 563 00:32:48,840 --> 00:32:51,800 Speaker 1: earlier point, which is the silos that people receive their 564 00:32:51,840 --> 00:32:55,120 Speaker 1: information and leedless. Just say, in a three network world, 565 00:32:55,440 --> 00:32:58,920 Speaker 1: we have the fairness doctrine. Uh, if we ultimately go 566 00:32:59,000 --> 00:33:01,080 Speaker 1: all the way to see clear broadcasting, there used to 567 00:33:01,080 --> 00:33:04,440 Speaker 1: be a limit of the number of television stations you 568 00:33:04,520 --> 00:33:07,560 Speaker 1: could own. However, on the other side, their social media. 569 00:33:07,960 --> 00:33:12,800 Speaker 1: There's Facebook, there's Twitter, Snapchat and the other outlets that 570 00:33:12,840 --> 00:33:17,160 Speaker 1: there was just yesterday in the newspaper that uh, Facebook 571 00:33:17,200 --> 00:33:22,719 Speaker 1: increases the views of holocaust deniers. So what do we 572 00:33:22,840 --> 00:33:27,080 Speaker 1: do with a the raw issue of people not being 573 00:33:27,080 --> 00:33:30,440 Speaker 1: informed and be to what degree to these outlets contribute 574 00:33:30,480 --> 00:33:33,160 Speaker 1: to it? And what can we do about that. This 575 00:33:33,280 --> 00:33:38,440 Speaker 1: is the worst and hardest problem. Um So, first we 576 00:33:38,480 --> 00:33:41,480 Speaker 1: have to identify the source of the problem. And the 577 00:33:41,480 --> 00:33:44,120 Speaker 1: source of the problem is not technology and the abstract. 578 00:33:44,160 --> 00:33:46,840 Speaker 1: The source of the problem is the business model of 579 00:33:47,480 --> 00:33:52,880 Speaker 1: the technical technology platforms as well as television stations. You know, 580 00:33:53,040 --> 00:33:55,479 Speaker 1: when there were three networks, the business model of news, 581 00:33:56,160 --> 00:33:59,600 Speaker 1: which remember was coordinated. It's exactly the same time news 582 00:33:59,640 --> 00:34:02,760 Speaker 1: came on all three networks, so you know, you're watching TV, 583 00:34:02,800 --> 00:34:05,920 Speaker 1: you're gonna watch the news, and the business model of 584 00:34:05,920 --> 00:34:08,760 Speaker 1: those news stations was to shoot right down the middle. 585 00:34:09,280 --> 00:34:12,600 Speaker 1: It was a very kind of plain vanilla approach tell 586 00:34:12,680 --> 00:34:15,520 Speaker 1: the story and the way that was the most inclusive possible. 587 00:34:16,040 --> 00:34:18,399 Speaker 1: I'm not saying it was unbiased in some absolute sense. 588 00:34:18,400 --> 00:34:20,680 Speaker 1: I'm not saying it was comprehensive. There's certain issues that 589 00:34:20,719 --> 00:34:22,879 Speaker 1: never got covered. There's all sorts of ways you could 590 00:34:22,920 --> 00:34:25,799 Speaker 1: criticize it. But what you couldn't say is that it 591 00:34:25,840 --> 00:34:30,200 Speaker 1: was trying to create tribes in America. It you know, 592 00:34:30,239 --> 00:34:33,040 Speaker 1: Walter Cronkite thought that what he was doing was speaking 593 00:34:33,080 --> 00:34:36,320 Speaker 1: to America, and he wanted to tell the truth to America, 594 00:34:36,440 --> 00:34:39,440 Speaker 1: and the consequence of that was a kind of understanding 595 00:34:39,480 --> 00:34:43,279 Speaker 1: a common understanding that was shared by almost everybody, because 596 00:34:43,360 --> 00:34:46,640 Speaker 1: that's all there was, those three networks, and at that 597 00:34:46,680 --> 00:34:51,120 Speaker 1: time every day news was being covered. But once networks 598 00:34:51,400 --> 00:34:54,759 Speaker 1: were no longer the center of television, once cable television 599 00:34:54,880 --> 00:34:57,359 Speaker 1: came along and and you weren't forced to watch one 600 00:34:57,400 --> 00:34:59,720 Speaker 1: of three channels, you could watch one of a hundred 601 00:34:59,760 --> 00:35:03,480 Speaker 1: and money channels. Then the news needed to compete. You know, 602 00:35:03,640 --> 00:35:07,920 Speaker 1: news had to compete with Home shopping or um with ESPN, 603 00:35:08,480 --> 00:35:10,560 Speaker 1: and so it needed to find a way to bring 604 00:35:10,640 --> 00:35:15,640 Speaker 1: people to it. And the critical business model innovation was 605 00:35:15,920 --> 00:35:20,520 Speaker 1: Roger Ales at Fox News. You know, when Fox Rupert 606 00:35:20,600 --> 00:35:23,560 Speaker 1: Murdoch brought Rogers Ales in and he pitches to Fox 607 00:35:23,600 --> 00:35:28,120 Speaker 1: News how he imagines running Fox News. Um, he startles 608 00:35:28,160 --> 00:35:30,759 Speaker 1: everybody by saying that he's not going to try to 609 00:35:30,800 --> 00:35:34,040 Speaker 1: pitch the story to all of America. He's going to 610 00:35:34,120 --> 00:35:36,640 Speaker 1: focus on what he called the base. He's going to 611 00:35:36,719 --> 00:35:41,799 Speaker 1: build a network focused on the base of conservatives. And 612 00:35:41,880 --> 00:35:45,800 Speaker 1: by doing that he would build a loyal and coherent public, 613 00:35:46,680 --> 00:35:51,239 Speaker 1: which it would be easier to sell advertising too. And 614 00:35:51,280 --> 00:35:54,239 Speaker 1: so the idea of the business model focused on a 615 00:35:54,320 --> 00:35:58,280 Speaker 1: fraction of America is the critical change in cable television. 616 00:35:58,320 --> 00:36:01,719 Speaker 1: And initially only he is doing that. But now you know, 617 00:36:01,760 --> 00:36:04,120 Speaker 1: if you look at the ideological content of the three 618 00:36:04,160 --> 00:36:08,160 Speaker 1: major cable networks, they're perfectly separated. You have Fox far 619 00:36:08,200 --> 00:36:10,680 Speaker 1: to the well, not in an absolute sense, but relative 620 00:36:10,719 --> 00:36:14,040 Speaker 1: to UM, I mean Fox far to the right, MSNBC 621 00:36:14,200 --> 00:36:16,760 Speaker 1: to the left, and CNN kind of bouncing in between. 622 00:36:17,160 --> 00:36:19,840 Speaker 1: And that's a business model decision. And the consequence of 623 00:36:19,840 --> 00:36:22,640 Speaker 1: that business model decision is the people who get their 624 00:36:22,680 --> 00:36:28,200 Speaker 1: information from television increasingly come to believe, um, the world 625 00:36:28,400 --> 00:36:31,360 Speaker 1: is different from people who get their information from a 626 00:36:31,360 --> 00:36:34,200 Speaker 1: different channel on television. Barack Obama says, if you if 627 00:36:34,239 --> 00:36:36,920 Speaker 1: you watch Fox News, you live in a different planet. 628 00:36:37,360 --> 00:36:39,520 Speaker 1: And if you read the New York Times, um. And 629 00:36:39,560 --> 00:36:45,600 Speaker 1: so that consequence of this separated tribal world is a 630 00:36:45,680 --> 00:36:48,879 Speaker 1: consequence of the business model of cable television. And it's 631 00:36:48,920 --> 00:36:52,839 Speaker 1: the same thing with the digital platforms. You know, Facebook 632 00:36:54,000 --> 00:36:57,040 Speaker 1: is a platform for selling ads. That's where they make 633 00:36:57,080 --> 00:36:59,279 Speaker 1: their money. I mean they you know, gussie it up 634 00:36:59,360 --> 00:37:02,120 Speaker 1: by pretending it's all about community and all about enabling 635 00:37:02,200 --> 00:37:05,040 Speaker 1: people to share and connect and blah blah blah. But 636 00:37:05,200 --> 00:37:09,319 Speaker 1: the thing it sells Wall Street, the reason why it's 637 00:37:09,360 --> 00:37:12,920 Speaker 1: stock is valuable is that it's an extraordinary machine for 638 00:37:12,960 --> 00:37:16,680 Speaker 1: selling ads. But the way that it sells ads is 639 00:37:16,719 --> 00:37:20,759 Speaker 1: to figure out everything it can about you so that 640 00:37:20,800 --> 00:37:23,799 Speaker 1: it knows exactly what will work with you and so 641 00:37:23,840 --> 00:37:27,799 Speaker 1: that it can promise advertisers a higher return from their 642 00:37:27,840 --> 00:37:30,279 Speaker 1: ads than anybody else. And the way it does that 643 00:37:30,400 --> 00:37:33,440 Speaker 1: is it spies on you. It spies, watches everything you're 644 00:37:33,440 --> 00:37:35,600 Speaker 1: interested in doing, shows where you're, where you're going, who 645 00:37:35,640 --> 00:37:38,160 Speaker 1: you're talking to, what kind of things you're interested in. 646 00:37:38,239 --> 00:37:41,600 Speaker 1: And then after spying, it pokes you. It tries to 647 00:37:41,680 --> 00:37:45,760 Speaker 1: make you angry or happy, and it sees how you respond. 648 00:37:46,120 --> 00:37:50,800 Speaker 1: It's a constant experiment to develop a really sophisticated model 649 00:37:50,880 --> 00:37:53,600 Speaker 1: of you so that it can sell ads. And the 650 00:37:53,680 --> 00:37:58,520 Speaker 1: unfortunate thing for democracy is that it turns out fueling 651 00:37:58,640 --> 00:38:02,360 Speaker 1: this politics of hate is a really effective way to 652 00:38:02,560 --> 00:38:06,160 Speaker 1: learn things about you. If we can like hold up 653 00:38:06,239 --> 00:38:10,120 Speaker 1: the kind of red meat or um uh, you know, 654 00:38:10,200 --> 00:38:13,719 Speaker 1: to kind of inspire you to get angry, you'll engage more, 655 00:38:13,960 --> 00:38:17,239 Speaker 1: you'll share more, you'll be more responsive to what is 656 00:38:17,280 --> 00:38:19,799 Speaker 1: coming across your news feed. And the more responsive you are, 657 00:38:20,280 --> 00:38:22,480 Speaker 1: the more they know, and the more they know the 658 00:38:22,560 --> 00:38:25,840 Speaker 1: better their ads are. And so once again it is 659 00:38:25,840 --> 00:38:29,160 Speaker 1: the business model of the platform that is driving them 660 00:38:29,160 --> 00:38:32,319 Speaker 1: to engage with our political environment in a way that 661 00:38:32,480 --> 00:38:36,480 Speaker 1: weakens our capacity as a democracy to do what we 662 00:38:36,520 --> 00:38:38,120 Speaker 1: need to do in the context of an election, which 663 00:38:38,160 --> 00:38:40,600 Speaker 1: is to come to a judgment about which candidate or 664 00:38:40,640 --> 00:38:43,440 Speaker 1: which party is actually in the best interests of America. 665 00:38:43,960 --> 00:38:47,799 Speaker 1: And in both cases you can imagine a different business model. Um, 666 00:38:47,920 --> 00:38:51,200 Speaker 1: you can imagine a different way of like deploying the technologies. 667 00:38:51,680 --> 00:38:55,840 Speaker 1: But unfortunately we've backed ourselves into the worst possible pair 668 00:38:55,960 --> 00:38:59,120 Speaker 1: of business models. And the consequence of that is these 669 00:38:59,120 --> 00:39:03,520 Speaker 1: platforms are not contributing to a better democracy. They're contributing 670 00:39:03,520 --> 00:39:07,960 Speaker 1: to a more polarized, more tribalistic, less capable democracy than 671 00:39:08,000 --> 00:39:11,359 Speaker 1: one we've had than any we've had since the Civil War. 672 00:39:12,719 --> 00:39:16,440 Speaker 1: Now the ship really is sailed on broadcast television, the ratings, 673 00:39:16,760 --> 00:39:19,480 Speaker 1: the actual number of people watching these news channels is 674 00:39:19,520 --> 00:39:24,600 Speaker 1: astoundingly low. Despite the press. The younger generation is finding 675 00:39:24,640 --> 00:39:28,520 Speaker 1: their information online, so focusing on that, you know, a Twitter, 676 00:39:28,560 --> 00:39:30,440 Speaker 1: they say they're not going to have any political ads 677 00:39:30,440 --> 00:39:33,560 Speaker 1: at all in this cycle. Is there any way to 678 00:39:33,640 --> 00:39:38,319 Speaker 1: address the problem. You just delineated. Well, you know, there's 679 00:39:38,360 --> 00:39:42,480 Speaker 1: this some there's this great health movement called the slow 680 00:39:42,600 --> 00:39:45,480 Speaker 1: food movement. And what the slow food movement says is 681 00:39:46,560 --> 00:39:50,120 Speaker 1: the only way to respond to the terrible way people 682 00:39:50,160 --> 00:39:53,240 Speaker 1: eat is to look at how the human body works 683 00:39:54,040 --> 00:39:57,200 Speaker 1: and fit your consumption to the human body. And so 684 00:39:57,239 --> 00:39:59,120 Speaker 1: it turns out that if you cook your own food 685 00:40:00,120 --> 00:40:02,439 Speaker 1: and you eat it over a long period of time, 686 00:40:02,480 --> 00:40:04,719 Speaker 1: like over dinner with friends, you know, two or three 687 00:40:04,719 --> 00:40:09,200 Speaker 1: hour dinner, that that way of consuming food will be 688 00:40:09,239 --> 00:40:11,880 Speaker 1: the healthiest for you. Like you can't poison yourself the 689 00:40:11,920 --> 00:40:15,239 Speaker 1: way you do with processed food, and um, you'll eat 690 00:40:15,400 --> 00:40:17,320 Speaker 1: the right amounts and you'll eat it in a in 691 00:40:17,400 --> 00:40:20,279 Speaker 1: a at a pace that will actually um complement the 692 00:40:20,280 --> 00:40:23,000 Speaker 1: process in the body. So the fat slow food movement 693 00:40:23,040 --> 00:40:28,279 Speaker 1: is about finding the way to eat that fits our physiology. 694 00:40:28,600 --> 00:40:31,920 Speaker 1: There's an equivalence. We could call it the slow democracy movement. 695 00:40:32,440 --> 00:40:34,640 Speaker 1: And so the slow dog democracy movement, like the slow 696 00:40:34,640 --> 00:40:37,319 Speaker 1: food movement, says, you know, we do democracy well in 697 00:40:37,360 --> 00:40:40,680 Speaker 1: some context, and we do it poorly in other context. 698 00:40:41,200 --> 00:40:43,320 Speaker 1: So we do it poorly in a kind of Facebook 699 00:40:43,400 --> 00:40:46,520 Speaker 1: or Twitter environment, like we get riled up, we we 700 00:40:46,680 --> 00:40:48,600 Speaker 1: you know, act on the basis of a tiny bit 701 00:40:48,640 --> 00:40:53,040 Speaker 1: of information, were easily misled, were easily um guided into 702 00:40:53,080 --> 00:40:55,959 Speaker 1: the wrong understanding of the facts. We're just not good. 703 00:40:56,200 --> 00:40:58,759 Speaker 1: We're not We're not good in that environment. But there 704 00:40:58,800 --> 00:41:01,200 Speaker 1: are environments where we are pretty good. So, you know, 705 00:41:01,280 --> 00:41:05,240 Speaker 1: my favorite is an environment like this, a podcast, UM 706 00:41:05,239 --> 00:41:08,120 Speaker 1: where people have a chance, over a long period of 707 00:41:08,120 --> 00:41:11,279 Speaker 1: time to hear the arc or the development of a 708 00:41:11,320 --> 00:41:14,120 Speaker 1: story or an argument they understand and they have a 709 00:41:14,160 --> 00:41:16,360 Speaker 1: chance to reflect on it the way you know, humans 710 00:41:16,400 --> 00:41:19,760 Speaker 1: have that understanding and their ordinary interaction with ordinary people, 711 00:41:20,360 --> 00:41:23,000 Speaker 1: and that in the context of that type of engagement, 712 00:41:23,680 --> 00:41:27,399 Speaker 1: they learn about issues in a better, more comprehensive way 713 00:41:27,440 --> 00:41:30,279 Speaker 1: than they would ever over Twitter or over their news 714 00:41:30,360 --> 00:41:36,600 Speaker 1: feed on Facebook UM or you know, narrative um uh 715 00:41:36,760 --> 00:41:41,319 Speaker 1: UM content like um television shows that are um you know, 716 00:41:41,360 --> 00:41:44,520 Speaker 1: there are not documentaries necessarily, but that try to tell 717 00:41:44,560 --> 00:41:47,920 Speaker 1: a moral story or tell tell us story about some 718 00:41:48,040 --> 00:41:51,080 Speaker 1: historical moments that gives you a chance to imagine and 719 00:41:51,239 --> 00:41:55,799 Speaker 1: envision and see um played out the tensions of the 720 00:41:56,280 --> 00:41:59,400 Speaker 1: you know, moral issues or the the struggles. You know, 721 00:41:59,440 --> 00:42:03,440 Speaker 1: my favorite, even though there's a million reasons to um 722 00:42:03,560 --> 00:42:07,200 Speaker 1: uh to criticize the series. But like Homeland, which over 723 00:42:07,239 --> 00:42:10,239 Speaker 1: the arc of its six or seven series, you know, 724 00:42:10,480 --> 00:42:13,960 Speaker 1: brought in an incredible number of Americans to a differ different, 725 00:42:14,560 --> 00:42:17,359 Speaker 1: deeper understanding of the story of the Middle East than 726 00:42:17,400 --> 00:42:19,640 Speaker 1: they had when that series began, which was you know, 727 00:42:20,040 --> 00:42:23,640 Speaker 1: this kind of simple, black, white, good versus bad picture 728 00:42:23,760 --> 00:42:28,399 Speaker 1: of the Middle East. I think what the solution is 729 00:42:28,120 --> 00:42:30,600 Speaker 1: U is for us to find ways to channel more 730 00:42:31,160 --> 00:42:35,080 Speaker 1: of politics into context that we have a good reason 731 00:42:35,120 --> 00:42:38,640 Speaker 1: to believe humans can process. Well. Um, you know some 732 00:42:38,680 --> 00:42:41,640 Speaker 1: people's responses to say, we'll get humans out of the mix, 733 00:42:41,719 --> 00:42:45,080 Speaker 1: like let's just have experts, Like forget democracy, let's just 734 00:42:45,120 --> 00:42:48,640 Speaker 1: have a technocracy where you know, the smartest people, you know, 735 00:42:48,680 --> 00:42:50,960 Speaker 1: Harvard law professors, get to decide everything. And you know, 736 00:42:51,040 --> 00:42:53,239 Speaker 1: having known a bunch of Harvard law professors, I can 737 00:42:53,280 --> 00:42:55,360 Speaker 1: tell you that would be a disaster, would be the 738 00:42:55,440 --> 00:42:58,800 Speaker 1: worst thing in the world to imagine turning these decisions 739 00:42:58,840 --> 00:43:01,680 Speaker 1: over to the elite of a araka. But instead of 740 00:43:01,719 --> 00:43:04,200 Speaker 1: like giving up on democracy, I think we have to 741 00:43:04,200 --> 00:43:06,799 Speaker 1: wait to find a way to make democracy work by 742 00:43:06,840 --> 00:43:11,560 Speaker 1: giving people a chance to understand and engage on issues 743 00:43:12,280 --> 00:43:16,000 Speaker 1: in a way that we know they can actually process. 744 00:43:16,160 --> 00:43:20,160 Speaker 1: And so that means channeling away from the five second 745 00:43:20,239 --> 00:43:22,840 Speaker 1: ad or the thirty second ad on television or the 746 00:43:22,880 --> 00:43:26,880 Speaker 1: Facebook feed or Twitter into places where they actually have 747 00:43:26,920 --> 00:43:29,080 Speaker 1: a sense of what the issues are and they can 748 00:43:29,080 --> 00:43:31,920 Speaker 1: get a real sense of the people involved. But how 749 00:43:31,920 --> 00:43:36,799 Speaker 1: does that address the silo issue? Well, we don't have 750 00:43:36,880 --> 00:43:40,000 Speaker 1: the ability, given the First Amendment, to address it through legislation. 751 00:43:41,600 --> 00:43:44,320 Speaker 1: So we're not going to blow up the silos um. 752 00:43:44,400 --> 00:43:49,200 Speaker 1: They're going to exist until they're no longer profitable. So 753 00:43:49,280 --> 00:43:50,960 Speaker 1: the way to deal with the silos issue is to 754 00:43:51,080 --> 00:43:53,239 Speaker 1: just to give people a different sense of what they 755 00:43:53,280 --> 00:43:56,080 Speaker 1: want to do, what they want to consume, and then 756 00:43:56,120 --> 00:43:59,319 Speaker 1: they consume differently maybe hopefully. You know, that's the same 757 00:43:59,320 --> 00:44:01,960 Speaker 1: fight in the d of healthy food movement, like you know, 758 00:44:02,000 --> 00:44:03,480 Speaker 1: if all this unhealthy food, how are you going to 759 00:44:03,560 --> 00:44:05,520 Speaker 1: deal with it? You can't ban it, I mean, you know, 760 00:44:06,080 --> 00:44:08,800 Speaker 1: Michael Bloomberg thought you could basically do that with sodas 761 00:44:08,800 --> 00:44:11,520 Speaker 1: in New York, But you're not gonna across the country 762 00:44:11,520 --> 00:44:13,239 Speaker 1: ban it. The most you can do is get people 763 00:44:13,239 --> 00:44:16,279 Speaker 1: to reflect on like what's the actually healthy way to eat? 764 00:44:16,440 --> 00:44:18,040 Speaker 1: Like what sort of things should you be eating? And 765 00:44:18,120 --> 00:44:22,320 Speaker 1: let's develop the right Um, sensibilities of how we should 766 00:44:22,320 --> 00:44:25,320 Speaker 1: be eating and what balance we should have, and practice 767 00:44:25,360 --> 00:44:29,279 Speaker 1: that take responsibility, you know, both for your diet of 768 00:44:29,400 --> 00:44:32,960 Speaker 1: food and your diet of information. Be a responsible consumer 769 00:44:32,960 --> 00:44:36,200 Speaker 1: in both dimensions. Um. Now, I say that, and I'll 770 00:44:36,200 --> 00:44:40,440 Speaker 1: be the first to say, I'm not optimistic. You know. 771 00:44:40,520 --> 00:44:43,560 Speaker 1: I think that we're at a moment that we literally 772 00:44:43,560 --> 00:44:46,719 Speaker 1: have never been in before, a moment like this. I mean, 773 00:44:46,719 --> 00:44:50,919 Speaker 1: people historians will say the media a hundred years ago 774 00:44:51,360 --> 00:44:55,360 Speaker 1: was just as fragmented, just as polarized as it is today, 775 00:44:55,360 --> 00:44:58,320 Speaker 1: and they're right it was. But the difference is a 776 00:44:58,400 --> 00:45:00,759 Speaker 1: hundred years ago, there was no way to know what 777 00:45:00,840 --> 00:45:04,160 Speaker 1: the public thought. There was no polling, so you could 778 00:45:04,160 --> 00:45:06,759 Speaker 1: have a public that was partisan and divided and lived 779 00:45:06,760 --> 00:45:09,560 Speaker 1: in their own little echo chambers. But actually the people 780 00:45:09,560 --> 00:45:14,120 Speaker 1: who made policy were politicians in Washington, and they understood 781 00:45:14,160 --> 00:45:16,560 Speaker 1: both sides of the issue, and they were engaged in 782 00:45:16,600 --> 00:45:18,879 Speaker 1: real debate and deliberation about what they should be doing, 783 00:45:18,920 --> 00:45:21,279 Speaker 1: and and they did what they did, and kind of 784 00:45:21,320 --> 00:45:23,960 Speaker 1: what the people thought was irrelevant. We live in a 785 00:45:24,040 --> 00:45:28,080 Speaker 1: time where the public is polarized and divided, press is 786 00:45:28,120 --> 00:45:30,960 Speaker 1: completely partisan and drives people in their own little silos, 787 00:45:31,800 --> 00:45:35,080 Speaker 1: but we can see what the people believe, and so 788 00:45:35,160 --> 00:45:37,960 Speaker 1: you can always point back to what the people believe 789 00:45:38,080 --> 00:45:41,040 Speaker 1: as a justification for whatever crazy thing you want to do. 790 00:45:41,200 --> 00:45:44,040 Speaker 1: So seventy four percent of Americans think we need to 791 00:45:44,080 --> 00:45:48,719 Speaker 1: invade a rock, um, you know, based on misrepresentations and 792 00:45:48,840 --> 00:45:51,920 Speaker 1: falsehoods and ignorance about the actual facts. But that's what 793 00:45:52,239 --> 00:45:55,200 Speaker 1: America says. So that's what we have to do. And 794 00:45:55,239 --> 00:45:58,040 Speaker 1: so I think the problem is that we both um 795 00:45:58,280 --> 00:46:01,560 Speaker 1: have a system for un understanding what the people believe, 796 00:46:01,719 --> 00:46:03,960 Speaker 1: but don't have a system for giving people the chance 797 00:46:04,440 --> 00:46:08,640 Speaker 1: to believe what they actually would believe if they understood 798 00:46:08,680 --> 00:46:10,520 Speaker 1: the facts and had a chance to reflect on them 799 00:46:10,520 --> 00:46:13,960 Speaker 1: and deliberate about them. Okay, So you were essentially saying 800 00:46:14,000 --> 00:46:17,160 Speaker 1: there's no place for the legal system to address these 801 00:46:17,200 --> 00:46:24,920 Speaker 1: problems and social networks, not in America, not right now, um, 802 00:46:24,960 --> 00:46:31,879 Speaker 1: because any effort to directly affect the content of these 803 00:46:31,920 --> 00:46:35,400 Speaker 1: networks is going to be resisted by claims the First Amendment. 804 00:46:35,400 --> 00:46:38,880 Speaker 1: I mean, there are things to do on the margin. Um, 805 00:46:38,920 --> 00:46:41,719 Speaker 1: you know, I think that the legal system, you know this, 806 00:46:43,360 --> 00:46:45,560 Speaker 1: Donald Trump has sent something close to this, And I 807 00:46:45,560 --> 00:46:48,279 Speaker 1: don't want to sound like I'm mimicking what he said, 808 00:46:48,320 --> 00:46:51,319 Speaker 1: because I think it's importantly different. But I do think 809 00:46:51,360 --> 00:46:55,480 Speaker 1: that we can think more um Uh, we can think 810 00:46:55,520 --> 00:46:59,280 Speaker 1: it a smarter way about how to make platforms responsible 811 00:47:00,600 --> 00:47:07,600 Speaker 1: so that they don't allow misrepresentations UM to propagate in 812 00:47:07,600 --> 00:47:10,200 Speaker 1: a way that can be destructive. Right now, the legal 813 00:47:10,200 --> 00:47:15,440 Speaker 1: system basically gives them total immunity UM for defamation or 814 00:47:15,480 --> 00:47:19,279 Speaker 1: false information that they are allowed to spread UM. And 815 00:47:19,360 --> 00:47:22,839 Speaker 1: of course, when the business model is as it is, 816 00:47:22,880 --> 00:47:26,040 Speaker 1: where you have a strong interest to spread false information 817 00:47:26,160 --> 00:47:30,080 Speaker 1: for political reasons or even just for advertising reasons, UM, 818 00:47:30,120 --> 00:47:33,800 Speaker 1: that creates a really dangerous incentive. So I think the 819 00:47:33,880 --> 00:47:36,279 Speaker 1: legal system could, on the margin, tweak that and do 820 00:47:36,360 --> 00:47:39,600 Speaker 1: that better. But I don't think that the core problem 821 00:47:39,600 --> 00:47:41,640 Speaker 1: with the business model is going to go away because 822 00:47:41,680 --> 00:47:43,920 Speaker 1: of fixing that. So then you want to say, well, 823 00:47:43,920 --> 00:47:45,839 Speaker 1: can the legal system come in and you know, ban 824 00:47:45,960 --> 00:47:51,280 Speaker 1: advertising UM. And my my view is I'd love to try. 825 00:47:51,719 --> 00:47:54,600 Speaker 1: I'd love to say, let's see what happens if we 826 00:47:54,760 --> 00:47:59,399 Speaker 1: ban political advertising for six weeks before an election. I mean, 827 00:47:59,560 --> 00:48:01,560 Speaker 1: you know, it would be stupid to ban it all 828 00:48:01,560 --> 00:48:04,200 Speaker 1: the time, because you know, advertising is good for selling products, 829 00:48:04,200 --> 00:48:07,399 Speaker 1: and products drive an economy. And it might not make 830 00:48:07,400 --> 00:48:09,560 Speaker 1: sense to say ban advertising for like, you know, the 831 00:48:09,560 --> 00:48:12,040 Speaker 1: local state representatives, because that turns out to be a 832 00:48:12,080 --> 00:48:14,600 Speaker 1: really efficient way for them to get their message out there. 833 00:48:15,760 --> 00:48:21,040 Speaker 1: But where you've got these really polarized and potentially um 834 00:48:21,239 --> 00:48:25,720 Speaker 1: foreign influence driven campaigns for president or for the control 835 00:48:25,760 --> 00:48:28,399 Speaker 1: of Congress, I would love to see what happens if 836 00:48:28,400 --> 00:48:32,680 Speaker 1: we just turn off the mechanism for um tweaking or 837 00:48:32,760 --> 00:48:38,600 Speaker 1: driving the public according to whatever interest the advertiser might have, 838 00:48:39,320 --> 00:48:42,400 Speaker 1: and see what see what happens. Um. I expect that 839 00:48:42,440 --> 00:48:44,359 Speaker 1: you're going to see countries that try to do that, 840 00:48:44,600 --> 00:48:47,719 Speaker 1: and we might learn something from those countries. And it 841 00:48:47,719 --> 00:48:49,440 Speaker 1: would be great if what we learned is that's all 842 00:48:49,480 --> 00:48:53,400 Speaker 1: it takes, just turn off the advertising and uh, you know, 843 00:48:53,440 --> 00:48:55,600 Speaker 1: it's not that social media becomes great, but it doesn't 844 00:48:55,640 --> 00:48:59,360 Speaker 1: become the poison and that that it was in and 845 00:48:59,440 --> 00:49:03,120 Speaker 1: then I think it's to be in now broad concept. 846 00:49:03,640 --> 00:49:08,320 Speaker 1: Do you do you not believe social media, search networks, etcetera. 847 00:49:08,560 --> 00:49:12,920 Speaker 1: Should be liable for the content on their sites. I 848 00:49:12,960 --> 00:49:16,480 Speaker 1: don't think you can answer that simply. Um. I think 849 00:49:16,520 --> 00:49:21,120 Speaker 1: there are contacts in which they should be liable, yes, Um, 850 00:49:21,160 --> 00:49:24,520 Speaker 1: But they certainly should not be liable for everything, regardless 851 00:49:24,560 --> 00:49:27,480 Speaker 1: of the context or the steps that they've taken to 852 00:49:28,040 --> 00:49:30,840 Speaker 1: to avoid that content being there. So I don't believe 853 00:49:30,840 --> 00:49:33,920 Speaker 1: in the absolute immunity that exists right now, and I 854 00:49:33,920 --> 00:49:35,600 Speaker 1: don't believe we should go to a world where they're 855 00:49:35,640 --> 00:49:39,200 Speaker 1: liable for everything that I upload onto a network. You know, 856 00:49:39,239 --> 00:49:42,840 Speaker 1: it's got to be a more sophisticated kind of notice 857 00:49:42,840 --> 00:49:46,600 Speaker 1: and takedown infrastructure than the copyright notice and take on 858 00:49:46,760 --> 00:49:49,040 Speaker 1: infrastructure is. I mean, you know, let's be realistic. So 859 00:49:49,080 --> 00:49:51,160 Speaker 1: it M let's be real about this. It's not like 860 00:49:51,239 --> 00:49:55,080 Speaker 1: networks are immune from every bit of content that's up 861 00:49:55,120 --> 00:49:57,719 Speaker 1: there that might violate the law. I mean, they can 862 00:49:57,719 --> 00:50:01,080 Speaker 1: defame someone and they're you know, there's st free um. 863 00:50:01,200 --> 00:50:03,120 Speaker 1: Or they can allow defamation on their network and they're 864 00:50:03,120 --> 00:50:05,680 Speaker 1: scott free. But you know, if they allow a Disney 865 00:50:05,719 --> 00:50:09,280 Speaker 1: movie to be distributed on their network, UM in violation 866 00:50:09,320 --> 00:50:12,239 Speaker 1: of copyright, they're not scot free. UM. They're gonna be 867 00:50:12,280 --> 00:50:14,359 Speaker 1: responsible if they don't take steps to take that down. 868 00:50:14,400 --> 00:50:18,560 Speaker 1: So we've already decided that some content is important UM 869 00:50:18,680 --> 00:50:21,720 Speaker 1: for us to regulate, to make sure that the legal 870 00:50:21,760 --> 00:50:24,399 Speaker 1: principles are upheld and I think we could do more 871 00:50:24,520 --> 00:50:29,200 Speaker 1: to worry about at least defamation UM. And and you know, 872 00:50:29,239 --> 00:50:33,440 Speaker 1: I have a broader concept of direct manipulation UM. You know, 873 00:50:33,480 --> 00:50:38,480 Speaker 1: in the way that UM these fake videos events that 874 00:50:38,600 --> 00:50:42,319 Speaker 1: is for the purpose of misrepresenting or for um, you know, 875 00:50:42,440 --> 00:50:45,960 Speaker 1: creating descent where the actual facts don't support it. How 876 00:50:46,040 --> 00:50:48,120 Speaker 1: you do it is going to be so hard. I 877 00:50:48,120 --> 00:50:51,160 Speaker 1: don't imagine as a simple statement here or a simple rule, 878 00:50:51,760 --> 00:50:53,239 Speaker 1: but I do think we shouldn't give up on it, 879 00:50:53,280 --> 00:50:56,040 Speaker 1: because I think the consequence of giving up has already 880 00:50:56,080 --> 00:51:01,560 Speaker 1: proven itself to be really devastating. Let's address anti trust. Okay, 881 00:51:01,600 --> 00:51:05,319 Speaker 1: generally speaking, the Republicans have been looser than the Democrats 882 00:51:05,360 --> 00:51:08,840 Speaker 1: on this. We've had a number of very interesting things. 883 00:51:09,360 --> 00:51:12,640 Speaker 1: We have the Books case. The end result was that 884 00:51:12,719 --> 00:51:17,560 Speaker 1: books became more expensive. We had the T mobile Sprint thing, 885 00:51:17,680 --> 00:51:20,520 Speaker 1: where people on the left said, oh, we must have 886 00:51:20,760 --> 00:51:24,120 Speaker 1: these more competitors. Were as anyone who's a student of 887 00:51:24,120 --> 00:51:27,239 Speaker 1: the business said, Hey, Sprint is going to go out 888 00:51:27,239 --> 00:51:31,080 Speaker 1: of business, We're gonna go bankrupt. Then uh, we have 889 00:51:31,239 --> 00:51:35,200 Speaker 1: people like Elizabeth Warren saying let's break up the social networks. 890 00:51:35,200 --> 00:51:38,360 Speaker 1: What are your viewpoints about anti trust? And what should 891 00:51:38,360 --> 00:51:43,799 Speaker 1: the uh UH decisions be made upon. Well, I think 892 00:51:43,800 --> 00:51:47,719 Speaker 1: we should recognize that we basically had no antitrust enforcement 893 00:51:47,880 --> 00:51:52,440 Speaker 1: in the technology sector UM in any significant way since 894 00:51:52,480 --> 00:51:57,439 Speaker 1: the Microsoft decision twenty years ago. And I think that's 895 00:51:57,440 --> 00:52:01,440 Speaker 1: been a mistake. I think we've allowed out these companies 896 00:52:01,480 --> 00:52:05,040 Speaker 1: to get much bigger than they would have gotten had 897 00:52:05,280 --> 00:52:09,880 Speaker 1: ordinary principles of anti trust been deployed consistently throughout the period. 898 00:52:09,920 --> 00:52:12,960 Speaker 1: And I wouldn't blame Republicans for that. I mean, certainly 899 00:52:12,960 --> 00:52:15,759 Speaker 1: I blame Republicans for what happened under George Bush, but 900 00:52:15,840 --> 00:52:19,600 Speaker 1: I blame Democrats for what happened under Barack Obama. And 901 00:52:19,680 --> 00:52:24,320 Speaker 1: the the clearest moments when anti trust should have stepped 902 00:52:24,320 --> 00:52:27,719 Speaker 1: in was under Barack Obama. UM. And now we know 903 00:52:27,920 --> 00:52:32,720 Speaker 1: from you know, Mark Zuckerberg's email, UM that they're intent 904 00:52:32,880 --> 00:52:37,520 Speaker 1: that their game was exactly this kind of anti competitive 905 00:52:38,880 --> 00:52:42,200 Speaker 1: acquisition strategy. And and I think there's a real reason 906 00:52:42,239 --> 00:52:44,239 Speaker 1: to be worried about that. And I think that the 907 00:52:44,320 --> 00:52:46,919 Speaker 1: reason to be worried about it is much more than 908 00:52:46,960 --> 00:52:51,400 Speaker 1: just a pure economic reason. Right, the standard Chicago School 909 00:52:51,440 --> 00:52:54,200 Speaker 1: theory of anti trust is only worry about it if 910 00:52:54,200 --> 00:52:56,520 Speaker 1: there's going to be a harm to consumers. But I 911 00:52:56,520 --> 00:52:59,360 Speaker 1: think you also need to be worried about it because 912 00:52:59,400 --> 00:53:02,680 Speaker 1: it creates these entities that are so powerful that they 913 00:53:02,760 --> 00:53:06,360 Speaker 1: cannot be regulated. And so it's a there's this is 914 00:53:06,400 --> 00:53:10,520 Speaker 1: a um Luigi's and Galis calls the political antitrust, which 915 00:53:10,560 --> 00:53:13,920 Speaker 1: is a concern for the size and power of organizations, 916 00:53:14,360 --> 00:53:17,160 Speaker 1: not just because of the economy, but also because of 917 00:53:17,160 --> 00:53:20,359 Speaker 1: the political democracy. And I think from that perspective, we've 918 00:53:20,360 --> 00:53:24,560 Speaker 1: plainly lost control of Amazon and Google and Facebook. Um 919 00:53:24,640 --> 00:53:26,840 Speaker 1: these are entities which I, you know, as much as 920 00:53:26,880 --> 00:53:30,840 Speaker 1: Elizabeth Warren was, you know, inspirational in her aspiration, I 921 00:53:30,920 --> 00:53:33,440 Speaker 1: think it was kind of crazy talk to imagine the 922 00:53:33,520 --> 00:53:35,520 Speaker 1: government was going to be able to do any of 923 00:53:35,520 --> 00:53:38,040 Speaker 1: the things she was talking about, given the enormous power 924 00:53:38,560 --> 00:53:42,320 Speaker 1: that they have because they're plugged into everybody's lives directly. Okay, 925 00:53:42,400 --> 00:53:43,920 Speaker 1: let's talk about you for a second. Where did you 926 00:53:43,920 --> 00:53:45,960 Speaker 1: grow up? So I grew up in the kind of 927 00:53:46,040 --> 00:53:50,720 Speaker 1: Kentucky part of Pennsylvania, um town called Williamsport, Pennsylvania, which 928 00:53:50,760 --> 00:53:54,680 Speaker 1: is about a Little League World Series exactly right, Um so, 929 00:53:54,760 --> 00:53:57,600 Speaker 1: about eighty miles north of Harrisburg. And I, you know, 930 00:53:57,640 --> 00:53:59,319 Speaker 1: I was there from about the age of six until 931 00:53:59,360 --> 00:54:02,240 Speaker 1: I went to college. Okay, And what were the political 932 00:54:02,320 --> 00:54:05,319 Speaker 1: leanings of your family? So we grew up. I grew 933 00:54:05,400 --> 00:54:08,920 Speaker 1: up a Republican. I was quite an active Republican when 934 00:54:08,960 --> 00:54:11,880 Speaker 1: I was a kid. I was chairman of the Pennsylvania 935 00:54:11,920 --> 00:54:14,600 Speaker 1: Teenage Republicans. I was the youngest member of a delegation 936 00:54:15,320 --> 00:54:20,279 Speaker 1: to the nine eighty Republican Convention. UM, so you know 937 00:54:20,360 --> 00:54:24,120 Speaker 1: we were strong conservatives back then. Okay, what did your 938 00:54:24,120 --> 00:54:28,560 Speaker 1: parents do for a living. My father was a ran 939 00:54:28,640 --> 00:54:32,359 Speaker 1: a steel fabricating coming his own companies steel fabricators called 940 00:54:32,400 --> 00:54:35,920 Speaker 1: Williams Sport Fabricators. It was acquired by a company called 941 00:54:36,000 --> 00:54:40,400 Speaker 1: High Steel um Structures. And my mom um, for at 942 00:54:40,440 --> 00:54:42,839 Speaker 1: least the time I was in high school, was sold 943 00:54:42,880 --> 00:54:45,280 Speaker 1: real estate, but she basically did that as a hobby. 944 00:54:46,280 --> 00:54:48,600 Speaker 1: And how many kids in the family. So I had 945 00:54:48,640 --> 00:54:51,880 Speaker 1: two older siblings and one who are about nine and 946 00:54:51,920 --> 00:54:55,359 Speaker 1: ten years older, and then one younger sister who's two 947 00:54:55,400 --> 00:54:58,440 Speaker 1: years younger than I am. And the older siblings. What 948 00:54:58,560 --> 00:55:02,040 Speaker 1: sex were there? Boy in a girl point a girl? 949 00:55:02,400 --> 00:55:05,200 Speaker 1: So on some level you were in the middle, but 950 00:55:05,280 --> 00:55:08,880 Speaker 1: because of so many years in between. Uh, to what 951 00:55:09,080 --> 00:55:12,600 Speaker 1: degree what were your parents very supportive? Were you lost 952 00:55:12,640 --> 00:55:14,840 Speaker 1: in the shuffle? Where were you in the hierarchy of 953 00:55:14,840 --> 00:55:18,719 Speaker 1: the family. Yeah, you're right, So my older siblings are 954 00:55:18,719 --> 00:55:22,400 Speaker 1: basically gone when I, um, you know, I can actually 955 00:55:22,440 --> 00:55:26,840 Speaker 1: remember anything about my life, and so I think that 956 00:55:26,880 --> 00:55:30,280 Speaker 1: they were extremely supportive. I actually went away to school 957 00:55:30,320 --> 00:55:34,040 Speaker 1: for a couple of years, um, four years, and that 958 00:55:34,120 --> 00:55:37,440 Speaker 1: was a significant Where where did you go to school? UM? 959 00:55:37,480 --> 00:55:40,640 Speaker 1: I went to UM. It was called the Columbus Boy Choir. 960 00:55:40,880 --> 00:55:44,040 Speaker 1: So it was a professional boy choir UM in Princeton 961 00:55:45,239 --> 00:55:49,880 Speaker 1: from sixth to ninth grade. And that was says choir. 962 00:55:50,080 --> 00:55:53,160 Speaker 1: Was it musically oriented? It was a professional boy choir. 963 00:55:53,440 --> 00:55:57,080 Speaker 1: We traveled around the world and we sang. How do 964 00:55:57,160 --> 00:55:59,520 Speaker 1: you even get into that? You know? I was singing 965 00:55:59,520 --> 00:56:03,120 Speaker 1: in my urch choir and my choir director said, you know, 966 00:56:03,200 --> 00:56:04,840 Speaker 1: you do this pretty well. You should try out for 967 00:56:04,880 --> 00:56:07,799 Speaker 1: this boy choir. So I tried out and they told 968 00:56:07,800 --> 00:56:10,960 Speaker 1: me I should come and that that started UM in 969 00:56:11,040 --> 00:56:13,560 Speaker 1: sixth grade. And that was a big chunk of my 970 00:56:13,560 --> 00:56:19,239 Speaker 1: my youth and good experience or bad experience. So you 971 00:56:19,280 --> 00:56:23,120 Speaker 1: tripped into this question, bob uh. Yeah, So I've been 972 00:56:23,120 --> 00:56:25,560 Speaker 1: an article written about this that's not announcing anything to do. 973 00:56:26,440 --> 00:56:28,279 Speaker 1: In some ways, it was the best of times, in 974 00:56:28,280 --> 00:56:29,839 Speaker 1: some ways the worst of times. It was the best 975 00:56:29,880 --> 00:56:31,600 Speaker 1: of times because you know, I was a kid from 976 00:56:31,640 --> 00:56:35,279 Speaker 1: a small town traveling around the world understanding things that 977 00:56:35,320 --> 00:56:37,400 Speaker 1: no one at my age would have understood. It was 978 00:56:37,400 --> 00:56:41,080 Speaker 1: the worst of times because um, for um, two and 979 00:56:41,120 --> 00:56:44,719 Speaker 1: a half of those four years, I was sexually abused 980 00:56:44,719 --> 00:56:48,200 Speaker 1: by the director of the choir. UM and um had 981 00:56:48,239 --> 00:56:52,840 Speaker 1: had a profound impact on everything that happened afterwards. Um 982 00:56:52,880 --> 00:56:56,359 Speaker 1: and uh um, you know, kind of in my mind 983 00:56:56,400 --> 00:56:59,680 Speaker 1: defines what the experience was for me. So UM, I 984 00:56:59,680 --> 00:57:03,880 Speaker 1: don't know. It depends which day I'm thinking about it. Okay. 985 00:57:03,920 --> 00:57:07,800 Speaker 1: Does one ever get over sexual abuse of that nature? 986 00:57:07,840 --> 00:57:12,160 Speaker 1: Can you metabolize it? Does it affect future choices and experiences? 987 00:57:15,760 --> 00:57:20,560 Speaker 1: You never get over it. It's never gone. And some 988 00:57:20,640 --> 00:57:24,840 Speaker 1: people it's profoundly significant in everything they do. You know, 989 00:57:24,960 --> 00:57:28,720 Speaker 1: some people, it probably drives them into their own kind 990 00:57:28,760 --> 00:57:32,880 Speaker 1: of perversions. Some people that creates an extraordinary insecurity or 991 00:57:33,840 --> 00:57:40,000 Speaker 1: um pathology around the ability to connect to people. Um. 992 00:57:40,040 --> 00:57:42,360 Speaker 1: You know, I I think very few people have a 993 00:57:42,440 --> 00:57:47,320 Speaker 1: really clear sense of just how destructive mucking about with 994 00:57:47,360 --> 00:57:51,760 Speaker 1: those emotions of a young child can be um and 995 00:57:52,080 --> 00:57:55,520 Speaker 1: uh and so no, I you know, I would say, 996 00:57:56,360 --> 00:58:00,600 Speaker 1: you know, at the at the shrink level, um, um, 997 00:58:00,640 --> 00:58:02,760 Speaker 1: there's never been a moment that I haven't been affected 998 00:58:02,760 --> 00:58:05,919 Speaker 1: by it. At the professional level, I kind of think 999 00:58:05,920 --> 00:58:09,320 Speaker 1: of most of my professional career as in some sense 1000 00:58:09,920 --> 00:58:14,240 Speaker 1: responsive to the the problem that I was obsessed with 1001 00:58:14,280 --> 00:58:17,000 Speaker 1: when I was there. So that problem was, you know 1002 00:58:17,040 --> 00:58:20,240 Speaker 1: what what struck me about that experience was that, of 1003 00:58:20,280 --> 00:58:23,880 Speaker 1: course there was this criminal at the center, um, you know, 1004 00:58:23,920 --> 00:58:26,320 Speaker 1: who was pathological and just you know, in some sense 1005 00:58:26,320 --> 00:58:29,640 Speaker 1: couldn't stop himself. Um. And I wasn't the only person 1006 00:58:29,680 --> 00:58:31,920 Speaker 1: he abused. He was probably depending on the year it 1007 00:58:31,920 --> 00:58:34,520 Speaker 1: could have been abusing up to the school. I mean, 1008 00:58:34,640 --> 00:58:38,600 Speaker 1: he was just you know, he was insatiable and unstoppable. 1009 00:58:40,120 --> 00:58:42,920 Speaker 1: But the more striking thing to me was to reflect 1010 00:58:42,960 --> 00:58:46,320 Speaker 1: on the people who were around him and in some 1011 00:58:46,400 --> 00:58:50,440 Speaker 1: sense knew but didn't want to step up and stop 1012 00:58:50,560 --> 00:58:54,360 Speaker 1: what was happening, either because they turned a blind eye 1013 00:58:54,560 --> 00:58:57,760 Speaker 1: or because they thought it would be destructive of the institution, 1014 00:58:57,840 --> 00:59:02,000 Speaker 1: or whatever the reason is. They were the enablers. And 1015 00:59:02,040 --> 00:59:04,320 Speaker 1: I kind of feel like, if you you know, if 1016 00:59:04,400 --> 00:59:07,320 Speaker 1: you really understood the threat of the work that I've 1017 00:59:07,360 --> 00:59:09,320 Speaker 1: done from the very beginning of my career. It has 1018 00:59:09,360 --> 00:59:14,520 Speaker 1: always been to try to draw attention to the institution 1019 00:59:14,800 --> 00:59:20,000 Speaker 1: or the context of the enablers and to ask, you know, 1020 00:59:20,760 --> 00:59:22,440 Speaker 1: you know, like the last work I did when I 1021 00:59:22,440 --> 00:59:24,720 Speaker 1: was running the Edmund Saffer Center was focused on what 1022 00:59:24,760 --> 00:59:29,520 Speaker 1: we called institutional corruption. And that's that's the corruption not 1023 00:59:29,680 --> 00:59:32,400 Speaker 1: of criminals. It's not the quid pro quo corruption. It's 1024 00:59:32,440 --> 00:59:36,400 Speaker 1: the corruption of the institution where people, good people in 1025 00:59:36,440 --> 00:59:41,080 Speaker 1: the institution allow the institution become sensitive to our responsive 1026 00:59:41,080 --> 00:59:46,000 Speaker 1: to the wrong kind of desires or interests. So when 1027 00:59:46,040 --> 00:59:50,800 Speaker 1: Congress raises money to fund its campaigns, is there's very 1028 00:59:50,840 --> 00:59:54,000 Speaker 1: little bribery involved in that, So it's not criminal in 1029 00:59:54,000 --> 00:59:57,720 Speaker 1: the traditional sense, but it's producing an institution that is 1030 00:59:58,280 --> 01:00:01,440 Speaker 1: um not responsive to the ordinary people. And it's allowed 1031 01:00:01,480 --> 01:00:03,920 Speaker 1: to be that way because people enable it and and 1032 01:00:03,960 --> 01:00:06,000 Speaker 1: support it, and they're not willing to step up and 1033 01:00:06,040 --> 01:00:08,400 Speaker 1: just say this is wrong and we have to change it. 1034 01:00:09,240 --> 01:00:11,400 Speaker 1: And so I feel like everything I've done in some 1035 01:00:11,520 --> 01:00:16,040 Speaker 1: senses recurred to that same reflection in you know, very 1036 01:00:16,080 --> 01:00:19,880 Speaker 1: many different legal context but all of that, you know, 1037 01:00:20,000 --> 01:00:23,680 Speaker 1: comes from moments reflecting on what happened to me when 1038 01:00:23,720 --> 01:00:26,960 Speaker 1: I was you know, twelve and thirteen and fourteen. Now 1039 01:00:27,000 --> 01:00:29,760 Speaker 1: did you age out of that situation or did you 1040 01:00:30,200 --> 01:00:33,080 Speaker 1: decide to leave? I aged out. You know, your voice 1041 01:00:33,160 --> 01:00:35,520 Speaker 1: changes and you're gone. So my voice changed and I 1042 01:00:35,600 --> 01:00:39,760 Speaker 1: was gone. Okay, did this affect your relationships with women 1043 01:00:39,840 --> 01:00:45,680 Speaker 1: and girls thereafter? Of course, of course, you know not. 1044 01:00:46,640 --> 01:00:52,040 Speaker 1: I mean, I'm happily married to an extraordinary woman. And um, 1045 01:00:52,080 --> 01:00:54,840 Speaker 1: I know other people who had the same experience who 1046 01:00:54,920 --> 01:00:59,040 Speaker 1: are not. So, you know, I feel but for the 1047 01:00:59,040 --> 01:01:03,480 Speaker 1: grace of God, there goes I. Um. But you know, 1048 01:01:04,200 --> 01:01:06,600 Speaker 1: in ways I've you know, we've gone far in this 1049 01:01:06,680 --> 01:01:09,520 Speaker 1: conversation part further than I think either of us thought 1050 01:01:09,560 --> 01:01:11,600 Speaker 1: we would. I don't want to go into the depth 1051 01:01:11,720 --> 01:01:14,280 Speaker 1: or the detail of it, but yes, absolutely day to 1052 01:01:14,400 --> 01:01:21,320 Speaker 1: day um uh, that hung right in the middle of everything. Okay, 1053 01:01:21,320 --> 01:01:24,080 Speaker 1: so you age out, your voice changes? Should go back 1054 01:01:24,080 --> 01:01:28,360 Speaker 1: to public high school? What is that experience like? After 1055 01:01:28,400 --> 01:01:32,919 Speaker 1: being away for three years? It was difficult. People didn't 1056 01:01:33,000 --> 01:01:36,000 Speaker 1: quite understand who, you know, what the experience had been. 1057 01:01:36,120 --> 01:01:38,680 Speaker 1: Obviously I didn't tell people what the experience had been, 1058 01:01:39,320 --> 01:01:42,160 Speaker 1: but it did. But it really affected how I understood 1059 01:01:42,160 --> 01:01:45,360 Speaker 1: the world. I kind of thought, I kind of have 1060 01:01:45,560 --> 01:01:48,600 Speaker 1: sex was at the center of everything, because sex was 1061 01:01:48,640 --> 01:01:51,680 Speaker 1: at the center of everything when I was twelve, So 1062 01:01:51,720 --> 01:01:55,240 Speaker 1: I interpreted everything around me as if somehow sex was 1063 01:01:55,280 --> 01:01:58,400 Speaker 1: involved in it. So I remember it's extraordinary moment of 1064 01:02:00,000 --> 01:02:02,640 Speaker 1: our neighbor came across the street and to tell me 1065 01:02:02,680 --> 01:02:07,320 Speaker 1: that her daughter had just gotten a car and a 1066 01:02:07,360 --> 01:02:10,560 Speaker 1: bunch of guys had you know, cleaned the car for 1067 01:02:10,600 --> 01:02:14,439 Speaker 1: her and tuned it up. And the way my mind work, 1068 01:02:14,520 --> 01:02:16,720 Speaker 1: given the experience I had had, I imagine, you know, 1069 01:02:16,800 --> 01:02:20,280 Speaker 1: that sex had to be in that transaction somewhere, like 1070 01:02:20,360 --> 01:02:22,760 Speaker 1: that's just the normal thing it would have been there. Um, 1071 01:02:23,400 --> 01:02:25,560 Speaker 1: And I said something that reflected that, and she kind 1072 01:02:25,560 --> 01:02:28,760 Speaker 1: of was astonished. And I think back on it, and 1073 01:02:28,800 --> 01:02:32,400 Speaker 1: even I'm in stolished. It's like completely abnormal way to 1074 01:02:32,440 --> 01:02:34,760 Speaker 1: think about the world. But you know, that's where I 1075 01:02:34,800 --> 01:02:37,480 Speaker 1: had been delivered. I was delivered thinking that's the normal 1076 01:02:37,520 --> 01:02:41,080 Speaker 1: way people interacted because that had been the normal way 1077 01:02:41,120 --> 01:02:44,320 Speaker 1: I had seen for you know, many years of my 1078 01:02:44,440 --> 01:02:47,600 Speaker 1: life at that point. So you're in high school, you 1079 01:02:47,720 --> 01:02:49,880 Speaker 1: fit in, You're a nerd. Where do you fit in 1080 01:02:49,920 --> 01:02:53,080 Speaker 1: the social total? Nerds total nerd. You know, I'm in 1081 01:02:53,080 --> 01:02:56,000 Speaker 1: the bands, I I run. You know, I'm like a 1082 01:02:56,080 --> 01:02:59,600 Speaker 1: leader in student government. I'm a smart kid. Um, I'm 1083 01:02:59,720 --> 01:03:03,520 Speaker 1: not the number one, but the number two and um 1084 01:03:03,560 --> 01:03:05,200 Speaker 1: you know, and I have a lot of good friends. 1085 01:03:05,360 --> 01:03:09,480 Speaker 1: But but you know, the hierarchy of high school, I'm 1086 01:03:09,520 --> 01:03:12,760 Speaker 1: not the football player or I'm not the the coolest 1087 01:03:12,840 --> 01:03:15,760 Speaker 1: kids in the room. And and I'm obsessed with obeying 1088 01:03:15,760 --> 01:03:18,720 Speaker 1: the law. So I don't I don't drink, and I 1089 01:03:18,760 --> 01:03:21,440 Speaker 1: don't you know, go out and drive at times I 1090 01:03:21,440 --> 01:03:23,960 Speaker 1: shouldn't be in Um. So you know in that sense, yes, 1091 01:03:24,040 --> 01:03:26,720 Speaker 1: absolutely a nerve. Well where does that come from? The 1092 01:03:27,000 --> 01:03:30,640 Speaker 1: sense you up to us toil the lawn? Legally? That's 1093 01:03:30,680 --> 01:03:35,960 Speaker 1: totally my dad. My dad was an obsessive um, you know, 1094 01:03:36,000 --> 01:03:37,840 Speaker 1: in the good in a good sense. He just you know, 1095 01:03:38,080 --> 01:03:43,120 Speaker 1: completely absolute integrity sort of person. You know. The thing 1096 01:03:43,160 --> 01:03:46,480 Speaker 1: I think about all the time with my dad is like, um, 1097 01:03:46,520 --> 01:03:52,080 Speaker 1: in college, I worked worked in his firm. In high school, 1098 01:03:52,080 --> 01:03:56,560 Speaker 1: I basically worked as a laborer, you know, doing work 1099 01:03:56,840 --> 01:04:00,360 Speaker 1: drilling holes and steel. But after college, I I did 1100 01:04:00,400 --> 01:04:03,240 Speaker 1: a summer where I was helping their finance departments, you know, 1101 01:04:03,360 --> 01:04:08,960 Speaker 1: negotiate a merger and transition, and they the way they 1102 01:04:08,960 --> 01:04:12,160 Speaker 1: did businesses they would bid jobs, um, you know, basically 1103 01:04:12,280 --> 01:04:14,560 Speaker 1: usually to the state. They would give a price, they 1104 01:04:14,560 --> 01:04:19,040 Speaker 1: would get the job. And um, there was one job 1105 01:04:19,120 --> 01:04:23,320 Speaker 1: where the bidder forgot to include the bolts. So that 1106 01:04:23,400 --> 01:04:29,600 Speaker 1: was a million dollars. And anybody would have at that stage, 1107 01:04:29,720 --> 01:04:31,920 Speaker 1: you know, done everything they could to get out of 1108 01:04:31,920 --> 01:04:34,440 Speaker 1: the contract, because a million dollars is a million dollars. 1109 01:04:34,440 --> 01:04:36,520 Speaker 1: That was basically the net worth of the company right there. 1110 01:04:37,640 --> 01:04:40,040 Speaker 1: And I suggested to my dad, well, what are we 1111 01:04:40,080 --> 01:04:41,680 Speaker 1: gonna do. We gotta get out of this. We can't 1112 01:04:41,720 --> 01:04:44,200 Speaker 1: go through it. You can't can't spend a million dollars 1113 01:04:44,920 --> 01:04:47,760 Speaker 1: because of this mistake. And he said, my words, my word, 1114 01:04:47,840 --> 01:04:49,400 Speaker 1: that's a we're not going to get out of it, 1115 01:04:49,440 --> 01:04:50,919 Speaker 1: and we're going to find a way to make it work. 1116 01:04:52,000 --> 01:04:54,720 Speaker 1: And you know, it's easy for a parent to talk 1117 01:04:54,720 --> 01:04:59,240 Speaker 1: about the right thing to do, but the greatest opportunities 1118 01:04:59,280 --> 01:05:02,240 Speaker 1: to be able to teach by showing somebody that you're 1119 01:05:02,280 --> 01:05:06,280 Speaker 1: doing the right thing when it really hurts. Um. And 1120 01:05:06,360 --> 01:05:09,560 Speaker 1: he did that all the time. And and so that 1121 01:05:09,720 --> 01:05:13,160 Speaker 1: was that was what you made me think about how 1122 01:05:13,200 --> 01:05:16,600 Speaker 1: I should behave too and you know, I just couldn't. 1123 01:05:17,000 --> 01:05:19,640 Speaker 1: I couldn't break the law. Um, and did he get 1124 01:05:19,680 --> 01:05:22,840 Speaker 1: that contract? He got the contract. He sucked it up. 1125 01:05:23,440 --> 01:05:26,640 Speaker 1: I think they were able to cut back about half 1126 01:05:26,680 --> 01:05:29,480 Speaker 1: of that loss, but it was a huge hit on 1127 01:05:29,120 --> 01:05:31,600 Speaker 1: the on the company. Okay, so how do you decide 1128 01:05:31,640 --> 01:05:35,600 Speaker 1: to go to pen and what's your experience there? Um? 1129 01:05:35,640 --> 01:05:39,400 Speaker 1: I was a legacy. Uh, my father, my grandfather, my aunt, 1130 01:05:39,480 --> 01:05:42,000 Speaker 1: everybody went to PEN. I did early admit, I got 1131 01:05:42,040 --> 01:05:45,640 Speaker 1: in and so I decided to go. And uh I 1132 01:05:45,680 --> 01:05:48,560 Speaker 1: went originally thinking I was going to be a businessman 1133 01:05:48,720 --> 01:05:51,680 Speaker 1: like my dad. So I went to the Warden Business 1134 01:05:51,680 --> 01:05:54,320 Speaker 1: School and as quickly as I could, I finished my 1135 01:05:54,360 --> 01:05:57,000 Speaker 1: degree there and got into the I was interested in 1136 01:05:57,040 --> 01:05:59,440 Speaker 1: the history of economics. And then I left. After I 1137 01:05:59,480 --> 01:06:03,080 Speaker 1: finished PEN and with a degree from the college as 1138 01:06:03,120 --> 01:06:05,120 Speaker 1: well as the business school, I went to Cambridge to 1139 01:06:05,120 --> 01:06:09,240 Speaker 1: study philosophy. Well well, well, what was the inspiration? I mean, 1140 01:06:09,840 --> 01:06:14,160 Speaker 1: because you've had a long peripatetic career in education. You 1141 01:06:14,280 --> 01:06:16,400 Speaker 1: just said, okay, I want to I want to explore more. 1142 01:06:16,440 --> 01:06:18,760 Speaker 1: There was no issue of going into the business world 1143 01:06:18,880 --> 01:06:21,880 Speaker 1: or anything. Yeah, I mean when I finished, when I 1144 01:06:21,920 --> 01:06:25,560 Speaker 1: decided I didn't want to be in the business. Um. 1145 01:06:25,600 --> 01:06:27,760 Speaker 1: I then thought I'd go to law school. And then 1146 01:06:27,960 --> 01:06:29,800 Speaker 1: so I applied to law school and I got into 1147 01:06:29,880 --> 01:06:31,760 Speaker 1: law school, and then at the last minute, I thought, 1148 01:06:31,840 --> 01:06:33,680 Speaker 1: I just don't want to go to law school yet. 1149 01:06:33,680 --> 01:06:37,320 Speaker 1: So I was really obsessed with this philosophy course that 1150 01:06:37,400 --> 01:06:41,240 Speaker 1: I had taken in my last term at Penn, and I, 1151 01:06:41,360 --> 01:06:44,080 Speaker 1: you know, it was searching around for how I could 1152 01:06:44,120 --> 01:06:46,600 Speaker 1: then do something else related to philosophy. And I thought 1153 01:06:46,640 --> 01:06:49,600 Speaker 1: I'd go to England. And I looked at Oxford and 1154 01:06:49,640 --> 01:06:52,640 Speaker 1: it required a thirty page essay to apply. And I 1155 01:06:52,640 --> 01:06:55,400 Speaker 1: looked at Cambridge and it required a six word essay 1156 01:06:55,440 --> 01:06:58,400 Speaker 1: to apply. So obviously I applied to Cambridge and I 1157 01:06:58,440 --> 01:07:01,400 Speaker 1: got in. I went, and I spent three years studying 1158 01:07:01,400 --> 01:07:05,400 Speaker 1: philosophy at Cambridge. And did you end up with a degree. Yeah, 1159 01:07:05,440 --> 01:07:07,440 Speaker 1: I got what would become a master, as they have 1160 01:07:07,520 --> 01:07:12,080 Speaker 1: this weird way that what wasn't originally undergraduate degree matures 1161 01:07:12,080 --> 01:07:14,960 Speaker 1: into a master. So I have a Masters in philosophy, 1162 01:07:15,080 --> 01:07:16,880 Speaker 1: and I thought it would become a philosopher. I thought 1163 01:07:16,920 --> 01:07:19,120 Speaker 1: I would get a PhD. But then at the end 1164 01:07:19,160 --> 01:07:21,360 Speaker 1: I just felt it was too remote, too removed from 1165 01:07:21,400 --> 01:07:24,640 Speaker 1: the real world. So I came back to law school. Okay, 1166 01:07:24,640 --> 01:07:26,680 Speaker 1: But what did you learn from that experience in the 1167 01:07:26,760 --> 01:07:30,880 Speaker 1: UK other than socially? You know, I think that really 1168 01:07:30,920 --> 01:07:34,480 Speaker 1: important part of education, part of growing up, is to 1169 01:07:34,560 --> 01:07:37,360 Speaker 1: place yourself in radically different contexts. Now, England is not 1170 01:07:37,480 --> 01:07:40,400 Speaker 1: radically difficult, different from America, but in many ways it 1171 01:07:40,560 --> 01:07:44,200 Speaker 1: was m and so that makes you much more sensitive 1172 01:07:44,240 --> 01:07:48,960 Speaker 1: to what in the background is making the world the 1173 01:07:49,000 --> 01:07:55,800 Speaker 1: way it seems, and what is the part you're responsible for. So, um, 1174 01:07:55,840 --> 01:07:58,919 Speaker 1: I loved my time in England. Um, it was extraordinary 1175 01:07:59,000 --> 01:08:02,200 Speaker 1: luxury because I just got to read and write and 1176 01:08:02,480 --> 01:08:05,280 Speaker 1: uh um and I had already had a degree, so 1177 01:08:05,320 --> 01:08:09,280 Speaker 1: I felt like I knew something. Um uh. But um, 1178 01:08:09,360 --> 01:08:11,439 Speaker 1: but it's uh, it was you know, that was a time. 1179 01:08:11,520 --> 01:08:13,160 Speaker 1: It was a long time ago. It was eighty three 1180 01:08:13,200 --> 01:08:16,559 Speaker 1: to eighty six. That was a period when you know, 1181 01:08:16,600 --> 01:08:20,479 Speaker 1: Margaret Thatcher was the Prime minister. They were having these 1182 01:08:20,560 --> 01:08:25,679 Speaker 1: massive coal strikes. Um, there was strikes of the students 1183 01:08:25,720 --> 01:08:28,160 Speaker 1: because she was trying to cut back the welfare subsidy 1184 01:08:28,200 --> 01:08:31,400 Speaker 1: to students. Um. So there's a lot of social unrest 1185 01:08:31,400 --> 01:08:36,080 Speaker 1: in the country at the time. But I I felt 1186 01:08:36,120 --> 01:08:38,760 Speaker 1: it was you know, the period where I could go 1187 01:08:38,840 --> 01:08:44,719 Speaker 1: deepest and think deepest about stuff that was important. Okay, 1188 01:08:44,720 --> 01:08:47,240 Speaker 1: so you go to Yale. What's your experience at Yale 1189 01:08:47,320 --> 01:08:52,040 Speaker 1: Law School? Um? Well, first, I actually went to Chicago 1190 01:08:52,240 --> 01:08:55,439 Speaker 1: and I transferred to Yale to follow a girlfriend who 1191 01:08:55,439 --> 01:08:57,120 Speaker 1: had met in England who wanted to go to Yale. 1192 01:08:57,160 --> 01:09:00,800 Speaker 1: So I spent my first year of Chicago. That begs 1193 01:09:00,800 --> 01:09:03,559 Speaker 1: the question, what's the been Going to the uh University 1194 01:09:03,560 --> 01:09:05,880 Speaker 1: of Chicago and going to Yale for law school? There 1195 01:09:06,000 --> 01:09:10,519 Speaker 1: radically different places. Um, so Chicago is wasn't you know? 1196 01:09:10,520 --> 01:09:12,040 Speaker 1: I'm sure it still less. I don't have a clear 1197 01:09:12,040 --> 01:09:14,599 Speaker 1: sense of it today. But it's a great law school. 1198 01:09:15,400 --> 01:09:19,600 Speaker 1: Really super kids, really really super professors. I mean, um, 1199 01:09:19,640 --> 01:09:25,759 Speaker 1: you know Richard Posner, Michael McConnell, Cass Sunstein, Jeffrey Stone. 1200 01:09:26,280 --> 01:09:29,880 Speaker 1: I mean, there was incredible mix of people there. Diane 1201 01:09:29,880 --> 01:09:35,000 Speaker 1: would who eventually also became a judge with Richard Posner. Um, 1202 01:09:35,160 --> 01:09:38,840 Speaker 1: and so it was an intense, really intense legal education. 1203 01:09:39,600 --> 01:09:43,880 Speaker 1: Yale was more like a graduate school education. So, um, 1204 01:09:43,920 --> 01:09:48,640 Speaker 1: you know, they're super smart, smart kids. Um, but they 1205 01:09:48,680 --> 01:09:51,200 Speaker 1: didn't they almost didn't take the law seriously. They didn't 1206 01:09:51,240 --> 01:09:54,200 Speaker 1: want to get deep into that, you know, the messiness 1207 01:09:54,280 --> 01:09:56,559 Speaker 1: of the law. So I felt like, in some ways 1208 01:09:56,560 --> 01:09:59,439 Speaker 1: it was the perfect education because I got a really 1209 01:10:00,080 --> 01:10:02,719 Speaker 1: deep legal education in my first year from the Chicago 1210 01:10:02,760 --> 01:10:04,680 Speaker 1: and then I went to Yale and I could be 1211 01:10:04,720 --> 01:10:08,080 Speaker 1: more reflective and philosophical about it. And you know, given 1212 01:10:08,120 --> 01:10:09,920 Speaker 1: I decided I wanted to be a professor, that was 1213 01:10:09,960 --> 01:10:12,559 Speaker 1: a pretty good chance to um to be in that 1214 01:10:12,640 --> 01:10:16,519 Speaker 1: environment for that. Okay, what happened to the girl? We 1215 01:10:16,520 --> 01:10:18,559 Speaker 1: were very serious for many years, but in the end 1216 01:10:18,600 --> 01:10:26,599 Speaker 1: it didn't work. Okay, how did you decide you wanted 1217 01:10:26,640 --> 01:10:30,040 Speaker 1: to be a professor? You know, I realized I didn't 1218 01:10:30,080 --> 01:10:33,400 Speaker 1: have the discipline to work on problems I didn't find interesting. 1219 01:10:33,880 --> 01:10:37,240 Speaker 1: So I clerked at a law firm, crevass Wind and 1220 01:10:37,280 --> 01:10:42,240 Speaker 1: More my first summer and on my second summer, and 1221 01:10:42,240 --> 01:10:44,160 Speaker 1: and I saw all these people working on things that, 1222 01:10:44,400 --> 01:10:48,040 Speaker 1: you know, just seemed incredibly boring. And and what struck 1223 01:10:48,080 --> 01:10:49,960 Speaker 1: me was not that they were credibly boring, It was just, 1224 01:10:50,240 --> 01:10:52,840 Speaker 1: my god, how can you actually do that? Because I 1225 01:10:52,920 --> 01:10:55,120 Speaker 1: realized that if you told me I had to work 1226 01:10:55,120 --> 01:10:57,440 Speaker 1: on something and I didn't find interesting, I just physically 1227 01:10:57,920 --> 01:11:00,719 Speaker 1: I couldn't do it. So I at least. I needed 1228 01:11:00,760 --> 01:11:04,599 Speaker 1: a career where I was free to pick, uh, whatever 1229 01:11:04,640 --> 01:11:06,599 Speaker 1: I wanted to work on, based on what I wanted 1230 01:11:06,640 --> 01:11:09,160 Speaker 1: to work on. And you know, being I wasn't rich, 1231 01:11:09,320 --> 01:11:11,800 Speaker 1: So being an academic was the best way to do that. 1232 01:11:11,880 --> 01:11:15,240 Speaker 1: And that's the freedom that the job gives me. And 1233 01:11:15,400 --> 01:11:20,559 Speaker 1: I've depended on that deeply in the years since. And 1234 01:11:20,680 --> 01:11:22,320 Speaker 1: how did you end up at Stanford and how did 1235 01:11:22,320 --> 01:11:23,680 Speaker 1: you end it up at Harvard? And what are the 1236 01:11:23,720 --> 01:11:28,439 Speaker 1: differences between those two gigs and environments? So after I clerked, 1237 01:11:28,439 --> 01:11:31,040 Speaker 1: I clerked for Posner first at the Seventh Circuit and 1238 01:11:31,040 --> 01:11:33,680 Speaker 1: then Justice Scalia at the Supreme Court. I went back 1239 01:11:33,680 --> 01:11:36,679 Speaker 1: to Chicago and I taught at Chicago for six years, 1240 01:11:37,400 --> 01:11:41,000 Speaker 1: and then I came to Harvard to start the Center 1241 01:11:41,160 --> 01:11:44,760 Speaker 1: for Internet and Society that was here. And then in 1242 01:11:44,840 --> 01:11:48,560 Speaker 1: two thousand after I got married, Um, my wife is 1243 01:11:48,600 --> 01:11:51,919 Speaker 1: really keen that we see California, so we went to Stanford. 1244 01:11:51,920 --> 01:11:56,080 Speaker 1: I was there for nine, yeah, nine years, and then 1245 01:11:56,080 --> 01:11:58,280 Speaker 1: I came back to Harvard in two thousand nine. So 1246 01:11:58,320 --> 01:12:00,920 Speaker 1: they're very different environments. I mean, and Ford is like 1247 01:12:01,120 --> 01:12:04,360 Speaker 1: Yale in the sense of being small and very intimate, 1248 01:12:04,479 --> 01:12:07,720 Speaker 1: although of course at the time two thousand and two 1249 01:12:07,760 --> 01:12:11,240 Speaker 1: thousand nine, it was filled with people obsessed with Silicon Valley. 1250 01:12:11,280 --> 01:12:15,280 Speaker 1: At the heart of Silicon Valley, Harvard is huge, um, 1251 01:12:15,320 --> 01:12:18,000 Speaker 1: which I find wonderful because what that means is that 1252 01:12:18,040 --> 01:12:22,200 Speaker 1: there's always someone who is doing what you want to, 1253 01:12:22,680 --> 01:12:25,439 Speaker 1: you know, be interested in. So there's always students who 1254 01:12:25,439 --> 01:12:27,120 Speaker 1: are interested in the stuff you want to work on. 1255 01:12:27,160 --> 01:12:29,240 Speaker 1: There's always professors who are working in the area you 1256 01:12:29,240 --> 01:12:32,360 Speaker 1: want to work on. So it's, um, it's like a city. 1257 01:12:32,360 --> 01:12:35,599 Speaker 1: It's like New York City compared to a village. But 1258 01:12:35,680 --> 01:12:37,400 Speaker 1: you know, as long as you're comfortable in a city, 1259 01:12:37,400 --> 01:12:42,760 Speaker 1: it's it's ideal. Going back to Scalia, you know, Republicans 1260 01:12:43,000 --> 01:12:46,840 Speaker 1: idolized him, like the idolized Reagan, where those on the 1261 01:12:46,920 --> 01:12:51,200 Speaker 1: left certainly are not in favorite decisions since he's deceased. 1262 01:12:51,800 --> 01:12:56,679 Speaker 1: Was it hype? Was he really that intelligent? What's your take? Well, 1263 01:12:56,720 --> 01:12:59,360 Speaker 1: I think we have to recognize Justice Scalia as a 1264 01:12:59,439 --> 01:13:03,320 Speaker 1: justice from many years. So I guess he goes to 1265 01:13:03,360 --> 01:13:08,080 Speaker 1: the coordin eight six or eighty seven. Um. And obviously 1266 01:13:08,120 --> 01:13:11,360 Speaker 1: he dies in two thousand and sixteen, UM, so that's 1267 01:13:11,400 --> 01:13:15,720 Speaker 1: a long career. UM. And UM, I think he was 1268 01:13:15,760 --> 01:13:19,000 Speaker 1: a different justice at different stages of his career. So 1269 01:13:19,040 --> 01:13:21,840 Speaker 1: what inspired me the most about him? You know, when 1270 01:13:21,840 --> 01:13:24,120 Speaker 1: you apply, at least when I applied to be a clerk, 1271 01:13:24,200 --> 01:13:26,120 Speaker 1: you you apply to everybody, and if you don't, you 1272 01:13:26,120 --> 01:13:28,000 Speaker 1: don't get a job from anybody. So you're not allowed 1273 01:13:28,040 --> 01:13:30,680 Speaker 1: to pick and choose based on you know, the philosophy 1274 01:13:30,760 --> 01:13:36,200 Speaker 1: or the part of part partisan character of a particular judge. Um. 1275 01:13:36,280 --> 01:13:38,800 Speaker 1: But I applied and he interviewed me quickly, and he 1276 01:13:38,880 --> 01:13:42,280 Speaker 1: hired me quickly, um um. But what I admired about 1277 01:13:42,360 --> 01:13:44,479 Speaker 1: him is that, you know, he was someone who was 1278 01:13:44,720 --> 01:13:48,240 Speaker 1: who was deeply committed to um uh, to live out 1279 01:13:48,320 --> 01:13:52,479 Speaker 1: his philosophy of originalism and textualism. And there were a 1280 01:13:52,560 --> 01:13:57,120 Speaker 1: number of cases, number of times when there was a 1281 01:13:57,120 --> 01:14:01,000 Speaker 1: case where the conservative thing to do is not necessarily 1282 01:14:01,040 --> 01:14:03,960 Speaker 1: the originalist thing to do. And in every one of 1283 01:14:03,960 --> 01:14:07,360 Speaker 1: those cases I would say that he was immediately convinced 1284 01:14:07,400 --> 01:14:10,120 Speaker 1: of the originalist thing. But in every one of those cases, 1285 01:14:10,120 --> 01:14:12,720 Speaker 1: he was eventually brought around to doing the originalist thing, 1286 01:14:12,800 --> 01:14:17,639 Speaker 1: even though it was against his otherwise conservative politics. Um. 1287 01:14:17,760 --> 01:14:20,120 Speaker 1: And so I was impressed that at that stage of 1288 01:14:20,160 --> 01:14:23,880 Speaker 1: his career, at least the principle of the um that 1289 01:14:23,920 --> 01:14:28,280 Speaker 1: he was advancing, he allowed to constrain himself. UM. And 1290 01:14:28,400 --> 01:14:30,720 Speaker 1: part of you know that was he's just trying to 1291 01:14:30,720 --> 01:14:32,600 Speaker 1: work it out. Part of it was his belief that 1292 01:14:32,640 --> 01:14:36,639 Speaker 1: the principles were not UM didn't have a political valance 1293 01:14:36,680 --> 01:14:39,639 Speaker 1: to them. Part of it was he hired liberal clerks. 1294 01:14:39,640 --> 01:14:42,759 Speaker 1: I mean, I was a token liberal hired by Justice Scalia. 1295 01:14:43,240 --> 01:14:46,439 Speaker 1: The other clerks were very conservative, you know, they were UM. 1296 01:14:46,479 --> 01:14:49,160 Speaker 1: There was a deep skepticism about me early on, maybe 1297 01:14:49,240 --> 01:14:52,400 Speaker 1: all the way through UM. Later in his career he 1298 01:14:52,400 --> 01:14:55,400 Speaker 1: stopped hiring liberals. I asked him once why. He said, well, 1299 01:14:55,439 --> 01:14:57,320 Speaker 1: you know, I figured everything out. I don't need anybody 1300 01:14:57,320 --> 01:15:01,680 Speaker 1: to argue with anymore, so you know I don't. I 1301 01:15:01,680 --> 01:15:04,559 Speaker 1: didn't spend my career studying the work of Justice Scalia, 1302 01:15:04,640 --> 01:15:07,200 Speaker 1: so I don't feel like I'm an expert in the 1303 01:15:07,280 --> 01:15:11,360 Speaker 1: late Scalia or the Scalia circus two thousand ten. But 1304 01:15:11,680 --> 01:15:14,120 Speaker 1: the Scalia I knew was someone that UM I didn't 1305 01:15:14,120 --> 01:15:17,120 Speaker 1: agree with, and you know, vast majority of cases. But 1306 01:15:17,160 --> 01:15:19,760 Speaker 1: he's certainly someone I respected for for what he was doing. 1307 01:15:21,040 --> 01:15:23,440 Speaker 1: And how did you go from being a young Republican 1308 01:15:23,520 --> 01:15:27,000 Speaker 1: to the other end of the spectrum you know, in 1309 01:15:27,000 --> 01:15:29,920 Speaker 1: one sense, I don't think I've changed, because I'm still 1310 01:15:30,080 --> 01:15:33,240 Speaker 1: libertarian in the sense that I think society's objectives should 1311 01:15:33,240 --> 01:15:36,040 Speaker 1: be to give people the widest scope for liberty that 1312 01:15:36,120 --> 01:15:40,080 Speaker 1: they can. But what I what changed was recognizing how 1313 01:15:40,160 --> 01:15:45,320 Speaker 1: much society needed to contribute to that project. So creating 1314 01:15:45,360 --> 01:15:50,640 Speaker 1: the conditions within which liberty can exist becomes an extraordinarily 1315 01:15:50,680 --> 01:15:52,839 Speaker 1: part of what society has to do and what government 1316 01:15:52,840 --> 01:15:56,080 Speaker 1: has to do for that. So that means creating conditions, 1317 01:15:56,080 --> 01:15:59,439 Speaker 1: including conditions of equality, um and. So the more you 1318 01:15:59,520 --> 01:16:02,600 Speaker 1: focus on how do you create those conditions, what education, 1319 01:16:03,240 --> 01:16:08,600 Speaker 1: what protection for rights, what enablement of opportunities? Um. You know, 1320 01:16:08,640 --> 01:16:10,799 Speaker 1: it was obviously the Democratic Party that was more concerned 1321 01:16:10,840 --> 01:16:15,760 Speaker 1: about those issues than the Republicans, certainly the Republicans today 1322 01:16:16,000 --> 01:16:19,599 Speaker 1: um and and going forward. You know, I think one 1323 01:16:19,640 --> 01:16:21,920 Speaker 1: of the most interesting debates that came out of this 1324 01:16:22,240 --> 01:16:27,400 Speaker 1: recent Democratic primary was Andrew Yang's push for universal basic 1325 01:16:27,479 --> 01:16:31,720 Speaker 1: income um and. One way to understand the push for 1326 01:16:31,800 --> 01:16:35,680 Speaker 1: universal basic income is again related to this idea of 1327 01:16:35,840 --> 01:16:40,200 Speaker 1: enabling people to have the most meaningful liberty that they 1328 01:16:40,200 --> 01:16:44,320 Speaker 1: can have. There's a really great book called um Bullshit Jobs, 1329 01:16:45,560 --> 01:16:49,360 Speaker 1: which is a which is a wonderful account of um. 1330 01:16:49,400 --> 01:16:52,040 Speaker 1: How you know, up tot of us have jobs that 1331 01:16:52,120 --> 01:16:55,599 Speaker 1: we think of as bullshit jobs, by which he means 1332 01:16:55,680 --> 01:16:58,120 Speaker 1: is defined to mean a job which is doing no 1333 01:16:58,200 --> 01:17:00,479 Speaker 1: good for you or for anybody else in the world. 1334 01:17:00,800 --> 01:17:03,200 Speaker 1: You just start doing it, but you can't imagine how 1335 01:17:03,240 --> 01:17:05,880 Speaker 1: it's helping anybody. And it's not that other people think 1336 01:17:05,880 --> 01:17:08,040 Speaker 1: it's not helping anybody, you know, like I might think 1337 01:17:08,080 --> 01:17:11,479 Speaker 1: that UM parking ticket person is not helping anybody. It's 1338 01:17:11,520 --> 01:17:14,040 Speaker 1: that you yourself don't think that you're helping anybody, that 1339 01:17:14,040 --> 01:17:15,639 Speaker 1: you don't think the world is any better off because 1340 01:17:15,640 --> 01:17:18,680 Speaker 1: of that. I think about, you know, the luxury of 1341 01:17:18,720 --> 01:17:20,519 Speaker 1: having a life for every morning I get up and 1342 01:17:20,520 --> 01:17:22,400 Speaker 1: I think I'm going to work on exactly what I 1343 01:17:22,439 --> 01:17:25,320 Speaker 1: want to work on, compared to somebody next to me 1344 01:17:25,360 --> 01:17:27,320 Speaker 1: who was doing a job where they're thinking this is 1345 01:17:27,400 --> 01:17:31,479 Speaker 1: just contributing nothing to the world. That contrast, I think, 1346 01:17:31,520 --> 01:17:33,960 Speaker 1: I think is going to be increasingly important as we think, 1347 01:17:33,960 --> 01:17:36,479 Speaker 1: how do we build a world where more people have 1348 01:17:36,640 --> 01:17:39,679 Speaker 1: the freedom to have jobs where they feel like they're 1349 01:17:39,680 --> 01:17:42,400 Speaker 1: actually contributing something. They're doing something, they're doing something is 1350 01:17:42,439 --> 01:17:45,439 Speaker 1: meaningful at least to them. And what Andrew Yang was 1351 01:17:45,479 --> 01:17:48,920 Speaker 1: talking about was something that would create the conditions for 1352 01:17:49,080 --> 01:17:51,200 Speaker 1: making that possible. Like ub I would make it so, 1353 01:17:51,720 --> 01:17:54,280 Speaker 1: you know, you would have enough to live and then 1354 01:17:54,320 --> 01:17:55,960 Speaker 1: you could pick a job. Maybe you wouldn't be able 1355 01:17:56,000 --> 01:17:57,960 Speaker 1: to work forty hours a week on that job, you know, 1356 01:17:57,960 --> 01:18:00,400 Speaker 1: only couple hours a week, whether it's you know, being 1357 01:18:00,479 --> 01:18:05,240 Speaker 1: a h an artist or um, you know, building things 1358 01:18:05,280 --> 01:18:07,120 Speaker 1: in the forest, whatever. But the point is you could 1359 01:18:07,160 --> 01:18:10,120 Speaker 1: be living in a way that was meaningful to you. 1360 01:18:10,439 --> 01:18:13,120 Speaker 1: And I think that would be extraordinary if we set 1361 01:18:13,160 --> 01:18:15,160 Speaker 1: that as an objective for society. How will we make 1362 01:18:15,200 --> 01:18:18,320 Speaker 1: it so everybody wakes up every morning with the same 1363 01:18:18,360 --> 01:18:21,360 Speaker 1: feeling that I do, that this is what they're doing, 1364 01:18:21,360 --> 01:18:22,880 Speaker 1: what they want to be doing, and they have a 1365 01:18:22,960 --> 01:18:25,719 Speaker 1: chance to do it. And you know, in a certain sense, 1366 01:18:25,760 --> 01:18:30,160 Speaker 1: I think that's libertarian. That's about making sure society provides 1367 01:18:30,160 --> 01:18:33,280 Speaker 1: two people that opportunity for liberty, even though there's a 1368 01:18:33,400 --> 01:18:36,480 Speaker 1: hugely important role for the states, and creating the conditions 1369 01:18:36,520 --> 01:18:40,719 Speaker 1: for that liberty. And you say you have the ability 1370 01:18:40,760 --> 01:18:43,000 Speaker 1: to choose your projects at Harvard. What are you working 1371 01:18:43,080 --> 01:18:46,960 Speaker 1: on right now? Well, about a dozen years ago, I 1372 01:18:47,000 --> 01:18:49,360 Speaker 1: announced I was going to give up my work on 1373 01:18:49,600 --> 01:18:54,000 Speaker 1: Internet and copyrights to take up the project of um 1374 01:18:54,360 --> 01:18:57,400 Speaker 1: addressing what I was calling that institutional corruption. But basically 1375 01:18:57,920 --> 01:19:02,360 Speaker 1: this corruption of institutions like government meant um and UM. 1376 01:19:02,400 --> 01:19:04,000 Speaker 1: You know, my obsession was trying to find a way 1377 01:19:04,000 --> 01:19:07,439 Speaker 1: to get a democracy that would be responsive um to 1378 01:19:07,680 --> 01:19:11,759 Speaker 1: the public, as opposed to responsive to the special interests 1379 01:19:11,800 --> 01:19:14,040 Speaker 1: the funders of campaign. So that's the project I've been 1380 01:19:14,040 --> 01:19:18,280 Speaker 1: working on for the last twelve years and um uh 1381 01:19:18,320 --> 01:19:20,080 Speaker 1: and it's the project I'll be working on two we 1382 01:19:20,200 --> 01:19:22,439 Speaker 1: get it done at least for the next ten years. 1383 01:19:22,439 --> 01:19:24,680 Speaker 1: That's kind of I've set you know, seventy is a 1384 01:19:24,720 --> 01:19:28,920 Speaker 1: point where I want to step aside um. But uh so, 1385 01:19:29,000 --> 01:19:31,080 Speaker 1: what I'm working on right now are projects that are 1386 01:19:31,080 --> 01:19:37,520 Speaker 1: trying to elevate and make salient these reforms, democratic reforms. 1387 01:19:37,880 --> 01:19:39,640 Speaker 1: And so we've got a bunch of things related to 1388 01:19:39,720 --> 01:19:42,519 Speaker 1: this election to try to tee this issue up. Got 1389 01:19:42,520 --> 01:19:45,479 Speaker 1: a bunch of projects related to the electoral college, which 1390 01:19:45,520 --> 01:19:48,240 Speaker 1: we hope, you know, we'll follow this election and give 1391 01:19:49,479 --> 01:19:53,599 Speaker 1: a real opportunity for reform of the electoral college. And 1392 01:19:53,960 --> 01:19:57,479 Speaker 1: my real hope is that if Drew Biden is elected 1393 01:19:58,560 --> 01:20:02,960 Speaker 1: and Mitch McConnell, the dark Lord of the Senate, is 1394 01:20:03,080 --> 01:20:08,559 Speaker 1: returned to Kentucky. That there's a chance that Congress will 1395 01:20:08,600 --> 01:20:12,200 Speaker 1: pass fundamental democratic reform in the first hundred days. Joe 1396 01:20:12,240 --> 01:20:14,960 Speaker 1: Biden has promised they will do it. Nancy Pelosi has 1397 01:20:15,000 --> 01:20:17,680 Speaker 1: already passed that reform in the House last year, and 1398 01:20:17,720 --> 01:20:21,120 Speaker 1: she's committed to passing it if she has a majority 1399 01:20:21,120 --> 01:20:23,719 Speaker 1: in the House again. Chuck Schumer has promised to pass 1400 01:20:23,720 --> 01:20:25,720 Speaker 1: it in the Senate if he has a majority in 1401 01:20:25,760 --> 01:20:28,920 Speaker 1: the Senate. So we have a real chance for this 1402 01:20:29,000 --> 01:20:32,479 Speaker 1: reform to be enacted. And if it's enacted, I think 1403 01:20:32,520 --> 01:20:35,360 Speaker 1: it will be the most important thing that this administration 1404 01:20:35,400 --> 01:20:38,519 Speaker 1: could do. So I'm eager to continue to push to 1405 01:20:38,560 --> 01:20:41,200 Speaker 1: try to make sure that that happens. So, for those 1406 01:20:41,240 --> 01:20:43,559 Speaker 1: who are not following as closely as you are, what 1407 01:20:43,680 --> 01:20:49,280 Speaker 1: would be the elements of that push. Well, in um, 1408 01:20:50,439 --> 01:20:54,640 Speaker 1: Nancy Pelosi passed something called the called HR one, And 1409 01:20:54,680 --> 01:20:57,960 Speaker 1: what was significant about that title is that it signaled 1410 01:20:57,960 --> 01:21:00,599 Speaker 1: that this needed to be the first thing that would 1411 01:21:00,600 --> 01:21:04,040 Speaker 1: be done, and that recognized that nothing else was possible 1412 01:21:04,120 --> 01:21:07,960 Speaker 1: until you passed this reform. And this reform includes, most 1413 01:21:08,000 --> 01:21:13,599 Speaker 1: importantly changing away congressional campaigns are funded. You know, right now, 1414 01:21:14,040 --> 01:21:17,120 Speaker 1: members of Congress spend anywhere between thirty and seventy percent 1415 01:21:17,200 --> 01:21:21,280 Speaker 1: of their time raising money, spending their time on the 1416 01:21:21,320 --> 01:21:24,799 Speaker 1: phone calling donors, and their donors are not the average American. 1417 01:21:24,840 --> 01:21:28,080 Speaker 1: Their dontors sort of the tiniest fraction of the one spent. 1418 01:21:28,520 --> 01:21:31,200 Speaker 1: So they spend, you know, some people most of their 1419 01:21:31,240 --> 01:21:34,360 Speaker 1: time sucking up to this tiny, tiny fraction of the 1420 01:21:34,360 --> 01:21:36,080 Speaker 1: one percent to get the money they need to run 1421 01:21:36,120 --> 01:21:38,240 Speaker 1: their campaigns or to get their party back into power. 1422 01:21:38,520 --> 01:21:42,760 Speaker 1: That is a deeply corrupting experience. And so the most 1423 01:21:42,760 --> 01:21:44,680 Speaker 1: important thing I think HR one would do was to 1424 01:21:44,680 --> 01:21:48,080 Speaker 1: create a different way of funding campaigns UM that would 1425 01:21:48,120 --> 01:21:51,840 Speaker 1: liberate them from that project, and by liberating them, give 1426 01:21:51,880 --> 01:21:54,599 Speaker 1: them a chance to think about what their constituents want 1427 01:21:54,680 --> 01:21:56,519 Speaker 1: or what's in the interest of the country as opposed 1428 01:21:56,520 --> 01:21:59,120 Speaker 1: to what the funders of campaigns wants. That's number one. 1429 01:21:59,200 --> 01:22:03,200 Speaker 1: Number two, it would end partisan gerrymandering of congressional districts 1430 01:22:03,200 --> 01:22:05,840 Speaker 1: in the states. Number three, it would restore the Voting 1431 01:22:05,920 --> 01:22:09,200 Speaker 1: Rights Act so that states cannot exercise their power to 1432 01:22:09,280 --> 01:22:13,120 Speaker 1: suppress the votes. UM. It's of you know, it's targeted 1433 01:22:13,360 --> 01:22:15,800 Speaker 1: African Americans. But the main reason I think for this 1434 01:22:15,880 --> 01:22:18,479 Speaker 1: suppression is just partisan, Like Republicans want to make it 1435 01:22:18,520 --> 01:22:21,480 Speaker 1: harder for Democrats to vote, and where Democrats have that opportunity, 1436 01:22:21,520 --> 01:22:23,200 Speaker 1: they want to do the same thing too. But the 1437 01:22:23,200 --> 01:22:26,840 Speaker 1: point is it's a corrupting inequality on the basis of 1438 01:22:26,920 --> 01:22:31,120 Speaker 1: partisan um power that we've got to end. And h 1439 01:22:31,240 --> 01:22:32,640 Speaker 1: R one would go a long way to that. And 1440 01:22:32,640 --> 01:22:34,200 Speaker 1: then it did a bunch of other things related to 1441 01:22:34,240 --> 01:22:37,599 Speaker 1: ethics and government automatic voter registration things just to make 1442 01:22:37,600 --> 01:22:41,080 Speaker 1: the democratic process work better. And um, And I think 1443 01:22:41,120 --> 01:22:44,000 Speaker 1: that if we passed that bill, um maybe a little 1444 01:22:44,040 --> 01:22:47,040 Speaker 1: bit better than that bill, more aggressive in the changing 1445 01:22:47,080 --> 01:22:49,960 Speaker 1: the way campaigns are funded. UM, then there would be 1446 01:22:50,040 --> 01:22:53,479 Speaker 1: an extraordinary opportunity to get a democracy that cared more 1447 01:22:53,680 --> 01:22:56,559 Speaker 1: about what people want than it cared about what the 1448 01:22:56,600 --> 01:23:01,160 Speaker 1: lobbyists or the funders of campaigns want. And sketching in 1449 01:23:01,320 --> 01:23:04,879 Speaker 1: the painting a little bit more, how would campaigns be funded? 1450 01:23:05,760 --> 01:23:11,360 Speaker 1: So HR one has two ideas. UM. It's real emphasis 1451 01:23:11,560 --> 01:23:13,760 Speaker 1: is on a matching fund system. So this is the 1452 01:23:13,760 --> 01:23:17,439 Speaker 1: way New York City funds most of its campaigns. So 1453 01:23:17,560 --> 01:23:20,400 Speaker 1: matching fund system says if you give a small amount 1454 01:23:20,920 --> 01:23:24,400 Speaker 1: that gets matched up to UM six to one UM 1455 01:23:24,600 --> 01:23:28,160 Speaker 1: by the government, so that small contributions are much more valuable. 1456 01:23:28,280 --> 01:23:31,040 Speaker 1: And the theory is candidates would then be looking for 1457 01:23:31,120 --> 01:23:34,120 Speaker 1: more small contributions, which means they would be more responsive 1458 01:23:34,160 --> 01:23:37,080 Speaker 1: to people who are giving small contributions than to the 1459 01:23:37,120 --> 01:23:39,800 Speaker 1: small number of people who give large contributions. Now, the 1460 01:23:39,840 --> 01:23:42,120 Speaker 1: other idea, which is the part that I think is 1461 01:23:42,160 --> 01:23:45,519 Speaker 1: really exciting and has enormous potential. Something that city of 1462 01:23:45,520 --> 01:23:49,880 Speaker 1: Seattle has adopted is something called vouchers. So give every 1463 01:23:50,760 --> 01:23:53,439 Speaker 1: give every voter a voucher. Think of it as a 1464 01:23:53,479 --> 01:23:56,479 Speaker 1: packet of coupons that they can use to give to 1465 01:23:56,600 --> 01:23:59,639 Speaker 1: candidates to help them fund their campaigns. So, if you're 1466 01:23:59,640 --> 01:24:01,559 Speaker 1: a candida running for office, which you know is every 1467 01:24:01,560 --> 01:24:04,600 Speaker 1: single voter in your district has a hundred dollars in 1468 01:24:04,640 --> 01:24:06,840 Speaker 1: their pocket that they can give you to help fund 1469 01:24:06,880 --> 01:24:10,439 Speaker 1: your campaigns, and so you'll be raising money from ordinary people, 1470 01:24:10,560 --> 01:24:12,439 Speaker 1: not from the tiny fraction of the one percent to 1471 01:24:12,520 --> 01:24:15,760 Speaker 1: fund campaigns now. And if you do that, raise money 1472 01:24:15,800 --> 01:24:18,960 Speaker 1: from those people, um, then you're responsive to those people. 1473 01:24:19,520 --> 01:24:24,280 Speaker 1: And I think that has the enormous potential to change 1474 01:24:24,320 --> 01:24:27,439 Speaker 1: the sensibilities of Congress to make them more focused on 1475 01:24:27,520 --> 01:24:30,880 Speaker 1: what ordinary people care about as opposed to these big funders. 1476 01:24:30,920 --> 01:24:34,000 Speaker 1: And and I think that would be the absolutely most 1477 01:24:34,040 --> 01:24:36,360 Speaker 1: important change that could that could be done. And so 1478 01:24:36,560 --> 01:24:41,000 Speaker 1: um um, Now, in this last presidential election, um, you know, 1479 01:24:41,120 --> 01:24:45,160 Speaker 1: we had there was an enormous focus on democracy reform. 1480 01:24:45,240 --> 01:24:48,960 Speaker 1: Every single major candidate, including by the end um Joe 1481 01:24:48,960 --> 01:24:52,960 Speaker 1: Biden on the Democratic side and uh Bill Weld on 1482 01:24:52,960 --> 01:24:56,160 Speaker 1: the Republican side, promised within the first hundred days to 1483 01:24:56,200 --> 01:25:01,439 Speaker 1: pass fundamental democratic reform. And and in the Democratic side, 1484 01:25:01,520 --> 01:25:06,000 Speaker 1: people like Andrew Yang and Kirston gilla Brand and uh 1485 01:25:06,160 --> 01:25:09,439 Speaker 1: Um Bernie Sanders endorsed the idea of vouchers. Kirston jilla 1486 01:25:09,520 --> 01:25:11,920 Speaker 1: Brand wanted to give people up to six hundred dollars 1487 01:25:11,920 --> 01:25:14,120 Speaker 1: and vouchers two hundred dollars for every federal race, so 1488 01:25:14,200 --> 01:25:16,640 Speaker 1: two underrellars for Congress, two underrellars for the Senate, to 1489 01:25:16,760 --> 01:25:19,799 Speaker 1: andolls for the president. Andrew Young said a hundred dollars. 1490 01:25:19,800 --> 01:25:21,640 Speaker 1: Bernie was never clear about how much money he was 1491 01:25:21,680 --> 01:25:24,920 Speaker 1: talking about. But they all recognized that if you could 1492 01:25:25,000 --> 01:25:28,320 Speaker 1: make it so that ordinary people were funding campaigns, you 1493 01:25:28,320 --> 01:25:31,120 Speaker 1: would have a Congress more focused on what ordinary people 1494 01:25:31,200 --> 01:25:33,800 Speaker 1: cared about, not on the you know, the billionaires than 1495 01:25:33,800 --> 01:25:37,680 Speaker 1: millionaires who are funding through their packs or their large contributions. 1496 01:25:39,040 --> 01:25:41,320 Speaker 1: Now this all sounds great on the surface, but as 1497 01:25:41,360 --> 01:25:45,360 Speaker 1: someone who has seen multiple election cycles, I'm reminded of 1498 01:25:45,360 --> 01:25:48,920 Speaker 1: the famous George Carlin routine, who says, you can vote 1499 01:25:49,320 --> 01:25:52,040 Speaker 1: if it makes you feel good, But the owners of 1500 01:25:52,080 --> 01:25:54,400 Speaker 1: the country are not going to give up power. They 1501 01:25:54,400 --> 01:25:56,919 Speaker 1: are really going to control. And I look at UH. 1502 01:25:56,960 --> 01:25:59,599 Speaker 1: You know, Bernie Sanders, who certainly spoke to the public 1503 01:25:59,640 --> 01:26:03,679 Speaker 1: and eased all his money from individual contributions, as opposed 1504 01:26:03,720 --> 01:26:06,400 Speaker 1: to Biden, as opposed to Commona who ultimately couldn't raise 1505 01:26:06,439 --> 01:26:10,320 Speaker 1: that much money. It's hard to have confidence. The theory 1506 01:26:10,439 --> 01:26:15,040 Speaker 1: sounds good, but especially UH, taxes have been denigrated to 1507 01:26:15,080 --> 01:26:18,040 Speaker 1: the point where everyone thinks taxes are bad, even people 1508 01:26:18,080 --> 01:26:22,200 Speaker 1: on the left. It's hard to believe as a result 1509 01:26:22,240 --> 01:26:26,040 Speaker 1: of income inequality opportunities in education. You're also talking earlier 1510 01:26:26,320 --> 01:26:31,920 Speaker 1: about long form discussion of problems and issues. I certainly 1511 01:26:31,920 --> 01:26:34,800 Speaker 1: grew up in the sixties when public schools had enough 1512 01:26:34,840 --> 01:26:37,559 Speaker 1: money to educate people how to think, they were not 1513 01:26:37,640 --> 01:26:42,400 Speaker 1: teaching to the test. And is the system inherently broken 1514 01:26:42,840 --> 01:26:45,559 Speaker 1: or will these brand aids fix it, or is there 1515 01:26:45,600 --> 01:26:47,320 Speaker 1: an elite The funny thing, of course, the elite to 1516 01:26:47,320 --> 01:26:49,759 Speaker 1: a great degrees on the left, people who are worked 1517 01:26:49,840 --> 01:26:53,120 Speaker 1: very hard and educated themselves to gain power, who have 1518 01:26:53,240 --> 01:26:56,200 Speaker 1: contempt for those who work with their hands or in 1519 01:26:56,240 --> 01:27:02,680 Speaker 1: service positions. Are these really solvable choose under the present system, 1520 01:27:02,720 --> 01:27:06,280 Speaker 1: they are solvable. And I think the kind of changes 1521 01:27:06,320 --> 01:27:09,320 Speaker 1: that I was just describing, for example, if you gave 1522 01:27:09,560 --> 01:27:13,280 Speaker 1: if you followed Kirston jeweler Brand's proposal and you gave 1523 01:27:13,320 --> 01:27:17,240 Speaker 1: everybody six two hundred dollars for federal race, that's not 1524 01:27:17,320 --> 01:27:20,880 Speaker 1: a band aid. That would be a fundamental change and 1525 01:27:20,920 --> 01:27:24,639 Speaker 1: how Washington works now. I think it's a really important 1526 01:27:24,720 --> 01:27:29,760 Speaker 1: question whether the Democrats will carry through. You know, many skeptical, 1527 01:27:29,800 --> 01:27:33,320 Speaker 1: cynical people said that Nancy Pelosi was able to pass 1528 01:27:33,439 --> 01:27:36,920 Speaker 1: h R one with every single Democrat voting for it 1529 01:27:37,280 --> 01:27:39,439 Speaker 1: because they all knew Mitch mcconnough would kill it in 1530 01:27:39,479 --> 01:27:42,000 Speaker 1: the Senate, so it never was going to get passed. 1531 01:27:42,760 --> 01:27:46,160 Speaker 1: And the same people say that if the Democrats take 1532 01:27:46,160 --> 01:27:48,800 Speaker 1: control of the Senate and they have control of the 1533 01:27:48,840 --> 01:27:53,920 Speaker 1: House and they have the presidency, don't expect reform to 1534 01:27:54,000 --> 01:27:56,880 Speaker 1: pass because then all the money in the world will 1535 01:27:56,920 --> 01:27:59,960 Speaker 1: be against reform passing and they will stop the Democrats 1536 01:28:00,000 --> 01:28:02,040 Speaker 1: from passing it. And I think that's a fair challenge, 1537 01:28:02,640 --> 01:28:06,000 Speaker 1: and it's up to us to make them do it. 1538 01:28:06,000 --> 01:28:08,559 Speaker 1: It's up to us to say, you know, we have 1539 01:28:09,240 --> 01:28:12,559 Speaker 1: finally come to the point where the vast majority of 1540 01:28:12,560 --> 01:28:18,160 Speaker 1: Americans realize if we don't fix this corrupted, broken political system, 1541 01:28:18,240 --> 01:28:22,120 Speaker 1: nothing happens. Nothing happens. You know, you mentioned at the 1542 01:28:22,200 --> 01:28:26,240 Speaker 1: opening that I tried to run for president and UH 1543 01:28:26,680 --> 01:28:32,280 Speaker 1: my campaign in UH was expressly focused, exclusively focused on 1544 01:28:32,439 --> 01:28:36,360 Speaker 1: passing fundamental reform UM, and it was almost impossible to 1545 01:28:36,360 --> 01:28:38,880 Speaker 1: be able to run that campaign. Um, you know, this 1546 01:28:39,040 --> 01:28:41,760 Speaker 1: complicated game that they played to change the rules to 1547 01:28:41,880 --> 01:28:44,479 Speaker 1: keep me off the debate stage, partly because the issue 1548 01:28:44,520 --> 01:28:48,880 Speaker 1: was not salient four years ago. But now every single 1549 01:28:48,920 --> 01:28:52,599 Speaker 1: candidate endorsed this idea of fundamental reform in the first 1550 01:28:52,680 --> 01:28:56,360 Speaker 1: hundred days of the administration. So I think the Democrats 1551 01:28:56,760 --> 01:28:59,439 Speaker 1: UM have said they will do it. I think there's 1552 01:28:59,479 --> 01:29:02,240 Speaker 1: a chance we get enough Democrats in Congress to actually 1553 01:29:02,280 --> 01:29:04,960 Speaker 1: do it, and then we've got to hold their feet 1554 01:29:04,960 --> 01:29:07,120 Speaker 1: to the fire. And if they don't do it, then 1555 01:29:07,520 --> 01:29:10,519 Speaker 1: God help them, there will be a revolution, and I 1556 01:29:10,560 --> 01:29:14,320 Speaker 1: will be in the barracks in the streets doing everything 1557 01:29:14,360 --> 01:29:17,479 Speaker 1: I can to drive it because we can't get anything 1558 01:29:17,520 --> 01:29:20,240 Speaker 1: done in this country until we fix this problem. There's 1559 01:29:20,240 --> 01:29:23,679 Speaker 1: no climate change legislation that will pass until we fix 1560 01:29:23,800 --> 01:29:28,320 Speaker 1: this corrupted, broken system. There's no healthcare reform that doesn't 1561 01:29:28,360 --> 01:29:32,320 Speaker 1: benefit pharmaceutical companies or the insurance companies until we fix 1562 01:29:32,520 --> 01:29:37,120 Speaker 1: this corrupted, broken political system. Finally, people get that. And 1563 01:29:37,160 --> 01:29:39,400 Speaker 1: now the question is whether the Democrats will carry through 1564 01:29:39,560 --> 01:29:43,639 Speaker 1: if in fact they're able to stop this president from 1565 01:29:43,720 --> 01:29:48,880 Speaker 1: stealing the next election. Let's just assume election day happens 1566 01:29:49,640 --> 01:29:54,439 Speaker 1: and the ultimate result is up for grabs, the obvious 1567 01:29:54,520 --> 01:29:58,960 Speaker 1: example being the year two thousand. Many people believe in 1568 01:29:59,000 --> 01:30:01,800 Speaker 1: the year two thousands, despite having David Boy's on the 1569 01:30:01,840 --> 01:30:06,840 Speaker 1: Gore team, that they were outmaneuvered by the Republican attorneys. 1570 01:30:07,840 --> 01:30:13,400 Speaker 1: If it's a jump ball, what should the Democrats do 1571 01:30:13,560 --> 01:30:18,120 Speaker 1: legally to ensure that they have the best chance of 1572 01:30:18,160 --> 01:30:20,479 Speaker 1: things coming out their way. There's so many issues. Who 1573 01:30:20,560 --> 01:30:24,000 Speaker 1: knows what the issues might ultimately be, but certainly there's 1574 01:30:24,040 --> 01:30:27,680 Speaker 1: issues of postmarks on absentee ballots, whether the ballots are 1575 01:30:27,680 --> 01:30:31,840 Speaker 1: filled out correctly. Uh then we certainly have people in 1576 01:30:31,920 --> 01:30:35,960 Speaker 1: charge of elections and states making unilateral decisions. And one 1577 01:30:36,000 --> 01:30:39,599 Speaker 1: can argue in previous situations the Democrats are the opposing team. 1578 01:30:39,640 --> 01:30:43,960 Speaker 1: The Democrats have been on their heels in this situation. Well, 1579 01:30:44,000 --> 01:30:48,519 Speaker 1: it's absolutely true that this is going to be an 1580 01:30:48,520 --> 01:30:51,960 Speaker 1: incredibly complex election. I'm teaching a seminar in the fall 1581 01:30:52,040 --> 01:30:56,280 Speaker 1: called war gaming, which tries to map out all the 1582 01:30:56,360 --> 01:31:00,080 Speaker 1: things that can go wrong, all the places where the 1583 01:31:00,120 --> 01:31:03,080 Speaker 1: system can get can be played. So, you know, so 1584 01:31:03,600 --> 01:31:06,080 Speaker 1: if I were to advise the Democrats on what's the 1585 01:31:06,160 --> 01:31:08,759 Speaker 1: one thing they need to be doing in twenty twenty 1586 01:31:08,920 --> 01:31:12,200 Speaker 1: that they didn't do in two thousand, they need to 1587 01:31:12,240 --> 01:31:15,439 Speaker 1: assume the other side is not going to play fair. 1588 01:31:15,560 --> 01:31:18,479 Speaker 1: And I'm not saying that's going to happen, but I 1589 01:31:18,520 --> 01:31:22,040 Speaker 1: think one striking thing about two thousand was that so 1590 01:31:22,160 --> 01:31:26,320 Speaker 1: many times the Democrats were just astonished with what the Democrats, 1591 01:31:26,320 --> 01:31:29,080 Speaker 1: what the Republicans did. I mean, even the core argument 1592 01:31:29,160 --> 01:31:31,479 Speaker 1: that actually in the end one the argument about the 1593 01:31:31,479 --> 01:31:35,599 Speaker 1: equal Protection clause um controlling or regulating how votes would 1594 01:31:35,640 --> 01:31:38,000 Speaker 1: be recounted. When that argument was made, people on the 1595 01:31:38,040 --> 01:31:40,799 Speaker 1: less of this is just crazy talk. There's no way 1596 01:31:41,200 --> 01:31:44,600 Speaker 1: such an argument could work. Um. But of course it 1597 01:31:44,640 --> 01:31:48,639 Speaker 1: did work, and the Republicans were willing to do almost anything, um, 1598 01:31:48,720 --> 01:31:52,240 Speaker 1: including remember the kind of Brooks brother riots um in 1599 01:31:52,320 --> 01:31:55,360 Speaker 1: Florida where you had these people who had been literally 1600 01:31:55,400 --> 01:31:58,759 Speaker 1: bust in, who were pretending to be Florida voters pounding 1601 01:31:58,760 --> 01:32:01,240 Speaker 1: on the doors demanding that they're hanging Chad Ballance be 1602 01:32:01,520 --> 01:32:05,240 Speaker 1: not countered or whatever depending on the district, um. You know. 1603 01:32:05,280 --> 01:32:07,880 Speaker 1: And I think the Democrats were caught off guard because 1604 01:32:07,920 --> 01:32:10,559 Speaker 1: they just couldn't imagine that that kind of behavior would 1605 01:32:10,560 --> 01:32:13,800 Speaker 1: be engaged in. Well, I think now people are being 1606 01:32:14,439 --> 01:32:18,439 Speaker 1: woken up to recognize that this president will not stop 1607 01:32:18,439 --> 01:32:22,120 Speaker 1: at anything. Literally, he's got everything to lose, you know, 1608 01:32:22,320 --> 01:32:25,479 Speaker 1: not just the election, but given what we know about 1609 01:32:25,520 --> 01:32:28,400 Speaker 1: the you know, criminal charges that are against his organization 1610 01:32:28,479 --> 01:32:32,120 Speaker 1: and his company, given what we know about the fiscal library, 1611 01:32:32,600 --> 01:32:37,559 Speaker 1: financial liability he faces to like Deutsche Bang, incredible amount 1612 01:32:37,600 --> 01:32:41,200 Speaker 1: of money outstanding. Um. You can just see that he 1613 01:32:41,280 --> 01:32:44,200 Speaker 1: imagines that if he doesn't win, it's not just the election, 1614 01:32:44,240 --> 01:32:47,920 Speaker 1: it's also his whole empire collapses. And when you've got 1615 01:32:47,960 --> 01:32:51,640 Speaker 1: somebody against the wall and they know that everything disappears, 1616 01:32:51,640 --> 01:32:54,000 Speaker 1: that they don't win, They're going to do whatever they can. 1617 01:32:55,000 --> 01:32:59,600 Speaker 1: And I'm genuinely terrified about whether our legal system is 1618 01:32:59,760 --> 01:33:05,080 Speaker 1: kabowle of adjudicating when one side is so is operating 1619 01:33:05,080 --> 01:33:08,200 Speaker 1: in such bad faith. Um, I just don't know, you know, 1620 01:33:08,200 --> 01:33:10,559 Speaker 1: we haven't seen good evidence that the court system will 1621 01:33:10,600 --> 01:33:13,800 Speaker 1: stand up to this so far, and and what's to 1622 01:33:13,840 --> 01:33:16,200 Speaker 1: say that they will stand up, you know, against it 1623 01:33:16,280 --> 01:33:19,000 Speaker 1: in the context of that election, And and as I 1624 01:33:19,000 --> 01:33:22,519 Speaker 1: said at the very beginning, if that happens, I don't 1625 01:33:22,520 --> 01:33:26,719 Speaker 1: know what happens. You know, George Floyd brought out, Um, 1626 01:33:26,760 --> 01:33:29,360 Speaker 1: I think it was something like seven percent of America, 1627 01:33:29,720 --> 01:33:32,639 Speaker 1: it might have been seventeen at some number around there 1628 01:33:33,080 --> 01:33:36,800 Speaker 1: who turned out to a protest because of because of 1629 01:33:36,800 --> 01:33:41,320 Speaker 1: that event. You know, take that multiplied by three or four, 1630 01:33:42,080 --> 01:33:45,479 Speaker 1: and you have a country destabilizing event. I don't know 1631 01:33:45,520 --> 01:33:49,000 Speaker 1: what the country is like if people, Um, if there 1632 01:33:49,040 --> 01:33:52,439 Speaker 1: is an election which is stolen and people react in 1633 01:33:52,479 --> 01:33:54,600 Speaker 1: the way that I expect they will react given the 1634 01:33:54,800 --> 01:33:59,760 Speaker 1: fury that this man has triggered across the country. Now, 1635 01:33:59,800 --> 01:34:03,120 Speaker 1: you obviously have a singular high profile yourself, but to 1636 01:34:03,200 --> 01:34:06,400 Speaker 1: what degree are you integrated with the d n C 1637 01:34:06,880 --> 01:34:11,160 Speaker 1: or Kamala or Joe. Is that something where you have 1638 01:34:11,320 --> 01:34:14,760 Speaker 1: interaction with them? Is the lines open or you on 1639 01:34:14,800 --> 01:34:19,839 Speaker 1: your own separate island. Well, I give advice on UM 1640 01:34:20,040 --> 01:34:23,200 Speaker 1: reform issues to anybody who wants it, and I've given 1641 01:34:23,240 --> 01:34:26,360 Speaker 1: advice to a wide range of people, including people involved 1642 01:34:26,400 --> 01:34:29,360 Speaker 1: in these campaigns, so I will continue to do that. 1643 01:34:29,439 --> 01:34:32,839 Speaker 1: I'm not inside the campaign in the sense of running anything, 1644 01:34:32,880 --> 01:34:35,800 Speaker 1: and I don't expect to be UM although you know 1645 01:34:36,240 --> 01:34:38,759 Speaker 1: my view is Joe Biden needs to become the president 1646 01:34:38,840 --> 01:34:41,040 Speaker 1: and I will do whatever it takes to make every 1647 01:34:41,160 --> 01:34:44,040 Speaker 1: whatever it takes that my father would approve of, to 1648 01:34:44,160 --> 01:34:47,440 Speaker 1: make sure that that happens. Larry, this has been unbelievable. 1649 01:34:47,479 --> 01:34:49,400 Speaker 1: We could go on for hours, but we're gonna wear 1650 01:34:49,720 --> 01:34:52,680 Speaker 1: our ears out and those of the audience. But this 1651 01:34:52,760 --> 01:34:55,719 Speaker 1: has been incredibly edifying, especially to those people who don't 1652 01:34:55,760 --> 01:34:58,680 Speaker 1: follow the law on a week by week basis or 1653 01:34:58,720 --> 01:35:00,920 Speaker 1: even a year by your business. Thanks so much for 1654 01:35:01,000 --> 01:35:04,040 Speaker 1: doing this. Thank you for having me. Until next time. 1655 01:35:04,200 --> 01:35:28,400 Speaker 1: This is Bob Left, says m