1 00:00:15,356 --> 00:00:22,756 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:22,796 --> 00:00:25,796 Speaker 1: where we explored the stories behind the stories in the news. 3 00:00:26,196 --> 00:00:31,956 Speaker 1: I'm Noah Feldman. Last February, thirty five year old Thomas 4 00:00:32,076 --> 00:00:35,476 Speaker 1: Vargas received an envelope in the mail. It held a 5 00:00:35,556 --> 00:00:38,996 Speaker 1: debit card with five hundred dollars loaded onto it. I 6 00:00:39,076 --> 00:00:42,316 Speaker 1: was shocked, kind of like thought it was not true 7 00:00:42,396 --> 00:00:45,916 Speaker 1: or it was fake, but the money was real. Vargas 8 00:00:45,916 --> 00:00:49,036 Speaker 1: had been randomly selected to take part in an eighteen 9 00:00:49,116 --> 00:00:53,556 Speaker 1: month universal basic income pilot program being run by the 10 00:00:53,596 --> 00:00:58,196 Speaker 1: City of Stockton, California. Every month until July twenty twenty, 11 00:00:58,436 --> 00:01:01,596 Speaker 1: he'll get another five hundred dollars and be able to 12 00:01:01,636 --> 00:01:05,076 Speaker 1: spend that money on anything he wants. It's in fact 13 00:01:05,076 --> 00:01:09,676 Speaker 1: of my life greatly. I've been able to better, Like 14 00:01:09,796 --> 00:01:12,436 Speaker 1: I guess the goals that I wanted to do. I 15 00:01:12,596 --> 00:01:15,036 Speaker 1: will able to go and sign it back up for school. 16 00:01:15,756 --> 00:01:18,716 Speaker 1: My kids are also able to get more schooling or 17 00:01:18,836 --> 00:01:22,356 Speaker 1: more tutoring if they say ahead of their education. My 18 00:01:22,436 --> 00:01:25,356 Speaker 1: health and my stress level is better. I find that 19 00:01:25,396 --> 00:01:28,716 Speaker 1: I'm more energized, more happier than what I was before. 20 00:01:29,396 --> 00:01:32,036 Speaker 1: Vargas is hoping that the program will eventually expand to 21 00:01:32,076 --> 00:01:34,636 Speaker 1: help more people on a national level. Yeah, I can 22 00:01:34,676 --> 00:01:37,956 Speaker 1: see helping a lot of people out in a major way. 23 00:01:37,996 --> 00:01:39,476 Speaker 1: I mean, a lot of people struggle or whatever, and 24 00:01:39,476 --> 00:01:41,596 Speaker 1: if they have the opportunity to receive something like this, 25 00:01:41,916 --> 00:01:44,356 Speaker 1: they could sit there and get theirselves out of the situation, 26 00:01:44,516 --> 00:01:46,276 Speaker 1: or sit there and help to make it better in 27 00:01:46,396 --> 00:01:49,196 Speaker 1: some way somehow. It's a long shot, but maybe it 28 00:01:49,196 --> 00:01:53,716 Speaker 1: could happen. In part thanks to Democratic presidential candidate Andrew Yang, 29 00:01:53,876 --> 00:01:57,196 Speaker 1: who's made it a central part of his platform, universal 30 00:01:57,236 --> 00:02:01,276 Speaker 1: basic income has actually gotten some buzz around it. What 31 00:02:01,356 --> 00:02:04,996 Speaker 1: would it actually be like, though, if every American received 32 00:02:05,076 --> 00:02:08,276 Speaker 1: some sort of set basic income from the government on 33 00:02:08,316 --> 00:02:11,236 Speaker 1: a monthly basis. To get a sense of the way 34 00:02:11,276 --> 00:02:15,236 Speaker 1: this could look, I spoke with Stockton's Mayor Michael Tubbs 35 00:02:15,476 --> 00:02:19,676 Speaker 1: and with Suki Samra, who's been tasked with overseeing Stockton's 36 00:02:19,796 --> 00:02:23,876 Speaker 1: universal basic income pilot project. Mayor Michael Tubbs, I want 37 00:02:23,876 --> 00:02:27,836 Speaker 1: to start by asking you how you ended up involved 38 00:02:27,876 --> 00:02:30,196 Speaker 1: in this project. I mean, here you are, six years 39 00:02:30,196 --> 00:02:32,116 Speaker 1: out of college, a whole six years out of college, 40 00:02:32,636 --> 00:02:35,796 Speaker 1: and the first African American mayor of your city, the 41 00:02:35,876 --> 00:02:38,596 Speaker 1: youngest mayor in the history of your city. It's not 42 00:02:38,676 --> 00:02:41,996 Speaker 1: the most obvious thing to take on as a high 43 00:02:42,036 --> 00:02:46,156 Speaker 1: profile signature project. So how did it come about? Well, 44 00:02:47,276 --> 00:02:50,476 Speaker 1: before being mayors, on city council for four years and 45 00:02:50,516 --> 00:02:53,276 Speaker 1: as a city council member representing the South part of 46 00:02:53,316 --> 00:02:57,516 Speaker 1: the city, poverty was just frequently on my desk as 47 00:02:58,316 --> 00:03:00,596 Speaker 1: an issue we never talked about, but we were solving 48 00:03:00,636 --> 00:03:02,676 Speaker 1: for with all these other means. So we would have 49 00:03:02,716 --> 00:03:07,116 Speaker 1: conversations about employed things like education and healthcare, access to 50 00:03:07,156 --> 00:03:10,876 Speaker 1: healthy foods, and safety, and the crux of all those issues, 51 00:03:10,876 --> 00:03:13,676 Speaker 1: in my opinion, was just to persistent rate of poverty 52 00:03:13,876 --> 00:03:17,076 Speaker 1: and how people just were in areas of lack. So 53 00:03:17,116 --> 00:03:20,676 Speaker 1: then when I became mayor, I challenged my team to 54 00:03:20,756 --> 00:03:22,916 Speaker 1: really think through sort of what can a city do 55 00:03:23,036 --> 00:03:26,076 Speaker 1: to really address issues of poverty. And I told my team, 56 00:03:26,116 --> 00:03:28,676 Speaker 1: I said I don't want anything I've heard before. I 57 00:03:28,716 --> 00:03:31,636 Speaker 1: wanted to be bold and a little bit crazy. And 58 00:03:31,676 --> 00:03:34,076 Speaker 1: then they came back with this idea of a basic income. 59 00:03:34,356 --> 00:03:37,076 Speaker 1: I say, well, you know about you guys, it's one 60 00:03:37,156 --> 00:03:39,676 Speaker 1: year in office. I would love to be re elected, 61 00:03:40,076 --> 00:03:42,236 Speaker 1: so maybe we'll wait to our second term to even 62 00:03:42,276 --> 00:03:45,236 Speaker 1: begin this conversation because it seems like, particularly in a 63 00:03:45,356 --> 00:03:49,076 Speaker 1: city that was just struggling from bankruptcy, was a non starter. 64 00:03:49,916 --> 00:03:54,156 Speaker 1: Fast forward to a week after my team came back 65 00:03:54,156 --> 00:03:56,036 Speaker 1: with the idea of basic income, I was at a 66 00:03:56,076 --> 00:03:59,476 Speaker 1: conference on the Future of Work in San Francisco where 67 00:03:59,476 --> 00:04:02,236 Speaker 1: I connected with Natalie Foster, when the co chairs of 68 00:04:02,276 --> 00:04:05,076 Speaker 1: the Economic Security Project, and she said that she was 69 00:04:05,156 --> 00:04:07,556 Speaker 1: looking for a city to pile the basic income in 70 00:04:08,476 --> 00:04:11,676 Speaker 1: Philipdropic we funded, and I said, well, I have a 71 00:04:11,716 --> 00:04:14,636 Speaker 1: task for us to investigate this issue. So then we 72 00:04:14,716 --> 00:04:18,236 Speaker 1: spent six months kind of going back and forth talking 73 00:04:18,236 --> 00:04:20,236 Speaker 1: about what it would look like, sort of some of 74 00:04:20,276 --> 00:04:23,196 Speaker 1: the values we shared, and I finally agreed to do 75 00:04:23,236 --> 00:04:27,716 Speaker 1: it because I realized that too often the conversations about 76 00:04:27,916 --> 00:04:31,956 Speaker 1: poverty or the working poor, even the economic structure of 77 00:04:31,996 --> 00:04:37,116 Speaker 1: our countries, positions people, particularly people of color, particularly women, 78 00:04:37,636 --> 00:04:42,756 Speaker 1: as passive victims or as people who are lazy and 79 00:04:42,836 --> 00:04:45,596 Speaker 1: make bad decisions and need a bunch of coaching and 80 00:04:45,636 --> 00:04:49,636 Speaker 1: financial literacy to defeat poverty. And I thought that Stockton, 81 00:04:49,716 --> 00:04:52,956 Speaker 1: given us diversity, given where we had started from, and 82 00:04:52,996 --> 00:04:55,316 Speaker 1: just given the people I know as mayor, that it 83 00:04:55,316 --> 00:04:58,236 Speaker 1: would be beautiful to a position the people of Stockton, immigrants, 84 00:04:58,276 --> 00:05:01,356 Speaker 1: people of color, women, young people, people who are working 85 00:05:01,356 --> 00:05:04,716 Speaker 1: incredibly hard. As a center of a conversation around what 86 00:05:05,876 --> 00:05:07,796 Speaker 1: does a guarantee income look like in this country? But 87 00:05:07,836 --> 00:05:10,996 Speaker 1: more importantly, what does do you mean? And what type 88 00:05:11,036 --> 00:05:12,836 Speaker 1: of society do we want to live in? So I 89 00:05:12,876 --> 00:05:17,036 Speaker 1: decided to go ahead and launch the pilot. Suki Samurai, 90 00:05:17,236 --> 00:05:19,436 Speaker 1: you're running the program, and I want to learn more 91 00:05:19,476 --> 00:05:23,596 Speaker 1: about the details of how you're going about doing it. 92 00:05:23,756 --> 00:05:27,076 Speaker 1: Usually universal basic income goes to everybody, but this is 93 00:05:27,116 --> 00:05:29,396 Speaker 1: a pilot project, so the money's not going to everybody. 94 00:05:29,756 --> 00:05:32,196 Speaker 1: So I'd like to know how many people are getting 95 00:05:32,196 --> 00:05:35,156 Speaker 1: the money, and perhaps most significantly, how you chose them. 96 00:05:36,036 --> 00:05:38,916 Speaker 1: So the question of who was going to be selected 97 00:05:38,996 --> 00:05:43,036 Speaker 1: was determined after a pretty deep community engagement process. So 98 00:05:43,076 --> 00:05:45,956 Speaker 1: we announced the plan back in October of twenty seventeen, 99 00:05:46,236 --> 00:05:48,916 Speaker 1: and after that we launched into a design phase where 100 00:05:48,956 --> 00:05:54,476 Speaker 1: we asked the community, including elected officials, residents, leaders of nonprofits, 101 00:05:54,716 --> 00:05:57,796 Speaker 1: who they thought should benefit from the program. There were 102 00:05:57,876 --> 00:06:00,356 Speaker 1: three common design ideals that were echoed over and over 103 00:06:00,396 --> 00:06:03,436 Speaker 1: again and that ultimately helped us determine what our selection 104 00:06:03,556 --> 00:06:06,676 Speaker 1: criteria would be. And those three design ideals was that 105 00:06:06,716 --> 00:06:09,636 Speaker 1: the selection process be fair so everyone at about a 106 00:06:09,716 --> 00:06:12,996 Speaker 1: roughly equal chance of being selected. That it be diverse 107 00:06:13,076 --> 00:06:16,716 Speaker 1: and representative of Stockton. So recognizing that diversity is one 108 00:06:16,756 --> 00:06:19,036 Speaker 1: of our greatest assets, making sure that the folks were 109 00:06:19,076 --> 00:06:22,556 Speaker 1: ultimately selected to receive the guaranteed income, we're representative of 110 00:06:22,556 --> 00:06:25,516 Speaker 1: our diversity as well, and that we maximize our ability 111 00:06:25,556 --> 00:06:29,476 Speaker 1: to learn. So Stocktonians recognized the unique opportunity that we 112 00:06:29,556 --> 00:06:32,916 Speaker 1: had to really influence the national conversation around not just 113 00:06:32,956 --> 00:06:36,036 Speaker 1: a guaranteed income, but around an economy that's falling short 114 00:06:36,076 --> 00:06:39,876 Speaker 1: for far too many Americans. So with those three design ideals, 115 00:06:40,116 --> 00:06:42,836 Speaker 1: the selection criteria that we lended on were actually pretty simple. 116 00:06:43,236 --> 00:06:45,396 Speaker 1: One was that folks had to be at least eighteen 117 00:06:45,476 --> 00:06:47,756 Speaker 1: years of age or older, that they lived in Stockton, 118 00:06:48,116 --> 00:06:49,876 Speaker 1: and that they lived in a neighborhood where the area 119 00:06:49,996 --> 00:06:52,956 Speaker 1: meeting income was at forty six thousand dollars or below. 120 00:06:53,236 --> 00:06:56,876 Speaker 1: And that number we settled on because that's Stockton's citywide 121 00:06:56,916 --> 00:07:00,396 Speaker 1: meeting income as well. That was a selection criteria. From there, 122 00:07:00,836 --> 00:07:03,996 Speaker 1: we actually did a randomized selection, so we worked with 123 00:07:04,036 --> 00:07:07,356 Speaker 1: our research team to identify the qualifying neighborhoods where the 124 00:07:07,356 --> 00:07:10,236 Speaker 1: meeting income was again at or below forty six thousand dollars. 125 00:07:10,596 --> 00:07:13,676 Speaker 1: We randomly selected forty two hundred households and sent them 126 00:07:13,716 --> 00:07:17,716 Speaker 1: a letter inviting them to participate. That letter detailed the 127 00:07:18,636 --> 00:07:21,516 Speaker 1: opportunities they had to participate, either as part of our 128 00:07:21,556 --> 00:07:23,916 Speaker 1: treatment group or the folks who are receiving the five dollars, 129 00:07:24,036 --> 00:07:26,316 Speaker 1: or as part of our control group. Of the people 130 00:07:26,316 --> 00:07:29,756 Speaker 1: that responded to that letter and responding entailed going online, 131 00:07:29,756 --> 00:07:32,356 Speaker 1: filling out a consent form one hundred and twenty five, 132 00:07:32,556 --> 00:07:35,316 Speaker 1: or randomly selected to be in our treatment group or 133 00:07:35,316 --> 00:07:38,756 Speaker 1: to receive the five hundred dollars a month. So basically, 134 00:07:38,756 --> 00:07:42,316 Speaker 1: what you're describing is a pretty rigorous design of an 135 00:07:42,356 --> 00:07:46,396 Speaker 1: experiment in a randomized model, which the social scientists like 136 00:07:46,476 --> 00:07:49,116 Speaker 1: to say is the gold standard of the best way 137 00:07:49,156 --> 00:07:52,716 Speaker 1: you can figure out whether something is working. It's too soon, 138 00:07:52,756 --> 00:07:54,836 Speaker 1: I guess, to say whether it's working or not, and 139 00:07:54,876 --> 00:07:58,476 Speaker 1: you don't want to break the structure of the experiment 140 00:07:58,516 --> 00:08:01,516 Speaker 1: by theorizing too soon. But I guess the first question 141 00:08:01,516 --> 00:08:04,596 Speaker 1: that that makes me ask is, let's say it's great. 142 00:08:05,236 --> 00:08:07,716 Speaker 1: Let's say you find after you look through all the 143 00:08:07,796 --> 00:08:11,396 Speaker 1: data after July twenty twenty that there were substantially better 144 00:08:11,596 --> 00:08:15,396 Speaker 1: life outcomes for people who received this five hundred dollars 145 00:08:15,396 --> 00:08:19,596 Speaker 1: a month payment over an eighteen month period. What happens then, 146 00:08:19,996 --> 00:08:23,116 Speaker 1: I mean, your city is a city, mister Mayor, of 147 00:08:23,396 --> 00:08:26,036 Speaker 1: just over three hundred thousand people, if I'm not mistaken, 148 00:08:26,796 --> 00:08:29,596 Speaker 1: and that's a pretty far cry from one hundred and 149 00:08:29,636 --> 00:08:33,396 Speaker 1: twenty five people. Is there a plausible way to take 150 00:08:33,436 --> 00:08:36,396 Speaker 1: this forward, assuming that it works well? No? No, absolutely. 151 00:08:36,476 --> 00:08:40,476 Speaker 1: I think for me, I've used cities as laboratories of democracy, 152 00:08:40,596 --> 00:08:44,236 Speaker 1: meaning that for any intervention, Being a nerd, I like 153 00:08:44,316 --> 00:08:46,716 Speaker 1: to test it and kind of see what works and 154 00:08:46,956 --> 00:08:50,236 Speaker 1: kind of see impacts before making the case for a scale, 155 00:08:50,236 --> 00:08:52,956 Speaker 1: which is why I think pilots and philanthropy could be 156 00:08:52,956 --> 00:08:55,796 Speaker 1: helpful in that regard. But to answer your question, I 157 00:08:55,876 --> 00:08:58,756 Speaker 1: think for any program my fists to take off at scale, 158 00:08:58,996 --> 00:09:01,236 Speaker 1: it couldn't be city led. I don't know if any 159 00:09:01,276 --> 00:09:04,436 Speaker 1: sea that has the resources to provide one hundred dollars 160 00:09:04,476 --> 00:09:07,276 Speaker 1: a month to every resident of as five hundred thousand. 161 00:09:07,436 --> 00:09:12,316 Speaker 1: But what we've seen from design to implementation is bold 162 00:09:12,316 --> 00:09:14,636 Speaker 1: policy proposals and ways to get us closer to there. 163 00:09:14,676 --> 00:09:17,916 Speaker 1: I think of in California, I'm governor in Newsom has 164 00:09:17,956 --> 00:09:21,636 Speaker 1: doubled their earning income tax credit, which is a form 165 00:09:21,676 --> 00:09:24,156 Speaker 1: of giving folks cash and money. Um, there are a 166 00:09:24,196 --> 00:09:28,156 Speaker 1: tax credit. We see folks like send Or Harris with 167 00:09:28,156 --> 00:09:31,796 Speaker 1: her Lift America Act, who is calling for five hundred 168 00:09:31,796 --> 00:09:34,596 Speaker 1: dollars a month every month to families in this country 169 00:09:34,596 --> 00:09:36,676 Speaker 1: and make a hundred care or less through the tax code. 170 00:09:37,156 --> 00:09:39,156 Speaker 1: But I also want to take a step back because 171 00:09:39,196 --> 00:09:42,916 Speaker 1: I think if data actually drove decision making, the world 172 00:09:42,916 --> 00:09:44,996 Speaker 1: in general and the society we live in will look 173 00:09:45,036 --> 00:09:47,916 Speaker 1: radically different. So I think the results from the study 174 00:09:47,916 --> 00:09:50,076 Speaker 1: are important. In the data is important, but it's going 175 00:09:50,116 --> 00:09:52,996 Speaker 1: to be most important as the stories that we're able 176 00:09:53,036 --> 00:09:55,836 Speaker 1: to tell about people, people who are just like everyone else, 177 00:09:55,876 --> 00:09:58,116 Speaker 1: that the rest of the body politic can see itself 178 00:09:58,116 --> 00:10:00,956 Speaker 1: reflected in who are doing good things with money and 179 00:10:00,996 --> 00:10:03,916 Speaker 1: who are actually contributing. Can I ask a question about 180 00:10:03,916 --> 00:10:06,156 Speaker 1: that though, I mean, so there's two you said two 181 00:10:06,156 --> 00:10:08,276 Speaker 1: different things there, mister Mary, both of which are really interesting, 182 00:10:08,276 --> 00:10:10,236 Speaker 1: but I think there might be a teeny bit of 183 00:10:10,236 --> 00:10:14,996 Speaker 1: difference between them. So one is that data should drive 184 00:10:15,236 --> 00:10:17,636 Speaker 1: more policymaking, and I'm very sympathetic to that view, and 185 00:10:17,676 --> 00:10:20,316 Speaker 1: I think it's very impressive that you guys are doing 186 00:10:20,596 --> 00:10:23,956 Speaker 1: a proper randomized controlled experiment so that there will be data. 187 00:10:24,036 --> 00:10:25,396 Speaker 1: But then if that's the case, you have to be 188 00:10:25,436 --> 00:10:28,316 Speaker 1: open to the possibility that the data will reveal that 189 00:10:28,516 --> 00:10:32,156 Speaker 1: the program didn't work, didn't achieve any significant effects on 190 00:10:32,196 --> 00:10:34,516 Speaker 1: the group of people who've been who've been getting the money. 191 00:10:35,116 --> 00:10:37,956 Speaker 1: The other angle is the stories angle, that we want 192 00:10:37,956 --> 00:10:41,156 Speaker 1: to get great stories out of a trial like this, 193 00:10:41,276 --> 00:10:43,516 Speaker 1: and for that, I'm like one hundred percent sure that 194 00:10:43,516 --> 00:10:45,716 Speaker 1: with one hundred and twenty five people you'll be able 195 00:10:45,716 --> 00:10:47,716 Speaker 1: to tell some great stories. There will be some stories 196 00:10:47,716 --> 00:10:50,676 Speaker 1: that people who did amazing things and improve their lives 197 00:10:50,996 --> 00:10:53,076 Speaker 1: with a sextra amount of money. But that's not really 198 00:10:53,116 --> 00:10:55,476 Speaker 1: the same thing as the data, right. I mean, from 199 00:10:55,476 --> 00:10:57,436 Speaker 1: the standpoint of an economist who looks at the randomized 200 00:10:57,476 --> 00:11:00,236 Speaker 1: controlled experiment, their whole idea is to stay away from 201 00:11:00,276 --> 00:11:02,316 Speaker 1: the stories, which are the kind of things that you know, 202 00:11:02,356 --> 00:11:04,876 Speaker 1: with respect, that's what politicians like. They like anecdotes, they 203 00:11:04,916 --> 00:11:07,916 Speaker 1: like stories to inspire people. So which is it. I mean, 204 00:11:08,036 --> 00:11:11,836 Speaker 1: is this a case where the stories will help drive 205 00:11:11,876 --> 00:11:14,236 Speaker 1: policy going forward, or is this a case where you're 206 00:11:14,236 --> 00:11:17,356 Speaker 1: going to rely on the hard data as a takeaway. Yeah, well, 207 00:11:17,396 --> 00:11:19,556 Speaker 1: well I think we have. There's a bunch of different 208 00:11:19,556 --> 00:11:22,836 Speaker 1: audiences and constituencies we're speaking to. So I think for 209 00:11:22,876 --> 00:11:25,476 Speaker 1: the economists and for those who actually use data and 210 00:11:25,476 --> 00:11:27,716 Speaker 1: look at data, the data is important, and for me 211 00:11:27,756 --> 00:11:30,876 Speaker 1: it is as well. But being a pragmatist and understanding 212 00:11:30,916 --> 00:11:34,716 Speaker 1: just how not just politics but policy and just how 213 00:11:34,956 --> 00:11:37,476 Speaker 1: things happen in this country, narrative is an important part 214 00:11:37,476 --> 00:11:39,756 Speaker 1: of that. And I think particularly when we're talking about 215 00:11:39,756 --> 00:11:43,956 Speaker 1: basic income, we're also talking about issues of race. We're 216 00:11:43,996 --> 00:11:46,756 Speaker 1: also talking about issues of who's deserving. We're also talking 217 00:11:46,796 --> 00:11:50,396 Speaker 1: about issues attached to work and dignity and purpose and contribution. 218 00:11:50,436 --> 00:11:54,076 Speaker 1: We're talking about issues of misogyny and sexism. We're talking 219 00:11:54,076 --> 00:11:56,756 Speaker 1: about all these very thorny issues that no amount of 220 00:11:56,836 --> 00:11:59,556 Speaker 1: data has been able to really move our country as 221 00:11:59,636 --> 00:12:01,756 Speaker 1: quickly as I would like us see us move on 222 00:12:01,796 --> 00:12:05,356 Speaker 1: these issues. But I do think kind of humanizing and 223 00:12:05,436 --> 00:12:08,436 Speaker 1: putting stories and faces behind not just the data but 224 00:12:08,476 --> 00:12:13,236 Speaker 1: the idea policy will lead to some forward step. But 225 00:12:13,276 --> 00:12:16,196 Speaker 1: I think I tell people all the time the data, 226 00:12:16,316 --> 00:12:18,876 Speaker 1: the research that's all independent of me. I have no 227 00:12:18,956 --> 00:12:22,596 Speaker 1: interaction with the researchers, I I don't know who was 228 00:12:22,596 --> 00:12:26,756 Speaker 1: selected that that's going to be independent, but regardless of 229 00:12:26,756 --> 00:12:29,876 Speaker 1: what the data shows, I think it's also an opportunity 230 00:12:29,956 --> 00:12:33,436 Speaker 1: to show just how hard people are working. I think 231 00:12:33,796 --> 00:12:37,156 Speaker 1: the stories, if nothing else, will help illustrate that poor 232 00:12:37,196 --> 00:12:40,836 Speaker 1: people aren't lazy, that people who struggle aren't less deserving, 233 00:12:41,516 --> 00:12:44,116 Speaker 1: and that people have dignity that's inherent to their humanity, 234 00:12:44,236 --> 00:12:45,996 Speaker 1: is not attached to what they're able to produce for 235 00:12:46,036 --> 00:12:49,236 Speaker 1: somebody in terms of work. So that's a really interesting 236 00:12:49,276 --> 00:12:51,076 Speaker 1: point that you raise that Suki. I actually wanted to 237 00:12:51,116 --> 00:12:53,876 Speaker 1: ask you about this, and in your talks with people 238 00:12:53,956 --> 00:12:56,996 Speaker 1: in the community and thinking about the design and implementing it, 239 00:12:57,516 --> 00:13:01,476 Speaker 1: have you encountered anybody who has said to you, look, 240 00:13:01,516 --> 00:13:04,476 Speaker 1: I actually don't want this. My own sense of dignity 241 00:13:04,556 --> 00:13:06,956 Speaker 1: is defined as dignity by work. I want to be 242 00:13:06,956 --> 00:13:09,916 Speaker 1: able to show that I've worked for what I've received, 243 00:13:10,396 --> 00:13:12,356 Speaker 1: and I don't want any part of it for that reason. 244 00:13:12,396 --> 00:13:16,076 Speaker 1: Have you encountered people in Stockton who said, I don't 245 00:13:16,116 --> 00:13:18,676 Speaker 1: want to be part of this program because I don't 246 00:13:18,716 --> 00:13:23,276 Speaker 1: like the idea of a handout. We've definitely heard people 247 00:13:23,356 --> 00:13:26,876 Speaker 1: say that they don't want the five hund dollars a month. 248 00:13:26,916 --> 00:13:29,516 Speaker 1: But I think that comes from the point that Mayor 249 00:13:29,556 --> 00:13:32,396 Speaker 1: raised earlier, from more of the racialized and gendered stereotypes 250 00:13:32,436 --> 00:13:35,996 Speaker 1: that exist around the public welfare system at large. A 251 00:13:36,036 --> 00:13:39,156 Speaker 1: lot of folks, definitely when the guaranteed income first started, 252 00:13:39,196 --> 00:13:40,756 Speaker 1: felt like there was going to be a catch, that 253 00:13:40,836 --> 00:13:42,516 Speaker 1: there was no way that they could just get the 254 00:13:42,556 --> 00:13:45,196 Speaker 1: five hundred dollars a month and there wouldn't be strings attached, 255 00:13:46,076 --> 00:13:48,756 Speaker 1: and it's taken at least I mean, it took until 256 00:13:48,756 --> 00:13:51,196 Speaker 1: at least a six seven month mark where people finally 257 00:13:51,196 --> 00:13:55,036 Speaker 1: started believing that a government or an elected official really 258 00:13:55,076 --> 00:13:57,116 Speaker 1: could believe in them enough and trust them enough to 259 00:13:57,156 --> 00:13:59,636 Speaker 1: give them five hundred dollars a month without there being 260 00:13:59,716 --> 00:14:02,436 Speaker 1: anything attached, without it needing to be attached to work. 261 00:14:02,956 --> 00:14:04,796 Speaker 1: And what we see is that the five dollars a 262 00:14:04,796 --> 00:14:07,076 Speaker 1: month is really just an income floor. The five dollars 263 00:14:07,116 --> 00:14:09,716 Speaker 1: a month is not stopping folks from working. If anything, 264 00:14:09,796 --> 00:14:12,556 Speaker 1: it's allowing people to find better jobs. So I think 265 00:14:13,636 --> 00:14:16,956 Speaker 1: to answer your question, there's definitely our country has a 266 00:14:16,956 --> 00:14:20,996 Speaker 1: long history of attaching values to the money and to 267 00:14:22,036 --> 00:14:24,076 Speaker 1: quote unquote handouts that we give to our folks and 268 00:14:24,116 --> 00:14:26,636 Speaker 1: it's taken for our recipients a couple of months to 269 00:14:26,676 --> 00:14:29,556 Speaker 1: really break that down and realize that an elected official 270 00:14:29,636 --> 00:14:32,076 Speaker 1: and really is here to see them thrive regardless of 271 00:14:32,116 --> 00:14:35,876 Speaker 1: what they do with the cash. So you're recently graduated 272 00:14:35,916 --> 00:14:39,916 Speaker 1: from Stanford and that's in the heart of Silicon Valley, 273 00:14:40,116 --> 00:14:44,876 Speaker 1: and the Economic Security Project also comes out of Silicon Valley. Recently, 274 00:14:45,356 --> 00:14:48,996 Speaker 1: Andrew Yang, who's a former tech entrepreneur, has put the 275 00:14:49,036 --> 00:14:51,916 Speaker 1: topic of universal basic income on the national agenda in 276 00:14:52,036 --> 00:14:56,716 Speaker 1: his presidential run. What is it about Silicon Valley and 277 00:14:56,836 --> 00:15:00,676 Speaker 1: universal basic income? Why is there some kind of a match? 278 00:15:00,876 --> 00:15:04,396 Speaker 1: It seems even Elon musk as As embraced the idea. 279 00:15:04,516 --> 00:15:08,876 Speaker 1: Why is there this kind of association between the valley 280 00:15:08,916 --> 00:15:12,836 Speaker 1: and tech and on the one hand, and the idea 281 00:15:12,876 --> 00:15:15,956 Speaker 1: of a universal basic income on the other. I think 282 00:15:15,956 --> 00:15:20,436 Speaker 1: if the close association between tech and universal basic income 283 00:15:20,516 --> 00:15:22,756 Speaker 1: is probably rooted in the fact that the folks who 284 00:15:22,796 --> 00:15:25,596 Speaker 1: are in Silicon Valley are most are the closest to 285 00:15:26,556 --> 00:15:30,036 Speaker 1: automation and artificial intelligence, and so when they're thinking about 286 00:15:30,116 --> 00:15:32,956 Speaker 1: the economy, they're looking at it from generally from a 287 00:15:32,956 --> 00:15:35,476 Speaker 1: more future oriented lens in which they see that you know, 288 00:15:35,476 --> 00:15:38,916 Speaker 1: the robots are coming, which will lead to displacement a 289 00:15:38,916 --> 00:15:42,476 Speaker 1: lot of workers, and that will in turn disproportionately impact 290 00:15:42,596 --> 00:15:45,916 Speaker 1: quote unquote originally thought of as low skilled workers who 291 00:15:45,996 --> 00:15:48,916 Speaker 1: tend to be people of color, and a universal basic 292 00:15:48,956 --> 00:15:51,956 Speaker 1: income is positive as a one solution that could mitigate 293 00:15:51,996 --> 00:15:54,516 Speaker 1: the effects of automation. I think for folks who are 294 00:15:54,556 --> 00:15:57,236 Speaker 1: a little bit more removed from the Silicon Valley, it's 295 00:15:57,316 --> 00:15:59,356 Speaker 1: a guaranteed income as posed a solution to an economy 296 00:15:59,396 --> 00:16:01,996 Speaker 1: that's falling short for far too many Americans today. So 297 00:16:01,996 --> 00:16:04,996 Speaker 1: I'll be recognized that the robots are indeed coming. It's 298 00:16:04,996 --> 00:16:08,076 Speaker 1: also a matter of today one in for Stocktonians lives 299 00:16:08,076 --> 00:16:10,156 Speaker 1: in poverty and today away teens in the nation for 300 00:16:10,236 --> 00:16:12,636 Speaker 1: talent poverty. So how can we use a guaranteed income 301 00:16:12,636 --> 00:16:16,356 Speaker 1: to address and make the economy better in this moment? 302 00:16:17,636 --> 00:16:20,916 Speaker 1: Mayor Tubbs, can I ask you if I were designing 303 00:16:20,916 --> 00:16:23,796 Speaker 1: a universal a basic income program, I wouldn't want it 304 00:16:23,796 --> 00:16:25,276 Speaker 1: to be universal. I mean, one of the things that 305 00:16:25,316 --> 00:16:27,836 Speaker 1: I thought was impressive about your program is that you 306 00:16:27,996 --> 00:16:31,316 Speaker 1: targeted it to some degree at people of lower income, 307 00:16:31,396 --> 00:16:34,436 Speaker 1: just by choosing neighborhoods that were at or below the 308 00:16:34,476 --> 00:16:37,996 Speaker 1: city's median income line. I mean, that's not targeting narrow poverty, 309 00:16:38,076 --> 00:16:40,556 Speaker 1: but it's at least trying to avoid giving away too 310 00:16:40,636 --> 00:16:42,756 Speaker 1: much of this money to people who genuinely don't need it. 311 00:16:43,436 --> 00:16:45,916 Speaker 1: And I'm wondering, how do you feel about the universal 312 00:16:45,956 --> 00:16:48,636 Speaker 1: part of universal basic income? I mean, my instinct, again 313 00:16:48,676 --> 00:16:51,996 Speaker 1: this is just speaking for myself, is that there's something 314 00:16:52,036 --> 00:16:56,476 Speaker 1: crazy about giving people who don't need large amounts of 315 00:16:56,476 --> 00:16:59,236 Speaker 1: money this kind of money. And I understand that the 316 00:16:59,276 --> 00:17:01,756 Speaker 1: logic behind it is it gives greater dignity if it's 317 00:17:01,756 --> 00:17:04,596 Speaker 1: available universally to everybody. But there you're just trading off 318 00:17:04,636 --> 00:17:09,756 Speaker 1: dignity against significant redistribution of income in a situation where 319 00:17:10,076 --> 00:17:13,476 Speaker 1: programs are extremely costly. So do you how do you 320 00:17:13,476 --> 00:17:17,236 Speaker 1: feel about the universal part of universal income? I'm not 321 00:17:17,276 --> 00:17:22,356 Speaker 1: gonna lie that I actually struggle. I struggle with that. Yeah, 322 00:17:22,436 --> 00:17:26,796 Speaker 1: And speaking candidly, I think if it was to go 323 00:17:26,876 --> 00:17:30,716 Speaker 1: at scale, it probably couldn't start us universal anyway. I mean, 324 00:17:30,836 --> 00:17:33,676 Speaker 1: Chris Hughes talks about doing some sort of financial tax 325 00:17:34,036 --> 00:17:36,836 Speaker 1: which would help folks making fifty k below with the 326 00:17:36,876 --> 00:17:40,076 Speaker 1: income floor Sendor Harris with her and hundred k below 327 00:17:40,116 --> 00:17:42,716 Speaker 1: for families I think we have to start there and iterate, 328 00:17:43,236 --> 00:17:47,076 Speaker 1: but I would say I've actually have begin to understand 329 00:17:47,276 --> 00:17:51,036 Speaker 1: why universality makes sense, and I think it goes to 330 00:17:51,076 --> 00:17:54,556 Speaker 1: your point around kind of dignity, but also re establishing 331 00:17:54,836 --> 00:17:59,196 Speaker 1: this idea of the commons in terms of and extending 332 00:17:59,196 --> 00:18:02,276 Speaker 1: the social contract. So when I think of a lot 333 00:18:02,316 --> 00:18:06,956 Speaker 1: of sort of public goods, whether it's public schools, even 334 00:18:06,996 --> 00:18:10,036 Speaker 1: extremely wealthy people have access to those things, although they 335 00:18:10,076 --> 00:18:12,636 Speaker 1: may not need them, UM, I think that it does 336 00:18:12,836 --> 00:18:15,956 Speaker 1: build sort of the body politics, if you will. That 337 00:18:16,236 --> 00:18:19,756 Speaker 1: and this idea of dignity has kept resurfacing in terms 338 00:18:19,756 --> 00:18:24,436 Speaker 1: of people don't want to feel like they're they're needy, 339 00:18:24,556 --> 00:18:27,916 Speaker 1: or they're less than UM. They want to feel like, oh, 340 00:18:27,956 --> 00:18:30,596 Speaker 1: this is something that we're all getting UM. And I 341 00:18:30,596 --> 00:18:33,556 Speaker 1: do think there's some there's some value to that UM. 342 00:18:33,596 --> 00:18:35,516 Speaker 1: But I even when we were designing the program, I 343 00:18:35,556 --> 00:18:39,196 Speaker 1: really struggled with the UM at or below the median 344 00:18:39,236 --> 00:18:41,996 Speaker 1: income line, or or the fact that there's some people 345 00:18:42,036 --> 00:18:44,716 Speaker 1: receiving the money who make more than me, who make 346 00:18:44,756 --> 00:18:48,156 Speaker 1: eight hundred K. But then well, I've also learned it's 347 00:18:48,156 --> 00:18:51,396 Speaker 1: that even people who make eighty ninety one fifty two 348 00:18:51,476 --> 00:18:55,316 Speaker 1: hundred k, particularly in areas that are more a lot 349 00:18:55,396 --> 00:18:58,396 Speaker 1: less affordable than Stockton, are still struggling. UM and I 350 00:18:58,476 --> 00:19:01,396 Speaker 1: still have high rates to student debt, UM still may 351 00:19:01,436 --> 00:19:04,836 Speaker 1: not have the best healthcare, and still could use a 352 00:19:04,876 --> 00:19:08,316 Speaker 1: little income ushion as well. So i've i've I've relented 353 00:19:08,396 --> 00:19:11,196 Speaker 1: that fact, and I think that whatever is And also 354 00:19:11,356 --> 00:19:13,236 Speaker 1: last thing, obviously, I also think in terms of what's 355 00:19:13,276 --> 00:19:17,796 Speaker 1: politically possible, given the history of this country and the 356 00:19:17,836 --> 00:19:22,036 Speaker 1: way we other folks, the only way something like this 357 00:19:22,116 --> 00:19:25,076 Speaker 1: we would happen is if the majority of a lot 358 00:19:25,076 --> 00:19:27,996 Speaker 1: of people, particularly people who vote, which are particularly middle 359 00:19:27,996 --> 00:19:31,916 Speaker 1: class people, can see themselves reflecting actually benefiting tagibly from 360 00:19:31,916 --> 00:19:34,356 Speaker 1: said program. So for those reasons, I think it has 361 00:19:34,356 --> 00:19:37,756 Speaker 1: to be, if not universal, as close universal as possible. 362 00:19:38,756 --> 00:19:40,396 Speaker 1: I mean, you ended up that answer in a little 363 00:19:40,396 --> 00:19:42,796 Speaker 1: bit different than the place that you started. I mean 364 00:19:42,876 --> 00:19:46,156 Speaker 1: I when it comes to public schools, I see the 365 00:19:46,236 --> 00:19:48,356 Speaker 1: argument very much that we actually have an incentive as 366 00:19:48,356 --> 00:19:51,036 Speaker 1: a society to try to get the people who have 367 00:19:51,076 --> 00:19:54,396 Speaker 1: the greatest resources to stay in the public schools, precisely 368 00:19:54,436 --> 00:19:56,276 Speaker 1: so we will have a common sense of community and 369 00:19:56,276 --> 00:19:59,036 Speaker 1: so that people with resources will invest to the extent 370 00:19:59,076 --> 00:20:03,596 Speaker 1: possible in those public institutions, rather than diverting their resources 371 00:20:03,636 --> 00:20:07,316 Speaker 1: and their efforts to private institutions. Here where you're talking 372 00:20:07,316 --> 00:20:12,076 Speaker 1: about essentially check, you know, without a social service support 373 00:20:12,156 --> 00:20:15,796 Speaker 1: network associated with it, it does seem to me that 374 00:20:15,916 --> 00:20:18,956 Speaker 1: there's it's harder to make the case that making it 375 00:20:19,036 --> 00:20:21,716 Speaker 1: universal will give everybody a sense of buying. I do 376 00:20:21,836 --> 00:20:25,156 Speaker 1: understand that it might reduce the sense of indignity that 377 00:20:25,236 --> 00:20:27,436 Speaker 1: some people might might suffer, and then we're just trading 378 00:20:27,436 --> 00:20:30,876 Speaker 1: off in a very hard, tragic choice people's feelings of 379 00:20:30,916 --> 00:20:33,356 Speaker 1: dignity against the question of how much money will have 380 00:20:33,436 --> 00:20:38,236 Speaker 1: to will actually have to hand out. Suki, I wonder 381 00:20:38,476 --> 00:20:41,396 Speaker 1: what are the biggest challenges that you've been facing in implementation? 382 00:20:42,036 --> 00:20:45,436 Speaker 1: What are the problems that you've faced so far. I 383 00:20:45,476 --> 00:20:47,756 Speaker 1: think one of the biggest challenges that we've faced has 384 00:20:47,796 --> 00:20:50,396 Speaker 1: been Sama Tubs says this all the time, that change 385 00:20:50,396 --> 00:20:52,396 Speaker 1: moves out the speed of trust. The same is definitely 386 00:20:52,436 --> 00:20:57,356 Speaker 1: true for program implementation. So we've seen since we launched 387 00:20:57,356 --> 00:20:59,436 Speaker 1: in the very beginning of fun the ways in which 388 00:20:59,476 --> 00:21:02,196 Speaker 1: a lack of trust in our institutions has impacted our 389 00:21:02,196 --> 00:21:05,196 Speaker 1: ability to implement. So I'll give a couple of examples. 390 00:21:05,596 --> 00:21:08,436 Speaker 1: The first is that when we sent out our letters, 391 00:21:09,316 --> 00:21:12,716 Speaker 1: as I mentioned earlier, our recruitment, we did address based recruitment, 392 00:21:13,036 --> 00:21:15,956 Speaker 1: and we sent letters out to randomly selected addresses inviting 393 00:21:15,996 --> 00:21:18,636 Speaker 1: them to participate. So when we sent the letters out, 394 00:21:18,676 --> 00:21:21,716 Speaker 1: we originally only sent a thousand letters because we didn't 395 00:21:21,716 --> 00:21:24,276 Speaker 1: want to fabricate false hope in a city where there's 396 00:21:24,316 --> 00:21:26,036 Speaker 1: a lot of need and a lot of folks who 397 00:21:26,236 --> 00:21:28,636 Speaker 1: need an extra five hundred dollars a month. But when 398 00:21:28,636 --> 00:21:31,396 Speaker 1: we sent those those thousand letters out and we started 399 00:21:31,396 --> 00:21:34,116 Speaker 1: monitoring the response rates, we realized that though we had 400 00:21:34,276 --> 00:21:37,076 Speaker 1: enough responses to fill the treatment group, we didn't necessarily 401 00:21:37,116 --> 00:21:40,076 Speaker 1: have a representative sample of Stockton, and not as many 402 00:21:40,116 --> 00:21:42,396 Speaker 1: folks responded as we hoped. And so then we sent 403 00:21:42,476 --> 00:21:44,516 Speaker 1: out and you thought that their responses might actually have 404 00:21:44,556 --> 00:21:48,836 Speaker 1: been skewed by education or governmental trust or some other thing. 405 00:21:48,876 --> 00:21:50,156 Speaker 1: I mean, I certainly have to say that if I 406 00:21:50,196 --> 00:21:53,236 Speaker 1: got a letter in the mail saying, you know, hey, 407 00:21:53,236 --> 00:21:56,436 Speaker 1: would you like to join a survey, just a pilot program, 408 00:21:56,436 --> 00:21:57,876 Speaker 1: and we're going to pay you an extra five hundred 409 00:21:57,876 --> 00:21:59,836 Speaker 1: dollars a month for nothing, I think I would be 410 00:21:59,876 --> 00:22:02,476 Speaker 1: highly skeptical of that. I think I would be very 411 00:22:02,516 --> 00:22:05,956 Speaker 1: doubtful exactly, and we heard and it's something that's been 412 00:22:06,036 --> 00:22:08,836 Speaker 1: echoed and what we've heard from our recipients over and 413 00:22:08,916 --> 00:22:12,156 Speaker 1: over again. So when the recipients were notified that they 414 00:22:12,156 --> 00:22:14,716 Speaker 1: had been selected to participate in SEED, they were brought 415 00:22:14,756 --> 00:22:17,436 Speaker 1: in for an onboarding session, which is still one on 416 00:22:17,436 --> 00:22:20,836 Speaker 1: one orientation with a member of the SEED staff. And 417 00:22:20,876 --> 00:22:24,076 Speaker 1: even in that orientation session, folks were expressing their disbelief 418 00:22:24,316 --> 00:22:26,996 Speaker 1: and their distrust that in a time when the federal 419 00:22:26,996 --> 00:22:30,796 Speaker 1: government was shut down, in a time when the ice 420 00:22:30,956 --> 00:22:34,156 Speaker 1: raids were increasing and more and more folks of Hispanic 421 00:22:34,276 --> 00:22:38,756 Speaker 1: Latin ex origin we're actually just hiding themselves in their homes, 422 00:22:38,796 --> 00:22:41,396 Speaker 1: that an elected official could actually want to give something 423 00:22:41,396 --> 00:22:46,436 Speaker 1: for nothing. So that that lack of trust and you know, 424 00:22:46,836 --> 00:22:48,716 Speaker 1: thinking that we were scamed, that we just wanted to 425 00:22:48,756 --> 00:22:52,196 Speaker 1: collect information for other purposes. That was a common thread 426 00:22:52,236 --> 00:22:54,076 Speaker 1: that really impact our ability to implement when we were 427 00:22:54,196 --> 00:22:57,116 Speaker 1: just getting started. Mayor Tubs, do you think that you know, 428 00:22:57,596 --> 00:23:01,516 Speaker 1: all politics is local and the retail politics of a 429 00:23:01,556 --> 00:23:04,356 Speaker 1: city of three hundred thousand plus is very very local. 430 00:23:04,436 --> 00:23:06,356 Speaker 1: So I'm sure you know all your neighborhoods and I'm 431 00:23:06,356 --> 00:23:07,996 Speaker 1: sure on a lot of blocks. You know a lot 432 00:23:07,996 --> 00:23:09,516 Speaker 1: of your constituents and if you if you get re 433 00:23:09,516 --> 00:23:11,916 Speaker 1: elected every year, you've got to be out there, you know, 434 00:23:11,996 --> 00:23:16,516 Speaker 1: hearing and talking to people. Are you worried about jealousy 435 00:23:16,676 --> 00:23:20,516 Speaker 1: from neighbors who say so and so got lucky. You know, 436 00:23:20,596 --> 00:23:23,156 Speaker 1: Thomas got lucky, and you know the city gave him 437 00:23:23,196 --> 00:23:26,116 Speaker 1: nine thousand dollars. Not so much me and my neighbor 438 00:23:26,156 --> 00:23:27,836 Speaker 1: and my other neighbor. I mean one hundred and twenty 439 00:23:27,836 --> 00:23:31,116 Speaker 1: five families or people. Is it's enough people to be 440 00:23:31,196 --> 00:23:34,356 Speaker 1: known about, but it's also a small enough people that 441 00:23:34,436 --> 00:23:37,116 Speaker 1: a significant number of people might be kind of jealous. 442 00:23:37,436 --> 00:23:40,436 Speaker 1: Have you encountered that yet? Are you imagining that that 443 00:23:40,476 --> 00:23:42,756 Speaker 1: could happen? Or do you think everyone in Stockton is 444 00:23:42,796 --> 00:23:45,716 Speaker 1: such a giving, big person that they wouldn't mind that 445 00:23:45,756 --> 00:23:48,236 Speaker 1: their neighbor makes nine thousand dollars And they don't. So 446 00:23:48,316 --> 00:23:50,276 Speaker 1: what was the fascinating is that we also had this 447 00:23:50,396 --> 00:23:55,636 Speaker 1: conversation during the design phase, and what we've found since 448 00:23:55,756 --> 00:23:59,156 Speaker 1: disbursements is that most folks are comfortable in that I 449 00:23:59,236 --> 00:24:03,556 Speaker 1: didn't pick it was random. They believe that people believe 450 00:24:03,596 --> 00:24:05,236 Speaker 1: in the randomness. They think it really was random. A 451 00:24:05,236 --> 00:24:06,956 Speaker 1: lot of people believe in ranmothers because it's people who 452 00:24:06,956 --> 00:24:09,636 Speaker 1: hate me who got envelopes who said, there's no way 453 00:24:10,116 --> 00:24:11,836 Speaker 1: if he saw how I talked about him on Facebook 454 00:24:11,876 --> 00:24:13,596 Speaker 1: or they got this envelope right. So I think the 455 00:24:14,796 --> 00:24:19,156 Speaker 1: random lottery really kind of innocuate that. As from all that, 456 00:24:19,436 --> 00:24:21,716 Speaker 1: I think part of it is I've been very clear 457 00:24:22,316 --> 00:24:26,716 Speaker 1: with the community in that there's all types of benefits programs, 458 00:24:26,796 --> 00:24:30,396 Speaker 1: nonprofit services in this city that do good work that 459 00:24:30,436 --> 00:24:33,316 Speaker 1: doesn't serve everybody, and it doesn't make them less good. 460 00:24:34,436 --> 00:24:38,196 Speaker 1: And this is not a departure, very good argument, exact 461 00:24:38,236 --> 00:24:41,036 Speaker 1: departure from the status quo. That's just it's another program 462 00:24:41,076 --> 00:24:43,636 Speaker 1: that everyone has a chance to qualify for if you 463 00:24:43,716 --> 00:24:46,476 Speaker 1: beat these requirements, but everybody won't qualify for. And I 464 00:24:46,556 --> 00:24:49,636 Speaker 1: think for the most part people I have been accepting 465 00:24:49,676 --> 00:24:52,436 Speaker 1: of that. And then I also think that most of 466 00:24:52,436 --> 00:24:57,076 Speaker 1: the recipients are not telling people that they received it, 467 00:24:57,196 --> 00:25:00,796 Speaker 1: which is their choice. That's probably clever, yeah, probably smart, 468 00:25:01,316 --> 00:25:04,156 Speaker 1: But when we were first designing it, that was actually 469 00:25:04,196 --> 00:25:06,276 Speaker 1: a big issue because people were saying, you're just gonna 470 00:25:06,276 --> 00:25:08,756 Speaker 1: pick your friends, You're going to pick this certain neighborhood, 471 00:25:09,236 --> 00:25:12,236 Speaker 1: where you're from. A subtext was as only going to 472 00:25:12,276 --> 00:25:17,116 Speaker 1: pick black people. So the random interesting, Yeah, very interesting, 473 00:25:17,876 --> 00:25:20,116 Speaker 1: But the random will actually helped to octuvate us from 474 00:25:20,116 --> 00:25:21,916 Speaker 1: a lot of that moving forward, which bodes well for 475 00:25:21,916 --> 00:25:24,356 Speaker 1: all of us. I want to ask both of you 476 00:25:24,436 --> 00:25:28,116 Speaker 1: about what I see is sort of the big national 477 00:25:28,196 --> 00:25:31,796 Speaker 1: debate downside of the universal basic income question, and it's this. 478 00:25:32,516 --> 00:25:35,036 Speaker 1: It seems to be that some of the supporters of 479 00:25:35,156 --> 00:25:39,276 Speaker 1: universal basic income sort of see it as a substitute 480 00:25:39,796 --> 00:25:44,956 Speaker 1: for all other forms of social service programs, for unemployment benefits, 481 00:25:45,156 --> 00:25:48,876 Speaker 1: for welfare, because I think in the minds of some people, 482 00:25:48,876 --> 00:25:54,076 Speaker 1: and especially some kind of libertarian leaning tech types, the ideas, 483 00:25:54,076 --> 00:25:56,476 Speaker 1: you know, government programs don't work very well, so let's 484 00:25:56,476 --> 00:25:59,756 Speaker 1: simplify by eliminating those government programs and then using the 485 00:25:59,796 --> 00:26:02,636 Speaker 1: money we save it from those government programs to provide 486 00:26:02,676 --> 00:26:07,116 Speaker 1: universal basic income. If your program ends up working well, 487 00:26:07,476 --> 00:26:10,556 Speaker 1: are you concerned that it might provide data that actually 488 00:26:10,636 --> 00:26:13,716 Speaker 1: is then used to support the position that says this 489 00:26:13,796 --> 00:26:17,996 Speaker 1: is our solution to our social ills, going alongside the 490 00:26:18,036 --> 00:26:21,836 Speaker 1: elimination of some traditional healthic assistance programs. So he maybe 491 00:26:21,876 --> 00:26:24,596 Speaker 1: let's start with you. Sure, So the good thing is 492 00:26:24,676 --> 00:26:27,636 Speaker 1: that are for us, guaranteed income is really meant to 493 00:26:27,636 --> 00:26:30,316 Speaker 1: be supplemental to existing benefits, So a lot of our 494 00:26:30,356 --> 00:26:34,036 Speaker 1: folks are actually receiving other public benefits. So just strictly 495 00:26:34,116 --> 00:26:37,316 Speaker 1: technically speaking, the data won't be able to really isolate 496 00:26:37,356 --> 00:26:40,596 Speaker 1: the effects of a guaranteed income from good good technical answer. 497 00:26:40,676 --> 00:26:42,956 Speaker 1: I like that, and I'll tell you let Mary take 498 00:26:42,956 --> 00:26:47,036 Speaker 1: the visionary answer. Well, Noah, think for me, I've been 499 00:26:47,236 --> 00:26:53,476 Speaker 1: very very very clear that a basic income, for me, 500 00:26:53,516 --> 00:26:56,396 Speaker 1: it's not progress to replace what we've built to build 501 00:26:56,436 --> 00:27:00,996 Speaker 1: something else. And I come from basic income not from 502 00:27:00,996 --> 00:27:04,276 Speaker 1: a libertarian tradition, but from a civil rights tradition, from 503 00:27:04,356 --> 00:27:06,956 Speaker 1: the work of doctor King and the Black Panthers and others, 504 00:27:07,756 --> 00:27:11,916 Speaker 1: in that we need those other things as well. The 505 00:27:11,956 --> 00:27:14,956 Speaker 1: more we extend the social safety net, the better are 506 00:27:15,036 --> 00:27:19,156 Speaker 1: we are. So I definitely think that what's interesting about 507 00:27:19,196 --> 00:27:23,916 Speaker 1: this conversation is that there's so many different, usually competing 508 00:27:23,996 --> 00:27:29,116 Speaker 1: interest around this solution. But I think for me, my 509 00:27:29,436 --> 00:27:32,796 Speaker 1: stake in the ground and my clarity is that no, 510 00:27:32,916 --> 00:27:35,676 Speaker 1: this is meant to be additive, to be an extension 511 00:27:36,076 --> 00:27:39,116 Speaker 1: of our social safety and extension of the social contract 512 00:27:39,236 --> 00:27:41,476 Speaker 1: is to build on what we already have because everything 513 00:27:41,476 --> 00:27:44,716 Speaker 1: we have isn't broken, but there's a realization to everything 514 00:27:44,716 --> 00:27:46,916 Speaker 1: we have also is imperfect, and people are still struggling. 515 00:27:47,476 --> 00:27:51,116 Speaker 1: I think the idea of this experiment is terrific, and 516 00:27:51,596 --> 00:27:55,076 Speaker 1: here's hoping that the data support you and that it 517 00:27:55,116 --> 00:27:58,396 Speaker 1: makes a real contribution to the ongoing question of what 518 00:27:58,436 --> 00:28:00,316 Speaker 1: we can do to make our country a fairer and 519 00:28:00,316 --> 00:28:02,916 Speaker 1: a more just place going forward. And I'm really grateful 520 00:28:02,916 --> 00:28:04,716 Speaker 1: to both of you for joining me. Thank you, Thank 521 00:28:04,756 --> 00:28:13,756 Speaker 1: you so much for having us. Thank you. Universal basic 522 00:28:13,796 --> 00:28:18,996 Speaker 1: income is a genuinely fascinating experiment. And if we think 523 00:28:19,276 --> 00:28:22,516 Speaker 1: that in the foreseeable future, some huge number of Americans 524 00:28:22,516 --> 00:28:25,316 Speaker 1: are actually going to lose their jobs as a result 525 00:28:25,396 --> 00:28:30,316 Speaker 1: of increasing automation and machine learning tools, then we kind 526 00:28:30,316 --> 00:28:32,956 Speaker 1: of have to try. That is to say, we need 527 00:28:32,996 --> 00:28:36,436 Speaker 1: to look for every option that we might have to 528 00:28:36,556 --> 00:28:39,636 Speaker 1: enable people to keep some stream of income in the 529 00:28:39,636 --> 00:28:44,236 Speaker 1: face of what could be devastating unemployment. Stockton, California is 530 00:28:44,236 --> 00:28:47,636 Speaker 1: a pretty small sized experiment, but it may produce data 531 00:28:47,796 --> 00:28:50,756 Speaker 1: they will enable us to make wiser judgments about whether 532 00:28:50,876 --> 00:28:53,756 Speaker 1: universal basic income should be part of how we go 533 00:28:53,836 --> 00:28:57,476 Speaker 1: about solving the problem of unemployment in the future. In 534 00:28:57,516 --> 00:29:00,996 Speaker 1: any case, it's a fascinating story from an unusual place, 535 00:29:01,276 --> 00:29:02,716 Speaker 1: and we're going to learn a lot about it in 536 00:29:02,716 --> 00:29:06,676 Speaker 1: the future. And now the sound of the week, I'm 537 00:29:06,716 --> 00:29:13,196 Speaker 1: gathered here with dozens of my congressional colleagues underground in 538 00:29:13,236 --> 00:29:16,876 Speaker 1: the basement of the Capitol because if behind those doors 539 00:29:16,916 --> 00:29:21,556 Speaker 1: they intend to overturn the results of an American presidential election, 540 00:29:22,036 --> 00:29:25,316 Speaker 1: we want to know what's going on. That's Congressman Matt 541 00:29:25,356 --> 00:29:29,356 Speaker 1: Gates from Florida this Wednesday, when a group of Republicans 542 00:29:29,556 --> 00:29:33,636 Speaker 1: tried to barge into closed door impeachment hearings being led 543 00:29:33,676 --> 00:29:37,356 Speaker 1: by the House Intelligence Committee. A lot has happened in 544 00:29:37,356 --> 00:29:40,876 Speaker 1: the impeachment inquiry this week, and what's remarkable is that 545 00:29:41,356 --> 00:29:46,836 Speaker 1: until now almost everything significant has happened behind closed doors. 546 00:29:47,996 --> 00:29:51,036 Speaker 1: Democrats who are controlling the impeachment inquiry have a couple 547 00:29:51,076 --> 00:29:54,836 Speaker 1: of justifications for the secrecy. The first thing they say 548 00:29:54,956 --> 00:29:57,876 Speaker 1: is that matters of national security could potentially be involved. 549 00:29:58,156 --> 00:30:00,836 Speaker 1: And in fact, the House Intelligence Committee is accustomed to 550 00:30:00,836 --> 00:30:03,676 Speaker 1: doing all kinds of secret hearings, which it does in 551 00:30:03,676 --> 00:30:06,916 Speaker 1: a special facility called a skiff, which is exactly the 552 00:30:06,956 --> 00:30:09,876 Speaker 1: facility that the other Republicans wanted to get into this week. 553 00:30:10,636 --> 00:30:13,996 Speaker 1: Another explanation the Democrats provide is to say that at 554 00:30:13,996 --> 00:30:18,476 Speaker 1: an early stage in any criminal investigation, you don't talk 555 00:30:18,516 --> 00:30:21,996 Speaker 1: to the witnesses in public. You talk to them in private. 556 00:30:22,836 --> 00:30:25,236 Speaker 1: Yet a third thing Democrats have been saying is that 557 00:30:25,316 --> 00:30:29,236 Speaker 1: these aren't hearings at all, their depositions in which sworn 558 00:30:29,316 --> 00:30:34,236 Speaker 1: statements are taken from witnesses. Now, all of these justifications 559 00:30:34,276 --> 00:30:37,156 Speaker 1: have some truth to them, and it's also true that 560 00:30:37,236 --> 00:30:41,716 Speaker 1: historically in investigators in earlier impeachment inquiries have done lots 561 00:30:41,716 --> 00:30:45,596 Speaker 1: of their questioning of witnesses in private before reaching the 562 00:30:45,636 --> 00:30:51,196 Speaker 1: point of public hearings. But these arguments also have distinct limitations. 563 00:30:51,876 --> 00:30:55,316 Speaker 1: The main limitation is that an impeachment process is ultimately 564 00:30:55,316 --> 00:30:59,276 Speaker 1: a political process. It's political in the sense that impeachment 565 00:30:59,356 --> 00:31:02,036 Speaker 1: is conducted by the House of Representatives, which is an 566 00:31:02,076 --> 00:31:05,476 Speaker 1: elected body, and subsequent trial takes place in front of 567 00:31:05,516 --> 00:31:08,756 Speaker 1: the Senate, which is also an elected body. That means 568 00:31:08,796 --> 00:31:11,716 Speaker 1: that when people's representatives are going to be directly involved 569 00:31:11,756 --> 00:31:15,436 Speaker 1: in doing the investigation, in mounting the charges, and ultimately 570 00:31:15,476 --> 00:31:18,396 Speaker 1: in deciding whether they're true or not, the public is 571 00:31:18,476 --> 00:31:20,836 Speaker 1: always going to want to have a say, and for 572 00:31:20,876 --> 00:31:23,236 Speaker 1: the public to have a say, the public has to 573 00:31:23,276 --> 00:31:26,756 Speaker 1: have information, and so, at least in my view, the 574 00:31:26,796 --> 00:31:29,516 Speaker 1: Democrats are reaching the point where they're running out of 575 00:31:29,596 --> 00:31:33,556 Speaker 1: room to continue the private, secret side of the inquiry. 576 00:31:33,596 --> 00:31:35,836 Speaker 1: They're going to have to pretty soon do what they've 577 00:31:35,836 --> 00:31:39,796 Speaker 1: been promising for a while now, shift to public hearings 578 00:31:39,836 --> 00:31:43,356 Speaker 1: in which the entire world and the TV cameras will 579 00:31:43,396 --> 00:31:46,036 Speaker 1: be able to see what the witnesses have to say. 580 00:31:46,836 --> 00:31:49,276 Speaker 1: Now that might seem simple or easy, but from the 581 00:31:49,276 --> 00:31:52,636 Speaker 1: standpoint of the Democrats it's not going to be. The 582 00:31:52,716 --> 00:31:56,436 Speaker 1: big challenges that the Democrats face are really two. First, 583 00:31:56,876 --> 00:32:00,076 Speaker 1: we're already starting to know the story. We more or 584 00:32:00,156 --> 00:32:03,316 Speaker 1: less know what the allegations against Donald Trump are, and 585 00:32:03,476 --> 00:32:06,556 Speaker 1: we more or less know that they're true. The President 586 00:32:06,876 --> 00:32:10,596 Speaker 1: has already more or less acknowledged that he pressured Ukraine 587 00:32:10,796 --> 00:32:15,476 Speaker 1: to investigate Joe and Hunter Biden. As a consequence, when 588 00:32:15,476 --> 00:32:18,796 Speaker 1: the witnesses appear and testify, the news media may be 589 00:32:18,916 --> 00:32:22,396 Speaker 1: a little bit less interested in the details of exactly 590 00:32:22,436 --> 00:32:25,436 Speaker 1: who did what and when. Then they've been until now, 591 00:32:25,636 --> 00:32:29,516 Speaker 1: when we've gotten a steady diet of leaked secret information. 592 00:32:29,916 --> 00:32:32,556 Speaker 1: After all, it's pretty exciting to have leaked secret information 593 00:32:32,596 --> 00:32:34,996 Speaker 1: on the front page. It's a lot less exciting to 594 00:32:35,036 --> 00:32:37,716 Speaker 1: have a public hearing where the witness repeats what he 595 00:32:37,836 --> 00:32:40,236 Speaker 1: or she has already said secretly and which has already 596 00:32:40,276 --> 00:32:42,796 Speaker 1: been leaked on the front page of the newspaper. The 597 00:32:42,836 --> 00:32:45,396 Speaker 1: other big challenge the Democrats are going to face is 598 00:32:45,436 --> 00:32:49,956 Speaker 1: that in public house Republicans are sure to emphasize in 599 00:32:49,996 --> 00:32:53,756 Speaker 1: their questions to witnesses the allegation with which Donald Trump 600 00:32:53,756 --> 00:32:57,316 Speaker 1: began this entire process, namely, the allegation that Joe and 601 00:32:57,396 --> 00:33:02,196 Speaker 1: Hunter Biden were both participants in an illicit scheme to 602 00:33:02,276 --> 00:33:07,316 Speaker 1: influence the Ukrainian prosecutor to drop prosecution against the company 603 00:33:07,476 --> 00:33:12,196 Speaker 1: for which Hunter Biden was working. Now, Biden has said 604 00:33:12,396 --> 00:33:14,476 Speaker 1: that neither he nor his son did anything wrong, and 605 00:33:14,556 --> 00:33:17,596 Speaker 1: almost all Democrats have continued to insist on exactly that. 606 00:33:18,276 --> 00:33:21,676 Speaker 1: The reality is, however, that when the Bidens come to 607 00:33:21,676 --> 00:33:25,076 Speaker 1: be attacked day by day by day, hour by hour 608 00:33:25,196 --> 00:33:28,676 Speaker 1: by hour on national television, that is going to have 609 00:33:28,756 --> 00:33:32,556 Speaker 1: some distracting effect on the Democrat's effort to keep things 610 00:33:32,556 --> 00:33:36,516 Speaker 1: focused laser like on Donald Trump's wrongdoing. In fact, it 611 00:33:36,516 --> 00:33:39,876 Speaker 1: could pose a very significant danger to the Bidens and 612 00:33:40,156 --> 00:33:43,076 Speaker 1: an even more significant challenge to the momentum that the 613 00:33:43,116 --> 00:33:46,356 Speaker 1: Democrats have managed to produce in the context of their 614 00:33:46,436 --> 00:33:51,236 Speaker 1: impeachment inquiry. All of those things may happen, and indeed 615 00:33:51,276 --> 00:33:54,676 Speaker 1: I think they're going to happen, but there's no avoiding 616 00:33:54,676 --> 00:33:58,556 Speaker 1: them in the long run. An impeachment inquiry has to 617 00:33:58,596 --> 00:34:02,396 Speaker 1: take place in front of the American public. The Democrats 618 00:34:02,396 --> 00:34:05,796 Speaker 1: have gone just about as far as they can in private, 619 00:34:06,036 --> 00:34:14,316 Speaker 1: and the Republican protest dramatized that. This week, Deep Background 620 00:34:14,356 --> 00:34:16,996 Speaker 1: is brought to you by Pushkin Industries. Our producer is 621 00:34:17,036 --> 00:34:20,956 Speaker 1: Lydia Gancott, with engineering by Jason Gambrel and Jason Roskowski. 622 00:34:21,236 --> 00:34:24,356 Speaker 1: Our showrunner is Sophie mckibbon. Our theme music is composed 623 00:34:24,356 --> 00:34:28,236 Speaker 1: by Luis GERA special thanks to the Pushkin Brass, Malcolm Gladwell, 624 00:34:28,356 --> 00:34:31,556 Speaker 1: Jacob Weisberg, and Mia Lobel. I'm Noah Feldman. You can 625 00:34:31,556 --> 00:34:34,716 Speaker 1: follow me on Twitter at Noah R. Feldman. This is 626 00:34:34,756 --> 00:34:35,516 Speaker 1: Deep Background.