1 00:00:10,119 --> 00:00:14,120 Speaker 1: Hello, and welcome to another episode of the Odd Lots Podcast. 2 00:00:14,200 --> 00:00:16,560 Speaker 2: I'm Jill Wisenthal and I'm Tracy Alloway. 3 00:00:16,760 --> 00:00:19,120 Speaker 1: Tracy, you know, what I felt was a really interesting 4 00:00:19,200 --> 00:00:21,720 Speaker 1: thing in a recent debt ceiling episode that we did 5 00:00:21,920 --> 00:00:24,439 Speaker 1: was about all the coding it would take if the 6 00:00:24,480 --> 00:00:27,680 Speaker 1: Treasury were to introduce a new kind of a bill. 7 00:00:27,960 --> 00:00:31,280 Speaker 2: Yes, that was interesting to me too, because I think 8 00:00:31,320 --> 00:00:34,639 Speaker 2: there's a perception out there that if Treasury wakes up 9 00:00:34,720 --> 00:00:37,680 Speaker 2: tomorrow and decides to sell a new type of bond, 10 00:00:38,280 --> 00:00:40,320 Speaker 2: they could just do it. But actually there are all 11 00:00:40,360 --> 00:00:43,160 Speaker 2: these back end changes that would need to be made. 12 00:00:43,720 --> 00:00:46,879 Speaker 2: And as you and I know from multiple conversations at 13 00:00:46,880 --> 00:00:50,760 Speaker 2: this point, it feels like government technology can be a 14 00:00:50,760 --> 00:00:53,760 Speaker 2: little bit clunky sometimes. Is that a fair way of 15 00:00:53,760 --> 00:00:54,320 Speaker 2: putting it? 16 00:00:54,320 --> 00:00:54,640 Speaker 3: It is? 17 00:00:54,760 --> 00:00:57,000 Speaker 1: I would say, you know, in defense, I mean get 18 00:00:57,000 --> 00:01:00,600 Speaker 1: into yes, you know, corporate technology can be clunky. But 19 00:01:00,760 --> 00:01:04,640 Speaker 1: that point reminded me that, like we've done, software is 20 00:01:04,640 --> 00:01:07,440 Speaker 1: like a fascinating thing. And we talk a lot on 21 00:01:07,480 --> 00:01:10,200 Speaker 1: the show. I think about like the sort of built economy, 22 00:01:10,319 --> 00:01:13,400 Speaker 1: like the Chips Act and battery investment and semiconductor and 23 00:01:13,440 --> 00:01:16,440 Speaker 1: all these things are like can the US build things again? 24 00:01:16,520 --> 00:01:20,280 Speaker 1: Like as a government state capacity these ideas, but in 25 00:01:20,319 --> 00:01:23,759 Speaker 1: the twenty first century also like building software in tech 26 00:01:23,959 --> 00:01:24,920 Speaker 1: has to be part of that. 27 00:01:24,959 --> 00:01:27,800 Speaker 2: Question, right, So it's one thing to announce that you're 28 00:01:27,840 --> 00:01:31,240 Speaker 2: going to do this new thing, but almost everything that 29 00:01:31,280 --> 00:01:34,800 Speaker 2: we do nowadays comes with some sort of software requirement. 30 00:01:35,120 --> 00:01:38,520 Speaker 2: Like I'm sure there is a system for processing applications 31 00:01:38,560 --> 00:01:41,479 Speaker 2: for chips, funding and things like that, and someone has 32 00:01:41,520 --> 00:01:44,600 Speaker 2: to actually build that. So there's two questions contained in 33 00:01:44,640 --> 00:01:48,040 Speaker 2: the can America build things question? It's like, can it 34 00:01:48,120 --> 00:01:50,800 Speaker 2: build the actual things and then can it build the 35 00:01:50,840 --> 00:01:53,200 Speaker 2: software systems that would allow it to build the things? 36 00:01:53,320 --> 00:01:56,280 Speaker 1: And you know, in the pandemic period, we got sort 37 00:01:56,320 --> 00:01:58,200 Speaker 1: of smacked in the face, I would say, by this 38 00:01:58,320 --> 00:02:02,120 Speaker 1: realization that the saw fair component is not trivial at all. 39 00:02:02,200 --> 00:02:05,680 Speaker 1: And most notably it was like every state had some 40 00:02:05,880 --> 00:02:09,440 Speaker 1: sort of problem with their unemployment insurance system and there 41 00:02:09,440 --> 00:02:12,200 Speaker 1: were just such a surge in claims obviously in those 42 00:02:12,200 --> 00:02:14,160 Speaker 1: initial you know, really for that first year, just on 43 00:02:14,160 --> 00:02:16,920 Speaker 1: the scale that none had ever seen. And it reminded 44 00:02:16,960 --> 00:02:20,480 Speaker 1: people's like, oh, yeah, fifty different stage systems much with 45 00:02:20,600 --> 00:02:23,480 Speaker 1: like tech that was probably built by some contractor, and 46 00:02:23,720 --> 00:02:25,600 Speaker 1: I don't know the eighties or nineties and the people 47 00:02:25,639 --> 00:02:28,280 Speaker 1: who knew how to run it have left, and you know, 48 00:02:28,280 --> 00:02:30,520 Speaker 1: it was a I gues Eventually, I guess it got 49 00:02:30,760 --> 00:02:34,200 Speaker 1: smoothed out, but it sort of exposed like, Okay, why 50 00:02:34,200 --> 00:02:36,399 Speaker 1: did that happen? And what else is lurking out there 51 00:02:36,400 --> 00:02:38,400 Speaker 1: that we don't really know how it works until it's broken. 52 00:02:38,560 --> 00:02:39,919 Speaker 2: I'm sure this is going to end up being a 53 00:02:39,960 --> 00:02:42,920 Speaker 2: classic episode where we mentioned Kobyl quite a few times, 54 00:02:43,000 --> 00:02:45,880 Speaker 2: like it's inevitable, right, it's coming, you can feel it. 55 00:02:45,919 --> 00:02:47,640 Speaker 1: One day, We're going to do it like actually, just 56 00:02:47,680 --> 00:02:50,840 Speaker 1: like what is Kobol episode where we just like focus 57 00:02:50,880 --> 00:02:52,720 Speaker 1: on that directly. But yes, we're going to talk about 58 00:02:52,760 --> 00:02:54,280 Speaker 1: software and we're going to talk about it from the 59 00:02:54,320 --> 00:02:57,480 Speaker 1: public sector perspective. What does it mean when the government 60 00:02:57,560 --> 00:03:00,480 Speaker 1: tries to fix something, what happens when the govern tries 61 00:03:00,480 --> 00:03:02,880 Speaker 1: to build something, what happens when the government tries to 62 00:03:02,880 --> 00:03:05,799 Speaker 1: buy something. Some of these themes we talked about them 63 00:03:05,800 --> 00:03:10,240 Speaker 1: from a private sector perspective with Patrick McKenzie earlier in 64 00:03:10,280 --> 00:03:12,160 Speaker 1: the year, but more to do on this topic. 65 00:03:12,480 --> 00:03:14,800 Speaker 2: Yeah, I'm excited. I have a lot This is going 66 00:03:14,880 --> 00:03:17,880 Speaker 2: to be a really good chance to actually ask like 67 00:03:18,320 --> 00:03:22,280 Speaker 2: the decisions that are going into some of these things 68 00:03:22,320 --> 00:03:25,600 Speaker 2: being built. I'm still floored every time I log into 69 00:03:25,680 --> 00:03:28,000 Speaker 2: the Treasury Direct website that you know you have to 70 00:03:28,000 --> 00:03:31,959 Speaker 2: click the little the buttons like the actual letters and 71 00:03:32,040 --> 00:03:34,640 Speaker 2: numbers to type your password. It won't let you just 72 00:03:34,720 --> 00:03:35,640 Speaker 2: type your password. 73 00:03:36,240 --> 00:03:37,680 Speaker 4: How did that come to be? 74 00:03:38,200 --> 00:03:39,279 Speaker 2: I have questions? 75 00:03:39,880 --> 00:03:42,720 Speaker 1: That is a great question. So we have I think 76 00:03:42,840 --> 00:03:46,720 Speaker 1: literally the two perfect guests to discuss this, two guests 77 00:03:46,720 --> 00:03:51,280 Speaker 1: with extensive experience answering and working on exactly these problems. 78 00:03:51,280 --> 00:03:53,880 Speaker 1: We're going to be speaking with Jennifer Palka. She is 79 00:03:53,920 --> 00:03:57,200 Speaker 1: the founder of Code for America. She helped found the 80 00:03:57,320 --> 00:04:00,880 Speaker 1: US Digital Service. She co chaired the cal Fornia task 81 00:04:00,960 --> 00:04:05,480 Speaker 1: force on fixing its unemployment insurance system during the pandemic, 82 00:04:05,600 --> 00:04:08,680 Speaker 1: and she's the author of a new book called Recoding America, 83 00:04:09,080 --> 00:04:11,240 Speaker 1: and we're going to be speaking with it. Dave Guarino, 84 00:04:11,360 --> 00:04:15,200 Speaker 1: who's a number of things currently an independent consultant researcher 85 00:04:15,240 --> 00:04:18,520 Speaker 1: focused on this stuff. Founding engineer at Code for America, 86 00:04:18,839 --> 00:04:22,920 Speaker 1: helped create get cal Fresh, which was like allowed Californians 87 00:04:22,920 --> 00:04:28,440 Speaker 1: to easily access snap has recently worked on the unemployment 88 00:04:28,440 --> 00:04:32,279 Speaker 1: insurance modernization at the Department of Labor. Also worked at California, 89 00:04:32,360 --> 00:04:35,880 Speaker 1: also worked in California. So two people's extensive experience on 90 00:04:35,960 --> 00:04:38,839 Speaker 1: exactly these things. So, Jennifer and David, thank you so 91 00:04:38,920 --> 00:04:40,000 Speaker 1: much for coming on odd. 92 00:04:39,800 --> 00:04:41,720 Speaker 4: Lots great, thanks for having us. 93 00:04:41,880 --> 00:04:43,279 Speaker 3: Absolutely great to be here. 94 00:04:44,440 --> 00:04:46,640 Speaker 1: Now I want to ask Tracy, could I steal your 95 00:04:46,640 --> 00:04:47,160 Speaker 1: first question? 96 00:04:47,200 --> 00:04:47,360 Speaker 3: Yeah? 97 00:04:47,360 --> 00:04:47,720 Speaker 5: I do it. 98 00:04:47,839 --> 00:04:48,920 Speaker 4: How has just happened that. 99 00:04:48,880 --> 00:04:52,279 Speaker 1: A government website doesn't let you type in something and 100 00:04:52,320 --> 00:04:55,479 Speaker 1: you have to like click buttons like that like resemble 101 00:04:55,600 --> 00:04:57,960 Speaker 1: mini keyboard on a website. What happened when you see that? 102 00:04:58,320 --> 00:04:59,880 Speaker 1: Do you have like in your mind you're like, oh, 103 00:04:59,920 --> 00:05:02,520 Speaker 1: I know exactly the meeting that caused this to happen. 104 00:05:02,720 --> 00:05:05,400 Speaker 4: I think Dave has been in those meetings and has 105 00:05:05,440 --> 00:05:07,919 Speaker 4: come out like with his hair on fire, fuming, So 106 00:05:08,040 --> 00:05:09,760 Speaker 4: I will let him answer that. 107 00:05:09,360 --> 00:05:12,120 Speaker 3: I mean, well, I don't have Yeah, I wasn't in 108 00:05:12,160 --> 00:05:15,880 Speaker 3: that specific meeting. Maybe the best answer I can give is, 109 00:05:16,400 --> 00:05:20,719 Speaker 3: and there's deeper issues behind it, is that the way 110 00:05:20,839 --> 00:05:23,359 Speaker 3: Garment tends to build technology. The reason for that is 111 00:05:23,400 --> 00:05:27,480 Speaker 3: no one wrote down specifically as a requirement upfront that 112 00:05:27,600 --> 00:05:30,800 Speaker 3: people should be able to use the keypad and the 113 00:05:30,880 --> 00:05:35,680 Speaker 3: numbers on their keyboard to enter the digits, and so 114 00:05:35,720 --> 00:05:37,920 Speaker 3: someone's like, well, I've met the requirements. Like the requirements 115 00:05:37,960 --> 00:05:41,120 Speaker 3: are that someone can do this, they can enter it, 116 00:05:41,160 --> 00:05:43,680 Speaker 3: and the way they do that is they click in 117 00:05:43,760 --> 00:05:46,359 Speaker 3: this somewhat I suppose insane way. 118 00:05:46,640 --> 00:05:49,160 Speaker 1: This is gonna be a conversation or like bang your 119 00:05:49,160 --> 00:05:51,120 Speaker 1: head on a disk a bunch. I feel like as 120 00:05:51,160 --> 00:05:52,880 Speaker 1: we listen these answers, anyway. 121 00:05:52,680 --> 00:05:53,400 Speaker 4: Yeah, I get ready. 122 00:05:53,480 --> 00:05:56,480 Speaker 2: Okay, Well, let me ask maybe a big picture question 123 00:05:56,760 --> 00:06:00,560 Speaker 2: to start, but what is the process by which government 124 00:06:00,640 --> 00:06:04,800 Speaker 2: software is developed? Because you know that answer just then 125 00:06:04,960 --> 00:06:07,560 Speaker 2: it makes it sound like someone gets a mandate we 126 00:06:07,640 --> 00:06:10,640 Speaker 2: need a website, for instance, through which people can buy 127 00:06:10,920 --> 00:06:13,960 Speaker 2: government bonds, and then someone else goes off and I 128 00:06:13,960 --> 00:06:16,600 Speaker 2: guess they execute it along the lines of the mandate 129 00:06:16,640 --> 00:06:17,280 Speaker 2: that's been set. 130 00:06:17,360 --> 00:06:19,000 Speaker 4: I mean, I'll jump in on that one. I mean, 131 00:06:19,160 --> 00:06:21,320 Speaker 4: first of all, it is changing. So just before we 132 00:06:21,400 --> 00:06:24,200 Speaker 4: get everyone sort of bounding their heads on the table, 133 00:06:24,240 --> 00:06:26,800 Speaker 4: there's a lot of movement in good directions. But what 134 00:06:27,000 --> 00:06:29,880 Speaker 4: has been the process is exactly that there's sort of 135 00:06:29,960 --> 00:06:33,600 Speaker 4: this mandated often comes from legislation or regulation that very 136 00:06:33,600 --> 00:06:36,120 Speaker 4: frequently now like says there will be a website that'll 137 00:06:36,120 --> 00:06:39,440 Speaker 4: even put the URL in the legislation, and then it 138 00:06:39,560 --> 00:06:43,440 Speaker 4: kicks off what's a really long process that does exactly 139 00:06:43,600 --> 00:06:48,320 Speaker 4: what Dave was saying. It is first centered around requirements gathering, 140 00:06:48,480 --> 00:06:51,599 Speaker 4: and that process of just requirements gathering can easily take 141 00:06:51,640 --> 00:06:53,920 Speaker 4: ten years. Now, when it's something like healthcare doc of 142 00:06:54,320 --> 00:06:56,400 Speaker 4: you know, and they have a launch date that's three 143 00:06:56,440 --> 00:07:00,320 Speaker 4: years out, they don't take ten years, but they still 144 00:07:00,320 --> 00:07:04,520 Speaker 4: gather sort of every imaginable requirement and they kind of 145 00:07:04,560 --> 00:07:08,720 Speaker 4: throw it in one giant bucket. And that's how the 146 00:07:08,920 --> 00:07:12,240 Speaker 4: RFP is created. The request for proposal, that's what vendors 147 00:07:12,400 --> 00:07:16,320 Speaker 4: bid on, and it's sort of like an undifferentiated, unprioritized 148 00:07:16,400 --> 00:07:20,000 Speaker 4: mess of everything everyone could think of, plus things that 149 00:07:20,040 --> 00:07:22,960 Speaker 4: have sort of pulled over from previous RFPs, plus a 150 00:07:23,000 --> 00:07:28,000 Speaker 4: bunch of compliance stuff around security, compliance stuff around well 151 00:07:28,080 --> 00:07:30,720 Speaker 4: various other things. Dave can maybe chime in, but what 152 00:07:30,760 --> 00:07:34,080 Speaker 4: that isn't is product management. Right, there's no one saying 153 00:07:34,200 --> 00:07:37,600 Speaker 4: here's the important things this should do. And as Dave said, 154 00:07:37,640 --> 00:07:40,560 Speaker 4: there's never a requirement for it to just work. So 155 00:07:41,040 --> 00:07:43,600 Speaker 4: then the vendor, I mean, I'll skip several steps in 156 00:07:43,640 --> 00:07:45,720 Speaker 4: the middle where you know, it takes a long time 157 00:07:45,760 --> 00:07:47,960 Speaker 4: to bid it out, and then there's like a protest 158 00:07:48,080 --> 00:07:51,000 Speaker 4: because one vendor thinks the other vendor shouldn't have got it, 159 00:07:51,040 --> 00:07:52,600 Speaker 4: and that delays it a couple more years. But then 160 00:07:52,640 --> 00:07:55,960 Speaker 4: they go into development and their job, as Dave said, 161 00:07:56,040 --> 00:07:58,760 Speaker 4: is just like check all these boxes, and it's thousands. 162 00:07:58,800 --> 00:08:02,480 Speaker 4: I mean, when we were at California, State of California 163 00:08:02,560 --> 00:08:05,679 Speaker 4: working on unemployment insurance, they were actually about to put 164 00:08:05,720 --> 00:08:10,240 Speaker 4: out a Business System Modernization RFP. Actually I think about 165 00:08:10,240 --> 00:08:12,320 Speaker 4: it to award it to a vendor, and I think 166 00:08:12,320 --> 00:08:15,800 Speaker 4: it had seven six seven hundred requirements. We can check that, 167 00:08:15,840 --> 00:08:19,320 Speaker 4: but it's in that's really a normal number of requirements. 168 00:08:19,360 --> 00:08:21,880 Speaker 4: And so they do all these things that where you 169 00:08:21,880 --> 00:08:24,240 Speaker 4: can check a box, but what they don't do is 170 00:08:24,280 --> 00:08:27,560 Speaker 4: actually check that it works. So I mean, maybe i'll 171 00:08:27,600 --> 00:08:29,480 Speaker 4: just tell a quick story from the book. And you know, 172 00:08:29,520 --> 00:08:34,480 Speaker 4: there was famously this application for veterans' healthcare benefits at 173 00:08:34,520 --> 00:08:39,200 Speaker 4: the VA that didn't work outside the building and they 174 00:08:39,840 --> 00:08:43,240 Speaker 4: didn't couldn't see it. So basically, somewhere in the specs 175 00:08:43,360 --> 00:08:46,640 Speaker 4: it had that this form needed to work on a 176 00:08:46,800 --> 00:08:51,840 Speaker 4: very specific and outdated combination of Internet Explorer and Adobe 177 00:08:51,880 --> 00:08:54,600 Speaker 4: Reader and The reason no one knew that it didn't 178 00:08:54,600 --> 00:08:57,000 Speaker 4: work inside the building is that's how all the computers 179 00:08:57,040 --> 00:08:59,439 Speaker 4: in there were set up. But if you were outside 180 00:08:59,440 --> 00:09:02,800 Speaker 4: the building, had any other possible combination of those two 181 00:09:02,800 --> 00:09:05,360 Speaker 4: pieces of software, it literally wouldn't load. And so they 182 00:09:05,360 --> 00:09:09,520 Speaker 4: had very very few people applying for these benefits online, 183 00:09:09,920 --> 00:09:12,920 Speaker 4: and you know, veterans were really, really, really frustrated. And 184 00:09:12,920 --> 00:09:16,720 Speaker 4: it took a team going out there recording a veteran 185 00:09:16,880 --> 00:09:19,679 Speaker 4: who had tried to do this dozens and dozens of times, 186 00:09:20,120 --> 00:09:24,360 Speaker 4: bringing that video back and showing it to the Deputy 187 00:09:24,400 --> 00:09:26,560 Speaker 4: secretary for them to be able to say, oh, okay, 188 00:09:26,600 --> 00:09:30,000 Speaker 4: actually there is something to be fixed here. Okay, you know, 189 00:09:30,040 --> 00:09:31,720 Speaker 4: you can go ahead and make a new form. But 190 00:09:31,800 --> 00:09:34,760 Speaker 4: up to them, they said, sorry, it's fine, we're looking 191 00:09:34,760 --> 00:09:37,719 Speaker 4: at the requirements. The requirements have been met. There's technically 192 00:09:37,720 --> 00:09:39,760 Speaker 4: nothing wrong. That's amazing. 193 00:09:39,840 --> 00:09:42,120 Speaker 2: It's also kind of crazy to think that people would 194 00:09:42,160 --> 00:09:43,839 Speaker 2: have been looking at that and just been like, oh, well, 195 00:09:43,840 --> 00:09:47,120 Speaker 2: I guess demand for veterans benefits is lower than we 196 00:09:47,200 --> 00:09:49,360 Speaker 2: thought it would be. But actually it was a tech issue. 197 00:09:49,400 --> 00:09:51,880 Speaker 4: I think what they said was demand for them doing 198 00:09:51,880 --> 00:09:54,360 Speaker 4: it online is low. These veterans must. 199 00:09:54,120 --> 00:09:56,240 Speaker 2: Not have access, which is absolutely not true, or they 200 00:09:56,240 --> 00:09:58,520 Speaker 2: can't figure out the computer when in fact, yeah, I 201 00:09:58,520 --> 00:10:00,640 Speaker 2: mean them couldn't figure out the software. 202 00:10:00,679 --> 00:10:02,520 Speaker 4: They would tell them. And they told this guy Dominic, 203 00:10:02,520 --> 00:10:04,480 Speaker 4: the veteran that they interviewed, They kept telling him it's 204 00:10:04,559 --> 00:10:06,040 Speaker 4: user error. There's something wrong with you. 205 00:10:06,840 --> 00:10:09,160 Speaker 1: Can I ask you a question about that bidding process? 206 00:10:09,240 --> 00:10:14,760 Speaker 1: When I hear government goes out to bidders for a website, 207 00:10:15,160 --> 00:10:21,280 Speaker 1: I imagine these like big sort of faceless offices in Roslin, Virginia, 208 00:10:21,800 --> 00:10:24,199 Speaker 1: where you know, people get like chop salid for lunch 209 00:10:24,240 --> 00:10:25,959 Speaker 1: and stuff like that, and there's like three or four 210 00:10:26,000 --> 00:10:27,800 Speaker 1: of them and they sort of like rotate who wins 211 00:10:27,840 --> 00:10:30,439 Speaker 1: the bid and stuff like that. How competitive, like is 212 00:10:30,480 --> 00:10:33,800 Speaker 1: it would a typical rfp auction or bidding process be 213 00:10:33,840 --> 00:10:37,640 Speaker 1: and how hard would it be forcea a more agile, 214 00:10:37,840 --> 00:10:40,840 Speaker 1: innovative firm or someone wanting to do it differently to 215 00:10:41,440 --> 00:10:43,600 Speaker 1: break into the pursuit of that contract. 216 00:10:43,760 --> 00:10:46,920 Speaker 4: That's changing a lot. So now you've got a set 217 00:10:46,960 --> 00:10:49,760 Speaker 4: of vendors. I think there's about thirty five of them 218 00:10:50,040 --> 00:10:52,400 Speaker 4: in this thing called the Digital Services Alliance, which are 219 00:10:52,559 --> 00:10:55,520 Speaker 4: smaller companies that work in a more sort of agile, 220 00:10:55,600 --> 00:10:58,880 Speaker 4: user centered way, and you know, they're just a lot 221 00:10:58,960 --> 00:11:02,280 Speaker 4: smaller than the Beltway, so there's still a tiny portion 222 00:11:02,880 --> 00:11:06,319 Speaker 4: of the market, but you know, they're really viable businesses, 223 00:11:06,400 --> 00:11:09,000 Speaker 4: and people in government can contract out to them. In fact, 224 00:11:09,000 --> 00:11:10,959 Speaker 4: I think there's a mechanism by which you can put 225 00:11:11,000 --> 00:11:13,800 Speaker 4: an RFP out just to those companies. But what has 226 00:11:13,840 --> 00:11:17,560 Speaker 4: happened in the past is like Dave can pile onto this. 227 00:11:17,679 --> 00:11:21,080 Speaker 4: Like if you were bidding out, say you know benefits 228 00:11:21,120 --> 00:11:23,880 Speaker 4: system in a state, and pick your benefit. Usually they 229 00:11:23,920 --> 00:11:28,200 Speaker 4: would require in the RFP that the bidder had to 230 00:11:28,240 --> 00:11:32,240 Speaker 4: have done a similar system in a different state in 231 00:11:32,320 --> 00:11:35,280 Speaker 4: the same category, which means that right there it limited 232 00:11:35,280 --> 00:11:39,000 Speaker 4: to like three vendors, and so you were never going 233 00:11:39,080 --> 00:11:42,880 Speaker 4: to get any disruptors because they were boxed out in 234 00:11:42,920 --> 00:11:44,080 Speaker 4: the very first step. 235 00:11:44,200 --> 00:11:47,040 Speaker 3: Just to maybe jump in. That's exactly right, and if 236 00:11:47,080 --> 00:11:49,440 Speaker 3: you I mean, to some extent, it's rational. If you 237 00:11:49,480 --> 00:11:52,239 Speaker 3: are thinking about this as someone who's not a technologist, 238 00:11:52,360 --> 00:11:55,640 Speaker 3: you run an agency, and you're thinking, we need to 239 00:11:55,679 --> 00:11:58,000 Speaker 3: do this very large project. It's going to take a 240 00:11:58,120 --> 00:12:01,640 Speaker 3: very long time. It we really need it to go right. 241 00:12:02,320 --> 00:12:06,760 Speaker 3: It is hard to think about, well, why wouldn't I 242 00:12:06,840 --> 00:12:10,640 Speaker 3: look for a vendor who's done this five, ten, fifteen times. 243 00:12:10,640 --> 00:12:13,840 Speaker 3: The problem is, as Jen said, well, I think there's 244 00:12:13,840 --> 00:12:17,120 Speaker 3: two problems in that. One is you've now very very 245 00:12:17,200 --> 00:12:20,400 Speaker 3: drastically narrowed your vendor pool, so there's a lot less competition, 246 00:12:20,800 --> 00:12:25,200 Speaker 3: and two you also may have too big of a project, 247 00:12:25,280 --> 00:12:27,680 Speaker 3: Like really, what level of attraction are you trying to 248 00:12:27,720 --> 00:12:30,120 Speaker 3: work towards? And is it the case that really you 249 00:12:30,200 --> 00:12:33,439 Speaker 3: need someone or a vendor who's worked on this specific 250 00:12:33,640 --> 00:12:36,280 Speaker 3: type of system in another entity that it works exactly 251 00:12:36,360 --> 00:12:38,959 Speaker 3: the same way and all of those details of the 252 00:12:39,000 --> 00:12:42,880 Speaker 3: exact same program, the exact same structure of agencies. Probably not, 253 00:12:43,040 --> 00:12:47,600 Speaker 3: because probably it's about more generically software. But if you 254 00:12:47,679 --> 00:12:50,520 Speaker 3: really focus only on the track record there, you kind 255 00:12:50,520 --> 00:12:54,680 Speaker 3: of get stuck, as Jema was saying, in this trap 256 00:12:54,760 --> 00:12:57,839 Speaker 3: where there's only a small number of vendors who can 257 00:12:57,880 --> 00:13:01,240 Speaker 3: meet those criteria, even if you think they're And it's 258 00:13:01,280 --> 00:13:03,120 Speaker 3: hard because the other thing I would say is as 259 00:13:03,200 --> 00:13:05,640 Speaker 3: much as there are new vendors, and maybe Jenny can 260 00:13:05,640 --> 00:13:08,280 Speaker 3: speak more of this, but it's also the case that 261 00:13:08,320 --> 00:13:10,280 Speaker 3: if you have a new vendor that wants to work 262 00:13:10,320 --> 00:13:12,160 Speaker 3: in a more agile way, wants to work in a 263 00:13:12,160 --> 00:13:16,360 Speaker 3: more use this CRD way, but they are bidding on 264 00:13:16,440 --> 00:13:19,880 Speaker 3: an RFP that is structured in a very waterfall top 265 00:13:20,000 --> 00:13:22,920 Speaker 3: down meet other requirements, and that's what success looks like. 266 00:13:23,160 --> 00:13:26,280 Speaker 3: It's also not going to be successful because you're not 267 00:13:26,320 --> 00:13:30,040 Speaker 3: giving them any space to say, oh, well, I know 268 00:13:30,120 --> 00:13:33,840 Speaker 3: your requirement says a way to ender phone numbers using 269 00:13:34,040 --> 00:13:37,000 Speaker 3: the key the mouse, but when we tested it with 270 00:13:37,080 --> 00:13:40,840 Speaker 3: people like people, turns out really don't like using the 271 00:13:40,920 --> 00:13:43,520 Speaker 3: mouse to click numbers to enter digits. They prefer to 272 00:13:43,600 --> 00:13:46,880 Speaker 3: use the keyboard. Could we change that requirement? And honestly, 273 00:13:47,120 --> 00:13:50,040 Speaker 3: I mean, that's an extreme example, but a lot of 274 00:13:51,160 --> 00:13:52,760 Speaker 3: and there's a lot of what Jen's book is about 275 00:13:52,880 --> 00:13:56,200 Speaker 3: and what resonated with me in it is that in 276 00:13:56,679 --> 00:13:59,640 Speaker 3: doing these things and building software, you learn those details 277 00:13:59,640 --> 00:14:02,040 Speaker 3: that you never could have gotten right up front. And 278 00:14:02,120 --> 00:14:07,640 Speaker 3: so if you make success defined by implementing everything as 279 00:14:07,679 --> 00:14:11,600 Speaker 3: we understand it before we ever get started, you almost 280 00:14:11,720 --> 00:14:14,200 Speaker 3: ensure that you're not going to get what you really 281 00:14:14,240 --> 00:14:17,559 Speaker 3: wanted because you're not allowing any information after that cut 282 00:14:17,600 --> 00:14:19,800 Speaker 3: off point, right when you might be getting to the 283 00:14:19,800 --> 00:14:22,160 Speaker 3: point where in doing it you're gonna learn a lot. 284 00:14:39,280 --> 00:14:41,320 Speaker 2: I definitely want to ask you more about that top 285 00:14:41,400 --> 00:14:44,800 Speaker 2: down waterfall nature of how a lot of these software 286 00:14:44,840 --> 00:14:47,760 Speaker 2: projects get commissioned. But before I do, I'm just curious 287 00:14:47,840 --> 00:14:51,600 Speaker 2: about the actual vendors because I'm sort of imagining these 288 00:14:51,640 --> 00:14:57,200 Speaker 2: like big faceless offices. But what types of vendors are 289 00:14:57,240 --> 00:14:59,720 Speaker 2: we talking about? And I'm trying to think how to 290 00:15:00,120 --> 00:15:03,680 Speaker 2: use those kind of diplomatically. But I can imagine a 291 00:15:03,800 --> 00:15:08,000 Speaker 2: company that is putting itself out bidding for government contracts 292 00:15:08,120 --> 00:15:13,360 Speaker 2: on software development, Like maybe they're not as experimental or 293 00:15:13,440 --> 00:15:17,320 Speaker 2: cutting edge as a software company that's, you know, somewhere 294 00:15:17,360 --> 00:15:21,920 Speaker 2: in San Francisco and it's building like new exciting products 295 00:15:21,640 --> 00:15:25,240 Speaker 2: for the corporate world. Although we've also done an episode 296 00:15:25,240 --> 00:15:29,400 Speaker 2: on how bad internal corporate tech can be, but it 297 00:15:29,520 --> 00:15:32,680 Speaker 2: seems like it might be a little bit more I 298 00:15:32,720 --> 00:15:37,080 Speaker 2: hesitate to say boring, but sort of basic playing it safe. 299 00:15:37,160 --> 00:15:39,520 Speaker 4: Well, you know one thing that I will say, like 300 00:15:39,600 --> 00:15:43,960 Speaker 4: I hear both incredible frustration and even anger from folks 301 00:15:44,000 --> 00:15:46,720 Speaker 4: that have to hire these companies. They're just like, there's 302 00:15:46,720 --> 00:15:50,400 Speaker 4: no alternatives. This has gone badly, it will go badly. 303 00:15:50,440 --> 00:15:52,880 Speaker 4: They kind of know that, Like I had this woman 304 00:15:53,040 --> 00:15:57,120 Speaker 4: in a major state. I won't say which who. When 305 00:15:57,120 --> 00:15:59,080 Speaker 4: I was said, you know your project's going to fail, 306 00:15:59,160 --> 00:16:01,440 Speaker 4: she said, do you think we don't know that? The 307 00:16:01,560 --> 00:16:04,720 Speaker 4: last seven projects have failed. So there's this a huge frustration. 308 00:16:05,240 --> 00:16:11,560 Speaker 4: But there's also this real feeling that the company in 309 00:16:11,600 --> 00:16:15,280 Speaker 4: San Francisco with its forty people who make consumer software 310 00:16:15,760 --> 00:16:18,560 Speaker 4: like great, people love using that software, but they don't 311 00:16:18,600 --> 00:16:21,880 Speaker 4: understand us, right, They're not going to actually understand the 312 00:16:21,880 --> 00:16:25,040 Speaker 4: constraints of government. So we have to go with these 313 00:16:25,040 --> 00:16:29,120 Speaker 4: big companies. And they're not wrong in the sense that 314 00:16:29,240 --> 00:16:32,880 Speaker 4: the constraints of working with government are real. There's a 315 00:16:32,960 --> 00:16:35,960 Speaker 4: huge compliance burden that forty person company in Sum Francisco 316 00:16:36,640 --> 00:16:40,520 Speaker 4: mostly does not want these jobs, Like there's so much 317 00:16:40,560 --> 00:16:44,520 Speaker 4: that goes into getting the you know, getting the bid, 318 00:16:45,360 --> 00:16:48,160 Speaker 4: you know these it's sort of famously said that these 319 00:16:48,200 --> 00:16:51,760 Speaker 4: companies know how to get the get the work, not 320 00:16:52,320 --> 00:16:54,760 Speaker 4: how to deliver on the work. That's not probably what 321 00:16:54,800 --> 00:16:58,400 Speaker 4: a fordy person company wants to do, but you know, 322 00:16:59,000 --> 00:17:02,320 Speaker 4: it's true that you have to deliver on stuff that 323 00:17:02,440 --> 00:17:06,840 Speaker 4: is mind bogglingly complex. Like when we're working in unemployment 324 00:17:06,840 --> 00:17:12,200 Speaker 4: insurance again during the pandemic, my colleague was talking with 325 00:17:12,400 --> 00:17:15,760 Speaker 4: the claims processors like week over week, and we're trying 326 00:17:15,800 --> 00:17:18,520 Speaker 4: to like dissect it and figure out what's going wrong 327 00:17:18,600 --> 00:17:21,600 Speaker 4: and like clear this backlog. And one of these guys 328 00:17:21,680 --> 00:17:24,600 Speaker 4: keep saying, well, I'm not quite sure about that answer. 329 00:17:24,680 --> 00:17:26,560 Speaker 4: I'm the new guy. I'm the new guy. And she 330 00:17:26,640 --> 00:17:29,600 Speaker 4: finally says, how long have you been here? And he says, 331 00:17:29,680 --> 00:17:33,119 Speaker 4: I've been here seventeen years. The guys who really know 332 00:17:33,200 --> 00:17:35,879 Speaker 4: how this works have been here twenty five years or more. 333 00:17:36,440 --> 00:17:40,880 Speaker 4: So think about, like, you know, going from doing some simple, cool, 334 00:17:41,080 --> 00:17:44,720 Speaker 4: you know tech app, you know, easy consumer app, to 335 00:17:45,200 --> 00:17:48,640 Speaker 4: trying to build or fix or improve upon a system 336 00:17:49,240 --> 00:17:52,440 Speaker 4: that is so complex that it takes twenty five years 337 00:17:52,520 --> 00:17:56,320 Speaker 4: to learn how to process a claim. That's that's sort 338 00:17:56,320 --> 00:17:57,919 Speaker 4: of I think what needs to be on the table 339 00:17:57,960 --> 00:18:00,680 Speaker 4: as part of this agenda is not just like can 340 00:18:00,720 --> 00:18:03,000 Speaker 4: the tech be better? But can we go back and 341 00:18:03,000 --> 00:18:07,440 Speaker 4: simplify the accumulated like ninety years of policy and process 342 00:18:07,840 --> 00:18:09,560 Speaker 4: that's making that so hard to make? 343 00:18:09,920 --> 00:18:10,120 Speaker 3: Well? 344 00:18:10,240 --> 00:18:11,880 Speaker 1: Why do we sort of back up? 345 00:18:11,920 --> 00:18:12,080 Speaker 3: Then? 346 00:18:12,119 --> 00:18:14,800 Speaker 1: And I again kind of steal Tracy's question because I 347 00:18:14,840 --> 00:18:17,399 Speaker 1: had the same thought, what is it about government buying 348 00:18:18,040 --> 00:18:23,800 Speaker 1: that creates these massive waterfold RFPs in which you can't 349 00:18:23,880 --> 00:18:27,719 Speaker 1: you know, in which a like create these constraints that 350 00:18:27,760 --> 00:18:31,200 Speaker 1: the software developer would not necessarily have with a private buyer, 351 00:18:31,640 --> 00:18:35,879 Speaker 1: and b puts government agencies in a position where they 352 00:18:35,920 --> 00:18:37,359 Speaker 1: will say to you, of course, we know it's going 353 00:18:37,440 --> 00:18:40,840 Speaker 1: to fail, which again presumably that actually probably does happen 354 00:18:40,880 --> 00:18:43,320 Speaker 1: in the private sector too. Or but what is what 355 00:18:43,359 --> 00:18:45,800 Speaker 1: are the to some extent, but what are the dynamics 356 00:18:45,920 --> 00:18:48,560 Speaker 1: that make these things like so hard to change and 357 00:18:48,640 --> 00:18:52,040 Speaker 1: so hard to you know, rip, you know, come up 358 00:18:52,080 --> 00:18:52,760 Speaker 1: with a new system. 359 00:18:54,280 --> 00:18:56,320 Speaker 4: Well, this is a good Dave question, but I'm gonna start. 360 00:18:57,600 --> 00:18:59,960 Speaker 4: You know, I really spend a lot of time reflect 361 00:19:00,200 --> 00:19:02,200 Speaker 4: on this, you know, I had. I've had sort of 362 00:19:02,240 --> 00:19:04,080 Speaker 4: three years since I stepped down from Code for America. 363 00:19:04,119 --> 00:19:05,720 Speaker 4: I just like sat in a room and well, I 364 00:19:05,760 --> 00:19:08,760 Speaker 4: interviewed people and thought about it, and I thought about 365 00:19:08,760 --> 00:19:12,080 Speaker 4: my own experiences, and I think that there's a deep 366 00:19:12,119 --> 00:19:16,040 Speaker 4: seated culture in government where the policy people are the 367 00:19:16,080 --> 00:19:20,840 Speaker 4: important people. They do the important stuff, and technology digital 368 00:19:21,000 --> 00:19:23,760 Speaker 4: is just part of implementation, which is at the not 369 00:19:23,800 --> 00:19:27,520 Speaker 4: just the bottom of like a software development waterfall, it's 370 00:19:27,520 --> 00:19:31,760 Speaker 4: the bottom of a big, rigid hierarchy and which information 371 00:19:31,920 --> 00:19:34,280 Speaker 4: and power and insights only flows from the top to 372 00:19:34,320 --> 00:19:38,119 Speaker 4: the bottom. And so it's problematic in part because the 373 00:19:38,160 --> 00:19:41,800 Speaker 4: people who are doing the tech are really just sort 374 00:19:41,840 --> 00:19:46,560 Speaker 4: of downstream of everything else and the power and ability 375 00:19:46,640 --> 00:19:49,480 Speaker 4: and willingness to step up and say, hey, like we 376 00:19:49,680 --> 00:19:53,560 Speaker 4: probably shouldn't do those sixy seven hundred requirements, we should 377 00:19:53,560 --> 00:19:57,440 Speaker 4: probably focus on these two hundred. Get that out the door, 378 00:19:57,480 --> 00:19:59,960 Speaker 4: and then you know, adage cases as later like that's 379 00:20:00,000 --> 00:20:03,360 Speaker 4: it's just not there's no permission really to say that, 380 00:20:03,480 --> 00:20:07,439 Speaker 4: and until we you compare that with say, we'll call 381 00:20:07,440 --> 00:20:09,879 Speaker 4: it metaphysical silicon valleys. That I don't mean actually like the 382 00:20:09,920 --> 00:20:13,680 Speaker 4: silicon valley, Like the people I write code started the companies, 383 00:20:13,680 --> 00:20:16,879 Speaker 4: they're in power, they're at the top, Like compliance like 384 00:20:17,400 --> 00:20:23,160 Speaker 4: is below them. In sort of the DC government hierarchy, 385 00:20:23,240 --> 00:20:26,640 Speaker 4: compliance is like way above, right, Like policy is way 386 00:20:26,680 --> 00:20:29,440 Speaker 4: above and there's not this sort of build measure learn 387 00:20:29,560 --> 00:20:33,960 Speaker 4: cycle the Dave was referring to, where you know, people 388 00:20:34,000 --> 00:20:36,720 Speaker 4: are learning from each other as they're doing the work 389 00:20:36,760 --> 00:20:40,040 Speaker 4: and that technology. You know, certainly the policy has an 390 00:20:40,040 --> 00:20:42,440 Speaker 4: impact on the technology, but the building of the technology 391 00:20:42,480 --> 00:20:45,400 Speaker 4: needs to have an impact on the policy, like that 392 00:20:45,680 --> 00:20:50,680 Speaker 4: fundamental culture is something that needs to be like looked 393 00:20:50,680 --> 00:20:54,120 Speaker 4: at and called out, and where it's not happening, where 394 00:20:54,160 --> 00:21:01,639 Speaker 4: there's actual conversation, you know, dialogue beats right. Where that's happening, 395 00:21:01,680 --> 00:21:04,280 Speaker 4: we should be lifting that up and saying this is 396 00:21:04,320 --> 00:21:07,520 Speaker 4: possible within government because that's like the foundation I think 397 00:21:07,520 --> 00:21:12,119 Speaker 4: of getting away from these mega projects that fail. But 398 00:21:12,200 --> 00:21:14,040 Speaker 4: Dave you you are you're going to disagree with me 399 00:21:14,040 --> 00:21:14,919 Speaker 4: on this, and I'm gonna love it. 400 00:21:15,000 --> 00:21:17,119 Speaker 3: I think that's I actually do think that's right. I 401 00:21:17,160 --> 00:21:21,480 Speaker 3: think a big part of it flows downstream from budgeting, 402 00:21:22,359 --> 00:21:26,119 Speaker 3: meaning a lot of this the way that and again 403 00:21:26,320 --> 00:21:28,920 Speaker 3: everything that I think Eyegen sate like, there's some degree 404 00:21:28,960 --> 00:21:31,520 Speaker 3: of generalization here, and there's always outliers and there's always exceptions, 405 00:21:31,520 --> 00:21:33,080 Speaker 3: and of course there's also a lot of people try 406 00:21:33,080 --> 00:21:37,280 Speaker 3: and change these things. That said, maybe the dominant mode 407 00:21:37,520 --> 00:21:41,879 Speaker 3: is we're going to budget for a large project. We 408 00:21:42,000 --> 00:21:44,640 Speaker 3: get one time funding to spend to do that project, 409 00:21:45,280 --> 00:21:47,399 Speaker 3: so we got to get everything right. We're only get 410 00:21:47,440 --> 00:21:50,800 Speaker 3: this chance once every so often. We're going to replace everything. 411 00:21:51,080 --> 00:21:53,080 Speaker 3: We're gonna have this new system. It's going to be great. 412 00:21:53,840 --> 00:21:56,679 Speaker 3: And the way that this paradigm breaks down is to 413 00:21:56,840 --> 00:22:00,679 Speaker 3: in what's called a design development phase, where you build 414 00:22:00,680 --> 00:22:03,200 Speaker 3: the system, you design what it should do, you think 415 00:22:03,240 --> 00:22:05,560 Speaker 3: about that. Again, you're kind of making all these decisions 416 00:22:05,640 --> 00:22:08,359 Speaker 3: up front, which I think is very counter to a 417 00:22:08,400 --> 00:22:10,000 Speaker 3: lot of the if you want to call the Silicon 418 00:22:10,080 --> 00:22:12,520 Speaker 3: Valley approach. But then you're building the system, and then 419 00:22:12,560 --> 00:22:15,879 Speaker 3: you enter what's called maintenance and operation, and it's supposed 420 00:22:15,880 --> 00:22:18,320 Speaker 3: to be very very little. You're not going to change much. 421 00:22:18,320 --> 00:22:22,399 Speaker 3: It's just kind of keeping the system going. And the 422 00:22:22,480 --> 00:22:27,280 Speaker 3: problem is modern software you kind of learn the most 423 00:22:27,480 --> 00:22:30,600 Speaker 3: the second you've actually deployed to real users. That is 424 00:22:30,640 --> 00:22:33,199 Speaker 3: the point when you are getting people saying, oh, this 425 00:22:33,240 --> 00:22:35,560 Speaker 3: doesn't work, or you're getting an error, or you're getting hey, 426 00:22:35,640 --> 00:22:37,119 Speaker 3: I tried to put in this address, but I have 427 00:22:37,160 --> 00:22:38,919 Speaker 3: a funky address and it won't let me do that. So, 428 00:22:38,920 --> 00:22:41,840 Speaker 3: all of a sudden, you have all this information, and unfortunately, 429 00:22:42,280 --> 00:22:44,400 Speaker 3: in sort of what gets called a big bang launch 430 00:22:44,400 --> 00:22:46,200 Speaker 3: project where you just put it up and you're done 431 00:22:46,280 --> 00:22:49,000 Speaker 3: and you're not going to touch it again, you now, 432 00:22:49,200 --> 00:22:52,000 Speaker 3: almost immediately on day one, can see potentially all of 433 00:22:52,040 --> 00:22:54,000 Speaker 3: these issues. And a lot of people prepare for this 434 00:22:54,040 --> 00:22:55,800 Speaker 3: in these kinds of projects. They're like, the first month 435 00:22:55,840 --> 00:22:57,240 Speaker 3: is going to be really hard. You all of a 436 00:22:57,280 --> 00:22:59,400 Speaker 3: sudden know all the things that were assumptions you made 437 00:22:59,400 --> 00:23:02,400 Speaker 3: that you got wrong. But the model that you had 438 00:23:02,400 --> 00:23:04,560 Speaker 3: that you were given is, well, we're going to do 439 00:23:04,600 --> 00:23:09,000 Speaker 3: this one time. Now. Contrast that with actually going live 440 00:23:09,280 --> 00:23:11,680 Speaker 3: is the first step, and you're going to have funding 441 00:23:11,800 --> 00:23:15,119 Speaker 3: that's continuous and kind of flat over ten, over twenty, 442 00:23:15,119 --> 00:23:17,320 Speaker 3: over thirty years, and you're going to have those people 443 00:23:17,359 --> 00:23:20,880 Speaker 3: on STEPH and you're going to continuously change the system 444 00:23:21,160 --> 00:23:23,560 Speaker 3: as you learn more about how users interact with with it, 445 00:23:23,760 --> 00:23:25,680 Speaker 3: as you learn more about like what things you didn't 446 00:23:25,720 --> 00:23:29,359 Speaker 3: get right upfront? Did you You made it possible to 447 00:23:30,480 --> 00:23:34,159 Speaker 3: upload a certain type of document, You accept PDFs and 448 00:23:34,280 --> 00:23:38,359 Speaker 3: tiff images Tiff this is a real example, but turns 449 00:23:38,359 --> 00:23:42,200 Speaker 3: out you don't support JPEGs or the I think iPhone 450 00:23:42,200 --> 00:23:45,879 Speaker 3: proprietary default image format. So now all your mobile users 451 00:23:45,880 --> 00:23:47,840 Speaker 3: are saying, hey, we can't upload, or a bunch of 452 00:23:47,880 --> 00:23:49,520 Speaker 3: them were saying, I can't upload documents. It says I 453 00:23:49,520 --> 00:23:51,959 Speaker 3: can't submit this file type right now. A lot of 454 00:23:52,320 --> 00:23:55,040 Speaker 3: you know, well intentioned folks in government get to that 455 00:23:55,119 --> 00:23:58,359 Speaker 3: situation and then they're stuck because they don't have the 456 00:23:58,880 --> 00:24:00,480 Speaker 3: mandate or the funding the same now we're going to 457 00:24:00,560 --> 00:24:03,000 Speaker 3: change that. They're waiting for the next big project to 458 00:24:03,040 --> 00:24:05,919 Speaker 3: replace the system and now support those new things, Whereas instead, 459 00:24:05,920 --> 00:24:08,400 Speaker 3: if you have a model where you're saying we expect 460 00:24:08,400 --> 00:24:11,640 Speaker 3: to continuously change this, it's a very very different thing. 461 00:24:12,240 --> 00:24:14,200 Speaker 3: And that's a little bit that is more how Silicon 462 00:24:14,280 --> 00:24:17,280 Speaker 3: Valley operates. I mean, this maybe is sort of hyperbole, 463 00:24:17,400 --> 00:24:21,600 Speaker 3: but Google didn't build search and then lay off ninety 464 00:24:21,600 --> 00:24:23,480 Speaker 3: percent of their staff and say we're done. This is great, 465 00:24:23,960 --> 00:24:26,360 Speaker 3: you know, like, look we've got this search thing. And 466 00:24:26,880 --> 00:24:30,439 Speaker 3: I think that difference flowing downstream from how tech is 467 00:24:30,480 --> 00:24:33,160 Speaker 3: budgeted for where you're budgeting for a one time project, 468 00:24:33,240 --> 00:24:36,840 Speaker 3: and this comes from legislatures, you know, from Congress. A 469 00:24:36,880 --> 00:24:40,040 Speaker 3: one time project versus we want to have ongoing fundings 470 00:24:40,119 --> 00:24:42,760 Speaker 3: because this is a living, breathing system. Is a huge 471 00:24:42,760 --> 00:24:46,040 Speaker 3: paradigm shift and has an uncomfortable aspect, which is it 472 00:24:46,119 --> 00:24:49,679 Speaker 3: may seem like it's costing more money because it's potentially 473 00:24:50,080 --> 00:24:53,439 Speaker 3: dollars ongoing because it's staff, not one time by but 474 00:24:53,640 --> 00:24:55,679 Speaker 3: I think a lot of the dysfunction that we see 475 00:24:56,119 --> 00:24:59,320 Speaker 3: probably flows down from that root. Cause. 476 00:24:59,440 --> 00:25:02,440 Speaker 2: Okay, so here's here's a big question. If we can 477 00:25:02,480 --> 00:25:06,080 Speaker 2: identify the cause of a lot of this dysfunction and 478 00:25:06,160 --> 00:25:09,360 Speaker 2: if we can trace it back to this waterfall structure, 479 00:25:09,960 --> 00:25:12,960 Speaker 2: the sort of top down mandates or maybe like theorists 480 00:25:13,040 --> 00:25:15,720 Speaker 2: versus practitioners. So you have a politician who has a 481 00:25:15,720 --> 00:25:18,720 Speaker 2: big idea and they want that big bang reveal of 482 00:25:18,760 --> 00:25:20,919 Speaker 2: the software they're going to expand benefits, and it's going 483 00:25:20,960 --> 00:25:23,439 Speaker 2: to come with a really cool website and everyone's going 484 00:25:23,480 --> 00:25:26,560 Speaker 2: to be able to access them easily. And yet it 485 00:25:26,800 --> 00:25:28,960 Speaker 2: leads to these problems that we've been discussing. 486 00:25:29,720 --> 00:25:30,800 Speaker 4: Why do we keep doing it? 487 00:25:31,880 --> 00:25:36,359 Speaker 2: That's a hard question, Well, what is the system that 488 00:25:36,520 --> 00:25:40,600 Speaker 2: is like keeping this way of doing things in place, 489 00:25:40,920 --> 00:25:45,240 Speaker 2: versus maybe you know, after a few decades of developing 490 00:25:45,320 --> 00:25:48,440 Speaker 2: these types of systems, why isn't someone going well, actually, 491 00:25:48,960 --> 00:25:51,520 Speaker 2: the way we've been doing things hasn't really been working well. 492 00:25:51,520 --> 00:25:53,280 Speaker 4: I think a lot of people have been saying that, 493 00:25:53,359 --> 00:25:55,320 Speaker 4: and that's why I say things are changing, like and 494 00:25:55,359 --> 00:25:58,440 Speaker 4: I will offer as evidence COVID tests dot gov. Do 495 00:25:58,480 --> 00:26:02,400 Speaker 4: you remember that one beginning of twenty twenty one, the 496 00:26:02,440 --> 00:26:05,080 Speaker 4: President announced that there would be this site where you 497 00:26:05,119 --> 00:26:10,200 Speaker 4: could request tests, and I think he gave a date 498 00:26:10,320 --> 00:26:12,720 Speaker 4: that was I want to say six weeks out, it 499 00:26:12,800 --> 00:26:16,280 Speaker 4: launched a day early, it launched in multiple languages. It 500 00:26:16,320 --> 00:26:19,120 Speaker 4: took eleven seconds for me to use other people say 501 00:26:19,600 --> 00:26:23,399 Speaker 4: eight seconds, fifteen seconds, right, and you're test showed up 502 00:26:23,440 --> 00:26:26,640 Speaker 4: a couple days later. Like that sounds like it could 503 00:26:26,640 --> 00:26:29,600 Speaker 4: have been really like okay, so it's much simpler than 504 00:26:29,640 --> 00:26:32,159 Speaker 4: say signing up for healthcare ducov or whatever. But like 505 00:26:32,240 --> 00:26:35,120 Speaker 4: they could have made it way more complex. They could 506 00:26:35,160 --> 00:26:40,520 Speaker 4: have said, like, let's gather every possible requirement and try 507 00:26:40,520 --> 00:26:45,240 Speaker 4: to fulfill every possible requirement. But you had internal capacity, 508 00:26:45,359 --> 00:26:47,840 Speaker 4: Like to Dave's point, there are people in house who 509 00:26:47,920 --> 00:26:51,199 Speaker 4: were saying, hey, we have been doing it away that 510 00:26:51,280 --> 00:26:55,359 Speaker 4: hasn't been working. Let's try this way and it's great. 511 00:26:55,440 --> 00:26:58,160 Speaker 4: Like it serves as a fantastic example I think within 512 00:26:58,240 --> 00:27:00,800 Speaker 4: government and outside government. Right, So part of it is 513 00:27:00,800 --> 00:27:04,120 Speaker 4: how people in government think what they believe, and part 514 00:27:04,119 --> 00:27:06,080 Speaker 4: of it is what the public thinks. Right. The public 515 00:27:06,200 --> 00:27:08,760 Speaker 4: can't couldn't imagine something that easy, but we have to 516 00:27:08,760 --> 00:27:12,720 Speaker 4: start imagining it and holding our government accountable to doing that. So, 517 00:27:12,800 --> 00:27:17,240 Speaker 4: I mean, I will say it is changing, but the 518 00:27:17,280 --> 00:27:20,520 Speaker 4: belief that we aren't good at it, that we couldn't 519 00:27:20,600 --> 00:27:26,040 Speaker 4: ever do like a COVID test. Dot of drives politicians, pundits, 520 00:27:26,440 --> 00:27:29,880 Speaker 4: vendors to say government's bad at this, and so we 521 00:27:29,920 --> 00:27:33,600 Speaker 4: should outsource everything. Now, of course we're going to have vendors, 522 00:27:34,000 --> 00:27:36,760 Speaker 4: and we should have vendors, but there has to be 523 00:27:36,920 --> 00:27:41,199 Speaker 4: enough internal capacity and competency to outsource well and to 524 00:27:41,280 --> 00:27:44,760 Speaker 4: make those decisions like, hey, let's have this thing, only 525 00:27:44,880 --> 00:27:48,480 Speaker 4: ask like name, address and like, you know, not ask 526 00:27:48,560 --> 00:27:51,480 Speaker 4: for their health insurance, you know, numbers and not like 527 00:27:52,040 --> 00:27:55,800 Speaker 4: do different numbers of tests per household. Like that's an 528 00:27:55,840 --> 00:28:02,200 Speaker 4: internal capacity question that is really important. And we don't 529 00:28:02,960 --> 00:28:07,840 Speaker 4: have enough internal capacity for these kinds of decisions in 530 00:28:07,920 --> 00:28:10,240 Speaker 4: government because we believe we're not good at it, and 531 00:28:10,240 --> 00:28:11,840 Speaker 4: we believe the only people who are good at out 532 00:28:11,840 --> 00:28:15,080 Speaker 4: of these outside companies. Plenty of evidence that that's totally wrong. 533 00:28:32,119 --> 00:28:34,359 Speaker 1: I want to go back to internal capacity in a minute, 534 00:28:34,400 --> 00:28:38,240 Speaker 1: but since you mentioned the COVID test website, it's sort 535 00:28:38,240 --> 00:28:40,160 Speaker 1: of a good chance to pivot a little bit here. 536 00:28:40,600 --> 00:28:43,800 Speaker 1: You know, COVID specifically seem to open people's minds a 537 00:28:43,840 --> 00:28:45,680 Speaker 1: little bit about how fast we can work and how 538 00:28:45,720 --> 00:28:49,360 Speaker 1: much money we can spend, and obviously a historically fast 539 00:28:49,400 --> 00:28:53,080 Speaker 1: pace of vaccine development that we've never seen prior to that, 540 00:28:53,480 --> 00:28:55,680 Speaker 1: and so it sort of had this brief moment where 541 00:28:55,680 --> 00:28:57,360 Speaker 1: it sort of opened people's minds. And I'm not sure 542 00:28:57,360 --> 00:28:59,920 Speaker 1: if it's closing again, but you know, since both of 543 00:29:00,120 --> 00:29:05,440 Speaker 1: you worked on the California unemployment insurance system and generally, 544 00:29:05,480 --> 00:29:07,720 Speaker 1: like I think almost every state on some level was 545 00:29:07,720 --> 00:29:11,320 Speaker 1: seen by some tobaccle Like what was I guess the 546 00:29:11,360 --> 00:29:15,640 Speaker 1: biggest eye opening thing that going into that environment taught 547 00:29:15,680 --> 00:29:18,160 Speaker 1: you or about the system. And then what was like, 548 00:29:18,280 --> 00:29:20,720 Speaker 1: you know, what's the key sort of takeaway of like, Okay, 549 00:29:20,720 --> 00:29:23,080 Speaker 1: here's a thing that we can learn from this that 550 00:29:23,120 --> 00:29:26,960 Speaker 1: then whether it's UI and other states are more broadly, okay, 551 00:29:26,960 --> 00:29:29,160 Speaker 1: this is something that you learned that then can like 552 00:29:29,360 --> 00:29:30,560 Speaker 1: have extensible lessons. 553 00:29:31,160 --> 00:29:35,040 Speaker 4: So I think you're right. COVID had sort of twin 554 00:29:35,640 --> 00:29:38,640 Speaker 4: impacts on our thinking. One was, holy cow, we can 555 00:29:38,680 --> 00:29:41,440 Speaker 4: actually do stuff, and the other one was, holy cow, 556 00:29:41,560 --> 00:29:44,880 Speaker 4: we're really screwed. Right, Yeah, you're way to put it, 557 00:29:45,840 --> 00:29:48,680 Speaker 4: And there's evidence of both, which is which is really fair. 558 00:29:49,400 --> 00:29:51,680 Speaker 4: I mean I think for me, you know, i'd been 559 00:29:51,720 --> 00:29:54,840 Speaker 4: doing this for you know, ten years when I got 560 00:29:54,840 --> 00:30:00,360 Speaker 4: pulled into the the unemployment insurance I'll call it a rescue. 561 00:30:00,600 --> 00:30:05,200 Speaker 4: We were clearing the backlot, and I felt like, nothing 562 00:30:05,280 --> 00:30:08,360 Speaker 4: here can scare me. I have seen the VA, I've 563 00:30:08,400 --> 00:30:13,200 Speaker 4: seen the operating of you know, some pretty janky, you know, 564 00:30:14,120 --> 00:30:18,800 Speaker 4: government agencies. But I actually really did learn a lot 565 00:30:18,840 --> 00:30:22,440 Speaker 4: and was kind of shocked at how the unemployment insurance 566 00:30:22,440 --> 00:30:25,880 Speaker 4: system worked. And the metaphor that came to mind for 567 00:30:25,960 --> 00:30:27,959 Speaker 4: me was like everyone kept saying, well, it's a system, 568 00:30:28,040 --> 00:30:31,080 Speaker 4: obviously you can just fix it, and it isn't a system. 569 00:30:31,400 --> 00:30:33,400 Speaker 4: And I think that's true of other things we think 570 00:30:33,440 --> 00:30:37,440 Speaker 4: of the systems too, Like it is archaeological layers of 571 00:30:37,520 --> 00:30:43,240 Speaker 4: technology that sort of map to archaeological layers of policy 572 00:30:44,000 --> 00:30:50,320 Speaker 4: that didn't ever get designed. It just accrued, Like so 573 00:30:50,400 --> 00:30:53,200 Speaker 4: nobody ever goes back and says, okay, how is this 574 00:30:53,280 --> 00:30:56,040 Speaker 4: going to work with this? There's such little appetite for 575 00:30:56,080 --> 00:30:59,200 Speaker 4: anything that's like backward looking that would actually update and 576 00:30:59,240 --> 00:31:03,760 Speaker 4: make these layers work together well, that it just isn't 577 00:31:03,840 --> 00:31:06,360 Speaker 4: what people think it is. When you go look at it, 578 00:31:06,480 --> 00:31:09,480 Speaker 4: you're just like, WHOA, of course you can't do these things, 579 00:31:10,200 --> 00:31:12,760 Speaker 4: like this thing isn't connected in any meaningful way to 580 00:31:12,840 --> 00:31:15,160 Speaker 4: this thing, and like this was built in a certain 581 00:31:15,200 --> 00:31:18,719 Speaker 4: era for a certain purpose, and we just glombed stuff 582 00:31:18,760 --> 00:31:22,800 Speaker 4: onto it when we needed to say, add internet access, 583 00:31:22,880 --> 00:31:26,719 Speaker 4: like let people apply online. I mean, the biggest thing was, 584 00:31:26,760 --> 00:31:29,440 Speaker 4: like I think we haven't grappled with the fact that 585 00:31:29,880 --> 00:31:32,400 Speaker 4: when these systems were quote unquote designed, I actually was 586 00:31:32,520 --> 00:31:34,320 Speaker 4: to say when they were started, because they really have 587 00:31:34,520 --> 00:31:37,520 Speaker 4: never been designed, and a lot of them and there's 588 00:31:37,600 --> 00:31:41,680 Speaker 4: huge advantage to actually doing some design. But you went 589 00:31:41,720 --> 00:31:44,040 Speaker 4: into an office and showed who you were, like you 590 00:31:44,560 --> 00:31:48,760 Speaker 4: identified you, you validated your identity by going in, and 591 00:31:48,800 --> 00:31:51,280 Speaker 4: then we moved online and we never figured out how 592 00:31:51,280 --> 00:31:55,320 Speaker 4: to validate people's identity. And there were a number of 593 00:31:55,360 --> 00:31:59,280 Speaker 4: problems in California with unemployment assurance. The fact that you 594 00:31:59,280 --> 00:32:01,240 Speaker 4: know it took twenty five years to learn how to 595 00:32:01,280 --> 00:32:05,360 Speaker 4: do it meant that the any claim that couldn't be 596 00:32:05,440 --> 00:32:11,400 Speaker 4: processed automatically was a huge, huge bottleneck. And you couldn't 597 00:32:11,400 --> 00:32:13,640 Speaker 4: add claims process to do that, Like you would have 598 00:32:13,680 --> 00:32:16,440 Speaker 4: to go back in time twenty five years, start new 599 00:32:16,480 --> 00:32:19,200 Speaker 4: claims browsers and have them ready to come online during 600 00:32:19,400 --> 00:32:23,360 Speaker 4: a downturn, right, Like that's never going to scale. But 601 00:32:23,440 --> 00:32:27,280 Speaker 4: the other big problem was that that you only went 602 00:32:27,400 --> 00:32:32,640 Speaker 4: through the automatic sort of pipeline if they felt like 603 00:32:32,760 --> 00:32:35,440 Speaker 4: they could verify your identity, but they weren't really doing 604 00:32:35,520 --> 00:32:42,400 Speaker 4: any meaningful identity verification. And we, unlike most other industrialized countries, 605 00:32:43,040 --> 00:32:46,000 Speaker 4: don't have a national system for that, Like we don't 606 00:32:46,000 --> 00:32:50,280 Speaker 4: have a national technology platform for that. But also like 607 00:32:50,360 --> 00:32:53,120 Speaker 4: again back to this all derives from like the legal 608 00:32:53,120 --> 00:32:57,640 Speaker 4: and policy framework, but actually have a reasonable policy framework 609 00:32:58,080 --> 00:33:00,720 Speaker 4: for identifying people and knowing who they are when we 610 00:33:00,720 --> 00:33:04,680 Speaker 4: start to do a transaction. And that is causing problems everywhere, 611 00:33:04,840 --> 00:33:07,640 Speaker 4: and there is interesting conversations about how to fix it. 612 00:33:07,640 --> 00:33:11,640 Speaker 4: There's some stuff out there, but it's it's really contested, 613 00:33:12,240 --> 00:33:15,360 Speaker 4: and it's not just a problem for the pandemic. It's 614 00:33:15,360 --> 00:33:16,680 Speaker 4: going to continue to be a problem. 615 00:33:16,800 --> 00:33:19,920 Speaker 3: Yeah, I mean, I'll just add and my vantage point 616 00:33:20,120 --> 00:33:22,360 Speaker 3: was was somewhat different, right, I was coming at it 617 00:33:22,360 --> 00:33:28,520 Speaker 3: from a different angle. I think the major thing that 618 00:33:28,600 --> 00:33:32,600 Speaker 3: I took away, especially with so much focus on technology 619 00:33:32,640 --> 00:33:35,000 Speaker 3: and so much focus on COBAL mainframes, and that was, 620 00:33:35,040 --> 00:33:35,280 Speaker 3: you know. 621 00:33:35,240 --> 00:33:39,360 Speaker 2: That was set we need to do a Cobyl like 622 00:33:39,480 --> 00:33:42,240 Speaker 2: drinking take a shot time. 623 00:33:42,400 --> 00:33:44,360 Speaker 4: How long it took us to get to that. 624 00:33:44,240 --> 00:33:47,360 Speaker 3: Word oh, I can't believe, you know. And the best 625 00:33:47,360 --> 00:33:49,000 Speaker 3: part is that. I mean my next point was that's 626 00:33:49,040 --> 00:33:52,120 Speaker 3: misplaced causation. And yet I am the one who brought 627 00:33:52,160 --> 00:33:57,479 Speaker 3: up cobal. Yeah, I mean I think I think we 628 00:33:58,320 --> 00:34:00,800 Speaker 3: that is a you know, for so much focus on Oh, 629 00:34:00,840 --> 00:34:04,160 Speaker 3: these tech systems are falling over they're not good. I 630 00:34:04,200 --> 00:34:08,080 Speaker 3: think there's the number one lesson I learned was that 631 00:34:08,920 --> 00:34:10,359 Speaker 3: and I think this is true, but we don't tend 632 00:34:10,400 --> 00:34:12,719 Speaker 3: to think about it, which is the technology systems are 633 00:34:12,719 --> 00:34:15,960 Speaker 3: part of a larger sociotechnical system. What I mean by 634 00:34:15,960 --> 00:34:19,080 Speaker 3: that is, as Jen mentioned, you have staff who can 635 00:34:19,080 --> 00:34:21,480 Speaker 3: do stuff manually, and you have a system that might 636 00:34:21,480 --> 00:34:24,080 Speaker 3: be able to do stuff in an automated way. One 637 00:34:24,120 --> 00:34:27,480 Speaker 3: of the problems with programs that are so complex is 638 00:34:27,480 --> 00:34:29,759 Speaker 3: if you have a massive spike of volume in your 639 00:34:29,760 --> 00:34:32,960 Speaker 3: technology system isn't set up to just cover all those 640 00:34:33,000 --> 00:34:36,000 Speaker 3: cases in an automated way. You try to throw bodies 641 00:34:36,000 --> 00:34:37,640 Speaker 3: at it, and you try to hire people. But if 642 00:34:37,680 --> 00:34:39,720 Speaker 3: it takes a year and a half to train someone 643 00:34:40,080 --> 00:34:42,960 Speaker 3: and like train someone to a relatively basic level, with 644 00:34:43,040 --> 00:34:45,560 Speaker 3: some of the complexity of these programs, you're not able 645 00:34:45,600 --> 00:34:51,640 Speaker 3: to hire fast enough and Unfortunately, you know, this is why. 646 00:34:54,360 --> 00:34:57,840 Speaker 4: We were hiring really fast. We heard five thousand people 647 00:34:58,320 --> 00:35:02,080 Speaker 4: in California to help process these claims, but because they 648 00:35:02,080 --> 00:35:05,520 Speaker 4: couldn't do anything, they were taking up the time of 649 00:35:05,600 --> 00:35:10,120 Speaker 4: the experienced claims processors. And we calculated that every single 650 00:35:10,239 --> 00:35:15,759 Speaker 4: new hire the state made slowed processing down. And one 651 00:35:15,760 --> 00:35:17,839 Speaker 4: of the big things we did was actually just get 652 00:35:18,000 --> 00:35:22,239 Speaker 4: them to reassign staff away. I mean, some of it 653 00:35:22,280 --> 00:35:24,279 Speaker 4: was just reassigned staff to other things, Like nobody was 654 00:35:24,320 --> 00:35:28,160 Speaker 4: opening the mail, which was pretty critical. And yet opening 655 00:35:28,200 --> 00:35:30,480 Speaker 4: the mail is something you can do if you just 656 00:35:30,840 --> 00:35:33,960 Speaker 4: joined and have no background, but like, yeah, it was 657 00:35:34,280 --> 00:35:36,799 Speaker 4: pretty franny. So sorry to interruptive. It's just like we 658 00:35:36,840 --> 00:35:39,680 Speaker 4: were hiring. It's just that the hiring was having a 659 00:35:39,719 --> 00:35:40,520 Speaker 4: perverse effect. 660 00:35:40,960 --> 00:35:44,040 Speaker 3: Well, and that's I guess that's my point is that's 661 00:35:44,080 --> 00:35:48,600 Speaker 3: exactly right. And I guess the number one lesson I 662 00:35:48,760 --> 00:35:52,440 Speaker 3: learned was if you look at the history of a program, 663 00:35:52,840 --> 00:35:56,040 Speaker 3: like un employment trends, but specifically UI, you have this 664 00:35:56,440 --> 00:35:59,080 Speaker 3: thing where once every ten years or so, there's a 665 00:35:59,080 --> 00:36:01,480 Speaker 3: major recession, or in this case, there's a pandemic, which 666 00:36:01,520 --> 00:36:03,840 Speaker 3: was like a ten x recession, right because it was 667 00:36:03,880 --> 00:36:07,040 Speaker 3: overnight as opposed to graduate was overnight. So much of 668 00:36:07,040 --> 00:36:09,560 Speaker 3: the economy, so many of the working force, labor force, 669 00:36:09,760 --> 00:36:12,600 Speaker 3: were just out of work immediately, so all those claims 670 00:36:12,600 --> 00:36:15,120 Speaker 3: came in simultaneously. But we have this cycle of every 671 00:36:15,200 --> 00:36:17,000 Speaker 3: ten years or so way of recession or I mean, 672 00:36:17,040 --> 00:36:18,960 Speaker 3: I don't know, I'm non economists, I can tell you 673 00:36:19,000 --> 00:36:21,960 Speaker 3: exactly what that is, but it happens in a recurring way, 674 00:36:22,120 --> 00:36:25,560 Speaker 3: and then everyone gets frustrated with how difficult it is 675 00:36:25,600 --> 00:36:28,839 Speaker 3: to get unemployment insurance because they're overloaded with volume. And 676 00:36:28,880 --> 00:36:32,600 Speaker 3: then the claims volume goes down, recession ends, and then 677 00:36:33,480 --> 00:36:35,720 Speaker 3: there's a lot there. Honestly, isn't a lot of focus 678 00:36:35,760 --> 00:36:38,960 Speaker 3: on Okay, we're gonna sort of put more money into 679 00:36:39,000 --> 00:36:41,040 Speaker 3: the UI system so that we're ready for next time. 680 00:36:41,560 --> 00:36:45,520 Speaker 3: But unfortunately, you can't have resilient systems if you optimize 681 00:36:45,520 --> 00:36:48,680 Speaker 3: for pure, pure, pure efficiency, like in the down years, 682 00:36:48,719 --> 00:36:50,440 Speaker 3: in the years when you have very very low volume, 683 00:36:51,200 --> 00:36:53,680 Speaker 3: if you are just going to cut staff down to 684 00:36:53,880 --> 00:36:56,759 Speaker 3: very very bare bone skeleton crew. It comes back to 685 00:36:56,800 --> 00:36:58,880 Speaker 3: this point of not only are you not able to 686 00:36:58,880 --> 00:37:01,920 Speaker 3: do preparation if you don't have the funding to have staff, 687 00:37:02,280 --> 00:37:05,640 Speaker 3: you can't even make the changes that would be good, 688 00:37:05,719 --> 00:37:07,960 Speaker 3: and you can't when you get the money in a 689 00:37:08,040 --> 00:37:10,439 Speaker 3: new recession hire the people because you can't train them, 690 00:37:10,640 --> 00:37:12,160 Speaker 3: and so we kind of get stuck in the cycle. 691 00:37:12,239 --> 00:37:14,920 Speaker 3: So that was the number one from a systems modeling 692 00:37:14,960 --> 00:37:16,799 Speaker 3: perspective lesson I took away with it. 693 00:37:16,840 --> 00:37:19,440 Speaker 4: From it, I got to add, though, you know, we 694 00:37:19,560 --> 00:37:23,480 Speaker 4: did invest several billion across states during the Great recession 695 00:37:23,719 --> 00:37:26,839 Speaker 4: or after a Great recession, so I think about half 696 00:37:26,880 --> 00:37:30,520 Speaker 4: of the states technically modernized then they did take advantage 697 00:37:30,520 --> 00:37:34,359 Speaker 4: of pretty significant funds. None of those states did any 698 00:37:34,400 --> 00:37:39,279 Speaker 4: better than average in this downturn. So I agree with Dave, 699 00:37:39,400 --> 00:37:43,080 Speaker 4: and I would argue that taking a different approach to 700 00:37:43,280 --> 00:37:48,719 Speaker 4: quote unquote modernization, to policy and process simplification, to you know, 701 00:37:48,760 --> 00:37:54,440 Speaker 4: having goal driven modernization like all of those things, is 702 00:37:54,560 --> 00:37:59,480 Speaker 4: equally important as investing during the downturn. Investing the way 703 00:37:59,520 --> 00:38:03,000 Speaker 4: we have been doing it is not working. There are 704 00:38:03,040 --> 00:38:05,719 Speaker 4: so many different threads to pick out of there. But 705 00:38:06,160 --> 00:38:07,960 Speaker 4: just on that last point, you know, when you were 706 00:38:08,000 --> 00:38:10,319 Speaker 4: talking about how a lot of these government systems were 707 00:38:10,360 --> 00:38:13,520 Speaker 4: not designed to be that way, I kept thinking of 708 00:38:13,560 --> 00:38:16,640 Speaker 4: a lot of the software used by the banks, where 709 00:38:16,800 --> 00:38:20,080 Speaker 4: you know, you think, you think of like JP Morgan's 710 00:38:20,120 --> 00:38:24,520 Speaker 4: software system. It wasn't designed to be JP morgan software system. 711 00:38:24,520 --> 00:38:28,040 Speaker 4: It's the result of many, many acquisitions of other banks, 712 00:38:28,040 --> 00:38:30,320 Speaker 4: and you know, every time they take over this business, 713 00:38:30,400 --> 00:38:33,759 Speaker 4: it gets sort of duct taped onto the rest of 714 00:38:33,800 --> 00:38:37,880 Speaker 4: the system. But I'm curious, is there a moment that 715 00:38:38,000 --> 00:38:42,160 Speaker 4: you have seen in your careers at which someone says, 716 00:38:42,719 --> 00:38:46,000 Speaker 4: this system is just like irredeemable. We're going to have 717 00:38:46,080 --> 00:38:48,320 Speaker 4: to start from scratch, And like, what is the trigger 718 00:38:48,360 --> 00:38:52,840 Speaker 4: point for making that decision versus reaching for the band 719 00:38:52,840 --> 00:38:55,759 Speaker 4: aids and the duct tape and just trying to make 720 00:38:55,800 --> 00:38:56,239 Speaker 4: it work. 721 00:38:56,960 --> 00:38:59,200 Speaker 3: I mean, and I forget if someone else has mentioned this. 722 00:38:59,600 --> 00:39:02,160 Speaker 3: I lit the Patrick McKenzie episode, maybe he mentioned this, 723 00:39:02,160 --> 00:39:06,479 Speaker 3: But in general, a good rule of thumb is never 724 00:39:06,560 --> 00:39:09,839 Speaker 3: rewrite a system from scratch that's working. I mean, other 725 00:39:09,880 --> 00:39:13,680 Speaker 3: people might disagree with that. But the reason I don't 726 00:39:13,719 --> 00:39:16,200 Speaker 3: have a good answer for sort of a moment. Lots 727 00:39:16,200 --> 00:39:17,839 Speaker 3: of people have tried to do that thing where they 728 00:39:17,880 --> 00:39:20,440 Speaker 3: just rebuild it from scratch, and again it tends to 729 00:39:20,560 --> 00:39:24,520 Speaker 3: go poorly because and this is the specific thing, the 730 00:39:24,600 --> 00:39:28,799 Speaker 3: old system encodes so much tacit knowledge and so many 731 00:39:28,880 --> 00:39:31,560 Speaker 3: of the many, many many little details, and some of 732 00:39:31,600 --> 00:39:34,600 Speaker 3: those details might no longer be relevant, but many of 733 00:39:34,640 --> 00:39:40,760 Speaker 3: them might encode some really hard learned truth about the world, 734 00:39:41,000 --> 00:39:43,799 Speaker 3: because at the end of the day, the software is 735 00:39:44,000 --> 00:39:46,520 Speaker 3: just modeling the real world. It's just saying, okay, well 736 00:39:46,560 --> 00:39:49,840 Speaker 3: we have customers. They have addresses. Okay, so maybe you 737 00:39:49,840 --> 00:39:52,560 Speaker 3: have two address fields in your old system and you think, oh, well, 738 00:39:52,640 --> 00:39:55,080 Speaker 3: let's just streamline that down to one. But then you 739 00:39:55,160 --> 00:39:58,120 Speaker 3: launch your new system has one address, and everybody calls 740 00:39:58,120 --> 00:39:59,759 Speaker 3: you up and like, what are you doing? Like, we 741 00:40:00,080 --> 00:40:02,880 Speaker 3: the reason we have two addresses is because our billing 742 00:40:02,880 --> 00:40:05,520 Speaker 3: department is over here and the customer facing address is different. 743 00:40:05,800 --> 00:40:08,680 Speaker 3: So I think the throw it away and start from 744 00:40:08,719 --> 00:40:13,000 Speaker 3: scratch is an impulse everybody has, but it has a 745 00:40:13,000 --> 00:40:17,000 Speaker 3: lot of risk as well, and I think the better 746 00:40:17,080 --> 00:40:19,680 Speaker 3: approach tends to be can you get to a place 747 00:40:20,080 --> 00:40:25,600 Speaker 3: where you can start to incrementally change your systems as 748 00:40:25,600 --> 00:40:28,359 Speaker 3: you go in small ways that are safe, where it 749 00:40:28,400 --> 00:40:31,000 Speaker 3: doesn't take it. You know, it's not like we have 750 00:40:31,040 --> 00:40:34,759 Speaker 3: to test for six weeks any change before making it live. 751 00:40:35,520 --> 00:40:38,680 Speaker 3: We can ship weekly, we can ship daily, we can 752 00:40:38,680 --> 00:40:42,840 Speaker 3: ship multiple times a day. That tends to yield better things. 753 00:40:43,560 --> 00:40:47,239 Speaker 3: There are plenty of projects out there that we're going 754 00:40:47,280 --> 00:40:50,879 Speaker 3: to build this from scratch, from the bottom up. Most 755 00:40:50,880 --> 00:40:54,640 Speaker 3: of the ones I'm aware of don't have great launch stories. 756 00:40:54,920 --> 00:40:56,400 Speaker 3: Maybe JO have disagrees, that's my no. 757 00:40:56,520 --> 00:40:58,960 Speaker 4: They're still in that mode of like get it all 758 00:40:58,960 --> 00:41:01,600 Speaker 4: in once and then we're done, and we have sort 759 00:41:01,600 --> 00:41:04,600 Speaker 4: of maintenance, and so they're pretty bad. But I will 760 00:41:04,600 --> 00:41:07,480 Speaker 4: say two things. One on a practical level, just because 761 00:41:07,800 --> 00:41:10,239 Speaker 4: I was talking with a colleague on the way over 762 00:41:10,280 --> 00:41:13,239 Speaker 4: here about this, there are great examples now of what 763 00:41:13,320 --> 00:41:16,319 Speaker 4: Dave's talking about, like not waiting for this big bang 764 00:41:16,360 --> 00:41:19,640 Speaker 4: project but saying we are just going to be making 765 00:41:19,640 --> 00:41:22,520 Speaker 4: this a little bit better every day. And one of 766 00:41:22,560 --> 00:41:26,640 Speaker 4: them is New Jersey's Department of Labor. Their unemployment insurance 767 00:41:27,280 --> 00:41:30,960 Speaker 4: response during the pandemic was quite good, but they've also 768 00:41:31,080 --> 00:41:34,200 Speaker 4: learned from that and said we're not going to wait 769 00:41:34,239 --> 00:41:36,719 Speaker 4: for some like big system to come that's going to 770 00:41:36,719 --> 00:41:39,720 Speaker 4: fix all of our problems. We have a team now 771 00:41:40,000 --> 00:41:44,759 Speaker 4: that can just incrementally fix things as we go and 772 00:41:45,560 --> 00:41:49,000 Speaker 4: eventually that will involve some bigger investments as they learn X, Y, 773 00:41:49,040 --> 00:41:52,359 Speaker 4: and Z. But they're just not waiting and it's just 774 00:41:52,400 --> 00:41:56,319 Speaker 4: a great, great example, But I will say, Tracy, like 775 00:41:56,440 --> 00:41:58,920 Speaker 4: I think about what you're asking about all the time. 776 00:42:00,120 --> 00:42:02,480 Speaker 4: I don't think this will ever happen, but if it, 777 00:42:03,480 --> 00:42:05,239 Speaker 4: if you do get a chance to sort of zero 778 00:42:05,320 --> 00:42:08,600 Speaker 4: base it and start over, you have to zero base 779 00:42:08,920 --> 00:42:13,080 Speaker 4: the policy, Like you can't. Could you do a new 780 00:42:13,160 --> 00:42:17,799 Speaker 4: unemployment insurance system in California that encodes the twenty five 781 00:42:17,880 --> 00:42:21,080 Speaker 4: years of knowledge that those claims processors have. Well, I 782 00:42:21,080 --> 00:42:22,960 Speaker 4: mean maybe I will help you now. I don't want 783 00:42:22,960 --> 00:42:27,120 Speaker 4: to talk about any but like, but like that's a problem. 784 00:42:27,360 --> 00:42:29,560 Speaker 4: Like you know, I say in the book, there's this 785 00:42:29,640 --> 00:42:32,560 Speaker 4: great team that's working on this Medicare problem, and it's 786 00:42:32,600 --> 00:42:36,600 Speaker 4: like there's nine different definitions of even like what a 787 00:42:36,680 --> 00:42:40,080 Speaker 4: medical group is. Like even the first question doctors are 788 00:42:40,080 --> 00:42:43,840 Speaker 4: supposed to answer is like wildly complex. And the policy 789 00:42:44,080 --> 00:42:47,160 Speaker 4: the delivery team is sort of fighting with the policy team, 790 00:42:47,200 --> 00:42:50,439 Speaker 4: and she's like, I get that it's complex. It has 791 00:42:50,520 --> 00:42:54,080 Speaker 4: to make sense to a person, And I like, fight, fight, fight, 792 00:42:54,200 --> 00:42:58,000 Speaker 4: not over the tech, but out of like over collapsing 793 00:42:58,040 --> 00:43:00,560 Speaker 4: these nine definitions into two. They want I get to one. 794 00:43:00,600 --> 00:43:02,960 Speaker 4: They can't. At least they get to two. And like 795 00:43:03,160 --> 00:43:05,400 Speaker 4: this idea that it has to make sense to a person. 796 00:43:06,080 --> 00:43:09,080 Speaker 4: We've like lost track of that. I mean, of course 797 00:43:09,080 --> 00:43:12,080 Speaker 4: you're frustrated when you apply for unemployment insurance, Like it's 798 00:43:12,160 --> 00:43:16,440 Speaker 4: wildly complex in its policy and the process that accrued 799 00:43:16,440 --> 00:43:19,560 Speaker 4: from those policies. Like if you were going to start over, 800 00:43:20,080 --> 00:43:22,200 Speaker 4: you can't start over with the tech. You would have 801 00:43:22,280 --> 00:43:25,640 Speaker 4: to say, okay, let's figure out let's like look at 802 00:43:25,680 --> 00:43:30,560 Speaker 4: this ninety years of accumulated policy and say what was 803 00:43:30,600 --> 00:43:34,160 Speaker 4: this actually trying to do. Can we kind of you know, 804 00:43:34,239 --> 00:43:37,239 Speaker 4: easier go back and just like rationalize all the changes, right, 805 00:43:37,239 --> 00:43:40,240 Speaker 4: It's not like like I say, like there's not one 806 00:43:40,480 --> 00:43:45,200 Speaker 4: binder of regulations that covers youI there's ninety years of 807 00:43:45,320 --> 00:43:48,839 Speaker 4: memos that change the previous memo that change the previous thing, 808 00:43:49,120 --> 00:43:52,000 Speaker 4: that changed the previous thing. And you could at least 809 00:43:52,080 --> 00:43:55,520 Speaker 4: go back and like condense that and like figure out 810 00:43:55,560 --> 00:43:58,120 Speaker 4: all of the conflicts in it. But it would make actually, 811 00:43:58,120 --> 00:44:00,239 Speaker 4: I think a lot more sense. And I will be clear, 812 00:44:00,440 --> 00:44:02,759 Speaker 4: this is never going to happen if you just said, 813 00:44:02,880 --> 00:44:08,040 Speaker 4: let's design the policy for an unemployment insurance system that 814 00:44:08,200 --> 00:44:10,960 Speaker 4: makes sense in twenty twenty three, and then you could 815 00:44:10,960 --> 00:44:12,360 Speaker 4: build tech for that. 816 00:44:12,400 --> 00:44:14,560 Speaker 1: There's so many different threads and places we could go. 817 00:44:14,600 --> 00:44:17,720 Speaker 1: There's such a fascinating conversation. I just have one more question, 818 00:44:18,000 --> 00:44:21,000 Speaker 1: and it sort of relates to something, Jennifer that you 819 00:44:21,000 --> 00:44:25,120 Speaker 1: said earlier about like in government, the tech people are 820 00:44:25,160 --> 00:44:29,240 Speaker 1: like secondary likely like the policy wonks and people designing 821 00:44:29,280 --> 00:44:31,640 Speaker 1: these things, and that that's a really big difference between 822 00:44:32,160 --> 00:44:35,279 Speaker 1: government of public and private sector. And you know, this 823 00:44:35,440 --> 00:44:38,600 Speaker 1: internal capacity, how much does that hinder And I think 824 00:44:38,680 --> 00:44:41,239 Speaker 1: this is another thing that Patrick McKenzie brought up, but 825 00:44:41,320 --> 00:44:44,880 Speaker 1: like hiring and so and the challenge of like, you know, 826 00:44:44,960 --> 00:44:47,720 Speaker 1: public sector pay skills are not private sector pay skills. 827 00:44:47,760 --> 00:44:49,600 Speaker 1: I have to imagine for a lot of these things 828 00:44:49,840 --> 00:44:52,360 Speaker 1: he talked about a little bit, you know, and you 829 00:44:52,360 --> 00:44:54,319 Speaker 1: know what you've worked on some of these like sort 830 00:44:54,320 --> 00:44:57,080 Speaker 1: of volunteer things that sort of bridge the private and 831 00:44:57,320 --> 00:44:59,480 Speaker 1: public sector. But can you talk a little bit about 832 00:44:59,560 --> 00:45:03,040 Speaker 1: that can strength of like even if like just the 833 00:45:03,160 --> 00:45:06,680 Speaker 1: challenge of in those in those roles getting like experienced 834 00:45:06,680 --> 00:45:07,360 Speaker 1: talented people. 835 00:45:08,080 --> 00:45:10,600 Speaker 4: I'm glad you brought that up because it's a little 836 00:45:10,600 --> 00:45:13,200 Speaker 4: bit of a soapbox I like to get on because 837 00:45:13,360 --> 00:45:16,400 Speaker 4: we have you know, like politicians, like the leaders who 838 00:45:16,440 --> 00:45:20,120 Speaker 4: are so frustrated with delivery and technology, and like they're 839 00:45:20,480 --> 00:45:24,160 Speaker 4: generally they're well intentioned, but the sort of things they 840 00:45:24,200 --> 00:45:27,600 Speaker 4: push on are kind of unhelpful. You know, they kind 841 00:45:27,600 --> 00:45:31,000 Speaker 4: of increase the risk aversion of the bureaucracy by yelling 842 00:45:31,080 --> 00:45:35,080 Speaker 4: them about things and not recognizing the constraints on the bureaucracy. 843 00:45:35,120 --> 00:45:37,799 Speaker 4: I call this the accountability trap. But there are things 844 00:45:37,880 --> 00:45:41,800 Speaker 4: they actually could do, and fixing the civil service and 845 00:45:41,840 --> 00:45:46,279 Speaker 4: civil service hiring rules is really really important. I do 846 00:45:46,600 --> 00:45:49,640 Speaker 4: know what Patrick McKenzie said on your show about pay 847 00:45:49,680 --> 00:45:53,560 Speaker 4: scales is not totally I don't totally agree with that. 848 00:45:53,960 --> 00:45:57,640 Speaker 4: The biggest problem is not the pay, it's the time 849 00:45:57,680 --> 00:46:01,200 Speaker 4: to hire. So if it takes, as is really common 850 00:46:01,280 --> 00:46:05,040 Speaker 4: right now in federal government, nine months to make an offer, 851 00:46:05,400 --> 00:46:08,040 Speaker 4: of course you lose everybody you've got. Actually, you know, 852 00:46:08,120 --> 00:46:10,880 Speaker 4: when when I started this work, there was not like 853 00:46:11,080 --> 00:46:14,560 Speaker 4: a line of people out the door, one with great 854 00:46:14,640 --> 00:46:17,480 Speaker 4: tech and design skills wanting to work for government. Now 855 00:46:17,880 --> 00:46:20,560 Speaker 4: there actually is, and we can't get them in the door. 856 00:46:20,880 --> 00:46:23,479 Speaker 4: I mean, I know people who do wait nine months 857 00:46:23,520 --> 00:46:27,319 Speaker 4: for that offer because they really want to help and 858 00:46:27,360 --> 00:46:30,919 Speaker 4: the pay is fine, Like it's not fantastic. I think 859 00:46:30,920 --> 00:46:33,680 Speaker 4: he was comparing to like Google whatever, but like you know, 860 00:46:34,400 --> 00:46:37,400 Speaker 4: compared to a startup where yes, you have the hope 861 00:46:37,400 --> 00:46:40,000 Speaker 4: of getting an exit, but you're probably not gonna make 862 00:46:40,160 --> 00:46:42,879 Speaker 4: like five hundred thousand dollars right as a programmer, you're 863 00:46:42,880 --> 00:46:44,919 Speaker 4: gonna make what half of that. I mean, we don't 864 00:46:44,960 --> 00:46:46,960 Speaker 4: have to get into like exact numbers, but like the 865 00:46:47,000 --> 00:46:51,240 Speaker 4: pay is in the right ballpark for like mid level folks. 866 00:46:51,360 --> 00:46:54,080 Speaker 4: It's not going to pay at the executive level. But 867 00:46:54,160 --> 00:46:56,439 Speaker 4: as he mentioned, like at the executive level, if you've 868 00:46:56,440 --> 00:46:58,600 Speaker 4: been at Google for twenty years, pay is not your 869 00:46:58,640 --> 00:47:00,879 Speaker 4: biggest concern. I do want to push back a little 870 00:47:00,880 --> 00:47:02,920 Speaker 4: bit on the notion that that is he is in government. 871 00:47:02,960 --> 00:47:05,439 Speaker 4: There are some of those people, but there are far 872 00:47:05,520 --> 00:47:08,560 Speaker 4: more people who just have good tech and design skills 873 00:47:08,920 --> 00:47:12,280 Speaker 4: who just want to make a difference more than anything else. 874 00:47:12,360 --> 00:47:16,120 Speaker 4: And we should be working on all of the constraints 875 00:47:16,360 --> 00:47:20,839 Speaker 4: that are keeping that hiring process at nine months more 876 00:47:20,920 --> 00:47:24,200 Speaker 4: than we're than we're like calling up the people who 877 00:47:25,440 --> 00:47:28,200 Speaker 4: manage the Department of Labor in a state and yelling 878 00:47:28,239 --> 00:47:30,840 Speaker 4: at them because they had these failures, like fix the 879 00:47:31,040 --> 00:47:34,759 Speaker 4: environment in which they're working. Give them that capacity, then 880 00:47:34,840 --> 00:47:37,320 Speaker 4: you can hold them accountable, but they can't hire people 881 00:47:37,360 --> 00:47:37,799 Speaker 4: right now. 882 00:47:38,760 --> 00:47:40,880 Speaker 3: You also you need to be able to hire people, 883 00:47:40,920 --> 00:47:44,080 Speaker 3: and you need them to be able to use their 884 00:47:44,280 --> 00:47:50,080 Speaker 3: judgment to make higher order decisions than just do we 885 00:47:50,280 --> 00:47:53,200 Speaker 3: enter the phone number with the keyboard or by clicking 886 00:47:53,280 --> 00:47:56,080 Speaker 3: on a virtual pad of numbers. Like it is also 887 00:47:56,120 --> 00:47:58,759 Speaker 3: the case that some of these systems are the way 888 00:47:58,760 --> 00:48:04,680 Speaker 3: they are because it's just jen this might be your phrase, 889 00:48:04,920 --> 00:48:08,320 Speaker 3: it's like policy vomit. It's just the rules put onto 890 00:48:08,760 --> 00:48:11,439 Speaker 3: a screen, just like all the rules in statute put 891 00:48:11,480 --> 00:48:14,439 Speaker 3: onto a screen. And so I think the thing I'd 892 00:48:14,480 --> 00:48:18,520 Speaker 3: like to add is the whole point of product management 893 00:48:19,560 --> 00:48:26,360 Speaker 3: is to arbitrate the trade offs between user experience, compliance goals, 894 00:48:26,760 --> 00:48:30,279 Speaker 3: technical cost of change, and technical trade offs. And if 895 00:48:30,280 --> 00:48:33,520 Speaker 3: you don't give those people if you hire them, but 896 00:48:33,920 --> 00:48:36,880 Speaker 3: they don't have the ability to say, well, maybe we 897 00:48:36,960 --> 00:48:40,160 Speaker 3: need to simplify this to make it make sense to 898 00:48:40,400 --> 00:48:44,400 Speaker 3: a person. If instead the only response they get is 899 00:48:44,640 --> 00:48:47,399 Speaker 3: what we need to put on the page exactly what's 900 00:48:47,440 --> 00:48:52,279 Speaker 3: in statute then, or whatever other compliance goals there are, like, oh, 901 00:48:52,320 --> 00:48:54,920 Speaker 3: we need to strictly follow this process. It's procedurally what 902 00:48:54,960 --> 00:48:57,360 Speaker 3: it is. If those are counter to the outcome. You 903 00:48:57,440 --> 00:49:00,200 Speaker 3: need a change in government where outcomes start to be 904 00:49:00,280 --> 00:49:04,880 Speaker 3: it outcomes in this area around technology start to sort 905 00:49:04,880 --> 00:49:08,880 Speaker 3: of have a higher importance than the strict process and 906 00:49:08,920 --> 00:49:11,080 Speaker 3: the strict procedures. And I do think that that's a 907 00:49:11,120 --> 00:49:16,520 Speaker 3: necessary corollary to bringing smart capable people in because if 908 00:49:16,560 --> 00:49:23,000 Speaker 3: you bring in a superb product manager and they're just 909 00:49:23,280 --> 00:49:25,680 Speaker 3: they have no ability to sort of bring these trade 910 00:49:25,719 --> 00:49:28,719 Speaker 3: offs up and make those calls on the ground, you're 911 00:49:28,760 --> 00:49:30,359 Speaker 3: going to get the same status quo. 912 00:49:30,440 --> 00:49:30,800 Speaker 4: I think. 913 00:49:31,360 --> 00:49:34,799 Speaker 2: So I can't resist asking this question. I take all 914 00:49:34,800 --> 00:49:36,560 Speaker 2: the points about, like, here's what we need to do 915 00:49:36,680 --> 00:49:38,839 Speaker 2: in order for this to change, But can I just ask, 916 00:49:39,200 --> 00:49:43,839 Speaker 2: because both of you have that practical experience of doing this, 917 00:49:44,160 --> 00:49:47,359 Speaker 2: what was the craziest thing that you saw during your 918 00:49:47,520 --> 00:49:51,440 Speaker 2: respective tenures, Like, what was something that just really surprised 919 00:49:51,520 --> 00:49:54,160 Speaker 2: you or shocked you. I'm back to the banging your 920 00:49:54,200 --> 00:49:56,640 Speaker 2: head on the table a portion of this conversation. 921 00:49:57,440 --> 00:49:59,640 Speaker 4: I mean, I guess the thing that comes to mind 922 00:50:00,719 --> 00:50:05,200 Speaker 4: just because it really made an impression on me, and 923 00:50:05,239 --> 00:50:08,040 Speaker 4: it's it's a negative story that I want to sort 924 00:50:08,080 --> 00:50:11,960 Speaker 4: of balance out with something positive. But I was working 925 00:50:12,000 --> 00:50:16,520 Speaker 4: in the White House and had decided to do what 926 00:50:16,560 --> 00:50:20,200 Speaker 4: we convinced the Veterans Administration to do what we call 927 00:50:20,280 --> 00:50:25,080 Speaker 4: discovery sprint on the Veterans Benefit management system. And it 928 00:50:25,200 --> 00:50:29,319 Speaker 4: was going really slowly. There was like huge latency, and 929 00:50:29,360 --> 00:50:31,520 Speaker 4: I got these two folks in the help out for 930 00:50:31,680 --> 00:50:33,600 Speaker 4: just a couple of weeks and just you know, do 931 00:50:33,640 --> 00:50:35,759 Speaker 4: an analysis of what was wrong before they go, like, 932 00:50:35,880 --> 00:50:37,759 Speaker 4: you know, trying to figure you can't solve the problem 933 00:50:37,840 --> 00:50:39,919 Speaker 4: till you know what it is. So like the first 934 00:50:40,000 --> 00:50:42,719 Speaker 4: thing was we show up and the guy's like, oh, 935 00:50:42,760 --> 00:50:44,920 Speaker 4: I'm glad they sent the White House to you know, 936 00:50:45,400 --> 00:50:49,240 Speaker 4: verify that everything's fine now. And it turns out everything 937 00:50:49,280 --> 00:50:53,279 Speaker 4: was fine because the leader had defined the latency as 938 00:50:53,400 --> 00:50:58,000 Speaker 4: under two minutes. Okay, so you've just defined the problem away. 939 00:50:58,040 --> 00:51:01,320 Speaker 4: So people who are trying to process benefit fit applications, 940 00:51:01,360 --> 00:51:04,000 Speaker 4: like if they clicked and it took one minute and 941 00:51:04,080 --> 00:51:07,840 Speaker 4: fifty nine seconds for the application to respond, it was fine, 942 00:51:08,440 --> 00:51:12,120 Speaker 4: no problem here. Wow. But then I kept talking to 943 00:51:12,160 --> 00:51:14,799 Speaker 4: the guy and I kept asking him questions about the system, like, well, 944 00:51:14,840 --> 00:51:16,960 Speaker 4: why is it designed this way? Why was it designed 945 00:51:16,960 --> 00:51:19,760 Speaker 4: that way? And he said, I've spent my career teaching 946 00:51:19,840 --> 00:51:22,799 Speaker 4: people not to have an opinion on business requirements. The 947 00:51:22,880 --> 00:51:26,919 Speaker 4: IT people should have no opinions there. And I said why, 948 00:51:27,400 --> 00:51:29,520 Speaker 4: He said, well, if they ask us to build a 949 00:51:29,560 --> 00:51:34,319 Speaker 4: concrete boat, will build a concrete boat, because then it's 950 00:51:34,400 --> 00:51:37,919 Speaker 4: not our fault. And I was just like, I felt 951 00:51:37,960 --> 00:51:41,600 Speaker 4: like I'd been gut punched, Like it was so hard 952 00:51:41,640 --> 00:51:44,080 Speaker 4: to hear that, Like there were I think at the time, 953 00:51:44,200 --> 00:51:49,040 Speaker 4: eighteen veterans a day committing suicide, most of whom couldn't 954 00:51:49,120 --> 00:51:53,160 Speaker 4: get their healthcare benefits. And I just could understand how 955 00:51:53,160 --> 00:51:56,840 Speaker 4: someone could take that approach. But honestly, you know, in 956 00:51:56,880 --> 00:51:58,960 Speaker 4: the ten years since then, I've seen a lot of 957 00:51:59,000 --> 00:52:02,359 Speaker 4: reasons why he felt like he could say that, And 958 00:52:02,560 --> 00:52:03,920 Speaker 4: you know, I don't want to blame him, I want 959 00:52:03,920 --> 00:52:08,120 Speaker 4: to blame the system. But yeah, if you tell us 960 00:52:08,120 --> 00:52:10,640 Speaker 4: to build a concrete boat, we'll build a concrete boat. Like, 961 00:52:10,719 --> 00:52:13,480 Speaker 4: how what way is that to run a country? 962 00:52:13,840 --> 00:52:17,600 Speaker 3: I have a much more technical problem, and please yell 963 00:52:17,640 --> 00:52:19,760 Speaker 3: at me if this is too technically in the weeds. 964 00:52:19,920 --> 00:52:23,120 Speaker 3: So at one point, we in my old work on snap, 965 00:52:23,200 --> 00:52:26,719 Speaker 3: we observed a system, one of the one of the 966 00:52:26,760 --> 00:52:29,600 Speaker 3: official systems that was user facing, so external users like 967 00:52:29,719 --> 00:52:31,839 Speaker 3: folks trying to apply for for essentially use it. And 968 00:52:31,880 --> 00:52:35,839 Speaker 3: we noticed one day that it wasn't loading right, and 969 00:52:35,880 --> 00:52:38,440 Speaker 3: it wasn't loading in the following way. You would go 970 00:52:38,480 --> 00:52:41,880 Speaker 3: to the website, it would have buttons and everything loaded, 971 00:52:42,600 --> 00:52:44,600 Speaker 3: but then when you try to interclick on anything, it 972 00:52:44,600 --> 00:52:47,960 Speaker 3: wouldn't work, and it wouldn't work weirdly for about a 973 00:52:48,120 --> 00:52:51,600 Speaker 3: minute and a half, like then all of a sudden, 974 00:52:51,600 --> 00:52:54,200 Speaker 3: you can click on things and we're like, what is 975 00:52:54,239 --> 00:52:59,440 Speaker 3: going on here? This is bizarre, And we looked around, 976 00:52:59,239 --> 00:53:02,080 Speaker 3: and so we we did some debugging on the client side, 977 00:53:02,080 --> 00:53:04,279 Speaker 3: meaning like we opened up the web browser inspector and 978 00:53:04,320 --> 00:53:06,360 Speaker 3: we said, okay, what's going on? And what we found 979 00:53:06,560 --> 00:53:11,120 Speaker 3: was this website was loading I forget exactly the name. 980 00:53:11,120 --> 00:53:14,560 Speaker 3: It was like translations dot js or translations dot xml, 981 00:53:15,600 --> 00:53:17,320 Speaker 3: and it was taking a min and a half. Everything 982 00:53:17,320 --> 00:53:19,560 Speaker 3: else was loading fine, but this was taking a minut 983 00:53:19,520 --> 00:53:23,400 Speaker 3: and a half, and counter to best practice, it was 984 00:53:24,320 --> 00:53:27,120 Speaker 3: you know, blocking any interaction with the page until it 985 00:53:27,160 --> 00:53:29,480 Speaker 3: was fully fully loaded. And it was loading very very slowly. 986 00:53:29,840 --> 00:53:31,399 Speaker 3: And we looked at it and it says giant file. 987 00:53:31,440 --> 00:53:32,560 Speaker 3: I was like, I don't know. It was like fifty 988 00:53:32,600 --> 00:53:34,839 Speaker 3: megab items a giant file, Like what is going on here. 989 00:53:35,120 --> 00:53:38,520 Speaker 3: Looked into it and it was a translation of every 990 00:53:39,120 --> 00:53:45,600 Speaker 3: page on the entire system into like eight languages, and 991 00:53:46,680 --> 00:53:49,399 Speaker 3: we call the thing and that was like, okay, this 992 00:53:49,480 --> 00:53:51,600 Speaker 3: is not great, and there's ways to fix that. Like 993 00:53:51,640 --> 00:53:53,760 Speaker 3: I said, you can let people interact with the page 994 00:53:53,840 --> 00:53:56,600 Speaker 3: until before that's fully load and it's fine. Also, you 995 00:53:57,080 --> 00:53:59,600 Speaker 3: definitely shouldn't put all the translations for every page in 996 00:53:59,600 --> 00:54:02,160 Speaker 3: your website on eight pages into a single file that 997 00:54:02,200 --> 00:54:05,400 Speaker 3: you sends to the client, Like that's just bad development practice. 998 00:54:05,480 --> 00:54:09,680 Speaker 3: But the thing that maddened me is we being good 999 00:54:10,200 --> 00:54:13,440 Speaker 3: hearted people who were not out to, you know, just 1000 00:54:13,440 --> 00:54:17,400 Speaker 3: sort of lob bombs and talk trash. We contacted the 1001 00:54:17,560 --> 00:54:20,240 Speaker 3: entity who ran it and we said, hey, this is happening. 1002 00:54:20,480 --> 00:54:22,960 Speaker 3: We're seeing consistently, this is the issue that's going on. 1003 00:54:23,680 --> 00:54:25,839 Speaker 3: And they said, no, no, no, no, that's not having us load. 1004 00:54:25,920 --> 00:54:31,160 Speaker 3: Totally fine for us. And the reason is because we 1005 00:54:31,880 --> 00:54:33,839 Speaker 3: diagnosed it down to one or two reasons. One was 1006 00:54:33,960 --> 00:54:36,000 Speaker 3: they'd already loaded the file once so it was cash 1007 00:54:36,040 --> 00:54:37,400 Speaker 3: and so it was on their computers every time they 1008 00:54:37,440 --> 00:54:39,239 Speaker 3: went to it. It was fine and like, oh yeah, 1009 00:54:39,239 --> 00:54:42,120 Speaker 3: it works for us. The other was and this was 1010 00:54:42,160 --> 00:54:45,640 Speaker 3: our better guess is their server that was serving this 1011 00:54:45,960 --> 00:54:48,960 Speaker 3: was on their network and actually sending it much more quickly, 1012 00:54:49,480 --> 00:54:51,520 Speaker 3: and they were just like, well, it works on our network. 1013 00:54:51,520 --> 00:54:54,160 Speaker 3: It works totally fine. And I don't know exactly what 1014 00:54:54,200 --> 00:54:56,640 Speaker 3: it was. But I think that is the reason that's 1015 00:54:56,680 --> 00:54:59,440 Speaker 3: so maddening. And it gets back to a little bit 1016 00:54:59,440 --> 00:55:02,680 Speaker 3: like Gem's point about accountability is the truth is we 1017 00:55:02,760 --> 00:55:06,040 Speaker 3: can't fix problems that we don't see or don't recognize. 1018 00:55:06,280 --> 00:55:09,279 Speaker 3: And I do think that they're a big part of 1019 00:55:09,320 --> 00:55:13,200 Speaker 3: all of this is we have some expectations that websites 1020 00:55:13,200 --> 00:55:15,880 Speaker 3: should work, that technology should work. We also need to 1021 00:55:15,960 --> 00:55:18,919 Speaker 3: give more. We have to set an expectation I think, 1022 00:55:19,040 --> 00:55:22,960 Speaker 3: government wide about what stuff working looks like. Like Jen 1023 00:55:23,000 --> 00:55:26,360 Speaker 3: gave an implicit one of a normal human should be 1024 00:55:26,400 --> 00:55:29,439 Speaker 3: able to understand this. That should be tested on every 1025 00:55:29,440 --> 00:55:32,680 Speaker 3: system in the e commerce world. Like what's the number 1026 00:55:32,719 --> 00:55:35,200 Speaker 3: one metric you care about? It's from start to finish? 1027 00:55:35,280 --> 00:55:37,840 Speaker 3: How many people start transaction and actually end up purchasing 1028 00:55:37,960 --> 00:55:40,360 Speaker 3: that conversion rate that I mean, it would be a 1029 00:55:40,360 --> 00:55:43,239 Speaker 3: really useful feedback loop to measure the success rate of 1030 00:55:43,360 --> 00:55:46,680 Speaker 3: users trying to do a transaction across every transaction in government, 1031 00:55:46,760 --> 00:55:49,360 Speaker 3: right like people applying for food, SAMs, people applying for whatever. 1032 00:55:49,760 --> 00:55:53,640 Speaker 3: If you saw massive drop off or massive disparities across systems, 1033 00:55:53,800 --> 00:55:56,600 Speaker 3: at least it would tell you, oh, there's something we 1034 00:55:56,719 --> 00:55:59,480 Speaker 3: need to fix here. So I think there's another layer 1035 00:55:59,480 --> 00:56:03,040 Speaker 3: of all this, and that's my depressing story because eventually 1036 00:56:03,040 --> 00:56:05,560 Speaker 3: it got fixed and I think probably they figured out 1037 00:56:05,600 --> 00:56:07,799 Speaker 3: we were right like a week later, but it took 1038 00:56:07,800 --> 00:56:10,799 Speaker 3: a week and it was a very I think that's 1039 00:56:10,840 --> 00:56:13,640 Speaker 3: why in my mind, there's also if we can get 1040 00:56:13,680 --> 00:56:17,479 Speaker 3: to better accountability and give people better on the government side, 1041 00:56:17,480 --> 00:56:21,640 Speaker 3: better visibility into these issues with possible routes to solving them. 1042 00:56:21,719 --> 00:56:24,719 Speaker 3: They want to solve them, but right now we don't 1043 00:56:24,719 --> 00:56:27,960 Speaker 3: have a system that necessarily looks for those issues. And 1044 00:56:28,000 --> 00:56:29,520 Speaker 3: I think that would be a better form of the 1045 00:56:29,520 --> 00:56:33,840 Speaker 3: accountability we have relative to say, did you follow every 1046 00:56:34,040 --> 00:56:37,759 Speaker 3: IT checklist bullet point in policy? Right? Like, if we're 1047 00:56:37,760 --> 00:56:41,120 Speaker 3: getting bad outcomes while following that, then maybe something else 1048 00:56:41,160 --> 00:56:43,320 Speaker 3: needs to change and maybe there's a different form of 1049 00:56:43,320 --> 00:56:44,759 Speaker 3: accountability necessary there. 1050 00:56:44,960 --> 00:56:47,000 Speaker 4: We should give a quick shout out to the President 1051 00:56:47,040 --> 00:56:54,680 Speaker 4: Biden's Customer Experience Executive Order that is trying to get agencies. 1052 00:56:54,719 --> 00:56:58,000 Speaker 1: Maybe in that direction, there is so much to talk 1053 00:56:58,040 --> 00:57:01,120 Speaker 1: about it, so many different subversions of this conversation we have, 1054 00:57:01,400 --> 00:57:05,760 Speaker 1: but uh, that was amazing conversation. Jennifer Palka, Dave Gorino, 1055 00:57:05,920 --> 00:57:08,400 Speaker 1: thank you both so much for our coming on odlocks, 1056 00:57:08,600 --> 00:57:11,359 Speaker 1: sharing your time and insight and direct experience and all 1057 00:57:11,400 --> 00:57:11,800 Speaker 1: this stuff. 1058 00:57:11,920 --> 00:57:27,480 Speaker 5: Yes, delight, thanks so much, Thank you, Thank you, Tracy. 1059 00:57:27,680 --> 00:57:31,120 Speaker 1: That was an extremely illuminating conversation I found. I mean, 1060 00:57:31,160 --> 00:57:33,040 Speaker 1: it's like easy enough to say like, oh, government is 1061 00:57:33,120 --> 00:57:35,480 Speaker 1: bureaucratic and they don't pay enough and that's why they 1062 00:57:35,480 --> 00:57:37,320 Speaker 1: can't build software, but they were like there was like 1063 00:57:37,400 --> 00:57:41,439 Speaker 1: so much richer and more complex than I like appreciated. 1064 00:57:41,760 --> 00:57:41,920 Speaker 5: Right. 1065 00:57:42,040 --> 00:57:45,120 Speaker 2: The emphasis on incentive was really interesting and sort of 1066 00:57:45,120 --> 00:57:47,200 Speaker 2: fits into a lot of the discussions we have here. 1067 00:57:47,240 --> 00:57:50,560 Speaker 2: And also Jen's point about a lot of this emanates 1068 00:57:50,560 --> 00:57:54,160 Speaker 2: from the policy yes side, yes, right, and you know 1069 00:57:54,160 --> 00:57:56,760 Speaker 2: it's people trying to tick various boxes that have been 1070 00:57:56,800 --> 00:57:58,120 Speaker 2: set in stone by policy. 1071 00:57:58,320 --> 00:57:59,360 Speaker 4: I got to say one. 1072 00:57:59,160 --> 00:58:01,720 Speaker 2: Thing coming out of that, though, I feel a newfound 1073 00:58:01,720 --> 00:58:04,760 Speaker 2: appreciation for being a journalist. And actually, you know, when 1074 00:58:04,800 --> 00:58:08,440 Speaker 2: you and I produce an episode or write an article, 1075 00:58:08,520 --> 00:58:10,160 Speaker 2: we sort of send it out in the world and 1076 00:58:10,200 --> 00:58:12,040 Speaker 2: then it's done, and we can move on to the 1077 00:58:12,080 --> 00:58:12,480 Speaker 2: next thing. 1078 00:58:12,640 --> 00:58:15,200 Speaker 4: We don't have to go back in ten years refine it. 1079 00:58:15,320 --> 00:58:20,200 Speaker 2: Yeah, in response to customer feedback and various testing and 1080 00:58:20,280 --> 00:58:21,000 Speaker 2: things like that. 1081 00:58:21,240 --> 00:58:23,800 Speaker 1: If we were if we were Wikipedia editors, we would 1082 00:58:23,880 --> 00:58:26,800 Speaker 1: have to be continually editing them for years. I thought 1083 00:58:26,840 --> 00:58:29,600 Speaker 1: that was really great. And again Jen's point that you 1084 00:58:29,680 --> 00:58:32,080 Speaker 1: just brought up, it's like this sort of like reflection, 1085 00:58:32,280 --> 00:58:35,440 Speaker 1: like the real solution like has to emanate from the 1086 00:58:35,440 --> 00:58:37,800 Speaker 1: policy level. I was actually getting drinks recently with a 1087 00:58:37,800 --> 00:58:40,520 Speaker 1: friend of mine who's a software engineer, and he said, 1088 00:58:40,560 --> 00:58:43,720 Speaker 1: there's like this phenomenon. He said, it's called Conway's law. 1089 00:58:43,800 --> 00:58:46,600 Speaker 1: Like every organization, he said, like ships, its organ structure 1090 00:58:47,080 --> 00:58:49,080 Speaker 1: and this then it makes like so much sense, Like 1091 00:58:49,240 --> 00:58:52,720 Speaker 1: the software is complicated because US politics is like complicated 1092 00:58:52,880 --> 00:58:56,760 Speaker 1: ninety years of policy changes and Democrats and Republicans switching 1093 00:58:56,800 --> 00:58:59,360 Speaker 1: who's in power, et cetera. It's like, no wonder, like 1094 00:58:59,400 --> 00:59:03,120 Speaker 1: the ultimate like product of that thing is going to 1095 00:59:03,280 --> 00:59:05,560 Speaker 1: not look like you know, Google dot Com. 1096 00:59:05,560 --> 00:59:08,280 Speaker 2: Well, also the story of the guy who is like 1097 00:59:08,320 --> 00:59:10,800 Speaker 2: it takes twenty five years to really know how to 1098 00:59:10,880 --> 00:59:14,440 Speaker 2: file acclaim. You can imagine if you spend twenty five 1099 00:59:14,520 --> 00:59:18,960 Speaker 2: years developing expertise in this one very specific skill set. 1100 00:59:19,200 --> 00:59:23,800 Speaker 2: You're not really incentivized to start like changing that anytime soon. 1101 00:59:23,920 --> 00:59:26,840 Speaker 2: And so the system just kind of perpetuates itself so 1102 00:59:26,920 --> 00:59:29,840 Speaker 2: many different directions, So we can go with that conversation. 1103 00:59:29,480 --> 00:59:32,640 Speaker 1: So many different ways. The system is a self perpetuating Yeah, 1104 00:59:32,800 --> 00:59:33,640 Speaker 1: I learned a lot from that. 1105 00:59:33,720 --> 00:59:33,919 Speaker 4: Yep. 1106 00:59:33,960 --> 00:59:34,800 Speaker 2: Shall we leave it there. 1107 00:59:34,880 --> 00:59:35,560 Speaker 1: Let's leave it there. 1108 00:59:35,600 --> 00:59:38,360 Speaker 2: This has been another episode of the odd Lots podcast. 1109 00:59:38,600 --> 00:59:41,160 Speaker 2: I'm Tracy Alloway. You can follow me on Twitter at 1110 00:59:41,160 --> 00:59:41,960 Speaker 2: Tracy Alloway. 1111 00:59:42,120 --> 00:59:44,280 Speaker 1: And I'm Joe Why Isn't Thal? You can follow me 1112 00:59:44,480 --> 00:59:47,960 Speaker 1: on Twitter at the Stalwart. Follow our guests on Twitter. 1113 00:59:48,240 --> 00:59:51,320 Speaker 1: Jennifer Palka, She's at Paulka dot and she is the 1114 00:59:51,400 --> 00:59:54,720 Speaker 1: author of the new book Recoding America. Definitely check it out. 1115 00:59:55,080 --> 00:59:59,880 Speaker 1: Follow Dave Guarino, He's at Dave Underscore Glorrino, follow our 1116 01:00:00,520 --> 01:00:04,440 Speaker 1: Carmen Rodriguez at Carmen Arman, and dash Oll Bennett at dashbot. 1117 01:00:04,720 --> 01:00:07,160 Speaker 1: And check out all of the Bloomberg podcasts under the 1118 01:00:07,200 --> 01:00:10,880 Speaker 1: handle at Podcasts and for more odlots content, go to 1119 01:00:10,880 --> 01:00:14,040 Speaker 1: Bloomberg dot com slash odd Lots, where we post transcripts. 1120 01:00:14,240 --> 01:00:16,560 Speaker 1: Tracy and I have a blog and a newsletter that 1121 01:00:16,640 --> 01:00:19,320 Speaker 1: comes out every Friday, and you can talk about all 1122 01:00:19,360 --> 01:00:22,360 Speaker 1: of these things twenty four to seven with fellow listeners 1123 01:00:22,440 --> 01:00:25,520 Speaker 1: on the Odlogs Discord. It's really fun. I hang out 1124 01:00:25,520 --> 01:00:29,400 Speaker 1: there a lot, go to discord dot, gg slash, odd lots, 1125 01:00:29,920 --> 01:00:32,200 Speaker 1: so it's a real blast and a fun place to 1126 01:00:32,200 --> 01:00:34,400 Speaker 1: hang out on the Internet. Thanks for listening. 1127 01:01:00,680 --> 01:01:00,720 Speaker 3: E