1 00:00:02,480 --> 00:00:19,240 Speaker 1: Bloomberg Audio Studios, Podcasts, Radio News. Hello and welcome to 2 00:00:19,320 --> 00:00:21,919 Speaker 1: another episode of the Odd Lots podcast. 3 00:00:22,000 --> 00:00:24,400 Speaker 2: I'm Joe Wisenthal and I'm Tracy Alloway. 4 00:00:24,840 --> 00:00:28,240 Speaker 1: Tracy, remember that episode we did several weeks ago about 5 00:00:28,960 --> 00:00:31,800 Speaker 1: waste and fraud and abuse in medicare I do. 6 00:00:32,040 --> 00:00:35,199 Speaker 2: It was full of interesting facts and figures, such as 7 00:00:35,240 --> 00:00:38,360 Speaker 2: the US government spends one percent of the federal budget 8 00:00:38,400 --> 00:00:39,360 Speaker 2: on dialysis. 9 00:00:39,560 --> 00:00:42,000 Speaker 1: And the thing about that episode, so we were talking 10 00:00:42,040 --> 00:00:45,879 Speaker 1: to be you, Professor Jetson, Leader Luise, you know, the 11 00:00:45,960 --> 00:00:48,840 Speaker 1: thing about that episode that's striking is if you look 12 00:00:48,840 --> 00:00:51,960 Speaker 1: at government spending and probably a lot of all bureaucracies 13 00:00:52,000 --> 00:00:54,440 Speaker 1: even outside the government, but if you look at government 14 00:00:54,640 --> 00:00:58,160 Speaker 1: spending or government waste or government fraud, et cetera, there 15 00:00:58,160 --> 00:01:01,880 Speaker 1: are many things that everyone can point to and say 16 00:01:02,080 --> 00:01:05,120 Speaker 1: this is waste, this is bad, we should change this. 17 00:01:05,680 --> 00:01:09,319 Speaker 1: But like that's clearly just being able to like identify 18 00:01:09,560 --> 00:01:12,680 Speaker 1: some bad process or whether it's in how you you know, 19 00:01:12,760 --> 00:01:15,679 Speaker 1: how purchasing works or how hiring works or anything like, 20 00:01:16,080 --> 00:01:19,160 Speaker 1: clearly that's not enough to fix it. Like identification almost 21 00:01:19,160 --> 00:01:20,160 Speaker 1: seems like the easy part. 22 00:01:20,319 --> 00:01:22,360 Speaker 2: No, I think this is actually endemic in a lot 23 00:01:22,360 --> 00:01:26,360 Speaker 2: of large organizations, although I am certain there are specific 24 00:01:26,400 --> 00:01:29,160 Speaker 2: things about the government process that are unique to them, 25 00:01:29,280 --> 00:01:33,560 Speaker 2: But it feels like at large organizations everyone kind of 26 00:01:33,560 --> 00:01:36,720 Speaker 2: feels helpless, right, Like everyone can say we need to 27 00:01:36,760 --> 00:01:39,679 Speaker 2: do this better, or this is ridiculous, or we should 28 00:01:39,760 --> 00:01:43,560 Speaker 2: change this, but no one actually seems able to fix it. 29 00:01:43,920 --> 00:01:47,680 Speaker 1: Yeah, I anyway, Yes this is my this is my 30 00:01:47,760 --> 00:01:51,080 Speaker 1: perception as well. But we are in an era, you know, 31 00:01:51,440 --> 00:01:53,880 Speaker 1: the vibes have changed. We are in an era where, 32 00:01:53,920 --> 00:01:59,440 Speaker 1: once again and probably because inflation remains high, people see 33 00:01:59,480 --> 00:02:04,160 Speaker 1: the avious failures of what people call state capacity. You know, 34 00:02:04,200 --> 00:02:06,680 Speaker 1: there was a lot of stuff built under the Biden administration. 35 00:02:06,840 --> 00:02:09,680 Speaker 1: There was a lot of stuff allocated. People are frustrated 36 00:02:09,840 --> 00:02:13,239 Speaker 1: by the speed of things, whether you look at charging stations, 37 00:02:14,000 --> 00:02:18,000 Speaker 1: rural broadband, the speed with which any project gets implemented. 38 00:02:18,720 --> 00:02:22,040 Speaker 1: We're in a moment of frustration, and people want to 39 00:02:22,360 --> 00:02:24,799 Speaker 1: see two things. They want to see waste and spending 40 00:02:25,000 --> 00:02:28,119 Speaker 1: decline in many cases, and they also want to see 41 00:02:28,120 --> 00:02:31,200 Speaker 1: that the spending that we actually do allocate turns into action, 42 00:02:31,320 --> 00:02:35,080 Speaker 1: good results. Right, We're in a period where sitting aside, 43 00:02:35,160 --> 00:02:38,400 Speaker 1: what will happen, etc. Like there is this public impolicy. 44 00:02:38,480 --> 00:02:40,720 Speaker 1: We want to see results for things that we do. 45 00:02:41,000 --> 00:02:43,320 Speaker 2: Speaking of good results, can I just say the last 46 00:02:43,360 --> 00:02:46,600 Speaker 2: time we spoke to this guest who we are about 47 00:02:46,639 --> 00:02:50,280 Speaker 2: to speak to again, we were complaining about how clunky 48 00:02:50,400 --> 00:02:53,360 Speaker 2: the Treasury Direct website is and how you had to 49 00:02:53,400 --> 00:02:56,239 Speaker 2: click little buttons to get in instead of just typing 50 00:02:56,280 --> 00:03:01,000 Speaker 2: your password. And I think shortly after that someone sent 51 00:03:01,040 --> 00:03:03,880 Speaker 2: me a screenshot and said that they had fixed it. Amazing, 52 00:03:04,320 --> 00:03:07,880 Speaker 2: So you know this podcast can lead to real results. 53 00:03:08,520 --> 00:03:10,960 Speaker 1: Maybe that's the answer. We just need to do podcast 54 00:03:11,000 --> 00:03:13,240 Speaker 1: episodes on various things that aren't working and that someone 55 00:03:13,280 --> 00:03:15,440 Speaker 1: here is it and stuff like that. But yes, this 56 00:03:15,600 --> 00:03:18,800 Speaker 1: is a we are in an era for people demanding results. Anyway, 57 00:03:18,960 --> 00:03:21,680 Speaker 1: you mentioned that last episode we did with this guest, 58 00:03:21,760 --> 00:03:24,720 Speaker 1: she really is the perfect guest to talk about now 59 00:03:24,760 --> 00:03:27,600 Speaker 1: in this era of doge and the sort of impulse 60 00:03:27,680 --> 00:03:31,680 Speaker 1: to see government root out waste and fraud and abuse 61 00:03:31,720 --> 00:03:33,880 Speaker 1: and become more efficient. We're going to be speaking with 62 00:03:34,120 --> 00:03:37,920 Speaker 1: Jennifer Palka. She is the author of the book Recoding America, 63 00:03:38,440 --> 00:03:41,320 Speaker 1: currently serving as a Senior Fellow at the Nascannon Center. 64 00:03:41,680 --> 00:03:44,280 Speaker 1: She was the founder and former director of the Code 65 00:03:44,280 --> 00:03:47,800 Speaker 1: for America initiative. She's been on the Defense Innovation Board, 66 00:03:48,160 --> 00:03:51,960 Speaker 1: so has seen how government actually operates and spends and 67 00:03:52,000 --> 00:03:55,800 Speaker 1: works and builds technology at a very granular level. Perfect 68 00:03:55,800 --> 00:03:58,040 Speaker 1: guest to have back on. So, Jennifer, thank you so 69 00:03:58,120 --> 00:03:59,600 Speaker 1: much for coming back on out laws. 70 00:04:00,480 --> 00:04:01,760 Speaker 3: Yeah, no, thanks for having me. 71 00:04:03,000 --> 00:04:06,640 Speaker 1: You know, when it comes to government spending and waste, 72 00:04:06,720 --> 00:04:09,600 Speaker 1: you know, I think there's sort of two There are 73 00:04:09,640 --> 00:04:12,360 Speaker 1: sort of two things that people I think it's sort 74 00:04:12,360 --> 00:04:15,960 Speaker 1: of useful to divide the conversation in two things, because 75 00:04:15,960 --> 00:04:18,560 Speaker 1: there are major things that the government spends money on, 76 00:04:18,680 --> 00:04:22,160 Speaker 1: like various entitlements or having a large military, et cetera, 77 00:04:22,640 --> 00:04:27,600 Speaker 1: that some might consider wasteful per se, right, and then 78 00:04:27,600 --> 00:04:30,960 Speaker 1: there is the other aspect, which is just like within 79 00:04:31,000 --> 00:04:33,680 Speaker 1: the realm of things that many people agree we need, 80 00:04:34,080 --> 00:04:36,839 Speaker 1: is government getting the best bang for the book within 81 00:04:36,960 --> 00:04:40,359 Speaker 1: the sort of expectation that this is you know, legitimate spending. 82 00:04:40,400 --> 00:04:42,719 Speaker 1: And so some of these fights that in conversations that 83 00:04:42,760 --> 00:04:45,080 Speaker 1: are going to be held onto the new administration are 84 00:04:45,080 --> 00:04:46,719 Speaker 1: going to be in the first category, where it's like 85 00:04:47,160 --> 00:04:51,440 Speaker 1: big political questions about where we should be allocating our money. 86 00:04:51,960 --> 00:04:55,200 Speaker 1: But from your perspective, even setting that aside, when you 87 00:04:55,240 --> 00:04:58,560 Speaker 1: look at sort of the actual expenditure, like, do you 88 00:04:58,720 --> 00:05:02,520 Speaker 1: see big dials that can be turned in terms of 89 00:05:02,839 --> 00:05:03,880 Speaker 1: this can be done better. 90 00:05:04,080 --> 00:05:06,320 Speaker 3: Yeah, it's funny you say, like people get concerned about 91 00:05:06,360 --> 00:05:08,559 Speaker 3: the bang for the buck. I think people get really 92 00:05:08,600 --> 00:05:11,520 Speaker 3: concerned when they're like, any amount of books is not 93 00:05:11,640 --> 00:05:15,560 Speaker 3: getting a bang. Yeah, right, you know, we're it's it's less. 94 00:05:16,640 --> 00:05:20,560 Speaker 3: I understand the frustration and the switch to a dialogue 95 00:05:20,560 --> 00:05:24,839 Speaker 3: around government efficiency that Elon Musk is pushing, But I 96 00:05:24,880 --> 00:05:27,760 Speaker 3: think we've gotten to that place because people are saying, 97 00:05:27,960 --> 00:05:31,159 Speaker 3: you know, we're getting very little results. We would be 98 00:05:31,279 --> 00:05:34,600 Speaker 3: tolerant of, you know, reasonably high spending if something we're 99 00:05:34,600 --> 00:05:38,279 Speaker 3: coming out of this and you know, you talk about 100 00:05:38,320 --> 00:05:41,560 Speaker 3: the defense world for instance, I mean, our spending keeps 101 00:05:41,600 --> 00:05:45,440 Speaker 3: going up, and in theory, we're spending more so that 102 00:05:45,480 --> 00:05:49,599 Speaker 3: we can be safer and have greater deterrence. And in reality, 103 00:05:49,720 --> 00:05:53,840 Speaker 3: I think that that greater spending is actually keeping us 104 00:05:54,320 --> 00:05:57,200 Speaker 3: from making the changes that we need to make such 105 00:05:57,200 --> 00:06:01,080 Speaker 3: that the system is more effective in a fefficient you know, 106 00:06:01,080 --> 00:06:04,440 Speaker 3: people just still do these like big bang projects because 107 00:06:04,480 --> 00:06:07,440 Speaker 3: they have the money for them, they have all the 108 00:06:07,560 --> 00:06:12,600 Speaker 3: trappings around them that make them really slow all the procedure, 109 00:06:13,279 --> 00:06:17,160 Speaker 3: you know, instead of these you know fast you know procurements, 110 00:06:17,160 --> 00:06:19,640 Speaker 3: like we need a ton of drones, for instance, instead 111 00:06:19,640 --> 00:06:24,360 Speaker 3: of these you know, big heavy weapons platforms. But we're 112 00:06:24,400 --> 00:06:27,320 Speaker 3: still putting a lot of money into these things that 113 00:06:27,400 --> 00:06:29,800 Speaker 3: don't get us any results for twenty years. By the 114 00:06:29,839 --> 00:06:33,360 Speaker 3: time you know, we get the results, the thing isn't 115 00:06:33,360 --> 00:06:37,000 Speaker 3: actually needed anymore. What we need is drones instead of ships. 116 00:06:37,640 --> 00:06:41,000 Speaker 3: And you know, I really think that the dialogue is 117 00:06:41,120 --> 00:06:43,320 Speaker 3: changing for like, hey, where's the bang at all? 118 00:06:43,880 --> 00:06:47,520 Speaker 2: Right? Yeah, it feels like when organizations are sort of 119 00:06:47,720 --> 00:06:51,520 Speaker 2: swimming in money and have large budgets, they don't actually 120 00:06:51,560 --> 00:06:55,080 Speaker 2: have to think about how to best spend that money, right, 121 00:06:55,600 --> 00:06:58,040 Speaker 2: And so I well, I wanted to ask you more 122 00:06:58,240 --> 00:07:02,120 Speaker 2: your position on the Defense Innovation Board. One thing I've 123 00:07:02,120 --> 00:07:06,000 Speaker 2: been wondering is why does the Pentagon to this point 124 00:07:06,279 --> 00:07:08,120 Speaker 2: always seem to fail its audits. 125 00:07:08,920 --> 00:07:14,160 Speaker 3: Hmm, okay, there's so many reasons for that. One of 126 00:07:14,200 --> 00:07:16,840 Speaker 3: them I think speaks to a cordys function that you 127 00:07:16,920 --> 00:07:20,560 Speaker 3: see across government, which is we do everything in this 128 00:07:20,800 --> 00:07:24,560 Speaker 3: very very bespoke way. So it used to be over 129 00:07:24,760 --> 00:07:29,000 Speaker 3: five thousand different back end systems, including accounting, systems just 130 00:07:29,080 --> 00:07:32,000 Speaker 3: within the DoD. I don't think that number has gone 131 00:07:32,080 --> 00:07:35,160 Speaker 3: down that much. How do you get so many different 132 00:07:35,240 --> 00:07:39,680 Speaker 3: systems in you know, it's obviously a huge institution, but 133 00:07:39,880 --> 00:07:41,720 Speaker 3: like you would think there would be a standard way 134 00:07:41,760 --> 00:07:45,800 Speaker 3: to start to do accounting. But what happens is that Congress, 135 00:07:46,320 --> 00:07:49,960 Speaker 3: you know, puts some new requirement for reporting on this 136 00:07:50,000 --> 00:07:54,720 Speaker 3: particular program or this particular department, and they go, okay, well, 137 00:07:54,760 --> 00:07:58,560 Speaker 3: we need really bespoke software to handle these, you know, 138 00:07:59,080 --> 00:08:03,280 Speaker 3: sort of ourcane and in the weeds kinds of requests 139 00:08:03,320 --> 00:08:06,360 Speaker 3: from Congress or someone else. Some of them are like real, 140 00:08:06,520 --> 00:08:08,840 Speaker 3: Congress isn't backing off on them, and some of them 141 00:08:08,880 --> 00:08:11,720 Speaker 3: are kind of imagined. But that's how we get this 142 00:08:12,760 --> 00:08:19,560 Speaker 3: very heavyweight requirements development process in government, where like the 143 00:08:19,680 --> 00:08:21,920 Speaker 3: job of a bunch of people for a long time 144 00:08:22,400 --> 00:08:24,400 Speaker 3: is to go around to everybody and say, what are 145 00:08:24,440 --> 00:08:26,400 Speaker 3: all the things that you're going to need for this 146 00:08:26,520 --> 00:08:29,480 Speaker 3: system to work. Well, if you do that, they come 147 00:08:29,600 --> 00:08:31,720 Speaker 3: up with a bunch of things. That means you can't 148 00:08:31,760 --> 00:08:37,320 Speaker 3: just buy an off the shelf commercial accounting software the 149 00:08:37,360 --> 00:08:41,839 Speaker 3: way that even a really big enterprise would, because you've 150 00:08:41,840 --> 00:08:43,679 Speaker 3: come up with all these like bespoke things that need 151 00:08:43,720 --> 00:08:45,200 Speaker 3: to happen. Some of them are rail and some of 152 00:08:45,200 --> 00:08:47,960 Speaker 3: them are invented. And one of the things we said 153 00:08:48,160 --> 00:08:52,400 Speaker 3: on the Defense Innovation Board was it's actually not about 154 00:08:52,600 --> 00:08:55,920 Speaker 3: like doing better software to accommodate all those requirements. It 155 00:08:55,920 --> 00:08:57,880 Speaker 3: is about getting rid of those requirements so that you 156 00:08:57,880 --> 00:09:02,040 Speaker 3: can just buy commodities where But that's just really hard 157 00:09:02,200 --> 00:09:05,199 Speaker 3: because it's sort of you know, flow that power is 158 00:09:05,240 --> 00:09:07,320 Speaker 3: flowing the wrong direction. You don't have people who do 159 00:09:07,400 --> 00:09:11,920 Speaker 3: software going telling Congress to remove this requirement. It's just like, 160 00:09:12,120 --> 00:09:15,040 Speaker 3: you know, the stream needs to be a reverse. So 161 00:09:15,400 --> 00:09:18,240 Speaker 3: one of the reasons is just that they literally have 162 00:09:18,600 --> 00:09:22,320 Speaker 3: all these different systems that are supposed to feed into 163 00:09:22,400 --> 00:09:26,120 Speaker 3: one big picture of what the Department of Defense is spending, 164 00:09:26,960 --> 00:09:32,520 Speaker 3: and you know what it's receiving, and it's just incredibly complicated. 165 00:09:32,800 --> 00:09:35,920 Speaker 3: Now there are other reasons as well, but I think 166 00:09:35,920 --> 00:09:37,800 Speaker 3: that's one of the core. It's functions that you see 167 00:09:37,800 --> 00:09:51,240 Speaker 3: at play. 168 00:09:55,280 --> 00:09:57,360 Speaker 1: Let's back up for a second. So you wrote your 169 00:09:57,360 --> 00:10:00,680 Speaker 1: book Recruiting America. You have a great substance right now 170 00:10:00,720 --> 00:10:03,160 Speaker 1: called eating policy, and I want to get into some 171 00:10:03,160 --> 00:10:05,160 Speaker 1: of we'll get into some of the posts you wrote 172 00:10:05,200 --> 00:10:09,560 Speaker 1: about fixing hiring practices, and so forth. But I'm curious, like, 173 00:10:10,040 --> 00:10:12,360 Speaker 1: did you have a moment in your life, like the 174 00:10:12,520 --> 00:10:16,560 Speaker 1: sort of like founding trauma or whatever, where you're like, oh, wow, 175 00:10:16,679 --> 00:10:20,760 Speaker 1: this really something that you know, something that scandalized you 176 00:10:21,320 --> 00:10:25,360 Speaker 1: such that you've now devoted your work to writing about 177 00:10:25,559 --> 00:10:28,880 Speaker 1: and elucidating the issues that come up with things like 178 00:10:28,920 --> 00:10:30,160 Speaker 1: government hiring and spending. 179 00:10:31,400 --> 00:10:35,160 Speaker 3: Yeah, that's a funny question. I think what happened was 180 00:10:35,240 --> 00:10:38,880 Speaker 3: I started looking at this whole issue of government technology 181 00:10:38,960 --> 00:10:42,880 Speaker 3: back right after Obama was elected. Was we were working 182 00:10:42,920 --> 00:10:45,920 Speaker 3: on this idea of web two dotto if anyone remembers that, 183 00:10:46,120 --> 00:10:50,920 Speaker 3: what would gov Dato look like? And some colleagues and 184 00:10:50,960 --> 00:10:52,880 Speaker 3: I were sort of going to DC and talking to 185 00:10:52,880 --> 00:10:55,560 Speaker 3: people about how software got done in government. And I 186 00:10:55,559 --> 00:10:58,200 Speaker 3: remember this story. This is not the traumatic part, but 187 00:10:58,480 --> 00:11:00,280 Speaker 3: I remember the story from this guy who I'm from 188 00:11:00,320 --> 00:11:02,319 Speaker 3: Silicon Valley and he was working with the Department of 189 00:11:02,400 --> 00:11:04,360 Speaker 3: Labor and he was like, let me explain to you 190 00:11:04,400 --> 00:11:08,200 Speaker 3: how we do stuff here. It's like you have to 191 00:11:08,240 --> 00:11:10,800 Speaker 3: get soldiers from New York to d C. And you 192 00:11:10,920 --> 00:11:15,400 Speaker 3: go out and like source the iron and like you know, 193 00:11:15,520 --> 00:11:20,440 Speaker 3: have engineers speck out the requirements for the engines, because 194 00:11:20,440 --> 00:11:23,160 Speaker 3: you're going to build an entire railway instead of just 195 00:11:23,240 --> 00:11:27,960 Speaker 3: buying the soldiers some train tickets, and that can't be right, 196 00:11:28,040 --> 00:11:30,760 Speaker 3: that's crazy. But as I got into it, actually that 197 00:11:30,880 --> 00:11:34,040 Speaker 3: really is how this works. Often there's just not this 198 00:11:34,200 --> 00:11:37,040 Speaker 3: sense of like just how to use what's out there 199 00:11:37,080 --> 00:11:41,280 Speaker 3: because of this obsession with you know, doing everything from 200 00:11:41,320 --> 00:11:45,280 Speaker 3: the ground up. But before I worked in tech, my 201 00:11:45,440 --> 00:11:48,240 Speaker 3: first job out of college, I worked in child welfare 202 00:11:48,280 --> 00:11:53,160 Speaker 3: agency and I saw how badly the operations were there, 203 00:11:53,600 --> 00:11:59,160 Speaker 3: and you really see up close what's at stake these 204 00:11:59,320 --> 00:12:02,720 Speaker 3: kids luck and I thought, oh my god, if we're 205 00:12:02,720 --> 00:12:07,520 Speaker 3: doing software for the Department of Labor, this bad. This 206 00:12:07,679 --> 00:12:11,959 Speaker 3: is also affecting things like child welfare. And we really 207 00:12:11,960 --> 00:12:14,720 Speaker 3: ought to be putting like the best our country has 208 00:12:14,720 --> 00:12:18,280 Speaker 3: to offer on the biggest problems. And it's kind of 209 00:12:18,280 --> 00:12:21,520 Speaker 3: the opposite. And that's really got what made me, you know, 210 00:12:21,640 --> 00:12:23,120 Speaker 3: want to start Code for America. 211 00:12:24,040 --> 00:12:27,360 Speaker 2: Say more about the process of building everything from the 212 00:12:27,400 --> 00:12:29,839 Speaker 2: ground up, because I remember the last time we spoke 213 00:12:29,880 --> 00:12:33,640 Speaker 2: to you, one of the crazy numbers in that conversation 214 00:12:33,880 --> 00:12:38,160 Speaker 2: was the amount of requirements for a specific software project. 215 00:12:38,200 --> 00:12:40,720 Speaker 2: What is the exact process like walk us through the 216 00:12:40,960 --> 00:12:43,760 Speaker 2: I guess the life cycle of putting out a government 217 00:12:44,000 --> 00:12:46,720 Speaker 2: You know, we can talk software in particular because you 218 00:12:46,720 --> 00:12:50,200 Speaker 2: have a lot of experience with that a government software project. 219 00:12:50,920 --> 00:12:54,240 Speaker 2: Where does it start, what are the various like hoops 220 00:12:54,360 --> 00:12:56,480 Speaker 2: that people have to jump through, and then where does 221 00:12:56,520 --> 00:12:57,120 Speaker 2: it end up? 222 00:12:58,880 --> 00:13:03,760 Speaker 3: Yeah, maybe I'll talk about the unemployment insurance system in California, 223 00:13:03,840 --> 00:13:08,440 Speaker 3: which it worked on very briefly as the coacher of 224 00:13:08,480 --> 00:13:12,520 Speaker 3: the strike team in that first year of the pandemic 225 00:13:12,600 --> 00:13:14,520 Speaker 3: when everyone was waiting for their checks, and it was 226 00:13:14,880 --> 00:13:18,960 Speaker 3: really a disaster when we came in because there was 227 00:13:19,000 --> 00:13:24,800 Speaker 3: this huge backlog of claims. In fact, just coincidentally, the 228 00:13:24,880 --> 00:13:29,400 Speaker 3: team at the Employment Development Department had just finally gotten 229 00:13:29,400 --> 00:13:30,840 Speaker 3: to the point where they were going to put a 230 00:13:31,120 --> 00:13:34,760 Speaker 3: bid for a sort of a modernization they would say, 231 00:13:34,800 --> 00:13:39,319 Speaker 3: an upgrade they call a business system modernization out to vendors. 232 00:13:39,600 --> 00:13:43,200 Speaker 3: And I don't remember the exact number of requirements in it, 233 00:13:43,280 --> 00:13:46,920 Speaker 3: but it was in the thousands and they had been 234 00:13:47,000 --> 00:13:51,720 Speaker 3: working on that for eleven years. So they started the 235 00:13:51,760 --> 00:13:56,680 Speaker 3: requirements gathering eleven years ago and they were just putting 236 00:13:56,720 --> 00:13:59,640 Speaker 3: it out to bid, and I think we became extremely 237 00:13:59,720 --> 00:14:03,160 Speaker 3: unpop popular by telling the governor's office you really should 238 00:14:03,200 --> 00:14:07,600 Speaker 3: not bid that out. It is even if it were 239 00:14:07,640 --> 00:14:10,600 Speaker 3: one year old, it's now totally outdated because now we 240 00:14:10,679 --> 00:14:13,920 Speaker 3: understand the problems that you need to be solved much 241 00:14:13,960 --> 00:14:16,880 Speaker 3: better than you did a year ago, but certainly better 242 00:14:16,920 --> 00:14:21,640 Speaker 3: than you did eleven years ago. And they did actually 243 00:14:21,680 --> 00:14:24,760 Speaker 3: stop that and sort of revise it. But like, why 244 00:14:24,840 --> 00:14:28,920 Speaker 3: are there those thousands of requirements that take I don't 245 00:14:28,920 --> 00:14:30,880 Speaker 3: think that I should be clear. I don't think the 246 00:14:30,960 --> 00:14:34,960 Speaker 3: requirements gathering was eleven years. It was a big chunk 247 00:14:35,040 --> 00:14:37,280 Speaker 3: of that eleven years. Then you have sort of putting 248 00:14:37,360 --> 00:14:40,680 Speaker 3: together the RFP and all getting it, you know, getting 249 00:14:40,680 --> 00:14:43,480 Speaker 3: it approved for the legislature, et cetera. So you know, 250 00:14:43,520 --> 00:14:47,480 Speaker 3: why does it get so big? Well, I think you I. 251 00:14:48,000 --> 00:14:52,280 Speaker 3: Unemployment insurance is a good example of this. So unemployment 252 00:14:52,280 --> 00:14:55,880 Speaker 3: insurance started in the nineteen thirty five Social Security Act. 253 00:14:56,080 --> 00:14:58,240 Speaker 3: So we are well now it's twenty twenty five. So 254 00:14:58,280 --> 00:15:05,480 Speaker 3: now we are ninety years of accumulation of policy and 255 00:15:05,680 --> 00:15:11,720 Speaker 3: process and regulatory cruft essentially, So think about like stuff, 256 00:15:11,760 --> 00:15:18,160 Speaker 3: you know, requirements and rules and guidance from both the executive, 257 00:15:18,360 --> 00:15:22,840 Speaker 3: legislative and judicial branches just sort of falling like garbage 258 00:15:23,520 --> 00:15:27,320 Speaker 3: onto something like unemployment insurance for ninety years. Like a 259 00:15:27,760 --> 00:15:31,680 Speaker 3: rule of thumb is we always add, we never subtract. 260 00:15:31,760 --> 00:15:37,040 Speaker 3: And this is a fault of our legislatures, primarily not 261 00:15:37,440 --> 00:15:40,280 Speaker 3: the people who run the unemployment insurance system. They kind 262 00:15:40,280 --> 00:15:43,000 Speaker 3: of feel like they can't do anything about this accumulation. 263 00:15:44,120 --> 00:15:47,680 Speaker 3: What they do then is try to accommodate all of 264 00:15:47,720 --> 00:15:52,120 Speaker 3: these changes over time while never actually fixing the underlying system. 265 00:15:52,280 --> 00:15:55,840 Speaker 3: So like one of the metaphors someone else used for 266 00:15:55,880 --> 00:15:58,200 Speaker 3: me is like layers of paint to just get put 267 00:15:58,240 --> 00:16:01,320 Speaker 3: on well, you know, especially if you've lived like an 268 00:16:01,360 --> 00:16:04,320 Speaker 3: apartment in New York that's been painted like ninety thousand times, 269 00:16:04,360 --> 00:16:07,960 Speaker 3: eventually it cracks. That's sort of what's happening. Or my 270 00:16:07,960 --> 00:16:11,880 Speaker 3: other metaphor, which is like archaeological layers, but they only 271 00:16:12,000 --> 00:16:16,280 Speaker 3: ever appropriate money to just like make the fixes, not 272 00:16:16,520 --> 00:16:19,640 Speaker 3: to go back and say, how would we actually design 273 00:16:19,720 --> 00:16:22,400 Speaker 3: this system if we were designing it today, not just 274 00:16:22,440 --> 00:16:25,600 Speaker 3: from a technological perspective, but how would we go back 275 00:16:25,600 --> 00:16:29,280 Speaker 3: and say, oh, that you know, requirement from the forties, 276 00:16:29,720 --> 00:16:32,200 Speaker 3: you know, that's been overwritten by these other things. Why 277 00:16:32,240 --> 00:16:35,120 Speaker 3: is there still code in the system that's you know, 278 00:16:35,440 --> 00:16:38,000 Speaker 3: that's dealing with that requirement from the forties and why 279 00:16:38,080 --> 00:16:41,520 Speaker 3: is there still you know, memos pointing to that, confusing 280 00:16:41,600 --> 00:16:44,360 Speaker 3: people about what the current state of the rules are. 281 00:16:44,680 --> 00:16:47,840 Speaker 3: But like, really, what we need to do is do 282 00:16:47,960 --> 00:16:53,040 Speaker 3: a full kind of regulatory simplification, policy and process simplification 283 00:16:53,680 --> 00:16:56,560 Speaker 3: alongside our modernizations, and we just don't do that. 284 00:16:57,320 --> 00:17:01,920 Speaker 2: Do government officials who are design these projects, how much 285 00:17:02,400 --> 00:17:05,280 Speaker 2: do they ever work in tandem with vendors? Like well, 286 00:17:05,320 --> 00:17:08,000 Speaker 2: they go out and say, hey, we're thinking about upgrading 287 00:17:08,160 --> 00:17:10,919 Speaker 2: the system. What do you think we should do? Or 288 00:17:10,960 --> 00:17:14,360 Speaker 2: does it always start with the requirement process and then 289 00:17:14,480 --> 00:17:15,560 Speaker 2: that just got sent out. 290 00:17:16,880 --> 00:17:19,959 Speaker 3: No, they do often work with vendors. The problem is 291 00:17:20,000 --> 00:17:23,840 Speaker 3: that the sort of way the vendor game is played, 292 00:17:23,880 --> 00:17:26,480 Speaker 3: and I'd say this is changing to some degree. It's 293 00:17:26,480 --> 00:17:29,480 Speaker 3: like the vendors benefit from all that complexity, So the 294 00:17:29,560 --> 00:17:33,280 Speaker 3: vendors never say, hey, why don't you go your legislature 295 00:17:33,400 --> 00:17:36,520 Speaker 3: and ask for like, you know, get can we you know, 296 00:17:36,680 --> 00:17:39,560 Speaker 3: collaborate in a project, sort of rationalize all of this 297 00:17:39,680 --> 00:17:42,600 Speaker 3: policy craft and then we can make you a system 298 00:17:42,760 --> 00:17:45,200 Speaker 3: that it's going to be you know, a lot more elegant, 299 00:17:45,280 --> 00:17:48,879 Speaker 3: a lot more stable, a lot more scalable. Right, scalability 300 00:17:48,960 --> 00:17:53,720 Speaker 3: is really really degraded by this complexity because like it's 301 00:17:53,760 --> 00:17:57,040 Speaker 3: they're huge projects. I mean, I think even after we 302 00:17:57,160 --> 00:18:01,719 Speaker 3: asked California to boot out the old business system monorization 303 00:18:01,920 --> 00:18:04,919 Speaker 3: and started again, they still went out with something that 304 00:18:05,000 --> 00:18:10,160 Speaker 3: I believe was almost two billion dollars. Might have been 305 00:18:10,200 --> 00:18:12,640 Speaker 3: one billion, So forgive me if I'm getting that wrong, 306 00:18:12,680 --> 00:18:15,080 Speaker 3: but that's still in the same range of like, that's 307 00:18:15,119 --> 00:18:18,760 Speaker 3: a lot of money to like, quote unquote like do 308 00:18:18,960 --> 00:18:24,280 Speaker 3: some upgrades on a system, because they really don't have 309 00:18:24,480 --> 00:18:28,760 Speaker 3: that relationship between the executive and legislative branch. And also 310 00:18:28,840 --> 00:18:31,359 Speaker 3: let's throw in the judicial to say, what this needs 311 00:18:31,400 --> 00:18:33,960 Speaker 3: is a real breeboot. I mean, unemployment insurance is not 312 00:18:34,080 --> 00:18:38,119 Speaker 3: that complicated in its basics, right, We're giving people a 313 00:18:38,160 --> 00:18:40,400 Speaker 3: certain amount of money for a certain amount of time 314 00:18:41,200 --> 00:18:47,600 Speaker 3: under certain conditions. But it's gotten wildly complicated, and that 315 00:18:48,040 --> 00:18:52,520 Speaker 3: like the vendors don't seem to most of the vendors 316 00:18:52,560 --> 00:18:56,439 Speaker 3: aren't really incentive to be part of that conversation about 317 00:18:56,480 --> 00:18:59,840 Speaker 3: the simplification because then the projects are going to be smaller. 318 00:19:00,400 --> 00:19:04,040 Speaker 1: Let's talk about government hiring and your experience, and you've 319 00:19:04,040 --> 00:19:07,480 Speaker 1: written some sort of wild stuff on your subject just 320 00:19:07,520 --> 00:19:11,160 Speaker 1: about how hard it is to get a talented person 321 00:19:11,480 --> 00:19:14,199 Speaker 1: in the door. And this is even sitting aside the 322 00:19:14,240 --> 00:19:17,080 Speaker 1: fact that government pay skills aren't the same as private 323 00:19:17,119 --> 00:19:19,960 Speaker 1: sector pay skills, et cetera. There's no like, you know, 324 00:19:20,080 --> 00:19:24,840 Speaker 1: equity upside for a talented engineer just working through the 325 00:19:24,880 --> 00:19:26,760 Speaker 1: government HR process. 326 00:19:26,800 --> 00:19:29,960 Speaker 3: What do you see the biggest problem with hiring to me? 327 00:19:30,200 --> 00:19:32,959 Speaker 3: And there's there are many isn't the pay though I 328 00:19:33,000 --> 00:19:35,399 Speaker 3: know that should be fixed and a lot of people 329 00:19:35,400 --> 00:19:38,679 Speaker 3: do talk about that. It's that we don't actually assess 330 00:19:38,800 --> 00:19:42,639 Speaker 3: candidates for their skills. So if you leave a part 331 00:19:42,760 --> 00:19:45,439 Speaker 3: the like political appointees and the people who get appointed 332 00:19:45,480 --> 00:19:48,560 Speaker 3: are things like JLA or schedule. See where you're using 333 00:19:48,640 --> 00:19:53,399 Speaker 3: an exception and just talk about like the regular uh, 334 00:19:53,840 --> 00:19:58,439 Speaker 3: open to the public competitive process. Ninety percent of those 335 00:19:59,280 --> 00:20:03,200 Speaker 3: use on self assessments of the person's own skill and 336 00:20:03,520 --> 00:20:07,359 Speaker 3: they tell you what level they are and a resume screen. 337 00:20:07,960 --> 00:20:11,360 Speaker 3: So how it goes is as a hiring manager, you're 338 00:20:11,359 --> 00:20:14,399 Speaker 3: like completely cut out of the process. The HR person 339 00:20:14,440 --> 00:20:17,320 Speaker 3: is in control, and you say, okay, I need this 340 00:20:17,400 --> 00:20:19,080 Speaker 3: job there's all this sort of back and forth about 341 00:20:19,080 --> 00:20:22,040 Speaker 3: what the job is. The job descriptions aren't often up 342 00:20:22,040 --> 00:20:25,080 Speaker 3: to date. Maybe there isn't a job description or a 343 00:20:25,119 --> 00:20:28,280 Speaker 3: classification for the kind of person you need, especially if 344 00:20:28,280 --> 00:20:32,720 Speaker 3: it's technical. Leaving all that aside, you finally post a 345 00:20:32,760 --> 00:20:36,520 Speaker 3: position and let's say five hundred resumes come in. Well, 346 00:20:36,920 --> 00:20:40,639 Speaker 3: your HR person has to give you a sert, a 347 00:20:40,800 --> 00:20:43,480 Speaker 3: list of people that they deem qualified. And the way 348 00:20:43,520 --> 00:20:46,800 Speaker 3: they get from five hundred to say ten on the 349 00:20:46,880 --> 00:20:50,240 Speaker 3: sert is first they look for everybody who has cut 350 00:20:50,280 --> 00:20:53,480 Speaker 3: and pasted from the job description into their resume and 351 00:20:53,520 --> 00:20:55,840 Speaker 3: cover letter. Oh, by the way, that person had to 352 00:20:55,880 --> 00:20:58,240 Speaker 3: have known to do a government resume which is usually 353 00:20:58,240 --> 00:21:01,200 Speaker 3: like seven pages long as opposed private sector resume, which 354 00:21:01,200 --> 00:21:04,440 Speaker 3: would be shorter. That's all another story. So anyone who 355 00:21:04,440 --> 00:21:08,880 Speaker 3: has exact matches between the language in the job description 356 00:21:09,040 --> 00:21:13,320 Speaker 3: and their resume can go through to the next step. 357 00:21:14,080 --> 00:21:17,040 Speaker 3: Of those the people who rate themselves as a master 358 00:21:17,280 --> 00:21:21,000 Speaker 3: on all levels then can go through to the next step, 359 00:21:21,560 --> 00:21:26,000 Speaker 3: and from there they apply veterans preference. Now I am 360 00:21:26,000 --> 00:21:28,760 Speaker 3: a fan of giving veterans preference, but when you have 361 00:21:29,359 --> 00:21:32,320 Speaker 3: down selected out all the people who didn't want to 362 00:21:32,320 --> 00:21:35,359 Speaker 3: cut in paste or didn't want to basically lie and 363 00:21:35,400 --> 00:21:38,480 Speaker 3: just say that they were a master on every competency. 364 00:21:38,960 --> 00:21:43,320 Speaker 3: The good veterans probably haven't ended up on that sert, 365 00:21:43,760 --> 00:21:46,360 Speaker 3: And so as a hiring manager, you're given a cert 366 00:21:46,480 --> 00:21:49,520 Speaker 3: of people who just knew how to play the game, 367 00:21:49,520 --> 00:21:52,480 Speaker 3: but don't necessarily have the skills, and half of the 368 00:21:52,560 --> 00:21:56,639 Speaker 3: time the hiring manager just rejects the sert. So, like 369 00:21:56,680 --> 00:22:00,359 Speaker 3: an example of how this works with that, you know, 370 00:22:00,680 --> 00:22:03,680 Speaker 3: looking for exact matches, was this guy that we found 371 00:22:03,680 --> 00:22:05,800 Speaker 3: when we were working on the Defense and Avation Board 372 00:22:06,080 --> 00:22:11,280 Speaker 3: who had literally won the hack the Pentagon contest, like 373 00:22:11,520 --> 00:22:15,080 Speaker 3: the top security researchers in the country came together and 374 00:22:15,119 --> 00:22:17,400 Speaker 3: this one guy he happened to be nineteen years old. 375 00:22:18,160 --> 00:22:22,639 Speaker 3: One that thing, he's like incredibly talented, and the people 376 00:22:22,840 --> 00:22:25,239 Speaker 3: there recognized this is the kind of guy that we 377 00:22:25,359 --> 00:22:28,719 Speaker 3: need in the Pentagon making our systems more secure. So 378 00:22:28,760 --> 00:22:33,120 Speaker 3: they try to hire him, but his resume like just details, 379 00:22:33,240 --> 00:22:37,239 Speaker 3: you know, all the programming languages and frameworks that he 380 00:22:37,720 --> 00:22:40,359 Speaker 3: knows how to use, because that's what you would, you know, 381 00:22:40,400 --> 00:22:44,719 Speaker 3: you would talk about your experience with specific frameworks and languages. 382 00:22:44,800 --> 00:22:47,800 Speaker 3: If you were applying for a private sector job, and 383 00:22:48,280 --> 00:22:51,040 Speaker 3: the you know, HR people look at that and they go, 384 00:22:51,119 --> 00:22:53,280 Speaker 3: we have no idea what that means. That doesn't match 385 00:22:53,280 --> 00:22:56,000 Speaker 3: at all our language. And he gets kicked out literally, 386 00:22:56,040 --> 00:22:59,119 Speaker 3: you know, in the first the first down select. But 387 00:22:59,200 --> 00:23:01,920 Speaker 3: it doesn't stop there, like the people at the Pentagon 388 00:23:02,359 --> 00:23:05,800 Speaker 3: continue to try to get this guy hired, and pretty 389 00:23:05,920 --> 00:23:09,959 Speaker 3: high level people kept in intervening as they tried again 390 00:23:10,080 --> 00:23:14,280 Speaker 3: and again to get Jack Cable hired, and it took 391 00:23:14,960 --> 00:23:18,000 Speaker 3: months for him to like finally get through the process. 392 00:23:18,720 --> 00:23:21,720 Speaker 3: In between, one of the HR people told him, why 393 00:23:21,720 --> 00:23:24,439 Speaker 3: don't you go work for Best Buy selling TVs for 394 00:23:24,480 --> 00:23:26,280 Speaker 3: a year because then you'll be qualified. 395 00:23:26,640 --> 00:23:28,879 Speaker 2: That is nuts. So just to be clear, because he 396 00:23:28,920 --> 00:23:32,760 Speaker 2: said on his CV specific programming languages like I can 397 00:23:32,920 --> 00:23:36,639 Speaker 2: use like babbel and I'm an expert in I don't know, 398 00:23:36,800 --> 00:23:40,439 Speaker 2: NPM or whatever. The HR manager didn't recognize it, and 399 00:23:40,440 --> 00:23:42,560 Speaker 2: they're like, this is not what we're looking for. 400 00:23:43,560 --> 00:23:47,680 Speaker 3: That's exactly right, and they explained to the Defense Innovation Board. 401 00:23:48,200 --> 00:23:51,000 Speaker 3: They pulled up the language in the job description, which 402 00:23:51,080 --> 00:23:54,600 Speaker 3: was like, you know, X years of experience doing a 403 00:23:54,760 --> 00:23:59,320 Speaker 3: very generic language about like the completion of it tech 404 00:23:59,400 --> 00:24:02,280 Speaker 3: you know, prods or something, which is like doesn't mean anything, 405 00:24:02,960 --> 00:24:05,360 Speaker 3: and they were like, we don't see this, and. 406 00:24:05,640 --> 00:24:07,960 Speaker 2: Just a question, this might be a cultural question, but 407 00:24:08,080 --> 00:24:12,680 Speaker 2: why does HR not listen to the expertise of these 408 00:24:12,680 --> 00:24:14,160 Speaker 2: specific hiring managers. 409 00:24:15,119 --> 00:24:17,760 Speaker 3: Okay, that is a fantastic question, and I've actually done 410 00:24:17,840 --> 00:24:24,520 Speaker 3: some research on like legally is that required? And I'll 411 00:24:24,520 --> 00:24:27,479 Speaker 3: give you the answer in a second. But I think 412 00:24:27,600 --> 00:24:33,719 Speaker 3: the real answer is that they're extremely risk averse and 413 00:24:33,760 --> 00:24:35,879 Speaker 3: one of the principles that they're supposed to hold is 414 00:24:36,440 --> 00:24:40,800 Speaker 3: uphold is a principle of fairness. And I like fairness. 415 00:24:40,920 --> 00:24:42,800 Speaker 3: I think that's great, But the way that it gets 416 00:24:42,800 --> 00:24:47,680 Speaker 3: applied is that essentially saying, we have all these very 417 00:24:47,880 --> 00:24:53,800 Speaker 3: very specific rules designed to keep bias out of this process, 418 00:24:54,920 --> 00:24:59,720 Speaker 3: and we are experts in those processes and those rules, 419 00:25:00,400 --> 00:25:06,000 Speaker 3: and so only we can ensure that this process occurs 420 00:25:06,040 --> 00:25:09,919 Speaker 3: without the introduction of bias and or you know, you know, 421 00:25:10,080 --> 00:25:15,560 Speaker 3: some perceived or real you know, deviation from the rules, 422 00:25:16,160 --> 00:25:19,840 Speaker 3: and so what they call them subject matter experts. Like 423 00:25:19,840 --> 00:25:22,320 Speaker 3: in other words, if you're hiring from a programmer, like 424 00:25:22,520 --> 00:25:25,200 Speaker 3: you would kind of want a programmer to evaluate that person. 425 00:25:25,560 --> 00:25:28,280 Speaker 3: But programmers are not allowed to valuate the person again 426 00:25:28,320 --> 00:25:32,639 Speaker 3: in ninety percent of these cases because they might introduce bias. 427 00:25:32,960 --> 00:25:35,560 Speaker 3: And it's a perfect example for me of like, if 428 00:25:35,600 --> 00:25:39,359 Speaker 3: you read the merit system principles that came out of 429 00:25:39,400 --> 00:25:42,879 Speaker 3: the Civil Service Reform Act of nineteen seventy eight, Thank you, 430 00:25:43,000 --> 00:25:46,320 Speaker 3: President Carter. That was, you know, an important thing that 431 00:25:46,400 --> 00:25:49,960 Speaker 3: he did. They sound really great and they talk about, 432 00:25:50,000 --> 00:25:52,280 Speaker 3: you know, hiring on the basis of merit, and fairness 433 00:25:52,320 --> 00:25:54,520 Speaker 3: is in there as well. And you read them, you 434 00:25:54,520 --> 00:25:57,680 Speaker 3: would say these are fantastic. Federal hiring is great, no problem. 435 00:25:58,440 --> 00:26:02,399 Speaker 3: But what happened is as they get operationalized, they go 436 00:26:02,440 --> 00:26:05,880 Speaker 3: through what I call the cascade of rigidity, where each 437 00:26:06,000 --> 00:26:09,240 Speaker 3: step down from sort of a law or policy into 438 00:26:09,400 --> 00:26:13,000 Speaker 3: like the way HR managers actually work on a daily basis, 439 00:26:13,320 --> 00:26:17,159 Speaker 3: just gets incredibly rigid. And the rigidity means that you 440 00:26:17,320 --> 00:26:21,520 Speaker 3: get kind of the opposite outcome of what the law intended. 441 00:26:21,560 --> 00:26:25,520 Speaker 3: The law intended merit, we're really not hiring on the 442 00:26:25,560 --> 00:26:28,240 Speaker 3: basis of merit because it's gotten so rigid as it's 443 00:26:28,280 --> 00:26:32,359 Speaker 3: been operationalized. And I asked one of our researchers to 444 00:26:32,400 --> 00:26:36,119 Speaker 3: go look into this, like is it in the law 445 00:26:36,240 --> 00:26:39,679 Speaker 3: is it in the regulations? Is it in the guidance? 446 00:26:39,840 --> 00:26:43,879 Speaker 3: Like who wrote in that HR managers control this process 447 00:26:44,080 --> 00:26:47,200 Speaker 3: and hiring managers don't have a say. And it turns 448 00:26:47,200 --> 00:26:49,320 Speaker 3: out it just sort of appears in this thing called 449 00:26:49,320 --> 00:26:54,760 Speaker 3: the Delegated Examiner's Operations Handbook, which means somebody wrote this 450 00:26:54,880 --> 00:26:59,320 Speaker 3: operational handbook and said, you know, this is HR manager's job, 451 00:26:59,520 --> 00:27:03,439 Speaker 3: like you know, and so that's what people do. But 452 00:27:03,520 --> 00:27:06,440 Speaker 3: like the law never said that in needed does the regulation? 453 00:27:07,000 --> 00:27:07,199 Speaker 2: You know? 454 00:27:07,280 --> 00:27:09,800 Speaker 1: I just you know, Okay, So five hundred people apply 455 00:27:09,880 --> 00:27:13,359 Speaker 1: for a job, you get ten. The hiring manager sees 456 00:27:13,760 --> 00:27:16,400 Speaker 1: ten names, they sort of might have a feeling that 457 00:27:16,520 --> 00:27:19,400 Speaker 1: these can't be the best of the five hundred. Can 458 00:27:19,440 --> 00:27:21,520 Speaker 1: you say, look, just give me the five hundred. Let 459 00:27:21,600 --> 00:27:23,840 Speaker 1: me scan through click on some of them, because a 460 00:27:23,920 --> 00:27:27,359 Speaker 1: good hiring manager probably has some intuitions about what a 461 00:27:27,359 --> 00:27:29,720 Speaker 1: good candidate might look like, like, will they let you 462 00:27:29,800 --> 00:27:31,480 Speaker 1: see that five hundred person list? 463 00:27:32,520 --> 00:27:35,639 Speaker 3: Well, I maintain that there is nothing legally that keeps 464 00:27:35,640 --> 00:27:38,840 Speaker 3: them from doing that, But on a practical basis, no, 465 00:27:39,040 --> 00:27:39,919 Speaker 3: they're not allowed to. 466 00:27:40,960 --> 00:27:43,840 Speaker 2: So you touched on this earlier. But part of the 467 00:27:43,880 --> 00:27:48,679 Speaker 2: reasons all these requirements have been codified is for reasons 468 00:27:48,680 --> 00:27:51,560 Speaker 2: of fairness, and the US presumably doesn't want to go 469 00:27:51,680 --> 00:27:56,560 Speaker 2: back to a system of patronage or nepotism or people 470 00:27:56,680 --> 00:28:00,199 Speaker 2: hiring their friends and stuff like that. What would be 471 00:28:00,480 --> 00:28:04,719 Speaker 2: I guess the suggested guard rails for stopping the government 472 00:28:04,720 --> 00:28:07,400 Speaker 2: from doing that or maybe balancing you know, the two 473 00:28:07,440 --> 00:28:10,760 Speaker 2: things fairness and hiring based on merit. 474 00:28:11,200 --> 00:28:15,360 Speaker 3: It's a complicated question because of what I described earlier, 475 00:28:15,440 --> 00:28:18,200 Speaker 3: which is, if you don't solve the problem of everything 476 00:28:18,240 --> 00:28:22,280 Speaker 3: getting more rigid as it gets operationalized, even you know, 477 00:28:22,359 --> 00:28:26,560 Speaker 3: acts of Congress to change you know, civil service rules 478 00:28:26,760 --> 00:28:29,879 Speaker 3: at the highest level kind of you know, don't have 479 00:28:29,920 --> 00:28:33,479 Speaker 3: the intended effect, but we are still always going to 480 00:28:33,480 --> 00:28:37,160 Speaker 3: be trying to balance those things. I maintain that essentially, 481 00:28:37,480 --> 00:28:40,600 Speaker 3: the merits and stant principles defined in that nineteen seventy 482 00:28:40,600 --> 00:28:44,760 Speaker 3: eight CSRA are great. What we just need to do 483 00:28:44,800 --> 00:28:47,640 Speaker 3: is sort of like pull the you know, pull the 484 00:28:47,840 --> 00:28:50,920 Speaker 3: edifice back to the studs, like it's a massive remodel, 485 00:28:51,400 --> 00:28:55,200 Speaker 3: but not like an upending of the foundation. The foundation 486 00:28:55,520 --> 00:28:58,840 Speaker 3: is solid, and so we need to go have a 487 00:28:58,880 --> 00:29:01,400 Speaker 3: full look up and down the stack. Where did we 488 00:29:01,440 --> 00:29:05,800 Speaker 3: go wrong in that cascade of rigidity. There's probably there 489 00:29:05,840 --> 00:29:08,040 Speaker 3: are tweaks that need to happen at the statute level, 490 00:29:08,040 --> 00:29:10,080 Speaker 3: So I think like OPM and Congress do need to 491 00:29:10,080 --> 00:29:11,840 Speaker 3: get involved. But a lot of it is just how 492 00:29:11,880 --> 00:29:15,880 Speaker 3: has this been operationalized? Now you probably know that there 493 00:29:16,040 --> 00:29:20,760 Speaker 3: is this sort of nuclear bomb that is being threatened. 494 00:29:20,800 --> 00:29:22,719 Speaker 2: Oh I was going to ask about that. 495 00:29:23,080 --> 00:29:30,800 Speaker 3: Yeah. Schedule F basically says if you can demonstrate that 496 00:29:30,840 --> 00:29:35,120 Speaker 3: this person in the who has civil service protections today 497 00:29:35,640 --> 00:29:40,720 Speaker 3: has sort of any adjacency to policy, you can reclassify 498 00:29:40,880 --> 00:29:42,920 Speaker 3: them as a political appointc and then they lose all 499 00:29:42,960 --> 00:29:47,000 Speaker 3: their civil service protections. And of course, uh Trump rolled 500 00:29:47,000 --> 00:29:50,280 Speaker 3: that up like in the last month or something of 501 00:29:50,360 --> 00:29:52,720 Speaker 3: his first term and has been very clear that he 502 00:29:52,720 --> 00:29:55,480 Speaker 3: wants to bring it back in the beginning of this 503 00:29:55,600 --> 00:29:59,760 Speaker 3: coming term. And you know, in my metaphor, that is 504 00:29:59,800 --> 00:30:03,200 Speaker 3: at pulling up the foundation, because that is saying we're 505 00:30:03,200 --> 00:30:07,680 Speaker 3: going to hire fire on the basis of loyalty essentially 506 00:30:08,120 --> 00:30:12,040 Speaker 3: to an administration, instead of on the basis of merit. 507 00:30:12,160 --> 00:30:15,840 Speaker 3: And we do really have problems with not being able 508 00:30:15,880 --> 00:30:19,719 Speaker 3: to fire underperformers. They're a very small percentage of people 509 00:30:19,880 --> 00:30:22,920 Speaker 3: who really should have been removed and it's super hard 510 00:30:22,960 --> 00:30:24,680 Speaker 3: to remove them. I can tell you a couple of 511 00:30:24,680 --> 00:30:27,479 Speaker 3: stories about that. But like, and we should fix that. 512 00:30:28,000 --> 00:30:32,440 Speaker 3: We should fix that, not take away the protections for 513 00:30:32,560 --> 00:30:34,800 Speaker 3: folks who stand up and say, hey, you need to 514 00:30:34,880 --> 00:30:37,160 Speaker 3: know this. Oh wait, I disagree with you, so suddenly 515 00:30:37,200 --> 00:30:42,120 Speaker 3: you're fired. Like, like, rebuild the thing from its strong foundation, don't. 516 00:30:42,360 --> 00:30:44,560 Speaker 3: Don't pull the foundation out of the soil. 517 00:31:00,080 --> 00:31:04,360 Speaker 1: The way I'm looking at the Delegated Examining Operations Handbook, 518 00:31:07,040 --> 00:31:11,800 Speaker 1: three hundred and eighteen pages, many flow charts, many sub sections. 519 00:31:11,840 --> 00:31:15,840 Speaker 1: It's many appendices. You know, I'm sure, look this serious stuff. 520 00:31:15,840 --> 00:31:17,960 Speaker 1: I'm sure that this needs to be a real document. 521 00:31:17,680 --> 00:31:18,160 Speaker 3: But it is. 522 00:31:18,680 --> 00:31:21,320 Speaker 1: There's a lot to it. This gets though to something 523 00:31:21,520 --> 00:31:25,320 Speaker 1: that I think is maybe part of the core question here, 524 00:31:25,480 --> 00:31:27,880 Speaker 1: and we sort of teased at it in the introduction. 525 00:31:28,360 --> 00:31:31,320 Speaker 1: It's easy enough to identify these issues and say, look, 526 00:31:31,360 --> 00:31:34,080 Speaker 1: this is absurd, and I imagine that if a lot 527 00:31:34,160 --> 00:31:37,640 Speaker 1: of people experiencing this would agree, and probably over the 528 00:31:37,720 --> 00:31:40,320 Speaker 1: years in government many people have come to the same 529 00:31:40,400 --> 00:31:44,280 Speaker 1: conclusion that you have about the absurdity of some of 530 00:31:44,320 --> 00:31:48,200 Speaker 1: these some of these workflows and so forth. In one 531 00:31:48,240 --> 00:31:51,720 Speaker 1: of your pieces recently, something that you implied is that 532 00:31:51,800 --> 00:31:56,640 Speaker 1: maybe the answer is not just more identification of the 533 00:31:56,680 --> 00:32:00,120 Speaker 1: problems or sort of good intention, but maybe someone who's 534 00:32:00,200 --> 00:32:03,840 Speaker 1: kind of a bully or someone who doesn't necessarily think 535 00:32:03,920 --> 00:32:08,600 Speaker 1: the rules apply, and perhaps someone like you know, Elon 536 00:32:08,720 --> 00:32:12,960 Speaker 1: Musk for example, someone with that someone with that character. 537 00:32:13,600 --> 00:32:16,680 Speaker 1: Gravitas might be a word that people use energy. I 538 00:32:16,720 --> 00:32:19,959 Speaker 1: don't know, talk to us more, because again, it doesn't 539 00:32:20,000 --> 00:32:22,800 Speaker 1: seem like just identifying these problems are enough. 540 00:32:24,680 --> 00:32:27,160 Speaker 3: Well, I think evidently hasn't been because we haven't done 541 00:32:27,240 --> 00:32:31,560 Speaker 3: that much. The chance to compete act is a you know, 542 00:32:31,600 --> 00:32:34,560 Speaker 3: a good move in the right direction. It's past both 543 00:32:34,760 --> 00:32:38,000 Speaker 3: houses and is I presume we'll be signed by President 544 00:32:38,000 --> 00:32:41,040 Speaker 3: Biden before he leaves office. And it does a lot 545 00:32:41,080 --> 00:32:44,640 Speaker 3: actually about the assessment problem that we just discussed, but 546 00:32:44,680 --> 00:32:47,640 Speaker 3: it doesn't it probably doesn't do enough in my view, 547 00:32:48,160 --> 00:32:50,760 Speaker 3: and again you'll have the problem of actually implementing it, 548 00:32:50,800 --> 00:32:53,000 Speaker 3: which the new head of OPM is going to find 549 00:32:53,040 --> 00:32:56,040 Speaker 3: out is pretty hard to implement when you have an 550 00:32:56,160 --> 00:32:59,520 Speaker 3: HR workforce that risk averse and that sort of process obsessed. 551 00:33:00,120 --> 00:33:04,400 Speaker 3: So yeah, I'm in that weird position as a Democrat 552 00:33:05,120 --> 00:33:07,440 Speaker 3: and someone who very much cares about the rule of 553 00:33:07,560 --> 00:33:10,880 Speaker 3: law and values the institution of government and wants to 554 00:33:10,880 --> 00:33:13,400 Speaker 3: see it succeed in part because we have all these 555 00:33:13,480 --> 00:33:16,760 Speaker 3: challenges that we government is the only institution that can 556 00:33:17,280 --> 00:33:19,239 Speaker 3: really meet them, in my view, at least meet them 557 00:33:19,280 --> 00:33:22,640 Speaker 3: in a way that is consistent with the democratic principles 558 00:33:23,160 --> 00:33:29,160 Speaker 3: of our society. And yet we have neglected that kind 559 00:33:29,200 --> 00:33:32,280 Speaker 3: of change, or we've just been not bold enough in 560 00:33:32,480 --> 00:33:36,280 Speaker 3: trying to change the system as it exists today. I 561 00:33:36,320 --> 00:33:40,080 Speaker 3: think it's part of culture of both the left and 562 00:33:40,120 --> 00:33:42,960 Speaker 3: the right to fight a lot about like i'll call 563 00:33:43,000 --> 00:33:46,480 Speaker 3: the what of government, Like what bills are we passing? 564 00:33:47,120 --> 00:33:49,640 Speaker 3: You know, the Chips and Science Act and the Inflational 565 00:33:49,680 --> 00:33:53,360 Speaker 3: Reduction Act and the by De Parson Instructure Law. Those 566 00:33:53,360 --> 00:33:56,200 Speaker 3: are all the what, and like they were really good, 567 00:33:56,320 --> 00:33:59,920 Speaker 3: I think in a lot of ways, but we neglect 568 00:34:00,120 --> 00:34:03,520 Speaker 3: did the how? And the how is all these things 569 00:34:03,600 --> 00:34:07,160 Speaker 3: like the functioning of government and hiring and again both 570 00:34:07,200 --> 00:34:11,200 Speaker 3: parties do that. And because we've neglected the how, it's 571 00:34:11,280 --> 00:34:16,640 Speaker 3: gotten really really sclerotic, I guess is a word, and 572 00:34:16,920 --> 00:34:19,520 Speaker 3: in effective, as you talked about at the start of this. 573 00:34:19,719 --> 00:34:27,040 Speaker 3: And so that democratic approach generally of let's be very thoughtful, 574 00:34:27,360 --> 00:34:31,719 Speaker 3: and let's listen to all the voices and make sure 575 00:34:31,719 --> 00:34:35,480 Speaker 3: we're making very measured decisions here, which is like the 576 00:34:35,520 --> 00:34:37,879 Speaker 3: world I'd like to live in. That sounds great, and 577 00:34:38,440 --> 00:34:40,400 Speaker 3: I like to think of myself as a thoughtful and 578 00:34:40,440 --> 00:34:45,480 Speaker 3: measured person. But that kind of approach tends to like 579 00:34:45,640 --> 00:34:49,680 Speaker 3: tends basically to add more process and more procedure when 580 00:34:49,719 --> 00:34:51,960 Speaker 3: we already have a lot, and try to make sure 581 00:34:51,960 --> 00:34:55,200 Speaker 3: that everybody is happy, and the result of that has 582 00:34:55,320 --> 00:34:58,960 Speaker 3: been that we're not achieving our policy goals. And so 583 00:34:59,360 --> 00:35:06,000 Speaker 3: I'm in that comfortable space of saying, I don't really 584 00:35:06,080 --> 00:35:09,080 Speaker 3: agree with a lot of the policy goals of the 585 00:35:09,080 --> 00:35:14,439 Speaker 3: incoming administration, but I have to acknowledge that a sort 586 00:35:14,520 --> 00:35:20,560 Speaker 3: of extremely disruptive approach might be needed for us to 587 00:35:21,280 --> 00:35:23,160 Speaker 3: just get out of the rut that we're in of 588 00:35:23,200 --> 00:35:27,360 Speaker 3: always you know, always adding and never subtracting, and hope 589 00:35:27,480 --> 00:35:31,759 Speaker 3: that the cycles of you know, change in power mean that, 590 00:35:32,400 --> 00:35:35,120 Speaker 3: you know, if the party that I prefer ever gets 591 00:35:35,200 --> 00:35:39,200 Speaker 3: back into power, some cleanup would have happened, and then 592 00:35:39,239 --> 00:35:43,400 Speaker 3: they can sort of rebuild in that thoughtful, considered way 593 00:35:43,719 --> 00:35:47,040 Speaker 3: that Democrats tend to do, you know, rebuild something that 594 00:35:47,040 --> 00:35:52,000 Speaker 3: that's quite healthy and not you know, you know, out 595 00:35:52,080 --> 00:35:55,680 Speaker 3: of sort of this cleanup, but recognizing that this period 596 00:35:55,760 --> 00:36:01,360 Speaker 3: of disruption could be quite bad in many ways and 597 00:36:01,680 --> 00:36:02,440 Speaker 3: yet maybe needed. 598 00:36:02,960 --> 00:36:07,880 Speaker 2: I mean, there have been some virtually process free suggestions 599 00:36:07,920 --> 00:36:12,480 Speaker 2: for firing workers made by Musk and Favec Ramaswani. I 600 00:36:12,480 --> 00:36:16,040 Speaker 2: think Ramaswani wanted to didn't you want to like fire 601 00:36:16,719 --> 00:36:21,040 Speaker 2: all employees with like odd numbered Social Security numbers or something? 602 00:36:22,520 --> 00:36:26,239 Speaker 3: He did, And I will say that, a, that's really 603 00:36:26,400 --> 00:36:30,600 Speaker 3: dumb idea, because the people in government who are the 604 00:36:30,600 --> 00:36:33,800 Speaker 3: biggest advocate for change, who could be the vek in 605 00:36:33,880 --> 00:36:37,759 Speaker 3: Elon's you know, greatest allies in making the kind of 606 00:36:37,840 --> 00:36:41,160 Speaker 3: changes that they or at least reasonable changes, would probably 607 00:36:41,440 --> 00:36:44,920 Speaker 3: get fired, leaving you with the ones that should have 608 00:36:44,960 --> 00:36:48,000 Speaker 3: gotten fired. But I will also say that he said 609 00:36:48,040 --> 00:36:51,440 Speaker 3: that on the campaign trail, and he's since given interviews 610 00:36:51,680 --> 00:36:53,560 Speaker 3: that and I'm not trying to defend him, but he's 611 00:36:53,600 --> 00:36:57,960 Speaker 3: since given interviews that demonstrate quite a bit more understanding 612 00:36:58,600 --> 00:37:04,160 Speaker 3: of the right ways to do this. He's acknowledged, for instance, 613 00:37:04,239 --> 00:37:09,080 Speaker 3: that you know, it's Congress who's in charge of laws 614 00:37:09,080 --> 00:37:12,680 Speaker 3: and regulations, and they can't just ride rushshot over that 615 00:37:12,680 --> 00:37:15,520 Speaker 3: that there's probably better ways to deal with the workforce. 616 00:37:15,920 --> 00:37:19,920 Speaker 3: But he's certainly coming from this place of just completely 617 00:37:19,960 --> 00:37:23,360 Speaker 3: crazy ideas that at least maybe move the Overton window, 618 00:37:23,440 --> 00:37:25,560 Speaker 3: but should not actually be operationalized. 619 00:37:26,040 --> 00:37:29,000 Speaker 2: Can you give an example of I understand we haven't 620 00:37:29,120 --> 00:37:32,400 Speaker 2: done any big reforms to the hiring process, but I imagine 621 00:37:32,280 --> 00:37:36,200 Speaker 2: there have been some attempts previously to improve it at 622 00:37:36,200 --> 00:37:40,640 Speaker 2: the margins. Are there any successful demonstrations of changes here? 623 00:37:42,239 --> 00:37:44,719 Speaker 3: Under the first Trump administration, a team at the United 624 00:37:44,719 --> 00:37:48,120 Speaker 3: States Digital Service started something with the absolutely awful name 625 00:37:48,160 --> 00:37:55,719 Speaker 3: of SMEKUA Expertise subject Matter Expert qualifying Assessments. I know, 626 00:37:55,760 --> 00:37:56,919 Speaker 3: but it just doesn't sound okay. 627 00:37:56,920 --> 00:37:59,120 Speaker 1: At least it's not dog right like. At least it 628 00:37:59,239 --> 00:38:02,520 Speaker 1: sounds serious speaks, Yeah, keep going. 629 00:38:02,640 --> 00:38:04,439 Speaker 3: Well, the funny thing is it sounds. 630 00:38:04,160 --> 00:38:06,880 Speaker 1: Very governmentment, which I'm okay with. 631 00:38:08,040 --> 00:38:12,640 Speaker 3: But essentially, this team looked at that problem of ninety 632 00:38:12,719 --> 00:38:17,200 Speaker 3: percent of hires happening without a reasonable assessment of the 633 00:38:17,239 --> 00:38:20,319 Speaker 3: candidate's skills and said, you know, we don't need to 634 00:38:20,360 --> 00:38:23,200 Speaker 3: do this. The law and policy absolutely allow for real 635 00:38:23,239 --> 00:38:27,000 Speaker 3: assessments and put together a process that hiring managers and 636 00:38:27,440 --> 00:38:32,360 Speaker 3: HR their HR partners could use to for instance, assess 637 00:38:32,520 --> 00:38:35,759 Speaker 3: you know, programmers by actual other programmers, you know. But 638 00:38:35,800 --> 00:38:39,880 Speaker 3: it's because it's working within existing not only law in policy, 639 00:38:39,920 --> 00:38:43,360 Speaker 3: but sort of the culture of the Office of Personnel Management. 640 00:38:43,360 --> 00:38:47,239 Speaker 3: It's still a pretty heavyweight process. But the people I 641 00:38:47,280 --> 00:38:50,759 Speaker 3: know who used it to hire people love it. They 642 00:38:51,320 --> 00:38:55,440 Speaker 3: get great candidates through it, and it just it hasn't scaled, 643 00:38:56,000 --> 00:38:58,520 Speaker 3: in part because I don't think it was a huge 644 00:38:58,520 --> 00:39:01,480 Speaker 3: priority of the Biden industry, and in part because it's 645 00:39:01,480 --> 00:39:05,000 Speaker 3: still just a lot of work to get all the 646 00:39:05,280 --> 00:39:09,080 Speaker 3: people who can review these programmers resumes, for instance, to 647 00:39:09,080 --> 00:39:12,120 Speaker 3: take time out of their jobs to do this in 648 00:39:12,120 --> 00:39:17,440 Speaker 3: a very prescribed kind of controlled way, Like you can't 649 00:39:17,440 --> 00:39:19,200 Speaker 3: just take someone's time and say, hey, look at this 650 00:39:19,200 --> 00:39:20,480 Speaker 3: guy's resume, do you think he's good. 651 00:39:21,880 --> 00:39:24,240 Speaker 1: What happens when you try to fire someone in the government. 652 00:39:24,520 --> 00:39:28,040 Speaker 1: You know, we did that episode several weeks ago about 653 00:39:28,920 --> 00:39:32,160 Speaker 1: medicure fraud and the importance of having good data scientists 654 00:39:32,160 --> 00:39:34,360 Speaker 1: to look at. And let's say there's a data scientist 655 00:39:34,360 --> 00:39:36,759 Speaker 1: who isn't good or isn't finding any whatever, it is 656 00:39:36,840 --> 00:39:38,520 Speaker 1: what happens when you try to fire them. 657 00:39:39,840 --> 00:39:42,160 Speaker 3: So most of the time on a practical basis. What 658 00:39:42,280 --> 00:39:46,160 Speaker 3: happens is they say that HR will tell you it's 659 00:39:46,160 --> 00:39:48,879 Speaker 3: going to be impossible to fire this person. Don't give 660 00:39:48,920 --> 00:39:51,040 Speaker 3: them a bad rating, because if you give them a 661 00:39:51,040 --> 00:39:53,960 Speaker 3: bad rating, we won't be able to transfer them. So 662 00:39:54,239 --> 00:39:56,520 Speaker 3: let's work on finding a transfer so they can be 663 00:39:56,600 --> 00:40:01,640 Speaker 3: someone else's problem. Not great. I heard from a friend 664 00:40:01,719 --> 00:40:07,200 Speaker 3: recently at an agency who got a complaint one of 665 00:40:07,239 --> 00:40:09,280 Speaker 3: his One of the employees on his team, complained about 666 00:40:09,280 --> 00:40:12,040 Speaker 3: the team leader who reports to him. She felt it 667 00:40:12,120 --> 00:40:18,560 Speaker 3: a formal complaint because that manager hadn't given her an 668 00:40:18,600 --> 00:40:22,600 Speaker 3: outstanding or exceptional rating of this person had been absent 669 00:40:22,680 --> 00:40:25,200 Speaker 3: a ton of the year, her skills were way outdated, 670 00:40:25,640 --> 00:40:29,520 Speaker 3: and the HR team said, well, you know, the best 671 00:40:29,520 --> 00:40:32,799 Speaker 3: thing to do is to, you know, assign her some 672 00:40:33,320 --> 00:40:36,880 Speaker 3: work that she can do, or very specific work that 673 00:40:36,920 --> 00:40:39,799 Speaker 3: you don't think she'll be able to achieve. But you know, 674 00:40:39,880 --> 00:40:43,319 Speaker 3: you can't push back on this. We would lose, and 675 00:40:43,440 --> 00:40:45,760 Speaker 3: you know, so we're just going to sort of manage 676 00:40:45,800 --> 00:40:48,520 Speaker 3: the situation. And oh, by the way, your manager is 677 00:40:48,920 --> 00:40:51,719 Speaker 3: potentially in trouble here because of this, this threat of 678 00:40:51,719 --> 00:40:55,359 Speaker 3: a lawsuit, so it's like completely upside down, and what 679 00:40:55,440 --> 00:40:57,440 Speaker 3: they will do is try to get that person to 680 00:40:57,480 --> 00:41:01,439 Speaker 3: be moved on to be someone else's problem. It's really 681 00:41:01,440 --> 00:41:05,040 Speaker 3: important to note that there is nothing that says you 682 00:41:05,120 --> 00:41:09,440 Speaker 3: can't fire people in government, and there is some number 683 00:41:09,560 --> 00:41:12,920 Speaker 3: and forgetting it that do get fired every year, so 684 00:41:13,160 --> 00:41:17,480 Speaker 3: it's not impossible. The law doesn't prevent it by any means. 685 00:41:17,840 --> 00:41:24,200 Speaker 3: It's more that there are many many pathways for employees 686 00:41:24,440 --> 00:41:27,799 Speaker 3: to protest, you know, not just being fired, but like 687 00:41:27,960 --> 00:41:31,160 Speaker 3: being given anything other than the top rating. And because 688 00:41:31,160 --> 00:41:37,319 Speaker 3: they have all those pathways to protest, they can sort 689 00:41:37,360 --> 00:41:40,360 Speaker 3: of run the clock out on all of those, sucking 690 00:41:40,440 --> 00:41:43,360 Speaker 3: up all of the time of the managers and the 691 00:41:43,520 --> 00:41:46,400 Speaker 3: HR people and the lawyers in a way that just nobody, 692 00:41:46,600 --> 00:41:48,400 Speaker 3: you know, everyone's like it's not worth it, Like we 693 00:41:48,400 --> 00:41:51,120 Speaker 3: can't get our work done if we're going to also 694 00:41:51,160 --> 00:41:53,359 Speaker 3: try to manage this person out, and so they give up. 695 00:41:54,120 --> 00:41:57,160 Speaker 2: It feels like there's also an incentive problem where if 696 00:41:57,160 --> 00:42:01,560 Speaker 2: you're a manager dealing with a bad employemployee, you know, 697 00:42:01,640 --> 00:42:04,720 Speaker 2: even if you manage to give them to another team, 698 00:42:05,239 --> 00:42:07,560 Speaker 2: there's no guarantee that you're going to get the head 699 00:42:07,560 --> 00:42:10,560 Speaker 2: count to replace them. And of course you have to 700 00:42:10,560 --> 00:42:13,040 Speaker 2: start the hiring process all over again, and you don't 701 00:42:13,080 --> 00:42:15,400 Speaker 2: know who you're going to get. And I feel like 702 00:42:15,560 --> 00:42:19,239 Speaker 2: in politics, especially or government bureaucracy, where there's a lot 703 00:42:19,320 --> 00:42:24,640 Speaker 2: of presumably internal infighting and you know, office politics, that 704 00:42:24,800 --> 00:42:27,520 Speaker 2: there's a lot of empire building as well, right Like 705 00:42:27,600 --> 00:42:30,440 Speaker 2: people do not want to see their teams reduce and 706 00:42:30,480 --> 00:42:31,480 Speaker 2: their budgets reduced. 707 00:42:31,600 --> 00:42:34,120 Speaker 3: That is undred present true, and I think it's an 708 00:42:34,200 --> 00:42:38,080 Speaker 3: interesting dynamic right now with this sort of threat of 709 00:42:38,320 --> 00:42:42,719 Speaker 3: you know, mass firings going on, that that instinct of 710 00:42:43,880 --> 00:42:46,640 Speaker 3: I am more important the more people report into me 711 00:42:47,000 --> 00:42:49,359 Speaker 3: is is sort of rearing its head in a way 712 00:42:49,440 --> 00:42:52,720 Speaker 3: that I wish it weren't, because I think public sector 713 00:42:52,719 --> 00:42:55,520 Speaker 3: employees have a lot to be proud of in terms 714 00:42:55,600 --> 00:42:58,640 Speaker 3: of the results they can produce for the American people, 715 00:42:59,040 --> 00:43:01,960 Speaker 3: and that ought to be the metric of of pride 716 00:43:02,440 --> 00:43:06,040 Speaker 3: and status, not how many people report to them. And 717 00:43:06,239 --> 00:43:09,680 Speaker 3: again I'm not a big fan of this, like let's, 718 00:43:09,719 --> 00:43:13,840 Speaker 3: you know, make every civil servant afraid to come in 719 00:43:13,880 --> 00:43:16,440 Speaker 3: every day. That kind of rhetoric is very sad to me, 720 00:43:17,200 --> 00:43:19,719 Speaker 3: but I think it's it's also true that that the 721 00:43:19,760 --> 00:43:23,319 Speaker 3: civil service, should you know, measure its value a little 722 00:43:23,320 --> 00:43:25,560 Speaker 3: differently than it often does today. And I'm not saying 723 00:43:25,600 --> 00:43:29,239 Speaker 3: it's everyone, but but that instinctive of empire building isn't 724 00:43:29,280 --> 00:43:30,920 Speaker 3: as healthy as as we would like. 725 00:43:31,760 --> 00:43:34,960 Speaker 1: Jennifer Pelka, thank you so much for coming back on 726 00:43:35,000 --> 00:43:37,640 Speaker 1: odd lots. Maybe we'll u we should have you back, 727 00:43:37,680 --> 00:43:39,040 Speaker 1: like in a year. We could do it. You can 728 00:43:39,120 --> 00:43:42,919 Speaker 1: sort of give your assessment on what you've seen so far. 729 00:43:42,960 --> 00:43:46,960 Speaker 1: We could sort of do regular uh regulator. Yeah, regular 730 00:43:47,040 --> 00:43:49,120 Speaker 1: doge report cards and updates. 731 00:43:49,560 --> 00:43:52,040 Speaker 3: I love that. That's great. We'll see if dog is 732 00:43:52,040 --> 00:43:52,560 Speaker 3: even around you. 733 00:43:52,680 --> 00:43:54,520 Speaker 1: Yeah, right, we'll see if it even exist. Certainly. 734 00:43:54,520 --> 00:43:55,080 Speaker 2: All good for it. 735 00:43:55,200 --> 00:44:11,600 Speaker 1: All right, thank you so much. That was fantastic. That 736 00:44:11,719 --> 00:44:14,600 Speaker 1: was really good. Tracy just recketing and hearing some of 737 00:44:14,600 --> 00:44:15,840 Speaker 1: those stories. 738 00:44:15,440 --> 00:44:18,360 Speaker 2: Like about hiring, the stories are amazing. You kind of 739 00:44:18,400 --> 00:44:20,960 Speaker 2: just want to your head against them going well. 740 00:44:21,000 --> 00:44:23,400 Speaker 1: Like Also, it makes me wonder if it would be 741 00:44:23,840 --> 00:44:26,120 Speaker 1: I don't know anything about applying for a government job, 742 00:44:26,120 --> 00:44:28,399 Speaker 1: but if you just sort of know the hacks, right, 743 00:44:28,520 --> 00:44:30,960 Speaker 1: if you just sort of know what a government resume 744 00:44:31,080 --> 00:44:33,400 Speaker 1: is supposed to look like, and the importance of copying 745 00:44:33,440 --> 00:44:37,279 Speaker 1: and pasting keywords and getting past that initial round, that 746 00:44:37,360 --> 00:44:39,799 Speaker 1: seems like that's a good alpha for listeners if you're 747 00:44:39,800 --> 00:44:42,960 Speaker 1: ever applying for a government job. Some good advice there 748 00:44:42,960 --> 00:44:43,560 Speaker 1: from Jennifer. 749 00:44:43,719 --> 00:44:43,959 Speaker 3: Yeah. 750 00:44:44,000 --> 00:44:47,000 Speaker 2: The best government officials, the most effective government officials, are 751 00:44:47,000 --> 00:44:48,640 Speaker 2: the ones who know how to copy and paste. 752 00:44:49,040 --> 00:44:50,399 Speaker 1: Suppressing it is. 753 00:44:50,520 --> 00:44:52,680 Speaker 2: Oh gosh, there was one thing I actually forgot to 754 00:44:52,719 --> 00:44:54,799 Speaker 2: ask jen But you know, you brought up that like 755 00:44:54,880 --> 00:44:59,000 Speaker 2: three hundred page document, and it struck me that people, 756 00:44:59,440 --> 00:45:02,319 Speaker 2: I doubt anyone is reading that from cover to cover, right, 757 00:45:02,640 --> 00:45:06,240 Speaker 2: probably not. And so if you were looking to streamline 758 00:45:06,440 --> 00:45:09,719 Speaker 2: some of those processes again, I know, identifying the problem 759 00:45:09,920 --> 00:45:12,640 Speaker 2: isn't the biggest issue here. It's actually trying to fix 760 00:45:12,680 --> 00:45:15,719 Speaker 2: it and act on it. But it seems like you 761 00:45:15,760 --> 00:45:17,240 Speaker 2: could use AI to go through. 762 00:45:17,080 --> 00:45:19,640 Speaker 1: A lot of those rules, right yeah, yeah. 763 00:45:19,320 --> 00:45:21,919 Speaker 2: And say, can you make this flow chart that goes 764 00:45:21,960 --> 00:45:25,279 Speaker 2: on for five pages reduce it down to I don't know, 765 00:45:25,400 --> 00:45:26,280 Speaker 2: half a page. 766 00:45:26,440 --> 00:45:28,279 Speaker 1: The other thing I would just say is that a 767 00:45:28,320 --> 00:45:31,319 Speaker 1: lot of the sort of I would say pathologies of 768 00:45:31,360 --> 00:45:36,719 Speaker 1: the government are in my experienced pathologies of large organizations period, right, 769 00:45:36,719 --> 00:45:39,440 Speaker 1: And I think you find people, you know, there are 770 00:45:39,440 --> 00:45:42,440 Speaker 1: frustrations you have as a customer to large companies, There 771 00:45:42,440 --> 00:45:45,480 Speaker 1: are frustrations you have as an employee of large companies. 772 00:45:45,960 --> 00:45:49,319 Speaker 1: These things exist the way sort of certain institutions within 773 00:45:49,360 --> 00:45:52,960 Speaker 1: an organization perpetuate themselves by holding onto power by sort 774 00:45:53,000 --> 00:45:56,439 Speaker 1: of a strict adherence a handful of people that really 775 00:45:56,480 --> 00:45:57,200 Speaker 1: know the rules. 776 00:45:57,120 --> 00:45:59,719 Speaker 2: People that know the system, people that build the systems 777 00:45:59,719 --> 00:46:00,520 Speaker 2: that no one else can. 778 00:46:00,480 --> 00:46:01,439 Speaker 3: I They're just. 779 00:46:01,360 --> 00:46:04,840 Speaker 1: Really hard to root out, and so they're hard in 780 00:46:04,920 --> 00:46:10,000 Speaker 1: any organization. And obviously, you know, private sector entities sometimes 781 00:46:10,120 --> 00:46:12,879 Speaker 1: benefit because someone says, look, we really need to cut 782 00:46:12,920 --> 00:46:15,440 Speaker 1: this area or whatever in a way that often hasn't 783 00:46:15,480 --> 00:46:19,520 Speaker 1: happened in the government. But these are really tough problems 784 00:46:19,600 --> 00:46:22,480 Speaker 1: in organizations. And I think anyone who probablys in business 785 00:46:22,480 --> 00:46:25,879 Speaker 1: school and goes into any Fortune five hundred company would 786 00:46:25,960 --> 00:46:30,000 Speaker 1: probably be faced with sort of similar challenges in reforming 787 00:46:30,000 --> 00:46:30,840 Speaker 1: any organization. 788 00:46:31,080 --> 00:46:33,880 Speaker 2: Absolutely, and I will say this is obviously a highly 789 00:46:33,960 --> 00:46:38,319 Speaker 2: politically charged issue, but you know, I think everyone on 790 00:46:38,360 --> 00:46:40,680 Speaker 2: both sides of the aisle can agree that they want 791 00:46:41,160 --> 00:46:44,920 Speaker 2: an effective government, right. You want a government that is 792 00:46:45,040 --> 00:46:49,440 Speaker 2: doing good and productive things and not wasting people's money, 793 00:46:49,440 --> 00:46:53,759 Speaker 2: because at a minimum, that feeds into a perception problem. 794 00:46:54,520 --> 00:46:57,239 Speaker 2: And so this is a really interesting topic. I think 795 00:46:57,280 --> 00:46:59,719 Speaker 2: we'll do more on it absolutely, Okay, shall we leave 796 00:46:59,719 --> 00:47:00,000 Speaker 2: it there? 797 00:47:00,040 --> 00:47:00,719 Speaker 1: Let's leave it there. 798 00:47:00,920 --> 00:47:03,879 Speaker 2: This has been another episode of the Authots podcast. I'm 799 00:47:03,880 --> 00:47:06,560 Speaker 2: Tracy Alloway. You can follow me at Tracy Alloway. 800 00:47:06,800 --> 00:47:09,800 Speaker 1: And I'm Jill Wisenthal. You can follow me at the Stalwart. 801 00:47:10,080 --> 00:47:13,400 Speaker 1: Follow our guest Jennifer Pelka, She's at Pelka dot and 802 00:47:13,520 --> 00:47:16,720 Speaker 1: definitely go check out her substack and her book. Follow 803 00:47:16,760 --> 00:47:20,200 Speaker 1: our producers Carmen Rodriguez at carman Armann dash O Bennett 804 00:47:20,239 --> 00:47:23,840 Speaker 1: at Dashbot and Kilbrooks at Kilbrooks. From our Oddlogs content, 805 00:47:23,880 --> 00:47:26,120 Speaker 1: go to Bloomberg dot com slash odd Lots, where you 806 00:47:26,120 --> 00:47:28,680 Speaker 1: have transcripts, a blog, and a newsletter and you can 807 00:47:28,760 --> 00:47:30,840 Speaker 1: chat about all of these topics twenty four to seven 808 00:47:30,920 --> 00:47:34,480 Speaker 1: in our discord Discord dot gg slash od lots. 809 00:47:34,680 --> 00:47:37,120 Speaker 2: And if you enjoy odd Lots, if you like it 810 00:47:37,200 --> 00:47:40,879 Speaker 2: when we dive deep into various bureaucracies, then please leave 811 00:47:40,960 --> 00:47:44,960 Speaker 2: us a positive review on your favorite podcast platform. And remember, 812 00:47:45,000 --> 00:47:47,560 Speaker 2: if you are a Bloomberg subscriber, you can listen to 813 00:47:47,680 --> 00:47:50,640 Speaker 2: all of our episodes absolutely add free. All you need 814 00:47:50,719 --> 00:47:53,480 Speaker 2: to do is find the Bloomberg channel on Apple Podcasts 815 00:47:53,480 --> 00:48:20,480 Speaker 2: and follow the instructions there. Thanks for listening in in