1 00:00:10,119 --> 00:00:14,360 Speaker 1: Hello, and welcome to another episode of the Odd Locks Podcast. 2 00:00:14,440 --> 00:00:18,360 Speaker 1: I'm Joe Wisenthal and I'm Tracy Hallo. Tracy, I forgot 3 00:00:18,360 --> 00:00:20,800 Speaker 1: to ask you, and it's kind of embarrassing like this 4 00:00:20,920 --> 00:00:24,400 Speaker 1: late in January, but how is your how is your 5 00:00:24,400 --> 00:00:25,880 Speaker 1: New Year's? Like? How is your holidays? Do you have 6 00:00:25,920 --> 00:00:29,320 Speaker 1: a good Christmas? And stuff? Thanks? Thanks, Joe. I had 7 00:00:29,360 --> 00:00:32,559 Speaker 1: excellent Christmas. I stayed at home for a week with 8 00:00:32,600 --> 00:00:35,960 Speaker 1: my husband and my dog and we did hardly anything 9 00:00:36,040 --> 00:00:40,000 Speaker 1: and it was absolutely glorious. How about you? It was 10 00:00:40,040 --> 00:00:41,920 Speaker 1: all right. So the thing was I was down in 11 00:00:42,000 --> 00:00:45,080 Speaker 1: Texas visiting a family, which is but there was that 12 00:00:45,280 --> 00:00:50,040 Speaker 1: huge cold blast and it's worse in It's worth when 13 00:00:50,159 --> 00:00:52,239 Speaker 1: when the weather gets really cold, it's worse in a 14 00:00:52,280 --> 00:00:55,200 Speaker 1: place like Texas because like none of the buildings are 15 00:00:55,240 --> 00:00:58,120 Speaker 1: insulated particularly well, none of them have that heat. So 16 00:00:58,160 --> 00:01:00,640 Speaker 1: it's like, when it's really cold, is that like better 17 00:01:00,720 --> 00:01:03,279 Speaker 1: to be in a cold place where people are used 18 00:01:03,320 --> 00:01:05,880 Speaker 1: to it. So it's a little uncomfortable. But the good 19 00:01:05,880 --> 00:01:09,000 Speaker 1: news is somehow we managed to travel back and forth 20 00:01:09,240 --> 00:01:14,200 Speaker 1: without getting having any major airline disruptions. Right, So this 21 00:01:14,280 --> 00:01:17,360 Speaker 1: is the key thing that happened right before Christmas, which 22 00:01:17,400 --> 00:01:21,040 Speaker 1: is we had that very big winter storm, the Arctic 23 00:01:21,160 --> 00:01:25,839 Speaker 1: bomb blast, and it disrupted a ton of flights first off, 24 00:01:25,840 --> 00:01:28,640 Speaker 1: because of the weather, But then what happened is you 25 00:01:28,680 --> 00:01:31,639 Speaker 1: had this sort of cascade effect because the weather event 26 00:01:31,720 --> 00:01:35,520 Speaker 1: was so large, a number of airlines, but one airline 27 00:01:35,640 --> 00:01:39,520 Speaker 1: in particular experienced a lot of problems with its software 28 00:01:39,520 --> 00:01:42,479 Speaker 1: and Southwest had to cancel I think in the end 29 00:01:42,520 --> 00:01:46,400 Speaker 1: it was something like sixteen thousand flights. You had millions 30 00:01:46,400 --> 00:01:49,680 Speaker 1: of passengers affected, and you had disruptions that you know, 31 00:01:49,760 --> 00:01:52,800 Speaker 1: a weather related disruption that lasted one or two days 32 00:01:53,360 --> 00:01:57,440 Speaker 1: ended up lasting I think more than a week because 33 00:01:57,680 --> 00:02:01,720 Speaker 1: of the impact of the computer litches. I guess, yeah, yeah, 34 00:02:01,800 --> 00:02:04,760 Speaker 1: that's right. Like you know, airline traveled like it always 35 00:02:04,840 --> 00:02:07,640 Speaker 1: sort of cascades and ripples out right, because it canceled 36 00:02:07,680 --> 00:02:09,840 Speaker 1: flight is going to affect other flights and so forth. 37 00:02:10,160 --> 00:02:13,840 Speaker 1: But it seemed like Southwest experienced something unique, which is 38 00:02:13,880 --> 00:02:17,400 Speaker 1: that there's turned into this major software problem. And it's 39 00:02:17,440 --> 00:02:19,800 Speaker 1: sort of like a reminder that like, okay, when we 40 00:02:19,880 --> 00:02:22,040 Speaker 1: use the Internet, when we use like sort of like 41 00:02:22,160 --> 00:02:25,280 Speaker 1: modern consumer software, it's all very zippy and quick, and 42 00:02:25,360 --> 00:02:28,560 Speaker 1: it has nice interfaces and then when you use like 43 00:02:28,600 --> 00:02:32,600 Speaker 1: sort of like back end corporate software, particularly at large 44 00:02:32,840 --> 00:02:39,200 Speaker 1: legacy institutions, it's just like it's nothing like the consumer Internet. 45 00:02:39,240 --> 00:02:43,000 Speaker 1: It's like clunky. It's like we all know what it's like, right. 46 00:02:43,080 --> 00:02:44,919 Speaker 1: So this is something that came up quite a lot. 47 00:02:44,960 --> 00:02:48,200 Speaker 1: I used to cover the big banks, and one of 48 00:02:48,240 --> 00:02:51,840 Speaker 1: the crazy things that I learned relatively early on while 49 00:02:51,840 --> 00:02:54,120 Speaker 1: I was doing this was just how much of their 50 00:02:54,160 --> 00:02:59,080 Speaker 1: I T system was still these old, creaky, big iron mainframes, 51 00:02:59,400 --> 00:03:01,800 Speaker 1: some of them still running on Cobalt, which is the 52 00:03:01,840 --> 00:03:03,959 Speaker 1: programming language that I think it dates back to the 53 00:03:04,040 --> 00:03:08,000 Speaker 1: nineteen fifties or nineteen sixties. And I remember, you know, 54 00:03:08,120 --> 00:03:10,200 Speaker 1: you hear this, You hear like, oh, I can't believe 55 00:03:10,200 --> 00:03:13,799 Speaker 1: that these big banks, our entire financial system in some respects, 56 00:03:13,840 --> 00:03:17,080 Speaker 1: is still running on these legacy computer systems. But on 57 00:03:17,120 --> 00:03:19,440 Speaker 1: the other hand, if you look at what a big 58 00:03:19,480 --> 00:03:25,160 Speaker 1: bank is, it's structurally a series of mergers and acquisitions. 59 00:03:25,200 --> 00:03:30,000 Speaker 1: There used to be the no, no, that's there used 60 00:03:30,040 --> 00:03:33,160 Speaker 1: to be this great flow chart that showed the formation 61 00:03:33,360 --> 00:03:35,480 Speaker 1: of like a JP Morgan or a Bank of America, 62 00:03:35,600 --> 00:03:37,400 Speaker 1: and you can just see it's like a series of 63 00:03:37,480 --> 00:03:40,320 Speaker 1: roll ups of smaller banks, and you think every time 64 00:03:40,360 --> 00:03:43,720 Speaker 1: they acquire a new bank, they have to integrate another 65 00:03:43,840 --> 00:03:46,640 Speaker 1: system into their own system, and in the end you 66 00:03:46,720 --> 00:03:50,320 Speaker 1: kind of end up with this just incredibly like complex 67 00:03:50,320 --> 00:03:53,720 Speaker 1: and kind of patchy i T structure that in some 68 00:03:53,800 --> 00:03:57,960 Speaker 1: ways very much resembles the amalgamation of all these smaller 69 00:03:57,960 --> 00:04:00,560 Speaker 1: banks into a larger bank. There's too the right, like 70 00:04:00,640 --> 00:04:03,280 Speaker 1: you think of like banks and when they merge, it 71 00:04:03,400 --> 00:04:05,760 Speaker 1: just like okay, like you know, the capitals all merged 72 00:04:05,800 --> 00:04:09,080 Speaker 1: together and the assets, etcetera. But like, right, like none 73 00:04:09,120 --> 00:04:11,000 Speaker 1: of them are going to have like i T systems 74 00:04:11,000 --> 00:04:14,000 Speaker 1: that work perfectly together, so they're like glued together with 75 00:04:14,000 --> 00:04:17,360 Speaker 1: like duct tape and just over time. I think technical 76 00:04:17,440 --> 00:04:19,960 Speaker 1: debt is a term that software engineers, and so you 77 00:04:19,960 --> 00:04:23,000 Speaker 1: can to relate all this technical debt. And I'm surely 78 00:04:23,120 --> 00:04:26,640 Speaker 1: nobody likes it inside the bank at any level. But 79 00:04:26,680 --> 00:04:29,760 Speaker 1: I've always been curious, like, what are the economics such 80 00:04:29,880 --> 00:04:37,000 Speaker 1: that it is just so impossible for these legacy institutions, banking, airlines, 81 00:04:37,080 --> 00:04:42,120 Speaker 1: absolute public sector to like get with the times, you know, absolutely, 82 00:04:42,240 --> 00:04:44,160 Speaker 1: and when it comes to Southwest, So first of all, 83 00:04:44,240 --> 00:04:47,320 Speaker 1: let me declare a small self interest in Southwest which 84 00:04:47,400 --> 00:04:50,080 Speaker 1: is my My dad was a pilot for Southwest for 85 00:04:50,120 --> 00:04:52,160 Speaker 1: a long time and I remember when the outages were 86 00:04:52,200 --> 00:04:54,680 Speaker 1: happening this Christmas. I sent him a text going like, well, 87 00:04:54,680 --> 00:04:56,880 Speaker 1: what do you think about this? And he just said, well, 88 00:04:56,880 --> 00:04:58,760 Speaker 1: all the crew are out of position and they need 89 00:04:58,800 --> 00:05:02,359 Speaker 1: to get them back. So very straightforward ex pilot answer 90 00:05:02,480 --> 00:05:06,040 Speaker 1: about what was going on. Not that helpful for this podcast. 91 00:05:06,320 --> 00:05:09,599 Speaker 1: But I do think in the case of Southwest, this 92 00:05:09,640 --> 00:05:12,320 Speaker 1: has been a long running issue and you have had, 93 00:05:12,600 --> 00:05:14,960 Speaker 1: you know, people talking about it at various times, the 94 00:05:15,040 --> 00:05:18,680 Speaker 1: need to upgrade the infrastructure, of the technological infrastructure, and 95 00:05:18,760 --> 00:05:21,200 Speaker 1: yet it hasn't happened. And so the question is why. 96 00:05:21,520 --> 00:05:24,240 Speaker 1: The question is whether or not is it cheaper just 97 00:05:24,320 --> 00:05:27,280 Speaker 1: to keep running these old mainframes and assume that you 98 00:05:27,320 --> 00:05:30,840 Speaker 1: are going to have these outages during major events, cheaper 99 00:05:30,920 --> 00:05:34,320 Speaker 1: than it would cost to actually upgrade it. No. Absolutely, Well, 100 00:05:34,640 --> 00:05:38,800 Speaker 1: let's let's talk to somebody who knows about software and 101 00:05:38,880 --> 00:05:41,280 Speaker 1: knows about economics and can walk us through and help 102 00:05:41,360 --> 00:05:45,600 Speaker 1: us understand the problem. We're gonna be speaking with Patrick mackenzie. 103 00:05:46,040 --> 00:05:49,960 Speaker 1: He's an expert on software and infrastructure. He's a writer 104 00:05:50,080 --> 00:05:52,640 Speaker 1: of the Bits about Money newsletter and knows a lot 105 00:05:52,680 --> 00:05:56,839 Speaker 1: about finance, and he just left a stripe after six years. 106 00:05:56,880 --> 00:06:00,200 Speaker 1: He's still an adviser there. Been reading his stuff for 107 00:06:00,240 --> 00:06:02,039 Speaker 1: a long time. One of these people I was sort 108 00:06:02,040 --> 00:06:04,839 Speaker 1: of trust on almost any topic. Patrick, Thank you so 109 00:06:04,920 --> 00:06:07,560 Speaker 1: much for coming on odd lots. Thanks very much for 110 00:06:07,600 --> 00:06:10,320 Speaker 1: having me. What is the deal? Let's just start like 111 00:06:10,520 --> 00:06:13,680 Speaker 1: straight up, like why do you like, I mean Tracy 112 00:06:13,800 --> 00:06:17,360 Speaker 1: mentioned like all these institutions, they just like they swish 113 00:06:17,400 --> 00:06:20,080 Speaker 1: it all together. They sort of like tie it together 114 00:06:20,200 --> 00:06:23,360 Speaker 1: with duct tape and so forth. You get these unwieldy things. 115 00:06:24,080 --> 00:06:26,159 Speaker 1: Just like give us the high level view of like 116 00:06:26,600 --> 00:06:29,880 Speaker 1: why is it so difficult from at an abstract level 117 00:06:30,240 --> 00:06:35,560 Speaker 1: to modernize legacy software? So I'll start with a disclaimer, 118 00:06:35,560 --> 00:06:38,880 Speaker 1: which is sort of mandatory and engineering culture, we have 119 00:06:39,000 --> 00:06:40,880 Speaker 1: this thing that we've come up with over the last 120 00:06:41,000 --> 00:06:44,799 Speaker 1: course of two decades, or so called blameless postmortems, where 121 00:06:45,320 --> 00:06:48,440 Speaker 1: when there is a failure within a company where you know, 122 00:06:48,640 --> 00:06:50,840 Speaker 1: planes cannot be up in the air for a week 123 00:06:50,880 --> 00:06:53,120 Speaker 1: at a time. Rather than like trying to point the 124 00:06:53,160 --> 00:06:55,359 Speaker 1: finger at someone and say it was your decision or 125 00:06:55,400 --> 00:06:58,680 Speaker 1: your inaction that caused this event, we as engineers want 126 00:06:58,800 --> 00:07:01,200 Speaker 1: to like look at the rejective reality of the system, 127 00:07:01,279 --> 00:07:03,719 Speaker 1: figure out what went wrong for the benefit of both 128 00:07:03,720 --> 00:07:05,880 Speaker 1: that organization and for the larger community. And so this 129 00:07:05,920 --> 00:07:08,400 Speaker 1: isn't too grind their nose in it, but just you know, 130 00:07:08,720 --> 00:07:13,040 Speaker 1: as an engineering matter, what probably happened in an ideal world, 131 00:07:13,320 --> 00:07:15,840 Speaker 1: And if this had happened at a Google, for example, 132 00:07:15,920 --> 00:07:18,880 Speaker 1: the engineering teams would push, because of the culture of 133 00:07:18,880 --> 00:07:23,040 Speaker 1: these things, to do a very public postmortem of what 134 00:07:23,200 --> 00:07:26,320 Speaker 1: the decisions were, what the background is, etcetera, etcetera. In 135 00:07:26,360 --> 00:07:29,360 Speaker 1: more traditional industries, I don't think that culture is fully 136 00:07:29,680 --> 00:07:32,800 Speaker 1: baked yet as it were, although they're very well might 137 00:07:32,840 --> 00:07:35,920 Speaker 1: be a postmortem by uh, you know, the FA and 138 00:07:35,960 --> 00:07:39,520 Speaker 1: federal regulators, because you know, liveness constraint is a real 139 00:07:39,560 --> 00:07:43,920 Speaker 1: thing for extremely important economic systems like airlines. Anyhowp what 140 00:07:44,240 --> 00:07:48,240 Speaker 1: probably happened. It wouldn't surprise anyone in the bowels of 141 00:07:48,320 --> 00:07:51,640 Speaker 1: an airline that if you put off maintenance on airplanes 142 00:07:51,680 --> 00:07:54,080 Speaker 1: for decades at a time, that eventually bad things would 143 00:07:54,120 --> 00:07:58,400 Speaker 1: happen and no one would countenance that. However, software systems 144 00:07:58,400 --> 00:08:01,800 Speaker 1: are quite similar. Where aren't a build once and then 145 00:08:01,920 --> 00:08:04,200 Speaker 1: run for the rest of eternity sort of things? There 146 00:08:04,240 --> 00:08:06,960 Speaker 1: were some decisions made early in their lifetimes which share 147 00:08:07,040 --> 00:08:09,320 Speaker 1: no longer accurate for the world we live in. They 148 00:08:09,360 --> 00:08:12,600 Speaker 1: do suffer from something engineers you from misply called bit 149 00:08:12,720 --> 00:08:16,680 Speaker 1: rot where software which worked back in the past will 150 00:08:16,680 --> 00:08:19,640 Speaker 1: tend to like succumbed entropy over time and not work 151 00:08:19,680 --> 00:08:22,760 Speaker 1: exactly perfectly for all time afterwards, and so you need 152 00:08:22,800 --> 00:08:27,320 Speaker 1: to be doing a ongoing program of maintenance for your software, 153 00:08:27,400 --> 00:08:30,840 Speaker 1: just like you would for your airplanes. That bluntly was 154 00:08:30,960 --> 00:08:34,920 Speaker 1: not done, and it seems to be credibly reported that's 155 00:08:35,320 --> 00:08:38,320 Speaker 1: the sort of cultural factors at Southwest that caused that 156 00:08:38,360 --> 00:08:40,959 Speaker 1: to not be done. Might have been caused by a 157 00:08:41,120 --> 00:08:43,760 Speaker 1: sort of like overly accounting the slash penny pinching focused 158 00:08:43,760 --> 00:08:46,400 Speaker 1: management culture, which thought, well, it costs money and the 159 00:08:46,440 --> 00:08:49,320 Speaker 1: short term to do to do maintenance, we can like 160 00:08:49,480 --> 00:08:51,560 Speaker 1: cram down or engineering costs by doing less of this 161 00:08:51,600 --> 00:08:54,640 Speaker 1: and relying more on external vendors, etcetera, etcetera. And then 162 00:08:54,840 --> 00:08:59,000 Speaker 1: when stuff HiT's the oscillating plate, the right people have 163 00:08:59,080 --> 00:09:01,040 Speaker 1: not done the years of work that are required to 164 00:09:01,880 --> 00:09:04,560 Speaker 1: get to a posture where you can like quickly recover 165 00:09:04,640 --> 00:09:07,680 Speaker 1: from failures, and so they were left in a part 166 00:09:07,760 --> 00:09:11,000 Speaker 1: where we use euphemistically in the in street called heroics 167 00:09:11,440 --> 00:09:15,720 Speaker 1: was required from folks in operations trying to you know, 168 00:09:15,800 --> 00:09:18,839 Speaker 1: contact thousands and tens of thousands of employees by fille 169 00:09:18,880 --> 00:09:22,200 Speaker 1: and pass around their information and probably spreadsheets to figure 170 00:09:22,200 --> 00:09:24,720 Speaker 1: out where the crew is actually where to be able 171 00:09:24,720 --> 00:09:27,679 Speaker 1: to take the boxes that are required due to regulation 172 00:09:27,760 --> 00:09:30,280 Speaker 1: to allow people to get back and up in the air. 173 00:09:31,120 --> 00:09:33,720 Speaker 1: So that's probably the high level, like root cause of 174 00:09:33,720 --> 00:09:37,080 Speaker 1: what happened. But like a reason to conduct these postmortems 175 00:09:37,200 --> 00:09:40,720 Speaker 1: is it's there for one, like one single decision made 176 00:09:40,720 --> 00:09:44,280 Speaker 1: by one person. It's the result of cultural factors, business 177 00:09:44,320 --> 00:09:47,440 Speaker 1: decisions made over a course of probably decades in this place, 178 00:09:47,760 --> 00:09:49,880 Speaker 1: and we want to like keys out the various nuances 179 00:09:49,880 --> 00:09:53,520 Speaker 1: there and like make both Southwest and other organizations where 180 00:09:53,520 --> 00:09:57,040 Speaker 1: of that so that they don't suffer critical systemic failures 181 00:09:57,040 --> 00:10:00,960 Speaker 1: in their own places. So, first of all, bit brought 182 00:10:01,240 --> 00:10:03,080 Speaker 1: is a fantastic term that I am going to have 183 00:10:03,160 --> 00:10:06,120 Speaker 1: to try to work into all my conversations going forward. 184 00:10:06,520 --> 00:10:09,480 Speaker 1: But secondly, just to step back a bit, can you 185 00:10:09,520 --> 00:10:14,199 Speaker 1: maybe explain, you know, when we talk about mainframe computer systems, 186 00:10:14,640 --> 00:10:17,480 Speaker 1: what exactly are we talking about? And like what is 187 00:10:17,520 --> 00:10:21,280 Speaker 1: the counterpoint to mainframe. I'm assuming it's more like cloud 188 00:10:21,280 --> 00:10:23,800 Speaker 1: based applications and things like that, but could you maybe 189 00:10:24,120 --> 00:10:28,000 Speaker 1: define those that basic term very quickly. And then secondly, 190 00:10:28,640 --> 00:10:33,320 Speaker 1: my understanding of the Southwest debacle was that they had 191 00:10:33,480 --> 00:10:37,600 Speaker 1: this in house software. I think it was called sky Solver, 192 00:10:38,320 --> 00:10:41,679 Speaker 1: but it was based on an application that g E 193 00:10:42,040 --> 00:10:45,480 Speaker 1: had been selling, and then Southwest kind of customized it. 194 00:10:45,880 --> 00:10:48,839 Speaker 1: And I guess my question is how endemic is that 195 00:10:48,880 --> 00:10:51,719 Speaker 1: type of software where you get something off the shelf, 196 00:10:51,760 --> 00:10:53,760 Speaker 1: but then you customize it in such a way that 197 00:10:53,800 --> 00:10:57,680 Speaker 1: it becomes um, I guess special to you and therefore 198 00:10:57,920 --> 00:11:01,680 Speaker 1: your problem when it goes awry. So let's talk about 199 00:11:01,720 --> 00:11:04,080 Speaker 1: the the main frame first, and then we'll talk about 200 00:11:04,080 --> 00:11:07,120 Speaker 1: the problem of like who owns the problem, whether it's 201 00:11:07,120 --> 00:11:09,280 Speaker 1: your problem or some of the vendor's problem, and the 202 00:11:09,840 --> 00:11:12,439 Speaker 1: way that balls tend to end uh in the middle 203 00:11:12,480 --> 00:11:16,320 Speaker 1: of people and things get dropped. So mainframes back in 204 00:11:16,360 --> 00:11:19,880 Speaker 1: the day, many many decades ago, computers were approximately the 205 00:11:19,920 --> 00:11:22,160 Speaker 1: size of a room. This is the before the personal 206 00:11:22,160 --> 00:11:25,360 Speaker 1: computer revolution and banks, which were some of the earliest 207 00:11:25,360 --> 00:11:28,840 Speaker 1: adopters of computers for sort of scaled usage in industry. 208 00:11:29,040 --> 00:11:32,360 Speaker 1: And it's funny like the earliest users and computers tended 209 00:11:32,400 --> 00:11:35,400 Speaker 1: to be either the financial industry or the military, either 210 00:11:35,520 --> 00:11:38,880 Speaker 1: attempting to like move numbers which represented money, or numbers 211 00:11:38,880 --> 00:11:42,120 Speaker 1: which represented like literally artillery shells flying through the air, 212 00:11:42,440 --> 00:11:45,760 Speaker 1: and both were a very important society for better or worse, 213 00:11:45,920 --> 00:11:49,520 Speaker 1: Like banks because they standardized on what was the best 214 00:11:49,520 --> 00:11:51,600 Speaker 1: available technology at the time, ended up with a lot 215 00:11:51,640 --> 00:11:53,880 Speaker 1: of main frames, and they have kept those main frames 216 00:11:53,960 --> 00:11:56,120 Speaker 1: running for a good portion of seventy years now. In 217 00:11:56,160 --> 00:11:59,720 Speaker 1: some cases, as you mentioned earlier, the alternative to main frames. 218 00:12:00,040 --> 00:12:02,480 Speaker 1: Cloud is a bit of a buzzword. So there's the 219 00:12:02,480 --> 00:12:05,480 Speaker 1: personal computer form factor that you're familiar with, where something 220 00:12:05,480 --> 00:12:08,600 Speaker 1: sits on your desk. There are servers which would typically 221 00:12:08,640 --> 00:12:11,200 Speaker 1: sit in a server X somewhere, uh. And the difference 222 00:12:11,240 --> 00:12:14,120 Speaker 1: between like servers and quote unquote the cloud is like 223 00:12:14,440 --> 00:12:16,800 Speaker 1: the traditional way to manage servers would be you would 224 00:12:16,840 --> 00:12:19,320 Speaker 1: have a data center that would be owned or at 225 00:12:19,360 --> 00:12:21,440 Speaker 1: least by yourself, and you would put hardware that you 226 00:12:21,559 --> 00:12:24,360 Speaker 1: owned in that data center and in the cloud. The 227 00:12:24,400 --> 00:12:27,280 Speaker 1: cloud case, the data center is owned by Amazon or 228 00:12:27,320 --> 00:12:30,760 Speaker 1: Google or Microsoft. You have rented access to a machine 229 00:12:30,800 --> 00:12:34,280 Speaker 1: there which sits on their balance sheet and it's probably 230 00:12:34,280 --> 00:12:37,400 Speaker 1: not one machine, and it's probably like an awful lot 231 00:12:37,440 --> 00:12:41,480 Speaker 1: of machines potentially with some virtualization layer, and your engineers 232 00:12:41,520 --> 00:12:44,920 Speaker 1: can cause systems to scale up or scale down based 233 00:12:44,960 --> 00:12:47,280 Speaker 1: on how many machines you need on a you know, 234 00:12:47,360 --> 00:12:49,880 Speaker 1: minute by minute or second by second basis. Is sort 235 00:12:49,880 --> 00:12:51,880 Speaker 1: of like the high level version of the sales pitch 236 00:12:51,920 --> 00:12:55,199 Speaker 1: that cloud vendors will give you. So that is a 237 00:12:55,280 --> 00:12:58,240 Speaker 1: hey quick run through between like four generations of the 238 00:12:58,240 --> 00:13:01,160 Speaker 1: main stay of how technology gets done at scale the 239 00:13:01,240 --> 00:13:04,839 Speaker 1: question of where software gets written. So it is quite 240 00:13:04,880 --> 00:13:07,800 Speaker 1: common for businesses in the traditional economy to not have 241 00:13:08,000 --> 00:13:12,360 Speaker 1: an internal software engineering competence, and as a result, they'll 242 00:13:12,400 --> 00:13:15,560 Speaker 1: go to vendors. Like in this case, you probably know 243 00:13:15,600 --> 00:13:18,599 Speaker 1: this better than me. But for example g and the 244 00:13:19,200 --> 00:13:22,240 Speaker 1: vendor will sell quote unquote package software which might not 245 00:13:22,280 --> 00:13:24,320 Speaker 1: be fully responsive to the needs of the business, and 246 00:13:24,360 --> 00:13:27,800 Speaker 1: then some customization happens. Um what could happen is the 247 00:13:27,800 --> 00:13:30,680 Speaker 1: customization might happen within the business if they have some 248 00:13:30,880 --> 00:13:34,160 Speaker 1: level of software engineers internally. What happens more frequently in 249 00:13:34,200 --> 00:13:36,400 Speaker 1: a lot of places, particular in depend where I live, 250 00:13:36,800 --> 00:13:39,520 Speaker 1: is that the business will contract with we call them 251 00:13:39,520 --> 00:13:42,280 Speaker 1: system integrators here that may be called consultancies in America, 252 00:13:42,720 --> 00:13:47,880 Speaker 1: you know a Deloitte or a another large consulting firm 253 00:13:47,960 --> 00:13:52,280 Speaker 1: like that Accenture and say, okay, we have this package system, 254 00:13:52,360 --> 00:13:55,760 Speaker 1: we have these business requirements. Clearly some software engineers have 255 00:13:55,800 --> 00:13:57,800 Speaker 1: to be involved. We don't have enough on our staff. 256 00:13:57,800 --> 00:14:00,000 Speaker 1: Can you like figure out the missing link for us? 257 00:14:00,400 --> 00:14:03,600 Speaker 1: And so there is an ex sense of contract negotiation. 258 00:14:04,000 --> 00:14:07,480 Speaker 1: Consultancy goes off and does the customizations, and they delivered 259 00:14:07,480 --> 00:14:10,720 Speaker 1: the work to the organization. And then there are various 260 00:14:10,720 --> 00:14:13,120 Speaker 1: different ways to do maintenance, but often that will involve 261 00:14:13,160 --> 00:14:16,720 Speaker 1: the original team standing down for a while, which that 262 00:14:16,760 --> 00:14:19,160 Speaker 1: always sounds like a great idea when it's pitched, because 263 00:14:19,200 --> 00:14:21,320 Speaker 1: it's like, oh great, I don't have to pay expensive 264 00:14:21,320 --> 00:14:24,080 Speaker 1: engineers every week to just sit around waiting for something 265 00:14:24,120 --> 00:14:27,840 Speaker 1: to happen. And then when something actually happens, you sort 266 00:14:27,880 --> 00:14:31,080 Speaker 1: of like realize that the other end of the option 267 00:14:31,160 --> 00:14:34,600 Speaker 1: value there where oh not so great. I don't have 268 00:14:34,680 --> 00:14:37,400 Speaker 1: a team of like experts who understand the system, ready 269 00:14:37,400 --> 00:14:39,440 Speaker 1: to call up at a moment's notice and bring in 270 00:14:39,480 --> 00:14:43,040 Speaker 1: to debug the problems that we're seeing right now. And 271 00:14:43,080 --> 00:14:45,360 Speaker 1: so I have no specific knowledge about went in this 272 00:14:45,600 --> 00:14:48,600 Speaker 1: individual instance. But a thing that happens during a lot 273 00:14:48,640 --> 00:14:52,240 Speaker 1: of outages is uh a, you know, quick look into 274 00:14:52,280 --> 00:14:55,480 Speaker 1: the history of the system to say, hey, wait, who 275 00:14:55,600 --> 00:14:57,920 Speaker 1: actually built this for us? Are they still in business? 276 00:14:58,120 --> 00:15:00,720 Speaker 1: Can we get on their calendar immediately? Is there an 277 00:15:00,720 --> 00:15:03,200 Speaker 1: engineering team there that is ready to hop in and 278 00:15:03,200 --> 00:15:06,480 Speaker 1: work at this? Like goodness, they're going to charge us 279 00:15:06,600 --> 00:15:09,320 Speaker 1: I bleeding rates to to do it, but we're going 280 00:15:09,360 --> 00:15:11,320 Speaker 1: to have to pay that. That's a you know, tertiary 281 00:15:11,320 --> 00:15:14,280 Speaker 1: consideration at this point. The bigger consideration is like, how 282 00:15:14,360 --> 00:15:17,720 Speaker 1: quickly can we get our you know, organization to organization 283 00:15:17,960 --> 00:15:21,280 Speaker 1: like legal paperwork, etcetera, spun up? And then how quickly 284 00:15:21,320 --> 00:15:23,320 Speaker 1: can you get they get the engineering team spun up 285 00:15:23,320 --> 00:15:26,720 Speaker 1: to address the system in real time? And this sort 286 00:15:26,760 --> 00:15:30,120 Speaker 1: of thing is why the sort of like major scaled 287 00:15:30,560 --> 00:15:34,440 Speaker 1: software companies and the economy that Google, Microsoft, etcetera, etcetera, 288 00:15:34,480 --> 00:15:37,400 Speaker 1: you all know the names, largely treat engineering as an 289 00:15:37,400 --> 00:15:40,520 Speaker 1: internal competence rather than you know, the software for the 290 00:15:40,560 --> 00:15:43,240 Speaker 1: iPhone isn't built by Debloid, It's built by Apple, and 291 00:15:43,520 --> 00:15:46,000 Speaker 1: almost everything in the critical path will either be open 292 00:15:46,000 --> 00:15:48,480 Speaker 1: source or something that there is a team on Apple 293 00:15:48,560 --> 00:15:50,720 Speaker 1: that owns the sort of totality of the experience for 294 00:15:50,800 --> 00:16:10,720 Speaker 1: a goal. So something that I sort of hinted at 295 00:16:10,800 --> 00:16:13,320 Speaker 1: in the beginning, and we talked about this a little bit. 296 00:16:13,400 --> 00:16:16,320 Speaker 1: We did an episode on petroleum engineering. But I'm really 297 00:16:16,360 --> 00:16:20,600 Speaker 1: curious about like the distribution of engineering talent. Like in 298 00:16:20,800 --> 00:16:23,160 Speaker 1: my mind, I would have to imagine that a talented 299 00:16:23,200 --> 00:16:27,120 Speaker 1: engineer would be more excited to work for Stripe than Google, 300 00:16:27,560 --> 00:16:30,840 Speaker 1: and we're excited to work for Google than Deloitte, and 301 00:16:30,840 --> 00:16:35,160 Speaker 1: we're excited to work for Deloitte than Southwest. And I'm 302 00:16:35,240 --> 00:16:37,640 Speaker 1: just sort of and I'm curious, like a if that's 303 00:16:38,040 --> 00:16:41,920 Speaker 1: the case, and whether like this is a problem, and 304 00:16:42,000 --> 00:16:44,160 Speaker 1: whether like it would be better, whether it be better 305 00:16:44,320 --> 00:16:49,360 Speaker 1: if like there were just more high tech experts, software 306 00:16:49,400 --> 00:16:52,680 Speaker 1: experts who would want to work for a Southwest directly 307 00:16:53,120 --> 00:16:56,200 Speaker 1: or work for a bank directly, whether this sort of 308 00:16:56,240 --> 00:17:00,400 Speaker 1: like distribution of engineering talent contributes to software body all act. 309 00:17:00,920 --> 00:17:03,960 Speaker 1: So I have nuanced thoughts that engineering talent, on one hand, 310 00:17:04,080 --> 00:17:06,520 Speaker 1: like that is certainly thing that exists in the world, 311 00:17:06,600 --> 00:17:12,040 Speaker 1: and like talent is not necessarily distributed equally among different industries, individuals, etcetera. 312 00:17:12,440 --> 00:17:14,439 Speaker 1: On the other hand, I think that Silicon Valley and 313 00:17:14,520 --> 00:17:18,440 Speaker 1: ecosystem to which I owe a lot occasionally has an 314 00:17:18,480 --> 00:17:21,119 Speaker 1: overly enamored view of itself, thinking that, oh, yes, we 315 00:17:21,240 --> 00:17:23,639 Speaker 1: have the best engineers in the world, and all the 316 00:17:23,680 --> 00:17:26,040 Speaker 1: engineers who made every other system that we rely on 317 00:17:26,119 --> 00:17:28,760 Speaker 1: in our lives are sort of second rate, And clearly 318 00:17:28,800 --> 00:17:32,960 Speaker 1: that's not true. Like the phone system works, airplanes like 319 00:17:33,080 --> 00:17:35,240 Speaker 1: what's the old Lui kiss came on a love like 320 00:17:35,480 --> 00:17:38,280 Speaker 1: it flies through the sky and then teleports beams up 321 00:17:38,280 --> 00:17:41,320 Speaker 1: to space like none of that happened by accident, And 322 00:17:41,400 --> 00:17:47,399 Speaker 1: so it's important to not focus on like the You know, 323 00:17:47,520 --> 00:17:50,840 Speaker 1: engineers working traditional industry are worse engineers that engineers their 324 00:17:50,880 --> 00:17:54,679 Speaker 1: work in software companies. The larger problement they have is 325 00:17:54,880 --> 00:17:57,879 Speaker 1: one they don't drive the bus. They have less ability 326 00:17:58,000 --> 00:18:01,840 Speaker 1: to control the situations within their organizations and control like 327 00:18:02,000 --> 00:18:06,000 Speaker 1: larger decisions made such that they have influence over decisions 328 00:18:06,040 --> 00:18:08,320 Speaker 1: on like what the main and schedule would look like, 329 00:18:08,520 --> 00:18:11,240 Speaker 1: or who gets to make decisions with respect to whether 330 00:18:11,280 --> 00:18:13,960 Speaker 1: something ships or not. And there is a little bit 331 00:18:14,000 --> 00:18:16,560 Speaker 1: of a sort of like the life cycle of engineers 332 00:18:16,960 --> 00:18:19,959 Speaker 1: thing that happens where the original architects of the system 333 00:18:20,080 --> 00:18:22,760 Speaker 1: did it forty years ago. A careers are about as 334 00:18:22,800 --> 00:18:24,800 Speaker 1: long as they are. Many of the original architects of 335 00:18:24,840 --> 00:18:27,360 Speaker 1: the system will sort of be aging into the retirement 336 00:18:27,440 --> 00:18:29,480 Speaker 1: years at this point, and so there is a question 337 00:18:29,520 --> 00:18:32,040 Speaker 1: of did the organization put in the work over the 338 00:18:32,119 --> 00:18:35,400 Speaker 1: years to recruit newer engineers to inculcate them into how 339 00:18:35,400 --> 00:18:38,600 Speaker 1: the system is made, etcetera, etcetera, Or do they just 340 00:18:38,720 --> 00:18:40,720 Speaker 1: allow all of the knowledge to walk out the door 341 00:18:41,119 --> 00:18:43,480 Speaker 1: and did they Like I think that comes up a 342 00:18:43,480 --> 00:18:46,480 Speaker 1: lot is do they put enough work on a day 343 00:18:46,480 --> 00:18:49,080 Speaker 1: to day basis to maintain an engineering brand so that 344 00:18:49,119 --> 00:18:51,600 Speaker 1: they can get new talent engineers to join them in 345 00:18:52,400 --> 00:18:55,840 Speaker 1: three such that those engineers are the word used as 346 00:18:55,840 --> 00:18:59,040 Speaker 1: often gray beard, but you know that the wise and 347 00:18:59,160 --> 00:19:01,280 Speaker 1: veteran thirty year is from now, so that when something 348 00:19:01,280 --> 00:19:04,280 Speaker 1: happens in there are people who have been around the 349 00:19:04,280 --> 00:19:08,000 Speaker 1: black that know where the skeletons are buried in the system. Interestingly, 350 00:19:08,119 --> 00:19:11,320 Speaker 1: traditional industry is getting better over the years at having 351 00:19:11,320 --> 00:19:14,919 Speaker 1: an engineering brand, sort of moving away from this this 352 00:19:15,000 --> 00:19:17,639 Speaker 1: world where engineering was largely seen as a cost center, 353 00:19:18,040 --> 00:19:20,200 Speaker 1: where like the goal is just too cram down the 354 00:19:20,200 --> 00:19:22,680 Speaker 1: amount of money you spend on it and improve your margins. 355 00:19:23,080 --> 00:19:25,600 Speaker 1: I think there's a few things that played into that. 356 00:19:25,680 --> 00:19:29,040 Speaker 1: One of them was, particularly in the Internet age, it 357 00:19:29,280 --> 00:19:32,000 Speaker 1: became obvious that you know, in finance, you talk about 358 00:19:32,080 --> 00:19:34,680 Speaker 1: like the front office and the back office. Engineering used 359 00:19:34,680 --> 00:19:36,160 Speaker 1: to be in the back office, and the front office, 360 00:19:36,160 --> 00:19:38,959 Speaker 1: where the sales people live, is the one that generates 361 00:19:39,000 --> 00:19:42,240 Speaker 1: all the money for the bank. And increasingly, because like 362 00:19:42,800 --> 00:19:45,199 Speaker 1: experiences that were in the palm of the users hands 363 00:19:45,240 --> 00:19:47,520 Speaker 1: were the thing that we're actually generating in the sales, 364 00:19:48,040 --> 00:19:53,320 Speaker 1: those experiences became sort of institutionally important within banks and 365 00:19:53,359 --> 00:19:57,160 Speaker 1: airlines and other firms. And then when those experiences became 366 00:19:57,200 --> 00:20:00,800 Speaker 1: important took a while, but gradually the people and teams 367 00:20:00,840 --> 00:20:03,840 Speaker 1: that build those experiences became more institutionally important than they 368 00:20:03,840 --> 00:20:06,479 Speaker 1: had previously been. And so like, if you look at 369 00:20:06,480 --> 00:20:08,840 Speaker 1: the large money center banks in the US, they certainly 370 00:20:08,880 --> 00:20:12,440 Speaker 1: have no small number of technical challenges. But the thing 371 00:20:12,480 --> 00:20:16,560 Speaker 1: that is like largely true which was not true, is 372 00:20:16,600 --> 00:20:19,200 Speaker 1: that their mobile apps are actually kind of good these days. 373 00:20:19,359 --> 00:20:22,399 Speaker 1: Like if you download, you know, not to endorse anybody 374 00:20:22,400 --> 00:20:25,119 Speaker 1: in particular, but like Chase or Capital One, when you 375 00:20:25,119 --> 00:20:26,919 Speaker 1: play with their mobile apps, like, oh, this kind of 376 00:20:26,920 --> 00:20:29,879 Speaker 1: feels like a mobile app made in Silicon Valley. And 377 00:20:29,920 --> 00:20:31,840 Speaker 1: the reason is, well, yeah, they hired a lot of 378 00:20:31,840 --> 00:20:34,720 Speaker 1: people who made the apps from Silicon Valley, and those 379 00:20:34,760 --> 00:20:37,720 Speaker 1: folks brought their skills and sort of like level of 380 00:20:37,720 --> 00:20:40,720 Speaker 1: competence with this and their taste and now exercise and 381 00:20:40,760 --> 00:20:43,760 Speaker 1: on behalf of old world companies. One hopes, knock on wood, 382 00:20:44,000 --> 00:20:46,720 Speaker 1: that that seeps into the back end of these systems 383 00:20:46,760 --> 00:20:50,119 Speaker 1: to where Apple, Google, Microsoft, they have very talented teams 384 00:20:50,119 --> 00:20:51,680 Speaker 1: on the front end of their systems, but they also 385 00:20:51,720 --> 00:20:54,240 Speaker 1: have very talented teams on the back end of the systems. 386 00:20:54,240 --> 00:20:57,280 Speaker 1: And that's bluntly why you don't see like core systems 387 00:20:57,280 --> 00:20:58,920 Speaker 1: that Google going down for a week at a time 388 00:20:59,760 --> 00:21:02,720 Speaker 1: that is almost unimaginable. You know how much of capitalism 389 00:21:02,800 --> 00:21:05,280 Speaker 1: would break if Google Docs was just down for a week. 390 00:21:05,800 --> 00:21:07,960 Speaker 1: Given that there are many parts of the world where 391 00:21:09,320 --> 00:21:12,600 Speaker 1: there is a liveness constraint, we would like all else 392 00:21:12,680 --> 00:21:15,159 Speaker 1: being equal. If it is safe to fly airplanes, we 393 00:21:15,160 --> 00:21:17,600 Speaker 1: would prefer that there be airplanes flying versus all the 394 00:21:17,640 --> 00:21:20,879 Speaker 1: airplanes being on the ground, because airplanes generated valley for 395 00:21:20,960 --> 00:21:23,840 Speaker 1: human society. If that is true, then it must be 396 00:21:23,880 --> 00:21:27,399 Speaker 1: the case that the back end systems of airlines that 397 00:21:27,720 --> 00:21:30,680 Speaker 1: control whether airplanes are allowed to fly at a given time. 398 00:21:31,160 --> 00:21:33,440 Speaker 1: That has to be at least as important as you know, 399 00:21:33,640 --> 00:21:36,280 Speaker 1: Google docs is, which implies that the airlines need to 400 00:21:36,320 --> 00:21:39,040 Speaker 1: put at least as much work as Google does, roughly 401 00:21:39,480 --> 00:21:42,239 Speaker 1: into like having true mastery of their own back end 402 00:21:42,280 --> 00:21:45,440 Speaker 1: systems and the various problems that could happen there. So 403 00:21:45,760 --> 00:21:48,960 Speaker 1: just on this note, you know, there's another travel disruption 404 00:21:49,000 --> 00:21:51,159 Speaker 1: that happened recently and we haven't even mentioned it, but 405 00:21:51,200 --> 00:21:53,919 Speaker 1: that was the f a A experiencing some sort of 406 00:21:53,960 --> 00:21:58,760 Speaker 1: computer event and grounding I think all the domestic departures 407 00:21:58,840 --> 00:22:02,160 Speaker 1: for for one morning. And it was recently reported that 408 00:22:02,680 --> 00:22:06,240 Speaker 1: the approximate cause of that was because there were some 409 00:22:06,480 --> 00:22:10,520 Speaker 1: software engineers who were trying to upgrade the system and 410 00:22:10,560 --> 00:22:13,879 Speaker 1: they accidentally deleted a bunch of critical files as they 411 00:22:13,880 --> 00:22:15,920 Speaker 1: were trying to do this. Can you talk a little 412 00:22:15,920 --> 00:22:19,919 Speaker 1: bit more about the technical challenges when it comes to 413 00:22:20,640 --> 00:22:23,760 Speaker 1: trying to fix some of these legacy systems, Like why 414 00:22:23,920 --> 00:22:26,840 Speaker 1: exactly is it so difficult? You sort of talked about 415 00:22:26,880 --> 00:22:31,040 Speaker 1: it from a organizational perspective, but from a technical perspective, 416 00:22:31,359 --> 00:22:34,280 Speaker 1: why does this seem to be such a big challenge? Sure, 417 00:22:34,720 --> 00:22:38,240 Speaker 1: so that one thing is that that that symptom, the 418 00:22:38,240 --> 00:22:41,119 Speaker 1: the underlying cause is that a problem happened during an 419 00:22:41,200 --> 00:22:46,159 Speaker 1: upgrade is extremely well understood in the software engineering field. 420 00:22:46,520 --> 00:22:49,320 Speaker 1: It's called out in, among other places, Google's book about 421 00:22:49,480 --> 00:22:53,200 Speaker 1: Site Reliability Engineers, which is essentially the subcategory of engineers 422 00:22:53,200 --> 00:22:56,159 Speaker 1: that Google relies on keeping the world running. And it 423 00:22:56,320 --> 00:22:58,120 Speaker 1: is often the case that the people who are attempting 424 00:22:58,160 --> 00:23:01,680 Speaker 1: to make an incremental change do not have full context 425 00:23:01,720 --> 00:23:03,440 Speaker 1: of how the system came to be in the state 426 00:23:03,480 --> 00:23:06,440 Speaker 1: that is currently in, and a change that they thought 427 00:23:06,480 --> 00:23:09,400 Speaker 1: will have a limited sort of area of impact ends 428 00:23:09,440 --> 00:23:11,800 Speaker 1: up having a larger area of impact. The term of 429 00:23:11,920 --> 00:23:14,399 Speaker 1: art we use in the industry is blast radius. Like 430 00:23:14,560 --> 00:23:16,919 Speaker 1: you you hope to quantify the amount of blast radius 431 00:23:16,960 --> 00:23:19,080 Speaker 1: of something that goes wrong, such that you know, if 432 00:23:19,080 --> 00:23:21,640 Speaker 1: I'm make a mistake, am I going to like bring 433 00:23:21,680 --> 00:23:23,680 Speaker 1: down our blog? Or am I going to bring down 434 00:23:23,720 --> 00:23:26,600 Speaker 1: like credit card processing worldwide? And you know, be much 435 00:23:26,600 --> 00:23:29,320 Speaker 1: more careful if the blast radius includes like worldwide credit 436 00:23:29,320 --> 00:23:32,320 Speaker 1: card processing plausibly, like you should engineer systems such that 437 00:23:32,359 --> 00:23:34,520 Speaker 1: there's no way to take down worldwide credit card process 438 00:23:34,640 --> 00:23:37,520 Speaker 1: but that's actually harder to do, then then it just sounds. 439 00:23:37,600 --> 00:23:40,600 Speaker 1: So anyhow, how do you does one get to the 440 00:23:40,600 --> 00:23:44,199 Speaker 1: point where it is difficult to understand, like what the 441 00:23:44,240 --> 00:23:47,080 Speaker 1: implications of the changes who make and truly are These 442 00:23:47,119 --> 00:23:49,600 Speaker 1: are like meat and potatoes questions, and then the meat 443 00:23:49,600 --> 00:23:52,080 Speaker 1: and potato's answers are often things like was the system 444 00:23:52,119 --> 00:23:55,280 Speaker 1: adequately documented when it was made? Frequently the answer is no, 445 00:23:55,880 --> 00:23:59,520 Speaker 1: and a lot of information about like how systems are 446 00:23:59,520 --> 00:24:03,280 Speaker 1: put together survives this oral lore within the engineering teamate 447 00:24:03,320 --> 00:24:08,160 Speaker 1: various companies, which, yeah, that's an uncomfortable bit of information 448 00:24:08,200 --> 00:24:10,240 Speaker 1: to holding your head when you start talking about like 449 00:24:10,359 --> 00:24:13,640 Speaker 1: the life cycle engineers and the fact that the original 450 00:24:13,720 --> 00:24:16,080 Speaker 1: architects of many of these systems are like literally no 451 00:24:16,119 --> 00:24:19,080 Speaker 1: longer with us, either because they have retired or they 452 00:24:19,160 --> 00:24:22,639 Speaker 1: might be like um, beyond our ability to call up 453 00:24:22,680 --> 00:24:26,199 Speaker 1: out of retirement at this point, and so you have 454 00:24:26,280 --> 00:24:29,320 Speaker 1: to write down what you do. And that concept was 455 00:24:29,359 --> 00:24:33,240 Speaker 1: not like new to governments and bureaucracies as a result 456 00:24:33,280 --> 00:24:37,000 Speaker 1: of software engineering happening in the last seventy years, sort 457 00:24:37,000 --> 00:24:40,080 Speaker 1: of fundamental to the operation of large organizations. Software is 458 00:24:40,119 --> 00:24:42,760 Speaker 1: just how people choose to do work with each other, 459 00:24:43,040 --> 00:24:46,760 Speaker 1: but is a lesson that we keep relearning there is 460 00:24:46,840 --> 00:24:49,919 Speaker 1: often that issue where because software is how people an 461 00:24:50,040 --> 00:24:53,280 Speaker 1: organization choose to work with each other, often software will 462 00:24:53,359 --> 00:24:56,520 Speaker 1: interface various systems together and problems will happen at the 463 00:24:56,520 --> 00:25:00,600 Speaker 1: boundaries between systems, either between literal computer systems or between 464 00:25:00,960 --> 00:25:04,879 Speaker 1: other breakages between organizations. So the thing that you will 465 00:25:04,920 --> 00:25:09,640 Speaker 1: see frequently is my software didn't fail. Your software didn't fail. 466 00:25:09,800 --> 00:25:14,120 Speaker 1: We mutually failed together at that point where where we are, 467 00:25:14,440 --> 00:25:17,200 Speaker 1: you know, supposed to transfer information and then both sides 468 00:25:17,240 --> 00:25:20,960 Speaker 1: end up pointing their their at their counterparty. And so 469 00:25:21,400 --> 00:25:24,560 Speaker 1: part of the discipline of software engineering is one like 470 00:25:24,800 --> 00:25:27,040 Speaker 1: creating a culture where you don't want to point fingers 471 00:25:27,040 --> 00:25:30,359 Speaker 1: of the counterparty into creating structures and incentives such that, 472 00:25:30,920 --> 00:25:33,600 Speaker 1: you know, like complex systems that involved multiple different parties 473 00:25:33,640 --> 00:25:35,840 Speaker 1: with multiple different engineering teams who might not report to 474 00:25:35,880 --> 00:25:39,760 Speaker 1: the same payerial department will like converge on correct outcomes. 475 00:25:40,200 --> 00:25:42,359 Speaker 1: And there are a variety of ways to do that 476 00:25:42,400 --> 00:25:44,960 Speaker 1: in the industry. Some of them are better than others. 477 00:25:45,160 --> 00:25:50,680 Speaker 1: So without like naming the particular company, there exists to 478 00:25:50,760 --> 00:25:56,040 Speaker 1: credit card system which is extremely so credit card systems 479 00:25:56,160 --> 00:25:59,080 Speaker 1: as a baseline are extremely reliable. You probably don't remember 480 00:25:59,119 --> 00:26:00,639 Speaker 1: the last time that you were unable to use a 481 00:26:00,640 --> 00:26:03,280 Speaker 1: credit card for a week, because like literally that does 482 00:26:03,320 --> 00:26:07,120 Speaker 1: not happen. So like a plus plus for achieving that outcome. 483 00:26:07,480 --> 00:26:10,040 Speaker 1: What one credit card company does to achieve that is 484 00:26:10,200 --> 00:26:13,040 Speaker 1: they are willing to make changes for their system precisely 485 00:26:13,119 --> 00:26:16,320 Speaker 1: twice a year, after six months of testing every change 486 00:26:16,560 --> 00:26:20,960 Speaker 1: that they make. That's an incredible amount of upfront work 487 00:26:21,040 --> 00:26:24,800 Speaker 1: to do, like relatively small amounts of engineering, and so 488 00:26:25,119 --> 00:26:29,080 Speaker 1: the pace at which that ecosystem evolves is much much 489 00:26:29,080 --> 00:26:33,200 Speaker 1: slower than more software forward company is like Google, App, Apple, 490 00:26:33,280 --> 00:26:36,399 Speaker 1: ms on, etcetera, etcetera, where they're shipping thousands of changes 491 00:26:36,440 --> 00:26:39,120 Speaker 1: to their systems every day. So one of the interesting 492 00:26:39,160 --> 00:26:42,680 Speaker 1: bits about the remixing the skills and techniques of Silicon 493 00:26:42,760 --> 00:26:45,919 Speaker 1: Valley is attempting to get people who are comfortable with 494 00:26:45,960 --> 00:26:48,600 Speaker 1: the engineering practices that allow you to ship software thousands 495 00:26:48,600 --> 00:26:52,040 Speaker 1: of times a day into positions of authority at old 496 00:26:52,080 --> 00:26:55,359 Speaker 1: line companies such that they can gradually transition from you know, 497 00:26:55,400 --> 00:26:57,560 Speaker 1: the point that they're at where they might be able 498 00:26:57,600 --> 00:27:00,560 Speaker 1: to ship software once or twice a year, maybe quarterly, 499 00:27:00,600 --> 00:27:02,960 Speaker 1: to the point where they will be shipping software like 500 00:27:03,600 --> 00:27:23,520 Speaker 1: let's start with by week move up from there. So 501 00:27:23,680 --> 00:27:26,760 Speaker 1: we've been talking a lot about like failures or sort 502 00:27:26,800 --> 00:27:28,879 Speaker 1: of like collapses. But the other thing that I like 503 00:27:28,960 --> 00:27:32,679 Speaker 1: sort of associate with large business software is just like 504 00:27:33,080 --> 00:27:36,640 Speaker 1: the user the user experience is just not as good. 505 00:27:36,760 --> 00:27:38,679 Speaker 1: And I think that was, you know, the sort of 506 00:27:38,680 --> 00:27:41,679 Speaker 1: like the u X I believe was part of the 507 00:27:41,760 --> 00:27:45,320 Speaker 1: story with like the infamous like city error where they 508 00:27:45,400 --> 00:27:49,919 Speaker 1: transmitted nine million dollars they shouldn't have to some counterparties, 509 00:27:49,920 --> 00:27:52,400 Speaker 1: and I think some of the users were confused by 510 00:27:52,400 --> 00:27:55,520 Speaker 1: the internal software, whether they were actually sending that or not. 511 00:27:56,119 --> 00:27:59,200 Speaker 1: I heard story from someone who worked at the VCR 512 00:27:59,359 --> 00:28:02,720 Speaker 1: once of a major bank about like the hoops that 513 00:28:02,800 --> 00:28:05,159 Speaker 1: they have to go through just to share documents with 514 00:28:05,200 --> 00:28:08,720 Speaker 1: each other like power points, because they get flagged off. 515 00:28:08,760 --> 00:28:11,320 Speaker 1: And internally I assume that there's some sort of like 516 00:28:11,560 --> 00:28:16,439 Speaker 1: regulatory issues or booking travel, like you know, going to 517 00:28:16,520 --> 00:28:20,239 Speaker 1: like booking dot com is really easy when Tracy and 518 00:28:20,280 --> 00:28:23,120 Speaker 1: I book travel for work here, it's not that bad. 519 00:28:23,200 --> 00:28:27,000 Speaker 1: But like the usability of the software, internally, it's not 520 00:28:27,119 --> 00:28:32,640 Speaker 1: as smooth and snappy as consumer travel sides. Why is that? 521 00:28:32,880 --> 00:28:35,280 Speaker 1: Like what is why is this sort of like business 522 00:28:35,320 --> 00:28:38,840 Speaker 1: software internally just like is not as uh, you know, yeah, 523 00:28:38,920 --> 00:28:41,960 Speaker 1: easy to use and sort of visually appealing as the 524 00:28:42,000 --> 00:28:45,880 Speaker 1: consumer internet. So this is getting better over time, and 525 00:28:45,920 --> 00:28:48,040 Speaker 1: we'll talk about that in a moment, but broadly, like 526 00:28:48,160 --> 00:28:51,720 Speaker 1: your your observation is entirely accurate. If you were the 527 00:28:51,760 --> 00:28:56,920 Speaker 1: softwares are for the entire world, you might like rank 528 00:28:57,000 --> 00:28:59,960 Speaker 1: applications by their importance to the world and say, okay, 529 00:29:00,160 --> 00:29:02,840 Speaker 1: if you were an online application on someone's phone that 530 00:29:02,880 --> 00:29:05,960 Speaker 1: allows someone to share cat photos, it's like important in 531 00:29:06,000 --> 00:29:08,400 Speaker 1: some sense, but probably not as important as sending a 532 00:29:08,400 --> 00:29:10,880 Speaker 1: billion dollars outside of a bank, And so we would 533 00:29:10,880 --> 00:29:13,400 Speaker 1: have a lot more talent and time spent on the 534 00:29:13,480 --> 00:29:16,080 Speaker 1: question of can you send a billion dollars out of 535 00:29:16,080 --> 00:29:18,680 Speaker 1: a bank? Versus does a thirteen year old have an 536 00:29:18,680 --> 00:29:22,600 Speaker 1: awesome experience when sending a cat photo? In actual fact, 537 00:29:22,640 --> 00:29:25,800 Speaker 1: though much much more time and talent is spent the 538 00:29:25,840 --> 00:29:28,080 Speaker 1: cat photo question, that is spent on the like wiring 539 00:29:28,080 --> 00:29:31,200 Speaker 1: billions of dollars out of banks question, that was a choice. 540 00:29:31,680 --> 00:29:34,400 Speaker 1: It sounds silly to say maybe we should stop choosing 541 00:29:34,440 --> 00:29:38,520 Speaker 1: stupid things, but the the like true answer is business 542 00:29:38,560 --> 00:29:42,880 Speaker 1: software gets better when the you know, people and organizations 543 00:29:42,920 --> 00:29:45,920 Speaker 1: that cause that software to be built, choose that the 544 00:29:46,000 --> 00:29:48,160 Speaker 1: quality of that software is something that is very relevant 545 00:29:48,160 --> 00:29:51,040 Speaker 1: to their interests. And so one way to like come 546 00:29:51,040 --> 00:29:53,240 Speaker 1: to the realization is to lose a billion dollars and 547 00:29:53,280 --> 00:29:57,080 Speaker 1: then you know, hopefully the next time you're level engineering 548 00:29:57,120 --> 00:29:59,160 Speaker 1: management says we should spend a little more on maintenance, 549 00:29:59,560 --> 00:30:01,800 Speaker 1: you will say, yes, I agree with you, we should 550 00:30:01,800 --> 00:30:04,320 Speaker 1: spend a little more on maintenance versus taking another billion 551 00:30:04,320 --> 00:30:07,360 Speaker 1: dollar charge at a time, not of our choosing. Part 552 00:30:07,360 --> 00:30:10,080 Speaker 1: of it is just the culture of quality coming back 553 00:30:10,120 --> 00:30:12,640 Speaker 1: to these things. Part of it is also through sort 554 00:30:12,680 --> 00:30:16,240 Speaker 1: of teaching the user inside of organizations that software doesn't 555 00:30:16,280 --> 00:30:19,800 Speaker 1: have to be terrible because most software in the world 556 00:30:20,040 --> 00:30:24,680 Speaker 1: exists inside of companies and runs business processes. There's something 557 00:30:24,720 --> 00:30:27,240 Speaker 1: that is not broadly known, but like, of all the 558 00:30:27,280 --> 00:30:30,280 Speaker 1: lines of software in the world, most exist inside of companies, 559 00:30:30,600 --> 00:30:33,520 Speaker 1: and for a very long time, because most software people 560 00:30:33,560 --> 00:30:36,480 Speaker 1: interacted with was after employer and it was generally kind 561 00:30:36,480 --> 00:30:39,400 Speaker 1: of terrible. They just had an image like software is 562 00:30:39,440 --> 00:30:42,960 Speaker 1: like generally kind of terrible. And then the iPhone came around. 563 00:30:43,040 --> 00:30:45,920 Speaker 1: Everyone has a powerful computer in their hand. For x 564 00:30:46,000 --> 00:30:49,120 Speaker 1: number of hours a day, you've used applications you've tapped 565 00:30:49,160 --> 00:30:52,000 Speaker 1: three times, and you know, interesting things happened in the world. 566 00:30:52,000 --> 00:30:53,640 Speaker 1: There's a result of you tapping three times and you 567 00:30:53,720 --> 00:30:56,120 Speaker 1: broadly like the experience, and then you go back to 568 00:30:56,120 --> 00:30:59,520 Speaker 1: work and say, wait, I've used software that doesn't suck. 569 00:31:00,080 --> 00:31:02,080 Speaker 1: All the stuff I use at work sucks. Hey, I 570 00:31:02,160 --> 00:31:05,880 Speaker 1: t department, Hey senior management, can you please make like 571 00:31:06,080 --> 00:31:09,720 Speaker 1: our expense tracking software not be terrible? And that is 572 00:31:09,720 --> 00:31:12,200 Speaker 1: starting to happen built as a result of like internal 573 00:31:12,240 --> 00:31:15,200 Speaker 1: software producers advocating for change, the result of that user 574 00:31:15,200 --> 00:31:18,200 Speaker 1: feedback within companies, and also as a result of various 575 00:31:18,200 --> 00:31:22,200 Speaker 1: startups happening to say, like, not to throw a particular 576 00:31:22,320 --> 00:31:24,800 Speaker 1: expense solution under the bus, but the thing that most 577 00:31:24,840 --> 00:31:28,000 Speaker 1: like oldline economy companies probably use is not a thing 578 00:31:28,040 --> 00:31:30,360 Speaker 1: that people love using to book their travel. And if 579 00:31:30,360 --> 00:31:32,680 Speaker 1: you you know, use trip actions or something that is 580 00:31:32,680 --> 00:31:37,280 Speaker 1: designed by a modern team with modern sort of ux offordances, 581 00:31:37,600 --> 00:31:40,280 Speaker 1: it's a much nicer solution for the end user. And 582 00:31:40,960 --> 00:31:43,480 Speaker 1: in some companies, end users are starting to have some 583 00:31:43,600 --> 00:31:47,040 Speaker 1: level of ability to advocate for what software gets adopted, 584 00:31:47,080 --> 00:31:50,200 Speaker 1: whereas previously that was made by processes that were not 585 00:31:50,320 --> 00:31:54,360 Speaker 1: user centric, where which team was better at doing whining 586 00:31:54,400 --> 00:31:56,720 Speaker 1: and dining the person in charge of the purchase and 587 00:31:56,760 --> 00:31:59,400 Speaker 1: decision and not winning on the basis of product quality. 588 00:31:59,680 --> 00:32:02,880 Speaker 1: And the couple of years, even like some enterprise software 589 00:32:02,880 --> 00:32:04,920 Speaker 1: is starting to win largely end the basis of product 590 00:32:05,000 --> 00:32:07,880 Speaker 1: quality versus on sort of the more traditional sales motion, 591 00:32:07,880 --> 00:32:10,520 Speaker 1: although goodness knows that the traditional sales motion is still 592 00:32:10,600 --> 00:32:14,840 Speaker 1: very important to enterprise software companies. So I have a 593 00:32:14,920 --> 00:32:17,920 Speaker 1: slightly weird question, but I'm thinking a lot about it 594 00:32:18,000 --> 00:32:20,880 Speaker 1: as as we have this conversation. But it feels to me, 595 00:32:21,440 --> 00:32:25,920 Speaker 1: like software engineering and computer programming, it always seems to 596 00:32:26,040 --> 00:32:29,480 Speaker 1: be in flux. Like if your job is a software engineer, 597 00:32:29,640 --> 00:32:32,320 Speaker 1: it feels like there's always something to do. You're always 598 00:32:32,320 --> 00:32:36,600 Speaker 1: trying to fix a problem or adapt a system. And 599 00:32:36,680 --> 00:32:39,160 Speaker 1: I guess, I guess my question is why you know? 600 00:32:39,200 --> 00:32:42,200 Speaker 1: I fully admit my own programming experience is confined to 601 00:32:42,360 --> 00:32:46,320 Speaker 1: like HTML, which I learned from that website HTML goodies 602 00:32:46,480 --> 00:32:51,880 Speaker 1: in like But back then, you know, you program your website, 603 00:32:52,040 --> 00:32:55,160 Speaker 1: you design it in HTML, you release it into the wild, 604 00:32:55,320 --> 00:32:58,320 Speaker 1: and you're kind of done. And yet it seems with 605 00:32:58,360 --> 00:33:02,160 Speaker 1: these large scale systems that there's always change, something is 606 00:33:02,200 --> 00:33:06,680 Speaker 1: always in motion. Something is always influx. Why is that? So, 607 00:33:06,880 --> 00:33:08,880 Speaker 1: let me push back a tiny bit on this year, 608 00:33:09,280 --> 00:33:12,080 Speaker 1: Like how many years have we had lawyers available? And 609 00:33:12,320 --> 00:33:14,320 Speaker 1: does anyone ever go up to the lawyers and say, like, 610 00:33:14,600 --> 00:33:17,320 Speaker 1: come on, guys, it's three haven't you like figured out 611 00:33:17,480 --> 00:33:23,320 Speaker 1: all the laws? Yeah? And so why does the law 612 00:33:23,400 --> 00:33:25,200 Speaker 1: change on a week to week basis? Well, it doesn't 613 00:33:25,280 --> 00:33:28,160 Speaker 1: change per se. It's just the world is complicated. The 614 00:33:28,240 --> 00:33:32,000 Speaker 1: number of commercial relationships between organizations is increasing all the time. 615 00:33:32,080 --> 00:33:35,120 Speaker 1: We have increasing demands on what those relationships will do. 616 00:33:35,240 --> 00:33:37,280 Speaker 1: And the job of lawyers is to adapt to that 617 00:33:37,760 --> 00:33:41,560 Speaker 1: increasingly complex world every weekend, continue delivering like the law 618 00:33:41,680 --> 00:33:44,160 Speaker 1: that society needs and the outcomes that come as a 619 00:33:44,200 --> 00:33:48,280 Speaker 1: result of like competently executing on the ability of organizations 620 00:33:48,320 --> 00:33:51,800 Speaker 1: to collaborate internally with their employees and with other organizations. 621 00:33:52,000 --> 00:33:55,280 Speaker 1: What's my answer for software engineering, Well, software engineers, they 622 00:33:55,320 --> 00:33:58,720 Speaker 1: are you know, working this week on increasingly complex world 623 00:33:58,760 --> 00:34:01,560 Speaker 1: where software is more leverage than it had even last week, 624 00:34:01,560 --> 00:34:04,760 Speaker 1: where they're increasing demands on the world, etcetera, etcetera, etcetera. 625 00:34:05,400 --> 00:34:06,800 Speaker 1: Is there going to be a time where the last 626 00:34:06,800 --> 00:34:09,120 Speaker 1: line of software is written? Probably not. There will never 627 00:34:09,200 --> 00:34:11,279 Speaker 1: be a last a bit of software written. There will 628 00:34:11,280 --> 00:34:12,960 Speaker 1: never be a less contract written, there will never be 629 00:34:13,040 --> 00:34:16,359 Speaker 1: a last book written. Because humans want more things out 630 00:34:16,360 --> 00:34:18,800 Speaker 1: of the world than we have, kind of like infinite 631 00:34:18,880 --> 00:34:22,520 Speaker 1: capacity for want. At the margin, there was a compelling answer, 632 00:34:22,920 --> 00:34:24,520 Speaker 1: can you talk a little bit about you know, we've 633 00:34:24,520 --> 00:34:28,520 Speaker 1: been talking about banks, airlines for startups. Can you tell 634 00:34:28,640 --> 00:34:32,160 Speaker 1: us like, how are the challenges for the public sector? 635 00:34:32,400 --> 00:34:34,719 Speaker 1: And I remember like Obamba had a thing about like 636 00:34:34,760 --> 00:34:37,520 Speaker 1: I want to like bring government websites or government tech 637 00:34:37,560 --> 00:34:40,360 Speaker 1: and the modern age. It just seems like whatever problems 638 00:34:40,400 --> 00:34:43,600 Speaker 1: exist for big companies seemed to be even worse or 639 00:34:43,719 --> 00:34:46,439 Speaker 1: more tricky when you're dealing with the public sector. Could 640 00:34:46,480 --> 00:34:50,000 Speaker 1: I think at one point, at one point, New Jersey 641 00:34:50,160 --> 00:34:54,279 Speaker 1: was explicitly like begging the Internet for cobal programmers, wasn't it. 642 00:34:54,280 --> 00:34:58,040 Speaker 1: In I think that was a thing that happened. Yeah, 643 00:34:58,200 --> 00:35:01,839 Speaker 1: so full disclosure here at today UH nonprofit organization. Last 644 00:35:01,880 --> 00:35:03,560 Speaker 1: year where a few of us in the tech industry 645 00:35:03,560 --> 00:35:08,160 Speaker 1: abandoned together to work on the vaccine location information infrastructure 646 00:35:08,200 --> 00:35:11,520 Speaker 1: for the United States because the public sector was having 647 00:35:11,560 --> 00:35:14,840 Speaker 1: a great deal of difficulty creating websites that would track 648 00:35:14,880 --> 00:35:17,520 Speaker 1: where the vaccine was and route vaccine sekers to it. 649 00:35:17,840 --> 00:35:21,040 Speaker 1: So have lots of thoughts here, so many issues. Again, 650 00:35:21,320 --> 00:35:24,800 Speaker 1: we did not wake up in three with these issues magically. 651 00:35:24,840 --> 00:35:27,160 Speaker 1: It's a result of like decisions that we've collectively made 652 00:35:27,200 --> 00:35:30,520 Speaker 1: as a society go for many years. One decision that 653 00:35:30,560 --> 00:35:33,720 Speaker 1: we've made in the United States in particulars that government 654 00:35:33,760 --> 00:35:36,080 Speaker 1: pay of scales are what they are. If you compare 655 00:35:36,080 --> 00:35:39,200 Speaker 1: those government pay scales to what private industry pays for technologists, 656 00:35:39,520 --> 00:35:43,400 Speaker 1: they are sharply out of whack. And so then you know, 657 00:35:43,440 --> 00:35:46,360 Speaker 1: if you look at GS whatever, the highest paid public 658 00:35:46,360 --> 00:35:49,480 Speaker 1: sector employees in the United States make less than Google 659 00:35:49,560 --> 00:35:52,799 Speaker 1: uh interns. To help for the equilibrium, if you can 660 00:35:52,840 --> 00:35:55,640 Speaker 1: get hired by Google, you know, it would require you 661 00:35:55,719 --> 00:36:01,200 Speaker 1: to really does matter in this realm, right. And one 662 00:36:01,239 --> 00:36:02,840 Speaker 1: of the things that the government has been attempting to 663 00:36:02,880 --> 00:36:05,319 Speaker 1: do over the years to create things where there are 664 00:36:05,360 --> 00:36:11,359 Speaker 1: groups of people who are like officially their government employees. Unofficially, 665 00:36:12,040 --> 00:36:14,520 Speaker 1: I think they're sort of doing an active service to 666 00:36:14,560 --> 00:36:18,319 Speaker 1: the nation in places like the Digital Services Agency, etcetera, etcetera, 667 00:36:18,360 --> 00:36:20,759 Speaker 1: where they already made their money in tech there now 668 00:36:20,800 --> 00:36:23,560 Speaker 1: on the GS whatever and making a fraction of what 669 00:36:23,600 --> 00:36:27,040 Speaker 1: they previously made, but are contributing software expertise to these 670 00:36:27,120 --> 00:36:30,000 Speaker 1: various problems, where like what the government needs is like 671 00:36:30,040 --> 00:36:32,880 Speaker 1: some competent software written, and that requires having competent software 672 00:36:32,880 --> 00:36:36,480 Speaker 1: people available in quantity. Another issue that governments have is 673 00:36:36,520 --> 00:36:39,560 Speaker 1: like what is the true goal you are solving for? 674 00:36:40,239 --> 00:36:42,840 Speaker 1: Without getting too political about it, In some parts of 675 00:36:42,880 --> 00:36:47,520 Speaker 1: the government, like, you know, an organization might exist as 676 00:36:47,600 --> 00:36:53,520 Speaker 1: largely a job's program, and i T modernization might sharply 677 00:36:54,520 --> 00:36:57,719 Speaker 1: decrease the effectiveness of that organization at employing a large 678 00:36:57,800 --> 00:37:01,160 Speaker 1: number of people to like repeatedly to a process that 679 00:37:01,400 --> 00:37:04,200 Speaker 1: a machine could do in a faster fashion. And so 680 00:37:04,320 --> 00:37:07,400 Speaker 1: sometimes like the the powers that be with an organization 681 00:37:07,480 --> 00:37:10,719 Speaker 1: so like, well, you know, I don't necessarily consider i 682 00:37:10,800 --> 00:37:13,240 Speaker 1: T modernization one of my top priorities at the moment, 683 00:37:13,560 --> 00:37:16,160 Speaker 1: because that would cause me to need to break faith 684 00:37:16,200 --> 00:37:18,400 Speaker 1: with a number of people that that I employees slash. 685 00:37:18,719 --> 00:37:22,040 Speaker 1: You know, sometimes my own career trajectory as a bureaucrat 686 00:37:22,120 --> 00:37:24,399 Speaker 1: is and this is true within private industry as well. 687 00:37:24,719 --> 00:37:27,239 Speaker 1: You know, there's a bit of empire building involved where 688 00:37:27,320 --> 00:37:30,360 Speaker 1: you want your number of people that you manage and 689 00:37:30,360 --> 00:37:32,040 Speaker 1: your budgets to go up every year, and you don't 690 00:37:32,080 --> 00:37:34,960 Speaker 1: want to say, okay, like I've solved my problems, so 691 00:37:35,120 --> 00:37:37,200 Speaker 1: I can deal with five percent as much budget next year, 692 00:37:37,239 --> 00:37:40,879 Speaker 1: thank you. That is incentive incompatible. That's not great from 693 00:37:40,880 --> 00:37:43,440 Speaker 1: the perspective of the parts of society which aren't employed 694 00:37:43,440 --> 00:37:46,480 Speaker 1: by government, but which nonetheless dependent on government for you know, 695 00:37:46,560 --> 00:37:49,000 Speaker 1: providing goods and services. And so this is ultimately a 696 00:37:49,080 --> 00:37:51,360 Speaker 1: thing that we have to resolve through the political system 697 00:37:51,400 --> 00:37:54,839 Speaker 1: on pushing back a little bit on saying like, hey, 698 00:37:55,200 --> 00:37:57,040 Speaker 1: you kind of have to be good at what you do, 699 00:37:57,320 --> 00:37:59,960 Speaker 1: and these days that involves making software that is also 700 00:38:00,080 --> 00:38:02,040 Speaker 1: good at what you do. I have no magic bullet 701 00:38:02,080 --> 00:38:05,040 Speaker 1: for how to cause that to be a you know, 702 00:38:05,200 --> 00:38:08,200 Speaker 1: a stunning rallying cry for political parties, but probably something 703 00:38:08,239 --> 00:38:10,320 Speaker 1: that needs to get said in a lot of places 704 00:38:10,719 --> 00:38:14,719 Speaker 1: for enough decades until the message thinks in Patrick, this 705 00:38:14,800 --> 00:38:17,160 Speaker 1: has been an amazing conversation, and I feel like a 706 00:38:17,560 --> 00:38:19,359 Speaker 1: I already want to have you back and be like 707 00:38:19,440 --> 00:38:21,799 Speaker 1: almost each one of your answers could be like it's 708 00:38:22,000 --> 00:38:25,520 Speaker 1: or like have a full conversation. I have one last question, 709 00:38:25,600 --> 00:38:28,719 Speaker 1: how does bit rot happen? And I mean, like you know, 710 00:38:28,800 --> 00:38:31,320 Speaker 1: even I like you know, you like, go away on 711 00:38:31,440 --> 00:38:33,240 Speaker 1: vacation for two weeks, you come back to your office 712 00:38:33,280 --> 00:38:36,320 Speaker 1: computer and like, face are weird and sort of jaki, 713 00:38:36,520 --> 00:38:39,000 Speaker 1: they don't quite work the same, Like what is that process? 714 00:38:39,000 --> 00:38:41,920 Speaker 1: Because you would think that just like words on a database, 715 00:38:42,040 --> 00:38:44,399 Speaker 1: like what did rot? So what's actually going on there? 716 00:38:45,120 --> 00:38:47,799 Speaker 1: So the sardonic but true answer that you have to 717 00:38:47,800 --> 00:38:49,720 Speaker 1: think of its scale is like bits in a computer 718 00:38:49,760 --> 00:38:53,040 Speaker 1: can literally flipped by gamma rays coming from outer space 719 00:38:53,080 --> 00:38:56,200 Speaker 1: that interact with like the physical manifestation of your your 720 00:38:56,239 --> 00:38:58,080 Speaker 1: memory in the computer. And that's one cause of this 721 00:38:58,480 --> 00:39:01,120 Speaker 1: that's true. That does happen, and that isn't like the 722 00:39:01,160 --> 00:39:04,520 Speaker 1: dominant thing that happens. The dominant thing that happens is 723 00:39:04,800 --> 00:39:08,919 Speaker 1: like there exists change in the broader system that must 724 00:39:08,920 --> 00:39:11,879 Speaker 1: happen on any given basis. Change is a sort of risk. 725 00:39:12,200 --> 00:39:14,600 Speaker 1: It is not always managed well. Um. This gets back 726 00:39:14,640 --> 00:39:17,960 Speaker 1: to that's like a commanding majority of systemic downtime at 727 00:39:17,960 --> 00:39:21,240 Speaker 1: well managed software companies is caused by attempts to upgrade 728 00:39:21,239 --> 00:39:23,880 Speaker 1: the system that go less than optimally. Is the thing 729 00:39:23,920 --> 00:39:26,839 Speaker 1: that is amenable to study, Like a bit happens in 730 00:39:26,880 --> 00:39:32,560 Speaker 1: some cases because you know you had a constellation of software, etcetera. 731 00:39:32,640 --> 00:39:35,880 Speaker 1: Installed on your machine and installed on other machines that 732 00:39:35,920 --> 00:39:39,120 Speaker 1: your machine connected to, which was working. You might say 733 00:39:39,160 --> 00:39:42,600 Speaker 1: exactly perfectly, exactly perfect because unknown as software, but like 734 00:39:42,719 --> 00:39:46,720 Speaker 1: it was working right now. Something about the constellation changed 735 00:39:46,800 --> 00:39:49,120 Speaker 1: as a result of a decision made about a machine 736 00:39:49,160 --> 00:39:51,800 Speaker 1: that is not directly under your control, and that decision 737 00:39:52,000 --> 00:39:54,680 Speaker 1: must be made at scale in the economy because software 738 00:39:54,719 --> 00:39:59,000 Speaker 1: can't be software can't allow to be static to deliver 739 00:39:59,120 --> 00:40:01,640 Speaker 1: the things that we want from software. Is a society, 740 00:40:02,000 --> 00:40:04,480 Speaker 1: and then that change caused some other part of the 741 00:40:04,520 --> 00:40:07,800 Speaker 1: system to behave in a less great manner, and then eventually, 742 00:40:07,960 --> 00:40:09,600 Speaker 1: you know, you see the ripple effects of it in 743 00:40:09,680 --> 00:40:12,680 Speaker 1: your daily life. That's the dominant way bit rot happens. 744 00:40:12,920 --> 00:40:15,279 Speaker 1: It is not the bits actually getting corrupted over time, 745 00:40:15,440 --> 00:40:17,640 Speaker 1: but again a thing that does happen, and we have 746 00:40:17,680 --> 00:40:20,920 Speaker 1: things in engineering to control against that. Well, Patrick, you 747 00:40:20,920 --> 00:40:24,279 Speaker 1: are the perfect guest for this topic. Really appreciate you 748 00:40:24,360 --> 00:40:26,600 Speaker 1: coming out on odd locks. I really appreciate you having 749 00:40:26,600 --> 00:40:29,360 Speaker 1: me and would be glad to be back sometimes. Definitely, 750 00:40:29,440 --> 00:40:31,560 Speaker 1: Thanks so much, Patrick, that was great. I learned so 751 00:40:31,600 --> 00:40:36,319 Speaker 1: many excellent new terms like bit rot, blast, radius, heroics. 752 00:40:36,719 --> 00:40:39,279 Speaker 1: We mutually failed together. That one will come in hand 753 00:40:39,280 --> 00:40:44,520 Speaker 1: Ea Joe, every everything we do wrong mutual. No. I 754 00:40:44,560 --> 00:40:47,000 Speaker 1: love that, Patrick, Thank you so much, Thanks very much, 755 00:40:47,040 --> 00:41:03,040 Speaker 1: happing Tracy. I think really Patrick was like the perfect 756 00:41:03,040 --> 00:41:04,840 Speaker 1: guest for that topic, Like we needed to do this 757 00:41:04,920 --> 00:41:08,200 Speaker 1: for a while. I'm glad we like did it with Patrick. Yeah. Well, 758 00:41:08,239 --> 00:41:10,120 Speaker 1: I also feel like this is something that's going to 759 00:41:10,280 --> 00:41:12,840 Speaker 1: keep coming up, and so we might have more opportunities 760 00:41:12,920 --> 00:41:16,840 Speaker 1: from Patrick to do uh interesting post mortems on various 761 00:41:16,880 --> 00:41:20,120 Speaker 1: tech failures. Yeah. Absolutely, I mean there's so many interesting things. 762 00:41:20,200 --> 00:41:23,200 Speaker 1: I really like some of the questions that you asked about, 763 00:41:23,239 --> 00:41:27,239 Speaker 1: like ownership of software, and I think like it's sort 764 00:41:27,239 --> 00:41:29,960 Speaker 1: of like that really click to me because like if 765 00:41:29,960 --> 00:41:34,200 Speaker 1: you're a software company and software is the main product, 766 00:41:34,880 --> 00:41:37,920 Speaker 1: and you know, I'm thinking about like in the manufacturing analogy, 767 00:41:37,960 --> 00:41:40,360 Speaker 1: you know, it's like a Taiwan semiconductor, this sort of 768 00:41:40,400 --> 00:41:45,320 Speaker 1: like institutional knowledge to build something exists within the firm 769 00:41:45,400 --> 00:41:47,759 Speaker 1: and it just you know gets handed down, handed down 770 00:41:48,160 --> 00:41:51,880 Speaker 1: when software isn't your main product, Like if you're a 771 00:41:51,880 --> 00:41:56,239 Speaker 1: Southwest like if you're a city group, etcetera. Then you 772 00:41:56,280 --> 00:41:59,440 Speaker 1: can sort of see why that process of like internal 773 00:41:59,480 --> 00:42:02,719 Speaker 1: knowledge that works in manufacturing, you don't get that sort 774 00:42:02,760 --> 00:42:06,560 Speaker 1: of ongoing feedback, you know, sort of distribution of knowledge 775 00:42:06,880 --> 00:42:11,080 Speaker 1: in some of these large organizations. Yeah. Absolutely, And it 776 00:42:11,120 --> 00:42:13,759 Speaker 1: feels like, I mean Patrick kind of I think he 777 00:42:13,840 --> 00:42:16,360 Speaker 1: used the expression like dropping the ball in the middle 778 00:42:16,440 --> 00:42:18,600 Speaker 1: of both of us. But it does seem like that 779 00:42:18,800 --> 00:42:24,600 Speaker 1: system kind of produces opportunities for I guess I'm trying 780 00:42:24,640 --> 00:42:26,439 Speaker 1: to think how to phrase this for people to sort 781 00:42:26,480 --> 00:42:30,280 Speaker 1: of like, no one takes total responsibility for a systems 782 00:42:30,280 --> 00:42:32,839 Speaker 1: failure like that, right, because on the one hand, someone 783 00:42:32,880 --> 00:42:34,880 Speaker 1: designed the software, but on the other hand, maybe it 784 00:42:34,920 --> 00:42:37,840 Speaker 1: was customized by someone else. Maybe you have two different 785 00:42:37,840 --> 00:42:40,600 Speaker 1: systems talking to each other and both of them mess 786 00:42:40,640 --> 00:42:43,720 Speaker 1: up in some way, or there's some sort of misunderstanding. 787 00:42:43,960 --> 00:42:47,280 Speaker 1: It just seems like there's such a like gray area, 788 00:42:48,160 --> 00:42:51,560 Speaker 1: and maybe this is one of the reasons why it's 789 00:42:51,600 --> 00:42:55,440 Speaker 1: so difficult to fix, because you have all these different 790 00:42:55,480 --> 00:42:59,000 Speaker 1: things that are sort of operating together. Well, even like 791 00:42:59,120 --> 00:43:02,759 Speaker 1: you know, like his last answer about how bit rot happens, right, 792 00:43:02,760 --> 00:43:07,880 Speaker 1: like somewhere somewhere, like all these interconnected computers, somewhere someone 793 00:43:07,960 --> 00:43:10,160 Speaker 1: has to make a change. Because he pointed out in 794 00:43:10,360 --> 00:43:13,279 Speaker 1: his answer to your question about why software is never 795 00:43:13,320 --> 00:43:16,120 Speaker 1: like done nor why it's never solved problem. It's like 796 00:43:16,280 --> 00:43:18,560 Speaker 1: we're always demanding more so there will never be a 797 00:43:18,560 --> 00:43:21,760 Speaker 1: time where someone like has the luxury of not making 798 00:43:21,760 --> 00:43:25,400 Speaker 1: a change and then everyone else has to. Intercomputer we 799 00:43:25,480 --> 00:43:29,359 Speaker 1: figured out technology is solved. Well, maybe that's what the AI, like, 800 00:43:29,760 --> 00:43:35,120 Speaker 1: what's it, you know, the singularity, GPT it could be, 801 00:43:36,040 --> 00:43:39,000 Speaker 1: but like, you know, that seems like it makes a 802 00:43:39,000 --> 00:43:41,359 Speaker 1: lot of sense. Someone has to make a change because 803 00:43:41,440 --> 00:43:43,719 Speaker 1: that's just how the world works. And then all these 804 00:43:43,719 --> 00:43:47,120 Speaker 1: other interconnected systems like maybe they're fine with the change, 805 00:43:47,160 --> 00:43:49,839 Speaker 1: but something happens and then eventually they have to change too, 806 00:43:50,239 --> 00:43:52,360 Speaker 1: and so it's just like this constant state of flux. 807 00:43:52,440 --> 00:43:57,839 Speaker 1: Can I tell you my one internalized programming lesson? So, um, 808 00:43:57,880 --> 00:43:59,840 Speaker 1: you know I mentioned HTML and then when I was 809 00:43:59,880 --> 00:44:02,680 Speaker 1: in high school, part of our computer science class we 810 00:44:02,760 --> 00:44:06,200 Speaker 1: had to learn JavaScript and we had to create a program. 811 00:44:06,400 --> 00:44:09,000 Speaker 1: And so I wrote this program. Again, like keep in 812 00:44:09,040 --> 00:44:11,680 Speaker 1: mind that this was the year two thousand or something 813 00:44:11,719 --> 00:44:13,400 Speaker 1: like that. I wrote a program. It was like a 814 00:44:13,440 --> 00:44:15,840 Speaker 1: digital fortune cookie, and you know, you could click on 815 00:44:15,840 --> 00:44:18,080 Speaker 1: it and it would give you a fortune. And then 816 00:44:18,120 --> 00:44:20,320 Speaker 1: at the end of it, at the end of this module, 817 00:44:20,400 --> 00:44:24,600 Speaker 1: we had to sign a contract signing over our program 818 00:44:24,760 --> 00:44:29,080 Speaker 1: to our computer teacher took ownership. No, it was an 819 00:44:29,080 --> 00:44:33,319 Speaker 1: extremely valuable lesson, which is all the coding that you're 820 00:44:33,320 --> 00:44:36,960 Speaker 1: doing will ultimately belong to someone else and they'll be 821 00:44:37,000 --> 00:44:40,440 Speaker 1: able to monetize it. That's the downside, as they'll monetize 822 00:44:40,440 --> 00:44:42,399 Speaker 1: it for you and maybe you won't get as much. 823 00:44:42,680 --> 00:44:44,360 Speaker 1: But the upside, I guess, is that you don't have 824 00:44:44,440 --> 00:44:47,120 Speaker 1: to take responsibility for it. Once you write it, it 825 00:44:47,239 --> 00:44:49,759 Speaker 1: goes out into the world. The computer professor owns it 826 00:44:49,800 --> 00:44:52,160 Speaker 1: and he can do with it. I love that that 827 00:44:52,280 --> 00:44:55,440 Speaker 1: was actually the lesson. Also, I feel like in another universe, 828 00:44:55,480 --> 00:44:57,520 Speaker 1: like you could have sold that startup for a hundred 829 00:44:57,520 --> 00:45:01,640 Speaker 1: million dollars to face book and it went like super viral. 830 00:45:02,120 --> 00:45:05,160 Speaker 1: It seems like, how did this person make their fortune? 831 00:45:05,320 --> 00:45:08,360 Speaker 1: They made like a digital fortune cookie, but they also 832 00:45:08,480 --> 00:45:11,680 Speaker 1: like I loved his answer about like the public sector 833 00:45:11,760 --> 00:45:13,879 Speaker 1: because it's like, there, you really do have this problem 834 00:45:13,920 --> 00:45:18,000 Speaker 1: with like salary disparities, and it is pretty crazy that 835 00:45:18,320 --> 00:45:20,960 Speaker 1: we sort of treat like the way we're sort of 836 00:45:20,960 --> 00:45:24,320 Speaker 1: solving this problem in this country is kind of getting 837 00:45:24,320 --> 00:45:27,080 Speaker 1: people to volunteer, like going to work for the government 838 00:45:27,640 --> 00:45:29,919 Speaker 1: in I T and tech. It's like what you do 839 00:45:30,080 --> 00:45:32,680 Speaker 1: after you're like rich and you want to give something 840 00:45:32,760 --> 00:45:35,239 Speaker 1: back is like, Okay, I'm gonna like go work for 841 00:45:35,320 --> 00:45:37,400 Speaker 1: the federal government and like try to help them like 842 00:45:37,520 --> 00:45:40,319 Speaker 1: update their systems. Which is great that people want to 843 00:45:40,320 --> 00:45:42,640 Speaker 1: do that, and like I love that, but like that 844 00:45:42,760 --> 00:45:46,319 Speaker 1: does not seem like a great sustainable solution to having 845 00:45:46,360 --> 00:45:50,280 Speaker 1: like a modern government that can like communicate with people 846 00:45:50,320 --> 00:45:54,120 Speaker 1: and provide services for people in the way that they expect. No, Absolutely, 847 00:45:54,239 --> 00:45:57,120 Speaker 1: and it's something that we see again and again in 848 00:45:57,239 --> 00:45:59,960 Speaker 1: various ways. Shall we leave it there, Let's leave at 849 00:45:59,960 --> 00:46:03,000 Speaker 1: the Okay, this has been another episode of the All 850 00:46:03,040 --> 00:46:05,719 Speaker 1: Thoughts podcast. I'm Tracy Alloway. You can follow me on 851 00:46:05,760 --> 00:46:08,440 Speaker 1: Twitter at Tracy Alloway and I'm Joe Why Isn't Though. 852 00:46:08,480 --> 00:46:11,440 Speaker 1: You can follow me on Twitter at the Stalwart. Follow 853 00:46:11,480 --> 00:46:15,480 Speaker 1: our guest Patrick McKenzie. He's on Twitter at Patio eleven 854 00:46:15,600 --> 00:46:19,600 Speaker 1: and check out his Bits about Money newsletter. Follow our 855 00:46:19,640 --> 00:46:24,600 Speaker 1: producers Carmen Rodriguez at kerman Ermine and Dash Bennett at Dashbot. 856 00:46:24,960 --> 00:46:27,360 Speaker 1: And check out all of the podcasts here at Bloomberg 857 00:46:27,520 --> 00:46:31,720 Speaker 1: under the handle at podcasts and For more odd Lots content, 858 00:46:31,800 --> 00:46:34,360 Speaker 1: go to Bloomberg dot com slash odd Lots, where we 859 00:46:34,400 --> 00:46:37,440 Speaker 1: push the transcripts of the episodes. Tracy and I blog, 860 00:46:37,680 --> 00:46:40,080 Speaker 1: and we have a weekly newsletter that comes out every Friday. 861 00:46:40,160 --> 00:46:43,360 Speaker 1: Go there, sign up and get into your inboxes. Thanks 862 00:46:43,400 --> 00:47:09,520 Speaker 1: for listening to ye to