1 00:00:03,120 --> 00:00:18,520 Speaker 1: Bloomberg Audio Studios, Podcasts, Radio News. 2 00:00:20,720 --> 00:00:24,760 Speaker 2: Hello and welcome to another episode of the Odd Lots Podcast. 3 00:00:24,840 --> 00:00:26,560 Speaker 2: I'm Joe Wattenthal. 4 00:00:26,200 --> 00:00:27,320 Speaker 3: And I'm Tracy Alloway. 5 00:00:27,760 --> 00:00:32,040 Speaker 2: Tracy, you know, there's a lot of like concerns about 6 00:00:32,640 --> 00:00:36,640 Speaker 2: AI obviously these days, and if anyone who's like reasonably 7 00:00:36,680 --> 00:00:39,760 Speaker 2: intelligent can like list tons like maybe like it's gonna 8 00:00:39,800 --> 00:00:42,559 Speaker 2: go rogue and be smarter than us or maybe whatever, 9 00:00:42,960 --> 00:00:45,120 Speaker 2: But I still think, like what or maybe there's just 10 00:00:45,159 --> 00:00:49,320 Speaker 2: gonna be this like flood of disinformation and deep fakes, 11 00:00:49,479 --> 00:00:51,600 Speaker 2: or maybe it's gonna put all journalists out of business, 12 00:00:51,640 --> 00:00:54,520 Speaker 2: which is certainly plausible, But I think, you know, like 13 00:00:54,560 --> 00:00:56,920 Speaker 2: something I think a lot about is just this idea 14 00:00:57,040 --> 00:01:00,680 Speaker 2: that regardless of what happens, like we're going to be 15 00:01:00,760 --> 00:01:04,280 Speaker 2: increasingly sort of trusting like a black box for answers 16 00:01:04,319 --> 00:01:08,080 Speaker 2: that we really have no idea where those answers, however 17 00:01:08,080 --> 00:01:09,360 Speaker 2: you want to describe that come from. 18 00:01:09,520 --> 00:01:12,080 Speaker 3: Yes, absolutely, And this is something that's come up on 19 00:01:12,120 --> 00:01:13,720 Speaker 3: the podcast a number of times. 20 00:01:13,760 --> 00:01:13,920 Speaker 4: Now. 21 00:01:13,959 --> 00:01:17,360 Speaker 3: I'm thinking way back to an episode we did that 22 00:01:17,520 --> 00:01:20,640 Speaker 3: was basically about the black box of algorithms and how 23 00:01:20,680 --> 00:01:23,880 Speaker 3: difficult it was to understand what goes into them and 24 00:01:23,920 --> 00:01:27,080 Speaker 3: then what comes out. And then of course we recently 25 00:01:27,080 --> 00:01:31,520 Speaker 3: did that episode on pricing and the idea of algorithmic pricing, 26 00:01:31,880 --> 00:01:36,440 Speaker 3: the idea of building proxy consumer profiles, And you're right, 27 00:01:36,680 --> 00:01:39,880 Speaker 3: the issue is we know that there's this new technology, 28 00:01:39,959 --> 00:01:42,840 Speaker 3: We know that there's all this data floating around, but 29 00:01:43,000 --> 00:01:46,920 Speaker 3: we don't entirely know how it is coming to the 30 00:01:47,000 --> 00:01:50,280 Speaker 3: conclusions or creating the output that it actually is. 31 00:01:50,680 --> 00:01:53,920 Speaker 2: You know, the pricing thing is interesting because you know, 32 00:01:54,640 --> 00:01:57,360 Speaker 2: in a market economy, you know, you could argue it's like, oh, 33 00:01:57,480 --> 00:02:01,040 Speaker 2: at any given moment, you know you're being served up 34 00:02:01,080 --> 00:02:03,520 Speaker 2: the like the optimal price, right. And in theory, even 35 00:02:03,560 --> 00:02:06,160 Speaker 2: with the most advanced algorithms and stuff like, maybe there's 36 00:02:06,160 --> 00:02:08,359 Speaker 2: some like this price is happening at the best price 37 00:02:08,440 --> 00:02:11,000 Speaker 2: for both the seller and the buyer, et cetera. But 38 00:02:11,080 --> 00:02:14,320 Speaker 2: I think like people just have a sort of deep 39 00:02:14,880 --> 00:02:18,480 Speaker 2: intuitive distrust about the fact that like, you know, you 40 00:02:18,520 --> 00:02:20,880 Speaker 2: can't go there and like touch it and verify it 41 00:02:20,919 --> 00:02:23,680 Speaker 2: and see like this is why this exists in the 42 00:02:23,720 --> 00:02:26,040 Speaker 2: state that it is. And I think it's going to 43 00:02:26,120 --> 00:02:29,839 Speaker 2: create a lot of I don't know, cultural apprehensions as 44 00:02:29,919 --> 00:02:32,840 Speaker 2: more and more decisions and more and more things that 45 00:02:32,919 --> 00:02:37,200 Speaker 2: are that affect our lives just seem to like emerge 46 00:02:37,240 --> 00:02:38,760 Speaker 2: spontaneously out of the box. 47 00:02:38,880 --> 00:02:41,840 Speaker 3: Absolutely. I'm thinking of all the people working at Chipotle 48 00:02:42,000 --> 00:02:44,480 Speaker 3: who are going to have to answer questions about not 49 00:02:44,560 --> 00:02:48,000 Speaker 3: just portion sizes now, but also questions from customers about 50 00:02:48,040 --> 00:02:49,360 Speaker 3: are they getting the best price. 51 00:02:49,600 --> 00:02:52,280 Speaker 2: Have you seen those awful videos that people are taking 52 00:02:52,360 --> 00:02:53,840 Speaker 2: of the Chipotle workers. 53 00:02:54,160 --> 00:02:57,080 Speaker 3: Yeah, I've seen some of us so vilely imagine, yes. 54 00:02:56,919 --> 00:03:00,040 Speaker 2: It's so vile anyway, that's a that's a that's a 55 00:03:00,080 --> 00:03:02,960 Speaker 2: separate thing. But yes, like all of these things, and 56 00:03:03,000 --> 00:03:06,440 Speaker 2: you know it's it's not just with AI obviously, and 57 00:03:06,520 --> 00:03:10,000 Speaker 2: like this sort of world exists in increasingly black boxes. 58 00:03:10,080 --> 00:03:12,280 Speaker 2: You put in the support ticket, you try to like 59 00:03:12,360 --> 00:03:15,160 Speaker 2: talk to someone in an embassy or a consulate or 60 00:03:15,200 --> 00:03:17,120 Speaker 2: anything you do when you sort of like send out 61 00:03:17,160 --> 00:03:21,440 Speaker 2: some requirement or some request to some bureaucracy or some company, 62 00:03:21,480 --> 00:03:24,000 Speaker 2: and then it moves around. I was on a I 63 00:03:24,000 --> 00:03:26,840 Speaker 2: had a flight recently that was delayed for nine hours, 64 00:03:26,880 --> 00:03:30,520 Speaker 2: and there's like this palpable frustration that everyone feels that 65 00:03:30,600 --> 00:03:33,360 Speaker 2: the person you know, standing at the gate like can't 66 00:03:33,400 --> 00:03:35,880 Speaker 2: answer their questions and they can't get anyone to answer 67 00:03:35,920 --> 00:03:37,720 Speaker 2: their questions, and it just sort of like, you know, 68 00:03:37,720 --> 00:03:41,200 Speaker 2: everyone explodes and everyone knows it's not the gate agent's fault, 69 00:03:41,600 --> 00:03:44,320 Speaker 2: but still, like, you know, there's just this frustration, like 70 00:03:44,440 --> 00:03:46,680 Speaker 2: where is the answer to what's going on? 71 00:03:47,080 --> 00:03:47,280 Speaker 1: Right? 72 00:03:47,320 --> 00:03:50,120 Speaker 3: And you can't ask a single person because that single 73 00:03:50,160 --> 00:03:52,840 Speaker 3: person doesn't, like the gate agent doesn't have the answers. 74 00:03:53,160 --> 00:03:56,960 Speaker 3: I think what's happening is like society has organized itself 75 00:03:57,040 --> 00:04:02,040 Speaker 3: in such a way as to devolve responsibility, and the 76 00:04:02,200 --> 00:04:07,280 Speaker 3: creation of all this new technology is basically going to 77 00:04:07,600 --> 00:04:10,560 Speaker 3: I guess, ramp all of that up, right, Like, so 78 00:04:10,960 --> 00:04:13,160 Speaker 3: it might not even be the gate agent that you're 79 00:04:13,200 --> 00:04:16,240 Speaker 3: asking in the future. It might be you trying to, 80 00:04:16,320 --> 00:04:19,839 Speaker 3: like I don't know, ask the algorithm, like why it 81 00:04:19,920 --> 00:04:23,680 Speaker 3: decided to bump you versus someone else or actually that 82 00:04:23,720 --> 00:04:26,400 Speaker 3: already happens, right, There is an algo that dictates like 83 00:04:26,640 --> 00:04:29,440 Speaker 3: who gets bumped from the plane and who doesn't. So yeah. 84 00:04:29,520 --> 00:04:31,480 Speaker 2: The other big one, of course is health insurance and 85 00:04:31,560 --> 00:04:34,599 Speaker 2: why some claims are suddenly denied and you never get 86 00:04:34,600 --> 00:04:39,680 Speaker 2: this answer anyway. The world is already filled with systems 87 00:04:40,040 --> 00:04:42,840 Speaker 2: in which we have some question and no one actually 88 00:04:42,880 --> 00:04:45,279 Speaker 2: can sort of you know give you the answer. 89 00:04:45,960 --> 00:04:49,360 Speaker 3: Yes, absolutely, and I think we might have the perfect guests. 90 00:04:49,480 --> 00:04:52,200 Speaker 2: We do have the perfect guest. It's someone we've talked 91 00:04:52,200 --> 00:04:55,920 Speaker 2: to multiple times on the podcast, one of the smartest 92 00:04:55,960 --> 00:04:59,360 Speaker 2: guys around, Always interesting, always worth paying attention to. We're 93 00:04:59,400 --> 00:05:01,480 Speaker 2: going to be speaking with Dan Davies. He is the 94 00:05:01,520 --> 00:05:05,720 Speaker 2: author of the new book The Unaccountability Machine, Why Big 95 00:05:05,800 --> 00:05:09,640 Speaker 2: Systems Make Terrible Decisions, and How the World Lost its Mind. 96 00:05:09,960 --> 00:05:12,520 Speaker 2: So this should be really fun. Dan, thank you so 97 00:05:12,600 --> 00:05:14,240 Speaker 2: much for coming back on the podcast. 98 00:05:14,480 --> 00:05:16,000 Speaker 4: Oh thanks very much for inviting me. 99 00:05:16,400 --> 00:05:19,200 Speaker 2: You know, before we get into the meat of your argument, 100 00:05:19,640 --> 00:05:23,040 Speaker 2: The Unaccountability Machine, I'm actually curious about the second half 101 00:05:23,320 --> 00:05:25,920 Speaker 2: of the title, how the World Lost its Mind, because 102 00:05:25,960 --> 00:05:28,479 Speaker 2: I certainly feel like the world lost its mind. But 103 00:05:28,640 --> 00:05:31,159 Speaker 2: I'm like a middle aged boomer, and I feel like, 104 00:05:31,520 --> 00:05:33,760 Speaker 2: you know, anytime you get to my age, you're like, oh, 105 00:05:33,800 --> 00:05:37,680 Speaker 2: the world's gone crazy, the world's gone mad, Like, you know, 106 00:05:37,800 --> 00:05:40,480 Speaker 2: why is everything so nuts these days? Has the world? 107 00:05:40,480 --> 00:05:42,680 Speaker 2: Do we actually know that the world's gone mad? Or 108 00:05:42,720 --> 00:05:44,640 Speaker 2: is it just because like we're all sort of old 109 00:05:44,680 --> 00:05:47,279 Speaker 2: and cranky now everything seems like the world's gone mad. 110 00:05:47,680 --> 00:05:50,400 Speaker 4: Well, from your own perspective, you can never be sure. Yeah, 111 00:05:50,440 --> 00:05:54,640 Speaker 4: but I think there's actually reasonable objective ways that you 112 00:05:54,680 --> 00:05:58,520 Speaker 4: can check up on this, just by noticing that the 113 00:05:58,560 --> 00:06:02,680 Speaker 4: world gets more complicated as it gets bigger, and it 114 00:06:02,760 --> 00:06:05,840 Speaker 4: gets exponentially more complicated. And I mean that're in the 115 00:06:05,880 --> 00:06:10,240 Speaker 4: literal mathematical sense, because the number of connections grows faster 116 00:06:10,320 --> 00:06:15,000 Speaker 4: than the number of things, whereas our capability to understand 117 00:06:15,000 --> 00:06:19,880 Speaker 4: the world, manage it, and make decisions doesn't necessarily grow exponentially. 118 00:06:20,360 --> 00:06:24,479 Speaker 4: So this is the story, I would argue of economics, 119 00:06:24,480 --> 00:06:27,720 Speaker 4: It's the story of any management book that's worth reading, 120 00:06:28,040 --> 00:06:30,839 Speaker 4: because the central problem of management is the world is 121 00:06:30,839 --> 00:06:36,039 Speaker 4: getting faster, more complicated, faster than you can process that complexity. 122 00:06:36,400 --> 00:06:38,200 Speaker 4: What are you going to do about it? How are 123 00:06:38,200 --> 00:06:40,880 Speaker 4: you going to reorganize? And you know, we've just been 124 00:06:40,920 --> 00:06:44,800 Speaker 4: through a global financial crisis, we've just been through a 125 00:06:44,880 --> 00:06:48,440 Speaker 4: political what Aanam two is called poly crisis. I think 126 00:06:48,440 --> 00:06:50,680 Speaker 4: there's decent reasons to believe it's not just because we're 127 00:06:50,680 --> 00:06:55,120 Speaker 4: getting older, and it is actually a crisis of the 128 00:06:55,200 --> 00:06:59,479 Speaker 4: ability to make decisions matched up against the speed and 129 00:06:59,480 --> 00:07:01,719 Speaker 4: complexity see of the decisions that we're having to take. 130 00:07:02,560 --> 00:07:05,800 Speaker 3: So when we talk about the lack of accountability and 131 00:07:05,839 --> 00:07:10,600 Speaker 3: making bad decisions. Give us some concrete examples of things 132 00:07:10,640 --> 00:07:13,840 Speaker 3: that you have spoken about or written about in your book. 133 00:07:14,000 --> 00:07:15,280 Speaker 3: What are you thinking about here? 134 00:07:15,840 --> 00:07:19,120 Speaker 4: Well, I mean there are kind of there's there's little trivial, 135 00:07:19,120 --> 00:07:24,040 Speaker 4: funny examples, and there's big, huge, serious examples. So for example, 136 00:07:24,320 --> 00:07:28,640 Speaker 4: and apologize in advance because this is quite disgusting. 137 00:07:29,240 --> 00:07:30,520 Speaker 3: Oh is it the squirrels? 138 00:07:30,960 --> 00:07:32,040 Speaker 4: Would you rather I didn't talk? 139 00:07:32,720 --> 00:07:32,920 Speaker 1: Yeah? 140 00:07:33,080 --> 00:07:37,000 Speaker 3: You could. This is bad if there are children listening. 141 00:07:37,280 --> 00:07:39,120 Speaker 3: Maybe child. 142 00:07:41,080 --> 00:07:44,880 Speaker 4: At the start of this century there was a craze 143 00:07:45,160 --> 00:07:50,560 Speaker 4: for squirrels as pets in Europe, and squirrels were being 144 00:07:50,560 --> 00:07:53,560 Speaker 4: imported from North America in China to be pets, and 145 00:07:53,640 --> 00:07:58,040 Speaker 4: they had to have the right paperwork. And so one 146 00:07:58,120 --> 00:08:02,600 Speaker 4: day ad of four hundred of the poor little things 147 00:08:02,880 --> 00:08:07,040 Speaker 4: showed up in Chippell Airport in Amsterdam without any paperwork 148 00:08:07,200 --> 00:08:10,040 Speaker 4: and without a return address to send them back to. 149 00:08:10,880 --> 00:08:14,080 Speaker 4: And it's difficult to know what the airline should have done, 150 00:08:14,560 --> 00:08:17,000 Speaker 4: but you can't help thinking that there must have been 151 00:08:17,040 --> 00:08:20,400 Speaker 4: a better solution than what they actually did do, which 152 00:08:20,560 --> 00:08:24,440 Speaker 4: was that they threw all four hundred of them, except 153 00:08:24,480 --> 00:08:27,840 Speaker 4: for one or two that escaped into an industrial shred 154 00:08:29,240 --> 00:08:34,480 Speaker 4: and this caused an outrage. There were questions asked in 155 00:08:34,520 --> 00:08:38,199 Speaker 4: the Dutch Parliament and people immediately started asking how did 156 00:08:38,240 --> 00:08:43,040 Speaker 4: this happen? Who is responsible? And in fact, the press 157 00:08:43,120 --> 00:08:46,439 Speaker 4: release from the airline apologizing for this is studied as 158 00:08:46,480 --> 00:08:50,280 Speaker 4: a masterpiece of crisis pr in business schools. But when 159 00:08:50,280 --> 00:08:53,960 Speaker 4: they went back to inquire, they ended up realizing that 160 00:08:54,160 --> 00:08:58,000 Speaker 4: no one had ever really made the decision that that 161 00:08:58,160 --> 00:09:02,160 Speaker 4: was what they were going to do. The government's biosecurity 162 00:09:02,200 --> 00:09:07,280 Speaker 4: Ministry had set some standards for the importation of small animals, 163 00:09:08,200 --> 00:09:12,440 Speaker 4: the airline had set some standards for compliance with that policy. 164 00:09:13,040 --> 00:09:15,800 Speaker 4: The only people who were expected to make a decision 165 00:09:16,200 --> 00:09:19,920 Speaker 4: about whether this was grotesque and couldn't be done or 166 00:09:20,000 --> 00:09:23,520 Speaker 4: not were some low level employees in a shed at 167 00:09:23,559 --> 00:09:28,480 Speaker 4: Chippell Airport. And frankly, people who work in sheds aren't 168 00:09:28,600 --> 00:09:31,280 Speaker 4: usually going to be thinking that they're meant to be 169 00:09:31,320 --> 00:09:34,120 Speaker 4: second guessing the government. And so what happened is that 170 00:09:34,240 --> 00:09:37,600 Speaker 4: you had this phenomenon that turns up a lot of 171 00:09:37,640 --> 00:09:41,240 Speaker 4: the time at all levels of organization, which is that 172 00:09:41,320 --> 00:09:47,079 Speaker 4: something happened which nobody wanted, but was the predictable output 173 00:09:47,600 --> 00:09:49,280 Speaker 4: of the system that they had created. 174 00:10:05,080 --> 00:10:07,400 Speaker 2: Yeah, this is this is first of all that's grim, 175 00:10:07,440 --> 00:10:10,000 Speaker 2: but also like when you describe it that way, you 176 00:10:10,040 --> 00:10:12,600 Speaker 2: could see it because ultimately, like, all right, here's this 177 00:10:12,720 --> 00:10:16,600 Speaker 2: awful thing that happened to three hundred and ninety eight squirrels. 178 00:10:17,320 --> 00:10:20,800 Speaker 2: I guess in theory, someone had to I don't want 179 00:10:20,840 --> 00:10:23,680 Speaker 2: to get I don't know dumped the bag to the shretter. 180 00:10:24,400 --> 00:10:25,679 Speaker 4: I have not researched them. 181 00:10:25,760 --> 00:10:30,679 Speaker 2: Yeah, but that's not a very satisfying answer. I mean, yes, okay, 182 00:10:30,720 --> 00:10:33,440 Speaker 2: maybe there was someone who did the physical thing, and 183 00:10:33,520 --> 00:10:37,000 Speaker 2: I guess the entire you know, operations of the airline 184 00:10:37,080 --> 00:10:40,320 Speaker 2: and the port and the customs Bureau could like just 185 00:10:40,360 --> 00:10:43,880 Speaker 2: blame that one person. But that's not a very satisfying 186 00:10:44,120 --> 00:10:47,480 Speaker 2: conclusion I guess in terms of like how this actually happened. 187 00:10:47,520 --> 00:10:49,800 Speaker 2: It's funny I brought this documentary up. I was thinking 188 00:10:49,800 --> 00:10:52,400 Speaker 2: about this too, Like I'm with the destruction of the 189 00:10:52,440 --> 00:10:54,760 Speaker 2: old Penn station in New York, which it's like this 190 00:10:54,920 --> 00:10:58,720 Speaker 2: like extraordinary, like I guess, like Roman or Greek building, 191 00:10:58,760 --> 00:11:01,400 Speaker 2: and they just tore it down to like build the 192 00:11:01,960 --> 00:11:02,800 Speaker 2: pretty owes. 193 00:11:02,600 --> 00:11:04,840 Speaker 3: Something hideous, Yeah, something hideous. 194 00:11:04,480 --> 00:11:07,240 Speaker 2: In its place. And now the new Penn station is terrible, 195 00:11:07,280 --> 00:11:09,160 Speaker 2: but it feels like the same thing. It's like, how 196 00:11:09,200 --> 00:11:11,240 Speaker 2: did no one stop and say, like, wait, does this 197 00:11:11,360 --> 00:11:14,000 Speaker 2: make any sense in the long in the big picture. 198 00:11:14,520 --> 00:11:17,360 Speaker 4: Yeah. And the thing is they'd created a system on 199 00:11:17,440 --> 00:11:20,360 Speaker 4: the assumption that squirrels would show up in ones and 200 00:11:20,400 --> 00:11:24,280 Speaker 4: twos and they could be dealt with as individuals, and 201 00:11:24,320 --> 00:11:26,400 Speaker 4: that therefore you would never get into this sort of 202 00:11:26,440 --> 00:11:30,280 Speaker 4: situation because when you build a system, you're always building 203 00:11:30,320 --> 00:11:33,120 Speaker 4: a model of the world, and if something happens which 204 00:11:33,160 --> 00:11:35,800 Speaker 4: doesn't fit into your model in the world, your system 205 00:11:35,920 --> 00:11:40,000 Speaker 4: might do something awful. And there's a sort of symmetry 206 00:11:40,120 --> 00:11:44,120 Speaker 4: and a kind of resemblance here with much bigger and 207 00:11:44,200 --> 00:11:47,360 Speaker 4: more grim things like the Boeing seven three seven max, 208 00:11:47,600 --> 00:11:51,320 Speaker 4: like the light board scandal in financial markets. It's not 209 00:11:51,480 --> 00:11:55,720 Speaker 4: so much that anyone sat down and said, let's form 210 00:11:55,840 --> 00:11:59,520 Speaker 4: a conspiracy to manipulate interest rates, or let's build a 211 00:11:59,760 --> 00:12:03,800 Speaker 4: place that crashes under certain circumstances. It's just that no 212 00:12:03,840 --> 00:12:06,280 Speaker 4: one set things up so that that wouldn't happen. 213 00:12:07,280 --> 00:12:10,840 Speaker 3: Do you think I'm afraid going for the squirrels are 214 00:12:10,840 --> 00:12:14,680 Speaker 3: probably going to be our archetypal example of this, But 215 00:12:14,920 --> 00:12:18,000 Speaker 3: do you think with the squirrels, for instance, the hyper 216 00:12:18,200 --> 00:12:22,440 Speaker 3: specificity of the goals or the job rolls of everyone 217 00:12:22,760 --> 00:12:26,360 Speaker 3: involved contributed to the outcome. So in the sense that 218 00:12:26,440 --> 00:12:29,960 Speaker 3: you have you know, I guess, the Dutch Wildlife Department 219 00:12:30,080 --> 00:12:34,400 Speaker 3: who is trying to protect Dutch wildlife. Then you have 220 00:12:34,480 --> 00:12:36,560 Speaker 3: like the guys at the airport who are charged with 221 00:12:36,760 --> 00:12:40,440 Speaker 3: actually carrying out these orders, and then you have the 222 00:12:40,520 --> 00:12:44,200 Speaker 3: airline which is charged with like looking at the paperwork, 223 00:12:44,679 --> 00:12:47,480 Speaker 3: do those tend to lead to when taken altogether, do 224 00:12:47,559 --> 00:12:49,599 Speaker 3: those tend to lead to worse outcomes? 225 00:12:50,000 --> 00:12:53,480 Speaker 4: Yeah? Absolutely. It's this kind of fragmentation of the decision 226 00:12:54,040 --> 00:12:57,640 Speaker 4: which is a result of the industrialization of the decision 227 00:12:57,679 --> 00:13:01,240 Speaker 4: and its industrialization literally in the atom Smith's sense that 228 00:13:01,679 --> 00:13:04,480 Speaker 4: you don't have anyone in the pin factory building an 229 00:13:04,640 --> 00:13:09,240 Speaker 4: entire pin. You have thirteen guys all performing one simple operation. 230 00:13:09,760 --> 00:13:12,440 Speaker 4: And that's a much more productive way to do things. 231 00:13:12,720 --> 00:13:15,480 Speaker 4: But then when you apply it to decision making, you 232 00:13:15,559 --> 00:13:19,280 Speaker 4: have the problem that everyone assumes that everyone else is 233 00:13:19,320 --> 00:13:24,400 Speaker 4: going to react when something unplanned happens. So this is, 234 00:13:24,520 --> 00:13:28,760 Speaker 4: like I say, it's the central problem of every good 235 00:13:28,840 --> 00:13:31,800 Speaker 4: management sex book. How do you deal with information? How 236 00:13:31,800 --> 00:13:33,959 Speaker 4: do you get a drink from a fire hose. How 237 00:13:33,960 --> 00:13:37,200 Speaker 4: do you stop yourself from being overwhelmed? The answer is 238 00:13:37,280 --> 00:13:40,079 Speaker 4: always in some way or another, you build a system 239 00:13:40,240 --> 00:13:43,160 Speaker 4: to make the decisions for you. But once you've built 240 00:13:43,200 --> 00:13:46,120 Speaker 4: that system to make the decisions for you, you no 241 00:13:46,200 --> 00:13:49,640 Speaker 4: longer feel ownership of that decision. Psychologically, you no longer 242 00:13:49,640 --> 00:13:52,920 Speaker 4: feel like you're accountable for the decision, because if you 243 00:13:52,960 --> 00:13:55,960 Speaker 4: were accountable, you might be able to change it. But 244 00:13:56,040 --> 00:13:58,680 Speaker 4: if you're going to be held accountable for this thing, 245 00:13:58,960 --> 00:14:01,840 Speaker 4: you haven't really moved that thing on, you haven't really 246 00:14:01,880 --> 00:14:03,160 Speaker 4: delegated it to the system. 247 00:14:04,080 --> 00:14:07,320 Speaker 2: You mentioned Boeing, And speaking of Boeing, there was a 248 00:14:07,360 --> 00:14:11,679 Speaker 2: great blog post back in April from Steve Randy Waldman 249 00:14:11,800 --> 00:14:14,840 Speaker 2: inter Fluidity, who I consider another one kind of in 250 00:14:14,880 --> 00:14:18,640 Speaker 2: your category of people who have basically been writing interesting 251 00:14:18,760 --> 00:14:21,280 Speaker 2: things on the internet for a very long time, and 252 00:14:21,320 --> 00:14:24,040 Speaker 2: he has the title was seeing like a CEO and 253 00:14:24,120 --> 00:14:27,600 Speaker 2: this idea that you know, when Boeing merged with McDonald douglas, 254 00:14:27,720 --> 00:14:30,360 Speaker 2: the McDonald douglas CEO became an outside hire and he 255 00:14:30,480 --> 00:14:35,040 Speaker 2: had to essentially gain some legibility into this new organization 256 00:14:35,120 --> 00:14:37,840 Speaker 2: that he inherited, and that was the cause for some 257 00:14:37,880 --> 00:14:42,120 Speaker 2: of this like streamlining and offshore or you know, offshoring, 258 00:14:42,680 --> 00:14:45,000 Speaker 2: things like that. Talk to us like about you know, 259 00:14:45,040 --> 00:14:49,040 Speaker 2: you mentioned Boeing and it's more you know, the crises 260 00:14:49,080 --> 00:14:51,480 Speaker 2: there that's been going on several years. It's a more severe, 261 00:14:51,560 --> 00:14:56,280 Speaker 2: serious issue than the squirrels. But how do you know, 262 00:14:56,280 --> 00:14:57,800 Speaker 2: how do you think about what happened there? 263 00:14:58,200 --> 00:15:01,360 Speaker 4: I think about it in very similar ways to Steve 264 00:15:01,440 --> 00:15:05,600 Speaker 4: Randy Wilburn because the and you see it in Boeing, 265 00:15:05,720 --> 00:15:08,600 Speaker 4: but it's visible to a greater or lesser extent in 266 00:15:08,960 --> 00:15:12,240 Speaker 4: very very many companies, probably the majority of companies today 267 00:15:12,720 --> 00:15:16,440 Speaker 4: that the C suite has an information environment which is 268 00:15:16,600 --> 00:15:22,560 Speaker 4: almost completely composed of financial numbers, because the financial numbers 269 00:15:23,080 --> 00:15:27,520 Speaker 4: are taken by them as objective facts. We can talk 270 00:15:28,160 --> 00:15:31,680 Speaker 4: long and hard with accountants about how objective those financial 271 00:15:31,760 --> 00:15:34,200 Speaker 4: numbers are and how many assumptions go into them, but 272 00:15:34,320 --> 00:15:38,280 Speaker 4: they arrive on a spreadsheet looking very much like objective 273 00:15:38,280 --> 00:15:43,200 Speaker 4: facts about the world. Things like engineering level, in principle 274 00:15:43,360 --> 00:15:45,560 Speaker 4: are objective facts, but you have to do a lot 275 00:15:45,640 --> 00:15:48,600 Speaker 4: more work to find them out and to know what's relevant. 276 00:15:49,200 --> 00:15:53,680 Speaker 4: And then you have issues like culture and kind of 277 00:15:53,720 --> 00:15:58,360 Speaker 4: the social environments which aren't even capable of being quantified, 278 00:15:58,760 --> 00:16:01,280 Speaker 4: So there's always this tender and see, if you're trying 279 00:16:01,320 --> 00:16:04,160 Speaker 4: to do that thing of manage your own capacity to 280 00:16:04,200 --> 00:16:08,400 Speaker 4: manage your information flow, that you're going to concentrate on 281 00:16:08,480 --> 00:16:12,560 Speaker 4: the things that look finite and look manageable, and that's 282 00:16:12,560 --> 00:16:14,960 Speaker 4: always going to be the financial numbers, which can be 283 00:16:15,120 --> 00:16:20,600 Speaker 4: a big problem because financial numbers can mislead. You know, 284 00:16:20,680 --> 00:16:23,520 Speaker 4: you can create illusions in an accounting system the same 285 00:16:23,520 --> 00:16:24,840 Speaker 4: way that you can with anything else. 286 00:16:25,720 --> 00:16:29,040 Speaker 3: How did we end up here? And I'm thinking specifically 287 00:16:29,360 --> 00:16:33,440 Speaker 3: to one particular development in the world of business, which 288 00:16:33,480 --> 00:16:36,840 Speaker 3: is the creation of the limited liability company. And I 289 00:16:36,840 --> 00:16:39,440 Speaker 3: guess the clues kind of in the name there, But 290 00:16:39,600 --> 00:16:43,600 Speaker 3: what were the decisions or the trends that sort of 291 00:16:43,640 --> 00:16:46,080 Speaker 3: came about in creating the current system. 292 00:16:46,880 --> 00:16:50,600 Speaker 4: Well, I mean, the limited liability company is certainly a 293 00:16:50,760 --> 00:16:54,280 Speaker 4: big kind of change if you think about these things 294 00:16:54,400 --> 00:16:58,960 Speaker 4: in feedback terms and information terms, and you know, one 295 00:16:59,000 --> 00:17:01,040 Speaker 4: of the big arguments of the BO is that we 296 00:17:01,080 --> 00:17:05,400 Speaker 4: should be looking to the mathematics of information theory rather 297 00:17:05,440 --> 00:17:09,359 Speaker 4: than the mathematics of optimization to explain and model some 298 00:17:09,440 --> 00:17:12,840 Speaker 4: of these things. But a limited liability company is an 299 00:17:12,840 --> 00:17:17,040 Speaker 4: information filter. It tells you that outcomes below a certain 300 00:17:17,080 --> 00:17:22,000 Speaker 4: amount aren't going to affect you anymore, and that changes 301 00:17:22,400 --> 00:17:25,600 Speaker 4: your information world, It changes what you care about. But 302 00:17:25,880 --> 00:17:31,199 Speaker 4: then what I think really started doing the damage, so 303 00:17:31,280 --> 00:17:34,720 Speaker 4: to speak, was the development of the leverage buyout in 304 00:17:34,840 --> 00:17:39,919 Speaker 4: the nineteen seventies and the shareholder value movement, as it 305 00:17:40,040 --> 00:17:44,040 Speaker 4: was kind of really kicked off by Milton Friedman's essay 306 00:17:44,040 --> 00:17:46,959 Speaker 4: on the social responsibility of a business to increase its 307 00:17:47,000 --> 00:17:50,720 Speaker 4: profits New York Times, nineteen seventy, but then built on 308 00:17:51,440 --> 00:17:55,439 Speaker 4: by just the entire kind of two decades of business 309 00:17:55,440 --> 00:17:59,680 Speaker 4: school research that followed from that. Because again, thinking about 310 00:17:59,720 --> 00:18:03,960 Speaker 4: it in information terms, a leftage buyout is a massive, 311 00:18:04,240 --> 00:18:08,919 Speaker 4: screaming signal. The requirement to make the payments on debts 312 00:18:09,280 --> 00:18:13,320 Speaker 4: becomes a signal that swamps anything else you might be 313 00:18:13,400 --> 00:18:16,640 Speaker 4: thinking of, because if you are a CEO and you've 314 00:18:16,680 --> 00:18:21,200 Speaker 4: got LBO levels of debt, that's your priority. You can't 315 00:18:21,240 --> 00:18:25,680 Speaker 4: think about anything that isn't related to servicing that debt. 316 00:18:26,160 --> 00:18:28,040 Speaker 2: Let's go back and talk about the sort of big 317 00:18:28,080 --> 00:18:30,919 Speaker 2: picture ideas. In your book, you talk a lot about 318 00:18:30,920 --> 00:18:36,720 Speaker 2: this field called cybernetics, and cybernetics is a name sounds 319 00:18:36,760 --> 00:18:38,480 Speaker 2: like something that they would have come up with in 320 00:18:38,520 --> 00:18:41,640 Speaker 2: the early nineties, like the Wired magazine people like would 321 00:18:41,680 --> 00:18:44,880 Speaker 2: get into like cybernetics in ninety one. But actually this 322 00:18:44,920 --> 00:18:48,720 Speaker 2: field has been around at least since the nineteen forties. 323 00:18:48,960 --> 00:18:51,199 Speaker 2: I'm surprised they even had a word like cybernetics in 324 00:18:51,200 --> 00:18:55,240 Speaker 2: the nineteen forties. But what is cybernetics? And talk to 325 00:18:55,320 --> 00:18:58,080 Speaker 2: us about the sort of general framework you use as 326 00:18:58,080 --> 00:19:00,359 Speaker 2: a sort of to start talking about this stories in 327 00:19:00,400 --> 00:19:00,800 Speaker 2: your book. 328 00:19:00,880 --> 00:19:03,359 Speaker 4: Sure, I mean, you're right, it was its Second World 329 00:19:03,400 --> 00:19:08,119 Speaker 4: War kind of talk. It's originally from a word meaning 330 00:19:08,200 --> 00:19:11,280 Speaker 4: the man who steers the boat, So it's a cybernetics 331 00:19:11,280 --> 00:19:14,640 Speaker 4: in that sense. And the first guy to use it 332 00:19:15,400 --> 00:19:18,960 Speaker 4: was a scientist working on creating an automated gun site 333 00:19:19,640 --> 00:19:24,720 Speaker 4: for the United States Air Force. And the idea here 334 00:19:25,080 --> 00:19:29,600 Speaker 4: is that there is some quantity that is preserved in 335 00:19:29,640 --> 00:19:34,920 Speaker 4: an automated gun site between the operator, the radar, the server, 336 00:19:35,040 --> 00:19:38,160 Speaker 4: motors and that and all the components of that system, 337 00:19:38,280 --> 00:19:43,080 Speaker 4: and that component is information. And so at the time 338 00:19:43,119 --> 00:19:46,800 Speaker 4: this was Norbert Viena. I'm talking about the scientist in 339 00:19:46,840 --> 00:19:50,439 Speaker 4: the automated gun sites. Another guy working in the same field. 340 00:19:50,600 --> 00:19:53,320 Speaker 4: You might have heard of was Claude Shannon at Bell Labs, 341 00:19:53,760 --> 00:19:58,240 Speaker 4: who was inventing information theory at Bell Labs and has 342 00:19:58,240 --> 00:20:02,240 Speaker 4: some fundamental theorems there, And in many ways, science of 343 00:20:02,280 --> 00:20:09,280 Speaker 4: cybernetics is information theory applied to control. So you might 344 00:20:09,400 --> 00:20:11,919 Speaker 4: have an information theory kind of pie the piece of 345 00:20:11,960 --> 00:20:15,479 Speaker 4: maths that tells you how much bandwidth you need to 346 00:20:15,520 --> 00:20:19,959 Speaker 4: transmit a given signal, and the cybernetic interpretation of that 347 00:20:20,040 --> 00:20:24,360 Speaker 4: math would be that it's telling you how much capacity 348 00:20:24,520 --> 00:20:29,680 Speaker 4: you need to manage a system that's of similar kind 349 00:20:29,680 --> 00:20:34,800 Speaker 4: of noise. So this was all made huge use of 350 00:20:35,600 --> 00:20:39,080 Speaker 4: in controlling things where you have access to the whole 351 00:20:39,320 --> 00:20:42,720 Speaker 4: information environment. So a lot of that early maths from 352 00:20:42,760 --> 00:20:46,760 Speaker 4: the first cyberneticians has just kind of stayed with us 353 00:20:47,920 --> 00:20:50,760 Speaker 4: through the invention of the electronic computer, and a lot 354 00:20:50,760 --> 00:20:55,960 Speaker 4: of it is actually at work in really modern artificial intelligence. 355 00:20:56,560 --> 00:21:01,879 Speaker 4: Meragai called ben Recht who works on recommendation algorithms, and 356 00:21:02,040 --> 00:21:05,480 Speaker 4: he's quite upfront that a lot of his fundamental mathematical 357 00:21:05,520 --> 00:21:08,640 Speaker 4: techniques are from the four seas and fifties just being 358 00:21:08,680 --> 00:21:13,280 Speaker 4: applied in the context of massively more computing power. What 359 00:21:13,320 --> 00:21:16,280 Speaker 4: I'm interested in is where you take those kinds of 360 00:21:16,840 --> 00:21:22,320 Speaker 4: theorems and apply them in a slightly more unrigorous, slightly 361 00:21:22,359 --> 00:21:28,280 Speaker 4: more metaphorical sense to situations of management and organization where 362 00:21:28,320 --> 00:21:31,359 Speaker 4: you don't have access to the full information environment and 363 00:21:31,440 --> 00:21:34,879 Speaker 4: you just have to say, we're going to think about 364 00:21:34,920 --> 00:21:40,399 Speaker 4: this not in an optimizing economics kind of neoclassical economic sense, 365 00:21:40,680 --> 00:21:42,680 Speaker 4: but we're going to think about this as a system 366 00:21:43,000 --> 00:21:46,159 Speaker 4: that has to be kept under control, and we're going 367 00:21:46,240 --> 00:21:49,320 Speaker 4: to say, well, how much resource do we need in 368 00:21:49,400 --> 00:21:53,040 Speaker 4: order to stabilize this system as a system, Which is 369 00:21:53,040 --> 00:21:54,879 Speaker 4: a kind of abstract way of looking at it, but 370 00:21:54,960 --> 00:21:58,199 Speaker 4: it's the same fundamental problem of management. How do you 371 00:21:58,200 --> 00:22:01,119 Speaker 4: get a drink from a firehouse? How do you match 372 00:22:01,200 --> 00:22:04,480 Speaker 4: your own capacity to manage to the complexity of the 373 00:22:04,520 --> 00:22:05,560 Speaker 4: thing that you're in charge of. 374 00:22:06,040 --> 00:22:09,720 Speaker 3: Wait, can you give us a practical example of the 375 00:22:09,760 --> 00:22:13,159 Speaker 3: application of cybernetics. I guess because it does sound to 376 00:22:13,200 --> 00:22:15,639 Speaker 3: your point earlier, it sounds a little bit abstract in 377 00:22:15,760 --> 00:22:16,280 Speaker 3: my mind. 378 00:22:17,359 --> 00:22:20,840 Speaker 4: Well, I mean practical example I think is the history 379 00:22:20,920 --> 00:22:24,200 Speaker 4: of the development of the corporation. So from the first 380 00:22:24,280 --> 00:22:28,399 Speaker 4: days of the American railroads, which were probably the first 381 00:22:28,640 --> 00:22:32,919 Speaker 4: really big corporate structures, the world ever saw. You have 382 00:22:33,160 --> 00:22:37,159 Speaker 4: this problem that as the network builds out, it gets 383 00:22:37,280 --> 00:22:41,600 Speaker 4: more complicated, and the ability of the head office to 384 00:22:41,680 --> 00:22:46,520 Speaker 4: manage it doesn't grow faster. And you can try and 385 00:22:46,560 --> 00:22:50,439 Speaker 4: solve that by adding more people. You can get a 386 00:22:50,480 --> 00:22:55,000 Speaker 4: great improvement by adding wireless telegraphs. But fundamentally, at some 387 00:22:55,200 --> 00:22:59,040 Speaker 4: point this railroad is going to grow big enough that 388 00:22:59,119 --> 00:23:03,400 Speaker 4: you have to devolve. You have to split it into branches, 389 00:23:03,800 --> 00:23:07,160 Speaker 4: and you have to give autonomy to some of the subsidiaries, 390 00:23:07,480 --> 00:23:11,720 Speaker 4: because that's the only way that you can match the 391 00:23:11,760 --> 00:23:16,440 Speaker 4: bandwidth of the management to the bandwidth of the control problem. 392 00:23:16,520 --> 00:23:19,720 Speaker 4: So I'd say that what we always see in any 393 00:23:19,720 --> 00:23:24,560 Speaker 4: big organization is that it grows, it gets more complicated, 394 00:23:25,080 --> 00:23:27,720 Speaker 4: It tries to deal with that by adding more resources 395 00:23:27,720 --> 00:23:31,320 Speaker 4: at head office, it ends up not being able to 396 00:23:31,400 --> 00:23:36,800 Speaker 4: keep up, and then it reorganizes. And the reorganizations almost 397 00:23:36,840 --> 00:23:42,320 Speaker 4: always either involve pushing responsibility down to the shop floor 398 00:23:42,480 --> 00:23:45,800 Speaker 4: or down to the branches, or they involve spinning off 399 00:23:45,840 --> 00:23:49,119 Speaker 4: parts of the business into a separate organization and giving 400 00:23:49,160 --> 00:23:51,520 Speaker 4: up the task of controlling it at all. 401 00:24:07,240 --> 00:24:09,399 Speaker 2: What are accountability sinks? 402 00:24:10,920 --> 00:24:13,800 Speaker 4: The accountability sink is just it's a name for a 403 00:24:13,840 --> 00:24:18,400 Speaker 4: particular move of cybern ethics that I notice a lot 404 00:24:18,440 --> 00:24:25,040 Speaker 4: of these days, which is when you consciously break the 405 00:24:25,080 --> 00:24:30,320 Speaker 4: feedback links from the subjects of a particular decision to 406 00:24:31,160 --> 00:24:34,560 Speaker 4: yourself or to the unit that's kind of meant to 407 00:24:34,600 --> 00:24:37,119 Speaker 4: be making it. So your gate agent Joe is just 408 00:24:37,160 --> 00:24:41,040 Speaker 4: a classic example of the accountability sink, because they talk 409 00:24:41,160 --> 00:24:45,800 Speaker 4: to you with the voice of a corporation and they 410 00:24:45,840 --> 00:24:47,960 Speaker 4: say that this is the policy, there's nothing I can 411 00:24:47,960 --> 00:24:50,920 Speaker 4: do to change it, and then you only able to 412 00:24:50,960 --> 00:24:53,640 Speaker 4: talk back to them as a human being like yourself, 413 00:24:53,960 --> 00:24:56,040 Speaker 4: so you can't get mad at them because it's not 414 00:24:56,119 --> 00:24:58,720 Speaker 4: their decision they get Just to. 415 00:24:58,680 --> 00:24:59,919 Speaker 2: Be clear, I did not get mad. 416 00:25:01,480 --> 00:25:04,000 Speaker 3: There's a video of Joe out there. 417 00:25:04,800 --> 00:25:07,919 Speaker 2: I just sat there and I closed my eyes, and 418 00:25:08,000 --> 00:25:10,480 Speaker 2: I did stand around because I was sort of curious 419 00:25:10,520 --> 00:25:12,360 Speaker 2: the gate banter. But I did not get bad. 420 00:25:12,400 --> 00:25:14,880 Speaker 4: I just want to establish Yeah, but then you might 421 00:25:14,960 --> 00:25:17,280 Speaker 4: ask them, and I will confess I've done this. I've 422 00:25:17,320 --> 00:25:21,320 Speaker 4: asked them someone politely for the phone number of someone 423 00:25:21,359 --> 00:25:24,320 Speaker 4: who I can call up who is responsible for that decision, 424 00:25:24,920 --> 00:25:27,600 Speaker 4: and you know it's not the policy. You can't get 425 00:25:27,600 --> 00:25:30,440 Speaker 4: that phone number. The whole point of this was to 426 00:25:30,480 --> 00:25:36,959 Speaker 4: create a sink into which unpleasant feedback can be poured 427 00:25:37,400 --> 00:25:41,640 Speaker 4: and does dissipated harmlessly. And when you start thinking about 428 00:25:41,680 --> 00:25:44,959 Speaker 4: these things in terms of accountability sinks, you start seeing 429 00:25:44,960 --> 00:25:49,720 Speaker 4: them everywhere. Because everywhere that there's a policy that can't 430 00:25:49,720 --> 00:25:53,760 Speaker 4: be broken and no feedback to the person who could 431 00:25:53,800 --> 00:25:57,080 Speaker 4: get the policy changed, that's an accountability sink. That's a 432 00:25:57,119 --> 00:26:01,040 Speaker 4: way that someone has protected themselves from the consequences of 433 00:26:01,080 --> 00:26:05,399 Speaker 4: their decisions, possibly at huge costs to the organization that 434 00:26:05,440 --> 00:26:07,199 Speaker 4: they're working for, but possibly not. 435 00:26:09,640 --> 00:26:12,800 Speaker 3: I guess I'll ask the obvious question, which is how 436 00:26:12,800 --> 00:26:16,880 Speaker 3: do we break out of accountability sinks? And I think 437 00:26:16,920 --> 00:26:19,520 Speaker 3: the frustration with everyone is that, you know, you feel 438 00:26:19,520 --> 00:26:21,960 Speaker 3: powerless when you're caught in one, when you can't get 439 00:26:22,280 --> 00:26:24,480 Speaker 3: the answer that you want, or when you can't speak 440 00:26:24,520 --> 00:26:26,720 Speaker 3: to the decision maker and try to reason with them 441 00:26:26,800 --> 00:26:28,960 Speaker 3: or explain why this might be a one off or 442 00:26:29,000 --> 00:26:32,240 Speaker 3: a peculiar situation. And then it just feels like the 443 00:26:32,320 --> 00:26:35,640 Speaker 3: idea of actually starting to break apart some of these 444 00:26:35,680 --> 00:26:39,760 Speaker 3: sinks and move to more of an era of personal 445 00:26:39,800 --> 00:26:44,200 Speaker 3: accountability the book stops here and all of that, Yeah, 446 00:26:44,200 --> 00:26:47,439 Speaker 3: there we go. It just seems further and further away. 447 00:26:48,119 --> 00:26:50,600 Speaker 4: Well, it is. And the horrible answer to your question, 448 00:26:50,720 --> 00:26:55,080 Speaker 4: Tracy is that maybe we can't, or maybe as individuals 449 00:26:55,119 --> 00:26:58,840 Speaker 4: we can't. And that's actually, in my view, potentially very 450 00:26:58,880 --> 00:27:02,800 Speaker 4: bad news for society because all of this sink, you know, 451 00:27:02,880 --> 00:27:05,280 Speaker 4: all of this negative emotion from people about the way 452 00:27:05,280 --> 00:27:08,840 Speaker 4: the world is, goes into the sinks. But like any sink, 453 00:27:08,960 --> 00:27:11,800 Speaker 4: it piles up, and it piles up, and then after 454 00:27:11,880 --> 00:27:15,399 Speaker 4: a while it all spills out, and then suddenly we 455 00:27:15,480 --> 00:27:19,600 Speaker 4: get things like Brexit in the UK, or kind of 456 00:27:19,800 --> 00:27:22,680 Speaker 4: first go of Donald Trump in the USA. We get 457 00:27:22,720 --> 00:27:27,920 Speaker 4: people who have used to being decided upon and used 458 00:27:27,960 --> 00:27:33,080 Speaker 4: to being ignored, getting steadily more and more dissatisfied with 459 00:27:33,119 --> 00:27:35,879 Speaker 4: the system, and then finally they start to use the 460 00:27:35,920 --> 00:27:40,639 Speaker 4: only power left to them to just make use their 461 00:27:40,720 --> 00:27:43,520 Speaker 4: votes in a way that says, I am no longer 462 00:27:43,560 --> 00:27:46,000 Speaker 4: satisfied with this, this is no longer tolerable to me. 463 00:27:46,320 --> 00:27:48,560 Speaker 4: I'm going to use my vote to tear the system apart. 464 00:27:49,200 --> 00:27:53,359 Speaker 4: And so with all these things, can you can divert 465 00:27:53,400 --> 00:27:55,359 Speaker 4: these things for a while, but it's at the cost 466 00:27:55,400 --> 00:27:56,600 Speaker 4: of building up fragility. 467 00:27:57,359 --> 00:28:00,439 Speaker 2: Yeah, I have to say, you know, I I didn't 468 00:28:00,720 --> 00:28:04,119 Speaker 2: finish your book I read. I'm about halfway through, but 469 00:28:04,240 --> 00:28:09,800 Speaker 2: it did leave me fairly nihilistic or pessimistic that it's like, 470 00:28:10,119 --> 00:28:13,639 Speaker 2: these are these inexorable centrifugal or centripetal forces. I can 471 00:28:13,680 --> 00:28:16,840 Speaker 2: never remember which is which, and they're pulling us into 472 00:28:16,920 --> 00:28:21,160 Speaker 2: all of these sort of high stress decisions and it's 473 00:28:21,200 --> 00:28:24,000 Speaker 2: really bad, and things are going to things are gonna 474 00:28:24,040 --> 00:28:26,880 Speaker 2: keep breaking, and we're going to get angry or anger. 475 00:28:27,600 --> 00:28:29,960 Speaker 2: We talked about AI in the beginning, and a sort 476 00:28:30,000 --> 00:28:32,560 Speaker 2: of provocative idea that you talk about in your book 477 00:28:32,640 --> 00:28:36,239 Speaker 2: is the idea of the corporation as it exists and 478 00:28:36,320 --> 00:28:39,400 Speaker 2: as we've known about it as already a proto AI. 479 00:28:39,520 --> 00:28:41,160 Speaker 2: So you go to chat GPT and you put in 480 00:28:41,200 --> 00:28:43,800 Speaker 2: a request and something spits out and it's impressive whatever, 481 00:28:44,120 --> 00:28:46,560 Speaker 2: but that actually this is just sort of a specific 482 00:28:46,640 --> 00:28:50,240 Speaker 2: example of what the corporation has been for a long time. 483 00:28:50,720 --> 00:28:54,200 Speaker 4: Absolutely, I mean, and this is the beauty of abstract mass. 484 00:28:54,200 --> 00:28:56,600 Speaker 4: It describes things without you needing to know what they 485 00:28:56,640 --> 00:29:00,320 Speaker 4: actually are. These are all just decision making systems. I 486 00:29:00,400 --> 00:29:03,520 Speaker 4: had a conversation with someone at the European Commission the 487 00:29:03,600 --> 00:29:07,320 Speaker 4: year before last, because in Europe they passed in Act, 488 00:29:07,800 --> 00:29:11,240 Speaker 4: saying that if a decision is made affecting you, like 489 00:29:11,360 --> 00:29:14,240 Speaker 4: to turn you down for health insurance, then if that's 490 00:29:14,240 --> 00:29:17,360 Speaker 4: made by an algorithm, you have a right of explainability, 491 00:29:17,800 --> 00:29:19,760 Speaker 4: So you have a right that someone can explain to 492 00:29:19,840 --> 00:29:23,320 Speaker 4: you why that algorithm made that decision for you. And 493 00:29:23,440 --> 00:29:25,480 Speaker 4: you know, I thought that's quite good, But it's kind 494 00:29:25,480 --> 00:29:28,320 Speaker 4: of ironic that this decision is coming from the European Commission. 495 00:29:28,680 --> 00:29:30,720 Speaker 4: So I asked the guy who works there, well, you know, 496 00:29:30,800 --> 00:29:33,280 Speaker 4: when you make a decision like that, what right of 497 00:29:33,320 --> 00:29:36,200 Speaker 4: explainability do I have from you? And the answer is 498 00:29:36,200 --> 00:29:40,320 Speaker 4: ha hah, no, none at all. M All these things 499 00:29:40,560 --> 00:29:43,479 Speaker 4: are basically working in the same way. The AI is 500 00:29:43,600 --> 00:29:46,760 Speaker 4: working like the corporation, which is working like the government, 501 00:29:47,400 --> 00:29:52,600 Speaker 4: and the same problems of information managements affect them all. 502 00:29:53,000 --> 00:29:55,640 Speaker 4: But it's not as nihilistic as you think, in my view, 503 00:29:55,760 --> 00:30:00,400 Speaker 4: because that means that these things can be so subject 504 00:30:00,400 --> 00:30:03,240 Speaker 4: to the same kinds of solutions. You know, if we 505 00:30:03,280 --> 00:30:06,560 Speaker 4: think about the original reason for building the accountability sink, 506 00:30:07,080 --> 00:30:11,400 Speaker 4: it was that someone felt overwhelm us by information and 507 00:30:11,480 --> 00:30:14,320 Speaker 4: so didn't feel that they were responsible for the decision 508 00:30:14,760 --> 00:30:18,640 Speaker 4: and so wanted to cut the link of accountability. If 509 00:30:18,680 --> 00:30:21,240 Speaker 4: you can put the AI in the loop in such 510 00:30:21,280 --> 00:30:25,360 Speaker 4: a way that the decision maker is more able to 511 00:30:25,440 --> 00:30:28,480 Speaker 4: manage their information flow, then they don't have so much 512 00:30:28,600 --> 00:30:32,240 Speaker 4: need to break the feedback links because they've got more 513 00:30:32,760 --> 00:30:35,440 Speaker 4: functional ways to deal with them. 514 00:30:36,040 --> 00:30:39,640 Speaker 3: I have a theoretical I have a theoretical question, which is, 515 00:30:39,760 --> 00:30:43,760 Speaker 3: what does accountability actually look like for a decision taken 516 00:30:43,840 --> 00:30:48,320 Speaker 3: by an algorithm? Is it that, like we understand the 517 00:30:48,640 --> 00:30:53,280 Speaker 3: factors that went into the model, and like the decisions 518 00:30:53,320 --> 00:30:56,800 Speaker 3: within the model that spat out a particular outcome, or 519 00:30:56,960 --> 00:31:00,200 Speaker 3: is it that the person who is using the ourg go, 520 00:31:00,800 --> 00:31:04,400 Speaker 3: you know, decides to think more thoughtfully about how they're 521 00:31:04,520 --> 00:31:04,920 Speaker 3: using it. 522 00:31:06,720 --> 00:31:09,560 Speaker 4: It's a that's a really interesting question which I'm going 523 00:31:09,640 --> 00:31:13,360 Speaker 4: to think for two seconds, delight and waffle before answering, 524 00:31:14,160 --> 00:31:19,000 Speaker 4: which is that I think the basic definition of accountability 525 00:31:19,040 --> 00:31:21,400 Speaker 4: in the decision sense, in my view, is that you're 526 00:31:21,440 --> 00:31:25,800 Speaker 4: accountable for a decision exactly to the extent that you 527 00:31:25,880 --> 00:31:28,600 Speaker 4: are able to change it. So, in terms of a 528 00:31:28,640 --> 00:31:33,680 Speaker 4: decision made by an AI, it is accountable if it 529 00:31:33,720 --> 00:31:37,800 Speaker 4: could be made to make a different decision by new 530 00:31:37,880 --> 00:31:41,800 Speaker 4: information being provided to it. So the crucial thing is 531 00:31:42,560 --> 00:31:45,800 Speaker 4: not so much having someone to point at and attribute 532 00:31:45,800 --> 00:31:49,760 Speaker 4: moral responsibility to. The crucial thing is to have some 533 00:31:50,280 --> 00:31:53,600 Speaker 4: link between the subject of the decision. Who can just say, 534 00:31:53,840 --> 00:31:57,760 Speaker 4: let's review this, let's have a court of appeal. And 535 00:31:58,280 --> 00:32:01,040 Speaker 4: you know, if the algo still thinks that this decision 536 00:32:01,080 --> 00:32:03,400 Speaker 4: needs to be done, if the elgo still thinks I'm 537 00:32:03,440 --> 00:32:07,240 Speaker 4: not an insurable risk, then maybe I've been Maybe I 538 00:32:07,240 --> 00:32:09,280 Speaker 4: still don't agree with that, but at least I know 539 00:32:09,480 --> 00:32:12,800 Speaker 4: I've been heard. It's not coming to me just simply 540 00:32:13,080 --> 00:32:15,120 Speaker 4: as a one way communication channel. 541 00:32:15,800 --> 00:32:20,680 Speaker 2: You know, making a more optimistic view. So you said 542 00:32:20,680 --> 00:32:25,360 Speaker 2: something interesting or important, which is that for a company, 543 00:32:25,200 --> 00:32:29,040 Speaker 2: the financial numbers are the closest thing, the closest form 544 00:32:29,080 --> 00:32:33,680 Speaker 2: of information that is at least like objective in some sense. 545 00:32:33,720 --> 00:32:36,280 Speaker 2: But then there's all these other things like how how 546 00:32:36,320 --> 00:32:39,680 Speaker 2: well is your engineering team working together? How is the 547 00:32:39,720 --> 00:32:43,160 Speaker 2: culture that are just inherently much more difficult. And they 548 00:32:43,200 --> 00:32:45,880 Speaker 2: are all kinds of like consultants and other companies that 549 00:32:45,920 --> 00:32:48,760 Speaker 2: try to like answer this for executives and you know, 550 00:32:48,880 --> 00:32:53,200 Speaker 2: rank employees different things. Aaron Levy, you know, he's the 551 00:32:53,240 --> 00:32:55,520 Speaker 2: CEO of Box and he's one of the few like 552 00:32:55,880 --> 00:32:58,960 Speaker 2: tech CEOs who tweets some interesting stuff from time to time. 553 00:32:59,160 --> 00:33:01,600 Speaker 2: But he said he had this recent thread about how 554 00:33:01,600 --> 00:33:03,560 Speaker 2: his company is using AI, and he said, you know, 555 00:33:03,600 --> 00:33:06,840 Speaker 2: the exciting thing is, you know, we have some data, 556 00:33:07,480 --> 00:33:09,880 Speaker 2: but then we have all this unstructured data that exists 557 00:33:09,880 --> 00:33:11,640 Speaker 2: in our company, and we've never been able to do 558 00:33:11,720 --> 00:33:15,160 Speaker 2: with anything, probably like chat logs from customer support and 559 00:33:15,200 --> 00:33:18,000 Speaker 2: all this. And he said, the exciting thing for them 560 00:33:18,200 --> 00:33:21,840 Speaker 2: is the prospect of turning all of this sort of unstructured, 561 00:33:22,000 --> 00:33:25,800 Speaker 2: unusable information that the company has into something that can 562 00:33:25,840 --> 00:33:30,720 Speaker 2: be essentially searchable and that insights can be gleaned from it. 563 00:33:30,760 --> 00:33:32,880 Speaker 2: Is there a story or is there a path in 564 00:33:32,920 --> 00:33:36,720 Speaker 2: your view where artificial intelligence can actually make some of 565 00:33:36,760 --> 00:33:41,400 Speaker 2: these other parts of a system more legible and more 566 00:33:41,440 --> 00:33:45,560 Speaker 2: interactive and more concrete to the executives in a way 567 00:33:45,600 --> 00:33:48,800 Speaker 2: that brings it those that other data on par with 568 00:33:48,880 --> 00:33:49,800 Speaker 2: the financial data. 569 00:33:50,200 --> 00:33:52,040 Speaker 4: I mean, I really hope. So, I mean, like kind 570 00:33:52,080 --> 00:33:53,880 Speaker 4: of one thing that Aaron Levy could do that would 571 00:33:53,880 --> 00:33:57,200 Speaker 4: be really radical would be to open up as much 572 00:33:57,200 --> 00:34:00,920 Speaker 4: of that data as possible to the investors and let 573 00:34:00,960 --> 00:34:04,120 Speaker 4: them pass it and have that as a main channel 574 00:34:04,160 --> 00:34:10,000 Speaker 4: of communication of corporate performance. Rather than generally accepted accounting principles, 575 00:34:10,040 --> 00:34:12,880 Speaker 4: because if you think about that phrase, that's an accountability 576 00:34:12,880 --> 00:34:17,040 Speaker 4: sink right there. What are these accounting principles? They're generally accepted. 577 00:34:17,960 --> 00:34:20,680 Speaker 4: Can I change them? No, that would not be generally accepted. 578 00:34:20,960 --> 00:34:24,160 Speaker 4: What if this is completely irrelevant set of metrics to 579 00:34:24,280 --> 00:34:27,399 Speaker 4: my business? Well, that you still have to do it 580 00:34:27,680 --> 00:34:31,480 Speaker 4: in exactly this way, even if it's not presenting what 581 00:34:31,520 --> 00:34:34,400 Speaker 4: you think is actually generating value. And in a world 582 00:34:34,480 --> 00:34:39,239 Speaker 4: where we've got better ways of processing bulk information like that, 583 00:34:39,840 --> 00:34:42,080 Speaker 4: then I think there's a real question about whether gap 584 00:34:42,520 --> 00:34:46,200 Speaker 4: is something that we should be so fixated on, whether 585 00:34:46,239 --> 00:34:49,280 Speaker 4: we should be thinking that the only way to report 586 00:34:49,680 --> 00:34:54,000 Speaker 4: corporate performance is in a way that's optimized. Basically, for 587 00:34:54,080 --> 00:34:56,720 Speaker 4: a guy with a greener eyeshade sitting at a desk 588 00:34:57,000 --> 00:35:01,000 Speaker 4: in the beginning of the twentieth century, flip through printed 589 00:35:01,480 --> 00:35:03,560 Speaker 4: reports and accounts, you know, that's not the way that 590 00:35:03,600 --> 00:35:06,799 Speaker 4: we process information anymore. And so maybe that shouldn't be 591 00:35:07,360 --> 00:35:11,960 Speaker 4: the way that we report information anymore, because, as you say, 592 00:35:12,880 --> 00:35:17,799 Speaker 4: these assumptions that go into gap earnings start driving decisions 593 00:35:18,000 --> 00:35:20,120 Speaker 4: and they were never meant to drive decisions. 594 00:35:20,840 --> 00:35:23,960 Speaker 2: It's interesting Tracy now thinking about it in this financial sense, 595 00:35:24,000 --> 00:35:28,640 Speaker 2: how many of these accountabilities sinks? Like even like performance benchmarks, right, 596 00:35:28,680 --> 00:35:30,560 Speaker 2: it's like, oh, we beat the S and P or whatever. 597 00:35:30,600 --> 00:35:33,080 Speaker 2: It's like, why the S and P etcetera. Well, you 598 00:35:33,120 --> 00:35:36,160 Speaker 2: know it's there, right, we could point to it and 599 00:35:36,200 --> 00:35:38,000 Speaker 2: we could say it. But like once you start thinking 600 00:35:38,000 --> 00:35:41,080 Speaker 2: of them in all of the indices and measures we cite, 601 00:35:41,080 --> 00:35:43,760 Speaker 2: you could see, Tracy, how they like serve that purpose 602 00:35:43,800 --> 00:35:45,880 Speaker 2: of just like yeah, look this is what we measure against. 603 00:35:45,920 --> 00:35:48,680 Speaker 3: Oh yeah, of course. I mean incentives matter, right, Like 604 00:35:48,760 --> 00:35:50,880 Speaker 3: that's something that we say over and over again on 605 00:35:50,920 --> 00:35:53,880 Speaker 3: this podcast and when it comes to accounting. Okay, just 606 00:35:53,920 --> 00:35:55,640 Speaker 3: to push back a little bit, but like there is 607 00:35:55,680 --> 00:35:59,480 Speaker 3: an argument to have standardized accounting rules so that we 608 00:35:59,560 --> 00:36:03,280 Speaker 3: don't always end up with companies running off and creating 609 00:36:03,320 --> 00:36:07,759 Speaker 3: community adjusted ebit dah and things like that. But on 610 00:36:07,800 --> 00:36:11,120 Speaker 3: the subject of incentives, I wanted to ask, you know, 611 00:36:11,160 --> 00:36:14,719 Speaker 3: I read David Graber's Bull Jobs this year and it's 612 00:36:14,760 --> 00:36:17,120 Speaker 3: still sort of looming large in my memory, and I 613 00:36:17,120 --> 00:36:21,160 Speaker 3: guess my question is how much overlap is there between 614 00:36:21,400 --> 00:36:25,480 Speaker 3: the accountability issues that you describe in your book and 615 00:36:25,960 --> 00:36:31,080 Speaker 3: the way specific jobs, especially middle management are structured. And 616 00:36:31,120 --> 00:36:32,719 Speaker 3: I don't mean to trigger you because I know that 617 00:36:32,760 --> 00:36:35,080 Speaker 3: you mentioned in the book that you got into it 618 00:36:35,120 --> 00:36:37,520 Speaker 3: a little bit with Graber over a different subject. But 619 00:36:37,560 --> 00:36:38,839 Speaker 3: if you want to talk about that too. 620 00:36:39,160 --> 00:36:41,560 Speaker 4: Oh, I missed David so much because we used to 621 00:36:41,560 --> 00:36:47,520 Speaker 4: wind each other up so badly, and I got into 622 00:36:47,560 --> 00:36:49,960 Speaker 4: a different argument that's not mentioned into the book over 623 00:36:50,000 --> 00:36:53,120 Speaker 4: bull jobs. Because it's just like, if you're saying that 624 00:36:53,400 --> 00:36:56,480 Speaker 4: middle management is a bulk job, then you're saying that 625 00:36:56,480 --> 00:37:01,760 Speaker 4: the Sarahbellum is a bull's organ The middle management exists 626 00:37:01,960 --> 00:37:06,239 Speaker 4: precisely because of all the metrics, and all the financial 627 00:37:06,280 --> 00:37:10,799 Speaker 4: and non financial metrics on the chief executive's dashboard are 628 00:37:11,560 --> 00:37:15,640 Speaker 4: massive information reducing filters. The middle managers are the people 629 00:37:15,640 --> 00:37:19,160 Speaker 4: who carry the knowledge of the ways in which those 630 00:37:19,200 --> 00:37:24,760 Speaker 4: metrics can misrepresent reality and how to cure the problems 631 00:37:24,760 --> 00:37:28,560 Speaker 4: which arise when they are. You know, when someone's in 632 00:37:28,640 --> 00:37:31,319 Speaker 4: danger of making a decision. The first thing a bad 633 00:37:31,360 --> 00:37:34,640 Speaker 4: company does before it creates something like the Libuil scandal 634 00:37:34,719 --> 00:37:36,920 Speaker 4: or the seven three seven mags is thin out the 635 00:37:37,000 --> 00:37:42,000 Speaker 4: ranks of its middle management. And this particularly without wanting 636 00:37:42,040 --> 00:37:45,520 Speaker 4: to relate the gate arguments with David Graeber, particularly since 637 00:37:45,520 --> 00:37:48,640 Speaker 4: he's not around any more to answer back. In his 638 00:37:48,920 --> 00:37:53,880 Speaker 4: day job as an anthropologist, David was so subtle and 639 00:37:54,000 --> 00:37:59,279 Speaker 4: intelligent about the ceremonial roles of elders and people who 640 00:37:59,440 --> 00:38:04,240 Speaker 4: built census among hunter gatherers to decide on what bands 641 00:38:04,239 --> 00:38:06,760 Speaker 4: they would do. And then when you have those exact 642 00:38:06,840 --> 00:38:11,360 Speaker 4: same problem solving and dispute resolving jobs happening, you know, 643 00:38:11,440 --> 00:38:14,280 Speaker 4: in the offices at Bloomberger at a law firm, suddenly 644 00:38:14,320 --> 00:38:19,239 Speaker 4: he thinks that they're bulk jobs. So that's a bit 645 00:38:19,239 --> 00:38:23,359 Speaker 4: of a personal hobby horse. But basically all those or 646 00:38:23,480 --> 00:38:27,279 Speaker 4: very many of those roles are actually the preservation of 647 00:38:27,320 --> 00:38:31,120 Speaker 4: the information systems and the memory of organizations. 648 00:38:32,160 --> 00:38:36,560 Speaker 2: Who is Stafford Beer and why why does he loom 649 00:38:36,640 --> 00:38:38,520 Speaker 2: so large in the story you tell about the world. 650 00:38:38,800 --> 00:38:42,279 Speaker 4: Well, Stafford Bit was the father of management sebernetics. He 651 00:38:42,440 --> 00:38:45,400 Speaker 4: was the guy who first said you can take the 652 00:38:45,640 --> 00:38:49,600 Speaker 4: mathematics of information theory and apply it to industrial organization. 653 00:38:50,400 --> 00:38:57,560 Speaker 4: He was also David Bowie's favorite management consultants. He was very, 654 00:38:57,680 --> 00:39:01,280 Speaker 4: very influential on Brian Eno and the divilment of ambient music. 655 00:39:02,080 --> 00:39:07,200 Speaker 4: He was a hippie. He tried to it's not clear 656 00:39:07,239 --> 00:39:09,399 Speaker 4: that this was a joke, but he did try to 657 00:39:09,440 --> 00:39:12,839 Speaker 4: invent a computing pond where the growth of algae would 658 00:39:12,880 --> 00:39:15,280 Speaker 4: correspond to the solutions of differential equations. 659 00:39:15,760 --> 00:39:18,000 Speaker 3: He's going to be my summer project. I got a 660 00:39:18,040 --> 00:39:19,120 Speaker 3: pond with algae. 661 00:39:19,360 --> 00:39:21,400 Speaker 4: Yeah. The problem was that he used to feed them 662 00:39:21,400 --> 00:39:23,839 Speaker 4: iron filings to make them grow, and after a while 663 00:39:23,880 --> 00:39:27,520 Speaker 4: the entire pond became magnetic. But he was just this crazy, 664 00:39:27,840 --> 00:39:34,240 Speaker 4: larger than life figure who did these incredibly successful management 665 00:39:34,280 --> 00:39:39,920 Speaker 4: consulting assignments, but just somehow never quite was able to 666 00:39:39,960 --> 00:39:43,440 Speaker 4: get on with enough people in the corporate world to 667 00:39:43,560 --> 00:39:46,799 Speaker 4: really get his ideas across. And then he ended up 668 00:39:46,880 --> 00:39:52,759 Speaker 4: in Chile in nineteen seventy two with this incredibly romantic 669 00:39:53,200 --> 00:39:59,040 Speaker 4: but ultimately doomed project to reinvent socialism for the twenty 670 00:39:59,080 --> 00:40:04,480 Speaker 4: first century under the ide governments, which I mean realistically 671 00:40:04,520 --> 00:40:06,960 Speaker 4: it was never going to work because the confuting resources 672 00:40:07,000 --> 00:40:10,000 Speaker 4: were completely disproportionate to the task. But it never got 673 00:40:10,040 --> 00:40:13,080 Speaker 4: a fair trial, obviously, because the Pinochet coup happened about 674 00:40:13,120 --> 00:40:15,040 Speaker 4: eleven months after he started the project. 675 00:40:15,640 --> 00:40:17,880 Speaker 2: So why do we talk a little bit more about 676 00:40:17,920 --> 00:40:21,799 Speaker 2: like the current day? You know, I started in the 677 00:40:22,520 --> 00:40:27,120 Speaker 2: conversation like kind of not disagreeing, but questioning the premise, 678 00:40:27,239 --> 00:40:30,080 Speaker 2: like the world has lost its Has the world really 679 00:40:30,120 --> 00:40:32,480 Speaker 2: lost its mind? Or just the three of us in 680 00:40:32,520 --> 00:40:36,560 Speaker 2: this conversation getting old and angry? What do you see 681 00:40:36,600 --> 00:40:38,360 Speaker 2: like when you like think about like applying. You know, 682 00:40:38,400 --> 00:40:41,720 Speaker 2: we talked about Boeing and you mentioned the live war scandal. 683 00:40:41,760 --> 00:40:43,840 Speaker 2: But how are these things when you look around the 684 00:40:43,840 --> 00:40:46,880 Speaker 2: world today and you look around whatever apparent world losing 685 00:40:46,920 --> 00:40:48,120 Speaker 2: its mind? What are you seeing? 686 00:40:49,000 --> 00:40:51,040 Speaker 4: I think the number one thing I'm seeing, And this 687 00:40:51,200 --> 00:40:53,719 Speaker 4: is a point where David Graeber had it absolutely right, 688 00:40:54,520 --> 00:40:58,560 Speaker 4: is debt. You know, we have so many cases at 689 00:40:58,600 --> 00:41:03,960 Speaker 4: presence of companies where you have plenty of people who 690 00:41:04,040 --> 00:41:08,000 Speaker 4: know exactly what they need to do, what investments they 691 00:41:08,120 --> 00:41:12,560 Speaker 4: want to make, but they can't have any plans which 692 00:41:12,600 --> 00:41:16,560 Speaker 4: stretch out any further than the next debt repayment. And 693 00:41:16,760 --> 00:41:21,479 Speaker 4: to a large extent, that's because of leverage buyouts and 694 00:41:22,000 --> 00:41:26,000 Speaker 4: managements acting in anticipation of the risk of a leverage buyout. 695 00:41:26,560 --> 00:41:32,960 Speaker 4: But this really is a degradation of the higher functions, 696 00:41:33,000 --> 00:41:38,080 Speaker 4: the brain functions of the corporations of the anglosphere world. 697 00:41:38,440 --> 00:41:42,440 Speaker 4: The practice of firing middle managers because you don't know 698 00:41:42,480 --> 00:41:46,760 Speaker 4: what they do is also demonstrably, in my view, making 699 00:41:46,840 --> 00:41:51,080 Speaker 4: corporations stupid. It might have been that at the start 700 00:41:51,120 --> 00:41:53,680 Speaker 4: of the LBO boom in the seventies there were too 701 00:41:53,719 --> 00:41:56,759 Speaker 4: many of these guys on soft jobs with country club 702 00:41:56,800 --> 00:41:59,920 Speaker 4: memberships and private jets and whatnot, but we've clearly gone 703 00:41:59,920 --> 00:42:02,600 Speaker 4: too far in the other direction in my view. And 704 00:42:02,719 --> 00:42:09,400 Speaker 4: then we've got the frightening tendency of government organizations to 705 00:42:09,719 --> 00:42:14,920 Speaker 4: outsource absolutely critical functions, which means that all of the 706 00:42:15,000 --> 00:42:18,799 Speaker 4: knowledge of the systems that they're meant to be regulating 707 00:42:19,120 --> 00:42:24,839 Speaker 4: and dealing with. What's an example of that, right, best 708 00:42:24,880 --> 00:42:31,439 Speaker 4: example currently thinking, I think m just the question of 709 00:42:31,640 --> 00:42:35,480 Speaker 4: infrastructure building, for example in the UK, where we have 710 00:42:36,560 --> 00:42:40,480 Speaker 4: there's a river crossing in to the east of London 711 00:42:41,000 --> 00:42:45,239 Speaker 4: which is being responsible for generating the largest pile of 712 00:42:45,400 --> 00:42:48,719 Speaker 4: paper ever brought together in one place in the history 713 00:42:48,719 --> 00:42:54,000 Speaker 4: of humanity. And the reason for that is that the 714 00:42:54,040 --> 00:42:57,400 Speaker 4: people who are meant to be deciding what an appropriate 715 00:42:57,480 --> 00:43:03,720 Speaker 4: level of consultation is don't know anything about environmental impact 716 00:43:03,800 --> 00:43:08,080 Speaker 4: studies and building bridges anymore. So they commission results reports, 717 00:43:08,400 --> 00:43:13,600 Speaker 4: the commission reports from professional services firms and professional services 718 00:43:13,640 --> 00:43:17,839 Speaker 4: firms want to generate repeat business, and so you've got 719 00:43:17,840 --> 00:43:21,000 Speaker 4: this situation where the people who are meant to be 720 00:43:21,080 --> 00:43:25,640 Speaker 4: seeing the whole system as a system don't really understand 721 00:43:25,680 --> 00:43:29,440 Speaker 4: it anymore because they've outsourced all of their engineering knowledge, 722 00:43:29,480 --> 00:43:33,120 Speaker 4: all of their economic and environmental knowledge. And as a result, 723 00:43:33,800 --> 00:43:38,600 Speaker 4: the UK or the London Department for Transports are going 724 00:43:38,640 --> 00:43:41,120 Speaker 4: to be paying as much as ten times as much 725 00:43:41,400 --> 00:43:44,399 Speaker 4: as it should reasonably cost to build a bridge over 726 00:43:44,400 --> 00:43:45,080 Speaker 4: the River Thames. 727 00:43:45,920 --> 00:43:49,279 Speaker 3: It's interesting that you single out debt as a sort 728 00:43:49,280 --> 00:43:52,719 Speaker 3: of deciding factor versus share price, because this is the 729 00:43:52,800 --> 00:43:55,160 Speaker 3: one we hear a lot about in the context of 730 00:43:55,560 --> 00:44:00,680 Speaker 3: corporate short termism and people usually trying to hit specific 731 00:44:00,840 --> 00:44:03,600 Speaker 3: share price metric that may or may not be tied 732 00:44:03,640 --> 00:44:04,640 Speaker 3: into their compensation. 733 00:44:05,880 --> 00:44:09,160 Speaker 4: Yeah. I think I'm right on that. I know that 734 00:44:09,160 --> 00:44:11,719 Speaker 4: people disagree with me, but you know, we had share 735 00:44:11,760 --> 00:44:15,040 Speaker 4: prices in the fifties and sixties and we didn't have 736 00:44:15,120 --> 00:44:18,680 Speaker 4: this kind of problem of short termism. The difference is 737 00:44:19,160 --> 00:44:25,120 Speaker 4: that now we've got takeovers, particularly private equity and LBOs, 738 00:44:25,200 --> 00:44:28,840 Speaker 4: but also in general the use of debt in takeovers, 739 00:44:29,440 --> 00:44:32,799 Speaker 4: and that makes the share price more salient because any 740 00:44:32,840 --> 00:44:36,200 Speaker 4: incumbent management knows that if the share price falls, it 741 00:44:36,320 --> 00:44:41,320 Speaker 4: makes them vulnerable to a takeover. So to my mind, 742 00:44:41,880 --> 00:44:44,240 Speaker 4: I think it's not so much you know, the share 743 00:44:44,280 --> 00:44:50,360 Speaker 4: price as the exaggerated importance of short term financial metrics, 744 00:44:50,840 --> 00:44:53,640 Speaker 4: which is partly through the share price, but much more 745 00:44:54,120 --> 00:44:55,560 Speaker 4: just simply because of leverage. 746 00:44:56,239 --> 00:44:58,680 Speaker 2: One of the things that you know, you mentioned the 747 00:44:58,800 --> 00:45:01,680 Speaker 2: UK bridge and it ten times more expensive then needs 748 00:45:01,680 --> 00:45:03,279 Speaker 2: to be in a pile of paperwork. I mean, this 749 00:45:03,480 --> 00:45:06,759 Speaker 2: is probably the number one thing that like, you know, 750 00:45:06,960 --> 00:45:09,879 Speaker 2: people I follow on Twitter talk about all the time, 751 00:45:09,920 --> 00:45:12,480 Speaker 2: which is just how hard it is to build anything 752 00:45:12,600 --> 00:45:16,200 Speaker 2: in the United States and the interlocking systems of environmental 753 00:45:16,239 --> 00:45:20,000 Speaker 2: regulations and nimbi's and everything else, and it's the big 754 00:45:20,080 --> 00:45:22,759 Speaker 2: challenge of the IRA and it feels like listening to this, 755 00:45:23,239 --> 00:45:27,040 Speaker 2: it's just accountability sink. After accountability sink is to blame. 756 00:45:27,160 --> 00:45:31,080 Speaker 4: It's absolutely and it's accountability sinks being put up because 757 00:45:31,120 --> 00:45:34,360 Speaker 4: the people who were meant to be taking the decisions, 758 00:45:34,680 --> 00:45:37,480 Speaker 4: and who in the nineteen fifties and sixties did take 759 00:45:37,520 --> 00:45:41,279 Speaker 4: the decisions, are no longer really able to They've not 760 00:45:41,400 --> 00:45:44,200 Speaker 4: got the confidence of their decisions. They're not sure they're 761 00:45:44,239 --> 00:45:46,600 Speaker 4: going to be able to defend them in litigation, and 762 00:45:46,640 --> 00:45:51,880 Speaker 4: it's mainly because they've lost their executive functions with successive 763 00:45:51,920 --> 00:45:55,480 Speaker 4: staff cuts and retirements and outsourcing contracts. 764 00:45:55,880 --> 00:45:58,000 Speaker 2: Dan Davies, It's so great to catch up with you. 765 00:45:58,080 --> 00:46:01,560 Speaker 2: It's fascinating conversation, fascination way of thinking about the world. 766 00:46:01,920 --> 00:46:05,520 Speaker 2: Highly recommend everyone check out your book, The Unaccountability Machine. 767 00:46:06,000 --> 00:46:07,680 Speaker 2: Thank you so much for coming back on outlag. 768 00:46:08,200 --> 00:46:09,319 Speaker 4: Oh, thanks to so much. 769 00:46:09,440 --> 00:46:26,200 Speaker 2: Oh is a pleasure, Tracy, I really I love talking 770 00:46:26,200 --> 00:46:27,960 Speaker 2: to Dan. First of all, it's always I just like 771 00:46:28,000 --> 00:46:30,480 Speaker 2: hearing his voice. You know this, I do think like 772 00:46:30,560 --> 00:46:33,120 Speaker 2: accountability sinks is now going to be one of those 773 00:46:33,200 --> 00:46:35,840 Speaker 2: phrases that I'm just going to now start seeing everywhere. 774 00:46:36,040 --> 00:46:38,279 Speaker 2: And you know there's whole industry like McKinsey right, Like 775 00:46:38,360 --> 00:46:40,239 Speaker 2: that's like, you know, it's like again someone else to 776 00:46:40,560 --> 00:46:42,560 Speaker 2: lay off your workers, et cetera. Like you just start 777 00:46:42,600 --> 00:46:44,240 Speaker 2: seeing how big that is everywhere. 778 00:46:44,280 --> 00:46:46,279 Speaker 3: Well, this is what I was thinking. You know, you 779 00:46:46,320 --> 00:46:49,399 Speaker 3: brought in the US example of building infrastructure or other 780 00:46:49,520 --> 00:46:52,520 Speaker 3: energy products, and I kept thinking back to Jiggershaw and 781 00:46:52,600 --> 00:46:56,920 Speaker 3: his point about the lack of institutional memory of how 782 00:46:56,960 --> 00:47:00,640 Speaker 3: to build nuclear power plants. Right, it's not necessarily that 783 00:47:01,239 --> 00:47:05,359 Speaker 3: it's so complicated to get environmental permits and things like that, 784 00:47:05,400 --> 00:47:08,080 Speaker 3: although that is certainly part of it. But it's also 785 00:47:08,239 --> 00:47:11,640 Speaker 3: that the people who used to do this haven't been 786 00:47:11,680 --> 00:47:15,000 Speaker 3: doing it for a long time or are no longer around. 787 00:47:15,080 --> 00:47:18,040 Speaker 3: And that kind of goes to Down's point about middle 788 00:47:18,080 --> 00:47:22,600 Speaker 3: management being the sort of like what's the word I'm. 789 00:47:22,520 --> 00:47:23,840 Speaker 2: Thinking connective tissue. 790 00:47:24,080 --> 00:47:27,120 Speaker 3: Connective tissue is a good one of like institutional memory. 791 00:47:27,280 --> 00:47:30,120 Speaker 2: Yeah, no, it's only true, you know, in defense of 792 00:47:30,239 --> 00:47:33,440 Speaker 2: the the Nimbi's. You know, I keep mentioning this New 793 00:47:33,520 --> 00:47:35,600 Speaker 2: York documentary and watching and I got to the yeah, 794 00:47:35,600 --> 00:47:36,880 Speaker 2: I want to watch that, you gotta watch it. But 795 00:47:36,920 --> 00:47:39,359 Speaker 2: I got to the episode where like really talks about 796 00:47:39,400 --> 00:47:42,520 Speaker 2: like Robert Moses and just like plowing these big highways 797 00:47:42,520 --> 00:47:45,960 Speaker 2: through neighborhoods and putting up these like terrible, like terrible 798 00:47:46,080 --> 00:47:48,000 Speaker 2: housing projects that are like, you know. 799 00:47:48,120 --> 00:47:52,080 Speaker 3: Sort of right, continuously prioritizing highways. 800 00:47:52,239 --> 00:47:55,360 Speaker 2: That is someone who did not have the problem of 801 00:47:56,080 --> 00:47:59,960 Speaker 2: Nimbi's or a million different interlocking constraints on him, Like 802 00:48:00,080 --> 00:48:02,799 Speaker 2: there are our drawbacks to when someone has like too 803 00:48:02,920 --> 00:48:06,960 Speaker 2: much autonomoy and it's sort of like, yeah, but now 804 00:48:07,000 --> 00:48:09,520 Speaker 2: it does seem arguably we've gone too far in the 805 00:48:09,560 --> 00:48:13,000 Speaker 2: other direction, in which everyone just clings to their accountability 806 00:48:13,000 --> 00:48:14,520 Speaker 2: sink and can't get it right done. 807 00:48:14,560 --> 00:48:20,040 Speaker 3: Everything is a collective decision therefore can be held responsible. Yeah, 808 00:48:20,040 --> 00:48:23,640 Speaker 3: I feel like there must be a reasonable middle ground. 809 00:48:24,200 --> 00:48:26,560 Speaker 3: And yet I don't know. I'm trying to think if 810 00:48:26,560 --> 00:48:30,040 Speaker 3: I know of, like any organizations that have completely cracked 811 00:48:30,320 --> 00:48:33,799 Speaker 3: like the nut just for a little while. Yeah, collectivism 812 00:48:33,880 --> 00:48:38,000 Speaker 3: versus individual responsibility, I don't know. Well, on that note, 813 00:48:38,000 --> 00:48:38,759 Speaker 3: shall we leave it there? 814 00:48:38,840 --> 00:48:39,520 Speaker 2: Let's leave it there. 815 00:48:39,640 --> 00:48:42,320 Speaker 3: This has been another episode of the All Thoughts podcast. 816 00:48:42,360 --> 00:48:45,640 Speaker 3: I'm Tracy Alloway. You can follow me at Tracy Alloway and. 817 00:48:45,560 --> 00:48:48,280 Speaker 2: I'm Joe Wisenthal. You can follow me at The Stalwart. 818 00:48:48,600 --> 00:48:51,200 Speaker 2: Follow our guest Dan Davies. He's the author of the 819 00:48:51,200 --> 00:48:55,000 Speaker 2: book The Unaccountability Machine, Why Big Systems Make Terrible Decisions 820 00:48:55,000 --> 00:48:56,319 Speaker 2: and How the World Lost its Mind. 821 00:48:56,360 --> 00:48:57,040 Speaker 3: Go check it out. 822 00:48:57,239 --> 00:49:01,200 Speaker 2: His handle is at d squared Digest. Follow our producers 823 00:49:01,239 --> 00:49:04,640 Speaker 2: Carmen Rodriguez at Carmen Arman, dash Ol Bennett at Dashbot 824 00:49:04,680 --> 00:49:08,280 Speaker 2: and Kilbrooks at Kilbrooks. Thank you to our producer Moses Ondem. 825 00:49:08,560 --> 00:49:11,160 Speaker 2: For more Oddlags content, go to Bloomberg dot com slash 826 00:49:11,200 --> 00:49:13,279 Speaker 2: odd Lots, where we have transcripts, a blog, and a 827 00:49:13,320 --> 00:49:15,720 Speaker 2: newsletter and you can chat about all of these topics 828 00:49:15,760 --> 00:49:18,920 Speaker 2: twenty four to seven in our discord Discord dot gg 829 00:49:19,080 --> 00:49:20,839 Speaker 2: slash odd lots. 830 00:49:20,480 --> 00:49:24,240 Speaker 3: And speaking of personal accountability. If you like odd Lots, 831 00:49:24,360 --> 00:49:28,160 Speaker 3: please leave us a positive review on your favorite podcast platform. 832 00:49:28,360 --> 00:49:31,080 Speaker 3: And remember, if you are a Bloomberg subscriber, you can 833 00:49:31,160 --> 00:49:34,399 Speaker 3: listen to all of our episodes absolutely ad free. All 834 00:49:34,440 --> 00:49:36,840 Speaker 3: you need to do is connect your Bloomberg account with 835 00:49:36,960 --> 00:49:39,680 Speaker 3: Apple Podcasts. In order to do that, you can find 836 00:49:39,719 --> 00:49:43,359 Speaker 3: the Bloomberg channel on Apple Podcasts and follow the instructions there. 837 00:49:43,640 --> 00:49:44,480 Speaker 3: Thanks for listening.