1 00:00:10,119 --> 00:00:14,040 Speaker 1: Hello, and welcome to another episode of the Odd Lots Podcast. 2 00:00:14,080 --> 00:00:17,279 Speaker 1: I'm Joe Wisenthal and I'm Tracy Alloway. Tracy, have I 3 00:00:17,320 --> 00:00:20,680 Speaker 1: ever told you like my idea for like a two 4 00:00:20,720 --> 00:00:24,040 Speaker 1: part podcast or like a series of two part podcasts? 5 00:00:24,320 --> 00:00:27,520 Speaker 1: A series of two parts? Is it the debate one? No, 6 00:00:27,720 --> 00:00:31,600 Speaker 1: it's different. Okay, So I don't know about you, But 7 00:00:31,880 --> 00:00:35,599 Speaker 1: most of the time, after we do these interviews, I 8 00:00:35,720 --> 00:00:38,279 Speaker 1: usually have questions that I started kicked myself for not 9 00:00:38,440 --> 00:00:43,519 Speaker 1: having asked oh, yes, yes. So this often happens on 10 00:00:43,520 --> 00:00:46,920 Speaker 1: the podcast because we often touch on kind of wide 11 00:00:47,000 --> 00:00:50,839 Speaker 1: varying topics and we're not experts in a lot of 12 00:00:50,840 --> 00:00:53,120 Speaker 1: the things that we talked about, and often the first 13 00:00:53,159 --> 00:00:55,720 Speaker 1: episode is sort of you get to know your subject matter, 14 00:00:56,080 --> 00:00:59,200 Speaker 1: and then you leave it with even more questions. Yeah. 15 00:00:59,240 --> 00:01:01,560 Speaker 1: So I often kicked myself, like, oh, I should have 16 00:01:01,640 --> 00:01:04,080 Speaker 1: I should have obviously asked that obviously. And then the 17 00:01:04,080 --> 00:01:06,800 Speaker 1: other thing that happens is you post then the episode 18 00:01:06,840 --> 00:01:09,200 Speaker 1: comes out, and then like people on Twitter and elsewhere, 19 00:01:09,240 --> 00:01:10,880 Speaker 1: they'll talk about it. They talked about it, they're like, 20 00:01:10,920 --> 00:01:13,119 Speaker 1: what I'm curious about is actually like, oh, that's really 21 00:01:13,160 --> 00:01:15,400 Speaker 1: good question too. I should have thought of that. So, 22 00:01:15,480 --> 00:01:17,360 Speaker 1: like I thought, like a thing that we should do 23 00:01:17,440 --> 00:01:21,280 Speaker 1: maybe one day is schedule, like, have all episodes be 24 00:01:21,360 --> 00:01:24,000 Speaker 1: two parters where we do an interview with a guest, 25 00:01:24,600 --> 00:01:27,360 Speaker 1: take a week, sort of magrinate on it, think about it, 26 00:01:27,560 --> 00:01:29,319 Speaker 1: what are some questions we wish we would ask, and 27 00:01:29,319 --> 00:01:33,440 Speaker 1: then have the second episode schedule. I really like that idea. 28 00:01:33,640 --> 00:01:37,440 Speaker 1: It's sort of like it's almost the octopus model of 29 00:01:37,880 --> 00:01:42,920 Speaker 1: podcast episodes, where like one one episode just springs forth 30 00:01:42,959 --> 00:01:45,520 Speaker 1: a dozen you arms and legs that you can talk 31 00:01:45,520 --> 00:01:48,840 Speaker 1: about forever. So today we're kind of gonna we're kind 32 00:01:48,840 --> 00:01:50,960 Speaker 1: of gonna be doing that. We're this is I think 33 00:01:51,040 --> 00:01:53,440 Speaker 1: we might be This might be close. I'm not sure 34 00:01:53,480 --> 00:01:55,840 Speaker 1: what the record is, but this might be close to 35 00:01:55,880 --> 00:01:59,040 Speaker 1: the soonest we've ever had a guest on so soon 36 00:01:59,160 --> 00:02:01,600 Speaker 1: after they appeared the show. This is a pilot for 37 00:02:01,720 --> 00:02:04,400 Speaker 1: podcast two parters. The only other time I can think 38 00:02:04,440 --> 00:02:07,800 Speaker 1: of is we talked to Claudia Sam twice before the 39 00:02:07,840 --> 00:02:09,720 Speaker 1: pandemic and was because it was sort of like her 40 00:02:09,800 --> 00:02:13,560 Speaker 1: general views on how to like forecast recessions, and then 41 00:02:13,600 --> 00:02:15,760 Speaker 1: like two weeks later, like the pandemic was in view, 42 00:02:15,840 --> 00:02:18,080 Speaker 1: and it's like, oh, I shoot, this might be really bad, 43 00:02:18,360 --> 00:02:20,320 Speaker 1: so he had her on really fast. But this is 44 00:02:20,320 --> 00:02:22,600 Speaker 1: going to be close to that record. I think, yeah, 45 00:02:22,880 --> 00:02:25,000 Speaker 1: sounds good. All right, let's do it. So we recently 46 00:02:25,040 --> 00:02:30,600 Speaker 1: talked to Patrick McKenzie. He is a technology infrastructure financial 47 00:02:30,600 --> 00:02:34,720 Speaker 1: infrastructure specialist. He worked for Stripe for six years. He's 48 00:02:34,720 --> 00:02:37,200 Speaker 1: currently an advisor. He's the author of the Bits about 49 00:02:37,320 --> 00:02:40,400 Speaker 1: Money newsletter, and we talked to him about like corporate 50 00:02:40,440 --> 00:02:43,200 Speaker 1: I why it is the way it is? Why does 51 00:02:43,320 --> 00:02:46,200 Speaker 1: it seem to be like years and years behind what 52 00:02:46,280 --> 00:02:48,239 Speaker 1: we think of the cutting edge of software wise it 53 00:02:48,280 --> 00:02:51,079 Speaker 1: off and clunky wise it off and have these big 54 00:02:51,080 --> 00:02:54,080 Speaker 1: technical issues that can take a while to fix. That 55 00:02:54,160 --> 00:02:57,000 Speaker 1: was a great conversation. The public loved it. But there's 56 00:02:57,040 --> 00:03:00,080 Speaker 1: a lot going on in software these days. And the 57 00:03:00,120 --> 00:03:02,639 Speaker 1: other sort of big trend that we haven't really talked 58 00:03:02,680 --> 00:03:05,400 Speaker 1: about is that for the first time and I don't know, 59 00:03:05,480 --> 00:03:09,160 Speaker 1: maybe like fifteen years, we've been seeing all these tech layoffs, right, 60 00:03:09,200 --> 00:03:11,240 Speaker 1: And I think there was a little bit of tension 61 00:03:11,320 --> 00:03:13,399 Speaker 1: in that episode in that we were talking about why 62 00:03:13,440 --> 00:03:16,480 Speaker 1: corporate software is so bad. So it's almost like, well, 63 00:03:16,560 --> 00:03:19,280 Speaker 1: obviously there's a need for better software, and yet all 64 00:03:19,320 --> 00:03:23,000 Speaker 1: these big tech companies that ostensibly provide these services are 65 00:03:23,080 --> 00:03:27,680 Speaker 1: laying people off, but also in the broader macro picture. 66 00:03:28,080 --> 00:03:31,800 Speaker 1: Since we recorded that episode, we had a payrolls report 67 00:03:31,960 --> 00:03:36,800 Speaker 1: that came out much, much stronger than anyone expected. And 68 00:03:36,960 --> 00:03:39,480 Speaker 1: yet we've seen these big tech companies layoff people. And 69 00:03:39,520 --> 00:03:43,560 Speaker 1: so the question obviously becomes, is this something specifically about 70 00:03:43,680 --> 00:03:48,000 Speaker 1: tech or are these layoffs sort of the first sign 71 00:03:48,240 --> 00:03:51,120 Speaker 1: of something broader to come in the economy. And the 72 00:03:51,160 --> 00:03:54,000 Speaker 1: other thing that everyone points out to when when you 73 00:03:54,120 --> 00:03:58,600 Speaker 1: see these announcements from Meta and Alphabet, at various startups 74 00:03:58,600 --> 00:04:01,160 Speaker 1: and Microsoft and Amazon they all done it is like 75 00:04:01,320 --> 00:04:03,520 Speaker 1: they edited so many jobs over the last two years 76 00:04:03,560 --> 00:04:07,080 Speaker 1: that actually these layoffs are like fairly small in uh, 77 00:04:07,360 --> 00:04:09,800 Speaker 1: the grand scheme of things, even for these companies, based 78 00:04:09,800 --> 00:04:11,840 Speaker 1: on the amount of hiring that they've done. Yeah, but 79 00:04:11,880 --> 00:04:14,800 Speaker 1: I saw I saw a figure from Goldman Sex I 80 00:04:14,840 --> 00:04:16,880 Speaker 1: think it was Yannatsis, and he was talking about in 81 00:04:16,880 --> 00:04:19,279 Speaker 1: the tech sector, most of the companies that have been 82 00:04:19,400 --> 00:04:23,440 Speaker 1: laying people off grew their head count by over since 83 00:04:23,480 --> 00:04:26,240 Speaker 1: the pandemic basically because they thought all the pandemic trends 84 00:04:26,240 --> 00:04:28,440 Speaker 1: were going to keep going. Maybe there's a little bit 85 00:04:28,440 --> 00:04:31,839 Speaker 1: of labor hoarding. But it's a big figure. Absolutely. You know, 86 00:04:31,920 --> 00:04:35,800 Speaker 1: people in our industry, journalism, when journalists lose their jobs, 87 00:04:35,800 --> 00:04:39,160 Speaker 1: there's always trolls on Twitter saying, oh, learn to code. 88 00:04:39,440 --> 00:04:41,000 Speaker 1: You know, that's like a thing. I think at one 89 00:04:41,000 --> 00:04:43,120 Speaker 1: point even like Twitter start banning people for saying that. 90 00:04:43,360 --> 00:04:45,240 Speaker 1: But I guess the question is, right now, when you 91 00:04:45,279 --> 00:04:47,719 Speaker 1: see these layoffs, like should we learn to code or 92 00:04:47,800 --> 00:04:50,720 Speaker 1: is that not you know, is that not the career 93 00:04:50,760 --> 00:04:52,560 Speaker 1: safety that that it used to be? So all kinds 94 00:04:52,600 --> 00:04:55,560 Speaker 1: of questions about what is going on in the market 95 00:04:55,600 --> 00:04:57,960 Speaker 1: for tech talent. I'm sure someone's going to tell us 96 00:04:58,000 --> 00:05:00,400 Speaker 1: to learn to code. For the record, I can code 97 00:05:00,440 --> 00:05:04,000 Speaker 1: just in like very non useful languages. I have no well, 98 00:05:04,040 --> 00:05:07,039 Speaker 1: I coded in basic like yeah, yeah exactly, like C 99 00:05:07,160 --> 00:05:11,200 Speaker 1: plus plus and like some really basic HTML. But all right, 100 00:05:11,480 --> 00:05:13,839 Speaker 1: let's talk to someone who knows more about this question 101 00:05:13,880 --> 00:05:16,400 Speaker 1: than we do. We're bringing back Patrick. Thank you so 102 00:05:16,480 --> 00:05:19,000 Speaker 1: much for coming back on the podcast. Thanks very much 103 00:05:19,040 --> 00:05:22,560 Speaker 1: for having me. Absolutely so. I guess the question is, 104 00:05:22,720 --> 00:05:27,000 Speaker 1: before we even start talking about the recently off announcements, 105 00:05:27,480 --> 00:05:30,040 Speaker 1: why don't we start with like the hiring boom that 106 00:05:30,120 --> 00:05:32,479 Speaker 1: we really saw over the last two years, just massive 107 00:05:32,520 --> 00:05:35,039 Speaker 1: amount of headcount added and all these companies, we know 108 00:05:35,080 --> 00:05:39,040 Speaker 1: who they are, what drove that. So how about we 109 00:05:39,160 --> 00:05:41,960 Speaker 1: rule back history to two thousand nineteen and if you're 110 00:05:41,960 --> 00:05:45,039 Speaker 1: looking at recent history, as of two thousand nineteen, tech 111 00:05:45,120 --> 00:05:49,839 Speaker 1: has been on sort of an uninterrupted series of a 112 00:05:49,880 --> 00:05:53,039 Speaker 1: bunch of very good years, broad based expansion across the 113 00:05:53,160 --> 00:05:56,359 Speaker 1: entire industry, basically writing continuing to ride the wave that 114 00:05:56,400 --> 00:06:00,960 Speaker 1: had happened since the late pots. Is that how we 115 00:06:00,960 --> 00:06:05,760 Speaker 1: was it in English with the consolidation of mobile gains, etcetera, etcetera. 116 00:06:06,920 --> 00:06:11,120 Speaker 1: Then the pandemic happened and there was a brief pause 117 00:06:11,480 --> 00:06:15,640 Speaker 1: of okay, is this going to be an absolutely catastrophic 118 00:06:15,680 --> 00:06:19,599 Speaker 1: event for the entire world economy. Many bad things happened 119 00:06:19,640 --> 00:06:22,360 Speaker 1: during the pandemic, the way it played out for tech 120 00:06:22,680 --> 00:06:26,240 Speaker 1: was probably not how anyone would have expected one. There 121 00:06:26,279 --> 00:06:29,080 Speaker 1: was sort of a one two punch of a combined 122 00:06:29,080 --> 00:06:32,360 Speaker 1: fiscal response from governments built in the United States and 123 00:06:32,720 --> 00:06:36,520 Speaker 1: worldwide to stave off a huge economic disaster, which had 124 00:06:36,560 --> 00:06:40,520 Speaker 1: the effect of both putting money into consumers pockets and 125 00:06:40,600 --> 00:06:44,880 Speaker 1: also using the markets for assets, for example tech stocks, 126 00:06:45,279 --> 00:06:47,000 Speaker 1: which will come back to the importance so that in 127 00:06:47,000 --> 00:06:50,000 Speaker 1: a moment to a lot of the customers were, due 128 00:06:50,080 --> 00:06:53,920 Speaker 1: to various non pharmaceutical inventions, sitting at home with very 129 00:06:53,920 --> 00:06:56,800 Speaker 1: little to do other than use the internet. And so 130 00:06:57,120 --> 00:07:00,040 Speaker 1: a lot of commerce that had been possible on the 131 00:07:00,080 --> 00:07:02,680 Speaker 1: internet before the share of it that was soaked up 132 00:07:02,680 --> 00:07:06,159 Speaker 1: by the Internet in uh both sort of like semi 133 00:07:06,200 --> 00:07:10,280 Speaker 1: discretionary places like food delivery, but also much less discretionary 134 00:07:10,280 --> 00:07:14,560 Speaker 1: places like you know, core supermarkets suddenly shifted online in 135 00:07:14,600 --> 00:07:18,600 Speaker 1: a very very fast way. And so this combination of 136 00:07:18,720 --> 00:07:20,960 Speaker 1: there's more money slashing around and more of it is 137 00:07:21,000 --> 00:07:25,400 Speaker 1: falling into the online bucket led to absolutely blockbuster years 138 00:07:25,440 --> 00:07:29,400 Speaker 1: for tech companies, and it was a real like trying 139 00:07:29,480 --> 00:07:32,880 Speaker 1: to keep your fingers onto the rocket internally at the companies, 140 00:07:33,000 --> 00:07:35,960 Speaker 1: like the amount of new users that was on boarding, 141 00:07:36,560 --> 00:07:38,560 Speaker 1: at the rate of growth of the business, the raw 142 00:07:38,720 --> 00:07:41,200 Speaker 1: volumes of stuff that was going through the pipes made 143 00:07:41,240 --> 00:07:44,400 Speaker 1: it like difficult to keep everything up and running. And 144 00:07:44,560 --> 00:07:47,840 Speaker 1: a good news front, the business is largely successfully did 145 00:07:47,920 --> 00:07:50,120 Speaker 1: keep up and running during a time where society very 146 00:07:50,160 --> 00:07:53,800 Speaker 1: much needed them to. They also started to like readjust 147 00:07:53,880 --> 00:07:57,280 Speaker 1: their projections of what the future would look like, and 148 00:07:57,800 --> 00:08:01,680 Speaker 1: for a while it was looking like. Yeah. The phrase 149 00:08:01,680 --> 00:08:04,480 Speaker 1: that was going around was decades of growth were happening 150 00:08:04,520 --> 00:08:06,600 Speaker 1: every couple of weeks in terms of, you know, our 151 00:08:06,680 --> 00:08:10,200 Speaker 1: anticipated long term shift of the offline economy into the 152 00:08:10,240 --> 00:08:14,840 Speaker 1: online economy. And there was a big question of how 153 00:08:14,880 --> 00:08:17,440 Speaker 1: long does that continue for and is that pulling forth 154 00:08:17,480 --> 00:08:20,080 Speaker 1: growth growth that is happening in the future, is it 155 00:08:20,240 --> 00:08:24,320 Speaker 1: a one time spike, etcetera, etcetera. To to various structural 156 00:08:24,360 --> 00:08:28,240 Speaker 1: and competitive dynamics, a lot of firms spit simultaneously. This 157 00:08:28,320 --> 00:08:33,080 Speaker 1: is a pretty durable change. We find ourselves crushed by 158 00:08:33,120 --> 00:08:35,400 Speaker 1: the amount of demand we're seeing right now. We're going 159 00:08:35,440 --> 00:08:37,880 Speaker 1: to need to hire and hire aggressively to deal with 160 00:08:37,920 --> 00:08:40,480 Speaker 1: this and to position ourselves for what we see as 161 00:08:40,559 --> 00:08:43,040 Speaker 1: the you know, eventual coming out of the pandemic future. 162 00:08:43,679 --> 00:08:46,880 Speaker 1: And as a result of this, companies were mature companies 163 00:08:47,040 --> 00:08:50,680 Speaker 1: the Google's Amazons Facebooks of the world. We're hiring on 164 00:08:50,760 --> 00:08:54,079 Speaker 1: the order of like year over year growth across large 165 00:08:54,120 --> 00:08:58,439 Speaker 1: portions of their business. Somewhat earlier stage companies, companies that 166 00:08:58,520 --> 00:09:01,439 Speaker 1: might look like a stripe even though stripe as uh 167 00:09:01,480 --> 00:09:06,160 Speaker 1: somewhat larger these days, or early stage startups were on 168 00:09:06,280 --> 00:09:10,679 Speaker 1: boarding multiples of their pre pandemic had count as we 169 00:09:11,000 --> 00:09:14,320 Speaker 1: used over the course of the pandemic, so huge expansion 170 00:09:14,400 --> 00:09:17,640 Speaker 1: during the during the interval, and then as we came 171 00:09:17,640 --> 00:09:20,240 Speaker 1: out of the pandemic, companies assessed a number of things. 172 00:09:20,360 --> 00:09:23,400 Speaker 1: One the growth rates tended to go back to his 173 00:09:23,400 --> 00:09:26,360 Speaker 1: historical norms rather than this shot in the arm that 174 00:09:26,679 --> 00:09:31,520 Speaker 1: the pandemic was offering. Importantly, and you know, tech is 175 00:09:31,520 --> 00:09:34,240 Speaker 1: a wide sector. It touches every part of the economy 176 00:09:34,280 --> 00:09:37,200 Speaker 1: these days, so it's difficult to say with huge generalizations, 177 00:09:37,240 --> 00:09:40,440 Speaker 1: but as top line level things did not decline back 178 00:09:40,480 --> 00:09:43,120 Speaker 1: to two thousand nineteen. And again, two thousand nineteen was 179 00:09:43,160 --> 00:09:46,080 Speaker 1: not a bad year for tech. It was a, you know, 180 00:09:46,120 --> 00:09:48,280 Speaker 1: a pretty good year after a number of pretty good years. 181 00:09:48,679 --> 00:09:51,439 Speaker 1: So we haven't gone back to the pre pandemic baseline. 182 00:09:52,120 --> 00:09:54,200 Speaker 1: We haven't even stopped growing. In a number of cases, 183 00:09:54,480 --> 00:09:58,240 Speaker 1: the growth curve has just spent downwards, and so the 184 00:09:58,600 --> 00:10:03,120 Speaker 1: sustained like plus plus headcount growth over time didn't look 185 00:10:03,120 --> 00:10:05,040 Speaker 1: like it could be that could be sustained. And then 186 00:10:05,120 --> 00:10:08,720 Speaker 1: companies started to look at things that they had allowed 187 00:10:08,760 --> 00:10:12,199 Speaker 1: to happen for the course of the pandemic to characterize 188 00:10:12,240 --> 00:10:15,480 Speaker 1: these broadly. One of the things that happened during the 189 00:10:15,480 --> 00:10:19,800 Speaker 1: pandemic was due to the lockdowns and advisability of having 190 00:10:19,880 --> 00:10:22,199 Speaker 1: large numbers of people congregate in small pockets of air, 191 00:10:22,720 --> 00:10:25,840 Speaker 1: a bunch of companies went to both remote work and 192 00:10:25,920 --> 00:10:29,160 Speaker 1: remote hiring where they might not have had a huge 193 00:10:29,200 --> 00:10:33,200 Speaker 1: amount of institutional experience with that model of working before, 194 00:10:34,040 --> 00:10:36,839 Speaker 1: and after two to three years of working with these 195 00:10:37,040 --> 00:10:41,240 Speaker 1: newer cohorts of people, they've found that there are some 196 00:10:41,920 --> 00:10:44,280 Speaker 1: practices that they want to continue from this room at 197 00:10:44,280 --> 00:10:47,120 Speaker 1: work world into the future, and that there's some amount 198 00:10:47,160 --> 00:10:51,040 Speaker 1: of internal impetus to return to office and have sort 199 00:10:51,080 --> 00:10:54,920 Speaker 1: of a cultural reset around the office or headquarters, etcetera 200 00:10:55,000 --> 00:10:58,080 Speaker 1: as the sort of beating center of these firms. I've 201 00:10:58,080 --> 00:11:00,480 Speaker 1: worked remote for most of my career, myself of a 202 00:11:00,480 --> 00:11:03,920 Speaker 1: broad fan of the model. Let's say that there was 203 00:11:04,000 --> 00:11:07,040 Speaker 1: some cultural tension and companies on where the locus of 204 00:11:07,080 --> 00:11:08,880 Speaker 1: activity is going to be, whether it's going to be 205 00:11:09,160 --> 00:11:12,000 Speaker 1: in this online in zoom meetings and slack all the time, 206 00:11:12,520 --> 00:11:16,000 Speaker 1: or in the office high band with communication directly with 207 00:11:16,040 --> 00:11:18,720 Speaker 1: trusted peers, and a lot of companies wanted to have 208 00:11:18,760 --> 00:11:21,640 Speaker 1: a bit of a pullback towards the office, and then 209 00:11:21,800 --> 00:11:26,880 Speaker 1: they're looking more granularly at the classes of people to 210 00:11:26,920 --> 00:11:29,400 Speaker 1: the heart over the last couple of years, and found 211 00:11:29,480 --> 00:11:33,120 Speaker 1: that in comparison to prior classes, there was a bit 212 00:11:33,120 --> 00:11:35,720 Speaker 1: of cultural drift relative to where the companies want their 213 00:11:35,760 --> 00:11:38,880 Speaker 1: baselines to be, and also in some cases a bit 214 00:11:38,920 --> 00:11:41,840 Speaker 1: of a measured productivity difference versus where they wanted their 215 00:11:41,840 --> 00:11:45,400 Speaker 1: baselines to be. That's sort of expected because when you're 216 00:11:45,440 --> 00:11:49,280 Speaker 1: pulling out all the steps to hire, you like necessarily 217 00:11:49,280 --> 00:11:50,800 Speaker 1: you have to be a little less choosy than you 218 00:11:50,880 --> 00:11:53,480 Speaker 1: normally are. You have, you know, to the extent that 219 00:11:53,480 --> 00:11:55,880 Speaker 1: you describe any value at all to the in person 220 00:11:56,040 --> 00:11:59,400 Speaker 1: interview loop, which I describe relatively little value too, but 221 00:11:59,480 --> 00:12:03,560 Speaker 1: hopefully like slightly greater than zero. You you lose that 222 00:12:03,600 --> 00:12:06,439 Speaker 1: amount of signal and they're sort of hiring in a 223 00:12:06,480 --> 00:12:09,520 Speaker 1: slightly more challenge fashion than usual. And so I think 224 00:12:09,600 --> 00:12:12,840 Speaker 1: that companies will be pretty quiet about saying, but we'll 225 00:12:13,000 --> 00:12:17,000 Speaker 1: we'll say to themselves is we probably have a few 226 00:12:17,080 --> 00:12:21,560 Speaker 1: more regrets in like hiring classes than we did in 227 00:12:21,679 --> 00:12:26,520 Speaker 1: the hiring classes as a percentage, Patrick, this actually leads 228 00:12:26,559 --> 00:12:28,680 Speaker 1: to something that I want to ask you, but what 229 00:12:28,760 --> 00:12:33,320 Speaker 1: does blow actually look like in the tech sector, and 230 00:12:33,520 --> 00:12:37,480 Speaker 1: you know, is it something that only emerges as business 231 00:12:37,520 --> 00:12:41,840 Speaker 1: activity actually slows down, or even in one would you 232 00:12:41,840 --> 00:12:47,640 Speaker 1: have characterized tech as bloated. So it's difficult. This tech 233 00:12:48,840 --> 00:12:51,080 Speaker 1: sort of subsumes more and more of the economy into 234 00:12:51,160 --> 00:12:55,760 Speaker 1: its every increasing embrace to make like huge paint with 235 00:12:55,960 --> 00:12:59,720 Speaker 1: raw brush assertions across all of it. But let's see 236 00:12:59,720 --> 00:13:04,760 Speaker 1: where to start here. So one, the number of things 237 00:13:04,760 --> 00:13:08,559 Speaker 1: that are done in these large companies are extremely varied. 238 00:13:08,640 --> 00:13:11,120 Speaker 1: People might have a image that like most people who 239 00:13:11,320 --> 00:13:14,600 Speaker 1: could google our engineers, that's actually not the case. Depending 240 00:13:14,600 --> 00:13:17,360 Speaker 1: on the company we're talking about between twenty and the 241 00:13:17,360 --> 00:13:20,679 Speaker 1: people who work at the company are technologists broadly written, 242 00:13:21,040 --> 00:13:25,560 Speaker 1: they are software engineers, their system administrators, their designers at 243 00:13:25,600 --> 00:13:28,680 Speaker 1: some companies reporting to the same division. And then the 244 00:13:28,720 --> 00:13:31,000 Speaker 1: rest are every sort of worker that you would have 245 00:13:31,080 --> 00:13:35,080 Speaker 1: in any company in the economy. Lawyers, regulatory people, customer 246 00:13:35,120 --> 00:13:39,320 Speaker 1: sport agents, etcetera, etcetera, etcetera. Management layers upon layers of management. 247 00:13:40,320 --> 00:13:44,200 Speaker 1: So what does what does company growth look like? In 248 00:13:44,679 --> 00:13:47,680 Speaker 1: one case, it is staffing up more teams to work 249 00:13:47,760 --> 00:13:51,960 Speaker 1: on products that already exist. Sometimes staffing teams that sort 250 00:13:52,000 --> 00:13:54,880 Speaker 1: of like grow with the the rate of usage of 251 00:13:54,920 --> 00:13:58,679 Speaker 1: your products. So like customer service teams typically grow relatively 252 00:13:58,760 --> 00:14:03,280 Speaker 1: linearly with the usage of your service. Sometimes it's teams 253 00:14:03,320 --> 00:14:06,040 Speaker 1: that grow relatively nearly with the size of your organizations. 254 00:14:06,120 --> 00:14:09,880 Speaker 1: So as companies were having these sort of like unprecedented 255 00:14:09,920 --> 00:14:13,079 Speaker 1: amounts of employees getting on boarded every year, they needed 256 00:14:13,200 --> 00:14:16,880 Speaker 1: larger recruiting divisions to staff up there are other employees, 257 00:14:16,920 --> 00:14:20,120 Speaker 1: and it's just based on like the productivity math of 258 00:14:20,160 --> 00:14:22,800 Speaker 1: a recruiter. And you can, like finger to the wind 259 00:14:22,920 --> 00:14:25,480 Speaker 1: that if you hire a recruiter, that recruiter will be 260 00:14:25,520 --> 00:14:27,320 Speaker 1: able to hire twenty five people in the year. And 261 00:14:27,360 --> 00:14:29,880 Speaker 1: so if you need to hire four thousand people, then 262 00:14:30,080 --> 00:14:33,520 Speaker 1: you know, work math backwards, you require a hundred sixty 263 00:14:33,520 --> 00:14:36,440 Speaker 1: recruiters that you didn't have previously. That will tend to 264 00:14:36,520 --> 00:14:40,320 Speaker 1: cause your recruiting division to get larger as you are 265 00:14:40,360 --> 00:14:43,000 Speaker 1: doing a rapid expansion, and then it will contract faster 266 00:14:43,120 --> 00:14:45,680 Speaker 1: than the rest of your company will when you decide 267 00:14:45,720 --> 00:14:48,160 Speaker 1: to take your foot off the gas bottle. So those 268 00:14:48,200 --> 00:14:51,600 Speaker 1: are the the things that are sort of less inside 269 00:14:51,600 --> 00:14:54,120 Speaker 1: of your control. You you just need to keep doing 270 00:14:54,120 --> 00:14:57,560 Speaker 1: them to run the business, and then you're making some 271 00:14:58,520 --> 00:15:01,360 Speaker 1: more speculative investments on like what are what is our 272 00:15:01,400 --> 00:15:03,480 Speaker 1: new product line up going to look? Like? What features 273 00:15:03,520 --> 00:15:07,240 Speaker 1: are we going to add? And so the basic unit 274 00:15:07,280 --> 00:15:10,160 Speaker 1: of organization within an engineering organization these days is a 275 00:15:10,240 --> 00:15:13,680 Speaker 1: single engineering team will typically be like five to eight people, 276 00:15:14,160 --> 00:15:17,080 Speaker 1: and that team has a mental bandwidth to deal with 277 00:15:17,360 --> 00:15:21,120 Speaker 1: three relatively narrowly scoped problems. And so the more that 278 00:15:21,160 --> 00:15:25,320 Speaker 1: you want your software services suite, etcetera. To do, the 279 00:15:25,360 --> 00:15:28,200 Speaker 1: more like narrowly scoped problems that come into its domain, 280 00:15:28,400 --> 00:15:30,680 Speaker 1: more like five to eight people engineering teams you need, 281 00:15:31,400 --> 00:15:34,520 Speaker 1: and so you might find yourself in a position where 282 00:15:34,960 --> 00:15:37,160 Speaker 1: you've hired like five to eight people to work on 283 00:15:37,280 --> 00:15:44,320 Speaker 1: three relatively narrowly scoped problems somewhat opportunistically, and then when 284 00:15:44,320 --> 00:15:48,560 Speaker 1: you you know, come to three and are thinking very 285 00:15:48,640 --> 00:15:52,440 Speaker 1: rigorously around like, Okay, we think we're a little bigger 286 00:15:52,440 --> 00:15:54,640 Speaker 1: than we were when we were efficient back a couple 287 00:15:54,640 --> 00:15:57,320 Speaker 1: of years ago. We think the economic environment not be 288 00:15:57,680 --> 00:16:00,280 Speaker 1: might not be as strong and as we were model link, 289 00:16:00,800 --> 00:16:03,160 Speaker 1: Which of all the problems in our company are the 290 00:16:03,160 --> 00:16:05,440 Speaker 1: ones that we definitely need to keep focusing on, and 291 00:16:05,480 --> 00:16:08,360 Speaker 1: which can we refer into later or just our corridor 292 00:16:08,440 --> 00:16:11,440 Speaker 1: business right now, then perhaps like some of these nearroly 293 00:16:11,480 --> 00:16:13,920 Speaker 1: scoped problems are not at the top of our list. 294 00:16:14,320 --> 00:16:17,400 Speaker 1: And then if you consider, you know, like this product 295 00:16:17,400 --> 00:16:19,880 Speaker 1: that we thought we would bring to market in three 296 00:16:19,880 --> 00:16:22,840 Speaker 1: maybe it will not be brought to mark till then 297 00:16:22,920 --> 00:16:25,480 Speaker 1: there might be like ten teams implicated by that that 298 00:16:25,520 --> 00:16:28,320 Speaker 1: you do not have propletied for I have a lot 299 00:16:28,360 --> 00:16:31,000 Speaker 1: of questions. You know, when Ellen bought Twitter, and he, 300 00:16:31,160 --> 00:16:33,680 Speaker 1: like what you know, much more aggressive with the layoffs 301 00:16:33,720 --> 00:16:36,280 Speaker 1: than anything else that we've seen. There were all these 302 00:16:36,400 --> 00:16:40,520 Speaker 1: like vcs and stuff. A common to the Dirty Secret 303 00:16:40,520 --> 00:16:42,680 Speaker 1: and Silicon Valley is that all these companies could do 304 00:16:42,720 --> 00:16:46,000 Speaker 1: that they have fifty of their employees not really working 305 00:16:46,000 --> 00:16:49,040 Speaker 1: on anything and not really contributing anything. And like, thank 306 00:16:49,080 --> 00:16:51,520 Speaker 1: you Ellen for showing that this could be done. And 307 00:16:51,600 --> 00:16:53,680 Speaker 1: Twitter still is operating. Although I don't know how the 308 00:16:53,680 --> 00:16:56,120 Speaker 1: businesses or whether he cut too deep to the bone 309 00:16:56,200 --> 00:16:58,920 Speaker 1: or whatever, but like, would you hear that, like is 310 00:16:58,960 --> 00:17:01,600 Speaker 1: that the case that just like over the years, setting 311 00:17:01,640 --> 00:17:07,399 Speaker 1: aside the unrealistic expectations of one and maybe two, was 312 00:17:07,480 --> 00:17:11,320 Speaker 1: there just a wide scale over hiring relative to the 313 00:17:11,359 --> 00:17:16,000 Speaker 1: needs of the business. So tech has been in sort 314 00:17:16,040 --> 00:17:18,480 Speaker 1: of a land grab mode for essentially all of my 315 00:17:18,520 --> 00:17:21,080 Speaker 1: adult life. We certainly haven't hit the asom to oute 316 00:17:21,119 --> 00:17:23,200 Speaker 1: of how many things in the economy can be orchestrated 317 00:17:23,200 --> 00:17:25,879 Speaker 1: by software. We certainly haven't hit the assom tote of 318 00:17:26,000 --> 00:17:28,920 Speaker 1: how many human and human interactions will be intermediated by 319 00:17:29,040 --> 00:17:33,119 Speaker 1: a technical system happening over a smartphone, etcetera, etcetera. In 320 00:17:33,160 --> 00:17:37,040 Speaker 1: that sort of land grab mode, you aren't simply like 321 00:17:37,119 --> 00:17:39,480 Speaker 1: trying to answer what is the minimal set of things 322 00:17:39,480 --> 00:17:41,280 Speaker 1: we can do with the minimal number of people, but 323 00:17:41,359 --> 00:17:44,440 Speaker 1: are sort of opportunistically looking at what are the next 324 00:17:44,600 --> 00:17:46,919 Speaker 1: ten things that we can try such that one of 325 00:17:46,960 --> 00:17:51,680 Speaker 1: them becomes a company defining product feature, etcetera, etcetera. I 326 00:17:51,760 --> 00:17:55,639 Speaker 1: have a little bit of risk reflective contrarianism when people 327 00:17:55,640 --> 00:18:00,280 Speaker 1: say all tech companies are overstaffed by Could you cut 328 00:18:00,359 --> 00:18:03,280 Speaker 1: eighty percent of people who work at tech companies and 329 00:18:03,600 --> 00:18:06,600 Speaker 1: still have something functional at the end of the day. 330 00:18:07,160 --> 00:18:11,199 Speaker 1: Probably true, that would be extremely painful. But if you 331 00:18:11,240 --> 00:18:14,800 Speaker 1: went into a very different mode of operation and just 332 00:18:14,920 --> 00:18:17,720 Speaker 1: wanted them to continue the products and services they had 333 00:18:17,760 --> 00:18:21,959 Speaker 1: three years ago, possibly that could be done. Probably wouldn't 334 00:18:21,960 --> 00:18:24,720 Speaker 1: be optimal for any of them. That's one major reason 335 00:18:24,760 --> 00:18:28,840 Speaker 1: why nobody does it. There's also some not gone like cultural, etcetera. 336 00:18:28,840 --> 00:18:31,239 Speaker 1: Effects that make it virtually unthinkable. If you were an 337 00:18:31,240 --> 00:18:34,560 Speaker 1: executive at at a tech company and you were sufficiently 338 00:18:34,560 --> 00:18:37,080 Speaker 1: in your cups and had a had a heart to 339 00:18:37,080 --> 00:18:39,399 Speaker 1: heart with someone and said, what's the true number of 340 00:18:39,480 --> 00:18:43,640 Speaker 1: Like if I could wave a magic wand and no consequences, 341 00:18:44,000 --> 00:18:47,000 Speaker 1: where would our staffing be? Would probably be like eighty 342 00:18:47,160 --> 00:18:49,880 Speaker 1: five to ninetent of what it is currently. I think 343 00:18:49,920 --> 00:18:51,679 Speaker 1: I think most people would say, like, oh, there's a 344 00:18:51,680 --> 00:18:54,760 Speaker 1: bit of there's a bit of like I hate the 345 00:18:54,760 --> 00:18:58,120 Speaker 1: word fat in this context about you know, a little 346 00:18:58,119 --> 00:19:00,480 Speaker 1: bit of fluff around the edges, but we're not in 347 00:19:00,600 --> 00:19:04,879 Speaker 1: systemically a terrible place. And I think you know you 348 00:19:04,880 --> 00:19:07,560 Speaker 1: you would get different numbers from different people in different 349 00:19:07,560 --> 00:19:10,040 Speaker 1: parts of the organization, but that feels like plus or 350 00:19:10,080 --> 00:19:12,439 Speaker 1: minus right to me. Should be noted that I was 351 00:19:12,480 --> 00:19:15,240 Speaker 1: a beer worker b rather than the sort of executive 352 00:19:15,280 --> 00:19:34,440 Speaker 1: that would be tesked with making that kind of decision. Patrick, 353 00:19:34,640 --> 00:19:39,600 Speaker 1: you mentioned the sort of impetus towards creating company defining features, 354 00:19:39,640 --> 00:19:42,160 Speaker 1: and this is also something I've always wondered, is there 355 00:19:42,200 --> 00:19:47,320 Speaker 1: a bias in tech towards creating new products and our 356 00:19:47,440 --> 00:19:52,240 Speaker 1: employees and engineers you know, rewarded for doing new things 357 00:19:52,440 --> 00:19:57,560 Speaker 1: rather than maybe maintaining the old ones and perfecting those. Oh, 358 00:19:57,720 --> 00:20:00,199 Speaker 1: this is an extremely important thing to understand and the 359 00:20:00,200 --> 00:20:03,000 Speaker 1: behavior of the large tech companies from outside of them 360 00:20:03,040 --> 00:20:06,000 Speaker 1: that they all have what's called a PERF process in 361 00:20:06,040 --> 00:20:09,119 Speaker 1: the industry, it's called PURF outside is a performance review, 362 00:20:09,680 --> 00:20:14,159 Speaker 1: and the performance reviews are largely how a company takes 363 00:20:14,840 --> 00:20:17,439 Speaker 1: creative work that is done over this time scale of 364 00:20:17,520 --> 00:20:20,680 Speaker 1: like quarters and years, and it is often sort of 365 00:20:21,200 --> 00:20:24,320 Speaker 1: indevigable and very area and reduces it to a number 366 00:20:24,440 --> 00:20:27,080 Speaker 1: such that the company can dole out things of value 367 00:20:27,160 --> 00:20:32,119 Speaker 1: like promotions and bonuses and career paths, etcetera, etcetera. And 368 00:20:32,320 --> 00:20:36,400 Speaker 1: PERF happens on a semi annual or annual basis, and 369 00:20:37,280 --> 00:20:39,440 Speaker 1: the way it PERF works that most large tech companies 370 00:20:39,480 --> 00:20:42,480 Speaker 1: is heavily biases in the direction of getting your name 371 00:20:42,600 --> 00:20:46,040 Speaker 1: attached to new things that shipped in the world versus 372 00:20:46,440 --> 00:20:48,720 Speaker 1: you know, I was assigned to this legacy product, the 373 00:20:48,720 --> 00:20:51,080 Speaker 1: product did not go down for six months, you should 374 00:20:51,080 --> 00:20:53,800 Speaker 1: definitely give me a bonus on that basis. Oddly enough, 375 00:20:54,400 --> 00:20:57,440 Speaker 1: this is not straightforwardly the things that is in the 376 00:20:57,480 --> 00:21:01,600 Speaker 1: company's interests because all of the money is made by existing, well, 377 00:21:01,640 --> 00:21:03,840 Speaker 1: not all the money. The supermajority of money in the 378 00:21:03,840 --> 00:21:07,359 Speaker 1: tech companies is made by satisfying customers you already have 379 00:21:07,520 --> 00:21:11,200 Speaker 1: rather than getting new customers, and the supermajority of money 380 00:21:11,240 --> 00:21:13,679 Speaker 1: is made on your oldest and tourist products rather than 381 00:21:13,720 --> 00:21:18,000 Speaker 1: the new stuff. But institutionally, tech companies biased towards we 382 00:21:18,080 --> 00:21:20,480 Speaker 1: want our best people to be on the new things 383 00:21:20,560 --> 00:21:24,600 Speaker 1: all of the time. And if your individual best people 384 00:21:25,000 --> 00:21:28,120 Speaker 1: want to be, you know, doing the hard yards that 385 00:21:28,280 --> 00:21:31,200 Speaker 1: keeps the old stuff running, they will quickly be dissuaded 386 00:21:31,240 --> 00:21:33,920 Speaker 1: by their mentors and managers, etcetera, and say don't, no, no, 387 00:21:34,400 --> 00:21:37,560 Speaker 1: that is not the way to exceed expectations. You will like, 388 00:21:37,920 --> 00:21:41,000 Speaker 1: if you only do great maintenance work for the for 389 00:21:41,040 --> 00:21:44,639 Speaker 1: the next couple of years, you will be, you know, 390 00:21:45,119 --> 00:21:48,600 Speaker 1: severely career limited here. So figure out something new to 391 00:21:48,640 --> 00:21:50,439 Speaker 1: do and make sure your name is attached to it 392 00:21:50,520 --> 00:21:53,439 Speaker 1: in a way that is legible to your manager and 393 00:21:53,520 --> 00:21:57,240 Speaker 1: your manager's manager and this performance roview process. So let's 394 00:21:57,280 --> 00:22:00,400 Speaker 1: talk about the layoffs that we've seen. Because you said 395 00:22:00,440 --> 00:22:02,840 Speaker 1: something interesting in your first answer, which is that's sort 396 00:22:02,880 --> 00:22:08,800 Speaker 1: of like hiring discipline, hiring quality. During those crazy years 397 00:22:07,720 --> 00:22:12,720 Speaker 1: of one part of may have been loose, maybe not 398 00:22:12,920 --> 00:22:15,520 Speaker 1: as the standards were a little lower, or maybe people 399 00:22:15,600 --> 00:22:19,200 Speaker 1: just didn't fit or something like that. When companies these 400 00:22:19,280 --> 00:22:23,000 Speaker 1: days are now or recently making the decisions about who 401 00:22:23,040 --> 00:22:26,320 Speaker 1: they're going to let go, how skewed is it towards 402 00:22:26,560 --> 00:22:28,960 Speaker 1: that sort of recent cohort. Because the other thing I 403 00:22:29,000 --> 00:22:31,840 Speaker 1: could see is that look at many companies, you probably 404 00:22:31,880 --> 00:22:35,119 Speaker 1: have people who have been there forever who are getting 405 00:22:35,119 --> 00:22:39,359 Speaker 1: paid extremely high salaries or very good salaries just based 406 00:22:39,400 --> 00:22:42,280 Speaker 1: on the fact that they got some bump every single year. 407 00:22:42,960 --> 00:22:45,560 Speaker 1: Maybe they're not pulling their weight to some perceived degree 408 00:22:45,600 --> 00:22:47,960 Speaker 1: as much as they used to be. So how much 409 00:22:48,000 --> 00:22:49,600 Speaker 1: of the you know, when when are they when the 410 00:22:50,000 --> 00:22:52,560 Speaker 1: executive look and say, okay, we're gonna make cuts. How 411 00:22:52,640 --> 00:22:55,639 Speaker 1: much was it skewed towards the new cohort versus seeing 412 00:22:55,640 --> 00:22:58,560 Speaker 1: as like this is an opportunity to get rid of 413 00:22:58,600 --> 00:23:01,560 Speaker 1: some highly paid employees maybe don't add as much value 414 00:23:01,560 --> 00:23:05,040 Speaker 1: as they want. So a disclaimer off the top, layoffs 415 00:23:05,040 --> 00:23:08,360 Speaker 1: are like understandably traumatic for the people who go through them. 416 00:23:09,200 --> 00:23:11,280 Speaker 1: I don't want to minimize that. At the same time, 417 00:23:11,320 --> 00:23:14,320 Speaker 1: I think we often, particularly as as workers in this industry, 418 00:23:14,720 --> 00:23:18,400 Speaker 1: sort of like advocate responsibility for understanding the like structures 419 00:23:18,440 --> 00:23:20,520 Speaker 1: that cause these things to happen in ways that are 420 00:23:20,560 --> 00:23:23,080 Speaker 1: not in our interests. So broadly it's it's good to 421 00:23:23,119 --> 00:23:26,240 Speaker 1: have like open conversations about how these sort of decisions 422 00:23:26,240 --> 00:23:29,160 Speaker 1: are made. I think it is different on a firm 423 00:23:29,160 --> 00:23:32,840 Speaker 1: differing basis, but broadly speaking, you would not want your 424 00:23:33,440 --> 00:23:36,200 Speaker 1: simply to like roll back for the last six months 425 00:23:36,240 --> 00:23:38,879 Speaker 1: of your hiring. There's a couple of different reasons for that. 426 00:23:39,320 --> 00:23:43,280 Speaker 1: One is that when you're dealing with these complex ecosystems 427 00:23:43,280 --> 00:23:48,119 Speaker 1: that sufficiently large companies ecosystem to itself, there's all sorts 428 00:23:48,119 --> 00:23:50,959 Speaker 1: of levers that you are like managing a parallel and 429 00:23:51,000 --> 00:23:54,280 Speaker 1: one of those levers is that you are attempting to 430 00:23:54,359 --> 00:23:57,800 Speaker 1: balance the seniority ranges in various parts of your organization 431 00:23:58,160 --> 00:24:00,720 Speaker 1: such that you always have a mixed within some error 432 00:24:00,760 --> 00:24:04,520 Speaker 1: bars of how many people that you have they're acclimating 433 00:24:04,600 --> 00:24:07,359 Speaker 1: to the company versus how many who have acclimated and 434 00:24:07,359 --> 00:24:10,040 Speaker 1: could do productive work, versus how many are in that 435 00:24:10,240 --> 00:24:13,439 Speaker 1: senior mode where they can lately parachuting to consult on 436 00:24:13,560 --> 00:24:17,800 Speaker 1: things and do the architecture stuff that you're more intermediate 437 00:24:17,800 --> 00:24:21,200 Speaker 1: employees might not be able to do. Yet, if you 438 00:24:21,840 --> 00:24:25,719 Speaker 1: sort of create a bubble in the pipeline by concentrating 439 00:24:25,720 --> 00:24:27,760 Speaker 1: your cuts in the people that were only hired in 440 00:24:27,760 --> 00:24:31,160 Speaker 1: the last six months to two years, then you are 441 00:24:31,560 --> 00:24:33,600 Speaker 1: setting yourself up for a bubble a couple of years 442 00:24:33,600 --> 00:24:36,879 Speaker 1: from now where you have far too few people at 443 00:24:36,920 --> 00:24:39,520 Speaker 1: a portion of the experience curve to do work that 444 00:24:39,600 --> 00:24:42,200 Speaker 1: you urgently need to do work on a week by week, 445 00:24:42,240 --> 00:24:45,639 Speaker 1: quarter by quarter basis. And so if you come to 446 00:24:45,680 --> 00:24:48,840 Speaker 1: the conclusion that we've hired a few too many people 447 00:24:48,880 --> 00:24:51,359 Speaker 1: over the last couple of the last couple of months, 448 00:24:51,400 --> 00:24:53,359 Speaker 1: what are we going to do about that? You have 449 00:24:53,480 --> 00:24:57,600 Speaker 1: to distribute your cuts over a larger number of cohorts 450 00:24:57,600 --> 00:24:59,960 Speaker 1: than the most recent cohorts, or you will set yourself 451 00:25:00,000 --> 00:25:03,040 Speaker 1: fun for some pain. There's also some compliance and legal 452 00:25:03,080 --> 00:25:05,320 Speaker 1: issues that come up with is employees and you already 453 00:25:05,359 --> 00:25:09,240 Speaker 1: a protected class in particular jurisdictions, which also plays into 454 00:25:09,280 --> 00:25:11,760 Speaker 1: it into a little bit. But the biggest reason is 455 00:25:11,800 --> 00:25:15,400 Speaker 1: to avoid causing the operational issues for your company layoffs. 456 00:25:15,480 --> 00:25:19,600 Speaker 1: Is performance management that is a thing that exists in 457 00:25:19,640 --> 00:25:22,520 Speaker 1: the world. And so you know, if you were hearing 458 00:25:22,640 --> 00:25:25,080 Speaker 1: skeptical FECs on Twitter, that they would say about large 459 00:25:25,080 --> 00:25:27,640 Speaker 1: software companies is not merely that they were a little 460 00:25:27,640 --> 00:25:31,000 Speaker 1: bit flabby, but that they were a little bit uh 461 00:25:31,240 --> 00:25:33,280 Speaker 1: self assured of their position in the world, and it 462 00:25:33,359 --> 00:25:35,680 Speaker 1: had too many good years in a row. And if 463 00:25:35,720 --> 00:25:38,119 Speaker 1: you got attached to them, you could you get a 464 00:25:38,200 --> 00:25:40,639 Speaker 1: job in a corner office and not do all that 465 00:25:40,720 --> 00:25:44,280 Speaker 1: much and still be fine. I think that is a 466 00:25:44,359 --> 00:25:47,199 Speaker 1: little exaggerated, but let's say there's certainly cases of it, 467 00:25:47,280 --> 00:25:51,080 Speaker 1: and there's certainly some people like mature into a career 468 00:25:51,080 --> 00:25:54,320 Speaker 1: where they continue being impactful over years and decades, and 469 00:25:54,359 --> 00:25:56,840 Speaker 1: some people end up in sort of a tenured professor 470 00:25:56,920 --> 00:26:00,280 Speaker 1: mode where they've become critical to the organization because they 471 00:26:00,440 --> 00:26:02,600 Speaker 1: know a couple of things that the organization needs to know, 472 00:26:03,000 --> 00:26:06,120 Speaker 1: but they don't bring the same intensity that they used 473 00:26:06,119 --> 00:26:08,240 Speaker 1: to in their career. And then there are some people 474 00:26:08,280 --> 00:26:13,720 Speaker 1: who have like successfully created a niche for themselves inside 475 00:26:13,720 --> 00:26:16,920 Speaker 1: the company, but the company might not desire to exist, 476 00:26:17,400 --> 00:26:20,400 Speaker 1: and nobody wakes up in the morning and says today, 477 00:26:20,400 --> 00:26:24,000 Speaker 1: I want to do layoffs. But given a circumstance where 478 00:26:24,119 --> 00:26:28,400 Speaker 1: everyone in the industry is doing layoffs, some executives might say, okay, 479 00:26:28,760 --> 00:26:31,400 Speaker 1: it is a good time to reevaluate and like turn 480 00:26:31,480 --> 00:26:34,120 Speaker 1: up the heat a little bit on our performance management 481 00:26:34,160 --> 00:26:36,639 Speaker 1: and say, okay, is there anyone who has been coasting 482 00:26:36,680 --> 00:26:39,280 Speaker 1: a little too long? Is there anyone who has uh, 483 00:26:39,400 --> 00:26:43,000 Speaker 1: you know, created a secure little nest for themselves in 484 00:26:43,000 --> 00:26:44,920 Speaker 1: a way that that nest does not add a lot 485 00:26:44,920 --> 00:26:47,240 Speaker 1: of value to the company. Given that we we need 486 00:26:47,280 --> 00:26:50,920 Speaker 1: to usher some people on two new positions, let's start 487 00:26:50,960 --> 00:26:53,399 Speaker 1: with that first and then move to the cuts that 488 00:26:53,440 --> 00:26:56,040 Speaker 1: are going to take more mental energy to do. You know, 489 00:26:56,160 --> 00:27:00,479 Speaker 1: we're talking broadly about hiring discipline and the idea of bloat. 490 00:27:00,760 --> 00:27:04,879 Speaker 1: And this is a slightly loaded question, but to what degree, 491 00:27:05,200 --> 00:27:09,800 Speaker 1: if any, do you think the sort of maybe monopolistic 492 00:27:09,960 --> 00:27:13,159 Speaker 1: mode that some big tech companies have built around their 493 00:27:13,200 --> 00:27:18,960 Speaker 1: businesses has contributed to some complacency on the hiring front. 494 00:27:20,600 --> 00:27:23,760 Speaker 1: And a little adverse to the word monopolistic, but I 495 00:27:23,800 --> 00:27:26,080 Speaker 1: think I get what you're getting at, and that there 496 00:27:26,160 --> 00:27:28,639 Speaker 1: is certainly a lot of rent created in the technology 497 00:27:28,640 --> 00:27:33,400 Speaker 1: industry where these are some of the most effective businesses 498 00:27:33,480 --> 00:27:37,600 Speaker 1: ever created in any industry. Google AdWords will print a 499 00:27:37,720 --> 00:27:42,200 Speaker 1: ginormous amount of money next year, and almost no amount 500 00:27:42,320 --> 00:27:45,920 Speaker 1: of action taken by any set of first actors internal 501 00:27:46,000 --> 00:27:49,080 Speaker 1: or external ad to Google will cause Google AdWords to 502 00:27:49,119 --> 00:27:52,840 Speaker 1: not be worth many, many, many billions of dollars, and 503 00:27:52,920 --> 00:27:57,520 Speaker 1: so the margins on it are very high as well 504 00:27:57,560 --> 00:28:01,320 Speaker 1: in comparison to you know, we we're talking last time 505 00:28:01,320 --> 00:28:04,480 Speaker 1: about the airline industry, where the airline industry has struggled 506 00:28:04,560 --> 00:28:08,960 Speaker 1: mightily to maintain like single digit percentage positive margins over 507 00:28:09,000 --> 00:28:12,439 Speaker 1: a multi decade timeframe. Tech doesn't have that problem. The 508 00:28:12,520 --> 00:28:16,200 Speaker 1: nature of these very sticky products that shearsket size of them, 509 00:28:16,440 --> 00:28:19,240 Speaker 1: and the margins do tend to create a little more 510 00:28:19,400 --> 00:28:25,640 Speaker 1: room for that flabbiness than in exists in many industries 511 00:28:25,640 --> 00:28:29,160 Speaker 1: that have more of a cutthroat reputation. This is sort 512 00:28:29,160 --> 00:28:32,320 Speaker 1: of the polar opposite question. But nowadays we hear a 513 00:28:32,320 --> 00:28:37,000 Speaker 1: lot about the possibility of companies hoarding labor when it 514 00:28:37,040 --> 00:28:40,000 Speaker 1: comes to tech. How much of that do you think 515 00:28:40,040 --> 00:28:42,280 Speaker 1: has actually gone on in the sense that do you 516 00:28:42,320 --> 00:28:47,840 Speaker 1: see tech companies opportunistically hiring people just so their competitors 517 00:28:47,920 --> 00:28:51,120 Speaker 1: can't get their hands on them. I've heard this theory 518 00:28:51,160 --> 00:28:54,440 Speaker 1: advanced many times, and honestly, I don't think it is 519 00:28:54,560 --> 00:28:59,360 Speaker 1: very explanatory, and sometimes it's phrased Google would rather hire 520 00:28:59,640 --> 00:29:02,800 Speaker 1: a particular talented engineers so that they don't create a 521 00:29:02,800 --> 00:29:06,680 Speaker 1: startup and then eventually become competition to one of Google's products. 522 00:29:07,280 --> 00:29:12,040 Speaker 1: If hypothetically that were something that actually motivated executives at 523 00:29:12,040 --> 00:29:14,880 Speaker 1: tech companies, there would be a number of things that 524 00:29:14,880 --> 00:29:17,920 Speaker 1: would be easier to do than quote unquote labor hoarding 525 00:29:18,200 --> 00:29:21,600 Speaker 1: that we don't do institutionally. So in finance, there's this 526 00:29:21,640 --> 00:29:25,600 Speaker 1: institution of gardening leave. Tech doesn't institutionalize gardening leave at 527 00:29:25,640 --> 00:29:28,880 Speaker 1: any level almost anywhere in the industry. And if you 528 00:29:28,960 --> 00:29:32,240 Speaker 1: were thinking about let's prevent highly talented people from doing 529 00:29:32,280 --> 00:29:35,880 Speaker 1: interesting things for our competitors or for new startups that 530 00:29:35,920 --> 00:29:38,360 Speaker 1: they could create. The people in the industry that you 531 00:29:38,440 --> 00:29:41,440 Speaker 1: have the like tightest speed on their productivity level are 532 00:29:41,480 --> 00:29:44,720 Speaker 1: your existing employees, and so you would be you would think, oh, well, 533 00:29:44,840 --> 00:29:47,280 Speaker 1: like the natural place to start is like start with 534 00:29:47,320 --> 00:29:49,720 Speaker 1: people who are already work here and say, if you leave, 535 00:29:50,000 --> 00:29:51,720 Speaker 1: we would like to buy twelve months of your time 536 00:29:51,760 --> 00:29:54,400 Speaker 1: sight unseen, and no one does that. And there's other 537 00:29:54,440 --> 00:29:58,120 Speaker 1: things that you can do Broadly tech is there's always 538 00:29:58,160 --> 00:29:59,560 Speaker 1: a bit of push and pull between the needs of 539 00:29:59,560 --> 00:30:01,560 Speaker 1: a company and the needs of employees, but broadly tech 540 00:30:01,560 --> 00:30:06,000 Speaker 1: who is strikingly pro worker relative to many industries in 541 00:30:06,000 --> 00:30:08,360 Speaker 1: the United States. These things that are done that would 542 00:30:08,400 --> 00:30:11,840 Speaker 1: be consistent with the labor hoarding hypothesis just are not done. 543 00:30:12,360 --> 00:30:14,440 Speaker 1: You know, you can talk to the people that are 544 00:30:14,440 --> 00:30:16,640 Speaker 1: involved in the decisions that that are read on the 545 00:30:16,640 --> 00:30:21,680 Speaker 1: outside as labor hoarding, and they never advanced that as 546 00:30:21,760 --> 00:30:24,240 Speaker 1: a reason to you know, buy up. A new company 547 00:30:24,280 --> 00:30:26,959 Speaker 1: that has four engineers attached to it is typically phrased 548 00:30:27,280 --> 00:30:29,720 Speaker 1: something more similar to, well, this is a team that 549 00:30:29,760 --> 00:30:33,920 Speaker 1: seems already jelled. They're clearly highly highly productive individual contributors, 550 00:30:34,200 --> 00:30:36,840 Speaker 1: and we could have a bunch of engineering recruiters work 551 00:30:36,880 --> 00:30:40,040 Speaker 1: for months to find for similarly talented individuals or the 552 00:30:40,440 --> 00:30:42,680 Speaker 1: m and a team can like tick one box off 553 00:30:42,680 --> 00:30:44,640 Speaker 1: in Q one and get them all in the door 554 00:30:44,640 --> 00:30:46,560 Speaker 1: for the price of one low check. Let's do it. 555 00:30:47,000 --> 00:30:49,400 Speaker 1: The notion of like and let's take this team off 556 00:30:49,400 --> 00:30:51,800 Speaker 1: the table, so they don't, you know, have a market 557 00:30:51,840 --> 00:30:54,560 Speaker 1: success in three years and create something competitive with us 558 00:30:54,760 --> 00:31:14,400 Speaker 1: never comes up. Okay, we started talking about why the 559 00:31:14,480 --> 00:31:17,680 Speaker 1: hiring boom happened. In the first place, we've talked about 560 00:31:17,720 --> 00:31:20,480 Speaker 1: maybe some of the decisions on who is getting cut. 561 00:31:21,200 --> 00:31:23,720 Speaker 1: Let's talk about the sort of prospects for the people 562 00:31:23,760 --> 00:31:26,280 Speaker 1: that have lost their jobs and or the people that 563 00:31:26,320 --> 00:31:28,200 Speaker 1: are thinking about going into a career in tech. So 564 00:31:28,320 --> 00:31:31,240 Speaker 1: how quickly do you perceive that the people losing their 565 00:31:31,360 --> 00:31:35,280 Speaker 1: jobs over the last several months are finding new offers? Like, 566 00:31:35,320 --> 00:31:37,880 Speaker 1: let's start really simple. Can I tack something onto that, 567 00:31:37,960 --> 00:31:42,560 Speaker 1: which is how how fungible are these types of jobs 568 00:31:42,600 --> 00:31:45,840 Speaker 1: in reality? Yeah, A long time ago, in a place 569 00:31:45,880 --> 00:31:48,680 Speaker 1: far far away, during the dot com crash, I was 570 00:31:48,960 --> 00:31:52,840 Speaker 1: graduating from university and the Wall Street Journal was which 571 00:31:53,320 --> 00:31:55,520 Speaker 1: read the Wall Street Journal every day with my father 572 00:31:55,560 --> 00:31:57,280 Speaker 1: growing up. It was how I learned to read the 573 00:31:57,320 --> 00:31:59,840 Speaker 1: Wall Street Journal. Could do no no wrong in my 574 00:32:00,040 --> 00:32:04,080 Speaker 1: as as a an undergraded engineer, and the Wall Street 575 00:32:04,160 --> 00:32:07,640 Speaker 1: Journal was pretty decided that yep, engineering as a field 576 00:32:07,720 --> 00:32:10,680 Speaker 1: is done in the United States of America. Henceforth, all 577 00:32:10,720 --> 00:32:14,320 Speaker 1: engineering will happen in Asia. And I said, oh, chucks, 578 00:32:14,480 --> 00:32:16,760 Speaker 1: I really wanted to get an engineering job. I guess 579 00:32:16,800 --> 00:32:18,920 Speaker 1: I have to move to Asia, and so I did. 580 00:32:19,200 --> 00:32:23,400 Speaker 1: Oh back, now we know the origin story. This is 581 00:32:23,440 --> 00:32:25,520 Speaker 1: the backstory of how I ended up spending my entire 582 00:32:25,520 --> 00:32:28,440 Speaker 1: adult life in Japan. Now that ended up being a 583 00:32:28,440 --> 00:32:31,120 Speaker 1: good read like a good life decision for me for 584 00:32:31,480 --> 00:32:34,560 Speaker 1: entirely unrelated reasons. But it turns out there were, in 585 00:32:34,640 --> 00:32:37,520 Speaker 1: fact engineers hired between two thousand and four and two 586 00:32:37,560 --> 00:32:40,000 Speaker 1: thousand and twenty three in the United States, and so 587 00:32:40,400 --> 00:32:44,560 Speaker 1: reports of the field's demise were heavily exaggerated. If you 588 00:32:44,600 --> 00:32:48,280 Speaker 1: are considering a career in engineering, every reason you had 589 00:32:48,320 --> 00:32:52,440 Speaker 1: to consider a career in engineering in is like still 590 00:32:52,520 --> 00:32:55,680 Speaker 1: a reason to do it. So this like minor wobble 591 00:32:55,800 --> 00:32:59,320 Speaker 1: that will be forgotten in a matter of months. Please 592 00:32:59,320 --> 00:33:01,560 Speaker 1: don't allow it to like cause you to make major 593 00:33:01,640 --> 00:33:05,960 Speaker 1: drastic life decisions. Although life is what happens when when 594 00:33:06,000 --> 00:33:09,200 Speaker 1: you're busy dealing with these little wabbles. Okay, so that 595 00:33:09,280 --> 00:33:13,800 Speaker 1: out of the way. How fungible are people? Broadly speaking? 596 00:33:14,240 --> 00:33:17,480 Speaker 1: In the early levels of career, tech tends to cast 597 00:33:17,560 --> 00:33:20,920 Speaker 1: a very wide net and hire people for what's often 598 00:33:20,920 --> 00:33:24,320 Speaker 1: called horsepower, with the expectation that they will be able 599 00:33:24,360 --> 00:33:28,120 Speaker 1: to specialize over time. There is some degree of worry 600 00:33:28,200 --> 00:33:31,479 Speaker 1: that if you spend ten years or fifteen years in 601 00:33:31,480 --> 00:33:35,760 Speaker 1: a particular industry doing. The quote often used is have 602 00:33:35,880 --> 00:33:37,960 Speaker 1: the same year ten years in a row, then you 603 00:33:37,960 --> 00:33:40,600 Speaker 1: will end up over specialized and only be available for 604 00:33:40,720 --> 00:33:42,560 Speaker 1: doing that sort of thing in the future. Depending on 605 00:33:42,640 --> 00:33:45,600 Speaker 1: the thing you are doing, there might be a sharply 606 00:33:45,640 --> 00:33:49,080 Speaker 1: limited set firms for which that is relevant. But broadly speaking, 607 00:33:49,320 --> 00:33:52,200 Speaker 1: the engineers that were hired to do anything in the 608 00:33:52,240 --> 00:33:56,720 Speaker 1: first five seven is years of their career are broadly 609 00:33:56,760 --> 00:34:00,920 Speaker 1: expected to be able to do not quite anything, but 610 00:34:01,040 --> 00:34:03,400 Speaker 1: like a large subset of all the things that a 611 00:34:03,600 --> 00:34:06,440 Speaker 1: tech employer could want an engineer to do. And so 612 00:34:06,920 --> 00:34:10,800 Speaker 1: the liquidity in the tech market within like a broad 613 00:34:10,800 --> 00:34:16,520 Speaker 1: class like recruiters or engineers, etcetera liquidity between job titles, 614 00:34:16,560 --> 00:34:19,759 Speaker 1: exact roles, exact companies business model of the company is 615 00:34:19,840 --> 00:34:23,480 Speaker 1: very high. And I'm forgetting what Joe's original question was, Well, 616 00:34:23,560 --> 00:34:26,080 Speaker 1: so are they finding just a short term like you 617 00:34:26,160 --> 00:34:29,560 Speaker 1: must hear from people, you must talk to people like, uh, 618 00:34:30,239 --> 00:34:33,520 Speaker 1: people just got cut off? Are they recruiters already reaching 619 00:34:33,560 --> 00:34:38,080 Speaker 1: out to them from different companies? Structurally tech company this 620 00:34:38,080 --> 00:34:40,480 Speaker 1: would be a bad pool quote. Structurally tech companies are 621 00:34:40,520 --> 00:34:43,040 Speaker 1: like sharks. Okay, we're gonna we're gonna pull their quote. 622 00:34:45,320 --> 00:34:48,080 Speaker 1: Just just like sharks, like the way that their gills work. 623 00:34:48,120 --> 00:34:51,240 Speaker 1: They have to keep swimming or they stop getting oxygen. 624 00:34:51,280 --> 00:34:53,759 Speaker 1: And that's an unfortunate thing for most creatures. The tech 625 00:34:53,760 --> 00:34:57,799 Speaker 1: companies because of their staffing models, they and that thing 626 00:34:57,800 --> 00:35:00,239 Speaker 1: we talked about earlier, where they are constantly mixing the 627 00:35:00,920 --> 00:35:03,480 Speaker 1: number of people at each level of seniority within the company. 628 00:35:03,560 --> 00:35:06,080 Speaker 1: They have to keep hiring. And so even if an 629 00:35:06,120 --> 00:35:09,879 Speaker 1: individual company decides like, Okay, we're going to like push 630 00:35:09,920 --> 00:35:13,280 Speaker 1: pause for six weeks and do quote unquote hiring freeze 631 00:35:13,560 --> 00:35:15,799 Speaker 1: one the amount of time and they can actually do 632 00:35:15,960 --> 00:35:18,680 Speaker 1: that and not severely damage the business is limited. So 633 00:35:19,040 --> 00:35:22,319 Speaker 1: it pauses always temporary unless the company is going down 634 00:35:22,360 --> 00:35:25,359 Speaker 1: the tubes. And like the large tech companies certainly are 635 00:35:25,360 --> 00:35:27,920 Speaker 1: not going down the tubes. Some startups might get shaken 636 00:35:27,960 --> 00:35:30,560 Speaker 1: out at the march and tow to UH funding constraints, 637 00:35:30,600 --> 00:35:34,160 Speaker 1: et cetera, But the like the overall business of the Internet, 638 00:35:34,400 --> 00:35:38,799 Speaker 1: continues to grow apace. So pauses are temporary nature. And 639 00:35:39,280 --> 00:35:43,200 Speaker 1: there exists, you know, like many different companies inside that 640 00:35:43,640 --> 00:35:46,440 Speaker 1: the broader ambit of tech. Some of them might be 641 00:35:46,440 --> 00:35:49,319 Speaker 1: positive any given moment, Some of them are you know, 642 00:35:49,680 --> 00:35:52,960 Speaker 1: still attempting to make new investments for three and some 643 00:35:53,080 --> 00:35:55,640 Speaker 1: of them while they're not in uh sort of rapid 644 00:35:55,640 --> 00:35:59,600 Speaker 1: growth mode, growth mode for doing things like you know, 645 00:35:59,719 --> 00:36:01,399 Speaker 1: we have to back fill for people who are leaving 646 00:36:01,440 --> 00:36:04,360 Speaker 1: the company, and in a typical year at a typical 647 00:36:04,360 --> 00:36:06,319 Speaker 1: tech company, that may be like ten percent of our 648 00:36:06,360 --> 00:36:09,239 Speaker 1: engineering staff. So if we've got two thousand engineers, we 649 00:36:09,239 --> 00:36:11,319 Speaker 1: have two hundred engineers that we are slate to hire 650 00:36:11,360 --> 00:36:15,279 Speaker 1: in Interestingly, one of the things that cost a bit 651 00:36:15,280 --> 00:36:17,960 Speaker 1: of the over hiring was companies have this model for 652 00:36:18,400 --> 00:36:20,640 Speaker 1: what percentage of people will leave in a year and 653 00:36:20,680 --> 00:36:22,839 Speaker 1: therefore how many you need to hire just to stay 654 00:36:22,840 --> 00:36:25,439 Speaker 1: at the current level of employment that you have. And 655 00:36:25,480 --> 00:36:29,320 Speaker 1: when the economy started wobbling in, what happened was the 656 00:36:30,200 --> 00:36:33,560 Speaker 1: rates of voluntary attrition that companies, meaning that people who 657 00:36:33,680 --> 00:36:36,520 Speaker 1: resigned out of their own volution, went lower than the 658 00:36:36,560 --> 00:36:40,880 Speaker 1: model predicted. And because you need to like set in 659 00:36:40,920 --> 00:36:43,680 Speaker 1: place a process that takes months to hire people, but 660 00:36:43,920 --> 00:36:47,880 Speaker 1: the process of deciding not to quit is not visible 661 00:36:47,920 --> 00:36:50,160 Speaker 1: for those months, that resulted in sort of like a 662 00:36:50,239 --> 00:36:53,440 Speaker 1: hiring overhang, and so companies overshot their targets for how 663 00:36:53,440 --> 00:36:57,040 Speaker 1: many people would be in the company, which doesn't sound 664 00:36:57,080 --> 00:36:58,319 Speaker 1: like an easy thing to do, but it is a 665 00:36:58,400 --> 00:37:01,080 Speaker 1: very easy thing to do if there is a sort 666 00:37:01,120 --> 00:37:03,799 Speaker 1: of like sharp change and employee behavior with regards to 667 00:37:03,840 --> 00:37:06,160 Speaker 1: things that they have total control over and don't have 668 00:37:06,200 --> 00:37:08,160 Speaker 1: to announce to you, like deciding to leave or not 669 00:37:08,320 --> 00:37:12,879 Speaker 1: leave in a statistical fashion. Are there signs that tech 670 00:37:12,960 --> 00:37:16,160 Speaker 1: workers should look out for that they're about to be 671 00:37:16,239 --> 00:37:19,680 Speaker 1: laid off? Like do you stop being a signed new projects? 672 00:37:19,719 --> 00:37:22,440 Speaker 1: Do your access codes get cut off? Does someone come 673 00:37:22,480 --> 00:37:25,520 Speaker 1: take your stapler off your desk? Like? What exactly are 674 00:37:25,520 --> 00:37:30,319 Speaker 1: the warning signs that you might be in the danger zone? Many, 675 00:37:30,400 --> 00:37:34,080 Speaker 1: many tech people have a large degree of stress with 676 00:37:34,120 --> 00:37:38,640 Speaker 1: regards to whether I'm doing well, am I on the list, etcetera, etcetera, 677 00:37:38,800 --> 00:37:41,440 Speaker 1: And I don't want to add to that stress. Broadly. 678 00:37:41,560 --> 00:37:44,920 Speaker 1: You should. You should have an understanding of how performance 679 00:37:44,960 --> 00:37:48,399 Speaker 1: is calculated at your company, and consider that official view 680 00:37:48,400 --> 00:37:51,880 Speaker 1: of your performance to be perhaps more important than you 681 00:37:51,920 --> 00:37:54,919 Speaker 1: would naively believe it too, because the official view where 682 00:37:54,960 --> 00:37:57,880 Speaker 1: you're the entirety of your performance is reduced down to 683 00:37:57,960 --> 00:38:01,320 Speaker 1: like one number, I'm a full or for this six months. 684 00:38:01,719 --> 00:38:03,799 Speaker 1: That is the only view that is going to be 685 00:38:03,840 --> 00:38:07,800 Speaker 1: available to someone who might be two, three, four steps 686 00:38:07,800 --> 00:38:10,520 Speaker 1: above you on the ladder when they're going to make 687 00:38:10,800 --> 00:38:14,200 Speaker 1: hard decisions in a hypothetical future where they're making hard decisions. 688 00:38:14,680 --> 00:38:18,480 Speaker 1: So the things that cause formal visibility to accompany are 689 00:38:18,680 --> 00:38:22,399 Speaker 1: anomalously important, and the career oriented people around you, who 690 00:38:22,520 --> 00:38:26,400 Speaker 1: are very good at work in those systems their advantages 691 00:38:26,400 --> 00:38:29,680 Speaker 1: will find advantages based on that. But I wouldn't, you know, 692 00:38:30,040 --> 00:38:32,880 Speaker 1: over rotate on perfect The only thing we're thinking about 693 00:38:33,040 --> 00:38:37,000 Speaker 1: seems simple, But just do great work and then make 694 00:38:37,040 --> 00:38:38,759 Speaker 1: sure people are aware of the fact that you did 695 00:38:38,760 --> 00:38:40,680 Speaker 1: the great work, and then things will tend to work 696 00:38:40,680 --> 00:38:43,160 Speaker 1: out in a career fashion over only a long period 697 00:38:43,160 --> 00:38:46,880 Speaker 1: of time, not gun. So I have a question that 698 00:38:47,040 --> 00:38:50,759 Speaker 1: bridges this conversation with the conversation we had last week 699 00:38:50,800 --> 00:38:53,279 Speaker 1: about I T. And I realized I should have asked 700 00:38:53,320 --> 00:38:56,600 Speaker 1: it last time. This whole episode has been because it's 701 00:38:56,640 --> 00:38:59,920 Speaker 1: actually I actually only just had one question from last time. 702 00:39:00,000 --> 00:39:02,000 Speaker 1: I had to come up with a whole excuse for 703 00:39:02,080 --> 00:39:03,840 Speaker 1: why we needed to have you back out just so 704 00:39:03,880 --> 00:39:06,160 Speaker 1: I could ask this. But occurred to me, you know, 705 00:39:06,280 --> 00:39:09,600 Speaker 1: like in the in the business press, we're always reporting 706 00:39:09,600 --> 00:39:12,200 Speaker 1: on c E O s getting fired or let go 707 00:39:12,320 --> 00:39:16,960 Speaker 1: and hired. Sometimes CFOs I don't see much coverage of, 708 00:39:17,000 --> 00:39:18,719 Speaker 1: like c t O S or c I O is 709 00:39:18,760 --> 00:39:20,799 Speaker 1: like the people who run the internal tech systems being 710 00:39:20,880 --> 00:39:23,759 Speaker 1: let go for poor performance. I actually think the only 711 00:39:23,800 --> 00:39:26,680 Speaker 1: time I can ever remember hearing any sort of CTO 712 00:39:26,840 --> 00:39:29,760 Speaker 1: or something losing their job for poor performance is probably 713 00:39:29,760 --> 00:39:32,960 Speaker 1: like fifteen years ago when Twitter was always having the 714 00:39:33,000 --> 00:39:35,880 Speaker 1: fail wills and like they weren't scaling very well during 715 00:39:36,040 --> 00:39:38,480 Speaker 1: the boom years. And other than that, I can't actually 716 00:39:38,680 --> 00:39:42,400 Speaker 1: recall a time in which I like read a story 717 00:39:42,440 --> 00:39:44,560 Speaker 1: about it, you know, a CTO being laid off for 718 00:39:44,600 --> 00:39:47,560 Speaker 1: bad performance. How often does that happen? And you know, 719 00:39:47,600 --> 00:39:50,160 Speaker 1: in the context of whether we're talking about tech companies 720 00:39:50,600 --> 00:39:52,560 Speaker 1: or all. You know, I think we were talking about 721 00:39:52,680 --> 00:39:55,920 Speaker 1: Southwest and others last time, Like how often do the 722 00:39:55,960 --> 00:39:58,600 Speaker 1: head of do those positions lose their job because they say, like, 723 00:39:59,080 --> 00:40:03,240 Speaker 1: our I is not good. So it's a complicated subject 724 00:40:03,280 --> 00:40:05,400 Speaker 1: for a number of reasons. One is that the degree 725 00:40:05,400 --> 00:40:08,920 Speaker 1: of saliency of CTO most companies to the media is 726 00:40:08,960 --> 00:40:12,160 Speaker 1: relatively low. The degree of saliency of many things that 727 00:40:12,200 --> 00:40:14,200 Speaker 1: are very important in the tech industry to the media 728 00:40:14,360 --> 00:40:16,040 Speaker 1: is lower than many people in the tech industry, but 729 00:40:16,239 --> 00:40:18,880 Speaker 1: like and that is one cause of the frequent conflict 730 00:40:18,880 --> 00:40:21,680 Speaker 1: between the media and tech. But be that doesn't two 731 00:40:21,719 --> 00:40:26,239 Speaker 1: people get laid off for for performance. Yes, one relatively 732 00:40:26,280 --> 00:40:29,200 Speaker 1: frequent thing that happens relative to the incidents of senior 733 00:40:29,880 --> 00:40:34,279 Speaker 1: senior executives departing is the uh sort of like fall 734 00:40:34,320 --> 00:40:37,440 Speaker 1: on your sword motion if there is a significant outage. 735 00:40:37,800 --> 00:40:40,799 Speaker 1: Is a thing that frequently happens, are frequently relative to 736 00:40:40,960 --> 00:40:44,960 Speaker 1: all causes for a departure and hesitant to give you 737 00:40:45,040 --> 00:40:47,919 Speaker 1: the example because Tokyo is a small town, but there 738 00:40:47,960 --> 00:40:50,920 Speaker 1: are a number of banks, both in Japan and outside 739 00:40:50,960 --> 00:40:54,120 Speaker 1: of Japan that have had disabling computer outages for like 740 00:40:54,239 --> 00:40:55,879 Speaker 1: days to weeks at a time, where that is an 741 00:40:55,920 --> 00:40:59,799 Speaker 1: extremely extremely thing to be avoided for a bank and 742 00:41:00,040 --> 00:41:02,800 Speaker 1: rules up fairly directly to the head of I T 743 00:41:03,200 --> 00:41:06,200 Speaker 1: or the CEO. And there are cases where either the 744 00:41:06,239 --> 00:41:08,080 Speaker 1: head of I T or the CEO of have left 745 00:41:08,120 --> 00:41:10,520 Speaker 1: us a result of doing that. There is one thing 746 00:41:10,560 --> 00:41:13,200 Speaker 1: that I do like about the culture that is Japanese management, 747 00:41:13,280 --> 00:41:16,719 Speaker 1: where in the sort of like ritualized speech that an 748 00:41:16,719 --> 00:41:20,239 Speaker 1: executive gives it that they will often say police don't 749 00:41:20,239 --> 00:41:22,120 Speaker 1: blame the people that had their hands on the keyboards 750 00:41:22,160 --> 00:41:24,480 Speaker 1: during this. The fact that this was allowed to happen 751 00:41:24,560 --> 00:41:28,040 Speaker 1: was a result of managements miss decisions or taken over 752 00:41:28,040 --> 00:41:31,759 Speaker 1: the course of years. I presided over them, and as 753 00:41:31,760 --> 00:41:34,600 Speaker 1: a result this uh, this allage. Even if you know 754 00:41:34,640 --> 00:41:37,920 Speaker 1: it was one person individually fat fingering something that took 755 00:41:37,960 --> 00:41:39,960 Speaker 1: us down for a week, this belongs at my door, 756 00:41:40,000 --> 00:41:42,880 Speaker 1: and I'm resigning to take responsibility for it. There are 757 00:41:42,920 --> 00:41:45,319 Speaker 1: many things I don't love about Japanese management culture, but 758 00:41:45,400 --> 00:41:50,319 Speaker 1: that bit I do like. Another thing is there are 759 00:41:50,360 --> 00:41:55,120 Speaker 1: reasons for companies to be other than other than maximally 760 00:41:55,120 --> 00:41:57,760 Speaker 1: public about the fact that we are removing a senior 761 00:41:57,800 --> 00:42:00,480 Speaker 1: executive for cost. If you remember the over the course 762 00:42:00,480 --> 00:42:02,800 Speaker 1: of the last couple of years, the I T sector 763 00:42:02,800 --> 00:42:05,840 Speaker 1: has been in sort of like massive boom mode. Companies 764 00:42:05,880 --> 00:42:09,239 Speaker 1: are extremely protective of their brand with respect to engineering candidates. 765 00:42:09,640 --> 00:42:13,640 Speaker 1: Nobody wants to join a organization that exists under a 766 00:42:13,680 --> 00:42:17,240 Speaker 1: cloud who's CTO just got fired for being an idiot. 767 00:42:17,680 --> 00:42:21,040 Speaker 1: So the thing that might happen is like, oh, well, 768 00:42:21,600 --> 00:42:25,919 Speaker 1: the previous VP of engineering wasn't quite up to enough. 769 00:42:26,239 --> 00:42:29,040 Speaker 1: Maybe they can be shuffled onto a different project, and 770 00:42:29,080 --> 00:42:31,719 Speaker 1: we're going to hire a CTO above them. If you've 771 00:42:31,719 --> 00:42:34,160 Speaker 1: already hired a CTO, that's a bit of a bit 772 00:42:34,160 --> 00:42:37,440 Speaker 1: of a more difficult thing. But like shuffles with regards 773 00:42:37,480 --> 00:42:39,880 Speaker 1: to who are the most important people in the engineering 774 00:42:39,960 --> 00:42:42,920 Speaker 1: organization and is there a separate product organization? Do they 775 00:42:42,920 --> 00:42:45,800 Speaker 1: report to the same people, etcetera, etcetera, are sometimes caused 776 00:42:45,840 --> 00:42:49,239 Speaker 1: by like X isn't getting it done. We want to 777 00:42:49,640 --> 00:42:51,880 Speaker 1: like shuffle in why, But we don't want that to 778 00:42:51,920 --> 00:42:55,160 Speaker 1: be seen as a repudiation of X. Not because we 779 00:42:55,200 --> 00:42:57,239 Speaker 1: care about X's opinions so much, but we care about 780 00:42:57,280 --> 00:42:59,719 Speaker 1: how this will be read by internal engineers who we 781 00:42:59,800 --> 00:43:03,680 Speaker 1: want to keep attaptioned to the company and external candidas. Alright, 782 00:43:03,800 --> 00:43:07,400 Speaker 1: one last small question they'll probably eject, a question that 783 00:43:07,440 --> 00:43:08,880 Speaker 1: we can talk about for a long time, but just 784 00:43:08,880 --> 00:43:11,440 Speaker 1: real quickly. So the one one area within tech that 785 00:43:11,480 --> 00:43:15,000 Speaker 1: seems like almost certainly going to be hiring like crazy 786 00:43:15,080 --> 00:43:18,480 Speaker 1: for years at this point is anything to do with AI. 787 00:43:18,520 --> 00:43:20,799 Speaker 1: And you know we all know what's going on there. 788 00:43:21,520 --> 00:43:23,799 Speaker 1: How much are the skills that some of these like 789 00:43:23,800 --> 00:43:28,520 Speaker 1: sort of cutting edge AI companies in need of. How 790 00:43:28,600 --> 00:43:31,280 Speaker 1: much are these skills that sort of legacy or existing 791 00:43:31,320 --> 00:43:34,600 Speaker 1: tech workers might have, or how much are the skills 792 00:43:34,600 --> 00:43:36,880 Speaker 1: that they need something that like you really need years 793 00:43:36,920 --> 00:43:40,480 Speaker 1: of like focus training in the specific area to satisfy 794 00:43:40,520 --> 00:43:43,360 Speaker 1: what these companies need. Can I can I add another 795 00:43:43,400 --> 00:43:46,440 Speaker 1: thing onto the back of that place? How many coding 796 00:43:46,520 --> 00:43:50,479 Speaker 1: jobs will something like chat GPT destroy? Ye? Should people 797 00:43:50,480 --> 00:43:54,160 Speaker 1: stop learning to code? Yeah? Yeah, talk about talk about 798 00:43:55,120 --> 00:43:58,520 Speaker 1: So I have a glib but true answer with respect 799 00:43:58,560 --> 00:44:02,000 Speaker 1: to our advanced AI techniques going to destroy programming jobs. 800 00:44:02,360 --> 00:44:05,080 Speaker 1: The first program or class of programs that we had 801 00:44:05,120 --> 00:44:08,560 Speaker 1: where an advanced computer was obviating the need for human 802 00:44:08,560 --> 00:44:12,520 Speaker 1: programmers was called the compiler, where instead of doing you know, 803 00:44:12,960 --> 00:44:16,480 Speaker 1: complex low level and instructions directly and assembly and speaking 804 00:44:16,520 --> 00:44:19,160 Speaker 1: sort of natively the language of the computer, you use 805 00:44:19,200 --> 00:44:21,800 Speaker 1: what we're called high level programming language is like C 806 00:44:22,080 --> 00:44:24,520 Speaker 1: back in the day. And then the compiler would you know, 807 00:44:24,920 --> 00:44:27,560 Speaker 1: use its magic AI powers to turn that C into 808 00:44:27,719 --> 00:44:30,040 Speaker 1: assembly language so that you didn't have to laboriously do 809 00:44:30,080 --> 00:44:34,480 Speaker 1: the assembly language itself. So every technology that gives programmers 810 00:44:34,560 --> 00:44:38,240 Speaker 1: more power, more capability to do things that are valuable 811 00:44:38,280 --> 00:44:42,239 Speaker 1: for human society probably increases the aggregate demand for programmers 812 00:44:42,320 --> 00:44:44,200 Speaker 1: is sort of like my high level view on the 813 00:44:44,200 --> 00:44:47,840 Speaker 1: world and if yet to see a contrary example to that. 814 00:44:48,920 --> 00:44:53,040 Speaker 1: So an interesting question with regards to AI is what 815 00:44:53,239 --> 00:44:56,880 Speaker 1: are the like, what series of steps is going to 816 00:44:56,880 --> 00:44:59,919 Speaker 1: be necessary to take it to market in a way 817 00:45:00,200 --> 00:45:03,920 Speaker 1: that it actually creates the value for individual people land 818 00:45:03,960 --> 00:45:07,000 Speaker 1: for society, and that it seems to have Latin within it. 819 00:45:07,360 --> 00:45:10,279 Speaker 1: And if you look at like chat GPT, if you 820 00:45:10,520 --> 00:45:13,080 Speaker 1: like I view it as an iceberg, there's the above 821 00:45:13,160 --> 00:45:15,960 Speaker 1: the waterline part and below the waterline part, and below 822 00:45:16,000 --> 00:45:20,320 Speaker 1: the waterline part has some let's say deep deep magic 823 00:45:20,440 --> 00:45:24,200 Speaker 1: there bracketting out that magic for the moment, it seems 824 00:45:24,239 --> 00:45:27,239 Speaker 1: like the above the waterline part was very important in 825 00:45:27,719 --> 00:45:31,239 Speaker 1: why everyone has heard chat GPT and probably used it 826 00:45:31,280 --> 00:45:34,160 Speaker 1: if you're listening to this podcast, but it hasn't heard 827 00:45:34,200 --> 00:45:37,279 Speaker 1: of like similar efforts at Google, etcetera. The reason is 828 00:45:37,360 --> 00:45:39,759 Speaker 1: that there was a you know, a product focused team 829 00:45:39,840 --> 00:45:43,160 Speaker 1: that made a relatively pedestrian piece of software like a 830 00:45:43,280 --> 00:45:47,719 Speaker 1: chat interface, but made it really, really good and like 831 00:45:47,880 --> 00:45:50,719 Speaker 1: work on that to the point where people's interactions with 832 00:45:50,760 --> 00:45:55,799 Speaker 1: the underlying large language model would be like sufficiently effervescent 833 00:45:55,840 --> 00:45:58,160 Speaker 1: that it would screenshot that interaction has shared it over 834 00:45:58,160 --> 00:46:02,319 Speaker 1: to Twitter, and so everything that above the water line 835 00:46:02,400 --> 00:46:07,280 Speaker 1: part is amenable to the to the technologies and tactics 836 00:46:07,280 --> 00:46:11,480 Speaker 1: that existing engineers have with no modification whatsoever. There you're 837 00:46:11,520 --> 00:46:13,720 Speaker 1: talking to the back end. The back end is implemented 838 00:46:13,719 --> 00:46:15,279 Speaker 1: in a different kind of magic than your back ends 839 00:46:15,360 --> 00:46:18,000 Speaker 1: usually are. But the back end has always been magic too. 840 00:46:18,520 --> 00:46:22,600 Speaker 1: That is like part of the answer. There's an interesting question, 841 00:46:22,840 --> 00:46:25,040 Speaker 1: like how much of the work is going to be 842 00:46:25,120 --> 00:46:28,520 Speaker 1: that above the waterline part. The productization of these you know, 843 00:46:29,040 --> 00:46:32,520 Speaker 1: creating like new forms of user interfaces, new models for 844 00:46:32,560 --> 00:46:35,680 Speaker 1: interactions with users, new metaphors that we have to teach 845 00:46:35,719 --> 00:46:39,759 Speaker 1: to people, like new you know there there might be 846 00:46:39,800 --> 00:46:42,040 Speaker 1: an entire field and like education and how to do 847 00:46:42,239 --> 00:46:45,080 Speaker 1: I don't know, prompt engineering, well, prompt engineering being how 848 00:46:45,120 --> 00:46:47,800 Speaker 1: do you type in the right series of incantations to 849 00:46:47,880 --> 00:46:51,120 Speaker 1: the machine so that the the spirited some and stuff 850 00:46:51,160 --> 00:46:54,360 Speaker 1: out of the ether does the right thing for you? So, like, 851 00:46:54,880 --> 00:46:57,200 Speaker 1: what percentage of the work will happen there? First? What 852 00:46:57,239 --> 00:47:00,319 Speaker 1: percentage of the work will happen on these like core 853 00:47:00,480 --> 00:47:03,319 Speaker 1: under the hood model things. A sub sub question to 854 00:47:03,360 --> 00:47:06,360 Speaker 1: that too is like okay, so for the work happening 855 00:47:07,440 --> 00:47:10,400 Speaker 1: at that model layer. Is that work going to happen 856 00:47:10,520 --> 00:47:13,560 Speaker 1: at every company that consumes models, or is it going 857 00:47:13,600 --> 00:47:17,640 Speaker 1: to happen primarily at open AI and Google and Microsoft. 858 00:47:18,080 --> 00:47:20,000 Speaker 1: And we can count the number of firms that like 859 00:47:20,080 --> 00:47:23,120 Speaker 1: need this these engineers on a single hand. In a 860 00:47:23,200 --> 00:47:26,239 Speaker 1: world where we count the number of firms that need 861 00:47:26,360 --> 00:47:30,040 Speaker 1: like dedicated hard AI researchers on a single hand, that 862 00:47:30,080 --> 00:47:33,279 Speaker 1: probably implies like lower total employment of them than in 863 00:47:33,320 --> 00:47:36,399 Speaker 1: the world where every firm that touches AI has its 864 00:47:36,400 --> 00:47:40,040 Speaker 1: own AI practice on staff. But it's at least as 865 00:47:40,040 --> 00:47:43,319 Speaker 1: of like the current state play deeply uncertain where that 866 00:47:43,360 --> 00:47:45,080 Speaker 1: will shake out. And so these are some of the 867 00:47:45,160 --> 00:47:47,720 Speaker 1: questions that get debated upon people at both the AI 868 00:47:47,760 --> 00:47:50,319 Speaker 1: firms and also like you know, if if you are 869 00:47:50,560 --> 00:47:53,839 Speaker 1: a VC that's adventure that's investing in the space, you 870 00:47:53,880 --> 00:47:56,759 Speaker 1: are probably having like a number of interesting dinner conversations 871 00:47:56,760 --> 00:47:59,879 Speaker 1: on okay, where does the value accrue in this chain? 872 00:48:00,320 --> 00:48:02,560 Speaker 1: Where does most of the work get done? What do 873 00:48:02,640 --> 00:48:05,279 Speaker 1: these products like expose themselves to in the life of 874 00:48:05,280 --> 00:48:07,840 Speaker 1: the user. Is it's something deeply under the hood or 875 00:48:07,880 --> 00:48:10,600 Speaker 1: is it integrated into their daily operations? Do they know 876 00:48:10,760 --> 00:48:13,560 Speaker 1: they're using an AI do they know they're using software? 877 00:48:14,080 --> 00:48:16,440 Speaker 1: Is it something that they're like directly typing in or 878 00:48:16,480 --> 00:48:19,239 Speaker 1: is it something that they're interfacing with someone who's doing 879 00:48:19,239 --> 00:48:22,480 Speaker 1: the typing on their path, etcetera, etcetera. Well, Patrick, this 880 00:48:22,680 --> 00:48:25,040 Speaker 1: was absolutely great talking to you. We could talk for 881 00:48:25,080 --> 00:48:27,319 Speaker 1: a long time, but instead we'll just talk to you 882 00:48:27,320 --> 00:48:29,200 Speaker 1: in a few weeks again when we have a million 883 00:48:29,280 --> 00:48:31,759 Speaker 1: more questions. Now, I'm big patigias, but I learned a 884 00:48:31,760 --> 00:48:35,000 Speaker 1: lot and really appreciate you coming back on the show. 885 00:48:35,360 --> 00:48:36,959 Speaker 1: Thanks very much for having me, and I always happy 886 00:48:36,960 --> 00:48:52,560 Speaker 1: to be come back. Thanks Patrick. That was fun, so Tracy, 887 00:48:52,600 --> 00:48:55,160 Speaker 1: there were so many interesting elements of that conversation. I'm 888 00:48:55,160 --> 00:48:58,239 Speaker 1: really glad we had Patrick back. I'm not even sure 889 00:48:58,239 --> 00:49:00,680 Speaker 1: where to begin, but to start, you know, his point 890 00:49:00,680 --> 00:49:03,600 Speaker 1: about the hiring boom during the pandemic, I thought was interesting, 891 00:49:03,640 --> 00:49:06,080 Speaker 1: not just that maybe these companies had a sort of 892 00:49:06,200 --> 00:49:09,479 Speaker 1: unrealistic expectations about how long does growth boo would last, 893 00:49:09,880 --> 00:49:12,799 Speaker 1: but that when you're hiring that fast and under sort 894 00:49:12,840 --> 00:49:16,719 Speaker 1: of extremely unusual situations, like you have that drift where 895 00:49:16,800 --> 00:49:18,799 Speaker 1: maybe you're like there's a little bit of a we're 896 00:49:18,800 --> 00:49:21,560 Speaker 1: not that happy with the class. And then also that point, 897 00:49:21,640 --> 00:49:24,400 Speaker 1: but you also can't just hire everyone who came in 898 00:49:24,480 --> 00:49:29,240 Speaker 1: recently for reasons of like seasoning and like skill level growth. Well, 899 00:49:29,400 --> 00:49:32,719 Speaker 1: to me, it kind of I guess hammers home the 900 00:49:32,760 --> 00:49:36,120 Speaker 1: point that three years on from the start of the 901 00:49:36,160 --> 00:49:41,759 Speaker 1: COVID pandemic, we are still experiencing these normal developments. And 902 00:49:41,840 --> 00:49:44,680 Speaker 1: it kind of gets to the macro versus micro point 903 00:49:44,800 --> 00:49:48,040 Speaker 1: about some of the recent payrolls figures. You know, all 904 00:49:48,120 --> 00:49:50,880 Speaker 1: the tech layoffs that have been announced. Are they saying 905 00:49:51,120 --> 00:49:54,440 Speaker 1: something about the wider economy or is this really a 906 00:49:54,480 --> 00:49:58,920 Speaker 1: tech specific problem? And I think, I mean, I can 907 00:49:58,960 --> 00:50:00,640 Speaker 1: kind of argue it either way. I think I come 908 00:50:00,680 --> 00:50:04,960 Speaker 1: away from that conversation thinking, well, you know, one were 909 00:50:05,080 --> 00:50:08,480 Speaker 1: really unusual periods in terms of hiring for the big 910 00:50:08,480 --> 00:50:12,040 Speaker 1: tech companies, and to some extent it seems reasonable that 911 00:50:12,040 --> 00:50:14,719 Speaker 1: that starts to get rolled back a little bit. But uh, 912 00:50:14,920 --> 00:50:17,319 Speaker 1: you know, it's also I take his point, is he 913 00:50:17,520 --> 00:50:21,120 Speaker 1: and I suspect it's probably true, which is that if 914 00:50:21,160 --> 00:50:22,960 Speaker 1: a year ago you were thinking you wanted to go 915 00:50:23,000 --> 00:50:27,400 Speaker 1: into engineering or coding or something like that, very little 916 00:50:27,480 --> 00:50:30,640 Speaker 1: about what we've seen so far in three should make 917 00:50:30,680 --> 00:50:32,720 Speaker 1: you change your mind. I thought that was really interesting 918 00:50:32,719 --> 00:50:35,919 Speaker 1: too about like sort of the questions about AI and 919 00:50:36,200 --> 00:50:38,719 Speaker 1: how so it's like, as he pointed out, like there 920 00:50:38,719 --> 00:50:42,200 Speaker 1: are other you know, places working on very similar, if 921 00:50:42,239 --> 00:50:45,760 Speaker 1: not equal technology. What sort of made things breakthrough recently 922 00:50:46,480 --> 00:50:49,440 Speaker 1: was the consumerization of some of the chat interface or 923 00:50:49,480 --> 00:50:52,680 Speaker 1: some of these AI images imaging things. So like how 924 00:50:52,800 --> 00:50:54,680 Speaker 1: much of like to go to market for this stuff? 925 00:50:54,760 --> 00:50:59,400 Speaker 1: Ultimately isn't sort of like familiar experiences that people already have. 926 00:50:59,800 --> 00:51:02,840 Speaker 1: It reminds me a lot and I don't mean this 927 00:51:03,120 --> 00:51:05,719 Speaker 1: necessarily in a negative way, but it reminds me a 928 00:51:05,760 --> 00:51:08,680 Speaker 1: lot of crypto in the sense that, like, yes, there 929 00:51:08,719 --> 00:51:11,799 Speaker 1: is a lot of hype around AI, but also in 930 00:51:11,840 --> 00:51:14,200 Speaker 1: the sense that this is a new technology that people 931 00:51:14,239 --> 00:51:17,680 Speaker 1: can actually participate in. And so the use of the 932 00:51:18,320 --> 00:51:22,480 Speaker 1: AI image generators, chat GPT, it kind of brings it 933 00:51:22,520 --> 00:51:24,479 Speaker 1: to people in the same way that they are able 934 00:51:24,520 --> 00:51:27,680 Speaker 1: to experiment with, you know, blockchain and different types of 935 00:51:27,719 --> 00:51:31,359 Speaker 1: money using crypto, and so it suddenly becomes a lot 936 00:51:31,400 --> 00:51:33,920 Speaker 1: more salient for people in that way. Yeah, you know, 937 00:51:33,960 --> 00:51:36,440 Speaker 1: like there's a good example because like with crypto, like 938 00:51:36,440 --> 00:51:39,319 Speaker 1: if you're like interacting with like core protocols are like 939 00:51:39,360 --> 00:51:42,120 Speaker 1: developing on ethereum like that's going to be a limited 940 00:51:42,200 --> 00:51:44,239 Speaker 1: a limited number of people know how to do that. 941 00:51:44,760 --> 00:51:47,680 Speaker 1: But if you're building like an exchange, there are a 942 00:51:47,680 --> 00:51:49,640 Speaker 1: lot of I mean, everyone can have a wallet, right 943 00:51:49,800 --> 00:51:52,319 Speaker 1: you're marketing or stuff like that. There are still all 944 00:51:52,360 --> 00:51:55,120 Speaker 1: of these roles within crypto that have like sort of 945 00:51:55,120 --> 00:51:58,880 Speaker 1: like consumer facing analogs to any other industry. Yeah, I 946 00:51:58,920 --> 00:52:02,040 Speaker 1: need to look up the compile tiler. That sounds interesting. Yeah, 947 00:52:02,120 --> 00:52:05,320 Speaker 1: I'm gonna go off in Google um deep learning compiler. 948 00:52:05,520 --> 00:52:07,600 Speaker 1: I guess for so far a job security, we still 949 00:52:07,600 --> 00:52:09,879 Speaker 1: need to learn to code. Huh. I think we need 950 00:52:09,960 --> 00:52:14,719 Speaker 1: to learn AI. I don't know. Probably I don't know, 951 00:52:14,960 --> 00:52:18,359 Speaker 1: but you know C plus plus, which is what I 952 00:52:18,440 --> 00:52:22,000 Speaker 1: learned in a little bit of no because it's obsolete. 953 00:52:22,239 --> 00:52:24,720 Speaker 1: Like no, I don't think anyone uses c post plus 954 00:52:24,760 --> 00:52:27,040 Speaker 1: and they certainly don't use it for for AI and 955 00:52:27,560 --> 00:52:30,440 Speaker 1: machine learning stuff. I should have asked Patrick what language? 956 00:52:30,640 --> 00:52:33,320 Speaker 1: What coding language Python? When we have them back in 957 00:52:33,400 --> 00:52:36,480 Speaker 1: the Yeah, okay, yeah, our next episode with Patrick will 958 00:52:36,520 --> 00:52:39,080 Speaker 1: be about which coding language we should all be learning 959 00:52:39,120 --> 00:52:41,279 Speaker 1: in the future. Shall we leave it there? Let's leave 960 00:52:41,320 --> 00:52:43,920 Speaker 1: it there. This has been another episode of the All 961 00:52:43,960 --> 00:52:46,640 Speaker 1: Thoughts podcast. I'm Tracy Alloway. You can follow me on 962 00:52:46,680 --> 00:52:49,480 Speaker 1: Twitter at Tracy Alloway. And I'm Joe wisn't Thal. You 963 00:52:49,520 --> 00:52:52,919 Speaker 1: can follow me on Twitter at the Stalwart. Follow our 964 00:52:52,960 --> 00:52:57,080 Speaker 1: guest Patrick McKenzie. He's at Patio eleven and check out 965 00:52:57,160 --> 00:53:01,520 Speaker 1: his Bits about Money newsletter. Follow our producers Carmen Rodriguez 966 00:53:01,560 --> 00:53:05,160 Speaker 1: at Carmen Armand and Dash Bennett at Dashbot. And check 967 00:53:05,160 --> 00:53:09,000 Speaker 1: out all of our podcasts Bloomberg under the handle at podcasts, 968 00:53:09,200 --> 00:53:12,640 Speaker 1: and for more odd Lots content, go to Bloomberg dot 969 00:53:12,640 --> 00:53:16,560 Speaker 1: com slash odd Lots when we push the transcripts Tracy 970 00:53:16,600 --> 00:53:18,520 Speaker 1: and I blog. If we have a weekly newsletter that 971 00:53:18,600 --> 00:53:21,520 Speaker 1: comes out every Friday, go there and sign up. Thanks 972 00:53:21,600 --> 00:53:22,040 Speaker 1: for listening