1 00:00:10,840 --> 00:00:14,960 Speaker 1: Hello, and welcome to another episode of the Odd Lots Podcast. 2 00:00:15,040 --> 00:00:19,639 Speaker 1: I'm Joe Wisn't Thal and I'm Tracy all Away. So, Tracy, 3 00:00:20,239 --> 00:00:23,960 Speaker 1: you know, we've had a number of episodes, um, not 4 00:00:24,040 --> 00:00:26,079 Speaker 1: that many, but certainly a handful of them. One of 5 00:00:26,079 --> 00:00:30,840 Speaker 1: our themes is like quantitative finance quant stuff. We always 6 00:00:30,880 --> 00:00:32,840 Speaker 1: sort of talk about it from a sort of very 7 00:00:32,920 --> 00:00:39,080 Speaker 1: big theoretical level, like what's happening with different factors, alpha decay, 8 00:00:39,200 --> 00:00:41,760 Speaker 1: stuff like that always sort of like tends to be 9 00:00:41,840 --> 00:00:46,880 Speaker 1: at the sort of big picture abstraction level. I would say, yeah, 10 00:00:46,960 --> 00:00:48,720 Speaker 1: I think that's right. We talk a lot about the 11 00:00:49,159 --> 00:00:54,240 Speaker 1: future of quantitative finance. What's most important, I guess when 12 00:00:54,280 --> 00:00:58,520 Speaker 1: it comes to running a successful quant strategy, like whether 13 00:00:58,640 --> 00:01:02,200 Speaker 1: it's the actual formula or program that you're using, or 14 00:01:02,240 --> 00:01:05,759 Speaker 1: whether it's something like the data set that you're using. 15 00:01:06,040 --> 00:01:10,080 Speaker 1: We've discussed all of those. Yeah, absolutely, And you know, 16 00:01:10,160 --> 00:01:12,760 Speaker 1: so we talked about theoretically, we talked about it academically, 17 00:01:13,120 --> 00:01:15,800 Speaker 1: but one thing that we haven't really discussed is you know, 18 00:01:15,840 --> 00:01:19,840 Speaker 1: when you like think about quantitative finance, you think about 19 00:01:20,160 --> 00:01:24,039 Speaker 1: lots of really smart people, mathematicians, people using computers to 20 00:01:24,120 --> 00:01:27,960 Speaker 1: find data, to analyze huge amounts of data. We've never 21 00:01:27,959 --> 00:01:30,160 Speaker 1: really liked talked about it from the bottoms up, like 22 00:01:30,440 --> 00:01:36,759 Speaker 1: the actual process of using computers, using technology to uh 23 00:01:36,800 --> 00:01:40,360 Speaker 1: to perform all these calculations and too in theory um 24 00:01:40,560 --> 00:01:43,520 Speaker 1: beat the market. Yeah, I think that's right. And I'm 25 00:01:43,560 --> 00:01:46,000 Speaker 1: going to go ahead and admit that I don't know 26 00:01:46,200 --> 00:01:50,720 Speaker 1: that much about engineering in the financial industry. We can 27 00:01:50,720 --> 00:01:54,080 Speaker 1: talk a lot about what gives a competitive edge when 28 00:01:54,080 --> 00:01:57,440 Speaker 1: it comes to quant strategies, but a competitive edge when 29 00:01:57,480 --> 00:02:00,400 Speaker 1: it comes to engineering when it comes to in stalling 30 00:02:00,560 --> 00:02:04,520 Speaker 1: particular products or systems. I really have no idea what 31 00:02:04,560 --> 00:02:09,160 Speaker 1: that looks like nowadays. No, I literally have have no idea. 32 00:02:09,240 --> 00:02:11,040 Speaker 1: But it's kind of, you know, it's this huge whole 33 00:02:11,080 --> 00:02:13,640 Speaker 1: component of it, and every financial firm, whether it's a 34 00:02:13,680 --> 00:02:18,480 Speaker 1: major bank, whether it's a fund, want to advertise their 35 00:02:18,680 --> 00:02:22,880 Speaker 1: tech stack, their tech advantage, uh so to speak. But 36 00:02:23,240 --> 00:02:25,720 Speaker 1: I really I don't really know much about the actual 37 00:02:26,280 --> 00:02:29,320 Speaker 1: process of like building that tech and how you build 38 00:02:29,360 --> 00:02:34,040 Speaker 1: a competitive edge from that tech. And Yeah, it's like 39 00:02:34,320 --> 00:02:36,600 Speaker 1: it's a whole sort of facet of this that I 40 00:02:36,639 --> 00:02:39,000 Speaker 1: don't think we've I don't think most people know very 41 00:02:39,040 --> 00:02:42,400 Speaker 1: much about. Yeah, and I'm also interested in knowing whether 42 00:02:42,840 --> 00:02:47,560 Speaker 1: the culture of being an engineer at a financial company 43 00:02:47,600 --> 00:02:51,120 Speaker 1: is different to the culture of being say a trader 44 00:02:51,280 --> 00:02:54,120 Speaker 1: or a banker at a financial company. And I'm definitely 45 00:02:54,160 --> 00:02:57,120 Speaker 1: interested in knowing whether or not the culture of being 46 00:02:57,160 --> 00:03:00,160 Speaker 1: an engineer at a financial company is different to being 47 00:03:00,160 --> 00:03:05,200 Speaker 1: an engineer at say Google or Apple or something like that. Right, 48 00:03:05,240 --> 00:03:08,320 Speaker 1: I'm extremely curious about this as well, because again, it's 49 00:03:08,320 --> 00:03:11,040 Speaker 1: like one of these things where in theory more and 50 00:03:11,080 --> 00:03:15,280 Speaker 1: more financial institutions would like to be competitive for talent 51 00:03:15,400 --> 00:03:18,360 Speaker 1: against the likes of Google and Facebook and other startups 52 00:03:18,440 --> 00:03:21,560 Speaker 1: and so on. But it raises the question of like, 53 00:03:21,600 --> 00:03:25,120 Speaker 1: how similar differences and what how the too, how the 54 00:03:25,120 --> 00:03:28,360 Speaker 1: to compare? Yeah, absolutely, this is going to be a 55 00:03:28,360 --> 00:03:31,160 Speaker 1: fun episode, I can tell. Yeah. I'm super excited about 56 00:03:31,160 --> 00:03:33,520 Speaker 1: this one. This is a conversation I've wanted to have 57 00:03:33,880 --> 00:03:37,560 Speaker 1: for a long time. Today, our guest is Camille Fourgnier. 58 00:03:37,720 --> 00:03:40,960 Speaker 1: She is the managing director or a managing director head 59 00:03:40,960 --> 00:03:44,560 Speaker 1: of platform engineering at two Sigma, which is a big 60 00:03:44,600 --> 00:03:48,320 Speaker 1: financial services company. It has a hedge fund and also 61 00:03:48,440 --> 00:03:53,320 Speaker 1: does VC stuff. Veryous sort of well known famous firm 62 00:03:53,360 --> 00:03:57,120 Speaker 1: in the space. Camille was previously the CTO at rent, 63 00:03:57,160 --> 00:04:00,920 Speaker 1: the runway, the popular fat and I don't know how 64 00:04:00,920 --> 00:04:02,800 Speaker 1: you describe it, but I guess you like people rent 65 00:04:02,880 --> 00:04:06,040 Speaker 1: dresses onto temporary basis or rent clothes on temporary basis. 66 00:04:06,160 --> 00:04:09,400 Speaker 1: And prior to that, Camille was at Goldman sach so 67 00:04:09,800 --> 00:04:12,640 Speaker 1: has seen all of this from both sides, the tech 68 00:04:12,760 --> 00:04:16,440 Speaker 1: hemisphere of all these services. So Camille, thank you very 69 00:04:16,480 --> 00:04:19,080 Speaker 1: much for joining us. Yeah, thank you for having me 70 00:04:19,120 --> 00:04:23,520 Speaker 1: excited to be here. Yeah, I'm really looking forward to this. So, UM, 71 00:04:23,600 --> 00:04:25,279 Speaker 1: before we get going, why don't you sort of give 72 00:04:25,360 --> 00:04:28,640 Speaker 1: us the quick summary of your career. How did you 73 00:04:28,760 --> 00:04:32,360 Speaker 1: get to be? Uh, your path to being the head 74 00:04:32,360 --> 00:04:37,200 Speaker 1: of platform engineering at two Sigma. Sure, I have something 75 00:04:37,520 --> 00:04:41,839 Speaker 1: I have. I guess I have a fairly standard educational background. 76 00:04:41,920 --> 00:04:45,359 Speaker 1: So I have a computer science degree from Carnegie Mellon. 77 00:04:45,440 --> 00:04:48,200 Speaker 1: For my undergraduate degree, I worked at Microsoft for a 78 00:04:48,240 --> 00:04:51,679 Speaker 1: hot second. UM went to graduate school at the University 79 00:04:51,720 --> 00:04:54,720 Speaker 1: of Wisconsin. Um decided I didn't want to get a PhD. 80 00:04:55,480 --> 00:04:57,440 Speaker 1: I did want to live in New York City, and 81 00:04:57,520 --> 00:04:59,800 Speaker 1: so ended up getting a job at this company that 82 00:04:59,839 --> 00:05:02,200 Speaker 1: I had literally never heard of until I interviewed there, 83 00:05:02,240 --> 00:05:05,440 Speaker 1: called Goldman Sachs, which I'm sure will be very amusing 84 00:05:05,480 --> 00:05:09,680 Speaker 1: to your listeners. UH. You know, went to Goldman UH 85 00:05:09,720 --> 00:05:11,839 Speaker 1: and was there for about six and a half years. 86 00:05:11,880 --> 00:05:14,720 Speaker 1: And I worked in risk technology for a long time, 87 00:05:14,760 --> 00:05:18,000 Speaker 1: which was actually like really awesome area to work in, 88 00:05:18,080 --> 00:05:22,040 Speaker 1: building a lot of really big scaled systems to do uh, 89 00:05:22,080 --> 00:05:24,640 Speaker 1: you know, really massive risk analysis. UM. And then I 90 00:05:24,680 --> 00:05:28,040 Speaker 1: also worked in similar kind of technology group to the 91 00:05:28,080 --> 00:05:31,560 Speaker 1: one that I run today UM for a couple of years. UM. 92 00:05:31,640 --> 00:05:35,400 Speaker 1: And then around two thousand and eleven, I decided that, 93 00:05:35,640 --> 00:05:37,720 Speaker 1: you know, I wasn't quite ready to be settled down 94 00:05:37,800 --> 00:05:40,440 Speaker 1: and like making a career at a giant company like Goldman, 95 00:05:40,520 --> 00:05:43,719 Speaker 1: and I wanted to try something new. And the startup 96 00:05:43,760 --> 00:05:47,880 Speaker 1: world was really, you know, getting to be very hot, 97 00:05:48,040 --> 00:05:51,120 Speaker 1: very popular, UM. Even in New York. There were lots 98 00:05:51,120 --> 00:05:54,960 Speaker 1: of interesting startups. And UM, I got approached by Rent 99 00:05:54,960 --> 00:05:57,720 Speaker 1: the Runway and I had also not heard Rent the 100 00:05:57,760 --> 00:05:59,840 Speaker 1: Runway until I I interviewed there, But of course they 101 00:05:59,880 --> 00:06:02,880 Speaker 1: were very small, small startup at the time. UM, and 102 00:06:02,920 --> 00:06:06,960 Speaker 1: I was just so blown away by the product idea. 103 00:06:07,000 --> 00:06:08,640 Speaker 1: It was one of those things where every time I 104 00:06:08,680 --> 00:06:13,640 Speaker 1: described the product to another woman, she immediately understood what 105 00:06:13,680 --> 00:06:17,720 Speaker 1: I was talking about when it comes to consumer facing startups, 106 00:06:18,480 --> 00:06:22,080 Speaker 1: that's really such an important sign of a good idea 107 00:06:22,320 --> 00:06:25,960 Speaker 1: is just how fast your potential customer gets it. So 108 00:06:26,000 --> 00:06:28,479 Speaker 1: I was like, all right, like this company seems like 109 00:06:28,520 --> 00:06:31,719 Speaker 1: it has really shart business people. Right. The founders were 110 00:06:31,720 --> 00:06:35,960 Speaker 1: just incredibly smart, you know, Harvard NBA's um PhDs with 111 00:06:36,240 --> 00:06:38,680 Speaker 1: operations to a starch backgrounds on the data side. So 112 00:06:38,720 --> 00:06:41,479 Speaker 1: it was like they're really smart people in the in 113 00:06:41,520 --> 00:06:44,960 Speaker 1: the leadership team, and the technology was a complete mess um, 114 00:06:45,000 --> 00:06:46,680 Speaker 1: And so I was like, great, I'm a great engineer 115 00:06:46,720 --> 00:06:48,520 Speaker 1: and I want to like, you know, become a leader. 116 00:06:48,600 --> 00:06:51,960 Speaker 1: So this is a great opportunity for me to you know, 117 00:06:52,240 --> 00:06:54,520 Speaker 1: join this company and the fixing the tech will be 118 00:06:54,560 --> 00:06:56,560 Speaker 1: easy and you know this is going to be a 119 00:06:56,560 --> 00:07:00,000 Speaker 1: slam dunk, and of course life never works out that way. Um. 120 00:07:00,120 --> 00:07:04,599 Speaker 1: It was significantly more challenging to come in to a 121 00:07:04,680 --> 00:07:08,560 Speaker 1: startup that's sort of in the growth stage and uh, 122 00:07:08,680 --> 00:07:11,920 Speaker 1: you know, both do a lot of work to really 123 00:07:11,960 --> 00:07:14,920 Speaker 1: turn the technology around and make it really stable and 124 00:07:15,000 --> 00:07:19,280 Speaker 1: scalable and easy to add new stuff too. Because that's 125 00:07:19,320 --> 00:07:21,120 Speaker 1: really a lot of the challenge in a in a 126 00:07:21,200 --> 00:07:23,640 Speaker 1: startup is how fast can you iterate how fast can 127 00:07:23,680 --> 00:07:27,040 Speaker 1: you do new things? UM? And it's also just that 128 00:07:27,200 --> 00:07:30,600 Speaker 1: was a you know, a massively challenging learning experience to 129 00:07:31,120 --> 00:07:34,600 Speaker 1: learn how to become a manager and an executive and 130 00:07:34,840 --> 00:07:38,080 Speaker 1: a leader. But I did that. I did that for 131 00:07:38,160 --> 00:07:40,760 Speaker 1: four years. So I became the CTO UM. You know, 132 00:07:40,800 --> 00:07:42,640 Speaker 1: I was there for about four years, and it was 133 00:07:42,680 --> 00:07:47,080 Speaker 1: a really it was an amazing, amazing learning experience. You know. 134 00:07:47,240 --> 00:07:50,200 Speaker 1: After after four years, I sort of felt like I 135 00:07:50,240 --> 00:07:51,880 Speaker 1: had done what I came to do it. I had 136 00:07:51,920 --> 00:07:57,840 Speaker 1: come to learn a lot. I had come to establish 137 00:07:57,880 --> 00:08:00,840 Speaker 1: myself a little bit as as a leader in the 138 00:08:01,000 --> 00:08:03,200 Speaker 1: in the industry. Frankly, I did want to have more 139 00:08:03,200 --> 00:08:05,440 Speaker 1: of a public career than I was able to have 140 00:08:05,520 --> 00:08:08,120 Speaker 1: it Goldman. At the time, Golden was still very secretive. 141 00:08:08,120 --> 00:08:10,880 Speaker 1: They didn't really want you, you know, out there in 142 00:08:10,920 --> 00:08:13,240 Speaker 1: public talking about your work. And I like talking about 143 00:08:13,280 --> 00:08:16,200 Speaker 1: my work. I like sharing my ideas. You know. I 144 00:08:16,240 --> 00:08:18,160 Speaker 1: learned a lot about being a leader. I grew the team, 145 00:08:18,200 --> 00:08:21,440 Speaker 1: I stabilized the technology, UM. And then I decided that 146 00:08:21,480 --> 00:08:23,360 Speaker 1: all right, I wanted to you know, being at the 147 00:08:23,400 --> 00:08:25,680 Speaker 1: start ups exhausting. So I was like, all right, I'm 148 00:08:25,680 --> 00:08:27,520 Speaker 1: gonna take a break I'm gonna think about starting my 149 00:08:27,560 --> 00:08:30,640 Speaker 1: own company. I actually wrote a book about engineering management 150 00:08:30,840 --> 00:08:32,960 Speaker 1: in the year and a half that I was doing 151 00:08:33,040 --> 00:08:36,320 Speaker 1: various things. Was how I like to describe it. And 152 00:08:36,360 --> 00:08:39,040 Speaker 1: then I was like, all right, you know what I'm 153 00:08:39,040 --> 00:08:42,320 Speaker 1: this me starting a startup by myself is not working 154 00:08:42,360 --> 00:08:45,200 Speaker 1: out because I don't have any great ideas and you know, 155 00:08:45,720 --> 00:08:47,160 Speaker 1: maybe this is not the right thing for me to 156 00:08:47,200 --> 00:08:49,000 Speaker 1: be doing right now. So I was like, Okay, I 157 00:08:49,040 --> 00:08:52,920 Speaker 1: need a job. And I got approached by two Sigma. 158 00:08:53,280 --> 00:08:54,480 Speaker 1: You know, I would talked to a bunch of company 159 00:08:54,480 --> 00:08:56,000 Speaker 1: I talked to Google, I talked to a bunch of 160 00:08:56,000 --> 00:08:58,880 Speaker 1: smaller startups and you know, if I'll be honest, When 161 00:08:58,920 --> 00:09:02,560 Speaker 1: I was approached by two Sigma, UM, my first reaction was, 162 00:09:02,880 --> 00:09:05,000 Speaker 1: I don't know. I had actually I know a lot 163 00:09:05,040 --> 00:09:07,600 Speaker 1: of people who worked at two Sigma already, and my 164 00:09:07,640 --> 00:09:11,559 Speaker 1: impression was actually that the company was really academic. I 165 00:09:11,600 --> 00:09:14,760 Speaker 1: wasn't sure how well I would fit in in uh, 166 00:09:14,840 --> 00:09:17,320 Speaker 1: in a super academic culture. UM. But I went and 167 00:09:17,360 --> 00:09:20,520 Speaker 1: interviewed and I was actually blown away by just how 168 00:09:21,040 --> 00:09:25,560 Speaker 1: smart and nice everyone was, and you know, the role 169 00:09:25,600 --> 00:09:27,160 Speaker 1: that they wanted me to come in to run their 170 00:09:27,200 --> 00:09:32,320 Speaker 1: platform engineering organization was a really interesting role, UM, really 171 00:09:32,480 --> 00:09:35,920 Speaker 1: kind of getting back to my technical roots as you know, 172 00:09:36,320 --> 00:09:39,800 Speaker 1: a large systems developer. It's really kind of the technology 173 00:09:39,840 --> 00:09:42,839 Speaker 1: area that I'm most interested in personally. UM. And it 174 00:09:42,920 --> 00:09:45,040 Speaker 1: was a great, you know, great team I got to 175 00:09:45,200 --> 00:09:47,160 Speaker 1: you know, I'm part of the engineering leadership team at 176 00:09:47,160 --> 00:09:49,720 Speaker 1: two Stigma UM, and so you know, I took the 177 00:09:49,800 --> 00:09:53,160 Speaker 1: job and you know, here we are today and it's been, uh, 178 00:09:53,240 --> 00:09:55,880 Speaker 1: you know, it's been really a great experience so far. 179 00:09:55,920 --> 00:10:00,480 Speaker 1: I've been there for almost four years now. So you 180 00:10:00,520 --> 00:10:04,040 Speaker 1: mentioned when you moved from Goldman to Rent the Runway, 181 00:10:04,559 --> 00:10:08,920 Speaker 1: which was a startup at the time and has really 182 00:10:08,960 --> 00:10:11,600 Speaker 1: I gotta say, like I have used the runway and 183 00:10:11,640 --> 00:10:15,360 Speaker 1: it's blown up a lot since then and pretty much UM. 184 00:10:15,400 --> 00:10:18,440 Speaker 1: All of the women that you might ask about this 185 00:10:18,600 --> 00:10:21,640 Speaker 1: probably no about it. UM. But when you moved from 186 00:10:21,679 --> 00:10:23,640 Speaker 1: Goldman to Read the Runway, you mentioned that it was 187 00:10:23,920 --> 00:10:27,640 Speaker 1: more challenging than you had expected. Can you go into 188 00:10:27,679 --> 00:10:30,640 Speaker 1: a little bit more detail on on why that was 189 00:10:30,800 --> 00:10:34,600 Speaker 1: and what was the difference between either the technology that 190 00:10:34,640 --> 00:10:37,439 Speaker 1: you were actually working on Goldman versus the technology and 191 00:10:37,520 --> 00:10:40,760 Speaker 1: Rent the Runway, or maybe the differences in in the 192 00:10:40,760 --> 00:10:45,679 Speaker 1: ways the companies actually operated and did things. There were 193 00:10:45,679 --> 00:10:48,959 Speaker 1: a few things. So first of all, like at at Goldman, 194 00:10:49,440 --> 00:10:51,800 Speaker 1: certainly at the time, and the areas that I was in, 195 00:10:52,400 --> 00:10:56,280 Speaker 1: I was just not used to working in, So I 196 00:10:56,320 --> 00:10:58,720 Speaker 1: was not used to working in an environment a sort 197 00:10:58,760 --> 00:11:05,960 Speaker 1: of combined being very needing to move really really fast, 198 00:11:06,200 --> 00:11:08,600 Speaker 1: right so needing to build new stuff really quickly. There 199 00:11:08,600 --> 00:11:10,200 Speaker 1: are areas of Goldmen that are like this, but that 200 00:11:10,240 --> 00:11:14,200 Speaker 1: wasn't really where I was working and having really high 201 00:11:14,480 --> 00:11:17,080 Speaker 1: sort of uptime requirements. So I think one of the 202 00:11:17,120 --> 00:11:19,199 Speaker 1: interesting things about the difference between a lot of parts 203 00:11:19,200 --> 00:11:23,040 Speaker 1: of finance and consumer technology is that it is still 204 00:11:23,080 --> 00:11:26,240 Speaker 1: possible in a lot of parts of finance to work 205 00:11:26,280 --> 00:11:30,280 Speaker 1: on systems that actually don't have to be up seven, 206 00:11:30,600 --> 00:11:33,360 Speaker 1: meaning that they don't run on the weekend sometimes right 207 00:11:33,400 --> 00:11:35,280 Speaker 1: like you know, the markets are closed, so the system 208 00:11:35,280 --> 00:11:38,360 Speaker 1: doesn't actually need to be running for example. Um, that 209 00:11:38,559 --> 00:11:41,320 Speaker 1: is not the case with consumer technology, right. The the 210 00:11:41,360 --> 00:11:44,000 Speaker 1: expectation is if you're building a product, you're building you know, 211 00:11:44,000 --> 00:11:46,160 Speaker 1: a website or an app or whatever that you know 212 00:11:46,280 --> 00:11:48,440 Speaker 1: consumers are going to use, then it's going to be 213 00:11:48,440 --> 00:11:50,920 Speaker 1: available when they want to use it. And that is 214 00:11:51,000 --> 00:11:55,560 Speaker 1: actually a really big technical difference because it really does 215 00:11:55,679 --> 00:11:58,880 Speaker 1: mean that you know, if something goes wrong on you know, 216 00:11:59,000 --> 00:12:01,800 Speaker 1: two o'clock in the morning on the Sunday, you have 217 00:12:01,960 --> 00:12:04,360 Speaker 1: to do deal with it. Someone has to deal with it, right. 218 00:12:04,520 --> 00:12:08,679 Speaker 1: It can't just like leave it until, you know sometime 219 00:12:08,720 --> 00:12:11,080 Speaker 1: when you're more wide awake and all right, like we've 220 00:12:11,120 --> 00:12:14,720 Speaker 1: got to get this ready for you know, open business 221 00:12:14,720 --> 00:12:16,800 Speaker 1: open on Monday morning, but we've got some time, right, 222 00:12:17,320 --> 00:12:20,880 Speaker 1: which is often the case in finance. Um, that's really 223 00:12:20,920 --> 00:12:22,760 Speaker 1: not the case in consumer. And so I think that 224 00:12:22,920 --> 00:12:26,000 Speaker 1: is actually a really big different kind of challenge that 225 00:12:26,240 --> 00:12:29,040 Speaker 1: you know, I wasn't totally thinking about, I think when 226 00:12:29,080 --> 00:12:30,599 Speaker 1: I when I came in. So I think that was 227 00:12:30,640 --> 00:12:32,720 Speaker 1: a big challenge. UM. And then there's just like a 228 00:12:32,760 --> 00:12:37,800 Speaker 1: lot of technology pieces that you know, when you're building, um, 229 00:12:38,400 --> 00:12:42,079 Speaker 1: when you're building for consumers, particularly frankly in fashion, where 230 00:12:42,600 --> 00:12:45,400 Speaker 1: things need to look and feel good right in the 231 00:12:45,440 --> 00:12:48,000 Speaker 1: technology there's a you know, there's sort of a spit 232 00:12:48,080 --> 00:12:53,120 Speaker 1: and polish and you know, branding and elements to everything 233 00:12:53,120 --> 00:12:55,800 Speaker 1: that you're building that you know a lot of times 234 00:12:55,800 --> 00:12:58,640 Speaker 1: in finance, you know you're building an interface for you know, 235 00:12:59,000 --> 00:13:01,640 Speaker 1: a trader or you know, a risk analyst or whatever. 236 00:13:01,920 --> 00:13:05,440 Speaker 1: It just needs to work. It doesn't need to be beautiful. Um. So, 237 00:13:05,520 --> 00:13:08,800 Speaker 1: you know, really having to think much much more about 238 00:13:09,480 --> 00:13:12,480 Speaker 1: you know, the customer who isn't sitting you know, in 239 00:13:12,520 --> 00:13:15,240 Speaker 1: the desk next to you telling you exactly what they need. 240 00:13:15,400 --> 00:13:18,040 Speaker 1: You know, because the customer is you know, thousands and 241 00:13:18,120 --> 00:13:20,920 Speaker 1: tens of thousands of women all over the country who 242 00:13:21,000 --> 00:13:23,240 Speaker 1: you know, each has kind of different preferences, right, so 243 00:13:23,480 --> 00:13:26,360 Speaker 1: how do you figure out what to build for them? 244 00:13:26,440 --> 00:13:29,000 Speaker 1: How do you you know, how do you build something 245 00:13:29,040 --> 00:13:33,520 Speaker 1: great without necessarily having you know, the person telling you 246 00:13:33,559 --> 00:13:36,360 Speaker 1: exactly what it is that they want built. Those are 247 00:13:36,360 --> 00:13:38,440 Speaker 1: some of the challenges just on the kind of technical 248 00:13:38,440 --> 00:13:41,600 Speaker 1: side that it was really a very different kind of 249 00:13:41,679 --> 00:13:45,520 Speaker 1: both you know, some of the technical challenges on availability, 250 00:13:45,520 --> 00:13:48,360 Speaker 1: but also on you know, the actual way you decide 251 00:13:48,360 --> 00:13:50,680 Speaker 1: what to build is really different. Um. And then you know, 252 00:13:50,760 --> 00:13:54,160 Speaker 1: look big company to small company. You know, obviously you've 253 00:13:54,160 --> 00:13:56,839 Speaker 1: got a lot you don't even notice the things that 254 00:13:56,880 --> 00:13:59,840 Speaker 1: are prepared provided for you in big companies like you 255 00:14:00,000 --> 00:14:04,320 Speaker 1: O HR and training and i T and all of 256 00:14:04,360 --> 00:14:07,200 Speaker 1: that stuff that's just kind of there and working and 257 00:14:07,240 --> 00:14:10,600 Speaker 1: has been established over years and years you you have 258 00:14:10,679 --> 00:14:13,800 Speaker 1: to establish it yourself at a small company, and that's fun, 259 00:14:13,800 --> 00:14:34,040 Speaker 1: but it's a lot of work. So you mentioned between 260 00:14:34,520 --> 00:14:37,040 Speaker 1: Rent the Runway and your current job, which will get 261 00:14:37,080 --> 00:14:40,640 Speaker 1: too soon. You did write this book, and I meant 262 00:14:40,640 --> 00:14:42,360 Speaker 1: to plug it in the beginning, and I'll plug it 263 00:14:42,360 --> 00:14:44,760 Speaker 1: at the end. But the Manager's Path a guide for 264 00:14:44,800 --> 00:14:48,040 Speaker 1: tech leaders navigating growth and change. And so I have 265 00:14:48,120 --> 00:14:50,840 Speaker 1: to imagine like during that time, like coming to Rent 266 00:14:50,880 --> 00:14:53,440 Speaker 1: the Runway when it was a small startup and then 267 00:14:53,560 --> 00:14:57,320 Speaker 1: leaving when it was this household name. I have to 268 00:14:57,360 --> 00:15:00,680 Speaker 1: imagine that like your role in your evolved a lot 269 00:15:00,760 --> 00:15:04,560 Speaker 1: during this time. My assumption would be that in the beginning, um, 270 00:15:04,600 --> 00:15:07,240 Speaker 1: you were doing a lot more like hands on technical stuff. 271 00:15:07,240 --> 00:15:10,280 Speaker 1: By the end, my guests would be a lot more management. 272 00:15:10,360 --> 00:15:13,880 Speaker 1: And per your book, I imagined that that's a really 273 00:15:13,880 --> 00:15:17,240 Speaker 1: tough progression for a lot of people. And engineering and 274 00:15:17,360 --> 00:15:20,040 Speaker 1: management they're very different jobs. It's kind of like in 275 00:15:20,120 --> 00:15:22,960 Speaker 1: our industry, like being a journalist and being sorry, being 276 00:15:22,960 --> 00:15:25,880 Speaker 1: a reporter and being an editor. They're very different jobs, 277 00:15:25,880 --> 00:15:29,280 Speaker 1: even though a lot of people go on that trajectory. Now, 278 00:15:29,320 --> 00:15:31,000 Speaker 1: I guess this is you know, this is a book 279 00:15:31,080 --> 00:15:33,640 Speaker 1: length topic. But for you like sort of like what 280 00:15:33,760 --> 00:15:37,320 Speaker 1: are the big pain points that people experience or that 281 00:15:37,400 --> 00:15:41,800 Speaker 1: you experienced on that journey to actually sort of uh, 282 00:15:42,080 --> 00:15:48,240 Speaker 1: managing engineers as opposed to building technology directly. Yeah, you're 283 00:15:48,280 --> 00:15:50,440 Speaker 1: you're absolutely right that it is a book length topic. 284 00:15:50,480 --> 00:15:54,800 Speaker 1: But I think some of the highlights are in the 285 00:15:54,880 --> 00:15:57,800 Speaker 1: early days you can still sort of manage and write 286 00:15:57,920 --> 00:16:00,560 Speaker 1: code or you know, be kind of somewhat hands on 287 00:16:00,800 --> 00:16:04,720 Speaker 1: for a while, but as your team grows, you start 288 00:16:04,800 --> 00:16:08,480 Speaker 1: to you have to consciously like ease off and pick 289 00:16:08,520 --> 00:16:10,760 Speaker 1: out what you're going to work on if you're gonna 290 00:16:10,760 --> 00:16:12,240 Speaker 1: be hands on. And then at some point you just 291 00:16:12,280 --> 00:16:14,520 Speaker 1: can't be hands on anymore, right, you can't. You know, 292 00:16:14,600 --> 00:16:20,840 Speaker 1: you cannot really be writing production quality software if you're 293 00:16:20,840 --> 00:16:24,720 Speaker 1: in meetings six to eight hours a day because you're 294 00:16:24,880 --> 00:16:26,600 Speaker 1: you're not going to be focused, and then you're gonna 295 00:16:26,600 --> 00:16:29,280 Speaker 1: be working. You know, maybe you're writing that code at 296 00:16:29,360 --> 00:16:31,680 Speaker 1: night after your day is done. But you know, in 297 00:16:31,800 --> 00:16:34,880 Speaker 1: engineering teams, right, you don't just like write code and 298 00:16:34,920 --> 00:16:37,280 Speaker 1: then it's done. You write code, someone has to review it, 299 00:16:37,560 --> 00:16:39,720 Speaker 1: tell you whether it's good or not. Um, you know, 300 00:16:39,760 --> 00:16:41,720 Speaker 1: you've got to make sure it actually works. You've got 301 00:16:41,720 --> 00:16:45,040 Speaker 1: to actually get it running in in you know, two customers. 302 00:16:45,160 --> 00:16:47,160 Speaker 1: Customers have to be actually using it, um and if 303 00:16:47,200 --> 00:16:49,320 Speaker 1: it breaks, it has to be fixed. And so when 304 00:16:49,400 --> 00:16:53,400 Speaker 1: managers who are really managing very large teams right code, 305 00:16:53,720 --> 00:16:55,400 Speaker 1: they may be able to do the writing part, but 306 00:16:55,480 --> 00:16:57,640 Speaker 1: they often can't do the rest of those parts. And 307 00:16:57,680 --> 00:17:00,200 Speaker 1: that's actually really frustrating as an engineer. I've been I've 308 00:17:00,200 --> 00:17:03,200 Speaker 1: been an engineer with managers who have like throwing code 309 00:17:03,200 --> 00:17:05,920 Speaker 1: at me that way, and it's it's really h it's 310 00:17:05,960 --> 00:17:08,960 Speaker 1: really frustrating to experience that. So, you know, I really 311 00:17:09,119 --> 00:17:11,560 Speaker 1: strongly advise you know, managers, once they get to a 312 00:17:11,560 --> 00:17:14,520 Speaker 1: certain point, to stop doing that. Stop writing code, right, 313 00:17:14,520 --> 00:17:16,439 Speaker 1: that's not really your job. You can do it as 314 00:17:16,480 --> 00:17:18,640 Speaker 1: a hobby on the weekends, but don't do it for 315 00:17:18,760 --> 00:17:23,600 Speaker 1: like your actual team because they'll probably get mad at you. Actually, um. 316 00:17:23,640 --> 00:17:29,440 Speaker 1: But that transition is so painful because as engineers, we're 317 00:17:29,480 --> 00:17:31,920 Speaker 1: taught for years and years and years that our value 318 00:17:32,320 --> 00:17:35,600 Speaker 1: is in the software that we create. That like, that 319 00:17:35,840 --> 00:17:38,640 Speaker 1: is what we are rewarded for, is you know, writing 320 00:17:38,640 --> 00:17:42,720 Speaker 1: good software, building good systems and being clever and fixing 321 00:17:42,800 --> 00:17:45,840 Speaker 1: you know, fixing hard bugs and being being smart about 322 00:17:45,880 --> 00:17:51,359 Speaker 1: that thing, and it really feels very it's very scary 323 00:17:51,560 --> 00:17:55,080 Speaker 1: to stop doing it. Um. You know, also because everyone 324 00:17:55,200 --> 00:17:56,960 Speaker 1: tells you, you know, and there's just kind of you know, 325 00:17:57,440 --> 00:18:00,399 Speaker 1: tech is a very ageist industry, right, and there's this 326 00:18:00,560 --> 00:18:04,359 Speaker 1: great sense of like, if I stop writing code, you know, 327 00:18:04,560 --> 00:18:07,600 Speaker 1: I'm immediately on the path to obsolescence and my technical 328 00:18:07,680 --> 00:18:10,640 Speaker 1: knowledge is going to fade and I'm not going I'm 329 00:18:10,680 --> 00:18:13,439 Speaker 1: not going to be taken seriously by the people that 330 00:18:13,480 --> 00:18:16,840 Speaker 1: I'm managing anymore. And you know, that is a very so. 331 00:18:17,000 --> 00:18:19,320 Speaker 1: So it's like you've got all these cultural messages that 332 00:18:19,320 --> 00:18:22,960 Speaker 1: are like your value is in software, but you can't 333 00:18:22,960 --> 00:18:25,200 Speaker 1: really do your job well if you're writing a lot 334 00:18:25,200 --> 00:18:28,200 Speaker 1: of software as a manager of a large team, um, 335 00:18:28,240 --> 00:18:30,040 Speaker 1: and you have to kind of get over that and 336 00:18:30,280 --> 00:18:33,760 Speaker 1: figure out how to feel like you're still staying technically 337 00:18:34,480 --> 00:18:39,280 Speaker 1: in the loop. UM. Incredible without you know, having you know, 338 00:18:39,359 --> 00:18:42,560 Speaker 1: writing code being your your main job. UM. So I 339 00:18:42,560 --> 00:18:45,520 Speaker 1: think that was that was a huge transition for me, 340 00:18:46,080 --> 00:18:48,199 Speaker 1: um and a huge transition for a lot of engineers. 341 00:18:48,640 --> 00:18:50,520 Speaker 1: I think that's that's that's a huge one. And then 342 00:18:50,520 --> 00:18:51,720 Speaker 1: I think, you know, there are a lot of them 343 00:18:51,720 --> 00:18:54,240 Speaker 1: like as your team gets larger, you also have to 344 00:18:54,960 --> 00:18:59,400 Speaker 1: you have to develop the skill of helping your your 345 00:18:59,480 --> 00:19:04,400 Speaker 1: organization make good technical decisions without knowing every single detail 346 00:19:04,440 --> 00:19:06,760 Speaker 1: of what they're doing. And that's sort of a it's 347 00:19:06,800 --> 00:19:09,600 Speaker 1: sort of a similar challenge but a little bit different. Right, 348 00:19:09,600 --> 00:19:12,119 Speaker 1: So when you're actually again writing the code, reading the code, 349 00:19:12,520 --> 00:19:15,199 Speaker 1: very much in the details of the work you and 350 00:19:15,280 --> 00:19:17,240 Speaker 1: you're probably an expert if you've been you know, you're 351 00:19:17,280 --> 00:19:19,680 Speaker 1: getting into management, you've been doing this for a long time. 352 00:19:20,080 --> 00:19:24,040 Speaker 1: You know, generally speaking, people they get promoted into management 353 00:19:24,480 --> 00:19:27,960 Speaker 1: do tend to be viewed as technically credible by their teams. 354 00:19:28,160 --> 00:19:30,359 Speaker 1: You know, I don't think you should always just promote 355 00:19:30,359 --> 00:19:32,840 Speaker 1: the best engineers as managers because they're the best engineers 356 00:19:32,880 --> 00:19:35,840 Speaker 1: they might actually like writing code. But you know, sometimes 357 00:19:35,960 --> 00:19:38,080 Speaker 1: right like there, you know, you do want to promote 358 00:19:38,080 --> 00:19:42,159 Speaker 1: people into management who are technically credible, um and you know, 359 00:19:42,240 --> 00:19:44,280 Speaker 1: have good judgment, because that is part of your job 360 00:19:44,280 --> 00:19:46,879 Speaker 1: as an engineering manager is sort of helping your team 361 00:19:46,960 --> 00:19:52,320 Speaker 1: make good decisions. But you uh, you definitely need to 362 00:19:53,440 --> 00:19:56,399 Speaker 1: still be able to help them make those good decisions 363 00:19:56,440 --> 00:19:58,240 Speaker 1: even as a team it's larger and larger, and you 364 00:19:58,280 --> 00:20:01,479 Speaker 1: get farther away from the details of the problems, and 365 00:20:01,520 --> 00:20:04,760 Speaker 1: so I think that's the other really challenging transition that 366 00:20:04,840 --> 00:20:07,320 Speaker 1: a lot of managers go through and that I certainly 367 00:20:07,320 --> 00:20:11,119 Speaker 1: went through, which was like, how do you, like, how 368 00:20:11,160 --> 00:20:15,120 Speaker 1: do you get the right information and get sort of 369 00:20:15,680 --> 00:20:19,479 Speaker 1: about what's going on in a team that's maybe you know, 370 00:20:20,320 --> 00:20:24,280 Speaker 1: you don't even you're managing the manager of their manager, uh, 371 00:20:24,359 --> 00:20:27,560 Speaker 1: you know, or whatever you're you're you're like several management 372 00:20:27,640 --> 00:20:32,360 Speaker 1: layers away from it, but you kind of need to know, oh, 373 00:20:32,520 --> 00:20:35,200 Speaker 1: something's going wrong here, because if it goes too wrong, 374 00:20:35,480 --> 00:20:38,120 Speaker 1: it could affect, you know, the ability for your whole 375 00:20:38,240 --> 00:20:40,800 Speaker 1: organization to get stuff done right, or it could affect 376 00:20:40,840 --> 00:20:43,280 Speaker 1: it's gonna make you look bad because something is late 377 00:20:43,400 --> 00:20:46,080 Speaker 1: or it's broken, or you know, it's constantly falling over. 378 00:20:46,359 --> 00:20:48,240 Speaker 1: And so you have to still be able to influence 379 00:20:48,280 --> 00:20:52,240 Speaker 1: their technical decisions, but without actually like looking through their 380 00:20:52,240 --> 00:20:54,800 Speaker 1: code and reading it and actually knowing all the details 381 00:20:54,800 --> 00:20:57,160 Speaker 1: of their systems designs. So how do you do that? 382 00:20:57,760 --> 00:21:00,040 Speaker 1: Um And I think that is a similar challenge to 383 00:21:00,160 --> 00:21:02,199 Speaker 1: a lot of people go through, and certainly that I 384 00:21:02,320 --> 00:21:03,919 Speaker 1: that I went through. And so those are those are 385 00:21:03,960 --> 00:21:06,359 Speaker 1: some of the challenges that I talk about in my book, 386 00:21:06,400 --> 00:21:10,320 Speaker 1: among many others. So I may be projecting here, but 387 00:21:10,440 --> 00:21:14,960 Speaker 1: I see a lot of parallels between the the engineering 388 00:21:15,000 --> 00:21:18,159 Speaker 1: profession and how it um. Sorry, let me rephrase that. 389 00:21:18,600 --> 00:21:20,960 Speaker 1: So I may be projecting a little bit here, but 390 00:21:21,480 --> 00:21:24,560 Speaker 1: I feel like there are quite a few parallels between 391 00:21:25,000 --> 00:21:28,880 Speaker 1: journalists who transition from reporters to editors or some other 392 00:21:28,960 --> 00:21:33,480 Speaker 1: sort of management job in journalism and engineers who do 393 00:21:33,600 --> 00:21:36,960 Speaker 1: something similar. You know, journalists are judged by their output, 394 00:21:37,040 --> 00:21:39,840 Speaker 1: the quality and maybe sometimes the quantity of the stories 395 00:21:39,880 --> 00:21:42,680 Speaker 1: they produce, and once you move into an editing role 396 00:21:42,800 --> 00:21:46,320 Speaker 1: or a management role, um, it's hard to come up 397 00:21:46,359 --> 00:21:49,720 Speaker 1: with new ways of judging your performance. But the other 398 00:21:49,760 --> 00:21:53,960 Speaker 1: thing that tends to happen in journalism is journalists, in particular, 399 00:21:54,040 --> 00:22:00,760 Speaker 1: reporters are competitive people who often have an anti authority streak, 400 00:22:00,880 --> 00:22:04,840 Speaker 1: like they like calling authority up on the decisions that 401 00:22:04,840 --> 00:22:07,840 Speaker 1: they're making, and therefore sometimes they do not make the 402 00:22:07,920 --> 00:22:11,879 Speaker 1: greatest managers. So one thing I'm wondering is in tech 403 00:22:12,000 --> 00:22:13,840 Speaker 1: it's kind of similar, right, you have a lot of 404 00:22:13,880 --> 00:22:17,560 Speaker 1: competitive people who want to move fast and break things. 405 00:22:17,720 --> 00:22:20,440 Speaker 1: And I think there, at least for a long time, 406 00:22:20,480 --> 00:22:23,680 Speaker 1: there was this culture of management being bad and that 407 00:22:24,000 --> 00:22:26,240 Speaker 1: you should have fewer managers, you should just hire good 408 00:22:26,240 --> 00:22:29,560 Speaker 1: people and let them do their own thing. How did 409 00:22:29,560 --> 00:22:31,920 Speaker 1: you deal with with that aspect of it? And and 410 00:22:31,960 --> 00:22:36,320 Speaker 1: do you think that managers are are more widely accepted 411 00:22:36,520 --> 00:22:40,600 Speaker 1: in tech as a lot of the players get bigger. Oh, yes, 412 00:22:40,760 --> 00:22:44,159 Speaker 1: this is a this is a very very interesting area. 413 00:22:44,200 --> 00:22:47,600 Speaker 1: I one of my friends and I sometimes refer to 414 00:22:47,640 --> 00:22:51,600 Speaker 1: ourselves as punk managers because we're kind of both both 415 00:22:51,640 --> 00:22:54,959 Speaker 1: a little bit punks and in various ways, you know, 416 00:22:55,240 --> 00:22:58,000 Speaker 1: And I do think absolutely so. Actually think tech is 417 00:22:58,000 --> 00:23:01,960 Speaker 1: sort of interesting. There are definitely the sort of anti 418 00:23:02,040 --> 00:23:07,159 Speaker 1: authority types of which I'm sort of one in some ways, 419 00:23:07,400 --> 00:23:10,560 Speaker 1: right um where you know, I mean, part of the 420 00:23:10,600 --> 00:23:14,400 Speaker 1: reason that I ended up getting into management is that 421 00:23:14,680 --> 00:23:16,880 Speaker 1: I looked around and I was like, look, I think 422 00:23:16,880 --> 00:23:21,440 Speaker 1: I can make better decisions than other people. I think 423 00:23:21,480 --> 00:23:23,240 Speaker 1: I can, you know, I think I think I make 424 00:23:23,280 --> 00:23:25,760 Speaker 1: good decisions. I think I make better decisions on the whole. 425 00:23:26,160 --> 00:23:28,920 Speaker 1: I think I'm you know, going to do a better 426 00:23:29,040 --> 00:23:33,040 Speaker 1: job of treating people well and you know, creating healthy, 427 00:23:33,040 --> 00:23:35,879 Speaker 1: healthy organizations. But I mean also i think I'm gonna, like, 428 00:23:36,160 --> 00:23:37,760 Speaker 1: you know, I think we're going to be more successful 429 00:23:37,840 --> 00:23:40,680 Speaker 1: under me because I've you know, I've got what it takes, right. 430 00:23:41,080 --> 00:23:43,160 Speaker 1: I do think there is some of that, But there 431 00:23:43,200 --> 00:23:47,479 Speaker 1: are actually plenty of people in tech who are you know, 432 00:23:47,720 --> 00:23:52,560 Speaker 1: very rule abiding thoughtful. I mean, look, tech, part of 433 00:23:52,760 --> 00:23:55,840 Speaker 1: part of being an engineer, being a really good engineer, 434 00:23:55,880 --> 00:23:59,439 Speaker 1: is actually like learning the rules of the systems really well. Right. 435 00:23:59,480 --> 00:24:03,640 Speaker 1: You're you're sort of writing very very very precise rules 436 00:24:03,880 --> 00:24:06,600 Speaker 1: into you know, into software as you write software, right, 437 00:24:06,880 --> 00:24:09,640 Speaker 1: And so there are actually plenty of people in tech 438 00:24:09,640 --> 00:24:12,760 Speaker 1: who I think are are you know, our rule abiding 439 00:24:12,800 --> 00:24:16,560 Speaker 1: and quite quite willing to kind of um, I don't know, 440 00:24:16,440 --> 00:24:19,560 Speaker 1: you know, they're definitely not like anti authority necessarily, um, 441 00:24:19,600 --> 00:24:23,760 Speaker 1: but you're absolutely right that. For a long time that really, 442 00:24:23,800 --> 00:24:25,480 Speaker 1: back when I started it round the runway, this was 443 00:24:25,520 --> 00:24:28,360 Speaker 1: like really very popular, the idea that like managers are 444 00:24:28,359 --> 00:24:31,400 Speaker 1: a waste. There are you know, again higher st front 445 00:24:31,440 --> 00:24:33,360 Speaker 1: people and get out of their way and let them, 446 00:24:33,400 --> 00:24:36,040 Speaker 1: you know, and they'll just figure it out. And you know, 447 00:24:36,160 --> 00:24:39,880 Speaker 1: managers are deadweight, right. Both engineers and like vcs would 448 00:24:39,920 --> 00:24:42,480 Speaker 1: say this, right. Frankly, part of the reason I started 449 00:24:42,600 --> 00:24:45,720 Speaker 1: blogging about management and ended up writing the book was that. 450 00:24:45,800 --> 00:24:50,120 Speaker 1: I just thought that was so so ridiculous because it's 451 00:24:50,160 --> 00:24:55,399 Speaker 1: so obvious to me that, look, modern engineering is is 452 00:24:55,480 --> 00:24:59,720 Speaker 1: building very complex systems that required large teams of people 453 00:24:59,760 --> 00:25:02,760 Speaker 1: to build and maintain and evolve. Um it is a 454 00:25:02,760 --> 00:25:04,760 Speaker 1: team sport, is what a lot of people say, right, 455 00:25:04,800 --> 00:25:06,840 Speaker 1: And I think that's true. I don't think that you know, 456 00:25:06,920 --> 00:25:11,800 Speaker 1: that isn't that is a somewhat more modern evolution of 457 00:25:11,840 --> 00:25:15,280 Speaker 1: software engineering, right, you know, thirty years ago or the 458 00:25:15,280 --> 00:25:19,160 Speaker 1: early days of software engineering, lone wolves could do more 459 00:25:19,320 --> 00:25:22,800 Speaker 1: by themselves. Um, But as systems have grown and grown 460 00:25:22,800 --> 00:25:25,520 Speaker 1: in complexity and they're more components and more things that 461 00:25:25,560 --> 00:25:28,560 Speaker 1: need to be done, it's just much harder to build 462 00:25:29,080 --> 00:25:32,919 Speaker 1: you know, a really interesting and compelling product, as you know, 463 00:25:32,960 --> 00:25:35,000 Speaker 1: a lone wolf for as a team of like sort 464 00:25:35,000 --> 00:25:37,680 Speaker 1: of individuals just working on their own stuff. And so 465 00:25:37,800 --> 00:25:40,520 Speaker 1: I do think that when you need that really like 466 00:25:40,680 --> 00:25:45,439 Speaker 1: team based coordination UM to go really well, you do 467 00:25:45,680 --> 00:25:50,119 Speaker 1: need capable management to help with that. You know, you 468 00:25:50,160 --> 00:25:53,240 Speaker 1: need the coach, You need the person who is you know, 469 00:25:53,320 --> 00:25:56,480 Speaker 1: observing everything that's happening and making sure that you're covering 470 00:25:57,000 --> 00:25:59,240 Speaker 1: all the pieces that need to be covered, making sure 471 00:25:59,320 --> 00:26:02,040 Speaker 1: that the groups were well together, because you know, when 472 00:26:02,040 --> 00:26:04,960 Speaker 1: the group isn't working well together, they're not as productive, 473 00:26:05,000 --> 00:26:07,040 Speaker 1: they're not as happy. You know, you get a lot 474 00:26:07,040 --> 00:26:08,640 Speaker 1: of conflict, you get a lot of you know, you're 475 00:26:08,680 --> 00:26:11,280 Speaker 1: just you're you're getting bad results out of that, right. Um. 476 00:26:11,359 --> 00:26:14,040 Speaker 1: You're not going to get the greatest creativity out of 477 00:26:14,080 --> 00:26:16,919 Speaker 1: the overall group when they're you know, when they're not 478 00:26:17,080 --> 00:26:20,240 Speaker 1: collaborating well. Right. Um. And so I do think that 479 00:26:20,600 --> 00:26:24,119 Speaker 1: you know that there was I just, in my opinion, frankly, 480 00:26:24,200 --> 00:26:28,000 Speaker 1: like kind of a you know, legacy attitude about management 481 00:26:28,440 --> 00:26:30,879 Speaker 1: that I think, for the most part, has gone away, 482 00:26:30,960 --> 00:26:34,600 Speaker 1: although I'm always waiting for the backlash to happen. You know, inevitably, 483 00:26:34,600 --> 00:26:36,199 Speaker 1: there will be a backlash and people will be like, 484 00:26:36,200 --> 00:26:39,119 Speaker 1: oh my god, we have too many managers and everyone 485 00:26:39,160 --> 00:26:41,600 Speaker 1: wants to be a manager, and you know, we're wasting 486 00:26:41,640 --> 00:26:44,320 Speaker 1: all this overhead on management, you know. So I'm just 487 00:26:44,480 --> 00:26:46,399 Speaker 1: sort of waiting for that to to drop personally. But 488 00:26:46,480 --> 00:26:50,000 Speaker 1: I do like, I think that management. I do think 489 00:26:50,000 --> 00:26:53,800 Speaker 1: that management became more accepted even at startups in tech 490 00:26:54,359 --> 00:26:57,600 Speaker 1: because a lot of people were suffering from the lack 491 00:26:57,640 --> 00:27:00,480 Speaker 1: of it. A lot of teams are just not efficient 492 00:27:00,680 --> 00:27:04,199 Speaker 1: even small teams that startups, Like, it's really hard to 493 00:27:04,400 --> 00:27:07,320 Speaker 1: run an efficient it's really hard to like build a 494 00:27:07,440 --> 00:27:12,080 Speaker 1: multiperson product group effort if you don't have leadership and 495 00:27:12,160 --> 00:27:15,119 Speaker 1: you know, if you know, you need someone who's willing 496 00:27:15,160 --> 00:27:18,480 Speaker 1: to to do that. You know, both the good and 497 00:27:18,520 --> 00:27:20,640 Speaker 1: the bad work though, right, you know, the fund work 498 00:27:20,680 --> 00:27:23,080 Speaker 1: maybe if sometimes you get to you know, you get 499 00:27:23,080 --> 00:27:26,000 Speaker 1: to be the final decision maker on some stuff, maybe, right. 500 00:27:26,040 --> 00:27:28,399 Speaker 1: But the grunt work of like, actually you have to 501 00:27:28,480 --> 00:27:30,719 Speaker 1: kind of influence a lot of people, and you know, 502 00:27:30,720 --> 00:27:32,280 Speaker 1: you have to deal with a lot of conflict, and 503 00:27:32,320 --> 00:27:34,520 Speaker 1: you have to you know, have hard conversations, and so 504 00:27:35,280 --> 00:27:37,600 Speaker 1: you know, I do think that management has become more 505 00:27:37,600 --> 00:27:40,440 Speaker 1: popular in tech, not just because the big tech companies 506 00:27:40,560 --> 00:27:45,240 Speaker 1: have grown up, but actually because everyone has just realized that, 507 00:27:45,280 --> 00:27:48,960 Speaker 1: like it makes their teams happier and more effective. So 508 00:27:49,400 --> 00:27:53,000 Speaker 1: let's uh skip ahead, or let's skip to your current role. 509 00:27:53,240 --> 00:27:57,760 Speaker 1: UM introduced you, Managing director, Head of Platform Engineering at 510 00:27:57,840 --> 00:28:03,440 Speaker 1: two Sigma. What is the head flaff engineering at Too Sigma? Do? Yeah, 511 00:28:03,520 --> 00:28:09,720 Speaker 1: So I run the team that builds the software systems 512 00:28:09,760 --> 00:28:13,800 Speaker 1: that anyone at Too Sigma who writes software themselves will use. 513 00:28:14,160 --> 00:28:17,640 Speaker 1: So that could be other engineers or modelers. But we 514 00:28:17,720 --> 00:28:22,080 Speaker 1: build things like our data storage products. Um, you know, 515 00:28:22,200 --> 00:28:25,760 Speaker 1: you do all of our sort of public cloud work. 516 00:28:26,119 --> 00:28:30,880 Speaker 1: Um we build Uh, we build tools for software developers 517 00:28:30,920 --> 00:28:35,159 Speaker 1: to you know, test their code and and execute their code. 518 00:28:35,600 --> 00:28:37,800 Speaker 1: UM we build framework. So you know, really it's all 519 00:28:37,880 --> 00:28:41,360 Speaker 1: of the all of the software that other people who 520 00:28:41,400 --> 00:28:46,360 Speaker 1: write software use, is you know, foundational software is built 521 00:28:46,360 --> 00:28:51,240 Speaker 1: by by min How different is an engineer at a 522 00:28:51,560 --> 00:28:55,840 Speaker 1: financial services firms such as to Sigma to an engineer 523 00:28:56,280 --> 00:29:00,840 Speaker 1: at a tech company or or some sort of startup, 524 00:29:00,920 --> 00:29:04,040 Speaker 1: Like does it take a different skill set or does 525 00:29:04,080 --> 00:29:08,640 Speaker 1: it take a different personality? I think that depends a 526 00:29:08,680 --> 00:29:12,200 Speaker 1: little bit. Um. So, like in my team, we are 527 00:29:12,640 --> 00:29:14,680 Speaker 1: actually have a lot of people. I mean to Sigma 528 00:29:14,720 --> 00:29:17,239 Speaker 1: in general has like six of the staff is not 529 00:29:17,400 --> 00:29:20,880 Speaker 1: from financial background financial company backgrounds, right, So so we 530 00:29:20,960 --> 00:29:22,920 Speaker 1: definitely you know, to Sigma in general is like a 531 00:29:23,120 --> 00:29:28,160 Speaker 1: very um diverse in terms of a former employer background 532 00:29:28,280 --> 00:29:31,040 Speaker 1: kind of company. Um. And my team is actually very 533 00:29:31,080 --> 00:29:33,720 Speaker 1: heavily populated by people who have worked in the tech 534 00:29:33,760 --> 00:29:37,920 Speaker 1: industry before coming to to Sigma because a lot of 535 00:29:37,920 --> 00:29:40,440 Speaker 1: the products that we build are very much like it's 536 00:29:40,600 --> 00:29:43,720 Speaker 1: very tech heavy. The team that I run. You would 537 00:29:43,760 --> 00:29:47,320 Speaker 1: see similar kinds of teams at all kinds of big 538 00:29:47,360 --> 00:29:49,840 Speaker 1: tech companies, UM, they would be much much larger than 539 00:29:49,880 --> 00:29:52,000 Speaker 1: the team that I run. But you know, it's very 540 00:29:52,040 --> 00:29:56,640 Speaker 1: it's very kind of like UM transferable work. So I 541 00:29:56,640 --> 00:30:00,640 Speaker 1: think that the some of the differences that i've I've 542 00:30:00,760 --> 00:30:08,200 Speaker 1: noticed between engineers certainly at startups versus some somewhat larger 543 00:30:08,640 --> 00:30:13,760 Speaker 1: UM either tech or finance companies. I do think that 544 00:30:14,400 --> 00:30:18,320 Speaker 1: you know that as you get larger or you know 545 00:30:18,600 --> 00:30:21,840 Speaker 1: more that you can you can hire more specialists and 546 00:30:21,880 --> 00:30:25,080 Speaker 1: more depth experts, and so I do think that what 547 00:30:25,200 --> 00:30:27,400 Speaker 1: you'll see if you look at a startup is you're 548 00:30:27,400 --> 00:30:30,720 Speaker 1: gonna see a lot more generalists, and you'll see engineers 549 00:30:30,760 --> 00:30:34,640 Speaker 1: who are not necessarily the best engineer in terms of 550 00:30:34,680 --> 00:30:38,720 Speaker 1: like real like sort of deep expertise and you know 551 00:30:38,800 --> 00:30:43,160 Speaker 1: they know everything about you know, this kind of deep 552 00:30:43,200 --> 00:30:46,240 Speaker 1: technical system or whatever, right, or you know they're they're 553 00:30:46,400 --> 00:30:49,320 Speaker 1: much this. The skills that startups tend of value are 554 00:30:49,440 --> 00:30:51,840 Speaker 1: much more about people who could just be like, all right, 555 00:30:51,920 --> 00:30:54,880 Speaker 1: like how quickly can I get through this problem and 556 00:30:54,920 --> 00:30:59,160 Speaker 1: onto the next thing? And that is not that is 557 00:30:59,360 --> 00:31:03,120 Speaker 1: value to UM at big companies to some extent, but 558 00:31:03,160 --> 00:31:05,680 Speaker 1: I think big companies tend to have more time to 559 00:31:05,840 --> 00:31:09,320 Speaker 1: spend on solving problems, and so they'd rather spend the 560 00:31:09,320 --> 00:31:12,760 Speaker 1: time to solve a problem right or more right than 561 00:31:12,960 --> 00:31:15,040 Speaker 1: to just get it, you know, get something out there, 562 00:31:15,080 --> 00:31:17,560 Speaker 1: and we want to the next thing. So I think 563 00:31:17,760 --> 00:31:20,000 Speaker 1: that's that is a difference. I also actually think one 564 00:31:20,040 --> 00:31:23,280 Speaker 1: of the one of the really one of the things 565 00:31:23,280 --> 00:31:25,200 Speaker 1: that I learned and Rent the Runway and that I've 566 00:31:25,200 --> 00:31:28,440 Speaker 1: brought to my team at Sigma Up is that UM. 567 00:31:28,480 --> 00:31:30,160 Speaker 1: You know, as I mentioned I think a little earlier, 568 00:31:30,200 --> 00:31:31,400 Speaker 1: one of the big things that you have to do 569 00:31:31,440 --> 00:31:33,720 Speaker 1: to be successful at a startup, particularly like a consumer 570 00:31:33,760 --> 00:31:37,000 Speaker 1: facing startup like Rent the Runway is really understand and 571 00:31:37,040 --> 00:31:39,600 Speaker 1: figure out what your customer is gonna want UM and 572 00:31:39,680 --> 00:31:42,280 Speaker 1: figure out how to build things and how to know 573 00:31:42,520 --> 00:31:45,080 Speaker 1: when if the feature or the product feature that you're 574 00:31:45,120 --> 00:31:47,800 Speaker 1: building is actually something that people want UM. And that 575 00:31:47,880 --> 00:31:52,480 Speaker 1: sort of product management skill set and acknowledge and even 576 00:31:52,520 --> 00:31:54,360 Speaker 1: engineers have to have it. So there is a there 577 00:31:54,400 --> 00:31:57,440 Speaker 1: is a formal role called product manager exists, you know, 578 00:31:57,480 --> 00:32:00,080 Speaker 1: most startups in most technopanies, and two Sigma has these 579 00:32:00,120 --> 00:32:02,400 Speaker 1: folks as well. UM, But you know, when you're at 580 00:32:02,440 --> 00:32:04,880 Speaker 1: a startup, even the engineers are kind of expected to 581 00:32:05,200 --> 00:32:10,520 Speaker 1: understand and sort of have that customer empathy and that 582 00:32:10,520 --> 00:32:13,280 Speaker 1: that knowledge of kind of knowing how to think about 583 00:32:13,320 --> 00:32:16,320 Speaker 1: that has actually been really useful for me to bring 584 00:32:16,800 --> 00:32:20,080 Speaker 1: to two Sigma because you know, we have customers for 585 00:32:20,200 --> 00:32:22,560 Speaker 1: my team. There are other people at two Sigma, UM, 586 00:32:22,600 --> 00:32:24,960 Speaker 1: but they're not you know, we don't we don't have 587 00:32:25,440 --> 00:32:27,640 Speaker 1: you know, a trading desk telling us to build X, 588 00:32:27,760 --> 00:32:30,080 Speaker 1: Y or Z. Right, We've got you know, hundreds of 589 00:32:30,160 --> 00:32:32,800 Speaker 1: people that use our software and they all have slightly 590 00:32:32,800 --> 00:32:36,240 Speaker 1: different means, and so it's really important that my team 591 00:32:36,880 --> 00:32:40,000 Speaker 1: UM be able to hear lots of different pieces of 592 00:32:40,000 --> 00:32:44,280 Speaker 1: feedback and sort of generalize that into Okay, where do 593 00:32:44,320 --> 00:32:46,920 Speaker 1: we actually need to go with the products that we're building, 594 00:32:47,280 --> 00:32:50,320 Speaker 1: so that we're building something that's great for other people 595 00:32:50,360 --> 00:32:52,320 Speaker 1: to use, it's easy for them to use, it makes 596 00:32:52,360 --> 00:32:56,040 Speaker 1: them productive, and it kind of anticipates the future needs 597 00:32:56,080 --> 00:32:58,920 Speaker 1: of this business. So yeah, I mean, I was that's 598 00:32:58,960 --> 00:33:01,080 Speaker 1: sort of where I was gonna go next. I mean, 599 00:33:01,320 --> 00:33:05,080 Speaker 1: when you describe figuring out the needs of rent the runway, 600 00:33:05,080 --> 00:33:07,760 Speaker 1: you're talking about tens of thousands of women around the 601 00:33:07,840 --> 00:33:11,520 Speaker 1: country and how they might want to search and have 602 00:33:11,760 --> 00:33:15,360 Speaker 1: a article of clothing shipped to them. Describe a little 603 00:33:15,360 --> 00:33:18,960 Speaker 1: bit more that process when it's internal, when it's at 604 00:33:18,960 --> 00:33:24,040 Speaker 1: a financial services company. What are the types of things 605 00:33:24,320 --> 00:33:28,560 Speaker 1: that your your clients, your customers, your internal customers are 606 00:33:28,640 --> 00:33:31,760 Speaker 1: looking for. What do they need, what kind of problems 607 00:33:31,800 --> 00:33:34,520 Speaker 1: are they trying to solve, and what are they looking 608 00:33:34,560 --> 00:33:37,320 Speaker 1: to you specifically to build for them or to help 609 00:33:37,360 --> 00:33:42,200 Speaker 1: them build so that they can do their job more effectively. Yeah, So, 610 00:33:42,240 --> 00:33:44,840 Speaker 1: I mean, look, one of the big problems that they 611 00:33:44,880 --> 00:33:48,320 Speaker 1: they're always going to need, um, is just the demand 612 00:33:48,680 --> 00:33:54,160 Speaker 1: for more you know, more data, uh, more data and 613 00:33:54,320 --> 00:33:59,520 Speaker 1: more computational power. And so my team, um, you know, 614 00:33:59,640 --> 00:34:02,360 Speaker 1: my team doesn't do the actual data on boarding, right, 615 00:34:02,360 --> 00:34:04,760 Speaker 1: So there's there's a data engineering team that's really responsible 616 00:34:04,760 --> 00:34:06,520 Speaker 1: for that part of things, but my team owns the 617 00:34:06,560 --> 00:34:09,880 Speaker 1: storage systems, right and you know, recently, right we were 618 00:34:09,920 --> 00:34:12,440 Speaker 1: serving we've been serving like fifteen petabytes a day of data. 619 00:34:12,520 --> 00:34:15,160 Speaker 1: So just to like contextualize that for your users, if 620 00:34:15,160 --> 00:34:18,600 Speaker 1: you've got an iPhone with two fifty six gigs of storage, 621 00:34:18,880 --> 00:34:22,080 Speaker 1: that's like sixty tho iPhones worth of storage that we're 622 00:34:22,080 --> 00:34:24,680 Speaker 1: sort of serving in a day. UM. So you know, 623 00:34:24,920 --> 00:34:27,439 Speaker 1: we have these really massive amounts of data and there's 624 00:34:27,560 --> 00:34:31,040 Speaker 1: always demand for you know, I want more data, I 625 00:34:31,080 --> 00:34:33,200 Speaker 1: want it faster, I need it. You know, I need 626 00:34:33,239 --> 00:34:35,960 Speaker 1: to be able to access this data in particular ways 627 00:34:36,000 --> 00:34:38,320 Speaker 1: because you know, like I'm looking across a time series 628 00:34:38,360 --> 00:34:41,200 Speaker 1: of data and trying to you know, detect changes or 629 00:34:41,200 --> 00:34:44,160 Speaker 1: what have you. And so you know, and as we 630 00:34:44,440 --> 00:34:48,000 Speaker 1: you know, as new kinds of machine learning algorithms are adopted, 631 00:34:48,160 --> 00:34:52,120 Speaker 1: they have different kinds of data requirements UM, and data 632 00:34:52,160 --> 00:34:56,000 Speaker 1: serving requirements that my team will need to be able 633 00:34:56,040 --> 00:34:58,600 Speaker 1: to to keep up with. Right. And you know, so 634 00:34:58,680 --> 00:35:01,719 Speaker 1: I think there's always that a man for bigger and 635 00:35:01,920 --> 00:35:05,400 Speaker 1: faster um you know, on the on the and and 636 00:35:05,400 --> 00:35:09,040 Speaker 1: then on the compute computational capacity side, right Like you know, again, 637 00:35:09,320 --> 00:35:12,160 Speaker 1: as as we get more data, we want to process 638 00:35:12,200 --> 00:35:14,400 Speaker 1: it faster. You know, we want to run lots and 639 00:35:14,400 --> 00:35:17,279 Speaker 1: lots and lots of experiments. Um. You know, we have 640 00:35:17,360 --> 00:35:20,200 Speaker 1: workloads that peek at like half a million cores of processing. 641 00:35:20,239 --> 00:35:22,560 Speaker 1: So again that's like you know, your laptop might have 642 00:35:22,800 --> 00:35:25,280 Speaker 1: sixteen cores if you've got like a really hefty laptop, 643 00:35:25,360 --> 00:35:30,080 Speaker 1: so you know, huge huge volumes of computational processing, and 644 00:35:30,440 --> 00:35:35,040 Speaker 1: to do that requires pretty complex systems under the covers 645 00:35:35,320 --> 00:35:38,640 Speaker 1: to actually you know, coordinate all of that work. Um, 646 00:35:38,680 --> 00:35:41,440 Speaker 1: you know, enable people to sort of submit work to 647 00:35:41,520 --> 00:35:43,680 Speaker 1: these systems to run and get results. You know. The 648 00:35:43,680 --> 00:35:46,880 Speaker 1: way that my team will will understand this is we 649 00:35:46,920 --> 00:35:49,719 Speaker 1: really have to work with our partners, both in engineering 650 00:35:50,120 --> 00:35:53,560 Speaker 1: and modeling to hear about Okay, like you know, what's 651 00:35:53,560 --> 00:35:58,160 Speaker 1: what's coming up next? Right? Okay, if we're moving to uh, 652 00:35:58,200 --> 00:36:00,399 Speaker 1: you know, if we're moving to doing a lot more 653 00:36:00,480 --> 00:36:02,960 Speaker 1: deep learning in this area, what is that going to 654 00:36:03,040 --> 00:36:05,960 Speaker 1: mean for the platform? Right? What? Okay? You know, what 655 00:36:06,080 --> 00:36:08,840 Speaker 1: does what are the real heavyweight requirements? You need access 656 00:36:08,880 --> 00:36:12,640 Speaker 1: to GPU processors, right, you know, those those processors that 657 00:36:12,680 --> 00:36:15,680 Speaker 1: are great for certain kinds of matrix math, and that's 658 00:36:15,719 --> 00:36:18,920 Speaker 1: what um you know, okay deep learning really needs on GPUs, 659 00:36:19,000 --> 00:36:21,399 Speaker 1: So my systems are going to need to support that. 660 00:36:21,680 --> 00:36:23,680 Speaker 1: So I think that's that's one side of it, and 661 00:36:23,719 --> 00:36:26,640 Speaker 1: then I think another side of it is just um 662 00:36:26,680 --> 00:36:28,719 Speaker 1: you know, in the same way that if you're at 663 00:36:28,760 --> 00:36:33,439 Speaker 1: a at a business serving external customers where you can't 664 00:36:33,480 --> 00:36:36,799 Speaker 1: just talk to each customer individually. You want to look 665 00:36:36,840 --> 00:36:39,239 Speaker 1: actually at the data about how people are using your 666 00:36:39,239 --> 00:36:44,080 Speaker 1: product and detect trends um or opportunities for improvements. Right, So, 667 00:36:44,360 --> 00:36:46,760 Speaker 1: if you're looking at the data about the way people 668 00:36:46,840 --> 00:36:49,160 Speaker 1: are using your systems and you see that like a 669 00:36:49,200 --> 00:36:53,879 Speaker 1: particular very commonly run command is really slow, you can 670 00:36:53,920 --> 00:36:56,440 Speaker 1: sort of accept that that actually is probably slowing people 671 00:36:56,480 --> 00:37:01,440 Speaker 1: down and and and interrupting their their low and their creativity. 672 00:37:01,719 --> 00:37:03,960 Speaker 1: And that's an opportunity where if we could make that 673 00:37:04,080 --> 00:37:07,759 Speaker 1: commonly used command faster in some way, they're going to 674 00:37:07,840 --> 00:37:10,040 Speaker 1: be more productive. They're going to be you know, happier 675 00:37:10,320 --> 00:37:12,200 Speaker 1: because it's just gonna be a lot easier for them 676 00:37:12,239 --> 00:37:14,400 Speaker 1: to get their work done. Right. So, those are some 677 00:37:14,440 --> 00:37:15,880 Speaker 1: of the ways that we kind of approach trying to 678 00:37:15,880 --> 00:37:20,040 Speaker 1: figure out what to build. So presumably making the core 679 00:37:20,120 --> 00:37:25,960 Speaker 1: systems faster and able to process larger amounts of data 680 00:37:26,040 --> 00:37:31,160 Speaker 1: like that is how engineering contributes to a competitive edge 681 00:37:31,360 --> 00:37:35,040 Speaker 1: for two sigma um. First of all, is that correct? 682 00:37:35,120 --> 00:37:40,080 Speaker 1: And then secondly, if that's true, does that mean that 683 00:37:41,000 --> 00:37:44,800 Speaker 1: does that mean as as quant shops kind of get 684 00:37:44,840 --> 00:37:50,879 Speaker 1: bigger and faster, that scale matters much much more than 685 00:37:50,920 --> 00:37:53,719 Speaker 1: it used to, and does that mean that the competitive 686 00:37:53,800 --> 00:37:57,680 Speaker 1: advantage is always going to lie with the biggest firms 687 00:37:57,800 --> 00:38:02,880 Speaker 1: with the most resources. Does that makes them yeah, you know, 688 00:38:03,000 --> 00:38:06,360 Speaker 1: I will so. I do think that a lot of 689 00:38:06,400 --> 00:38:11,320 Speaker 1: the advantage that engineers bring to companies like to Sigma 690 00:38:11,520 --> 00:38:15,520 Speaker 1: is just our ability to build platforms, um to build 691 00:38:15,520 --> 00:38:19,520 Speaker 1: platforms that can enable scale and productivity. I do think 692 00:38:19,520 --> 00:38:22,880 Speaker 1: that's you know that and and being you know and 693 00:38:22,920 --> 00:38:25,280 Speaker 1: then and I do think that that engineers also bring 694 00:38:25,480 --> 00:38:27,920 Speaker 1: bring a degree of innovation with them, right. This is 695 00:38:27,960 --> 00:38:31,160 Speaker 1: there's a reason that you know, companies do put put 696 00:38:31,239 --> 00:38:35,080 Speaker 1: a premium on really great engineers. And you know, it's 697 00:38:35,120 --> 00:38:38,279 Speaker 1: not just because like the skill set is rare, because 698 00:38:38,440 --> 00:38:40,240 Speaker 1: more and more people are learning how to write code, 699 00:38:40,280 --> 00:38:42,960 Speaker 1: and you know, I think it is becoming it's not 700 00:38:43,000 --> 00:38:45,319 Speaker 1: as hard as it once was to write code. But 701 00:38:45,640 --> 00:38:48,719 Speaker 1: I think the best engineers are not only capable of 702 00:38:49,080 --> 00:38:51,759 Speaker 1: writing code, but there they are innovators and they have 703 00:38:52,520 --> 00:38:55,520 Speaker 1: you know, they can see ideas for how to do 704 00:38:55,719 --> 00:39:00,239 Speaker 1: things that you know, you wouldn't necessarily predict as an 705 00:39:00,280 --> 00:39:02,359 Speaker 1: as an outsider, even as an outsider who is an 706 00:39:02,360 --> 00:39:06,160 Speaker 1: engineer but not deep in the details of a particular area, 707 00:39:06,920 --> 00:39:10,840 Speaker 1: I think It's an interesting question as to whether scale 708 00:39:11,239 --> 00:39:14,320 Speaker 1: is always a competitive advantage for companies like to Sigma. 709 00:39:14,640 --> 00:39:19,240 Speaker 1: I am not an expert in you know, the quantitative 710 00:39:19,360 --> 00:39:23,279 Speaker 1: finance hedge fund industry. Um. I've only worked at two 711 00:39:23,280 --> 00:39:25,920 Speaker 1: Sigma when it comes to you know, hedge funds and 712 00:39:25,920 --> 00:39:29,480 Speaker 1: and so I don't have a ton of you know, experience. 713 00:39:29,880 --> 00:39:32,680 Speaker 1: More broadly, I mean, I do think though that you know, 714 00:39:32,960 --> 00:39:36,760 Speaker 1: you do see scale. Um. I do think scale matters, 715 00:39:36,800 --> 00:39:39,720 Speaker 1: and scale matters in a lot of ways in these businesses. 716 00:39:39,760 --> 00:39:42,600 Speaker 1: I mean, look, I think scale matters in your ability 717 00:39:42,719 --> 00:39:46,440 Speaker 1: to get good costs and trading right, which is, you know, 718 00:39:46,760 --> 00:39:48,839 Speaker 1: get get your costs down in those areas, get your 719 00:39:48,880 --> 00:39:51,399 Speaker 1: lending costs down. But I do think that scale from 720 00:39:51,400 --> 00:39:55,000 Speaker 1: a from a technology point of view, it will matter 721 00:39:55,239 --> 00:39:59,280 Speaker 1: if the way, you know, if the way to find 722 00:39:59,480 --> 00:40:04,440 Speaker 1: good valuable ideas comes from processing larger and larger and 723 00:40:04,520 --> 00:40:07,719 Speaker 1: larger sets of data. Yes, scale is always going to 724 00:40:07,800 --> 00:40:11,840 Speaker 1: be a competitive advantage. Now, if you can find ways 725 00:40:11,880 --> 00:40:16,560 Speaker 1: of getting good ideas that don't require processing massive amounts 726 00:40:16,640 --> 00:40:20,720 Speaker 1: of data UM or you know, don't require running really 727 00:40:21,200 --> 00:40:26,200 Speaker 1: large computational experiments to determine, then you know, scale is 728 00:40:26,239 --> 00:40:30,200 Speaker 1: not necessarily an advantage, and scale certainly comes with its downsides, right, 729 00:40:30,239 --> 00:40:32,719 Speaker 1: So I think too Sigma has really invested heavily in 730 00:40:32,760 --> 00:40:35,560 Speaker 1: building out platforms UM. Not just in my team, but 731 00:40:35,680 --> 00:40:38,799 Speaker 1: you know even in our our model or tools from modelers, right. 732 00:40:38,840 --> 00:40:41,239 Speaker 1: We try to think about that is building a platform 733 00:40:41,320 --> 00:40:44,760 Speaker 1: of tools that are modelers can use, it all work together, 734 00:40:45,440 --> 00:40:47,839 Speaker 1: you know, some companies of my impression is that there 735 00:40:47,880 --> 00:40:50,600 Speaker 1: are companies where, like you know, practically like every desk 736 00:40:50,640 --> 00:40:52,719 Speaker 1: will have its own sort of engineering team in its 737 00:40:52,719 --> 00:40:55,520 Speaker 1: own set of tools. And those are two very different approaches, 738 00:40:55,560 --> 00:40:57,960 Speaker 1: and they have pros and cons. I think that the 739 00:40:58,040 --> 00:41:02,600 Speaker 1: platform approach, you can you can do bigger things, right. 740 00:41:02,719 --> 00:41:05,840 Speaker 1: It's sort of like comparing like a cruise ship or 741 00:41:05,920 --> 00:41:10,200 Speaker 1: a you know, a big container ship with like a speedboat. Right, 742 00:41:10,280 --> 00:41:12,320 Speaker 1: so like you know, you can go a lot farther 743 00:41:12,520 --> 00:41:14,520 Speaker 1: and you can do a lot more and can carry 744 00:41:14,520 --> 00:41:17,040 Speaker 1: a lot more in a big ship, but you cannot 745 00:41:17,080 --> 00:41:19,800 Speaker 1: turn quickly, right, Whereas if you're in a speedboat you 746 00:41:19,800 --> 00:41:23,080 Speaker 1: can reactually really really really fast. You just can't. You know, 747 00:41:23,160 --> 00:41:25,160 Speaker 1: you've got sort of a limited amount that you can 748 00:41:25,160 --> 00:41:28,120 Speaker 1: actually do with that UM. And so I do think 749 00:41:28,160 --> 00:41:30,560 Speaker 1: that you know, platforms have a lot of advantage and 750 00:41:30,600 --> 00:41:34,200 Speaker 1: this ability to scale and this you know, investment in 751 00:41:34,239 --> 00:41:36,920 Speaker 1: being able to scale does give you a lot of 752 00:41:36,960 --> 00:41:40,560 Speaker 1: advantages in terms of, you know, the scope of things 753 00:41:40,560 --> 00:41:43,359 Speaker 1: you can do, but it does have downsides in that 754 00:41:43,480 --> 00:41:45,920 Speaker 1: you have to invest a lot of technology and time 755 00:41:46,080 --> 00:41:48,600 Speaker 1: and effort into being able to do that in the 756 00:41:48,600 --> 00:41:52,520 Speaker 1: first place, and that means that changing your approach becomes 757 00:41:52,520 --> 00:42:14,759 Speaker 1: somewhat more expensive. So I'm curious about the evaluation process 758 00:42:15,120 --> 00:42:20,080 Speaker 1: of new endeavors. So presumably anyone who is you know, 759 00:42:20,200 --> 00:42:24,680 Speaker 1: doing trying to build some sort of machine learning AI 760 00:42:25,440 --> 00:42:29,800 Speaker 1: trading system would always like more data and faster access 761 00:42:29,840 --> 00:42:34,080 Speaker 1: to the data and so forth. But you or the 762 00:42:34,160 --> 00:42:37,479 Speaker 1: engineers have to evaluate, I presume like, okay, well, what's 763 00:42:37,480 --> 00:42:40,640 Speaker 1: a worthy investment and does this really make sense to 764 00:42:40,760 --> 00:42:45,200 Speaker 1: build out this capacity and will it really deliver adequate returns? 765 00:42:45,680 --> 00:42:49,080 Speaker 1: How do you think about those kinds of problems when 766 00:42:49,120 --> 00:42:51,200 Speaker 1: a team comes to you with some sort of need 767 00:42:51,360 --> 00:42:53,880 Speaker 1: or with some sort of when they're bumping up against 768 00:42:53,920 --> 00:42:58,160 Speaker 1: the limitations of the existing platform, of the existing technology, 769 00:42:58,600 --> 00:43:03,000 Speaker 1: evaluating what what aspect of capacity is actually worth investing in, 770 00:43:03,040 --> 00:43:07,880 Speaker 1: because presumably it's a very very costly time money also 771 00:43:07,920 --> 00:43:13,359 Speaker 1: with uncertain uncertain returns. Yeah, so I have have sort 772 00:43:13,360 --> 00:43:15,600 Speaker 1: of two different approaches. So one of them is that, 773 00:43:15,680 --> 00:43:18,200 Speaker 1: of course, if you see a pattern of people coming 774 00:43:18,239 --> 00:43:21,440 Speaker 1: to you, or you can attect a pattern of requests, 775 00:43:22,120 --> 00:43:26,160 Speaker 1: you should probably build it. Because now you're seeing, all right, 776 00:43:26,320 --> 00:43:30,200 Speaker 1: like I've seen several different people are thinking about problems 777 00:43:30,200 --> 00:43:33,440 Speaker 1: that if we had a solution that served time series 778 00:43:33,520 --> 00:43:36,880 Speaker 1: data ten times faster or whatever, right, Um, we would 779 00:43:36,880 --> 00:43:39,359 Speaker 1: be killing a lot of birds with that stone, right. 780 00:43:39,400 --> 00:43:42,239 Speaker 1: And so definitely part of the job is looking for 781 00:43:42,239 --> 00:43:45,160 Speaker 1: those patterns and knowing what people are working on and 782 00:43:45,360 --> 00:43:49,600 Speaker 1: being able to to decompose it into Okay, actually, you 783 00:43:49,640 --> 00:43:51,560 Speaker 1: know these are very similar things, and we need to 784 00:43:51,560 --> 00:43:55,279 Speaker 1: solve for this pattern. Um. But another approach that I 785 00:43:55,360 --> 00:43:59,600 Speaker 1: like to do is sometimes you're not sure, and so 786 00:44:00,480 --> 00:44:03,520 Speaker 1: the best approach is to partner with the team, particularly 787 00:44:03,520 --> 00:44:06,160 Speaker 1: if it's coming from an engineering team that's requesting something. 788 00:44:06,480 --> 00:44:09,359 Speaker 1: Is a partner of the team, and build out some 789 00:44:09,440 --> 00:44:12,640 Speaker 1: kind of proof of concept. You know, where you're you're 790 00:44:12,680 --> 00:44:16,280 Speaker 1: investing enough to get something built, but you're not committing 791 00:44:16,400 --> 00:44:20,960 Speaker 1: to turning it into like a big platform system. Um. 792 00:44:21,000 --> 00:44:23,520 Speaker 1: And you know what you do. When when you do that, 793 00:44:23,600 --> 00:44:26,239 Speaker 1: it's like, first of all, you learn, uh, you learn 794 00:44:26,280 --> 00:44:29,120 Speaker 1: about the problem more and more and a really more 795 00:44:29,200 --> 00:44:33,359 Speaker 1: like specific way. And then if you build this sort 796 00:44:33,360 --> 00:44:37,560 Speaker 1: of proof of concept for a team and nobody else 797 00:44:37,640 --> 00:44:40,279 Speaker 1: sees it and says, oh, I really want that too, 798 00:44:40,440 --> 00:44:42,080 Speaker 1: or this is super useful. I can see how I 799 00:44:42,080 --> 00:44:44,480 Speaker 1: could use something like this in my area. Then you 800 00:44:44,480 --> 00:44:46,160 Speaker 1: can say, okay, you know what like this isn't gonna 801 00:44:46,160 --> 00:44:48,799 Speaker 1: be a platform team we've you know, we've partnered with you, 802 00:44:49,000 --> 00:44:51,400 Speaker 1: but you're gonna this is your your thing now, so 803 00:44:51,520 --> 00:44:54,000 Speaker 1: take it, run with it, you know, use it for 804 00:44:54,040 --> 00:44:57,040 Speaker 1: your small area, but we're not going to extend it 805 00:44:57,160 --> 00:45:00,480 Speaker 1: to be something generalized. But sometimes you build out that 806 00:45:00,480 --> 00:45:04,040 Speaker 1: proof of concept and you do see people asking for 807 00:45:04,760 --> 00:45:07,239 Speaker 1: well maybe not quite that thing, but something that looks 808 00:45:07,280 --> 00:45:09,480 Speaker 1: sort of similar, and it gives you a much clearer 809 00:45:09,520 --> 00:45:12,920 Speaker 1: idea of what the actual need is. And again I 810 00:45:12,920 --> 00:45:15,200 Speaker 1: think that is this is a very much you know, 811 00:45:15,640 --> 00:45:18,319 Speaker 1: something that I've taken from startup life right where it's 812 00:45:18,400 --> 00:45:21,680 Speaker 1: you know, how can I how can I learn quickly 813 00:45:22,120 --> 00:45:24,440 Speaker 1: where I should be going? And sometimes the best way 814 00:45:24,480 --> 00:45:26,759 Speaker 1: to learn quickly is so just to build something that 815 00:45:26,920 --> 00:45:30,920 Speaker 1: isn't the ideal, um, you know, big in system, but 816 00:45:31,320 --> 00:45:34,879 Speaker 1: that if we launch it with a partner, will learn 817 00:45:34,920 --> 00:45:37,000 Speaker 1: a lot about the area and whether or not it's 818 00:45:37,040 --> 00:45:40,800 Speaker 1: worth investing in. So I have a slightly weird question, 819 00:45:41,160 --> 00:45:45,920 Speaker 1: which is how do financial services firms feel about open 820 00:45:45,960 --> 00:45:50,800 Speaker 1: source software nowadays? Because I remember, I guess five or 821 00:45:50,840 --> 00:45:53,600 Speaker 1: six years ago when I was writing about Wall Street, 822 00:45:54,440 --> 00:45:56,279 Speaker 1: I remember there were one or two banks that we're 823 00:45:56,320 --> 00:45:59,439 Speaker 1: trying to make an open source push because they thought 824 00:45:59,440 --> 00:46:03,240 Speaker 1: it could make everything more efficient and cut down on costs. 825 00:46:03,239 --> 00:46:05,080 Speaker 1: And there's a lot of cost pressure at the banks 826 00:46:05,120 --> 00:46:08,360 Speaker 1: after the financial crisis. But it was sort of an 827 00:46:08,480 --> 00:46:11,839 Speaker 1: uphill battle for them because most of the banks were 828 00:46:11,960 --> 00:46:15,440 Speaker 1: pretty competitive, and I guess they thought all their technology 829 00:46:15,520 --> 00:46:17,800 Speaker 1: was very proprietary and they didn't really want to share. 830 00:46:18,360 --> 00:46:21,760 Speaker 1: Has has that changed at all? Is there more cooperation 831 00:46:21,920 --> 00:46:26,520 Speaker 1: across the financial industry when it comes to software. There's 832 00:46:26,560 --> 00:46:29,520 Speaker 1: certainly a lot more open source um. Actually, you know, 833 00:46:29,600 --> 00:46:32,480 Speaker 1: one of the reasons that I left Goldman was that 834 00:46:32,560 --> 00:46:35,000 Speaker 1: I actually was working in open source there and I 835 00:46:35,040 --> 00:46:37,960 Speaker 1: felt like it was such an uphill battle to be 836 00:46:38,080 --> 00:46:41,279 Speaker 1: allowed to work an open source at the time. I 837 00:46:41,320 --> 00:46:44,239 Speaker 1: think they've changed that process quite a bit, and you know, 838 00:46:44,600 --> 00:46:46,719 Speaker 1: in recent years. But you know, when I was there, 839 00:46:46,760 --> 00:46:50,640 Speaker 1: it was like, well, we'll let you do this because 840 00:46:51,480 --> 00:46:53,719 Speaker 1: we like you and we trust you. But it wasn't 841 00:46:53,800 --> 00:46:57,480 Speaker 1: sort of a generalized process that lots of people could 842 00:46:57,560 --> 00:47:00,840 Speaker 1: do UM and I do think that's changed dramatically in 843 00:47:00,840 --> 00:47:03,160 Speaker 1: the last ten years. So two Sigma dos a ton 844 00:47:03,239 --> 00:47:06,720 Speaker 1: of open source UM. We both contribute to big popular 845 00:47:06,760 --> 00:47:10,160 Speaker 1: open source projects like pandas UM. We also open source 846 00:47:10,239 --> 00:47:13,480 Speaker 1: some of our own software. UM. So I definitely think 847 00:47:13,600 --> 00:47:16,840 Speaker 1: that you're seeing a lot more open source in the 848 00:47:16,880 --> 00:47:20,440 Speaker 1: financial industry now, it's still you know, it's still different, 849 00:47:20,680 --> 00:47:24,520 Speaker 1: I think than the tech industry in that you know, 850 00:47:24,600 --> 00:47:28,520 Speaker 1: it's still a bit more bureaucratic probably to to you know, 851 00:47:28,600 --> 00:47:32,360 Speaker 1: create open source. There is certainly concern about I P 852 00:47:32,800 --> 00:47:36,359 Speaker 1: and what is going to give our you know, competitors 853 00:47:36,360 --> 00:47:40,239 Speaker 1: and advantage if we open source versus you know, what 854 00:47:40,440 --> 00:47:44,640 Speaker 1: helps helps with our brand in the industry, right, you know, 855 00:47:44,640 --> 00:47:47,799 Speaker 1: because open source is good partly because it helps you 856 00:47:47,840 --> 00:47:52,040 Speaker 1: recruit engineers. Engineers tend to like companies that do open source. 857 00:47:52,320 --> 00:47:54,640 Speaker 1: You know. In my opinion, I think it's just like 858 00:47:54,840 --> 00:47:57,239 Speaker 1: it shows that you're a good corporate citizen when you 859 00:47:57,360 --> 00:47:59,880 Speaker 1: contribute to open source, and you you know, because everyone 860 00:48:00,040 --> 00:48:03,640 Speaker 1: uses open source, and open source is expensive to keep 861 00:48:03,760 --> 00:48:05,759 Speaker 1: up for the open source maintainers, right, you know, a 862 00:48:05,800 --> 00:48:09,200 Speaker 1: lot of them are volunteers. And so when you look 863 00:48:09,239 --> 00:48:11,400 Speaker 1: at a company and you see that they you know, 864 00:48:11,440 --> 00:48:13,720 Speaker 1: they use open source, because every company that does software 865 00:48:13,719 --> 00:48:16,400 Speaker 1: is using open source, but they absolutely do not contribute 866 00:48:16,440 --> 00:48:18,520 Speaker 1: back to it, you think, Okay, this company isn't going 867 00:48:18,640 --> 00:48:21,480 Speaker 1: corporate citizen in that way. What does that mean for 868 00:48:21,480 --> 00:48:24,719 Speaker 1: the rest of the way that they think about, you know, engineering? 869 00:48:25,239 --> 00:48:28,280 Speaker 1: So I do think that you know, savvy companies realize 870 00:48:28,280 --> 00:48:33,000 Speaker 1: that preventing their engineers from contributing to open source is 871 00:48:33,000 --> 00:48:37,360 Speaker 1: actually bad branding in a lot of ways. But there's 872 00:48:37,400 --> 00:48:40,960 Speaker 1: still certainly is that concern of Okay, how do we 873 00:48:41,080 --> 00:48:44,040 Speaker 1: make sure that engineers are not accidentally leaking something that 874 00:48:44,239 --> 00:48:47,799 Speaker 1: is in fact really like proprietary and valuable um. And 875 00:48:47,840 --> 00:48:50,000 Speaker 1: I do think, you know, that's going to be a 876 00:48:50,000 --> 00:48:54,280 Speaker 1: concern probably forever, given that there are you know, algorithms 877 00:48:54,320 --> 00:48:57,600 Speaker 1: and pieces of code in finance that really are quite 878 00:48:57,680 --> 00:49:00,920 Speaker 1: valuable in and of themselves, right, And so you know, 879 00:49:00,960 --> 00:49:03,880 Speaker 1: when you can you can do something significantly faster than 880 00:49:03,920 --> 00:49:07,959 Speaker 1: your competitor because you've thought of this great mathematical algorithmic trick, 881 00:49:08,440 --> 00:49:11,680 Speaker 1: Like you don't really want that getting out there um 882 00:49:11,719 --> 00:49:14,600 Speaker 1: to the wider world accidentally leaked as a piece of 883 00:49:14,600 --> 00:49:17,640 Speaker 1: open source software. You know, I'm curious. I mean, at 884 00:49:17,680 --> 00:49:23,040 Speaker 1: the beginning you mentioned two Sigma's reputation as being kind 885 00:49:23,040 --> 00:49:26,680 Speaker 1: of academic. But I'm thinking, like from the perspective of 886 00:49:26,719 --> 00:49:29,560 Speaker 1: an engineer, like maybe thinking about going to a place 887 00:49:29,600 --> 00:49:31,760 Speaker 1: like two Sigma versus going to a place like Google. 888 00:49:31,800 --> 00:49:34,319 Speaker 1: I mean, like I don't know if Google still has it, 889 00:49:34,400 --> 00:49:36,040 Speaker 1: but of course they used to be famous for like 890 00:49:36,080 --> 00:49:39,000 Speaker 1: the thing and getting to work on your own projects 891 00:49:39,160 --> 00:49:40,960 Speaker 1: and I and they still have all those other like 892 00:49:41,040 --> 00:49:43,160 Speaker 1: side bets, which are just a bunch of businesses that 893 00:49:43,200 --> 00:49:45,919 Speaker 1: lose money. And I also have to imagine that even 894 00:49:45,920 --> 00:49:47,839 Speaker 1: in the core Google, there's probably a lot of people 895 00:49:47,880 --> 00:49:50,520 Speaker 1: working on things that don't make money and are never 896 00:49:50,560 --> 00:49:52,840 Speaker 1: expected to make money. And they're just a bunch of people, 897 00:49:53,200 --> 00:49:55,759 Speaker 1: I don't know, eating the lunches and working on their 898 00:49:55,800 --> 00:49:58,040 Speaker 1: projects and stuff. But I had had a fund that 899 00:49:58,120 --> 00:49:59,960 Speaker 1: a hedge fund, I can't imagine that there is a 900 00:50:00,040 --> 00:50:03,440 Speaker 1: much space for that kind of thing where it just 901 00:50:03,480 --> 00:50:07,560 Speaker 1: sort of like intellectual explorations or long pursuits of things 902 00:50:07,600 --> 00:50:10,839 Speaker 1: that may never get monetized or product tized and so forth. 903 00:50:11,239 --> 00:50:13,839 Speaker 1: And so I'm curious like about like, is that how 904 00:50:13,920 --> 00:50:17,479 Speaker 1: much of it different that is internally at a place 905 00:50:17,560 --> 00:50:21,960 Speaker 1: like two Sigma versus a large established tech company like 906 00:50:22,000 --> 00:50:27,080 Speaker 1: a Google or Facebook or Microsoft. So I think, look, 907 00:50:27,120 --> 00:50:31,319 Speaker 1: I think that that UM to Sigma definitely has I 908 00:50:31,360 --> 00:50:35,960 Speaker 1: mean people doing technical research. We have a labs team UM, 909 00:50:36,000 --> 00:50:39,600 Speaker 1: and we do try to provide room for people to 910 00:50:40,360 --> 00:50:44,560 Speaker 1: explore ideas, which again, as I said, innovation you know, 911 00:50:44,719 --> 00:50:48,840 Speaker 1: comes from surprising places and engineering. And you know, while 912 00:50:48,920 --> 00:50:53,160 Speaker 1: Google is not always particularly efficient probably in the way 913 00:50:53,200 --> 00:50:57,480 Speaker 1: it allocates its engineers, at least to an outsider, UM, 914 00:50:57,520 --> 00:51:00,480 Speaker 1: you know that that it does come a place of 915 00:51:00,560 --> 00:51:03,880 Speaker 1: recognition that like engineers, when left to their own devices 916 00:51:04,400 --> 00:51:09,040 Speaker 1: and but given clear you know, ideas, maybe clear problem statements, 917 00:51:09,239 --> 00:51:12,520 Speaker 1: will innovate in surprising ways. UM. And so I do 918 00:51:12,600 --> 00:51:15,479 Speaker 1: think that at least at Too Sigma, we we try 919 00:51:15,560 --> 00:51:19,279 Speaker 1: to do a reasonable job of balancing, like being very 920 00:51:19,320 --> 00:51:21,920 Speaker 1: pragmatic and focused on we've got to ship things for 921 00:51:21,920 --> 00:51:24,439 Speaker 1: this business, right Obviously, you know, we're not a huge 922 00:51:24,440 --> 00:51:26,960 Speaker 1: company or you know, eight hundred some engineers, so it's 923 00:51:27,000 --> 00:51:30,400 Speaker 1: not like, you know, we have infinite resources to do whatever, 924 00:51:30,920 --> 00:51:34,719 Speaker 1: but being culturally aligned with that desire of engineers to 925 00:51:34,760 --> 00:51:38,600 Speaker 1: have a little bit of freedom to explore and innovate 926 00:51:38,880 --> 00:51:42,319 Speaker 1: and you know, try new things. You know, I do 927 00:51:42,360 --> 00:51:44,719 Speaker 1: think too, Sam actually does a pretty good job of that, 928 00:51:45,080 --> 00:51:46,759 Speaker 1: partly by the fact that we just have like a 929 00:51:46,800 --> 00:51:50,400 Speaker 1: great learning culture, so you know, we really support people 930 00:51:51,200 --> 00:51:54,359 Speaker 1: learning new things, whether it's going to conferences or doing 931 00:51:54,400 --> 00:51:57,080 Speaker 1: training or reading academic papers together. You see a lot 932 00:51:57,160 --> 00:52:00,120 Speaker 1: of that kind of thing at two Sigma, and I 933 00:52:00,160 --> 00:52:04,200 Speaker 1: think we're very eager to take ideas from academia and 934 00:52:04,239 --> 00:52:06,480 Speaker 1: sort of more cutting edge ideas from the wider world 935 00:52:06,760 --> 00:52:08,720 Speaker 1: and figure out how to apply them, which I think 936 00:52:08,880 --> 00:52:11,560 Speaker 1: is a competitive advantage for us. So, you know, I 937 00:52:12,160 --> 00:52:15,480 Speaker 1: think that you know, if you're an engineer deciding between 938 00:52:15,520 --> 00:52:18,440 Speaker 1: Google and two Sigma, you know, there's a lot of 939 00:52:18,440 --> 00:52:21,160 Speaker 1: different things you're going to be deciding on. I actually 940 00:52:21,160 --> 00:52:22,839 Speaker 1: would guess that one of the major things you're really 941 00:52:22,880 --> 00:52:24,440 Speaker 1: deciding on, though, is do you want to work at 942 00:52:24,480 --> 00:52:26,920 Speaker 1: a giant company or do you want to work at 943 00:52:26,920 --> 00:52:28,799 Speaker 1: a small company, you know, right, because if you want 944 00:52:28,800 --> 00:52:31,720 Speaker 1: to work at a giant company, you know, giant companies 945 00:52:31,760 --> 00:52:34,440 Speaker 1: have pluses and minuses, right, There's lots of different things 946 00:52:34,440 --> 00:52:36,719 Speaker 1: you could do, but you are going to be a 947 00:52:36,880 --> 00:52:40,400 Speaker 1: small fish and a very very very very very big pond, 948 00:52:40,920 --> 00:52:44,160 Speaker 1: you know, whereas at two sigma you can you know, 949 00:52:44,200 --> 00:52:47,480 Speaker 1: you can make a difference, you know, even as a 950 00:52:47,560 --> 00:52:50,839 Speaker 1: college graduate, you know, coming into the company. We're not 951 00:52:50,960 --> 00:52:55,080 Speaker 1: so big that people don't make a big noticeable difference 952 00:52:55,440 --> 00:52:59,200 Speaker 1: even early in their career, um, if they find the 953 00:52:59,280 --> 00:53:02,360 Speaker 1: right problem to solved. So I think that that's really 954 00:53:02,360 --> 00:53:06,120 Speaker 1: the that's the bigger difference, um, you know, than the 955 00:53:06,760 --> 00:53:09,759 Speaker 1: than the time concept or any any of the rest 956 00:53:09,760 --> 00:53:14,319 Speaker 1: of it. Um. I'm curious how intense the competition is 957 00:53:14,360 --> 00:53:18,160 Speaker 1: for engineers at the moment. So, you know, obviously there 958 00:53:18,239 --> 00:53:21,719 Speaker 1: was a lot written about how banks were competing with 959 00:53:21,800 --> 00:53:25,319 Speaker 1: the big tech firms to get the best talent, particularly 960 00:53:25,360 --> 00:53:28,640 Speaker 1: around the time that Goldman announced that it was actually 961 00:53:28,760 --> 00:53:31,080 Speaker 1: a tech firm. And there are all these stories about 962 00:53:31,080 --> 00:53:35,840 Speaker 1: financial services firms putting in uh, you know, pool tables 963 00:53:36,000 --> 00:53:38,960 Speaker 1: or ping pong tables and free lunches or whatever to 964 00:53:39,000 --> 00:53:42,359 Speaker 1: try to entice people that would normally go to some 965 00:53:42,480 --> 00:53:46,839 Speaker 1: campus in Silicon Valley. Is that still the case or 966 00:53:47,360 --> 00:53:50,520 Speaker 1: is that starting to um perhaps ease off of it. 967 00:53:51,719 --> 00:53:55,480 Speaker 1: I mean, I do. My impression is that that the 968 00:53:55,480 --> 00:53:58,840 Speaker 1: the competition for engineers is still very hot. You know, people, 969 00:53:58,920 --> 00:54:03,080 Speaker 1: can you compete with different people? So I don't. I 970 00:54:03,320 --> 00:54:07,480 Speaker 1: you know, we compete with big tech companies and other funds. 971 00:54:08,200 --> 00:54:10,440 Speaker 1: We don't compete. My impression is not that we compete 972 00:54:10,480 --> 00:54:12,880 Speaker 1: as much with Goldman. I do think that Goldman can 973 00:54:12,920 --> 00:54:16,839 Speaker 1: stay there a tech company, but uh, culturally they are 974 00:54:16,960 --> 00:54:19,640 Speaker 1: an investment bank, um and for good and for bad. 975 00:54:19,680 --> 00:54:21,840 Speaker 1: I learned a lot in my time working at Goldman. 976 00:54:21,840 --> 00:54:24,680 Speaker 1: I think Goldman has some really great aspects of their culture. 977 00:54:24,719 --> 00:54:26,719 Speaker 1: But you do not spend a hundred years as an 978 00:54:26,719 --> 00:54:29,279 Speaker 1: investment bank and then say you're a tech company and 979 00:54:29,320 --> 00:54:32,279 Speaker 1: become a tech company. That's just not just foundationally, like, 980 00:54:32,320 --> 00:54:35,480 Speaker 1: not the way it works like. I actually sort of personally, 981 00:54:35,520 --> 00:54:39,000 Speaker 1: quite strongly believe that the founding culture of a company 982 00:54:39,080 --> 00:54:41,560 Speaker 1: is very hard to shake. And so if you have 983 00:54:41,640 --> 00:54:45,000 Speaker 1: a founding culture of engineers, which two Sigma does, right. 984 00:54:45,080 --> 00:54:47,279 Speaker 1: So one of the co founders of two Sigma is, 985 00:54:47,640 --> 00:54:50,600 Speaker 1: you know, an engineer. Um, I do think you've got 986 00:54:50,600 --> 00:54:54,880 Speaker 1: a very strong technical culture built in from day zero. 987 00:54:55,320 --> 00:54:57,920 Speaker 1: If you have a founding culture of bankers, it's going 988 00:54:57,960 --> 00:55:00,520 Speaker 1: to be a different culture. And you can make it 989 00:55:00,600 --> 00:55:02,440 Speaker 1: a good place for engineers to work. And there's you know, 990 00:55:02,520 --> 00:55:04,360 Speaker 1: lots to do and lots to learn, but it's going 991 00:55:04,400 --> 00:55:07,239 Speaker 1: to feel different. And I think you know, engineers are 992 00:55:07,239 --> 00:55:10,400 Speaker 1: gonna are going to recognize and see that, and so 993 00:55:10,520 --> 00:55:14,560 Speaker 1: I but the competition for engineers is definitely still quite hot, 994 00:55:14,800 --> 00:55:18,560 Speaker 1: you know, between you know, financial companies and tech companies 995 00:55:18,600 --> 00:55:21,719 Speaker 1: and you know, frankly everything in between. Right software, software 996 00:55:21,760 --> 00:55:24,040 Speaker 1: is eating the world, as it were, and you know, 997 00:55:24,160 --> 00:55:28,319 Speaker 1: everybody needs technical talents to to get ahead. Like one 998 00:55:28,360 --> 00:55:32,080 Speaker 1: more question sort of about the transition from a tech 999 00:55:32,120 --> 00:55:36,319 Speaker 1: company to two Sigma. I mean, obviously, I assume there's 1000 00:55:36,320 --> 00:55:39,880 Speaker 1: a many things that are similar, um, you know, the 1001 00:55:40,000 --> 00:55:43,560 Speaker 1: sort of pure tech aspect of it. But I'm curiously 1002 00:55:43,840 --> 00:55:45,880 Speaker 1: and I guess you know, you had did have the 1003 00:55:45,920 --> 00:55:49,480 Speaker 1: financial experience, Ed Goldman, But how much did going from 1004 00:55:49,960 --> 00:55:54,000 Speaker 1: tech to a financial services firm or to a fund 1005 00:55:54,440 --> 00:55:58,719 Speaker 1: require you to sort of learn the language of finance more? 1006 00:55:58,800 --> 00:56:02,200 Speaker 1: And how challenging was that? I mean, I I understand 1007 00:56:02,239 --> 00:56:05,480 Speaker 1: most of your work is working with other engineers, but 1008 00:56:05,600 --> 00:56:08,399 Speaker 1: in terms of you know, the way people talk about 1009 00:56:08,400 --> 00:56:12,280 Speaker 1: analyzing time series and so forth, how did you approach 1010 00:56:12,360 --> 00:56:14,239 Speaker 1: that And how much of a challenge is that Being 1011 00:56:14,280 --> 00:56:18,280 Speaker 1: able to sort of meld the communication between tech people 1012 00:56:18,360 --> 00:56:22,600 Speaker 1: engineers versus the more purely uh finance side, It is 1013 00:56:22,640 --> 00:56:25,960 Speaker 1: a challenge. Um, you know, I think one of the 1014 00:56:25,960 --> 00:56:28,959 Speaker 1: big challenges frankly for me at two Sigma is I'm 1015 00:56:29,120 --> 00:56:33,440 Speaker 1: not I don't have a like huge heavy math background, 1016 00:56:33,920 --> 00:56:37,759 Speaker 1: and so there is you know, even more than like Goldman, 1017 00:56:38,320 --> 00:56:43,520 Speaker 1: the math at two Sigma is intimidating to me, I 1018 00:56:43,560 --> 00:56:46,600 Speaker 1: will say. And you know, so I think a lot 1019 00:56:46,719 --> 00:56:52,040 Speaker 1: of a lot of this is just learning the language 1020 00:56:52,040 --> 00:56:55,239 Speaker 1: of the company and how to how to speak to 1021 00:56:55,640 --> 00:56:57,920 Speaker 1: the language of the company. We like to talk about 1022 00:56:58,000 --> 00:57:02,640 Speaker 1: Epsilon's and Omega's and you know it's like, okay, like 1023 00:57:02,719 --> 00:57:06,720 Speaker 1: sort of small steps and like big you know, big ideas. Okay, 1024 00:57:06,719 --> 00:57:09,360 Speaker 1: I can I can sort of translate that, right, I 1025 00:57:09,400 --> 00:57:11,719 Speaker 1: guess I find, especially for the role that I'm in now, 1026 00:57:12,040 --> 00:57:13,719 Speaker 1: a lot of what I really need to do and 1027 00:57:13,719 --> 00:57:15,279 Speaker 1: a lot of what I really need to translate is 1028 00:57:15,320 --> 00:57:20,320 Speaker 1: really getting people to out of the details of their 1029 00:57:20,400 --> 00:57:23,960 Speaker 1: area again, whether it's my area details or the details 1030 00:57:24,000 --> 00:57:26,840 Speaker 1: of you know, the need of some some modeler or 1031 00:57:26,880 --> 00:57:31,720 Speaker 1: some of our engineering team into kind of the the 1032 00:57:31,720 --> 00:57:35,560 Speaker 1: the higher level problem that they're trying to solve. So 1033 00:57:35,720 --> 00:57:38,919 Speaker 1: helping helping us all get to kind of the generalized 1034 00:57:39,000 --> 00:57:41,800 Speaker 1: common ground. I don't necessary. I don't need to understand 1035 00:57:41,840 --> 00:57:45,560 Speaker 1: the math at a detailed level to understand that you 1036 00:57:45,640 --> 00:57:48,840 Speaker 1: need ten times the data that you used to need, right, 1037 00:57:49,000 --> 00:57:53,560 Speaker 1: I don't need to understand, um, the details of you know, 1038 00:57:53,640 --> 00:57:58,760 Speaker 1: dealing with prime brokers to understand that this team needs 1039 00:57:58,760 --> 00:58:00,720 Speaker 1: to be able to build systems are going to iterate 1040 00:58:01,120 --> 00:58:04,360 Speaker 1: much faster and you know, have you know, have these 1041 00:58:04,480 --> 00:58:08,439 Speaker 1: characteristics because of the way that they need to pull 1042 00:58:08,520 --> 00:58:11,480 Speaker 1: the different prime reverse. I don't know now now I'm 1043 00:58:11,480 --> 00:58:13,640 Speaker 1: way way out of my depth, but you know so, 1044 00:58:13,640 --> 00:58:15,440 Speaker 1: so I think a lot of a lot of my 1045 00:58:15,560 --> 00:58:18,800 Speaker 1: job anywhere that it's been in leadership, has been how 1046 00:58:18,840 --> 00:58:21,840 Speaker 1: do I get everyone to speak to get to a 1047 00:58:21,880 --> 00:58:25,880 Speaker 1: common ground of of talking about their problems at a 1048 00:58:26,160 --> 00:58:28,720 Speaker 1: high enough level that each can kind of appreciate it 1049 00:58:29,080 --> 00:58:33,360 Speaker 1: um without forcing everyone to know all of the known 1050 00:58:33,440 --> 00:58:35,840 Speaker 1: I'll understand all the details. So I do think that 1051 00:58:36,080 --> 00:58:39,160 Speaker 1: it's fun to know the details sometimes, but a lot 1052 00:58:39,200 --> 00:58:42,959 Speaker 1: of times engineers can get lost in the details, and 1053 00:58:43,360 --> 00:58:46,000 Speaker 1: you know, the details aren't the important thing, right, The 1054 00:58:46,040 --> 00:58:48,280 Speaker 1: important thing is kind of the narrative of how is 1055 00:58:48,320 --> 00:58:51,840 Speaker 1: the work that we're doing making an impact on your 1056 00:58:51,960 --> 00:58:54,400 Speaker 1: day to day life, making your job easier, helping this 1057 00:58:54,480 --> 00:58:59,160 Speaker 1: company scale, helping this company grow, you know, get where 1058 00:58:59,160 --> 00:59:02,080 Speaker 1: it needs to go. UM. And I think that's that's 1059 00:59:02,160 --> 00:59:04,960 Speaker 1: really that's the most important thing that I do is 1060 00:59:05,480 --> 00:59:10,360 Speaker 1: help is create and and and share narratives that almost 1061 00:59:10,360 --> 00:59:12,760 Speaker 1: anyone can understand, because you know, I may need to 1062 00:59:12,800 --> 00:59:15,720 Speaker 1: talk to you know, modelers, I may need to talk 1063 00:59:15,720 --> 00:59:17,600 Speaker 1: to people in legal I mean need to talk to 1064 00:59:17,640 --> 00:59:20,400 Speaker 1: people in you know, uh, you know, in the on 1065 00:59:20,400 --> 00:59:23,400 Speaker 1: the corporate side or an HR right uh and and 1066 00:59:23,440 --> 00:59:25,760 Speaker 1: all of those folks need to understand the value that 1067 00:59:25,800 --> 00:59:29,120 Speaker 1: my team is bringing UM to the company. And I 1068 00:59:29,200 --> 00:59:32,600 Speaker 1: need to understand that at the high level of their problems, UM, 1069 00:59:32,640 --> 00:59:34,320 Speaker 1: so that I can make sure that I'm meeting them. 1070 00:59:34,320 --> 00:59:37,280 Speaker 1: But again, you know, I don't need to necessarily understand 1071 00:59:37,360 --> 00:59:39,600 Speaker 1: all of the details of how they do their jobs 1072 00:59:39,800 --> 00:59:43,240 Speaker 1: to deliver the solutions for the kinds of software that 1073 00:59:43,280 --> 00:59:47,320 Speaker 1: I need to build. I really enjoy. I mean, it's 1074 00:59:47,360 --> 00:59:50,720 Speaker 1: such a sort of new area for us, for me 1075 00:59:50,720 --> 00:59:53,360 Speaker 1: and Tracy, and I think for probably most of our listeners. 1076 00:59:53,360 --> 00:59:57,000 Speaker 1: So really appreciate you taking the time and walking us 1077 00:59:57,040 --> 00:59:59,680 Speaker 1: through your job and your career and the whole landscape 1078 00:59:59,800 --> 01:00:03,160 Speaker 1: at really absolutely fascinating stuff. Well, yeah, thank you for 1079 01:00:03,160 --> 01:00:27,720 Speaker 1: having me on. It was fun. Thanks, that was great, Tracy. 1080 01:00:27,760 --> 01:00:29,160 Speaker 1: You know, like I said in the beginning, that was 1081 01:00:29,200 --> 01:00:32,120 Speaker 1: a conversation I've kind of wanted to have for a 1082 01:00:32,200 --> 01:00:35,200 Speaker 1: long time to like talk about the sort of nuts 1083 01:00:35,240 --> 01:00:38,360 Speaker 1: and bolts of just how all of this works. And 1084 01:00:38,560 --> 01:00:42,880 Speaker 1: I found that really super interesting. Yeah. Absolutely, And um, 1085 01:00:44,240 --> 01:00:46,080 Speaker 1: I'm trying to think about what to like focus on 1086 01:00:46,120 --> 01:00:48,520 Speaker 1: here because it was such a wide ranging conversation. But 1087 01:00:48,720 --> 01:00:52,920 Speaker 1: I did think the idea about banks sort of like 1088 01:00:53,680 --> 01:00:58,560 Speaker 1: creating or financial services firms creating um an open source 1089 01:00:58,640 --> 01:01:00,960 Speaker 1: culture and the evolution of that. I thought that was 1090 01:01:01,000 --> 01:01:04,920 Speaker 1: really interesting. In this idea of being a good corporate 1091 01:01:05,000 --> 01:01:08,840 Speaker 1: citizen when it comes to producing open source software, that's 1092 01:01:08,920 --> 01:01:11,919 Speaker 1: such a it's such a sea change to the way 1093 01:01:11,960 --> 01:01:15,400 Speaker 1: banks used to operate. And Camille was sort of giving 1094 01:01:15,400 --> 01:01:17,560 Speaker 1: a flavor of that with with her description of her 1095 01:01:17,560 --> 01:01:20,480 Speaker 1: work at Goldman UM. And we're still only on the 1096 01:01:20,680 --> 01:01:24,320 Speaker 1: edges of it. But I'm interested to see how that goes, 1097 01:01:24,480 --> 01:01:28,800 Speaker 1: because as you have more financial companies that like to 1098 01:01:28,840 --> 01:01:31,280 Speaker 1: declare that they are in fact tech firms and that 1099 01:01:31,320 --> 01:01:34,800 Speaker 1: they're competitive, edge also comes from tech, Like does that 1100 01:01:35,720 --> 01:01:39,880 Speaker 1: does that spark a similar culture to what we see 1101 01:01:40,880 --> 01:01:43,040 Speaker 1: on the West Coast in Silicon Valley when it comes 1102 01:01:43,040 --> 01:01:45,200 Speaker 1: to open source or does that cause banks to sort 1103 01:01:45,240 --> 01:01:48,680 Speaker 1: of double down on being competitive and keeping their tech 1104 01:01:48,760 --> 01:01:51,680 Speaker 1: as a proprietary thing. Does that make sense? Now? It 1105 01:01:51,680 --> 01:01:54,120 Speaker 1: makes a lot of sense. I like her point. That's like, 1106 01:01:54,120 --> 01:01:56,439 Speaker 1: in the end, like a company like Goldman, it's a bank, 1107 01:01:56,560 --> 01:02:02,840 Speaker 1: it's an investment bank. And maybe maybe UH an engineering 1108 01:02:03,000 --> 01:02:07,640 Speaker 1: founded firm like two Sigma can genuinely have a tech culture. 1109 01:02:08,200 --> 01:02:11,000 Speaker 1: But I have to I suspect she's right that for 1110 01:02:11,120 --> 01:02:15,960 Speaker 1: all of the Protestations Goldman affirm like Goldman can never 1111 01:02:16,040 --> 01:02:18,000 Speaker 1: really be a tech firm. That would just be my guests. 1112 01:02:18,120 --> 01:02:20,600 Speaker 1: Maybe maybe we'll have someone from Goldman on at some 1113 01:02:20,680 --> 01:02:23,440 Speaker 1: point to make the counter case. I think both of 1114 01:02:23,520 --> 01:02:26,280 Speaker 1: us sort of had the thought at the same time 1115 01:02:26,280 --> 01:02:30,200 Speaker 1: of like there are some similarities between going from being 1116 01:02:30,240 --> 01:02:33,320 Speaker 1: an engineer to a manager to going from being a 1117 01:02:33,320 --> 01:02:36,640 Speaker 1: reporter to a journalist. And you know, I think, um, 1118 01:02:36,680 --> 01:02:39,960 Speaker 1: a lot of what she said really resonated with me, 1119 01:02:40,200 --> 01:02:43,400 Speaker 1: including that last bit about like the sort of the 1120 01:02:43,920 --> 01:02:47,400 Speaker 1: uh the need to create a common language that everyone understands. 1121 01:02:47,400 --> 01:02:49,000 Speaker 1: And I see that all the time, like in a 1122 01:02:49,080 --> 01:02:53,959 Speaker 1: newsroom where you have different teams very focused on one thing, 1123 01:02:54,600 --> 01:02:57,680 Speaker 1: but the best managers within the newsroom are people that 1124 01:02:57,720 --> 01:03:00,240 Speaker 1: are very good at sort of essentially describing in plain 1125 01:03:00,360 --> 01:03:02,480 Speaker 1: English to everyone else in the news room that doesn't 1126 01:03:02,480 --> 01:03:05,640 Speaker 1: really know about this stuff, why what they're working on 1127 01:03:05,920 --> 01:03:09,840 Speaker 1: is interesting, so that there's some sort of level of um, 1128 01:03:09,880 --> 01:03:12,640 Speaker 1: you know, uniform understanding about what the news, what's going 1129 01:03:12,680 --> 01:03:15,280 Speaker 1: on in the news room. Yeah, there's also the issue 1130 01:03:15,320 --> 01:03:18,520 Speaker 1: of creating a sort of language around the type of 1131 01:03:18,680 --> 01:03:22,960 Speaker 1: news product that you're offering, like because writing, you know, 1132 01:03:23,040 --> 01:03:24,800 Speaker 1: if I say to someone, oh, why don't you just 1133 01:03:24,840 --> 01:03:28,360 Speaker 1: write a quick and short story about this, quick and 1134 01:03:28,400 --> 01:03:30,920 Speaker 1: short can mean different things to lots of different people, 1135 01:03:31,120 --> 01:03:34,720 Speaker 1: and so even just creating like an artificial term for 1136 01:03:34,760 --> 01:03:37,960 Speaker 1: what that might be, like creating this artificial structure that 1137 01:03:38,080 --> 01:03:41,800 Speaker 1: everyone can eventually recognize and come to understand. Yeah, there 1138 01:03:41,840 --> 01:03:47,240 Speaker 1: are a lot of parallels between engineering and journalism management apparently, so, um, 1139 01:03:47,640 --> 01:03:49,280 Speaker 1: I didn't expect that. And I feel like I'm going 1140 01:03:49,320 --> 01:03:52,120 Speaker 1: to have to go out and buy Camille's book now 1141 01:03:52,160 --> 01:03:55,920 Speaker 1: and read it and learn more about it. Yeah. No, 1142 01:03:56,040 --> 01:03:57,680 Speaker 1: I want to. I want to read it too, even 1143 01:03:57,720 --> 01:04:01,000 Speaker 1: though it's probably not for me per se, I'm for tech. 1144 01:04:01,240 --> 01:04:04,600 Speaker 1: It definitely seems like, um, it will be very useful. 1145 01:04:04,680 --> 01:04:06,640 Speaker 1: It's but or at least to see the same processes 1146 01:04:06,920 --> 01:04:09,840 Speaker 1: replicated elsbere I know. I just really enjoyed that a lot, 1147 01:04:09,880 --> 01:04:11,480 Speaker 1: and I feel like I learned a ton in the 1148 01:04:11,560 --> 01:04:15,160 Speaker 1: last hour. Yeah. Same, A bit different to the usual 1149 01:04:15,320 --> 01:04:18,160 Speaker 1: All thoughts fair in a good way. All right. Shall 1150 01:04:18,160 --> 01:04:20,880 Speaker 1: we leave it there? Yeah, let's leave it there. Okay. 1151 01:04:20,920 --> 01:04:24,000 Speaker 1: This has been another episode of the All Thoughts podcast. 1152 01:04:24,040 --> 01:04:26,720 Speaker 1: I'm Tracy Alloway. You can follow me on Twitter at 1153 01:04:26,760 --> 01:04:30,160 Speaker 1: Tracy Alloway and I'm Joe wisn't Thal. You can follow 1154 01:04:30,200 --> 01:04:34,280 Speaker 1: me on Twitter at the Stalwart. Follow our guest on Twitter, 1155 01:04:34,640 --> 01:04:38,400 Speaker 1: Camille Fournier. Her handle is at scamill s K A 1156 01:04:38,920 --> 01:04:41,840 Speaker 1: M I L L E. And she is the author 1157 01:04:41,920 --> 01:04:45,600 Speaker 1: of the Manager's Path, a guide for tech leaders navigating 1158 01:04:45,640 --> 01:04:48,640 Speaker 1: growth and change, which I'm going to download to my 1159 01:04:48,800 --> 01:04:53,320 Speaker 1: Kindle asap. UH. Follow our producer Laura Carlson. She's at 1160 01:04:53,400 --> 01:04:57,040 Speaker 1: Laura M. Carlson. Follow the Bloomberg head of podcast, Francesca 1161 01:04:57,120 --> 01:05:00,440 Speaker 1: Levi at Francesca Today, and check out all of our 1162 01:05:00,480 --> 01:05:04,800 Speaker 1: podcast said Bloomberg under the handle ad podcast. Thanks for listening.