1 00:00:02,200 --> 00:00:06,800 Speaker 1: This is Master's in Business with Barry Ridholts on Boomberg Radio. 2 00:00:08,720 --> 00:00:11,440 Speaker 1: This week on the podcast, I have an extra special guest. 3 00:00:11,840 --> 00:00:14,960 Speaker 1: His name is Matthew Rothman and he is the director 4 00:00:15,440 --> 00:00:21,400 Speaker 1: of quantitative Strategies at Credit Swiss. He is really a 5 00:00:21,520 --> 00:00:25,040 Speaker 1: fairly legendary guy in the world of quant He very 6 00:00:25,160 --> 00:00:30,040 Speaker 1: specifically warned Lehman Brothers when he was a relatively new 7 00:00:30,120 --> 00:00:32,959 Speaker 1: higher there about some of the problems that they were 8 00:00:33,000 --> 00:00:37,280 Speaker 1: looking at with their quant strategies and ask questions that 9 00:00:37,360 --> 00:00:40,559 Speaker 1: they really kind of dismissed and laughed at, What do 10 00:00:40,560 --> 00:00:42,559 Speaker 1: you mean we might go out of business? That's the 11 00:00:42,640 --> 00:00:46,760 Speaker 1: dumbest thing we've ever heard. He's also been profiled in 12 00:00:46,800 --> 00:00:51,240 Speaker 1: a number of places. If you read Scott Patterson's The Quants, 13 00:00:51,320 --> 00:00:54,640 Speaker 1: you can find him referenced throughout there. Pretty much the 14 00:00:54,720 --> 00:00:58,600 Speaker 1: first guy to figure out what happened during the quant 15 00:00:58,680 --> 00:01:01,720 Speaker 1: quake of of two thousand and seven. We're just about 16 00:01:01,760 --> 00:01:05,560 Speaker 1: a decade past that, and so, uh, he was the 17 00:01:05,640 --> 00:01:08,800 Speaker 1: one of the first people who really figured out how 18 00:01:08,840 --> 00:01:11,800 Speaker 1: this happened, why it happened, and what it might mean 19 00:01:11,840 --> 00:01:15,720 Speaker 1: going forward to the future of of short term trading 20 00:01:15,880 --> 00:01:20,679 Speaker 1: and markets and companies like Lehman Brothers. It's it's one 21 00:01:20,680 --> 00:01:24,920 Speaker 1: of those stories of someone who was unfortunately, much to 22 00:01:25,000 --> 00:01:27,560 Speaker 1: his chagrin, proven right. He was kind of hoping he 23 00:01:27,640 --> 00:01:30,480 Speaker 1: wasn't going to be right, but hey, that's what happened, 24 00:01:30,520 --> 00:01:33,759 Speaker 1: and we have all sense lived with the consequences. So, 25 00:01:33,800 --> 00:01:41,120 Speaker 1: with no further ado, my conversation with Matthew Rothman. My 26 00:01:41,200 --> 00:01:44,160 Speaker 1: special guest today is Matthew Rothman. He is currently the 27 00:01:44,160 --> 00:01:48,360 Speaker 1: head of Global Quantitative Equity Research at Credit Swiss. He 28 00:01:48,560 --> 00:01:52,560 Speaker 1: is also a senior lecturer in Finance at the m I. T. 29 00:01:52,760 --> 00:01:57,520 Speaker 1: Sloan School of Management. Prior to joining Credit Swiss, Matthew 30 00:01:57,600 --> 00:02:01,400 Speaker 1: was the director of Global Quantitative mat Growth Research at 31 00:02:01,400 --> 00:02:05,280 Speaker 1: a Canadian asset management in Boston, which was running approximately 32 00:02:05,400 --> 00:02:09,640 Speaker 1: seventy billion dollars in assets. Before that, Matthew was the 33 00:02:09,680 --> 00:02:13,280 Speaker 1: global head of quant research at Lehman Brothers and then 34 00:02:13,320 --> 00:02:18,440 Speaker 1: continued on at Barclay's Capital after that acquisition post bankruptcy. 35 00:02:18,560 --> 00:02:22,799 Speaker 1: He is the author of Turbulent Times in quant Lands, 36 00:02:23,200 --> 00:02:26,680 Speaker 1: which was a research note during the quant crash in 37 00:02:26,720 --> 00:02:30,120 Speaker 1: the summer of two thousand and seven that became the 38 00:02:30,160 --> 00:02:35,320 Speaker 1: most highly distributed research note in Lehman Brothers history. Matthew Rothman, 39 00:02:35,560 --> 00:02:38,080 Speaker 1: Welcome to Bloomberg. Welcome, Thanks so much for having me, 40 00:02:38,120 --> 00:02:40,920 Speaker 1: Barry Um. So, I've been looking forward to this for 41 00:02:41,280 --> 00:02:45,240 Speaker 1: quite a while. I knew of you from Scott Patterson's 42 00:02:45,240 --> 00:02:50,400 Speaker 1: book The Quantz, and I was vaguely familiar with UM, 43 00:02:50,440 --> 00:02:53,560 Speaker 1: the research piece that you would put out turbulent times 44 00:02:53,560 --> 00:02:57,120 Speaker 1: in quant land. Let's let's start at a very basic 45 00:02:57,240 --> 00:03:02,040 Speaker 1: level for the lay person. Please explain what quants strategies 46 00:03:02,200 --> 00:03:05,760 Speaker 1: focus on. You know, so much gets grouped under the 47 00:03:06,040 --> 00:03:08,639 Speaker 1: kind of rubric of quant today that you really kind 48 00:03:08,639 --> 00:03:10,640 Speaker 1: of have to start to decompose it a little bit. 49 00:03:11,000 --> 00:03:14,079 Speaker 1: And there are a variety of different quants, UM, you 50 00:03:14,240 --> 00:03:17,440 Speaker 1: should begin to think about them via asset classes. Uh So, 51 00:03:17,560 --> 00:03:21,440 Speaker 1: derivatives based quants are very different than fixed income general 52 00:03:21,440 --> 00:03:26,880 Speaker 1: fixed income quants versus uh kind of equity quants versus 53 00:03:26,960 --> 00:03:29,760 Speaker 1: risk modeling quants, and each one will come with a 54 00:03:29,800 --> 00:03:31,640 Speaker 1: different kind of skill set and a different kind of 55 00:03:31,680 --> 00:03:36,480 Speaker 1: approach to modeling. If you take equity quants just for 56 00:03:36,520 --> 00:03:39,120 Speaker 1: a second, they also kind of come at a variety 57 00:03:39,360 --> 00:03:42,800 Speaker 1: of forecasting horizons, and so they'll look at different types 58 00:03:42,800 --> 00:03:45,200 Speaker 1: of signals and different types of things. So you have 59 00:03:45,280 --> 00:03:49,480 Speaker 1: people who are playing literally in the millisecond range doing 60 00:03:49,560 --> 00:03:54,160 Speaker 1: kind frequency very high frequency trading market making. Uh it 61 00:03:54,360 --> 00:03:57,280 Speaker 1: literally in you know, trading hundreds of times in the 62 00:03:57,320 --> 00:04:00,440 Speaker 1: blink of an eye, um down to people are holding 63 00:04:00,440 --> 00:04:03,960 Speaker 1: intra day strategies, to people holding several day strategies, to 64 00:04:04,080 --> 00:04:07,640 Speaker 1: people holding strategies at last months uh and and so 65 00:04:08,760 --> 00:04:10,840 Speaker 1: you know, you can think about them having very different 66 00:04:10,840 --> 00:04:13,520 Speaker 1: types of signals and very different types of performance, but 67 00:04:13,600 --> 00:04:17,440 Speaker 1: what they all have in common is that they're forecasting returns. 68 00:04:17,480 --> 00:04:20,640 Speaker 1: And what separates a quant in my book, really from 69 00:04:20,680 --> 00:04:23,919 Speaker 1: a fundamental manager is that fundamental managers really try to 70 00:04:24,000 --> 00:04:28,440 Speaker 1: understand the drivers behind the company. They talk to management, 71 00:04:28,440 --> 00:04:31,240 Speaker 1: they think about products, They forecast earnings at the end 72 00:04:31,279 --> 00:04:34,000 Speaker 1: of the day, and they think about a company as 73 00:04:34,000 --> 00:04:37,480 Speaker 1: an organic unit. Quants think about returns and what are 74 00:04:37,520 --> 00:04:40,880 Speaker 1: the drivers of returns. What is going to make h 75 00:04:41,000 --> 00:04:43,520 Speaker 1: two returns, two stocks take the same way or go 76 00:04:43,600 --> 00:04:46,320 Speaker 1: the opposite way over a long period time or baskets 77 00:04:46,320 --> 00:04:49,880 Speaker 1: of returns, and so we we think about what drives 78 00:04:49,960 --> 00:04:52,720 Speaker 1: returns more than anything and really abstract away from the 79 00:04:52,760 --> 00:04:57,840 Speaker 1: companies themselves. So so I oversimplified, as the fundamentalists are 80 00:04:57,880 --> 00:05:01,800 Speaker 1: telling the story and the qual are crunching numbers. Is 81 00:05:01,839 --> 00:05:04,880 Speaker 1: that a gross of simplification or does it work? I 82 00:05:04,920 --> 00:05:07,000 Speaker 1: think everybody crunches numbers. I wouldn't want to say the 83 00:05:07,000 --> 00:05:10,720 Speaker 1: fundamentalists don't tell a story. Um, they're certainly, you know, 84 00:05:10,880 --> 00:05:14,440 Speaker 1: trying to forecast cash flows and understand, uh, you know, 85 00:05:14,440 --> 00:05:17,520 Speaker 1: what are the drivers of earnings and revenues uh, and 86 00:05:17,560 --> 00:05:21,600 Speaker 1: then finally relate that back to us a stock price 87 00:05:21,720 --> 00:05:24,279 Speaker 1: and what they think they're appropriate stock price would be. 88 00:05:25,160 --> 00:05:28,800 Speaker 1: Quants don't try to do any of those things necessarily. 89 00:05:29,240 --> 00:05:33,559 Speaker 1: They try to just forecast returns directly, uh, and see 90 00:05:33,839 --> 00:05:37,720 Speaker 1: what can be those drivers of those returns and overall, 91 00:05:37,960 --> 00:05:41,880 Speaker 1: for the most part, think about large baskets of returns 92 00:05:42,520 --> 00:05:46,320 Speaker 1: or of stocks and how those characteristics and how those 93 00:05:46,320 --> 00:05:51,440 Speaker 1: stocks behave based upon their return based characteristics. So you 94 00:05:51,600 --> 00:05:55,560 Speaker 1: studied under Gene Fama, you got your PhD from Chicago, 95 00:05:56,160 --> 00:06:00,120 Speaker 1: The really the home of the efficient market hypothesis? Can 96 00:06:00,160 --> 00:06:05,039 Speaker 1: you square E M H with quantitative analysis? Are they similar? 97 00:06:05,279 --> 00:06:08,120 Speaker 1: Or really? When I think of quants, I think of 98 00:06:09,920 --> 00:06:13,880 Speaker 1: using powerful computers in order to try and beat the market. 99 00:06:14,040 --> 00:06:17,960 Speaker 1: Or again, oh, you oversimplify. So I think the E 100 00:06:18,120 --> 00:06:21,240 Speaker 1: M H is probably one of the most misunderstood concepts 101 00:06:21,720 --> 00:06:25,760 Speaker 1: UM in finance. And Gene Fama's genius was that he 102 00:06:25,880 --> 00:06:29,960 Speaker 1: really taught us how to think in a very rigorous 103 00:06:30,000 --> 00:06:32,480 Speaker 1: way about what it means to be an efficient market 104 00:06:32,800 --> 00:06:35,760 Speaker 1: and what it means to beat the market. Before Fama 105 00:06:35,839 --> 00:06:39,039 Speaker 1: came along, there were people publishing studies all the time 106 00:06:39,440 --> 00:06:41,719 Speaker 1: that said they had a strategy to beat the market. 107 00:06:42,080 --> 00:06:46,000 Speaker 1: I think that drove Fama a little crazy, um, because 108 00:06:46,480 --> 00:06:50,000 Speaker 1: the work wasn't very well done and the phrase beat 109 00:06:50,080 --> 00:06:55,440 Speaker 1: the market um was very loosely applied. And what Farma 110 00:06:55,520 --> 00:06:57,080 Speaker 1: really kind of taught us was that you have to 111 00:06:57,120 --> 00:07:01,320 Speaker 1: think about risk uh and say, on a risk adjusted basis, 112 00:07:01,760 --> 00:07:05,120 Speaker 1: can I beat the market? And then academics have debated 113 00:07:05,160 --> 00:07:08,919 Speaker 1: for years what is the appropriate measure of risk? Is 114 00:07:08,960 --> 00:07:11,760 Speaker 1: that the capital asset pricing model is that the Fama 115 00:07:11,840 --> 00:07:14,880 Speaker 1: French three factor model. Uh? Is there something else that 116 00:07:14,920 --> 00:07:19,239 Speaker 1: we're missing this now car hearts factor on momentum uh 117 00:07:19,320 --> 00:07:21,960 Speaker 1: that is put in there. But academics have then debated 118 00:07:21,960 --> 00:07:25,200 Speaker 1: are those factors anomalies or their proxies for risks? And 119 00:07:25,560 --> 00:07:28,120 Speaker 1: you know, we spent fifty years more plus and academic 120 00:07:28,160 --> 00:07:31,560 Speaker 1: circles debating what it means to beat the market with 121 00:07:31,640 --> 00:07:35,600 Speaker 1: a risk adjusted return in Wall Street. Um. You know, 122 00:07:35,840 --> 00:07:38,880 Speaker 1: there's been a generation of Chicago students and other students 123 00:07:38,920 --> 00:07:42,000 Speaker 1: who have come to Wall Street uh, cliff as nous 124 00:07:42,000 --> 00:07:44,280 Speaker 1: and crowd and many of the people's at Goldman's access 125 00:07:44,320 --> 00:07:48,520 Speaker 1: at Management and then fanned out across the street in 126 00:07:48,520 --> 00:07:51,480 Speaker 1: in in in many ways. Um. And what we've all 127 00:07:51,560 --> 00:07:54,600 Speaker 1: kind of been trained in these methods, not only Chicago 128 00:07:54,640 --> 00:07:56,880 Speaker 1: but other schools as well. And what we've really brought 129 00:07:56,920 --> 00:08:01,520 Speaker 1: to Bears is kind of very hardcore, rigorous academic quasi 130 00:08:01,600 --> 00:08:05,280 Speaker 1: academic background to how we think about can you make 131 00:08:05,320 --> 00:08:08,920 Speaker 1: money as a quant and what does that mean? Uh? 132 00:08:08,960 --> 00:08:12,000 Speaker 1: And and so you know, Wheel spend less time arguing 133 00:08:12,000 --> 00:08:14,280 Speaker 1: about is something risk or is it a miss pricing? 134 00:08:14,640 --> 00:08:17,040 Speaker 1: Is it an anomaly? Um? Doesn't that's that mean the 135 00:08:17,040 --> 00:08:20,280 Speaker 1: market is efficient or less efficient? But you know, we 136 00:08:20,280 --> 00:08:24,200 Speaker 1: we bring that same kind of sensibility that Fama taught us, UM, 137 00:08:24,240 --> 00:08:27,480 Speaker 1: but we'll get less involved in the academic, you know, 138 00:08:27,720 --> 00:08:31,200 Speaker 1: debate about risk versus miss pricing. So let's talk a 139 00:08:31,200 --> 00:08:34,520 Speaker 1: little bit about building a quant team. You're hired at 140 00:08:34,520 --> 00:08:38,480 Speaker 1: Credit Swiss to help put a team together what goes 141 00:08:38,520 --> 00:08:42,520 Speaker 1: into that? How do you first begin to assemble a 142 00:08:42,600 --> 00:08:45,120 Speaker 1: QUANT team. I think the first thing that you need 143 00:08:45,120 --> 00:08:46,680 Speaker 1: if a quant, if you're going to be a quant, 144 00:08:47,160 --> 00:08:51,280 Speaker 1: is a combination of data and technology. So you need 145 00:08:51,360 --> 00:08:53,040 Speaker 1: to kind of go out and figure out what are 146 00:08:53,040 --> 00:08:55,079 Speaker 1: the big databases that you need where you're gonna get 147 00:08:55,080 --> 00:08:58,959 Speaker 1: your information and what is you're diversified information set going 148 00:08:59,000 --> 00:09:01,360 Speaker 1: to be what you think you're edges and go about 149 00:09:01,480 --> 00:09:05,559 Speaker 1: procuring that. So you're you're building hardware and software, You're 150 00:09:05,600 --> 00:09:08,599 Speaker 1: hiring programmers, You're hiring programmers, you've got you have to 151 00:09:08,679 --> 00:09:10,880 Speaker 1: hire data scientists and people who are going to really 152 00:09:11,080 --> 00:09:13,520 Speaker 1: an overuse term, but people are going to really understand 153 00:09:13,520 --> 00:09:17,600 Speaker 1: how to manage and curate and store your data. And 154 00:09:17,600 --> 00:09:19,680 Speaker 1: then you have to find researchers who know what to 155 00:09:19,720 --> 00:09:21,680 Speaker 1: really do with that data and where to find and 156 00:09:21,679 --> 00:09:24,440 Speaker 1: where to find those hidden gems of signals and come 157 00:09:24,520 --> 00:09:26,800 Speaker 1: up with ideas. And then you actually need people who 158 00:09:26,800 --> 00:09:29,440 Speaker 1: can communicate it. So so this isn't anything that gets 159 00:09:29,480 --> 00:09:32,840 Speaker 1: put together very quickly. This is a long processes, this 160 00:09:32,960 --> 00:09:35,600 Speaker 1: is a long process. When when Credit Swiss comes to 161 00:09:35,600 --> 00:09:37,880 Speaker 1: you and says, hey, Matthew, we want to build a 162 00:09:38,000 --> 00:09:39,959 Speaker 1: quant team. Do you say, all right, it's gonna take 163 00:09:40,160 --> 00:09:43,600 Speaker 1: five years, two years, How how do you put them 164 00:09:44,400 --> 00:09:47,280 Speaker 1: into the proper mindset for this? I say, you've probably 165 00:09:47,280 --> 00:09:49,679 Speaker 1: got to give me twelve to eighteen months and think 166 00:09:49,720 --> 00:09:51,600 Speaker 1: that I'm going to be in a dark cave, um 167 00:09:51,600 --> 00:09:53,160 Speaker 1: and you're gonna see nothing from me. And I'm gonna 168 00:09:53,200 --> 00:09:56,400 Speaker 1: be asking you for big checks uh and and and 169 00:09:56,600 --> 00:09:59,520 Speaker 1: hiring people and kind of layout a business plan very 170 00:09:59,559 --> 00:10:02,640 Speaker 1: carefully uh and can detail the costs and exactly what 171 00:10:02,679 --> 00:10:04,880 Speaker 1: I need. Um. And you got to make sure that 172 00:10:04,920 --> 00:10:07,760 Speaker 1: they're in it and get and get the ask because 173 00:10:07,760 --> 00:10:09,760 Speaker 1: it's a heavy ask, but what you can get out 174 00:10:09,760 --> 00:10:11,240 Speaker 1: of it is pretty cool at the end of the day. 175 00:10:11,440 --> 00:10:16,720 Speaker 1: So the competition for the really skilled fill in the 176 00:10:16,760 --> 00:10:23,520 Speaker 1: blank programmers, researchers, data scientists. I think about the just 177 00:10:23,600 --> 00:10:28,120 Speaker 1: a giant collection of PhD s at Renaissance Technologies, long 178 00:10:28,160 --> 00:10:30,400 Speaker 1: before the rest of Wall Street started thinking in those 179 00:10:30,520 --> 00:10:33,679 Speaker 1: terms that that's gotta be you said, big checks, that's 180 00:10:33,720 --> 00:10:37,160 Speaker 1: gotta be a serious commitment made by the firm to 181 00:10:37,160 --> 00:10:39,840 Speaker 1: to build something like this out. It is definitely a 182 00:10:39,880 --> 00:10:44,160 Speaker 1: serious commitment by Credit sweet um uh. And they understand 183 00:10:44,200 --> 00:10:46,840 Speaker 1: that much of the world is really moving this way. 184 00:10:46,920 --> 00:10:51,040 Speaker 1: And from the firm's perspective, what I believe they understand 185 00:10:51,120 --> 00:10:53,160 Speaker 1: is that we need to be able to deliver content 186 00:10:53,760 --> 00:10:57,280 Speaker 1: um to those firms that you're mentioning. Uh, that is 187 00:10:57,360 --> 00:11:00,079 Speaker 1: interesting to them. Uh. The way we deliver fundament to 188 00:11:00,240 --> 00:11:03,199 Speaker 1: research to the biggest asset managers in the world out there, 189 00:11:03,240 --> 00:11:07,000 Speaker 1: we need to deliver quantitative research along those same domains 190 00:11:07,320 --> 00:11:09,400 Speaker 1: and so yes, it's a big ask if you're going 191 00:11:09,440 --> 00:11:12,800 Speaker 1: to be additive to those people's process. Uh, and you know, 192 00:11:12,840 --> 00:11:14,800 Speaker 1: play with them in the sandbox. Is it is it 193 00:11:14,960 --> 00:11:18,280 Speaker 1: that competitive to hire people? I was joking a little bit, 194 00:11:18,640 --> 00:11:21,480 Speaker 1: but I'm assuming that these folks are really in demand 195 00:11:21,640 --> 00:11:24,080 Speaker 1: and there is no you know, you can't really do 196 00:11:24,120 --> 00:11:26,400 Speaker 1: this on the cheap. I don't think that you can 197 00:11:26,480 --> 00:11:29,880 Speaker 1: do this on the cheap. But you know, you need 198 00:11:30,440 --> 00:11:33,200 Speaker 1: you need a relatively well sized staff. But you know, 199 00:11:33,320 --> 00:11:36,040 Speaker 1: we're not going to be rentech. We don't think about that. 200 00:11:36,120 --> 00:11:40,760 Speaker 1: You know, we don't need that size staff. No, no, no, no, no, no, 201 00:11:40,760 --> 00:11:42,760 Speaker 1: no no no. That I you know, I think you 202 00:11:42,840 --> 00:11:46,040 Speaker 1: need a staff of probably five to seven good researchers 203 00:11:46,080 --> 00:11:48,640 Speaker 1: to be able to produce something interesting. Uh. You need 204 00:11:48,880 --> 00:11:51,840 Speaker 1: a technology team of three to four people. Uh, you 205 00:11:51,880 --> 00:11:54,720 Speaker 1: need a data team of probably another two to three people, 206 00:11:55,120 --> 00:11:57,320 Speaker 1: um to really four people to really kind of begin 207 00:11:57,360 --> 00:11:59,760 Speaker 1: to curate what you're doing. So it's you know, not 208 00:12:00,040 --> 00:12:02,400 Speaker 1: crazy um in any sense, but you can be very 209 00:12:02,440 --> 00:12:05,440 Speaker 1: productive and produce really interesting research on the cell side 210 00:12:05,440 --> 00:12:07,760 Speaker 1: with that size team. So in one of your notes 211 00:12:07,800 --> 00:12:11,880 Speaker 1: you mentioned quant one point oh um, referring to the 212 00:12:12,080 --> 00:12:15,080 Speaker 1: quant quake of of the summer of two thousand and seven. 213 00:12:15,960 --> 00:12:18,560 Speaker 1: What does quant two point oh and quant three point 214 00:12:18,559 --> 00:12:21,160 Speaker 1: oh look like? What are the changes that that are 215 00:12:21,320 --> 00:12:24,520 Speaker 1: taking place and will take place? So Quant one point 216 00:12:24,520 --> 00:12:27,120 Speaker 1: I really ended I think in the summer of August 217 00:12:27,120 --> 00:12:31,439 Speaker 1: two thousand and seven, where there were rather simplistic strategies 218 00:12:31,480 --> 00:12:34,720 Speaker 1: that a lot of people were using, and we turned 219 00:12:34,720 --> 00:12:37,320 Speaker 1: on the light in the room and saw everyone else 220 00:12:37,320 --> 00:12:40,360 Speaker 1: who was there and realized that we needed to do 221 00:12:40,440 --> 00:12:44,480 Speaker 1: things to diversify ourselves from each other. Um. And so 222 00:12:45,000 --> 00:12:48,240 Speaker 1: we've seen that really over the past eight to nine years, 223 00:12:48,600 --> 00:12:51,720 Speaker 1: where people really started to think in different ways, not 224 00:12:51,800 --> 00:12:54,920 Speaker 1: even so much about uh forecasting returns, because I don't 225 00:12:54,920 --> 00:12:57,760 Speaker 1: think we were all that similar there, but really about 226 00:12:57,800 --> 00:13:01,719 Speaker 1: how we access liquidity in the market, how we optimize 227 00:13:01,720 --> 00:13:06,040 Speaker 1: our portfolios, how we thought about risk, how we put 228 00:13:06,320 --> 00:13:09,440 Speaker 1: you know, factors together. Could we time factors? Could we 229 00:13:09,520 --> 00:13:14,439 Speaker 1: not time factors? How you incorporate macro information into your forecast? 230 00:13:14,800 --> 00:13:18,480 Speaker 1: And so people really started to break the paradigm in 231 00:13:18,720 --> 00:13:24,079 Speaker 1: a lot of ways. Uh, still within relatively traditional framework, 232 00:13:24,360 --> 00:13:27,880 Speaker 1: but begin to really push that envelope, you know, kind 233 00:13:27,880 --> 00:13:31,280 Speaker 1: of doing simple screening was no longer enough. So some 234 00:13:31,360 --> 00:13:34,240 Speaker 1: of the criticism, and I'm pulling a line, this is 235 00:13:34,280 --> 00:13:38,880 Speaker 1: actually an academic white paper, our our quansole fishing in 236 00:13:38,920 --> 00:13:42,640 Speaker 1: the same small pond with the same tackle box, implying, hey, 237 00:13:42,679 --> 00:13:45,520 Speaker 1: these were all crowded trades. Everyone was more or less 238 00:13:45,559 --> 00:13:49,320 Speaker 1: using the same tools and pursuing the same goals. Was 239 00:13:49,360 --> 00:13:53,079 Speaker 1: that true back then? And is it still true today? 240 00:13:53,240 --> 00:13:55,160 Speaker 1: You know, it's one of the criticisms that get leveled 241 00:13:55,160 --> 00:13:58,520 Speaker 1: at quant that infuriates me the most. Um. You never 242 00:13:58,559 --> 00:14:01,560 Speaker 1: hear people say that to fundament to analysts. Right, you're 243 00:14:01,559 --> 00:14:04,400 Speaker 1: all listening to the same press conference, You're are reading 244 00:14:04,400 --> 00:14:06,600 Speaker 1: the same earnings report, you're all talking to the same 245 00:14:06,640 --> 00:14:11,760 Speaker 1: investor relations person, so therefore you must all be the same. 246 00:14:12,280 --> 00:14:14,720 Speaker 1: So I think it's one of those great misunderstandings about 247 00:14:14,760 --> 00:14:16,719 Speaker 1: quant is that just because you look at the same 248 00:14:16,840 --> 00:14:20,000 Speaker 1: data or studied under gene Fama, Uh, you must all 249 00:14:20,000 --> 00:14:23,240 Speaker 1: be the same. Um. And let me kind of give 250 00:14:23,240 --> 00:14:27,200 Speaker 1: you an example of how even quants can be different, 251 00:14:27,520 --> 00:14:29,880 Speaker 1: even though on the outside they may look the same. 252 00:14:30,400 --> 00:14:34,400 Speaker 1: So quants not surprisingly like to buy cheap things, um, 253 00:14:34,440 --> 00:14:36,240 Speaker 1: and the hope that they'll go up in value. I 254 00:14:36,280 --> 00:14:38,600 Speaker 1: really don't know any investor who likes to buy expensive 255 00:14:38,640 --> 00:14:40,480 Speaker 1: things and think that it's going to go down in 256 00:14:40,560 --> 00:14:45,600 Speaker 1: value high but sell high. But I don't know anyoneho 257 00:14:45,600 --> 00:14:47,600 Speaker 1: wants to buy high and sell low. No, right, not 258 00:14:47,920 --> 00:14:50,560 Speaker 1: a great strategy to make money. But when you're a 259 00:14:50,600 --> 00:14:53,800 Speaker 1: quant and you say that you want to buy something 260 00:14:53,840 --> 00:14:56,720 Speaker 1: that's cheap, well, you're programming that, and so you have 261 00:14:56,800 --> 00:15:00,240 Speaker 1: to all of a sudden be really really precise ice 262 00:15:00,560 --> 00:15:04,440 Speaker 1: on what you mean by cheap. Do you mean it's 263 00:15:04,560 --> 00:15:07,320 Speaker 1: cheap on a pe ratio? Do you mean it's cheap 264 00:15:07,360 --> 00:15:09,360 Speaker 1: on a book to price ratio? Do you mean it's 265 00:15:09,400 --> 00:15:12,320 Speaker 1: cheap because of sales to price? What metric are you 266 00:15:12,400 --> 00:15:16,800 Speaker 1: even using to define cheapness? Right, you've got to program 267 00:15:16,880 --> 00:15:19,520 Speaker 1: that into the computer, and then you've got to say, 268 00:15:19,600 --> 00:15:22,000 Speaker 1: do you mean it's cheap relative to its own history? 269 00:15:22,360 --> 00:15:24,840 Speaker 1: Do you mean it's cheap on a sales surprice ratio 270 00:15:25,120 --> 00:15:27,760 Speaker 1: compared to every other stock in the market. Do you 271 00:15:27,800 --> 00:15:30,280 Speaker 1: want to do its sector relative? Do you want to 272 00:15:30,440 --> 00:15:35,120 Speaker 1: do it country relative? What do you mean? Um? And 273 00:15:35,360 --> 00:15:39,440 Speaker 1: God is in the details of a lot of these things. UM. 274 00:15:39,560 --> 00:15:41,560 Speaker 1: If I'm gonna look at a book to price ratio, 275 00:15:41,840 --> 00:15:46,120 Speaker 1: do I just book values for differences in gap standards 276 00:15:46,160 --> 00:15:49,840 Speaker 1: in different industries or not? Do I? How do I 277 00:15:49,880 --> 00:15:53,360 Speaker 1: correct for book value under i f R S accounting 278 00:15:53,360 --> 00:15:56,960 Speaker 1: International accounting standards versus gap standards? Do? How do I 279 00:15:57,360 --> 00:16:00,600 Speaker 1: handle all these things? So even a look like you're 280 00:16:00,600 --> 00:16:02,960 Speaker 1: just doing the same thing, Oh, I'm using book to price, 281 00:16:03,240 --> 00:16:06,840 Speaker 1: there can be a tons of details. Let's talk a 282 00:16:06,840 --> 00:16:09,320 Speaker 1: little bit about that period of of oh eight oh nine, 283 00:16:09,320 --> 00:16:12,640 Speaker 1: because you know it's almost ten years ago to the 284 00:16:12,720 --> 00:16:18,400 Speaker 1: day when that weekend that shook the entire financial firmament 285 00:16:18,880 --> 00:16:22,200 Speaker 1: took took place. I've read a couple of your older 286 00:16:22,880 --> 00:16:25,440 Speaker 1: research notes, and I have to ask you the question 287 00:16:26,120 --> 00:16:32,360 Speaker 1: what actually caused the financial crisis and market crash? You know, 288 00:16:32,400 --> 00:16:35,080 Speaker 1: I think the great place um to start, as Andrew 289 00:16:35,160 --> 00:16:38,320 Speaker 1: Ross Sorkin's book Too Big to Fail. If you're really 290 00:16:38,360 --> 00:16:41,200 Speaker 1: interested in kind of the inner workings of Lehman. Uh 291 00:16:41,320 --> 00:16:45,240 Speaker 1: during that time, he nailed it. Um, it's a great read. 292 00:16:45,440 --> 00:16:47,960 Speaker 1: I couldn't put it down. My wife kept, you know, 293 00:16:48,000 --> 00:16:50,120 Speaker 1: like nudging me, like put the book down. You've lived this, 294 00:16:50,200 --> 00:16:51,480 Speaker 1: like why do you have to read this? And I 295 00:16:51,560 --> 00:16:53,520 Speaker 1: was like, oh no, he's got details in here that 296 00:16:53,600 --> 00:16:56,880 Speaker 1: like some of us were trying to find out. UM, 297 00:16:57,480 --> 00:17:00,480 Speaker 1: research one into that you could see in the suffis 298 00:17:00,600 --> 00:17:03,720 Speaker 1: he got access to and got people talking. That's really 299 00:17:03,840 --> 00:17:07,640 Speaker 1: quite remarkable. Um. I think that there was definitely some 300 00:17:07,720 --> 00:17:12,160 Speaker 1: level of mismanagement at the top, as as he documents, uh, 301 00:17:12,359 --> 00:17:14,720 Speaker 1: not just Lehman Brothers, but across the board, but across 302 00:17:14,760 --> 00:17:19,560 Speaker 1: the board, a misunderstanding of risk. Uh. And it's very 303 00:17:19,600 --> 00:17:23,320 Speaker 1: hard to know when the music is going to stop, 304 00:17:23,440 --> 00:17:27,640 Speaker 1: as it were, when successful businesses have run their course. Um. 305 00:17:27,680 --> 00:17:30,919 Speaker 1: If you remember nine months back, Lehman Brothers was putting 306 00:17:30,960 --> 00:17:35,040 Speaker 1: up record earnings. Uh, And so how do you know 307 00:17:35,320 --> 00:17:37,880 Speaker 1: that it's time to get out of that business. It's 308 00:17:37,880 --> 00:17:40,760 Speaker 1: a really hard call to make. So so let's talk 309 00:17:40,800 --> 00:17:44,159 Speaker 1: about six or nine months back. I read and I 310 00:17:44,160 --> 00:17:47,199 Speaker 1: don't think this was Sorkin's book. I think it was Patterson's. 311 00:17:47,280 --> 00:17:51,639 Speaker 1: The Quants. You had submitted some memos to senior management, 312 00:17:52,040 --> 00:17:53,920 Speaker 1: sort of saying, hey, guys, you gotta wake up. Is 313 00:17:53,960 --> 00:17:57,159 Speaker 1: a ton of risk here, and it seemed like you 314 00:17:57,320 --> 00:18:01,000 Speaker 1: had a sense there were problems coming long before much 315 00:18:01,040 --> 00:18:03,080 Speaker 1: of the street figured it out. I think you're being 316 00:18:03,119 --> 00:18:06,840 Speaker 1: overly generous to me on that one. I think that 317 00:18:06,880 --> 00:18:10,600 Speaker 1: there were definitely things that concerned me. Did you ask someone, 318 00:18:11,040 --> 00:18:14,360 Speaker 1: and again maybe this is Patterson's book, didn't you say, well, 319 00:18:14,359 --> 00:18:17,360 Speaker 1: what what are the contingencies in case Leman goes bankrupt? 320 00:18:17,600 --> 00:18:19,399 Speaker 1: And people laughed at you, They looked at you like 321 00:18:19,440 --> 00:18:23,200 Speaker 1: you were crazy. Um, there were times that there were 322 00:18:23,240 --> 00:18:26,199 Speaker 1: things going on that disturbed me. I'll give you I'll 323 00:18:26,240 --> 00:18:29,600 Speaker 1: give you a little anecdote. There's a great paper by 324 00:18:29,640 --> 00:18:33,800 Speaker 1: a professor um Owen Lamont at the Harvard Business School 325 00:18:33,800 --> 00:18:35,959 Speaker 1: and used to be at the University of Chicago, and 326 00:18:36,040 --> 00:18:40,080 Speaker 1: he did a study that found that firms who get 327 00:18:40,119 --> 00:18:45,840 Speaker 1: into fights with their short sellers, like the time those 328 00:18:45,880 --> 00:18:50,360 Speaker 1: firms end up going bankrupt. They're in trouble, nothing else 329 00:18:50,400 --> 00:18:54,120 Speaker 1: to do but with the shorts like like stop right, 330 00:18:54,240 --> 00:18:56,440 Speaker 1: you know, Um, and and he documents some of these 331 00:18:56,480 --> 00:18:58,920 Speaker 1: and they're great anecdotes in there. And if you remember, 332 00:18:59,000 --> 00:19:02,240 Speaker 1: towards the end of Lean and Brothers, UM Management got 333 00:19:02,280 --> 00:19:05,639 Speaker 1: into a fight with one of our with David Einhorn. 334 00:19:05,800 --> 00:19:10,199 Speaker 1: I believe UM, And when I yes, it was, it 335 00:19:10,280 --> 00:19:14,280 Speaker 1: was contested. And I did send UM that paper along 336 00:19:14,880 --> 00:19:20,480 Speaker 1: to senior people. And picture, wait, this guy sending me 337 00:19:20,520 --> 00:19:23,679 Speaker 1: a Harvard Business School white paper and arguing with shorts. 338 00:19:23,920 --> 00:19:27,280 Speaker 1: Doesn't he realize our very foundation is under assault. I 339 00:19:27,320 --> 00:19:30,520 Speaker 1: could just picture the c suite response to that. UM 340 00:19:30,760 --> 00:19:33,080 Speaker 1: was this head sending me a white paper? What is 341 00:19:33,119 --> 00:19:38,200 Speaker 1: this except except saying, these people do this, You're scaring me. UM. 342 00:19:38,680 --> 00:19:41,080 Speaker 1: I was lucky that my boss was a PhD from 343 00:19:41,080 --> 00:19:44,320 Speaker 1: the University of Chicago as well. He appreciated his kind 344 00:19:44,320 --> 00:19:47,320 Speaker 1: of things. Uh, he got it. Uh. And and I 345 00:19:47,359 --> 00:19:48,760 Speaker 1: think we actually, you know, I don't want to say 346 00:19:48,760 --> 00:19:50,520 Speaker 1: we stopped fighting with him, but we did stop fighting 347 00:19:50,520 --> 00:19:52,000 Speaker 1: with him, and I think we did start to content 348 00:19:52,119 --> 00:19:55,240 Speaker 1: rate on different things. We had some very talented, you're 349 00:19:55,280 --> 00:19:57,760 Speaker 1: bright people there. You're literally on the way to London 350 00:19:57,840 --> 00:20:00,359 Speaker 1: to a conference when you get a phone call. All 351 00:20:00,560 --> 00:20:03,199 Speaker 1: you're in JFK. You get a phone call on the 352 00:20:03,200 --> 00:20:06,720 Speaker 1: other side of security. Hey, tap out, you gotta come back. 353 00:20:07,160 --> 00:20:10,520 Speaker 1: You go home to New Jersey. You take not your 354 00:20:10,640 --> 00:20:14,720 Speaker 1: little car but your wife's station wagon with the presence 355 00:20:14,720 --> 00:20:16,840 Speaker 1: of mind too. I gotta go clear up my office. 356 00:20:17,119 --> 00:20:22,359 Speaker 1: And you're described as this lucid rational reason like you weren't. 357 00:20:22,440 --> 00:20:25,119 Speaker 1: Oh what a shock, what a surprise. This seemed to 358 00:20:25,119 --> 00:20:29,160 Speaker 1: be something that you apparently had thought out before, where 359 00:20:29,200 --> 00:20:31,919 Speaker 1: most people more or less seem to be shocked or 360 00:20:31,960 --> 00:20:35,080 Speaker 1: panicked or both. How do you and is that again, 361 00:20:35,080 --> 00:20:39,520 Speaker 1: am I oversimplifying this? Or um? Well, I was certainly emotional. Um, 362 00:20:39,640 --> 00:20:41,360 Speaker 1: I don't want to say that that wasn't a very 363 00:20:41,359 --> 00:20:45,159 Speaker 1: emotional night for me. Um. You know, one of the 364 00:20:45,160 --> 00:20:48,760 Speaker 1: things that I think, you know, behavioral economists and other 365 00:20:48,760 --> 00:20:50,679 Speaker 1: people tell you, is that the closer you are to 366 00:20:50,720 --> 00:20:53,760 Speaker 1: a situation, the harder it is for you to kind 367 00:20:53,800 --> 00:20:58,000 Speaker 1: of take that step back rationally and see what's going on. 368 00:20:58,680 --> 00:21:01,520 Speaker 1: A lot of Lehman management lived through ninety four and 369 00:21:01,640 --> 00:21:05,840 Speaker 1: had lived through other crises, and really we're very, very, 370 00:21:05,960 --> 00:21:10,280 Speaker 1: very close. I was relatively new at Lehman and so 371 00:21:10,760 --> 00:21:14,680 Speaker 1: kind of had a little different perspective more objectivity than 372 00:21:14,720 --> 00:21:18,080 Speaker 1: then they did. About the situation. I think that was 373 00:21:18,240 --> 00:21:21,640 Speaker 1: part of part of the difference. Um and just being 374 00:21:21,720 --> 00:21:25,000 Speaker 1: kind of a little just more unsentimental. Let's talk a 375 00:21:25,040 --> 00:21:28,240 Speaker 1: little bit about the quant crash of two thousand and seven. 376 00:21:28,800 --> 00:21:33,440 Speaker 1: I love the story about you and Austriel Levin figuring 377 00:21:33,480 --> 00:21:38,720 Speaker 1: out what actually had happened long before anybody else. Uh. 378 00:21:38,920 --> 00:21:42,480 Speaker 1: Was this over sushi or Chinese food in San fran sushi? Okay, 379 00:21:42,560 --> 00:21:45,080 Speaker 1: it was a sushi dinner. Um. I've always felt badly 380 00:21:45,160 --> 00:21:48,200 Speaker 1: that um ozreal Levine to his friends as known as susy. 381 00:21:48,680 --> 00:21:50,600 Speaker 1: You know, he really should have been the co author 382 00:21:50,640 --> 00:21:53,600 Speaker 1: with me on that paper and deserves every bit of credit. Um. 383 00:21:53,840 --> 00:21:55,520 Speaker 1: What was he working at the time? What was the 384 00:21:55,560 --> 00:21:59,200 Speaker 1: place called Menta Capital? He's still there. Uh. And he 385 00:21:59,840 --> 00:22:03,600 Speaker 1: used to run b g I's hedge fund um main um, 386 00:22:03,640 --> 00:22:05,960 Speaker 1: you know main hedge fund over there, and he had 387 00:22:06,000 --> 00:22:08,240 Speaker 1: started on his own. Uh. And you know I'd have 388 00:22:08,240 --> 00:22:10,720 Speaker 1: been out that day seeing clients and watching the blow 389 00:22:10,840 --> 00:22:14,200 Speaker 1: up happening, and like we both were just sitting there 390 00:22:14,280 --> 00:22:18,240 Speaker 1: over sushi, um, and like just kind of piecing together 391 00:22:18,400 --> 00:22:23,160 Speaker 1: what would have caused everyone to unwind? Um? And it 392 00:22:23,200 --> 00:22:25,920 Speaker 1: was literally just over sushi dinner. Just arguing it back 393 00:22:25,960 --> 00:22:29,400 Speaker 1: and forth and kind of putting together what the story 394 00:22:29,920 --> 00:22:32,160 Speaker 1: had to have been. So tell that story, because it's 395 00:22:32,160 --> 00:22:37,480 Speaker 1: fascinating how you guys deduce leverage multi strat. So the 396 00:22:37,480 --> 00:22:39,919 Speaker 1: story that we that that that that we kind of 397 00:22:39,960 --> 00:22:42,680 Speaker 1: came up with um and still holds up to this day. 398 00:22:42,760 --> 00:22:45,120 Speaker 1: No one's you know, we can't prove it, um, but 399 00:22:45,400 --> 00:22:47,679 Speaker 1: no one has a better story UM. And you know, 400 00:22:47,720 --> 00:22:51,440 Speaker 1: it's kind of become accepted wisdom is that there were 401 00:22:51,440 --> 00:22:56,520 Speaker 1: a number of multi strat quantitative hedge funds that held 402 00:22:57,320 --> 00:23:01,480 Speaker 1: positions in UH sub prime mortgage UH and fixed income 403 00:23:01,520 --> 00:23:04,720 Speaker 1: mortgages of low credit that we're taking losses. This was 404 00:23:04,760 --> 00:23:08,000 Speaker 1: in the summer of two thousand and seven where you 405 00:23:08,040 --> 00:23:11,600 Speaker 1: had the managers at bear Stearns who were running those 406 00:23:11,640 --> 00:23:15,560 Speaker 1: fixed income portfolio and trying to remember I don't remember 407 00:23:15,600 --> 00:23:17,439 Speaker 1: the names, but that that was June that kind of 408 00:23:17,480 --> 00:23:21,639 Speaker 1: wabble that that that started wobbling right and by in 409 00:23:21,760 --> 00:23:24,960 Speaker 1: mid July you saw a number of other quant um 410 00:23:25,119 --> 00:23:28,280 Speaker 1: You saw the fixed income credit distress. Credit market was 411 00:23:28,600 --> 00:23:31,680 Speaker 1: in distress um and it was a liquid and people 412 00:23:31,680 --> 00:23:36,119 Speaker 1: were beginning to receive margin calls on those on those books. 413 00:23:36,160 --> 00:23:38,960 Speaker 1: They were, they were highly levered, and um Man and 414 00:23:38,960 --> 00:23:41,240 Speaker 1: and and prime brokers and others were coming to people 415 00:23:41,240 --> 00:23:44,600 Speaker 1: who who held those assets and said, we need more 416 00:23:44,880 --> 00:23:48,560 Speaker 1: collateral so support those books. So highly levered and a 417 00:23:48,680 --> 00:23:51,679 Speaker 1: liquid not a great combination and not a great combination. 418 00:23:51,960 --> 00:23:53,760 Speaker 1: And the last thing you want to do if you're 419 00:23:53,760 --> 00:23:59,280 Speaker 1: holding that portfolio is actually liquidate those assets because the 420 00:23:59,359 --> 00:24:03,240 Speaker 1: marks aren't probably really at market. There at some discount 421 00:24:03,400 --> 00:24:05,840 Speaker 1: discount to market. But when you try to move that, 422 00:24:05,920 --> 00:24:08,879 Speaker 1: the mark is going to get set lower. As you 423 00:24:08,920 --> 00:24:12,360 Speaker 1: try to sell in a liquid asset, right for that, 424 00:24:12,440 --> 00:24:14,280 Speaker 1: you know it's going to be marked lower than the 425 00:24:14,280 --> 00:24:17,680 Speaker 1: whole portfolio gets marked lower. You're gonna need to raise 426 00:24:18,000 --> 00:24:22,879 Speaker 1: more collateral for the discount of the underlying asset. Sounds 427 00:24:22,880 --> 00:24:26,800 Speaker 1: like portfolio, It sounds like exactly. And so the last 428 00:24:26,800 --> 00:24:29,840 Speaker 1: if so, if you're smart and you realize this, you're 429 00:24:29,880 --> 00:24:31,720 Speaker 1: not gonna if you to meet the margin call, you're 430 00:24:31,760 --> 00:24:34,680 Speaker 1: not going to sell that asset. You're gonna go sell 431 00:24:34,720 --> 00:24:37,600 Speaker 1: a very highly liquid asset because you're taking a much 432 00:24:37,600 --> 00:24:40,040 Speaker 1: smaller haircut on that if any at all. Right, it's 433 00:24:40,040 --> 00:24:43,280 Speaker 1: a liquid portfolio. Now, what is the most liquid assets 434 00:24:43,880 --> 00:24:47,919 Speaker 1: in the world, Probably US large cap equities. So if 435 00:24:47,920 --> 00:24:49,840 Speaker 1: you're a multi strat firm, where are you gonna go 436 00:24:49,960 --> 00:24:53,080 Speaker 1: raise that equity? You're gonna go liquidate, And many of 437 00:24:53,119 --> 00:24:56,280 Speaker 1: these were quants. You're gonna go liquidate your quant portfolio. 438 00:24:56,680 --> 00:24:58,360 Speaker 1: And we saw that if you go back and look 439 00:24:58,359 --> 00:25:00,800 Speaker 1: at the data, that a lot of the quants were 440 00:25:00,840 --> 00:25:05,359 Speaker 1: losing money throughout most of July. A well known quant 441 00:25:05,400 --> 00:25:08,880 Speaker 1: manager has come out and said, like, we lost money 442 00:25:08,880 --> 00:25:11,920 Speaker 1: twenty one out of the twenty two days in July. 443 00:25:12,440 --> 00:25:14,880 Speaker 1: But it wasn't just a kind of steady trickle, like 444 00:25:15,080 --> 00:25:19,119 Speaker 1: it wasn't really bad. But then it really started to 445 00:25:19,160 --> 00:25:24,400 Speaker 1: pick up momentum as it were in um August, and 446 00:25:24,720 --> 00:25:28,000 Speaker 1: people started and we really think it's because the liquidations 447 00:25:28,000 --> 00:25:31,480 Speaker 1: and the margin calls became much more severe, and other 448 00:25:31,520 --> 00:25:34,880 Speaker 1: people were noticing that their being their portfolios were misbehaving, 449 00:25:35,280 --> 00:25:37,600 Speaker 1: and so they started to take down who didn't have 450 00:25:37,640 --> 00:25:41,080 Speaker 1: any exposure necessary to these subprime assets. They saw their 451 00:25:41,160 --> 00:25:44,359 Speaker 1: quant portfolios not behaving that the way they wanted, and 452 00:25:44,400 --> 00:25:47,880 Speaker 1: so they started to take down risk because their models 453 00:25:47,920 --> 00:25:51,520 Speaker 1: were misbehaving, which is because it just shows you how 454 00:25:51,560 --> 00:25:54,200 Speaker 1: inter related everything is. That's right, we don't have any 455 00:25:54,200 --> 00:25:57,840 Speaker 1: subprime exposure doesn't matter. People who do a liquidating things 456 00:25:57,880 --> 00:26:00,320 Speaker 1: that you have exposed to. And so that's how can 457 00:26:00,400 --> 00:26:03,240 Speaker 1: that is a definition of contagion, right where something that 458 00:26:03,280 --> 00:26:07,119 Speaker 1: you're not actually exposed to begins to affect another part 459 00:26:07,359 --> 00:26:10,240 Speaker 1: of the market. And you and Oozy are putting this 460 00:26:10,359 --> 00:26:13,840 Speaker 1: together pretty much in real time in early August. We're 461 00:26:13,880 --> 00:26:17,000 Speaker 1: putting this literally together over a three to four hour 462 00:26:17,160 --> 00:26:22,120 Speaker 1: dinner of sushi in a restaurant in California with some saki. UM, 463 00:26:23,640 --> 00:26:26,840 Speaker 1: and you closed to join up there until they kicked 464 00:26:26,840 --> 00:26:30,520 Speaker 1: this out, and we kind of you know, you know, 465 00:26:30,560 --> 00:26:33,800 Speaker 1: we didn't exactly have the story. We couldn't prove it, 466 00:26:34,000 --> 00:26:36,880 Speaker 1: but it all made sense, UM, and kind of got 467 00:26:36,920 --> 00:26:40,560 Speaker 1: this story. And then I went back to my hotel room, uh, 468 00:26:40,600 --> 00:26:42,679 Speaker 1: and realized that the rest of the trip that I 469 00:26:42,720 --> 00:26:45,800 Speaker 1: was planning in California was out the window. What I 470 00:26:45,880 --> 00:26:50,040 Speaker 1: needed to do was right this all up. And so 471 00:26:50,280 --> 00:26:53,080 Speaker 1: that next morning I called and told all my sales people, 472 00:26:53,119 --> 00:26:57,160 Speaker 1: cancel my trip, UM, cancel all my client meetings. UM. 473 00:26:57,200 --> 00:26:59,520 Speaker 1: This is you know, I'm going into the San Francisco 474 00:26:59,600 --> 00:27:03,199 Speaker 1: office and we're writing this note as our quant world 475 00:27:03,320 --> 00:27:07,520 Speaker 1: is melting down. UM and you know, stayed up until literally, 476 00:27:07,520 --> 00:27:10,560 Speaker 1: I mean I got there at you know, eight o'clock 477 00:27:10,560 --> 00:27:14,200 Speaker 1: in the morning and published that note. Um walk. I 478 00:27:14,200 --> 00:27:16,879 Speaker 1: remember walking back from the San Francisco office after I 479 00:27:17,000 --> 00:27:19,760 Speaker 1: hit the send button on that note and knowing that 480 00:27:19,880 --> 00:27:22,640 Speaker 1: I had done it was almost like the Jerry McGuire moment, 481 00:27:22,720 --> 00:27:24,919 Speaker 1: like when you put that out there, kind of saying like, 482 00:27:25,160 --> 00:27:27,280 Speaker 1: oh my god, what if I just hit the send 483 00:27:27,320 --> 00:27:30,320 Speaker 1: button on um and woke up to the most read 484 00:27:30,400 --> 00:27:33,920 Speaker 1: note in really the history of wurbulent times in quant Land. 485 00:27:33,920 --> 00:27:37,240 Speaker 1: Really just the timing was perfect, and you guys figured out, 486 00:27:38,119 --> 00:27:42,119 Speaker 1: if not the best explanation, certainly no one's come along 487 00:27:42,119 --> 00:27:44,840 Speaker 1: with a better explanation, since I think people it is 488 00:27:44,840 --> 00:27:49,439 Speaker 1: pretty much I think received as the explanation. So at 489 00:27:49,480 --> 00:27:51,399 Speaker 1: this point, there's a line I'm not sure if this 490 00:27:51,520 --> 00:27:55,560 Speaker 1: is from that or another one of your writings. Events 491 00:27:55,560 --> 00:27:59,200 Speaker 1: that model only predicted what happen once in ten thousand years, 492 00:27:59,600 --> 00:28:03,600 Speaker 1: happened every day for three days. So, in other words, 493 00:28:04,200 --> 00:28:08,399 Speaker 1: loudly improbable things are happening way too frequently, right, I 494 00:28:08,480 --> 00:28:12,440 Speaker 1: think that's gonna be on my tombstones, um um. And 495 00:28:12,520 --> 00:28:15,000 Speaker 1: you know, some people have actually criticized me as not 496 00:28:15,160 --> 00:28:18,440 Speaker 1: understanding that returns are not normally distributed. For that statement, 497 00:28:18,720 --> 00:28:21,600 Speaker 1: of course, what I meant was that things were misbehaving 498 00:28:21,760 --> 00:28:25,640 Speaker 1: on our models, and our models were misspecified and wrong. 499 00:28:26,280 --> 00:28:29,080 Speaker 1: UM and obviously we did not have their appropriate distribution 500 00:28:29,119 --> 00:28:32,560 Speaker 1: of returns. If that's what our models were saying. UM, 501 00:28:32,720 --> 00:28:35,720 Speaker 1: I clearly understand statistics, and clearly it was it was. 502 00:28:35,920 --> 00:28:39,040 Speaker 1: It was a pithy way of trying to say, our 503 00:28:39,080 --> 00:28:41,680 Speaker 1: models are absolutely wrong. If that's what we're predicting and 504 00:28:41,680 --> 00:28:43,400 Speaker 1: we're seeing them three days in a row, we don't 505 00:28:43,480 --> 00:28:46,200 Speaker 1: understand what's going on. Our models are wrong. So so 506 00:28:46,280 --> 00:28:49,560 Speaker 1: I love the expression all models are wrong, but some 507 00:28:49,680 --> 00:28:53,120 Speaker 1: are useful. Um. And your models had previously proven to 508 00:28:53,160 --> 00:28:58,720 Speaker 1: be useful. What was wrong with all of the quant models? 509 00:28:58,720 --> 00:29:04,160 Speaker 1: Some people were blaming Gaussian Coppola's and other people were saying, no, 510 00:29:04,360 --> 00:29:09,560 Speaker 1: this is strictly a subprime derivative c d O UM contagion. 511 00:29:09,960 --> 00:29:12,840 Speaker 1: Where did the models? Where were the models off? I 512 00:29:12,880 --> 00:29:18,280 Speaker 1: think where the models were off is in understanding liquidity 513 00:29:18,640 --> 00:29:22,760 Speaker 1: UM wasn't appropriately kind of factored into that and notions 514 00:29:22,800 --> 00:29:26,840 Speaker 1: of crowding. We're very very We're just not in the 515 00:29:26,880 --> 00:29:29,080 Speaker 1: models to be honest with you, we didn't know how 516 00:29:29,120 --> 00:29:30,560 Speaker 1: to think about that. We didn't know how to think 517 00:29:30,560 --> 00:29:33,760 Speaker 1: about crowding risk, We didn't know really how to think 518 00:29:33,800 --> 00:29:38,280 Speaker 1: about liquidity the way we do today. We held more 519 00:29:38,320 --> 00:29:42,840 Speaker 1: concentrated positions at that time. While we might have only 520 00:29:42,880 --> 00:29:45,560 Speaker 1: hold a fraction of average daily volume a d V 521 00:29:46,360 --> 00:29:50,520 Speaker 1: and traded those carefully, we let those positions build up 522 00:29:50,560 --> 00:29:53,280 Speaker 1: too much as a portion of our book um. We 523 00:29:53,320 --> 00:29:56,960 Speaker 1: didn't spread the bets out enough across enough different stocks, 524 00:29:57,480 --> 00:30:00,040 Speaker 1: and we ran with this way too much leverage. So 525 00:30:00,040 --> 00:30:02,840 Speaker 1: so there's no doubt. Leverage is always a giant problem 526 00:30:02,920 --> 00:30:07,760 Speaker 1: whenever there's headache. But you had done some subsequent research 527 00:30:08,280 --> 00:30:13,080 Speaker 1: that found, hey, the correlations were much lower than everybody believed. 528 00:30:13,080 --> 00:30:18,360 Speaker 1: Everybody that was talking about crowded trades assumed people were all, 529 00:30:18,400 --> 00:30:21,600 Speaker 1: if not in the exact same investments, in such similar 530 00:30:21,640 --> 00:30:26,160 Speaker 1: asset um holdings that it didn't make a difference. But 531 00:30:26,280 --> 00:30:30,840 Speaker 1: you found the correlation was something around. What I tried 532 00:30:30,880 --> 00:30:34,400 Speaker 1: to do was decompose why we were into crowded trades. 533 00:30:34,960 --> 00:30:37,640 Speaker 1: So I don't think we're denying that quants were holding 534 00:30:37,640 --> 00:30:41,840 Speaker 1: the same portfolios. The received wisdom was that it was 535 00:30:41,920 --> 00:30:46,000 Speaker 1: because our return prediction models are alpha models were all 536 00:30:46,040 --> 00:30:48,080 Speaker 1: the same that we were looking in this We were 537 00:30:48,080 --> 00:30:51,040 Speaker 1: fishing for alpha in the same pond. And what I 538 00:30:51,160 --> 00:30:55,440 Speaker 1: actually managed to do was convince the biggest quant firms 539 00:30:55,520 --> 00:30:58,360 Speaker 1: out there that they should actually give me the outputs 540 00:30:58,480 --> 00:31:01,280 Speaker 1: of their models for a period of a year. And 541 00:31:01,840 --> 00:31:04,480 Speaker 1: and they did. And that's a lot of trust for 542 00:31:04,640 --> 00:31:06,720 Speaker 1: I want to say, all right, here's the crown jewels. 543 00:31:06,800 --> 00:31:08,760 Speaker 1: Try not to let anybody else get home. That was 544 00:31:08,880 --> 00:31:11,240 Speaker 1: the relationships that I had with my clients was that 545 00:31:11,320 --> 00:31:14,800 Speaker 1: they actually gave me the outputs to their models because 546 00:31:14,800 --> 00:31:17,280 Speaker 1: we all thought this was a really important problem to 547 00:31:17,400 --> 00:31:21,120 Speaker 1: figure out of what drove us into these same trades. 548 00:31:21,560 --> 00:31:24,520 Speaker 1: And what I saw was that the actual outputs of 549 00:31:24,520 --> 00:31:28,240 Speaker 1: the models weren't all that correlated. It wasn't an alpha 550 00:31:28,360 --> 00:31:31,960 Speaker 1: modeling problem. People, as because we talked about before, have 551 00:31:32,120 --> 00:31:35,360 Speaker 1: different ways of predicting returns. If you and I were 552 00:31:35,400 --> 00:31:37,080 Speaker 1: to say, what is the stock that's gonna have the 553 00:31:37,120 --> 00:31:40,680 Speaker 1: highest return over the next you know, six months, or 554 00:31:40,760 --> 00:31:43,120 Speaker 1: ask your listeners, right, there'll be a lot of people 555 00:31:43,120 --> 00:31:46,120 Speaker 1: would have very different opinions on a stock like Netflix 556 00:31:46,240 --> 00:31:50,800 Speaker 1: or tesla Um or Amazon Um, any of the fang stocks, 557 00:31:51,040 --> 00:31:52,720 Speaker 1: any of those kind of things. We might all have 558 00:31:52,880 --> 00:31:55,640 Speaker 1: very different forecasts, but if we were to ask what 559 00:31:55,720 --> 00:31:59,000 Speaker 1: are the most risky stocks, could probably list a lot 560 00:31:59,000 --> 00:32:01,680 Speaker 1: of those names. The dispersion and outcome. We don't know 561 00:32:01,760 --> 00:32:04,760 Speaker 1: which way it's gonna go, but we can agree it's risky. 562 00:32:04,960 --> 00:32:08,480 Speaker 1: We have been speaking with Matthew Rothman. He is currently 563 00:32:08,920 --> 00:32:13,720 Speaker 1: the head of Quantitative equity Strategies at Credit Swiss. We 564 00:32:13,880 --> 00:32:18,120 Speaker 1: love your comments, feedback and suggestions right to us at 565 00:32:18,400 --> 00:32:21,640 Speaker 1: m IB podcast at Bloomberg dot net. You can check 566 00:32:21,680 --> 00:32:24,560 Speaker 1: out my daily column on Bloomberg View dot com or 567 00:32:24,640 --> 00:32:28,200 Speaker 1: follow me on Twitter at rit Halts. I'm Barry Hults. 568 00:32:28,360 --> 00:32:47,800 Speaker 1: You're listening to Masters in Business on Bloomberg Radio. Welcome 569 00:32:47,840 --> 00:32:50,440 Speaker 1: to the podcast, Matthew. Thank you for being so generous 570 00:32:50,440 --> 00:32:53,360 Speaker 1: with your time. I find this stuff fascinating. I was 571 00:32:53,400 --> 00:32:57,280 Speaker 1: about to ask you, Um, you mentioned Andrew Ross Sorkin's 572 00:32:57,280 --> 00:33:01,040 Speaker 1: book Too Big to Fail. Um, do you read Patterson's 573 00:33:01,080 --> 00:33:05,560 Speaker 1: book The Klantz? I did, oh uh not. I found 574 00:33:05,560 --> 00:33:09,280 Speaker 1: it fascinating because I love the characters. It's it's all 575 00:33:09,360 --> 00:33:14,200 Speaker 1: my favorite Asses and Sam, Jim Simons of Renaissance and 576 00:33:14,360 --> 00:33:17,120 Speaker 1: Ed Thorpe. You're mentioned in it. A number of people 577 00:33:17,120 --> 00:33:19,280 Speaker 1: are in it. I found that to be a really 578 00:33:19,360 --> 00:33:24,000 Speaker 1: fascinating tale. What what was your take on that? I'm 579 00:33:24,000 --> 00:33:28,760 Speaker 1: to apologize for a no, no, no, it's fine. Um 580 00:33:28,800 --> 00:33:32,600 Speaker 1: my frustrations with the book or that I found it 581 00:33:32,640 --> 00:33:37,360 Speaker 1: a little overridden and a little over sensationalized. So here's 582 00:33:37,400 --> 00:33:39,480 Speaker 1: what I have to I have to throw your own 583 00:33:39,480 --> 00:33:42,880 Speaker 1: words back at you. Are you closer to that narrative 584 00:33:43,040 --> 00:33:47,600 Speaker 1: than you are to the Sorcan narrative? Maybe I read 585 00:33:47,680 --> 00:33:50,200 Speaker 1: I you know, I know some of the characters in there, 586 00:33:50,400 --> 00:33:53,840 Speaker 1: and um, you know some of them do have tempers. 587 00:33:53,880 --> 00:33:57,720 Speaker 1: But like you know, I read the poker game that starts. 588 00:33:57,520 --> 00:33:59,440 Speaker 1: It's when I was looking at them, like what is 589 00:33:59,480 --> 00:34:02,360 Speaker 1: this about? Out? It was yes, and you know we 590 00:34:02,800 --> 00:34:04,440 Speaker 1: those of us who are close to it, and you see, 591 00:34:04,480 --> 00:34:06,440 Speaker 1: I mean there's a detail in the book, just for 592 00:34:06,760 --> 00:34:10,000 Speaker 1: as a small example that drives me crazy, where Scott 593 00:34:10,080 --> 00:34:13,759 Speaker 1: has me coming off a red eye flight from New 594 00:34:13,840 --> 00:34:17,720 Speaker 1: York to San Francisco. No, it's the other way around. 595 00:34:17,880 --> 00:34:21,160 Speaker 1: There are no red eye flasks. It's San Francisco to 596 00:34:21,200 --> 00:34:23,520 Speaker 1: New York, and it's it's barely a red eye because 597 00:34:23,520 --> 00:34:25,360 Speaker 1: it's five and a half hours, right, and it's like 598 00:34:25,400 --> 00:34:27,760 Speaker 1: you're going to London and it's eight hours, you can exactly, 599 00:34:27,840 --> 00:34:30,760 Speaker 1: and so and so. It's just little things like that 600 00:34:31,040 --> 00:34:33,520 Speaker 1: which you just pick up and you're like, he's got 601 00:34:33,560 --> 00:34:36,440 Speaker 1: the details rock. And when someone starts getting the details 602 00:34:36,440 --> 00:34:40,320 Speaker 1: wrong on little things that are so obvious, you start 603 00:34:40,840 --> 00:34:44,720 Speaker 1: distrusting some of the other stuff where it's harder, where 604 00:34:44,760 --> 00:34:48,080 Speaker 1: it's harder to see you. You're bursting my balloon. I 605 00:34:48,120 --> 00:34:51,719 Speaker 1: just love that book so much, but I can understand 606 00:34:52,320 --> 00:34:54,400 Speaker 1: it's someone who was so close to the story and 607 00:34:54,480 --> 00:34:57,640 Speaker 1: so close to the characters that you see the embellishments. Sure, 608 00:34:58,120 --> 00:35:01,240 Speaker 1: so and and if you know some of these people, 609 00:35:01,280 --> 00:35:05,720 Speaker 1: if you know so, how I know Cliff Astness today 610 00:35:05,840 --> 00:35:09,520 Speaker 1: versus that book. There are two very different characters. Like 611 00:35:09,600 --> 00:35:11,520 Speaker 1: the ass. This character in the book is a little 612 00:35:11,560 --> 00:35:16,040 Speaker 1: harder and a little like I know him as this 613 00:35:16,040 --> 00:35:19,320 Speaker 1: this mischievous guy with a wicked wit. I mean, he's 614 00:35:19,400 --> 00:35:23,000 Speaker 1: just outright hilarious. He's a funny guy. He's he has 615 00:35:23,040 --> 00:35:26,080 Speaker 1: a great sense of humor, great personality. Doesn't come across 616 00:35:26,120 --> 00:35:27,799 Speaker 1: that way in the book, you know, he he is 617 00:35:28,480 --> 00:35:31,080 Speaker 1: um little hard ass in the book. You know, I'm 618 00:35:31,080 --> 00:35:33,280 Speaker 1: not saying that he's look at the company he's built 619 00:35:33,520 --> 00:35:35,520 Speaker 1: and all of those things. I'm sure he drives people 620 00:35:35,800 --> 00:35:38,600 Speaker 1: to produce results. I would expect nothing else of a 621 00:35:38,920 --> 00:35:41,400 Speaker 1: multi billionaire, uh, you know who had the vision to 622 00:35:41,440 --> 00:35:44,759 Speaker 1: build the you know, the incredible company that he and 623 00:35:44,840 --> 00:35:48,480 Speaker 1: his partners have built. Um. But you're right, the charming 624 00:35:48,480 --> 00:35:51,120 Speaker 1: side the Cliff does not come through that. The witty, 625 00:35:51,480 --> 00:35:56,600 Speaker 1: the you know, hilarious UM, charismatic um, you know, part 626 00:35:56,719 --> 00:36:00,640 Speaker 1: that makes Cliff the legend that he is unshine through 627 00:36:00,640 --> 00:36:02,839 Speaker 1: in that book. And so like those are the types 628 00:36:02,880 --> 00:36:06,920 Speaker 1: of things that that that that bother me um about it. 629 00:36:06,960 --> 00:36:11,400 Speaker 1: But it's what's fascinating about is to me is is 630 00:36:11,440 --> 00:36:13,880 Speaker 1: this thread that runs through the throughout that whole book. 631 00:36:14,040 --> 00:36:17,640 Speaker 1: And we'll get two books a little later. How quant 632 00:36:17,960 --> 00:36:22,160 Speaker 1: was sort of disdained in the people like almost I 633 00:36:22,160 --> 00:36:24,160 Speaker 1: don't want to say laughed at, but kind of like 634 00:36:24,440 --> 00:36:27,000 Speaker 1: you put the numbers geeks in the basement. We're actually 635 00:36:27,440 --> 00:36:31,279 Speaker 1: running a real firm here. It almost starts like that 636 00:36:31,440 --> 00:36:34,239 Speaker 1: and ends up in oh Klan is taking overall the 637 00:36:34,320 --> 00:36:37,759 Speaker 1: Wall Street and you people who didn't understand or appreciate it, well, 638 00:36:37,800 --> 00:36:40,040 Speaker 1: you missed the bus and here's the next big thing. 639 00:36:40,400 --> 00:36:43,000 Speaker 1: But that thread is fascinating and I think that's even 640 00:36:43,000 --> 00:36:46,279 Speaker 1: more true today, um than it ever has been. Um. 641 00:36:46,360 --> 00:36:50,359 Speaker 1: You know today you shouldn't be putting EXCEL on your resume. Uh. 642 00:36:50,400 --> 00:36:53,239 Speaker 1: You know you know that you know just a word, right, 643 00:36:53,280 --> 00:36:54,839 Speaker 1: you know that's a given. Like you know today, if 644 00:36:54,840 --> 00:36:56,400 Speaker 1: you want to stand out, you know, you better be 645 00:36:56,440 --> 00:37:00,399 Speaker 1: talking about how you can program and Python in our uh. 646 00:37:00,440 --> 00:37:02,959 Speaker 1: And you know, you know, you know all those sets 647 00:37:03,000 --> 00:37:05,520 Speaker 1: of skills that you that you have, you know, if 648 00:37:05,520 --> 00:37:07,719 Speaker 1: you really want to be successful. So I think on 649 00:37:07,960 --> 00:37:10,480 Speaker 1: Wall Street today and kind of going forward. So before 650 00:37:10,520 --> 00:37:12,680 Speaker 1: I get to the standard questions, there are a couple 651 00:37:12,680 --> 00:37:18,720 Speaker 1: of things I missed that I want to come back to. UM, 652 00:37:18,760 --> 00:37:21,360 Speaker 1: and I have to stop saying, um, halt h O 653 00:37:21,560 --> 00:37:26,640 Speaker 1: l T is a pretty substantial product at Credit Swiss. 654 00:37:26,640 --> 00:37:29,239 Speaker 1: Can can you explain exactly what that is? Because when 655 00:37:29,280 --> 00:37:31,880 Speaker 1: I started researching and I'm like, wow, how have I 656 00:37:31,960 --> 00:37:34,920 Speaker 1: never heard of this? This is Uh it's a great product. Yeah, no, 657 00:37:34,960 --> 00:37:37,120 Speaker 1: it's a great product. Um, you know it's not part 658 00:37:37,160 --> 00:37:40,480 Speaker 1: of my domain. UM, there's a there's a team there 659 00:37:40,480 --> 00:37:42,239 Speaker 1: that has been doing and that's been together for close 660 00:37:42,280 --> 00:37:45,160 Speaker 1: to twenty five years, maybe more than that. Pretty successful, 661 00:37:45,320 --> 00:37:49,480 Speaker 1: very successful uh. And what they've really done is collected 662 00:37:49,840 --> 00:37:55,480 Speaker 1: accounting data for companies over that plus a year period 663 00:37:55,920 --> 00:38:00,480 Speaker 1: and figured out how to normalize it uh and really 664 00:38:00,520 --> 00:38:05,800 Speaker 1: begin to look at companies across different industries and different 665 00:38:06,080 --> 00:38:10,560 Speaker 1: countries and put them all on an equal footing um 666 00:38:11,040 --> 00:38:13,279 Speaker 1: uh and then really look at what a take those 667 00:38:13,320 --> 00:38:16,480 Speaker 1: cash flows and look what really is the return on 668 00:38:16,680 --> 00:38:21,280 Speaker 1: capital um for these companies and the implied growth rates 669 00:38:21,719 --> 00:38:23,640 Speaker 1: for them and kind of come back and then look 670 00:38:23,680 --> 00:38:28,799 Speaker 1: at what is then being implied for what the appropriate 671 00:38:28,840 --> 00:38:32,880 Speaker 1: stock valuation should be. And so it's a wonderful tool 672 00:38:32,960 --> 00:38:35,879 Speaker 1: that people that's very interactive UM and that people can 673 00:38:35,920 --> 00:38:40,440 Speaker 1: really kind of compare companies all across the globe really 674 00:38:40,480 --> 00:38:42,279 Speaker 1: on an apples to apples basis and look at it 675 00:38:42,320 --> 00:38:46,359 Speaker 1: from a fundamental accounting perspective. It's very very powerful and 676 00:38:46,560 --> 00:38:50,080 Speaker 1: has a wide following across the investor basis, and I 677 00:38:50,120 --> 00:38:52,200 Speaker 1: assume a lot of people just are unfamiliar with it. 678 00:38:52,400 --> 00:38:54,840 Speaker 1: I was looking at it saying, how have I not 679 00:38:55,000 --> 00:38:57,719 Speaker 1: seen anything mentioned of this in the media. It was 680 00:38:57,760 --> 00:39:01,400 Speaker 1: pretty uh. Up with a trial fe like, yeah, we'll 681 00:39:01,400 --> 00:39:04,319 Speaker 1: set you up with the trial. Anytime you like I 682 00:39:04,360 --> 00:39:06,000 Speaker 1: could get lost in that, I'll have my head of 683 00:39:06,040 --> 00:39:09,320 Speaker 1: restart something like that. So since the quant crash and 684 00:39:09,400 --> 00:39:13,440 Speaker 1: oh seven, we've seen two really interesting changes in the market. 685 00:39:13,920 --> 00:39:16,400 Speaker 1: One has been, I don't want to call it the 686 00:39:16,520 --> 00:39:19,520 Speaker 1: rise of indexing because that's been going on for forty years, 687 00:39:19,560 --> 00:39:24,239 Speaker 1: but certainly a broader mom and pop imbraceive indexing. And 688 00:39:24,280 --> 00:39:28,720 Speaker 1: then second, at the same time, really volatility has fallen 689 00:39:28,960 --> 00:39:33,279 Speaker 1: off the off the cliff. How have those two factors 690 00:39:33,800 --> 00:39:38,560 Speaker 1: impacted the ability for quants to make money in the market. Yeah, 691 00:39:38,800 --> 00:39:40,759 Speaker 1: I mean I the way I have really kind of 692 00:39:40,880 --> 00:39:45,280 Speaker 1: understood the rise of indexing and then probably not unique 693 00:39:45,280 --> 00:39:48,200 Speaker 1: in my insight here is that in two thousand and eight, 694 00:39:48,280 --> 00:39:50,919 Speaker 1: what you would really investors in two thousand and nine 695 00:39:51,239 --> 00:39:53,600 Speaker 1: really had hoped for was managers that were going to 696 00:39:53,640 --> 00:39:56,799 Speaker 1: be able to give them some level of insurance and 697 00:39:56,800 --> 00:39:59,759 Speaker 1: and protect them in those moments in time, and that 698 00:40:00,080 --> 00:40:03,400 Speaker 1: just didn't happen. UM. And so I think people have 699 00:40:03,520 --> 00:40:08,120 Speaker 1: been driven by lower fees UH, and I think the fiduciaries, 700 00:40:08,360 --> 00:40:11,080 Speaker 1: the planned sponsors who are managing UH many of the 701 00:40:11,120 --> 00:40:15,120 Speaker 1: retirement accounts UH and pension funds have been disappointed by 702 00:40:15,160 --> 00:40:18,680 Speaker 1: that facts as well. And so you've seen this move 703 00:40:18,880 --> 00:40:24,480 Speaker 1: towards lower fee uh types of investing that can deliver 704 00:40:24,600 --> 00:40:27,040 Speaker 1: you, you you know, what seems like to be the same 705 00:40:27,120 --> 00:40:31,040 Speaker 1: kind of outcome for for for a lesser price. UH. 706 00:40:31,120 --> 00:40:34,759 Speaker 1: And and so investors have definitely flocked that. And you've 707 00:40:34,800 --> 00:40:37,920 Speaker 1: seen even this past year, the funds that have actually 708 00:40:37,960 --> 00:40:41,840 Speaker 1: attracted the greatest inflows have not only been passive, but 709 00:40:42,000 --> 00:40:47,040 Speaker 1: the absolute lowest priced passive UM funds. So even within 710 00:40:47,160 --> 00:40:51,759 Speaker 1: low fee, it's been the absolute lowest fee that have 711 00:40:51,840 --> 00:40:55,359 Speaker 1: attracted inflows. I remember some years ago morning Star did 712 00:40:55,400 --> 00:40:58,560 Speaker 1: the study. Now their bread and butter is the Star 713 00:40:58,680 --> 00:41:02,480 Speaker 1: rating system. They do this this study that said, if 714 00:41:02,520 --> 00:41:04,880 Speaker 1: you can only know one thing about a fund, what 715 00:41:04,960 --> 00:41:07,440 Speaker 1: should it be? And the answer was fee. If you 716 00:41:07,560 --> 00:41:10,200 Speaker 1: just forget everything else, just by the lowest fees net 717 00:41:10,280 --> 00:41:13,560 Speaker 1: net on average, you're gonna end up with the best performance. 718 00:41:13,880 --> 00:41:17,239 Speaker 1: And and warning Stars credit Not only did they publish it, 719 00:41:17,239 --> 00:41:19,240 Speaker 1: it's still up on the website. You go see it. 720 00:41:19,080 --> 00:41:22,640 Speaker 1: It sort of argues ignore the stars, just look at fees. 721 00:41:23,080 --> 00:41:25,560 Speaker 1: But this is an academic there's a whole bunch of 722 00:41:25,560 --> 00:41:30,040 Speaker 1: academic research out there UM that has been absolutely making 723 00:41:30,400 --> 00:41:33,680 Speaker 1: making that point. Um as well for a number of years, 724 00:41:34,040 --> 00:41:36,440 Speaker 1: you know, Uh, Professor Gruber down An n y U 725 00:41:36,560 --> 00:41:39,799 Speaker 1: has published some of the Gruber Gruber um some of 726 00:41:39,880 --> 00:41:43,439 Speaker 1: the seminal studies on that as well, which and and 727 00:41:43,440 --> 00:41:45,960 Speaker 1: and and others have. Um, he's not alone in that. 728 00:41:46,040 --> 00:41:48,719 Speaker 1: But um, that really kind of made the point that 729 00:41:49,200 --> 00:41:51,799 Speaker 1: fees low fees are one of the biggest predictors of 730 00:41:51,840 --> 00:41:56,919 Speaker 1: future outperformance. Wow. That that's that's pretty Uh, that's pretty significant. Um. 731 00:41:56,960 --> 00:41:59,800 Speaker 1: There's a line you haven't in some of your no, 732 00:42:00,160 --> 00:42:03,080 Speaker 1: and I just love this. I have to share this. Um, 733 00:42:03,120 --> 00:42:05,799 Speaker 1: you must have the right dictionary. If a trader in 734 00:42:05,880 --> 00:42:09,200 Speaker 1: an instant message rights this is a dog with fleas, 735 00:42:09,320 --> 00:42:12,600 Speaker 1: you need to know what they're really saying. UM. Much 736 00:42:12,719 --> 00:42:15,839 Speaker 1: less if they're saying I'm doing market research, that just 737 00:42:15,880 --> 00:42:20,040 Speaker 1: means they're watching YouTube videos. So so what is this 738 00:42:20,080 --> 00:42:22,480 Speaker 1: is a dog fleas means I don't want to touch this, 739 00:42:22,520 --> 00:42:24,080 Speaker 1: I want nothing to do with it. Yeah, it means 740 00:42:24,120 --> 00:42:26,960 Speaker 1: this is a bad stock. Don't don't don't don't own it? Right, 741 00:42:27,360 --> 00:42:30,480 Speaker 1: Um you know? Um? Or yeah, it is trade or 742 00:42:30,520 --> 00:42:32,480 Speaker 1: speak for you know what are you doing market research? 743 00:42:32,640 --> 00:42:36,440 Speaker 1: You're watching YouTube? Um? Where this comes up um is 744 00:42:36,560 --> 00:42:40,600 Speaker 1: that there's a whole new field in in finance, and 745 00:42:40,640 --> 00:42:44,400 Speaker 1: not that new um um, but it's really taken um 746 00:42:44,440 --> 00:42:47,080 Speaker 1: getting a lot of momentum over the last five seven 747 00:42:47,160 --> 00:42:51,920 Speaker 1: years of trying to understand text UH and parsing text 748 00:42:51,920 --> 00:42:54,879 Speaker 1: and trying to understand the meaning within and what people 749 00:42:54,920 --> 00:42:58,440 Speaker 1: are saying. So, whether it's reading earnings transcripts or reading 750 00:42:58,560 --> 00:43:03,120 Speaker 1: annual reports UM, or reading news in general, or from 751 00:43:03,120 --> 00:43:06,480 Speaker 1: a compliance perspective, if you're just trying to read instant 752 00:43:06,480 --> 00:43:08,799 Speaker 1: messages and so the question is how do you begin 753 00:43:08,840 --> 00:43:11,759 Speaker 1: to understand context and what people are really saying. And 754 00:43:11,800 --> 00:43:15,040 Speaker 1: so if you're reading trader speak, your dictionary of words 755 00:43:15,200 --> 00:43:17,920 Speaker 1: to try to understand what our trader is saying is 756 00:43:18,040 --> 00:43:21,000 Speaker 1: different than if you're reading a novel. So so let's 757 00:43:21,000 --> 00:43:23,239 Speaker 1: talk a little bit about that. Because one of the 758 00:43:23,320 --> 00:43:26,120 Speaker 1: questions I didn't get to how to do with machine 759 00:43:26,200 --> 00:43:30,120 Speaker 1: learning and artificial intelligence, which and then throwing big data. 760 00:43:30,239 --> 00:43:35,080 Speaker 1: These are burgeoning areas for research, not just for quant trading, 761 00:43:35,120 --> 00:43:39,600 Speaker 1: but for everything. Big data is now devouring the whole world. 762 00:43:40,280 --> 00:43:45,480 Speaker 1: How do you interact with artificial intelligence and machine learning 763 00:43:45,520 --> 00:43:49,400 Speaker 1: when it comes to figuring out what is out in 764 00:43:49,440 --> 00:43:54,440 Speaker 1: the world and translating it to an expressible investment theme. So, 765 00:43:55,200 --> 00:43:57,320 Speaker 1: you know, I think this is a really exciting time 766 00:43:57,360 --> 00:44:00,760 Speaker 1: to be a quant because the word world is becoming 767 00:44:00,840 --> 00:44:05,239 Speaker 1: more and more and more and more digitalized every day, 768 00:44:05,320 --> 00:44:07,880 Speaker 1: and we are able to get our hands on data 769 00:44:07,920 --> 00:44:13,240 Speaker 1: sets as quantitative investors that we could only dream of, um, 770 00:44:13,280 --> 00:44:17,000 Speaker 1: you know, five seven years ago. And so the real 771 00:44:17,080 --> 00:44:20,160 Speaker 1: question is that you have these huge data sets, how 772 00:44:20,160 --> 00:44:23,880 Speaker 1: do you begin to process them? Uh and look for 773 00:44:24,120 --> 00:44:29,400 Speaker 1: signal within them? Uh? And so you've really seen To 774 00:44:29,719 --> 00:44:32,239 Speaker 1: make that happen, you've had to have two other revolutions 775 00:44:32,280 --> 00:44:34,440 Speaker 1: that have had to come along at the same time. 776 00:44:34,920 --> 00:44:38,920 Speaker 1: One is that computing power and in general hardware and 777 00:44:39,000 --> 00:44:42,919 Speaker 1: software has had to just kind of go through a revolution. Uh. 778 00:44:42,960 --> 00:44:47,759 Speaker 1: And we've seen that um exponential increases in computing. At 779 00:44:47,760 --> 00:44:51,360 Speaker 1: the same time, the price just fell off the cliff 780 00:44:51,440 --> 00:44:54,040 Speaker 1: for storage. Right, So we can go to the web 781 00:44:54,160 --> 00:44:58,080 Speaker 1: on you know, a WS, Amazon Web Services or Zure 782 00:44:58,560 --> 00:45:02,719 Speaker 1: or other places our there Google um uh, and you 783 00:45:02,760 --> 00:45:06,440 Speaker 1: can buy you know, terabytes, tens and tens and hundreds 784 00:45:06,440 --> 00:45:10,719 Speaker 1: of terabytes a storage extraordinarily cheaply rent it when you're 785 00:45:10,719 --> 00:45:13,520 Speaker 1: done just kind of you know that's it. That's all 786 00:45:13,520 --> 00:45:16,399 Speaker 1: you need um uh you can have There's that we've 787 00:45:16,400 --> 00:45:22,280 Speaker 1: moved now from processing uh power from processing on CPUs 788 00:45:22,360 --> 00:45:25,360 Speaker 1: UM in computers where everything had to be done in 789 00:45:25,400 --> 00:45:29,040 Speaker 1: a hierarchical structure, but we've now rewritten the code so 790 00:45:29,120 --> 00:45:33,719 Speaker 1: that everything can run in parallel UM and use GPUs 791 00:45:33,840 --> 00:45:39,279 Speaker 1: graphical processing much faster in tandem cheaper UM. And so 792 00:45:39,360 --> 00:45:43,520 Speaker 1: you've seen exponential growth um uh in computing power that 793 00:45:43,680 --> 00:45:47,160 Speaker 1: is really really hard to overstate UH. And it's just 794 00:45:47,200 --> 00:45:51,080 Speaker 1: been accelerating even that over the past you know, eighteen months. 795 00:45:51,320 --> 00:45:55,040 Speaker 1: So we're we're just beginning to understand and unlock the 796 00:45:55,120 --> 00:45:58,640 Speaker 1: horizon here. And so this has allowed you to just 797 00:45:58,920 --> 00:46:04,120 Speaker 1: process amount ounce of data UM that is hard to imagine. 798 00:46:04,840 --> 00:46:08,719 Speaker 1: An example, UM, there is a company out there that 799 00:46:08,880 --> 00:46:15,600 Speaker 1: is now literally taking pictures of the entire globe every 800 00:46:15,760 --> 00:46:23,120 Speaker 1: day at three meter resolution and storing that data for you. 801 00:46:23,120 --> 00:46:26,600 Speaker 1: You could say, see the tiniest changes anywhere, right, Well, 802 00:46:26,640 --> 00:46:29,120 Speaker 1: three m resolution is what the law allows, right, so 803 00:46:29,200 --> 00:46:31,160 Speaker 1: a car, so but you can tell whether that's a 804 00:46:31,200 --> 00:46:35,320 Speaker 1: car or a bus, right, UM. We can't see people, 805 00:46:35,800 --> 00:46:40,000 Speaker 1: um UM. But the storage on that is a Yoda bite. 806 00:46:40,239 --> 00:46:45,879 Speaker 1: How big is a Yoda bite, It's a trillion terra bites. Right, 807 00:46:46,080 --> 00:46:49,359 Speaker 1: So how do you can now now process that data? Right? 808 00:46:49,640 --> 00:46:52,879 Speaker 1: That requires a lot of computing power and a lot 809 00:46:52,920 --> 00:46:58,000 Speaker 1: of storage capabilities. That's now economical to do, where five 810 00:46:58,080 --> 00:47:00,719 Speaker 1: years ago that was a pipe dream. That's amazing, you know. 811 00:47:00,880 --> 00:47:04,920 Speaker 1: I just I'm familiar with Ways, which was bought by Google, 812 00:47:04,960 --> 00:47:09,160 Speaker 1: which uses Android phones to to look at traffic. I 813 00:47:09,239 --> 00:47:13,800 Speaker 1: was just reading about a company that uses cell phone 814 00:47:13,960 --> 00:47:20,000 Speaker 1: signals to identify actual foot traffic in malls and they 815 00:47:20,160 --> 00:47:24,600 Speaker 1: identified way early that retail was in trouble before it 816 00:47:24,680 --> 00:47:28,680 Speaker 1: was front page news. Amazon's doing this and this company's 817 00:47:28,920 --> 00:47:31,080 Speaker 1: series is in trouble with that. They had figured it 818 00:47:31,120 --> 00:47:33,520 Speaker 1: out and it costs hundreds of thousands of dollars a 819 00:47:33,600 --> 00:47:37,200 Speaker 1: year told the services sometimes right, but you know, if 820 00:47:37,280 --> 00:47:39,879 Speaker 1: you know two years in advance, hey, by the way, 821 00:47:39,920 --> 00:47:42,759 Speaker 1: retail is about to fall off a cliff, it will 822 00:47:42,760 --> 00:47:45,960 Speaker 1: pay for itself in Uh yeah, I mean the data 823 00:47:46,000 --> 00:47:48,880 Speaker 1: sets that, like I said, that are available are really 824 00:47:49,000 --> 00:47:53,000 Speaker 1: quite remarkable. Um that people have in terms of literally 825 00:47:53,000 --> 00:47:56,120 Speaker 1: where you your your foot traffic, UM, your credit card 826 00:47:56,160 --> 00:48:01,080 Speaker 1: spending data, UM, reading email receipts UM. You know, that 827 00:48:01,239 --> 00:48:04,879 Speaker 1: that are available. As you said, traffic data. UM. It's 828 00:48:05,160 --> 00:48:07,040 Speaker 1: there's really if you're if you if there's a data 829 00:48:07,040 --> 00:48:10,200 Speaker 1: set that you want, UM and you don't think that 830 00:48:10,239 --> 00:48:13,960 Speaker 1: you can get it, you're not looking hard enough. Uh. 831 00:48:14,239 --> 00:48:17,160 Speaker 1: At this point, we can track literally every bill of 832 00:48:17,280 --> 00:48:21,520 Speaker 1: lading for every cargo container that is coming into the 833 00:48:21,640 --> 00:48:25,520 Speaker 1: United States. Really at T plus one tomorrow time, I 834 00:48:25,600 --> 00:48:29,080 Speaker 1: I can, I can. For planes are crazy that you 835 00:48:29,080 --> 00:48:31,040 Speaker 1: could see every plane in the sky. You can see 836 00:48:31,080 --> 00:48:32,880 Speaker 1: where they're coming from, where they're going, what type of 837 00:48:32,920 --> 00:48:35,400 Speaker 1: plane it is. You can ask Syria on your phone, 838 00:48:35,800 --> 00:48:37,719 Speaker 1: tell me the planes that are above my head and 839 00:48:37,760 --> 00:48:39,640 Speaker 1: she'll tell you. Really, I'm gonna have to try that. 840 00:48:39,760 --> 00:48:41,960 Speaker 1: You know. You literally just asked Siri what plane is 841 00:48:41,960 --> 00:48:44,160 Speaker 1: above me and she'll tell you the planes that literally 842 00:48:44,160 --> 00:48:46,680 Speaker 1: are right about I mean. So, so all this data 843 00:48:47,080 --> 00:48:49,960 Speaker 1: UM is you know what people are beginning to look at, 844 00:48:50,440 --> 00:48:52,080 Speaker 1: and it's a bit of an arms race UM for 845 00:48:52,120 --> 00:48:54,640 Speaker 1: everybody to try to keep up with this UM and 846 00:48:54,719 --> 00:48:57,200 Speaker 1: to try to understand what's out there and how you 847 00:48:57,280 --> 00:49:00,120 Speaker 1: process it because there's probably no one data set it's 848 00:49:00,160 --> 00:49:02,400 Speaker 1: going to give you the you know, the holy grail 849 00:49:02,680 --> 00:49:05,239 Speaker 1: of everything. It's really about how you take all these 850 00:49:05,320 --> 00:49:09,080 Speaker 1: disparate data sets combine them h in a thoughtful manner 851 00:49:09,320 --> 00:49:12,000 Speaker 1: that's really going to give you your answers uh to 852 00:49:12,640 --> 00:49:14,640 Speaker 1: do that? Interesting? Alright, So I only have you for 853 00:49:14,680 --> 00:49:18,120 Speaker 1: another ten or fifteen minutes. Let me get to my 854 00:49:18,200 --> 00:49:21,839 Speaker 1: favorite questions. Well, I was going to ask you what's 855 00:49:21,880 --> 00:49:24,600 Speaker 1: the most important thing people don't know about your background? 856 00:49:25,000 --> 00:49:27,839 Speaker 1: But I suspect you've already answered that. Actually I don't 857 00:49:27,840 --> 00:49:31,480 Speaker 1: think I have. So it's not Springsteen. It's not Springsteen. 858 00:49:31,560 --> 00:49:34,560 Speaker 1: Springsteen is very important to my background. Um, But people 859 00:49:34,600 --> 00:49:36,319 Speaker 1: know who know you know that, people who know me 860 00:49:36,360 --> 00:49:39,000 Speaker 1: know that. And actually Springsteen lyrics are always pretty much 861 00:49:39,040 --> 00:49:41,480 Speaker 1: hidden somewhere in my notes. Like if you're a Springsteen 862 00:49:41,480 --> 00:49:43,600 Speaker 1: fan and you read my notes, you'll capture you'll find 863 00:49:43,600 --> 00:49:46,520 Speaker 1: another you'll find a hidden Springsteen reference in there. Um, 864 00:49:46,600 --> 00:49:49,880 Speaker 1: And sometimes they're not so subtle. UM. But what people 865 00:49:49,880 --> 00:49:52,160 Speaker 1: probably don't unders know about me is that I was 866 00:49:52,280 --> 00:49:56,239 Speaker 1: very lucky, um to be born to two academics who 867 00:49:56,239 --> 00:50:00,000 Speaker 1: teach at Columbia UM. And my father was a professor 868 00:50:00,440 --> 00:50:03,279 Speaker 1: I Guess is professor of American social history and really 869 00:50:03,320 --> 00:50:07,240 Speaker 1: kind of founded the field um of American social history 870 00:50:07,760 --> 00:50:11,840 Speaker 1: UM and taught me UM at a very young age 871 00:50:12,280 --> 00:50:18,480 Speaker 1: UH to be questioning and dubious of your sources UM. 872 00:50:18,560 --> 00:50:21,479 Speaker 1: And so when I was ten, he dedicated a book 873 00:50:21,520 --> 00:50:24,240 Speaker 1: to me UM and it's called The Sources of American 874 00:50:24,360 --> 00:50:28,719 Speaker 1: Social History UH. And it says to Matthew to to 875 00:50:28,920 --> 00:50:33,960 Speaker 1: understand that American history is more than the study of 876 00:50:34,280 --> 00:50:40,320 Speaker 1: great people UM. And it's a book of unconventional sources 877 00:50:40,360 --> 00:50:44,320 Speaker 1: that try to study how institutions work. UM. And study 878 00:50:44,440 --> 00:50:48,360 Speaker 1: history as a study of institutions UM, not as acts 879 00:50:48,360 --> 00:50:51,480 Speaker 1: of Congress or acts of war or what great people 880 00:50:51,480 --> 00:50:55,160 Speaker 1: are doing. But study the church, UM, study the prison, 881 00:50:55,480 --> 00:50:59,280 Speaker 1: study the hospital and the experience of people within those 882 00:50:59,520 --> 00:51:02,600 Speaker 1: setting UM. And understand the biases of these sources and 883 00:51:02,640 --> 00:51:05,560 Speaker 1: look for unconventional sources. And so I like to think 884 00:51:05,600 --> 00:51:10,040 Speaker 1: that that kind of training about data UH was embedded 885 00:51:10,080 --> 00:51:14,279 Speaker 1: into me at a very very very early age. And 886 00:51:14,360 --> 00:51:18,160 Speaker 1: looking for things UM and biases and things and being 887 00:51:18,280 --> 00:51:22,960 Speaker 1: skeptical of the wisdom you're receiving of what you're being told, UM, 888 00:51:23,000 --> 00:51:26,600 Speaker 1: how markets actually work, the players in them. All of 889 00:51:26,640 --> 00:51:29,840 Speaker 1: that was really instilled at me from age ten to eleven. 890 00:51:30,000 --> 00:51:32,800 Speaker 1: So you're an M. I. T. Darren Asamoglu is there 891 00:51:32,960 --> 00:51:36,440 Speaker 1: and he talks about the role of institutions in the 892 00:51:36,480 --> 00:51:40,120 Speaker 1: economy and people shouldn't be looking at these big events 893 00:51:40,280 --> 00:51:44,960 Speaker 1: or these fed chiefs, should be looking at these societal institutions. 894 00:51:45,320 --> 00:51:48,759 Speaker 1: They have a much greater impact on things like economic 895 00:51:48,800 --> 00:51:52,960 Speaker 1: inequality and recessions and cycles than does anyone person or 896 00:51:53,000 --> 00:51:57,080 Speaker 1: anyone sort of event. Very much along your dad's along 897 00:51:57,080 --> 00:52:00,359 Speaker 1: those lines, um, And I think you know, I think 898 00:52:00,560 --> 00:52:02,319 Speaker 1: you know, I know. One of your questions to me, 899 00:52:02,760 --> 00:52:04,080 Speaker 1: I don't mean to jump the gun on any of 900 00:52:04,080 --> 00:52:07,520 Speaker 1: your questions is favorite books. So let's let's jump the gun. 901 00:52:07,600 --> 00:52:11,760 Speaker 1: Let's well, before we do that, let's because I'm anticipating 902 00:52:12,160 --> 00:52:14,040 Speaker 1: the answer to who were your mentors? I have to 903 00:52:14,080 --> 00:52:17,560 Speaker 1: assume your father was a mentor of yours, of course, 904 00:52:17,680 --> 00:52:19,920 Speaker 1: of course, I mean, um, that's a little cliche to 905 00:52:19,960 --> 00:52:22,319 Speaker 1: say that your dad was a mentor. Well, but you know, 906 00:52:22,480 --> 00:52:26,759 Speaker 1: someone dedicates a book and it it obviously, of course. Um. 907 00:52:26,800 --> 00:52:30,520 Speaker 1: And my father was absolutely influential in my life and 908 00:52:30,640 --> 00:52:32,880 Speaker 1: my thinking and teaching me how to write and just 909 00:52:32,920 --> 00:52:35,759 Speaker 1: taking a red pen to my writing and just um, 910 00:52:35,920 --> 00:52:40,600 Speaker 1: you know, arguing with me about ideas. And um, I 911 00:52:40,640 --> 00:52:42,960 Speaker 1: had a professor at college who was a huge mentor 912 00:52:42,960 --> 00:52:46,719 Speaker 1: of media to me. I went to Brown advisor. UM. 913 00:52:46,760 --> 00:52:48,600 Speaker 1: I did a lot of independent studies with him, who 914 00:52:48,719 --> 00:52:51,920 Speaker 1: just grilled me on ethics and rigor of thought and 915 00:52:52,400 --> 00:52:55,600 Speaker 1: uh the law uh and you know what are what 916 00:52:55,680 --> 00:52:58,560 Speaker 1: are rights versus nonconstitutional rights and just took it in 917 00:52:58,719 --> 00:53:02,560 Speaker 1: his wing and really shaped my thinking, uh in in 918 00:53:02,560 --> 00:53:05,759 Speaker 1: in many many hard ways. UM. I think some of 919 00:53:05,800 --> 00:53:10,080 Speaker 1: my other um you know mentors, uh you know had 920 00:53:10,120 --> 00:53:11,920 Speaker 1: to have been a guy by the name of Sid Brown, 921 00:53:12,400 --> 00:53:15,320 Speaker 1: um who's who was a professor at Columbia who saw 922 00:53:15,360 --> 00:53:17,120 Speaker 1: something in me. When I was a master student there, 923 00:53:17,120 --> 00:53:21,560 Speaker 1: I wandered into uh his graduate class, graduate students class 924 00:53:21,560 --> 00:53:24,960 Speaker 1: and stochastic calculus uh you know, and uh you know 925 00:53:25,040 --> 00:53:26,960 Speaker 1: it was filled with PhD students and somehow I got 926 00:53:27,000 --> 00:53:28,799 Speaker 1: the high grade. I'm still to this day not sure 927 00:53:28,800 --> 00:53:31,120 Speaker 1: how I did that. And he took me under his wing, 928 00:53:31,360 --> 00:53:32,960 Speaker 1: um and taught me a lot. And now he's a 929 00:53:32,960 --> 00:53:36,400 Speaker 1: friend and colleague and just been entrusted um kind of 930 00:53:36,440 --> 00:53:39,640 Speaker 1: mentor and advisor throughout my career. UM. And then there 931 00:53:39,719 --> 00:53:43,319 Speaker 1: was a gentleman at Lehman Brothers UM, where I was 932 00:53:43,400 --> 00:53:46,280 Speaker 1: much too young in many ways to get the position 933 00:53:46,320 --> 00:53:51,399 Speaker 1: I did as head of quant. Uh. Um I had 934 00:53:51,440 --> 00:53:53,800 Speaker 1: been uh and I had been out of my PhD 935 00:53:53,840 --> 00:53:57,080 Speaker 1: program all of five years. Oh really, so you're a 936 00:53:57,120 --> 00:53:59,799 Speaker 1: little green. I was a little green. And he threw 937 00:53:59,840 --> 00:54:01,840 Speaker 1: me into as a managing director ahead of all the 938 00:54:01,920 --> 00:54:04,360 Speaker 1: quantitative equity research at Lehman five years out of my 939 00:54:04,400 --> 00:54:08,040 Speaker 1: PhD program. Um, and so I was green. Um. I 940 00:54:08,080 --> 00:54:10,759 Speaker 1: didn't quite you know, know how to behave with other 941 00:54:10,800 --> 00:54:14,560 Speaker 1: senior managing directors and how that whole world worked. Um. 942 00:54:14,640 --> 00:54:19,360 Speaker 1: And Robbie was absolutely you know, harsh um uh in 943 00:54:19,480 --> 00:54:23,120 Speaker 1: brutal and uh to me, but in the loving way. 944 00:54:23,280 --> 00:54:25,560 Speaker 1: I was one of Robbie's children, as the way I 945 00:54:25,600 --> 00:54:29,399 Speaker 1: described it in um. Uh you know, he taught me 946 00:54:29,560 --> 00:54:32,200 Speaker 1: how to grow up um and how to behave in 947 00:54:32,320 --> 00:54:35,840 Speaker 1: a major, world class institution and what was expected of 948 00:54:35,840 --> 00:54:37,640 Speaker 1: of me not so much. I mean he helped me 949 00:54:37,680 --> 00:54:39,520 Speaker 1: on the quant but really helped me mature as a 950 00:54:39,520 --> 00:54:42,520 Speaker 1: manager um and as a person. Uh and how one 951 00:54:42,600 --> 00:54:46,000 Speaker 1: carries oneself uh in a role. And I remember kind 952 00:54:46,000 --> 00:54:47,759 Speaker 1: of telling my team every day like I don't know 953 00:54:47,800 --> 00:54:50,720 Speaker 1: what I did wrong um this week before my weekly 954 00:54:50,719 --> 00:54:52,520 Speaker 1: meeting with Robber. But but I'm about to go find 955 00:54:52,600 --> 00:54:57,200 Speaker 1: out and UM, I don't know when I found out. UM. 956 00:54:57,360 --> 00:55:00,759 Speaker 1: Uh and he and it was painful at times, UM, 957 00:55:00,800 --> 00:55:03,480 Speaker 1: but I absolutely love him for it. UM. And he 958 00:55:03,600 --> 00:55:08,160 Speaker 1: made me such a better UM workplace person, uh you 959 00:55:08,200 --> 00:55:10,520 Speaker 1: know every day and uh you know, uh it was 960 00:55:10,560 --> 00:55:13,480 Speaker 1: it was hard at the time, but I I adore 961 00:55:13,560 --> 00:55:19,480 Speaker 1: him for taking that time and attention. UM everybody. You 962 00:55:19,480 --> 00:55:21,840 Speaker 1: you know, if he's listening to this, you know everyone 963 00:55:21,880 --> 00:55:25,040 Speaker 1: needs them two you know. So let's talk you mentioned books. 964 00:55:25,080 --> 00:55:28,120 Speaker 1: Let's talk about some of your favorite books, fiction, non fiction. 965 00:55:28,560 --> 00:55:30,400 Speaker 1: What do you read for fun? What do you read 966 00:55:30,520 --> 00:55:35,880 Speaker 1: for informational purposes? So? Uh, I love documentary photography. I 967 00:55:36,360 --> 00:55:40,120 Speaker 1: am a huge fan of that, and I collect um 968 00:55:40,840 --> 00:55:45,239 Speaker 1: documentary photography books. I have a pretty extensive collection, UM. 969 00:55:45,600 --> 00:55:48,919 Speaker 1: And I am always on the hunt for that new 970 00:55:49,000 --> 00:55:53,040 Speaker 1: great documentary photography book UM or and collecting the masters. 971 00:55:53,520 --> 00:55:55,600 Speaker 1: Uh give us the name of a book for someone 972 00:55:55,640 --> 00:55:59,799 Speaker 1: who has no experience with documentary photography but wants to 973 00:55:59,840 --> 00:56:02,520 Speaker 1: have splore the space. So you have to start post 974 00:56:02,520 --> 00:56:06,760 Speaker 1: World documentary Photography with Robert Frank Um the Americans. UM. 975 00:56:06,840 --> 00:56:11,600 Speaker 1: That was an absolutely revolutionary book. UM. Along with Cardi 976 00:56:11,640 --> 00:56:15,600 Speaker 1: A Brasson the decisive moment UM. And but but but 977 00:56:15,719 --> 00:56:20,200 Speaker 1: Frank changed photography UM forever UM. And then there comes 978 00:56:20,239 --> 00:56:24,560 Speaker 1: a whole series of lesser known masters UM, but utter 979 00:56:24,600 --> 00:56:29,319 Speaker 1: masters UM from you know, Eugene Richard's work out there 980 00:56:29,400 --> 00:56:33,400 Speaker 1: documentary UM poverty in America. That is one has to 981 00:56:33,400 --> 00:56:37,160 Speaker 1: be aware of and see um uh to the work 982 00:56:37,239 --> 00:56:40,279 Speaker 1: like people of ron have viv that is just legendary. UM. 983 00:56:40,400 --> 00:56:44,240 Speaker 1: His photographs were actually submitted into the War Crimes Tribunal 984 00:56:44,680 --> 00:56:48,800 Speaker 1: UH in the Hague UM documentary the atrocities um uh 985 00:56:48,840 --> 00:56:52,480 Speaker 1: you know that happened in the UM you know, and 986 00:56:52,719 --> 00:56:56,000 Speaker 1: in what was Yugoslavia, UM. You know, just the some 987 00:56:56,080 --> 00:56:58,560 Speaker 1: of the most important work. There's work by people like 988 00:56:58,600 --> 00:57:02,640 Speaker 1: Don McCullen, the legendary British war photographer, documenting the atrocities 989 00:57:02,680 --> 00:57:07,240 Speaker 1: in Biafra and Vietnam and around the world UM, heroic people. UM. 990 00:57:07,280 --> 00:57:11,360 Speaker 1: Tim Heatherington's doing work that unfortunately he died UM, covering 991 00:57:11,880 --> 00:57:15,240 Speaker 1: UH in Africa and Libya. And these are just um, 992 00:57:15,239 --> 00:57:17,560 Speaker 1: you know, moving work. Of course, the work by Robert Kappa, 993 00:57:17,600 --> 00:57:20,080 Speaker 1: the legendary war photographer who was in the first wave 994 00:57:20,120 --> 00:57:23,280 Speaker 1: of the boats UM coming off of d day. Um. 995 00:57:23,400 --> 00:57:26,520 Speaker 1: So just um, very very variable, powerful work. And there 996 00:57:26,520 --> 00:57:29,360 Speaker 1: are people who are still doing this work today, um 997 00:57:29,680 --> 00:57:33,080 Speaker 1: out there that just don't get any love uh and attention. 998 00:57:33,200 --> 00:57:36,919 Speaker 1: People for like Alex Webb at Magnum, Ed Kashi um 999 00:57:37,440 --> 00:57:40,160 Speaker 1: at you know seven. Uh there's I don't I'm just 1000 00:57:40,240 --> 00:57:45,040 Speaker 1: missing people. James Knockway. Uh, you know, truly legendary. Listen. 1001 00:57:45,080 --> 00:57:48,440 Speaker 1: I'll make sure it gets included with just on this. 1002 00:57:48,800 --> 00:57:50,520 Speaker 1: I could go on. I have hundreds and hundreds of 1003 00:57:50,680 --> 00:57:54,040 Speaker 1: about something, a little something, a little lighter than than 1004 00:57:54,680 --> 00:57:57,080 Speaker 1: what else do you read? What do you read for enjoy, 1005 00:57:57,160 --> 00:58:01,160 Speaker 1: for just pure relaxation. Um. You know you're talking to 1006 00:58:01,200 --> 00:58:04,480 Speaker 1: a guy who just loves quant and loves getting quant finance. Um. 1007 00:58:04,520 --> 00:58:06,600 Speaker 1: I'm really into the work right now by a guy 1008 00:58:06,600 --> 00:58:09,840 Speaker 1: by name of Jarren Lanier. Um. He's the philosopher in 1009 00:58:09,920 --> 00:58:14,440 Speaker 1: chief um at Microsoft. Okay I knew yeah. And he's 1010 00:58:14,440 --> 00:58:17,000 Speaker 1: written this great book called You Are Not a Gadget. 1011 00:58:17,440 --> 00:58:21,120 Speaker 1: Um Uh and he's really thinking very hard about the 1012 00:58:21,240 --> 00:58:24,800 Speaker 1: role of human beings and artificial intelligence uh in this 1013 00:58:24,840 --> 00:58:28,680 Speaker 1: big data world. Um. There's just great stuff uh to 1014 00:58:28,760 --> 00:58:33,200 Speaker 1: read out there. Um. There's other work U, A great 1015 00:58:33,240 --> 00:58:37,160 Speaker 1: book called Behave by Richard Seboski. Um. So it's it's 1016 00:58:37,200 --> 00:58:40,280 Speaker 1: a big book. It's a big book somebody else recommended 1017 00:58:40,360 --> 00:58:42,360 Speaker 1: and I actually picked it up not too long, you know, 1018 00:58:42,480 --> 00:58:45,440 Speaker 1: just really I mean it's not light stuff. Um. But 1019 00:58:45,520 --> 00:58:49,200 Speaker 1: it's really beginning to explain how what is you know, 1020 00:58:49,240 --> 00:58:51,680 Speaker 1: getting too these arguments of what is nature versus what 1021 00:58:51,840 --> 00:58:53,920 Speaker 1: is nurture? And how do we learn? And how do 1022 00:58:54,000 --> 00:58:58,040 Speaker 1: human beings change behavior? Um? And you know how inbred 1023 00:58:58,160 --> 00:59:01,840 Speaker 1: is things like violence into our societies or not, and 1024 00:59:01,880 --> 00:59:05,840 Speaker 1: studies examples coming from um baboons and how baboon's learned 1025 00:59:06,000 --> 00:59:09,200 Speaker 1: and you think baboons are very um have this ingrained, 1026 00:59:09,240 --> 00:59:11,840 Speaker 1: but they reached these tipping points where societies really changed. 1027 00:59:11,880 --> 00:59:16,280 Speaker 1: And so um, really getting uh at at at this work. Um, 1028 00:59:16,320 --> 00:59:21,000 Speaker 1: that's that's just fascinating. Any any fiction or is it strange? Oh? No, 1029 00:59:21,080 --> 00:59:24,520 Speaker 1: I love fiction. I'm a huge fan of Paul Auster. 1030 00:59:25,040 --> 00:59:28,479 Speaker 1: Um what's the book. Uh, he's written this great series 1031 00:59:28,480 --> 00:59:30,320 Speaker 1: of books and starting books out there called the New 1032 00:59:30,360 --> 00:59:32,800 Speaker 1: York Trilogy. But he's written a great book called Invisible 1033 00:59:33,080 --> 00:59:36,520 Speaker 1: and he kind of again it's kind of postmodernist fiction, um, 1034 00:59:36,600 --> 00:59:38,360 Speaker 1: where you always kind of have to figure out what's 1035 00:59:38,400 --> 00:59:40,640 Speaker 1: the story. He writes them as detective stories, but they're 1036 00:59:40,680 --> 00:59:43,400 Speaker 1: much deeper than that. I love the stories by Raymond Carver. 1037 00:59:43,520 --> 00:59:45,640 Speaker 1: I wish I could ever write like Raymond Carver short 1038 00:59:45,720 --> 00:59:50,200 Speaker 1: stories or Richard Ford. Um, the poetry of Marie How Um, 1039 00:59:50,240 --> 00:59:54,320 Speaker 1: you know, it's what the living do is amazing. Um 1040 00:59:54,360 --> 00:59:57,400 Speaker 1: you know there's uh so, there's just a lot of these, 1041 00:59:57,520 --> 01:00:01,200 Speaker 1: uh just wonderful books uh out there you're you're on 1042 01:00:01,240 --> 01:00:04,720 Speaker 1: a lot of planes, I am, um, but you just 1043 01:00:04,760 --> 01:00:07,440 Speaker 1: find time to read. Well if if you if you 1044 01:00:07,520 --> 01:00:09,760 Speaker 1: want to read a book, you have to carve out 1045 01:00:09,760 --> 01:00:12,640 Speaker 1: a specific time. Otherwise it's just not going to happen. 1046 01:00:12,680 --> 01:00:15,000 Speaker 1: There are too many of the distractions. Yeah, you know 1047 01:00:15,080 --> 01:00:17,120 Speaker 1: the TV. You know, just keep the TV off. Don't 1048 01:00:17,120 --> 01:00:19,960 Speaker 1: get addicted to the Law and Order SVU. Actually, this 1049 01:00:20,040 --> 01:00:21,560 Speaker 1: is a good list. I think people are gonna have 1050 01:00:21,560 --> 01:00:24,520 Speaker 1: some good feedback about this. Um So, we've talked about 1051 01:00:24,560 --> 01:00:28,360 Speaker 1: the arc of quant and how it's changed. Um So, 1052 01:00:28,400 --> 01:00:30,080 Speaker 1: I don't know if I have to ask you what's 1053 01:00:30,160 --> 01:00:33,120 Speaker 1: changed since you joined the industry, But what might be 1054 01:00:33,200 --> 01:00:37,560 Speaker 1: a little more insightful for listeners is what changes do 1055 01:00:37,600 --> 01:00:40,120 Speaker 1: you see just beginning to happen. Now, what are the 1056 01:00:40,120 --> 01:00:43,520 Speaker 1: next major shifts that are already underway and we're just 1057 01:00:43,560 --> 01:00:46,840 Speaker 1: not aware of them. Um. I think it's really this 1058 01:00:46,960 --> 01:00:50,680 Speaker 1: quant three point oh um as as I call it, 1059 01:00:50,720 --> 01:00:56,480 Speaker 1: which is really beginning to understand these disparate data sources 1060 01:00:56,560 --> 01:00:59,680 Speaker 1: that are out there and how we begin to use 1061 01:00:59,760 --> 01:01:04,760 Speaker 1: them and incorporate them into an investment process. Uh. And 1062 01:01:05,320 --> 01:01:09,560 Speaker 1: you know where, you know where that data is useful 1063 01:01:09,640 --> 01:01:13,200 Speaker 1: and where that data isn't useful. And I think a 1064 01:01:13,240 --> 01:01:16,240 Speaker 1: little bit of what's been happening in the industry has 1065 01:01:16,280 --> 01:01:20,560 Speaker 1: been putting the horse before the cart, where there is 1066 01:01:20,640 --> 01:01:24,080 Speaker 1: this explosion of data as we were talking about, and 1067 01:01:24,160 --> 01:01:27,480 Speaker 1: at this point it's just so cool, like you can 1068 01:01:27,560 --> 01:01:30,360 Speaker 1: literally if you can imagine it, you can get the 1069 01:01:30,440 --> 01:01:33,240 Speaker 1: data for it. Um. But we also need to slow 1070 01:01:33,280 --> 01:01:36,640 Speaker 1: down and start thinking about it from an investor's perspective 1071 01:01:36,760 --> 01:01:39,959 Speaker 1: of what is the data that I need that's going 1072 01:01:40,000 --> 01:01:44,880 Speaker 1: to help me with solve my investment controversy um and 1073 01:01:45,120 --> 01:01:47,840 Speaker 1: kind of turn things back on its head um. And 1074 01:01:47,840 --> 01:01:49,919 Speaker 1: that's what I'm hoping the industry will start to do 1075 01:01:50,200 --> 01:01:52,560 Speaker 1: and not just be data for data's sake and just 1076 01:01:52,640 --> 01:01:55,800 Speaker 1: consuming these vast amounts of it UM and ware housing it, 1077 01:01:56,000 --> 01:01:58,560 Speaker 1: but really think what's the what's what's the crux of 1078 01:01:58,600 --> 01:02:01,200 Speaker 1: the question, what's the controversy? See? And then how do 1079 01:02:01,240 --> 01:02:04,600 Speaker 1: I go and get that data? UM? And and that 1080 01:02:04,640 --> 01:02:07,200 Speaker 1: can be in so many different forms of How do 1081 01:02:07,280 --> 01:02:09,680 Speaker 1: I read text that I want to figure out what 1082 01:02:09,720 --> 01:02:13,760 Speaker 1: are people talking about? UM? How do I read UM? 1083 01:02:13,800 --> 01:02:16,400 Speaker 1: How do I understand body language from reading the text 1084 01:02:16,440 --> 01:02:20,240 Speaker 1: of an earnings transcript? How do I infer even like 1085 01:02:20,280 --> 01:02:22,840 Speaker 1: the big questions and natural language processing is, how do 1086 01:02:23,000 --> 01:02:26,320 Speaker 1: I look at double negatives or triple negatives? How do 1087 01:02:26,360 --> 01:02:29,520 Speaker 1: I begin to infer what you mean versus what you 1088 01:02:29,680 --> 01:02:33,840 Speaker 1: actually said? Like we can do that as humans. Sometimes 1089 01:02:33,880 --> 01:02:37,920 Speaker 1: sometimes a written word like on not that twitter is 1090 01:02:38,000 --> 01:02:41,640 Speaker 1: has anything to do with the real world, but I 1091 01:02:41,720 --> 01:02:46,720 Speaker 1: noticed that sarcasm or snark very often gets lost in 1092 01:02:46,760 --> 01:02:51,439 Speaker 1: the written word from from some intentions it does, UM. 1093 01:02:51,480 --> 01:02:53,840 Speaker 1: And that's what makes email dangerous as a form of communication, 1094 01:02:53,880 --> 01:02:55,840 Speaker 1: and why sometimes it's better to pick up your instead 1095 01:02:55,840 --> 01:02:58,120 Speaker 1: of writing an emails, or if you're getting upset at 1096 01:02:58,120 --> 01:02:59,960 Speaker 1: an email, to actually pick up the phone and ask 1097 01:03:00,040 --> 01:03:02,360 Speaker 1: your colleague what do you mean? Let's let's have a 1098 01:03:02,400 --> 01:03:05,680 Speaker 1: quick conversation before you hit the send snarky reply back 1099 01:03:05,720 --> 01:03:07,560 Speaker 1: to them because they may have meant something quarterably different 1100 01:03:07,560 --> 01:03:10,040 Speaker 1: and you're not getting it. Like, so people have difficulty 1101 01:03:10,080 --> 01:03:14,600 Speaker 1: interpreting actual written words. Can machines do what humans in 1102 01:03:14,600 --> 01:03:18,880 Speaker 1: this case can't do with human not today, not today. 1103 01:03:18,960 --> 01:03:21,680 Speaker 1: But that's the frontier of where we're moving, right, and 1104 01:03:21,720 --> 01:03:24,400 Speaker 1: that's what we have to do. We have machines today 1105 01:03:24,400 --> 01:03:28,960 Speaker 1: have trouble with just double negatives or triple negatives. It 1106 01:03:29,040 --> 01:03:33,320 Speaker 1: wasn't a great quarter, but it was okay, and it 1107 01:03:33,480 --> 01:03:38,320 Speaker 1: exceeded our expectations, which were fairly modest. Get a machine 1108 01:03:38,360 --> 01:03:42,840 Speaker 1: to parse that, right, It sounds like all those clauses 1109 01:03:42,920 --> 01:03:45,640 Speaker 1: sort of contradict each one before, right, and getting to 1110 01:03:45,800 --> 01:03:48,120 Speaker 1: like you know what I'm trying to say, Get the 1111 01:03:48,160 --> 01:03:52,280 Speaker 1: machine to parse the full context of that sentence. See 1112 01:03:53,480 --> 01:03:57,000 Speaker 1: I don't own that company. Maybe maybe it depends upon 1113 01:03:57,040 --> 01:03:58,880 Speaker 1: what you want to do or not, right, you know, 1114 01:03:59,120 --> 01:04:02,760 Speaker 1: get a machine to know that when inflation comes in 1115 01:04:02,880 --> 01:04:07,200 Speaker 1: higher than expectations in Japan, Right, that's a good thing 1116 01:04:08,280 --> 01:04:11,600 Speaker 1: as well in the present deflation environment. Sure, Right, So 1117 01:04:11,720 --> 01:04:14,040 Speaker 1: you have to teach machines context, You have to teach 1118 01:04:14,080 --> 01:04:16,680 Speaker 1: them nuance, you have to teach them the ability to 1119 01:04:16,880 --> 01:04:20,640 Speaker 1: understand when this is bad here but not bad there. 1120 01:04:20,640 --> 01:04:25,800 Speaker 1: It's the same, that's right. That's that's the frontier, um. 1121 01:04:25,920 --> 01:04:29,600 Speaker 1: And and that's all done through models, teaching model models 1122 01:04:29,600 --> 01:04:32,640 Speaker 1: for this to work within that's right now. I'm not 1123 01:04:32,680 --> 01:04:35,480 Speaker 1: saying that we're there by any stretch of the imagination, 1124 01:04:35,520 --> 01:04:37,439 Speaker 1: Please don't miss hear me on that. But I'm saying 1125 01:04:37,480 --> 01:04:40,520 Speaker 1: that's where we're headed, right, um. And that's the frontier. 1126 01:04:40,920 --> 01:04:43,760 Speaker 1: And it's question. The question is how quickly with this 1127 01:04:43,880 --> 01:04:48,240 Speaker 1: explosion um in you know, kind of processing power and 1128 01:04:48,600 --> 01:04:51,120 Speaker 1: all the texts and all these things that are being digitalized, 1129 01:04:51,400 --> 01:04:55,640 Speaker 1: will we get there? Interesting? I kind of feel like 1130 01:04:55,680 --> 01:04:59,040 Speaker 1: I asked you this question. Tell me about a time 1131 01:04:59,120 --> 01:05:02,920 Speaker 1: you failed, because you described the so so succinctly the 1132 01:05:04,480 --> 01:05:08,080 Speaker 1: quant crash and oh seven. But is that a fair question. 1133 01:05:08,520 --> 01:05:10,960 Speaker 1: Tell us about a time you've failed and what you 1134 01:05:11,040 --> 01:05:14,440 Speaker 1: learned from it, you know, Um, if you go back 1135 01:05:14,560 --> 01:05:20,720 Speaker 1: and actually read my third grade report card, yeah, um, 1136 01:05:21,360 --> 01:05:24,040 Speaker 1: it's amazingly talked about someone who hasn't really learned a 1137 01:05:24,040 --> 01:05:27,560 Speaker 1: lot from third grade um or changed their behavior. My 1138 01:05:27,640 --> 01:05:30,280 Speaker 1: third grade teacher said, Matthew always turns in his homework 1139 01:05:30,280 --> 01:05:34,720 Speaker 1: assignments late, but when he does it forest far surpasses 1140 01:05:34,720 --> 01:05:38,480 Speaker 1: our expectations. Do you still have a problem with tardiness? 1141 01:05:38,520 --> 01:05:41,640 Speaker 1: Is that an ongoing UF? I've got it much better. Uh. 1142 01:05:42,480 --> 01:05:45,520 Speaker 1: But you know what I've had to learn over time is, 1143 01:05:45,800 --> 01:05:47,960 Speaker 1: as one of my bosses has put it, is not 1144 01:05:48,200 --> 01:05:50,960 Speaker 1: let the better be the you know, the enemy of 1145 01:05:51,000 --> 01:05:53,400 Speaker 1: the perfect. Right, you're the perfect me the enemy the better? 1146 01:05:53,400 --> 01:05:56,080 Speaker 1: Excuse me? Um? And so you know, uh, you know, 1147 01:05:56,360 --> 01:05:59,280 Speaker 1: get out version one, get out version two, get out 1148 01:05:59,360 --> 01:06:02,600 Speaker 1: version three. What was your the great technology line? Is? Um? 1149 01:06:02,840 --> 01:06:06,840 Speaker 1: Good programmers ship? Is that the yes? So? So they 1150 01:06:06,960 --> 01:06:08,840 Speaker 1: I see people are coming into the studio before we 1151 01:06:08,880 --> 01:06:11,560 Speaker 1: get thrown out of here. Let me ask my two 1152 01:06:11,600 --> 01:06:16,360 Speaker 1: favorite questions. If a millennial or some recent college graduate 1153 01:06:16,680 --> 01:06:18,200 Speaker 1: where to come up to you and say, Hey, I'm 1154 01:06:18,200 --> 01:06:22,520 Speaker 1: thinking about going into quantitative research in finance, what sort 1155 01:06:22,560 --> 01:06:27,280 Speaker 1: of advice would you give them? Program? Program, program program 1156 01:06:27,480 --> 01:06:30,720 Speaker 1: much more than statistics, applied mathematics, calculus. So so the 1157 01:06:30,880 --> 01:06:33,480 Speaker 1: so the thing that I want if I'm looking for 1158 01:06:33,560 --> 01:06:36,440 Speaker 1: my ideal candidate, they need to know how to program. 1159 01:06:36,480 --> 01:06:41,640 Speaker 1: They need to know how to to do statistics and econometrics. Um. Uh, 1160 01:06:41,800 --> 01:06:44,560 Speaker 1: they need to know finance right, and they need to 1161 01:06:44,560 --> 01:06:47,680 Speaker 1: be curious. UM. I can't teach you how to program right. 1162 01:06:47,800 --> 01:06:50,200 Speaker 1: I've tried to take people who have those other characteristics 1163 01:06:50,200 --> 01:06:54,120 Speaker 1: and teach a program utter failure um um. But you 1164 01:06:54,120 --> 01:06:58,000 Speaker 1: know how you also get people who are curious and 1165 01:06:58,200 --> 01:07:01,760 Speaker 1: skeptical and skeptical of our own work. I think that's 1166 01:07:01,760 --> 01:07:04,640 Speaker 1: the biggest thing, Like realize that you're going to make 1167 01:07:04,680 --> 01:07:07,200 Speaker 1: mistakes and find the errors in your work before I 1168 01:07:07,240 --> 01:07:12,320 Speaker 1: find them. That's really interesting. And our final question, what 1169 01:07:12,480 --> 01:07:15,960 Speaker 1: is it that you know about quantitative investing today that 1170 01:07:16,040 --> 01:07:19,000 Speaker 1: you wish you knew fifteen or twenty years ago? You know, 1171 01:07:19,040 --> 01:07:22,560 Speaker 1: I guess I have a you know my my. My 1172 01:07:22,600 --> 01:07:27,080 Speaker 1: short answer to that is people need a much healthier 1173 01:07:27,160 --> 01:07:30,040 Speaker 1: respect for that effect that it's a model and it's 1174 01:07:30,080 --> 01:07:33,640 Speaker 1: going to be wrong. Um. And understand that even the 1175 01:07:33,720 --> 01:07:37,960 Speaker 1: best models that are actually true UM or have very 1176 01:07:38,000 --> 01:07:42,200 Speaker 1: good uh performance are going to go through periods of 1177 01:07:42,200 --> 01:07:45,080 Speaker 1: of big underperformance. And that doesn't mean you get rid 1178 01:07:45,120 --> 01:07:48,240 Speaker 1: of the model. UM. You know, how do you diversify 1179 01:07:48,280 --> 01:07:52,200 Speaker 1: across those models? Uh? And kind of you know, work 1180 01:07:52,280 --> 01:07:54,960 Speaker 1: through those periods. I think is the biggest thing that 1181 01:07:55,040 --> 01:07:57,520 Speaker 1: everyone who's trying to invest in quant and really kind 1182 01:07:57,520 --> 01:08:01,640 Speaker 1: of has appreciation for quite fast nating. We have been 1183 01:08:01,720 --> 01:08:05,760 Speaker 1: speaking with Matthew Ruthman. He is the head of quantitative 1184 01:08:05,840 --> 01:08:10,800 Speaker 1: equity Strategies at Credit Swiss. If you enjoy this conversation, 1185 01:08:10,880 --> 01:08:13,320 Speaker 1: be Shuan looked up an inchro down an inch on 1186 01:08:13,360 --> 01:08:17,000 Speaker 1: Apple iTunes and you could see any of the hundred 1187 01:08:17,040 --> 01:08:21,760 Speaker 1: and fifty or so such conversations that we have had previously. 1188 01:08:22,280 --> 01:08:26,720 Speaker 1: We love your comments, feedback and suggestions right to us 1189 01:08:26,760 --> 01:08:31,080 Speaker 1: at m IB podcast at Bloomberg dot net. I would 1190 01:08:31,120 --> 01:08:33,640 Speaker 1: be remiss if I did not thank Taylor Riggs for 1191 01:08:33,680 --> 01:08:36,040 Speaker 1: helping to produce the show and set up these interviews. 1192 01:08:36,680 --> 01:08:41,240 Speaker 1: Michael bat Nick is our head of research. Medina Parwana 1193 01:08:41,520 --> 01:08:45,320 Speaker 1: is our audio engineer, who is our recording engineer. Today, 1194 01:08:45,520 --> 01:08:49,120 Speaker 1: Caroline O'Brien, Thank you Caroline for filling in last minute. 1195 01:08:49,520 --> 01:08:53,000 Speaker 1: I'm Barry Ritults. You've been listening to Masters in Business 1196 01:08:53,439 --> 01:09:00,320 Speaker 1: on Bloomberg Radio p