1 00:00:02,200 --> 00:00:06,359 Speaker 1: This is Masters in Business with Barry Ridholts on Bloomberg 2 00:00:06,480 --> 00:00:09,760 Speaker 1: Radio this weekend. On the podcast, I have a special guest. 3 00:00:10,200 --> 00:00:14,840 Speaker 1: Her name is Alison Schraeger, and amongst other things, she 4 00:00:15,080 --> 00:00:17,360 Speaker 1: is the author of a book that I took with 5 00:00:17,400 --> 00:00:21,920 Speaker 1: me on vacation and absolutely found intriguing. My my version 6 00:00:22,040 --> 00:00:25,599 Speaker 1: is just demolished because I plowed through it on a 7 00:00:25,680 --> 00:00:30,320 Speaker 1: beach uh in Turks and Caicos, and really, despite everything 8 00:00:30,320 --> 00:00:32,159 Speaker 1: going on around me, I kind of ignored it and 9 00:00:32,200 --> 00:00:35,800 Speaker 1: just worked my way through the book. I love the 10 00:00:35,840 --> 00:00:39,600 Speaker 1: title and economists walks into a brothel, which really has 11 00:00:39,640 --> 00:00:42,280 Speaker 1: nothing to do with sex and everything to do with 12 00:00:43,200 --> 00:00:48,879 Speaker 1: finding ways to do risk reward analyzes in really unusual places. 13 00:00:48,920 --> 00:00:52,600 Speaker 1: So whether it's big wave surfing or horse breeding, or 14 00:00:52,960 --> 00:00:58,240 Speaker 1: poker playing or paparazzi, there are all these unusual um 15 00:00:58,280 --> 00:01:03,000 Speaker 1: situations where we don't really think about the risk reward analysis, 16 00:01:03,480 --> 00:01:06,600 Speaker 1: but really the details of that have a major impact 17 00:01:06,720 --> 00:01:11,679 Speaker 1: on how these industries and these individuals progress. And and 18 00:01:11,720 --> 00:01:15,520 Speaker 1: once you start looking into that, it changes the way 19 00:01:15,560 --> 00:01:20,920 Speaker 1: you look at everything from insurance to annuities to hedging 20 00:01:21,080 --> 00:01:26,760 Speaker 1: to market based portfolios. Risk permanent permeates everything we do 21 00:01:27,280 --> 00:01:30,720 Speaker 1: and most of us just don't give it enough time 22 00:01:30,760 --> 00:01:37,440 Speaker 1: and thought to recognize the dangers and advantages it potentially 23 00:01:37,920 --> 00:01:40,880 Speaker 1: can afford us. So, if you're at all interested in 24 00:01:41,480 --> 00:01:47,200 Speaker 1: filling the blank investing, insurance, understanding risk, understanding what happens 25 00:01:47,200 --> 00:01:51,200 Speaker 1: in a brothel from an economic perspective, I think you're 26 00:01:51,200 --> 00:01:54,880 Speaker 1: gonna find this to be an absolutely fascinating conversation. So, 27 00:01:54,920 --> 00:02:00,640 Speaker 1: with no further ado, my interview with Alison Traeger. This 28 00:02:01,120 --> 00:02:05,160 Speaker 1: is Masters in Business with Barry Ridholts on Bloomberg Radio. 29 00:02:05,640 --> 00:02:09,480 Speaker 1: My special guest this week is Alison Schreeger. She is 30 00:02:09,520 --> 00:02:12,679 Speaker 1: an economist and adjunct professor at n y U, a 31 00:02:12,800 --> 00:02:17,600 Speaker 1: journalist at Courts, co founder of life Cycle Finance Partners, 32 00:02:17,840 --> 00:02:21,440 Speaker 1: which is a risk advisory firm. She is also the 33 00:02:21,480 --> 00:02:25,000 Speaker 1: author of An Economist Walks Into a Brothel and Other 34 00:02:25,120 --> 00:02:29,880 Speaker 1: Unexpected Places to Understand Risk. Alison Schreeger, Welcome to Bloomberg. 35 00:02:30,080 --> 00:02:32,880 Speaker 1: Thank you for having me. My pleasure. So, so let's 36 00:02:32,919 --> 00:02:38,800 Speaker 1: talk a little bit about your um backgrounds. You describe 37 00:02:38,800 --> 00:02:41,680 Speaker 1: in the book making a series of what you later 38 00:02:41,760 --> 00:02:46,519 Speaker 1: called risky career decisions, not exactly sure what you wanted 39 00:02:46,560 --> 00:02:50,320 Speaker 1: to do. Your assumption was, hey, PhD program and economics. 40 00:02:50,600 --> 00:02:56,000 Speaker 1: Obviously you're gonna become a professor. How did that work out? Um? Well, 41 00:02:56,040 --> 00:02:57,920 Speaker 1: in the end, I think it worked out well for me, 42 00:02:58,240 --> 00:03:00,880 Speaker 1: but the path was a lot rocky year than I 43 00:03:00,880 --> 00:03:03,760 Speaker 1: would have expected. I think, like a lot of people, 44 00:03:03,880 --> 00:03:05,880 Speaker 1: especially in this day and age, I fell into this 45 00:03:06,000 --> 00:03:08,959 Speaker 1: idea that more education was just better and it would 46 00:03:09,000 --> 00:03:10,600 Speaker 1: open up all these doors. And most of the time 47 00:03:10,600 --> 00:03:12,639 Speaker 1: it does. I think that is normally a good bet, 48 00:03:12,680 --> 00:03:15,920 Speaker 1: and certainly an economics PhD is is a good thing 49 00:03:15,960 --> 00:03:19,120 Speaker 1: to have in life. But I had kind of a 50 00:03:19,200 --> 00:03:22,320 Speaker 1: rocky transition when I finished grad school because I had 51 00:03:22,360 --> 00:03:26,760 Speaker 1: invested essentially my whole twenties into learning all these skills 52 00:03:26,840 --> 00:03:28,799 Speaker 1: and really being cut off from the rest of the world, 53 00:03:28,880 --> 00:03:32,160 Speaker 1: and then realized what I was investing in was this 54 00:03:32,240 --> 00:03:35,360 Speaker 1: idea of becoming a professor, and had this realization as 55 00:03:35,360 --> 00:03:36,920 Speaker 1: I graduated, this was not what I wanted to do 56 00:03:36,960 --> 00:03:42,640 Speaker 1: at all. So undergrad it and Edinburgh PhD at Columbia University. 57 00:03:43,000 --> 00:03:46,920 Speaker 1: What sort of thesis were you working on that? Um? Weirdly, 58 00:03:47,040 --> 00:03:49,480 Speaker 1: even when I was like eighteen, I was always really 59 00:03:49,520 --> 00:03:54,640 Speaker 1: fascinated by retirement. Uh So my PhD thesis was on 60 00:03:55,440 --> 00:03:59,640 Speaker 1: risk in retirement, so it was I started around two 61 00:03:59,680 --> 00:04:02,160 Speaker 1: thousand and so this is as DC Plans had really 62 00:04:02,160 --> 00:04:04,200 Speaker 1: taken a foothold in the market and I was doing 63 00:04:04,200 --> 00:04:06,280 Speaker 1: some work in the UK where they were taking over, 64 00:04:06,640 --> 00:04:10,080 Speaker 1: so really understanding more deeply the risks and define contribution 65 00:04:10,200 --> 00:04:15,800 Speaker 1: versus define benefit pensions. So eventually you come to the realization, hey, 66 00:04:15,880 --> 00:04:19,680 Speaker 1: I'm not going to be any sort of professor. And 67 00:04:19,720 --> 00:04:23,680 Speaker 1: then you happened to have a job interview with a 68 00:04:23,800 --> 00:04:29,480 Speaker 1: world famous Noble Prize winning economist, Robert Martin. What happened 69 00:04:29,480 --> 00:04:33,360 Speaker 1: with that? Well, I uh, as I said, I had 70 00:04:33,360 --> 00:04:35,480 Speaker 1: this hard realization when I was finishing grad school. I 71 00:04:35,520 --> 00:04:37,120 Speaker 1: didn't want to be professor. I didn't want to be 72 00:04:37,160 --> 00:04:39,040 Speaker 1: a government bureaucrat all the things you're supposed to do 73 00:04:39,120 --> 00:04:41,599 Speaker 1: with an economics PhD. So I kind of had this 74 00:04:41,680 --> 00:04:45,480 Speaker 1: burn it all down, do something new attitude, which also 75 00:04:45,560 --> 00:04:48,200 Speaker 1: turned out to be helpful for the book. So I 76 00:04:48,240 --> 00:04:50,320 Speaker 1: was like, I'm gonna do something fun. I've just spent 77 00:04:50,360 --> 00:04:52,680 Speaker 1: my whole twenties while everyone else was partying solving a 78 00:04:52,720 --> 00:04:56,840 Speaker 1: math problem. So I went to the economist unpaid, because 79 00:04:56,920 --> 00:04:59,560 Speaker 1: this is early days with web journalism, and they would 80 00:04:59,600 --> 00:05:01,359 Speaker 1: take people who had never written before to write for 81 00:05:01,360 --> 00:05:04,920 Speaker 1: the web back then and what you pay for right, 82 00:05:06,080 --> 00:05:08,240 Speaker 1: So I was just writing web web stuff for the 83 00:05:08,240 --> 00:05:11,200 Speaker 1: economists for free with any com PhD. And then someone 84 00:05:11,240 --> 00:05:15,280 Speaker 1: passed my dissertation along to Bob Merton and turned out 85 00:05:15,360 --> 00:05:19,359 Speaker 1: retirement was also his main interest. He was working on 86 00:05:19,440 --> 00:05:22,080 Speaker 1: market solutions to the retirement problem. So so let me 87 00:05:22,120 --> 00:05:25,240 Speaker 1: interrupt you there and ask some questions here, because this 88 00:05:25,320 --> 00:05:30,039 Speaker 1: is really kind of fascinating. You're wildly overqualified to turn 89 00:05:30,120 --> 00:05:33,520 Speaker 1: out web based nonsense where you're not being paid for 90 00:05:33,560 --> 00:05:38,000 Speaker 1: a website, and you know they treat unpaid volunteers not 91 00:05:38,240 --> 00:05:41,719 Speaker 1: especially well, and they have the same respect for their content. 92 00:05:42,160 --> 00:05:46,720 Speaker 1: Who said, oh, I know Allison's pH d thesis. Let 93 00:05:46,720 --> 00:05:49,440 Speaker 1: me give this to Bob. He's busy, he want a 94 00:05:49,520 --> 00:05:52,160 Speaker 1: Nobel prize, but he'll like this. How on earth did 95 00:05:52,200 --> 00:05:55,440 Speaker 1: that come about? Um a friend of mine who I 96 00:05:55,480 --> 00:05:57,679 Speaker 1: think had him when he was doing his NBA at Harvard. 97 00:05:58,440 --> 00:06:01,360 Speaker 1: And because my dissertation, that is exactly what Bob was doing. 98 00:06:01,360 --> 00:06:03,839 Speaker 1: Bob was trying to come up with financial models that 99 00:06:04,000 --> 00:06:06,280 Speaker 1: took the best to define benefit plans and put them 100 00:06:06,279 --> 00:06:09,080 Speaker 1: in a defined contribution structure, and that turns out, you know, 101 00:06:09,160 --> 00:06:10,479 Speaker 1: as I said, as much as I said burn it 102 00:06:10,520 --> 00:06:13,240 Speaker 1: all down, I always had a good dissertation which I 103 00:06:13,320 --> 00:06:16,720 Speaker 1: thought about risk in a clever way. So once Bob 104 00:06:16,760 --> 00:06:20,240 Speaker 1: didn't know, i'd sort of hit bottom career wise. You know, 105 00:06:20,279 --> 00:06:21,960 Speaker 1: it just goes to show where something you feel like 106 00:06:22,040 --> 00:06:24,279 Speaker 1: something can feel like it totally blew up. I was thinking, 107 00:06:24,360 --> 00:06:26,560 Speaker 1: I just really messed up here, you know, I just 108 00:06:26,600 --> 00:06:30,200 Speaker 1: wasted all this time. And then my paper ended up 109 00:06:30,240 --> 00:06:32,600 Speaker 1: in front of him and he was like, I really 110 00:06:32,640 --> 00:06:34,880 Speaker 1: want to do this. He called me in. He's like, 111 00:06:34,920 --> 00:06:37,440 Speaker 1: if you come work with me, I will teach you finance. 112 00:06:37,720 --> 00:06:39,599 Speaker 1: Let me tell you when I first was handed a 113 00:06:39,640 --> 00:06:43,240 Speaker 1: book whose title was an economist walks into a brothel. 114 00:06:44,440 --> 00:06:48,520 Speaker 1: I'm intrigued. Okay, now, your your thought is they're not 115 00:06:48,560 --> 00:06:51,520 Speaker 1: going in there as a customer or anything like that. 116 00:06:51,560 --> 00:06:55,080 Speaker 1: You thought is what sort of wacky economic data are 117 00:06:55,080 --> 00:06:58,800 Speaker 1: they going to be analyzing? As an economist looking at 118 00:06:58,800 --> 00:07:01,480 Speaker 1: the sex trade, it's pretty fast spending subject. How did 119 00:07:01,480 --> 00:07:05,400 Speaker 1: you find your way to Nevada and going to the 120 00:07:05,400 --> 00:07:10,560 Speaker 1: bunny ranch? So I wrote a story for courts because 121 00:07:10,560 --> 00:07:12,680 Speaker 1: I'm it's interesting. I've always been in shouting risk even 122 00:07:12,720 --> 00:07:15,960 Speaker 1: as a journalist. You know, after I started working with Bob, 123 00:07:16,360 --> 00:07:18,880 Speaker 1: before I was a macro economist, I became I learned finance. 124 00:07:18,920 --> 00:07:21,239 Speaker 1: I'm like, everything I know about the economy, I was wrong. 125 00:07:21,920 --> 00:07:25,240 Speaker 1: Risk is a much more rigorous and interesting way to 126 00:07:25,320 --> 00:07:29,280 Speaker 1: understand the macro economy and every economic problem. So I 127 00:07:29,360 --> 00:07:32,680 Speaker 1: was applying that in sort of this freakonomics see way 128 00:07:32,720 --> 00:07:36,040 Speaker 1: as a journalist, and I wrote a story about uh, 129 00:07:36,760 --> 00:07:39,280 Speaker 1: a friend of a friend who was running an online 130 00:07:39,280 --> 00:07:44,400 Speaker 1: brothel where her value add was screening clients. This is 131 00:07:44,920 --> 00:07:48,320 Speaker 1: an illegal operation, so when you work illegally, you have 132 00:07:48,360 --> 00:07:51,840 Speaker 1: to screen your clients otherwise, especially these were first submissives specifically, 133 00:07:52,320 --> 00:07:55,440 Speaker 1: so they get they get tied up, so they're particularly vulnerable. 134 00:07:55,480 --> 00:07:59,119 Speaker 1: So screening has especially important premium. So she was being 135 00:07:59,160 --> 00:08:01,720 Speaker 1: paid this memium to screen them, and I was like, well, 136 00:08:01,720 --> 00:08:04,160 Speaker 1: this is pretty cool. Uh. So I wrote the story 137 00:08:04,160 --> 00:08:06,560 Speaker 1: about it did super well, as you can imagine, And 138 00:08:06,720 --> 00:08:09,400 Speaker 1: so I got a call from the Bunny ranch saying, 139 00:08:09,440 --> 00:08:11,520 Speaker 1: if you're gonna be writing about brothels, you should be 140 00:08:11,520 --> 00:08:15,720 Speaker 1: writing about us, and all, well, I don't write about brothels, 141 00:08:15,760 --> 00:08:19,240 Speaker 1: but this is an interesting call, So I'm gonna continue 142 00:08:19,240 --> 00:08:23,880 Speaker 1: this um and so I was talking to them and 143 00:08:24,040 --> 00:08:26,280 Speaker 1: they were I'm like, how does this all work? They're like, well, 144 00:08:26,800 --> 00:08:30,240 Speaker 1: the women come in and every we don't upset prices. 145 00:08:30,240 --> 00:08:33,319 Speaker 1: They negotiate every transaction and they kind of another throwaway 146 00:08:33,320 --> 00:08:35,120 Speaker 1: thing he said. And I was like, well, that's actually 147 00:08:35,200 --> 00:08:37,480 Speaker 1: very interesting. So you're telling me you have women who 148 00:08:37,480 --> 00:08:39,640 Speaker 1: are about twenty years old negotiating with men in their 149 00:08:39,640 --> 00:08:42,720 Speaker 1: sixties over tens of thousands of dollars. He's like yeah, 150 00:08:42,800 --> 00:08:45,400 Speaker 1: and it's interesting you say that because most of them 151 00:08:45,400 --> 00:08:46,959 Speaker 1: come here not knowing how to negotiate. So have a 152 00:08:47,000 --> 00:08:52,040 Speaker 1: negotiation training program. So the Bunny Ranch in Nevada reaches 153 00:08:52,080 --> 00:08:54,520 Speaker 1: out to you and says, hey, if you want to 154 00:08:54,559 --> 00:08:58,000 Speaker 1: have a conversation about a brothel and about risk and 155 00:08:58,200 --> 00:09:02,040 Speaker 1: analyzing numbers, come to to us. What was your thought 156 00:09:02,040 --> 00:09:05,600 Speaker 1: process when you got this phone call? Well, at first 157 00:09:05,679 --> 00:09:07,560 Speaker 1: I was like, this isn't my thing, this isn't what 158 00:09:07,600 --> 00:09:11,240 Speaker 1: I'm like known for. I'm a retirement economist. But when 159 00:09:11,280 --> 00:09:15,000 Speaker 1: they said the thing about we have a negotiation training program, 160 00:09:15,040 --> 00:09:17,760 Speaker 1: which is something I struggle with. Two, Like, you would 161 00:09:17,760 --> 00:09:21,000 Speaker 1: think a car dealership has a negotiating training program, not 162 00:09:21,040 --> 00:09:24,120 Speaker 1: a brothel or sales job on Wall Street, you know 163 00:09:24,280 --> 00:09:26,280 Speaker 1: they and they're like well, and then he even said 164 00:09:26,280 --> 00:09:29,080 Speaker 1: the headline, the women here don't know their value, so 165 00:09:29,120 --> 00:09:31,079 Speaker 1: we teach them to know their value and to ask 166 00:09:31,120 --> 00:09:33,679 Speaker 1: for enough. And I was like, well, I could use that. 167 00:09:34,200 --> 00:09:36,320 Speaker 1: So I talked to my editor at Courts to sending 168 00:09:36,360 --> 00:09:39,200 Speaker 1: me there to go through their negotiation training program. It 169 00:09:39,240 --> 00:09:41,880 Speaker 1: was the immediate reaction, this is a great story. Oh yeah, 170 00:09:41,880 --> 00:09:45,640 Speaker 1: they send a video crew, all right, so they were 171 00:09:45,679 --> 00:09:48,160 Speaker 1: they were pretty hip that, hey, this is funky and 172 00:09:48,160 --> 00:09:51,160 Speaker 1: on you. This was freaky before fre economics, wasn't this 173 00:09:51,320 --> 00:09:53,679 Speaker 1: freakonomics is already out, but okay, it took it to 174 00:09:53,720 --> 00:09:55,679 Speaker 1: another level because I don't think Steve Levitt's spent a 175 00:09:55,760 --> 00:09:58,160 Speaker 1: lot of time in brothels. Now he focused on young 176 00:09:58,400 --> 00:10:02,559 Speaker 1: drug dealers in the inner city and other similarly freaky stuff. 177 00:10:02,800 --> 00:10:05,240 Speaker 1: But this is like right up his alley for sure. 178 00:10:05,640 --> 00:10:09,240 Speaker 1: So you get to the Bunny ranch. How receptive were 179 00:10:09,360 --> 00:10:13,960 Speaker 1: the women to speaking to you about all these sorts 180 00:10:14,000 --> 00:10:18,760 Speaker 1: of economic related um issues of of salary and compensation 181 00:10:18,800 --> 00:10:21,880 Speaker 1: and negotiation and the beginning Some some are very very wary, 182 00:10:21,960 --> 00:10:24,959 Speaker 1: some are very open, but once you get talking to them, 183 00:10:24,960 --> 00:10:27,160 Speaker 1: they open up because I think a lot of people 184 00:10:27,280 --> 00:10:29,360 Speaker 1: go there and they don't really take them seriously as 185 00:10:29,360 --> 00:10:33,960 Speaker 1: business women. And okay, uh, Dennis when he was alive, 186 00:10:34,440 --> 00:10:37,040 Speaker 1: this is the guy who was running the bunny ranch. Yes, 187 00:10:37,640 --> 00:10:40,959 Speaker 1: uh really had a lot of good business training going on. 188 00:10:41,200 --> 00:10:43,360 Speaker 1: And the women they're are great business women and they're 189 00:10:43,400 --> 00:10:45,600 Speaker 1: proud of what they know. So once you get them 190 00:10:45,600 --> 00:10:47,800 Speaker 1: talking about that, they do open up. Because this is 191 00:10:47,800 --> 00:10:50,400 Speaker 1: a special skill a lot of them have learned, and 192 00:10:50,440 --> 00:10:52,240 Speaker 1: a skill honestly, I think most people could learn on 193 00:10:52,280 --> 00:10:56,320 Speaker 1: their jobs and they don't. So how to value the 194 00:10:56,400 --> 00:10:59,920 Speaker 1: worth of your own work products relative to the mare 195 00:11:00,080 --> 00:11:02,920 Speaker 1: could place and relative to customer on the other side 196 00:11:02,920 --> 00:11:05,360 Speaker 1: of the desk from you, and how to ask for it, 197 00:11:05,480 --> 00:11:09,080 Speaker 1: how to feel comfortable. Especially it's an interesting negotiation because 198 00:11:09,120 --> 00:11:12,760 Speaker 1: you have to negotiation can be very fraught and afterward 199 00:11:12,800 --> 00:11:14,640 Speaker 1: you're gonna have this very intimate encounter with this person. 200 00:11:15,360 --> 00:11:18,360 Speaker 1: So making that transition, which is I mean a more 201 00:11:18,440 --> 00:11:23,559 Speaker 1: exaggerated version how Valley calls that the pivot, it's a 202 00:11:23,559 --> 00:11:25,320 Speaker 1: more exactitary vision of what we all do, right, Like, 203 00:11:25,360 --> 00:11:26,839 Speaker 1: we have to negotiate with someone and we have to 204 00:11:26,880 --> 00:11:30,640 Speaker 1: work with them. So this is just that on steroids? Huh? 205 00:11:31,080 --> 00:11:33,840 Speaker 1: And and so what is the secret? What did they 206 00:11:33,920 --> 00:11:37,880 Speaker 1: do that's somewhat different than what or what did they 207 00:11:37,920 --> 00:11:41,480 Speaker 1: teach you that you didn't know going into this? I 208 00:11:41,520 --> 00:11:46,280 Speaker 1: think I always saw negotiation is very adversarial, and what 209 00:11:46,320 --> 00:11:48,240 Speaker 1: I learned is how to make it not so, how 210 00:11:48,280 --> 00:11:50,920 Speaker 1: to just put a bunch of things out there, which 211 00:11:50,920 --> 00:11:54,160 Speaker 1: apparently is a negotiation technique. Here's a here's a men, 212 00:11:54,240 --> 00:11:58,079 Speaker 1: you choose a be your sake. That's really interesting, and 213 00:11:58,200 --> 00:12:01,520 Speaker 1: therefore then it's not adversarial. It's just, hey, you know, 214 00:12:01,640 --> 00:12:04,040 Speaker 1: everyone feels like they're getting what they want. I'm customizing 215 00:12:04,080 --> 00:12:07,080 Speaker 1: this experience for you. So when I was there, I 216 00:12:07,120 --> 00:12:10,480 Speaker 1: learned a lot about pricing, sex pricing, and something that 217 00:12:10,520 --> 00:12:13,800 Speaker 1: fascinated me was how much more they could charge than 218 00:12:13,840 --> 00:12:18,120 Speaker 1: the illegal market. And now the assumption is, from the 219 00:12:18,240 --> 00:12:21,480 Speaker 1: John's point of view, they're going into a place where 220 00:12:21,520 --> 00:12:25,760 Speaker 1: they know the workers have been tested for STDs, they 221 00:12:25,800 --> 00:12:28,280 Speaker 1: know they're not going to get mummeda, ripped off, they 222 00:12:28,320 --> 00:12:31,520 Speaker 1: know they're not going to get arrested. That should be 223 00:12:31,520 --> 00:12:34,120 Speaker 1: worth some sort of premium, shouldn't it. I never thought 224 00:12:34,160 --> 00:12:36,800 Speaker 1: of it that way before, but that was what always 225 00:12:36,800 --> 00:12:39,600 Speaker 1: on the lookout for risk fascinated me because I mean, 226 00:12:39,600 --> 00:12:42,000 Speaker 1: the premium is large. When I went, I went back 227 00:12:42,000 --> 00:12:44,719 Speaker 1: to the brothel for bookwork and surveyed a lot of 228 00:12:44,760 --> 00:12:48,360 Speaker 1: the women on their transactions. And then it just turned 229 00:12:48,400 --> 00:12:51,600 Speaker 1: out that this economist I knew, scraped all the data 230 00:12:51,640 --> 00:12:53,800 Speaker 1: from the Erotic Review, which is yelped for a legal 231 00:12:53,800 --> 00:12:57,120 Speaker 1: sex work. So I had one point two million legal 232 00:12:57,160 --> 00:13:01,360 Speaker 1: sex transactions. So i'd really good robust data set. So 233 00:13:01,640 --> 00:13:04,079 Speaker 1: I was kind of surprised in reading that chapter, that 234 00:13:04,160 --> 00:13:07,720 Speaker 1: section of the book the Bunny Ranch. To go to 235 00:13:07,840 --> 00:13:12,120 Speaker 1: the Bunny Ranch, it's a couple of thousand dollars per 236 00:13:13,160 --> 00:13:16,920 Speaker 1: um experience. I don't know what's called. The median price 237 00:13:17,320 --> 00:13:20,480 Speaker 1: I found was four hundred dollars an hour. That's a 238 00:13:20,480 --> 00:13:25,680 Speaker 1: lot of money. Yeah, uh, you know, especially if you 239 00:13:25,760 --> 00:13:29,040 Speaker 1: just hire a woman online, even for escorting, which is 240 00:13:29,080 --> 00:13:31,200 Speaker 1: more high end. I think equivalent to what you get 241 00:13:31,240 --> 00:13:33,200 Speaker 1: at the Bunny Ranch is three or four hundred dollar 242 00:13:33,280 --> 00:13:35,959 Speaker 1: an that's all. Yeah, and you don't have to go 243 00:13:36,040 --> 00:13:40,720 Speaker 1: to Nevada. Yeah, but then the risk is four dollars. 244 00:13:40,720 --> 00:13:43,600 Speaker 1: But you might get arrested, you might get robbed, you 245 00:13:43,679 --> 00:13:46,320 Speaker 1: might who knows. You might get killed, so it's worth 246 00:13:46,360 --> 00:13:49,719 Speaker 1: a premium, one would argue, And you effectively do that 247 00:13:50,360 --> 00:13:53,720 Speaker 1: in an economist walks into a brothel. Uh that what 248 00:13:53,840 --> 00:13:58,800 Speaker 1: you're paying is you're paying an insurance premium to eliminate 249 00:13:59,320 --> 00:14:04,160 Speaker 1: all of the risk associated with legal sex for higher transactions. 250 00:14:04,240 --> 00:14:06,240 Speaker 1: And on the other side too, because you think these 251 00:14:06,280 --> 00:14:09,680 Speaker 1: women they're getting so much money, but they're really not. Now, 252 00:14:09,720 --> 00:14:13,120 Speaker 1: what's the payout to the woman relative to at Ridge. 253 00:14:13,200 --> 00:14:16,319 Speaker 1: Let's call it fifteen hundred dollar average. They're getting what 254 00:14:16,360 --> 00:14:19,320 Speaker 1: you wrote about half goes to them, Well, not even 255 00:14:19,400 --> 00:14:21,840 Speaker 1: half goes to the brothel. So they have to give 256 00:14:21,880 --> 00:14:25,840 Speaker 1: fifty of their cut to the brothel. And then, uh, 257 00:14:25,880 --> 00:14:28,840 Speaker 1: they're legal sex workers, right, so they ten ninety nine employees, 258 00:14:29,160 --> 00:14:33,400 Speaker 1: so the self employment tax together cover that. And I 259 00:14:33,440 --> 00:14:36,200 Speaker 1: mean these are high earners. They're making, you know, probably 260 00:14:36,200 --> 00:14:38,520 Speaker 1: at least a hundred grand a year, so I mean 261 00:14:38,800 --> 00:14:40,320 Speaker 1: a lot of them don't live in Nevadas. They might 262 00:14:40,360 --> 00:14:41,800 Speaker 1: have stayed income taxes on top of it all. So 263 00:14:41,840 --> 00:14:44,720 Speaker 1: we're talking. So do they do an I RA A 264 00:14:44,840 --> 00:14:46,560 Speaker 1: or a KYO. They have to have some sort of 265 00:14:46,600 --> 00:14:50,280 Speaker 1: retirement friends. So Dennis had really had really good financial 266 00:14:50,320 --> 00:14:52,600 Speaker 1: planners coming in and give them very good financial literacy. 267 00:14:52,680 --> 00:14:55,040 Speaker 1: I was talking to these women who came from households 268 00:14:55,040 --> 00:14:56,600 Speaker 1: where no one had a bank account. They're like, I 269 00:14:56,640 --> 00:14:58,440 Speaker 1: didn't even know the credit score was. And then they're 270 00:14:58,440 --> 00:15:01,000 Speaker 1: telling me, oh, yeah, my ra is on index funds. 271 00:15:01,000 --> 00:15:04,080 Speaker 1: You know, why pay the fees for active That's just 272 00:15:04,280 --> 00:15:06,440 Speaker 1: that's just hilarious. I have a I have a million, 273 00:15:07,160 --> 00:15:10,400 Speaker 1: a million horrific puns, and I'm not going to touch 274 00:15:10,480 --> 00:15:13,440 Speaker 1: a single one of them. Wait, so are you saying 275 00:15:13,480 --> 00:15:17,360 Speaker 1: they were passive investors not active? Is that? Is that 276 00:15:17,680 --> 00:15:21,480 Speaker 1: what you're saying? They were indexers? Right, They're they're all indexers. 277 00:15:21,520 --> 00:15:26,280 Speaker 1: That's hilarious. That's that's really hilarious. So what was the 278 00:15:26,360 --> 00:15:30,160 Speaker 1: single most surprising thing that you came away from the 279 00:15:30,160 --> 00:15:36,080 Speaker 1: Bunny Ranch with having interviewed all these professional legal sex workers. 280 00:15:36,120 --> 00:15:39,160 Speaker 1: The most surprising thing? I mean, I know it's sounds 281 00:15:39,200 --> 00:15:42,400 Speaker 1: almost patronizing, and I knew they'd be smart, but I 282 00:15:42,440 --> 00:15:46,160 Speaker 1: was shocked at how much I learned from them about business, 283 00:15:46,240 --> 00:15:50,520 Speaker 1: about negotiation, and about risk. So more than just smart savvy, 284 00:15:50,720 --> 00:15:54,520 Speaker 1: very savvy. Let's talk a little bit about risk, because 285 00:15:54,560 --> 00:15:59,440 Speaker 1: I think different people think of risk differently, How can 286 00:15:59,680 --> 00:16:04,560 Speaker 1: you define what risk is for the average person. Well, 287 00:16:04,600 --> 00:16:06,560 Speaker 1: I think of risk. I think of it as an 288 00:16:06,680 --> 00:16:09,080 Speaker 1: estimate of all the different things that could happen and 289 00:16:09,080 --> 00:16:11,680 Speaker 1: how probable they are. As I say, it's a very 290 00:16:11,720 --> 00:16:15,040 Speaker 1: technical definition. Let me ask you the same question a 291 00:16:15,040 --> 00:16:18,080 Speaker 1: little bit differently, then, Since you run a firm who's 292 00:16:18,240 --> 00:16:22,160 Speaker 1: got the words life cycle in its name, how does 293 00:16:22,320 --> 00:16:27,680 Speaker 1: risk change over the course of a person's lifetime. Well, um, 294 00:16:27,840 --> 00:16:29,920 Speaker 1: you have career risk, you have academic risk, you have 295 00:16:30,000 --> 00:16:32,600 Speaker 1: retirement risk. I mean there has to be a million 296 00:16:32,680 --> 00:16:37,200 Speaker 1: different points in one's life where the risks that are 297 00:16:37,320 --> 00:16:41,320 Speaker 1: presented to you are very different with different ramifications. Totally, 298 00:16:41,400 --> 00:16:43,920 Speaker 1: and how we are able to deal with risk and 299 00:16:44,000 --> 00:16:46,880 Speaker 1: understand risk and how risk averse we are can also 300 00:16:46,960 --> 00:16:49,560 Speaker 1: change over a life cycle, which makes it even more complicated. 301 00:16:50,160 --> 00:16:52,960 Speaker 1: You know, there's all this evidence about behavioral biases, but 302 00:16:52,960 --> 00:16:56,440 Speaker 1: there's also evidence as people get older those biases tend 303 00:16:56,480 --> 00:16:59,240 Speaker 1: to be less prevalent. Really, so, is it that we 304 00:16:59,400 --> 00:17:02,600 Speaker 1: get a little wiser with age or it just matters less? 305 00:17:03,200 --> 00:17:05,439 Speaker 1: Have we get a little wiser with age? There's experience 306 00:17:05,840 --> 00:17:09,359 Speaker 1: that just as well. Experience really changes how you perceive risk. Like, 307 00:17:09,400 --> 00:17:11,520 Speaker 1: once you've seen things blow up for you a couple 308 00:17:11,520 --> 00:17:13,959 Speaker 1: of times, suddenly you become a little more risk averse 309 00:17:14,080 --> 00:17:16,760 Speaker 1: or a little more aware of the probabilities you face, 310 00:17:16,960 --> 00:17:19,240 Speaker 1: or take time to hedge or ensure when you take risks. 311 00:17:19,600 --> 00:17:21,840 Speaker 1: Let's talk about that, because you have a few chapters 312 00:17:21,880 --> 00:17:26,080 Speaker 1: in the book on the differences between insurance and hedging. 313 00:17:26,160 --> 00:17:29,760 Speaker 1: So let's talk a little about hedging, whether we're referring 314 00:17:29,760 --> 00:17:33,280 Speaker 1: to a market perspective or any other perspective. What is 315 00:17:33,359 --> 00:17:35,960 Speaker 1: hedging and what should the average person use it for? 316 00:17:36,560 --> 00:17:38,280 Speaker 1: I think of hedging and is it This is a 317 00:17:38,320 --> 00:17:41,560 Speaker 1: distinction that a lot of people don't. I think asn't 318 00:17:41,600 --> 00:17:44,439 Speaker 1: often made very clearly. In fact, I had lunch with 319 00:17:44,480 --> 00:17:47,520 Speaker 1: the CEO of an insurance firm who even he kept 320 00:17:47,520 --> 00:17:50,000 Speaker 1: we kept messing up insurance and hedging. So it's it's 321 00:17:50,040 --> 00:17:52,840 Speaker 1: a very subtle but important difference, and you make it clear. 322 00:17:53,000 --> 00:17:56,720 Speaker 1: They're two very distinct things. They are, and if you 323 00:17:57,800 --> 00:18:00,520 Speaker 1: draw a picture of what they mean in on a graph, 324 00:18:00,600 --> 00:18:03,520 Speaker 1: it's very clear. But intuitively it's a very hard difference. 325 00:18:04,000 --> 00:18:07,000 Speaker 1: So I think of hedging as you just take less risk. 326 00:18:07,520 --> 00:18:11,480 Speaker 1: So in basic finance world, that would be of a 327 00:18:11,560 --> 00:18:14,080 Speaker 1: risky asset and you have a risk free asset. So 328 00:18:14,160 --> 00:18:17,119 Speaker 1: hedging is just putting more of your portfolio and the 329 00:18:17,240 --> 00:18:20,879 Speaker 1: risk free asset, the typical sixty stock and bond portfolio. 330 00:18:21,040 --> 00:18:24,080 Speaker 1: You're not so much hedging your stocks as you're removing 331 00:18:24,119 --> 00:18:27,200 Speaker 1: some risk and putting it into much lower risk fixed 332 00:18:27,200 --> 00:18:31,159 Speaker 1: income exactly, So you're you're hedging your portfolio balance. So 333 00:18:32,359 --> 00:18:36,880 Speaker 1: I've always thought of hedging as I'm willing to give 334 00:18:36,960 --> 00:18:42,240 Speaker 1: up some upside and an exchange reduce my downside. That's 335 00:18:42,280 --> 00:18:44,879 Speaker 1: exactly it. Okay, So if you is it, If you 336 00:18:45,200 --> 00:18:48,520 Speaker 1: in some sixty you do fifty fifty, that's a less 337 00:18:48,560 --> 00:18:51,560 Speaker 1: upside of stocks do well, but also less downside risk 338 00:18:51,600 --> 00:18:55,560 Speaker 1: of stocks crash. Or conversely, I'll own X y Z 339 00:18:55,640 --> 00:18:58,320 Speaker 1: stock and I'll marry a put to it, and that 340 00:18:58,359 --> 00:19:01,560 Speaker 1: protects it. It costs me something, cost me a couple 341 00:19:01,560 --> 00:19:04,160 Speaker 1: of percent, but hey, if the stock falls out of bed, 342 00:19:04,320 --> 00:19:08,760 Speaker 1: the put should cover some percentage of most of that downside. Yeah, 343 00:19:08,800 --> 00:19:11,480 Speaker 1: they usually think of puts assure a little bit more insurance, 344 00:19:11,520 --> 00:19:13,080 Speaker 1: as I was saying it. So, so now let's talk 345 00:19:13,080 --> 00:19:16,760 Speaker 1: about insurance. How do you think of insurance? What? What 346 00:19:16,840 --> 00:19:20,879 Speaker 1: does purpose does insurance serve. So insurance is a little different. 347 00:19:20,920 --> 00:19:24,360 Speaker 1: So rather than giving up you know, upside, what you're 348 00:19:24,400 --> 00:19:26,000 Speaker 1: doing is you're paying someone a fee, So you give 349 00:19:26,040 --> 00:19:27,879 Speaker 1: up that amount of upsite, but it's a set amount 350 00:19:28,200 --> 00:19:33,160 Speaker 1: and you eliminate downside risk, So a predetermined costs, and 351 00:19:33,280 --> 00:19:37,119 Speaker 1: what you're purchasing is putting that risk or eliminating that 352 00:19:37,200 --> 00:19:40,280 Speaker 1: risk from yourself, putting it onto someone else in a 353 00:19:40,320 --> 00:19:43,600 Speaker 1: specific state of the world. Yeah, so it's insurance, which 354 00:19:43,640 --> 00:19:46,560 Speaker 1: is if if X happens, if this, if this stock 355 00:19:46,600 --> 00:19:51,280 Speaker 1: goes above the strike price, then I get something. So 356 00:19:51,520 --> 00:19:54,520 Speaker 1: I've never owned a car where I did not have 357 00:19:55,000 --> 00:19:58,199 Speaker 1: at one point in the ownership of that car, uh, 358 00:19:58,359 --> 00:20:00,560 Speaker 1: some piece of gravel of rock mo up on our 359 00:20:00,600 --> 00:20:03,840 Speaker 1: local crappy highways here in New York and either ding 360 00:20:04,520 --> 00:20:08,920 Speaker 1: or crack the windshield. So once I could start affording it, 361 00:20:09,240 --> 00:20:11,919 Speaker 1: I always get glass insurance on the car, and I 362 00:20:11,960 --> 00:20:17,680 Speaker 1: have replaced literally every single windshield I've ever owned. Um 363 00:20:17,840 --> 00:20:20,360 Speaker 1: is that a good use of of insurance or am 364 00:20:20,359 --> 00:20:23,720 Speaker 1: I just padding their profits and wasting money? It sounds 365 00:20:23,760 --> 00:20:25,480 Speaker 1: like it. I mean, if you're if you're if you're 366 00:20:25,480 --> 00:20:28,240 Speaker 1: claiming it. Well, but you know, a car you have three, 367 00:20:28,280 --> 00:20:31,560 Speaker 1: four or five years at least UM, and you're paying 368 00:20:31,680 --> 00:20:34,240 Speaker 1: whatever it is, a hundred bucks that I guess what 369 00:20:34,440 --> 00:20:36,520 Speaker 1: I'm really paying for is, Hey, I don't have to 370 00:20:36,720 --> 00:20:38,280 Speaker 1: worry about whether or not I break a window of 371 00:20:38,359 --> 00:20:41,639 Speaker 1: somebody else. Somebody else is responsible for that. Is that 372 00:20:41,720 --> 00:20:44,639 Speaker 1: a fair definition of insurance? Even though you're paying a 373 00:20:44,680 --> 00:20:48,240 Speaker 1: fixed amount, that stress and worry goes away exactly because 374 00:20:48,280 --> 00:20:51,240 Speaker 1: now the stress of breaking the window is on the 375 00:20:51,240 --> 00:20:53,400 Speaker 1: insurance company. I mean, you have to still go through 376 00:20:53,400 --> 00:20:56,680 Speaker 1: the rigamarole replacing it, but the financial risk is borne 377 00:20:56,680 --> 00:20:59,600 Speaker 1: by someone else. So let's talk a little bit about 378 00:20:59,600 --> 00:21:03,040 Speaker 1: the book and some of the other things, um you describe. 379 00:21:03,200 --> 00:21:06,600 Speaker 1: Outside of the Bunny Ranch, you spoke to a number 380 00:21:06,640 --> 00:21:11,000 Speaker 1: of poker players who had some kind of surprising statistic. 381 00:21:11,200 --> 00:21:14,800 Speaker 1: The one player whose name escapes me, but was notorious 382 00:21:14,840 --> 00:21:18,560 Speaker 1: for having yes for having these sort of hissy fits 383 00:21:18,920 --> 00:21:21,960 Speaker 1: when he either wasn't happy with his own play or 384 00:21:21,960 --> 00:21:25,280 Speaker 1: what have you. He said, he should really only be 385 00:21:25,359 --> 00:21:28,800 Speaker 1: playing a very small percentage of hands. And I would 386 00:21:28,800 --> 00:21:34,760 Speaker 1: have guessed what percentage does he play? It's twelve percent, 387 00:21:34,880 --> 00:21:37,680 Speaker 1: twelve percent. That's in a one out of eight hands, 388 00:21:37,680 --> 00:21:40,960 Speaker 1: so seven out of eight hands he's not participating in. Yeah, 389 00:21:41,000 --> 00:21:42,800 Speaker 1: and there's a lot of poker terminology, which is new 390 00:21:42,800 --> 00:21:44,560 Speaker 1: for me because I'm not a player, and they call 391 00:21:44,640 --> 00:21:48,600 Speaker 1: that patient. I don't know if that's specific to poker, 392 00:21:48,720 --> 00:21:51,560 Speaker 1: but you would think if you're passing on seven out 393 00:21:51,560 --> 00:21:53,760 Speaker 1: of eight hands, there's a lot of patients with that, 394 00:21:53,920 --> 00:21:56,040 Speaker 1: especially for someone who struck me as so volatile and 395 00:21:56,080 --> 00:21:58,800 Speaker 1: out of control. Now is how much of that is 396 00:21:58,840 --> 00:22:00,520 Speaker 1: real and how much of that is just to get 397 00:22:00,560 --> 00:22:02,680 Speaker 1: into the heads of the other players. I think it's 398 00:22:02,680 --> 00:22:05,359 Speaker 1: a combination of both. I think that's his natural temperament 399 00:22:05,520 --> 00:22:08,959 Speaker 1: and he's noticed it's also an asset. So in the 400 00:22:09,000 --> 00:22:11,119 Speaker 1: beginning it wasn't an asset. It was a bit of 401 00:22:11,200 --> 00:22:14,120 Speaker 1: a problem for him. How did he manage to deal 402 00:22:14,160 --> 00:22:18,000 Speaker 1: with that risk and turn it into a positive Well, 403 00:22:18,000 --> 00:22:21,760 Speaker 1: he tells stories of just being so emotionally overwhelmed by 404 00:22:21,800 --> 00:22:24,399 Speaker 1: trying to keep him in check your pass out. But 405 00:22:24,440 --> 00:22:27,159 Speaker 1: when you really delve into it, you realize, you know, 406 00:22:27,240 --> 00:22:30,679 Speaker 1: this is the classic behavioral bias we called loss a version, 407 00:22:31,080 --> 00:22:34,280 Speaker 1: which is when you're losing, you know you hate you 408 00:22:34,320 --> 00:22:37,840 Speaker 1: hate losses so much more than you value gains that 409 00:22:37,920 --> 00:22:40,760 Speaker 1: you'll take double down and take extra risk to avoid 410 00:22:40,800 --> 00:22:44,600 Speaker 1: a loss. And so this is often shows up a 411 00:22:44,600 --> 00:22:47,040 Speaker 1: lot in poker and why people play too many hands. 412 00:22:47,080 --> 00:22:49,280 Speaker 1: So he has to do is anyway, even though he's down, 413 00:22:49,359 --> 00:22:51,119 Speaker 1: is still not play the hands. So he has to 414 00:22:51,160 --> 00:22:55,760 Speaker 1: stay rational and like overcome this bias. And my found 415 00:22:55,880 --> 00:22:58,959 Speaker 1: is behind the scenes he's doing all these things so 416 00:22:59,000 --> 00:23:01,320 Speaker 1: he feels like there's less set steak so he can 417 00:23:01,359 --> 00:23:04,040 Speaker 1: stay more rational. He does things like he gets he 418 00:23:04,160 --> 00:23:07,520 Speaker 1: called steaked, which is someone else is an investor in 419 00:23:07,640 --> 00:23:10,680 Speaker 1: him and takes a percentage of the winnings exactly. So anyway, 420 00:23:10,720 --> 00:23:14,560 Speaker 1: he claims to be incredibly wealthy, um, he only puts 421 00:23:14,560 --> 00:23:16,760 Speaker 1: in ten tho dollars of his own money in any 422 00:23:16,760 --> 00:23:20,919 Speaker 1: poker tournament. The other thing he does so it's effectively hedging. 423 00:23:21,520 --> 00:23:23,119 Speaker 1: And the other thing he does is he sort of 424 00:23:23,119 --> 00:23:25,919 Speaker 1: gets this weird insurance, which is where he's in the 425 00:23:26,080 --> 00:23:29,080 Speaker 1: in a major game and he takes the other player 426 00:23:29,119 --> 00:23:31,879 Speaker 1: aside and they cut these little back deals, which is 427 00:23:31,920 --> 00:23:35,240 Speaker 1: apparently allowed. And so in other words, some of these 428 00:23:35,440 --> 00:23:38,119 Speaker 1: tournaments are winner takes all and if you come in 429 00:23:38,200 --> 00:23:41,120 Speaker 1: second you get nothing. So in one one of these 430 00:23:41,119 --> 00:23:43,440 Speaker 1: tournaments he pulled someone aside and said, here's the deal. 431 00:23:44,720 --> 00:23:48,679 Speaker 1: Let's agree to split um a million dollars each of 432 00:23:48,680 --> 00:23:52,240 Speaker 1: the purse, and whoever wins gets the balance. So so 433 00:23:52,440 --> 00:23:55,159 Speaker 1: if you lose, you take half a million, and I 434 00:23:55,200 --> 00:23:57,040 Speaker 1: think it was like two point two millions something like 435 00:23:57,080 --> 00:23:59,040 Speaker 1: that for the winner. So there's a little bit of 436 00:23:59,040 --> 00:24:01,239 Speaker 1: a hedge there as well. Yeah, so he does all 437 00:24:01,240 --> 00:24:03,480 Speaker 1: these things to take a little risk off the table 438 00:24:03,920 --> 00:24:06,879 Speaker 1: and that keeps him rational. What about the concept of 439 00:24:06,920 --> 00:24:11,280 Speaker 1: a player who, after they lose a few hands, instead 440 00:24:11,280 --> 00:24:14,560 Speaker 1: of being patient, has a tendency to try and get 441 00:24:14,600 --> 00:24:17,840 Speaker 1: back to break even. There's something about that concept of 442 00:24:17,920 --> 00:24:20,320 Speaker 1: I gotta get back to break break even. Stock traders 443 00:24:20,359 --> 00:24:22,480 Speaker 1: do the same thing. Hey, I know this stock is 444 00:24:22,480 --> 00:24:24,560 Speaker 1: a piece of junk, but dear Lord, if I get 445 00:24:24,560 --> 00:24:27,000 Speaker 1: back to break even, I swear I'll sell that sort 446 00:24:27,000 --> 00:24:31,160 Speaker 1: of attitude. Poker players do the same thing. They do um, 447 00:24:31,200 --> 00:24:33,480 Speaker 1: but the successful ones are the ones. This is what 448 00:24:33,520 --> 00:24:35,440 Speaker 1: I was saying about. There's evidence as you get older 449 00:24:35,440 --> 00:24:38,440 Speaker 1: and wiser, you are less prey to these behavioral issues. 450 00:24:39,040 --> 00:24:42,399 Speaker 1: And I think it's John List is this economist who 451 00:24:42,480 --> 00:24:46,040 Speaker 1: studies loss of version how impacts behavior? Uh, because the 452 00:24:46,119 --> 00:24:48,720 Speaker 1: break even effect is just sort of a corollary of 453 00:24:48,840 --> 00:24:51,399 Speaker 1: loss a version, which is, when you're down, you're so 454 00:24:51,480 --> 00:24:53,840 Speaker 1: determined to win back, so you didn't experience a loss, 455 00:24:53,840 --> 00:24:56,119 Speaker 1: you double down and take extra risk risky would have 456 00:24:56,200 --> 00:24:58,960 Speaker 1: never taken if you were up. It seems almost opposite 457 00:24:58,960 --> 00:25:02,639 Speaker 1: of risk aversion. You're embracing so much risk I guess 458 00:25:02,640 --> 00:25:05,440 Speaker 1: to try and recover from that loss only when you're down. 459 00:25:05,520 --> 00:25:08,000 Speaker 1: And this is the interesting, you know, sort of um 460 00:25:08,240 --> 00:25:10,880 Speaker 1: deviation from what economists normally think is when you're down, 461 00:25:10,960 --> 00:25:14,000 Speaker 1: you'll take that extra risk to get back. So tell 462 00:25:14,080 --> 00:25:16,240 Speaker 1: us a little bit about how we should be thinking 463 00:25:16,280 --> 00:25:20,560 Speaker 1: about risk in the stock market. Well, actually, I mean 464 00:25:21,480 --> 00:25:25,320 Speaker 1: my background to approach risk is from financial economics, which 465 00:25:25,359 --> 00:25:27,399 Speaker 1: is the study of the stock market. I think in 466 00:25:27,400 --> 00:25:29,600 Speaker 1: a lot of ways, the stock markets the perfect place 467 00:25:29,600 --> 00:25:31,560 Speaker 1: to think about risk because you just have so much data, 468 00:25:32,080 --> 00:25:35,600 Speaker 1: and what financial markets are doing is just finding ways 469 00:25:35,640 --> 00:25:39,280 Speaker 1: to price and move risk around. So I think anyone 470 00:25:39,320 --> 00:25:41,600 Speaker 1: who is in the stock market is someone who's naturally 471 00:25:41,600 --> 00:25:45,120 Speaker 1: thinking about risk all the time. Huh. And and we're 472 00:25:45,160 --> 00:25:49,200 Speaker 1: making more data every day, don't we. Yeah, I think again, 473 00:25:49,280 --> 00:25:52,040 Speaker 1: you you do get people who get away from you know, 474 00:25:52,080 --> 00:25:54,720 Speaker 1: as I said, if they're down there, like, there's all 475 00:25:54,720 --> 00:25:57,640 Speaker 1: this evidence that people won't sell losers, but they'll sell winners. 476 00:25:58,200 --> 00:26:00,520 Speaker 1: And you know how, you know that usually not a 477 00:26:00,520 --> 00:26:02,600 Speaker 1: good idea. The stocks going down, it might probably keep 478 00:26:02,600 --> 00:26:04,439 Speaker 1: going down, is supposed to you know, why would you 479 00:26:04,440 --> 00:26:07,119 Speaker 1: sell your winners and keep a loser. But that's supposed 480 00:26:07,160 --> 00:26:09,440 Speaker 1: to be some version of loss of version people think. 481 00:26:09,560 --> 00:26:11,760 Speaker 1: So there was a study that had come out towards 482 00:26:11,800 --> 00:26:16,200 Speaker 1: the end of eighteen that had looked at portfolio managers 483 00:26:16,600 --> 00:26:18,840 Speaker 1: and rather than compare them to a benchmark, what they 484 00:26:18,880 --> 00:26:23,399 Speaker 1: did instead was, let's instead of selling what the portfolio 485 00:26:23,440 --> 00:26:26,680 Speaker 1: manager sold, we're gonna randomly sell anything else from their 486 00:26:26,680 --> 00:26:29,520 Speaker 1: portfolio and then compare and see how they did. And 487 00:26:29,560 --> 00:26:34,239 Speaker 1: it turned out that random sales outperformed manager sales by 488 00:26:34,280 --> 00:26:36,920 Speaker 1: a hundred basis points over the next year. And when 489 00:26:36,920 --> 00:26:40,359 Speaker 1: they looked at what was being sold, they found two 490 00:26:40,440 --> 00:26:44,680 Speaker 1: broad categories that accounted for most of the underperformance. One 491 00:26:44,720 --> 00:26:47,200 Speaker 1: is stocks that had gone up a lot and therefore 492 00:26:47,240 --> 00:26:50,960 Speaker 1: we're benefiting from momentum. Managers had a tendency to sell those, 493 00:26:51,320 --> 00:26:54,719 Speaker 1: but also stocks that had collapsed a lot, rather than 494 00:26:54,760 --> 00:26:57,119 Speaker 1: selling them when they were small losers. They waited till 495 00:26:57,119 --> 00:27:00,960 Speaker 1: they were giant losers and effectively had become deep value 496 00:27:00,960 --> 00:27:04,920 Speaker 1: plays and they were selling them. And those two categories 497 00:27:04,960 --> 00:27:09,200 Speaker 1: were Um, we're determined to be the behavioral errors that 498 00:27:09,320 --> 00:27:14,400 Speaker 1: we're we're driving portfolio losses. So hold that aside. Let's 499 00:27:14,560 --> 00:27:17,439 Speaker 1: let's think of of the stock market in terms of 500 00:27:17,560 --> 00:27:25,520 Speaker 1: individual investors embracing of risk. Are most people overly invested 501 00:27:25,640 --> 00:27:28,320 Speaker 1: in the stock market and the embracing too much risk? 502 00:27:28,840 --> 00:27:33,080 Speaker 1: Or do you perceive the actual Do you perceive risk 503 00:27:33,160 --> 00:27:36,640 Speaker 1: amongst individual investors as just not taking enough of it, 504 00:27:37,040 --> 00:27:39,639 Speaker 1: especially in the early stages of a market, and not 505 00:27:39,760 --> 00:27:42,479 Speaker 1: embracing it until the very latter stages in the market. 506 00:27:43,119 --> 00:27:44,920 Speaker 1: I think it's hard to say. I mean, the rate 507 00:27:45,040 --> 00:27:48,360 Speaker 1: stock allocation really depends on an individual and where they 508 00:27:48,359 --> 00:27:51,800 Speaker 1: are in their life cycle. Um, I think people don't 509 00:27:51,800 --> 00:27:54,560 Speaker 1: appreciate a lot of people don't appreciate how risky the 510 00:27:54,600 --> 00:27:57,960 Speaker 1: stock market is. Like my mother is nearing retirement and 511 00:27:58,000 --> 00:28:00,600 Speaker 1: she expects her portfolio to like double every year, and 512 00:28:00,640 --> 00:28:03,440 Speaker 1: I'm like, sure, but you're gonna have to take on 513 00:28:03,520 --> 00:28:05,359 Speaker 1: a lot of risk for that to happen. And she 514 00:28:05,400 --> 00:28:08,000 Speaker 1: doesn't seem to internalize that. If you want more return 515 00:28:08,320 --> 00:28:10,600 Speaker 1: that comes with something, and you know, in stocks are 516 00:28:10,640 --> 00:28:13,240 Speaker 1: a great investment. They're a great way, especially an index fund, 517 00:28:13,600 --> 00:28:18,679 Speaker 1: to get UH risk exclosure cheaply and efficiently. But you 518 00:28:18,720 --> 00:28:22,360 Speaker 1: know there's you know, there's no guarantees. So we noticed 519 00:28:22,400 --> 00:28:26,560 Speaker 1: that when housing markets are booming, people have the same 520 00:28:26,760 --> 00:28:30,680 Speaker 1: overly optimistic expectations about how fast they are home prices 521 00:28:30,680 --> 00:28:33,680 Speaker 1: are appreciating. I'm assuming your mom is not a big 522 00:28:33,760 --> 00:28:37,400 Speaker 1: big bit clemon investor. Um, why does she think her 523 00:28:37,440 --> 00:28:41,960 Speaker 1: stock her portfolio should double every year given long term 524 00:28:41,960 --> 00:28:45,400 Speaker 1: returns between eight and well disundegrade. It's also you know, 525 00:28:46,080 --> 00:28:49,440 Speaker 1: errors and how we perceive risk. I mean, I think 526 00:28:49,480 --> 00:28:53,480 Speaker 1: people often in assume there's serial correlation where there is none. 527 00:28:54,520 --> 00:28:57,440 Speaker 1: So you know, if if housing prices have gone up 528 00:28:57,480 --> 00:29:00,120 Speaker 1: the last twenty years, people assume they'll stay doing that. 529 00:29:00,240 --> 00:29:03,080 Speaker 1: I think people also make that assumption around interest rates 530 00:29:03,160 --> 00:29:05,680 Speaker 1: that you know, they've been nothing but go down for 531 00:29:05,720 --> 00:29:09,280 Speaker 1: the last thirty years. So I though, it's questionable if 532 00:29:09,280 --> 00:29:11,560 Speaker 1: they can keep doing that, because how negative canields get. 533 00:29:11,800 --> 00:29:15,360 Speaker 1: So this was kind of an interesting um aspect of 534 00:29:15,360 --> 00:29:18,600 Speaker 1: the book that resonated with me personally because we're always 535 00:29:18,600 --> 00:29:23,800 Speaker 1: trying to teach clients to think about portfolios in terms 536 00:29:23,960 --> 00:29:27,560 Speaker 1: of a way that's easily understandable. And if you say 537 00:29:27,600 --> 00:29:32,280 Speaker 1: to somebody, hey, there's the seventy or here here, there's 538 00:29:32,280 --> 00:29:34,600 Speaker 1: a lot of software that can project you out to retirement. 539 00:29:35,000 --> 00:29:41,320 Speaker 1: There's a probability that you'll hit your retirement goals, assuming 540 00:29:41,480 --> 00:29:45,040 Speaker 1: inflation stays under four percent and you continue making your 541 00:29:45,040 --> 00:29:50,520 Speaker 1: regular contributions. I don't know what a confidence interval does 542 00:29:50,640 --> 00:29:53,560 Speaker 1: for most people, but if if you were to say 543 00:29:53,600 --> 00:29:56,920 Speaker 1: to them, hey, in nineteen out of twenty situations, we 544 00:29:57,040 --> 00:29:59,920 Speaker 1: can show you you'll hit your target goals. It's only 545 00:30:00,000 --> 00:30:02,080 Speaker 1: one out of twenty that you don't make it. Why 546 00:30:02,200 --> 00:30:05,280 Speaker 1: is that so much easier? Don't understand than percentile? Yeah. 547 00:30:05,400 --> 00:30:07,920 Speaker 1: I'm not a psychologist, but the research psychologist I talked 548 00:30:07,920 --> 00:30:09,680 Speaker 1: to and said just something about the way our brains 549 00:30:09,680 --> 00:30:12,200 Speaker 1: are programmed. His frequencies just resonate with us more. We 550 00:30:12,240 --> 00:30:18,160 Speaker 1: are so twenty is a better phrase, Yeah, and it 551 00:30:18,200 --> 00:30:20,800 Speaker 1: makes a big difference because we are programmed as humans were, 552 00:30:20,840 --> 00:30:23,600 Speaker 1: not complete like it's supposed to be. Disasters with risk. 553 00:30:23,720 --> 00:30:25,800 Speaker 1: Risk is something that's humans have been facing, you know, 554 00:30:25,840 --> 00:30:28,520 Speaker 1: as long as we've been on the earth. But you know, 555 00:30:28,600 --> 00:30:32,040 Speaker 1: probabilities are a fairly modern invention. They only really came 556 00:30:32,040 --> 00:30:33,720 Speaker 1: along on the Renaissance with a lot of sort of 557 00:30:33,760 --> 00:30:37,760 Speaker 1: brainy people Bernoulli and a whole run of difference calling 558 00:30:37,760 --> 00:30:40,400 Speaker 1: all these people. So, I mean, it's not surprising that 559 00:30:40,440 --> 00:30:43,640 Speaker 1: we aren't just born having this natural intuitive sense of probabilities. 560 00:30:43,680 --> 00:30:45,640 Speaker 1: I mean, we both work in this area all the time, 561 00:30:46,000 --> 00:30:47,600 Speaker 1: and they often don't mean that much to me either. 562 00:30:48,200 --> 00:30:51,600 Speaker 1: So so for the average person, how best should we 563 00:30:51,800 --> 00:30:57,080 Speaker 1: quantify risk? Well, you want to, um, I say converted 564 00:30:57,160 --> 00:30:59,480 Speaker 1: if you're thinking, except when I defined risk to you 565 00:30:59,560 --> 00:31:00,800 Speaker 1: and I was like, it's all the things that you 566 00:31:00,800 --> 00:31:02,760 Speaker 1: can measure happening and now probable they are. That is 567 00:31:02,760 --> 00:31:07,000 Speaker 1: a probability distributions the whole distribution of things that aren't 568 00:31:07,000 --> 00:31:10,520 Speaker 1: intuitive to us. Um So, I mean, part of it 569 00:31:10,520 --> 00:31:13,120 Speaker 1: really is sort of getting more comfortable with those concepts, 570 00:31:13,120 --> 00:31:15,400 Speaker 1: but it's also when you think of probabilities converting them 571 00:31:15,440 --> 00:31:20,320 Speaker 1: to frequencies. The book was really interesting. It reads really well. 572 00:31:20,480 --> 00:31:23,600 Speaker 1: It's sort of like like as I started reading it, 573 00:31:23,680 --> 00:31:29,320 Speaker 1: I immediately um thought of against the Gods because it's 574 00:31:29,360 --> 00:31:34,200 Speaker 1: also so risk focused, but from a historical perspective. This 575 00:31:34,320 --> 00:31:38,800 Speaker 1: is really a twenty one century perspective on risk. I 576 00:31:38,880 --> 00:31:41,280 Speaker 1: love Against the Gods. It was my favorite book, and 577 00:31:42,120 --> 00:31:45,480 Speaker 1: I was actually partially inspired by that because as much 578 00:31:45,520 --> 00:31:47,680 Speaker 1: as I love it, I wouldn't say everyone wants to 579 00:31:47,680 --> 00:31:49,720 Speaker 1: read it. I mean it's oh no, everybody should read 580 00:31:49,760 --> 00:31:51,840 Speaker 1: that book. It's amazing. Everyone should, but not a lot 581 00:31:51,880 --> 00:31:56,440 Speaker 1: of people will. I mean it is dense, um. He 582 00:31:56,920 --> 00:32:01,880 Speaker 1: is a very detailed or in. Every page is filled 583 00:32:01,960 --> 00:32:05,080 Speaker 1: with lots and lots of stuff, which is why ps 584 00:32:05,160 --> 00:32:07,400 Speaker 1: if you go back and reread it twenty years later, 585 00:32:07,960 --> 00:32:11,520 Speaker 1: it's still fresh, and I mean, that is a master work. 586 00:32:11,600 --> 00:32:14,200 Speaker 1: It's beautiful. I mean, I love everything about that book, 587 00:32:14,200 --> 00:32:17,520 Speaker 1: and I also love capital ideas. I like his everything 588 00:32:17,560 --> 00:32:21,600 Speaker 1: he's done. But you know your mom's probably talk about 589 00:32:22,320 --> 00:32:23,880 Speaker 1: you know, your mom is probably not going to read it. 590 00:32:24,040 --> 00:32:26,840 Speaker 1: And I felt like everyone needs to know what's in 591 00:32:26,880 --> 00:32:29,440 Speaker 1: Against the Gods. And that's partially what inspired this book 592 00:32:29,480 --> 00:32:33,600 Speaker 1: is I wanted to take those ideas that we're so resonant, 593 00:32:33,640 --> 00:32:36,479 Speaker 1: and I felt everyone needed to understand and make them 594 00:32:36,520 --> 00:32:40,880 Speaker 1: accessible to even broader audience. So obviously the whole Bunny 595 00:32:41,000 --> 00:32:44,600 Speaker 1: Ranch brothel section is hilarious. And must have been a 596 00:32:44,640 --> 00:32:48,640 Speaker 1: ton of fun to do. Um, what else did you 597 00:32:48,760 --> 00:32:52,200 Speaker 1: do in your research that was kind of fun and surprising? 598 00:32:52,520 --> 00:32:54,440 Speaker 1: It was all fun and surprised I had. I had 599 00:32:54,440 --> 00:32:55,840 Speaker 1: a I mean I was afraid to write a book 600 00:32:55,840 --> 00:32:57,640 Speaker 1: for a long time because my dissertation was such a 601 00:32:57,720 --> 00:33:00,600 Speaker 1: horrible slog. See. I think of my book as as 602 00:33:00,640 --> 00:33:03,840 Speaker 1: my PhD dissertation, and it was a horrible slob. I 603 00:33:03,840 --> 00:33:05,520 Speaker 1: think maybe you have to do that first big research 604 00:33:05,560 --> 00:33:08,560 Speaker 1: project just has to be horrible. Uh So I was like, 605 00:33:08,560 --> 00:33:10,120 Speaker 1: if I'm gonna do this, I'm gonna have a lot 606 00:33:10,160 --> 00:33:12,040 Speaker 1: of fun, which was the other way I wanted to 607 00:33:12,040 --> 00:33:15,160 Speaker 1: approach the book. So I just was like, I'm gonna 608 00:33:15,160 --> 00:33:17,520 Speaker 1: have all these adventures. It seems ceased to go places. 609 00:33:17,960 --> 00:33:20,680 Speaker 1: I went to a risk conference for big wave surfers 610 00:33:20,680 --> 00:33:23,760 Speaker 1: in a y right. That was the whole fascinating segment. 611 00:33:24,080 --> 00:33:27,920 Speaker 1: The guys who invented jet skiing their way onto eighty 612 00:33:27,920 --> 00:33:30,840 Speaker 1: foot waves that before you couldn't even get onto a 613 00:33:30,840 --> 00:33:33,960 Speaker 1: fifty footwave. You just weren't able to get out there 614 00:33:34,000 --> 00:33:36,760 Speaker 1: fast enough. Um, what was that like? You spoke to 615 00:33:36,840 --> 00:33:40,680 Speaker 1: some really interesting big surf names. Yeah, and you know 616 00:33:40,760 --> 00:33:43,920 Speaker 1: they have these this regular conference where they talk about risk, 617 00:33:43,960 --> 00:33:46,040 Speaker 1: and it's like going to a pension risk conference, although 618 00:33:46,080 --> 00:33:50,800 Speaker 1: everyone's better looking everyone Yeah, like I was in the 619 00:33:50,840 --> 00:33:54,200 Speaker 1: worst shape in the room. Um, but but you know 620 00:33:54,280 --> 00:33:56,080 Speaker 1: it is a it is a conference where they're very 621 00:33:56,080 --> 00:34:00,960 Speaker 1: thoughtful about risk in debating uh, you know, regulatory function 622 00:34:01,000 --> 00:34:03,880 Speaker 1: and who bears the responsibility of risk when your behavior 623 00:34:04,280 --> 00:34:07,680 Speaker 1: impacts other people? And you know, it was as intellectual 624 00:34:07,680 --> 00:34:11,400 Speaker 1: discussion as I've seen anywhere. And the guy who I profile, 625 00:34:11,920 --> 00:34:14,840 Speaker 1: Brian cu Lana, is the one who brought jet skis 626 00:34:14,840 --> 00:34:16,840 Speaker 1: to big wave surfing. And much like a lot of 627 00:34:16,840 --> 00:34:20,000 Speaker 1: financial jurve that is initially supposed to be insurance, but 628 00:34:20,040 --> 00:34:22,719 Speaker 1: as you mentioned, you can also use them to lever 629 00:34:22,920 --> 00:34:26,440 Speaker 1: up and take on more risk and take even bigger waves. 630 00:34:26,440 --> 00:34:28,960 Speaker 1: Because anything that can reduce risk can also be flipped 631 00:34:28,960 --> 00:34:32,160 Speaker 1: around to exacerbate risks. We see a plateau ng and 632 00:34:32,239 --> 00:34:37,239 Speaker 1: even an increase in the annual automobile fatality rates, and 633 00:34:37,360 --> 00:34:41,200 Speaker 1: the discussion in your book and and elsewhere has been, well, 634 00:34:41,239 --> 00:34:45,560 Speaker 1: how much of the confidence that people feel about air 635 00:34:45,600 --> 00:34:49,200 Speaker 1: bags and crumple zones and and anti lick breaks is 636 00:34:49,280 --> 00:34:52,160 Speaker 1: leaving them to drive faster and engage in more dangerous 637 00:34:52,200 --> 00:34:56,040 Speaker 1: behavior without these safety provisions. What what are your thoughts 638 00:34:56,040 --> 00:34:58,319 Speaker 1: on that? Yeah, I mean that is I felt like 639 00:34:58,360 --> 00:35:00,759 Speaker 1: an important thing to include an any book on risk 640 00:35:00,880 --> 00:35:03,279 Speaker 1: telling you how to how there's all these tools that 641 00:35:03,280 --> 00:35:04,960 Speaker 1: can reduce your risk, because then you have to be 642 00:35:05,040 --> 00:35:08,439 Speaker 1: mindful of not feeling so safe that you can then 643 00:35:08,480 --> 00:35:11,600 Speaker 1: go and take whatever risks you want, because nothing makes 644 00:35:11,640 --> 00:35:15,000 Speaker 1: the world truly risk free. You know, we're all risk 645 00:35:15,120 --> 00:35:18,680 Speaker 1: is all still an estimate of something that's immeasurable. So 646 00:35:19,080 --> 00:35:21,759 Speaker 1: you know, and you're you're basing your risk strategy on 647 00:35:21,880 --> 00:35:24,800 Speaker 1: something you know it's better than doing nothing. Just because 648 00:35:24,880 --> 00:35:26,719 Speaker 1: it's not perfect doesn't mean you shouldn't do it. But 649 00:35:26,760 --> 00:35:30,239 Speaker 1: you also have to be aware of the limitations. So 650 00:35:30,360 --> 00:35:33,520 Speaker 1: even if you have every safety device in the world 651 00:35:34,000 --> 00:35:36,080 Speaker 1: and you're going to surf an eighty foot wave, it's 652 00:35:36,080 --> 00:35:38,000 Speaker 1: still not going to be safe. There's nothing to make 653 00:35:38,000 --> 00:35:40,920 Speaker 1: that risk free. And you specifically refer to a number 654 00:35:41,040 --> 00:35:45,880 Speaker 1: of big name surface who died even after or perhaps 655 00:35:45,960 --> 00:35:50,680 Speaker 1: because of some of these safety innovations being brought in. Yeah, 656 00:35:50,719 --> 00:35:53,400 Speaker 1: and when I went to the Big Wave Surf Risk Conference, 657 00:35:53,760 --> 00:35:58,160 Speaker 1: you know, they every innovation resurfaces this issue Initially it 658 00:35:58,200 --> 00:36:01,160 Speaker 1: was leashes because you're already hated when they first came 659 00:36:01,200 --> 00:36:03,600 Speaker 1: out totally because I don't know anything about surfing and 660 00:36:03,600 --> 00:36:06,280 Speaker 1: I've never done it. But apparently before there were leashes, 661 00:36:06,360 --> 00:36:08,080 Speaker 1: if you wiped out, you lost your board and you 662 00:36:08,120 --> 00:36:09,760 Speaker 1: had to swim to shore, and that could be maybe 663 00:36:09,760 --> 00:36:13,120 Speaker 1: twenty miles. So you think they were like amazing swimmers. 664 00:36:13,120 --> 00:36:15,319 Speaker 1: Now you can be a pretty mediocre swimmer and still surf. Right, 665 00:36:15,440 --> 00:36:18,480 Speaker 1: we're sending people out who really, if they get into trouble, 666 00:36:18,840 --> 00:36:21,120 Speaker 1: can't swim a half mile back to shore. Yeah. And 667 00:36:21,120 --> 00:36:23,000 Speaker 1: then it was worse with jet skis because now you 668 00:36:23,000 --> 00:36:24,960 Speaker 1: can have a jet ski, you know, push you not 669 00:36:25,000 --> 00:36:26,560 Speaker 1: only an eighty foot wave, but if you only need 670 00:36:26,600 --> 00:36:28,279 Speaker 1: to belong on a five foot wave, Now you can 671 00:36:28,280 --> 00:36:30,759 Speaker 1: go on a twenty foot wave, and you pose risk 672 00:36:30,760 --> 00:36:32,279 Speaker 1: to other people when that happens, if you need to 673 00:36:32,320 --> 00:36:35,160 Speaker 1: be rescued. And now the big debate is on something 674 00:36:35,200 --> 00:36:38,480 Speaker 1: called these inflatable vests right when you go under, and 675 00:36:38,880 --> 00:36:42,080 Speaker 1: they don't always inflate, but it gives people a sense 676 00:36:42,120 --> 00:36:44,399 Speaker 1: that all right, I'm okay, I could do anything now. Yeah, 677 00:36:44,440 --> 00:36:47,520 Speaker 1: And so this because they're new and just starting to 678 00:36:47,520 --> 00:36:50,080 Speaker 1: be sold. I think I mentioned Greg Long this dame 679 00:36:50,080 --> 00:36:52,880 Speaker 1: a surfer who's inflatable fest and open that was I 680 00:36:52,920 --> 00:36:55,480 Speaker 1: think he had a sort of an early version. But 681 00:36:55,560 --> 00:36:59,080 Speaker 1: now they're being widely sold and this is really tearing 682 00:36:59,160 --> 00:37:01,640 Speaker 1: up the surf community of who should be allowed to 683 00:37:01,640 --> 00:37:04,239 Speaker 1: buy these? Is it irresponsible to allow anyone to have 684 00:37:04,280 --> 00:37:06,960 Speaker 1: these vests? So, in other words, we don't want to 685 00:37:06,960 --> 00:37:09,759 Speaker 1: give people a full sense of confidence. Send down an 686 00:37:09,760 --> 00:37:13,959 Speaker 1: amateur with no skills and no minimal swimming ability out 687 00:37:14,000 --> 00:37:16,440 Speaker 1: into a dangerous area, and they feel because they have 688 00:37:16,760 --> 00:37:21,640 Speaker 1: this inflatable vest that they can serve with the big boys, 689 00:37:21,640 --> 00:37:23,279 Speaker 1: so to speak. But then the flip side of that 690 00:37:23,400 --> 00:37:25,880 Speaker 1: is what this is a potentially life saving piece of technology. 691 00:37:25,960 --> 00:37:28,200 Speaker 1: Are you going to deny people buying it? I mean 692 00:37:28,320 --> 00:37:32,160 Speaker 1: it's there aren't easy answers to this, which is why 693 00:37:32,239 --> 00:37:33,920 Speaker 1: they think it's just such debate. But you know, if 694 00:37:33,960 --> 00:37:36,480 Speaker 1: you go to a finance conference and you debate systemic risk, 695 00:37:36,760 --> 00:37:40,080 Speaker 1: there s the answers there. Either what what else do 696 00:37:40,120 --> 00:37:43,640 Speaker 1: you recall from your research that was um, if not 697 00:37:43,719 --> 00:37:48,320 Speaker 1: quite as buff um, really interesting and surprising because you 698 00:37:48,440 --> 00:37:50,319 Speaker 1: cover a lot of ground in the book. I've got 699 00:37:50,400 --> 00:37:53,919 Speaker 1: horse breeding, um that was kind of fascinating. Also, yeah, 700 00:37:54,000 --> 00:37:56,400 Speaker 1: I didn't. Again, I kind of went after things I 701 00:37:56,440 --> 00:37:59,880 Speaker 1: didn't really have much knowledge of before. Um and I 702 00:38:00,120 --> 00:38:01,960 Speaker 1: never been like a horsey person. I wasn't like when 703 00:38:01,960 --> 00:38:06,520 Speaker 1: these little girls who rode horses. Uh, I was so 704 00:38:07,760 --> 00:38:12,480 Speaker 1: I didn't realize that after the tax reform changed the 705 00:38:12,520 --> 00:38:15,480 Speaker 1: whole dynamics of horse breeding and the economics of horse breeding, 706 00:38:15,760 --> 00:38:18,359 Speaker 1: and and all sorts of other tax shelters. These things 707 00:38:18,440 --> 00:38:21,120 Speaker 1: had been devised as a way to hide money from 708 00:38:21,200 --> 00:38:23,759 Speaker 1: Uncle Sam, and suddenly now they have to stand on 709 00:38:23,800 --> 00:38:27,560 Speaker 1: their own feet. They can't just be a foe investment exactly. 710 00:38:27,640 --> 00:38:29,839 Speaker 1: So the fact that there's less return to long term 711 00:38:29,840 --> 00:38:32,319 Speaker 1: capital gains and they want to realize the returns from 712 00:38:32,320 --> 00:38:35,280 Speaker 1: their risk earlier means now that everyone solds a horse 713 00:38:35,360 --> 00:38:37,080 Speaker 1: after when it's one year old, when they don't have 714 00:38:37,120 --> 00:38:41,279 Speaker 1: complete information about how good a racer it's going to be. 715 00:38:41,640 --> 00:38:43,799 Speaker 1: The only information you have is who its parents are. 716 00:38:44,280 --> 00:38:47,000 Speaker 1: So this has led to this increase in inbreeding. So 717 00:38:47,080 --> 00:38:49,960 Speaker 1: we have all these horses that are sprinters, not long 718 00:38:50,000 --> 00:38:52,960 Speaker 1: distance runners, because that's the first thing that will show 719 00:38:53,000 --> 00:38:54,800 Speaker 1: and that will help sell a one year old horse. 720 00:38:55,080 --> 00:38:58,000 Speaker 1: It seems like the incentives are kind of weird and 721 00:38:58,080 --> 00:39:01,759 Speaker 1: not properly aligned. No, because what you really want is 722 00:39:01,800 --> 00:39:03,680 Speaker 1: you want a horse. Well, actually the real money and 723 00:39:03,760 --> 00:39:07,640 Speaker 1: horses is not winning races. It's from the breeding, the 724 00:39:07,680 --> 00:39:10,520 Speaker 1: stud fees. But you know a good race career is 725 00:39:10,560 --> 00:39:13,680 Speaker 1: necessary for that. But none of those things are necessarily 726 00:39:13,680 --> 00:39:16,760 Speaker 1: correlated with There's been studies on this, on the prices 727 00:39:16,800 --> 00:39:19,239 Speaker 1: that you sell for as a year old yearling. So 728 00:39:19,400 --> 00:39:21,880 Speaker 1: as I was reading through the whole horse section, and 729 00:39:21,920 --> 00:39:24,640 Speaker 1: I rode in college, so I was kind of fascinated 730 00:39:24,640 --> 00:39:27,040 Speaker 1: by that. And I'm not a big fan of horse 731 00:39:27,200 --> 00:39:32,239 Speaker 1: racing and betting, but but I enjoy riding. Um, I 732 00:39:32,280 --> 00:39:35,959 Speaker 1: was reminded of a book I read a hundred years 733 00:39:35,960 --> 00:39:42,560 Speaker 1: ago by William Goldman, who was the screenwriter for Butchcasting 734 00:39:42,600 --> 00:39:46,279 Speaker 1: and Sundance Kid, Princess Bride, and Marathon Man Hit. His 735 00:39:46,440 --> 00:39:51,800 Speaker 1: Screenless are Insane and his book was called Adventures in 736 00:39:51,840 --> 00:39:54,480 Speaker 1: the Screen Trade. But a quote of his that I've 737 00:39:54,560 --> 00:39:59,360 Speaker 1: used repeatedly when writing about markets is nobody knows anything. 738 00:40:00,040 --> 00:40:04,160 Speaker 1: And he refers to all the studios that passed on 739 00:40:04,200 --> 00:40:07,080 Speaker 1: Star Wars, all the studios that wanted nothing to do 740 00:40:07,120 --> 00:40:09,840 Speaker 1: with raiders, of the lost art, and he uses example 741 00:40:09,920 --> 00:40:14,520 Speaker 1: after example after example of these people who are supposed 742 00:40:14,520 --> 00:40:17,800 Speaker 1: to be in the film industry and they're throwing darts. 743 00:40:17,880 --> 00:40:23,520 Speaker 1: And I'm reading your right up about the various horses 744 00:40:23,560 --> 00:40:26,839 Speaker 1: that later go on to storied career that were picked 745 00:40:26,920 --> 00:40:30,520 Speaker 1: up for pennies because nobody recognized their value. And all 746 00:40:30,560 --> 00:40:36,920 Speaker 1: of these very storied stud married to uh, these mayors 747 00:40:36,920 --> 00:40:40,560 Speaker 1: who were great runners, and the horses nobody they don't 748 00:40:40,560 --> 00:40:43,160 Speaker 1: win anything, they don't. So is it the same sort 749 00:40:43,200 --> 00:40:46,960 Speaker 1: of situation when it comes to horse breeding nobody knows anything? Well, 750 00:40:47,280 --> 00:40:49,840 Speaker 1: I think it's that the incentives are kind of are off. 751 00:40:50,600 --> 00:40:53,680 Speaker 1: So you know what, people when they breed a horse, 752 00:40:53,719 --> 00:40:55,719 Speaker 1: they're looking for something that's gonna sell after one year, 753 00:40:55,760 --> 00:40:57,320 Speaker 1: which is very different from a horse that's going to 754 00:40:57,360 --> 00:41:00,759 Speaker 1: win a Kentucky Derby. You would think that there's so 755 00:41:00,880 --> 00:41:04,200 Speaker 1: much money in winning these big races that at least 756 00:41:04,360 --> 00:41:08,480 Speaker 1: some subsection of the breeding community would say, hey, if 757 00:41:08,520 --> 00:41:10,040 Speaker 1: you want to buy a horse to sell in a year, 758 00:41:10,080 --> 00:41:14,120 Speaker 1: don't come to us. We're trying to bread triple crown 759 00:41:14,280 --> 00:41:18,400 Speaker 1: competitive racers. How come that hasn't happened? Well, does some degree? 760 00:41:18,480 --> 00:41:21,719 Speaker 1: You're right that it is just impossible to know. I mean, 761 00:41:21,800 --> 00:41:25,399 Speaker 1: you're getting more information now with technology because you're able 762 00:41:25,400 --> 00:41:28,600 Speaker 1: to do the genetic profiling of the horses, which gives 763 00:41:28,640 --> 00:41:30,960 Speaker 1: you some information, like you can tell, you know, I 764 00:41:31,000 --> 00:41:32,840 Speaker 1: guess the horses that are best suited for like the 765 00:41:32,920 --> 00:41:36,120 Speaker 1: Kentucky Derby or so of these hybrid half sprinter half 766 00:41:36,120 --> 00:41:40,640 Speaker 1: distance runners, and you can test for that also reference. Um. 767 00:41:40,680 --> 00:41:42,840 Speaker 1: And there's a really fascinating story I'm trying to remember 768 00:41:42,840 --> 00:41:45,080 Speaker 1: where I saw it about the size of one of 769 00:41:45,120 --> 00:41:48,080 Speaker 1: the ventricles and the horses hard you reference. You're they're 770 00:41:48,120 --> 00:41:52,319 Speaker 1: looking for horses with not metaphysical of that horse as hard, 771 00:41:52,719 --> 00:41:57,120 Speaker 1: but genuine larger cardiac pumps. Yeah, although you have to 772 00:41:57,160 --> 00:42:00,719 Speaker 1: be careful because everything you know, horse like everything to 773 00:42:00,760 --> 00:42:03,640 Speaker 1: be like a racer, so of this freakish combination of 774 00:42:03,719 --> 00:42:06,640 Speaker 1: factors have to come together. It's like being a Nobel 775 00:42:06,680 --> 00:42:08,879 Speaker 1: Prize winning supermodel. It's like you just need all these 776 00:42:09,040 --> 00:42:13,640 Speaker 1: genetic components in perfect alignment. Um. That that's the I'm 777 00:42:13,640 --> 00:42:17,560 Speaker 1: sure it's an urban myth, but the Einstein Marilyn Monroe, 778 00:42:17,800 --> 00:42:20,080 Speaker 1: back and forth. You may not get the qualities you 779 00:42:20,120 --> 00:42:24,080 Speaker 1: want from each of the sires of the of the horse. Yeah, 780 00:42:24,120 --> 00:42:27,000 Speaker 1: you may get the worst qualities of both. So you know, 781 00:42:27,000 --> 00:42:28,719 Speaker 1: if you uh, if it was one guy explained to me, 782 00:42:28,760 --> 00:42:31,520 Speaker 1: it's like getting like that perfect heart in the wrong 783 00:42:32,160 --> 00:42:34,800 Speaker 1: um horse could be like having of them mentioned Ferrari 784 00:42:34,840 --> 00:42:39,000 Speaker 1: and a super U. So it's it's really hard to predict. 785 00:42:39,120 --> 00:42:41,440 Speaker 1: So this is why that chapter is about about modern 786 00:42:41,480 --> 00:42:44,240 Speaker 1: portfolio theory. By the way, that that is the super 787 00:42:44,480 --> 00:42:47,040 Speaker 1: super U w r X. They've done that and it's 788 00:42:47,040 --> 00:42:50,719 Speaker 1: been a very successful car. I don't know anything about cars, 789 00:42:50,480 --> 00:42:53,680 Speaker 1: so but that was really that was really a fascinating discussion, 790 00:42:53,680 --> 00:42:57,160 Speaker 1: and I I really enjoyed that last question about the book. 791 00:42:57,719 --> 00:43:01,120 Speaker 1: Anything else sort of stood out it is Wow, that 792 00:43:01,239 --> 00:43:04,040 Speaker 1: was really weird and unusual that I was not expected. 793 00:43:05,520 --> 00:43:08,120 Speaker 1: You know, there was every chapter sort of I guess 794 00:43:08,160 --> 00:43:11,040 Speaker 1: had those moments. I mean there were certainly when the 795 00:43:11,080 --> 00:43:14,480 Speaker 1: time I spent following around the paparazzi, I'm like, also 796 00:43:14,560 --> 00:43:17,800 Speaker 1: a fascinating story. This is weird. You know, I was 797 00:43:17,920 --> 00:43:19,920 Speaker 1: telling someone would say, you know, as I'm like crouching 798 00:43:19,920 --> 00:43:22,360 Speaker 1: behind a garbage can waiting for Alec Baldwin, I'm like, 799 00:43:22,440 --> 00:43:23,880 Speaker 1: this is just not what I expected when I went 800 00:43:23,920 --> 00:43:26,279 Speaker 1: to go out school, you know, to end up here 801 00:43:26,280 --> 00:43:28,520 Speaker 1: of all places. But you know again, they also have 802 00:43:28,600 --> 00:43:31,480 Speaker 1: a fascinating risk story that's going on behind the scenes. 803 00:43:31,520 --> 00:43:33,839 Speaker 1: You would never know because, as you can imagine, they 804 00:43:33,880 --> 00:43:37,480 Speaker 1: faced just crazy amounts of idiosyncratic risk. At the risk 805 00:43:37,600 --> 00:43:39,319 Speaker 1: that you're going to get that one money shot in 806 00:43:39,320 --> 00:43:43,040 Speaker 1: any one day is so large. They have to form 807 00:43:43,080 --> 00:43:47,160 Speaker 1: these complex alliances to share tips and sometimes royalties. It's 808 00:43:47,239 --> 00:43:49,879 Speaker 1: especially its pooling, so you're getting rid of your idiosyncratic risk. 809 00:43:50,320 --> 00:43:52,600 Speaker 1: But because all the money in celebrity photography is from 810 00:43:52,640 --> 00:43:55,000 Speaker 1: getting an exclusive, they also have this incentive to always 811 00:43:55,040 --> 00:43:59,000 Speaker 1: cheat to These alliances are inherently very unstable, so they're 812 00:43:59,040 --> 00:44:02,359 Speaker 1: always reef warming and breaking up these alliances, so they 813 00:44:02,400 --> 00:44:04,520 Speaker 1: all hate each other. So this is something you see 814 00:44:04,520 --> 00:44:07,000 Speaker 1: the paparazzi on the street and you're like, this is 815 00:44:07,000 --> 00:44:09,400 Speaker 1: a much more interesting story that's going on with this 816 00:44:09,400 --> 00:44:12,200 Speaker 1: this lineup of paparazzi than you would ever know. Then 817 00:44:12,360 --> 00:44:15,799 Speaker 1: the paparazzi a more interesting than catching a celebrity takes 818 00:44:15,800 --> 00:44:18,040 Speaker 1: their kid out for ice cream. Yeah, they'd always be surprised. 819 00:44:18,080 --> 00:44:19,919 Speaker 1: Can you me wait for these celebrities for like six 820 00:44:20,000 --> 00:44:22,080 Speaker 1: or eight hours and after maybe an hour or two. 821 00:44:22,120 --> 00:44:23,719 Speaker 1: I'll just get bored and I'm like, I've had a 822 00:44:23,719 --> 00:44:26,480 Speaker 1: good story here. I'm gonna go home. And they'd be shocked. 823 00:44:26,480 --> 00:44:28,480 Speaker 1: They're like, but she has an ampauty. I'm like, well, 824 00:44:28,520 --> 00:44:32,799 Speaker 1: I'm here for supermodel g who was friends with somebody else? 825 00:44:32,840 --> 00:44:38,800 Speaker 1: Want that? That's how suddenly she blew up on on Instagram. Yeah. Well, 826 00:44:38,840 --> 00:44:41,160 Speaker 1: the celebrities who do well with the paparazzi also play 827 00:44:41,239 --> 00:44:45,040 Speaker 1: the game with them. Huh. Quite interesting. You You also 828 00:44:45,120 --> 00:44:50,360 Speaker 1: have been writing regularly on Vox for some time, and 829 00:44:50,360 --> 00:44:56,320 Speaker 1: and some of the columns you've done sort of tangentially 830 00:44:56,320 --> 00:45:02,240 Speaker 1: involved risk in surprising ways. So everybody today is focused 831 00:45:02,280 --> 00:45:06,920 Speaker 1: on the Amazon HQ two disaster that blew up earlier 832 00:45:06,960 --> 00:45:10,319 Speaker 1: this year. Um, but but you looked at it from 833 00:45:10,320 --> 00:45:14,880 Speaker 1: the context of the US has a talent problem and 834 00:45:14,960 --> 00:45:19,680 Speaker 1: that presents a risk. Two corporations explain that. Well, so 835 00:45:20,040 --> 00:45:22,680 Speaker 1: this is as the Richard Florida argument that you need, 836 00:45:23,200 --> 00:45:25,440 Speaker 1: you know, anyway you think anyone would want to work 837 00:45:25,440 --> 00:45:29,480 Speaker 1: for Amazon. They still it's the the competition for sort 838 00:45:29,520 --> 00:45:32,000 Speaker 1: of really good talent is still very stiff, and it 839 00:45:32,120 --> 00:45:36,239 Speaker 1: is does occur globally. And we're talking about engineers and programmers, 840 00:45:36,280 --> 00:45:40,520 Speaker 1: not necessarily the surfs that they have enslaved in their 841 00:45:40,520 --> 00:45:43,520 Speaker 1: in their warehouse. Yeah, on the high skill tech workers. 842 00:45:43,560 --> 00:45:45,960 Speaker 1: I mean, they're kings of the labor market and they 843 00:45:46,000 --> 00:45:47,520 Speaker 1: compete a lot for it, which is one of the 844 00:45:47,560 --> 00:45:50,279 Speaker 1: reasons why Amazon probably wanted to come to New York. 845 00:45:50,320 --> 00:45:53,239 Speaker 1: It wasn't just the tax incentives. It was that you 846 00:45:53,280 --> 00:45:55,359 Speaker 1: could get talent who wanted to move here. It's really 847 00:45:55,440 --> 00:45:58,239 Speaker 1: hard to get a cluster of talented young people to 848 00:45:58,440 --> 00:46:02,080 Speaker 1: want to move to, uh, the middle of the country, 849 00:46:02,160 --> 00:46:04,920 Speaker 1: and that's why Brooklyn is so hot these days. Yeah. Well, 850 00:46:04,960 --> 00:46:07,280 Speaker 1: and you can see why because if you are talented, 851 00:46:07,320 --> 00:46:10,360 Speaker 1: I mean human capital, you know, is something you have 852 00:46:10,400 --> 00:46:12,200 Speaker 1: to work towards your whole career. You don't just go 853 00:46:12,239 --> 00:46:14,120 Speaker 1: to Harvard and then you're just set for life. You 854 00:46:14,160 --> 00:46:18,120 Speaker 1: have to manage a network. You have to keep your 855 00:46:18,160 --> 00:46:20,600 Speaker 1: skills sharp, and that's why you want to be around 856 00:46:20,719 --> 00:46:22,960 Speaker 1: and not just be limited to your own company, be 857 00:46:23,040 --> 00:46:26,680 Speaker 1: around other people, intelligent companies. That way, you keep your 858 00:46:26,680 --> 00:46:29,239 Speaker 1: skills fresh. You have the option of changing jobs. That's 859 00:46:29,239 --> 00:46:32,320 Speaker 1: always a very valuable option. I mean, if if Amazon 860 00:46:32,440 --> 00:46:35,040 Speaker 1: moved to a place where there were no other good jobs. 861 00:46:35,040 --> 00:46:36,560 Speaker 1: You're kind of stuck at Amazon and you give up 862 00:46:36,600 --> 00:46:39,040 Speaker 1: the option of job switching, that's you have to compensate 863 00:46:39,080 --> 00:46:42,319 Speaker 1: people for that. Towns are problematic for that reason. Yeah, 864 00:46:42,360 --> 00:46:45,480 Speaker 1: I mean it worked before when you have people who 865 00:46:45,480 --> 00:46:48,520 Speaker 1: are more middle skill. And also back before when you 866 00:46:48,560 --> 00:46:51,960 Speaker 1: had technology was such before where the skills you would 867 00:46:52,040 --> 00:46:54,560 Speaker 1: learn would be very idustocratic to the company you worked for, 868 00:46:54,880 --> 00:46:57,640 Speaker 1: and now they're very transferable exactly. So if you want 869 00:46:57,680 --> 00:47:02,480 Speaker 1: to maintain competitive competitive really in the labor market, you 870 00:47:02,560 --> 00:47:04,520 Speaker 1: have to be part of sort of these clusters of 871 00:47:04,520 --> 00:47:07,080 Speaker 1: people that allow you to move around. And and Google 872 00:47:07,120 --> 00:47:09,960 Speaker 1: announced they're doubling their New York workforce from seven thousand 873 00:47:10,000 --> 00:47:14,000 Speaker 1: and fourteen thousand. Apple has dramatically expanded its presence. So 874 00:47:14,040 --> 00:47:16,479 Speaker 1: if you're an Amazon work or theoretically in New York, 875 00:47:17,160 --> 00:47:20,560 Speaker 1: they're competing for your skills with some of the biggest 876 00:47:20,640 --> 00:47:23,359 Speaker 1: companies in the world. Yeah, but and you know, they 877 00:47:23,360 --> 00:47:25,719 Speaker 1: don't like the ice work for a company that was 878 00:47:25,880 --> 00:47:28,080 Speaker 1: far outside my cluster, and they kept trying to get 879 00:47:28,120 --> 00:47:30,200 Speaker 1: me to move there, And I'm like, but you're asking 880 00:47:30,239 --> 00:47:32,000 Speaker 1: me to give up a very valuable option. You're gonna 881 00:47:32,040 --> 00:47:34,880 Speaker 1: have to compensate me for it. That never really resonated 882 00:47:34,920 --> 00:47:39,000 Speaker 1: with them. But um, it seems counterintuitive that Amazon would 883 00:47:39,000 --> 00:47:41,279 Speaker 1: want to be close to their job competition, but it's 884 00:47:41,280 --> 00:47:43,719 Speaker 1: also what they need to do to attract talent. It's 885 00:47:43,719 --> 00:47:47,360 Speaker 1: why cities haven't disappeared despite the best predictions of people 886 00:47:47,760 --> 00:47:50,560 Speaker 1: half a century ago. What about I thought this was 887 00:47:50,600 --> 00:47:54,480 Speaker 1: an interesting headline. What is the real reason people regret 888 00:47:54,719 --> 00:47:59,880 Speaker 1: not saving more risk? So, you know, people think about 889 00:48:00,040 --> 00:48:02,799 Speaker 1: saving as something that they're gonna, you know, often for 890 00:48:02,960 --> 00:48:04,600 Speaker 1: something that's gonna happen in the future that they want 891 00:48:04,600 --> 00:48:06,880 Speaker 1: to do. But when they looked at people at the 892 00:48:06,960 --> 00:48:09,799 Speaker 1: end of their life or in retirement, the reason why 893 00:48:09,800 --> 00:48:11,480 Speaker 1: they think they wish had more money. They don't wish 894 00:48:11,520 --> 00:48:13,400 Speaker 1: they could have gone on better vacations when they retired. 895 00:48:13,400 --> 00:48:15,440 Speaker 1: They wish they could have a better lifestyle. It was 896 00:48:15,880 --> 00:48:18,200 Speaker 1: I didn't realize that divorce would blow all my savings. 897 00:48:18,480 --> 00:48:22,200 Speaker 1: It's not a luxury goal. It's a Hey, things happen 898 00:48:22,280 --> 00:48:25,279 Speaker 1: that are just unexpected and I failed to plan for that. Yeah, 899 00:48:25,320 --> 00:48:27,360 Speaker 1: and people, I mean the the like FOURG dollar for 900 00:48:27,400 --> 00:48:30,279 Speaker 1: emergency saving figure is a little controversial, but I don't 901 00:48:30,280 --> 00:48:32,359 Speaker 1: think it is a stretch to say people really don't 902 00:48:32,360 --> 00:48:36,680 Speaker 1: make emergency savings enough of a priority. Right, that's pretty fair. Um, 903 00:48:36,760 --> 00:48:39,439 Speaker 1: let's talk about charities. What's the best way to get 904 00:48:39,480 --> 00:48:41,800 Speaker 1: people to donate to charity? So I think it was 905 00:48:41,880 --> 00:48:44,200 Speaker 1: John List, who I mentioned earlier, did a study of 906 00:48:44,800 --> 00:48:47,319 Speaker 1: the of Alaska. You know, Alaska is just right for 907 00:48:47,440 --> 00:48:49,800 Speaker 1: a lot of economic studies because they get the dividend 908 00:48:49,840 --> 00:48:53,080 Speaker 1: payment from the from all the oil reserves that they're 909 00:48:53,080 --> 00:48:56,759 Speaker 1: selling to or licensing and leasing out to the big 910 00:48:56,800 --> 00:48:59,440 Speaker 1: public oil companies. Yeah, so they have a program where 911 00:48:59,640 --> 00:49:01,920 Speaker 1: you can give some of that money to charity. And 912 00:49:01,960 --> 00:49:05,799 Speaker 1: they had they did a study where you they they 913 00:49:05,880 --> 00:49:08,240 Speaker 1: came with a card and one said, you know, warm 914 00:49:08,280 --> 00:49:10,440 Speaker 1: your heart. The other is like improve Alaska. On the 915 00:49:10,440 --> 00:49:12,680 Speaker 1: other just was nothing. And they found the warm the 916 00:49:12,680 --> 00:49:15,640 Speaker 1: heart group donated the most and we're most likely to donate. So, 917 00:49:15,640 --> 00:49:19,200 Speaker 1: in other words, they made the charitable donation about the 918 00:49:19,239 --> 00:49:23,720 Speaker 1: person as opposed to the recipient exactly. So appeal appeal 919 00:49:23,800 --> 00:49:26,800 Speaker 1: to ego. So over the past year or two, we've 920 00:49:26,840 --> 00:49:31,600 Speaker 1: gone through this giant me too movement. Um, and I 921 00:49:31,640 --> 00:49:35,960 Speaker 1: know that amongst my colleagues we've had debates about what 922 00:49:36,000 --> 00:49:39,400 Speaker 1: do you do with an artist who turns out to 923 00:49:39,440 --> 00:49:44,160 Speaker 1: be a less than um nice guy. Uh. And the 924 00:49:44,239 --> 00:49:47,200 Speaker 1: Michael Jackson HBO documentary just came out. I have yet 925 00:49:47,239 --> 00:49:50,600 Speaker 1: to see it, but I know Michael Jackson fans are 926 00:49:50,680 --> 00:49:53,319 Speaker 1: kind of split. Some are defending him and others are 927 00:49:53,320 --> 00:49:55,759 Speaker 1: a little bereft. I was always a big fan of 928 00:49:55,800 --> 00:49:59,160 Speaker 1: Louis c K. I'm not happy with with what he did. 929 00:49:59,400 --> 00:50:03,879 Speaker 1: Go down the list, you know, it ranges from offensive 930 00:50:03,960 --> 00:50:08,240 Speaker 1: to criminal and everything in between. You raised the question, 931 00:50:08,480 --> 00:50:11,560 Speaker 1: what do you do when a brilliant economist is accused 932 00:50:11,600 --> 00:50:17,399 Speaker 1: of sexually harassing his research assistance? So what's the solution? Well, 933 00:50:17,560 --> 00:50:20,120 Speaker 1: I don't know because the thing about I was writing 934 00:50:20,120 --> 00:50:23,760 Speaker 1: about Rolling Fire, who has been accused but not found guilty. Um. 935 00:50:24,000 --> 00:50:27,920 Speaker 1: But there He's been accused by a number of research assistants, 936 00:50:28,480 --> 00:50:32,879 Speaker 1: some of whom have incredibly claimed that he awarded their 937 00:50:32,920 --> 00:50:37,919 Speaker 1: careers for They're refusing to succumb to his charms. So 938 00:50:38,320 --> 00:50:40,120 Speaker 1: which is hard. It's a terrible way to say. Is 939 00:50:40,120 --> 00:50:43,960 Speaker 1: this is terrible? But and in his what he always 940 00:50:44,040 --> 00:50:46,480 Speaker 1: defended himself with and this is a valid point, although 941 00:50:46,719 --> 00:50:49,680 Speaker 1: doesn't excuse his behavior in any way. Is I do 942 00:50:49,760 --> 00:50:54,000 Speaker 1: research that socially critical. He was the economist who was 943 00:50:54,080 --> 00:50:57,320 Speaker 1: leading the charge of understanding why a lot of minority 944 00:50:57,320 --> 00:51:00,759 Speaker 1: students don't do well in school. So this it's important. 945 00:51:00,800 --> 00:51:02,239 Speaker 1: But what does that have to do with whether or 946 00:51:02,280 --> 00:51:05,839 Speaker 1: not he's a pig. So here's the question is if 947 00:51:05,840 --> 00:51:08,799 Speaker 1: someone is doing research that's socially important, are supposed they're 948 00:51:08,800 --> 00:51:10,640 Speaker 1: on the verge of a cure for cancer and we 949 00:51:10,680 --> 00:51:13,640 Speaker 1: find out they're a pig, you know, do we should 950 00:51:13,719 --> 00:51:15,839 Speaker 1: they still have a career? You know, if there's all 951 00:51:15,840 --> 00:51:20,160 Speaker 1: these other positive externalities for society. I could so you 952 00:51:20,239 --> 00:51:23,880 Speaker 1: just had the bishop six years prison sentence in Australia. 953 00:51:24,320 --> 00:51:28,040 Speaker 1: You could see throughout history that the powerful will say, look, 954 00:51:28,320 --> 00:51:31,480 Speaker 1: there have been some bad behavior, but we're literally doing 955 00:51:31,560 --> 00:51:35,680 Speaker 1: God's work and therefore we should be exempt from this. Um. 956 00:51:37,080 --> 00:51:43,440 Speaker 1: You know, pick a person. Um. John Lennon was supposed 957 00:51:43,440 --> 00:51:46,160 Speaker 1: to be a bit of a hardass, and you don't 958 00:51:46,200 --> 00:51:48,319 Speaker 1: want to go through the history of literature to find 959 00:51:48,320 --> 00:51:51,520 Speaker 1: out how big a jerk half the writers out there are. 960 00:51:52,200 --> 00:51:56,480 Speaker 1: If we but and then artists and paintings and things 961 00:51:56,560 --> 00:51:59,239 Speaker 1: like that, if we're gonna have a moral purity test 962 00:51:59,280 --> 00:52:01,839 Speaker 1: on that stuff, your museums will be empty and there'll 963 00:52:01,840 --> 00:52:05,200 Speaker 1: be nothing on the air waves. However, it doesn't mean 964 00:52:05,200 --> 00:52:08,880 Speaker 1: they shouldn't suffer the ramifications from So should we separate 965 00:52:08,920 --> 00:52:11,480 Speaker 1: the value of the art from the artist. But that 966 00:52:11,560 --> 00:52:13,799 Speaker 1: doesn't give them a free pass in their career, not 967 00:52:13,880 --> 00:52:15,880 Speaker 1: at all. And I didn't honestly know the answer, so 968 00:52:15,960 --> 00:52:18,759 Speaker 1: spoke to philosopher, which was just I love talking to 969 00:52:18,760 --> 00:52:23,160 Speaker 1: philosophers because they always remind you know nothing about anything um. 970 00:52:23,200 --> 00:52:26,200 Speaker 1: And he pointed out that if you are doing important work, 971 00:52:26,440 --> 00:52:30,120 Speaker 1: like scientific works, but either economics or hard science, and 972 00:52:30,160 --> 00:52:32,440 Speaker 1: you're on the verge of really making the world important, 973 00:52:32,760 --> 00:52:35,640 Speaker 1: it's actually the more the pressure on you to behave 974 00:52:35,640 --> 00:52:38,400 Speaker 1: well is even higher because you're letting down especially no 975 00:52:38,400 --> 00:52:41,600 Speaker 1: good work happens alone. You're letting down everyone's effort, and 976 00:52:41,760 --> 00:52:44,640 Speaker 1: it's you know, not everyone gets the opportunity and resources 977 00:52:44,640 --> 00:52:48,480 Speaker 1: to perform research like this, and so if you're threatening 978 00:52:48,480 --> 00:52:50,719 Speaker 1: it with your behavior, it actually should be held to 979 00:52:50,760 --> 00:52:54,400 Speaker 1: an even higher standard. So the question is, according to 980 00:52:54,440 --> 00:52:58,400 Speaker 1: the philosopher, then once this bad behavior is identified, do 981 00:52:58,480 --> 00:53:02,480 Speaker 1: you stop the person from doing research or do you 982 00:53:02,520 --> 00:53:05,640 Speaker 1: just put a higher level of HR scrutiny in and 983 00:53:05,680 --> 00:53:08,440 Speaker 1: sit that person down and say you were putting millions 984 00:53:08,440 --> 00:53:10,560 Speaker 1: of people's lives at risk because you're this close to 985 00:53:10,560 --> 00:53:12,640 Speaker 1: a cure for cancer. And if you can't keep your 986 00:53:12,640 --> 00:53:15,440 Speaker 1: hands off your research assistant, here's what's going to happen. 987 00:53:15,480 --> 00:53:17,359 Speaker 1: I mean, how do you how do you resolve that? Well, 988 00:53:17,400 --> 00:53:21,480 Speaker 1: he was more in favor of just jettising them, I know, 989 00:53:21,600 --> 00:53:27,399 Speaker 1: jetting the assistance. No, the researcher, Um, the philosopher said 990 00:53:27,560 --> 00:53:29,759 Speaker 1: just fire the guy. Yeah, it's like whoever it is. Yeah, 991 00:53:29,760 --> 00:53:31,719 Speaker 1: he's like, we need, we need to have standards, and 992 00:53:31,760 --> 00:53:34,399 Speaker 1: you know, as he said, you know, yeah we might 993 00:53:34,400 --> 00:53:37,919 Speaker 1: have we might have a longer wait before a cure 994 00:53:37,920 --> 00:53:40,040 Speaker 1: for cancer. But he's like, but what about the behavior 995 00:53:40,040 --> 00:53:41,520 Speaker 1: he did all along? Maybe some would have come up 996 00:53:41,520 --> 00:53:44,279 Speaker 1: with a cure for cancer even earlier and he discouraged them, 997 00:53:44,960 --> 00:53:47,560 Speaker 1: and maybe we'd have had a cancer fifteen years ago 998 00:53:47,680 --> 00:53:50,880 Speaker 1: his behavior. So so you can't use as an excuse. 999 00:53:51,160 --> 00:53:53,279 Speaker 1: Bad behavior has to be punished, because you don't know 1000 00:53:53,360 --> 00:53:57,239 Speaker 1: what sort of that's the I love a classic counterfaction 1001 00:53:57,280 --> 00:54:02,000 Speaker 1: on that. That's a perfect one. Um, last one before 1002 00:54:02,040 --> 00:54:05,239 Speaker 1: we get to our favorite questions. Actually there are two here. 1003 00:54:05,239 --> 00:54:09,680 Speaker 1: They're fascinating. Um, let's start with this one. Are millennials 1004 00:54:09,719 --> 00:54:13,560 Speaker 1: the wealthiest generation? They could be I think, you know, 1005 00:54:13,719 --> 00:54:17,319 Speaker 1: I'm tired eventually maybe, but today they certainly don't feel 1006 00:54:17,360 --> 00:54:19,680 Speaker 1: that way. Well, I'm a life cyclist, right, so when 1007 00:54:19,719 --> 00:54:27,040 Speaker 1: I think about wealth lifecle, I think about all your assets. 1008 00:54:27,360 --> 00:54:31,080 Speaker 1: So I think of human capital, which in life cycle 1009 00:54:31,120 --> 00:54:34,400 Speaker 1: economics is the value of your future earnings. So for me, 1010 00:54:34,480 --> 00:54:38,840 Speaker 1: the idea that people don't millennials don't own houses and 1011 00:54:38,880 --> 00:54:41,560 Speaker 1: instead they have student debt doesn't seem to me like 1012 00:54:41,600 --> 00:54:45,719 Speaker 1: they've made a huge investment in their future earnings and education. 1013 00:54:45,840 --> 00:54:49,640 Speaker 1: It's not perfect. Is correlated with much higher lifetime earnings, true, 1014 00:54:49,719 --> 00:54:53,040 Speaker 1: but the issue that they've appropriately raised you're an n 1015 00:54:53,120 --> 00:54:56,440 Speaker 1: y U. It's the most expensive tuition in America sixty 1016 00:54:56,640 --> 00:55:00,279 Speaker 1: three thousand dollars or something insane a year. UM. I 1017 00:55:00,320 --> 00:55:02,560 Speaker 1: went to a state school. I want Stony Brook. My 1018 00:55:02,600 --> 00:55:06,360 Speaker 1: tuition was four fifty semester. Even today it's like five 1019 00:55:06,400 --> 00:55:10,600 Speaker 1: thousand a semester, which seems like a bargain. Um. Students 1020 00:55:10,600 --> 00:55:15,719 Speaker 1: today are paying prices for education that are just so 1021 00:55:15,880 --> 00:55:20,160 Speaker 1: vastly out of whack with what they were like forty fifty, 1022 00:55:20,239 --> 00:55:23,480 Speaker 1: sixty years or even twenty five years ago, it was 1023 00:55:23,600 --> 00:55:26,840 Speaker 1: much much cheaper relative to the total cost of living 1024 00:55:27,239 --> 00:55:30,160 Speaker 1: to go to school. So are they going to get 1025 00:55:30,280 --> 00:55:34,360 Speaker 1: the same return on their investment in education or have 1026 00:55:34,480 --> 00:55:39,480 Speaker 1: things just completely run them up? Well, definitely they're getting Uh. Well, 1027 00:55:39,480 --> 00:55:41,720 Speaker 1: it's hard to say. I mean, we can't predict the future. 1028 00:55:41,760 --> 00:55:44,520 Speaker 1: I mean, so far it does seem I mean, I'm 1029 00:55:44,520 --> 00:55:46,279 Speaker 1: not sure if there's much of value in going to 1030 00:55:46,680 --> 00:55:51,719 Speaker 1: school over Stonybrook. Probably you're you're gonna do just as 1031 00:55:51,760 --> 00:55:54,480 Speaker 1: well going to a good public school. Yes, I couldn't 1032 00:55:54,480 --> 00:55:57,680 Speaker 1: get into stony Brook today with my grades. Back then 1033 00:55:58,440 --> 00:56:00,600 Speaker 1: I was in a little bit of the value between 1034 00:56:01,120 --> 00:56:03,120 Speaker 1: the boomers and the gen xers. And the same thing 1035 00:56:03,120 --> 00:56:06,520 Speaker 1: with grad school. I couldn't get into grad school. Uh 1036 00:56:06,520 --> 00:56:10,480 Speaker 1: with to my grad school. Uh, y you that I 1037 00:56:10,520 --> 00:56:13,719 Speaker 1: got into way back when. So some of it's just 1038 00:56:13,840 --> 00:56:16,320 Speaker 1: dumb luck when you're born. But the other aspect of 1039 00:56:16,520 --> 00:56:21,040 Speaker 1: is what, um, what sort of return are are are 1040 00:56:21,160 --> 00:56:25,720 Speaker 1: these current graduates? What should they expect going forward? Explain 1041 00:56:25,760 --> 00:56:28,440 Speaker 1: why you say potentially they're the wealthiest generation. Well, I 1042 00:56:28,480 --> 00:56:32,359 Speaker 1: think when people look at the outcome firm education, they 1043 00:56:32,360 --> 00:56:35,680 Speaker 1: are too shortsighted because you don't often like I mean, 1044 00:56:35,760 --> 00:56:38,640 Speaker 1: I didn't enter the labor market really officially until I 1045 00:56:38,640 --> 00:56:41,920 Speaker 1: finished grad school was almost thirty. And my earnings well, 1046 00:56:41,920 --> 00:56:45,080 Speaker 1: actually my first job is unpaid. Um, you know, weren't 1047 00:56:45,080 --> 00:56:46,719 Speaker 1: that great. But when you're looking, when you think of 1048 00:56:46,760 --> 00:56:50,440 Speaker 1: the payoff from human capital, it's your lifetime earnings. And 1049 00:56:50,520 --> 00:56:52,879 Speaker 1: often out of school, you're gonna earn less than someone 1050 00:56:52,880 --> 00:56:55,560 Speaker 1: who you know didn't go to college. But the trajectory 1051 00:56:55,560 --> 00:56:57,920 Speaker 1: of your earnings as follows a much steeper path. So 1052 00:56:57,960 --> 00:56:59,840 Speaker 1: they've they've been working for a few years, they have 1053 00:57:00,000 --> 00:57:02,680 Speaker 1: of a series of raises. You're starting up below them, 1054 00:57:03,080 --> 00:57:06,080 Speaker 1: but you're gonna as a college or grad student, you're 1055 00:57:06,120 --> 00:57:08,959 Speaker 1: gonna accelerate way past them. And you also face less risk. 1056 00:57:09,000 --> 00:57:11,040 Speaker 1: I mean, the unemployment rates for college grads is much 1057 00:57:11,120 --> 00:57:13,640 Speaker 1: much lower. I'm sensing a theme with you. I don't 1058 00:57:13,680 --> 00:57:15,440 Speaker 1: know what it is, so I only have you for 1059 00:57:15,480 --> 00:57:19,320 Speaker 1: another ten minutes. Let's jump to my favorite questions. These 1060 00:57:19,320 --> 00:57:23,480 Speaker 1: are what we ask all of our guests. Um, tell 1061 00:57:23,600 --> 00:57:26,840 Speaker 1: us the first car you ever owned? Year making model? Well, 1062 00:57:26,880 --> 00:57:29,400 Speaker 1: I've never actually technically owned a car, but in high 1063 00:57:29,400 --> 00:57:33,280 Speaker 1: school I did drive Inde Okay, I love that car. 1064 00:57:33,440 --> 00:57:35,920 Speaker 1: Oh it lasts when two thousand miles, which in the 1065 00:57:35,960 --> 00:57:38,000 Speaker 1: nineties was a really big deal. That was a little 1066 00:57:38,040 --> 00:57:41,080 Speaker 1: two seater. They were you couldn't you couldn't destroy them. 1067 00:57:41,640 --> 00:57:43,360 Speaker 1: It wasn't the fastest car in the world. You could 1068 00:57:43,360 --> 00:57:45,280 Speaker 1: get them with a stick shift, which was nice back 1069 00:57:45,320 --> 00:57:47,800 Speaker 1: then that it was a stick st Oh there you go, 1070 00:57:48,000 --> 00:57:50,400 Speaker 1: so you drive by the way Today. I call stick 1071 00:57:50,440 --> 00:57:55,560 Speaker 1: shifts millennial anti theft divience because that's what they effectively are. 1072 00:57:55,840 --> 00:57:59,240 Speaker 1: So what's the most important thing we don't know about 1073 00:57:59,280 --> 00:58:03,280 Speaker 1: Alison shre ag Um, well, you wouldn't know from the 1074 00:58:03,320 --> 00:58:06,400 Speaker 1: book title. But I'm just really a risk nerd. I mean, 1075 00:58:06,440 --> 00:58:08,600 Speaker 1: I guess I don't know if that's not surprising. Um No, 1076 00:58:08,760 --> 00:58:11,720 Speaker 1: that's not a shocker. You read the book and it's 1077 00:58:11,720 --> 00:58:16,840 Speaker 1: like this is it should say Alison Schraeger comma PhD 1078 00:58:17,160 --> 00:58:20,480 Speaker 1: risk nerd. So that that's not a big surprise that 1079 00:58:20,600 --> 00:58:23,000 Speaker 1: you think of yourself as a risk nerd. Maybe that's 1080 00:58:23,000 --> 00:58:25,720 Speaker 1: a well, I guess. I also I wasn't always Like 1081 00:58:25,840 --> 00:58:29,240 Speaker 1: in college, I had a job in Alaska selling incense. 1082 00:58:30,320 --> 00:58:34,240 Speaker 1: Incense in Alaska? Why would they need incense in Alaska? 1083 00:58:34,320 --> 00:58:36,840 Speaker 1: Is there that much you would think of if anything 1084 00:58:36,880 --> 00:58:40,360 Speaker 1: smells like the Great Outdoor was a fishing village. Oh 1085 00:58:40,520 --> 00:58:44,520 Speaker 1: say no more. Yeah, that's very that's very funny. So 1086 00:58:44,800 --> 00:58:46,320 Speaker 1: I know the answer to this, but I have to 1087 00:58:46,320 --> 00:58:50,080 Speaker 1: ask who are your early mentors. Obviously Bob Merton, So 1088 00:58:50,160 --> 00:58:52,000 Speaker 1: tell us a little bit about what he taught you, 1089 00:58:52,040 --> 00:58:57,600 Speaker 1: because he's clearly fascinating and storied person. Well, he just 1090 00:58:57,720 --> 00:59:00,640 Speaker 1: taught me finance, which you know I went to I 1091 00:59:00,680 --> 00:59:02,720 Speaker 1: did an eCOM PhD, so it wasn't like I wasn't 1092 00:59:02,760 --> 00:59:05,600 Speaker 1: exposed to it, but my program it was very empirical, 1093 00:59:06,080 --> 00:59:09,800 Speaker 1: So I just thought financial economists just crunch data is 1094 00:59:09,800 --> 00:59:12,600 Speaker 1: looking for deviations of the efficient markets hypothesis, and I 1095 00:59:12,640 --> 00:59:15,120 Speaker 1: thought it was not very interesting and sort of corrupt. 1096 00:59:15,760 --> 00:59:18,080 Speaker 1: But then when I met Bob and learned what financial 1097 00:59:18,120 --> 00:59:20,760 Speaker 1: theory was, it really turned me onto theory too. In 1098 00:59:20,760 --> 00:59:23,560 Speaker 1: a way. I was always much more empirically oriented of 1099 00:59:23,960 --> 00:59:25,880 Speaker 1: how to think about the world and how to see 1100 00:59:25,920 --> 00:59:29,120 Speaker 1: the world in terms of a risk lens, and how 1101 00:59:29,160 --> 00:59:32,640 Speaker 1: to see risk problems everywhere and how they're driving all markets, 1102 00:59:32,680 --> 00:59:35,280 Speaker 1: not just financial markets. So he was really like your 1103 00:59:35,320 --> 00:59:37,919 Speaker 1: post doc work. He was. It was a very sort 1104 00:59:37,920 --> 00:59:40,560 Speaker 1: of long, intense postdoc I had good training as an 1105 00:59:40,600 --> 00:59:43,480 Speaker 1: as a graduate student too, but it sort of readied 1106 00:59:43,560 --> 00:59:46,400 Speaker 1: me to really fully embrace all of his lessons. So 1107 00:59:46,600 --> 00:59:50,680 Speaker 1: let's talk about investors and others who influenced the way 1108 00:59:50,720 --> 00:59:52,920 Speaker 1: you look at the world of risk. Who who do 1109 00:59:52,920 --> 00:59:57,200 Speaker 1: you consider to be UM important thinkers that affected the 1110 00:59:57,200 --> 01:00:01,760 Speaker 1: way you perceived the world um, investors, investors, or or anyone. 1111 01:00:01,800 --> 01:00:04,800 Speaker 1: Really you mentioned Peter Bernstein, who who affected the way 1112 01:00:05,240 --> 01:00:09,040 Speaker 1: you perceive UM the world of risk? Certainly? Peter Bernstein 1113 01:00:09,120 --> 01:00:10,800 Speaker 1: also said I worked at d f A for a 1114 01:00:10,880 --> 01:00:14,800 Speaker 1: number of years. So David Booth, Um, when did you 1115 01:00:14,840 --> 01:00:19,960 Speaker 1: work at d f A two thousand ten? Oh? So 1116 01:00:20,040 --> 01:00:23,560 Speaker 1: really that was in fairly They were already fairly developed 1117 01:00:23,560 --> 01:00:25,760 Speaker 1: and running hundreds of billions of dollars by then, and 1118 01:00:25,760 --> 01:00:27,840 Speaker 1: it was it was a great experience for me because 1119 01:00:27,920 --> 01:00:29,800 Speaker 1: I was working on a project that they were all 1120 01:00:29,880 --> 01:00:32,800 Speaker 1: very interested in. So every two weeks I would have 1121 01:00:32,920 --> 01:00:36,680 Speaker 1: these meetings with Gene Fama, Ken French Merton, David booth 1122 01:00:36,680 --> 01:00:39,720 Speaker 1: and EDWARDA. Rippetto who is the U and we would 1123 01:00:39,800 --> 01:00:42,680 Speaker 1: just because we were having to calibrate this very complicated 1124 01:00:42,680 --> 01:00:45,160 Speaker 1: interest rate model I was working on, and we would 1125 01:00:45,200 --> 01:00:47,960 Speaker 1: just debate the path of interest rates, and I learned 1126 01:00:48,080 --> 01:00:49,720 Speaker 1: That's where I learned a lot of my finance was 1127 01:00:49,840 --> 01:00:52,720 Speaker 1: just seeing obviously Geene and Bob go at it and 1128 01:00:52,920 --> 01:00:55,600 Speaker 1: you know, with the influences of David Booth, who's just 1129 01:00:55,640 --> 01:00:59,760 Speaker 1: a real market guy. And this was pre uh will 1130 01:00:59,800 --> 01:01:03,000 Speaker 1: you when when Fama won the Nobel was just before 1131 01:01:06,040 --> 01:01:10,840 Speaker 1: so right before quite quite. That's some collection of mentors 1132 01:01:10,840 --> 01:01:14,720 Speaker 1: and inflot It's interesting because they're smart in such different ways. Huh. 1133 01:01:15,400 --> 01:01:20,240 Speaker 1: So we mentioned Um Against the Gods. What are some 1134 01:01:20,320 --> 01:01:23,520 Speaker 1: of your other favorite books? Well, that obviously anything in 1135 01:01:23,520 --> 01:01:27,439 Speaker 1: Peter Bernstein writes, but I also just I love memoirs. Really, yeah, 1136 01:01:27,560 --> 01:01:30,600 Speaker 1: give us an example. I love just kids like Patti Smith. 1137 01:01:30,720 --> 01:01:33,960 Speaker 1: I just read Educated, which I hate write loving popular books, 1138 01:01:33,960 --> 01:01:36,800 Speaker 1: but I really loved that wrote Educated. Was it, Tara Westover? 1139 01:01:38,040 --> 01:01:40,160 Speaker 1: Don't ask me. Yeah, it's just it's outside of mine. 1140 01:01:40,200 --> 01:01:44,040 Speaker 1: It's just just so beautifully written. Oh no kidding. What 1141 01:01:44,040 --> 01:01:47,280 Speaker 1: What other memoirs have you read that really resonated with you? 1142 01:01:47,560 --> 01:01:50,360 Speaker 1: By the way, I read two books on vacation. Yours 1143 01:01:50,440 --> 01:01:54,920 Speaker 1: was one of them, and then McCullough's Um The Right 1144 01:01:54,960 --> 01:01:58,320 Speaker 1: Brothers was quite fascinating. If you're if you're interested in 1145 01:01:58,400 --> 01:02:02,200 Speaker 1: flight and or um, that's really less of a memoir 1146 01:02:02,360 --> 01:02:05,440 Speaker 1: more of a biography. All right, skip that, give us 1147 01:02:05,440 --> 01:02:09,280 Speaker 1: one more book. So you mentioned you mentioned two, Um 1148 01:02:09,440 --> 01:02:10,920 Speaker 1: hit hit us with the third. By the way, this 1149 01:02:11,000 --> 01:02:13,800 Speaker 1: is the question. I get more email about this point. 1150 01:02:13,840 --> 01:02:16,320 Speaker 1: What was that book? The person I mentioned on? I 1151 01:02:16,360 --> 01:02:18,800 Speaker 1: get more email about this than anything. It's a stressful 1152 01:02:18,880 --> 01:02:23,120 Speaker 1: question because I feel like it's so personal and revealing. Yes, um, 1153 01:02:23,160 --> 01:02:25,640 Speaker 1: and what makes it so good? And especially because I 1154 01:02:25,680 --> 01:02:28,760 Speaker 1: read a lot of crap because you know you finished crap. 1155 01:02:29,560 --> 01:02:32,040 Speaker 1: You know. When I was doing book research, I had 1156 01:02:32,080 --> 01:02:34,600 Speaker 1: this idea that Chris Jenner was this risk mastermind because 1157 01:02:34,600 --> 01:02:37,560 Speaker 1: look at what she's built. And I so I read 1158 01:02:37,600 --> 01:02:39,720 Speaker 1: her biography thinking maybe I'll include her, and it turns 1159 01:02:39,760 --> 01:02:41,200 Speaker 1: out she didn't have a good risk strategy, so I 1160 01:02:41,200 --> 01:02:44,160 Speaker 1: couldn't um right place, right time? Is that all about? 1161 01:02:44,280 --> 01:02:47,120 Speaker 1: She takes advantage? I mean, and it's like this extreme 1162 01:02:47,200 --> 01:02:50,520 Speaker 1: level of diversification where she literally any opportunity that comes 1163 01:02:50,520 --> 01:02:52,920 Speaker 1: her away, she seizes on it and she does work hard. 1164 01:02:53,240 --> 01:02:55,760 Speaker 1: There's nothing like strategic. It's sort of or if it's 1165 01:02:55,760 --> 01:02:57,280 Speaker 1: sort of as well, like a Donald Trump thing. It's 1166 01:02:57,320 --> 01:02:59,560 Speaker 1: like well, if this blows up, I have something else 1167 01:02:59,600 --> 01:03:02,200 Speaker 1: to distry people with, because I've got ninety gazillion things going. 1168 01:03:02,240 --> 01:03:03,920 Speaker 1: So it's like we have a debit card that pour 1169 01:03:04,000 --> 01:03:06,280 Speaker 1: people off. Oh look, here's a sex tape, you know. 1170 01:03:07,800 --> 01:03:10,120 Speaker 1: But I remember I was reading her book on the 1171 01:03:10,160 --> 01:03:13,080 Speaker 1: subway and I had this realization, like I've never read 1172 01:03:13,080 --> 01:03:17,439 Speaker 1: Anna Karna, but I've read this. Um. I'm gonna say 1173 01:03:17,640 --> 01:03:21,320 Speaker 1: you probably picked the wrong one of the two, just 1174 01:03:21,320 --> 01:03:25,000 Speaker 1: just to hypothesize it. Um. Alright, so here's here's a 1175 01:03:25,120 --> 01:03:29,280 Speaker 1: question that also is uh personal and probing. Tell us 1176 01:03:29,480 --> 01:03:31,840 Speaker 1: about a time you failed and what you learned from 1177 01:03:31,840 --> 01:03:36,080 Speaker 1: the experience. And by the way, you detailed some personal failures. Yeah, 1178 01:03:36,280 --> 01:03:38,240 Speaker 1: I fell a lot. You don't. You don't show away 1179 01:03:38,240 --> 01:03:40,520 Speaker 1: from that, No, I mean my first year of grad school, 1180 01:03:40,560 --> 01:03:42,600 Speaker 1: I think I failed almost all my comps. I mean 1181 01:03:42,720 --> 01:03:45,160 Speaker 1: largely because I was doing a quantitative PhD with no 1182 01:03:45,240 --> 01:03:48,720 Speaker 1: math backgrounds UM. And it was the hardest failure I've 1183 01:03:48,760 --> 01:03:51,360 Speaker 1: ever gone through because it was the biggest intellectual achievement 1184 01:03:51,400 --> 01:03:54,240 Speaker 1: I'd ever done, which is I taught myself six years 1185 01:03:54,240 --> 01:03:57,600 Speaker 1: worth of math myself in my free time. My first 1186 01:03:57,640 --> 01:04:00,080 Speaker 1: year of grad school, I never intellectually had grown so 1187 01:04:00,160 --> 01:04:03,400 Speaker 1: much from anything and never achieved so much. And I still, 1188 01:04:03,560 --> 01:04:04,800 Speaker 1: you know, at the end of the day, you're still 1189 01:04:04,800 --> 01:04:07,880 Speaker 1: taking a math exam and you're being judged against Korean 1190 01:04:07,880 --> 01:04:11,160 Speaker 1: math champion. No matter how much math you learned quickly, 1191 01:04:11,960 --> 01:04:13,760 Speaker 1: you're not going to stack up, so you're going to fail. 1192 01:04:14,080 --> 01:04:19,480 Speaker 1: Meaning the rest of the student base was hardcore math people. Yeah, 1193 01:04:19,480 --> 01:04:21,680 Speaker 1: and I was just reading math textbooks in my free time, 1194 01:04:21,720 --> 01:04:24,640 Speaker 1: which wasn't all much. I wasn't sleeping. I was just reading, 1195 01:04:25,040 --> 01:04:27,800 Speaker 1: working through entire math textbooks to try to do my homework. 1196 01:04:28,200 --> 01:04:30,720 Speaker 1: That sounds horrible, and yeah, it was horrible. I was 1197 01:04:30,760 --> 01:04:33,320 Speaker 1: such an unhappy person. I kind of started developing weird 1198 01:04:33,360 --> 01:04:36,040 Speaker 1: social tics. And then I went through all of this, 1199 01:04:36,600 --> 01:04:38,880 Speaker 1: never slept for a year, just working through math textbooks, 1200 01:04:38,920 --> 01:04:41,560 Speaker 1: and I still just failed everything. And I mean that 1201 01:04:41,640 --> 01:04:45,320 Speaker 1: was obviously just I was devastated. I'm gonna blame the 1202 01:04:45,400 --> 01:04:47,680 Speaker 1: lack of sleep because that affects God want to function it, 1203 01:04:47,960 --> 01:04:49,880 Speaker 1: you know. And it was like I had time to 1204 01:04:50,240 --> 01:04:52,600 Speaker 1: get to retake them. And when I got to finally 1205 01:04:52,680 --> 01:04:55,880 Speaker 1: rest and have all that knowledge should have marinate in me, 1206 01:04:56,000 --> 01:04:58,400 Speaker 1: I realized how much I knew and I mean, I 1207 01:04:58,440 --> 01:05:02,080 Speaker 1: think what I learned from that experience is, you know, 1208 01:05:02,200 --> 01:05:04,439 Speaker 1: if you really want something, you know, I mean, there's 1209 01:05:04,440 --> 01:05:07,120 Speaker 1: a point you have to cut cut loose, you just 1210 01:05:07,640 --> 01:05:10,760 Speaker 1: you you can't take the first failure because you know, 1211 01:05:10,960 --> 01:05:14,800 Speaker 1: no one remembers that you failed your first year exam. 1212 01:05:14,880 --> 01:05:18,040 Speaker 1: All they remembers that you graduated. Huh. That's that's an 1213 01:05:18,080 --> 01:05:21,600 Speaker 1: interesting observation. So now let me ask you the flip question. 1214 01:05:22,040 --> 01:05:23,480 Speaker 1: What do you do for fun? When do you do 1215 01:05:23,560 --> 01:05:28,520 Speaker 1: when you're not failing math exams? I play bridge? Really yeah, 1216 01:05:28,560 --> 01:05:31,280 Speaker 1: my mother plays bridge, my wife plays bridge. This has 1217 01:05:31,320 --> 01:05:34,200 Speaker 1: become like a giant thing. Now I'm part of this 1218 01:05:34,520 --> 01:05:38,360 Speaker 1: great bridge group of all these intellectuals. There's a Field 1219 01:05:38,200 --> 01:05:40,880 Speaker 1: medal winner, last of the last one, but mixed it. 1220 01:05:40,880 --> 01:05:43,200 Speaker 1: But it's just a very humble, low key group. H 1221 01:05:43,680 --> 01:05:47,680 Speaker 1: really quite quite interesting. Um what are you most excited 1222 01:05:47,720 --> 01:05:52,840 Speaker 1: about in the future direction of the risk industry? I 1223 01:05:52,840 --> 01:05:56,200 Speaker 1: hank technology. I mean everyone else is sort of scared 1224 01:05:56,280 --> 01:05:58,280 Speaker 1: of it, but I think it's gonna sort of take 1225 01:05:58,320 --> 01:06:03,960 Speaker 1: us to some interesting places. Interest interesting. I don't necessarily disagree. Um, 1226 01:06:04,000 --> 01:06:06,560 Speaker 1: So if a millennial or recent college student came up 1227 01:06:06,600 --> 01:06:10,560 Speaker 1: to you and asked you for career advice about going 1228 01:06:10,680 --> 01:06:15,120 Speaker 1: into economics, risk or academia. What what sort of advice 1229 01:06:15,160 --> 01:06:17,440 Speaker 1: would you give them? Find the smartest person you can 1230 01:06:17,520 --> 01:06:20,520 Speaker 1: and attach yourself to them. It's funny you say that 1231 01:06:20,520 --> 01:06:22,640 Speaker 1: that was my farthest advice to me when I went 1232 01:06:22,680 --> 01:06:26,840 Speaker 1: to gratz. Did it work? Um? More or less? Actually 1233 01:06:26,960 --> 01:06:30,600 Speaker 1: it began as joining track in high school. Find the 1234 01:06:30,600 --> 01:06:33,120 Speaker 1: fastest guy, keep up with him. And then when I 1235 01:06:33,160 --> 01:06:35,560 Speaker 1: went to grad school, he said, Hey, remember the advice 1236 01:06:35,560 --> 01:06:38,200 Speaker 1: I gave you about track. I'm going to give you 1237 01:06:38,240 --> 01:06:41,480 Speaker 1: the same exact advice, find the smartest And I'm like, Dad, 1238 01:06:41,520 --> 01:06:43,600 Speaker 1: I'm way ahead of you. I already I already had 1239 01:06:43,600 --> 01:06:47,080 Speaker 1: thought about that. So how does that manifest itself for 1240 01:06:47,800 --> 01:06:50,880 Speaker 1: millennial or college graduate? How would they actually go about 1241 01:06:50,920 --> 01:06:52,960 Speaker 1: doing that? When you're in a job, I said, find 1242 01:06:52,960 --> 01:06:54,680 Speaker 1: the smartest person in the room and it should never 1243 01:06:54,720 --> 01:06:59,040 Speaker 1: be you, um or there's something wrong um, and try 1244 01:06:59,080 --> 01:07:02,040 Speaker 1: to get them to mentor you and be open to it. 1245 01:07:02,040 --> 01:07:04,760 Speaker 1: And you know, anyway, contrary all these perceptions of millennials 1246 01:07:04,840 --> 01:07:06,880 Speaker 1: being no it alls, I find that the millennials I 1247 01:07:06,920 --> 01:07:10,120 Speaker 1: work with are always looking to learn. Um. Maybe I've 1248 01:07:10,120 --> 01:07:11,720 Speaker 1: just been lucky. I don't know if no it alls 1249 01:07:11,840 --> 01:07:16,800 Speaker 1: is the right, uh right description. They're definitely hard working, 1250 01:07:16,880 --> 01:07:20,240 Speaker 1: and they have areas that they have great strengths in, 1251 01:07:20,360 --> 01:07:23,840 Speaker 1: and I think the the biggest knock is their weaknesses. 1252 01:07:23,840 --> 01:07:28,440 Speaker 1: They're not interested in working on UM. But I think 1253 01:07:28,480 --> 01:07:31,680 Speaker 1: they've gotten a bad having worked with them for five 1254 01:07:31,800 --> 01:07:34,960 Speaker 1: years in a firm, H do you think they've gotten 1255 01:07:34,960 --> 01:07:36,760 Speaker 1: a bad rap. Yeah. I think they're the same as 1256 01:07:36,760 --> 01:07:39,600 Speaker 1: every other generation. You got some noisy outliers, and I 1257 01:07:39,640 --> 01:07:42,320 Speaker 1: think with social media the noisy outliers voices are amplified. 1258 01:07:42,520 --> 01:07:44,880 Speaker 1: That's a great observation. I think on average, they're just 1259 01:07:44,960 --> 01:07:48,320 Speaker 1: like everyone else. That that makes perfect sense. UM. Although 1260 01:07:48,320 --> 01:07:52,080 Speaker 1: there they grew up in such a unique like think 1261 01:07:52,120 --> 01:07:54,360 Speaker 1: about what you grew up in and then what people 1262 01:07:54,400 --> 01:07:56,840 Speaker 1: born in ten or twenty years after you growing up 1263 01:07:56,880 --> 01:07:59,680 Speaker 1: in technology is this thing that's kind of cool on 1264 01:07:59,760 --> 01:08:02,440 Speaker 1: this id. They're immersed in it from birth. It's a 1265 01:08:02,440 --> 01:08:06,520 Speaker 1: whole different headspace. So yeah, I mean it's it's very different. 1266 01:08:06,520 --> 01:08:08,560 Speaker 1: I think their brains probably formed differently to some degree. 1267 01:08:08,600 --> 01:08:11,360 Speaker 1: But again and again, I think ultimately even college students, 1268 01:08:11,400 --> 01:08:14,560 Speaker 1: the ones I've interacted with from teaching are also really 1269 01:08:14,560 --> 01:08:18,439 Speaker 1: open to debate in new ideas and even uncomfortable ideas. Again, 1270 01:08:18,479 --> 01:08:23,440 Speaker 1: it's just the noisy outliers. That's that's quite fascinating. Um 1271 01:08:23,479 --> 01:08:26,000 Speaker 1: and our final question, what do you know about the 1272 01:08:26,040 --> 01:08:30,040 Speaker 1: world of risk and economics today that you wish you 1273 01:08:30,120 --> 01:08:35,240 Speaker 1: knew fifteen twenty years ago when you were just graduating. Well, 1274 01:08:35,280 --> 01:08:38,880 Speaker 1: that's when I was starting grad school, and you know, 1275 01:08:39,200 --> 01:08:44,120 Speaker 1: I chose macro, and macro traditionally has not incorporated risk 1276 01:08:44,160 --> 01:08:47,120 Speaker 1: at all. I didn't realize how fundamental that it was 1277 01:08:47,200 --> 01:08:49,759 Speaker 1: to the economy. I thought, you know, I was studying 1278 01:08:49,800 --> 01:08:52,880 Speaker 1: sort of either the neo classical Kensie or new Kynsian models, 1279 01:08:52,880 --> 01:08:55,160 Speaker 1: where you know, if you do government spending, this is 1280 01:08:55,200 --> 01:08:57,639 Speaker 1: what happens. Where if I was doing finance, I would 1281 01:08:57,640 --> 01:08:59,680 Speaker 1: think of how does that impact markets? What are the 1282 01:08:59,760 --> 01:09:02,679 Speaker 1: range of things that happened. So I wish I knew 1283 01:09:02,680 --> 01:09:05,839 Speaker 1: about risk because I didn't really, So, Matt, I'm shocked 1284 01:09:05,840 --> 01:09:09,160 Speaker 1: to even hear that I don't have an economics background. 1285 01:09:09,240 --> 01:09:14,400 Speaker 1: It's just I play an economist on TV. How how 1286 01:09:14,479 --> 01:09:19,519 Speaker 1: is it possible that macro economics does not incorporate any 1287 01:09:19,560 --> 01:09:23,439 Speaker 1: analysis or study of risk? That seems shocking. It is 1288 01:09:23,600 --> 01:09:25,960 Speaker 1: is actually, you know, people don't talk about this because 1289 01:09:25,960 --> 01:09:28,400 Speaker 1: other things get attention. But I just went to a 1290 01:09:28,400 --> 01:09:30,439 Speaker 1: conference two or three weeks ago that Lars Hanson and 1291 01:09:30,400 --> 01:09:34,320 Speaker 1: Antilope put on about how can we incorporate finance into 1292 01:09:34,360 --> 01:09:37,160 Speaker 1: macro how can we put risk into macro models? And 1293 01:09:37,200 --> 01:09:39,800 Speaker 1: I think this was amongst academics, the big takeaway from 1294 01:09:39,840 --> 01:09:43,120 Speaker 1: the financial crisis. Then law macro models have no meaningful 1295 01:09:43,200 --> 01:09:45,080 Speaker 1: rule for financial sector, so how are you gonna even 1296 01:09:45,160 --> 01:09:50,240 Speaker 1: capture systemic risk? So this is really where like hardcore economists, 1297 01:09:50,240 --> 01:09:52,920 Speaker 1: but they've been working on post crisis, and it's it's 1298 01:09:52,920 --> 01:09:55,400 Speaker 1: a hard problem because any economic model has to make 1299 01:09:55,439 --> 01:09:58,520 Speaker 1: choices and once you bring risk in, they get more complicated. 1300 01:09:58,840 --> 01:10:00,880 Speaker 1: This is a controversial thing to say. I mean, I'm 1301 01:10:00,880 --> 01:10:03,519 Speaker 1: not gonna sound sexy to most people, but to small 1302 01:10:03,560 --> 01:10:06,280 Speaker 1: group economist is I now feel strongly I don't think 1303 01:10:06,280 --> 01:10:09,920 Speaker 1: anyone agrees with me that finance basic finance should be 1304 01:10:10,120 --> 01:10:13,600 Speaker 1: part of basic economics training because it is such a 1305 01:10:13,640 --> 01:10:16,519 Speaker 1: fundamental part of understanding how the economy works. Meaning basic 1306 01:10:16,600 --> 01:10:19,680 Speaker 1: finance plus an understanding of risk. Well yeah, well you 1307 01:10:19,800 --> 01:10:22,519 Speaker 1: you get a basic understanding of risk if you learn 1308 01:10:22,560 --> 01:10:24,840 Speaker 1: finance because finance is all the principles of risk, because 1309 01:10:25,160 --> 01:10:28,400 Speaker 1: finance and economics is just howcom study risk. It just 1310 01:10:28,439 --> 01:10:30,640 Speaker 1: happens to be in financial markets because that's where the 1311 01:10:30,720 --> 01:10:33,519 Speaker 1: data is. So I think it should be, you know, 1312 01:10:33,960 --> 01:10:39,360 Speaker 1: undergraduate macro micro finance. Quite fascinating. We have been speaking 1313 01:10:39,439 --> 01:10:43,960 Speaker 1: to Alison Schraeger. She is the author of An Economist 1314 01:10:44,240 --> 01:10:48,080 Speaker 1: Walks Into a Brothel, as well as co founder of 1315 01:10:48,120 --> 01:10:53,840 Speaker 1: the life Cycle Financial Partners UH and an adject professor 1316 01:10:53,880 --> 01:10:58,000 Speaker 1: at n y U and journalists at Courts. If you 1317 01:10:58,120 --> 01:11:00,599 Speaker 1: enjoy this conversation, we'll be sure look up an Inch 1318 01:11:00,680 --> 01:11:04,240 Speaker 1: or down an inch on Apple iTunes, UH, Stitcher, Overcast, 1319 01:11:04,240 --> 01:11:08,200 Speaker 1: Bloomberg dot com, wherever you find your favorite podcast. We 1320 01:11:08,320 --> 01:11:12,960 Speaker 1: love your comments, feedback and suggestions right to us at 1321 01:11:13,640 --> 01:11:16,600 Speaker 1: m IB podcast at Bloomberg dot net. Be sure and 1322 01:11:16,680 --> 01:11:19,280 Speaker 1: go to Apple iTunes and give us a delightful review 1323 01:11:19,920 --> 01:11:22,880 Speaker 1: for sharing all this time and information with you. I 1324 01:11:22,960 --> 01:11:25,679 Speaker 1: would be remiss if I did not thank the crack 1325 01:11:25,800 --> 01:11:31,240 Speaker 1: staff that helps put together UH these weekly conversations UH. 1326 01:11:31,280 --> 01:11:35,680 Speaker 1: Attica Valburon is our project manager. Michael Batnick is my 1327 01:11:35,800 --> 01:11:39,800 Speaker 1: head of research. I'm Barry Riholts. You've been listening to 1328 01:11:39,960 --> 01:11:42,519 Speaker 1: pastors of business on Bloomberg Radio.