1 00:00:00,080 --> 00:00:06,040 Speaker 1: M This is Mesters in Business with Very Renaults on 2 00:00:06,240 --> 00:00:10,840 Speaker 1: Bluebird Radio this week on the podcast What Can I Say? 3 00:00:11,039 --> 00:00:17,200 Speaker 1: Bill Bernstein is a brilliant author neurologists, financial theorists, investor. 4 00:00:17,920 --> 00:00:21,400 Speaker 1: His most recent book, The Delusion of Crowds, Why People 5 00:00:21,440 --> 00:00:24,040 Speaker 1: Go Man in Groups is going to be one of 6 00:00:24,040 --> 00:00:28,760 Speaker 1: those must read books in the pantheon of both bubbles 7 00:00:28,800 --> 00:00:35,240 Speaker 1: and behavioral finance. He's written so many books, Birth of Plenty, Investors, 8 00:00:35,320 --> 00:00:40,520 Speaker 1: Asset Allocator, four Pillars of Investing, just just just too 9 00:00:40,520 --> 00:00:43,159 Speaker 1: many to mention. I find Bill to be one of 10 00:00:43,200 --> 00:00:49,040 Speaker 1: those really unique people who has a number of insights 11 00:00:49,080 --> 00:00:54,600 Speaker 1: into the world of investing, partly because he's a neurologist 12 00:00:54,680 --> 00:00:58,720 Speaker 1: and really spent a lot of time learning how people's 13 00:00:58,800 --> 00:01:04,360 Speaker 1: brains function and what drives us in our decision making process. 14 00:01:04,360 --> 00:01:08,679 Speaker 1: But just as important, he's a historian and a deep 15 00:01:08,760 --> 00:01:13,399 Speaker 1: researcher and author, and so not only does he understand 16 00:01:13,600 --> 00:01:18,160 Speaker 1: the neurology and the cognitive science of our brains, but 17 00:01:18,520 --> 00:01:21,920 Speaker 1: he is very familiar with all of the academic literature 18 00:01:22,319 --> 00:01:25,160 Speaker 1: and all of the actual history of what's taken place 19 00:01:25,720 --> 00:01:29,279 Speaker 1: over the past pick a timeframe five hundred, a thousand, 20 00:01:29,360 --> 00:01:33,000 Speaker 1: two thousand years and so all of his work is 21 00:01:33,040 --> 00:01:36,720 Speaker 1: always a deep dive. Every page is filled with so 22 00:01:36,800 --> 00:01:41,200 Speaker 1: much fascinating information. I'm really enjoying Delusions of Crowds. I'm 23 00:01:41,200 --> 00:01:45,200 Speaker 1: about halfway through it. All of his books are absolutely 24 00:01:45,400 --> 00:01:48,240 Speaker 1: essential reading to me. So if you are at all 25 00:01:48,360 --> 00:01:57,560 Speaker 1: interested in anything from cognitive theory, behavioral finance, investing, anthropology, 26 00:01:58,360 --> 00:02:02,600 Speaker 1: bubble behavior, bitcoin, in Tesla, you name it, you're gonna 27 00:02:02,640 --> 00:02:06,440 Speaker 1: find this to be just a fascinating dive. So, with 28 00:02:06,520 --> 00:02:13,800 Speaker 1: no further delay, my conversation with William Bernstein. This is 29 00:02:13,960 --> 00:02:19,440 Speaker 1: mesters in Business with very Renaults on Bloombern Radio. My 30 00:02:19,760 --> 00:02:23,919 Speaker 1: extra special guest this week is William Bernstein. He began 31 00:02:24,200 --> 00:02:29,640 Speaker 1: his career as a neurologist before becoming an author, financial theorist, 32 00:02:29,720 --> 00:02:33,919 Speaker 1: and investment advisor. He's written a dozen books, the most 33 00:02:34,040 --> 00:02:38,519 Speaker 1: recent of which is The Delusion of Crowds, Why People 34 00:02:38,600 --> 00:02:43,480 Speaker 1: Go Mad in Groups. Who better to discuss people going 35 00:02:43,600 --> 00:02:50,519 Speaker 1: insane than a neurologist investor William Bernstein, Welcome back to Bloomberg. 36 00:02:50,960 --> 00:02:54,520 Speaker 1: As always a pleasure, Barry. So let's dive right into this. 37 00:02:54,639 --> 00:02:58,000 Speaker 1: But before you were an investor and an author, you 38 00:02:58,120 --> 00:03:01,600 Speaker 1: worked as a medical doctor, with this specialty in neurology, 39 00:03:01,680 --> 00:03:07,080 Speaker 1: How did that background help you as an investor? Really 40 00:03:07,240 --> 00:03:10,280 Speaker 1: only in a very indirect way. You would think that 41 00:03:10,400 --> 00:03:15,679 Speaker 1: being a clinical neurologist would help you with psychology uh 42 00:03:15,680 --> 00:03:20,799 Speaker 1: and the neuropsychology, but really practicing neurology is a very 43 00:03:21,360 --> 00:03:24,600 Speaker 1: down and dirty, ground level occupation. Um. You know what 44 00:03:24,720 --> 00:03:28,120 Speaker 1: I like to explain to people, uh, is that the 45 00:03:28,160 --> 00:03:31,360 Speaker 1: great neuroscientists that we think of. Uh. You know, people 46 00:03:31,400 --> 00:03:35,360 Speaker 1: like the Demasio's are like great artists, or Michelangelo's and 47 00:03:35,440 --> 00:03:38,640 Speaker 1: da Vinci's. Or is what I did was more Sherwin Williams. 48 00:03:38,640 --> 00:03:41,480 Speaker 1: It was met pain in back pain. Rather, what helped 49 00:03:41,480 --> 00:03:45,000 Speaker 1: me write about finance is simply the scientific training and 50 00:03:45,120 --> 00:03:48,760 Speaker 1: back of being a doctor. Uh. And you know the 51 00:03:48,760 --> 00:03:56,000 Speaker 1: importance of examining data, seeking it out and rigorously analyzing it. Interesting. So, 52 00:03:56,000 --> 00:03:58,520 Speaker 1: so I mentioned to some colleagues that I was going 53 00:03:58,560 --> 00:04:01,480 Speaker 1: to be speaking with you, and one of them, uh 54 00:04:01,600 --> 00:04:04,280 Speaker 1: said to ask you. Michael Batnick said to ask you, 55 00:04:04,800 --> 00:04:08,240 Speaker 1: when you started writing Why People Go Mad in Groups? 56 00:04:08,320 --> 00:04:10,320 Speaker 1: Did you expect it to come out at a time 57 00:04:10,400 --> 00:04:14,600 Speaker 1: when people were in fact going mad in groups? No? 58 00:04:14,800 --> 00:04:17,760 Speaker 1: This is the very definition of dumb. Luck. As it 59 00:04:17,760 --> 00:04:21,440 Speaker 1: turns out, my publisher decided to delay bring the book 60 00:04:21,440 --> 00:04:24,200 Speaker 1: out for about six months because they didn't want to 61 00:04:24,200 --> 00:04:26,400 Speaker 1: bring it out right before the election, because they figured 62 00:04:26,400 --> 00:04:30,520 Speaker 1: out would suck up too much of people's bandwidth. Uh. 63 00:04:30,600 --> 00:04:34,680 Speaker 1: And so it got delayed until just at the moment 64 00:04:34,720 --> 00:04:37,920 Speaker 1: when people started, you know, believing in Q and on 65 00:04:38,000 --> 00:04:42,880 Speaker 1: and occupying the Capitol building and going nuts over bitcoin 66 00:04:42,960 --> 00:04:46,680 Speaker 1: and game stop. So we'll we'll get to just about 67 00:04:46,760 --> 00:04:49,960 Speaker 1: each and every one of those things. But it leads 68 00:04:50,000 --> 00:04:54,720 Speaker 1: to an obvious question, how has the Internet changed the 69 00:04:54,760 --> 00:04:59,040 Speaker 1: psychology of crowds? How has it affected how people respond 70 00:04:59,200 --> 00:05:02,880 Speaker 1: to these men as delusions? Well, you can think of 71 00:05:03,000 --> 00:05:09,359 Speaker 1: a mass delusion the same way you think of the pandemic. Uh. 72 00:05:09,400 --> 00:05:12,640 Speaker 1: There is an agent, a causative agent. So in the 73 00:05:12,680 --> 00:05:15,320 Speaker 1: case of the pandemic, it's the coronavirus. In the case 74 00:05:15,360 --> 00:05:19,159 Speaker 1: of the Black Death, it was your seni epestus uh. 75 00:05:19,200 --> 00:05:22,839 Speaker 1: And then there's a vector, and the vector for COVID 76 00:05:23,160 --> 00:05:27,920 Speaker 1: is people uh, coughing and springing each other with droplets. Uh. 77 00:05:27,960 --> 00:05:31,640 Speaker 1: And with the Black Death it was you know, fleas 78 00:05:31,680 --> 00:05:36,120 Speaker 1: and rats transmitting the disease across great distances. What's the 79 00:05:36,200 --> 00:05:39,839 Speaker 1: same thing. It's the same way with UH. With delusions, 80 00:05:39,880 --> 00:05:43,159 Speaker 1: the the agent is the narrative UH. And the more 81 00:05:43,200 --> 00:05:47,520 Speaker 1: compelling the narrative, the more virulent, the more dangerous the agent. 82 00:05:47,960 --> 00:05:50,480 Speaker 1: And then you've got the vector UH. And the vector 83 00:05:50,600 --> 00:05:55,279 Speaker 1: is the medium through which the delusion spreads. UH. And 84 00:05:55,480 --> 00:05:58,159 Speaker 1: what we saw over the past ten years of the 85 00:05:58,200 --> 00:06:02,080 Speaker 1: evolution in the explosion of social media UH. And so 86 00:06:02,160 --> 00:06:05,000 Speaker 1: that is one of the most powerful vectors of delusions 87 00:06:05,000 --> 00:06:07,880 Speaker 1: that we've ever we've ever seen. And it's kind of 88 00:06:07,920 --> 00:06:09,880 Speaker 1: like we've gone, you know, in the in the days 89 00:06:09,920 --> 00:06:13,080 Speaker 1: of the old media, from being you know, widely separated 90 00:06:13,120 --> 00:06:16,760 Speaker 1: people who are infected UH. And we've gone now into 91 00:06:16,839 --> 00:06:19,520 Speaker 1: a world where everybody's in the same small, aerless room 92 00:06:19,560 --> 00:06:22,560 Speaker 1: and they're coughing on each other. So let's stay with 93 00:06:22,640 --> 00:06:25,400 Speaker 1: the idea of narrative. M One of the things you 94 00:06:25,680 --> 00:06:30,279 Speaker 1: refer to is that human beings are cognitive misers, and 95 00:06:30,320 --> 00:06:35,560 Speaker 1: according to psychologists, we much prefer mental shortcuts and heuristics 96 00:06:35,880 --> 00:06:40,040 Speaker 1: and a compelling narrative. What is the role of narratives 97 00:06:40,320 --> 00:06:45,000 Speaker 1: in these manias? Well, the narrative is the causative agent. Okay, 98 00:06:45,120 --> 00:06:48,159 Speaker 1: you can think of the narrative as being viral to 99 00:06:48,240 --> 00:06:53,080 Speaker 1: the extent that it's compelling. And what psychologists have found 100 00:06:53,600 --> 00:06:57,320 Speaker 1: is that the more compelling a narrative is, the more 101 00:06:57,360 --> 00:07:03,800 Speaker 1: corrosive it is to our cognitive ability, in our analytical ability. Uh. 102 00:07:03,839 --> 00:07:06,360 Speaker 1: And so you know what I did in the book 103 00:07:06,400 --> 00:07:12,000 Speaker 1: was I identified the most compelling narratives that we're exposed to. 104 00:07:12,080 --> 00:07:14,480 Speaker 1: And the one, of course, it's most financially afflictable, as 105 00:07:14,520 --> 00:07:18,840 Speaker 1: a narrative that you can become effortlessly rich just by 106 00:07:19,080 --> 00:07:22,400 Speaker 1: going online and clicking a few keys and there's almost 107 00:07:22,400 --> 00:07:25,040 Speaker 1: no every involved. That's a very pleasing and a very 108 00:07:25,080 --> 00:07:30,400 Speaker 1: compelling narrative. The other narrative that I examined a great 109 00:07:30,480 --> 00:07:34,080 Speaker 1: length in the book is the most compelling religious narrative 110 00:07:34,120 --> 00:07:35,600 Speaker 1: out there, which is the one that the world is 111 00:07:35,640 --> 00:07:38,520 Speaker 1: going to end very quickly, which it turns out is 112 00:07:38,520 --> 00:07:42,240 Speaker 1: far more prevalent than most people in your in my bubble, 113 00:07:42,880 --> 00:07:45,840 Speaker 1: uh think that it is. So let's stay with that 114 00:07:45,880 --> 00:07:51,880 Speaker 1: because there's some really fascinating combinations of apocalyptic end times 115 00:07:51,960 --> 00:07:55,920 Speaker 1: and investment mania. One of the things I've found about 116 00:07:56,000 --> 00:08:01,640 Speaker 1: the hardcore goldbugs as well as the hyper inflationists is 117 00:08:02,200 --> 00:08:05,720 Speaker 1: that overlap between Hey, when it hits the fan and 118 00:08:05,760 --> 00:08:08,760 Speaker 1: it all goes down. You better have some gold coins 119 00:08:08,760 --> 00:08:12,040 Speaker 1: because that paper money will be worthless. That seems to 120 00:08:12,160 --> 00:08:18,040 Speaker 1: really combine the religious armageddon with the financial armageddon. Yeah, 121 00:08:18,080 --> 00:08:21,440 Speaker 1: and you can throw into that conspiracy theories. Uh. You know. 122 00:08:21,520 --> 00:08:26,040 Speaker 1: Social psychologists have found that the two biggest correlates of 123 00:08:26,280 --> 00:08:29,520 Speaker 1: conspiracy theories are a belief in the end times, that 124 00:08:29,680 --> 00:08:33,000 Speaker 1: is the theological the religious end times. Uh. And the 125 00:08:33,040 --> 00:08:37,520 Speaker 1: other characteristic that it correlates with is manichey and thinking. 126 00:08:37,520 --> 00:08:41,600 Speaker 1: I believe that the world is divided into black and white, 127 00:08:41,640 --> 00:08:45,240 Speaker 1: good people and bad people with nothing in between the 128 00:08:45,280 --> 00:08:48,600 Speaker 1: maniche and personality. Could people never do bad things and 129 00:08:48,760 --> 00:08:50,800 Speaker 1: bad people never do good things, even though we know 130 00:08:50,880 --> 00:08:54,080 Speaker 1: that happens all the time. Yeah, that binary approach to 131 00:08:54,240 --> 00:08:58,120 Speaker 1: reality seems to be a gross over simplification, and it 132 00:08:58,520 --> 00:09:02,760 Speaker 1: certainly doesn't work for investing. I can't imagine it works 133 00:09:02,800 --> 00:09:06,240 Speaker 1: in day to day life, although people certainly seem to 134 00:09:06,240 --> 00:09:12,719 Speaker 1: be functioning despite holding some pretty insane beliefs. Yeah, absolutely, 135 00:09:12,760 --> 00:09:15,160 Speaker 1: I mean you're right. Occasionally, you know, Jim Kramer gets 136 00:09:15,160 --> 00:09:20,160 Speaker 1: things right. So Kramer is an interesting example because he 137 00:09:20,320 --> 00:09:25,240 Speaker 1: leads um a group of followers. Uh. You could say 138 00:09:25,280 --> 00:09:28,240 Speaker 1: the same thing about barstool sports. You could say the 139 00:09:28,280 --> 00:09:33,160 Speaker 1: same thing about a number of different either financial or 140 00:09:33,160 --> 00:09:37,280 Speaker 1: religious movements. But when we look at either Robin Hood 141 00:09:37,440 --> 00:09:41,160 Speaker 1: or or read It's Wall Street Bets, there are no 142 00:09:41,679 --> 00:09:45,040 Speaker 1: leaders there. There are no proselytizers there. Can you have 143 00:09:45,200 --> 00:09:48,280 Speaker 1: a mob with no leader? I think there's something like 144 00:09:48,320 --> 00:09:52,000 Speaker 1: ten million people follow Wall Street Bets, but no one 145 00:09:52,080 --> 00:09:55,480 Speaker 1: person controls that group. Oh, absolutely, that's the way it 146 00:09:55,520 --> 00:09:58,840 Speaker 1: happened most of the time. Uh. You know, when someone 147 00:09:58,880 --> 00:10:02,000 Speaker 1: goes up on a lead and threatens to jump and 148 00:10:02,559 --> 00:10:07,280 Speaker 1: a crowd gathers below them, A small percentage of the time, 149 00:10:07,440 --> 00:10:10,319 Speaker 1: maybe you know, a few percent, maybe five or ten 150 00:10:10,360 --> 00:10:13,080 Speaker 1: percent of the time, people start shouting up for the 151 00:10:13,120 --> 00:10:16,600 Speaker 1: person to jump. All right, Uh, this is a one 152 00:10:16,600 --> 00:10:20,160 Speaker 1: of the sad, you know, accouterments of human nature. Well, 153 00:10:20,200 --> 00:10:24,040 Speaker 1: certainly there's no leader there, and it's it's certainly possible. 154 00:10:24,080 --> 00:10:27,200 Speaker 1: In fact, I think that history shows that most math 155 00:10:27,280 --> 00:10:31,640 Speaker 1: meanias really don't have an identifiable leader. Quite interesting. So 156 00:10:32,000 --> 00:10:35,199 Speaker 1: there's some really fascinating quotes throughout the book I want 157 00:10:35,240 --> 00:10:39,360 Speaker 1: to start with. More often than not, we avoid contrary 158 00:10:39,400 --> 00:10:43,720 Speaker 1: facts and data when we cannot avoid them. Are erroneous assessments, 159 00:10:43,760 --> 00:10:49,160 Speaker 1: will occasionally even harden them and yet more amazingly, make 160 00:10:49,280 --> 00:10:54,120 Speaker 1: us more likely to proselytize them. This sounds very much 161 00:10:54,440 --> 00:10:58,440 Speaker 1: like cognitive dissonance. Is how key a factor is that 162 00:10:58,679 --> 00:11:03,680 Speaker 1: in both religious and financial decision making. Well, you said 163 00:11:03,679 --> 00:11:07,720 Speaker 1: the magic the magic words uh. You know, cognitive difference 164 00:11:07,840 --> 00:11:09,959 Speaker 1: was something that was written about and talked about by 165 00:11:10,040 --> 00:11:15,000 Speaker 1: Leon Festinger, although he didn't invent the term uh and uh, 166 00:11:15,880 --> 00:11:19,080 Speaker 1: it's you know, it's it's it's somewhat overdone, I think, 167 00:11:19,160 --> 00:11:22,760 Speaker 1: especially in modern culture, but it's certainly still a fact that. 168 00:11:23,040 --> 00:11:27,280 Speaker 1: Really what cognitive disfinance is about and confirmation bias is about, 169 00:11:27,440 --> 00:11:32,200 Speaker 1: is it's not necessarily doubling down or reinforcing your beliefs 170 00:11:32,240 --> 00:11:35,400 Speaker 1: when presented with contrary data, although that happens all the time, 171 00:11:35,679 --> 00:11:39,840 Speaker 1: but it's more the avoidance of inconvenient uh facts and 172 00:11:39,960 --> 00:11:42,600 Speaker 1: things that this confirm your theory. And to give you 173 00:11:42,640 --> 00:11:45,120 Speaker 1: an example, there are people out there who believe that 174 00:11:45,160 --> 00:11:48,280 Speaker 1: the Bible is perfectly prophetic, and there are a lot 175 00:11:48,280 --> 00:11:51,840 Speaker 1: of very prophetic things in the Bible, things that came true, 176 00:11:52,320 --> 00:11:54,400 Speaker 1: But there are a lot more things with the Bible 177 00:11:54,520 --> 00:11:58,640 Speaker 1: prophesieses that didn't come true, and those get conveniently ignored 178 00:11:58,880 --> 00:12:02,280 Speaker 1: by religious fundamental with So you mentioned in the book 179 00:12:02,520 --> 00:12:08,040 Speaker 1: State of Balance that let's say you're not a Trump supporter, 180 00:12:08,160 --> 00:12:10,320 Speaker 1: but a good friend of yours is a Trump supporter. 181 00:12:11,240 --> 00:12:17,000 Speaker 1: That creates an inherent tention that not just the political debate, 182 00:12:17,600 --> 00:12:21,000 Speaker 1: but the ability to reconcile. Hey, here's a person I like, 183 00:12:21,920 --> 00:12:24,920 Speaker 1: but they have views I disagree with. How does that 184 00:12:25,000 --> 00:12:28,439 Speaker 1: state of balance get reconciled in a brain? And I 185 00:12:28,480 --> 00:12:31,240 Speaker 1: don't know if that's the best example from the book, 186 00:12:31,320 --> 00:12:35,920 Speaker 1: but just that concept of being able to manage two 187 00:12:35,960 --> 00:12:40,800 Speaker 1: inconsistent beliefs at the same time. Well, what you're talking 188 00:12:40,840 --> 00:12:44,280 Speaker 1: about is a concept that was written about in by 189 00:12:44,280 --> 00:12:46,880 Speaker 1: a psychologist by the name of the Fritz Tighter, and 190 00:12:46,880 --> 00:12:50,319 Speaker 1: it's an important concept because it explains a lot. So 191 00:12:50,360 --> 00:12:52,160 Speaker 1: I'll spend just a little bit of time on it, 192 00:12:52,480 --> 00:12:56,840 Speaker 1: which is that of the balanced state. So let's say 193 00:12:56,840 --> 00:13:01,000 Speaker 1: that you're, you know, a Trump supporter, okay, uh, and 194 00:13:01,679 --> 00:13:04,040 Speaker 1: you meet someone and they're a Trump supporter and you 195 00:13:04,080 --> 00:13:06,520 Speaker 1: really like the person, all right, then you're in a 196 00:13:06,559 --> 00:13:10,600 Speaker 1: balanced state. All right. Now, if you meet someone who 197 00:13:10,640 --> 00:13:14,280 Speaker 1: thinks that Donald Trump is Satan incarnate and you think 198 00:13:14,320 --> 00:13:16,800 Speaker 1: that that person is a charcoal head, then you're in 199 00:13:16,840 --> 00:13:19,640 Speaker 1: a balance state too, because it enables you to dismiss 200 00:13:19,840 --> 00:13:23,520 Speaker 1: his or her opinion. But on the other hand, if 201 00:13:24,040 --> 00:13:26,640 Speaker 1: you're a Trump supporter and you disagree with your very 202 00:13:26,679 --> 00:13:30,440 Speaker 1: best friend about it about Donald Trump, then you're in 203 00:13:30,520 --> 00:13:33,000 Speaker 1: an unbalanced state and you have to resolve that you 204 00:13:33,080 --> 00:13:36,400 Speaker 1: have to or either have to decide that Donald Trump 205 00:13:36,480 --> 00:13:38,839 Speaker 1: isn't so good, or you have to decide that you 206 00:13:38,880 --> 00:13:41,000 Speaker 1: don't like your friends so much anymore, which is much 207 00:13:41,000 --> 00:13:44,920 Speaker 1: more likely to happen because people find it easier to 208 00:13:44,960 --> 00:13:50,680 Speaker 1: lose their friends than to completely obliterate their their belief system. Uh. 209 00:13:50,720 --> 00:13:55,040 Speaker 1: And so you can actually do functional magnetic resonance imaging 210 00:13:55,240 --> 00:13:57,280 Speaker 1: and you can see these two mechanisms at work. You 211 00:13:57,280 --> 00:13:59,640 Speaker 1: can see certain areas of the brain lighting up when 212 00:13:59,679 --> 00:14:02,760 Speaker 1: you're in a balanced state, and you can see certain 213 00:14:02,760 --> 00:14:06,160 Speaker 1: other areas of the brain light up when you're in 214 00:14:06,160 --> 00:14:08,719 Speaker 1: an unbalanced state, and then when that gets resolved, you 215 00:14:08,760 --> 00:14:11,400 Speaker 1: see other areas of the brain light up. So when 216 00:14:11,440 --> 00:14:16,199 Speaker 1: we hear about family members having disputes with other family 217 00:14:16,240 --> 00:14:20,280 Speaker 1: members over Q and on and people literally being cut 218 00:14:20,320 --> 00:14:24,040 Speaker 1: off by their parents and others who have fallen prey 219 00:14:24,120 --> 00:14:27,640 Speaker 1: to do we call it a cult or a belief system. 220 00:14:28,640 --> 00:14:33,560 Speaker 1: Is that balanced state issue? What's underlying that schism with 221 00:14:33,720 --> 00:14:38,160 Speaker 1: even within families? Yes, uh to to to to get 222 00:14:38,560 --> 00:14:41,200 Speaker 1: to bring in a four bit term, that's the classic 223 00:14:41,280 --> 00:14:45,360 Speaker 1: Hiberian unbalanced state and it has to be resolved one 224 00:14:45,360 --> 00:14:47,120 Speaker 1: way or the other. And the way most people resolved 225 00:14:47,200 --> 00:14:49,960 Speaker 1: if they cut themselves off from people who they have 226 00:14:50,040 --> 00:14:55,640 Speaker 1: strong political uh disagreements with. Quite interesting. Let's move towards uh, 227 00:14:55,840 --> 00:14:59,240 Speaker 1: fear and greed. Those are the phrases I've always heard 228 00:14:59,720 --> 00:15:03,880 Speaker 1: in finance. But if we want to get more specific 229 00:15:04,080 --> 00:15:08,000 Speaker 1: that when we look at the limbic system, you discuss 230 00:15:08,120 --> 00:15:11,960 Speaker 1: more precisely, I'm gonna I'm gonna mangle this the nuclei, 231 00:15:12,000 --> 00:15:18,560 Speaker 1: accumbents and the amygdala. How important is the limbic system 232 00:15:18,600 --> 00:15:23,600 Speaker 1: to our financial behavior? Well, it's almost everything. Uh. And 233 00:15:24,040 --> 00:15:26,960 Speaker 1: you know to the extent that you succeed in finance, 234 00:15:27,400 --> 00:15:30,760 Speaker 1: you succeed in finance to the extent that you can 235 00:15:30,800 --> 00:15:35,000 Speaker 1: suppress the Limbic system, your system one, which is the 236 00:15:35,120 --> 00:15:38,480 Speaker 1: very fast moving emotional system that we have. If you 237 00:15:38,520 --> 00:15:43,560 Speaker 1: can't suppress that, you're probably going to die poor. That's interesting. 238 00:15:43,600 --> 00:15:49,200 Speaker 1: I have a few more specific questions from the neurology perspective, 239 00:15:49,800 --> 00:15:54,600 Speaker 1: also from the perspective of evolutionary history. You reference a 240 00:15:54,720 --> 00:16:01,800 Speaker 1: preference for quote rationalization over rationality. Whatever luctionary purpose does 241 00:16:01,880 --> 00:16:06,560 Speaker 1: that serve well? In a state of nature, you have 242 00:16:06,800 --> 00:16:10,400 Speaker 1: to react very quickly. If you see, you know, black 243 00:16:10,400 --> 00:16:12,960 Speaker 1: and yellow stripes in your peripheral vision, or you hear 244 00:16:13,000 --> 00:16:15,000 Speaker 1: the history the stake, you would better move and better 245 00:16:15,040 --> 00:16:19,120 Speaker 1: move quickly. Uh. And that leads to a number of things. 246 00:16:19,200 --> 00:16:22,040 Speaker 1: But number one is the dominance of your fast moving 247 00:16:22,080 --> 00:16:25,600 Speaker 1: limbic system. The limbic system is the fastest moving uh, 248 00:16:25,760 --> 00:16:28,640 Speaker 1: central structure that you that you have, at least you 249 00:16:28,680 --> 00:16:32,000 Speaker 1: know in your brain it is UH. And you know 250 00:16:32,000 --> 00:16:35,760 Speaker 1: the central nervous system extends below the brain, but that's 251 00:16:35,520 --> 00:16:39,440 Speaker 1: been getting too much in the weeds. Uh, And so 252 00:16:40,040 --> 00:16:43,960 Speaker 1: evolutionarily it's got real survival value. But it also leads 253 00:16:44,000 --> 00:16:49,440 Speaker 1: to something else, which is patternization. Okay, seeing patterns uh 254 00:16:49,560 --> 00:16:53,760 Speaker 1: that really aren't there, because there's relatively little penalty for 255 00:16:53,880 --> 00:16:55,720 Speaker 1: doing that. If you if you if you think you 256 00:16:55,760 --> 00:16:58,320 Speaker 1: see yellow and black black stripes and your periphle vision 257 00:16:58,320 --> 00:17:00,680 Speaker 1: and you jump but it's really not a tiger, it's 258 00:17:00,800 --> 00:17:03,840 Speaker 1: something else. You haven't lost much. But if you under 259 00:17:03,880 --> 00:17:07,120 Speaker 1: interpret the black and yellow stripes, then you're then you're lunch. 260 00:17:08,080 --> 00:17:12,639 Speaker 1: So false positive evolutionary drive that let neat leave us 261 00:17:12,640 --> 00:17:15,639 Speaker 1: to over interpret things. And the analogy I think that's 262 00:17:15,680 --> 00:17:18,240 Speaker 1: that's best is that you know, if you're a skunk, 263 00:17:18,520 --> 00:17:20,680 Speaker 1: millions of years of evolution have told you that when 264 00:17:20,720 --> 00:17:24,080 Speaker 1: you need a large predator, you turn a hundred maybe degrees, 265 00:17:24,119 --> 00:17:27,840 Speaker 1: lift your tail and spread uh. And that's you know, 266 00:17:28,000 --> 00:17:30,840 Speaker 1: very very functional and very useful in a state of nature. 267 00:17:31,280 --> 00:17:33,960 Speaker 1: But in a semi urban environment where the biggest threat 268 00:17:34,200 --> 00:17:36,520 Speaker 1: to your existence is a hunk of steel weighing two 269 00:17:36,560 --> 00:17:42,679 Speaker 1: tons moving at sixty that is not a functional response. 270 00:17:42,920 --> 00:17:46,320 Speaker 1: And unfortunately that's the world we live in today. So 271 00:17:46,520 --> 00:17:51,200 Speaker 1: false positives carry no cost, but false negatives are really 272 00:17:51,240 --> 00:17:55,720 Speaker 1: significant from an evolutionary perspective, From an evolution yeah, but 273 00:17:55,800 --> 00:17:58,960 Speaker 1: today it's the other way around. Today, the false positives 274 00:17:59,000 --> 00:18:02,439 Speaker 1: that we're that we're proned to have a very high cost. 275 00:18:02,920 --> 00:18:06,720 Speaker 1: So let's stay with that theme. Humans attend to bad 276 00:18:06,800 --> 00:18:11,240 Speaker 1: news much more strongly than good news. We focus on 277 00:18:11,440 --> 00:18:17,000 Speaker 1: negative outcomes. What's the genetic advantage of being more biased 278 00:18:17,400 --> 00:18:21,199 Speaker 1: to spotting bad news than good news. Well, again, it's 279 00:18:21,240 --> 00:18:25,000 Speaker 1: got obvious survival value in the first place. It's something 280 00:18:25,000 --> 00:18:27,560 Speaker 1: that's almost so obvious that we never we never talked 281 00:18:27,560 --> 00:18:31,200 Speaker 1: about it. Uh, you know, I mean things generally getting 282 00:18:31,200 --> 00:18:33,720 Speaker 1: better is not what makes it to the headlines, all right, 283 00:18:35,560 --> 00:18:37,359 Speaker 1: So we don't attend to good news as much as 284 00:18:37,359 --> 00:18:39,800 Speaker 1: we attend to bad news. And in a state of nature, 285 00:18:39,880 --> 00:18:44,040 Speaker 1: once again, it had obvious, uh survival value. One of 286 00:18:44,040 --> 00:18:48,399 Speaker 1: the questions I get asked as as a doctor or 287 00:18:48,400 --> 00:18:50,600 Speaker 1: good get asked as a doctor. And people who still 288 00:18:50,640 --> 00:18:52,520 Speaker 1: ask me that this question is you know, when I 289 00:18:52,640 --> 00:18:55,640 Speaker 1: was uh five year you know, I was ten years old, 290 00:18:55,720 --> 00:19:00,359 Speaker 1: I I ate uh some some some Chinese food, and 291 00:19:00,359 --> 00:19:02,199 Speaker 1: I got hardly sick and sick. And I've not been 292 00:19:02,240 --> 00:19:04,760 Speaker 1: able to look at Chinese fude ever since. Why is 293 00:19:04,840 --> 00:19:07,600 Speaker 1: that I can't? I can't get past it? And the 294 00:19:07,640 --> 00:19:09,639 Speaker 1: answer is very simple. It's in a state of nature, 295 00:19:09,680 --> 00:19:13,360 Speaker 1: you ate a certain mushroom and it made you sick, uh, 296 00:19:13,760 --> 00:19:15,920 Speaker 1: never wanting to have that mushroom again. It was a 297 00:19:16,000 --> 00:19:20,919 Speaker 1: very useful response. Interesting, So I love the quote you 298 00:19:21,280 --> 00:19:25,720 Speaker 1: used from Charles Kindleberger there's nothing so disturbing to one's 299 00:19:25,760 --> 00:19:29,000 Speaker 1: well being in judgment as to see a friend get rich. 300 00:19:29,600 --> 00:19:32,080 Speaker 1: Why is that break that down from us? For us, 301 00:19:32,600 --> 00:19:38,080 Speaker 1: wouldn't somebody in your tribe getting rich help your own 302 00:19:38,320 --> 00:19:44,040 Speaker 1: survival prospects from a historical basis? Well, what would be? 303 00:19:44,280 --> 00:19:46,520 Speaker 1: You know what my book really is is it's really 304 00:19:46,520 --> 00:19:51,240 Speaker 1: a meditation on human nature. So we've already discussed the 305 00:19:51,280 --> 00:19:55,359 Speaker 1: fact that, you know, man is the ape to tell stories. Uh. 306 00:19:56,440 --> 00:19:58,520 Speaker 1: The foremost thing, which we haven't talked about yet, as 307 00:19:58,600 --> 00:20:02,240 Speaker 1: man is the ape it imitates. But the third most 308 00:20:02,280 --> 00:20:06,280 Speaker 1: important characteristic of human nature I talk about is man 309 00:20:06,359 --> 00:20:10,600 Speaker 1: is the ape that seeks status. Uh. Why do we 310 00:20:10,640 --> 00:20:14,120 Speaker 1: seek status? Well, because it helps us pass on our 311 00:20:14,240 --> 00:20:17,000 Speaker 1: d n A, particularly if you're a male, uh. You 312 00:20:17,040 --> 00:20:21,679 Speaker 1: know it's it's why rock stars uh and and athletes 313 00:20:21,720 --> 00:20:24,120 Speaker 1: do so well with with women, because in a state 314 00:20:24,119 --> 00:20:26,679 Speaker 1: of nature, if you're good at telling stories, or if 315 00:20:26,720 --> 00:20:30,760 Speaker 1: you're good at hunting animals, uh, you can support meats. 316 00:20:30,800 --> 00:20:33,560 Speaker 1: So it's the same thing. It's it's it's a very 317 00:20:33,600 --> 00:20:37,359 Speaker 1: it's a very basic part of sexual selection. We seek 318 00:20:37,400 --> 00:20:39,479 Speaker 1: status so he can pass on our d n A. 319 00:20:40,440 --> 00:20:44,760 Speaker 1: Quite interesting, So let's talk about some of the parallels 320 00:20:44,800 --> 00:20:49,399 Speaker 1: between religious zealotry and financial mania. If we were to 321 00:20:49,480 --> 00:20:54,320 Speaker 1: think about how social epidemics like financial bubbles and and 322 00:20:54,720 --> 00:21:00,480 Speaker 1: violent end time apocalyptic perspectives originate and p up a gate, 323 00:21:01,240 --> 00:21:08,200 Speaker 1: why is there such parallels between finance and religion because 324 00:21:08,560 --> 00:21:14,520 Speaker 1: they basically involved pretty much the same the same draws. Uh. 325 00:21:14,920 --> 00:21:19,840 Speaker 1: You know, I've talked about how compelling financial narratives can 326 00:21:20,040 --> 00:21:23,440 Speaker 1: sweep people up in its ambit and the same thing 327 00:21:23,480 --> 00:21:28,200 Speaker 1: happens with compelling religious narratives. Now, if we want to 328 00:21:28,240 --> 00:21:31,199 Speaker 1: look at you know, compelling narratives, and we want to 329 00:21:31,240 --> 00:21:34,400 Speaker 1: go at the rubric of bad is more compelling than 330 00:21:34,440 --> 00:21:37,440 Speaker 1: good or bad is stronger than good, then the most 331 00:21:37,440 --> 00:21:39,720 Speaker 1: compelling narrative in the all the world has to be 332 00:21:39,760 --> 00:21:42,560 Speaker 1: the one in which the world ends uh in a 333 00:21:42,640 --> 00:21:46,240 Speaker 1: in a fiery inferno. Uh. And so that's the compelling 334 00:21:46,320 --> 00:21:49,520 Speaker 1: narrative that really gets people's attention. And then to bring 335 00:21:49,560 --> 00:21:53,880 Speaker 1: the status part of it in, uh. You know, if 336 00:21:53,880 --> 00:21:55,680 Speaker 1: the world is going to end in fiery in a 337 00:21:56,040 --> 00:22:00,600 Speaker 1: fiery inferno, then what more pleasing outcome than if you're 338 00:22:00,640 --> 00:22:04,439 Speaker 1: able to avoid it because you and your friends are developed. 339 00:22:04,760 --> 00:22:09,160 Speaker 1: So that speaks to our desire to acquire both status 340 00:22:09,160 --> 00:22:12,119 Speaker 1: and you know, the financial narratives, You're going to become 341 00:22:12,160 --> 00:22:16,280 Speaker 1: effortlessly rich, that's very pleasing. The religious narrative is that 342 00:22:16,359 --> 00:22:18,440 Speaker 1: the world everybody is going to go to hell except 343 00:22:18,480 --> 00:22:21,760 Speaker 1: repeat you and people who believe like you. And it's 344 00:22:21,840 --> 00:22:24,720 Speaker 1: it's very pleasing. It gives you and you're in group status. 345 00:22:24,800 --> 00:22:28,399 Speaker 1: So on. A very related quote from the book, quote 346 00:22:28,400 --> 00:22:32,639 Speaker 1: immersion and narrative brings about isolation from the facts of 347 00:22:32,680 --> 00:22:38,040 Speaker 1: the real world end quote. How significant is that separation 348 00:22:38,240 --> 00:22:42,200 Speaker 1: from real world facts, both in finance and in religion, 349 00:22:42,880 --> 00:22:47,600 Speaker 1: and what happens when people are confronted with the truth 350 00:22:47,880 --> 00:22:51,879 Speaker 1: of their disconnect from real world facts. Well, in finance, 351 00:22:52,000 --> 00:22:54,760 Speaker 1: that's one of the ways that you identify a mania 352 00:22:55,040 --> 00:22:58,040 Speaker 1: or a bubble. It's one of the characteristics which is 353 00:22:58,200 --> 00:23:02,119 Speaker 1: when you disconfirmed, when you get the disconfirming opinion to 354 00:23:02,200 --> 00:23:05,159 Speaker 1: someone who is in the grips of a financial mania, 355 00:23:05,240 --> 00:23:08,640 Speaker 1: this delusion that they're going to become effortlessly rich. Uh, 356 00:23:08,680 --> 00:23:11,880 Speaker 1: they pushed back and they get very angry. Uh. And 357 00:23:12,000 --> 00:23:14,920 Speaker 1: I don't know if you this happened to you, what 358 00:23:15,280 --> 00:23:17,280 Speaker 1: you were doing, you know, back in the late nineties. 359 00:23:17,720 --> 00:23:20,879 Speaker 1: But I was still in middle age back then, and 360 00:23:20,960 --> 00:23:24,560 Speaker 1: when I would express skepticism about the Internet bubble, people 361 00:23:24,640 --> 00:23:26,959 Speaker 1: just didn't disagree with me. They got angry at me. 362 00:23:27,280 --> 00:23:29,160 Speaker 1: They told me I just didn't get it. They told 363 00:23:29,200 --> 00:23:31,600 Speaker 1: me I was an idiot, and one or two, uh, 364 00:23:31,720 --> 00:23:36,240 Speaker 1: people made dispersions about my my parenthood. Uh. They don't 365 00:23:36,280 --> 00:23:39,040 Speaker 1: like that at all. And that was an experience that 366 00:23:39,080 --> 00:23:41,360 Speaker 1: a lot of people had. And it's the same thing 367 00:23:41,359 --> 00:23:44,640 Speaker 1: obviously with with religions. Why you don't argue with people 368 00:23:44,680 --> 00:23:47,879 Speaker 1: ever about religion. You don't want to even think about 369 00:23:47,880 --> 00:23:51,240 Speaker 1: disconfirming their their religious beliefs. So I had a very 370 00:23:51,280 --> 00:23:54,119 Speaker 1: similar experience leading up to the O eight oh nine 371 00:23:55,000 --> 00:23:59,320 Speaker 1: financial crisis. I was barish on stocks, real estate derivatives, 372 00:23:59,760 --> 00:24:03,600 Speaker 1: and in the beginning, people just laughed. There was just 373 00:24:03,720 --> 00:24:07,280 Speaker 1: complete and total you know, you're out of your mind. 374 00:24:07,640 --> 00:24:11,520 Speaker 1: But it was only a little later that it became anger. 375 00:24:11,720 --> 00:24:15,680 Speaker 1: And then once it turned out, I got lucky. Nobody 376 00:24:15,720 --> 00:24:18,359 Speaker 1: wanted to talk about it and everybody was seemed to 377 00:24:18,359 --> 00:24:22,359 Speaker 1: be convinced that they saw it coming. Also, Yeah, my 378 00:24:22,480 --> 00:24:25,840 Speaker 1: favorite example of that was, you know, in summer around 379 00:24:25,840 --> 00:24:29,880 Speaker 1: oh six four seven, I was listening to Bob Schiller 380 00:24:30,200 --> 00:24:34,119 Speaker 1: be interviewed by somebody about the real estate bubble that 381 00:24:34,240 --> 00:24:37,640 Speaker 1: was still evolving at that point, and he was being 382 00:24:37,640 --> 00:24:41,080 Speaker 1: interviewed alongside of someone from you know, the National Association 383 00:24:41,119 --> 00:24:45,680 Speaker 1: of Realities. Uh, And she wasn't having any any part 384 00:24:45,680 --> 00:24:48,760 Speaker 1: of it. And finally she got so frustrated with Professor 385 00:24:48,800 --> 00:24:51,720 Speaker 1: Schiller that she stopped and said, Professor Schiller, I don't 386 00:24:51,760 --> 00:24:54,359 Speaker 1: know who you think you are, but you don't know 387 00:24:54,400 --> 00:24:59,720 Speaker 1: the first thing about real estate. That's hilarious Nobel Prize 388 00:24:59,760 --> 00:25:03,080 Speaker 1: to So he has another quote from the another quote 389 00:25:03,080 --> 00:25:05,840 Speaker 1: from the book that I like religious manias tend to 390 00:25:05,880 --> 00:25:10,040 Speaker 1: play out in the worst of times, during which mankind 391 00:25:10,160 --> 00:25:14,119 Speaker 1: desires delivery from its troubles and a return to the 392 00:25:14,200 --> 00:25:17,680 Speaker 1: quote unquote good old days. So what is it about 393 00:25:17,760 --> 00:25:21,800 Speaker 1: the good old days that seems so compelling to people? 394 00:25:22,960 --> 00:25:26,520 Speaker 1: I don't know the answer to that, but it is 395 00:25:26,640 --> 00:25:32,600 Speaker 1: a constant, near constant feature of human nature to believe 396 00:25:33,240 --> 00:25:36,040 Speaker 1: that things were always better a generation or two or 397 00:25:36,080 --> 00:25:42,440 Speaker 1: three generations ago, when it's manifestly not the that's manifestly 398 00:25:42,480 --> 00:25:44,040 Speaker 1: not the case. And you can see it all the 399 00:25:44,080 --> 00:25:47,440 Speaker 1: way back to the dawn of literacy. I mean, one 400 00:25:47,440 --> 00:25:51,399 Speaker 1: of the first archaic Greek Hellas was fellow named Hesiod 401 00:25:51,480 --> 00:25:54,240 Speaker 1: who was no well known for a collection that he 402 00:25:54,280 --> 00:25:59,040 Speaker 1: wrote called Works some Days. Uh, and he talks about how, 403 00:25:59,280 --> 00:26:01,760 Speaker 1: you know, four or five generations, there was a there 404 00:26:01,800 --> 00:26:05,160 Speaker 1: was a generation of golden men who lived wonderful lives 405 00:26:05,200 --> 00:26:08,200 Speaker 1: and were virtuous and we're prosperous, and then with each 406 00:26:08,240 --> 00:26:12,159 Speaker 1: success of generation, things went went went to hell in 407 00:26:12,200 --> 00:26:15,320 Speaker 1: a handbasket until you know, they were in the current 408 00:26:15,359 --> 00:26:18,879 Speaker 1: generation when no one respected their elders and the world 409 00:26:19,119 --> 00:26:23,280 Speaker 1: was corrupt and things were generally generally awful. Uh. It was, 410 00:26:23,320 --> 00:26:26,080 Speaker 1: you know, the origin of kids these days. Two areas 411 00:26:26,080 --> 00:26:29,800 Speaker 1: where people tend to go mad our religion and money. 412 00:26:29,960 --> 00:26:34,760 Speaker 1: And some people have talked about some current assets that 413 00:26:34,800 --> 00:26:38,320 Speaker 1: have run up as combining the two. And I think 414 00:26:38,359 --> 00:26:40,440 Speaker 1: you could look at bitcoin that way, you could look 415 00:26:40,440 --> 00:26:43,399 Speaker 1: at Tesla that way. What are your thoughts on on 416 00:26:43,440 --> 00:26:49,239 Speaker 1: those two particular assets. Well, I think certainly bitcoin is 417 00:26:49,800 --> 00:26:54,359 Speaker 1: exhibits all the behavior of a bubble. Uh. You you 418 00:26:54,400 --> 00:26:57,239 Speaker 1: see large groups of people who think that they're going 419 00:26:57,240 --> 00:27:02,400 Speaker 1: to become effortlessly rich investing in bitcoin. Uh. You see 420 00:27:02,440 --> 00:27:07,720 Speaker 1: people quitting their jobs to trade it. Uh. You see 421 00:27:08,200 --> 00:27:12,920 Speaker 1: uh anger directed at you when you express skepticism. My 422 00:27:12,960 --> 00:27:19,800 Speaker 1: favorite bit of anger uh was uh John McAfee's famous 423 00:27:19,840 --> 00:27:22,520 Speaker 1: statement that he would perform an act on himself that 424 00:27:22,880 --> 00:27:26,960 Speaker 1: requires great spinal flexibility on national TV if it didn't 425 00:27:27,040 --> 00:27:29,680 Speaker 1: hit a half a million dollars and then finally a 426 00:27:29,800 --> 00:27:37,000 Speaker 1: date that's already passed right here, he lost that bad Yeah. Yeah, 427 00:27:37,280 --> 00:27:42,480 Speaker 1: so so you see that. Uh, you know, with with with, 428 00:27:42,600 --> 00:27:46,080 Speaker 1: certainly with bitcoin. Now Tesla is a more difficult case. 429 00:27:46,119 --> 00:27:49,280 Speaker 1: I mean, Tesla may actually wind up being a very 430 00:27:49,400 --> 00:27:54,399 Speaker 1: successful uh company. Whether it will justify it it's evaluation 431 00:27:54,640 --> 00:27:57,000 Speaker 1: is another story. There are other people in the world, 432 00:27:57,080 --> 00:27:59,280 Speaker 1: you know, how to make electric vehicles besides figuring on 433 00:27:59,320 --> 00:28:04,560 Speaker 1: a musk uh, And so it's not immediately clear that's 434 00:28:04,600 --> 00:28:08,639 Speaker 1: what's going on. I don't see the kind of mania 435 00:28:08,720 --> 00:28:12,520 Speaker 1: surrounding Tesla stock. What I see the mania surrounding is 436 00:28:12,560 --> 00:28:16,119 Speaker 1: the Tesla car, which is a different thing. Uh. The 437 00:28:16,200 --> 00:28:18,880 Speaker 1: real passion that I see, but not the people love 438 00:28:18,920 --> 00:28:23,040 Speaker 1: their Tesla stock. It's it's the the the evangelicism of 439 00:28:23,400 --> 00:28:26,320 Speaker 1: people who love the Tesla cars. I'm going to take 440 00:28:26,359 --> 00:28:28,440 Speaker 1: it a step further than that. I just had the 441 00:28:28,520 --> 00:28:31,920 Speaker 1: Mustang Machee for a week to play with. And it's 442 00:28:31,960 --> 00:28:35,520 Speaker 1: not even the cars, it's the software within the car. 443 00:28:36,040 --> 00:28:40,640 Speaker 1: Whenever I've discussed the Mustang with other Tesla owners, the 444 00:28:40,760 --> 00:28:45,400 Speaker 1: comparing contrast is not the build quality or the design 445 00:28:45,960 --> 00:28:48,480 Speaker 1: or some of the things that Ford does really well. 446 00:28:49,400 --> 00:28:53,360 Speaker 1: It's the network of superchargers. It's the order over the 447 00:28:53,440 --> 00:28:57,280 Speaker 1: air automatic updates, it's the self driving. It's all these 448 00:28:57,320 --> 00:29:02,440 Speaker 1: software things that Tesla has created beyond the actual vehicle itself. 449 00:29:02,480 --> 00:29:05,560 Speaker 1: It's kind of fascinating. Yeah, well, that's because you and 450 00:29:05,560 --> 00:29:08,960 Speaker 1: I are older than dirt and we you know, of 451 00:29:09,000 --> 00:29:12,880 Speaker 1: a generation when cars were really really important. Uh you know, 452 00:29:12,960 --> 00:29:15,880 Speaker 1: to young people, their their phones and the capability of 453 00:29:15,880 --> 00:29:18,000 Speaker 1: their phones are much more important in the capability of 454 00:29:18,040 --> 00:29:22,040 Speaker 1: their cars. So my generation, a car meant freedom. But 455 00:29:22,520 --> 00:29:26,160 Speaker 1: the current generation, I don't think, feels the same way 456 00:29:26,200 --> 00:29:31,040 Speaker 1: about escaping a small town as Bruce Springsteen did in 457 00:29:31,080 --> 00:29:35,120 Speaker 1: in Born to Run. It's it's certainly a generational difference. 458 00:29:35,600 --> 00:29:39,040 Speaker 1: So what's interesting to me. There's a lot of things 459 00:29:39,080 --> 00:29:41,960 Speaker 1: interesting to me about this book. But you discussed in 460 00:29:42,000 --> 00:29:47,560 Speaker 1: the introduction on how Charles McKay was really the influence 461 00:29:47,600 --> 00:29:51,080 Speaker 1: and the inspiration for this. So what motivated you to 462 00:29:51,200 --> 00:29:57,040 Speaker 1: want to update the original work by Charles McKay about 463 00:29:57,160 --> 00:30:02,080 Speaker 1: the Madness of crowds. Well, yeah, the I was uh 464 00:30:02,600 --> 00:30:05,480 Speaker 1: a journalist who in eighteen four you and writes this 465 00:30:05,560 --> 00:30:10,120 Speaker 1: book Memoirs of Extraordinarily Popular Delusions in the Madness of Crowds, 466 00:30:10,200 --> 00:30:12,680 Speaker 1: And as you and I both well know, it's a 467 00:30:12,760 --> 00:30:15,400 Speaker 1: seminal book in finance. It's one of those books that 468 00:30:15,520 --> 00:30:18,479 Speaker 1: you tell every practitioner at some point to read because 469 00:30:18,520 --> 00:30:23,720 Speaker 1: it described the two great bubbles of the of the 470 00:30:23,760 --> 00:30:27,920 Speaker 1: eighteenth century and also the supposed bubble that surrounded uh 471 00:30:28,160 --> 00:30:31,680 Speaker 1: toolip bulbs in the seventeenth century, which turns out not 472 00:30:31,800 --> 00:30:35,480 Speaker 1: to be as a society wide phenomenon is as mackay 473 00:30:35,640 --> 00:30:38,360 Speaker 1: made it out to be. But it's the most famous 474 00:30:38,440 --> 00:30:40,480 Speaker 1: chapter in the book because he coins the word tool 475 00:30:40,520 --> 00:30:43,360 Speaker 1: of mania uh and gives it to the to the 476 00:30:43,400 --> 00:30:47,720 Speaker 1: English language, and so for generations after that the book 477 00:30:47,760 --> 00:30:52,120 Speaker 1: has remained in print and has saved people's bacon. UH. 478 00:30:52,240 --> 00:30:55,120 Speaker 1: Bernard Baruke famously read the book in n right before 479 00:30:55,160 --> 00:30:59,000 Speaker 1: the crash in nineteen o seven and recognized the signs 480 00:30:59,040 --> 00:31:02,480 Speaker 1: of the time, and he loved the book so much 481 00:31:02,480 --> 00:31:05,440 Speaker 1: that he actually wrote the introduction to the nineteen thirty 482 00:31:05,480 --> 00:31:10,120 Speaker 1: two edition of the book. And of course I read 483 00:31:10,120 --> 00:31:13,000 Speaker 1: the book in the mid nineteen nineties, just before the 484 00:31:13,000 --> 00:31:16,680 Speaker 1: tech bubble started to really get going, and I thought 485 00:31:16,680 --> 00:31:19,240 Speaker 1: it was vaguely interesting, was sort of like a bad 486 00:31:19,280 --> 00:31:22,880 Speaker 1: B movie about the Roman Empire. It wasn't really relevant 487 00:31:22,880 --> 00:31:25,880 Speaker 1: to the financial markets at the time, which were pretty 488 00:31:25,880 --> 00:31:30,240 Speaker 1: well behaved in retrospect. Uh. And then all of a sudden, 489 00:31:30,280 --> 00:31:33,479 Speaker 1: before my very eyes, I saw this bubble blow, and 490 00:31:34,280 --> 00:31:36,720 Speaker 1: you know, it saved my bacon the way it did 491 00:31:36,800 --> 00:31:39,760 Speaker 1: probably most other people who had read the book, So 492 00:31:39,840 --> 00:31:44,480 Speaker 1: that sort of stayed in my memory banks. Uh. And 493 00:31:44,520 --> 00:31:49,200 Speaker 1: then you know, five or six years ago, Uh, I 494 00:31:49,240 --> 00:31:51,960 Speaker 1: observed the way everybody else did, the ability of the 495 00:31:52,040 --> 00:31:55,920 Speaker 1: Islamic state to attract tens of thousands of people from 496 00:31:55,920 --> 00:32:00,840 Speaker 1: around the world, uh, including from some a prosperous and 497 00:32:00,880 --> 00:32:03,720 Speaker 1: secure Western nations, to go to one of the most 498 00:32:03,800 --> 00:32:06,760 Speaker 1: dangerous and worst places on the world and to you know, 499 00:32:07,280 --> 00:32:10,960 Speaker 1: to be make cases to to die or to become 500 00:32:11,040 --> 00:32:15,240 Speaker 1: seriously injured. And the way they did it, it turns out, 501 00:32:15,320 --> 00:32:19,040 Speaker 1: was not by deploying this end Times narrative, which Mackay 502 00:32:19,120 --> 00:32:21,959 Speaker 1: had also written about in in his book, and I 503 00:32:21,960 --> 00:32:24,160 Speaker 1: realized at that point I had to update the book 504 00:32:24,560 --> 00:32:27,040 Speaker 1: because you know, we now know a lot more about 505 00:32:27,040 --> 00:32:31,040 Speaker 1: the neuropsychology of why these things occur, which mackay couldn't 506 00:32:31,080 --> 00:32:33,080 Speaker 1: know because of the state of science at the time. 507 00:32:33,680 --> 00:32:37,240 Speaker 1: So I love this quote from Charles McKay quote, men, 508 00:32:37,400 --> 00:32:40,240 Speaker 1: it has been said thinking herds, it will be seen 509 00:32:40,320 --> 00:32:43,400 Speaker 1: that they go mad and herds while they only recover 510 00:32:43,520 --> 00:32:49,120 Speaker 1: their senses more slowly and one by one. Why is that, Well, 511 00:32:49,160 --> 00:32:54,080 Speaker 1: it's because this confirmation is is a solitary process. It's 512 00:32:54,080 --> 00:32:56,760 Speaker 1: a process where you have to look inside yourself and say, 513 00:32:57,120 --> 00:33:00,240 Speaker 1: my god, I was an idiot. Uh. And it's a 514 00:33:00,320 --> 00:33:03,600 Speaker 1: very individual process, whereas the spreading of the media, the 515 00:33:03,640 --> 00:33:07,120 Speaker 1: spreading of the virus of the contagion is is a 516 00:33:07,160 --> 00:33:11,480 Speaker 1: is a social process. So you mentioned social networks earlier, 517 00:33:11,560 --> 00:33:14,480 Speaker 1: whether it's Facebook or Twitter, or or TikTok or what 518 00:33:14,560 --> 00:33:19,720 Speaker 1: have you. Um, they seem to be accelerating both the 519 00:33:19,840 --> 00:33:23,719 Speaker 1: creation of these memes and the way they're propagated. Do 520 00:33:23,760 --> 00:33:28,280 Speaker 1: you think they're a key factor in some of these episodes, 521 00:33:28,320 --> 00:33:30,680 Speaker 1: how quickly they seem to come out of nowhere, how 522 00:33:30,800 --> 00:33:33,800 Speaker 1: big they blow up, And then what happens to them 523 00:33:33,840 --> 00:33:37,880 Speaker 1: after the fact. Yes, I think that's that's absolutely true. 524 00:33:38,320 --> 00:33:42,600 Speaker 1: A meania can spread more rapidly and more widely today 525 00:33:42,720 --> 00:33:46,680 Speaker 1: than it ever could before. The thing that's that's really 526 00:33:46,760 --> 00:33:49,760 Speaker 1: interesting is that, you know, with each iteration, with each 527 00:33:49,920 --> 00:33:56,840 Speaker 1: advance in communications technology, you have the potential for accelerating delusions. 528 00:33:56,880 --> 00:34:00,920 Speaker 1: For example, it's no accident that uhs that I wrote 529 00:34:00,920 --> 00:34:05,440 Speaker 1: about in the book were sixteenth century episodes that's followed 530 00:34:05,680 --> 00:34:08,200 Speaker 1: hard on the heels of the invention of the printing press. 531 00:34:08,280 --> 00:34:11,920 Speaker 1: That wasn't an accident, all right, that we started to see, 532 00:34:12,080 --> 00:34:16,879 Speaker 1: you know, several religious mass delusions in in in continental 533 00:34:17,040 --> 00:34:21,239 Speaker 1: Europe starting around uh the year fifteen hundred or so. 534 00:34:21,600 --> 00:34:24,080 Speaker 1: The really interesting thing is why we didn't see all 535 00:34:24,120 --> 00:34:27,680 Speaker 1: that many manias with radio and television. And the answer 536 00:34:27,800 --> 00:34:31,600 Speaker 1: was they had built in filters, all right. Uh you know, 537 00:34:31,680 --> 00:34:36,120 Speaker 1: serious journalistic ethics. Walter Cronkite and you know Edward R. 538 00:34:36,200 --> 00:34:40,040 Speaker 1: Merrow tended not to lie to people, although Adolph Hitler 539 00:34:40,120 --> 00:34:43,080 Speaker 1: did and he certainly spread a mass delusion, and he 540 00:34:43,160 --> 00:34:47,200 Speaker 1: did it primarily with radio. That's interesting. You know, if 541 00:34:47,320 --> 00:34:50,719 Speaker 1: you look at the book pop why bubbles are great 542 00:34:50,760 --> 00:34:55,360 Speaker 1: for the economy. Dan gross goes over every time a 543 00:34:55,400 --> 00:34:59,680 Speaker 1: new technology comes out, if not a full blown bubble 544 00:35:00,080 --> 00:35:05,560 Speaker 1: than just a massive land Russian over investment misdeployment of 545 00:35:05,640 --> 00:35:12,719 Speaker 1: capital accompanied everything from railroads to radio, television, automobiles, even 546 00:35:12,760 --> 00:35:16,760 Speaker 1: fiber optics. You end up with this giant investment because 547 00:35:16,800 --> 00:35:20,040 Speaker 1: of a fear of missing the next big thing, and 548 00:35:20,520 --> 00:35:22,960 Speaker 1: it usually ends up with a handful of winners, but 549 00:35:23,200 --> 00:35:26,080 Speaker 1: most of the rest of the companies turn out to 550 00:35:26,080 --> 00:35:29,680 Speaker 1: be worthless. But it doesn't necessarily become a full blown 551 00:35:29,719 --> 00:35:34,040 Speaker 1: bubble across all of society. Yeah, that is that is 552 00:35:34,080 --> 00:35:38,160 Speaker 1: something that that's almost the constant of financial history. Is 553 00:35:38,200 --> 00:35:41,000 Speaker 1: there are technologies that really changed the world. I mean, 554 00:35:41,520 --> 00:35:45,520 Speaker 1: you know, um uh, the Internet really did change everything. Okay, 555 00:35:45,560 --> 00:35:49,200 Speaker 1: it's just that it didn't make the average investor who 556 00:35:49,200 --> 00:35:52,520 Speaker 1: invested in the tech companies of the late nineties wealthy. 557 00:35:53,120 --> 00:35:55,040 Speaker 1: The paradigm to that is something I write about in 558 00:35:55,040 --> 00:36:00,200 Speaker 1: the book which is Global Crossing Gary Linno Sperm, which was, 559 00:36:00,480 --> 00:36:04,440 Speaker 1: you know, a fiasco from a financial uh point of view. 560 00:36:04,920 --> 00:36:07,880 Speaker 1: But the man did build, you know, a large percentage 561 00:36:07,920 --> 00:36:12,080 Speaker 1: of the world's fiber optic capacity, and he didn't foresee 562 00:36:12,080 --> 00:36:14,920 Speaker 1: two things, not that he should have at least one 563 00:36:14,920 --> 00:36:16,960 Speaker 1: of which he should have foreseen, which was that you know, 564 00:36:17,000 --> 00:36:20,799 Speaker 1: there would be competing lines UH laid that would cut 565 00:36:20,840 --> 00:36:23,760 Speaker 1: into his profits. But the other thing which he didn't perceive, 566 00:36:23,840 --> 00:36:27,120 Speaker 1: and I don't think anyone really foresaw, was that the 567 00:36:27,200 --> 00:36:30,360 Speaker 1: improvement in dry plant, that is, the relays and the 568 00:36:30,400 --> 00:36:35,400 Speaker 1: transmitters and the receivers along the fiber aphic chains would 569 00:36:35,440 --> 00:36:39,000 Speaker 1: improve so dramatically that a fiber, the capacity of a 570 00:36:39,040 --> 00:36:44,640 Speaker 1: fiber to carry data UH would without changing the fiber 571 00:36:44,680 --> 00:36:48,040 Speaker 1: at all, would improve by orders of magnitude without laying 572 00:36:48,080 --> 00:36:50,759 Speaker 1: any new fibers. So between I don't know about two 573 00:36:50,840 --> 00:36:54,200 Speaker 1: thousand and two and two thousand and fifteen, there was 574 00:36:54,200 --> 00:36:58,280 Speaker 1: almost no fiber new fiber laid UH and yet the capacity, 575 00:36:58,360 --> 00:37:01,840 Speaker 1: the carrying capacity of the fiber up by about a thousandfold. 576 00:37:02,360 --> 00:37:08,400 Speaker 1: That's amazing. Arguably, global crossing bandwidth capacity is responsible for 577 00:37:09,239 --> 00:37:13,360 Speaker 1: everything from YouTube to Facebook that could not have existed 578 00:37:13,960 --> 00:37:18,120 Speaker 1: in in the pre fat pipe days. Yeah, the concept 579 00:37:18,120 --> 00:37:21,440 Speaker 1: that I talked about in the book is that technology 580 00:37:21,480 --> 00:37:27,320 Speaker 1: investors tend to be capitalism philanthropists. Okay, they tend to 581 00:37:27,560 --> 00:37:32,479 Speaker 1: put money into vitally needed infrastructure that benefits society at large, 582 00:37:32,520 --> 00:37:34,840 Speaker 1: but benefits all of us, whether it's the railroads, or 583 00:37:34,920 --> 00:37:40,759 Speaker 1: radio or television or aircraft manufacturer. But they don't make 584 00:37:40,800 --> 00:37:43,319 Speaker 1: any money in the process. They probably be better off, 585 00:37:43,320 --> 00:37:46,120 Speaker 1: as the old joke goes, uh, throwing half the money 586 00:37:46,120 --> 00:37:49,440 Speaker 1: out the window and burning the other half. So I 587 00:37:49,480 --> 00:37:52,280 Speaker 1: want to stick with the some of the eighteen century 588 00:37:52,320 --> 00:37:55,440 Speaker 1: bubbles you write about, the south Sea bubble and the 589 00:37:55,480 --> 00:38:00,320 Speaker 1: Mississippi Company bubble in Europe. One of the data points 590 00:38:00,360 --> 00:38:07,480 Speaker 1: just stunned me of eighteenth century European stock issuance occurred 591 00:38:07,560 --> 00:38:11,480 Speaker 1: in a single year, seventeen twenty. How on earth is 592 00:38:11,520 --> 00:38:16,480 Speaker 1: that even possible. Well, we saw that that was an 593 00:38:16,480 --> 00:38:19,560 Speaker 1: extreme example. I don't think we've seen anything quite like 594 00:38:19,640 --> 00:38:24,279 Speaker 1: it ever since. But certainly, you know, the the the 595 00:38:24,320 --> 00:38:28,320 Speaker 1: amount of I p o issuance of of new sheriffs 596 00:38:28,440 --> 00:38:33,000 Speaker 1: or in the new company births in the UH was 597 00:38:33,000 --> 00:38:35,960 Speaker 1: was a very similar phenomenon and left us with some 598 00:38:36,280 --> 00:38:38,640 Speaker 1: you know, companies that are very important today, prime among 599 00:38:38,640 --> 00:38:42,480 Speaker 1: which is Amazon. No con intended to say the least. 600 00:38:42,800 --> 00:38:45,640 Speaker 1: I also want to stick with the parallel you draw 601 00:38:45,960 --> 00:38:49,920 Speaker 1: from the south Sea Company and real estate. You mentioned 602 00:38:50,040 --> 00:38:54,359 Speaker 1: the Sassy Company was worth twice the value of all 603 00:38:54,400 --> 00:38:57,640 Speaker 1: of the lands in England, and we saw an echo 604 00:38:57,719 --> 00:39:01,320 Speaker 1: of that in the nineteen eighties thanks to the Tokyo 605 00:39:01,400 --> 00:39:07,080 Speaker 1: real estate bubble, when the Imperial Palace was worth more 606 00:39:07,120 --> 00:39:10,200 Speaker 1: than all of the land in California. What is it 607 00:39:10,360 --> 00:39:18,759 Speaker 1: about real estate that encourages this sort of interesting, bubblicious behavior. Well, 608 00:39:19,160 --> 00:39:21,200 Speaker 1: you have to go back a step and ask what 609 00:39:21,280 --> 00:39:25,880 Speaker 1: are the factors that underlie bubbles? Uh? And the biggest 610 00:39:25,880 --> 00:39:28,960 Speaker 1: factor of all, once you get past the new technology, is, 611 00:39:29,000 --> 00:39:33,080 Speaker 1: of course, the availability of easy credit. So invariably you 612 00:39:33,120 --> 00:39:40,640 Speaker 1: see bubbles in times of uh lowering gradually falling interest rates, 613 00:39:40,680 --> 00:39:45,880 Speaker 1: which certainly has some relevance today. UH. And what happens 614 00:39:46,000 --> 00:39:49,520 Speaker 1: is is the classic real estate bubble cycle, which is, 615 00:39:49,560 --> 00:39:52,600 Speaker 1: you buy a piece of real estate, Uh, it appreciates 616 00:39:52,600 --> 00:39:55,680 Speaker 1: and value that enables you to borrow more when you 617 00:39:55,760 --> 00:39:59,000 Speaker 1: collateralize it. UH. So you take that capital and you 618 00:39:59,040 --> 00:40:02,720 Speaker 1: buy more real estate. Uh. The prices continue to rise, 619 00:40:03,560 --> 00:40:05,800 Speaker 1: which enables you to borrow more and more. It becomes 620 00:40:05,840 --> 00:40:09,439 Speaker 1: a self inflating cycle until it finally blows up. That's 621 00:40:09,480 --> 00:40:12,520 Speaker 1: my favorite scene in the movie version of The Big Short, 622 00:40:13,280 --> 00:40:17,040 Speaker 1: where Steve Carrell is talking to I don't remember for 623 00:40:17,120 --> 00:40:20,320 Speaker 1: as a waitress or a stripper who owned a house 624 00:40:20,840 --> 00:40:24,600 Speaker 1: and was talking about how she financed it, and uh, 625 00:40:25,040 --> 00:40:27,040 Speaker 1: he asked her a question and her answer was, how 626 00:40:27,120 --> 00:40:29,200 Speaker 1: is this? He's like, you have more than one? She says, 627 00:40:29,239 --> 00:40:33,120 Speaker 1: I have five, And it's just that classic moment of Oh, 628 00:40:33,160 --> 00:40:36,240 Speaker 1: that's what people do with free money and lots of leverage. 629 00:40:36,560 --> 00:40:39,399 Speaker 1: They go out and try and uh, try and get 630 00:40:39,480 --> 00:40:42,959 Speaker 1: rich as quickly as they can. Yeah. My favorite story, 631 00:40:42,960 --> 00:40:44,279 Speaker 1: and I think it may have come from one of 632 00:40:44,320 --> 00:40:49,160 Speaker 1: your episodes, was when Dick Sailor did beside that scene 633 00:40:49,200 --> 00:40:53,080 Speaker 1: with two of the Jennifer Lopez uh, and it turned 634 00:40:53,400 --> 00:40:55,600 Speaker 1: and it turned out that Jennifer Lopez didn't know who 635 00:40:55,600 --> 00:40:57,760 Speaker 1: Dick Taylor was, and Dick Taylor didn't know who Jennifer 636 00:40:57,760 --> 00:41:01,160 Speaker 1: Lopez was. Was it j Lower? Was it Selena Gomez? 637 00:41:01,200 --> 00:41:04,560 Speaker 1: I don't remember, you know, I honestly don't know. Tells 638 00:41:04,600 --> 00:41:06,840 Speaker 1: you what generation I belong to. But the actress didn't 639 00:41:06,880 --> 00:41:10,719 Speaker 1: know who who whose failure was advice A person I 640 00:41:10,760 --> 00:41:14,400 Speaker 1: thought that was hilarious, to be fair, another person a 641 00:41:14,480 --> 00:41:17,920 Speaker 1: decade before they won their Nobel prize. So you know, 642 00:41:18,040 --> 00:41:21,319 Speaker 1: maybe maybe it was a you know to her just 643 00:41:21,360 --> 00:41:27,520 Speaker 1: an obscure economics professor toiling and obscurity in Chicago. There 644 00:41:27,560 --> 00:41:32,160 Speaker 1: there's a couple of really just fascinating digressions within the book. 645 00:41:33,000 --> 00:41:36,640 Speaker 1: I love this, this reference to some of Barry Iken 646 00:41:36,680 --> 00:41:40,440 Speaker 1: Green's work. He's an authority on the gold standard, and 647 00:41:40,600 --> 00:41:45,279 Speaker 1: his observation was that nations recovered from the Great Depression 648 00:41:45,840 --> 00:41:49,880 Speaker 1: in the precise order they abandoned hard money and a 649 00:41:49,960 --> 00:41:54,440 Speaker 1: currency backed by gold. I have never seen that precise quote. 650 00:41:55,080 --> 00:42:01,880 Speaker 1: How significant was hard money and gold to the basic 651 00:42:02,040 --> 00:42:06,640 Speaker 1: narrative and the delusion that gold was somehow special. Well, 652 00:42:07,200 --> 00:42:10,840 Speaker 1: in the first place, we get that to the the 653 00:42:10,880 --> 00:42:15,200 Speaker 1: Paris bubble of Mississippi company UH bubble, And that was 654 00:42:15,560 --> 00:42:19,000 Speaker 1: John Law's great innovation was he realized that hard money 655 00:42:20,480 --> 00:42:25,759 Speaker 1: was a drag on any company's economy UH and financial development. 656 00:42:25,800 --> 00:42:30,480 Speaker 1: And so he basically introduced paper currency UH to France, 657 00:42:30,520 --> 00:42:35,480 Speaker 1: and unfortunately he went he went overboard. But he's probably 658 00:42:35,600 --> 00:42:38,880 Speaker 1: you know, I think it's he's arguably the the father 659 00:42:39,200 --> 00:42:43,719 Speaker 1: of modern central banking and modern finance. He invented the 660 00:42:43,760 --> 00:42:46,120 Speaker 1: system that we have today. Just that he you know, 661 00:42:46,440 --> 00:42:50,560 Speaker 1: he and the Duke door Land or they all blew 662 00:42:50,560 --> 00:42:55,080 Speaker 1: it blew it up, so I can Green is of course, 663 00:42:55,120 --> 00:42:57,480 Speaker 1: you know, did the seminal research in this area. You're 664 00:42:57,480 --> 00:43:00,840 Speaker 1: talking about a book called Golden Setters U and the 665 00:43:00,880 --> 00:43:06,000 Speaker 1: research behind it. When you ever you hear the mainstream economist, uh, 666 00:43:06,040 --> 00:43:09,520 Speaker 1: you know, gag when you mentioned the gold standard. Uh, 667 00:43:09,560 --> 00:43:13,000 Speaker 1: it's Barry I can Green there channeling and and his work. 668 00:43:13,440 --> 00:43:16,240 Speaker 1: My favorite trope of the gold bug is that gold 669 00:43:16,239 --> 00:43:19,439 Speaker 1: has been money for five thousand years. It certainly has 670 00:43:19,520 --> 00:43:23,319 Speaker 1: not been money for five thousand years. The money, you know, 671 00:43:23,400 --> 00:43:26,480 Speaker 1: it wasn't started to be used as money until the 672 00:43:26,480 --> 00:43:31,200 Speaker 1: Hellenistic period. Uh in in Greece, which was not much 673 00:43:31,200 --> 00:43:34,680 Speaker 1: more than two thousand years ago. And before that, lack 674 00:43:34,800 --> 00:43:36,680 Speaker 1: of things were money. You know, a leader of grain 675 00:43:36,800 --> 00:43:40,000 Speaker 1: was money, ahead of cattle was money. Uh. If you 676 00:43:40,040 --> 00:43:43,080 Speaker 1: wanted something that looked more like hard money back in 677 00:43:43,120 --> 00:43:46,120 Speaker 1: the ancient world, silver was was the real hard money 678 00:43:46,160 --> 00:43:48,719 Speaker 1: at that error, it wasn't gold. Yeah, it's going to 679 00:43:48,760 --> 00:43:51,759 Speaker 1: be curious to see what happens to the goldbubs when 680 00:43:51,760 --> 00:43:55,640 Speaker 1: we um last. So some asteroid with you know, a 681 00:43:55,719 --> 00:43:59,200 Speaker 1: billion metric tons of gold and platinum in it, it 682 00:43:59,480 --> 00:44:03,200 Speaker 1: might change the belief and hard hard money There's another 683 00:44:03,520 --> 00:44:07,200 Speaker 1: issue we really didn't get to, which is the overwhelming 684 00:44:07,200 --> 00:44:11,440 Speaker 1: proclivity of human beings to imitate the behavior of of 685 00:44:11,480 --> 00:44:15,320 Speaker 1: those around them, regardless of how baseless or self destructive 686 00:44:15,360 --> 00:44:21,520 Speaker 1: that behavior. Maybe what's behind our propensity for imitation? Well, 687 00:44:21,560 --> 00:44:24,000 Speaker 1: in the first place, we really haven't talked about this. 688 00:44:24,239 --> 00:44:28,600 Speaker 1: When you ask what is the primary uh psychological mechanism 689 00:44:28,760 --> 00:44:33,800 Speaker 1: of manias, it's our proclivity to imitate U from something 690 00:44:33,840 --> 00:44:36,759 Speaker 1: as basic as just yawning. It's why yawning is infectious. 691 00:44:37,080 --> 00:44:39,640 Speaker 1: And the infectiousness of yawning is something that's actually been 692 00:44:39,680 --> 00:44:42,279 Speaker 1: studied in some detail, and I won't get into how 693 00:44:42,320 --> 00:44:45,719 Speaker 1: interesting it is, but trust me, yawning is is very 694 00:44:45,800 --> 00:44:50,560 Speaker 1: very interesting. But what is behind that is something that's 695 00:44:50,640 --> 00:44:54,719 Speaker 1: that's very simple, and that's the paradigm of thinking of 696 00:44:55,000 --> 00:44:59,000 Speaker 1: the human occupation of the New World North and South America, 697 00:44:59,040 --> 00:45:03,560 Speaker 1: which began some around fifteen twenty thousand uh years ago, 698 00:45:03,920 --> 00:45:08,200 Speaker 1: and within the space of several thousand years, human kind 699 00:45:08,400 --> 00:45:13,000 Speaker 1: spread from the frozen Arctic wastes down through the North 700 00:45:13,040 --> 00:45:16,719 Speaker 1: American continent and into South America and the Andy's and 701 00:45:16,800 --> 00:45:19,440 Speaker 1: all the way down to the Tierra del Fuego. And 702 00:45:19,480 --> 00:45:22,160 Speaker 1: along the way human beings had to learn how to 703 00:45:22,200 --> 00:45:27,040 Speaker 1: make kayaks uh and how to hunt bison and how 704 00:45:27,080 --> 00:45:30,360 Speaker 1: to make poison blow darts, and all of those endeavors 705 00:45:30,400 --> 00:45:33,200 Speaker 1: are very complex, they're very hard to learn, and no 706 00:45:33,400 --> 00:45:35,880 Speaker 1: one person is going to figure out how to do 707 00:45:35,960 --> 00:45:39,239 Speaker 1: it well on their own. And so the way that 708 00:45:39,800 --> 00:45:42,200 Speaker 1: you will survive, in the way your tribe will survive, 709 00:45:42,560 --> 00:45:45,799 Speaker 1: is if you're very good at imitating. You know, if 710 00:45:45,920 --> 00:45:48,000 Speaker 1: Joe and his friends figure out how to build a 711 00:45:48,080 --> 00:45:52,440 Speaker 1: kayak from from scratched and rather than figure it out yourself, 712 00:45:52,480 --> 00:45:55,760 Speaker 1: you're going to imitate that, all right. So the ability 713 00:45:55,800 --> 00:45:59,880 Speaker 1: to imitate carries enormous sur value of value not only 714 00:46:00,000 --> 00:46:05,880 Speaker 1: for individuals, but for entire societies as well. And unfortunately, 715 00:46:06,120 --> 00:46:11,120 Speaker 1: it's a good deal more dysfunctional uh in the in 716 00:46:11,400 --> 00:46:14,480 Speaker 1: the modern world, because that ability to imitate carries with 717 00:46:14,560 --> 00:46:18,880 Speaker 1: it the proclivity to nnias and the proclivity of nnias 718 00:46:19,080 --> 00:46:22,600 Speaker 1: It may not be that dangerous in the in a 719 00:46:22,640 --> 00:46:25,280 Speaker 1: state of nature, but in a world of social media, 720 00:46:25,320 --> 00:46:30,040 Speaker 1: it's deadly interesting. So, you know, we we briefly touched 721 00:46:30,080 --> 00:46:34,839 Speaker 1: on on some of the evolutionary adaptations that work so 722 00:46:34,920 --> 00:46:39,239 Speaker 1: well on the Savannah, but hurt us um in modern times. 723 00:46:40,040 --> 00:46:44,680 Speaker 1: What is some of the anthropology behind this? Why are 724 00:46:44,760 --> 00:46:49,920 Speaker 1: we simply status seeking, narrative believing imitators. Give us a 725 00:46:50,000 --> 00:46:53,680 Speaker 1: little more background about some of the academic work that 726 00:46:53,680 --> 00:46:59,759 Speaker 1: that's been done in this space. Well, the narrative proclivity 727 00:47:00,120 --> 00:47:03,840 Speaker 1: is something that you think about. It is fairly obvious, 728 00:47:03,920 --> 00:47:07,560 Speaker 1: which is that when you and your your colleagues fifteen 729 00:47:07,560 --> 00:47:11,240 Speaker 1: thousand years ago went out to hunt wily wily mammoth, 730 00:47:11,320 --> 00:47:14,680 Speaker 1: you didn't issue mathematical vectors to each other. That's not 731 00:47:14,719 --> 00:47:17,200 Speaker 1: the way the human mind works. We use We use 732 00:47:17,239 --> 00:47:21,799 Speaker 1: our left hemisphere seric ability to communicate with words. So 733 00:47:21,920 --> 00:47:24,399 Speaker 1: you know, Joe, you go left, Fred you go right, 734 00:47:24,960 --> 00:47:29,399 Speaker 1: and you'll both spear the mathodon from both sides. That's 735 00:47:29,520 --> 00:47:32,600 Speaker 1: that's how we communicate with each other UH. And that's 736 00:47:32,640 --> 00:47:35,480 Speaker 1: how we transmit our values. We don't do it mathematically 737 00:47:35,560 --> 00:47:39,520 Speaker 1: or with analytical UH rigor. And the example of that 738 00:47:39,680 --> 00:47:43,000 Speaker 1: that I use in the book was very early in 739 00:47:43,040 --> 00:47:47,080 Speaker 1: the UH the Republican nominating process in late two thousand 740 00:47:47,160 --> 00:47:50,800 Speaker 1: and fifteen, when you know, the Republicans had these enormous 741 00:47:51,040 --> 00:47:54,960 Speaker 1: UH primary UH debates and you know, no one no 742 00:47:55,080 --> 00:48:00,200 Speaker 1: one took Donald Trump seriously. Someone asked Ben Carson, who, 743 00:48:00,239 --> 00:48:04,200 Speaker 1: for whatever you think of him, was a very famous neurosurgeon. Uh. 744 00:48:04,239 --> 00:48:06,040 Speaker 1: And they asked him what he thought of vaccination, and 745 00:48:06,080 --> 00:48:09,200 Speaker 1: he thought the science and back of it was pretty 746 00:48:09,200 --> 00:48:12,400 Speaker 1: far many, you know, gave the typical Republican answer, which was, 747 00:48:12,440 --> 00:48:13,799 Speaker 1: but you know, no one's going to force you to 748 00:48:13,800 --> 00:48:15,840 Speaker 1: take you know, I don't want anybody to be forced 749 00:48:15,840 --> 00:48:19,839 Speaker 1: to take a vaccination. Uh. And Donald Trump interrupted him 750 00:48:20,360 --> 00:48:22,600 Speaker 1: and said, I had an employed you who had a 751 00:48:22,680 --> 00:48:26,200 Speaker 1: daughter who's got vaccinated, a beautiful child. This was a 752 00:48:26,200 --> 00:48:30,120 Speaker 1: beautiful child and she developed autism. It's an epidemic. I 753 00:48:30,200 --> 00:48:33,560 Speaker 1: tell you, it's an epidemic. And he knew exactly what 754 00:48:33,600 --> 00:48:36,239 Speaker 1: he was doing. Uh. He knew that. You know, Ben 755 00:48:36,280 --> 00:48:41,200 Speaker 1: Carson's data were nothing compared to his narratives. All right, 756 00:48:41,680 --> 00:48:44,040 Speaker 1: So that's why you know, that's that's an example of 757 00:48:44,080 --> 00:48:49,279 Speaker 1: the importance of of narratives. Now status thinking again, Uh, 758 00:48:49,600 --> 00:48:55,160 Speaker 1: is is something that has to do with with sexual selection. Uh. 759 00:48:55,200 --> 00:48:58,200 Speaker 1: If you're a female and you're looking for a mate, 760 00:48:58,600 --> 00:49:01,440 Speaker 1: your major concern is passing on your d N A. 761 00:49:01,560 --> 00:49:04,480 Speaker 1: And the way your DNA gets passed on is if 762 00:49:04,560 --> 00:49:07,759 Speaker 1: you and your progeny are well provided for all right, 763 00:49:08,200 --> 00:49:11,279 Speaker 1: and your your progeny will be well provided for. If 764 00:49:11,400 --> 00:49:14,840 Speaker 1: your mate, uh, if the father of the children U 765 00:49:15,280 --> 00:49:18,480 Speaker 1: is a mocker, You someone who can bring home the 766 00:49:18,600 --> 00:49:21,480 Speaker 1: meat and command other people. If you don't, you're not 767 00:49:21,480 --> 00:49:24,640 Speaker 1: going to take a weekly because if you take a weekly, uh, 768 00:49:24,880 --> 00:49:27,560 Speaker 1: then that that person, that man will not provide for 769 00:49:27,600 --> 00:49:30,200 Speaker 1: you and your children. And you are less likely to 770 00:49:30,719 --> 00:49:34,759 Speaker 1: pass your your June's on. And we see it today. Uh. 771 00:49:34,840 --> 00:49:37,360 Speaker 1: You know in society today are a lot more likely 772 00:49:37,400 --> 00:49:39,359 Speaker 1: to pass your jeans on if you're a rock star 773 00:49:39,920 --> 00:49:44,359 Speaker 1: or athlete, then you know, if you're someone who didn't 774 00:49:44,400 --> 00:49:48,360 Speaker 1: graduate high school and can't jump. Interesting, So let's go 775 00:49:48,560 --> 00:49:52,319 Speaker 1: to our favorite questions that we ask all of our guests, 776 00:49:52,400 --> 00:49:55,440 Speaker 1: starting with what are you streaming these days? Give us 777 00:49:55,480 --> 00:50:00,319 Speaker 1: your favorite Netflix podcast, Amazon Prime? What what's keeping you 778 00:50:00,440 --> 00:50:04,920 Speaker 1: entertained during a lockdown? Well, I'm afraid I'm not much 779 00:50:05,040 --> 00:50:09,080 Speaker 1: unpopular culture. That the the the series that we're into 780 00:50:09,200 --> 00:50:13,640 Speaker 1: right now, uh is Green Leaf, which is about a 781 00:50:13,640 --> 00:50:18,120 Speaker 1: African American evangelical church, and it's just it's just a 782 00:50:18,120 --> 00:50:21,200 Speaker 1: great soap opera. The acting, it's spectacular, the move the 783 00:50:21,280 --> 00:50:26,080 Speaker 1: music is is fantastic, This cinematography is is you know, 784 00:50:26,239 --> 00:50:29,160 Speaker 1: is Amy class. I mean, it's just absolutely a spectacular 785 00:50:29,200 --> 00:50:32,319 Speaker 1: series and best of all, as sixty five episodes, so 786 00:50:32,520 --> 00:50:36,680 Speaker 1: it'll keep you occupied for for a while. And you know, 787 00:50:36,719 --> 00:50:39,439 Speaker 1: like everybody else's my ilk, I'm waiting for the next 788 00:50:39,440 --> 00:50:45,120 Speaker 1: season of Succession and better call Saul and Ozark. Uh. 789 00:50:45,160 --> 00:50:50,399 Speaker 1: As far as podcast, uh go, I'm afraid that most 790 00:50:50,440 --> 00:50:53,680 Speaker 1: of my reading bandwidth and my podcast Bandwich gets taken 791 00:50:53,719 --> 00:50:56,920 Speaker 1: up by the Economist, and it's almost overkill to do 792 00:50:57,040 --> 00:50:59,560 Speaker 1: both of those with the Economists. But the thing you 793 00:50:59,600 --> 00:51:02,680 Speaker 1: get from Economists podcast that you don't get from reading 794 00:51:02,680 --> 00:51:05,560 Speaker 1: the magazine is the flavor of just how smart and 795 00:51:05,600 --> 00:51:07,920 Speaker 1: funny these people are. And you have to put names 796 00:51:07,960 --> 00:51:10,920 Speaker 1: on them too. Of course, you know the the the 797 00:51:10,920 --> 00:51:14,719 Speaker 1: the Economist doesn't have bylines, so you actually get to 798 00:51:14,800 --> 00:51:17,960 Speaker 1: listen to the people who hear about who excuse me, 799 00:51:18,000 --> 00:51:19,960 Speaker 1: you get to listen to people who are writing the articles, 800 00:51:19,960 --> 00:51:23,920 Speaker 1: and that is marvelously entertaining. I love Silent Money, I 801 00:51:23,960 --> 00:51:28,719 Speaker 1: love your shows. Uh And and of course On the 802 00:51:28,800 --> 00:51:32,640 Speaker 1: Media is another series that's very, very smart. I also 803 00:51:32,760 --> 00:51:36,319 Speaker 1: like uh The New Yorker just because I simply can't 804 00:51:36,320 --> 00:51:41,239 Speaker 1: get enough of David Remnick. He's certainly amusing. Uh. Let's 805 00:51:41,280 --> 00:51:44,840 Speaker 1: talk about some mentors. Who are the people who influenced 806 00:51:45,320 --> 00:51:49,839 Speaker 1: your thought process, your philosophy, your career well. First of all, 807 00:51:49,920 --> 00:51:52,880 Speaker 1: career wise, I have to give credit to two people 808 00:51:53,320 --> 00:51:57,600 Speaker 1: who most people haven't heard of. One of whom is 809 00:51:57,960 --> 00:52:02,560 Speaker 1: um Scott Burns, who wrote finance for the Dallas Morning 810 00:52:02,760 --> 00:52:06,520 Speaker 1: uh news. Uh. He's not your typical personal finance writer. 811 00:52:06,640 --> 00:52:11,319 Speaker 1: He went to uh M I p uh and then 812 00:52:11,400 --> 00:52:14,000 Speaker 1: he would go over Harvard Harvard to take writing lessons 813 00:52:14,000 --> 00:52:17,560 Speaker 1: from Archibald Mcalish uh. And he was the first person 814 00:52:17,640 --> 00:52:22,160 Speaker 1: who who basically encouraged me and told me that very 815 00:52:22,200 --> 00:52:24,400 Speaker 1: few people had the ability both to do the mass 816 00:52:24,400 --> 00:52:26,799 Speaker 1: and right well. And if I had that ability, and 817 00:52:26,840 --> 00:52:29,880 Speaker 1: he thought I did, I should be encouraged to do it. 818 00:52:29,920 --> 00:52:32,200 Speaker 1: And I was, you know, very discouraged at that point 819 00:52:32,200 --> 00:52:34,440 Speaker 1: in my career. So he gave me the impetus. And 820 00:52:34,480 --> 00:52:37,440 Speaker 1: the other person and someone who's even more obscure financial advisor, right, 821 00:52:37,440 --> 00:52:40,160 Speaker 1: and the Frank Armstrong, who was the right first person 822 00:52:40,200 --> 00:52:42,800 Speaker 1: I'm aware of to put a finance book on the web, 823 00:52:43,160 --> 00:52:45,759 Speaker 1: investing for the twentieth century or investing for the twenty 824 00:52:45,800 --> 00:52:48,160 Speaker 1: first century. I think he had two different editions, and 825 00:52:48,200 --> 00:52:49,920 Speaker 1: he was the one who told me to put my 826 00:52:50,040 --> 00:52:54,160 Speaker 1: book The Intelligent Asset Allocator on the web, which jump 827 00:52:54,200 --> 00:52:57,440 Speaker 1: started my writing career. Now, as far as investing goes, 828 00:52:57,640 --> 00:53:00,239 Speaker 1: you know, it's the usual suspects, Um, it's it's going 829 00:53:00,320 --> 00:53:02,560 Speaker 1: to be Jack Burgel. So I did have some personal 830 00:53:02,600 --> 00:53:05,560 Speaker 1: contact with and of course Farmer and French, who had 831 00:53:05,640 --> 00:53:09,960 Speaker 1: almost no personal contact with, who were my intellectual mentors, 832 00:53:10,000 --> 00:53:11,520 Speaker 1: and then they were the people who have helped me 833 00:53:11,560 --> 00:53:15,160 Speaker 1: along in my career, prime among which are John Raiken, 834 00:53:15,239 --> 00:53:19,880 Speaker 1: Pauler at Morning Star, and Jonathan Clements at the at 835 00:53:19,920 --> 00:53:23,719 Speaker 1: the Journal. And then he was succeeded by Jason's Wide, 836 00:53:24,280 --> 00:53:26,520 Speaker 1: who is someone that I derived a great deal of 837 00:53:26,760 --> 00:53:31,520 Speaker 1: uh personal inspiration from as well. That's a great list. 838 00:53:32,120 --> 00:53:34,680 Speaker 1: Let's talk about some of your favorite books and and 839 00:53:35,080 --> 00:53:38,440 Speaker 1: what you're reading right now. Okay, Well, the two favorite 840 00:53:38,440 --> 00:53:41,320 Speaker 1: books that I always mentioned to everybody are Expert Political 841 00:53:41,360 --> 00:53:46,600 Speaker 1: Judgment by philth Catlock, which talks about just how difficult 842 00:53:46,840 --> 00:53:50,799 Speaker 1: it is uh to forecast well, how poorly people do 843 00:53:51,280 --> 00:53:56,200 Speaker 1: at and particularly the characteristics of poor forecasters. Which is 844 00:53:56,719 --> 00:53:58,880 Speaker 1: which is immensely valuable since that turns out to me 845 00:53:58,960 --> 00:54:01,760 Speaker 1: most of the people on EV And he explains exactly 846 00:54:01,760 --> 00:54:05,000 Speaker 1: why people on TV UH talking heads tend to be 847 00:54:05,160 --> 00:54:10,279 Speaker 1: miserable UH forecasters and identifies in fact that they that 848 00:54:10,440 --> 00:54:13,400 Speaker 1: they that they are. UH. The other book that I recommend, 849 00:54:13,400 --> 00:54:15,600 Speaker 1: it's kind of a grim book and it seems irrelevant 850 00:54:15,600 --> 00:54:19,160 Speaker 1: to finance, but I think it's just so very important, 851 00:54:19,440 --> 00:54:23,399 Speaker 1: which is Lawrence Reaves's History of Ashwitz Ashwitz New History. 852 00:54:23,440 --> 00:54:25,520 Speaker 1: And it was a history of Ashwitz from the perspective 853 00:54:25,960 --> 00:54:29,440 Speaker 1: of the camp personnel. UH. And the bottom line is, 854 00:54:29,440 --> 00:54:34,640 Speaker 1: if you can understand how very ordinary Germans became mass murderers, 855 00:54:34,920 --> 00:54:38,919 Speaker 1: then you can certainly understand Enron UH and world com 856 00:54:39,000 --> 00:54:44,000 Speaker 1: and and we work all right, uh and and it's 857 00:54:44,040 --> 00:54:46,239 Speaker 1: it's just a very important for people in finance if 858 00:54:46,280 --> 00:54:49,000 Speaker 1: you can make that jump. Now. The book that I'm 859 00:54:49,000 --> 00:54:50,880 Speaker 1: reading right now and not even done with it, but 860 00:54:51,040 --> 00:54:53,760 Speaker 1: it's one of the most brilliant books I've ever read, 861 00:54:54,200 --> 00:54:57,560 Speaker 1: is a book called The Weirdest People in the World. UH. 862 00:54:57,719 --> 00:55:01,080 Speaker 1: Weird is an acreneym W I l D, which stands 863 00:55:01,080 --> 00:55:07,560 Speaker 1: for Western educated, UH, Industrialized, rich and Democratic, which is 864 00:55:07,560 --> 00:55:10,520 Speaker 1: all of Western society. And it's a book, first of all, 865 00:55:10,560 --> 00:55:13,560 Speaker 1: about how unusual we are. It's a book by man 866 00:55:13,560 --> 00:55:17,320 Speaker 1: by Joseph Henrick, who is an interesting guy in himself, 867 00:55:17,320 --> 00:55:20,000 Speaker 1: which I'll get to in in a minute. And he 868 00:55:20,040 --> 00:55:23,800 Speaker 1: has this bizarre thesis which you know, strikes most people 869 00:55:23,920 --> 00:55:25,880 Speaker 1: is outrageous, and if the reason why the book is 870 00:55:26,080 --> 00:55:29,080 Speaker 1: more widely read than it is. And the thesis is 871 00:55:29,120 --> 00:55:31,560 Speaker 1: that we are who we are in Western society and 872 00:55:31,560 --> 00:55:35,760 Speaker 1: we're rich the way we are because, uh, the Church 873 00:55:35,920 --> 00:55:40,839 Speaker 1: forbade cousin marriage, which seems like a really weird hypothesis 874 00:55:41,080 --> 00:55:44,440 Speaker 1: until he explains it to you. And then he goes 875 00:55:44,480 --> 00:55:46,839 Speaker 1: into the data and back of it, which is just 876 00:55:46,960 --> 00:55:51,040 Speaker 1: absolutely steel trapped. Uh. He brings an enormous amount of 877 00:55:51,400 --> 00:55:56,040 Speaker 1: data from anthropology and social psychology, uh, to bear on it, 878 00:55:56,239 --> 00:55:59,879 Speaker 1: and he makes an extremely convincing case. Uh. And it's 879 00:56:00,200 --> 00:56:03,239 Speaker 1: just that the the you know, the Church forbade the 880 00:56:03,280 --> 00:56:06,359 Speaker 1: marriage of first cousins. They forbade the marriage of even 881 00:56:06,440 --> 00:56:09,400 Speaker 1: fifth cousins in some instances. And if you can't marry 882 00:56:09,440 --> 00:56:11,560 Speaker 1: your fifth cousin basically means you have to get out 883 00:56:11,560 --> 00:56:15,040 Speaker 1: of town to find a mate. And what that does 884 00:56:15,680 --> 00:56:18,920 Speaker 1: is it greatly increases your radius of trust. Okay, you 885 00:56:18,960 --> 00:56:22,880 Speaker 1: start trusting people who are outside your immediate family and tribe, 886 00:56:23,320 --> 00:56:28,640 Speaker 1: and that is the essential characteristic of the wealthiest modern society. 887 00:56:28,680 --> 00:56:32,440 Speaker 1: The reason why the dames in the Norwegians, uh and 888 00:56:32,480 --> 00:56:36,520 Speaker 1: the Germans do so well is because they trust strangers, 889 00:56:36,800 --> 00:56:38,719 Speaker 1: all right. And if you look at the places in 890 00:56:38,760 --> 00:56:42,719 Speaker 1: the world that don't do well, that have poor poor 891 00:56:42,760 --> 00:56:46,640 Speaker 1: economies and poor politics and poor institutions, its societies with 892 00:56:46,880 --> 00:56:49,200 Speaker 1: very low radius of trust. So, for example, it's the 893 00:56:49,239 --> 00:56:52,640 Speaker 1: difference between northern and southern Italy. The radius of trust 894 00:56:52,640 --> 00:56:55,600 Speaker 1: in northern Italy is very high. The radius of trust 895 00:56:55,800 --> 00:57:00,000 Speaker 1: in in um In uh sard in Italy is very 896 00:57:00,120 --> 00:57:03,600 Speaker 1: very low, and something that sociologists have known about for years, 897 00:57:03,640 --> 00:57:06,879 Speaker 1: for decades. So it's it's a book that explains all that, 898 00:57:07,200 --> 00:57:10,120 Speaker 1: and it's it's just, uh, it's an amazing read and 899 00:57:10,200 --> 00:57:13,840 Speaker 1: I can't recommend the book highly enough. Interesting, you mentioned 900 00:57:13,840 --> 00:57:16,560 Speaker 1: he has an unusual background. He didn't start act with 901 00:57:16,640 --> 00:57:21,600 Speaker 1: the usual four star you know, academic background. He started 902 00:57:21,600 --> 00:57:23,800 Speaker 1: out as an aerospace engineer, I think at a state 903 00:57:23,880 --> 00:57:27,480 Speaker 1: university somewhere, but he's been a minor in anthropology and 904 00:57:27,520 --> 00:57:30,160 Speaker 1: after working in aerospace for a while, and you look 905 00:57:30,160 --> 00:57:32,720 Speaker 1: at his pictures and he looks like your typical aerospace engineer, 906 00:57:32,800 --> 00:57:35,560 Speaker 1: you know, with the find horns and glasses. He doesn't 907 00:57:35,600 --> 00:57:38,919 Speaker 1: have a pocket protector, but he should have one. Uh 908 00:57:39,040 --> 00:57:42,920 Speaker 1: and uh. And he starts doing He gets a PhD 909 00:57:42,920 --> 00:57:46,240 Speaker 1: in anthropology, and he starts doing his research. And the 910 00:57:46,280 --> 00:57:50,680 Speaker 1: research is so so remarkably good that he just gradually 911 00:57:50,680 --> 00:57:55,040 Speaker 1: ascends the academic um uh ladder. First the University of 912 00:57:55,040 --> 00:57:58,960 Speaker 1: British Columbia, and now he's gotten endowed share of anthropology 913 00:57:59,080 --> 00:58:05,200 Speaker 1: at Harvard and this enormous team of multidiscipcenary researchers in 914 00:58:05,520 --> 00:58:11,480 Speaker 1: under him, you know, psychologists and economists, uh and social psychologists, 915 00:58:11,560 --> 00:58:15,919 Speaker 1: and his his work is just absolutely brilliant. Once once 916 00:58:15,960 --> 00:58:18,800 Speaker 1: you start reading the book, if you're academically inclined, that 917 00:58:18,920 --> 00:58:21,320 Speaker 1: you'll get lost in the thicket of his references because 918 00:58:21,320 --> 00:58:24,880 Speaker 1: the references are so fascinating. So he's another left brain 919 00:58:25,000 --> 00:58:29,160 Speaker 1: right brain person who could do both the math and 920 00:58:29,280 --> 00:58:32,960 Speaker 1: the narrative side. Yes, you're going to have a mentioned 921 00:58:32,960 --> 00:58:35,840 Speaker 1: is fact. He also writes very well, right, you're another 922 00:58:35,840 --> 00:58:38,760 Speaker 1: one of those people who have the ability to deal 923 00:58:38,800 --> 00:58:43,280 Speaker 1: with the underlying mathematics and the language side. And that's 924 00:58:43,280 --> 00:58:47,920 Speaker 1: a relatively rare combination of skills. Shucks, belly, I'm just 925 00:58:47,960 --> 00:58:52,200 Speaker 1: a simple country neurologist. That's right. Let's get to our 926 00:58:52,320 --> 00:58:56,080 Speaker 1: last two questions. What sort of advice would you give 927 00:58:56,120 --> 00:59:00,560 Speaker 1: to a recent college graduate who was interested in career 928 00:59:00,880 --> 00:59:08,640 Speaker 1: in either neurology, investing, or writing. Well, the usual admonission 929 00:59:08,760 --> 00:59:13,680 Speaker 1: to follow your bliss uh is just awful advice. Um. 930 00:59:13,720 --> 00:59:16,080 Speaker 1: You know, my favorite of my favorite New Yorker cartoons 931 00:59:16,520 --> 00:59:18,920 Speaker 1: is the typical cartoon where they have a guy with 932 00:59:19,280 --> 00:59:22,240 Speaker 1: you know, sitting on the street and shambles with a 933 00:59:22,280 --> 00:59:23,840 Speaker 1: with a with a hat in front of them, and 934 00:59:23,920 --> 00:59:27,400 Speaker 1: his sign says, followed my bliss. Uh. You know you 935 00:59:27,720 --> 00:59:30,479 Speaker 1: have to you have to make a living. Uh. And 936 00:59:31,560 --> 00:59:33,480 Speaker 1: the best thing you can do is to melt those 937 00:59:33,520 --> 00:59:35,560 Speaker 1: two things. You shouldn't. You shouldn't take a job if 938 00:59:35,600 --> 00:59:38,760 Speaker 1: you despise just to make money. But on the other hand, 939 00:59:39,120 --> 00:59:42,480 Speaker 1: you know you shouldn't get a degree in ethno musicology 940 00:59:43,400 --> 00:59:47,080 Speaker 1: as well and expect to have a happy existence. Uh. 941 00:59:47,360 --> 00:59:50,080 Speaker 1: You know you need to to have a decent standard 942 00:59:50,120 --> 00:59:52,480 Speaker 1: of living. You need to provide for your family, and 943 00:59:52,520 --> 00:59:54,640 Speaker 1: you need to save up so that if you ever 944 00:59:54,680 --> 00:59:56,920 Speaker 1: get sick of your job, you can follow your bliss 945 00:59:56,920 --> 00:59:58,840 Speaker 1: at that point and I have to worry about your 946 00:59:58,840 --> 01:00:01,520 Speaker 1: next your net to meal. And the best part of 947 01:00:01,560 --> 01:00:05,120 Speaker 1: doing that is if you can wind up with reasonable 948 01:00:05,160 --> 01:00:08,160 Speaker 1: savings at age forty or fifty, you don't have to 949 01:00:08,160 --> 01:00:09,840 Speaker 1: worry about what to do with the last twenty or 950 01:00:09,880 --> 01:00:11,600 Speaker 1: thirty years of your life because you'll have whole in 951 01:00:11,680 --> 01:00:14,440 Speaker 1: your careers in front of you and usibly things that 952 01:00:14,480 --> 01:00:18,560 Speaker 1: you genuinely enjoy doing. So don't follow your bliss, don't 953 01:00:18,600 --> 01:00:20,600 Speaker 1: take a job just for the money. Try and mel 954 01:00:20,680 --> 01:00:23,680 Speaker 1: those through things intelligently and balanced them off. I have 955 01:00:23,800 --> 01:00:27,600 Speaker 1: to ask, since you you referenced, what what's your take 956 01:00:27,720 --> 01:00:33,480 Speaker 1: on the whole early retirement fire community that that seems 957 01:00:33,520 --> 01:00:37,880 Speaker 1: to believe you can save enough money and tap out 958 01:00:38,040 --> 01:00:41,520 Speaker 1: at in your thirties or forties. Oh, it's a great idea, 959 01:00:41,640 --> 01:00:45,680 Speaker 1: as long as you don't get sicker have kids. So 960 01:00:46,280 --> 01:00:49,760 Speaker 1: not a fan, let me uh, I think I think 961 01:00:49,760 --> 01:00:53,640 Speaker 1: the hardest in the right place. I think they're a 962 01:00:53,640 --> 01:00:56,880 Speaker 1: little delusional, and of course they also ignore you know, 963 01:00:56,840 --> 01:00:59,240 Speaker 1: about the realities of life. And I think they also 964 01:00:59,440 --> 01:01:02,640 Speaker 1: ignore Uh, you know, what do you do with your 965 01:01:02,680 --> 01:01:04,560 Speaker 1: life when you go to the beach at age thirty 966 01:01:04,640 --> 01:01:07,560 Speaker 1: or forty? You better have something worthwhile that you want 967 01:01:07,560 --> 01:01:09,840 Speaker 1: to do with your life. You better have a mission 968 01:01:09,840 --> 01:01:12,640 Speaker 1: in life, because if you're don't, you're going to wind 969 01:01:12,720 --> 01:01:16,280 Speaker 1: up coffering from industrial grade on the way. I don't 970 01:01:16,280 --> 01:01:17,800 Speaker 1: want to trash them too much. I mean, I like 971 01:01:17,920 --> 01:01:20,400 Speaker 1: these people, and I think that their message for our 972 01:01:20,400 --> 01:01:23,840 Speaker 1: society that we live in a corrosive consumer culture that 973 01:01:23,920 --> 01:01:26,320 Speaker 1: has to be resisted and you should keep your living 974 01:01:26,360 --> 01:01:30,680 Speaker 1: expenses down, is certainly a very salutary message. Fair enough, 975 01:01:30,840 --> 01:01:33,480 Speaker 1: And our final question, what do you know about the 976 01:01:33,520 --> 01:01:37,200 Speaker 1: world of investing today that you wish you knew thirty 977 01:01:37,240 --> 01:01:40,720 Speaker 1: plus years ago or so when you were first getting 978 01:01:40,760 --> 01:01:45,080 Speaker 1: started in finance. Well, you know, academics like to play 979 01:01:45,080 --> 01:01:51,160 Speaker 1: this parlor game of do equities become riskier? Uh? You know, 980 01:01:51,680 --> 01:01:55,280 Speaker 1: over over less riskier, more risky over over time? And 981 01:01:55,520 --> 01:01:57,800 Speaker 1: there are an instant both sides that most academ missions 982 01:01:57,800 --> 01:02:00,240 Speaker 1: will tell you that they become riskier with time. But 983 01:02:00,400 --> 01:02:05,280 Speaker 1: what I only fully understood, you know, after until after 984 01:02:05,320 --> 01:02:07,720 Speaker 1: doing it. I didn't understand until after doing finance for 985 01:02:07,760 --> 01:02:09,920 Speaker 1: a couple of decades, was that that's kind of a 986 01:02:09,920 --> 01:02:13,680 Speaker 1: silly question. Uh. The real question is how risky are 987 01:02:13,960 --> 01:02:18,280 Speaker 1: stocks at a given stage in your life cycle? Uh. 988 01:02:18,320 --> 01:02:21,280 Speaker 1: And so if you are retired and you don't have 989 01:02:21,320 --> 01:02:25,560 Speaker 1: any human capital left, then stocks are three mile Island 990 01:02:25,960 --> 01:02:30,720 Speaker 1: chernewbyl toxic. They're very dangerous because you know, if you 991 01:02:30,800 --> 01:02:34,840 Speaker 1: have a bear market, bad bear market, uh that's prolonged, 992 01:02:35,080 --> 01:02:37,480 Speaker 1: you've got bad sequence risk and you may wind up 993 01:02:37,480 --> 01:02:40,520 Speaker 1: eating eating out out. At the other end of the spectrum, 994 01:02:40,560 --> 01:02:43,840 Speaker 1: if you're young, there's almost no risk to earning stocks 995 01:02:43,840 --> 01:02:47,200 Speaker 1: because you're periodically saving, and at some point in your 996 01:02:47,240 --> 01:02:49,760 Speaker 1: thirty or four year saving career, you're gonna buy a 997 01:02:49,760 --> 01:02:52,240 Speaker 1: lot of stocks very cheap, and you're gonna wind up 998 01:02:52,360 --> 01:02:55,360 Speaker 1: doing very very well. And I wish I had understood 999 01:02:55,400 --> 01:02:57,960 Speaker 1: that earlier in my career, so I could have been 1000 01:02:58,000 --> 01:03:01,200 Speaker 1: more aggressive earlier in my career. But you know, as 1001 01:03:01,200 --> 01:03:03,000 Speaker 1: it was at the time, So I don't feel too 1002 01:03:03,000 --> 01:03:04,680 Speaker 1: bad about doot. But if there was one lesson I 1003 01:03:04,720 --> 01:03:08,480 Speaker 1: wish I had had I had learned I had internalized 1004 01:03:08,680 --> 01:03:10,360 Speaker 1: it was I could have been a lot more aggressive 1005 01:03:10,400 --> 01:03:13,760 Speaker 1: when I was younger. Very interesting, Bill Bernstein, Thank you 1006 01:03:14,000 --> 01:03:16,960 Speaker 1: so much for being so generous with your time. We 1007 01:03:17,040 --> 01:03:21,280 Speaker 1: have been speaking with William Bernstein, author most recently of 1008 01:03:21,400 --> 01:03:24,360 Speaker 1: the Delusion of Crowds, Why People Go Mad in Groups, 1009 01:03:24,800 --> 01:03:28,040 Speaker 1: as well as a dozen other books. If you enjoy 1010 01:03:28,120 --> 01:03:30,600 Speaker 1: this conversation, well be sure and check out any of 1011 01:03:30,640 --> 01:03:34,600 Speaker 1: the previous three hundred and ninety conversations we've had over 1012 01:03:34,600 --> 01:03:39,600 Speaker 1: the past seven years. You can find that at iTunes, Spotify, 1013 01:03:39,760 --> 01:03:43,800 Speaker 1: where wherever you feed your podcast fix. We love your comments, 1014 01:03:43,840 --> 01:03:48,440 Speaker 1: feedback and suggestions right to us at m IB podcast 1015 01:03:48,480 --> 01:03:51,920 Speaker 1: at Bloomberg dot net. Give us a review on Apple iTunes. 1016 01:03:52,480 --> 01:03:55,840 Speaker 1: You can sign up for my Daily reads at Ridolts 1017 01:03:55,920 --> 01:03:59,200 Speaker 1: dot com. Check out my weekly column on Bloomberg dot 1018 01:03:59,240 --> 01:04:03,680 Speaker 1: com slash Opinion. Follow me on Twitter at rid Halts. 1019 01:04:03,760 --> 01:04:05,640 Speaker 1: I would be remiss if I did not thank the 1020 01:04:05,720 --> 01:04:09,040 Speaker 1: crack staff that helps us put these conversations together each 1021 01:04:09,080 --> 01:04:13,360 Speaker 1: and every week. Tim Harrow is my audio engineer. Atico 1022 01:04:13,480 --> 01:04:17,880 Speaker 1: val Brond is my project manager. Michael Boyle is my producer. 1023 01:04:18,000 --> 01:04:21,840 Speaker 1: Michael Batnick is my head of research. I'm Barry rit Halts. 1024 01:04:22,160 --> 01:04:25,680 Speaker 1: You've been listening to Masters in Business. I'm Bloomberg Radio