1 00:00:09,280 --> 00:00:12,639 Speaker 1: Hello, and welcome to another episode of the Odd Lots Podcast. 2 00:00:12,680 --> 00:00:18,079 Speaker 1: I'm Joe Wisen and I'm Tracy Hallowitt. Hi, Tracy, we're 3 00:00:18,079 --> 00:00:20,360 Speaker 1: in the same room together. I know um and I'm 4 00:00:20,360 --> 00:00:22,480 Speaker 1: looking at you while I say hello, which is different 5 00:00:22,480 --> 00:00:24,279 Speaker 1: to what we normally do. But it's nice to see 6 00:00:24,320 --> 00:00:28,280 Speaker 1: you in person again. Absolutely. Okay, So we're not talking 7 00:00:28,320 --> 00:00:32,600 Speaker 1: about poker on this episode. Now, I love poker and chess, 8 00:00:32,760 --> 00:00:34,960 Speaker 1: anything else that you love. We're not talking about poker 9 00:00:34,960 --> 00:00:37,080 Speaker 1: on this episode. But remember a couple of weeks ago 10 00:00:37,159 --> 00:00:40,479 Speaker 1: when we talked to uh Andy Douke, the former professional 11 00:00:40,520 --> 00:00:43,320 Speaker 1: poker player. Of course, one of our best episodes I 12 00:00:43,360 --> 00:00:46,520 Speaker 1: think in a while, and of course her whole thing. 13 00:00:46,600 --> 00:00:49,600 Speaker 1: And she has this new book out called Thinking in Beds, 14 00:00:50,080 --> 00:00:53,280 Speaker 1: and one of her big ideas that she talked about 15 00:00:53,440 --> 00:00:57,320 Speaker 1: was that to be successful in poker or trading, which 16 00:00:57,400 --> 00:01:01,800 Speaker 1: is particularly relevant to us, people have to not get 17 00:01:01,840 --> 00:01:04,280 Speaker 1: attached to certain ideas. They have to be willing to 18 00:01:04,319 --> 00:01:08,560 Speaker 1: discard things that are wrong, which is very tough for people. Yeah, 19 00:01:08,600 --> 00:01:11,880 Speaker 1: there's a lot of confirmation bias in trading, and people 20 00:01:11,959 --> 00:01:14,840 Speaker 1: kind of refuse to let go of trades that are 21 00:01:14,880 --> 00:01:16,520 Speaker 1: going the wrong way. Right because they think it will 22 00:01:16,560 --> 00:01:18,920 Speaker 1: come back eventually, they really want to be right, or 23 00:01:18,959 --> 00:01:22,600 Speaker 1: they want to fit a market move into some existing 24 00:01:22,840 --> 00:01:26,760 Speaker 1: worldview and really jam it in. And Okay, so we 25 00:01:26,800 --> 00:01:28,679 Speaker 1: say all that, we're like, oh, people do this, and 26 00:01:28,880 --> 00:01:31,720 Speaker 1: I think that's pretty easy for people to accept. But 27 00:01:31,840 --> 00:01:34,640 Speaker 1: I raised the question of why we do this? Why 28 00:01:34,720 --> 00:01:37,240 Speaker 1: is it so hard for us to update our beliefs 29 00:01:37,240 --> 00:01:40,039 Speaker 1: when new faction merge? Or why are we so interested 30 00:01:40,680 --> 00:01:43,840 Speaker 1: in coming up with an idea and fitting it into 31 00:01:43,920 --> 00:01:47,240 Speaker 1: an ideology or worldview even if it doesn't fit. Yeah? 32 00:01:47,240 --> 00:01:53,040 Speaker 1: Why are we seemingly incontrovertibly forever doomed to repeat this 33 00:01:53,120 --> 00:01:57,000 Speaker 1: one mistake? And why do we hate the truth? Right? Yeah? Okay, okay, yes, 34 00:01:57,600 --> 00:02:00,559 Speaker 1: So today we're going to try to answer that question 35 00:02:00,600 --> 00:02:02,480 Speaker 1: why we why we hate the truth, why we all 36 00:02:02,520 --> 00:02:06,560 Speaker 1: hate facts? And why this makes us all bad traitors? Okay? 37 00:02:06,760 --> 00:02:08,520 Speaker 1: Who are we talking to on this one? So I'm 38 00:02:08,600 --> 00:02:11,240 Speaker 1: very excited. I came across this paper the other day. 39 00:02:11,280 --> 00:02:14,920 Speaker 1: It's called the partisan Brain, an identity based model of 40 00:02:15,000 --> 00:02:18,720 Speaker 1: political belief and it talks about various explanations for this 41 00:02:18,800 --> 00:02:22,440 Speaker 1: phenomenon of our brains rejecting truth. And we have one 42 00:02:22,480 --> 00:02:26,320 Speaker 1: of the authors, j Van Babl. He is a professor 43 00:02:26,360 --> 00:02:29,280 Speaker 1: at n y U and uh, he's going to walk 44 00:02:29,360 --> 00:02:32,840 Speaker 1: us through um some of his research. What does he learned? Specifically, 45 00:02:32,840 --> 00:02:36,000 Speaker 1: he's an associate professor of psychology and neural science at 46 00:02:36,120 --> 00:02:38,120 Speaker 1: New York University. So he's going to explain to us 47 00:02:38,120 --> 00:02:40,760 Speaker 1: why we like lies, why we like lies exactly. Let's 48 00:02:40,800 --> 00:02:48,520 Speaker 1: do it. J Van Babel, thank you very much, thanks 49 00:02:48,560 --> 00:02:51,919 Speaker 1: for having so is this accurate? Do we just all 50 00:02:51,960 --> 00:02:55,080 Speaker 1: hate the truth? Sometimes we love the truth. It really 51 00:02:55,160 --> 00:02:58,359 Speaker 1: hinges on what our goals are. If we're looking for accuracy, 52 00:02:58,600 --> 00:03:02,440 Speaker 1: and that's part of our edity, then we look to 53 00:03:02,480 --> 00:03:05,120 Speaker 1: see code evidence that we might be wrong, or at 54 00:03:05,200 --> 00:03:10,280 Speaker 1: least verification that we know what we believe. And if 55 00:03:10,639 --> 00:03:15,560 Speaker 1: we are motivated by other goals, For example, if we 56 00:03:15,639 --> 00:03:18,280 Speaker 1: have a commitment to a political belief for a political party, 57 00:03:18,840 --> 00:03:22,080 Speaker 1: we might be compelled to believe them and ignore facts 58 00:03:22,080 --> 00:03:25,120 Speaker 1: that contradict that particular identity. So I set it up 59 00:03:25,240 --> 00:03:29,400 Speaker 1: obviously within the context of odd lots we talk about 60 00:03:29,480 --> 00:03:33,400 Speaker 1: markets and trading. The paper itself, though, is about you know, 61 00:03:33,440 --> 00:03:37,240 Speaker 1: as the title says, the partisan brain, My intuition and 62 00:03:37,360 --> 00:03:39,240 Speaker 1: my sentence is that a lot of what you've studied 63 00:03:39,240 --> 00:03:44,160 Speaker 1: about how political ideologues see the world has applicability to 64 00:03:44,520 --> 00:03:46,960 Speaker 1: the realm of trading in other areas. But just sort 65 00:03:47,000 --> 00:03:49,520 Speaker 1: of give us the overview of what this paper tried 66 00:03:49,560 --> 00:03:51,560 Speaker 1: to show. So what this paper tries to do is 67 00:03:51,600 --> 00:03:56,000 Speaker 1: bridge research in political science, which is about political parties 68 00:03:56,040 --> 00:03:59,720 Speaker 1: and ideologies with research and psychology, so what types of 69 00:03:59,760 --> 00:04:02,760 Speaker 1: things motivate us to engage in certain behaviors or hold 70 00:04:02,760 --> 00:04:05,560 Speaker 1: certain beliefs as well as neuroscience. So we looked inside 71 00:04:05,600 --> 00:04:08,360 Speaker 1: the brain and try to get a handle on if 72 00:04:08,360 --> 00:04:11,240 Speaker 1: you're being biased in some way by some commitment to 73 00:04:11,720 --> 00:04:15,120 Speaker 1: uh an identity or political party. Is that shaping the 74 00:04:15,160 --> 00:04:17,640 Speaker 1: way that you're reasoning, So your prefrontal cortex, is it 75 00:04:18,160 --> 00:04:23,200 Speaker 1: overactive and trying to argue or change your understanding of something, 76 00:04:23,600 --> 00:04:26,479 Speaker 1: or is it shaping our memory or emotional system or 77 00:04:26,520 --> 00:04:29,640 Speaker 1: maybe even our perceptual interpretations of the world, which is 78 00:04:29,880 --> 00:04:32,760 Speaker 1: really damning because if you're shaping how you see the world, 79 00:04:33,320 --> 00:04:35,359 Speaker 1: it's gonna be really hard to fix that. It's like 80 00:04:35,360 --> 00:04:38,039 Speaker 1: a perceptual illusion where you know, you see the dress 81 00:04:38,080 --> 00:04:39,839 Speaker 1: one way, I see it another way with another set 82 00:04:39,839 --> 00:04:42,480 Speaker 1: of colors and we'll just forever argue about it. So 83 00:04:42,520 --> 00:04:45,360 Speaker 1: what exactly did you find then? Because in my mind 84 00:04:45,440 --> 00:04:47,520 Speaker 1: I can imagine like I get a little bit of 85 00:04:47,640 --> 00:04:50,280 Speaker 1: a shot of serotonin every time someone tells me I'm 86 00:04:50,360 --> 00:04:53,200 Speaker 1: right or says that was a good question. So when 87 00:04:53,200 --> 00:04:56,800 Speaker 1: I think about these sort of confirmation biases, that's what 88 00:04:56,880 --> 00:04:58,560 Speaker 1: I think about. But you seem to be talking about 89 00:04:58,640 --> 00:05:02,640 Speaker 1: wide variety here. Yeah, so confirmation is certainly part of it. 90 00:05:02,720 --> 00:05:05,640 Speaker 1: We want to confirm what we are pret existing beliefs. 91 00:05:05,640 --> 00:05:11,080 Speaker 1: But what matters more is confirming or supporting or affirming 92 00:05:11,120 --> 00:05:14,440 Speaker 1: particular identities you have. And so if you're a trader 93 00:05:14,480 --> 00:05:19,159 Speaker 1: who's committed to, you know, economic outcomes of improving your performance, 94 00:05:19,520 --> 00:05:21,240 Speaker 1: you should care a lot about facts. If you're an 95 00:05:21,240 --> 00:05:24,560 Speaker 1: investigative journalist or a scientist, if those are your identities, 96 00:05:24,880 --> 00:05:27,680 Speaker 1: you care a lot about truth. But most people don't 97 00:05:27,720 --> 00:05:30,000 Speaker 1: care about those things most of the time. And even 98 00:05:30,080 --> 00:05:34,800 Speaker 1: professors or traders or journalists sometimes get led astray by 99 00:05:34,839 --> 00:05:38,400 Speaker 1: other goals that they have a goal to achieve status 100 00:05:38,520 --> 00:05:43,200 Speaker 1: or power or um affirm SENSI belonging, so to do 101 00:05:43,320 --> 00:05:47,120 Speaker 1: what is popular, or people might engage in what's called groupthink, 102 00:05:47,320 --> 00:05:49,240 Speaker 1: where they're kind of fitting in with the group beliefs 103 00:05:49,360 --> 00:05:52,880 Speaker 1: or norms, and those types of political motives and social 104 00:05:52,920 --> 00:05:56,720 Speaker 1: motives can lead us astray. So you might one might 105 00:05:56,880 --> 00:05:59,720 Speaker 1: you know, you give some examples in your research of 106 00:06:00,080 --> 00:06:03,120 Speaker 1: political ideologus or people who are belonging to one party 107 00:06:03,120 --> 00:06:07,640 Speaker 1: actually remembering facts differently. So, for example, Democrats are more 108 00:06:07,680 --> 00:06:11,039 Speaker 1: likely to believe incorrectly that George W. Bush was on 109 00:06:11,120 --> 00:06:15,560 Speaker 1: vacation during Hurricane Katrina, so things like that. So but 110 00:06:15,720 --> 00:06:19,120 Speaker 1: in the research you explained that to some extent, it 111 00:06:19,200 --> 00:06:22,640 Speaker 1: makes sense. In other words, the brain puts a higher 112 00:06:22,680 --> 00:06:27,240 Speaker 1: priority on believing something that makes sense for the group Democrats, 113 00:06:27,800 --> 00:06:31,440 Speaker 1: rather than putting a higher priority on factual accuracy. So 114 00:06:31,520 --> 00:06:35,560 Speaker 1: why does the brain elevate that sense above accuracy. Yeah, 115 00:06:35,640 --> 00:06:37,800 Speaker 1: so that's a great question. It goes back to thinking 116 00:06:37,839 --> 00:06:40,400 Speaker 1: about why do we have the brains we do. And 117 00:06:40,480 --> 00:06:44,200 Speaker 1: so our brains evolved over several hundred thousand years in 118 00:06:44,320 --> 00:06:48,640 Speaker 1: small tribes in Africa, and we were Humans are pretty 119 00:06:48,640 --> 00:06:52,159 Speaker 1: flimsy creatures. Okay, so we're not very strong or particularly fast, 120 00:06:52,200 --> 00:06:54,720 Speaker 1: so we're gonna get swallowed up by other creatures very 121 00:06:54,760 --> 00:06:57,640 Speaker 1: quickly unless we cooperated and fit in and if you 122 00:06:57,640 --> 00:07:00,760 Speaker 1: didn't fit in, you were shandrised from the group, and 123 00:07:00,800 --> 00:07:03,640 Speaker 1: so that was incredibly threatening. It was literally a threat 124 00:07:03,640 --> 00:07:06,480 Speaker 1: to your survival and certainly you would have had reproductive 125 00:07:06,480 --> 00:07:09,679 Speaker 1: opportunities if you weren't well respected and liked in your group. 126 00:07:10,200 --> 00:07:12,800 Speaker 1: And you can see the same thing in chimpanzees are 127 00:07:12,800 --> 00:07:16,400 Speaker 1: are one of our nearest genetic neighbors. Um they care 128 00:07:16,440 --> 00:07:18,120 Speaker 1: a lot about what their status is in the group, 129 00:07:18,200 --> 00:07:20,200 Speaker 1: and if they're excluded or if they fall down to 130 00:07:20,240 --> 00:07:22,800 Speaker 1: the bottom of the ladder, the outcomes for them aren't 131 00:07:22,880 --> 00:07:25,320 Speaker 1: very good and they're less likely to survive or pass 132 00:07:25,320 --> 00:07:28,880 Speaker 1: along their genes to future generations. And so through evolution, 133 00:07:28,920 --> 00:07:34,480 Speaker 1: we've developed these brains that are well adapted for fitting 134 00:07:34,480 --> 00:07:36,840 Speaker 1: into groups and getting along with other people in our groups, 135 00:07:36,840 --> 00:07:39,080 Speaker 1: and to the extent we can do that, and a 136 00:07:39,120 --> 00:07:42,880 Speaker 1: lot of it is through psychological tricks that guide us 137 00:07:42,920 --> 00:07:45,760 Speaker 1: towards things that now look irrational, but if you think 138 00:07:45,800 --> 00:07:49,160 Speaker 1: about human history, they actually were rational. They helped us survive, 139 00:07:49,280 --> 00:07:52,600 Speaker 1: and that's why we're the ancestors who have these particular 140 00:07:52,680 --> 00:07:56,080 Speaker 1: quirks of cognition. So if you were a stone age 141 00:07:56,120 --> 00:07:59,480 Speaker 1: Cassandra preaching the truth put it threatened the rest of 142 00:07:59,480 --> 00:08:01,160 Speaker 1: your tribe, they would kick you out, or at the 143 00:08:01,240 --> 00:08:03,160 Speaker 1: very least you wouldn't be able to find a mate. 144 00:08:03,320 --> 00:08:06,800 Speaker 1: There are no shortage of examples of that throughout history. 145 00:08:07,040 --> 00:08:10,760 Speaker 1: So I imagine that that sort of evolution and that 146 00:08:10,840 --> 00:08:15,760 Speaker 1: tribal heritage probably doesn't align very well with the modern world. 147 00:08:16,240 --> 00:08:19,360 Speaker 1: It doesn't. Yeah, we live. So we're here in Manhattan. 148 00:08:19,400 --> 00:08:23,000 Speaker 1: It's incredibly safe. We're safer and more prosperous than any 149 00:08:23,000 --> 00:08:26,640 Speaker 1: time in human history, and so we really don't have 150 00:08:26,720 --> 00:08:29,160 Speaker 1: a need to be afraid of a lot of things 151 00:08:29,240 --> 00:08:31,720 Speaker 1: that we're afraid of. But we are because we have 152 00:08:31,800 --> 00:08:34,200 Speaker 1: the brain that is over sensitive to things that might 153 00:08:34,400 --> 00:08:38,520 Speaker 1: have killed our ancestors. What are some other examples of 154 00:08:38,640 --> 00:08:45,439 Speaker 1: ways that the brain prioritizes things other than pure factual accuracy. Yeah, 155 00:08:45,440 --> 00:08:48,280 Speaker 1: So I would say I was mentioning to you both 156 00:08:48,280 --> 00:08:49,840 Speaker 1: earlier before we got on the show, is I'm a 157 00:08:49,840 --> 00:08:53,120 Speaker 1: big sports fan, and one of the first things professional 158 00:08:53,120 --> 00:08:55,560 Speaker 1: gamblers will tell you is don't bet on your favorite teams. 159 00:08:55,920 --> 00:08:58,400 Speaker 1: And the reason is because we're so identified with those teams, 160 00:08:58,440 --> 00:09:02,320 Speaker 1: we want them to win to affirm our fanship and 161 00:09:02,360 --> 00:09:05,120 Speaker 1: identity with them, that we make bad bets and so 162 00:09:05,160 --> 00:09:08,040 Speaker 1: if you ever want to sucker somebody, get them to 163 00:09:08,520 --> 00:09:11,080 Speaker 1: support some team that they really love or have a 164 00:09:11,120 --> 00:09:14,400 Speaker 1: flag flying out of their front porch on and that's 165 00:09:14,440 --> 00:09:16,720 Speaker 1: the type of person who's going to be more likely 166 00:09:16,760 --> 00:09:19,440 Speaker 1: to overestimate the probability of success of that. And is 167 00:09:19,440 --> 00:09:24,160 Speaker 1: it that same group think social comfort that causes that. 168 00:09:24,280 --> 00:09:26,400 Speaker 1: Is that essentially that just go back to that same 169 00:09:26,400 --> 00:09:29,320 Speaker 1: survival instinct. Yeah, it's driven by our need to belong, 170 00:09:29,679 --> 00:09:32,600 Speaker 1: our need for status, um also our need to be 171 00:09:32,720 --> 00:09:35,800 Speaker 1: good virtuous people. So we like to think, especially our 172 00:09:35,840 --> 00:09:38,640 Speaker 1: political groups, which are attached to morality, are virtuous, are 173 00:09:38,679 --> 00:09:40,439 Speaker 1: better than the other ones. And if we didn't think that, 174 00:09:40,480 --> 00:09:43,000 Speaker 1: we wouldn't vote for them in the first place. And 175 00:09:43,040 --> 00:09:45,480 Speaker 1: so we're constantly looking for evidence that we're on the 176 00:09:45,559 --> 00:09:49,559 Speaker 1: right side of political history. So you touched on this earlier. 177 00:09:49,640 --> 00:09:53,040 Speaker 1: But if you are a trader, you think that maybe 178 00:09:53,160 --> 00:09:56,480 Speaker 1: because your motivation is to make money, the bias doesn't 179 00:09:56,520 --> 00:09:59,640 Speaker 1: come into it as much. Is that your general just 180 00:09:59,720 --> 00:10:02,480 Speaker 1: I know of paper was about political biases. But yeah, 181 00:10:02,480 --> 00:10:05,120 Speaker 1: so let me give you an example of political bias. 182 00:10:05,320 --> 00:10:08,120 Speaker 1: There's some research showing that if you put monetary stakes 183 00:10:08,720 --> 00:10:11,440 Speaker 1: on what people say politically, they're more likely to be accurate, 184 00:10:12,080 --> 00:10:14,280 Speaker 1: and so there is reason to believe that that kind 185 00:10:14,280 --> 00:10:18,480 Speaker 1: of financial accountability can increase the extent to which you 186 00:10:18,559 --> 00:10:22,280 Speaker 1: value accuracy and guide you to the right answer, But 187 00:10:22,360 --> 00:10:24,920 Speaker 1: it doesn't get you all the way there, and sometimes 188 00:10:24,960 --> 00:10:27,680 Speaker 1: that bias pops up in other places in your behavior. 189 00:10:27,720 --> 00:10:31,040 Speaker 1: And so I think traders have pretty good built in 190 00:10:31,040 --> 00:10:34,760 Speaker 1: incentives to make them more accurate. But again, they're human, 191 00:10:34,840 --> 00:10:37,880 Speaker 1: and when they're not thinking about the bottom mind that 192 00:10:37,960 --> 00:10:41,440 Speaker 1: might be thinking about status or stature in the economic 193 00:10:41,440 --> 00:10:43,520 Speaker 1: world or something like that. Those are the types of 194 00:10:43,520 --> 00:10:45,480 Speaker 1: motives that are likely to guide them astray. So I 195 00:10:45,559 --> 00:10:48,160 Speaker 1: remember thinking about this very vividly in the wake of 196 00:10:48,200 --> 00:10:51,880 Speaker 1: the financial crisis, because there were a lot of these 197 00:10:51,960 --> 00:10:55,760 Speaker 1: legacy hedge funders who didn't like fiscal stimulus and they 198 00:10:55,760 --> 00:10:59,280 Speaker 1: didn't like quantitative easing and you're all kind of conservative 199 00:10:59,440 --> 00:11:02,720 Speaker 1: and cray key and being too mean, but they a 200 00:11:02,720 --> 00:11:04,960 Speaker 1: lot of them came of age during the Reagan era 201 00:11:05,640 --> 00:11:09,640 Speaker 1: and they know and like it was very clear that 202 00:11:09,880 --> 00:11:13,720 Speaker 1: experience and that ideology, and they many of them were 203 00:11:13,800 --> 00:11:16,640 Speaker 1: very barished for too long. And I remember the hedge 204 00:11:16,640 --> 00:11:20,400 Speaker 1: funder David Temper who was a Democrat, and he did 205 00:11:20,400 --> 00:11:22,880 Speaker 1: phenomenally well. And I don't think it's because democrats are 206 00:11:22,960 --> 00:11:25,840 Speaker 1: better traders. I think it was because and I thought 207 00:11:25,840 --> 00:11:29,000 Speaker 1: at the time that he didn't have any particular reason 208 00:11:29,080 --> 00:11:32,120 Speaker 1: to dislike the overall situation right now, and he didn't 209 00:11:32,240 --> 00:11:34,640 Speaker 1: let that get in the way of his, uh of 210 00:11:34,679 --> 00:11:37,800 Speaker 1: his trading. Yeah, so I'm glad you connected politics to trading. 211 00:11:37,840 --> 00:11:40,080 Speaker 1: I think there's a really good lesson here. So one 212 00:11:40,160 --> 00:11:43,920 Speaker 1: is that through our experiences we build an ideology, a 213 00:11:43,960 --> 00:11:46,400 Speaker 1: sense of how the world works, whether it's the financial 214 00:11:46,440 --> 00:11:50,360 Speaker 1: system or social relations. And ideologies are really good because 215 00:11:50,360 --> 00:11:52,520 Speaker 1: they can generate all kinds of predictions for us what's 216 00:11:52,559 --> 00:11:55,319 Speaker 1: going to happen and how we should play out certain situations. 217 00:11:56,480 --> 00:11:58,880 Speaker 1: But the problem with ideology is is that we become 218 00:11:59,080 --> 00:12:01,760 Speaker 1: extremely committed to them. We start looking for evidence that 219 00:12:01,920 --> 00:12:04,640 Speaker 1: they're true. And the reason for that is think of 220 00:12:04,679 --> 00:12:06,280 Speaker 1: how threatening it would be if I told you your 221 00:12:06,280 --> 00:12:08,760 Speaker 1: whole belief system was wrong. You have to start from scratch. 222 00:12:09,400 --> 00:12:12,800 Speaker 1: And so that's the reason that we get defensive and 223 00:12:12,840 --> 00:12:16,760 Speaker 1: cling to our worldviews and ideologies. And that's and when 224 00:12:16,800 --> 00:12:19,480 Speaker 1: in situations like that where they lead us astray, it 225 00:12:19,559 --> 00:12:22,600 Speaker 1: often takes a lot of additional evidence to prove those 226 00:12:22,600 --> 00:12:24,880 Speaker 1: people that they're wrong. They're going to cling to that, 227 00:12:25,160 --> 00:12:27,839 Speaker 1: and especially if they've been on the public record or 228 00:12:27,840 --> 00:12:29,760 Speaker 1: among their friends as saying they believe in it. There's 229 00:12:29,800 --> 00:12:32,400 Speaker 1: a social cost to admitting you're wrong. So when it 230 00:12:32,400 --> 00:12:37,640 Speaker 1: comes to building ideologies or trade ideas or financial narratives, 231 00:12:38,200 --> 00:12:41,640 Speaker 1: I feel like the big difference now is that there 232 00:12:41,760 --> 00:12:45,200 Speaker 1: is so much data and information out there that you 233 00:12:45,200 --> 00:12:48,920 Speaker 1: can pretty much cherry pick whatever narrative that you want. 234 00:12:49,520 --> 00:12:51,920 Speaker 1: So how does that play into all of this? Okay, 235 00:12:51,920 --> 00:12:53,880 Speaker 1: so I'll say that's the same as what I think 236 00:12:53,880 --> 00:12:56,160 Speaker 1: has happened in politics that's been really bad, which is 237 00:12:56,200 --> 00:12:58,800 Speaker 1: that used to be you had three major TV stations 238 00:12:58,800 --> 00:13:01,320 Speaker 1: and a trusted news anchor who came on at five pm. 239 00:13:01,640 --> 00:13:03,560 Speaker 1: We're all watching the news, and it was vetted with 240 00:13:03,800 --> 00:13:07,200 Speaker 1: the similar editorial standards. What's happened is you've had the 241 00:13:07,920 --> 00:13:10,360 Speaker 1: dispersion of news. So now you can cherry pick what 242 00:13:10,480 --> 00:13:12,880 Speaker 1: news you want to go to about whatever political story 243 00:13:12,880 --> 00:13:15,760 Speaker 1: of the day, um, and you can kind of drill 244 00:13:15,840 --> 00:13:18,280 Speaker 1: down into some rabbit holes on the internet based on 245 00:13:18,360 --> 00:13:21,000 Speaker 1: whatever you want to believe. And it's probably the same 246 00:13:21,040 --> 00:13:23,839 Speaker 1: thing with a financial information. There's so many people with 247 00:13:23,960 --> 00:13:27,400 Speaker 1: so much information, and instead of relying on a small 248 00:13:27,480 --> 00:13:31,920 Speaker 1: number of trusted sources of information, you could easily get 249 00:13:32,240 --> 00:13:35,120 Speaker 1: stuck looking for evidence for your ideology and probably find 250 00:13:35,120 --> 00:13:38,000 Speaker 1: it no matter what your belief system is. And so 251 00:13:38,480 --> 00:13:41,600 Speaker 1: information can be very helpful to people at times, um 252 00:13:41,640 --> 00:13:43,480 Speaker 1: if they use it in the right way. But the 253 00:13:43,559 --> 00:13:47,640 Speaker 1: brain wants to find evidence that supports its belief and 254 00:13:47,760 --> 00:13:49,920 Speaker 1: that can lead us astray because we'll just cherry pick 255 00:13:50,360 --> 00:13:53,199 Speaker 1: data that supports that belief, and scientists can do that too, 256 00:13:53,240 --> 00:13:55,199 Speaker 1: and it's dangerous for us just as much as it 257 00:13:55,320 --> 00:13:57,320 Speaker 1: is for traders. And I'm thinking a lot too about 258 00:13:57,400 --> 00:14:00,199 Speaker 1: the people who we know, like are very bearish all 259 00:14:00,200 --> 00:14:04,560 Speaker 1: these years, and even the upward move intostoction the economy 260 00:14:04,679 --> 00:14:07,000 Speaker 1: is proof that they're right, because then that just shows 261 00:14:07,000 --> 00:14:09,000 Speaker 1: that the whole thing is rigged in a bubble or 262 00:14:09,080 --> 00:14:12,560 Speaker 1: for the bulls, correction is always it's always exactly right. 263 00:14:12,559 --> 00:14:15,960 Speaker 1: It's the it's the flip side. I'm curious if there 264 00:14:16,000 --> 00:14:18,880 Speaker 1: are patterns of people who are just better than others 265 00:14:18,880 --> 00:14:21,680 Speaker 1: it not falling into this trap. Yeah, So I just 266 00:14:21,680 --> 00:14:24,400 Speaker 1: saw talk on this from a colleague of mine at Yale, 267 00:14:24,440 --> 00:14:28,840 Speaker 1: who's measuring people's analytic abilities, and so you there's a 268 00:14:28,880 --> 00:14:32,440 Speaker 1: simple measure called the cognitive response test that people can take, 269 00:14:33,000 --> 00:14:36,479 Speaker 1: and some people tend to go with intuitive, automatic responses. 270 00:14:36,480 --> 00:14:39,720 Speaker 1: And so one question might be, there's a pond with 271 00:14:40,000 --> 00:14:43,920 Speaker 1: lilies growing on it, and you know, after fifty days, 272 00:14:43,960 --> 00:14:47,800 Speaker 1: the pond is entirely covered and the lilies go double 273 00:14:47,840 --> 00:14:50,240 Speaker 1: in size each day, and so how full was the 274 00:14:50,280 --> 00:14:55,960 Speaker 1: pond at day forty nine? Uh? Half full? Half full? Right? Um? 275 00:14:56,000 --> 00:14:58,160 Speaker 1: A lot of people really like I got by the way, 276 00:14:58,200 --> 00:15:01,360 Speaker 1: there's really stressful quiz in the middle of the poet 277 00:15:01,720 --> 00:15:05,120 Speaker 1: good um. You know, a huge proportion of Ivy League 278 00:15:05,120 --> 00:15:08,680 Speaker 1: students fail this. These types of questions they're not they're 279 00:15:08,680 --> 00:15:11,240 Speaker 1: not easy. The reason is, even people who are smart, 280 00:15:11,640 --> 00:15:13,920 Speaker 1: the instinct is to say twenty five days, because that's 281 00:15:14,200 --> 00:15:16,960 Speaker 1: they're growing at twice the rate every day, and you're 282 00:15:16,960 --> 00:15:19,400 Speaker 1: thinking it's fifty days the ponds full. The halfway point 283 00:15:19,440 --> 00:15:21,520 Speaker 1: should be twenty five days. And so a lot of 284 00:15:21,520 --> 00:15:23,800 Speaker 1: people will say that response, and they do it automatically. 285 00:15:24,080 --> 00:15:30,360 Speaker 1: Some people like you stop, catch yourself, analyze your thinking critically, 286 00:15:30,440 --> 00:15:33,120 Speaker 1: and then give a response. This This just proves that 287 00:15:33,240 --> 00:15:39,480 Speaker 1: I should be a trader. Right. This is like, so 288 00:15:39,520 --> 00:15:41,640 Speaker 1: a lot of us have intuitions about what we should 289 00:15:41,680 --> 00:15:44,200 Speaker 1: do or what's right or wrong, But the best people 290 00:15:44,240 --> 00:15:46,960 Speaker 1: among us say, wait a second, that sounds too good 291 00:15:47,000 --> 00:15:49,920 Speaker 1: to be true, or that news doesn't seem quite right. 292 00:15:49,960 --> 00:15:52,560 Speaker 1: I'm gonna go to fact check it, and a lot 293 00:15:52,560 --> 00:15:54,640 Speaker 1: of fact checking now you can do in thirty seconds 294 00:15:54,720 --> 00:15:57,480 Speaker 1: you go to Snopes or something. But most people don't 295 00:15:57,520 --> 00:16:00,360 Speaker 1: do that. They are like, oh, I knew that along, 296 00:16:00,480 --> 00:16:03,200 Speaker 1: now I'm going to share it. But do those people 297 00:16:04,120 --> 00:16:07,240 Speaker 1: who do take the extra thirty seconds to check something 298 00:16:07,280 --> 00:16:10,200 Speaker 1: on snokes that they saw on Facebook or catch themselves 299 00:16:10,280 --> 00:16:13,840 Speaker 1: before they answer a riddle? Did they descend from some 300 00:16:14,000 --> 00:16:18,840 Speaker 1: other group of people or did they do? Why? Yeah, 301 00:16:18,880 --> 00:16:24,680 Speaker 1: why don't they have that instinct to you know, just basically, well, 302 00:16:24,800 --> 00:16:27,160 Speaker 1: move more on instinct? Yeah, I guess there's some The 303 00:16:27,160 --> 00:16:29,640 Speaker 1: way of thinking about is that there's some situations where 304 00:16:29,760 --> 00:16:33,480 Speaker 1: going with your instinct is helpful, and so what ends 305 00:16:33,520 --> 00:16:35,760 Speaker 1: up What that means is there's other situations where it doesn't. 306 00:16:35,760 --> 00:16:37,880 Speaker 1: So you end up with a normal distribution of humans, 307 00:16:38,200 --> 00:16:41,840 Speaker 1: some of which are very intuitive and guided by automatic reactions, 308 00:16:41,880 --> 00:16:44,520 Speaker 1: and the rest of which are kind of the opposite. 309 00:16:44,560 --> 00:16:46,680 Speaker 1: They're catching themselves. I do think I would not have 310 00:16:46,720 --> 00:16:49,440 Speaker 1: done well on the Stone Age for what I'm just 311 00:16:49,480 --> 00:16:51,080 Speaker 1: putting it out there right now. I think I would 312 00:16:51,080 --> 00:16:55,360 Speaker 1: have been cast out pretty and not found a mate. 313 00:16:57,560 --> 00:17:02,640 Speaker 1: So obviously this has pretty profound implications for politics, especially 314 00:17:02,680 --> 00:17:05,200 Speaker 1: as it gets easier and easier to craft our own narratives. 315 00:17:05,320 --> 00:17:07,639 Speaker 1: And as you know, and there's been a fair amount 316 00:17:07,640 --> 00:17:09,959 Speaker 1: of discussion of this, and we live in our own 317 00:17:10,280 --> 00:17:12,480 Speaker 1: worlds of our own facts and our own new sources, 318 00:17:12,520 --> 00:17:16,199 Speaker 1: and so that aspect of it, people will grasp what 319 00:17:16,359 --> 00:17:18,280 Speaker 1: can is there. You talk a little bit in the 320 00:17:18,280 --> 00:17:21,080 Speaker 1: paper about things that can be done to ameliorate some 321 00:17:21,200 --> 00:17:25,040 Speaker 1: of these effects. So theoretically, if we want to save democracy, 322 00:17:25,400 --> 00:17:28,520 Speaker 1: it would be nice. I maybe people could emerge towards 323 00:17:28,680 --> 00:17:31,840 Speaker 1: one set of shared facts. What works in terms of 324 00:17:31,880 --> 00:17:37,240 Speaker 1: getting people to lower social identity in their mental hierarchies. Yeah, 325 00:17:37,280 --> 00:17:41,040 Speaker 1: so one thing is developing other identities that value accuracy. 326 00:17:41,680 --> 00:17:44,280 Speaker 1: And so all of us have multiple identities. I am 327 00:17:44,320 --> 00:17:48,480 Speaker 1: a father, a son, a professor, a Canadian, and all 328 00:17:48,520 --> 00:17:50,639 Speaker 1: of these come online at different times. When I'm watching 329 00:17:50,640 --> 00:17:53,840 Speaker 1: the Olympics and thinking about my Canadian identity, and at 330 00:17:53,880 --> 00:17:55,600 Speaker 1: that point, I'm just cheering from my team, and I 331 00:17:55,680 --> 00:18:00,119 Speaker 1: might make bad bets on the Canadian national hockey team. 332 00:18:00,160 --> 00:18:02,520 Speaker 1: But when I'm thinking through the lens of a professor, 333 00:18:02,640 --> 00:18:05,240 Speaker 1: that comes with a certain set of training and expectations, 334 00:18:05,760 --> 00:18:09,439 Speaker 1: and I comport myself in a different way and apply 335 00:18:09,640 --> 00:18:11,560 Speaker 1: kind of I put on that set of glasses and 336 00:18:11,560 --> 00:18:14,159 Speaker 1: see the world through that lens, and it's a lot 337 00:18:14,240 --> 00:18:17,199 Speaker 1: more critical than my other identities. And so when I 338 00:18:17,200 --> 00:18:19,600 Speaker 1: want to evaluate facts or before I'm going to share 339 00:18:19,680 --> 00:18:23,360 Speaker 1: something on Twitter, you know, among my followers, I might 340 00:18:23,400 --> 00:18:26,320 Speaker 1: want to share something that is consistent with my identities, 341 00:18:26,440 --> 00:18:29,600 Speaker 1: my other identities, but often I catch myself and I ask, 342 00:18:29,800 --> 00:18:32,560 Speaker 1: is there data to support that? And that's basically because 343 00:18:32,600 --> 00:18:34,760 Speaker 1: I realized my professor friends are going to be seeing it, 344 00:18:34,800 --> 00:18:37,280 Speaker 1: and if I share things that are untrue or unfactual, 345 00:18:37,280 --> 00:18:40,320 Speaker 1: they're gonna jump on board. And so that community I 346 00:18:40,320 --> 00:18:44,600 Speaker 1: belong to is very fiercely skeptical, and so you can 347 00:18:45,280 --> 00:18:48,040 Speaker 1: build your own identities and hang out with communities that 348 00:18:48,200 --> 00:18:51,399 Speaker 1: value accuracy, and some of us aren't comfortable. It's very 349 00:18:51,400 --> 00:18:54,320 Speaker 1: threatening and upsetting to be told you wrong, especially in public. 350 00:18:55,280 --> 00:18:59,520 Speaker 1: Does democracy work if we are biologically destined to favor 351 00:18:59,720 --> 00:19:02,000 Speaker 1: law is and we have a lot of lies being 352 00:19:02,000 --> 00:19:05,479 Speaker 1: flung at us by new forms of media and the internet. Yes, 353 00:19:05,600 --> 00:19:10,560 Speaker 1: so democracy assumes that voters are informed and that they 354 00:19:10,560 --> 00:19:13,400 Speaker 1: can determine what their best interests are or the best 355 00:19:13,440 --> 00:19:16,240 Speaker 1: interests of the country and vote. But if they're fed 356 00:19:16,880 --> 00:19:21,720 Speaker 1: inaccurate information, the collective wisdom of the population is going 357 00:19:21,800 --> 00:19:25,440 Speaker 1: to go awry. And that is a huge risk of democracy, 358 00:19:25,480 --> 00:19:27,199 Speaker 1: not just in the US but around the world right now. 359 00:19:27,280 --> 00:19:31,000 Speaker 1: And that's why other countries are having boughts generating news 360 00:19:31,040 --> 00:19:33,480 Speaker 1: that serves their interests and not ours. See, this is 361 00:19:33,480 --> 00:19:35,520 Speaker 1: why markets are better, because it doesn't matter if a 362 00:19:35,520 --> 00:19:38,000 Speaker 1: bunch of people believe lies or it all comes out 363 00:19:38,040 --> 00:19:41,000 Speaker 1: in the wash, right. And so whereas democracy you sort 364 00:19:41,000 --> 00:19:44,159 Speaker 1: of choose one person over the other, in markets you 365 00:19:44,200 --> 00:19:47,160 Speaker 1: converge on a single price, which tends to be more 366 00:19:47,280 --> 00:19:50,040 Speaker 1: or less usually a pretty good price, although there are 367 00:19:50,320 --> 00:19:52,560 Speaker 1: bubbles and stuff like that. You know, I'm thinking back 368 00:19:52,600 --> 00:19:55,639 Speaker 1: to a really old episode we did on the podcast. 369 00:19:55,720 --> 00:20:00,320 Speaker 1: Remember Dr Brett Steinberger, the He's a Trader psychology. He 370 00:20:00,400 --> 00:20:02,639 Speaker 1: was like the real life, the real life Wendy Rhodes 371 00:20:03,000 --> 00:20:05,560 Speaker 1: from billions who went into hedge funds and talk to 372 00:20:05,600 --> 00:20:09,080 Speaker 1: traders about improving. So I'm curious based on you know, 373 00:20:09,080 --> 00:20:11,919 Speaker 1: if you were brought in, if a hedge funder wanted 374 00:20:11,960 --> 00:20:16,640 Speaker 1: to have you teach his traders essentially how to overcome 375 00:20:16,840 --> 00:20:21,320 Speaker 1: their biases, how to overcome ideological ruts that might prevent 376 00:20:21,359 --> 00:20:24,120 Speaker 1: them from seeing the truth in front of them. How 377 00:20:24,200 --> 00:20:27,040 Speaker 1: might you apply what you've learned about sort of ameliorating 378 00:20:27,080 --> 00:20:31,200 Speaker 1: political um uh, the effects of politics to the world 379 00:20:31,240 --> 00:20:33,679 Speaker 1: of markets. Yeah, so maybe I'd come up with a 380 00:20:33,760 --> 00:20:36,879 Speaker 1: handful of tips. The first one is take your ego 381 00:20:36,920 --> 00:20:39,879 Speaker 1: out of it, So take your own personal motives for 382 00:20:39,960 --> 00:20:42,080 Speaker 1: things like status are belonging out of it as much 383 00:20:42,080 --> 00:20:43,800 Speaker 1: as you can, because those are going to guide you 384 00:20:43,840 --> 00:20:47,480 Speaker 1: a straight on average. The next thing is, um have 385 00:20:47,560 --> 00:20:50,760 Speaker 1: a system where you check yourself so when you feel 386 00:20:50,760 --> 00:20:53,480 Speaker 1: like your intuition is to go one way, have some 387 00:20:53,560 --> 00:20:56,960 Speaker 1: system where you wait an hour or a day or 388 00:20:57,000 --> 00:20:59,119 Speaker 1: sleep on it before you make a big trade, and 389 00:20:59,160 --> 00:21:02,160 Speaker 1: so you're able to process more information and make it 390 00:21:02,240 --> 00:21:05,679 Speaker 1: based on analytics. So that also stops you from engaging 391 00:21:05,680 --> 00:21:08,720 Speaker 1: a group thinking going with the flow of things. The 392 00:21:08,840 --> 00:21:11,959 Speaker 1: other things I would say is try to put your 393 00:21:12,000 --> 00:21:15,639 Speaker 1: political ideology out of the equation. And so, as you 394 00:21:15,680 --> 00:21:20,800 Speaker 1: mentioned that happened in the recovery, some people had ideologies 395 00:21:20,840 --> 00:21:23,600 Speaker 1: based on past experiences in the market and and held 396 00:21:23,640 --> 00:21:26,160 Speaker 1: onto those a little too long. So get your ego 397 00:21:26,200 --> 00:21:28,399 Speaker 1: out of it, your ideology out of it. Build in 398 00:21:28,440 --> 00:21:31,440 Speaker 1: a process that allows you to deliberate. And the other 399 00:21:31,480 --> 00:21:35,480 Speaker 1: thing I think to do is to have a process 400 00:21:35,560 --> 00:21:38,119 Speaker 1: where people around you you can call on them to 401 00:21:38,280 --> 00:21:41,480 Speaker 1: be critical of you and create a culture of of skepticism. 402 00:21:41,480 --> 00:21:44,639 Speaker 1: And so I met with the CFO of eBay and 403 00:21:44,640 --> 00:21:46,439 Speaker 1: gave a talk with him once, and he said in 404 00:21:46,480 --> 00:21:49,359 Speaker 1: their c suite meetings they would give a different person 405 00:21:49,400 --> 00:21:52,159 Speaker 1: before each meeting a black helmet, and that person was 406 00:21:52,200 --> 00:21:56,120 Speaker 1: a designated dissenter. And because because descent is really hard, 407 00:21:56,200 --> 00:21:59,080 Speaker 1: especially in groups, because people might think you're trying to 408 00:21:59,160 --> 00:22:02,679 Speaker 1: undercut them or show off or sabotage the group. But 409 00:22:02,720 --> 00:22:04,960 Speaker 1: it turns out people actually care more most about the 410 00:22:04,960 --> 00:22:07,480 Speaker 1: group outcomes are most likely to dissent, but we tend 411 00:22:07,480 --> 00:22:10,040 Speaker 1: to suppress it to go along and fit in and 412 00:22:10,080 --> 00:22:13,639 Speaker 1: to avoid being socially excluded. And so building systems to 413 00:22:14,280 --> 00:22:16,399 Speaker 1: encourage dissent, it reminded me I was. I took my 414 00:22:16,520 --> 00:22:20,600 Speaker 1: daughter to the Natural History Museum recently and they showed 415 00:22:20,640 --> 00:22:24,560 Speaker 1: a mask that judges used to wear some tribe thousands 416 00:22:24,600 --> 00:22:27,679 Speaker 1: of years ago, and it pointed out that the idea 417 00:22:27,760 --> 00:22:30,919 Speaker 1: behind wearing the mask or something is to depersonalize the 418 00:22:30,960 --> 00:22:33,200 Speaker 1: decision so that the judge, so that the it's the 419 00:22:33,280 --> 00:22:35,760 Speaker 1: law speaking and not the individuals. So when you say 420 00:22:35,800 --> 00:22:39,520 Speaker 1: a black helmet creating some visual symbol of this person 421 00:22:39,640 --> 00:22:43,160 Speaker 1: isn't allowed to dissent, I could see how that can 422 00:22:43,520 --> 00:22:45,600 Speaker 1: has a long history of helping to break through and 423 00:22:45,640 --> 00:22:49,919 Speaker 1: make communication better. Yeah. Well that was a fantastic conversation 424 00:22:50,200 --> 00:22:53,320 Speaker 1: and I love this topic. Dr j Van Babel. He 425 00:22:53,400 --> 00:22:55,560 Speaker 1: is the author of a new paper the co author 426 00:22:55,560 --> 00:22:58,600 Speaker 1: of a new paper, The Partisan Brain and Identity based 427 00:22:58,680 --> 00:23:02,960 Speaker 1: Model of Political belief. It's a pretty grim read from 428 00:23:03,000 --> 00:23:05,960 Speaker 1: a future of democracy standpoint. I think this conversation left 429 00:23:06,040 --> 00:23:08,640 Speaker 1: me very down on that front. But look, it can 430 00:23:08,720 --> 00:23:11,000 Speaker 1: maybe have some implications for people who want to make 431 00:23:11,000 --> 00:23:13,639 Speaker 1: money on the market. So that's good, right, Thank you, 432 00:23:13,800 --> 00:23:28,000 Speaker 1: Thanks a lot, Joe. Next Odd Lots podcast, I'm showing 433 00:23:28,080 --> 00:23:29,880 Speaker 1: up with a black helmet and I'm going to disagree 434 00:23:29,920 --> 00:23:32,760 Speaker 1: with everything. I like that idea. You know, what I 435 00:23:32,800 --> 00:23:36,119 Speaker 1: really liked about this conversation is there's a lot of 436 00:23:36,119 --> 00:23:39,640 Speaker 1: people who talk about rationality, and there's a lot there's 437 00:23:39,720 --> 00:23:41,720 Speaker 1: I think there's kind of a cult of rationality among 438 00:23:41,840 --> 00:23:44,359 Speaker 1: certain people, and they people pride themselves and being more 439 00:23:44,480 --> 00:23:47,040 Speaker 1: rational than the next person. But what I like about 440 00:23:47,080 --> 00:23:50,399 Speaker 1: this framing is that it's not that maybe people are 441 00:23:50,440 --> 00:23:52,159 Speaker 1: more or less rational, but it's sort of like a 442 00:23:52,160 --> 00:23:55,280 Speaker 1: meta rationality and so in the sense that maybe it's 443 00:23:55,280 --> 00:24:00,000 Speaker 1: not rational too not acknowledge some certain fact, but maybe 444 00:24:00,040 --> 00:24:02,840 Speaker 1: it is just more rational for people to want to 445 00:24:02,840 --> 00:24:05,160 Speaker 1: agree with their neighbors and that ultimately it just might 446 00:24:05,200 --> 00:24:08,640 Speaker 1: make more sense. Well. Also, particularly in trading, we've talked 447 00:24:08,800 --> 00:24:11,720 Speaker 1: a number of times about how markets can be wrong 448 00:24:12,040 --> 00:24:14,800 Speaker 1: and you can be right, but ultimately you're going to 449 00:24:14,880 --> 00:24:17,240 Speaker 1: lose money on that trade if everyone is going in 450 00:24:17,320 --> 00:24:19,639 Speaker 1: a certain direction and you're trying to fight the tide. 451 00:24:20,320 --> 00:24:24,760 Speaker 1: So that really complicates matters, and it's particularly challenging for 452 00:24:25,119 --> 00:24:29,240 Speaker 1: traders and strategies because sometimes out of consensus views are 453 00:24:29,280 --> 00:24:34,560 Speaker 1: severely punished. Yeah, because there's group think at internal organizations 454 00:24:34,640 --> 00:24:37,120 Speaker 1: such as big investment. First, there's group think all over 455 00:24:37,119 --> 00:24:40,520 Speaker 1: the place. So if you were someone who went long 456 00:24:40,680 --> 00:24:45,119 Speaker 1: Lehman Brothers in September of two eight, uh, you were 457 00:24:45,240 --> 00:24:47,280 Speaker 1: Now you're a complete idiot. But it could have been 458 00:24:47,320 --> 00:24:49,359 Speaker 1: that they were bailed out and Lehman Brothers could be 459 00:24:49,359 --> 00:24:51,919 Speaker 1: worth twenty times what it was then, and everyone say 460 00:24:52,119 --> 00:24:55,440 Speaker 1: this person is a genius. So there's all kinds of 461 00:24:55,640 --> 00:24:58,080 Speaker 1: incentives to just go with the crowd and not take 462 00:24:58,119 --> 00:25:00,320 Speaker 1: the huge risk. Yeah. Absolutely. Can I just say one 463 00:25:00,359 --> 00:25:04,359 Speaker 1: more thing. I'm really into uh anthropology lately, so I 464 00:25:04,400 --> 00:25:06,280 Speaker 1: like this idea that we're going to start looking into 465 00:25:06,320 --> 00:25:09,560 Speaker 1: the evolution of humankind to explain markets. Let's let's let's 466 00:25:09,600 --> 00:25:11,800 Speaker 1: do some more. All right, Um, this has been another 467 00:25:11,840 --> 00:25:14,520 Speaker 1: episode of the Odd Lots podcast. I'm Tracy Alloway. You 468 00:25:14,520 --> 00:25:16,920 Speaker 1: can follow me on Twitter at Tracy Alloway, and I'm 469 00:25:17,000 --> 00:25:19,280 Speaker 1: Joe wisn't Thal. You could follow me on Twitter at 470 00:25:19,320 --> 00:25:22,240 Speaker 1: the Stalwart. Are you on Twitter? I'm on Twitter at 471 00:25:22,560 --> 00:25:25,359 Speaker 1: j A Y V A N D A v E L. 472 00:25:26,240 --> 00:25:30,280 Speaker 1: And you should follow our producer tofur Foreheads at foreheads 473 00:25:30,359 --> 00:25:33,440 Speaker 1: T as well as the Bloomberg head of podcast, Francesco 474 00:25:33,520 --> 00:25:36,720 Speaker 1: Levy at Francesco Today. Thanks for listening.