1 00:00:03,240 --> 00:00:07,600 Speaker 1: This is Masters in Business with Barry Ridholds on Bloomberg Radio. 2 00:00:08,280 --> 00:00:12,160 Speaker 1: This week on the show, I have an extra special guest. 3 00:00:12,640 --> 00:00:14,240 Speaker 1: I know you guys give me grief for saying that 4 00:00:14,280 --> 00:00:17,360 Speaker 1: every week, but I really have an extra special guest, 5 00:00:17,800 --> 00:00:21,680 Speaker 1: Professor Danny Kahneman, winner of the two thousand and two 6 00:00:21,760 --> 00:00:25,440 Speaker 1: Nobel Prize in Economics, which is quite a feat when 7 00:00:25,480 --> 00:00:29,720 Speaker 1: you realize that he's not an economist. He's a cognitive psychologist, 8 00:00:30,160 --> 00:00:33,640 Speaker 1: author of the book Thinking Fast and Slow before we 9 00:00:33,720 --> 00:00:37,760 Speaker 1: get to the podcast, which is really quite fascinating. Just 10 00:00:37,840 --> 00:00:42,480 Speaker 1: a quick funny story. So we finished doing the interview 11 00:00:42,800 --> 00:00:46,080 Speaker 1: and we're heading out of the Bloomberg building where you're 12 00:00:46,080 --> 00:00:48,520 Speaker 1: heading to, Oh, I'm going over here. Oh I'm heading 13 00:00:48,520 --> 00:00:51,519 Speaker 1: in that same direction, I tell Danny, which is what 14 00:00:51,560 --> 00:00:54,280 Speaker 1: he insists everybody call him. And so we take a 15 00:00:54,280 --> 00:00:57,800 Speaker 1: subway downtown and we're get out at Grand Central and 16 00:00:57,800 --> 00:01:01,600 Speaker 1: we're walking someplace and and he said, as to me, listen, 17 00:01:01,600 --> 00:01:03,120 Speaker 1: you don't have to walk me to where I'm going. 18 00:01:03,280 --> 00:01:05,440 Speaker 1: I'm in New York right now where I am? And 19 00:01:05,480 --> 00:01:07,959 Speaker 1: I said, well, to tell you the truth, it's two o'clock. 20 00:01:08,080 --> 00:01:11,440 Speaker 1: I haven't had lunch yet. A block from where your 21 00:01:11,480 --> 00:01:16,720 Speaker 1: destination is. Is this lovely um sandwich shop called Alidoro, 22 00:01:17,360 --> 00:01:20,320 Speaker 1: just open the midtown. There's a Soho branch. It's really good. 23 00:01:20,959 --> 00:01:25,280 Speaker 1: I'm just heading in your direction anyway. So he looks 24 00:01:25,319 --> 00:01:28,160 Speaker 1: at me and says, the sandwich is really that good. Yes, 25 00:01:28,200 --> 00:01:32,920 Speaker 1: they're delicious. So long story short, we go into this place. 26 00:01:33,000 --> 00:01:36,880 Speaker 1: It's essentially a takeout joint with one long picnic table 27 00:01:36,880 --> 00:01:42,120 Speaker 1: on the back. Most people um take sandwiches out. And 28 00:01:42,240 --> 00:01:46,199 Speaker 1: it's two o'clock so the rush lunch hour rush is over, 29 00:01:46,680 --> 00:01:49,960 Speaker 1: so I we order sandwiches. Takes a few minutes. I 30 00:01:50,040 --> 00:01:52,120 Speaker 1: kind of put it, sit him down at the table, 31 00:01:52,680 --> 00:01:55,880 Speaker 1: and I said, I'll go get the sandwiches. And as 32 00:01:55,880 --> 00:02:00,440 Speaker 1: I'm coming back to the table, there is Professor Khneman 33 00:02:00,560 --> 00:02:04,600 Speaker 1: sitting at a picnic table surrounded by I don't know 34 00:02:04,960 --> 00:02:08,840 Speaker 1: a dozen millennials, kids who could have been in his 35 00:02:09,040 --> 00:02:13,040 Speaker 1: class a couple of years ago at Princeton, and they're 36 00:02:13,120 --> 00:02:19,399 Speaker 1: just holy oblivious to this kindly professor sitting amongst them. 37 00:02:19,720 --> 00:02:24,200 Speaker 1: Here's a brilliant thinker who changed the way we think 38 00:02:24,280 --> 00:02:30,440 Speaker 1: about how we think, and just everybody completely wholly unaware 39 00:02:31,080 --> 00:02:34,320 Speaker 1: that that brilliance is in their midst. Then it's the 40 00:02:34,360 --> 00:02:38,280 Speaker 1: whole thing just cracked me up. I found the interview 41 00:02:38,320 --> 00:02:41,400 Speaker 1: to be absolutely fascinating. I think you will also, so, 42 00:02:41,560 --> 00:02:48,840 Speaker 1: with no further ado, my conversation with Danny Khneman. This 43 00:02:49,160 --> 00:02:53,120 Speaker 1: is Masters in Business with Barry Ridholts on Bloomberg Radio. 44 00:02:53,760 --> 00:02:56,240 Speaker 1: Our special guest this week. And I know I say 45 00:02:56,240 --> 00:02:59,120 Speaker 1: this all too often, but we really do have an 46 00:02:59,120 --> 00:03:03,680 Speaker 1: extra special us this week. His name is Professor Daniel Kaneman. 47 00:03:04,000 --> 00:03:07,440 Speaker 1: He taught at the Woodrow Wilson School at Princeton, both 48 00:03:07,440 --> 00:03:11,600 Speaker 1: in psychology and public affairs. He is a fellow at 49 00:03:11,639 --> 00:03:15,880 Speaker 1: the Center for Rationality at Hebrew University, winner of the 50 00:03:15,960 --> 00:03:20,920 Speaker 1: Nobel Prize in Economic Sciences as well as the Lifetime 51 00:03:21,080 --> 00:03:27,079 Speaker 1: Contribution Award of the American Psychology Association UH and he 52 00:03:27,200 --> 00:03:30,600 Speaker 1: is a recipient of the Presidential Medal of Freedom. I 53 00:03:30,639 --> 00:03:33,320 Speaker 1: could go on and on about his curriculum vita, but 54 00:03:33,400 --> 00:03:36,400 Speaker 1: I would rather jump right into this. Danny Kahneman, Welcome 55 00:03:36,440 --> 00:03:40,280 Speaker 1: to Bloomberg. Pleased to be here, so I'm excited to 56 00:03:40,360 --> 00:03:44,240 Speaker 1: talk to you about so many different things. Let's let's 57 00:03:44,440 --> 00:03:47,839 Speaker 1: start way back at the beginning. You you began your 58 00:03:47,920 --> 00:03:52,440 Speaker 1: academic career in Israel. Did did you expect to spend 59 00:03:52,960 --> 00:03:56,120 Speaker 1: the rest of your life working in academia? Oh? Yeah, 60 00:03:56,240 --> 00:03:59,320 Speaker 1: I mean I expected to be a professor when I 61 00:03:59,400 --> 00:04:03,120 Speaker 1: was a kid. And so you win the Nobel Prize 62 00:04:03,120 --> 00:04:06,760 Speaker 1: in two thousand and two, not for psychology, which is 63 00:04:06,840 --> 00:04:09,760 Speaker 1: your field of study, but but for economics. How does 64 00:04:09,800 --> 00:04:12,480 Speaker 1: the psychologist win a prize for economics? Well, I mean 65 00:04:12,520 --> 00:04:16,280 Speaker 1: there is no Nobel in psychology, but we wanted in 66 00:04:16,360 --> 00:04:22,160 Speaker 1: economics for work that is psychological. Uh, when it's you 67 00:04:22,320 --> 00:04:25,720 Speaker 1: and he didn't actually win this because they don't give 68 00:04:25,720 --> 00:04:29,159 Speaker 1: it posthumously, but you know, I always felt it it's 69 00:04:29,160 --> 00:04:34,159 Speaker 1: a joint prize because the prize was awarded for work 70 00:04:34,200 --> 00:04:38,880 Speaker 1: we had done together, and it was work that influenced economics. 71 00:04:38,960 --> 00:04:42,920 Speaker 1: So it's under influence in general that you get a Nobel, 72 00:04:43,040 --> 00:04:45,840 Speaker 1: not on the quality of the work necessary. So so 73 00:04:45,960 --> 00:04:50,640 Speaker 1: let's talk about your colleague, a most diverse key in 74 00:04:50,800 --> 00:04:54,760 Speaker 1: your book Thinking Fast and Slow, you described first seeing 75 00:04:54,839 --> 00:04:58,359 Speaker 1: him speak. I think it was around nineteen sixty nine 76 00:04:58,480 --> 00:05:03,279 Speaker 1: and the subject was, uh, do people make good intuitive 77 00:05:03,400 --> 00:05:07,320 Speaker 1: statistical assumptions? Did you know right away the two of 78 00:05:07,320 --> 00:05:11,240 Speaker 1: you were destined to be research partners? No? But what 79 00:05:11,360 --> 00:05:15,080 Speaker 1: happened was he visited seminario was teaching. I mean I 80 00:05:15,160 --> 00:05:19,039 Speaker 1: invited him and he spoke about some research that was 81 00:05:19,120 --> 00:05:23,960 Speaker 1: being done in Michigan at the time on whether people 82 00:05:23,960 --> 00:05:27,640 Speaker 1: are good intuitive statisticsians, and their conclusion was that they are. 83 00:05:28,360 --> 00:05:32,920 Speaker 1: And so we had very counterintuitive well it's now counterintuitive, 84 00:05:33,000 --> 00:05:36,440 Speaker 1: at that time, it sounded quite intuitive, and we had 85 00:05:36,480 --> 00:05:39,880 Speaker 1: a very heated discussion. You know, they had those in Israel, 86 00:05:40,040 --> 00:05:44,640 Speaker 1: and it was a very Israeli type conversation and we 87 00:05:44,800 --> 00:05:47,520 Speaker 1: enjoyed it, both of us. We decided to have lunch 88 00:05:47,600 --> 00:05:52,920 Speaker 1: together the following Friday, and there we we discussed ideas 89 00:05:53,000 --> 00:05:55,680 Speaker 1: over that lunch, and we didn't know that it was 90 00:05:55,720 --> 00:05:57,960 Speaker 1: going to shape our lives, but we had a pretty 91 00:05:57,960 --> 00:06:01,280 Speaker 1: good inkling of what we wanted to do next. So 92 00:06:01,960 --> 00:06:07,800 Speaker 1: the two you established a cognitive basis for analyzing common 93 00:06:08,040 --> 00:06:13,279 Speaker 1: human errors and how they arrive from biases. How did you, guys, 94 00:06:13,360 --> 00:06:18,400 Speaker 1: happen across across that discovery. We started, you know, from 95 00:06:18,400 --> 00:06:21,240 Speaker 1: a combination of what Amos knew and what I was 96 00:06:21,320 --> 00:06:26,200 Speaker 1: specialized in, and my specialty at the time was visual perception, 97 00:06:26,920 --> 00:06:31,279 Speaker 1: and in visual perception you have illusions and analyzing the 98 00:06:31,320 --> 00:06:35,559 Speaker 1: illusions is interesting because the illusions teach you something about 99 00:06:35,560 --> 00:06:40,720 Speaker 1: the mechanism of normal perception. And Amos was an expert 100 00:06:40,800 --> 00:06:45,279 Speaker 1: on decision theory and on formal analysis, and we complemented 101 00:06:45,320 --> 00:06:50,239 Speaker 1: each other very well. And the idea that the research 102 00:06:50,320 --> 00:06:53,640 Speaker 1: that we developed eventually was really a research on cognitive 103 00:06:53,680 --> 00:06:58,080 Speaker 1: illusions and where they come from. And we proposed a 104 00:06:58,120 --> 00:07:03,039 Speaker 1: mechanism and mechanisms we call we call those heuristics, and 105 00:07:03,120 --> 00:07:07,279 Speaker 1: now have a somewhat different interpretation, not very different, but 106 00:07:07,440 --> 00:07:10,440 Speaker 1: slightly different from the one we had at that time. 107 00:07:10,680 --> 00:07:14,640 Speaker 1: I mean, the general idea is quite simple. You ask 108 00:07:14,720 --> 00:07:18,240 Speaker 1: people a complicated question like what is the probability of 109 00:07:18,280 --> 00:07:23,080 Speaker 1: an event? And they can't answer it because it's very difficult. 110 00:07:23,720 --> 00:07:26,600 Speaker 1: But there are easier questions that are related to that 111 00:07:26,640 --> 00:07:30,600 Speaker 1: one that they can answer, such as, this is a 112 00:07:30,680 --> 00:07:34,600 Speaker 1: surprising event that is something that people know right away? 113 00:07:35,200 --> 00:07:38,920 Speaker 1: Is it a typical result of that kind of mechanism? 114 00:07:39,240 --> 00:07:43,080 Speaker 1: And people can answer that right away. So, and what 115 00:07:43,240 --> 00:07:46,679 Speaker 1: happens is people take the answer to the easy question, 116 00:07:47,360 --> 00:07:51,080 Speaker 1: they use it to answer the difficult question, and they 117 00:07:51,160 --> 00:07:54,480 Speaker 1: think they have answered the difficult question. In fact, they haven't. 118 00:07:54,720 --> 00:07:57,560 Speaker 1: They answered an easier one. So that's why the mechanism 119 00:07:57,600 --> 00:08:01,640 Speaker 1: that we studied. You mentioned that your view on heuristics 120 00:08:01,680 --> 00:08:06,120 Speaker 1: have changed somewhat over the years. What's the difference between 121 00:08:06,120 --> 00:08:09,480 Speaker 1: what you live today and and way back when. Well, 122 00:08:10,080 --> 00:08:13,200 Speaker 1: we started out with what we call a limited number 123 00:08:13,240 --> 00:08:17,680 Speaker 1: of heuristics, and they became quite well known because we 124 00:08:17,760 --> 00:08:22,200 Speaker 1: have a paper that we published in nine four that 125 00:08:22,760 --> 00:08:25,560 Speaker 1: in we published it in Science magazine, so it was 126 00:08:25,640 --> 00:08:30,360 Speaker 1: widely read across many disciplines. And in that paper we 127 00:08:30,440 --> 00:08:34,839 Speaker 1: spoke about three major heuristics, and we analyze the biases 128 00:08:35,240 --> 00:08:38,760 Speaker 1: that these heuristics lead to. And many people would know 129 00:08:38,840 --> 00:08:41,319 Speaker 1: their name from if they went to business school. So 130 00:08:41,720 --> 00:08:48,640 Speaker 1: it's representativeness, it's availability and anchoring. What I'm thinking of 131 00:08:48,840 --> 00:08:51,720 Speaker 1: now is that there is a more general process, and 132 00:08:51,800 --> 00:08:55,120 Speaker 1: that's the process I described a minute ago. They called 133 00:08:55,160 --> 00:08:59,920 Speaker 1: that attribute substitution. It's to substitute one question for another's. 134 00:09:00,000 --> 00:09:03,640 Speaker 1: So instead of answering the complex, difficult question, you answer 135 00:09:03,679 --> 00:09:06,440 Speaker 1: what you can, which is the easier question. That's right. 136 00:09:06,520 --> 00:09:09,440 Speaker 1: So if I ask you an example that comes to mind, 137 00:09:09,679 --> 00:09:13,080 Speaker 1: how happy are you these days? Now? You know your 138 00:09:13,120 --> 00:09:16,600 Speaker 1: mood right now? You're very likely to tell me your 139 00:09:16,640 --> 00:09:19,600 Speaker 1: mood right now. And think that you have answered the 140 00:09:19,720 --> 00:09:22,440 Speaker 1: more general question of how happy are you these days? 141 00:09:22,840 --> 00:09:25,880 Speaker 1: So that's that's another example. There are many like them. 142 00:09:26,120 --> 00:09:29,400 Speaker 1: I'm Barry Ridholts. You're listening to Masters in Business on 143 00:09:29,440 --> 00:09:33,800 Speaker 1: Bloomberg Radio. My special guest today is Nobel Prize winning 144 00:09:33,840 --> 00:09:38,880 Speaker 1: psychologist Danny Kahneman, who has specialized in both cognitive and 145 00:09:38,960 --> 00:09:42,880 Speaker 1: decision making processes. Let's let's jump right into some of 146 00:09:42,880 --> 00:09:46,480 Speaker 1: the things that you discussed in Thinking Fast and Slow, 147 00:09:46,800 --> 00:09:50,800 Speaker 1: a book that I found to be just right in 148 00:09:50,880 --> 00:09:54,000 Speaker 1: the sweet spot of my confirmation bias. Who was everything 149 00:09:54,320 --> 00:09:57,559 Speaker 1: I hoped it to be. You talked about what you 150 00:09:57,600 --> 00:10:01,360 Speaker 1: see is all there is? When when this cussing systems 151 00:10:01,360 --> 00:10:04,240 Speaker 1: one and two, it's almost a theme throughout the book, 152 00:10:04,679 --> 00:10:07,920 Speaker 1: explain what that is? What you see is all there is? Well, 153 00:10:08,200 --> 00:10:12,320 Speaker 1: people are really not aware of information that they don't have, 154 00:10:13,320 --> 00:10:17,160 Speaker 1: and so the the idea which is emphasized in the 155 00:10:17,240 --> 00:10:21,000 Speaker 1: book is that you take whatever information you have and 156 00:10:21,040 --> 00:10:24,000 Speaker 1: you make the best story possible out of that information. 157 00:10:24,640 --> 00:10:27,880 Speaker 1: And the information you don't have you don't feel that 158 00:10:27,920 --> 00:10:31,400 Speaker 1: it's necessary. And I have an example that that I 159 00:10:31,440 --> 00:10:34,559 Speaker 1: think brings that out. If I tell you about a 160 00:10:35,040 --> 00:10:42,800 Speaker 1: national leader, that she is intelligent and firm. Now do 161 00:10:42,840 --> 00:10:45,360 Speaker 1: you have an impression already whether she's a good leader 162 00:10:45,440 --> 00:10:48,440 Speaker 1: or a bad leader, And you certainly do. She's a 163 00:10:48,480 --> 00:10:51,040 Speaker 1: good leader. Now the third word that I was about 164 00:10:51,080 --> 00:10:56,280 Speaker 1: to say is corrupt. But you know, the point being 165 00:10:56,720 --> 00:11:00,079 Speaker 1: that you didn't wait for information that you didn't have, 166 00:11:00,800 --> 00:11:03,800 Speaker 1: who formed an impression as we were going with the 167 00:11:03,840 --> 00:11:07,200 Speaker 1: information you did have, And this is what you see, 168 00:11:07,240 --> 00:11:10,640 Speaker 1: is all there is the working assumption amongst people who 169 00:11:10,640 --> 00:11:14,600 Speaker 1: are trying to draw a conclusion from available information is 170 00:11:15,080 --> 00:11:19,200 Speaker 1: they failed to calculate the impact of data they're either 171 00:11:19,320 --> 00:11:23,120 Speaker 1: unaware of, or don't know or simply haven't encountered. That's 172 00:11:23,520 --> 00:11:26,559 Speaker 1: I mean, people, If what people are trying to do 173 00:11:26,679 --> 00:11:29,520 Speaker 1: is to make the best story possible out of the 174 00:11:29,559 --> 00:11:32,520 Speaker 1: information they have, then this is what they're going to do. 175 00:11:32,559 --> 00:11:37,280 Speaker 1: And that the measure of confidence that people have in uh, 176 00:11:37,360 --> 00:11:40,440 Speaker 1: in their beliefs, in their opinions, there's really a reflection 177 00:11:40,480 --> 00:11:43,760 Speaker 1: of the quality of the story that they've told them. So, 178 00:11:43,760 --> 00:11:46,679 Speaker 1: so let's talk about that narrative because it's it's one 179 00:11:46,760 --> 00:11:50,720 Speaker 1: of my favorite errors in investing is that we tell 180 00:11:50,760 --> 00:11:57,199 Speaker 1: ourselves these complex narratives that seem to fit whatever information 181 00:11:57,280 --> 00:12:00,840 Speaker 1: is in front of us, and that very much creates 182 00:12:00,880 --> 00:12:05,560 Speaker 1: a risk that the narrative, as emotional and compelling as 183 00:12:05,600 --> 00:12:09,200 Speaker 1: it might be, is possibly misleading. How does that fit 184 00:12:09,240 --> 00:12:12,920 Speaker 1: into your work? I think the same. Talib has a 185 00:12:13,040 --> 00:12:16,960 Speaker 1: very nice example in his in his book The Black Swan. 186 00:12:17,559 --> 00:12:21,880 Speaker 1: It's an example of that happened at Bloomberg on Bloomberg News, 187 00:12:22,840 --> 00:12:27,040 Speaker 1: and it happened the day that Saddam was court, and 188 00:12:27,520 --> 00:12:30,920 Speaker 1: something happened in the bond market shortly thereafter. I forget 189 00:12:30,920 --> 00:12:34,200 Speaker 1: whether it rose or it felt and the headline was 190 00:12:35,040 --> 00:12:40,320 Speaker 1: Saddam Court and investors are fleeing from risk. And then 191 00:12:40,400 --> 00:12:44,160 Speaker 1: a few hours later the market reversed and there was 192 00:12:44,160 --> 00:12:49,280 Speaker 1: a headline that said, Saddam Court investors feel more secure 193 00:12:49,360 --> 00:12:54,840 Speaker 1: taking risk. So now what happened was obviously it's not 194 00:12:54,960 --> 00:12:59,920 Speaker 1: plausible that that you could explain to opposite consequences by 195 00:13:00,040 --> 00:13:02,319 Speaker 1: the same thing, but you can make a very good 196 00:13:02,360 --> 00:13:06,840 Speaker 1: story and what happens and commentators and Bloomberg Radio and 197 00:13:07,000 --> 00:13:11,440 Speaker 1: all other pundits what they do there is a salient 198 00:13:11,480 --> 00:13:15,200 Speaker 1: event today. So when you're looking at what happened in 199 00:13:15,240 --> 00:13:19,120 Speaker 1: the market, you're looking back and you see that salient event, 200 00:13:19,280 --> 00:13:23,200 Speaker 1: and irresistibly you think there is a causal connection. So 201 00:13:23,400 --> 00:13:26,840 Speaker 1: whatever happened in the market you attribute to the salient event, 202 00:13:26,880 --> 00:13:31,160 Speaker 1: went up, went down, whatever it's always it makes a 203 00:13:31,240 --> 00:13:35,440 Speaker 1: good story. I have a very vivid recollection early in 204 00:13:35,480 --> 00:13:40,640 Speaker 1: the two thousand and three Iraq invasion there was a important, 205 00:13:40,800 --> 00:13:45,960 Speaker 1: highly regarded mosque that was blown up by American warplanes. 206 00:13:46,040 --> 00:13:48,280 Speaker 1: And the headline, and I don't remember it was the 207 00:13:48,320 --> 00:13:49,840 Speaker 1: Times of the Wall Street Journal, but it was a 208 00:13:49,880 --> 00:13:56,839 Speaker 1: big publication oil prices skyrocket on mosque accidentally being blown up. 209 00:13:56,920 --> 00:14:00,000 Speaker 1: And later that day, so you're looking at this online 210 00:14:00,040 --> 00:14:03,439 Speaker 1: and later that day the oil prices had come back down, 211 00:14:03,920 --> 00:14:06,839 Speaker 1: and it was the same headline, only changing the conclusion 212 00:14:07,400 --> 00:14:11,560 Speaker 1: mass destroyed, oil prices remains stable. Well, if it's going 213 00:14:11,640 --> 00:14:15,040 Speaker 1: up and down after the event, it means the event 214 00:14:15,160 --> 00:14:19,000 Speaker 1: isn't really significant, is it? Well, clearly. So let's talk 215 00:14:19,040 --> 00:14:21,680 Speaker 1: a little bit about some of the biases, those three 216 00:14:21,680 --> 00:14:27,920 Speaker 1: biases you discovered early on availability, anchoring, and representativeness. First, 217 00:14:28,320 --> 00:14:34,440 Speaker 1: these are three enormous biases. First question, why publish in 218 00:14:34,520 --> 00:14:38,760 Speaker 1: Discover magazine or Science magazine as opposed to some staid 219 00:14:39,040 --> 00:14:42,680 Speaker 1: academic journal. Oh, we had done our homework. I mean 220 00:14:42,760 --> 00:14:47,560 Speaker 1: we published in State academic journalist first, and then what 221 00:14:47,720 --> 00:14:51,720 Speaker 1: we published in Science, which you know is a scientific magazine. 222 00:14:51,760 --> 00:14:55,240 Speaker 1: I mean it's not it's a popular magazine. No it's not. No, 223 00:14:55,360 --> 00:14:58,960 Speaker 1: it's a it's a Science is a professional magazine, but 224 00:14:59,240 --> 00:15:04,840 Speaker 1: it's a it's addressed to scientists of all disciplines. But 225 00:15:05,040 --> 00:15:09,240 Speaker 1: you know, it's a it's a perfectly respectable scientific magazine. No, 226 00:15:09,400 --> 00:15:13,840 Speaker 1: we did. We we were academics. We published academically. We 227 00:15:13,920 --> 00:15:17,560 Speaker 1: didn't publish and discover. They looked us up and they 228 00:15:17,560 --> 00:15:21,760 Speaker 1: interviewed us and they wrote about us, but we didn't 229 00:15:21,760 --> 00:15:24,880 Speaker 1: seek that out. We were really academics. We were not 230 00:15:24,960 --> 00:15:29,360 Speaker 1: trying to influence the world. And whatever happened happened, but 231 00:15:29,760 --> 00:15:33,000 Speaker 1: that wasn't really what we intended. Did you have any 232 00:15:33,120 --> 00:15:38,640 Speaker 1: sense of the enormity of these three particular biases early on, Well, 233 00:15:38,680 --> 00:15:42,840 Speaker 1: when we wrote for Science, we had a sense that 234 00:15:42,880 --> 00:15:47,120 Speaker 1: this was an important story. We really did not anticipate 235 00:15:47,200 --> 00:15:51,760 Speaker 1: the impact that it had on many fields, including investment 236 00:15:51,880 --> 00:15:56,240 Speaker 1: and the law, and and and even economics. It had 237 00:15:56,640 --> 00:16:00,000 Speaker 1: a lot of influence and that we didn't anticipate. Really. 238 00:16:00,160 --> 00:16:01,960 Speaker 1: So so let's talk about a couple of these. You 239 00:16:02,360 --> 00:16:07,520 Speaker 1: mentioned availability bias, it's just the information you have. Anchoring 240 00:16:07,800 --> 00:16:12,800 Speaker 1: is an enormous factor in things like negotiations, in pricing. 241 00:16:13,280 --> 00:16:16,680 Speaker 1: I love the story of just asking people. Ask a 242 00:16:16,680 --> 00:16:19,440 Speaker 1: group of people in a room what the first digit 243 00:16:19,600 --> 00:16:22,560 Speaker 1: of their phone number is, and then ask them a separate, 244 00:16:22,640 --> 00:16:27,000 Speaker 1: unrelated question. A higher digit will yield a higher number 245 00:16:27,040 --> 00:16:30,080 Speaker 1: on the unrelated question, and a lower first digit yields 246 00:16:30,080 --> 00:16:33,400 Speaker 1: a low How did how did that discovery come about? Well, 247 00:16:33,440 --> 00:16:37,240 Speaker 1: actually it takes a little more than that, but I'll 248 00:16:37,280 --> 00:16:41,320 Speaker 1: give you an example. Uh, in the example of negotiation, 249 00:16:41,880 --> 00:16:45,040 Speaker 1: many people think that you have an advantage if you 250 00:16:45,120 --> 00:16:49,480 Speaker 1: go second, but actually the advantage is going first throughout 251 00:16:49,520 --> 00:16:52,520 Speaker 1: the first number, and that's where you're throughout. And and 252 00:16:52,560 --> 00:16:56,160 Speaker 1: the reason is something in the way the mind works. 253 00:16:56,840 --> 00:17:00,400 Speaker 1: And the mind works, it tries to make sense whatever 254 00:17:00,480 --> 00:17:04,760 Speaker 1: you put before it. So this built in tendency that 255 00:17:04,800 --> 00:17:07,439 Speaker 1: we have of trying to make sense of everything that 256 00:17:07,480 --> 00:17:12,800 Speaker 1: we encounter, that is a mechanism for anchoring. I'm Barry Ridholts. 257 00:17:12,920 --> 00:17:16,359 Speaker 1: You're listening to Masters in Business on Bloomberg Radio. My 258 00:17:16,440 --> 00:17:21,320 Speaker 1: special guest today is Nobel Prize winning psychologist Danny Kahneman, 259 00:17:21,359 --> 00:17:26,080 Speaker 1: who has specialized in both cognitive and decision making processes. 260 00:17:26,640 --> 00:17:29,200 Speaker 1: Let's talk a little bit about biases and money. Since 261 00:17:29,320 --> 00:17:32,880 Speaker 1: this is Bloomberg and many of our listeners are investors, 262 00:17:33,200 --> 00:17:36,040 Speaker 1: I want to start with prospect theory, which is something 263 00:17:36,040 --> 00:17:39,960 Speaker 1: that's I find absolutely fascinating. The basic idea is that 264 00:17:40,000 --> 00:17:44,760 Speaker 1: we feel losses much more severely than we enjoy gains. 265 00:17:45,280 --> 00:17:47,639 Speaker 1: The first question I have for you about this is 266 00:17:48,119 --> 00:17:51,800 Speaker 1: how did you discover this? Well, the idea that came 267 00:17:51,840 --> 00:17:57,960 Speaker 1: first or the was that the standard financial analysis and 268 00:17:58,080 --> 00:18:03,080 Speaker 1: which called the rational age model, fails in an important way. 269 00:18:03,520 --> 00:18:06,560 Speaker 1: And in the rational agent model, you should always be 270 00:18:06,640 --> 00:18:09,560 Speaker 1: thinking in terms of wealth. You know, what will be 271 00:18:09,640 --> 00:18:12,280 Speaker 1: my wealth if this happens? What will be my wealth 272 00:18:12,320 --> 00:18:15,639 Speaker 1: if that happens? And all the theory is based on 273 00:18:15,720 --> 00:18:19,159 Speaker 1: the idea that you are thinking in terms of your wealth. 274 00:18:19,400 --> 00:18:21,320 Speaker 1: But it turns out you're not thinking in terms of 275 00:18:21,359 --> 00:18:23,720 Speaker 1: your wealth at all. You're thinking in terms of gains 276 00:18:23,720 --> 00:18:26,760 Speaker 1: and losses. You're thinking in terms of changes of wealth. 277 00:18:27,200 --> 00:18:30,400 Speaker 1: So that was the first thing that that was important 278 00:18:30,440 --> 00:18:34,800 Speaker 1: to establish. It was clear Homo economists is what the 279 00:18:34,800 --> 00:18:38,280 Speaker 1: professors call it, not a really good representation of how 280 00:18:38,359 --> 00:18:41,040 Speaker 1: people behave. That's right, and you know that this is 281 00:18:41,080 --> 00:18:44,200 Speaker 1: sort of immediately obvious that people don't think in terms 282 00:18:44,280 --> 00:18:47,359 Speaker 1: of their wealth. And it's easy to show that that 283 00:18:47,480 --> 00:18:52,200 Speaker 1: they don't because you can describe the same decision which 284 00:18:52,240 --> 00:18:54,360 Speaker 1: in the same in terms of wealth. You can give 285 00:18:54,400 --> 00:18:57,720 Speaker 1: it two different descriptions and people will make different choices 286 00:18:58,080 --> 00:19:02,320 Speaker 1: depending on whether it's resented in gains and losses. So 287 00:19:02,400 --> 00:19:05,320 Speaker 1: that were the first discovery, if you will, I mean, 288 00:19:05,400 --> 00:19:08,400 Speaker 1: you know it had been stated before by the way 289 00:19:08,680 --> 00:19:13,080 Speaker 1: Marco Witch Harry Marco which he had he had written. 290 00:19:13,440 --> 00:19:18,080 Speaker 1: He had written one article on this about changes being important. 291 00:19:18,160 --> 00:19:21,720 Speaker 1: He didn't follow through, and we came upon us and 292 00:19:21,840 --> 00:19:24,760 Speaker 1: we did follow through. Now, once you have that that 293 00:19:24,840 --> 00:19:27,920 Speaker 1: people are thinking gains and losses, it's easy to see 294 00:19:27,960 --> 00:19:31,760 Speaker 1: that there is an asymmetry. That is that losses, as 295 00:19:31,800 --> 00:19:36,320 Speaker 1: we said, loom larger than gains. And we even have 296 00:19:36,400 --> 00:19:39,600 Speaker 1: a pretty good idea of by how much do they 297 00:19:39,640 --> 00:19:44,040 Speaker 1: loom larger than gains about two to one. And you 298 00:19:44,080 --> 00:19:49,160 Speaker 1: know an example how to bring that out is I'll 299 00:19:49,200 --> 00:19:51,280 Speaker 1: offer you a gamble on the toss of a coin. 300 00:19:51,640 --> 00:19:54,439 Speaker 1: You know, if it chose tales, you lose one hundred 301 00:19:54,720 --> 00:19:59,120 Speaker 1: and if it chose heads, you win X. Now what 302 00:19:59,160 --> 00:20:02,600 Speaker 1: would ex have to be for that gamble to become 303 00:20:02,680 --> 00:20:05,520 Speaker 1: really attractive to you? Well, more than a hundred, so 304 00:20:05,560 --> 00:20:09,159 Speaker 1: it's more than Actually you're a professional. There is a 305 00:20:09,200 --> 00:20:14,000 Speaker 1: difference of professionals than others most people, and that's been 306 00:20:14,040 --> 00:20:17,719 Speaker 1: well established. The men more than two hundred really, so 307 00:20:17,760 --> 00:20:22,000 Speaker 1: they want two to one odds, which is it's not 308 00:20:22,160 --> 00:20:24,480 Speaker 1: the odds, I mean the odds I went to one. 309 00:20:24,880 --> 00:20:28,439 Speaker 1: It's equal chance to payout. It's the payout has to 310 00:20:28,480 --> 00:20:31,840 Speaker 1: be to to one, meaning that it takes two hundred 311 00:20:31,880 --> 00:20:35,840 Speaker 1: dollars of potential gain to compensate for one hundred dollars 312 00:20:35,920 --> 00:20:39,640 Speaker 1: of potential loss when the chances of the two are equal. 313 00:20:40,280 --> 00:20:43,600 Speaker 1: So that's loss of version. Terms out that loss of 314 00:20:43,720 --> 00:20:48,719 Speaker 1: version has enormous consequence, enormous absolutely so. So that leads 315 00:20:49,200 --> 00:20:52,760 Speaker 1: to a couple of really interesting questions. The first one 316 00:20:52,920 --> 00:20:57,800 Speaker 1: is what is it about losses that makes them so 317 00:20:57,920 --> 00:21:01,840 Speaker 1: much more painful than gets are pleasurable? In other words, 318 00:21:02,040 --> 00:21:05,879 Speaker 1: why does this two to one risk of loss of version? 319 00:21:05,880 --> 00:21:09,800 Speaker 1: Why does this even exist? Well, I mean you know this, 320 00:21:09,800 --> 00:21:13,640 Speaker 1: this is evolutionary. I mean you you would imagine that 321 00:21:14,280 --> 00:21:19,000 Speaker 1: in evolution, threats are more important than opportunities. That makes 322 00:21:19,000 --> 00:21:22,040 Speaker 1: a lot of sense, And so it's a very general 323 00:21:22,080 --> 00:21:26,479 Speaker 1: phenomenon that sort of bad things. So the preemptal are 324 00:21:26,600 --> 00:21:30,520 Speaker 1: stronger than good things in our experience. So loss of 325 00:21:30,640 --> 00:21:33,960 Speaker 1: version is a special case of something much broader. So 326 00:21:34,000 --> 00:21:38,119 Speaker 1: there's always another opportunity coming along, another game, another deer 327 00:21:38,160 --> 00:21:42,080 Speaker 1: coming by, but an actual genuine loss. Hey, that's permanent. 328 00:21:42,119 --> 00:21:44,760 Speaker 1: You don't recover from that. That's right anyway, you take 329 00:21:44,800 --> 00:21:48,520 Speaker 1: it most seriously. So there is a deer in your 330 00:21:48,560 --> 00:21:51,560 Speaker 1: sights and a lion, you are going to be busy 331 00:21:51,560 --> 00:21:54,359 Speaker 1: about the lion and not about the deer. Makes sense. 332 00:21:54,840 --> 00:21:57,880 Speaker 1: That leads to the obvious question, well, what can investors 333 00:21:57,960 --> 00:22:02,960 Speaker 1: do to protect themselves against this hardwired Uh? Well, loss 334 00:22:02,960 --> 00:22:06,840 Speaker 1: of her Uh there's everything they can do. One is 335 00:22:07,560 --> 00:22:10,160 Speaker 1: not to look at their results, not to look too 336 00:22:10,160 --> 00:22:13,360 Speaker 1: often at how well they're doing. And today you could 337 00:22:13,400 --> 00:22:16,400 Speaker 1: look tick by tick, minute by minute. The worst thing, 338 00:22:16,480 --> 00:22:19,800 Speaker 1: it's a very very bad idea to look too often. 339 00:22:19,960 --> 00:22:23,800 Speaker 1: When you look very often, you attempted to make changes. 340 00:22:24,440 --> 00:22:28,920 Speaker 1: And where individual investors lose money is when they make 341 00:22:29,040 --> 00:22:35,160 Speaker 1: changes in their allocation virtually on average. Whenever an investor 342 00:22:35,240 --> 00:22:39,159 Speaker 1: makes a move, it's likely to lose money because they 343 00:22:39,200 --> 00:22:43,200 Speaker 1: are professionals on the other side, betting against the kind 344 00:22:43,240 --> 00:22:48,080 Speaker 1: of moves that individual investors. I'm Barry Hults. You're listening 345 00:22:48,119 --> 00:22:51,560 Speaker 1: to Masters in Business on Bloomberg Radio. My special guest 346 00:22:51,600 --> 00:22:55,240 Speaker 1: today is Danny Kahneman. He is the Nobel Prize winning 347 00:22:55,280 --> 00:23:00,600 Speaker 1: psychologist formerly professor at Princeton who has had an enormous 348 00:23:00,600 --> 00:23:05,280 Speaker 1: influence on the fields of economics and investing. Let's get 349 00:23:05,359 --> 00:23:09,240 Speaker 1: right back to the question of investing and the professionals 350 00:23:09,400 --> 00:23:12,679 Speaker 1: versus the amateurs. Do we see the same sort of 351 00:23:13,119 --> 00:23:17,800 Speaker 1: biases and errors amongst the professionals. They're clearly much attenuated 352 00:23:17,840 --> 00:23:20,720 Speaker 1: among professionals. So I'll give you an example. I mean, 353 00:23:20,760 --> 00:23:23,680 Speaker 1: and that's the biggest example. It's what makes lots of 354 00:23:23,800 --> 00:23:28,080 Speaker 1: version important. If I ask you, would you take a 355 00:23:28,119 --> 00:23:31,359 Speaker 1: gamble if I ask a regular person in the street, 356 00:23:31,480 --> 00:23:34,240 Speaker 1: Not you, because you're a professional, But if I ask 357 00:23:34,280 --> 00:23:36,800 Speaker 1: a regular person in the street, would you take a 358 00:23:36,800 --> 00:23:39,919 Speaker 1: gamble that if you lose, you lose a hundred dollars. 359 00:23:39,920 --> 00:23:42,440 Speaker 1: If you win, you win a hundred and eighty on 360 00:23:42,760 --> 00:23:45,440 Speaker 1: the toss of a coin. That's a no brainer. Well, 361 00:23:45,480 --> 00:23:49,280 Speaker 1: it's a no brainer for professionals. Most people don't like it. 362 00:23:49,680 --> 00:23:56,159 Speaker 1: Really do that all day long? Yeah, because you're a professional. Now, 363 00:23:56,520 --> 00:23:59,800 Speaker 1: when you ask the same people in the street, Okay, 364 00:24:00,080 --> 00:24:03,760 Speaker 1: don't want this one? Would you take ten? So we'll 365 00:24:03,800 --> 00:24:07,560 Speaker 1: toss tin coins and every time, if you lose, you 366 00:24:07,640 --> 00:24:10,080 Speaker 1: lose a hundred, and if you win, you win a 367 00:24:10,200 --> 00:24:16,159 Speaker 1: hundred and eighty. Everybody wants the tin. Nobody wants the one. So, 368 00:24:16,200 --> 00:24:23,080 Speaker 1: in other words, they think the mathistatistics in repeated play. 369 00:24:23,160 --> 00:24:27,679 Speaker 1: When the game is repeated, then people see they become 370 00:24:27,800 --> 00:24:31,800 Speaker 1: much closer to risk neutral and they see the advantage 371 00:24:31,800 --> 00:24:37,600 Speaker 1: of gambling. Now professionals are in a repeated play situation. 372 00:24:38,000 --> 00:24:42,240 Speaker 1: That's the biggest difference, And so they view each decision 373 00:24:42,320 --> 00:24:45,040 Speaker 1: that they make out of a stream of decision that 374 00:24:45,119 --> 00:24:48,480 Speaker 1: they're going to make. Now, this, by the way, is 375 00:24:48,520 --> 00:24:52,439 Speaker 1: true for investors as well. One question that I asked 376 00:24:52,520 --> 00:24:55,520 Speaker 1: people when I tell them about that, So you've turned 377 00:24:55,560 --> 00:25:00,960 Speaker 1: down a hundred hundred and eighty now, but you would 378 00:25:01,000 --> 00:25:04,119 Speaker 1: accept ten of those? Now, are you on your deathbed? 379 00:25:04,440 --> 00:25:07,280 Speaker 1: That's the question I ask if that the last decision 380 00:25:07,359 --> 00:25:10,159 Speaker 1: you're going to make, And clearly they are going to 381 00:25:10,240 --> 00:25:15,080 Speaker 1: be more opportunities to gamble, perhaps not exactly the same gamble, 382 00:25:15,320 --> 00:25:18,240 Speaker 1: but there will be many more opportunities. You need to 383 00:25:18,280 --> 00:25:21,880 Speaker 1: have a policy for how you deal with risks and 384 00:25:21,920 --> 00:25:26,679 Speaker 1: then make your decisions, your individual decisions, in terms of 385 00:25:26,720 --> 00:25:31,200 Speaker 1: a broader policy. Then you will be much closer to rationality. 386 00:25:31,480 --> 00:25:34,200 Speaker 1: But on the single flip of a coin, the average 387 00:25:34,240 --> 00:25:36,600 Speaker 1: individual is going to look at this and say, hey, 388 00:25:36,640 --> 00:25:39,080 Speaker 1: I'll feel foolish if I lose a hundred, even though 389 00:25:39,119 --> 00:25:42,560 Speaker 1: the payout is greater than my potential law that's exactly 390 00:25:43,080 --> 00:25:46,560 Speaker 1: what happens, and you shouldn't feel that way. I mean, now, 391 00:25:46,600 --> 00:25:49,280 Speaker 1: it's true that you will feel like a fool. It's 392 00:25:49,440 --> 00:25:52,240 Speaker 1: very closely related to what you see as all. There 393 00:25:52,400 --> 00:25:56,240 Speaker 1: is that as we tend to see decisions in isolation, 394 00:25:57,440 --> 00:26:01,040 Speaker 1: we don't see a decision about whether I take this 395 00:26:01,200 --> 00:26:05,400 Speaker 1: gamble as one of many similar decisions that I'm going 396 00:26:05,440 --> 00:26:09,280 Speaker 1: to make in future. So are people overly outcome focused 397 00:26:09,359 --> 00:26:13,400 Speaker 1: to the detriment of process? What they are We call 398 00:26:13,560 --> 00:26:18,760 Speaker 1: that narrow framing. They viewed the situation narrowly, and that 399 00:26:19,000 --> 00:26:22,800 Speaker 1: is true in all domains. So, for example, we said 400 00:26:22,840 --> 00:26:25,879 Speaker 1: that people on my opic that they have a narrow 401 00:26:26,160 --> 00:26:29,920 Speaker 1: time horizon. To be more rational, you want to look 402 00:26:29,960 --> 00:26:33,360 Speaker 1: further in time, and then you'll make better decisions. If 403 00:26:33,359 --> 00:26:36,399 Speaker 1: you're thinking of where you will be, you know, a 404 00:26:36,440 --> 00:26:40,720 Speaker 1: long time from now. It's completely different from thinking about 405 00:26:40,760 --> 00:26:44,000 Speaker 1: how will I feel tomorrow If I make this bit 406 00:26:44,160 --> 00:26:47,199 Speaker 1: and I lose, I recall reading about a study and 407 00:26:47,240 --> 00:26:49,760 Speaker 1: I hope I don't mangle this too badly where they 408 00:26:49,760 --> 00:26:52,639 Speaker 1: would take a photo of somebody and then using software 409 00:26:52,840 --> 00:26:55,960 Speaker 1: age their face, and then when they were would ask 410 00:26:56,040 --> 00:26:59,200 Speaker 1: the people who had the current photo of their own face. 411 00:26:59,240 --> 00:27:01,240 Speaker 1: They would get a very different answer than the group 412 00:27:01,560 --> 00:27:05,320 Speaker 1: that we're better able to imagine themselves twenty years alter. Yeah, 413 00:27:05,400 --> 00:27:08,760 Speaker 1: that's about saving. I mean what actually happens is that 414 00:27:08,840 --> 00:27:12,720 Speaker 1: when you show people morphed image of their face as 415 00:27:12,760 --> 00:27:18,280 Speaker 1: an old person, their tendency to save increases, so it's 416 00:27:18,320 --> 00:27:21,600 Speaker 1: easier for them to identify with their future self. But 417 00:27:21,720 --> 00:27:25,080 Speaker 1: in general that's not what we do. People aren't especially 418 00:27:25,160 --> 00:27:28,800 Speaker 1: good about that. Let's talk about being wrong and being 419 00:27:28,840 --> 00:27:32,240 Speaker 1: able to admit that you're wrong. John Kenneth gale Breath 420 00:27:32,359 --> 00:27:36,000 Speaker 1: once favor You famously said, given the choice between changing 421 00:27:36,040 --> 00:27:39,040 Speaker 1: one's minds and proving there's no need to do so 422 00:27:39,600 --> 00:27:43,440 Speaker 1: most people get busy on the proof. You called this 423 00:27:43,880 --> 00:27:48,560 Speaker 1: theory induced blindness. So why are we so unwilling to 424 00:27:48,680 --> 00:27:51,240 Speaker 1: admit when we're wrong. You know, you try to make 425 00:27:51,240 --> 00:27:55,520 Speaker 1: the best story possible, and the best story possible includes 426 00:27:55,880 --> 00:28:00,359 Speaker 1: what frequently actually didn't make that mistake. You know, So 427 00:28:00,520 --> 00:28:06,760 Speaker 1: something occurred and in fact I did not anticipate it, 428 00:28:06,800 --> 00:28:13,119 Speaker 1: but in retrospect, I didn't diticipate it. This is called hindsight. 429 00:28:13,880 --> 00:28:17,920 Speaker 1: And that's one of the main reasons that we don't 430 00:28:17,920 --> 00:28:21,280 Speaker 1: admit that we're wrong, is that whatever happens, we have 431 00:28:21,320 --> 00:28:23,720 Speaker 1: a story. We can make a story, we can make 432 00:28:23,760 --> 00:28:27,359 Speaker 1: sense of it. We think we understand it, and when 433 00:28:27,400 --> 00:28:32,000 Speaker 1: we think we understand it, we alter our image of 434 00:28:32,040 --> 00:28:34,679 Speaker 1: what we thought earlier. I'll give you a kind of 435 00:28:34,760 --> 00:28:39,200 Speaker 1: example that so you have two teams that are about 436 00:28:39,240 --> 00:28:43,160 Speaker 1: to play, you know, a football, and the two teams 437 00:28:43,160 --> 00:28:48,960 Speaker 1: are about evenly balanced. Now one of them completely crushes 438 00:28:49,080 --> 00:28:53,480 Speaker 1: the other. Now, after you have just seen that they're 439 00:28:53,520 --> 00:28:57,280 Speaker 1: not equally strong, you perceive one of them as much 440 00:28:57,360 --> 00:29:01,120 Speaker 1: stronger than the other, and that sception gives you the 441 00:29:01,160 --> 00:29:04,600 Speaker 1: sense that you know, this must have been visible in advance, 442 00:29:05,080 --> 00:29:07,720 Speaker 1: that one of them was much stronger than the other. 443 00:29:08,640 --> 00:29:13,800 Speaker 1: So hindsight is that's a big deal. It allows us 444 00:29:13,840 --> 00:29:18,000 Speaker 1: to keep a coherent view of the world. It blinds 445 00:29:18,120 --> 00:29:22,040 Speaker 1: us to surprise us. It prevents us from learning the 446 00:29:22,160 --> 00:29:25,640 Speaker 1: right thing. It allows us to learn the wrong thing. 447 00:29:25,760 --> 00:29:29,160 Speaker 1: That is, whenever we're surprised by something, even if we 448 00:29:29,360 --> 00:29:31,880 Speaker 1: do admit that we make a mistake. So oh, I'll 449 00:29:31,920 --> 00:29:35,320 Speaker 1: never make that mistake again. But in fact, what you 450 00:29:35,360 --> 00:29:38,920 Speaker 1: should learn, but you should learn when when you make 451 00:29:38,960 --> 00:29:42,400 Speaker 1: a mistake because you did not anticipate something, that the 452 00:29:42,480 --> 00:29:46,200 Speaker 1: world is difficult to anticipate. That's the correct lesson to 453 00:29:46,320 --> 00:29:50,320 Speaker 1: learn from surprises that the world is surprising. It's not 454 00:29:50,480 --> 00:29:53,960 Speaker 1: that my prediction is wrong. It's that predicting in general 455 00:29:54,080 --> 00:29:57,800 Speaker 1: is almost as possible, you know. It's it's ironic. During 456 00:29:57,840 --> 00:30:02,640 Speaker 1: the O eight oh nine finance shull crisis, we saw 457 00:30:03,000 --> 00:30:08,520 Speaker 1: very few people beforehand making warnings about housing, about derivatives, 458 00:30:08,520 --> 00:30:12,080 Speaker 1: about the stock markets. A pair of academics from Ryan 459 00:30:12,120 --> 00:30:15,320 Speaker 1: Hart and Rogolf very famously put out a paper in 460 00:30:15,520 --> 00:30:21,479 Speaker 1: January eight widely ignored afterwards. I can't tell you how 461 00:30:21,520 --> 00:30:25,560 Speaker 1: many people have claimed to have seen it coming, never 462 00:30:25,600 --> 00:30:28,960 Speaker 1: in print, there's no recorded history. But they all started 463 00:30:29,040 --> 00:30:31,400 Speaker 1: coming and they knew this was going to go south. 464 00:30:32,440 --> 00:30:36,520 Speaker 1: Pure hindsight. By this is really hindsight bias. It's actually 465 00:30:36,640 --> 00:30:41,040 Speaker 1: very pernicious because it gives you the wrong impression. So 466 00:30:42,080 --> 00:30:44,760 Speaker 1: I love Michael Lewis, and I like The Big Short 467 00:30:45,480 --> 00:30:48,720 Speaker 1: like everybody, but The Big Short really gives you the 468 00:30:48,720 --> 00:30:53,520 Speaker 1: impression that this was as obvious and the nose you know, see, 469 00:30:53,760 --> 00:30:58,120 Speaker 1: when I read that book, my interpretation was here are 470 00:30:58,160 --> 00:31:02,640 Speaker 1: a very small number of very odd characters who were 471 00:31:02,680 --> 00:31:06,160 Speaker 1: amongst the only ones who saw it, and everybody else 472 00:31:06,160 --> 00:31:08,440 Speaker 1: thought they were crazy. I mean, And the movie does 473 00:31:08,480 --> 00:31:12,040 Speaker 1: a nice job portraying that. Yeah, but actually, at any 474 00:31:12,120 --> 00:31:15,800 Speaker 1: one time, there are many people who are predicting that 475 00:31:15,880 --> 00:31:18,560 Speaker 1: the world is going to end tomorrow, that there's going 476 00:31:18,600 --> 00:31:22,040 Speaker 1: to be a crush next week. Now there is no crush. 477 00:31:22,160 --> 00:31:25,160 Speaker 1: Those people get forgotten when there is a crash, the 478 00:31:26,040 --> 00:31:29,440 Speaker 1: sort of geniuses. So we've seen that time and again, 479 00:31:29,920 --> 00:31:34,160 Speaker 1: the broken clock syndrome. Eventually they're right everybody for although 480 00:31:34,200 --> 00:31:36,880 Speaker 1: in the modern world, thanks to Google, you can very 481 00:31:36,920 --> 00:31:40,080 Speaker 1: easily go back and check if someone has said there. 482 00:31:40,200 --> 00:31:43,680 Speaker 1: There's a famous pundit who is for the past seven years, 483 00:31:43,680 --> 00:31:47,520 Speaker 1: has been saying a seven like crash is around the corner. 484 00:31:47,560 --> 00:31:50,760 Speaker 1: It's coming. Well maybe, but it's now seven years in 485 00:31:50,760 --> 00:31:52,960 Speaker 1: a row. You've been saying, and there are many people 486 00:31:53,040 --> 00:31:56,320 Speaker 1: you know, they're saying, is who predicted seven of the 487 00:31:56,400 --> 00:32:00,880 Speaker 1: last three recessions? That's it's that's sort of So speaking 488 00:32:00,880 --> 00:32:05,560 Speaker 1: of Michael Lewis, you and Amos Diversky are the subject 489 00:32:05,760 --> 00:32:09,600 Speaker 1: of his next book that's coming out. Let's talk about that. 490 00:32:09,640 --> 00:32:12,880 Speaker 1: What was that experience? Like Michael Lewis is a very 491 00:32:12,920 --> 00:32:17,480 Speaker 1: lovable character and is very pleasant to talk to, and 492 00:32:17,800 --> 00:32:22,000 Speaker 1: he earns your confidence by his charm and you know, 493 00:32:22,480 --> 00:32:25,120 Speaker 1: he just he is very good at what he does. 494 00:32:26,080 --> 00:32:28,520 Speaker 1: I haven't seen the book. I have no idea what 495 00:32:28,720 --> 00:32:31,200 Speaker 1: it's what. I don't even want to see it before 496 00:32:31,200 --> 00:32:33,680 Speaker 1: he gets published. You know, when it gets published, I'll 497 00:32:33,720 --> 00:32:37,640 Speaker 1: read it, of course. But the process has been interesting 498 00:32:37,800 --> 00:32:42,720 Speaker 1: because he's made me think about my past and that's 499 00:32:42,760 --> 00:32:46,200 Speaker 1: been quite enjoyable. It has made me think about my 500 00:32:46,560 --> 00:32:49,720 Speaker 1: my collaboration with a Mustwirsky. No I must Firsky died 501 00:32:49,760 --> 00:32:54,120 Speaker 1: twenty years ago, and he is bringing him to life. 502 00:32:55,040 --> 00:32:57,920 Speaker 1: And he also had access to all of Amos's papers, 503 00:32:58,000 --> 00:33:01,040 Speaker 1: including notes that they must talk and conversations we have 504 00:33:01,640 --> 00:33:06,240 Speaker 1: that's forgotten all about. So it's it's been an interesting 505 00:33:06,320 --> 00:33:10,240 Speaker 1: experience reliving that period, but the experience would be about 506 00:33:10,440 --> 00:33:12,920 Speaker 1: reading the book. I don't know yet. You can hang 507 00:33:12,960 --> 00:33:16,120 Speaker 1: around a little bit. We'll keep we'll keep chatting. We've 508 00:33:16,120 --> 00:33:19,400 Speaker 1: been speaking with Professor Danny Kahneman, winner of the two 509 00:33:19,440 --> 00:33:23,920 Speaker 1: thousand and two Nobel Prize in Economics, former psychology professor 510 00:33:24,400 --> 00:33:27,960 Speaker 1: at Princeton. If you've enjoyed this conversation, be sure and 511 00:33:28,040 --> 00:33:31,080 Speaker 1: stick around for our podcast extras, where we keep the 512 00:33:31,080 --> 00:33:35,600 Speaker 1: tape rolling and continue discussing all things biases and heuristics. 513 00:33:36,160 --> 00:33:39,680 Speaker 1: Be sure and check out my daily column on Bloomberg 514 00:33:39,720 --> 00:33:43,520 Speaker 1: dot com or follow me on Twitter at rit Halts. 515 00:33:43,760 --> 00:33:46,880 Speaker 1: I'm Barry rit Halts. You've been listening to Masters in 516 00:33:46,920 --> 00:33:51,640 Speaker 1: Business on Bloomberg Radio. Welcome to the podcast. By the way, 517 00:33:51,800 --> 00:33:53,680 Speaker 1: it's so odd to call you Danny. Danny, if I 518 00:33:53,680 --> 00:33:55,760 Speaker 1: haven't said thank you so much for being so generous 519 00:33:55,760 --> 00:33:59,000 Speaker 1: with your time, let me do that now before before 520 00:33:59,040 --> 00:34:04,200 Speaker 1: I um forget. I have been uh a big fan 521 00:34:04,920 --> 00:34:07,480 Speaker 1: of your work for a long time and in my 522 00:34:07,680 --> 00:34:12,480 Speaker 1: career I started as a trader and pretty early on 523 00:34:12,800 --> 00:34:16,320 Speaker 1: recognized that why are these four guys doing the exact 524 00:34:16,320 --> 00:34:20,520 Speaker 1: same thing and yet they're each obtaining very different results. 525 00:34:20,920 --> 00:34:23,600 Speaker 1: And the first book on the in the space I 526 00:34:23,680 --> 00:34:27,480 Speaker 1: read was by Tom Gilovich and Cornell, who I believe 527 00:34:27,520 --> 00:34:30,960 Speaker 1: you've you've worked with before. Um, how we know it 528 00:34:31,000 --> 00:34:35,200 Speaker 1: isn't so was was that book. But I've loved the 529 00:34:35,239 --> 00:34:38,800 Speaker 1: work that you've put out, and I have a bazillion questions. 530 00:34:38,840 --> 00:34:41,600 Speaker 1: Let's see if we can work our way through through 531 00:34:41,680 --> 00:34:45,080 Speaker 1: some of these. So earlier on we were talking about 532 00:34:45,160 --> 00:34:49,600 Speaker 1: anchoring and availability um. And one of the when I 533 00:34:49,680 --> 00:34:53,080 Speaker 1: mentioned to a friend that you were an upcoming guest, 534 00:34:53,760 --> 00:34:58,120 Speaker 1: the question he suggested is, well, the people who study 535 00:34:58,200 --> 00:35:02,719 Speaker 1: biases and and the pitfall of human cognitions, they must 536 00:35:02,760 --> 00:35:07,800 Speaker 1: be optimal decision makers, right, And I said, I'll ask so, well, 537 00:35:08,239 --> 00:35:11,759 Speaker 1: certainly not, uh, you know. And the way that I 538 00:35:11,880 --> 00:35:15,640 Speaker 1: described it, there's System one and their System two, and 539 00:35:15,719 --> 00:35:20,080 Speaker 1: System one is very difficult to educate, and System one 540 00:35:20,239 --> 00:35:24,239 Speaker 1: is where our intuitions come from. And system too, very 541 00:35:24,280 --> 00:35:28,799 Speaker 1: often is just the pr agent four. System one it 542 00:35:28,960 --> 00:35:34,560 Speaker 1: just explains decisions that were made, rationalizes, it rationalizes. But 543 00:35:34,640 --> 00:35:37,320 Speaker 1: System one does a good job at keeping us alive 544 00:35:37,400 --> 00:35:44,120 Speaker 1: and not only in the savannah, you know, keeping it 545 00:35:44,200 --> 00:35:47,240 Speaker 1: keeps us alive. And most of the things that people do, 546 00:35:48,440 --> 00:35:53,360 Speaker 1: uh are good. You know, I believe I'm to asky 547 00:35:53,440 --> 00:35:57,200 Speaker 1: and I often considered sort of the profits of the irrationality. 548 00:35:57,480 --> 00:36:01,239 Speaker 1: We don't never wanted that label. We don't believe that 549 00:36:01,280 --> 00:36:05,680 Speaker 1: people are irrational. People are not perfectly irrational. They couldn't be, 550 00:36:05,680 --> 00:36:09,680 Speaker 1: because they have the finite mind and perfect rationality demands 551 00:36:09,760 --> 00:36:13,799 Speaker 1: much more. But you know, they're quite reasonable, except they 552 00:36:13,880 --> 00:36:18,640 Speaker 1: make predictable biases in some mistakes in some conditions. So 553 00:36:18,760 --> 00:36:23,000 Speaker 1: by and large, most people get many of their important 554 00:36:23,040 --> 00:36:29,319 Speaker 1: decisions right, and the ones that get wrong really stand out. Well, uh, 555 00:36:29,560 --> 00:36:32,040 Speaker 1: you know, it's hard to count. But most of the time, 556 00:36:32,080 --> 00:36:35,560 Speaker 1: you know, most of the day you are on system one. 557 00:36:35,800 --> 00:36:38,719 Speaker 1: You know, we're doing things that are well practiced, and 558 00:36:38,880 --> 00:36:42,560 Speaker 1: most of them work. So we we managed to get 559 00:36:42,600 --> 00:36:46,520 Speaker 1: through our day without engaging in without thinking too much, 560 00:36:47,400 --> 00:36:51,040 Speaker 1: and well that for sure. Yeah, and most of the 561 00:36:51,120 --> 00:36:55,960 Speaker 1: time it works just fine. That's that's quite so. The 562 00:36:55,960 --> 00:36:59,279 Speaker 1: answer to your question was a long one, But no, 563 00:36:59,600 --> 00:37:03,080 Speaker 1: I I'm not smarter than I was when I began 564 00:37:03,160 --> 00:37:06,680 Speaker 1: this line of research more than fifty years ago. Because 565 00:37:06,760 --> 00:37:08,960 Speaker 1: my system one is just the same way it was. 566 00:37:09,640 --> 00:37:14,160 Speaker 1: I can recognize sometimes I can recognize situations in which 567 00:37:14,360 --> 00:37:18,120 Speaker 1: I'm likely to make a mistake. So, h this person 568 00:37:18,280 --> 00:37:21,160 Speaker 1: is trying to anchor me. Yeah, I can recognize that. 569 00:37:21,320 --> 00:37:23,560 Speaker 1: It still works on me, by the way, but but 570 00:37:23,600 --> 00:37:26,200 Speaker 1: I can recognize it. So well, let's let's talk a 571 00:37:26,239 --> 00:37:29,600 Speaker 1: little bit about that. So I was always under the belief, 572 00:37:29,800 --> 00:37:34,359 Speaker 1: apparently mistakenly, that if you become self enlightened about your 573 00:37:34,400 --> 00:37:37,920 Speaker 1: own biases, then you can undertake a series of steps 574 00:37:37,960 --> 00:37:42,480 Speaker 1: to prevent yourself from doing damage. We were speaking before 575 00:37:42,520 --> 00:37:47,080 Speaker 1: the show started about indexing and global asset allocation. Now 576 00:37:47,080 --> 00:37:51,000 Speaker 1: are you picking stocks and timing markets or have you 577 00:37:52,080 --> 00:37:55,600 Speaker 1: taken your system too and allowed it to prevent system 578 00:37:55,880 --> 00:38:01,560 Speaker 1: from saving I personally don't do anything of the kind 579 00:38:03,440 --> 00:38:06,320 Speaker 1: you know, But I'm a conservative Poston and so there's 580 00:38:06,360 --> 00:38:08,520 Speaker 1: nothing to be learned from the way that I handled 581 00:38:08,520 --> 00:38:11,760 Speaker 1: my money. But I certainly I would not advise speaking 582 00:38:11,800 --> 00:38:15,520 Speaker 1: individual stuff so or any form of very active manage. 583 00:38:15,760 --> 00:38:19,879 Speaker 1: You strike me as a passive indexer who you specifically said, 584 00:38:20,440 --> 00:38:23,200 Speaker 1: I don't want to check my portfolio too often people 585 00:38:23,239 --> 00:38:26,000 Speaker 1: do that, and when they do, they tend to make mistakes. 586 00:38:26,360 --> 00:38:30,520 Speaker 1: So this seems like a little closer to optimal decision making. 587 00:38:31,640 --> 00:38:35,480 Speaker 1: You've identified a couple of biases that affect out people 588 00:38:35,520 --> 00:38:39,920 Speaker 1: invest and you're engaging in the behavior to prevent yourself 589 00:38:39,920 --> 00:38:43,840 Speaker 1: from Well, I'm mostly very lazy, you know. That's it. 590 00:38:44,440 --> 00:38:47,160 Speaker 1: That's you can get to the same point by being 591 00:38:47,640 --> 00:38:51,400 Speaker 1: rational or by being lazy. In that case, it's laziness. 592 00:38:51,960 --> 00:38:53,799 Speaker 1: If that was true, I would have been a much 593 00:38:53,840 --> 00:38:59,200 Speaker 1: better student in high school. Um so uh. Jason's ye 594 00:38:59,239 --> 00:39:03,279 Speaker 1: of the Wall Street Journal is um was an editor 595 00:39:03,440 --> 00:39:07,399 Speaker 1: with you on some work you did, and he said 596 00:39:07,480 --> 00:39:09,480 Speaker 1: you you once said to him, and I want to 597 00:39:09,520 --> 00:39:15,040 Speaker 1: get the quote exactly right. You have no sunk costs. 598 00:39:15,080 --> 00:39:17,719 Speaker 1: So we're all familiar with the sun cost fallacy. How 599 00:39:17,760 --> 00:39:19,960 Speaker 1: do you have no sun costs? I mean I have 600 00:39:20,280 --> 00:39:22,880 Speaker 1: you know, it was in a particular context. It was 601 00:39:22,920 --> 00:39:25,880 Speaker 1: in the context of how many drafts do you write? 602 00:39:26,560 --> 00:39:29,239 Speaker 1: And I have no sun costs in the sense and 603 00:39:29,360 --> 00:39:32,879 Speaker 1: the mere fact that I have written something, even if 604 00:39:32,920 --> 00:39:36,399 Speaker 1: it's twenty or thirty pages. If now I decide it's 605 00:39:36,440 --> 00:39:40,759 Speaker 1: not good, I don't try to fix it. I started over, 606 00:39:40,960 --> 00:39:43,520 Speaker 1: throw it away and started over. That's that's having no 607 00:39:43,680 --> 00:39:48,120 Speaker 1: sun costs. Now, the implication is some people would have said, Hey, 608 00:39:48,160 --> 00:39:50,759 Speaker 1: I spent two hours on this, I'm gonna go back 609 00:39:50,760 --> 00:39:53,080 Speaker 1: and edit this and try and make it. You're that 610 00:39:53,080 --> 00:39:55,880 Speaker 1: that time and effort is spent as far as you're concerned. 611 00:39:56,280 --> 00:40:00,080 Speaker 1: You're starting from scratch, and I think you do the 612 00:40:00,239 --> 00:40:05,000 Speaker 1: things that way. If if you're if you forget about 613 00:40:05,040 --> 00:40:08,680 Speaker 1: the work that you've already invested. But if you if 614 00:40:08,719 --> 00:40:12,160 Speaker 1: you have found a better solution. Now instead of saying, oh, 615 00:40:12,280 --> 00:40:15,800 Speaker 1: I should have written that chapter differently, you just rewrite 616 00:40:15,800 --> 00:40:20,000 Speaker 1: that chapter differently. Now, don't Most people suffer from the 617 00:40:20,200 --> 00:40:23,000 Speaker 1: some cost of fact They feel, well, I already spent 618 00:40:23,120 --> 00:40:25,920 Speaker 1: this money, and now I'm committing. I bought this stock 619 00:40:26,080 --> 00:40:28,359 Speaker 1: and now I'm kind of committed to it. You know, 620 00:40:28,560 --> 00:40:34,759 Speaker 1: I've we're talking about writing, not not the decisions, but 621 00:40:34,800 --> 00:40:38,160 Speaker 1: it's still it's still work process, it's still effort, and 622 00:40:38,280 --> 00:40:40,640 Speaker 1: people tend to think, oh, I mean, most people are 623 00:40:40,719 --> 00:40:43,840 Speaker 1: highly sensitive to some costs, and I would not say 624 00:40:43,880 --> 00:40:47,440 Speaker 1: that I'm not in other domains of life. I said 625 00:40:47,560 --> 00:40:50,160 Speaker 1: I'm free of some costs in the domain of writing. 626 00:40:50,800 --> 00:40:54,560 Speaker 1: And that's what Jason Swag was writing about, because actually 627 00:40:55,440 --> 00:41:01,800 Speaker 1: he thinks that most writers very reluctant, distilled over. Yes, absolutely, 628 00:41:01,880 --> 00:41:04,640 Speaker 1: you put the time and effort in. Gee, I don't 629 00:41:04,640 --> 00:41:07,960 Speaker 1: want to throw this away. I have something to work with. Um. 630 00:41:08,000 --> 00:41:11,600 Speaker 1: It's always easier to do that that second draft than 631 00:41:11,640 --> 00:41:14,040 Speaker 1: it is to start with a plane sheet of paper. No, 632 00:41:14,040 --> 00:41:16,879 Speaker 1: no doubt about that. So we haven't really talked about 633 00:41:16,960 --> 00:41:21,320 Speaker 1: the endowment effect. UM, But I have my favorite example 634 00:41:21,600 --> 00:41:24,879 Speaker 1: that I wanted to share with you because I find 635 00:41:24,880 --> 00:41:29,040 Speaker 1: this such a fascinating subject. Whenever. So I'm a car guy, 636 00:41:29,320 --> 00:41:32,560 Speaker 1: and that means very often friends and family come and 637 00:41:32,600 --> 00:41:35,719 Speaker 1: ask me about I'm thinking about this car or that car. 638 00:41:35,880 --> 00:41:40,759 Speaker 1: What what's your view? And I've noticed it's a fascinating subject, 639 00:41:40,840 --> 00:41:44,200 Speaker 1: like like the two football teams who are so evenly matched. 640 00:41:44,640 --> 00:41:47,200 Speaker 1: When someone comes to you and says, I'm I'm considering 641 00:41:47,360 --> 00:41:50,680 Speaker 1: these two cars. I'm thinking about the Toyota camera or 642 00:41:50,719 --> 00:41:54,160 Speaker 1: the Honda Chord, and the camera has this, this and this, 643 00:41:54,280 --> 00:41:56,759 Speaker 1: but the accord has that, that and that, and it's 644 00:41:56,920 --> 00:41:59,960 Speaker 1: it's pretty evenly matched. And I'm I'm having a hard 645 00:42:00,040 --> 00:42:05,360 Speaker 1: time making a decision. What's your opinion? And so I'll say, well, 646 00:42:05,800 --> 00:42:07,839 Speaker 1: they're both great cars. You won't go wrong with either, 647 00:42:08,200 --> 00:42:10,399 Speaker 1: but I think this one is a little nicer. What 648 00:42:10,400 --> 00:42:12,400 Speaker 1: what do you think? I'll try and ask them some questions. 649 00:42:13,239 --> 00:42:15,840 Speaker 1: Six months later, you see them and they bought car 650 00:42:15,920 --> 00:42:18,360 Speaker 1: A over car B, and you asked them, how do 651 00:42:18,360 --> 00:42:20,879 Speaker 1: you like the car? And the answer is, I can't 652 00:42:20,880 --> 00:42:24,160 Speaker 1: imagine I was even considering that other car. This car 653 00:42:24,239 --> 00:42:27,880 Speaker 1: is fantastic. Now nothing has changed. These are two really 654 00:42:27,920 --> 00:42:32,000 Speaker 1: good automobiles, except for the fact that he now owns 655 00:42:32,040 --> 00:42:36,560 Speaker 1: this car. So what is it about ownership that makes 656 00:42:36,640 --> 00:42:41,160 Speaker 1: us think this is better, more valuable? Whatever? Why do 657 00:42:41,239 --> 00:42:45,799 Speaker 1: we endow these these objects with with superiority. When you've 658 00:42:45,880 --> 00:42:49,719 Speaker 1: owned something for six months, you think highly of it 659 00:42:49,880 --> 00:42:55,759 Speaker 1: because it's become familiar, and almost everything that is familiar 660 00:42:57,239 --> 00:43:02,160 Speaker 1: you like better. Really, So for smiliarity doesn't bring contempt. 661 00:43:02,600 --> 00:43:06,920 Speaker 1: Familiarity makes you like things. So that's a that's a 662 00:43:06,960 --> 00:43:11,200 Speaker 1: big psychological rule in general. Even in the study where 663 00:43:11,200 --> 00:43:13,680 Speaker 1: I think it was a mug with the school's name 664 00:43:13,719 --> 00:43:16,440 Speaker 1: on it, that's a different story. So that I said, 665 00:43:16,480 --> 00:43:20,000 Speaker 1: when you own something for a long time, when you're 666 00:43:20,040 --> 00:43:25,560 Speaker 1: discussing trading, then if you're not a professional, if you're 667 00:43:25,600 --> 00:43:29,560 Speaker 1: an individual and you're thinking of a mug. Then you're 668 00:43:29,600 --> 00:43:33,040 Speaker 1: not thinking of like thinking of wealth. You're not thinking 669 00:43:33,080 --> 00:43:34,960 Speaker 1: of the state of the world in which you have 670 00:43:35,120 --> 00:43:37,479 Speaker 1: that mugs against the state of the world in which 671 00:43:37,480 --> 00:43:42,680 Speaker 1: you have seven dollars in addition to your wealth. You're thinking, 672 00:43:42,920 --> 00:43:45,279 Speaker 1: if you have the mug, do I give it up? 673 00:43:45,719 --> 00:43:48,080 Speaker 1: And if you don't have the mug, do I get 674 00:43:48,120 --> 00:43:51,400 Speaker 1: it and give up the money for it? And giving 675 00:43:51,520 --> 00:43:56,120 Speaker 1: up is more painful than gaining. And that's how loss 676 00:43:56,120 --> 00:43:59,480 Speaker 1: a version gets involved in the endowment defect. It doesn't 677 00:43:59,480 --> 00:44:02,480 Speaker 1: play any role in your story about the car because 678 00:44:02,520 --> 00:44:05,720 Speaker 1: you're not about to trade the car, so you're liking 679 00:44:05,800 --> 00:44:10,360 Speaker 1: for the car is more in effective familiarity and of 680 00:44:10,520 --> 00:44:15,080 Speaker 1: something a psychological process that's called dissonance reduction. It chose 681 00:44:15,120 --> 00:44:18,600 Speaker 1: that car, therefore it must be good. So that is 682 00:44:18,640 --> 00:44:23,680 Speaker 1: true that almost anything that you chose becomes better because 683 00:44:23,719 --> 00:44:26,840 Speaker 1: you chose it. But that's a different process than the 684 00:44:26,920 --> 00:44:29,879 Speaker 1: endowment ef that's totally because it seems there's a big 685 00:44:29,920 --> 00:44:33,600 Speaker 1: overlap between I chose it, therefore it's good, and I 686 00:44:33,640 --> 00:44:39,359 Speaker 1: already own it and therefore it's valuable. Well, because you 687 00:44:39,440 --> 00:44:43,040 Speaker 1: may own it without having chosen it. You know, I 688 00:44:43,200 --> 00:44:46,239 Speaker 1: give you that mug and I ask would you sell it? 689 00:44:46,640 --> 00:44:50,839 Speaker 1: So you just had it for thirty seconds and you 690 00:44:50,840 --> 00:44:53,400 Speaker 1: you get the offer to sell it. So it's a 691 00:44:53,440 --> 00:44:58,920 Speaker 1: different process. That's quite that. It's quite fascinating. Um. So 692 00:44:59,080 --> 00:45:03,000 Speaker 1: I like this vote from thinking fast and slow. Our 693 00:45:03,200 --> 00:45:07,840 Speaker 1: company comforting conviction that the world makes sense rests on 694 00:45:07,920 --> 00:45:13,280 Speaker 1: a secure foundation of our almost unlimited ability to ignore 695 00:45:13,800 --> 00:45:17,759 Speaker 1: our own ignorance. Explain that since you mentioned dissonance, I 696 00:45:17,760 --> 00:45:20,920 Speaker 1: thought that was a good point to bring this up. Well. 697 00:45:22,040 --> 00:45:25,439 Speaker 1: Earlier we talked about something I've read about in the book, 698 00:45:25,480 --> 00:45:28,640 Speaker 1: which is what you see is all there is. And 699 00:45:28,960 --> 00:45:33,319 Speaker 1: the idea that but you don't see is you know 700 00:45:33,480 --> 00:45:37,680 Speaker 1: good refute everything that you believe. That just doesn't occur 701 00:45:37,760 --> 00:45:43,040 Speaker 1: to us. So it's it's another way of telling the story. 702 00:45:43,080 --> 00:45:46,680 Speaker 1: I was telling earlier that we construct the best possible 703 00:45:46,760 --> 00:45:50,680 Speaker 1: narrative about the world, and if we're successful in constructing 704 00:45:50,719 --> 00:45:53,479 Speaker 1: a good story, we believe it, we have a high 705 00:45:53,520 --> 00:45:55,960 Speaker 1: confidence in it, and we don't want to change it. 706 00:45:56,719 --> 00:46:00,880 Speaker 1: That makes perfect sense. Um Let's talk about regression to 707 00:46:00,960 --> 00:46:05,480 Speaker 1: the mean and your conversation with the fighter pilot instructor 708 00:46:05,880 --> 00:46:09,040 Speaker 1: who felt every time he yelled at a cadet they 709 00:46:09,080 --> 00:46:12,640 Speaker 1: would do better and kind of ignored the fact that 710 00:46:13,520 --> 00:46:15,920 Speaker 1: maybe it just happened to be did poorly and he 711 00:46:16,040 --> 00:46:21,560 Speaker 1: was do to do well well the whole. You know, 712 00:46:21,800 --> 00:46:26,200 Speaker 1: it's actually quite remarkable that the statistical fact that you know, 713 00:46:26,239 --> 00:46:30,759 Speaker 1: it's as common as as the air we breathe is 714 00:46:31,640 --> 00:46:35,400 Speaker 1: is very nonintuitive, and that's aggression to the mean. So 715 00:46:37,360 --> 00:46:42,239 Speaker 1: the fact is that, you know, if you look at golfers, 716 00:46:42,360 --> 00:46:45,160 Speaker 1: that's the example of developing the book. So you look 717 00:46:45,160 --> 00:46:50,320 Speaker 1: at the three guys who scored the highest score yesterday, 718 00:46:51,160 --> 00:46:55,279 Speaker 1: chances are they're going to do less well today. And 719 00:46:55,400 --> 00:47:00,040 Speaker 1: that's I'll tell you why. It's because yesterday there's a 720 00:47:00,080 --> 00:47:04,040 Speaker 1: sess was impowered due to luck, and luck is not going, 721 00:47:04,400 --> 00:47:08,880 Speaker 1: not guaranteed to follow them today. So we have to 722 00:47:08,920 --> 00:47:11,800 Speaker 1: think of regression to the mean that what we see 723 00:47:11,960 --> 00:47:16,120 Speaker 1: has already luck built into it, and luck is not 724 00:47:16,239 --> 00:47:20,439 Speaker 1: going to stay. So next time and next time it's 725 00:47:20,480 --> 00:47:23,920 Speaker 1: likely to be less successful. You know, the golfer is 726 00:47:23,920 --> 00:47:26,839 Speaker 1: likely to be less successful than he was because he's 727 00:47:26,880 --> 00:47:29,640 Speaker 1: likely to be less lucky than he was. And people 728 00:47:29,880 --> 00:47:34,360 Speaker 1: dramatically underestimating the impact of chance on their subsilently the 729 00:47:34,880 --> 00:47:39,319 Speaker 1: separating skill from chance, especially in this field, is an 730 00:47:39,400 --> 00:47:43,400 Speaker 1: ongoing battle, and very often what looks like very skillful 731 00:47:43,440 --> 00:47:46,120 Speaker 1: managers turns out to be Hey, they were lucky for 732 00:47:46,160 --> 00:47:49,360 Speaker 1: a couple of quarters. And and and now is that sure? 733 00:47:49,560 --> 00:47:53,600 Speaker 1: And we have a very strong tendency, you know, everybody 734 00:47:53,680 --> 00:47:57,360 Speaker 1: has that. We have that about ourselves, that we attribute 735 00:47:57,360 --> 00:48:03,239 Speaker 1: our success our successes to our skill and failures, you know, 736 00:48:03,320 --> 00:48:06,719 Speaker 1: to bad luck. But actually there is a lot of 737 00:48:06,800 --> 00:48:10,480 Speaker 1: luck in our successes as well that we don't see that. 738 00:48:10,480 --> 00:48:15,480 Speaker 1: That is uh an ongoing issue amongst investors and traders. 739 00:48:16,040 --> 00:48:19,000 Speaker 1: Um it's it's always bad luck when the trade doesn't 740 00:48:19,000 --> 00:48:21,800 Speaker 1: work out, But when it does work out, it's because 741 00:48:21,880 --> 00:48:24,600 Speaker 1: I'm a genius, and we see that. We see that 742 00:48:24,680 --> 00:48:28,480 Speaker 1: all the time. So, having read a lot of what 743 00:48:28,600 --> 00:48:32,040 Speaker 1: you've written and over over a long time, I was 744 00:48:32,080 --> 00:48:37,200 Speaker 1: surprised to learn that you don't describe yourself as particularly optimistic. 745 00:48:38,000 --> 00:48:40,680 Speaker 1: And I thought that was kind of interesting because you've 746 00:48:40,719 --> 00:48:45,040 Speaker 1: spent what more than a decade now looking at happiness 747 00:48:45,120 --> 00:48:48,440 Speaker 1: and utility. Um, So the first question I have to 748 00:48:48,480 --> 00:48:53,200 Speaker 1: ask is has studying this field made you any more 749 00:48:53,239 --> 00:48:56,920 Speaker 1: optimistic or any any happier? No, I mean, you know, 750 00:48:57,000 --> 00:49:03,279 Speaker 1: optimism is genetic anyway, and yeah largely, Uh you know, 751 00:49:03,920 --> 00:49:07,239 Speaker 1: I'm the son of a very pessimistic mother, and so 752 00:49:07,600 --> 00:49:12,480 Speaker 1: pessimism is sort of genetic too. I could see that 753 00:49:12,640 --> 00:49:14,520 Speaker 1: is is it genetic or is it the home you 754 00:49:14,560 --> 00:49:17,800 Speaker 1: were raised? Then no, I'm just I'm just pretty Actually 755 00:49:18,280 --> 00:49:25,520 Speaker 1: it's actually genetic really so so so studying utility and 756 00:49:25,520 --> 00:49:30,880 Speaker 1: studying happiness, what what conclusions have you reached based on 757 00:49:30,920 --> 00:49:36,360 Speaker 1: that research? Well, the main conclusion was that there is 758 00:49:36,360 --> 00:49:39,680 Speaker 1: a difference between what makes us satisfied and what makes 759 00:49:39,719 --> 00:49:44,200 Speaker 1: us happy. So distinction because the distinction between the two. 760 00:49:44,680 --> 00:49:49,640 Speaker 1: Being happy means having you know, having happy experiences and 761 00:49:49,680 --> 00:49:52,200 Speaker 1: being satisfied is when you look at your life and 762 00:49:52,239 --> 00:49:55,200 Speaker 1: what you've accomplished. Are you content and are you satisfied 763 00:49:55,239 --> 00:49:58,920 Speaker 1: with what you've accomplished? And those two are really not 764 00:49:59,000 --> 00:50:03,560 Speaker 1: the same, which is more important. Well, terms up that 765 00:50:03,840 --> 00:50:08,319 Speaker 1: for most people what they try to do, if they 766 00:50:08,320 --> 00:50:11,840 Speaker 1: try to be satisfied, they go for satisfaction and not 767 00:50:12,000 --> 00:50:15,680 Speaker 1: for they are not thinking how can I maximize the 768 00:50:15,760 --> 00:50:21,000 Speaker 1: quality of the my life? Mm hmm. So and many people, 769 00:50:21,040 --> 00:50:25,600 Speaker 1: I think make big mistakes and that they settle themselves 770 00:50:25,600 --> 00:50:28,759 Speaker 1: the big commute to have a larger home. You know, 771 00:50:28,840 --> 00:50:31,399 Speaker 1: the big commute means an hour or two a day 772 00:50:31,520 --> 00:50:36,600 Speaker 1: that are wasted. You know that counts. Time should count 773 00:50:36,640 --> 00:50:40,080 Speaker 1: for a lot, because it's the only thing we've got basically, 774 00:50:40,640 --> 00:50:45,240 Speaker 1: and people who waste time in order to achieve something 775 00:50:45,320 --> 00:50:49,879 Speaker 1: else are wasting the part of their life. So so 776 00:50:49,960 --> 00:50:52,759 Speaker 1: the studies I've seen is the longer your commute is, 777 00:50:53,360 --> 00:50:56,799 Speaker 1: the less happy you are. Does that also lead to 778 00:50:56,880 --> 00:51:02,160 Speaker 1: the less satisfied you? Not necessarily? Not necessarily, because if 779 00:51:02,280 --> 00:51:05,399 Speaker 1: having a big home is part of your satisfaction, if 780 00:51:05,400 --> 00:51:08,359 Speaker 1: you're sort of proud of the house you've got, uh, 781 00:51:08,360 --> 00:51:11,040 Speaker 1: and you're paying for it by a long commute, you're 782 00:51:11,080 --> 00:51:13,440 Speaker 1: not the way it's satisfied. Although you're not happy. You 783 00:51:13,480 --> 00:51:16,200 Speaker 1: get a bigger house closer to your job, but it's 784 00:51:16,239 --> 00:51:18,879 Speaker 1: going to be a lot more expensive. So so they're 785 00:51:18,920 --> 00:51:23,120 Speaker 1: trading the time for that makes senseense. What what else 786 00:51:23,160 --> 00:51:27,160 Speaker 1: have you what other things have you taken away from 787 00:51:27,200 --> 00:51:33,120 Speaker 1: studying happiness and satisfaction? Well, you know, I think I 788 00:51:33,160 --> 00:51:36,879 Speaker 1: think it turns out that people are happiest and when 789 00:51:36,880 --> 00:51:41,680 Speaker 1: they're in the presence of friends, truly friends, and then 790 00:51:42,080 --> 00:51:44,719 Speaker 1: you know there is a special thrill I think for 791 00:51:44,800 --> 00:51:47,960 Speaker 1: people to be with friends, which is even more than 792 00:51:48,000 --> 00:51:52,120 Speaker 1: to be with family. Really, why is that, Well, there's 793 00:51:52,239 --> 00:51:55,360 Speaker 1: something more relaxing, and you know, when you're in a family, 794 00:51:55,400 --> 00:51:58,799 Speaker 1: there are many obligations and many stresses, and so this 795 00:51:59,000 --> 00:52:01,719 Speaker 1: is a pleasure that you sation being with friends. But 796 00:52:01,960 --> 00:52:04,120 Speaker 1: those are some of the best moments of the week 797 00:52:05,120 --> 00:52:09,799 Speaker 1: when you're not alone. There are there is a particular 798 00:52:10,920 --> 00:52:17,200 Speaker 1: joy for many people in having shared routines with friends. 799 00:52:17,719 --> 00:52:20,960 Speaker 1: You know, the weekly poker game, the weekly dinner and 800 00:52:21,040 --> 00:52:24,600 Speaker 1: movie that that you share with friends. Over a period 801 00:52:24,640 --> 00:52:29,160 Speaker 1: of years, that becomes very precious and it's you know, 802 00:52:29,200 --> 00:52:34,360 Speaker 1: it's becomes an important part of life that that's really 803 00:52:34,880 --> 00:52:38,480 Speaker 1: quite intriguing. Um. You know a lot of the discoveries 804 00:52:38,520 --> 00:52:42,640 Speaker 1: we've talked about have the feeling to me of of epiphanies. 805 00:52:43,320 --> 00:52:46,719 Speaker 1: So one of the things I specifically wanted to ask 806 00:52:46,760 --> 00:52:50,320 Speaker 1: you about when you were doing the officer candidate evaluations 807 00:52:50,360 --> 00:52:54,560 Speaker 1: for the Israeli Army um or when you uh would 808 00:52:54,600 --> 00:52:57,920 Speaker 1: ask colleagues how long they thought it would take for 809 00:52:57,920 --> 00:53:01,959 Speaker 1: for a project to be done. The results of these 810 00:53:02,120 --> 00:53:06,640 Speaker 1: were just really wow, this is much different than everybody 811 00:53:06,680 --> 00:53:10,360 Speaker 1: everybody believed. So what was it about your background and 812 00:53:10,480 --> 00:53:14,560 Speaker 1: your training that let you look at the world so 813 00:53:14,760 --> 00:53:19,840 Speaker 1: differently than everybody else around you. Obviously, these behaviors exist 814 00:53:19,960 --> 00:53:24,960 Speaker 1: amongst everybody. What led you to these really fascinating discoveries 815 00:53:25,000 --> 00:53:32,360 Speaker 1: that everyone else seems to have overlooked? Well, Uh, you know, 816 00:53:32,400 --> 00:53:37,160 Speaker 1: I I I was born to be a psychologist. I 817 00:53:37,200 --> 00:53:40,320 Speaker 1: really believe that, not my faith, but you know, the 818 00:53:40,440 --> 00:53:43,680 Speaker 1: sort the thing was, I was better at that and 819 00:53:43,760 --> 00:53:48,799 Speaker 1: at anything else. And I've always been interested in observing 820 00:53:48,880 --> 00:53:52,120 Speaker 1: people and then trying to figure out, you know, why 821 00:53:52,200 --> 00:53:55,600 Speaker 1: they are, where they feel, what they feel, and and 822 00:53:56,080 --> 00:53:59,480 Speaker 1: so that's sort of curiosity about people I think as 823 00:54:00,360 --> 00:54:07,279 Speaker 1: somehow paid afful me. So um um, so let's get 824 00:54:07,320 --> 00:54:10,200 Speaker 1: to some of our favorite last few questions. I know 825 00:54:10,239 --> 00:54:13,319 Speaker 1: we only have you here for a finite amount of time. 826 00:54:13,640 --> 00:54:18,880 Speaker 1: These are the standard questions we ask all of our guests. Um. 827 00:54:19,000 --> 00:54:23,759 Speaker 1: We we discussed uh, your your you knew right away 828 00:54:23,760 --> 00:54:25,920 Speaker 1: you're going to be in academy or your whole career. 829 00:54:26,320 --> 00:54:34,600 Speaker 1: Who were some of your early mentors? Well, I had professors. 830 00:54:34,680 --> 00:54:42,520 Speaker 1: I had professors I loved when I was an undergraduates 831 00:54:42,560 --> 00:54:45,680 Speaker 1: in Jerusalem. One in particular, I wouldn't call him a mentor, 832 00:54:45,840 --> 00:54:49,880 Speaker 1: but you know, he was your sure life of it. 833 00:54:49,920 --> 00:54:52,600 Speaker 1: It was his name. He was a hero to me 834 00:54:52,680 --> 00:54:55,759 Speaker 1: and too many other people. Really why why is that? 835 00:54:55,880 --> 00:54:58,279 Speaker 1: But what did he do that resonated with you so much? 836 00:54:59,120 --> 00:55:02,480 Speaker 1: He just was a very powerful personality. He had strong 837 00:55:02,520 --> 00:55:06,840 Speaker 1: opinions about everything. He knew a lot, he was, he 838 00:55:06,840 --> 00:55:11,560 Speaker 1: had several doctorates, he was and he had a very 839 00:55:11,640 --> 00:55:16,480 Speaker 1: individual character. So I loved he made a deep impression 840 00:55:16,520 --> 00:55:20,360 Speaker 1: on me. In in graduate school, I had quite a 841 00:55:20,400 --> 00:55:25,160 Speaker 1: few teachers, especially the one who supervised my thesis at Gizelli. 842 00:55:25,640 --> 00:55:27,920 Speaker 1: But I learned from quite a few people when I 843 00:55:28,000 --> 00:55:32,240 Speaker 1: was in graduate school. I don't think I ever had 844 00:55:32,600 --> 00:55:38,120 Speaker 1: a proper mentor in the sense that that many graduate 845 00:55:38,200 --> 00:55:43,360 Speaker 1: students today they get into a lab and they're associated 846 00:55:43,400 --> 00:55:47,240 Speaker 1: with the same person throughout their career. In graduate school, 847 00:55:47,440 --> 00:55:50,000 Speaker 1: that wasn't the case when I was in graduate school. 848 00:55:50,280 --> 00:55:53,279 Speaker 1: You didn't belong to a lab and and so I 849 00:55:53,320 --> 00:55:58,680 Speaker 1: didn't have that experience. So any previous psychologists or or 850 00:55:58,880 --> 00:56:04,600 Speaker 1: clinical experimenters influence your approach to what you did or yeah, many, 851 00:56:04,680 --> 00:56:10,080 Speaker 1: I mean mainly a man named Paul Meal, who who 852 00:56:10,200 --> 00:56:15,239 Speaker 1: was the first to compare algorithms to clinical judgment and 853 00:56:15,360 --> 00:56:20,840 Speaker 1: to reach the conclusion that algorithms are actually more accurate 854 00:56:21,920 --> 00:56:27,919 Speaker 1: in many many cases than than intuitive judgment. Now I'm 855 00:56:27,960 --> 00:56:32,160 Speaker 1: I'm I hope I'm not misremembering it. Um. Something you 856 00:56:32,239 --> 00:56:36,920 Speaker 1: wrote specifically said, even after we show people the success 857 00:56:37,040 --> 00:56:40,640 Speaker 1: ratio of the algorithms, they still wanted to stick with 858 00:56:40,760 --> 00:56:45,120 Speaker 1: their their intuition. Yeah, that's absolutely true. I mean, people 859 00:56:45,120 --> 00:56:50,000 Speaker 1: have people don't love algorithms, and and they better get 860 00:56:50,080 --> 00:56:52,319 Speaker 1: used to them because they're taking over every right, this 861 00:56:52,440 --> 00:56:55,839 Speaker 1: is happening. But you know, when you even think about that, 862 00:56:59,160 --> 00:57:01,680 Speaker 1: when a doctor makes a mistake and the child dies, 863 00:57:01,800 --> 00:57:05,800 Speaker 1: this is terrible, But if it was a piece of 864 00:57:05,960 --> 00:57:09,360 Speaker 1: artificial intelligence that made a mistake and the child dies, 865 00:57:09,480 --> 00:57:13,120 Speaker 1: that would be worse. At least today, because we trust 866 00:57:13,200 --> 00:57:17,200 Speaker 1: the machinery, and the machinery it's more shocking. It's more 867 00:57:17,280 --> 00:57:21,600 Speaker 1: shocking when it's impersonal. You know, when a self driving 868 00:57:21,680 --> 00:57:25,120 Speaker 1: car has an accident, we just learned, we just learned 869 00:57:25,120 --> 00:57:28,720 Speaker 1: about that, But there's something more shocking about the idea. 870 00:57:30,280 --> 00:57:33,640 Speaker 1: So just think about that somebody dying in a in 871 00:57:33,680 --> 00:57:37,800 Speaker 1: an accident, or somebody dying in an accident with a 872 00:57:37,840 --> 00:57:42,160 Speaker 1: self driving car, and and that second idea is some 873 00:57:42,520 --> 00:57:46,400 Speaker 1: more shocking, so people really don't for the time being, 874 00:57:46,480 --> 00:57:49,000 Speaker 1: this is going to change. But there is such a thing. 875 00:57:49,040 --> 00:57:52,280 Speaker 1: It's been called algorithmic version. Really, you know, when you 876 00:57:52,320 --> 00:57:54,919 Speaker 1: look at the statistics, let's use the self driving car, 877 00:57:55,480 --> 00:57:59,919 Speaker 1: by all measures, they're they're safer, they're more reliable, they're 878 00:58:00,040 --> 00:58:02,800 Speaker 1: less likely to be involved in either a major or 879 00:58:02,840 --> 00:58:06,800 Speaker 1: a minor accident. Doesn't mean you're gonna eliminate those sort 880 00:58:06,800 --> 00:58:10,040 Speaker 1: of accidents. So when one happens, it seems to really 881 00:58:10,040 --> 00:58:15,200 Speaker 1: resonate people. I mean, part of that is that there 882 00:58:15,200 --> 00:58:18,640 Speaker 1: are things that a view does natural or as you know, 883 00:58:19,040 --> 00:58:23,080 Speaker 1: sort of acts of God, and those we've learned to accept. 884 00:58:23,440 --> 00:58:27,080 Speaker 1: But when it's acts of men, it's very different. So 885 00:58:27,720 --> 00:58:33,800 Speaker 1: the best example is vaccines. So vaccines, you know, could 886 00:58:33,880 --> 00:58:38,880 Speaker 1: cause sects side effect, they could Let's take a child 887 00:58:38,960 --> 00:58:44,360 Speaker 1: who died from a vaccine. Now, how many children would 888 00:58:44,360 --> 00:58:48,800 Speaker 1: the vaccine have to save to accept the death of 889 00:58:48,880 --> 00:58:52,240 Speaker 1: one child from the vaccine. It's clearly not one to one. 890 00:58:52,440 --> 00:58:55,280 Speaker 1: Now it's a lot of more millions, millions to I 891 00:58:55,280 --> 00:58:57,520 Speaker 1: don't know. I hope it's not millions because that would 892 00:58:57,520 --> 00:59:00,840 Speaker 1: be crazy. But but but it's a more than one. 893 00:59:02,000 --> 00:59:05,320 Speaker 1: And so what what happens here is that anything that 894 00:59:05,520 --> 00:59:09,520 Speaker 1: is man made we have a much stronger reaction to 895 00:59:10,360 --> 00:59:15,120 Speaker 1: then than if the same thing occurs naturally. M that's 896 00:59:15,200 --> 00:59:19,120 Speaker 1: quite fascinating. So so let's talk about books. What are 897 00:59:19,160 --> 00:59:22,680 Speaker 1: some of your favorite books, be it fiction, nonfiction, Michael 898 00:59:22,720 --> 00:59:28,919 Speaker 1: Lewis or otherwise. Well, in nonfiction, the best book I've 899 00:59:28,920 --> 00:59:34,480 Speaker 1: read in in several years is called Sapiens. Sure, it's 900 00:59:34,560 --> 00:59:41,040 Speaker 1: the short history of humankind. You've ash, well, Uh, I 901 00:59:41,080 --> 00:59:44,439 Speaker 1: think it's superb. It's literally on my night table. It's 902 00:59:44,480 --> 00:59:46,400 Speaker 1: the next book up in my queue. If you're telling 903 00:59:46,440 --> 00:59:50,720 Speaker 1: me it's superb, I better started soon it twice? Really, 904 00:59:50,720 --> 00:59:54,000 Speaker 1: I don't. Yeah, and there are many books. Wow, that's 905 00:59:54,040 --> 00:59:57,480 Speaker 1: some endorsement. No, I think it's I think it's very impressive. 906 00:59:58,200 --> 01:00:01,800 Speaker 1: So that's that was one. Uh, certainly in a seem 907 01:00:01,920 --> 01:00:06,560 Speaker 1: Talib's book had a big effect on me, his last one, 908 01:00:06,680 --> 01:00:12,200 Speaker 1: or Fooled by Randomness, mostly the Black Swan, but you know, 909 01:00:12,280 --> 01:00:15,760 Speaker 1: I had liked fool by Randomness too, but the Black 910 01:00:15,800 --> 01:00:22,200 Speaker 1: Swan I learned a lot from. And then obviously their 911 01:00:22,320 --> 01:00:29,560 Speaker 1: works by by people I know like Nudge and Taylor. Taylor, 912 01:00:31,080 --> 01:00:35,960 Speaker 1: I mean, so those are many of some of your favorites. Um, 913 01:00:36,000 --> 01:00:39,480 Speaker 1: so you've mentioned you've been doing this for fifty years. 914 01:00:39,520 --> 01:00:43,040 Speaker 1: What has changed in the field of psychology that really 915 01:00:43,120 --> 01:00:47,320 Speaker 1: stands out to you over over that long period. You know, 916 01:00:47,400 --> 01:00:55,320 Speaker 1: when I started my studies, a lot of psychology was 917 01:00:55,360 --> 01:01:01,360 Speaker 1: still concerned with rats running mazes, and so the what 918 01:01:01,520 --> 01:01:06,480 Speaker 1: it's called the cognitive revolution occurred during my career. Early 919 01:01:06,520 --> 01:01:10,680 Speaker 1: in my career, so people sort of forgot, you know, 920 01:01:11,080 --> 01:01:15,520 Speaker 1: just set rights aside, mostly except for physiological work. And 921 01:01:15,560 --> 01:01:17,960 Speaker 1: when they wanted to study how the mind works, they 922 01:01:18,000 --> 01:01:21,560 Speaker 1: studied how people think. So that that was a big change, 923 01:01:22,120 --> 01:01:25,840 Speaker 1: and and there have been many other changes in recent years. 924 01:01:26,320 --> 01:01:30,000 Speaker 1: Mostly it's the study of the brain that is taking over, 925 01:01:30,680 --> 01:01:33,880 Speaker 1: and you know that's huge. The fm R eyes and 926 01:01:34,000 --> 01:01:37,360 Speaker 1: that's you know this this came too late for me 927 01:01:38,080 --> 01:01:40,960 Speaker 1: to be involved in, but if I had been younger, 928 01:01:41,200 --> 01:01:44,480 Speaker 1: I would have done what younger colleagues of mine did 929 01:01:44,840 --> 01:01:48,360 Speaker 1: and they switched. Is that going to be the biggest 930 01:01:48,440 --> 01:01:51,120 Speaker 1: change going forward? Is the ability to pear into the 931 01:01:51,200 --> 01:01:54,640 Speaker 1: brain while it's in operation. I'm convinced that that's the 932 01:01:54,720 --> 01:01:58,160 Speaker 1: case for the next few decades, really a few decades Wow, 933 01:01:58,200 --> 01:02:00,560 Speaker 1: that's that's amazing. All right. So we're up to our 934 01:02:00,680 --> 01:02:05,840 Speaker 1: last two questions, our final two questions. If a millennial 935 01:02:06,160 --> 01:02:09,600 Speaker 1: or a recent college grad came up to you and said, hey, 936 01:02:09,640 --> 01:02:14,680 Speaker 1: I'm thinking about a career and in experimental psychology, what 937 01:02:14,680 --> 01:02:18,600 Speaker 1: what sort of advice would you give them, whether to 938 01:02:18,640 --> 01:02:21,800 Speaker 1: go into that or in something else or well, they said, 939 01:02:21,800 --> 01:02:24,960 Speaker 1: I'm I'm considering this career. What advice do you have 940 01:02:25,080 --> 01:02:27,400 Speaker 1: for me? What? What would your answer be? You must 941 01:02:27,400 --> 01:02:30,680 Speaker 1: have had people students at Princeton ask you all the time. No, 942 01:02:30,760 --> 01:02:35,240 Speaker 1: I mean I would ask what kind of psychologists you 943 01:02:35,240 --> 01:02:37,600 Speaker 1: want to be? And I assure you want to be 944 01:02:37,720 --> 01:02:41,800 Speaker 1: that kind of psychologist. And there are certain lines of studies, 945 01:02:42,200 --> 01:02:46,160 Speaker 1: like going to graduate school. It's not for everybody, and 946 01:02:46,280 --> 01:02:49,680 Speaker 1: some very very smart people who could succeed in graduate 947 01:02:49,760 --> 01:02:54,200 Speaker 1: school should really not do it because there they will 948 01:02:54,280 --> 01:02:57,360 Speaker 1: be happier doing something else than being an academic. An 949 01:02:57,400 --> 01:03:00,760 Speaker 1: academic is is a life that's good for for a 950 01:03:00,840 --> 01:03:04,520 Speaker 1: minority of people and for most others. You know, it's 951 01:03:04,640 --> 01:03:10,480 Speaker 1: it's not great. So uh, I would never have ready 952 01:03:10,520 --> 01:03:14,400 Speaker 1: made advice. It really depends on it really depends on 953 01:03:14,440 --> 01:03:18,360 Speaker 1: the individual. And in our final question, what is it 954 01:03:18,440 --> 01:03:22,600 Speaker 1: that you know about psychology and the human mind today 955 01:03:22,640 --> 01:03:25,880 Speaker 1: that you wish you knew when you began fifty years ago. 956 01:03:32,480 --> 01:03:35,880 Speaker 1: I can't think of that, actually, I mean it's very odd. 957 01:03:36,760 --> 01:03:40,640 Speaker 1: You know, fifty years ago I didn't know many of 958 01:03:40,680 --> 01:03:44,320 Speaker 1: the things I know today, and today I know a 959 01:03:44,440 --> 01:03:47,760 Speaker 1: negligible amount compared to what people will know, you know, 960 01:03:47,840 --> 01:03:51,120 Speaker 1: fifty years from now. Well, but you've also I don't 961 01:03:51,320 --> 01:03:54,240 Speaker 1: I don't regret not knowing what I knew. I mean, 962 01:03:54,320 --> 01:03:58,480 Speaker 1: discovering things and learning things has been so much fun 963 01:03:58,840 --> 01:04:02,240 Speaker 1: that I have no regret about what they didn't know then. Um, 964 01:04:02,320 --> 01:04:04,840 Speaker 1: maybe regrets the wrong word. What what would have been 965 01:04:04,920 --> 01:04:08,440 Speaker 1: helpful to have known that you discovered later on in 966 01:04:08,480 --> 01:04:13,919 Speaker 1: your career. I mean, there are there There are things 967 01:04:14,000 --> 01:04:17,000 Speaker 1: that I learned even in recent years, in the last 968 01:04:17,120 --> 01:04:22,920 Speaker 1: few years, which is there's a big change in psychology 969 01:04:23,400 --> 01:04:27,760 Speaker 1: that that people are suspicious, and there is what it's 970 01:04:27,800 --> 01:04:32,720 Speaker 1: called the reproducibility crisis. In sure that's across all sciences, 971 01:04:32,960 --> 01:04:36,400 Speaker 1: and it's across all sciences that I wish I had 972 01:04:36,440 --> 01:04:41,960 Speaker 1: been more careful about, in particular because I had thought 973 01:04:41,960 --> 01:04:44,680 Speaker 1: about it, I had all the tools to see it 974 01:04:45,040 --> 01:04:51,560 Speaker 1: and I was not sufficiently aware of it. Dr Danny Khaman, 975 01:04:51,720 --> 01:04:54,680 Speaker 1: this has been absolutely fascinating. Thank you so much for 976 01:04:54,760 --> 01:04:59,080 Speaker 1: being so generous with your time. If you have enjoyed 977 01:04:59,120 --> 01:05:01,320 Speaker 1: this conversation, and be sure and look up an intro 978 01:05:01,440 --> 01:05:04,760 Speaker 1: Down an Inch on Apple iTunes and you could see 979 01:05:04,800 --> 01:05:09,280 Speaker 1: the other one hundred or so such conversations uh we've 980 01:05:09,320 --> 01:05:13,080 Speaker 1: had over the past uh two years. I would be 981 01:05:13,120 --> 01:05:18,240 Speaker 1: remiss if I did not thank my producers, Taylor Riggs 982 01:05:18,240 --> 01:05:22,320 Speaker 1: and Charlie Valmer, Charlie also working as a recording engineer today, 983 01:05:22,640 --> 01:05:25,520 Speaker 1: and of course our head of research, Michael Batnick, who 984 01:05:25,520 --> 01:05:29,240 Speaker 1: has been a huge help in helping to prepare these questions. 985 01:05:29,800 --> 01:05:33,960 Speaker 1: We love your comments and questions and suggestions. Be sure 986 01:05:33,960 --> 01:05:36,720 Speaker 1: and send us an email. You can write me at 987 01:05:36,760 --> 01:05:41,439 Speaker 1: b Ridholt's three at Bloomberg dot net. UM, you've been 988 01:05:41,480 --> 01:05:44,800 Speaker 1: listening to Masters in Business on Bloomberg Radio.