1 00:00:02,240 --> 00:00:06,800 Speaker 1: This is Masters in Business with Barry Ridholts on Bloomberg Radio. 2 00:00:09,560 --> 00:00:12,400 Speaker 1: This weekend. On the podcast, I have an extra special guest. 3 00:00:12,800 --> 00:00:15,360 Speaker 1: His name is Tom Gilovich. He is a professor of 4 00:00:15,360 --> 00:00:20,200 Speaker 1: psychology at Cornell University, and I have to say he 5 00:00:20,360 --> 00:00:24,000 Speaker 1: is a person who has had over the years, tremendous 6 00:00:24,000 --> 00:00:29,920 Speaker 1: influence on my career, although admittedly unknowingly. I began in 7 00:00:30,000 --> 00:00:34,240 Speaker 1: this industry as a trader, and I was frequently perplexed 8 00:00:34,280 --> 00:00:37,600 Speaker 1: and fascinated by why the people on the desk around 9 00:00:37,600 --> 00:00:42,600 Speaker 1: me were doing either poorly or well any given day, week, month. 10 00:00:43,240 --> 00:00:47,120 Speaker 1: And I eventually figured out it wasn't that one guy 11 00:00:47,200 --> 00:00:50,639 Speaker 1: was smarter than another or someone had more knowledge. It 12 00:00:50,720 --> 00:00:54,440 Speaker 1: was their own behavior that led to their successful failures. 13 00:00:54,880 --> 00:00:59,720 Speaker 1: And so I started hunting for some information about why 14 00:00:59,800 --> 00:01:05,000 Speaker 1: people made certain behavioral decisions that they did. I ended 15 00:01:05,160 --> 00:01:09,800 Speaker 1: up tracking down a book of his from How We 16 00:01:09,920 --> 00:01:12,240 Speaker 1: Know It Isn't So, and that pretty much sent me 17 00:01:12,319 --> 00:01:17,959 Speaker 1: down the rabbit hole of behavioral economics, which has been 18 00:01:18,000 --> 00:01:21,399 Speaker 1: a tremendous asset to me uh both in the world 19 00:01:21,520 --> 00:01:26,280 Speaker 1: of finance and media. Understanding what motivates people to make 20 00:01:26,840 --> 00:01:30,480 Speaker 1: either good or bad decisions with their money UH is 21 00:01:30,520 --> 00:01:33,840 Speaker 1: a tremendous asset both what you should be doing with 22 00:01:33,920 --> 00:01:37,520 Speaker 1: your money and what you should be advising other people 23 00:01:37,640 --> 00:01:41,119 Speaker 1: to do with their money. This is to me one 24 00:01:41,160 --> 00:01:44,759 Speaker 1: of the more fascinating conversations UH that you'll hear if 25 00:01:44,800 --> 00:01:50,960 Speaker 1: you are all interested in filling the blank psychology, behavioral economics, heuristics, biases, 26 00:01:51,320 --> 00:01:53,960 Speaker 1: et cetera. I could have gone on for another three 27 00:01:53,960 --> 00:01:56,720 Speaker 1: hours with him. I barely got to scratch the surface 28 00:01:57,440 --> 00:02:01,280 Speaker 1: of all my questions with no further ado my conversation 29 00:02:01,360 --> 00:02:08,120 Speaker 1: with Tom Gilovich. I have an extra special guest this week, 30 00:02:08,280 --> 00:02:11,880 Speaker 1: and his name is Professor Thomas Gilovich. He is a 31 00:02:11,919 --> 00:02:17,560 Speaker 1: professor of psychology at Cornell University. He has published numerous 32 00:02:18,120 --> 00:02:22,480 Speaker 1: pure reviewed works on cognition and heuristics and is among 33 00:02:22,560 --> 00:02:26,760 Speaker 1: the most cited academics in the field. Working on behavioral 34 00:02:26,840 --> 00:02:31,160 Speaker 1: psychology and economics. His work has debunked the idea of 35 00:02:31,200 --> 00:02:35,160 Speaker 1: the hot hand in basketball, the spotlight effect, the bias, 36 00:02:35,240 --> 00:02:39,799 Speaker 1: blind site, clustering, illusion, and numerous other cognitive issues are 37 00:02:39,919 --> 00:02:43,360 Speaker 1: in his purview. He is the author of numerous books, 38 00:02:43,400 --> 00:02:47,600 Speaker 1: including a textbook on heuristics and biases UH. He is 39 00:02:47,639 --> 00:02:50,720 Speaker 1: the co author of Why Smart People make big bunny 40 00:02:50,760 --> 00:02:53,640 Speaker 1: mistakes and what you can do to correct them. I 41 00:02:53,720 --> 00:02:56,520 Speaker 1: became familiar with his work How We Know It Isn't So, 42 00:02:57,160 --> 00:03:01,280 Speaker 1: The Fallibility of Human Reason in Everyday. It was one 43 00:03:01,320 --> 00:03:05,359 Speaker 1: of the first popular books on behavioral economics and one 44 00:03:05,400 --> 00:03:11,000 Speaker 1: that I found incredibly influential. Thomas Killovich, Welcome to Bloomberg. 45 00:03:11,200 --> 00:03:13,880 Speaker 1: Happy to be here. Let's start out with how we 46 00:03:13,960 --> 00:03:16,760 Speaker 1: Know what is? And so you begin the book with 47 00:03:16,800 --> 00:03:21,360 Speaker 1: a quote from Artemus Ward. It ain't the things we 48 00:03:21,440 --> 00:03:24,400 Speaker 1: don't know that get us into trouble. It's the things 49 00:03:24,440 --> 00:03:29,440 Speaker 1: we know that just ain't so. Tell us about that. Well, 50 00:03:29,880 --> 00:03:33,000 Speaker 1: if we're convinced of things that aren't true, we're gonna 51 00:03:33,160 --> 00:03:36,120 Speaker 1: go down certain paths that aren't going to be productive. 52 00:03:36,280 --> 00:03:40,720 Speaker 1: And uh, that quote captures that idea very well. And 53 00:03:40,760 --> 00:03:44,120 Speaker 1: what I like about that quote is that most people 54 00:03:44,680 --> 00:03:50,760 Speaker 1: attribute it to Mark Twins or Will Rogers, and the 55 00:03:51,040 --> 00:03:56,280 Speaker 1: source I thought at the time was Artemus Ward. And 56 00:03:56,880 --> 00:04:00,400 Speaker 1: maybe two years after the publication of the book, reader 57 00:04:00,480 --> 00:04:03,520 Speaker 1: wrote to me and said, Uh, well, it's really interesting. 58 00:04:03,680 --> 00:04:05,960 Speaker 1: You're sort of telling everyone they got it wrong with 59 00:04:06,440 --> 00:04:09,360 Speaker 1: Will Rogers and Mark Twain, but in fact it goes 60 00:04:09,440 --> 00:04:13,960 Speaker 1: back even earlier than Artemis forward to someone named Josh 61 00:04:14,000 --> 00:04:17,000 Speaker 1: Billings that I hadn't heard of, so uh, it's kind 62 00:04:17,040 --> 00:04:20,880 Speaker 1: of ironic. Started off a book on illusion and air 63 00:04:21,040 --> 00:04:24,320 Speaker 1: to the very first sentence of the book contains a 64 00:04:24,960 --> 00:04:28,800 Speaker 1: citation air at least, but the that citation errors did 65 00:04:28,800 --> 00:04:32,000 Speaker 1: not lead you down a path filled with errors. Let's 66 00:04:32,040 --> 00:04:34,960 Speaker 1: discuss a little bit about the things that you discovered 67 00:04:35,560 --> 00:04:38,039 Speaker 1: uh and published in the book, and will begin with 68 00:04:38,080 --> 00:04:43,360 Speaker 1: a very simple question, what are heuristics and biases? Well, 69 00:04:43,400 --> 00:04:45,760 Speaker 1: the bias part is quite easy. It's a term that 70 00:04:45,920 --> 00:04:51,080 Speaker 1: people are familiar with when there's a systematic departure between um, 71 00:04:51,440 --> 00:04:54,360 Speaker 1: a belief that you have in reality, or a tendency 72 00:04:54,440 --> 00:04:57,760 Speaker 1: to choose to veer in one direction when you should 73 00:04:57,800 --> 00:05:00,960 Speaker 1: be varying in another direction. So it's assist thematic departure 74 00:05:01,080 --> 00:05:06,440 Speaker 1: from uh reality or the best assessment of reality. Heuristics 75 00:05:06,520 --> 00:05:10,800 Speaker 1: that's a little more complicated for most folks. UM. It's 76 00:05:10,880 --> 00:05:15,560 Speaker 1: generally defined in the behavioral economics world as a rough 77 00:05:15,600 --> 00:05:21,119 Speaker 1: approximation seed of the pants uh rule of thumbs another 78 00:05:21,160 --> 00:05:24,600 Speaker 1: way of describing it. Yes, and why do these heuristics 79 00:05:24,680 --> 00:05:30,040 Speaker 1: lead people down the wrong path? Because there they generally 80 00:05:30,080 --> 00:05:33,559 Speaker 1: work pretty well. In One of the earliest examples used 81 00:05:33,960 --> 00:05:39,400 Speaker 1: UH applied to psychology was Daniel Khneman and Amos Tversky's 82 00:05:39,440 --> 00:05:43,320 Speaker 1: example to we use the clarity of the of an 83 00:05:43,320 --> 00:05:46,320 Speaker 1: image as a cue for how far away it is. 84 00:05:46,480 --> 00:05:50,320 Speaker 1: Farther things are, the harder it is to see them clearly, 85 00:05:50,400 --> 00:05:55,240 Speaker 1: so they will see more indistinct and that generally works 86 00:05:55,279 --> 00:05:57,840 Speaker 1: pretty well. We're able to see whether something's very far 87 00:05:57,839 --> 00:06:02,520 Speaker 1: away or relatively close up. But on a hazy day, 88 00:06:02,560 --> 00:06:07,120 Speaker 1: that's gonna make things seem farther away than they really are. Um. 89 00:06:07,240 --> 00:06:11,520 Speaker 1: And conversely, on a spectacularly clear day, you often have 90 00:06:11,640 --> 00:06:15,480 Speaker 1: the reaction of whoa those mountains are. I didn't realize 91 00:06:15,520 --> 00:06:19,720 Speaker 1: they were that close. Let's use another example of of 92 00:06:19,760 --> 00:06:23,400 Speaker 1: some biases. Your online in the supermarket and your line 93 00:06:23,400 --> 00:06:26,159 Speaker 1: doesn't seem to be moving, the line next to you 94 00:06:26,240 --> 00:06:28,240 Speaker 1: really looks like it's flying, or you're waiting for a 95 00:06:28,279 --> 00:06:30,640 Speaker 1: toll booth. I know parts of the country still have 96 00:06:30,800 --> 00:06:34,960 Speaker 1: toll booths. Um. And you switch lines and suddenly the 97 00:06:35,000 --> 00:06:38,880 Speaker 1: line you're on comes to a dead halt and the 98 00:06:38,960 --> 00:06:42,080 Speaker 1: line you just left seems to be moving. What is 99 00:06:42,120 --> 00:06:47,279 Speaker 1: it about our life experience that causes that illusion or 100 00:06:47,440 --> 00:06:51,960 Speaker 1: is it an illusion? Um that one. As far as 101 00:06:52,040 --> 00:06:55,040 Speaker 1: the grocery lines, I don't know if anyone has formally 102 00:06:55,160 --> 00:06:58,000 Speaker 1: studied it, but you have to ask what principle of 103 00:06:58,040 --> 00:07:03,159 Speaker 1: the universe would there be that it systematically uh distort 104 00:07:03,279 --> 00:07:05,839 Speaker 1: things such that whenever you move to a line, it 105 00:07:05,839 --> 00:07:07,839 Speaker 1: would slow down and the line that you were in 106 00:07:08,120 --> 00:07:11,400 Speaker 1: suddenly sped up. Uh. But it's easy to explain why 107 00:07:11,400 --> 00:07:14,240 Speaker 1: people would believe that, even if it's not true. That 108 00:07:14,400 --> 00:07:17,640 Speaker 1: is to say, uh, those times when you stay in 109 00:07:17,800 --> 00:07:22,560 Speaker 1: your overly busy line, you're tempted to move to another one, UM, 110 00:07:22,800 --> 00:07:25,960 Speaker 1: and it turns out that that you can see that 111 00:07:25,960 --> 00:07:29,040 Speaker 1: that would have been a better thing. Your line stays slow, 112 00:07:29,440 --> 00:07:31,880 Speaker 1: you could see people speeding through the other line. That 113 00:07:32,000 --> 00:07:35,720 Speaker 1: bothers you, but you get over it. If, on the 114 00:07:35,720 --> 00:07:38,160 Speaker 1: other hand, you make the opposite mistake, you switch to 115 00:07:38,400 --> 00:07:42,040 Speaker 1: the seemingly speedy line and it slows down, the line 116 00:07:42,080 --> 00:07:45,560 Speaker 1: you were in suddenly speeds up, you're gonna kick yourself 117 00:07:45,600 --> 00:07:47,440 Speaker 1: and say, Wow, why did I do that? I was 118 00:07:47,480 --> 00:07:51,239 Speaker 1: in the right line. I brought this on myself, And 119 00:07:51,440 --> 00:07:55,240 Speaker 1: it's more annoying. And because it's more annoying, it's more memorable, 120 00:07:55,680 --> 00:07:59,080 Speaker 1: and therefore, UH, you're gonna have a distorted sense in 121 00:07:59,120 --> 00:08:02,160 Speaker 1: your head of how common it is. UM. Very much 122 00:08:02,200 --> 00:08:06,760 Speaker 1: like UH, there's a belief in the sports world, baseball 123 00:08:06,800 --> 00:08:09,280 Speaker 1: world that if your team, your picture has a no 124 00:08:09,400 --> 00:08:13,960 Speaker 1: hitter in progress, don't comment on it. And that's partly 125 00:08:14,000 --> 00:08:17,520 Speaker 1: fed by the idea that if you're picture does have 126 00:08:17,560 --> 00:08:19,680 Speaker 1: a no hitter and you say, oh, we've got a 127 00:08:19,720 --> 00:08:21,760 Speaker 1: no hitter going, this is great, and then they lose it, 128 00:08:22,440 --> 00:08:25,480 Speaker 1: you draw an association between those two and those are 129 00:08:25,480 --> 00:08:28,600 Speaker 1: gonna stand out, and you're gonna think that it's uh, 130 00:08:28,640 --> 00:08:32,160 Speaker 1: what you've done has played some determinative role, which of 131 00:08:32,160 --> 00:08:34,679 Speaker 1: course hasn't. So so let's talk a little bit about 132 00:08:34,800 --> 00:08:38,319 Speaker 1: mean reversion. Very often, when we see things that are 133 00:08:38,400 --> 00:08:41,920 Speaker 1: outliers to the upside of the downside, we're sort of 134 00:08:41,960 --> 00:08:47,400 Speaker 1: surprised when the next item in that series is not 135 00:08:47,559 --> 00:08:50,480 Speaker 1: as extreme, be it how fast the line is moving, 136 00:08:50,600 --> 00:08:54,120 Speaker 1: or how easily a no hitter UH is lost and 137 00:08:54,160 --> 00:08:57,760 Speaker 1: goes back to normal issues. Why do people have such 138 00:08:57,760 --> 00:09:00,959 Speaker 1: a hard time with mean reversion that It's a great question, 139 00:09:01,120 --> 00:09:04,560 Speaker 1: and there are a number of things that contribute to UH. 140 00:09:04,800 --> 00:09:08,880 Speaker 1: This belief. The failure to recognize the fact that UM 141 00:09:09,160 --> 00:09:13,160 Speaker 1: regression to the mean is happening UM. And one of 142 00:09:13,200 --> 00:09:17,120 Speaker 1: them is that UH is the same story that I've 143 00:09:17,160 --> 00:09:22,040 Speaker 1: described before, which is when it reverts UM, and particularly 144 00:09:22,080 --> 00:09:26,080 Speaker 1: when it reverts after you've done something that that stands 145 00:09:26,120 --> 00:09:29,200 Speaker 1: out in your memory more and distorts your the intuitive 146 00:09:29,280 --> 00:09:33,720 Speaker 1: database that you have in your head. And so UH. 147 00:09:33,760 --> 00:09:37,200 Speaker 1: There are a lot of superstitions that are essentially a 148 00:09:37,240 --> 00:09:41,359 Speaker 1: failure to recognize the operation of regression, the Sports Illustrated 149 00:09:41,440 --> 00:09:44,920 Speaker 1: jinks being one of them. That it's believed that if 150 00:09:44,920 --> 00:09:47,440 Speaker 1: you get your picture on the cover of Sports Illustrated, 151 00:09:47,840 --> 00:09:51,400 Speaker 1: that that's bad luck UM. And it doesn't take that 152 00:09:51,520 --> 00:09:55,120 Speaker 1: much insight to recognize that really is a mean aversion account. 153 00:09:55,160 --> 00:09:57,160 Speaker 1: That is, you only get your picture on the cover 154 00:09:57,240 --> 00:10:01,040 Speaker 1: of Sports Illustrated if you've had a run of success 155 00:10:01,120 --> 00:10:05,360 Speaker 1: and extraordinary success at time one is going to be 156 00:10:05,440 --> 00:10:09,840 Speaker 1: followed not by abject failure but by somewhat less extreme 157 00:10:11,160 --> 00:10:14,320 Speaker 1: a run of success afterwards. So you're right there on 158 00:10:14,360 --> 00:10:17,880 Speaker 1: the peak. On average, people are going to do less 159 00:10:17,880 --> 00:10:20,199 Speaker 1: well the next time, and that gives rise to this 160 00:10:20,280 --> 00:10:24,040 Speaker 1: belief that um, it's bad luck to be pictured on 161 00:10:24,080 --> 00:10:28,040 Speaker 1: the cover of Sports Illustrated magazine, and people athletes truly 162 00:10:28,120 --> 00:10:31,080 Speaker 1: believe it. Some of them have turned down the opportunity 163 00:10:31,120 --> 00:10:34,120 Speaker 1: to be on the cover of Sports Illustrated simply because 164 00:10:34,160 --> 00:10:37,000 Speaker 1: they thought it was bad luck. Let's talk a little 165 00:10:37,000 --> 00:10:41,120 Speaker 1: bit about the hot hands in basketball. You first wrote 166 00:10:41,160 --> 00:10:45,120 Speaker 1: about this effect with Amos Tversky. Is it about thirty 167 00:10:45,200 --> 00:10:48,840 Speaker 1: years ago? Is that right? Yes, paper came out, so 168 00:10:48,840 --> 00:10:52,400 Speaker 1: so tell us what your studies found and how you 169 00:10:52,440 --> 00:10:55,920 Speaker 1: went about proving that the hot hands wasn't everything it 170 00:10:56,000 --> 00:11:01,480 Speaker 1: appeared to be. Uh. The the belief in the hot 171 00:11:01,559 --> 00:11:05,600 Speaker 1: hand is just one of the most powerful and firmly 172 00:11:05,640 --> 00:11:09,440 Speaker 1: held beliefs that we have. If you've ever played or 173 00:11:09,440 --> 00:11:13,160 Speaker 1: watched the game of basketball, it just seems like people 174 00:11:13,320 --> 00:11:15,920 Speaker 1: get on these streaks where they can do no wrong. 175 00:11:16,080 --> 00:11:20,280 Speaker 1: I am in the zone, absolutely, and it's incredibly compelling 176 00:11:20,360 --> 00:11:24,280 Speaker 1: that you've made a few shots, the game just seems easier. 177 00:11:24,320 --> 00:11:27,240 Speaker 1: It seems like you don't even have to attend to 178 00:11:27,320 --> 00:11:30,080 Speaker 1: the basket as much as normal, and it just goes in. 179 00:11:30,240 --> 00:11:33,400 Speaker 1: It's absolutely compelling. Michael Jordan used to say, there are 180 00:11:33,480 --> 00:11:39,920 Speaker 1: times when the rim looks bigger, absolutely um, and we 181 00:11:40,000 --> 00:11:46,160 Speaker 1: wondered whether there was less to that belief than basketball 182 00:11:46,200 --> 00:11:51,360 Speaker 1: players think. UM, we're not. It was we don't want 183 00:11:51,360 --> 00:11:56,240 Speaker 1: to challenge the feeling. There's no question that when you've 184 00:11:56,280 --> 00:12:00,000 Speaker 1: done well, you feel different differently, and when you've done poor, 185 00:12:00,000 --> 00:12:04,040 Speaker 1: really you feel differently. The basket seems to have shrunk. 186 00:12:04,080 --> 00:12:06,599 Speaker 1: It seems like the best aim shot is going to 187 00:12:06,679 --> 00:12:10,479 Speaker 1: go partly in the cylinder and pop right out. UM, 188 00:12:10,480 --> 00:12:13,200 Speaker 1: no questioning that. But the belief in the hot hand 189 00:12:13,280 --> 00:12:17,240 Speaker 1: is really a three link chain. Previous performance affects how 190 00:12:17,320 --> 00:12:20,600 Speaker 1: you feel. How you feel affects subsequent performance, and we 191 00:12:20,600 --> 00:12:24,200 Speaker 1: were interested in the link between the second and third, 192 00:12:24,320 --> 00:12:27,679 Speaker 1: and the thought was that we exaggerate that for many 193 00:12:27,760 --> 00:12:32,440 Speaker 1: of the same UH psychological principles we've already talked about UM. 194 00:12:32,559 --> 00:12:36,960 Speaker 1: And when we tested it, UH, the our hypothesis was 195 00:12:37,000 --> 00:12:40,400 Speaker 1: that people are going to exaggerate how much heat there is, 196 00:12:40,440 --> 00:12:44,240 Speaker 1: how much streakiness there is in performance. And when we 197 00:12:44,320 --> 00:12:48,760 Speaker 1: did the analyzes, turned out there was no connection between 198 00:12:48,840 --> 00:12:51,880 Speaker 1: link two and three. That is, how you feel doesn't 199 00:12:51,880 --> 00:12:56,360 Speaker 1: seem to influence how UH you perform in the future, 200 00:12:56,400 --> 00:12:59,520 Speaker 1: at least at the professional level. So let's let's be 201 00:12:59,679 --> 00:13:02,520 Speaker 1: really precise with that. When you say there is no link. 202 00:13:03,000 --> 00:13:07,080 Speaker 1: There's not even the slight improvement. Hey, I'm lose some 203 00:13:07,320 --> 00:13:12,080 Speaker 1: hitting shots. I'm feeling good each subsequent basket. What is 204 00:13:12,160 --> 00:13:15,200 Speaker 1: the correlation with the future success whether I made a 205 00:13:15,200 --> 00:13:17,600 Speaker 1: few in a row where I missed a few row. UM. 206 00:13:17,920 --> 00:13:21,240 Speaker 1: Two things are important there. One is um, there may 207 00:13:21,280 --> 00:13:25,360 Speaker 1: be some ways in which you've experienced some success, you 208 00:13:25,480 --> 00:13:29,120 Speaker 1: feel different, and it might influence your future behavior that 209 00:13:29,200 --> 00:13:32,760 Speaker 1: we haven't studied. We put aside from the hot hand idea, 210 00:13:32,840 --> 00:13:36,440 Speaker 1: which is I've done well. Maybe I'm a more energetic player, 211 00:13:36,480 --> 00:13:39,520 Speaker 1: I played better defense, I get more rebounds. We don't 212 00:13:39,520 --> 00:13:41,760 Speaker 1: know about that, and that may be true. What we 213 00:13:41,800 --> 00:13:44,600 Speaker 1: did look at do you become a better shooter? Not 214 00:13:44,720 --> 00:13:46,640 Speaker 1: do you take more shots? You may you probably do 215 00:13:46,720 --> 00:13:49,400 Speaker 1: take more shots because you're feeling hot. The question is 216 00:13:49,400 --> 00:13:52,880 Speaker 1: are you more likely to make a shot um after 217 00:13:53,000 --> 00:13:56,600 Speaker 1: having succeeded in the past versus after having failed. And 218 00:13:56,679 --> 00:14:00,120 Speaker 1: the data show and this remains, you know, controversial or 219 00:14:00,200 --> 00:14:04,120 Speaker 1: thirty years, there's been back and forth. All the challenges 220 00:14:04,160 --> 00:14:06,720 Speaker 1: to it for decades have not stood up. There's a 221 00:14:06,720 --> 00:14:10,120 Speaker 1: recent challenge that's among the more interesting ones. So I 222 00:14:10,120 --> 00:14:14,840 Speaker 1: have to say it remains controversial. Our original hypothesis UM 223 00:14:15,040 --> 00:14:20,600 Speaker 1: isn't controversial. That is to say, UM people wildly overestimate 224 00:14:21,080 --> 00:14:24,680 Speaker 1: the extent to which people are hot or how streaky 225 00:14:24,760 --> 00:14:29,560 Speaker 1: people are. Whether there's any sort of carryover that remains 226 00:14:30,000 --> 00:14:33,320 Speaker 1: a bit controversial. So some of the pushback has been, 227 00:14:33,400 --> 00:14:36,920 Speaker 1: you know, a player gets hot and the defense collapses 228 00:14:37,000 --> 00:14:39,880 Speaker 1: on on them and they have less good looks at 229 00:14:39,880 --> 00:14:43,600 Speaker 1: the basket, and so naturally after a certain streak they're 230 00:14:43,640 --> 00:14:47,160 Speaker 1: gonna start missing. But you don't have that same defensive 231 00:14:47,160 --> 00:14:50,000 Speaker 1: pressure with foul shooting, do you. That's right? And what 232 00:14:50,120 --> 00:14:52,400 Speaker 1: is the data show with that? The data show that 233 00:14:52,480 --> 00:14:55,400 Speaker 1: the outcome of the second free throw is completely independent 234 00:14:55,480 --> 00:14:58,760 Speaker 1: of the out outcome of the first totally. So if 235 00:14:58,800 --> 00:15:01,720 Speaker 1: you hit or miss the US one, the outcome of 236 00:15:01,760 --> 00:15:05,840 Speaker 1: the second one is statistically no different. Right, And a 237 00:15:05,880 --> 00:15:09,000 Speaker 1: lot of basketball UM players and fans will say, well, 238 00:15:09,040 --> 00:15:11,440 Speaker 1: I'm not so impressed by that, because you can't really 239 00:15:11,480 --> 00:15:14,000 Speaker 1: be streaky when it comes to free throws. Now they 240 00:15:14,000 --> 00:15:17,000 Speaker 1: say that after the fact, but I mean, you're standing there, 241 00:15:17,080 --> 00:15:21,040 Speaker 1: nobody's guarding you. There are certain think of Reggie Miller 242 00:15:21,080 --> 00:15:24,920 Speaker 1: of the Yetta Paces used to shoot like free throws. 243 00:15:25,080 --> 00:15:30,120 Speaker 1: He was an outstanding free throw shooter. Why would people 244 00:15:30,320 --> 00:15:33,240 Speaker 1: not assume if there's a streak when you're on the 245 00:15:33,360 --> 00:15:35,560 Speaker 1: on the court, why would you not assume there's a 246 00:15:35,560 --> 00:15:39,960 Speaker 1: streak at the foul line. I think they say that 247 00:15:40,040 --> 00:15:42,800 Speaker 1: after having seen the data UM, and they want to 248 00:15:42,920 --> 00:15:46,120 Speaker 1: maintain that belief. But we went a step further and 249 00:15:46,120 --> 00:15:48,560 Speaker 1: we conducted UM and other people have done this too, 250 00:15:48,680 --> 00:15:51,560 Speaker 1: where UM, you have people shoot in the gym for you, 251 00:15:51,840 --> 00:15:54,960 Speaker 1: and the feeling for all of us who've played basketball 252 00:15:55,080 --> 00:15:57,040 Speaker 1: is you can feel it in warm up Sometime it 253 00:15:57,040 --> 00:15:58,880 Speaker 1: doesn't have to be in the heat of the game 254 00:15:58,920 --> 00:16:02,440 Speaker 1: that you can feel hot. And uh. We have people 255 00:16:02,520 --> 00:16:06,040 Speaker 1: take a series of shots along an arc equidistance from 256 00:16:06,040 --> 00:16:10,280 Speaker 1: the basket, and before each shot they place a bet 257 00:16:10,320 --> 00:16:12,960 Speaker 1: on themselves, take a risky bet if they're feeling hot, 258 00:16:13,040 --> 00:16:15,240 Speaker 1: or a more conservative bet if they are not feeling 259 00:16:15,280 --> 00:16:17,840 Speaker 1: so hot. It turns out we can predict the bets 260 00:16:17,840 --> 00:16:19,680 Speaker 1: they're going to make. That is, if they've made several 261 00:16:19,680 --> 00:16:22,560 Speaker 1: shots in a row, that is a very strong predictor 262 00:16:22,760 --> 00:16:26,400 Speaker 1: of whether they're going to choose the risky bet or not. However, 263 00:16:26,880 --> 00:16:29,840 Speaker 1: the bets that they choose are not very good predictors 264 00:16:29,880 --> 00:16:32,800 Speaker 1: of what's going to happen next? Again, this three linked chain, 265 00:16:33,440 --> 00:16:36,320 Speaker 1: no problem between the first and the second. How you've 266 00:16:36,360 --> 00:16:40,440 Speaker 1: done influences how you feel, But surprisingly how you feel 267 00:16:40,520 --> 00:16:44,640 Speaker 1: has um either no or very little impact on your 268 00:16:44,720 --> 00:16:46,920 Speaker 1: likelihood of making the next shot. And again I want 269 00:16:46,920 --> 00:16:49,480 Speaker 1: to stress it has to do with your ability to, 270 00:16:50,120 --> 00:16:53,520 Speaker 1: as they now say, score the basketball, get the ball 271 00:16:53,640 --> 00:16:55,520 Speaker 1: in the cylinder. It may affect you in other ways, 272 00:16:55,560 --> 00:16:58,120 Speaker 1: may make you a better defender, better pass or whatever. 273 00:16:58,760 --> 00:17:01,040 Speaker 1: So about a year a half ago I read the 274 00:17:01,080 --> 00:17:04,679 Speaker 1: book by Michael Lewis, The Undoing Project about Danny Khneman, 275 00:17:04,720 --> 00:17:07,760 Speaker 1: any of us Amos Tversky. You wrote the Hot Hand 276 00:17:07,760 --> 00:17:13,120 Speaker 1: paper with Amos Tversky? What was it like working with him? 277 00:17:13,160 --> 00:17:17,439 Speaker 1: I mean, it was certainly a highlight of my career, 278 00:17:17,480 --> 00:17:21,040 Speaker 1: and subsequently had the great opportunity to have worked with 279 00:17:21,119 --> 00:17:26,800 Speaker 1: Danny Khneman as well. Um and Amos, who's brilliant, as 280 00:17:26,880 --> 00:17:29,760 Speaker 1: you know, as as comes across in The Undoing Project, 281 00:17:30,160 --> 00:17:33,040 Speaker 1: and one of the things that I most liked about 282 00:17:33,880 --> 00:17:36,840 Speaker 1: uh working with him one of the great lessons. And 283 00:17:36,880 --> 00:17:41,240 Speaker 1: this applies equally as well to Koneman. Um. They're both 284 00:17:41,240 --> 00:17:45,320 Speaker 1: brilliant and it's fun to be in close proximity to brilliance. 285 00:17:46,000 --> 00:17:49,440 Speaker 1: But that's not enough. Both of these guys were incredibly 286 00:17:49,600 --> 00:17:53,119 Speaker 1: hard working and detailed oriented. And one of the things 287 00:17:53,200 --> 00:17:56,880 Speaker 1: I learned from Amos it's kind of similar to this, 288 00:17:57,240 --> 00:18:00,919 Speaker 1: uh saying people have about gall that you know, you 289 00:18:01,040 --> 00:18:05,720 Speaker 1: drive for show and you put for dough. Um that uh, 290 00:18:05,760 --> 00:18:09,480 Speaker 1: you know, when we were basically done with our hot 291 00:18:09,560 --> 00:18:12,000 Speaker 1: hand paper. In fact, as a young person, I thought 292 00:18:12,040 --> 00:18:15,320 Speaker 1: we were done. He would several times, Tom, come in here. 293 00:18:15,320 --> 00:18:17,840 Speaker 1: I've moved this sentence around here, and I've changed this word. 294 00:18:18,000 --> 00:18:20,879 Speaker 1: And the thought was the lesson was if someone of 295 00:18:21,000 --> 00:18:24,760 Speaker 1: his stature is really sweating the details right here at 296 00:18:24,840 --> 00:18:28,119 Speaker 1: the very end, just making the paper a little better 297 00:18:28,320 --> 00:18:31,720 Speaker 1: in these marginal ways, I can do that too. And 298 00:18:31,760 --> 00:18:34,399 Speaker 1: it's just been a very helpful lesson. You're never really 299 00:18:34,440 --> 00:18:38,120 Speaker 1: done and things can always be made better. And uh, 300 00:18:38,320 --> 00:18:43,240 Speaker 1: both Condimn and Diversky are you know, they they're they're 301 00:18:43,280 --> 00:18:46,600 Speaker 1: into what they're doing, passionately care about it, and it 302 00:18:46,640 --> 00:18:49,840 Speaker 1: shows in their ability to throw themselves into it. And 303 00:18:49,880 --> 00:18:54,520 Speaker 1: there's there's no being brilliants great, and both of them are, 304 00:18:54,560 --> 00:18:57,320 Speaker 1: but to get to their level, you also have to 305 00:18:57,359 --> 00:19:01,400 Speaker 1: have a great capacity for hard work and great enthusiasm 306 00:19:01,440 --> 00:19:03,960 Speaker 1: for what you're doing, which both of them did do. 307 00:19:04,480 --> 00:19:08,800 Speaker 1: Brilliance is necessary, but not sufficient, well said. Let's talk 308 00:19:08,840 --> 00:19:13,879 Speaker 1: a little bit about some of your research on happiness 309 00:19:14,040 --> 00:19:19,760 Speaker 1: and how do either get happy or avoid unhappiness. Let's 310 00:19:19,800 --> 00:19:23,840 Speaker 1: begin with the discussion of regret. What is it that 311 00:19:23,920 --> 00:19:27,720 Speaker 1: people regret more the things they do or the things 312 00:19:27,760 --> 00:19:31,320 Speaker 1: that they decide not to do. I think one thing 313 00:19:31,359 --> 00:19:35,720 Speaker 1: that's interesting about that question is that it depends upon 314 00:19:36,200 --> 00:19:40,960 Speaker 1: um when you're talking about it, that in the immediate 315 00:19:41,000 --> 00:19:44,679 Speaker 1: aftermath of a mistake of action, that just hurts a 316 00:19:44,720 --> 00:19:49,120 Speaker 1: lot more than omission. And uh, you see that everywhere. 317 00:19:49,280 --> 00:19:51,520 Speaker 1: It plays out in the example that you started with 318 00:19:51,560 --> 00:19:54,400 Speaker 1: about the grocery store line, that if you switch from 319 00:19:54,400 --> 00:19:58,439 Speaker 1: a slow line to one that's seemingly faster and it 320 00:19:58,520 --> 00:20:00,760 Speaker 1: slows down, and you just kick yourself. I was in 321 00:20:00,800 --> 00:20:03,720 Speaker 1: a better line, Why did I switch? Uh? The example 322 00:20:03,800 --> 00:20:06,120 Speaker 1: at a university, of course that I think everyone can 323 00:20:06,200 --> 00:20:09,919 Speaker 1: relate to is people taking multiple choice tests and you 324 00:20:10,000 --> 00:20:12,879 Speaker 1: have the common experience of zipping along, Oh this one's 325 00:20:12,920 --> 00:20:16,440 Speaker 1: B and oh wait a minute, maybe maybe it's D, 326 00:20:16,600 --> 00:20:18,520 Speaker 1: and you go back and forth. Should I stick with 327 00:20:18,600 --> 00:20:21,760 Speaker 1: my initial hunt or should I switch to now what 328 00:20:21,880 --> 00:20:24,919 Speaker 1: I think that it is? Turns out people have studied that, 329 00:20:25,160 --> 00:20:28,600 Speaker 1: and if you face that dilemma, Uh, you're much more 330 00:20:28,640 --> 00:20:32,400 Speaker 1: likely to get the right answer by switching than staying. 331 00:20:32,960 --> 00:20:35,920 Speaker 1: Students believe the complete opposite is better to go with 332 00:20:36,000 --> 00:20:41,280 Speaker 1: your initial gut instinct. And UM, So their belief is incorrect, 333 00:20:41,320 --> 00:20:45,240 Speaker 1: doesn't fit the data that's been observed. Um. But it's 334 00:20:45,240 --> 00:20:47,399 Speaker 1: easy to see why they would believe that. That is, 335 00:20:47,480 --> 00:20:50,280 Speaker 1: if you thought it was be and then you convince 336 00:20:50,320 --> 00:20:53,280 Speaker 1: yourself let me switch to D. You erase it. Now 337 00:20:53,359 --> 00:20:55,720 Speaker 1: you're endorsing B. And it turns out you were right. 338 00:20:56,160 --> 00:21:00,760 Speaker 1: You're gonna kick yourself. I had I had the right answer. 339 00:21:00,800 --> 00:21:03,560 Speaker 1: I knew it you did. Um. So, short term, the 340 00:21:03,600 --> 00:21:08,080 Speaker 1: bias against action is a manifestation of regret aversion. We 341 00:21:08,160 --> 00:21:11,679 Speaker 1: don't want to do something that we ultimately regret. But 342 00:21:11,880 --> 00:21:15,080 Speaker 1: then how does that play out with the sort of 343 00:21:15,240 --> 00:21:20,120 Speaker 1: deathbed statements we've seen from people I should have done X. Yeah, 344 00:21:20,240 --> 00:21:23,320 Speaker 1: that's the that's the interesting part. This change from the 345 00:21:23,720 --> 00:21:28,399 Speaker 1: close temporal proximity to a distant perspective, and over time 346 00:21:28,880 --> 00:21:31,399 Speaker 1: we do things with our because our mistakes of action 347 00:21:31,440 --> 00:21:34,840 Speaker 1: are so painful, we do things about them, We make 348 00:21:34,880 --> 00:21:38,600 Speaker 1: amends to other people, We engage in what psychologists called 349 00:21:38,680 --> 00:21:42,240 Speaker 1: cognitive dissonance reduction to make ourselves feel better about it. 350 00:21:42,440 --> 00:21:46,000 Speaker 1: We identify and part of that is identifying a silver lining. 351 00:21:46,040 --> 00:21:49,000 Speaker 1: If you ask people about some of their biggest regrets, 352 00:21:49,080 --> 00:21:53,199 Speaker 1: people will say, oh, I married the wrong person. But 353 00:21:53,480 --> 00:21:55,680 Speaker 1: they will say, yeah, that was a mistake, but I 354 00:21:55,680 --> 00:21:57,879 Speaker 1: wouldn't have these great kids if I hadn't done that. 355 00:21:58,000 --> 00:22:00,760 Speaker 1: So the marriage ended but turned out to have been 356 00:22:00,760 --> 00:22:02,680 Speaker 1: a good thing because I have these kids that I love. 357 00:22:03,160 --> 00:22:06,600 Speaker 1: Um And so there's all this mental stuff you do 358 00:22:06,760 --> 00:22:11,440 Speaker 1: to make yourself more comfortable with many, at least mistakes 359 00:22:11,480 --> 00:22:14,960 Speaker 1: of action. Whereas things that you didn't do, rather than 360 00:22:15,000 --> 00:22:19,200 Speaker 1: shrinking over time, they often grow. You often think, oh, 361 00:22:19,359 --> 00:22:21,119 Speaker 1: you know, if I had only taken that class, I 362 00:22:21,119 --> 00:22:23,440 Speaker 1: would have gone into this profession rather than this one, 363 00:22:23,480 --> 00:22:27,159 Speaker 1: and you imagine all the great things that could have happened. 364 00:22:27,160 --> 00:22:31,919 Speaker 1: So over time, regrets of inaction either don't shrink like 365 00:22:32,000 --> 00:22:34,760 Speaker 1: the action ones do, or they even grow over time. 366 00:22:34,800 --> 00:22:38,040 Speaker 1: And so you, as you said, on people's deathbeds, they 367 00:22:39,640 --> 00:22:42,240 Speaker 1: regret more of the things that they didn't do. I 368 00:22:42,240 --> 00:22:44,159 Speaker 1: could have been a contender. You know, I could have 369 00:22:44,200 --> 00:22:46,600 Speaker 1: been this, I could have been that. So so actions 370 00:22:47,119 --> 00:22:50,720 Speaker 1: become rationalized and we learned to deal with our mistakes, 371 00:22:51,160 --> 00:22:54,199 Speaker 1: but in actions blow up in our minds to become 372 00:22:54,280 --> 00:22:58,840 Speaker 1: these mythic if only so, let's talk about something sort 373 00:22:58,840 --> 00:23:01,520 Speaker 1: of related to that. You've you've created a little bit 374 00:23:01,520 --> 00:23:06,840 Speaker 1: of a stir not too long ago, expressing the belief 375 00:23:06,960 --> 00:23:11,119 Speaker 1: that you know, we're a consumer society and everybody wants 376 00:23:11,200 --> 00:23:13,720 Speaker 1: the two point three kids in the four bedroom house 377 00:23:13,760 --> 00:23:16,600 Speaker 1: and the convertible car and the boat or what have you. 378 00:23:16,720 --> 00:23:24,119 Speaker 1: But your conclusion is experiences are more valuable to individuals 379 00:23:24,359 --> 00:23:28,639 Speaker 1: than these babbles and goods. Explain how you came to 380 00:23:28,680 --> 00:23:31,720 Speaker 1: that conclusion and what it means. That line of research 381 00:23:32,359 --> 00:23:36,240 Speaker 1: stemmed from a finding, probably the biggest finding in the 382 00:23:36,440 --> 00:23:41,520 Speaker 1: now quite extensive research on happiness or well being, which 383 00:23:41,600 --> 00:23:45,879 Speaker 1: is that we have a remarkable capacity to adapt to things, 384 00:23:46,160 --> 00:23:50,199 Speaker 1: and that's a good capacity when bad things happened to us. 385 00:23:50,560 --> 00:23:54,560 Speaker 1: Oh no, I've lost my job, life's going to fall apart. Well, 386 00:23:54,600 --> 00:23:56,560 Speaker 1: that is a bad thing, and you are miserable for 387 00:23:56,600 --> 00:23:59,680 Speaker 1: a while, but we tend to adapt to it. Or 388 00:24:00,359 --> 00:24:02,440 Speaker 1: many people think Oh, I don't think i'd even want 389 00:24:02,440 --> 00:24:05,720 Speaker 1: to live if I lost the use of my legs. 390 00:24:05,840 --> 00:24:08,359 Speaker 1: Let's say, well, it turns out that people who have 391 00:24:08,440 --> 00:24:12,639 Speaker 1: experienced that, yes, they are miserable right away, but they 392 00:24:12,680 --> 00:24:16,800 Speaker 1: adapt and live lives that are as fulfilling as people 393 00:24:16,840 --> 00:24:20,159 Speaker 1: who haven't had that misfortune. Um. And this is just 394 00:24:20,680 --> 00:24:26,639 Speaker 1: the probably biggest fact about the study of happiness, and 395 00:24:26,760 --> 00:24:31,520 Speaker 1: so it applying it to consumption, the things that we buy. 396 00:24:31,600 --> 00:24:34,720 Speaker 1: You know, if I trade in my camera and get Alexis, 397 00:24:34,760 --> 00:24:37,240 Speaker 1: I'll be happier. Yeah, that's true, you will be for 398 00:24:37,280 --> 00:24:39,800 Speaker 1: a little while. Pretty soon it's just a car like 399 00:24:39,880 --> 00:24:41,880 Speaker 1: the other car, and you don't notice it as much. 400 00:24:42,200 --> 00:24:46,720 Speaker 1: And so one challenge for happiness for searchers is, um, 401 00:24:46,760 --> 00:24:49,800 Speaker 1: if adaptation is an enemy of happiness, how do you 402 00:24:49,840 --> 00:24:54,879 Speaker 1: combat that enemy? And the great judgment and decision making 403 00:24:54,960 --> 00:24:57,639 Speaker 1: researcher Robin Dawes had a thought experiment where he was 404 00:24:57,680 --> 00:25:01,000 Speaker 1: talking about this hedonic treadmill. I need more and more 405 00:25:01,040 --> 00:25:04,920 Speaker 1: to get the same level of happiness, he said. Imagine 406 00:25:04,960 --> 00:25:10,240 Speaker 1: you devoted your life not too selfish pursuits accumulating more, 407 00:25:10,440 --> 00:25:13,720 Speaker 1: but selfless ones. You were just trying to do good 408 00:25:13,760 --> 00:25:16,560 Speaker 1: in the world. I don't you know, I don't know 409 00:25:16,600 --> 00:25:18,359 Speaker 1: if he used I think he did use the Mother 410 00:25:18,480 --> 00:25:22,919 Speaker 1: Teresa example. Mother Teresa saves five people this week, she 411 00:25:23,000 --> 00:25:25,359 Speaker 1: probably isn't the next week going f five people. That 412 00:25:25,440 --> 00:25:27,080 Speaker 1: doesn't cut it. I need to save more. I need 413 00:25:27,119 --> 00:25:30,360 Speaker 1: to save more. And it's a compelling thought experiment. And 414 00:25:30,520 --> 00:25:35,159 Speaker 1: it seemed like, um, those kinds of experiences at least 415 00:25:35,560 --> 00:25:40,480 Speaker 1: weren't subject to this adaptation, this idonic treadmill. And so 416 00:25:40,680 --> 00:25:44,240 Speaker 1: the question became, how broadly does that apply to experiences? 417 00:25:44,280 --> 00:25:48,560 Speaker 1: And the hypothesis was, you don't adapt to the money 418 00:25:48,640 --> 00:25:51,320 Speaker 1: that you spend, the pleasure that you get out of 419 00:25:51,359 --> 00:25:54,840 Speaker 1: your experiences, Uh, you don't adapt to them as much, 420 00:25:54,840 --> 00:25:57,720 Speaker 1: which in some ways is paradoxical. That is, most people 421 00:25:57,760 --> 00:26:00,560 Speaker 1: have limited resources, and people often say, you know, I know, 422 00:26:00,640 --> 00:26:04,119 Speaker 1: I would love a vacation right now, but uh, we 423 00:26:04,160 --> 00:26:07,560 Speaker 1: really need a new set of bookcases, or we need 424 00:26:07,640 --> 00:26:10,480 Speaker 1: a new bed, or we need a new car or whatever, 425 00:26:11,080 --> 00:26:13,880 Speaker 1: and at least that will always be there. And that's 426 00:26:13,920 --> 00:26:17,760 Speaker 1: true in a material sense, but psychologically it's the reverse. 427 00:26:17,840 --> 00:26:20,320 Speaker 1: You adapt to the new bookcase that you adapt to 428 00:26:20,880 --> 00:26:24,320 Speaker 1: the upgrade in your car, the things that you buy 429 00:26:24,680 --> 00:26:26,679 Speaker 1: and it turns out you don't adapt as much to 430 00:26:26,720 --> 00:26:30,280 Speaker 1: your experiences. Your experiences change who you are, and you're 431 00:26:30,280 --> 00:26:34,760 Speaker 1: a changed person, and you continue to benefit from that. 432 00:26:35,160 --> 00:26:38,760 Speaker 1: Your experiences connect you to other people in ways that 433 00:26:38,840 --> 00:26:43,479 Speaker 1: you're material goods don't, and that continues to be a 434 00:26:43,520 --> 00:26:46,800 Speaker 1: gift that keeps on giving. Uh. And so it turns 435 00:26:46,800 --> 00:26:49,639 Speaker 1: out uh, and a lot of research has shown this 436 00:26:49,800 --> 00:26:54,320 Speaker 1: that even though the experience comes and goes, possibly in 437 00:26:54,320 --> 00:26:58,480 Speaker 1: a flash, UH, it lives on psychologically and provides more 438 00:26:58,720 --> 00:27:04,360 Speaker 1: enduring satisfaction and enjoyment. How much of the material issues 439 00:27:04,720 --> 00:27:09,960 Speaker 1: are based on the very natural tendency of us to 440 00:27:10,960 --> 00:27:14,639 Speaker 1: not be very good at predicting our own future happiness 441 00:27:14,720 --> 00:27:18,520 Speaker 1: or own future emotional state. It seems we build up 442 00:27:18,600 --> 00:27:21,600 Speaker 1: these things we want be at a house of car, whatever, 443 00:27:22,200 --> 00:27:25,960 Speaker 1: and then when we actually get it, it disappoints our 444 00:27:26,040 --> 00:27:32,440 Speaker 1: expectations of this grand change of lifestyle. Well, you've identified 445 00:27:32,480 --> 00:27:35,840 Speaker 1: a very interesting line of research of it would seem 446 00:27:35,920 --> 00:27:38,040 Speaker 1: like one of the easiest things to predict would be 447 00:27:38,080 --> 00:27:41,240 Speaker 1: how happy we're going to be Making predictions about ourselves. 448 00:27:41,240 --> 00:27:43,840 Speaker 1: Predicting the world, of course, is hard, but it would 449 00:27:43,840 --> 00:27:48,320 Speaker 1: seem to be easy to predict how we're going to react. 450 00:27:48,440 --> 00:27:52,760 Speaker 1: And I think it's just a very compelling, uh line 451 00:27:52,760 --> 00:27:56,240 Speaker 1: of research that now many investigators have done showing that 452 00:27:56,600 --> 00:28:00,159 Speaker 1: we're not so good prognosticators about our own enjoyment. And 453 00:28:00,160 --> 00:28:02,960 Speaker 1: one of the examples that I most like in this 454 00:28:03,080 --> 00:28:09,040 Speaker 1: area is that people feel like, I'll be happier if 455 00:28:09,080 --> 00:28:11,920 Speaker 1: I have a bigger house, and when you first move 456 00:28:11,960 --> 00:28:16,240 Speaker 1: into it, you will be happier again before you adapt 457 00:28:16,320 --> 00:28:21,040 Speaker 1: to this new amount of space, and uh, that becomes 458 00:28:21,080 --> 00:28:23,280 Speaker 1: the norm, this bigger house, and now you feel like 459 00:28:23,280 --> 00:28:26,520 Speaker 1: you need even more and more. Anyway, in order to 460 00:28:27,240 --> 00:28:31,639 Speaker 1: afford that bigger house, people often have to live farther 461 00:28:31,720 --> 00:28:34,240 Speaker 1: away from their job and they have a longer commute. 462 00:28:34,520 --> 00:28:38,240 Speaker 1: And one of the exceptions in this tendency that I described, 463 00:28:38,320 --> 00:28:42,760 Speaker 1: this remarkable capacity for adaptation, is that people don't tend 464 00:28:42,760 --> 00:28:48,080 Speaker 1: to adapt to the trauma of a long commute. And 465 00:28:48,480 --> 00:28:51,640 Speaker 1: it's not surprising why that would be the case. That is, 466 00:28:51,680 --> 00:28:56,040 Speaker 1: it's a it's a a morphous, highly variable thing. It's 467 00:28:56,040 --> 00:28:59,360 Speaker 1: a different version of hell each time you're driving. Sometimes 468 00:28:59,400 --> 00:29:03,160 Speaker 1: it's a snarl over here, other time it's a nasty 469 00:29:03,200 --> 00:29:06,840 Speaker 1: motorist there. It's different, but and you don't adapt to it, 470 00:29:07,120 --> 00:29:10,120 Speaker 1: and so you've made this terrible trade off. You think 471 00:29:10,120 --> 00:29:12,680 Speaker 1: you're gonna be happier with the bigger house. You get 472 00:29:12,720 --> 00:29:15,520 Speaker 1: the bigger house, you are happy for a while than 473 00:29:15,600 --> 00:29:18,520 Speaker 1: you adapt to it. You're saddled with this long commute 474 00:29:18,520 --> 00:29:21,640 Speaker 1: to which you don't adapt, and so you've lost out 475 00:29:21,640 --> 00:29:24,040 Speaker 1: in the bargain. So here's a quote from one of 476 00:29:24,080 --> 00:29:28,440 Speaker 1: your books. People do not hold questionable beliefs because they 477 00:29:28,480 --> 00:29:32,959 Speaker 1: have not been exposed to the relevant evidence. Why do 478 00:29:33,040 --> 00:29:38,400 Speaker 1: they hold questionable beliefs? Well, for a variety of reasons, 479 00:29:38,440 --> 00:29:42,120 Speaker 1: and and the that's what the book was about, was exploring, um, 480 00:29:42,160 --> 00:29:47,560 Speaker 1: those different reasons. Um. And one is that the world 481 00:29:47,600 --> 00:29:51,640 Speaker 1: doesn't play fair. That is, it doesn't it's not doesn't 482 00:29:51,920 --> 00:29:55,840 Speaker 1: put you through a series of controlled experiments. Um. It 483 00:29:56,320 --> 00:29:59,240 Speaker 1: highlights certain data and says, hey, look at me, and 484 00:29:59,560 --> 00:30:04,280 Speaker 1: puts their data in the shadows. And as a result 485 00:30:04,320 --> 00:30:07,840 Speaker 1: of that, the database we have in our head is 486 00:30:07,880 --> 00:30:11,600 Speaker 1: going to be somewhat biased. And we've already talked about 487 00:30:11,600 --> 00:30:14,440 Speaker 1: a few examples of that. That is to say, when 488 00:30:14,480 --> 00:30:17,560 Speaker 1: you switch lines and it slows down, that's gonna leave 489 00:30:17,680 --> 00:30:20,160 Speaker 1: more of an impression than when you switch lines and 490 00:30:20,240 --> 00:30:25,040 Speaker 1: it uh speeds up, and that's going to distort your database. Um. 491 00:30:25,920 --> 00:30:29,560 Speaker 1: Cornell University, when it decides who gets to go there, 492 00:30:29,760 --> 00:30:33,280 Speaker 1: it probably congratulates itself because we look around at the 493 00:30:33,280 --> 00:30:35,440 Speaker 1: student body and they're terrific, and we say, boy, we're 494 00:30:35,440 --> 00:30:38,080 Speaker 1: really good at selecting people. Now, of course, we don't 495 00:30:38,120 --> 00:30:40,960 Speaker 1: really know that because we don't know about the people 496 00:30:41,080 --> 00:30:44,600 Speaker 1: who we rejected who could be there. And maybe they 497 00:30:44,640 --> 00:30:48,040 Speaker 1: do just as well. That is, people self select them 498 00:30:48,320 --> 00:30:51,960 Speaker 1: only really good students by and large, or even applying 499 00:30:52,000 --> 00:30:54,360 Speaker 1: to Cornell, and maybe we don't do as good a 500 00:30:54,440 --> 00:30:57,440 Speaker 1: job is we think the world can't show us the 501 00:30:57,520 --> 00:31:02,640 Speaker 1: people that we didn't accept, and therefore it's um hard 502 00:31:02,760 --> 00:31:05,360 Speaker 1: to really know whether we're doing a good job. But 503 00:31:05,520 --> 00:31:10,240 Speaker 1: the impression is every bit. It's there. We look around 504 00:31:10,240 --> 00:31:12,560 Speaker 1: at the student body. You can't turn that impression off, 505 00:31:12,600 --> 00:31:14,160 Speaker 1: and the students look great, and so we think we 506 00:31:14,160 --> 00:31:17,840 Speaker 1: did a good job. You once, cold confirmation bias, the 507 00:31:17,920 --> 00:31:22,200 Speaker 1: mother of all biases. Tell us why, um boy, that's 508 00:31:22,240 --> 00:31:25,840 Speaker 1: a term that has been in the news all over lately, 509 00:31:26,080 --> 00:31:32,200 Speaker 1: and there there are many things uh that people mean 510 00:31:32,440 --> 00:31:35,680 Speaker 1: by the subject confirmation bias. So let me break that 511 00:31:35,720 --> 00:31:39,240 Speaker 1: down to three things go. There is some nuance in 512 00:31:39,240 --> 00:31:43,520 Speaker 1: in confirmation bias and in the academic study of that. 513 00:31:43,680 --> 00:31:47,000 Speaker 1: So I think what most people think of when they 514 00:31:47,000 --> 00:31:50,560 Speaker 1: think of confirmation bias, if we're talking about Facebook and 515 00:31:50,600 --> 00:31:56,520 Speaker 1: fake news, people go out seeking things that confirm existing beliefs. 516 00:31:56,880 --> 00:32:00,440 Speaker 1: That that's a broad definition of confirmation bias. But in 517 00:32:00,520 --> 00:32:03,120 Speaker 1: some of your research and some of your studies, I 518 00:32:03,200 --> 00:32:07,360 Speaker 1: was kind of fascinated by this. When people are presented 519 00:32:07,400 --> 00:32:12,000 Speaker 1: with a problem, rather than seeking disproving evidence, they go 520 00:32:12,080 --> 00:32:16,479 Speaker 1: out and seek confirming evidence, even though you can end 521 00:32:16,560 --> 00:32:19,760 Speaker 1: up with the exact wrong answer or the same answer 522 00:32:19,800 --> 00:32:22,920 Speaker 1: regardless of the question, based on how it is. The 523 00:32:22,960 --> 00:32:26,160 Speaker 1: example in one of the books is you have a 524 00:32:26,200 --> 00:32:29,200 Speaker 1: court that has to decide which parent to award custody 525 00:32:29,200 --> 00:32:33,400 Speaker 1: of a child. Too parents A is pretty much average 526 00:32:33,440 --> 00:32:37,440 Speaker 1: across all the major characteristics. The court looks at. Parent 527 00:32:37,520 --> 00:32:40,240 Speaker 1: B has some very high points and some very low points, 528 00:32:40,720 --> 00:32:44,280 Speaker 1: depending on how the question is phrased, which parents should 529 00:32:44,280 --> 00:32:48,400 Speaker 1: be awarded custody, which parents should not be awarded custody. 530 00:32:48,880 --> 00:32:52,320 Speaker 1: Everybody ends up looking at the parent with the high 531 00:32:52,360 --> 00:32:55,560 Speaker 1: points and low points, because it's confirming whatever the question 532 00:32:55,720 --> 00:32:58,440 Speaker 1: is asking is that what you were referring to the 533 00:32:58,480 --> 00:33:02,360 Speaker 1: more nuanced aspect. Very, that's great that you broke it 534 00:33:02,400 --> 00:33:05,440 Speaker 1: down that way. That is to say, UM, people are 535 00:33:05,640 --> 00:33:09,400 Speaker 1: very familiar with the idea that we're easier on evidence 536 00:33:09,440 --> 00:33:12,000 Speaker 1: that supports what we want to believe, and we're hard 537 00:33:12,080 --> 00:33:15,360 Speaker 1: on evidence that robuts what we want to believe. And 538 00:33:15,400 --> 00:33:18,480 Speaker 1: that's now part of the confirmation bias. That's the thing 539 00:33:18,560 --> 00:33:22,680 Speaker 1: we recognize that. Great people are also aware of the 540 00:33:22,720 --> 00:33:25,880 Speaker 1: fact that, UM, if I want something to be true, 541 00:33:26,360 --> 00:33:29,920 Speaker 1: I actively go out and look for evidence consistent with it. 542 00:33:30,080 --> 00:33:32,920 Speaker 1: If I want something not to be true, I'll go 543 00:33:33,160 --> 00:33:36,560 Speaker 1: actively look for evidence that's inconsistent with it. We can 544 00:33:36,600 --> 00:33:39,960 Speaker 1: call that the confirmation bias. That's true, there's evidence to 545 00:33:40,040 --> 00:33:44,560 Speaker 1: support that. But the confirmation bias is even more insidious, 546 00:33:44,600 --> 00:33:47,320 Speaker 1: more pervasive than that. That is, even if you don't 547 00:33:47,360 --> 00:33:52,480 Speaker 1: care about what the um the situation or question what 548 00:33:52,560 --> 00:33:55,520 Speaker 1: the answer, what the right answer might be like should 549 00:33:55,520 --> 00:33:58,680 Speaker 1: you award custody to this person or that person, we 550 00:33:58,800 --> 00:34:01,640 Speaker 1: still engage in. So One example that I think makes 551 00:34:01,680 --> 00:34:06,360 Speaker 1: this clear is, UM, suppose I'm hosting a dinner party 552 00:34:06,840 --> 00:34:10,120 Speaker 1: and I tell you you're gonna be sitting next to John. 553 00:34:10,239 --> 00:34:13,799 Speaker 1: I think John's politically conservative. You just might want to 554 00:34:14,000 --> 00:34:15,680 Speaker 1: but you might want to test that out. I'm not sure. 555 00:34:15,960 --> 00:34:18,239 Speaker 1: How would you test that out? Well, you would ask 556 00:34:18,840 --> 00:34:23,680 Speaker 1: conservative oriented questions. You would say, John, don't you think 557 00:34:23,719 --> 00:34:27,839 Speaker 1: the government is too intrusive sometimes? And then he, as 558 00:34:27,840 --> 00:34:32,160 Speaker 1: a conservative, would say yes. But even a liberal would say, well, sometime, sure, 559 00:34:32,200 --> 00:34:36,200 Speaker 1: the government's too intrusive. If I told you, hey, you're 560 00:34:36,200 --> 00:34:38,600 Speaker 1: gonna be sitting next to John, I think he's politically liberal, 561 00:34:38,640 --> 00:34:41,439 Speaker 1: you might want to check that out. You might ask 562 00:34:41,480 --> 00:34:44,480 Speaker 1: a very different question, John, don't you think we need 563 00:34:44,520 --> 00:34:46,759 Speaker 1: to do something about global warming? Don't you think we 564 00:34:46,800 --> 00:34:49,920 Speaker 1: need to do something about income inequality? And if he's liberal, 565 00:34:49,960 --> 00:34:53,040 Speaker 1: he'll say yes. But if you're a conservative, at least 566 00:34:53,040 --> 00:34:55,239 Speaker 1: when it comes to income inequality, you're gonna say, yeah, 567 00:34:55,239 --> 00:34:56,840 Speaker 1: we need to do things about it. Might be different 568 00:34:56,840 --> 00:35:00,399 Speaker 1: things than a liberal, but the thrust is you're gonna 569 00:35:00,400 --> 00:35:05,680 Speaker 1: get answers that support what you initially believed. Um And 570 00:35:05,800 --> 00:35:08,120 Speaker 1: it's so. All of this is to say that the 571 00:35:08,120 --> 00:35:11,680 Speaker 1: confirmation bias runs really deep. And one of my favorite 572 00:35:11,680 --> 00:35:15,200 Speaker 1: examples of this is an old study of conomen divers 573 00:35:15,239 --> 00:35:18,799 Speaker 1: keys excuse me of amos diverse. Key's who this was 574 00:35:18,880 --> 00:35:23,000 Speaker 1: before he was interested in the kind of judgmental biases 575 00:35:23,040 --> 00:35:27,120 Speaker 1: that economists have become interested in. He was just interested 576 00:35:27,120 --> 00:35:31,080 Speaker 1: in judgments of similarity. How do we say that? Why 577 00:35:31,120 --> 00:35:36,879 Speaker 1: do we say that, um, um, having a kid is 578 00:35:37,280 --> 00:35:41,080 Speaker 1: a little bit like having a dog, But having a 579 00:35:41,080 --> 00:35:43,640 Speaker 1: dog is nothing like having a kid. Uh? You know 580 00:35:43,719 --> 00:35:47,560 Speaker 1: why that discrepancy? Why is North Korea a lot like China? 581 00:35:47,640 --> 00:35:50,680 Speaker 1: But China is not really much like North Korea. And 582 00:35:51,200 --> 00:35:54,080 Speaker 1: so he did this study where he asked people which 583 00:35:54,120 --> 00:35:56,560 Speaker 1: two countries this was in the nineteen eighties are more 584 00:35:56,600 --> 00:35:59,640 Speaker 1: similar to one another East Germany and West Germany or Ceylon. 585 00:35:59,680 --> 00:36:02,360 Speaker 1: And the all the respondents no more about East Germany 586 00:36:02,400 --> 00:36:04,759 Speaker 1: and West Germany, so they can think of more reasons, 587 00:36:05,280 --> 00:36:07,839 Speaker 1: more ways in which they are similar. And so they say, 588 00:36:07,840 --> 00:36:10,080 Speaker 1: East Germany and West Germany is more similar. Okay, no 589 00:36:10,160 --> 00:36:12,359 Speaker 1: big deal there, No one's made a problem, No one's 590 00:36:12,360 --> 00:36:15,480 Speaker 1: made in air. Um. He asked another group of participants 591 00:36:15,480 --> 00:36:17,799 Speaker 1: which two countries are more dissimilar to one another, East 592 00:36:17,840 --> 00:36:21,200 Speaker 1: Germany and West Germany. And they say East Germany West 593 00:36:21,200 --> 00:36:23,560 Speaker 1: Germany are more dissimilar to one another because they know 594 00:36:23,600 --> 00:36:26,040 Speaker 1: more about East Germany and West Germany. They can think 595 00:36:26,040 --> 00:36:28,560 Speaker 1: of ways in which they are different, and you end 596 00:36:28,640 --> 00:36:32,320 Speaker 1: up with the paradoxical result, with uh East Germany and 597 00:36:32,320 --> 00:36:36,719 Speaker 1: West Germany being both more similar and more dissimilar. Logically impossible, 598 00:36:37,040 --> 00:36:41,759 Speaker 1: but psychologically the rules of similarity and dissimilarity judgments are 599 00:36:41,800 --> 00:36:44,319 Speaker 1: such that both can seem very compelling. And it's the 600 00:36:44,360 --> 00:36:47,000 Speaker 1: confirmation bias. You asked me how similar they are. I 601 00:36:47,040 --> 00:36:49,560 Speaker 1: look for evidence of similarity, and I can find it 602 00:36:49,680 --> 00:36:51,960 Speaker 1: for the countries I know more about than the countries 603 00:36:52,000 --> 00:36:54,520 Speaker 1: I know less about. You ask me which two countries 604 00:36:54,560 --> 00:36:58,759 Speaker 1: are more dissimilar. I'm looking for evidence of dissimilarity. I 605 00:36:58,800 --> 00:37:01,840 Speaker 1: don't care which to entries are more similar dissimilar. Just 606 00:37:01,880 --> 00:37:05,439 Speaker 1: shows you how deep the confirmation bias runs. So there 607 00:37:05,520 --> 00:37:08,040 Speaker 1: was a slip of the tongue that you referenced where 608 00:37:08,080 --> 00:37:11,320 Speaker 1: you said somebody said I'll see it when I believe it, 609 00:37:11,440 --> 00:37:13,600 Speaker 1: as opposed to saying I believe it when I see it. 610 00:37:13,920 --> 00:37:18,120 Speaker 1: But really there's some Freudian truth in that i'll see 611 00:37:18,120 --> 00:37:21,719 Speaker 1: it when I believe it. Tells us how much our 612 00:37:21,800 --> 00:37:25,640 Speaker 1: own um, ownership of ideas and desire to see things 613 00:37:25,640 --> 00:37:30,879 Speaker 1: we already believe actually impacts that. How does this manifest 614 00:37:30,960 --> 00:37:35,759 Speaker 1: itself in the field of investing? How do you see 615 00:37:35,840 --> 00:37:42,839 Speaker 1: confirmation bias impacting the way people put capital at risk? Um, Well, 616 00:37:42,840 --> 00:37:45,560 Speaker 1: it plays out there the way it plays out everywhere. 617 00:37:45,600 --> 00:37:48,080 Speaker 1: That is to say, or let's back up a step 618 00:37:48,200 --> 00:37:50,960 Speaker 1: that you know, when you cite that quote, it can 619 00:37:51,120 --> 00:37:54,640 Speaker 1: sound silly in some ways, it does sound silly. On 620 00:37:54,680 --> 00:37:58,560 Speaker 1: the other hand, what job does our brain have. It's 621 00:37:58,640 --> 00:38:01,280 Speaker 1: to make sense of the world. And we draw upon 622 00:38:01,320 --> 00:38:05,799 Speaker 1: every possible cues that would make that job easier. And 623 00:38:06,000 --> 00:38:07,920 Speaker 1: one cue that we draw upon is what we already know, 624 00:38:08,120 --> 00:38:11,359 Speaker 1: or what we already think we know. And therefore what 625 00:38:11,480 --> 00:38:15,319 Speaker 1: we know has to influence new information that comes in. 626 00:38:15,360 --> 00:38:18,319 Speaker 1: And so if we have a strong prior belief that 627 00:38:18,480 --> 00:38:23,560 Speaker 1: in this industry is better positioned for the environment we 628 00:38:23,640 --> 00:38:27,000 Speaker 1: now face rather than that industry, of course we should 629 00:38:27,560 --> 00:38:32,040 Speaker 1: take that into account. Pre existing beliefs, Uh, if they're 630 00:38:32,120 --> 00:38:36,319 Speaker 1: based on a solid foundation, they should influence our evaluation 631 00:38:36,360 --> 00:38:39,799 Speaker 1: of new information. So let me ask you, um a 632 00:38:39,880 --> 00:38:43,680 Speaker 1: question about the money illusion. Why is it that we 633 00:38:43,760 --> 00:38:49,120 Speaker 1: have these unfounded beliefs about values of different things? Define 634 00:38:49,200 --> 00:38:53,040 Speaker 1: what the money illusion is? The money illusions really interesting 635 00:38:53,800 --> 00:38:59,719 Speaker 1: to psychologists because it's another version of uh, something you 636 00:38:59,760 --> 00:39:03,120 Speaker 1: see all over the place, which is a difficulty we 637 00:39:03,200 --> 00:39:09,120 Speaker 1: have in taking context into account. And so in psychology 638 00:39:09,120 --> 00:39:13,520 Speaker 1: there's this phenomenon known as the fundamental attribution air that UM, 639 00:39:13,640 --> 00:39:16,640 Speaker 1: if you see, UM, a dad and a kid in 640 00:39:16,719 --> 00:39:19,000 Speaker 1: line at the grocery store and the dad yells at 641 00:39:19,000 --> 00:39:22,400 Speaker 1: the kid, you can't help but think that's a mean father. 642 00:39:23,360 --> 00:39:25,960 Speaker 1: We don't know what the context was that led up 643 00:39:25,960 --> 00:39:28,520 Speaker 1: to it, what's gone on in the person's life, and 644 00:39:28,640 --> 00:39:31,560 Speaker 1: how exceptional that might have been the only time that 645 00:39:31,600 --> 00:39:35,160 Speaker 1: the dad ever yelled at his kid. But we can't 646 00:39:35,160 --> 00:39:37,879 Speaker 1: help but look at the stimulus in front of us 647 00:39:37,920 --> 00:39:43,560 Speaker 1: and draw conclusions. Um. And with respect to the money illusion, 648 00:39:43,680 --> 00:39:47,480 Speaker 1: twenty thous dollars sounds a lot more than ten thousand dollars, 649 00:39:47,480 --> 00:39:50,319 Speaker 1: and we can lose sight of the context, that is 650 00:39:50,360 --> 00:39:53,960 Speaker 1: to say, how much has inflation changed from when you 651 00:39:54,000 --> 00:39:59,520 Speaker 1: had ten thousand to UH. They're obviously circumstances in which 652 00:40:00,120 --> 00:40:02,960 Speaker 1: real dollars, the twenty thousand is less than the ten 653 00:40:03,000 --> 00:40:06,160 Speaker 1: thousand used to have, but it's hard to to get 654 00:40:06,200 --> 00:40:10,160 Speaker 1: over that first impression of just being taken by the 655 00:40:10,160 --> 00:40:14,080 Speaker 1: the monetary value and to ignore context. What is the 656 00:40:14,160 --> 00:40:18,800 Speaker 1: issue with mental accounting? Well, economists tell us UH. Decision 657 00:40:18,840 --> 00:40:20,680 Speaker 1: makers tell us, if we want to make the best 658 00:40:20,680 --> 00:40:23,480 Speaker 1: decisions with our money, we should think of all of 659 00:40:23,520 --> 00:40:28,160 Speaker 1: our assets and liabilities in terms of one overall integrated 660 00:40:28,360 --> 00:40:31,200 Speaker 1: balance sheet. And in fact that turns out to be true, 661 00:40:31,200 --> 00:40:34,840 Speaker 1: we will make better decisions that way. Great, But those 662 00:40:35,400 --> 00:40:38,439 Speaker 1: we don't have the kinds of minds that UH either 663 00:40:38,480 --> 00:40:41,560 Speaker 1: are capable of doing that or inclined to do that. Instead, 664 00:40:41,600 --> 00:40:46,040 Speaker 1: we put money into different accounts um and we treat 665 00:40:46,080 --> 00:40:49,680 Speaker 1: it differently. We treat money differently depending upon where it 666 00:40:49,840 --> 00:40:53,880 Speaker 1: came from, how we got it, um and. There are 667 00:40:53,920 --> 00:40:57,279 Speaker 1: all sorts of examples. It's very easy to illustrate. If 668 00:40:57,360 --> 00:41:01,839 Speaker 1: you received an inheritance UM and aunt, let's say, who 669 00:41:01,920 --> 00:41:05,839 Speaker 1: was very careful with her money, You're probably gonna be 670 00:41:05,920 --> 00:41:10,080 Speaker 1: more careful with that money than if your aunt was 671 00:41:10,400 --> 00:41:14,720 Speaker 1: a wild and crazy person who liked to spread money around. 672 00:41:14,840 --> 00:41:18,080 Speaker 1: You're gonna feel like I should spread this around. Money 673 00:41:18,160 --> 00:41:23,759 Speaker 1: comes with almost a personality um and. People often talk 674 00:41:23,840 --> 00:41:27,400 Speaker 1: about this in the context of getting a smallish bonus. 675 00:41:27,480 --> 00:41:30,440 Speaker 1: You didn't expect you get a tax return, and it's small. 676 00:41:30,880 --> 00:41:33,880 Speaker 1: It feels like, Hey, this is great, let's go spend it, 677 00:41:33,920 --> 00:41:37,120 Speaker 1: And people do, and they often spend it several times. Uh. 678 00:41:37,280 --> 00:41:40,880 Speaker 1: Just it's used to justify going to this restaurant you 679 00:41:40,920 --> 00:41:44,240 Speaker 1: always wanted to go to, getting tickets to the Celtic 680 00:41:44,280 --> 00:41:47,160 Speaker 1: game or whatever, and you've spent it several times. If 681 00:41:47,239 --> 00:41:50,640 Speaker 1: in fact it were a big inheritance, it comes with 682 00:41:50,680 --> 00:41:53,719 Speaker 1: a lot of waitiness associated with it, and you're even 683 00:41:53,760 --> 00:41:55,640 Speaker 1: though you have more money, you're less likely to spend 684 00:41:55,719 --> 00:41:59,439 Speaker 1: it on that restaurant or those Celtic tickets and instead, um, 685 00:42:00,400 --> 00:42:03,240 Speaker 1: do something serious with it, because it's a serious amount 686 00:42:03,280 --> 00:42:06,480 Speaker 1: of money. In white more people make big money mistakes. 687 00:42:07,040 --> 00:42:11,680 Speaker 1: You tell the joke of the honeymooners in Vegas. Uh, 688 00:42:11,719 --> 00:42:15,760 Speaker 1: with mental accounting, the groom, while the wife is sleeping, 689 00:42:15,880 --> 00:42:19,200 Speaker 1: takes five dollars and goes down to the roulette table. 690 00:42:19,280 --> 00:42:23,040 Speaker 1: Share that story. I love that story. Uh well, do 691 00:42:23,120 --> 00:42:25,400 Speaker 1: you recall it? Yeah, of course I recall. It's a 692 00:42:25,480 --> 00:42:29,040 Speaker 1: it's a it's a long story. Takes the money goes 693 00:42:29,160 --> 00:42:36,160 Speaker 1: you know, bets here wins long double animulation of money, 694 00:42:36,280 --> 00:42:40,800 Speaker 1: UM that ultimately goes one round too far and loses 695 00:42:40,840 --> 00:42:43,160 Speaker 1: when he returns. How honey how did you do? Oh 696 00:42:43,239 --> 00:42:46,040 Speaker 1: not bad. I lost five dollars right the last round 697 00:42:46,239 --> 00:42:48,440 Speaker 1: it doubled and doubled and doubled. It was tens of 698 00:42:48,480 --> 00:42:50,240 Speaker 1: millions of dollars. He had to go to a different 699 00:42:50,280 --> 00:42:54,000 Speaker 1: hotel that would take the bigger stakes, and he bets it, 700 00:42:54,080 --> 00:42:57,799 Speaker 1: loses it all. Honey, I lost five dollars. I love 701 00:42:57,880 --> 00:42:59,640 Speaker 1: that story. I find that and you tell it better 702 00:42:59,640 --> 00:43:04,080 Speaker 1: than I. We have been speaking with Professor Tom Gilovich 703 00:43:04,200 --> 00:43:07,799 Speaker 1: of Cornell University. If you enjoy this conversation, be sure 704 00:43:07,800 --> 00:43:10,400 Speaker 1: and check out our podcast extras when we keep the 705 00:43:10,400 --> 00:43:15,080 Speaker 1: tape rolling and continue discussing all things cognitive. Be sure 706 00:43:15,080 --> 00:43:17,440 Speaker 1: and check out my daily column. You can find that 707 00:43:17,520 --> 00:43:20,319 Speaker 1: on Bloomberg View dot com. You can follow me on 708 00:43:20,320 --> 00:43:23,960 Speaker 1: Twitter at rit Halts. We love your comments, feedback and 709 00:43:24,080 --> 00:43:28,800 Speaker 1: suggestions right to us at m IB podcast at Bloomberg 710 00:43:28,880 --> 00:43:32,520 Speaker 1: dot net. I'm Barry rid Halts. You're listening to Master's 711 00:43:32,560 --> 00:43:50,120 Speaker 1: in Business on Bloomberg Radio. Welcome to the podcast. Thank you, 712 00:43:50,440 --> 00:43:53,279 Speaker 1: as Tom. I feel odd calling you Tom because I 713 00:43:53,320 --> 00:43:57,040 Speaker 1: know you as Professor Gillibit. I only say Thomas when 714 00:43:57,080 --> 00:43:59,319 Speaker 1: I've made him With Thomas, I've made a mistake of 715 00:43:59,360 --> 00:44:03,080 Speaker 1: something get that backhand into the net. So so there's 716 00:44:03,120 --> 00:44:05,000 Speaker 1: a lot of stuff I want to go over with 717 00:44:05,040 --> 00:44:08,719 Speaker 1: you that we didn't get to during the broadcast. But 718 00:44:08,840 --> 00:44:14,439 Speaker 1: I have to start with the discussion about experiences over 719 00:44:15,080 --> 00:44:20,480 Speaker 1: consumer um material goods, because there are some stories from 720 00:44:20,560 --> 00:44:24,400 Speaker 1: that really have stayed with me, that have experiences that 721 00:44:24,440 --> 00:44:26,600 Speaker 1: have stayed with me that have forced me down that 722 00:44:26,719 --> 00:44:30,120 Speaker 1: rabbit hole. A little bit about five years ago, my 723 00:44:30,120 --> 00:44:33,439 Speaker 1: wife and I were on vacation in St. Lucia and 724 00:44:33,560 --> 00:44:36,880 Speaker 1: one of the things that was an option at the hotel. 725 00:44:36,920 --> 00:44:38,360 Speaker 1: We were all the way on the north part of 726 00:44:38,360 --> 00:44:41,200 Speaker 1: the island and the Grand Tetons are on the south part, 727 00:44:41,760 --> 00:44:45,080 Speaker 1: and you could rent a sailboat and a crew, a 728 00:44:45,120 --> 00:44:48,400 Speaker 1: two man crew for the day. And at the time 729 00:44:48,440 --> 00:44:50,840 Speaker 1: it seemed like an ungodly amount of money. It was 730 00:44:50,920 --> 00:44:53,600 Speaker 1: less than a thousand dollars, but it's a day. That's 731 00:44:53,600 --> 00:44:56,440 Speaker 1: a lot of money, and I ended up just grinding 732 00:44:56,520 --> 00:45:00,160 Speaker 1: my teeth and doing it, and that experience she Ill 733 00:45:00,239 --> 00:45:04,640 Speaker 1: talks about consistently. I could have brought a thousand dollar 734 00:45:04,719 --> 00:45:07,960 Speaker 1: piece of jewelry or spent it on any babble, would 735 00:45:08,000 --> 00:45:12,160 Speaker 1: never have gotten the ongoing mileage out of that, and 736 00:45:12,320 --> 00:45:16,160 Speaker 1: that's my favorite example of that. On the other hand, 737 00:45:16,600 --> 00:45:21,800 Speaker 1: there are purchases I've made where I continue to derive 738 00:45:22,560 --> 00:45:28,600 Speaker 1: years later pleasure from the consumer purchase. And my best 739 00:45:28,640 --> 00:45:33,280 Speaker 1: explanation for that is has to do with my expectations 740 00:45:33,320 --> 00:45:35,000 Speaker 1: of what I'm gonna get. And I'm gonna give you 741 00:45:35,400 --> 00:45:39,400 Speaker 1: two quick examples. Uh. We we moved houses, and we 742 00:45:39,480 --> 00:45:42,960 Speaker 1: lived in a fairly suburban neighborhood where you feel like 743 00:45:43,000 --> 00:45:46,560 Speaker 1: you're the center square um with houses next to you, 744 00:45:46,640 --> 00:45:49,920 Speaker 1: behind you forward, And we hadn't plans on moving, and 745 00:45:49,960 --> 00:45:54,920 Speaker 1: this really fascinating modern house popped up. My wife trolls 746 00:45:55,000 --> 00:45:58,560 Speaker 1: Zillo all the time, and we ended up. It's adjacent 747 00:45:58,640 --> 00:46:02,600 Speaker 1: to a big preserve, so during in the winter, I 748 00:46:02,600 --> 00:46:05,160 Speaker 1: could see the house across the street from me on 749 00:46:05,239 --> 00:46:08,120 Speaker 1: the other about a hundred yards away. But during the summer, 750 00:46:08,640 --> 00:46:11,520 Speaker 1: I feel like I'm living in a forest and you 751 00:46:11,560 --> 00:46:14,719 Speaker 1: can't see any neighbors and I and I've only been 752 00:46:14,760 --> 00:46:18,200 Speaker 1: there three years, but I still walk around and think, 753 00:46:18,239 --> 00:46:20,600 Speaker 1: I can't believe we live here. I'm astonished by it. 754 00:46:21,480 --> 00:46:27,480 Speaker 1: That house has not hit the adaptation level yet. But 755 00:46:27,640 --> 00:46:29,600 Speaker 1: when we're in the old house and I was shopping 756 00:46:29,600 --> 00:46:32,320 Speaker 1: for new stereo. We were in that house for seven years. 757 00:46:32,360 --> 00:46:36,680 Speaker 1: My wife was constantly canceling the stairs. No, no, don't 758 00:46:36,719 --> 00:46:40,080 Speaker 1: spend that money on this, and ultimately said, tell you what, 759 00:46:40,200 --> 00:46:42,799 Speaker 1: when you moved. We moved to a new house one day, 760 00:46:43,120 --> 00:46:46,520 Speaker 1: get whatever you want. So we move and I remind 761 00:46:46,600 --> 00:46:49,839 Speaker 1: her of this and go out and spent probably too 762 00:46:49,880 --> 00:46:53,280 Speaker 1: much money on an audio system, and truth be told, 763 00:46:53,880 --> 00:46:57,600 Speaker 1: I hardly ever use it. Who has time to pop 764 00:46:57,640 --> 00:47:01,320 Speaker 1: in a CD and sit and listen uninterrupted to music 765 00:47:01,800 --> 00:47:05,439 Speaker 1: for forty five minutes? It just doesn't happen. And I'm 766 00:47:05,560 --> 00:47:11,279 Speaker 1: shocked at how little actual pleasure I've derived from that 767 00:47:11,360 --> 00:47:15,840 Speaker 1: purchase versus other things cars and houses and what have you. 768 00:47:16,200 --> 00:47:21,680 Speaker 1: And it it's always intriguing me how adaptable are we 769 00:47:21,760 --> 00:47:26,359 Speaker 1: to things and how much is it reliance upon our 770 00:47:26,440 --> 00:47:31,319 Speaker 1: own expectations or whether they're too high or too low. Well, 771 00:47:31,440 --> 00:47:34,600 Speaker 1: we're doing research. It's interesting that you raise that question. 772 00:47:34,640 --> 00:47:40,920 Speaker 1: We're doing research right now on people's reasons for making 773 00:47:40,960 --> 00:47:44,680 Speaker 1: certain purchases, and are those reasons born out in the 774 00:47:44,800 --> 00:47:49,840 Speaker 1: particular question is um We saw in our other research 775 00:47:50,000 --> 00:47:54,040 Speaker 1: that people often by certain material possessions with the thought 776 00:47:54,560 --> 00:47:57,640 Speaker 1: that that's going to give a boost to their social life. 777 00:47:57,640 --> 00:48:01,040 Speaker 1: The prototypical example is, yeah, and need to get forty 778 00:48:01,360 --> 00:48:05,280 Speaker 1: flat screen TV, UM because I want to have everyone 779 00:48:05,320 --> 00:48:09,000 Speaker 1: over for the Oscars and the super Bowl. And uh, 780 00:48:09,320 --> 00:48:13,080 Speaker 1: sometimes that does happen, but people end up watching the 781 00:48:13,080 --> 00:48:16,359 Speaker 1: Oscars by themselves or the super Bowl by themselves more 782 00:48:16,400 --> 00:48:20,960 Speaker 1: often than they expect. So the prediction is that you 783 00:48:21,040 --> 00:48:24,040 Speaker 1: buy certain material things with the expectation it's going to 784 00:48:24,120 --> 00:48:26,680 Speaker 1: connect you to other people, and that turns out to 785 00:48:26,800 --> 00:48:30,520 Speaker 1: be true less often than you expect, Whereas the experience 786 00:48:31,360 --> 00:48:34,320 Speaker 1: thinks some of those you buy also with an eye 787 00:48:34,360 --> 00:48:37,120 Speaker 1: toward doing it with other people, and those tend to 788 00:48:37,120 --> 00:48:41,440 Speaker 1: get confirmed more and that's part of the experiential advantage. UM. 789 00:48:41,560 --> 00:48:44,200 Speaker 1: So we have several studies in the hopper are ready 790 00:48:44,200 --> 00:48:47,480 Speaker 1: to run on on that subject. And I think your 791 00:48:47,840 --> 00:48:50,799 Speaker 1: example of buying this place that you love so much 792 00:48:50,840 --> 00:48:54,520 Speaker 1: and it hasn't diminished you right away talked about we 793 00:48:54,640 --> 00:48:57,040 Speaker 1: moved to this neighborhood and we're sort of connected to 794 00:48:57,160 --> 00:49:00,799 Speaker 1: the and like the other people around you. And that's 795 00:49:00,800 --> 00:49:04,720 Speaker 1: a case where you bought it for a social reason 796 00:49:04,760 --> 00:49:07,160 Speaker 1: and it turns out to have been confirmed and you've 797 00:49:07,200 --> 00:49:11,319 Speaker 1: got the great hiking ability and vista in the back. Um. 798 00:49:11,600 --> 00:49:14,799 Speaker 1: Compare that to a frequent motivation of people buying a 799 00:49:14,840 --> 00:49:17,239 Speaker 1: new house, So we need more square footage, or we 800 00:49:17,280 --> 00:49:20,960 Speaker 1: need three bathrooms rather than two. Well, people with big 801 00:49:20,960 --> 00:49:23,239 Speaker 1: families used to live in really tiny houses and never 802 00:49:23,320 --> 00:49:26,239 Speaker 1: really one They yeah, and they functioned just fine. We 803 00:49:26,280 --> 00:49:28,920 Speaker 1: could function just fine. And those are the kinds of 804 00:49:28,960 --> 00:49:32,160 Speaker 1: things that people adapt to. But if you move your 805 00:49:32,200 --> 00:49:36,120 Speaker 1: house and suddenly you're in a better neighborhood in the woods, 806 00:49:36,200 --> 00:49:39,080 Speaker 1: I'm literally you know, there are foxes running down the 807 00:49:39,080 --> 00:49:41,560 Speaker 1: street and we sort of I live in Long Island. 808 00:49:41,600 --> 00:49:43,800 Speaker 1: There was not supposed to be deer in Nassau County. 809 00:49:44,120 --> 00:49:47,160 Speaker 1: A deer went bounding across the front yard the other day. 810 00:49:47,680 --> 00:49:49,319 Speaker 1: I was trying to figure out where the dogs were 811 00:49:49,360 --> 00:49:51,880 Speaker 1: going crazy? Why is there a deer in our street? 812 00:49:51,920 --> 00:49:55,560 Speaker 1: That it never ceases to amaze me? And those bigger 813 00:49:55,600 --> 00:49:58,880 Speaker 1: TV you adapt to three months you don't even notice 814 00:49:58,880 --> 00:50:01,279 Speaker 1: the size of the TV. Whereas for you, it's gonna 815 00:50:01,320 --> 00:50:03,239 Speaker 1: be a fox one day, it's going to be a 816 00:50:03,360 --> 00:50:06,840 Speaker 1: sunset another day, a sunrise a third day, et cetera. 817 00:50:07,040 --> 00:50:09,640 Speaker 1: It's going to be a different little gift each time 818 00:50:10,160 --> 00:50:14,440 Speaker 1: that has provided this continuing enjoyment that you've received. I'm 819 00:50:14,440 --> 00:50:21,520 Speaker 1: curious as to how our Since you've discussed investing and 820 00:50:22,520 --> 00:50:28,200 Speaker 1: monetary issues, financial issues, I'm curious how our own recognition 821 00:50:28,280 --> 00:50:30,560 Speaker 1: of whether or not we're getting a bargain or not. 822 00:50:30,719 --> 00:50:34,920 Speaker 1: What whether you know there are lots of fast, beautiful cars. 823 00:50:35,680 --> 00:50:38,160 Speaker 1: I have a hard time wrapping my head around walking 824 00:50:38,200 --> 00:50:41,520 Speaker 1: into a Ferrari dealer and saying, here's two hundred and 825 00:50:41,560 --> 00:50:44,400 Speaker 1: fifty thousand dollars, I'll take the red one. Seems like 826 00:50:44,440 --> 00:50:47,359 Speaker 1: an awful lot of money. On the other hands, when 827 00:50:47,400 --> 00:50:49,200 Speaker 1: if you pick up a car that's two or three 828 00:50:49,239 --> 00:50:52,640 Speaker 1: years old and it's half of the new price, it 829 00:50:52,760 --> 00:50:56,800 Speaker 1: seems like a little more of a rational um decision. 830 00:50:56,920 --> 00:50:59,760 Speaker 1: I'm curious if you've looked at how people feel about 831 00:51:00,360 --> 00:51:03,480 Speaker 1: the cost or relative cost of what the purchasing to 832 00:51:03,680 --> 00:51:06,839 Speaker 1: its value, if that has any impact on how they 833 00:51:06,960 --> 00:51:12,160 Speaker 1: either adapt or continue to enjoy whatever that consumer bubble is. Yeah, sure, 834 00:51:12,280 --> 00:51:15,319 Speaker 1: there's um. You know, we get utility from all sorts 835 00:51:15,360 --> 00:51:18,279 Speaker 1: of things, and there's utility from the simple knowledge that 836 00:51:18,360 --> 00:51:22,720 Speaker 1: it's a deal that that provides pleasure, and that fact 837 00:51:22,840 --> 00:51:28,680 Speaker 1: is related to I think the single biggest psychological fact 838 00:51:28,719 --> 00:51:33,560 Speaker 1: about human beings. It's so simple. It's such a simple idea, 839 00:51:34,239 --> 00:51:37,840 Speaker 1: but it's really powerful. Um says something about us, and 840 00:51:38,000 --> 00:51:41,680 Speaker 1: UM it's sort of fallen to psychologists to remind the 841 00:51:41,719 --> 00:51:45,080 Speaker 1: world of this. Economists often want to say that, well, 842 00:51:45,120 --> 00:51:47,640 Speaker 1: let's just incentivize it and then we'll get more of 843 00:51:47,680 --> 00:51:51,800 Speaker 1: this behavior. Incentives work pretty well for things, but people 844 00:51:51,840 --> 00:51:55,520 Speaker 1: don't respond to the incentives themselves. They respond to the 845 00:51:55,560 --> 00:51:59,680 Speaker 1: meaning that they assigned to uh those incentives. And more broadly, 846 00:51:59,719 --> 00:52:03,600 Speaker 1: we don't respond to the stimuli we encounter. We respond 847 00:52:03,680 --> 00:52:06,920 Speaker 1: to the meaning that we assigned to them. Man, there's 848 00:52:06,960 --> 00:52:11,320 Speaker 1: so much flexibility in terms of how we construe things 849 00:52:11,960 --> 00:52:17,120 Speaker 1: that um, the same purchase thought of for whatever reason 850 00:52:17,239 --> 00:52:20,200 Speaker 1: as a deal versus not thought that way. It just 851 00:52:21,200 --> 00:52:24,600 Speaker 1: makes all the difference in terms of how much how 852 00:52:24,640 --> 00:52:27,960 Speaker 1: we feel about it, how much continued enjoyment we get it. Uh. 853 00:52:28,040 --> 00:52:32,719 Speaker 1: The same thing about virtually everything in in our lives. UH. 854 00:52:32,760 --> 00:52:37,319 Speaker 1: And therefore, the determinants of understanding how people are going 855 00:52:37,360 --> 00:52:41,240 Speaker 1: to interpret a given stimulus is the key to understanding 856 00:52:41,239 --> 00:52:42,920 Speaker 1: how people are going to behave. It's the key to 857 00:52:43,120 --> 00:52:45,719 Speaker 1: running a successful company, and certainly it's the key to 858 00:52:45,840 --> 00:52:50,799 Speaker 1: running a successful political campaign. So the other issue I 859 00:52:50,840 --> 00:52:53,880 Speaker 1: wanted to bring up relevant to this was the endowment 860 00:52:53,920 --> 00:52:58,439 Speaker 1: effects and how people place more value on things their own. 861 00:52:58,520 --> 00:53:01,319 Speaker 1: And and we'll stay with cars for a moment. My 862 00:53:01,520 --> 00:53:05,600 Speaker 1: favorite experience with this is speaking someone. I get this 863 00:53:05,640 --> 00:53:08,640 Speaker 1: all the time. Hey, you're a car guy. I'm thinking 864 00:53:08,680 --> 00:53:11,719 Speaker 1: about buying this or that. The most recent time it 865 00:53:11,800 --> 00:53:14,040 Speaker 1: was I'm looking at a cut. You mentioned a camera 866 00:53:14,640 --> 00:53:19,360 Speaker 1: or the Honda Accord. Which which car do you suggest? Um? 867 00:53:19,600 --> 00:53:21,880 Speaker 1: I get And I've learned it's like when someone says, 868 00:53:22,280 --> 00:53:24,439 Speaker 1: I'm thinking of leaving my wife, what do you think? 869 00:53:24,760 --> 00:53:26,799 Speaker 1: Whatever answer is going to come back to bite you. 870 00:53:27,160 --> 00:53:29,600 Speaker 1: So I usually say, those are both really good cars. 871 00:53:30,080 --> 00:53:32,319 Speaker 1: What's your experience with each of them? And I'll get 872 00:53:32,320 --> 00:53:34,520 Speaker 1: a laundry list of all the great things about the 873 00:53:34,600 --> 00:53:38,120 Speaker 1: Camera and then all the great things about the Accord. Well, 874 00:53:38,280 --> 00:53:40,239 Speaker 1: I don't think you could do poorly with either of them. 875 00:53:40,320 --> 00:53:43,200 Speaker 1: Let me know what you decide. Six months later, you 876 00:53:43,239 --> 00:53:47,120 Speaker 1: speak to that person and whichever car they picked, I'm 877 00:53:47,160 --> 00:53:50,120 Speaker 1: so glad I picked the Accord. It is the greatest 878 00:53:50,200 --> 00:53:53,279 Speaker 1: thing since that was that's a boring plane. This is 879 00:53:53,360 --> 00:53:56,640 Speaker 1: really it doesn't matter which one they select, but whicheveryone 880 00:53:56,760 --> 00:54:00,000 Speaker 1: they select, that's the one, that's the winning car, that's 881 00:54:00,000 --> 00:54:03,120 Speaker 1: the one they should have picked. It's amazing how what 882 00:54:03,239 --> 00:54:05,640 Speaker 1: was a coin to us at one point? Oh no, 883 00:54:05,760 --> 00:54:09,000 Speaker 1: this was the only way the decision could have gone. Yeah, 884 00:54:09,000 --> 00:54:11,200 Speaker 1: And I think there's an important lesson there that is, 885 00:54:11,239 --> 00:54:14,879 Speaker 1: when we find ourselves in these circumstances where, oh my god, 886 00:54:14,920 --> 00:54:18,120 Speaker 1: this is a weighty decision, I don't know, uh, you know, 887 00:54:18,680 --> 00:54:20,560 Speaker 1: six of one half dozen of the other. Well, if 888 00:54:20,600 --> 00:54:23,719 Speaker 1: you find yourself where you really are torn and there 889 00:54:23,719 --> 00:54:28,200 Speaker 1: are strong arguments either way it's a close call, then 890 00:54:28,280 --> 00:54:31,799 Speaker 1: just pick one. And what you just described, all that 891 00:54:31,920 --> 00:54:35,359 Speaker 1: psychology that uh you bring to bear to make you 892 00:54:35,440 --> 00:54:38,680 Speaker 1: happy with what you've chosen, will be brought to bear 893 00:54:38,719 --> 00:54:42,640 Speaker 1: and you'll be pretty happy with it. Um. And those decisions, 894 00:54:42,680 --> 00:54:45,719 Speaker 1: because it's six of one half dozen of the other, 895 00:54:45,760 --> 00:54:49,480 Speaker 1: they feel like hard ones, but in reality they aren't. 896 00:54:49,640 --> 00:54:54,960 Speaker 1: Um because if it's so much uh an unbalanced decision, 897 00:54:55,320 --> 00:54:56,920 Speaker 1: you can go either way and you'll end up feeling 898 00:54:56,920 --> 00:55:00,840 Speaker 1: pretty good about it. So there's a concept discussed, and 899 00:55:01,360 --> 00:55:03,640 Speaker 1: I believe it was how we know what isn't so 900 00:55:04,400 --> 00:55:07,440 Speaker 1: that I have to bring up because It just cracks 901 00:55:07,480 --> 00:55:10,440 Speaker 1: me up so much. We're all familiar with the Kubler 902 00:55:10,560 --> 00:55:14,640 Speaker 1: Ross five Stages of grief that has been, you know, 903 00:55:14,880 --> 00:55:20,319 Speaker 1: dogma for years. Your research more or less fines that 904 00:55:20,760 --> 00:55:24,000 Speaker 1: it's pretty meaningless and there is no data that backs 905 00:55:24,040 --> 00:55:26,680 Speaker 1: it up. Am I overstating that or is that a 906 00:55:26,680 --> 00:55:31,279 Speaker 1: fair assessment of that? I talk about that in the 907 00:55:31,280 --> 00:55:33,239 Speaker 1: book in the same chapter that I talked about the 908 00:55:33,280 --> 00:55:35,400 Speaker 1: hot hand, And both of them are reflection of the 909 00:55:35,440 --> 00:55:38,880 Speaker 1: fact that, Um, we tend to see more patterns in 910 00:55:38,920 --> 00:55:42,719 Speaker 1: the world than are actually there. Um. We've got this 911 00:55:42,880 --> 00:55:48,080 Speaker 1: incredible pattern detection machinery in our heads and it goes 912 00:55:48,120 --> 00:55:52,080 Speaker 1: out and finds patterns, but no system is perfect, and 913 00:55:52,120 --> 00:55:55,560 Speaker 1: so it overshoots sometimes, and so we see faces and 914 00:55:55,640 --> 00:55:58,319 Speaker 1: clouds and a man on the moon and canals on 915 00:55:58,440 --> 00:56:02,640 Speaker 1: Mars that aren't there are. UM and the hot hand, 916 00:56:03,160 --> 00:56:06,359 Speaker 1: Uh is you are seeing more streakiness than is really 917 00:56:06,400 --> 00:56:11,560 Speaker 1: There's another manifestation of that. Psychologists are very fond of 918 00:56:11,640 --> 00:56:15,200 Speaker 1: stage theories that we go through the systematic stages to 919 00:56:15,760 --> 00:56:18,960 Speaker 1: take us from the starting point to the endpoint. Kubler 920 00:56:19,040 --> 00:56:22,560 Speaker 1: Ross's is just one of them, and it I'm not 921 00:56:22,600 --> 00:56:25,959 Speaker 1: saying it's random. That is to say, people willy narly 922 00:56:26,040 --> 00:56:28,879 Speaker 1: go through all sorts of stages at different times. Maybe 923 00:56:28,880 --> 00:56:33,120 Speaker 1: there is some order there, but certainly there's less than people. 924 00:56:33,239 --> 00:56:35,520 Speaker 1: People are complicated and their act in all sorts of 925 00:56:35,560 --> 00:56:38,600 Speaker 1: different ways. And there is a lot of research on 926 00:56:38,760 --> 00:56:42,760 Speaker 1: people's reactions to grief, not so much about in reactions 927 00:56:42,800 --> 00:56:45,600 Speaker 1: to their own death, and there's research on that too, um. 928 00:56:45,760 --> 00:56:48,200 Speaker 1: And people are all over the map. Uh. Some people 929 00:56:48,400 --> 00:56:51,719 Speaker 1: the most common pattern, of course is you're devastated initially 930 00:56:52,160 --> 00:56:57,000 Speaker 1: and then you get over it and um. But many 931 00:56:57,000 --> 00:57:00,360 Speaker 1: people never get over it, and the world treats them like, 932 00:57:00,400 --> 00:57:02,000 Speaker 1: what's wrong with you? I who It's supposed to take 933 00:57:02,000 --> 00:57:04,560 Speaker 1: about a year, and you seem to have you seem 934 00:57:04,600 --> 00:57:07,080 Speaker 1: to still be troubled by this. We're adding an extra 935 00:57:07,160 --> 00:57:10,600 Speaker 1: burden onto those people. Some people aren't troubled right to 936 00:57:10,800 --> 00:57:13,160 Speaker 1: be from the beginning, and that can seem perverse weight. 937 00:57:13,239 --> 00:57:15,600 Speaker 1: You just lost a child, What what's wrong with you 938 00:57:15,640 --> 00:57:20,000 Speaker 1: that you didn't feel that way? Well, that happens, um. 939 00:57:20,720 --> 00:57:24,800 Speaker 1: And the problem with a firm belief in stage theories 940 00:57:24,960 --> 00:57:28,040 Speaker 1: is that they take on this normative stance, this is 941 00:57:28,080 --> 00:57:32,000 Speaker 1: what you should do. In fact, there are certain practitioners. 942 00:57:32,280 --> 00:57:35,160 Speaker 1: That sounds like I'm making this up, but it's uh, 943 00:57:35,200 --> 00:57:40,240 Speaker 1: and I wish I were UM, But there's a practitioners 944 00:57:40,320 --> 00:57:44,360 Speaker 1: manual for nurses that refers to people who don't go 945 00:57:44,480 --> 00:57:52,680 Speaker 1: through Kuberas's five stages in the order specified as pathological diers. Uh. 946 00:57:52,720 --> 00:57:54,960 Speaker 1: And just think how insidious that is. You've gotten the 947 00:57:54,960 --> 00:57:58,520 Speaker 1: worst news you could possibly get that your life here 948 00:57:58,640 --> 00:58:01,560 Speaker 1: is you don't know if much time left on earth. 949 00:58:01,840 --> 00:58:03,840 Speaker 1: You've got to deal with that. And now, in addition 950 00:58:03,880 --> 00:58:05,920 Speaker 1: to that, you have to deal with the fact that 951 00:58:05,960 --> 00:58:08,920 Speaker 1: the people presumably there to help you think that you 952 00:58:08,960 --> 00:58:12,800 Speaker 1: aren't doing it right. It's just adding a horrible injury 953 00:58:12,800 --> 00:58:16,960 Speaker 1: insult to an already terrible injury. And you link in 954 00:58:17,040 --> 00:58:19,440 Speaker 1: one of your presentations, you linked to the video of 955 00:58:19,600 --> 00:58:23,680 Speaker 1: Homer Simpson going through the five stages, which is absolutely hilarious. 956 00:58:23,680 --> 00:58:26,840 Speaker 1: I'll add the link to this. I have to ask 957 00:58:26,880 --> 00:58:30,440 Speaker 1: the related question, what sort of pushback did you get 958 00:58:30,520 --> 00:58:35,760 Speaker 1: from the the establishment uh in in this area when 959 00:58:35,800 --> 00:58:40,200 Speaker 1: you basically said, hey, the data doesn't really support the 960 00:58:40,240 --> 00:58:43,800 Speaker 1: classic Kubler Ross five stages as much as we think 961 00:58:43,840 --> 00:58:46,400 Speaker 1: it does. Compared to the amount of pushback I got 962 00:58:46,400 --> 00:58:49,040 Speaker 1: on the hot hand hardly any at all. Really, Yeah, 963 00:58:49,440 --> 00:58:52,960 Speaker 1: that's amazing. Well, basketball is serious dying. Hey, you know, 964 00:58:53,080 --> 00:58:55,800 Speaker 1: we can't really pay much attention to that. So so 965 00:58:55,920 --> 00:58:59,560 Speaker 1: let me ask you the question about UM. About the 966 00:58:59,600 --> 00:59:03,880 Speaker 1: Basket Bowl hot hand issue. You mentioned, there's some new 967 00:59:03,920 --> 00:59:08,439 Speaker 1: evidence that perhaps moderates a little bit. I know, when 968 00:59:08,440 --> 00:59:11,600 Speaker 1: you first came out with it, everybody from Red Hour 969 00:59:11,720 --> 00:59:15,680 Speaker 1: back to whoever, weighed in on it. What was some 970 00:59:15,800 --> 00:59:19,480 Speaker 1: of the crazier push back you got on this and 971 00:59:20,320 --> 00:59:26,520 Speaker 1: how accurate do you think the original assessment is? Um? Well, 972 00:59:26,520 --> 00:59:29,480 Speaker 1: I wouldn't say any of the pushback is crazy. It was. 973 00:59:29,600 --> 00:59:32,200 Speaker 1: It was very firm, and you know I understand that. 974 00:59:32,320 --> 00:59:37,680 Speaker 1: And UM, again be precisely because it's such a compelling phenomenon. 975 00:59:37,880 --> 00:59:39,480 Speaker 1: When you're out there in the heat of the game 976 00:59:39,560 --> 00:59:42,160 Speaker 1: and you've made several shots in a row, UM, and 977 00:59:42,200 --> 00:59:47,320 Speaker 1: then you hear about these psychologist statisticians doing some analysis, 978 00:59:47,320 --> 00:59:51,919 Speaker 1: it's easy to um just that that can't be and 979 00:59:52,080 --> 00:59:56,440 Speaker 1: just dismiss it. UM. And they're very dismissive that what 980 00:59:56,480 --> 01:00:00,000 Speaker 1: do they know, they're a bunch of psychologists. Well, um, 981 01:00:00,000 --> 01:00:03,840 Speaker 1: ecologists play basketball too, and and UM, you know, maybe 982 01:00:03,840 --> 01:00:06,000 Speaker 1: we wouldn't have the hot hand as much as a 983 01:00:06,080 --> 01:00:09,440 Speaker 1: more skilled player, but we do have it and UM, 984 01:00:09,480 --> 01:00:13,320 Speaker 1: so it wasn't crazy, but it was pretty um dismissive. 985 01:00:13,480 --> 01:00:19,080 Speaker 1: And again with respect to our initial hypothesis, are people 986 01:00:19,640 --> 01:00:23,320 Speaker 1: overestimating any streaked pinus that might be there? That the 987 01:00:23,400 --> 01:00:25,640 Speaker 1: answer to that is very clear. In fact, we're running 988 01:00:26,000 --> 01:00:30,880 Speaker 1: additional studies right now where um, we have people shoot 989 01:00:30,920 --> 01:00:33,520 Speaker 1: for us, UM and they identify just tell us when 990 01:00:33,520 --> 01:00:36,120 Speaker 1: you're hot, um, and then afterwards we say, you know 991 01:00:36,160 --> 01:00:38,760 Speaker 1: you said you were hot, Uh, some number of times 992 01:00:38,840 --> 01:00:41,840 Speaker 1: you didn't say that, some number of times. What percentage 993 01:00:41,920 --> 01:00:43,880 Speaker 1: of the shots do you think you made after you 994 01:00:43,920 --> 01:00:46,760 Speaker 1: said you were hot versus uh, the times when you 995 01:00:46,800 --> 01:00:51,200 Speaker 1: didn't say that? And the overestimation is staggering. They you know, 996 01:00:51,320 --> 01:00:54,200 Speaker 1: we know what percentage of shots they made when they 997 01:00:54,240 --> 01:00:58,320 Speaker 1: said they were hot, and they're wildly overestimating that. They're 998 01:00:58,400 --> 01:01:01,880 Speaker 1: modestly underestimating how well they did when they said they 999 01:01:01,880 --> 01:01:07,320 Speaker 1: weren't hot. So again a departure between ah belief in reality. 1000 01:01:07,640 --> 01:01:10,720 Speaker 1: So so let me not look for confirming data on 1001 01:01:10,720 --> 01:01:13,080 Speaker 1: on the thesis. Uh that there was a lot of 1002 01:01:13,120 --> 01:01:17,280 Speaker 1: pushback and ask did anybody at either the professional or 1003 01:01:17,360 --> 01:01:20,400 Speaker 1: college level get back to you and say, I want 1004 01:01:20,440 --> 01:01:26,040 Speaker 1: to adapt the data of the non hot hands. Here's 1005 01:01:26,080 --> 01:01:28,880 Speaker 1: how we think it could affect our player rotation or 1006 01:01:28,920 --> 01:01:31,600 Speaker 1: a game strategy. Did you ever hear anybody who said, 1007 01:01:32,680 --> 01:01:34,720 Speaker 1: as hard as this might be to believe, we think 1008 01:01:34,720 --> 01:01:37,560 Speaker 1: you're accurate and here's how it's going to change our 1009 01:01:37,680 --> 01:01:41,560 Speaker 1: coaching or playing strategy. Um, I you know, I've talked 1010 01:01:41,600 --> 01:01:44,840 Speaker 1: to a lot of coaches who will say, you know, 1011 01:01:44,880 --> 01:01:48,680 Speaker 1: one thing I have taken from that is that it 1012 01:01:48,720 --> 01:01:53,520 Speaker 1: reinforces really what my job is, which is to um 1013 01:01:53,560 --> 01:01:57,360 Speaker 1: to get my team the best shots. And I used 1014 01:01:57,400 --> 01:02:01,040 Speaker 1: to factor in how hot some one is. Now I'm 1015 01:02:01,080 --> 01:02:04,640 Speaker 1: gonna factor that in less it's really getting the ball 1016 01:02:04,680 --> 01:02:07,720 Speaker 1: in the hands of my best players on the best 1017 01:02:07,720 --> 01:02:11,200 Speaker 1: spots of the floor. And so it's it's reinforcing an 1018 01:02:11,200 --> 01:02:15,240 Speaker 1: alternative belief that they already have more than UM, trying 1019 01:02:15,240 --> 01:02:18,640 Speaker 1: to get them to extinguish this other belief. So there's 1020 01:02:18,680 --> 01:02:21,360 Speaker 1: a ton of other biases. I wanted to go over, 1021 01:02:21,520 --> 01:02:24,960 Speaker 1: the spotlight effect, the there's there's just a run of stuff. 1022 01:02:25,000 --> 01:02:28,040 Speaker 1: But I have to get to my favorite questions. Um, 1023 01:02:28,120 --> 01:02:31,600 Speaker 1: these are the standard questions we ask all of our guests. Okay, 1024 01:02:31,880 --> 01:02:34,840 Speaker 1: tell us the most important thing that people don't know 1025 01:02:35,040 --> 01:02:39,200 Speaker 1: about your background. I'm the first person in my family 1026 01:02:39,240 --> 01:02:41,640 Speaker 1: to go to college. That could sound like, oh, that's 1027 01:02:41,640 --> 01:02:46,240 Speaker 1: a bad thing. Um, I feel I've been studying gratitude 1028 01:02:46,240 --> 01:02:48,320 Speaker 1: a lot, and I think part of it stems from 1029 01:02:48,360 --> 01:02:51,760 Speaker 1: having grown up in an amazing part of the world 1030 01:02:51,760 --> 01:02:55,320 Speaker 1: at an amazing time. That is Silicon Valley before it 1031 01:02:55,360 --> 01:02:58,960 Speaker 1: was Silicon Valley. It's just a great place. It's pretty 1032 01:02:59,000 --> 01:03:02,760 Speaker 1: egalitary in world. Those of us whose parents didn't go 1033 01:03:02,800 --> 01:03:05,920 Speaker 1: to college mingled with those who did, and you never 1034 01:03:06,000 --> 01:03:08,240 Speaker 1: felt like you were part of a lower cast as 1035 01:03:08,240 --> 01:03:12,320 Speaker 1: a result of that. UM And obviously the weather there 1036 01:03:12,400 --> 01:03:16,360 Speaker 1: was sensational. You've got the beaches very close, Yosemite Valley 1037 01:03:16,480 --> 01:03:19,360 Speaker 1: very close. It was just an incredible place to have 1038 01:03:19,440 --> 01:03:24,000 Speaker 1: grown up. Tell us about some of your early mentors, Um, 1039 01:03:24,040 --> 01:03:27,920 Speaker 1: I've had the great pleasure of going to Stanford University, 1040 01:03:28,040 --> 01:03:31,880 Speaker 1: the number one psychology department then and now for a 1041 01:03:31,880 --> 01:03:35,760 Speaker 1: long time. It's it's had that ranking, UM at a 1042 01:03:35,880 --> 01:03:39,560 Speaker 1: great time. It was the height of the cognitive revolution 1043 01:03:39,760 --> 01:03:42,840 Speaker 1: in psychology. It's a lot of very exciting things happening, 1044 01:03:43,360 --> 01:03:45,680 Speaker 1: which is why I chose to go there, and then 1045 01:03:45,720 --> 01:03:49,320 Speaker 1: there were the surprising things. Uh Amos Diversky and Daniel 1046 01:03:49,360 --> 01:03:53,000 Speaker 1: Conneman were visiting there my very first year and started 1047 01:03:53,040 --> 01:03:55,520 Speaker 1: talking about this stuff I hadn't heard of that just 1048 01:03:55,560 --> 01:03:58,760 Speaker 1: seemed so compelling. It was sort of the dawn of 1049 01:03:58,800 --> 01:04:01,840 Speaker 1: the revolution and in judgment and decision making, and it 1050 01:04:01,920 --> 01:04:04,960 Speaker 1: was just a very heady time, and they were generous 1051 01:04:04,960 --> 01:04:09,880 Speaker 1: with their time. My advisers, Mark Leper and Lee Ross too, 1052 01:04:10,520 --> 01:04:15,640 Speaker 1: incredibly insightful psychologists, were very helpful. So it was just 1053 01:04:15,720 --> 01:04:20,280 Speaker 1: a felt like being in Wittgenstein's Vienna. I was just 1054 01:04:20,400 --> 01:04:24,880 Speaker 1: a great place at a great time, with terrific people 1055 01:04:25,160 --> 01:04:29,440 Speaker 1: and terrifically giving people. So let me diverge from my 1056 01:04:30,600 --> 01:04:33,320 Speaker 1: normal questions and ask you two things about this. First, 1057 01:04:34,000 --> 01:04:36,280 Speaker 1: you write that you originally planning on going to law 1058 01:04:36,320 --> 01:04:39,000 Speaker 1: school until you happen to see to Versking and Kaneman 1059 01:04:40,000 --> 01:04:44,480 Speaker 1: at school. Did did you really completely shift your career 1060 01:04:44,520 --> 01:04:48,240 Speaker 1: plans based on No, not not quite. Those are there's 1061 01:04:49,000 --> 01:04:52,440 Speaker 1: Those two components are true, but they're melded together. That 1062 01:04:52,560 --> 01:04:54,200 Speaker 1: is to say, I went to college again. I was 1063 01:04:54,240 --> 01:04:56,000 Speaker 1: the first most of my family to go to college. 1064 01:04:56,520 --> 01:04:58,400 Speaker 1: I didn't know what I wanted to do. But I 1065 01:04:58,680 --> 01:05:00,560 Speaker 1: kind of like to argue with people, and I thought 1066 01:05:00,600 --> 01:05:04,800 Speaker 1: that would be a good uh field the law, and 1067 01:05:05,160 --> 01:05:09,280 Speaker 1: but there's there's no pre law. You can take anything 1068 01:05:09,440 --> 01:05:12,240 Speaker 1: as a pre law person. And so I took a 1069 01:05:12,280 --> 01:05:16,680 Speaker 1: bunch of stuff and some of them psychology courses like them, 1070 01:05:16,720 --> 01:05:20,240 Speaker 1: took another, like that, took another, and then suddenly don 1071 01:05:20,320 --> 01:05:22,880 Speaker 1: dun me, Hey, I seem to like the psychology stuff. 1072 01:05:23,400 --> 01:05:25,960 Speaker 1: Can I make a career out of this? And took 1073 01:05:25,960 --> 01:05:28,960 Speaker 1: a year off after undergraduate to sort of do I 1074 01:05:29,000 --> 01:05:31,080 Speaker 1: want to take that plunge and go to graduate school? 1075 01:05:31,080 --> 01:05:34,440 Speaker 1: And the answer was yes. And I'm really really glad 1076 01:05:35,160 --> 01:05:38,440 Speaker 1: that the answer was yes. So one of the themes 1077 01:05:38,480 --> 01:05:41,720 Speaker 1: that comes up in all these conversations with people who 1078 01:05:41,760 --> 01:05:46,840 Speaker 1: have achieved either personal or financial success is surprisingly the 1079 01:05:46,960 --> 01:05:50,680 Speaker 1: role of luck and randomness in their careers. And I 1080 01:05:50,720 --> 01:05:53,680 Speaker 1: can't count how many people have said, you know, but 1081 01:05:53,960 --> 01:05:59,640 Speaker 1: for this one thing happening, my entire professional career, personal life, whatever, 1082 01:06:00,080 --> 01:06:03,280 Speaker 1: could have easily gone in a different direction you referenced 1083 01:06:03,520 --> 01:06:07,840 Speaker 1: in and again I think it was the white smart 1084 01:06:07,880 --> 01:06:11,960 Speaker 1: people make big money mistakes. The experience of of seeing 1085 01:06:12,040 --> 01:06:15,560 Speaker 1: to Versking condiment at Stanford am I reading too much 1086 01:06:15,600 --> 01:06:17,840 Speaker 1: into that. Was that a big deal? Or was that 1087 01:06:17,920 --> 01:06:20,720 Speaker 1: just hmm? This is really interesting and I want to 1088 01:06:20,760 --> 01:06:25,320 Speaker 1: stay with with psychology. I maybe put I may be 1089 01:06:25,400 --> 01:06:27,400 Speaker 1: reading too much into it. No, you're not. I mean 1090 01:06:27,520 --> 01:06:31,200 Speaker 1: it's uh well, I'll tell an embarrassing story about myself, 1091 01:06:31,280 --> 01:06:33,960 Speaker 1: which is I go to Stanford. It's chock full of 1092 01:06:34,040 --> 01:06:38,120 Speaker 1: these famous people. Walter Michelle of the Marshmallow Test, Gordon Bauer, 1093 01:06:38,200 --> 01:06:41,680 Speaker 1: a giant, and the cognitive revolution. The people I planned 1094 01:06:41,680 --> 01:06:45,080 Speaker 1: to study with, Mark Leper, Lee ross Um, and Lee 1095 01:06:45,200 --> 01:06:49,880 Speaker 1: Ross ran a seminar on an introduction to the faculty. Um. 1096 01:06:49,920 --> 01:06:52,520 Speaker 1: Each week you'd bring in two other members of the faculty. 1097 01:06:52,560 --> 01:06:56,080 Speaker 1: We'd read their papers and find out what's going on 1098 01:06:56,200 --> 01:07:00,680 Speaker 1: there among the faculty. Great way to start a program. Um, 1099 01:07:00,760 --> 01:07:04,960 Speaker 1: And in the first organizational meeting, Lee says, Okay, that's 1100 01:07:04,960 --> 01:07:07,160 Speaker 1: what we're gonna do. But the first week, we've got 1101 01:07:07,160 --> 01:07:11,360 Speaker 1: these visitors here, Amos Tversky and Daniel Kneman Um, and 1102 01:07:11,400 --> 01:07:14,280 Speaker 1: so we're gonna start with them. And internally, I say 1103 01:07:14,320 --> 01:07:17,320 Speaker 1: to myself, why these guys? Why I can't I want 1104 01:07:17,320 --> 01:07:20,520 Speaker 1: to hear from the famous people. Because my undergraduate education 1105 01:07:20,560 --> 01:07:23,000 Speaker 1: was such I hadn't heard of them. And I'm now 1106 01:07:23,080 --> 01:07:27,800 Speaker 1: embarrassed by that. Um but at least and they they 1107 01:07:28,440 --> 01:07:33,640 Speaker 1: had us read the now epic u N paper and 1108 01:07:33,720 --> 01:07:37,800 Speaker 1: science on heuristics and biases. And so it's an embarrassing 1109 01:07:37,840 --> 01:07:40,400 Speaker 1: story that I would say, um gee, when are we 1110 01:07:40,440 --> 01:07:42,920 Speaker 1: going to hear from the famous people? But one thing 1111 01:07:42,960 --> 01:07:44,920 Speaker 1: I can say in my defense, at least one thing 1112 01:07:44,960 --> 01:07:47,160 Speaker 1: that makes me feel better is at least I recognized 1113 01:07:47,640 --> 01:07:49,880 Speaker 1: when we read that paper, Hey man, this is this 1114 01:07:49,960 --> 01:07:53,520 Speaker 1: is really great stuff. Um. And and that changed what 1115 01:07:53,560 --> 01:07:56,840 Speaker 1: I planned to do in graduate school. That's fascinating. Tell 1116 01:07:56,920 --> 01:08:00,520 Speaker 1: us about some of your favorite books. Man, there's so 1117 01:08:00,560 --> 01:08:05,720 Speaker 1: many great books. Uh, well, I'll give you a category 1118 01:08:05,840 --> 01:08:09,960 Speaker 1: for one. Anything that Ian McEwen writes is just brilliant 1119 01:08:10,040 --> 01:08:13,160 Speaker 1: and and um you know it's it's it's fiction, but 1120 01:08:13,200 --> 01:08:17,240 Speaker 1: it's fiction that touches on and it's fiction that touches 1121 01:08:17,280 --> 01:08:20,280 Speaker 1: on so many different areas of life and so many 1122 01:08:20,320 --> 01:08:22,680 Speaker 1: different themes, but some of it having to do very 1123 01:08:22,720 --> 01:08:25,320 Speaker 1: related to judgment and decision making in game theory. The 1124 01:08:25,439 --> 01:08:30,519 Speaker 1: beginning the opening scene to his novel Enduring Love, it's 1125 01:08:31,560 --> 01:08:35,040 Speaker 1: a brilliant fictional depiction of game theory. It's all you 1126 01:08:35,080 --> 01:08:37,439 Speaker 1: need to know about game theory happens in that first 1127 01:08:37,640 --> 01:08:43,040 Speaker 1: very riveting uh scene, So anything by Ian McEwen. Another 1128 01:08:43,760 --> 01:08:47,479 Speaker 1: moving out of fiction to oh and his kind of 1129 01:08:47,520 --> 01:08:53,600 Speaker 1: recent one Nutshell that which is a little bizarrely fascinating 1130 01:08:53,680 --> 01:08:59,360 Speaker 1: take on the Hamlet tail is brilliant um. Moving out 1131 01:08:59,360 --> 01:09:03,479 Speaker 1: of fiction to nonfiction guns, germs and steel, just sort 1132 01:09:03,479 --> 01:09:05,880 Speaker 1: of the scope of it, and the you know, he 1133 01:09:05,920 --> 01:09:08,920 Speaker 1: asks so many questions that would never occur to me 1134 01:09:09,000 --> 01:09:13,559 Speaker 1: at least to ask and uh and then bring some 1135 01:09:13,880 --> 01:09:16,799 Speaker 1: interesting analysis to bear on them, some of them convincing, 1136 01:09:16,840 --> 01:09:19,439 Speaker 1: some of them less so. But the ambitiousness of it 1137 01:09:19,560 --> 01:09:26,240 Speaker 1: and the questions asked for just uh, just terrific um. 1138 01:09:26,280 --> 01:09:29,200 Speaker 1: And you know, we started with conoman diversky and so 1139 01:09:30,320 --> 01:09:34,040 Speaker 1: it feels right to say Thinking Fast and Slow is 1140 01:09:34,640 --> 01:09:40,800 Speaker 1: chock full of wisdom and just a brilliant explication of 1141 01:09:40,840 --> 01:09:44,800 Speaker 1: all the research in this area with h Danny Knomans 1142 01:09:44,880 --> 01:09:50,800 Speaker 1: just gift for putting things just right. I love that book. 1143 01:09:51,040 --> 01:09:53,640 Speaker 1: It was tremendous. Um. You work with a lot of 1144 01:09:53,640 --> 01:09:55,960 Speaker 1: students and a lot of millennials. What sort of advice 1145 01:09:55,960 --> 01:09:59,920 Speaker 1: would you give someone who was interested in psychology as 1146 01:10:00,000 --> 01:10:03,839 Speaker 1: a career? A bit of a complicated answer to that question. 1147 01:10:04,120 --> 01:10:07,680 Speaker 1: That is to say, people are often taught, oh, just 1148 01:10:07,760 --> 01:10:11,120 Speaker 1: follow your passion so on. And I chafe at that 1149 01:10:11,200 --> 01:10:13,719 Speaker 1: a little bit because I know a lot of kids. 1150 01:10:14,240 --> 01:10:16,519 Speaker 1: It makes a lot of kids feel bad because they think, oh, 1151 01:10:16,600 --> 01:10:18,600 Speaker 1: I don't have a passion, and then the problem is 1152 01:10:18,640 --> 01:10:21,360 Speaker 1: finding it. And it's okay to not have a passion. 1153 01:10:21,600 --> 01:10:25,600 Speaker 1: It's okay to be a person who does a variety 1154 01:10:25,640 --> 01:10:27,680 Speaker 1: of different jobs. And you know, you can be a 1155 01:10:27,720 --> 01:10:30,559 Speaker 1: good person and live a good life as just be 1156 01:10:30,640 --> 01:10:33,200 Speaker 1: a taxpayer, be someone who's a good neighbor, and so on. 1157 01:10:33,320 --> 01:10:36,439 Speaker 1: You don't have to to have a great passion and 1158 01:10:36,479 --> 01:10:39,519 Speaker 1: be a giant success. So I want to put that 1159 01:10:40,240 --> 01:10:44,280 Speaker 1: out there because I firmly believe that because the advice 1160 01:10:44,280 --> 01:10:47,000 Speaker 1: I'm going to give it could sound like, oh, you're 1161 01:10:47,120 --> 01:10:51,040 Speaker 1: recommending that and I'm not, which is, um, just do stuff. 1162 01:10:51,600 --> 01:10:54,439 Speaker 1: Do what's engaging to you. Take take you know. Um, 1163 01:10:54,479 --> 01:10:58,160 Speaker 1: if you're a student, don't worry so much about getting 1164 01:10:58,200 --> 01:11:01,200 Speaker 1: your undergraduate business May Jersey, so you can go right 1165 01:11:01,240 --> 01:11:04,320 Speaker 1: away into a job. Um, because if you do that 1166 01:11:05,200 --> 01:11:07,080 Speaker 1: and it's a job you like, they're probably gonna want 1167 01:11:07,120 --> 01:11:08,680 Speaker 1: you to get your m b a anyway, and you're 1168 01:11:08,680 --> 01:11:11,760 Speaker 1: gonna relearn that stuff. Um. You never know what you're 1169 01:11:11,760 --> 01:11:14,280 Speaker 1: gonna draw upon in your life. So take the kinds 1170 01:11:14,320 --> 01:11:17,840 Speaker 1: of courses, do the kinds of things that you're really 1171 01:11:18,200 --> 01:11:21,960 Speaker 1: engaged in. Push yourself a little bit um, That's what 1172 01:11:22,000 --> 01:11:24,120 Speaker 1: I would say. You know, the world changes so quickly 1173 01:11:24,200 --> 01:11:28,240 Speaker 1: you can't anticipate what the skills of the marketplace are 1174 01:11:28,280 --> 01:11:30,679 Speaker 1: going to be with that much accuracy. So just keep 1175 01:11:30,680 --> 01:11:34,400 Speaker 1: building your your intellectual capital and don't worry so much 1176 01:11:34,439 --> 01:11:37,680 Speaker 1: about the outcome. And our final question, what is it 1177 01:11:37,720 --> 01:11:41,240 Speaker 1: that you know about psychology today that you wish you 1178 01:11:41,320 --> 01:11:44,960 Speaker 1: knew thirty plus years ago when you were first starting out. Yeah, 1179 01:11:45,040 --> 01:11:47,760 Speaker 1: great question and a simple answer to it. Uh. I 1180 01:11:47,840 --> 01:11:50,960 Speaker 1: knew this a lot in psychology, but I kept it 1181 01:11:51,080 --> 01:11:56,400 Speaker 1: sort of walled off there um and didn't appreciate its breadth. 1182 01:11:56,439 --> 01:12:00,160 Speaker 1: Which is an idea often attributed to Kurt Lewin, which 1183 01:12:00,280 --> 01:12:04,280 Speaker 1: is when we're trying to change behavior other people's are 1184 01:12:04,280 --> 01:12:07,560 Speaker 1: our own, we often try to do it by increasing motivation. 1185 01:12:07,680 --> 01:12:12,000 Speaker 1: Psych people up higher motivational speakers to get the salesforce 1186 01:12:12,120 --> 01:12:14,760 Speaker 1: charged up and so on. And there are times in 1187 01:12:14,760 --> 01:12:18,880 Speaker 1: which that is helpful when motivation. When motivation is the problem. 1188 01:12:19,000 --> 01:12:22,759 Speaker 1: Getting more motivation is great, but oftentimes that's not the problem. 1189 01:12:22,800 --> 01:12:25,320 Speaker 1: People are perfectly well motivated and they just can't figure 1190 01:12:25,320 --> 01:12:29,400 Speaker 1: out how to translate their strong motivations into effective action. 1191 01:12:29,760 --> 01:12:33,439 Speaker 1: And Lewin's idea very simple one is when that's the case, 1192 01:12:33,640 --> 01:12:37,240 Speaker 1: don't try to push people more. Figure out what's preventing 1193 01:12:37,280 --> 01:12:42,240 Speaker 1: them and take away those blockages. And behavioral economists have 1194 01:12:42,320 --> 01:12:46,160 Speaker 1: been using that a lot, uh to exemplified best in 1195 01:12:46,240 --> 01:12:50,599 Speaker 1: the Richard Taylor and Cass Sunstein book Nudge. Figure out 1196 01:12:50,680 --> 01:12:56,120 Speaker 1: what's preventing this and uh rearrange the environment a little 1197 01:12:56,120 --> 01:12:58,800 Speaker 1: bit that makes the behavior easier. You want more of this, 1198 01:12:59,360 --> 01:13:03,480 Speaker 1: make it a little a bit easier. And I think if, um, 1199 01:13:03,520 --> 01:13:06,760 Speaker 1: if I knew that uh sooner, I would be more 1200 01:13:06,840 --> 01:13:11,320 Speaker 1: effective in you know, consulting on political campaigns or helping 1201 01:13:11,360 --> 01:13:15,640 Speaker 1: people in their personal lives as well. That that's absolutely fascinating. 1202 01:13:16,240 --> 01:13:21,160 Speaker 1: We have been speaking with Professor Tom Gilovich of Cornell University. 1203 01:13:21,200 --> 01:13:23,519 Speaker 1: If you enjoy this conversation, be sure and look up 1204 01:13:23,520 --> 01:13:26,559 Speaker 1: an Inch or down an inch on Apple, iTunes or 1205 01:13:26,600 --> 01:13:29,679 Speaker 1: wherever finer podcasts are sold, and you can see any 1206 01:13:29,720 --> 01:13:32,880 Speaker 1: of the other hundred and eighty or so such conversations 1207 01:13:32,880 --> 01:13:35,880 Speaker 1: that we've had. I would be remiss if I did 1208 01:13:35,880 --> 01:13:39,280 Speaker 1: not thank the Cracks staff who helps put this podcast 1209 01:13:39,320 --> 01:13:43,240 Speaker 1: together each week. Taylor Riggs is our booker, Michael Batnick 1210 01:13:43,320 --> 01:13:46,360 Speaker 1: is my head of research, and Medina Parwanner is our 1211 01:13:46,439 --> 01:13:50,640 Speaker 1: audio engineer slash producer. We love your comments, feedback in 1212 01:13:50,760 --> 01:13:54,760 Speaker 1: suggestions right to us at m IB podcast at Bloomberg 1213 01:13:54,800 --> 01:13:57,920 Speaker 1: dot net. I'm Barry Retults. You're listening to Masters in 1214 01:13:57,960 --> 01:14:06,479 Speaker 1: Business on Bloomberg Radio in the Canting Tin and the 1215 01:14:06,760 --> 01:14:06,800 Speaker 1: so