1 00:00:15,476 --> 00:00:24,236 Speaker 1: Pushkin. As humans, we do a lot of things that 2 00:00:24,276 --> 00:00:27,516 Speaker 1: don't make all that much sense. I sometimes imagine a 3 00:00:27,556 --> 00:00:31,236 Speaker 1: completely rational alien species somewhere out there in the universe 4 00:00:31,396 --> 00:00:34,716 Speaker 1: that's observing us earthlings and is totally shocked by how 5 00:00:34,716 --> 00:00:37,956 Speaker 1: often we act against our own best interests. We eat 6 00:00:37,956 --> 00:00:40,796 Speaker 1: foods that we know aren't good for us. We avoid 7 00:00:40,836 --> 00:00:45,036 Speaker 1: small simple actions like exercising or flossing preventive care that 8 00:00:45,076 --> 00:00:47,396 Speaker 1: we know will pay off in the long run. But 9 00:00:47,476 --> 00:00:50,796 Speaker 1: we also do some truly beautiful things, behaviors that might 10 00:00:50,796 --> 00:00:54,356 Speaker 1: puzzle those rational aliens. We donate blood or even an 11 00:00:54,476 --> 00:00:56,956 Speaker 1: organ to someone in need and at real risk to 12 00:00:56,996 --> 00:01:00,196 Speaker 1: our own health. We're kind to strangers who, in many 13 00:01:00,196 --> 00:01:03,796 Speaker 1: cases will never even see again. I also wonder what 14 00:01:03,836 --> 00:01:06,676 Speaker 1: those aliens would make of the way humans experience happiness. 15 00:01:07,236 --> 00:01:10,556 Speaker 1: How complex and counterintuitive the things that promote our well 16 00:01:10,556 --> 00:01:14,716 Speaker 1: being must look to a totally rational being. Unfortunately, it's 17 00:01:14,756 --> 00:01:16,636 Speaker 1: unlikely I'll ever get a chance to chat with a 18 00:01:16,756 --> 00:01:20,196 Speaker 1: rational extraterrestrial like one of the Vulcans from Star Trek, 19 00:01:20,556 --> 00:01:24,076 Speaker 1: those creatures who embody pure logic and self discipline about 20 00:01:24,076 --> 00:01:27,396 Speaker 1: the anomalies of human behavior. But in this episode, I 21 00:01:27,436 --> 00:01:29,996 Speaker 1: get the chance to do the next best thing. I 22 00:01:30,036 --> 00:01:33,396 Speaker 1: get to chat with a Nobel Prize winning economist, although, 23 00:01:33,516 --> 00:01:35,956 Speaker 1: as you'll hear in this episode, he doesn't exactly fit 24 00:01:35,996 --> 00:01:36,796 Speaker 1: the stereotype. 25 00:01:37,076 --> 00:01:40,436 Speaker 2: Uh, Lessie, what's my name? I'm Richard Taylor. I'm a 26 00:01:40,476 --> 00:01:45,516 Speaker 2: professor of behavioral economics at the University of Chicago's Booth 27 00:01:45,556 --> 00:01:46,476 Speaker 2: School of Business. 28 00:01:46,756 --> 00:01:50,636 Speaker 1: You don't Mike drop the like en Nobel laureate. Did 29 00:01:50,676 --> 00:01:52,396 Speaker 1: you bring it? Did you bring the metal like? Do 30 00:01:52,476 --> 00:01:54,476 Speaker 1: you bring it in your pocket? In case pull it 31 00:01:54,556 --> 00:01:55,156 Speaker 1: up on video? 32 00:01:55,436 --> 00:01:59,236 Speaker 2: It's a little heavy, so I don't. 33 00:01:59,596 --> 00:02:03,116 Speaker 1: Doctor Richard Thaylor started writing his seminal book The Winner's Curse, 34 00:02:03,276 --> 00:02:06,676 Speaker 1: Paradoxes and Anomalies of Economic Life all the way back 35 00:02:06,676 --> 00:02:09,236 Speaker 1: in the nineteen eighties, so you might be surprised to 36 00:02:09,276 --> 00:02:11,396 Speaker 1: hear that I'm featuring it in this series about my 37 00:02:11,436 --> 00:02:14,876 Speaker 1: favorite books of twenty twenty five. But several decades after 38 00:02:14,956 --> 00:02:17,356 Speaker 1: creating the book that transformed the way I think about 39 00:02:17,436 --> 00:02:21,436 Speaker 1: human behavior, Richard is republishing it with new research and 40 00:02:21,476 --> 00:02:25,276 Speaker 1: new reflections on the irrationalities of everyday life, all with 41 00:02:25,356 --> 00:02:26,796 Speaker 1: the help of a new co author. 42 00:02:27,076 --> 00:02:29,876 Speaker 3: My name is Alexeimas. I'm also a professor of behavioral 43 00:02:29,876 --> 00:02:33,116 Speaker 3: economics and behavioral Science at the Booth School of Business. 44 00:02:33,316 --> 00:02:35,916 Speaker 3: I don't have any Nobel laureate stuff to mention, so. 45 00:02:36,436 --> 00:02:38,356 Speaker 1: You'll come back in a couple of years on the podcast, 46 00:02:38,356 --> 00:02:42,076 Speaker 1: and I'm sure things will be there. So we're featuring 47 00:02:42,076 --> 00:02:44,316 Speaker 1: The Winner's Curse as one of my favorite books about 48 00:02:44,316 --> 00:02:47,276 Speaker 1: twenty twenty five. But that feels kind of odd because 49 00:02:47,316 --> 00:02:49,876 Speaker 1: I remember reading The Winner's Curse when I was like 50 00:02:49,996 --> 00:02:51,956 Speaker 1: in graduate school. So it's kind of an odd book 51 00:02:51,996 --> 00:02:53,876 Speaker 1: to have as my favorite book of twenty twenty five 52 00:02:53,916 --> 00:02:56,956 Speaker 1: and one of the books that really taught me important things. 53 00:02:57,236 --> 00:03:00,236 Speaker 1: But maybe i'll have Richard, you set the stage of, 54 00:03:00,276 --> 00:03:02,236 Speaker 1: like the origin story of this book. 55 00:03:02,956 --> 00:03:06,436 Speaker 2: Okay, Yeah, the origin story is when I was about 56 00:03:06,476 --> 00:03:11,476 Speaker 2: Alex's age, about forty years ago. I had just come 57 00:03:11,556 --> 00:03:15,956 Speaker 2: back from spending a year with Danny Konnoman, my mentor. 58 00:03:16,516 --> 00:03:19,116 Speaker 1: Just for context, the late Danny Connoman is also a 59 00:03:19,156 --> 00:03:22,956 Speaker 1: Nobel Prize wunner, but he was a psychologist, a heavyweight 60 00:03:22,956 --> 00:03:25,596 Speaker 1: in my field, who is credited with pioneering the field 61 00:03:25,676 --> 00:03:29,676 Speaker 1: of behavioral economics. Richard, then a young professor at Cornell, 62 00:03:30,116 --> 00:03:32,996 Speaker 1: was building on Connoman's insights as he forged his own. 63 00:03:32,836 --> 00:03:40,596 Speaker 2: Path, and somebody suggested that I write a series of 64 00:03:41,276 --> 00:03:46,196 Speaker 2: columns in a new journal called the Journal of Economic Perspectives. 65 00:03:46,796 --> 00:03:50,796 Speaker 2: The journal was aimed at like the general economist, so 66 00:03:50,876 --> 00:03:55,876 Speaker 2: it would be articles that non specialists could understand. The 67 00:03:55,956 --> 00:03:59,756 Speaker 2: idea was in each issue, I would write about something 68 00:04:00,036 --> 00:04:04,916 Speaker 2: an anomaly. And what is an anomaly? An anomaly is 69 00:04:04,956 --> 00:04:10,796 Speaker 2: something that is unexpected, right, so elephant in real life 70 00:04:10,796 --> 00:04:15,156 Speaker 2: would be an anomaly for an economist. An anomaly is 71 00:04:15,276 --> 00:04:19,956 Speaker 2: something that the theory says won't happen, like if price 72 00:04:20,076 --> 00:04:23,636 Speaker 2: goes up and demand goes up, that would surprise us. 73 00:04:24,076 --> 00:04:28,836 Speaker 2: So I did this for about four years, and when 74 00:04:28,916 --> 00:04:32,556 Speaker 2: I had enough of these that it looked like a book, 75 00:04:33,116 --> 00:04:36,916 Speaker 2: I kind of stapled them together and called it the 76 00:04:36,916 --> 00:04:42,396 Speaker 2: Winner's Curse because it's an intriguing phrase and it was 77 00:04:42,916 --> 00:04:43,996 Speaker 2: one of the anomalies. 78 00:04:44,716 --> 00:04:47,356 Speaker 1: At the time, Alex was in a very different place 79 00:04:47,396 --> 00:04:50,116 Speaker 1: in life, was being born. 80 00:04:50,836 --> 00:04:51,436 Speaker 2: How was it. 81 00:04:51,916 --> 00:04:54,756 Speaker 3: I was in Moldova at the time. I was born 82 00:04:54,756 --> 00:04:57,956 Speaker 3: in Moldova in Eastern Europe. Basically I wasn't around when 83 00:04:57,956 --> 00:05:01,156 Speaker 3: the book was first came out. He was shirking, yeah, 84 00:05:01,156 --> 00:05:04,676 Speaker 3: But then Richard and I met when I was a 85 00:05:04,716 --> 00:05:08,156 Speaker 3: graduate student at San Diego, and our offices ended up 86 00:05:08,156 --> 00:05:10,956 Speaker 3: being adjacent that I was super interested in behavioral economics, 87 00:05:10,996 --> 00:05:13,756 Speaker 3: super shy, kind of didn't want to bug them, but 88 00:05:13,796 --> 00:05:15,596 Speaker 3: eventually I kind of got the courage to be like, 89 00:05:15,636 --> 00:05:16,996 Speaker 3: what do you think about this idea? What do you 90 00:05:16,996 --> 00:05:20,276 Speaker 3: think about that idea? We started chatting. I got my 91 00:05:20,316 --> 00:05:23,276 Speaker 3: first faculty job at Carnegie Mellon, and then at some 92 00:05:23,276 --> 00:05:24,476 Speaker 3: point I joined Booth. 93 00:05:25,156 --> 00:05:29,676 Speaker 2: And then a few years ago the publisher of this 94 00:05:29,836 --> 00:05:32,716 Speaker 2: book said, hey, you know, this book is getting old 95 00:05:32,996 --> 00:05:35,516 Speaker 2: like you, and it's going to go out of print. 96 00:05:36,036 --> 00:05:39,396 Speaker 2: Maybe you want to freshen it up or something. And 97 00:05:39,436 --> 00:05:42,396 Speaker 2: I think they had in mind a new cover, and 98 00:05:42,876 --> 00:05:47,596 Speaker 2: stupidly I thought, oh, well, maybe there's something more ambitious 99 00:05:48,036 --> 00:05:48,516 Speaker 2: we could do. 100 00:05:48,916 --> 00:05:52,036 Speaker 3: Fortunately for me, by the way, maybe stupidly for you. 101 00:05:52,716 --> 00:05:58,156 Speaker 2: Yeah. Well, I got the clever idea to get Alex 102 00:05:58,316 --> 00:06:02,036 Speaker 2: to help. So I had this pile of anomalies. I 103 00:06:02,156 --> 00:06:06,076 Speaker 2: wrote another half dozen of these halfter the book came out, 104 00:06:06,436 --> 00:06:10,476 Speaker 2: and the question was, you know, it's forty years later, 105 00:06:11,276 --> 00:06:12,756 Speaker 2: how does this stuff hold up? 106 00:06:13,236 --> 00:06:15,916 Speaker 3: So that was the concept we got on the phone, 107 00:06:15,916 --> 00:06:19,956 Speaker 3: and it sounded kind of straightforward. And easy, like let's 108 00:06:20,036 --> 00:06:23,436 Speaker 3: knock this thing out in six months, ended up being 109 00:06:23,556 --> 00:06:26,676 Speaker 3: anything but that it took us about four years now 110 00:06:26,756 --> 00:06:30,316 Speaker 3: to actually finish it up. That's largely because two thirds 111 00:06:30,356 --> 00:06:31,756 Speaker 3: of the book is brand new. 112 00:06:31,836 --> 00:06:34,396 Speaker 1: The new version of The Winner's Curse includes the original 113 00:06:34,436 --> 00:06:37,916 Speaker 1: anomalies Richard began exploring in the eighties, but also new 114 00:06:37,956 --> 00:06:41,236 Speaker 1: research examining whether the old ideas hold up today. 115 00:06:41,716 --> 00:06:43,916 Speaker 3: Really how this has held up in terms of the 116 00:06:43,956 --> 00:06:47,876 Speaker 3: empirical robustness of the results. Have these results been replicated? 117 00:06:48,116 --> 00:06:49,796 Speaker 3: Do they show up in the real world like the 118 00:06:49,836 --> 00:06:53,236 Speaker 3: original results were largely you know, with some exceptions in 119 00:06:53,316 --> 00:06:57,676 Speaker 3: lab experiments with college students, relatively small samples in some cases. 120 00:06:58,316 --> 00:07:00,796 Speaker 3: And you know, we've had forty years, there's been hundreds 121 00:07:00,796 --> 00:07:04,316 Speaker 3: of replications. Where are we now? So part of the book, 122 00:07:04,316 --> 00:07:06,636 Speaker 3: and we emphasize this throughout, is we went ahead, Do 123 00:07:06,716 --> 00:07:10,436 Speaker 3: we replicated all of the studies ourselves? Findings are really robust, 124 00:07:10,516 --> 00:07:12,836 Speaker 3: Everything replicates, everything has been replicated. 125 00:07:13,156 --> 00:07:15,796 Speaker 1: The anomalies the book explores are all about the surprising 126 00:07:15,836 --> 00:07:18,956 Speaker 1: ways humans act differently from what's known as the standard 127 00:07:18,996 --> 00:07:22,556 Speaker 1: economic model, the model of rational behavior that both Vulcans 128 00:07:22,636 --> 00:07:24,836 Speaker 1: and economists tend to expect from us. 129 00:07:25,356 --> 00:07:29,876 Speaker 2: The standard economic model is really that people, or as 130 00:07:29,916 --> 00:07:36,076 Speaker 2: economists call them, agents, solve problems by optimizing. So what 131 00:07:36,196 --> 00:07:38,716 Speaker 2: route did you take to get to your office today? 132 00:07:38,956 --> 00:07:42,316 Speaker 2: You took the best route? Why are you doing this podcast? 133 00:07:42,476 --> 00:07:46,236 Speaker 2: Because it's the best possible use of your time? That 134 00:07:46,476 --> 00:07:52,636 Speaker 2: plus markets, that is what distinguishes economics from psychology, say, 135 00:07:53,116 --> 00:07:53,556 Speaker 2: and so this. 136 00:07:53,596 --> 00:07:56,396 Speaker 1: Idea of these agents that are optimized in the best 137 00:07:56,396 --> 00:07:58,956 Speaker 1: way interacting in these markets produced a certain kind of 138 00:07:59,036 --> 00:08:02,196 Speaker 1: view of what people tended to do when making these decisions. 139 00:08:02,516 --> 00:08:05,436 Speaker 1: And that's what's often been called homo economicists give me 140 00:08:05,476 --> 00:08:08,076 Speaker 1: the like homo economicus one oh one, Like, you know 141 00:08:08,116 --> 00:08:09,756 Speaker 1: what do we think this guy's what they're doing. 142 00:08:10,076 --> 00:08:16,516 Speaker 3: Homo economicus is a rational agent who bays principles of rationality, 143 00:08:16,956 --> 00:08:20,756 Speaker 3: So they have rational expectations. They don't have any memory issues, 144 00:08:20,796 --> 00:08:23,716 Speaker 3: they don't have any sorts of biases of what's out 145 00:08:23,756 --> 00:08:26,516 Speaker 3: there in the real world. They're fully rational. They take 146 00:08:26,556 --> 00:08:30,196 Speaker 3: these beliefs as inputs and then they optimize. They make 147 00:08:30,316 --> 00:08:34,276 Speaker 3: the best decision given their correct beliefs about the world. 148 00:08:34,316 --> 00:08:38,556 Speaker 2: We should add that these agents are also selfish jerks, 149 00:08:38,756 --> 00:08:42,356 Speaker 2: so they don't really care about anybody else, possibly members 150 00:08:42,356 --> 00:08:47,276 Speaker 2: of their family, though possibly not. And they also have 151 00:08:47,356 --> 00:08:52,076 Speaker 2: no self control problems, So no need for any weight 152 00:08:52,116 --> 00:08:56,276 Speaker 2: loss drugs in this world. Everybody weighs just the right amount. 153 00:08:56,596 --> 00:09:00,316 Speaker 3: No AA, no alcoholics, nobody gets addicted to drugs unless 154 00:09:00,356 --> 00:09:01,276 Speaker 3: they really want. 155 00:09:01,076 --> 00:09:04,436 Speaker 2: To, and no need to worry about saving for retirement. 156 00:09:04,516 --> 00:09:07,836 Speaker 2: People will do that because otherwise they're going to starve 157 00:09:07,876 --> 00:09:08,516 Speaker 2: when they're old. 158 00:09:09,116 --> 00:09:11,316 Speaker 1: Of course, you're saying all this stuff quite facetiously, because 159 00:09:11,396 --> 00:09:13,276 Speaker 1: real people don't tend to do this. We are in 160 00:09:13,276 --> 00:09:15,396 Speaker 1: a world with AA and weight loss drugs and people 161 00:09:15,396 --> 00:09:17,876 Speaker 1: who haven't saved for retirement. And these are the kinds 162 00:09:17,916 --> 00:09:20,316 Speaker 1: of anomalies that you pointed out in your book, These 163 00:09:20,356 --> 00:09:23,076 Speaker 1: cases where you looked and you said, hey, people are 164 00:09:23,116 --> 00:09:25,596 Speaker 1: supposed to be obeying the standard economic model and they're 165 00:09:25,636 --> 00:09:28,596 Speaker 1: just not. And so what does that mean for human psychology? 166 00:09:29,156 --> 00:09:31,876 Speaker 1: And maybe how can we avoid these anomalies, especially in 167 00:09:31,916 --> 00:09:34,276 Speaker 1: cases where they're kind of hurting us and messing us up. 168 00:09:34,556 --> 00:09:36,076 Speaker 1: And so that's what I want to go through today. 169 00:09:36,156 --> 00:09:38,836 Speaker 1: I want to go through my favorite of your anomalies 170 00:09:38,836 --> 00:09:41,396 Speaker 1: that you cover in the book. Six of these, but 171 00:09:41,436 --> 00:09:44,076 Speaker 1: anomaly number one that I love is the anomaly that 172 00:09:44,156 --> 00:09:47,076 Speaker 1: the book is named for, the so called winner's curse. 173 00:09:47,676 --> 00:09:49,556 Speaker 1: So Alex, tell me what the winner's curse is with 174 00:09:49,596 --> 00:09:52,116 Speaker 1: the famous jar example showing this anomaly. 175 00:09:52,596 --> 00:09:54,876 Speaker 3: Yeah, so imagine you go into a bar with a 176 00:09:54,996 --> 00:09:57,916 Speaker 3: jar of coins. The jar has a certain amount of 177 00:09:57,996 --> 00:10:00,836 Speaker 3: money in it, and then you tell everybody at the bar, look, 178 00:10:00,956 --> 00:10:03,716 Speaker 3: whoever bids the most for this jar gets the jar 179 00:10:03,796 --> 00:10:06,036 Speaker 3: and all the money that's in the jar. What's the 180 00:10:06,076 --> 00:10:10,236 Speaker 3: winner's curse? Pretty simple. The person who wins the jar 181 00:10:10,636 --> 00:10:13,476 Speaker 3: will end up losing money in the sense that their 182 00:10:13,676 --> 00:10:15,476 Speaker 3: bid is going to be higher than the amount of 183 00:10:15,516 --> 00:10:17,636 Speaker 3: money that's in the jar, So they'll get the jar, 184 00:10:17,676 --> 00:10:19,676 Speaker 3: they go home, they take all the money out, they're like, 185 00:10:20,036 --> 00:10:23,396 Speaker 3: oh crap, I pay twenty bucks. This has fifteen dollars 186 00:10:23,476 --> 00:10:26,076 Speaker 3: in it. So that's the winner's curse that by winning 187 00:10:26,236 --> 00:10:30,156 Speaker 3: the auction you're actually losing. The jar example is really 188 00:10:30,476 --> 00:10:32,636 Speaker 3: kind of easy to demonstrate. You could do it in 189 00:10:32,636 --> 00:10:34,516 Speaker 3: a class, Richard, you've done this in a class before. 190 00:10:34,556 --> 00:10:36,396 Speaker 1: I'm sure you do it in classrooms, But do you 191 00:10:36,396 --> 00:10:37,996 Speaker 1: ever do it in a bar? Is this like your 192 00:10:38,036 --> 00:10:38,476 Speaker 1: bar trick? 193 00:10:38,596 --> 00:10:40,156 Speaker 2: It better be a friendly bar. 194 00:10:40,676 --> 00:10:41,796 Speaker 3: I don't want to get beaten up. 195 00:10:42,276 --> 00:10:44,116 Speaker 1: So that's the bar version, you know, the kind of 196 00:10:44,156 --> 00:10:46,796 Speaker 1: toy version that economists talk about. But you've talked about 197 00:10:46,796 --> 00:10:49,196 Speaker 1: lots of examples where we see this in real life. 198 00:10:49,356 --> 00:10:51,196 Speaker 1: A curious example I didn't know until I read your 199 00:10:51,236 --> 00:10:53,996 Speaker 1: book was the case of oil companies, Richard, kind of 200 00:10:53,996 --> 00:10:56,236 Speaker 1: how to oil companies fall prey to this winner's curse. 201 00:10:56,596 --> 00:11:00,836 Speaker 2: Yeah. In fact, the winner's curse was discovered by oil 202 00:11:01,036 --> 00:11:06,156 Speaker 2: engineers at a company that was called Arco. Oil Companies 203 00:11:06,196 --> 00:11:10,116 Speaker 2: were bidding for leases for a certain plot of land 204 00:11:10,596 --> 00:11:13,076 Speaker 2: in the Gulf of Mexico. We're going to still stick 205 00:11:13,116 --> 00:11:16,676 Speaker 2: with the original name, the Gulf of Mexico. And what 206 00:11:16,796 --> 00:11:21,116 Speaker 2: they found was that every time they won one of 207 00:11:21,156 --> 00:11:25,916 Speaker 2: these auctions, there was less oil there than their engineers 208 00:11:25,996 --> 00:11:30,556 Speaker 2: had predicted. And they're saying, what's the story, lousy engineers 209 00:11:31,316 --> 00:11:36,396 Speaker 2: or are we just unlucky? And then they figured out no, actually, 210 00:11:37,076 --> 00:11:42,236 Speaker 2: it's that if there are lots of people bidding, the 211 00:11:42,276 --> 00:11:45,596 Speaker 2: winner is going to be the one whose engineers on 212 00:11:45,876 --> 00:11:49,516 Speaker 2: this plot had the most optimistic forecasts. 213 00:11:49,836 --> 00:11:51,356 Speaker 1: And then winners Chris Stemps are in the fact that 214 00:11:51,396 --> 00:11:53,916 Speaker 1: we're not tracking the fact that well, hey, if everybody's 215 00:11:54,316 --> 00:11:57,116 Speaker 1: bidding that there's less money in this jar than I think, 216 00:11:57,476 --> 00:11:59,956 Speaker 1: or if all the other oil engineers are bidding less 217 00:11:59,956 --> 00:12:02,516 Speaker 1: than I am for this oil, maybe I'm wrong about 218 00:12:02,516 --> 00:12:05,596 Speaker 1: how much oil or money is really out there. Perspective 219 00:12:05,636 --> 00:12:07,476 Speaker 1: taking failures are ones that we talk about a lot 220 00:12:07,516 --> 00:12:10,716 Speaker 1: on this podcast because many of them really impact our 221 00:12:10,756 --> 00:12:13,116 Speaker 1: happiness in these bad ways. Right, we don't realize that 222 00:12:13,156 --> 00:12:15,316 Speaker 1: other people want to connect with us, or we don't 223 00:12:15,316 --> 00:12:17,756 Speaker 1: realize that other people don't know we're super grateful, or 224 00:12:17,796 --> 00:12:19,516 Speaker 1: don't know that it'd be fine if they asked us 225 00:12:19,516 --> 00:12:21,636 Speaker 1: for help. So there's all these cases in the happiness 226 00:12:21,636 --> 00:12:24,236 Speaker 1: science where perspective taking failures kind of mess up how 227 00:12:24,276 --> 00:12:26,876 Speaker 1: much happiness we could be getting from other people. But 228 00:12:26,956 --> 00:12:28,636 Speaker 1: this is one where it seems to be messing up 229 00:12:28,716 --> 00:12:31,196 Speaker 1: a lot of the value that we get from winning, 230 00:12:31,356 --> 00:12:34,196 Speaker 1: Like quite ironically, yeah, And so how do we overcome 231 00:12:34,276 --> 00:12:36,796 Speaker 1: the winner's curse? How do we take into account what 232 00:12:36,916 --> 00:12:39,356 Speaker 1: other people are bidding in other people's perspectives a little 233 00:12:39,356 --> 00:12:41,156 Speaker 1: bit more to break this well? 234 00:12:41,156 --> 00:12:46,436 Speaker 2: A simple lesson is in the actual bidding circumstances, the 235 00:12:46,436 --> 00:12:49,836 Speaker 2: more bidders there are, the less you should bid. Now 236 00:12:49,916 --> 00:12:55,996 Speaker 2: that is really counterintuitive advice. If Alex is auctioning off 237 00:12:55,996 --> 00:12:58,716 Speaker 2: this jar of coins and there are ten of his 238 00:12:58,876 --> 00:13:02,396 Speaker 2: friends at the bar, and then twenty more people come 239 00:13:02,436 --> 00:13:05,956 Speaker 2: in and we let them bid, all of Alex's friends 240 00:13:05,956 --> 00:13:10,236 Speaker 2: should lower their bids, And that's just hard to get 241 00:13:10,276 --> 00:13:11,076 Speaker 2: your head around. 242 00:13:11,476 --> 00:13:14,476 Speaker 1: Yeah, I feel like my intuition is exactly the opposite. 243 00:13:14,116 --> 00:13:16,916 Speaker 2: Right, it's because you have the intuition, oh, I want 244 00:13:16,956 --> 00:13:21,756 Speaker 2: to win. No, the goal should be to submit a 245 00:13:21,796 --> 00:13:25,836 Speaker 2: bid that if it wins, it will be lower than 246 00:13:25,916 --> 00:13:30,636 Speaker 2: the amount of money. It's very counterintuitive, which is why 247 00:13:30,676 --> 00:13:33,596 Speaker 2: people get it wrong. It's a classic anomaly. 248 00:13:34,596 --> 00:13:38,156 Speaker 1: Okay, so that was my number one. Number two classic irrationality, 249 00:13:38,276 --> 00:13:40,356 Speaker 1: which is one that just makes me feel good about 250 00:13:40,436 --> 00:13:43,476 Speaker 1: the human race, is that people aren't the self interested 251 00:13:43,556 --> 00:13:47,116 Speaker 1: jerks that economists think they are. You mentioned this in 252 00:13:47,156 --> 00:13:50,596 Speaker 1: your description of homo economicists, that homoeconomicists is just out 253 00:13:50,596 --> 00:13:54,156 Speaker 1: for themselves. But explain why standard economic theory sort of 254 00:13:54,156 --> 00:13:54,716 Speaker 1: predicts that. 255 00:13:54,756 --> 00:14:00,316 Speaker 2: In a sense, economists have talked about a world in 256 00:14:00,356 --> 00:14:04,796 Speaker 2: which people don't care at all about anybody else. Paul Samuelson, 257 00:14:04,876 --> 00:14:08,836 Speaker 2: one of the great economists of the twentieth century, wrote 258 00:14:08,836 --> 00:14:12,756 Speaker 2: a paper about what he called the public goods problem. 259 00:14:13,636 --> 00:14:17,516 Speaker 2: A public good is something that if you provide to 260 00:14:17,556 --> 00:14:20,996 Speaker 2: one person, you provide to everybody, like a fireworks display. 261 00:14:21,636 --> 00:14:26,716 Speaker 2: And he said it will be underprovided because no one 262 00:14:26,756 --> 00:14:31,716 Speaker 2: will contribute because they can watch it for free. Now, 263 00:14:32,156 --> 00:14:38,476 Speaker 2: of course, people do contribute to charities and to public 264 00:14:38,596 --> 00:14:43,596 Speaker 2: radio and all kinds of other good causes. If you 265 00:14:43,636 --> 00:14:46,116 Speaker 2: open your eyes and look out the window, you'll notice 266 00:14:46,676 --> 00:14:49,756 Speaker 2: that people aren't always suffrage jerks. 267 00:14:50,116 --> 00:14:52,196 Speaker 1: And this is the kind of thing that experimenters were 268 00:14:52,196 --> 00:14:55,156 Speaker 1: starting to notice around the time of The Winner's Curse, Too. 269 00:14:55,356 --> 00:14:58,116 Speaker 1: Alex tell me about some of the classic violations of 270 00:14:58,356 --> 00:15:01,316 Speaker 1: selfishness that folks talked about early in the literature. 271 00:15:01,956 --> 00:15:05,436 Speaker 3: The real anomaly came when Werner Goot and his colleagues 272 00:15:05,756 --> 00:15:09,516 Speaker 3: ran something called the ultimatum game. It's super simple. Richard 273 00:15:09,516 --> 00:15:12,356 Speaker 3: and I are playing the ultimatum game. I have ten dollars, 274 00:15:12,636 --> 00:15:14,716 Speaker 3: and I decide how to split that with Richard. I 275 00:15:14,716 --> 00:15:17,876 Speaker 3: give him ultimatum. I'm the proposer. Richard is the receiver. 276 00:15:18,276 --> 00:15:20,996 Speaker 3: Richard sees my offer to him. Let's say, I say, look, 277 00:15:21,076 --> 00:15:23,196 Speaker 3: I like money. I'm going to keep nine dollars. I'm 278 00:15:23,196 --> 00:15:25,916 Speaker 3: going to give a buck to Richard, and then Richard says, 279 00:15:26,156 --> 00:15:29,636 Speaker 3: I want the dollar. Everything's fine. I end up with nine, 280 00:15:29,836 --> 00:15:32,436 Speaker 3: Richard ends up with one. But if Richard says no, 281 00:15:32,796 --> 00:15:34,956 Speaker 3: both of us end up with zero. Mm hmm. 282 00:15:35,316 --> 00:15:38,836 Speaker 1: And so if homogonomicus is the proposer, he should give 283 00:15:39,236 --> 00:15:42,076 Speaker 1: you know, one penny, right and keep nine dollars in 284 00:15:42,116 --> 00:15:44,116 Speaker 1: ninety nine cents for himself. And if Richard is a 285 00:15:44,116 --> 00:15:46,476 Speaker 1: homo economicus, Richard'd be like, wow, one penny is better 286 00:15:46,516 --> 00:15:48,796 Speaker 1: than nothing. I should go for that. But if you're 287 00:15:48,796 --> 00:15:52,236 Speaker 1: playing with real humans, if you offer one penny of 288 00:15:52,276 --> 00:15:56,076 Speaker 1: your ten bucks, most real humans are like explot of you. 289 00:15:56,076 --> 00:15:59,316 Speaker 1: You jerk like you know, I'll like take a hit, right. 290 00:15:59,716 --> 00:16:02,276 Speaker 1: And so what did gooth find in the original ultimatum 291 00:16:02,276 --> 00:16:05,716 Speaker 1: game effects? Are people homoeconomicus or do they throw some 292 00:16:05,756 --> 00:16:06,596 Speaker 1: explotives in there? 293 00:16:06,916 --> 00:16:09,556 Speaker 3: That's exactly what he found is basically people are not 294 00:16:09,636 --> 00:16:13,236 Speaker 3: homeo economics. Essentially, under twenty percent of the pipe, those 295 00:16:13,236 --> 00:16:16,556 Speaker 3: offers are rejected. So if I offer something less than 296 00:16:16,596 --> 00:16:19,156 Speaker 3: two dollars, Richard's going to reject my offer. Both of 297 00:16:19,236 --> 00:16:21,516 Speaker 3: us end up with nothing. And so this is a 298 00:16:21,516 --> 00:16:24,636 Speaker 3: real puzzle. So the proposer's decision of how much to give. 299 00:16:25,236 --> 00:16:27,236 Speaker 3: That could be driven by a lot of different things. 300 00:16:27,276 --> 00:16:30,556 Speaker 3: But the real puzzle is the receiver saying I would 301 00:16:30,596 --> 00:16:31,876 Speaker 3: prefer nothing. 302 00:16:31,676 --> 00:16:32,796 Speaker 2: To two dollars. 303 00:16:33,116 --> 00:16:35,356 Speaker 1: So this is in the case these like standard economics 304 00:16:35,356 --> 00:16:38,436 Speaker 1: games where we're really setting up these kind of arbitrary situations. 305 00:16:38,476 --> 00:16:41,196 Speaker 1: You know, I'm giving you ten bucks these really specific rules. 306 00:16:41,396 --> 00:16:43,436 Speaker 1: But folks were also finding that people are nicer than 307 00:16:43,476 --> 00:16:46,236 Speaker 1: you expect when you give them more real world contexts. 308 00:16:46,516 --> 00:16:48,876 Speaker 1: Richard tell me about these famous wallet studies and how 309 00:16:48,916 --> 00:16:50,676 Speaker 1: it showed that people were nicer than we think. 310 00:16:51,196 --> 00:16:55,916 Speaker 2: Yeah, so there's a guy called Alan Colne who was 311 00:16:55,956 --> 00:17:00,356 Speaker 2: a postdoc here for a while, and he undertook this 312 00:17:00,636 --> 00:17:05,476 Speaker 2: unbelievably ambitious project where they would take a little wallet 313 00:17:05,996 --> 00:17:11,076 Speaker 2: that would have some kind of id. Suppose it's Alex's 314 00:17:11,116 --> 00:17:14,556 Speaker 2: wallet and it has something like his email address on 315 00:17:14,676 --> 00:17:21,676 Speaker 2: it and sometimes money and sometimes key, and they would 316 00:17:21,996 --> 00:17:26,076 Speaker 2: turn it in at some place like a hotel, lobby 317 00:17:26,636 --> 00:17:30,756 Speaker 2: or a train station. And they did this thousands of 318 00:17:30,796 --> 00:17:34,916 Speaker 2: wallets all around the world. And the question is what 319 00:17:35,036 --> 00:17:40,236 Speaker 2: do people do. Are they more or less likely to 320 00:17:40,316 --> 00:17:43,556 Speaker 2: try and find the owner of the wallet if it 321 00:17:43,636 --> 00:17:47,276 Speaker 2: has money in it? And if people are self as jerks, 322 00:17:47,596 --> 00:17:51,316 Speaker 2: the more money that's there, the less likely they are 323 00:17:51,356 --> 00:17:52,156 Speaker 2: to turn it in. 324 00:17:52,396 --> 00:17:54,036 Speaker 1: Or at least to turn it in with the money 325 00:17:54,076 --> 00:17:54,876 Speaker 1: at least to turn. 326 00:17:54,716 --> 00:17:58,996 Speaker 2: It right exactly. Yeah, And the opposite happens. The more 327 00:17:59,076 --> 00:18:03,276 Speaker 2: money that's there, the more likely that it gets returned 328 00:18:03,436 --> 00:18:06,756 Speaker 2: with the money to the fictional owner. 329 00:18:07,396 --> 00:18:09,836 Speaker 1: And so these are anomal is when it comes to 330 00:18:09,876 --> 00:18:12,276 Speaker 1: the standard economic model, but they're kind of great when 331 00:18:12,276 --> 00:18:14,436 Speaker 1: it comes to human nature, that we're kind of naturally 332 00:18:14,516 --> 00:18:16,876 Speaker 1: not selfish jerks. But a question I had for you, 333 00:18:16,876 --> 00:18:18,716 Speaker 1: given all the evidence, is what can we do to 334 00:18:18,716 --> 00:18:22,236 Speaker 1: get people to become even more cooperative? Are there ways 335 00:18:22,236 --> 00:18:25,756 Speaker 1: that we can bump this lack of selfishness up even more? 336 00:18:26,116 --> 00:18:28,156 Speaker 3: Yeah, So one of the things that's been shown to 337 00:18:28,236 --> 00:18:31,556 Speaker 3: matter a lot is how much people can connect interpersonally. 338 00:18:31,796 --> 00:18:35,876 Speaker 3: So any sort of communication between people before some sort 339 00:18:35,916 --> 00:18:38,676 Speaker 3: of potential exchange takes place or any sort of decision 340 00:18:38,716 --> 00:18:43,596 Speaker 3: takes place, facilitates more cooperative, less selfish behavior. People kind 341 00:18:43,596 --> 00:18:46,076 Speaker 3: of get to know each other, that connection brings them 342 00:18:46,076 --> 00:18:48,916 Speaker 3: closer together, and then they're a lot less selfish. So 343 00:18:48,916 --> 00:18:53,556 Speaker 3: that's one thing that really very intuitively boosts cooperation. The 344 00:18:53,636 --> 00:18:56,076 Speaker 3: other thing is that in society we have a bunch 345 00:18:56,116 --> 00:19:00,796 Speaker 3: of norms. Some of those norms involve punishment of people 346 00:19:00,836 --> 00:19:03,596 Speaker 3: who don't cooperate, and when you introduce these sorts of 347 00:19:03,636 --> 00:19:06,636 Speaker 3: norms into these kind of very abstract games, turns out 348 00:19:06,716 --> 00:19:10,236 Speaker 3: that boosts cooperation tremendously. Of the largest effects in the 349 00:19:10,236 --> 00:19:12,916 Speaker 3: behavioral economics literature is that when you take a standard 350 00:19:12,916 --> 00:19:15,436 Speaker 3: sort of abstract game where people can contribute to the 351 00:19:15,476 --> 00:19:18,276 Speaker 3: public good or something like that, if you introduce the 352 00:19:18,316 --> 00:19:22,436 Speaker 3: opportunity for other people to punish non cooperators at a 353 00:19:22,516 --> 00:19:24,916 Speaker 3: costs to themselves, by the way, which is itself an 354 00:19:24,956 --> 00:19:28,316 Speaker 3: anomaly the standard economic model. If it costs me something 355 00:19:28,356 --> 00:19:30,836 Speaker 3: to punish somebody else, I would never do it. But 356 00:19:30,996 --> 00:19:35,716 Speaker 3: turns out people do, and because there's this threat of punishment, 357 00:19:35,996 --> 00:19:39,196 Speaker 3: everybody ends up cooperating. So this is kind of like 358 00:19:39,236 --> 00:19:42,356 Speaker 3: a little model of society where you take a group 359 00:19:42,396 --> 00:19:45,796 Speaker 3: of people interacting, you introduce the types of things that 360 00:19:45,836 --> 00:19:47,636 Speaker 3: we see in the real world, like norms and the 361 00:19:47,676 --> 00:19:49,916 Speaker 3: ability to punish, and all of a sudden, the world 362 00:19:49,956 --> 00:19:51,396 Speaker 3: looks a lot less selfish. 363 00:19:51,796 --> 00:19:54,196 Speaker 1: So it seems like there's some fun happiness implications from 364 00:19:54,196 --> 00:19:56,756 Speaker 1: this anomaly, it seems like happiness. Implication Number one is like, 365 00:19:56,836 --> 00:19:59,716 Speaker 1: we can just trust people more than we might think. 366 00:20:00,076 --> 00:20:02,276 Speaker 1: The other is that there's ways you can boost cooperation 367 00:20:02,396 --> 00:20:05,516 Speaker 1: even more, talk to people more, get more social connection, 368 00:20:05,636 --> 00:20:08,716 Speaker 1: which honestly is great for happiness anyway, and if possible, 369 00:20:08,756 --> 00:20:11,436 Speaker 1: give people both the opportunity to set up norms where 370 00:20:11,476 --> 00:20:13,516 Speaker 1: you can call out bad actors, which is good. 371 00:20:13,996 --> 00:20:18,996 Speaker 2: Yeah. The one finding is that if you are trusting, 372 00:20:19,956 --> 00:20:25,996 Speaker 2: you produce more trustworthy behavior. You know, in the old 373 00:20:26,276 --> 00:20:32,876 Speaker 2: book I talked about farm stands in Ithaca, where I 374 00:20:32,956 --> 00:20:36,196 Speaker 2: used to teach at Cornell, where a farmer would put 375 00:20:36,276 --> 00:20:41,196 Speaker 2: out like fresh corn on the honor system, and people 376 00:20:41,276 --> 00:20:46,196 Speaker 2: would put money in the box. And Alex and some 377 00:20:46,276 --> 00:20:51,316 Speaker 2: of our behavioral economics friends were out hiking in the 378 00:20:51,436 --> 00:20:56,836 Speaker 2: Swiss Alps this summer and came across a place that 379 00:20:56,956 --> 00:20:59,916 Speaker 2: had wine and cheese. What were they selling? 380 00:20:59,956 --> 00:21:02,356 Speaker 3: Alex there was like you know, these like little cabins 381 00:21:02,396 --> 00:21:05,676 Speaker 3: and secluded in the Alps. There's nobody there. You come in, 382 00:21:05,756 --> 00:21:08,596 Speaker 3: there's wine, and she set up. You take however much 383 00:21:08,596 --> 00:21:10,116 Speaker 3: you want than you put your money in the little 384 00:21:10,156 --> 00:21:10,916 Speaker 3: box and you leave. 385 00:21:11,116 --> 00:21:14,036 Speaker 1: It's less quaint than cornell, I guess with fresh corn 386 00:21:14,156 --> 00:21:16,036 Speaker 1: versus really nice French wine. 387 00:21:16,036 --> 00:21:18,236 Speaker 2: But yeah, but you have to hike up there to 388 00:21:18,236 --> 00:21:18,556 Speaker 2: get it. 389 00:21:21,196 --> 00:21:23,436 Speaker 1: Coming up after the break, we'll dive into more of 390 00:21:23,516 --> 00:21:26,996 Speaker 1: my favorite irrationalities, like why keep paying for a gym 391 00:21:27,036 --> 00:21:29,276 Speaker 1: that I've only ever used once? And why it's so 392 00:21:29,396 --> 00:21:32,636 Speaker 1: hard to actually redeem the frequent flyer miles I've been hoarding. 393 00:21:33,276 --> 00:21:43,996 Speaker 1: The Happiness Lab will be right back, all right, So 394 00:21:44,036 --> 00:21:47,916 Speaker 1: now we're moving on to irrationality number three, something that 395 00:21:47,956 --> 00:21:51,476 Speaker 1: I struggle with a lot personally, which is that as humans, 396 00:21:51,516 --> 00:21:53,956 Speaker 1: we seem to have more of a problem with inertia 397 00:21:54,036 --> 00:21:57,556 Speaker 1: than the standard economic model might predict. Richard, tell me 398 00:21:57,596 --> 00:22:01,036 Speaker 1: about the classic mug study that sholowed us the problems 399 00:22:01,036 --> 00:22:02,476 Speaker 1: that people face with inertia. 400 00:22:03,236 --> 00:22:07,676 Speaker 2: Yeah. So there's something that I originally called the endowment effect, 401 00:22:07,996 --> 00:22:12,636 Speaker 2: because your endowment is something you own, and the empirical 402 00:22:12,676 --> 00:22:18,396 Speaker 2: result is that we value stuff we have more than 403 00:22:19,116 --> 00:22:20,956 Speaker 2: we would be willing to pay to get it. 404 00:22:21,236 --> 00:22:23,236 Speaker 1: So, maybe having me in Alex play, if you're putting 405 00:22:23,276 --> 00:22:25,356 Speaker 1: me in an endowment effects study, how would you do it? 406 00:22:26,036 --> 00:22:29,036 Speaker 2: You and Alex are in a class, sitting next to 407 00:22:29,116 --> 00:22:33,236 Speaker 2: each other, and I go around and I give half 408 00:22:33,356 --> 00:22:38,636 Speaker 2: of you a Yale University coffee mug. It says Boola 409 00:22:38,756 --> 00:22:42,196 Speaker 2: boulah or something on the mug. 410 00:22:41,956 --> 00:22:44,236 Speaker 1: Boola boola, which, of course, for those that don't know, 411 00:22:44,396 --> 00:22:47,556 Speaker 1: is the Yale fight chant. There are literally those mugs 412 00:22:47,556 --> 00:22:48,196 Speaker 1: in the bookstore. 413 00:22:48,236 --> 00:22:51,556 Speaker 2: So yeah, yeah, yeah, And then we say, all right, 414 00:22:51,596 --> 00:22:53,956 Speaker 2: what we're going to do is we're going to have 415 00:22:53,996 --> 00:22:57,476 Speaker 2: a market for these mugs. Laurie, you have a mug 416 00:22:57,796 --> 00:23:00,876 Speaker 2: and you're asked for each of the following prices, are 417 00:23:00,876 --> 00:23:03,876 Speaker 2: you willing to sell it or not? So ten dollars, 418 00:23:04,276 --> 00:23:07,996 Speaker 2: nine to fifty so forth down and Alex is asked 419 00:23:08,356 --> 00:23:11,476 Speaker 2: and each of these is are you willing to buy one? Now? 420 00:23:12,116 --> 00:23:17,036 Speaker 2: A principle of economic rationality is if you wouldn't pay 421 00:23:17,156 --> 00:23:21,076 Speaker 2: six dollars to get it, then you should be willing 422 00:23:21,116 --> 00:23:24,316 Speaker 2: to sell it for six dollars. Suppose we hand out 423 00:23:24,356 --> 00:23:27,836 Speaker 2: five dollar bills, Well, people will be willing to trade 424 00:23:27,876 --> 00:23:31,436 Speaker 2: those for six dollars and they won't pay six dollars 425 00:23:31,476 --> 00:23:34,836 Speaker 2: to get one. Right, So mugs should be like five 426 00:23:34,876 --> 00:23:39,196 Speaker 2: dollar bills. But when we run those experiments, the people 427 00:23:39,196 --> 00:23:43,156 Speaker 2: who get mugs demand twice as much to give them 428 00:23:43,236 --> 00:23:47,756 Speaker 2: up as the people who randomly didn't get a mug 429 00:23:48,156 --> 00:23:51,356 Speaker 2: are willing to pay to get it. And notice it's 430 00:23:51,436 --> 00:23:56,916 Speaker 2: not like we're asking about your favorite hat, right this 431 00:23:57,116 --> 00:24:01,476 Speaker 2: mug has been in your possession for about two minutes 432 00:24:02,076 --> 00:24:05,916 Speaker 2: and you haven't grown to love it, but you act 433 00:24:05,996 --> 00:24:06,436 Speaker 2: like you do. 434 00:24:06,836 --> 00:24:09,516 Speaker 1: So Alex, explain what endowment effect is exact example of 435 00:24:09,556 --> 00:24:12,236 Speaker 1: us showing inertia, because I think that it's part of 436 00:24:12,276 --> 00:24:14,476 Speaker 1: a broader set of biases that you talk about in 437 00:24:14,516 --> 00:24:16,356 Speaker 1: the book that I find really fascinating. 438 00:24:16,996 --> 00:24:17,236 Speaker 2: Yeah. 439 00:24:17,236 --> 00:24:21,156 Speaker 3: So basically, there's a bunch of different biases that are 440 00:24:21,276 --> 00:24:24,356 Speaker 3: essentially an effect that's observed in a particular context that 441 00:24:24,356 --> 00:24:27,756 Speaker 3: are kind of driven by the same sort of underlying psychology, 442 00:24:27,796 --> 00:24:31,676 Speaker 3: which you can describe as inertia, that is driven by 443 00:24:31,716 --> 00:24:36,276 Speaker 3: something that we discuss as loss aversion. Particularly. Look, I'm 444 00:24:36,396 --> 00:24:41,156 Speaker 3: already here. I have this thing or this activity that 445 00:24:41,196 --> 00:24:43,756 Speaker 3: I'm doing, or whatever you want to call it. This 446 00:24:43,836 --> 00:24:46,276 Speaker 3: is now kind of my status quo. This is where 447 00:24:46,316 --> 00:24:49,876 Speaker 3: I'm at lose that thing, and that loss really hurts. 448 00:24:50,316 --> 00:24:54,076 Speaker 3: In fact, a loss hurts about twice maybe sometimes more 449 00:24:54,436 --> 00:24:56,916 Speaker 3: than an equivalent gain feels good. This is what we 450 00:24:56,956 --> 00:24:58,636 Speaker 3: call loss a version, and this. 451 00:24:58,596 --> 00:25:00,756 Speaker 1: Fits with the mug results that Richard was just talking about, 452 00:25:00,756 --> 00:25:03,316 Speaker 1: because people are demanding twice as much to sell this 453 00:25:03,436 --> 00:25:06,036 Speaker 1: mug as buyers are willing to buy. It's like losing 454 00:25:06,076 --> 00:25:08,556 Speaker 1: the mug kind of hits them twice as hard financially. 455 00:25:09,116 --> 00:25:09,396 Speaker 2: Exactly. 456 00:25:09,516 --> 00:25:11,956 Speaker 3: Now this mug is mine, this kind of now my 457 00:25:11,996 --> 00:25:14,836 Speaker 3: status quo giving it up will hurt a lot. So 458 00:25:14,916 --> 00:25:18,156 Speaker 3: I need more money to be compensated for that paint. 459 00:25:18,396 --> 00:25:20,036 Speaker 1: And so that's in the context of again this sort 460 00:25:20,036 --> 00:25:23,036 Speaker 1: of toy example where you're selling mugs, but this kind 461 00:25:23,076 --> 00:25:25,916 Speaker 1: of status quot inertia bias is something that lots of 462 00:25:25,956 --> 00:25:29,076 Speaker 1: companies are using against us all the time. I have 463 00:25:29,116 --> 00:25:31,676 Speaker 1: my own example right now. I have paid to join 464 00:25:31,716 --> 00:25:34,276 Speaker 1: a gym as part of this summer program. This really 465 00:25:34,276 --> 00:25:36,276 Speaker 1: cool bouldering gym near my house was like we have 466 00:25:36,356 --> 00:25:38,196 Speaker 1: yoga classes, and I was like, oh great, I'm going 467 00:25:38,276 --> 00:25:40,436 Speaker 1: to join the bouldering gym. And it turns out that 468 00:25:40,516 --> 00:25:42,876 Speaker 1: so far this summer, I've had a very expensive single 469 00:25:42,956 --> 00:25:46,396 Speaker 1: yoga class for the price of the entire summer enrollment. 470 00:25:46,476 --> 00:25:48,436 Speaker 1: It's like a three hundred dollars yoga class, which is 471 00:25:48,436 --> 00:25:51,716 Speaker 1: really embarrassing. But this is my status quote bias at work. 472 00:25:51,996 --> 00:25:53,356 Speaker 1: Richard explain why this is. 473 00:25:54,076 --> 00:25:59,556 Speaker 2: Yeah, our friends, the married couple of Stefanodelavigna and Ulrika Malmonde, 474 00:26:00,036 --> 00:26:02,476 Speaker 2: wrote a paper about this when they were grad students. 475 00:26:02,716 --> 00:26:05,156 Speaker 2: That was called paying not to go to the gym, 476 00:26:05,796 --> 00:26:10,276 Speaker 2: which is what Laurie did this summer. You know, this 477 00:26:10,356 --> 00:26:14,516 Speaker 2: is an example of what's called status quo bias. Whatever 478 00:26:14,556 --> 00:26:17,996 Speaker 2: the status quo is you tend to stick with. We 479 00:26:18,116 --> 00:26:24,156 Speaker 2: all know this happens. Suppose you're watching some show, say 480 00:26:24,196 --> 00:26:29,516 Speaker 2: you're streaming, and an episode ends and then the next 481 00:26:29,516 --> 00:26:33,316 Speaker 2: one starts. So if you do nothing, which people are 482 00:26:33,356 --> 00:26:37,276 Speaker 2: really good at, then you start watching the next one 483 00:26:37,316 --> 00:26:40,036 Speaker 2: and the next thing you know, you've watched four of 484 00:26:40,116 --> 00:26:44,556 Speaker 2: this stupid sitcom that really wasn't all that much fun, 485 00:26:44,636 --> 00:26:47,716 Speaker 2: but you just found yourself doing it. You know, you 486 00:26:47,756 --> 00:26:51,596 Speaker 2: can use this for good or for bad, and firms 487 00:26:51,636 --> 00:26:57,356 Speaker 2: have figured this out, so that gym warning Laurie they 488 00:26:57,396 --> 00:27:04,836 Speaker 2: are going to automatically renew your subscription. Yeah right, and 489 00:27:04,876 --> 00:27:07,436 Speaker 2: you know what, you're only going to go once this 490 00:27:07,716 --> 00:27:11,876 Speaker 2: fall because you'll be even busier. So I'm gonna save 491 00:27:11,916 --> 00:27:14,876 Speaker 2: you a lot of money here and suggest you quit now. 492 00:27:15,556 --> 00:27:21,156 Speaker 2: But we've used exactly that trick for good. When the 493 00:27:21,356 --> 00:27:25,196 Speaker 2: new kind of retirement plans for one K type plans 494 00:27:25,956 --> 00:27:29,396 Speaker 2: first came on the scene. One of the problems was 495 00:27:29,516 --> 00:27:33,476 Speaker 2: that people just didn't sign up even if their employer 496 00:27:33,716 --> 00:27:37,716 Speaker 2: was matching their contributions dollar for dollar, which is really 497 00:27:37,756 --> 00:27:42,156 Speaker 2: really stupid. And so we got the clever idea, why 498 00:27:42,196 --> 00:27:46,156 Speaker 2: don't we just change the default and say, hey, Laurie, 499 00:27:46,556 --> 00:27:48,436 Speaker 2: we have a new retirement plan. We're going to put 500 00:27:48,476 --> 00:27:51,676 Speaker 2: you in unless you fill out some form and say 501 00:27:51,676 --> 00:27:55,916 Speaker 2: you don't want it, and boom, that gets enrollment up 502 00:27:55,956 --> 00:28:00,236 Speaker 2: to ninety percent. And notice that's something economists would say, 503 00:28:00,276 --> 00:28:04,356 Speaker 2: we'll have no effect because what kind of idiot would 504 00:28:04,436 --> 00:28:09,116 Speaker 2: turn down a dollar for dollar match of six percent 505 00:28:09,276 --> 00:28:13,076 Speaker 2: of their salary. No, no idiot would say, oh, because 506 00:28:13,116 --> 00:28:15,436 Speaker 2: I have to fill out a one page form. But 507 00:28:16,196 --> 00:28:19,516 Speaker 2: you know, enrollments went from fifty percent to ninety percent 508 00:28:20,076 --> 00:28:24,556 Speaker 2: just by doing that. So that was using inertia for good. 509 00:28:25,076 --> 00:28:30,316 Speaker 2: Automatically renewing your gym membership is for profit and we 510 00:28:30,396 --> 00:28:35,116 Speaker 2: observe both. And there are lots of policy decisions. One 511 00:28:35,156 --> 00:28:37,476 Speaker 2: that is a little up in the air. The last 512 00:28:37,756 --> 00:28:42,916 Speaker 2: weeks of the Biden administration, they passed a rule saying 513 00:28:43,636 --> 00:28:47,036 Speaker 2: that you have to be able to unsubscribe the same 514 00:28:47,116 --> 00:28:51,916 Speaker 2: way you subscribed. So if it was with one click, 515 00:28:52,556 --> 00:28:57,756 Speaker 2: it should be one click. Some gyms during during COVID, 516 00:28:58,356 --> 00:29:02,236 Speaker 2: we're requiring people to come to the gym which was 517 00:29:02,356 --> 00:29:08,596 Speaker 2: closed in person to quit. That's really evil, right, So 518 00:29:09,116 --> 00:29:11,116 Speaker 2: this rule is up in the air. I don't know 519 00:29:11,396 --> 00:29:14,796 Speaker 2: whether it's going to be enforced or not, but I 520 00:29:14,836 --> 00:29:17,516 Speaker 2: strongly believe that's a rule for good. 521 00:29:17,916 --> 00:29:20,356 Speaker 1: So it seems like the happiness lesson from this problem 522 00:29:20,396 --> 00:29:23,436 Speaker 1: we have with inertia is like, if you're stuck in 523 00:29:23,516 --> 00:29:26,076 Speaker 1: some status quo, make sure it's a status quo that 524 00:29:26,076 --> 00:29:28,356 Speaker 1: you like, or build your own status quos that might 525 00:29:28,396 --> 00:29:31,516 Speaker 1: be helpful, But try to notice when you're getting stuck 526 00:29:31,556 --> 00:29:33,156 Speaker 1: in something just because it was the thing that you're 527 00:29:33,236 --> 00:29:33,476 Speaker 1: used to. 528 00:29:33,716 --> 00:29:34,636 Speaker 2: Yeah, okay, So. 529 00:29:34,596 --> 00:29:38,196 Speaker 1: That was irrationality number three that I think has interesting 530 00:29:38,236 --> 00:29:41,156 Speaker 1: implications for happiness. Now we get to irrationality number four, 531 00:29:41,636 --> 00:29:43,596 Speaker 1: which is I think a very very big one when 532 00:29:43,636 --> 00:29:45,916 Speaker 1: it comes to our happiness. And this is this idea 533 00:29:46,036 --> 00:29:50,356 Speaker 1: that we have a defective telescope, as economist Arthur Pigu 534 00:29:50,436 --> 00:29:53,556 Speaker 1: put it. What does Pigu mean by a defective telescope. 535 00:29:54,316 --> 00:30:00,316 Speaker 2: The idea is that the difference between today and tomorrow 536 00:30:01,036 --> 00:30:06,596 Speaker 2: seems bigger than the difference between two days apart in 537 00:30:06,636 --> 00:30:12,956 Speaker 2: the year, irrationally so. So Laurie is going to say, well, today, 538 00:30:13,076 --> 00:30:16,636 Speaker 2: she's busy, she's taping this podcast, she doesn't have time 539 00:30:16,676 --> 00:30:19,556 Speaker 2: to quit that gym. She's going to do it tomorrow, 540 00:30:19,876 --> 00:30:23,956 Speaker 2: and tomorrow there's going to be something else, so we 541 00:30:24,076 --> 00:30:27,556 Speaker 2: all procrastinate. That Maybe one reason it took just four 542 00:30:27,636 --> 00:30:31,636 Speaker 2: years to do this simple revision of this book, Alex 543 00:30:31,676 --> 00:30:33,876 Speaker 2: had a couple of kids, so he has an excuse. 544 00:30:34,436 --> 00:30:40,076 Speaker 2: So that defective telescope affects things like saving for retirement, 545 00:30:40,476 --> 00:30:45,636 Speaker 2: because retirement seems like it's a long way off, and 546 00:30:45,676 --> 00:30:48,876 Speaker 2: then suddenly you wake up and you're old like me. 547 00:30:49,196 --> 00:30:51,196 Speaker 1: I mean, I'll give you an even closer to home 548 00:30:51,276 --> 00:30:53,876 Speaker 1: example for listeners who might not be as old as 549 00:30:53,916 --> 00:30:54,956 Speaker 1: you are, Richard, which. 550 00:30:54,796 --> 00:30:58,236 Speaker 2: Is imagine that. 551 00:30:57,236 --> 00:31:00,276 Speaker 1: Which is just what's happening in your calendar weeks or 552 00:31:00,316 --> 00:31:03,556 Speaker 1: months from now, you know today, Laurie, if you ask me, hey, 553 00:31:03,596 --> 00:31:05,356 Speaker 1: would you want to write a chapter for a book 554 00:31:05,356 --> 00:31:08,036 Speaker 1: that doesn't even sound that interesting, I'm like, no way. 555 00:31:08,156 --> 00:31:10,876 Speaker 1: But if you ask me, does December Laurie want to 556 00:31:10,876 --> 00:31:12,956 Speaker 1: put some time into writing a chapter that sounds kind 557 00:31:12,956 --> 00:31:15,036 Speaker 1: of interesting. I'm like, oh my god, December Laurie would 558 00:31:15,036 --> 00:31:17,276 Speaker 1: love that. She would love to spend her time doing 559 00:31:17,276 --> 00:31:19,956 Speaker 1: that thing. And so I feel like my personal calendar 560 00:31:20,236 --> 00:31:24,316 Speaker 1: is filled with these instances of past Laurie's myopia, Like 561 00:31:24,436 --> 00:31:27,516 Speaker 1: somehow she thought that, you know, doing this podcast interview 562 00:31:27,516 --> 00:31:29,636 Speaker 1: today would be totally fine to do that with a 563 00:31:29,676 --> 00:31:32,196 Speaker 1: bunch of student meetings and she doesn't need lunch that day. 564 00:31:32,236 --> 00:31:33,636 Speaker 1: She'll just squeeze other things in. 565 00:31:34,236 --> 00:31:35,996 Speaker 3: And I think that, you know, in terms of like 566 00:31:36,916 --> 00:31:39,676 Speaker 3: rules in order to kind of solve these sorts of issues. 567 00:31:39,876 --> 00:31:42,636 Speaker 3: My favorite one is, what if that person asks you 568 00:31:42,716 --> 00:31:44,716 Speaker 3: to do that today, would you do it If the 569 00:31:44,756 --> 00:31:46,516 Speaker 3: answer is no, don't do it a year from now, 570 00:31:46,676 --> 00:31:49,676 Speaker 3: because a year from now, unless something real bad happens, 571 00:31:49,716 --> 00:31:52,516 Speaker 3: will at some point become today, and you will be like, 572 00:31:52,916 --> 00:31:53,916 Speaker 3: holy you. 573 00:31:53,876 --> 00:31:57,716 Speaker 1: Know, yeah No. I think this is so powerful, right, 574 00:31:57,756 --> 00:32:00,436 Speaker 1: because I think one of the big happiness implications of 575 00:32:00,476 --> 00:32:03,716 Speaker 1: this defective telescope is that we're just constantly screwing over 576 00:32:03,876 --> 00:32:07,316 Speaker 1: our future selves. Interestingly, we talked recently on a podcast 577 00:32:07,396 --> 00:32:09,716 Speaker 1: about a different way we screw over our future cells 578 00:32:09,756 --> 00:32:11,996 Speaker 1: that I'd be curious to think how behavioral economists think 579 00:32:12,036 --> 00:32:14,916 Speaker 1: about this. Behavioral economists are constantly talking about cases of 580 00:32:14,956 --> 00:32:18,596 Speaker 1: my apia, but happiness scientists often consider these cases of 581 00:32:18,596 --> 00:32:22,156 Speaker 1: what's called hyperopia, right where you kind of wind up 582 00:32:22,196 --> 00:32:24,916 Speaker 1: saving these good things for the future that you never 583 00:32:25,036 --> 00:32:27,436 Speaker 1: end up enjoying. So I'm thinking of cases of like 584 00:32:27,556 --> 00:32:30,076 Speaker 1: my frequent flyer miles. I'm sitting on one hundred thousand 585 00:32:30,076 --> 00:32:32,756 Speaker 1: frequent flyer miles that I'm like, someday I'll want to 586 00:32:33,116 --> 00:32:35,956 Speaker 1: enjoy these, but I never actually planned the vacation to 587 00:32:36,076 --> 00:32:38,236 Speaker 1: use them. Or you know, a really nice bottle of 588 00:32:38,276 --> 00:32:40,476 Speaker 1: wine that a good friend gave me for a birthday 589 00:32:40,676 --> 00:32:42,476 Speaker 1: and I'm like, oh, I want to wait for the 590 00:32:42,516 --> 00:32:45,356 Speaker 1: special day to use that, and then I never use it, 591 00:32:45,396 --> 00:32:47,436 Speaker 1: and years later I'll probably open it and it's corked 592 00:32:47,436 --> 00:32:50,036 Speaker 1: and I've forgotten about it. Are there hyper optic cases 593 00:32:50,076 --> 00:32:51,436 Speaker 1: that behavioral economists think about? 594 00:32:52,116 --> 00:32:56,556 Speaker 2: Yeah, So I have a student named Suzanne Chu, and 595 00:32:56,636 --> 00:33:00,956 Speaker 2: she talks about the example that you have two tomatoes 596 00:33:00,996 --> 00:33:05,716 Speaker 2: sitting on your kitchen counter and one is perfect, it's 597 00:33:05,916 --> 00:33:12,836 Speaker 2: like God's tomato, and one is two days past, which 598 00:33:12,916 --> 00:33:18,356 Speaker 2: one do you eat tonight? And in my family we 599 00:33:18,436 --> 00:33:23,396 Speaker 2: refer to the perfect one as Suzanne's tomato. My wife 600 00:33:23,436 --> 00:33:27,396 Speaker 2: is more frugal than me, and she can't bear the 601 00:33:27,516 --> 00:33:33,956 Speaker 2: thought of throwing away that almost still good tomato, whereas 602 00:33:34,036 --> 00:33:36,316 Speaker 2: I always want to go for the perfect one. 603 00:33:36,636 --> 00:33:39,836 Speaker 1: And that's important because if you wait till tomorrow, Susanne's 604 00:33:39,836 --> 00:33:41,916 Speaker 1: tomato is going to be not so good too. 605 00:33:42,236 --> 00:33:46,916 Speaker 2: You never eat Susanne's tomato, that's the problem. So you 606 00:33:47,036 --> 00:33:50,516 Speaker 2: gotta go for Suzanne's tomato when it's there. 607 00:33:51,156 --> 00:33:53,596 Speaker 3: Our friend George Lowenstein has and we talk about this 608 00:33:53,636 --> 00:33:57,236 Speaker 3: in the book, this idea of anticipatory utility. You were 609 00:33:57,236 --> 00:33:59,956 Speaker 3: talking about a nice bottle of wine or all of 610 00:33:59,956 --> 00:34:02,916 Speaker 3: these travel miles or something like that. You're kind of 611 00:34:02,916 --> 00:34:05,236 Speaker 3: saving it for something. You have this dream in your 612 00:34:05,236 --> 00:34:07,676 Speaker 3: head that there's gonna be this perfect event where you 613 00:34:07,956 --> 00:34:10,516 Speaker 3: open up the bottle, you have this incredible dinner party, 614 00:34:10,636 --> 00:34:13,076 Speaker 3: everything is amazing. It's gonna happen at some point in 615 00:34:13,116 --> 00:34:15,116 Speaker 3: the future. And what this allows you to do is 616 00:34:15,156 --> 00:34:17,436 Speaker 3: kind of just like go on imagining that that's gonna happen, 617 00:34:17,756 --> 00:34:20,716 Speaker 3: which provides you some sort of happiness over time. And 618 00:34:20,756 --> 00:34:25,276 Speaker 3: this sort of anticipatory utility leads to what looks like 619 00:34:25,796 --> 00:34:28,916 Speaker 3: putting nice things into the future too much. 620 00:34:29,276 --> 00:34:31,716 Speaker 1: And so it seems like we want to embrace the savoring. 621 00:34:31,756 --> 00:34:34,836 Speaker 1: We want to embrace the anticipatory utility. But maybe just 622 00:34:34,876 --> 00:34:38,036 Speaker 1: put in the calendar that, like, you know, October twenty first, 623 00:34:38,196 --> 00:34:41,716 Speaker 1: I'm gonna drink the good wine, or maybe like January 624 00:34:41,756 --> 00:34:44,556 Speaker 1: right in your calendar, use a frequent flyer miles this month. 625 00:34:44,356 --> 00:34:47,316 Speaker 2: And next time sailors in New Haven. That's when you 626 00:34:47,396 --> 00:34:47,756 Speaker 2: open that. 627 00:34:49,916 --> 00:34:52,276 Speaker 1: We'll see if we have some assusiasts tomatoes. Then I 628 00:34:52,276 --> 00:34:54,516 Speaker 1: don't know if we will Okay. When we get back 629 00:34:54,516 --> 00:34:57,796 Speaker 1: from the break, we'll dive into my final two favorite anomalies. 630 00:34:58,436 --> 00:35:01,756 Speaker 1: Think why Dustin Hoffmann, how did Gene Hackman for lunch money? 631 00:35:02,196 --> 00:35:05,996 Speaker 1: And why people buy convertibles in snowy climates. The Happiness 632 00:35:06,076 --> 00:35:16,916 Speaker 1: Lab will be back in a moment. Irrationality number five 633 00:35:17,076 --> 00:35:20,476 Speaker 1: gets us into monetary territory, and it's the idea that 634 00:35:20,716 --> 00:35:24,316 Speaker 1: money is actually fungible. We don't treat it like that, Alex. 635 00:35:24,396 --> 00:35:25,956 Speaker 1: What's this idea of fungibility? 636 00:35:26,556 --> 00:35:29,596 Speaker 3: The idea of fungibility underlies the whole reason we have 637 00:35:29,676 --> 00:35:32,116 Speaker 3: money in the first place, and we're not bartering anymore. 638 00:35:32,276 --> 00:35:34,556 Speaker 3: It just basically means a dollar is a dollar. It 639 00:35:34,556 --> 00:35:36,796 Speaker 3: doesn't matter how I got the dollar. It'll buy you 640 00:35:36,836 --> 00:35:39,116 Speaker 3: the same thing. So you should spend it the same way, 641 00:35:39,156 --> 00:35:41,396 Speaker 3: regardless of how you got it. So let's say you 642 00:35:41,476 --> 00:35:44,716 Speaker 3: got a nice bonus at work, come home, you think 643 00:35:44,716 --> 00:35:47,396 Speaker 3: about how you to spend that five hundred dollars, or 644 00:35:47,836 --> 00:35:50,356 Speaker 3: you found five hundred dollars envelope on the ground, you 645 00:35:50,396 --> 00:35:52,316 Speaker 3: think about how you want to spend it. You should 646 00:35:52,356 --> 00:35:54,916 Speaker 3: basically spend it the same way. It doesn't really matter 647 00:35:54,956 --> 00:35:57,036 Speaker 3: how you got it. The money is worth the same 648 00:35:57,076 --> 00:35:59,876 Speaker 3: to you. You should spend it exactly the same way. 649 00:36:00,276 --> 00:36:05,716 Speaker 3: And this idea of fungibility underlies the very very basic 650 00:36:05,796 --> 00:36:07,116 Speaker 3: principles of economics. 651 00:36:07,396 --> 00:36:09,676 Speaker 1: Except the problem is that we tend to do that. 652 00:36:09,956 --> 00:36:13,076 Speaker 1: In fact, we violate this fungibility principle in pretty much 653 00:36:13,076 --> 00:36:15,756 Speaker 1: every behavior we engage in Richard, in the book, you 654 00:36:15,836 --> 00:36:17,956 Speaker 1: use an example of the way we violate it when 655 00:36:17,996 --> 00:36:21,116 Speaker 1: we're purchasing gas under certain situations. I hadn't heard about 656 00:36:21,116 --> 00:36:22,516 Speaker 1: this was share the gas example. 657 00:36:23,196 --> 00:36:26,836 Speaker 2: Yeah. So one of the big themes in the book 658 00:36:26,876 --> 00:36:30,436 Speaker 2: in terms of methodology, Alex referred to this earlier. Is 659 00:36:30,876 --> 00:36:34,316 Speaker 2: a lot of stuff that was demonstrated with thought experiments 660 00:36:34,556 --> 00:36:39,476 Speaker 2: or lab experiments has now been replicated with actual data. 661 00:36:40,076 --> 00:36:43,436 Speaker 2: And this is a good example of that. Our friends 662 00:36:43,636 --> 00:36:48,596 Speaker 2: Jesse Shapiro and Justine Hastings looked at what happened when 663 00:36:48,996 --> 00:36:53,516 Speaker 2: the price of gasoline fell by fifty percent during the 664 00:36:53,596 --> 00:36:59,116 Speaker 2: financial crisis, right, so this was really bad times. Also unemployment. 665 00:36:59,476 --> 00:37:04,396 Speaker 2: People are coming back on everything. But gas has gotten cheap. 666 00:37:04,996 --> 00:37:08,236 Speaker 2: So they were spending eighty dollars a week on gas 667 00:37:08,276 --> 00:37:12,036 Speaker 2: and now it's for So there's forty dollars extra in 668 00:37:12,076 --> 00:37:15,476 Speaker 2: their budget. And what do they do with that windfall, Well, 669 00:37:15,476 --> 00:37:20,996 Speaker 2: they spend some of it very stupidly on better grade gas. 670 00:37:21,276 --> 00:37:23,716 Speaker 1: So it'd be like I would buy the eighty seven percent, 671 00:37:23,876 --> 00:37:26,516 Speaker 1: you know, the cheapo gas, the cheapest one on the board. 672 00:37:26,796 --> 00:37:29,836 Speaker 1: But then when the price of gas falls, instead of 673 00:37:29,836 --> 00:37:31,476 Speaker 1: saying like, oh, I can save that money and go 674 00:37:31,516 --> 00:37:34,276 Speaker 1: get an extra coffee or something like that, I say today, 675 00:37:34,316 --> 00:37:37,836 Speaker 1: I'm going to get the ninety two percent the premium, right. 676 00:37:37,756 --> 00:37:41,356 Speaker 2: Exactly, and that will do exactly no good. The car 677 00:37:41,436 --> 00:37:44,796 Speaker 2: won't appreciate it. Go buy a better bottle of olive oil. 678 00:37:44,996 --> 00:37:45,196 Speaker 3: You know. 679 00:37:46,036 --> 00:37:49,756 Speaker 2: So that's a good example of what I call mental accounting. 680 00:37:50,036 --> 00:37:52,276 Speaker 1: Yeah, and so what's mental accounting? Because I think it's 681 00:37:52,276 --> 00:37:53,876 Speaker 1: so intuitive once you explain it. 682 00:37:54,156 --> 00:37:56,876 Speaker 2: Mental accounting is sort of the way we keep track 683 00:37:56,916 --> 00:38:01,236 Speaker 2: of stuff. There's a great video you can find online 684 00:38:01,596 --> 00:38:07,956 Speaker 2: Dustin Hoffman and Gene Hackman and they're having this discussion. 685 00:38:08,036 --> 00:38:12,636 Speaker 2: It's about back when they were starving actors, and Hackman 686 00:38:13,116 --> 00:38:17,556 Speaker 2: tells this story about going to he calls him Dusty, 687 00:38:18,116 --> 00:38:23,196 Speaker 2: going to Dusty's apartment in Pasadena, and Hoffman says, he 688 00:38:23,276 --> 00:38:25,916 Speaker 2: needs some money? Can he loan him some money? And 689 00:38:25,956 --> 00:38:29,236 Speaker 2: then Hackman goes into the kitchen and he says, hey, 690 00:38:29,276 --> 00:38:33,956 Speaker 2: you need money. There's these jars in your kitchen and 691 00:38:33,996 --> 00:38:36,636 Speaker 2: they've got money in them. Why do you need money? 692 00:38:37,076 --> 00:38:39,276 Speaker 2: And Hoffman says, yeah, but there's no money in the 693 00:38:39,316 --> 00:38:46,596 Speaker 2: food jar, right, So that is mental accounting. Now, I 694 00:38:46,636 --> 00:38:51,076 Speaker 2: should say budgeting per se is not stupid, you know, 695 00:38:51,156 --> 00:38:53,796 Speaker 2: and making sure that you have enough money to pay 696 00:38:53,796 --> 00:38:57,996 Speaker 2: the rent each month, that's smart, and in fact, creating 697 00:38:58,436 --> 00:39:03,156 Speaker 2: the equivalent of those jars can be helpful. But spending 698 00:39:03,276 --> 00:39:07,156 Speaker 2: the gas windfall on gas that's stupid. 699 00:39:07,316 --> 00:39:09,916 Speaker 1: But we can harness these biases also, so for good 700 00:39:10,276 --> 00:39:12,956 Speaker 1: you mentioned before the example of the tool you were 701 00:39:13,036 --> 00:39:15,556 Speaker 1: using to get people to save more, and this is 702 00:39:15,596 --> 00:39:17,796 Speaker 1: another spot where you were able to use people's mental 703 00:39:17,796 --> 00:39:21,396 Speaker 1: accounts to do better a little bit too, using people's raises, 704 00:39:21,476 --> 00:39:23,636 Speaker 1: for example, to get them to save a little bit more. 705 00:39:23,636 --> 00:39:25,156 Speaker 1: Explain what you did in that program. 706 00:39:25,836 --> 00:39:29,396 Speaker 2: Yeah, so I mentioned before that the first problem we 707 00:39:29,436 --> 00:39:32,116 Speaker 2: had to solve was getting people to sign up. Then 708 00:39:32,156 --> 00:39:35,356 Speaker 2: the second problem was getting them to put more money in. 709 00:39:36,076 --> 00:39:41,236 Speaker 2: And the trick there was another one of my former students, 710 00:39:41,356 --> 00:39:44,596 Speaker 2: Sloman BENARTZI and I create a program we called save 711 00:39:44,876 --> 00:39:47,796 Speaker 2: More Tomorrow. And this goes back to what we were talking 712 00:39:47,836 --> 00:39:48,756 Speaker 2: about before, because it's. 713 00:39:48,596 --> 00:39:51,116 Speaker 1: Tomorrowmorrow is far away. I'm happy to save. Laurie's going 714 00:39:51,116 --> 00:39:52,596 Speaker 1: to be a great saver tomorrow. 715 00:39:52,436 --> 00:39:55,196 Speaker 2: Right, So we would go to people and say, how 716 00:39:55,236 --> 00:39:59,796 Speaker 2: about if you increase your saving contribution when you get 717 00:39:59,836 --> 00:40:04,236 Speaker 2: your next raise? All right, So that's combining two of 718 00:40:04,276 --> 00:40:07,836 Speaker 2: the things. So first of all, it's tomorrow, right it 719 00:40:08,236 --> 00:40:10,836 Speaker 2: you know it's in janu when I get the raise, 720 00:40:11,036 --> 00:40:14,556 Speaker 2: so sure, and I'm going to have more money, so 721 00:40:14,796 --> 00:40:18,436 Speaker 2: I'll take some of that new money and I'll save 722 00:40:18,556 --> 00:40:23,156 Speaker 2: that and that program. In the first company we tried at, 723 00:40:23,156 --> 00:40:28,036 Speaker 2: we triple saving rates. Wow. That has created billions and 724 00:40:28,076 --> 00:40:31,596 Speaker 2: billions of dollars in savings around the world. 725 00:40:31,876 --> 00:40:33,516 Speaker 1: And this kind of fits with the ideas we've been 726 00:40:33,516 --> 00:40:35,516 Speaker 1: talking about for so many of these other anomalies, right, 727 00:40:35,516 --> 00:40:38,636 Speaker 1: which is that mental accounting in some context looks sort 728 00:40:38,636 --> 00:40:40,996 Speaker 1: of silly when you're just kind of spending your gas 729 00:40:41,036 --> 00:40:43,996 Speaker 1: budget on higher priced gas just because you can. But 730 00:40:44,116 --> 00:40:46,196 Speaker 1: you can also use mental accounting to do things that 731 00:40:46,276 --> 00:40:47,156 Speaker 1: make you happier. 732 00:40:47,236 --> 00:40:47,396 Speaker 3: Right. 733 00:40:47,396 --> 00:40:49,356 Speaker 1: You can think of the account of, Oh, I'm going 734 00:40:49,396 --> 00:40:51,916 Speaker 1: to get this future raise that's not allocated for yet, 735 00:40:51,956 --> 00:40:54,076 Speaker 1: so I can put that into savings. I feel like 736 00:40:54,116 --> 00:40:56,196 Speaker 1: I do this all the time when the so called 737 00:40:56,276 --> 00:40:58,756 Speaker 1: pain of paying is kind of high, Like I want 738 00:40:58,756 --> 00:41:01,516 Speaker 1: to do something, but it feels sort of luxurious if 739 00:41:01,516 --> 00:41:03,676 Speaker 1: I take some random windfall I get, say I get 740 00:41:03,676 --> 00:41:06,476 Speaker 1: an extra honorarium from a talk I wasn't expecting. I 741 00:41:06,556 --> 00:41:09,356 Speaker 1: put that towards the stuff that normally i'd feel little 742 00:41:09,356 --> 00:41:11,876 Speaker 1: bit embarrassed about getting, and then that makes me feel 743 00:41:12,076 --> 00:41:14,156 Speaker 1: not so bad. That's like a happiness hack that I 744 00:41:14,236 --> 00:41:15,876 Speaker 1: use for mental accounting all the time. 745 00:41:16,196 --> 00:41:19,556 Speaker 2: I'll tell you a funny story about my daughter, My 746 00:41:19,676 --> 00:41:24,156 Speaker 2: daughter Maggie, who has been raised by a beaveral economist. Right. 747 00:41:24,556 --> 00:41:27,516 Speaker 2: She lives in Rhode Island, and one of her neighbors 748 00:41:27,876 --> 00:41:31,276 Speaker 2: grew into be a Major League baseball pitcher playing for 749 00:41:31,316 --> 00:41:34,836 Speaker 2: the Mets. And I noticed the Mets were going to 750 00:41:34,876 --> 00:41:37,476 Speaker 2: have a game in which this kid was going to 751 00:41:37,476 --> 00:41:41,316 Speaker 2: be pitching, and so I called Maggie and said, hey, 752 00:41:41,716 --> 00:41:44,156 Speaker 2: would you want to go to that game? I'll treat 753 00:41:44,196 --> 00:41:47,476 Speaker 2: you to two tickets, and she said, okay, great. So 754 00:41:47,716 --> 00:41:49,956 Speaker 2: this game was like in another day and a half, 755 00:41:50,316 --> 00:41:52,716 Speaker 2: so we had to act fast. So I look online. 756 00:41:53,196 --> 00:41:56,516 Speaker 2: I send her a link and say, look, mag you 757 00:41:56,596 --> 00:41:59,436 Speaker 2: can get what looked like pretty nice tickets for three 758 00:41:59,516 --> 00:42:02,436 Speaker 2: hundred dollars each. By the ones you want, I'll send 759 00:42:02,436 --> 00:42:06,516 Speaker 2: you one thousand dollars and go have fun. And she 760 00:42:06,996 --> 00:42:10,276 Speaker 2: text me back and said, well, this is just like 761 00:42:10,356 --> 00:42:13,316 Speaker 2: in your book. If you send me one thousand dollars, 762 00:42:13,596 --> 00:42:15,356 Speaker 2: I'm not going to use it on going to a 763 00:42:15,396 --> 00:42:16,196 Speaker 2: baseball game. 764 00:42:18,996 --> 00:42:21,516 Speaker 1: All right, So now we get photo. My final irrationality 765 00:42:21,596 --> 00:42:24,156 Speaker 1: that I think matters for happiness, which is the problem 766 00:42:24,356 --> 00:42:26,516 Speaker 1: that we don't always know what we're going to like 767 00:42:26,996 --> 00:42:29,436 Speaker 1: in the future. Alex. In the book, you use this 768 00:42:29,516 --> 00:42:31,276 Speaker 1: so called the hungry shopper example. 769 00:42:31,356 --> 00:42:34,676 Speaker 3: What's that well, I think everybody's kind of probably familiar 770 00:42:34,756 --> 00:42:37,436 Speaker 3: with this one. You're you shouldn't shop on an empty 771 00:42:37,476 --> 00:42:41,036 Speaker 3: stomach because everything feels like it's going to be really good, 772 00:42:41,076 --> 00:42:43,356 Speaker 3: and you fill your grocery cart with all of these 773 00:42:43,396 --> 00:42:46,796 Speaker 3: delicious looking things. You go home, you eat dinner, and 774 00:42:46,796 --> 00:42:49,156 Speaker 3: then you're like, why did I buy seventeen different types 775 00:42:49,196 --> 00:42:52,116 Speaker 3: of potato chips? And it's the idea that you can't 776 00:42:52,156 --> 00:42:55,516 Speaker 3: really accurately imagine what you're going to be like and 777 00:42:55,516 --> 00:42:58,356 Speaker 3: what you're going to want in the future. So if 778 00:42:58,396 --> 00:43:00,236 Speaker 3: I'm hungry, you kind of think I'm always going to 779 00:43:00,276 --> 00:43:03,116 Speaker 3: be hungry, I'm always gonna want these things. You're going 780 00:43:03,196 --> 00:43:05,876 Speaker 3: to fill your grocery cart with those things, and then 781 00:43:05,916 --> 00:43:08,516 Speaker 3: all of a sudden, you eat and your house is 782 00:43:08,556 --> 00:43:10,796 Speaker 3: filled with stuff if you don't want, especially you know, 783 00:43:10,876 --> 00:43:13,236 Speaker 3: stuff you really don't want, like chips and other sorts 784 00:43:13,276 --> 00:43:17,076 Speaker 3: of unhealthy things. It's this idea that we started perspective taking, 785 00:43:17,396 --> 00:43:20,156 Speaker 3: only it's not perspective taking with respect to other people's 786 00:43:20,196 --> 00:43:22,876 Speaker 3: perspective taking with respect to what you're going to be like. 787 00:43:22,956 --> 00:43:23,556 Speaker 2: In the future. 788 00:43:23,836 --> 00:43:25,716 Speaker 3: So people struggle with both. 789 00:43:26,516 --> 00:43:28,076 Speaker 1: In some ways, it makes sense that we struggle with 790 00:43:28,076 --> 00:43:30,596 Speaker 1: both when it comes to like hunger, right, it's hard 791 00:43:30,636 --> 00:43:33,036 Speaker 1: to kind of imagine a different state that you're in. 792 00:43:33,356 --> 00:43:35,676 Speaker 1: But we see these kinds of effects more broadly than 793 00:43:35,756 --> 00:43:38,356 Speaker 1: like what's happening in the grocery store. Alex tell me 794 00:43:38,396 --> 00:43:41,356 Speaker 1: the example of the effect of a person's weather and 795 00:43:41,396 --> 00:43:43,396 Speaker 1: the present day on their car purchases. 796 00:43:43,876 --> 00:43:46,476 Speaker 3: Yeah. So this is by our colleague Devin Pope and 797 00:43:46,516 --> 00:43:48,756 Speaker 3: his co authors. They basically looked at data. This is 798 00:43:48,756 --> 00:43:51,636 Speaker 3: again going into this kind of revolution and behavioral economics. 799 00:43:51,636 --> 00:43:54,876 Speaker 3: We started out with these experiments with college students and 800 00:43:54,916 --> 00:43:57,036 Speaker 3: now most of the results that we're seeing are with 801 00:43:57,356 --> 00:44:01,076 Speaker 3: data millions of car purchases. They take this database of 802 00:44:01,116 --> 00:44:04,116 Speaker 3: what sort of cars people are buying and when, and 803 00:44:04,156 --> 00:44:06,436 Speaker 3: they find that when it's sunny, people are a lot 804 00:44:06,476 --> 00:44:09,876 Speaker 3: more likely to buy a convertible. And the funny thing 805 00:44:09,956 --> 00:44:12,556 Speaker 3: is is that you know, if you're in Minnesota, you're 806 00:44:12,596 --> 00:44:15,676 Speaker 3: on that one sunny day June seventh. 807 00:44:15,836 --> 00:44:17,836 Speaker 1: Yeah, you all lived in Chicago. We know the one 808 00:44:17,876 --> 00:44:18,956 Speaker 1: sunny day in Chicago. 809 00:44:19,316 --> 00:44:22,756 Speaker 3: Yeah, and you know people are buying these convertibles and 810 00:44:23,396 --> 00:44:27,276 Speaker 3: they're going out, they're happily driving off the lot, and 811 00:44:27,276 --> 00:44:30,276 Speaker 3: then you know, the rest of the year starts and 812 00:44:30,316 --> 00:44:33,116 Speaker 3: they're like, oh crap. And the idea is that when 813 00:44:33,116 --> 00:44:36,756 Speaker 3: it's sunny, you kind of imagine yourself wind flowing through 814 00:44:36,796 --> 00:44:39,876 Speaker 3: your hair driving down the highway, and it's really hard 815 00:44:39,916 --> 00:44:42,796 Speaker 3: for you to imagine, Wait, it's going to snow tomorrow 816 00:44:42,916 --> 00:44:45,356 Speaker 3: probably and I'm going to have to close it up 817 00:44:45,716 --> 00:44:47,316 Speaker 3: and it's just going to be kind of a windy, 818 00:44:47,516 --> 00:44:50,396 Speaker 3: cold car. And then similarly, you know, when it's a 819 00:44:50,436 --> 00:44:53,036 Speaker 3: cold day, people are really not likely to buy convertibles 820 00:44:53,076 --> 00:44:55,636 Speaker 3: because they can't imagine the sunny day where they're actually 821 00:44:55,676 --> 00:44:56,396 Speaker 3: going to enjoy it. 822 00:44:56,516 --> 00:45:01,236 Speaker 2: Danny Kahneman my mentor coin the phrase the focusing illusion, 823 00:45:02,116 --> 00:45:06,676 Speaker 2: like nothing is as important as the thing you're thinking 824 00:45:06,716 --> 00:45:07,596 Speaker 2: about right now. 825 00:45:08,036 --> 00:45:10,396 Speaker 1: So this is a big problem, right because we need 826 00:45:10,556 --> 00:45:13,556 Speaker 1: our predictions to make decisions about the kinds of things 827 00:45:13,556 --> 00:45:15,156 Speaker 1: that we're going to engage with in the future that 828 00:45:15,196 --> 00:45:17,196 Speaker 1: are going to make us happy, and it seems like 829 00:45:17,236 --> 00:45:20,516 Speaker 1: this focusing illusion really messes us up. Right, whatever our 830 00:45:20,516 --> 00:45:23,236 Speaker 1: attention is focused on, we start thinking about that and 831 00:45:23,276 --> 00:45:25,756 Speaker 1: we ignore all the other stuff. So how can we 832 00:45:25,796 --> 00:45:29,036 Speaker 1: do better, Alex, any advice for how we can kind 833 00:45:29,036 --> 00:45:30,916 Speaker 1: of notice what we're going to like in the future 834 00:45:30,956 --> 00:45:33,316 Speaker 1: a little bit better and maybe open up our narrow attention. 835 00:45:34,196 --> 00:45:36,356 Speaker 3: Yeah, so I have some work on this myself. I 836 00:45:36,396 --> 00:45:40,156 Speaker 3: mean the kind of easy advice is just to think 837 00:45:40,196 --> 00:45:43,796 Speaker 3: about it for longer. People become a lot better calibrated 838 00:45:43,876 --> 00:45:46,756 Speaker 3: just by giving them what we call a waiting period. 839 00:45:47,076 --> 00:45:50,236 Speaker 3: And these waiting periods are actually not something we invented. 840 00:45:50,316 --> 00:45:53,276 Speaker 3: As you probably know, waiting periods are all over the place. 841 00:45:53,716 --> 00:45:56,116 Speaker 3: Many states have been as part of kind of gun laws. 842 00:45:56,516 --> 00:45:58,436 Speaker 3: The idea is that if I'm in a hot state, 843 00:45:58,476 --> 00:46:00,516 Speaker 3: I can't imagine what it's going to feel like when 844 00:46:00,556 --> 00:46:02,156 Speaker 3: I'm in a cold state. If I'm in a hot state, 845 00:46:02,196 --> 00:46:04,436 Speaker 3: I'm angry at somebody I go out there and buy 846 00:46:04,436 --> 00:46:06,516 Speaker 3: a gun and do something really stupid that I'm going 847 00:46:06,556 --> 00:46:09,716 Speaker 3: to regret. So states impose a waiting period to kind 848 00:46:09,716 --> 00:46:12,516 Speaker 3: of simulate like, let me think about it for a 849 00:46:12,556 --> 00:46:14,756 Speaker 3: little longer and then maybe I won't need it. It's 850 00:46:14,796 --> 00:46:18,916 Speaker 3: there for marriages, too, right. Somebody meets somebody at a 851 00:46:18,956 --> 00:46:20,516 Speaker 3: bar and is like, you're the love of my life. 852 00:46:20,556 --> 00:46:24,156 Speaker 3: Let's go, and it's like wait a second, think about 853 00:46:24,156 --> 00:46:26,196 Speaker 3: it for a little bit. And this same sort of 854 00:46:26,396 --> 00:46:29,956 Speaker 3: very simple intervention we found this in careful experimental tests 855 00:46:29,996 --> 00:46:32,396 Speaker 3: allows you to kind of simulate what you're going to 856 00:46:32,476 --> 00:46:34,276 Speaker 3: feel like in the future a lot more accurately. 857 00:46:34,436 --> 00:46:35,796 Speaker 1: And so it seems like there are two pieces of 858 00:46:35,796 --> 00:46:38,356 Speaker 1: advice there when you're making a big life decision, right. 859 00:46:38,436 --> 00:46:41,156 Speaker 1: One is give your attention a moment to catch up 860 00:46:41,196 --> 00:46:43,756 Speaker 1: with all the stuff it's not currently paying attention to. 861 00:46:44,236 --> 00:46:46,236 Speaker 1: And the other piece of advice is, just like one 862 00:46:46,316 --> 00:46:48,836 Speaker 1: end doubt, give it a little time because over time 863 00:46:48,956 --> 00:46:51,196 Speaker 1: more stuff will reveal. And whatever state you're in now, 864 00:46:51,196 --> 00:46:53,916 Speaker 1: if that's hungry, sitting out in the sunshine thinking you 865 00:46:53,916 --> 00:46:56,556 Speaker 1: want a convertible, you might revert to a different state, 866 00:46:56,596 --> 00:46:58,796 Speaker 1: which might be more of a baseline for how you're 867 00:46:58,796 --> 00:46:59,836 Speaker 1: going to feel in the future. 868 00:47:00,476 --> 00:47:03,876 Speaker 2: No, And there's one another extension of that. When I'm 869 00:47:03,916 --> 00:47:09,036 Speaker 2: talking to students about career choices. The mistake I find 870 00:47:09,196 --> 00:47:12,996 Speaker 2: people make all the time is they decide a career 871 00:47:13,116 --> 00:47:16,356 Speaker 2: based on what they like to study in school. And 872 00:47:16,796 --> 00:47:20,116 Speaker 2: I always say, no, think about what you can imagine 873 00:47:20,156 --> 00:47:23,796 Speaker 2: doing every day for the rest of your life, and 874 00:47:24,516 --> 00:47:28,236 Speaker 2: that may not be the same thing. Go you know, 875 00:47:28,596 --> 00:47:32,436 Speaker 2: shadow somebody for two weeks. See if that seems like 876 00:47:32,796 --> 00:47:35,756 Speaker 2: that's exciting. I think there are lots of careers that 877 00:47:36,556 --> 00:47:41,116 Speaker 2: sound good until you think about doing it every day. 878 00:47:41,996 --> 00:47:44,036 Speaker 1: Well, Richard, I'm so glad that what you decided to 879 00:47:44,036 --> 00:47:45,676 Speaker 1: do every day for the rest of your life was 880 00:47:45,716 --> 00:47:47,876 Speaker 1: to be a behavioral economist because it's been so fun 881 00:47:47,916 --> 00:47:50,676 Speaker 1: to learn about these anomalies from you through the years. 882 00:47:50,716 --> 00:47:52,316 Speaker 1: It was fun to read this book back in the 883 00:47:52,356 --> 00:47:54,196 Speaker 1: late nineties, and it was even more fun to read 884 00:47:54,236 --> 00:47:56,476 Speaker 1: it new in twenty twenty five, when we have all 885 00:47:56,476 --> 00:47:59,236 Speaker 1: of Alex's new insights there. So everyone should go out 886 00:47:59,236 --> 00:48:01,196 Speaker 1: and check out The Winner's Curse. There's so many cool 887 00:48:01,196 --> 00:48:03,396 Speaker 1: anomalies that we'd get any chance to go into for 888 00:48:03,556 --> 00:48:05,356 Speaker 1: Richard and Alex, thanks so much for being on the show. 889 00:48:05,676 --> 00:48:06,996 Speaker 2: Thanks Laurie, great to see you. 890 00:48:08,396 --> 00:48:12,076 Speaker 1: So it seemed humans are not vulcans. We don't always 891 00:48:12,076 --> 00:48:16,516 Speaker 1: act rationally, but as these six anomalies show, understanding the 892 00:48:16,516 --> 00:48:19,876 Speaker 1: irrational quirks of our species can help us live richer, 893 00:48:19,956 --> 00:48:23,116 Speaker 1: happier lives in ways that I think pure logic may 894 00:48:23,116 --> 00:48:26,476 Speaker 1: not have predicted. The new and improved version of the 895 00:48:26,516 --> 00:48:29,356 Speaker 1: Winner's Curse is out in October, and it's available for 896 00:48:29,436 --> 00:48:33,076 Speaker 1: pre order now. Next time on the Happiness Lab, we'll 897 00:48:33,076 --> 00:48:35,836 Speaker 1: hear about another of my favorite books of twenty twenty five, 898 00:48:36,396 --> 00:48:39,676 Speaker 1: one that's also from a scholar trained in economics and who, 899 00:48:39,956 --> 00:48:43,796 Speaker 1: like Richard Taylor, also wants to break the rules. But 900 00:48:43,956 --> 00:48:46,756 Speaker 1: her rule breaking involves teaching us why we should all 901 00:48:46,836 --> 00:48:49,756 Speaker 1: be working a bit less. All that. Next time on 902 00:48:49,836 --> 00:48:55,996 Speaker 1: the Happiness Lab with me Doctor Laurie Santos,