1 00:00:15,436 --> 00:00:22,036 Speaker 1: Pushkin. Angela, you don't know what a crossover is? Did 2 00:00:22,036 --> 00:00:24,156 Speaker 1: you google it? Well? I did have to google it. 3 00:00:24,276 --> 00:00:27,556 Speaker 1: The Simpsons did a crossover with twenty four. That's amazing. 4 00:00:28,156 --> 00:00:30,156 Speaker 1: I think I get it, though, it's like the Dorito's 5 00:00:30,276 --> 00:00:33,836 Speaker 1: Taco Bell mashup thing. The most horrifying new one of 6 00:00:33,876 --> 00:00:36,276 Speaker 1: these is Hines just did this crazy new thing where 7 00:00:36,316 --> 00:00:38,596 Speaker 1: they're doing all these weird Mayo ketchups and stuff. But 8 00:00:38,636 --> 00:00:42,596 Speaker 1: they did Oreo Mayo ketchup. What sounds terrible, But this 9 00:00:42,636 --> 00:00:44,996 Speaker 1: will not be terrible because today we are doing a 10 00:00:45,036 --> 00:00:48,636 Speaker 1: cognitive science mashup of the two best cognitive science podcasts 11 00:00:48,636 --> 00:00:51,636 Speaker 1: out there, Let's be Real, Laurie Santo's from The Happiness Lab, 12 00:00:51,716 --> 00:00:55,756 Speaker 1: along with Angela Duckworth from No Stupid Questions, and we 13 00:00:55,796 --> 00:00:59,676 Speaker 1: are using this No Stupid Questions format that my usual partner, 14 00:00:59,756 --> 00:01:02,716 Speaker 1: Stephen Dubner, and I like so much, which is that 15 00:01:02,836 --> 00:01:05,116 Speaker 1: one of us asked a question of the other and 16 00:01:05,156 --> 00:01:08,356 Speaker 1: then we just have a rambling conversation. I'm in for 17 00:01:08,396 --> 00:01:11,076 Speaker 1: the rambling conver station. That's usually what we have together 18 00:01:11,116 --> 00:01:15,236 Speaker 1: when we meet up. Angela, let's do it. Thank the 19 00:01:15,476 --> 00:01:20,556 Speaker 1: Lord that you're here, God, but I'm Angela Duckworth and 20 00:01:20,676 --> 00:01:23,516 Speaker 1: this is Laurie Santos and you're listening to No Stupid 21 00:01:23,596 --> 00:01:28,556 Speaker 1: Questions meets the Happiness Lab today on the show. Why 22 00:01:28,556 --> 00:01:31,516 Speaker 1: do we mirror the accents and mannerisms of the people 23 00:01:31,636 --> 00:01:36,196 Speaker 1: we meet? Monkey A observes monkey be do something different 24 00:01:36,396 --> 00:01:40,196 Speaker 1: with a banana. I don't know if that's stereotyping. They 25 00:01:40,236 --> 00:01:44,156 Speaker 1: do like bananas. Totally true. Also, why can happiness seem 26 00:01:44,276 --> 00:01:48,876 Speaker 1: so elusive even when things are going well? Holy Schmoley, 27 00:01:48,916 --> 00:01:51,276 Speaker 1: this is the best life I could possibly living. Why 28 00:01:51,276 --> 00:01:54,796 Speaker 1: am I not at ten out of ten on happiness? 29 00:01:57,676 --> 00:02:00,276 Speaker 1: I think you have the first question for me? Correct? 30 00:02:00,396 --> 00:02:03,676 Speaker 1: I do? And this comes from a listener named Sabika 31 00:02:03,836 --> 00:02:08,476 Speaker 1: Shavan who hails from Qatar and is a graduate student there. 32 00:02:09,196 --> 00:02:12,876 Speaker 1: This question is as follows. I have a question about 33 00:02:12,916 --> 00:02:16,276 Speaker 1: mimicking the mannerisms of people we meet, especially in our 34 00:02:16,356 --> 00:02:21,156 Speaker 1: multicultural environments. Often in conversations with people with very market 35 00:02:21,276 --> 00:02:24,636 Speaker 1: accents different from my own, I find my own accent 36 00:02:24,796 --> 00:02:29,236 Speaker 1: taking on nuances of theirs or interjecting typical expressions from 37 00:02:29,276 --> 00:02:33,116 Speaker 1: their language culture. Is this a typically observed behavior? And 38 00:02:33,156 --> 00:02:35,836 Speaker 1: on the flip side, are there many whose accents and 39 00:02:35,876 --> 00:02:39,596 Speaker 1: mannerisms never change regardless of who they speak to, and 40 00:02:39,676 --> 00:02:42,796 Speaker 1: a reason why some do and some don't. Laurie, I 41 00:02:42,836 --> 00:02:46,236 Speaker 1: love this question. I am so vibing with it, Sabika. 42 00:02:46,316 --> 00:02:49,636 Speaker 1: That is me. I spent two years in Oxford and 43 00:02:49,836 --> 00:02:55,876 Speaker 1: I ended up speaking almost involuntarily with this faux British 44 00:02:55,956 --> 00:02:59,716 Speaker 1: posh accent. And I am from southern New Jersey, so 45 00:02:59,836 --> 00:03:02,556 Speaker 1: I should not be doing that. Does this happen to 46 00:03:02,596 --> 00:03:04,956 Speaker 1: you too? Oh my gosh. I moved to the UK 47 00:03:05,036 --> 00:03:07,396 Speaker 1: when I was in graduate school, and I too took 48 00:03:07,436 --> 00:03:10,276 Speaker 1: on not just like a British accent, the most painful 49 00:03:10,836 --> 00:03:14,236 Speaker 1: British mannerisms, like what like if you told me something shocking, 50 00:03:14,276 --> 00:03:16,156 Speaker 1: you did like, oh my gosh, Lorie, I just went 51 00:03:16,196 --> 00:03:18,236 Speaker 1: to the pub and I had like eight points. I 52 00:03:18,236 --> 00:03:22,316 Speaker 1: would respond to that with the phrase you walk. And 53 00:03:22,396 --> 00:03:24,476 Speaker 1: I brought that back to the US for like a year, 54 00:03:24,516 --> 00:03:26,596 Speaker 1: and my friends were like, stop. It'd be one thing 55 00:03:26,636 --> 00:03:28,356 Speaker 1: if you just talked with a fake British accent would 56 00:03:28,356 --> 00:03:30,596 Speaker 1: be bad enough, but the fact that you're you wall 57 00:03:30,756 --> 00:03:33,356 Speaker 1: all the time just terrible. Wait, you knew this and 58 00:03:33,436 --> 00:03:36,476 Speaker 1: you still did it right. It wasn't an affectation that 59 00:03:36,556 --> 00:03:40,396 Speaker 1: you were doing in an iranic Brooklyn way. Yeah, and 60 00:03:40,436 --> 00:03:41,996 Speaker 1: I think you know this angela about me. But I 61 00:03:41,996 --> 00:03:45,476 Speaker 1: grew up in New Bedford, Massachusetts, and had what, to 62 00:03:45,516 --> 00:03:48,036 Speaker 1: the uninitiated would sound like a terrible Boston accent. You 63 00:03:48,036 --> 00:03:50,396 Speaker 1: were from Boston, you'd know it was a new Beefit accent, 64 00:03:50,516 --> 00:03:53,636 Speaker 1: just like Pokya kad havid yad. Basically, I couldn't say are, 65 00:03:53,876 --> 00:03:56,836 Speaker 1: to the point that I actually lost a research assistant 66 00:03:56,876 --> 00:03:59,956 Speaker 1: position my freshman year in college. So I started out 67 00:04:00,036 --> 00:04:03,156 Speaker 1: my freshman year working in Steve Coslin's lab with Kevin Oxner, 68 00:04:03,156 --> 00:04:05,316 Speaker 1: who's a professor at Columbia. He was doing some study 69 00:04:05,316 --> 00:04:08,276 Speaker 1: where he just needed a female voice to say letters, 70 00:04:08,556 --> 00:04:11,436 Speaker 1: and so I started recording these letters like hey b C. 71 00:04:11,956 --> 00:04:15,636 Speaker 1: And then I got to q awe and he was like, well, 72 00:04:15,676 --> 00:04:17,676 Speaker 1: you can't say awe. You have to say R. I 73 00:04:17,756 --> 00:04:20,916 Speaker 1: was physically incapable of doing that except sounding like a pirate. 74 00:04:20,956 --> 00:04:22,916 Speaker 1: I was like ah, and he was like, no, it's 75 00:04:22,956 --> 00:04:25,156 Speaker 1: just our anyway. So now you notice that in my 76 00:04:25,196 --> 00:04:27,756 Speaker 1: perfect podcast speak, I can say are. But the reason 77 00:04:27,876 --> 00:04:29,916 Speaker 1: I think that happened was that I ended up my 78 00:04:29,956 --> 00:04:33,476 Speaker 1: freshman year being paired with a roommate from New Orleans, 79 00:04:33,476 --> 00:04:37,556 Speaker 1: who also had an incredibly thick, this time Southern accent. Somehow, 80 00:04:37,676 --> 00:04:41,716 Speaker 1: again perfectly unconsciously, we took on each other's vocal cadences 81 00:04:41,716 --> 00:04:43,876 Speaker 1: to the point that by our senior year. This is 82 00:04:43,916 --> 00:04:46,436 Speaker 1: back when your college room would have a single phone 83 00:04:46,476 --> 00:04:48,716 Speaker 1: that someone would call and you wouldn't know who had 84 00:04:48,756 --> 00:04:51,436 Speaker 1: picked up except by their voice. And in fact, people 85 00:04:51,476 --> 00:04:53,756 Speaker 1: couldn't tell me and my roommate Catherine apart by our 86 00:04:53,756 --> 00:04:56,956 Speaker 1: senior year because our voices had converged so much. You 87 00:04:57,076 --> 00:05:00,556 Speaker 1: took her rs. Did she lose hers? She stopped saying 88 00:05:00,636 --> 00:05:03,036 Speaker 1: y'all as much. In fact, we both picked up another 89 00:05:03,036 --> 00:05:05,916 Speaker 1: expression from our Pittsburgh roommate, which was yins. I like 90 00:05:06,076 --> 00:05:10,596 Speaker 1: that because you guys is apparently an offensive, gender, presumptuous, 91 00:05:10,636 --> 00:05:15,796 Speaker 1: possibly hostile sounding appellation. So my own students tell me, 92 00:05:16,156 --> 00:05:18,756 Speaker 1: I think ian's and y'all are better. But the beauty 93 00:05:18,876 --> 00:05:21,196 Speaker 1: is we just do this naturally. In fact, this is 94 00:05:21,236 --> 00:05:25,036 Speaker 1: an evolved part of human cognition. Researchers call this behavioral contagion. 95 00:05:25,516 --> 00:05:27,756 Speaker 1: This is the kind of thing that you see in animals, 96 00:05:27,796 --> 00:05:30,276 Speaker 1: you know, classic cases. If you watch fish, they tend 97 00:05:30,276 --> 00:05:32,036 Speaker 1: to school around and it looks like they're all kind 98 00:05:32,036 --> 00:05:35,076 Speaker 1: of copying each other's behavior. There's some lovely work by 99 00:05:35,076 --> 00:05:38,076 Speaker 1: this guy Ian Cousin, who does all these detailed mathematical 100 00:05:38,156 --> 00:05:41,156 Speaker 1: network analyses of how fish school around. But the upshot 101 00:05:41,236 --> 00:05:44,196 Speaker 1: is they're just soaking up each other's behavior quite naturally. 102 00:05:44,516 --> 00:05:48,156 Speaker 1: How do you know that they are really mimicking each 103 00:05:48,156 --> 00:05:51,996 Speaker 1: other as opposed to all responding to the same little 104 00:05:52,036 --> 00:05:55,676 Speaker 1: piece of floating kelp. He does these incredibly detailed Matthew 105 00:05:55,676 --> 00:05:57,036 Speaker 1: things that I'm not going to be able to pull 106 00:05:57,116 --> 00:05:59,876 Speaker 1: up on the fly. He can actually do some predictive 107 00:05:59,916 --> 00:06:02,596 Speaker 1: coding based on one fish's behavior about what's going to 108 00:06:02,636 --> 00:06:04,756 Speaker 1: happen with the schooling, and so it really does seem 109 00:06:04,836 --> 00:06:07,196 Speaker 1: to be behavioral contagion. But you don't need to look 110 00:06:07,236 --> 00:06:10,116 Speaker 1: to fish. This is something that we do quite naturally. 111 00:06:10,116 --> 00:06:12,476 Speaker 1: One of the most famous examples of this came from 112 00:06:12,516 --> 00:06:14,796 Speaker 1: my colleague here at Yale, John Barge and his colleague 113 00:06:14,836 --> 00:06:17,996 Speaker 1: Tanya Chartrand. They found out about this effect that they 114 00:06:17,996 --> 00:06:21,796 Speaker 1: called the chameleon effect, well labeled because it's cases where 115 00:06:21,836 --> 00:06:26,636 Speaker 1: people just chameleonly copy other people's behavior. What's an example. 116 00:06:26,916 --> 00:06:30,956 Speaker 1: They had subjects get interviewed by a scary experimenter. So 117 00:06:31,036 --> 00:06:33,276 Speaker 1: there's just like, high status person, do you think you're 118 00:06:33,276 --> 00:06:36,996 Speaker 1: being interviewed by? And what the experimenter did, unbeknownst to 119 00:06:36,996 --> 00:06:40,076 Speaker 1: the subjects was just occasionally take on strange movements, so 120 00:06:40,116 --> 00:06:42,036 Speaker 1: she would touch her face or put her arm in 121 00:06:42,036 --> 00:06:44,596 Speaker 1: a particular way, or cross her legs. And what you 122 00:06:44,636 --> 00:06:48,356 Speaker 1: find is that as they're being interviewed, they unconsciously copy 123 00:06:48,396 --> 00:06:51,196 Speaker 1: all these behaviors. So as the experimenters touching her face, 124 00:06:51,236 --> 00:06:53,516 Speaker 1: they touched their face more. As she's foldening her legs, 125 00:06:53,516 --> 00:06:56,836 Speaker 1: they pull their legs more, again totally unconsciously. And their 126 00:06:56,916 --> 00:06:59,156 Speaker 1: later work shows that this happens more in the high 127 00:06:59,156 --> 00:07:02,876 Speaker 1: status direction, so you're more likely to unconsciously copy the 128 00:07:02,956 --> 00:07:05,556 Speaker 1: high status people, maybe just because you're watching them more. Honestly. 129 00:07:05,956 --> 00:07:09,636 Speaker 1: This is the classic work of Albandura also right, where 130 00:07:09,836 --> 00:07:13,756 Speaker 1: the little children who watch an adult take a toy 131 00:07:13,996 --> 00:07:17,116 Speaker 1: like a bobo doll and beat it with a hammer. 132 00:07:17,556 --> 00:07:20,916 Speaker 1: Then when entering a room with a bobo doll just 133 00:07:20,956 --> 00:07:24,316 Speaker 1: like they saw, they will walk over and start beating 134 00:07:24,356 --> 00:07:28,316 Speaker 1: it with a hammer, whereas Bandara points out, they don't 135 00:07:28,316 --> 00:07:30,516 Speaker 1: do that in a control condition where they have not 136 00:07:30,636 --> 00:07:33,156 Speaker 1: seen an adult model this, So I guess the question 137 00:07:33,196 --> 00:07:37,916 Speaker 1: I'm asking is, is the phenomenon that you're describing different 138 00:07:38,036 --> 00:07:41,796 Speaker 1: from modeling or basically the same. It's probably the mechanism 139 00:07:41,836 --> 00:07:43,636 Speaker 1: that leads to this kind of stuff. I mean, you 140 00:07:43,676 --> 00:07:46,036 Speaker 1: know this well. In cognitive science, we often don't know 141 00:07:46,236 --> 00:07:49,276 Speaker 1: the basic mechanisms that lead to other stuff down the line. 142 00:07:49,276 --> 00:07:52,156 Speaker 1: And there's lots of hypotheses that things like behavioral contagient 143 00:07:52,276 --> 00:07:54,676 Speaker 1: lead to lots of nasty stuff. In fact, mandura is 144 00:07:54,676 --> 00:07:57,356 Speaker 1: about people beating up a bobo doll. But there's evidence 145 00:07:57,396 --> 00:08:00,396 Speaker 1: from people like Francesca Gino and Dan Arielli that behavioral 146 00:08:00,476 --> 00:08:03,596 Speaker 1: contagient can lead to some truly unethical behavior in some 147 00:08:03,676 --> 00:08:08,516 Speaker 1: contexts lying, cheating, stealing, and worse. Exactly, the Dan Ailee 148 00:08:08,596 --> 00:08:12,036 Speaker 1: francescan you know cheating study is fantastic. So they bring 149 00:08:12,076 --> 00:08:15,676 Speaker 1: these subjects into the lab college students and give them 150 00:08:15,916 --> 00:08:19,796 Speaker 1: a super super hard math test. It's basically impossible. And 151 00:08:19,836 --> 00:08:21,516 Speaker 1: so if you're a subject, you're experience and like, oh 152 00:08:21,556 --> 00:08:23,516 Speaker 1: my god, this is terrible. I'm never going to finish this. 153 00:08:23,996 --> 00:08:26,596 Speaker 1: And then you watch one subject raise their hand immediately 154 00:08:26,636 --> 00:08:28,956 Speaker 1: like two seconds into the experiment and say, yeah, I'm done. 155 00:08:29,036 --> 00:08:31,516 Speaker 1: What can I do. And the experiments like, if you're 156 00:08:31,516 --> 00:08:33,236 Speaker 1: done well, you can just shred your answers so no 157 00:08:33,276 --> 00:08:35,316 Speaker 1: one sees them and will pay you. And so if 158 00:08:35,316 --> 00:08:36,916 Speaker 1: you're the subject, you think you wait a minute, They're 159 00:08:36,916 --> 00:08:38,516 Speaker 1: not even going to check that I did them. They're 160 00:08:38,556 --> 00:08:41,556 Speaker 1: just going to shred it. And what Arielli and colleagues 161 00:08:41,596 --> 00:08:45,196 Speaker 1: find is you're more likely to cheat on these problems 162 00:08:45,356 --> 00:08:48,636 Speaker 1: if you see somebody else cheating. But the neat thing 163 00:08:48,676 --> 00:08:50,676 Speaker 1: is that it's not just if you see somebody else cheating, 164 00:08:50,956 --> 00:08:53,556 Speaker 1: it actually has to be somebody that relates to you. 165 00:08:53,796 --> 00:08:55,756 Speaker 1: And so they manipulated this in a cute way. They're 166 00:08:55,756 --> 00:08:58,636 Speaker 1: doing this stuff at Duke University. So the Duke students 167 00:08:58,676 --> 00:09:01,116 Speaker 1: are there in this study, and the person that raises 168 00:09:01,156 --> 00:09:04,036 Speaker 1: their hand and cheats is in a Duke University sweatshirt. Now, 169 00:09:04,036 --> 00:09:06,116 Speaker 1: all of a sudden, cheating goes up a lot. But 170 00:09:06,156 --> 00:09:08,676 Speaker 1: if the person's in an UNC sweatshirt, who are like 171 00:09:08,676 --> 00:09:10,956 Speaker 1: the lose from the other school, Now all of a 172 00:09:10,996 --> 00:09:13,076 Speaker 1: sudden people are like, oh my god, I'm not going 173 00:09:13,116 --> 00:09:17,076 Speaker 1: to cheat, and it actually reinforces moral behavior. And so 174 00:09:17,116 --> 00:09:19,916 Speaker 1: this is another thing we know about behavioral contagion, which 175 00:09:19,916 --> 00:09:23,556 Speaker 1: is kind of weird. We're more likely to contagiously pick 176 00:09:23,676 --> 00:09:26,556 Speaker 1: up on the behaviors of people who we see as 177 00:09:26,556 --> 00:09:28,636 Speaker 1: our in group members, who we see as high status, 178 00:09:28,676 --> 00:09:31,396 Speaker 1: who we pay attention to. So I wanted to ask 179 00:09:31,476 --> 00:09:34,916 Speaker 1: you what you thought about mirror neurons. I, as more 180 00:09:34,956 --> 00:09:37,796 Speaker 1: or as an outsiders as literature, have only read articles 181 00:09:37,836 --> 00:09:40,916 Speaker 1: about these specialized neurons that if I see you doing 182 00:09:40,916 --> 00:09:44,476 Speaker 1: a particular action, they are lighting up in my brain 183 00:09:44,636 --> 00:09:47,516 Speaker 1: as if I were doing the same action. So A, 184 00:09:47,956 --> 00:09:50,676 Speaker 1: is that an accurate description of mirror neurons? And B 185 00:09:51,356 --> 00:09:55,156 Speaker 1: what up with mirror neurons? Lorie, I'm kind of not 186 00:09:55,236 --> 00:09:57,876 Speaker 1: a fan of mirror neurons. I'll be totally honest. There's 187 00:09:57,916 --> 00:10:00,476 Speaker 1: a lot of hype a brown mirror neurons, but what 188 00:10:00,596 --> 00:10:03,156 Speaker 1: they actually do might not be as cool as we 189 00:10:03,236 --> 00:10:07,316 Speaker 1: sometimes think. Basically, these were discovered in monkeys in a 190 00:10:07,436 --> 00:10:10,996 Speaker 1: very famous set of experiments back in the early nineties 191 00:10:11,476 --> 00:10:15,556 Speaker 1: where monkeys were watching humans engaging in these actions, and 192 00:10:15,716 --> 00:10:19,156 Speaker 1: areas of the monkey motor cortex the spot that would 193 00:10:19,356 --> 00:10:21,876 Speaker 1: fire if they were grabbing for something. When they are 194 00:10:21,916 --> 00:10:25,556 Speaker 1: watching these humans grabbing for something, they tended to fire it. 195 00:10:25,596 --> 00:10:27,756 Speaker 1: So it seems super cool, like, oh my gosh, the 196 00:10:27,836 --> 00:10:30,716 Speaker 1: same neuron in me that fires when I reach fires 197 00:10:30,756 --> 00:10:33,316 Speaker 1: when I see you reach. Maybe neurons are the code 198 00:10:33,316 --> 00:10:35,916 Speaker 1: for empathy. Maybe these neurons are the seat of our 199 00:10:35,956 --> 00:10:38,836 Speaker 1: perspective taking all the stuff. What we know about them 200 00:10:38,916 --> 00:10:42,356 Speaker 1: is they only exist in motor cortex, so it's for 201 00:10:42,716 --> 00:10:47,076 Speaker 1: very specific motoric movements like grabbing and reaching. There's a 202 00:10:47,116 --> 00:10:50,076 Speaker 1: couple mirror neurons in other spots. There's some that might 203 00:10:50,116 --> 00:10:52,716 Speaker 1: be in the tension region, so for igs turning and 204 00:10:52,756 --> 00:10:55,196 Speaker 1: stuff like that, but not as rich as you'd think. 205 00:10:55,316 --> 00:10:58,476 Speaker 1: And I think there is some argument that human beings 206 00:10:58,596 --> 00:11:03,596 Speaker 1: are unique in their ability to learn through observation, whereas 207 00:11:03,676 --> 00:11:06,596 Speaker 1: a dog or even a chimpanzee can't do it, or 208 00:11:06,596 --> 00:11:08,916 Speaker 1: at least can't do it as well, Which is kind 209 00:11:08,916 --> 00:11:12,716 Speaker 1: of right, because the mirror neurons were mostly found in monkeys. 210 00:11:13,036 --> 00:11:15,876 Speaker 1: In terms of learning by observation, animals do do that, 211 00:11:15,956 --> 00:11:18,556 Speaker 1: but what they don't learn by is imitation. Like I 212 00:11:18,596 --> 00:11:22,556 Speaker 1: see you behave in this very specific sequence, and I 213 00:11:22,636 --> 00:11:26,636 Speaker 1: copy all of those very mechanical behaviors perfectly. That's literally 214 00:11:26,636 --> 00:11:29,836 Speaker 1: what monkeys don't do. Wait, what do monkeys do? So 215 00:11:29,996 --> 00:11:35,356 Speaker 1: monkey a observes monkey bee do something different with a banana. 216 00:11:35,556 --> 00:11:38,276 Speaker 1: I don't know if that's stereotyping. They do like bananas. 217 00:11:38,316 --> 00:11:40,796 Speaker 1: Totally true. Okay, good, I'm glad that holds up. Anyway, 218 00:11:40,996 --> 00:11:43,436 Speaker 1: What does happen if it's not imitation, what is it? 219 00:11:43,916 --> 00:11:45,716 Speaker 1: Here's one study that looked at this. This is not 220 00:11:45,836 --> 00:11:48,916 Speaker 1: with monkeys, but with chimpanzees. For our listeners out there, 221 00:11:48,956 --> 00:11:52,036 Speaker 1: pet peeve of people who work with primates monkeys, not chimpanzees. 222 00:11:52,116 --> 00:11:55,036 Speaker 1: Chimpanzees actually eat monkeys, so totally different. What it's like 223 00:11:55,076 --> 00:11:57,236 Speaker 1: saying a human is a tuna fish sandwich or something. 224 00:11:57,596 --> 00:12:00,796 Speaker 1: So with chimpanzees, they have this task where there's a 225 00:12:00,796 --> 00:12:03,636 Speaker 1: bunch of food outside of some enclosure and chimp has 226 00:12:03,676 --> 00:12:05,356 Speaker 1: to use a tool to get it. They give a 227 00:12:05,436 --> 00:12:08,356 Speaker 1: chimp a rake basically where you could try to use 228 00:12:08,356 --> 00:12:12,356 Speaker 1: a rake with the times down like we'd normally rake leaves, 229 00:12:12,396 --> 00:12:15,556 Speaker 1: but if they're tiny pieces of food, that works sort 230 00:12:15,596 --> 00:12:17,636 Speaker 1: of but not super well because the food goes through 231 00:12:17,636 --> 00:12:20,756 Speaker 1: the times. Whereas if you flip the rake over and 232 00:12:20,836 --> 00:12:23,236 Speaker 1: you have that part of the rake that's flat, you 233 00:12:23,276 --> 00:12:27,436 Speaker 1: can scoop the food up more effectively. So they show 234 00:12:27,756 --> 00:12:30,556 Speaker 1: kids this behavior, and what you find is the kid 235 00:12:30,596 --> 00:12:33,956 Speaker 1: will copy whatever the human does, whereas if you do 236 00:12:33,956 --> 00:12:37,276 Speaker 1: the same thing with chimpanzees, they don't necessarily fully copy 237 00:12:37,436 --> 00:12:39,636 Speaker 1: what the human does. They realize like, oh, I can 238 00:12:39,716 --> 00:12:41,436 Speaker 1: use a rake to try to get the food, and 239 00:12:41,476 --> 00:12:44,476 Speaker 1: then they try and error it. So they're kind of 240 00:12:44,796 --> 00:12:47,516 Speaker 1: copying the fact that you're using this tool and you 241 00:12:47,556 --> 00:12:49,596 Speaker 1: can do it. But what they're not copying is the 242 00:12:49,676 --> 00:12:53,276 Speaker 1: perfect actions that go with it. It sounds smarter, doesn't 243 00:12:53,276 --> 00:12:55,796 Speaker 1: It sound a little bit more evolved as it were? 244 00:12:56,116 --> 00:12:59,116 Speaker 1: It is smarter. Yeah. In fact, there's a wonderful bias 245 00:12:59,276 --> 00:13:02,516 Speaker 1: that is perhaps human unique bias. We have some evidence 246 00:13:02,556 --> 00:13:04,836 Speaker 1: that you don't see it in primates or in dogs, 247 00:13:05,076 --> 00:13:08,516 Speaker 1: which is called overimitation, this idea that we imitate too much. 248 00:13:08,636 --> 00:13:12,036 Speaker 1: If you see somebody doing something that's inefficient or in 249 00:13:12,036 --> 00:13:14,356 Speaker 1: the case of this Arielli study we talked about bad 250 00:13:14,436 --> 00:13:18,676 Speaker 1: or like immoral, you inadvertently copy it. Anyway. Have you 251 00:13:18,876 --> 00:13:23,396 Speaker 1: read the studies of Christine Leguere, this developmental psychology work 252 00:13:23,436 --> 00:13:27,316 Speaker 1: on children imitating others? Yeah, and she finds with over imitation, 253 00:13:27,556 --> 00:13:30,356 Speaker 1: but part of it seems to be automatic, but part 254 00:13:30,356 --> 00:13:33,156 Speaker 1: of it seems to be because this kind of behavioral 255 00:13:33,196 --> 00:13:35,276 Speaker 1: contigient is our way of showing hey, I'm in the 256 00:13:35,316 --> 00:13:37,476 Speaker 1: group like you. And this gets to another way that 257 00:13:37,516 --> 00:13:40,796 Speaker 1: you could think about switching accents in particular, this idea 258 00:13:40,796 --> 00:13:43,396 Speaker 1: of code switching. So code switching is if you're a 259 00:13:43,436 --> 00:13:46,116 Speaker 1: member of a minority group and you're in a majority 260 00:13:46,116 --> 00:13:49,276 Speaker 1: group situation, you sort of switch your behavior around to 261 00:13:49,396 --> 00:13:52,676 Speaker 1: match what the majority group is doing, which is sometimes 262 00:13:52,716 --> 00:13:57,556 Speaker 1: considered not a great thing, but arguably, as you're pointing out, 263 00:13:57,676 --> 00:14:01,236 Speaker 1: is adaptive totally. If I look back at my own accent, switching, 264 00:14:01,396 --> 00:14:04,796 Speaker 1: my new beefit accent wasn't going to necessarily work super 265 00:14:04,836 --> 00:14:07,916 Speaker 1: well in IVY League classes. That wasn't the way these 266 00:14:07,996 --> 00:14:11,596 Speaker 1: high status, higher class people talked. It's no secret that 267 00:14:11,636 --> 00:14:15,756 Speaker 1: my accent switched more towards a IVY League vernacular English. Right, 268 00:14:15,956 --> 00:14:19,796 Speaker 1: So we were both in England, which was a higher 269 00:14:19,836 --> 00:14:23,276 Speaker 1: status accent, one could argue, certainly than my native South Jersey. 270 00:14:23,596 --> 00:14:27,236 Speaker 1: So I start speaking a faux British accent. But if 271 00:14:27,276 --> 00:14:31,036 Speaker 1: a British scholar for whatever reason, came to southern New 272 00:14:31,116 --> 00:14:33,916 Speaker 1: Jersey and had to spend a summer down the shore, 273 00:14:34,236 --> 00:14:38,716 Speaker 1: they would not adopt the local vernacular right, That would 274 00:14:38,716 --> 00:14:41,716 Speaker 1: be the prediction because of the status. Status is part 275 00:14:41,716 --> 00:14:45,676 Speaker 1: of it, for sure, but I think also just functionally 276 00:14:45,716 --> 00:14:48,036 Speaker 1: getting the inside scoop and seeming like you belong and 277 00:14:48,036 --> 00:14:50,196 Speaker 1: you're like an insider at that place. So my prediction 278 00:14:50,276 --> 00:14:53,116 Speaker 1: is brit might do it less in southern New Jersey 279 00:14:53,156 --> 00:14:55,356 Speaker 1: than this Southern New Jersey, or would do it in 280 00:14:55,396 --> 00:14:57,956 Speaker 1: the UK, but they would to a certain extent. And 281 00:14:58,036 --> 00:15:00,476 Speaker 1: this is the reason that again I have lost sadly 282 00:15:00,516 --> 00:15:02,876 Speaker 1: at my New Bedford accent until I go back to 283 00:15:02,916 --> 00:15:04,756 Speaker 1: New Bedford for a couple of days and then I 284 00:15:04,796 --> 00:15:06,876 Speaker 1: all of a sudden sound like I've been there my 285 00:15:06,916 --> 00:15:09,916 Speaker 1: whole life. I want to hear a New Bedford happiness lab. 286 00:15:09,956 --> 00:15:12,436 Speaker 1: I think you should do it full on and maybe 287 00:15:12,436 --> 00:15:15,116 Speaker 1: you could record it there another time it happens. Maybe 288 00:15:15,116 --> 00:15:16,556 Speaker 1: you got this too when you were in the UK. 289 00:15:16,716 --> 00:15:19,676 Speaker 1: Is when I'm drunk, those more automatic accents come back. 290 00:15:19,716 --> 00:15:22,156 Speaker 1: It's weird. I haven't been drunk since I was eighteen. 291 00:15:22,436 --> 00:15:25,116 Speaker 1: But that is probably a different question. So I'll have 292 00:15:25,156 --> 00:15:27,436 Speaker 1: to actually get data on that and come back to you. 293 00:15:27,556 --> 00:15:36,436 Speaker 1: There's an experiment we could do. Angela. Okay, so now 294 00:15:36,476 --> 00:15:38,436 Speaker 1: I get to ask a question right, which is so cool. 295 00:15:38,516 --> 00:15:40,236 Speaker 1: We don't normally do this on the Happiness Lab. You 296 00:15:40,236 --> 00:15:42,476 Speaker 1: need another person to be hanging out with, that's true, 297 00:15:42,516 --> 00:15:45,596 Speaker 1: You're welcome anytime on Happiness Lab. Angela, thank you. But 298 00:15:45,756 --> 00:15:49,436 Speaker 1: here's question number two. Amelia asks why is it that 299 00:15:49,476 --> 00:15:53,556 Speaker 1: so many people are restless or unsatisfied even in terrific 300 00:15:53,716 --> 00:15:56,796 Speaker 1: or satisfying circumstances. Her context is that she's in a 301 00:15:56,796 --> 00:15:59,636 Speaker 1: really lovely place with lots of supportive and happy colleagues. 302 00:16:00,036 --> 00:16:03,196 Speaker 1: She's exceeded all these expectations she's had for herself professionally, 303 00:16:03,316 --> 00:16:06,716 Speaker 1: but still finds herself looking around at other opportunities, kind 304 00:16:06,716 --> 00:16:10,356 Speaker 1: of feeling unhappy, thinking she should switching. And then she 305 00:16:10,436 --> 00:16:12,316 Speaker 1: goes on to ask why can't I just wait it out? 306 00:16:12,356 --> 00:16:15,036 Speaker 1: Why the need for change? Maybe at some level people 307 00:16:15,076 --> 00:16:18,116 Speaker 1: don't want to be happy? What is the deal? Scientifically? 308 00:16:18,476 --> 00:16:20,916 Speaker 1: This is such a great question. I think it is 309 00:16:21,396 --> 00:16:25,716 Speaker 1: timely because, as we both were very sad to know 310 00:16:26,076 --> 00:16:30,836 Speaker 1: a Dener, the scientist who arguably more than anyone, put 311 00:16:30,876 --> 00:16:34,356 Speaker 1: the scientific study of happiness on the map. He passed 312 00:16:34,356 --> 00:16:37,556 Speaker 1: away very recently, So I feel like this question is 313 00:16:37,596 --> 00:16:40,036 Speaker 1: a way also for us to honor the great Ed Dener, 314 00:16:40,396 --> 00:16:42,596 Speaker 1: who is amazing and who stuff we talk about all 315 00:16:42,596 --> 00:16:44,636 Speaker 1: the time on the Happiness Lab. So I will begin 316 00:16:44,876 --> 00:16:48,716 Speaker 1: by saying that I had this experience myself. I remember 317 00:16:48,916 --> 00:16:52,316 Speaker 1: when I was I think eighteen years old writing to 318 00:16:52,476 --> 00:16:56,876 Speaker 1: Dear Abbey, saying how unhappy I was. There were extending circumstances. 319 00:16:56,876 --> 00:16:59,676 Speaker 1: Mostly I was an adolescent. So that's partly your job 320 00:16:59,716 --> 00:17:03,476 Speaker 1: as adolescent to be unhappy with your circumstances. Wait, wait, wait, wait, 321 00:17:03,476 --> 00:17:06,476 Speaker 1: time out. You actually wrote to Dear Abbey, legit wrote 322 00:17:06,476 --> 00:17:10,756 Speaker 1: to Dear Abbey. Yes, who was the nineteen eighties. So 323 00:17:10,916 --> 00:17:14,756 Speaker 1: I wrote a letter, put it in an envelope, licked it, 324 00:17:14,996 --> 00:17:17,396 Speaker 1: sealed it, put a stamp on it, and mailed it away. 325 00:17:17,516 --> 00:17:19,836 Speaker 1: And what my letter said was more or less that 326 00:17:19,996 --> 00:17:23,316 Speaker 1: I felt like I had a perfect life. I had 327 00:17:23,356 --> 00:17:26,956 Speaker 1: just gotten into college, I know, was going to Harvard, 328 00:17:26,996 --> 00:17:30,436 Speaker 1: so I was getting lots of praise from my Asian family. 329 00:17:30,676 --> 00:17:33,876 Speaker 1: And my boyfriend at the time was somebody I thought 330 00:17:33,916 --> 00:17:35,836 Speaker 1: I would spend maybe the rest of my life with. 331 00:17:35,996 --> 00:17:39,516 Speaker 1: And I had a wonderful best friend, and all these 332 00:17:39,636 --> 00:17:42,556 Speaker 1: great things were happening to me. And there was this 333 00:17:42,716 --> 00:17:47,676 Speaker 1: contrast from the objective awesomeness of my life and my 334 00:17:48,516 --> 00:17:52,836 Speaker 1: unst I was just like, I'm not happy, Abby, what's 335 00:17:52,876 --> 00:17:56,116 Speaker 1: wrong with me? What did Abby say? She said to 336 00:17:56,196 --> 00:17:59,316 Speaker 1: go see a therapist. Go see a therapist. Art for 337 00:17:59,636 --> 00:18:02,516 Speaker 1: excellent words of advice. But I was very disappointed at 338 00:18:02,516 --> 00:18:04,636 Speaker 1: the time. And you know what, Laurie, I wish I 339 00:18:04,636 --> 00:18:06,916 Speaker 1: had listened to her, because it took me a couple 340 00:18:07,036 --> 00:18:11,156 Speaker 1: more decades to realize that I'm in the state of 341 00:18:11,236 --> 00:18:13,316 Speaker 1: mind where I'm so unhappy I need to write a 342 00:18:13,356 --> 00:18:16,796 Speaker 1: letter to a stranger to ask them for help. Then, 343 00:18:16,876 --> 00:18:19,116 Speaker 1: really what I should do? Let's go see a therapist. 344 00:18:19,196 --> 00:18:22,316 Speaker 1: But there are times where you just look around at 345 00:18:22,356 --> 00:18:26,316 Speaker 1: your life in any objective sense, you realize, holy shmoley, 346 00:18:26,716 --> 00:18:28,716 Speaker 1: this is the best life I could possibly be living, 347 00:18:28,836 --> 00:18:32,396 Speaker 1: and you have this gnawing sense of dissatisfactor, like, why 348 00:18:32,436 --> 00:18:35,276 Speaker 1: am I not a ten out of ten on happiness? Now? 349 00:18:35,316 --> 00:18:38,996 Speaker 1: I want to start our scientific discussion of this with 350 00:18:39,036 --> 00:18:42,756 Speaker 1: this very famous idea of the hedonic treadmill, which I 351 00:18:42,836 --> 00:18:46,276 Speaker 1: know you've probably already discussed on your podcast. Am I 352 00:18:46,276 --> 00:18:48,956 Speaker 1: write on that? Yeah? We had a fantastic guest on 353 00:18:48,996 --> 00:18:52,516 Speaker 1: to talk about the hedonic treadmill, Clay Cockerell. He's a 354 00:18:52,556 --> 00:18:56,236 Speaker 1: wealth psychologist, so he's a mental health professional for the 355 00:18:56,516 --> 00:18:59,676 Speaker 1: point zero zero one percent. First off, that's telling right 356 00:18:59,756 --> 00:19:01,876 Speaker 1: that we have to have mental health professionals for the 357 00:19:01,916 --> 00:19:03,916 Speaker 1: point zero zero one percent. You think these people would 358 00:19:03,956 --> 00:19:05,956 Speaker 1: be like, they're not so happy that they're like, no, 359 00:19:06,036 --> 00:19:09,436 Speaker 1: I'm good. Yeah, they need him, and the problems he 360 00:19:09,476 --> 00:19:11,716 Speaker 1: sees and his patients are just I mean, if you're 361 00:19:11,836 --> 00:19:13,516 Speaker 1: not in the point zero zero one percent, you kind 362 00:19:13,516 --> 00:19:15,636 Speaker 1: of get a little bit schadenfreude because they're things like 363 00:19:15,876 --> 00:19:18,236 Speaker 1: I don't know where to park my yacht and you're like, well, dude, 364 00:19:18,276 --> 00:19:20,276 Speaker 1: maybe if you don't have a yacht. But the point is, 365 00:19:20,756 --> 00:19:26,476 Speaker 1: these ostensibly objectively terrific circumstances don't always feel terrific, and 366 00:19:26,516 --> 00:19:28,836 Speaker 1: that is the hedonic treadmill. We kind of just get 367 00:19:28,916 --> 00:19:31,556 Speaker 1: used to stuff. So if you have something objectively awesome happen, 368 00:19:31,876 --> 00:19:33,996 Speaker 1: you notice and you feel that it is good, and 369 00:19:34,036 --> 00:19:36,956 Speaker 1: it affects your happiness for a short while, but then 370 00:19:37,196 --> 00:19:38,996 Speaker 1: you kind of just get used to it. And that's 371 00:19:39,036 --> 00:19:40,916 Speaker 1: the idea of treadmill. You like, keep running and running, 372 00:19:40,916 --> 00:19:42,316 Speaker 1: you stay in the same place. But I think we 373 00:19:42,356 --> 00:19:45,436 Speaker 1: would both agree that there is to some extent, a 374 00:19:45,596 --> 00:19:49,956 Speaker 1: phenomenon by which, through either things that we do intentionally 375 00:19:50,236 --> 00:19:55,076 Speaker 1: or unconsciously, we do come back from either extreme like 376 00:19:55,156 --> 00:19:57,596 Speaker 1: too happy or the opposite. The flip side is that 377 00:19:57,916 --> 00:20:01,196 Speaker 1: we also get used to circumstances that are pretty awful. 378 00:20:01,436 --> 00:20:04,356 Speaker 1: They don't continue to affect our psychology as bad as 379 00:20:04,396 --> 00:20:07,116 Speaker 1: when they first happen. So you break up or you 380 00:20:07,156 --> 00:20:10,796 Speaker 1: lose a job, those things for a while and they 381 00:20:10,796 --> 00:20:13,516 Speaker 1: feel like they're going to suck forever. I think that's 382 00:20:13,556 --> 00:20:16,236 Speaker 1: one of the fascinating things about emotions. When you're in 383 00:20:16,236 --> 00:20:19,756 Speaker 1: the middle of one, like when you're anxious or lonely 384 00:20:20,036 --> 00:20:23,556 Speaker 1: or extremely sad, you can't really see around the corner, 385 00:20:23,676 --> 00:20:26,596 Speaker 1: even if intellectually you realize like, oh, yeah, I've been 386 00:20:26,636 --> 00:20:29,116 Speaker 1: in this kind of place before and I've seen things 387 00:20:29,116 --> 00:20:32,996 Speaker 1: get better. But doesn't feel that way in any visceral sense. 388 00:20:33,116 --> 00:20:34,916 Speaker 1: But though we would agree with that, I think one 389 00:20:34,916 --> 00:20:38,716 Speaker 1: of the nuances here that is important to underscore is 390 00:20:38,756 --> 00:20:42,676 Speaker 1: that the returning to the set point isn't always exactly 391 00:20:42,716 --> 00:20:46,916 Speaker 1: to where you were before. So the famous nineteen seventy 392 00:20:46,916 --> 00:20:51,676 Speaker 1: eight study of accident victims who became paraplegics, it's often 393 00:20:51,996 --> 00:20:55,356 Speaker 1: described as follow them long enough, you see that they 394 00:20:55,396 --> 00:20:58,676 Speaker 1: come back to where they were before their accident, but 395 00:20:59,436 --> 00:21:03,996 Speaker 1: sadly not quite. Yes, they adapt to donically, but not 396 00:21:04,116 --> 00:21:07,036 Speaker 1: all the way back on average to where they were before. Yeah, 397 00:21:07,036 --> 00:21:10,876 Speaker 1: and there's a few cases like that where adaptation isn't perfect. 398 00:21:10,956 --> 00:21:13,916 Speaker 1: I think another one is in the context of unemployment. 399 00:21:14,116 --> 00:21:17,276 Speaker 1: That's another case where you go down a little bit. Actually, 400 00:21:17,356 --> 00:21:20,156 Speaker 1: one that's the opposite is divorce. You have a hit 401 00:21:20,196 --> 00:21:22,196 Speaker 1: to your happiness when you first get divorced, but you 402 00:21:22,196 --> 00:21:26,076 Speaker 1: actually pop up past baseline. This is the thing about 403 00:21:26,156 --> 00:21:28,756 Speaker 1: these happiness set points. They're not perfect. Sometimes you go 404 00:21:28,796 --> 00:21:30,836 Speaker 1: at tency bit down or a tency bit up in 405 00:21:30,836 --> 00:21:34,156 Speaker 1: the good cases. But the point is that moves. It's 406 00:21:34,196 --> 00:21:36,196 Speaker 1: not like this person's going to break up with me 407 00:21:36,236 --> 00:21:39,396 Speaker 1: and I'm stuck there forever. It has to move. Let's 408 00:21:39,436 --> 00:21:42,396 Speaker 1: talk about coming down from the highs people do. At 409 00:21:42,436 --> 00:21:45,356 Speaker 1: least a lot of us walk around basically shooting for 410 00:21:45,356 --> 00:21:47,676 Speaker 1: the ten out of ten, Like why can't I be 411 00:21:47,716 --> 00:21:50,756 Speaker 1: a ten out of ten every day? Is there something 412 00:21:50,876 --> 00:21:54,756 Speaker 1: we can say about the adaptiveness of not living life 413 00:21:54,796 --> 00:21:58,076 Speaker 1: at the extreme end of like everything is great when 414 00:21:58,116 --> 00:22:00,356 Speaker 1: you think about the extreme end. This reminds me not 415 00:22:00,476 --> 00:22:03,836 Speaker 1: of a scientific tip but a philosophical one. Aristotle thought 416 00:22:03,836 --> 00:22:06,076 Speaker 1: that virtue was living in the middle. You know, if 417 00:22:06,076 --> 00:22:07,996 Speaker 1: you're shooting to be brave, you don't want to be 418 00:22:08,036 --> 00:22:10,356 Speaker 1: like the bravest dude as such that you're reckless, But 419 00:22:10,396 --> 00:22:13,276 Speaker 1: you also don't want to be cowardly, And so there's 420 00:22:13,276 --> 00:22:15,876 Speaker 1: something to be said for this with happiness too. Happiness 421 00:22:15,996 --> 00:22:18,236 Speaker 1: is going to be elusive if you're constantly analyzing do 422 00:22:18,276 --> 00:22:19,476 Speaker 1: I have it yet? Do I have yet? Do I 423 00:22:19,516 --> 00:22:21,076 Speaker 1: have it yet? We really want to get to a 424 00:22:21,116 --> 00:22:24,036 Speaker 1: point where we're feeling grateful, noticing the good stuff in 425 00:22:24,076 --> 00:22:26,516 Speaker 1: our lives, doing everything we can to savor what we have. 426 00:22:26,916 --> 00:22:30,196 Speaker 1: But pushing, pushing, pushing might not necessarily be the best 427 00:22:30,196 --> 00:22:33,316 Speaker 1: thing for your happiness anyway. I have long pondered this 428 00:22:33,396 --> 00:22:37,556 Speaker 1: Aristotle golden mean idea, in part because the things I study, 429 00:22:37,756 --> 00:22:41,076 Speaker 1: like self control or grit, people always ask can you 430 00:22:41,116 --> 00:22:44,476 Speaker 1: be too self controlled? What if you're too gritty? What's 431 00:22:44,516 --> 00:22:47,756 Speaker 1: the dark side of excessive grit? And when I think 432 00:22:47,796 --> 00:22:50,956 Speaker 1: about what Aristotle saying, I'm a little confused. What is 433 00:22:50,996 --> 00:22:54,796 Speaker 1: the deeper reason why something in between the extremes is, 434 00:22:55,036 --> 00:22:58,556 Speaker 1: as a rule better not just in the case of 435 00:22:58,796 --> 00:23:02,396 Speaker 1: bravery and cowardice, but as a general truth about human nature. 436 00:23:02,436 --> 00:23:06,876 Speaker 1: And I wonder whether there is some cost to being 437 00:23:06,916 --> 00:23:08,636 Speaker 1: at the ten out of ten which makes us not 438 00:23:08,676 --> 00:23:10,476 Speaker 1: want to be there all the time. Well, I think 439 00:23:10,556 --> 00:23:13,116 Speaker 1: one is if you were always at a ten out 440 00:23:13,116 --> 00:23:16,036 Speaker 1: of ten and you never change, you wouldn't notice any change. 441 00:23:16,156 --> 00:23:18,596 Speaker 1: And I think this actually gets to Amelia's question. She's 442 00:23:18,596 --> 00:23:21,316 Speaker 1: asking why the need for change. The need for change 443 00:23:21,316 --> 00:23:24,116 Speaker 1: is that we don't notice our absolute objective status. We 444 00:23:24,196 --> 00:23:26,596 Speaker 1: only notice when we change from it. People who live 445 00:23:26,636 --> 00:23:29,356 Speaker 1: in southern California don't appreciate the weather because it's just 446 00:23:29,476 --> 00:23:32,596 Speaker 1: perfect all the time. But when you live in the Northeast, 447 00:23:32,876 --> 00:23:34,996 Speaker 1: you get enough sucky days that all of a sudden, 448 00:23:35,036 --> 00:23:36,636 Speaker 1: when it's sunny out, you're like, oh my gosh, it's 449 00:23:36,636 --> 00:23:39,476 Speaker 1: sunny in eighty Thank you, you know, whatever divinity you're 450 00:23:39,516 --> 00:23:41,356 Speaker 1: praying to. The other thing is, I think you're totally 451 00:23:41,436 --> 00:23:45,596 Speaker 1: right on the cost. Sometimes, if you're pushing happiness too much, 452 00:23:45,676 --> 00:23:47,676 Speaker 1: that can be costly, and I think we see that 453 00:23:47,756 --> 00:23:51,076 Speaker 1: in the context of clinical disorders like mania. Those people 454 00:23:51,116 --> 00:23:53,596 Speaker 1: would report I'm ten out of ten on a happiness scale, 455 00:23:53,596 --> 00:23:56,156 Speaker 1: but they're gambling and wrecking their car and hurting their 456 00:23:56,156 --> 00:23:58,196 Speaker 1: family and things like that. And so I think the 457 00:23:58,356 --> 00:24:00,796 Speaker 1: Aristotle might not have been perfect with the middle Road, 458 00:24:00,956 --> 00:24:02,916 Speaker 1: but he was onto something and something I think he 459 00:24:02,956 --> 00:24:05,196 Speaker 1: was onto. It's most relevant to Amelia gets back to 460 00:24:05,236 --> 00:24:08,196 Speaker 1: this idea of the power of change. When we're just 461 00:24:08,276 --> 00:24:12,756 Speaker 1: consistently we kind of don't notice it. The consequence of 462 00:24:12,796 --> 00:24:16,116 Speaker 1: that is what Danny Counterman and Amos Tversky referred to 463 00:24:16,116 --> 00:24:20,436 Speaker 1: as diminishing sensitivity. We can get small changes that objectively 464 00:24:20,596 --> 00:24:23,876 Speaker 1: feel good, but we just don't notice them, which is 465 00:24:23,916 --> 00:24:26,396 Speaker 1: sort of sad. The example I give my students is 466 00:24:26,436 --> 00:24:27,996 Speaker 1: I try to be hit right, like I try to 467 00:24:28,076 --> 00:24:30,876 Speaker 1: know what a crossover is and know all the new songs. 468 00:24:31,036 --> 00:24:33,796 Speaker 1: And there's this DJ College song called All I Do 469 00:24:33,916 --> 00:24:35,596 Speaker 1: is Win. Do you know this song? No, of course 470 00:24:35,636 --> 00:24:38,796 Speaker 1: I don't sell it. It's like all I do is win, win, win, 471 00:24:38,916 --> 00:24:41,636 Speaker 1: No matter what the ideas, he just wins all the time. 472 00:24:41,916 --> 00:24:43,916 Speaker 1: And I tell my students that is like a crappy 473 00:24:43,956 --> 00:24:46,156 Speaker 1: way to live a life, because if you're literally winning 474 00:24:46,196 --> 00:24:49,236 Speaker 1: all the time, you don't actually notice the subtle changes. 475 00:24:49,476 --> 00:24:53,036 Speaker 1: What is the optimal design, then, of a good life? 476 00:24:53,076 --> 00:24:56,076 Speaker 1: Maybe it's just ninety nine great days and one really 477 00:24:56,116 --> 00:24:58,356 Speaker 1: bad one to make you appreciate the rest of the 478 00:24:58,396 --> 00:25:00,396 Speaker 1: ninety nine. What do you think? I think about this 479 00:25:00,436 --> 00:25:03,156 Speaker 1: one a lot. What you want to do is maximize 480 00:25:03,196 --> 00:25:05,836 Speaker 1: the change somehow, and it's optimal if that change is 481 00:25:05,876 --> 00:25:07,996 Speaker 1: going in a positive direction, but you actually want it 482 00:25:08,036 --> 00:25:11,156 Speaker 1: to go down sometimes because another feature of this diminishing 483 00:25:11,156 --> 00:25:14,316 Speaker 1: sensitivity I mentioned Danny Koneman and Amos diversity. It comes 484 00:25:14,316 --> 00:25:16,676 Speaker 1: from their famous idea about prospect theory, which is this 485 00:25:16,716 --> 00:25:19,916 Speaker 1: idea that we don't evaluate prospects or things in our 486 00:25:19,956 --> 00:25:23,076 Speaker 1: lives in terms of absolute values. We've recognized them and 487 00:25:23,116 --> 00:25:25,236 Speaker 1: represent them in terms of changes. You know, if I 488 00:25:25,276 --> 00:25:27,276 Speaker 1: was like Angela, right now, the happiness Lab is going 489 00:25:27,316 --> 00:25:30,636 Speaker 1: to give you one million dollars, you'd be like, that's amazing. 490 00:25:30,716 --> 00:25:32,476 Speaker 1: But if I was like, right now, the happiness Lab 491 00:25:32,516 --> 00:25:34,236 Speaker 1: is going to give you two million dollars, I mean, 492 00:25:34,276 --> 00:25:37,276 Speaker 1: that's better, but you're not like twice as happy, and 493 00:25:37,356 --> 00:25:41,396 Speaker 1: so that is diminishing sensitivity, and that sucks. It means 494 00:25:41,436 --> 00:25:43,756 Speaker 1: for you to get that extra happiness benefit from the 495 00:25:43,876 --> 00:25:46,236 Speaker 1: extra million in the two million, you would have wanted 496 00:25:46,236 --> 00:25:48,316 Speaker 1: to go back to baseline first, so it feels like 497 00:25:48,356 --> 00:25:51,316 Speaker 1: two separate gains instead of one big gain. That's another 498 00:25:51,316 --> 00:25:54,036 Speaker 1: reason not to obsess about being at ten exactly. You 499 00:25:54,156 --> 00:25:55,916 Speaker 1: think going from nine to ten is going to be 500 00:25:56,116 --> 00:25:59,236 Speaker 1: just as good as going from eight to nine or 501 00:25:59,276 --> 00:26:03,356 Speaker 1: seven to eight, But according to diminishing returns, like, eh, 502 00:26:03,396 --> 00:26:06,356 Speaker 1: it's better, not much better as it was to go 503 00:26:06,636 --> 00:26:08,876 Speaker 1: from seven to eight. That gives us some hints about 504 00:26:08,916 --> 00:26:10,716 Speaker 1: how to do it better. Right, So one is split 505 00:26:10,796 --> 00:26:13,036 Speaker 1: your gains. You don't want two million at once, you 506 00:26:13,076 --> 00:26:15,036 Speaker 1: want one million and then come back a couple months 507 00:26:15,076 --> 00:26:17,676 Speaker 1: later and get another million. This is something I actually 508 00:26:17,796 --> 00:26:20,316 Speaker 1: try to do. How how do you do it? Sometimes 509 00:26:20,356 --> 00:26:21,956 Speaker 1: my husband and I will have a date night and 510 00:26:21,956 --> 00:26:23,436 Speaker 1: we're like, all right, we're gonna see the movie we 511 00:26:23,476 --> 00:26:25,236 Speaker 1: really wanted and get the dinner we really wanted and 512 00:26:25,316 --> 00:26:27,676 Speaker 1: get ice cream too. And it's like, wait, let's do 513 00:26:27,756 --> 00:26:30,236 Speaker 1: the nice dinner and then do the ice cream tomorrow. 514 00:26:30,356 --> 00:26:32,396 Speaker 1: A really stupid way I do this is sometimes when 515 00:26:32,396 --> 00:26:34,756 Speaker 1: I'm buying stuff. This is not very ecologically savvy, so 516 00:26:34,796 --> 00:26:36,756 Speaker 1: maybe this is not helping with my climate change goals, 517 00:26:36,756 --> 00:26:38,596 Speaker 1: But you know what it's like, you get the package 518 00:26:38,596 --> 00:26:40,396 Speaker 1: of a bunch of stuff, that you bought on Amazon. 519 00:26:40,516 --> 00:26:42,356 Speaker 1: It's not as fun as if you got the shirt 520 00:26:42,396 --> 00:26:44,276 Speaker 1: one day and then the next day you got the 521 00:26:44,356 --> 00:26:47,036 Speaker 1: shoes and you're like, oh yeah, so you split your gains. 522 00:26:47,316 --> 00:26:50,116 Speaker 1: I have a proposal that may or may not have 523 00:26:50,316 --> 00:26:54,876 Speaker 1: as severe consequences getting two Amazon packages. You know, vacations 524 00:26:54,916 --> 00:26:57,316 Speaker 1: are something that I don't know how to take very well. 525 00:26:57,476 --> 00:26:59,996 Speaker 1: But according to the principles we've been discussing, rather than 526 00:27:00,036 --> 00:27:03,956 Speaker 1: taking seven days off and cramming in all of your 527 00:27:04,076 --> 00:27:07,516 Speaker 1: dinners out and your extra desserts and your walks around 528 00:27:07,596 --> 00:27:09,676 Speaker 1: whatever city you're in with ice coffe, that would be 529 00:27:09,756 --> 00:27:13,996 Speaker 1: my preferred vacation. Why don't we have seven three day weekends. 530 00:27:14,236 --> 00:27:17,436 Speaker 1: I think that that could result in a massive global 531 00:27:17,556 --> 00:27:21,836 Speaker 1: gain in happiness without any obvious downside. I love it. 532 00:27:22,116 --> 00:27:24,356 Speaker 1: My other tip on this, I don't know if you 533 00:27:24,716 --> 00:27:27,916 Speaker 1: like Hostess cupcakes. I'm from Billions. We have Tasty cakes. Oh. 534 00:27:27,956 --> 00:27:30,076 Speaker 1: I think tasty cakes are similar, But the key to 535 00:27:30,116 --> 00:27:32,556 Speaker 1: the Hostess cupcake is that you get two of them. Like, 536 00:27:32,676 --> 00:27:35,916 Speaker 1: Hostess could have made that much chocolate cakiness in a 537 00:27:35,996 --> 00:27:38,916 Speaker 1: single big cupcake, but if you got that cupcake, you 538 00:27:39,076 --> 00:27:41,676 Speaker 1: just plowed through it. They had the insight to break 539 00:27:41,716 --> 00:27:43,756 Speaker 1: those up, and like what happens? Because you eat the 540 00:27:43,796 --> 00:27:47,396 Speaker 1: first one, you wait, and then you come back to it. 541 00:27:47,516 --> 00:27:50,356 Speaker 1: You're kind of at baseline again. You get more happiness. Wow, 542 00:27:50,516 --> 00:27:54,036 Speaker 1: you think that the Hostess people really had behavioral science stuff. 543 00:27:54,396 --> 00:27:57,676 Speaker 1: They read prospect theory. They're like, wait a minute, this 544 00:27:57,756 --> 00:28:00,236 Speaker 1: is why people like the mini black and white cookies 545 00:28:00,596 --> 00:28:02,836 Speaker 1: better than the one huge black and white cookie. You 546 00:28:02,876 --> 00:28:05,196 Speaker 1: can pause in between them, and you go back to 547 00:28:05,236 --> 00:28:08,116 Speaker 1: happiness baseline, no cookie, and then you're like, oh my gosh, 548 00:28:08,116 --> 00:28:10,396 Speaker 1: another cookie, and then spike backup. I don't know how 549 00:28:10,396 --> 00:28:13,596 Speaker 1: many people, by the way, eat the one Hostess or 550 00:28:13,756 --> 00:28:16,276 Speaker 1: Tasty cake. In the Tasty Cake version, there's no white 551 00:28:16,276 --> 00:28:18,836 Speaker 1: squirly line across the top, but it's basically the same cupcake. 552 00:28:18,876 --> 00:28:20,996 Speaker 1: But there's two, right, there is two. And I do 553 00:28:21,076 --> 00:28:24,396 Speaker 1: think spacing out our gains could be helpful, just as 554 00:28:24,396 --> 00:28:28,916 Speaker 1: you recommend, and maybe just reframing the inevitable bad days 555 00:28:29,156 --> 00:28:32,316 Speaker 1: as like, I mean, here's a trivial example. Last night 556 00:28:32,356 --> 00:28:36,356 Speaker 1: I made Casha, you know, the buckwheat thing. Oh yeah, 557 00:28:36,396 --> 00:28:39,916 Speaker 1: I followed the recipe to the letter because had a friend. 558 00:28:39,916 --> 00:28:41,596 Speaker 1: She's like, I'm gonna call my grandmother, We're going to 559 00:28:41,676 --> 00:28:45,476 Speaker 1: get this exactly right. And then I left the pot 560 00:28:45,556 --> 00:28:48,436 Speaker 1: on the stove, not even thinking, and it just burned 561 00:28:48,676 --> 00:28:51,756 Speaker 1: to a crisp and it was horrible and both mushy 562 00:28:51,796 --> 00:28:53,236 Speaker 1: and burned at the same time, which I didn't think 563 00:28:53,276 --> 00:28:57,116 Speaker 1: was physically possible. So that was a bad experience. I 564 00:28:57,236 --> 00:29:00,556 Speaker 1: grieved a bit. But maybe if I reframe that as 565 00:29:00,916 --> 00:29:05,796 Speaker 1: hooray for the burned casha, now to make the next 566 00:29:05,916 --> 00:29:10,276 Speaker 1: non burnt batch all that much more delicious and appreciate 567 00:29:10,356 --> 00:29:13,156 Speaker 1: it totally. And in fact, this gets back to a 568 00:29:13,236 --> 00:29:16,436 Speaker 1: different form of ancient wisdom. This was exactly the strategy 569 00:29:16,516 --> 00:29:19,636 Speaker 1: that the Stoics had. So the Stoics thought you should 570 00:29:19,636 --> 00:29:22,916 Speaker 1: every morning do what they call negative visualization. You wake 571 00:29:23,036 --> 00:29:26,196 Speaker 1: up and you say, my Kasha is going to get burned, 572 00:29:26,356 --> 00:29:29,116 Speaker 1: my husband's gonna leave me, I'm going to lose my job, 573 00:29:29,396 --> 00:29:32,316 Speaker 1: I'm gonna trip and break my leg. You don't ruminate 574 00:29:32,316 --> 00:29:33,876 Speaker 1: on that forever, but you do that as little kind 575 00:29:33,876 --> 00:29:35,516 Speaker 1: of five to him in meditation, and you go to 576 00:29:35,516 --> 00:29:37,276 Speaker 1: your day and you're like, oh my gosh, my Kasha 577 00:29:37,276 --> 00:29:40,116 Speaker 1: didn't burn today. So the Stoics were really into this 578 00:29:40,196 --> 00:29:43,116 Speaker 1: idea that you don't necessarily have to have the change 579 00:29:43,156 --> 00:29:45,916 Speaker 1: to notice the change. You could just imagine the change, 580 00:29:45,996 --> 00:29:47,956 Speaker 1: and it gives you a lot of gratitude for the 581 00:29:47,996 --> 00:29:50,196 Speaker 1: stuff you have. One technique I used in some of 582 00:29:50,196 --> 00:29:52,516 Speaker 1: my talkses I look out in the audience and I say, 583 00:29:52,556 --> 00:29:55,196 Speaker 1: all of you people who have kids, imagine whatever the 584 00:29:55,276 --> 00:29:56,916 Speaker 1: last time you saw them was, that was the last time. 585 00:29:56,956 --> 00:29:59,476 Speaker 1: It's over. You're never going to see them again. Any ideas. 586 00:29:59,516 --> 00:30:01,236 Speaker 1: The next time you hug your kids, you're going to 587 00:30:01,356 --> 00:30:03,916 Speaker 1: hug them much more tightly. You didn't have to have 588 00:30:03,996 --> 00:30:06,196 Speaker 1: a horrible thing happen to them. The reference point didn't 589 00:30:06,196 --> 00:30:07,956 Speaker 1: have to change in a bad way for you get 590 00:30:07,996 --> 00:30:11,116 Speaker 1: the appreciation. I had a shutter. I just had to say, Laurie, 591 00:30:11,236 --> 00:30:12,756 Speaker 1: that was rough. But now you're going to be so 592 00:30:12,836 --> 00:30:15,876 Speaker 1: nice to your kids today. Yeah, even if they're annoying, 593 00:30:15,876 --> 00:30:17,996 Speaker 1: you'd be like, but I'm so happy they're alive. Like, 594 00:30:18,236 --> 00:30:22,596 Speaker 1: thank the Lord that you're here. Okay, you have given 595 00:30:22,676 --> 00:30:24,036 Speaker 1: us one thing you could do. You could wake up 596 00:30:24,036 --> 00:30:26,676 Speaker 1: and think of three bad things and they are just imaginary, 597 00:30:26,756 --> 00:30:28,196 Speaker 1: and then the whole rest of your day is going 598 00:30:28,236 --> 00:30:30,596 Speaker 1: to go better. But I recall the study that you 599 00:30:30,636 --> 00:30:33,196 Speaker 1: and I did wake up and think of three good things, right, 600 00:30:33,236 --> 00:30:38,396 Speaker 1: the classic gratitude exercise. These are opposite recommendations. So should 601 00:30:38,396 --> 00:30:40,796 Speaker 1: people wake up and think of three good things or 602 00:30:41,116 --> 00:30:43,236 Speaker 1: should they wake up and think of three bad things? 603 00:30:43,356 --> 00:30:45,916 Speaker 1: I'm going to vote for the three good things. I 604 00:30:45,996 --> 00:30:48,716 Speaker 1: did this as I usually do this morning, and I 605 00:30:48,756 --> 00:30:51,516 Speaker 1: actually thought about the Kasha. Thank god the house didn't 606 00:30:51,516 --> 00:30:54,716 Speaker 1: burn down because I did discover the pot of burning 607 00:30:54,796 --> 00:30:58,316 Speaker 1: buckwheat in time to prevent a fire. Yay. And then 608 00:30:58,356 --> 00:31:00,156 Speaker 1: I thought of a couple of other things. My daughter 609 00:31:00,476 --> 00:31:04,676 Speaker 1: got home safely. I really love this collaborator and built 610 00:31:04,756 --> 00:31:09,716 Speaker 1: in is a contrast to the counterfactual, like my could 611 00:31:09,716 --> 00:31:12,036 Speaker 1: be a jerk, but they're not, and my daughter are 612 00:31:12,196 --> 00:31:15,716 Speaker 1: goold have not gotten home safely. So maybe the stoics 613 00:31:15,756 --> 00:31:18,076 Speaker 1: had a good idea, but I think it's improved upon 614 00:31:18,236 --> 00:31:21,916 Speaker 1: by this much more positive experience of thinking about three 615 00:31:21,956 --> 00:31:23,836 Speaker 1: good things. To be fair, I think that's what the 616 00:31:23,876 --> 00:31:25,516 Speaker 1: stoics mean. They don't mean like, oh my god, my 617 00:31:25,556 --> 00:31:27,036 Speaker 1: house is going to burn out. They think you should 618 00:31:27,036 --> 00:31:30,556 Speaker 1: do that because immediately afterwards you're going to think about 619 00:31:30,596 --> 00:31:33,516 Speaker 1: the positive thing too. You're gonna be grateful for your kids, 620 00:31:33,596 --> 00:31:35,516 Speaker 1: leaving the stuff on the floor because you had that 621 00:31:35,556 --> 00:31:36,996 Speaker 1: moment of thinking about what it could be like to 622 00:31:37,036 --> 00:31:38,676 Speaker 1: not have kids at all. I think naturally, in the 623 00:31:38,716 --> 00:31:40,476 Speaker 1: way the stokes are talking about them, they focus on 624 00:31:40,516 --> 00:31:42,396 Speaker 1: the negative side, but they're hoping you're going to get 625 00:31:42,396 --> 00:31:44,796 Speaker 1: to the blessings really fast. And I think the negative 626 00:31:44,836 --> 00:31:48,396 Speaker 1: side is important when you're feeling really down, Like the 627 00:31:48,436 --> 00:31:51,196 Speaker 1: example of breaking your like because I'm clumpsy, this actually 628 00:31:51,236 --> 00:31:53,876 Speaker 1: happens to me with reasonable frequency, Like I recently broke 629 00:31:53,996 --> 00:31:57,396 Speaker 1: my knee. You literally mean this happens too with frequency. 630 00:31:57,556 --> 00:32:00,636 Speaker 1: You injure yourself in a serious way. Yeah, this was 631 00:32:00,716 --> 00:32:03,476 Speaker 1: the second time I'd broken the same kneecap when I 632 00:32:03,476 --> 00:32:07,596 Speaker 1: fell on it. Oh my god, Laurie, it's terrible. Yeah, 633 00:32:07,636 --> 00:32:10,276 Speaker 1: I was like, whoa is me? I broke my knee cap? 634 00:32:10,316 --> 00:32:12,996 Speaker 1: This sucks. And then I actually went back to the 635 00:32:12,996 --> 00:32:15,236 Speaker 1: stoics because I know I needed hardcore people who are 636 00:32:15,236 --> 00:32:17,076 Speaker 1: going to help me with this. And I read a 637 00:32:17,076 --> 00:32:19,716 Speaker 1: book by this current practicing stoic, Bill Urvine, and he 638 00:32:19,756 --> 00:32:22,236 Speaker 1: went through like, let's talk about some cases that you 639 00:32:22,276 --> 00:32:24,436 Speaker 1: could have. He's like, you could be a shut in. 640 00:32:24,636 --> 00:32:27,516 Speaker 1: These are people who have some sort of accident happen 641 00:32:27,596 --> 00:32:30,356 Speaker 1: who are fully conscious but so paralyzed that they can't 642 00:32:30,396 --> 00:32:32,276 Speaker 1: move any part of their body. They have to like 643 00:32:32,356 --> 00:32:35,596 Speaker 1: blink an eye to communicate with people. And I was like, okay, well, 644 00:32:35,596 --> 00:32:37,556 Speaker 1: at least I don't have that. But sometimes if you 645 00:32:37,596 --> 00:32:40,116 Speaker 1: get the right negative visualization, you're like, wait a minute, 646 00:32:40,156 --> 00:32:41,996 Speaker 1: I can actually be grateful for the broken knee too, 647 00:32:42,036 --> 00:32:44,516 Speaker 1: because at least it's not X, Y and Z. And 648 00:32:44,636 --> 00:32:47,076 Speaker 1: I think this is a nice way to solve Amelia's problem. 649 00:32:47,236 --> 00:32:50,396 Speaker 1: You don't necessarily have to get the change from your 650 00:32:50,476 --> 00:32:54,036 Speaker 1: real actions. You can make your current reference points seem 651 00:32:54,156 --> 00:32:57,476 Speaker 1: good just through these imaginations. Do you think that would 652 00:32:57,596 --> 00:33:00,676 Speaker 1: change Amelia's set point? Do you think that if she 653 00:33:01,316 --> 00:33:05,636 Speaker 1: chronically were comparing her pretty awesome life. She says, she 654 00:33:05,756 --> 00:33:08,956 Speaker 1: is a lovely department. She's doing really well. If you 655 00:33:09,316 --> 00:33:16,476 Speaker 1: regularly did these mental counterfactuals, that she would be enduringly happier. 656 00:33:16,716 --> 00:33:19,676 Speaker 1: You know, if every morning she could have the counterfactual 657 00:33:19,676 --> 00:33:22,916 Speaker 1: of what if I didn't have this lovely, supportive job 658 00:33:22,996 --> 00:33:25,196 Speaker 1: with my interesting colleagues, As she mentions, what if my 659 00:33:25,236 --> 00:33:28,476 Speaker 1: colleagues suck that bumps up the appreciation you have, It 660 00:33:28,556 --> 00:33:31,596 Speaker 1: breaks your heat on a adaptation. So I actually do 661 00:33:31,676 --> 00:33:33,436 Speaker 1: think it would be kind of a nice strategy. I 662 00:33:33,436 --> 00:33:36,756 Speaker 1: think we need Amelia to agree to be a pilot 663 00:33:36,796 --> 00:33:40,756 Speaker 1: subject in a study with only one subject. So, Laurie, 664 00:33:40,956 --> 00:33:42,996 Speaker 1: you want Amelia to wake up every day for a 665 00:33:43,036 --> 00:33:45,876 Speaker 1: week and what think of three bad things of the 666 00:33:45,916 --> 00:33:48,756 Speaker 1: things she loves about her life and her job. Imagine 667 00:33:48,756 --> 00:33:51,476 Speaker 1: that those weren't there. I would propose the second week 668 00:33:51,516 --> 00:33:54,836 Speaker 1: be that she tried the three good things exercise, and 669 00:33:55,036 --> 00:33:58,676 Speaker 1: after a month we could all get together and find 670 00:33:58,716 --> 00:34:01,396 Speaker 1: out which week was better. I want a third condition 671 00:34:01,396 --> 00:34:04,076 Speaker 1: where she does both, where she imagines the bad thing 672 00:34:04,196 --> 00:34:06,396 Speaker 1: and then thinks, oh my gosh, I am so lucky 673 00:34:06,436 --> 00:34:07,996 Speaker 1: to have these colleagues because I think if you just 674 00:34:07,996 --> 00:34:10,636 Speaker 1: do bad then it could be And to be fair 675 00:34:10,676 --> 00:34:12,916 Speaker 1: to the stoics, that's not really what they meant. Okay, 676 00:34:13,036 --> 00:34:14,956 Speaker 1: now we need six weeks of your life, Amelia, right, 677 00:34:15,756 --> 00:34:22,396 Speaker 1: love it to be continued. The Happiness Lab is co 678 00:34:22,436 --> 00:34:25,316 Speaker 1: written and produced by Ryan Dilley. Our original music was 679 00:34:25,356 --> 00:34:28,996 Speaker 1: composed by Zachary Silver, with additional scoring, mixing, and mastering 680 00:34:29,036 --> 00:34:31,596 Speaker 1: by Evan Viola. That Happiness Lab is brought to you 681 00:34:31,636 --> 00:34:36,796 Speaker 1: by Pushkin Industries. Any doctor Laurie Santos No Stupid Questions 682 00:34:36,796 --> 00:34:40,276 Speaker 1: as part of the Freakonomics Radio Network, which also includes 683 00:34:40,356 --> 00:34:45,956 Speaker 1: Freakonomics Radio People I mostly Admire and Freakonomics MD. This 684 00:34:45,996 --> 00:34:49,476 Speaker 1: episode was produced by me, Rebecca Lee Douglas for our 685 00:34:49,556 --> 00:34:52,596 Speaker 1: Happiness Lab listeners who are new to No Stupid Questions. 686 00:34:52,876 --> 00:34:54,596 Speaker 1: This is the time in the show where I do 687 00:34:54,676 --> 00:34:58,156 Speaker 1: a quick fact check of the conversations. Early on in 688 00:34:58,156 --> 00:35:01,556 Speaker 1: the episode, Laurie and Angela say that they're horrified by 689 00:35:01,596 --> 00:35:05,436 Speaker 1: the idea of heinz Oreo mayonnaise. I'm sure they will 690 00:35:05,516 --> 00:35:07,876 Speaker 1: be thrilled to hear that this is not, in fact 691 00:35:07,996 --> 00:35:10,876 Speaker 1: a real product. In June of twenty twenty one, the 692 00:35:10,996 --> 00:35:15,636 Speaker 1: Instagram account doctor Photograph posted a convincing photo of may 693 00:35:15,796 --> 00:35:19,756 Speaker 1: Oreo sauce that immediately went viral, but the image was 694 00:35:19,876 --> 00:35:23,116 Speaker 1: later proven to be altered. For those who are disappointed 695 00:35:23,116 --> 00:35:27,476 Speaker 1: that this crossover product doesn't actually exist, fret notot plenty 696 00:35:27,636 --> 00:35:31,956 Speaker 1: of other hind mashups are actually real. The company now 697 00:35:31,996 --> 00:35:36,556 Speaker 1: produces Mayo Chup, a combination of mayonnaise and ketchup, Mayo 698 00:35:36,676 --> 00:35:40,676 Speaker 1: must a mix of mayonnaise and mustard, and Cranch, a 699 00:35:40,756 --> 00:35:45,316 Speaker 1: blend of Ketchup, and Ranch, among others. Later, Angela and 700 00:35:45,396 --> 00:35:51,196 Speaker 1: Laurie discuss recently deceased psychologist Albert Bendora's seminal Bobo Doll experiment. 701 00:35:51,636 --> 00:35:54,236 Speaker 1: I was unfamiliar with the concept of a bobo doll, 702 00:35:54,476 --> 00:35:56,956 Speaker 1: and I surmised that many listeners would be as well. 703 00:35:57,196 --> 00:35:59,596 Speaker 1: I found that the toy isn't really a doll at all, 704 00:35:59,796 --> 00:36:03,956 Speaker 1: but rather a large, inflatable plastic clown with a heavy, 705 00:36:04,076 --> 00:36:08,316 Speaker 1: rounded bottom. When it's pushed over, the clown temporarily wobbles, 706 00:36:08,396 --> 00:36:11,676 Speaker 1: but quickly bounces back to center, making it a perfect 707 00:36:11,716 --> 00:36:16,756 Speaker 1: toy for children and adults to beat up in Bendora's experiments. Finally, 708 00:36:16,876 --> 00:36:20,516 Speaker 1: Angela says that Tasty cakes, like Hostess cupcakes, come two 709 00:36:20,556 --> 00:36:24,076 Speaker 1: to a package. Standard Tasty Cake boxes do include six 710 00:36:24,156 --> 00:36:26,996 Speaker 1: packs of two cupcakes, but you can also opt for 711 00:36:27,036 --> 00:36:30,636 Speaker 1: a single package of three cupcakes. Either way, if you 712 00:36:30,676 --> 00:36:33,476 Speaker 1: have the self restraint, you can still enjoy the happiness 713 00:36:33,516 --> 00:36:36,756 Speaker 1: that comes with more dessert later that day. That's it 714 00:36:36,836 --> 00:36:41,116 Speaker 1: for the fact check. No Stupid Questions is produced by 715 00:36:41,116 --> 00:36:45,836 Speaker 1: Freakonomics Radio and Stitcher. Our staff includes Alison Craiglowe, Greg Rippin, 716 00:36:46,196 --> 00:36:51,436 Speaker 1: James Foster, Joel Meyer, Trisha Bobita, Emma Terrell, Lyric Bowtitch, 717 00:36:51,756 --> 00:36:54,436 Speaker 1: Jasmine Clinger and Jacob Clementi.