1 00:00:15,476 --> 00:00:30,956 Speaker 1: Pushkin the sounds of laughter. There are people who study it, 2 00:00:31,636 --> 00:00:33,236 Speaker 1: people like Ben Glenn. 3 00:00:34,156 --> 00:00:38,516 Speaker 2: This extended sequence of laughs from the Jack Benny Program 4 00:00:38,636 --> 00:00:40,596 Speaker 2: is one of the all time greats. 5 00:00:45,276 --> 00:00:48,436 Speaker 1: Jack Benny was the biggest comedian of his day. His 6 00:00:48,476 --> 00:00:51,796 Speaker 1: show ran on radio and TV for more than thirty years. 7 00:00:52,396 --> 00:00:55,076 Speaker 1: It's said that the Jack Benny Show was the only 8 00:00:55,116 --> 00:00:59,596 Speaker 1: program President John F. Kennedy would watch religiously. His jokes 9 00:00:59,636 --> 00:01:02,876 Speaker 1: have probably generated more laughs than anyone in the history 10 00:01:02,876 --> 00:01:07,196 Speaker 1: of comedy. In this scene, comedian Johnny Carson has just 11 00:01:07,276 --> 00:01:10,036 Speaker 1: remarked that he has no idea how Jack Benny stays 12 00:01:10,036 --> 00:01:12,956 Speaker 1: so young. The gag that's slaying the audience is the 13 00:01:12,996 --> 00:01:17,076 Speaker 1: reveal that Jack stays young because he's a robot, and 14 00:01:17,116 --> 00:01:19,916 Speaker 1: his staff are dismantling him bit by bit, putting him 15 00:01:19,916 --> 00:01:28,316 Speaker 1: in storage for the night. But the laughter you're hearing 16 00:01:28,316 --> 00:01:30,916 Speaker 1: from this gag isn't pure. It's sweetened. 17 00:01:31,636 --> 00:01:36,556 Speaker 2: Sweetening is augmenting the authentic reactions of a sitting audience 18 00:01:36,796 --> 00:01:38,396 Speaker 2: with prerecorded reactions. 19 00:01:39,196 --> 00:01:43,636 Speaker 1: Augmenting authentic reactions. That's the media industry trick you might 20 00:01:43,676 --> 00:01:46,716 Speaker 1: know as canned laughter or a laugh track. The man 21 00:01:46,796 --> 00:01:50,276 Speaker 1: I'm talking to, Ben Glenn is a television historian and 22 00:01:50,316 --> 00:01:53,316 Speaker 1: an expert on the history of this technique. He's listened 23 00:01:53,316 --> 00:01:56,436 Speaker 1: to every faked shriek and spliced in GUFA in US 24 00:01:56,476 --> 00:01:57,236 Speaker 1: TV history. 25 00:01:57,876 --> 00:02:02,916 Speaker 2: I'm fifty four years old, I am the television generation, 26 00:02:03,756 --> 00:02:08,636 Speaker 2: and I thought to myself, why does nobody ever talk 27 00:02:08,676 --> 00:02:14,196 Speaker 2: about this? Because the laugh track was a critical integral 28 00:02:14,236 --> 00:02:16,476 Speaker 2: part of these shows and of their success. 29 00:02:17,316 --> 00:02:20,196 Speaker 1: Canned laughter is pretty common today, but I was surprised 30 00:02:20,236 --> 00:02:23,636 Speaker 1: to hear the secret history of how this technique developed. 31 00:02:24,076 --> 00:02:28,476 Speaker 2: The laugh track began an earnest in nineteen fifty when 32 00:02:28,516 --> 00:02:32,956 Speaker 2: an engineer named Charles Douglas, a sound engineer at CBS, 33 00:02:33,996 --> 00:02:37,036 Speaker 2: developed not only the idea that you could insert pre 34 00:02:37,156 --> 00:02:41,796 Speaker 2: recorded laughter, but he also built and designed himself in 35 00:02:41,836 --> 00:02:48,356 Speaker 2: his garage the apparatus to use tape loops of carefully 36 00:02:48,396 --> 00:02:53,356 Speaker 2: selected reactions that could be inserted into television shows. 37 00:02:53,796 --> 00:02:56,436 Speaker 1: And so where did he get those the original Where 38 00:02:56,436 --> 00:02:58,436 Speaker 1: did those REPSA boy? 39 00:02:58,756 --> 00:03:02,236 Speaker 2: That is the question that everybody asks, So I took 40 00:03:02,236 --> 00:03:06,276 Speaker 2: it upon myself to start digging. The facts are very 41 00:03:06,276 --> 00:03:09,756 Speaker 2: hard to uncover because it was a was meant to 42 00:03:09,756 --> 00:03:12,236 Speaker 2: be a well kept Hollywood secret. 43 00:03:12,636 --> 00:03:15,316 Speaker 1: Ben's oral detective work led him to listen over and 44 00:03:15,356 --> 00:03:19,476 Speaker 1: over both to classic laugh tracks and early television comedies. 45 00:03:19,996 --> 00:03:22,276 Speaker 1: In the end, he was able to figure out that 46 00:03:22,356 --> 00:03:25,596 Speaker 1: Douglas made his laugh tracks by recording the real studio 47 00:03:25,676 --> 00:03:29,796 Speaker 1: audiences who watched early comedians like Red Skelton, Lucille, Ball 48 00:03:29,956 --> 00:03:30,916 Speaker 1: and Abbot and Costello. 49 00:03:31,196 --> 00:03:35,756 Speaker 2: There's also a report that some of the earliest ones 50 00:03:35,796 --> 00:03:39,636 Speaker 2: in the nineteen fifties came from a Marcel Marceau performance 51 00:03:39,636 --> 00:03:42,476 Speaker 2: in Los Angeles, and of course that would make sense 52 00:03:42,476 --> 00:03:45,916 Speaker 2: since he was a mime and there was no interfering 53 00:03:46,036 --> 00:03:50,196 Speaker 2: music or any sort of distraction noise, etc. 54 00:03:51,116 --> 00:03:54,236 Speaker 1: Ben told me something that totally blew my mind. Those 55 00:03:54,276 --> 00:03:58,156 Speaker 1: early laugh tracks are still in use today. The exact 56 00:03:58,156 --> 00:04:00,836 Speaker 1: same gufaws and shrieks that people heard in the fifties 57 00:04:01,036 --> 00:04:03,276 Speaker 1: are still sweetening modern TV shows. 58 00:04:04,676 --> 00:04:11,156 Speaker 2: We are hearing reactions that were recorded decades ago. They're 59 00:04:11,156 --> 00:04:13,316 Speaker 2: dead people, yes, but they live on. 60 00:04:15,676 --> 00:04:20,116 Speaker 1: Television. Comedies are still using these laughs because they work. 61 00:04:21,116 --> 00:04:23,716 Speaker 1: As I listened to Ben's clips of early comedies, I 62 00:04:23,756 --> 00:04:26,756 Speaker 1: realized that the canned laughter does seem to make jokes funnier, 63 00:04:27,276 --> 00:04:30,556 Speaker 1: even if the gags are pretty dumb, which honestly, many 64 00:04:30,596 --> 00:04:35,076 Speaker 1: of the gags were But why was my brain reacting 65 00:04:35,156 --> 00:04:38,156 Speaker 1: like this? How can the pre recorded laughter of dead 66 00:04:38,196 --> 00:04:41,556 Speaker 1: people I've never met affect my experience of a mostly 67 00:04:41,596 --> 00:04:45,756 Speaker 1: not funny television show today? I mean, are we literally 68 00:04:45,836 --> 00:04:49,756 Speaker 1: catching other people's emotions like the common cold? And if 69 00:04:49,756 --> 00:04:53,196 Speaker 1: we are, how can we inoculate ourselves against the sorts 70 00:04:53,196 --> 00:04:56,116 Speaker 1: of feelings we might not want to catch from other people. 71 00:05:01,516 --> 00:05:03,636 Speaker 1: Our minds are constantly telling us what to do to 72 00:05:03,676 --> 00:05:06,636 Speaker 1: be happy. But what if our minds are wrong? What 73 00:05:06,716 --> 00:05:09,356 Speaker 1: if our minds are lying to us, leading us away 74 00:05:09,436 --> 00:05:12,756 Speaker 1: from what will really make us happy. The good news 75 00:05:12,956 --> 00:05:15,356 Speaker 1: is that understanding the science of the mind can point 76 00:05:15,436 --> 00:05:18,996 Speaker 1: us all back in the right direction. You're listening to 77 00:05:19,076 --> 00:05:29,236 Speaker 1: the Happiness Lab with doctor Laurie Santinis. Modern day laugh 78 00:05:29,276 --> 00:05:32,436 Speaker 1: tracks are only a few decades old, but comedy designers 79 00:05:32,476 --> 00:05:35,036 Speaker 1: have been using similar techniques for centuries. 80 00:05:35,356 --> 00:05:41,436 Speaker 2: If you think about it. In the sixteenth century Shakespearean plays, 81 00:05:42,236 --> 00:05:46,516 Speaker 2: they were performed in front of an extremely raucous audience, 82 00:05:46,996 --> 00:05:52,396 Speaker 2: where the audience cheer the protagonist and booed the villain. 83 00:05:52,876 --> 00:05:57,636 Speaker 2: There are actually accounts of that Shakespeare planted people in 84 00:05:57,676 --> 00:06:00,316 Speaker 2: the audience to react in a certain way to spur 85 00:06:00,476 --> 00:06:03,756 Speaker 2: on the audience members around. 86 00:06:04,396 --> 00:06:07,916 Speaker 1: Shakespeare wasn't alone. There's a long, long history of seating 87 00:06:07,956 --> 00:06:11,276 Speaker 1: live audiences like this. The opera houses of France employed 88 00:06:11,316 --> 00:06:15,276 Speaker 1: whole groups of claquers, hired guns who sat in the auditorium, 89 00:06:15,596 --> 00:06:18,276 Speaker 1: primed to lead the paying audience in whatever reaction the 90 00:06:18,276 --> 00:06:22,476 Speaker 1: script required. Some members of the claque even had specializations. 91 00:06:23,076 --> 00:06:27,356 Speaker 1: Rears were expert laughers, while plurers could summon tears at will. 92 00:06:28,956 --> 00:06:32,476 Speaker 1: Early television producers also realized the power of these techniques. 93 00:06:33,116 --> 00:06:35,836 Speaker 1: They would have loved to hire professional claquers to enhance 94 00:06:35,876 --> 00:06:39,756 Speaker 1: their studio audience's reaction, but they couldn't afford something that extravagant. 95 00:06:40,756 --> 00:06:44,836 Speaker 1: Charles Douglas's new advance in laughter technology offered a tantalizing, 96 00:06:44,996 --> 00:06:47,276 Speaker 1: albeit totally fabricated alternative. 97 00:06:47,676 --> 00:06:52,796 Speaker 2: The entire process was it was artificial from beginning to end. 98 00:06:53,476 --> 00:06:55,836 Speaker 1: In a few years, Douglas was able to raise his 99 00:06:55,916 --> 00:06:59,076 Speaker 1: laugh track from simple canned laughter to an art form. 100 00:06:59,476 --> 00:07:02,396 Speaker 1: He wasn't just lifting laughs. It was as though he 101 00:07:02,476 --> 00:07:05,956 Speaker 1: was a conductor and the laughing audience members were his orchestra. 102 00:07:06,356 --> 00:07:08,796 Speaker 2: He would sit up late at night in his living 103 00:07:08,916 --> 00:07:14,036 Speaker 2: room and he would be listening to tape loops and 104 00:07:14,196 --> 00:07:23,156 Speaker 2: splicing and extracting reaction. He engineered them to bring a 105 00:07:23,196 --> 00:07:28,996 Speaker 2: certain individual laugh to the surface and to suppress ambient laughter. 106 00:07:29,236 --> 00:07:32,876 Speaker 2: They often were sped up to make it seem even 107 00:07:32,956 --> 00:07:37,796 Speaker 2: more kind of jolly or you know, more intensely funny. 108 00:07:38,236 --> 00:07:41,956 Speaker 2: The machine he built held up to three hundred and 109 00:07:41,996 --> 00:07:44,196 Speaker 2: twenty reactions. 110 00:07:45,036 --> 00:07:48,076 Speaker 1: Douglas's new machine allowed him to become the master of 111 00:07:48,156 --> 00:07:52,516 Speaker 1: television emotional manipulation, but most shows couldn't afford his exquisite 112 00:07:52,516 --> 00:07:53,036 Speaker 1: new craft. 113 00:07:53,556 --> 00:07:58,316 Speaker 2: Charles Douglas charged about one hundred dollars per day if 114 00:07:58,316 --> 00:08:01,836 Speaker 2: that was over budget. For some production companies, they found 115 00:08:02,076 --> 00:08:05,516 Speaker 2: some i have to say, sort of low budget pre 116 00:08:05,596 --> 00:08:10,036 Speaker 2: existing laugh track. For example, The Adventures of Harriet, which 117 00:08:10,076 --> 00:08:16,636 Speaker 2: was an enormously popular show in its day, used only 118 00:08:16,876 --> 00:08:21,116 Speaker 2: one reaction regardless of situation, so it's like the same laugh, 119 00:08:21,596 --> 00:08:23,156 Speaker 2: the same lab the exact same laugh. 120 00:08:27,996 --> 00:08:30,516 Speaker 1: But shows soon recognized they needed to be a little 121 00:08:30,516 --> 00:08:33,276 Speaker 1: bit more professional with their sweetening to make the joke stick, 122 00:08:33,796 --> 00:08:37,436 Speaker 1: which put Douglas's laugh track in even more demand. And 123 00:08:37,476 --> 00:08:40,516 Speaker 1: that's when Douglas realized he could start sweetening more than 124 00:08:40,596 --> 00:08:41,236 Speaker 1: just the laughs. 125 00:08:41,876 --> 00:08:45,716 Speaker 2: What we see is that the laugh track expands beyond 126 00:08:45,956 --> 00:08:49,636 Speaker 2: just laughter, and these are some of my favorite reactions. 127 00:08:49,956 --> 00:08:56,796 Speaker 2: There is shock, there are you know, women kind of squealing. 128 00:08:56,516 --> 00:08:59,316 Speaker 3: Oh oh oh. 129 00:09:00,036 --> 00:09:03,956 Speaker 2: So the laugh track expanded beyond just laughter to include 130 00:09:04,276 --> 00:09:07,116 Speaker 2: the full range of how an audience might react. 131 00:09:07,956 --> 00:09:11,516 Speaker 1: But why was Douglas's techniques so powerful? Why does hearing 132 00:09:11,516 --> 00:09:14,156 Speaker 1: the squeals of people we've never met change our own 133 00:09:14,236 --> 00:09:17,276 Speaker 1: reaction to a television show. Well, if you've listened to 134 00:09:17,316 --> 00:09:20,756 Speaker 1: some of the previous episodes, you know that experiencing an event, 135 00:09:20,996 --> 00:09:23,036 Speaker 1: say eating a piece of chocolate, at the same time 136 00:09:23,076 --> 00:09:26,436 Speaker 1: as another person, can make those events more intense. But 137 00:09:26,556 --> 00:09:30,716 Speaker 1: psychology shows that experiencing an event with another person can 138 00:09:30,796 --> 00:09:33,596 Speaker 1: also change our experience of that event in other ways 139 00:09:33,596 --> 00:09:33,996 Speaker 1: as well. 140 00:09:34,476 --> 00:09:38,636 Speaker 2: Our thinking, in our reactions, and really the psychology of 141 00:09:38,676 --> 00:09:43,596 Speaker 2: what we're watching can be changed by the reactions around us. 142 00:09:44,876 --> 00:09:47,956 Speaker 1: Psychologists have long documented the fact that we tend to 143 00:09:48,036 --> 00:09:51,956 Speaker 1: copy other people's behavior unconsciously, and most of the time 144 00:09:52,076 --> 00:09:56,036 Speaker 1: without even realizing it. In one study by Tanya Chartrand 145 00:09:56,076 --> 00:09:58,796 Speaker 1: and colleagues, subjects were brought into the lab and told 146 00:09:58,796 --> 00:10:01,316 Speaker 1: they needed to work with another person to describe a 147 00:10:01,316 --> 00:10:04,436 Speaker 1: set of magazine photos, but the photos were not part 148 00:10:04,476 --> 00:10:08,076 Speaker 1: of the experiment. Unbeknownst to the subject that people chosen 149 00:10:08,116 --> 00:10:12,716 Speaker 1: as their partners were actually experimental claquers, or confederates, as 150 00:10:12,756 --> 00:10:16,036 Speaker 1: scientists called them. They were people hired by the experimenters 151 00:10:16,316 --> 00:10:19,476 Speaker 1: to behave in a very specific way. They rub their 152 00:10:19,476 --> 00:10:22,636 Speaker 1: face a bunch, or sometimes touch their feet over and over. 153 00:10:23,516 --> 00:10:26,556 Speaker 1: Chartrand and colleagues wanted to know how these behaviors affected 154 00:10:26,556 --> 00:10:30,076 Speaker 1: the subjects. It turns out being around someone touching their 155 00:10:30,076 --> 00:10:33,316 Speaker 1: face caused the research subjects to touch their own faces 156 00:10:33,316 --> 00:10:36,396 Speaker 1: more often, whereas being around someone who touched their feet 157 00:10:36,756 --> 00:10:39,636 Speaker 1: caused participants to touch their feet more often. Even though 158 00:10:39,796 --> 00:10:42,836 Speaker 1: both of these behaviors are really weird, this result and 159 00:10:42,956 --> 00:10:45,556 Speaker 1: lots of others, show that we tend to unconsciously mimic 160 00:10:45,636 --> 00:10:49,116 Speaker 1: the behavior of others, a phenomenon that researchers christened the 161 00:10:49,196 --> 00:10:54,276 Speaker 1: chameleon effect. Were literally catching other people's behavior. But we 162 00:10:54,316 --> 00:10:58,156 Speaker 1: don't just catch other people's behavior. Researchers have long realized 163 00:10:58,236 --> 00:11:00,396 Speaker 1: that there's a tight link between behaving in a certain 164 00:11:00,436 --> 00:11:03,876 Speaker 1: way and feeling a certain way. The act of behaving 165 00:11:03,876 --> 00:11:07,156 Speaker 1: in a certain way, say smiling or frowning, can influence 166 00:11:07,236 --> 00:11:11,076 Speaker 1: how we feel just shown that adopting a happy facial 167 00:11:11,076 --> 00:11:15,196 Speaker 1: expression can unconsciously improve our mood. There are also studies 168 00:11:15,196 --> 00:11:18,436 Speaker 1: showing that when you can't make facial expressions, it becomes 169 00:11:18,436 --> 00:11:23,316 Speaker 1: harder to experience emotions. Chartrands showed this in an unusual experiment. 170 00:11:23,596 --> 00:11:27,596 Speaker 1: She tested whether women who receive botox injections, which paralyzes 171 00:11:27,636 --> 00:11:31,996 Speaker 1: facial muscles, have trouble recognizing emotions in others. She found 172 00:11:31,996 --> 00:11:34,756 Speaker 1: that not being able to make a particular facial expression 173 00:11:35,076 --> 00:11:39,116 Speaker 1: makes it harder to recognize that emotion in others. What 174 00:11:39,196 --> 00:11:41,636 Speaker 1: does all this have to do with the laugh track. Well, 175 00:11:41,836 --> 00:11:45,396 Speaker 1: if audience members naturally copy the reactions of those around them, 176 00:11:45,716 --> 00:11:48,036 Speaker 1: then they'll not only behave differently when they hear a 177 00:11:48,116 --> 00:11:51,716 Speaker 1: laugh track chuckling a bit more, but they'll also feel 178 00:11:51,756 --> 00:11:57,156 Speaker 1: differently because of emotional contagion. We're literally catching other people's emotions. 179 00:11:57,716 --> 00:12:00,596 Speaker 1: Hearing other people laugh at a joke makes us think 180 00:12:00,716 --> 00:12:02,556 Speaker 1: that the joke is actually funnier. 181 00:12:02,996 --> 00:12:05,556 Speaker 2: It's reassuring, like, oh, yes, I'm not the only one 182 00:12:05,556 --> 00:12:09,356 Speaker 2: who finds that funny. That validates me that is funny. 183 00:12:09,516 --> 00:12:13,676 Speaker 1: Humans are not just behavioral chameleons, but emotional chameleons. As well, 184 00:12:14,156 --> 00:12:17,076 Speaker 1: were as susceptible to the emotions around us as we 185 00:12:17,116 --> 00:12:20,836 Speaker 1: are to a highly contagious disease. But we don't often 186 00:12:20,916 --> 00:12:22,916 Speaker 1: like to think that our emotions can be so fickle, 187 00:12:23,436 --> 00:12:26,796 Speaker 1: which meant that, despite its pervasiveness, not everyone was a 188 00:12:26,796 --> 00:12:28,876 Speaker 1: fan of Douglas's laugh track technology. 189 00:12:29,356 --> 00:12:33,116 Speaker 2: I think the term canned laughter was pejorative and was 190 00:12:33,236 --> 00:12:37,476 Speaker 2: used by critics and probably producers and writers who look 191 00:12:37,556 --> 00:12:40,876 Speaker 2: down upon the use of pre recorded reactions. 192 00:12:41,476 --> 00:12:45,236 Speaker 1: But laugh tracks persist because they really do work. They 193 00:12:45,276 --> 00:12:47,756 Speaker 1: make a show seem funnier, whether or not we like 194 00:12:47,796 --> 00:12:50,436 Speaker 1: to admit it. That becomes clear when you listen to 195 00:12:50,476 --> 00:12:52,476 Speaker 1: shows that forego a little sweetening. 196 00:12:52,956 --> 00:12:57,236 Speaker 2: Some networks tried to embark upon what they called a 197 00:12:57,316 --> 00:13:02,156 Speaker 2: prestige sitcom which had no laugh track at all. One 198 00:13:02,236 --> 00:13:06,076 Speaker 2: called Frank's Place on CBS. There was one called The 199 00:13:06,156 --> 00:13:08,916 Speaker 2: Days and Nights of Molly Dodd, and they wore this 200 00:13:09,156 --> 00:13:11,676 Speaker 2: as a badge of honor, like we need known laugh track. 201 00:13:11,876 --> 00:13:15,436 Speaker 2: They went nowhere, They fizzled, and who remembers them. 202 00:13:16,076 --> 00:13:18,836 Speaker 1: Right When I teach my class at Yale, I show 203 00:13:18,876 --> 00:13:22,116 Speaker 1: students a modern day TV show without its clackers, just 204 00:13:22,156 --> 00:13:23,516 Speaker 1: to make the very same point. 205 00:13:24,316 --> 00:13:29,036 Speaker 3: Ah, nothing makes beer taste better than cool clear Rocky 206 00:13:29,076 --> 00:13:32,396 Speaker 3: Mountains spring water, so. 207 00:13:32,396 --> 00:13:33,716 Speaker 2: Big Bang theory on YouTube. 208 00:13:33,956 --> 00:13:35,996 Speaker 1: Somebody's gone in and clipped out the laugh track and 209 00:13:36,036 --> 00:13:37,636 Speaker 1: it sounds super weird. 210 00:13:37,956 --> 00:13:44,356 Speaker 3: We're not the Rocky Mountains anyway, Philadelphia. 211 00:13:44,596 --> 00:13:44,876 Speaker 1: They're not. 212 00:13:45,436 --> 00:13:48,516 Speaker 2: They're not jokes, but they it's they just to fill 213 00:13:48,556 --> 00:13:51,756 Speaker 2: it with this, to sort of propel you into thinking 214 00:13:51,796 --> 00:13:56,476 Speaker 2: everything is funny. How we hear other people reacting is contagious. 215 00:13:57,396 --> 00:13:59,716 Speaker 1: Laugh tracks are one of the most common cases of 216 00:13:59,756 --> 00:14:02,796 Speaker 1: emotional contagion and action. Whether or not we want to 217 00:14:02,836 --> 00:14:05,876 Speaker 1: admit it, we do enjoy TV shows more when there's 218 00:14:05,916 --> 00:14:09,956 Speaker 1: canned laughter, But emotional contagent can also be used in 219 00:14:09,956 --> 00:14:13,756 Speaker 1: more insidious ways, because not everyone out there wants to 220 00:14:13,796 --> 00:14:17,716 Speaker 1: make us laugh. Some companies today make money not by 221 00:14:17,756 --> 00:14:20,516 Speaker 1: making us feel good, but by making us feel a 222 00:14:20,516 --> 00:14:24,076 Speaker 1: whole lot worse. The Happiness Lab will be right back. 223 00:14:32,276 --> 00:14:34,076 Speaker 3: I opened up my email and there were hundreds of 224 00:14:34,116 --> 00:14:38,396 Speaker 3: emails that were all mostly like how dare you do this? 225 00:14:38,556 --> 00:14:41,196 Speaker 3: And you're so unethical? Or other friends writing me saying, hey, 226 00:14:41,236 --> 00:14:43,676 Speaker 3: you better get a lawyer, and I hope you're okay. 227 00:14:44,196 --> 00:14:47,316 Speaker 1: This is my friend Jeff Hancock back in twenty fourteen, 228 00:14:47,756 --> 00:14:51,356 Speaker 1: Jeff opened up a firestorm of hate in his inbox. 229 00:14:51,396 --> 00:14:54,436 Speaker 3: And over the course of the next five or six days, 230 00:14:54,516 --> 00:14:56,676 Speaker 3: you know, I could literally tell where the sun was 231 00:14:56,756 --> 00:14:59,596 Speaker 3: up around the world by where I was receiving hate 232 00:14:59,636 --> 00:15:00,716 Speaker 3: mail from. 233 00:15:00,836 --> 00:15:04,236 Speaker 1: I distinctly remember watching all this happen, just the crazy 234 00:15:04,316 --> 00:15:07,396 Speaker 1: level of outrage that Jeff had elicited. I was worried 235 00:15:07,396 --> 00:15:10,796 Speaker 1: about him. Many of our colleague or two. What had 236 00:15:10,876 --> 00:15:14,716 Speaker 1: Jeff done to generate so much fury from complete strangers. 237 00:15:14,956 --> 00:15:18,596 Speaker 1: He ran an experiment to manipulate people's emotions online, and 238 00:15:18,636 --> 00:15:19,516 Speaker 1: they didn't like it. 239 00:15:20,476 --> 00:15:23,676 Speaker 3: I often remember the first time I read about emotional contagion, 240 00:15:23,716 --> 00:15:26,116 Speaker 3: and the metaphor that the author used was you could 241 00:15:26,156 --> 00:15:28,796 Speaker 3: cut the emotions in the room with a knife, and 242 00:15:28,876 --> 00:15:31,756 Speaker 3: I know that feeling. You walk into a room and 243 00:15:32,036 --> 00:15:33,796 Speaker 3: you can be like, oh, it's tense in here. 244 00:15:34,436 --> 00:15:37,836 Speaker 1: Most psychologists study emotional contagion in the context of real 245 00:15:37,836 --> 00:15:40,756 Speaker 1: world social interactions like these, but Jeff wanted to know 246 00:15:40,876 --> 00:15:43,716 Speaker 1: whether we could catch people's emotions in more subtle ways 247 00:15:43,716 --> 00:15:46,756 Speaker 1: as well, like through a relatively new form of human 248 00:15:46,796 --> 00:15:48,476 Speaker 1: communication writing. 249 00:15:48,956 --> 00:15:53,436 Speaker 3: So until the nineteen forties, over half of the planet's 250 00:15:53,436 --> 00:15:56,156 Speaker 3: population was illiterate, so they never left any record of 251 00:15:56,196 --> 00:15:58,796 Speaker 3: any of their communication. And now you know, if you 252 00:15:58,836 --> 00:16:01,796 Speaker 3: were to ask one of your podcast listeners if they'd 253 00:16:01,796 --> 00:16:03,956 Speaker 3: written something today, almost everybody would say. 254 00:16:03,876 --> 00:16:06,996 Speaker 1: Yes, we forget that text is a completely new form 255 00:16:07,036 --> 00:16:10,436 Speaker 1: of communication for our species. And for a long time, 256 00:16:10,676 --> 00:16:14,476 Speaker 1: most scientists simply assumed that emotions couldn't transmit as easily 257 00:16:14,516 --> 00:16:16,796 Speaker 1: through written word as they did face to face. 258 00:16:17,196 --> 00:16:21,156 Speaker 3: I really hated this sort of idea that you can't 259 00:16:21,196 --> 00:16:24,356 Speaker 3: communicate emotions in text like an emails and things like that. 260 00:16:24,636 --> 00:16:28,276 Speaker 1: Jeff's research showed that participants pick up other people's emotions 261 00:16:28,276 --> 00:16:30,836 Speaker 1: through text and say a quick email note or an 262 00:16:30,876 --> 00:16:34,236 Speaker 1: online comment just as easily as they do in face 263 00:16:34,276 --> 00:16:36,036 Speaker 1: to face real world interactions. 264 00:16:36,636 --> 00:16:38,596 Speaker 3: On the one hand, is like surprising, like wow, how 265 00:16:38,596 --> 00:16:40,916 Speaker 3: can you tell emotions in text? And then you think 266 00:16:40,956 --> 00:16:43,956 Speaker 3: about your favorite author, and of course you have like 267 00:16:43,996 --> 00:16:48,436 Speaker 3: these super powerful feelings from text. I remember reading Game 268 00:16:48,476 --> 00:16:52,516 Speaker 3: of Thrones and there's this red wedding event where people 269 00:16:52,556 --> 00:16:55,236 Speaker 3: that you aren't expecting to be killed get killed, and 270 00:16:55,836 --> 00:16:57,796 Speaker 3: I was so furious I had to stop reading and 271 00:16:57,796 --> 00:16:59,996 Speaker 3: go for a walk. And it was just one of 272 00:16:59,996 --> 00:17:01,876 Speaker 3: these examples of like, yeah, why do we think that 273 00:17:01,916 --> 00:17:05,276 Speaker 3: you can't communicate emotion through text or online? 274 00:17:05,716 --> 00:17:07,956 Speaker 1: But there are a number of reasons why the text 275 00:17:08,036 --> 00:17:11,516 Speaker 1: we read online is a different communication medium than the 276 00:17:11,596 --> 00:17:15,356 Speaker 1: quick emotions we experience around other people. The first difference 277 00:17:15,436 --> 00:17:18,356 Speaker 1: is longevity. When I laugh at a funny joke, you 278 00:17:18,476 --> 00:17:21,676 Speaker 1: only hear that signal for a few seconds. But text, 279 00:17:21,916 --> 00:17:24,836 Speaker 1: especially the text we post online, is really different. 280 00:17:25,236 --> 00:17:28,356 Speaker 3: It's true that up until very recently in human history, 281 00:17:28,756 --> 00:17:33,156 Speaker 3: everything we said and did disappear, and now we leave records. 282 00:17:33,716 --> 00:17:36,876 Speaker 1: The records people leave can continue affecting our emotions for 283 00:17:37,036 --> 00:17:40,356 Speaker 1: years and years to come. My favorite writer, Kurt Vonnegut's 284 00:17:40,356 --> 00:17:43,356 Speaker 1: been dead for years, but his quotes still make me 285 00:17:43,476 --> 00:17:47,676 Speaker 1: laugh and sometimes cringe. But there's a second way that text, 286 00:17:47,916 --> 00:17:51,156 Speaker 1: especially text on the Internet, is different. We don't have 287 00:17:51,196 --> 00:17:55,036 Speaker 1: as much control over who is affecting our emotions. We 288 00:17:55,076 --> 00:17:57,836 Speaker 1: evolve to be around a small number of people, which 289 00:17:57,876 --> 00:18:01,356 Speaker 1: means that most face to face emotional contagion only occurs 290 00:18:01,436 --> 00:18:04,796 Speaker 1: between people who know each other well. But on the Internet, 291 00:18:04,956 --> 00:18:08,396 Speaker 1: we're emotionally affected both by people were close to and 292 00:18:08,436 --> 00:18:10,076 Speaker 1: by people we never even meet. 293 00:18:10,156 --> 00:18:14,516 Speaker 3: In real life, I should have a contagious sort of feeling. 294 00:18:14,596 --> 00:18:17,356 Speaker 3: If my mom posts something that's upsetting for her, I 295 00:18:17,356 --> 00:18:21,436 Speaker 3: should feel some empathy and negative response. But if I'm 296 00:18:21,516 --> 00:18:24,796 Speaker 3: looking through the comments of a hockey game that took 297 00:18:24,796 --> 00:18:28,276 Speaker 3: place last night, and I'm reading the comments, somebody named 298 00:18:28,396 --> 00:18:32,596 Speaker 3: Poster number eighty three is upset or angry, that should 299 00:18:32,596 --> 00:18:36,436 Speaker 3: have zero bearing on my emotional situation. And yet because 300 00:18:36,476 --> 00:18:38,756 Speaker 3: of some of the automatic aspects of the way we 301 00:18:38,796 --> 00:18:41,756 Speaker 3: respond to others emotions, it can trigger things to me. 302 00:18:42,436 --> 00:18:44,716 Speaker 1: When we're on the internet. We don't only succumb to 303 00:18:44,716 --> 00:18:47,596 Speaker 1: the emotions of people like our family members, people we 304 00:18:47,636 --> 00:18:48,516 Speaker 1: know and care about. 305 00:18:48,836 --> 00:18:51,676 Speaker 3: Then there's these sort of like unknown people that I 306 00:18:51,876 --> 00:18:54,236 Speaker 3: never expect to meet again. I don't have any idea 307 00:18:54,276 --> 00:18:56,316 Speaker 3: who they actually are, and we can call that the 308 00:18:56,436 --> 00:18:59,836 Speaker 3: unknown network. And this could be some Russian agent, could 309 00:18:59,836 --> 00:19:03,956 Speaker 3: be some fraud streat of Nigeria. And typically in the 310 00:19:03,956 --> 00:19:06,116 Speaker 3: face to face world, it's really easy to keep those 311 00:19:06,156 --> 00:19:10,956 Speaker 3: two kinds of contacts separate, so those two networks aren't overlapping. 312 00:19:10,996 --> 00:19:13,756 Speaker 3: But on the Internet and something like a Facebook news feed, 313 00:19:14,436 --> 00:19:18,756 Speaker 3: those two worlds are completely overlaid and they sit right 314 00:19:18,796 --> 00:19:21,276 Speaker 3: on top of each other. It's just as easy for 315 00:19:21,356 --> 00:19:24,436 Speaker 3: a Russian agent to put something to my feed or 316 00:19:24,476 --> 00:19:27,236 Speaker 3: my advertising space as it is for LORI. 317 00:19:27,796 --> 00:19:30,556 Speaker 1: And this means that our emotions aren't just affected by 318 00:19:30,556 --> 00:19:33,436 Speaker 1: the people that matter. We're also, in theory catching the 319 00:19:33,556 --> 00:19:38,916 Speaker 1: emotions of third cousins. We haven't seen in years, random strangers, advertisers' bots, 320 00:19:39,276 --> 00:19:41,516 Speaker 1: really anything that winds up in our news feed. 321 00:19:41,836 --> 00:19:45,836 Speaker 3: Everybody's newsfeed is super huge, it turns out, because of 322 00:19:45,876 --> 00:19:48,956 Speaker 3: network effects. If you have say, three hundred friends on Facebook, 323 00:19:49,076 --> 00:19:53,396 Speaker 3: you'll have thousands of possible pieces of content to look 324 00:19:53,436 --> 00:19:56,716 Speaker 3: at in your news feed. And what Facebook very quickly 325 00:19:56,756 --> 00:19:59,156 Speaker 3: found out was that much of that people found on 326 00:19:59,276 --> 00:20:03,276 Speaker 3: interesting and unengaging, and so they developed algorithms to try 327 00:20:03,316 --> 00:20:07,676 Speaker 3: and predict what the say, twenty to thirty most interesting 328 00:20:07,916 --> 00:20:11,476 Speaker 3: pieces for a person be. Basically what happens is a 329 00:20:11,516 --> 00:20:14,236 Speaker 3: piece of content would come and the algorithm would rank 330 00:20:14,316 --> 00:20:16,316 Speaker 3: and say, well, Jeff would be really interesting this we're 331 00:20:16,316 --> 00:20:18,276 Speaker 3: going to put at the top of his feed, or 332 00:20:18,316 --> 00:20:19,796 Speaker 3: he wouldn't at all, we're going to put it down. 333 00:20:19,916 --> 00:20:22,716 Speaker 3: You know. In fifteen hundredth and so, everybody's news feed 334 00:20:22,836 --> 00:20:27,356 Speaker 3: is algorithmically curated. And now in twenty nineteen, most people 335 00:20:27,476 --> 00:20:31,756 Speaker 3: know this, and I think in twenty fourteen. This was 336 00:20:32,036 --> 00:20:33,996 Speaker 3: not well known and I think was a big part 337 00:20:34,036 --> 00:20:35,036 Speaker 3: of the outrage. 338 00:20:35,556 --> 00:20:38,916 Speaker 1: Ah, yes, the outrage, So why did everyone get so 339 00:20:39,036 --> 00:20:42,116 Speaker 1: mad at Jeff Well. Facebook was interested to see if 340 00:20:42,156 --> 00:20:45,916 Speaker 1: it's users we're experiencing emotional contagion from the post they read. 341 00:20:46,316 --> 00:20:48,636 Speaker 3: In particular, they were worried about some studies that were 342 00:20:48,636 --> 00:20:52,356 Speaker 3: coming out suggesting that people might feel bad when they 343 00:20:52,396 --> 00:20:56,276 Speaker 3: look at their news feed. It'd be very bad for 344 00:20:56,316 --> 00:20:58,556 Speaker 3: a lot of obvious reasons if they're one of their 345 00:20:58,596 --> 00:21:03,676 Speaker 3: main key products was causing their users to be depressed 346 00:21:03,756 --> 00:21:04,916 Speaker 3: or causing negative emotion. 347 00:21:05,636 --> 00:21:08,476 Speaker 1: The social media giant design an experiment to figure out 348 00:21:08,476 --> 00:21:12,636 Speaker 1: the emotional impact of Facebook posts. Unbeknownst to the subjects, 349 00:21:12,956 --> 00:21:15,796 Speaker 1: it decided to tweak the newsfeed algorithm for two different 350 00:21:15,836 --> 00:21:19,916 Speaker 1: test groups of users. The first group, the negative content group, 351 00:21:20,276 --> 00:21:22,916 Speaker 1: was picked to see less bad content in their news feed. 352 00:21:23,316 --> 00:21:25,916 Speaker 3: It never went away, it was always in your feed, 353 00:21:26,076 --> 00:21:28,036 Speaker 3: just you know, you'd be less likely to see it. 354 00:21:28,436 --> 00:21:30,916 Speaker 1: But there was another group or condition, who saw their 355 00:21:30,956 --> 00:21:33,796 Speaker 1: newsfeed altered in the other emotional direction. 356 00:21:34,276 --> 00:21:37,156 Speaker 3: So if you were in the positive motion condition, you know, 357 00:21:37,196 --> 00:21:41,036 Speaker 3: even more controversial condition than when posts had positive motion 358 00:21:41,116 --> 00:21:43,316 Speaker 3: in them, they'd be moved down in your feed and 359 00:21:43,316 --> 00:21:44,836 Speaker 3: you'd be less likely to see them. 360 00:21:45,316 --> 00:21:48,036 Speaker 1: How did this change to a person's feed affect the 361 00:21:48,036 --> 00:21:51,716 Speaker 1: emotions that they themselves expressed in their own posts? Were 362 00:21:51,836 --> 00:21:55,116 Speaker 1: users automatically catching the emotions they saw in their feeds? 363 00:21:55,756 --> 00:21:58,036 Speaker 3: That's what we found. And so if you were in 364 00:21:58,116 --> 00:22:03,476 Speaker 3: the fewer positive posts condition, then you would write about 365 00:22:03,876 --> 00:22:10,196 Speaker 3: four positive words fewer over the next thousand words that 366 00:22:10,236 --> 00:22:11,556 Speaker 3: you post on Facebook. 367 00:22:11,836 --> 00:22:14,716 Speaker 1: Jeff and his colleagues had shown that people do catch 368 00:22:14,716 --> 00:22:17,236 Speaker 1: the emotions they see in their feed. It was an 369 00:22:17,236 --> 00:22:21,476 Speaker 1: important result. People online, many that you've never even met, 370 00:22:21,716 --> 00:22:24,636 Speaker 1: were able to make you feel happier or sadder in 371 00:22:24,716 --> 00:22:26,196 Speaker 1: your real, offline life. 372 00:22:26,316 --> 00:22:28,236 Speaker 3: I think got a little bit of media attention. I 373 00:22:28,276 --> 00:22:31,716 Speaker 3: remember Jimmy Fallon actually cracked a joke about it in 374 00:22:31,796 --> 00:22:34,556 Speaker 3: one of his monologues. I remember being like, okay, right on, 375 00:22:34,716 --> 00:22:37,356 Speaker 3: Jimmy found cracks joke. That's a check mark great, and 376 00:22:37,436 --> 00:22:39,756 Speaker 3: after that thought like, okay, I guess that'll be sort 377 00:22:39,756 --> 00:22:40,596 Speaker 3: of it for the study. 378 00:22:40,996 --> 00:22:44,636 Speaker 1: Jeff couldn't have been more wrong. The hate mail started arriving, 379 00:22:44,836 --> 00:22:47,756 Speaker 1: and not just for weeks after his study, but for years. 380 00:22:48,196 --> 00:22:51,516 Speaker 3: How dare you manipulate my newsfeed. 381 00:22:51,236 --> 00:22:54,036 Speaker 1: And the public anger he stirred didn't just hurt him. 382 00:22:54,316 --> 00:22:57,156 Speaker 3: My wife was really negatively affected emotion with it. I 383 00:22:57,196 --> 00:23:00,156 Speaker 3: mean she was, you know, turned on the radio, the TV, 384 00:23:00,436 --> 00:23:02,916 Speaker 3: or fire up her email or go on Twitter, and 385 00:23:02,916 --> 00:23:05,916 Speaker 3: would just be seeing all these like takedowns of her 386 00:23:05,956 --> 00:23:09,556 Speaker 3: husband being an unethical monster, and you forget that, like, 387 00:23:09,676 --> 00:23:11,996 Speaker 3: your work isn't just you, but it's also all the 388 00:23:12,036 --> 00:23:13,676 Speaker 3: people around you that have supported it. 389 00:23:14,076 --> 00:23:16,876 Speaker 1: Jeff's study raised questions that had never before been asked 390 00:23:16,916 --> 00:23:19,796 Speaker 1: about the role of social media in our emotional lives. 391 00:23:20,316 --> 00:23:22,676 Speaker 3: I think for a long time people kind of viewed 392 00:23:22,716 --> 00:23:24,876 Speaker 3: those as two different worlds. There's like stuff that happens 393 00:23:24,916 --> 00:23:27,636 Speaker 3: on social media or online and it's like whatever, and 394 00:23:27,676 --> 00:23:31,116 Speaker 3: then there's the real world. People really recognized all of 395 00:23:31,116 --> 00:23:33,916 Speaker 3: a sudden that like, hey, these are both real worlds, 396 00:23:34,476 --> 00:23:37,876 Speaker 3: and what you read and the emotions you get from 397 00:23:37,956 --> 00:23:41,916 Speaker 3: learning about your social network, they don't drop off once 398 00:23:41,956 --> 00:23:44,116 Speaker 3: you turn the screen off. They actually stay with you 399 00:23:44,196 --> 00:23:46,996 Speaker 3: and they can influence your physical interactions. 400 00:23:48,636 --> 00:23:51,476 Speaker 1: One thing has changed for Jeff though his study has 401 00:23:51,556 --> 00:23:54,636 Speaker 1: changed the kinds of emotional things he posts online. 402 00:23:54,796 --> 00:23:59,476 Speaker 3: I certainly try to be a positive person. Whenever I'm 403 00:23:59,956 --> 00:24:03,756 Speaker 3: using any sort of form of technology, I'm very aware 404 00:24:03,836 --> 00:24:06,676 Speaker 3: of the possible effects of writing negative things. 405 00:24:07,076 --> 00:24:10,836 Speaker 1: Jeff's onto something here will explore in more detail after 406 00:24:10,876 --> 00:24:13,836 Speaker 1: the break. When we first hear about the Facebook study, 407 00:24:14,236 --> 00:24:17,876 Speaker 1: we can feel existentially threatened, from laugh tracks to bias 408 00:24:17,996 --> 00:24:21,316 Speaker 1: news feeds. It can seem like other people are constantly 409 00:24:21,316 --> 00:24:24,476 Speaker 1: affecting us, and as though our emotions are completely out 410 00:24:24,476 --> 00:24:28,236 Speaker 1: of our control, even subject to manipulation. But when you 411 00:24:28,276 --> 00:24:31,436 Speaker 1: start to understand how these techniques work, when you begin 412 00:24:31,636 --> 00:24:34,916 Speaker 1: realizing that these forces are being used against you, you 413 00:24:34,996 --> 00:24:37,636 Speaker 1: can react in a more positive way to all of 414 00:24:37,676 --> 00:24:41,756 Speaker 1: this emotional contagion, because just as we are affected by 415 00:24:41,796 --> 00:24:45,476 Speaker 1: the emotions of others, so to do our emotions affect 416 00:24:45,556 --> 00:24:48,276 Speaker 1: other people. And that means that we can have a 417 00:24:48,316 --> 00:24:51,476 Speaker 1: lot more control over our own emotional climate than we 418 00:24:51,516 --> 00:24:54,956 Speaker 1: often think. We can become the laugh track we want 419 00:24:54,996 --> 00:24:59,236 Speaker 1: to hear in the world. The Happiness Lab will be 420 00:24:59,276 --> 00:25:00,236 Speaker 1: back in a moment. 421 00:25:05,716 --> 00:25:09,436 Speaker 4: When my youngest child was very very young, probably about 422 00:25:09,476 --> 00:25:12,916 Speaker 4: four or so, and I was reprimanding her for something, 423 00:25:13,156 --> 00:25:16,716 Speaker 4: not particularly aggressively, but just reprimanding. She said to me, 424 00:25:17,516 --> 00:25:24,436 Speaker 4: you are making me have negative emotional contagion, and I'm 425 00:25:24,476 --> 00:25:26,516 Speaker 4: kind of like, well, that's sort of the point. 426 00:25:26,676 --> 00:25:31,716 Speaker 1: Actually, this is Seagall Barsaid. She's a professor at the 427 00:25:31,716 --> 00:25:34,876 Speaker 1: Wharton School of Business. Segaull is an expert on emotional 428 00:25:34,876 --> 00:25:38,596 Speaker 1: contagion and specifically how we can use this phenomenon to 429 00:25:38,636 --> 00:25:41,676 Speaker 1: make ourselves and our organizations a little bit happier. 430 00:25:42,236 --> 00:25:45,676 Speaker 4: The reason I started to study emotional contagion was because 431 00:25:45,756 --> 00:25:48,156 Speaker 4: I was in a workplace and there was a woman 432 00:25:48,156 --> 00:25:50,396 Speaker 4: in the workplace who worked there. Will call her Megan, 433 00:25:50,596 --> 00:25:53,116 Speaker 4: and I didn't report to her. I didn't actually need 434 00:25:53,116 --> 00:25:54,556 Speaker 4: to even work with her a lot. But she was 435 00:25:54,556 --> 00:25:58,076 Speaker 4: a very negative person. And one day she left for vacation, 436 00:25:58,836 --> 00:26:01,636 Speaker 4: and I noticed that there was a palpable difference in 437 00:26:01,676 --> 00:26:04,396 Speaker 4: the workplace among everybody who was in this open office. 438 00:26:04,556 --> 00:26:06,916 Speaker 4: People's shoulders almost like seemed to lower, and they were 439 00:26:06,916 --> 00:26:10,716 Speaker 4: more relaxed and happy than when she came back from vacation, 440 00:26:10,836 --> 00:26:15,796 Speaker 4: everything went back to what it was, and I remember thinking, Wow, 441 00:26:15,996 --> 00:26:19,036 Speaker 4: this person is having this tremendous effect on our mood, 442 00:26:19,276 --> 00:26:22,076 Speaker 4: even when we don't literally have to engage with her 443 00:26:22,116 --> 00:26:23,356 Speaker 4: around workplace issues. 444 00:26:23,716 --> 00:26:27,556 Speaker 1: Segall has explored how moods transmit through an organization how 445 00:26:27,556 --> 00:26:31,036 Speaker 1: emotional contagion can shape an entire group or even an 446 00:26:31,196 --> 00:26:32,396 Speaker 1: entire workplace culture. 447 00:26:32,756 --> 00:26:35,716 Speaker 4: You can imagine a work group where you know, somebody 448 00:26:35,756 --> 00:26:38,436 Speaker 4: comes in in a really great mood, everybody else is, 449 00:26:38,476 --> 00:26:41,356 Speaker 4: let's say, kind of neutral. That person comes in, they're bubbly, 450 00:26:41,396 --> 00:26:46,916 Speaker 4: they're warm, and that infects everybody else. Or you can 451 00:26:46,956 --> 00:26:49,716 Speaker 4: also imagine the opposite, which is somebody comes in, they're 452 00:26:49,716 --> 00:26:52,836 Speaker 4: in a really bad mood, and everybody was kind of fine, 453 00:26:53,396 --> 00:26:57,516 Speaker 4: but now they're feeling it. Most often, it's coming as 454 00:26:57,556 --> 00:27:01,596 Speaker 4: a very automatic process as a result of behavioral mimicry, 455 00:27:02,036 --> 00:27:06,636 Speaker 4: mimicking the facial expressions and body language, and then through 456 00:27:06,676 --> 00:27:10,716 Speaker 4: a variety of physiological processes, we're actually feeling those emotions. 457 00:27:10,916 --> 00:27:15,076 Speaker 1: The process Segall describes is pretty intuitive. It's probably obvious 458 00:27:15,116 --> 00:27:18,476 Speaker 1: that people can affect others' moods in the workplace, but as. 459 00:27:18,316 --> 00:27:22,036 Speaker 4: An organizational theorist, I was also interested in, Okay, if 460 00:27:22,076 --> 00:27:26,556 Speaker 4: this happens, does it matter, you know, does it influence anything? Behaviorally? 461 00:27:27,076 --> 00:27:30,716 Speaker 1: Seagaull tested whether the contagion led to bad workplace outcomes. 462 00:27:31,356 --> 00:27:34,396 Speaker 1: She had our workers do a bunch of financial decision tasks, 463 00:27:34,836 --> 00:27:39,116 Speaker 1: including making hard decisions about monetary allocations. Did workers make 464 00:27:39,196 --> 00:27:41,916 Speaker 1: worse decisions when in the presence of a single bad 465 00:27:41,996 --> 00:27:42,876 Speaker 1: mooded confederate. 466 00:27:43,316 --> 00:27:48,876 Speaker 4: What we found is that in the positive emotional contagion conditions, 467 00:27:49,156 --> 00:27:54,356 Speaker 4: the groups were less conflictual, more cooperative, and they literally 468 00:27:54,516 --> 00:27:58,156 Speaker 4: allocated the money differently. The pot of money they were 469 00:27:58,156 --> 00:28:00,836 Speaker 4: more collectivistic in how they did it. This was very 470 00:28:00,956 --> 00:28:06,516 Speaker 4: powerful evidence for the idea that a, Yes, emotions are contagious, 471 00:28:06,956 --> 00:28:11,796 Speaker 4: and b that they actually influence work outcomes. 472 00:28:12,156 --> 00:28:14,876 Speaker 1: Seagull's study has now been replicated in a number of 473 00:28:14,916 --> 00:28:19,036 Speaker 1: experimental and field settings, from cricket teams to bank branches 474 00:28:19,076 --> 00:28:22,716 Speaker 1: to coffee shops. Much like the flu, a single bad 475 00:28:22,756 --> 00:28:25,876 Speaker 1: mood can transmit fast. It can influence the performance of 476 00:28:25,916 --> 00:28:29,116 Speaker 1: an entire team. It can even affect the way frontline 477 00:28:29,156 --> 00:28:32,956 Speaker 1: workers interact with their customers. One of the most insidious 478 00:28:32,956 --> 00:28:35,796 Speaker 1: parts of workplace contagion is the fact that the process 479 00:28:35,876 --> 00:28:39,796 Speaker 1: is reciprocal. One negative feeling Megan in the office doesn't 480 00:28:39,836 --> 00:28:42,716 Speaker 1: just make the folks around her grumpy. Those folks then 481 00:28:42,756 --> 00:28:47,316 Speaker 1: become their own influences, making everyone more annoyed and more stressed, 482 00:28:47,636 --> 00:28:51,076 Speaker 1: and on and on and on. This sort of emotional 483 00:28:51,116 --> 00:28:54,996 Speaker 1: feedback loop is what Siegel refers to as an affective spiral. 484 00:28:55,276 --> 00:28:59,316 Speaker 4: We literally do spiral, and we can have upward spirals 485 00:28:59,356 --> 00:29:01,236 Speaker 4: and we can have downward spirals. 486 00:29:01,876 --> 00:29:04,756 Speaker 1: The problem is that we often forget such spirals are 487 00:29:04,796 --> 00:29:06,156 Speaker 1: as common as they really. 488 00:29:05,956 --> 00:29:09,396 Speaker 4: Are time and time again, and research studies it's own 489 00:29:09,436 --> 00:29:13,276 Speaker 4: that people don't actually know it's occurring. And in some ways, 490 00:29:13,276 --> 00:29:15,196 Speaker 4: what I find so exciting about being able to talk 491 00:29:15,236 --> 00:29:17,916 Speaker 4: to this to people is that by letting them know 492 00:29:18,116 --> 00:29:22,716 Speaker 4: that it is a phenomenon that is happening, then you 493 00:29:22,756 --> 00:29:25,676 Speaker 4: are much more aware of it if you want to 494 00:29:25,716 --> 00:29:28,556 Speaker 4: protect yourself against the emotional contagion. 495 00:29:28,636 --> 00:29:30,796 Speaker 1: Because we all have our megans in the office, you know, 496 00:29:30,836 --> 00:29:31,916 Speaker 1: for better or for worse. 497 00:29:32,036 --> 00:29:35,556 Speaker 4: Yes, exactly well, and you know what we're all megans 498 00:29:35,596 --> 00:29:38,036 Speaker 4: in some ways. And what I mean by that is 499 00:29:38,076 --> 00:29:41,876 Speaker 4: that you know we have moods, right, We all vary 500 00:29:42,036 --> 00:29:45,476 Speaker 4: and quite dramatically throughout the day, throughout the week, and 501 00:29:45,556 --> 00:29:50,956 Speaker 4: so being really conscious of that can help us understand 502 00:29:51,036 --> 00:29:53,236 Speaker 4: how we're affecting other people. 503 00:29:53,916 --> 00:29:56,796 Speaker 1: See Gods onto something really important here. If we don't 504 00:29:56,796 --> 00:30:00,076 Speaker 1: work on our own emotions, we can inadvertently start a 505 00:30:00,116 --> 00:30:04,956 Speaker 1: downward spiral. Ourselves. Our negativity will likely get caught by 506 00:30:04,996 --> 00:30:08,476 Speaker 1: others around us, which can then get transmitted back to us. 507 00:30:09,236 --> 00:30:12,396 Speaker 1: Can inadvertently be the first bad step and a causal 508 00:30:12,516 --> 00:30:17,636 Speaker 1: chain that leads us to experience more misery. Luckily, Seegaull's 509 00:30:17,676 --> 00:30:19,996 Speaker 1: research has shown lots of ways that we can alter 510 00:30:20,116 --> 00:30:24,236 Speaker 1: our expressions to smooth out otherwise negative emotional situations, both 511 00:30:24,276 --> 00:30:25,756 Speaker 1: at work and beyond. 512 00:30:26,116 --> 00:30:29,196 Speaker 4: Let's say you're in a job interview with somebody and 513 00:30:29,236 --> 00:30:31,716 Speaker 4: you see that they're very nervous. You know, one of 514 00:30:31,756 --> 00:30:33,876 Speaker 4: the worst things you can say to somebody is, you know, 515 00:30:34,036 --> 00:30:37,356 Speaker 4: calm down right. If you sort of slow your pace 516 00:30:37,396 --> 00:30:40,836 Speaker 4: a little bit, you look encouragingly, you change your tone 517 00:30:40,996 --> 00:30:42,996 Speaker 4: as you can probably hear in what I'm doing right now, 518 00:30:43,556 --> 00:30:47,156 Speaker 4: that will naturally calm that person down so that you 519 00:30:47,196 --> 00:30:50,476 Speaker 4: can get kind of a more clear interview. As a leader, 520 00:30:52,036 --> 00:30:56,316 Speaker 4: you know, if you have people panicking around you, for example, 521 00:30:56,716 --> 00:30:59,436 Speaker 4: model for them the emotions that are going to be 522 00:30:59,476 --> 00:31:04,236 Speaker 4: the most productive in that situation. If you're sophisticated about 523 00:31:04,316 --> 00:31:09,836 Speaker 4: understanding that, you have this other way of getting your 524 00:31:09,916 --> 00:31:13,036 Speaker 4: team on board and where they need to be. 525 00:31:13,516 --> 00:31:16,036 Speaker 1: Seegaull has focused a lot of her research on the 526 00:31:16,116 --> 00:31:18,676 Speaker 1: one member of a team that has the most powerful 527 00:31:18,756 --> 00:31:21,316 Speaker 1: role in a team's emotions, the boss. 528 00:31:21,916 --> 00:31:24,516 Speaker 4: When I hear about a group or an organization that 529 00:31:24,556 --> 00:31:27,316 Speaker 4: has really low morale, one of the first questions I'm 530 00:31:27,356 --> 00:31:29,956 Speaker 4: asking is, how's the leader coming in in the morning. 531 00:31:30,316 --> 00:31:33,836 Speaker 4: You know, is a leader coming in excited, enthusiastic, energetic 532 00:31:33,956 --> 00:31:36,756 Speaker 4: talking with people, or is the leader coming in looking 533 00:31:36,796 --> 00:31:38,356 Speaker 4: like they have the weight of the world on their 534 00:31:38,356 --> 00:31:42,556 Speaker 4: shoulders and really stressed. People are always paying attention to leaders, 535 00:31:42,636 --> 00:31:45,876 Speaker 4: and so they literally catch the leader's mood. 536 00:31:46,556 --> 00:31:49,396 Speaker 1: Seegaul is currently working on a number of training programs 537 00:31:49,516 --> 00:31:52,516 Speaker 1: to get leaders to regulate their feelings, both internally and 538 00:31:52,596 --> 00:31:55,836 Speaker 1: externally in ways that best helped the group. But the 539 00:31:55,876 --> 00:31:58,996 Speaker 1: first step is simply realizing that one's emotions can affect 540 00:31:59,036 --> 00:32:01,156 Speaker 1: a team's mood and its performance. 541 00:32:01,556 --> 00:32:05,036 Speaker 4: Knowledge is empowering. Things like you're doing now with this 542 00:32:05,116 --> 00:32:08,476 Speaker 4: episode allow people to know that this exists, and it's 543 00:32:08,516 --> 00:32:11,956 Speaker 4: empowering because there are things that you can do to 544 00:32:12,076 --> 00:32:16,716 Speaker 4: try to not catch somebody else's emotions. The other empowering 545 00:32:16,756 --> 00:32:20,556 Speaker 4: piece of this, though, is knowing that we can change 546 00:32:20,556 --> 00:32:21,516 Speaker 4: each other's moods. 547 00:32:22,076 --> 00:32:24,916 Speaker 1: The good news is that Seagull's work suggests that we 548 00:32:24,996 --> 00:32:27,316 Speaker 1: can be the emotional change we want to see in 549 00:32:27,356 --> 00:32:30,196 Speaker 1: the world. I asked her if this work has made 550 00:32:30,196 --> 00:32:33,156 Speaker 1: her hopeful, if she thinks we really can tackle all 551 00:32:33,196 --> 00:32:35,036 Speaker 1: the negative office megans in our lives. 552 00:32:35,596 --> 00:32:39,356 Speaker 4: Absolutely, the affect of spiral can begin with us. Absolutely. 553 00:32:41,516 --> 00:32:44,236 Speaker 1: So what have we learned in this episode? First, we 554 00:32:44,316 --> 00:32:47,356 Speaker 1: are affected by other people's emotions, way way more than 555 00:32:47,396 --> 00:32:53,836 Speaker 1: we realize. We unconsciously and automatically catch the grief, despair, excitement, rage, 556 00:32:54,276 --> 00:32:58,156 Speaker 1: joy sadness, and serenity of all the individuals we encounter, 557 00:32:58,516 --> 00:33:00,596 Speaker 1: whether that person is a close friend at a party, 558 00:33:00,756 --> 00:33:02,756 Speaker 1: the guy behind us in line at a coffee shop, 559 00:33:02,876 --> 00:33:05,836 Speaker 1: some ranting idiot on Reddit, or even the voice of 560 00:33:05,876 --> 00:33:11,956 Speaker 1: some long dead fifties man laughing hysterically on a badame sitcom. 561 00:33:12,076 --> 00:33:16,476 Speaker 1: By instinctually copying others' behaviors, we irresistibly take on what 562 00:33:16,516 --> 00:33:20,876 Speaker 1: they feel, whether we like that feeling or not. But 563 00:33:20,956 --> 00:33:23,716 Speaker 1: we've also learned that we have more agency than we think. 564 00:33:24,236 --> 00:33:26,156 Speaker 1: We can be more aware of how the people we 565 00:33:26,196 --> 00:33:29,476 Speaker 1: interact with online are affecting us, and mindful of what 566 00:33:29,516 --> 00:33:33,196 Speaker 1: we ourselves post. We can have control over whether we're 567 00:33:33,236 --> 00:33:35,836 Speaker 1: the office Megan, or that burst of calm in a 568 00:33:35,876 --> 00:33:39,356 Speaker 1: tough moment, or the laugh that everyone really really needs. 569 00:33:40,636 --> 00:33:43,396 Speaker 1: And that means that listening to this podcast and following 570 00:33:43,396 --> 00:33:47,636 Speaker 1: its advice won't just help your happiness, Making changes in 571 00:33:47,636 --> 00:33:50,916 Speaker 1: your own life can be the positive seed that transforms 572 00:33:50,956 --> 00:33:54,276 Speaker 1: the well being of those around you. So if you'd 573 00:33:54,316 --> 00:33:56,316 Speaker 1: like to be a force for good to help make 574 00:33:56,396 --> 00:33:59,156 Speaker 1: joy just a little bit more viral, I hope you'll 575 00:33:59,196 --> 00:34:02,116 Speaker 1: return for the next episode of The Happiness Lab with 576 00:34:02,236 --> 00:34:19,476 Speaker 1: me Doctor Laurie Santos. The Happiness Lab is co written 577 00:34:19,476 --> 00:34:22,076 Speaker 1: and produced by Ryan Dilly. The show is mixed and 578 00:34:22,156 --> 00:34:25,716 Speaker 1: mastered by Evan Viola and edited by Julia Barton, fact 579 00:34:25,796 --> 00:34:29,636 Speaker 1: checking by Joseph Friedman, and our original music was composed 580 00:34:29,676 --> 00:34:35,156 Speaker 1: by Zachary Silver. Special thanks to Mio LaBelle, Carly mcgliorre 581 00:34:35,516 --> 00:34:40,956 Speaker 1: Heather Faine, Maggie Taylor, Maya Knig, and Jacob Weisberg. The 582 00:34:40,956 --> 00:34:43,716 Speaker 1: Happiness Lab is brought to you by Pushkin Industries and 583 00:34:43,836 --> 00:34:51,516 Speaker 1: me Doctor Laurie Santos.