1 00:00:15,436 --> 00:00:30,916 Speaker 1: Pushkin the sounds of laughter. There are people who study it, 2 00:00:31,596 --> 00:00:37,196 Speaker 1: people like Ben Glenn. This extended sequence of laughs from 3 00:00:37,196 --> 00:00:40,556 Speaker 1: the Jack Benny Program is one of the all time greats. 4 00:00:45,276 --> 00:00:48,356 Speaker 1: Jack Benny was the biggest comedian of his day. His 5 00:00:48,436 --> 00:00:51,796 Speaker 1: show ran on radio and TV for more than thirty years. 6 00:00:52,356 --> 00:00:55,036 Speaker 1: It said that the Jack Benny Show was the only 7 00:00:55,116 --> 00:00:59,516 Speaker 1: program President John F. Kennedy would watch religiously. His jokes 8 00:00:59,596 --> 00:01:02,836 Speaker 1: have probably generated more laughs than anyone in the history 9 00:01:02,876 --> 00:01:07,116 Speaker 1: of comedy. In this scene, comedian Johnny Carson has just 10 00:01:07,276 --> 00:01:09,996 Speaker 1: remarked that he has no idea how Jack Benny stays 11 00:01:10,036 --> 00:01:12,916 Speaker 1: so young. The gag that's slaying the audience is the 12 00:01:12,996 --> 00:01:16,996 Speaker 1: reveal that Jack stays young because he's a robot, and 13 00:01:17,116 --> 00:01:19,916 Speaker 1: his staff are dismantling him bit by bit, putting him 14 00:01:19,916 --> 00:01:28,316 Speaker 1: in storage for the night. But the laughter you're hearing 15 00:01:28,356 --> 00:01:33,676 Speaker 1: from this gag isn't pure. It's sweetened. Sweetening is augmenting 16 00:01:33,876 --> 00:01:38,996 Speaker 1: the authentic reactions of a sitting audience with prerecorded reactions. 17 00:01:39,116 --> 00:01:43,636 Speaker 1: Augmenting authentic reactions. That's the media industry trick you might 18 00:01:43,676 --> 00:01:46,716 Speaker 1: know as canned laughter or a laugh track. The man 19 00:01:46,756 --> 00:01:50,196 Speaker 1: I'm talking to, Ben Glenn is a television historian and 20 00:01:50,276 --> 00:01:53,276 Speaker 1: an expert on the history of this technique. He's listened 21 00:01:53,276 --> 00:01:57,236 Speaker 1: to every faked shriek and spliced in Giffa in USTv history. 22 00:01:57,836 --> 00:02:02,956 Speaker 1: I'm fifty four years old, I am the television generation, 23 00:02:03,716 --> 00:02:08,636 Speaker 1: and I thought to myself, why does nobody ever talk 24 00:02:08,676 --> 00:02:14,156 Speaker 1: about that? Because the laugh track was a critical integral 25 00:02:14,236 --> 00:02:17,916 Speaker 1: part of these shows and of their success. Canned laughter 26 00:02:17,996 --> 00:02:20,556 Speaker 1: is pretty common today, but I was surprised to hear 27 00:02:20,596 --> 00:02:24,476 Speaker 1: the secret history of how this technique developed. The laugh 28 00:02:24,516 --> 00:02:28,676 Speaker 1: track began an earnest in nineteen to fifty when an 29 00:02:28,676 --> 00:02:34,516 Speaker 1: engineer named Charles Douglas, a sound engineer at CBS, developed 30 00:02:34,836 --> 00:02:37,876 Speaker 1: not only the idea that you could insert prerecorded laughter, 31 00:02:37,996 --> 00:02:42,516 Speaker 1: but he also built and designed himself in his garage 32 00:02:43,036 --> 00:02:49,876 Speaker 1: the apparatus to use tape loops of carefully selected reactions 33 00:02:50,116 --> 00:02:54,236 Speaker 1: that could be inserted into television shows. And so where 34 00:02:54,236 --> 00:02:57,756 Speaker 1: did he get those in the original Where did those boys? 35 00:02:57,876 --> 00:03:02,196 Speaker 1: That is the question that everybody asks. So I took 36 00:03:02,236 --> 00:03:06,236 Speaker 1: it upon myself to start digging. The facts are very 37 00:03:06,276 --> 00:03:09,716 Speaker 1: hard to uncover because it was a was meant to 38 00:03:09,756 --> 00:03:13,916 Speaker 1: be a well kept Hollywood secret. Ben's oral detective work 39 00:03:13,996 --> 00:03:16,916 Speaker 1: led him to listen over and over, both to classic 40 00:03:16,996 --> 00:03:20,796 Speaker 1: laugh tracks and early television comedies. In the end, he 41 00:03:20,996 --> 00:03:23,396 Speaker 1: was able to figure out that Douglas made his laugh 42 00:03:23,476 --> 00:03:27,196 Speaker 1: tracks by recording the real studio audiences who watched early 43 00:03:27,276 --> 00:03:30,916 Speaker 1: comedians like Red Skelton, Lucille, Ball and Abbott and Costello. 44 00:03:31,196 --> 00:03:35,716 Speaker 1: There's also a report that some of the earliest ones 45 00:03:35,756 --> 00:03:39,596 Speaker 1: in the nineteen fifties came from a Marcel Marceau performance 46 00:03:39,636 --> 00:03:42,436 Speaker 1: in Los Angeles, and of course that would make sense 47 00:03:42,476 --> 00:03:45,876 Speaker 1: since he was a mime and there was no interfering 48 00:03:45,996 --> 00:03:51,596 Speaker 1: music or any sort of distraction noise, etc. Ben told 49 00:03:51,636 --> 00:03:54,836 Speaker 1: me something that totally blew my mind. Those early laugh 50 00:03:54,916 --> 00:03:58,876 Speaker 1: tracks are still in use today. The exact same guffaws 51 00:03:58,916 --> 00:04:01,436 Speaker 1: and shrieks that people heard in the fifties are still 52 00:04:01,516 --> 00:04:07,556 Speaker 1: sweetening modern TV shows. We are hearing reactions that were 53 00:04:08,516 --> 00:04:13,076 Speaker 1: recorded decades ago. They're dead people, yes, but they live 54 00:04:13,156 --> 00:04:19,476 Speaker 1: on television. Comedies are still using these laughs because they work. 55 00:04:21,116 --> 00:04:23,596 Speaker 1: As I listened to Ben's clips of early comedies, I 56 00:04:23,756 --> 00:04:26,716 Speaker 1: realized that the canned laughter does seem to make jokes funnier, 57 00:04:27,236 --> 00:04:30,516 Speaker 1: even if the gags are pretty dumb, which honestly, many 58 00:04:30,596 --> 00:04:35,036 Speaker 1: of the gags were but why was my brain reacting 59 00:04:35,116 --> 00:04:38,436 Speaker 1: like this? How can the prerecorded laughter of dead people 60 00:04:38,556 --> 00:04:41,596 Speaker 1: I've never met affect my experience of a mostly not 61 00:04:41,756 --> 00:04:46,156 Speaker 1: funny television show today? I mean, are we literally catching 62 00:04:46,236 --> 00:04:50,076 Speaker 1: other people's emotions like the common cold? And if we are, 63 00:04:50,636 --> 00:04:53,676 Speaker 1: how can we inoculate ourselves against the sorts of feelings 64 00:04:53,836 --> 00:05:01,636 Speaker 1: we might not want to catch from other people. Our 65 00:05:01,676 --> 00:05:04,036 Speaker 1: minds are constantly telling us what to do to be happy. 66 00:05:04,596 --> 00:05:06,956 Speaker 1: But what if our minds are wrong? What if our 67 00:05:06,956 --> 00:05:09,676 Speaker 1: minds are lying to us, leading us away from what 68 00:05:09,756 --> 00:05:13,076 Speaker 1: will really make us happy. The good news is that 69 00:05:13,236 --> 00:05:15,716 Speaker 1: understanding the science of the mind can point us all 70 00:05:15,756 --> 00:05:19,596 Speaker 1: back in the right direction. You're listening to the Happiness 71 00:05:19,676 --> 00:05:29,716 Speaker 1: Lab with doctor Laurie Sanders. Modern day laugh tracks are 72 00:05:29,756 --> 00:05:32,676 Speaker 1: only a few decades old, but comedy designers have been 73 00:05:32,756 --> 00:05:36,236 Speaker 1: using similar techniques for centuries. If you think about it, 74 00:05:37,516 --> 00:05:43,236 Speaker 1: In the sixteenth century Shakespearean plays, they were performed in 75 00:05:43,356 --> 00:05:49,556 Speaker 1: front of an extremely raucous audience, where the audience cheer 76 00:05:49,636 --> 00:05:54,876 Speaker 1: the protagonist and booed the villain. There are actually accounts 77 00:05:54,996 --> 00:05:59,116 Speaker 1: of Shakespeare planted people in the audience to react in 78 00:05:59,156 --> 00:06:03,676 Speaker 1: a certain way to spur on the audience members around. 79 00:06:04,356 --> 00:06:07,836 Speaker 1: Shakespeare wasn't alone. There's a long, long history of seating 80 00:06:07,956 --> 00:06:11,236 Speaker 1: live audiences like this. The opera houses of France employed 81 00:06:11,316 --> 00:06:15,196 Speaker 1: whole groups of claquers, hired guns who sat in the auditorium, 82 00:06:15,556 --> 00:06:18,196 Speaker 1: primed to lead the paying audience in whatever reaction the 83 00:06:18,276 --> 00:06:22,436 Speaker 1: script required. Some members of the claque even had specializations. 84 00:06:23,036 --> 00:06:27,276 Speaker 1: Rearers were expert laughers, while plurers could summon tears at will. 85 00:06:28,916 --> 00:06:32,436 Speaker 1: Early television producers also realized the power of these techniques. 86 00:06:33,076 --> 00:06:35,796 Speaker 1: They would have loved to hire professional claquers to enhance 87 00:06:35,876 --> 00:06:39,756 Speaker 1: their studio audience's reaction, but they couldn't afford something that extravagant. 88 00:06:40,756 --> 00:06:44,796 Speaker 1: Charles Douglas's new advance in laughter technology offered a tantalizing, 89 00:06:44,956 --> 00:06:50,436 Speaker 1: albeit totally fabricated alternative. The entire process was it was 90 00:06:51,316 --> 00:06:54,836 Speaker 1: artificial from beginning to end. In a few years, Douglas 91 00:06:54,956 --> 00:06:57,396 Speaker 1: was able to raise his laugh track from simple canned 92 00:06:57,476 --> 00:07:00,956 Speaker 1: laughter to an art form. He wasn't just lifting laughs. 93 00:07:01,396 --> 00:07:03,636 Speaker 1: It was as though he was a conductor and the 94 00:07:03,756 --> 00:07:07,156 Speaker 1: laughing audience members were his orchestra. He would sit up 95 00:07:07,236 --> 00:07:10,556 Speaker 1: late at night in his living room and he would 96 00:07:10,556 --> 00:07:18,036 Speaker 1: be listening to tape loops and splicing and extracting reaction. 97 00:07:20,276 --> 00:07:25,156 Speaker 1: He engineered them to bring a certain individual laugh to 98 00:07:25,236 --> 00:07:29,916 Speaker 1: the surface and to suppress ambient laughter. They often were 99 00:07:30,036 --> 00:07:33,436 Speaker 1: sped up to make it seem even more kind of 100 00:07:34,396 --> 00:07:40,076 Speaker 1: jolly or more intensely funny. The machine he built held 101 00:07:40,836 --> 00:07:46,676 Speaker 1: up to three hundred twenty reactions. Douglas's new machine allowed 102 00:07:46,716 --> 00:07:50,356 Speaker 1: him to become the master of television emotional manipulation, but 103 00:07:50,556 --> 00:07:54,236 Speaker 1: most shows couldn't afford his exquisite new craft. Charles Douglas 104 00:07:54,476 --> 00:07:58,636 Speaker 1: charged about one hundred dollars per day, if that was 105 00:07:58,756 --> 00:08:03,356 Speaker 1: over budget. For some production companies, they found some i 106 00:08:03,476 --> 00:08:06,676 Speaker 1: have to say, sort of low budget pre existing laugh track. 107 00:08:06,836 --> 00:08:10,836 Speaker 1: For example, The Adventures of Seen Harriet, which was an 108 00:08:11,156 --> 00:08:17,956 Speaker 1: enormously popular show in its day, used only one reaction 109 00:08:18,436 --> 00:08:21,556 Speaker 1: regardless of situation. So it's like the same laugh, it's 110 00:08:21,556 --> 00:08:28,716 Speaker 1: the same lay, the exact same laugh. But shows soon 111 00:08:28,796 --> 00:08:31,356 Speaker 1: recognized they needed to be a little bit more professional 112 00:08:31,436 --> 00:08:34,796 Speaker 1: with their sweetening to make the jokestick, which put Douglas's 113 00:08:34,876 --> 00:08:38,156 Speaker 1: laugh track in even more demand. And that's when Douglas 114 00:08:38,316 --> 00:08:41,196 Speaker 1: realized he could start sweetening more than just the laughs. 115 00:08:41,836 --> 00:08:45,596 Speaker 1: What we see is that the laugh track expands beyond 116 00:08:45,796 --> 00:08:49,516 Speaker 1: just laughter, and these are some of my favorite reactions. 117 00:08:49,956 --> 00:08:56,036 Speaker 1: There is shock, there are you know, women kind of squealing. 118 00:08:56,476 --> 00:09:02,516 Speaker 1: Oh oh. So the laugh track expanded beyond just laughter 119 00:09:03,196 --> 00:09:07,036 Speaker 1: to include the full range of how an audience might react. 120 00:09:07,916 --> 00:09:11,476 Speaker 1: But why was Douglas's techniques so powerful? Why does hearing 121 00:09:11,516 --> 00:09:14,076 Speaker 1: the squeals of people we've never met change our own 122 00:09:14,196 --> 00:09:17,276 Speaker 1: reaction to a television show? Well, if you've listened to 123 00:09:17,276 --> 00:09:20,676 Speaker 1: some of the previous episodes, you know the experiencing an event, 124 00:09:20,956 --> 00:09:23,036 Speaker 1: say eating a piece of chocolate at the same time 125 00:09:23,076 --> 00:09:26,356 Speaker 1: as another person, can make those events more intense. But 126 00:09:26,556 --> 00:09:30,716 Speaker 1: psychology shows the experiencing an event with another person can 127 00:09:30,796 --> 00:09:33,556 Speaker 1: also change our experience of that event in other ways 128 00:09:33,596 --> 00:09:36,756 Speaker 1: as well. Are thinking in our reactions, and really the 129 00:09:36,836 --> 00:09:41,716 Speaker 1: psychology of what we're watching can be changed by the 130 00:09:41,876 --> 00:09:47,356 Speaker 1: reactions around us. Psychologists have long documented the fact that 131 00:09:47,476 --> 00:09:51,396 Speaker 1: we tend to copy other people's behavior unconsciously, and most 132 00:09:51,436 --> 00:09:54,956 Speaker 1: of the time without even realizing it. In one study 133 00:09:55,036 --> 00:09:57,876 Speaker 1: by Tanya Chartrand and colleagues, subjects were brought into the 134 00:09:57,956 --> 00:10:00,396 Speaker 1: lab and told they needed to work with another person 135 00:10:00,556 --> 00:10:03,796 Speaker 1: to describe a set of magazine photos, but the photos 136 00:10:03,916 --> 00:10:06,876 Speaker 1: were not part of the experiment. Unbeknownst to the subject, 137 00:10:07,196 --> 00:10:11,116 Speaker 1: the people chosen as their partners were actually experimental clackers, 138 00:10:11,556 --> 00:10:14,876 Speaker 1: or confederates, as scientists called them. They were people hired 139 00:10:14,916 --> 00:10:18,116 Speaker 1: by the experimenters to behave in a very specific way. 140 00:10:18,756 --> 00:10:21,556 Speaker 1: They rub their face a bunch, or sometimes touch their 141 00:10:21,596 --> 00:10:24,996 Speaker 1: feet over and over. Chartrand and colleagues wanted to know 142 00:10:25,196 --> 00:10:28,836 Speaker 1: how these behaviors affected the subjects. It turns out being 143 00:10:28,876 --> 00:10:32,036 Speaker 1: around someone touching their face caused the research subjects to 144 00:10:32,116 --> 00:10:35,436 Speaker 1: touch their own faces more often, whereas being around someone 145 00:10:35,476 --> 00:10:38,396 Speaker 1: who touched their feet caused participants to touch their feet 146 00:10:38,436 --> 00:10:41,556 Speaker 1: more often. Even though both of these behaviors are really weird, 147 00:10:42,076 --> 00:10:44,316 Speaker 1: this result and lots of others, show that we tend 148 00:10:44,356 --> 00:10:47,836 Speaker 1: to unconsciously mimic the behavior of others, a phenomena that 149 00:10:47,916 --> 00:10:53,076 Speaker 1: researchers christened the chameleon effect. We're literally catching other people's behavior, 150 00:10:53,996 --> 00:10:57,236 Speaker 1: but we don't just catch other people's behavior. Researchers have 151 00:10:57,356 --> 00:10:59,956 Speaker 1: long realized that there's a tight link between behaving in 152 00:10:59,996 --> 00:11:03,316 Speaker 1: a certain way and feeling a certain way. The act 153 00:11:03,356 --> 00:11:05,956 Speaker 1: of behaving in a certain way, say smiling or frowning, 154 00:11:06,316 --> 00:11:10,236 Speaker 1: can influence how we feel. Are just shown that adopting 155 00:11:10,276 --> 00:11:14,276 Speaker 1: a happy facial expression can unconsciously improve our mood. There 156 00:11:14,316 --> 00:11:17,476 Speaker 1: also studies showing that when you can't make facial expressions, 157 00:11:17,796 --> 00:11:21,996 Speaker 1: it becomes harder to experience emotions. Chartrands showed this in 158 00:11:22,036 --> 00:11:26,476 Speaker 1: an unusual experiment. She tested whether women who receive botox injections, 159 00:11:26,636 --> 00:11:30,876 Speaker 1: which paralyzes facial muscles, have trouble recognizing emotions in others. 160 00:11:31,596 --> 00:11:33,796 Speaker 1: She found that not being able to make a particular 161 00:11:33,876 --> 00:11:37,916 Speaker 1: facial expression makes it harder to recognize that emotion in others. 162 00:11:39,036 --> 00:11:41,596 Speaker 1: What does all this have to do with the laugh track? Well, 163 00:11:41,796 --> 00:11:45,236 Speaker 1: if audience members naturally copy the reactions of those around them, 164 00:11:45,716 --> 00:11:48,036 Speaker 1: then they'll not only behave differently when they hear a 165 00:11:48,116 --> 00:11:51,676 Speaker 1: laugh track chuckling a bit more, but they'll also feel 166 00:11:51,756 --> 00:11:57,036 Speaker 1: differently because of emotional contagion. We're literally catching other people's emotions. 167 00:11:57,716 --> 00:12:00,476 Speaker 1: Hearing other people laugh at a joke makes us think 168 00:12:00,676 --> 00:12:04,436 Speaker 1: that the joke is actually funnier. It's reassuring, like, oh, yes, 169 00:12:04,716 --> 00:12:06,796 Speaker 1: I'm not the only one who finds that funny. That 170 00:12:06,996 --> 00:12:11,516 Speaker 1: validates me that is funny. Humans are not just behavioral chameleons, 171 00:12:11,876 --> 00:12:15,476 Speaker 1: but emotional chameleons as well. We're as susceptible to the 172 00:12:15,516 --> 00:12:18,876 Speaker 1: emotions around us as we are to a highly contagious disease, 173 00:12:19,396 --> 00:12:21,996 Speaker 1: but we don't often like to think that our emotions 174 00:12:22,076 --> 00:12:25,356 Speaker 1: can be so fickle, which meant that, despite its pervasiveness, 175 00:12:25,756 --> 00:12:28,796 Speaker 1: not everyone was a fan of Douglas's laugh track technology. 176 00:12:29,316 --> 00:12:33,036 Speaker 1: I think the term canned laughter was pejorative and was 177 00:12:33,236 --> 00:12:37,516 Speaker 1: used by critics and probably producers and writers who looked 178 00:12:37,556 --> 00:12:42,236 Speaker 1: down upon the use of prerecorded reactions. But laugh tracks 179 00:12:42,316 --> 00:12:45,796 Speaker 1: persist because they really do work. They make a show 180 00:12:45,916 --> 00:12:48,276 Speaker 1: seem funnier, whether or not we like to admit it. 181 00:12:48,876 --> 00:12:51,516 Speaker 1: That becomes clear when you listen to shows that forego 182 00:12:51,636 --> 00:12:55,836 Speaker 1: a little sweetening. Some networks tried to embark upon what 183 00:12:55,916 --> 00:13:00,876 Speaker 1: they called a prestige sitcom which had no laugh track 184 00:13:01,116 --> 00:13:05,316 Speaker 1: at all. One called Frank's Place on CBS. There was 185 00:13:05,396 --> 00:13:08,236 Speaker 1: one called The Days and Nights of Molly Dodd, and 186 00:13:08,356 --> 00:13:10,436 Speaker 1: they wore this as a badge of honor, like we 187 00:13:10,596 --> 00:13:13,836 Speaker 1: need known laugh track. They went nowhere, They fizzled, and 188 00:13:14,516 --> 00:13:18,356 Speaker 1: who remembers them. Right When I teach my class at Yale, 189 00:13:18,516 --> 00:13:21,556 Speaker 1: I show students a modern day TV show without its clackers, 190 00:13:21,916 --> 00:13:26,516 Speaker 1: just to make the very same point, ah, nothing makes 191 00:13:26,596 --> 00:13:30,236 Speaker 1: beauty is better than cool clear Rocky Mountains, spring Water, 192 00:13:32,196 --> 00:13:34,716 Speaker 1: so Big Bang theory on YouTube. Somebody's gone in and 193 00:13:34,796 --> 00:13:37,556 Speaker 1: clipped out the laugh track and it sounds super weird. 194 00:13:38,076 --> 00:13:44,796 Speaker 1: Are the Rocky Mountains anyway? Philadelphia? It's not like they're 195 00:13:44,836 --> 00:13:48,596 Speaker 1: not they're not jokes, but they's they just fill it 196 00:13:48,756 --> 00:13:52,116 Speaker 1: with this to sort of propel you into thinking everything 197 00:13:52,236 --> 00:13:56,396 Speaker 1: is funny. How we hear other people reacting is contagious. 198 00:13:57,396 --> 00:13:59,636 Speaker 1: Laugh tracks are one of the most common cases of 199 00:13:59,716 --> 00:14:02,756 Speaker 1: emotional contagion in action. Whether or not we want to 200 00:14:02,796 --> 00:14:05,876 Speaker 1: admit it. We do enjoy TV shows more when there's 201 00:14:05,916 --> 00:14:09,876 Speaker 1: can't laughter, But emotional contagion can also be used in 202 00:14:09,956 --> 00:14:13,676 Speaker 1: more insidious ways, because not everyone out there wants to 203 00:14:13,796 --> 00:14:17,676 Speaker 1: make us laugh. Some companies today make money not by 204 00:14:17,756 --> 00:14:20,396 Speaker 1: making us feel good, but by making us feel a 205 00:14:20,516 --> 00:14:23,956 Speaker 1: whole lot worse. The Happiness Lab will be right back. 206 00:14:32,236 --> 00:14:34,036 Speaker 1: I opened up my email and there were hundreds of 207 00:14:34,116 --> 00:14:38,236 Speaker 1: emails that were all mostly like how dare you do this? 208 00:14:38,516 --> 00:14:41,196 Speaker 1: And you're so unethical? Or other friends writing me saying, hey, 209 00:14:41,196 --> 00:14:43,636 Speaker 1: you better get a lawyer, and I hope you're okay. 210 00:14:44,156 --> 00:14:47,316 Speaker 1: This is my friend Jeff Hancock back in twenty fourteen, 211 00:14:47,716 --> 00:14:50,796 Speaker 1: Jeff opened up a firestorm of hate in his inbox, 212 00:14:51,356 --> 00:14:54,276 Speaker 1: and over the course of the next five or six days, 213 00:14:54,516 --> 00:14:56,596 Speaker 1: you know, I could literally tell where the sun was 214 00:14:56,716 --> 00:14:59,556 Speaker 1: up around the world by where I was receiving hate 215 00:14:59,636 --> 00:15:03,476 Speaker 1: mail from. I distinctly remember watching all this happen, just 216 00:15:03,636 --> 00:15:06,756 Speaker 1: the crazy level of outrage that Jeff had elicited. I 217 00:15:06,996 --> 00:15:09,636 Speaker 1: was worried about him. Many of our college or two 218 00:15:10,436 --> 00:15:13,196 Speaker 1: what had Jeff done to generate so much fury from 219 00:15:13,236 --> 00:15:18,036 Speaker 1: complete strangers. He ran an experiment to manipulate people's emotions online, 220 00:15:18,476 --> 00:15:22,156 Speaker 1: and they didn't like it. I often remember the first 221 00:15:22,156 --> 00:15:24,636 Speaker 1: time I read about emotion contagion, and the metaphor that 222 00:15:24,756 --> 00:15:26,996 Speaker 1: the author used was you could cut the emotions in 223 00:15:27,076 --> 00:15:29,636 Speaker 1: the room with a knife, and I know that feeling. 224 00:15:29,676 --> 00:15:32,796 Speaker 1: You walk into a room and you can be like, oh, 225 00:15:32,836 --> 00:15:36,836 Speaker 1: it's tense in here. Most psychologists study emotional contagion in 226 00:15:36,956 --> 00:15:39,676 Speaker 1: the context of real world social interactions like these, but 227 00:15:39,836 --> 00:15:42,596 Speaker 1: Jeff wanted to know whether we could catch people's emotions 228 00:15:42,756 --> 00:15:45,516 Speaker 1: in more subtle ways as well, like through a relatively 229 00:15:45,836 --> 00:15:51,836 Speaker 1: new form of human communication writing. So until the nineteen forties, 230 00:15:52,116 --> 00:15:54,836 Speaker 1: over half of the planet's population was illiterate, so they 231 00:15:54,916 --> 00:15:58,036 Speaker 1: never left any record of any their communication. And now 232 00:15:58,356 --> 00:16:00,276 Speaker 1: you know, if you would ask one of your podcast 233 00:16:00,396 --> 00:16:04,036 Speaker 1: listeners if they'd written something today, almost everybody would say, yes, 234 00:16:04,516 --> 00:16:07,076 Speaker 1: we forget that text is a completely new form of 235 00:16:07,156 --> 00:16:10,876 Speaker 1: communication for our species. And for a long time, most 236 00:16:10,916 --> 00:16:14,636 Speaker 1: scientists simply assumed that emotions couldn't transmit as easily through 237 00:16:14,716 --> 00:16:17,636 Speaker 1: written word as they did face to face. I really 238 00:16:18,316 --> 00:16:22,236 Speaker 1: hated this sort of idea that you can't communicate emotions 239 00:16:22,516 --> 00:16:24,996 Speaker 1: in text like an emails and things like that. Jeff's 240 00:16:25,076 --> 00:16:28,436 Speaker 1: research showed that participants pick up other people's emotions through 241 00:16:28,476 --> 00:16:31,236 Speaker 1: text and say a quick email note or an online 242 00:16:31,316 --> 00:16:34,276 Speaker 1: comment just as easily as they do in face to 243 00:16:34,396 --> 00:16:37,956 Speaker 1: face real world interactions. On the one hand, is like surprising, 244 00:16:38,036 --> 00:16:39,996 Speaker 1: like wow, how can you tell emotions in text? And 245 00:16:40,036 --> 00:16:43,316 Speaker 1: then you think about your favorite author, and of course 246 00:16:43,436 --> 00:16:46,676 Speaker 1: you have like these super powerful feelings from text. I 247 00:16:46,756 --> 00:16:51,196 Speaker 1: remember reading Game of Thrones and there's this red wedding 248 00:16:51,476 --> 00:16:54,196 Speaker 1: event where people that you aren't expecting to be killed 249 00:16:54,276 --> 00:16:56,996 Speaker 1: to get killed, and I was so furious I had 250 00:16:56,996 --> 00:16:59,396 Speaker 1: to stop reading and go for a walk. And it 251 00:16:59,556 --> 00:17:01,396 Speaker 1: was just one of these examples of like, yeah, why do. 252 00:17:01,476 --> 00:17:05,036 Speaker 1: We think that you can't communicate emotion through text or online, 253 00:17:05,716 --> 00:17:07,956 Speaker 1: But there are a number of reasons why the text 254 00:17:08,036 --> 00:17:11,476 Speaker 1: we read online is a different communication medium than the 255 00:17:11,556 --> 00:17:15,316 Speaker 1: quick emotions we experience around other people. The first difference 256 00:17:15,436 --> 00:17:18,316 Speaker 1: is longevity. When I laugh at a funny joke, you 257 00:17:18,436 --> 00:17:21,596 Speaker 1: only hear that signal for a few seconds. But text, 258 00:17:21,876 --> 00:17:25,436 Speaker 1: especially the text we post online, is really different. It's 259 00:17:25,476 --> 00:17:29,076 Speaker 1: true that up until very recently in human history, everything 260 00:17:29,156 --> 00:17:33,116 Speaker 1: we said and did disappear, and now we leave records. 261 00:17:33,676 --> 00:17:36,756 Speaker 1: The records people leave can continue affecting our emotions for 262 00:17:37,036 --> 00:17:40,316 Speaker 1: years and years to come. My favorite writer, Kurt Vonnegut's 263 00:17:40,356 --> 00:17:43,316 Speaker 1: been dead for years, but his quotes still make me 264 00:17:43,476 --> 00:17:47,636 Speaker 1: laugh and sometimes cringe. But there's a second way that text, 265 00:17:47,916 --> 00:17:51,076 Speaker 1: especially text on the Internet, is different. We don't have 266 00:17:51,196 --> 00:17:54,916 Speaker 1: as much control over who is affecting our emotions. We 267 00:17:55,076 --> 00:17:57,756 Speaker 1: evolve to be around a small number of people, which 268 00:17:57,836 --> 00:18:01,356 Speaker 1: means that most face to face emotional contagion only occurs 269 00:18:01,396 --> 00:18:04,756 Speaker 1: between people who know each other well. But on the Internet, 270 00:18:04,916 --> 00:18:07,636 Speaker 1: we're emotionally affected both by people who are close to 271 00:18:08,196 --> 00:18:11,596 Speaker 1: and by people never even meet. In real life, I 272 00:18:11,716 --> 00:18:15,076 Speaker 1: should have a contagious sort of feeling. If my mom 273 00:18:15,156 --> 00:18:17,916 Speaker 1: posts something that's stopped setting for her, I should feel 274 00:18:18,116 --> 00:18:21,956 Speaker 1: some empathy and negative response. But if I'm looking through 275 00:18:22,036 --> 00:18:25,436 Speaker 1: the comments of a hockey game that took place last night, 276 00:18:25,476 --> 00:18:29,516 Speaker 1: and I'm reading the comments, somebody named Poster number eighty 277 00:18:29,716 --> 00:18:33,396 Speaker 1: three is upset or angry, that should have zero bearing 278 00:18:33,516 --> 00:18:36,756 Speaker 1: on my emotional situation. And yet because of some of 279 00:18:36,796 --> 00:18:40,196 Speaker 1: the automatic aspects of the way we respond to others emotions, 280 00:18:40,516 --> 00:18:43,436 Speaker 1: it can trigger things to me. When we're on the internet. 281 00:18:43,596 --> 00:18:45,876 Speaker 1: We don't only succumb to the emotions of people like 282 00:18:45,996 --> 00:18:48,956 Speaker 1: our family members, people we know and care about. Then 283 00:18:48,996 --> 00:18:51,996 Speaker 1: there's these sort of like unknown people that I never 284 00:18:52,156 --> 00:18:54,436 Speaker 1: expect to meet again, I don't have any idea who 285 00:18:54,516 --> 00:18:57,196 Speaker 1: they actually are. And we can call that the unknown network. 286 00:18:57,236 --> 00:19:00,036 Speaker 1: And this could be some Russian agent, could be some 287 00:19:00,756 --> 00:19:04,196 Speaker 1: fraud strode in Nigeria, and typically in the face to 288 00:19:04,236 --> 00:19:06,916 Speaker 1: face world, it's really easy to keep those two kinds 289 00:19:06,956 --> 00:19:11,036 Speaker 1: of contacts separate, so those two networks aren't overlapping. But 290 00:19:11,156 --> 00:19:13,716 Speaker 1: on the Internet and something like a Facebook news feed, 291 00:19:14,396 --> 00:19:18,756 Speaker 1: those two worlds are completely overlaid and they sit right 292 00:19:18,796 --> 00:19:21,076 Speaker 1: on top of each other. It's just as easy for 293 00:19:21,316 --> 00:19:24,436 Speaker 1: a Russian agent to put something in my feed or 294 00:19:24,476 --> 00:19:27,956 Speaker 1: my advertising space as it is for LORI. And this 295 00:19:28,196 --> 00:19:30,876 Speaker 1: means that our emotions aren't just affected by the people 296 00:19:30,956 --> 00:19:34,156 Speaker 1: that matter. We're also, in theory catching the emotions of 297 00:19:34,396 --> 00:19:38,876 Speaker 1: third cousins we haven't seen in years, random strangers, advertisers, bots, 298 00:19:39,236 --> 00:19:42,276 Speaker 1: really anything that winds up in our news feed. Everybody's 299 00:19:42,356 --> 00:19:45,756 Speaker 1: news feed is super huge, it turns out, because of 300 00:19:45,836 --> 00:19:48,916 Speaker 1: network effects. If you have say, three hundred friends on Facebook, 301 00:19:49,036 --> 00:19:53,356 Speaker 1: you'll have thousands of possible pieces of content to look 302 00:19:53,436 --> 00:19:56,716 Speaker 1: at in your news feed. And what Facebook very quickly 303 00:19:56,756 --> 00:19:59,116 Speaker 1: found out was that much of that people found on 304 00:19:59,276 --> 00:20:03,236 Speaker 1: interesting and unengaging, and so they developed algorithms to try 305 00:20:03,236 --> 00:20:07,596 Speaker 1: and predict what the say, twenty to thirty most interesting 306 00:20:07,876 --> 00:20:11,356 Speaker 1: pieces for a person be. Basically what happens is a 307 00:20:11,516 --> 00:20:14,236 Speaker 1: piece of content would come and the algorithm would rank 308 00:20:14,276 --> 00:20:16,316 Speaker 1: and say, well, Jeff would be really interested this, we're 309 00:20:16,316 --> 00:20:18,196 Speaker 1: going to put at the top of his feed, or 310 00:20:18,276 --> 00:20:19,716 Speaker 1: he wouldn't at all, we're going to put it down. 311 00:20:19,916 --> 00:20:22,716 Speaker 1: You know. In fifteen hundred and so, everybody's news feed 312 00:20:22,836 --> 00:20:27,316 Speaker 1: is algorithmically curated. And now in twenty nineteen, most people 313 00:20:27,436 --> 00:20:31,676 Speaker 1: know this, and I think in twenty fourteen. This was 314 00:20:31,996 --> 00:20:33,996 Speaker 1: not well known and I think was a big part 315 00:20:34,036 --> 00:20:38,076 Speaker 1: of the outrage. Ah, yes, the outrage. So why did 316 00:20:38,156 --> 00:20:41,676 Speaker 1: everyone get so mad at Jeff Well. Facebook was interested 317 00:20:41,716 --> 00:20:44,996 Speaker 1: to see if it's users were experiencing emotional contagion from 318 00:20:45,036 --> 00:20:47,796 Speaker 1: the post they read. In particular, they were worried about 319 00:20:47,836 --> 00:20:51,276 Speaker 1: some studies that were coming out suggesting that people might 320 00:20:51,516 --> 00:20:54,276 Speaker 1: feel bad when they look at their news feed. You 321 00:20:54,396 --> 00:20:57,356 Speaker 1: would be very bad for a lot of obvious reasons 322 00:20:57,436 --> 00:21:01,516 Speaker 1: if they're one of their main key products was causing 323 00:21:01,676 --> 00:21:05,636 Speaker 1: their users to be depressed or causing negative emotion. The 324 00:21:05,756 --> 00:21:08,516 Speaker 1: social media giant design an experiment to figure out the 325 00:21:08,596 --> 00:21:12,916 Speaker 1: emotional impact of Facebook posts. Unbeknownst to the subjects, it 326 00:21:13,076 --> 00:21:16,076 Speaker 1: decided to tweak the newsfeed algorithm for two different test 327 00:21:16,116 --> 00:21:19,836 Speaker 1: groups of users. The first group, the negative content group, 328 00:21:20,276 --> 00:21:22,916 Speaker 1: was picked to see less bad content in their news feed. 329 00:21:23,276 --> 00:21:25,876 Speaker 1: It never went away, it was always in your feed, 330 00:21:25,996 --> 00:21:27,916 Speaker 1: just you know, you'd be less likely to see it. 331 00:21:28,396 --> 00:21:30,836 Speaker 1: But there was another group or condition who saw their 332 00:21:30,916 --> 00:21:34,516 Speaker 1: news feed altered in the other emotional direction. So if 333 00:21:34,556 --> 00:21:37,356 Speaker 1: you were in the positive emotion condition, you know, even 334 00:21:37,436 --> 00:21:41,316 Speaker 1: more controversial condition. Then when posts had positive emotion them, 335 00:21:41,356 --> 00:21:43,836 Speaker 1: they'd be moved down in your feeding, you'd be less 336 00:21:43,876 --> 00:21:46,396 Speaker 1: likely to see them. How did this change to a 337 00:21:46,476 --> 00:21:49,996 Speaker 1: person's feed affect the emotions that they themselves expressed in 338 00:21:50,036 --> 00:21:54,116 Speaker 1: their own posts? Were users automatically catching the emotions they 339 00:21:54,196 --> 00:21:57,236 Speaker 1: saw in their feeds? That's what we found. And so 340 00:21:57,356 --> 00:22:01,756 Speaker 1: if you were in the fewer positive posts condition, then 341 00:22:01,916 --> 00:22:07,596 Speaker 1: you would write about four positive words fewer over the 342 00:22:07,716 --> 00:22:12,156 Speaker 1: next thousand words that you post on Facebook. Jeff and 343 00:22:12,236 --> 00:22:15,236 Speaker 1: his colleagues had shown that people do catch the emotions 344 00:22:15,276 --> 00:22:18,036 Speaker 1: they see in their feed It was an important result. 345 00:22:18,796 --> 00:22:22,116 Speaker 1: People online many that you've never even met, or able 346 00:22:22,196 --> 00:22:25,036 Speaker 1: to make you feel happier or sadder in your real 347 00:22:25,276 --> 00:22:27,836 Speaker 1: offline life. I got a little bit of media attention. 348 00:22:28,116 --> 00:22:31,676 Speaker 1: I remember Jimmy Fallon actually cracked a joke about it 349 00:22:31,676 --> 00:22:34,196 Speaker 1: in one of his monologues. I remember being like, okay, 350 00:22:34,316 --> 00:22:36,836 Speaker 1: right on, Jimmy Fallon crack's joke, that's a check mark great, 351 00:22:37,276 --> 00:22:39,436 Speaker 1: and after that thought like, okay, I guess that'll be 352 00:22:39,556 --> 00:22:42,036 Speaker 1: sort of it for the study. Jeff couldn't have been 353 00:22:42,196 --> 00:22:45,236 Speaker 1: more wrong. The hate mail started arriving, and not just 354 00:22:45,436 --> 00:22:48,676 Speaker 1: for weeks after his study, but for years. How dare 355 00:22:48,756 --> 00:22:52,276 Speaker 1: you manipulate my news feed? And the public anger he 356 00:22:52,356 --> 00:22:56,076 Speaker 1: stirred didn't just hurt him. My wife was really negatively 357 00:22:56,116 --> 00:22:58,396 Speaker 1: affected emotion with it. I mean she was turned on 358 00:22:58,476 --> 00:23:02,356 Speaker 1: the radio, the TV, or fire up her email or 359 00:23:02,396 --> 00:23:04,796 Speaker 1: go on Twitter, and would just be seeing all these 360 00:23:04,916 --> 00:23:07,676 Speaker 1: like takedowns of her husband being an unethical monster, and 361 00:23:08,036 --> 00:23:11,076 Speaker 1: you forget that, like, your work isn't just you, but 362 00:23:11,156 --> 00:23:13,596 Speaker 1: it's also all the people around you that have supported it. 363 00:23:14,036 --> 00:23:16,796 Speaker 1: Jeff's study raised questions that had never before been asked 364 00:23:16,876 --> 00:23:19,716 Speaker 1: about the role of social media in our emotional lives. 365 00:23:20,316 --> 00:23:22,596 Speaker 1: I think for a long time people kind of viewed 366 00:23:22,676 --> 00:23:24,836 Speaker 1: those as two different worlds. There's like stuff that happens 367 00:23:24,876 --> 00:23:27,556 Speaker 1: on social media or online and it's like whatever, and 368 00:23:27,676 --> 00:23:31,076 Speaker 1: then there's the real world. People really recognized all of 369 00:23:31,116 --> 00:23:33,916 Speaker 1: a sudden that like, hey, these are both real worlds, 370 00:23:34,436 --> 00:23:37,836 Speaker 1: and what you read and the emotions you get from 371 00:23:37,956 --> 00:23:41,876 Speaker 1: learning about your social network, they don't drop off once 372 00:23:41,916 --> 00:23:44,076 Speaker 1: you turn the screen off. They actually stay with you 373 00:23:44,156 --> 00:23:49,236 Speaker 1: and they can influence your physical interactions. One thing has 374 00:23:49,356 --> 00:23:52,236 Speaker 1: changed for Jeff though his study has changed the kinds 375 00:23:52,276 --> 00:23:55,796 Speaker 1: of emotional things he posts online. I certainly try to 376 00:23:55,956 --> 00:24:00,756 Speaker 1: be a positive person. Whenever I'm using any sort of 377 00:24:00,836 --> 00:24:05,196 Speaker 1: form of technology, I'm very aware of the possible effects 378 00:24:05,276 --> 00:24:09,396 Speaker 1: of writing negative things. Jeff's onto something here. Something will 379 00:24:09,436 --> 00:24:12,396 Speaker 1: explore in more detail after the break. When we first 380 00:24:12,476 --> 00:24:15,996 Speaker 1: hear about the Facebook study, we can feel existentially threatened 381 00:24:16,436 --> 00:24:19,596 Speaker 1: from laugh tracks to biased news feeds. It can seem 382 00:24:19,636 --> 00:24:22,876 Speaker 1: like other people are constantly affecting us, and as though 383 00:24:22,916 --> 00:24:26,476 Speaker 1: our emotions are completely out of our control, even subject 384 00:24:26,516 --> 00:24:29,556 Speaker 1: to manipulation. But when you start to understand how these 385 00:24:29,596 --> 00:24:33,196 Speaker 1: techniques work, when you begin realizing that these forces are 386 00:24:33,236 --> 00:24:35,916 Speaker 1: being used against you, you can react in a more 387 00:24:35,996 --> 00:24:40,476 Speaker 1: positive way to all of this emotional contagion, because just 388 00:24:40,716 --> 00:24:43,396 Speaker 1: as we are affected by the emotions of others, so 389 00:24:43,596 --> 00:24:47,396 Speaker 1: to do our emotions affect other people. And that means 390 00:24:47,516 --> 00:24:49,756 Speaker 1: that we can have a lot more control over our 391 00:24:49,796 --> 00:24:53,476 Speaker 1: own emotional climate than we often think. We can become 392 00:24:53,556 --> 00:24:55,756 Speaker 1: the laugh track we want to hear in the world. 393 00:24:57,996 --> 00:25:05,876 Speaker 1: The Happiness Lab will be back in a moment when 394 00:25:05,996 --> 00:25:09,636 Speaker 1: my youngest child was very very young, probably about four 395 00:25:09,756 --> 00:25:13,196 Speaker 1: or so, and I was reprimanding her for something, not 396 00:25:13,436 --> 00:25:17,556 Speaker 1: particularly aggressively, but just reprimanding. She said to me, you 397 00:25:17,676 --> 00:25:24,636 Speaker 1: are making me have negative emotional contagion, and I'm kind 398 00:25:24,676 --> 00:25:29,156 Speaker 1: of like, well, that's sort of the point. Actually, this 399 00:25:29,316 --> 00:25:32,356 Speaker 1: is Seagal Barsade. She's a professor at the Wharton School 400 00:25:32,356 --> 00:25:35,756 Speaker 1: of Business. Seagal is an expert on emotional contagion and 401 00:25:35,916 --> 00:25:39,316 Speaker 1: specifically how we can use this phenomenon to make ourselves 402 00:25:39,396 --> 00:25:42,676 Speaker 1: and our organizations a little bit happier. The reason I 403 00:25:42,756 --> 00:25:46,236 Speaker 1: started to study emotional contagion was because I was in 404 00:25:46,276 --> 00:25:48,796 Speaker 1: a workplace and there was a woman in the workplace 405 00:25:48,836 --> 00:25:51,076 Speaker 1: who worked there. Will call her Megan, and I didn't 406 00:25:51,196 --> 00:25:53,676 Speaker 1: report to her. I didn't actually need to even work 407 00:25:53,716 --> 00:25:55,636 Speaker 1: with her a lot. But she was a very negative person. 408 00:25:56,276 --> 00:25:59,636 Speaker 1: And one day she left for vacation, and I noticed 409 00:25:59,676 --> 00:26:02,636 Speaker 1: that there was a palpable difference in the workplace among 410 00:26:02,716 --> 00:26:05,796 Speaker 1: everybody who was in this open office. People's shoulders almost 411 00:26:05,796 --> 00:26:08,396 Speaker 1: like seemed to lower, and they were more relaxed and happy. 412 00:26:08,836 --> 00:26:11,556 Speaker 1: Then when she came back from vacation, everything went back 413 00:26:11,596 --> 00:26:15,996 Speaker 1: to what it was, and I remember thinking, Wow, this 414 00:26:16,196 --> 00:26:19,476 Speaker 1: person is having this tremendous effect on our mood even 415 00:26:19,556 --> 00:26:22,396 Speaker 1: when we don't literally have to engage with her around 416 00:26:22,436 --> 00:26:26,756 Speaker 1: workplace issues. Seagal has explored how moods transmit through an organization. 417 00:26:27,356 --> 00:26:30,796 Speaker 1: How emotional contagion can shape an entire group or even 418 00:26:30,876 --> 00:26:34,156 Speaker 1: an entire workplace culture. You can imagine a work group 419 00:26:34,516 --> 00:26:37,516 Speaker 1: where you know, somebody comes in in a really great mood, 420 00:26:37,716 --> 00:26:40,316 Speaker 1: everybody else is, let's say, kind of neutral. That person 421 00:26:40,396 --> 00:26:45,716 Speaker 1: comes in, they're bubbly, they're warm, and that infects everybody else. 422 00:26:46,596 --> 00:26:49,156 Speaker 1: Or you can also imagine the opposite, which is somebody 423 00:26:49,196 --> 00:26:51,956 Speaker 1: comes in, they're in a really bad mood, and everybody 424 00:26:52,036 --> 00:26:56,076 Speaker 1: was kind of fine, but now they're feeling it. Most often, 425 00:26:56,636 --> 00:26:59,876 Speaker 1: it's coming as a very automatic process as a result 426 00:26:59,956 --> 00:27:05,356 Speaker 1: of behavioral mimicry, mimicking the facial expressions and body language, 427 00:27:06,076 --> 00:27:09,076 Speaker 1: and then through a variety of physiological processes where actually 428 00:27:09,596 --> 00:27:13,516 Speaker 1: feeling those emotions. The process Seagal describes is pretty intuitive. 429 00:27:13,996 --> 00:27:16,716 Speaker 1: It's probably obvious that people can affect others moods in 430 00:27:16,756 --> 00:27:20,276 Speaker 1: the workplace, but as an organizational theorist, I was also 431 00:27:20,396 --> 00:27:24,236 Speaker 1: interested in, Okay, if this happens, does it matter, you know, 432 00:27:24,316 --> 00:27:28,836 Speaker 1: does it influence anything? Behaviorally, Seagal tested whether the contagion 433 00:27:29,076 --> 00:27:32,396 Speaker 1: led to bad workplace outcomes. She had her workers do 434 00:27:32,476 --> 00:27:36,236 Speaker 1: a bunch of financial decision tasks, including making hard decisions 435 00:27:36,276 --> 00:27:40,516 Speaker 1: about monetary allocations. Did workers make worse decisions when in 436 00:27:40,556 --> 00:27:43,516 Speaker 1: the presence of a single bad mooded confederate. What we 437 00:27:43,716 --> 00:27:49,156 Speaker 1: found is that in the positive emotional contagion conditions, the 438 00:27:49,276 --> 00:27:55,116 Speaker 1: groups were less conflictual, more cooperative, and they literally allocated 439 00:27:55,316 --> 00:27:58,276 Speaker 1: the money differently. The pot of money they were more 440 00:27:58,356 --> 00:28:01,396 Speaker 1: collectivistic in how they did it. This was very powerful 441 00:28:01,516 --> 00:28:06,476 Speaker 1: evidence for the idea that a, Yes, emotions are contagious 442 00:28:06,916 --> 00:28:12,916 Speaker 1: and b that they actually influence work outcomes. Seagull's study 443 00:28:12,996 --> 00:28:15,516 Speaker 1: has now been replicated in a number of experimental in 444 00:28:15,636 --> 00:28:19,876 Speaker 1: field settings, from cricket teams to bank branches to coffee shops. 445 00:28:20,836 --> 00:28:24,036 Speaker 1: Much like the flu, a single bad mood can transmit fast. 446 00:28:24,396 --> 00:28:27,276 Speaker 1: It can influence the performance of an entire team. It 447 00:28:27,356 --> 00:28:30,996 Speaker 1: can even affect the way frontline workers interact with their customers. 448 00:28:31,836 --> 00:28:34,716 Speaker 1: One of the most insidious parts of workplace contagion is 449 00:28:34,756 --> 00:28:38,276 Speaker 1: the fact that the process is reciprocal. One negative feeling 450 00:28:38,356 --> 00:28:40,756 Speaker 1: Megan in the office doesn't just make the folks around 451 00:28:40,796 --> 00:28:44,956 Speaker 1: her grumpy. Those folks then become their own influences, making 452 00:28:45,076 --> 00:28:48,516 Speaker 1: everyone more annoyed and more stressed, and on and on 453 00:28:48,796 --> 00:28:52,396 Speaker 1: and on. This sort of emotional feedback loop is what 454 00:28:52,516 --> 00:28:57,596 Speaker 1: Seagal refers to as an affective spiral. We literally do spiral, 455 00:28:57,676 --> 00:28:59,796 Speaker 1: and we can have upward spirals and we can have 456 00:29:00,196 --> 00:29:03,956 Speaker 1: downward spirals. The problem is that we often forget such 457 00:29:04,036 --> 00:29:06,916 Speaker 1: spirals are as common as they really are time and 458 00:29:06,996 --> 00:29:10,276 Speaker 1: time again, and research studies it's own that people don't 459 00:29:10,316 --> 00:29:13,436 Speaker 1: actually know it's occurring. And in some ways, what I 460 00:29:13,556 --> 00:29:15,396 Speaker 1: find so exciting about being able to talk to this 461 00:29:15,516 --> 00:29:18,356 Speaker 1: to people is that by letting them know that it 462 00:29:18,516 --> 00:29:23,236 Speaker 1: is a phenomenon that is happening, then you are much 463 00:29:23,356 --> 00:29:26,476 Speaker 1: more aware of it if you want to protect yourself 464 00:29:26,796 --> 00:29:29,836 Speaker 1: against the emotional contagion. Because we all have our megans 465 00:29:29,916 --> 00:29:32,196 Speaker 1: in the office, you know, for better or for worse, Yes, 466 00:29:32,596 --> 00:29:35,596 Speaker 1: exactly well, and you know what, we're all megans in 467 00:29:35,676 --> 00:29:38,076 Speaker 1: some ways. And what I mean by that is that 468 00:29:38,676 --> 00:29:42,036 Speaker 1: you know we have moods, right, We all vary and 469 00:29:42,276 --> 00:29:45,556 Speaker 1: quite dramatically throughout the day, throughout the week, and so 470 00:29:46,236 --> 00:29:51,156 Speaker 1: being really conscious of that can help us understand how 471 00:29:51,236 --> 00:29:55,876 Speaker 1: we're affecting other people. See golled onto something really important here. 472 00:29:56,276 --> 00:29:58,876 Speaker 1: If we don't work on our own emotions, we can 473 00:29:58,956 --> 00:30:04,236 Speaker 1: inadvertently start a downward spiral ourselves. Our negativity will likely 474 00:30:04,276 --> 00:30:07,036 Speaker 1: get caught by others around us, which can then get 475 00:30:07,116 --> 00:30:10,876 Speaker 1: transmitted back to us. We can inadvertently be the first 476 00:30:10,956 --> 00:30:13,756 Speaker 1: bad step and a causal chain that leads us to 477 00:30:13,876 --> 00:30:18,756 Speaker 1: experience more misery. Luckily, Seagull's research has shown lots of 478 00:30:18,836 --> 00:30:21,756 Speaker 1: ways that we can alter our expressions to smooth out 479 00:30:21,796 --> 00:30:26,316 Speaker 1: otherwise negative emotional situations, both at work and beyond. Let's 480 00:30:26,316 --> 00:30:29,276 Speaker 1: say you're in a job interview with somebody and you 481 00:30:29,396 --> 00:30:31,756 Speaker 1: see that they're very nervous. You know, one of the 482 00:30:31,836 --> 00:30:33,796 Speaker 1: worst things you can say to somebody is, you know, 483 00:30:33,996 --> 00:30:37,316 Speaker 1: calm down right. If you sort of slow your pace 484 00:30:37,356 --> 00:30:40,756 Speaker 1: a little bit, you look encouragingly, you change your tone 485 00:30:40,956 --> 00:30:42,876 Speaker 1: as you can probably hear in what I'm doing right now, 486 00:30:43,556 --> 00:30:47,116 Speaker 1: that will naturally calm that person down so that you 487 00:30:47,196 --> 00:30:50,436 Speaker 1: can get kind of a more clear interview. As a leader, 488 00:30:52,036 --> 00:30:56,276 Speaker 1: you know, if you have people panicking around you, for example, 489 00:30:56,676 --> 00:30:59,356 Speaker 1: model for them the emotions that are going to be 490 00:30:59,476 --> 00:31:04,196 Speaker 1: the most productive in that situation. If you're sophisticated about 491 00:31:04,276 --> 00:31:09,836 Speaker 1: understanding that, you have this other way of getting your 492 00:31:09,916 --> 00:31:13,916 Speaker 1: team on board and where they need to be. Seagal 493 00:31:13,996 --> 00:31:16,276 Speaker 1: has focused a lot of her research on the one 494 00:31:16,396 --> 00:31:18,916 Speaker 1: member of a team that has the most powerful role 495 00:31:18,996 --> 00:31:23,076 Speaker 1: in a team's emotions, the boss. When I hear about 496 00:31:23,156 --> 00:31:25,556 Speaker 1: a group or an organization that has really low morale, 497 00:31:26,036 --> 00:31:28,676 Speaker 1: one of the first questions I'm asking is, how's the 498 00:31:28,796 --> 00:31:30,716 Speaker 1: leader coming in in the morning. You know, is a 499 00:31:30,836 --> 00:31:35,396 Speaker 1: leader coming in excited, enthusiastic, energetic talking with people, or 500 00:31:35,596 --> 00:31:37,236 Speaker 1: is the leader coming in looking like they have the 501 00:31:37,316 --> 00:31:40,316 Speaker 1: weight of the world on their shoulders and really stressed. 502 00:31:40,716 --> 00:31:43,396 Speaker 1: People are always paying attention to leaders, and so they 503 00:31:43,636 --> 00:31:47,836 Speaker 1: literally catch the leader's mood. Seagal is currently working on 504 00:31:47,956 --> 00:31:50,716 Speaker 1: a number of training programs to get leaders to regulate 505 00:31:50,756 --> 00:31:54,156 Speaker 1: their feelings, both internally and externally in ways that best 506 00:31:54,236 --> 00:31:57,636 Speaker 1: help the group. But the first step is simply realizing 507 00:31:57,676 --> 00:32:01,076 Speaker 1: that one's emotions can affect a team's mood and its performance. 508 00:32:01,556 --> 00:32:04,916 Speaker 1: Knowledge is empowering. Things like you're doing now with this 509 00:32:05,076 --> 00:32:08,476 Speaker 1: episode allow people to know that this exists, and it's 510 00:32:08,516 --> 00:32:11,876 Speaker 1: empowering because there are things that you can do to 511 00:32:12,076 --> 00:32:16,636 Speaker 1: try to not catch somebody else's emotions. The other empowering 512 00:32:16,756 --> 00:32:20,516 Speaker 1: piece of this, though, is knowing that we can change 513 00:32:20,556 --> 00:32:23,836 Speaker 1: each other's moods. The good news is that Seagull's work 514 00:32:23,916 --> 00:32:26,916 Speaker 1: suggests that we can be the emotional change we want 515 00:32:26,956 --> 00:32:29,516 Speaker 1: to see in the world. I asked her if this 516 00:32:29,636 --> 00:32:31,956 Speaker 1: work has made her hopeful, if she thinks we really 517 00:32:32,036 --> 00:32:36,236 Speaker 1: can tackle all the negative office megans in our lives. Absolutely, 518 00:32:36,356 --> 00:32:41,716 Speaker 1: the affective spiral can begin with us. Absolutely. So what 519 00:32:41,836 --> 00:32:44,876 Speaker 1: have we learned in this episode? First, we are affected 520 00:32:44,916 --> 00:32:47,996 Speaker 1: by other people's emotions, way way more than we realize. 521 00:32:48,676 --> 00:32:53,796 Speaker 1: We unconsciously and automatically catch the grief, despair, excitement, rage, 522 00:32:54,236 --> 00:32:58,116 Speaker 1: joy sadness, and serenity of all the individuals we encounter, 523 00:32:58,476 --> 00:33:00,516 Speaker 1: whether that person is a close friend at a party, 524 00:33:00,716 --> 00:33:02,676 Speaker 1: the guy behind us in line at a coffee shop, 525 00:33:02,876 --> 00:33:05,716 Speaker 1: some ranting idiot on Reddit, or even the voice of 526 00:33:05,836 --> 00:33:09,636 Speaker 1: some long dead fifties man laughing hysterically on a bad sitcom. 527 00:33:12,036 --> 00:33:16,396 Speaker 1: By instinctually copying others' behaviors, we irresistibly take on what 528 00:33:16,516 --> 00:33:20,796 Speaker 1: they feel, whether we like that feeling or not. But 529 00:33:20,956 --> 00:33:23,596 Speaker 1: we've also learned that we have more agency than we think. 530 00:33:24,236 --> 00:33:26,076 Speaker 1: We can be more aware of how the people we 531 00:33:26,156 --> 00:33:29,396 Speaker 1: interact with online are affecting us, and mindful of what 532 00:33:29,516 --> 00:33:33,156 Speaker 1: we ourselves post. We can have control over whether we're 533 00:33:33,236 --> 00:33:35,756 Speaker 1: the office Megan, or that burst of calm in a 534 00:33:35,836 --> 00:33:39,316 Speaker 1: tough moment, or the laugh that everyone really really needs. 535 00:33:40,556 --> 00:33:43,316 Speaker 1: And that means that listening to this podcast and following 536 00:33:43,396 --> 00:33:47,556 Speaker 1: its advice won't just help your happiness, Making changes in 537 00:33:47,636 --> 00:33:50,876 Speaker 1: your own life can be the positive seed that transforms 538 00:33:50,916 --> 00:33:54,236 Speaker 1: the well being of those around you. So if you'd 539 00:33:54,276 --> 00:33:56,276 Speaker 1: like to be a force for good to help make 540 00:33:56,396 --> 00:33:59,116 Speaker 1: joy just a little bit more viral, I hope you'll 541 00:33:59,156 --> 00:34:02,076 Speaker 1: return for the next episode of The Happiness Lab with 542 00:34:02,276 --> 00:34:19,436 Speaker 1: me Doctor Laurie Santos. The Happiness Lab is co written 543 00:34:19,476 --> 00:34:22,036 Speaker 1: and produced by Ryan Dilley. The show is mixed and 544 00:34:22,116 --> 00:34:25,716 Speaker 1: mastered by Evan Viola and edited by Julia Barton, fact 545 00:34:25,796 --> 00:34:29,596 Speaker 1: checking by Joseph Fridman, and our original music was composed 546 00:34:29,676 --> 00:34:35,076 Speaker 1: by Zachary Silver. Special thanks to Mia La Belle, Carly mcgliori, 547 00:34:35,516 --> 00:34:40,876 Speaker 1: Heather Faine, Maggie Taylor, Maya Kanig, and Jacob Weisberg. The 548 00:34:40,956 --> 00:34:43,636 Speaker 1: Happiness Lab is brought to you by Pushkin Industries and 549 00:34:43,836 --> 00:34:45,396 Speaker 1: me Doctor Laurie Santos.