1 00:00:15,436 --> 00:00:21,356 Speaker 1: Pushkin. The fighting in the streets of Saigon during the 2 00:00:21,396 --> 00:00:24,956 Speaker 1: New Year or Tet Offensive made the war too real tonight. 3 00:00:25,076 --> 00:00:27,836 Speaker 1: I have ordered our aircraft and our naval vessons to 4 00:00:27,916 --> 00:00:32,356 Speaker 1: make no attacks on North Vietnam. It's nineteen sixty eight, 5 00:00:32,796 --> 00:00:35,916 Speaker 1: a pretty tumultuous year had wrapped in the combat. Banic 6 00:00:36,036 --> 00:00:38,396 Speaker 1: being helped the cart the road now. The officers also 7 00:00:38,516 --> 00:00:41,476 Speaker 1: reportedly chased and fired on a radio equipped carc and 8 00:00:43,516 --> 00:00:46,316 Speaker 1: may not optimistic about one hundred people league that one 9 00:00:46,476 --> 00:00:50,236 Speaker 1: tum tumour wounded three. But there was one event in 10 00:00:50,356 --> 00:00:53,756 Speaker 1: nineteen sixty eight that didn't make the headlines, even though 11 00:00:54,036 --> 00:00:56,996 Speaker 1: it's still having a huge effect on your well being. 12 00:00:57,596 --> 00:01:00,396 Speaker 1: I had to wait, wait, wait, and I really got 13 00:01:00,396 --> 00:01:03,356 Speaker 1: a little bit afgate because I knew I had the 14 00:01:03,436 --> 00:01:05,476 Speaker 1: money there. So all I had to do is, you know, 15 00:01:05,676 --> 00:01:09,476 Speaker 1: cashy check and get out of there. This is dawned Uzel. 16 00:01:09,836 --> 00:01:12,436 Speaker 1: He's recalling a fateful day in November of that year 17 00:01:12,876 --> 00:01:15,196 Speaker 1: when he was trying to do something simple. He just 18 00:01:15,276 --> 00:01:18,076 Speaker 1: wanted to withdraw some cash at his bank. I was 19 00:01:18,236 --> 00:01:21,276 Speaker 1: scheduled to take a trip on a Monday morning on 20 00:01:21,396 --> 00:01:24,756 Speaker 1: so on Friday, on a lunch hour, I went to 21 00:01:24,956 --> 00:01:28,396 Speaker 1: my bank to get some money. I would say maybe 22 00:01:28,476 --> 00:01:33,476 Speaker 1: eight to ten people in line. My guess is, you know, 23 00:01:33,596 --> 00:01:36,396 Speaker 1: maybe I was in that line for like eighteen to 24 00:01:36,556 --> 00:01:40,236 Speaker 1: twenty minutes just to cash you check. Don's time was 25 00:01:40,276 --> 00:01:43,676 Speaker 1: really valuable. He was a talented engineer and vice president 26 00:01:43,716 --> 00:01:45,916 Speaker 1: of a technology company that was on the hunt for 27 00:01:45,996 --> 00:01:48,836 Speaker 1: a new business. But instead of problem solving at his 28 00:01:48,916 --> 00:01:51,236 Speaker 1: desk at work, he was stuck in a bank lobby. 29 00:01:51,476 --> 00:01:54,356 Speaker 1: So my job was to come up with one or 30 00:01:54,436 --> 00:01:57,836 Speaker 1: more new products, and I was getting you nowhere. Half 31 00:01:57,876 --> 00:02:01,396 Speaker 1: a century later, we still share Don's misery. We're stuck 32 00:02:01,436 --> 00:02:04,276 Speaker 1: in lines all the time, when we wait for coffee 33 00:02:04,316 --> 00:02:07,316 Speaker 1: in a cafe, when we stand on a crowded train platform, 34 00:02:07,756 --> 00:02:10,836 Speaker 1: when we get stuck for hours an airport security. We 35 00:02:10,956 --> 00:02:14,316 Speaker 1: know exactly what he was feeling, watching time slip through 36 00:02:14,396 --> 00:02:21,236 Speaker 1: his fingers. My brother Aaron wrote a book called how 37 00:02:21,276 --> 00:02:24,636 Speaker 1: Many Legs? Hey, Santo's How's it going? Or how to 38 00:02:24,836 --> 00:02:28,356 Speaker 1: estimate damn Near Anything. I asked him to calculate for 39 00:02:28,476 --> 00:02:30,716 Speaker 1: us how much time we're likely to spend waiting in 40 00:02:30,796 --> 00:02:34,396 Speaker 1: line over our entire lifetimes. Yes, so I think the 41 00:02:34,476 --> 00:02:38,396 Speaker 1: number we came up with was seven thousand hours, seven 42 00:02:38,516 --> 00:02:41,796 Speaker 1: thousand hours waiting in line. That's more than six months 43 00:02:41,876 --> 00:02:45,276 Speaker 1: of our life stuck in some queue. That's crazy. Right 44 00:02:45,676 --> 00:02:48,556 Speaker 1: With seven thousand hours, you could take a massive vacation. 45 00:02:49,076 --> 00:02:51,276 Speaker 1: You could learn a new instrument, or a new language, 46 00:02:51,396 --> 00:02:53,876 Speaker 1: or a new sport. But you're not doing any of that. 47 00:02:54,596 --> 00:02:57,796 Speaker 1: You're just waiting, staring at the back of someone's head, 48 00:02:58,436 --> 00:03:03,036 Speaker 1: and it sucks. We tell ourselves that standing in line 49 00:03:03,156 --> 00:03:06,996 Speaker 1: is an awful, annoying, happiness training waste of time. But 50 00:03:07,156 --> 00:03:09,556 Speaker 1: what if we could see that line is a huge 51 00:03:09,596 --> 00:03:12,436 Speaker 1: pain in the butt, but as an opportunity to be happier. 52 00:03:14,796 --> 00:03:16,676 Speaker 1: Our minds are constantly telling us what to do to 53 00:03:16,756 --> 00:03:19,636 Speaker 1: be happy. What if our minds are wrong. What if 54 00:03:19,676 --> 00:03:22,476 Speaker 1: our minds are lying to us, leading us away from 55 00:03:22,516 --> 00:03:25,556 Speaker 1: what will really make us happy. The good news is 56 00:03:25,596 --> 00:03:27,956 Speaker 1: that understanding the science of the mind can point us 57 00:03:27,996 --> 00:03:31,116 Speaker 1: all back in the right direction. You're listening to the 58 00:03:31,196 --> 00:03:43,796 Speaker 1: Happiness Lab with me, doctor Laurie santos Hey. Don Don 59 00:03:43,836 --> 00:03:46,916 Speaker 1: Wetzel doesn't have the same name recognition as Thomas Edison 60 00:03:47,076 --> 00:03:50,436 Speaker 1: or Steve Jobs, but he's an inventor too, and it 61 00:03:50,556 --> 00:03:53,596 Speaker 1: turns out his irritation with waiting in line led to 62 00:03:53,676 --> 00:03:57,436 Speaker 1: a creation that revolutionized the financial sector. It has also 63 00:03:57,596 --> 00:04:01,636 Speaker 1: completely changed the daily routines of ordinary people around the world. 64 00:04:02,676 --> 00:04:04,636 Speaker 1: Before I met Don, I had a certain image of 65 00:04:04,716 --> 00:04:07,396 Speaker 1: him in my mind. I rang the doorbell, expecting to 66 00:04:07,476 --> 00:04:11,356 Speaker 1: me a slick, self important enter guy. But then ninety 67 00:04:11,436 --> 00:04:13,876 Speaker 1: year old Don welcomed me into the cozy Dallas home 68 00:04:13,996 --> 00:04:17,156 Speaker 1: that he shares with his wife, Eleanor, and I realized 69 00:04:17,556 --> 00:04:20,676 Speaker 1: Dawn wasn't the Elon Musk type I had imagined. Well, 70 00:04:20,756 --> 00:04:24,676 Speaker 1: I'm delighted you're here. Thank you so much. Dn was 71 00:04:24,716 --> 00:04:28,276 Speaker 1: like the friendliest grandpa you've ever met. I sat with 72 00:04:28,356 --> 00:04:30,676 Speaker 1: Don and Eleanor in their living room, which was filled 73 00:04:30,676 --> 00:04:35,596 Speaker 1: with comfy pillows, smiling photos of their twelve children, and clocks, 74 00:04:36,116 --> 00:04:39,196 Speaker 1: lots of clocks. They were lovely, but clocks are kind 75 00:04:39,236 --> 00:04:41,596 Speaker 1: of the nemesis of the podcaster. And wait till this 76 00:04:42,076 --> 00:04:46,476 Speaker 1: an stops at a second and I'll finish out fucking 77 00:04:46,596 --> 00:04:50,436 Speaker 1: stop there, No, that's okay. The clocks are kind of fitting, though, 78 00:04:50,956 --> 00:04:54,436 Speaker 1: because Don understands the value of time. In fact, it 79 00:04:54,596 --> 00:04:57,396 Speaker 1: was that feeling of wasted time back in nineteen sixty 80 00:04:57,476 --> 00:05:01,636 Speaker 1: eight that led to his life changing idea. So while 81 00:05:01,716 --> 00:05:06,676 Speaker 1: I was in line, I thought, seems to me a 82 00:05:06,836 --> 00:05:11,796 Speaker 1: teller's job mostly is cash. He checked and taken deposits. 83 00:05:12,396 --> 00:05:17,476 Speaker 1: So I just got the idea that I think a 84 00:05:17,596 --> 00:05:21,396 Speaker 1: machine could do that. That's right. Dawn had just dreamed 85 00:05:21,476 --> 00:05:25,036 Speaker 1: up the ATM, the automated teller machine that millions upon 86 00:05:25,156 --> 00:05:29,556 Speaker 1: millions of busy people use every day nowadays. The idea 87 00:05:29,636 --> 00:05:32,636 Speaker 1: of an ATM seems really obvious, but Dawn faced a 88 00:05:32,716 --> 00:05:35,636 Speaker 1: lot of resistance when he first pitched the idea on 89 00:05:35,836 --> 00:05:40,916 Speaker 1: the board of our company. There was a banker he 90 00:05:41,076 --> 00:05:43,396 Speaker 1: thought it was the dumbest idea he had ever heard. 91 00:05:44,076 --> 00:05:46,396 Speaker 1: He said, we have tellers to do that. Has anybody 92 00:05:46,516 --> 00:05:48,636 Speaker 1: told you that? Yeah, you know, we do have tellers 93 00:05:48,676 --> 00:05:50,996 Speaker 1: to do exactly what you're saying your machine can do. 94 00:05:51,596 --> 00:05:55,876 Speaker 1: So why you think anybody would buy this? That board 95 00:05:55,916 --> 00:05:59,196 Speaker 1: member wasn't entirely crazy. There had been earlier attempts at 96 00:05:59,196 --> 00:06:02,476 Speaker 1: automated bank machines, and they'd all failed, including one that 97 00:06:02,556 --> 00:06:06,756 Speaker 1: took deposits. It's inventor, Luther Simgion lamented the only people 98 00:06:06,916 --> 00:06:09,716 Speaker 1: using the machine were prostitutes and gamblers who didn't want 99 00:06:09,716 --> 00:06:12,756 Speaker 1: to deal with a teller face to face. The genius 100 00:06:12,836 --> 00:06:15,156 Speaker 1: of Don's ATM is that it won the trust of 101 00:06:15,236 --> 00:06:18,476 Speaker 1: millions of regular customers who loved its convenience. You know, 102 00:06:18,596 --> 00:06:22,116 Speaker 1: everybody prefers to get thing done quicker, and you know, 103 00:06:22,356 --> 00:06:26,596 Speaker 1: the ATM was a quick, an easy way anybody could 104 00:06:26,676 --> 00:06:29,996 Speaker 1: use an ATM, really, because it's very simple. Stick to 105 00:06:30,116 --> 00:06:33,876 Speaker 1: card in key in your PPN number and bingo, here 106 00:06:33,916 --> 00:06:35,596 Speaker 1: comes the money if you got it in the bank, 107 00:06:35,676 --> 00:06:37,676 Speaker 1: of course. But now in hindsight, can you see that 108 00:06:37,796 --> 00:06:42,236 Speaker 1: this started, you know, in some ways a revolution of convenience. Well, 109 00:06:42,276 --> 00:06:46,036 Speaker 1: I never thought of it that way, really, Laurie. Now 110 00:06:46,196 --> 00:06:48,236 Speaker 1: I'll tell you a story about that. You know, I 111 00:06:48,316 --> 00:06:50,916 Speaker 1: had to come up with a forecast as to how 112 00:06:50,956 --> 00:06:55,076 Speaker 1: many of these ATMs were going to sell, and I 113 00:06:55,236 --> 00:06:58,116 Speaker 1: felt like we could sell four thousands of these machines. 114 00:06:59,036 --> 00:07:00,756 Speaker 1: I think there might have been four thousand just in 115 00:07:00,836 --> 00:07:04,636 Speaker 1: the in the airport where I was just at DFW today. Well, 116 00:07:04,716 --> 00:07:08,556 Speaker 1: at the latest report that I heard throughout the world, 117 00:07:08,636 --> 00:07:12,436 Speaker 1: there's they estimated there is one point three million ATMs 118 00:07:13,356 --> 00:07:17,516 Speaker 1: installed nowadays. But the real success of the ATM, according 119 00:07:17,556 --> 00:07:20,316 Speaker 1: to Dawn, is it it improves people's well being. It 120 00:07:20,476 --> 00:07:23,796 Speaker 1: gets them out of those annoying lines. It just made 121 00:07:23,956 --> 00:07:27,156 Speaker 1: sense that nobody wanted to wait in the tel A 122 00:07:27,236 --> 00:07:31,796 Speaker 1: line like I did, so it makes every bank customer 123 00:07:31,916 --> 00:07:34,396 Speaker 1: happy to get in and get out and do some 124 00:07:34,556 --> 00:07:37,356 Speaker 1: other things. A bit more free time is something we 125 00:07:37,436 --> 00:07:41,036 Speaker 1: all need, and Dawn's simple idea has probably freed up millions, 126 00:07:41,196 --> 00:07:45,076 Speaker 1: possibly billions of hours the world over. But it turns 127 00:07:45,116 --> 00:07:47,956 Speaker 1: out there's an awful downside to all this convenience and 128 00:07:48,036 --> 00:07:52,196 Speaker 1: save time, one that are lying minds don't even realize. 129 00:07:54,036 --> 00:07:56,916 Speaker 1: Don Wetzel's intuition was that most people want a bit 130 00:07:56,956 --> 00:07:59,996 Speaker 1: of extra free time, that it'll make us happier, and 131 00:08:00,076 --> 00:08:03,396 Speaker 1: the science backs him up. Simply put, we all feel 132 00:08:03,476 --> 00:08:06,916 Speaker 1: way too busy today. Many of us experience what scientists 133 00:08:06,996 --> 00:08:11,196 Speaker 1: call time famine. We're really starving for time, and that 134 00:08:11,436 --> 00:08:14,156 Speaker 1: vamished feeling has a negative effect on our well being. 135 00:08:14,796 --> 00:08:17,836 Speaker 1: In fact, people who report feeling short on time are 136 00:08:17,876 --> 00:08:21,196 Speaker 1: more likely to be depressed, anxious, and less happy than 137 00:08:21,236 --> 00:08:23,276 Speaker 1: people who feel like they have lots of free time. 138 00:08:24,276 --> 00:08:26,356 Speaker 1: Psychologists have even come up with a term for that 139 00:08:26,476 --> 00:08:29,636 Speaker 1: amazing feeling you get when say, a meeting is canceled 140 00:08:29,876 --> 00:08:32,556 Speaker 1: and you suddenly have a free hour. You didn't expect. 141 00:08:33,116 --> 00:08:36,996 Speaker 1: We call it time affluents, and those rare moments when 142 00:08:37,036 --> 00:08:39,556 Speaker 1: we feel wealthy in time can make us feel amazing. 143 00:08:40,356 --> 00:08:42,116 Speaker 1: It's one of the reasons that every once in a 144 00:08:42,156 --> 00:08:45,236 Speaker 1: while I sometimes surprise my Yale students by canceling my 145 00:08:45,276 --> 00:08:48,916 Speaker 1: Happiness class, and their reactions show just how important a 146 00:08:48,996 --> 00:08:53,116 Speaker 1: little unexpected time off can be. One student even burst 147 00:08:53,156 --> 00:08:56,276 Speaker 1: into tears. She said it was the first time she'd 148 00:08:56,276 --> 00:09:00,236 Speaker 1: had an hour off all semester. She'd almost forgotten what 149 00:09:00,316 --> 00:09:03,236 Speaker 1: it was like to have some free time, so adding 150 00:09:03,276 --> 00:09:05,596 Speaker 1: even a few extra minutes to our perceived time banks 151 00:09:05,716 --> 00:09:09,636 Speaker 1: can feel really good. But recent studies also suggest something 152 00:09:09,756 --> 00:09:14,716 Speaker 1: rather counterintuitive. That is, we misestimate just how busy we 153 00:09:14,876 --> 00:09:17,636 Speaker 1: really are. While there's lots of work showing that we 154 00:09:17,876 --> 00:09:21,076 Speaker 1: feel busier than ever before, there is very little evidence 155 00:09:21,116 --> 00:09:24,556 Speaker 1: showing that we actually are busier, which is kind of weird. 156 00:09:25,196 --> 00:09:27,636 Speaker 1: It's as though our minds tell us we're super busy 157 00:09:27,716 --> 00:09:30,676 Speaker 1: all the time, but in reality it's not as bad 158 00:09:30,716 --> 00:09:33,996 Speaker 1: as we think. But there's another even more insidious way 159 00:09:34,036 --> 00:09:36,436 Speaker 1: our mind leads us astray. When we try to save 160 00:09:36,556 --> 00:09:39,956 Speaker 1: some time, it turns out there's an opportunity cost that 161 00:09:40,076 --> 00:09:43,196 Speaker 1: comes from avoiding those bank lines, and the cost is 162 00:09:43,236 --> 00:09:47,436 Speaker 1: a social one. Long lines are frustrating, but they're also 163 00:09:47,476 --> 00:09:50,596 Speaker 1: an opportunity to be around other people, and the sheer 164 00:09:50,636 --> 00:09:53,916 Speaker 1: amount of time we spend around other people actually critics 165 00:09:54,116 --> 00:09:57,596 Speaker 1: how happy we are. Take one famous study by positive 166 00:09:57,636 --> 00:10:01,316 Speaker 1: psychologists Ed Deaner and Marty Seliban. They looked at people 167 00:10:01,356 --> 00:10:04,316 Speaker 1: who scored in the highest tenth percentile and happiness surveys 168 00:10:04,716 --> 00:10:06,956 Speaker 1: and tried to figure out what makes them so much 169 00:10:06,996 --> 00:10:10,076 Speaker 1: happier than the rest of us. The searchers discovered that 170 00:10:10,156 --> 00:10:13,116 Speaker 1: these happy people didn't spend any more time exercising or 171 00:10:13,156 --> 00:10:16,356 Speaker 1: doing religious activities. What did these happy folks do differently? 172 00:10:16,916 --> 00:10:20,036 Speaker 1: They were more social. They spent more time around other 173 00:10:20,156 --> 00:10:23,996 Speaker 1: humans than people with average levels of happiness. The results 174 00:10:23,996 --> 00:10:26,876 Speaker 1: were so strong that these researchers deemed being around other 175 00:10:26,996 --> 00:10:32,196 Speaker 1: people as a necessary condition for very high happiness. Another 176 00:10:32,276 --> 00:10:35,556 Speaker 1: study by Nobel Prize winning psychologist Danny Koneman confirmed this. 177 00:10:36,316 --> 00:10:39,196 Speaker 1: He and his colleagues tested which daily activities make us 178 00:10:39,236 --> 00:10:45,476 Speaker 1: feel best. The winner socializing with others. It's better than eating, shopping, relaxing, 179 00:10:45,876 --> 00:10:49,316 Speaker 1: or even watching TV. Just being with other people makes 180 00:10:49,396 --> 00:10:53,396 Speaker 1: us feel good, even if those people are strangers. There 181 00:10:53,436 --> 00:10:57,236 Speaker 1: are lots of sources of well being standing around you. 182 00:10:57,356 --> 00:10:59,996 Speaker 1: You just have to tap into them. My friend Nick 183 00:11:00,036 --> 00:11:02,876 Speaker 1: Epley is a professor of behavioral science at the University 184 00:11:02,916 --> 00:11:06,836 Speaker 1: of Chicago's Booth School of Business. Happiness isn't about the 185 00:11:06,956 --> 00:11:11,236 Speaker 1: intensity of experience as that we have, It's about the 186 00:11:11,276 --> 00:11:15,956 Speaker 1: frequency of them. Happiness is like is like a you know, 187 00:11:16,036 --> 00:11:18,236 Speaker 1: a leaky tire on your car. You don't have a 188 00:11:18,356 --> 00:11:21,396 Speaker 1: nice conversation with somebody and then are happy forever. But 189 00:11:21,636 --> 00:11:23,956 Speaker 1: if you're having a nice conversation with somebody on a plane, 190 00:11:24,076 --> 00:11:26,756 Speaker 1: that plane ride is more enjoyable than it would have 191 00:11:26,836 --> 00:11:28,596 Speaker 1: been otherwise. But then you know, once you're off the 192 00:11:28,636 --> 00:11:30,276 Speaker 1: plane ride, you know your tire goes flat a little bit. 193 00:11:30,316 --> 00:11:32,556 Speaker 1: You got to do something else to pump it back up. 194 00:11:32,596 --> 00:11:34,636 Speaker 1: And so I find a lot of these conversations are 195 00:11:35,556 --> 00:11:39,476 Speaker 1: are like, uh, you know, air compressors for my tires. 196 00:11:39,956 --> 00:11:42,556 Speaker 1: Nick studies why we're so resistant to being more social. 197 00:11:43,196 --> 00:11:45,036 Speaker 1: Why don't we take more time to fill up our 198 00:11:45,116 --> 00:11:48,756 Speaker 1: leaky happiness tires with a quick conversation. People get the 199 00:11:48,876 --> 00:11:53,636 Speaker 1: consequences of social interaction wrong, particularly with strangers not engaging 200 00:11:53,636 --> 00:11:57,076 Speaker 1: in conversation with somebody else gives you a cost somewhere else, 201 00:11:57,716 --> 00:12:00,516 Speaker 1: and people don't always seem to recognize that. It turns 202 00:12:00,516 --> 00:12:03,596 Speaker 1: out the cost of not being social, not taking enough 203 00:12:03,636 --> 00:12:06,076 Speaker 1: time to connect with other people, is that it makes 204 00:12:06,156 --> 00:12:09,356 Speaker 1: us feel pretty awful. Feeling lonely or isolated. It just 205 00:12:09,636 --> 00:12:13,396 Speaker 1: kind of stinks. Loneliness is now a growing epidemic around 206 00:12:13,436 --> 00:12:16,956 Speaker 1: the world. People today report feeling lonely at double the 207 00:12:17,036 --> 00:12:20,556 Speaker 1: rate they did in the nineteen eighties. Take college campuses 208 00:12:20,796 --> 00:12:23,876 Speaker 1: like where I work at Yale. Nationally in the US, 209 00:12:23,996 --> 00:12:27,156 Speaker 1: right now, over sixty percent of college students report feeling 210 00:12:27,356 --> 00:12:31,036 Speaker 1: very lonely most of the time. This is higher than 211 00:12:31,116 --> 00:12:34,996 Speaker 1: in any other previous generation. A stressor like that impairs 212 00:12:35,036 --> 00:12:38,996 Speaker 1: your well being at it impairs your health. Recent research 213 00:12:39,076 --> 00:12:43,076 Speaker 1: shows that the physical consequences of our increased loneliness are staggering. 214 00:12:43,596 --> 00:12:46,036 Speaker 1: Feeling isolated is said to be as bad for our 215 00:12:46,076 --> 00:12:50,076 Speaker 1: health as smoking fifteen cigarettes a day. If loneliness had 216 00:12:50,116 --> 00:12:53,076 Speaker 1: a health warning, it would sound like this may cause 217 00:12:53,156 --> 00:12:57,116 Speaker 1: increased risk of inflammation, disrupted sleep, abnormal immune responses, depression, anxiety, 218 00:12:57,196 --> 00:13:03,116 Speaker 1: higher stress levels, early cognitive decline, alcoholism, cardiovascular disease, stroke, Alzheimer's, diabetes, suicide, 219 00:13:03,116 --> 00:13:06,396 Speaker 1: and even early dine. So what can we do to 220 00:13:06,476 --> 00:13:09,836 Speaker 1: fight this loneliness epidemic. Well, you can get a few 221 00:13:09,876 --> 00:13:13,036 Speaker 1: hints from people who don't feel all that lonely, people 222 00:13:13,316 --> 00:13:17,996 Speaker 1: like Eleanor Wetzel. I'm half extrovert and half introvert, and 223 00:13:18,276 --> 00:13:23,276 Speaker 1: so that part of my personality enjoys the connection with people. 224 00:13:23,676 --> 00:13:26,236 Speaker 1: From the moment Don Wetzell's wife welcomed me into her home, 225 00:13:26,716 --> 00:13:29,396 Speaker 1: it was obvious that this old fashioned grandmother was the 226 00:13:29,476 --> 00:13:32,396 Speaker 1: opposite of lonely. She was one of the most sociable 227 00:13:32,436 --> 00:13:34,796 Speaker 1: people I had met in a while. She had a 228 00:13:34,876 --> 00:13:38,676 Speaker 1: story for everything, including how she met Don. It was 229 00:13:38,716 --> 00:13:42,276 Speaker 1: a blind date, so we were starting at zero, and 230 00:13:42,556 --> 00:13:45,276 Speaker 1: I think there was just a chemistry there. I had 231 00:13:45,316 --> 00:13:47,556 Speaker 1: planned to spend only thirty minutes or so on this interview, 232 00:13:47,916 --> 00:13:50,596 Speaker 1: but I ended up chatting with Eleanor for over two hours. 233 00:13:51,276 --> 00:13:53,476 Speaker 1: We talked about our families, while her life was like 234 00:13:53,596 --> 00:13:56,476 Speaker 1: growing up, how she was able to raise so many children, 235 00:13:56,876 --> 00:13:59,996 Speaker 1: and other stuff too. I asked what her secret was, 236 00:14:00,756 --> 00:14:04,276 Speaker 1: how did she connect with people so easily. It turns 237 00:14:04,316 --> 00:14:08,276 Speaker 1: out she just chats with strangers whenever she can. I 238 00:14:08,556 --> 00:14:13,356 Speaker 1: had no problem with direct eye content and smiles. You know, 239 00:14:13,516 --> 00:14:16,876 Speaker 1: that's who we are, that's how you relate to people. 240 00:14:17,356 --> 00:14:20,116 Speaker 1: But I can see a lot of downers with the 241 00:14:20,276 --> 00:14:24,116 Speaker 1: technology that we have available. The ATM doesn't smile back 242 00:14:24,196 --> 00:14:28,476 Speaker 1: at you, marvel or show me their pretty eyes or whatever. 243 00:14:28,676 --> 00:14:31,996 Speaker 1: So we don't want to lose all of that. It's 244 00:14:32,036 --> 00:14:34,436 Speaker 1: true that Don's ATMs have given us back time, but 245 00:14:34,556 --> 00:14:37,716 Speaker 1: they've also robbed us of an important opportunity to connect 246 00:14:37,756 --> 00:14:41,756 Speaker 1: with human tellers and our fellow bank customers. They steal 247 00:14:41,876 --> 00:14:44,316 Speaker 1: one of the small chances we have each day to 248 00:14:44,436 --> 00:14:47,236 Speaker 1: fill up our leaky happiness tires with a quick conversation, 249 00:14:48,316 --> 00:14:51,516 Speaker 1: which is why Eleanor has taken a relatively shocking stance 250 00:14:51,716 --> 00:14:56,676 Speaker 1: on ATMs. Well, I've actually never used one. Period. That's right, 251 00:14:57,276 --> 00:15:00,476 Speaker 1: Eleanor has never used an ATM, even though her husband 252 00:15:00,756 --> 00:15:03,636 Speaker 1: is the guy that invented them. She just prefers to 253 00:15:03,756 --> 00:15:06,716 Speaker 1: chat with the teller. I don't think we even know 254 00:15:07,036 --> 00:15:11,196 Speaker 1: yet how much she's being a long without that interaction 255 00:15:11,316 --> 00:15:15,196 Speaker 1: of human beings, the whole bit. So there's so many components. 256 00:15:15,236 --> 00:15:17,356 Speaker 1: I wouldn't even have time to go into all of them, 257 00:15:17,396 --> 00:15:19,556 Speaker 1: and I'm sure I haven't even thought of all of them. 258 00:15:19,996 --> 00:15:23,436 Speaker 1: Eleanor's right here. We're automating the humans out of everything. 259 00:15:24,116 --> 00:15:27,676 Speaker 1: Take music, for example. Back in nineteen sixty eight, if 260 00:15:27,756 --> 00:15:30,116 Speaker 1: Eleanor wanted to hear a new song, she'd have to 261 00:15:30,156 --> 00:15:32,716 Speaker 1: interact with a bunch of people. She'd have to find 262 00:15:32,756 --> 00:15:35,196 Speaker 1: a record store, ask the clerk where to find that 263 00:15:35,276 --> 00:15:37,916 Speaker 1: new song, stand in line with other folks to buy it, 264 00:15:38,276 --> 00:15:40,356 Speaker 1: and only then could she drive home with her kids 265 00:15:40,596 --> 00:15:44,076 Speaker 1: to throw it on her record player. But today it's different. 266 00:15:45,076 --> 00:15:49,596 Speaker 1: Who is this Alexa that you can have do everything 267 00:15:49,836 --> 00:15:53,916 Speaker 1: for you? Every new automated convenience we introduce into our 268 00:15:53,996 --> 00:15:57,316 Speaker 1: lives has a cost, and that cost, all too often 269 00:15:57,716 --> 00:16:01,316 Speaker 1: is a social one. The problem is it's not often 270 00:16:01,356 --> 00:16:07,476 Speaker 1: a cost we even realize. The question is why. But 271 00:16:07,636 --> 00:16:09,916 Speaker 1: first we need some music to sent us into the break. 272 00:16:10,316 --> 00:16:14,156 Speaker 1: So let's tease what's coming up next? Alexa, play anything 273 00:16:14,236 --> 00:16:19,356 Speaker 1: by the Talking Heads. I'm having trouble connecting to the internet. Oh, 274 00:16:19,556 --> 00:16:22,436 Speaker 1: I'm so sorry, Give me a moment. The Happiness Lab 275 00:16:22,556 --> 00:16:35,996 Speaker 1: will be right back. I ride the train into Chicago 276 00:16:36,116 --> 00:16:39,996 Speaker 1: every day to my office in High Park from one 277 00:16:40,036 --> 00:16:42,036 Speaker 1: of the far Southside suburbs, and every day I get 278 00:16:42,116 --> 00:16:44,236 Speaker 1: on the train, and I was seeing exactly the same 279 00:16:44,436 --> 00:16:47,836 Speaker 1: kind of phenomena I've seen it for years. Science begins 280 00:16:47,836 --> 00:16:51,236 Speaker 1: with observation, and Nick Epley observes something on his daily 281 00:16:51,276 --> 00:16:54,556 Speaker 1: commute that is so commonplace yet so odd when you 282 00:16:54,636 --> 00:16:57,756 Speaker 1: really think about it. Where people would get on sit 283 00:16:57,836 --> 00:17:02,436 Speaker 1: down next to their neighbors, perfectly decent, lovely people going 284 00:17:02,556 --> 00:17:05,916 Speaker 1: into Chicago to work for the day. They would sit down, 285 00:17:06,036 --> 00:17:08,676 Speaker 1: cheek to jowel next to somebody else, and they would 286 00:17:08,796 --> 00:17:13,316 Speaker 1: then ignore each other for forty five minutes. Most train 287 00:17:13,396 --> 00:17:15,956 Speaker 1: cars are full of people, which means they're also full 288 00:17:15,956 --> 00:17:20,596 Speaker 1: of knowledge, stories and jokes, but most are also deathly quiet. 289 00:17:20,956 --> 00:17:24,316 Speaker 1: I mean, almost nobody ever talks on the train. The 290 00:17:24,436 --> 00:17:27,836 Speaker 1: question is why Nick decided to test this. He recruited 291 00:17:27,876 --> 00:17:31,116 Speaker 1: passengers sharing his commute to work, dividing them into three 292 00:17:31,196 --> 00:17:34,796 Speaker 1: different groups or conditions, as we researchers call them. He 293 00:17:34,916 --> 00:17:37,476 Speaker 1: asked each group to act in a certain way while 294 00:17:37,476 --> 00:17:39,916 Speaker 1: they were on the train. The one condition, we told 295 00:17:39,956 --> 00:17:42,436 Speaker 1: them to keep to themselves, just focus on their day ahead, 296 00:17:42,516 --> 00:17:47,196 Speaker 1: don't engage others around you in conversation this morning. The 297 00:17:47,276 --> 00:17:50,036 Speaker 1: second condition, we asked them to do whatever they normally do, 298 00:17:50,156 --> 00:17:52,076 Speaker 1: which is typically the same as what happens in the 299 00:17:52,116 --> 00:17:55,276 Speaker 1: solitude condition. Almost nobody talks to strangers on the train. 300 00:17:56,196 --> 00:17:57,676 Speaker 1: And in the third condition, we asked them to do 301 00:17:57,756 --> 00:17:59,836 Speaker 1: something radical. We asked them to try to make a 302 00:17:59,916 --> 00:18:02,876 Speaker 1: connection with the person who sits down next to you 303 00:18:03,356 --> 00:18:05,236 Speaker 1: this morning on the train. Try to get to know 304 00:18:05,396 --> 00:18:08,196 Speaker 1: something about him or her, so they were going to 305 00:18:08,236 --> 00:18:11,796 Speaker 1: have a conversation. Let's think about these different groups for 306 00:18:11,876 --> 00:18:14,956 Speaker 1: a second. Which one would you be happiest in the 307 00:18:15,076 --> 00:18:17,436 Speaker 1: groups in which you could enjoy your solitude or the 308 00:18:17,516 --> 00:18:19,756 Speaker 1: one that forced you to talk to a complete stranger. 309 00:18:21,036 --> 00:18:24,076 Speaker 1: You might naturally have a pretty strong intuition here, but 310 00:18:24,196 --> 00:18:27,996 Speaker 1: I bet that intuition is wrong. People reported the most 311 00:18:28,036 --> 00:18:31,996 Speaker 1: positive commute in the connection condition, less positive in the 312 00:18:32,036 --> 00:18:35,676 Speaker 1: control condition, and least positive in the solitude condition, where 313 00:18:35,676 --> 00:18:38,996 Speaker 1: they kept to themselves. Being forced to talk with a 314 00:18:39,076 --> 00:18:44,036 Speaker 1: stranger was far and away the most pleasurable experience. Simply 315 00:18:44,116 --> 00:18:46,596 Speaker 1: making a connection with someone we don't know makes us 316 00:18:46,676 --> 00:18:50,596 Speaker 1: feel really good. Nixed out on this very same study. 317 00:18:50,676 --> 00:18:54,036 Speaker 1: In a number of different contexts, on city buses and cabs, 318 00:18:54,076 --> 00:18:57,076 Speaker 1: at the airport, in waiting rooms, they all find the 319 00:18:57,156 --> 00:19:01,036 Speaker 1: same result. People are happiest when they're being social with someone, 320 00:19:02,156 --> 00:19:05,836 Speaker 1: But what about that other person? You could imagine that 321 00:19:05,996 --> 00:19:08,876 Speaker 1: we were potentially spreading misery, that the person who was 322 00:19:08,876 --> 00:19:11,356 Speaker 1: talking too maybe was unhappy about this. We were like 323 00:19:11,436 --> 00:19:16,236 Speaker 1: polluting the train with all of this unwanted conversation. So 324 00:19:16,396 --> 00:19:20,076 Speaker 1: does your conversation make other people miserable? Well? Nick tested 325 00:19:20,116 --> 00:19:23,636 Speaker 1: that too by creating a fake waiting room in his laboratory. 326 00:19:24,116 --> 00:19:27,676 Speaker 1: They were also happier when they were talked to than 327 00:19:27,796 --> 00:19:32,276 Speaker 1: when they were not talked too, and that effect was 328 00:19:32,436 --> 00:19:35,716 Speaker 1: just as big as the effect on the people who 329 00:19:35,796 --> 00:19:38,636 Speaker 1: were instructed to talk. So I didn't I don't think 330 00:19:38,676 --> 00:19:42,716 Speaker 1: we're spreading misery on the trains or the buses. Connecting 331 00:19:42,756 --> 00:19:45,676 Speaker 1: with someone as pleasant, whether you are the one who's 332 00:19:45,676 --> 00:19:48,996 Speaker 1: initiating it or the one you're receiving it. Note that 333 00:19:49,156 --> 00:19:52,876 Speaker 1: Nick's not advocating harassing someone on the train or continuing 334 00:19:52,916 --> 00:19:55,156 Speaker 1: to try to talk to someone who clearly doesn't want 335 00:19:55,196 --> 00:19:57,636 Speaker 1: you to speak to them. All Nick saying is that 336 00:19:57,716 --> 00:20:01,116 Speaker 1: a quick conversation can make us feel good. The problem 337 00:20:01,276 --> 00:20:03,796 Speaker 1: is that's not what we think is going to happen. 338 00:20:04,516 --> 00:20:07,516 Speaker 1: When Nick asked people to imagine how they'd feel getting 339 00:20:07,556 --> 00:20:11,356 Speaker 1: into a conversation with a strange they wrongly predicted that 340 00:20:11,476 --> 00:20:14,996 Speaker 1: it wouldn't be fun or uplifting. The reason that's interesting 341 00:20:15,156 --> 00:20:19,156 Speaker 1: is because our expectations guide our behavior. So if you 342 00:20:19,276 --> 00:20:21,316 Speaker 1: expect it's going to be freezing cold outside, you'll pick 343 00:20:21,396 --> 00:20:22,916 Speaker 1: up a jacket and you'll wear it when you go outside. 344 00:20:22,956 --> 00:20:24,836 Speaker 1: If you expect that it's going to be really warm outside, 345 00:20:24,876 --> 00:20:27,316 Speaker 1: you won't wear a jacket. If I expect that talking 346 00:20:27,356 --> 00:20:28,956 Speaker 1: to somebody will be pleasant, I'll do it. If I 347 00:20:28,996 --> 00:20:31,356 Speaker 1: expect it it'll be miserable, I won't. But I bet 348 00:20:31,436 --> 00:20:34,796 Speaker 1: your thinking, what if you're shy around people? Maybe all 349 00:20:34,836 --> 00:20:37,716 Speaker 1: this talking to stranger's stuff works if you're really outgoing, 350 00:20:38,196 --> 00:20:41,116 Speaker 1: but maybe it sucks for introverts. And we did measure 351 00:20:41,156 --> 00:20:44,276 Speaker 1: this and we found actually no difference at all between 352 00:20:44,316 --> 00:20:48,756 Speaker 1: introverts and extroverts, and in across these conditions. That is, 353 00:20:49,036 --> 00:20:55,556 Speaker 1: introverts enjoyed connecting with others as extroverts did. Introverts did 354 00:20:55,636 --> 00:20:59,636 Speaker 1: not enjoy keeping to themselves in solitude, and extroverts didn't 355 00:20:59,716 --> 00:21:04,156 Speaker 1: enjoy that either. What tends to vary our people's expectations 356 00:21:04,196 --> 00:21:07,876 Speaker 1: about how they're going to feel. So an introvert, because 357 00:21:07,916 --> 00:21:09,476 Speaker 1: they think they're not going to enjoy it already, is 358 00:21:09,516 --> 00:21:12,676 Speaker 1: going to choose not to go, whereas an extrovert who 359 00:21:12,836 --> 00:21:16,236 Speaker 1: enjoys a party might choose to go. On average people 360 00:21:16,316 --> 00:21:19,156 Speaker 1: tend to feel happier when they are connecting with others, 361 00:21:19,516 --> 00:21:24,076 Speaker 1: and that's true for both extroverts and introverts. Nix's results 362 00:21:24,116 --> 00:21:26,356 Speaker 1: are quite challenging for a lot of people to hear. 363 00:21:26,876 --> 00:21:29,836 Speaker 1: No matter what your personality type is, you will increase 364 00:21:29,876 --> 00:21:32,516 Speaker 1: your happiness if you interact with people you randomly meet 365 00:21:32,796 --> 00:21:35,996 Speaker 1: in stores or on public transport. Creates a social connection. 366 00:21:36,036 --> 00:21:38,956 Speaker 1: It keeps you connected on the right level. I made 367 00:21:38,996 --> 00:21:41,836 Speaker 1: this very point on the CBS Morning News recently. Happy 368 00:21:41,916 --> 00:21:44,636 Speaker 1: people take time for social connection. They try to make 369 00:21:44,716 --> 00:21:46,796 Speaker 1: connections with the people on the street, and I got 370 00:21:46,876 --> 00:21:50,436 Speaker 1: some interesting reactions from the viewers. Here's one tweet from 371 00:21:50,476 --> 00:21:52,676 Speaker 1: someone who says, quote, talk to a stranger on the bus? 372 00:21:52,796 --> 00:21:55,996 Speaker 1: Are you insane? Don't talk to strangers. It's dangerous. Didn't 373 00:21:55,996 --> 00:21:59,076 Speaker 1: your mamma teach you anything. Here's another one one of 374 00:21:59,116 --> 00:22:02,796 Speaker 1: my personal favorites. If a stranger talks to me on 375 00:22:02,876 --> 00:22:05,036 Speaker 1: a bus, I will go nuts. People die because of 376 00:22:05,076 --> 00:22:09,476 Speaker 1: shite like this, hell no, So do you get do 377 00:22:09,516 --> 00:22:11,876 Speaker 1: you hit similar reactions where people hear these data and 378 00:22:11,916 --> 00:22:14,996 Speaker 1: are just like, not true, not me, Oh yeah yeah, yeah, yeah, 379 00:22:15,076 --> 00:22:17,876 Speaker 1: yeah No. I get it all the time. I get 380 00:22:17,916 --> 00:22:20,516 Speaker 1: a lot of pushback on this because the expectations are 381 00:22:20,596 --> 00:22:24,196 Speaker 1: so strong. So what people are imagining, I think, are 382 00:22:24,796 --> 00:22:28,356 Speaker 1: random people who might come up to you and talk 383 00:22:28,916 --> 00:22:33,956 Speaker 1: to you, and they imagine sort of the worst case outcome. 384 00:22:34,956 --> 00:22:38,516 Speaker 1: So they're imagining homeless people are mentally ill people or 385 00:22:38,596 --> 00:22:42,436 Speaker 1: something who who are dangerous to them, or psychopaths, whatever. 386 00:22:44,196 --> 00:22:47,396 Speaker 1: But that's a different situation from what we're asking people 387 00:22:47,916 --> 00:22:51,276 Speaker 1: to do here. We're just asking you to talk to 388 00:22:51,356 --> 00:22:53,076 Speaker 1: a person who happens to be sitting next to you, 389 00:22:53,196 --> 00:22:54,996 Speaker 1: and the person who happens to be sitting next to 390 00:22:55,076 --> 00:22:59,556 Speaker 1: you is likely to just be a normal person, not 391 00:22:59,676 --> 00:23:04,116 Speaker 1: a psychopath. We don't do something that's almost certain to 392 00:23:04,236 --> 00:23:06,636 Speaker 1: make us happier because we think we'll be preyed upon 393 00:23:06,796 --> 00:23:10,956 Speaker 1: by some imaginary psycho killer. Actually we're going into the 394 00:23:11,036 --> 00:23:14,716 Speaker 1: break again, Alexa play psycho killer. By the talk. I'm 395 00:23:14,756 --> 00:23:17,956 Speaker 1: having trouble connecting to the internet. That's so annoying. I'm 396 00:23:17,956 --> 00:23:19,836 Speaker 1: so sorry. The show will be back in a moment. 397 00:23:32,356 --> 00:23:34,836 Speaker 1: Nick Epley thinks we're too scared of falling victim to 398 00:23:34,956 --> 00:23:37,876 Speaker 1: some psycho killer to strike up conversations on a train. 399 00:23:38,596 --> 00:23:40,916 Speaker 1: Such unfounded fears are part of why we seem to 400 00:23:40,996 --> 00:23:45,436 Speaker 1: find the automation revolutions. So alluring Don's ATM was the 401 00:23:45,476 --> 00:23:48,556 Speaker 1: first step. But now we're killing the human part of 402 00:23:48,676 --> 00:23:52,196 Speaker 1: so many of our interactions. I want to introduce you 403 00:23:52,316 --> 00:23:56,156 Speaker 1: to someone who's deeply worried about this new direction, someone 404 00:23:56,396 --> 00:23:59,156 Speaker 1: who is making changes in his own life to fight back. 405 00:23:59,996 --> 00:24:03,276 Speaker 1: David Byrne, We're losing something, and a lot of the 406 00:24:04,116 --> 00:24:08,836 Speaker 1: efficiency that we think is there is kind of an illusion. 407 00:24:09,316 --> 00:24:12,716 Speaker 1: This is insane. You probably know David as a Talking 408 00:24:12,756 --> 00:24:15,276 Speaker 1: Heads frontman, but what you may not know is that 409 00:24:15,396 --> 00:24:19,316 Speaker 1: David also writes brilliant social essays. You recently authored a 410 00:24:19,396 --> 00:24:23,156 Speaker 1: fantastic article for the MIT Technology Review on the hidden 411 00:24:23,236 --> 00:24:28,476 Speaker 1: dangers of automation. It's title eliminating the Human. If these 412 00:24:28,516 --> 00:24:34,396 Speaker 1: things are becoming so ubiquitous, the elimination of the human interaction, 413 00:24:35,236 --> 00:24:40,036 Speaker 1: what does that mean for us as individuals, as a society, 414 00:24:40,276 --> 00:24:43,796 Speaker 1: as a community. David's thesis is that humans have developed 415 00:24:43,836 --> 00:24:47,156 Speaker 1: over millions of years to work trade, have fun, and 416 00:24:47,316 --> 00:24:52,116 Speaker 1: form relationships face to face. You're getting all these different signals. 417 00:24:52,196 --> 00:24:57,396 Speaker 1: You're getting signals from their body language, their facial expression, 418 00:24:57,716 --> 00:25:01,076 Speaker 1: what their eyes are doing, the tone of their voice. 419 00:25:01,716 --> 00:25:05,676 Speaker 1: We are social animals, That's what we are. We're like 420 00:25:05,796 --> 00:25:11,156 Speaker 1: ants and wolves, and we are an animal that flourishes 421 00:25:11,556 --> 00:25:15,076 Speaker 1: because we are social. And you wonder what will happen 422 00:25:15,196 --> 00:25:20,276 Speaker 1: or what is happening when that aspect of our deep 423 00:25:20,396 --> 00:25:23,676 Speaker 1: makeup starts to be taken away from us, or not 424 00:25:23,876 --> 00:25:27,156 Speaker 1: so much taken away, we give it up voluntarily. David's 425 00:25:27,156 --> 00:25:29,916 Speaker 1: worried that we're all voluntarily turning our backs on our 426 00:25:29,916 --> 00:25:33,156 Speaker 1: fellow humans every day thanks to new products which promise 427 00:25:33,276 --> 00:25:36,556 Speaker 1: us ease and convenience, be it an ATM or an 428 00:25:36,596 --> 00:25:39,636 Speaker 1: app to preorder our groceries, or a film streaming service 429 00:25:39,756 --> 00:25:41,956 Speaker 1: that saves us a trip to a crowded movie theater. 430 00:25:42,356 --> 00:25:46,476 Speaker 1: I'm not saying that whoever designed these things had in 431 00:25:46,516 --> 00:25:48,916 Speaker 1: the front of their mind. Can I come up with 432 00:25:49,116 --> 00:25:52,236 Speaker 1: a technology to eliminate some of the human interaction in 433 00:25:52,356 --> 00:25:56,276 Speaker 1: my life? But it sure seems to be the result. 434 00:25:56,956 --> 00:25:59,756 Speaker 1: No disrespect to inventors like Don Wetzel, who is as 435 00:25:59,836 --> 00:26:03,316 Speaker 1: socialist social can be. But David worries that a relatively 436 00:26:03,436 --> 00:26:07,036 Speaker 1: small section of society, namely the engineers who design all 437 00:26:07,076 --> 00:26:09,796 Speaker 1: this stuff, they're creating a world the rest of us 438 00:26:09,876 --> 00:26:13,556 Speaker 1: must inhabit, and they are creating it in their own image. 439 00:26:14,156 --> 00:26:19,116 Speaker 1: My father was an engineer. I enjoy that mindset of 440 00:26:19,636 --> 00:26:22,836 Speaker 1: looking at things from an engineer's point of view, but 441 00:26:23,036 --> 00:26:28,716 Speaker 1: I recognize a lot of that and a lot of programmers, 442 00:26:28,876 --> 00:26:32,836 Speaker 1: coders engineers who are designing a lot of the things 443 00:26:32,916 --> 00:26:37,676 Speaker 1: that kind of envelop us in the contemporary world. You 444 00:26:37,796 --> 00:26:41,356 Speaker 1: can sense that a lot of these guys, and most 445 00:26:41,396 --> 00:26:46,676 Speaker 1: of them are guys, are not comfortable in social situations. 446 00:26:46,876 --> 00:26:52,796 Speaker 1: So even if they would not vocally say I want 447 00:26:52,836 --> 00:26:54,676 Speaker 1: to make a world where I never have to interact 448 00:26:54,716 --> 00:26:59,516 Speaker 1: with a person, they might unconsciously do that that's the 449 00:26:59,636 --> 00:27:03,276 Speaker 1: world that has been made for us, and whether we 450 00:27:03,396 --> 00:27:07,276 Speaker 1: want to or not, we're living in their world. Whether 451 00:27:07,436 --> 00:27:10,116 Speaker 1: or not you completely by the sterio type that all 452 00:27:10,236 --> 00:27:13,596 Speaker 1: engineers shun human company, most of us can admit that 453 00:27:13,716 --> 00:27:17,516 Speaker 1: what they've designed is often pretty tempting. Many of us 454 00:27:17,556 --> 00:27:20,836 Speaker 1: have moments where we relish opportunities to be by ourselves 455 00:27:21,636 --> 00:27:24,756 Speaker 1: or just hide away a bit. When I was much younger, 456 00:27:24,876 --> 00:27:29,916 Speaker 1: I was much shyer, I was much much more uncomfortable 457 00:27:29,996 --> 00:27:34,836 Speaker 1: and social situations. I would create a kind of facade 458 00:27:34,996 --> 00:27:39,476 Speaker 1: or character or persona that would be my face for 459 00:27:39,756 --> 00:27:42,556 Speaker 1: a social interaction, and it was a little bit artificial 460 00:27:43,396 --> 00:27:47,636 Speaker 1: in that sense. I can identify and understand that a 461 00:27:47,716 --> 00:27:51,276 Speaker 1: lot of people feel like oh no. If I can 462 00:27:52,036 --> 00:27:54,956 Speaker 1: figure out a way to navigate the world with as 463 00:27:55,116 --> 00:28:02,156 Speaker 1: few annoying interactions with the human, then very good. Let's 464 00:28:02,476 --> 00:28:08,356 Speaker 1: design interfaces that speed things along and help someone who 465 00:28:08,556 --> 00:28:12,636 Speaker 1: is uncomfortable in social situations, for example, get through them 466 00:28:12,796 --> 00:28:17,676 Speaker 1: without the pesky human. At the end of his MIT 467 00:28:17,836 --> 00:28:21,436 Speaker 1: tech review article, David argues that as we spend less 468 00:28:21,476 --> 00:28:24,556 Speaker 1: and less time talking and listening to each other, will 469 00:28:24,596 --> 00:28:28,156 Speaker 1: become less tolerant of each other's differences, will become more 470 00:28:28,236 --> 00:28:33,396 Speaker 1: inclined to envy and antagonism. It's a chilling prospect. But 471 00:28:33,556 --> 00:28:37,236 Speaker 1: can science save us? Can researchers like Nick convince the 472 00:28:37,356 --> 00:28:40,996 Speaker 1: champions of automation that they're getting the balance between convenience 473 00:28:41,156 --> 00:28:45,796 Speaker 1: and happiness all wrong? I'm afraid to say it doesn't 474 00:28:45,836 --> 00:28:51,636 Speaker 1: look promising. Remember Nick's experiment using train passengers, how he 475 00:28:51,716 --> 00:28:54,316 Speaker 1: found that the people he forced into conversation with their 476 00:28:54,356 --> 00:28:58,556 Speaker 1: fellow commuters had happier journeys well. Nick reported these findings 477 00:28:58,636 --> 00:29:00,796 Speaker 1: back to the head of marketing at the railroad company, 478 00:29:01,396 --> 00:29:05,156 Speaker 1: and here was her response. Nick, you're not going to 479 00:29:05,236 --> 00:29:08,196 Speaker 1: believe what we're about to do. She said, we're going 480 00:29:08,236 --> 00:29:10,996 Speaker 1: to roll out a new policy on the trains. We're 481 00:29:10,996 --> 00:29:14,316 Speaker 1: going to put in place a quiet car. I said, 482 00:29:14,316 --> 00:29:18,276 Speaker 1: oh really, And the quiet car is one she explained, 483 00:29:18,316 --> 00:29:21,756 Speaker 1: where people are not allowed to engage in conversation, they're 484 00:29:21,756 --> 00:29:23,556 Speaker 1: not allowed to talk on their cell phones, they're not 485 00:29:23,596 --> 00:29:25,276 Speaker 1: allowed to talk to somebody sitting next to them. It's 486 00:29:25,276 --> 00:29:28,236 Speaker 1: supposed to be absolutely quiet. Nick was surprised why the 487 00:29:28,276 --> 00:29:31,396 Speaker 1: train company had made a decision that completely contradicted his 488 00:29:31,476 --> 00:29:34,676 Speaker 1: well being research, and she said, wow, because we asked 489 00:29:34,756 --> 00:29:36,676 Speaker 1: people on a survey what they wanted, and this is 490 00:29:36,756 --> 00:29:40,956 Speaker 1: what they said they wanted, which, of course I pointed out, 491 00:29:40,996 --> 00:29:43,876 Speaker 1: there is exactly what our participants said they wanted too, 492 00:29:44,756 --> 00:29:46,996 Speaker 1: and it just turned out not quite to be right, 493 00:29:47,036 --> 00:29:50,276 Speaker 1: at least in terms of their well being. Nick, being 494 00:29:50,316 --> 00:29:53,316 Speaker 1: a good scientist, wanted to know if the railroad people 495 00:29:53,556 --> 00:29:56,356 Speaker 1: had carried out an experiment with the opposite of a 496 00:29:56,436 --> 00:29:59,876 Speaker 1: quiet car. Have you ever just put a chatty car 497 00:30:00,596 --> 00:30:03,796 Speaker 1: on the line where people can just get together, you know, 498 00:30:03,876 --> 00:30:05,996 Speaker 1: maybe you got snacks or something, and you can get 499 00:30:05,996 --> 00:30:08,236 Speaker 1: together and you just talk. You just get to know 500 00:30:08,276 --> 00:30:10,116 Speaker 1: your neighbors a little. I get to know your commuters 501 00:30:10,156 --> 00:30:13,156 Speaker 1: your chatty car. And she laughed and she said, no, 502 00:30:13,276 --> 00:30:16,996 Speaker 1: we've we've never done the chatty car. But we used 503 00:30:17,036 --> 00:30:20,516 Speaker 1: to have bar cars on the trains where people would 504 00:30:20,516 --> 00:30:22,476 Speaker 1: get together and often they would then connect with each 505 00:30:22,516 --> 00:30:24,236 Speaker 1: other there. And I asked her to used to have 506 00:30:24,316 --> 00:30:26,316 Speaker 1: the bar cars anymore. She said, no, we don't. I 507 00:30:26,396 --> 00:30:28,276 Speaker 1: don't have them anymore. And I asked why not. I 508 00:30:28,436 --> 00:30:30,876 Speaker 1: was imagining, you know, her telling stories about people stumbling 509 00:30:30,916 --> 00:30:33,356 Speaker 1: off the trains drunk or something. But she said, the 510 00:30:33,436 --> 00:30:38,876 Speaker 1: real problem was they were too crowded. That is, they 511 00:30:38,916 --> 00:30:41,316 Speaker 1: were too popular, so there were too many people who 512 00:30:41,356 --> 00:30:43,796 Speaker 1: wanted to be in there. That's the point at which 513 00:30:43,876 --> 00:30:47,716 Speaker 1: as a behavioral science you just sort of sigh, we 514 00:30:47,956 --> 00:30:51,836 Speaker 1: think it clearly seems they have clear data that people 515 00:30:51,956 --> 00:30:54,956 Speaker 1: really enjoy being able to connect with each other, and 516 00:30:55,196 --> 00:30:58,956 Speaker 1: yet that service doesn't doesn't get extended, so they canceled 517 00:30:58,996 --> 00:31:01,996 Speaker 1: it because the chatty car or the equivalent of it 518 00:31:02,196 --> 00:31:07,316 Speaker 1: was was too crowded. So if banks and railroad companies 519 00:31:07,436 --> 00:31:10,676 Speaker 1: and app designers and store owners aren't going to come 520 00:31:10,716 --> 00:31:13,316 Speaker 1: to our rescue, what are we to do to stop 521 00:31:13,396 --> 00:31:18,156 Speaker 1: feeling so isolated? The answer is pretty simple. We just 522 00:31:18,316 --> 00:31:21,116 Speaker 1: need to connect with other people, and not just our 523 00:31:21,156 --> 00:31:24,276 Speaker 1: friends and family members. We also need to make the 524 00:31:24,316 --> 00:31:27,556 Speaker 1: effort to connect more with strangers, the random people around 525 00:31:27,676 --> 00:31:31,236 Speaker 1: us in lines and on our commute. They matter more 526 00:31:31,276 --> 00:31:37,436 Speaker 1: than we think. David Byrne realized this. Despite his natural shyness, 527 00:31:37,916 --> 00:31:40,196 Speaker 1: he's trying to be part of the cure. He now 528 00:31:40,236 --> 00:31:44,236 Speaker 1: embraces opportunities to connect with the people who cross his path. Yeah, 529 00:31:44,236 --> 00:31:46,836 Speaker 1: it happened the other day. The subways were messed up 530 00:31:46,876 --> 00:31:52,396 Speaker 1: and there was a Chinese guy who was really having 531 00:31:52,436 --> 00:31:54,596 Speaker 1: at He had some luggage with him and he was 532 00:31:54,676 --> 00:31:57,636 Speaker 1: really having a hard time. Everything all everything has changed. 533 00:31:57,636 --> 00:31:59,516 Speaker 1: You know, this trains now on this line, This trains 534 00:31:59,556 --> 00:32:01,516 Speaker 1: now running on this line. This used to be an 535 00:32:01,556 --> 00:32:03,356 Speaker 1: express now it's a local, And it was all this 536 00:32:03,756 --> 00:32:08,076 Speaker 1: We kind of figured it out together, which was kind 537 00:32:08,116 --> 00:32:14,156 Speaker 1: of sweet. You have made a connection. And what I 538 00:32:14,276 --> 00:32:18,916 Speaker 1: discover is very often they'll smile that you are sharing 539 00:32:19,036 --> 00:32:23,116 Speaker 1: this acknowledgement with them. They might laugh, you'll laugh, and 540 00:32:23,596 --> 00:32:26,276 Speaker 1: so it kind of well, it sounds to get cliche, 541 00:32:26,396 --> 00:32:29,796 Speaker 1: but it brightens your day for another fifteen minutes at least. 542 00:32:32,796 --> 00:32:35,236 Speaker 1: So what have we learned in this episode? For one thing, 543 00:32:35,476 --> 00:32:39,276 Speaker 1: we too readily assume that convenience, efficiency and near instant 544 00:32:39,316 --> 00:32:43,316 Speaker 1: gratification are the roots to happiness, but that assumption is 545 00:32:43,356 --> 00:32:47,236 Speaker 1: often wrong. Tiny human interactions are the burst of air 546 00:32:47,356 --> 00:32:50,436 Speaker 1: we need for our happiness tires to steal. Next metaphor, 547 00:32:51,356 --> 00:32:53,796 Speaker 1: your mind might tell you a quick conversation is going 548 00:32:53,876 --> 00:32:57,036 Speaker 1: to be awkward, too much time, not worth it, But 549 00:32:57,196 --> 00:33:01,556 Speaker 1: those intuitions are wrong, even for shy folks. So get 550 00:33:01,596 --> 00:33:04,796 Speaker 1: out there and make a new connection. Next time you 551 00:33:04,916 --> 00:33:07,276 Speaker 1: are standing in line, talk to the person next to you. 552 00:33:07,996 --> 00:33:10,396 Speaker 1: If you can't think of something to say, you could 553 00:33:10,436 --> 00:33:13,036 Speaker 1: tell them that lines are an opportunity, and that the 554 00:33:13,156 --> 00:33:16,316 Speaker 1: guy who had the inspiration for the ATM machine did 555 00:33:16,356 --> 00:33:19,076 Speaker 1: so while waiting in a bank line, and that his 556 00:33:19,236 --> 00:33:22,836 Speaker 1: wife has never used that invention. You could even tell 557 00:33:22,876 --> 00:33:25,996 Speaker 1: them that you heard that on a podcast, a podcast 558 00:33:26,076 --> 00:33:40,716 Speaker 1: called The Happiness Lab with doctor Laurie Santos. If you 559 00:33:40,876 --> 00:33:43,116 Speaker 1: enjoyed the show, I'd be super grateful if you could 560 00:33:43,116 --> 00:33:45,676 Speaker 1: spread the word by leaving a rating and a review. 561 00:33:46,076 --> 00:33:49,036 Speaker 1: It really does help other listeners find us, and don't 562 00:33:49,076 --> 00:33:51,876 Speaker 1: forget to tell your friends. If you want to learn 563 00:33:51,916 --> 00:33:53,916 Speaker 1: more about the science you heard on the show, then 564 00:33:53,996 --> 00:33:57,076 Speaker 1: check out our website Happiness Lab dot fm. You can 565 00:33:57,116 --> 00:33:59,636 Speaker 1: also sign up for our newsletter to get exclusive content. 566 00:34:01,116 --> 00:34:03,956 Speaker 1: The Happiness Lab is co written and produced by Ryan Dilley. 567 00:34:04,316 --> 00:34:06,636 Speaker 1: The show is mixed and mastered by Evan Viola and 568 00:34:06,876 --> 00:34:10,716 Speaker 1: edited by Julia Barton, fact check by Joseph Friedman, and 569 00:34:11,076 --> 00:34:15,316 Speaker 1: our original music was composed by Zachary Silver. Special thanks 570 00:34:15,516 --> 00:34:20,796 Speaker 1: to Mia LaBelle, Carly mcgliorre Heather Faine, Maggie Taylor, Maya Kanig, 571 00:34:21,076 --> 00:34:23,956 Speaker 1: and Jacob Weisberg. The Happiness Lab is brought to you 572 00:34:24,076 --> 00:34:27,116 Speaker 1: by Pushkin Industries and me Doctor Laurie Santos