1 00:00:00,040 --> 00:00:03,960 Speaker 1: Hello, I'm Dessa, and this is deeply human why you 2 00:00:04,040 --> 00:00:11,440 Speaker 1: do the things you do? So let me ask you 3 00:00:11,480 --> 00:00:14,320 Speaker 1: a question. Have you been on any of these online 4 00:00:14,400 --> 00:00:18,840 Speaker 1: dating sites? Yes? And have you been on one of 5 00:00:18,880 --> 00:00:23,320 Speaker 1: those that you have to enter your age and wait? Yes? 6 00:00:24,520 --> 00:00:29,639 Speaker 1: And how how honest I would say? I would say, 7 00:00:30,040 --> 00:00:34,640 Speaker 1: you're right? Mostly I probably I probably shaved off three 8 00:00:34,720 --> 00:00:38,040 Speaker 1: pounds on that day. Yeah, And you probably said, you know, 9 00:00:38,240 --> 00:00:40,559 Speaker 1: I used to be three pounds less, like just a 10 00:00:40,600 --> 00:00:44,320 Speaker 1: few months ago. It just so happened that now when 11 00:00:44,360 --> 00:00:47,160 Speaker 1: I'm filling it up is after the holidays. So you're 12 00:00:47,159 --> 00:00:52,320 Speaker 1: in my head, Dan, You're in my head. That's Dan Arielli, 13 00:00:52,520 --> 00:00:56,400 Speaker 1: a professor of psychology and behavioral economics at Duke University 14 00:00:56,400 --> 00:01:03,680 Speaker 1: in North Carolina, calling me out in a lie. High 15 00:01:03,760 --> 00:01:07,720 Speaker 1: stakes lies have all sorts of consequences. Perjurors are imprisoned, 16 00:01:07,959 --> 00:01:11,880 Speaker 1: slanderers are sued. In the eighteen hundreds, liars and gossips 17 00:01:11,880 --> 00:01:14,679 Speaker 1: in Scotland could be punished by being forced to wear 18 00:01:14,720 --> 00:01:17,600 Speaker 1: a scolds bridle, a middle cage worn over the head 19 00:01:17,720 --> 00:01:20,240 Speaker 1: with a painful bit to stop the tone from moving. 20 00:01:22,480 --> 00:01:26,840 Speaker 1: But little lies they sometimes feel almost obligatory, like if 21 00:01:26,880 --> 00:01:30,480 Speaker 1: your coworker Andy asks do you like my haircut? And 22 00:01:30,640 --> 00:01:36,160 Speaker 1: you say yes, It adds a new shortness to the 23 00:01:36,360 --> 00:01:40,560 Speaker 1: ends of it. Un Lying to children can be a 24 00:01:40,680 --> 00:01:44,840 Speaker 1: joyful community activity. Good news, little Maxie. There's someone who 25 00:01:44,840 --> 00:01:48,160 Speaker 1: would like to buy your old teeth. She's nocturnal, she 26 00:01:48,240 --> 00:01:52,040 Speaker 1: could fly, but she deals exclusively with unconscious clientele, So 27 00:01:52,120 --> 00:01:54,000 Speaker 1: you will just have to trust that you are receiving 28 00:01:54,040 --> 00:02:02,520 Speaker 1: market rate, b Lie detector tests, child psychology, moral vigilance, 29 00:02:02,840 --> 00:02:06,080 Speaker 1: bad haircuts. We're here to explore why lying is a 30 00:02:06,120 --> 00:02:10,040 Speaker 1: developmental milestone for toddlers, but maybe a slippery slope for you. 31 00:02:15,160 --> 00:02:17,880 Speaker 1: Honestly is a really good thing, and most people acknowledge 32 00:02:17,880 --> 00:02:20,400 Speaker 1: it and they want to be honest. But in the 33 00:02:20,480 --> 00:02:23,280 Speaker 1: day to day basis, we have lots of other motivation 34 00:02:23,360 --> 00:02:27,120 Speaker 1: that play out as well. So motivation to be honest 35 00:02:27,240 --> 00:02:29,480 Speaker 1: is one of them. Motivation to gain money is an 36 00:02:29,520 --> 00:02:33,799 Speaker 1: under one. Motivation to have our political party be successful, 37 00:02:33,919 --> 00:02:38,880 Speaker 1: motivation to decrease global warming, motivation to whatever whatever I mean. 38 00:02:38,919 --> 00:02:41,280 Speaker 1: There's lots of them. Dan is the author of a 39 00:02:41,320 --> 00:02:44,600 Speaker 1: book called The Honest Truth About Dishonesty. How we lie 40 00:02:44,639 --> 00:02:49,280 Speaker 1: to everyone, especially ourselves. Most of us want to be honest, 41 00:02:49,520 --> 00:02:52,880 Speaker 1: but we also want to find love, we want financial security, 42 00:02:52,960 --> 00:02:55,079 Speaker 1: we want the best for our kids, and not all 43 00:02:55,120 --> 00:02:58,120 Speaker 1: of these desires are compatible all of the time. The 44 00:02:58,160 --> 00:03:00,360 Speaker 1: team here at deeply human. So let's they did some 45 00:03:00,440 --> 00:03:03,440 Speaker 1: lives from friends and colleagues and strangers on the Internet. 46 00:03:03,760 --> 00:03:07,960 Speaker 1: And here's one that I found very clever. So my 47 00:03:08,080 --> 00:03:11,040 Speaker 1: senior year of college, I had a truly all consuming 48 00:03:11,040 --> 00:03:14,040 Speaker 1: crush on the teaching assistant for the photography seminar that 49 00:03:14,080 --> 00:03:16,440 Speaker 1: I was taking, and near the end of the year 50 00:03:16,560 --> 00:03:18,440 Speaker 1: I was applying for jobs. I got a bunch of 51 00:03:18,480 --> 00:03:21,560 Speaker 1: professors to write me letters of recommendation, and once I 52 00:03:21,600 --> 00:03:23,799 Speaker 1: had them all, I was like, you know, this is 53 00:03:23,840 --> 00:03:25,720 Speaker 1: a really easy way to get people to give me 54 00:03:25,760 --> 00:03:28,400 Speaker 1: a bunch of compliments. So I told the t A 55 00:03:28,520 --> 00:03:31,079 Speaker 1: that I needed an additional letter of recommendation from him, 56 00:03:31,160 --> 00:03:35,240 Speaker 1: even though I did not. Our liar Allison found a 57 00:03:35,280 --> 00:03:38,680 Speaker 1: way to extract two full written pages of compliments from 58 00:03:38,680 --> 00:03:41,920 Speaker 1: her crush. If you could somehow force a letter of 59 00:03:41,920 --> 00:03:43,960 Speaker 1: wreck out of someone in your life now, in the 60 00:03:44,080 --> 00:03:50,280 Speaker 1: same circumstances, who would it be? Oh my god, um 61 00:03:50,440 --> 00:03:54,520 Speaker 1: Curtis sit and felt maybe me too. She's the best 62 00:03:54,720 --> 00:03:57,680 Speaker 1: God is my favorite mort how did you get so smart? 63 00:03:57,720 --> 00:04:00,960 Speaker 1: And just one life? I have absolutely idea, but every 64 00:04:00,960 --> 00:04:02,400 Speaker 1: time I read one of her books, I'm just like, 65 00:04:03,040 --> 00:04:09,280 Speaker 1: every sentence is perfect. How listener if you're unfamiliar. Curtis 66 00:04:09,280 --> 00:04:12,000 Speaker 1: Sittenfeld is the name of a best selling fiction writer. 67 00:04:12,360 --> 00:04:16,040 Speaker 1: Highly recommend her short stories. All right back to Dan 68 00:04:16,120 --> 00:04:24,359 Speaker 1: Arielli are lying expert on how we conceptualize dishonesty. So 69 00:04:24,480 --> 00:04:27,000 Speaker 1: in general, when we think about this honesty, we think 70 00:04:27,040 --> 00:04:31,120 Speaker 1: about people as being either good open. We tend to 71 00:04:31,160 --> 00:04:35,040 Speaker 1: consider honesty as a character trait, a fixed feature, like 72 00:04:35,440 --> 00:04:37,960 Speaker 1: I am good at languages and bad at directions. I 73 00:04:38,040 --> 00:04:40,960 Speaker 1: prefer cheap milk chocolate to the fancy dark stuff, and 74 00:04:41,040 --> 00:04:44,280 Speaker 1: I am honest. But this kind of thinking can cause 75 00:04:44,320 --> 00:04:48,560 Speaker 1: problems because if we consider our honesty as intrinsic and constant, 76 00:04:48,839 --> 00:04:51,440 Speaker 1: then it's like a tattoo. It's permanently part of who 77 00:04:51,480 --> 00:04:54,440 Speaker 1: we are. And with that attitude, we might get lax 78 00:04:54,480 --> 00:04:58,320 Speaker 1: in policing our actual behavior. And we're really good at 79 00:04:58,400 --> 00:05:02,240 Speaker 1: preserving the idea of our ourselves as honest people even 80 00:05:02,279 --> 00:05:07,159 Speaker 1: while we're telling a lie. Imagine that you're late for 81 00:05:07,600 --> 00:05:09,880 Speaker 1: meeting some friends, right, you could say, oh, you know, 82 00:05:09,920 --> 00:05:12,200 Speaker 1: I left too late, or I didn't check my watch 83 00:05:12,279 --> 00:05:14,960 Speaker 1: or something like that, or you could say the subway 84 00:05:15,240 --> 00:05:18,240 Speaker 1: was too busy. You have a motivation to lie, and 85 00:05:18,400 --> 00:05:21,720 Speaker 1: what our amazing brain can do is to allow us 86 00:05:21,720 --> 00:05:25,159 Speaker 1: to rationalize things that are not exactly perfect. So you 87 00:05:25,160 --> 00:05:27,120 Speaker 1: can go to your friends and say, oh, yes, you know, 88 00:05:27,160 --> 00:05:29,760 Speaker 1: the traffic was terrible. Now the traffic is thats not 89 00:05:29,920 --> 00:05:33,320 Speaker 1: really the traffic that is terrible, but the traffic was 90 00:05:33,320 --> 00:05:36,400 Speaker 1: not great. So you basically add this up and in 91 00:05:36,440 --> 00:05:39,120 Speaker 1: your mind the moment you finished telling this lie, you're 92 00:05:39,160 --> 00:05:43,520 Speaker 1: not a liar. You just slightly exaggerated. As we tell 93 00:05:43,640 --> 00:05:46,279 Speaker 1: more and more lies of a certain type, we get 94 00:05:46,279 --> 00:05:49,680 Speaker 1: more and more used to it, and by the end 95 00:05:50,000 --> 00:05:54,120 Speaker 1: our brain stopped reacting. Dan and several colleagues put research 96 00:05:54,200 --> 00:05:57,680 Speaker 1: subjects into an fMRI machine to watch their brain activity 97 00:05:57,800 --> 00:06:01,200 Speaker 1: when they behaved it dishonestly, and they found that usually 98 00:06:01,440 --> 00:06:04,279 Speaker 1: when someone first lied for personal gain, they felt bad 99 00:06:04,279 --> 00:06:06,640 Speaker 1: about it, and there was this big bloom of activity 100 00:06:06,680 --> 00:06:09,560 Speaker 1: in the amygdala, a region of the brain associated with 101 00:06:09,600 --> 00:06:13,240 Speaker 1: fear and emotion. But as people continued to lie, that 102 00:06:13,320 --> 00:06:16,839 Speaker 1: brain activity declined, and this may be evidence that lying 103 00:06:16,920 --> 00:06:20,440 Speaker 1: is a slippery slope. We're telling little lies can desensitize 104 00:06:20,480 --> 00:06:22,640 Speaker 1: us and make it easier to tell bigger ones in 105 00:06:22,680 --> 00:06:26,480 Speaker 1: the future. But the news from Dan's research isn't all bad. 106 00:06:27,160 --> 00:06:30,480 Speaker 1: He also found ways to promote honest behavior by priming 107 00:06:30,520 --> 00:06:33,159 Speaker 1: people with cues that put morality a top of mind. 108 00:06:33,760 --> 00:06:37,080 Speaker 1: We asked people to do things like try to recall 109 00:06:37,160 --> 00:06:40,360 Speaker 1: the Ten Commandments or right down an owner code, and 110 00:06:40,400 --> 00:06:44,120 Speaker 1: we saw that those things increased honesty. This might have 111 00:06:44,200 --> 00:06:47,680 Speaker 1: implications for how our important documents are designed. A small 112 00:06:47,760 --> 00:06:50,200 Speaker 1: change in the layout of our forms might get us 113 00:06:50,240 --> 00:06:53,400 Speaker 1: to fill them out more honestly. Will you tell me 114 00:06:53,560 --> 00:06:58,640 Speaker 1: about where the signature line should be on a form? 115 00:06:58,680 --> 00:07:01,880 Speaker 1: So the signature line should be in the beginning of 116 00:07:01,880 --> 00:07:05,719 Speaker 1: the form? Why okay? So think about what what do 117 00:07:05,800 --> 00:07:08,720 Speaker 1: we usually do when we write a form? We write 118 00:07:09,000 --> 00:07:10,280 Speaker 1: and you fill it up, and then at the end 119 00:07:10,320 --> 00:07:13,040 Speaker 1: I said, please sign You know the line has been 120 00:07:13,040 --> 00:07:16,320 Speaker 1: done already, it's over. You did it. You rationalize that 121 00:07:16,360 --> 00:07:20,440 Speaker 1: you forgot about it. Instead, what priming tells us is 122 00:07:20,480 --> 00:07:22,880 Speaker 1: that we should start by getting people to be at 123 00:07:22,880 --> 00:07:27,320 Speaker 1: the higher level of thinking about their morality. Right, So 124 00:07:27,360 --> 00:07:29,680 Speaker 1: what we want is to get people to be aware 125 00:07:29,680 --> 00:07:32,200 Speaker 1: of their own honesty, so we want people to do 126 00:07:32,240 --> 00:07:36,480 Speaker 1: it first. At this point, if you're feeling pangs of 127 00:07:36,520 --> 00:07:39,640 Speaker 1: guilt about having been dishonest in the past, well it's 128 00:07:39,640 --> 00:07:41,520 Speaker 1: another thing. You might be able to blame my mom 129 00:07:41,560 --> 00:07:45,120 Speaker 1: or dad. We teach our kids to be dishonest in 130 00:07:45,160 --> 00:07:48,400 Speaker 1: the social politeness world. It's the first time that you 131 00:07:48,480 --> 00:07:51,679 Speaker 1: tell a kid don't always tell the truth. For example, 132 00:07:51,720 --> 00:07:54,720 Speaker 1: you don't want to offend somebody else's feeling, you don't 133 00:07:54,760 --> 00:07:57,040 Speaker 1: like this kid, don't tell it to them. Other things 134 00:07:57,040 --> 00:08:00,640 Speaker 1: are more important. So I'm wondering whether this is kind 135 00:08:00,680 --> 00:08:04,160 Speaker 1: of the beginning of the way that we teach kids 136 00:08:04,240 --> 00:08:08,200 Speaker 1: this very very delicate calculus about how much to care 137 00:08:08,240 --> 00:08:11,560 Speaker 1: about the truth, how much to care about themselves, and 138 00:08:11,600 --> 00:08:13,400 Speaker 1: how much to care about other people, and how to 139 00:08:13,440 --> 00:08:16,560 Speaker 1: make this very delicate trade of you don't want your 140 00:08:16,600 --> 00:08:19,440 Speaker 1: kid at the mall announcing who's old and who's smelling 141 00:08:19,480 --> 00:08:22,840 Speaker 1: in a fit of unchecked honesty. On the other hand, 142 00:08:23,320 --> 00:08:25,840 Speaker 1: a lot of parents actually talk to us about how 143 00:08:25,920 --> 00:08:29,240 Speaker 1: they are so worried that kids are lying at such 144 00:08:29,280 --> 00:08:31,320 Speaker 1: a young age, and they're worried they're going to turn 145 00:08:31,400 --> 00:08:35,040 Speaker 1: into some kind of psychopaths. When they grow up. My 146 00:08:35,160 --> 00:08:38,520 Speaker 1: name is Candy. I'm a professor at the University of Toronto. 147 00:08:39,360 --> 00:08:43,040 Speaker 1: I've been studying how children learned to tell lies for 148 00:08:43,080 --> 00:08:47,200 Speaker 1: the last twenty five years, calling himself was not a 149 00:08:47,200 --> 00:08:49,800 Speaker 1: master of deception as a boy, or was a terrible 150 00:08:49,880 --> 00:08:53,120 Speaker 1: liar as a child. So my sister always could figure 151 00:08:53,160 --> 00:08:55,640 Speaker 1: out one I lied and she always say, you know, 152 00:08:55,640 --> 00:08:57,960 Speaker 1: when you lie, your eyes turned the red. So I 153 00:08:58,000 --> 00:09:00,920 Speaker 1: don't that's true or not, but thoughts. That's how she 154 00:09:01,080 --> 00:09:04,840 Speaker 1: cut meat. And a lab outfitted with hidden cameras coming 155 00:09:04,840 --> 00:09:07,640 Speaker 1: and his team tried to discover why some kids start 156 00:09:07,720 --> 00:09:10,680 Speaker 1: lying earlier than others. We wanted to find out, you know, 157 00:09:10,720 --> 00:09:14,360 Speaker 1: who are these two yold's who already lie at this 158 00:09:14,720 --> 00:09:17,960 Speaker 1: tender age, you know, but the majority of them actually 159 00:09:18,000 --> 00:09:20,760 Speaker 1: were very honest. So then we measure their i Q. 160 00:09:21,559 --> 00:09:24,360 Speaker 1: There's no difference between those kids who lied and those 161 00:09:24,400 --> 00:09:26,560 Speaker 1: kids who did not lie. And then we look at 162 00:09:26,600 --> 00:09:30,320 Speaker 1: the onion moral and standing line. You know, there's no difference. 163 00:09:31,040 --> 00:09:34,080 Speaker 1: So and then well maybe just gender, you know, maybe 164 00:09:34,200 --> 00:09:36,839 Speaker 1: boys are more likely to lie then girls, or vice versa. 165 00:09:36,840 --> 00:09:39,640 Speaker 1: And turned out that there's no difference between boys and girls. 166 00:09:40,360 --> 00:09:44,560 Speaker 1: So then we looked closely at two very important count 167 00:09:44,559 --> 00:09:50,040 Speaker 1: if skills. Tiny kids don't fully grasp that other people 168 00:09:50,040 --> 00:09:54,200 Speaker 1: are having their own, unique, subjective experiences. They just presume 169 00:09:54,320 --> 00:09:56,800 Speaker 1: that you know their grandma, that you're also caught up 170 00:09:56,840 --> 00:09:59,720 Speaker 1: on the most recent episodes of pat Patrol, that you, too, 171 00:09:59,840 --> 00:10:03,200 Speaker 1: my enjoy a bite of room temperature period pumpkin. But 172 00:10:03,280 --> 00:10:06,840 Speaker 1: to lie, they've got to develop what's called theory of mind. 173 00:10:07,000 --> 00:10:09,200 Speaker 1: They have to realize that each of us has different 174 00:10:09,200 --> 00:10:12,760 Speaker 1: motives and thoughts and perceptions. They have to understand that 175 00:10:12,840 --> 00:10:16,080 Speaker 1: the thoughts in their heads aren't the thoughts in moms 176 00:10:16,160 --> 00:10:20,319 Speaker 1: or yours or mine. They've also got to develop executive 177 00:10:20,320 --> 00:10:24,280 Speaker 1: function that allows them to strategize and inhibit any impulse 178 00:10:24,320 --> 00:10:27,360 Speaker 1: to just blurt out the truth. Both theory of mind 179 00:10:27,640 --> 00:10:31,400 Speaker 1: and executive function take time to develop. We know that 180 00:10:31,600 --> 00:10:34,600 Speaker 1: kids begin to tell lies around two and a half 181 00:10:34,679 --> 00:10:40,840 Speaker 1: years of age, but at that time only about two 182 00:10:41,000 --> 00:10:43,560 Speaker 1: and a half year olds would lie. But when the 183 00:10:43,679 --> 00:10:48,280 Speaker 1: child reaches three years of age, about kids will lie. 184 00:10:48,760 --> 00:10:52,040 Speaker 1: By the time the kids reach four years of age, 185 00:10:52,080 --> 00:10:54,400 Speaker 1: about the eight percent of kids will lie, and by 186 00:10:54,480 --> 00:10:57,840 Speaker 1: seven years of age, almost kids would like Basically, the 187 00:10:57,840 --> 00:11:01,960 Speaker 1: story is almost all kids at some point in their 188 00:11:02,000 --> 00:11:04,440 Speaker 1: life and then seemed to be this kind of very 189 00:11:04,520 --> 00:11:11,080 Speaker 1: universal developmental trajectory. And turn out, these young two year 190 00:11:11,080 --> 00:11:16,120 Speaker 1: olds who lied how much better exactly functioning abilities than 191 00:11:16,200 --> 00:11:18,920 Speaker 1: those kids who are honest. So it's almost like there's 192 00:11:18,920 --> 00:11:22,839 Speaker 1: a precociousness that's indicated by lying early. It's not like 193 00:11:23,160 --> 00:11:25,520 Speaker 1: these two year olds are all smoking cigarettes behind the 194 00:11:25,520 --> 00:11:28,880 Speaker 1: schoolyard a little other jackets, Like they're developing theory of 195 00:11:28,880 --> 00:11:32,360 Speaker 1: mind early. Yeah, exactly, So am I correct in thinking 196 00:11:32,400 --> 00:11:36,640 Speaker 1: your father? So when I hang around my friends who 197 00:11:36,720 --> 00:11:41,840 Speaker 1: have young kids, it's easy for them to to brag 198 00:11:41,880 --> 00:11:45,680 Speaker 1: a little bit about like my kids recognizing shapes and 199 00:11:45,760 --> 00:11:48,040 Speaker 1: letters and she's only nine months old, and my kid 200 00:11:48,120 --> 00:11:50,000 Speaker 1: is walking and my kid is reading, and we're beating 201 00:11:50,000 --> 00:11:52,680 Speaker 1: all the milestones. Were you like hoping that your kid 202 00:11:52,760 --> 00:11:55,520 Speaker 1: was gonna lie really early? So, because I have a 203 00:11:55,600 --> 00:11:59,400 Speaker 1: lab for testing kids, line when Nathan turns, that's his name. 204 00:11:59,440 --> 00:12:03,160 Speaker 1: So when he turned three exactly on his third year birthday, 205 00:12:03,720 --> 00:12:06,560 Speaker 1: I brought him to my lap and I had my 206 00:12:06,679 --> 00:12:11,680 Speaker 1: students testing him, and I watched with anxiety. You know, 207 00:12:11,760 --> 00:12:13,480 Speaker 1: I thought, is he going to be honest? So he 208 00:12:13,559 --> 00:12:16,199 Speaker 1: is going to tell a lie, and he lied, so 209 00:12:17,040 --> 00:12:21,480 Speaker 1: I was very happy. So so he figured out, you know, 210 00:12:21,760 --> 00:12:24,640 Speaker 1: the theory of mind and figure out the exactive functioning 211 00:12:24,640 --> 00:12:27,320 Speaker 1: at three years of age exactly. I have for a 212 00:12:27,440 --> 00:12:30,120 Speaker 1: video recording of that. So you throw a hell of 213 00:12:30,120 --> 00:12:47,160 Speaker 1: a birthday party. Indeed, when my son was little, I 214 00:12:47,200 --> 00:12:49,720 Speaker 1: told him that when the ice cream van drove down 215 00:12:49,720 --> 00:12:54,680 Speaker 1: our street playing its music, that meant it had run 216 00:12:54,720 --> 00:12:58,760 Speaker 1: out of ice cream. Meal times are a classic setting 217 00:12:58,840 --> 00:13:02,400 Speaker 1: for parental lies. No, that is clearly not an airplane. 218 00:13:02,520 --> 00:13:05,680 Speaker 1: That is a rubberized spoon. Dad, if that is your 219 00:13:05,679 --> 00:13:09,240 Speaker 1: real name, I like to my daughter all day long. Okay, 220 00:13:09,240 --> 00:13:12,440 Speaker 1: how old is your daughter? Two? And like, what are 221 00:13:12,440 --> 00:13:14,520 Speaker 1: the kind of wies that you're telling your daughter? Oh? 222 00:13:14,559 --> 00:13:19,440 Speaker 1: This tastes great. Yeah, you could definitely have some vetchables. Oh. 223 00:13:20,040 --> 00:13:23,880 Speaker 1: The woman gleefully deceiving her daughter is Sophie Vanderzi of 224 00:13:23,880 --> 00:13:27,120 Speaker 1: the Erasmus School of Economics in the Netherlands. She runs 225 00:13:27,160 --> 00:13:30,960 Speaker 1: experiments to study how we lie to one another. I 226 00:13:31,000 --> 00:13:33,760 Speaker 1: remember my mom lying to me when I was given 227 00:13:33,760 --> 00:13:35,640 Speaker 1: a jump rope as a present as a little girl. 228 00:13:36,320 --> 00:13:39,520 Speaker 1: She was worried that her muscular, not very coordinator daughter 229 00:13:39,760 --> 00:13:42,040 Speaker 1: was going to spin herself upside down and knock her 230 00:13:42,080 --> 00:13:44,280 Speaker 1: teeth out with it. And so she lied to me 231 00:13:44,880 --> 00:13:46,560 Speaker 1: and told me that the way to use the toy 232 00:13:46,600 --> 00:13:48,720 Speaker 1: it was to lay it gently on the ground and 233 00:13:48,760 --> 00:13:51,360 Speaker 1: then jump all around it. And so when my dad 234 00:13:51,400 --> 00:13:53,720 Speaker 1: came home, he found me playing with my jump rope 235 00:13:53,800 --> 00:13:56,600 Speaker 1: as one would stamp out of fire. But I could 236 00:13:56,679 --> 00:13:59,200 Speaker 1: never tell when my mom was lying. And if she 237 00:13:59,320 --> 00:14:02,800 Speaker 1: still lies, then I still can't. Well, we're basically terrible 238 00:14:02,800 --> 00:14:05,240 Speaker 1: allied dettection, and we're quite good at lying, and that's 239 00:14:05,320 --> 00:14:09,960 Speaker 1: quite problematic. Experiments suggest that humans suck at spotting lies, 240 00:14:10,080 --> 00:14:14,679 Speaker 1: like genuinely suck. When research participants were asked to identify 241 00:14:14,720 --> 00:14:17,120 Speaker 1: the liars among a group of people half of whom 242 00:14:17,120 --> 00:14:19,160 Speaker 1: we're telling the truth and half of whom we're lying, 243 00:14:19,520 --> 00:14:22,720 Speaker 1: they were right only fifty percent of the time, which 244 00:14:22,720 --> 00:14:25,960 Speaker 1: is an abysmal success rate. It's just barely better than chance. 245 00:14:26,920 --> 00:14:29,240 Speaker 1: And as for all that stuff on forensic TV shows, 246 00:14:29,720 --> 00:14:32,400 Speaker 1: forget about it. One of the first comments you usually 247 00:14:32,440 --> 00:14:35,200 Speaker 1: cantus that liars look away. Something you often hear is 248 00:14:35,200 --> 00:14:37,920 Speaker 1: that they look to the top left corner. They behave 249 00:14:37,960 --> 00:14:40,400 Speaker 1: a bit sort of shiftly, But those are not actually 250 00:14:40,520 --> 00:14:42,920 Speaker 1: cutes to the seats, as we know from research and 251 00:14:42,920 --> 00:14:46,320 Speaker 1: from metal analysis, but that does mean that because people 252 00:14:46,400 --> 00:14:48,440 Speaker 1: think it's acute. One of the cues that we did 253 00:14:48,440 --> 00:14:51,200 Speaker 1: find is that looking someone like really in the eye, 254 00:14:51,240 --> 00:14:54,000 Speaker 1: like sort of uncomfortable eye contact, can be acute to 255 00:14:54,040 --> 00:14:57,960 Speaker 1: the seats. So that makes it really complicated. And one 256 00:14:57,960 --> 00:15:00,200 Speaker 1: of the other problems is that lying is that through 257 00:15:00,280 --> 00:15:03,280 Speaker 1: something you get better at with practice, So the more 258 00:15:03,320 --> 00:15:06,200 Speaker 1: you lie, the emotional responses to the line become less 259 00:15:06,200 --> 00:15:12,880 Speaker 1: and less severe. YEA, when I was five or six, 260 00:15:13,240 --> 00:15:15,560 Speaker 1: and my uncle took me aside and he had a 261 00:15:15,640 --> 00:15:18,360 Speaker 1: chalkboard and he did all these drawings of all the 262 00:15:18,440 --> 00:15:21,200 Speaker 1: trees that we could see around where he lived, and 263 00:15:21,200 --> 00:15:22,760 Speaker 1: he said, have you ever seen the trees at night? 264 00:15:23,360 --> 00:15:25,920 Speaker 1: And I said, well, well no, because I'm asleep. And 265 00:15:25,920 --> 00:15:28,120 Speaker 1: he said, well you know what happens to trees at night? 266 00:15:28,520 --> 00:15:30,240 Speaker 1: And I said no, and he said, well, they turned 267 00:15:30,280 --> 00:15:35,400 Speaker 1: upside down and they're actually rockets and they fly into 268 00:15:35,480 --> 00:15:40,480 Speaker 1: space and then they come back before the morning and 269 00:15:40,520 --> 00:15:47,360 Speaker 1: get him planted, and that's what trees do. Polygraph machines, 270 00:15:47,520 --> 00:15:51,280 Speaker 1: the favorite lie detection devices of TV dramas, are legitimately 271 00:15:51,320 --> 00:15:53,800 Speaker 1: better than humans. It's spotting to see, but they're not 272 00:15:53,920 --> 00:15:58,560 Speaker 1: perfectly reliable either. A polygraph doesn't directly test the truthfulness 273 00:15:58,560 --> 00:16:01,800 Speaker 1: of a statement. It measures our physical arousal through blood 274 00:16:01,800 --> 00:16:06,000 Speaker 1: pressure and breathing patterns and perspiration. The idea is that 275 00:16:06,160 --> 00:16:09,000 Speaker 1: lying usually stresses us out, and we can measure stress. 276 00:16:09,560 --> 00:16:12,000 Speaker 1: But sometimes, of course, we can be stressed out for 277 00:16:12,040 --> 00:16:15,000 Speaker 1: other reasons as well, so it's not a perfect indicator, 278 00:16:15,800 --> 00:16:21,040 Speaker 1: and polygraph examination involves some interpretation. Two different examiners given 279 00:16:21,080 --> 00:16:24,960 Speaker 1: the exact same polygraph output might come back with conflicting results. 280 00:16:26,720 --> 00:16:30,000 Speaker 1: As a kid, I designed my own lie detector test 281 00:16:30,120 --> 00:16:34,560 Speaker 1: to be used exclusively on my little brother Max. I 282 00:16:34,640 --> 00:16:37,160 Speaker 1: drew a series of buttons and dials onto an index card, 283 00:16:37,400 --> 00:16:39,880 Speaker 1: added a bright red circle where the suspected liar was 284 00:16:39,920 --> 00:16:42,880 Speaker 1: to place his thumb, and my tiny brown eyed brother 285 00:16:43,240 --> 00:16:46,320 Speaker 1: dutifully submitted pressing his thumb on the circle, at which 286 00:16:46,360 --> 00:16:48,920 Speaker 1: point I whisked the index card into my room, shut 287 00:16:48,960 --> 00:16:51,840 Speaker 1: the door, and turned on my lie detector machine, which 288 00:16:51,920 --> 00:16:55,400 Speaker 1: ran with a horrible roar. In truth, I was really 289 00:16:55,520 --> 00:17:02,680 Speaker 1: running the vacuum cleaner listener. I cannot tell you how 290 00:17:02,720 --> 00:17:06,240 Speaker 1: guilty I am in recalling this story Maxie, I am 291 00:17:06,280 --> 00:17:09,080 Speaker 1: so sorry. Next time we hang out, I am buying 292 00:17:09,080 --> 00:17:14,160 Speaker 1: every round. As Professor Conley pointed out, children's first lives 293 00:17:14,200 --> 00:17:17,919 Speaker 1: should be celebrated as a milestone, not interrogated with the hoover. 294 00:17:18,800 --> 00:17:21,120 Speaker 1: But by the time you grow up, you might lie 295 00:17:21,160 --> 00:17:24,359 Speaker 1: more than you realize. In an experimental setting where people 296 00:17:24,359 --> 00:17:26,480 Speaker 1: have to talk for ten minutes to someone they didn't 297 00:17:26,480 --> 00:17:29,879 Speaker 1: know in a waiting room that was secretly recorded, that 298 00:17:30,000 --> 00:17:33,160 Speaker 1: people already lied twice on average in this ten minutes 299 00:17:33,200 --> 00:17:36,040 Speaker 1: conversation with a stranger. So we all lie much more 300 00:17:36,119 --> 00:17:38,680 Speaker 1: than we think we do. So there was an experimental 301 00:17:38,720 --> 00:17:41,000 Speaker 1: design that put people in a room with a stranger 302 00:17:41,040 --> 00:17:42,959 Speaker 1: for ten minutes, and even in the course of that 303 00:17:43,000 --> 00:17:47,639 Speaker 1: ten minute conversation, they collocked two lies on average. And 304 00:17:47,920 --> 00:17:50,080 Speaker 1: what's interesting is they didn't have to impress. They were 305 00:17:50,119 --> 00:17:51,879 Speaker 1: just in the waiting room waiting to take part in 306 00:17:51,880 --> 00:17:55,119 Speaker 1: an experiment, not realizing that was already the experiments that 307 00:17:55,200 --> 00:17:58,359 Speaker 1: they were secretly being filmed. So to your point, you're saying, 308 00:17:58,440 --> 00:18:01,439 Speaker 1: there's nothing to be gained, right, like, this isn't your boss, 309 00:18:01,640 --> 00:18:04,399 Speaker 1: this isn't your mom, this isn't your lover, there's no 310 00:18:04,480 --> 00:18:07,560 Speaker 1: real social stakes here, and we still do benefits, and 311 00:18:07,600 --> 00:18:11,280 Speaker 1: we still lie. Remember Allison, the liar that we met 312 00:18:11,320 --> 00:18:13,800 Speaker 1: at the very beginning of this episode, the one who 313 00:18:13,800 --> 00:18:16,399 Speaker 1: got the letter of rec from a crush. We dragged 314 00:18:16,400 --> 00:18:18,920 Speaker 1: her back into the studio to present a little gift 315 00:18:19,040 --> 00:18:21,840 Speaker 1: for serving as the liar laureate of this episode of 316 00:18:21,880 --> 00:18:27,320 Speaker 1: Deeply Human. Okay, well, thanks so much Allison for coming 317 00:18:27,320 --> 00:18:31,080 Speaker 1: in again. No problem, Why am I back? Actually? I 318 00:18:31,119 --> 00:18:36,120 Speaker 1: had just wanted to play you some audio. Oh all right, 319 00:18:37,960 --> 00:18:43,520 Speaker 1: To whomsoever it may concern, I hereby commend Alison Cherry 320 00:18:43,800 --> 00:18:47,919 Speaker 1: on the slide design and flawless execution of her elegant 321 00:18:48,000 --> 00:18:52,080 Speaker 1: con May her work serve as an inspiration for future 322 00:18:52,119 --> 00:18:58,840 Speaker 1: generations of scheming love sick co eds. Sincerely, Curtis Sittenfeld, 323 00:19:02,440 --> 00:19:09,840 Speaker 1: Are you kidding me? What? How did you make that happen? 324 00:19:10,680 --> 00:19:14,160 Speaker 1: I wish you could see my face right now. Thank 325 00:19:14,160 --> 00:19:18,080 Speaker 1: you so much for that, Curtis Sittenfeld. In addition to 326 00:19:18,160 --> 00:19:20,760 Speaker 1: being a best selling fiction writer, you are a total boss. 327 00:19:20,920 --> 00:19:24,840 Speaker 1: And next time we meet, I am buying every round. Allison, 328 00:19:25,000 --> 00:19:27,919 Speaker 1: for her part, has zero regrets about lying to obtain 329 00:19:27,960 --> 00:19:30,399 Speaker 1: her letter from her crush worthy t A, and she 330 00:19:30,480 --> 00:19:34,880 Speaker 1: rereads it every few years. Come, our expert on children's lives, 331 00:19:35,280 --> 00:19:37,879 Speaker 1: thinks that lying can actually be an important part of 332 00:19:37,880 --> 00:19:41,960 Speaker 1: a happy life. The fact that the lies exists in 333 00:19:42,040 --> 00:19:47,359 Speaker 1: our society is because sometimes they serve as lubricant of 334 00:19:47,760 --> 00:19:52,040 Speaker 1: social interaction and it makes our lives better. But you 335 00:19:52,200 --> 00:19:56,480 Speaker 1: put yourself in this situation. Just think about this. If 336 00:19:56,640 --> 00:20:02,359 Speaker 1: you are honest all the time, all minute, think about 337 00:20:02,520 --> 00:20:04,720 Speaker 1: what kind of person you're going to be. Just try 338 00:20:04,760 --> 00:20:07,240 Speaker 1: it the full warm day. I can't guarantee you you're 339 00:20:07,240 --> 00:20:13,080 Speaker 1: going to be friendless, spouseless, and then it may be jobless. 340 00:20:13,160 --> 00:20:15,480 Speaker 1: You know, you just nobody would like you if you're 341 00:20:15,560 --> 00:20:18,520 Speaker 1: always very, very honest. You know you mentioned that you 342 00:20:18,520 --> 00:20:21,560 Speaker 1: were a lousy liar when you were a kid. Are 343 00:20:21,600 --> 00:20:24,800 Speaker 1: you any better now? No? I'm not. You know what's 344 00:20:24,800 --> 00:20:26,359 Speaker 1: funny is I don't know if I should believe you 345 00:20:26,440 --> 00:20:31,960 Speaker 1: or not. There's no way, there's no way. Embarrassingly, I'm 346 00:20:32,000 --> 00:20:35,040 Speaker 1: pretty sure that I lied during the course of this podcast. 347 00:20:35,480 --> 00:20:38,680 Speaker 1: Remember when Dan Ariali, the Duke Professor, asked me about 348 00:20:38,840 --> 00:20:41,639 Speaker 1: dating sites and I admitted to shaving off a few pounds. 349 00:20:42,160 --> 00:20:46,080 Speaker 1: After we spoke, I thought to myself, Wait, which dating 350 00:20:46,160 --> 00:20:49,040 Speaker 1: site asks for an exact weight. I mean, I know 351 00:20:49,119 --> 00:20:51,080 Speaker 1: I fibbed on that stuff before, but I think it 352 00:20:51,200 --> 00:20:53,720 Speaker 1: was like on my driver's license. I just I didn't 353 00:20:53,720 --> 00:20:55,720 Speaker 1: want to interrupt the flow of the conversation, and I 354 00:20:55,760 --> 00:20:58,240 Speaker 1: didn't stop to really make sure I had my facts rate. 355 00:20:58,800 --> 00:21:01,640 Speaker 1: And just like that, I'm sucked up in this metavortex 356 00:21:01,800 --> 00:21:06,400 Speaker 1: lying to a lying expert about lying. It's not all intentional, 357 00:21:07,560 --> 00:21:13,280 Speaker 1: it's not all perfectly calculated. The temptations to be dishonest, right, 358 00:21:13,320 --> 00:21:15,280 Speaker 1: they just shop all the time, and net means we 359 00:21:15,320 --> 00:21:21,640 Speaker 1: need to be very careful. That asks for vigilance, tremendous vigilance. Dan, 360 00:21:22,440 --> 00:21:25,399 Speaker 1: I'm sorry about the fib If we ever meet in person, 361 00:21:25,800 --> 00:21:28,000 Speaker 1: I'm buying every round. Man, this is becoming like the 362 00:21:28,040 --> 00:21:32,560 Speaker 1: most expensive podcast ever. Learning about Dan's research on how 363 00:21:32,600 --> 00:21:35,800 Speaker 1: little transgressions can pave the way for bigger ones, it's 364 00:21:35,880 --> 00:21:38,600 Speaker 1: changed the way I behave just a little bit. I'm 365 00:21:38,640 --> 00:21:40,680 Speaker 1: more likely to pause during an anecdote to get the 366 00:21:40,760 --> 00:21:44,120 Speaker 1: names right, even if it dampens the comic timing. And 367 00:21:44,200 --> 00:21:46,840 Speaker 1: when declining an invitation to hang out, I'm trying to 368 00:21:46,960 --> 00:21:49,360 Speaker 1: lean less on the old excuse of having work to do, 369 00:21:49,520 --> 00:21:52,720 Speaker 1: which is always sort of true, but rarely the whole story. 370 00:21:52,920 --> 00:21:55,119 Speaker 1: And if you'd like to lie less, it might be 371 00:21:55,240 --> 00:21:59,119 Speaker 1: that honest answers about the daily, unimportant stuff can serve 372 00:21:59,160 --> 00:22:01,439 Speaker 1: as cross train for the muscles that we need to 373 00:22:01,480 --> 00:22:05,479 Speaker 1: answer the hard questions truthfully. If lying is a slippery slope, 374 00:22:05,640 --> 00:22:09,399 Speaker 1: we can at least work cleats. So to co worker Andy, 375 00:22:10,080 --> 00:22:13,960 Speaker 1: I don't love the haircut, but you are awesome and 376 00:22:14,119 --> 00:22:16,280 Speaker 1: I like you no matter what is on or near 377 00:22:16,359 --> 00:22:20,280 Speaker 1: your head. To the Minnesota Department of Motor Vehicles on 378 00:22:20,440 --> 00:22:23,000 Speaker 1: last way in, I was one d and forty six 379 00:22:23,040 --> 00:22:26,880 Speaker 1: and a half pounds to my mom, so you totally 380 00:22:26,920 --> 00:22:29,200 Speaker 1: made the right call about the jump rope. I would 381 00:22:29,200 --> 00:22:31,960 Speaker 1: have knocked my theory of mind right out the back 382 00:22:32,000 --> 00:22:37,240 Speaker 1: of my little head. On the next Deeply Human, we'll 383 00:22:37,280 --> 00:22:39,560 Speaker 1: ask why do humans have sex at times that our 384 00:22:39,560 --> 00:22:45,480 Speaker 1: fellow mammals don't. So for most female ammost it's two 385 00:22:45,600 --> 00:22:48,000 Speaker 1: or three days out of a cycle that they have sex, 386 00:22:48,480 --> 00:22:51,919 Speaker 1: and otherwise they're not willing to have sex with males 387 00:22:51,960 --> 00:22:55,919 Speaker 1: that initiate it. And there really are that attractive to 388 00:22:55,960 --> 00:23:00,359 Speaker 1: males either. Males seem to know. Deeply Human is a 389 00:23:00,440 --> 00:23:05,040 Speaker 1: BBC World Service and American public Media co production with iHeartMedia, 390 00:23:05,400 --> 00:23:07,240 Speaker 1: and it's hosted by me Dessa