1 00:00:03,120 --> 00:00:06,519 Speaker 1: You're listening to Part Time Genius, the production of Kaleidoscope 2 00:00:06,640 --> 00:00:09,280 Speaker 1: and iHeartRadio. 3 00:00:11,720 --> 00:00:12,440 Speaker 2: Guess what, Mango? 4 00:00:12,640 --> 00:00:13,280 Speaker 1: What's that will? 5 00:00:13,440 --> 00:00:15,440 Speaker 2: So I came across this study, and you know I 6 00:00:15,480 --> 00:00:19,240 Speaker 2: love a good study, Mango, But this was from researchers 7 00:00:19,280 --> 00:00:22,200 Speaker 2: at the University of Wisconsin Lacrosse, and you know I 8 00:00:22,239 --> 00:00:25,840 Speaker 2: love University of Wisconsin Lacrosse. And this is a pretty 9 00:00:25,880 --> 00:00:28,400 Speaker 2: weird one. So here's what they did. They asked six 10 00:00:28,560 --> 00:00:32,920 Speaker 2: hundred and thirty two people to document every single lie 11 00:00:33,000 --> 00:00:37,040 Speaker 2: they told over the course of three months. That is incredible. 12 00:00:37,040 --> 00:00:40,159 Speaker 2: So what they find out, well, among other things, they 13 00:00:40,159 --> 00:00:43,279 Speaker 2: discovered that the number one reason people lied was to 14 00:00:43,360 --> 00:00:46,159 Speaker 2: avoid others. Oh, I mean that kind of makes sense 15 00:00:46,200 --> 00:00:48,520 Speaker 2: to me, Yeah, it does. And the second most common 16 00:00:48,560 --> 00:00:51,800 Speaker 2: reason people lied was for jokes or pranks, which also makes. 17 00:00:51,640 --> 00:00:54,600 Speaker 1: Sense, like the time you told me the CEO of 18 00:00:54,680 --> 00:00:56,520 Speaker 1: Vegemite was a huge fan of the show and wanted 19 00:00:56,520 --> 00:00:59,840 Speaker 1: to send us a lifetime supply of vegemite. That is, 20 00:01:00,160 --> 00:01:02,760 Speaker 1: that's a good example of a lie, yep. But from 21 00:01:02,800 --> 00:01:06,240 Speaker 1: what you're describing, these are fairly like harmless lies, right, 22 00:01:06,280 --> 00:01:10,039 Speaker 1: So is a lesson that lying isn't necessarily so bad. 23 00:01:10,680 --> 00:01:12,920 Speaker 2: I do like that that's your conclusion. I mean, the 24 00:01:13,040 --> 00:01:17,000 Speaker 2: lesson isn't quite that simple. Like, It's a highly complex 25 00:01:17,080 --> 00:01:20,400 Speaker 2: form of social interaction that can serve many different purposes, 26 00:01:20,880 --> 00:01:23,360 Speaker 2: and so lies can range from lighthearted and funny to 27 00:01:23,640 --> 00:01:27,640 Speaker 2: of course malicious to downright dangerous, depending on the intention 28 00:01:27,800 --> 00:01:31,280 Speaker 2: behind them and the cultural context too. So people have 29 00:01:31,319 --> 00:01:34,720 Speaker 2: been trying to understand lying for thousands of years, and 30 00:01:34,760 --> 00:01:38,160 Speaker 2: it definitely merits more research or at least a podcast episode, 31 00:01:38,200 --> 00:01:38,880 Speaker 2: I'd say. 32 00:01:38,959 --> 00:01:41,520 Speaker 1: So, will I do have something to confess? 33 00:01:42,400 --> 00:01:43,520 Speaker 2: Okay, I'm nervous. 34 00:01:43,800 --> 00:01:46,920 Speaker 1: I read that University of Wisconsin the study last week, 35 00:01:46,959 --> 00:01:49,040 Speaker 1: and I just pretended I didn't know about it, so 36 00:01:49,120 --> 00:01:50,480 Speaker 1: this show would sound better. 37 00:01:51,360 --> 00:01:54,400 Speaker 2: Oh man, well, you know what I actually was lying too, 38 00:01:54,480 --> 00:01:56,600 Speaker 2: because I knew that you had read the University of 39 00:01:56,600 --> 00:01:59,240 Speaker 2: Wisconsin study and I pretended that I didn't. So it's 40 00:01:59,320 --> 00:02:03,240 Speaker 2: just like this to an interesting start. But actually it's 41 00:02:03,240 --> 00:02:05,400 Speaker 2: the perfect way to kick off an episode about lying. 42 00:02:05,760 --> 00:02:08,000 Speaker 2: There is a whole lot to talk about, and I 43 00:02:08,040 --> 00:02:10,639 Speaker 2: guarantee from this point forward we will be one hundred 44 00:02:10,720 --> 00:02:35,240 Speaker 2: percent truthful. So let's dive in. Hey there, podcast listeners, 45 00:02:35,280 --> 00:02:38,120 Speaker 2: welcome to part time genius. I'm Will Pearson, and as always, 46 00:02:38,120 --> 00:02:40,520 Speaker 2: I'm joined by my good friend Mangesh hot ticketter and 47 00:02:40,639 --> 00:02:43,840 Speaker 2: over there in the booth waving a red hot poker, 48 00:02:44,240 --> 00:02:46,680 Speaker 2: that's our pal and producer Dylan fig and he's actually 49 00:02:46,720 --> 00:02:49,600 Speaker 2: waving it pretty aggressively. I'm not really sure what he's doing, 50 00:02:49,639 --> 00:02:52,040 Speaker 2: but it looks a little bit dangerous. If I'm being honest, 51 00:02:52,200 --> 00:02:52,799 Speaker 2: I am. 52 00:02:52,720 --> 00:02:55,399 Speaker 1: Glad that he's wearing oven mits at least, But believe 53 00:02:55,480 --> 00:02:57,359 Speaker 1: it or not, I think I know what he's trying 54 00:02:57,400 --> 00:02:59,960 Speaker 1: to demonstrate it. So, okay. Back in the Middle eight 55 00:03:00,400 --> 00:03:04,919 Speaker 1: lying was considered a terrible sin, and people, specifically Christian people, 56 00:03:04,960 --> 00:03:07,040 Speaker 1: believe that the only way to know if someone was 57 00:03:07,120 --> 00:03:10,960 Speaker 1: lying was by testing them with a trial by ordeal. So, 58 00:03:11,440 --> 00:03:14,160 Speaker 1: in other words, the accused was put through a physical challenge, and, 59 00:03:14,200 --> 00:03:17,400 Speaker 1: according to the logic at the time, if they were honest, 60 00:03:17,800 --> 00:03:20,760 Speaker 1: God would prevent them from getting hurt. Now, one of 61 00:03:20,800 --> 00:03:22,840 Speaker 1: the ordeals was touching the person's tongue to a red 62 00:03:22,880 --> 00:03:26,480 Speaker 1: hot poker. Obviously, this resulted in a lot of burnt tongues, 63 00:03:26,720 --> 00:03:29,760 Speaker 1: which was seen as proof of dishonesty. Yikes. 64 00:03:29,919 --> 00:03:32,640 Speaker 2: All right, well, thank you Dylan for that terrifying visual. 65 00:03:32,680 --> 00:03:35,680 Speaker 2: It is not common that we see Dylan looking terrifying 66 00:03:35,720 --> 00:03:38,000 Speaker 2: as one of the nicest humans on the planet. But 67 00:03:38,080 --> 00:03:40,480 Speaker 2: good job there, Dylan. All right, So Mengo, it sounds 68 00:03:40,560 --> 00:03:43,160 Speaker 2: like people took lying very seriously back then. 69 00:03:43,360 --> 00:03:47,080 Speaker 1: They definitely did so. Other trials by ordeal included walking 70 00:03:47,080 --> 00:03:51,119 Speaker 1: over burning coals and being forced to swallow poison. And 71 00:03:51,320 --> 00:03:53,920 Speaker 1: if you're thinking, wow, that seems kind of extreme, this 72 00:03:54,120 --> 00:03:57,200 Speaker 1: actually all traces back to Saint Augustine. He wrote a 73 00:03:57,200 --> 00:04:01,960 Speaker 1: famous treatise on lying called Online and Augustine argued that 74 00:04:02,080 --> 00:04:06,240 Speaker 1: lying is like a disease. It supposedly destroys your integrity 75 00:04:06,320 --> 00:04:09,560 Speaker 1: and puts you in direct opposition to God. He's also 76 00:04:09,640 --> 00:04:11,520 Speaker 1: the guy who came up with the idea that if 77 00:04:11,560 --> 00:04:14,120 Speaker 1: you always tell the truth, God will actually swoop in 78 00:04:14,200 --> 00:04:17,520 Speaker 1: to protect you from the harm. But medieval Europe didn't 79 00:04:17,520 --> 00:04:19,840 Speaker 1: invent the idea of testing for truth. They just kind 80 00:04:19,839 --> 00:04:22,919 Speaker 1: of took it to this new, more brutal level. In 81 00:04:23,160 --> 00:04:26,159 Speaker 1: ancient China, accused liars had to chew a mouthful of 82 00:04:26,240 --> 00:04:29,839 Speaker 1: dry rice while someone recited the accusations against them, and 83 00:04:29,880 --> 00:04:32,560 Speaker 1: then they'd spit the rice out, and if it seemed 84 00:04:32,600 --> 00:04:34,680 Speaker 1: too dry when they spit it out because of their 85 00:04:34,760 --> 00:04:38,200 Speaker 1: dry mouth from nervousness. Then that was considered proof that 86 00:04:38,240 --> 00:04:41,600 Speaker 1: they had lied. What a weird way to test if 87 00:04:41,600 --> 00:04:45,719 Speaker 1: somebody's lying. Yeah, these early lie detectors were very, very ineffective, 88 00:04:45,800 --> 00:04:48,159 Speaker 1: but I guess it shows that concern about lying is 89 00:04:48,200 --> 00:04:52,800 Speaker 1: practically universal. There's actually this ancient Hindu text, the yard Javeda, 90 00:04:53,440 --> 00:04:56,159 Speaker 1: and it dates back to one thousand BCE, and it 91 00:04:56,200 --> 00:04:58,960 Speaker 1: has tips for royalty who are worried that their servants 92 00:04:58,960 --> 00:05:03,920 Speaker 1: are undercover scenaries lying about their allegiance. So it says, quote, 93 00:05:04,040 --> 00:05:07,080 Speaker 1: a person who intends to poison food may be recognized. 94 00:05:07,320 --> 00:05:10,400 Speaker 1: He does not answer questions, or they are evasive answers. 95 00:05:10,680 --> 00:05:14,040 Speaker 1: He speaks nonsense, rubs the great toe along the ground 96 00:05:14,080 --> 00:05:17,640 Speaker 1: and shivers. His face is discolored. He rubs the roots 97 00:05:17,640 --> 00:05:19,680 Speaker 1: of his hair with his fingers, and he tries by 98 00:05:19,800 --> 00:05:21,719 Speaker 1: every means to leave the house. 99 00:05:22,960 --> 00:05:24,560 Speaker 2: It kind of feels like they would not be very 100 00:05:24,600 --> 00:05:26,640 Speaker 2: good mercenaries if they're doing all that stuff. 101 00:05:26,680 --> 00:05:29,479 Speaker 1: I know, but this gets is something really interesting, which 102 00:05:29,520 --> 00:05:33,240 Speaker 1: is that throughout history, people have believed that observing someone's 103 00:05:33,279 --> 00:05:36,880 Speaker 1: nonverbal behaviors, like their cues of things like fidgeting or 104 00:05:36,960 --> 00:05:39,880 Speaker 1: rubbing their nose. That's an easy way to tell if 105 00:05:39,880 --> 00:05:42,680 Speaker 1: they're lying, So you don't really need the hot pokers 106 00:05:42,760 --> 00:05:44,920 Speaker 1: or the vials of poison, I guess. 107 00:05:45,200 --> 00:05:48,080 Speaker 2: And that's basically the idea behind lie detectors looking for 108 00:05:48,080 --> 00:05:49,040 Speaker 2: those sorts of clues. 109 00:05:49,120 --> 00:05:53,239 Speaker 1: Right, Yeah, So modern lie detectors or polygraphs were invented 110 00:05:53,279 --> 00:05:55,960 Speaker 1: in the early nineteen hundreds, but some of the technology 111 00:05:56,040 --> 00:05:59,200 Speaker 1: they use was developed in the eighteen hundreds, And all 112 00:05:59,240 --> 00:06:02,200 Speaker 1: they really do is measure changes in certain vital signs, 113 00:06:02,240 --> 00:06:06,880 Speaker 1: including things like blood pressure, breathing, perspiration. And although they're 114 00:06:06,880 --> 00:06:10,560 Speaker 1: still used by law enforcement and government agencies, they're widely 115 00:06:10,600 --> 00:06:14,560 Speaker 1: considered a pseudoscience. And that's because the physiological changes they 116 00:06:14,560 --> 00:06:16,520 Speaker 1: measure it can be triggered by lots of things like 117 00:06:16,680 --> 00:06:18,440 Speaker 1: not just the stress of lying. 118 00:06:18,560 --> 00:06:19,960 Speaker 2: Well, if you think about how it like the fear 119 00:06:20,000 --> 00:06:22,400 Speaker 2: of being arrested or stressed from being interrogated. 120 00:06:22,560 --> 00:06:24,919 Speaker 1: Right. In fact, the ACLU has opposed the use of 121 00:06:24,960 --> 00:06:28,080 Speaker 1: polygraph machines for decades, saying they don't work and they're 122 00:06:28,120 --> 00:06:31,240 Speaker 1: a violation of people's privacy rights. And remember the machine 123 00:06:31,240 --> 00:06:34,600 Speaker 1: itself doesn't offer any conclusions. It just kind of records 124 00:06:34,600 --> 00:06:36,760 Speaker 1: this data for people But this is the part I 125 00:06:36,760 --> 00:06:40,680 Speaker 1: didn't realize. It's that a human polygraph examiner actually has 126 00:06:40,720 --> 00:06:44,320 Speaker 1: to interpret that data to decide what it means. Is 127 00:06:44,360 --> 00:06:46,840 Speaker 1: that a normal spike in sweating because the room is hot, 128 00:06:46,960 --> 00:06:49,840 Speaker 1: or does it indicate a lie, Which means the results 129 00:06:49,839 --> 00:06:52,800 Speaker 1: of a polygraph test can actually be tainted by human. 130 00:06:52,520 --> 00:06:56,520 Speaker 2: Bias, I mean, like crazy bias. It does feel like 131 00:06:57,160 --> 00:06:59,559 Speaker 2: it's just so weird that in the twenty first century, 132 00:06:59,600 --> 00:07:03,560 Speaker 2: where using what seems like the same lie detection concepts 133 00:07:03,600 --> 00:07:06,159 Speaker 2: ancient people, did we just exchanged the dry rice for 134 00:07:06,400 --> 00:07:07,600 Speaker 2: some sort of fancy machine. 135 00:07:07,600 --> 00:07:08,760 Speaker 1: I guess, yeah, pretty much. 136 00:07:09,840 --> 00:07:12,360 Speaker 2: All right, Well, you know, regardless of what goes on physically, 137 00:07:12,480 --> 00:07:15,520 Speaker 2: telling a lie does require a whole bunch of complicated 138 00:07:15,600 --> 00:07:18,320 Speaker 2: brain activity. Like first you have to come up with 139 00:07:18,360 --> 00:07:22,120 Speaker 2: the lie and mentally detach yourself from the truth. Then 140 00:07:22,160 --> 00:07:24,640 Speaker 2: you have to choose your words and behaviors carefully. Is 141 00:07:24,680 --> 00:07:28,000 Speaker 2: to sell the lie and avoid suspicion, and that requires 142 00:07:28,080 --> 00:07:31,800 Speaker 2: understanding what your audience knows now and then making guesses 143 00:07:31,840 --> 00:07:35,040 Speaker 2: about what they might discover later. Then you have to 144 00:07:35,080 --> 00:07:37,720 Speaker 2: process the response to the initial lie and spin out 145 00:07:37,760 --> 00:07:40,400 Speaker 2: more lies as you need to back yourself up The 146 00:07:40,440 --> 00:07:43,760 Speaker 2: scientific term for all of this is called cognitive load. 147 00:07:44,120 --> 00:07:46,760 Speaker 1: That is so funny, Like it feels like lying comes 148 00:07:46,840 --> 00:07:49,680 Speaker 1: so naturally at this point and you need to write, 149 00:07:49,720 --> 00:07:52,440 Speaker 1: but like it sounds exhausting when you put it that way. 150 00:07:52,560 --> 00:07:55,120 Speaker 1: So does this mean like there's a specific area of 151 00:07:55,120 --> 00:07:57,600 Speaker 1: the brain that controls our ability to lie? 152 00:07:58,680 --> 00:08:01,440 Speaker 2: You know? Actually it's the opposite of this, because there's 153 00:08:01,480 --> 00:08:04,920 Speaker 2: this heavy cognitive load. Lying involves multiple areas of the 154 00:08:04,960 --> 00:08:08,280 Speaker 2: brain really working together. So these are areas responsible for 155 00:08:08,320 --> 00:08:12,200 Speaker 2: critical stuff like self awareness and social cognition. And one 156 00:08:12,200 --> 00:08:14,200 Speaker 2: thing that really fascinates me is the fact that line 157 00:08:14,240 --> 00:08:18,000 Speaker 2: can actually change our memories. That's because memories aren't set 158 00:08:18,040 --> 00:08:20,240 Speaker 2: in stone, and so the way we remember something can 159 00:08:20,280 --> 00:08:23,480 Speaker 2: be distorted over time, and if a lie is repeated 160 00:08:23,600 --> 00:08:26,720 Speaker 2: often enough, the truth is then suppressed. It can actually 161 00:08:26,720 --> 00:08:28,360 Speaker 2: distort our memory of an event. 162 00:08:29,200 --> 00:08:31,960 Speaker 1: It also explains a lot about history and also like 163 00:08:32,000 --> 00:08:34,440 Speaker 1: collective memories, right, it really does. 164 00:08:34,559 --> 00:08:36,600 Speaker 2: Yeah, I mean you might be wondering how and when 165 00:08:36,640 --> 00:08:39,640 Speaker 2: do we develop the ability to lie? And it turns 166 00:08:39,640 --> 00:08:42,080 Speaker 2: out there's a whole body of research about little kids 167 00:08:42,120 --> 00:08:44,320 Speaker 2: and lying. And we're going to chat more about all 168 00:08:44,360 --> 00:08:51,040 Speaker 2: of that right after a quick break. 169 00:08:55,040 --> 00:09:02,800 Speaker 1: And Okay, well, so I know you've done a ton 170 00:09:02,840 --> 00:09:05,120 Speaker 1: of research about kids brains and how they developed the 171 00:09:05,120 --> 00:09:07,280 Speaker 1: ability to lie, But first I have to ask, like, 172 00:09:08,160 --> 00:09:09,920 Speaker 1: did you tell lies when you were a kid, or 173 00:09:10,320 --> 00:09:12,679 Speaker 1: you know, can you remember what your kids first lies 174 00:09:12,720 --> 00:09:13,800 Speaker 1: were to you? 175 00:09:13,800 --> 00:09:17,240 Speaker 2: You know, I know that I did. I don't remember 176 00:09:17,280 --> 00:09:19,800 Speaker 2: a whole lot of things specifically, but I remember that 177 00:09:19,920 --> 00:09:22,800 Speaker 2: my lies that I would tell were usually just like 178 00:09:22,960 --> 00:09:26,840 Speaker 2: ridiculous stories like coming in from playing outside or something 179 00:09:26,880 --> 00:09:30,319 Speaker 2: and then just making up some absurd story that probably 180 00:09:30,400 --> 00:09:32,480 Speaker 2: deep down I felt like my parents knew that I 181 00:09:32,559 --> 00:09:35,560 Speaker 2: was lying, but I'd still go for it anyway. But 182 00:09:35,840 --> 00:09:36,360 Speaker 2: how about you? 183 00:09:37,080 --> 00:09:41,240 Speaker 1: Uh? Yeah, I remember like being four and telling my 184 00:09:41,280 --> 00:09:44,360 Speaker 1: mom that I went to see Superman at the movie 185 00:09:44,360 --> 00:09:46,720 Speaker 1: theater and like pretending to fly around like it, but 186 00:09:46,760 --> 00:09:48,679 Speaker 1: I'd actually seen Return of the Jedi, And when she 187 00:09:48,760 --> 00:09:54,559 Speaker 1: found out, she was upset not be excited see Superman 188 00:09:54,679 --> 00:09:57,120 Speaker 1: just because she thought this like innocent kid could never 189 00:09:57,200 --> 00:09:57,600 Speaker 1: lie to her. 190 00:09:58,400 --> 00:10:01,559 Speaker 2: Oh yeah, that's a heart break moment for mom. 191 00:10:01,920 --> 00:10:05,200 Speaker 1: Ruby actually had an amazing first lie that I remember, Like. 192 00:10:05,320 --> 00:10:08,000 Speaker 1: Ruby was also about four at this time, and Henry 193 00:10:08,040 --> 00:10:11,120 Speaker 1: had had a birthday party at his school, and so 194 00:10:11,679 --> 00:10:13,520 Speaker 1: Ruby was like, oh yeah, we had cake at my 195 00:10:13,679 --> 00:10:16,200 Speaker 1: school too. I was like, oh, was there a birthday party? 196 00:10:16,360 --> 00:10:20,079 Speaker 1: And we was like no, no, no birthday party. And 197 00:10:20,720 --> 00:10:22,400 Speaker 1: I said, oh yeah, then then why was that cake? 198 00:10:22,640 --> 00:10:26,720 Speaker 1: And we was like, uh the classroom anniversary? It was 199 00:10:26,760 --> 00:10:29,920 Speaker 1: like a classroom niverse and like, I can I text 200 00:10:29,960 --> 00:10:32,199 Speaker 1: your teachers to find out about this classrom anniversary? And 201 00:10:32,240 --> 00:10:33,840 Speaker 1: Ruby was like no, no, no, they don't. 202 00:10:33,679 --> 00:10:34,320 Speaker 2: Like the text. 203 00:10:34,800 --> 00:10:39,400 Speaker 1: And so so I said, you know, oh so, uh 204 00:10:39,800 --> 00:10:43,080 Speaker 1: so did you sing the traditional classroom anniversary song? And 205 00:10:43,120 --> 00:10:46,160 Speaker 1: we was like yeah, and I said, oh would you 206 00:10:46,200 --> 00:10:48,800 Speaker 1: sing it? And Ruby, without like skipping a be goes, 207 00:10:49,520 --> 00:10:54,000 Speaker 1: it's a classroom manniversary. It's a classroom anniversary. We do 208 00:10:54,080 --> 00:10:55,199 Speaker 1: it in the big building. 209 00:10:56,240 --> 00:10:59,200 Speaker 2: Oh yeah, I heard that one. That's that is a classic. 210 00:10:59,200 --> 00:11:00,000 Speaker 2: That's pretty amazing. 211 00:11:01,360 --> 00:11:05,600 Speaker 1: So anyway, I think it's time to get to the science, 212 00:11:05,679 --> 00:11:07,240 Speaker 1: so let's get to it. 213 00:11:07,240 --> 00:11:09,080 Speaker 2: I don't know, I feel like it's time for more songs, 214 00:11:09,120 --> 00:11:11,160 Speaker 2: so let's do more of that, all right. Well, the 215 00:11:11,280 --> 00:11:14,400 Speaker 2: general consensus is that kids start lying when they're about 216 00:11:14,400 --> 00:11:17,079 Speaker 2: two or three years old, and usually they do it 217 00:11:17,120 --> 00:11:19,400 Speaker 2: when they've done something that they know they shouldn't do. 218 00:11:19,520 --> 00:11:22,079 Speaker 2: But you know, their brains aren't sophisticated enough to take 219 00:11:22,120 --> 00:11:24,959 Speaker 2: that extra step of thinking, you like, will the person 220 00:11:25,000 --> 00:11:27,960 Speaker 2: I'm talking to actually believe what I'm saying? And so 221 00:11:28,000 --> 00:11:30,880 Speaker 2: you get situations like a toddler standing there covered in 222 00:11:30,960 --> 00:11:33,560 Speaker 2: paint and you say, did you open the paint? And 223 00:11:33,600 --> 00:11:34,960 Speaker 2: they of course deny it. 224 00:11:35,520 --> 00:11:39,679 Speaker 1: So at what age like does this start to change? 225 00:11:40,040 --> 00:11:43,520 Speaker 2: Usually around four they start to take the listener's perspective 226 00:11:43,559 --> 00:11:47,160 Speaker 2: into account, and there's something psychologists call the theory of mind, 227 00:11:47,520 --> 00:11:50,439 Speaker 2: which is the ability to reason about other people's mental 228 00:11:50,480 --> 00:11:53,679 Speaker 2: states and then act accordingly. So the kid realizes that 229 00:11:53,720 --> 00:11:56,360 Speaker 2: you can see the paint and you'll be wondering how 230 00:11:56,400 --> 00:11:58,920 Speaker 2: that paint got there, and so they might say something like, 231 00:11:59,040 --> 00:12:01,200 Speaker 2: you know, my brother did, which is a nice try, 232 00:12:01,240 --> 00:12:04,960 Speaker 2: of course, but maybe the brother's been upstairs all day. 233 00:12:05,520 --> 00:12:08,640 Speaker 1: So they're close to getting this lie right. But there's 234 00:12:08,679 --> 00:12:10,240 Speaker 1: still some holes in the story. 235 00:12:10,400 --> 00:12:13,280 Speaker 2: Yeah, usually there are. It actually just is a side 236 00:12:13,320 --> 00:12:16,200 Speaker 2: note reminds me of our good friend Adam that talked 237 00:12:16,240 --> 00:12:20,120 Speaker 2: about a high school experience when he was sitting next 238 00:12:20,160 --> 00:12:22,920 Speaker 2: to a girl that he had a crush on and 239 00:12:22,960 --> 00:12:26,880 Speaker 2: he sneezed into his hands. She looks over at him. 240 00:12:27,280 --> 00:12:29,960 Speaker 2: He clearly has snot all in his hands, and he 241 00:12:30,120 --> 00:12:32,120 Speaker 2: just looks at her and he says, it wasn't me. 242 00:12:32,320 --> 00:12:33,199 Speaker 1: Do you knew the story? 243 00:12:33,360 --> 00:12:36,800 Speaker 2: Yes, it's pretty great. I just thought that was such 244 00:12:36,800 --> 00:12:41,320 Speaker 2: a great response anyway. So things really start to change 245 00:12:41,360 --> 00:12:44,000 Speaker 2: around seven or eight, and that's when they can account 246 00:12:44,080 --> 00:12:46,920 Speaker 2: for a wider range of facts and contexts like in 247 00:12:46,960 --> 00:12:49,640 Speaker 2: the lie. So maybe they say, you know, my brother 248 00:12:49,720 --> 00:12:51,880 Speaker 2: did it, and you say, but he's in his room, 249 00:12:51,960 --> 00:12:55,040 Speaker 2: And then a slightly older kid has the capacity to 250 00:12:55,080 --> 00:12:57,720 Speaker 2: react with something like he came down for a snack 251 00:12:57,760 --> 00:12:59,240 Speaker 2: and he was playing with the paint, but then he 252 00:12:59,280 --> 00:13:02,559 Speaker 2: went back upstairs, you know, just taking it to another level. 253 00:13:02,559 --> 00:13:05,000 Speaker 1: Really, it is wild to think of little kids like 254 00:13:05,200 --> 00:13:07,400 Speaker 1: almost like new versions of AI, Like they just keep 255 00:13:07,440 --> 00:13:09,400 Speaker 1: getting better and better at lying as. 256 00:13:09,280 --> 00:13:11,680 Speaker 2: The yeah, yeah, it's a good way to think about 257 00:13:11,720 --> 00:13:14,120 Speaker 2: it like that, that training over time, and it's even 258 00:13:14,120 --> 00:13:16,280 Speaker 2: more wild to realize that this is something parents have 259 00:13:16,440 --> 00:13:19,800 Speaker 2: known about for a long time. Actually, Charles Darwin wrote 260 00:13:19,840 --> 00:13:23,160 Speaker 2: about toddler liars after doing some observational research to his 261 00:13:23,200 --> 00:13:25,880 Speaker 2: own kids. This was, of course, back in the eighteen forties. 262 00:13:26,360 --> 00:13:28,480 Speaker 2: So what happened is one day he caught his two 263 00:13:28,480 --> 00:13:30,280 Speaker 2: and a half year old son coming out of the 264 00:13:30,280 --> 00:13:34,240 Speaker 2: dining room with something very obviously wrapped up in a pinafore. 265 00:13:34,760 --> 00:13:37,000 Speaker 2: And this is, you know, a dressed like garment. And 266 00:13:37,080 --> 00:13:39,760 Speaker 2: Darwin says, hey, buddy, what what do you got there? 267 00:13:39,840 --> 00:13:42,360 Speaker 2: And the kids like nothing, go away, and so Darwin 268 00:13:42,480 --> 00:13:44,880 Speaker 2: checks and it turns out he'd stolen a. 269 00:13:44,880 --> 00:13:48,720 Speaker 1: Pickle, the old pickle and a pinafore. 270 00:13:48,800 --> 00:13:52,160 Speaker 2: Bit, yeah, you know, the whole bit there. But the 271 00:13:52,240 --> 00:13:55,680 Speaker 2: point is, there's ample evidence that very young children have 272 00:13:55,760 --> 00:13:59,600 Speaker 2: the executive functions needed for lying. More complicated skills like 273 00:13:59,640 --> 00:14:03,120 Speaker 2: memory and inhibition control, they develop a little bit later, 274 00:14:03,320 --> 00:14:06,880 Speaker 2: and that makes their lies more elaborate and believable. And 275 00:14:06,880 --> 00:14:09,920 Speaker 2: what researchers have found is that kids with strong cognitive 276 00:14:09,960 --> 00:14:13,680 Speaker 2: abilities actually make better liars. So, in a weird way, 277 00:14:13,800 --> 00:14:17,079 Speaker 2: lying is a sign of reaching important developmental milestones. 278 00:14:17,400 --> 00:14:19,480 Speaker 1: I like the idea of grade schools giving out stickers 279 00:14:19,480 --> 00:14:24,640 Speaker 1: for lying. You know, so, the only thing cuter than 280 00:14:24,640 --> 00:14:27,880 Speaker 1: a toddler telling whoppers is an animal who's trying to 281 00:14:27,920 --> 00:14:28,320 Speaker 1: trick you. 282 00:14:28,560 --> 00:14:31,200 Speaker 2: So were you suggesting that animals also lie? 283 00:14:31,520 --> 00:14:33,560 Speaker 1: It kind of depends on what you mean. By lies. 284 00:14:33,720 --> 00:14:36,320 Speaker 1: So usually when we say someone is lying, we mean 285 00:14:36,360 --> 00:14:40,480 Speaker 1: they're knowingly making a false statement with the intent to deceive. Right. So, 286 00:14:40,680 --> 00:14:43,480 Speaker 1: animals can't talk obviously, so they don't make statements the 287 00:14:43,480 --> 00:14:46,600 Speaker 1: way we do. But if you expand this definition to 288 00:14:46,640 --> 00:14:50,480 Speaker 1: include behaviors, then you know animals do engage in deceptive 289 00:14:50,480 --> 00:14:53,360 Speaker 1: behavior all the time. So if you think about possums 290 00:14:53,400 --> 00:14:55,960 Speaker 1: when they're threatened, they pretend to be dead. I also 291 00:14:56,040 --> 00:14:59,200 Speaker 1: read about this bird called the forktailed drongo, which is 292 00:14:59,280 --> 00:15:02,520 Speaker 1: native to the color Ahari Desert in Africa, and when 293 00:15:02,520 --> 00:15:05,000 Speaker 1: it sees a predator like an eagle over ahead, it 294 00:15:05,080 --> 00:15:08,440 Speaker 1: lets out a squeaky alarm call. Now, other desert animals, 295 00:15:08,480 --> 00:15:10,800 Speaker 1: including meerkats, have learned to hide when they hear this 296 00:15:10,920 --> 00:15:14,200 Speaker 1: alarm as well, But sometimes the drongo plays a trick. 297 00:15:14,480 --> 00:15:17,200 Speaker 1: If it sees a meerkat holding food it wants, maybe 298 00:15:17,200 --> 00:15:20,360 Speaker 1: like a nice juicy lizard, it starts squeaking the alarm, 299 00:15:20,520 --> 00:15:22,840 Speaker 1: and the meerkat drops the food and runs for cover, 300 00:15:23,160 --> 00:15:26,120 Speaker 1: and then the drongo swoops in and takes the meal. Wow. 301 00:15:26,200 --> 00:15:28,520 Speaker 2: I mean that definitely sounds like the bird equivalent of 302 00:15:28,640 --> 00:15:31,000 Speaker 2: making a fault statement with the intent to deceive. Don't 303 00:15:31,040 --> 00:15:31,320 Speaker 2: you think. 304 00:15:31,480 --> 00:15:34,400 Speaker 1: Yeah, So Weirdly, it's the last part that people actually 305 00:15:34,480 --> 00:15:37,800 Speaker 1: disagree about whether or not there's intent. So clearly, the 306 00:15:37,840 --> 00:15:40,480 Speaker 1: drongo knows that if it makes the alarm call, the 307 00:15:40,560 --> 00:15:42,920 Speaker 1: meerkat will run, and the possum knows that if it 308 00:15:42,960 --> 00:15:46,000 Speaker 1: lies really still, its predator may walk away. But some 309 00:15:46,000 --> 00:15:50,920 Speaker 1: scientists actually call this functional deception, not intentional deception, and 310 00:15:50,960 --> 00:15:53,440 Speaker 1: they argue that the intent depends on something you mentioned 311 00:15:53,480 --> 00:15:56,480 Speaker 1: a few minutes ago, the theory of mind. And it's 312 00:15:56,520 --> 00:15:58,680 Speaker 1: just not clear that animals have the ability to think 313 00:15:58,680 --> 00:16:00,760 Speaker 1: about what other creatures know or believe. 314 00:16:02,160 --> 00:16:04,800 Speaker 2: So I guess, in other words, the drongo isn't thinking 315 00:16:04,880 --> 00:16:07,000 Speaker 2: if I yell, the dumb mere cat will believe there's 316 00:16:07,000 --> 00:16:09,680 Speaker 2: an eagle. It just thinks when I yell, the mere 317 00:16:09,760 --> 00:16:10,320 Speaker 2: cat runs. 318 00:16:10,560 --> 00:16:13,360 Speaker 1: Yeah, pretty much. But there have been some studies that 319 00:16:13,360 --> 00:16:16,560 Speaker 1: suggest animals, particularly great apes, may have a theory of mind. 320 00:16:16,720 --> 00:16:20,080 Speaker 1: And in one experiment, researchers gave an ape eye trackers 321 00:16:20,280 --> 00:16:22,760 Speaker 1: so they could tell where it was looking. And then, 322 00:16:22,840 --> 00:16:25,200 Speaker 1: and I swear I'm not making this up, an assistant 323 00:16:25,240 --> 00:16:28,640 Speaker 1: in an ape costume ran into the cage area puts 324 00:16:28,640 --> 00:16:31,400 Speaker 1: an object under one of the two boxes on the floor. 325 00:16:31,800 --> 00:16:34,920 Speaker 1: A researcher pretends to watch this happen and then leaves 326 00:16:34,920 --> 00:16:39,640 Speaker 1: the room. And then when this assistant in ape costume 327 00:16:40,160 --> 00:16:43,080 Speaker 1: is there alone, he puts the object under the other 328 00:16:43,160 --> 00:16:45,280 Speaker 1: box for a moment, and then he picks it up 329 00:16:45,280 --> 00:16:48,240 Speaker 1: and runs away. So when the researcher comes back and 330 00:16:48,360 --> 00:16:51,360 Speaker 1: reaches for the boxes, the ape looks at him and 331 00:16:51,400 --> 00:16:53,720 Speaker 1: then stared at the first box, as if he knew 332 00:16:53,720 --> 00:16:57,520 Speaker 1: the researcher was expecting the object to be there, which means, like, 333 00:16:57,640 --> 00:16:59,960 Speaker 1: you know, the ape is picking up on all these 334 00:17:00,320 --> 00:17:03,360 Speaker 1: and is more aware of others behaviors. And actually there's 335 00:17:03,440 --> 00:17:04,800 Speaker 1: video of this on YouTube. 336 00:17:05,560 --> 00:17:08,000 Speaker 2: Oh man, I've got to see this. It's such a 337 00:17:08,000 --> 00:17:10,720 Speaker 2: fascinating idea. And I also love the idea that like 338 00:17:10,720 --> 00:17:13,119 Speaker 2: a guy with several PhDs having to run around in 339 00:17:13,160 --> 00:17:15,560 Speaker 2: a costume pretending to be an ape. So this is 340 00:17:16,119 --> 00:17:18,280 Speaker 2: definitely something I'm going to watch, all right. So to 341 00:17:18,320 --> 00:17:20,600 Speaker 2: switch gears here, Mango, I want to tell you about 342 00:17:20,760 --> 00:17:24,320 Speaker 2: lying and culture clashes. So you mentioned earlier that lying 343 00:17:24,440 --> 00:17:28,120 Speaker 2: means knowingly making a false statement with the intention to deceive. 344 00:17:28,800 --> 00:17:31,879 Speaker 2: That's a good definition, but some types of deception aren't 345 00:17:31,920 --> 00:17:35,520 Speaker 2: so clear. Cut, and the context really does matter. So 346 00:17:35,600 --> 00:17:39,760 Speaker 2: omitting information or being purposely ambiguous, those can be types 347 00:17:39,800 --> 00:17:43,560 Speaker 2: of lies. Even accurate information can be used in misleading ways. 348 00:17:43,600 --> 00:17:46,760 Speaker 2: And the way we interpret these nuances and the way 349 00:17:46,800 --> 00:17:49,520 Speaker 2: we assess other people's honesty, it really comes down to 350 00:17:49,560 --> 00:17:51,760 Speaker 2: the norms that we've learned from our culture. 351 00:17:52,680 --> 00:17:55,320 Speaker 1: You know, this is a great quote by Montaigne that 352 00:17:56,080 --> 00:17:59,240 Speaker 1: Mary actually pointed me too, and it goes, quote the 353 00:17:59,320 --> 00:18:02,879 Speaker 1: reverse of you has one hundred thousand forms, And it 354 00:18:02,880 --> 00:18:04,480 Speaker 1: sounds like that, right, Like, there's so many ways to 355 00:18:04,560 --> 00:18:06,480 Speaker 1: lie in just like one way to simply tell the truth. 356 00:18:06,520 --> 00:18:09,200 Speaker 1: But can you give me some examples of what you're 357 00:18:09,240 --> 00:18:09,920 Speaker 1: thinking about here? 358 00:18:10,680 --> 00:18:13,320 Speaker 2: Yeah, for sure. But by the way, Mary, great quote 359 00:18:13,359 --> 00:18:16,920 Speaker 2: that is, that's pretty awesome, all right. So, according to research, 360 00:18:17,080 --> 00:18:20,640 Speaker 2: highly individualistic cultures like the United States, they see lies 361 00:18:20,840 --> 00:18:24,879 Speaker 2: very differently than more collectives cultures like many in Asia. So, 362 00:18:24,920 --> 00:18:28,040 Speaker 2: for example, an American business manager might consider it dishonest 363 00:18:28,080 --> 00:18:31,919 Speaker 2: to withhold negative feedback from an employee, even if it's harsh, 364 00:18:31,960 --> 00:18:35,040 Speaker 2: but a Japanese manager might not see it that way 365 00:18:35,080 --> 00:18:38,800 Speaker 2: at all because they're considering potential harm to an employee 366 00:18:38,800 --> 00:18:41,879 Speaker 2: and to their relationship. So for them, that's part of 367 00:18:41,920 --> 00:18:45,280 Speaker 2: being a trustworthy person. So maybe they downplay the bad 368 00:18:45,320 --> 00:18:48,840 Speaker 2: review or delivered an indirect fashion. But if you imagine 369 00:18:48,840 --> 00:18:52,280 Speaker 2: an American employee getting vague feedback might make them think, 370 00:18:52,640 --> 00:18:53,680 Speaker 2: what are they hiding from me? 371 00:18:54,160 --> 00:18:57,240 Speaker 1: Yeah, and I guess in both scenarios, people are they're 372 00:18:57,359 --> 00:19:00,160 Speaker 1: trying to do the right thing, and there's no there's 373 00:19:00,200 --> 00:19:01,840 Speaker 1: no actually intend to deceive there. 374 00:19:01,920 --> 00:19:04,520 Speaker 2: No, that's exactly right. I mean this extends to speech 375 00:19:04,560 --> 00:19:07,320 Speaker 2: patterns too. So most native English speakers in the United 376 00:19:07,359 --> 00:19:11,119 Speaker 2: States consider clear, direct language to be a sign of honesty, 377 00:19:11,600 --> 00:19:14,600 Speaker 2: and we're a little more suspicious of verbal meandering or 378 00:19:14,880 --> 00:19:17,840 Speaker 2: someone using a lot of euphemisms, but that's just how 379 00:19:17,920 --> 00:19:21,960 Speaker 2: some cultures communicate. Meanwhile, they might interpret our bluntness as 380 00:19:22,160 --> 00:19:25,400 Speaker 2: bad manners. And you know, of course, then there's body language. 381 00:19:25,480 --> 00:19:28,520 Speaker 2: So in the US, holding eye contact is usually considered 382 00:19:28,520 --> 00:19:31,480 Speaker 2: proof that someone's being honest, but in other parts of 383 00:19:31,520 --> 00:19:34,480 Speaker 2: the world that can actually be seen as rude or aggressive. 384 00:19:35,040 --> 00:19:37,720 Speaker 1: Yeah, well this is a podcast, so so you all 385 00:19:37,800 --> 00:19:39,600 Speaker 1: can't see my eyes. But I'm going to use the 386 00:19:39,720 --> 00:19:43,320 Speaker 1: straightforward American approach to honest communication. Right now, it is 387 00:19:43,440 --> 00:19:45,720 Speaker 1: time for an ad break, So we can make money 388 00:19:45,800 --> 00:19:47,880 Speaker 1: and keep the show going. That's how honest I'm being. 389 00:19:47,960 --> 00:19:50,320 Speaker 1: But we'll be back soon. 390 00:20:05,640 --> 00:20:07,720 Speaker 2: Welcome back to Part Time Genius, where we're learning the 391 00:20:07,800 --> 00:20:10,960 Speaker 2: truth about lies. All right, So Mango, what have you 392 00:20:11,000 --> 00:20:12,200 Speaker 2: got for us next? 393 00:20:12,480 --> 00:20:14,800 Speaker 1: So far, we've mostly been talking about lies at a 394 00:20:14,840 --> 00:20:17,880 Speaker 1: personal level, but lying isn't just about how we relate 395 00:20:17,920 --> 00:20:21,000 Speaker 1: to one another. It can also have legal consequences, including 396 00:20:21,040 --> 00:20:24,280 Speaker 1: what we call fraud. So under US law, fraud can 397 00:20:24,359 --> 00:20:27,960 Speaker 1: be a civil or a criminal matter, depending on the situation, 398 00:20:28,520 --> 00:20:32,120 Speaker 1: and in civil cases is defined as a misrepresentation of fact, 399 00:20:32,400 --> 00:20:36,359 Speaker 1: either intentional or negligent, and that causes injury or harm 400 00:20:36,400 --> 00:20:40,639 Speaker 1: to another person. Criminal fraud usually falls into specific categories 401 00:20:40,680 --> 00:20:43,080 Speaker 1: that have their own definition, so you think about things 402 00:20:43,119 --> 00:20:47,080 Speaker 1: like financial fraud, or insurance fraud, or even something like forgery. Right. 403 00:20:47,760 --> 00:20:50,879 Speaker 1: And although none of this is new, technology has caused 404 00:20:50,880 --> 00:20:54,040 Speaker 1: the scale of criminal fraud to explode in recent years. 405 00:20:54,400 --> 00:20:57,040 Speaker 1: So just to give you a sense, from twenty eighteen 406 00:20:57,160 --> 00:21:00,880 Speaker 1: to twenty twenty two, the US federal government lost between 407 00:21:00,920 --> 00:21:03,879 Speaker 1: two hundred and thirty three billion and five hundred and 408 00:21:03,920 --> 00:21:07,520 Speaker 1: twenty one billion dollars each year to fraud. And wow, 409 00:21:07,600 --> 00:21:09,960 Speaker 1: that doesn't even count money lost at the state level, 410 00:21:10,800 --> 00:21:14,280 Speaker 1: and that's billion with a B like that is staggering. Yeah, 411 00:21:14,320 --> 00:21:17,480 Speaker 1: and if you thought those numbers were big. According to 412 00:21:17,520 --> 00:21:22,280 Speaker 1: the World Economic Forum, cybercrime, which often involves fraud, obviously 413 00:21:22,600 --> 00:21:26,840 Speaker 1: will cost the world ten point five trillion this year alone. 414 00:21:27,240 --> 00:21:30,480 Speaker 1: But even more than the economic impact, there are personal costs. 415 00:21:30,520 --> 00:21:32,840 Speaker 1: So you think about identity theft is one of the 416 00:21:32,880 --> 00:21:35,800 Speaker 1: most common types of cyber fraud, and in a survey 417 00:21:35,840 --> 00:21:39,040 Speaker 1: by the Identity Theft Resource Center, sixty percent of ID 418 00:21:39,200 --> 00:21:42,359 Speaker 1: theft victims found themselves struggling to cover their bills in 419 00:21:42,400 --> 00:21:46,720 Speaker 1: the aftermath of the incident, and sixteen percent reported feeling suicidal. 420 00:21:46,880 --> 00:21:48,760 Speaker 1: It's really a serious problem. 421 00:21:49,080 --> 00:21:51,560 Speaker 2: That is wild, all right. So when you say technology 422 00:21:51,680 --> 00:21:53,919 Speaker 2: is driving all this, what exactly mean by that? 423 00:21:54,480 --> 00:21:56,480 Speaker 1: So some of it's the obvious stuff, like we do 424 00:21:56,560 --> 00:21:59,919 Speaker 1: our banking online, we give out our email addresses like candy, 425 00:22:00,160 --> 00:22:03,320 Speaker 1: We share personal information on social media and a lot 426 00:22:03,320 --> 00:22:05,919 Speaker 1: of us aren't great about checking privacy settings or changing 427 00:22:05,920 --> 00:22:09,560 Speaker 1: our passwords. So fraudsters have plenty of material to work with, 428 00:22:09,800 --> 00:22:12,080 Speaker 1: and they also have more points of entry to our 429 00:22:12,119 --> 00:22:15,320 Speaker 1: lives than our finances. Now. But the World Economic Forum 430 00:22:15,440 --> 00:22:18,840 Speaker 1: reports that the biggest problem right now is generative AI. 431 00:22:19,320 --> 00:22:22,199 Speaker 1: So criminals are using AI language tools to crank out 432 00:22:22,240 --> 00:22:26,320 Speaker 1: phishing emails that seem really authentic. They're using AI image 433 00:22:26,320 --> 00:22:29,520 Speaker 1: generators to create deep fakes of people's faces that let 434 00:22:29,560 --> 00:22:34,040 Speaker 1: them pass identity verification checks. According to a recent FBI warning, 435 00:22:34,080 --> 00:22:37,240 Speaker 1: They're even using AI voice cloning to create deceptive audio 436 00:22:37,280 --> 00:22:41,119 Speaker 1: clips that quote impersonate a close relative in a crisis situation, 437 00:22:41,480 --> 00:22:45,679 Speaker 1: asking for immediate financial assistance or demanding a ransom. 438 00:22:46,040 --> 00:22:50,240 Speaker 2: So, I mean, this is obviously just terrifying, and I'm wondering, though, 439 00:22:50,240 --> 00:22:52,360 Speaker 2: before I throw my phone out the window, is there 440 00:22:52,520 --> 00:22:54,040 Speaker 2: anything I can do about this? 441 00:22:54,720 --> 00:22:56,760 Speaker 1: So from a tech standpoint, there's been a lot of 442 00:22:56,760 --> 00:23:01,000 Speaker 1: interest in digital identity wallets like that would combine and biometrics, 443 00:23:01,119 --> 00:23:04,199 Speaker 1: ID documents, personal information to create a secure log in 444 00:23:04,320 --> 00:23:07,679 Speaker 1: that you can use multiple places. Proponents say it's like 445 00:23:07,720 --> 00:23:10,520 Speaker 1: being asked to show your physical driver's license, so much 446 00:23:10,600 --> 00:23:14,199 Speaker 1: safer than easily hacked password. But civil liberties groups the 447 00:23:14,200 --> 00:23:16,320 Speaker 1: one that this could mean sacrificing privacy. 448 00:23:16,560 --> 00:23:18,120 Speaker 2: I thought you were just gonna say something like change 449 00:23:18,160 --> 00:23:19,920 Speaker 2: your passwords. I was kind of hoping that was where 450 00:23:19,960 --> 00:23:20,679 Speaker 2: you were going with it. 451 00:23:20,960 --> 00:23:22,800 Speaker 1: I mean you could do that too. You should do 452 00:23:22,880 --> 00:23:25,560 Speaker 1: that actually, But the FBI has another tip that is 453 00:23:25,600 --> 00:23:28,240 Speaker 1: so low tech, it's kind of brilliant, and it's just 454 00:23:28,520 --> 00:23:30,960 Speaker 1: create a secret passphrase with your family and close friends. 455 00:23:31,040 --> 00:23:33,119 Speaker 1: So if you ever get a call that sounds like 456 00:23:33,359 --> 00:23:36,479 Speaker 1: me asking you to wire ten thousand dollars, you can 457 00:23:36,680 --> 00:23:39,480 Speaker 1: ask for the phrase to confirm it's me. Wow. 458 00:23:39,480 --> 00:23:41,919 Speaker 2: Okay, that's a good idea, all right. With all of 459 00:23:41,920 --> 00:23:45,480 Speaker 2: this raises another question, and that's what makes someone lie 460 00:23:45,720 --> 00:23:48,760 Speaker 2: or commit fraud, Like, we all know what's wrong, so 461 00:23:48,800 --> 00:23:51,879 Speaker 2: why does it keep happening? So Dan Arielli is a 462 00:23:51,880 --> 00:23:55,200 Speaker 2: professor of psychology and behavioral economics at Duke, and he's 463 00:23:55,240 --> 00:23:57,160 Speaker 2: done a lot of research about this kind of thing. 464 00:23:57,560 --> 00:24:00,919 Speaker 2: So he pointed out that our entire legal system operates 465 00:24:00,960 --> 00:24:04,159 Speaker 2: on the theory of cost benefit analysis, Like, if you 466 00:24:04,200 --> 00:24:07,479 Speaker 2: know there's a serious penalty for being dishonest, you'll decide 467 00:24:07,480 --> 00:24:09,720 Speaker 2: if it's worth it or not, and you won't do 468 00:24:09,800 --> 00:24:12,000 Speaker 2: it if you decide that it isn't. And the problem 469 00:24:12,040 --> 00:24:14,840 Speaker 2: here is that it's just not how lying works in 470 00:24:14,880 --> 00:24:17,840 Speaker 2: real life, and that's because humans have the ability to 471 00:24:18,080 --> 00:24:22,840 Speaker 2: rationalize dishonesty. So Arieli did this experiment with a vending machine. 472 00:24:23,119 --> 00:24:26,440 Speaker 2: He set the price mechanism to zero sense, but put 473 00:24:26,520 --> 00:24:29,840 Speaker 2: labels on the snack saying that each item costs seventy 474 00:24:29,920 --> 00:24:32,560 Speaker 2: five cents. He also taped a sign to the machine 475 00:24:32,560 --> 00:24:35,480 Speaker 2: that read, quote, if there's something wrong with this machine, 476 00:24:35,520 --> 00:24:38,280 Speaker 2: please call this number, and then he put his own 477 00:24:38,359 --> 00:24:39,160 Speaker 2: cell phone number. 478 00:24:39,440 --> 00:24:43,760 Speaker 1: That's clever. So what happened when people use the machine. 479 00:24:42,920 --> 00:24:45,080 Speaker 2: Well, they'd put their money in and the machine would 480 00:24:45,080 --> 00:24:48,560 Speaker 2: dispense multiple snacks and return all of their cash, and 481 00:24:48,640 --> 00:24:52,240 Speaker 2: not a single person called to report the malfunction, although 482 00:24:52,240 --> 00:24:56,040 Speaker 2: Ariela noted that nobody took more than four bags of 483 00:24:56,080 --> 00:24:59,119 Speaker 2: free candy either. So his theory is that people were 484 00:24:59,240 --> 00:25:03,840 Speaker 2: rationalizing the decision based on previous vending machine experiences, like 485 00:25:03,880 --> 00:25:06,240 Speaker 2: we've all had that situation where you put your money 486 00:25:06,280 --> 00:25:08,400 Speaker 2: in and it doesn't give you a snack, or, as 487 00:25:08,400 --> 00:25:10,800 Speaker 2: he puts it, people were just kind of sorting out 488 00:25:10,800 --> 00:25:12,800 Speaker 2: the vending karma in the world. I guess. 489 00:25:13,200 --> 00:25:16,160 Speaker 1: I mean that's pretty fascinating, although obviously there's a big 490 00:25:16,160 --> 00:25:19,520 Speaker 1: difference between taking candy and committing identity theft. 491 00:25:19,840 --> 00:25:22,960 Speaker 2: Yeah, of course, but understanding the ways criminals make sense 492 00:25:22,960 --> 00:25:25,679 Speaker 2: of what they're doing could help stop fraud before it 493 00:25:25,720 --> 00:25:29,400 Speaker 2: even starts. So, for example, a Reli found that people 494 00:25:29,440 --> 00:25:32,600 Speaker 2: would behave more honestly if they spend time reflecting on 495 00:25:32,640 --> 00:25:36,000 Speaker 2: their personal morals or their ethics. So he tested this 496 00:25:36,040 --> 00:25:39,040 Speaker 2: by asking one group of people to list ten books 497 00:25:39,040 --> 00:25:41,959 Speaker 2: they read in high school and another group to recall 498 00:25:42,320 --> 00:25:46,000 Speaker 2: the Ten Commandments. Then he gave both groups reward based 499 00:25:46,040 --> 00:25:49,119 Speaker 2: tests that included opportunities to cheat, and he found that 500 00:25:49,160 --> 00:25:52,760 Speaker 2: people who thought about books engaged in widespread cheating, while 501 00:25:52,800 --> 00:25:55,679 Speaker 2: the Ten Commandments group did not cheat at all. So 502 00:25:55,720 --> 00:25:57,960 Speaker 2: this is a reminder, don't read kids, you know what 503 00:25:58,040 --> 00:26:02,440 Speaker 2: I mean. And what's really interesting is that he repeated 504 00:26:02,440 --> 00:26:06,000 Speaker 2: the test with people who identified as atheists because they 505 00:26:06,000 --> 00:26:08,240 Speaker 2: didn't know the Ten Commandments. He just had them swear 506 00:26:08,280 --> 00:26:11,919 Speaker 2: on a Bible. After doing that, they still didn't cheat. 507 00:26:12,600 --> 00:26:15,199 Speaker 2: That is incredible. So well, we're almost at the end 508 00:26:15,200 --> 00:26:15,800 Speaker 2: of the episode. 509 00:26:15,880 --> 00:26:18,320 Speaker 1: Tell me the truth. Did you prepare anything for the fact? Off? 510 00:26:18,560 --> 00:26:20,239 Speaker 2: I sure did, Mango, Let's get to it. 511 00:26:25,119 --> 00:26:28,560 Speaker 1: If there's one man whose name has become synonymous with dishonesty, 512 00:26:28,680 --> 00:26:31,280 Speaker 1: it is Charles Ponzi and we've all heard his name. 513 00:26:31,320 --> 00:26:33,360 Speaker 1: But I thought we'd get into these stories. So Ponzi 514 00:26:33,520 --> 00:26:36,560 Speaker 1: was born in northern Italy in eighteen eighty two, and 515 00:26:36,600 --> 00:26:39,720 Speaker 1: apparently his family had at one time been pretty well off, 516 00:26:39,880 --> 00:26:41,680 Speaker 1: but by the time he came along, they had lost 517 00:26:41,720 --> 00:26:44,199 Speaker 1: their fortune. He tried going to university in Rome, but 518 00:26:44,280 --> 00:26:47,240 Speaker 1: dropped out and decided to take his chances in America. Now. 519 00:26:47,280 --> 00:26:49,600 Speaker 1: Ponzi later claimed that he had set sail with two 520 00:26:49,680 --> 00:26:52,119 Speaker 1: hundred dollars in savings, most of which he lost on 521 00:26:52,200 --> 00:26:54,760 Speaker 1: the ship crossing the Atlantic because he got swindled by 522 00:26:54,840 --> 00:26:58,199 Speaker 1: a card sharp Once in the US, though, he bounced 523 00:26:58,240 --> 00:27:00,760 Speaker 1: around a series of jobs and low level but in 524 00:27:00,880 --> 00:27:04,800 Speaker 1: nineteen twenty he launched the scheme that made him famous. Basically, 525 00:27:05,000 --> 00:27:08,960 Speaker 1: there was this thing called an International Reply coupon, which 526 00:27:09,080 --> 00:27:11,679 Speaker 1: was like a voucher many countries accepted in exchange for 527 00:27:11,760 --> 00:27:15,040 Speaker 1: local postage stamps. Ponzi realized that he could buy cheap 528 00:27:15,080 --> 00:27:18,119 Speaker 1: IRCs in Italy and exchange them for more expensive stamps 529 00:27:18,160 --> 00:27:20,800 Speaker 1: in the US. As he put it in his memoir, quote, 530 00:27:20,880 --> 00:27:23,280 Speaker 1: the racket fell into my lap like a ripe apple. 531 00:27:23,640 --> 00:27:27,840 Speaker 1: It looked good, luscious. I examined it for flaws, found none, 532 00:27:28,000 --> 00:27:30,160 Speaker 1: and I had to bite. I wouldn't have been human 533 00:27:30,200 --> 00:27:33,720 Speaker 1: if I didn't. And just to be clear that part 534 00:27:33,760 --> 00:27:37,000 Speaker 1: of the scheme was not illegal. It's just simple arbitrage, 535 00:27:37,000 --> 00:27:39,560 Speaker 1: which happens in the stock market all the time. The 536 00:27:39,640 --> 00:27:42,399 Speaker 1: problem was when he started tricking thousands of people into 537 00:27:42,440 --> 00:27:45,840 Speaker 1: investing in his operation, claiming they'd earn a fifty percent 538 00:27:45,960 --> 00:27:49,320 Speaker 1: return in ninety days. Now, every time an investor gave money, 539 00:27:49,400 --> 00:27:51,720 Speaker 1: he used it to pay off earlier investors. So people 540 00:27:51,720 --> 00:27:54,119 Speaker 1: thought they were getting huge returns and thought it was 541 00:27:54,160 --> 00:27:56,719 Speaker 1: all real. And in the first eight months of nineteen 542 00:27:56,800 --> 00:28:00,000 Speaker 1: twenty Ponzi actually made fifteen million dollars, which is about 543 00:28:00,200 --> 00:28:02,480 Speaker 1: two hundred and thirty five million dollars in to day's money. 544 00:28:02,480 --> 00:28:04,320 Speaker 1: I had no idea how much money you'd made. 545 00:28:04,680 --> 00:28:05,520 Speaker 2: That's huge. 546 00:28:05,720 --> 00:28:08,520 Speaker 1: It all came crashing down that August when Boston Post 547 00:28:08,560 --> 00:28:12,200 Speaker 1: Reporters launched an investigation that ended in eighty six counts 548 00:28:12,240 --> 00:28:14,439 Speaker 1: of mail fraud and a lengthy prison sentence. 549 00:28:15,160 --> 00:28:18,160 Speaker 2: All right, well, speaking of elaborate and profitable lies, the 550 00:28:18,240 --> 00:28:21,840 Speaker 2: FTC reports that in twenty twenty two, almost seventy thousand 551 00:28:21,880 --> 00:28:25,080 Speaker 2: people were the victims of romance scammers. Now, this is 552 00:28:25,080 --> 00:28:27,800 Speaker 2: a type of fraud that often begins on dating apps. 553 00:28:27,920 --> 00:28:30,560 Speaker 2: That's where someone lies about who they are, usually in 554 00:28:30,680 --> 00:28:32,640 Speaker 2: order to get you to send them money, and it's 555 00:28:32,680 --> 00:28:35,920 Speaker 2: an extra malicious form of cat fishing. Now. One of 556 00:28:35,960 --> 00:28:38,680 Speaker 2: the trickiest things for a catfisher to explain is why 557 00:28:38,720 --> 00:28:42,120 Speaker 2: they can't meet up in person, So, according to the FTC, 558 00:28:42,320 --> 00:28:45,120 Speaker 2: they'll often build a fake identity around a job that 559 00:28:45,240 --> 00:28:48,720 Speaker 2: keeps them far away. And among romance scammers, the most 560 00:28:48,720 --> 00:28:52,320 Speaker 2: popular lie is quote I'm stationed on a military base 561 00:28:52,360 --> 00:28:54,840 Speaker 2: in another country. But another one that comes up a 562 00:28:54,840 --> 00:28:57,440 Speaker 2: lot is I'm an offshore oil rig worker. 563 00:28:57,640 --> 00:28:59,120 Speaker 1: I mean, I think what's funny about that is it's 564 00:28:59,160 --> 00:29:01,680 Speaker 1: going to make online on dating so hard for actual 565 00:29:01,760 --> 00:29:03,000 Speaker 1: oil rig workers. 566 00:29:03,240 --> 00:29:06,640 Speaker 2: Like sure, I didn't think about this. I feels so 567 00:29:06,800 --> 00:29:09,080 Speaker 2: bad for the rig workers just trying to get dates. 568 00:29:09,600 --> 00:29:12,400 Speaker 1: Anyway. I was curious where the word lie comes from, 569 00:29:12,440 --> 00:29:15,440 Speaker 1: and it turns out the term white lie is older 570 00:29:15,440 --> 00:29:19,160 Speaker 1: than I expected. The first known use occurred on April tenth, 571 00:29:19,360 --> 00:29:23,400 Speaker 1: fifteen sixty seven, when an English landowner named Ralph Adderley 572 00:29:23,480 --> 00:29:25,560 Speaker 1: wrote a letter to a friend describing his brother in 573 00:29:25,600 --> 00:29:28,320 Speaker 1: law by saying, quote, I do assure you he is 574 00:29:28,400 --> 00:29:32,600 Speaker 1: unsuspected of any untruth or other notable crime, except a 575 00:29:32,640 --> 00:29:35,440 Speaker 1: white lie, which is taken for a small fault in 576 00:29:35,480 --> 00:29:36,400 Speaker 1: these parts. 577 00:29:36,840 --> 00:29:39,280 Speaker 2: I mean, it's definitely sort of a compliment, I think. 578 00:29:39,440 --> 00:29:42,560 Speaker 2: But all right, well, for our final fact, here's the 579 00:29:42,600 --> 00:29:45,680 Speaker 2: story of a lie that was anything but harmless because 580 00:29:45,680 --> 00:29:49,240 Speaker 2: it almost threw off decades of archaeological research. So it 581 00:29:49,320 --> 00:29:52,000 Speaker 2: was back in nineteen twelve you have this amateur fossil 582 00:29:52,040 --> 00:29:55,800 Speaker 2: hunter named Charles Dawson. He wrote to the British palaeontologist 583 00:29:55,840 --> 00:29:59,280 Speaker 2: Sir Arthur Smith Woodward, and he was claiming he'd unearthed 584 00:29:59,280 --> 00:30:01,800 Speaker 2: parts of a human skull in Piltdown. This was a 585 00:30:01,880 --> 00:30:05,560 Speaker 2: village near his home in Sussex, so Woodward helped expand 586 00:30:05,600 --> 00:30:09,520 Speaker 2: the excavation and eventually he and Dawson dug up parts 587 00:30:09,520 --> 00:30:12,920 Speaker 2: of a mandible, pieces of a skull and multiple teeth there. 588 00:30:13,440 --> 00:30:16,040 Speaker 2: And so intriguingly, some of these teeth appeared to be 589 00:30:16,240 --> 00:30:19,600 Speaker 2: larger than a human's but smaller than an apes, and 590 00:30:19,640 --> 00:30:22,200 Speaker 2: so it was pointing to the possibility of an ancient 591 00:30:22,280 --> 00:30:26,400 Speaker 2: ancestor some five hundred thousand years old. Now the British 592 00:30:26,400 --> 00:30:29,560 Speaker 2: scientific community was understandably excited about the evidence of a 593 00:30:29,600 --> 00:30:32,920 Speaker 2: creature dubbed the Piltdown Man in part because a few 594 00:30:33,000 --> 00:30:36,440 Speaker 2: years earlier, an ancient pre human jawbone had been discovered 595 00:30:36,480 --> 00:30:39,600 Speaker 2: in Germany. Now in the run up to World War One, 596 00:30:39,800 --> 00:30:43,040 Speaker 2: tensions between Germany and the UK were running high, and 597 00:30:43,080 --> 00:30:45,880 Speaker 2: so the Piltdown Man was proof of England's importance and 598 00:30:45,960 --> 00:30:50,560 Speaker 2: the fossil record until nineteen fifty three, when new dating 599 00:30:50,600 --> 00:30:54,080 Speaker 2: techniques showed that his bones weren't all the same age, 600 00:30:54,400 --> 00:30:57,160 Speaker 2: and what's more, they were a mix of human and 601 00:30:57,480 --> 00:31:00,640 Speaker 2: eight bones. So this was proof of the p but 602 00:31:00,680 --> 00:31:03,880 Speaker 2: the question was who did it so. Finally, in two 603 00:31:03,920 --> 00:31:08,320 Speaker 2: thousand and nine, a paleoanthropologist named Isabelle de Groute used 604 00:31:08,360 --> 00:31:12,040 Speaker 2: DNA analysis and CT scanning to get an even closer 605 00:31:12,080 --> 00:31:15,200 Speaker 2: look at the Piltdown samples. Here's what she found. She 606 00:31:15,240 --> 00:31:18,520 Speaker 2: found that all the teeth came from one orangutan, and 607 00:31:18,560 --> 00:31:20,640 Speaker 2: that the bones had been coated in a sort of 608 00:31:20,720 --> 00:31:24,520 Speaker 2: putty to make them appear uniform and heavy. Even so, 609 00:31:24,560 --> 00:31:27,320 Speaker 2: she concluded that given the consistency of the cover up, 610 00:31:27,400 --> 00:31:31,360 Speaker 2: it was the work of one person. Charles Dawson, as 611 00:31:31,400 --> 00:31:35,640 Speaker 2: an amateur who dabbled in geology, archaeology, and anthropology. He 612 00:31:35,680 --> 00:31:39,240 Speaker 2: would have known exactly what a real prehistoric find should 613 00:31:39,280 --> 00:31:43,080 Speaker 2: look like, and he had access to samples and to tools, 614 00:31:43,560 --> 00:31:46,640 Speaker 2: and as it turned out, he had perpetuated other smaller 615 00:31:46,640 --> 00:31:49,200 Speaker 2: frauds that were all in an attempt to gain recognition 616 00:31:49,320 --> 00:31:51,160 Speaker 2: by the British scientific community. 617 00:31:52,040 --> 00:31:54,200 Speaker 1: That's so weird, you know, I'd heard of Pilton men, 618 00:31:54,280 --> 00:31:56,640 Speaker 1: but I'd never realized that they'd figured out that Dawson 619 00:31:56,760 --> 00:31:59,160 Speaker 1: was behind it. And I guess he did end up 620 00:31:59,160 --> 00:32:02,120 Speaker 1: getting some recognition, but none the way he wanted. Yeah, 621 00:32:02,160 --> 00:32:05,360 Speaker 1: you know, if I'm being honest, which I am obviously, 622 00:32:05,440 --> 00:32:09,160 Speaker 1: I think he deserved Today's trophy for explaining how kids 623 00:32:09,160 --> 00:32:12,000 Speaker 1: become such expert liars. I think that bit of the 624 00:32:12,080 --> 00:32:13,480 Speaker 1: story was my favorite today. 625 00:32:14,240 --> 00:32:16,680 Speaker 2: Thank you. It is fun, too fun to think about. 626 00:32:18,200 --> 00:32:20,640 Speaker 1: Well that does it for this episode. If you like 627 00:32:20,680 --> 00:32:22,680 Speaker 1: our show, remember to rate and review us on the 628 00:32:22,720 --> 00:32:25,560 Speaker 1: Apple Store. Actually I cannot help, but look at our 629 00:32:25,560 --> 00:32:29,200 Speaker 1: reviews and we've been sitting at seventeen ninety nine seven 630 00:32:29,240 --> 00:32:32,000 Speaker 1: hundred ninety nine reviews, just hovering there for the last 631 00:32:32,040 --> 00:32:34,960 Speaker 1: few weeks, and it hasn't gone up to eighteen hundred. 632 00:32:35,080 --> 00:32:38,520 Speaker 1: So someone out there, help us out, help my OCD 633 00:32:38,680 --> 00:32:41,719 Speaker 1: out and get it to an even number. Also, if 634 00:32:41,720 --> 00:32:43,280 Speaker 1: you want to keep up with us on the socials, 635 00:32:43,320 --> 00:32:45,360 Speaker 1: remember you can find us up part Time Genius on 636 00:32:45,400 --> 00:32:48,680 Speaker 1: the gram. Anyway, we'll be back soon with another new episode, 637 00:32:48,720 --> 00:32:52,760 Speaker 1: but in the meantime from Gabe, Mary, Dylan, Will and myself, 638 00:32:52,920 --> 00:33:06,320 Speaker 1: thank you so much for listening. Part Time Genius is 639 00:33:06,320 --> 00:33:10,160 Speaker 1: a production of Kaleidoscope and iHeartRadio. This show is hosted 640 00:33:10,160 --> 00:33:14,440 Speaker 1: by Will Pearson and me Mongish Heatikler and research by 641 00:33:14,480 --> 00:33:18,680 Speaker 1: our good pal Mary Philip Sandy. Today's episode was engineered 642 00:33:18,720 --> 00:33:21,680 Speaker 1: and produced by the wonderful Dylan Fagan with support from 643 00:33:21,720 --> 00:33:25,440 Speaker 1: Tyler Klang. The show is executive produced for iHeart by 644 00:33:25,480 --> 00:33:28,959 Speaker 1: Katrina Norvell and Ali Perry, with social media support from 645 00:33:29,000 --> 00:33:33,560 Speaker 1: Sasha Gay, Trustee Dara Potts and Viney Shoory. For more 646 00:33:33,600 --> 00:33:38,920 Speaker 1: podcasts from Kaleidoscope and iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 647 00:33:39,040 --> 00:33:55,960 Speaker 1: or wherever you listen to your favorite shows.