1 00:00:00,800 --> 00:00:05,120 Speaker 1: Hey, everybody, This is Chuck and welcome to Uh. I 2 00:00:05,160 --> 00:00:09,600 Speaker 1: guess this is week three of s Y S Kate Selects, 3 00:00:10,480 --> 00:00:12,600 Speaker 1: where we give you one of our favorite or maybe 4 00:00:12,600 --> 00:00:16,200 Speaker 1: a timely episode from our vault that maybe you didn't hear. 5 00:00:16,960 --> 00:00:20,360 Speaker 1: And uh we're curating these individually, and this is my 6 00:00:20,400 --> 00:00:26,040 Speaker 1: pick lying Liars colon how lying works? Liar, I think 7 00:00:26,079 --> 00:00:28,960 Speaker 1: that was It sounds like a Josh title. It's a 8 00:00:28,960 --> 00:00:32,920 Speaker 1: great one. This is from June of twelve, and um, 9 00:00:32,960 --> 00:00:34,440 Speaker 1: I don't know. I just remember this being a pretty 10 00:00:34,479 --> 00:00:38,240 Speaker 1: good episode and lying is um just a pretty interesting 11 00:00:38,280 --> 00:00:41,879 Speaker 1: concept to me. And so I wanted to make this 12 00:00:41,960 --> 00:00:43,919 Speaker 1: my selection this week and I hope you enjoy it. 13 00:00:45,280 --> 00:00:48,720 Speaker 1: Welcome to Stuff you Should Know from House Stuff Works 14 00:00:48,760 --> 00:00:57,680 Speaker 1: dot com. Hey, and welcome to the podcast. I'm Josh Clark. 15 00:00:58,000 --> 00:01:02,440 Speaker 1: There's Charles W. Chuck Bryant, Right, Yeah, take it, Chuck, 16 00:01:05,280 --> 00:01:08,000 Speaker 1: Josh that that shirt looks great on you today. I 17 00:01:08,120 --> 00:01:10,880 Speaker 1: just want to say that, Uh, I really like a 18 00:01:11,000 --> 00:01:14,000 Speaker 1: checked pattern. Well, I see what you're doing. Um, you're 19 00:01:14,080 --> 00:01:16,760 Speaker 1: blinking and you're scratching your face. It looks like you 20 00:01:16,880 --> 00:01:21,319 Speaker 1: have scabies like my friend Dirty Mitch. Yeah. Um, but 21 00:01:21,440 --> 00:01:24,760 Speaker 1: that doesn't indicate necessarily that you're lying. I'm glad you 22 00:01:24,760 --> 00:01:26,840 Speaker 1: said necessarily, because they didn't say that in this article, 23 00:01:26,840 --> 00:01:29,240 Speaker 1: and that was a very important omission. Now they did. 24 00:01:29,720 --> 00:01:34,600 Speaker 1: They didn't say necessarily, they meaning Tom, Tom didn't say necessarily. 25 00:01:34,920 --> 00:01:37,760 Speaker 1: I had to write it in because what we're talking 26 00:01:37,800 --> 00:01:42,840 Speaker 1: about are potential tells whether or not whether that you 27 00:01:42,840 --> 00:01:47,840 Speaker 1: can recognize a lie. And people often associate blinking, scratching 28 00:01:47,840 --> 00:01:53,120 Speaker 1: the face, covering the mouth as uh, surefire indicators. And 29 00:01:53,160 --> 00:01:55,559 Speaker 1: that's not necessarily the cases. No, it's not. And let's 30 00:01:55,560 --> 00:01:58,880 Speaker 1: just get that out right now. Um. It's because there 31 00:01:58,920 --> 00:02:02,480 Speaker 1: aren't any sure fire behaviors where you're like, oh, you're lying, 32 00:02:02,520 --> 00:02:04,920 Speaker 1: Like the one I always heard was touching your touching 33 00:02:04,920 --> 00:02:10,320 Speaker 1: your face, right, that's it's based on the idea that 34 00:02:10,480 --> 00:02:13,320 Speaker 1: somebody might do that if they don't normally do that, 35 00:02:14,200 --> 00:02:19,160 Speaker 1: or if before this whole conversation started and a lie, Uh, 36 00:02:19,200 --> 00:02:21,560 Speaker 1: they're placed in a position where they're lying, where they're 37 00:02:21,600 --> 00:02:24,880 Speaker 1: choosing to lie, and they start doing that, they are 38 00:02:24,919 --> 00:02:27,480 Speaker 1: now deviating from their baseline behavior. And that's what the 39 00:02:27,520 --> 00:02:31,320 Speaker 1: important thing is exactly. That's where you add the qualifier necessarily. 40 00:02:31,720 --> 00:02:35,080 Speaker 1: Like Tom said, like maybe the dude has an eyelash 41 00:02:35,120 --> 00:02:38,240 Speaker 1: in his eye, Jason, he's rubbing his eyes or blinking 42 00:02:38,280 --> 00:02:43,880 Speaker 1: a lot, So yeah, it's interesting. Yeah, that's why he's 43 00:02:43,880 --> 00:02:48,240 Speaker 1: scratching his face um orca. Or maybe someone's self conscious 44 00:02:48,240 --> 00:02:50,480 Speaker 1: because of a missing front tooth, so they cover their 45 00:02:50,520 --> 00:02:54,400 Speaker 1: mouth when they talk. I cover my mouth when I talk, um, 46 00:02:54,560 --> 00:02:57,800 Speaker 1: especially when I'm eating. Oh yeah, sure, but you're just 47 00:02:57,840 --> 00:03:00,000 Speaker 1: not supposed to talk while you eat. Yeah, Emily does 48 00:03:00,080 --> 00:03:03,840 Speaker 1: that because she can't stop talking ever, so she'll just 49 00:03:03,880 --> 00:03:06,720 Speaker 1: talk while she eats and cover her mouth. So I'm 50 00:03:06,760 --> 00:03:08,840 Speaker 1: not even gonna bother to ask you if you've ever lied? 51 00:03:09,520 --> 00:03:13,720 Speaker 1: I've lied today? Have you some of you? I'm sure? 52 00:03:14,120 --> 00:03:18,680 Speaker 1: Apparently according to studies I have. Um, you found a 53 00:03:18,680 --> 00:03:22,200 Speaker 1: few studies that I've I've thought were a little let's 54 00:03:22,200 --> 00:03:25,080 Speaker 1: get into them, okay, I mean, so they're surprising. Like 55 00:03:25,120 --> 00:03:28,560 Speaker 1: one of them found that a quarter of of all 56 00:03:28,600 --> 00:03:35,040 Speaker 1: of our daily interactions involved lies. That's really high. I like, 57 00:03:35,120 --> 00:03:38,200 Speaker 1: I really, I don't think like I'm naive. I I 58 00:03:38,240 --> 00:03:40,560 Speaker 1: went back and just kind of like evaluated my behavior. 59 00:03:40,600 --> 00:03:45,440 Speaker 1: I'm like, it still seems high. Agreed, So, uh, who 60 00:03:45,920 --> 00:03:48,560 Speaker 1: does everyone else lying more than us? Are these just 61 00:03:48,640 --> 00:03:53,360 Speaker 1: a really rotten people. I don't know that happened to 62 00:03:53,360 --> 00:03:56,040 Speaker 1: be part of this random sample in the study. Well, 63 00:03:56,080 --> 00:03:58,160 Speaker 1: and you know, you pull up any study online and 64 00:03:58,160 --> 00:04:00,840 Speaker 1: you're going to find a different result, Like people are 65 00:04:00,880 --> 00:04:03,320 Speaker 1: every ninety minutes, people are every ninety seconds, she lied 66 00:04:03,320 --> 00:04:06,120 Speaker 1: twice a day, that kind of thing. Yeah, so, um 67 00:04:06,840 --> 00:04:09,960 Speaker 1: Phi psychologist failing miserably at studying lying is what you're 68 00:04:10,000 --> 00:04:12,880 Speaker 1: saying as far as coming up with hard because I 69 00:04:12,920 --> 00:04:15,120 Speaker 1: don't think there are any hard and fast rules. I 70 00:04:15,200 --> 00:04:17,040 Speaker 1: might not lie at all today, or I might like 71 00:04:17,120 --> 00:04:19,360 Speaker 1: ten times tomorrow. If you don't lie at all to day, 72 00:04:19,360 --> 00:04:22,599 Speaker 1: they'll make a movie about you. Don't They make movies 73 00:04:22,680 --> 00:04:28,719 Speaker 1: like that. You either can't lie Jim Carrey's case or yeah, 74 00:04:28,760 --> 00:04:33,120 Speaker 1: that's usually the plot. Yeah. So so I guess before 75 00:04:33,120 --> 00:04:34,520 Speaker 1: we get into the studies, do you want to talk 76 00:04:34,520 --> 00:04:37,240 Speaker 1: about what is a lie? Yeah? You dug this up. 77 00:04:37,320 --> 00:04:41,000 Speaker 1: I like these definitions. So there's this philosopher named Cecelia 78 00:04:41,120 --> 00:04:44,239 Speaker 1: Bach who is a Swedish philosopher who published a book 79 00:04:44,320 --> 00:04:47,000 Speaker 1: in the nine that apparently still to this day it's 80 00:04:47,040 --> 00:04:52,440 Speaker 1: like the the authority um lying and philosophically speaking. The 81 00:04:52,839 --> 00:04:55,440 Speaker 1: work in the seventies in psychology. I think they were 82 00:04:55,440 --> 00:04:58,080 Speaker 1: smoking a lot of weed, that's what it was. Um. Yeah, 83 00:04:58,080 --> 00:05:01,600 Speaker 1: and she's a philosopher. Um, so this is the philosophical 84 00:05:01,720 --> 00:05:05,560 Speaker 1: ethical approach to lying. And so cecili Bach came up 85 00:05:05,600 --> 00:05:09,120 Speaker 1: with basically a definition of a lie, and she said 86 00:05:09,160 --> 00:05:14,839 Speaker 1: that it has three features, Um, that it communicates some information. Now, 87 00:05:15,480 --> 00:05:19,440 Speaker 1: philosophers tend to choose their words carefully. They're like almost 88 00:05:19,440 --> 00:05:24,000 Speaker 1: the poets of the humanities, especially if you don't include 89 00:05:24,000 --> 00:05:28,080 Speaker 1: poetry as a humanity. Um. The liar intends to deceive 90 00:05:28,160 --> 00:05:31,080 Speaker 1: or mislead, of course, and the liar believes that what 91 00:05:31,120 --> 00:05:34,360 Speaker 1: they're saying is not true. So they haven't tricked themselves 92 00:05:34,440 --> 00:05:39,599 Speaker 1: into believing, uh that they like O. J believes that 93 00:05:39,640 --> 00:05:42,719 Speaker 1: he did not kill his wife. Yeah. I don't know. 94 00:05:42,839 --> 00:05:46,320 Speaker 1: I think that's a different conversation altogether, because that begs 95 00:05:46,320 --> 00:05:49,719 Speaker 1: the question is it really possible to fully trick yourself 96 00:05:49,720 --> 00:05:52,599 Speaker 1: into believing something you know is not true is true? 97 00:05:54,080 --> 00:05:56,480 Speaker 1: You always hear that that can happen, Like you become 98 00:05:56,760 --> 00:06:00,760 Speaker 1: so um entrenched in that lie for so long that 99 00:06:00,839 --> 00:06:03,400 Speaker 1: you don't even realize it's a lie anymore. I'd like 100 00:06:03,520 --> 00:06:06,039 Speaker 1: to see the study on that, though, yeah, I'm sure 101 00:06:06,040 --> 00:06:08,520 Speaker 1: somebody's got one out here. But so those three you 102 00:06:08,520 --> 00:06:12,120 Speaker 1: put those three things together, and um, you have a lie. 103 00:06:12,400 --> 00:06:17,320 Speaker 1: You're communicating information, right, yeah, Um, you're you're intend to 104 00:06:17,360 --> 00:06:19,559 Speaker 1: deceive and you don't believe what you're saying is true. 105 00:06:19,640 --> 00:06:21,919 Speaker 1: And I'll apologies to O J. By the way, that 106 00:06:22,040 --> 00:06:25,000 Speaker 1: was really uncalled for. Yeah, really sorry, that was very 107 00:06:25,080 --> 00:06:31,159 Speaker 1: uncalled for, Chuck, Shame on you for UM. So there's 108 00:06:31,200 --> 00:06:34,920 Speaker 1: a big debate though about whether or not, um, the 109 00:06:35,000 --> 00:06:38,120 Speaker 1: lie has to have false information to it. Yeah, that's 110 00:06:38,120 --> 00:06:40,560 Speaker 1: what I don't quite understand. Can you explain that? Yes? 111 00:06:40,720 --> 00:06:45,560 Speaker 1: So example, UM, I I like your shirt, and I 112 00:06:45,640 --> 00:06:49,800 Speaker 1: know that you hate shirts that have armpit stains on them. 113 00:06:50,440 --> 00:06:53,479 Speaker 1: But I know at that at the present time you 114 00:06:53,800 --> 00:06:56,400 Speaker 1: also have a pinch nerve in your neck so you 115 00:06:56,440 --> 00:07:00,440 Speaker 1: can't see very clearly. Okay, So I want your shirt. 116 00:07:00,520 --> 00:07:01,800 Speaker 1: I know you'll give it to me if I tell 117 00:07:01,839 --> 00:07:04,200 Speaker 1: you there's an armpit stain. I don't think there's an 118 00:07:04,240 --> 00:07:06,240 Speaker 1: armpit stain. I don't believe what I'm saying is true. 119 00:07:06,240 --> 00:07:08,400 Speaker 1: But I say, Chuck that that modest master it is 120 00:07:08,400 --> 00:07:10,560 Speaker 1: an armpit stain. You should probably give it to me, 121 00:07:10,800 --> 00:07:12,400 Speaker 1: and you take it off and give it to me, 122 00:07:13,280 --> 00:07:16,240 Speaker 1: and the shirt actually has an armpit stain. And someone 123 00:07:16,320 --> 00:07:19,560 Speaker 1: pops their head and goes, want, want Want. That was 124 00:07:19,640 --> 00:07:23,440 Speaker 1: not false information. Even though I intended to deceive you. 125 00:07:23,600 --> 00:07:25,760 Speaker 1: I didn't believe what I was saying. I was just 126 00:07:25,800 --> 00:07:29,440 Speaker 1: communicating information. That's the that's the point. But that brings 127 00:07:29,480 --> 00:07:33,200 Speaker 1: up another philosophical question, like, is a lie of omission 128 00:07:33,240 --> 00:07:37,640 Speaker 1: then a lie because you're not communicating anything. In Tom's 129 00:07:37,680 --> 00:07:40,760 Speaker 1: article they talk about um, the reason the U. S. 130 00:07:40,760 --> 00:07:43,200 Speaker 1: Court system say the whole truth and nothing but the 131 00:07:43,240 --> 00:07:47,720 Speaker 1: truth is because a lie of omission sometimes is not 132 00:07:47,880 --> 00:07:51,480 Speaker 1: as UH look down upon, because you're not actually constructing 133 00:07:51,520 --> 00:07:57,880 Speaker 1: some falsehead. You're just not telling the whole truth. Interesting. Yeah, 134 00:07:58,560 --> 00:08:01,560 Speaker 1: a lot of the version of the yeah, you know, sure, 135 00:08:01,720 --> 00:08:04,520 Speaker 1: I think. I think ultimately you can follow. The conclusion 136 00:08:04,600 --> 00:08:08,600 Speaker 1: is that most philosophers believe that lying is bad, but 137 00:08:08,720 --> 00:08:11,640 Speaker 1: that there are exceptions to the rule. Like if a 138 00:08:11,720 --> 00:08:14,520 Speaker 1: murderer comes up and says, hey, I'm looking for my 139 00:08:14,600 --> 00:08:17,480 Speaker 1: next victim. Have you seen this man and you have 140 00:08:17,600 --> 00:08:20,640 Speaker 1: seen this man and you know where he is, then 141 00:08:20,800 --> 00:08:24,760 Speaker 1: I'm making this really basic gets But that's a that's 142 00:08:24,800 --> 00:08:27,440 Speaker 1: a that's an example of when a lie is beneficial. 143 00:08:27,520 --> 00:08:30,240 Speaker 1: You're thwarting a murderer and saving somebody's life. Yes, you 144 00:08:30,240 --> 00:08:33,760 Speaker 1: should lie, or obviously a white lie, which kids learned 145 00:08:33,800 --> 00:08:37,760 Speaker 1: early on, um, is something you do oftentimes to spare 146 00:08:38,320 --> 00:08:43,160 Speaker 1: the feelings of someone else. That that top does not 147 00:08:43,280 --> 00:08:46,640 Speaker 1: make you look bad. I think it's cut, just just fun. Sure, 148 00:08:47,360 --> 00:08:48,920 Speaker 1: but maybe you should wear that other one because you 149 00:08:48,920 --> 00:08:53,360 Speaker 1: look great and black. Well you, I imagine if you're 150 00:08:53,400 --> 00:08:55,600 Speaker 1: dealing with an intelligent person, you've showing your hand at 151 00:08:55,640 --> 00:09:00,319 Speaker 1: that point. Not necessarily. Yeah, um, but you did mention kids, 152 00:09:00,480 --> 00:09:06,240 Speaker 1: you mentioned white lies. Yeah, so you found a study. Supposedly, 153 00:09:06,280 --> 00:09:09,960 Speaker 1: according to Tom, around age two or three, we realized 154 00:09:10,040 --> 00:09:17,559 Speaker 1: that we're not being constantly supervised, that reality exists outside 155 00:09:17,600 --> 00:09:20,880 Speaker 1: of other people's view. Our reality sometimes does. When we 156 00:09:20,880 --> 00:09:22,880 Speaker 1: go into another room or Mom's in the other room, 157 00:09:22,920 --> 00:09:26,160 Speaker 1: I'm still here playing yes, and it comes becomes clear 158 00:09:26,200 --> 00:09:31,120 Speaker 1: to us that we are responsible for conveying that information. 159 00:09:31,559 --> 00:09:34,640 Speaker 1: But we have a choice e g. Free will that 160 00:09:34,720 --> 00:09:37,680 Speaker 1: allows us to decide whether that we convey that information 161 00:09:37,840 --> 00:09:42,520 Speaker 1: truthfully or dishonestly. Right. Yeah, and then after that it 162 00:09:42,600 --> 00:09:46,280 Speaker 1: just takes off like a rocket. Yeah, this one study, Um, 163 00:09:46,440 --> 00:09:49,320 Speaker 1: found that nine and ten of the kids that they 164 00:09:49,360 --> 00:09:52,760 Speaker 1: spied on, and this is just this is just cruel. 165 00:09:52,760 --> 00:09:55,199 Speaker 1: They put kids in a room with like a stuffed 166 00:09:55,240 --> 00:09:58,760 Speaker 1: animal or they called it a soft toy, which was 167 00:09:58,760 --> 00:10:02,640 Speaker 1: a little creepy um behind them, and said don't look 168 00:10:02,679 --> 00:10:05,000 Speaker 1: at that behind you, and they would leave the room 169 00:10:05,000 --> 00:10:07,040 Speaker 1: in videotape them, and of course nine out of ten 170 00:10:07,160 --> 00:10:10,120 Speaker 1: kids turned around looked at it. And then did I 171 00:10:10,120 --> 00:10:15,480 Speaker 1: actually say how many of them lied? Um, I don't 172 00:10:15,480 --> 00:10:16,920 Speaker 1: even see that. I guess they were just told not 173 00:10:16,960 --> 00:10:18,880 Speaker 1: to look at it. And I don't know if that's 174 00:10:18,880 --> 00:10:21,720 Speaker 1: a lie. You know, that's a little lanky. It's not 175 00:10:21,840 --> 00:10:24,160 Speaker 1: following orders, it's disobeying all right, Well, how about this? 176 00:10:24,200 --> 00:10:26,120 Speaker 1: The second part of that study was older kids. They 177 00:10:26,120 --> 00:10:27,839 Speaker 1: would give them a test with the answers on the 178 00:10:27,880 --> 00:10:29,439 Speaker 1: back and tell them not to look at the answers, 179 00:10:30,080 --> 00:10:32,960 Speaker 1: give them a fake answer, and when they were asked 180 00:10:32,960 --> 00:10:34,640 Speaker 1: to explain the fake answer, they would make up a 181 00:10:34,679 --> 00:10:36,679 Speaker 1: lie like, oh I learned that in history class, right, 182 00:10:36,679 --> 00:10:40,240 Speaker 1: And actually they the question was who discovered Tunisia? That's 183 00:10:40,280 --> 00:10:43,040 Speaker 1: one of them and the on the back the answer 184 00:10:43,360 --> 00:10:48,320 Speaker 1: um was a fake answer. Presidious Aikman and so some 185 00:10:48,800 --> 00:10:51,760 Speaker 1: the dumb kids from Star Wars who cheated said it 186 00:10:51,800 --> 00:10:55,160 Speaker 1: does doesn't. Um, the dumb kids who cheated put that 187 00:10:55,200 --> 00:10:58,960 Speaker 1: down and uh. But then Wayne, given the opportunity to 188 00:10:58,960 --> 00:11:02,240 Speaker 1: fest up with them lied. What I just disagree with 189 00:11:02,280 --> 00:11:05,240 Speaker 1: about this study was that, um, it said that the 190 00:11:05,400 --> 00:11:09,079 Speaker 1: smartest kids told the best lies. How do you qualify that? 191 00:11:09,240 --> 00:11:13,080 Speaker 1: What a you know, the best lies? Uh? If it's 192 00:11:13,120 --> 00:11:16,480 Speaker 1: believable and conveyed in a way that's uh. But that's 193 00:11:16,480 --> 00:11:22,400 Speaker 1: all subjective with confidence, totally subjective. It's methodologically unsound. But 194 00:11:22,520 --> 00:11:24,560 Speaker 1: the whole point of these studies was they believe that 195 00:11:24,679 --> 00:11:29,920 Speaker 1: children who are able to lie, um, have faster developing brains, 196 00:11:31,640 --> 00:11:35,400 Speaker 1: especially the areas involved in executive functioning. So they believe, 197 00:11:35,600 --> 00:11:39,199 Speaker 1: they concur that a child who lies early on might 198 00:11:39,240 --> 00:11:41,839 Speaker 1: be more successful later in life. Yeah. Well, I mean 199 00:11:42,080 --> 00:11:45,600 Speaker 1: you can make a case that lying is um, basically 200 00:11:45,679 --> 00:11:50,440 Speaker 1: using your imagination. Yeah, especially when you're a child. And 201 00:11:50,480 --> 00:11:53,720 Speaker 1: they they have other studies that support the fact that 202 00:11:53,840 --> 00:11:58,120 Speaker 1: rich people are more prone to lie and manipulate. So 203 00:11:59,240 --> 00:12:01,080 Speaker 1: rich people are more trying to do a lot of 204 00:12:01,120 --> 00:12:05,200 Speaker 1: shady stuff. You think, well, yeah, that's how a lot 205 00:12:05,240 --> 00:12:12,480 Speaker 1: of them get rich. Huh. Yeah, m hmm, chuck um. 206 00:12:12,640 --> 00:12:16,320 Speaker 1: You want to talk about why people tell lies, like 207 00:12:16,400 --> 00:12:20,240 Speaker 1: some of the great categories of lies, the Big six. 208 00:12:21,160 --> 00:12:24,440 Speaker 1: So what you call it? Sure? Number one, Josh, is 209 00:12:24,480 --> 00:12:27,880 Speaker 1: to conceal a misdeed instead of trouble. And even at 210 00:12:27,880 --> 00:12:30,520 Speaker 1: the advanced stage of forty one, this is the reason 211 00:12:30,600 --> 00:12:33,960 Speaker 1: I will most often lie because I hate being in 212 00:12:34,000 --> 00:12:38,760 Speaker 1: trouble and I will try and cover for myself. Uh sometimes, 213 00:12:39,760 --> 00:12:42,600 Speaker 1: even though in the end and in the long run, 214 00:12:43,280 --> 00:12:46,560 Speaker 1: it's better to tell the truth, agreed. The the short term, 215 00:12:46,559 --> 00:12:48,400 Speaker 1: you might get away with something, but you're not doing 216 00:12:48,400 --> 00:12:52,360 Speaker 1: yourself any favors. I call these lies of cowardice, thank you. 217 00:12:53,360 --> 00:12:56,360 Speaker 1: It's basically it sucks. I mean, it's that you're you're 218 00:12:56,440 --> 00:13:01,600 Speaker 1: having to like go through this terrible, uncomfortable, you know, 219 00:13:01,760 --> 00:13:06,000 Speaker 1: moment or whatever it is. But I'm really glad Tom 220 00:13:06,040 --> 00:13:08,400 Speaker 1: pointed out at least in the short term you might 221 00:13:08,400 --> 00:13:10,600 Speaker 1: not gain an advantage, but overall, like there is a 222 00:13:10,600 --> 00:13:14,360 Speaker 1: positive outcome. You're building trust by fessing out or whatever 223 00:13:14,600 --> 00:13:18,920 Speaker 1: you know. Number two to preserve a reputation. Um and 224 00:13:18,960 --> 00:13:22,360 Speaker 1: the example time uses a drug a drug addict recovering 225 00:13:22,400 --> 00:13:25,560 Speaker 1: drug addict who might lie about having gone to rehab 226 00:13:25,679 --> 00:13:31,200 Speaker 1: or something like that, or to like a perspective romantic partner. Yeah, 227 00:13:31,679 --> 00:13:33,840 Speaker 1: and I think that these are kind of understandable to 228 00:13:33,920 --> 00:13:37,880 Speaker 1: an extent. It's it's like, you know, we have walls up. Yeah, 229 00:13:38,320 --> 00:13:40,920 Speaker 1: you know, you know if you're if you're just a 230 00:13:40,960 --> 00:13:43,320 Speaker 1: normal person, you don't walk down the street like, hey, Josh, 231 00:13:43,400 --> 00:13:45,960 Speaker 1: get to meet you. I spent some time in rehably, 232 00:13:46,040 --> 00:13:47,880 Speaker 1: like there's just a certain amount and then once you 233 00:13:47,920 --> 00:13:50,240 Speaker 1: get to that point, well then maybe you are kind 234 00:13:50,240 --> 00:13:53,920 Speaker 1: of short shrifting somebody who's who you consider close to 235 00:13:54,720 --> 00:13:57,280 Speaker 1: um if you're not telling them that. But as you 236 00:13:57,320 --> 00:14:02,240 Speaker 1: gain trust, then you open up more and you should um. 237 00:14:02,360 --> 00:14:04,680 Speaker 1: Number three this one, of course, we all get the 238 00:14:04,679 --> 00:14:07,520 Speaker 1: white lie to avoid hurting someone's feelings. We already kind 239 00:14:07,520 --> 00:14:13,520 Speaker 1: of covered that. Number four. To increase stature or reputation. Yeah, 240 00:14:13,520 --> 00:14:15,800 Speaker 1: these are just the boastful people. They're like, yeah, I 241 00:14:16,520 --> 00:14:19,120 Speaker 1: scored forty two in uh in a high school basketball 242 00:14:19,120 --> 00:14:22,520 Speaker 1: game one time and one county and track and you're like, 243 00:14:22,800 --> 00:14:25,080 Speaker 1: you do some investigation and you realize they weren't even 244 00:14:25,080 --> 00:14:27,600 Speaker 1: on the track team, right, And I've realized that, like 245 00:14:27,680 --> 00:14:29,600 Speaker 1: when you notice when you have a lot of these 246 00:14:29,640 --> 00:14:33,280 Speaker 1: people in your life, like there usually it's time to 247 00:14:33,320 --> 00:14:35,680 Speaker 1: get introspective because there's something wrong with the way that 248 00:14:35,720 --> 00:14:39,720 Speaker 1: you're living. If you have people like this, yeah, orbiting 249 00:14:39,760 --> 00:14:42,800 Speaker 1: you a lot. Those people wear me out, they really do. 250 00:14:43,080 --> 00:14:45,040 Speaker 1: I've done a pretty good job with shaking you know, 251 00:14:45,120 --> 00:14:47,200 Speaker 1: everybody like that off. I don't. I can't think of 252 00:14:47,240 --> 00:14:50,240 Speaker 1: anybody in my life like even acquaintance wise, and it's 253 00:14:50,320 --> 00:14:53,240 Speaker 1: like that just makes up lies about themselves, agreed. And 254 00:14:53,280 --> 00:14:55,720 Speaker 1: I have known people like that. They don't stick around. No, 255 00:14:56,120 --> 00:14:58,480 Speaker 1: especially once you get out of what you're like mid 256 00:14:58,560 --> 00:15:02,200 Speaker 1: twenties is like that edge of what you know where 257 00:15:02,960 --> 00:15:05,200 Speaker 1: you should be interacting with those people still well. And 258 00:15:05,240 --> 00:15:08,360 Speaker 1: I think that's sort of a an immaturity thing to 259 00:15:08,800 --> 00:15:10,680 Speaker 1: like if you're still doing that when you as you 260 00:15:10,720 --> 00:15:13,640 Speaker 1: grow older, then you've got some serious problems. But everyone 261 00:15:13,680 --> 00:15:16,800 Speaker 1: when they're younger probably stretches the truth about things they've done. Sure, 262 00:15:16,840 --> 00:15:19,000 Speaker 1: but even still don't you remember, like it's just kind 263 00:15:19,000 --> 00:15:22,280 Speaker 1: of being uncomfortable when somebody was doing this, and everyone 264 00:15:22,320 --> 00:15:24,960 Speaker 1: in this group that they're talking to is like, we 265 00:15:25,000 --> 00:15:28,240 Speaker 1: know you're lying, Like this is just please stop. This 266 00:15:28,280 --> 00:15:30,480 Speaker 1: is you might as well be putting bamboo shoots under 267 00:15:30,520 --> 00:15:34,640 Speaker 1: a fingernail number five to manipulate. And this is probably 268 00:15:34,840 --> 00:15:37,400 Speaker 1: something like I said, the rich people are more prone 269 00:15:37,480 --> 00:15:41,200 Speaker 1: to lie. Um, this is an evasive or defensive it's 270 00:15:41,240 --> 00:15:45,720 Speaker 1: to gain something for yourself. It's pretty pretty it's the 271 00:15:45,760 --> 00:15:49,120 Speaker 1: most vicious type of lying, probably because it also can 272 00:15:49,200 --> 00:15:52,280 Speaker 1: involve telling lies about other people right to get ahead. 273 00:15:52,840 --> 00:15:57,640 Speaker 1: That's all destroy someone else's reputation. Man, those people, there's 274 00:15:57,680 --> 00:16:01,360 Speaker 1: a special place in Dante circle for you. Agreed. Now, 275 00:16:01,400 --> 00:16:04,280 Speaker 1: those people you start to run into more of, especially 276 00:16:04,360 --> 00:16:06,880 Speaker 1: as you inter the corporate world. And you know what, 277 00:16:06,920 --> 00:16:08,440 Speaker 1: they may be the same people who were telling the 278 00:16:08,520 --> 00:16:11,760 Speaker 1: other lives when they were younger. You know what I'm saying, 279 00:16:12,040 --> 00:16:16,520 Speaker 1: They just evolved into to be manipulators. And the last one, Josh, 280 00:16:16,680 --> 00:16:20,160 Speaker 1: is to control information. Um. And this is what we 281 00:16:20,320 --> 00:16:23,960 Speaker 1: mentioned about indirect lying or like withholding parts of the 282 00:16:24,040 --> 00:16:28,160 Speaker 1: truth a liable mission. Yeah, which I'm more able to 283 00:16:28,200 --> 00:16:33,440 Speaker 1: forgive something like that. Um. So that's as you called it, 284 00:16:33,920 --> 00:16:38,000 Speaker 1: the Big six. Huh, the Big six, um, Chuck. Kind 285 00:16:38,000 --> 00:16:39,720 Speaker 1: Of one of the running themes that we've had here 286 00:16:39,760 --> 00:16:42,560 Speaker 1: is that everybody lies. I don't think that we've Yeah, 287 00:16:42,600 --> 00:16:46,920 Speaker 1: you said it, explicitly. Every the Dalai Lama doesn't lie. 288 00:16:48,120 --> 00:16:50,360 Speaker 1: I don't know. I could see him wanting to preserve 289 00:16:50,400 --> 00:16:53,320 Speaker 1: harmony or balance or something like that at a diplomatic 290 00:16:53,360 --> 00:16:57,360 Speaker 1: dinner or something, or with one of his followers feelings 291 00:16:57,360 --> 00:17:00,560 Speaker 1: about like, yes, that is a really nice sand mandala. 292 00:17:01,280 --> 00:17:03,040 Speaker 1: Sure you know. Yeah, he's sent putting a lot of 293 00:17:03,120 --> 00:17:08,639 Speaker 1: high pressure social situations. So at the very least everyone 294 00:17:08,680 --> 00:17:13,080 Speaker 1: has little white lies, I imagine, like right, Um, but 295 00:17:13,160 --> 00:17:16,720 Speaker 1: there are some types of personalities that are much more 296 00:17:16,800 --> 00:17:23,639 Speaker 1: prone to lying, um the big five. Yes, like pathological liars. Yes, 297 00:17:23,800 --> 00:17:27,000 Speaker 1: they are the worst of the worst because they are 298 00:17:27,200 --> 00:17:33,879 Speaker 1: sociopathic and they don't understand right and wrong. Um, and 299 00:17:33,920 --> 00:17:36,640 Speaker 1: they have they're probably really good liars because they don't 300 00:17:36,720 --> 00:17:39,080 Speaker 1: understand right or wrong, because they're not gonna have the 301 00:17:39,080 --> 00:17:44,160 Speaker 1: stresses associated with guilt in line bad people. Yeah, their 302 00:17:44,160 --> 00:17:46,640 Speaker 1: consciences don't put them through their paces when they're lying, 303 00:17:46,680 --> 00:17:51,240 Speaker 1: because they lack in whole or in part consciences compared 304 00:17:51,320 --> 00:17:56,280 Speaker 1: to non sociopathic people. Right. Uh. Number two. I've known 305 00:17:56,400 --> 00:17:59,320 Speaker 1: one of these people, a compulsive liar, and I really 306 00:17:59,359 --> 00:18:02,600 Speaker 1: felt bad for him because they almost seem like they 307 00:18:02,600 --> 00:18:05,440 Speaker 1: can't help it. Yeah, and you would think a pathological 308 00:18:05,480 --> 00:18:07,720 Speaker 1: liar and a compulsive liar one and the same. Not 309 00:18:08,200 --> 00:18:14,879 Speaker 1: no compulsive liar UM. Pathological liar lies um as a strategy. 310 00:18:15,119 --> 00:18:21,280 Speaker 1: It's a means of gaining advantage. UM. A compulsive liar 311 00:18:22,160 --> 00:18:25,320 Speaker 1: like can't help it, like they have their brain has 312 00:18:25,320 --> 00:18:28,840 Speaker 1: been trained to lie as a first resort, even if 313 00:18:28,880 --> 00:18:33,000 Speaker 1: they're not, if there's no gain whatsoever in line, they 314 00:18:33,040 --> 00:18:35,960 Speaker 1: they will still just generally lie. It's the first thing. 315 00:18:36,040 --> 00:18:38,840 Speaker 1: It's their gut reaction. It's Uh, this guy knew. It 316 00:18:38,960 --> 00:18:42,320 Speaker 1: was really clear that he lied as a first option, 317 00:18:42,960 --> 00:18:45,480 Speaker 1: as his go to when he didn't need to. And 318 00:18:45,520 --> 00:18:48,119 Speaker 1: it was so frustrating. And I had a sort of 319 00:18:48,119 --> 00:18:51,879 Speaker 1: a big brotherly relationship to this guy, so it wasn't 320 00:18:51,920 --> 00:18:54,400 Speaker 1: he wasn't a peer, so I tried to help him 321 00:18:54,640 --> 00:18:57,960 Speaker 1: through that. But UM, I haven't talked to him a 322 00:18:57,960 --> 00:18:59,199 Speaker 1: long time. I don't know if it worked. So you 323 00:18:59,240 --> 00:19:03,679 Speaker 1: abandoned him faith that them um and one of the 324 00:19:03,920 --> 00:19:06,120 Speaker 1: and this of course makes sense in his case too, 325 00:19:06,160 --> 00:19:09,879 Speaker 1: without getting specific, but said that living in an abusive environment, 326 00:19:10,280 --> 00:19:14,080 Speaker 1: relying as necessary to self preservation might be where that 327 00:19:14,119 --> 00:19:16,600 Speaker 1: stems from. And I think that was the case for him. Yeah, 328 00:19:16,600 --> 00:19:20,200 Speaker 1: it's like the brain has been trained to um lie, 329 00:19:20,640 --> 00:19:24,000 Speaker 1: like this is what you do. It's But the good 330 00:19:24,000 --> 00:19:26,879 Speaker 1: thing is if you run across somebody in your life 331 00:19:26,880 --> 00:19:29,320 Speaker 1: who cares enough about you, they can be trained out 332 00:19:29,359 --> 00:19:32,040 Speaker 1: of you, I imagine, although I'm sure it's a painful 333 00:19:32,080 --> 00:19:36,800 Speaker 1: process for both people. Yeah, it's probably are you're really 334 00:19:36,800 --> 00:19:41,720 Speaker 1: waiting for me to say. People culty programming UM. Narcissists 335 00:19:42,000 --> 00:19:45,040 Speaker 1: of course will lie. These are the people who um 336 00:19:45,240 --> 00:19:49,400 Speaker 1: lie to increase stature and reputation, but they do themselves 337 00:19:49,440 --> 00:19:53,760 Speaker 1: no favors. Borderline personalities. I thought this one was kind 338 00:19:53,800 --> 00:19:56,960 Speaker 1: of interesting. They will go through wild mood swings and 339 00:19:57,040 --> 00:20:00,520 Speaker 1: engage in really risky behavior, but then they come back 340 00:20:00,560 --> 00:20:04,359 Speaker 1: down to normalcy and they're like, oh, um, I just 341 00:20:04,480 --> 00:20:07,320 Speaker 1: gambled away our savings count and then they'll lie to 342 00:20:07,359 --> 00:20:11,439 Speaker 1: cover those up, right. Uh. And then histrionic personalities, Uh, 343 00:20:11,560 --> 00:20:14,560 Speaker 1: these are people if you have like a true histrionic 344 00:20:14,560 --> 00:20:19,919 Speaker 1: personality disorder, you are attention seeking, excessively needy uh for 345 00:20:19,960 --> 00:20:24,960 Speaker 1: approval and emotional um. Apparently women are more prone to 346 00:20:25,000 --> 00:20:28,919 Speaker 1: be histrionic by ratio four to one. Oh is there 347 00:20:29,000 --> 00:20:30,480 Speaker 1: right what they say? Or at least I have the 348 00:20:30,560 --> 00:20:35,440 Speaker 1: disorder and you can, um, these are people desperate for attention, 349 00:20:35,480 --> 00:20:37,440 Speaker 1: like if you leave, if you walk out the door, 350 00:20:37,480 --> 00:20:40,240 Speaker 1: and to kill myself. That was a great example time used. 351 00:20:40,600 --> 00:20:42,919 Speaker 1: Yeah that's a lie, you mean. I were on this 352 00:20:42,960 --> 00:20:45,440 Speaker 1: air tram flight once in this UM we were waiting 353 00:20:45,480 --> 00:20:49,439 Speaker 1: to take off and this caterer was backing off of 354 00:20:49,480 --> 00:20:52,160 Speaker 1: the little his little sky ramp who was backing away 355 00:20:52,160 --> 00:20:56,639 Speaker 1: from the other door, the catering door UM, and apparently 356 00:20:56,640 --> 00:20:58,359 Speaker 1: it was still hooked a part of it or whatever. 357 00:20:58,400 --> 00:21:00,639 Speaker 1: The SHEI was like, no, no, don't do and he 358 00:21:00,760 --> 00:21:03,200 Speaker 1: kept going and she turned around, was like, well, we're 359 00:21:03,200 --> 00:21:05,679 Speaker 1: not going anywhere for a while. And the way she 360 00:21:05,720 --> 00:21:08,800 Speaker 1: said it, you could tell we were we were discussing this, 361 00:21:08,920 --> 00:21:11,160 Speaker 1: like neither one of us believed her. We just knew 362 00:21:11,200 --> 00:21:14,480 Speaker 1: she was wrong because just she's threw it out there like, 363 00:21:14,480 --> 00:21:16,600 Speaker 1: well we're not going anywhere for a while, like and 364 00:21:16,640 --> 00:21:19,040 Speaker 1: of course, like everything was fine within like fifteen minutes, 365 00:21:19,440 --> 00:21:21,080 Speaker 1: but the way she sounded, it was gonna be like 366 00:21:21,280 --> 00:21:24,240 Speaker 1: stuck on a tarmic for three hours. And it was weird. 367 00:21:24,280 --> 00:21:28,280 Speaker 1: Were afforded this glimpse into this woman's true personality, and 368 00:21:28,520 --> 00:21:32,320 Speaker 1: I wouldn't have hung out with histrionic. She was a 369 00:21:32,320 --> 00:21:35,560 Speaker 1: bit histrionic and that was probably low level you could tell, 370 00:21:37,960 --> 00:21:41,080 Speaker 1: all right, so stay away from her. How can you 371 00:21:41,119 --> 00:21:44,280 Speaker 1: tell if someone's lying. We've talked about micro expressions, yeah, 372 00:21:45,280 --> 00:21:49,600 Speaker 1: and the micro expression. Yeah, it was like three years ago. 373 00:21:50,680 --> 00:21:54,400 Speaker 1: Uh yeah, yeah, it has been well. Basically, the for 374 00:21:54,480 --> 00:21:57,200 Speaker 1: those people who don't know that, we have a good 375 00:21:57,640 --> 00:22:01,680 Speaker 1: hundred and thirty hundred and fifty at the sisodes still 376 00:22:01,960 --> 00:22:05,920 Speaker 1: not on iTunes, not um we the episode we did 377 00:22:05,920 --> 00:22:11,800 Speaker 1: on micro expressions. Basically, a micro expression is an uncontrollable fleeting, 378 00:22:11,880 --> 00:22:17,240 Speaker 1: like millisecond long facial expression that is linked to your 379 00:22:17,280 --> 00:22:19,840 Speaker 1: true feelings. That was longer than a millisecond. I just 380 00:22:19,880 --> 00:22:22,399 Speaker 1: made a quick frowny face. That was a grim was it? 381 00:22:22,680 --> 00:22:25,720 Speaker 1: You look like? Megan m ram I don't know this 382 00:22:25,880 --> 00:22:30,840 Speaker 1: is on Twitter girling like that. It's hilarious. Um anyway, Uh, 383 00:22:31,680 --> 00:22:34,080 Speaker 1: it's it's a you can't control it. It's linked to 384 00:22:34,119 --> 00:22:38,679 Speaker 1: your true feelings, and people have been shown to be 385 00:22:38,720 --> 00:22:42,160 Speaker 1: able to pick up on these without even consciously knowing it. 386 00:22:42,720 --> 00:22:44,959 Speaker 1: You just get what we would call a gut feeling 387 00:22:45,040 --> 00:22:50,400 Speaker 1: about somebody because their smile and that sudden fleeting micro 388 00:22:50,480 --> 00:22:54,720 Speaker 1: expression of like contempt back into a smile that you 389 00:22:54,880 --> 00:22:59,040 Speaker 1: didn't really see, but you're caught, don't add up, and 390 00:22:59,119 --> 00:23:01,200 Speaker 1: your body is like, why am I having a weird 391 00:23:01,240 --> 00:23:03,960 Speaker 1: reaction to this person? So that's a micro expression, and 392 00:23:03,960 --> 00:23:08,800 Speaker 1: they're they're apparently linked to lies and lying. So basically, 393 00:23:08,800 --> 00:23:11,280 Speaker 1: if you have a gut feeling that somebody's lying, you 394 00:23:11,359 --> 00:23:13,320 Speaker 1: might be onto something. Yes, and it might be because 395 00:23:13,359 --> 00:23:16,640 Speaker 1: of a micro expression that you saw. Interesting. You know this, 396 00:23:17,119 --> 00:23:19,280 Speaker 1: I know, but I still find it interesting. Okay, I'm 397 00:23:19,280 --> 00:23:21,919 Speaker 1: not lying. That's your new thing? Interesting? Did you know 398 00:23:21,960 --> 00:23:24,800 Speaker 1: that that I said that? Um, I've been on that 399 00:23:24,880 --> 00:23:31,920 Speaker 1: for a while. Yeah, if I ramped it up lately. Interesting. Uh. 400 00:23:31,960 --> 00:23:34,679 Speaker 1: These are non verbal cues, by the way, micro expressions. 401 00:23:34,720 --> 00:23:37,480 Speaker 1: Another one is a forced smile. And I think we've 402 00:23:37,480 --> 00:23:39,360 Speaker 1: talked about this too. That's when you smile with your 403 00:23:39,359 --> 00:23:43,720 Speaker 1: mouth only. Yeah. It's so creepy, is the seeing it? 404 00:23:43,760 --> 00:23:45,760 Speaker 1: You're just like, you know, what are you doing? Right? 405 00:23:45,880 --> 00:23:49,760 Speaker 1: You look weird right now? Um? But seeing it described 406 00:23:49,840 --> 00:23:52,000 Speaker 1: the way Tom wrote it, which is a perfect description, 407 00:23:52,119 --> 00:23:57,080 Speaker 1: was creepy to me. Yes, it was. Um. Another thing 408 00:23:57,119 --> 00:23:59,840 Speaker 1: you can look for is like someone's nodding yes during 409 00:23:59,840 --> 00:24:02,400 Speaker 1: a denial some kind that could be a dead giveaway. 410 00:24:02,560 --> 00:24:06,520 Speaker 1: It's terrible. Sometimes it's like way more outward, Like you 411 00:24:06,600 --> 00:24:10,240 Speaker 1: get literally in a defensive position, right, like crossing your 412 00:24:10,359 --> 00:24:13,800 Speaker 1: arms or moving away, turning away from somebody who's questioning you, 413 00:24:14,080 --> 00:24:18,520 Speaker 1: and fidgeting. Finally we get to fidgeting. Like, fidgeting is 414 00:24:19,040 --> 00:24:23,719 Speaker 1: significant if the person is normally calm or doesn't normally fidget. 415 00:24:23,800 --> 00:24:27,960 Speaker 1: If it's a fidgety, nervous person to begin with, then 416 00:24:28,160 --> 00:24:32,120 Speaker 1: that doesn't it doesn't mean that they're lying. If they're 417 00:24:32,160 --> 00:24:36,520 Speaker 1: fidgety and then they get calm all of a sudden 418 00:24:36,680 --> 00:24:41,200 Speaker 1: while you think that they're lying, then you're onto something there. 419 00:24:41,320 --> 00:24:44,159 Speaker 1: But they're probably the weirdest person in the world, and 420 00:24:44,200 --> 00:24:46,320 Speaker 1: you should force them into lies just to watch them 421 00:24:46,600 --> 00:24:50,200 Speaker 1: go from fidgety to normal, because they'd be really odd 422 00:24:50,240 --> 00:24:54,520 Speaker 1: to see. Um verbal Sometimes you know, you can pick 423 00:24:54,600 --> 00:24:58,760 Speaker 1: up on nonverbal cues. Sometimes they are quite verbal. Um 424 00:24:58,800 --> 00:25:02,240 Speaker 1: So like, ask asked me a question, asked me, why 425 00:25:02,240 --> 00:25:04,520 Speaker 1: didn't I come over last night and help you with 426 00:25:04,560 --> 00:25:10,680 Speaker 1: your with your lawn mowing. Uh, that's a terrible question, 427 00:25:10,720 --> 00:25:14,000 Speaker 1: all right, I'll ask me why did you tell me 428 00:25:14,119 --> 00:25:17,240 Speaker 1: that the sweetwater thing doesn't have a date when it 429 00:25:17,280 --> 00:25:21,359 Speaker 1: clearly does. Well, I'll fake it and then I'll tell 430 00:25:21,359 --> 00:25:25,960 Speaker 1: you the real answer, because that actually does ever real answer. Um, 431 00:25:26,000 --> 00:25:29,480 Speaker 1: So what you're saying that you looked and saw that 432 00:25:29,840 --> 00:25:32,600 Speaker 1: my band gig for Sweetwater didn't have a date on it? 433 00:25:32,720 --> 00:25:36,360 Speaker 1: Is that what you're asking me? Yes, well, I mean 434 00:25:36,520 --> 00:25:41,639 Speaker 1: I could tell you, uh right now exactly. There's a 435 00:25:41,800 --> 00:25:44,600 Speaker 1: very good reason for that, Josh, Then why don't you liar? 436 00:25:45,400 --> 00:25:48,040 Speaker 1: So those are some of the not actually verbal cues, 437 00:25:48,600 --> 00:25:52,360 Speaker 1: taking too long, not using contractions, repeating the question, basically 438 00:25:52,359 --> 00:25:55,880 Speaker 1: buying yourself, sometimes not using contractions. It's hilarious to me, like, 439 00:25:56,000 --> 00:26:00,080 Speaker 1: I cannot believe that you want to know why and 440 00:26:00,320 --> 00:26:03,280 Speaker 1: tell you why I did not tell you this, and 441 00:26:03,320 --> 00:26:06,879 Speaker 1: the reason why, the real reason why, No, the reason 442 00:26:06,920 --> 00:26:09,520 Speaker 1: why you're not using contractions. Well, to draw it out 443 00:26:09,880 --> 00:26:12,480 Speaker 1: so your brain can the other part of you can 444 00:26:12,560 --> 00:26:15,880 Speaker 1: come up with the answer. That's so hilarious. The real 445 00:26:16,000 --> 00:26:19,080 Speaker 1: reason why I said that about Sweetwater is because I 446 00:26:19,160 --> 00:26:21,440 Speaker 1: just found out two days ago that they have gone 447 00:26:21,440 --> 00:26:27,640 Speaker 1: with another band and its officially canceled. No, yeah, that's 448 00:26:27,640 --> 00:26:31,720 Speaker 1: a little bummed. Do you want me to talk to somebody? No? Now, 449 00:26:31,760 --> 00:26:33,919 Speaker 1: I was told that one of the reasons why they 450 00:26:33,920 --> 00:26:35,840 Speaker 1: went with another band is because we didn't have like 451 00:26:36,760 --> 00:26:41,199 Speaker 1: video and things to send to show you caged up 452 00:26:41,240 --> 00:26:45,399 Speaker 1: your two fifty gig computer with HD videos. Yeah, but 453 00:26:46,000 --> 00:26:47,400 Speaker 1: a lot of stuff we don't want to send out, 454 00:26:48,880 --> 00:26:52,639 Speaker 1: like a lot of pants lists, pencils, l cheap. Well, 455 00:26:52,760 --> 00:26:55,879 Speaker 1: I think that stinks. If you're at Sweetwater Brewer right now, 456 00:26:56,240 --> 00:26:59,639 Speaker 1: shame on you. Now, this wasn't them, This was in GEO. 457 00:26:59,800 --> 00:27:03,159 Speaker 1: This as a fundraiser. So the fan who wrote in 458 00:27:03,320 --> 00:27:09,720 Speaker 1: to ask you bomped. You bumped me really for another band. 459 00:27:09,800 --> 00:27:11,840 Speaker 1: And I've looked at the other band too, like they 460 00:27:11,880 --> 00:27:16,760 Speaker 1: have a website and videos and lady, stop listening right now. No, 461 00:27:17,040 --> 00:27:19,760 Speaker 1: don't do that, at least for the rest of the episode. 462 00:27:19,840 --> 00:27:22,639 Speaker 1: It's okay. It was humbling. Well, I'm sorry that I 463 00:27:22,720 --> 00:27:25,200 Speaker 1: brought it up. That's okay. Might as well get it 464 00:27:25,200 --> 00:27:26,680 Speaker 1: out there. I'm glad you did, because you probably thought 465 00:27:26,680 --> 00:27:30,680 Speaker 1: I was lying to you. Where are we? We are 466 00:27:31,600 --> 00:27:33,639 Speaker 1: now going to teach people how to lie? Which I 467 00:27:33,720 --> 00:27:36,920 Speaker 1: thought was an interesting feature of this article. Okay, I um, 468 00:27:37,880 --> 00:27:40,520 Speaker 1: I thought it was interesting. I also refused to do it. 469 00:27:40,640 --> 00:27:43,439 Speaker 1: I find it wholly a moral to teach people how 470 00:27:43,520 --> 00:27:45,000 Speaker 1: to lie? Do you want to skip this in? No, 471 00:27:45,359 --> 00:27:48,720 Speaker 1: but it's still information. I feel like we can say 472 00:27:49,080 --> 00:27:53,480 Speaker 1: people who are successful at lying tend to do certain things, 473 00:27:54,080 --> 00:27:57,280 Speaker 1: but teaching somebody how to lie, I think it's just 474 00:27:57,480 --> 00:28:00,400 Speaker 1: utterly wrong. Well, imagine there's been more than one attorney 475 00:28:00,480 --> 00:28:03,639 Speaker 1: give these instructions to their clients. Definitely on the stand, 476 00:28:03,760 --> 00:28:07,840 Speaker 1: but I mean like attorneys, nobody. But we're making all 477 00:28:07,920 --> 00:28:13,720 Speaker 1: kinds of friends today. It turneys in Detroit especially p you. Alright, 478 00:28:13,840 --> 00:28:17,240 Speaker 1: so here's how to lie, so says Tom Chief. So 479 00:28:17,960 --> 00:28:23,000 Speaker 1: this is what successful liars do. Okay, stay calm. That's 480 00:28:23,040 --> 00:28:24,600 Speaker 1: what they'll probably tell you before you get on the 481 00:28:24,640 --> 00:28:27,240 Speaker 1: witness stand. Will put you up, especially with the polygraph. 482 00:28:28,200 --> 00:28:31,760 Speaker 1: Yeah sure, because it measures all those fluctuations and what 483 00:28:32,000 --> 00:28:36,000 Speaker 1: temperature and heart rate and yeah you're yeah, you're yeah, 484 00:28:36,160 --> 00:28:41,040 Speaker 1: those two things. Just keep it simple, stupid. And this one, boy, 485 00:28:41,120 --> 00:28:43,640 Speaker 1: you see this one when someone's cooking up a lie 486 00:28:43,840 --> 00:28:45,920 Speaker 1: and they you know they're lying when they start adding 487 00:28:45,960 --> 00:28:49,320 Speaker 1: all these details because they think, well the more detailed, 488 00:28:50,080 --> 00:28:52,560 Speaker 1: like who could make this up? I was thinking about 489 00:28:52,600 --> 00:28:54,600 Speaker 1: this and I was like, while I was reading this, 490 00:28:54,720 --> 00:28:56,520 Speaker 1: I was like going over my own behavior. Is he's 491 00:28:56,600 --> 00:29:00,200 Speaker 1: using myself as a model. It's impossible not too I guess, right, Um, 492 00:29:00,640 --> 00:29:05,120 Speaker 1: this this I did not get. Like I I tried 493 00:29:05,160 --> 00:29:07,600 Speaker 1: to think back to like any lie or story I've 494 00:29:07,640 --> 00:29:11,719 Speaker 1: ever told him, like if if like just adding details 495 00:29:11,840 --> 00:29:14,479 Speaker 1: and like information that has nothing to do with anything, 496 00:29:14,600 --> 00:29:17,200 Speaker 1: Like I don't get that. Yeah, well, you know it 497 00:29:17,200 --> 00:29:19,200 Speaker 1: probably means you're not a very good liar. That's a 498 00:29:19,240 --> 00:29:22,000 Speaker 1: good thing. I hope that's what it means. But that's 499 00:29:22,040 --> 00:29:26,040 Speaker 1: like the Eddie Murphy. Uh remember his joke about you know, 500 00:29:26,200 --> 00:29:29,000 Speaker 1: the ladies called the guy like the car was at 501 00:29:29,040 --> 00:29:30,880 Speaker 1: the other ladies house and it was just what me, 502 00:29:32,000 --> 00:29:33,600 Speaker 1: That's what he kept saying. Yeah, but I called you 503 00:29:33,720 --> 00:29:35,280 Speaker 1: and you know, I called you red hand and it 504 00:29:35,480 --> 00:29:37,960 Speaker 1: wouldn't me Oh yeah yeah, And that I guess that 505 00:29:38,040 --> 00:29:39,320 Speaker 1: kind of goes back to it and keep it as 506 00:29:39,320 --> 00:29:42,600 Speaker 1: simple as possible. Richard Pryor apparently also had a little 507 00:29:42,640 --> 00:29:45,200 Speaker 1: bit where like, um, he said that he came in 508 00:29:45,480 --> 00:29:48,560 Speaker 1: and found like his wife or his girlfriend in bed 509 00:29:48,640 --> 00:29:50,600 Speaker 1: with another guy and she's like, who are you gonna 510 00:29:50,640 --> 00:29:56,920 Speaker 1: believe me? Or your lion eyes? That's pretty good. Um 511 00:29:57,000 --> 00:30:02,440 Speaker 1: remains steady Um. And this goes back to the you 512 00:30:02,520 --> 00:30:05,960 Speaker 1: will be you're being studied by. Let's go ahead and 513 00:30:06,000 --> 00:30:08,360 Speaker 1: use the courtroom example, like we're coaching you to go 514 00:30:08,440 --> 00:30:10,000 Speaker 1: on the stand. This jury is gonna be watching you. 515 00:30:10,640 --> 00:30:14,560 Speaker 1: Just remain steady. If you're all fidgety, stay fidgety. People 516 00:30:14,600 --> 00:30:18,400 Speaker 1: who are good liars tend to remain steady, right, And 517 00:30:18,960 --> 00:30:20,520 Speaker 1: what do you mean by that is not just like 518 00:30:20,760 --> 00:30:23,800 Speaker 1: you have to keep a steady hand. That was already 519 00:30:23,880 --> 00:30:27,600 Speaker 1: covered in staying calm. People who are good liars continue 520 00:30:28,080 --> 00:30:31,719 Speaker 1: the behavior that they were that they if they were 521 00:30:31,760 --> 00:30:34,760 Speaker 1: relaxed before there's a line of questioning started, they're relaxed 522 00:30:34,840 --> 00:30:38,040 Speaker 1: during it, and they're relaxed after. If they're fidgety, then 523 00:30:38,080 --> 00:30:41,360 Speaker 1: they're fidgety before, during and after. Yeah, because they say 524 00:30:41,400 --> 00:30:43,760 Speaker 1: one of the telltale signs is um or the way 525 00:30:43,840 --> 00:30:45,400 Speaker 1: of one way to trip up a liar, which we'll 526 00:30:45,400 --> 00:30:48,600 Speaker 1: talk about is to change the subject and see if 527 00:30:48,760 --> 00:30:51,480 Speaker 1: you see them relax because they think, oh man, that's over, 528 00:30:51,600 --> 00:30:55,520 Speaker 1: thank god. Yeah, Tom says, um. Once the questioning is over, 529 00:30:55,560 --> 00:31:00,080 Speaker 1: don't suddenly relax and a pure really all right, I 530 00:31:00,120 --> 00:31:04,240 Speaker 1: can get down now off the stand, um, And then 531 00:31:04,720 --> 00:31:09,400 Speaker 1: good liars generally are affable, which makes sense. They make 532 00:31:09,640 --> 00:31:12,960 Speaker 1: people want to believe them because I think if you 533 00:31:13,000 --> 00:31:17,040 Speaker 1: don't like somebody, it's easier to be suspicious of them. Yeah. Well, 534 00:31:17,600 --> 00:31:20,120 Speaker 1: how many people have gotten away with horrible stuff because 535 00:31:20,480 --> 00:31:24,480 Speaker 1: they just seem likable? What about Ted Bundy? How many people? 536 00:31:24,720 --> 00:31:27,440 Speaker 1: How many more people was he able to kill? And 537 00:31:27,600 --> 00:31:30,200 Speaker 1: he got caught? Remember we talked about him before he 538 00:31:30,280 --> 00:31:32,760 Speaker 1: got caught because he went in like just a completely reckless, 539 00:31:32,880 --> 00:31:35,960 Speaker 1: like killing rampage in his sorority house. Yeah, he killed 540 00:31:36,040 --> 00:31:39,160 Speaker 1: for years before that. Yeah, but he was white and 541 00:31:39,280 --> 00:31:42,000 Speaker 1: he had good haircut, so I mean he couldn't he 542 00:31:42,080 --> 00:31:46,080 Speaker 1: couldn't have been any threat to anybody. I love that 543 00:31:46,160 --> 00:31:49,200 Speaker 1: serial killer stuff. I could do like every other show 544 00:31:49,280 --> 00:31:53,040 Speaker 1: on some aspect of that, We'll do another serial killer one. Okay, Okay, 545 00:31:53,160 --> 00:31:57,960 Speaker 1: so Chuck, I will. I don't find it a moral 546 00:31:58,040 --> 00:32:00,960 Speaker 1: to teach people to tell when someone at lyne so 547 00:32:01,120 --> 00:32:05,120 Speaker 1: that we can do five steps the Big five Big 548 00:32:05,160 --> 00:32:07,880 Speaker 1: Part two, Part two. Um, and we kind of already 549 00:32:07,920 --> 00:32:12,120 Speaker 1: talked about establishing the baseline, Like if you think someone 550 00:32:12,240 --> 00:32:14,920 Speaker 1: might be lying in your quizzing, hum, look at their 551 00:32:14,960 --> 00:32:20,520 Speaker 1: behavior very closely and determine how they just normally act 552 00:32:20,640 --> 00:32:22,160 Speaker 1: and just go ahead and log that in your brain. 553 00:32:22,280 --> 00:32:24,440 Speaker 1: Let's step one, right, And I mean this is like 554 00:32:24,560 --> 00:32:27,920 Speaker 1: I guess if this is if you're a professional interrogator 555 00:32:28,520 --> 00:32:31,440 Speaker 1: like the lady in that terrible show, um or on 556 00:32:31,600 --> 00:32:35,000 Speaker 1: Law and Order, the Great Shows, um Or, if you're 557 00:32:35,000 --> 00:32:36,920 Speaker 1: on a jury, this is good advice. If you sit 558 00:32:36,960 --> 00:32:38,800 Speaker 1: on a jury to watch these people, that's a good 559 00:32:38,840 --> 00:32:43,560 Speaker 1: one too, for sure. Um Or if you are um something, uh, 560 00:32:44,320 --> 00:32:47,479 Speaker 1: if you are really help bent on finding out if 561 00:32:47,520 --> 00:32:50,320 Speaker 1: somebody that you interact with is lying to you, and 562 00:32:50,440 --> 00:32:52,080 Speaker 1: you do a lot of pre planning, you could do 563 00:32:52,160 --> 00:32:55,320 Speaker 1: this too. But yes, before you let on that there's 564 00:32:55,400 --> 00:32:57,239 Speaker 1: a line of questioning that's going to be coming up. 565 00:32:57,360 --> 00:32:59,920 Speaker 1: You want to interact with the person in um may 566 00:33:00,160 --> 00:33:03,320 Speaker 1: notes about their behavior. Are they a fidgety Joe, not 567 00:33:03,480 --> 00:33:07,280 Speaker 1: literally in front of them making notes? Or if you 568 00:33:07,360 --> 00:33:10,720 Speaker 1: do do that, just don't lend them see what you're right? Alright? 569 00:33:10,760 --> 00:33:13,120 Speaker 1: So number two is once you've established this, to look 570 00:33:13,160 --> 00:33:17,080 Speaker 1: for deviations from that pretty much a no brainer, Like 571 00:33:17,160 --> 00:33:19,560 Speaker 1: did they start fidgeting all that stuff? Yeah? Or if 572 00:33:19,560 --> 00:33:23,040 Speaker 1: they're fidgety Joe, do they turn into a smooth Samuel 573 00:33:24,360 --> 00:33:26,680 Speaker 1: all of a sudden, that's weird that that did would 574 00:33:26,720 --> 00:33:28,960 Speaker 1: just be so weird it would be it's weird. Or 575 00:33:29,000 --> 00:33:32,120 Speaker 1: if they're smooth Samuel and they're trying into a fidgety joe, 576 00:33:32,800 --> 00:33:35,800 Speaker 1: there you go. That's a that's a deviation from the baseline. 577 00:33:35,840 --> 00:33:40,000 Speaker 1: That's right. Uh. Step three, you really gotta listen, um so, 578 00:33:40,200 --> 00:33:43,360 Speaker 1: I might be pretty steely with their nonverbal cues. So 579 00:33:43,520 --> 00:33:46,239 Speaker 1: just listen to what they're saying and is it adding up. 580 00:33:46,480 --> 00:33:51,760 Speaker 1: If they're spouting off all kinds of details, maybe lead 581 00:33:51,800 --> 00:33:54,000 Speaker 1: them down a different path and then jump back to 582 00:33:54,080 --> 00:33:56,440 Speaker 1: those details and see if they're still on those. And 583 00:33:56,520 --> 00:33:59,680 Speaker 1: then you pick out a detail it seems a little 584 00:33:59,720 --> 00:34:02,680 Speaker 1: hinky to you, as you would say, and started asking 585 00:34:02,720 --> 00:34:06,240 Speaker 1: them questions about that, because then they may have to 586 00:34:06,320 --> 00:34:09,400 Speaker 1: lie about the lie, right, And if none of it's fitting, 587 00:34:10,000 --> 00:34:12,480 Speaker 1: are they having to make up more information to explain 588 00:34:12,560 --> 00:34:17,680 Speaker 1: why certain things aren't fitting into this um? And eventually, 589 00:34:17,719 --> 00:34:20,279 Speaker 1: if you draw the line of questioning out enough, you're 590 00:34:20,320 --> 00:34:24,120 Speaker 1: going to drive the person totally insane because their brains 591 00:34:24,160 --> 00:34:26,480 Speaker 1: are going over time. Yeah, the tell tale heard, I think, 592 00:34:26,560 --> 00:34:30,520 Speaker 1: and that would happen in that they actually they didn't 593 00:34:30,560 --> 00:34:32,799 Speaker 1: even know what was going on those guys. Oh yeah, 594 00:34:32,840 --> 00:34:35,040 Speaker 1: that's right. They were sort of innocently questioning and it 595 00:34:35,160 --> 00:34:37,080 Speaker 1: was all in his head. Either that or it was 596 00:34:37,200 --> 00:34:40,759 Speaker 1: really a phantom heart, right, that's the I guess you 597 00:34:40,880 --> 00:34:44,000 Speaker 1: just shouldn't murder. Yeah, boy, that John Cusack movie about 598 00:34:44,040 --> 00:34:47,160 Speaker 1: Poe looks like one of the worst pieces of garbage 599 00:34:47,400 --> 00:34:50,879 Speaker 1: for me, it really does. Whose idea was that, let's 600 00:34:51,040 --> 00:34:53,920 Speaker 1: make a movie about Edgar Allan Poe as a as 601 00:34:53,960 --> 00:34:59,960 Speaker 1: a murderous hunter And I'm thinking John Cusack? Yeah, really, 602 00:35:00,680 --> 00:35:04,560 Speaker 1: although I have to say a Lincoln vampire hunter looks awesome. Yeah, 603 00:35:04,560 --> 00:35:06,239 Speaker 1: I totally want to see that. Man, I can't see that. 604 00:35:07,200 --> 00:35:10,920 Speaker 1: That was no, but I'm sorry. I saw the the 605 00:35:11,760 --> 00:35:14,920 Speaker 1: preview for the po one and for the um Abraham 606 00:35:14,960 --> 00:35:17,320 Speaker 1: Lincoln one, like right next to each other. So I 607 00:35:17,760 --> 00:35:21,560 Speaker 1: linked the two and they look similar and mood her tone, 608 00:35:22,760 --> 00:35:26,239 Speaker 1: um oh, hold on with them with everything adding up 609 00:35:26,480 --> 00:35:28,759 Speaker 1: and drawing it out. One of the other things we 610 00:35:28,880 --> 00:35:32,840 Speaker 1: talked about the body language, where um, a body language 611 00:35:32,920 --> 00:35:35,360 Speaker 1: might be lie what they're really thinking or what they 612 00:35:35,440 --> 00:35:38,360 Speaker 1: really believe. Where if you're if you're denying something but 613 00:35:38,480 --> 00:35:42,200 Speaker 1: you're nodding your head, and she talked about um a 614 00:35:42,440 --> 00:35:45,880 Speaker 1: rod or somebody like that who was involved in steroids, 615 00:35:46,000 --> 00:35:48,960 Speaker 1: and he was on like sixty minutes and someone broke 616 00:35:49,000 --> 00:35:52,440 Speaker 1: down their micro expressions yeah and found clearly that he 617 00:35:52,600 --> 00:35:54,399 Speaker 1: was like he was nodding his head when he said 618 00:35:54,520 --> 00:35:57,640 Speaker 1: yes or something like that at one point, So, um 619 00:35:58,120 --> 00:36:00,560 Speaker 1: are so dumb? Yeah? And why you why you're doing that? Normally, 620 00:36:00,600 --> 00:36:03,399 Speaker 1: our body language matches up to our thoughts because they're 621 00:36:03,640 --> 00:36:07,359 Speaker 1: accurate and true and instinctual, right, and when they're when 622 00:36:07,400 --> 00:36:09,760 Speaker 1: we're lying, not only do we have to think about 623 00:36:09,840 --> 00:36:14,200 Speaker 1: the words we're saying and fabricate this alternate reality that 624 00:36:14,280 --> 00:36:17,520 Speaker 1: doesn't really exist except in our heads, we also have 625 00:36:17,600 --> 00:36:19,640 Speaker 1: to come up with the body language that's supposed to 626 00:36:19,760 --> 00:36:24,239 Speaker 1: match us being calm, us being truthful, whatever. And so 627 00:36:24,400 --> 00:36:27,759 Speaker 1: all of this thinky thinky can be confusing to um 628 00:36:28,000 --> 00:36:31,640 Speaker 1: a liar. And if if you stretch it out over 629 00:36:31,680 --> 00:36:35,600 Speaker 1: the course of enough time a line of questioning, they're 630 00:36:35,640 --> 00:36:38,120 Speaker 1: probably going to be like, why are you interrogating me? 631 00:36:38,239 --> 00:36:41,439 Speaker 1: Or whatever? And you've broken them And at that point 632 00:36:41,520 --> 00:36:45,359 Speaker 1: you just drive the hammer home and literally beat them 633 00:36:45,480 --> 00:36:48,880 Speaker 1: to death with the hammer. That's the inevitable conclusion of 634 00:36:48,920 --> 00:36:51,600 Speaker 1: any line of questioning. Right now. What you could do, though, 635 00:36:51,680 --> 00:36:56,440 Speaker 1: is pause, because a pause and the conversation might make 636 00:36:56,680 --> 00:36:59,440 Speaker 1: just a regular conversation feel uncomfortable. You talk about the 637 00:36:59,440 --> 00:37:02,600 Speaker 1: awkward pause, but man, if someone's cooking up a lie 638 00:37:02,760 --> 00:37:05,160 Speaker 1: and you're asking them questions and you pause, and this 639 00:37:05,360 --> 00:37:08,040 Speaker 1: is a big time tactic by an attorney with someone 640 00:37:08,080 --> 00:37:11,000 Speaker 1: on the stand, that will seem like an eternity to 641 00:37:11,120 --> 00:37:14,319 Speaker 1: those people. Yeah, they may get fidgety or whatever. They'll 642 00:37:14,360 --> 00:37:17,719 Speaker 1: turn into like Miranda July on that video tape and you, 643 00:37:17,840 --> 00:37:19,839 Speaker 1: me and everyone we know where she's like, I can 644 00:37:19,920 --> 00:37:24,719 Speaker 1: do anything right now? Is that? Did you see that? 645 00:37:24,800 --> 00:37:26,319 Speaker 1: I did? I enjoyed that you know what I'm talking 646 00:37:26,360 --> 00:37:27,920 Speaker 1: about and know exactly what you're talking about. So if 647 00:37:27,960 --> 00:37:31,399 Speaker 1: somebody does that in the middle of a pause during 648 00:37:31,400 --> 00:37:34,800 Speaker 1: a line of questioning, yeah, they're lying, uh and finally, 649 00:37:35,000 --> 00:37:39,160 Speaker 1: or they're adorable and artsy right and or into mumble 650 00:37:39,200 --> 00:37:42,160 Speaker 1: core um. And then we talked about the last one. 651 00:37:42,200 --> 00:37:44,719 Speaker 1: Step five has changed the subject. And see if you 652 00:37:44,800 --> 00:37:50,319 Speaker 1: see them visibly relax and then go ah ha yeah, 653 00:37:50,400 --> 00:37:53,239 Speaker 1: where they're like yeah and they're like, oh wait, no, 654 00:37:53,520 --> 00:37:58,920 Speaker 1: I wasn't relaxing, right, I'm on meth. Yeah, that's a 655 00:37:58,960 --> 00:38:01,160 Speaker 1: good way to catch somebody off guard. Or you could 656 00:38:01,160 --> 00:38:05,239 Speaker 1: do the Colombo method, let him off the hook, let 657 00:38:05,320 --> 00:38:08,239 Speaker 1: them calm down, and then turn around be like one 658 00:38:08,280 --> 00:38:10,799 Speaker 1: more question. How was that what he would do? Oh yeah, 659 00:38:10,880 --> 00:38:12,560 Speaker 1: every time that thought we would catch somebody and that 660 00:38:12,600 --> 00:38:14,960 Speaker 1: would be like that, like they didn't have an answer 661 00:38:15,000 --> 00:38:16,880 Speaker 1: for that, and it totally caught him off guard and 662 00:38:17,280 --> 00:38:18,839 Speaker 1: then he beat him to death of the hammer at 663 00:38:18,840 --> 00:38:23,000 Speaker 1: the end of every episode. Robert Blake, Right, I'm the 664 00:38:23,080 --> 00:38:25,760 Speaker 1: Colombo Peter Fault, Peter Falk. I always get those guys confused. 665 00:38:26,120 --> 00:38:29,799 Speaker 1: Robert Blake really did go to jail for murder. Yeah, 666 00:38:29,960 --> 00:38:32,080 Speaker 1: that's because he really killed his wife. He was Burretta. 667 00:38:33,000 --> 00:38:36,160 Speaker 1: That's right. I always getting confused, But not with the 668 00:38:36,239 --> 00:38:42,440 Speaker 1: Rockford Files. Oh no, that's a Jim. Yeah, it's great. Um, 669 00:38:42,640 --> 00:38:44,880 Speaker 1: all right, so I didn't look at any of the 670 00:38:44,920 --> 00:38:47,879 Speaker 1: famous lives in history. They weren't. They were pretty much 671 00:38:48,800 --> 00:38:53,160 Speaker 1: whatever standard. So we talked about the Kids study. UM. 672 00:38:54,480 --> 00:39:00,239 Speaker 1: I found an interesting link from the University of Southern California, huh, 673 00:39:00,680 --> 00:39:04,120 Speaker 1: where they found what they believe is proof that, uh, 674 00:39:04,560 --> 00:39:10,520 Speaker 1: the brain structure of a pathological liar is different. They're 675 00:39:10,520 --> 00:39:13,160 Speaker 1: actually wired differently than the rest of us. Oh yeah, 676 00:39:13,200 --> 00:39:15,800 Speaker 1: I could totally see that because the brain is subject 677 00:39:15,840 --> 00:39:20,000 Speaker 1: to plasticity and so structurally it's different. Right, Well, that's 678 00:39:20,040 --> 00:39:23,160 Speaker 1: what they say. They took subjects a D eight volunteers 679 00:39:23,239 --> 00:39:26,600 Speaker 1: and then interviewed them with psychological tests and placed them 680 00:39:26,600 --> 00:39:30,080 Speaker 1: in different categories of like are you a repeated liar? 681 00:39:30,239 --> 00:39:34,440 Speaker 1: Are you anti social and pathological liar? Or are you normal? 682 00:39:35,560 --> 00:39:38,080 Speaker 1: I guess a normal liar? And then they hooked him 683 00:39:38,120 --> 00:39:41,240 Speaker 1: up to the old m R I the Wonder machine, 684 00:39:41,280 --> 00:39:44,120 Speaker 1: and they found that liars had significantly more white matter 685 00:39:44,680 --> 00:39:50,359 Speaker 1: and less gray matter, which they believe equates to um. 686 00:39:50,680 --> 00:39:56,600 Speaker 1: Liars are quicker thinkers. Basically white matters the stuff that 687 00:39:56,680 --> 00:40:00,680 Speaker 1: transmits the electrical impulse. They're more equipped to lie physically, 688 00:40:00,760 --> 00:40:05,560 Speaker 1: their brains, their feet. I'm I'm like one big continuous 689 00:40:05,840 --> 00:40:08,759 Speaker 1: lump of gray matter. Yeah, I don't think I have 690 00:40:08,800 --> 00:40:11,680 Speaker 1: any white matter whatsoever. There's like a little there's a 691 00:40:11,800 --> 00:40:14,399 Speaker 1: donkey on like a ferry that goes across the mighty 692 00:40:14,480 --> 00:40:20,279 Speaker 1: mississipp that's my that's my neural transmission with Huck Finn. Yeah. Um, 693 00:40:20,719 --> 00:40:24,560 Speaker 1: but huck fan is like really fat and kind of dumb. 694 00:40:26,760 --> 00:40:31,560 Speaker 1: There they go, right now I can chuck yeah. Uh. 695 00:40:31,640 --> 00:40:34,759 Speaker 1: And then this other thing we've touched on, the thinking cap, 696 00:40:35,520 --> 00:40:41,680 Speaker 1: transcranial magnetic stimulation. Yeah, and what was that again, TMS, Oh, 697 00:40:41,920 --> 00:40:43,799 Speaker 1: it's the it was the thinking cap. We did an 698 00:40:43,840 --> 00:40:47,600 Speaker 1: episode on this too, where there's a I guess that 699 00:40:47,680 --> 00:40:52,560 Speaker 1: uses magnets magnetic pulses that can target very specific parts 700 00:40:52,600 --> 00:40:55,440 Speaker 1: of the brain without fighting others. Laying a magnet that 701 00:40:55,560 --> 00:40:58,000 Speaker 1: can pulse right over a specific part of your brain 702 00:40:58,120 --> 00:41:01,319 Speaker 1: and going woom woomp, and then high frequency stuff makes 703 00:41:01,320 --> 00:41:04,400 Speaker 1: you like you twitch and stuff like that. Low frequency 704 00:41:04,480 --> 00:41:07,160 Speaker 1: can give you a stutter depending on the region of 705 00:41:07,200 --> 00:41:10,120 Speaker 1: the brain. It's basically just messing with your brain and 706 00:41:10,280 --> 00:41:13,080 Speaker 1: the neural firing. And it can make you more creative too. 707 00:41:13,160 --> 00:41:15,040 Speaker 1: Wasn't that one of them? Yeah, that's what they found 708 00:41:15,200 --> 00:41:18,560 Speaker 1: more creative and um people could pick out like prime 709 00:41:18,680 --> 00:41:21,960 Speaker 1: numbers out of a huge block of numbers. Whether they 710 00:41:22,000 --> 00:41:24,480 Speaker 1: couldn't do that right before, they could draw horse really 711 00:41:24,520 --> 00:41:27,640 Speaker 1: well all of a sudden, so apparently if UM these 712 00:41:27,719 --> 00:41:32,320 Speaker 1: volunteers were hooked up to the TMS Thinking Cap machine, 713 00:41:32,520 --> 00:41:36,680 Speaker 1: let's call it, and volunteered to have their dor dorsalateral 714 00:41:37,160 --> 00:41:42,040 Speaker 1: prefonal cortex stimulated, which is UM complex thought and deception 715 00:41:42,160 --> 00:41:46,200 Speaker 1: decision making. And they found and it has two sides 716 00:41:46,520 --> 00:41:49,240 Speaker 1: like every other part of the brain. And they found 717 00:41:49,280 --> 00:41:54,279 Speaker 1: that people who had the left side stimulated lit more often. Uh, 718 00:41:54,400 --> 00:41:57,360 Speaker 1: and people on the right side were more likely to 719 00:41:57,400 --> 00:41:59,960 Speaker 1: tell the truth. And they asked him like obvious question. 720 00:42:00,040 --> 00:42:02,440 Speaker 1: It's like what color is this piece of paper? Lie 721 00:42:02,480 --> 00:42:06,600 Speaker 1: to me or don't lie to me? And basically it's 722 00:42:06,640 --> 00:42:08,319 Speaker 1: early in the going here. I think this is from 723 00:42:08,320 --> 00:42:11,080 Speaker 1: two thousand eleven, but they think this could lead to 724 00:42:11,239 --> 00:42:16,040 Speaker 1: possibly one day instead of people taking a lie detector tests, 725 00:42:16,080 --> 00:42:19,720 Speaker 1: taking a lie prevention test, like hooking them up and basically, 726 00:42:19,800 --> 00:42:22,600 Speaker 1: you cannot lie to me while you're getting pulse like this. 727 00:42:23,120 --> 00:42:26,360 Speaker 1: That is crazy. So did you kill your wife? That 728 00:42:26,600 --> 00:42:31,160 Speaker 1: is very crazy? Did I? No? I wasn't asking me 729 00:42:32,560 --> 00:42:36,720 Speaker 1: although I haven't seen you mean awhile, okay, sorry this morning, 730 00:42:37,560 --> 00:42:41,200 Speaker 1: I'll have her send you an email. Sure. Well, actually, 731 00:42:41,239 --> 00:42:43,680 Speaker 1: it's funny you brought that up because UM this last 732 00:42:43,719 --> 00:42:49,479 Speaker 1: little study UM found the email people are only likely 733 00:42:49,560 --> 00:42:52,720 Speaker 1: to lie in email four pent of the time, compared 734 00:42:52,760 --> 00:42:55,880 Speaker 1: to thirty seven percent on the phone seven percent in 735 00:42:56,040 --> 00:43:02,439 Speaker 1: person via text. And I found that interesting. I find 736 00:43:02,480 --> 00:43:07,520 Speaker 1: it interesting to The phone makes sense because it's verbal communication, 737 00:43:07,880 --> 00:43:12,759 Speaker 1: so you're more frightened. There's more intimidation, which I think 738 00:43:12,840 --> 00:43:14,880 Speaker 1: probably is one of the things that leads to lying 739 00:43:14,920 --> 00:43:19,799 Speaker 1: among people more frequently than anything else, being intimidated face 740 00:43:19,840 --> 00:43:22,840 Speaker 1: to face, which probably helps exactly. So the phone is 741 00:43:22,920 --> 00:43:29,800 Speaker 1: the most um lie laden form of communication, but email 742 00:43:29,920 --> 00:43:32,880 Speaker 1: is the least lie laden, and I think it's because 743 00:43:32,920 --> 00:43:35,720 Speaker 1: you don't have to vocally express it. Plus the internet 744 00:43:35,800 --> 00:43:40,239 Speaker 1: makes us all very brassy. Uh well, plus email to 745 00:43:40,400 --> 00:43:45,680 Speaker 1: also there's like documented evidence the paper trail or an 746 00:43:45,719 --> 00:43:49,279 Speaker 1: electric paper trail. Yeah, that's a good point too, Like 747 00:43:49,320 --> 00:43:51,319 Speaker 1: I wouldn't line in an email because when somebody could 748 00:43:51,320 --> 00:43:53,160 Speaker 1: take that email later on and bust me on it, 749 00:43:53,760 --> 00:43:56,759 Speaker 1: that's true. I would only do that fourteen percent in 750 00:43:56,800 --> 00:43:59,920 Speaker 1: the time, evidently, right. Well, I record all of you 751 00:44:00,120 --> 00:44:04,040 Speaker 1: was in my conversation, so I've got it all documented, 752 00:44:05,760 --> 00:44:11,359 Speaker 1: all of it. Oh bad, I guess that's it. That's 753 00:44:11,360 --> 00:44:13,160 Speaker 1: all I got. That was a lot. That was a 754 00:44:13,239 --> 00:44:17,040 Speaker 1: lot online. We we covered the philosophical aspects you mentioned 755 00:44:17,120 --> 00:44:21,520 Speaker 1: the brain, We pood pood psychology. Um. We talked about 756 00:44:21,719 --> 00:44:24,200 Speaker 1: the Big five, the Big six, the other Big five. 757 00:44:24,360 --> 00:44:27,040 Speaker 1: We covered everything in him. Yeah, and let us finish 758 00:44:27,080 --> 00:44:30,319 Speaker 1: by saying, two kids out there. Although it may seem 759 00:44:30,360 --> 00:44:32,920 Speaker 1: like a good idea at the time, it is not 760 00:44:33,040 --> 00:44:35,439 Speaker 1: a good idea in the long run in your life. 761 00:44:35,560 --> 00:44:38,319 Speaker 1: May just make things worse. In fact, it will very 762 00:44:38,400 --> 00:44:41,520 Speaker 1: much likely make things much worse on you agreed to 763 00:44:41,560 --> 00:44:43,520 Speaker 1: tell the truth. Agreed. It's a good habit to get 764 00:44:43,560 --> 00:44:45,920 Speaker 1: into as you grow. Oh yeah, those are the people 765 00:44:45,960 --> 00:44:49,320 Speaker 1: who really kind of become the best agreed later on. 766 00:44:49,560 --> 00:44:54,280 Speaker 1: Not necessarily the richest, although that's not the Richard Brandson 767 00:44:54,480 --> 00:44:56,759 Speaker 1: never told a lie in his life, probably, but there 768 00:44:56,800 --> 00:44:59,919 Speaker 1: are richest beyond the dollar exactly. That's what I'm driving 769 00:45:00,040 --> 00:45:02,640 Speaker 1: and being honest as one and the true trust of 770 00:45:02,680 --> 00:45:07,680 Speaker 1: another person doesn't much wealthier than that. Agreed Uh a 771 00:45:07,760 --> 00:45:11,040 Speaker 1: few while learn more about lying. You can look it 772 00:45:11,160 --> 00:45:14,279 Speaker 1: up by typing L Y, I, n G in the 773 00:45:14,360 --> 00:45:16,879 Speaker 1: search bar at how stuff works dot com. I said, 774 00:45:16,920 --> 00:45:20,200 Speaker 1: search bar in there no lie. This means it's time 775 00:45:20,239 --> 00:45:26,680 Speaker 1: for a listener mail. Whoa, Nelly, whoa. We got a 776 00:45:26,719 --> 00:45:29,600 Speaker 1: couple of quick announcements to make a listener mail. The 777 00:45:29,680 --> 00:45:32,880 Speaker 1: first is we have if you've listened to our Halloween episodes. 778 00:45:32,960 --> 00:45:36,120 Speaker 1: The past two years we do readings that are royalty 779 00:45:36,200 --> 00:45:39,359 Speaker 1: free because they're old, right and they're good from po 780 00:45:39,560 --> 00:45:42,320 Speaker 1: and Lovecraft and the like. This year, we want to 781 00:45:42,400 --> 00:45:46,040 Speaker 1: read one of your horror stories absolutely true, um, and 782 00:45:46,239 --> 00:45:49,920 Speaker 1: we're going to do that through this long, complicated process 783 00:45:50,000 --> 00:45:53,040 Speaker 1: by which starting uh this well a little while ago 784 00:45:53,160 --> 00:45:58,239 Speaker 1: June eighth and running until July. You can submit your own, 785 00:45:58,960 --> 00:46:03,520 Speaker 1: um horror fiction not been published any anywhere else and 786 00:46:03,640 --> 00:46:06,759 Speaker 1: that is between three thousand and four thousand words. That's right. 787 00:46:06,880 --> 00:46:09,240 Speaker 1: You can send it in an email to how Stuff 788 00:46:09,280 --> 00:46:14,120 Speaker 1: Works Underscore Contests at Discovery dot com. Right, yeah, and 789 00:46:14,200 --> 00:46:17,480 Speaker 1: do yourself a favor. Uh go read the rules because 790 00:46:17,880 --> 00:46:19,440 Speaker 1: you don't want to take the time to do this 791 00:46:19,600 --> 00:46:21,960 Speaker 1: and then be disqualified. No, there's a blog post on 792 00:46:22,040 --> 00:46:24,759 Speaker 1: the blogs that how Stuff Works and it's titled something 793 00:46:24,920 --> 00:46:28,200 Speaker 1: like stuff you should know his horror Fiction contest colon 794 00:46:28,560 --> 00:46:31,120 Speaker 1: get your official rules here something like that, and it's 795 00:46:31,160 --> 00:46:33,320 Speaker 1: got all the rules that has like a pithy introduction 796 00:46:33,400 --> 00:46:36,359 Speaker 1: to the rules that we came up with and then UM. 797 00:46:37,000 --> 00:46:40,759 Speaker 1: So the key here though is no matter what, um, 798 00:46:41,239 --> 00:46:44,920 Speaker 1: you have to in the email write the words by 799 00:46:45,160 --> 00:46:49,360 Speaker 1: entering into the contest. I agree to abide by the 800 00:46:49,440 --> 00:46:53,160 Speaker 1: contest rules. Any email that that has a submission that 801 00:46:53,200 --> 00:46:55,960 Speaker 1: doesn't have those words in it is automatically disqualified. Yeah, 802 00:46:55,960 --> 00:46:57,640 Speaker 1: and we don't want that because if you worked on this, 803 00:46:58,360 --> 00:47:00,640 Speaker 1: you know we want you to be able to win. Yeah. Um, 804 00:47:00,760 --> 00:47:03,279 Speaker 1: it's only open to residents of the US eighteen or 805 00:47:03,320 --> 00:47:06,360 Speaker 1: older eighteen as of June eighteen. That's right. But anyway 806 00:47:06,360 --> 00:47:07,920 Speaker 1: you send him to us, we're gonna read all of them, 807 00:47:07,960 --> 00:47:10,560 Speaker 1: we're gonna judge them. Uh, and then that we're gonna 808 00:47:10,600 --> 00:47:13,160 Speaker 1: pick the top sixteen, enterim into a bracket, and it's 809 00:47:13,200 --> 00:47:16,640 Speaker 1: gonna be like the thunder Dome until everybody votes on 810 00:47:16,760 --> 00:47:18,600 Speaker 1: their favorite, and that favorite one is the one we'll 811 00:47:18,600 --> 00:47:21,760 Speaker 1: read for our Halloween episode. Pretty cool, great idea, Josh's idea. 812 00:47:21,880 --> 00:47:25,520 Speaker 1: I think he's already regretting having thought of it because 813 00:47:25,560 --> 00:47:29,400 Speaker 1: about complicated contests, are they really really are? Yeh? But 814 00:47:29,560 --> 00:47:31,520 Speaker 1: I don't regret it. No, I'm I'm very anxious and 815 00:47:32,719 --> 00:47:36,239 Speaker 1: um so comic Con to what. Yeah, we're gonna be 816 00:47:36,280 --> 00:47:38,719 Speaker 1: at Comic Con in San Diego. You're going to Comic Con. 817 00:47:39,000 --> 00:47:41,360 Speaker 1: You are two? Pal? Oh, that's right, I know. I 818 00:47:41,440 --> 00:47:44,279 Speaker 1: booked my ticket. Oh you booked your own. I had 819 00:47:44,320 --> 00:47:47,960 Speaker 1: people book mind for me. Well well, um so, uh 820 00:47:48,080 --> 00:47:50,560 Speaker 1: let's see, we're gonna be a comic Con on Thursday, 821 00:47:50,719 --> 00:47:54,279 Speaker 1: July twelve, and we're going to do a live podcast 822 00:47:54,560 --> 00:47:57,560 Speaker 1: at a panel right yeah, right there at the convention center. Uh, 823 00:47:57,960 --> 00:48:01,120 Speaker 1: we do not know what time exactly it. Uh, well, 824 00:48:01,160 --> 00:48:03,160 Speaker 1: we'll announce that on Facebook and the like in Twitter. 825 00:48:03,200 --> 00:48:06,160 Speaker 1: And they're supposed to be possibly a special guests. Maybe 826 00:48:06,560 --> 00:48:11,359 Speaker 1: there's gonna be Trappiezes, Monkeys, circus, peanuts, the whole thing. Yeah, 827 00:48:11,360 --> 00:48:13,800 Speaker 1: and we can't announce the guests yet because we haven't 828 00:48:13,840 --> 00:48:16,720 Speaker 1: locked it down, but hopefully they are in the fringe 829 00:48:16,800 --> 00:48:22,319 Speaker 1: of society. Alright, So comic con horror fiction contests. It's 830 00:48:22,440 --> 00:48:25,520 Speaker 1: time now for listen to mail. Uh Josh, I'm gonna 831 00:48:25,560 --> 00:48:30,560 Speaker 1: call this ss k uh teaching America's utes. Okay, Um, 832 00:48:30,920 --> 00:48:35,400 Speaker 1: hi guys and Jerry, I've written before you even mentioned 833 00:48:35,440 --> 00:48:38,600 Speaker 1: my podcast our list on your show once. Remember we 834 00:48:38,680 --> 00:48:41,120 Speaker 1: were the doppelgangers because we sent you T shirts that 835 00:48:41,280 --> 00:48:43,680 Speaker 1: you originally thought were you sure? And it was us? 836 00:48:43,719 --> 00:48:45,800 Speaker 1: I remember those guys. Anyway, I wanted to write and 837 00:48:45,840 --> 00:48:48,440 Speaker 1: thank you besides being a podcaster. I'm a teacher. My 838 00:48:48,520 --> 00:48:51,919 Speaker 1: fourth graders were going to be dissecting earthworms, and help 839 00:48:52,280 --> 00:48:55,600 Speaker 1: prepare them for their first ever dissection, I gave them 840 00:48:55,680 --> 00:49:01,200 Speaker 1: a homework assignment of listening to the earthworm podcast. Great idea. Um. 841 00:49:01,840 --> 00:49:04,680 Speaker 1: Many had never listened to a podcast before, and I 842 00:49:04,840 --> 00:49:07,280 Speaker 1: was wondering how they would react. And the overall response 843 00:49:07,400 --> 00:49:10,440 Speaker 1: was great, except for the three that ended up with seizures. 844 00:49:10,480 --> 00:49:13,640 Speaker 1: For some weird reason. They found it funny that you 845 00:49:13,719 --> 00:49:18,120 Speaker 1: discussed species pronunciation, which turns out I was wrong. Remember 846 00:49:18,160 --> 00:49:21,319 Speaker 1: I said I can't say species. Apparently you can see either. 847 00:49:22,239 --> 00:49:25,520 Speaker 1: Uh No, I said species. You said it's species or 848 00:49:25,680 --> 00:49:29,680 Speaker 1: species or species or species and I said, no, it's 849 00:49:29,719 --> 00:49:33,520 Speaker 1: just species. So you were wrong in that. I just 850 00:49:33,719 --> 00:49:35,480 Speaker 1: just wanted to make sure. I don't really remember it, 851 00:49:35,560 --> 00:49:40,600 Speaker 1: but that's how it, right though. Uh. They found the 852 00:49:40,640 --> 00:49:43,799 Speaker 1: facts that you shared amazing. I gave them a questionnaire 853 00:49:44,239 --> 00:49:46,359 Speaker 1: to fill out in Mr Zach because spelled the word 854 00:49:46,440 --> 00:49:50,640 Speaker 1: question zach. As they listened in, almost everyone completed their homework, 855 00:49:50,719 --> 00:49:53,279 Speaker 1: which is no small feed. So thank you for giving 856 00:49:53,280 --> 00:49:55,840 Speaker 1: me another way to reinforce my lessons. You may have 857 00:49:55,960 --> 00:49:58,120 Speaker 1: some new fans in exchange, and that is Mr Zach 858 00:49:58,520 --> 00:50:02,520 Speaker 1: at Ease Cooper monest Story Charter in Mount Pleasant, South Carolina. 859 00:50:03,120 --> 00:50:08,359 Speaker 1: Go Coops at Amount Coops Go barrel Makers. Yes, then 860 00:50:08,440 --> 00:50:12,200 Speaker 1: that what Cooper's do? Oh was that a Cooper? Sure? 861 00:50:12,400 --> 00:50:14,800 Speaker 1: I think so, man, I hope it is good. I 862 00:50:14,880 --> 00:50:18,120 Speaker 1: hope it is. Um, thanks Zach. It's good to hear 863 00:50:18,200 --> 00:50:21,480 Speaker 1: from you guys again. Uh and uh, where can they 864 00:50:21,520 --> 00:50:25,440 Speaker 1: find their podcast? Chuck, It's called our list on iTunes? 865 00:50:25,560 --> 00:50:30,440 Speaker 1: Probably Yeah? Sure, okay, um, usual old Google cool? Uh. 866 00:50:30,520 --> 00:50:34,040 Speaker 1: If we have helped shape children's lives, man, we are 867 00:50:34,280 --> 00:50:37,920 Speaker 1: crazy for that. We always love hearing about that kind 868 00:50:37,960 --> 00:50:40,640 Speaker 1: of stuff. Um, so we want to hear it. Send 869 00:50:40,680 --> 00:50:42,759 Speaker 1: it to us. You can tweet to us at s 870 00:50:42,880 --> 00:50:45,640 Speaker 1: y s K podcast. You can join us on Facebook 871 00:50:46,000 --> 00:50:48,319 Speaker 1: at stuff you Should Know, Facebook dot com slash stuff 872 00:50:48,320 --> 00:50:50,160 Speaker 1: you Should Know um, and you can send us an 873 00:50:50,200 --> 00:51:00,680 Speaker 1: email at stuff podcast at Discovery dot com. For more 874 00:51:00,760 --> 00:51:03,360 Speaker 1: on this and thousands of other topics, visit how Stuff 875 00:51:03,360 --> 00:51:06,960 Speaker 1: Works dot com. M