1 00:00:10,119 --> 00:00:14,520 Speaker 1: Hi, guys, it's Andrea with a bonus episode this season 2 00:00:14,560 --> 00:00:18,280 Speaker 1: on Betrayal, We're telling the story of Caroline Brega. After 3 00:00:18,360 --> 00:00:21,840 Speaker 1: two decades of marriage, she discovered that her entire life 4 00:00:21,880 --> 00:00:26,000 Speaker 1: was a mirage. Her husband, Joel, an honorable cop, was 5 00:00:26,040 --> 00:00:29,840 Speaker 1: anything but For years, he'd been spending his time on 6 00:00:29,880 --> 00:00:33,080 Speaker 1: the clock having sex in his police car. On top 7 00:00:33,120 --> 00:00:37,839 Speaker 1: of that, he'd had dozens of affairs for Caroline. This 8 00:00:37,960 --> 00:00:41,280 Speaker 1: betrayal was not just about what Joel did, It was 9 00:00:41,320 --> 00:00:43,879 Speaker 1: about the lengths he went to to cover it all up. 10 00:00:44,440 --> 00:00:47,199 Speaker 2: Our marriage has just been lie after lie after. 11 00:00:47,040 --> 00:00:51,720 Speaker 1: Lie, day after day. Joel deceived her. He lied about 12 00:00:51,720 --> 00:00:54,880 Speaker 1: where he was, who he was with, and what he 13 00:00:54,960 --> 00:00:57,760 Speaker 1: was really up to all those long nights on duty, 14 00:00:58,400 --> 00:01:02,120 Speaker 1: And even during his investigation by the Colorado Springs Police Department, 15 00:01:02,600 --> 00:01:06,560 Speaker 1: when he signed a document guaranteeing honesty, he continued to 16 00:01:06,640 --> 00:01:07,560 Speaker 1: hide the truth. 17 00:01:07,800 --> 00:01:08,640 Speaker 3: To me, this is the. 18 00:01:08,600 --> 00:01:11,560 Speaker 4: Most disturbing piece of the entire case. The fact that 19 00:01:11,640 --> 00:01:13,640 Speaker 4: you like the fact that you're willing to put this 20 00:01:13,680 --> 00:01:17,520 Speaker 4: on a third person is absolutely horrific and constitutes a 21 00:01:17,600 --> 00:01:20,120 Speaker 4: violation of your owth in office. 22 00:01:20,160 --> 00:01:23,680 Speaker 1: While reporting on Caroline's story, our team has been fascinated 23 00:01:23,760 --> 00:01:27,640 Speaker 1: by the idea of liars, people who refuse to be 24 00:01:27,720 --> 00:01:30,600 Speaker 1: honest even when their back is up against the wall. 25 00:01:31,800 --> 00:01:35,319 Speaker 1: We wanted to understand why people lie and how someone 26 00:01:35,400 --> 00:01:39,280 Speaker 1: like Joel could have kept lying for so long. So 27 00:01:40,560 --> 00:01:44,160 Speaker 1: we track down two of the world's leading experts in deception. 28 00:01:49,040 --> 00:01:51,800 Speaker 2: Drew Curtis and my name's Chris Hart. 29 00:01:52,480 --> 00:01:56,720 Speaker 1: They're both psychology researchers and professors. Together they wrote a 30 00:01:56,720 --> 00:02:01,040 Speaker 1: book called Big Liars. What Psychological Science tells Us about 31 00:02:01,080 --> 00:02:04,840 Speaker 1: Lying and How you can Avoid being duped. They've spent 32 00:02:04,960 --> 00:02:09,000 Speaker 1: years studying pathological lying, so I asked them to define 33 00:02:09,000 --> 00:02:09,400 Speaker 1: it for me. 34 00:02:10,400 --> 00:02:12,959 Speaker 3: Most people are honest most of the time, but it's 35 00:02:12,960 --> 00:02:16,639 Speaker 3: a small percentage of the population who tells excessive amounts 36 00:02:16,639 --> 00:02:20,200 Speaker 3: of lies. So there's these groups of prolific or big 37 00:02:20,280 --> 00:02:24,400 Speaker 3: liars who tell lots of lies, and those lies don't 38 00:02:24,400 --> 00:02:27,320 Speaker 3: always put them at some disadvantage. And then there's a 39 00:02:27,400 --> 00:02:30,600 Speaker 3: smaller subset of individuals who would say are pathological liars, 40 00:02:30,639 --> 00:02:35,320 Speaker 3: where their lies do disadvantage them, typically in their relationships, 41 00:02:35,760 --> 00:02:37,680 Speaker 3: causing them to stress and so forth. 42 00:02:38,919 --> 00:02:43,800 Speaker 1: You guys say in your book Big Liars, that lying, 43 00:02:43,840 --> 00:02:47,519 Speaker 1: at its core is the attempt to persuade. Can you 44 00:02:47,560 --> 00:02:49,000 Speaker 1: tell us a little bit more about what you mean 45 00:02:49,040 --> 00:02:49,359 Speaker 1: by that. 46 00:02:50,200 --> 00:02:54,320 Speaker 2: Oftentimes our goals and ambitions are in alignment with other people, 47 00:02:54,400 --> 00:02:57,680 Speaker 2: but there's always a certain degree to which that's not true, 48 00:02:58,040 --> 00:03:01,880 Speaker 2: and so we're always navigating that tension between satisfying our 49 00:03:01,919 --> 00:03:05,640 Speaker 2: own goals and trying to match someone else's goals. But 50 00:03:05,639 --> 00:03:09,919 Speaker 2: I think ultimately we all find ourselves bending the truth 51 00:03:10,080 --> 00:03:13,320 Speaker 2: and sometimes outright lying when we feel like that's our 52 00:03:13,360 --> 00:03:17,360 Speaker 2: best option at persuading other people to essentially do what 53 00:03:17,400 --> 00:03:17,800 Speaker 2: we want. 54 00:03:18,520 --> 00:03:20,480 Speaker 1: People are coming to the show because in some ways 55 00:03:20,520 --> 00:03:24,320 Speaker 1: they relate to either Caroline story or Ashley or Stacey's 56 00:03:24,360 --> 00:03:27,200 Speaker 1: story from past seasons. In a lot of the cases, 57 00:03:27,200 --> 00:03:31,320 Speaker 1: they were with someone that deceived them for their own game. 58 00:03:32,040 --> 00:03:35,160 Speaker 1: What kind of resources could we give to anybody who's 59 00:03:35,160 --> 00:03:39,680 Speaker 1: trying to help someone who cares about the liar? Where 60 00:03:39,680 --> 00:03:42,960 Speaker 1: do you start? Where do you go to help advocate 61 00:03:43,000 --> 00:03:45,240 Speaker 1: for them to get help? Is there actually a path 62 00:03:45,320 --> 00:03:47,040 Speaker 1: forward for these individuals? 63 00:03:47,600 --> 00:03:49,800 Speaker 3: What you're saying makes me think of two pieces to this, 64 00:03:49,880 --> 00:03:54,560 Speaker 3: and one is how do we overcome deception within our 65 00:03:54,600 --> 00:03:59,160 Speaker 3: relationships or betrayals that are coupled with deception. One of 66 00:03:59,160 --> 00:04:02,680 Speaker 3: the challenges with exception is that it really damages trust, 67 00:04:03,160 --> 00:04:05,720 Speaker 3: and so the restoration of trust is a kind of 68 00:04:05,720 --> 00:04:08,200 Speaker 3: at the seat of this. But you're right, there's not 69 00:04:08,280 --> 00:04:11,480 Speaker 3: a lot of help. And to make this clear, pathological 70 00:04:11,560 --> 00:04:15,400 Speaker 3: lying is not currently recognized as a formal diagnostic entity 71 00:04:15,440 --> 00:04:16,760 Speaker 3: in the DSM. 72 00:04:17,240 --> 00:04:20,120 Speaker 1: For those unfamiliar with the term, the DSM is a 73 00:04:20,160 --> 00:04:24,560 Speaker 1: manual for mental health professionals. It lays out diagnoses recognized 74 00:04:24,600 --> 00:04:28,360 Speaker 1: by the medical establishment, and doctor Curtis is saying the 75 00:04:28,360 --> 00:04:32,839 Speaker 1: pathological lying is not something clinicians can formally diagnose. 76 00:04:32,680 --> 00:04:34,800 Speaker 3: And so that leaves a lot of people helpless, you know, 77 00:04:35,040 --> 00:04:38,440 Speaker 3: who might reach out to me or Chris or experts saying, hey, 78 00:04:38,440 --> 00:04:39,039 Speaker 3: can you help me? 79 00:04:40,200 --> 00:04:43,520 Speaker 1: Why do you think that this isn't a formal diagnosis 80 00:04:43,520 --> 00:04:44,640 Speaker 1: in the DSM. 81 00:04:45,080 --> 00:04:48,800 Speaker 3: It's surprising to me because some of the most prolific 82 00:04:48,880 --> 00:04:54,279 Speaker 3: writers in psychiatry and psychology identified pathological lying and it 83 00:04:54,279 --> 00:04:57,280 Speaker 3: comes with different names. And that's one of our hypotheses 84 00:04:57,320 --> 00:04:59,599 Speaker 3: is that maybe it was too fragmented. We called it 85 00:04:59,640 --> 00:05:03,279 Speaker 3: all these different things, and maybe it didn't cohesively come together. 86 00:05:03,760 --> 00:05:05,359 Speaker 3: The other part of this is a lot of the 87 00:05:05,400 --> 00:05:09,160 Speaker 3: research on pathological lying and the case studies were late 88 00:05:09,200 --> 00:05:13,680 Speaker 3: eighteen hundreds, early nineteen hundreds, but after about nineteen fifteen, 89 00:05:14,760 --> 00:05:17,080 Speaker 3: there's really not a lot of writing on it until 90 00:05:17,360 --> 00:05:21,119 Speaker 3: maybe the nineteen eighties, so as the DSM was really 91 00:05:21,160 --> 00:05:24,720 Speaker 3: being developed in the fifties, you know, it doesn't necessarily 92 00:05:24,760 --> 00:05:27,160 Speaker 3: make its way in there, but I'm hopeful. I've been 93 00:05:27,160 --> 00:05:30,640 Speaker 3: working with some colleagues psychiatrists from Yale and Colombia, and 94 00:05:30,680 --> 00:05:32,880 Speaker 3: we're working actively to get it recognized. 95 00:05:33,640 --> 00:05:37,880 Speaker 1: How would saying curtly this is a diagnosis help the 96 00:05:37,920 --> 00:05:41,799 Speaker 1: individual or help other people? Like, why would that be important? 97 00:05:43,000 --> 00:05:45,320 Speaker 3: One of the most important reasons is just a standard 98 00:05:45,400 --> 00:05:48,960 Speaker 3: label by which we can communicate as professionals but also 99 00:05:49,000 --> 00:05:52,160 Speaker 3: communicate with patients. You know, so you think of any 100 00:05:52,240 --> 00:05:56,040 Speaker 3: kind of disorder like major depressive disorder. When we say that, 101 00:05:56,120 --> 00:05:59,919 Speaker 3: all clinical professionals understand the cluster of symptoms that come 102 00:06:00,200 --> 00:06:03,080 Speaker 3: with that. But then also people who receive that diagnosis, 103 00:06:03,440 --> 00:06:06,680 Speaker 3: they can associate that label with the symptoms they already feel. 104 00:06:07,160 --> 00:06:09,600 Speaker 3: So it gives a standard language for people to communicate. 105 00:06:09,640 --> 00:06:11,880 Speaker 3: That's kind of at the very basic aspect of it. 106 00:06:13,040 --> 00:06:17,839 Speaker 3: More pragmatically looking for like insurance reimbursement, so insurance is 107 00:06:17,880 --> 00:06:20,960 Speaker 3: not going to reimburse treatment of something that what are 108 00:06:21,000 --> 00:06:23,920 Speaker 3: you treating where you're not treating anything that actually exists 109 00:06:23,960 --> 00:06:28,360 Speaker 3: or that's formally recognized. Other pragmatic concerns are We did 110 00:06:28,360 --> 00:06:33,479 Speaker 3: a study looking at psychotherapists, and the majority of psychotherapists 111 00:06:33,480 --> 00:06:35,920 Speaker 3: indicated they had worked with someone who they considered to 112 00:06:35,960 --> 00:06:39,400 Speaker 3: be a pathological liar, but in the absence of this label, 113 00:06:39,480 --> 00:06:42,840 Speaker 3: they end up giving another diagnosis. And so when you 114 00:06:42,880 --> 00:06:47,279 Speaker 3: do that, you're somewhat misdiagnosing and then maybe even arguably 115 00:06:47,480 --> 00:06:51,120 Speaker 3: ineffectively offering a treatment. And that's the last piece of 116 00:06:51,120 --> 00:06:56,440 Speaker 3: this too, is that if you can identify a formal diagnosis, 117 00:06:56,839 --> 00:06:59,200 Speaker 3: then you can set forth research to look at what 118 00:06:59,279 --> 00:07:00,880 Speaker 3: is the most effect of treatment for this. 119 00:07:22,720 --> 00:07:25,360 Speaker 1: Where Caroline is left today is that she's kind of 120 00:07:25,400 --> 00:07:28,800 Speaker 1: living with two different realities. There was her perspective of 121 00:07:28,840 --> 00:07:30,880 Speaker 1: what her life was and what her family looked like 122 00:07:30,960 --> 00:07:32,640 Speaker 1: and what she thought her family looked like, and on 123 00:07:32,680 --> 00:07:35,800 Speaker 1: the other track, there's the life that Joel was doing 124 00:07:35,840 --> 00:07:38,880 Speaker 1: behind the scenes, and she now has to kind of 125 00:07:38,920 --> 00:07:43,320 Speaker 1: integrate those two realities because she has to look back 126 00:07:43,360 --> 00:07:47,720 Speaker 1: on major memories and wonder what was real, what wasn't real, 127 00:07:48,640 --> 00:07:51,880 Speaker 1: And so when I look at someone like Caroline, or 128 00:07:51,920 --> 00:07:54,720 Speaker 1: if I'm Caroline, I don't even know where to start 129 00:07:54,760 --> 00:07:58,200 Speaker 1: on rebuilding trust or understanding the world in which I live. 130 00:07:58,880 --> 00:08:03,320 Speaker 1: That's why I f this topic fascinating because he lied 131 00:08:03,320 --> 00:08:04,320 Speaker 1: to her for twenty years. 132 00:08:05,160 --> 00:08:08,480 Speaker 2: Our research shows that most people are really good at lying. 133 00:08:08,520 --> 00:08:11,320 Speaker 2: It's a pretty easy thing for most humans to pull off. 134 00:08:12,000 --> 00:08:15,400 Speaker 2: And I think we go through the world trusting everyone 135 00:08:15,520 --> 00:08:17,720 Speaker 2: as being honest with us, and especially those people who 136 00:08:17,800 --> 00:08:21,320 Speaker 2: are close with us. But it's important to remember that 137 00:08:21,680 --> 00:08:24,520 Speaker 2: they're probably not being fully honest with us all the time, 138 00:08:24,600 --> 00:08:27,240 Speaker 2: even the people who are the very closest people in 139 00:08:27,280 --> 00:08:30,440 Speaker 2: our lives. If we catch someone close to us telling 140 00:08:30,520 --> 00:08:33,360 Speaker 2: us a rather minor lie, it has the same effect 141 00:08:33,520 --> 00:08:36,360 Speaker 2: as these bigger lies that we're talking about in this case, 142 00:08:36,440 --> 00:08:39,080 Speaker 2: where we start to question, well, if they lie about this, 143 00:08:39,120 --> 00:08:40,480 Speaker 2: what else are they lying about. 144 00:08:41,120 --> 00:08:43,760 Speaker 3: It's a natural proclivity, I believe to go back and 145 00:08:43,800 --> 00:08:46,960 Speaker 3: start investigating. And one of the pieces about the advice 146 00:08:47,000 --> 00:08:50,520 Speaker 3: I'd say too is to not necessarily let that overcloud 147 00:08:50,600 --> 00:08:54,200 Speaker 3: or overshadow places where you did have good experiences. 148 00:08:54,760 --> 00:08:56,319 Speaker 1: But it's easier said than done. 149 00:08:57,080 --> 00:09:00,439 Speaker 3: Sure, I think another part of that is really commitment 150 00:09:00,559 --> 00:09:04,080 Speaker 3: to where do you want to be now and where 151 00:09:04,080 --> 00:09:06,280 Speaker 3: do you want to go forward? And I imagine anyone 152 00:09:06,280 --> 00:09:08,760 Speaker 3: who's been lied to for a very long time that 153 00:09:09,520 --> 00:09:14,040 Speaker 3: is going back. You know it's going to impact trust 154 00:09:14,120 --> 00:09:17,280 Speaker 3: of other relationships or at least you know. The analogy 155 00:09:17,320 --> 00:09:20,319 Speaker 3: I use as walls. You know, when you've lowered your 156 00:09:20,360 --> 00:09:24,160 Speaker 3: wall and you've been vulnerable and you've gotten crushed, the 157 00:09:24,240 --> 00:09:26,480 Speaker 3: walls are going to come up, probably higher than before, 158 00:09:26,679 --> 00:09:28,520 Speaker 3: and you're probably going to have a hard time letting 159 00:09:28,559 --> 00:09:31,480 Speaker 3: people in because you've seen what people can do to 160 00:09:31,559 --> 00:09:34,480 Speaker 3: you and you're developing these new beliefs that if I 161 00:09:34,559 --> 00:09:37,080 Speaker 3: let people in, they will crush me, they will lie 162 00:09:37,120 --> 00:09:39,920 Speaker 3: to me, they will take advantage of me, and those thoughts, 163 00:09:40,120 --> 00:09:42,800 Speaker 3: those are hard to guard against. Right, But you are 164 00:09:43,440 --> 00:09:45,880 Speaker 3: making decisions about what it is you want to do, 165 00:09:46,400 --> 00:09:48,320 Speaker 3: and maybe you do want to keep the walls up. 166 00:09:48,720 --> 00:09:50,760 Speaker 3: But there's a consequence of that too, and it's not 167 00:09:50,840 --> 00:09:53,400 Speaker 3: letting people in who may not do that to. 168 00:09:53,360 --> 00:09:57,760 Speaker 1: You, right. I mean, I imagine your brain is helping 169 00:09:57,800 --> 00:10:00,920 Speaker 1: you create that story for a sense of safe because 170 00:10:00,960 --> 00:10:03,960 Speaker 1: your world has just kind of been taken away from you. 171 00:10:04,080 --> 00:10:06,520 Speaker 1: Or your perception of what your life was like has 172 00:10:06,559 --> 00:10:09,480 Speaker 1: been taken away. As much as you want to beat 173 00:10:09,520 --> 00:10:12,480 Speaker 1: yourself up, people who lie all the time are very 174 00:10:12,520 --> 00:10:13,720 Speaker 1: good at it, you know. 175 00:10:14,480 --> 00:10:16,679 Speaker 2: We do see that people who are really practice at 176 00:10:16,720 --> 00:10:18,640 Speaker 2: lying get good at it. And one of the things 177 00:10:18,640 --> 00:10:23,120 Speaker 2: we see is for people that lie prolifically, they have 178 00:10:23,240 --> 00:10:27,120 Speaker 2: this diminished fear response when they're lying. So probably if 179 00:10:27,120 --> 00:10:29,280 Speaker 2: any of us were lying, we'd be really nervous about 180 00:10:29,280 --> 00:10:32,120 Speaker 2: being caught, you know, because for a lot of reasons, 181 00:10:32,120 --> 00:10:35,599 Speaker 2: like it would destroy our reputations and cause ruptures in 182 00:10:35,640 --> 00:10:38,360 Speaker 2: our relationships. But but people who lie a lot and 183 00:10:38,400 --> 00:10:42,840 Speaker 2: do it every day, that fear response subsides, and so 184 00:10:43,559 --> 00:10:46,679 Speaker 2: they can lie and their emotional reactions are going to 185 00:10:46,760 --> 00:10:48,840 Speaker 2: be about the same as if they're telling you what 186 00:10:48,880 --> 00:10:51,560 Speaker 2: they had for dinner last night. There's just not much there. 187 00:10:52,200 --> 00:10:54,160 Speaker 3: And the other part you mentioned is Blaine, you know, 188 00:10:54,280 --> 00:10:57,760 Speaker 3: you can beat yourself up, Like you said, what did 189 00:10:57,800 --> 00:10:58,640 Speaker 3: I not see? 190 00:10:58,760 --> 00:10:58,880 Speaker 2: Right? 191 00:10:58,960 --> 00:11:01,960 Speaker 3: Hindsight's twenty two? How did I not see all these things? 192 00:11:01,960 --> 00:11:05,280 Speaker 3: And maybe you see them much clearer now. You know, 193 00:11:05,400 --> 00:11:09,200 Speaker 3: most of us, you know, don't want to catch those 194 00:11:09,240 --> 00:11:11,120 Speaker 3: awful things. We don't want to be confronted with that 195 00:11:11,200 --> 00:11:13,640 Speaker 3: even if it's true. And so I think you know 196 00:11:14,360 --> 00:11:17,720 Speaker 3: that aspect too is helping someone deal with beating themselves 197 00:11:17,760 --> 00:11:21,000 Speaker 3: up for not being super light detector. But there is 198 00:11:21,040 --> 00:11:25,680 Speaker 3: an initial impulse to not necessarily want to know that 199 00:11:25,760 --> 00:11:28,880 Speaker 3: the person's lying because what that brings about or the 200 00:11:28,920 --> 00:11:31,200 Speaker 3: consequences of what they were lying about. 201 00:11:31,679 --> 00:11:34,880 Speaker 2: Yeah, and especially within the context of you know, romantic 202 00:11:34,960 --> 00:11:39,040 Speaker 2: relationships and marriage, is if I'm going to call my 203 00:11:39,520 --> 00:11:43,200 Speaker 2: spouse out for lying, does that mean we have to 204 00:11:43,240 --> 00:11:47,120 Speaker 2: split up? And it gets really complicated and scary really quickly, 205 00:11:47,600 --> 00:11:51,120 Speaker 2: And it's just so much easier and less frightening to 206 00:11:51,320 --> 00:11:54,720 Speaker 2: just turn a blind eye to that thing that's giving 207 00:11:54,840 --> 00:11:55,840 Speaker 2: rise towards suspicion. 208 00:11:56,880 --> 00:12:01,600 Speaker 1: Can people who are pathological liars change? Is there a 209 00:12:01,720 --> 00:12:04,760 Speaker 1: path for them to move about life in a more 210 00:12:04,800 --> 00:12:06,640 Speaker 1: honest way if they want to work on it. 211 00:12:07,280 --> 00:12:11,200 Speaker 3: I think people always have the opportunities to change, and 212 00:12:11,320 --> 00:12:13,839 Speaker 3: change is kind of the business we're in in one 213 00:12:13,880 --> 00:12:17,319 Speaker 3: of those really cognitive behavioral therapy. You know, it's aspects 214 00:12:17,400 --> 00:12:21,000 Speaker 3: like modeling honesty even when it's hard, So trying to 215 00:12:21,080 --> 00:12:24,600 Speaker 3: encourage people to be honest even when it's hard, Really 216 00:12:24,600 --> 00:12:28,839 Speaker 3: having those tough conversations, showing that you're willing to have 217 00:12:28,920 --> 00:12:30,400 Speaker 3: tough conversations with people. 218 00:12:31,360 --> 00:12:33,120 Speaker 2: Yeah, I think a lot of it is just the 219 00:12:33,120 --> 00:12:38,120 Speaker 2: intention to change. Lying is really a social strategy that 220 00:12:38,200 --> 00:12:43,520 Speaker 2: people adopt and cultivate and reinforce over decades and decades. 221 00:12:43,520 --> 00:12:46,800 Speaker 2: And it's just like any behavioral pattern, whether it's you know, 222 00:12:47,120 --> 00:12:51,920 Speaker 2: alcohol consumption, smoking, using sarcasm, anything that you've been doing 223 00:12:51,920 --> 00:12:54,040 Speaker 2: for decades. It's hard just to flip the switch and 224 00:12:54,080 --> 00:12:57,360 Speaker 2: turn it off. But the key and the first step 225 00:12:57,360 --> 00:13:00,439 Speaker 2: in Drew and I both hear from these people periodically, 226 00:13:00,640 --> 00:13:04,280 Speaker 2: is people decide they finally want to change. They finally 227 00:13:04,840 --> 00:13:07,400 Speaker 2: hit some point in their lives where they realize that 228 00:13:07,480 --> 00:13:12,079 Speaker 2: their patterns of laing are causing such upheaval and turmoil 229 00:13:12,160 --> 00:13:15,640 Speaker 2: that they really have a strong desire to change. I 230 00:13:15,679 --> 00:13:18,360 Speaker 2: think we can all become more honest than we are 231 00:13:18,440 --> 00:13:20,840 Speaker 2: right now, but we have to make that a goal, 232 00:13:20,880 --> 00:13:23,199 Speaker 2: we have to make a priority. And if we just 233 00:13:23,240 --> 00:13:25,680 Speaker 2: take one moment every day and think, how can I 234 00:13:25,760 --> 00:13:28,760 Speaker 2: be more honest about this situation with someone who I 235 00:13:28,800 --> 00:13:31,800 Speaker 2: care about that I'm interacting with, we can move that needle, 236 00:13:32,320 --> 00:13:35,520 Speaker 2: and each day, as we practice that habit, we start 237 00:13:35,559 --> 00:13:37,840 Speaker 2: to see some change, and the change might be gradual. 238 00:13:38,280 --> 00:13:40,840 Speaker 2: But I assume if everyone made an intention to be 239 00:13:40,960 --> 00:13:43,439 Speaker 2: more honest every day, if they looked at themselves a 240 00:13:43,520 --> 00:13:46,600 Speaker 2: year from now, they find they've made some considerable progress. 241 00:13:49,720 --> 00:13:51,720 Speaker 1: If you want to hear more of this conversation and 242 00:13:51,800 --> 00:13:55,280 Speaker 1: see it in video, check out our brand new substack. 243 00:13:56,040 --> 00:14:01,280 Speaker 1: Just head to Betrayal dot substack that's sub stac or 244 00:14:01,360 --> 00:14:04,360 Speaker 1: just go to subsack dot com, search beyond Betrayal and 245 00:14:04,480 --> 00:14:09,200 Speaker 1: hit subscribe. You can find Curtisonhart's book Big Liars on 246 00:14:09,240 --> 00:14:13,520 Speaker 1: the American Psychological Association website, Amazon or Barnes and Noble. 247 00:14:19,680 --> 00:14:22,360 Speaker 1: Thank you for listening to Betrayal season four. If you 248 00:14:22,360 --> 00:14:24,640 Speaker 1: would like to reach out to the Betrayal team, email 249 00:14:24,720 --> 00:14:28,960 Speaker 1: us at Betrayalpod at gmail dot com. That's Betrayal Pod 250 00:14:29,600 --> 00:14:32,920 Speaker 1: at gmail dot com. Also, please be sure to follow 251 00:14:33,000 --> 00:14:37,240 Speaker 1: us on Instagram at Betrayal Pod and me Andrea Hgunning 252 00:14:37,240 --> 00:14:40,240 Speaker 1: for all Betrayal content, news and updates. One way to 253 00:14:40,280 --> 00:14:42,640 Speaker 1: support the series is by subscribing to our show on 254 00:14:42,680 --> 00:14:47,120 Speaker 1: Apple Podcasts. Please rate and review Betrayal. Five star reviews 255 00:14:47,200 --> 00:14:51,080 Speaker 1: help us know you appreciate what we do. Betrayal is 256 00:14:51,080 --> 00:14:54,800 Speaker 1: a production of Glass Podcasts, a division of Glass Entertainment Group, 257 00:14:54,840 --> 00:14:59,040 Speaker 1: in partnership with iHeart Podcasts. The show is executive produced 258 00:14:59,040 --> 00:15:02,680 Speaker 1: by Nancy Glass and Jennifer Fasin. Betrayal is hosted and 259 00:15:02,720 --> 00:15:06,600 Speaker 1: produced by me Andrea Gunning, written and produced by Caitlin Golden, 260 00:15:07,480 --> 00:15:11,360 Speaker 1: also produced by Carrie Hartman and Ben Fetterman. Our associate 261 00:15:11,360 --> 00:15:15,360 Speaker 1: producer is Kristin Melcurie. Our iHeart team is Ali Perry 262 00:15:15,400 --> 00:15:20,240 Speaker 1: and Jessica Krincheck. Story editing by Monique Leboard, audio editing 263 00:15:20,320 --> 00:15:24,680 Speaker 1: and mixing by Matt Delvecchio, editing by Tanner Robbins, and 264 00:15:24,720 --> 00:15:28,120 Speaker 1: special thanks to Caroline and her family. Betrayal's theme is 265 00:15:28,120 --> 00:15:32,880 Speaker 1: composed by Oliver Baines. Music library provided by my Music 266 00:15:33,360 --> 00:15:36,120 Speaker 1: and For more podcasts from iHeart, visit the iHeartRadio app, 267 00:15:36,320 --> 00:15:39,080 Speaker 1: Apple Podcasts, or wherever you get your podcasts.