1 00:00:15,476 --> 00:00:23,316 Speaker 1: Pushkin. Welcome to the second of our two part series 2 00:00:23,396 --> 00:00:26,436 Speaker 1: focused on the twenty twenty five World Happiness Report. In 3 00:00:26,476 --> 00:00:29,476 Speaker 1: the last episode, we talked about the decline of shared meals, 4 00:00:29,676 --> 00:00:31,836 Speaker 1: but in this episode we'll turn to a chapter that 5 00:00:31,876 --> 00:00:35,196 Speaker 1: focuses on a different decline. In it, the French economists 6 00:00:35,276 --> 00:00:38,196 Speaker 1: jan Algin Current Blank and Claudia Senic look at the 7 00:00:38,316 --> 00:00:42,236 Speaker 1: changes in social trust, particularly the decreases in trust that 8 00:00:42,316 --> 00:00:45,596 Speaker 1: so many of us have observed post pandemic. The economists 9 00:00:45,596 --> 00:00:48,236 Speaker 1: find that there's been a huge reduction in social trust globally, 10 00:00:48,516 --> 00:00:50,916 Speaker 1: and they argue that these changes in trust have led 11 00:00:50,916 --> 00:00:53,036 Speaker 1: not just to a big hit on our collective happiness, 12 00:00:53,236 --> 00:00:56,436 Speaker 1: but also to increases in what they call anti system thinking, 13 00:00:56,676 --> 00:01:00,436 Speaker 1: with people rejecting traditional political parties and turning to populism. 14 00:01:00,916 --> 00:01:02,836 Speaker 1: I wanted to understand how to make sense of these 15 00:01:02,876 --> 00:01:06,356 Speaker 1: findings and the implications they're having for well being worldwide. 16 00:01:06,516 --> 00:01:08,516 Speaker 1: So to help me out, I decided to turn to 17 00:01:08,556 --> 00:01:10,956 Speaker 1: one of my feet for experts on the science of trust. 18 00:01:11,436 --> 00:01:14,716 Speaker 2: Hi everyone, my name is Rachel Batsman. I've been studying 19 00:01:14,716 --> 00:01:17,996 Speaker 2: trust well for over fifteen years now. The latest book 20 00:01:18,116 --> 00:01:21,956 Speaker 2: is called How to Trust and Be Trusted? Intentionally a 21 00:01:21,996 --> 00:01:25,476 Speaker 2: two way title, because that's how trust works. We trust 22 00:01:25,476 --> 00:01:27,796 Speaker 2: other people and then we want others to trust us 23 00:01:27,796 --> 00:01:30,516 Speaker 2: as well. And you can get it on audible or 24 00:01:30,556 --> 00:01:34,396 Speaker 2: Spotify or pushkin FM. People tell me that it's really 25 00:01:34,556 --> 00:01:37,716 Speaker 2: changed the way they trust others and others trust them, 26 00:01:37,756 --> 00:01:39,796 Speaker 2: and that was the reason for making it. 27 00:01:39,876 --> 00:01:42,276 Speaker 1: And so as a trust expert, I'm curious your reaction 28 00:01:42,396 --> 00:01:44,236 Speaker 1: to the fact that there's an entire chapter of the 29 00:01:44,236 --> 00:01:46,836 Speaker 1: World Happiness Report devoted to trust. I mean, is that 30 00:01:47,116 --> 00:01:49,276 Speaker 1: something you think is long over too? Is that something 31 00:01:49,396 --> 00:01:50,276 Speaker 1: you were surprised by. 32 00:01:50,516 --> 00:01:53,556 Speaker 2: I'm surprised it's taken so long to make that connection, 33 00:01:53,836 --> 00:01:56,396 Speaker 2: if I'm honest, because I mean, I've always struggled with 34 00:01:56,436 --> 00:01:59,796 Speaker 2: the word happiness, but satisfaction and joy is very much 35 00:01:59,916 --> 00:02:04,036 Speaker 2: tied to social trust. So not just the people we 36 00:02:04,076 --> 00:02:05,996 Speaker 2: have in our lives, but how much we can trust 37 00:02:05,996 --> 00:02:09,636 Speaker 2: ourselves to take risks and explore new things, how much 38 00:02:09,676 --> 00:02:12,556 Speaker 2: we can take risks and new relationships, how much confidence 39 00:02:12,556 --> 00:02:15,396 Speaker 2: we can place in systems and society. So yeah, there's 40 00:02:15,436 --> 00:02:18,636 Speaker 2: a very strong correlation there. So it's it's not a surprise, 41 00:02:18,676 --> 00:02:19,876 Speaker 2: but surprise it took so long. 42 00:02:20,196 --> 00:02:22,076 Speaker 1: I mean, when you read the report. Was there anything 43 00:02:22,116 --> 00:02:24,636 Speaker 1: particular that you've found especially straightly, I. 44 00:02:24,676 --> 00:02:30,116 Speaker 2: Found it very alarming. Looking at the it's not even 45 00:02:30,556 --> 00:02:34,396 Speaker 2: a decline, It like falls off the cliff in interpersonal trust. 46 00:02:34,716 --> 00:02:38,276 Speaker 2: So that's the trust in families, friends, co workers, people 47 00:02:38,356 --> 00:02:41,916 Speaker 2: close to us, the Marcus twenty twenty. So you think, hmm, 48 00:02:42,196 --> 00:02:45,276 Speaker 2: that's the pandemic, but there's no recovery from it. I 49 00:02:45,316 --> 00:02:48,236 Speaker 2: could understand if it was social trust, like trust and 50 00:02:48,276 --> 00:02:50,676 Speaker 2: strangers and other people, but the fact that there's no 51 00:02:50,796 --> 00:02:54,876 Speaker 2: repair in that close circle. And for me, this ties 52 00:02:54,996 --> 00:02:58,956 Speaker 2: into what I think is a huge societal problem that's 53 00:02:58,996 --> 00:03:03,036 Speaker 2: not getting enough attention. And it's not just loneliness. It's 54 00:03:03,116 --> 00:03:07,556 Speaker 2: that people are spending more time alone and at home 55 00:03:07,916 --> 00:03:11,716 Speaker 2: than ever before. Yes, a worry about loneliness as an epidemic, 56 00:03:11,836 --> 00:03:15,436 Speaker 2: but I feel the rise of the anti social society. 57 00:03:15,876 --> 00:03:19,716 Speaker 2: Our ability to be with other people, even people close 58 00:03:19,796 --> 00:03:22,556 Speaker 2: to us, and want to go out and connect with people. 59 00:03:23,156 --> 00:03:26,436 Speaker 2: That was what I found the most alarming in the report. 60 00:03:27,076 --> 00:03:30,036 Speaker 1: Let's just start by defining trust. You've had a kind 61 00:03:30,036 --> 00:03:32,556 Speaker 1: of curious definition of it, one that I haven't seen. 62 00:03:32,596 --> 00:03:35,636 Speaker 1: This idea of this confident relationship with the unknown. What 63 00:03:35,676 --> 00:03:36,316 Speaker 1: do you mean there. 64 00:03:36,436 --> 00:03:40,156 Speaker 2: Yeah, it's so a confident relationship with the unknown describes 65 00:03:40,236 --> 00:03:42,916 Speaker 2: the need or the existence for trust. So if you 66 00:03:42,956 --> 00:03:44,796 Speaker 2: know the outcome of something, or if you know how 67 00:03:44,836 --> 00:03:47,556 Speaker 2: something's going to turn out, or there's very little risk 68 00:03:47,596 --> 00:03:49,876 Speaker 2: in a situation, you don't actually need a lot of trust. 69 00:03:50,236 --> 00:03:53,836 Speaker 2: It's in those situations where there is a really high 70 00:03:54,156 --> 00:03:56,516 Speaker 2: unknown or there is a lot of uncertainty where you 71 00:03:56,636 --> 00:04:00,636 Speaker 2: need the most trust. And that's why there's such a 72 00:04:00,636 --> 00:04:05,476 Speaker 2: strong tie between trust and uncertainty, which is the flip 73 00:04:05,836 --> 00:04:09,916 Speaker 2: of how many people think about trust. So I ask 74 00:04:10,316 --> 00:04:13,356 Speaker 2: many people to define trust, and the answers you get 75 00:04:13,396 --> 00:04:18,996 Speaker 2: back are typically around stability and expectations, things like reliability, 76 00:04:19,436 --> 00:04:22,716 Speaker 2: which I find really interesting that we're so wired on 77 00:04:22,756 --> 00:04:25,436 Speaker 2: that sort of solid side of the spectrum and that's 78 00:04:25,596 --> 00:04:28,236 Speaker 2: what we attached trust to, versus trust is needed in 79 00:04:28,276 --> 00:04:29,756 Speaker 2: those unknown situations. 80 00:04:29,956 --> 00:04:32,316 Speaker 1: It's also needed across all kinds of different contexts, and 81 00:04:32,356 --> 00:04:34,156 Speaker 1: I think this is something you've so nicely pointed out 82 00:04:34,196 --> 00:04:35,916 Speaker 1: in your work. Give me an example of the different 83 00:04:35,996 --> 00:04:38,716 Speaker 1: domains in which we see trust and where trust seems 84 00:04:38,716 --> 00:04:39,116 Speaker 1: to matter. 85 00:04:39,316 --> 00:04:41,116 Speaker 2: One of the things that really frustrates me is when 86 00:04:41,196 --> 00:04:44,316 Speaker 2: you hear these very generalized ways of talking about trust. Oh, 87 00:04:44,396 --> 00:04:47,076 Speaker 2: trust is in a state of total decline. That's the 88 00:04:47,116 --> 00:04:50,076 Speaker 2: media headline, and it's really not helpful. It's not helpful 89 00:04:50,116 --> 00:04:53,156 Speaker 2: for our own states, it's not helpful for society or 90 00:04:53,196 --> 00:04:56,316 Speaker 2: any system. Because I think of trust in different circles. 91 00:04:56,476 --> 00:04:58,916 Speaker 2: So I'll give you a couple because they might ring true. 92 00:04:58,996 --> 00:05:01,436 Speaker 2: The first is sort of more academic. You can think 93 00:05:01,516 --> 00:05:05,316 Speaker 2: of institutional trust, So that's trust you place in institutions, 94 00:05:05,356 --> 00:05:09,116 Speaker 2: the legal system, healthcare, education, government, whatever it might be. 95 00:05:09,196 --> 00:05:11,876 Speaker 2: That's trust. An institutions, that's trust, and an entity. Then 96 00:05:11,956 --> 00:05:15,516 Speaker 2: you have what we call interpersonal trust. This is the 97 00:05:15,596 --> 00:05:18,316 Speaker 2: trust that really impacts us day in day out. It's 98 00:05:18,516 --> 00:05:22,396 Speaker 2: our family, it's our friends, it's our close circle, our coworkers, 99 00:05:22,476 --> 00:05:25,396 Speaker 2: and those bonds are really really important. They're the ones 100 00:05:25,436 --> 00:05:28,316 Speaker 2: I'm actually most worried about. And then more broadly, we 101 00:05:28,476 --> 00:05:31,436 Speaker 2: have social trust, and that is the trust that we 102 00:05:31,476 --> 00:05:34,716 Speaker 2: can place in strangers, that people we don't know, our 103 00:05:34,756 --> 00:05:38,476 Speaker 2: belief in things like integrity and moral good. So they're 104 00:05:38,476 --> 00:05:41,916 Speaker 2: the sort of three academic ways of framing trust. Another 105 00:05:41,956 --> 00:05:44,436 Speaker 2: way of framing trust is to think of trust in 106 00:05:44,476 --> 00:05:48,636 Speaker 2: yourself trust in others. And then the trust they placed 107 00:05:48,636 --> 00:05:52,236 Speaker 2: in you. So that's more of like concentric circle approach, 108 00:05:52,436 --> 00:05:56,196 Speaker 2: and trust issues can arise in any three of those circles. 109 00:05:56,196 --> 00:05:59,836 Speaker 2: So some people find it very difficult to trust themselves, 110 00:06:00,316 --> 00:06:04,516 Speaker 2: but can trust others easily. Others find that people naturally 111 00:06:04,516 --> 00:06:07,476 Speaker 2: trust them for some reason, but they don't necessarily trust 112 00:06:07,476 --> 00:06:10,676 Speaker 2: other people. So I find those circles really interesting to 113 00:06:10,716 --> 00:06:11,156 Speaker 2: think about. 114 00:06:11,236 --> 00:06:13,196 Speaker 1: They seem to be so important for happiness, but also 115 00:06:13,236 --> 00:06:15,716 Speaker 1: really dynamic, right, you know, if you don't trust yourself, 116 00:06:15,756 --> 00:06:18,116 Speaker 1: then what's that going to lead other people to think 117 00:06:18,156 --> 00:06:19,956 Speaker 1: of you? And things like that? I mean, it must 118 00:06:19,996 --> 00:06:21,796 Speaker 1: be really complicated to kind of get at some of 119 00:06:21,836 --> 00:06:23,396 Speaker 1: these dynamics. 120 00:06:22,756 --> 00:06:26,356 Speaker 2: And constantly evolving. So trust is not like this fixed asset. 121 00:06:26,436 --> 00:06:28,756 Speaker 2: I don't like it when people talk about banking trust, 122 00:06:28,796 --> 00:06:30,996 Speaker 2: you know, like it's a reservoir. They change with age, 123 00:06:30,996 --> 00:06:34,596 Speaker 2: they change with experience, they change with environment, and most importantly, 124 00:06:34,636 --> 00:06:37,796 Speaker 2: they change with context. Trust is so so contextual, and 125 00:06:37,836 --> 00:06:39,596 Speaker 2: this is the part that we often miss. 126 00:06:39,956 --> 00:06:42,076 Speaker 1: And so it seems like trust in all these different 127 00:06:42,076 --> 00:06:44,716 Speaker 1: dimensions is super important for our happiness. But it seems 128 00:06:44,756 --> 00:06:47,196 Speaker 1: like there's also two problems that we could have with trust, right, 129 00:06:47,236 --> 00:06:49,756 Speaker 1: One is the idea of being too trusting and then 130 00:06:49,876 --> 00:06:51,516 Speaker 1: you know, you kind of get let down by the 131 00:06:51,516 --> 00:06:53,196 Speaker 1: people around you, maybe even by yourself. 132 00:06:53,276 --> 00:06:53,436 Speaker 2: Right. 133 00:06:53,836 --> 00:06:56,876 Speaker 1: The second is this idea of being not trusting enough, right, 134 00:06:56,956 --> 00:06:59,076 Speaker 1: kind of not realizing that other people actually have your 135 00:06:59,076 --> 00:07:01,156 Speaker 1: back end. The uncertainty is not as scary as you think. 136 00:07:01,436 --> 00:07:02,796 Speaker 1: And so I wanted to go through each of these 137 00:07:02,796 --> 00:07:05,596 Speaker 1: in turn, maybe starting with this idea of being too trusting, 138 00:07:05,716 --> 00:07:07,516 Speaker 1: and partly because my understanding is that this is one 139 00:07:07,556 --> 00:07:09,316 Speaker 1: of the reasons you get interested in trust in the 140 00:07:09,396 --> 00:07:12,556 Speaker 1: first place. Share with me this story of what happened 141 00:07:12,556 --> 00:07:14,836 Speaker 1: in your childhood where folks in your world were maybe 142 00:07:14,916 --> 00:07:16,796 Speaker 1: too trusting. Yeah. 143 00:07:16,796 --> 00:07:18,716 Speaker 2: I don't think people often think of it as a problem, 144 00:07:18,756 --> 00:07:20,836 Speaker 2: but it is. And the interesting thing there is actually 145 00:07:20,836 --> 00:07:23,636 Speaker 2: a high correlation between people who are very trusting and 146 00:07:23,676 --> 00:07:27,676 Speaker 2: that consider themselves emotionally intelligent and intuitive. And the reason 147 00:07:27,716 --> 00:07:29,956 Speaker 2: why is you are told that you are very good 148 00:07:29,956 --> 00:07:33,516 Speaker 2: at reading people, that you can pick up on signals. 149 00:07:33,596 --> 00:07:35,036 Speaker 2: So if you're someone that says, all, I have very 150 00:07:35,036 --> 00:07:39,076 Speaker 2: strong intuition about people, and that's something I felt from 151 00:07:39,116 --> 00:07:42,716 Speaker 2: a really young age. And the story of telling the book, 152 00:07:42,916 --> 00:07:45,436 Speaker 2: I guess everyone has an origin story, and mine is 153 00:07:45,476 --> 00:07:49,676 Speaker 2: around a nanny that turned out to be a drug 154 00:07:49,756 --> 00:07:55,756 Speaker 2: dealer that used our famili's car, a Volvo nonetheless a 155 00:07:55,796 --> 00:07:57,396 Speaker 2: get away car, and an armed robbery. 156 00:07:57,436 --> 00:08:00,556 Speaker 1: Oh my god. Yeah, it's like a very like true 157 00:08:00,596 --> 00:08:01,916 Speaker 1: crime trust origin story. 158 00:08:03,236 --> 00:08:05,316 Speaker 2: The thing that's crazy about the story is that she 159 00:08:05,436 --> 00:08:08,116 Speaker 2: lived with us for a year. It wasn't like she 160 00:08:08,236 --> 00:08:11,236 Speaker 2: moved in and she did this in eight weeks. And 161 00:08:11,276 --> 00:08:14,236 Speaker 2: she was this incredible nanny. Like my memories of her 162 00:08:14,436 --> 00:08:17,796 Speaker 2: were that she played with us, she was very attentive, 163 00:08:18,076 --> 00:08:20,636 Speaker 2: she was a really good cook, she was very peaceful. 164 00:08:20,756 --> 00:08:23,796 Speaker 2: But she had a complete other life that my parents 165 00:08:23,796 --> 00:08:24,796 Speaker 2: discovered over time. 166 00:08:25,316 --> 00:08:27,676 Speaker 1: So I'm sure that looking back, there must have been 167 00:08:27,676 --> 00:08:30,516 Speaker 1: some red flags about this nanny that you maybe sort 168 00:08:30,516 --> 00:08:32,796 Speaker 1: of questioned the extent to which she would have trusted her. 169 00:08:33,116 --> 00:08:34,956 Speaker 1: As you look back, what are some red flags in 170 00:08:34,956 --> 00:08:38,076 Speaker 1: this situation that you've kind of generalized to other situations 171 00:08:38,276 --> 00:08:40,596 Speaker 1: which we might wonder whether we're trusting somebody to us. 172 00:08:40,636 --> 00:08:42,236 Speaker 2: I think there's the red flags as a child, But 173 00:08:42,316 --> 00:08:45,276 Speaker 2: now my parents have told the story, they realized that 174 00:08:45,356 --> 00:08:48,156 Speaker 2: when they hired her, they're both entrepreneurs and they were 175 00:08:48,316 --> 00:08:51,596 Speaker 2: in a very intense time of building their companies and 176 00:08:51,636 --> 00:08:54,716 Speaker 2: traveling a lot, So making trust decisions when you're under 177 00:08:54,796 --> 00:08:57,716 Speaker 2: pressure is high stakes. Trust decisions like who to leave 178 00:08:57,756 --> 00:09:02,076 Speaker 2: your children with often leads to bad decision making because 179 00:09:02,116 --> 00:09:05,836 Speaker 2: you want to believe that person. So that's the first lesson, and. 180 00:09:05,796 --> 00:09:07,796 Speaker 1: That one I have to say is like so important, right, 181 00:09:07,836 --> 00:09:10,036 Speaker 1: I feel like there's so many situa in which just 182 00:09:10,116 --> 00:09:12,796 Speaker 1: like the convenience of being able to trust somebody, just 183 00:09:12,836 --> 00:09:14,636 Speaker 1: like you're like, I can't even question this right now 184 00:09:14,676 --> 00:09:16,316 Speaker 1: because I just like have to make this work, So 185 00:09:16,356 --> 00:09:18,436 Speaker 1: I'm just going to assume everything's going to be fine. 186 00:09:18,596 --> 00:09:20,436 Speaker 1: You could totally see how that plays out in so 187 00:09:20,476 --> 00:09:21,516 Speaker 1: many different domains. 188 00:09:22,036 --> 00:09:25,076 Speaker 2: Most hires at work right, like, it's just left too late, 189 00:09:25,196 --> 00:09:28,916 Speaker 2: and convenience no pun intended, often trumps trust, so will 190 00:09:28,916 --> 00:09:32,036 Speaker 2: often give our trust away if it is convenient. That 191 00:09:32,196 --> 00:09:34,676 Speaker 2: is a real life lesson. The second thing is I 192 00:09:34,716 --> 00:09:37,676 Speaker 2: think the way when I asked them, like, because there 193 00:09:37,716 --> 00:09:40,596 Speaker 2: was no email, there was no video calling, there was 194 00:09:40,636 --> 00:09:43,476 Speaker 2: no social networks at the time, you know, why did 195 00:09:43,516 --> 00:09:46,476 Speaker 2: you believe this person? And they said it was things 196 00:09:46,516 --> 00:09:49,716 Speaker 2: like she had a Scottish accent, and she said that 197 00:09:49,916 --> 00:09:53,516 Speaker 2: she even came to work wearing a Salvation Army uniform 198 00:09:53,596 --> 00:09:56,116 Speaker 2: because she said she really liked helping people and that 199 00:09:56,396 --> 00:09:59,676 Speaker 2: she played the piano and the tambourine. So it's all 200 00:09:59,676 --> 00:10:03,076 Speaker 2: these stereotypes of who is trustworthy. Accents are a really 201 00:10:03,356 --> 00:10:06,636 Speaker 2: big influence on that. So that's been a life lesson 202 00:10:06,796 --> 00:10:11,356 Speaker 2: in when I'm meeting someone, what signals am I tuning into? 203 00:10:12,076 --> 00:10:15,956 Speaker 2: And so many signals really play to our biases, so 204 00:10:16,036 --> 00:10:18,916 Speaker 2: we will look for people that are familiar or fit 205 00:10:18,996 --> 00:10:23,076 Speaker 2: that kind of stereotype. So that's number two. Number three 206 00:10:23,156 --> 00:10:28,116 Speaker 2: is listened to your children because I knew they were 207 00:10:28,156 --> 00:10:32,396 Speaker 2: being lied to and I notice things going missing around 208 00:10:32,396 --> 00:10:35,716 Speaker 2: the house. I noticed on Wednesdays we'd go to this 209 00:10:35,796 --> 00:10:39,036 Speaker 2: strange flat and there was this strange man, and there 210 00:10:39,116 --> 00:10:43,236 Speaker 2: was something that just didn't add up. And I would 211 00:10:43,236 --> 00:10:46,436 Speaker 2: say to Mom, these people come round and they talk 212 00:10:46,476 --> 00:10:49,356 Speaker 2: about strange things, and I would get told off not 213 00:10:49,436 --> 00:10:53,116 Speaker 2: to make things up. So trust children because they are 214 00:10:53,196 --> 00:10:56,716 Speaker 2: really observant and they often don't have an agenda in 215 00:10:56,756 --> 00:10:59,356 Speaker 2: the same way that adults do. So I'd listened to 216 00:10:59,396 --> 00:11:00,316 Speaker 2: that feedback more. 217 00:11:00,716 --> 00:11:02,916 Speaker 1: It also seems like you just need to perspective, take 218 00:11:02,996 --> 00:11:05,556 Speaker 1: a little bit, like get a really outside perspective, a 219 00:11:05,556 --> 00:11:07,716 Speaker 1: bit of a distance perspective. Sometimes you can be that 220 00:11:07,756 --> 00:11:09,836 Speaker 1: can be your child. But I bet sometimes that can 221 00:11:09,876 --> 00:11:12,236 Speaker 1: just be like another person who might not be in 222 00:11:12,276 --> 00:11:15,156 Speaker 1: the tight situation you are might not have noticed like 223 00:11:15,196 --> 00:11:17,156 Speaker 1: the same things that you're noticing about how cool this 224 00:11:17,196 --> 00:11:19,676 Speaker 1: person is in this salvation army you know, you know, 225 00:11:19,876 --> 00:11:22,156 Speaker 1: uniform they're wearing and so on. It seems like we 226 00:11:22,196 --> 00:11:24,876 Speaker 1: often get into trouble when we trust our gut and 227 00:11:24,916 --> 00:11:27,796 Speaker 1: don't take outside input when it comes to trust. Does 228 00:11:27,796 --> 00:11:29,236 Speaker 1: the research bear that out totally? 229 00:11:29,276 --> 00:11:32,076 Speaker 2: I mean, they say trust has two enemies, bad character 230 00:11:32,116 --> 00:11:35,556 Speaker 2: and poor information, And it's when we either don't slow 231 00:11:35,596 --> 00:11:38,156 Speaker 2: down to get enough information or as you say, Laurie, 232 00:11:38,156 --> 00:11:41,036 Speaker 2: like we don't get a different perspective. And I'm sure 233 00:11:41,516 --> 00:11:43,476 Speaker 2: one of mom's friends, like when she said that she 234 00:11:43,596 --> 00:11:46,076 Speaker 2: just found all this money under a tree in a park, 235 00:11:46,396 --> 00:11:49,996 Speaker 2: like the money tree, that someone would have said, I 236 00:11:50,036 --> 00:11:50,996 Speaker 2: think those exist. 237 00:11:52,236 --> 00:11:54,956 Speaker 1: That sounds sketchy, It sounds a little sketchy. Yeah. 238 00:11:55,116 --> 00:11:55,636 Speaker 2: Yeah. 239 00:11:55,676 --> 00:11:57,716 Speaker 1: So let's walk through the things that really maybe are 240 00:11:57,956 --> 00:12:00,596 Speaker 1: good indicators of trust, like these sort of so called 241 00:12:00,636 --> 00:12:03,676 Speaker 1: traits of trustworthiness. As you talk about them, can you 242 00:12:03,716 --> 00:12:04,636 Speaker 1: break these down for us. 243 00:12:04,716 --> 00:12:06,996 Speaker 2: Yes, so, and I'm gonna say I think some of 244 00:12:07,036 --> 00:12:10,076 Speaker 2: them need updating. So this is based on social science 245 00:12:10,116 --> 00:12:12,636 Speaker 2: that now has been tracking traits for forty years, and 246 00:12:12,676 --> 00:12:15,156 Speaker 2: these traits are changing, which I think is really interesting. 247 00:12:15,316 --> 00:12:18,276 Speaker 2: The capability side, so you have imagined your parts. You 248 00:12:18,276 --> 00:12:21,396 Speaker 2: have capability, which is really about what you do, and 249 00:12:21,436 --> 00:12:24,276 Speaker 2: then you have character, which is why you do things, 250 00:12:24,316 --> 00:12:27,596 Speaker 2: but really importantly how you do things. So how would 251 00:12:27,596 --> 00:12:28,956 Speaker 2: you describe, Laurie what you do. 252 00:12:29,276 --> 00:12:31,836 Speaker 1: I would say that I'm a podcaster. I'm a teacher, 253 00:12:31,996 --> 00:12:34,756 Speaker 1: you know, I try to be really there for my students. 254 00:12:34,836 --> 00:12:37,516 Speaker 1: I try to use evidence in a really capable way, 255 00:12:37,756 --> 00:12:39,676 Speaker 1: Like I have lots of things that make me capable, 256 00:12:39,716 --> 00:12:41,676 Speaker 1: but also maybe I'm like a warm person, right, I 257 00:12:41,716 --> 00:12:43,956 Speaker 1: want to take care of my students and help my 258 00:12:44,036 --> 00:12:44,996 Speaker 1: listeners and so on. 259 00:12:45,276 --> 00:12:48,836 Speaker 2: So they you're talking about capability and character. So the capability, 260 00:12:49,036 --> 00:12:53,596 Speaker 2: if you imagine, is your competence. So you have the skills, 261 00:12:53,756 --> 00:12:55,796 Speaker 2: you have the expertise, you have the knowledge, you have 262 00:12:55,836 --> 00:12:59,756 Speaker 2: the resources, you have YEO, as an institution, you have 263 00:12:59,836 --> 00:13:02,236 Speaker 2: all these things that allow you to do what you 264 00:13:02,276 --> 00:13:04,116 Speaker 2: say you're going to do. So you're really credible on 265 00:13:04,156 --> 00:13:06,716 Speaker 2: those things, but maybe there's some other things. I don't know. 266 00:13:06,756 --> 00:13:08,396 Speaker 2: What's something you completely can't do? 267 00:13:09,156 --> 00:13:12,316 Speaker 1: Terrible have I god so many things. Driving I can't 268 00:13:12,396 --> 00:13:16,196 Speaker 1: drive by that. Biking, biking, I'm really bad by I 269 00:13:16,276 --> 00:13:19,396 Speaker 1: don't know how to bike. Most physical things are very clumsy, 270 00:13:19,556 --> 00:13:21,796 Speaker 1: fall lot, skiing not great. 271 00:13:21,916 --> 00:13:24,356 Speaker 2: Yeah, that actually makes you more trustworthy because you can 272 00:13:24,396 --> 00:13:26,476 Speaker 2: be honest about things that you do and then you're 273 00:13:26,516 --> 00:13:28,596 Speaker 2: really comfortable with saying don't ever get in a car 274 00:13:28,716 --> 00:13:31,316 Speaker 2: with me or don't ride with me. So that's your confidence. 275 00:13:31,356 --> 00:13:33,916 Speaker 2: What you're talking about with you with your students, wanting 276 00:13:33,956 --> 00:13:36,396 Speaker 2: to be there for them consistently. I'd imagine that you're 277 00:13:36,436 --> 00:13:38,476 Speaker 2: this person that likes them to know that they can 278 00:13:38,476 --> 00:13:41,396 Speaker 2: depend on you. That's the reliability trait, and that is 279 00:13:41,596 --> 00:13:44,276 Speaker 2: really really important when it comes to trust. So you 280 00:13:44,316 --> 00:13:48,036 Speaker 2: know those like really inconsistent people that are high energy 281 00:13:48,076 --> 00:13:50,476 Speaker 2: and they show up sometimes and then they completely disappear. 282 00:13:50,516 --> 00:13:53,716 Speaker 2: There's no follow through. So that's it's on the capability side. 283 00:13:53,796 --> 00:13:57,956 Speaker 2: And then on the character side, we have empathy, which 284 00:13:57,956 --> 00:14:00,596 Speaker 2: you spoke about. I prefer the word compassion. You know, 285 00:14:00,636 --> 00:14:04,116 Speaker 2: empathy doesn't really talk about the action side, the follow through. 286 00:14:04,356 --> 00:14:06,716 Speaker 2: And then The last trait, which I think is the 287 00:14:06,756 --> 00:14:10,116 Speaker 2: most important trait, is integrity. And that's all about your 288 00:14:10,156 --> 00:14:13,676 Speaker 2: interest being aligned with the best interests of other people. 289 00:14:13,876 --> 00:14:17,156 Speaker 2: So you the professor, your interests are aligned with the 290 00:14:17,196 --> 00:14:20,556 Speaker 2: interest the students, You the host. The audience feels that 291 00:14:20,556 --> 00:14:23,276 Speaker 2: this is self serving, that you are there to be 292 00:14:23,396 --> 00:14:26,076 Speaker 2: generous and to care about them and their learning. So 293 00:14:26,316 --> 00:14:29,516 Speaker 2: that's what we talk about when we talk about trustworthiness. 294 00:14:29,556 --> 00:14:31,596 Speaker 1: And I imagine that in these different domains of trust 295 00:14:31,756 --> 00:14:34,516 Speaker 1: the importance of say, capability, this kind of combination of 296 00:14:34,556 --> 00:14:39,316 Speaker 1: competence and reliability versus character, this combination of compassionate integrity 297 00:14:39,636 --> 00:14:41,756 Speaker 1: that might go up and down depending on what you need, 298 00:14:41,916 --> 00:14:44,836 Speaker 1: right Like, I might not need a surgeon filled with 299 00:14:44,916 --> 00:14:46,716 Speaker 1: a lot of character. I just really want him to 300 00:14:46,716 --> 00:14:50,876 Speaker 1: be very competent and capable. But a best friend, I right, really, 301 00:14:51,076 --> 00:14:53,356 Speaker 1: you know, I don't necessarily care that my best friend 302 00:14:53,356 --> 00:14:54,876 Speaker 1: does it her job, but I really want her to 303 00:14:54,916 --> 00:14:57,516 Speaker 1: be really empathic when it comes to, you know, helping 304 00:14:57,516 --> 00:14:58,876 Speaker 1: me with my problems and so on. 305 00:14:59,076 --> 00:15:01,276 Speaker 2: It's a really important comes back to context, and it's 306 00:15:01,316 --> 00:15:03,796 Speaker 2: such an important point because I don't know if you've heard. 307 00:15:04,316 --> 00:15:06,996 Speaker 2: Actually my dad said it the other day he's having 308 00:15:06,996 --> 00:15:08,836 Speaker 2: some dot to his hip and he's I really don't 309 00:15:08,836 --> 00:15:11,556 Speaker 2: feel like the surgeon cares. I was like, yeah, but 310 00:15:11,636 --> 00:15:14,396 Speaker 2: he's a great surgeon, right, Like he's going to fix 311 00:15:14,436 --> 00:15:18,156 Speaker 2: the hit. And I think that has become an expectation 312 00:15:18,396 --> 00:15:21,756 Speaker 2: on sort of a feeling led society, is that sometimes 313 00:15:21,796 --> 00:15:26,396 Speaker 2: we can place too much emphasis on compassion and empathy 314 00:15:26,876 --> 00:15:30,876 Speaker 2: and that person can seem incredibly kind but not capable. 315 00:15:31,196 --> 00:15:34,556 Speaker 2: So this is why I think this alchemy of traits 316 00:15:34,596 --> 00:15:37,916 Speaker 2: and thinking about the particular situation is it's almost like 317 00:15:37,956 --> 00:15:40,676 Speaker 2: a compass for making really good decisions about people. 318 00:15:40,876 --> 00:15:43,316 Speaker 1: And so if you're a person who's maybe meeting a 319 00:15:43,356 --> 00:15:45,196 Speaker 1: new surgeon for the first time, or a new business 320 00:15:45,196 --> 00:15:47,436 Speaker 1: partner or a new love interest, and you really want 321 00:15:47,436 --> 00:15:50,236 Speaker 1: to kind of make sure you're trusting appropriately, what are 322 00:15:50,236 --> 00:15:53,156 Speaker 1: some strategies you'd use to do that well. 323 00:15:53,276 --> 00:15:55,956 Speaker 2: I think the business situation is probably the easiest, and 324 00:15:55,996 --> 00:15:58,796 Speaker 2: it's also where most people go wrong, because most people 325 00:15:58,836 --> 00:16:01,076 Speaker 2: start with the competence piece. If you think about most 326 00:16:01,116 --> 00:16:04,076 Speaker 2: job interviews or promotional interviews like so tell me what 327 00:16:04,116 --> 00:16:06,476 Speaker 2: you've done, tell me about your experience, Like they are 328 00:16:06,516 --> 00:16:08,676 Speaker 2: the easiest things to get from a resume or a 329 00:16:08,756 --> 00:16:11,956 Speaker 2: reference to check, and the number of interviews that don't 330 00:16:11,996 --> 00:16:14,476 Speaker 2: get to the why and the how. The how is 331 00:16:14,556 --> 00:16:18,036 Speaker 2: really interesting, like how people approach things, how they break 332 00:16:18,076 --> 00:16:21,956 Speaker 2: down problems, how they are in difficult conversations. So I 333 00:16:21,996 --> 00:16:25,476 Speaker 2: would like focus questions around that. The second thing that 334 00:16:25,556 --> 00:16:32,396 Speaker 2: I would do is really try to understand someone's interests, intentions, 335 00:16:32,996 --> 00:16:36,276 Speaker 2: and motives, not like why do you want the job? 336 00:16:36,676 --> 00:16:39,516 Speaker 2: But where are they really coming from? And again asking 337 00:16:39,516 --> 00:16:42,476 Speaker 2: yourself this question of does that align with the role, 338 00:16:42,596 --> 00:16:45,116 Speaker 2: does that align with the organization, Because it's when you 339 00:16:45,196 --> 00:16:48,756 Speaker 2: have that misalignment in a professional or a personal context 340 00:16:49,076 --> 00:16:51,396 Speaker 2: that trust issues can arise. So you can even think 341 00:16:51,396 --> 00:16:54,596 Speaker 2: of a bit uncomfortable saying this, but like in dating situations, 342 00:16:54,636 --> 00:16:57,476 Speaker 2: like if someone really doesn't want children and the other 343 00:16:57,516 --> 00:17:00,636 Speaker 2: person wants children, right, there's a misalignment there. If someone 344 00:17:00,676 --> 00:17:03,196 Speaker 2: doesn't want a long term relationship, if someone doesn't want 345 00:17:03,236 --> 00:17:05,956 Speaker 2: a monogamous relationship, someone doesn't want to live with you, 346 00:17:06,196 --> 00:17:09,956 Speaker 2: it's that misalignment that really is the problem. Now, you're 347 00:17:09,956 --> 00:17:11,396 Speaker 2: not going to get there on the first date, you'll 348 00:17:11,396 --> 00:17:15,156 Speaker 2: probably scarce someone away, but over time, if you feel 349 00:17:15,196 --> 00:17:20,236 Speaker 2: like there's trust issues emerging, there's probably some unsaid conversation 350 00:17:20,556 --> 00:17:23,036 Speaker 2: around I want this one thing and you want this 351 00:17:23,076 --> 00:17:26,236 Speaker 2: other thing, and we're both too scared to say it. 352 00:17:26,716 --> 00:17:29,076 Speaker 1: And then how do we overcome the kind of biases 353 00:17:29,116 --> 00:17:31,196 Speaker 1: that we talked about earlier. Is there anything we can 354 00:17:31,236 --> 00:17:33,596 Speaker 1: do to kind of get that perspective maybe not fall 355 00:17:33,636 --> 00:17:35,796 Speaker 1: for a familiarity and some of the other biases we 356 00:17:35,876 --> 00:17:36,516 Speaker 1: talked about. 357 00:17:36,636 --> 00:17:39,036 Speaker 2: It's so hard. I mean, if someone can come up 358 00:17:39,036 --> 00:17:41,276 Speaker 2: with the solution around that, then tell me. But I 359 00:17:41,276 --> 00:17:44,316 Speaker 2: think it. I mean, it's really obvious advice. It's becoming 360 00:17:44,396 --> 00:17:47,996 Speaker 2: aware of what those biases are for you. What does 361 00:17:48,036 --> 00:17:51,316 Speaker 2: familiarity look like? What are the signals? And that's different 362 00:17:51,356 --> 00:17:54,676 Speaker 2: for different people. So some people are very influenced by 363 00:17:54,676 --> 00:18:00,596 Speaker 2: looks and appearance. Other people are really influenced by cultural 364 00:18:00,636 --> 00:18:05,076 Speaker 2: background and accents. Other people are really influenced by education. 365 00:18:05,596 --> 00:18:07,356 Speaker 2: Away of sort of tuning into this is when you 366 00:18:07,436 --> 00:18:10,516 Speaker 2: meet people for the first time, where do you sort 367 00:18:10,556 --> 00:18:14,076 Speaker 2: of focus. Do you notice what someone's wearing, do you 368 00:18:14,156 --> 00:18:17,796 Speaker 2: notice what they're saying? Where does the conversation orientate itself 369 00:18:17,836 --> 00:18:21,316 Speaker 2: as well? Like, these are really powerful signals as to 370 00:18:21,356 --> 00:18:24,876 Speaker 2: what is important to you. That maybe where your biases 371 00:18:24,916 --> 00:18:25,476 Speaker 2: are rooted. 372 00:18:26,196 --> 00:18:28,636 Speaker 1: You've also suggested doing something that we talk about a 373 00:18:28,716 --> 00:18:31,516 Speaker 1: lot on the Happiness Lab, which is like, take a pause, 374 00:18:31,636 --> 00:18:34,076 Speaker 1: take a breath. How can take what you've called the 375 00:18:34,116 --> 00:18:35,596 Speaker 1: trust pause be helpful here? 376 00:18:35,916 --> 00:18:38,796 Speaker 2: So a trust pause it's something I invented for myself. 377 00:18:40,956 --> 00:18:42,396 Speaker 1: All researchers research, right. 378 00:18:44,836 --> 00:18:48,116 Speaker 2: I tend to move quickly, do things quickly, think quickly, 379 00:18:48,596 --> 00:18:52,636 Speaker 2: speak quickly, and I just realized there were certain situations 380 00:18:52,716 --> 00:18:57,356 Speaker 2: where slowing down and really asking myself, you know, was 381 00:18:57,396 --> 00:19:02,636 Speaker 2: this person, this piece of information, this situation, this partnership, 382 00:19:02,796 --> 00:19:06,436 Speaker 2: did they actually deserve my trust? And it's really placing 383 00:19:06,556 --> 00:19:09,036 Speaker 2: value on your trust that you have trust to give 384 00:19:09,836 --> 00:19:13,636 Speaker 2: and you don't have to give it to everyone. I 385 00:19:13,876 --> 00:19:18,436 Speaker 2: found that to be quite empowering because in certain situations 386 00:19:18,476 --> 00:19:21,676 Speaker 2: you go, you know what, I'm gonna just hold back 387 00:19:21,716 --> 00:19:24,596 Speaker 2: a little bit. Or it's not like a self protection mechanism. 388 00:19:24,716 --> 00:19:27,356 Speaker 2: It's just saying I don't think I want to give 389 00:19:27,396 --> 00:19:30,396 Speaker 2: you my trust at this particular moment. And you can 390 00:19:30,436 --> 00:19:34,476 Speaker 2: think about that even in the context of online information, right, 391 00:19:34,556 --> 00:19:37,556 Speaker 2: like when you just share something without reading it that 392 00:19:37,636 --> 00:19:39,596 Speaker 2: would benefit from a trust pause. 393 00:19:40,876 --> 00:19:43,076 Speaker 1: So far, we've talked about cases where we trust a 394 00:19:43,076 --> 00:19:45,276 Speaker 1: little bit too much or a little bit too early. 395 00:19:45,556 --> 00:19:47,436 Speaker 1: But when we get back from the break, we're going 396 00:19:47,516 --> 00:19:50,316 Speaker 1: to discuss the other problem when it comes to trusted happiness, 397 00:19:50,836 --> 00:19:54,036 Speaker 1: trusting too little. The happiness lab will be right back. 398 00:20:02,596 --> 00:20:05,596 Speaker 1: Rachel Batsman, author of How To Trust and Be Trusted, 399 00:20:05,956 --> 00:20:08,556 Speaker 1: is an expert on the science of trust. But I 400 00:20:08,596 --> 00:20:12,436 Speaker 1: was curious how she thinks about the opposite, what is distrust. 401 00:20:12,756 --> 00:20:15,676 Speaker 2: Oh, So the first thing I'd say is there's a 402 00:20:15,716 --> 00:20:20,916 Speaker 2: difference between low trust and distrust, and that's really important 403 00:20:20,956 --> 00:20:23,796 Speaker 2: to understand. So low trust can just be you don't 404 00:20:23,796 --> 00:20:26,196 Speaker 2: have enough information, like you're new to a situation or 405 00:20:26,196 --> 00:20:29,396 Speaker 2: a relationship. It's not necessarily a bad thing. And also 406 00:20:29,796 --> 00:20:32,036 Speaker 2: this is why I hate a lot of poles and surveys. 407 00:20:32,236 --> 00:20:34,356 Speaker 2: If you try to live your whole life in a 408 00:20:34,436 --> 00:20:37,916 Speaker 2: high trust state, it'll be pretty exhausting, right, Like, there's 409 00:20:37,956 --> 00:20:40,596 Speaker 2: certain things that just don't require a high degree of trust. 410 00:20:40,636 --> 00:20:43,796 Speaker 2: So that's the first thing I would say. Distrust it's 411 00:20:43,916 --> 00:20:46,956 Speaker 2: very difficult to define. I have not yet come up 412 00:20:47,476 --> 00:20:49,836 Speaker 2: or seen a definition that I really like. But the 413 00:20:49,876 --> 00:20:53,356 Speaker 2: way I think of distrust is more through the lens 414 00:20:53,436 --> 00:20:57,236 Speaker 2: of behaviors. So I find it helpful to think of 415 00:20:57,276 --> 00:21:01,636 Speaker 2: these three d's. So when someone is distrusting, you tend 416 00:21:01,636 --> 00:21:04,876 Speaker 2: to see this spectrum play out where you see a 417 00:21:04,916 --> 00:21:10,116 Speaker 2: defensiveness set in. And that defensiveness is because you have 418 00:21:10,396 --> 00:21:14,036 Speaker 2: made yourself vulnerable or you've placed something of value and 419 00:21:14,716 --> 00:21:16,756 Speaker 2: in some ways you feel that it's being exploited or 420 00:21:16,796 --> 00:21:19,276 Speaker 2: it's not being taken care of, and so the first 421 00:21:19,356 --> 00:21:22,276 Speaker 2: instinct is to be quite defensive about that. Now, the 422 00:21:22,316 --> 00:21:24,956 Speaker 2: thing about that stage of distrust is you can still 423 00:21:24,996 --> 00:21:29,716 Speaker 2: fix the situation because the person cares. The second phase 424 00:21:30,436 --> 00:21:34,596 Speaker 2: is disengagement. Disengagement is when you start to pour back. 425 00:21:34,796 --> 00:21:37,556 Speaker 2: So you might have experienced this at work where you're like, 426 00:21:38,116 --> 00:21:40,596 Speaker 2: I'm not really sure I trust this personalised situation or 427 00:21:40,636 --> 00:21:42,716 Speaker 2: this boss, so I'm just going to pull back. I'm 428 00:21:42,716 --> 00:21:44,956 Speaker 2: not sure I'm going to really show up or really care. 429 00:21:45,276 --> 00:21:49,236 Speaker 2: And then the last phase, which is incredibly dangerous and 430 00:21:49,276 --> 00:21:52,036 Speaker 2: I think it's how we talk about distrust in society today, 431 00:21:52,396 --> 00:21:56,996 Speaker 2: is disenchantment. And disenchantment means you have turned against that person, 432 00:21:57,036 --> 00:22:00,876 Speaker 2: that organization, that system, and you're in a downward spiral 433 00:22:00,996 --> 00:22:05,196 Speaker 2: because your only motive is to bring that thing down. 434 00:22:05,556 --> 00:22:09,236 Speaker 2: You have become anti you are pushing against. And the 435 00:22:09,276 --> 00:22:12,156 Speaker 2: reason why this is so dangerous is because it can 436 00:22:12,236 --> 00:22:17,516 Speaker 2: become all consuming for yourself, for others, and very very toxic. 437 00:22:17,956 --> 00:22:20,596 Speaker 2: So you see this in sports scenes where someone turns 438 00:22:20,636 --> 00:22:23,276 Speaker 2: against the coach. You see it in workplaces where they 439 00:22:23,276 --> 00:22:26,156 Speaker 2: want everyone to leave. We see it wider in society, 440 00:22:26,156 --> 00:22:28,676 Speaker 2: where you want to turn against a party. That's how 441 00:22:28,716 --> 00:22:31,356 Speaker 2: I tend to think of distrust is moving through these 442 00:22:31,396 --> 00:22:32,036 Speaker 2: three phases. 443 00:22:32,116 --> 00:22:35,196 Speaker 1: And you mentioned sort of society, especially politics. It seems 444 00:22:35,236 --> 00:22:38,276 Speaker 1: like this idea of disenchantment is running rampant. At least 445 00:22:38,316 --> 00:22:40,276 Speaker 1: that's what we hear a lot from the news. But 446 00:22:40,316 --> 00:22:41,876 Speaker 1: I know this is an idea that you've pushed back 447 00:22:41,916 --> 00:22:44,436 Speaker 1: against a little bit that, like the kind of the 448 00:22:44,516 --> 00:22:46,996 Speaker 1: freefall of trust might not be as bad as we think. 449 00:22:47,556 --> 00:22:48,636 Speaker 1: Explain why that's the case. 450 00:22:48,756 --> 00:22:51,716 Speaker 2: Yeah, I mean, I find it really difficult to listen 451 00:22:51,716 --> 00:22:55,836 Speaker 2: to news for so many reasons. But it's often because 452 00:22:55,956 --> 00:23:01,956 Speaker 2: everything is described in free fall, including trust. So you'll 453 00:23:01,996 --> 00:23:05,516 Speaker 2: see these graphs where it's just like this downward line 454 00:23:05,836 --> 00:23:08,836 Speaker 2: and from about twenty twenty, it's like falling off a 455 00:23:08,836 --> 00:23:12,916 Speaker 2: clip that's across the spectrum. Actually, the institutional the social 456 00:23:12,916 --> 00:23:16,636 Speaker 2: and the interpersonal trust. To me, that is problematic because 457 00:23:17,076 --> 00:23:19,716 Speaker 2: the way I think of trust is more like energy 458 00:23:19,796 --> 00:23:22,836 Speaker 2: that it's not getting destroyed, it's changing form. And so 459 00:23:23,796 --> 00:23:26,756 Speaker 2: what might look like low trust or distrust to you 460 00:23:27,596 --> 00:23:31,276 Speaker 2: is just someone trusting differently. And once you see this, 461 00:23:31,476 --> 00:23:34,316 Speaker 2: it explains so many things, and it's stopped so much judgment. 462 00:23:34,516 --> 00:23:37,116 Speaker 2: So it doesn't even have to be between political parties. 463 00:23:37,156 --> 00:23:39,556 Speaker 2: You see it within generations where people will say, well 464 00:23:39,636 --> 00:23:42,356 Speaker 2: that gen Z or jen Alfa they just don't trust anymore. 465 00:23:42,516 --> 00:23:46,276 Speaker 2: That is not true. Their trust is sideways, like they 466 00:23:46,316 --> 00:23:49,996 Speaker 2: trust their peers and friends and influencers because that's where 467 00:23:49,996 --> 00:23:53,996 Speaker 2: they get their information. They don't trust upwards. So thinking 468 00:23:54,116 --> 00:23:58,796 Speaker 2: of flows of trust versus amounts of trust can be 469 00:23:58,876 --> 00:24:03,876 Speaker 2: really helpful in understanding dynamics in relationships and then bigger 470 00:24:03,956 --> 00:24:05,276 Speaker 2: paradise shifts happening. 471 00:24:05,316 --> 00:24:07,276 Speaker 1: It seems like one of those bigger paradigm shifts is 472 00:24:07,356 --> 00:24:09,596 Speaker 1: kind of the trust that we have that in some sense, 473 00:24:09,636 --> 00:24:11,396 Speaker 1: for lack of better word, is kind of distributed, right, 474 00:24:11,476 --> 00:24:14,596 Speaker 1: You mentioned kind of gen Z and gen Alpha trusting, 475 00:24:14,636 --> 00:24:17,236 Speaker 1: influencers trusting what they read on social media. Is that 476 00:24:17,316 --> 00:24:20,156 Speaker 1: a cultural shift that you're seeing in this trust research too. 477 00:24:20,396 --> 00:24:24,676 Speaker 2: Yes, it's huge. So what we see is like if 478 00:24:24,676 --> 00:24:29,436 Speaker 2: you imagine like an evolution of trust. We had institutional trusts, 479 00:24:29,516 --> 00:24:33,356 Speaker 2: so everything was very top down and hierarchical and centralized 480 00:24:33,356 --> 00:24:36,276 Speaker 2: and defined by fixed boundaries. So we used to get 481 00:24:36,276 --> 00:24:40,276 Speaker 2: a newspaper, we turn on the TV that's institutional trust, 482 00:24:40,596 --> 00:24:45,116 Speaker 2: and technology inherently blew that up and distributed it through 483 00:24:45,196 --> 00:24:49,196 Speaker 2: networks and marketplaces and platforms and now through artificial intelligence. 484 00:24:49,636 --> 00:24:52,676 Speaker 2: So the easiest way to understand this is imagine a 485 00:24:52,716 --> 00:24:58,156 Speaker 2: trust that for decades flowed upwards and now it's distributed sideways. 486 00:24:58,836 --> 00:25:01,796 Speaker 2: And this has so much influence over so many things 487 00:25:01,836 --> 00:25:06,356 Speaker 2: because who we believe is trustworthy, who influences our opinions 488 00:25:06,356 --> 00:25:09,756 Speaker 2: and beliefs, what we decide to act on in the 489 00:25:09,756 --> 00:25:15,796 Speaker 2: context of health, politics, education, life decisions, wellness. This is 490 00:25:15,916 --> 00:25:20,636 Speaker 2: all moving sideways, and sometimes the people sharing that information 491 00:25:20,676 --> 00:25:23,276 Speaker 2: are not the most trustworthy. I mean, just if it 492 00:25:23,316 --> 00:25:26,196 Speaker 2: actually you see this in the happiness face that you know, 493 00:25:26,196 --> 00:25:29,196 Speaker 2: it really bothers me. Like when I see influencers online, 494 00:25:29,276 --> 00:25:32,676 Speaker 2: they're usually alone, Like do you know what I mean? 495 00:25:32,756 --> 00:25:35,836 Speaker 2: Like they wake up alone, they meditate, their their coffee, 496 00:25:35,876 --> 00:25:39,556 Speaker 2: they stretch they run alone, right, they work alone, just 497 00:25:39,756 --> 00:25:42,956 Speaker 2: really bothers me that that's the image of happiness. 498 00:25:43,116 --> 00:25:45,036 Speaker 1: And I think we're falling prey to a lot of 499 00:25:45,036 --> 00:25:48,116 Speaker 1: the biases that you talked about earlier about trusting too much, right, 500 00:25:48,156 --> 00:25:51,116 Speaker 1: Like many of these influencers have, you know, this beautiful 501 00:25:51,196 --> 00:25:53,236 Speaker 1: home that they go into, and they tend to be 502 00:25:53,316 --> 00:25:56,716 Speaker 1: really beautiful, right, Like we're falling for these tropes of 503 00:25:57,156 --> 00:25:59,836 Speaker 1: like familiarity and maybe some of the things we like 504 00:25:59,996 --> 00:26:01,916 Speaker 1: and a halo effect, that's a fact where we kind 505 00:26:01,916 --> 00:26:04,156 Speaker 1: of like people who have other good things happening to them. 506 00:26:04,276 --> 00:26:06,916 Speaker 1: Like it is true that like we're putting putting our trust, 507 00:26:06,996 --> 00:26:09,596 Speaker 1: this distributed trust winds up maybe following pray to as 508 00:26:09,596 --> 00:26:11,516 Speaker 1: many of those same biases as you just talked about 509 00:26:11,516 --> 00:26:13,436 Speaker 1: in the interpersonal trust domain earlier. 510 00:26:13,716 --> 00:26:16,756 Speaker 2: Yeah, and it's in sound bites and fragments, some very 511 00:26:16,796 --> 00:26:20,596 Speaker 2: carefully curated images and I know all this stuff. And 512 00:26:20,876 --> 00:26:24,036 Speaker 2: I started running last year and I fell down this 513 00:26:24,116 --> 00:26:27,596 Speaker 2: rabbit hole so hard, and then I realized no one 514 00:26:27,636 --> 00:26:31,996 Speaker 2: looks like that when they run, right, Because I'd be like, 515 00:26:32,036 --> 00:26:34,916 Speaker 2: why am I so much pain and sweaty and disgusting? 516 00:26:34,956 --> 00:26:36,956 Speaker 2: And they're all like you know, here's weighing and stuff, 517 00:26:36,996 --> 00:26:41,356 Speaker 2: and because they're making money off me, and you've just 518 00:26:41,356 --> 00:26:43,236 Speaker 2: made me think of something I hadn't thought of. But 519 00:26:43,276 --> 00:26:46,276 Speaker 2: if you think about the trust you play in influencers, 520 00:26:46,476 --> 00:26:50,476 Speaker 2: like just a scrolling behavior, it's a one way thing, right, 521 00:26:50,516 --> 00:26:53,476 Speaker 2: it doesn't flow back to you, And so much of 522 00:26:53,516 --> 00:26:58,356 Speaker 2: healthy trust is reciprocation. So I do something for you, 523 00:26:58,636 --> 00:27:01,076 Speaker 2: which then creates the space for you to do something 524 00:27:01,196 --> 00:27:05,556 Speaker 2: for me. And those loops are what form trust, These 525 00:27:05,596 --> 00:27:08,276 Speaker 2: one way forms of trust. Let's not call them shallow trust, 526 00:27:08,276 --> 00:27:11,796 Speaker 2: but these one way form so trust. They're breaking those loops, 527 00:27:11,796 --> 00:27:15,516 Speaker 2: those possibilities for reciprocation, and that's damaging the social glue, 528 00:27:15,756 --> 00:27:19,516 Speaker 2: which i'd imagine is impacting our happiness and satisfaction. I'd 529 00:27:19,556 --> 00:27:21,556 Speaker 2: be interested if the research correlates to that. 530 00:27:22,196 --> 00:27:23,956 Speaker 1: So that's kind of the over trust we put in 531 00:27:23,956 --> 00:27:26,676 Speaker 1: these distributed networks, you know, to these influencers and things 532 00:27:26,756 --> 00:27:29,916 Speaker 1: like that. And that's, as you mentioned, happening alongside a 533 00:27:29,996 --> 00:27:33,876 Speaker 1: real kind of emergence of distrust for institutions, whether that's 534 00:27:33,996 --> 00:27:37,356 Speaker 1: you know, governments or academic institutions like the one I'm 535 00:27:37,396 --> 00:27:39,036 Speaker 1: at and so on. And this was the kind of 536 00:27:39,076 --> 00:27:41,916 Speaker 1: thing that the World Happiness Report was really looking at. Right, 537 00:27:41,956 --> 00:27:44,316 Speaker 1: The chapter of the World Happiness Report that's on trust 538 00:27:44,356 --> 00:27:46,836 Speaker 1: is trying to look at this puzzle about why there's 539 00:27:46,876 --> 00:27:50,196 Speaker 1: been so many kind of voting behaviors that have shifted 540 00:27:50,476 --> 00:27:53,996 Speaker 1: kind of more anti institutional right for candidates that maybe 541 00:27:54,076 --> 00:27:56,356 Speaker 1: want smaller government or kind of want to do away 542 00:27:56,396 --> 00:27:59,676 Speaker 1: with government, you know, kind of candidates that really represent 543 00:27:59,756 --> 00:28:02,076 Speaker 1: these sort of views that are kind of anti establishment. 544 00:28:02,396 --> 00:28:03,956 Speaker 1: Walk me through some of the things that this chapter 545 00:28:04,076 --> 00:28:06,436 Speaker 1: found and how trust was really important for some of 546 00:28:06,436 --> 00:28:07,836 Speaker 1: these changes in voting behavior. 547 00:28:07,996 --> 00:28:10,836 Speaker 2: The report is really interesting. I don't agree with all 548 00:28:10,916 --> 00:28:13,276 Speaker 2: of the framing. Sort of at the highest level. It's 549 00:28:13,276 --> 00:28:17,036 Speaker 2: basically saying the far left have high trust and higher 550 00:28:17,036 --> 00:28:21,236 Speaker 2: life satisfaction and the far right have low trust and 551 00:28:21,276 --> 00:28:24,916 Speaker 2: lower life satisfaction. That's like top level, which is a 552 00:28:24,956 --> 00:28:29,316 Speaker 2: problematic framing for me. And one of the reasons why 553 00:28:29,476 --> 00:28:31,956 Speaker 2: is they're saying that when you're in a low trust date, 554 00:28:32,276 --> 00:28:35,476 Speaker 2: you tend to be anti You invest your energy in 555 00:28:35,676 --> 00:28:38,396 Speaker 2: pushing it against things, the status guo, you want to 556 00:28:38,396 --> 00:28:41,516 Speaker 2: break down the system, and when you are high trust, 557 00:28:41,756 --> 00:28:45,956 Speaker 2: you are four so you're more progressive. Where the report 558 00:28:46,116 --> 00:28:50,596 Speaker 2: gets really interesting is when you dig further down. So 559 00:28:50,956 --> 00:28:53,636 Speaker 2: what I found interesting is that if you stick with 560 00:28:53,676 --> 00:28:56,436 Speaker 2: this far left, in this far right dichotomy, the far 561 00:28:56,596 --> 00:29:01,836 Speaker 2: right had higher distrust in strangers, but much higher trust 562 00:29:01,916 --> 00:29:06,996 Speaker 2: in their private circles, close knit family, friends, work, colleagues, 563 00:29:07,276 --> 00:29:10,516 Speaker 2: really high bonds of social trust. And then on the 564 00:29:10,636 --> 00:29:14,916 Speaker 2: left hand side there are actually signs of lower interpersonal trust, 565 00:29:15,316 --> 00:29:19,876 Speaker 2: higher signs of learniness and disengagement. And that I think 566 00:29:20,036 --> 00:29:24,196 Speaker 2: is really interesting. Essentially, what the report is saying is 567 00:29:24,196 --> 00:29:27,356 Speaker 2: that the social fabric is damaged for both. 568 00:29:27,156 --> 00:29:30,796 Speaker 1: Sides, yeah, and damaged and damaged in different ways. That 569 00:29:30,876 --> 00:29:33,076 Speaker 1: might lead to the fact that it's very hard to 570 00:29:33,076 --> 00:29:36,436 Speaker 1: see across the aisle right because people are thinking about 571 00:29:36,436 --> 00:29:39,676 Speaker 1: trust in different ways across different sides of the political spectrum, 572 00:29:39,716 --> 00:29:42,636 Speaker 1: which is maybe what's leading to a disconnect in what 573 00:29:42,676 --> 00:29:45,436 Speaker 1: people want governments to be doing over time exactly. 574 00:29:45,516 --> 00:29:47,636 Speaker 2: And so this idea of high trust and low trust, 575 00:29:47,716 --> 00:29:50,756 Speaker 2: that's the problem because both sads hype high trust, but 576 00:29:50,796 --> 00:29:53,876 Speaker 2: in different things. It's not that they lack trust, they 577 00:29:53,916 --> 00:29:56,436 Speaker 2: just trust differently. And this is what we see is 578 00:29:56,476 --> 00:29:59,076 Speaker 2: that when you take trust away from one area of 579 00:29:59,076 --> 00:30:02,356 Speaker 2: our lives, it creates a vacuum, and that vacuum has 580 00:30:02,436 --> 00:30:06,076 Speaker 2: to be filled with something, and that could be different beliefs, 581 00:30:06,076 --> 00:30:08,516 Speaker 2: it could be conspiracy theories, but you have to feel 582 00:30:08,636 --> 00:30:11,476 Speaker 2: that for But I think holding on to this idea 583 00:30:11,556 --> 00:30:15,116 Speaker 2: that the social fabric is damaged and that the root 584 00:30:15,156 --> 00:30:19,036 Speaker 2: causes of that that are driving trust issues and problems 585 00:30:19,036 --> 00:30:23,916 Speaker 2: with happiness are insecurity and loneliness on both sides. It's 586 00:30:23,956 --> 00:30:27,276 Speaker 2: those two things that are merging together to really cause 587 00:30:27,316 --> 00:30:29,356 Speaker 2: this sort of reconfiguration over trust. 588 00:30:29,756 --> 00:30:32,116 Speaker 1: So is there any hope that our tattered social fabric 589 00:30:32,156 --> 00:30:34,396 Speaker 1: can be repaired? And if so, how can we go 590 00:30:34,476 --> 00:30:36,956 Speaker 1: about creating a future where we trust each other and 591 00:30:37,036 --> 00:30:40,036 Speaker 1: our institutions. We'll look at that when the Happiness Lab 592 00:30:40,076 --> 00:30:49,956 Speaker 1: returns in a moment. In her audiobook How to Trust 593 00:30:49,956 --> 00:30:53,196 Speaker 1: and Be Trusted, Rachel Batsman has lots of great suggestions 594 00:30:53,236 --> 00:30:56,476 Speaker 1: for increasing trust in our lives. One interesting concept is 595 00:30:56,516 --> 00:30:58,076 Speaker 1: what she calls a trust leap. 596 00:30:58,436 --> 00:31:01,916 Speaker 2: So, a trust leap is whenever you take a risk 597 00:31:02,516 --> 00:31:05,516 Speaker 2: to do something new or to do something differently. Now, 598 00:31:05,516 --> 00:31:07,796 Speaker 2: when you talk about trust lease, people often think about 599 00:31:07,876 --> 00:31:11,756 Speaker 2: big things in a self driving car or buying bitcoin 600 00:31:11,916 --> 00:31:14,996 Speaker 2: that involves like a new technology and that is a 601 00:31:14,996 --> 00:31:17,796 Speaker 2: trust leap, that is how we change behavior. But trust 602 00:31:17,836 --> 00:31:22,676 Speaker 2: sleeps can also be relatively small in our lives. So 603 00:31:22,996 --> 00:31:26,196 Speaker 2: it can be choosing to put your hand up at 604 00:31:26,236 --> 00:31:29,636 Speaker 2: work to do something completely differently. It can be choosing 605 00:31:29,876 --> 00:31:32,876 Speaker 2: to speak up in a meeting. And one of the 606 00:31:32,876 --> 00:31:35,756 Speaker 2: problems I've actually seen around trust leaps is that first 607 00:31:35,796 --> 00:31:38,396 Speaker 2: of all, people focus on the outcome, so they focus 608 00:31:38,436 --> 00:31:40,236 Speaker 2: on where they want the leap to go, and they 609 00:31:40,276 --> 00:31:45,596 Speaker 2: imagine these leaps being really really big versus small, consistent 610 00:31:45,716 --> 00:31:48,556 Speaker 2: ways of doing something new or doing something differently and 611 00:31:48,636 --> 00:31:51,196 Speaker 2: seeing where that takes us. So that is the concept 612 00:31:51,196 --> 00:31:52,116 Speaker 2: of trust leaps. 613 00:31:52,516 --> 00:31:54,596 Speaker 1: So I'm guessing I know the answer to this, but 614 00:31:54,636 --> 00:31:56,836 Speaker 1: what are some of the things that prevent people from 615 00:31:56,916 --> 00:31:58,916 Speaker 1: making these trust leaves. It seems like part of it 616 00:31:58,956 --> 00:32:01,396 Speaker 1: maybe is a bias to assume the trust leap is 617 00:32:01,436 --> 00:32:03,036 Speaker 1: going to go badly in some forum. 618 00:32:03,396 --> 00:32:05,596 Speaker 2: It's interesting to say badly, because we often don't even 619 00:32:05,636 --> 00:32:08,756 Speaker 2: get that far. Because if you imagine a trust leap 620 00:32:08,796 --> 00:32:13,396 Speaker 2: involved going from the known and the safe and the familiar, 621 00:32:13,756 --> 00:32:16,236 Speaker 2: and that's, as you know, Laurie, where we all love 622 00:32:16,276 --> 00:32:18,156 Speaker 2: to be, like, that's where we're wired to be, and 623 00:32:18,196 --> 00:32:21,796 Speaker 2: it involves like going to the unknown, because that's when 624 00:32:21,836 --> 00:32:25,996 Speaker 2: we discover new things. Most of us, it's so hard 625 00:32:26,036 --> 00:32:29,476 Speaker 2: to move from that place that is safe and familiar 626 00:32:29,836 --> 00:32:32,676 Speaker 2: that we never even really take the first leap. So 627 00:32:32,716 --> 00:32:35,636 Speaker 2: it's not that we are assessing risk. We're not thinking 628 00:32:35,636 --> 00:32:38,796 Speaker 2: about all these bad things could happen. It's often just 629 00:32:39,716 --> 00:32:42,556 Speaker 2: the getting started and breaking the leap down that is 630 00:32:42,636 --> 00:32:43,196 Speaker 2: the problem. 631 00:32:43,356 --> 00:32:45,876 Speaker 1: I love this idea because it's really about like recognizing 632 00:32:45,916 --> 00:32:48,276 Speaker 1: that getting out of your comfort zone, which we so 633 00:32:48,276 --> 00:32:49,916 Speaker 1: many of us struggle with, is in some ways a 634 00:32:49,916 --> 00:32:52,596 Speaker 1: trust leap, right. It's like embracing that unknown and kind 635 00:32:52,636 --> 00:32:55,596 Speaker 1: of trusting yourself that it's going to be fine no 636 00:32:55,636 --> 00:32:57,476 Speaker 1: matter how it turns out, even if it doesn't go 637 00:32:57,596 --> 00:32:58,516 Speaker 1: perfectly as planned. 638 00:32:58,596 --> 00:33:00,636 Speaker 2: Yeah, and I think you know you said you don't 639 00:33:00,716 --> 00:33:03,076 Speaker 2: like being on a bike. I hate being on a bike. 640 00:33:03,316 --> 00:33:06,036 Speaker 2: It fills me with dread. I don't like skiing on 641 00:33:06,076 --> 00:33:08,556 Speaker 2: the edges of things because I don't like ledges right. 642 00:33:08,716 --> 00:33:11,796 Speaker 2: So often where we find it hard to take trust 643 00:33:11,876 --> 00:33:15,476 Speaker 2: leaps is associated with risks. So there are some risks, 644 00:33:15,836 --> 00:33:18,276 Speaker 2: like physical risks that are really hard for me to take, 645 00:33:18,396 --> 00:33:20,676 Speaker 2: like when I swim in the ocean, my husband's Australian 646 00:33:20,756 --> 00:33:23,796 Speaker 2: and we go to Australia. I'm so afraid of the sharks. 647 00:33:23,796 --> 00:33:26,636 Speaker 2: But no sharks in a common three centimeters of order, right. 648 00:33:27,636 --> 00:33:30,516 Speaker 2: But those trust leaps are really really hard, but creative 649 00:33:30,516 --> 00:33:32,916 Speaker 2: trust leaps are really easy for me to take. It's 650 00:33:32,956 --> 00:33:37,356 Speaker 2: a very different type of risk, and so understanding that 651 00:33:37,596 --> 00:33:39,676 Speaker 2: again can be and it's something I talk about in 652 00:33:39,676 --> 00:33:42,436 Speaker 2: the book because I really believe it can help you 653 00:33:43,316 --> 00:33:46,476 Speaker 2: understand where you're stuck and where you like things to 654 00:33:46,516 --> 00:33:48,756 Speaker 2: be comfortable and where you really find it hard to 655 00:33:48,796 --> 00:33:52,556 Speaker 2: stretch yourself. Is usually to do with different types of risk, 656 00:33:52,716 --> 00:33:57,196 Speaker 2: financial risk, emotional risk, physical risk, creative risk. They all 657 00:33:57,236 --> 00:33:59,076 Speaker 2: have a very different makeup in our lives. 658 00:33:59,236 --> 00:34:01,676 Speaker 1: You've also argued that we'd be helped in terms of 659 00:34:01,796 --> 00:34:05,036 Speaker 1: finding more trust by trying to sort out our trust barriers. 660 00:34:05,316 --> 00:34:07,036 Speaker 1: What do you mean by trust barriers and what are 661 00:34:07,076 --> 00:34:08,396 Speaker 1: some ways that we can understand them? 662 00:34:08,676 --> 00:34:11,036 Speaker 2: So the trust is, as you can tell, I like 663 00:34:11,076 --> 00:34:15,076 Speaker 2: my metaphors, it is the thing that gets in the way, 664 00:34:15,356 --> 00:34:17,036 Speaker 2: you know, like the number of people who say I 665 00:34:17,076 --> 00:34:20,236 Speaker 2: can't move like they've never lived in a different country, 666 00:34:20,476 --> 00:34:23,476 Speaker 2: or they can't move jobs, or in some ways they 667 00:34:23,476 --> 00:34:26,116 Speaker 2: can't get off of a relationship, they can't take up 668 00:34:26,116 --> 00:34:29,556 Speaker 2: a new hobby. They are describing well being stuck. That 669 00:34:29,636 --> 00:34:32,676 Speaker 2: they are in some way paralyzed, and that is because 670 00:34:32,756 --> 00:34:35,756 Speaker 2: there is some kind of barrier in the way. Now, 671 00:34:35,796 --> 00:34:38,676 Speaker 2: that trust barrier. It might be a very practical thing 672 00:34:38,756 --> 00:34:42,116 Speaker 2: like money, financial security, but it can also be things 673 00:34:42,196 --> 00:34:47,076 Speaker 2: like companionship just really frightened to do it alone. Not 674 00:34:47,196 --> 00:34:49,356 Speaker 2: to go on about running, but I wouldn't been able 675 00:34:49,396 --> 00:34:51,636 Speaker 2: to run if I didn't find a friend, because I 676 00:34:51,636 --> 00:34:54,396 Speaker 2: wouldn't run around the parks in the dark, and I'd 677 00:34:54,436 --> 00:34:55,996 Speaker 2: be really scared of what was going to happen if 678 00:34:56,036 --> 00:34:59,436 Speaker 2: I got lost. That's a trust barrier, and in companies 679 00:34:59,556 --> 00:35:02,676 Speaker 2: these barriers get bigger. It's actually a really useful framework 680 00:35:02,716 --> 00:35:05,836 Speaker 2: if you're launching a new product or service because often 681 00:35:05,876 --> 00:35:08,556 Speaker 2: what you think is the trust barrier is very different 682 00:35:08,556 --> 00:35:10,876 Speaker 2: from what the cust some that thinks is the trust barrier. 683 00:35:10,996 --> 00:35:14,996 Speaker 2: So understanding what that perception is around risk, whether it's 684 00:35:15,076 --> 00:35:18,156 Speaker 2: real or perceived, can be really helpful for launching a 685 00:35:18,156 --> 00:35:19,356 Speaker 2: new product or service as well. 686 00:35:19,396 --> 00:35:21,356 Speaker 1: And imagine there are lots of trust barriers that come 687 00:35:21,396 --> 00:35:24,076 Speaker 1: up in the domain of politics, which is what the 688 00:35:24,076 --> 00:35:26,516 Speaker 1: World Happiness Report is about what are some examples of 689 00:35:26,516 --> 00:35:28,476 Speaker 1: trust barriers that come up in that domain. 690 00:35:28,676 --> 00:35:32,516 Speaker 2: I'm not really sure where to begin, but a trust barrier, 691 00:35:32,596 --> 00:35:34,916 Speaker 2: actually it's a funny thing, but it can be like 692 00:35:35,316 --> 00:35:39,196 Speaker 2: nothing's going to change. So it can result in apathy, 693 00:35:39,516 --> 00:35:41,716 Speaker 2: like I'm just not going to vote, I'm going to abstain. 694 00:35:42,276 --> 00:35:46,716 Speaker 2: So believing that there is no way things are going 695 00:35:46,756 --> 00:35:50,916 Speaker 2: to change direction or system's going to change. Financial insecurity, 696 00:35:51,156 --> 00:35:54,836 Speaker 2: that is a massive trust barrier. Not believing there's going 697 00:35:54,916 --> 00:35:57,276 Speaker 2: to be any kind of redistribution of wealth or that 698 00:35:57,396 --> 00:36:00,356 Speaker 2: my life is going to get better. Trust barriers can 699 00:36:00,436 --> 00:36:03,156 Speaker 2: be much closer to home. You're probably seeing in the 700 00:36:03,196 --> 00:36:05,596 Speaker 2: research the number of people and it's a real worry 701 00:36:05,716 --> 00:36:07,476 Speaker 2: that they're never going to own a home, they're never 702 00:36:07,516 --> 00:36:09,636 Speaker 2: going to be able to retire, and so it can 703 00:36:09,716 --> 00:36:12,436 Speaker 2: actually paralyze people from even getting started, even though that 704 00:36:12,476 --> 00:36:14,956 Speaker 2: life stage might be forty to fifty years away. So 705 00:36:15,076 --> 00:36:19,836 Speaker 2: these are real trust barriers that are impacting people's happiness 706 00:36:19,876 --> 00:36:21,436 Speaker 2: and immediate decisions. 707 00:36:21,756 --> 00:36:23,836 Speaker 1: And so what are some questions we can ask ourselves 708 00:36:23,876 --> 00:36:25,716 Speaker 1: to overcome these trust barriers. I mean, some of the 709 00:36:25,796 --> 00:36:27,756 Speaker 1: ones you're talking about are structural right, they don't have 710 00:36:27,836 --> 00:36:29,996 Speaker 1: enough money for a home. These things might be harder, 711 00:36:30,316 --> 00:36:32,036 Speaker 1: but some of the trust barriers, like well I can't 712 00:36:32,036 --> 00:36:34,476 Speaker 1: do it by myself, they might be easier to overcome 713 00:36:34,516 --> 00:36:36,596 Speaker 1: if we consult our fears of it. So what are 714 00:36:36,596 --> 00:36:39,436 Speaker 1: some questions we can ask ourselves to overcome these trust barriers. 715 00:36:39,516 --> 00:36:42,276 Speaker 2: It's a good question where did it come from? Was 716 00:36:42,476 --> 00:36:45,156 Speaker 2: it something I developed or was it something that I 717 00:36:45,236 --> 00:36:47,476 Speaker 2: was told as a child. Is a really big one. 718 00:36:47,596 --> 00:36:49,876 Speaker 2: So if you grew up in a family that always 719 00:36:49,916 --> 00:36:53,556 Speaker 2: like be careful, don't go that far, or get down 720 00:36:53,596 --> 00:36:57,116 Speaker 2: from there. A lot of it starts really young, Laurie, 721 00:36:57,156 --> 00:37:00,676 Speaker 2: and that's not surprising because our relationship to risk and 722 00:37:00,676 --> 00:37:03,316 Speaker 2: trust it's really formed around the age of four. The 723 00:37:03,436 --> 00:37:07,476 Speaker 2: second thing I would ask is, again, how much of 724 00:37:07,516 --> 00:37:10,516 Speaker 2: this is real and how much of this is perception? 725 00:37:11,316 --> 00:37:14,396 Speaker 2: So how much of this is rooted in facts and 726 00:37:14,476 --> 00:37:19,276 Speaker 2: data versus my own fits? That would be the second, 727 00:37:19,596 --> 00:37:23,756 Speaker 2: and then the third is not necessarily getting rid of 728 00:37:23,796 --> 00:37:26,716 Speaker 2: the barrier, but lowering the leap. You know, I feel 729 00:37:26,756 --> 00:37:28,596 Speaker 2: so lucky to teach at Oxford, but then I have 730 00:37:28,676 --> 00:37:30,876 Speaker 2: friends who are like, I'd love my child to go there, 731 00:37:31,196 --> 00:37:34,916 Speaker 2: and You're like, that is a very ambitious leap for 732 00:37:35,036 --> 00:37:38,236 Speaker 2: many children, Like, why don't we just lower that leap 733 00:37:38,716 --> 00:37:41,516 Speaker 2: a little bit? And if it ends up like that, great. 734 00:37:41,716 --> 00:37:44,676 Speaker 2: It's not about not being ambitious and wanting achievement. I 735 00:37:44,796 --> 00:37:47,836 Speaker 2: just think so much pressure comes in life because we 736 00:37:47,916 --> 00:37:50,236 Speaker 2: make these leaps way too high. 737 00:37:50,756 --> 00:37:53,316 Speaker 1: And I think that that happens in politics too, right. 738 00:37:53,396 --> 00:37:56,116 Speaker 1: I think both because of the misinformation you talked about, right, 739 00:37:56,156 --> 00:38:00,076 Speaker 1: you know my information accurate? Have I gotten the right facts? 740 00:38:00,356 --> 00:38:03,396 Speaker 1: But also because we make the barriers to entries so high, right. 741 00:38:03,436 --> 00:38:05,796 Speaker 1: I think when we envision maybe talking to somebody from 742 00:38:05,836 --> 00:38:09,676 Speaker 1: across the aisle, we picture really extreme on the other 743 00:38:09,716 --> 00:38:11,556 Speaker 1: side of the aisle, right, you know, because that's you know, 744 00:38:11,596 --> 00:38:13,636 Speaker 1: what we see in our distributed trust networks, that's what 745 00:38:13,636 --> 00:38:16,556 Speaker 1: the influencers attacking us about. Are there ways that we 746 00:38:16,596 --> 00:38:19,596 Speaker 1: can maybe overcome these trust barriers in those domains too? 747 00:38:20,276 --> 00:38:23,076 Speaker 2: I think it's it's a really important point because the 748 00:38:23,116 --> 00:38:25,476 Speaker 2: way it's sort of pictured it often is taking on 749 00:38:25,516 --> 00:38:27,996 Speaker 2: the system or taking on the other side. And I 750 00:38:28,036 --> 00:38:31,956 Speaker 2: would start by just having a conversation that makes you uncomfortable, 751 00:38:32,156 --> 00:38:35,196 Speaker 2: being comfortable with the discomfort of a difficult situation is 752 00:38:35,236 --> 00:38:38,156 Speaker 2: the starting place. So I have a lot of friends 753 00:38:38,196 --> 00:38:42,156 Speaker 2: that have very difficult and different views on the war 754 00:38:42,196 --> 00:38:45,436 Speaker 2: in Gaza. And you know, at my children's school, they're 755 00:38:45,436 --> 00:38:48,876 Speaker 2: banned from talking about wars or politics because I quote, 756 00:38:48,916 --> 00:38:51,956 Speaker 2: it brings up big feelings. And I have a real 757 00:38:51,996 --> 00:38:55,356 Speaker 2: objection to that, because what are we teaching our children 758 00:38:55,436 --> 00:38:57,916 Speaker 2: that they can't deal with that discomfort, that they can't 759 00:38:57,956 --> 00:39:01,396 Speaker 2: have a difficult conversation, that they can't hold that space 760 00:39:01,436 --> 00:39:04,036 Speaker 2: with another human being that maybe has a different belief 761 00:39:04,116 --> 00:39:06,516 Speaker 2: or viewpoint from them. So you know, you can start 762 00:39:06,556 --> 00:39:09,036 Speaker 2: with your own friends and your family and people that 763 00:39:09,276 --> 00:39:11,276 Speaker 2: you know at the end and just they're going to 764 00:39:11,316 --> 00:39:13,436 Speaker 2: hug and tell you that they still love you, right 765 00:39:13,476 --> 00:39:16,876 Speaker 2: and knowing that nothing changes in that relationship. There's so 766 00:39:17,116 --> 00:39:20,036 Speaker 2: much we can learn just from being with that discomfort 767 00:39:20,116 --> 00:39:22,236 Speaker 2: and that heat and learning that everything is okay. 768 00:39:22,436 --> 00:39:24,356 Speaker 1: And then on the other side of that discomfort, if 769 00:39:24,356 --> 00:39:26,916 Speaker 1: you push through, it might be a trusting relationship that 770 00:39:27,156 --> 00:39:28,956 Speaker 1: you're going to value tremendously and it's going to make 771 00:39:28,996 --> 00:39:29,796 Speaker 1: you much happier. 772 00:39:29,876 --> 00:39:33,036 Speaker 2: Yeah, or like not to go to my children, but 773 00:39:33,076 --> 00:39:34,916 Speaker 2: like this did happen in their school where they had 774 00:39:34,996 --> 00:39:38,196 Speaker 2: quite a difficult conversation around the Holocaust where some children 775 00:39:38,516 --> 00:39:41,196 Speaker 2: you know now don't believe the Holocaust has happened now 776 00:39:41,476 --> 00:39:43,916 Speaker 2: they're nine and ten. That's not coming from them, that's 777 00:39:43,956 --> 00:39:47,396 Speaker 2: come from somewhere. And the fact that the history teacher 778 00:39:47,796 --> 00:39:50,716 Speaker 2: then backed them up and gave this very factor assembly. 779 00:39:51,236 --> 00:39:54,116 Speaker 2: They were like, wow, there's someone who cares right, and 780 00:39:54,156 --> 00:39:57,196 Speaker 2: there's information that we can trust. And he explained where 781 00:39:57,196 --> 00:39:59,956 Speaker 2: he got this information from. Now, regardless of what your 782 00:40:00,276 --> 00:40:03,796 Speaker 2: side you're on, that assembly would have impacted both sets 783 00:40:03,796 --> 00:40:06,276 Speaker 2: of children, the non believers and the children that this 784 00:40:06,396 --> 00:40:09,556 Speaker 2: was really important that this chapter in history was members. 785 00:40:09,596 --> 00:40:13,636 Speaker 2: So I feel like those moments, particularly with children, it's 786 00:40:13,676 --> 00:40:15,996 Speaker 2: really like, I can trust this person, I can trust 787 00:40:15,996 --> 00:40:18,276 Speaker 2: this situation. It's really important to teach. 788 00:40:18,396 --> 00:40:20,236 Speaker 1: So as we think back to the World Happiness Report 789 00:40:20,236 --> 00:40:22,716 Speaker 1: and the importance it's placed on trust, and how trust 790 00:40:22,836 --> 00:40:25,876 Speaker 1: might be going down institutionally, how lack of trust is 791 00:40:25,956 --> 00:40:28,676 Speaker 1: kind of affecting our politics, and so much any last 792 00:40:28,716 --> 00:40:31,196 Speaker 1: advice for how people on both sides can become a 793 00:40:31,236 --> 00:40:32,196 Speaker 1: little bit more trusting. 794 00:40:32,716 --> 00:40:35,876 Speaker 2: I think too much of the conversation is around institutional trust. 795 00:40:36,476 --> 00:40:40,676 Speaker 2: These are systems that ninety nine percent of us cannot fix. 796 00:40:40,916 --> 00:40:43,156 Speaker 2: They are too big, they're too far out of reach, right, 797 00:40:43,316 --> 00:40:47,796 Speaker 2: So I would say, focus on local trust doesn't even 798 00:40:47,836 --> 00:40:51,636 Speaker 2: have to be your family and friends. Focus on things 799 00:40:51,716 --> 00:40:55,796 Speaker 2: going on in your community, in your neighborhood, in your street, 800 00:40:56,276 --> 00:40:59,356 Speaker 2: and really getting involved in something like that. It can 801 00:40:59,396 --> 00:41:01,556 Speaker 2: do so much for your social ties. It can do 802 00:41:01,636 --> 00:41:04,876 Speaker 2: so much for the glue and make you feel in 803 00:41:04,916 --> 00:41:07,796 Speaker 2: some way back in control because you can impact the 804 00:41:07,836 --> 00:41:11,036 Speaker 2: people around you. So that would be my advice is 805 00:41:11,276 --> 00:41:15,076 Speaker 2: just stop looking outward and upward so much at these 806 00:41:15,076 --> 00:41:18,276 Speaker 2: big problems and letting them consume so much energy. And 807 00:41:18,316 --> 00:41:20,996 Speaker 2: it's not about withdrawal. If I'm not saying that, I'm 808 00:41:21,036 --> 00:41:23,036 Speaker 2: not saying like, be at home more alone. We don't 809 00:41:23,076 --> 00:41:25,956 Speaker 2: need more of that. It's about local ties and community 810 00:41:25,956 --> 00:41:29,196 Speaker 2: ties and connections that you could in some way put 811 00:41:29,236 --> 00:41:30,556 Speaker 2: into action Tomorrow. 812 00:41:32,356 --> 00:41:34,916 Speaker 1: I hope you've enjoyed my conversation with Rachel and that 813 00:41:35,036 --> 00:41:36,796 Speaker 1: you've learned a bit more about what you can do 814 00:41:36,836 --> 00:41:39,836 Speaker 1: to trust and be trusted more effectively. If you want 815 00:41:39,876 --> 00:41:42,276 Speaker 1: to hear more, be sure to check out Rachel's fabulous 816 00:41:42,276 --> 00:41:46,076 Speaker 1: new audiobook, How to Trust and Be Trusted, available everywhere 817 00:41:46,116 --> 00:41:48,676 Speaker 1: you get your Audio Books. That's a wrap on our 818 00:41:48,716 --> 00:41:51,396 Speaker 1: special series on the World Happiness Report. But not to 819 00:41:51,436 --> 00:41:53,956 Speaker 1: worry because the Happiness Lab will be back next week. 820 00:41:54,156 --> 00:41:56,716 Speaker 1: I'll be chatting with my friend, the happiness expert, Gretchen 821 00:41:56,756 --> 00:42:00,036 Speaker 1: Rubin about her secrets to adulthood. So I hope you'll 822 00:42:00,036 --> 00:42:03,076 Speaker 1: returned soon for the Happiness Lab with me, Doctor Larry 823 00:42:03,076 --> 00:42:10,876 Speaker 1: Santos