1 00:00:03,040 --> 00:00:06,760 Speaker 1: Welcome to Stuff to Blow Your Mind production of iHeartRadio. 2 00:00:12,680 --> 00:00:15,640 Speaker 2: Hey, welcome to Stuff to Blow your Mind. My name 3 00:00:15,680 --> 00:00:16,680 Speaker 2: is Robert Lamb. 4 00:00:16,520 --> 00:00:19,119 Speaker 3: And I am Joe McCormick, and we're back with Part 5 00:00:19,160 --> 00:00:23,480 Speaker 3: two in our series on cynicism, the tendency to believe 6 00:00:23,920 --> 00:00:29,200 Speaker 3: other people are selfish, untrustworthy, and immoral. In Part one, 7 00:00:29,480 --> 00:00:33,040 Speaker 3: we talked about what cynicism means in its modern usage, 8 00:00:33,080 --> 00:00:38,159 Speaker 3: and we contrasted that with similar but distinct concepts like pessimism, 9 00:00:38,680 --> 00:00:42,160 Speaker 3: and also with cynic philosophy, the latter being a school 10 00:00:42,200 --> 00:00:46,400 Speaker 3: of philosophy born in ancient Greece that emphasized moral integrity, 11 00:00:46,600 --> 00:00:52,000 Speaker 3: self sufficiency, and virtue, and achieved in part by shedding pretensions, 12 00:00:52,040 --> 00:00:55,880 Speaker 3: like ignoring the pressure to conform and living in accordance 13 00:00:55,920 --> 00:00:59,280 Speaker 3: with our nature. The word cynic comes from the Greek 14 00:00:59,320 --> 00:01:03,360 Speaker 3: word for, and so a cynic philosopher in the ancient 15 00:01:03,440 --> 00:01:05,840 Speaker 3: sense might say that we can all learn something from 16 00:01:05,840 --> 00:01:08,440 Speaker 3: watching the honest way that a dog lives according to 17 00:01:08,480 --> 00:01:11,440 Speaker 3: its nature. This, of course, is quite different from what 18 00:01:11,480 --> 00:01:15,200 Speaker 3: we mean by cynicism today in common language, which is 19 00:01:15,280 --> 00:01:19,679 Speaker 3: a disposition of low social trust, the tendency to believe, 20 00:01:19,720 --> 00:01:22,440 Speaker 3: as we put it, last time, that people are bad, 21 00:01:22,680 --> 00:01:26,240 Speaker 3: people are selfish, and morals are fake. Yeah. 22 00:01:26,360 --> 00:01:33,120 Speaker 2: Yeah, ancient philosophic cynics be like a dog. Modern cynics 23 00:01:33,360 --> 00:01:34,240 Speaker 2: dog eat dog. 24 00:01:34,600 --> 00:01:38,000 Speaker 3: Yes, now, last time, we also talked a bit about 25 00:01:38,040 --> 00:01:41,520 Speaker 3: cynicism in literature, with my observation that really, like the 26 00:01:41,880 --> 00:01:45,959 Speaker 3: most cynical writing I could find anywhere was all in 27 00:01:46,000 --> 00:01:48,480 Speaker 3: the Bible, which I'm not sure exactly what to make 28 00:01:48,520 --> 00:01:53,440 Speaker 3: of that, but that is kind of surprising and counterintuitive perhaps, 29 00:01:54,000 --> 00:01:56,720 Speaker 3: But it's like, especially in the books of the prophets 30 00:01:56,720 --> 00:02:00,360 Speaker 3: in the Hebrew Bible, like Micah and Jeremiah, full of 31 00:02:00,520 --> 00:02:05,200 Speaker 3: just very eloquent, devastating condemnations of human nature. The heart 32 00:02:05,280 --> 00:02:08,880 Speaker 3: is deceitful above all things. There is none that doeth good, No, 33 00:02:09,000 --> 00:02:14,440 Speaker 3: not one, that sort of thing. We also discussed medical, psychological, 34 00:02:14,520 --> 00:02:18,960 Speaker 3: and sociological research on the correlates of cynicism, with the 35 00:02:19,200 --> 00:02:23,960 Speaker 3: overwhelming conclusion being that holding cynical beliefs about human nature 36 00:02:24,320 --> 00:02:27,919 Speaker 3: and low trust in others is quite harmful to us 37 00:02:27,960 --> 00:02:30,400 Speaker 3: in many different ways. It appears to be bad for 38 00:02:30,480 --> 00:02:33,960 Speaker 3: our health, mental and physical health. It is correlated with 39 00:02:34,040 --> 00:02:38,560 Speaker 3: all kinds of undesirable outcomes, including early death. We didn't 40 00:02:38,560 --> 00:02:40,800 Speaker 3: really get into this research last time, but you probably 41 00:02:40,800 --> 00:02:43,320 Speaker 3: won't be surprised to learn it is associated with low 42 00:02:43,400 --> 00:02:49,519 Speaker 3: quality social relationships, and despite the archetype of the ruthless, 43 00:02:49,600 --> 00:02:53,320 Speaker 3: cynical striver and achiever, it also tends, on average to 44 00:02:53,360 --> 00:02:56,639 Speaker 3: make it harder for us to reach even our material 45 00:02:56,760 --> 00:03:00,720 Speaker 3: goals like making money and attaining positions of leadership, at 46 00:03:00,800 --> 00:03:03,440 Speaker 3: least in part because cynical people waste a lot of 47 00:03:03,480 --> 00:03:07,120 Speaker 3: time and resources trying to avoid being made a sucker 48 00:03:07,560 --> 00:03:10,960 Speaker 3: and miss out on opportunities to cooperate with others for 49 00:03:11,080 --> 00:03:14,200 Speaker 3: mutual benefit. So I think it's safe to say that 50 00:03:14,760 --> 00:03:17,920 Speaker 3: believing everyone is just in it for themselves and cannot 51 00:03:17,960 --> 00:03:21,919 Speaker 3: be trusted is in so many ways bad, bad, bad 52 00:03:21,960 --> 00:03:24,920 Speaker 3: for us. It's bad for our lives. But of course, 53 00:03:25,000 --> 00:03:28,440 Speaker 3: the somewhat heartwarming implied inverse of all this is the 54 00:03:28,840 --> 00:03:31,760 Speaker 3: evidence that it really does us good to cooperate and 55 00:03:31,800 --> 00:03:32,519 Speaker 3: trust people. 56 00:03:33,200 --> 00:03:36,960 Speaker 2: Definite silver lining there to all all of this research 57 00:03:37,000 --> 00:03:38,120 Speaker 2: we've been doing here now. 58 00:03:38,200 --> 00:03:40,480 Speaker 3: In the last episode, we also raised a number of 59 00:03:40,520 --> 00:03:43,520 Speaker 3: questions that we weren't able to fully answer yet, and 60 00:03:43,560 --> 00:03:45,320 Speaker 3: we'll come back to them throughout the series. I think 61 00:03:45,440 --> 00:03:48,680 Speaker 3: we'll look at at least one study today that sheds 62 00:03:48,800 --> 00:03:52,640 Speaker 3: some light on this. But these were questions like, do 63 00:03:52,880 --> 00:03:56,720 Speaker 3: more cynical or less cynical people actually have a better 64 00:03:56,920 --> 00:04:01,040 Speaker 3: predictive model of the world, who's model of how other 65 00:04:01,080 --> 00:04:04,920 Speaker 3: people behave is more accurate? And in any given situation, 66 00:04:05,280 --> 00:04:09,080 Speaker 3: how can we know if we're being too trusting, too cynical, 67 00:04:09,280 --> 00:04:13,520 Speaker 3: or if we're striking just the most reasonable balance. And also, 68 00:04:14,000 --> 00:04:17,200 Speaker 3: given that it comes with so many clear downsides, what 69 00:04:17,560 --> 00:04:20,680 Speaker 3: if anything, is the benefit of cynicism. 70 00:04:21,200 --> 00:04:23,599 Speaker 2: Well, I'm not sure this will actually help us answer 71 00:04:23,640 --> 00:04:27,040 Speaker 2: any of these questions, but then maybe it'll give us 72 00:04:27,040 --> 00:04:29,479 Speaker 2: just a little more elbow room to work with the concept. 73 00:04:31,560 --> 00:04:36,120 Speaker 2: Reading through Anscar Allen's cynicism book Mit that I reverenced 74 00:04:36,120 --> 00:04:40,240 Speaker 2: in the last episode, and he covers several subsets of 75 00:04:40,279 --> 00:04:43,360 Speaker 2: modern cynicism, and here are a few that I thought 76 00:04:43,440 --> 00:04:47,960 Speaker 2: might help us out. So one category is insider cynics, 77 00:04:48,160 --> 00:04:51,400 Speaker 2: So the cynicism of contemporary professionals who believe that people 78 00:04:51,400 --> 00:04:55,080 Speaker 2: are ultimately selfish, and these individuals do their best to 79 00:04:55,120 --> 00:04:59,160 Speaker 2: survive in their organizations by dealing with their colleagues on 80 00:04:59,240 --> 00:05:01,880 Speaker 2: those terms. So we might think of it as a 81 00:05:01,960 --> 00:05:04,520 Speaker 2: you know, I'm not here to make friends approach, but 82 00:05:04,800 --> 00:05:07,599 Speaker 2: you know, only to work or specifically to work, with 83 00:05:07,680 --> 00:05:10,359 Speaker 2: the idea being that maybe outside of that work environment 84 00:05:10,720 --> 00:05:11,839 Speaker 2: they are less cynical. 85 00:05:12,640 --> 00:05:16,359 Speaker 3: Oh okay, well, I would certainly have questions about how 86 00:05:16,480 --> 00:05:20,440 Speaker 3: skilled people are actually are at like turning cynicism on 87 00:05:20,600 --> 00:05:25,040 Speaker 3: and off when switching between contexts. But you can certainly 88 00:05:25,040 --> 00:05:28,560 Speaker 3: see how that would be useful, maybe because you know, 89 00:05:28,600 --> 00:05:32,919 Speaker 3: it's quite reasonable to understand that, like, certain professional environments 90 00:05:32,920 --> 00:05:36,000 Speaker 3: require a lot less trust, require you to be more 91 00:05:36,080 --> 00:05:38,240 Speaker 3: doubting of people. Maybe if you are, I don't know, 92 00:05:38,279 --> 00:05:41,560 Speaker 3: investigating murders or something like that, like you you really 93 00:05:41,600 --> 00:05:43,320 Speaker 3: need to not just like trust people. 94 00:05:45,279 --> 00:05:48,960 Speaker 2: There are so many movies about murder detectives being able 95 00:05:49,000 --> 00:05:53,640 Speaker 2: to neatly separate their work life from their home life. 96 00:05:53,720 --> 00:05:56,920 Speaker 3: Yeah, but so hopefully if you were in a job 97 00:05:57,000 --> 00:05:58,440 Speaker 3: like that, or if you're in just in kind of 98 00:05:58,560 --> 00:06:01,760 Speaker 3: a cutthrow business environment where everybody, you know, everybody's trying 99 00:06:01,800 --> 00:06:04,360 Speaker 3: to edge other people out in deals, you would hope 100 00:06:04,400 --> 00:06:06,720 Speaker 3: to be able to turn that off when you come 101 00:06:06,800 --> 00:06:09,400 Speaker 3: out of that and get into your relationships in life. 102 00:06:09,880 --> 00:06:12,680 Speaker 3: I guess it's questionable to what extent people can do 103 00:06:12,720 --> 00:06:13,560 Speaker 3: that very well. 104 00:06:14,080 --> 00:06:17,880 Speaker 2: I would agree with that. Yes, Another classification that he 105 00:06:17,960 --> 00:06:22,080 Speaker 2: singles out are the master cynics, so rich and powerful 106 00:06:22,120 --> 00:06:26,120 Speaker 2: contemporary cynics who hide their own cynicism by adopting the 107 00:06:26,200 --> 00:06:29,599 Speaker 2: values and beliefs of people they hold power over. In 108 00:06:29,640 --> 00:06:33,200 Speaker 2: a weird sense, system's kind of kind of a puzzle 109 00:06:33,600 --> 00:06:37,320 Speaker 2: because a person who's really cynical about politics would no 110 00:06:37,400 --> 00:06:41,600 Speaker 2: doubt assume that every politician is a master cynic, that 111 00:06:41,680 --> 00:06:44,360 Speaker 2: they're just when they're glad handing and so forth, they're 112 00:06:44,400 --> 00:06:48,080 Speaker 2: just pretending to take on these values where deep down 113 00:06:48,640 --> 00:06:51,440 Speaker 2: they know the truth, like I know that people are 114 00:06:51,480 --> 00:06:53,880 Speaker 2: just selfish and they're exploiting everyone as well. 115 00:06:54,279 --> 00:06:55,800 Speaker 3: We're going to have to come back to the question 116 00:06:55,880 --> 00:06:59,480 Speaker 3: of cynicism in politics and political participation because I think 117 00:06:59,480 --> 00:07:01,600 Speaker 3: that raises all sorts of interesting questions. 118 00:07:01,800 --> 00:07:03,360 Speaker 2: Yeah, and there's been a lot of research, a lot 119 00:07:03,400 --> 00:07:07,040 Speaker 2: of writing just on that area alone. Now, the third 120 00:07:07,640 --> 00:07:09,240 Speaker 2: of the categories that I'm going to reference here. There 121 00:07:09,240 --> 00:07:11,480 Speaker 2: are some additional ones that I'm not getting into, but 122 00:07:11,520 --> 00:07:14,720 Speaker 2: the third one I want to reference here is paternalistic cynicism, 123 00:07:15,200 --> 00:07:18,120 Speaker 2: and in this one holds a cynical view of human 124 00:07:18,160 --> 00:07:22,120 Speaker 2: motivations and seeks to capitalize on those motivations, but not 125 00:07:22,160 --> 00:07:26,200 Speaker 2: for personal gain, but for the greater good. And that one, 126 00:07:26,320 --> 00:07:27,800 Speaker 2: you know, made me think a little bit more and 127 00:07:27,840 --> 00:07:29,960 Speaker 2: maybe gives us a little more room to play when 128 00:07:30,000 --> 00:07:35,080 Speaker 2: considering the effectiveness or possible effectiveness of cynicism. I was thinking, like, 129 00:07:35,280 --> 00:07:37,600 Speaker 2: let's say you wanted to encourage a certain behavior in 130 00:07:37,600 --> 00:07:41,160 Speaker 2: the general public, and you're presented with two messaging options, 131 00:07:41,200 --> 00:07:44,760 Speaker 2: one that works based on a cynical view of human motivations, 132 00:07:44,840 --> 00:07:47,960 Speaker 2: you know, greed and self interest, while the other appeals 133 00:07:48,000 --> 00:07:50,800 Speaker 2: to the higher angels of their nature. Well, which is 134 00:07:50,920 --> 00:07:54,200 Speaker 2: more likely to work. It's going to depend on the messaging, 135 00:07:54,240 --> 00:07:57,960 Speaker 2: of course, and the exact details of the target audience, 136 00:07:58,000 --> 00:08:01,080 Speaker 2: you know, general audience versus you know, some narrow or audience. 137 00:08:01,720 --> 00:08:05,920 Speaker 2: But yeah, which worldview is a better starting place? I 138 00:08:05,960 --> 00:08:07,840 Speaker 2: feel like this might work as a good sort of 139 00:08:08,000 --> 00:08:11,080 Speaker 2: practical thought experiment because it entails making a choice about 140 00:08:11,080 --> 00:08:13,960 Speaker 2: how you're going to model the motivations of a given population. 141 00:08:14,560 --> 00:08:16,640 Speaker 2: But on the other hand, we kind of come back 142 00:08:16,680 --> 00:08:19,240 Speaker 2: to that sliding scale of cynicism. If one is cynical 143 00:08:19,360 --> 00:08:23,480 Speaker 2: enough about the intended audience, then would any messaging seem 144 00:08:23,560 --> 00:08:25,720 Speaker 2: like it would work, like why are you even bothering? 145 00:08:25,760 --> 00:08:28,520 Speaker 2: If you're just assuming that everyone out there is just selfish, 146 00:08:28,680 --> 00:08:31,480 Speaker 2: you might not be able to get them to stop littering, 147 00:08:31,600 --> 00:08:34,800 Speaker 2: or to recycle or to I don't know where seat belts. 148 00:08:35,480 --> 00:08:40,040 Speaker 3: But maybe the paternalistic cynicism model is like, you must 149 00:08:40,080 --> 00:08:42,360 Speaker 3: stop littering and you must be kind to your fellow 150 00:08:42,440 --> 00:08:45,200 Speaker 3: human being or you will go to hell something you know, 151 00:08:45,320 --> 00:08:49,040 Speaker 3: like appealing to your base personal interest in order to 152 00:08:49,080 --> 00:08:52,280 Speaker 3: get you to do something that the messager sees as good. 153 00:08:52,559 --> 00:08:55,199 Speaker 2: Yeah, bring it back to fear right. Maybe I'm just 154 00:08:55,280 --> 00:08:58,920 Speaker 2: muddying the waters. But there's so many different areas you 155 00:08:58,920 --> 00:09:01,720 Speaker 2: can get into in discussion of modern cynicism. Rob. 156 00:09:01,840 --> 00:09:04,080 Speaker 3: One thing you said in the last episode that I 157 00:09:04,080 --> 00:09:07,120 Speaker 3: thought was really insightful and I have kept thinking about 158 00:09:07,160 --> 00:09:10,560 Speaker 3: ever since we recorded that is that you said maybe 159 00:09:10,559 --> 00:09:14,480 Speaker 3: one of the appeals of cynicism is that expressions of 160 00:09:14,559 --> 00:09:21,000 Speaker 3: cynicism are cathartic statements, like people suck. They when you 161 00:09:21,040 --> 00:09:23,240 Speaker 3: say things like that, or when somebody else says that 162 00:09:23,280 --> 00:09:26,080 Speaker 3: and you get to agree with them, it feels like 163 00:09:26,160 --> 00:09:30,439 Speaker 3: a psychological pressure release valve by like making a statement 164 00:09:30,480 --> 00:09:34,640 Speaker 3: of that sort you're sort of blowing off steam. And 165 00:09:34,840 --> 00:09:36,880 Speaker 3: of course this can be true even for people who 166 00:09:36,960 --> 00:09:40,920 Speaker 3: are not especially cynical on average, having these little moments 167 00:09:40,960 --> 00:09:44,280 Speaker 3: of situational cynicism. But I was thinking more about the 168 00:09:44,320 --> 00:09:50,520 Speaker 3: Catharsis element, and I wonder if this association of cynicism 169 00:09:50,520 --> 00:09:54,360 Speaker 3: with cathartic relief also tells us something about how people 170 00:09:54,400 --> 00:09:59,960 Speaker 3: can acquire generalized dispositional cynicism. Like what if the person 171 00:10:00,120 --> 00:10:03,640 Speaker 3: who is very cynical on average gets to be that 172 00:10:03,679 --> 00:10:07,840 Speaker 3: way by creating by like habituating themselves to a desire 173 00:10:07,960 --> 00:10:11,800 Speaker 3: for that momentary catharsis that you get from saying people suck? 174 00:10:12,240 --> 00:10:13,240 Speaker 3: Does that make any sense? 175 00:10:13,400 --> 00:10:16,040 Speaker 2: Yeah, Like you dip your hand into the cynicism cookie 176 00:10:16,120 --> 00:10:20,040 Speaker 2: jar a few times too often and the crumbs begin 177 00:10:20,160 --> 00:10:23,520 Speaker 2: to stick. Yeah. I was thinking about this when I 178 00:10:23,559 --> 00:10:27,080 Speaker 2: was looking at some material related to cynicism in the workplace, 179 00:10:27,120 --> 00:10:31,319 Speaker 2: which is its own huge area of consideration that we're 180 00:10:31,360 --> 00:10:34,240 Speaker 2: not really going to get into in this episode per se. 181 00:10:34,360 --> 00:10:37,600 Speaker 2: But I was just thinking about, Okay, if one's really 182 00:10:37,600 --> 00:10:43,559 Speaker 2: cynical regarding one's employers, one's corporate overlords, you might excuse 183 00:10:43,679 --> 00:10:46,679 Speaker 2: any amount of inaction or slacking based on the view that, well, 184 00:10:46,679 --> 00:10:49,520 Speaker 2: they don't really care about me, they're not invested in me, 185 00:10:50,400 --> 00:10:53,600 Speaker 2: they're not paying me enough, and so forth, all any 186 00:10:53,640 --> 00:10:58,120 Speaker 2: grievance you might imagine. And maybe, just maybe in small doses, 187 00:10:58,200 --> 00:11:00,800 Speaker 2: this gives you space to let things glide in ways 188 00:11:00,800 --> 00:11:03,720 Speaker 2: that ease your work burden or create space for something 189 00:11:03,760 --> 00:11:08,040 Speaker 2: else you want to do. And yeah, maybe it's a 190 00:11:08,040 --> 00:11:10,240 Speaker 2: situation where if you stick your hand into that cookie 191 00:11:10,320 --> 00:11:14,320 Speaker 2: jar too often, it does become your default view, at 192 00:11:14,400 --> 00:11:17,439 Speaker 2: least of your work situation. And maybe it bleeds over 193 00:11:17,480 --> 00:11:18,640 Speaker 2: into other areas as well. 194 00:11:19,360 --> 00:11:21,839 Speaker 3: That is an interesting possibility. So you're in a kind 195 00:11:21,880 --> 00:11:27,360 Speaker 3: of a pressure inducing scenario and you, for whatever reason, 196 00:11:27,520 --> 00:11:31,079 Speaker 3: you can get little moments of relief from that pressure 197 00:11:31,400 --> 00:11:36,199 Speaker 3: by resorting to cynical evaluations. And so does that create 198 00:11:36,240 --> 00:11:39,320 Speaker 3: a kind of addiction, Like you associate the moment of 199 00:11:39,760 --> 00:11:45,480 Speaker 3: cathartic cynicism with relaxation of the pain, or with pleasure 200 00:11:45,640 --> 00:11:47,800 Speaker 3: even or something like that, and you just kind of 201 00:11:47,880 --> 00:11:51,200 Speaker 3: keep pressing the pleasure button until that's just what your 202 00:11:51,240 --> 00:11:55,080 Speaker 3: personality is. Yeah, I don't have research to back up 203 00:11:55,120 --> 00:11:58,200 Speaker 3: that interpretation of where cynicism comes from, but I think 204 00:11:58,240 --> 00:12:00,880 Speaker 3: that is an interesting possibility or if it ever sets 205 00:12:00,920 --> 00:12:01,440 Speaker 3: in that way. 206 00:12:02,200 --> 00:12:04,840 Speaker 2: Yeah, yeah, I also was thinking, you know, in terms 207 00:12:04,840 --> 00:12:08,120 Speaker 2: of dealing with this a corporation or a company, like 208 00:12:08,200 --> 00:12:12,960 Speaker 2: it's one thing to sort of initially think cynically about, 209 00:12:13,240 --> 00:12:17,600 Speaker 2: like this faceless thing, this organization. But of course organizations 210 00:12:17,600 --> 00:12:20,679 Speaker 2: are made up of people, and so I wonder how 211 00:12:20,679 --> 00:12:24,640 Speaker 2: the cynicism might spread where you might generally have cynical 212 00:12:24,679 --> 00:12:27,360 Speaker 2: ideas about a company, but then those cynical ideas end 213 00:12:27,440 --> 00:12:30,240 Speaker 2: up applying to certain heads of that company. But then 214 00:12:30,280 --> 00:12:33,600 Speaker 2: it could potentially trickle down and then where does it stop? 215 00:12:33,679 --> 00:12:35,960 Speaker 2: Like who stops being the face of the company. I 216 00:12:35,960 --> 00:12:37,679 Speaker 2: guess they have to be cynical enough. They have to 217 00:12:37,720 --> 00:12:40,400 Speaker 2: share your cynicism in order to be like your your 218 00:12:40,440 --> 00:12:44,280 Speaker 2: brothers in arms against the company, that sort of thing. 219 00:12:44,760 --> 00:12:47,400 Speaker 3: Yeah, yeah, you know, Actually this sort of gets a 220 00:12:47,400 --> 00:12:50,760 Speaker 3: little bit into something I'm gonna talk about in the paper. 221 00:12:50,800 --> 00:12:55,559 Speaker 3: I'm about to explain. But there are different environments. There 222 00:12:55,559 --> 00:12:59,720 Speaker 3: are different sort of environments and contexts that encourage and 223 00:12:59,800 --> 00:13:02,840 Speaker 3: or different levels of cynicism. And so there can be 224 00:13:03,080 --> 00:13:07,760 Speaker 3: very like cynicism positive organizations, like if you are within 225 00:13:07,800 --> 00:13:12,040 Speaker 3: a company that is very cruel and in which you 226 00:13:12,080 --> 00:13:14,880 Speaker 3: know you don't do very well by placing trust in people, 227 00:13:15,679 --> 00:13:18,160 Speaker 3: it can be quite reasonable to end up responding with 228 00:13:18,679 --> 00:13:24,360 Speaker 3: the generalized cynicism about interactions within that company. Organizational culture 229 00:13:24,440 --> 00:13:28,560 Speaker 3: is a thing. So anyway, So I want to turn 230 00:13:28,760 --> 00:13:33,680 Speaker 3: to a concept in psychological research on cynicism that I 231 00:13:33,800 --> 00:13:38,360 Speaker 3: found really interesting, and that is the so called cynical 232 00:13:38,679 --> 00:13:43,520 Speaker 3: genius illusion. So I was reading about this in a 233 00:13:43,559 --> 00:13:46,000 Speaker 3: paper by a couple of researchers that I cited in 234 00:13:46,040 --> 00:13:50,040 Speaker 3: part one of the series. These scientists are Olga Stavrova, 235 00:13:50,160 --> 00:13:53,440 Speaker 3: a professor of psychology at Tilburg University in the Netherlands, 236 00:13:53,880 --> 00:13:57,440 Speaker 3: and Daniel Illbracht at the University of Cologne in Germany. 237 00:13:58,280 --> 00:14:01,040 Speaker 3: These two published a paper in the year twenty eighteen 238 00:14:01,200 --> 00:14:04,599 Speaker 3: in the journal Personality and Social Psychology Bulletin, and the 239 00:14:04,640 --> 00:14:09,200 Speaker 3: paper was called the Cynical Genius Illusion, Exploring and Debunking 240 00:14:09,400 --> 00:14:15,040 Speaker 3: lay beliefs about cynicism and competence. Now they begin by 241 00:14:15,080 --> 00:14:17,440 Speaker 3: acknowledging a lot of the things we talked about in 242 00:14:17,480 --> 00:14:21,160 Speaker 3: the last episode. They define cynicism as you know, the 243 00:14:21,200 --> 00:14:24,960 Speaker 3: main cognitive component of hostility. They're talking about it pretty 244 00:14:25,040 --> 00:14:26,640 Speaker 3: much in the same terms we are a belief that 245 00:14:26,720 --> 00:14:30,680 Speaker 3: other people you know, you should be suspicious of their motives, 246 00:14:30,800 --> 00:14:33,960 Speaker 3: that they that they are primarily motivated by self interest, 247 00:14:34,320 --> 00:14:37,000 Speaker 3: that they can't be trusted and will harm you. And 248 00:14:37,080 --> 00:14:39,520 Speaker 3: then they run through the long list of ways that 249 00:14:39,520 --> 00:14:42,320 Speaker 3: cynicism appears to be bad for us, bad for our 250 00:14:42,360 --> 00:14:46,040 Speaker 3: lives in health, in relationships, and ability to attain goals 251 00:14:46,400 --> 00:14:52,320 Speaker 3: and so forth. However, the authors complicate that picture by 252 00:14:52,520 --> 00:14:55,960 Speaker 3: noting that if you just look at popular culture and 253 00:14:56,200 --> 00:15:00,880 Speaker 3: literature and folk wisdom, cynicism does not not seem to 254 00:15:00,960 --> 00:15:05,120 Speaker 3: have and on the whole negative reputation. To read from 255 00:15:05,120 --> 00:15:09,320 Speaker 3: their introduction here quote, Among nineteenth and twentieth century writers 256 00:15:09,320 --> 00:15:12,720 Speaker 3: in popular figures, cynicism has often been seen as a 257 00:15:12,840 --> 00:15:17,600 Speaker 3: sign of intelligence and wit. American writers Ambrose Bierce and 258 00:15:17,680 --> 00:15:21,680 Speaker 3: Lillian Hellman praised cynicism as an art of seeing the 259 00:15:21,800 --> 00:15:26,120 Speaker 3: true nature of things. Bernard Shaw referred to cynicism as 260 00:15:26,200 --> 00:15:30,600 Speaker 3: a quote power of accurate observation, and John Stuart Mill 261 00:15:30,880 --> 00:15:33,960 Speaker 3: noticed that quote it is thought essential to a man 262 00:15:34,000 --> 00:15:36,440 Speaker 3: who has any knowledge of the world to have an 263 00:15:36,440 --> 00:15:41,160 Speaker 3: extremely bad opinion of it. And as for the other 264 00:15:41,200 --> 00:15:43,720 Speaker 3: authors they named, I looked up some of the cynical quotes. 265 00:15:43,720 --> 00:15:45,920 Speaker 3: I can't believe I didn't think of Ambrose Bierce as 266 00:15:45,960 --> 00:15:49,479 Speaker 3: a good source of literary cynicism in the last episode. 267 00:15:49,760 --> 00:15:53,160 Speaker 3: But in The Devil's Dictionary, Bierce defines a cynic as 268 00:15:53,560 --> 00:15:57,600 Speaker 3: someone quote whose faulty vision sees things as they are, 269 00:15:57,920 --> 00:16:00,480 Speaker 3: not as they ought to be. They all so cided 270 00:16:00,520 --> 00:16:04,720 Speaker 3: Lillian Hellman. Her version of this was, quote, cynicism is 271 00:16:04,760 --> 00:16:08,600 Speaker 3: an unpleasant way of saying the truth. And I do 272 00:16:08,640 --> 00:16:13,120 Speaker 3: think there's something interesting in Hellman's phrasing here, because of 273 00:16:13,160 --> 00:16:17,080 Speaker 3: the emphasis on tone. It is an unpleasant way of 274 00:16:17,120 --> 00:16:20,520 Speaker 3: saying what is true. So what is the difference between 275 00:16:20,560 --> 00:16:25,160 Speaker 3: somebody just quote being real and somebody being cynical. It 276 00:16:25,240 --> 00:16:27,120 Speaker 3: might be in the substance of what they say and 277 00:16:27,160 --> 00:16:29,280 Speaker 3: how they think. It might be like you know, material 278 00:16:29,360 --> 00:16:32,920 Speaker 3: substantive differences. But I think sometimes we make that distinction 279 00:16:33,040 --> 00:16:37,080 Speaker 3: based on whether there is negative emotion in their expression, 280 00:16:37,680 --> 00:16:41,080 Speaker 3: Like if they are counseling us against trust, did they 281 00:16:41,560 --> 00:16:46,360 Speaker 3: deliver that council with anger or contempt? But anyway, so 282 00:16:46,480 --> 00:16:50,960 Speaker 3: you've got all this literature that equates cynicism with the 283 00:16:51,040 --> 00:16:54,440 Speaker 3: kind of wisdom and you know, like the power to 284 00:16:54,440 --> 00:16:58,400 Speaker 3: see what is really going on. Also, the authors here 285 00:16:58,480 --> 00:17:00,400 Speaker 3: point out that if you draw up a list of 286 00:17:00,440 --> 00:17:04,879 Speaker 3: like cynical characters in popular culture, they don't tend to 287 00:17:04,960 --> 00:17:09,320 Speaker 3: be pitiable wretches dealing with setbacks imposed by their lack 288 00:17:09,359 --> 00:17:14,240 Speaker 3: of faith in humankind. More often, cynicism in fictional characters 289 00:17:14,640 --> 00:17:20,240 Speaker 3: is presented as gruff hard one realism and wisdom. The 290 00:17:20,280 --> 00:17:25,800 Speaker 3: cynical character has knowledge, insights, and powers of deduction not 291 00:17:26,080 --> 00:17:30,160 Speaker 3: available to their more trusting peers. So think of Sherlock 292 00:17:30,240 --> 00:17:33,680 Speaker 3: Holmes or the authors give the example of House from 293 00:17:33,720 --> 00:17:36,800 Speaker 3: how simd. I'm not a watcher of House, but I'm 294 00:17:36,800 --> 00:17:37,800 Speaker 3: familiar with the character. 295 00:17:38,720 --> 00:17:40,680 Speaker 2: I mean, you could do a full stop after Sherlock, 296 00:17:40,760 --> 00:17:45,919 Speaker 2: because Sherlock, of course influences so many different similar characters 297 00:17:46,480 --> 00:17:50,879 Speaker 2: and cast a long shadow across across the English language 298 00:17:50,920 --> 00:17:54,520 Speaker 2: and fiction and other languages, but cast a long shadow 299 00:17:54,520 --> 00:17:57,520 Speaker 2: across our media. Yeah, and yet at this and it 300 00:17:57,600 --> 00:17:59,640 Speaker 2: is interesting to think about Sherlock Holmes in these terms, 301 00:17:59,680 --> 00:18:03,879 Speaker 2: because yes, Sherlock Holmes is presented as being, you know, 302 00:18:03,920 --> 00:18:08,720 Speaker 2: somewhat emotionally detached, but not you know, certainly he's fighting 303 00:18:08,920 --> 00:18:10,879 Speaker 2: the good fight. He is on the side of the 304 00:18:10,920 --> 00:18:13,520 Speaker 2: good guy, and will sometimes even you know, break the 305 00:18:13,600 --> 00:18:16,960 Speaker 2: rules a little bit or bend them in order to 306 00:18:17,000 --> 00:18:20,640 Speaker 2: make sure that justice is served. But on the other hand, 307 00:18:20,680 --> 00:18:22,640 Speaker 2: I think if you, if you look closely enough at Sherlock, 308 00:18:22,680 --> 00:18:26,639 Speaker 2: I mean he's also a character who at times admits 309 00:18:26,680 --> 00:18:29,320 Speaker 2: that he's never loved anyone or has certainly never had 310 00:18:29,320 --> 00:18:33,560 Speaker 2: a romantic love in his life. He also struggles horribly 311 00:18:33,600 --> 00:18:36,760 Speaker 2: with addiction at one point. You know, so you know, 312 00:18:36,800 --> 00:18:39,560 Speaker 2: he's he's not an angel. But again, I guess this 313 00:18:39,640 --> 00:18:42,720 Speaker 2: part of his presentation, he's he's hard boiled. It's it's 314 00:18:43,040 --> 00:18:46,600 Speaker 2: it's it's hard one cynicism that he uses in order 315 00:18:46,640 --> 00:18:49,040 Speaker 2: to solve the crimes that he's presented with. 316 00:18:49,359 --> 00:18:52,760 Speaker 3: Yeah, yeah, I mean I think that is actually generally true. 317 00:18:52,800 --> 00:18:58,440 Speaker 3: Cynical characters are often presented as suffering as a result 318 00:18:58,480 --> 00:19:01,880 Speaker 3: of their own cynicism, but not wrong because of it, 319 00:19:02,000 --> 00:19:05,240 Speaker 3: Like that their cynicism is something that hurts them and 320 00:19:05,280 --> 00:19:08,639 Speaker 3: it makes them sad and lonely, but it also gives 321 00:19:08,680 --> 00:19:13,399 Speaker 3: them cognitive superiority. It gives them intelligence and wisdom and 322 00:19:13,520 --> 00:19:18,000 Speaker 3: power to see through the facade and see what's really happening. 323 00:19:18,640 --> 00:19:19,520 Speaker 2: Yeah. 324 00:19:19,560 --> 00:19:22,400 Speaker 3: But anyway, so based on this background of the cynical 325 00:19:22,440 --> 00:19:26,679 Speaker 3: geniuses in fiction, and the sort of cynical wit and 326 00:19:26,720 --> 00:19:30,399 Speaker 3: wisdom from literature. The author is conducted a number of 327 00:19:30,400 --> 00:19:34,080 Speaker 3: different studies. They did four studies to explore common beliefs 328 00:19:34,119 --> 00:19:39,119 Speaker 3: about the link between cynicism and cognitive superiority and competence, 329 00:19:39,720 --> 00:19:42,400 Speaker 3: and then three more studies to look at whether there 330 00:19:42,440 --> 00:19:46,720 Speaker 3: actually is a link. So, do people in general think 331 00:19:46,800 --> 00:19:50,959 Speaker 3: that cynicism is a sign of knowledge, intellect, and competence. 332 00:19:51,720 --> 00:19:55,679 Speaker 3: Do cynics actually seem smarter? And are they actually smarter 333 00:19:55,800 --> 00:19:57,480 Speaker 3: and more competent than the rest of us? 334 00:19:58,320 --> 00:20:01,600 Speaker 2: Well Sherlock is above reproach, but I'm curious to hear 335 00:20:01,680 --> 00:20:03,040 Speaker 2: how it lights to real people. 336 00:20:03,440 --> 00:20:06,359 Speaker 3: So the authors begin by acknowledging some existing research that 337 00:20:06,680 --> 00:20:09,520 Speaker 3: touches on these questions. For example, there was a study 338 00:20:09,760 --> 00:20:14,320 Speaker 3: by Evans and Venda Caalceda published in Personality and Social 339 00:20:14,320 --> 00:20:19,119 Speaker 3: Psychology Bulletin in twenty eighteen called the Reputational Consequences of 340 00:20:19,280 --> 00:20:23,439 Speaker 3: Generalized Trust. And this study looked at what we was 341 00:20:23,480 --> 00:20:25,199 Speaker 3: just sort of a survey of what we tend to 342 00:20:25,280 --> 00:20:27,960 Speaker 3: think of people when we know that they are high 343 00:20:28,000 --> 00:20:32,040 Speaker 3: in trust or low in cynicism. The findings were that 344 00:20:32,560 --> 00:20:37,200 Speaker 3: high trust, individuals are seen as moral and seen as sociable, 345 00:20:37,520 --> 00:20:41,280 Speaker 3: but also seen as less competent. And this kind of 346 00:20:41,320 --> 00:20:45,800 Speaker 3: makes sense as a familiar personality archetype, right like, Johnny 347 00:20:45,840 --> 00:20:49,159 Speaker 3: is so trusting, he's a good guy, he's friendly, but 348 00:20:49,240 --> 00:20:53,960 Speaker 3: he doesn't know what he's doing. So, if showing generalized 349 00:20:54,080 --> 00:20:58,439 Speaker 3: trust makes people think we're less competent, does that imply 350 00:20:58,560 --> 00:21:05,400 Speaker 3: that showing generalized distrust makes people think we're more competent. Perhaps? However, 351 00:21:05,560 --> 00:21:08,880 Speaker 3: the authors also found here that people see you as 352 00:21:08,960 --> 00:21:13,159 Speaker 3: more competent if you display what they call discriminate ability, 353 00:21:13,560 --> 00:21:17,040 Speaker 3: which is the ability to tell the difference between a 354 00:21:17,119 --> 00:21:20,560 Speaker 3: situation in which you should trust and situations in which 355 00:21:20,560 --> 00:21:23,000 Speaker 3: you should not. And this takes us back to the 356 00:21:23,080 --> 00:21:26,880 Speaker 3: question we mentioned in part one. Obviously, nobody either trusts 357 00:21:27,000 --> 00:21:30,399 Speaker 3: or distrusts in every situation, So how do you determine 358 00:21:30,560 --> 00:21:34,080 Speaker 3: how cynical it is reasonable to be in this situation? 359 00:21:34,520 --> 00:21:36,800 Speaker 3: And how do we know if we're off balance? 360 00:21:37,440 --> 00:21:37,600 Speaker 2: Now? 361 00:21:37,640 --> 00:21:41,840 Speaker 3: The authors also discuss reasons that people might think it 362 00:21:41,880 --> 00:21:46,760 Speaker 3: is wise to be cynical. One is pretty familiar better 363 00:21:46,800 --> 00:21:50,960 Speaker 3: safe than sorry reasoning. They write, quote in many domains, 364 00:21:51,000 --> 00:21:54,879 Speaker 3: the consequences of false negative errors e g. Believing that 365 00:21:54,920 --> 00:21:58,480 Speaker 3: someone is trustworthy when they really are not have often 366 00:21:58,560 --> 00:22:02,800 Speaker 3: been more costly than false positive errors e g. Believing 367 00:22:02,840 --> 00:22:07,120 Speaker 3: that someone is untrustworthy when they really are trustworthy. Over 368 00:22:07,240 --> 00:22:11,520 Speaker 3: human evolutionary history, making the cognitive system of modern humans 369 00:22:11,640 --> 00:22:17,119 Speaker 3: biased toward false alarms, which is hard to argue with, right, Like, Yeah, 370 00:22:17,160 --> 00:22:20,000 Speaker 3: in this series, we are showing lots of evidence that 371 00:22:20,119 --> 00:22:23,760 Speaker 3: it is bad for you to be highly chronically cynical, 372 00:22:24,240 --> 00:22:27,880 Speaker 3: and yet it's true that more often if you distrust 373 00:22:27,920 --> 00:22:33,080 Speaker 3: a trustworthy person, the immediate consequences are fairly limited, but 374 00:22:33,160 --> 00:22:37,679 Speaker 3: if you trust an untrustworthy person, the consequences can be disastrous. 375 00:22:38,080 --> 00:22:39,520 Speaker 2: Yeah, we talked about this a little bit in the 376 00:22:39,560 --> 00:22:41,959 Speaker 2: last episode. Type one Eras and Cognition. You know, it's 377 00:22:42,000 --> 00:22:45,080 Speaker 2: like you've got to make your way across an open field, 378 00:22:45,680 --> 00:22:49,080 Speaker 2: and you know there's going to be a time cost 379 00:22:49,520 --> 00:22:54,240 Speaker 2: and probably like you know, an anxiety cost to checking 380 00:22:54,280 --> 00:22:56,040 Speaker 2: every bush along the way to make sure there's not 381 00:22:56,080 --> 00:22:58,119 Speaker 2: a tiger in there to jump out and get you. 382 00:22:58,880 --> 00:23:01,000 Speaker 2: But you know, the way our brains work and the 383 00:23:01,040 --> 00:23:05,160 Speaker 2: way we're hardwired, it's like we know that that's one 384 00:23:05,200 --> 00:23:07,359 Speaker 2: sort of risk I'm going to lose some time, and 385 00:23:07,440 --> 00:23:10,480 Speaker 2: I might you know, feel horrible the whole way versus 386 00:23:10,880 --> 00:23:13,280 Speaker 2: getting eaten by a tiger. Yeah, like one of those 387 00:23:13,400 --> 00:23:16,440 Speaker 2: is like a mutt like looms far larger in our 388 00:23:16,520 --> 00:23:18,120 Speaker 2: short term threat analysis. 389 00:23:18,640 --> 00:23:21,960 Speaker 3: One side of the balance has an infinite cost on it. Yeah, 390 00:23:21,960 --> 00:23:25,320 Speaker 3: it's like kind of hard to outbalance that, even though, 391 00:23:25,359 --> 00:23:28,000 Speaker 3: like wasting all your time and resources checking every bush 392 00:23:28,119 --> 00:23:31,160 Speaker 3: that really does matter, like over time, that hugely impacts 393 00:23:31,160 --> 00:23:32,199 Speaker 3: your quality of life. 394 00:23:32,400 --> 00:23:34,360 Speaker 2: Yeah, yeah, especially of course, you know when you get 395 00:23:34,359 --> 00:23:37,520 Speaker 2: into not only real tigers, but all the paper tigers 396 00:23:37,560 --> 00:23:38,479 Speaker 2: in one's life, you know. 397 00:23:39,080 --> 00:23:41,960 Speaker 3: So anyway, they say that a general appreciation for the 398 00:23:42,000 --> 00:23:45,520 Speaker 3: merits of the better safe than sorry framework could lead 399 00:23:45,560 --> 00:23:48,960 Speaker 3: to the widespread notion that cynics are smarter people. They're 400 00:23:49,000 --> 00:24:02,920 Speaker 3: more knowledgeable and more competent. Now, continuing the background review, 401 00:24:02,960 --> 00:24:06,600 Speaker 3: the authors also get into existing research on whether there 402 00:24:06,640 --> 00:24:11,480 Speaker 3: is an actual link between cynicism and competence. So now 403 00:24:11,520 --> 00:24:14,359 Speaker 3: we're asking not about how cynics are perceived, but about 404 00:24:14,400 --> 00:24:18,600 Speaker 3: what their relative competence level actually is. And this, I 405 00:24:18,640 --> 00:24:20,840 Speaker 3: guess comes back to another question we brought up in 406 00:24:20,880 --> 00:24:24,639 Speaker 3: Part one. Do cynics or nonsnics have a better predictive 407 00:24:24,720 --> 00:24:28,600 Speaker 3: model of the world. And one very interesting way of 408 00:24:28,640 --> 00:24:33,040 Speaker 3: studying this is the so called trust game. So here's 409 00:24:33,040 --> 00:24:35,720 Speaker 3: an example of a type of trust game. This was 410 00:24:35,760 --> 00:24:39,520 Speaker 3: described in a paper called why so cynical asymmetric feedback 411 00:24:39,640 --> 00:24:44,439 Speaker 3: underlies misguided skepticism regarding the trustworthiness of others. This was 412 00:24:44,520 --> 00:24:47,760 Speaker 3: by Debt Left, Fetchenhawer, and David Dunning in the journal 413 00:24:47,760 --> 00:24:52,480 Speaker 3: Psychological Science in twenty ten and it describes this what 414 00:24:52,600 --> 00:24:55,119 Speaker 3: is sometimes called an investing game. They call it a 415 00:24:55,160 --> 00:24:58,840 Speaker 3: trust game, and it goes like this quote. In the game, 416 00:24:59,240 --> 00:25:03,000 Speaker 3: the trust is given money that can be kept or 417 00:25:03,560 --> 00:25:08,280 Speaker 3: handed to a completely random and anonymous stranger, the trustee. 418 00:25:08,400 --> 00:25:11,200 Speaker 3: If the trust hands his or her money over, the 419 00:25:11,280 --> 00:25:16,120 Speaker 3: amount of money is quadrupled eg. Five dollars becomes twenty dollars, 420 00:25:16,640 --> 00:25:20,879 Speaker 3: and trustees have two options. They can either split the 421 00:25:20,920 --> 00:25:24,920 Speaker 3: money evenly between themselves and the truster, eg. Give ten 422 00:25:25,000 --> 00:25:28,840 Speaker 3: dollars back and keep ten dollars for themselves, or they 423 00:25:28,840 --> 00:25:32,280 Speaker 3: can keep all the money for themselves. So the way 424 00:25:32,320 --> 00:25:34,720 Speaker 3: the game works is I'm the truster. I'm the person 425 00:25:34,760 --> 00:25:37,359 Speaker 3: who gets to make the first decision. If I trust 426 00:25:37,480 --> 00:25:40,719 Speaker 3: you and you are trustworthy, we both benefit and I 427 00:25:40,760 --> 00:25:43,680 Speaker 3: double my money. If I trust you and you are 428 00:25:43,720 --> 00:25:47,000 Speaker 3: not trustworthy, I get nothing. So the authors did this 429 00:25:47,080 --> 00:25:50,840 Speaker 3: experiment lots of times and some interesting patterns came out. 430 00:25:51,600 --> 00:25:56,040 Speaker 3: They found that trusters estimate the rate of trustworthiness of 431 00:25:56,080 --> 00:25:59,120 Speaker 3: anonymous strangers in the game. They estimate it will be 432 00:25:59,119 --> 00:26:03,439 Speaker 3: between four forty five and sixty percent. So it seems 433 00:26:03,440 --> 00:26:06,320 Speaker 3: that most people think it's a little better than a 434 00:26:06,359 --> 00:26:09,320 Speaker 3: coin flip chance that the other person will honor their 435 00:26:09,359 --> 00:26:13,080 Speaker 3: trust and split the money for mutual benefit. In reality, 436 00:26:13,440 --> 00:26:16,920 Speaker 3: the trustees honored the trust and split the money around 437 00:26:17,040 --> 00:26:21,399 Speaker 3: eighty to ninety percent of the time. So people in 438 00:26:21,480 --> 00:26:27,480 Speaker 3: this game massively underestimated how trustworthy random strangers would be. 439 00:26:29,000 --> 00:26:31,680 Speaker 3: At least in the context of this game, anonymous strangers 440 00:26:31,680 --> 00:26:36,080 Speaker 3: were something like twenty to fifty percent more trustworthy and 441 00:26:36,160 --> 00:26:40,360 Speaker 3: cooperative than people expected them to be. Isn't that interesting? 442 00:26:40,480 --> 00:26:41,160 Speaker 3: That's crazy? 443 00:26:41,720 --> 00:26:44,920 Speaker 2: Yeah, eighty to ninety percent of the time. That's higher 444 00:26:44,960 --> 00:26:47,840 Speaker 2: than I would have guessed. But then again, it's like, 445 00:26:48,200 --> 00:26:50,280 Speaker 2: I'd like to think that if someone offered me the 446 00:26:50,320 --> 00:26:54,280 Speaker 2: scenario and I was not too cynical and trusted that 447 00:26:54,359 --> 00:26:56,520 Speaker 2: it was not some sort of a scam, I would 448 00:26:56,560 --> 00:26:58,120 Speaker 2: be as trustworthy. 449 00:26:57,880 --> 00:27:00,800 Speaker 3: Can I offer. I think maybe one thing that could 450 00:27:00,840 --> 00:27:05,800 Speaker 3: be working in this particular scenario is that it makes 451 00:27:06,080 --> 00:27:10,760 Speaker 3: sense to be wary of people who are offering to 452 00:27:10,880 --> 00:27:15,560 Speaker 3: double your money in financial transactions. But I think that 453 00:27:15,600 --> 00:27:19,520 Speaker 3: makes sense when like people come to you and they 454 00:27:19,560 --> 00:27:22,080 Speaker 3: say like, hey, you know, you give me some money 455 00:27:22,080 --> 00:27:25,760 Speaker 3: and I'll double it, you should. I mean, if somebody 456 00:27:25,800 --> 00:27:28,040 Speaker 3: says that they're not telling you the truth, almost always, 457 00:27:28,359 --> 00:27:31,480 Speaker 3: so like there's a good reason to be wary there. 458 00:27:31,560 --> 00:27:34,240 Speaker 3: This is a different thing because the trustee in this 459 00:27:34,359 --> 00:27:36,880 Speaker 3: game is not somebody who is coming out of nowhere 460 00:27:36,920 --> 00:27:39,600 Speaker 3: to offer you money if you just give them some first. 461 00:27:39,960 --> 00:27:42,879 Speaker 3: They're a random stranger who is who has been pulled 462 00:27:42,880 --> 00:27:46,520 Speaker 3: into this experiment designed by somebody else. And so I 463 00:27:46,520 --> 00:27:48,960 Speaker 3: think what this shows is most of the time, if 464 00:27:49,000 --> 00:27:52,840 Speaker 3: given the opportunity to be trustworthy and cooperate, most people will. 465 00:27:53,359 --> 00:27:56,119 Speaker 3: But also it makes sense to be wary of people 466 00:27:56,160 --> 00:27:59,720 Speaker 3: who are claiming they're trying to help you cooperate, you know, 467 00:27:59,760 --> 00:28:03,160 Speaker 3: from mutual benefit. If they're coming out of nowhere with this, 468 00:28:03,480 --> 00:28:05,920 Speaker 3: you know, that's often going to be a scam. Does 469 00:28:05,920 --> 00:28:06,480 Speaker 3: that make sense? 470 00:28:06,600 --> 00:28:07,399 Speaker 2: Yeah? Yeah, I think so. 471 00:28:08,119 --> 00:28:10,560 Speaker 3: But anyway. So yeah, in the trust game, most people 472 00:28:10,640 --> 00:28:14,400 Speaker 3: are very trustworthy and players are on average way too 473 00:28:14,440 --> 00:28:17,160 Speaker 3: cynical about their fellow human being. They are missing out 474 00:28:17,200 --> 00:28:21,359 Speaker 3: on lots of opportunities to double their money. And this 475 00:28:21,480 --> 00:28:24,280 Speaker 3: is consistent with research by Miller in nineteen ninety eight 476 00:28:24,320 --> 00:28:27,399 Speaker 3: and ninety nine finding that people just tend to grossly 477 00:28:27,560 --> 00:28:33,439 Speaker 3: overestimate the selfishness and underestimate the trustworthiness of strangers. In 478 00:28:33,480 --> 00:28:38,760 Speaker 3: this particular paper, the authors note that cynicism might grow 479 00:28:38,880 --> 00:28:43,280 Speaker 3: from what they call asymmetric feedback. And the way that 480 00:28:43,320 --> 00:28:47,480 Speaker 3: works is this, when you trust somebody and you get betrayed, 481 00:28:48,160 --> 00:28:52,360 Speaker 3: you get very clear feedback that it was wrong to trust. 482 00:28:52,760 --> 00:28:56,360 Speaker 3: The downside of granting your trust is very apparent to you. 483 00:28:56,440 --> 00:28:58,680 Speaker 3: They walk away with the money, you get nothing, and 484 00:28:58,720 --> 00:29:01,800 Speaker 3: you know it's clear to you happened. But when you 485 00:29:02,040 --> 00:29:06,920 Speaker 3: refrain from trusting people, the downsides are often invisible to 486 00:29:06,960 --> 00:29:11,280 Speaker 3: you because you don't actually see the lost opportunity as 487 00:29:11,320 --> 00:29:13,520 Speaker 3: a scenario that plays out in front of you. You 488 00:29:13,600 --> 00:29:17,520 Speaker 3: have to like imagine it as a counterfactual. It's not 489 00:29:17,680 --> 00:29:21,040 Speaker 3: concrete and in your face, like being betrayed is, so 490 00:29:21,280 --> 00:29:25,440 Speaker 3: you don't really get conditioned by feedback from instances where 491 00:29:25,440 --> 00:29:29,120 Speaker 3: you harmed yourself by withholding trust. Does that make sense? 492 00:29:30,080 --> 00:29:33,480 Speaker 2: Yeah, Yeah, it's kind of like ive, like, here's a scenario. 493 00:29:33,920 --> 00:29:37,040 Speaker 2: Let's imagine that you're just really pedantic when looking at 494 00:29:37,080 --> 00:29:39,760 Speaker 2: the checks when you go out to eat with friends. 495 00:29:40,960 --> 00:29:42,280 Speaker 2: You know, you're like, all right, I want to see 496 00:29:42,280 --> 00:29:44,920 Speaker 2: everybody's work. I got to make one hundred percent sure 497 00:29:45,000 --> 00:29:47,640 Speaker 2: this is fair. And maybe it's because at some point 498 00:29:47,680 --> 00:29:51,600 Speaker 2: someone really did stick you, yeah, over one of these situations, 499 00:29:51,960 --> 00:29:54,280 Speaker 2: and so it maybe that is more apparent. You like, 500 00:29:54,280 --> 00:29:57,160 Speaker 2: you're never going to forget that that you were wronged 501 00:29:57,200 --> 00:30:01,000 Speaker 2: in this way. But if you're if you're just overly 502 00:30:01,120 --> 00:30:04,240 Speaker 2: pedantic when it comes to the bills, eventually people might 503 00:30:04,320 --> 00:30:07,840 Speaker 2: stop asking you to join them for dinner. Yeah, And 504 00:30:08,040 --> 00:30:10,920 Speaker 2: that might be very invisible to you that that's happening. 505 00:30:11,080 --> 00:30:13,880 Speaker 3: Yes, you don't realize. Yeah, it's just like things are 506 00:30:13,920 --> 00:30:15,600 Speaker 3: not as good now, and I don't know why I'm 507 00:30:15,600 --> 00:30:19,920 Speaker 3: feeling lonely. Yeah, Or is a more direct comparison, If 508 00:30:19,960 --> 00:30:23,200 Speaker 3: you're not looking at the checks all the time, you 509 00:30:23,320 --> 00:30:27,000 Speaker 3: might not notice the times when somebody made a mistake 510 00:30:27,080 --> 00:30:29,920 Speaker 3: in your favor that just passed You never even noticed 511 00:30:29,960 --> 00:30:32,800 Speaker 3: that it just goes right by you. That is a 512 00:30:32,800 --> 00:30:35,640 Speaker 3: good comparison. But anyway, what the study found is that 513 00:30:35,720 --> 00:30:40,800 Speaker 3: if you give subjects symmetric feedback about the trustworthiness of others, 514 00:30:41,360 --> 00:30:45,760 Speaker 3: it tended to reduce the subject's cynicism. So, like, let's 515 00:30:45,760 --> 00:30:49,240 Speaker 3: say you play the trust game, symmetric feedback would be 516 00:30:49,280 --> 00:30:52,440 Speaker 3: rob whether or not you decide to hand somebody the 517 00:30:52,840 --> 00:30:56,239 Speaker 3: five dollars and try to cooperate to quadruple it. You 518 00:30:56,280 --> 00:30:59,160 Speaker 3: get to find out what they would have done either way, 519 00:31:00,400 --> 00:31:02,760 Speaker 3: so you know, you get to keep playing the game 520 00:31:02,800 --> 00:31:05,040 Speaker 3: that way, And it turns out if you play it 521 00:31:05,080 --> 00:31:07,680 Speaker 3: that way, where people keep seeing, oh, I kept the money, 522 00:31:07,720 --> 00:31:10,160 Speaker 3: but they I saw that they would have doubled my 523 00:31:10,240 --> 00:31:12,200 Speaker 3: money if I just trusted them, and you get to 524 00:31:12,240 --> 00:31:15,360 Speaker 3: see that happen over and over. That actually does decrease 525 00:31:15,400 --> 00:31:19,520 Speaker 3: people's cynicism, which also is interesting and that it gives 526 00:31:19,520 --> 00:31:21,720 Speaker 3: you at least a little bit of an idea where 527 00:31:22,040 --> 00:31:25,440 Speaker 3: some elements of cynicism could be coming from. It could 528 00:31:25,440 --> 00:31:29,480 Speaker 3: be related in part to this asymmetric build up of information. 529 00:31:29,640 --> 00:31:33,440 Speaker 3: We get to see where trust fails very clearly, but 530 00:31:33,880 --> 00:31:37,440 Speaker 3: the opportunities we lose out on by not granting trust 531 00:31:37,560 --> 00:31:40,360 Speaker 3: are often just like, we don't even realize what's happened. 532 00:31:40,360 --> 00:31:42,640 Speaker 3: We don't even realize anything. We don't even know what 533 00:31:42,640 --> 00:31:46,160 Speaker 3: we're missing. Yeah, so this research does not give us 534 00:31:46,200 --> 00:31:48,880 Speaker 3: a complete picture. But I think some evidence is starting 535 00:31:48,880 --> 00:31:53,000 Speaker 3: to accumulate that the cynic does not have a highly 536 00:31:53,160 --> 00:31:56,880 Speaker 3: accurate internal model of the world. They might in some scenarios, 537 00:31:57,320 --> 00:32:01,520 Speaker 3: but generalized cynicism is not like as some of these 538 00:32:01,520 --> 00:32:05,000 Speaker 3: writers were saying, seeing things as they really are. In fact, 539 00:32:05,040 --> 00:32:09,120 Speaker 3: cynicism often causes us to incorrectly predict the behavior of 540 00:32:09,160 --> 00:32:13,040 Speaker 3: other people, assuming they will be more selfish and treacherous 541 00:32:13,080 --> 00:32:17,120 Speaker 3: than they really are. Now coming back to the main 542 00:32:17,160 --> 00:32:20,440 Speaker 3: paper I was talking about, but Stavrova and elibract they 543 00:32:20,480 --> 00:32:24,680 Speaker 3: note some other research on the link between cynicism and competence. Again, 544 00:32:24,880 --> 00:32:28,920 Speaker 3: contrary to the cynical genius archetype, the authors are able 545 00:32:28,920 --> 00:32:31,840 Speaker 3: to cite a long list of studies looking at links 546 00:32:31,880 --> 00:32:36,280 Speaker 3: between cynicism and various types of cognitive performance and ability, 547 00:32:36,320 --> 00:32:40,400 Speaker 3: and they find it's exactly the opposite of what you 548 00:32:40,480 --> 00:32:44,640 Speaker 3: might guess from the Sherlock Holmes example. Higher performance on 549 00:32:44,800 --> 00:32:49,360 Speaker 3: various types of cognitive academic and IQ tests is negatively 550 00:32:49,440 --> 00:32:54,719 Speaker 3: correlated with cynicism. It is instead positively correlated with increased 551 00:32:54,840 --> 00:32:58,720 Speaker 3: tendency to trust. There are a few confounding results here. 552 00:32:58,720 --> 00:33:00,880 Speaker 3: It's not like every single study has found this, but 553 00:33:00,920 --> 00:33:04,440 Speaker 3: the vast majority have. Some of the confounding results are 554 00:33:04,480 --> 00:33:08,120 Speaker 3: For example, they cite a twenty thirteen study that found 555 00:33:08,200 --> 00:33:12,760 Speaker 3: that higher IQ does not, on average improve a person's 556 00:33:12,800 --> 00:33:16,760 Speaker 3: ability to correctly predict who will be trustworthy and who 557 00:33:16,760 --> 00:33:19,840 Speaker 3: will not. So like you do better on cognitive tests, 558 00:33:20,160 --> 00:33:22,440 Speaker 3: that doesn't mean that if we pair you up with, 559 00:33:22,640 --> 00:33:25,760 Speaker 3: you know, Johnny and Billy in the in the Trust 560 00:33:25,760 --> 00:33:28,520 Speaker 3: Game experiment, you can predict whether Johnny or Billy you'll 561 00:33:28,560 --> 00:33:31,600 Speaker 3: be more likely to help you out. It just it 562 00:33:31,640 --> 00:33:33,000 Speaker 3: doesn't help us in that regard. 563 00:33:33,280 --> 00:33:35,240 Speaker 2: Yeah, it's like the compass is already pulling you in 564 00:33:35,240 --> 00:33:38,640 Speaker 2: one direction or another, and however higher IQ is, I mean, 565 00:33:38,680 --> 00:33:40,800 Speaker 2: that's just that's just the kind of mental energy that 566 00:33:40,920 --> 00:33:43,720 Speaker 2: ends up being wrapped around the initial impulse. 567 00:33:44,200 --> 00:33:46,360 Speaker 3: Another thing to keep in mind here is that, of course, 568 00:33:46,400 --> 00:33:50,120 Speaker 3: while like various cognitive and IQ and academic tests can 569 00:33:50,480 --> 00:33:52,160 Speaker 3: tell you, they can tell you a lot of things 570 00:33:52,200 --> 00:33:56,280 Speaker 3: about cognitive ability, they don't tell you everything. So you know, 571 00:33:56,320 --> 00:33:58,280 Speaker 3: they can tell you about certain kinds of skills with 572 00:33:58,360 --> 00:34:02,120 Speaker 3: reasoning certain kinds of intelligence, but there are always going 573 00:34:02,160 --> 00:34:06,320 Speaker 3: to be elements of intelligence that are not perfectly captured 574 00:34:06,360 --> 00:34:10,480 Speaker 3: by these sorts of tests. So the authors begin to 575 00:34:11,320 --> 00:34:15,279 Speaker 3: develop a possible hypothetical model to explain what's going on here. 576 00:34:15,960 --> 00:34:20,920 Speaker 3: They say, what if intelligence, knowledge and competence don't really 577 00:34:21,000 --> 00:34:24,759 Speaker 3: help you very much in identifying who to trust in 578 00:34:24,800 --> 00:34:27,800 Speaker 3: a given scenario. They don't tell you really if Johnny 579 00:34:27,920 --> 00:34:30,640 Speaker 3: or Billy, both of whom you've just met, is more trustworthy, 580 00:34:31,200 --> 00:34:36,040 Speaker 3: But instead they help you evaluate the scenario itself to 581 00:34:36,160 --> 00:34:39,680 Speaker 3: decide whether to deploy a more cynical or a more 582 00:34:39,840 --> 00:34:45,200 Speaker 3: trusting framework given the environment and the circumstances. The authors 583 00:34:45,280 --> 00:34:49,279 Speaker 3: rit quote high levels of competence might allow individuals to 584 00:34:49,360 --> 00:34:54,080 Speaker 3: correctly identify the corruptness of their environment and adjust their 585 00:34:54,160 --> 00:34:57,839 Speaker 3: level of cynicism to match it. Following this reasoning, high 586 00:34:57,840 --> 00:35:02,440 Speaker 3: competence individuals might hold adapt attitudes and recur to cynicism 587 00:35:02,560 --> 00:35:06,640 Speaker 3: only when it seems warranted, while they're less competent. Counterparts 588 00:35:06,719 --> 00:35:11,080 Speaker 3: might show more cognitive rigidity and relying on the better 589 00:35:11,160 --> 00:35:16,719 Speaker 3: safe than sorry heuristic tend to endorse cynicism indiscriminately. So 590 00:35:16,840 --> 00:35:19,440 Speaker 3: if this model is correct, they're saying it can be 591 00:35:19,520 --> 00:35:23,640 Speaker 3: efficient to just remain in better safe than sorry mode 592 00:35:24,040 --> 00:35:26,840 Speaker 3: when you lack the ability to tell whether you're dealing 593 00:35:26,920 --> 00:35:29,719 Speaker 3: with a corrupt, untrustworthy environment or not. 594 00:35:30,520 --> 00:35:32,080 Speaker 2: Yeah, yeah, this makes sense. We can all think of 595 00:35:32,120 --> 00:35:36,200 Speaker 2: examples where the scenario is very clear, like, Okay, even 596 00:35:36,239 --> 00:35:38,279 Speaker 2: if someone is out to get me, this is not 597 00:35:38,360 --> 00:35:40,799 Speaker 2: the environment where they can just really take me for 598 00:35:40,840 --> 00:35:44,000 Speaker 2: all I'm worth. I'm literally handing somebody a five dollar bill. 599 00:35:44,239 --> 00:35:46,200 Speaker 2: What are they going to do. They're going to run 600 00:35:46,239 --> 00:35:49,320 Speaker 2: away into the woods and keep my dollar fifty and 601 00:35:49,440 --> 00:35:51,640 Speaker 2: change that the risks seem low. 602 00:35:52,040 --> 00:35:57,279 Speaker 3: So onto Stevrova and Illebrect's actual experiments, and I'll start 603 00:35:57,320 --> 00:36:00,480 Speaker 3: with the very short version of their findings. First of all, 604 00:36:00,480 --> 00:36:05,160 Speaker 3: they find yes on average across multiple experiments. Regular people 605 00:36:05,200 --> 00:36:09,000 Speaker 3: tend to believe that cynicism is a sign of cognitive 606 00:36:09,040 --> 00:36:12,600 Speaker 3: superiority in others, if you think people are bad, you 607 00:36:12,680 --> 00:36:16,280 Speaker 3: think people are selfish and morals are fake. On balance, 608 00:36:16,360 --> 00:36:19,719 Speaker 3: people will tend to assume you are smarter and more competent, 609 00:36:20,480 --> 00:36:26,120 Speaker 3: especially at certain types of cognitive tasks. Things involving logic 610 00:36:26,239 --> 00:36:28,960 Speaker 3: and numbers and stuff. They're more likely to assign you 611 00:36:29,440 --> 00:36:34,799 Speaker 3: important cognitive tasks like doing mathematical calculations and logical analysis 612 00:36:34,840 --> 00:36:37,960 Speaker 3: of documents if they think you're cynical. And on the 613 00:36:38,000 --> 00:36:41,280 Speaker 3: other hand, the authors found in their experiments no, on average, 614 00:36:41,280 --> 00:36:46,360 Speaker 3: cynicism is not associated with cognitive superiority or greater competence. 615 00:36:47,160 --> 00:36:49,600 Speaker 3: They had three studies based on data from about two 616 00:36:49,680 --> 00:36:53,399 Speaker 3: hundred thousand subjects across thirty different countries and showed that 617 00:36:53,440 --> 00:36:58,120 Speaker 3: on average, cynicism was negatively correlated with tests of cognitive 618 00:36:58,120 --> 00:37:01,400 Speaker 3: ability and tests of academic nolede and competency. So this 619 00:37:01,480 --> 00:37:05,920 Speaker 3: included all kinds of things like reading, comprehension, mathematical skills, 620 00:37:06,040 --> 00:37:11,320 Speaker 3: scientific literacy, technological literacy, and so forth. And this negative 621 00:37:11,320 --> 00:37:15,520 Speaker 3: association between cynicism and cognitive tests was true even after 622 00:37:15,560 --> 00:37:21,080 Speaker 3: controlling for confounding variables like age, gender, household income, wealth, test, 623 00:37:21,160 --> 00:37:26,400 Speaker 3: language proficiency, and Big five personality traits. Now, one major 624 00:37:26,440 --> 00:37:30,320 Speaker 3: distinction here is that they found that people who tested 625 00:37:30,400 --> 00:37:36,960 Speaker 3: higher incompetence tended to have attitudes of contingent trust. They 626 00:37:37,040 --> 00:37:41,400 Speaker 3: might be trusting by default, but we're not rigid in 627 00:37:41,480 --> 00:37:44,319 Speaker 3: that regard and would become more cynical if it was 628 00:37:44,400 --> 00:37:49,239 Speaker 3: warranted situationally or based on the environment and cultural context, 629 00:37:49,760 --> 00:37:54,920 Speaker 3: whereas people scoring lower incompetence tended to accept an unconditionally 630 00:37:54,960 --> 00:37:59,040 Speaker 3: cynical worldview. In the words of the authors quote suggesting 631 00:37:59,080 --> 00:38:02,240 Speaker 3: that at low leane levels of competence, holding a cynical 632 00:38:02,239 --> 00:38:06,400 Speaker 3: worldview might represent an adaptive default strategy to avoid the 633 00:38:06,400 --> 00:38:10,120 Speaker 3: potential costs of falling prey to others cunning. Now, I 634 00:38:10,239 --> 00:38:12,520 Speaker 3: wanted to expand on these findings with a few notes. 635 00:38:13,480 --> 00:38:16,440 Speaker 3: One of the things about the early tests of people's 636 00:38:16,520 --> 00:38:20,799 Speaker 3: perceptions of cynicism a control they had here, is that 637 00:38:20,840 --> 00:38:24,439 Speaker 3: the authors didn't just ask about cognitive competence. They also 638 00:38:24,560 --> 00:38:28,200 Speaker 3: asked about social and moral competence, and quite along the 639 00:38:28,200 --> 00:38:31,120 Speaker 3: lines you might expect. People tended to think that low 640 00:38:31,200 --> 00:38:35,240 Speaker 3: trust individuals would be better at cognitive things like math, logic, 641 00:38:35,320 --> 00:38:38,920 Speaker 3: and critical thinking, but they thought that high trust individuals 642 00:38:38,920 --> 00:38:41,600 Speaker 3: would be better at social tasks like cheering up a 643 00:38:41,640 --> 00:38:44,840 Speaker 3: depressed friend or taking care of a stray animal. So 644 00:38:44,920 --> 00:38:47,480 Speaker 3: it wasn't just like across the board we think cynics 645 00:38:47,480 --> 00:38:50,279 Speaker 3: are great. We think cynics are better at everything. It's 646 00:38:50,360 --> 00:38:52,640 Speaker 3: that people tend to think cynics are better at certain 647 00:38:52,760 --> 00:38:56,880 Speaker 3: types of intelligence based skills, things like math and logic 648 00:38:56,960 --> 00:38:57,600 Speaker 3: and so forth. 649 00:39:00,000 --> 00:39:02,279 Speaker 2: This this is sounding kind of like when you see 650 00:39:02,280 --> 00:39:05,600 Speaker 2: somebody smoking a cigarette. You know, they can look pretty cool, 651 00:39:05,960 --> 00:39:09,040 Speaker 2: especially in movies, like but we all we know deep 652 00:39:09,080 --> 00:39:11,840 Speaker 2: down that like, well, the smoking a cigarette doesn't actually 653 00:39:11,880 --> 00:39:14,800 Speaker 2: make you cool, but we can't help it. And likewise 654 00:39:14,840 --> 00:39:17,480 Speaker 2: you might think, well who you know. You might say, okay, 655 00:39:17,800 --> 00:39:20,960 Speaker 2: doctor smoking a cigarette. I have questions. Maybe this is 656 00:39:21,000 --> 00:39:24,680 Speaker 2: not the doctor for me, but private detective smoking a cigarette, Well, 657 00:39:25,160 --> 00:39:27,960 Speaker 2: obviously that's the guy I want looking after my interests. 658 00:39:28,480 --> 00:39:30,960 Speaker 3: Oh do you mean like the smoking of the cigarette 659 00:39:31,000 --> 00:39:34,399 Speaker 3: implies like a rejection of the consensus about the health 660 00:39:34,440 --> 00:39:36,520 Speaker 3: effects of it, or just that it I mean, I 661 00:39:36,520 --> 00:39:39,560 Speaker 3: guess that often is suggested. It's like, I don't think 662 00:39:39,560 --> 00:39:41,640 Speaker 3: it's hurting me. I don't care what people say. I 663 00:39:41,640 --> 00:39:43,759 Speaker 3: guess there are two ways of going It's like it 664 00:39:43,800 --> 00:39:45,880 Speaker 3: would be cynical. I think it's part of a cynical 665 00:39:45,880 --> 00:39:48,239 Speaker 3: worldview to say, like, ah, these doctors who say it 666 00:39:48,280 --> 00:39:50,399 Speaker 3: causes cancer or heart disease, they don't know what they're 667 00:39:50,440 --> 00:39:54,080 Speaker 3: talking about. I can just smoke it's fine. Or there's 668 00:39:54,120 --> 00:39:56,520 Speaker 3: the version that's like I don't care what happens to me, 669 00:39:56,760 --> 00:39:59,080 Speaker 3: which I think is a little bit different than cynicism. 670 00:39:59,120 --> 00:40:01,120 Speaker 3: Maybe though could go along with cynicism. 671 00:40:01,360 --> 00:40:02,319 Speaker 4: Yeah, yeah, but. 672 00:40:02,320 --> 00:40:05,480 Speaker 3: I follow you in general because yeah, there's like, apart 673 00:40:05,560 --> 00:40:08,839 Speaker 3: from thinking that cynical people are smart, there is also 674 00:40:08,880 --> 00:40:11,240 Speaker 3: a tendency to think that cynical people are cool. 675 00:40:11,600 --> 00:40:12,040 Speaker 2: Yeah. 676 00:40:12,160 --> 00:40:14,520 Speaker 3: That's in fact, I've got a section where maybe we'll 677 00:40:14,560 --> 00:40:16,120 Speaker 3: get more into that in just a minute. 678 00:40:16,120 --> 00:40:16,279 Speaker 4: Here. 679 00:40:16,640 --> 00:40:19,160 Speaker 2: Scientific analysis of the coolness. 680 00:40:18,760 --> 00:40:32,000 Speaker 3: Of another thing here is that they tried different like 681 00:40:32,120 --> 00:40:37,000 Speaker 3: wordings and types of questions across multiple replication attempts to 682 00:40:37,040 --> 00:40:39,800 Speaker 3: make sure that like the cynical genius effect was robust, 683 00:40:39,840 --> 00:40:43,239 Speaker 3: And it was robust, but the effects were modulated a 684 00:40:43,239 --> 00:40:46,000 Speaker 3: little bit by changes in phrasing, such as whether you 685 00:40:46,080 --> 00:40:50,319 Speaker 3: describe the opposite of cynicism as an idealistic versus a 686 00:40:50,440 --> 00:40:54,960 Speaker 3: positive view of human nature. Apparently people think being idealistic 687 00:40:55,040 --> 00:40:57,520 Speaker 3: about human nature is a little bit dumber than being 688 00:40:57,640 --> 00:41:01,680 Speaker 3: positive about human nature. It's always funny how just changing 689 00:41:01,719 --> 00:41:04,480 Speaker 3: the swapping a word out can have some effects there. 690 00:41:05,239 --> 00:41:08,520 Speaker 3: They also replicated these findings in different samples, so they 691 00:41:08,520 --> 00:41:12,040 Speaker 3: did some online surveys, international online surveys, and they did 692 00:41:12,080 --> 00:41:15,440 Speaker 3: some in person tests of university students in Germany. They 693 00:41:15,440 --> 00:41:18,720 Speaker 3: did some with British adults, and the cynical genius effect 694 00:41:18,800 --> 00:41:22,680 Speaker 3: appeared to varying degrees in all the groups tested here. However, 695 00:41:23,280 --> 00:41:27,880 Speaker 3: in some of these experiments, respondents got to rate essentially 696 00:41:28,200 --> 00:41:32,480 Speaker 3: how cynical they would like a person assigned to a 697 00:41:32,520 --> 00:41:35,839 Speaker 3: cognitive task to be, and the breakdown, to be clear, 698 00:41:35,960 --> 00:41:40,279 Speaker 3: was not toward a preference for extreme cynicism, but for 699 00:41:40,680 --> 00:41:44,640 Speaker 3: higher than average cynicism. So one example here is that 700 00:41:44,880 --> 00:41:48,759 Speaker 3: in a group of British adults selecting between hypothetical candidates 701 00:41:48,800 --> 00:41:54,040 Speaker 3: to solve intellectual problems, participants quote desired mix of cynical 702 00:41:54,080 --> 00:41:58,080 Speaker 3: and non cynical tendencies was fifty six percent cynical to 703 00:41:58,200 --> 00:42:02,000 Speaker 3: forty four percent non cynical. So on average, the group 704 00:42:02,200 --> 00:42:05,279 Speaker 3: they thought, we need somebody smart. We want somebody who 705 00:42:05,400 --> 00:42:07,600 Speaker 3: is a little bit more cynical than the median. 706 00:42:08,400 --> 00:42:10,480 Speaker 2: That makes sense, you know, if you were able to 707 00:42:11,160 --> 00:42:14,640 Speaker 2: move the slider on your so like your android doctor, 708 00:42:14,640 --> 00:42:17,319 Speaker 2: your android lawyer, or whatever it happens to be. Yeah, 709 00:42:17,360 --> 00:42:20,080 Speaker 2: you want the right mix of cynicism. A little more 710 00:42:20,120 --> 00:42:23,359 Speaker 2: than the average person, but not not too much. This 711 00:42:23,400 --> 00:42:25,319 Speaker 2: will be interesting to get into later when we start 712 00:42:25,320 --> 00:42:29,680 Speaker 2: talking about like absolute cynicism and what what that is 713 00:42:29,719 --> 00:42:31,600 Speaker 2: and where we stand in relation to it. 714 00:42:31,840 --> 00:42:36,480 Speaker 3: Yeah. So, as for the actual inverse link between cynicism 715 00:42:36,520 --> 00:42:39,920 Speaker 3: and competence, when broken down by test domain, I was 716 00:42:39,960 --> 00:42:44,000 Speaker 3: interested to see that the effect was strongest in reading 717 00:42:44,040 --> 00:42:50,600 Speaker 3: skills and weakest in information processing speed. So in these tests, 718 00:42:50,640 --> 00:42:54,400 Speaker 3: apparently highly cynical people holding up relatively okay with speed 719 00:42:54,440 --> 00:42:58,000 Speaker 3: of reasoning, doing a lot worse in like reading comprehension, 720 00:42:59,280 --> 00:43:03,279 Speaker 3: and finally getting to the element of the paper comparing cynicism, competence, 721 00:43:03,320 --> 00:43:08,480 Speaker 3: and environment. The author's tested levels of cynicism cross referenced 722 00:43:08,480 --> 00:43:12,320 Speaker 3: with these cognitive tests in subjects across thirty different countries, 723 00:43:13,040 --> 00:43:16,680 Speaker 3: and they found that in countries that scored low in 724 00:43:16,800 --> 00:43:20,279 Speaker 3: corruption and high in rule of law according to an 725 00:43:20,320 --> 00:43:24,319 Speaker 3: international database called the World Governance Indicators, the effect we've 726 00:43:24,360 --> 00:43:27,759 Speaker 3: been talking about did hold true, but in countries with 727 00:43:27,960 --> 00:43:32,080 Speaker 3: high corruption and eroded rule of law, the effect was 728 00:43:32,280 --> 00:43:33,400 Speaker 3: greatly diminished. 729 00:43:34,080 --> 00:43:34,440 Speaker 2: Quote. 730 00:43:34,680 --> 00:43:38,480 Speaker 3: The harsher the social climate, the more these high competence 731 00:43:38,520 --> 00:43:42,239 Speaker 3: people embraced a cynical worldview. So kind of along the 732 00:43:42,239 --> 00:43:44,640 Speaker 3: lines of results we talked about in the last episode, 733 00:43:44,800 --> 00:43:49,200 Speaker 3: it hurts you materially to hold cynical views unless those 734 00:43:49,320 --> 00:43:53,280 Speaker 3: views are correct in the environment where you operate. Along 735 00:43:53,360 --> 00:43:57,040 Speaker 3: these lines, the authors discuss ways that cynicism might be 736 00:43:57,160 --> 00:44:01,520 Speaker 3: learned directly from personal experience. Despite the fact that they 737 00:44:01,680 --> 00:44:05,600 Speaker 3: tried to control for the influence of variables like age, gender, 738 00:44:05,640 --> 00:44:09,360 Speaker 3: and wealth, it's still possible that quote higher levels of 739 00:44:09,400 --> 00:44:14,920 Speaker 3: cognitive ability, academic competence, and education might protect from adverse 740 00:44:15,000 --> 00:44:19,240 Speaker 3: life experiences, not only as they allow discovering potential fraud, 741 00:44:19,560 --> 00:44:22,200 Speaker 3: but also as they increase the chances of living in 742 00:44:22,239 --> 00:44:25,560 Speaker 3: a safe and friendly environment, providing more evidence for a 743 00:44:25,600 --> 00:44:28,560 Speaker 3: positive than for a negative view of human nature, and 744 00:44:28,960 --> 00:44:34,000 Speaker 3: consequently preventing cynicism development. So that's talking about the idea that, 745 00:44:34,120 --> 00:44:38,480 Speaker 3: like that education and cognitive skills, they might not just 746 00:44:38,560 --> 00:44:41,799 Speaker 3: be about how accurately you're seeing the world around you. 747 00:44:41,880 --> 00:44:45,440 Speaker 3: They might actually, over time, influence what the world around 748 00:44:45,480 --> 00:44:49,960 Speaker 3: you is like. On the other hand, since cynicism entails 749 00:44:50,120 --> 00:44:55,520 Speaker 3: generalized distrust quote cynical versus less cynical, individuals might be 750 00:44:55,600 --> 00:44:59,120 Speaker 3: more distrustful of the opinions and knowledge of others, a 751 00:44:59,160 --> 00:45:02,840 Speaker 3: behavior that can eventually prevent them from expanding their knowledge 752 00:45:03,160 --> 00:45:04,240 Speaker 3: and understanding. 753 00:45:04,640 --> 00:45:07,200 Speaker 2: Well, that that seems like it tracks the idea that 754 00:45:07,239 --> 00:45:12,839 Speaker 2: if you're cynical about about potential information sources, and you're 755 00:45:12,880 --> 00:45:15,080 Speaker 2: more likely to sort of back your way into a 756 00:45:15,120 --> 00:45:19,680 Speaker 2: corner where you have very few informational sources coming in 757 00:45:19,719 --> 00:45:21,400 Speaker 2: and they're the only the only ones you're going to 758 00:45:21,400 --> 00:45:24,600 Speaker 2: accept are the ones that back up your existing cynicism. 759 00:45:24,920 --> 00:45:27,959 Speaker 3: Yes, but they say, of course, there is more work 760 00:45:28,000 --> 00:45:32,200 Speaker 3: to do exploring the different possible causal mechanisms here. So 761 00:45:32,320 --> 00:45:35,960 Speaker 3: this paper does find good robust evidence for the cynical 762 00:45:36,000 --> 00:45:39,839 Speaker 3: genius illusion, that the illusion is widely present, and it 763 00:45:39,880 --> 00:45:43,640 Speaker 3: is in fact an illusion, but the questions about why 764 00:45:43,719 --> 00:45:48,120 Speaker 3: are still largely open. One thing, I wonder about a 765 00:45:48,120 --> 00:45:51,760 Speaker 3: lot of the cognitive tasks that subjects said they would 766 00:45:51,920 --> 00:45:55,520 Speaker 3: entrust to a cynical person more than a non cynical person. 767 00:45:56,440 --> 00:45:58,200 Speaker 3: I was looking through the inventory, and a lot of 768 00:45:58,200 --> 00:46:03,720 Speaker 3: these tasks involved screwy of details, like crunching numbers, following 769 00:46:03,800 --> 00:46:08,440 Speaker 3: complex logic, analyzing scientific results, things like that, And I 770 00:46:08,680 --> 00:46:13,240 Speaker 3: wonder if the same pattern would hold for cognitive tasks 771 00:46:13,280 --> 00:46:18,440 Speaker 3: that people associate less with scrutiny of details and instead 772 00:46:18,480 --> 00:46:22,359 Speaker 3: with things like creativity and imagination. And to be clear, 773 00:46:22,400 --> 00:46:25,520 Speaker 3: the cynical genius effect would be an illusion even if 774 00:46:25,560 --> 00:46:29,040 Speaker 3: it were only applied to scrutinizing cognition. But I wonder 775 00:46:29,080 --> 00:46:32,240 Speaker 3: if the illusion is actually more specific to certain kinds 776 00:46:32,239 --> 00:46:32,960 Speaker 3: of cognition. 777 00:46:33,600 --> 00:46:37,040 Speaker 2: Yeah, yeah, that very much, that Sherlock Holmes scenario. 778 00:46:37,680 --> 00:46:39,960 Speaker 3: But also coming back to this question, we've asked several 779 00:46:40,000 --> 00:46:44,560 Speaker 3: times now, are there any benefits to generalize cynicism. It 780 00:46:44,840 --> 00:46:47,560 Speaker 3: comes with tons of harms for the cynic it hurts 781 00:46:47,560 --> 00:46:50,479 Speaker 3: you to be cynical, But are there any benefits? Well, 782 00:46:50,560 --> 00:46:53,120 Speaker 3: first of all, this paper does find if you're in 783 00:46:53,239 --> 00:46:58,239 Speaker 3: a really corrupt, untrustworthy environment, obviously it does make more 784 00:46:58,280 --> 00:47:00,520 Speaker 3: sense to be more cynical. That's just like a correct 785 00:47:00,640 --> 00:47:05,400 Speaker 3: understanding of how your environment operates. Number Two, even if 786 00:47:05,440 --> 00:47:10,400 Speaker 3: you're not in a more corrupt, untrustworthy environment, if you 787 00:47:10,520 --> 00:47:15,040 Speaker 3: don't understand your environment and you're basically out of your depth, 788 00:47:15,400 --> 00:47:19,640 Speaker 3: cynicism may protect you from catastrophic outcomes. It's like, I 789 00:47:19,640 --> 00:47:21,640 Speaker 3: don't really know what's going on here, don't know if 790 00:47:21,680 --> 00:47:24,359 Speaker 3: I can trust or not, so by default I'm not 791 00:47:24,560 --> 00:47:28,200 Speaker 3: going to trust. That's better safe than sorry. And then third, 792 00:47:28,239 --> 00:47:30,600 Speaker 3: I think this is going to be mainly related to 793 00:47:30,680 --> 00:47:35,560 Speaker 3: the cynical genius illusion. Having a reputation for cynicism may 794 00:47:35,680 --> 00:47:39,200 Speaker 3: have the effect of convincing people around you that you 795 00:47:39,239 --> 00:47:43,160 Speaker 3: are very smart and intellectually savvy, even though on average 796 00:47:43,239 --> 00:47:46,120 Speaker 3: the opposite is more likely to be true. So there's 797 00:47:46,160 --> 00:47:50,160 Speaker 3: a kind of social premium incentive to appear to be cynical. 798 00:47:50,239 --> 00:47:52,359 Speaker 3: It's in a lot of cases it's going to make 799 00:47:52,440 --> 00:47:55,439 Speaker 3: people think that you know something they don't, and you're 800 00:47:55,480 --> 00:47:58,600 Speaker 3: a wise and world weary and intelligent person. 801 00:47:59,440 --> 00:48:02,840 Speaker 2: Yes, is often kind of a safe gambole at, like 802 00:48:02,880 --> 00:48:07,280 Speaker 2: a cocktail party or a mixer. Right if politics should 803 00:48:07,280 --> 00:48:09,880 Speaker 2: come up, which of course is bad manners anyway, but 804 00:48:09,920 --> 00:48:11,560 Speaker 2: if it were to come up, you might say something 805 00:48:11,600 --> 00:48:13,640 Speaker 2: that is just kind of a you know, a blanket 806 00:48:13,760 --> 00:48:17,120 Speaker 2: statement of cynicism, like aoh, well, politicians are all the same. 807 00:48:17,520 --> 00:48:19,200 Speaker 2: And then what people are gonna have to double down. 808 00:48:19,239 --> 00:48:22,160 Speaker 2: They're gonna have to come back and try to convince 809 00:48:22,200 --> 00:48:25,480 Speaker 2: you no, no, not all politicians. Some are great, and 810 00:48:25,520 --> 00:48:28,600 Speaker 2: they're going to look like the person who's naive where 811 00:48:28,680 --> 00:48:32,520 Speaker 2: you've already you know, mounted your cynicism high horse. 812 00:48:32,719 --> 00:48:35,879 Speaker 3: We've talked about this before. Yeah, the the like all 813 00:48:35,920 --> 00:48:38,000 Speaker 3: politicians are the same as the kind of statement that 814 00:48:38,040 --> 00:48:42,280 Speaker 3: I think is just facially untrue. It could not be true, 815 00:48:42,440 --> 00:48:47,040 Speaker 3: obviously wrong, but you feel foolish trying to argue with it. Yeah, 816 00:48:47,080 --> 00:48:49,440 Speaker 3: and I think that goes beyond politics. I mean, just 817 00:48:49,600 --> 00:48:53,920 Speaker 3: generally trying to argue with the cynic is so difficult. 818 00:48:54,400 --> 00:48:59,919 Speaker 3: Statements of cynicism often come with this a priori tech 819 00:49:00,000 --> 00:49:04,120 Speaker 3: mixture of factuality just feels self evidently true with it, 820 00:49:04,280 --> 00:49:07,600 Speaker 3: even when it's obviously wrong, when it would be absurd 821 00:49:07,640 --> 00:49:07,960 Speaker 3: for it. 822 00:49:07,960 --> 00:49:18,239 Speaker 4: To be true. 823 00:49:18,400 --> 00:49:21,040 Speaker 3: One more thing before I wrap up from the Stavrovia 824 00:49:21,200 --> 00:49:25,080 Speaker 3: and Ilbrect study here talking about in their discussion, they 825 00:49:25,080 --> 00:49:28,400 Speaker 3: talk about why do we tend to assume highly cynical 826 00:49:28,440 --> 00:49:30,759 Speaker 3: people are smarter than the rest of us, even though 827 00:49:30,760 --> 00:49:34,840 Speaker 3: this is usually not the case. We touched on this earlier, 828 00:49:34,880 --> 00:49:38,680 Speaker 3: but the authors do offer a few ideas based on 829 00:49:39,080 --> 00:49:44,080 Speaker 3: common cognitive biases. In particular, they call out negativity bias 830 00:49:44,239 --> 00:49:49,040 Speaker 3: and loss aversion. Negativity bias is the observation that we 831 00:49:49,200 --> 00:49:53,560 Speaker 3: are more psychologically affected by negative things than we are 832 00:49:53,600 --> 00:49:57,480 Speaker 3: by positive things of equal intensity and loss a version 833 00:49:57,520 --> 00:50:00,000 Speaker 3: is very similar. It's the finding that we're more strong 834 00:50:00,280 --> 00:50:03,880 Speaker 3: motivated to avoid a loss than we are to achieve 835 00:50:04,000 --> 00:50:07,720 Speaker 3: a gain of the same value. So here's an example. 836 00:50:07,880 --> 00:50:10,359 Speaker 3: I find a five dollar bill on the sidewalk. Oh, 837 00:50:10,400 --> 00:50:14,520 Speaker 3: that's nice, quickly forget about it, Versus I drop a 838 00:50:14,560 --> 00:50:17,520 Speaker 3: five dollar bill down a storm drain or you know 839 00:50:17,640 --> 00:50:21,640 Speaker 3: why me, ah, I hate this. You know, it's like 840 00:50:21,880 --> 00:50:26,520 Speaker 3: the dollar value is exactly the same, but the loss 841 00:50:26,760 --> 00:50:30,480 Speaker 3: is more memorable, it's more salient and will cause a 842 00:50:30,520 --> 00:50:33,799 Speaker 3: greater emotional reaction. And I think for those reasons, like 843 00:50:34,160 --> 00:50:37,400 Speaker 3: we are more likely to learn something from it, to 844 00:50:37,440 --> 00:50:40,359 Speaker 3: try to draw a general inference that we will take 845 00:50:40,400 --> 00:50:43,000 Speaker 3: and apply to the rest of life from these moments 846 00:50:43,000 --> 00:50:45,960 Speaker 3: of loss than from gains of the exact same value. 847 00:50:46,360 --> 00:50:48,640 Speaker 2: Yeah, yeah, I mean, I'd go as far as to 848 00:50:48,680 --> 00:50:51,560 Speaker 2: say that at the very least, you're more likely to 849 00:50:51,600 --> 00:50:55,280 Speaker 2: remember dropping that five than finding a ten. Yeah, and yeah, 850 00:50:55,560 --> 00:50:58,719 Speaker 2: there's probably been an interesting thought experiment to be had, 851 00:50:58,719 --> 00:51:02,720 Speaker 2: and just trying to determine which point the find value 852 00:51:02,760 --> 00:51:06,279 Speaker 2: would be equal to a much lesser loss value, I think. 853 00:51:06,760 --> 00:51:09,120 Speaker 3: I think work on that exact question has been done. 854 00:51:09,160 --> 00:51:10,520 Speaker 3: I don't have it pulled up in front of me, 855 00:51:10,600 --> 00:51:13,640 Speaker 3: but I think we've looked at that before. Yeah, I'm 856 00:51:13,680 --> 00:51:16,000 Speaker 3: sure for some reason seeing the exact numbers is going 857 00:51:16,040 --> 00:51:20,520 Speaker 3: to be really funny. But anyway, so you apply this 858 00:51:21,000 --> 00:51:25,000 Speaker 3: these biases negativity, bias, and loss aversion to the domain 859 00:51:25,040 --> 00:51:28,439 Speaker 3: of trust and cynicism, and they could mean that the 860 00:51:28,480 --> 00:51:32,640 Speaker 3: pain of being betrayed is much greater than the pleasure 861 00:51:32,800 --> 00:51:36,680 Speaker 3: of having our trust rewarded, even given the exact same 862 00:51:36,719 --> 00:51:40,200 Speaker 3: original act of trust. And this is back to the 863 00:51:40,360 --> 00:51:42,480 Speaker 3: mental cherry picking that you mentioned last time. 864 00:51:42,560 --> 00:51:42,759 Speaker 2: Rob. 865 00:51:42,800 --> 00:51:44,760 Speaker 3: You know, you can always like think of these really 866 00:51:44,920 --> 00:51:48,760 Speaker 3: sticky examples of times when you shouldn't have trusted someone 867 00:51:48,880 --> 00:51:52,880 Speaker 3: or something. We may trust somebody twenty times, it works 868 00:51:52,880 --> 00:51:56,319 Speaker 3: out great nineteen times, but the one time it did 869 00:51:56,320 --> 00:51:59,720 Speaker 3: not work out is shocking and painful and we feel 870 00:51:59,760 --> 00:52:03,240 Speaker 3: so hurt. And so from this we form an idea 871 00:52:03,320 --> 00:52:06,600 Speaker 3: that people who do not trust easily have learned a 872 00:52:06,640 --> 00:52:10,840 Speaker 3: lot of valuable lessons. Therefore they are generally knowledgeable, wise, 873 00:52:10,960 --> 00:52:15,200 Speaker 3: and smart. Another explanation comes back to that study from 874 00:52:15,239 --> 00:52:18,040 Speaker 3: the background section that we talked about briefly about the 875 00:52:18,040 --> 00:52:23,200 Speaker 3: the invisibility of consequences in situations where we refrain from 876 00:52:23,239 --> 00:52:27,399 Speaker 3: from giving trust to our detriment. So again you get 877 00:52:27,440 --> 00:52:31,280 Speaker 3: to see what happens when you trust and that trust 878 00:52:31,360 --> 00:52:34,880 Speaker 3: is betrayed. But when you withhold trust and you just 879 00:52:35,040 --> 00:52:38,200 Speaker 3: miss out on an opportunity to gain, you don't really 880 00:52:38,239 --> 00:52:40,920 Speaker 3: get to see that loss made concrete. It's just like 881 00:52:41,040 --> 00:52:43,680 Speaker 3: it's another path you could have taken. You can even 882 00:52:43,719 --> 00:52:44,880 Speaker 3: go without thinking about it. 883 00:52:45,400 --> 00:52:47,680 Speaker 2: Yeah, that's right. Unless you're visited by you know, Christmas 884 00:52:47,719 --> 00:52:49,680 Speaker 2: spirits or something, You're just not going to have any 885 00:52:49,719 --> 00:52:50,560 Speaker 2: alternate views. 886 00:52:50,840 --> 00:52:52,960 Speaker 3: That's a really you know what. I think a Christmas 887 00:52:53,000 --> 00:52:56,080 Speaker 3: Carol is a great example here. That is something the 888 00:52:56,120 --> 00:52:58,799 Speaker 3: ghost of Christmas past has to come and make the 889 00:52:59,120 --> 00:53:03,800 Speaker 3: lost oportunities concrete. And then one last point the authors 890 00:53:03,840 --> 00:53:06,680 Speaker 3: make that I thought was a very interesting point. They raise, 891 00:53:07,360 --> 00:53:11,200 Speaker 3: what if the cynical genius illusion arises in part from 892 00:53:11,760 --> 00:53:17,719 Speaker 3: biases of storytelling. We fill our lives with fictional stories. 893 00:53:18,200 --> 00:53:23,359 Speaker 3: Fictional stories need to be entertaining. Stories are usually more 894 00:53:23,520 --> 00:53:28,600 Speaker 3: entertaining if danger and conflict are heightened, if villains are 895 00:53:28,760 --> 00:53:33,000 Speaker 3: meaner and more dangerous, if the stakes are high, if 896 00:53:33,160 --> 00:53:35,960 Speaker 3: no one can be trusted, you can you can hear 897 00:53:36,000 --> 00:53:39,200 Speaker 3: all these phrases in the Don la Fontaine movie trailer voice, 898 00:53:39,239 --> 00:53:42,279 Speaker 3: can't you you know? It's like that's what stories are 899 00:53:42,320 --> 00:53:46,960 Speaker 3: made out of. Fictional storytelling selects for narratives about the 900 00:53:47,120 --> 00:53:51,200 Speaker 3: dangers of trust and the risk of betrayal because stories 901 00:53:51,239 --> 00:53:54,279 Speaker 3: like that are captivating to our attention and we want 902 00:53:54,320 --> 00:53:58,200 Speaker 3: to know what happens next. So hostile and treacherous worlds 903 00:53:58,280 --> 00:54:02,040 Speaker 3: may be more entertaining and narrative. But it's possible that 904 00:54:02,080 --> 00:54:06,160 Speaker 3: we draw incorrect inferences from those fictional worlds. We learn 905 00:54:06,280 --> 00:54:11,040 Speaker 3: too much about how life works from unrealities that are 906 00:54:11,080 --> 00:54:15,960 Speaker 3: specifically crafted to hack our attention. And who are the smart, 907 00:54:16,040 --> 00:54:19,160 Speaker 3: savvy characters in these worlds. I think very often they 908 00:54:19,160 --> 00:54:21,480 Speaker 3: are cynics who are very reluctant to trust. 909 00:54:22,080 --> 00:54:23,520 Speaker 2: That's right, That's a great point. 910 00:54:23,880 --> 00:54:25,799 Speaker 3: So anyway, that's all I've got in the study for now. 911 00:54:25,840 --> 00:54:28,680 Speaker 3: But I think the cynical genius illusion is so interesting. 912 00:54:28,680 --> 00:54:30,200 Speaker 3: I'm going to be thinking about this a lot in 913 00:54:30,440 --> 00:54:31,560 Speaker 3: the days and weeks to come. 914 00:54:32,040 --> 00:54:34,360 Speaker 2: Yeah, this will be an interesting one to bring into 915 00:54:34,440 --> 00:54:37,800 Speaker 2: our weird house cinema discussions as we inevitably come around 916 00:54:37,800 --> 00:54:41,719 Speaker 2: to a film that has a cynic genius in it, 917 00:54:42,800 --> 00:54:44,279 Speaker 2: and I'm sure if I was to go back and 918 00:54:44,320 --> 00:54:46,760 Speaker 2: look at some of the titles we've covered, we've probably 919 00:54:46,840 --> 00:54:50,879 Speaker 2: encountered these sorts of characters before, probably played by someone 920 00:54:50,920 --> 00:54:54,600 Speaker 2: like Christopher Lee. Yes, all right, well, on that note, 921 00:54:54,640 --> 00:54:56,680 Speaker 2: we're going to go ahead and close out this episode, 922 00:54:56,680 --> 00:54:58,439 Speaker 2: but we're going to come back with at least one 923 00:54:58,440 --> 00:55:02,480 Speaker 2: more episode on cynicism. Again, this is a huge topic. 924 00:55:03,280 --> 00:55:04,960 Speaker 2: In the next episode, I believe we're going to get 925 00:55:04,960 --> 00:55:09,839 Speaker 2: into cynicism, politics, and social media, so that should be 926 00:55:10,880 --> 00:55:15,680 Speaker 2: fun discussion either way. Tune in. We're looking forward to 927 00:55:15,960 --> 00:55:18,960 Speaker 2: getting into it. In the meantime, I'd like to remind 928 00:55:18,960 --> 00:55:21,520 Speaker 2: everyone that Stuff to Blow Your Mind is primarily a 929 00:55:21,600 --> 00:55:25,239 Speaker 2: science and culture podcast, with core episodes on Tuesdays and Thursdays. 930 00:55:25,480 --> 00:55:27,520 Speaker 2: We have a short form episode on Wednesdays, and on 931 00:55:27,560 --> 00:55:29,920 Speaker 2: Fridays we set aside most serious concerns to just talk 932 00:55:29,960 --> 00:55:32,040 Speaker 2: about a weird film on Weird House Cinema. 933 00:55:32,600 --> 00:55:36,239 Speaker 3: Huge thanks as always to our excellent audio producer JJ Posway. 934 00:55:36,320 --> 00:55:37,799 Speaker 3: If you would like to get in touch with us 935 00:55:37,800 --> 00:55:40,240 Speaker 3: with feedback on this episode or any other, to suggest 936 00:55:40,320 --> 00:55:42,400 Speaker 3: a topic for the future, or just to say hello, 937 00:55:42,680 --> 00:55:45,240 Speaker 3: you can email us at contact at Stuff to Blow 938 00:55:45,239 --> 00:56:01,239 Speaker 3: Your Mind dot com. 939 00:55:53,760 --> 00:55:56,719 Speaker 1: Stuff to Blow Your Mind is production of iHeartRadio. For 940 00:55:56,800 --> 00:56:00,640 Speaker 1: more podcasts from my heart Radio, visit the iHeartRadio app. Podcasts, 941 00:56:00,719 --> 00:56:02,480 Speaker 1: or wherever you're listening to your favorite shows.