1 00:00:05,720 --> 00:00:07,960 Speaker 1: Hey, welcome to Stuff to Blow Your Mind. My name 2 00:00:08,000 --> 00:00:10,840 Speaker 1: is Robert Lamb and I'm Joe McCormick. In it's Saturday. 3 00:00:10,960 --> 00:00:13,400 Speaker 1: Time to head into the old vault for a classic 4 00:00:13,440 --> 00:00:18,040 Speaker 1: episode of the show. This one originally aired February and 5 00:00:18,160 --> 00:00:21,560 Speaker 1: it's part two of our series about overconfidence. All right, 6 00:00:21,680 --> 00:00:27,400 Speaker 1: let's let's just dive right into it. Welcome to Stuff 7 00:00:27,400 --> 00:00:29,400 Speaker 1: to Blow Your Mind, a production of I Heart Radios 8 00:00:29,400 --> 00:00:37,920 Speaker 1: How Stuff Works. Hey, you, welcome to Stuff to Blow 9 00:00:37,960 --> 00:00:40,680 Speaker 1: Your Mind. My name is Robert Lamb and I'm Joe McCormick, 10 00:00:40,680 --> 00:00:42,640 Speaker 1: and we're back with part two of our discussion of 11 00:00:42,760 --> 00:00:46,159 Speaker 1: over confidence. That's right. If you did not listen to 12 00:00:46,240 --> 00:00:49,680 Speaker 1: the previous episode, do go back and listen to that episode, 13 00:00:49,680 --> 00:00:51,760 Speaker 1: because we're gonna lay the ground work. We're going to 14 00:00:51,840 --> 00:00:57,320 Speaker 1: discuss over confidence in hubrists and mythology in human histories, 15 00:00:57,800 --> 00:01:00,800 Speaker 1: and then get into the psychology of it. What various 16 00:01:00,800 --> 00:01:04,680 Speaker 1: psychological studies have revealed and continue to reveal about the 17 00:01:05,000 --> 00:01:07,720 Speaker 1: nature of over confidence and how we can divide this 18 00:01:07,840 --> 00:01:12,319 Speaker 1: sort of a morphous concept of over confidence out into 19 00:01:12,520 --> 00:01:15,759 Speaker 1: categories that can be more easily studied and understood. That's 20 00:01:15,840 --> 00:01:17,760 Speaker 1: right now. In the last episode, One of the main 21 00:01:17,800 --> 00:01:20,600 Speaker 1: things we talked about was this huge new review of 22 00:01:20,600 --> 00:01:22,880 Speaker 1: the scientific literature on something known as the better than 23 00:01:22,959 --> 00:01:25,640 Speaker 1: average effect, which is the tendency for people to rate 24 00:01:25,720 --> 00:01:29,280 Speaker 1: themselves as better than average with respect to their peers 25 00:01:29,319 --> 00:01:32,720 Speaker 1: on all kinds of stuff. One classic example is that 26 00:01:32,840 --> 00:01:35,800 Speaker 1: something like nine three percent of people think they're a 27 00:01:35,880 --> 00:01:38,880 Speaker 1: better than average driver. So if you're if you're listening 28 00:01:38,920 --> 00:01:41,080 Speaker 1: to this as you drive, eyes back on the road, 29 00:01:41,480 --> 00:01:43,920 Speaker 1: and make sure you use this turn signals, turn signals, stays, 30 00:01:43,959 --> 00:01:47,560 Speaker 1: save lives, turn signals. Let other drivers and pedestrians know 31 00:01:47,680 --> 00:01:49,560 Speaker 1: what you intend to do. Even if you think you're 32 00:01:49,600 --> 00:01:52,240 Speaker 1: a great driver, drive like you're less good than you are, 33 00:01:52,440 --> 00:01:54,600 Speaker 1: and it will make you a better driver. Drive like 34 00:01:54,720 --> 00:01:57,600 Speaker 1: you can't see all the other cars and pedestrians around you, 35 00:01:57,640 --> 00:02:00,840 Speaker 1: because sometimes you cannot drive like you're driving a murder weapon, 36 00:02:00,960 --> 00:02:04,680 Speaker 1: because potentially you are. It's quite true, all right now. 37 00:02:04,760 --> 00:02:06,360 Speaker 1: One of the things we talked about in the last 38 00:02:06,360 --> 00:02:10,840 Speaker 1: episode was a paper from seventeen by Don Amore and 39 00:02:10,880 --> 00:02:15,360 Speaker 1: Derek Schatz called The Three Faces of Overconfidence, which which 40 00:02:15,400 --> 00:02:20,440 Speaker 1: actually broke over confidence down into three distinct categories of 41 00:02:20,440 --> 00:02:24,079 Speaker 1: of bias or misperception, and uh. And we we talked 42 00:02:24,120 --> 00:02:25,639 Speaker 1: about those a little bit last time. We're going to 43 00:02:25,680 --> 00:02:27,799 Speaker 1: be exploring more of what that paper had to say, 44 00:02:27,800 --> 00:02:32,120 Speaker 1: and it's critiques of overconfidence research specifically with reference to 45 00:02:32,160 --> 00:02:35,040 Speaker 1: these three types of overconfidence, and as a brief refresher, 46 00:02:35,600 --> 00:02:41,480 Speaker 1: the three types are overestimation, overplacement, and over precision. Overestimation 47 00:02:41,639 --> 00:02:43,880 Speaker 1: is thinking that you're better than you are, and this 48 00:02:43,919 --> 00:02:46,440 Speaker 1: would be with reference to some kind of, uh, you know, 49 00:02:46,520 --> 00:02:49,200 Speaker 1: objective measure out in the world. So if you think 50 00:02:49,280 --> 00:02:51,520 Speaker 1: that you are taller than you are, you know, if 51 00:02:51,560 --> 00:02:53,799 Speaker 1: you think that you can jump higher than you can, 52 00:02:54,320 --> 00:02:56,680 Speaker 1: if you think that you would get a better score 53 00:02:56,680 --> 00:03:00,600 Speaker 1: on a test than you actually could, that's overestimation. The 54 00:03:00,639 --> 00:03:04,480 Speaker 1: next one, over placement, is similar, but instead it's comparing 55 00:03:04,560 --> 00:03:08,000 Speaker 1: yourself with other people. So the better than average effect 56 00:03:08,000 --> 00:03:11,280 Speaker 1: would be an example of overplacement. It's, you know, thinking 57 00:03:11,320 --> 00:03:14,320 Speaker 1: you are better than average compared to your peers at 58 00:03:14,400 --> 00:03:17,400 Speaker 1: some task. Or it would be thinking that you know 59 00:03:17,520 --> 00:03:20,760 Speaker 1: that you work harder than other people, or thinking that 60 00:03:20,880 --> 00:03:24,240 Speaker 1: you are smarter than other people. Of course, with the if, 61 00:03:24,280 --> 00:03:28,120 Speaker 1: it's over confidence meaning that those are not actually accurate assessments. 62 00:03:28,160 --> 00:03:31,120 Speaker 1: And then finally the other one would be over precision, 63 00:03:31,160 --> 00:03:34,600 Speaker 1: which is being too sure that you know the truth. Again, 64 00:03:34,639 --> 00:03:37,640 Speaker 1: this this might be called epistemic over confidence. It's just 65 00:03:37,760 --> 00:03:40,840 Speaker 1: being too certain that your beliefs are correct. Now, to 66 00:03:40,880 --> 00:03:45,080 Speaker 1: get into more in chats is paper from. One of 67 00:03:45,120 --> 00:03:49,200 Speaker 1: the questions that they address is what actually drives some 68 00:03:49,360 --> 00:03:52,800 Speaker 1: of these different effects as as they are manifested. So 69 00:03:53,120 --> 00:03:56,960 Speaker 1: they start with overestimation. What causes us to say I 70 00:03:57,040 --> 00:03:59,160 Speaker 1: think we would get a better score on a test 71 00:03:59,280 --> 00:04:01,520 Speaker 1: than we do, to think we have more money in 72 00:04:01,560 --> 00:04:04,840 Speaker 1: the bank than we do. A common answer that people 73 00:04:04,880 --> 00:04:08,120 Speaker 1: give to this is the idea of wishful thinking. It 74 00:04:08,160 --> 00:04:10,920 Speaker 1: would feel good if this were true, therefore I believe 75 00:04:10,960 --> 00:04:15,000 Speaker 1: it right. Uh. The authors don't think that this explanation 76 00:04:15,120 --> 00:04:17,599 Speaker 1: is very plausible, and they offer several problems with it, 77 00:04:17,640 --> 00:04:20,280 Speaker 1: and we can interrogate these, maybe disagree with them as 78 00:04:20,279 --> 00:04:22,440 Speaker 1: we go on. But first of all, they say, you know, 79 00:04:22,480 --> 00:04:27,920 Speaker 1: self delusion is demonstrably maladaptive. For example, a tendency toward 80 00:04:28,000 --> 00:04:30,960 Speaker 1: wishful thinking about the safety of kissing sharks so with 81 00:04:31,040 --> 00:04:33,839 Speaker 1: tongue is not a trait that the environment will tend 82 00:04:33,880 --> 00:04:38,080 Speaker 1: to select For people overconfident about their academic abilities, will 83 00:04:38,120 --> 00:04:41,279 Speaker 1: tend not to study and actually do worse. People who 84 00:04:41,320 --> 00:04:45,560 Speaker 1: believe themselves invulnerable will take risks that sometimes get them killed. 85 00:04:45,920 --> 00:04:48,159 Speaker 1: This might seem obvious, but there is actually plenty of 86 00:04:48,240 --> 00:04:51,320 Speaker 1: research on this. I mean, people who are overconfident about 87 00:04:51,320 --> 00:04:54,880 Speaker 1: their abilities do face a lot of downsides when those 88 00:04:54,880 --> 00:04:58,200 Speaker 1: abilities are put to the test. Yeah, I mean, one 89 00:04:58,240 --> 00:05:01,120 Speaker 1: example from literature that comes to mine is that of Macbeth, 90 00:05:01,880 --> 00:05:06,240 Speaker 1: who believes himself protected by prophecy um and then of 91 00:05:06,279 --> 00:05:12,120 Speaker 1: course uh snuffs it because exactly. But then again, I think, okay, 92 00:05:12,160 --> 00:05:14,159 Speaker 1: so it is true that these people will face a 93 00:05:14,160 --> 00:05:18,320 Speaker 1: lot of downside, But then again, people do engage in 94 00:05:18,400 --> 00:05:21,800 Speaker 1: self destructive, self deluded behavior all the time. This is 95 00:05:21,800 --> 00:05:24,520 Speaker 1: a common feature of human life. Yeah, I mean, for instance, 96 00:05:24,560 --> 00:05:27,320 Speaker 1: we were just recently talking about the placebo effect on 97 00:05:27,400 --> 00:05:30,839 Speaker 1: our movie episode where we talked about the fly and 98 00:05:30,960 --> 00:05:35,239 Speaker 1: about the possibility that the placebo effect is basically due 99 00:05:35,279 --> 00:05:39,400 Speaker 1: to uh, you know, this innate tendency towards self delusion 100 00:05:39,880 --> 00:05:42,400 Speaker 1: that may very well be adaptive in at least in 101 00:05:42,440 --> 00:05:47,200 Speaker 1: this scenario where yeah, we we benefit from being able 102 00:05:47,279 --> 00:05:50,400 Speaker 1: to believe something is going to work and and uh 103 00:05:50,440 --> 00:05:54,839 Speaker 1: and and and experiencing at least a small physical benefit 104 00:05:54,920 --> 00:05:58,080 Speaker 1: from it, like a small curative benefit from it. And 105 00:05:58,120 --> 00:06:01,240 Speaker 1: then um, you know, I also can't help but think that, 106 00:06:01,360 --> 00:06:05,400 Speaker 1: you know, self delusion entails far more than just over confidence. 107 00:06:05,400 --> 00:06:08,320 Speaker 1: It also entails all manner of paranoia. And there is 108 00:06:08,360 --> 00:06:11,760 Speaker 1: a strong case for the adaptive nature of say, making 109 00:06:11,800 --> 00:06:15,360 Speaker 1: a type one error in cognition false positive, the belief 110 00:06:15,400 --> 00:06:17,240 Speaker 1: that the rustle in the tall grass is that of 111 00:06:17,240 --> 00:06:19,919 Speaker 1: a tiger when it's not, because if you make the 112 00:06:19,960 --> 00:06:24,039 Speaker 1: type two uh are you're more likely to be eaten 113 00:06:24,040 --> 00:06:27,520 Speaker 1: by the tiger? Right? Right? Yeah, having accurate information about 114 00:06:27,520 --> 00:06:31,120 Speaker 1: the world is actually very useful, and having inaccurate information 115 00:06:31,200 --> 00:06:33,960 Speaker 1: can kill you. Yeah, but but I'm not so much 116 00:06:34,040 --> 00:06:36,960 Speaker 1: you know, trying to disagree with the maladaptive self delusion 117 00:06:37,120 --> 00:06:40,640 Speaker 1: argument that we mentioned earlier, but but rather, you know, 118 00:06:40,720 --> 00:06:43,200 Speaker 1: to point out that the human experiences is rife with 119 00:06:43,240 --> 00:06:46,839 Speaker 1: self delusions. So might a dash of overconfidence, even in 120 00:06:46,880 --> 00:06:50,760 Speaker 1: the form of overestimation, served to balance out this alchemy 121 00:06:50,960 --> 00:06:54,159 Speaker 1: of you know, of our perception of reality. For example, 122 00:06:54,279 --> 00:06:57,000 Speaker 1: so you have a karaoke singer, and granted karaoke is 123 00:06:57,080 --> 00:07:01,320 Speaker 1: very low stakes, but you could involve of social embarrassment, 124 00:07:01,680 --> 00:07:04,240 Speaker 1: which you could fear would lead to ostracism, and that's 125 00:07:04,240 --> 00:07:07,960 Speaker 1: actually one of the most powerful negative motivators on human behavior, right. 126 00:07:08,600 --> 00:07:10,760 Speaker 1: But again, karaoke is also one of these things where 127 00:07:10,760 --> 00:07:12,680 Speaker 1: like sometimes it's cool to do it badly. So this 128 00:07:12,720 --> 00:07:15,080 Speaker 1: is not a perfect example, but so you have a 129 00:07:15,120 --> 00:07:18,760 Speaker 1: kara karaoke singer that imbibes in a little liquid courage 130 00:07:18,800 --> 00:07:23,040 Speaker 1: before taking the microphone, as most karaoke participants are are 131 00:07:23,080 --> 00:07:25,800 Speaker 1: are wont to do. Uh, but yeah, they get a 132 00:07:25,800 --> 00:07:27,840 Speaker 1: little liquid courage because they know they don't have the 133 00:07:27,840 --> 00:07:30,160 Speaker 1: greatest voice in the world. And then they feel a 134 00:07:30,160 --> 00:07:32,320 Speaker 1: little awkward getting up there, but but they know that 135 00:07:32,400 --> 00:07:35,680 Speaker 1: a little bit of booze induced over confidence might help matters. 136 00:07:35,760 --> 00:07:37,560 Speaker 1: I think you're exactly right there, and this this is 137 00:07:37,560 --> 00:07:39,880 Speaker 1: funny to start here because I think while the authors 138 00:07:39,880 --> 00:07:41,680 Speaker 1: make tons of good points, this is one of the 139 00:07:41,720 --> 00:07:44,320 Speaker 1: ones they make that I might disagree with the most. 140 00:07:44,600 --> 00:07:48,400 Speaker 1: I think that there are antagonistic adaptations in human behavior. 141 00:07:48,720 --> 00:07:51,920 Speaker 1: One pressure might favor having an accurate picture of the world, 142 00:07:51,920 --> 00:07:54,560 Speaker 1: assessing things in a clear and accurate way, while a 143 00:07:54,640 --> 00:07:58,760 Speaker 1: cross pressure favor self deception, especially self deception in the 144 00:07:58,800 --> 00:08:02,520 Speaker 1: form of overconfidence. For example, you might be more likely 145 00:08:02,600 --> 00:08:06,560 Speaker 1: to survive if you have accurate assessments of your own abilities, 146 00:08:06,880 --> 00:08:09,600 Speaker 1: but you might be more likely to take big risks 147 00:08:09,680 --> 00:08:14,560 Speaker 1: with potentially big rewards if you overestimate your abilities or 148 00:08:14,640 --> 00:08:17,920 Speaker 1: self delusional. Over confidence could be adaptive because it helps 149 00:08:18,040 --> 00:08:22,920 Speaker 1: us persuade or even deceive other people about our worth. Yeah. 150 00:08:22,640 --> 00:08:25,080 Speaker 1: You ultimately you have to you have to believe in 151 00:08:25,080 --> 00:08:27,840 Speaker 1: yourself if you know other people are not going to 152 00:08:27,920 --> 00:08:30,280 Speaker 1: believe in you for you, right right. I mean we 153 00:08:30,280 --> 00:08:32,840 Speaker 1: we talked in the last episode about how it's probably 154 00:08:32,880 --> 00:08:36,680 Speaker 1: not a coincidence that you really often notice over confidence 155 00:08:36,720 --> 00:08:40,440 Speaker 1: in people who occupy high status leadership roles. How they 156 00:08:40,480 --> 00:08:42,560 Speaker 1: get there. I mean, it's not hard to imagine the 157 00:08:42,600 --> 00:08:45,600 Speaker 1: overconfidence help to them get to that point. Yeah, and 158 00:08:45,640 --> 00:08:48,679 Speaker 1: it's uh sometime it's a fun, sometimes terrifying exercise to 159 00:08:48,840 --> 00:08:52,000 Speaker 1: like if you if you engage with people like this 160 00:08:52,080 --> 00:08:54,920 Speaker 1: and then when you realize, oh, they're just really overconfidence. 161 00:08:55,160 --> 00:08:57,360 Speaker 1: They don't they're they're not to say they're not skilled, 162 00:08:57,559 --> 00:09:00,880 Speaker 1: but when you realize unless they're not, sometimes they're not. 163 00:09:00,920 --> 00:09:03,240 Speaker 1: But sometimes you really you realize, oh, there there is 164 00:09:03,280 --> 00:09:07,240 Speaker 1: this gap between ability and uh and and and what 165 00:09:07,280 --> 00:09:09,760 Speaker 1: they're they're saying they're going to deliver on, or what 166 00:09:09,800 --> 00:09:12,880 Speaker 1: they are estimating the future will consist of. Yeah, I mean, 167 00:09:14,440 --> 00:09:17,120 Speaker 1: it is kind of shocking how often in life you 168 00:09:17,200 --> 00:09:20,200 Speaker 1: will suddenly come to a realization that, you know, the 169 00:09:20,280 --> 00:09:23,040 Speaker 1: boss or the leader or whatever's main skill is b 170 00:09:23,240 --> 00:09:25,840 Speaker 1: essing like that they can just go out there and 171 00:09:25,840 --> 00:09:27,880 Speaker 1: wing it in a way that you would be too 172 00:09:27,880 --> 00:09:31,400 Speaker 1: timid and reserve to do. Right now, this idea of 173 00:09:31,480 --> 00:09:34,559 Speaker 1: you know, accurate assessments playing into our our own abilities, 174 00:09:35,520 --> 00:09:37,840 Speaker 1: I couldn't help but think of the film Butch casting 175 00:09:37,920 --> 00:09:42,280 Speaker 1: the Sundance Kid in this scenario, because it really as 176 00:09:42,320 --> 00:09:45,120 Speaker 1: it relates to two specific points in the film. One 177 00:09:45,240 --> 00:09:47,440 Speaker 1: is the whole would you make that jump if you 178 00:09:47,440 --> 00:09:50,840 Speaker 1: didn't have to scenario where they're being tracked, they're being hunted, 179 00:09:51,200 --> 00:09:54,520 Speaker 1: and they've come to this, uh, this this cliff overlooking 180 00:09:54,600 --> 00:09:58,199 Speaker 1: this river, and they realize that if they jump, if 181 00:09:58,200 --> 00:09:59,839 Speaker 1: they jump off this cliff and they land in that 182 00:10:00,080 --> 00:10:04,040 Speaker 1: river and they don't die, they'll get away because the 183 00:10:04,080 --> 00:10:07,240 Speaker 1: stakes are such that those pursuing them will not following them. 184 00:10:07,280 --> 00:10:09,480 Speaker 1: They will not make that jump if they don't need to. 185 00:10:10,320 --> 00:10:14,480 Speaker 1: Um So, so so there's there's that, and then at 186 00:10:14,520 --> 00:10:16,640 Speaker 1: the very end there's kind of a going out the 187 00:10:16,679 --> 00:10:19,760 Speaker 1: old fashioned way guns ablazing scenario where they're cornered, they're 188 00:10:19,760 --> 00:10:22,280 Speaker 1: going to slowly be killed and they decided to just 189 00:10:22,320 --> 00:10:27,439 Speaker 1: go for it, to just bust out shooting and just fight. Right. So, 190 00:10:27,440 --> 00:10:31,600 Speaker 1: so the incentives, like the evolutionary incentives on a brain 191 00:10:31,720 --> 00:10:36,559 Speaker 1: generating accurate pictures of the world versus self deluded over confidence, 192 00:10:36,920 --> 00:10:39,520 Speaker 1: those could very well be just the contrast between a 193 00:10:39,600 --> 00:10:43,120 Speaker 1: low risk, low reward strategy versus a high risk, high 194 00:10:43,200 --> 00:10:46,040 Speaker 1: reward strike, right, yeah, so yeah, the first example definitely 195 00:10:46,120 --> 00:10:48,839 Speaker 1: high risk, high reward, Like it was pretty much their 196 00:10:48,920 --> 00:10:52,920 Speaker 1: only their best option for survival at that point, and 197 00:10:52,960 --> 00:10:56,120 Speaker 1: they took it, and in the film they survive. At 198 00:10:56,120 --> 00:10:58,199 Speaker 1: the end of the film, it's pretty much implied that 199 00:10:58,240 --> 00:11:02,360 Speaker 1: they die, but but at the same time, it's it 200 00:11:02,440 --> 00:11:04,600 Speaker 1: still seems to be their best option, if not their 201 00:11:04,640 --> 00:11:08,560 Speaker 1: best option for surviving. It's kind of at least like 202 00:11:08,600 --> 00:11:11,240 Speaker 1: the psychological best option. You know, we're we're gonna stay 203 00:11:11,280 --> 00:11:13,400 Speaker 1: in here and die like rats, or are we gonna 204 00:11:13,840 --> 00:11:16,280 Speaker 1: you know, just burst out there and uh, you know 205 00:11:16,400 --> 00:11:19,920 Speaker 1: die like heroes in a film that is named after them. 206 00:11:19,960 --> 00:11:22,120 Speaker 1: You know, man, that's a great movie. I want to 207 00:11:22,160 --> 00:11:24,920 Speaker 1: go back, and why I haven't seen it? I remember 208 00:11:24,960 --> 00:11:26,440 Speaker 1: except the end. I mean, the ending is kind of 209 00:11:26,440 --> 00:11:30,000 Speaker 1: a downer, but uh but yeah, it's it's it's surprisingly 210 00:11:30,040 --> 00:11:33,199 Speaker 1: sweet for a for a violent outlaw movie. Yeah, it's 211 00:11:33,240 --> 00:11:35,560 Speaker 1: a good one. And you know, I mentioned him Inbeth 212 00:11:35,600 --> 00:11:37,760 Speaker 1: earlier in the whole idea of you know, draping himself 213 00:11:37,760 --> 00:11:41,280 Speaker 1: in prophecy and using that to to to to pump 214 00:11:41,360 --> 00:11:44,240 Speaker 1: himself up. But that doesn't bring up bring to mind 215 00:11:44,320 --> 00:11:46,320 Speaker 1: the rule of religion and all of this, you know, 216 00:11:46,400 --> 00:11:49,520 Speaker 1: I mean, certainly a lot of the things that religion 217 00:11:49,720 --> 00:11:55,400 Speaker 1: can do to your estimation of ability or you know, uh, 218 00:11:55,760 --> 00:11:58,559 Speaker 1: you know, you can revolve around you know, the survivability 219 00:11:58,600 --> 00:12:00,880 Speaker 1: of the soul for example, pull you know, and like 220 00:12:01,120 --> 00:12:04,520 Speaker 1: what will happen if I act a certain way in life? Yeah, 221 00:12:04,559 --> 00:12:07,520 Speaker 1: and I think there could possibly be cross pressures going 222 00:12:07,800 --> 00:12:09,920 Speaker 1: the same way with that, I mean that that there 223 00:12:09,920 --> 00:12:13,760 Speaker 1: are some evolutionary drawbacks and some some advantages to it, right. 224 00:12:13,800 --> 00:12:17,679 Speaker 1: And then of course that's not to say that religious motivations, uh, 225 00:12:17,679 --> 00:12:20,920 Speaker 1: you know exist free of social of course. I guess 226 00:12:21,040 --> 00:12:23,000 Speaker 1: you know, there's going to be a rich interplay between 227 00:12:23,040 --> 00:12:25,600 Speaker 1: those and that's you know, that's something that comes up, 228 00:12:25,640 --> 00:12:28,360 Speaker 1: for instance, in UM when you look at studies of say, 229 00:12:28,400 --> 00:12:32,200 Speaker 1: suicide bombers. You know where on one hand, you can 230 00:12:32,200 --> 00:12:34,240 Speaker 1: look at it and just go with the simple scenario 231 00:12:34,280 --> 00:12:36,559 Speaker 1: of oh, here's a person who believes that if they 232 00:12:36,600 --> 00:12:39,760 Speaker 1: die doing this act then they'll be rewarded in the afterlife. 233 00:12:40,000 --> 00:12:42,840 Speaker 1: But then behind that there's a whole social scenario as 234 00:12:42,920 --> 00:12:47,360 Speaker 1: well of other humans, you know, telling them that this 235 00:12:47,440 --> 00:12:50,960 Speaker 1: is the thing to do, etcetera. Yeah, motivations are are 236 00:12:51,040 --> 00:12:54,400 Speaker 1: a rich stew of many different influences. I mean it's 237 00:12:54,480 --> 00:12:58,079 Speaker 1: usually hard to nail down a single inciting incident or 238 00:12:58,160 --> 00:13:01,600 Speaker 1: cause that lead people down a path in life. And 239 00:13:01,600 --> 00:13:03,120 Speaker 1: in fact, I think a lot of times even when 240 00:13:03,120 --> 00:13:05,480 Speaker 1: people do that with themselves and say this was the 241 00:13:05,520 --> 00:13:08,079 Speaker 1: reason I became or whatever I did whatever, I think 242 00:13:08,080 --> 00:13:10,000 Speaker 1: a lot of times they're over some they're they're wrong 243 00:13:10,040 --> 00:13:14,880 Speaker 1: about themselves. So basically self delusion, we're all just houses 244 00:13:14,920 --> 00:13:18,800 Speaker 1: of cards. I'm just ready to be knocked down at 245 00:13:18,840 --> 00:13:21,480 Speaker 1: any point. Well, but there's another way of putting it. Now. 246 00:13:21,559 --> 00:13:23,439 Speaker 1: One thing you and I could be getting wrong here 247 00:13:23,520 --> 00:13:27,680 Speaker 1: is if we're talking properly about self delusion or some 248 00:13:27,840 --> 00:13:31,439 Speaker 1: other type of bias or like misperception in the brain, 249 00:13:31,840 --> 00:13:36,240 Speaker 1: because the authors here they're saying, okay, self delusion specifically, 250 00:13:36,720 --> 00:13:40,360 Speaker 1: maybe self delusion implies that there's there's a sort of 251 00:13:40,440 --> 00:13:43,480 Speaker 1: transformation going on somewhere in the brain, like the brain 252 00:13:43,640 --> 00:13:47,680 Speaker 1: gets accurate information about the world and then just somehow 253 00:13:47,760 --> 00:13:51,160 Speaker 1: presents it to the conscious mind in a skewed way. 254 00:13:51,360 --> 00:13:53,720 Speaker 1: The authors share think that, especially if you're talking about 255 00:13:53,720 --> 00:13:57,080 Speaker 1: wishful thinking, is the brand of self delusion, uh, you know, 256 00:13:58,000 --> 00:14:00,840 Speaker 1: getting false perceptions about the world in or to feel better. 257 00:14:00,920 --> 00:14:03,520 Speaker 1: They think that doesn't really work from a like unconscious 258 00:14:03,520 --> 00:14:07,319 Speaker 1: mind to conscious mind model, because emotions and moods also 259 00:14:07,400 --> 00:14:09,920 Speaker 1: seem to emerge from the unconscious mind, not from the 260 00:14:09,920 --> 00:14:13,120 Speaker 1: conscious mind. But then there's another thing they go to, 261 00:14:13,480 --> 00:14:16,560 Speaker 1: which is that they argue the empirical evidence for true 262 00:14:16,600 --> 00:14:20,800 Speaker 1: self deception in overestimation it's actually kind of weak and 263 00:14:20,880 --> 00:14:24,040 Speaker 1: kind of mixed. Why would this be, Well, first of all, 264 00:14:24,080 --> 00:14:27,360 Speaker 1: they say, it's hard to separate true self deception from 265 00:14:27,400 --> 00:14:31,440 Speaker 1: attempts to deceive others, including the researchers. So how can 266 00:14:31,520 --> 00:14:36,640 Speaker 1: you tell when somebody truly overestimates their own traits or 267 00:14:36,640 --> 00:14:41,040 Speaker 1: abilities versus they just tell you that they think their 268 00:14:41,280 --> 00:14:43,760 Speaker 1: traits or abilities are better than they are. In a 269 00:14:43,760 --> 00:14:48,400 Speaker 1: lot of cases, both would manifest equally as outward over confidence. Now, 270 00:14:48,440 --> 00:14:50,840 Speaker 1: you can come up with some methodologies and some tests 271 00:14:50,840 --> 00:14:52,760 Speaker 1: to try to get around this, Like you can make 272 00:14:52,840 --> 00:14:56,960 Speaker 1: people bet sums of money that would where the outcome 273 00:14:57,000 --> 00:14:59,240 Speaker 1: of the bet would be dependent on how good they 274 00:14:59,240 --> 00:15:02,360 Speaker 1: actually are at a task or something. But in a 275 00:15:02,360 --> 00:15:04,080 Speaker 1: lot of cases, they say, it's hard to tell the 276 00:15:04,080 --> 00:15:08,840 Speaker 1: difference between true self deception and just attempts to deceive 277 00:15:08,960 --> 00:15:11,440 Speaker 1: other people. Another thing they point out is that you 278 00:15:11,480 --> 00:15:16,480 Speaker 1: don't actually have to be deceiving yourself to overestimate your abilities. 279 00:15:16,840 --> 00:15:20,680 Speaker 1: You could be genuinely completely ignorant of the fact that 280 00:15:20,720 --> 00:15:23,240 Speaker 1: you're not as good as you think you are. Uh. 281 00:15:23,280 --> 00:15:26,480 Speaker 1: And here's one place that the famous Dunning Krueger effect 282 00:15:26,560 --> 00:15:29,720 Speaker 1: comes in. Now you may have heard about the Dunning 283 00:15:29,800 --> 00:15:33,160 Speaker 1: Krueger effect, but very short sketch on it. Of course, 284 00:15:33,240 --> 00:15:35,200 Speaker 1: overlaps with a lot of what we're talking about today. 285 00:15:35,720 --> 00:15:40,080 Speaker 1: Participants less skilled in a task or subject area can 286 00:15:40,080 --> 00:15:44,280 Speaker 1: be prone to show even greater overestimation of their abilities 287 00:15:44,320 --> 00:15:47,760 Speaker 1: in that skill or subject area. So with some skills, 288 00:15:47,960 --> 00:15:52,720 Speaker 1: the worst you are, the more you overestimate your awesomeness. Now, 289 00:15:52,880 --> 00:15:55,160 Speaker 1: why why on earth would this be Well, the authors 290 00:15:55,160 --> 00:15:58,040 Speaker 1: here mentioned this could just simply come from your low 291 00:15:58,200 --> 00:16:01,920 Speaker 1: skills providing you with a or frame of reference. You 292 00:16:02,000 --> 00:16:05,920 Speaker 1: don't know enough about this task or skill or subject 293 00:16:05,960 --> 00:16:10,000 Speaker 1: area to even understand how much you don't know, So, 294 00:16:10,120 --> 00:16:13,240 Speaker 1: like the Dunning Kruger effect would show not self deception 295 00:16:13,560 --> 00:16:18,000 Speaker 1: but genuine ignorance. You lack enough information to understand how 296 00:16:18,040 --> 00:16:21,200 Speaker 1: bad you're failing. Like I think a good example of 297 00:16:21,240 --> 00:16:23,680 Speaker 1: this is you know, you read you you read one 298 00:16:23,800 --> 00:16:28,080 Speaker 1: theory about some phenomena and uh, and it can be 299 00:16:28,160 --> 00:16:30,760 Speaker 1: rather convincing. It can be so convincing that you think, well, 300 00:16:30,800 --> 00:16:33,360 Speaker 1: this is it. It made a great case. But if 301 00:16:33,360 --> 00:16:35,240 Speaker 1: you don't, if if you don't actually look at some 302 00:16:35,280 --> 00:16:38,280 Speaker 1: of the other theories out there or look at uh, 303 00:16:38,320 --> 00:16:40,640 Speaker 1: you know, some sort of if you look at writings 304 00:16:40,760 --> 00:16:43,600 Speaker 1: or or pieces that actually compare them, or do some 305 00:16:43,640 --> 00:16:46,520 Speaker 1: sort of meta analysis, then you don't really have a 306 00:16:46,520 --> 00:16:49,040 Speaker 1: proper frame of reference or even like sort of I 307 00:16:49,040 --> 00:16:51,480 Speaker 1: wouldn't even say like nothing like a perfect frame of reference, 308 00:16:51,480 --> 00:16:53,600 Speaker 1: but even say like a healthy frame of reference. Yes, 309 00:16:53,720 --> 00:16:56,280 Speaker 1: you can go so like, you read one article about 310 00:16:56,280 --> 00:16:58,600 Speaker 1: a subject and then you're an expert, and then you 311 00:16:58,600 --> 00:17:02,160 Speaker 1: start reading more and you realize like, oh wait a minute, 312 00:17:02,160 --> 00:17:04,480 Speaker 1: you know there's so much I don't understand that your 313 00:17:04,600 --> 00:17:08,000 Speaker 1: estimation of your own expertise drops sharply after Yeah, like 314 00:17:08,000 --> 00:17:10,480 Speaker 1: you might realize, oh, well, there are other theories, or 315 00:17:10,520 --> 00:17:13,240 Speaker 1: you might realize, oh well, this was just one person's 316 00:17:13,280 --> 00:17:16,479 Speaker 1: summary of this particular theory. And oh and then on 317 00:17:16,520 --> 00:17:18,919 Speaker 1: top of that, perhaps they had a particular acts to 318 00:17:18,960 --> 00:17:22,040 Speaker 1: grind in writing it, etcetera. Yeah, that that's a great example. 319 00:17:22,080 --> 00:17:24,080 Speaker 1: I mean not to say that you should doubt everything 320 00:17:24,200 --> 00:17:26,200 Speaker 1: you read, but I mean, yeah, you should have. You 321 00:17:26,200 --> 00:17:29,600 Speaker 1: should you should have healthy doubt, not you know, not denialism, 322 00:17:30,000 --> 00:17:33,040 Speaker 1: but but you know, just be aware that you don't 323 00:17:33,040 --> 00:17:36,680 Speaker 1: know everything, and you should be especially suspicious when you 324 00:17:36,760 --> 00:17:40,359 Speaker 1: have dipped your toes into a subject and now feel 325 00:17:40,400 --> 00:17:43,119 Speaker 1: that you fully understand it. And we say that as 326 00:17:43,200 --> 00:17:48,160 Speaker 1: a professional toe dippers. Uh. Now, finally they point out 327 00:17:48,160 --> 00:17:51,840 Speaker 1: that the empirical evidence for wishful thinking itself in general 328 00:17:51,880 --> 00:17:55,880 Speaker 1: as a psychological phenomenon. They say, this is not actually strong. Uh, 329 00:17:55,920 --> 00:17:59,320 Speaker 1: if if there were strong evidence for wishful thinking. Wouldn't 330 00:17:59,359 --> 00:18:03,119 Speaker 1: it be the case that more desirable outcomes would be 331 00:18:03,200 --> 00:18:07,480 Speaker 1: more strongly believed, And they say, no, studies that try 332 00:18:07,520 --> 00:18:09,480 Speaker 1: to test this out do not find this to be 333 00:18:09,560 --> 00:18:11,760 Speaker 1: the case. It's not the case that the more you 334 00:18:11,800 --> 00:18:14,119 Speaker 1: want something, the more you believe it to be true. 335 00:18:14,560 --> 00:18:17,600 Speaker 1: And there are only a few types of scenarios where 336 00:18:17,640 --> 00:18:20,480 Speaker 1: there's any evidence of this at all, such as scenarios 337 00:18:20,480 --> 00:18:23,840 Speaker 1: where all outcomes are equally likely, like a dice roll 338 00:18:24,040 --> 00:18:26,399 Speaker 1: or something. Now that is interesting to think of in 339 00:18:26,520 --> 00:18:30,200 Speaker 1: terms of dungeons and dragons, were frequently one is either 340 00:18:30,240 --> 00:18:32,919 Speaker 1: making an attack or doing some sort of attempting some 341 00:18:33,000 --> 00:18:36,040 Speaker 1: sort of act that requires a skill check. And I 342 00:18:36,080 --> 00:18:39,280 Speaker 1: find myself doing this. You'll, you go up there and 343 00:18:39,320 --> 00:18:41,880 Speaker 1: you begin to explain what your character is going to do, 344 00:18:42,160 --> 00:18:44,760 Speaker 1: as if you hit that natural twenty that's kind of 345 00:18:44,800 --> 00:18:47,800 Speaker 1: the Yeah. So I find myself engaging in a lot 346 00:18:47,840 --> 00:18:51,360 Speaker 1: of that level of overconfidence with my character because ultimately 347 00:18:51,359 --> 00:18:53,520 Speaker 1: it all comes down to the roll of the dice, 348 00:18:53,920 --> 00:18:57,160 Speaker 1: you know, unless I'm trying to you know, leap off 349 00:18:57,200 --> 00:18:59,360 Speaker 1: of the democ organ's head or something that is going 350 00:18:59,400 --> 00:19:02,879 Speaker 1: to be extremely difficult because they're going to be additional 351 00:19:03,000 --> 00:19:06,800 Speaker 1: numerical you know values. Uh, you know added is attracted 352 00:19:06,840 --> 00:19:09,000 Speaker 1: from the attempt. You know, ultimately it's still gonna be 353 00:19:09,040 --> 00:19:11,879 Speaker 1: one to twenty one being h you know, a pretty 354 00:19:11,960 --> 00:19:14,560 Speaker 1: much complete fail. Uh. You know that's gonna be the 355 00:19:14,560 --> 00:19:17,359 Speaker 1: one where you slip and stab yourself with your own sword, 356 00:19:17,800 --> 00:19:19,679 Speaker 1: or it's gonna be that natural twenty which is going 357 00:19:19,720 --> 00:19:22,239 Speaker 1: to be you know, the wonder hit where you do 358 00:19:22,320 --> 00:19:25,919 Speaker 1: extra damage. That is a fantastic example. I I was 359 00:19:25,960 --> 00:19:29,159 Speaker 1: trying to think of cases where I thought I really 360 00:19:29,280 --> 00:19:31,920 Speaker 1: did engage in wishful thinking, and I couldn't think. I'm 361 00:19:31,920 --> 00:19:34,720 Speaker 1: sure I do sometimes, but yeah, they say it's not 362 00:19:34,800 --> 00:19:37,280 Speaker 1: actually as common as people think it is. And here's 363 00:19:37,320 --> 00:19:39,520 Speaker 1: maybe one case. I think in Dungeons and Dragons, I 364 00:19:39,920 --> 00:19:42,240 Speaker 1: have yet to meet a player or be a player 365 00:19:42,640 --> 00:19:45,160 Speaker 1: that does not engage in will wish will thinking. Every 366 00:19:45,200 --> 00:19:47,560 Speaker 1: time you roll the dice, like nobody, nobody rolls that 367 00:19:47,640 --> 00:19:49,320 Speaker 1: dice and it says, all right, this is how I'm 368 00:19:49,320 --> 00:19:52,920 Speaker 1: gonna fall off this table or this is how I'm 369 00:19:52,920 --> 00:19:56,560 Speaker 1: going to fall into the next trap, and you know, 370 00:19:56,720 --> 00:19:59,960 Speaker 1: and skewer myself on a stake. No, we want the 371 00:20:00,040 --> 00:20:02,400 Speaker 1: best outcome and we we have it in our mind 372 00:20:02,520 --> 00:20:06,119 Speaker 1: before the dice puts this in our place. Now, another 373 00:20:06,200 --> 00:20:10,000 Speaker 1: thing that the authors here bring up is that overestimation itself. 374 00:20:10,080 --> 00:20:12,560 Speaker 1: Remember again, that's just thinking that you are better than 375 00:20:12,600 --> 00:20:15,000 Speaker 1: you are in some way in terms of abilities or 376 00:20:15,040 --> 00:20:18,760 Speaker 1: traits or something. Um that this actually has a mixed 377 00:20:18,760 --> 00:20:22,320 Speaker 1: evidential record. It's not always the case that we overestimate 378 00:20:22,359 --> 00:20:25,720 Speaker 1: ourselves on all qualities or tasks. It's more the case 379 00:20:25,800 --> 00:20:28,040 Speaker 1: for some things in particular. And they give a couple 380 00:20:28,040 --> 00:20:30,960 Speaker 1: of examples of things where there really is a ton 381 00:20:31,040 --> 00:20:34,800 Speaker 1: of evidence for consistent overestimation. One is something you brought 382 00:20:34,880 --> 00:20:38,680 Speaker 1: up in the last episode, Robert, the planning fallacy. There 383 00:20:38,800 --> 00:20:43,720 Speaker 1: is really good evidence that people consistently overestimate how fast 384 00:20:43,800 --> 00:20:46,800 Speaker 1: they'll be able to get things done or complete a 385 00:20:46,840 --> 00:20:50,560 Speaker 1: project of some kind. And this is especially true if 386 00:20:50,600 --> 00:20:54,360 Speaker 1: the project is difficult and novel. So like, if I 387 00:20:54,359 --> 00:20:56,560 Speaker 1: I try to, you know, I put together some complex 388 00:20:56,600 --> 00:20:58,680 Speaker 1: thing for you to do that's hard and you've never 389 00:20:58,720 --> 00:21:02,919 Speaker 1: done it before, you are really likely to massively underestimate 390 00:21:02,960 --> 00:21:04,840 Speaker 1: how much time it's going to take you. Right if 391 00:21:04,840 --> 00:21:06,920 Speaker 1: you're like, well, you know, I'm not a handyman, but 392 00:21:07,040 --> 00:21:10,560 Speaker 1: I think I'm gonna install this sink myself and then 393 00:21:10,600 --> 00:21:14,679 Speaker 1: you watch a weekend just vanish. Yeah, I know that feeling. 394 00:21:15,359 --> 00:21:18,120 Speaker 1: Another one that they site is the illusion of control. 395 00:21:18,240 --> 00:21:22,560 Speaker 1: People pretty consistently overestimate how much control they, well they 396 00:21:22,600 --> 00:21:26,000 Speaker 1: will have over future outcomes, even things that that they 397 00:21:26,000 --> 00:21:31,080 Speaker 1: should understand are basically random. Right. You see this financially 398 00:21:31,200 --> 00:21:33,840 Speaker 1: in business wise a lot of times where someone will 399 00:21:33,880 --> 00:21:36,399 Speaker 1: have they think they have a clear idea of like 400 00:21:36,440 --> 00:21:38,600 Speaker 1: how things are going to flow, but they're they're just 401 00:21:38,680 --> 00:21:42,560 Speaker 1: not taking into account all the factors they cannot control, 402 00:21:43,200 --> 00:21:46,760 Speaker 1: and say the economy or or or or just the 403 00:21:47,080 --> 00:21:49,840 Speaker 1: industry that they're a part of. But they're kind of 404 00:21:49,920 --> 00:21:52,719 Speaker 1: they're acting, they're making choices based on sort of like 405 00:21:52,760 --> 00:21:55,320 Speaker 1: a not even a best case scenario, but sort of 406 00:21:55,320 --> 00:21:58,760 Speaker 1: like a standard scenario. You know, Yeah, I think I 407 00:21:58,760 --> 00:22:00,560 Speaker 1: know what you mean. Like they're they're not counting on 408 00:22:00,600 --> 00:22:02,920 Speaker 1: the storm, and they're also not counting on the wind 409 00:22:02,920 --> 00:22:06,240 Speaker 1: to completely die away, and that's how they're basing, you know, 410 00:22:06,280 --> 00:22:08,240 Speaker 1: their their estimate of how long it's going to take 411 00:22:08,280 --> 00:22:10,920 Speaker 1: to sail across the sea. Oh yeah, I mean that 412 00:22:11,040 --> 00:22:13,879 Speaker 1: that's another thing, like we there's actually a name for this. 413 00:22:13,920 --> 00:22:16,000 Speaker 1: I've forgotten it at the moment, but maybe I'll call 414 00:22:16,040 --> 00:22:18,240 Speaker 1: it to mind in a second. But uh, it's the 415 00:22:18,240 --> 00:22:21,080 Speaker 1: the assumption that the future will be like the present. 416 00:22:21,600 --> 00:22:26,000 Speaker 1: Maybe it's called the continuation fallacy or um. But now 417 00:22:26,040 --> 00:22:28,080 Speaker 1: there's one more thing that they bring up with respect 418 00:22:28,119 --> 00:22:32,600 Speaker 1: to overestimation specifically, uh, and this is a standard finding 419 00:22:32,640 --> 00:22:35,399 Speaker 1: that applies to a lot of the research on overestimation. 420 00:22:35,680 --> 00:22:40,120 Speaker 1: It's called the hard easy distinction or the hard easy effect. 421 00:22:40,600 --> 00:22:43,119 Speaker 1: And UH, this one is interesting because we'll see some 422 00:22:43,200 --> 00:22:45,919 Speaker 1: variations with it in other types of overconfidence. But it 423 00:22:45,960 --> 00:22:49,600 Speaker 1: goes like this, we are more likely to overestimate our 424 00:22:49,640 --> 00:22:54,679 Speaker 1: abilities on hard tasks and underestimate our abilities on easy ones. 425 00:22:55,640 --> 00:22:59,280 Speaker 1: So again, like the hard project comes up with the 426 00:22:59,320 --> 00:23:03,280 Speaker 1: planning foul. See, you massively underestimate how much time it's 427 00:23:03,280 --> 00:23:06,320 Speaker 1: going to take you to do that hard, complex novel thing, 428 00:23:06,720 --> 00:23:09,240 Speaker 1: but then you might overestimate how much time it's going 429 00:23:09,280 --> 00:23:12,159 Speaker 1: to take you to do something that's a common easy task. 430 00:23:12,960 --> 00:23:14,800 Speaker 1: I guess that the main example that's coming to mind 431 00:23:14,800 --> 00:23:18,399 Speaker 1: on this one would be the scenario where driving across 432 00:23:18,480 --> 00:23:22,040 Speaker 1: town are going to a particular destination takes less time 433 00:23:22,040 --> 00:23:23,840 Speaker 1: than you think it will and then you show up 434 00:23:23,920 --> 00:23:27,000 Speaker 1: like fifteen minutes early yea or or worse yeah. And 435 00:23:27,200 --> 00:23:30,359 Speaker 1: the authors here have some explanations for how exactly this 436 00:23:30,400 --> 00:23:31,800 Speaker 1: is working that we'll get to an abid who we 437 00:23:31,800 --> 00:23:33,760 Speaker 1: take a break, Yes, we should, and when we come 438 00:23:33,760 --> 00:23:40,119 Speaker 1: back we will continue our journey through over confidence. Thank alright, 439 00:23:40,119 --> 00:23:42,680 Speaker 1: we're back. All right. So we've been talking about this 440 00:23:42,720 --> 00:23:46,120 Speaker 1: paper about over confidence, but the three types of overconfidence 441 00:23:46,200 --> 00:23:49,320 Speaker 1: by shots and more. We were just talking about overestimation, 442 00:23:49,400 --> 00:23:51,479 Speaker 1: the belief that you're better than you are, especially with 443 00:23:51,560 --> 00:23:55,240 Speaker 1: respect to some kind of objective measure or independent measure. 444 00:23:55,600 --> 00:23:57,440 Speaker 1: And so we want to move on to the next 445 00:23:57,520 --> 00:24:00,760 Speaker 1: type of overconfidence they talk about, which is over placement. 446 00:24:00,840 --> 00:24:03,719 Speaker 1: And again this is different from overestimation because this is 447 00:24:04,160 --> 00:24:07,439 Speaker 1: thinking that you're better than you are with respect to 448 00:24:07,520 --> 00:24:11,240 Speaker 1: other people and judging yourself compared to others. Now, the 449 00:24:11,280 --> 00:24:14,920 Speaker 1: authors have some methodological critiques of some of the literature here, 450 00:24:14,920 --> 00:24:17,680 Speaker 1: but they acknowledge there's a lot of evidence for overplacement, 451 00:24:18,000 --> 00:24:21,200 Speaker 1: citing the better than average effect in all its beastly forms, 452 00:24:21,240 --> 00:24:23,600 Speaker 1: Like in the other paper that we talked about that 453 00:24:23,680 --> 00:24:27,240 Speaker 1: was you know, recently found to be extremely robust. Uh 454 00:24:27,440 --> 00:24:30,120 Speaker 1: they've got some quibbles about methodology and some of these 455 00:24:30,119 --> 00:24:34,040 Speaker 1: studies like using ambiguous scales or measures. Robert, you were talking, 456 00:24:34,080 --> 00:24:36,320 Speaker 1: I think in the last episode about you know how 457 00:24:36,359 --> 00:24:38,600 Speaker 1: some of these types of things like uh, you know, 458 00:24:38,680 --> 00:24:43,440 Speaker 1: how people rate themselves in terms of attractiveness or intelligence 459 00:24:43,560 --> 00:24:46,919 Speaker 1: or something. These can of course suffer from ambiguous criteria, 460 00:24:47,200 --> 00:24:51,480 Speaker 1: right yeah, or sometimes just straight up unfair criteria, racist criteria, um, 461 00:24:51,600 --> 00:24:56,120 Speaker 1: misogynistic criteria, etcetera. Well, yeah, it absolutely has all those 462 00:24:56,160 --> 00:24:59,360 Speaker 1: negative effects. I mean, I think overplacement is like it's 463 00:24:59,400 --> 00:25:01,399 Speaker 1: behind a lot of the worst types of prejudices that 464 00:25:01,440 --> 00:25:04,800 Speaker 1: make themselves known. But even if you're just like looking 465 00:25:04,800 --> 00:25:07,840 Speaker 1: at what is the quality you're trying to measure, you know, 466 00:25:08,359 --> 00:25:11,679 Speaker 1: attractiveness or something like that, that there's usually not like 467 00:25:11,880 --> 00:25:14,320 Speaker 1: a way of rating that is it's all based on 468 00:25:14,359 --> 00:25:18,159 Speaker 1: these kind of ambiguous subjective judgments. One great example of 469 00:25:18,200 --> 00:25:20,880 Speaker 1: this is something that we've brought up several times already, 470 00:25:20,960 --> 00:25:25,960 Speaker 1: like the driving example. So Vinson in one did a 471 00:25:26,000 --> 00:25:30,720 Speaker 1: study where he discovered that nine percent of American drivers 472 00:25:30,840 --> 00:25:35,720 Speaker 1: rated themselves above the median and driving ability. Obviously, whatever 473 00:25:35,880 --> 00:25:40,240 Speaker 1: criterion you use, it's impossible for to be above the media, 474 00:25:40,240 --> 00:25:43,400 Speaker 1: and it would have to be you know, like, um, 475 00:25:43,440 --> 00:25:46,280 Speaker 1: the majority can be above average, but the majority cannot 476 00:25:46,359 --> 00:25:48,960 Speaker 1: be above the median. And the authors point out this 477 00:25:48,960 --> 00:25:51,720 Speaker 1: would be more impressive if it were more specific, because 478 00:25:52,119 --> 00:25:55,080 Speaker 1: due to this problem with like ambiguous scales or measures, 479 00:25:55,400 --> 00:25:59,400 Speaker 1: anybody could technically have their own definition of what makes 480 00:25:59,400 --> 00:26:02,080 Speaker 1: a good drive. Ever, so you could be answering that 481 00:26:02,200 --> 00:26:04,640 Speaker 1: thinking like, well, there are things that I do well 482 00:26:04,760 --> 00:26:07,639 Speaker 1: when I drive, and maybe they're different from what somebody 483 00:26:07,640 --> 00:26:09,879 Speaker 1: else thinks that they do well when they drive, and 484 00:26:09,920 --> 00:26:13,280 Speaker 1: that's their criterion, right, I mean, your your definition of 485 00:26:13,320 --> 00:26:15,560 Speaker 1: being a good driver could just be I did not 486 00:26:16,040 --> 00:26:17,400 Speaker 1: you know, I wasn't in a wreck on the way 487 00:26:17,440 --> 00:26:19,520 Speaker 1: to work this morning, you know. Or or it could 488 00:26:19,560 --> 00:26:22,840 Speaker 1: be I get the places I need to go fast, yeah, 489 00:26:22,880 --> 00:26:25,800 Speaker 1: like in those are those are are definitely you know, 490 00:26:25,920 --> 00:26:29,040 Speaker 1: not necessarily the same vision of good driving. Or I 491 00:26:29,080 --> 00:26:32,480 Speaker 1: look really cool when I do it, you know. Uh. 492 00:26:32,520 --> 00:26:34,760 Speaker 1: There's another thing they bring up which is interesting, which 493 00:26:34,800 --> 00:26:39,359 Speaker 1: is the role of self selection in increasing the apparent 494 00:26:39,520 --> 00:26:43,280 Speaker 1: prevalence of overconfidence in the real world. So an example 495 00:26:43,320 --> 00:26:47,480 Speaker 1: would be like this, On average, more overconfident people are 496 00:26:47,560 --> 00:26:50,639 Speaker 1: likely to apply for jobs just sort of by definition, right, 497 00:26:51,400 --> 00:26:54,880 Speaker 1: more overconfident people are likely to start businesses, to run 498 00:26:54,920 --> 00:26:59,639 Speaker 1: for office, So we're we're exposed to more of these people, 499 00:27:00,200 --> 00:27:03,480 Speaker 1: and this could lead to us thinking that their confidence 500 00:27:03,600 --> 00:27:06,760 Speaker 1: level is more represented in the general population than it 501 00:27:06,840 --> 00:27:09,720 Speaker 1: actually is. Oh yeah, you turn onto television. It's what 502 00:27:09,800 --> 00:27:13,439 Speaker 1: it's almost exclusively overly confident people. That's true. Yeah, So 503 00:27:13,480 --> 00:27:17,840 Speaker 1: if you just look at like business leadership, politics, celebrities, 504 00:27:18,760 --> 00:27:20,919 Speaker 1: all this, you're gonna see. I think you will see 505 00:27:20,960 --> 00:27:24,479 Speaker 1: in general way more over confidence than you just will 506 00:27:24,720 --> 00:27:27,680 Speaker 1: talking to your friends and relatives and co workers. Now 507 00:27:27,720 --> 00:27:30,199 Speaker 1: here's a really interesting thing. Remember we talked about the 508 00:27:30,320 --> 00:27:34,359 Speaker 1: hard easy effect or the easy heart effect with overestimation, 509 00:27:34,359 --> 00:27:39,520 Speaker 1: where people tend to overestimate their abilities on hard jobs 510 00:27:39,560 --> 00:27:44,760 Speaker 1: and underestimate their abilities on easy jobs. Apparently, for overplacement, 511 00:27:45,400 --> 00:27:48,200 Speaker 1: it's there's also an easy heart effect, but it's in 512 00:27:48,240 --> 00:27:54,520 Speaker 1: the exact opposite direction. With overplacement, you overplace yourselves relative 513 00:27:54,680 --> 00:27:58,520 Speaker 1: We overplace ourselves relative to others on easy, common tasks 514 00:27:58,880 --> 00:28:02,960 Speaker 1: and underplace our selves relative to others on difficult, unusual, 515 00:28:03,200 --> 00:28:06,240 Speaker 1: or rare ones. So again, what would be some examples 516 00:28:06,280 --> 00:28:09,399 Speaker 1: of this? Uh, you think you're in the ninety percentile 517 00:28:09,480 --> 00:28:12,080 Speaker 1: of drivers, but really you're in the fort This is 518 00:28:12,119 --> 00:28:16,000 Speaker 1: an easy, common task. On the other hand, people think 519 00:28:16,080 --> 00:28:20,680 Speaker 1: that they are less likely than others to win difficult competitions. Uh. 520 00:28:20,720 --> 00:28:23,840 Speaker 1: Studies show that when there's a teacher that decides to 521 00:28:24,160 --> 00:28:27,560 Speaker 1: make an exam harder and graded on a curve, students 522 00:28:27,600 --> 00:28:30,880 Speaker 1: expect their grades to be worse than others, even when 523 00:28:30,920 --> 00:28:33,040 Speaker 1: there's common knowledge that there will be a curve, So 524 00:28:33,080 --> 00:28:36,240 Speaker 1: as the test gets harder, students perceived that they will 525 00:28:36,280 --> 00:28:40,920 Speaker 1: do worse relative to other classmates. That's kind of interesting. Uh. 526 00:28:40,960 --> 00:28:43,840 Speaker 1: They point out that people believe they are worse jugglers 527 00:28:43,920 --> 00:28:47,200 Speaker 1: than other people. They believe that they are less likely 528 00:28:47,280 --> 00:28:49,960 Speaker 1: to win the lottery than other people. Again a difficult, 529 00:28:50,040 --> 00:28:54,240 Speaker 1: rare thing, uh, And that they here's here's a very 530 00:28:54,240 --> 00:28:57,600 Speaker 1: interesting version. Just in terms of ages. People believe they 531 00:28:57,600 --> 00:29:01,400 Speaker 1: are less likely than other people to of past one hundred, 532 00:29:01,800 --> 00:29:04,560 Speaker 1: But they also think they're more likely than other people 533 00:29:04,640 --> 00:29:08,440 Speaker 1: to live past seventy Interesting. Well, of course, both of 534 00:29:08,440 --> 00:29:10,440 Speaker 1: those kind of depends on where you are in the 535 00:29:10,480 --> 00:29:13,280 Speaker 1: age spectrum when you're making that estimation, you know, because 536 00:29:13,280 --> 00:29:16,200 Speaker 1: you could be I mean, I mean, but apparently it's 537 00:29:16,200 --> 00:29:19,440 Speaker 1: true of all age. Yeah, but also your quality of life, right, 538 00:29:19,440 --> 00:29:22,240 Speaker 1: I mean for some people that prospect of living two hundred, 539 00:29:22,760 --> 00:29:25,840 Speaker 1: depending on where you are healthwise, that might be terrifying. 540 00:29:25,920 --> 00:29:29,080 Speaker 1: That might be it might be wishful thinking that you'll 541 00:29:29,120 --> 00:29:31,920 Speaker 1: expire sooner than that. Uh, or it could be the 542 00:29:31,920 --> 00:29:35,160 Speaker 1: other way around. You know. Um, what kind of explanations 543 00:29:35,160 --> 00:29:38,280 Speaker 1: are they they're throwing out? Yeah, this was interesting? So yeah, 544 00:29:38,320 --> 00:29:40,840 Speaker 1: why do we fail in opposite directions here? Depending on 545 00:29:40,880 --> 00:29:44,480 Speaker 1: whether we're imagining our performance against objective measures versus relative 546 00:29:44,520 --> 00:29:47,560 Speaker 1: to others and the author's site solutions from some of 547 00:29:47,640 --> 00:29:52,280 Speaker 1: Moore's previously previous work with other authors, They write this quote. 548 00:29:52,520 --> 00:29:55,880 Speaker 1: If people make any errors estimating how well they've done 549 00:29:56,080 --> 00:29:58,840 Speaker 1: or will do, then it stands to reason they're more 550 00:29:58,880 --> 00:30:01,680 Speaker 1: likely to over us to make a low score and 551 00:30:01,840 --> 00:30:05,360 Speaker 1: more likely to underestimate a high score. That's the herd 552 00:30:05,440 --> 00:30:09,040 Speaker 1: easy effect. As long as people have more uncertainty about 553 00:30:09,160 --> 00:30:12,520 Speaker 1: other scores, they will tend to make even more regressive 554 00:30:12,640 --> 00:30:16,040 Speaker 1: estimates of others than of self. The consequence would be 555 00:30:16,120 --> 00:30:20,040 Speaker 1: that they overestimate others even more than themselves on difficult 556 00:30:20,080 --> 00:30:23,000 Speaker 1: tasks and come to believe that they are worse than others. 557 00:30:23,400 --> 00:30:26,160 Speaker 1: The opposite would hold true for easy tasks. People would 558 00:30:26,240 --> 00:30:30,040 Speaker 1: underestimate others more than themselves and wind up believing that 559 00:30:30,080 --> 00:30:33,040 Speaker 1: they are better than others. So that took me a 560 00:30:33,080 --> 00:30:34,840 Speaker 1: minute to get my head around, but then I finally 561 00:30:34,880 --> 00:30:37,440 Speaker 1: made sense of it. So when you're not sure how 562 00:30:37,520 --> 00:30:39,680 Speaker 1: you will do it something, as we're always you know, 563 00:30:39,800 --> 00:30:42,840 Speaker 1: not sure, there's a ton of uncertainty in life, or 564 00:30:43,000 --> 00:30:46,000 Speaker 1: you're not sure how others will do they're simply more 565 00:30:46,080 --> 00:30:50,000 Speaker 1: room for possibility to guess high if your performance is 566 00:30:50,040 --> 00:30:52,920 Speaker 1: likely to be low, and more room to guess low 567 00:30:53,080 --> 00:30:55,920 Speaker 1: if your performance is likely to be high. And this 568 00:30:55,960 --> 00:30:59,480 Speaker 1: applies to both the self and other people. Since we 569 00:30:59,560 --> 00:31:02,960 Speaker 1: know even less about other people than we do about ourselves, 570 00:31:03,200 --> 00:31:05,960 Speaker 1: we're going to spend more time guessing wrong in these 571 00:31:06,080 --> 00:31:09,480 Speaker 1: vast over and under zones for other people, depending on 572 00:31:09,560 --> 00:31:11,960 Speaker 1: what type of task it is. Now we're going to 573 00:31:12,080 --> 00:31:15,880 Speaker 1: talk about over precision from this two thousand seventeen study. Now, 574 00:31:15,920 --> 00:31:18,760 Speaker 1: over precision again is that that's like having way more 575 00:31:18,800 --> 00:31:22,440 Speaker 1: confidence than you should about what you believe to be true. 576 00:31:22,840 --> 00:31:24,440 Speaker 1: So I could ask you, you know, I could ask 577 00:31:24,480 --> 00:31:26,320 Speaker 1: you to answer a question that I could ask you 578 00:31:26,440 --> 00:31:30,080 Speaker 1: how confident you are that your answer is correct. Uh. 579 00:31:30,160 --> 00:31:33,920 Speaker 1: And the authors here right quote results routinely find that 580 00:31:34,080 --> 00:31:39,520 Speaker 1: hit rates inside confidence intervals are below fifty percent, implying 581 00:31:39,560 --> 00:31:42,720 Speaker 1: that people set their ranges too precisely, acting as if 582 00:31:42,760 --> 00:31:46,360 Speaker 1: they're inappropriately confident that their beliefs are accurate. So if 583 00:31:46,360 --> 00:31:49,960 Speaker 1: you take a quiz, you say you're more than confident 584 00:31:50,040 --> 00:31:52,360 Speaker 1: on average about your answers, and you're actually more like 585 00:31:52,480 --> 00:31:56,040 Speaker 1: fifty percent correct on average. This is something that's been 586 00:31:56,080 --> 00:31:59,280 Speaker 1: found a bunch of times. It's quite clear that there's 587 00:31:59,360 --> 00:32:03,360 Speaker 1: tons of over precision in human behavior. The authors have 588 00:32:03,360 --> 00:32:06,719 Speaker 1: a few critiques about like common research paradigms that are 589 00:32:06,800 --> 00:32:09,600 Speaker 1: used to study this. One example is they say, you know, 590 00:32:09,720 --> 00:32:11,920 Speaker 1: it may be that normal people don't have a very 591 00:32:11,960 --> 00:32:14,959 Speaker 1: solid understanding of how to use confidence intervals, so there 592 00:32:14,960 --> 00:32:18,320 Speaker 1: have been other ways of trying to measure it. But however, 593 00:32:18,440 --> 00:32:21,520 Speaker 1: the authors here believe that over precision is the most 594 00:32:21,640 --> 00:32:26,240 Speaker 1: pervasive form of over confidence. You find it absolutely ever everywhere, 595 00:32:26,280 --> 00:32:29,520 Speaker 1: even in experts talking about their own subject matter. I 596 00:32:29,520 --> 00:32:31,160 Speaker 1: think that's come up on the show before that, I 597 00:32:31,160 --> 00:32:33,960 Speaker 1: don't remember when. After this, the authors here turned to 598 00:32:33,960 --> 00:32:36,760 Speaker 1: the question UH, a question we talked about a little before, 599 00:32:36,840 --> 00:32:40,800 Speaker 1: could over confidence actually be useful? Like? How do why 600 00:32:40,800 --> 00:32:44,280 Speaker 1: does it make sense for a brain to be over confident? Uh? 601 00:32:44,320 --> 00:32:47,440 Speaker 1: And they talk about explanations in two main categories, intra 602 00:32:47,640 --> 00:32:52,200 Speaker 1: personal and interpersonal. Uh. The authors generally think the evidence 603 00:32:52,240 --> 00:32:56,640 Speaker 1: for the interpersonal explanations, the explanations and how it works 604 00:32:56,680 --> 00:33:00,320 Speaker 1: on other people are stronger than the intra personal ones, 605 00:33:00,360 --> 00:33:03,880 Speaker 1: though there could could be some good intrapersonal ones. For example, 606 00:33:04,320 --> 00:33:07,400 Speaker 1: you know, maybe over confidence doesn't just make you feel good, it, 607 00:33:07,560 --> 00:33:11,040 Speaker 1: as we hypothesized earlier, makes you more likely to take 608 00:33:11,160 --> 00:33:14,160 Speaker 1: risks that can pay off big. Yeah. Well, I mean, 609 00:33:14,200 --> 00:33:16,400 Speaker 1: for instance, to come to to go to like a 610 00:33:16,400 --> 00:33:22,160 Speaker 1: predator prey scenario, like one is reminded of the you know, 611 00:33:22,200 --> 00:33:25,640 Speaker 1: how effective your average predator is. You know they are 612 00:33:25,680 --> 00:33:29,520 Speaker 1: going to fail a lot. And granted, a leopard is 613 00:33:29,560 --> 00:33:34,040 Speaker 1: not really subject to human you know, over confidence or 614 00:33:34,120 --> 00:33:37,320 Speaker 1: under confidence, but certainly, if you if you if you 615 00:33:37,360 --> 00:33:39,840 Speaker 1: look at a human scenario, if you look at human hunters, 616 00:33:40,480 --> 00:33:42,640 Speaker 1: you know it, it's certainly a situation where it would 617 00:33:42,640 --> 00:33:46,800 Speaker 1: pay to be overconfident. Uh, to a certain degree. Yeah, 618 00:33:46,800 --> 00:33:49,360 Speaker 1: I think you can find some analogies of confidence and 619 00:33:49,400 --> 00:33:52,120 Speaker 1: over confidence and animals like, you know, how likely are 620 00:33:52,160 --> 00:33:55,400 Speaker 1: you to try to take down prey animal that you're 621 00:33:55,480 --> 00:33:58,320 Speaker 1: very unlikely to succeed against what you know would provide 622 00:33:58,320 --> 00:34:00,120 Speaker 1: you with a lot of meat and energy of you 623 00:34:00,160 --> 00:34:02,720 Speaker 1: do right. Of course, the reverse of the other side 624 00:34:02,760 --> 00:34:04,280 Speaker 1: of that is that you would not need want to 625 00:34:04,280 --> 00:34:07,200 Speaker 1: be so overconfident you were going after prey that was 626 00:34:07,240 --> 00:34:09,919 Speaker 1: extremely likely to kill you if you try to bring 627 00:34:09,960 --> 00:34:12,879 Speaker 1: it down. Right. But the authors here they do think 628 00:34:12,920 --> 00:34:17,040 Speaker 1: that there's really good evidence for interpersonal benefits for over confidence. 629 00:34:17,560 --> 00:34:20,240 Speaker 1: One example would be all the empirical evidence that already 630 00:34:20,239 --> 00:34:24,080 Speaker 1: exists that just outwardly projecting confidence has all these benefits 631 00:34:24,080 --> 00:34:26,880 Speaker 1: affecting how other people see us. There are studies that 632 00:34:26,960 --> 00:34:31,240 Speaker 1: show that highly confident people are more persuasive, they're more influential, 633 00:34:31,560 --> 00:34:34,480 Speaker 1: they're perceived as more sexually attractive, they tend to get 634 00:34:34,520 --> 00:34:38,480 Speaker 1: promoted to positions of authority in groups um, and it's 635 00:34:38,520 --> 00:34:42,239 Speaker 1: possible that confidence is actually more important than competence in 636 00:34:42,280 --> 00:34:46,560 Speaker 1: determining who gets promoted to high status positions the author's 637 00:34:46,640 --> 00:34:49,640 Speaker 1: right quote. While a preference for confident leaders may make 638 00:34:49,680 --> 00:34:54,120 Speaker 1: sense if there's a correlation, however, weak between confidence and competence. 639 00:34:54,280 --> 00:34:58,160 Speaker 1: There is real risk in selecting over confident leaders. Well, 640 00:34:58,200 --> 00:35:00,560 Speaker 1: I mean because on one hand, you on a boss 641 00:35:00,680 --> 00:35:03,120 Speaker 1: that can say, you know, do their job and and 642 00:35:03,200 --> 00:35:06,640 Speaker 1: keep the company afloat and actually grow the business, etcetera, 643 00:35:06,680 --> 00:35:09,680 Speaker 1: all the various catchphrases. But you also want a boss 644 00:35:09,719 --> 00:35:11,640 Speaker 1: that you can kind of like you can you can 645 00:35:11,800 --> 00:35:13,920 Speaker 1: you can trust that they're doing their thing like they 646 00:35:14,000 --> 00:35:17,600 Speaker 1: seem confident. I guess they have their whole end of 647 00:35:17,640 --> 00:35:19,560 Speaker 1: it figured out, and maybe I can focus on my 648 00:35:19,640 --> 00:35:22,240 Speaker 1: own thing and not my own role in the company 649 00:35:22,280 --> 00:35:25,520 Speaker 1: without you know, freaking out about what's going to happen tomorrow. Well, 650 00:35:25,560 --> 00:35:28,600 Speaker 1: maybe in the business scenario we should pivot to talking 651 00:35:28,640 --> 00:35:32,000 Speaker 1: about the Icarus paradox. Oh, all right, we're gonna take 652 00:35:32,000 --> 00:35:33,640 Speaker 1: a quick break, but we'll be right back with more. 653 00:35:37,280 --> 00:35:39,759 Speaker 1: And we're back. So, Robert, you wanted to bring in 654 00:35:39,800 --> 00:35:44,080 Speaker 1: a concept from the business world about overconfidence, the Icarus paradox. Yeah, 655 00:35:44,239 --> 00:35:46,520 Speaker 1: And I'm as surprised as maybe some of you are 656 00:35:46,560 --> 00:35:49,319 Speaker 1: that I'm bringing in a something business wise, but it 657 00:35:49,360 --> 00:35:51,239 Speaker 1: caught my attention when I was looking for things on 658 00:35:52,120 --> 00:35:55,000 Speaker 1: for papers and so forth on over confidence, and also 659 00:35:55,160 --> 00:35:58,120 Speaker 1: looking at Greek mythology and so forth, because yeah, the 660 00:35:58,160 --> 00:36:02,400 Speaker 1: Acarus paradox invokes the story of Chris rather directly. And 661 00:36:02,440 --> 00:36:04,359 Speaker 1: I think it also makes sense at another level because 662 00:36:04,400 --> 00:36:07,080 Speaker 1: you know, we don't have gods uh so much anymore, 663 00:36:07,120 --> 00:36:11,800 Speaker 1: Like these are not the driving commanding forces in our world. 664 00:36:12,200 --> 00:36:16,239 Speaker 1: But we do have institutions, industries, global economies, and these 665 00:36:16,239 --> 00:36:19,000 Speaker 1: are not unlike the concepts of the gods. Right. You know, 666 00:36:19,040 --> 00:36:22,560 Speaker 1: they're sometimes lawful, sometimes chaotic entities that are likely to 667 00:36:22,640 --> 00:36:25,279 Speaker 1: destroy you if you question their authority or if you 668 00:36:25,960 --> 00:36:29,000 Speaker 1: you you turn against them. But anyway, I ran across 669 00:36:29,160 --> 00:36:32,920 Speaker 1: this interesting concept of the Acress paradox um. It was 670 00:36:32,960 --> 00:36:36,799 Speaker 1: devised by Canadian economist Danny Miller, and he points out 671 00:36:36,840 --> 00:36:40,839 Speaker 1: that businesses are often like Icarus in the myth They 672 00:36:40,840 --> 00:36:46,000 Speaker 1: start out confident and competent, they rise, but then they perish. Uh. 673 00:36:46,120 --> 00:36:48,440 Speaker 1: The The irony, he writes, is that many of the 674 00:36:48,480 --> 00:36:52,839 Speaker 1: most dramatically successful companies are prone to this exact sort 675 00:36:52,880 --> 00:36:56,520 Speaker 1: of failure. Uh. And and it doesn't in the business sense, 676 00:36:56,560 --> 00:37:00,560 Speaker 1: since businesses are sometimes less likely to they're more like 677 00:37:00,560 --> 00:37:03,120 Speaker 1: the gods. You know, they're they have downfalls, but there 678 00:37:03,120 --> 00:37:06,720 Speaker 1: may be they may very well be immortal. In some cases, 679 00:37:07,160 --> 00:37:09,240 Speaker 1: they're still alive. They're just chained to a rock getting 680 00:37:09,239 --> 00:37:12,560 Speaker 1: their liver pecked out by an eagle. Right for eternity. 681 00:37:12,600 --> 00:37:16,000 Speaker 1: You know, maybe they just declare multiple bankruptcies. Immortal declares 682 00:37:16,040 --> 00:37:18,680 Speaker 1: bankruptcy is a different thing than than a corporation doing it. 683 00:37:19,239 --> 00:37:22,000 Speaker 1: Um wait, what is that eagle. It's like a private 684 00:37:22,080 --> 00:37:26,080 Speaker 1: equity eagle. Yeah, the private equity eagle. Um yeah. He 685 00:37:26,080 --> 00:37:28,640 Speaker 1: mentioned several companies and discussing this, and and most of them, 686 00:37:28,680 --> 00:37:30,440 Speaker 1: I think that he was discussing, are still around, like 687 00:37:30,520 --> 00:37:34,320 Speaker 1: they have survived their downfalls. Uh but yeah, yeah. He 688 00:37:34,360 --> 00:37:36,720 Speaker 1: writes that the irony is that that many of these 689 00:37:36,840 --> 00:37:39,560 Speaker 1: dramatically successful companies are prone to these sort of failures. 690 00:37:39,560 --> 00:37:42,759 Speaker 1: And what's more, the very factors that drive success, when 691 00:37:42,760 --> 00:37:46,120 Speaker 1: taken to excess, are the factors that bring about decline. 692 00:37:47,200 --> 00:37:49,440 Speaker 1: So I think this is an interesting model to look at, 693 00:37:49,520 --> 00:37:51,640 Speaker 1: not only for the you know, because it's a just 694 00:37:51,719 --> 00:37:53,279 Speaker 1: a take on it from the business world, but I 695 00:37:53,320 --> 00:37:55,320 Speaker 1: think he can also serve as sort of a reflecting 696 00:37:55,360 --> 00:37:58,600 Speaker 1: pool for some of the individual concepts that we've discussed 697 00:37:58,640 --> 00:38:01,799 Speaker 1: already by placing them outside of the human psyche and 698 00:38:01,840 --> 00:38:04,319 Speaker 1: looking at them in the context of an organization or 699 00:38:04,320 --> 00:38:09,080 Speaker 1: a culture. So Miller wrote a book about this, The 700 00:38:09,160 --> 00:38:12,920 Speaker 1: Acarious Paradox. How exceptional companies bring about their own downfall, 701 00:38:13,280 --> 00:38:17,080 Speaker 1: New Lessons in the dynamics of corporate success, decline, and renewal. 702 00:38:17,120 --> 00:38:19,960 Speaker 1: I know you love a long title like that, um, 703 00:38:20,120 --> 00:38:24,200 Speaker 1: but I also I was mainly looking at his Business 704 00:38:24,520 --> 00:38:27,440 Speaker 1: Horizons article that he wrote on the subject, and he 705 00:38:27,480 --> 00:38:31,400 Speaker 1: summarizes a lot of the key points. So Miller identified 706 00:38:31,560 --> 00:38:37,719 Speaker 1: four key to trajectories in the richest to rags business scenario. 707 00:38:38,400 --> 00:38:41,800 Speaker 1: So the first one that he mentions is the focusing trajectory. 708 00:38:41,840 --> 00:38:46,880 Speaker 1: In this trajectory, the craftsman becomes the tinker. Quote firms 709 00:38:46,880 --> 00:38:54,080 Speaker 1: whose insular, technocratic monocultures alienate customers with perfect but irrelevant offerings. 710 00:38:54,120 --> 00:38:56,319 Speaker 1: So I guess in a scenario would be, you know, 711 00:38:56,800 --> 00:39:00,399 Speaker 1: the company that creates a really groundbreaking product, and then 712 00:39:00,480 --> 00:39:04,120 Speaker 1: all they do is tinker with that concept. All they 713 00:39:04,160 --> 00:39:07,960 Speaker 1: do is make adjustments to that concept, and eventually, like 714 00:39:08,120 --> 00:39:10,520 Speaker 1: somebody else is going to create a better widget or 715 00:39:10,560 --> 00:39:13,920 Speaker 1: a better you know, smartphone, or whatever the situation may be, 716 00:39:14,400 --> 00:39:16,640 Speaker 1: somebody else is going to take some some you know, 717 00:39:16,719 --> 00:39:21,440 Speaker 1: some wider swings next comes the venturing trajectory, and this 718 00:39:21,520 --> 00:39:25,120 Speaker 1: is by which builders become imperialist. And this one, I 719 00:39:25,160 --> 00:39:27,160 Speaker 1: think is the one that really has the smack of 720 00:39:27,239 --> 00:39:31,920 Speaker 1: overconfidence to it. In this trajectory, the strategy of building 721 00:39:32,120 --> 00:39:36,440 Speaker 1: feeds into over expansion. The goal of growth becomes grandeur, 722 00:39:36,880 --> 00:39:41,400 Speaker 1: and an entrepreneurial culture becomes one of the gamesman. A 723 00:39:41,480 --> 00:39:46,080 Speaker 1: divisionalized structure becomes fractured. And then on top of that 724 00:39:46,120 --> 00:39:49,319 Speaker 1: there's the the inventing trajectory. This is where you go 725 00:39:49,400 --> 00:39:53,040 Speaker 1: from pioneering to escapist. In this trajectory, innovation feeds into 726 00:39:53,120 --> 00:39:58,640 Speaker 1: high tech escapism. Science for society transforms into technological utopianism. 727 00:39:58,680 --> 00:40:01,319 Speaker 1: Research and development gives way to think tank culture, and 728 00:40:01,360 --> 00:40:05,439 Speaker 1: the overall culture goes from organic to chaotic. And then 729 00:40:05,600 --> 00:40:09,200 Speaker 1: he rounds us out with the decoupling trajectory from salesman 730 00:40:09,320 --> 00:40:15,280 Speaker 1: to drifter quote. Finally, the decoupling trajectory transforms salesman organizations 731 00:40:15,280 --> 00:40:19,520 Speaker 1: with unparalleled marketing skills, prominent brand names, and broad markets 732 00:40:19,840 --> 00:40:24,880 Speaker 1: into aimless, bureaucratic drifters whose sales fetish obscures design issues 733 00:40:25,080 --> 00:40:27,799 Speaker 1: and who produce a stale and disjointed line of me 734 00:40:27,920 --> 00:40:31,040 Speaker 1: too offerings. Okay, so that's the company that's more based 735 00:40:31,080 --> 00:40:34,279 Speaker 1: around marketing culture than around having a good product, right. 736 00:40:34,400 --> 00:40:36,960 Speaker 1: You know, thinking about these trajectories kind of reminds me 737 00:40:37,040 --> 00:40:40,520 Speaker 1: of the peaking in high school stereotype in life trajectories. 738 00:40:41,000 --> 00:40:43,040 Speaker 1: It's a stereotype, but there is some truth to it. 739 00:40:43,160 --> 00:40:45,880 Speaker 1: I think it's possible that too much success early on 740 00:40:45,920 --> 00:40:48,520 Speaker 1: in life can kind of corrupt a person, and can 741 00:40:48,560 --> 00:40:51,000 Speaker 1: kind of corrupt your work ethic, your ability to learn 742 00:40:51,080 --> 00:40:54,560 Speaker 1: from mistakes and mature. It's important for people to experience 743 00:40:54,600 --> 00:40:58,200 Speaker 1: both successes and failures early in life. Yeah, yeah, I 744 00:40:58,440 --> 00:41:00,360 Speaker 1: think so. It kind of comes act to what we 745 00:41:00,440 --> 00:41:05,160 Speaker 1: talked about looking at considering Aristotle's uh summary of hubrists 746 00:41:05,200 --> 00:41:07,640 Speaker 1: saying that the young and the rich were the most 747 00:41:07,680 --> 00:41:10,280 Speaker 1: likely to engage in it, and the idea that perhaps 748 00:41:10,320 --> 00:41:13,680 Speaker 1: in some scenarios, certainly there there are plenty of examples 749 00:41:13,680 --> 00:41:17,200 Speaker 1: of very wealthy people who got there through defeat, like 750 00:41:17,280 --> 00:41:20,400 Speaker 1: through learning the lessons of defeat. There are also examples 751 00:41:20,440 --> 00:41:23,400 Speaker 1: of of people who have you know, arguably maybe you know, 752 00:41:23,880 --> 00:41:26,759 Speaker 1: never suffered a true defeat like they have. They have 753 00:41:26,800 --> 00:41:30,600 Speaker 1: remains sort of man children, uh, kind of failed upward. Yeah, 754 00:41:30,680 --> 00:41:33,280 Speaker 1: that sort of thing. Um. And again we're dealing abroad 755 00:41:33,320 --> 00:41:36,000 Speaker 1: tropes here. But you know, to some extent, I think 756 00:41:36,000 --> 00:41:40,600 Speaker 1: it's useful to consider to consider these, um. But but 757 00:41:40,719 --> 00:41:42,920 Speaker 1: also I feel like it does get too into the 758 00:41:43,000 --> 00:41:47,320 Speaker 1: idea that that when we're dealing with the trajectory of 759 00:41:47,320 --> 00:41:51,439 Speaker 1: of the human life, you know, um, part of it 760 00:41:51,480 --> 00:41:54,480 Speaker 1: is perhaps comes down to just our ability to forecast 761 00:41:54,520 --> 00:41:56,880 Speaker 1: the future, our ability to make long to engage in 762 00:41:56,920 --> 00:42:00,440 Speaker 1: long term planning. Like we're as humans were generally not 763 00:42:00,520 --> 00:42:03,040 Speaker 1: as good about that. We're certainly not good about planning 764 00:42:03,040 --> 00:42:05,680 Speaker 1: beyond uh, you know, the scope of a human life. 765 00:42:05,719 --> 00:42:08,160 Speaker 1: But but even like beyond the scope of a of 766 00:42:08,200 --> 00:42:10,920 Speaker 1: a few years, we're better at the short term goals, 767 00:42:11,360 --> 00:42:14,640 Speaker 1: and it's only with considerable effort that we we get 768 00:42:14,640 --> 00:42:18,439 Speaker 1: better at considering long term goals. UM. So I think 769 00:42:18,440 --> 00:42:21,160 Speaker 1: that's important to keep in mind. And all of this now, 770 00:42:21,239 --> 00:42:23,320 Speaker 1: one thing that I think is also interesting in Miller's 771 00:42:23,320 --> 00:42:26,319 Speaker 1: writings is that he talks, he talks a bit about 772 00:42:26,400 --> 00:42:30,040 Speaker 1: over confidence, you know, as a symptom of underlying issues, 773 00:42:30,440 --> 00:42:34,479 Speaker 1: you know, as opposed to being like an intrinsic quality. Uh. 774 00:42:34,800 --> 00:42:39,640 Speaker 1: He writes, Unfortunately, configuration and synergy are usually attained at 775 00:42:39,640 --> 00:42:43,520 Speaker 1: the cost of myopia. Stellar performers view the world through 776 00:42:43,600 --> 00:42:47,640 Speaker 1: narrowing telescopes. One point of view takes over, one set 777 00:42:47,640 --> 00:42:51,640 Speaker 1: of assumptions comes to dominate. The result is complacency and 778 00:42:51,760 --> 00:42:54,920 Speaker 1: over confidence. And I think that plays into a lot 779 00:42:54,920 --> 00:42:58,080 Speaker 1: of what what we've we spoke about earlier. You know, Sorry, 780 00:42:58,120 --> 00:42:59,840 Speaker 1: I'm trying to make sense of it. Given that this 781 00:43:00,000 --> 00:43:02,279 Speaker 1: out the word synergy in it, I shouldn't know what 782 00:43:02,320 --> 00:43:04,880 Speaker 1: that means. By now I've purged it from my brain. 783 00:43:05,320 --> 00:43:07,880 Speaker 1: Oh okay, I see now, yeah, I'm looking at the quote. Okay, 784 00:43:07,880 --> 00:43:11,160 Speaker 1: so no, this is this is a very standard thing. Uh. 785 00:43:11,200 --> 00:43:15,160 Speaker 1: You know, if you have successfully hammered several nails in 786 00:43:15,480 --> 00:43:18,280 Speaker 1: everything and you've still got the hammer, everything really starts 787 00:43:18,320 --> 00:43:21,959 Speaker 1: to look like a nail, just because like, if you've 788 00:43:21,960 --> 00:43:25,440 Speaker 1: had success with the strategy in the past, you don't switch. 789 00:43:25,600 --> 00:43:27,920 Speaker 1: You just keep doing what you've done before, even if 790 00:43:27,960 --> 00:43:29,799 Speaker 1: it doesn't make sense anything like, yeah, this is the 791 00:43:29,920 --> 00:43:32,319 Speaker 1: this is what works, this is what our product is, 792 00:43:32,880 --> 00:43:35,400 Speaker 1: and this is what we're going to stick to. Um 793 00:43:35,560 --> 00:43:37,280 Speaker 1: and and then yeah, he talks a good bit about 794 00:43:37,520 --> 00:43:40,600 Speaker 1: to do about overconfidence as just a result of success 795 00:43:40,719 --> 00:43:44,600 Speaker 1: writing quote failure teaches leaders valuable lessons, but good results 796 00:43:44,640 --> 00:43:48,440 Speaker 1: only reinforce their preconceptions and tether them more firmly to 797 00:43:48,560 --> 00:43:53,319 Speaker 1: their tried and true recipes. Success also makes managers overconfident, 798 00:43:53,400 --> 00:43:56,239 Speaker 1: more prone to excess and neglect, and more given to 799 00:43:56,280 --> 00:43:59,520 Speaker 1: shape strategies to reflect their own preferences rather than those 800 00:43:59,560 --> 00:44:02,600 Speaker 1: of the cut stomers and uh, you know. He also 801 00:44:02,680 --> 00:44:05,920 Speaker 1: points to one of the key aspects of the Acreus 802 00:44:05,920 --> 00:44:11,000 Speaker 1: paradox being that overconfident, complacent executives extend the very factors 803 00:44:11,040 --> 00:44:14,920 Speaker 1: that contributed to success to the point where they cause decline. 804 00:44:15,280 --> 00:44:18,040 Speaker 1: So the thing that's working on, you know, the button 805 00:44:18,080 --> 00:44:21,160 Speaker 1: we're pushing that is leading to success, let's just really 806 00:44:21,239 --> 00:44:24,040 Speaker 1: jam that sucker. You're like the rat in the experiment 807 00:44:24,080 --> 00:44:27,960 Speaker 1: that keeps pushing the cocaine button. Yeah. So he summarizes 808 00:44:28,000 --> 00:44:30,880 Speaker 1: that there are really two aspects of the Acreus paradox. 809 00:44:30,960 --> 00:44:33,160 Speaker 1: One is the success can lead to failure via the 810 00:44:33,160 --> 00:44:36,239 Speaker 1: fostering of overconfidence and other factors. And then to the 811 00:44:36,320 --> 00:44:39,320 Speaker 1: aspects of a business that brings success can also hasten 812 00:44:39,400 --> 00:44:42,960 Speaker 1: failure or quote, the very causes of success, when extended, 813 00:44:43,239 --> 00:44:46,719 Speaker 1: may become the causes of failure. And as far as 814 00:44:47,120 --> 00:44:50,160 Speaker 1: you know ways to fight these transformations, because that's when 815 00:44:50,200 --> 00:44:52,520 Speaker 1: one question I had, It's like, it's just just the trajectory, 816 00:44:52,520 --> 00:44:55,759 Speaker 1: this is what happens. Like this is things can't just 817 00:44:55,800 --> 00:45:01,240 Speaker 1: go up forever and a business cannot just exist, you know, indefinitely, 818 00:45:01,320 --> 00:45:05,399 Speaker 1: Like things have to die, right, I mean, these corporations 819 00:45:05,400 --> 00:45:08,440 Speaker 1: are not like you mortals, but they are given to 820 00:45:09,000 --> 00:45:12,560 Speaker 1: life and death. They're not eternal. Uh and and he says, 821 00:45:12,760 --> 00:45:15,520 Speaker 1: he argues that there are ways to fight these transformations. 822 00:45:15,920 --> 00:45:20,320 Speaker 1: He suggests self reflection and intelligence gathering that guards against excess, 823 00:45:20,440 --> 00:45:24,600 Speaker 1: over confidence and irrelevance. And this this I think matches 824 00:45:24,640 --> 00:45:26,239 Speaker 1: up with what we've been talking about so far in 825 00:45:26,280 --> 00:45:29,640 Speaker 1: the individual level, like like if you think you're you're gracious, 826 00:45:29,680 --> 00:45:32,719 Speaker 1: like take a step back and and and uh and 827 00:45:32,719 --> 00:45:35,200 Speaker 1: and think in the question whether you know you actually 828 00:45:35,200 --> 00:45:37,080 Speaker 1: are to what extent you are? What else you could 829 00:45:37,080 --> 00:45:39,240 Speaker 1: be doing to ensure that that that you were actually 830 00:45:39,280 --> 00:45:42,319 Speaker 1: living up to your overestimation of self? If you think 831 00:45:42,320 --> 00:45:44,960 Speaker 1: it's true about yourself, prove it, prove it to you. Yeah, 832 00:45:45,000 --> 00:45:47,840 Speaker 1: I have to say Miller was a is a is 833 00:45:47,840 --> 00:45:49,719 Speaker 1: a really good writer about it. This because normally I'm 834 00:45:49,719 --> 00:45:54,120 Speaker 1: not interested in business culture type stuff like this. But 835 00:45:54,120 --> 00:45:55,640 Speaker 1: but he does a great job of like tying it 836 00:45:55,680 --> 00:45:59,320 Speaker 1: back into just the basic human scenario as well. Like 837 00:45:59,320 --> 00:46:02,720 Speaker 1: like he points out that excellence in any human endeavor, 838 00:46:02,760 --> 00:46:05,160 Speaker 1: be at arts or sports or what have you, you know, 839 00:46:05,160 --> 00:46:07,560 Speaker 1: it tends to come at a price. We cannot excel 840 00:46:07,680 --> 00:46:10,759 Speaker 1: at everything. We have to make sacrifices and choose what's 841 00:46:10,800 --> 00:46:13,520 Speaker 1: important in the middle of the road or a jack 842 00:46:13,560 --> 00:46:17,280 Speaker 1: of all trades approach is not going to lead to greatness, 843 00:46:17,719 --> 00:46:20,120 Speaker 1: you know, unless it's a story or some fable where 844 00:46:20,160 --> 00:46:26,040 Speaker 1: greatness is just thrust upon somebody, you know, Uh, there's 845 00:46:26,080 --> 00:46:28,640 Speaker 1: got to be some sort of a trade off there. 846 00:46:28,960 --> 00:46:30,719 Speaker 1: And he says it goes for the individual, but it 847 00:46:30,719 --> 00:46:33,720 Speaker 1: also applies to companies as well. You can only sharpen 848 00:46:33,800 --> 00:46:36,399 Speaker 1: your blade if you first realize that it is dull 849 00:46:36,480 --> 00:46:38,839 Speaker 1: and must be sharpened, you know, like if if you 850 00:46:39,000 --> 00:46:42,359 Speaker 1: if you believe yourself eternally and infinitely sharp, you're not 851 00:46:42,400 --> 00:46:44,680 Speaker 1: going to do the sharpening right. And it's interesting to 852 00:46:44,719 --> 00:46:46,680 Speaker 1: come back and think about the individual level and think 853 00:46:46,680 --> 00:46:49,880 Speaker 1: about over confidence in this scenario, like one like basic 854 00:46:49,920 --> 00:46:52,799 Speaker 1: trophy mentions is the idea of, say, you know, the 855 00:46:52,920 --> 00:46:56,160 Speaker 1: artists who neglects their family to focus on their art, 856 00:46:56,280 --> 00:46:58,719 Speaker 1: or a business person that does the same thing. In 857 00:46:58,760 --> 00:47:02,880 Speaker 1: those scenarios, it be you know, I guess you know, 858 00:47:03,120 --> 00:47:05,440 Speaker 1: maybe in a warped sensor in a way that that 859 00:47:05,440 --> 00:47:08,520 Speaker 1: that caters to their their prime focus. Might it be 860 00:47:08,520 --> 00:47:11,080 Speaker 1: beneficial to just think, oh, I'm a good dad, I'm 861 00:47:11,080 --> 00:47:14,400 Speaker 1: a good husband, even when they're not. But it enables 862 00:47:14,400 --> 00:47:16,680 Speaker 1: them to then put all of their more of their 863 00:47:16,680 --> 00:47:20,080 Speaker 1: resources anyway into the pursuit of the thing that ultimately 864 00:47:20,120 --> 00:47:22,840 Speaker 1: matters to them, like you know, card, hard cold cash, 865 00:47:23,000 --> 00:47:27,000 Speaker 1: or the pursuit of art. Yeah. Well, so to be clear, 866 00:47:27,160 --> 00:47:30,320 Speaker 1: obviously this wouldn't mean that it's actually useful in being 867 00:47:30,360 --> 00:47:32,839 Speaker 1: a good person have any good life, but it may 868 00:47:33,120 --> 00:47:36,920 Speaker 1: very well be useful in focusing on whatever it is 869 00:47:36,960 --> 00:47:40,160 Speaker 1: that really matters to you, to be overconfident about your 870 00:47:40,239 --> 00:47:43,880 Speaker 1: your crappy efforts in other areas of life. Yeah, I 871 00:47:44,160 --> 00:47:47,000 Speaker 1: think that's entirely true. So it's it's yeah, it's interesting 872 00:47:47,000 --> 00:47:49,400 Speaker 1: to compare the to the corporate version of this and 873 00:47:49,440 --> 00:47:51,719 Speaker 1: the individual version of this. And certainly if you want 874 00:47:51,719 --> 00:47:54,840 Speaker 1: to read more about this idea of the Chris paradox, 875 00:47:55,000 --> 00:47:58,400 Speaker 1: uh I certainly recommend checking out more of his writings. 876 00:47:58,440 --> 00:48:01,080 Speaker 1: So we have all these economic metaphors, like you know 877 00:48:01,160 --> 00:48:04,080 Speaker 1: from Adam Smith, the Invisible Hand or whatever. I feel 878 00:48:04,120 --> 00:48:07,560 Speaker 1: like we need an economic metaphor of the nemesis, like 879 00:48:07,600 --> 00:48:10,799 Speaker 1: the Nemesis that is this force in the market that 880 00:48:10,840 --> 00:48:15,640 Speaker 1: swoops into punish hubrist and overconfidence in business. Yeah, sometimes 881 00:48:15,680 --> 00:48:20,360 Speaker 1: it seems like Nemesis is a little a little uh 882 00:48:20,840 --> 00:48:22,960 Speaker 1: resistant to doing that. I don't know. I mean, part 883 00:48:22,960 --> 00:48:24,840 Speaker 1: of it comes down again to the fact that that 884 00:48:25,360 --> 00:48:30,080 Speaker 1: businesses and corporations are are are less mortal. Well, yeah, 885 00:48:30,120 --> 00:48:32,680 Speaker 1: I mean it is funny that, uh, we've discussed a 886 00:48:32,719 --> 00:48:35,640 Speaker 1: lot of these episodes how overconfidence can both lead to 887 00:48:35,840 --> 00:48:39,759 Speaker 1: disaster and and negative outcomes, but can also in some 888 00:48:39,840 --> 00:48:44,040 Speaker 1: cases be highly rewarded and be very lucrative. Yeah. Yeah, 889 00:48:44,840 --> 00:48:49,040 Speaker 1: you know, I'm also remind talking about, you know, curbing 890 00:48:49,160 --> 00:48:52,520 Speaker 1: overconfidence or the perception of overconfidence by making statements that 891 00:48:52,600 --> 00:48:55,520 Speaker 1: cannot be put to the test. You of course see 892 00:48:55,520 --> 00:48:57,920 Speaker 1: that a lot in the business world. You know that 893 00:48:58,480 --> 00:49:01,200 Speaker 1: if you're saying you're the best card leadership in the galaxy, 894 00:49:01,880 --> 00:49:03,640 Speaker 1: you know that you can get away with saying you're 895 00:49:03,680 --> 00:49:07,400 Speaker 1: the best car dealership in town. Well then people can say, well, 896 00:49:07,440 --> 00:49:09,800 Speaker 1: let's see your sales numbers, let's compare you to gems 897 00:49:09,800 --> 00:49:11,879 Speaker 1: across town. No, even that would be easier to get 898 00:49:11,880 --> 00:49:13,400 Speaker 1: away with. Like the one that would be hard to 899 00:49:13,400 --> 00:49:15,520 Speaker 1: get away with is saying like we have the lowest 900 00:49:15,640 --> 00:49:18,400 Speaker 1: prices in town. That's something like that. Then you're like, 901 00:49:18,800 --> 00:49:21,480 Speaker 1: then you're stuck either that's true or that's not right. 902 00:49:22,239 --> 00:49:24,320 Speaker 1: But then again, on a personal and the personal level, 903 00:49:24,360 --> 00:49:26,440 Speaker 1: you know, you can have your mug that says world's 904 00:49:26,440 --> 00:49:29,480 Speaker 1: greatest dad and nobody's gonna call you on that. What 905 00:49:29,600 --> 00:49:33,560 Speaker 1: you can get out your dad ruler and measure me. Yeah, 906 00:49:33,680 --> 00:49:36,120 Speaker 1: all right, So I think we've we've reached the end 907 00:49:36,160 --> 00:49:39,239 Speaker 1: of our discussion here for the week on over confidence, 908 00:49:39,880 --> 00:49:41,799 Speaker 1: but clearly there's a there's a lot of material here. 909 00:49:42,080 --> 00:49:44,799 Speaker 1: It will be interesting to hear back from listeners because 910 00:49:44,800 --> 00:49:46,560 Speaker 1: I think we all have some perspective on this. We 911 00:49:46,600 --> 00:49:50,279 Speaker 1: all have experience with with over confidence in others, or 912 00:49:50,719 --> 00:49:54,040 Speaker 1: certainly in over confidence in ourselves or the management of 913 00:49:54,080 --> 00:49:56,960 Speaker 1: overconfidence in ourselves, and so we we'd love to hear 914 00:49:56,960 --> 00:49:59,640 Speaker 1: everyone's thoughts on this. In the meantime, if you want 915 00:49:59,640 --> 00:50:01,640 Speaker 1: to check other episodes of stuff to blow your mind. 916 00:50:02,040 --> 00:50:05,160 Speaker 1: You can find them wherever you find your podcasts. If 917 00:50:05,200 --> 00:50:06,840 Speaker 1: you go to stuff to Blow your Mind dot com, 918 00:50:06,880 --> 00:50:09,320 Speaker 1: that will lead you over to the I heart listing 919 00:50:09,360 --> 00:50:12,440 Speaker 1: for our show, but you can find us anywhere and 920 00:50:12,480 --> 00:50:15,360 Speaker 1: wherever that happens to be. Just make sure you rate, review, 921 00:50:15,440 --> 00:50:18,920 Speaker 1: and subscribe. That really helps us out huge Thanks as 922 00:50:18,920 --> 00:50:22,239 Speaker 1: always to our excellent audio producer Seth Nicholas Johnson. If 923 00:50:22,280 --> 00:50:23,600 Speaker 1: you would like to get in touch with us with 924 00:50:23,680 --> 00:50:26,360 Speaker 1: feedback on this episode or any other to suggest topic 925 00:50:26,400 --> 00:50:28,760 Speaker 1: for the future, just to say hey, you can email 926 00:50:28,840 --> 00:50:39,239 Speaker 1: us at contact at stuff to blow your Mind dot com. 927 00:50:39,320 --> 00:50:41,160 Speaker 1: Stuff to Blow Your Mind is a production of iHeart 928 00:50:41,239 --> 00:50:43,560 Speaker 1: Radio's How Stuff Works. For more podcasts from my heart 929 00:50:43,640 --> 00:50:46,600 Speaker 1: Radio is the iHeart Radio app, Apple Podcasts, or wherever 930 00:50:46,640 --> 00:51:04,040 Speaker 1: you listen to your favorite shows by Trappers charges us 931 00:51:04,080 --> 00:51:04,840 Speaker 1: by bur