1 00:00:03,000 --> 00:00:04,840 Speaker 1: Welcome to Stuff to Blow Your Mind, a production of 2 00:00:04,880 --> 00:00:13,520 Speaker 1: I Heart Radios How Stuff Works. Hey, you, welcome to 3 00:00:13,560 --> 00:00:15,600 Speaker 1: Stuff to Blow Your Mind. My name is Robert Lamb 4 00:00:15,720 --> 00:00:17,880 Speaker 1: and I'm Joe McCormick, and we're back with part two 5 00:00:17,960 --> 00:00:21,800 Speaker 1: of our discussion of overconfidence. That's right. If you did 6 00:00:21,800 --> 00:00:24,920 Speaker 1: not listen to the previous episode, do go back and 7 00:00:24,960 --> 00:00:27,560 Speaker 1: listen to that episode, because we're gonna lay the ground work. 8 00:00:27,600 --> 00:00:31,920 Speaker 1: We're going to discuss over confidence and hubrists and mythology 9 00:00:32,000 --> 00:00:35,920 Speaker 1: in human histories, and then get into the psychology of 10 00:00:36,000 --> 00:00:39,519 Speaker 1: it and what various psychological studies have revealed and continue 11 00:00:39,560 --> 00:00:42,599 Speaker 1: to reveal about the nature of over confidence and how 12 00:00:42,640 --> 00:00:46,640 Speaker 1: we can divide this sort of amorphous concept of over 13 00:00:46,720 --> 00:00:50,920 Speaker 1: confidence out into categories that can be more easily studied 14 00:00:50,960 --> 00:00:53,639 Speaker 1: and understood. That's right now. In the last episode, one 15 00:00:53,640 --> 00:00:55,520 Speaker 1: of the main things we talked about was this huge 16 00:00:55,600 --> 00:00:58,560 Speaker 1: new review of the scientific literature on something known as 17 00:00:58,560 --> 00:01:01,120 Speaker 1: the better than average effect, which is the tendency for 18 00:01:01,160 --> 00:01:04,400 Speaker 1: people to rate themselves as better than average with respect 19 00:01:04,400 --> 00:01:07,720 Speaker 1: to their peers on all kinds of stuff. One classic 20 00:01:07,800 --> 00:01:11,240 Speaker 1: example is that something like ninety three percent of people 21 00:01:11,360 --> 00:01:14,520 Speaker 1: think they're a better than average driver. So if you're 22 00:01:14,520 --> 00:01:16,760 Speaker 1: if you're listening to this as you drive, eyes back 23 00:01:16,760 --> 00:01:19,240 Speaker 1: on the road, and make sure you use this turn signals. 24 00:01:19,319 --> 00:01:22,800 Speaker 1: Turn signals, save lives, turn signals. Let other drivers and 25 00:01:22,840 --> 00:01:25,399 Speaker 1: pedestrians know what you intend to do. Even if you 26 00:01:25,400 --> 00:01:27,920 Speaker 1: think you're a great driver, drive like you're less good 27 00:01:27,959 --> 00:01:30,039 Speaker 1: than you are, and it will make you a better driver. 28 00:01:30,280 --> 00:01:32,760 Speaker 1: Drive like you can't see all the other cars and 29 00:01:32,800 --> 00:01:35,720 Speaker 1: pedestrians around you, because sometimes you cannot drive like you're 30 00:01:35,800 --> 00:01:39,880 Speaker 1: driving a murder weapon, because potentially you are. It's quite true, 31 00:01:40,480 --> 00:01:42,080 Speaker 1: all right. Now. One of the things we talked about 32 00:01:42,080 --> 00:01:46,040 Speaker 1: in the last episode was a paper from seventeen by 33 00:01:46,120 --> 00:01:50,120 Speaker 1: Don Amore and Derek Schatz called the Three Faces of Overconfidence, 34 00:01:50,560 --> 00:01:55,320 Speaker 1: which which actually broke over confidence down into three distinct 35 00:01:55,440 --> 00:02:00,040 Speaker 1: categories of of bias or misperception and uh. And we 36 00:02:00,040 --> 00:02:01,640 Speaker 1: we talked about those a little bit last time. We're 37 00:02:01,640 --> 00:02:03,640 Speaker 1: going to be exploring more of what that paper had 38 00:02:03,680 --> 00:02:07,720 Speaker 1: to say, and it's critiques of overconfidence research, specifically with 39 00:02:07,800 --> 00:02:10,480 Speaker 1: reference to these three types of overconfidence. And as a 40 00:02:10,520 --> 00:02:16,240 Speaker 1: brief refresher, the three types are overestimation, overplacement, and over precision. 41 00:02:16,680 --> 00:02:19,960 Speaker 1: Overestimation is thinking that you're better than you are, and 42 00:02:20,000 --> 00:02:22,359 Speaker 1: this would be with reference to some kind of uh, 43 00:02:22,400 --> 00:02:24,840 Speaker 1: you know, objective measure out in the world. So if 44 00:02:24,880 --> 00:02:27,480 Speaker 1: you think that you are taller than you are, you know, 45 00:02:27,560 --> 00:02:30,000 Speaker 1: if you think that you can jump higher than you can, 46 00:02:30,520 --> 00:02:32,880 Speaker 1: if you think that you would get a better score 47 00:02:32,880 --> 00:02:36,760 Speaker 1: on a test than you actually could, that's overestimation. The 48 00:02:36,840 --> 00:02:41,200 Speaker 1: next one, overplacement, is similar, but instead it's comparing yourself 49 00:02:41,240 --> 00:02:44,399 Speaker 1: with other people. So the better than average effect would 50 00:02:44,400 --> 00:02:47,600 Speaker 1: be an example of overplacement. It's, you know, thinking you 51 00:02:47,639 --> 00:02:51,240 Speaker 1: are better than average compared to your peers at some task. 52 00:02:51,760 --> 00:02:54,200 Speaker 1: Or it would be thinking that you know that you 53 00:02:54,320 --> 00:02:57,560 Speaker 1: work harder than other people, or thinking that you are 54 00:02:57,680 --> 00:03:00,639 Speaker 1: smarter than other people. Of course, with the if, it's 55 00:03:00,680 --> 00:03:03,799 Speaker 1: over confidence, meaning that those are not actually accurate assessments. 56 00:03:04,360 --> 00:03:07,280 Speaker 1: And then finally the other one would be over precision, 57 00:03:07,360 --> 00:03:10,800 Speaker 1: which is being too sure that you know the truth. Again, 58 00:03:10,840 --> 00:03:13,800 Speaker 1: this this might be called epistemic over confidence. It's just 59 00:03:13,960 --> 00:03:17,040 Speaker 1: being too certain that your beliefs are correct. Now, to 60 00:03:17,080 --> 00:03:21,280 Speaker 1: get into more in chats is paper from. One of 61 00:03:21,280 --> 00:03:25,400 Speaker 1: the questions that they address is what actually drives some 62 00:03:25,520 --> 00:03:29,000 Speaker 1: of these different effects as as they are manifested so 63 00:03:29,480 --> 00:03:33,480 Speaker 1: they start with overestimation. What causes us to say think 64 00:03:33,560 --> 00:03:35,640 Speaker 1: we would get a better score on a test than 65 00:03:35,680 --> 00:03:37,720 Speaker 1: we do, or to think we have more money in 66 00:03:37,760 --> 00:03:41,040 Speaker 1: the bank than we do. A common answer that people 67 00:03:41,080 --> 00:03:44,280 Speaker 1: give to this is the idea of wishful thinking. It 68 00:03:44,320 --> 00:03:48,280 Speaker 1: would feel good if this were true, therefore I believe it. Right. Uh. 69 00:03:48,480 --> 00:03:52,119 Speaker 1: The authors don't think that this explanation is very plausible, 70 00:03:52,120 --> 00:03:54,480 Speaker 1: and they offer several problems with it, and we can 71 00:03:54,520 --> 00:03:56,880 Speaker 1: interrogate these, maybe disagree with them as we go on. 72 00:03:57,000 --> 00:03:59,480 Speaker 1: But first of all, they say, you know, self delusion 73 00:03:59,720 --> 00:04:04,880 Speaker 1: is demonstrably maladaptive. For example, a tendency toward wishful thinking 74 00:04:04,960 --> 00:04:07,760 Speaker 1: about the safety of kissing sharks so with tongue is 75 00:04:08,240 --> 00:04:10,680 Speaker 1: not a trait that the environment will tend to select for. 76 00:04:11,440 --> 00:04:14,920 Speaker 1: People overconfident about their academic abilities will tend not to 77 00:04:14,960 --> 00:04:19,279 Speaker 1: study and actually do worse. People who believe themselves invulnerable 78 00:04:19,320 --> 00:04:22,400 Speaker 1: will take risks that sometimes get them killed. This might 79 00:04:22,440 --> 00:04:25,159 Speaker 1: seem obvious, but there is actually plenty of research on this. 80 00:04:25,279 --> 00:04:28,960 Speaker 1: I mean, people who are overconfident about their abilities do 81 00:04:29,200 --> 00:04:31,960 Speaker 1: face a lot of downsides when those abilities are put 82 00:04:32,000 --> 00:04:35,719 Speaker 1: to the test. Yeah, I mean, one example from literature 83 00:04:35,720 --> 00:04:39,000 Speaker 1: that comes to mind is that of Macbeth, who believes 84 00:04:39,080 --> 00:04:43,600 Speaker 1: himself protected by prophecy um and then of course uh 85 00:04:43,880 --> 00:04:48,280 Speaker 1: snuffs it because yeah, exactly. But then again, I think, okay, 86 00:04:48,360 --> 00:04:50,320 Speaker 1: so it is true that these people will face a 87 00:04:50,360 --> 00:04:54,520 Speaker 1: lot of downside, But then again, people do engage in 88 00:04:54,600 --> 00:04:57,760 Speaker 1: self destructive, self deluded behavior all the time. I mean, 89 00:04:57,760 --> 00:04:59,720 Speaker 1: this is a common feature of human life. Yeah we 90 00:05:00,000 --> 00:05:02,320 Speaker 1: I mean, for instance, we were just recently talking about 91 00:05:02,360 --> 00:05:05,640 Speaker 1: the Placebo effect on our movie episode where we talked 92 00:05:05,640 --> 00:05:08,520 Speaker 1: about the fly and and about the possibility that the 93 00:05:08,520 --> 00:05:12,919 Speaker 1: Polacebo effect is basically due to uh, you know, this 94 00:05:13,040 --> 00:05:18,000 Speaker 1: innate tendency towards self delusion that may very well be adaptive. 95 00:05:18,040 --> 00:05:21,640 Speaker 1: And at least in this scenario, um, where yeah, we 96 00:05:21,640 --> 00:05:24,960 Speaker 1: we benefit from being able to believe something is going 97 00:05:25,000 --> 00:05:29,160 Speaker 1: to work and and uh and and and experiencing at 98 00:05:29,200 --> 00:05:32,279 Speaker 1: least a small physical benefit from it, like a small 99 00:05:32,320 --> 00:05:35,640 Speaker 1: curative benefit from it. And then um, you know, I 100 00:05:35,720 --> 00:05:38,560 Speaker 1: also can't help but think that, you know, self delusion 101 00:05:39,320 --> 00:05:42,320 Speaker 1: entails far more than just over confidence. It also entails 102 00:05:42,320 --> 00:05:45,480 Speaker 1: all manner of paranoia. And there is a strong case 103 00:05:45,680 --> 00:05:48,480 Speaker 1: for the adaptive nature of say making a type one 104 00:05:48,640 --> 00:05:51,760 Speaker 1: error in cognition, a false positive, the belief that the 105 00:05:51,839 --> 00:05:53,920 Speaker 1: rustle in the tall grass is that of a tiger 106 00:05:54,000 --> 00:05:56,800 Speaker 1: when it's not, because if you make the type two 107 00:05:57,279 --> 00:06:01,720 Speaker 1: uh or, you're more likely to be eaten by the tiger. Right, right, Yeah, 108 00:06:01,960 --> 00:06:05,679 Speaker 1: having accurate information about the world is actually very useful, 109 00:06:05,720 --> 00:06:09,479 Speaker 1: and having inaccurate information can kill you. Yeah, but but 110 00:06:09,520 --> 00:06:11,320 Speaker 1: I'm not so much you know, trying to to disagree 111 00:06:11,320 --> 00:06:15,800 Speaker 1: with the maladaptive self delusion argument that we mentioned earlier, 112 00:06:15,839 --> 00:06:17,560 Speaker 1: but but rather you know, to point out that the 113 00:06:17,600 --> 00:06:21,120 Speaker 1: human experiences is rife with self delusions. So might a 114 00:06:21,279 --> 00:06:25,279 Speaker 1: dash of overconfidence, even in the form of overestimation, served 115 00:06:25,320 --> 00:06:28,040 Speaker 1: to balance outh this alchemy of you know, of our 116 00:06:28,120 --> 00:06:31,839 Speaker 1: perception of reality. For example, so you have a karaoke singer, 117 00:06:32,000 --> 00:06:35,880 Speaker 1: and granted karaoke is very low stakes, but you could 118 00:06:35,920 --> 00:06:40,039 Speaker 1: involve social embarrassment, which you could fear would lead to ostracism, 119 00:06:40,120 --> 00:06:42,960 Speaker 1: and that's actually one of the most powerful negative motivators 120 00:06:42,960 --> 00:06:46,080 Speaker 1: on human behavior. Right, But again, at karaoke is also 121 00:06:46,120 --> 00:06:47,919 Speaker 1: one of these things where like sometimes it's cool to 122 00:06:47,920 --> 00:06:50,240 Speaker 1: do it badly. So this is not a perfect example, 123 00:06:50,600 --> 00:06:53,600 Speaker 1: but so you have a kara karaoke singer that imbibes 124 00:06:53,640 --> 00:06:56,480 Speaker 1: in a little liquid courage before taking the microphone, as 125 00:06:56,680 --> 00:07:01,040 Speaker 1: most karaoke participants are are are wont to do. Uh, 126 00:07:01,040 --> 00:07:03,240 Speaker 1: but yeah, they get a little liquid courage because they 127 00:07:03,279 --> 00:07:05,240 Speaker 1: know they don't have the greatest voice in the world. 128 00:07:05,320 --> 00:07:07,440 Speaker 1: And then they feel a little awkward getting up there. 129 00:07:07,440 --> 00:07:09,560 Speaker 1: But but they know that a little bit of booze 130 00:07:09,560 --> 00:07:12,800 Speaker 1: induced over confidence might help matters. I think you're exactly 131 00:07:12,880 --> 00:07:14,800 Speaker 1: right there, and this this is funny to start here 132 00:07:14,840 --> 00:07:17,239 Speaker 1: because I think while the authors make tons of good points, 133 00:07:17,240 --> 00:07:18,720 Speaker 1: this is one of the ones they make that I 134 00:07:18,800 --> 00:07:21,840 Speaker 1: might disagree with the most. I think that there are 135 00:07:21,960 --> 00:07:26,560 Speaker 1: antagonistic adaptations in human behavior. One pressure might favor having 136 00:07:26,640 --> 00:07:29,120 Speaker 1: an accurate picture of the world, assessing things in a 137 00:07:29,200 --> 00:07:33,040 Speaker 1: clear and accurate way, while a cross pressure favor self deception, 138 00:07:33,440 --> 00:07:36,920 Speaker 1: especially self deception in the form of over confidence. For example, 139 00:07:37,440 --> 00:07:40,280 Speaker 1: you might be more likely to survive if you have 140 00:07:40,480 --> 00:07:43,640 Speaker 1: accurate assessments of your own abilities, but you might be 141 00:07:43,760 --> 00:07:47,640 Speaker 1: more likely to take big risks with potentially big rewards 142 00:07:47,960 --> 00:07:52,480 Speaker 1: if you overestimate your abilities or self delusional. Over Confidence 143 00:07:52,480 --> 00:07:55,680 Speaker 1: could be adaptive because it helps us persuade or even 144 00:07:55,720 --> 00:08:00,520 Speaker 1: deceive other people about our worth. Yeah, Ultimately you have 145 00:08:00,560 --> 00:08:03,200 Speaker 1: to you have to believe in yourself if you know 146 00:08:03,240 --> 00:08:05,160 Speaker 1: other people are not going to believe in you for you, 147 00:08:05,240 --> 00:08:07,320 Speaker 1: right right. I mean, we we talked in the last 148 00:08:07,360 --> 00:08:10,440 Speaker 1: episode about how it's probably not a coincidence that you 149 00:08:10,560 --> 00:08:14,480 Speaker 1: really often notice over confidence in people who occupy high 150 00:08:14,520 --> 00:08:17,680 Speaker 1: status leadership roles. How they get there. I mean, it's 151 00:08:17,720 --> 00:08:20,400 Speaker 1: not hard to imagine the overconfidence helped them get to 152 00:08:20,440 --> 00:08:23,040 Speaker 1: that point. Yeah, and it's, uh, sometimes it's a fun, 153 00:08:23,120 --> 00:08:27,440 Speaker 1: sometimes terrifying exercise to like if you if you engage 154 00:08:27,480 --> 00:08:29,440 Speaker 1: with people like this and then when you realize, oh, 155 00:08:29,720 --> 00:08:32,559 Speaker 1: they're just really overconfident, they don't they're they're not to 156 00:08:32,600 --> 00:08:36,160 Speaker 1: say they're not skilled, but when you realize they're not. 157 00:08:36,360 --> 00:08:38,800 Speaker 1: Sometimes they're not. But sometimes you really you realize, oh, 158 00:08:38,840 --> 00:08:42,640 Speaker 1: there there is this gap between ability and uh and 159 00:08:42,640 --> 00:08:45,360 Speaker 1: and and what they're they're saying they're going to deliver 160 00:08:45,480 --> 00:08:48,800 Speaker 1: on or what they are estimating the future will consist of. Yeah, 161 00:08:48,840 --> 00:08:52,640 Speaker 1: I mean, I it is kind of shocking how often 162 00:08:52,679 --> 00:08:55,680 Speaker 1: in life you will suddenly come to a realization that 163 00:08:56,000 --> 00:08:58,520 Speaker 1: you know, the boss or the leader or whatever's main 164 00:08:58,600 --> 00:09:01,320 Speaker 1: skill is b essing, like that they can just go 165 00:09:01,400 --> 00:09:03,280 Speaker 1: out there and wing it in a way that you 166 00:09:03,320 --> 00:09:06,880 Speaker 1: would be too timid and reserve to do. Right now, 167 00:09:06,920 --> 00:09:09,640 Speaker 1: this idea of you know, accurate assessments playing into our 168 00:09:09,920 --> 00:09:13,080 Speaker 1: our own abilities, I couldn't help but think of the 169 00:09:13,120 --> 00:09:16,160 Speaker 1: film Butch Casting and the Sun Dance Kid in this scenario, 170 00:09:17,200 --> 00:09:20,000 Speaker 1: because it really as it relates to two specific points 171 00:09:20,040 --> 00:09:22,960 Speaker 1: in the film. One is the whole would you make 172 00:09:23,000 --> 00:09:25,720 Speaker 1: that jump if you didn't have to scenario where they're 173 00:09:25,720 --> 00:09:28,520 Speaker 1: being tracked, they're being hunted, and they've come to this, uh, 174 00:09:28,600 --> 00:09:33,120 Speaker 1: this this cliff overlooking this river, and they realize that 175 00:09:33,200 --> 00:09:35,520 Speaker 1: if they jump, if they jump off this cliff and 176 00:09:35,520 --> 00:09:38,320 Speaker 1: they land in that river and they don't die, they'll 177 00:09:38,360 --> 00:09:41,880 Speaker 1: get away because the stakes are such that those pursuing 178 00:09:41,920 --> 00:09:44,400 Speaker 1: them will not follow them, they will not make that 179 00:09:44,480 --> 00:09:49,560 Speaker 1: jump if they don't need to. Um. So, so so 180 00:09:49,600 --> 00:09:51,560 Speaker 1: there's there's that, and then at the very end there's 181 00:09:51,679 --> 00:09:53,959 Speaker 1: kind of a going out the old fashioned way guns 182 00:09:54,000 --> 00:09:56,920 Speaker 1: ablazing scenario where they're cornered, they're going to slowly be 183 00:09:57,280 --> 00:09:59,040 Speaker 1: killed and they decided to just go for it, to 184 00:09:59,120 --> 00:10:03,800 Speaker 1: just bust out shooting and just fight. Right. So, so 185 00:10:03,840 --> 00:10:08,520 Speaker 1: the incentives, like the evolutionary incentives on a brain generating 186 00:10:08,559 --> 00:10:12,760 Speaker 1: accurate pictures of the world versus self deluded over confidence. 187 00:10:13,120 --> 00:10:15,680 Speaker 1: Those could very well be just the contrast between a 188 00:10:15,800 --> 00:10:19,360 Speaker 1: low risk, low reward strategy versus a high risk, high 189 00:10:19,360 --> 00:10:22,199 Speaker 1: reward strateke right, Yeah, so yeah, the first example definitely 190 00:10:22,320 --> 00:10:25,400 Speaker 1: high risk, high reward. Like it was pretty much their only, 191 00:10:26,200 --> 00:10:29,280 Speaker 1: their best option for survival at that point, and they 192 00:10:29,320 --> 00:10:32,360 Speaker 1: took it, and in the film they survive. At the 193 00:10:32,480 --> 00:10:35,360 Speaker 1: end of the film, it's pretty much implied that they die. 194 00:10:35,840 --> 00:10:39,240 Speaker 1: But but at the same time, it's it still seems 195 00:10:39,240 --> 00:10:41,440 Speaker 1: to be their best option, if not their best option 196 00:10:41,520 --> 00:10:44,880 Speaker 1: for surviving. It's kind of at least a like the 197 00:10:44,920 --> 00:10:47,560 Speaker 1: psychological best option. You know, are we gonna stay in 198 00:10:47,600 --> 00:10:50,240 Speaker 1: here and die like rats, or are we gonna you know, 199 00:10:50,679 --> 00:10:54,000 Speaker 1: just burst out there and you know, die like heroes 200 00:10:54,040 --> 00:10:57,160 Speaker 1: in a film that is named after them. You know, man, 201 00:10:57,200 --> 00:10:58,640 Speaker 1: that's a great movie. I want to go back and 202 00:10:58,679 --> 00:11:01,680 Speaker 1: why I haven't seen the Garsh remember, except the end, 203 00:11:01,720 --> 00:11:04,080 Speaker 1: I mean, the ending is kind of a downer, but uh, 204 00:11:04,120 --> 00:11:07,280 Speaker 1: but yeah, it's it's it's surprisingly sweet for for a 205 00:11:07,360 --> 00:11:10,880 Speaker 1: violent outlaw movie. Yeah, it's a good one. And you know, 206 00:11:10,920 --> 00:11:13,160 Speaker 1: I mentioned Macbeth earlier in the whole idea of you know, 207 00:11:13,240 --> 00:11:16,760 Speaker 1: draping himself in prophecy and using that to to to 208 00:11:16,760 --> 00:11:20,040 Speaker 1: to pump himself up. But that doesn't bring up bring 209 00:11:20,080 --> 00:11:22,160 Speaker 1: to mind the role of religion and all of this, 210 00:11:22,280 --> 00:11:24,840 Speaker 1: you know, I mean certainly a lot of the things 211 00:11:24,920 --> 00:11:30,640 Speaker 1: that religion can do to your estimation of ability or 212 00:11:30,679 --> 00:11:33,720 Speaker 1: you know, uh, you know, it can revolve around you know, 213 00:11:33,800 --> 00:11:36,880 Speaker 1: the survivability of the soul for example, you know, and 214 00:11:36,920 --> 00:11:39,600 Speaker 1: like what will happen if I act a certain way 215 00:11:39,640 --> 00:11:42,440 Speaker 1: in life? Yeah, and I think there could possibly be 216 00:11:42,520 --> 00:11:45,080 Speaker 1: cross pressure is going the same way with that. I 217 00:11:45,080 --> 00:11:48,560 Speaker 1: mean that that there are some evolutionary drawbacks and some 218 00:11:48,559 --> 00:11:50,679 Speaker 1: some advantages to it, right. And then of course that's 219 00:11:50,720 --> 00:11:54,640 Speaker 1: not to say that religious motivations, uh, you know exist 220 00:11:54,920 --> 00:11:57,640 Speaker 1: free of social of course, I guess you know, there's 221 00:11:57,640 --> 00:11:59,960 Speaker 1: going to be a rich interplay between those, and that's 222 00:12:00,640 --> 00:12:02,439 Speaker 1: you know, that's something that comes up, for instance, in 223 00:12:02,679 --> 00:12:06,319 Speaker 1: um when you look at studies of say, suicide bombers, 224 00:12:06,679 --> 00:12:08,680 Speaker 1: you know, where on one hand, you can look at 225 00:12:08,720 --> 00:12:10,719 Speaker 1: it and just go with the simple scenario of oh, 226 00:12:10,760 --> 00:12:13,280 Speaker 1: here's a person who believes that if they die doing 227 00:12:13,320 --> 00:12:16,280 Speaker 1: this act, then they'll be rewarded in the afterlife. But 228 00:12:16,360 --> 00:12:19,320 Speaker 1: then behind that there's a whole social scenario as well 229 00:12:19,840 --> 00:12:23,720 Speaker 1: of other humans, you know, telling them that this is 230 00:12:23,720 --> 00:12:27,120 Speaker 1: the thing to do, et cetera. Yeah, motivations are are 231 00:12:27,200 --> 00:12:30,600 Speaker 1: a rich stew of many different influences. I mean, it's 232 00:12:30,679 --> 00:12:34,280 Speaker 1: usually hard to nail down a single inciting incident or 233 00:12:34,360 --> 00:12:37,760 Speaker 1: cause that lead people down a path in life. And 234 00:12:37,800 --> 00:12:39,320 Speaker 1: in fact, I think a lot of times even when 235 00:12:39,320 --> 00:12:41,679 Speaker 1: people do that with themselves and say this was the 236 00:12:41,679 --> 00:12:44,240 Speaker 1: reason I became or whatever I did, whatever, I think 237 00:12:44,240 --> 00:12:46,200 Speaker 1: a lot of times they're over some they're they're wrong 238 00:12:46,200 --> 00:12:50,320 Speaker 1: about themselves. Yeah, so basically self delusion, we're all just 239 00:12:50,640 --> 00:12:55,000 Speaker 1: houses of cards, just ready to be knocked down at 240 00:12:55,000 --> 00:12:57,640 Speaker 1: any point. Well, but there's another way of putting it. Now. 241 00:12:57,760 --> 00:12:59,640 Speaker 1: One thing you and I could be getting wrong here 242 00:12:59,720 --> 00:13:03,840 Speaker 1: is if we're talking properly about self delusion or some 243 00:13:04,040 --> 00:13:07,600 Speaker 1: other type of of bias or like misperception in the brain. 244 00:13:08,040 --> 00:13:12,359 Speaker 1: Because the authors here they're saying, okay, self delusion specifically, 245 00:13:12,920 --> 00:13:16,560 Speaker 1: maybe self delusion implies that there's there's a sort of 246 00:13:16,600 --> 00:13:19,679 Speaker 1: transformation going on somewhere in the brain, Like the brain 247 00:13:19,800 --> 00:13:23,880 Speaker 1: gets accurate information about the world and then just somehow 248 00:13:23,960 --> 00:13:27,319 Speaker 1: presents it to the conscious mind in a skewed way. 249 00:13:27,559 --> 00:13:29,920 Speaker 1: The authors share think that, especially if you're talking about 250 00:13:29,920 --> 00:13:33,240 Speaker 1: wishful thinking, is the brand of self delusion, uh, you know, 251 00:13:34,200 --> 00:13:37,000 Speaker 1: getting false perceptions about the world in order to feel better. 252 00:13:37,120 --> 00:13:39,680 Speaker 1: They think that doesn't really work from a like unconscious 253 00:13:39,720 --> 00:13:43,520 Speaker 1: mind to conscious mind model, because emotions and moods also 254 00:13:43,600 --> 00:13:46,079 Speaker 1: seem to emerge from the unconscious mind, not from the 255 00:13:46,120 --> 00:13:49,320 Speaker 1: conscious mind. But then there's another thing they go to, 256 00:13:49,679 --> 00:13:52,760 Speaker 1: which is that they argue the empirical evidence for true 257 00:13:52,760 --> 00:13:57,000 Speaker 1: self deception in overestimation it's actually kind of weak and 258 00:13:57,080 --> 00:14:00,280 Speaker 1: kind of mixed. Why would this be Well, first of all, 259 00:14:00,320 --> 00:14:03,280 Speaker 1: they say, it's hard to separate true self deception from 260 00:14:03,559 --> 00:14:07,640 Speaker 1: attempts to deceive others, including the researchers. So how can 261 00:14:07,679 --> 00:14:12,800 Speaker 1: you tell when somebody truly overestimates their own traits or 262 00:14:12,840 --> 00:14:17,240 Speaker 1: abilities versus they just tell you that they think their 263 00:14:17,480 --> 00:14:19,960 Speaker 1: traits or abilities are better than they are. In a 264 00:14:19,960 --> 00:14:24,600 Speaker 1: lot of cases, both would manifest equally as outward over confidence. Now, 265 00:14:24,600 --> 00:14:27,040 Speaker 1: you can come up with some methodologies and some tests 266 00:14:27,040 --> 00:14:28,960 Speaker 1: to try to get around this, Like you can make 267 00:14:29,040 --> 00:14:33,160 Speaker 1: people bet sums of money that would where the outcome 268 00:14:33,200 --> 00:14:35,400 Speaker 1: of the bet would be dependent on how good they 269 00:14:35,440 --> 00:14:38,560 Speaker 1: actually are at a task or something. But in a 270 00:14:38,560 --> 00:14:40,240 Speaker 1: lot of cases, they say, it's hard to tell the 271 00:14:40,280 --> 00:14:45,040 Speaker 1: difference between true self deception and just attempts to deceive 272 00:14:45,120 --> 00:14:47,600 Speaker 1: other people. Another thing they point out is that you 273 00:14:47,640 --> 00:14:52,680 Speaker 1: don't actually have to be deceiving yourself to overestimate your abilities. 274 00:14:53,040 --> 00:14:56,880 Speaker 1: You could be genuinely completely ignorant of the fact that 275 00:14:56,920 --> 00:14:59,440 Speaker 1: you're not as good as you think you are. Uh. 276 00:14:59,440 --> 00:15:02,680 Speaker 1: And here's in place that the famous Dunning Krueger effect 277 00:15:02,760 --> 00:15:05,920 Speaker 1: comes in. Now, you may have heard about the Dunning 278 00:15:05,960 --> 00:15:09,360 Speaker 1: Krueger effect, but very short sketch on it. Of course, 279 00:15:09,400 --> 00:15:11,440 Speaker 1: overlaps with a lot of what we're talking about today. 280 00:15:11,920 --> 00:15:16,240 Speaker 1: Participants less skilled in a task or subject area can 281 00:15:16,280 --> 00:15:20,440 Speaker 1: be prone to show even greater overestimation of their abilities 282 00:15:20,480 --> 00:15:24,120 Speaker 1: in that skill or subject area. So with some skills, 283 00:15:24,160 --> 00:15:28,880 Speaker 1: the worst you are, the more you overestimate your awesomeness. Now, 284 00:15:29,080 --> 00:15:31,320 Speaker 1: why why on earth would this be? Well, the authors 285 00:15:31,360 --> 00:15:34,240 Speaker 1: here mentioned this could just simply come from your low 286 00:15:34,360 --> 00:15:38,080 Speaker 1: skills providing you with a poor frame of reference. You 287 00:15:38,160 --> 00:15:42,080 Speaker 1: don't know enough about this task or skill or subject 288 00:15:42,160 --> 00:15:46,200 Speaker 1: area to even understand how much you don't know, so 289 00:15:46,320 --> 00:15:49,400 Speaker 1: like the Dunning Krueger effect would show not self deception 290 00:15:49,760 --> 00:15:54,160 Speaker 1: but genuine ignorance. You lack enough information to understand how 291 00:15:54,240 --> 00:15:57,400 Speaker 1: bad you're failing. Like I think a good example of 292 00:15:57,400 --> 00:16:00,000 Speaker 1: this is you know, you read you you read one 293 00:16:00,040 --> 00:16:04,240 Speaker 1: theory about some phenomena and uh, and it can be 294 00:16:04,360 --> 00:16:06,920 Speaker 1: rather convincing. It can be so convincing that you think, well, 295 00:16:06,960 --> 00:16:09,560 Speaker 1: this is it. They made a great case. But if 296 00:16:09,560 --> 00:16:11,440 Speaker 1: you don't, if if you don't actually look at some 297 00:16:11,480 --> 00:16:14,480 Speaker 1: of the other theories out there or look at uh, 298 00:16:14,520 --> 00:16:16,840 Speaker 1: you know, some sort of if you look at writings 299 00:16:16,920 --> 00:16:19,600 Speaker 1: or or or pieces that actually compare them, or do 300 00:16:19,720 --> 00:16:22,480 Speaker 1: some sort of meta analysis, then you don't really have 301 00:16:22,600 --> 00:16:25,120 Speaker 1: a proper frame of reference or even like sort of 302 00:16:25,160 --> 00:16:27,280 Speaker 1: I wouldn't even say like nothing like a perfect frame 303 00:16:27,280 --> 00:16:29,800 Speaker 1: of reference, but even say like a healthy frame of reference. Yes, 304 00:16:29,920 --> 00:16:32,480 Speaker 1: you can go so like you read one article about 305 00:16:32,480 --> 00:16:34,760 Speaker 1: a subject and then you're an expert, and then you 306 00:16:34,800 --> 00:16:38,360 Speaker 1: start reading more and you realize like, oh wait a minute, 307 00:16:38,360 --> 00:16:40,680 Speaker 1: you know there's so much I don't understand that your 308 00:16:40,800 --> 00:16:44,000 Speaker 1: estimation of your own expertise drops sharply after Yeah, you 309 00:16:44,120 --> 00:16:46,480 Speaker 1: like you might realize, oh, well, there are other theories, 310 00:16:46,600 --> 00:16:48,360 Speaker 1: or you might realize, oh, well, this was just one 311 00:16:48,440 --> 00:16:52,440 Speaker 1: person's summary of this particular theory, and oh and then 312 00:16:52,560 --> 00:16:55,000 Speaker 1: on top of that, perhaps they had a particular acts 313 00:16:55,040 --> 00:16:58,240 Speaker 1: to grind in writing it, etcetera. Yeah, that's a great example. 314 00:16:58,280 --> 00:17:00,000 Speaker 1: I mean not to say that you should doubt every 315 00:17:00,360 --> 00:17:02,400 Speaker 1: you read, but I mean, yeah, you should have. You 316 00:17:02,400 --> 00:17:04,800 Speaker 1: should you should have healthy doubt, not you know, not 317 00:17:04,960 --> 00:17:08,800 Speaker 1: denial ism, but but you know, just be aware that 318 00:17:08,880 --> 00:17:12,400 Speaker 1: you don't know everything, and you should be especially suspicious 319 00:17:12,440 --> 00:17:15,920 Speaker 1: when you have dipped your toes into a subject and 320 00:17:15,960 --> 00:17:19,000 Speaker 1: now feel that you fully understand it. And we say 321 00:17:19,040 --> 00:17:23,960 Speaker 1: that as a professional toe dippers. Uh. Now, finally, they 322 00:17:23,960 --> 00:17:27,399 Speaker 1: point out that the empirical evidence for wishful thinking itself 323 00:17:27,480 --> 00:17:29,960 Speaker 1: in general as a psychological phenomenon. They say, this is 324 00:17:30,000 --> 00:17:33,840 Speaker 1: not actually strong. Uh. If if there were strong evidence 325 00:17:33,920 --> 00:17:37,439 Speaker 1: for wishful thinking, wouldn't it be the case that more 326 00:17:37,480 --> 00:17:42,240 Speaker 1: desirable outcomes would be more strongly believed? And they say, no, 327 00:17:42,840 --> 00:17:45,240 Speaker 1: studies that try to test this out do not find 328 00:17:45,280 --> 00:17:47,159 Speaker 1: this to be the case. It's not the case that 329 00:17:47,280 --> 00:17:49,720 Speaker 1: the more you want something, the more you believe it 330 00:17:49,760 --> 00:17:52,719 Speaker 1: to be true. And there are only a few types 331 00:17:52,760 --> 00:17:55,320 Speaker 1: of scenarios where there's any evidence of this at all, 332 00:17:55,640 --> 00:17:59,119 Speaker 1: such as scenarios where all outcomes are equally likely, like 333 00:17:59,160 --> 00:18:02,040 Speaker 1: a dice roll or something. Now that is interesting to 334 00:18:02,080 --> 00:18:05,639 Speaker 1: think of in terms of dungeons and dragons, were frequently 335 00:18:05,720 --> 00:18:07,920 Speaker 1: one is either making an attack or doing some sort 336 00:18:07,960 --> 00:18:10,960 Speaker 1: of attempting, some sort of act that requires a skill check. 337 00:18:11,680 --> 00:18:15,080 Speaker 1: And I find myself doing this. You'll you go up 338 00:18:15,119 --> 00:18:17,439 Speaker 1: there and you begin to explain what your character is 339 00:18:17,480 --> 00:18:20,040 Speaker 1: going to do, as if you hit that natural twenty 340 00:18:20,400 --> 00:18:23,600 Speaker 1: that's kind of the Yeah, So I find myself engaging 341 00:18:23,600 --> 00:18:26,359 Speaker 1: in a lot of that level of overconfidence with my 342 00:18:26,440 --> 00:18:29,119 Speaker 1: character because ultimately it all comes down to the roll 343 00:18:29,200 --> 00:18:31,919 Speaker 1: of the dice. You know, unless I'm trying to you know, 344 00:18:32,840 --> 00:18:35,159 Speaker 1: leap off of the democ organ's head or something, that 345 00:18:35,280 --> 00:18:38,399 Speaker 1: is going to be extremely difficult because they're going to 346 00:18:38,480 --> 00:18:42,480 Speaker 1: be additional numerical you know values, uh, you know, addages 347 00:18:42,520 --> 00:18:45,120 Speaker 1: attracted from the attempt. You know, ultimately it's still gonna 348 00:18:45,119 --> 00:18:47,760 Speaker 1: be one to twenty one being uh, you know, a 349 00:18:47,800 --> 00:18:50,639 Speaker 1: pretty much complete fail. Uh. You know that's gonna be 350 00:18:50,680 --> 00:18:52,960 Speaker 1: the one where you slip and stab yourself with your 351 00:18:52,960 --> 00:18:55,640 Speaker 1: own sword, or it's gonna be that natural twenty, which 352 00:18:55,720 --> 00:18:58,200 Speaker 1: is going to be you know, the wonder hit where 353 00:18:58,200 --> 00:19:01,959 Speaker 1: you do extra damage. That is fantastic example. I I 354 00:19:02,040 --> 00:19:04,879 Speaker 1: was trying to think of cases where I thought I 355 00:19:04,960 --> 00:19:07,919 Speaker 1: really did engage in wishful thinking, and I couldn't think. 356 00:19:07,960 --> 00:19:10,800 Speaker 1: I'm sure I do sometimes. But yeah, they say it's 357 00:19:10,840 --> 00:19:12,920 Speaker 1: not actually as common as people think it is. And 358 00:19:13,200 --> 00:19:15,640 Speaker 1: here's maybe one case. Yeah, I think in Dungeons and Dragons, 359 00:19:15,680 --> 00:19:18,040 Speaker 1: I have yet to meet a player or be a 360 00:19:18,080 --> 00:19:21,360 Speaker 1: player that does not engage in will wishfull thinking. Every 361 00:19:21,359 --> 00:19:23,760 Speaker 1: time you roll the dice, like nobody nobody rolls that 362 00:19:23,800 --> 00:19:25,520 Speaker 1: dice and it says, all right, this is how I'm 363 00:19:25,520 --> 00:19:28,880 Speaker 1: going to fall off this table or this is how 364 00:19:28,920 --> 00:19:32,159 Speaker 1: I'm going to fall into the next trap, and uh, 365 00:19:32,359 --> 00:19:35,639 Speaker 1: and you know, and skewer myself on a stake. No, 366 00:19:35,800 --> 00:19:37,639 Speaker 1: we want the best outcome and we we have it 367 00:19:37,640 --> 00:19:41,960 Speaker 1: in our mind before the dice puts us in our place. Now, 368 00:19:42,000 --> 00:19:44,120 Speaker 1: another thing that the authors here bring up is that 369 00:19:44,720 --> 00:19:48,280 Speaker 1: overestimation itself. Remember again, that's just thinking that you are 370 00:19:48,320 --> 00:19:50,439 Speaker 1: better than you are in some way in terms of 371 00:19:50,480 --> 00:19:54,320 Speaker 1: abilities or traits or something. Um that this actually has 372 00:19:54,359 --> 00:19:57,480 Speaker 1: a mixed evidential record. It's not always the case that 373 00:19:57,520 --> 00:20:01,440 Speaker 1: we overestimate ourselves on all quality, easier tasks. It's more 374 00:20:01,560 --> 00:20:03,840 Speaker 1: the case for some things in particular, And they give 375 00:20:03,880 --> 00:20:06,679 Speaker 1: a couple of examples of things where there really is 376 00:20:06,720 --> 00:20:10,680 Speaker 1: a ton of evidence for consistent overestimation. One is something 377 00:20:10,680 --> 00:20:13,720 Speaker 1: you brought up in the last episode, Robert, the planning fallacy. 378 00:20:14,680 --> 00:20:19,399 Speaker 1: There is really good evidence that people consistently overestimate how 379 00:20:19,520 --> 00:20:22,920 Speaker 1: fast they'll be able to get things done or complete 380 00:20:22,920 --> 00:20:26,320 Speaker 1: a project of some kind. And this is especially true 381 00:20:26,600 --> 00:20:30,320 Speaker 1: if the project is difficult and novel. So like, if 382 00:20:30,359 --> 00:20:32,719 Speaker 1: I try to, you know, I put together some complex 383 00:20:32,800 --> 00:20:34,880 Speaker 1: thing for you to do that's hard and you've never 384 00:20:34,920 --> 00:20:39,119 Speaker 1: done it before, you are really likely to massively underestimate 385 00:20:39,119 --> 00:20:41,000 Speaker 1: how much time it's going to take you. Right if 386 00:20:41,040 --> 00:20:43,000 Speaker 1: you're like, well, you know, I'm not a handyman, but 387 00:20:43,240 --> 00:20:46,760 Speaker 1: I think I'm gonna install this sink myself, and then 388 00:20:46,800 --> 00:20:50,880 Speaker 1: you watch a weekend just vanish. Yeah, I know that feeling. 389 00:20:51,520 --> 00:20:54,320 Speaker 1: Another one that they cite is the illusion of control. 390 00:20:54,400 --> 00:20:58,760 Speaker 1: People pretty consistently overestimate how much control they were they 391 00:20:58,760 --> 00:21:02,000 Speaker 1: will have over future. It comes even things that that 392 00:21:02,080 --> 00:21:05,199 Speaker 1: they should understand are basically random. Right, you see this 393 00:21:06,600 --> 00:21:09,800 Speaker 1: financially in business wise. A lot of times where someone 394 00:21:09,840 --> 00:21:12,439 Speaker 1: will have they think they have a clear idea of 395 00:21:12,480 --> 00:21:14,639 Speaker 1: like how things are going to flow, but they're they're 396 00:21:14,680 --> 00:21:18,720 Speaker 1: just not taking into account all the factors they cannot control, 397 00:21:19,359 --> 00:21:22,639 Speaker 1: and say the economy or you know, or or or 398 00:21:22,720 --> 00:21:25,640 Speaker 1: just the industry that they're a part of. But they're 399 00:21:25,720 --> 00:21:28,640 Speaker 1: kind of they're acting, they're making choices based on sort 400 00:21:28,640 --> 00:21:31,080 Speaker 1: of like a not even a best case scenario, but 401 00:21:31,200 --> 00:21:34,679 Speaker 1: sort of like a standard scenario. You know, Yeah, I 402 00:21:34,680 --> 00:21:36,280 Speaker 1: think I know what you mean. Like they're they're not 403 00:21:36,400 --> 00:21:38,639 Speaker 1: counting on the storm, and they're also not counting on 404 00:21:38,720 --> 00:21:41,919 Speaker 1: the wind to completely die away, and that's how they're basing, 405 00:21:42,240 --> 00:21:44,080 Speaker 1: you know, their their estimate of how long it's going 406 00:21:44,160 --> 00:21:46,879 Speaker 1: to take to sail across the sea. Oh yeah, I 407 00:21:46,880 --> 00:21:49,600 Speaker 1: mean that that's another thing, like we there's actually a 408 00:21:49,680 --> 00:21:51,880 Speaker 1: name for this. I've forgotten it at the moment. Maybe 409 00:21:51,920 --> 00:21:54,120 Speaker 1: I'll call it to mind in a second. But it's 410 00:21:54,160 --> 00:21:57,680 Speaker 1: the the assumption that the future will be like the present. Yeah, 411 00:21:57,760 --> 00:22:01,880 Speaker 1: maybe it's called the continuation fallaci ores thing um. But 412 00:22:02,000 --> 00:22:03,919 Speaker 1: now there's one more thing that they bring up with 413 00:22:03,960 --> 00:22:08,320 Speaker 1: respect to overestimation, specifically, uh and This is a standard 414 00:22:08,359 --> 00:22:11,600 Speaker 1: finding that applies to a lot of the research on overestimation. 415 00:22:11,840 --> 00:22:16,280 Speaker 1: It's called the hard easy distinction or the hard easy effect. 416 00:22:16,800 --> 00:22:19,200 Speaker 1: And uh that this one is interesting because we'll see 417 00:22:19,200 --> 00:22:21,919 Speaker 1: some variations with it in other types of overconfidence. But 418 00:22:22,040 --> 00:22:25,639 Speaker 1: it goes like this, we are more likely to overestimate 419 00:22:25,640 --> 00:22:30,040 Speaker 1: our abilities on hard tasks and underestimate our abilities on 420 00:22:30,119 --> 00:22:34,600 Speaker 1: easy ones. So again, like the hard project comes up 421 00:22:35,240 --> 00:22:39,240 Speaker 1: with the planning fallacy, you massively underestimate how much time 422 00:22:39,280 --> 00:22:41,879 Speaker 1: it's going to take you to do that hard, complex, 423 00:22:41,920 --> 00:22:45,120 Speaker 1: novel thing, but then you might overestimate how much time 424 00:22:45,160 --> 00:22:47,040 Speaker 1: it's going to take you to do something that's a 425 00:22:47,119 --> 00:22:50,480 Speaker 1: common easy task. I guess that the main example that's 426 00:22:50,480 --> 00:22:52,960 Speaker 1: coming to mind on this one would be the scenario 427 00:22:53,080 --> 00:22:57,480 Speaker 1: where driving across town, are going to a particular destination 428 00:22:57,560 --> 00:22:59,480 Speaker 1: takes less time than you think it will, and then 429 00:22:59,480 --> 00:23:03,080 Speaker 1: you show up like fifteen minutes early or were worse. Yeah, 430 00:23:03,080 --> 00:23:06,400 Speaker 1: and the authors here have some explanations for how exactly 431 00:23:06,400 --> 00:23:07,679 Speaker 1: this is working that we'll get to in a bit. 432 00:23:07,680 --> 00:23:09,720 Speaker 1: Should we take a break, Yes, we should, and when 433 00:23:09,720 --> 00:23:12,320 Speaker 1: we come back we will continue our journey through over confidence. 434 00:23:13,920 --> 00:23:18,080 Speaker 1: Thank alright, we're back all right, So we've been talking 435 00:23:18,119 --> 00:23:21,200 Speaker 1: about this paper about over confidence, about the three types 436 00:23:21,240 --> 00:23:24,320 Speaker 1: of overconfidence by shots and more. We were just talking 437 00:23:24,320 --> 00:23:27,040 Speaker 1: about overestimation, the belief that you're better than you are, 438 00:23:27,119 --> 00:23:30,399 Speaker 1: especially with respect to some kind of objective measure or 439 00:23:30,440 --> 00:23:33,199 Speaker 1: independent measure. And so we want to move on to 440 00:23:33,240 --> 00:23:36,919 Speaker 1: the next type of overconfidence they talk about, which is overplacement. 441 00:23:37,040 --> 00:23:39,879 Speaker 1: And again this is different from overestimation because this is 442 00:23:40,359 --> 00:23:43,600 Speaker 1: thinking that you're better than you are with respect to 443 00:23:43,680 --> 00:23:47,439 Speaker 1: other people and judging yourself compared to others. Now, the 444 00:23:47,440 --> 00:23:51,080 Speaker 1: authors have some methodological critiques of some of the literature here, 445 00:23:51,080 --> 00:23:53,879 Speaker 1: but they acknowledge there's a lot of evidence for overplacement, 446 00:23:54,200 --> 00:23:57,400 Speaker 1: citing the better than average effect in all its beastly forms, 447 00:23:57,440 --> 00:23:59,879 Speaker 1: like in the other paper that we talked about that was, 448 00:24:00,200 --> 00:24:03,840 Speaker 1: you know, recently found to be extremely robust. Uh. They've 449 00:24:03,840 --> 00:24:06,800 Speaker 1: got some quibbles about methodology and some of these studies, 450 00:24:06,920 --> 00:24:10,240 Speaker 1: like using ambiguous scales or measures. Robert, you were talking 451 00:24:10,240 --> 00:24:12,520 Speaker 1: I think in the last episode about you know how 452 00:24:12,560 --> 00:24:14,800 Speaker 1: some of these types of things like uh, you know, 453 00:24:14,880 --> 00:24:19,640 Speaker 1: how people rate themselves in terms of attractiveness or intelligence 454 00:24:19,720 --> 00:24:23,119 Speaker 1: or something. These can of course suffer from ambiguous criteria, 455 00:24:23,400 --> 00:24:27,640 Speaker 1: right yeah, or sometimes just straight up unfair criteria, racist criteria, UM, 456 00:24:27,800 --> 00:24:32,320 Speaker 1: misogynistic criteria, etcetera. Well, yeah, it absolutely has all those 457 00:24:32,359 --> 00:24:35,560 Speaker 1: negative effects. I mean, I think overplacement is like it's 458 00:24:35,600 --> 00:24:37,560 Speaker 1: behind a lot of the worst types of prejudices that 459 00:24:37,600 --> 00:24:40,960 Speaker 1: make themselves known. But even if you're just like looking 460 00:24:41,000 --> 00:24:44,040 Speaker 1: at what is the quality you're trying to measure, you know, 461 00:24:44,520 --> 00:24:47,879 Speaker 1: attractiveness or something like that, that there's usually not like 462 00:24:48,080 --> 00:24:50,480 Speaker 1: a way of rating that is, it's all based on 463 00:24:50,520 --> 00:24:54,359 Speaker 1: these kind of ambiguous subjective judgments. One great example of 464 00:24:54,400 --> 00:24:57,040 Speaker 1: this is something that we've brought up several times already, 465 00:24:57,119 --> 00:25:02,159 Speaker 1: like the driving example. So Svinson in one did a 466 00:25:02,200 --> 00:25:06,880 Speaker 1: study where he discovered that nine percent of American drivers 467 00:25:07,000 --> 00:25:11,920 Speaker 1: rated themselves above the median and driving ability. Obviously, whatever 468 00:25:12,080 --> 00:25:16,439 Speaker 1: criterion you use, it's impossible for to be above the media, 469 00:25:16,440 --> 00:25:19,560 Speaker 1: and it would have to be you know, like, um, 470 00:25:19,640 --> 00:25:22,520 Speaker 1: the majority can be above average, but the majority cannot 471 00:25:22,520 --> 00:25:25,119 Speaker 1: be above the median. And the authors point out this 472 00:25:25,160 --> 00:25:27,919 Speaker 1: would be more impressive if it were more specific, because 473 00:25:28,320 --> 00:25:31,240 Speaker 1: due to this problem with like ambiguous scales or measures, 474 00:25:31,600 --> 00:25:35,600 Speaker 1: anybody could technically have their own definition of what makes 475 00:25:35,600 --> 00:25:39,040 Speaker 1: a good driver, So you could be answering that, thinking like, well, 476 00:25:39,080 --> 00:25:41,760 Speaker 1: there are things that I do well when I drive, 477 00:25:41,840 --> 00:25:44,520 Speaker 1: and maybe they're different from what somebody else thinks that 478 00:25:44,640 --> 00:25:48,040 Speaker 1: they do well when they drive, and that's their criterion, right, 479 00:25:48,080 --> 00:25:50,480 Speaker 1: I mean, your your definition of being a good driver 480 00:25:50,640 --> 00:25:52,800 Speaker 1: could just be I did not you know, I wasn't 481 00:25:52,840 --> 00:25:54,480 Speaker 1: in a wreck on the way to work this morning, 482 00:25:54,480 --> 00:25:56,639 Speaker 1: you know, Or or it could be I get the 483 00:25:56,680 --> 00:26:00,560 Speaker 1: places I need to go fast like in those those 484 00:26:00,560 --> 00:26:03,919 Speaker 1: are definitely, you know, not necessarily the same vision of 485 00:26:03,960 --> 00:26:06,840 Speaker 1: good driving. Or I look really cool when I do it, 486 00:26:06,960 --> 00:26:10,120 Speaker 1: you know. Uh. There's another thing they bring up which 487 00:26:10,160 --> 00:26:14,240 Speaker 1: is interesting, which is the role of self selection in 488 00:26:14,240 --> 00:26:18,240 Speaker 1: increasing the apparent prevalence of over confidence in the real world. 489 00:26:18,600 --> 00:26:22,080 Speaker 1: So an example would be like this, On average, more 490 00:26:22,119 --> 00:26:25,760 Speaker 1: overconfident people are likely to apply for jobs just sort 491 00:26:25,760 --> 00:26:29,520 Speaker 1: of by definition, right, more overconfident people are likely to 492 00:26:29,600 --> 00:26:34,240 Speaker 1: start businesses, to run for office, So we're we're exposed 493 00:26:34,440 --> 00:26:37,760 Speaker 1: to more of these people, and this could lead to 494 00:26:37,880 --> 00:26:41,480 Speaker 1: us thinking that their confidence level is more represented in 495 00:26:41,520 --> 00:26:44,399 Speaker 1: the general population than it actually is. Oh yeah, you 496 00:26:44,440 --> 00:26:47,719 Speaker 1: turn on the television. It's what it's almost exclusively overly 497 00:26:47,760 --> 00:26:50,280 Speaker 1: confident people, that's true. Yeah, So if you just look 498 00:26:50,320 --> 00:26:56,120 Speaker 1: at like business leadership, politics, celebrities, all this, you're gonna see. 499 00:26:56,160 --> 00:26:59,000 Speaker 1: I think you will see in general way more over 500 00:26:59,080 --> 00:27:02,520 Speaker 1: confidence than just will talking to your friends and relatives 501 00:27:02,520 --> 00:27:05,639 Speaker 1: and co workers. Now here's a really interesting thing. Remember 502 00:27:05,680 --> 00:27:08,200 Speaker 1: we talked about the hard easy effect or the easy 503 00:27:08,240 --> 00:27:13,440 Speaker 1: heart effect with overestimation, where people tend to overestimate their 504 00:27:13,480 --> 00:27:19,359 Speaker 1: abilities on hard jobs and underestimate their abilities on easy jobs. Apparently, 505 00:27:19,600 --> 00:27:23,840 Speaker 1: for overplacement, it's there's also an easy heart effect, but 506 00:27:24,000 --> 00:27:29,360 Speaker 1: it's in the exact opposite direction. With overplacement, you overplace 507 00:27:29,440 --> 00:27:33,600 Speaker 1: yourselves relative. We overplace ourselves relative to others on easy, 508 00:27:33,760 --> 00:27:39,160 Speaker 1: common tasks and underplace ourselves relative to others on difficult, unusual, 509 00:27:39,359 --> 00:27:42,440 Speaker 1: or rare ones. So again, what would be some examples 510 00:27:42,480 --> 00:27:45,560 Speaker 1: of this? Uh, you think you're in the ninety percentile 511 00:27:45,640 --> 00:27:48,280 Speaker 1: of drivers, but really you're in the forty. This is 512 00:27:48,320 --> 00:27:52,200 Speaker 1: an easy common task. On the other hand, people think 513 00:27:52,280 --> 00:27:56,880 Speaker 1: that they are less likely than others to win difficult competitions. Uh. 514 00:27:56,920 --> 00:28:00,200 Speaker 1: Studies show that when there's a teacher that decides to 515 00:28:00,359 --> 00:28:03,760 Speaker 1: make an exam harder and graded on a curve, students 516 00:28:03,800 --> 00:28:07,080 Speaker 1: expect their grades to be worse than others, even when 517 00:28:07,080 --> 00:28:09,240 Speaker 1: there's common knowledge that there will be a curve. So 518 00:28:09,280 --> 00:28:12,400 Speaker 1: as the test gets harder, students perceived that they will 519 00:28:12,440 --> 00:28:17,120 Speaker 1: do worse relative to other classmates. That's kind of interesting. Uh. 520 00:28:17,160 --> 00:28:20,080 Speaker 1: They point out that people believe they are worse jugglers 521 00:28:20,080 --> 00:28:23,399 Speaker 1: than other people. They believe that they are less likely 522 00:28:23,480 --> 00:28:26,160 Speaker 1: to win the lottery than other people. Again a difficult, 523 00:28:26,240 --> 00:28:30,400 Speaker 1: rare thing, uh, And that they here's here's a very 524 00:28:30,440 --> 00:28:33,800 Speaker 1: interesting version. Just in terms of ages. People believe they 525 00:28:33,800 --> 00:28:37,600 Speaker 1: are less likely than other people to live past one hundred, 526 00:28:37,960 --> 00:28:40,760 Speaker 1: but they also think they're more likely than other people 527 00:28:40,800 --> 00:28:44,600 Speaker 1: to live past seventy. Interesting. Well, of course, both of 528 00:28:44,640 --> 00:28:46,640 Speaker 1: those kind of depends on where you are in the 529 00:28:46,640 --> 00:28:49,440 Speaker 1: age spectrum when you're making that estimation, you know, because 530 00:28:49,480 --> 00:28:52,360 Speaker 1: you could be I mean, I mean, but apparently it's 531 00:28:52,360 --> 00:28:55,600 Speaker 1: true of all age, but also your quality of life, right, 532 00:28:55,640 --> 00:28:58,360 Speaker 1: I mean, for some people that prospect of living two hundred, 533 00:28:58,920 --> 00:29:02,040 Speaker 1: Depending on where you are health wise, that might be terrifying. 534 00:29:02,120 --> 00:29:05,280 Speaker 1: That might be it might be wishful thinking that you'll 535 00:29:05,280 --> 00:29:08,080 Speaker 1: expire sooner than that, uh, or it could be the 536 00:29:08,120 --> 00:29:11,360 Speaker 1: other way around, you know. Um, what kind of explanations 537 00:29:11,360 --> 00:29:14,440 Speaker 1: are they They throwing out, Yeah, this was interesting, So yeah, 538 00:29:14,480 --> 00:29:17,000 Speaker 1: why do we fail in opposite directions here? Depending on 539 00:29:17,080 --> 00:29:20,680 Speaker 1: whether we're imagining our performance against objective measures versus relative 540 00:29:20,680 --> 00:29:23,760 Speaker 1: to others, and the author's site solutions from some of 541 00:29:23,840 --> 00:29:28,479 Speaker 1: Moore's previously previous work with other authors, they write this quote. 542 00:29:28,720 --> 00:29:32,080 Speaker 1: If people make any errors estimating how well they've done 543 00:29:32,240 --> 00:29:35,040 Speaker 1: or will do, then it stands to reason they're more 544 00:29:35,080 --> 00:29:38,520 Speaker 1: likely to overestimate a low score and the more likely 545 00:29:38,560 --> 00:29:42,400 Speaker 1: to underestimate a high score. That's the herd easy effect. 546 00:29:42,760 --> 00:29:46,240 Speaker 1: As long as people have more uncertainty about others scores, 547 00:29:46,560 --> 00:29:49,800 Speaker 1: they'll tend to make even more regressive estimates of others 548 00:29:49,880 --> 00:29:53,600 Speaker 1: than of self. The consequence would be that they overestimate 549 00:29:53,640 --> 00:29:57,440 Speaker 1: others even more than themselves on difficult tasks and come 550 00:29:57,480 --> 00:29:59,920 Speaker 1: to believe that they are worse than others. The oppos 551 00:30:00,080 --> 00:30:03,200 Speaker 1: that would hold true for easy tasks. People would underestimate 552 00:30:03,240 --> 00:30:06,360 Speaker 1: others more than themselves and wind up believing that they 553 00:30:06,360 --> 00:30:09,480 Speaker 1: are better than others. So that took me a minute 554 00:30:09,480 --> 00:30:11,240 Speaker 1: to get my head around, but then I finally made 555 00:30:11,280 --> 00:30:13,800 Speaker 1: sense of it. So when you're not sure how you 556 00:30:13,800 --> 00:30:16,120 Speaker 1: will do it, something, as we're always you know, not 557 00:30:16,200 --> 00:30:19,320 Speaker 1: sure there's a ton of uncertainty in life, or you're 558 00:30:19,360 --> 00:30:22,520 Speaker 1: not sure how others will do. They're simply more room 559 00:30:22,680 --> 00:30:26,480 Speaker 1: for possibility to guess high if your performance is likely 560 00:30:26,520 --> 00:30:29,440 Speaker 1: to be low, and more room to guess low if 561 00:30:29,440 --> 00:30:32,640 Speaker 1: your performance is likely to be high. And this applies 562 00:30:32,680 --> 00:30:35,920 Speaker 1: to both the self and other people. Since we know 563 00:30:36,080 --> 00:30:39,160 Speaker 1: even less about other people than we do about ourselves, 564 00:30:39,360 --> 00:30:42,160 Speaker 1: we're going to spend more time guessing wrong in these 565 00:30:42,240 --> 00:30:45,680 Speaker 1: vast over and under zones for other people, depending on 566 00:30:45,760 --> 00:30:48,160 Speaker 1: what type of task it is. Now we're going to 567 00:30:48,240 --> 00:30:52,080 Speaker 1: talk about over precision from this two thousand seventeen study. Now, 568 00:30:52,080 --> 00:30:54,880 Speaker 1: over precision again is that that's like having way more 569 00:30:55,000 --> 00:30:58,640 Speaker 1: confidence than you should about what you believe to be true. 570 00:30:59,040 --> 00:31:00,640 Speaker 1: So I could ask you, you know, I could ask 571 00:31:00,680 --> 00:31:02,520 Speaker 1: you to answer a question, Then I could ask you 572 00:31:02,600 --> 00:31:06,280 Speaker 1: how confident you are that your answer is correct? Uh. 573 00:31:06,360 --> 00:31:10,080 Speaker 1: And the authors here right quote results routinely find that 574 00:31:10,240 --> 00:31:15,720 Speaker 1: hit rates inside confidence intervals are below fifty percent, implying 575 00:31:15,720 --> 00:31:18,920 Speaker 1: that people set their ranges too precisely, acting as if 576 00:31:18,920 --> 00:31:22,520 Speaker 1: they're inappropriately confident that their beliefs are accurate. So, if 577 00:31:22,560 --> 00:31:26,120 Speaker 1: you take a quiz, you say you're more than confident 578 00:31:26,200 --> 00:31:28,560 Speaker 1: on average about your answers, and you're actually more like 579 00:31:28,640 --> 00:31:32,239 Speaker 1: fifty percent correct on average. This is something that's been 580 00:31:32,240 --> 00:31:35,480 Speaker 1: found a bunch of times. It's quite clear that there's 581 00:31:35,560 --> 00:31:39,520 Speaker 1: tons of over precision in human behavior. The authors have 582 00:31:39,560 --> 00:31:42,720 Speaker 1: a few critiques about like common research paradigms that that 583 00:31:42,760 --> 00:31:45,240 Speaker 1: are used to study this. One example is they say, 584 00:31:45,560 --> 00:31:47,640 Speaker 1: you know it. It may be that normal people don't 585 00:31:47,640 --> 00:31:50,880 Speaker 1: have a very solid understanding of how to use confidence intervals, 586 00:31:50,880 --> 00:31:53,080 Speaker 1: so there have been other ways of trying to measure it. 587 00:31:53,840 --> 00:31:57,040 Speaker 1: But however, the authors here believe that over precision is 588 00:31:57,120 --> 00:32:00,560 Speaker 1: the most pervasive form of over confidence. You find it 589 00:32:00,840 --> 00:32:04,640 Speaker 1: absolutely every everywhere, even in experts talking about their own 590 00:32:04,720 --> 00:32:06,760 Speaker 1: subject matter. I think that's come up on the show 591 00:32:06,800 --> 00:32:09,600 Speaker 1: before that, I don't remember when. After this, the authors 592 00:32:09,640 --> 00:32:12,120 Speaker 1: here turned to the question UH, a question we talked 593 00:32:12,120 --> 00:32:16,000 Speaker 1: about a little before. Could over confidence actually be useful? Like? 594 00:32:16,360 --> 00:32:18,440 Speaker 1: How do why does it make sense for a brain 595 00:32:18,520 --> 00:32:21,960 Speaker 1: to be overconfident? Uh? And they talk about explanations in 596 00:32:22,040 --> 00:32:27,000 Speaker 1: two main categories, intra personal and interpersonal UH. The authors 597 00:32:27,160 --> 00:32:31,320 Speaker 1: generally think the evidence for the interpersonal explanations the the 598 00:32:31,360 --> 00:32:34,840 Speaker 1: explanations and how it works on other people are stronger 599 00:32:34,880 --> 00:32:37,840 Speaker 1: than the intra personal ones, though there kid could be 600 00:32:37,880 --> 00:32:41,000 Speaker 1: some good intra personal ones. For example, you know, maybe 601 00:32:41,000 --> 00:32:43,920 Speaker 1: over confidence doesn't just make you feel good, it, as 602 00:32:43,960 --> 00:32:47,760 Speaker 1: we hypothesized earlier, makes you more likely to take risks 603 00:32:47,840 --> 00:32:50,840 Speaker 1: that can pay off big. Yeah. Well, I mean, for instance, 604 00:32:50,880 --> 00:32:53,360 Speaker 1: to come to to go to like a predator prey 605 00:32:53,400 --> 00:32:58,520 Speaker 1: scenario like one is reminded of the you know, how 606 00:32:58,560 --> 00:33:02,280 Speaker 1: effective your average editor is. You know they're going to 607 00:33:02,320 --> 00:33:06,200 Speaker 1: fail a lot. And granted, a leopard is not really 608 00:33:06,520 --> 00:33:11,200 Speaker 1: subject to human uh, you know, over confidence or under confidence, 609 00:33:11,560 --> 00:33:13,840 Speaker 1: but certainly if you if you if you look at 610 00:33:13,840 --> 00:33:16,600 Speaker 1: a human scenario, if you look at human hunters, uh, 611 00:33:16,680 --> 00:33:18,840 Speaker 1: you know it, it's certainly a situation where it would 612 00:33:18,840 --> 00:33:22,960 Speaker 1: pay to be overconfident, uh, to a certain degree. Yeah, 613 00:33:23,000 --> 00:33:25,520 Speaker 1: I think you can find some analogies of confidence and 614 00:33:25,560 --> 00:33:28,320 Speaker 1: over confidence and animals like you know, how likely are 615 00:33:28,320 --> 00:33:31,600 Speaker 1: you to try to take down prey animal that you're 616 00:33:31,680 --> 00:33:34,480 Speaker 1: very unlikely to succeed against, but you know, would provide 617 00:33:34,520 --> 00:33:36,240 Speaker 1: you with a lot of meat and energy if you 618 00:33:36,320 --> 00:33:38,640 Speaker 1: do right. Though, of course, the reverse of the other 619 00:33:38,680 --> 00:33:40,400 Speaker 1: side of that is that you would not need want 620 00:33:40,400 --> 00:33:42,840 Speaker 1: to be so overconfident that you were going after prey 621 00:33:43,120 --> 00:33:45,240 Speaker 1: that was extremely likely to kill you if you try 622 00:33:45,320 --> 00:33:48,680 Speaker 1: to bring it down right. But the authors here they 623 00:33:48,720 --> 00:33:52,280 Speaker 1: do think that there's really good evidence for interpersonal benefits 624 00:33:52,280 --> 00:33:55,560 Speaker 1: for over confidence. One example would be all the empirical 625 00:33:55,600 --> 00:33:59,160 Speaker 1: evidence that already exists that just outwardly projecting confidence has 626 00:33:59,200 --> 00:34:01,920 Speaker 1: all these benefit fits affecting how other people see us. 627 00:34:02,280 --> 00:34:05,040 Speaker 1: There are studies that show that highly confident people are 628 00:34:05,080 --> 00:34:09,880 Speaker 1: more persuasive, they're more influential, they're perceived as more sexually attractive, 629 00:34:09,960 --> 00:34:14,240 Speaker 1: they tend to get promoted to positions of authority in groups. Um, 630 00:34:14,320 --> 00:34:17,640 Speaker 1: And it's possible that confidence is actually more important than 631 00:34:17,680 --> 00:34:21,680 Speaker 1: competence in determining who gets promoted to high status positions 632 00:34:22,239 --> 00:34:25,360 Speaker 1: the author's right quote. While a preference for confident leaders 633 00:34:25,440 --> 00:34:28,920 Speaker 1: may make sense if there's a correlation, however, weak, between 634 00:34:28,960 --> 00:34:32,359 Speaker 1: confidence and competence, there is real risk in selecting over 635 00:34:32,400 --> 00:34:35,959 Speaker 1: confident leaders. Well, I mean, because on one hand, you 636 00:34:35,960 --> 00:34:38,239 Speaker 1: you want a boss that can say, you know, do 637 00:34:38,320 --> 00:34:41,280 Speaker 1: their job and and keep the company afloat and actually 638 00:34:41,480 --> 00:34:45,000 Speaker 1: grow the business, etcetera, all the various catchphrases. But you 639 00:34:45,040 --> 00:34:46,839 Speaker 1: also want a boss that you can kind of like 640 00:34:47,280 --> 00:34:49,040 Speaker 1: you can. You can you can trust that they're doing 641 00:34:49,080 --> 00:34:53,080 Speaker 1: their thing like they seem confident. I guess they have 642 00:34:53,200 --> 00:34:54,960 Speaker 1: their whole end of it figured out, and maybe I 643 00:34:54,960 --> 00:34:57,560 Speaker 1: can focus on my own thing and not my own 644 00:34:57,680 --> 00:35:00,279 Speaker 1: role in the company without you freaking out a what's 645 00:35:00,280 --> 00:35:03,520 Speaker 1: going to happen tomorrow? Well, maybe in the business scenario 646 00:35:03,600 --> 00:35:07,319 Speaker 1: we should pivot to talking about the Icarus paradox. Oh, 647 00:35:07,400 --> 00:35:08,960 Speaker 1: all right, we're gonna take a quick break, but we'll 648 00:35:09,000 --> 00:35:14,680 Speaker 1: be right back with more than thank and we're back. So, Robert, 649 00:35:14,760 --> 00:35:17,240 Speaker 1: you wanted to bring in a concept from the business 650 00:35:17,239 --> 00:35:21,239 Speaker 1: world about overconfidence, the Icarus paradox. Yeah, And I'm as 651 00:35:21,280 --> 00:35:23,280 Speaker 1: surprised as maybe some of you are that I'm bringing 652 00:35:23,280 --> 00:35:26,280 Speaker 1: in a something business wise, but it caught my attention 653 00:35:26,320 --> 00:35:29,040 Speaker 1: when I was looking for things on for papers and 654 00:35:29,040 --> 00:35:32,319 Speaker 1: so forth, on over confidence, and also looking at Greek 655 00:35:32,360 --> 00:35:35,920 Speaker 1: mythology and so forth, because yeah, the Acarus paradox invokes 656 00:35:36,080 --> 00:35:38,920 Speaker 1: the story of Charask rather directly. And I think it 657 00:35:38,960 --> 00:35:40,800 Speaker 1: also makes sense at another level, because you know, we 658 00:35:40,840 --> 00:35:43,839 Speaker 1: don't have gods so much anymore, Like these are not 659 00:35:43,880 --> 00:35:48,799 Speaker 1: the driving commanding forces in our world, but we do 660 00:35:48,840 --> 00:35:53,120 Speaker 1: have institutions, industries, global economies. And these are not unlike 661 00:35:53,120 --> 00:35:56,240 Speaker 1: the concepts of the gods, right. You know, they're sometimes lawful, 662 00:35:56,320 --> 00:35:59,640 Speaker 1: sometimes chaotic entities that are likely to destroy you if 663 00:35:59,680 --> 00:36:02,759 Speaker 1: you question their authority, or if you you you turn 664 00:36:02,800 --> 00:36:06,480 Speaker 1: against them. But anyway, I ran across this interesting concept 665 00:36:06,560 --> 00:36:10,240 Speaker 1: of the acress paradox um. It was devised by Canadian 666 00:36:10,239 --> 00:36:14,120 Speaker 1: economist Danny Miller, and he points out that businesses are 667 00:36:14,200 --> 00:36:18,080 Speaker 1: often like Icarus in the myth. They start out confident 668 00:36:18,120 --> 00:36:22,680 Speaker 1: and competent, they rise, but then they perish. Uh. The 669 00:36:22,680 --> 00:36:24,839 Speaker 1: The irony, he writes, is that many of the most 670 00:36:24,920 --> 00:36:30,480 Speaker 1: dramatically successful companies are prone to this exact sort of failure. Uh. 671 00:36:30,520 --> 00:36:33,000 Speaker 1: And and it doesn't necessary in the business sense, since 672 00:36:33,080 --> 00:36:36,719 Speaker 1: businesses are sometimes less likely to die. They're more like 673 00:36:36,760 --> 00:36:39,279 Speaker 1: the gods. You know, they're they have downfalls, but there 674 00:36:39,320 --> 00:36:43,319 Speaker 1: may be they may very well be immortal in some cases. Right, 675 00:36:43,360 --> 00:36:45,399 Speaker 1: they're still alive. They're just chained to a rock getting 676 00:36:45,440 --> 00:36:48,680 Speaker 1: their liver pecked out by an eagle, right for eternity. 677 00:36:48,760 --> 00:36:52,680 Speaker 1: You know, maybe they just declare multiple bankruptcies. Immortal declares bankruptcy. 678 00:36:52,680 --> 00:36:56,080 Speaker 1: It's a different thing than than a corporation doing it. Um. Wait, 679 00:36:56,120 --> 00:36:59,239 Speaker 1: what is that eagle? It's like a private equity eagle. Yeah, 680 00:36:59,239 --> 00:37:02,799 Speaker 1: the private ECU eagle. Um. And yeah, he mentioned several 681 00:37:02,800 --> 00:37:04,920 Speaker 1: companies and discussing this, and and most of them I 682 00:37:04,920 --> 00:37:06,839 Speaker 1: think that he was discussing are still around like they 683 00:37:06,840 --> 00:37:10,799 Speaker 1: have survived their downfalls. Uh. But yeah, yeah. He writes 684 00:37:10,800 --> 00:37:13,520 Speaker 1: that the irony is that that many of these dramatically 685 00:37:13,560 --> 00:37:15,839 Speaker 1: successful companies are prone to these sort of failures. And 686 00:37:15,880 --> 00:37:19,240 Speaker 1: what's more, the very factors that drive success, when taken 687 00:37:19,280 --> 00:37:23,600 Speaker 1: to excess, are the factors that bring about decline. So 688 00:37:23,880 --> 00:37:25,640 Speaker 1: I think this is an interesting model to look at, 689 00:37:25,680 --> 00:37:27,840 Speaker 1: not only for the you know, because it's a just 690 00:37:27,880 --> 00:37:29,480 Speaker 1: to take on it from the business world, but I 691 00:37:29,480 --> 00:37:31,520 Speaker 1: think he can also serve as sort of a reflecting 692 00:37:31,560 --> 00:37:34,800 Speaker 1: pool for some of the individual concepts that we've discussed 693 00:37:34,800 --> 00:37:37,880 Speaker 1: already by placing them out outside of the human psyche 694 00:37:37,880 --> 00:37:40,319 Speaker 1: and looking at them in the context of an organization 695 00:37:40,440 --> 00:37:44,920 Speaker 1: or a culture. So Miller wrote a book about this, 696 00:37:45,200 --> 00:37:49,120 Speaker 1: The Icarous Paradox, How Exceptional Companies bring about their own downfall. 697 00:37:49,440 --> 00:37:53,280 Speaker 1: New Lessons in the Dynamics of Corporate success, Decline, and Renewal. 698 00:37:53,320 --> 00:37:56,240 Speaker 1: I know you love a long title like that. Um. 699 00:37:56,280 --> 00:38:00,440 Speaker 1: But I also I was mainly looking at his Business 700 00:38:00,719 --> 00:38:03,640 Speaker 1: Horizons article that he wrote on the subject, and he 701 00:38:03,680 --> 00:38:07,560 Speaker 1: summarizes a lot of the key points. So Miller identified 702 00:38:07,760 --> 00:38:13,920 Speaker 1: four key to trajectories in the riches to rags business scenario. 703 00:38:14,600 --> 00:38:17,960 Speaker 1: So the first one that he mentions is the focusing trajectory. 704 00:38:18,040 --> 00:38:23,040 Speaker 1: In this trajectory, the craftsman becomes the tinker quote firms 705 00:38:23,080 --> 00:38:30,279 Speaker 1: whose insular, technocratic monocultures alienate customers with perfect but irrelevant offerings. 706 00:38:30,320 --> 00:38:32,480 Speaker 1: So I guess in a scenario would be, you know, 707 00:38:32,960 --> 00:38:36,560 Speaker 1: the company that creates a really groundbreaking product, and then 708 00:38:36,640 --> 00:38:40,319 Speaker 1: all they do is tinker with that concept. All they 709 00:38:40,360 --> 00:38:44,160 Speaker 1: do is make adjustment to that concept, and eventually, like 710 00:38:44,280 --> 00:38:46,719 Speaker 1: somebody else is going to create a better widget or 711 00:38:46,760 --> 00:38:50,279 Speaker 1: a better you know, smartphone, or whatever the situation may be, 712 00:38:50,600 --> 00:38:52,839 Speaker 1: somebody else is going to take some some you know, 713 00:38:52,880 --> 00:38:57,600 Speaker 1: some wider swings. Next comes the venturing trajectory, and this 714 00:38:57,680 --> 00:39:01,200 Speaker 1: is by which builders become imperial list And this one, 715 00:39:01,239 --> 00:39:03,160 Speaker 1: I think is the one that really has the smack 716 00:39:03,239 --> 00:39:07,520 Speaker 1: of overconfidence to it. In this trajectory, the strategy of 717 00:39:07,600 --> 00:39:12,040 Speaker 1: building feeds into over expansion. The goal of growth becomes 718 00:39:12,120 --> 00:39:16,719 Speaker 1: grandeur and an entrepreneurial culture becomes one of the gamesman, 719 00:39:17,520 --> 00:39:22,080 Speaker 1: A divisionalized structure becomes fractured. And then on top of 720 00:39:22,120 --> 00:39:25,400 Speaker 1: that there's the the inventing trajectory. This is where you 721 00:39:25,440 --> 00:39:29,040 Speaker 1: go from pioneering to escapist. In this trajectory, innovation feeds 722 00:39:29,080 --> 00:39:34,799 Speaker 1: into high tech escapism. Science for society transforms into technological utopianism. 723 00:39:34,840 --> 00:39:37,520 Speaker 1: Research and development gives way to think tank culture, and 724 00:39:37,520 --> 00:39:41,600 Speaker 1: the overall culture goes from organic to chaotic. And then 725 00:39:41,800 --> 00:39:45,400 Speaker 1: he rounds us out with the decoupling trajectory from salesman 726 00:39:45,480 --> 00:39:51,480 Speaker 1: to drifter quote. Finally, the decoupling trajectory transforms salesman organizations 727 00:39:51,480 --> 00:39:55,680 Speaker 1: with unparalleled marketing skills, prominent brand names, and broad markets 728 00:39:56,000 --> 00:40:00,600 Speaker 1: into aimless, bureaucratic drifters whose sales fetish obscure is design 729 00:40:00,640 --> 00:40:03,799 Speaker 1: issues and who produce a stale and disjointed line of 730 00:40:03,880 --> 00:40:06,879 Speaker 1: me to offerings. Okay, so that's the company that's more 731 00:40:06,920 --> 00:40:10,480 Speaker 1: based around marketing culture than around having a good product, right. 732 00:40:10,600 --> 00:40:13,160 Speaker 1: You know. Thinking about these trajectories kind of reminds me 733 00:40:13,239 --> 00:40:16,719 Speaker 1: of the peaking in high school stereotype. In life trajectories, 734 00:40:17,160 --> 00:40:19,239 Speaker 1: it's a stereotype, but there is some truth to it. 735 00:40:19,360 --> 00:40:22,040 Speaker 1: I think it's possible that too much success early on 736 00:40:22,120 --> 00:40:24,680 Speaker 1: in life can kind of corrupt a person and can 737 00:40:24,760 --> 00:40:27,200 Speaker 1: kind of corrupt your work ethic, your ability to learn 738 00:40:27,280 --> 00:40:30,759 Speaker 1: from mistakes and mature. It's important for people to experience 739 00:40:30,800 --> 00:40:34,360 Speaker 1: both successes and failures early in life. Yeah, yeah, I 740 00:40:34,600 --> 00:40:36,560 Speaker 1: think so. It kind of comes back to what we 741 00:40:36,600 --> 00:40:41,360 Speaker 1: talked about looking at considering Aristotle's uh summary of hubrists 742 00:40:41,360 --> 00:40:43,560 Speaker 1: saying that the the young and the rich were the 743 00:40:43,600 --> 00:40:46,080 Speaker 1: most likely to engage in it, and the idea that 744 00:40:46,120 --> 00:40:49,400 Speaker 1: perhaps in some scenarios, certainly there there are plenty of 745 00:40:49,440 --> 00:40:53,080 Speaker 1: examples of very wealthy people who got there through defeat, 746 00:40:53,280 --> 00:40:56,040 Speaker 1: like through learning the lessons of defeat. There are also 747 00:40:56,080 --> 00:40:59,320 Speaker 1: examples of of people who have you know, arguably maybe 748 00:40:59,360 --> 00:41:02,279 Speaker 1: you know, never suffered a true defeat like they have. 749 00:41:02,680 --> 00:41:05,680 Speaker 1: They have remains sort of man children, uh kind of 750 00:41:05,719 --> 00:41:08,600 Speaker 1: failed upward. Yeah, that sort of thing. Um. And again 751 00:41:08,600 --> 00:41:11,880 Speaker 1: we're dealing abroad tropes here. But you know, to some extent, 752 00:41:11,960 --> 00:41:16,280 Speaker 1: I think it's useful to consider, uh, to consider these um. 753 00:41:16,320 --> 00:41:18,480 Speaker 1: But but also I feel like it it does get 754 00:41:18,480 --> 00:41:22,319 Speaker 1: too into the idea that that when we're dealing with 755 00:41:22,400 --> 00:41:25,880 Speaker 1: the trajectory of of a human life, you know. Um. 756 00:41:27,080 --> 00:41:29,239 Speaker 1: Part of it is perhaps comes down to just our 757 00:41:29,280 --> 00:41:32,279 Speaker 1: ability to forecast the future, our ability to make long 758 00:41:32,480 --> 00:41:35,920 Speaker 1: to engage in long term planning. Like we're as humans 759 00:41:35,920 --> 00:41:38,000 Speaker 1: were generally not as good about that. We're certainly not 760 00:41:38,040 --> 00:41:40,960 Speaker 1: good about planning beyond uh, you know, the scope of 761 00:41:41,000 --> 00:41:44,000 Speaker 1: a human life. But but even like beyond the scope 762 00:41:44,040 --> 00:41:45,520 Speaker 1: of a of a few years, we're better at the 763 00:41:45,960 --> 00:41:49,600 Speaker 1: short term goals, and it's only with considerable effort that 764 00:41:49,680 --> 00:41:54,160 Speaker 1: we we get better at considering long term goals. Um. 765 00:41:54,200 --> 00:41:55,840 Speaker 1: So I think that's important to keep in mind. And 766 00:41:55,880 --> 00:41:58,239 Speaker 1: all of this now, one thing that I think is 767 00:41:58,239 --> 00:42:01,200 Speaker 1: also interesting in Miller's writings is that he talked he 768 00:42:01,440 --> 00:42:04,120 Speaker 1: he talks a bit about over confidence, you know, as 769 00:42:04,120 --> 00:42:08,239 Speaker 1: a symptom of underlying issues, you know, as opposed to 770 00:42:08,280 --> 00:42:13,680 Speaker 1: being like an intrinsic quality. Uh. He writes, Unfortunately, configuration 771 00:42:13,760 --> 00:42:17,360 Speaker 1: and synergy are usually attained at the cost of myopia. 772 00:42:17,840 --> 00:42:21,960 Speaker 1: Stellar performers view the world through narrowing telescopes. One point 773 00:42:21,960 --> 00:42:25,759 Speaker 1: of view takes over, one set of assumptions comes to dominate. 774 00:42:25,960 --> 00:42:30,319 Speaker 1: The result is complacency and over confidence. And I think 775 00:42:30,360 --> 00:42:32,560 Speaker 1: that plays into a lot of what what we've we 776 00:42:32,640 --> 00:42:35,040 Speaker 1: spoke about earlier. You know. Sorry, I'm trying to make 777 00:42:35,080 --> 00:42:37,440 Speaker 1: sense of it. Given the discuss the word synergy in it, 778 00:42:37,840 --> 00:42:40,280 Speaker 1: I shouldn't know what that means. By now I've purged 779 00:42:40,320 --> 00:42:42,719 Speaker 1: it from my brain. Oh okay, I see now, Yeah, 780 00:42:42,719 --> 00:42:45,080 Speaker 1: I'm looking at the quote. Okay, So no, this is 781 00:42:45,160 --> 00:42:47,799 Speaker 1: this is a very standard thing. Uh. You know, if 782 00:42:47,840 --> 00:42:52,480 Speaker 1: you have successfully hammered several nails in everything and you've 783 00:42:52,480 --> 00:42:55,040 Speaker 1: still got the hammer, everything really starts to look like 784 00:42:55,080 --> 00:42:59,120 Speaker 1: a nail, just because like, if you've had success with 785 00:42:59,160 --> 00:43:02,399 Speaker 1: the strategy and past, you don't switch. You just keep 786 00:43:02,480 --> 00:43:04,560 Speaker 1: doing what you've done before, even if it doesn't make 787 00:43:04,600 --> 00:43:07,080 Speaker 1: sense of that. Yeah, this is the this is what works, 788 00:43:07,080 --> 00:43:09,520 Speaker 1: this is what our product is, and this is what 789 00:43:09,560 --> 00:43:12,439 Speaker 1: we're going to stick to. Um and and then yeah, 790 00:43:12,440 --> 00:43:14,600 Speaker 1: he talks a good bit about to do about over 791 00:43:14,640 --> 00:43:18,200 Speaker 1: confidence as just a result of success, writing quote failure 792 00:43:18,239 --> 00:43:21,960 Speaker 1: teaches leaders valuable lessons, but good results only reinforce their 793 00:43:22,000 --> 00:43:25,680 Speaker 1: preconceptions and tether them more firmly to their tried and 794 00:43:25,760 --> 00:43:30,160 Speaker 1: true recipes. Success also makes managers overconfident, more prone to 795 00:43:30,280 --> 00:43:33,400 Speaker 1: excess and neglect, and more given to shape strategies to 796 00:43:33,440 --> 00:43:36,600 Speaker 1: reflect their own preferences rather than those of the customers. 797 00:43:37,560 --> 00:43:40,680 Speaker 1: And uh, you know. He also points to one of 798 00:43:40,680 --> 00:43:44,280 Speaker 1: the key aspects of the icorous paradox being that overconfident, 799 00:43:44,360 --> 00:43:48,719 Speaker 1: complacent executives extend the very factors that contributed to success 800 00:43:49,000 --> 00:43:52,239 Speaker 1: to the point where they cause decline. So the thing 801 00:43:52,280 --> 00:43:55,800 Speaker 1: that's working, you know, the button we're pushing that is 802 00:43:55,920 --> 00:43:58,560 Speaker 1: leading to success, let's just really jam that sucker. You're 803 00:43:58,600 --> 00:44:01,120 Speaker 1: like the rat in the experien that keeps pushing the 804 00:44:01,160 --> 00:44:04,959 Speaker 1: cocaine button. Yeah. So he summarizes that there are really 805 00:44:04,960 --> 00:44:08,000 Speaker 1: two aspects of the acreous paradox. One is the success 806 00:44:08,000 --> 00:44:10,719 Speaker 1: can lead to failure via the fostering of overconfidence and 807 00:44:10,760 --> 00:44:13,640 Speaker 1: other factors. And then to the aspects of a business 808 00:44:13,640 --> 00:44:16,840 Speaker 1: that brings success can also hasten failure or quote, the 809 00:44:17,000 --> 00:44:20,520 Speaker 1: very causes of success, when extended, may become the causes 810 00:44:20,560 --> 00:44:24,360 Speaker 1: of failure. And as far as you know, ways to 811 00:44:24,480 --> 00:44:27,200 Speaker 1: fight these transformations. Because that's when one question I had, 812 00:44:27,200 --> 00:44:29,640 Speaker 1: It's like, is this just the trajectory? This is what happens? 813 00:44:29,719 --> 00:44:33,360 Speaker 1: Like this is things can't just go up forever and 814 00:44:33,600 --> 00:44:38,480 Speaker 1: a business cannot just exist, you know, indefinitely, like things 815 00:44:38,520 --> 00:44:41,920 Speaker 1: have to die, right, I mean, these corporations are not 816 00:44:42,040 --> 00:44:45,919 Speaker 1: like you mortals, but they are given to life and death. 817 00:44:45,960 --> 00:44:49,479 Speaker 1: They're not eternal. Uh and and he says, he argues 818 00:44:49,520 --> 00:44:52,719 Speaker 1: that there are ways to fight these transformations. He suggests 819 00:44:52,719 --> 00:44:56,799 Speaker 1: self reflection and intelligence gathering that guards against excess, over 820 00:44:56,880 --> 00:45:00,960 Speaker 1: confidence and irrelevance. And this, this I think matches up 821 00:45:00,960 --> 00:45:02,560 Speaker 1: with what we've been talking about so far in the 822 00:45:02,560 --> 00:45:05,839 Speaker 1: individual level, Like like, if you think you're you're gracious, 823 00:45:05,880 --> 00:45:08,919 Speaker 1: like take a step back and and and uh and 824 00:45:08,920 --> 00:45:11,360 Speaker 1: and think in the question whether you know you actually 825 00:45:11,400 --> 00:45:13,239 Speaker 1: are to what extent you are? What else you could 826 00:45:13,239 --> 00:45:15,440 Speaker 1: be doing to ensure that that that you were actually 827 00:45:15,480 --> 00:45:18,480 Speaker 1: living up to your overestimation of self. If you think 828 00:45:18,520 --> 00:45:21,160 Speaker 1: it's true about yourself, prove it, Prove it to you. Yeah, 829 00:45:21,160 --> 00:45:24,000 Speaker 1: I have to say Miller is a is a is 830 00:45:24,040 --> 00:45:25,879 Speaker 1: a really good writer about it this because normally I'm 831 00:45:25,920 --> 00:45:30,279 Speaker 1: not interested in business culture type stuff like this, but 832 00:45:30,280 --> 00:45:31,840 Speaker 1: but he does a great job of like tying it 833 00:45:31,880 --> 00:45:35,520 Speaker 1: back into just the basic human scenario as well, Like 834 00:45:35,520 --> 00:45:38,920 Speaker 1: like he points out that excellence in any human endeavor, 835 00:45:38,960 --> 00:45:41,319 Speaker 1: be at arts or sports or what have you, you know, 836 00:45:41,360 --> 00:45:43,760 Speaker 1: it tends to come at a price. We cannot excel 837 00:45:43,880 --> 00:45:46,960 Speaker 1: at everything. We have to make sacrifices and choose what's 838 00:45:46,960 --> 00:45:49,719 Speaker 1: important in the middle of the road, or a jack 839 00:45:49,760 --> 00:45:53,520 Speaker 1: of all trades approach is not going to lead to greatness, 840 00:45:53,880 --> 00:45:56,319 Speaker 1: you know, unless it's a story or some fable where 841 00:45:56,360 --> 00:46:02,239 Speaker 1: greatness is just thrust upon somebody. You know, Uh, there's 842 00:46:02,280 --> 00:46:05,360 Speaker 1: got to be some sort of trade off there. And 843 00:46:05,360 --> 00:46:07,080 Speaker 1: he says it goes for the individual, but it also 844 00:46:07,120 --> 00:46:10,120 Speaker 1: applies to companies as well. You can only sharpen your 845 00:46:10,120 --> 00:46:12,719 Speaker 1: blade if you first realize that it is dull and 846 00:46:12,800 --> 00:46:15,359 Speaker 1: must be sharpened. You know, like if if you if 847 00:46:15,400 --> 00:46:18,640 Speaker 1: you believe yourself eternally and infinitely sharp, you're not going 848 00:46:18,719 --> 00:46:21,040 Speaker 1: to do the sharpening right. And it's interesting to come 849 00:46:21,040 --> 00:46:23,120 Speaker 1: back and think about the individual level and think about 850 00:46:23,120 --> 00:46:26,520 Speaker 1: over confidence in this scenario, like one like basic trophy 851 00:46:26,560 --> 00:46:29,319 Speaker 1: mentions is the idea of say, you know, the artist 852 00:46:29,400 --> 00:46:32,520 Speaker 1: who neglects their family to focus on their art, or 853 00:46:32,520 --> 00:46:35,919 Speaker 1: a business person that does the same thing. In those scenarios, 854 00:46:36,040 --> 00:46:39,560 Speaker 1: might it be you know, I guess you know, maybe 855 00:46:39,560 --> 00:46:41,719 Speaker 1: in a warped sensor in a way that that that 856 00:46:41,840 --> 00:46:45,319 Speaker 1: caters to their their prime focus. Might it be beneficial 857 00:46:45,480 --> 00:46:47,360 Speaker 1: to just think, oh, I'm a good dad, I'm a 858 00:46:47,360 --> 00:46:50,799 Speaker 1: good husband, even when they're not. But it enables them 859 00:46:50,840 --> 00:46:53,400 Speaker 1: to then put all of their more of their resources 860 00:46:53,400 --> 00:46:56,640 Speaker 1: anyway into the pursuit of the thing that ultimately matters 861 00:46:56,680 --> 00:46:59,319 Speaker 1: to them, like you know, card, hard cold cash, or 862 00:46:59,480 --> 00:47:03,160 Speaker 1: the pursuit of art. Yeah. Well, so to be clear, 863 00:47:03,320 --> 00:47:06,520 Speaker 1: obviously this wouldn't mean that it's actually useful in being 864 00:47:06,560 --> 00:47:09,000 Speaker 1: a good person, have any good life, but it may 865 00:47:09,280 --> 00:47:13,080 Speaker 1: very well be useful in focusing on whatever it is 866 00:47:13,160 --> 00:47:16,319 Speaker 1: that really matters to you, to be overconfident about your 867 00:47:16,440 --> 00:47:20,080 Speaker 1: your crappy efforts in other areas of life. Yeah, I 868 00:47:20,360 --> 00:47:23,160 Speaker 1: think that's entirely true. So it's it's Yeah, it's interesting 869 00:47:23,200 --> 00:47:25,600 Speaker 1: to compare the two, the corporate version of this in 870 00:47:25,640 --> 00:47:27,879 Speaker 1: the individual version of this. And certainly if you want 871 00:47:27,880 --> 00:47:31,040 Speaker 1: to read more about this idea of the acreous paradox, 872 00:47:31,920 --> 00:47:34,759 Speaker 1: I certainly recommend checking out more of his writings. So 873 00:47:34,800 --> 00:47:37,520 Speaker 1: we have all these economic metaphors, like you know from 874 00:47:37,520 --> 00:47:40,360 Speaker 1: Adam Smith, the Invisible Hand and whatever, I feel like 875 00:47:40,400 --> 00:47:43,920 Speaker 1: we need an economic metaphor of the nemesis, Like the 876 00:47:43,960 --> 00:47:47,440 Speaker 1: Nemesis that is this force in the market that swoops 877 00:47:47,440 --> 00:47:51,880 Speaker 1: into punish hubrist and over confidence in business. Yeah, sometimes 878 00:47:51,880 --> 00:47:56,359 Speaker 1: it seems like Nemesis is a little a little uh 879 00:47:57,040 --> 00:47:59,120 Speaker 1: resistant to doing that. I don't know, I mean, part 880 00:47:59,120 --> 00:48:01,040 Speaker 1: of it comes down again to the fact that that 881 00:48:01,520 --> 00:48:06,279 Speaker 1: businesses and corporations are are are less mortal. Well, yeah, 882 00:48:06,280 --> 00:48:08,880 Speaker 1: I mean it is funny that, uh, we've discussed a 883 00:48:08,920 --> 00:48:11,840 Speaker 1: lot of these episodes how overconfidence can both lead to 884 00:48:12,040 --> 00:48:15,960 Speaker 1: disaster and and negative outcomes, but can also in some 885 00:48:16,000 --> 00:48:20,280 Speaker 1: cases be highly rewarded and be very lucrative. Yeah. Yeah, 886 00:48:21,040 --> 00:48:25,200 Speaker 1: you know, I'm also remind talking about, you know, curbing 887 00:48:25,360 --> 00:48:28,680 Speaker 1: overconfidence or the perception of overconfidence by making statements that 888 00:48:28,760 --> 00:48:31,719 Speaker 1: cannot be put to the test. You of course see 889 00:48:31,719 --> 00:48:34,120 Speaker 1: that a lot in the business world. You know that 890 00:48:34,440 --> 00:48:37,360 Speaker 1: if you're saying you're the best car dealership in the galaxy, 891 00:48:38,040 --> 00:48:39,839 Speaker 1: you know that you can get away with saying you're 892 00:48:39,880 --> 00:48:43,560 Speaker 1: the best car dealership in town. Well, then people can say, well, 893 00:48:43,640 --> 00:48:45,960 Speaker 1: let's see your sales numbers, let's compare you to gems 894 00:48:46,000 --> 00:48:48,080 Speaker 1: across town. No, even that would be easier to get 895 00:48:48,080 --> 00:48:49,560 Speaker 1: away with, Like the one that would be hard to 896 00:48:49,560 --> 00:48:51,720 Speaker 1: get away with is saying like we have the lowest 897 00:48:51,800 --> 00:48:54,600 Speaker 1: prices in town or something like that. Then you're like, 898 00:48:55,000 --> 00:48:57,640 Speaker 1: then you're stuck either that's true or that's not right. 899 00:48:58,400 --> 00:49:00,520 Speaker 1: But then again, on a personal the first all level, 900 00:49:00,560 --> 00:49:02,600 Speaker 1: you know, you can have your mug that says world's 901 00:49:02,640 --> 00:49:05,680 Speaker 1: Greatest dad and nobody's gonna call you on that. What 902 00:49:05,800 --> 00:49:09,759 Speaker 1: you can get out your dad ruler and measure me. Yeah, 903 00:49:09,840 --> 00:49:12,319 Speaker 1: all right, So I think we've we've reached the end 904 00:49:12,360 --> 00:49:16,120 Speaker 1: of our discussion here for the week on overconfidence, but 905 00:49:16,200 --> 00:49:18,640 Speaker 1: clearly there's a there's a lot of material here. It'll 906 00:49:18,680 --> 00:49:21,200 Speaker 1: be interesting to hear back from listeners because I think 907 00:49:21,200 --> 00:49:23,040 Speaker 1: we all have some perspective on this. We all have 908 00:49:23,120 --> 00:49:27,719 Speaker 1: experience with with over confidence in others or certainly in 909 00:49:27,880 --> 00:49:31,600 Speaker 1: over confidence in ourselves or the management of overconfidence in ourselves, 910 00:49:31,920 --> 00:49:34,160 Speaker 1: and so we we'd love to hear everyone's thoughts on this. 911 00:49:34,880 --> 00:49:36,400 Speaker 1: In the meantime, if you want to check out other 912 00:49:36,440 --> 00:49:38,680 Speaker 1: episodes of Stuff to Blow your Mind, you can find 913 00:49:38,719 --> 00:49:41,640 Speaker 1: them wherever you find your podcasts. If you go to 914 00:49:41,640 --> 00:49:43,600 Speaker 1: stuff to Blow your Mind dot com, that will lead 915 00:49:43,600 --> 00:49:46,240 Speaker 1: you over to the I Heart listing for our show, 916 00:49:46,760 --> 00:49:49,520 Speaker 1: but you can find us anywhere and wherever that happens 917 00:49:49,560 --> 00:49:52,480 Speaker 1: to be. Just make sure you rate, review, and subscribe 918 00:49:52,760 --> 00:49:55,520 Speaker 1: that really helps us out huge thanks as always to 919 00:49:55,520 --> 00:49:58,719 Speaker 1: our excellent audio producer Seth Nicholas Johnson. If you would 920 00:49:58,760 --> 00:50:00,440 Speaker 1: like to get in touch with us with back on 921 00:50:00,480 --> 00:50:03,120 Speaker 1: this episode or any other to suggest topic for the future, 922 00:50:03,239 --> 00:50:06,000 Speaker 1: just to say hey, you can email us at contact 923 00:50:06,080 --> 00:50:15,840 Speaker 1: at stuff to Blow your Mind dot com. Stuff to 924 00:50:15,840 --> 00:50:17,839 Speaker 1: Blow Your Mind is a production of iHeart Radios. How 925 00:50:17,840 --> 00:50:20,239 Speaker 1: stuff Works. For more podcasts from my Heart Radio is 926 00:50:20,280 --> 00:50:22,920 Speaker 1: at the iHeart Radio app, Apple Podcasts, or wherever you 927 00:50:22,920 --> 00:50:33,080 Speaker 1: listen to your favorite shows.