1 00:00:01,480 --> 00:00:04,960 Speaker 1: Welcome to Stuff you Should Know, a production of iHeartRadio. 2 00:00:11,400 --> 00:00:13,920 Speaker 2: Hey, and welcome to the podcast. I'm Josh, and there's 3 00:00:14,040 --> 00:00:16,560 Speaker 2: Chuck and Jerry's here too, and we are getting down 4 00:00:16,600 --> 00:00:19,319 Speaker 2: to business, getting right to it here on stuff you 5 00:00:19,320 --> 00:00:22,440 Speaker 2: should know, because we've got a lot to cover here. 6 00:00:22,960 --> 00:00:26,320 Speaker 2: That's right. So, Chuck, I got a little bit of 7 00:00:26,360 --> 00:00:26,800 Speaker 2: an intro. 8 00:00:27,680 --> 00:00:30,360 Speaker 1: Let's hear it was that it? That wasn't it? 9 00:00:32,960 --> 00:00:37,720 Speaker 2: Do you remember how homeostasis used to come up a lot? Yes, So, 10 00:00:38,000 --> 00:00:39,960 Speaker 2: for those of you who haven't been listening that long, 11 00:00:40,040 --> 00:00:42,720 Speaker 2: homeostasis is what your body, in your mind and your 12 00:00:42,720 --> 00:00:45,320 Speaker 2: brain wants to return to. Right. You just want everything 13 00:00:45,400 --> 00:00:48,919 Speaker 2: nice and even, keel and normal and without exerting too 14 00:00:49,000 --> 00:00:51,360 Speaker 2: much effort and energy. Right, that's homeostasis. 15 00:00:52,200 --> 00:00:54,440 Speaker 1: That's are you asking me? Sure? 16 00:00:54,800 --> 00:00:59,640 Speaker 2: Okay? So one of the ways that your brain returns 17 00:00:59,680 --> 00:01:02,520 Speaker 2: to homeostasis as fast as it can is to use 18 00:01:02,560 --> 00:01:07,160 Speaker 2: shortcuts in making decisions, right, Because if you're having to 19 00:01:07,160 --> 00:01:10,480 Speaker 2: decide something, you're actively being challenged, you have to You're 20 00:01:10,480 --> 00:01:15,000 Speaker 2: not in your homeostatic space. So if you use a shortcut, 21 00:01:15,240 --> 00:01:19,319 Speaker 2: you can say something like I've had the red apple 22 00:01:19,480 --> 00:01:22,720 Speaker 2: in the past and it was delicious, I've eaten the 23 00:01:22,760 --> 00:01:25,640 Speaker 2: brown mushy one before and it was awful. I'm going 24 00:01:25,720 --> 00:01:28,240 Speaker 2: to eat this red apple, right. Rather than going to 25 00:01:28,280 --> 00:01:31,360 Speaker 2: the trouble of pulling both apples out and like analyzing 26 00:01:31,400 --> 00:01:33,320 Speaker 2: them with a microscope and all that, you can just 27 00:01:33,400 --> 00:01:36,200 Speaker 2: kind of use a little shortcut. That's a heuristic, and 28 00:01:36,280 --> 00:01:38,560 Speaker 2: it makes a lot of sense because your brain is like, great, 29 00:01:38,600 --> 00:01:41,319 Speaker 2: I didn't use that much energy. I made the right decision, 30 00:01:41,680 --> 00:01:45,959 Speaker 2: and we're good to go. The problem that comes about, though, 31 00:01:46,080 --> 00:01:49,360 Speaker 2: is that with heuristics, you're not always right. You don't 32 00:01:49,400 --> 00:01:51,840 Speaker 2: always make the right decision, you're not always taking all 33 00:01:51,840 --> 00:01:55,040 Speaker 2: of the information into account, and when that happens, you 34 00:01:55,120 --> 00:01:59,720 Speaker 2: start stumbling into cognitive biases. 35 00:02:00,680 --> 00:02:04,040 Speaker 1: Yeah. Like, this is a frustrating episode because I feel 36 00:02:04,080 --> 00:02:08,000 Speaker 1: like the title could be cognitive bias is everything you 37 00:02:08,080 --> 00:02:09,120 Speaker 1: think you know is wrong. 38 00:02:09,600 --> 00:02:11,520 Speaker 2: Yeah, well that's a great title. Let's go with that. 39 00:02:12,760 --> 00:02:14,760 Speaker 1: It just it made me feel like a dummy the 40 00:02:14,760 --> 00:02:15,200 Speaker 1: whole time. 41 00:02:15,800 --> 00:02:19,400 Speaker 2: Oh don't it's You're not a dummy. All humans are 42 00:02:19,440 --> 00:02:22,360 Speaker 2: dummies as far as cognitive biases go. It's not just you, 43 00:02:22,840 --> 00:02:25,280 Speaker 2: and this stuff is hardwired into us because like I 44 00:02:25,400 --> 00:02:28,760 Speaker 2: just said, we take mental shortcuts and the problem, Chuck, 45 00:02:28,840 --> 00:02:33,160 Speaker 2: is what we're talking about mostly today are unconscious biases. Right, 46 00:02:33,480 --> 00:02:37,079 Speaker 2: there's conscious biases. We just usually call those biases, right, 47 00:02:37,120 --> 00:02:40,919 Speaker 2: those are the active challenges that you need to overcome 48 00:02:40,960 --> 00:02:45,359 Speaker 2: to be a better version of yourself. These are like unconscious. 49 00:02:45,400 --> 00:02:48,680 Speaker 2: So there's not a lot you can do about it. 50 00:02:48,960 --> 00:02:50,600 Speaker 2: Although at the end we're going to kind of give 51 00:02:50,600 --> 00:02:53,800 Speaker 2: you some tips and pointers, but it's a challenge for 52 00:02:53,840 --> 00:02:56,280 Speaker 2: absolutely everybody. Doesn't it doesn't make you dumb. 53 00:02:56,440 --> 00:02:59,080 Speaker 1: Yeah, I mean I think the tips and things can help, 54 00:02:59,200 --> 00:03:02,400 Speaker 1: for sure, But it's just part of being human, you know, 55 00:03:02,600 --> 00:03:05,280 Speaker 1: the unconscious bias, and there's not a lot we can 56 00:03:05,320 --> 00:03:09,360 Speaker 1: do to completely eradicate them. And if it's a you know, 57 00:03:10,120 --> 00:03:13,320 Speaker 1: if it's a real problem, then I'm sorry. 58 00:03:14,919 --> 00:03:17,040 Speaker 2: Well. One of the big problems that we all kind 59 00:03:17,040 --> 00:03:24,440 Speaker 2: of face is that we are were predictably irrational, as 60 00:03:24,600 --> 00:03:30,480 Speaker 2: was said by Dan Arielli, who was a behavioral economist, 61 00:03:30,840 --> 00:03:35,720 Speaker 2: And because of that, corporations, marketers, basically everybody who wants 62 00:03:35,720 --> 00:03:38,320 Speaker 2: to sell you something knows about these things, and they 63 00:03:38,320 --> 00:03:40,840 Speaker 2: can manipulate those things. They can trick you into making 64 00:03:40,880 --> 00:03:42,440 Speaker 2: decisions you wouldn't otherwise make. 65 00:03:42,800 --> 00:03:45,880 Speaker 1: Yeah, for sure, And we wouldn't even be here probably 66 00:03:45,920 --> 00:03:48,920 Speaker 1: talking about this, but hadn't been for two kind of 67 00:03:49,040 --> 00:03:52,360 Speaker 1: revolutionary thinkers who ended up being some of the more 68 00:03:52,360 --> 00:03:56,600 Speaker 1: off cited researchers in the history of research, as we'll learn, 69 00:03:56,680 --> 00:04:00,400 Speaker 1: especially when it comes to economics. And they were a 70 00:04:00,440 --> 00:04:04,920 Speaker 1: couple of psychologists, Israeli psychologists named Amos. And I looked 71 00:04:05,000 --> 00:04:06,840 Speaker 1: up different ways to pronounce this because we always get 72 00:04:06,920 --> 00:04:12,160 Speaker 1: guff and I've heard everything from hard Tbirski to his 73 00:04:12,440 --> 00:04:16,600 Speaker 1: colleague Daniel Kahneman doing more of one of those Verski. 74 00:04:17,920 --> 00:04:19,840 Speaker 2: Oh, I just heard him refer to him as Big T. 75 00:04:20,880 --> 00:04:24,039 Speaker 1: The Big T because it's TV. But those were the 76 00:04:24,040 --> 00:04:26,240 Speaker 1: two guys working together. They developed this concept in the 77 00:04:26,279 --> 00:04:32,839 Speaker 1: seventies at the Hebrew University of Jerusalem and really like 78 00:04:33,279 --> 00:04:36,760 Speaker 1: got down to it pretty quickly as a result of 79 00:04:37,279 --> 00:04:41,839 Speaker 1: Connoman I think taking some issue with the Big T's research, 80 00:04:41,920 --> 00:04:44,320 Speaker 1: and I guess they kind of bonded over that or something. 81 00:04:45,320 --> 00:04:50,159 Speaker 2: Yeah, it was pretty cool because Tski, basically he was 82 00:04:50,279 --> 00:04:54,760 Speaker 2: a mathematical psychologist, which anytime you hear mathematical and it's 83 00:04:55,120 --> 00:04:57,239 Speaker 2: to do with something other than math, what that means 84 00:04:57,320 --> 00:05:00,640 Speaker 2: is you've taken something and you've set it out in 85 00:05:00,640 --> 00:05:03,359 Speaker 2: a very standardized way, so you can explore it, you 86 00:05:03,360 --> 00:05:06,080 Speaker 2: can teach it based on certain facets and the upshot 87 00:05:06,120 --> 00:05:09,760 Speaker 2: of mathematical psychology as far as human behavior goes. These 88 00:05:09,760 --> 00:05:12,279 Speaker 2: are the people who came up with the kroc idea 89 00:05:12,800 --> 00:05:17,680 Speaker 2: that humans behave as rational actors. We're self interested, we 90 00:05:17,760 --> 00:05:20,359 Speaker 2: take all the best information available to make the best 91 00:05:20,920 --> 00:05:25,000 Speaker 2: decision for ourselves. And Daniel Kaneman was like, this is 92 00:05:25,040 --> 00:05:29,320 Speaker 2: not at all true, and he started challenging Amos Tawirski's 93 00:05:30,279 --> 00:05:34,000 Speaker 2: theories and Tavsky instead of saying like, no, you shut up, 94 00:05:34,440 --> 00:05:36,200 Speaker 2: he was like, all right, let's go figure out, let's 95 00:05:36,200 --> 00:05:38,240 Speaker 2: get to the bottom of this. And because of that, Yeah, 96 00:05:38,480 --> 00:05:42,479 Speaker 2: they formed this partnership that a huge impact on the world. Yeah. 97 00:05:42,480 --> 00:05:46,159 Speaker 1: I think it's kind of heartening that they, as academics, 98 00:05:46,600 --> 00:05:49,160 Speaker 1: you know, got together. There were no ruffled feathers or 99 00:05:49,200 --> 00:05:51,360 Speaker 1: at least it didn't end up that way, and they 100 00:05:51,400 --> 00:05:53,480 Speaker 1: work together. It's kind of a heartening thing I think 101 00:05:53,520 --> 00:05:54,000 Speaker 1: these days. 102 00:05:54,120 --> 00:05:55,440 Speaker 2: Yeah, there's got to be at least one. 103 00:05:56,240 --> 00:05:59,080 Speaker 1: Yeah, that's right. They came up with a program called 104 00:05:59,080 --> 00:06:04,320 Speaker 1: the Heroistics and Biases Program to basically, you know, study 105 00:06:04,320 --> 00:06:06,719 Speaker 1: how human beings make their decisions, how they go through 106 00:06:06,760 --> 00:06:11,480 Speaker 1: life making choices when they don't have like all the 107 00:06:11,480 --> 00:06:14,240 Speaker 1: information at hand, all the most perfect information to make 108 00:06:14,279 --> 00:06:17,479 Speaker 1: that choice, or if they don't have like all the 109 00:06:17,520 --> 00:06:19,719 Speaker 1: time in the world to look at the information that 110 00:06:19,760 --> 00:06:22,240 Speaker 1: they do have to make that choice. So, like, how 111 00:06:22,240 --> 00:06:25,880 Speaker 1: are people making decisions? How are they making mistakes and 112 00:06:25,960 --> 00:06:28,839 Speaker 1: their decision making? And they ended up coming up with 113 00:06:28,880 --> 00:06:32,840 Speaker 1: a couple of different systems, one which is super quick, 114 00:06:32,839 --> 00:06:34,320 Speaker 1: and one which is much more deliberate. 115 00:06:35,160 --> 00:06:37,960 Speaker 2: Yeah, Daniel Kahman came out with Thinking Fast and Slow, 116 00:06:38,000 --> 00:06:42,440 Speaker 2: which was one of those super popular airport books. 117 00:06:42,360 --> 00:06:45,480 Speaker 1: You know, Yeah, thinking Comma faster. 118 00:06:45,760 --> 00:06:50,159 Speaker 2: Yes, thank you, eat shoots and leaves, that's right. And 119 00:06:50,200 --> 00:06:53,520 Speaker 2: in it he basically lays out this kind of shorthand models. 120 00:06:53,640 --> 00:06:56,240 Speaker 2: He's very explicit to say, like this is not this 121 00:06:56,320 --> 00:06:58,920 Speaker 2: is not how like your brain is actually laid out, 122 00:06:59,000 --> 00:07:01,120 Speaker 2: but it's a good metaphor for it. And System one 123 00:07:01,640 --> 00:07:05,400 Speaker 2: is how you think quickly, You think almost unconsciously, you 124 00:07:05,440 --> 00:07:09,400 Speaker 2: make rapid decisions, and that is kind of how we 125 00:07:09,520 --> 00:07:13,520 Speaker 2: generally navigate life. System two is much more deliberate. It's 126 00:07:13,520 --> 00:07:16,840 Speaker 2: where we take into account like different ideas, It's where 127 00:07:16,880 --> 00:07:20,120 Speaker 2: we really stop and think about something before making a decision, 128 00:07:20,600 --> 00:07:26,040 Speaker 2: and they're essentially competing. There's something called interference, and system 129 00:07:26,120 --> 00:07:29,720 Speaker 2: one has a really great tendency to interfere with system two. 130 00:07:30,360 --> 00:07:32,640 Speaker 2: And there was a psychologist working all the way back 131 00:07:32,640 --> 00:07:37,440 Speaker 2: in nineteen thirty five named John Ridley's Stroop who basically 132 00:07:37,480 --> 00:07:40,200 Speaker 2: discovered the Stroop effect that is a way of demonstrating 133 00:07:40,280 --> 00:07:44,560 Speaker 2: how system one interferes with the slower, more deliberate system too. 134 00:07:44,960 --> 00:07:47,000 Speaker 1: Yeah, I bet a boy. I bet he patted himself 135 00:07:47,040 --> 00:07:49,520 Speaker 1: on the back after this one, because it's one of 136 00:07:49,520 --> 00:07:52,400 Speaker 1: those things that's so simple, But I bet he winked 137 00:07:52,400 --> 00:07:55,040 Speaker 1: at everyone like watch this. Yeah, this is going to 138 00:07:55,080 --> 00:07:55,720 Speaker 1: break your brain. 139 00:07:55,960 --> 00:07:56,760 Speaker 2: It's genius. 140 00:07:57,000 --> 00:07:59,280 Speaker 1: It really kind of is. So what they did was 141 00:07:59,320 --> 00:08:04,920 Speaker 1: they simply wrote down the names of colors, but they 142 00:08:04,920 --> 00:08:06,760 Speaker 1: would write down the name of that color in a 143 00:08:06,800 --> 00:08:10,440 Speaker 1: different color, and then he would just say, as to 144 00:08:10,480 --> 00:08:12,960 Speaker 1: the person to read out loud, the color of the 145 00:08:13,000 --> 00:08:15,360 Speaker 1: word that is written, not the color that it's written in. 146 00:08:15,960 --> 00:08:19,280 Speaker 1: And it is surprisingly difficult to do that. It's just 147 00:08:19,360 --> 00:08:21,160 Speaker 1: a little weird brain breaking thing. 148 00:08:21,760 --> 00:08:25,480 Speaker 2: Yeah. Yeah, so he's showing that your system one just 149 00:08:25,480 --> 00:08:27,760 Speaker 2: wants to hurry up and read it, yeah, and it's 150 00:08:27,800 --> 00:08:31,760 Speaker 2: getting it wrong and that's interference, right. So that kind 151 00:08:31,760 --> 00:08:34,360 Speaker 2: of like started to lay the groundwork for this idea 152 00:08:34,440 --> 00:08:37,000 Speaker 2: that we do have kind of competing ways of seeing 153 00:08:37,080 --> 00:08:43,000 Speaker 2: the world and making decisions. And what Knoman was saying 154 00:08:43,240 --> 00:08:47,200 Speaker 2: is that most of the decisions were walking around making 155 00:08:47,240 --> 00:08:52,000 Speaker 2: are actually the System one super fast shorthand decisions. But 156 00:08:52,160 --> 00:08:55,280 Speaker 2: we think that we're using our more rational mind because 157 00:08:55,320 --> 00:08:59,080 Speaker 2: we make post hoc explanations for why we decided that. 158 00:09:00,080 --> 00:09:02,400 Speaker 2: And that's not to say we're all walking around with 159 00:09:02,440 --> 00:09:05,559 Speaker 2: these creepy little secrets that we know. We're like fooling ourselves. 160 00:09:05,800 --> 00:09:08,600 Speaker 2: We don't realize we're doing this. That's why these biases 161 00:09:08,600 --> 00:09:11,280 Speaker 2: are unconscious. Even if you stop and think about what 162 00:09:11,360 --> 00:09:13,960 Speaker 2: you're doing, you may still not come up with the 163 00:09:14,040 --> 00:09:17,440 Speaker 2: answer like, oh, yeah, I was making up explanations after 164 00:09:17,559 --> 00:09:20,680 Speaker 2: the fact to explain why I actually used System two 165 00:09:20,720 --> 00:09:23,240 Speaker 2: when I didn't. It's really hard to do that. 166 00:09:23,720 --> 00:09:26,480 Speaker 1: Yeah, for sure, Livy gives a pretty good example of that. 167 00:09:26,520 --> 00:09:29,560 Speaker 1: As far as like hiring somebody, Someone may make an 168 00:09:29,600 --> 00:09:33,960 Speaker 1: impression in an interview that kind of locks it up 169 00:09:33,960 --> 00:09:36,040 Speaker 1: from the second they walk in. Maybe they look like 170 00:09:36,080 --> 00:09:39,200 Speaker 1: their mom or dad, or a relative, or maybe they 171 00:09:39,280 --> 00:09:41,920 Speaker 1: remind them of themselves, or maybe like who knows what 172 00:09:41,960 --> 00:09:44,280 Speaker 1: it could be. Then they end up getting that job, 173 00:09:44,480 --> 00:09:46,640 Speaker 1: and later if you ask the person who hired them, 174 00:09:47,120 --> 00:09:49,400 Speaker 1: you might say, oh, what it was because of this, 175 00:09:49,400 --> 00:09:51,720 Speaker 1: this and this and this, when in fact that's really 176 00:09:51,760 --> 00:09:55,160 Speaker 1: just system too kind of confirming, Like now it's because 177 00:09:55,200 --> 00:09:58,120 Speaker 1: the guy walked in wearing a New York Giants T shirt. 178 00:09:58,679 --> 00:10:01,840 Speaker 2: Yeah, and we'll get into some of the problems with 179 00:10:01,880 --> 00:10:04,559 Speaker 2: the stuff throughout, but this is a good example. Right 180 00:10:04,600 --> 00:10:07,760 Speaker 2: if the opposite happened, If like you didn't hire somebody 181 00:10:07,840 --> 00:10:11,319 Speaker 2: because they weren't quite like you, that's an example of 182 00:10:11,440 --> 00:10:14,480 Speaker 2: a bias too, even if you don't think that that's 183 00:10:14,520 --> 00:10:16,480 Speaker 2: why you did it. If you're looking at their CV 184 00:10:16,600 --> 00:10:18,760 Speaker 2: afterward and you're like, oh, they didn't graduate from college, 185 00:10:18,760 --> 00:10:22,160 Speaker 2: it's why. But really it was not because you're racist, 186 00:10:22,160 --> 00:10:24,920 Speaker 2: it's not because you're a woman hater. It was because 187 00:10:24,960 --> 00:10:28,520 Speaker 2: you're preserving your own level of comfort because other groups 188 00:10:28,520 --> 00:10:31,560 Speaker 2: that are different than you make you uncomfortable. And that's 189 00:10:31,559 --> 00:10:35,200 Speaker 2: how groups can become entrenched. Right You just once one 190 00:10:35,240 --> 00:10:38,200 Speaker 2: group kind of dominates an organization, they tend to continue 191 00:10:38,240 --> 00:10:41,760 Speaker 2: doing that because people hire other people who they're comfortable around, 192 00:10:41,880 --> 00:10:45,040 Speaker 2: rather than pushing themselves outside of their comfort zone and 193 00:10:45,200 --> 00:10:49,520 Speaker 2: probably improving their organization. And that's why diversity programs exist 194 00:10:49,600 --> 00:10:51,959 Speaker 2: in the first place, because of that human tendency. 195 00:10:52,320 --> 00:10:54,040 Speaker 1: Yeah, or maybe they were just a Jets fan. 196 00:10:54,720 --> 00:10:57,520 Speaker 2: That's possible. I mean, no Jets is going to hire 197 00:10:57,559 --> 00:10:58,320 Speaker 2: a Giants fan. 198 00:10:58,679 --> 00:11:00,480 Speaker 1: Yeah. Well, here's a tip. I don't know a lot 199 00:11:00,480 --> 00:11:02,559 Speaker 1: about interviewing other than be yourself and try and get 200 00:11:02,559 --> 00:11:05,199 Speaker 1: someone to like you. But don't go into any interview 201 00:11:05,320 --> 00:11:08,560 Speaker 1: wearing any sort of branded sports apparel. 202 00:11:09,480 --> 00:11:13,440 Speaker 2: Yeah, especially a jersey. Yeah, I think that says quite 203 00:11:13,440 --> 00:11:13,719 Speaker 2: a bit. 204 00:11:14,679 --> 00:11:17,640 Speaker 1: Yeah, you wear that Giants jersey in there. Well, I 205 00:11:17,640 --> 00:11:20,000 Speaker 1: guess you're rolling the dice. You've either got that job 206 00:11:20,280 --> 00:11:21,800 Speaker 1: right or there's no way you're going to get it. 207 00:11:21,840 --> 00:11:24,720 Speaker 1: So maybe it's not a bad idea. Then I don't 208 00:11:24,720 --> 00:11:26,160 Speaker 1: even know what I'm I might be wrong. 209 00:11:27,280 --> 00:11:29,240 Speaker 2: Yeah, I mean, I guess if you dressed it up 210 00:11:29,240 --> 00:11:31,440 Speaker 2: with a bow tie, maybe you could get away with it. 211 00:11:31,480 --> 00:11:33,719 Speaker 2: But yes, it is still a gamble regardless. 212 00:11:33,960 --> 00:11:35,600 Speaker 1: Well, not all jobs you have to wear a suit 213 00:11:35,640 --> 00:11:36,079 Speaker 1: and tie, you. 214 00:11:36,040 --> 00:11:39,240 Speaker 2: Realize, I know, But I'm saying, like you dress for 215 00:11:39,280 --> 00:11:41,520 Speaker 2: the part you want. If you're wearing a giant's jersey 216 00:11:41,559 --> 00:11:44,160 Speaker 2: with the bow tie, I think you're making a good 217 00:11:44,160 --> 00:11:45,200 Speaker 2: impression out of the gate. 218 00:11:46,160 --> 00:11:50,320 Speaker 1: All right, So I guess we can talk about We're 219 00:11:50,320 --> 00:11:52,480 Speaker 1: going to go through a list of about ten different 220 00:11:52,520 --> 00:11:56,040 Speaker 1: biases and they're all pretty interesting and I know everyone 221 00:11:56,080 --> 00:11:58,480 Speaker 1: can identify with probably each of these at some point. 222 00:11:58,760 --> 00:12:01,800 Speaker 1: But before we do that, we need to point out that, like, 223 00:12:02,360 --> 00:12:05,120 Speaker 1: these are all mental shortcuts or the result of a 224 00:12:05,160 --> 00:12:07,840 Speaker 1: mental shortcut, but not all of them work in the 225 00:12:07,840 --> 00:12:10,160 Speaker 1: same way. And how our brains work, a lot of 226 00:12:10,240 --> 00:12:11,760 Speaker 1: you know, there could be a lot of things at play. 227 00:12:12,720 --> 00:12:16,640 Speaker 1: Emotions can come into play, maybe, like we were just 228 00:12:16,679 --> 00:12:20,120 Speaker 1: talking about, like it's hard to reassess something after you've 229 00:12:20,160 --> 00:12:24,720 Speaker 1: gotten a first impression. People humans historically through their life 230 00:12:24,800 --> 00:12:27,600 Speaker 1: tend to make bad guesses at things because if you 231 00:12:27,640 --> 00:12:31,040 Speaker 1: make great guesses, then things like you know, gambling would 232 00:12:31,040 --> 00:12:33,800 Speaker 1: be super easy. So all of these things come into play. 233 00:12:35,720 --> 00:12:37,880 Speaker 1: There's not just like a single way that this it's 234 00:12:37,880 --> 00:12:40,280 Speaker 1: a broken system. 235 00:12:39,880 --> 00:12:44,000 Speaker 2: Right yeah, yeah, yeah, But people generally, it seems like 236 00:12:44,160 --> 00:12:46,800 Speaker 2: universally in a lot of these cases, behave in these 237 00:12:47,240 --> 00:12:49,760 Speaker 2: ways under the same circumstances. 238 00:12:50,440 --> 00:12:53,440 Speaker 1: Yeah, like some of their stuff wasn't replicatable, but that's 239 00:12:53,520 --> 00:12:57,160 Speaker 1: sort of standard for studies in psychology. Like a lot 240 00:12:57,160 --> 00:12:59,920 Speaker 1: of this stuff, as we'll see, has checked out across cultures. 241 00:13:00,240 --> 00:13:03,520 Speaker 2: Yeah, which is huge, you know, considering the whole weird 242 00:13:03,760 --> 00:13:05,640 Speaker 2: problem in psychology in. 243 00:13:05,559 --> 00:13:09,040 Speaker 1: Particular that weird people. 244 00:13:08,760 --> 00:13:14,559 Speaker 2: Are Western educated, industrial hich are Yeah, rich democratic, I think. 245 00:13:15,520 --> 00:13:18,040 Speaker 1: All right, so we'll start. We'll leave the biggest guy 246 00:13:18,120 --> 00:13:20,440 Speaker 1: for last, I think, which will be after a break probably, 247 00:13:20,440 --> 00:13:23,600 Speaker 1: but we'll start then with maybe hindsight bias. And this 248 00:13:23,679 --> 00:13:27,240 Speaker 1: is the idea that after something has occurred and we 249 00:13:27,320 --> 00:13:28,959 Speaker 1: talked about this one before here and there, that we 250 00:13:30,080 --> 00:13:31,800 Speaker 1: tend to think like, oh, well, of course that was 251 00:13:31,840 --> 00:13:34,400 Speaker 1: going to happen. In fact, not only was that should 252 00:13:34,440 --> 00:13:37,160 Speaker 1: I have seen that coming, It was probably inevitable that 253 00:13:37,240 --> 00:13:40,800 Speaker 1: that happened. And a lot of a time maybe because 254 00:13:40,840 --> 00:13:45,640 Speaker 1: you're misremembering your expectation before it even happened, right. 255 00:13:45,960 --> 00:13:49,600 Speaker 2: Like we can rearrange our memory of how we felt 256 00:13:49,600 --> 00:13:52,040 Speaker 2: about the event or the outcome of the event afterward 257 00:13:52,080 --> 00:13:55,120 Speaker 2: to basically match the outcome. Yeah, I guess because we 258 00:13:55,160 --> 00:13:56,840 Speaker 2: have this number ending need to be right. 259 00:13:57,559 --> 00:13:59,000 Speaker 1: Yeah, that probably had something to do with it. 260 00:13:59,640 --> 00:14:05,080 Speaker 2: I knew you were going to say that, so I've 261 00:14:05,080 --> 00:14:08,240 Speaker 2: got another one for you, Chuck all right, self serving 262 00:14:08,280 --> 00:14:13,280 Speaker 2: bias combined with a little fundamental attribution error on the side. 263 00:14:13,240 --> 00:14:15,359 Speaker 1: Yeah, that's a good one. It's a good side dish. 264 00:14:15,640 --> 00:14:19,760 Speaker 2: So these things basically go hand in hand. It's basically 265 00:14:19,760 --> 00:14:22,360 Speaker 2: how we see ourselves in a great light and how 266 00:14:22,400 --> 00:14:24,760 Speaker 2: we see other people in a more negative light. Self 267 00:14:24,800 --> 00:14:28,240 Speaker 2: serving bias is basically saying, if something good happens to you, 268 00:14:28,400 --> 00:14:31,400 Speaker 2: it's because you are good, like you earned it. It's 269 00:14:31,440 --> 00:14:34,640 Speaker 2: because of you doing something right. Something bad happens to you, 270 00:14:35,040 --> 00:14:41,000 Speaker 2: it's external forces that made that happen, right. Fundamental attribution 271 00:14:41,160 --> 00:14:43,720 Speaker 2: error is the exact opposite with other people, If they 272 00:14:43,800 --> 00:14:46,320 Speaker 2: do something right, it was just luck. If something bad 273 00:14:46,360 --> 00:14:49,680 Speaker 2: happens to them, it's their own fault. So good example 274 00:14:49,680 --> 00:14:52,000 Speaker 2: of this is like if a coworker comes in late 275 00:14:52,080 --> 00:14:55,040 Speaker 2: one day, you're like, they're just lazy and slack, but 276 00:14:55,080 --> 00:14:57,040 Speaker 2: then you come in late the next day and you're 277 00:14:57,080 --> 00:14:58,160 Speaker 2: like it was traffic. 278 00:14:58,600 --> 00:14:58,840 Speaker 1: Right. 279 00:14:59,160 --> 00:15:01,720 Speaker 2: That's basically the two things going hand in hand, and 280 00:15:01,760 --> 00:15:03,520 Speaker 2: those are those are both biases. 281 00:15:04,000 --> 00:15:07,320 Speaker 1: Yeah, and I hope people understand that, Like all of 282 00:15:07,360 --> 00:15:11,480 Speaker 1: those things can also be true. You know. Sure, so 283 00:15:11,880 --> 00:15:14,040 Speaker 1: if you're if you're thinking like, well, no, but some 284 00:15:14,440 --> 00:15:17,520 Speaker 1: you know, sometimes I did deserve the thing, and sometimes 285 00:15:17,520 --> 00:15:20,520 Speaker 1: it was someone's fault. Yeah, sure, that can't happen. That's 286 00:15:20,600 --> 00:15:22,840 Speaker 1: we're not These aren't absolutes. 287 00:15:22,960 --> 00:15:27,800 Speaker 2: No, it's more just yeah, your your your tendency to 288 00:15:27,840 --> 00:15:30,480 Speaker 2: think in certain ways. Yeah, sometimes you're going to be right. 289 00:15:30,520 --> 00:15:31,640 Speaker 2: Sometimes you're going to be wrong for. 290 00:15:31,880 --> 00:15:34,480 Speaker 1: Like humanities tendency. Yeah, you got to take a big 291 00:15:34,480 --> 00:15:35,160 Speaker 1: broad view here. 292 00:15:35,800 --> 00:15:41,080 Speaker 2: Yeah, but also you specifically right you James Kirkland listening 293 00:15:41,160 --> 00:15:41,840 Speaker 2: in Baltimore. 294 00:15:42,080 --> 00:15:44,960 Speaker 1: There's one. Oh, man, James Kirkland is going to pull 295 00:15:44,960 --> 00:15:47,600 Speaker 1: over to the side of the road right now and really 296 00:15:47,640 --> 00:15:48,120 Speaker 1: freak out. 297 00:15:48,320 --> 00:15:50,320 Speaker 2: I hope, man, I hope I nailed it. 298 00:15:50,800 --> 00:15:52,520 Speaker 1: Yeah, I think you picked a common enough name. 299 00:15:52,920 --> 00:15:53,880 Speaker 2: We'll see, all right. 300 00:15:54,400 --> 00:15:58,160 Speaker 1: So anchoring bias is another one. This one I've fallen 301 00:15:58,160 --> 00:15:59,960 Speaker 1: prey too. I'm gonna say that probably about all these, 302 00:16:00,080 --> 00:16:02,760 Speaker 1: but that is the first piece of info that you 303 00:16:02,800 --> 00:16:06,520 Speaker 1: get about something can really affect and even in a 304 00:16:06,760 --> 00:16:10,960 Speaker 1: very disproportional way, things that happen after that, Like, once 305 00:16:10,960 --> 00:16:13,960 Speaker 1: something is kind of locked in, it's hard to unwind that. 306 00:16:14,560 --> 00:16:18,560 Speaker 2: Yeah, that first piece of information, it's like, oh, okay, 307 00:16:17,800 --> 00:16:21,640 Speaker 2: this is going to basically prime you in your answer 308 00:16:21,760 --> 00:16:24,400 Speaker 2: your decision. Right. So a good example I saw is 309 00:16:24,680 --> 00:16:27,880 Speaker 2: there was a study that says, like, Okay, the Mississippi 310 00:16:27,960 --> 00:16:30,840 Speaker 2: River is less than two miles long, how long is it? 311 00:16:31,160 --> 00:16:33,560 Speaker 2: And those people would say something like fifteen hundred miles. 312 00:16:34,240 --> 00:16:36,960 Speaker 2: And then other people would say, okay, the Mississippi River 313 00:16:37,120 --> 00:16:40,240 Speaker 2: is less than five hundred miles long, and people would 314 00:16:40,240 --> 00:16:42,760 Speaker 2: say like, it's like three hundred miles. And then another 315 00:16:42,840 --> 00:16:45,760 Speaker 2: group was the Mississippi River is less than eighty miles long. 316 00:16:45,920 --> 00:16:49,280 Speaker 2: Those people would answer like sixty. It's the same thing, 317 00:16:49,360 --> 00:16:52,479 Speaker 2: the length of the Mississippi River. But they were presented 318 00:16:52,520 --> 00:16:55,640 Speaker 2: with this basically this priming number, a large one, a 319 00:16:55,680 --> 00:16:59,000 Speaker 2: middle number, or a smaller number, and their answers were 320 00:16:59,040 --> 00:17:02,640 Speaker 2: related to that first piece of information that they got, 321 00:17:02,840 --> 00:17:04,159 Speaker 2: and that's anchoring bias. 322 00:17:04,640 --> 00:17:07,600 Speaker 1: Yeah, And Livia pointed out another little side this year, 323 00:17:07,600 --> 00:17:09,760 Speaker 1: which is called the decoy effect, and that's when you 324 00:17:10,480 --> 00:17:12,399 Speaker 1: will go into a restaurant and they might and this 325 00:17:12,520 --> 00:17:15,800 Speaker 1: is how just kind of one way this can affect economics, 326 00:17:16,000 --> 00:17:17,840 Speaker 1: which will come up a lot. But you'll go into 327 00:17:17,880 --> 00:17:21,200 Speaker 1: a restaurant and they might have one super expensive bottle 328 00:17:21,240 --> 00:17:24,320 Speaker 1: of wine on the menu, and maybe it's even placed 329 00:17:24,359 --> 00:17:26,600 Speaker 1: at the top so you see it first, and then 330 00:17:26,760 --> 00:17:29,199 Speaker 1: the other bottles of wine might seem like a decent 331 00:17:29,280 --> 00:17:31,680 Speaker 1: deal after that, even if they're also overpriced. 332 00:17:31,840 --> 00:17:34,200 Speaker 2: Yeah, exploitation, but. 333 00:17:34,240 --> 00:17:37,840 Speaker 1: All wine and restaurants is overpriced. I hope everyone realizes that. 334 00:17:37,840 --> 00:17:39,560 Speaker 2: I mean buy a lot, right, Yeah, I think that 335 00:17:39,640 --> 00:17:41,040 Speaker 2: most of their margin on that. 336 00:17:41,280 --> 00:17:45,360 Speaker 1: Oh, totally. It's it's frustrating, but it's that's that's the business. 337 00:17:45,680 --> 00:17:48,080 Speaker 2: Well, that's also I think, chuck. This anchoring bias is 338 00:17:48,080 --> 00:17:50,600 Speaker 2: why they say you should never lead in a negotiation 339 00:17:50,720 --> 00:17:53,200 Speaker 2: with your actual price that you want to go high 340 00:17:53,280 --> 00:17:55,400 Speaker 2: or lower depending on your position. 341 00:17:56,080 --> 00:17:58,119 Speaker 1: Oh like if they ask what you want to get paid? 342 00:17:58,560 --> 00:18:01,399 Speaker 2: Yeah, or like you're trying question or something like that. 343 00:18:01,480 --> 00:18:04,040 Speaker 1: You know, yeah, I hate all that stuff. 344 00:18:04,680 --> 00:18:07,199 Speaker 2: Yeah. Yeah. If somebody it's like, well, what do you 345 00:18:07,280 --> 00:18:08,359 Speaker 2: think I should get paid? 346 00:18:08,440 --> 00:18:10,359 Speaker 1: You know, what are you looking to make at this job? 347 00:18:10,680 --> 00:18:12,840 Speaker 1: But that's what I'm saying, what you make is should 348 00:18:12,840 --> 00:18:13,719 Speaker 1: always be the answer. 349 00:18:14,040 --> 00:18:17,280 Speaker 2: Exactly. You just want to add I don't know, fifty 350 00:18:17,320 --> 00:18:19,280 Speaker 2: percent to what you actually want, and then you're away 351 00:18:19,320 --> 00:18:22,560 Speaker 2: for negotiation. There's the one other thing that's related to 352 00:18:22,600 --> 00:18:25,520 Speaker 2: this framing bias, and that's basically the same thing, but 353 00:18:25,600 --> 00:18:28,520 Speaker 2: rather than the first piece of information guiding you, this 354 00:18:28,680 --> 00:18:33,959 Speaker 2: is more directly guiding you. So for example, some drug 355 00:18:35,440 --> 00:18:38,760 Speaker 2: maker says ten percent of patients die, You're like, oh god, 356 00:18:38,880 --> 00:18:41,680 Speaker 2: that's a lot, right. You could say at the opposite weight, 357 00:18:41,800 --> 00:18:44,480 Speaker 2: ninety percent of patients live and you're like, oh that's great, 358 00:18:44,920 --> 00:18:47,840 Speaker 2: the same amount of people dying. It's just framed differently 359 00:18:47,920 --> 00:18:51,600 Speaker 2: to exploit your your response. 360 00:18:51,680 --> 00:18:53,440 Speaker 1: To exploit your aversion to dying. 361 00:18:55,600 --> 00:18:57,399 Speaker 2: And that's a human thing, isn't it? 362 00:18:57,480 --> 00:19:01,160 Speaker 1: For sure? Shall we take a break? Yeah, all right, 363 00:19:01,280 --> 00:19:03,080 Speaker 1: Josh said human things. So it's time for a break 364 00:19:03,080 --> 00:19:04,840 Speaker 1: and we're going to come back with more biases right 365 00:19:04,880 --> 00:19:05,359 Speaker 1: after this. 366 00:19:34,880 --> 00:19:37,640 Speaker 2: All right, chuck up to bat is number twenty three 367 00:19:38,119 --> 00:19:39,720 Speaker 2: availability heuristic? 368 00:19:40,680 --> 00:19:42,760 Speaker 1: Can we put like a stadium echo effect on that? 369 00:19:44,160 --> 00:19:46,440 Speaker 2: Next to bat? Many motos? 370 00:19:47,359 --> 00:19:47,800 Speaker 1: Very nice? 371 00:19:49,160 --> 00:19:53,639 Speaker 2: So what is that availability heuristic? And I'm sure this 372 00:19:53,720 --> 00:19:54,920 Speaker 2: has happened to you before. 373 00:19:54,760 --> 00:19:58,399 Speaker 1: Right, Yeah, none of these have ever happened to you, 374 00:19:58,440 --> 00:20:00,640 Speaker 1: which is a funny thing about us doing this episode. 375 00:20:00,640 --> 00:20:06,439 Speaker 1: But the availability heuristic is what you have available to 376 00:20:06,600 --> 00:20:08,679 Speaker 1: call up in your brain at any given moment, So 377 00:20:08,720 --> 00:20:12,400 Speaker 1: you're you're going to rely more on what you can 378 00:20:12,480 --> 00:20:16,439 Speaker 1: immediately think of in the moment, and chances are what 379 00:20:16,520 --> 00:20:18,800 Speaker 1: you're immediately able to think of in the moment as 380 00:20:18,800 --> 00:20:22,360 Speaker 1: something that probably aligns with your worldview or something like that, 381 00:20:22,400 --> 00:20:24,560 Speaker 1: which is a sort of a well, we won't talk 382 00:20:24,560 --> 00:20:27,280 Speaker 1: about the sea bias because that's coming up. 383 00:20:27,560 --> 00:20:30,160 Speaker 2: Well, yeah, or something that like really kind of goosed 384 00:20:30,200 --> 00:20:34,440 Speaker 2: you emotionally, like that's that's very available because it's wow, 385 00:20:34,560 --> 00:20:38,080 Speaker 2: you know, loud and scary in your mind kind of 386 00:20:38,119 --> 00:20:42,119 Speaker 2: you know, like if you saw something about a plane 387 00:20:42,200 --> 00:20:44,640 Speaker 2: crash in the last like day or so, right when 388 00:20:44,640 --> 00:20:47,639 Speaker 2: somebody asks you how frequent plane crashes are, you're probably 389 00:20:47,640 --> 00:20:49,600 Speaker 2: going to give a much higher estimate than you would 390 00:20:49,600 --> 00:20:52,400 Speaker 2: have before that, you know, maybe based on the number 391 00:20:52,400 --> 00:20:54,400 Speaker 2: of times you've flown and nothing bad happened. 392 00:20:54,720 --> 00:20:56,200 Speaker 1: Yeah, that's a good one. 393 00:20:57,119 --> 00:21:01,719 Speaker 2: There's also an attentional blindness. And before anybody, before we 394 00:21:01,760 --> 00:21:04,359 Speaker 2: talk about this, because we're gonna spoil it. Yeah, yeah, 395 00:21:04,440 --> 00:21:06,919 Speaker 2: I want to send everybody, if you have the means 396 00:21:06,960 --> 00:21:10,800 Speaker 2: to do this, go onto YouTube and search for selective 397 00:21:10,920 --> 00:21:15,719 Speaker 2: attention test. And this is Daniel Simon's YouTube channel, and 398 00:21:15,760 --> 00:21:17,960 Speaker 2: then watch the test where the people are passing the 399 00:21:18,000 --> 00:21:20,919 Speaker 2: basketball back and forth. We'll wait a second. 400 00:21:21,240 --> 00:21:26,359 Speaker 1: Dude, Okay, that's enough. 401 00:21:26,400 --> 00:21:29,320 Speaker 2: All right, great, so hopefully you press pause and you 402 00:21:29,359 --> 00:21:31,680 Speaker 2: didn't just try to watch it while we were while 403 00:21:31,760 --> 00:21:33,240 Speaker 2: Chuck was doing the Jeopardy theme. 404 00:21:33,560 --> 00:21:35,000 Speaker 1: It is short, but it's not that short. 405 00:21:35,119 --> 00:21:37,320 Speaker 2: It's like a minute and a half or something, right, Yeah, 406 00:21:37,359 --> 00:21:40,520 Speaker 2: So tell them about this video, Chuck, because it's pretty great. 407 00:21:40,880 --> 00:21:44,119 Speaker 1: That's right. In the video, they have a group of 408 00:21:44,880 --> 00:21:48,800 Speaker 1: what was it like six people, probably six on the nose, 409 00:21:49,200 --> 00:21:52,280 Speaker 1: six college students I guess, Three are wearing white shirts, 410 00:21:52,320 --> 00:21:54,960 Speaker 1: three are not wearing white shirts, and they're in a 411 00:21:55,400 --> 00:21:59,520 Speaker 1: very tight, small circle. It looks very awkward. They have 412 00:21:59,680 --> 00:22:03,800 Speaker 1: a I think was it two basketballs. There's two groups, yeah, 413 00:22:03,800 --> 00:22:07,040 Speaker 1: two groups, two basketballs, and you're what you're told is 414 00:22:07,080 --> 00:22:09,520 Speaker 1: the tasket hand is to count the number of times 415 00:22:09,880 --> 00:22:14,080 Speaker 1: that people in white the white team are passing the basketball. 416 00:22:14,200 --> 00:22:18,680 Speaker 1: So you're counting, right, one, two, three, four, four, five, six, seven, 417 00:22:19,359 --> 00:22:21,040 Speaker 1: and that's all you're supposed to do. And at the 418 00:22:21,119 --> 00:22:23,160 Speaker 1: end you're supposed to say, you know how many times 419 00:22:23,240 --> 00:22:24,400 Speaker 1: they pass a basketball? 420 00:22:24,440 --> 00:22:24,680 Speaker 2: Right? 421 00:22:24,920 --> 00:22:27,000 Speaker 1: And now, now, hit them with the good stuff. 422 00:22:27,680 --> 00:22:32,440 Speaker 2: So apparently half of the people who do this, which 423 00:22:32,480 --> 00:22:35,680 Speaker 2: is astounding to me, half of the people who watch 424 00:22:35,760 --> 00:22:38,080 Speaker 2: this video and take this test don't notice that in 425 00:22:38,160 --> 00:22:41,440 Speaker 2: the middle of it, a person in a gorilla suit 426 00:22:41,520 --> 00:22:44,840 Speaker 2: walks into frame and turns to the camera and I 427 00:22:44,880 --> 00:22:48,119 Speaker 2: think beats on their chest and then walks out of frame. 428 00:22:48,440 --> 00:22:50,840 Speaker 2: Like in the middle of these people throwing these basketballs 429 00:22:50,840 --> 00:22:54,040 Speaker 2: around the gorilla, half of the people are paying such 430 00:22:54,080 --> 00:22:58,600 Speaker 2: close attention to counting how many times the people wearing 431 00:22:58,600 --> 00:23:00,760 Speaker 2: white T shirts are passing the basket well that they 432 00:23:00,880 --> 00:23:03,760 Speaker 2: do not notice the gorilla until the end of the 433 00:23:03,840 --> 00:23:05,240 Speaker 2: video when it's pointed out. 434 00:23:05,880 --> 00:23:08,920 Speaker 1: Yeah, and we're assuming it's a person in a gorilla costume. 435 00:23:09,040 --> 00:23:10,520 Speaker 2: I'm hoping, first of all. 436 00:23:10,400 --> 00:23:11,919 Speaker 1: That might be a bias at play, that it's not 437 00:23:11,960 --> 00:23:12,879 Speaker 1: a real gorilla. 438 00:23:13,040 --> 00:23:15,200 Speaker 2: Well, I guess it depends on the amount of funding 439 00:23:15,240 --> 00:23:15,560 Speaker 2: they had. 440 00:23:15,800 --> 00:23:20,439 Speaker 1: Yeah, it looks actually like the gorilla from trading places. 441 00:23:19,320 --> 00:23:23,920 Speaker 2: Totally, which is like, were they even trying? No, Okay, kid, 442 00:23:24,119 --> 00:23:25,360 Speaker 2: I just wanted to make sure. 443 00:23:25,520 --> 00:23:28,159 Speaker 1: Uh, did you watch this video Before you knew about this. 444 00:23:28,520 --> 00:23:32,120 Speaker 2: I had heard it from some friends who were who 445 00:23:32,160 --> 00:23:34,720 Speaker 2: do magic, and they were basically talking about this on 446 00:23:34,800 --> 00:23:36,440 Speaker 2: a little podcast that they made. 447 00:23:36,880 --> 00:23:39,480 Speaker 1: Who Bo Bobo, You are friends who do magic? 448 00:23:41,119 --> 00:23:43,960 Speaker 2: So you know our friend Toby. Oh, yeah, he has 449 00:23:44,119 --> 00:23:45,760 Speaker 2: very good friends that do magic. 450 00:23:46,200 --> 00:23:46,440 Speaker 1: Wow. 451 00:23:46,680 --> 00:23:50,560 Speaker 2: I became Yes, I became kind of friends with him. 452 00:23:50,560 --> 00:23:52,399 Speaker 2: So yeah, I guess I do. I have friends that 453 00:23:52,440 --> 00:23:53,000 Speaker 2: do magic. 454 00:23:53,480 --> 00:23:55,879 Speaker 1: Well, buddy, next time we are in Los Angeles at 455 00:23:55,920 --> 00:23:58,360 Speaker 1: the same time, our good friend and friend of the show, 456 00:23:58,400 --> 00:24:02,120 Speaker 1: Adam Pranica, Yeah, is a member of the Magic Castle 457 00:24:03,000 --> 00:24:05,600 Speaker 1: and it's one of his and his wife Elaine's favorite 458 00:24:05,600 --> 00:24:08,119 Speaker 1: thing to do is to take friends to the Magic Castle. 459 00:24:08,200 --> 00:24:09,440 Speaker 1: So have you ever been? 460 00:24:09,800 --> 00:24:10,000 Speaker 2: No? 461 00:24:10,280 --> 00:24:11,480 Speaker 1: And it is great fun. 462 00:24:11,800 --> 00:24:14,960 Speaker 2: Adam Pranika just keeps getting better and better, doesn't he. 463 00:24:15,440 --> 00:24:17,080 Speaker 1: Yeah. I haven't been with him, but I've been a 464 00:24:17,119 --> 00:24:19,639 Speaker 1: couple of times, once many many years ago, then another 465 00:24:19,720 --> 00:24:21,560 Speaker 1: probably ten years ago. But it's a lot of fun. 466 00:24:21,880 --> 00:24:25,280 Speaker 1: I'm a big fan of magic. Yeah, And it's pretty magical. 467 00:24:25,280 --> 00:24:27,760 Speaker 1: When people don't see that gorilla in a very tight frame, 468 00:24:28,359 --> 00:24:31,240 Speaker 1: it's not like it's on a big basketball court and 469 00:24:31,280 --> 00:24:33,439 Speaker 1: the gorilla sneaks in there like there's six people and 470 00:24:33,480 --> 00:24:35,919 Speaker 1: then there's a seventh. Very clearly it is. 471 00:24:36,080 --> 00:24:38,000 Speaker 2: It is very obvious so I mean, what this is 472 00:24:38,040 --> 00:24:41,720 Speaker 2: showing is that our attention is limited, right, and we're 473 00:24:41,760 --> 00:24:45,560 Speaker 2: really focused on a task. You you saw that gorilla. 474 00:24:46,080 --> 00:24:48,560 Speaker 2: Those half of the people who didn't notice the gorilla, 475 00:24:48,600 --> 00:24:50,960 Speaker 2: you still saw it, but you're still focused on the task. 476 00:24:51,000 --> 00:24:53,199 Speaker 2: That your brain was just getting rid of information that 477 00:24:53,280 --> 00:24:57,840 Speaker 2: was unrelated to the task because it's not pertinent. It 478 00:24:57,840 --> 00:25:00,880 Speaker 2: can become pertinent though, when that gorilla decide to attack you. 479 00:25:01,680 --> 00:25:04,800 Speaker 2: And so this is a cognitive bias we have where 480 00:25:04,800 --> 00:25:10,159 Speaker 2: we're ignoring potentially unimportant information to take in the stuff 481 00:25:10,200 --> 00:25:12,480 Speaker 2: that's related to the task at hand. Yeah. 482 00:25:12,520 --> 00:25:14,199 Speaker 1: Well, you know where they could really get away with 483 00:25:14,280 --> 00:25:17,080 Speaker 1: this because where you have great concentration. Is it a 484 00:25:17,119 --> 00:25:20,240 Speaker 1: professional sports game on the jumbo tron when they have 485 00:25:20,359 --> 00:25:23,480 Speaker 1: the baseball under the helmet or whatever, uh huh, and 486 00:25:23,520 --> 00:25:25,440 Speaker 1: then they're moving them around and you got to find 487 00:25:25,520 --> 00:25:27,840 Speaker 1: you know, it's like three card money, yep, Because you're 488 00:25:27,840 --> 00:25:30,040 Speaker 1: concentrating so hard on that, they could they could put 489 00:25:30,080 --> 00:25:32,320 Speaker 1: whatever they wanted on that screen while that's going on, 490 00:25:32,560 --> 00:25:35,320 Speaker 1: And I bet you most people would not or maybe 491 00:25:35,359 --> 00:25:37,159 Speaker 1: I guess it's half if that's what they found. 492 00:25:37,640 --> 00:25:40,000 Speaker 2: Yeah, I'll bet you're right. I'll bet you're right, man. 493 00:25:40,200 --> 00:25:41,520 Speaker 1: I was just trying to think of something where you're 494 00:25:41,520 --> 00:25:44,560 Speaker 1: super trying to follow because I was happy I came 495 00:25:44,640 --> 00:25:46,520 Speaker 1: up with the correct amount of passes at the end. 496 00:25:46,440 --> 00:25:48,919 Speaker 2: You did. You said, what does it mean if I 497 00:25:49,000 --> 00:25:52,000 Speaker 2: noticed the gorilla and got the correct number of passes? 498 00:25:52,040 --> 00:25:54,119 Speaker 2: And I said, it means you're a perfect human. 499 00:25:53,960 --> 00:25:55,639 Speaker 1: That's right, which we all know is not true. 500 00:25:57,240 --> 00:25:58,880 Speaker 2: None of us are, Chuck, None of us are. 501 00:25:59,119 --> 00:25:59,520 Speaker 1: Yeah, I know. 502 00:26:00,080 --> 00:26:02,359 Speaker 2: So there's another one that you may have heard of before, 503 00:26:02,400 --> 00:26:04,200 Speaker 2: even if you've not heard of any of these other ones, 504 00:26:04,200 --> 00:26:07,000 Speaker 2: called the Dunning Kruger effect. It became kind of viral 505 00:26:07,119 --> 00:26:11,320 Speaker 2: because if you take it through the pop culture meat grinder, 506 00:26:11,800 --> 00:26:14,159 Speaker 2: it becomes much more simplified and kind of loses some 507 00:26:14,240 --> 00:26:18,199 Speaker 2: of its actuality. But yeah, people still like it because 508 00:26:18,520 --> 00:26:20,240 Speaker 2: it's a good way to put other people down. 509 00:26:20,720 --> 00:26:23,320 Speaker 1: Yeah, it is. This is the idea that the correct 510 00:26:23,520 --> 00:26:27,000 Speaker 1: idea is that people with a little understanding in an 511 00:26:27,040 --> 00:26:31,119 Speaker 1: area tend to overestimate their ability and their knowledge about something, 512 00:26:31,359 --> 00:26:34,400 Speaker 1: right because they don't They know so little they don't 513 00:26:34,400 --> 00:26:37,080 Speaker 1: even know what they don't know. Kind of right exactly, 514 00:26:37,560 --> 00:26:39,760 Speaker 1: But what you were talking about, it's kind of been 515 00:26:39,800 --> 00:26:44,679 Speaker 1: transformed into like morons have them are the most like braggadocious, 516 00:26:44,800 --> 00:26:46,320 Speaker 1: which can be true. 517 00:26:47,000 --> 00:26:48,960 Speaker 2: It can be you know. I think that's one of 518 00:26:49,000 --> 00:26:51,159 Speaker 2: the things. Like you said, you can be right with 519 00:26:51,240 --> 00:26:53,680 Speaker 2: cognitive biases, you're not wrong with them all the time. 520 00:26:53,720 --> 00:26:56,119 Speaker 2: So yeah, that kind of supports that, But that's not 521 00:26:56,160 --> 00:26:58,919 Speaker 2: what the Dunning Kruger effect actually says. You said it. 522 00:26:59,240 --> 00:27:02,639 Speaker 2: And then there's the opposite way too, where the more 523 00:27:02,720 --> 00:27:06,040 Speaker 2: experience you have, the more expert you are in a field, 524 00:27:06,640 --> 00:27:10,320 Speaker 2: the more you assume that it should be easier for 525 00:27:10,400 --> 00:27:11,199 Speaker 2: you than it is. 526 00:27:12,240 --> 00:27:15,600 Speaker 1: Yeah. That's a very valuable thing to understand, I think, 527 00:27:15,800 --> 00:27:19,360 Speaker 1: and you get much further in life if people are like, well, 528 00:27:19,400 --> 00:27:22,719 Speaker 1: you're the expert, and the experts usually the one going yeah, 529 00:27:22,760 --> 00:27:25,360 Speaker 1: but I don't know, maybe we should hold off because 530 00:27:25,440 --> 00:27:26,560 Speaker 1: you know X, Y and Z. 531 00:27:27,040 --> 00:27:30,440 Speaker 2: Right. Yeah, So that's the actual Dunning Kruger effect, and 532 00:27:30,520 --> 00:27:33,520 Speaker 2: I saw that it's it's being assailed right now. People 533 00:27:33,560 --> 00:27:36,480 Speaker 2: are showing a question even the basic version of it, 534 00:27:36,560 --> 00:27:39,120 Speaker 2: like the actually academic version of it. Yeah, so we'll 535 00:27:39,119 --> 00:27:40,080 Speaker 2: see what happens with that. 536 00:27:40,400 --> 00:27:44,199 Speaker 1: Oh interesting. We've got the gambler's fallacy next, and that 537 00:27:44,359 --> 00:27:49,760 Speaker 1: is oh boy, if you have ever gambled anywhere. But 538 00:27:49,800 --> 00:27:51,560 Speaker 1: if you like go to casinos and stuff like that, 539 00:27:52,200 --> 00:27:54,360 Speaker 1: you're going to see this all over the place. You're 540 00:27:54,359 --> 00:27:56,600 Speaker 1: going to hear it spoken out loud. And this is 541 00:27:56,640 --> 00:28:00,960 Speaker 1: the idea that you find patterns where there are no patterns. 542 00:28:01,119 --> 00:28:01,639 Speaker 2: Yeah. 543 00:28:01,720 --> 00:28:04,120 Speaker 1: So if you're at the blackjack table and you hear 544 00:28:04,200 --> 00:28:08,040 Speaker 1: the person next to you like, well, oh, man, see, 545 00:28:08,040 --> 00:28:10,280 Speaker 1: I've lost four in a row, so I'm gonna bet, like, 546 00:28:10,840 --> 00:28:13,160 Speaker 1: I'm gonna go all in on this because I'm bound 547 00:28:13,200 --> 00:28:14,760 Speaker 1: to win because I've lost four in a row. There's 548 00:28:14,760 --> 00:28:16,280 Speaker 1: no way I'm gonna lose five in a row. 549 00:28:16,400 --> 00:28:19,440 Speaker 2: Right. The problem is is those each of those hands 550 00:28:19,440 --> 00:28:22,480 Speaker 2: of blackjack are unrelated to one another. They don't form 551 00:28:22,520 --> 00:28:26,080 Speaker 2: a pattern. But you are predicting a pattern that just 552 00:28:26,119 --> 00:28:29,560 Speaker 2: doesn't exist. Yeah, that means you're a fallacious gambler. 553 00:28:29,960 --> 00:28:33,040 Speaker 1: I can get you in real trouble. I mean it's 554 00:28:33,200 --> 00:28:35,040 Speaker 1: you can do the same thing on the playground with 555 00:28:35,119 --> 00:28:37,600 Speaker 1: coin tosses. In fact, coin tosses, I think is a 556 00:28:37,600 --> 00:28:39,920 Speaker 1: lot of a lot of times the way they sort 557 00:28:39,920 --> 00:28:41,600 Speaker 1: of try and prove. 558 00:28:41,440 --> 00:28:45,440 Speaker 2: This, Yeah, because each coin toss, considering like you're playing 559 00:28:45,480 --> 00:28:48,480 Speaker 2: with a perfect unflogged coin that has no bias whatsoever. 560 00:28:49,120 --> 00:28:51,840 Speaker 2: Each coin toss is totally unrelated to the last. So 561 00:28:52,440 --> 00:28:54,560 Speaker 2: you could get one hundred heads in a row and 562 00:28:54,560 --> 00:28:56,880 Speaker 2: that doesn't mean anything. It doesn't mean a tail is coming, 563 00:28:57,320 --> 00:29:00,720 Speaker 2: because that each of those hundred heads in each of 564 00:29:00,760 --> 00:29:03,440 Speaker 2: those coin tosses had nothing to do with the last 565 00:29:03,440 --> 00:29:04,760 Speaker 2: one or the next one. 566 00:29:05,120 --> 00:29:07,240 Speaker 1: I know. That's hard to break out of, though, because 567 00:29:07,240 --> 00:29:10,000 Speaker 1: it seems very human to think like they've flipped foreheads 568 00:29:10,000 --> 00:29:11,360 Speaker 1: in a row. There's no way there's going to be 569 00:29:11,360 --> 00:29:11,640 Speaker 1: a fit. 570 00:29:12,080 --> 00:29:14,320 Speaker 2: Well, that's another reason why this is so hard. We're 571 00:29:14,400 --> 00:29:17,560 Speaker 2: hardwired to find patterns and stuff. It's a way to 572 00:29:17,680 --> 00:29:20,240 Speaker 2: navigate the world. The way we navigate the world is 573 00:29:20,280 --> 00:29:22,920 Speaker 2: by finding patterns so that we can recognize things in 574 00:29:22,920 --> 00:29:26,440 Speaker 2: the future and thus spend less energy getting back to homeostasis. 575 00:29:26,800 --> 00:29:28,920 Speaker 1: That's right, This is all so interesting to me. 576 00:29:29,160 --> 00:29:30,120 Speaker 2: I love this stuff. 577 00:29:30,880 --> 00:29:33,240 Speaker 1: I knew that you loved it. This is Josh Clark Central. 578 00:29:33,680 --> 00:29:36,840 Speaker 2: I love observing it because I can't grasp what it 579 00:29:36,880 --> 00:29:39,000 Speaker 2: feels like to suffer any of these, So just to 580 00:29:39,080 --> 00:29:42,480 Speaker 2: discuss it and in this way is really fascinating to me. 581 00:29:42,960 --> 00:29:45,880 Speaker 1: All right, let's talk about the base rate fallacy That 582 00:29:45,960 --> 00:29:49,480 Speaker 1: means you put more weight on just like one very 583 00:29:49,480 --> 00:29:52,600 Speaker 1: specific piece of information instead of looking at all the 584 00:29:52,640 --> 00:29:54,640 Speaker 1: pieces of information that have come your way. 585 00:29:54,840 --> 00:29:58,920 Speaker 2: Yeah, and usually it's individuated information, meaning like say some 586 00:29:59,120 --> 00:30:02,920 Speaker 2: quality or careacteristic of one person, and then you're ignoring 587 00:30:03,000 --> 00:30:06,440 Speaker 2: the base rate, which is like pure statistical information about 588 00:30:06,520 --> 00:30:09,440 Speaker 2: what you're trying to figure out. And a really good 589 00:30:09,480 --> 00:30:11,880 Speaker 2: example of this is like, let's say that you are 590 00:30:12,000 --> 00:30:14,600 Speaker 2: looking at somebody who is super fit, a woman who's 591 00:30:14,680 --> 00:30:18,000 Speaker 2: very fit and athletic, and you're asked, do you think 592 00:30:18,000 --> 00:30:21,640 Speaker 2: that woman is a personal trainer or a teacher, Because 593 00:30:21,800 --> 00:30:24,280 Speaker 2: the basically the only evidence you have there is that 594 00:30:24,320 --> 00:30:28,280 Speaker 2: this woman is very athletic and fit. You might say 595 00:30:28,360 --> 00:30:31,480 Speaker 2: personal trainer. But if you took all the base rate 596 00:30:31,560 --> 00:30:35,880 Speaker 2: information into account, you would know that even the very say, 597 00:30:36,040 --> 00:30:40,280 Speaker 2: very small portion of teachers who are very fit and 598 00:30:40,320 --> 00:30:44,080 Speaker 2: athletic may be small compared to the total number of teachers, 599 00:30:44,400 --> 00:30:48,040 Speaker 2: it's still much larger than the total number of personal 600 00:30:48,080 --> 00:30:51,960 Speaker 2: trainers in the world. So statistically speaking, it's much likelier 601 00:30:52,280 --> 00:30:55,479 Speaker 2: that that very fit athletic woman is a teacher and 602 00:30:55,520 --> 00:30:57,960 Speaker 2: not a personal trainer. You don't do that because you 603 00:30:58,000 --> 00:31:01,840 Speaker 2: think personal trainer athletic fit must be a personal trainer. 604 00:31:02,080 --> 00:31:05,280 Speaker 2: You've just fallen prey to the base rate fallacy. 605 00:31:05,360 --> 00:31:08,520 Speaker 1: My friend, Yeah, but she has on yoga pants and 606 00:31:08,560 --> 00:31:12,880 Speaker 1: hokuhs exactly. That doesn't narrow down anything these days. 607 00:31:13,560 --> 00:31:18,080 Speaker 2: You know, how about the mere exposure effect, Chuck, and like, 608 00:31:18,240 --> 00:31:20,320 Speaker 2: mirror is part of it. I'm not making a judgment 609 00:31:20,360 --> 00:31:20,960 Speaker 2: about it. 610 00:31:21,120 --> 00:31:25,760 Speaker 1: That's right. That means just merely being exposed to something 611 00:31:26,640 --> 00:31:30,840 Speaker 1: has a vast impact. So the more we experience something, 612 00:31:31,240 --> 00:31:33,200 Speaker 1: the more you like it, which is why you see 613 00:31:33,200 --> 00:31:36,400 Speaker 1: that commercial for the thing over and over and over, 614 00:31:36,520 --> 00:31:40,160 Speaker 1: that burger king ad over and over and over. Although 615 00:31:40,200 --> 00:31:42,920 Speaker 1: I wouldn't say that you might like that one the 616 00:31:42,920 --> 00:31:44,200 Speaker 1: more you heard it. 617 00:31:44,320 --> 00:31:45,800 Speaker 2: That's the outlier for me too. 618 00:31:46,600 --> 00:31:50,840 Speaker 1: But that's the idea. Though there's just just mere exposure, 619 00:31:51,080 --> 00:31:51,800 Speaker 1: we'll get you there. 620 00:31:52,000 --> 00:31:54,880 Speaker 2: And then there's a related thing called the illusory truth effect, 621 00:31:54,880 --> 00:32:00,600 Speaker 2: which is basically that repeated exposure to a lie causes 622 00:32:00,640 --> 00:32:02,560 Speaker 2: you to eventually believe in it if you hear it 623 00:32:02,680 --> 00:32:06,920 Speaker 2: enough times, even if you initially knew that it wasn't true. 624 00:32:07,160 --> 00:32:10,000 Speaker 2: So that makes me wonder if like it just wears 625 00:32:10,000 --> 00:32:12,040 Speaker 2: you down over time, like your brain is tired of 626 00:32:12,080 --> 00:32:15,560 Speaker 2: defending itself against being assailed with a lie. And it's 627 00:32:15,600 --> 00:32:17,520 Speaker 2: just like, fine, that's true. I don't I don't care. 628 00:32:18,440 --> 00:32:22,200 Speaker 1: Yeah, I mean sure, politics certainly comes to mind. Repeat 629 00:32:22,240 --> 00:32:23,840 Speaker 1: the lie, repeat the lie, repeat the lie. 630 00:32:23,920 --> 00:32:26,280 Speaker 2: Yeah, And I mean like it's a viable way to 631 00:32:26,360 --> 00:32:30,000 Speaker 2: exploit people's cognitive biases in that In that respect. 632 00:32:30,680 --> 00:32:33,240 Speaker 1: Should we end up should we close out with the 633 00:32:33,240 --> 00:32:35,200 Speaker 1: big daddy of them all, the big c the big 634 00:32:35,200 --> 00:32:36,200 Speaker 1: confirmation bias? 635 00:32:36,320 --> 00:32:38,000 Speaker 2: Yeah, let's do it, baby. 636 00:32:38,080 --> 00:32:40,640 Speaker 1: All right, why don't you start this one? 637 00:32:41,320 --> 00:32:43,880 Speaker 2: Okay, So there's a guy named Peter Wasason back in 638 00:32:43,920 --> 00:32:48,800 Speaker 2: the sixties. He coined the term confirmation bias, and he 639 00:32:48,880 --> 00:32:51,680 Speaker 2: basically had an experiment that's really clever. It's hard to 640 00:32:51,760 --> 00:32:55,440 Speaker 2: understand at first, but it's very clever. He basically said, hey, 641 00:32:55,600 --> 00:32:58,360 Speaker 2: here is a sequence of numbers two, four, and six. 642 00:32:59,480 --> 00:33:02,680 Speaker 2: Figure out what the pattern is. Just to be clear, 643 00:33:02,760 --> 00:33:06,960 Speaker 2: this is really hard to explain. If if you find 644 00:33:07,000 --> 00:33:10,600 Speaker 2: somebody who can explain this, well you'll get it. But 645 00:33:10,840 --> 00:33:13,360 Speaker 2: I don't think I'm a candidate for that. I think 646 00:33:13,400 --> 00:33:15,400 Speaker 2: we all know that I'm not going to explain this 647 00:33:15,560 --> 00:33:16,000 Speaker 2: very well. 648 00:33:16,360 --> 00:33:17,680 Speaker 1: Oh I don't think that's true. 649 00:33:17,720 --> 00:33:18,720 Speaker 2: Do you want to take a crack? 650 00:33:19,240 --> 00:33:23,120 Speaker 1: Okay, The original numbers were two four six, and people 651 00:33:23,240 --> 00:33:26,240 Speaker 1: might tend to go with, like, all right, eight ten twelve, 652 00:33:26,960 --> 00:33:28,920 Speaker 1: and they thinking it might be all right, it's even 653 00:33:29,000 --> 00:33:32,960 Speaker 1: number sequence ascending even number, right, and they would say, no, 654 00:33:33,080 --> 00:33:35,560 Speaker 1: that's not correct. And you said, well, maybe it's four 655 00:33:35,640 --> 00:33:38,360 Speaker 1: eight twelve and it's like doubled or something, and they 656 00:33:38,400 --> 00:33:43,120 Speaker 1: would say, well, that's also incorrect. And then you're at 657 00:33:43,120 --> 00:33:46,200 Speaker 1: WIT's end because what you haven't done is just done 658 00:33:46,240 --> 00:33:50,320 Speaker 1: any ascending order. You didn't go one seventy nine, three hundred. 659 00:33:50,760 --> 00:33:53,920 Speaker 2: All right, let me take a crack at it. You ready? 660 00:33:54,280 --> 00:33:54,640 Speaker 1: Sure? 661 00:33:55,120 --> 00:33:58,320 Speaker 2: So the original numbers two four six, and the participants 662 00:33:58,360 --> 00:34:00,760 Speaker 2: would try to come up with the explanation of why, 663 00:34:01,080 --> 00:34:04,880 Speaker 2: like what are the what pattern are those numbers following? 664 00:34:05,040 --> 00:34:05,200 Speaker 1: Right? 665 00:34:05,880 --> 00:34:11,200 Speaker 2: So you might say the like does eight ten twelve 666 00:34:11,280 --> 00:34:14,560 Speaker 2: work and they would say yes, and you'd say, okay, 667 00:34:14,600 --> 00:34:16,680 Speaker 2: well then you're just looking at even numbers and they 668 00:34:16,680 --> 00:34:20,040 Speaker 2: would say, no, you still got this right, This still 669 00:34:20,080 --> 00:34:23,280 Speaker 2: fits the pattern, but your your hypothesis for it is wrong. 670 00:34:23,800 --> 00:34:25,200 Speaker 1: Right, that's yeah, that's the key. 671 00:34:25,360 --> 00:34:28,439 Speaker 2: Right. Here's where the confirmation bias came in. People would 672 00:34:28,520 --> 00:34:32,799 Speaker 2: then go back and continue trying to find versions that 673 00:34:32,960 --> 00:34:36,600 Speaker 2: fit their hypothesis to explain this even though it was wrong. Yeah, 674 00:34:36,719 --> 00:34:39,920 Speaker 2: rather than take their hypothesis and say, Okay, this is right, 675 00:34:40,160 --> 00:34:42,880 Speaker 2: this fits the pattern, but it's still not correct and 676 00:34:42,920 --> 00:34:47,120 Speaker 2: start trying to break their original hypothesis by coming up 677 00:34:47,120 --> 00:34:50,520 Speaker 2: with like just completely random stuff that doesn't fit their 678 00:34:50,800 --> 00:34:54,240 Speaker 2: original hypothesis, in which case they might have said something 679 00:34:54,320 --> 00:34:57,160 Speaker 2: like does one, six or twenty seven work and they 680 00:34:57,200 --> 00:35:00,680 Speaker 2: would say no, that's that doesn't fit, and then that 681 00:35:00,760 --> 00:35:03,480 Speaker 2: might lead the person to see that actually, the only 682 00:35:03,640 --> 00:35:06,279 Speaker 2: the only thing that has to be correct to be 683 00:35:06,360 --> 00:35:08,160 Speaker 2: part of the model is that the numbers have to 684 00:35:08,239 --> 00:35:13,479 Speaker 2: ascend in order. That was it. But people, people, man, 685 00:35:14,200 --> 00:35:14,440 Speaker 2: just you. 686 00:35:14,480 --> 00:35:16,799 Speaker 1: Might even try, you might even try and break it 687 00:35:16,840 --> 00:35:20,400 Speaker 1: by saying three, five, seven, but you're still using that 688 00:35:20,480 --> 00:35:25,440 Speaker 1: original a version of that original hypothesis. 689 00:35:25,000 --> 00:35:27,839 Speaker 2: Exactly that that there's yes, say like that you think 690 00:35:27,840 --> 00:35:29,760 Speaker 2: it goes up by two or something like that. Yes, 691 00:35:30,239 --> 00:35:33,719 Speaker 2: You're like very few people go back and try to 692 00:35:33,760 --> 00:35:37,479 Speaker 2: break their own hypothesis, And that's the point of confirmation bias. 693 00:35:37,560 --> 00:35:40,360 Speaker 2: Let's move on from that experiment. The point of confirmation 694 00:35:40,480 --> 00:35:43,120 Speaker 2: bias that this shows if you actually can understand it 695 00:35:43,160 --> 00:35:47,279 Speaker 2: from other people than us, is that we tend to 696 00:35:47,320 --> 00:35:51,200 Speaker 2: take our initial ideas, our beliefs in a lot of cases, 697 00:35:51,600 --> 00:35:54,960 Speaker 2: and look for information that supports those and discard information 698 00:35:55,080 --> 00:35:56,200 Speaker 2: that doesn't support it. 699 00:35:57,000 --> 00:35:59,400 Speaker 1: Right, And that's of course you know mentioned politics a 700 00:35:59,440 --> 00:36:01,680 Speaker 1: minute ago. That where you most firmly see that these 701 00:36:01,760 --> 00:36:06,880 Speaker 1: days is you are in a media bubble. Probably, I 702 00:36:06,880 --> 00:36:09,399 Speaker 1: don't know a ton of people that get their news 703 00:36:09,480 --> 00:36:14,160 Speaker 1: sources from completely disparate points of view, and news these 704 00:36:14,280 --> 00:36:17,680 Speaker 1: days that you're getting is so a lot of it 705 00:36:17,719 --> 00:36:19,839 Speaker 1: is so slanted to begin with. It's probably not even 706 00:36:19,880 --> 00:36:23,840 Speaker 1: the best example anymore to use, but you're probably a 707 00:36:23,840 --> 00:36:26,160 Speaker 1: long way of saying, you're probably going to be seeking 708 00:36:26,200 --> 00:36:28,520 Speaker 1: out news that confirms your beliefs because you don't want 709 00:36:28,560 --> 00:36:29,360 Speaker 1: your beliefs challenged. 710 00:36:29,560 --> 00:36:32,440 Speaker 2: Yes, I mean, I, like everybody else, I have trouble 711 00:36:32,480 --> 00:36:34,440 Speaker 2: with that as well. But I have to boast you 712 00:36:34,520 --> 00:36:38,000 Speaker 2: me is actually really good about getting news from different sources. 713 00:36:38,719 --> 00:36:39,160 Speaker 1: Yeah. 714 00:36:39,200 --> 00:36:41,480 Speaker 2: And one reason that I find it difficult to do 715 00:36:41,560 --> 00:36:45,640 Speaker 2: is because I have like physical reactions sometimes. Yeah, And 716 00:36:45,800 --> 00:36:47,799 Speaker 2: that is a thing that's one of the reasons why 717 00:36:47,800 --> 00:36:50,200 Speaker 2: they think we have we use confirmation biases because it 718 00:36:50,280 --> 00:36:53,319 Speaker 2: sucks to be to have your beliefs challenged, right, It's 719 00:36:53,360 --> 00:36:56,319 Speaker 2: really difficult to overcome that. And there's this thing called 720 00:36:56,360 --> 00:37:01,160 Speaker 2: belief perseverance, which is, even when you're beliefs are challenged with, 721 00:37:01,280 --> 00:37:06,799 Speaker 2: say like an indisputable fact, you can still use confirmation 722 00:37:06,880 --> 00:37:12,440 Speaker 2: bias to preserve that belief because we usually attach our identity, 723 00:37:12,520 --> 00:37:15,040 Speaker 2: or build our identity around our beliefs. That's who we are. 724 00:37:15,200 --> 00:37:18,200 Speaker 2: So it's like we're being personally attacked. And then even 725 00:37:18,239 --> 00:37:20,560 Speaker 2: more than that, there's the backfire effect. Right did you 726 00:37:20,560 --> 00:37:21,000 Speaker 2: see that? 727 00:37:21,760 --> 00:37:22,160 Speaker 1: I did not. 728 00:37:22,520 --> 00:37:26,000 Speaker 2: So the backfire effect says that in the in the 729 00:37:26,040 --> 00:37:30,880 Speaker 2: face of being presented with information that is that basically 730 00:37:30,880 --> 00:37:34,200 Speaker 2: counters your own beliefs, it can make you actually solidify 731 00:37:34,800 --> 00:37:39,000 Speaker 2: your original incorrect belief in the first place. Right, You'll 732 00:37:39,040 --> 00:37:42,680 Speaker 2: you'll you'll believe it even more strongly, even though you've 733 00:37:42,719 --> 00:37:46,560 Speaker 2: just been given facts that contradict it. So we really 734 00:37:46,920 --> 00:37:51,320 Speaker 2: really hang on to our beliefs as much as possible. 735 00:37:51,320 --> 00:37:54,680 Speaker 2: And that is a huge, huge thing that humans trip over. 736 00:37:54,840 --> 00:37:58,759 Speaker 2: That confirmation bias is probably the grandaddy of all biases. 737 00:37:58,840 --> 00:38:03,120 Speaker 1: I think, Yeah, that's why I saved it for last Yeah, 738 00:38:03,160 --> 00:38:04,680 Speaker 1: and you know, a lot of reasons people do this. 739 00:38:04,840 --> 00:38:08,520 Speaker 1: You might be protecting your yourself, like your self esteem, 740 00:38:08,600 --> 00:38:11,880 Speaker 1: because otherwise you can you're admitting that you're may have 741 00:38:11,920 --> 00:38:14,759 Speaker 1: been wrong about something, and it, you know, takes a 742 00:38:14,760 --> 00:38:17,839 Speaker 1: big person to do that. You want to believe that 743 00:38:17,960 --> 00:38:21,440 Speaker 1: you're right about stuff. And it also might just be 744 00:38:21,480 --> 00:38:25,239 Speaker 1: difficult to process more than one hypothesis at once. It 745 00:38:25,520 --> 00:38:27,120 Speaker 1: might just be a little too brain breaking. 746 00:38:27,440 --> 00:38:30,440 Speaker 2: Yeah, because once you lock into an explanation, your brain 747 00:38:30,600 --> 00:38:32,879 Speaker 2: just it's like, I know, we've got it. We don't 748 00:38:32,880 --> 00:38:35,960 Speaker 2: have to figure this other thing out. Homeostasis, homeostasis. You know. 749 00:38:36,520 --> 00:38:39,200 Speaker 2: It's it is very hard to entertain something that is 750 00:38:40,200 --> 00:38:42,160 Speaker 2: counter to what we already think is true. 751 00:38:42,680 --> 00:38:45,319 Speaker 1: That's right, all right everyone. As you can tell by 752 00:38:45,360 --> 00:38:48,600 Speaker 1: the clock, we are taking our second break, this is 753 00:38:48,640 --> 00:38:49,880 Speaker 1: a long one, and we're going to come back and 754 00:38:49,920 --> 00:38:51,880 Speaker 1: talk about behavioral economics right after that. 755 00:38:52,080 --> 00:38:54,600 Speaker 2: Well, wait, before we do that, let's try to explain 756 00:38:54,640 --> 00:38:56,200 Speaker 2: this confirmation by a study again. 757 00:38:56,600 --> 00:39:29,600 Speaker 1: Yeah, we should all right, we'll be right back, Okay, 758 00:39:29,840 --> 00:39:33,000 Speaker 1: I promised talk about behavioral economics. A lot of the 759 00:39:33,000 --> 00:39:38,319 Speaker 1: work that Spirsky and Konoman did was super applicable and 760 00:39:38,520 --> 00:39:40,239 Speaker 1: kind of revolutionary in a lot of ways for the 761 00:39:40,239 --> 00:39:44,480 Speaker 1: world of economics and how people buying behavior is affected. 762 00:39:45,200 --> 00:39:47,520 Speaker 1: They didn't invent it, like Adam Smith wrote about stuff 763 00:39:47,560 --> 00:39:50,919 Speaker 1: like this, and starting about World War two is when 764 00:39:51,440 --> 00:39:54,239 Speaker 1: they started really kind of homing in on stuff like this, 765 00:39:54,760 --> 00:39:58,640 Speaker 1: like using mathematical models, and it all kind of started 766 00:39:58,640 --> 00:40:03,359 Speaker 1: with the assumption that people and companies and organizations are 767 00:40:03,440 --> 00:40:06,839 Speaker 1: really just trying to pursue their self interest at the end. 768 00:40:06,719 --> 00:40:08,160 Speaker 2: Of the day. Yeah, and they're going to make the 769 00:40:08,160 --> 00:40:12,040 Speaker 2: most rational decision. Yeah, and that's just they And they 770 00:40:12,040 --> 00:40:15,080 Speaker 2: were like, yes, we know people make irrational decisions, but 771 00:40:15,320 --> 00:40:17,480 Speaker 2: these are outliers. Like if you take all of the 772 00:40:17,520 --> 00:40:20,960 Speaker 2: information and their data and aggregate, you will see that 773 00:40:21,080 --> 00:40:24,279 Speaker 2: humans generally try to make the most rational decision. That's 774 00:40:24,440 --> 00:40:28,120 Speaker 2: just not true. People don't do that. We make all 775 00:40:28,160 --> 00:40:32,840 Speaker 2: sorts of irrational decisions that very frequently run counter to 776 00:40:32,920 --> 00:40:37,160 Speaker 2: our own best interests. And again we'll even reject stuff 777 00:40:37,280 --> 00:40:40,480 Speaker 2: like information that would help us make decisions to our 778 00:40:40,480 --> 00:40:44,120 Speaker 2: own best interests if they counter our beliefs. So there's 779 00:40:44,120 --> 00:40:47,120 Speaker 2: a guy named Richard Thaler who ended up becoming a 780 00:40:47,160 --> 00:40:50,600 Speaker 2: colleague of Diversky and Konnoment, and he took some of 781 00:40:50,640 --> 00:40:56,080 Speaker 2: their papers, and he realized that these mistakes, these cognitive biases, 782 00:40:56,480 --> 00:41:00,960 Speaker 2: they can be predictable. Right, you can actually map how 783 00:41:01,040 --> 00:41:04,600 Speaker 2: somebody's going to make a bad decision. And this became 784 00:41:04,640 --> 00:41:06,960 Speaker 2: the basis of behavioral economics. 785 00:41:07,200 --> 00:41:12,880 Speaker 1: Yeah, he well, let's talk about this this prospect theory, 786 00:41:12,880 --> 00:41:16,640 Speaker 1: because this was from Tversky and Konomon. It was an 787 00:41:16,680 --> 00:41:21,080 Speaker 1: article from nineteen seventy nine from this idea of prospect 788 00:41:21,160 --> 00:41:25,160 Speaker 1: theory colon an Analysis of Decision under Risk, and Livia 789 00:41:25,160 --> 00:41:27,919 Speaker 1: says it's probably the most cited economics paper of all time. 790 00:41:28,040 --> 00:41:31,600 Speaker 1: Like this was a revolutionary, landmark paper and they didn't 791 00:41:31,600 --> 00:41:33,799 Speaker 1: write a ton of papers for researchers. They did like 792 00:41:33,840 --> 00:41:37,319 Speaker 1: eight totalah, which just shows what an outsize impact they had. 793 00:41:38,440 --> 00:41:41,600 Speaker 1: But they talk about in this paper a lot of 794 00:41:41,880 --> 00:41:46,640 Speaker 1: attitudes about risk. One is loss a version, which is 795 00:41:46,680 --> 00:41:49,760 Speaker 1: the idea that you're going to experience more emotional suffering 796 00:41:49,800 --> 00:41:53,600 Speaker 1: when you lose money, then you will gain happiness if 797 00:41:53,600 --> 00:41:58,120 Speaker 1: you gain something, So you may pass up an offer 798 00:41:58,520 --> 00:42:00,759 Speaker 1: that gives you equal odds of winning twenty five or 799 00:42:00,800 --> 00:42:04,080 Speaker 1: losing twenty. There was another example I think kind of 800 00:42:04,080 --> 00:42:06,360 Speaker 1: gets it across more is there was an experiment in 801 00:42:06,400 --> 00:42:09,560 Speaker 1: nineteen ninety six where they gave participants a lottery ticket 802 00:42:10,160 --> 00:42:12,680 Speaker 1: and before you scratched it off or whatever, let's say 803 00:42:12,719 --> 00:42:14,120 Speaker 1: it was a scratch off. They said, all right, well, 804 00:42:14,160 --> 00:42:17,680 Speaker 1: hold on, before you do that, I'll give you another 805 00:42:17,760 --> 00:42:21,840 Speaker 1: lottery ticket plus ten dollars in cash. And for no 806 00:42:22,400 --> 00:42:26,359 Speaker 1: logical reason at all, people tended to think that that 807 00:42:26,400 --> 00:42:29,279 Speaker 1: first ticket was the one, even though there was no 808 00:42:29,400 --> 00:42:31,399 Speaker 1: it was a lottery ticket, there was no difference at all. 809 00:42:31,719 --> 00:42:33,560 Speaker 1: They would turn down that extra ten bucks. I think 810 00:42:33,640 --> 00:42:35,480 Speaker 1: less than fifty percent of them took that deal. 811 00:42:35,760 --> 00:42:38,760 Speaker 2: Yeah, because in giving away or trading that first ticket, 812 00:42:38,840 --> 00:42:42,680 Speaker 2: they risked a loss even though the gain was right there. 813 00:42:42,880 --> 00:42:46,480 Speaker 2: Just trading the ticket, you got an extra ten bucks, right, Yeah, 814 00:42:45,960 --> 00:42:51,120 Speaker 2: that's fairly irrational. We also have a lot of trouble 815 00:42:51,760 --> 00:42:56,160 Speaker 2: with rare events. Yeah, we tend to overestimate them. It 816 00:42:56,200 --> 00:42:57,680 Speaker 2: can be a positive event and it can be a 817 00:42:57,680 --> 00:43:02,480 Speaker 2: negative event. But we're really bad at probabilities and statistics. 818 00:43:02,960 --> 00:43:09,960 Speaker 2: And this is essentially it's like you won't let your 819 00:43:10,040 --> 00:43:12,960 Speaker 2: kid walk to school because you're afraid of your kid 820 00:43:12,960 --> 00:43:15,560 Speaker 2: being kidnapped, even though the chance of your kid being 821 00:43:15,640 --> 00:43:21,200 Speaker 2: kidnapped is just ridiculously low. It's technically irrational, even though 822 00:43:21,360 --> 00:43:24,440 Speaker 2: very few people would fault you for that, but it's 823 00:43:24,480 --> 00:43:26,040 Speaker 2: still an irrational decision. 824 00:43:26,640 --> 00:43:29,560 Speaker 1: Yeah, for sure. We talked about that in the. 825 00:43:31,200 --> 00:43:33,280 Speaker 2: Well was it free range kids? 826 00:43:33,960 --> 00:43:36,560 Speaker 1: Maybe I can't remember, tied in with a satanic panic 827 00:43:36,600 --> 00:43:38,920 Speaker 1: and stuff like that I think definitely did back in 828 00:43:38,960 --> 00:43:42,800 Speaker 1: the day. Relative rather than absolute terms. That is a theory, 829 00:43:43,960 --> 00:43:48,319 Speaker 1: monetary theory where and there's a great example. You might 830 00:43:48,400 --> 00:43:50,399 Speaker 1: drive an extra ten minutes in a car to buy 831 00:43:50,400 --> 00:43:52,920 Speaker 1: a shirt that you know is selling the shirt for 832 00:43:52,960 --> 00:43:55,120 Speaker 1: twenty bucks rather than the one closer to you that 833 00:43:55,160 --> 00:43:57,520 Speaker 1: sells it for thirty. But you're not going to do 834 00:43:57,560 --> 00:44:00,040 Speaker 1: that because that saves you ten bucks. You won't s 835 00:44:00,200 --> 00:44:04,279 Speaker 1: twenty dollars on a car even though you may even 836 00:44:04,400 --> 00:44:06,960 Speaker 1: have to drive five minutes down the road, because you're like, oh, 837 00:44:07,040 --> 00:44:10,120 Speaker 1: it's twenty dollars, the car is twenty thousand. But it's 838 00:44:10,200 --> 00:44:13,440 Speaker 1: really a relative, you know, absolute thing, as you're saving 839 00:44:13,800 --> 00:44:15,360 Speaker 1: twice as much money as you did on that T 840 00:44:15,480 --> 00:44:16,080 Speaker 1: shirt purchase. 841 00:44:16,320 --> 00:44:19,840 Speaker 2: Right, But it's like you said, it's all relative again, 842 00:44:20,160 --> 00:44:25,040 Speaker 2: totally irrational. But all this stuff relates to economics, and 843 00:44:25,480 --> 00:44:27,520 Speaker 2: like you said, this stuff can be replicated. There's a 844 00:44:27,560 --> 00:44:32,560 Speaker 2: twenty twenty study that looked at the prospect theory in particular, 845 00:44:33,120 --> 00:44:37,279 Speaker 2: and this major study was conducted in nineteen countries and 846 00:44:37,360 --> 00:44:40,960 Speaker 2: thirteen different languages and held up not bad. No, that's 847 00:44:41,040 --> 00:44:44,440 Speaker 2: not bad at all. And so it's not just economics. 848 00:44:44,480 --> 00:44:48,160 Speaker 2: It's not just being exploited by the wine list or 849 00:44:49,400 --> 00:44:53,480 Speaker 2: you know, Kentucky Fried Chicken or something like that to 850 00:44:53,600 --> 00:44:57,960 Speaker 2: make you buy their stuff. This actually this can have 851 00:44:58,040 --> 00:45:01,360 Speaker 2: like life and death consequences too, although I guess so 852 00:45:01,480 --> 00:45:04,279 Speaker 2: can wine. In Kentucky Fried chicken. 853 00:45:04,719 --> 00:45:06,960 Speaker 1: You know what you're gonna get a Kentucky fried chicken? 854 00:45:07,000 --> 00:45:08,080 Speaker 1: What PEPSI? 855 00:45:08,760 --> 00:45:11,359 Speaker 2: That's right, you will get some pepsi. You know how 856 00:45:11,400 --> 00:45:13,880 Speaker 2: I know that cause you just had Kentucky fried Chicken. 857 00:45:14,480 --> 00:45:16,720 Speaker 1: I did. After our tour. I was a little tired 858 00:45:16,760 --> 00:45:19,240 Speaker 1: and needed just some fried chicken. So I got fried chicken. 859 00:45:19,280 --> 00:45:21,360 Speaker 2: What do you get just the original or extra crispy? 860 00:45:21,360 --> 00:45:23,280 Speaker 2: Because you're crazy if you don't get extra crispy. 861 00:45:23,760 --> 00:45:28,400 Speaker 1: I get extra crispy. But they were out they could satisfy. 862 00:45:28,480 --> 00:45:30,400 Speaker 1: I got the three piece. They had two more pieces 863 00:45:30,400 --> 00:45:32,160 Speaker 1: of extra crispy, and they asked if one piece of 864 00:45:33,239 --> 00:45:36,319 Speaker 1: O R was available, and I was like, yeah, sure, 865 00:45:36,320 --> 00:45:37,920 Speaker 1: I'm not gonna not eat a piece of chicken. 866 00:45:40,400 --> 00:45:43,200 Speaker 2: It is good. They do chicken, right, Yeah they do. 867 00:45:43,920 --> 00:45:45,520 Speaker 2: Did you get the mashed potatoes and gravy? 868 00:45:45,960 --> 00:45:48,680 Speaker 1: You know it? Buddy times too, and extra biscuit. I 869 00:45:48,680 --> 00:45:52,760 Speaker 1: went all in. It was a rare treat eating frenzy. 870 00:45:52,880 --> 00:45:53,920 Speaker 2: Did you drink a PEPSI? 871 00:45:54,360 --> 00:45:54,719 Speaker 1: I did? 872 00:45:54,840 --> 00:45:58,160 Speaker 2: Awesome. Well that all fits somehow. I don't know how, 873 00:45:58,239 --> 00:46:02,960 Speaker 2: but it somehow fits this this episode so where I 874 00:46:03,040 --> 00:46:05,120 Speaker 2: was saying that this can be life and death, as 875 00:46:05,160 --> 00:46:08,840 Speaker 2: with medicine, because although doctors have God complexes and like 876 00:46:08,880 --> 00:46:11,759 Speaker 2: to present themselves as infallible, they are quite fallible. They're 877 00:46:11,840 --> 00:46:15,359 Speaker 2: humans and they can suffer the same cognitive biases as us. 878 00:46:15,800 --> 00:46:19,240 Speaker 2: But they have your life in their hands. We rarely 879 00:46:19,280 --> 00:46:21,200 Speaker 2: have others' lives in our hands. 880 00:46:21,880 --> 00:46:23,439 Speaker 1: Yeah. Do you watch the show the Pit? 881 00:46:24,000 --> 00:46:26,759 Speaker 2: I tried and I just did not grab me. I 882 00:46:26,800 --> 00:46:28,960 Speaker 2: gave it like ten minutes, but I hear nothing but 883 00:46:29,040 --> 00:46:29,640 Speaker 2: good things. 884 00:46:30,160 --> 00:46:31,799 Speaker 1: Yeah, I mean, I really like it. I've never been 885 00:46:31,840 --> 00:46:33,680 Speaker 1: a hospital show guy, so this is kind of one 886 00:46:33,719 --> 00:46:36,000 Speaker 1: of my first forays into it, but I like it 887 00:46:36,000 --> 00:46:38,320 Speaker 1: a lot. I haven't started season two, but I noticed 888 00:46:38,320 --> 00:46:42,440 Speaker 1: when reading through these bias like medical biases that they do, 889 00:46:43,320 --> 00:46:45,520 Speaker 1: or at least Noah Wiley does a really good job 890 00:46:45,560 --> 00:46:49,759 Speaker 1: on the show with these younger residents trying to bust 891 00:46:49,840 --> 00:46:51,840 Speaker 1: through and a lot of this stuff comes up. He 892 00:46:51,880 --> 00:46:57,040 Speaker 1: doesn't say, hey, that's affect heuristic. He just will talk 893 00:46:57,080 --> 00:46:59,440 Speaker 1: about what that is. And now that I know the definitions, 894 00:46:59,440 --> 00:47:01,640 Speaker 1: I'm like, oh, he's talking about as an outcome bias 895 00:47:02,239 --> 00:47:05,200 Speaker 1: or an anchoring bias. It's fairly interesting. 896 00:47:05,320 --> 00:47:08,160 Speaker 2: Yeah. Rather than say, like being presented with a really 897 00:47:08,239 --> 00:47:10,680 Speaker 2: high price for a bottle of wine to make the 898 00:47:10,719 --> 00:47:14,080 Speaker 2: other overpriced wine seem like a bargain, this can be 899 00:47:14,200 --> 00:47:17,760 Speaker 2: like your first lab work comes back and that forms 900 00:47:17,800 --> 00:47:23,480 Speaker 2: the anchoring biased impression of your condition. And even as 901 00:47:23,640 --> 00:47:27,080 Speaker 2: new lab work comes back, that doctor may fail to 902 00:47:27,520 --> 00:47:31,440 Speaker 2: adjust their view of your condition because they're not taking 903 00:47:31,480 --> 00:47:34,040 Speaker 2: into account this new stuff. They're giving more weight to 904 00:47:34,080 --> 00:47:37,560 Speaker 2: that original, that original number. So yeah, and that's just 905 00:47:37,600 --> 00:47:39,959 Speaker 2: the anchoring bias. The way that it can affect there's, 906 00:47:40,040 --> 00:47:41,759 Speaker 2: like you said, there's all sorts of other ways for 907 00:47:41,800 --> 00:47:44,279 Speaker 2: it to happen, and all of it can result in 908 00:47:44,719 --> 00:47:48,800 Speaker 2: poorer outcomes for patients just because their doctors are humans, 909 00:47:49,239 --> 00:47:52,919 Speaker 2: and we don't really approach cognitive biases in a really 910 00:47:53,960 --> 00:47:55,560 Speaker 2: methodical or deliberate way. 911 00:47:56,640 --> 00:47:58,640 Speaker 1: Yeah. They In fact, now that I'm thinking about it, 912 00:47:58,640 --> 00:48:00,359 Speaker 1: they do this so much on the show. The show 913 00:48:00,400 --> 00:48:05,600 Speaker 1: could be called medical confirmation bias the show because you 914 00:48:05,640 --> 00:48:08,000 Speaker 1: see it all the time. Outcome bias is when a 915 00:48:08,080 --> 00:48:10,960 Speaker 1: shift in the patient's health you're convinced is the result 916 00:48:10,960 --> 00:48:13,600 Speaker 1: of a treatment, like it's because of that thing I did, 917 00:48:13,880 --> 00:48:17,759 Speaker 1: or affect heuristic that I mentioned, an emotional reaction to 918 00:48:17,800 --> 00:48:21,799 Speaker 1: a patient, you know, kind of overrunning you know, deliberating 919 00:48:21,800 --> 00:48:24,000 Speaker 1: on this thing in a logical way. This happens all 920 00:48:24,080 --> 00:48:25,000 Speaker 1: the time on the show. 921 00:48:25,480 --> 00:48:28,600 Speaker 2: Yeah. Well, another field that it happens with this forensic science, 922 00:48:28,640 --> 00:48:31,000 Speaker 2: which we've gone to great links to kind of point out, 923 00:48:31,120 --> 00:48:35,120 Speaker 2: is junks in the most in most cases, and a 924 00:48:35,160 --> 00:48:37,759 Speaker 2: lot of that junk is just based on cognitive biases. 925 00:48:38,719 --> 00:48:41,080 Speaker 1: Yeah, for sure. I mean certainly the way they do 926 00:48:41,120 --> 00:48:45,120 Speaker 1: lineups is flawed. I mean the way they I feel like, 927 00:48:45,200 --> 00:48:46,759 Speaker 1: you're right, we've done this a lot on the on 928 00:48:46,800 --> 00:48:49,239 Speaker 1: the show. The way they have done a lot of 929 00:48:49,280 --> 00:48:53,359 Speaker 1: this is super flawed. And I think maybe they're looking 930 00:48:53,360 --> 00:48:54,960 Speaker 1: at it some but not a lot. 931 00:48:55,239 --> 00:49:00,759 Speaker 2: No, So if you want to fight cognitive bias in 932 00:49:00,800 --> 00:49:03,920 Speaker 2: your own mind, Chuck, what do you do? What do 933 00:49:04,000 --> 00:49:04,279 Speaker 2: you do? 934 00:49:05,800 --> 00:49:07,560 Speaker 1: Well, there's a list of good tips here, and I 935 00:49:07,560 --> 00:49:10,640 Speaker 1: think these are pretty good tips. The first one is 936 00:49:10,719 --> 00:49:12,640 Speaker 1: just being aware that you have these, which is something 937 00:49:12,680 --> 00:49:14,400 Speaker 1: that we've already kind of kind of worked through on 938 00:49:14,440 --> 00:49:16,200 Speaker 1: this episode, except for you, of course, because you don't 939 00:49:16,200 --> 00:49:21,680 Speaker 1: have these. Sure, but studies show that like just being aware, 940 00:49:22,360 --> 00:49:24,359 Speaker 1: it's not one of those things where like, well, being 941 00:49:24,360 --> 00:49:26,440 Speaker 1: aware is have the problem. It's like being aware seems 942 00:49:26,480 --> 00:49:27,600 Speaker 1: like two percent of the problem. 943 00:49:27,760 --> 00:49:32,480 Speaker 2: Yeah, it's like you're aware that you have an unconscious bias. 944 00:49:32,520 --> 00:49:35,560 Speaker 2: It doesn't make you understand the bias. You just know 945 00:49:35,719 --> 00:49:39,200 Speaker 2: that they're there, Right, that's the problem with is unconscious. 946 00:49:40,000 --> 00:49:40,359 Speaker 1: What else? 947 00:49:40,520 --> 00:49:43,400 Speaker 2: There are some like actual things you can do like 948 00:49:43,560 --> 00:49:48,040 Speaker 2: delay decision making. Yeah, don't don't come to snap judgments. 949 00:49:48,480 --> 00:49:52,279 Speaker 2: Go get more information, Go get information from a contradictory 950 00:49:52,320 --> 00:49:55,800 Speaker 2: source or different source or something like that. And then 951 00:49:55,920 --> 00:49:57,839 Speaker 2: it like kind of tied into that. You can have 952 00:49:57,960 --> 00:50:02,080 Speaker 2: like personal like rule like if there's a big decision, 953 00:50:02,320 --> 00:50:06,520 Speaker 2: you will not make that decision until you've slept on it. Yeah, 954 00:50:06,560 --> 00:50:10,799 Speaker 2: for example, don't buy a TV unless your friend says, yeah, 955 00:50:11,040 --> 00:50:12,040 Speaker 2: good idea. 956 00:50:12,960 --> 00:50:17,200 Speaker 1: Try and consider your past experience for sure, because optimism 957 00:50:17,280 --> 00:50:21,600 Speaker 1: biased could come into play, like hey, worked out last time, yeah, 958 00:50:21,640 --> 00:50:23,880 Speaker 1: Like why would I why would I take more time 959 00:50:24,080 --> 00:50:24,600 Speaker 1: this time? 960 00:50:24,920 --> 00:50:27,160 Speaker 2: Yeah? And that's another way that you can kind of 961 00:50:27,200 --> 00:50:29,680 Speaker 2: do that. An exercise you can do is write down 962 00:50:29,880 --> 00:50:33,000 Speaker 2: your expectations for an outcome and then go back and 963 00:50:33,040 --> 00:50:34,960 Speaker 2: look at it afterward and see if you were right 964 00:50:35,040 --> 00:50:38,000 Speaker 2: or not. Can kind of help you realize like, uh, 965 00:50:38,080 --> 00:50:40,759 Speaker 2: I do kind of tend toward the optimism bias. 966 00:50:41,000 --> 00:50:43,280 Speaker 1: Yeah, because I believe that was one of the other biases. 967 00:50:43,480 --> 00:50:46,440 Speaker 1: Is even like it is hard to recognize because you're 968 00:50:46,440 --> 00:50:49,520 Speaker 1: biased and that you misremember what you thought going into it. 969 00:50:49,520 --> 00:50:51,280 Speaker 1: So writing it down is a good that's a good. 970 00:50:51,120 --> 00:50:54,360 Speaker 2: One, right, But if you're super super unconsciously biased, you 971 00:50:54,440 --> 00:50:57,319 Speaker 2: might be like someone else wrote this in my handwriting, Right, 972 00:50:58,040 --> 00:50:59,120 Speaker 2: I've never been this wrong. 973 00:51:00,920 --> 00:51:03,320 Speaker 1: What about Thomas Bayes and Baysian reasoning. 974 00:51:03,840 --> 00:51:06,480 Speaker 2: So he was a minister from the eighteenth century, and 975 00:51:06,560 --> 00:51:10,640 Speaker 2: he basically came up with a standardized formula for taking 976 00:51:10,680 --> 00:51:14,840 Speaker 2: into account the probability of an outcome, right that things 977 00:51:14,960 --> 00:51:18,120 Speaker 2: aren't essentially so I saw this on less wrong dot 978 00:51:18,200 --> 00:51:21,080 Speaker 2: org founded by one of the guys who wrote, if 979 00:51:21,120 --> 00:51:24,480 Speaker 2: anyone builds it, everyone dies about Ai Eliotz or Yukowski. 980 00:51:25,080 --> 00:51:27,120 Speaker 2: The whole point of less wrong dot org is to 981 00:51:27,360 --> 00:51:31,439 Speaker 2: overcome your biases in a methodical way. And they love 982 00:51:31,520 --> 00:51:34,600 Speaker 2: Baysian reasoning, and it basically says, there's no such thing 983 00:51:34,640 --> 00:51:39,440 Speaker 2: as something is just true. Everything is just a probability, 984 00:51:39,480 --> 00:51:42,560 Speaker 2: and you can kind of try to determine how probable 985 00:51:42,640 --> 00:51:46,600 Speaker 2: something is based on whatever evidence you can gather about it. 986 00:51:46,680 --> 00:51:48,440 Speaker 2: Just basically going through life like that. 987 00:51:49,080 --> 00:51:53,600 Speaker 1: You know, who hates that website? Who l E s 988 00:51:53,800 --> 00:51:57,120 Speaker 1: R O NNG That dude who started his own personal 989 00:51:57,280 --> 00:52:01,000 Speaker 1: comedy website right, less wrong dot God. That's right, he's 990 00:52:01,040 --> 00:52:01,880 Speaker 1: just getting smashed. 991 00:52:03,560 --> 00:52:04,279 Speaker 2: What else, Chuck? 992 00:52:06,800 --> 00:52:08,840 Speaker 1: What else is I cultivate a growth mindset? 993 00:52:09,080 --> 00:52:09,920 Speaker 2: That's a big one. 994 00:52:10,239 --> 00:52:13,480 Speaker 1: Hey, I make mistakes. I screw things up, and like 995 00:52:13,600 --> 00:52:16,120 Speaker 1: I need to recognize that and try and grow from 996 00:52:16,120 --> 00:52:21,080 Speaker 1: that rather than you know, just being confirmed with my 997 00:52:21,120 --> 00:52:22,280 Speaker 1: own biases constantly. 998 00:52:22,640 --> 00:52:24,960 Speaker 2: Yeah, maybe like looking around at some of the ways 999 00:52:25,000 --> 00:52:29,319 Speaker 2: that you're commonly exploited, say like by advertisers, Like scarcity 1000 00:52:29,400 --> 00:52:31,880 Speaker 2: is one when somebody says act now supplies are limited. 1001 00:52:31,920 --> 00:52:37,719 Speaker 2: They're creating a scarcity mindset in you social proof basically 1002 00:52:37,800 --> 00:52:40,200 Speaker 2: like these people like this, so you probably should too, 1003 00:52:40,200 --> 00:52:41,719 Speaker 2: and you're like, oh, I should like that too. 1004 00:52:42,320 --> 00:52:42,760 Speaker 1: Yeah. 1005 00:52:42,800 --> 00:52:45,680 Speaker 2: And then two other things I saw. There's something called 1006 00:52:46,360 --> 00:52:54,280 Speaker 2: cognitive bias mod modification I think is what it is. Okay, 1007 00:52:54,400 --> 00:52:57,400 Speaker 2: you can use this for like treating anxiety, right, Like 1008 00:52:57,480 --> 00:53:02,879 Speaker 2: people like tend to seek out negative facial expressions. Oh yeah, 1009 00:53:02,920 --> 00:53:07,040 Speaker 2: and this treatment is like like here's a thousand frownie faces. 1010 00:53:07,520 --> 00:53:10,840 Speaker 2: Find the smiley face in there, and just screen after screen, 1011 00:53:10,880 --> 00:53:13,359 Speaker 2: you're looking for the smiley face, and you're training your 1012 00:53:13,360 --> 00:53:18,640 Speaker 2: brain to stop putting as much weight on negative facial expressions, 1013 00:53:19,160 --> 00:53:23,839 Speaker 2: just using like basically exploiting your cognitive bias to get 1014 00:53:23,880 --> 00:53:25,440 Speaker 2: over your cognitive bias. 1015 00:53:25,840 --> 00:53:26,480 Speaker 1: Oh wow. 1016 00:53:26,719 --> 00:53:30,839 Speaker 2: And then the last thing, Chuck is apparently AI are 1017 00:53:30,880 --> 00:53:35,560 Speaker 2: starting to show signs of emergent cognitive biases because they 1018 00:53:35,640 --> 00:53:40,279 Speaker 2: use heuristics too, so they're starting to make cognitive they're 1019 00:53:40,280 --> 00:53:44,000 Speaker 2: starting to make errors in judgment in predictable ways, which 1020 00:53:44,000 --> 00:53:46,600 Speaker 2: are cognitive biases, just like humans. 1021 00:53:47,280 --> 00:53:48,879 Speaker 1: Rob Zombie more human than human. 1022 00:53:49,080 --> 00:53:51,479 Speaker 2: That's right, you got anything else? 1023 00:53:51,920 --> 00:53:52,759 Speaker 1: I got nothing else. 1024 00:53:52,840 --> 00:53:56,520 Speaker 2: This is a good one. This is fun. Chuck, I'm going, well, 1025 00:53:56,560 --> 00:53:59,160 Speaker 2: since Chuck and I both liked this episode, that means 1026 00:53:59,160 --> 00:54:02,440 Speaker 2: we have no choice but for listener mail to be triggered. 1027 00:54:02,719 --> 00:54:06,799 Speaker 1: Right now, I'm going to call this follow up on 1028 00:54:07,080 --> 00:54:11,920 Speaker 1: a Sebastopol because I wondered what the connection there was, Hey, guys, 1029 00:54:12,200 --> 00:54:15,960 Speaker 1: because if you didn't listen Sebastopol, California, and we were 1030 00:54:15,960 --> 00:54:20,160 Speaker 1: talking about the Sebastopold in the Crimean War, and I 1031 00:54:20,200 --> 00:54:22,879 Speaker 1: was like, there's no way that's a coincidence. And it's not. Hey, guys, 1032 00:54:22,920 --> 00:54:26,200 Speaker 1: listening to the podcast on the Light Brigade from Sonoma County. 1033 00:54:26,320 --> 00:54:31,400 Speaker 1: Our Sebastopol was named after Sebastopod. And here's a little information. 1034 00:54:31,640 --> 00:54:35,440 Speaker 1: The settlement was apparently originally named pine Grove, and the 1035 00:54:35,520 --> 00:54:40,040 Speaker 1: name changed to Sebastopol was attributed to a bar fight 1036 00:54:40,120 --> 00:54:44,879 Speaker 1: in the eighteen fifties which allegedly compared by a bystander 1037 00:54:45,440 --> 00:54:49,799 Speaker 1: to the long siege of the seaport of Sebastopol during 1038 00:54:49,840 --> 00:54:53,440 Speaker 1: the Crimean War. Wow, so the original name survives in 1039 00:54:53,480 --> 00:54:56,440 Speaker 1: the name of the Pine Grove General Store downtown only, 1040 00:54:57,280 --> 00:55:00,920 Speaker 1: and that is it. There's also the rush there Russian 1041 00:55:00,960 --> 00:55:04,000 Speaker 1: River Valley, so apparently there are some Russian influence in 1042 00:55:04,040 --> 00:55:05,480 Speaker 1: that area which I didn't know about. And that is 1043 00:55:05,480 --> 00:55:06,279 Speaker 1: from Marsha Ford. 1044 00:55:06,920 --> 00:55:09,879 Speaker 2: Yeah. Also, we want to apologize to all of our 1045 00:55:10,600 --> 00:55:13,759 Speaker 2: Iron Maiden fans who wrote in to be like, yeah, 1046 00:55:14,120 --> 00:55:17,000 Speaker 2: that song of the Trooper is about that whole battle. 1047 00:55:18,000 --> 00:55:21,120 Speaker 1: Yeah, I didn't know. I am not. I like Iron Maiden, 1048 00:55:21,640 --> 00:55:23,600 Speaker 1: but I didn't have as much shame upon my head 1049 00:55:23,640 --> 00:55:26,760 Speaker 1: as you. But you didn't reading the lyrics, it doesn't 1050 00:55:26,800 --> 00:55:31,640 Speaker 1: say you know crimean war in charge of the light Brigade, 1051 00:55:31,640 --> 00:55:31,920 Speaker 1: does it? 1052 00:55:32,400 --> 00:55:34,880 Speaker 2: I don't know. I haven't heard it in a while. 1053 00:55:35,200 --> 00:55:37,239 Speaker 2: I'm a big fan of the poster. I love the 1054 00:55:37,280 --> 00:55:38,120 Speaker 2: poster a lot. 1055 00:55:38,440 --> 00:55:39,040 Speaker 1: Yeah, me too. 1056 00:55:39,920 --> 00:55:42,399 Speaker 2: Well, sorry all of you Iron Maiden fans out there. 1057 00:55:42,440 --> 00:55:43,879 Speaker 2: We'll try to do better next time. 1058 00:55:44,040 --> 00:55:45,160 Speaker 1: Yeah, missed opportunity. 1059 00:55:45,200 --> 00:55:47,480 Speaker 2: Who is that that wrote in about Sebastopol? 1060 00:55:48,000 --> 00:55:49,040 Speaker 1: That was Marcia, I believe. 1061 00:55:49,080 --> 00:55:53,120 Speaker 2: Thanks Marcia, Marcia, Marsha Marsha. We really appreciate you, and 1062 00:55:53,160 --> 00:55:55,600 Speaker 2: if you want to be like Marcia, you can email 1063 00:55:55,680 --> 00:55:58,520 Speaker 2: us as well. Send it off to Stuff podcast at 1064 00:55:58,520 --> 00:56:03,520 Speaker 2: iHeartRadio dot com. 1065 00:56:03,680 --> 00:56:06,560 Speaker 1: Stuff you Should Know is a production of iHeartRadio. For 1066 00:56:06,640 --> 00:56:10,839 Speaker 1: more podcasts Myheart Radio, visit the iHeartRadio, app, Apple podcasts, 1067 00:56:10,920 --> 00:56:12,760 Speaker 1: or wherever you listen to your favorite shows.