1 00:00:02,240 --> 00:00:06,840 Speaker 1: This is Masters in Business with Barry Ridholts on Bloomberg Radio. 2 00:00:07,880 --> 00:00:10,840 Speaker 1: This week on the podcast, I have an extra special guest, 3 00:00:11,080 --> 00:00:13,520 Speaker 1: and what can I tell you? His name is on 4 00:00:13,560 --> 00:00:16,720 Speaker 1: the tip of your tongue. You know all about his research, 5 00:00:16,800 --> 00:00:20,239 Speaker 1: You know all about the charts that the Internet created 6 00:00:20,360 --> 00:00:23,080 Speaker 1: based on his research. You probably didn't know that that 7 00:00:23,200 --> 00:00:27,480 Speaker 1: wasn't originally his work. David Dunning, famous for the Dunning 8 00:00:27,560 --> 00:00:33,320 Speaker 1: Kruger Effect, professor of psychology at Michigan. We talk about 9 00:00:33,360 --> 00:00:37,280 Speaker 1: everything his research, why people don't know what they don't know, 10 00:00:37,720 --> 00:00:41,080 Speaker 1: how we could get better at decision making. Just absolutely 11 00:00:41,080 --> 00:00:43,760 Speaker 1: a fascinating conversation. If you're at all interested in human 12 00:00:43,800 --> 00:00:47,800 Speaker 1: cognition and psychology in why we think we're better at 13 00:00:47,840 --> 00:00:51,280 Speaker 1: tasks than we really are, then you're gonna find this 14 00:00:51,360 --> 00:00:55,120 Speaker 1: to be an absolutely fascinating discussion. So, with no further ado, 15 00:00:55,760 --> 00:01:02,080 Speaker 1: my conversation with David Dunning. This is Masters in Business 16 00:01:02,360 --> 00:01:07,360 Speaker 1: with Barry Ridholts on Bloomberg Radio. My extra special guest 17 00:01:07,440 --> 00:01:10,680 Speaker 1: this week is David Dunning. He is a professor of 18 00:01:10,760 --> 00:01:14,880 Speaker 1: psychology at the University of Michigan, where he focuses on 19 00:01:14,959 --> 00:01:20,320 Speaker 1: the psychology underlying human misbelief He is best known for 20 00:01:20,360 --> 00:01:26,080 Speaker 1: his study with colleague Justin Krueger, Unskilled and unaware of it, 21 00:01:26,440 --> 00:01:31,800 Speaker 1: how difficulties and recognizing one's own incompetence lead to self 22 00:01:31,840 --> 00:01:35,920 Speaker 1: inflated assessments. Dunning Krueger showed that people who were the 23 00:01:35,920 --> 00:01:41,320 Speaker 1: worst performers significantly overestimated how good they were. He is 24 00:01:41,360 --> 00:01:45,080 Speaker 1: also the author of the book Self Insight, Roadblocks and 25 00:01:45,160 --> 00:01:49,320 Speaker 1: Detours on the Path to Knowing Thyself. David Dunning, Welcome 26 00:01:49,320 --> 00:01:51,560 Speaker 1: to Bloomberg. It's a pleasure to be here. I have 27 00:01:51,680 --> 00:01:54,919 Speaker 1: been looking forward to this conversation for a long time. 28 00:01:55,440 --> 00:01:58,560 Speaker 1: I am a giant fan of your work, and I 29 00:01:58,680 --> 00:02:02,800 Speaker 1: have to start with a really simple question. What's the 30 00:02:02,840 --> 00:02:05,880 Speaker 1: origin of the study? What led you to a thesis 31 00:02:05,960 --> 00:02:10,000 Speaker 1: that we're really bad at self evaluation? Well, if you're 32 00:02:10,000 --> 00:02:13,880 Speaker 1: an academic, you meet up with many students, and you 33 00:02:13,919 --> 00:02:17,640 Speaker 1: meet up with many colleagues who say outrageous things, and 34 00:02:17,720 --> 00:02:20,520 Speaker 1: you just have to wonder, don't they know what they're saying? 35 00:02:20,760 --> 00:02:26,359 Speaker 1: Is let me say this diplomatically odd, suboptimal. And over 36 00:02:26,400 --> 00:02:29,079 Speaker 1: the years I just was intrigued with finding out whether 37 00:02:29,120 --> 00:02:31,560 Speaker 1: or not people knew when they were saying things that 38 00:02:31,600 --> 00:02:35,400 Speaker 1: were outrageous. We're obviously wrong on the face of it. 39 00:02:35,880 --> 00:02:39,080 Speaker 1: And so one day Justin Krueger walked into my office 40 00:02:39,120 --> 00:02:41,720 Speaker 1: said he wanted to a study with me, and I said, well, 41 00:02:41,720 --> 00:02:44,880 Speaker 1: I have this high high risk reward study to do, 42 00:02:44,960 --> 00:02:46,760 Speaker 1: and it has to do with a question I've often 43 00:02:46,800 --> 00:02:50,760 Speaker 1: wondered about. And so we did the first original series 44 00:02:50,760 --> 00:02:54,600 Speaker 1: of studies and were astonished at how little people who 45 00:02:54,639 --> 00:02:57,720 Speaker 1: didn't know I didn't know about how little they knew. 46 00:02:58,240 --> 00:03:01,399 Speaker 1: So I was on the impression and that most academics 47 00:03:01,520 --> 00:03:04,480 Speaker 1: have a thesis and there's some data supporting it, and 48 00:03:04,520 --> 00:03:07,680 Speaker 1: when they go out and test it, they have a 49 00:03:07,680 --> 00:03:11,200 Speaker 1: little confirmation bias and they see what they expected to see. 50 00:03:11,800 --> 00:03:15,120 Speaker 1: You're saying, you guys were just shocked by the results 51 00:03:15,160 --> 00:03:17,160 Speaker 1: of this study. That's right. I mean we expected it 52 00:03:17,160 --> 00:03:18,880 Speaker 1: to work, because if you think about the logic of it, 53 00:03:18,960 --> 00:03:21,799 Speaker 1: it has to work. The question was one of magnitude. 54 00:03:22,520 --> 00:03:24,919 Speaker 1: When a student was failing the course, for example, or 55 00:03:24,919 --> 00:03:28,960 Speaker 1: were giving them a pop quiz on grammar, uh, did 56 00:03:29,560 --> 00:03:32,320 Speaker 1: they have some inkling that they were performing really poorly? 57 00:03:32,919 --> 00:03:36,000 Speaker 1: And the answer was maybe a little, but not much, 58 00:03:36,040 --> 00:03:38,880 Speaker 1: and they were missing their true performance level by a 59 00:03:38,920 --> 00:03:42,680 Speaker 1: mile by a mile. So so how much of this that? 60 00:03:42,680 --> 00:03:46,280 Speaker 1: That really raises um a number of questions. So I 61 00:03:46,600 --> 00:03:51,160 Speaker 1: love the phrase metacognition, the ability to self evaluate your 62 00:03:51,200 --> 00:03:54,840 Speaker 1: skill set and your findings. Essentially find that this is 63 00:03:54,960 --> 00:03:58,360 Speaker 1: highly correlated with an underlying skill. Whenever I try and 64 00:03:58,400 --> 00:04:01,440 Speaker 1: explain this to a lay person, it's pro golfers know 65 00:04:01,640 --> 00:04:04,119 Speaker 1: how good they are and where the weaknesses in their 66 00:04:04,120 --> 00:04:08,320 Speaker 1: games are. Amateurs have no idea that they're not remotely 67 00:04:08,360 --> 00:04:10,080 Speaker 1: as good as they think they are. That is that 68 00:04:10,120 --> 00:04:13,240 Speaker 1: a fair Oh I'm a perfect example of this. So 69 00:04:13,320 --> 00:04:16,039 Speaker 1: when I go out in golf, I often end up 70 00:04:16,080 --> 00:04:18,360 Speaker 1: in the in the rough when I when I drive 71 00:04:18,440 --> 00:04:21,000 Speaker 1: the ball and then I see the ball going the 72 00:04:21,080 --> 00:04:22,680 Speaker 1: roof and I go out to find it later on, 73 00:04:22,720 --> 00:04:25,280 Speaker 1: and I'm always over guessing how far the ball went 74 00:04:25,800 --> 00:04:29,000 Speaker 1: in the rough by about thirty yards. And I know this, 75 00:04:29,200 --> 00:04:31,480 Speaker 1: yet every time I drive the ball into the rough, 76 00:04:31,640 --> 00:04:34,359 Speaker 1: I'm looking in the wrong plates. Uh so, yeah, I 77 00:04:34,360 --> 00:04:37,960 Speaker 1: mean amateur golfers don't know such terms as of course 78 00:04:38,000 --> 00:04:41,280 Speaker 1: management for example. Uh there's a number of concepts and 79 00:04:41,360 --> 00:04:43,760 Speaker 1: number of ideas they just simply don't have available to them. 80 00:04:43,760 --> 00:04:45,880 Speaker 1: And as a consequence, I think they're they're doing the 81 00:04:45,880 --> 00:04:48,480 Speaker 1: best possible job, when in fact there's a whole realm 82 00:04:48,520 --> 00:04:50,719 Speaker 1: of competency as they don't know about. They're just wholly 83 00:04:50,760 --> 00:04:53,240 Speaker 1: unaware of what they don't know about. That's right. So 84 00:04:53,480 --> 00:04:59,920 Speaker 1: you begin theer with a amusing anecdote. Tell us about 85 00:05:00,000 --> 00:05:04,560 Speaker 1: out the Pittsburgh bank robber MacArthur Wheeler. Well, MacArthur Wheeler 86 00:05:05,320 --> 00:05:11,159 Speaker 1: was a aspirant bank robber who decided to go out 87 00:05:11,440 --> 00:05:14,400 Speaker 1: and rob, but needed a disguise. And he had heard 88 00:05:14,400 --> 00:05:17,560 Speaker 1: that if you rub your face with lemon juice, it 89 00:05:17,640 --> 00:05:22,880 Speaker 1: renders the face um uh, fuzzy or even uh invisible 90 00:05:23,000 --> 00:05:27,320 Speaker 1: to bank security cameras, and so he actually did test 91 00:05:27,360 --> 00:05:30,080 Speaker 1: it out. He actually rubbed his face with uninduced at home, 92 00:05:30,360 --> 00:05:33,960 Speaker 1: pointed a polaroid camera or whatever at his face, and 93 00:05:34,000 --> 00:05:37,240 Speaker 1: then he wasn't there. He miss aimed the camera is 94 00:05:37,480 --> 00:05:39,960 Speaker 1: he thought he was insible, but he thought he was invisible. 95 00:05:40,000 --> 00:05:43,720 Speaker 1: He went out with no actual disguise, rob to Pittsburgh 96 00:05:43,720 --> 00:05:49,039 Speaker 1: area banks during the daytime, um uh, was immediately caught 97 00:05:49,080 --> 00:05:53,480 Speaker 1: on security cameras. Uh. Those tapes were broadcast on the news, 98 00:05:54,120 --> 00:05:58,159 Speaker 1: and he himself was caught before the eleven o'clock news hour, 99 00:05:58,480 --> 00:06:01,320 Speaker 1: and he was incredulous because, as he said, I wore 100 00:06:01,360 --> 00:06:06,280 Speaker 1: the juice. I wore the juice. Uh. So um thus 101 00:06:06,400 --> 00:06:09,560 Speaker 1: ended his career. But these are sorts of mistakes we 102 00:06:09,600 --> 00:06:11,200 Speaker 1: make all the time. We think we we have a 103 00:06:11,200 --> 00:06:13,960 Speaker 1: strategy that's going to work, and to our surprise, the 104 00:06:14,000 --> 00:06:17,040 Speaker 1: world has a different lesson for us to learn. So 105 00:06:18,040 --> 00:06:22,400 Speaker 1: medic cognition sometimes looks a little bit like over confidence. 106 00:06:22,800 --> 00:06:25,960 Speaker 1: How similar or different are the two? Well, metic cognition 107 00:06:26,120 --> 00:06:29,040 Speaker 1: is a number of things, a number of skills that 108 00:06:29,120 --> 00:06:35,000 Speaker 1: underlie um being able to evaluate your judgments, evaluate your decisions. 109 00:06:35,000 --> 00:06:38,040 Speaker 1: So some often it's over confidence. Usually it's over confidence. 110 00:06:38,640 --> 00:06:40,880 Speaker 1: It can be under confidence, thinking you can't do something 111 00:06:40,880 --> 00:06:44,479 Speaker 1: that you can do. Uh. It might be over confidence 112 00:06:44,560 --> 00:06:48,880 Speaker 1: or under confidence. But does your confidence rise and fall 113 00:06:49,000 --> 00:06:51,120 Speaker 1: with the accuracy of your judgment? So is there a 114 00:06:51,240 --> 00:06:54,920 Speaker 1: relationship whether or not your confidences is a speed dometer 115 00:06:55,040 --> 00:06:58,919 Speaker 1: that overstates or understates how well uh you're doing. But 116 00:06:59,040 --> 00:07:03,000 Speaker 1: there it also is knowing how to make a judgment, uh, 117 00:07:03,279 --> 00:07:08,240 Speaker 1: knowing when to stop thinking and start acting. So knowing 118 00:07:08,680 --> 00:07:12,360 Speaker 1: when uh, there's a doubt that you really should be 119 00:07:12,400 --> 00:07:16,120 Speaker 1: following up on. So over confidence as a phenomenon I 120 00:07:16,160 --> 00:07:18,760 Speaker 1: think lies within a whole family of skills that you 121 00:07:18,760 --> 00:07:23,120 Speaker 1: can call metacognition, which is basically skill in knowing how 122 00:07:23,120 --> 00:07:26,960 Speaker 1: to evaluate your thinking and control you're thinking. Quite fascinating, 123 00:07:27,440 --> 00:07:32,600 Speaker 1: Let's talk a little bit about your unskilled and unaware 124 00:07:32,640 --> 00:07:35,320 Speaker 1: of it. This blew up into one of the most 125 00:07:35,400 --> 00:07:39,480 Speaker 1: famous psychology papers ever. When when you and Krueger were 126 00:07:39,520 --> 00:07:42,480 Speaker 1: writing this, did you have any idea that it was 127 00:07:42,520 --> 00:07:45,800 Speaker 1: going to be this explosive? No, because I thought it 128 00:07:45,840 --> 00:07:48,040 Speaker 1: was going to have trouble being published, because it actually 129 00:07:48,160 --> 00:07:51,840 Speaker 1: is an unusual piece of work given the usual structure 130 00:07:51,880 --> 00:07:55,000 Speaker 1: of a paper in the journal we ultimately submitted to. 131 00:07:55,600 --> 00:07:59,160 Speaker 1: So the fact that it blew up was a big surprise. 132 00:07:59,640 --> 00:08:02,160 Speaker 1: The fact that it got published was also a big surprise, 133 00:08:02,200 --> 00:08:04,600 Speaker 1: which was very very happy because internally I thought it 134 00:08:04,600 --> 00:08:05,840 Speaker 1: was a good piece of work, but I didn't know 135 00:08:05,840 --> 00:08:08,520 Speaker 1: if the world was going to agree. So I I've 136 00:08:08,600 --> 00:08:12,640 Speaker 1: seen your work misstated in a variety of ways. I'm 137 00:08:12,680 --> 00:08:16,040 Speaker 1: sure you have. Also. The one that I noticed all 138 00:08:16,080 --> 00:08:20,280 Speaker 1: the time is stupid people don't know they're stupid. And 139 00:08:20,320 --> 00:08:23,680 Speaker 1: while that could very well be true, that is not 140 00:08:24,120 --> 00:08:26,880 Speaker 1: the basic theme of of your research. Is it no, 141 00:08:27,640 --> 00:08:30,120 Speaker 1: we were very clear from the outset that the Dunne 142 00:08:30,200 --> 00:08:34,079 Speaker 1: Ruger effect is something that can visit anybody at any time. 143 00:08:34,520 --> 00:08:36,960 Speaker 1: That is, each of us has our own pockets of 144 00:08:37,600 --> 00:08:40,520 Speaker 1: incompetence and we just don't know when we wander into them. 145 00:08:40,760 --> 00:08:44,839 Speaker 1: So it uh well. Often the one mistake that people 146 00:08:44,880 --> 00:08:47,520 Speaker 1: make is thinking about the Dunne Krueger effect is about them, 147 00:08:48,400 --> 00:08:50,880 Speaker 1: those as you say, stupid people out there, and the 148 00:08:50,880 --> 00:08:55,480 Speaker 1: paper really was really about us and ourselves and being 149 00:08:56,559 --> 00:08:59,000 Speaker 1: vigilant about the fact that sometimes we're going to wander 150 00:08:59,160 --> 00:09:02,800 Speaker 1: into our own little personal disaster. Is not knowing that 151 00:09:02,880 --> 00:09:06,480 Speaker 1: a disaster is imminent. So people trying to explain Dunning 152 00:09:06,559 --> 00:09:09,760 Speaker 1: Krueger themselves are suffering from the Dunning Kruger effects in 153 00:09:09,800 --> 00:09:12,560 Speaker 1: many different ways. So if you give me a moment, 154 00:09:12,920 --> 00:09:16,120 Speaker 1: two different ways um that people get it wrong. First 155 00:09:16,200 --> 00:09:19,240 Speaker 1: is to think about other people and it's not about me. 156 00:09:20,280 --> 00:09:23,320 Speaker 1: The second is thinking that incompetent people are the most 157 00:09:23,360 --> 00:09:26,400 Speaker 1: confident people in the room. That's not necessarily true. Occasionally 158 00:09:26,440 --> 00:09:28,960 Speaker 1: that shows up in our data, but they are usually 159 00:09:29,080 --> 00:09:33,760 Speaker 1: less confident than the really competent people, but not that much. 160 00:09:33,800 --> 00:09:36,839 Speaker 1: And but the real thing that I think is fascinating 161 00:09:36,880 --> 00:09:38,640 Speaker 1: and this has only happened in the past five years. 162 00:09:39,240 --> 00:09:41,640 Speaker 1: Is that if you google images of the Dunning Kruger effect, 163 00:09:41,760 --> 00:09:45,760 Speaker 1: the charts, the chart, well we did that. Those aren't 164 00:09:45,760 --> 00:09:48,439 Speaker 1: our charts. So you didn't do Mount Stupid or the 165 00:09:49,720 --> 00:09:53,640 Speaker 1: value of despair, and no, we did not. That has 166 00:09:53,720 --> 00:09:56,440 Speaker 1: nothing to do whatsoever with our ninety paper or anything 167 00:09:56,440 --> 00:10:01,160 Speaker 1: that we did subsequently. And uh, two notes of At first, 168 00:10:01,200 --> 00:10:03,120 Speaker 1: I think it's it's delicious that a lot of people 169 00:10:03,240 --> 00:10:06,160 Speaker 1: think of the Dunning Kruger effect. They're talking about the 170 00:10:06,200 --> 00:10:09,320 Speaker 1: Dunning Kruger effect, their videotaping talks and Dunning Kruger effect, 171 00:10:09,320 --> 00:10:12,760 Speaker 1: and what they're talking about is not the Dunning Kruger effect. Um, 172 00:10:12,800 --> 00:10:19,199 Speaker 1: they're suffering the effect, about the effect itself. Um, that's 173 00:10:19,320 --> 00:10:22,800 Speaker 1: the first. The second note, though, is given this situation, 174 00:10:22,920 --> 00:10:27,000 Speaker 1: we did face a dilemma in the lab, how do 175 00:10:27,040 --> 00:10:30,280 Speaker 1: we fix this? How do we correct this? And so 176 00:10:30,920 --> 00:10:34,080 Speaker 1: this is true. In part we decided the most efficient 177 00:10:34,120 --> 00:10:36,520 Speaker 1: ethical thing to do was to steal the idea from 178 00:10:36,559 --> 00:10:39,719 Speaker 1: the internet, because the other problem with the idea, other 179 00:10:39,760 --> 00:10:41,480 Speaker 1: than it not being the Dunning Kuger effect, is that's 180 00:10:41,559 --> 00:10:44,120 Speaker 1: it's more interesting than the Dunning Kruger effect. So but 181 00:10:44,200 --> 00:10:46,200 Speaker 1: we stole the idea, tested it, and it turns out 182 00:10:46,240 --> 00:10:49,840 Speaker 1: that mount stupid value of despair a plateau of enlightenment 183 00:10:50,240 --> 00:10:52,840 Speaker 1: time course of people see that. We pretty much get 184 00:10:52,920 --> 00:10:57,040 Speaker 1: that um pattern as we pay people through a completely 185 00:10:57,080 --> 00:11:01,199 Speaker 1: novel task. So internet is right. So so in other words, 186 00:11:01,679 --> 00:11:05,760 Speaker 1: and I'm I'm intrigued and fascinated by this. You never 187 00:11:05,840 --> 00:11:09,800 Speaker 1: put out a chart. I always assumed that that chart 188 00:11:09,920 --> 00:11:12,679 Speaker 1: had to come from your data, because what are people 189 00:11:12,720 --> 00:11:16,280 Speaker 1: just brewing lines and making it up and ps it 190 00:11:16,480 --> 00:11:21,240 Speaker 1: intuitively looks right. You would assume, Hey, when so I 191 00:11:21,320 --> 00:11:25,080 Speaker 1: play tennis, I only started recently, less than ten years ago. 192 00:11:25,520 --> 00:11:27,440 Speaker 1: And when you start out and you're starting hit the 193 00:11:27,440 --> 00:11:29,240 Speaker 1: ball and you feel like you have some control and 194 00:11:29,280 --> 00:11:32,400 Speaker 1: you have some skill, and and then you're you're working 195 00:11:32,440 --> 00:11:35,280 Speaker 1: your way up that mount stupid. And then when you 196 00:11:35,320 --> 00:11:38,960 Speaker 1: actually start to develop some skill, not that I really have, 197 00:11:39,320 --> 00:11:42,160 Speaker 1: but I'm better than I was five years ago, you realize, oh, 198 00:11:42,200 --> 00:11:43,880 Speaker 1: I didn't know what the heck I was doing, not 199 00:11:44,000 --> 00:11:47,240 Speaker 1: just a ball and getting lucky when it catches the tape, 200 00:11:47,360 --> 00:11:49,720 Speaker 1: and all of a sudden you realize, oh, I'm way 201 00:11:49,760 --> 00:11:52,520 Speaker 1: down this And then you continue playing, you get a 202 00:11:52,520 --> 00:11:54,560 Speaker 1: little better and a little better. I don't know if 203 00:11:54,600 --> 00:12:00,160 Speaker 1: this is all rationalization, but it intuitively seems to make sense. Well, 204 00:12:00,240 --> 00:12:02,079 Speaker 1: not only does it intuitively make sense, it turns out 205 00:12:02,120 --> 00:12:06,560 Speaker 1: to make sense. Uh. And in a paper with Carmen Sanchez, 206 00:12:06,600 --> 00:12:10,240 Speaker 1: we were able to demonstrate that basically what happens is 207 00:12:10,280 --> 00:12:12,400 Speaker 1: when you start a task. And what we did is 208 00:12:12,440 --> 00:12:15,679 Speaker 1: we had people. We put people in a post apocalyptic 209 00:12:15,720 --> 00:12:19,240 Speaker 1: world where they had to without supervision, but with feedback 210 00:12:19,360 --> 00:12:25,120 Speaker 1: diagnose who was infected with a zombie disease I hope, 211 00:12:25,200 --> 00:12:27,640 Speaker 1: hoping that that wasn't something that people had experienced with. 212 00:12:28,679 --> 00:12:31,959 Speaker 1: And basically what happens is if you're a beginner, you 213 00:12:32,280 --> 00:12:35,760 Speaker 1: start out way at the beginning, being appropriately conscious. You 214 00:12:35,760 --> 00:12:37,080 Speaker 1: really don't know what you're doing, and you know it. 215 00:12:37,559 --> 00:12:39,760 Speaker 1: But the problem is that you have a few successes 216 00:12:39,840 --> 00:12:42,360 Speaker 1: they're probably due to luck more than skill, and you 217 00:12:42,360 --> 00:12:46,040 Speaker 1: think you have it. That is, people arrive at a 218 00:12:46,120 --> 00:12:49,880 Speaker 1: theory based on data which is far too early, far 219 00:12:49,920 --> 00:12:52,240 Speaker 1: too sparse, and far too unreliable, but they think I 220 00:12:52,320 --> 00:12:54,720 Speaker 1: got it, and then the next phase that they have 221 00:12:54,760 --> 00:12:58,600 Speaker 1: to go through is realizing, oh, that theory really doesn't work. Uh, 222 00:12:58,640 --> 00:13:00,120 Speaker 1: and so we've been able to track that a to 223 00:13:00,160 --> 00:13:04,000 Speaker 1: show that uh and a number of studies. So the 224 00:13:04,000 --> 00:13:07,120 Speaker 1: internet is right. UM. I'm very pleased with its intuition 225 00:13:07,160 --> 00:13:11,040 Speaker 1: on this one. UM, but it is a little bit 226 00:13:11,160 --> 00:13:13,840 Speaker 1: odd to get credit for an insight that we never had, 227 00:13:14,120 --> 00:13:16,720 Speaker 1: but we're very happy to steal. So essentially, when you 228 00:13:17,120 --> 00:13:23,200 Speaker 1: run the data showing UM the correlation between skill and 229 00:13:24,360 --> 00:13:28,079 Speaker 1: UM ability to self evaluate, you end up with a 230 00:13:28,160 --> 00:13:32,800 Speaker 1: chart that looks in this paper, looks remarkably similar to 231 00:13:32,840 --> 00:13:37,080 Speaker 1: all the various pop psychology uh mount stupid charts that 232 00:13:37,080 --> 00:13:39,719 Speaker 1: are out there. Well. Yeah, as you gain experience, you 233 00:13:39,800 --> 00:13:42,640 Speaker 1: unfortunately start with a burst of overconfidence. I got this 234 00:13:42,800 --> 00:13:46,679 Speaker 1: you No, you don't. And then experience basically is correcting 235 00:13:46,760 --> 00:13:50,280 Speaker 1: your flattering impression of your skill as time goes on, 236 00:13:50,440 --> 00:13:55,960 Speaker 1: until at some point learning stops because of experiences, not 237 00:13:56,120 --> 00:14:02,320 Speaker 1: new or learning does experience human limits. But that that 238 00:14:02,480 --> 00:14:05,720 Speaker 1: is a pattern. By the way, if anybody flies an airplane, 239 00:14:05,800 --> 00:14:09,080 Speaker 1: they perfectly understand this pattern. It's not beginning pilots who 240 00:14:09,120 --> 00:14:12,520 Speaker 1: are the most dangerous. It's pilots with let's say six 241 00:14:13,160 --> 00:14:17,480 Speaker 1: hundred UM flight hours. They have enough experience. I think 242 00:14:17,480 --> 00:14:19,800 Speaker 1: that they've got this, and they enter into what's referred 243 00:14:19,800 --> 00:14:23,200 Speaker 1: to as the killing zone where accents are most likely 244 00:14:23,240 --> 00:14:27,000 Speaker 1: to happen. All of this raises the question of how 245 00:14:27,120 --> 00:14:31,320 Speaker 1: much of an independent skill is self assessment? Or asked differently, 246 00:14:31,800 --> 00:14:34,720 Speaker 1: do you have to be skilled at the underlying task 247 00:14:34,800 --> 00:14:37,920 Speaker 1: in hands in order to have any skill set in 248 00:14:37,960 --> 00:14:41,440 Speaker 1: evaluating it or can they be learned independently. I think 249 00:14:41,480 --> 00:14:45,560 Speaker 1: research actually has to look at this a little bit more. Uh. 250 00:14:45,760 --> 00:14:47,800 Speaker 1: One of the things that we know, and we followed 251 00:14:47,880 --> 00:14:51,880 Speaker 1: up on this is there's um direct skill in doing 252 00:14:51,880 --> 00:14:55,040 Speaker 1: the task, direct skill and doing the judgment, and then 253 00:14:55,040 --> 00:14:59,160 Speaker 1: there is potentially another layer which is evaluating the judgment. 254 00:15:00,080 --> 00:15:02,640 Speaker 1: The question is how much does that second judgment rely 255 00:15:02,720 --> 00:15:06,000 Speaker 1: on knowledge in the first And from our data it no, 256 00:15:06,560 --> 00:15:11,160 Speaker 1: It's clear that accuracy in knowing whether or not you're 257 00:15:11,240 --> 00:15:15,400 Speaker 1: right is very correlated with accuracy in the first place. 258 00:15:15,800 --> 00:15:17,760 Speaker 1: Are you really good at the skill? Can you reach 259 00:15:17,800 --> 00:15:21,000 Speaker 1: an accurate judgment? Now? It's not true in everything. It's 260 00:15:21,000 --> 00:15:24,960 Speaker 1: not true in golf. Um, I know just how bad 261 00:15:25,280 --> 00:15:27,600 Speaker 1: my golf game is because I tend to score my 262 00:15:28,160 --> 00:15:30,320 Speaker 1: rounds not in terms of shots, but in terms of 263 00:15:30,320 --> 00:15:34,400 Speaker 1: how many balls did I lose, of course, and that's 264 00:15:34,440 --> 00:15:37,160 Speaker 1: a that's a metric that gives me a pretty good 265 00:15:37,160 --> 00:15:39,800 Speaker 1: indication of how bad I am. So you could self 266 00:15:39,880 --> 00:15:44,200 Speaker 1: evaluate without even seeing your skill, your your actual scorecard score. 267 00:15:44,520 --> 00:15:47,000 Speaker 1: You just count the lost balls. That yeah, that's the 268 00:15:47,040 --> 00:15:51,400 Speaker 1: real thing. Um And but there are a lot of 269 00:15:51,520 --> 00:15:56,080 Speaker 1: skills though that uh, accuracy at the medicaguntive task. Judging 270 00:15:56,120 --> 00:15:59,440 Speaker 1: whether or not you're right that skill really depends on 271 00:15:59,480 --> 00:16:00,960 Speaker 1: your skill in the first test, which is gaining a 272 00:16:01,040 --> 00:16:06,840 Speaker 1: right judgment. And for example of financial forecasting would be 273 00:16:06,880 --> 00:16:11,800 Speaker 1: an example. That's easy pickings, that's fishing a barrel from 274 00:16:11,800 --> 00:16:15,760 Speaker 1: what I hear. And giving a good lecture in my world, well, 275 00:16:15,920 --> 00:16:20,640 Speaker 1: you do have to judge internally, am I really giving 276 00:16:20,680 --> 00:16:22,480 Speaker 1: a good lecture or not? You can't really depend on 277 00:16:22,520 --> 00:16:25,400 Speaker 1: the audience. Audiences can be good, audiences can be bad. 278 00:16:26,120 --> 00:16:30,920 Speaker 1: And so but the choices you make, um, well, they 279 00:16:30,960 --> 00:16:34,200 Speaker 1: depend on skill. But your evaluation of those choices probably 280 00:16:34,240 --> 00:16:36,200 Speaker 1: depend on how good you are and knowing what a 281 00:16:36,240 --> 00:16:39,160 Speaker 1: good lecture looks like, what a good lecture sounds like. 282 00:16:39,440 --> 00:16:44,000 Speaker 1: So let's talk a little bit about academic psychology and 283 00:16:44,440 --> 00:16:47,440 Speaker 1: your background and what it's like teaching these days. You 284 00:16:47,560 --> 00:16:52,600 Speaker 1: got your PhD at Stanford at a time when I 285 00:16:52,640 --> 00:16:54,600 Speaker 1: guess you could still say it today. It was the 286 00:16:54,680 --> 00:16:58,120 Speaker 1: mecca of psychology, wasn't it? Yes? It was. So who 287 00:16:58,120 --> 00:17:01,920 Speaker 1: would you study under? I studied under Ross primarily was 288 00:17:01,960 --> 00:17:05,480 Speaker 1: also mentored a little bit by Phoebe Elsworth, whose last 289 00:17:05,480 --> 00:17:08,320 Speaker 1: few years I've been a colleague of at Michigan. But 290 00:17:08,480 --> 00:17:12,920 Speaker 1: it really was a village um. Everybody among the faculty 291 00:17:13,280 --> 00:17:15,600 Speaker 1: was on the same page, so to speak. And so 292 00:17:15,880 --> 00:17:18,440 Speaker 1: I'd have to say that entire faculty raised me as 293 00:17:18,440 --> 00:17:21,240 Speaker 1: it did a lot of other people. Quite interesting. So 294 00:17:21,680 --> 00:17:24,960 Speaker 1: you've been studying psychology for a long time. Have you 295 00:17:25,080 --> 00:17:30,480 Speaker 1: found in the rest of your life's decision making that 296 00:17:30,560 --> 00:17:35,760 Speaker 1: you've become more rational and a bitter decision maker. I 297 00:17:35,800 --> 00:17:40,040 Speaker 1: think life has provided those lessons. Yes, and I've certainly 298 00:17:40,040 --> 00:17:44,200 Speaker 1: become more experienced in my work. So, Um, I bear 299 00:17:44,280 --> 00:17:47,239 Speaker 1: the scars, Uh, I bear the wounds, but I do 300 00:17:47,359 --> 00:17:50,240 Speaker 1: think that I am a little wiser because of it. 301 00:17:51,040 --> 00:17:53,880 Speaker 1: I One of the things, or one of the principles 302 00:17:53,920 --> 00:17:57,439 Speaker 1: I often live by, is are you vaguely embarrassed by 303 00:17:57,640 --> 00:18:02,000 Speaker 1: something you did five ten years go? And so I'll 304 00:18:02,040 --> 00:18:04,760 Speaker 1: read things that I did five or ten years ago, 305 00:18:04,840 --> 00:18:07,040 Speaker 1: and I find myself I shouldn't have done it that way, 306 00:18:07,359 --> 00:18:11,119 Speaker 1: and I take that as a pleasant emotion. It's suggests 307 00:18:11,119 --> 00:18:13,720 Speaker 1: I'm in a different place now than I was back then. 308 00:18:14,040 --> 00:18:17,080 Speaker 1: So I go through something similar in every five years. 309 00:18:17,080 --> 00:18:20,720 Speaker 1: I'm mortified of the five year younger version of me. Um. 310 00:18:20,760 --> 00:18:23,000 Speaker 1: But I never took the next step to say, well, 311 00:18:23,040 --> 00:18:25,919 Speaker 1: I guess this means I'm growing. I always been just 312 00:18:26,000 --> 00:18:30,880 Speaker 1: so horrified at at the younger version. Um, I didn't 313 00:18:30,920 --> 00:18:32,760 Speaker 1: make the leap that. Oh, I guess this means that 314 00:18:32,880 --> 00:18:37,000 Speaker 1: that's progress. Um. So let's talk a little bit about 315 00:18:37,040 --> 00:18:40,399 Speaker 1: about things like that, about learning and norms. You write 316 00:18:40,400 --> 00:18:43,240 Speaker 1: a lot about social norms. Why do you find this 317 00:18:43,359 --> 00:18:46,760 Speaker 1: topic so so fascinating? Well, social norms, I think is 318 00:18:47,359 --> 00:18:51,600 Speaker 1: the surprisingly understudying thing in the behavioral sciences. There are 319 00:18:51,600 --> 00:18:55,720 Speaker 1: people who study it, but social norms are an incredible 320 00:18:55,760 --> 00:18:59,960 Speaker 1: guide both to successful human behavior, not only for individuals 321 00:19:00,080 --> 00:19:04,919 Speaker 1: but for society but also at times, um, the source 322 00:19:04,960 --> 00:19:09,200 Speaker 1: of the greatest calamity, if you will, so, Um, why 323 00:19:09,280 --> 00:19:12,919 Speaker 1: is it? Give us some examples to better understand that? Well? 324 00:19:12,960 --> 00:19:16,879 Speaker 1: I think that the clearest example that comes to mind is, 325 00:19:17,440 --> 00:19:20,199 Speaker 1: let's take norms of politeness. And let's talk about the 326 00:19:20,200 --> 00:19:24,399 Speaker 1: fact that the FAA has recorded I believe, I'm not 327 00:19:24,400 --> 00:19:27,720 Speaker 1: sure the numbers sixteen times where the crew in the 328 00:19:27,760 --> 00:19:32,160 Speaker 1: cockpit of a of an airliner knew that the pilot 329 00:19:32,200 --> 00:19:34,000 Speaker 1: was doing something wrong and they were going to crash 330 00:19:34,000 --> 00:19:37,520 Speaker 1: into a mountain. The pilot didn't seem to know, but 331 00:19:37,600 --> 00:19:40,480 Speaker 1: they're polite, and so they indirectly keep telling the pilot 332 00:19:41,240 --> 00:19:44,600 Speaker 1: you better change things up, but they don't say it directly. 333 00:19:44,880 --> 00:19:47,760 Speaker 1: And if you listen to the black box recordings, those 334 00:19:47,760 --> 00:19:52,280 Speaker 1: planes crash. Uh, So there's a there's a norm that we, 335 00:19:52,720 --> 00:19:55,639 Speaker 1: uh try not to embarrass the other person. It's a 336 00:19:55,720 --> 00:19:59,000 Speaker 1: very important norm for day to day life. Imagine day 337 00:19:59,000 --> 00:20:01,000 Speaker 1: to day life without it. But it can go to 338 00:20:01,040 --> 00:20:04,960 Speaker 1: extremes in terms of not telling pilots that, uh, they're 339 00:20:04,960 --> 00:20:09,280 Speaker 1: on a course to disaster, or not telling doctors that 340 00:20:09,280 --> 00:20:12,440 Speaker 1: they're operating on the wrong leg for example. Really, and 341 00:20:12,920 --> 00:20:15,120 Speaker 1: so to me, that sounds a lot like just deferral 342 00:20:15,160 --> 00:20:19,520 Speaker 1: to authority. How much of that is just being a 343 00:20:19,560 --> 00:20:22,720 Speaker 1: good little soldier and how much of that is social norms? 344 00:20:22,760 --> 00:20:25,080 Speaker 1: Were they you know, two sides of the same coin, Well, 345 00:20:25,080 --> 00:20:26,520 Speaker 1: they're two sides of the same coin. I mean, we 346 00:20:26,560 --> 00:20:28,680 Speaker 1: defer to authority, but we also defer to each other, 347 00:20:29,359 --> 00:20:32,240 Speaker 1: and by and large sets there because it has an 348 00:20:32,240 --> 00:20:35,960 Speaker 1: overall positive impact. But it can go too far. Um 349 00:20:36,119 --> 00:20:39,160 Speaker 1: so uh. And the question becomes knowing when it's going 350 00:20:39,160 --> 00:20:41,480 Speaker 1: too far and being able to break the norm. And 351 00:20:41,520 --> 00:20:44,800 Speaker 1: what I find interesting though, is that norms permeate our life. 352 00:20:45,520 --> 00:20:49,080 Speaker 1: For example, there are norms that we know that we 353 00:20:49,320 --> 00:20:52,280 Speaker 1: don't know that we know. So, for example, just just 354 00:20:52,359 --> 00:20:56,679 Speaker 1: give you a one example. We know it's a teenage 355 00:20:57,080 --> 00:21:00,359 Speaker 1: ninja turtles as opposed to a teenage ninja to turtles 356 00:21:00,880 --> 00:21:05,280 Speaker 1: as opposed to mutant ninja teenage turtles. That sounds odd. 357 00:21:06,200 --> 00:21:08,760 Speaker 1: There's a rule in how you stack up adjectives before 358 00:21:08,760 --> 00:21:11,480 Speaker 1: and now uh, and we all follow that rule and 359 00:21:11,480 --> 00:21:13,240 Speaker 1: we know when that rules being violated, but we don't 360 00:21:13,240 --> 00:21:15,919 Speaker 1: know that rule. But there are a lot of rules 361 00:21:15,920 --> 00:21:18,040 Speaker 1: in our language, a lot of rules in our behavior, 362 00:21:18,680 --> 00:21:21,200 Speaker 1: a lot of rules in our etiquette that we're following, 363 00:21:21,320 --> 00:21:23,040 Speaker 1: but we're so skilled at them we don't know that 364 00:21:23,040 --> 00:21:25,320 Speaker 1: we're following. We just internalize them and we're not aware 365 00:21:25,320 --> 00:21:27,879 Speaker 1: of that. That's right, And so so how does that 366 00:21:27,960 --> 00:21:30,679 Speaker 1: come back? How do you deal with that when you 367 00:21:30,760 --> 00:21:34,480 Speaker 1: have a deferring co pilot and the pilots about to 368 00:21:34,560 --> 00:21:37,199 Speaker 1: hit them out. You have to train people to have 369 00:21:37,240 --> 00:21:40,680 Speaker 1: a different norm. So you just completely break the underlying 370 00:21:40,720 --> 00:21:43,959 Speaker 1: norm and replace it with something for safety purposes. That's right. 371 00:21:44,000 --> 00:21:47,040 Speaker 1: Either you invent a procedure or you invent a piece 372 00:21:47,040 --> 00:21:48,639 Speaker 1: of equipment, so it's going to tell the pilot that 373 00:21:48,680 --> 00:21:52,360 Speaker 1: they're in error, UM or a piece of equipment that 374 00:21:52,359 --> 00:21:55,000 Speaker 1: prevents the error in the first place. So, for example, 375 00:21:55,080 --> 00:21:57,040 Speaker 1: in terms of wrong side surgery, and this is a 376 00:21:57,080 --> 00:21:59,600 Speaker 1: thing that can happen, but it happens much less than 377 00:21:59,600 --> 00:22:03,360 Speaker 1: it used to basically because the medical profession has instituted 378 00:22:03,400 --> 00:22:07,000 Speaker 1: procedures to just avoid the error another norm, if you will. 379 00:22:07,080 --> 00:22:10,280 Speaker 1: So I remember when I had eye surgery. I'm having 380 00:22:10,280 --> 00:22:14,560 Speaker 1: a pleasant conversation with the eye surgeon um beforehand, and 381 00:22:14,600 --> 00:22:16,520 Speaker 1: at the end he has, oh, by the way, it's 382 00:22:16,560 --> 00:22:20,919 Speaker 1: your right eye we're doing today, right, and uh, I 383 00:22:20,960 --> 00:22:22,840 Speaker 1: go yes, And well he knew it was the right eye, 384 00:22:22,880 --> 00:22:26,040 Speaker 1: but he had to check and then he signed, you know, 385 00:22:26,359 --> 00:22:30,359 Speaker 1: the forehead and above my right just to make sure 386 00:22:30,480 --> 00:22:34,200 Speaker 1: that to avoid wrong side surgery. So I'm just horrified 387 00:22:34,200 --> 00:22:36,679 Speaker 1: at the thought that there's a room full of surgeons 388 00:22:37,000 --> 00:22:39,840 Speaker 1: and someone starts sawing off the wrong leg and nobody 389 00:22:39,880 --> 00:22:44,840 Speaker 1: says anything. Yes, well, because it is the case that 390 00:22:45,119 --> 00:22:49,320 Speaker 1: people may be uncertain, they don't know how to intervene. Hey, 391 00:22:49,440 --> 00:22:53,800 Speaker 1: that's the wrong leg. Not to be funny, but I'm 392 00:22:53,840 --> 00:22:57,280 Speaker 1: it's just terrifying. Oh I know. But but remember this 393 00:22:57,520 --> 00:22:58,840 Speaker 1: in some sense, it goes all the way back to 394 00:22:58,840 --> 00:23:02,119 Speaker 1: the Pilgrim experiment. Uh. And the key about the Pilgrim 395 00:23:02,200 --> 00:23:04,920 Speaker 1: experiment is not that people gleefully went all the way 396 00:23:04,960 --> 00:23:08,920 Speaker 1: to shock another person and basically a commit involuntary manslaughter. 397 00:23:09,400 --> 00:23:11,719 Speaker 1: That's what the moment experiment was. They didn't know how 398 00:23:11,720 --> 00:23:16,399 Speaker 1: to get out. And what I'm intrigued by the film 399 00:23:16,480 --> 00:23:20,800 Speaker 1: of the Milgram experiment, for example, is that the second thing, uh, 400 00:23:20,920 --> 00:23:22,760 Speaker 1: the subjects tend to say when they're trying to get 401 00:23:22,760 --> 00:23:24,840 Speaker 1: out is they say, you can have your four fifty back. 402 00:23:25,880 --> 00:23:28,800 Speaker 1: That is that the social contract is a norm, it 403 00:23:28,840 --> 00:23:32,000 Speaker 1: has to be followed, and they have to aggregate that 404 00:23:32,080 --> 00:23:36,360 Speaker 1: contract before they can stop doing involuntary manslaughter essentially. But 405 00:23:36,680 --> 00:23:38,480 Speaker 1: the real thing about that experiment is people don't know 406 00:23:38,520 --> 00:23:42,120 Speaker 1: how to dissent. It's not something we're necessarily well trained 407 00:23:42,160 --> 00:23:46,120 Speaker 1: in we're trained in cooperating, we are trained in deferring. 408 00:23:46,160 --> 00:23:48,679 Speaker 1: That's not true all the time, but if you start 409 00:23:48,720 --> 00:23:52,040 Speaker 1: looking around in life, you realize we do it a 410 00:23:52,080 --> 00:23:54,440 Speaker 1: lot more than we think we're doing it. But we're 411 00:23:54,440 --> 00:23:59,040 Speaker 1: not really well trained in the psychology of dissent um 412 00:23:59,200 --> 00:24:02,160 Speaker 1: or the psych collogy of objection. That's just not something 413 00:24:02,200 --> 00:24:07,280 Speaker 1: we do. So how much of this is institutional schools, family, whatever, 414 00:24:07,320 --> 00:24:10,399 Speaker 1: and how much of this is biological? Hey, we're social 415 00:24:10,440 --> 00:24:13,439 Speaker 1: primates and that's how we've evolved. I think it's it 416 00:24:13,520 --> 00:24:16,840 Speaker 1: has to be both. Um. That is, both people and 417 00:24:16,920 --> 00:24:21,840 Speaker 1: institutions evolved to create norms that do the best to 418 00:24:21,840 --> 00:24:25,680 Speaker 1: make the day pleasant, survivable, to make the day efficient. 419 00:24:26,359 --> 00:24:29,320 Speaker 1: And uh, it does have that. Norms do have that effect. 420 00:24:29,720 --> 00:24:32,560 Speaker 1: Imagine a world in which we didn't have norms. Your 421 00:24:32,640 --> 00:24:36,639 Speaker 1: enthusiast that that is a whole show about what happens 422 00:24:36,680 --> 00:24:39,320 Speaker 1: if one person decides he's not going to pay attention 423 00:24:39,359 --> 00:24:41,040 Speaker 1: to any of the social That's absolutely right, and it's 424 00:24:41,040 --> 00:24:43,320 Speaker 1: incredibly entertaining, but I wouldn't want to live in it. 425 00:24:43,320 --> 00:24:47,000 Speaker 1: It's sometimes difficult to watch. It just goes to show 426 00:24:47,040 --> 00:24:51,239 Speaker 1: you how ingrain. Those norms are the not to not 427 00:24:51,280 --> 00:24:55,439 Speaker 1: to become a television critic. But the first couple of 428 00:24:55,480 --> 00:24:58,520 Speaker 1: seasons of that show, I remember having a pose it 429 00:24:58,800 --> 00:25:02,000 Speaker 1: and just take a rake because it was so cringe 430 00:25:02,000 --> 00:25:06,399 Speaker 1: worthy and so difficult and uncomfortable to watch, even as 431 00:25:06,440 --> 00:25:10,359 Speaker 1: it was hilarious. Uh. I never really thought of it 432 00:25:10,359 --> 00:25:12,200 Speaker 1: in terms of norms. You just think of him as 433 00:25:12,240 --> 00:25:15,800 Speaker 1: a you know, cranky, difficult person. But I guess it's 434 00:25:15,840 --> 00:25:17,880 Speaker 1: all norms. Well, it is all norms, And if there's 435 00:25:17,880 --> 00:25:22,040 Speaker 1: a biology to it, it's that we are primed, uh 436 00:25:22,080 --> 00:25:24,879 Speaker 1: to have anxiety mechanisms that are really ready to go 437 00:25:25,320 --> 00:25:28,919 Speaker 1: when we're in a situation of of norm violation. So 438 00:25:28,960 --> 00:25:31,560 Speaker 1: it's interesting that you're watching something on television separated from you. 439 00:25:31,560 --> 00:25:34,119 Speaker 1: You know it's fictional, and yet you're feeling real emotion, 440 00:25:34,200 --> 00:25:37,280 Speaker 1: and the emotion is exactly the emotion you feel around 441 00:25:37,320 --> 00:25:42,639 Speaker 1: norm violations. It's anxiety, it's nervousness, its tension um. That's 442 00:25:42,640 --> 00:25:45,719 Speaker 1: fascinating and potentially speaks to how powerful that mechanism is 443 00:25:45,800 --> 00:25:50,879 Speaker 1: within the body, within the species um and why norms 444 00:25:51,160 --> 00:25:54,560 Speaker 1: hopefully work in society. So before we get off this topic, 445 00:25:54,600 --> 00:25:58,480 Speaker 1: I have to circle back to the Milgram experiment and 446 00:25:58,720 --> 00:26:03,720 Speaker 1: an unrelated the marshmallow experiments as well, all these things 447 00:26:03,760 --> 00:26:06,800 Speaker 1: that listen. I've been out of college for a hundred years, 448 00:26:06,840 --> 00:26:10,080 Speaker 1: but the things that I read through in in college 449 00:26:10,160 --> 00:26:14,320 Speaker 1: level psychology, I keep reading about different studies that they're 450 00:26:14,359 --> 00:26:17,960 Speaker 1: going back and saying, well, maybe there was a false 451 00:26:18,040 --> 00:26:20,879 Speaker 1: bias built into the way the test was done, and 452 00:26:20,920 --> 00:26:23,639 Speaker 1: when we try and recreate this, we're not getting the 453 00:26:23,680 --> 00:26:29,720 Speaker 1: same level of of effect. Is the Milgram experiment still 454 00:26:30,400 --> 00:26:33,840 Speaker 1: the operative obedience to authority in the world of psychology 455 00:26:33,920 --> 00:26:36,200 Speaker 1: or has that been rolled back a little bit? I 456 00:26:36,400 --> 00:26:40,479 Speaker 1: think people are reevaluating it as we speak. I know 457 00:26:40,680 --> 00:26:43,720 Speaker 1: there has been some journalism that's been antagonistic to the 458 00:26:43,760 --> 00:26:46,080 Speaker 1: Milgram effects. So I've actually gone back because I teach 459 00:26:46,200 --> 00:26:50,000 Speaker 1: this stuff in this specific case and read the journalism 460 00:26:50,000 --> 00:26:52,760 Speaker 1: and going back to the original study, and I think 461 00:26:53,040 --> 00:26:56,040 Speaker 1: the Milgram experiment itself is still solid. But you do 462 00:26:56,080 --> 00:26:58,359 Speaker 1: have to go back in a case by case basis, 463 00:26:58,640 --> 00:27:02,080 Speaker 1: because it is the case that UM a lot of 464 00:27:02,119 --> 00:27:06,080 Speaker 1: classic work is being re evaluated, UH, and you really 465 00:27:06,119 --> 00:27:10,440 Speaker 1: do have to go back and UM review the original work, 466 00:27:10,480 --> 00:27:13,760 Speaker 1: and you have to review the replications or review the 467 00:27:14,119 --> 00:27:17,760 Speaker 1: rethinking if you will, and case by case there are 468 00:27:17,800 --> 00:27:21,359 Speaker 1: different issues that you really have to think through. So UM, 469 00:27:21,600 --> 00:27:23,840 Speaker 1: in the case the mil group experiment, I think that's 470 00:27:24,000 --> 00:27:26,440 Speaker 1: uh that solid. In the case of the marshmallow experiment, 471 00:27:26,840 --> 00:27:32,679 Speaker 1: clearly the uh the um headline is still the same. 472 00:27:32,840 --> 00:27:36,399 Speaker 1: Kids who wait a long time when they're young have 473 00:27:36,560 --> 00:27:39,640 Speaker 1: different life outcomes when they're teenagers, and so on. Uh. 474 00:27:39,680 --> 00:27:41,959 Speaker 1: The argument is over what exactly does that represent? Does 475 00:27:42,000 --> 00:27:45,320 Speaker 1: that represent personality or does that represent social class? Does 476 00:27:45,320 --> 00:27:48,959 Speaker 1: that represent whether or not what environment you grew up in? Uh? 477 00:27:49,040 --> 00:27:51,879 Speaker 1: So the issue has changed depending on which specific topic 478 00:27:52,040 --> 00:27:56,200 Speaker 1: you are reviewing. Quite interesting, you write about a lot 479 00:27:56,240 --> 00:27:59,880 Speaker 1: of things beyond metacognition. You cover a whole bunch of 480 00:28:00,320 --> 00:28:03,760 Speaker 1: other areas. We haven't really talked about. Your book, which 481 00:28:03,800 --> 00:28:07,360 Speaker 1: is a couple of years old already Self Insights, roadblocks 482 00:28:07,440 --> 00:28:11,800 Speaker 1: and detours on the path to knowing thyself. There was 483 00:28:11,840 --> 00:28:14,360 Speaker 1: something in the book that just cracked me up, which 484 00:28:14,359 --> 00:28:18,880 Speaker 1: you don't normally get in an academic book. Um, you're special, 485 00:28:19,560 --> 00:28:22,480 Speaker 1: and it turns out no, most of us are not special, 486 00:28:22,880 --> 00:28:26,959 Speaker 1: and we are wholly unaware of that. We've been told 487 00:28:27,119 --> 00:28:30,080 Speaker 1: most of our lives how special we are tell us 488 00:28:30,240 --> 00:28:33,320 Speaker 1: why so few of us are actually special? Well, the 489 00:28:33,359 --> 00:28:36,960 Speaker 1: problem is that, um, well, if you look at the 490 00:28:37,000 --> 00:28:40,120 Speaker 1: complete person, each of us is special. But if you 491 00:28:40,160 --> 00:28:43,800 Speaker 1: put us in any situation or any circumstance, we're most 492 00:28:43,840 --> 00:28:47,680 Speaker 1: gonna mostly going to act like everybody else. Most of 493 00:28:47,760 --> 00:28:49,680 Speaker 1: us are average. Most of us are average. Most of 494 00:28:49,720 --> 00:28:52,680 Speaker 1: us are typical. I mean that in any specific circumstance. 495 00:28:53,120 --> 00:28:55,200 Speaker 1: So if you argregate all that, all of who we 496 00:28:55,200 --> 00:28:57,960 Speaker 1: are together, we yeah, we are special. But when it 497 00:28:58,000 --> 00:29:02,000 Speaker 1: comes to specific situations, no, we're not special. And so 498 00:29:02,080 --> 00:29:06,320 Speaker 1: what that does leave people with, though, is they people 499 00:29:06,360 --> 00:29:09,160 Speaker 1: do have this idea that they are unique, that they 500 00:29:09,160 --> 00:29:13,640 Speaker 1: are exceptional, and as a as a consequence, they can't 501 00:29:14,520 --> 00:29:17,680 Speaker 1: I'm just doing the checkboxes, yep, right of course, Oh absolutely. 502 00:29:17,920 --> 00:29:21,800 Speaker 1: And so what that means is that it turns out 503 00:29:21,920 --> 00:29:24,800 Speaker 1: people have a good rough understanding of human nature. I'm 504 00:29:24,800 --> 00:29:26,800 Speaker 1: not going to say it's perfect, that's my work, but 505 00:29:26,920 --> 00:29:29,720 Speaker 1: they do have a good understanding of human nature. The 506 00:29:29,760 --> 00:29:33,160 Speaker 1: mistake they make is that they think they stand outside 507 00:29:33,160 --> 00:29:36,400 Speaker 1: that human nature, that they are different, they're special, that 508 00:29:36,440 --> 00:29:40,280 Speaker 1: they're special. So, for example, we've done studies if we 509 00:29:40,360 --> 00:29:44,480 Speaker 1: ask people, uh, there's going to be a a food 510 00:29:44,600 --> 00:29:47,200 Speaker 1: drive at your campus. Let's say in a month, will 511 00:29:47,240 --> 00:29:50,320 Speaker 1: you contribute to it? Um? And what percentage of people 512 00:29:50,320 --> 00:29:52,560 Speaker 1: will contribute to it. They're pretty good at nailing the 513 00:29:52,600 --> 00:29:55,040 Speaker 1: percentage of people on their campus are going to contribute 514 00:29:55,040 --> 00:29:57,760 Speaker 1: to the food drive there. Rather, they sort of figure 515 00:29:57,840 --> 00:30:00,640 Speaker 1: what the situation is, they can think about their experience. 516 00:30:00,680 --> 00:30:02,520 Speaker 1: They come up with a good answer. Uh, and that 517 00:30:02,520 --> 00:30:05,360 Speaker 1: answer turns out to be right. But when we ask them, okay, 518 00:30:05,360 --> 00:30:07,080 Speaker 1: what are you gonna do? Are you going to contribute? 519 00:30:07,320 --> 00:30:09,640 Speaker 1: They way overestimate how much they're going to do the 520 00:30:09,720 --> 00:30:11,680 Speaker 1: right thing. They're going to do the good thing. They're 521 00:30:11,720 --> 00:30:15,520 Speaker 1: going to do the social thing, basically because they understand 522 00:30:15,560 --> 00:30:21,240 Speaker 1: how the situation and external forces will prompt people to 523 00:30:21,360 --> 00:30:24,040 Speaker 1: donate and to not donate, but they think they stand 524 00:30:24,040 --> 00:30:26,680 Speaker 1: outside those forces. For them, it's just simply a decision 525 00:30:27,280 --> 00:30:29,000 Speaker 1: do I want to donate or not? And a lot 526 00:30:29,040 --> 00:30:31,120 Speaker 1: of people want to donate, so yeah, I'm going to donate. 527 00:30:31,520 --> 00:30:34,160 Speaker 1: It turns out when the time comes, no, there's subject 528 00:30:34,160 --> 00:30:38,280 Speaker 1: to all these external forces that push against donation as 529 00:30:38,320 --> 00:30:41,360 Speaker 1: well as push forward donation. So they turn out to 530 00:30:42,280 --> 00:30:45,080 Speaker 1: be typical just like everybody else. So let's let's talk 531 00:30:45,120 --> 00:30:48,920 Speaker 1: about a related topic. UM. Again from the book about 532 00:30:49,000 --> 00:30:52,520 Speaker 1: Moral Fortitude. You tell the the story about being on 533 00:30:52,720 --> 00:30:57,400 Speaker 1: radio show UM around the time of the Clinton impeachment, 534 00:30:57,440 --> 00:31:00,280 Speaker 1: almost a Trump impeachment. But this is this is funny. 535 00:31:00,440 --> 00:31:04,520 Speaker 1: Plus years ago, the radio host goes off on a 536 00:31:04,520 --> 00:31:08,600 Speaker 1: tirade about infidelity and the moral inferiority and failings of 537 00:31:08,640 --> 00:31:12,200 Speaker 1: other people. And you had at your fingertips a bunch 538 00:31:12,280 --> 00:31:17,920 Speaker 1: of research about how everybody's expectations of their own moral 539 00:31:17,960 --> 00:31:22,120 Speaker 1: superiority sort of fit into the Dunning Kruger framework. We 540 00:31:22,240 --> 00:31:24,680 Speaker 1: think we're much better at that than we really are. Well, 541 00:31:24,720 --> 00:31:26,840 Speaker 1: that's true. That is because when you move to the 542 00:31:26,840 --> 00:31:31,560 Speaker 1: moral domain the ethical domain, UH, people definitely have this 543 00:31:31,640 --> 00:31:33,880 Speaker 1: folier than thou attitude. I won't do it, but other 544 00:31:33,960 --> 00:31:37,480 Speaker 1: people will do it if it's bad. For example, I 545 00:31:37,520 --> 00:31:41,080 Speaker 1: would never cheat on my beloved, but other people, of course, 546 00:31:41,080 --> 00:31:44,080 Speaker 1: they're gonna cheat in their beloved. Um. And it turns 547 00:31:44,080 --> 00:31:47,120 Speaker 1: out we did a number of studies not an infidelity 548 00:31:47,160 --> 00:31:50,120 Speaker 1: but rather will you vote, uh, will be charitable? Will 549 00:31:50,120 --> 00:31:54,120 Speaker 1: you will tread? Will you obey traffic laws? For example? 550 00:31:54,600 --> 00:31:57,840 Speaker 1: And it turns out that people widely overestimate themselves that 551 00:31:58,080 --> 00:32:02,000 Speaker 1: is a overestimate how moral, ethical, and good they will 552 00:32:02,000 --> 00:32:05,280 Speaker 1: be relative to what they think about other people. And 553 00:32:05,400 --> 00:32:08,680 Speaker 1: they also overestimate how moral and good they're going to 554 00:32:08,720 --> 00:32:13,160 Speaker 1: be relative to the reality when we actually test either 555 00:32:13,280 --> 00:32:19,120 Speaker 1: them or equivalent group of people. So, um, the question 556 00:32:19,160 --> 00:32:22,479 Speaker 1: for us is people tendably they're morally superior. Are they 557 00:32:22,520 --> 00:32:24,680 Speaker 1: making a mistake about other people? Are they being too 558 00:32:24,680 --> 00:32:27,840 Speaker 1: cynical about other people? Are they being too optimistic about 559 00:32:27,880 --> 00:32:31,440 Speaker 1: the self? And it turns out to be to my surprise, 560 00:32:31,520 --> 00:32:33,800 Speaker 1: and this is completely the reverse of what I expected. 561 00:32:34,120 --> 00:32:38,520 Speaker 1: People are wrong about themselves exactly because they think they're special. Huh. 562 00:32:38,560 --> 00:32:41,200 Speaker 1: But so so, they're not being cynical about the rest 563 00:32:41,240 --> 00:32:44,080 Speaker 1: of humanity. They pretty much have them naw, they just 564 00:32:44,160 --> 00:32:47,160 Speaker 1: think they're better than everybody. That's right. With maybe one 565 00:32:47,240 --> 00:32:52,760 Speaker 1: or two glaring exceptions, people are surprisingly accurate about the 566 00:32:52,800 --> 00:32:56,560 Speaker 1: general rate about human nature in general, how other people 567 00:32:56,560 --> 00:32:59,320 Speaker 1: are gonna be buffeted around by external forces. They just 568 00:32:59,440 --> 00:33:03,080 Speaker 1: think they're for themselves are exempt from those forces, all right. 569 00:33:03,160 --> 00:33:05,920 Speaker 1: So we have metic cognition issues when we're trying to 570 00:33:05,960 --> 00:33:10,640 Speaker 1: do a specific task that requires skills. There's a similar 571 00:33:10,680 --> 00:33:13,520 Speaker 1: issue with our own sense of self and ethics and 572 00:33:13,640 --> 00:33:17,880 Speaker 1: moral turpitude. Um, what other areas are subject to the 573 00:33:18,000 --> 00:33:20,640 Speaker 1: Dunning Kruger effect. Well, I don't know what else there 574 00:33:20,720 --> 00:33:24,240 Speaker 1: might be, but is that everything is it thoughts in 575 00:33:24,320 --> 00:33:27,480 Speaker 1: action and everything else has left over? No, there's also 576 00:33:27,520 --> 00:33:31,680 Speaker 1: the future if you think so. People are also over 577 00:33:31,800 --> 00:33:36,880 Speaker 1: optimistic about their prospects if you will really, oh absolutely, Uh. 578 00:33:36,920 --> 00:33:40,600 Speaker 1: That is, people really underestimate how long it's going to 579 00:33:40,680 --> 00:33:45,640 Speaker 1: take to complete projects. Uh, the underestimate or how long 580 00:33:45,680 --> 00:33:48,880 Speaker 1: it's going to take for their business to be profitable. Uh. 581 00:33:48,920 --> 00:33:53,920 Speaker 1: They when they're thinking about the future, they tend to 582 00:33:54,080 --> 00:33:58,040 Speaker 1: base their planning and their ideas on the most optimistic 583 00:33:58,040 --> 00:34:00,680 Speaker 1: scenario rather than the most pessimistic scenario area, or maybe 584 00:34:00,680 --> 00:34:05,320 Speaker 1: even the most realistic scenario. So, um, there are things 585 00:34:05,400 --> 00:34:07,720 Speaker 1: we missed, not only in terms of competence and character, 586 00:34:07,800 --> 00:34:11,279 Speaker 1: but also about our prospects. So how do we explain that? 587 00:34:11,320 --> 00:34:15,360 Speaker 1: I can imagine I could concoct a lovely narrative tale 588 00:34:15,440 --> 00:34:19,000 Speaker 1: as to why having an optimism bias is good for 589 00:34:19,040 --> 00:34:22,240 Speaker 1: the species. Even if you're the guy from Cave seventy 590 00:34:22,280 --> 00:34:25,440 Speaker 1: three that doesn't come back from the mammoth aunt, everybody 591 00:34:25,440 --> 00:34:29,000 Speaker 1: else has foreign meat for the winter. Is this just 592 00:34:29,080 --> 00:34:32,760 Speaker 1: a crazy narrative story or is there some evolutionary component 593 00:34:32,800 --> 00:34:35,600 Speaker 1: to us Well, there is an evolutionary component to it 594 00:34:36,120 --> 00:34:39,919 Speaker 1: and an adaptability component to it, but it's complicated. So 595 00:34:40,000 --> 00:34:43,000 Speaker 1: the fact that people commit to things far too optimistically 596 00:34:43,360 --> 00:34:48,760 Speaker 1: really does create those things. I mean, books are written, um, 597 00:34:48,760 --> 00:34:53,879 Speaker 1: Businesses are developed. Um. Uh, movies are made, even though 598 00:34:53,920 --> 00:34:56,280 Speaker 1: the people who start them out did far more work 599 00:34:56,360 --> 00:34:59,920 Speaker 1: and are now far more depressed and tired than they 600 00:35:00,000 --> 00:35:02,600 Speaker 1: ever imagined they would be at the end of those projects. 601 00:35:02,640 --> 00:35:05,920 Speaker 1: But um, if they had only been prepared for how 602 00:35:05,920 --> 00:35:07,360 Speaker 1: long it was going to take, they probably would have 603 00:35:07,360 --> 00:35:09,080 Speaker 1: come up with a better project, a better business, and 604 00:35:09,120 --> 00:35:14,240 Speaker 1: a better book. Uh. So things get made, but people 605 00:35:14,280 --> 00:35:17,720 Speaker 1: will fail or they won't produce really what they're capable 606 00:35:17,760 --> 00:35:21,440 Speaker 1: of producing. Very interesting, all of which leads to one 607 00:35:21,480 --> 00:35:26,279 Speaker 1: big question, which is why do we seem to make 608 00:35:26,360 --> 00:35:30,160 Speaker 1: these same errors in judgment? Is it's something about the 609 00:35:30,160 --> 00:35:32,840 Speaker 1: way we learn? Is it something about our fragile egos? 610 00:35:33,560 --> 00:35:37,279 Speaker 1: Why as a species are we unable to get by 611 00:35:37,480 --> 00:35:41,080 Speaker 1: some of these fairly obvious flaws. Well, I think there 612 00:35:41,080 --> 00:35:44,480 Speaker 1: are two things involved. One comes from the holier than 613 00:35:44,520 --> 00:35:48,000 Speaker 1: that work, which is for overweighting our intentions and the 614 00:35:48,040 --> 00:35:50,319 Speaker 1: part of our personality to produce things that that's part 615 00:35:50,320 --> 00:35:52,560 Speaker 1: of what's going on when we repeat that, the power 616 00:35:52,600 --> 00:35:56,640 Speaker 1: of our personality to because well, I will do this, 617 00:35:56,719 --> 00:36:01,000 Speaker 1: because I want to do this, uh and I uh, 618 00:36:01,080 --> 00:36:04,520 Speaker 1: that is part. Uh, that's something that we overestimate. The 619 00:36:04,560 --> 00:36:07,399 Speaker 1: other is the competence angle, which is we really don't 620 00:36:07,440 --> 00:36:11,520 Speaker 1: know what we don't know and RUMs felt unknown unknowns? Well, 621 00:36:11,920 --> 00:36:16,000 Speaker 1: the world is filled with unknown unknowns and uh, and 622 00:36:16,080 --> 00:36:19,080 Speaker 1: we don't know well, not only do we not know them, 623 00:36:19,200 --> 00:36:21,279 Speaker 1: we don't pay attention to the fact we don't know them. 624 00:36:21,320 --> 00:36:23,440 Speaker 1: I mean too many people out there. The idea of 625 00:36:23,520 --> 00:36:27,279 Speaker 1: unknown unknowns is still a novel concept, but it is 626 00:36:27,320 --> 00:36:31,480 Speaker 1: something that they don't know what they don't know. But 627 00:36:31,520 --> 00:36:33,319 Speaker 1: there is a lot of work showing that people just 628 00:36:33,360 --> 00:36:36,080 Speaker 1: don't pay attention to what they don't know when they're 629 00:36:36,120 --> 00:36:38,879 Speaker 1: making predictions or when they're planning things out. They don't 630 00:36:39,200 --> 00:36:41,520 Speaker 1: sit back and ask, Okay, what is it that I 631 00:36:41,560 --> 00:36:45,280 Speaker 1: don't know here? What's still open? What are the possibilities 632 00:36:45,520 --> 00:36:49,160 Speaker 1: that I'm not considering? Not only that, am I concerning 633 00:36:49,200 --> 00:36:53,120 Speaker 1: the fact that there are unknown unknowns and I should 634 00:36:53,160 --> 00:36:58,000 Speaker 1: be planning for that possibility. So you mentioned earlier planning. 635 00:36:58,840 --> 00:37:02,200 Speaker 1: I saw something kind of interesting around January nine of 636 00:37:02,239 --> 00:37:05,439 Speaker 1: this year. That's the date when most people's New Year's 637 00:37:05,480 --> 00:37:10,520 Speaker 1: resolutions fail. Does that sound remotely plausible or is that 638 00:37:10,680 --> 00:37:14,080 Speaker 1: just um something else from the internet. I'm surprised that 639 00:37:14,160 --> 00:37:18,400 Speaker 1: our resolutions last that long. Oh really, no kidding. So 640 00:37:18,400 --> 00:37:22,839 Speaker 1: so why that raises the next question? If we have 641 00:37:22,920 --> 00:37:26,800 Speaker 1: all the best intentions and we want to fill in 642 00:37:26,840 --> 00:37:31,520 Speaker 1: the blanks, stop smoking, exercise, uh, lose weight, whatever it is, 643 00:37:32,160 --> 00:37:34,840 Speaker 1: why is it that when we make these sorts of plans, 644 00:37:35,320 --> 00:37:37,600 Speaker 1: all as a group on the same date every year, 645 00:37:37,760 --> 00:37:40,839 Speaker 1: I can't imagine why would that not work? Well, it 646 00:37:40,880 --> 00:37:45,239 Speaker 1: doesn't work because the world is waiting for us in 647 00:37:45,280 --> 00:37:47,400 Speaker 1: some sense. It does have those unknown unknowns, and it 648 00:37:47,480 --> 00:37:50,280 Speaker 1: does have external forces that are going to defeat us. 649 00:37:50,400 --> 00:37:52,520 Speaker 1: And what we tend to do is we tend to 650 00:37:52,560 --> 00:37:55,319 Speaker 1: focus on our plans. What am I going to do, 651 00:37:56,360 --> 00:37:58,839 Speaker 1: What are my intentions, what are the steps that I'm 652 00:37:58,840 --> 00:38:02,360 Speaker 1: going to take. What we really should do is interview 653 00:38:02,400 --> 00:38:04,160 Speaker 1: people who tried to do this before and find out 654 00:38:04,160 --> 00:38:06,960 Speaker 1: what the real difficulties are. They're gonna be many difficulties 655 00:38:07,000 --> 00:38:09,880 Speaker 1: that we haven't anticipated they're gonna be many difficulties that 656 00:38:09,920 --> 00:38:13,960 Speaker 1: we don't know about. Um. Uh. And not only that, 657 00:38:14,000 --> 00:38:17,920 Speaker 1: there are probably tricks, strategies to tactics, plans that we 658 00:38:17,960 --> 00:38:20,640 Speaker 1: can make that we wouldn't think of, but someone else 659 00:38:20,680 --> 00:38:22,839 Speaker 1: has thought of them and they actually work. So if 660 00:38:22,840 --> 00:38:25,919 Speaker 1: we actually consulted with people who have traveled the road 661 00:38:26,000 --> 00:38:28,719 Speaker 1: before us, we would do a much better job, I think, 662 00:38:28,719 --> 00:38:33,000 Speaker 1: anticipating the difficulties we have lined ahead, as well as 663 00:38:33,040 --> 00:38:36,479 Speaker 1: being better armed with strategies that have a better chance 664 00:38:36,560 --> 00:38:39,000 Speaker 1: of success. All right, so let me push back on 665 00:38:39,080 --> 00:38:41,680 Speaker 1: that a little bit. The dieting industry is like a 666 00:38:41,760 --> 00:38:45,680 Speaker 1: twenty six billion dollar sector of the economy, and they 667 00:38:45,719 --> 00:38:50,640 Speaker 1: all have the magic UM bullet, and yet everybody in 668 00:38:50,680 --> 00:38:55,400 Speaker 1: this country seems to be increasingly overweight. Um. Diabetes is 669 00:38:55,440 --> 00:38:58,560 Speaker 1: a problem. There are all these weight related issues. If 670 00:38:58,600 --> 00:39:01,440 Speaker 1: we could speak to other people well and have that 671 00:39:01,520 --> 00:39:06,360 Speaker 1: conversation who have been successful, how does that work given 672 00:39:06,440 --> 00:39:11,680 Speaker 1: the vast numbers of people, um who need assistance losing weight? Uh. 673 00:39:11,800 --> 00:39:14,319 Speaker 1: That's a very good question, by the way. Evolutionarily, this 674 00:39:14,360 --> 00:39:18,560 Speaker 1: is a very novel task for because having extra weight 675 00:39:18,800 --> 00:39:21,960 Speaker 1: is a good survival thing. If you have a shorter lifespan. 676 00:39:22,040 --> 00:39:26,879 Speaker 1: We now live beyond that adaptation. I don't think cholesterol 677 00:39:26,960 --> 00:39:29,680 Speaker 1: was a big problem ten years ago. I think that's right, 678 00:39:29,760 --> 00:39:31,200 Speaker 1: and it probably wasn't a big problem even up to 679 00:39:31,239 --> 00:39:34,680 Speaker 1: a hundred years ago. I mean, getting calories was up 680 00:39:34,719 --> 00:39:37,799 Speaker 1: to very very recently. So as a species, we are 681 00:39:37,840 --> 00:39:42,120 Speaker 1: dealing with a very novel task in trying to lose weight. 682 00:39:42,920 --> 00:39:46,520 Speaker 1: I think that there are some common sense things that 683 00:39:47,200 --> 00:39:50,080 Speaker 1: people can do, um. But one of the things they 684 00:39:50,120 --> 00:39:55,239 Speaker 1: can do is reset two things. The first is what's 685 00:39:55,239 --> 00:40:00,000 Speaker 1: a realistic outcome in terms of losing weight? But also 686 00:40:00,040 --> 00:40:03,319 Speaker 1: so um having more realism in terms of how much 687 00:40:03,320 --> 00:40:06,040 Speaker 1: effort uh and how much time it is going to 688 00:40:06,120 --> 00:40:09,080 Speaker 1: take to get there, for example, uh, and also being 689 00:40:09,160 --> 00:40:10,960 Speaker 1: to think things more in terms of long term as 690 00:40:10,960 --> 00:40:12,719 Speaker 1: opposed to the short term. I mean a lot of 691 00:40:12,719 --> 00:40:14,680 Speaker 1: people think, how do I lose weight this month? No, 692 00:40:14,800 --> 00:40:16,160 Speaker 1: the question is how do you keep the weight? How 693 00:40:16,160 --> 00:40:17,759 Speaker 1: do you lose weight and keep the weight off for 694 00:40:17,840 --> 00:40:21,480 Speaker 1: years and years and years um. But I think as 695 00:40:21,520 --> 00:40:24,280 Speaker 1: certainly as a society, I think it's taking a while 696 00:40:24,400 --> 00:40:27,399 Speaker 1: for the collective wisdom to form because it does turn 697 00:40:27,440 --> 00:40:30,640 Speaker 1: out to be a particularly difficult task. So my I 698 00:40:30,680 --> 00:40:33,200 Speaker 1: go for an annual physical every year. My GP is 699 00:40:33,280 --> 00:40:36,680 Speaker 1: also a cardiologist, and he's one of these old school doctors. 700 00:40:36,840 --> 00:40:38,960 Speaker 1: When they're done with the tests, you go into their office, 701 00:40:39,000 --> 00:40:41,719 Speaker 1: you sit down and you have a conversation and when 702 00:40:41,719 --> 00:40:44,000 Speaker 1: you go through everything, it's all good. And he says, 703 00:40:44,040 --> 00:40:46,279 Speaker 1: you have any questions for me? I'm like, yeah, I'd 704 00:40:46,280 --> 00:40:49,440 Speaker 1: like to drop a few pounds. What do you suggests? 705 00:40:49,440 --> 00:40:54,040 Speaker 1: And he very conspiratorially looked over each shoulder and then 706 00:40:54,120 --> 00:40:58,879 Speaker 1: lean forward and whispered to me, eat less food. And UM, 707 00:40:58,920 --> 00:41:01,760 Speaker 1: I'm like, Doc, you know this a giant industry whose 708 00:41:01,800 --> 00:41:05,200 Speaker 1: whole purpose is to not share that advice, but it 709 00:41:05,200 --> 00:41:08,440 Speaker 1: turns out to be good advice. Yes, so eating a 710 00:41:08,440 --> 00:41:12,200 Speaker 1: little less food you can lose some weight. It's it's um, 711 00:41:12,239 --> 00:41:15,480 Speaker 1: it's quite fascinating, and yet it's hard to do than 712 00:41:15,520 --> 00:41:19,000 Speaker 1: you would imagine. Then I certainly, then I imagine, No, 713 00:41:19,080 --> 00:41:20,760 Speaker 1: I think that's right. Well, certainly in the United States 714 00:41:20,760 --> 00:41:23,320 Speaker 1: it's harder. Um. One of think that I think is interesting. 715 00:41:23,360 --> 00:41:25,680 Speaker 1: Now this isn't psychologist, it's just my personal life is 716 00:41:25,719 --> 00:41:29,759 Speaker 1: every so often I spend time in Germany and I 717 00:41:29,800 --> 00:41:32,960 Speaker 1: always lose weight in Germany. Without even trying. Now, why 718 00:41:33,080 --> 00:41:36,600 Speaker 1: is that? Do you not like bratt worstern beer or uh? Well, 719 00:41:37,200 --> 00:41:39,520 Speaker 1: German cuisine is more than that, not much more, by 720 00:41:39,520 --> 00:41:43,080 Speaker 1: the way, but it is more than that. But uh 721 00:41:43,480 --> 00:41:45,560 Speaker 1: it's a lot of schnitzel when you don't know anything else. 722 00:41:46,920 --> 00:41:49,759 Speaker 1: That's a safe choice. It's a safe choice. But I 723 00:41:49,800 --> 00:41:53,400 Speaker 1: think most uh, well, in Germany the portions are small. 724 00:41:54,400 --> 00:41:56,200 Speaker 1: In the rest of the world, the portions are small. 725 00:41:56,280 --> 00:41:59,360 Speaker 1: That's exactly right, and that's an issue. Most of the 726 00:41:59,360 --> 00:42:02,239 Speaker 1: calories in the meal are conveyed by the sauce, in 727 00:42:02,280 --> 00:42:05,000 Speaker 1: the in the inevitable beer you're going to drink, or 728 00:42:05,040 --> 00:42:07,520 Speaker 1: the wine you're going to drink. But there's also just 729 00:42:07,560 --> 00:42:10,680 Speaker 1: a much more walking. Oh really, bike riding? Yeah, but 730 00:42:10,719 --> 00:42:13,600 Speaker 1: can you walk off that many calories? I mean, if 731 00:42:13,640 --> 00:42:16,680 Speaker 1: you're Michael Phelps, sure, But for the rest of us, 732 00:42:17,160 --> 00:42:20,560 Speaker 1: we're not putting in three hours a day of sweating. Well, 733 00:42:20,560 --> 00:42:24,360 Speaker 1: that's certainly true. But if you just walk, and walking 734 00:42:24,480 --> 00:42:27,640 Speaker 1: is one physical act our species was built for, h 735 00:42:27,760 --> 00:42:31,440 Speaker 1: it does bring things under control. Um, this isn't scientific. 736 00:42:31,480 --> 00:42:33,640 Speaker 1: I just know my brother lost quite a bit of 737 00:42:33,680 --> 00:42:37,400 Speaker 1: weight by buying a beagle and then taking the beagle 738 00:42:37,440 --> 00:42:42,240 Speaker 1: out for eight to nine mile walks every weekend and UH. 739 00:42:42,280 --> 00:42:45,840 Speaker 1: That worked for him. UH. And so there are strategies 740 00:42:45,880 --> 00:42:49,120 Speaker 1: that work. Maybe different strategies work for different people. UM. 741 00:42:49,160 --> 00:42:53,959 Speaker 1: But the key is often UM. What we will tend 742 00:42:53,960 --> 00:42:55,560 Speaker 1: to do is will tend to try to solve the 743 00:42:55,640 --> 00:42:59,960 Speaker 1: question ourselves, using only ourselves as the source of knowledge. 744 00:43:00,560 --> 00:43:03,359 Speaker 1: It's good to consult, It's good to find out who's 745 00:43:03,360 --> 00:43:06,200 Speaker 1: had a success. It's good to confer with other people. 746 00:43:06,400 --> 00:43:10,080 Speaker 1: That can only broaden the knowledge and the wisdom that 747 00:43:10,160 --> 00:43:13,200 Speaker 1: we have UM at our disposal whenever we have a 748 00:43:13,200 --> 00:43:16,560 Speaker 1: difficult task like losing weight. And I gotta ask why 749 00:43:16,640 --> 00:43:19,680 Speaker 1: you In Germany each year I have a collaboration there 750 00:43:19,719 --> 00:43:23,520 Speaker 1: in Cologne with a couple of researchers doing work on trust. 751 00:43:23,640 --> 00:43:26,600 Speaker 1: This is where the interest in norms comes in. UH. 752 00:43:26,640 --> 00:43:29,439 Speaker 1: And that's been going on for many, many years. And 753 00:43:29,480 --> 00:43:34,000 Speaker 1: there have been many many meals during the course of 754 00:43:34,040 --> 00:43:38,320 Speaker 1: that collaboration, and much weight has been lost funding over there. Interesting. 755 00:43:38,320 --> 00:43:41,000 Speaker 1: I have a bunch more questions, including some on trust. 756 00:43:41,040 --> 00:43:43,760 Speaker 1: Can you stick around a few surements? We have been 757 00:43:43,840 --> 00:43:47,960 Speaker 1: speaking with David Dunning, professor of psychology at the University 758 00:43:48,120 --> 00:43:51,680 Speaker 1: of Michigan. If you enjoy this conversation, well, be sure 759 00:43:51,719 --> 00:43:54,319 Speaker 1: and stick around and check out our podcast Astras, where 760 00:43:54,360 --> 00:43:57,400 Speaker 1: we keep the tape rolling and continue discussing all things 761 00:43:57,800 --> 00:44:03,280 Speaker 1: psychology related. You can find in that on Apple, iTunes, Spotify, Stitcher, 762 00:44:03,560 --> 00:44:07,240 Speaker 1: wherever your final podcasts are sold. We love your comments, 763 00:44:07,239 --> 00:44:11,719 Speaker 1: feedback and suggestions right to us at m IB podcast 764 00:44:11,760 --> 00:44:14,839 Speaker 1: at Bloomberg dot net. Check out my weekly column on 765 00:44:14,880 --> 00:44:18,560 Speaker 1: Bloomberg dot com slash Opinion. Follow me on Twitter at 766 00:44:18,640 --> 00:44:21,960 Speaker 1: Rid Halts. I'm Barry Hults. You're listening to Master some 767 00:44:22,080 --> 00:44:27,880 Speaker 1: Business on Bloomberg Radio Professor Donny. I don't even know 768 00:44:27,920 --> 00:44:29,640 Speaker 1: what to call you, David. Thank you so much for 769 00:44:29,680 --> 00:44:31,960 Speaker 1: doing this. I have been looking forward to this for 770 00:44:32,000 --> 00:44:34,520 Speaker 1: a long time. And there I have all these formal 771 00:44:34,600 --> 00:44:36,880 Speaker 1: questions and we kind of work our way through that 772 00:44:36,880 --> 00:44:40,400 Speaker 1: that's my crutch. But I have all these other questions 773 00:44:40,440 --> 00:44:43,640 Speaker 1: that that I've been dying to ask you, and and 774 00:44:43,760 --> 00:44:47,200 Speaker 1: the big one was on that chart which you surprised 775 00:44:47,239 --> 00:44:51,080 Speaker 1: me with. I didn't realize you guys had hadn't created 776 00:44:51,120 --> 00:44:55,439 Speaker 1: that and that only in did you end up validating 777 00:44:55,440 --> 00:44:59,840 Speaker 1: with the Internet intuitive about your work. So that's fast 778 00:44:59,840 --> 00:45:06,160 Speaker 1: and nating the thing that intrigues me so much. Why 779 00:45:06,160 --> 00:45:11,440 Speaker 1: why is it that the way we learn is to 780 00:45:11,640 --> 00:45:16,640 Speaker 1: start from zero, assume we have knowledge that we don't 781 00:45:17,640 --> 00:45:21,920 Speaker 1: and then build on that, and all of a sudden 782 00:45:21,960 --> 00:45:25,680 Speaker 1: there's an insight and we realize, oh, we are idiots, 783 00:45:25,760 --> 00:45:28,520 Speaker 1: we don't know half of what we're talking about. And 784 00:45:28,640 --> 00:45:33,200 Speaker 1: from from that broken down position, are we able to 785 00:45:33,280 --> 00:45:39,759 Speaker 1: rebuild some true confidence relative to skills versus the false confidence? 786 00:45:39,760 --> 00:45:42,400 Speaker 1: And so the big question is what is it about 787 00:45:42,440 --> 00:45:46,480 Speaker 1: the species that has this inherit in it? Because it 788 00:45:46,640 --> 00:45:52,480 Speaker 1: seems to cause wide spread problems across society. Well, two things. 789 00:45:52,600 --> 00:45:54,720 Speaker 1: I mean, let me start off with the things we 790 00:45:54,719 --> 00:45:57,200 Speaker 1: we don't pay attention to. We don't pay attention to 791 00:45:57,239 --> 00:45:59,520 Speaker 1: what we don't know. We've already talked about that. And 792 00:45:59,560 --> 00:46:02,040 Speaker 1: we also do pay attention to luck and its potential 793 00:46:02,120 --> 00:46:05,759 Speaker 1: role success and failure, uh, for example, So we set 794 00:46:05,800 --> 00:46:09,759 Speaker 1: that aside. Um. Where this comes from in terms of 795 00:46:09,920 --> 00:46:13,080 Speaker 1: we think we've got this is that actually, in many 796 00:46:13,080 --> 00:46:16,359 Speaker 1: situations we start from zero and we do get it. 797 00:46:17,160 --> 00:46:21,399 Speaker 1: That is um. Uh, every situation we face in one 798 00:46:21,400 --> 00:46:24,360 Speaker 1: way or another. This interview, for example, is a new situation. 799 00:46:24,480 --> 00:46:28,040 Speaker 1: It does it doesn't exactly replicate the past. It's unique, 800 00:46:28,200 --> 00:46:31,080 Speaker 1: and our brain is able to fetch a lot of 801 00:46:31,120 --> 00:46:34,160 Speaker 1: little elements of knowledge from everywhere to figure out, Okay, 802 00:46:34,160 --> 00:46:35,719 Speaker 1: what is this, how do I deal with this? What's 803 00:46:35,719 --> 00:46:37,839 Speaker 1: the next move? I mean, the genius of our brain 804 00:46:37,920 --> 00:46:40,719 Speaker 1: is taking something novel and coming to an understanding of it. 805 00:46:40,880 --> 00:46:43,799 Speaker 1: This is similar enough to that that I could use 806 00:46:43,800 --> 00:46:46,160 Speaker 1: what I learned last time to work my way through. 807 00:46:46,440 --> 00:46:50,040 Speaker 1: That's right. Uh, and that's essential for the species to survive. 808 00:46:50,600 --> 00:46:53,720 Speaker 1: But sometimes you know that skill is going to derail, 809 00:46:53,880 --> 00:46:57,520 Speaker 1: it's going to lead us to something that's absolutely wrong. Uh. 810 00:46:57,560 --> 00:46:59,440 Speaker 1: But it will look exactly right. That is, it will 811 00:46:59,480 --> 00:47:01,640 Speaker 1: look like all the experiences where it was novel. We 812 00:47:01,719 --> 00:47:04,239 Speaker 1: figured out what was going on, We figured out what 813 00:47:04,280 --> 00:47:07,000 Speaker 1: we should do. So for example, if you have a 814 00:47:07,000 --> 00:47:09,520 Speaker 1: friend who's drowning in the lake, you're on the dock 815 00:47:09,960 --> 00:47:12,080 Speaker 1: and next and you don't have life reservers, but you 816 00:47:12,120 --> 00:47:14,560 Speaker 1: do have a basketball and a bowling ball next to you. 817 00:47:14,680 --> 00:47:17,520 Speaker 1: You know which ball to throw them, depending on how 818 00:47:18,400 --> 00:47:22,600 Speaker 1: much you like him exactly. Um. We we can innovate. Uh, 819 00:47:22,640 --> 00:47:27,080 Speaker 1: that's uh, that's what we were built to do. The 820 00:47:27,120 --> 00:47:31,279 Speaker 1: problem is those innovations may become misapplied. And that's where 821 00:47:31,280 --> 00:47:34,880 Speaker 1: the Dunning Krueger effect comes in. We've um worked from 822 00:47:34,920 --> 00:47:38,640 Speaker 1: this genius, we've worked from this amazing database we have 823 00:47:38,680 --> 00:47:42,680 Speaker 1: in our squishy little organic driver in our server in 824 00:47:42,719 --> 00:47:46,960 Speaker 1: our head and um, but we've misapplied and we don't 825 00:47:47,000 --> 00:47:51,359 Speaker 1: realize that until well after the disaster has happened, So 826 00:47:51,400 --> 00:47:53,880 Speaker 1: we're not aware of. What we don't know are blind 827 00:47:53,920 --> 00:47:59,600 Speaker 1: spots where we underestimate luck. And I've seen some ridings 828 00:47:59,640 --> 00:48:03,759 Speaker 1: that's a when we're successful, we credited to our own 829 00:48:03,800 --> 00:48:07,000 Speaker 1: skill and when we're unsuccessful, we credited to bad luck. 830 00:48:07,440 --> 00:48:09,840 Speaker 1: And not only that, but with the we do the 831 00:48:09,840 --> 00:48:13,680 Speaker 1: opposite with other people when they're successful while they got lucky, 832 00:48:13,719 --> 00:48:17,000 Speaker 1: and when they're unsuccessful it's because they're not very skillful. 833 00:48:17,680 --> 00:48:21,120 Speaker 1: That sort of back to the I'm special thing that 834 00:48:21,160 --> 00:48:24,360 Speaker 1: seems to permeate everything, doesn't it It does, and that 835 00:48:24,800 --> 00:48:27,520 Speaker 1: it's exactly the I'm special thing. But one thing I 836 00:48:27,520 --> 00:48:30,239 Speaker 1: should mention though, is the I'm special thing, though might 837 00:48:30,480 --> 00:48:33,719 Speaker 1: be constrained in other parts of the globe, is still 838 00:48:33,760 --> 00:48:36,319 Speaker 1: could be cultural. Oh there's a cultural element, no doubt. 839 00:48:36,320 --> 00:48:39,719 Speaker 1: We've actually studied that that this is something that attaches 840 00:48:39,840 --> 00:48:42,840 Speaker 1: much more to people with a heritage it's American as Canadian, 841 00:48:42,880 --> 00:48:48,360 Speaker 1: that's Western European. If you're coming from uh an Eastern culture, Uh, 842 00:48:48,600 --> 00:48:52,400 Speaker 1: you don't do um as much or at all, this 843 00:48:52,560 --> 00:48:55,719 Speaker 1: overestimation of self for this I'm special stuff that you'll 844 00:48:55,719 --> 00:48:58,400 Speaker 1: find Americans do all the time. Huh. And now I 845 00:48:58,400 --> 00:49:01,240 Speaker 1: would imagine in China, where there's a billion plus people, 846 00:49:01,760 --> 00:49:04,640 Speaker 1: it's harder to just assume you're special or is that 847 00:49:04,719 --> 00:49:07,839 Speaker 1: not even relevant? It's cultural more than anything. Well, it's 848 00:49:07,840 --> 00:49:11,000 Speaker 1: cultural in the sense of is the emphasis on me 849 00:49:11,080 --> 00:49:13,120 Speaker 1: and what I can do and what can I impose 850 00:49:13,200 --> 00:49:16,920 Speaker 1: upon the world that's very Western as opposed to how 851 00:49:16,960 --> 00:49:18,799 Speaker 1: do I fit in? How do I harmonize? How do 852 00:49:18,840 --> 00:49:22,040 Speaker 1: I fulfill the role that I've been assigned or the 853 00:49:22,160 --> 00:49:24,400 Speaker 1: role that I've fallen into, and that's much more Eastern. 854 00:49:24,920 --> 00:49:26,480 Speaker 1: And you're just gonna have a very different way of 855 00:49:26,480 --> 00:49:28,279 Speaker 1: thinking if you're in the first culture as opposed to 856 00:49:28,360 --> 00:49:32,320 Speaker 1: second culture. So so you earlier we were talking about 857 00:49:32,360 --> 00:49:36,839 Speaker 1: trust um and I'm kind of intrigued by that. There 858 00:49:36,920 --> 00:49:39,800 Speaker 1: there's a question that is, I guess, sort of obvious. 859 00:49:40,600 --> 00:49:43,560 Speaker 1: Why do we trust strangers? Why are we so susceptible 860 00:49:43,640 --> 00:49:47,680 Speaker 1: to being defrauded or scammed. It seems that every other 861 00:49:47,760 --> 00:49:51,319 Speaker 1: day I'm reading about some different Ponzi scheme or some 862 00:49:51,440 --> 00:49:57,080 Speaker 1: different um insanity where people trusted someone they clearly shouldn't 863 00:49:57,200 --> 00:50:00,520 Speaker 1: and it got them into a lot of trouble. I 864 00:50:00,560 --> 00:50:02,799 Speaker 1: think that comes from the fact we took what we 865 00:50:02,800 --> 00:50:06,080 Speaker 1: talked about norms earlier. And one of the norms we 866 00:50:06,120 --> 00:50:09,040 Speaker 1: have that goes right down deep in the heart of 867 00:50:09,080 --> 00:50:11,720 Speaker 1: what it means to have a conversation is we assume 868 00:50:11,760 --> 00:50:13,960 Speaker 1: what the other person is telling us is true unless 869 00:50:14,000 --> 00:50:17,440 Speaker 1: there is evidence otherwise. But the assumption is truth. That's 870 00:50:17,440 --> 00:50:20,560 Speaker 1: the presumption that we have, and that makes sense. Imagine 871 00:50:20,560 --> 00:50:23,400 Speaker 1: a world in which I are we all distrusted what 872 00:50:23,440 --> 00:50:25,960 Speaker 1: the other person is telling us, there would not be 873 00:50:26,040 --> 00:50:29,160 Speaker 1: much coordination going on in the world. Um. So, if 874 00:50:29,160 --> 00:50:30,920 Speaker 1: you ask for directions, the person tells you how to 875 00:50:30,960 --> 00:50:33,799 Speaker 1: get to the Bloomberg building, you assume they're telling you 876 00:50:33,840 --> 00:50:36,399 Speaker 1: the truth, because imagine if you said, no, I don't 877 00:50:36,400 --> 00:50:39,360 Speaker 1: trust them, what are you gonna do? So do you 878 00:50:39,360 --> 00:50:41,839 Speaker 1: even ask them in the first place? Exactly, So, there 879 00:50:42,080 --> 00:50:46,319 Speaker 1: is a normal presumption of truth. Uh. That serves us 880 00:50:46,360 --> 00:50:49,919 Speaker 1: well for the most part in life. But um, if 881 00:50:49,960 --> 00:50:53,960 Speaker 1: the other person is malevolent. If the other person is incompetent, 882 00:50:54,640 --> 00:50:59,920 Speaker 1: that presumption is going to lead to potential folly, for example. 883 00:51:00,400 --> 00:51:03,719 Speaker 1: But we we do have actually ongoing work looking at 884 00:51:04,239 --> 00:51:09,000 Speaker 1: people's ability to tell uh true science headlines from fake 885 00:51:09,080 --> 00:51:14,120 Speaker 1: science headlines. And what's interesting to us is that, um, 886 00:51:15,000 --> 00:51:17,080 Speaker 1: uh the error people tend to make is they tend 887 00:51:17,080 --> 00:51:19,040 Speaker 1: to believe fake things are true. They make that error 888 00:51:19,080 --> 00:51:21,439 Speaker 1: much more they do the reverse error, thinking a true 889 00:51:21,480 --> 00:51:25,759 Speaker 1: thing is fake. So in general, people are gullible, so 890 00:51:25,840 --> 00:51:29,040 Speaker 1: to speak. What's interesting though, is you ask people this 891 00:51:29,120 --> 00:51:32,879 Speaker 1: is one of the rare areas for people. Uh. They 892 00:51:32,960 --> 00:51:36,040 Speaker 1: don't say, oh, I have no bias, I see it 893 00:51:36,080 --> 00:51:38,480 Speaker 1: the way it is. Rather what they they say, they 894 00:51:38,520 --> 00:51:41,399 Speaker 1: do have a bias. They're too skeptical. Uh, they're too 895 00:51:41,440 --> 00:51:44,440 Speaker 1: wary of information out there. They're more likely to uh 896 00:51:44,600 --> 00:51:47,560 Speaker 1: distrust a true thing than to accept a false thing. 897 00:51:48,040 --> 00:51:50,200 Speaker 1: So this is the first time I've ever seen something 898 00:51:50,520 --> 00:51:53,480 Speaker 1: a bias with the superpower that is, most people are gullible, 899 00:51:53,600 --> 00:51:55,800 Speaker 1: but they actually believe they have the reverse bias, that 900 00:51:55,840 --> 00:52:00,000 Speaker 1: they're too skeptical. But it all comes from uh, from 901 00:52:00,320 --> 00:52:03,480 Speaker 1: a norm if you will, that for the most part 902 00:52:03,520 --> 00:52:05,520 Speaker 1: in life, and day to day living. It works. It 903 00:52:05,600 --> 00:52:09,080 Speaker 1: makes UH life eminently easier if we assume what the 904 00:52:09,120 --> 00:52:11,600 Speaker 1: other person's telling us is true, because at the very 905 00:52:11,680 --> 00:52:15,000 Speaker 1: least what the other person is telling us is sincere right. 906 00:52:15,120 --> 00:52:18,799 Speaker 1: That that's quite interesting. I'm surprised, in this era of 907 00:52:19,040 --> 00:52:23,120 Speaker 1: misinformation and all the false memes all over the Internet 908 00:52:23,480 --> 00:52:27,840 Speaker 1: that people still think their problem is Well, I'm too skeptical. 909 00:52:27,960 --> 00:52:32,680 Speaker 1: It's clear, at least from the popular culture, that we 910 00:52:32,880 --> 00:52:36,759 Speaker 1: too easily believe things we shouldn't. Oh, that's absolutely right, UH, 911 00:52:36,800 --> 00:52:39,239 Speaker 1: and I have to admit we don't exactly have a 912 00:52:39,280 --> 00:52:43,279 Speaker 1: handle on why do people think the reverse? That's fascinating 913 00:52:43,480 --> 00:52:47,239 Speaker 1: And once again it's one of those UH findings we 914 00:52:47,280 --> 00:52:49,200 Speaker 1: get where I look at it and I go, I 915 00:52:49,239 --> 00:52:51,759 Speaker 1: have no idea why this is happening. That happens far 916 00:52:51,800 --> 00:52:55,080 Speaker 1: too often in my work. So what about nudges? Is 917 00:52:55,120 --> 00:53:00,040 Speaker 1: there a way to to and I'm referencing uh on 918 00:53:00,160 --> 00:53:06,000 Speaker 1: Stein and failures UM work on on small, little systemic 919 00:53:06,040 --> 00:53:09,640 Speaker 1: ways to steer people in the right direction. Is that 920 00:53:09,920 --> 00:53:13,520 Speaker 1: something that can help people make better decisions? Or are 921 00:53:13,600 --> 00:53:18,640 Speaker 1: we just left to our own faulty devices. Well, our 922 00:53:18,640 --> 00:53:20,880 Speaker 1: devices are always going to be somewhat faulty, but we 923 00:53:20,920 --> 00:53:22,799 Speaker 1: can reduce the fault if you will. We can never 924 00:53:22,840 --> 00:53:27,320 Speaker 1: be perfect, but we can reduce our vulnerability. And for example, 925 00:53:27,360 --> 00:53:31,040 Speaker 1: we talked about gullibility. Uh. There are a number of 926 00:53:31,400 --> 00:53:34,600 Speaker 1: UM resources that are being developed on the Internet even 927 00:53:34,600 --> 00:53:37,960 Speaker 1: as we speak that are focused on how do we 928 00:53:37,960 --> 00:53:40,520 Speaker 1: get people to better evaluate what they're hearing over the internet, 929 00:53:41,400 --> 00:53:46,040 Speaker 1: And just to go over what the key movie is 930 00:53:46,040 --> 00:53:48,279 Speaker 1: is that typically what people do is when they see 931 00:53:48,360 --> 00:53:52,400 Speaker 1: something that's a provocative headline, for example, they look at 932 00:53:52,400 --> 00:53:54,920 Speaker 1: the website and try to figure out, just based on 933 00:53:55,120 --> 00:53:58,239 Speaker 1: that headline and the website it's designed itself, is this 934 00:53:58,360 --> 00:54:01,799 Speaker 1: something that I can believe? And so if it has 935 00:54:01,800 --> 00:54:04,719 Speaker 1: a Stasi professional picture for example, they decide it must 936 00:54:04,719 --> 00:54:07,920 Speaker 1: be more believable. Um, that's not the way to decide 937 00:54:07,920 --> 00:54:10,480 Speaker 1: whether or not something that's true or not. Instead of 938 00:54:10,480 --> 00:54:12,839 Speaker 1: internal reading, what you have to do is something that 939 00:54:13,040 --> 00:54:15,560 Speaker 1: all fact checkers know, which is you have to do 940 00:54:15,680 --> 00:54:19,000 Speaker 1: lateral reading. You have to go to other sources. You 941 00:54:19,040 --> 00:54:23,000 Speaker 1: have to go to other people once again and find 942 00:54:23,000 --> 00:54:26,040 Speaker 1: out other other sources saying the same thing. Is there 943 00:54:26,040 --> 00:54:29,279 Speaker 1: any comment on the reliability of the source you're looking 944 00:54:29,280 --> 00:54:32,640 Speaker 1: at now? From from other places. Uh am I looking 945 00:54:32,680 --> 00:54:38,040 Speaker 1: at something mainstream or looking at something that's made up. Um, 946 00:54:38,160 --> 00:54:40,839 Speaker 1: what people have to become is a little bit more 947 00:54:40,920 --> 00:54:43,839 Speaker 1: like a journalist. And what journalists do and what fact 948 00:54:43,920 --> 00:54:46,279 Speaker 1: checkers do is they check from multiple sources. They go 949 00:54:46,320 --> 00:54:48,840 Speaker 1: to other sources to take a look at whether this 950 00:54:48,840 --> 00:54:51,680 Speaker 1: piece of information is one that I can rely on. 951 00:54:52,200 --> 00:54:56,160 Speaker 1: And so in terms of nudges, there there are thematic judges, uh, 952 00:54:56,320 --> 00:55:00,360 Speaker 1: nudges like uh, the lateral reading. Uh. But there are 953 00:55:00,360 --> 00:55:02,640 Speaker 1: also more specific things now that are popping up on 954 00:55:02,640 --> 00:55:05,799 Speaker 1: the Internet that can be quite helpful, at least in 955 00:55:05,880 --> 00:55:11,200 Speaker 1: this on this issue. That's quite intriguing. Although I guess 956 00:55:11,280 --> 00:55:13,240 Speaker 1: you could do the same thing with the deep fakes 957 00:55:13,320 --> 00:55:17,280 Speaker 1: that are coming out. Some of the videos are really 958 00:55:17,440 --> 00:55:21,799 Speaker 1: horrifying because they just look so real. How can you 959 00:55:22,160 --> 00:55:24,560 Speaker 1: do a lateral check and find out if something like 960 00:55:24,600 --> 00:55:28,040 Speaker 1: that is real? Well, uh, I actually do this, actually 961 00:55:28,239 --> 00:55:31,279 Speaker 1: google and see if anybody else has basically said, oh, well, 962 00:55:31,320 --> 00:55:35,560 Speaker 1: this is a deep fake basically, so uh, you can't 963 00:55:35,600 --> 00:55:38,960 Speaker 1: tell from from the video itself because you are incredibly good. 964 00:55:39,000 --> 00:55:41,200 Speaker 1: Now you really have to go to other sources and 965 00:55:41,200 --> 00:55:44,400 Speaker 1: find out what the other sources are saying so, and 966 00:55:44,480 --> 00:55:46,440 Speaker 1: often what you find. For example, if you do that, 967 00:55:47,200 --> 00:55:49,239 Speaker 1: you'll find out this video tape was created by such 968 00:55:49,280 --> 00:55:52,359 Speaker 1: and such, or this video tape actually comes from some 969 00:55:52,400 --> 00:55:56,400 Speaker 1: other incident has nothing to do with what's going on here. 970 00:55:56,440 --> 00:56:00,399 Speaker 1: But basically, uh, in terms of dealing with misinformation, I think, 971 00:56:00,920 --> 00:56:03,719 Speaker 1: either whether we're talking about students in school, whether we're 972 00:56:03,719 --> 00:56:05,719 Speaker 1: talking about adults, the thing we have to do is 973 00:56:06,239 --> 00:56:10,640 Speaker 1: learn a little journalism. By the way, just quick, other 974 00:56:10,800 --> 00:56:13,719 Speaker 1: countries have actually gone this route in a big way. 975 00:56:13,760 --> 00:56:17,319 Speaker 1: So Finland, because it's right next to Russia and it's 976 00:56:17,360 --> 00:56:20,799 Speaker 1: been at a cool war with Russia for the last 977 00:56:20,880 --> 00:56:24,880 Speaker 1: hundred years, knows that Russian disinformation is coming over. And 978 00:56:24,920 --> 00:56:29,000 Speaker 1: so they're actually training students and training adults about how 979 00:56:29,000 --> 00:56:32,720 Speaker 1: to tell fake from real, you know, in an intensive way. 980 00:56:32,840 --> 00:56:35,600 Speaker 1: Um that, well, we could borrow a few of their techniques. 981 00:56:36,080 --> 00:56:38,520 Speaker 1: Quite quite interesting. There was something else in the book 982 00:56:38,560 --> 00:56:42,560 Speaker 1: I had to ask you about. What is anna so 983 00:56:43,360 --> 00:56:46,200 Speaker 1: nosa and as ignosia. I don't know if I pronounced 984 00:56:46,200 --> 00:56:48,840 Speaker 1: that right, but neither do I certainly have no I 985 00:56:48,960 --> 00:56:51,239 Speaker 1: know that I have no idea. If I pronounced that right, 986 00:56:51,680 --> 00:56:55,040 Speaker 1: I could barely spit it out that's true. But but 987 00:56:55,160 --> 00:56:57,439 Speaker 1: I have enough to think I know, and I really 988 00:56:57,440 --> 00:56:59,440 Speaker 1: haven't checked in a while to figure out if I 989 00:56:59,440 --> 00:57:02,160 Speaker 1: really know how to announced that well. As Egnosia is 990 00:57:02,920 --> 00:57:07,120 Speaker 1: actually a term that comes from um medicine and has 991 00:57:07,160 --> 00:57:13,319 Speaker 1: to do with issues where because of brain injury, people 992 00:57:13,360 --> 00:57:17,080 Speaker 1: are paralyzed but don't know that they're paralyzed. So oh yeah. 993 00:57:17,160 --> 00:57:21,440 Speaker 1: So for example, if you if a person is paralyzed, um, 994 00:57:21,600 --> 00:57:25,000 Speaker 1: I believe it's the left arm, and put a cup 995 00:57:25,000 --> 00:57:26,400 Speaker 1: of water in front of them and say, okay, pick 996 00:57:26,480 --> 00:57:28,560 Speaker 1: up the cup. Well, the person can't move their arm. 997 00:57:29,160 --> 00:57:31,240 Speaker 1: They're paralyzed. I can't move their arm. But if you 998 00:57:31,240 --> 00:57:33,520 Speaker 1: ask the person, whey aren't they picking up the cup, 999 00:57:33,640 --> 00:57:37,120 Speaker 1: they may say something like I'm not thirsty, why would 1000 00:57:37,160 --> 00:57:38,440 Speaker 1: I want to pick up the cup? That is, they 1001 00:57:38,440 --> 00:57:42,160 Speaker 1: have no awareness. Yeah, sort of like the split brain experiments. 1002 00:57:42,200 --> 00:57:45,840 Speaker 1: Well it's flit brain experts are exactly that, where the 1003 00:57:46,240 --> 00:57:48,360 Speaker 1: one side of the brain can point to the right object, 1004 00:57:48,960 --> 00:57:50,960 Speaker 1: but that's not the side of the brain that controls 1005 00:57:51,160 --> 00:57:55,560 Speaker 1: um uh talking, that controls verbal skills. But if you 1006 00:57:55,600 --> 00:57:57,320 Speaker 1: ask the person why did you point to that, they 1007 00:57:57,320 --> 00:57:59,520 Speaker 1: can come up with something that is that's part of 1008 00:57:59,520 --> 00:58:03,400 Speaker 1: Our brain is very good at interpreting how to understand 1009 00:58:03,440 --> 00:58:06,640 Speaker 1: novel situations, so we can come up with justifications. We 1010 00:58:06,640 --> 00:58:08,320 Speaker 1: can come up with rash now is for why we 1011 00:58:08,400 --> 00:58:10,600 Speaker 1: do what we do quite easily. Our brain is an 1012 00:58:10,600 --> 00:58:16,160 Speaker 1: incredible storyteller, um, But you know, incredible storytellers sometimes tell fiction, 1013 00:58:16,600 --> 00:58:18,760 Speaker 1: and our brain is quite good at coming up with 1014 00:58:18,800 --> 00:58:22,400 Speaker 1: fiction at times. That's quite interesting. I didn't know you 1015 00:58:22,440 --> 00:58:25,520 Speaker 1: were going to go. Where are you going to go with? Um? 1016 00:58:26,080 --> 00:58:31,520 Speaker 1: The idea of of that injury and paralysis, it started 1017 00:58:31,520 --> 00:58:34,479 Speaker 1: to remind me a little bit of the aphasias where 1018 00:58:34,520 --> 00:58:38,240 Speaker 1: people lose the ability to speak but they could sing, 1019 00:58:38,480 --> 00:58:41,200 Speaker 1: or they can't write, but they could still read. And 1020 00:58:41,440 --> 00:58:44,840 Speaker 1: it seems like there's almost a very specific part of 1021 00:58:44,880 --> 00:58:49,000 Speaker 1: the brain that performs very specific functions, and if it's injured, 1022 00:58:49,480 --> 00:58:53,200 Speaker 1: everything else related still works. Just that one skill seems 1023 00:58:53,240 --> 00:58:55,360 Speaker 1: to go away. That's right. But the issue with a 1024 00:58:55,360 --> 00:58:59,040 Speaker 1: lot of physical maladies and our work can be thought 1025 00:58:59,040 --> 00:59:04,720 Speaker 1: of as ataphorical extension of that too intellectual capabilities. A 1026 00:59:04,760 --> 00:59:07,280 Speaker 1: lot of people don't know the physical melodies that they've got, 1027 00:59:07,360 --> 00:59:10,120 Speaker 1: So as people become hard of hearing, they often don't 1028 00:59:10,120 --> 00:59:12,440 Speaker 1: know that they're becoming hard of hearing, and so they 1029 00:59:12,520 --> 00:59:16,320 Speaker 1: wonder why everybody's mumbling. For example, a lot of people 1030 00:59:16,320 --> 00:59:19,280 Speaker 1: who are color blind, I don't know their color blindly 1031 00:59:19,320 --> 00:59:22,000 Speaker 1: because they've never not been color blind. I had no idea. 1032 00:59:22,040 --> 00:59:24,360 Speaker 1: I thought you would order, like, when you look at 1033 00:59:24,360 --> 00:59:27,320 Speaker 1: a stop light, you can see what are people talking about? 1034 00:59:27,320 --> 00:59:29,360 Speaker 1: With red lights and green lights, they all look great 1035 00:59:29,360 --> 00:59:33,200 Speaker 1: at me. Does that not register or is that apparently 1036 00:59:33,240 --> 00:59:35,400 Speaker 1: not because you have you've never experienced fred or you've 1037 00:59:35,400 --> 00:59:37,360 Speaker 1: never experienced green, so you don't know what you're missing. 1038 00:59:38,120 --> 00:59:41,520 Speaker 1: When you mentioned everybody's mumbling. When I turned fifty, I 1039 00:59:41,560 --> 00:59:44,080 Speaker 1: remember having this is absolutely true. Had a conversation with 1040 00:59:44,120 --> 00:59:47,240 Speaker 1: my wife. I was sitting at the breakfast table one 1041 00:59:47,280 --> 00:59:49,120 Speaker 1: Sunday and I said, don't know what's going on with 1042 00:59:49,160 --> 00:59:52,160 Speaker 1: the New York Times, but they're using some cheaper paper. 1043 00:59:52,760 --> 00:59:56,320 Speaker 1: Look how fuzzy the words are. And then I said, 1044 00:59:56,320 --> 00:59:58,600 Speaker 1: look the Wall Street General. It's the same thing. And 1045 00:59:58,680 --> 01:00:03,080 Speaker 1: my wife says it it You need glasses and I'm like, what, No, No, 1046 01:00:03,200 --> 01:00:05,800 Speaker 1: I have perfect vision. She hands me her glasses and 1047 01:00:05,800 --> 01:00:10,919 Speaker 1: I'm like, oh, I had no idea. My vision had 1048 01:00:11,040 --> 01:00:14,400 Speaker 1: decayed so much at the ripled age of fifty one 1049 01:00:15,080 --> 01:00:18,400 Speaker 1: UM some years ago, and it's that exactly the same thing. 1050 01:00:18,680 --> 01:00:22,720 Speaker 1: You have no idea that the gradual decay is taking place. 1051 01:00:23,880 --> 01:00:26,400 Speaker 1: So so what else are you working on? Your Your 1052 01:00:26,440 --> 01:00:30,400 Speaker 1: field of study has very much um evolved since the 1053 01:00:30,440 --> 01:00:33,720 Speaker 1: original Dunning Krueger work. What else are you looking at 1054 01:00:33,800 --> 01:00:37,320 Speaker 1: these days? A related idea that we've been looking at 1055 01:00:37,400 --> 01:00:44,240 Speaker 1: quite a bit is this idea of hypocognition hypo hypo cognition. Uh. 1056 01:00:44,280 --> 01:00:46,360 Speaker 1: And the best way to explain it is, if you 1057 01:00:46,360 --> 01:00:50,680 Speaker 1: don't know what hypocognition is, congratulations, you've just experienced. Hypo 1058 01:00:51,680 --> 01:00:56,480 Speaker 1: Hypocognition is not having a concept if you will, so um, 1059 01:00:56,920 --> 01:00:59,960 Speaker 1: not having the idea of unknown unknowns. In the finance, 1060 01:01:00,480 --> 01:01:03,240 Speaker 1: a lot of people invest, but they don't really have 1061 01:01:03,320 --> 01:01:07,600 Speaker 1: the concept of exponential growth or compound interest. Your compounding 1062 01:01:07,720 --> 01:01:13,520 Speaker 1: is most probability and statistical things are very counterintuitive. People 1063 01:01:14,200 --> 01:01:16,360 Speaker 1: just can't wrap their head around it. And when you 1064 01:01:16,400 --> 01:01:21,680 Speaker 1: show people compounding charts, they're very often incredulous, incredulous that, wait, 1065 01:01:21,800 --> 01:01:24,640 Speaker 1: this much money can't I had a whole discussion about 1066 01:01:24,680 --> 01:01:28,320 Speaker 1: the number of four oh one k millionaires and the 1067 01:01:28,320 --> 01:01:30,840 Speaker 1: person said, well, maybe years ago, but you couldn't do 1068 01:01:30,920 --> 01:01:34,320 Speaker 1: that now? Why can't you do that now? It's these 1069 01:01:34,560 --> 01:01:37,600 Speaker 1: still got however many years it is, and here's what 1070 01:01:37,640 --> 01:01:41,000 Speaker 1: you're expected. Returns are over forty years. Oh and ps, 1071 01:01:41,520 --> 01:01:45,160 Speaker 1: your contribution levels are are up. It's easier today than 1072 01:01:45,200 --> 01:01:47,480 Speaker 1: it was years ago. That's right. But if you don't 1073 01:01:47,520 --> 01:01:51,640 Speaker 1: have the concept what what you are talking about seems alien, foreign, 1074 01:01:51,840 --> 01:01:54,720 Speaker 1: or a little bit of a con Uh So, But 1075 01:01:55,400 --> 01:01:58,360 Speaker 1: we're studying that in number of ways because we're interested. 1076 01:01:58,400 --> 01:02:02,520 Speaker 1: For example, what if people don't have a concept of 1077 01:02:02,520 --> 01:02:05,400 Speaker 1: scientific rigor they don't know all the rules that I 1078 01:02:05,440 --> 01:02:09,640 Speaker 1: have to live under, for example, to verify or make 1079 01:02:09,680 --> 01:02:12,120 Speaker 1: the case for any sort of conclusion that I want 1080 01:02:12,120 --> 01:02:15,600 Speaker 1: to reach. And that turns out to be related to uh, 1081 01:02:15,600 --> 01:02:17,840 Speaker 1: two perceptions out there in the world. The first perception 1082 01:02:17,960 --> 01:02:20,880 Speaker 1: is scientists can say whatever they want. Is that a 1083 01:02:20,880 --> 01:02:25,960 Speaker 1: real perception that people really think? Uh? Not a majority, 1084 01:02:26,040 --> 01:02:28,520 Speaker 1: but a clear percentage of people believe that. Is that 1085 01:02:28,600 --> 01:02:31,120 Speaker 1: specific to this country or is that global that I 1086 01:02:31,160 --> 01:02:34,320 Speaker 1: don't know, I've only studied it within this country. Uh. 1087 01:02:34,360 --> 01:02:36,800 Speaker 1: And it's also related, by the way to distrusted science 1088 01:02:36,840 --> 01:02:38,920 Speaker 1: that you just you don't have to listen to scientists 1089 01:02:39,120 --> 01:02:42,200 Speaker 1: what they have to say, really isn't useful? Um, and 1090 01:02:42,280 --> 01:02:45,440 Speaker 1: that it all does trace back in part but an 1091 01:02:45,480 --> 01:02:49,200 Speaker 1: important part to not knowing that how much work it 1092 01:02:49,280 --> 01:02:52,520 Speaker 1: is to reduce a piece of scientific knowledge. You don't 1093 01:02:52,520 --> 01:02:55,080 Speaker 1: have the idea of control condition, random assignment. I can 1094 01:02:55,080 --> 01:02:59,080 Speaker 1: go on it on, you can't cherry pick. Uh. People 1095 01:02:59,200 --> 01:03:01,480 Speaker 1: don't know these rules, and as a consequence, they think 1096 01:03:01,480 --> 01:03:06,440 Speaker 1: scientists are just some uh professors in their office dreaming 1097 01:03:06,520 --> 01:03:09,439 Speaker 1: up a conclusion and then collecting some data to window 1098 01:03:09,560 --> 01:03:13,120 Speaker 1: dress it, for example. And yet we use technology to 1099 01:03:13,240 --> 01:03:16,080 Speaker 1: such a great deal. Do do people think these are like, oh, 1100 01:03:16,080 --> 01:03:18,880 Speaker 1: look a magic box that I can speak to people 1101 01:03:18,920 --> 01:03:22,760 Speaker 1: on it's magic? Do they not get technology and engineering 1102 01:03:22,960 --> 01:03:27,400 Speaker 1: is based on fundamental science? I mean that seems pretty obvious. 1103 01:03:27,960 --> 01:03:30,800 Speaker 1: If science doesn't work, then how could you fly on 1104 01:03:30,840 --> 01:03:33,040 Speaker 1: a plane, How could you take medicine? How could you 1105 01:03:33,120 --> 01:03:36,400 Speaker 1: use you know, there's we get into an elevator at 1106 01:03:36,440 --> 01:03:39,920 Speaker 1: least in cities every day. Is it a magic box 1107 01:03:40,000 --> 01:03:42,400 Speaker 1: or is there science behind it? It just it seems 1108 01:03:42,440 --> 01:03:46,480 Speaker 1: so hard to accept that people are really science skeptical. 1109 01:03:47,320 --> 01:03:50,040 Speaker 1: I well, I agree, but I assure you that that 1110 01:03:50,120 --> 01:03:54,280 Speaker 1: percentage of people does exist. How what percentage of people 1111 01:03:54,320 --> 01:03:57,920 Speaker 1: that you study are truly science skeptics. Well, we're not 1112 01:03:58,000 --> 01:04:00,560 Speaker 1: using representative samples, but in the same calls we get 1113 01:04:00,560 --> 01:04:03,440 Speaker 1: and they're actually a better educated than than the average American. 1114 01:04:03,800 --> 01:04:07,000 Speaker 1: It's about let's say, I, but I can't I I 1115 01:04:07,040 --> 01:04:08,800 Speaker 1: don't know what the real percentage is because I haven't 1116 01:04:08,800 --> 01:04:12,480 Speaker 1: done anything that's a good representative snapshot, let's say in 1117 01:04:12,560 --> 01:04:15,240 Speaker 1: the United States. But you have to understand that a 1118 01:04:15,280 --> 01:04:19,400 Speaker 1: lot of people, I mean, the ignorance of the scientific 1119 01:04:19,480 --> 01:04:21,320 Speaker 1: method runs so deep that a lot of people don't 1120 01:04:21,400 --> 01:04:26,440 Speaker 1: understand that scientists collect data. They don't understand that that's so, 1121 01:04:26,640 --> 01:04:30,400 Speaker 1: that's the that's the process, and that data have the 1122 01:04:30,440 --> 01:04:32,680 Speaker 1: final authority and what you're able to conclude and what 1123 01:04:32,680 --> 01:04:35,000 Speaker 1: you're able to say, it just doesn't appear to them. 1124 01:04:35,040 --> 01:04:38,840 Speaker 1: So if you ask um students, let's say in college, 1125 01:04:38,880 --> 01:04:41,680 Speaker 1: are in high school, do they believe in oxygen or 1126 01:04:41,800 --> 01:04:45,320 Speaker 1: do they believe in the in the electron? They'll go yes, yes, 1127 01:04:45,440 --> 01:04:49,360 Speaker 1: why And they don't cite an experiment, they don't cite data. Uh, 1128 01:04:49,400 --> 01:04:51,840 Speaker 1: they basically say, that's what everybody says, that's what my 1129 01:04:51,880 --> 01:04:55,360 Speaker 1: teacher says, that's what my parents say. So for a 1130 01:04:55,360 --> 01:04:58,600 Speaker 1: lot of people, Um, the idea of data is not 1131 01:04:58,640 --> 01:05:01,160 Speaker 1: what they think about. They're basing their beliefs and what 1132 01:05:01,200 --> 01:05:02,920 Speaker 1: other people say, by the way, which is the same 1133 01:05:02,960 --> 01:05:06,480 Speaker 1: basis they use to believe in things like reincarnation or 1134 01:05:06,560 --> 01:05:11,720 Speaker 1: ghosts or karma. That is the basis for people's scientific 1135 01:05:11,760 --> 01:05:14,920 Speaker 1: beliefs tends to be the same as the basis of 1136 01:05:14,960 --> 01:05:19,120 Speaker 1: their supernatural beliefs. So it's just whatever the societal consensus is. 1137 01:05:19,160 --> 01:05:23,120 Speaker 1: They're acceptance. It's social proof, that's exactly. And and you 1138 01:05:23,160 --> 01:05:25,400 Speaker 1: know the one clapp question before I get to my 1139 01:05:25,440 --> 01:05:28,800 Speaker 1: favorite question. One thing I wanted to ask you earlier 1140 01:05:29,480 --> 01:05:33,400 Speaker 1: but for didn't get to was was comes back to 1141 01:05:35,920 --> 01:05:40,600 Speaker 1: paper blowing up and becoming so popular. After that happens, 1142 01:05:41,080 --> 01:05:44,160 Speaker 1: how did that affect your subsequent research? Did it affect 1143 01:05:44,560 --> 01:05:47,200 Speaker 1: the topics you pick? That did affect the options you 1144 01:05:47,240 --> 01:05:51,840 Speaker 1: had available? Like, what did what did this paper blowing 1145 01:05:51,960 --> 01:05:55,520 Speaker 1: up due to your subsequent research? Well, for many years, 1146 01:05:55,560 --> 01:06:00,080 Speaker 1: it didn't do anything because it was known, but the 1147 01:06:00,120 --> 01:06:03,479 Speaker 1: Internet wasn't fully in place yet, it wasn't a thing yet. 1148 01:06:03,640 --> 01:06:06,480 Speaker 1: I think that's happened far much more recently. So I 1149 01:06:06,520 --> 01:06:09,280 Speaker 1: went off and studied whatever I studied, But then the 1150 01:06:09,320 --> 01:06:11,200 Speaker 1: world sort of told me, no, we want you to 1151 01:06:11,240 --> 01:06:16,280 Speaker 1: look at this. Uh. And that's okay, because this was 1152 01:06:16,320 --> 01:06:19,800 Speaker 1: always the paper I didn't know how to follow up. Yes, Uh, 1153 01:06:20,080 --> 01:06:23,440 Speaker 1: so you have follow up. What else came out of 1154 01:06:23,440 --> 01:06:25,280 Speaker 1: this paper? Oh? A number of things have come out 1155 01:06:25,320 --> 01:06:27,080 Speaker 1: of this paper. So the question is when are people 1156 01:06:27,080 --> 01:06:31,040 Speaker 1: most vulnerable the Dunning Kruger effect? Um? Uh? And the 1157 01:06:31,040 --> 01:06:33,560 Speaker 1: answer is when they have an answer, when they believe 1158 01:06:33,640 --> 01:06:37,000 Speaker 1: they have expertise or they can spin a yard if 1159 01:06:37,000 --> 01:06:38,920 Speaker 1: you will. I mean there are times when you just 1160 01:06:39,040 --> 01:06:41,160 Speaker 1: simply cannot come up with an answer and you know 1161 01:06:41,240 --> 01:06:44,160 Speaker 1: that you don't know. Uh, you know when you're guessing. 1162 01:06:44,360 --> 01:06:47,800 Speaker 1: And that's some recent work uh we now have under review. 1163 01:06:48,080 --> 01:06:50,520 Speaker 1: It shows that people know when they're guessing. The problem 1164 01:06:50,560 --> 01:06:54,400 Speaker 1: the Dunning Kruger effect is when you don't think you're guessing, um, 1165 01:06:54,440 --> 01:06:57,080 Speaker 1: and coming up with a wrong answer. Uh. It's led 1166 01:06:57,120 --> 01:07:00,280 Speaker 1: to this work on hypocognition. It's led to this work 1167 01:07:00,360 --> 01:07:05,560 Speaker 1: on gullibility. We're now looking at do people know when 1168 01:07:06,200 --> 01:07:10,640 Speaker 1: they really need to ask for advice? That's an important consequence. 1169 01:07:10,920 --> 01:07:13,240 Speaker 1: But a lot of these questions really weren't formed in 1170 01:07:13,280 --> 01:07:16,280 Speaker 1: my head until I started interacting with people like you, 1171 01:07:16,920 --> 01:07:20,680 Speaker 1: or reporters or people in the airport, for example, are 1172 01:07:20,720 --> 01:07:24,720 Speaker 1: people randomly stopping you to ask Dunning Krueger questions? Well, 1173 01:07:24,760 --> 01:07:28,720 Speaker 1: it has happened. I mean, there's no escaping the baggage carousel. 1174 01:07:28,760 --> 01:07:32,080 Speaker 1: You're a prisoner over there? Well no, well, uh, luckily 1175 01:07:32,080 --> 01:07:34,160 Speaker 1: no one can see my little label on the on 1176 01:07:34,240 --> 01:07:36,920 Speaker 1: the luggage. But if my name gets called, you know, 1177 01:07:37,160 --> 01:07:39,520 Speaker 1: to get a seat assignment or whatever something like that, 1178 01:07:40,480 --> 01:07:42,840 Speaker 1: occasionally prison come over and say, are are you that Dunning? 1179 01:07:43,120 --> 01:07:46,720 Speaker 1: I gonna go this is wild? Um. So it's had 1180 01:07:46,800 --> 01:07:52,960 Speaker 1: that impact. Um. But but basically, I'm in, let's say, 1181 01:07:53,000 --> 01:07:56,000 Speaker 1: the last act of my research career, and the world 1182 01:07:56,000 --> 01:07:57,720 Speaker 1: has told me this is what it wants me to 1183 01:07:57,960 --> 01:08:03,600 Speaker 1: look at. So we're I'm now really asking the question, Uh, 1184 01:08:03,640 --> 01:08:05,800 Speaker 1: do do people really not know what they don't know? 1185 01:08:05,880 --> 01:08:09,040 Speaker 1: And what implications does that have? Quite fascinating? When when 1186 01:08:09,120 --> 01:08:12,520 Speaker 1: is that research coming out hopefully soon to a journal 1187 01:08:12,560 --> 01:08:16,679 Speaker 1: and eventually a book near you? Excellent? Alright, so um, 1188 01:08:16,760 --> 01:08:19,400 Speaker 1: let me jump to my favorite questions that we ask 1189 01:08:20,000 --> 01:08:22,400 Speaker 1: all of our guests. Feel free to go as long 1190 01:08:22,479 --> 01:08:25,280 Speaker 1: and short as you like with this, um, and these 1191 01:08:25,320 --> 01:08:28,479 Speaker 1: are really designed to be telling us to who you are, 1192 01:08:28,520 --> 01:08:30,840 Speaker 1: because we may not know who you are. Um, what 1193 01:08:30,880 --> 01:08:33,320 Speaker 1: was the first car you ever owned? Year, make and model? 1194 01:08:34,040 --> 01:08:37,040 Speaker 1: The first car I owned was a nineteen seventies six 1195 01:08:37,240 --> 01:08:41,320 Speaker 1: Ford Pinto. It was a Mint Julip green Ford Pinto. 1196 01:08:41,800 --> 01:08:44,920 Speaker 1: So if anybody is interested, you should google Mint Julip 1197 01:08:45,120 --> 01:08:48,439 Speaker 1: green Forward Pinto and you will see pictures of a 1198 01:08:48,479 --> 01:08:51,400 Speaker 1: color that exists nowhere else on this world. Yeah, that 1199 01:08:51,400 --> 01:08:54,240 Speaker 1: that is insult to injury, a terrible car in an 1200 01:08:54,320 --> 01:08:57,639 Speaker 1: awful color. Oh, and that this car was the epitome 1201 01:08:57,680 --> 01:09:00,880 Speaker 1: of all of that. So um so a little more 1202 01:09:00,960 --> 01:09:04,519 Speaker 1: interesting question. What what are you streaming or listening to 1203 01:09:04,720 --> 01:09:09,680 Speaker 1: or or watching these days? Uh? Well, in terms of streaming, 1204 01:09:10,280 --> 01:09:15,240 Speaker 1: my taste these days run to um uh, let's say 1205 01:09:15,320 --> 01:09:21,080 Speaker 1: intellectual fantasy series like Watchman or West World is about 1206 01:09:21,120 --> 01:09:25,960 Speaker 1: to come on Star Trek the Card for example. Uh. 1207 01:09:26,000 --> 01:09:30,240 Speaker 1: In terms of streaming music, well, I I'm a BBC 1208 01:09:30,280 --> 01:09:34,800 Speaker 1: two excuse me, a BBC six CBC two kind of guy. 1209 01:09:35,520 --> 01:09:37,639 Speaker 1: I'm listening to a lot of Canadian pop music at 1210 01:09:37,640 --> 01:09:39,600 Speaker 1: the moment. Okay, I was going to say, what is 1211 01:09:39,640 --> 01:09:43,639 Speaker 1: BBC six. BBC six is basically British pop music. British 1212 01:09:44,120 --> 01:09:47,440 Speaker 1: like great pop from the seventies or Maurice. No, it's contemporary, 1213 01:09:47,439 --> 01:09:49,800 Speaker 1: it's more alternative, if you will. But I find what's 1214 01:09:49,840 --> 01:09:51,759 Speaker 1: going on in Britain and Canada will be more interesting 1215 01:09:51,800 --> 01:09:54,040 Speaker 1: than what's going on in the United States in terms 1216 01:09:54,040 --> 01:09:56,760 Speaker 1: of pop. I've been listening to Bob Harris on BBC 1217 01:09:56,840 --> 01:10:00,240 Speaker 1: for forever and I love the sort of he covers old, 1218 01:10:00,439 --> 01:10:03,759 Speaker 1: old genres in decades, always an interesting and that's exactly 1219 01:10:03,760 --> 01:10:08,000 Speaker 1: what these two channels and the Yeah, that's that's very interesting. 1220 01:10:08,439 --> 01:10:10,200 Speaker 1: Um and if you like Watchman, I just had this 1221 01:10:10,240 --> 01:10:16,000 Speaker 1: conversation yesterday. Have you seen on Amazon Prime The Boys? All? Right? 1222 01:10:16,160 --> 01:10:22,840 Speaker 1: So really, very quickly, it's a sort of anti superhero 1223 01:10:22,920 --> 01:10:30,200 Speaker 1: world where all the superheroes are these corporate owned entities 1224 01:10:30,320 --> 01:10:35,759 Speaker 1: and there turn out to really not be as saving 1225 01:10:35,840 --> 01:10:39,200 Speaker 1: society as they appear to be, so much as earning 1226 01:10:39,200 --> 01:10:42,240 Speaker 1: a corporate buck. And it's really quite fascinating if you're 1227 01:10:42,280 --> 01:10:46,599 Speaker 1: at all interested in Watchman is not quite but there 1228 01:10:46,600 --> 01:10:52,000 Speaker 1: are some parallels there that the It was really fascinating. 1229 01:10:52,040 --> 01:10:55,400 Speaker 1: It's a little grizzly parts of it, but it's cartoonish, 1230 01:10:55,439 --> 01:10:58,599 Speaker 1: so it's not real volume. It's not real violence. It's 1231 01:10:58,960 --> 01:11:02,519 Speaker 1: cartoon violence, although you know it can get a little glorious, 1232 01:11:02,600 --> 01:11:05,920 Speaker 1: but it's having an contemporary theme y storry imagining, you know, 1233 01:11:05,960 --> 01:11:08,800 Speaker 1: this sort of genre in light of contemporary themes, that 1234 01:11:08,840 --> 01:11:11,680 Speaker 1: would be very interesting. Yeah, exactly. Um, so what's the 1235 01:11:11,720 --> 01:11:14,799 Speaker 1: most important thing that people don't know about David Dunnet? 1236 01:11:15,200 --> 01:11:19,360 Speaker 1: M hmm, interesting question. Uh Well, originally, when I was 1237 01:11:19,400 --> 01:11:22,720 Speaker 1: a kid, I first wanted to be a cartoonist and 1238 01:11:22,800 --> 01:11:26,240 Speaker 1: then a screenwriter. In fact, when I was thirteen, I 1239 01:11:26,280 --> 01:11:31,600 Speaker 1: actually submitted a spec script to the TV show Mash. 1240 01:11:31,600 --> 01:11:33,519 Speaker 1: It was rejected, but I had in my hand. I've 1241 01:11:33,560 --> 01:11:37,479 Speaker 1: since lost it and I uh regret regret losing them. 1242 01:11:37,479 --> 01:11:39,920 Speaker 1: I had little handwritten notes from Larry Gilbart, the producer 1243 01:11:39,920 --> 01:11:41,880 Speaker 1: of the show, who was then and now a hero 1244 01:11:42,040 --> 01:11:45,559 Speaker 1: of mine. So yeah, he's an interesting guy. Who were 1245 01:11:45,640 --> 01:11:49,719 Speaker 1: some of your early mentors? What psychologists influenced your approach 1246 01:11:50,120 --> 01:11:52,640 Speaker 1: to what you do? I would have to say I 1247 01:11:52,720 --> 01:11:54,160 Speaker 1: had a great set of mentor as both as an 1248 01:11:54,200 --> 01:11:59,280 Speaker 1: undergraduate and as a graduated undergraduate. Uh, Michigan State professors 1249 01:11:59,280 --> 01:12:03,360 Speaker 1: of Larry Missy and Joel Arnov were very influential. Uh. 1250 01:12:03,439 --> 01:12:05,280 Speaker 1: Then I went to Stanford and I was a Lee 1251 01:12:05,360 --> 01:12:09,799 Speaker 1: raw student. Uh and uh, Michigan State taught me rigor 1252 01:12:10,439 --> 01:12:12,960 Speaker 1: um Stanford and Lee taught me humanity, how to put 1253 01:12:13,040 --> 01:12:15,639 Speaker 1: humanity into the work, make it an interesting human story. 1254 01:12:16,640 --> 01:12:20,080 Speaker 1: But I don't think anybody who was around everybody who 1255 01:12:20,120 --> 01:12:23,160 Speaker 1: was around Amos Tversky thinks of him as an influence 1256 01:12:23,880 --> 01:12:27,880 Speaker 1: because of you want to know what smart looks like. 1257 01:12:27,960 --> 01:12:31,040 Speaker 1: Amos was smart, and this is often something I tell undergraduates. 1258 01:12:31,600 --> 01:12:34,360 Speaker 1: UM pick a professor who everybody says it's the smartest, 1259 01:12:34,400 --> 01:12:36,800 Speaker 1: because you need to see what smart looks like. That 1260 01:12:36,840 --> 01:12:39,479 Speaker 1: will be the content doesn't matter. You want to see 1261 01:12:39,520 --> 01:12:43,840 Speaker 1: what smart looks like. Uh So Uh, Amos divers Key 1262 01:12:44,280 --> 01:12:49,280 Speaker 1: um Phoebel's worth uh were tremendous influences and basically how 1263 01:12:49,280 --> 01:12:53,400 Speaker 1: I spend my day quite quite interesting. Uh. Tell us 1264 01:12:53,439 --> 01:12:55,759 Speaker 1: about some of your favorite books. What are you reading 1265 01:12:55,800 --> 01:12:58,560 Speaker 1: these days? What do you like? Well? The problem with 1266 01:12:58,600 --> 01:13:00,400 Speaker 1: the books I read now is they're all related to 1267 01:13:00,520 --> 01:13:04,719 Speaker 1: my work, and reading is a little bit tough because 1268 01:13:04,720 --> 01:13:07,840 Speaker 1: I do it for the job so much. Um but 1269 01:13:08,280 --> 01:13:11,160 Speaker 1: uh So, Actually, what I've been doing is going back 1270 01:13:11,240 --> 01:13:15,560 Speaker 1: to classics from my youth. So the book form of 1271 01:13:15,680 --> 01:13:19,120 Speaker 1: swing to Cambodia something I recently read, and I'm trying 1272 01:13:19,160 --> 01:13:22,840 Speaker 1: to find girl Escherbach. I can't believe you. You're bringing 1273 01:13:22,920 --> 01:13:25,320 Speaker 1: up some of my old time classics. There you go, Well, 1274 01:13:25,360 --> 01:13:27,120 Speaker 1: I want to go back now that I'm older, And 1275 01:13:27,160 --> 01:13:29,720 Speaker 1: what do I think of them now? For example? Is 1276 01:13:30,040 --> 01:13:32,200 Speaker 1: the way to think about it? But Um, a lot 1277 01:13:32,240 --> 01:13:34,360 Speaker 1: of what I do is I just read long form 1278 01:13:35,000 --> 01:13:39,040 Speaker 1: on the web. So every morning I get the ritzults reads. 1279 01:13:39,880 --> 01:13:43,160 Speaker 1: And do you find them interesting? Because I really sift 1280 01:13:43,200 --> 01:13:46,680 Speaker 1: through a ton of stuff to find ten really interesting 1281 01:13:46,760 --> 01:13:50,559 Speaker 1: things you're sifting. At least to me, it works very 1282 01:13:50,560 --> 01:13:53,639 Speaker 1: well if you will, because I find great things to read. 1283 01:13:53,720 --> 01:13:56,080 Speaker 1: The thing that I have to do is discipline myself 1284 01:13:56,120 --> 01:13:59,040 Speaker 1: not to tweet o the readings you're suggesting, because then 1285 01:13:59,040 --> 01:14:02,040 Speaker 1: I'd just be ripping you feel free, Listen. I'm just 1286 01:14:02,040 --> 01:14:05,000 Speaker 1: putting together a list of except for Tuesdays where it's 1287 01:14:05,000 --> 01:14:07,439 Speaker 1: fifteen instead of ten. I don't know where to for 1288 01:14:07,479 --> 01:14:12,519 Speaker 1: Tuesday came from, but somehow that's become I am, um 1289 01:14:12,560 --> 01:14:15,000 Speaker 1: a creature of habit, and I've learned that if I 1290 01:14:15,040 --> 01:14:18,000 Speaker 1: want to do do something, if I can turn it 1291 01:14:18,000 --> 01:14:21,760 Speaker 1: into a habit, I can make it repetitive, and it's 1292 01:14:21,800 --> 01:14:24,800 Speaker 1: really just once you start doing something for a month 1293 01:14:24,920 --> 01:14:28,280 Speaker 1: or two, it becomes ingrained. Forget a decade or two. 1294 01:14:28,360 --> 01:14:33,000 Speaker 1: That's a whole different thing. And that The Reads began 1295 01:14:33,080 --> 01:14:35,760 Speaker 1: as a way of just being organized. There's so much 1296 01:14:35,760 --> 01:14:38,760 Speaker 1: stuff to read. Let me eliminate all the junk and 1297 01:14:38,840 --> 01:14:43,000 Speaker 1: let me see what's left that that's good. People don't 1298 01:14:43,080 --> 01:14:47,280 Speaker 1: realize this is really a golden age of journalism writing. 1299 01:14:47,760 --> 01:14:49,759 Speaker 1: I used to go through the process of the morning 1300 01:14:49,800 --> 01:14:52,360 Speaker 1: of figuring out what's relevant and what do I want 1301 01:14:52,360 --> 01:14:59,720 Speaker 1: to read? That sort of concept of creation by extreme prejudice, 1302 01:14:59,800 --> 01:15:04,200 Speaker 1: by by saying, if this isn't well done and well 1303 01:15:04,240 --> 01:15:07,400 Speaker 1: researched and well written and on a topic that's interesting, 1304 01:15:07,479 --> 01:15:10,040 Speaker 1: I can't be bothered with it. Because everything is so 1305 01:15:10,080 --> 01:15:15,400 Speaker 1: ephemeral and superficial. Lead to I used to do that manually, 1306 01:15:15,439 --> 01:15:18,040 Speaker 1: used to print it out. This is a hundred years ago, 1307 01:15:18,520 --> 01:15:20,160 Speaker 1: and someone said, hey, could you just give me a 1308 01:15:20,200 --> 01:15:22,960 Speaker 1: list of what you're reading instead of a hard copy? 1309 01:15:23,080 --> 01:15:28,840 Speaker 1: And okay, And that eventually became that eventually became the 1310 01:15:28,880 --> 01:15:31,200 Speaker 1: Morning Reads. And I think I've been doing that for 1311 01:15:31,240 --> 01:15:35,120 Speaker 1: like twenty years or so. It's it's I'm at the 1312 01:15:35,160 --> 01:15:37,599 Speaker 1: point now where I could be a sentence or two 1313 01:15:37,680 --> 01:15:40,679 Speaker 1: into a peace and I'm like, nope, Like I could 1314 01:15:40,720 --> 01:15:44,400 Speaker 1: tell immediately if something is is good or bad. Um, 1315 01:15:44,520 --> 01:15:46,080 Speaker 1: so you're not reading a whole lot of books in 1316 01:15:46,120 --> 01:15:49,920 Speaker 1: other wise, No, basically because I do so much treating 1317 01:15:50,000 --> 01:15:54,080 Speaker 1: that I prefer shorter, punch eier things. Uh. And you're 1318 01:15:54,080 --> 01:15:57,479 Speaker 1: absolutely right. There's so much terrific information, some terrific blogs 1319 01:15:57,479 --> 01:15:59,760 Speaker 1: on the web, for example, that I can give it 1320 01:15:59,800 --> 01:16:02,639 Speaker 1: in that give us some some blog names. The blog 1321 01:16:02,720 --> 01:16:04,559 Speaker 1: name I would point out actually is a blog called 1322 01:16:04,600 --> 01:16:07,760 Speaker 1: Stumbling and Mumbling. Oh sure, I remember that from that 1323 01:16:07,800 --> 01:16:11,160 Speaker 1: became big about ten twelve, fifteen years. Well, it still 1324 01:16:11,200 --> 01:16:16,200 Speaker 1: goes on, and I find the the blogger to be 1325 01:16:16,439 --> 01:16:19,760 Speaker 1: extremely persuasive. It's about England, so it's not about the 1326 01:16:19,800 --> 01:16:24,080 Speaker 1: Knights State. Uh. So that's good. And um often has 1327 01:16:24,200 --> 01:16:28,280 Speaker 1: some insights I would dearly love to steal. Um. But 1328 01:16:28,280 --> 01:16:31,719 Speaker 1: but that one I find to be quite good. In 1329 01:16:31,840 --> 01:16:36,679 Speaker 1: terms of political commentary. The blog Progress Pond I find 1330 01:16:36,720 --> 01:16:41,559 Speaker 1: to be extremely interesting familiar. Um. But well it's a 1331 01:16:41,960 --> 01:16:45,720 Speaker 1: it's a democratic activist, if you will. But he's rather 1332 01:16:45,760 --> 01:16:49,799 Speaker 1: clear eyed. Um. He does stand off from the sermon 1333 01:16:49,880 --> 01:16:51,880 Speaker 1: drum of the day to really try to figure out 1334 01:16:51,920 --> 01:16:54,240 Speaker 1: what's going on, or to project what's going on a 1335 01:16:54,280 --> 01:16:57,040 Speaker 1: Bernie bro not, in fact, he is not a Bernie 1336 01:16:57,040 --> 01:17:00,680 Speaker 1: bro That's absolutely clear. It's by the time this broadcast 1337 01:17:00,880 --> 01:17:03,960 Speaker 1: we will already have had the Super Tuesday results, we 1338 01:17:04,040 --> 01:17:10,760 Speaker 1: will be pretty deep into um the primary season. We 1339 01:17:10,800 --> 01:17:13,360 Speaker 1: may even have a nominee by then. That that will 1340 01:17:13,400 --> 01:17:16,400 Speaker 1: be kind of kind of interesting. Do you when you 1341 01:17:16,400 --> 01:17:19,920 Speaker 1: look at politics, do you ever find yourself with opinions 1342 01:17:19,960 --> 01:17:22,680 Speaker 1: and then catch yourself saying self saying, I have no 1343 01:17:22,720 --> 01:17:25,720 Speaker 1: expertise in this, this is just my own opinion. Are 1344 01:17:25,760 --> 01:17:30,120 Speaker 1: you self aware of your own Dunning Krueger? Well, in politics? 1345 01:17:30,200 --> 01:17:33,479 Speaker 1: Absolutely so. Whenever I pronounced something in politics, I usually 1346 01:17:34,080 --> 01:17:36,880 Speaker 1: uh precade it or or or preamble it with well, 1347 01:17:36,880 --> 01:17:41,639 Speaker 1: this is for entertainment value only, but quite interesting. Um, 1348 01:17:41,680 --> 01:17:43,680 Speaker 1: tell us about a time you failed and what you 1349 01:17:43,840 --> 01:17:48,599 Speaker 1: learned from the experience. Uh well, um, a chronic failure 1350 01:17:48,680 --> 01:17:52,120 Speaker 1: I had. Ultimately it was successful or the project was successful, 1351 01:17:52,120 --> 01:17:55,759 Speaker 1: but it took fifteen years. Was this work on trust? 1352 01:17:56,080 --> 01:18:01,120 Speaker 1: Where basically the finding is is that people trust complete strangers, 1353 01:18:01,640 --> 01:18:05,160 Speaker 1: even though economics tells us they shouldn't, because why would 1354 01:18:05,160 --> 01:18:08,360 Speaker 1: a person ever honor your trust their complete stranger. But 1355 01:18:08,800 --> 01:18:12,280 Speaker 1: people do trust UH, and our civilization profits because of that. 1356 01:18:13,040 --> 01:18:14,960 Speaker 1: And I looked at that, I said, Okay, clearly the 1357 01:18:15,000 --> 01:18:19,400 Speaker 1: economics is failing. Clearly two years and and a psychological 1358 01:18:19,400 --> 01:18:21,840 Speaker 1: team will be able to figure this out. So I 1359 01:18:21,920 --> 01:18:25,519 Speaker 1: tried hypothesis after hypothesis after hypothesis and ran hundreds and 1360 01:18:25,600 --> 01:18:29,799 Speaker 1: hundreds and hundreds of subjects. All my hypotheses um failed. 1361 01:18:29,960 --> 01:18:32,599 Speaker 1: They often failed in interesting ways, they failed in ways 1362 01:18:32,600 --> 01:18:34,760 Speaker 1: that cohered with one another, but for the life I 1363 01:18:34,800 --> 01:18:36,920 Speaker 1: couldn't figure out what was going on. That ultimately led 1364 01:18:36,960 --> 01:18:40,120 Speaker 1: to this emphasis on norms and the norm of respect 1365 01:18:40,120 --> 01:18:43,840 Speaker 1: and politeness with other people. We trust other people, um, 1366 01:18:43,920 --> 01:18:46,559 Speaker 1: because we have to respect them, and to distrust them 1367 01:18:46,640 --> 01:18:49,240 Speaker 1: is to disrespect them. That took fifteen years in the 1368 01:18:49,320 --> 01:18:52,080 Speaker 1: making to get to. What I learned from that, though, 1369 01:18:53,200 --> 01:18:57,880 Speaker 1: is I learned that there can be rules of human nature, 1370 01:18:58,479 --> 01:19:00,920 Speaker 1: but they can be so deep that none of our 1371 01:19:00,920 --> 01:19:03,880 Speaker 1: subjects knew what was going on. People could never explain it. 1372 01:19:04,600 --> 01:19:07,160 Speaker 1: And I'm the professional, and I couldn't explain it. Some 1373 01:19:07,200 --> 01:19:10,040 Speaker 1: things can run that deep. So that's what I learned. 1374 01:19:10,040 --> 01:19:15,080 Speaker 1: But that was fifteen years of failed data which I 1375 01:19:15,120 --> 01:19:19,080 Speaker 1: could only bear because of the good graces of tenure. Huh. 1376 01:19:19,439 --> 01:19:24,120 Speaker 1: That's really interesting that there's a book that comes it's 1377 01:19:24,160 --> 01:19:27,920 Speaker 1: sort of related to the normative issue and and the 1378 01:19:28,000 --> 01:19:30,599 Speaker 1: trust issue. And there's a whole bunch of cognitive other 1379 01:19:30,680 --> 01:19:36,200 Speaker 1: things by a Will Shure Store called The Heretics Adventures 1380 01:19:36,200 --> 01:19:38,719 Speaker 1: with the Enemies of Science. So usually I'm not familiar 1381 01:19:38,720 --> 01:19:44,360 Speaker 1: with it. So he is a journalist who embeds himself 1382 01:19:44,560 --> 01:19:48,360 Speaker 1: with all sorts of groups that you would otherwise think 1383 01:19:48,400 --> 01:19:53,439 Speaker 1: of as wacky, extreme crazy, and whether it's Clad Arthur's 1384 01:19:54,000 --> 01:19:58,880 Speaker 1: or science deniers or climate change, it's one group after 1385 01:19:58,920 --> 01:20:02,599 Speaker 1: another that's very elevant to the science and Nile issue. 1386 01:20:03,200 --> 01:20:07,840 Speaker 1: And his sort of thesis is these people aren't dead 1387 01:20:08,000 --> 01:20:13,400 Speaker 1: or evil or dumb. There's something fundamentally wrong with their 1388 01:20:13,439 --> 01:20:17,679 Speaker 1: basic model of the world. And once that building block 1389 01:20:17,800 --> 01:20:20,280 Speaker 1: is set, you know, it's like aiming for the moon. 1390 01:20:20,320 --> 01:20:22,400 Speaker 1: If you're off just a little bit an inch or two, here, 1391 01:20:22,520 --> 01:20:25,160 Speaker 1: you're off by millions of miles as you whiz by. 1392 01:20:25,880 --> 01:20:30,240 Speaker 1: When they're fundamental model of the universe is off, everything 1393 01:20:30,320 --> 01:20:33,599 Speaker 1: constructed on top of that just takes them in these 1394 01:20:33,760 --> 01:20:37,840 Speaker 1: crazy directions and it's not Hey, these aren't necessary. Some 1395 01:20:37,880 --> 01:20:41,960 Speaker 1: of these are evil people, but that's not necessarily how 1396 01:20:41,960 --> 01:20:45,960 Speaker 1: they went so far astray. It's a fundamental, fundamental error 1397 01:20:46,040 --> 01:20:50,200 Speaker 1: that just keeps compounding. And uh, it's quite fascinating. It's 1398 01:20:50,280 --> 01:20:53,120 Speaker 1: it's really an interesting book. If you've never if you've 1399 01:20:53,160 --> 01:20:57,760 Speaker 1: if you've never seen it before. Um So what do 1400 01:20:57,800 --> 01:20:59,120 Speaker 1: you do for fun? What do you do when you're 1401 01:20:59,120 --> 01:21:03,160 Speaker 1: not read in academic research papers? Well, I'm older, so 1402 01:21:03,479 --> 01:21:05,960 Speaker 1: a lot of what I do is I watch stuff 1403 01:21:06,040 --> 01:21:10,000 Speaker 1: on a screen, whether it be television or not. Um uh, 1404 01:21:10,280 --> 01:21:14,000 Speaker 1: during the when the terms in session, um, I will 1405 01:21:14,040 --> 01:21:16,720 Speaker 1: tend to watch a lot of sports, but not the 1406 01:21:16,760 --> 01:21:21,080 Speaker 1: typical sports. So I'm a big fan of Arsenal, the 1407 01:21:21,120 --> 01:21:23,839 Speaker 1: soccer team in England. And I know that your knowledgeable 1408 01:21:24,439 --> 01:21:29,280 Speaker 1: listeners out there are thinking, oh, I'm so sorry. Um No, well, 1409 01:21:29,320 --> 01:21:31,920 Speaker 1: World Cup is fascinating and when you get to what 1410 01:21:32,080 --> 01:21:35,759 Speaker 1: World Cup soccer is really, there's no commercial breaks. It's 1411 01:21:35,800 --> 01:21:38,960 Speaker 1: practically they don't you know American sports you're used to 1412 01:21:39,360 --> 01:21:41,760 Speaker 1: exactly you know, you you watch World Cup and like 1413 01:21:42,120 --> 01:21:44,719 Speaker 1: there have been times where it's like, gee, it's sixty minutes. 1414 01:21:44,760 --> 01:21:47,439 Speaker 1: We haven't had a break. Yet it's kind of amazing. Um, 1415 01:21:47,479 --> 01:21:49,799 Speaker 1: And there's a flow of that game that is really unique, 1416 01:21:49,800 --> 01:21:53,600 Speaker 1: and it's a beautiful sport if you appreciate it for 1417 01:21:53,640 --> 01:21:55,920 Speaker 1: what it is. It really is the beautiful game. And 1418 01:21:56,320 --> 01:21:58,120 Speaker 1: there's a lot of strategy and a lot of incident 1419 01:21:58,640 --> 01:22:00,920 Speaker 1: going on once you've been around it enough to realize 1420 01:22:00,960 --> 01:22:03,720 Speaker 1: what incident is. I mean, there's not much scoring, but 1421 01:22:03,760 --> 01:22:06,200 Speaker 1: that actually makes the games more exciting because a goal 1422 01:22:06,320 --> 01:22:11,040 Speaker 1: matters so much. The games are always on edge and um, 1423 01:22:11,640 --> 01:22:14,479 Speaker 1: things could change in a in a minute. Um that 1424 01:22:14,560 --> 01:22:17,040 Speaker 1: it can truly lead to excitement, but it's also a 1425 01:22:17,040 --> 01:22:20,639 Speaker 1: sport that can truly lead to despair. I found uniquely well. 1426 01:22:20,680 --> 01:22:22,680 Speaker 1: I live in New York, so between the Mets and 1427 01:22:22,680 --> 01:22:26,559 Speaker 1: the Knicks, I know all about despare the I wish 1428 01:22:26,640 --> 01:22:30,679 Speaker 1: they would stop with the flopping in in World Cup 1429 01:22:30,720 --> 01:22:34,040 Speaker 1: and soccer. It's gotten to be way too much. So 1430 01:22:34,200 --> 01:22:37,280 Speaker 1: within your your field, what are you most optimistic about 1431 01:22:37,320 --> 01:22:40,880 Speaker 1: today and what are you most pessimistic about the most 1432 01:22:40,920 --> 01:22:44,599 Speaker 1: exciting thing in my field right now is the introduction 1433 01:22:44,640 --> 01:22:47,240 Speaker 1: of big data, if you will. That is, there are 1434 01:22:47,240 --> 01:22:50,800 Speaker 1: many social psychological questions and also questions of interests. People 1435 01:22:50,840 --> 01:22:55,400 Speaker 1: in the world that can be addressed with big data. Um, 1436 01:22:55,439 --> 01:23:00,120 Speaker 1: there's just great sources of data out there. And how 1437 01:23:00,120 --> 01:23:01,960 Speaker 1: it's going to be exploited. I have no idea, but 1438 01:23:02,040 --> 01:23:03,920 Speaker 1: I bet it's going to be great. So in the 1439 01:23:03,960 --> 01:23:06,639 Speaker 1: field of behavioral science in general, I'm very much looking 1440 01:23:06,640 --> 01:23:09,479 Speaker 1: forward to that as long as people who have the 1441 01:23:09,560 --> 01:23:14,960 Speaker 1: data and ash and the people who no traditional theory 1442 01:23:15,520 --> 01:23:18,760 Speaker 1: join up, because it is the case that a lot 1443 01:23:18,800 --> 01:23:21,840 Speaker 1: of people who do traditional theory don't know that these 1444 01:23:21,920 --> 01:23:26,080 Speaker 1: data sources exist, and so opportunities are missed, and the 1445 01:23:26,080 --> 01:23:28,800 Speaker 1: people who have big data don't realize that they can 1446 01:23:28,800 --> 01:23:32,040 Speaker 1: be quite naive and they're thinking about how to test 1447 01:23:32,080 --> 01:23:33,880 Speaker 1: the ideas that they have. They need to connect up 1448 01:23:33,920 --> 01:23:36,040 Speaker 1: with the theory people. If that happens, it's going to 1449 01:23:36,120 --> 01:23:39,479 Speaker 1: be great, quite interesting. I always look at Facebook, which 1450 01:23:39,520 --> 01:23:42,600 Speaker 1: I'm not a big fan of as a user, and 1451 01:23:42,640 --> 01:23:46,280 Speaker 1: I just imagine they must have unbelievable reams of data 1452 01:23:46,360 --> 01:23:51,520 Speaker 1: about all sorts of individuals and groups, and then how 1453 01:23:52,160 --> 01:23:55,360 Speaker 1: um how they behave in certain situations. I gotta think 1454 01:23:55,880 --> 01:23:59,080 Speaker 1: a team of research psychologists could have a field day 1455 01:23:59,080 --> 01:24:03,439 Speaker 1: with that. Oh uh, anthropologists, sociologist, economists, you name it. 1456 01:24:03,479 --> 01:24:07,559 Speaker 1: Absolutely interesting. And our our final two questions, what sort 1457 01:24:07,560 --> 01:24:10,960 Speaker 1: of advice would you give to uh recent college graduate 1458 01:24:11,280 --> 01:24:15,280 Speaker 1: who was interested in a career in psychology and research? 1459 01:24:16,160 --> 01:24:20,720 Speaker 1: Uh interest Uh, get some mentors and get more than 1460 01:24:20,760 --> 01:24:24,959 Speaker 1: one essentially absolutely, whether they be from your home institution 1461 01:24:25,000 --> 01:24:27,519 Speaker 1: or it's just your going to or wherever. Uh. People 1462 01:24:27,520 --> 01:24:29,120 Speaker 1: are willing to give advice, and some of it is 1463 01:24:29,160 --> 01:24:34,720 Speaker 1: actually good. Um. But also um uh meet people, be 1464 01:24:35,200 --> 01:24:39,400 Speaker 1: someone aggressive that but also presents yourself, give talks, have 1465 01:24:39,520 --> 01:24:42,719 Speaker 1: a blog, for example. Uh. It forces you to think, 1466 01:24:43,080 --> 01:24:46,360 Speaker 1: but it also gets you out there for people to see. 1467 01:24:46,400 --> 01:24:49,920 Speaker 1: And I don't think younger folks do that do that much. 1468 01:24:49,920 --> 01:24:51,880 Speaker 1: There are younger folks who do that, but I think 1469 01:24:52,240 --> 01:24:54,200 Speaker 1: there could be many more voices added to the mix. 1470 01:24:54,400 --> 01:24:57,080 Speaker 1: And our final question, what do you know about the 1471 01:24:57,120 --> 01:25:01,160 Speaker 1: world of psychology today that you would you knew thirty 1472 01:25:01,240 --> 01:25:04,639 Speaker 1: years ago or so when you were just beginning your career. 1473 01:25:05,479 --> 01:25:12,639 Speaker 1: Oh boy, that's extremely interesting question. Um. I I sort 1474 01:25:12,680 --> 01:25:15,160 Speaker 1: of wish I had known what the trends were going 1475 01:25:15,240 --> 01:25:18,439 Speaker 1: to be uh in my field, because I've been around 1476 01:25:18,560 --> 01:25:21,000 Speaker 1: the block for quite a bit and I was the 1477 01:25:21,000 --> 01:25:24,040 Speaker 1: reason I'm in psychology is because this is the specific issues. 1478 01:25:24,479 --> 01:25:27,919 Speaker 1: They were at the forefront of psychology and social psychology 1479 01:25:27,960 --> 01:25:31,680 Speaker 1: at that point, and then it was really about misbelief, 1480 01:25:32,479 --> 01:25:35,000 Speaker 1: errors that people made and so forth. That's sort of 1481 01:25:35,000 --> 01:25:38,320 Speaker 1: the foundation which I built my career. Uh now, and 1482 01:25:38,760 --> 01:25:40,240 Speaker 1: by the way, what we weren't asked to do is 1483 01:25:40,280 --> 01:25:43,040 Speaker 1: we weren't asked to solve those questions. The idea of 1484 01:25:43,120 --> 01:25:46,160 Speaker 1: nudging with several decades into the future, and now the 1485 01:25:46,200 --> 01:25:48,439 Speaker 1: field is very much about Okay, what do you do 1486 01:25:48,479 --> 01:25:52,439 Speaker 1: about it? And I'm I'm a little bit behind the 1487 01:25:52,479 --> 01:25:55,519 Speaker 1: younger generation because I didn't have to pay attention to it. 1488 01:25:56,200 --> 01:25:58,160 Speaker 1: And I wish I had known that at some point 1489 01:25:58,400 --> 01:26:00,479 Speaker 1: the field was going to get to the obvious question 1490 01:26:00,560 --> 01:26:04,719 Speaker 1: of we have all this knowledge about what people do 1491 01:26:05,080 --> 01:26:07,800 Speaker 1: that is a mistake, how do you get people to 1492 01:26:08,240 --> 01:26:11,240 Speaker 1: avoid those mistakes or repair those mistakes, or how in 1493 01:26:11,240 --> 01:26:14,599 Speaker 1: general do you improve people's lives. Finally the field got 1494 01:26:14,640 --> 01:26:16,800 Speaker 1: to that. I wish someone had come to me and 1495 01:26:16,800 --> 01:26:19,040 Speaker 1: basically said, that question is going to be the question 1496 01:26:19,080 --> 01:26:21,880 Speaker 1: in the future. You should prepare. But you know, not 1497 01:26:22,000 --> 01:26:25,920 Speaker 1: too long ago, it wasn't really thought of his academics jobs. 1498 01:26:25,960 --> 01:26:28,120 Speaker 1: It's like, hey, just tell us what the knowledge is 1499 01:26:28,160 --> 01:26:30,519 Speaker 1: and the policymakers will figure out that's absolutely right. It 1500 01:26:30,600 --> 01:26:33,439 Speaker 1: was going to be uh, that was going to be 1501 01:26:33,479 --> 01:26:36,599 Speaker 1: offloaded to somebody else. But it's finally come into the field. 1502 01:26:36,840 --> 01:26:40,120 Speaker 1: And I think in part because science does react to 1503 01:26:40,360 --> 01:26:45,360 Speaker 1: society and um uh, now people are developing apps to 1504 01:26:45,400 --> 01:26:48,160 Speaker 1: do this, computer programs to do that new technology that 1505 01:26:48,200 --> 01:26:50,719 Speaker 1: helps us other thing. So the idea is the endpoint 1506 01:26:50,920 --> 01:26:53,080 Speaker 1: is how do you develop something that people can use? 1507 01:26:53,640 --> 01:26:56,439 Speaker 1: Is much more in the heads of younger researchers than 1508 01:26:56,479 --> 01:26:59,080 Speaker 1: it is for older researchers. Researchers they think of that 1509 01:26:59,120 --> 01:27:03,240 Speaker 1: as a natural end point of research. And uh, I 1510 01:27:03,280 --> 01:27:05,920 Speaker 1: should have gone, I should have been prepared for that 1511 01:27:06,320 --> 01:27:10,519 Speaker 1: shifting time. Quite quite interesting. Um, thank you David for 1512 01:27:10,520 --> 01:27:13,200 Speaker 1: being so generous with your time. We have been speaking 1513 01:27:13,560 --> 01:27:17,559 Speaker 1: with Professor David Dunning of the University of Michigan. If 1514 01:27:17,640 --> 01:27:20,679 Speaker 1: you enjoy this conversation, well look up an intro Down 1515 01:27:20,680 --> 01:27:23,360 Speaker 1: an Inch on Apple iTunes and you can see any 1516 01:27:23,400 --> 01:27:26,960 Speaker 1: of the previous three hundred plus conversations we've had over 1517 01:27:26,960 --> 01:27:30,439 Speaker 1: the past five and a half years. We love your comments, 1518 01:27:30,479 --> 01:27:34,280 Speaker 1: feedback and suggestions right to us at m IB podcast 1519 01:27:34,360 --> 01:27:38,280 Speaker 1: at Bloomberg dot net. Leave comments. What was I gonna say? 1520 01:27:38,360 --> 01:27:41,760 Speaker 1: Leave a review on Apple iTunes? Uh. If you want 1521 01:27:41,760 --> 01:27:45,120 Speaker 1: to see the daily reads at Professor Dunning Reference, you 1522 01:27:45,120 --> 01:27:47,960 Speaker 1: could find those at rid Halts dot com and sign 1523 01:27:48,040 --> 01:27:51,800 Speaker 1: up there. Check out my weekly column on Bloomberg dot com. 1524 01:27:51,840 --> 01:27:54,960 Speaker 1: Follow me on Twitter at rit Halts. I would be 1525 01:27:55,040 --> 01:27:57,720 Speaker 1: remiss if I did not thank the crack staff who 1526 01:27:57,760 --> 01:28:02,360 Speaker 1: helps me put together uh this conversation each week. Sam 1527 01:28:02,560 --> 01:28:07,000 Speaker 1: Chivraj is my producer slash booker. Michael Batnick is my 1528 01:28:07,120 --> 01:28:12,559 Speaker 1: head of research. Nick Falco is my audio engineer. I'm 1529 01:28:12,640 --> 01:28:16,160 Speaker 1: Barry Hults. You've been listening to Masters in Business on 1530 01:28:16,240 --> 01:28:17,160 Speaker 1: Bloomberg Radio