1 00:00:02,240 --> 00:00:06,399 Speaker 1: This is Master's in Business with Barry Ridholts on Bloomberg 2 00:00:06,480 --> 00:00:10,360 Speaker 1: Radio this week on the podcast Really, I have a 3 00:00:10,520 --> 00:00:13,119 Speaker 1: super extra special guest. Everybody makes fun of me for 4 00:00:13,160 --> 00:00:16,160 Speaker 1: saying that each week, but I have an extra special guest. 5 00:00:16,480 --> 00:00:20,120 Speaker 1: I was fortunate enough to go to a dinner one 6 00:00:20,200 --> 00:00:23,680 Speaker 1: night that Annie Duke was hosting, and each person at 7 00:00:23,680 --> 00:00:27,400 Speaker 1: the table was more fascinating and accomplished than the next, 8 00:00:27,520 --> 00:00:31,520 Speaker 1: from Mike Mobison, a Josh Wolf to Danny Kahneman. And 9 00:00:31,640 --> 00:00:33,720 Speaker 1: at the end of the evening, one of the women 10 00:00:33,760 --> 00:00:38,640 Speaker 1: at the table pulls me aside to discuss my interview 11 00:00:38,680 --> 00:00:41,920 Speaker 1: with Michael Lewis, and that turned out to be Barbara Tversky, 12 00:00:42,400 --> 00:00:47,960 Speaker 1: a experimental psychologist, publisher of hundreds of research papers, oh 13 00:00:47,960 --> 00:00:51,280 Speaker 1: and also the spouse of a Moost Tversky. And she 14 00:00:51,320 --> 00:00:54,840 Speaker 1: told me how much she enjoyed my conversation with Mike Lewis, 15 00:00:54,960 --> 00:00:57,960 Speaker 1: and we started chatting, and it took me, I don't know, 16 00:00:58,080 --> 00:01:01,040 Speaker 1: maybe four seconds to say, oh my god, this woman 17 00:01:01,120 --> 00:01:03,440 Speaker 1: is fascinating and I have to sit down and have 18 00:01:03,520 --> 00:01:06,360 Speaker 1: a conversation with her. But she's back and forth between 19 00:01:06,360 --> 00:01:09,959 Speaker 1: Stanford and Colombian, and it took us a while to 20 00:01:10,040 --> 00:01:12,680 Speaker 1: hone in on a time, and I'm really glad we did. 21 00:01:12,840 --> 00:01:17,000 Speaker 1: She wrote this fascinating book on how the brain works, 22 00:01:17,040 --> 00:01:21,160 Speaker 1: how we perceive things, whether it's language or spatial perception, 23 00:01:21,600 --> 00:01:26,960 Speaker 1: and why action shapes thoughts and how motion impacts cognitive processes. 24 00:01:27,440 --> 00:01:32,440 Speaker 1: It's not drying clinical, It's really a very fascinating abstract conversation. 25 00:01:33,040 --> 00:01:37,000 Speaker 1: And we just babbled. At least I babbled for two hours. 26 00:01:37,040 --> 00:01:40,160 Speaker 1: It really was an intriguing conversation. If you're at all 27 00:01:40,280 --> 00:01:45,120 Speaker 1: interested in cognitive psychology, how the brain works, the way 28 00:01:45,520 --> 00:01:49,520 Speaker 1: language affects thought and vice versa, the way thought affects language, 29 00:01:49,880 --> 00:01:54,520 Speaker 1: as well as her nine laws of Cognition, you're going 30 00:01:54,600 --> 00:01:58,280 Speaker 1: to find this to be absolutely fascinating. So, with no 31 00:01:58,360 --> 00:02:05,640 Speaker 1: further ado, my conversation with Barbara Taversky. This is Masters 32 00:02:05,680 --> 00:02:10,320 Speaker 1: in Business with Barry Ridholts on Bloomberg Radio. My extra 33 00:02:10,400 --> 00:02:13,720 Speaker 1: special guest this week is Barbara Tversky. She is a 34 00:02:13,720 --> 00:02:17,720 Speaker 1: professor of psychology at Stanford University. She also is a 35 00:02:17,800 --> 00:02:23,359 Speaker 1: psychology and education professor at Teachers College at Columbia University. 36 00:02:23,680 --> 00:02:30,200 Speaker 1: She has published more than two hundred articles on cognition, psychology, memory, 37 00:02:30,880 --> 00:02:34,320 Speaker 1: all sorts of fascinating topics. Her new book is called 38 00:02:34,440 --> 00:02:40,880 Speaker 1: Mind in Motion, How Action Shapes Thought. Barbara Taverski, Welcome 39 00:02:40,880 --> 00:02:44,080 Speaker 1: to Bloomberg. Thank you. I'm happy to be here. So 40 00:02:44,200 --> 00:02:49,440 Speaker 1: let's talk a little bit about cognition and psychology. How 41 00:02:49,440 --> 00:02:53,040 Speaker 1: did you find your way to that field. I'm a 42 00:02:53,080 --> 00:02:57,200 Speaker 1: bit of a contrarian, and when I entered the field, 43 00:02:57,480 --> 00:03:01,560 Speaker 1: the cognition revolution was well in progress, so you could 44 00:03:01,600 --> 00:03:04,840 Speaker 1: open up the mind if you were clever, and find 45 00:03:04,919 --> 00:03:10,040 Speaker 1: ways of revealing how people thought. But I thought at 46 00:03:10,080 --> 00:03:15,480 Speaker 1: that time was heavily dominated by language, by propositional thinking 47 00:03:15,520 --> 00:03:21,919 Speaker 1: that came from philosophy and linguistics, and people thought at 48 00:03:21,919 --> 00:03:24,919 Speaker 1: that time. The way we thought about the spatial world, 49 00:03:24,960 --> 00:03:29,320 Speaker 1: the visual world, was by putting it into propositional format. 50 00:03:29,680 --> 00:03:34,120 Speaker 1: Explain what that means is So proposition is in philosophy 51 00:03:34,280 --> 00:03:38,400 Speaker 1: is a minimal statement, like the cup is round or 52 00:03:38,600 --> 00:03:42,600 Speaker 1: the desk is flat. Their minimal statements where you attribute 53 00:03:42,680 --> 00:03:47,200 Speaker 1: something to something else. And it felt to me like 54 00:03:47,400 --> 00:03:52,080 Speaker 1: you could never begin to describe faces that way. We're 55 00:03:52,240 --> 00:03:56,559 Speaker 1: very bad at describing faces. Emotions are difficult to describe, 56 00:03:57,040 --> 00:04:01,520 Speaker 1: fairly easy to detect. Spaces that were in our heart 57 00:04:01,600 --> 00:04:06,520 Speaker 1: to describe, So thinking about reducing that to propositions, to 58 00:04:06,640 --> 00:04:11,280 Speaker 1: these simple, minimal statements didn't make sense to me. It 59 00:04:11,400 --> 00:04:14,680 Speaker 1: made sense to me that the spatial world and the 60 00:04:14,800 --> 00:04:19,479 Speaker 1: visual world had its own logic, and that logic came 61 00:04:19,560 --> 00:04:24,920 Speaker 1: first evolutionarily, because you know, babies don't talk. It takes 62 00:04:24,920 --> 00:04:27,320 Speaker 1: them a while to talk, and even when they talk, 63 00:04:28,080 --> 00:04:32,760 Speaker 1: everything sounds like bah buss banana bottle and they aren't 64 00:04:32,800 --> 00:04:37,240 Speaker 1: saying very deep things for a long time. So they 65 00:04:37,279 --> 00:04:42,599 Speaker 1: do very intelligent things. Animals do very intelligent things without speech. 66 00:04:43,240 --> 00:04:45,839 Speaker 1: So it felt to me that, if anything, the spatial 67 00:04:46,000 --> 00:04:52,039 Speaker 1: visual world preceded evolutionarily and in development and had a 68 00:04:52,160 --> 00:04:56,040 Speaker 1: richness of its own and that needed to be explored 69 00:04:56,200 --> 00:04:59,960 Speaker 1: independent of language. So let's talk about language for secon 70 00:05:00,040 --> 00:05:03,719 Speaker 1: and because it's funny, I first wrote this question how 71 00:05:03,720 --> 00:05:07,719 Speaker 1: did people think before language? And then I kind of said, well, 72 00:05:07,800 --> 00:05:10,680 Speaker 1: now that's the wrong way to think about it, based 73 00:05:10,680 --> 00:05:12,200 Speaker 1: on some of the things you wrote in the book. 74 00:05:12,279 --> 00:05:14,560 Speaker 1: The better way to ask that question is how has 75 00:05:14,680 --> 00:05:18,480 Speaker 1: language changed the way we think? That's a great question 76 00:05:18,520 --> 00:05:21,880 Speaker 1: and that's one that people grapple with, and you know, 77 00:05:21,960 --> 00:05:25,760 Speaker 1: people want simple answers. It is probably complicated. Let me 78 00:05:25,839 --> 00:05:29,640 Speaker 1: start first with how we speak, and a lot of 79 00:05:29,680 --> 00:05:33,640 Speaker 1: the way we speak about thinking is as if they 80 00:05:33,640 --> 00:05:37,760 Speaker 1: were actions on objects. And I think that that kind 81 00:05:37,800 --> 00:05:42,160 Speaker 1: of thinking of acting on objects got internalized to think 82 00:05:42,240 --> 00:05:47,320 Speaker 1: about thoughts as objects. So we raise ideas, we pushed 83 00:05:47,360 --> 00:05:50,880 Speaker 1: them forward, we tear them apart. All of those are 84 00:05:51,080 --> 00:05:53,960 Speaker 1: the ways we talk about thinking, and there is if 85 00:05:54,000 --> 00:05:57,840 Speaker 1: we were having actual objects and doing it. So I 86 00:05:58,279 --> 00:06:01,679 Speaker 1: don't think that's metaphor. I think we have no other 87 00:06:01,760 --> 00:06:06,640 Speaker 1: way of talking about thinking except as if there were actions. 88 00:06:06,720 --> 00:06:10,520 Speaker 1: So those are things we do with our hands, with 89 00:06:10,600 --> 00:06:14,000 Speaker 1: our fate. We go from place to place, and our 90 00:06:14,080 --> 00:06:19,320 Speaker 1: thoughts go from idea to idea along conceptual paths, the 91 00:06:19,400 --> 00:06:24,360 Speaker 1: way our feet go from place to place along spatial paths. 92 00:06:24,400 --> 00:06:28,480 Speaker 1: And it now turns out that the same brain structures 93 00:06:28,520 --> 00:06:32,520 Speaker 1: in humans are coding both of them. They're both coding 94 00:06:32,920 --> 00:06:38,080 Speaker 1: our paths in actual space and our paths in conceptual space. 95 00:06:38,200 --> 00:06:41,520 Speaker 1: Explain that a little bit. When you say that we're 96 00:06:41,600 --> 00:06:44,520 Speaker 1: coded in in conceptual space when we go from thought 97 00:06:44,560 --> 00:06:49,719 Speaker 1: to thought, are you suggesting our thought processes are somewhat predestined? 98 00:06:50,080 --> 00:06:53,560 Speaker 1: We all have the same approach to solving the same 99 00:06:53,600 --> 00:06:57,000 Speaker 1: problems more or less, at least structurally. What do you 100 00:06:57,000 --> 00:07:02,039 Speaker 1: mean by that? It's again a great question. It's really associations, 101 00:07:02,080 --> 00:07:05,359 Speaker 1: just the way you'll walk from A to B in 102 00:07:05,400 --> 00:07:07,680 Speaker 1: a different way than I might walk from A to 103 00:07:07,800 --> 00:07:10,800 Speaker 1: B for different reasons, but it's still a to be right. 104 00:07:11,040 --> 00:07:13,880 Speaker 1: And some of the problem with thinking is we don't 105 00:07:13,920 --> 00:07:16,760 Speaker 1: know where be is right. We start off with a 106 00:07:16,880 --> 00:07:19,960 Speaker 1: and we're trying to solve a problem maybe or we're 107 00:07:20,040 --> 00:07:23,760 Speaker 1: just letting our minds wander. No. I think, if anything, 108 00:07:23,880 --> 00:07:27,960 Speaker 1: we're gonna If you look at human beings were incredibly 109 00:07:28,000 --> 00:07:32,720 Speaker 1: diverse and thinking in different ways, and our associations are 110 00:07:33,280 --> 00:07:38,800 Speaker 1: built from experience, from many different experiences, So thoughts go 111 00:07:38,960 --> 00:07:44,160 Speaker 1: from association to association, and those associations are going to 112 00:07:44,240 --> 00:07:48,040 Speaker 1: be wildly different. So I'll give you a small example. 113 00:07:48,560 --> 00:07:52,160 Speaker 1: When my husband and I used to walk in streets 114 00:07:52,240 --> 00:07:57,680 Speaker 1: in cities that yours being a most of our right. 115 00:07:57,880 --> 00:08:01,840 Speaker 1: This is years ago, because unfortunately he's hasn't been with 116 00:08:01,920 --> 00:08:05,800 Speaker 1: us for all too long. But we would walk in Paris, 117 00:08:05,880 --> 00:08:10,600 Speaker 1: in New York, places where we were tourists at the time, 118 00:08:11,280 --> 00:08:14,720 Speaker 1: and when we'd come back, um, he would say, did 119 00:08:14,760 --> 00:08:19,960 Speaker 1: you see all those prostitutes. I didn't pick up a 120 00:08:20,080 --> 00:08:24,119 Speaker 1: single one, And you know, I was looking at other things, 121 00:08:24,160 --> 00:08:28,720 Speaker 1: the architecture, maybe what people were wearing. Um I didn't. 122 00:08:28,760 --> 00:08:32,320 Speaker 1: I wasn't seeing that. But you also noticed, and I 123 00:08:32,360 --> 00:08:36,439 Speaker 1: picked this up from the book, the way the structure 124 00:08:36,440 --> 00:08:39,840 Speaker 1: of the cities were lying. You talk about how Japan 125 00:08:40,160 --> 00:08:44,040 Speaker 1: has there very it's very confusing to foreigners, but they 126 00:08:44,080 --> 00:08:48,119 Speaker 1: break cities into quadrants, and those quadrants are pretty consistent 127 00:08:48,200 --> 00:08:51,920 Speaker 1: from Japanese city to Japanese city. It becomes very helpful 128 00:08:52,320 --> 00:08:56,200 Speaker 1: if you know the code, but anybody who doesn't, it's 129 00:08:56,280 --> 00:09:00,120 Speaker 1: just a perplexing mess. Yeah. And and that again is 130 00:09:00,160 --> 00:09:04,160 Speaker 1: illustrating different ways of thinking and different ways of designing. 131 00:09:05,040 --> 00:09:08,040 Speaker 1: But both are designed. And I think one of the 132 00:09:08,080 --> 00:09:10,480 Speaker 1: points are trying to make in the book is that 133 00:09:10,880 --> 00:09:14,640 Speaker 1: we design the world the way we design our minds. 134 00:09:14,679 --> 00:09:18,040 Speaker 1: So both Japan and the US New York with its 135 00:09:18,080 --> 00:09:22,240 Speaker 1: grid system and many other cities with grid systems, even 136 00:09:22,280 --> 00:09:28,239 Speaker 1: the Romans had grid systems, and some Chinese have grid systems. 137 00:09:28,880 --> 00:09:32,000 Speaker 1: In fact, Japan has grid systems, they just labeled them 138 00:09:32,040 --> 00:09:38,080 Speaker 1: differently from the way we do. But that those designs 139 00:09:38,160 --> 00:09:44,120 Speaker 1: are go across cultures with with variations, but they have 140 00:09:44,280 --> 00:09:48,120 Speaker 1: to do with the way we design our minds. Um. So, 141 00:09:48,120 --> 00:09:50,480 Speaker 1: so let me ask you this question, since since we're 142 00:09:50,640 --> 00:09:55,560 Speaker 1: talking about visual cognition, how does our sense of spatial 143 00:09:55,760 --> 00:10:02,160 Speaker 1: understanding and visual cognition affect the way we think? It's so? 144 00:10:02,320 --> 00:10:07,200 Speaker 1: One way is is I think? Okay? How to how 145 00:10:07,240 --> 00:10:10,320 Speaker 1: to answer that? One way are things that I've already 146 00:10:10,400 --> 00:10:15,200 Speaker 1: said that we think about actions on ideas, the way 147 00:10:15,240 --> 00:10:18,400 Speaker 1: we think about actions and objects, and we go from 148 00:10:18,480 --> 00:10:23,200 Speaker 1: place to place, the way we go from idea to idea. 149 00:10:23,440 --> 00:10:30,000 Speaker 1: UM that probably affects the structure of language. That's my hunch, 150 00:10:30,080 --> 00:10:34,440 Speaker 1: and I didn't work that out very well. You asked 151 00:10:34,480 --> 00:10:39,200 Speaker 1: earlier how does language bootstrap our thinking? And I think 152 00:10:39,280 --> 00:10:43,439 Speaker 1: language isn't one of many cognitive tools that we use. 153 00:10:43,679 --> 00:10:47,439 Speaker 1: We can then use language to reason about other things, 154 00:10:47,480 --> 00:10:51,200 Speaker 1: the way we can use math to reason about other things, 155 00:10:51,400 --> 00:10:55,000 Speaker 1: or or logic or computer programming. We have a number 156 00:10:55,040 --> 00:10:58,800 Speaker 1: of these cognitive tools that we have built through culture. 157 00:10:59,280 --> 00:11:02,800 Speaker 1: Their cultural evolved. We don't we're not born with them, 158 00:11:02,840 --> 00:11:07,440 Speaker 1: and they enable leaps of thought. We now have computers 159 00:11:07,440 --> 00:11:10,800 Speaker 1: that do help us think and help us design, and 160 00:11:11,400 --> 00:11:16,319 Speaker 1: so we we've developed a number of these cognitive tools 161 00:11:16,360 --> 00:11:21,920 Speaker 1: that help us structure our thinking and will leapfrog are thinking. 162 00:11:22,040 --> 00:11:25,560 Speaker 1: We can use them to think farther. So I don't 163 00:11:25,679 --> 00:11:30,280 Speaker 1: need to compute square roots anymore. I have them amount calculator. 164 00:11:30,640 --> 00:11:33,680 Speaker 1: I can use that to leap frog and go to 165 00:11:33,760 --> 00:11:38,720 Speaker 1: other um levels of understanding. So so last question on 166 00:11:38,720 --> 00:11:44,920 Speaker 1: this topic, what is cognitive collage? I love that phrase. Okay, 167 00:11:44,920 --> 00:11:48,720 Speaker 1: thank you. So we did some work on how people 168 00:11:48,840 --> 00:11:53,199 Speaker 1: understand the environments that they walk through every day, and 169 00:11:53,480 --> 00:11:56,920 Speaker 1: there are a number of distortions that people have that 170 00:11:57,000 --> 00:12:01,559 Speaker 1: indicate that people are constructing their images of environments. So 171 00:12:01,760 --> 00:12:05,160 Speaker 1: the grid pattern, we tend to line things up in 172 00:12:05,160 --> 00:12:12,480 Speaker 1: in in parallel and perpendicular lines and ignore um ignore diagonals, 173 00:12:12,520 --> 00:12:17,480 Speaker 1: and ignore five sided things like the Boston Commons. People 174 00:12:17,600 --> 00:12:22,080 Speaker 1: often act as if it were four sided kind of quadrilateral, 175 00:12:22,280 --> 00:12:26,240 Speaker 1: not a pentagon, and it's not really a pentagon, so 176 00:12:26,520 --> 00:12:29,520 Speaker 1: we we distort. So some of the distortions are really 177 00:12:29,600 --> 00:12:34,480 Speaker 1: quite remarkable. The most remarkable is probably that people think 178 00:12:34,559 --> 00:12:39,400 Speaker 1: that distances to a landmark are smaller than distances from 179 00:12:39,440 --> 00:12:44,640 Speaker 1: a landmark. Explain that because that sounds so obviously incorrect. Exactly, 180 00:12:45,320 --> 00:12:48,200 Speaker 1: So a number of studies have been done where people 181 00:12:48,559 --> 00:12:53,680 Speaker 1: on college campuses pick landmarks. They have the students picked 182 00:12:53,720 --> 00:12:57,080 Speaker 1: the landmarks, and then another group of students is asked 183 00:12:57,080 --> 00:13:01,920 Speaker 1: about distances and you get those a symmetric distances. So 184 00:13:02,000 --> 00:13:04,920 Speaker 1: the example I use is that people think that Jacques's 185 00:13:04,960 --> 00:13:09,160 Speaker 1: house is closer to the Eiffel Tower than Eiffel Tower 186 00:13:09,240 --> 00:13:13,560 Speaker 1: to Jacques's house. And one explanation for that might be 187 00:13:13,760 --> 00:13:19,520 Speaker 1: that the Eiffel Tower defines a neighborhood. Columbus Circle is 188 00:13:19,559 --> 00:13:22,840 Speaker 1: a neighborhood, the village is a neighborhood. We tend to 189 00:13:22,880 --> 00:13:26,880 Speaker 1: form neighborhoods around these landmarks, and we say, if someone 190 00:13:26,920 --> 00:13:29,480 Speaker 1: says where do you live, you can say I live 191 00:13:29,600 --> 00:13:34,600 Speaker 1: near Columbus Circle and that tells people the environment. And 192 00:13:35,880 --> 00:13:39,800 Speaker 1: so the neighborhood is quite expanded. It includes Jacques's house, 193 00:13:40,040 --> 00:13:45,240 Speaker 1: but Jacques's House only includes itself, so you get that asymmetry. 194 00:13:45,559 --> 00:13:50,080 Speaker 1: And these are these are This is research I didn't do. 195 00:13:50,280 --> 00:13:53,400 Speaker 1: I pointed to it, but it's very related to research 196 00:13:53,559 --> 00:13:58,160 Speaker 1: that my husband had done earlier, showing that people this 197 00:13:58,280 --> 00:14:01,000 Speaker 1: is going way back in history. People think that North 198 00:14:01,080 --> 00:14:05,720 Speaker 1: Korea is more like communist China than communist China like 199 00:14:05,880 --> 00:14:09,600 Speaker 1: North Korea, or the son is more like the father 200 00:14:10,000 --> 00:14:13,160 Speaker 1: than the father like this son. So we have these 201 00:14:13,280 --> 00:14:18,640 Speaker 1: landmarks or prototypes and they draw similar things into them. 202 00:14:19,320 --> 00:14:22,840 Speaker 1: Eleanor Rosh had earlier done work showing that people think 203 00:14:23,160 --> 00:14:28,440 Speaker 1: magenta is more like red than red like magenta, so 204 00:14:29,440 --> 00:14:33,120 Speaker 1: I could see that. Yeah, they are defining categories really, 205 00:14:33,280 --> 00:14:41,280 Speaker 1: and these peripheral instances don't. So those are spatial ideas. 206 00:14:42,560 --> 00:14:46,120 Speaker 1: The distance, whether it's distance and color, or distance in 207 00:14:46,160 --> 00:14:51,040 Speaker 1: political leanings, or distances in some kind of psychological or 208 00:14:51,040 --> 00:14:54,840 Speaker 1: physical similarity, the son to the father, those are all 209 00:14:54,960 --> 00:15:00,120 Speaker 1: spatial concepts. But and and we can show them in 210 00:15:00,200 --> 00:15:04,720 Speaker 1: space two. But they are affecting our thinking. We're spatializing 211 00:15:04,880 --> 00:15:10,680 Speaker 1: those concepts of North Korea, red and magenta and communist 212 00:15:10,800 --> 00:15:16,960 Speaker 1: China and and thinking about other things in relation to them, 213 00:15:17,000 --> 00:15:22,160 Speaker 1: but in relation to spatial distance. So everywhere I look, 214 00:15:22,440 --> 00:15:28,240 Speaker 1: I can find spatial distortions that are reflected in conceptual ones. 215 00:15:28,960 --> 00:15:31,680 Speaker 1: In group out group would be one. We tend to 216 00:15:31,720 --> 00:15:36,160 Speaker 1: think that that if we're in one political leaning, that 217 00:15:36,280 --> 00:15:40,960 Speaker 1: everybody in another political they're all alike. My group is 218 00:15:41,000 --> 00:15:44,440 Speaker 1: highly differentiated, but it's because it's close to me. And 219 00:15:44,480 --> 00:15:48,400 Speaker 1: I can see the differences. I don't see the differences 220 00:15:48,440 --> 00:15:51,800 Speaker 1: in those others, so we tend to think of them 221 00:15:51,920 --> 00:15:56,080 Speaker 1: is all alike, but our group is differentiate. That's quite fascinating. 222 00:15:56,280 --> 00:15:59,200 Speaker 1: One of my favorite parts about the book was the 223 00:15:59,320 --> 00:16:02,960 Speaker 1: nine law of cognition. And I have to ask you, 224 00:16:03,400 --> 00:16:06,560 Speaker 1: how did you develop nine rules? And how long have 225 00:16:06,640 --> 00:16:09,600 Speaker 1: you been working on these? You know? I started writing 226 00:16:09,640 --> 00:16:13,920 Speaker 1: the book and I realized that certain of the things 227 00:16:14,040 --> 00:16:18,200 Speaker 1: that I was saying, again, many of them about space, 228 00:16:18,320 --> 00:16:22,880 Speaker 1: had generality to thinking in general or behavior in general, 229 00:16:22,960 --> 00:16:27,400 Speaker 1: and I wanted to point to those and encapsulate them 230 00:16:27,480 --> 00:16:31,240 Speaker 1: in a way. Then it became laws. And I have 231 00:16:31,440 --> 00:16:34,960 Speaker 1: a good friend who comes from medical science, very hard 232 00:16:35,040 --> 00:16:38,280 Speaker 1: noised researcher. After the book was written, and I couldn't 233 00:16:38,320 --> 00:16:41,640 Speaker 1: change anything. Sounds confused by that because she was thinking 234 00:16:41,680 --> 00:16:46,160 Speaker 1: of physics laws, which are laws, and and you can 235 00:16:46,280 --> 00:16:49,560 Speaker 1: compute them and get answers that will in fact hold. 236 00:16:50,160 --> 00:16:54,400 Speaker 1: So these are not like that. They're generalizations, and I 237 00:16:54,440 --> 00:17:00,160 Speaker 1: think that fits with the way social sciences see laws 238 00:17:00,320 --> 00:17:04,360 Speaker 1: kind of generalizations. Okay, let's go over some of these generalizations. 239 00:17:04,600 --> 00:17:08,920 Speaker 1: Will start with rule number one. There are no benefits 240 00:17:08,920 --> 00:17:14,120 Speaker 1: without costs, meaning creativity versus learning. How do they set 241 00:17:14,160 --> 00:17:17,879 Speaker 1: each other? Interesting? And I mean it's a nice example. 242 00:17:18,440 --> 00:17:21,840 Speaker 1: So I think again, the mind is quite simple. We 243 00:17:21,880 --> 00:17:25,840 Speaker 1: want simple, straight answers. Things are always this, and things 244 00:17:25,880 --> 00:17:30,320 Speaker 1: are always that. And we categorize because the number of 245 00:17:30,400 --> 00:17:33,240 Speaker 1: things in the world is huge. If we had to 246 00:17:33,320 --> 00:17:38,040 Speaker 1: think about each chair individually, each cup individually, it would 247 00:17:38,080 --> 00:17:42,720 Speaker 1: be overwhelming. So we categorize things as chairs and tables 248 00:17:42,840 --> 00:17:46,440 Speaker 1: and dogs and cats, and that enables us to know 249 00:17:46,560 --> 00:17:50,200 Speaker 1: how to recognize them and how to behave towards them. 250 00:17:50,200 --> 00:17:55,159 Speaker 1: Now we can miscategorize, and that happens often tragically and 251 00:17:55,359 --> 00:18:00,760 Speaker 1: certain situations, but we forget that we need those categories 252 00:18:01,280 --> 00:18:05,240 Speaker 1: on some level because we have to behave very quickly 253 00:18:05,800 --> 00:18:08,720 Speaker 1: and we have to respond very quickly. Is someone throwing 254 00:18:08,840 --> 00:18:12,719 Speaker 1: something at me to hurt me, as someone throwing something 255 00:18:12,760 --> 00:18:16,000 Speaker 1: at me so I can catch it, and our behavior 256 00:18:16,119 --> 00:18:18,160 Speaker 1: is going to be very different. So when you say 257 00:18:18,160 --> 00:18:23,240 Speaker 1: cost versus benefits, sometimes the trade off is speed versus 258 00:18:23,280 --> 00:18:29,280 Speaker 1: accuracy or making a defensive decision because it's a matter 259 00:18:29,320 --> 00:18:33,920 Speaker 1: of existential survival versus Hey, I may not be accurate here, 260 00:18:34,080 --> 00:18:36,400 Speaker 1: but better safe and sorry? Is that what you mean? 261 00:18:36,520 --> 00:18:39,560 Speaker 1: By the forces of these decisions. Well, I mean it's 262 00:18:39,600 --> 00:18:43,520 Speaker 1: a cost benefit. I think economists understands that everything is 263 00:18:43,560 --> 00:18:48,920 Speaker 1: a trade off. I'm not sure that psychologists understand that 264 00:18:48,920 --> 00:18:54,200 Speaker 1: that well, So then they'll say people miscategorize. Somebody had 265 00:18:54,240 --> 00:18:57,959 Speaker 1: a toy gun, not a real gun. That this is 266 00:18:58,080 --> 00:19:01,199 Speaker 1: we shouldn't be categorizing at all. And I want to 267 00:19:01,240 --> 00:19:04,320 Speaker 1: say we have to we do it's it seems to 268 00:19:04,320 --> 00:19:09,200 Speaker 1: be the way we think, occasionally with tragic consequence, exactly 269 00:19:09,560 --> 00:19:14,320 Speaker 1: right and missteps. So on creativity and learning. We have 270 00:19:14,480 --> 00:19:18,520 Speaker 1: to learn routines to get through the day, and otherwise 271 00:19:18,560 --> 00:19:21,320 Speaker 1: if everything is a new problem, it's going to take 272 00:19:21,359 --> 00:19:23,480 Speaker 1: too long. How do we get a key in a door? 273 00:19:23,600 --> 00:19:26,399 Speaker 1: And how do we make toasts in the morning. So 274 00:19:26,480 --> 00:19:29,439 Speaker 1: we have to get into those routines. But once we 275 00:19:29,520 --> 00:19:33,600 Speaker 1: have those routines, it's hard to change them. So creativity 276 00:19:33,680 --> 00:19:38,480 Speaker 1: requires thinking in new ways, meaning outside of the routines exactly, 277 00:19:38,640 --> 00:19:41,000 Speaker 1: and thinking of a new way to build a teacup 278 00:19:41,119 --> 00:19:43,320 Speaker 1: or a new way to design a chair. These are 279 00:19:43,359 --> 00:19:46,640 Speaker 1: what designers have. A new way to design a school, 280 00:19:46,720 --> 00:19:50,440 Speaker 1: maybe the old way of designing schools isn't as good 281 00:19:50,440 --> 00:19:54,400 Speaker 1: as it could be. Libraries have changed enormously now that 282 00:19:54,920 --> 00:19:58,120 Speaker 1: in a little while there'll be no paperbooks. I mean, 283 00:19:58,400 --> 00:20:01,639 Speaker 1: I'm happy so further hanging. They are hanging out, and 284 00:20:01,680 --> 00:20:04,840 Speaker 1: I'm glad to see because I'm a fan. So in 285 00:20:04,920 --> 00:20:07,719 Speaker 1: order to design, we have to get rid of those 286 00:20:07,760 --> 00:20:12,280 Speaker 1: old ways and thinking new. And we just finished an 287 00:20:12,320 --> 00:20:15,680 Speaker 1: experiment asking people to think of new ways to use 288 00:20:15,880 --> 00:20:21,560 Speaker 1: old things, and the good answers come about the ninth answer, 289 00:20:21,960 --> 00:20:25,920 Speaker 1: meaning it takes that long before they overcome their natural 290 00:20:26,200 --> 00:20:30,920 Speaker 1: tendency towards routine. Exactly, quite quite interesting. What's I think 291 00:20:31,080 --> 00:20:34,320 Speaker 1: interesting also is the way we got people to generate 292 00:20:34,359 --> 00:20:37,960 Speaker 1: new ideas for these old things. How do you use 293 00:20:38,000 --> 00:20:42,440 Speaker 1: an umbrella? And creative ways? Well, it can be sticks 294 00:20:42,480 --> 00:20:46,440 Speaker 1: to hold kebab, but that was the ninth idea, right, 295 00:20:46,640 --> 00:20:50,120 Speaker 1: and it's clever and cute. But the way we got 296 00:20:50,160 --> 00:20:54,840 Speaker 1: people to think creatively was to ask them to think 297 00:20:54,880 --> 00:20:58,719 Speaker 1: about different roles of people. So how would a doctor 298 00:20:58,960 --> 00:21:02,239 Speaker 1: use this? How would a gardener use it? So we 299 00:21:02,320 --> 00:21:07,280 Speaker 1: ask people to put themselves in mindsets of other professions. 300 00:21:07,320 --> 00:21:10,640 Speaker 1: And professions are something we know a lot about. We've 301 00:21:10,840 --> 00:21:13,720 Speaker 1: since we were three. People ask us, what do you 302 00:21:13,760 --> 00:21:16,600 Speaker 1: want to be when you grow up and we interact 303 00:21:16,680 --> 00:21:19,560 Speaker 1: with people with different roles, so we know a lot 304 00:21:19,600 --> 00:21:24,119 Speaker 1: about what those roles do, and that help people generate 305 00:21:24,280 --> 00:21:28,560 Speaker 1: new uses. So there's a television show that focuses on 306 00:21:28,680 --> 00:21:32,480 Speaker 1: improv called Whose Line Is It? Anyway? And as you 307 00:21:32,520 --> 00:21:35,879 Speaker 1: were speaking, I can only think of the segment they 308 00:21:35,960 --> 00:21:39,600 Speaker 1: do with props where they give each group a different 309 00:21:39,640 --> 00:21:44,600 Speaker 1: set of props, and it's amazingly creative, and it seems 310 00:21:44,640 --> 00:21:49,320 Speaker 1: some people really have a skill set for applying these 311 00:21:49,560 --> 00:21:53,200 Speaker 1: in unexpected ways. Yeah, and improv is exactly the right 312 00:21:53,280 --> 00:21:57,439 Speaker 1: example for that, for that kind of creativity. I have 313 00:21:57,560 --> 00:22:01,080 Speaker 1: a former grand with student who talks about art making. 314 00:22:01,160 --> 00:22:05,400 Speaker 1: She's an artist. An excellent one is improvisational and you 315 00:22:05,440 --> 00:22:08,800 Speaker 1: have to keep your mind open to new ideas. And 316 00:22:08,840 --> 00:22:11,680 Speaker 1: I think you're right, you couldn't develop really good skills 317 00:22:11,760 --> 00:22:15,800 Speaker 1: for doing it. Let's let's go to another rule number three. 318 00:22:16,200 --> 00:22:22,000 Speaker 1: The mind can override perception, really cognitive dissonance, And is 319 00:22:22,040 --> 00:22:24,400 Speaker 1: the way I was looking at that. If you're perceiving 320 00:22:24,480 --> 00:22:27,879 Speaker 1: something and you're not going to believe it, how is 321 00:22:27,920 --> 00:22:30,960 Speaker 1: it that we ignore what's in front of our very 322 00:22:31,000 --> 00:22:33,920 Speaker 1: eyes if we're not happy with what we're seeing? Well, 323 00:22:33,920 --> 00:22:35,800 Speaker 1: I don't know if it's not happy, it doesn't fit 324 00:22:35,840 --> 00:22:40,040 Speaker 1: our hypotheses. So that's really more confirmation biased than anything, 325 00:22:40,160 --> 00:22:44,560 Speaker 1: or exactly or disconfirmation exactly. We don't want to see 326 00:22:44,600 --> 00:22:48,959 Speaker 1: that which disconfirms our existing but exactly. And one of 327 00:22:49,000 --> 00:22:51,840 Speaker 1: the early studies that was done on that was by 328 00:22:52,040 --> 00:22:55,440 Speaker 1: Jerry Brunner, an old friend, and Molly Potter, and they 329 00:22:55,480 --> 00:22:59,800 Speaker 1: showed out of focused photographs of odd things like if 330 00:22:59,800 --> 00:23:03,200 Speaker 1: I or hydrant at an odd angle, and they gradually 331 00:23:03,240 --> 00:23:06,600 Speaker 1: brought them into focus and asked people to keep guessing 332 00:23:06,760 --> 00:23:10,359 Speaker 1: what they were and compare them to a group that 333 00:23:10,520 --> 00:23:14,280 Speaker 1: saw them in focus. So these odd angles, like an 334 00:23:14,359 --> 00:23:17,280 Speaker 1: odd angle of a fire hydrant, people came up with 335 00:23:17,359 --> 00:23:21,160 Speaker 1: wild hypothesis and when it was in full focus couldn't 336 00:23:21,200 --> 00:23:25,000 Speaker 1: identify it because they already anchored to that previous And 337 00:23:25,040 --> 00:23:28,320 Speaker 1: then let me go to my favorite question, my favorite rule. 338 00:23:28,720 --> 00:23:32,840 Speaker 1: The mind fills in misinformation. I have a pet theory 339 00:23:32,880 --> 00:23:35,760 Speaker 1: that we're walking around with the model of a universe 340 00:23:35,800 --> 00:23:39,080 Speaker 1: in our head that's just wildly wrong, and that we 341 00:23:40,240 --> 00:23:44,040 Speaker 1: it's mostly misinformation. I'm curious what your thoughts are on 342 00:23:44,480 --> 00:23:48,280 Speaker 1: the mind fills in misinformation? Is it just little patches 343 00:23:48,520 --> 00:23:52,760 Speaker 1: behind our vision that's filled in, or is our world view? 344 00:23:52,920 --> 00:23:56,719 Speaker 1: Is our model of the universe completely wrong? Well, I 345 00:23:56,760 --> 00:23:59,239 Speaker 1: don't know that. I would say it's one of the 346 00:23:59,280 --> 00:24:02,840 Speaker 1: other around there's something in between. But sure we're filling 347 00:24:02,880 --> 00:24:05,960 Speaker 1: in all the time. There are some lovely experiments on 348 00:24:06,160 --> 00:24:10,400 Speaker 1: our worldview now where you show a photograph and then 349 00:24:10,440 --> 00:24:14,040 Speaker 1: another photograph where something has changed, like even an engine 350 00:24:14,160 --> 00:24:18,760 Speaker 1: is often a jet plane, and show them in rapid succession, 351 00:24:18,960 --> 00:24:21,840 Speaker 1: and people think they've seen the whole scene, but they 352 00:24:21,840 --> 00:24:26,080 Speaker 1: cannot identify what's changed. And those are that you can 353 00:24:26,119 --> 00:24:30,280 Speaker 1: find them online their wild So you don't know what's changed, 354 00:24:30,320 --> 00:24:34,040 Speaker 1: but you know you've seen a jet plane, people going 355 00:24:34,119 --> 00:24:37,240 Speaker 1: up on it, cargo being loaded the background. You get 356 00:24:37,280 --> 00:24:40,160 Speaker 1: this feeling that you see a rich scene. But it's 357 00:24:40,200 --> 00:24:43,719 Speaker 1: because we're refreshing it all the time, internally refreshing it. 358 00:24:44,040 --> 00:24:47,920 Speaker 1: And I'm sure we're filling in the gaps. So if 359 00:24:47,960 --> 00:24:50,320 Speaker 1: in a series of photos, in one of them the 360 00:24:50,359 --> 00:24:53,760 Speaker 1: engine is missing from the plane but it should be there, 361 00:24:54,359 --> 00:24:57,399 Speaker 1: do we visually fill that in in our own minds? 362 00:24:57,920 --> 00:25:00,679 Speaker 1: I think we just don't even notice that missing. We 363 00:25:00,800 --> 00:25:05,120 Speaker 1: see airplane, we're coding it on that level and adding 364 00:25:05,240 --> 00:25:08,719 Speaker 1: those details. So this is an example from Scott McCloud, 365 00:25:08,760 --> 00:25:11,800 Speaker 1: who wrote a brilliant book on comics, and he says, 366 00:25:11,960 --> 00:25:14,160 Speaker 1: you know someone sitting at a desk. You can't see 367 00:25:14,200 --> 00:25:16,840 Speaker 1: their legs because the desk is covering it up, but 368 00:25:16,920 --> 00:25:20,560 Speaker 1: you know they have legs. So we're filling in in 369 00:25:20,560 --> 00:25:23,400 Speaker 1: that way. We fill in on language, We fill in 370 00:25:23,520 --> 00:25:28,440 Speaker 1: all the time, missing information, we're guessing, and usually it's 371 00:25:28,520 --> 00:25:32,719 Speaker 1: right because we've learned those contingencies in the world. I 372 00:25:32,720 --> 00:25:37,040 Speaker 1: think almost everybody has at some point experienced an argument 373 00:25:37,080 --> 00:25:41,439 Speaker 1: with someone that they're close to and say misinterprets what 374 00:25:41,560 --> 00:25:46,359 Speaker 1: they're saying emotionally, and that can lead to the disastrous 375 00:25:47,359 --> 00:25:50,720 Speaker 1: escalation of an argument. I mean you were angry. No, 376 00:25:50,880 --> 00:25:54,159 Speaker 1: I wasn't angry. I was sad, And we get to 377 00:25:54,280 --> 00:25:58,080 Speaker 1: those sorts of impasses. So the most I have to 378 00:25:58,080 --> 00:26:01,440 Speaker 1: share something with you because I hadn't experience with filling in. 379 00:26:02,000 --> 00:26:05,840 Speaker 1: That was just astonishing and it stayed with me many 380 00:26:05,880 --> 00:26:10,679 Speaker 1: many years ago. I would occasionally ride a motorcycle and 381 00:26:10,720 --> 00:26:14,600 Speaker 1: when the time came to get the motorcycle license, the 382 00:26:14,640 --> 00:26:18,000 Speaker 1: state requires you go through this training program, most of 383 00:26:18,040 --> 00:26:21,359 Speaker 1: which if you're an experienced writer you don't need, but 384 00:26:21,600 --> 00:26:24,040 Speaker 1: they fill it in with a lot of safety things. 385 00:26:24,640 --> 00:26:28,040 Speaker 1: And the one thing that stayed with me probably why 386 00:26:28,080 --> 00:26:32,520 Speaker 1: I don't ride motorcycles anymore, um, is they wanted to 387 00:26:32,680 --> 00:26:36,680 Speaker 1: explain to you how limited your field division is. And 388 00:26:36,760 --> 00:26:39,639 Speaker 1: when you're looking straight ahead, you have about a three 389 00:26:39,680 --> 00:26:43,400 Speaker 1: percent range of vision and everything around you is more 390 00:26:43,480 --> 00:26:47,080 Speaker 1: or less a reasonable guests your brain constructing a model. 391 00:26:47,440 --> 00:26:49,760 Speaker 1: But that means if something enters that field and you 392 00:26:49,760 --> 00:26:53,320 Speaker 1: know you're not aware of it, it's a danger. And 393 00:26:53,359 --> 00:26:56,360 Speaker 1: the way they showed this to you was they put 394 00:26:56,440 --> 00:27:00,800 Speaker 1: us in a room, regular square, rectangular room, and you 395 00:27:00,920 --> 00:27:05,439 Speaker 1: stand on one wall, and then a person um stands 396 00:27:05,480 --> 00:27:09,720 Speaker 1: directly opposite you, about twenty ft away, and then in 397 00:27:09,800 --> 00:27:13,080 Speaker 1: your peripheral vision on your same wall in the corners, 398 00:27:13,119 --> 00:27:17,280 Speaker 1: someone holds a reasonable size playing card and they walk 399 00:27:17,359 --> 00:27:20,520 Speaker 1: along the wall towards the person opposite you, and you 400 00:27:20,600 --> 00:27:23,800 Speaker 1: have to say, you have to guess when you can 401 00:27:23,880 --> 00:27:27,080 Speaker 1: identify the card is either red or black, and then 402 00:27:27,119 --> 00:27:30,240 Speaker 1: when you can identify the actual suit and number of 403 00:27:30,240 --> 00:27:33,000 Speaker 1: the card. And it's not like a regular four inch 404 00:27:33,160 --> 00:27:37,000 Speaker 1: deck of plane cards. They're like ten big magician cards, 405 00:27:37,040 --> 00:27:39,600 Speaker 1: like eight inch day or ten inch, and I was 406 00:27:39,840 --> 00:27:42,520 Speaker 1: I assumed that I would be able to identify it 407 00:27:43,040 --> 00:27:48,959 Speaker 1: pretty rapidly, maybe thirty degrees from forty five degrees, and 408 00:27:49,160 --> 00:27:53,720 Speaker 1: I was shocked to learn that it's almost dead on 409 00:27:54,440 --> 00:27:58,040 Speaker 1: maybe that instead of straight across from you, maybe it's 410 00:27:58,080 --> 00:28:01,560 Speaker 1: a hundred and twenty degrees before you can identify just 411 00:28:01,640 --> 00:28:04,119 Speaker 1: the color. You couldn't even say is this a club 412 00:28:04,160 --> 00:28:06,720 Speaker 1: or spade? You could say it's the card is black, 413 00:28:07,080 --> 00:28:09,840 Speaker 1: and then a little closer before you can identify. And 414 00:28:09,880 --> 00:28:13,840 Speaker 1: they're practically dead opposite you when you're staring straight ahead 415 00:28:14,240 --> 00:28:17,680 Speaker 1: before you could identify the card. It was shocking at 416 00:28:17,720 --> 00:28:22,200 Speaker 1: how little acutey you have outside of straight ahead of you. 417 00:28:22,200 --> 00:28:26,080 Speaker 1: Your peripheral vision is. You can see images, you could 418 00:28:26,080 --> 00:28:30,280 Speaker 1: see rough shapes, but there's no specificity at all. It's 419 00:28:30,320 --> 00:28:34,520 Speaker 1: a beautiful demonstration and we should all have it. Shocking, 420 00:28:34,640 --> 00:28:40,120 Speaker 1: just absolutely shocking. So we discussed cognitive collage earlier, and 421 00:28:40,120 --> 00:28:45,480 Speaker 1: I'm fascinated by that concept. The way we put together 422 00:28:45,720 --> 00:28:49,040 Speaker 1: our models of the world, how much of that is 423 00:28:49,120 --> 00:28:53,200 Speaker 1: based on what we visually perceive in reality, and how 424 00:28:53,280 --> 00:28:56,360 Speaker 1: much of it is based on what we're creating to 425 00:28:56,440 --> 00:28:59,400 Speaker 1: fill in the holes, it's gonna vary. And what I 426 00:28:59,480 --> 00:29:04,000 Speaker 1: liked about the College metaphor is its multi media. If 427 00:29:04,000 --> 00:29:06,480 Speaker 1: you go back and look at Picasso and Bronch, they 428 00:29:06,560 --> 00:29:11,200 Speaker 1: put newspaper clippings in and paintings in and all kinds 429 00:29:11,240 --> 00:29:15,720 Speaker 1: of things in it. And again, people think, or many 430 00:29:15,840 --> 00:29:20,360 Speaker 1: researchers thought, that our views of our environment are more 431 00:29:20,440 --> 00:29:23,800 Speaker 1: or less vertical. And I think our research and the 432 00:29:23,840 --> 00:29:26,800 Speaker 1: research of many other people show it's filled with small 433 00:29:26,880 --> 00:29:30,680 Speaker 1: biases that aren't coherent. You try to put them together, 434 00:29:31,400 --> 00:29:35,640 Speaker 1: different perspectives on the world, different landmarks. So you try 435 00:29:35,680 --> 00:29:37,920 Speaker 1: to put them together, you don't get anything that would 436 00:29:37,960 --> 00:29:42,560 Speaker 1: work on a Euclidean space. And in addition to its multimodels, 437 00:29:42,560 --> 00:29:45,320 Speaker 1: So if I'm wandering around New York or another city, 438 00:29:45,960 --> 00:29:48,600 Speaker 1: there are some things I know from language that I 439 00:29:48,680 --> 00:29:51,840 Speaker 1: need to go four blocks this way. In turn, some 440 00:29:51,920 --> 00:29:55,880 Speaker 1: things I know from recognizing the world, some from my 441 00:29:56,000 --> 00:30:01,240 Speaker 1: recollections of maps that I've seen. So I'm gathering information 442 00:30:01,760 --> 00:30:05,760 Speaker 1: from many different places to decide is the entrance to 443 00:30:05,840 --> 00:30:08,560 Speaker 1: the Bloomer building going to be on Lexington or going 444 00:30:08,640 --> 00:30:13,040 Speaker 1: to be on fifty nine, and how do I find it? 445 00:30:13,240 --> 00:30:17,320 Speaker 1: And so I'm making those decisions. That way balancing that 446 00:30:17,520 --> 00:30:23,000 Speaker 1: information gathering from many sources. It's not a coherent system, 447 00:30:23,040 --> 00:30:26,160 Speaker 1: and I do think that's a model for all kinds 448 00:30:26,160 --> 00:30:29,920 Speaker 1: of judgment. And the context is going to determine what 449 00:30:30,160 --> 00:30:34,719 Speaker 1: information is salient and what information isn't what I'm bringing 450 00:30:34,800 --> 00:30:38,040 Speaker 1: up now, what I'm bringing up in other cases. So 451 00:30:38,120 --> 00:30:43,120 Speaker 1: I do feel that this cognitive collage idea is really 452 00:30:43,200 --> 00:30:47,160 Speaker 1: a model for the way we make judgments in many situations. 453 00:30:47,280 --> 00:30:50,160 Speaker 1: If you think about what's in the brain, there aren't 454 00:30:50,280 --> 00:30:53,680 Speaker 1: calculations in the brain, there aren't maps in the brain, 455 00:30:53,760 --> 00:30:57,560 Speaker 1: there aren't photographs of people, and it's all neurons. And 456 00:30:57,760 --> 00:31:03,560 Speaker 1: we use these terms like language and spatial representations and 457 00:31:04,040 --> 00:31:07,920 Speaker 1: images of faces and so forth as a way of talking. 458 00:31:08,080 --> 00:31:11,000 Speaker 1: And in fact, there are places in the brain that 459 00:31:11,080 --> 00:31:16,960 Speaker 1: are dedicated to recognizing faces or scenes and even rudimentary 460 00:31:17,040 --> 00:31:20,680 Speaker 1: concepts of number. There are places that are activated. But 461 00:31:20,840 --> 00:31:25,560 Speaker 1: in the end it's norns and these are ways of talking. 462 00:31:25,880 --> 00:31:29,520 Speaker 1: And again, the idea that we gather information from all 463 00:31:29,520 --> 00:31:33,600 Speaker 1: over the cortex to make a judgment whatever seems relevant 464 00:31:34,080 --> 00:31:37,600 Speaker 1: seems to me a model not just for space, but 465 00:31:37,720 --> 00:31:43,080 Speaker 1: for all judgment. So you reference the comparison of people 466 00:31:43,800 --> 00:31:48,000 Speaker 1: using either verbal or visual thinking. But maybe this is 467 00:31:48,040 --> 00:31:51,240 Speaker 1: the American schooling system. I tend to think about the 468 00:31:51,280 --> 00:31:54,840 Speaker 1: way different people approach the world and either verbal or 469 00:31:55,000 --> 00:31:58,000 Speaker 1: mathematical thinking, or at least maybe that's what we do 470 00:31:58,080 --> 00:32:01,320 Speaker 1: with kids coming out of school. He's a numbers person 471 00:32:01,520 --> 00:32:04,400 Speaker 1: or she's a language person. How did you come up 472 00:32:04,440 --> 00:32:07,800 Speaker 1: with the economy between verbal and visual and are there 473 00:32:07,840 --> 00:32:12,920 Speaker 1: any parallels for academia where there's a tendency for the 474 00:32:13,000 --> 00:32:15,959 Speaker 1: math and science people to go this way and the 475 00:32:16,120 --> 00:32:19,800 Speaker 1: literature and language people to go that way? Right? So 476 00:32:19,880 --> 00:32:24,040 Speaker 1: people think of themselves as visual thinkers or as verbal 477 00:32:24,080 --> 00:32:29,080 Speaker 1: thinkers or computational thinkers, or I think can aesthetically and 478 00:32:29,440 --> 00:32:35,360 Speaker 1: meaning in terms of right dancers or right might be think. 479 00:32:35,880 --> 00:32:39,600 Speaker 1: So those are again ways of talking. They don't have 480 00:32:39,680 --> 00:32:44,880 Speaker 1: a lot of evidence behind them. Even spatial thinking turns 481 00:32:44,880 --> 00:32:47,680 Speaker 1: out or visual it turns out to be quite complicated. 482 00:32:47,760 --> 00:32:51,600 Speaker 1: It's many different features verbal too. We know people that 483 00:32:51,680 --> 00:32:55,240 Speaker 1: are can produce words but can think straight, and vice 484 00:32:55,400 --> 00:32:58,320 Speaker 1: versa who are hard to come up with words but 485 00:32:58,440 --> 00:33:02,880 Speaker 1: think very logically. So verbal abilities are quite different and 486 00:33:03,160 --> 00:33:07,680 Speaker 1: spatial abilities are quite different. The bad news is you 487 00:33:07,680 --> 00:33:11,000 Speaker 1: can be good at both and bad at votes. It's 488 00:33:11,040 --> 00:33:15,880 Speaker 1: not that one compensates for another to some extent, they 489 00:33:15,920 --> 00:33:21,360 Speaker 1: clearly compensate. So I recently had the wonderful opportunity of 490 00:33:21,680 --> 00:33:24,720 Speaker 1: because of a talk I needed to give, of delving 491 00:33:24,760 --> 00:33:28,200 Speaker 1: into Leonardo, who's by all accounts, one of the most 492 00:33:28,280 --> 00:33:33,320 Speaker 1: brilliant thinkers of all times. He thought visually spacially, and 493 00:33:33,400 --> 00:33:37,880 Speaker 1: he thought through sketching, and he used sketches as a 494 00:33:37,880 --> 00:33:43,080 Speaker 1: way of understanding dynamic processes that not just static ones, 495 00:33:43,120 --> 00:33:46,880 Speaker 1: because sketches are static. And in fact, he used the 496 00:33:46,960 --> 00:33:50,600 Speaker 1: way he drew as a way of understanding the way 497 00:33:50,680 --> 00:33:54,959 Speaker 1: vortices are happening water, so the way he drew them, 498 00:33:55,000 --> 00:33:59,160 Speaker 1: and he used many different perspectives. So he was very 499 00:33:59,240 --> 00:34:02,040 Speaker 1: much a kind of visual thinker. But he was able 500 00:34:02,080 --> 00:34:07,320 Speaker 1: to get to enormous abstractions through the visual spatial thinking, 501 00:34:07,680 --> 00:34:12,000 Speaker 1: and there wasn't much maths then. Quite interesting. One of 502 00:34:12,000 --> 00:34:14,759 Speaker 1: the things you talked about in the book is our 503 00:34:14,880 --> 00:34:21,520 Speaker 1: hands expressing our thinking. New Yorker's notorious with their hands, big, 504 00:34:21,800 --> 00:34:25,680 Speaker 1: big hand. But what is the significance of gestures to cognition? 505 00:34:26,160 --> 00:34:30,600 Speaker 1: How important is it you could typically understand what someone 506 00:34:30,640 --> 00:34:32,920 Speaker 1: is saying on the radio regardless of what their hands 507 00:34:32,920 --> 00:34:38,040 Speaker 1: are doing. That said, it's helpful to try and express 508 00:34:38,120 --> 00:34:42,479 Speaker 1: certain ideas with your hands as as you speak. Why 509 00:34:42,560 --> 00:34:45,319 Speaker 1: why is that? Yeah, I know it's lovely and I 510 00:34:45,360 --> 00:34:48,759 Speaker 1: think what you're saying that we can get information just 511 00:34:49,000 --> 00:34:51,759 Speaker 1: through hearing, we can get it just through readings. So 512 00:34:51,920 --> 00:34:58,720 Speaker 1: human beings are enormously um adept at learning from different 513 00:34:58,760 --> 00:35:02,440 Speaker 1: media and where I don't get filled in, but you 514 00:35:02,480 --> 00:35:06,360 Speaker 1: couldn't if it face to face. Conversation does involve gesture, 515 00:35:07,239 --> 00:35:10,160 Speaker 1: and we did a number of experiments showing that the 516 00:35:10,239 --> 00:35:14,000 Speaker 1: way people gesture when they're explaining something changes the thought 517 00:35:14,080 --> 00:35:17,400 Speaker 1: of other people. Really yeah, so so depending on what 518 00:35:17,440 --> 00:35:20,040 Speaker 1: you're doing with your hands, you're very much. It's not 519 00:35:20,120 --> 00:35:23,440 Speaker 1: necessarily for the speaker, it's for the listener both so 520 00:35:23,520 --> 00:35:27,640 Speaker 1: for the speaker. So think of cyclical thinking, going from 521 00:35:27,640 --> 00:35:30,719 Speaker 1: a seed to a flower back to a seed. If 522 00:35:30,760 --> 00:35:34,000 Speaker 1: we ask people just to represent that, they tend to 523 00:35:34,080 --> 00:35:38,520 Speaker 1: represent lines, not cycles. But if we gesture in a circle, 524 00:35:38,960 --> 00:35:43,200 Speaker 1: they'll put down circles when we asked them to put 525 00:35:43,239 --> 00:35:46,359 Speaker 1: something on paper. We have some other examples of that, 526 00:35:46,400 --> 00:35:50,759 Speaker 1: but I think the most striking wounds are gestures for yourself. 527 00:35:51,560 --> 00:35:54,400 Speaker 1: So we put people in a room there alone, not 528 00:35:54,560 --> 00:35:59,200 Speaker 1: talking to anybody. They're reading a complicated description of space 529 00:35:59,760 --> 00:36:04,640 Speaker 1: low catering, say eight landmarks in a larger space. It's 530 00:36:04,719 --> 00:36:07,640 Speaker 1: new to them and they're going to be tested, so 531 00:36:07,719 --> 00:36:10,840 Speaker 1: they have to learn it. And if you watch them studying, 532 00:36:10,880 --> 00:36:14,000 Speaker 1: they're looking at the screen and their hands are making 533 00:36:14,040 --> 00:36:18,720 Speaker 1: a map. They're drawing lines for the paths and points 534 00:36:18,719 --> 00:36:23,799 Speaker 1: with emphasis on the table for landmarks. And when they 535 00:36:23,880 --> 00:36:28,680 Speaker 1: do that, they're more likely to be correct on the exam. 536 00:36:28,719 --> 00:36:30,880 Speaker 1: And if we tell them to sit on their hands 537 00:36:30,960 --> 00:36:35,120 Speaker 1: when they're reading, they do worse. Really, so the process 538 00:36:35,160 --> 00:36:38,359 Speaker 1: of emoting or or maybe that's the wrong word, of 539 00:36:38,360 --> 00:36:42,320 Speaker 1: of cognitively expressing what they're learning through their hands helps 540 00:36:42,360 --> 00:36:44,239 Speaker 1: them learn and helps them retain that. So, if you 541 00:36:44,320 --> 00:36:48,480 Speaker 1: think about the language is arbitrary and it's very hard 542 00:36:48,560 --> 00:36:51,839 Speaker 1: to understand a spatial description, what I can do with 543 00:36:51,880 --> 00:36:55,680 Speaker 1: my hands is modeled the environment. And that's what I do. 544 00:36:55,800 --> 00:36:58,760 Speaker 1: I turn the words into a model with my hands 545 00:36:59,200 --> 00:37:04,040 Speaker 1: that my hands are representing the information in the description. 546 00:37:04,360 --> 00:37:08,040 Speaker 1: And when you say words are arbitrary, you specifically make 547 00:37:08,080 --> 00:37:14,160 Speaker 1: a point in the book that most words are completely arbitrary, 548 00:37:14,280 --> 00:37:17,920 Speaker 1: with a handful of an amount of poetic exceptions. And 549 00:37:18,680 --> 00:37:22,520 Speaker 1: that's true from you know, a cup, there's a different 550 00:37:22,560 --> 00:37:24,760 Speaker 1: word for it in every language, and none of which 551 00:37:24,800 --> 00:37:27,480 Speaker 1: sound like the word cup or or what what this 552 00:37:27,600 --> 00:37:31,680 Speaker 1: physical object? If it makes a sound, what it would be? Right? 553 00:37:31,960 --> 00:37:35,160 Speaker 1: And so we find with environments. We also find it. 554 00:37:35,239 --> 00:37:38,440 Speaker 1: Teaching people how a car break works, they model it 555 00:37:38,520 --> 00:37:43,240 Speaker 1: with their hands. So what's again an extra interesting about 556 00:37:43,239 --> 00:37:48,520 Speaker 1: that is the hands are representing the information. Language also 557 00:37:48,680 --> 00:37:53,799 Speaker 1: represents the information, and a sketch will represent the information. 558 00:37:53,840 --> 00:37:57,319 Speaker 1: So with many different ways of representation, So one is abstract, 559 00:37:57,440 --> 00:38:01,040 Speaker 1: one is physical, and one is a hatch to us 560 00:38:01,080 --> 00:38:03,719 Speaker 1: actually part of our body. Right. And what I try 561 00:38:03,760 --> 00:38:07,520 Speaker 1: to argue is that this kind of spatial thinking is 562 00:38:07,600 --> 00:38:12,320 Speaker 1: more direct. I'm I'm expressing it through a diagram or 563 00:38:12,360 --> 00:38:16,359 Speaker 1: expressing it through my hands. It's a direct representation of 564 00:38:16,400 --> 00:38:19,720 Speaker 1: the knowledge. So even if I'm explaining, if I'm talking 565 00:38:19,760 --> 00:38:22,960 Speaker 1: to about a situation where people are arguing or whatever, 566 00:38:23,360 --> 00:38:26,040 Speaker 1: I use on the one hand, on the other I've 567 00:38:26,080 --> 00:38:29,880 Speaker 1: created a diagram. I've put all the things that go 568 00:38:30,080 --> 00:38:33,920 Speaker 1: with on this hand on one space, and the things 569 00:38:33,960 --> 00:38:37,399 Speaker 1: that go with it on another in another space. If 570 00:38:37,440 --> 00:38:42,120 Speaker 1: I talk about people rising in a corporate world, I'm 571 00:38:42,120 --> 00:38:44,600 Speaker 1: going to use my hand to go up. So I'm 572 00:38:44,640 --> 00:38:49,279 Speaker 1: illustrating all those ways of thinking with my hands, and 573 00:38:49,760 --> 00:38:54,640 Speaker 1: it helps you understand and it helps me express. Quite interesting. 574 00:38:54,960 --> 00:38:58,640 Speaker 1: Let's talk a little bit about the project that Michael 575 00:38:58,719 --> 00:39:03,279 Speaker 1: Lewis did. Theo he wrote called the Undoing Project, and 576 00:39:03,360 --> 00:39:07,840 Speaker 1: he very specifically said, without you, there would be no 577 00:39:08,000 --> 00:39:12,520 Speaker 1: book Undoing Projects. First question, and the book is about 578 00:39:12,600 --> 00:39:16,920 Speaker 1: Danny Kaneman and his partner Amos Tversky, who was your husband. 579 00:39:17,239 --> 00:39:19,319 Speaker 1: They worked for many years together in Israel and then 580 00:39:19,360 --> 00:39:22,240 Speaker 1: came here to the United States. I get the sense 581 00:39:22,280 --> 00:39:25,120 Speaker 1: from the book that in the beginning you were a 582 00:39:25,120 --> 00:39:30,080 Speaker 1: little reluctant to participate. Is that a misinterpretation or were 583 00:39:30,120 --> 00:39:33,120 Speaker 1: you ready to jump in with both feet right from 584 00:39:33,120 --> 00:39:38,960 Speaker 1: the beginning. No, I wasn't reluctant. Danny Kaneman and Michael 585 00:39:39,000 --> 00:39:41,799 Speaker 1: had struck up a friendship. They lived pretty close to 586 00:39:41,800 --> 00:39:45,200 Speaker 1: each other in at the time. Danny no longer has 587 00:39:45,239 --> 00:39:48,680 Speaker 1: a home there. And the way Michael tells the story 588 00:39:48,760 --> 00:39:51,640 Speaker 1: of how they met, Michael did go to business school, 589 00:39:51,719 --> 00:39:56,800 Speaker 1: but he went too early to have learned Koneman and Firsty, 590 00:39:57,400 --> 00:39:59,879 Speaker 1: and he came to it quite late, and they came 591 00:40:00,000 --> 00:40:04,520 Speaker 1: would indirectly. He'd written money Ball, an amazing book, and 592 00:40:04,640 --> 00:40:08,160 Speaker 1: Dick Taylor and I think as Sunstein wrote a review 593 00:40:08,200 --> 00:40:11,120 Speaker 1: in the Republic and they said the book is great 594 00:40:11,160 --> 00:40:14,240 Speaker 1: and all that, but Michael needs to know about Kanamen 595 00:40:14,280 --> 00:40:20,760 Speaker 1: and first Key to understand why coaches scouts were missled 596 00:40:20,760 --> 00:40:25,279 Speaker 1: and Billy Dean. So Michael at that time and now 597 00:40:25,480 --> 00:40:28,120 Speaker 1: is living in Berkeley. One of his closest friends is 598 00:40:28,680 --> 00:40:33,360 Speaker 1: Dr Keltner, who was a graduate student in social psychology 599 00:40:33,360 --> 00:40:37,640 Speaker 1: at Stanford and tight actually for my husband. As Michael 600 00:40:37,640 --> 00:40:41,920 Speaker 1: tells the story over beer, Michael asked Dr about this 601 00:40:42,000 --> 00:40:45,759 Speaker 1: work and doctor says, sure, Danny lives up the hill, 602 00:40:45,920 --> 00:40:50,719 Speaker 1: I'll introduce you. So Danny was willing, in his generous 603 00:40:50,800 --> 00:40:54,440 Speaker 1: way to talk about the work explain it to Michael, 604 00:40:54,480 --> 00:40:57,440 Speaker 1: because Michael realized he needed to know about it if 605 00:40:57,480 --> 00:41:00,360 Speaker 1: he were interested in statistics, he need it to know 606 00:41:00,480 --> 00:41:04,160 Speaker 1: about how people misuse them. So I think Danny said, 607 00:41:04,160 --> 00:41:07,520 Speaker 1: if you walk with me, we'll talk. Because Danny walked, 608 00:41:07,840 --> 00:41:11,440 Speaker 1: and they gradually I think struck up a friendship. And 609 00:41:11,600 --> 00:41:14,440 Speaker 1: I think Michael tells it that he got the idea 610 00:41:14,440 --> 00:41:18,920 Speaker 1: of writing a book about their friendship, about Danny and 611 00:41:19,120 --> 00:41:23,320 Speaker 1: amesis friendship. And Danny came to me and he said, 612 00:41:23,760 --> 00:41:26,800 Speaker 1: Michael wants to write a book and if he doesn't 613 00:41:26,840 --> 00:41:30,719 Speaker 1: do it, somebody else will. And Michael likes us, and 614 00:41:31,239 --> 00:41:34,040 Speaker 1: he named another person who was waiting in line and 615 00:41:34,080 --> 00:41:38,880 Speaker 1: said that person doesn't like us. So I said, Danny, whatever, 616 00:41:38,920 --> 00:41:43,359 Speaker 1: I trust you completely. Whatever you think, I'll go along with. 617 00:41:43,719 --> 00:41:46,759 Speaker 1: So the side part on that story is, I think 618 00:41:46,800 --> 00:41:49,799 Speaker 1: for a year or two Michael taught a course in 619 00:41:50,120 --> 00:41:55,240 Speaker 1: finance journalism at the Berkeley School of Journalism. One point 620 00:41:55,239 --> 00:41:58,560 Speaker 1: he opened it up to be school students. Our oldest son, 621 00:41:58,840 --> 00:42:02,560 Speaker 1: Oran was a student has at that point and took 622 00:42:02,600 --> 00:42:06,200 Speaker 1: Michael's course and they hit it off, and he obviously 623 00:42:06,239 --> 00:42:09,080 Speaker 1: figured out who Oran was. He didn't figure it out 624 00:42:09,200 --> 00:42:12,319 Speaker 1: at all. Really, he didn't figure it out at all. 625 00:42:12,600 --> 00:42:17,680 Speaker 1: And I don't know exactly he never put that together, 626 00:42:17,680 --> 00:42:20,759 Speaker 1: didn't put it together. And I don't remember if that 627 00:42:20,880 --> 00:42:24,000 Speaker 1: was before or after money well, but he didn't put 628 00:42:24,040 --> 00:42:28,080 Speaker 1: it together. And the upshot was that it was Oran 629 00:42:28,160 --> 00:42:32,200 Speaker 1: who introduced me to Michael, which is sweet over email, 630 00:42:32,840 --> 00:42:36,360 Speaker 1: which I mean, I think Michael asked Oran Michael. He 631 00:42:36,560 --> 00:42:39,440 Speaker 1: ended up talking a great deal with my three children 632 00:42:40,000 --> 00:42:42,719 Speaker 1: and ended up being quite fond of them, which of 633 00:42:42,760 --> 00:42:45,840 Speaker 1: course warms the mother's heart. So how did you and 634 00:42:46,560 --> 00:42:49,680 Speaker 1: Michael collaborate when he was doing the research part of 635 00:42:49,680 --> 00:42:52,839 Speaker 1: the book. I don't know if his collaboration. Michael sat 636 00:42:53,080 --> 00:42:58,640 Speaker 1: in my office at Stanford going through Emmessis papers. He 637 00:42:58,680 --> 00:43:03,719 Speaker 1: really does his search beautifully, complete concentration, and he would 638 00:43:03,760 --> 00:43:07,000 Speaker 1: ask me questions and I would answer them. If something 639 00:43:07,120 --> 00:43:10,319 Speaker 1: was in Hebrew, I'd try to translate it for him. 640 00:43:10,440 --> 00:43:14,319 Speaker 1: He asked me questions by email, and I answered them 641 00:43:14,360 --> 00:43:17,200 Speaker 1: at great length, and that was I think an easy 642 00:43:17,280 --> 00:43:20,200 Speaker 1: way for us to communicate. At one point, you know, 643 00:43:20,280 --> 00:43:23,160 Speaker 1: I came to Israel young bride, within the middle of 644 00:43:23,160 --> 00:43:26,440 Speaker 1: graduate school with no Hebrew whatsoever. Where did you go 645 00:43:26,480 --> 00:43:29,400 Speaker 1: to Israel from from the US? From the US from graduates? 646 00:43:30,360 --> 00:43:36,719 Speaker 1: That it was the October sixt six day war, right, 647 00:43:36,920 --> 00:43:40,040 Speaker 1: and AMOS was drafted on the twenty two of May, 648 00:43:40,120 --> 00:43:44,839 Speaker 1: and the war broke out ten days later. And and 649 00:43:44,880 --> 00:43:48,040 Speaker 1: by then I'd learned enough Hebrew that I could understand 650 00:43:48,120 --> 00:43:51,880 Speaker 1: what the chief of staff at Corobbins said on the 651 00:43:52,080 --> 00:43:55,600 Speaker 1: ten o'clock news. So that's a longer story. I can 652 00:43:55,640 --> 00:43:59,440 Speaker 1: tell it. And in fact, Michael asked about that, and 653 00:43:59,480 --> 00:44:03,560 Speaker 1: I told them, in great detail, turn on your novelist eyes, 654 00:44:03,760 --> 00:44:06,880 Speaker 1: what's it like to be an American coming to Israel? 655 00:44:07,440 --> 00:44:10,120 Speaker 1: So I wrote in the sixties. Yeah, I wrote him 656 00:44:10,120 --> 00:44:13,799 Speaker 1: at great length, what's that was like? And later I 657 00:44:14,000 --> 00:44:17,840 Speaker 1: found letters I think my brother sent them. I'm found 658 00:44:17,920 --> 00:44:21,440 Speaker 1: letters that I've written my parents. Everything that I remember 659 00:44:21,640 --> 00:44:26,399 Speaker 1: was correct, which is astounding. You write about memory, which 660 00:44:26,440 --> 00:44:28,880 Speaker 1: we know is not only fallible, but every time we 661 00:44:28,880 --> 00:44:32,640 Speaker 1: recall an event, we're reconstructing the event on Memories are 662 00:44:33,440 --> 00:44:37,399 Speaker 1: essentially replaced by a series of bad carbon copies. So 663 00:44:37,480 --> 00:44:40,080 Speaker 1: it's nice that when you when something is that vivid 664 00:44:40,120 --> 00:44:43,560 Speaker 1: and you remember it accurately. Maybe it's because it was 665 00:44:43,640 --> 00:44:47,680 Speaker 1: so vivid it couldn't or that I retold it. I mean, 666 00:44:47,719 --> 00:44:51,920 Speaker 1: memories start getting distorted them and you use language because 667 00:44:51,960 --> 00:44:54,480 Speaker 1: they don't happen in language. I mean, they might happen 668 00:44:54,520 --> 00:44:58,000 Speaker 1: in part, but they get distorted from the get go, 669 00:44:58,320 --> 00:45:02,240 Speaker 1: from your perception and so forth. I think I helped 670 00:45:02,280 --> 00:45:05,799 Speaker 1: my colin that way answering his emails. I gave him 671 00:45:05,800 --> 00:45:08,799 Speaker 1: a long list of people he might want to speak with, 672 00:45:08,840 --> 00:45:12,879 Speaker 1: both in the US and people who knew Amos well, 673 00:45:12,960 --> 00:45:15,920 Speaker 1: and people like can Arrow. He was a close friend 674 00:45:16,320 --> 00:45:19,800 Speaker 1: and stayed a close friend of mine for many years 675 00:45:19,840 --> 00:45:24,200 Speaker 1: after Amos died. Kenneth stayed a close friend. So when 676 00:45:24,280 --> 00:45:27,799 Speaker 1: Kenneth was one of the early Nobel Prize winners in economics, 677 00:45:27,800 --> 00:45:31,280 Speaker 1: and he quickly bought into the work, it was clear 678 00:45:31,320 --> 00:45:33,920 Speaker 1: to him that the work was right from the get go, 679 00:45:34,120 --> 00:45:38,160 Speaker 1: in contrast to many other economists. So I sent him 680 00:45:38,200 --> 00:45:40,759 Speaker 1: to a whole set of people that I thought he 681 00:45:40,880 --> 00:45:44,320 Speaker 1: might give him a picture of Amos because he knew 682 00:45:44,440 --> 00:45:48,279 Speaker 1: Danny well, but he didn't know Amos. I sent him 683 00:45:48,320 --> 00:45:51,719 Speaker 1: to many people at friends of ours and Israel. He 684 00:45:51,840 --> 00:45:55,560 Speaker 1: went to Israel three four times and met all of them, 685 00:45:55,640 --> 00:45:59,600 Speaker 1: I mean Michael's research, where he met my sister in law, 686 00:45:59,800 --> 00:46:04,759 Speaker 1: my niece and talked with them. He was extraordinary in 687 00:46:04,800 --> 00:46:10,319 Speaker 1: the amazing and he got Amos in many ways. I mean, 688 00:46:10,320 --> 00:46:13,319 Speaker 1: there are errors in the book, but some people were 689 00:46:13,360 --> 00:46:16,560 Speaker 1: disturbed by the portrait of Amos is only being interested 690 00:46:16,600 --> 00:46:20,600 Speaker 1: in his work, because he really was helpful on every way, 691 00:46:20,760 --> 00:46:24,520 Speaker 1: both personally to people and in the departments in which 692 00:46:24,600 --> 00:46:28,759 Speaker 1: he participated in the university. He was a super good citizen, 693 00:46:29,000 --> 00:46:32,160 Speaker 1: so that he was single minded about his work isn't 694 00:46:32,239 --> 00:46:35,359 Speaker 1: quite right. That's hard to depict when you're going back 695 00:46:35,400 --> 00:46:37,920 Speaker 1: twenty five years later. I would imagine when you didn't 696 00:46:37,920 --> 00:46:39,920 Speaker 1: know the guy, and you know, all of us are 697 00:46:39,960 --> 00:46:45,200 Speaker 1: complicated and we're different, with different people in different situations, 698 00:46:45,480 --> 00:46:49,440 Speaker 1: and he probably caricatured Danny, and then you're capturing them 699 00:46:49,440 --> 00:46:53,000 Speaker 1: at a particular point of time and we're always changing. 700 00:46:53,320 --> 00:46:57,000 Speaker 1: I mean, the book's a great story and what impressed 701 00:46:57,000 --> 00:47:01,239 Speaker 1: me to him? And Michael learned the work and he 702 00:47:01,600 --> 00:47:04,640 Speaker 1: kept telling me, I feel like it'd be student studying 703 00:47:04,680 --> 00:47:08,480 Speaker 1: a plus work, but he really learned it. And I 704 00:47:08,640 --> 00:47:12,080 Speaker 1: think his portrayal of both of the history that Mail 705 00:47:12,360 --> 00:47:15,479 Speaker 1: and Lou Goldberg and other people have done similar work 706 00:47:15,520 --> 00:47:19,759 Speaker 1: before their work Robin Dawes people who were influences on 707 00:47:19,920 --> 00:47:23,960 Speaker 1: them or in the same ecology, and getting the work right. 708 00:47:24,040 --> 00:47:27,279 Speaker 1: I thought he did a masterful job of explaining the 709 00:47:27,320 --> 00:47:30,160 Speaker 1: work to lay people, so he might have gotten some 710 00:47:30,200 --> 00:47:34,840 Speaker 1: of the nuances about Amos wrong. Did he find anything 711 00:47:34,920 --> 00:47:39,399 Speaker 1: from speaking to friends, colleagues, relatives that surprised you. Did 712 00:47:39,440 --> 00:47:41,560 Speaker 1: anything show up in the book that you said, huh? 713 00:47:41,880 --> 00:47:43,879 Speaker 1: I don't really know about that. No, I don't think. 714 00:47:43,920 --> 00:47:47,080 Speaker 1: So it's not that Amos didn't have secrets, although a 715 00:47:47,080 --> 00:47:49,719 Speaker 1: few from me, But no, I don't. I don't think 716 00:47:49,719 --> 00:47:53,399 Speaker 1: there was anything there that I didn't know. So you 717 00:47:53,440 --> 00:47:56,000 Speaker 1: wrote or said, I don't remember where I pulled this 718 00:47:56,080 --> 00:48:00,439 Speaker 1: quote from about Danny and Amos. Their relationship was more 719 00:48:00,600 --> 00:48:04,279 Speaker 1: intense than a marriage, so that had to be a 720 00:48:04,440 --> 00:48:07,400 Speaker 1: difficult thing to balance with your own marriage. What was 721 00:48:07,480 --> 00:48:09,759 Speaker 1: that like living through that? I'm not sure if I 722 00:48:09,760 --> 00:48:12,560 Speaker 1: said more intense than a marriage, but it was certainly 723 00:48:12,600 --> 00:48:16,520 Speaker 1: intense like a marriage. They really loved each other, and 724 00:48:16,600 --> 00:48:20,200 Speaker 1: they formed a close friendship, and the way they worked 725 00:48:20,200 --> 00:48:24,279 Speaker 1: together was gleeful and joyful until it wasn't. When you 726 00:48:24,320 --> 00:48:28,239 Speaker 1: say gleeful and joyful. There are stories parts of the 727 00:48:28,320 --> 00:48:31,279 Speaker 1: Undoing Project where the two of them are locked in 728 00:48:31,320 --> 00:48:35,360 Speaker 1: a classroom by themselves and all people in the hallway 729 00:48:35,400 --> 00:48:39,520 Speaker 1: here is just peals of laughter for hours. They're back 730 00:48:39,560 --> 00:48:42,520 Speaker 1: and forth debating stuff and just laughing their butts off. 731 00:48:43,160 --> 00:48:45,400 Speaker 1: Was it work or was it fun? I had the 732 00:48:45,480 --> 00:48:50,319 Speaker 1: advantage of understanding Hebrew and English, and the conversations would 733 00:48:50,320 --> 00:48:54,240 Speaker 1: go back and forth and be mixed between English and Hebrew. Yeah, 734 00:48:54,320 --> 00:48:57,680 Speaker 1: and you know, they'd come out for tea or a 735 00:48:57,680 --> 00:49:00,440 Speaker 1: lot of the conversations were in my house. They'd come 736 00:49:00,440 --> 00:49:02,960 Speaker 1: out for tea, or they'd come out to tell me 737 00:49:03,120 --> 00:49:06,880 Speaker 1: something that they were dying to tell me. And when 738 00:49:06,960 --> 00:49:09,400 Speaker 1: Danny would leave and Amos would be with me, I 739 00:49:09,440 --> 00:49:13,240 Speaker 1: would hear a recap of the discussion and the conversations 740 00:49:13,280 --> 00:49:18,080 Speaker 1: and the stories the questions they were asking students. So 741 00:49:18,239 --> 00:49:20,600 Speaker 1: I had a front row seat to everything that was 742 00:49:20,640 --> 00:49:23,239 Speaker 1: going on. It didn't interfere with mine. So this wasn't 743 00:49:23,239 --> 00:49:26,280 Speaker 1: an imposition. This was just your husband and a professional 744 00:49:26,320 --> 00:49:29,760 Speaker 1: relationship that worked for him and worked for everybody involved, 745 00:49:29,800 --> 00:49:33,040 Speaker 1: and gave me a great deal of intellectual pleasure, of 746 00:49:33,120 --> 00:49:36,799 Speaker 1: personal pleasure. Danny would often visit us, stay in our 747 00:49:36,800 --> 00:49:39,799 Speaker 1: house when we were at Stanford, he was at Vancouver. 748 00:49:40,360 --> 00:49:43,280 Speaker 1: Amus tended to work late at night and come wake 749 00:49:43,600 --> 00:49:46,120 Speaker 1: late in the morning, so I'd have breakfast with Danny. 750 00:49:46,440 --> 00:49:50,560 Speaker 1: Danny's great company, and that was a pleasure. So he's 751 00:49:50,560 --> 00:49:52,960 Speaker 1: in New York. Now you're in New York half the year. 752 00:49:53,120 --> 00:49:56,359 Speaker 1: You guys still see each other? Sure? Sure? I mean, 753 00:49:56,400 --> 00:49:58,600 Speaker 1: I'd like to say we we lived down the street 754 00:49:58,600 --> 00:50:01,680 Speaker 1: from each other, because We both from the corner of Broadway, 755 00:50:01,920 --> 00:50:06,760 Speaker 1: but hundred blocks apart. No, he's been a really loyal 756 00:50:06,880 --> 00:50:10,239 Speaker 1: friend and I appreciate that. Can you stick around a 757 00:50:10,239 --> 00:50:11,960 Speaker 1: little bit. I have a bunch more questions for you. 758 00:50:13,239 --> 00:50:16,800 Speaker 1: We have been speaking with Barbara Tversky, professor at Stanford 759 00:50:16,800 --> 00:50:20,960 Speaker 1: in Colombia and author of Mind in Motion, How Action 760 00:50:21,080 --> 00:50:24,080 Speaker 1: Shapes Thoughts. If you enjoy this conversation, be sure and 761 00:50:24,200 --> 00:50:26,840 Speaker 1: check out the podcast extras. Will we keep the tape 762 00:50:26,920 --> 00:50:31,440 Speaker 1: rolling and continue discussing all things cognitive and psychology related. 763 00:50:31,840 --> 00:50:37,000 Speaker 1: You can find that at Apple iTunes, Google Podcast, Stitcher, Spotify, Overcast, 764 00:50:37,320 --> 00:50:40,960 Speaker 1: wherever your finer podcasts are found. Be sure to check 765 00:50:41,000 --> 00:50:44,360 Speaker 1: out my weekly column on Bloomberg dot com. Follow me 766 00:50:44,400 --> 00:50:47,760 Speaker 1: on Twitter at rid Halts. Sign up for my daily 767 00:50:47,840 --> 00:50:51,799 Speaker 1: reads at rid Halts dot com. I'm Barry Ridholts. You're 768 00:50:51,840 --> 00:51:00,600 Speaker 1: listening to Masters in Business on Bloomberg Radio. Welcome to 769 00:51:00,600 --> 00:51:03,040 Speaker 1: the podcast, Barbara, Thank you so much for doing this. 770 00:51:03,120 --> 00:51:06,319 Speaker 1: You you and I can speak off Mike for as 771 00:51:06,400 --> 00:51:09,440 Speaker 1: long as we're talking on Mike, because these are really 772 00:51:09,520 --> 00:51:15,680 Speaker 1: really fascinating subjects. And I didn't realize the person who 773 00:51:15,719 --> 00:51:19,840 Speaker 1: sent me down the behavioral finance Rob rabbit Hole. Um 774 00:51:19,880 --> 00:51:24,840 Speaker 1: Thomas Gilovich was a student of Amoses and Lee Ross 775 00:51:24,880 --> 00:51:27,960 Speaker 1: back in the day. Now, tell us a little bit 776 00:51:27,960 --> 00:51:31,759 Speaker 1: about Lee Ross. What was his relationship with Amos? How 777 00:51:31,760 --> 00:51:35,960 Speaker 1: did how did he um have anything to do with Stanford? 778 00:51:36,080 --> 00:51:39,319 Speaker 1: And and you and Amos? So Lee Ross, who said 779 00:51:39,480 --> 00:51:43,440 Speaker 1: a dear friend. Um. It was at Stanford working with 780 00:51:43,560 --> 00:51:48,399 Speaker 1: Nick nasb and they were working on essentially biases, their 781 00:51:48,520 --> 00:51:52,160 Speaker 1: social psychologists, and they were working on biases in the 782 00:51:52,239 --> 00:51:57,200 Speaker 1: way that we interpret other people's behavior and our own behavior. 783 00:51:58,239 --> 00:52:04,600 Speaker 1: In those years, cognition was really active in social psychology 784 00:52:04,640 --> 00:52:07,960 Speaker 1: and thinking about an individual. So one of the things 785 00:52:08,040 --> 00:52:12,000 Speaker 1: they came up with is called the fundamental attribution error, 786 00:52:12,840 --> 00:52:15,719 Speaker 1: and that's that we attribute our own behavior to the 787 00:52:15,840 --> 00:52:21,759 Speaker 1: circumstances around us, and other people's behavior to enduring personality traits. 788 00:52:21,760 --> 00:52:24,600 Speaker 1: So so when I do something right, when I do 789 00:52:24,719 --> 00:52:27,879 Speaker 1: something right, it's because I'm so skillful, But when they 790 00:52:27,920 --> 00:52:30,800 Speaker 1: do something wrong, it's they're not that smart. Than that, 791 00:52:31,080 --> 00:52:32,960 Speaker 1: they're not good people. That's why they messed up. Is 792 00:52:33,040 --> 00:52:36,120 Speaker 1: something like that, Or when when if I get angry, 793 00:52:36,160 --> 00:52:39,759 Speaker 1: it's something that you did. It's a situation, it's not 794 00:52:39,880 --> 00:52:44,000 Speaker 1: my fault. The circumstances, maybe you provoked me, and and 795 00:52:44,320 --> 00:52:47,560 Speaker 1: if you're behaving that way, it's because you're an aggressive person, 796 00:52:47,880 --> 00:52:50,600 Speaker 1: or an angry person or a shy person. So we 797 00:52:50,719 --> 00:52:56,279 Speaker 1: all three. So so the that's kind of interesting, And 798 00:52:56,360 --> 00:53:00,200 Speaker 1: it just goes back to the filter I view these 799 00:53:00,239 --> 00:53:04,759 Speaker 1: things through is finance and investing in trading, and the 800 00:53:04,880 --> 00:53:08,839 Speaker 1: greatest thing to do is to speak to traders who 801 00:53:08,880 --> 00:53:11,959 Speaker 1: were either making money or losing money. And when they're 802 00:53:11,960 --> 00:53:16,000 Speaker 1: making money, it's because they're brilliant. Hey, I had this 803 00:53:16,080 --> 00:53:19,040 Speaker 1: trade figured out, I knew where to jump into it, 804 00:53:19,480 --> 00:53:22,279 Speaker 1: I understood the value of this company. And when the 805 00:53:22,320 --> 00:53:25,320 Speaker 1: trade goes south, it's never oh I had it wrong, 806 00:53:25,400 --> 00:53:28,600 Speaker 1: it's well, the Federal Reserve did this, and who knew 807 00:53:28,640 --> 00:53:32,800 Speaker 1: about this attack? And I ran And it's always externalities 808 00:53:32,840 --> 00:53:36,440 Speaker 1: why they lose money. But it's their skills why they 809 00:53:36,480 --> 00:53:40,520 Speaker 1: make money. Yeah, and you wonder why we're built that way, 810 00:53:40,560 --> 00:53:45,040 Speaker 1: because it does interfere with learning what's happening in the 811 00:53:45,080 --> 00:53:48,440 Speaker 1: world and how to interpret it. That it does feel 812 00:53:48,560 --> 00:53:51,520 Speaker 1: very strongly that we're built that way. So so if 813 00:53:51,520 --> 00:53:56,879 Speaker 1: that's the case, is there an evolutionary benefit to that 814 00:53:57,080 --> 00:54:02,520 Speaker 1: sort of self confidence? Um? And ignoring things that that 815 00:54:02,719 --> 00:54:06,480 Speaker 1: perhaps might even be your fault. Why would that why 816 00:54:06,480 --> 00:54:09,799 Speaker 1: would that be hardwired? And you know, you could write 817 00:54:09,840 --> 00:54:13,480 Speaker 1: an evolutionary story, sure that. I don't know how you 818 00:54:13,719 --> 00:54:16,719 Speaker 1: know if it was correct or not, but it makes 819 00:54:16,719 --> 00:54:20,040 Speaker 1: a great narrative. It makes a great story, right, And 820 00:54:20,880 --> 00:54:25,520 Speaker 1: I mean evolutionary psychology has taken hold a bit in psychology, 821 00:54:25,960 --> 00:54:30,520 Speaker 1: sometimes in an annoying way because in some sense all 822 00:54:30,560 --> 00:54:35,560 Speaker 1: of us were doing it anyway, but it it there's 823 00:54:35,680 --> 00:54:38,440 Speaker 1: there's a very little way that you can check those 824 00:54:38,480 --> 00:54:44,160 Speaker 1: deep psychological hypotheses. You can check other sorts of hypotheses 825 00:54:44,200 --> 00:54:49,160 Speaker 1: about structure eye or structure by raising food flies for 826 00:54:49,239 --> 00:54:53,920 Speaker 1: many generations. But it's those deep psychological ones that the 827 00:54:54,000 --> 00:54:59,160 Speaker 1: connection between any gene and psycho and behavior, and add 828 00:54:59,200 --> 00:55:04,480 Speaker 1: to that epigenetics and our BioGenome in in our stomach 829 00:55:04,680 --> 00:55:08,960 Speaker 1: and so genetics has become a huge field recently. Hasn't 830 00:55:08,960 --> 00:55:16,280 Speaker 1: it that that our experiences somehow impact our our genetics? 831 00:55:16,600 --> 00:55:19,839 Speaker 1: Am I oversimple? Well, again, I'm not an expert on this. 832 00:55:19,960 --> 00:55:23,560 Speaker 1: I went to UH symposium on it and quiz the 833 00:55:23,560 --> 00:55:29,360 Speaker 1: biologists mercilessly on the mechanisms, and yes, they seem to 834 00:55:30,080 --> 00:55:32,040 Speaker 1: they seem to believe in this is some of this 835 00:55:32,120 --> 00:55:35,759 Speaker 1: is animal work where you can check it that that 836 00:55:36,080 --> 00:55:41,200 Speaker 1: actually it is affecting the genome, the germ cells that 837 00:55:41,280 --> 00:55:45,359 Speaker 1: are being passed on to the next generation. And so 838 00:55:45,600 --> 00:55:49,839 Speaker 1: if you starve the grandfather, the grandchild who never knew 839 00:55:50,120 --> 00:55:54,560 Speaker 1: rat that never knew the grandfather has different eating behavior 840 00:55:54,640 --> 00:55:59,160 Speaker 1: than if the grandfather wasn't solved. So a lot of 841 00:55:59,160 --> 00:56:04,160 Speaker 1: that is looking at negative things that maternal deprivation, starvation, 842 00:56:04,719 --> 00:56:07,799 Speaker 1: and so forth. So I asked, does it work for 843 00:56:07,920 --> 00:56:11,919 Speaker 1: positive things? If you enrich an environment, does that get 844 00:56:12,000 --> 00:56:15,480 Speaker 1: passed to the grandchildren? And they said it looks like 845 00:56:16,000 --> 00:56:20,360 Speaker 1: it might. It does. And then I asked, are these 846 00:56:20,480 --> 00:56:25,400 Speaker 1: big effects? And they said no, they're small effects. In 847 00:56:25,560 --> 00:56:30,480 Speaker 1: the larger picture, they're small effects, but they're detectable. So 848 00:56:31,360 --> 00:56:35,440 Speaker 1: since you mentioned rats, I have to ask this question, um, 849 00:56:35,480 --> 00:56:39,760 Speaker 1: how do animals? How do the way animals think differ 850 00:56:39,920 --> 00:56:44,200 Speaker 1: from human thinking? Or are there many parallels? Do animals 851 00:56:44,200 --> 00:56:47,080 Speaker 1: and humans have a lot of similar thought processes? So 852 00:56:47,200 --> 00:56:50,560 Speaker 1: again we're getting out of my own research research that 853 00:56:50,680 --> 00:56:57,000 Speaker 1: I reviewed. But if you look, primates can't count the 854 00:56:57,120 --> 00:56:59,720 Speaker 1: way we can count. But then there are many civil 855 00:56:59,760 --> 00:57:03,400 Speaker 1: as nations still around in the world that don't have 856 00:57:03,520 --> 00:57:09,120 Speaker 1: number words, and numbers are a cultural phenomenon. Not hardwired, yes, 857 00:57:09,920 --> 00:57:13,440 Speaker 1: but one to one correspondences. Things that you have in 858 00:57:13,600 --> 00:57:18,680 Speaker 1: tallies are old and are We have an estimation system 859 00:57:18,720 --> 00:57:21,120 Speaker 1: as well as an accurate system. There are kind of 860 00:57:21,200 --> 00:57:24,760 Speaker 1: two mass systems in the brain, and they are somewhat 861 00:57:24,760 --> 00:57:29,600 Speaker 1: integrated and somewhat independent. But making estimates, are there eighty 862 00:57:29,760 --> 00:57:34,520 Speaker 1: three things or ninety things? Primates can make those estimates 863 00:57:34,640 --> 00:57:40,000 Speaker 1: quite well without counting, right. That's interesting, No, it's fascinating. 864 00:57:40,240 --> 00:57:43,120 Speaker 1: And one of the things you mentioned earlier that I 865 00:57:43,160 --> 00:57:47,400 Speaker 1: was kind of intrigued with, I wanna I wanna just 866 00:57:47,440 --> 00:57:52,120 Speaker 1: do a slight um variation of so you mentioned some 867 00:57:52,200 --> 00:57:56,760 Speaker 1: people are good with language and other people are good 868 00:57:56,840 --> 00:58:02,120 Speaker 1: with thinking processes, and not every he has both. But 869 00:58:02,160 --> 00:58:04,720 Speaker 1: one of the things I was kind of fascinated with 870 00:58:04,760 --> 00:58:08,920 Speaker 1: about language and creativity and thinking. So I write a 871 00:58:08,960 --> 00:58:13,160 Speaker 1: lot and I speak a lot. But I found that 872 00:58:13,320 --> 00:58:16,680 Speaker 1: my writing it's much more intel actually sharp, and at 873 00:58:16,720 --> 00:58:21,520 Speaker 1: a higher grade level than my speaking, and I was 874 00:58:21,640 --> 00:58:25,760 Speaker 1: kind of surprised. So when I worked on my first 875 00:58:25,800 --> 00:58:28,680 Speaker 1: book over a decade ago, I thought, oh, this will 876 00:58:28,680 --> 00:58:32,439 Speaker 1: be easy I'll dictate a bunch of stuff and it'll 877 00:58:32,480 --> 00:58:34,080 Speaker 1: take me a couple of weekends, and I'll have a 878 00:58:34,160 --> 00:58:37,680 Speaker 1: hundred thousand words, and I'm shocked as I'm rereading my 879 00:58:38,000 --> 00:58:42,240 Speaker 1: spoken word. This is terrible. Why are the things that 880 00:58:42,320 --> 00:58:46,760 Speaker 1: I laboriously pound out on a keyboard so much more 881 00:58:46,840 --> 00:58:50,720 Speaker 1: articulate and intelligent than what I say? And eventually it 882 00:58:50,760 --> 00:58:53,200 Speaker 1: wasn't a big leap to think, well, you have a 883 00:58:53,240 --> 00:58:55,200 Speaker 1: part of the brain for speech and a different part 884 00:58:55,200 --> 00:58:58,200 Speaker 1: of the brain for creativity and writing, and hey, maybe 885 00:58:58,240 --> 00:59:01,439 Speaker 1: that speech part in is well developed as your writing part. 886 00:59:02,240 --> 00:59:04,480 Speaker 1: Is that an oversimplification or is that a fair way 887 00:59:04,520 --> 00:59:06,440 Speaker 1: to look at it? And no, I think it's probably. 888 00:59:06,480 --> 00:59:08,480 Speaker 1: I don't think there are separate parts of the brain 889 00:59:08,560 --> 00:59:11,040 Speaker 1: for speaking and writing. What happens when you write is 890 00:59:11,120 --> 00:59:14,240 Speaker 1: you put something in a page and you would edit. Yeah, 891 00:59:14,240 --> 00:59:18,080 Speaker 1: but my first drafts of writing are much more articulate 892 00:59:18,120 --> 00:59:22,040 Speaker 1: than my first drafts of speaking. And the best speaking 893 00:59:22,120 --> 00:59:25,080 Speaker 1: things I do are when I write them out in 894 00:59:25,160 --> 00:59:29,680 Speaker 1: advance and come up with the structural language that I want. Okay, 895 00:59:29,680 --> 00:59:33,480 Speaker 1: so then you're you're putting on your your writing hat 896 00:59:33,680 --> 00:59:37,640 Speaker 1: and supposed to your speaking is more spontaneous, sure, and 897 00:59:37,840 --> 00:59:40,920 Speaker 1: the writing hat you're deliberately thinking about what what am 898 00:59:40,920 --> 00:59:43,840 Speaker 1: I going to? What are the thoughts I want to express, 899 00:59:43,880 --> 00:59:47,160 Speaker 1: and how's the best way to express them? So you 900 00:59:47,280 --> 00:59:49,920 Speaker 1: some particularly adept at that. I had to. I have 901 00:59:49,960 --> 00:59:53,760 Speaker 1: a colleague at Stanford named alband Or who writes many books, 902 00:59:53,760 --> 00:59:56,200 Speaker 1: and they're all good. And you ask them a question 903 00:59:56,640 --> 01:00:01,240 Speaker 1: and it comes out in paragraphs and pages full answers, 904 01:00:01,280 --> 01:00:03,840 Speaker 1: like first draft. I have a friend like that. It's 905 01:00:03,920 --> 01:00:08,880 Speaker 1: just fully formed, coherent, organized like I wish I could 906 01:00:08,920 --> 01:00:11,040 Speaker 1: do that. I could do that on on pen and paper. 907 01:00:11,360 --> 01:00:14,160 Speaker 1: I can't do that, verbon No. It's astounding, and I 908 01:00:14,200 --> 01:00:17,000 Speaker 1: think it's he's practiced so much so if you think 909 01:00:17,040 --> 01:00:21,160 Speaker 1: about musicians that write music or play music, they can 910 01:00:21,200 --> 01:00:23,960 Speaker 1: do it very rapidly. They have the scheme as they 911 01:00:23,960 --> 01:00:29,200 Speaker 1: can generated very quickly. It's highly practiced, like any sport 912 01:00:29,240 --> 01:00:33,240 Speaker 1: would be highly practiced. So perhaps you're writing, you're thinking, 913 01:00:33,280 --> 01:00:35,480 Speaker 1: at this metal level, how am I going to organize 914 01:00:35,520 --> 01:00:38,760 Speaker 1: my thoughts. I've got an outline for organizing them. I'm 915 01:00:38,840 --> 01:00:43,520 Speaker 1: gesturing the outline and and you're thinking that through and 916 01:00:43,640 --> 01:00:49,840 Speaker 1: filling it through. This interview is different from spontaneous conversations, 917 01:00:50,320 --> 01:00:55,160 Speaker 1: because again I'm crafting and thinking ahead and crafting in 918 01:00:55,200 --> 01:00:57,760 Speaker 1: that way. So my convenent of dissonances. I'm going to 919 01:00:57,840 --> 01:01:01,640 Speaker 1: stick with the two different brain sections because I like 920 01:01:01,760 --> 01:01:05,840 Speaker 1: that idea, but also the way various aphasi acts and 921 01:01:05,920 --> 01:01:10,120 Speaker 1: people who have had brain damage lose the ability to speak, 922 01:01:10,200 --> 01:01:13,040 Speaker 1: but they can sing, or they could they could write, 923 01:01:13,040 --> 01:01:15,640 Speaker 1: but they can't read. That's what led me to think 924 01:01:16,160 --> 01:01:19,520 Speaker 1: it's a speech center and a writing center. You're telling 925 01:01:19,520 --> 01:01:24,040 Speaker 1: me there is no difference. I don't know. I sort 926 01:01:24,080 --> 01:01:29,560 Speaker 1: of doubt it because I think each involves many areas Aphacians. Yeah, 927 01:01:29,600 --> 01:01:32,960 Speaker 1: Aphacia's brain damage, and it's usually not pointed. It's not 928 01:01:33,160 --> 01:01:38,160 Speaker 1: it's a cluster of neurons probably and and and there. 929 01:01:38,200 --> 01:01:42,720 Speaker 1: It is losing certain kinds of words and not others. Right, 930 01:01:43,160 --> 01:01:47,120 Speaker 1: So this is a general So so the other thing 931 01:01:47,200 --> 01:01:51,120 Speaker 1: I want to ask. Art comes up in the book 932 01:01:51,440 --> 01:01:55,960 Speaker 1: in several places. I'm curious as to why you use 933 01:01:56,120 --> 01:01:59,400 Speaker 1: art as an example. And then there's one specific example 934 01:01:59,440 --> 01:02:02,520 Speaker 1: I have to bring up to you because I was 935 01:02:02,560 --> 01:02:08,040 Speaker 1: intrigued by it. Um, what's the relationship between art and 936 01:02:08,320 --> 01:02:13,919 Speaker 1: thinking and between the concept of spatial motion and how 937 01:02:14,000 --> 01:02:18,919 Speaker 1: we express ourselves artistically. Yeah, so right, So how would 938 01:02:18,960 --> 01:02:21,680 Speaker 1: I come to naturally? I drew a lot as a kid, 939 01:02:21,720 --> 01:02:24,360 Speaker 1: and my mother's an artist, and my cousins that are 940 01:02:24,600 --> 01:02:27,920 Speaker 1: so art is very much in my life writing too, 941 01:02:28,120 --> 01:02:32,960 Speaker 1: for that matter. M but I happen to have I 942 01:02:33,040 --> 01:02:36,160 Speaker 1: got interested in design, So I first got interested in 943 01:02:36,240 --> 01:02:39,480 Speaker 1: how do we put the world in our mind? How 944 01:02:39,520 --> 01:02:42,040 Speaker 1: do we get space in our mind? And then I 945 01:02:42,160 --> 01:02:44,920 Speaker 1: got more and more interested in the spaces that we 946 01:02:45,080 --> 01:02:50,000 Speaker 1: create to improve our own cognition. So diagrams would be one. 947 01:02:50,360 --> 01:02:54,240 Speaker 1: Even the alphabet would be at one. Sure, developing the 948 01:02:54,280 --> 01:03:00,040 Speaker 1: alphabet which was invented apparently only once, but invent it 949 01:03:00,160 --> 01:03:04,200 Speaker 1: only once? What do you mean? The sound sound too, 950 01:03:04,600 --> 01:03:08,920 Speaker 1: symbol correspondence was developed once since spread and then just 951 01:03:09,040 --> 01:03:14,840 Speaker 1: variations based and otherwise. Alphabets were representing meaning the way 952 01:03:14,960 --> 01:03:20,600 Speaker 1: Chinese went from symbology to phonetics. It's whether it's representing 953 01:03:20,680 --> 01:03:24,680 Speaker 1: meanings directly the way Chinese does, or whether it's representing 954 01:03:24,720 --> 01:03:29,040 Speaker 1: the sound of language of speaking. And that's what the 955 01:03:29,080 --> 01:03:33,640 Speaker 1: Phoenician alphabet that spread everywhere and got varied and was 956 01:03:33,680 --> 01:03:39,720 Speaker 1: apparently only invented once and then spread it is fascinating. 957 01:03:39,800 --> 01:03:42,120 Speaker 1: So so let me have you disabused me of another 958 01:03:42,760 --> 01:03:46,720 Speaker 1: thing I probably have wrong? Um? Have have you seen 959 01:03:46,880 --> 01:03:53,240 Speaker 1: the trick in the Federal Express? Uh? So, the way 960 01:03:53,280 --> 01:03:55,960 Speaker 1: that was first explained to me is the reason we 961 01:03:56,080 --> 01:03:59,640 Speaker 1: don't perceive the arrow in the FedEx. And just pull 962 01:03:59,720 --> 01:04:02,800 Speaker 1: up a FedEx, any picture of FedEx, and you'll see 963 01:04:02,840 --> 01:04:06,439 Speaker 1: between the E and the x um there there's an arrow. 964 01:04:06,880 --> 01:04:10,640 Speaker 1: That's the part of your brain that recognizes language is 965 01:04:10,640 --> 01:04:14,360 Speaker 1: a different part of the brain that recognizes symbols, And 966 01:04:14,400 --> 01:04:18,320 Speaker 1: when you're reading the letters, your brain isn't primed for 967 01:04:18,480 --> 01:04:21,840 Speaker 1: seeing a symbol like an arrow. So that's fascinating, and 968 01:04:21,880 --> 01:04:25,640 Speaker 1: it would be arrows are fascinating anyway, and they've gotten 969 01:04:25,680 --> 01:04:30,360 Speaker 1: me fascinated, and somebody needs to do that work. The 970 01:04:30,360 --> 01:04:36,600 Speaker 1: the lateral occipital parietal juncture, that area of the brain 971 01:04:36,680 --> 01:04:41,280 Speaker 1: that is recognizing objects and recognizes fruits and vegetables and 972 01:04:41,320 --> 01:04:43,920 Speaker 1: so forth. There are many different sub areas, is like 973 01:04:43,960 --> 01:04:48,400 Speaker 1: a mosaic. There is only one area in all of 974 01:04:48,440 --> 01:04:55,440 Speaker 1: those areas that recognizes left right asymmetries. Otherwise, your face 975 01:04:56,240 --> 01:05:00,040 Speaker 1: mirror reversed is more places in a good example, but 976 01:05:00,320 --> 01:05:03,400 Speaker 1: it's mostly the same the special there's a special area 977 01:05:03,480 --> 01:05:06,400 Speaker 1: for faces, but most things, it doesn't matter. If it's 978 01:05:06,440 --> 01:05:10,320 Speaker 1: a left right, turning it upside down matters, but left 979 01:05:10,440 --> 01:05:13,840 Speaker 1: right doesn't matter. There's one area of the brain that 980 01:05:14,000 --> 01:05:18,040 Speaker 1: is primed to recognize left right aberration, so we can 981 01:05:18,080 --> 01:05:22,000 Speaker 1: tell a small B from a small D and differentiate. 982 01:05:22,400 --> 01:05:25,240 Speaker 1: So that area is used for reading no matter what 983 01:05:25,400 --> 01:05:29,240 Speaker 1: language you read. Now what that area would do with arrows, 984 01:05:29,880 --> 01:05:34,000 Speaker 1: I'd be absolutely fascinated to know. So maybe the way 985 01:05:34,040 --> 01:05:35,760 Speaker 1: I heard it might be right, I'm not going to 986 01:05:35,840 --> 01:05:39,240 Speaker 1: be exactly. And and the problem there is the arrows 987 01:05:39,320 --> 01:05:43,200 Speaker 1: embedded in the letters. So the embedding is going to 988 01:05:43,360 --> 01:05:46,920 Speaker 1: interfere with the perception anyway I'm not. It's like a relief. 989 01:05:46,960 --> 01:05:50,280 Speaker 1: Are you looking at the white or the black? Exactly? 990 01:05:50,320 --> 01:05:53,360 Speaker 1: The negative? There could be another the negative space. So 991 01:05:53,440 --> 01:05:56,000 Speaker 1: let me bring that back to ouray, because we we 992 01:05:56,040 --> 01:06:01,040 Speaker 1: started talking about that. You reference the linear how how 993 01:06:01,080 --> 01:06:04,840 Speaker 1: linear things are when we're moving through space, the way 994 01:06:04,920 --> 01:06:08,400 Speaker 1: language words after another appear. You use a whole bunch 995 01:06:08,440 --> 01:06:12,880 Speaker 1: of examples, and you talk about, um, the space within 996 01:06:12,960 --> 01:06:17,640 Speaker 1: an art and painting and how it had a form 997 01:06:17,800 --> 01:06:24,120 Speaker 1: of linear progression. Until Mark Rothko and Jackson Pollock come 998 01:06:24,160 --> 01:06:28,240 Speaker 1: along where they just explode that concept. And the the 999 01:06:28,280 --> 01:06:32,720 Speaker 1: reason that stood out to me is so my wife 1000 01:06:32,960 --> 01:06:35,960 Speaker 1: used to she's now retired, but she taught fashion, illustration 1001 01:06:36,000 --> 01:06:39,800 Speaker 1: and design. I've been dragged at every museum in the world, 1002 01:06:40,440 --> 01:06:45,160 Speaker 1: and initially a lot of some modern art just didn't 1003 01:06:45,200 --> 01:06:49,440 Speaker 1: resonate with me. Stella, I would look at just nonsense. 1004 01:06:49,680 --> 01:06:54,400 Speaker 1: Always was fascinated by Jackson Pollock, um, but I didn't 1005 01:06:54,440 --> 01:06:58,720 Speaker 1: care much for Mark Rothko until I don't know, fifteen 1006 01:06:58,800 --> 01:07:03,560 Speaker 1: twenty years ago, and I can't explain what happened. But suddenly, 1007 01:07:04,600 --> 01:07:08,360 Speaker 1: maybe having just seen it enough time, suddenly this is 1008 01:07:08,440 --> 01:07:12,760 Speaker 1: really fascinating stuff. There's there's it's not only abstract, but 1009 01:07:12,800 --> 01:07:16,120 Speaker 1: there's all sorts of different things going on, whether it's 1010 01:07:16,240 --> 01:07:22,439 Speaker 1: the space, the border, the color choices. Suddenly I went 1011 01:07:22,560 --> 01:07:27,080 Speaker 1: from not caring anything about roth Go to really, this 1012 01:07:27,120 --> 01:07:29,840 Speaker 1: is one of the most fascinating modern painters there are. 1013 01:07:30,080 --> 01:07:32,240 Speaker 1: And the fact that you used it as him as 1014 01:07:32,280 --> 01:07:35,640 Speaker 1: an example. I'm not a big fan of the black period, 1015 01:07:35,680 --> 01:07:37,720 Speaker 1: the later stuff he did when it's all black and 1016 01:07:37,800 --> 01:07:41,320 Speaker 1: gray and white, but hold that aside. The example of 1017 01:07:41,360 --> 01:07:48,000 Speaker 1: that having no linear narrative and no structural. Here's where 1018 01:07:48,040 --> 01:07:52,640 Speaker 1: your eyes going to naturally lead by the figures. I 1019 01:07:52,680 --> 01:07:55,480 Speaker 1: thought that was fascinating as an example early in the 1020 01:07:55,520 --> 01:07:58,800 Speaker 1: book Thank you. I'm not an art critic, I'm gonna 1021 01:07:58,840 --> 01:08:03,440 Speaker 1: appreciate it or either. So it gets back to the 1022 01:08:03,560 --> 01:08:07,120 Speaker 1: learning versus creativity. And when you're learning, you want things 1023 01:08:07,320 --> 01:08:11,400 Speaker 1: very straightforward and structured. When you're being creative you want 1024 01:08:11,400 --> 01:08:14,680 Speaker 1: to go in many different directions. And what I think 1025 01:08:14,720 --> 01:08:18,000 Speaker 1: both Roscoe and Pollock do and it took me a 1026 01:08:18,040 --> 01:08:23,200 Speaker 1: while to appreciate them as well, is there's ambiguity built 1027 01:08:23,240 --> 01:08:27,240 Speaker 1: into the paintings, and every time you look something different, 1028 01:08:27,280 --> 01:08:31,360 Speaker 1: you see something differ. And as you're looking at configures 1029 01:08:31,360 --> 01:08:36,160 Speaker 1: and reconfigures, and what Roscoe especially does is you get 1030 01:08:36,200 --> 01:08:41,120 Speaker 1: the adapters of the eye to color adapting, so colors 1031 01:08:41,280 --> 01:08:44,479 Speaker 1: change in them because of the way you're looking. You 1032 01:08:44,520 --> 01:08:47,599 Speaker 1: can get after images that are very interesting. With Rothco, 1033 01:08:47,760 --> 01:08:51,720 Speaker 1: look at a blank wall, but those adapters in your 1034 01:08:51,800 --> 01:08:55,960 Speaker 1: eye to the different colors keep changing, and that means 1035 01:08:56,000 --> 01:09:00,639 Speaker 1: that what you're seeing changes because it's not it's there, 1036 01:09:01,080 --> 01:09:05,960 Speaker 1: it's you're in your eye. And and Leonardo, by the way, 1037 01:09:06,120 --> 01:09:09,800 Speaker 1: knew that that it wasn't in the thing out there. 1038 01:09:09,920 --> 01:09:14,160 Speaker 1: It was in your mind, m through your eye. So 1039 01:09:14,200 --> 01:09:19,200 Speaker 1: I think, to me, that's what's intriguing about Roscoe. You 1040 01:09:19,320 --> 01:09:23,960 Speaker 1: commune with it and it different structures appear. It stops 1041 01:09:24,000 --> 01:09:28,080 Speaker 1: being flat and gets into depth. Oh, it's definitely dimensional. 1042 01:09:29,120 --> 01:09:32,000 Speaker 1: And maybe that's the switch that flicked onto me. Suddenly 1043 01:09:32,000 --> 01:09:35,240 Speaker 1: it wasn't just an orange or purple square. It's like, 1044 01:09:35,720 --> 01:09:38,679 Speaker 1: this really has a dimensionality to it and a depth. 1045 01:09:39,200 --> 01:09:42,560 Speaker 1: And I watch people going in there are those galleries 1046 01:09:42,600 --> 01:09:46,160 Speaker 1: in the in the tad and in what in the 1047 01:09:46,200 --> 01:09:49,080 Speaker 1: east wing of the National Gallery where there are lots 1048 01:09:49,120 --> 01:09:51,639 Speaker 1: of roth calls, and I watch people just going through 1049 01:09:51,680 --> 01:09:54,639 Speaker 1: and looking very quickly, and you have to sit there 1050 01:09:54,840 --> 01:10:00,280 Speaker 1: and commune, and then it's it. It becomes spiritual. So 1051 01:10:00,880 --> 01:10:05,080 Speaker 1: and that got me. It wasn't that, But earlier on 1052 01:10:05,240 --> 01:10:08,400 Speaker 1: I started looking at design and looking at architects how 1053 01:10:08,439 --> 01:10:13,160 Speaker 1: they design. We looked at experienced architects and drawing while 1054 01:10:13,200 --> 01:10:17,760 Speaker 1: they were designing, and there were early sketches are ambiguous, 1055 01:10:18,280 --> 01:10:21,760 Speaker 1: and it allows them to make discoveries in their own sketches. 1056 01:10:22,760 --> 01:10:25,880 Speaker 1: So they drew for one reason, and when they look 1057 01:10:25,920 --> 01:10:29,280 Speaker 1: at their sketch, they see new things. They see patterns, 1058 01:10:29,800 --> 01:10:33,200 Speaker 1: they see implications like this is a building, the traffic 1059 01:10:33,280 --> 01:10:36,919 Speaker 1: will not be right, or the light will fall poorly. 1060 01:10:37,439 --> 01:10:40,439 Speaker 1: So they see new things in their sketches, and it's 1061 01:10:40,479 --> 01:10:44,200 Speaker 1: the ambiguity that allows it. And then one of my 1062 01:10:44,280 --> 01:10:50,320 Speaker 1: graduate students started We studied them drawing and studied that process, 1063 01:10:50,400 --> 01:10:53,800 Speaker 1: and one of my graduate students started studying artists for 1064 01:10:53,880 --> 01:10:59,120 Speaker 1: whom drawing is their major practice, and for them, the 1065 01:10:59,320 --> 01:11:04,400 Speaker 1: drawing is a conversation between the eye and the hand 1066 01:11:04,439 --> 01:11:07,840 Speaker 1: and the page. There are no words, and if they 1067 01:11:07,840 --> 01:11:10,760 Speaker 1: try to talk about it, they can't. It interferes with 1068 01:11:10,840 --> 01:11:13,960 Speaker 1: the whole process. So this is a different way of 1069 01:11:14,000 --> 01:11:20,080 Speaker 1: thinking than language. It's thinking with the objects that I'm creating, 1070 01:11:20,160 --> 01:11:23,479 Speaker 1: with a body that's creating them, and with the thing 1071 01:11:23,600 --> 01:11:27,960 Speaker 1: that's perceiving them. And I'm sure something similar goes on 1072 01:11:28,080 --> 01:11:32,639 Speaker 1: in creating music and even in imagining your words. If 1073 01:11:32,680 --> 01:11:36,080 Speaker 1: you're thinking about speaking, you're thinking about how the words 1074 01:11:36,120 --> 01:11:38,599 Speaker 1: are going to sound, if you're if you're practicing it. 1075 01:11:39,400 --> 01:11:44,200 Speaker 1: So I wanted to call attention to that way of thinking. 1076 01:11:44,280 --> 01:11:47,639 Speaker 1: It's not going through language. It's an important way of thinking. 1077 01:11:47,840 --> 01:11:50,000 Speaker 1: It's the way I find my way in the world, 1078 01:11:50,680 --> 01:11:53,400 Speaker 1: and it contributes to many other kinds of It's the 1079 01:11:53,439 --> 01:11:57,640 Speaker 1: way I understand other people when I'm watching their bodies 1080 01:11:58,280 --> 01:12:02,960 Speaker 1: and their faces as are responding. So I wanted to 1081 01:12:03,000 --> 01:12:06,320 Speaker 1: call attention to those ways of thinking with the body 1082 01:12:06,439 --> 01:12:09,400 Speaker 1: and the world and the things that we create in 1083 01:12:09,439 --> 01:12:14,400 Speaker 1: the world. Is an important way of thinking that compliments language. 1084 01:12:14,479 --> 01:12:17,960 Speaker 1: That's different from language. I mean, I love language, So 1085 01:12:18,360 --> 01:12:20,880 Speaker 1: it's it's obvious you do. So I know I don't 1086 01:12:20,920 --> 01:12:23,640 Speaker 1: have you all day. I only have you for a 1087 01:12:23,640 --> 01:12:26,240 Speaker 1: little bit of time. Let me jump to my favorite 1088 01:12:26,320 --> 01:12:29,200 Speaker 1: questions that we ask all our guests, and let's see 1089 01:12:29,200 --> 01:12:33,840 Speaker 1: if we can learn a little bit more about Barbara Tavski. Um, 1090 01:12:33,880 --> 01:12:37,680 Speaker 1: so what do you? Uh? What are you listening to? 1091 01:12:38,080 --> 01:12:42,360 Speaker 1: Watching or downloading? Are you? Are you watching anything on 1092 01:12:42,479 --> 01:12:50,439 Speaker 1: Netflix or Interestingly? Movies affect me too much, too much. Yeah, 1093 01:12:51,560 --> 01:12:54,920 Speaker 1: so I'm very picky about what I see. So what 1094 01:12:55,240 --> 01:13:01,280 Speaker 1: have you seen that you've liked recently? Um? Synonyms, synonyms. 1095 01:13:01,320 --> 01:13:04,920 Speaker 1: It's an Israeli film about and because I lived in 1096 01:13:05,040 --> 01:13:09,919 Speaker 1: ISRAELI especially resonates it's about but it has universal appeal. 1097 01:13:10,240 --> 01:13:12,680 Speaker 1: It's about a man who was traumatized by being in 1098 01:13:12,720 --> 01:13:15,840 Speaker 1: the army, by what he had to do and it 1099 01:13:15,960 --> 01:13:19,280 Speaker 1: went to Paris and decides he needs a new identity. 1100 01:13:19,800 --> 01:13:22,920 Speaker 1: H quite quite interesting. It's a fascinating movie. And they 1101 01:13:22,920 --> 01:13:25,719 Speaker 1: add that to my Netflix. Cue. What's the most important 1102 01:13:25,720 --> 01:13:29,519 Speaker 1: thing people don't know about? Barbara Zavarsky. Oh, I don't. 1103 01:13:29,600 --> 01:13:32,840 Speaker 1: I'm pretty much out there. I think you probably don't 1104 01:13:32,840 --> 01:13:35,679 Speaker 1: know that I, or maybe I hinted. I have three 1105 01:13:35,720 --> 01:13:41,599 Speaker 1: wonderful children, and they have produced eight wonderful grandchildren who 1106 01:13:41,640 --> 01:13:46,760 Speaker 1: are all lively individuals with personalities and who adore each other. 1107 01:13:47,520 --> 01:13:51,640 Speaker 1: Who are some of your early mentors, So that's interesting. 1108 01:13:51,680 --> 01:13:53,880 Speaker 1: I don't think I had much in the way of 1109 01:13:53,960 --> 01:13:58,599 Speaker 1: human beings. It was more books that were influential, and 1110 01:13:58,720 --> 01:14:07,360 Speaker 1: when I was a teenager, the existentialists, particularly um were influential. UM. 1111 01:14:07,880 --> 01:14:14,320 Speaker 1: Later philosophers like Russell Quine, Wittgenstein, especially late Wittgenstein, were 1112 01:14:14,360 --> 01:14:20,719 Speaker 1: all influential. Eventually, when I got into psychology, the early 1113 01:14:20,880 --> 01:14:27,960 Speaker 1: cognitive people like Chomsky, Miller, Brunner, Broadband, um kun who 1114 01:14:28,080 --> 01:14:32,920 Speaker 1: talked about scientific revolutions, but they were really intellectual revolutions. 1115 01:14:32,960 --> 01:14:37,000 Speaker 1: They were about the human mind. So those but more 1116 01:14:37,080 --> 01:14:41,400 Speaker 1: than that, the colleagues, I mean, I always, even as 1117 01:14:41,439 --> 01:14:45,559 Speaker 1: an undergraduate hung out with the graduate students and and 1118 01:14:46,640 --> 01:14:50,479 Speaker 1: learned an enormous amount from them. They were indulgent and 1119 01:14:51,040 --> 01:14:54,320 Speaker 1: that stayed with me. I'm fortunate to have had amazing 1120 01:14:55,040 --> 01:14:58,680 Speaker 1: colleagues who were also friends, including the one that I 1121 01:14:58,800 --> 01:15:05,120 Speaker 1: lived with for thirty years, and our overlapping friendship groups. 1122 01:15:05,680 --> 01:15:10,679 Speaker 1: So in your work, what other psychologists affect the way 1123 01:15:10,960 --> 01:15:16,600 Speaker 1: you approach the world of psychology. I think that, like you, 1124 01:15:16,760 --> 01:15:23,720 Speaker 1: I'm a generalist, and at I mean I had Amazon 1125 01:15:23,840 --> 01:15:26,679 Speaker 1: the house and that whole circle and Danny and other 1126 01:15:26,800 --> 01:15:30,200 Speaker 1: colleagues at Hebrew University when I was early on then 1127 01:15:30,240 --> 01:15:35,640 Speaker 1: moved to Stanford. Stanford's way of hiring people is wallpaper. 1128 01:15:36,320 --> 01:15:39,200 Speaker 1: You want the whole field covered, but you don't want 1129 01:15:39,200 --> 01:15:44,799 Speaker 1: to overlap. And and I've seen other departments build little 1130 01:15:44,920 --> 01:15:49,000 Speaker 1: nuclei of everybody's working on speech perception, and then there's 1131 01:15:49,000 --> 01:15:53,160 Speaker 1: a nucleus some myself working on other and you don't communicate. 1132 01:15:53,360 --> 01:15:56,720 Speaker 1: So what was really wonderful to me at Stanford was 1133 01:15:56,800 --> 01:16:00,800 Speaker 1: having wonderful people and developmental and social in rain In 1134 01:16:01,000 --> 01:16:04,519 Speaker 1: and people in cognitive doing different things from me. And 1135 01:16:04,600 --> 01:16:07,000 Speaker 1: I learned a great deal from that. I loved that, 1136 01:16:07,400 --> 01:16:12,679 Speaker 1: and now I have people from the arts and people 1137 01:16:12,720 --> 01:16:18,400 Speaker 1: from technology, people for many arts, music, drama, UM, painting, 1138 01:16:18,680 --> 01:16:23,080 Speaker 1: and who are dance? Who are influencing me? And everything 1139 01:16:23,120 --> 01:16:26,200 Speaker 1: goes through the human mind. I mean you've said that, well, 1140 01:16:26,360 --> 01:16:28,759 Speaker 1: you see it in your book. You talk about everything 1141 01:16:28,840 --> 01:16:32,280 Speaker 1: from you reference dance and moving through space as well 1142 01:16:32,320 --> 01:16:35,280 Speaker 1: as art and music. It's it's clear that all those 1143 01:16:35,320 --> 01:16:39,280 Speaker 1: different folks are influencing you. Let's talk about books. What 1144 01:16:39,320 --> 01:16:41,360 Speaker 1: are some of your favorite books? What are you reading 1145 01:16:41,400 --> 01:16:44,519 Speaker 1: these days? What do you like to recommend? Is? So 1146 01:16:44,680 --> 01:16:50,519 Speaker 1: I went from reading fiction voraciously to reading UM, to 1147 01:16:50,840 --> 01:16:56,719 Speaker 1: postmodern fiction to reading nonfiction so UM a long history 1148 01:16:56,840 --> 01:17:00,439 Speaker 1: and a lot of foreign fiction that I've loved and 1149 01:17:00,560 --> 01:17:03,000 Speaker 1: still love because it brings you to other worlds. But 1150 01:17:03,080 --> 01:17:08,520 Speaker 1: I'm reading So Sapiens, which I recommend to everybody. UM. 1151 01:17:08,640 --> 01:17:13,599 Speaker 1: Danny's book Thinking Fast and Slow is wonderful. UM anything 1152 01:17:13,680 --> 01:17:22,360 Speaker 1: Jared Diamond or Sapolsky as you recommended UM and UM 1153 01:17:22,520 --> 01:17:29,200 Speaker 1: Hans Rustling's factfulness I found very uplifting and and it 1154 01:17:29,240 --> 01:17:33,000 Speaker 1: gives you a right, the right perspective on long things 1155 01:17:33,280 --> 01:17:38,280 Speaker 1: you mentioned Jared Diamond, guns, germs and steel right would 1156 01:17:38,320 --> 01:17:42,520 Speaker 1: be one of them. And that again is a broad 1157 01:17:42,560 --> 01:17:46,000 Speaker 1: way of thinking that makes sense to me. Um, I 1158 01:17:46,120 --> 01:17:50,840 Speaker 1: loved Misbehaving Dick Taylor's Frock was just a lot of fun. 1159 01:17:51,280 --> 01:17:54,840 Speaker 1: So those are probably that's a great list that'll keep 1160 01:17:54,880 --> 01:17:58,200 Speaker 1: someone busy for a full semester. To say the least, 1161 01:17:58,360 --> 01:18:01,280 Speaker 1: UM tell us about it time you failed and what 1162 01:18:01,360 --> 01:18:04,719 Speaker 1: you learned from the experience. So that was a hard 1163 01:18:04,800 --> 01:18:09,200 Speaker 1: question for me. I I don't try things that are 1164 01:18:09,280 --> 01:18:11,679 Speaker 1: really out of reach. But if you think about being 1165 01:18:12,280 --> 01:18:16,680 Speaker 1: an an experimental psychologist, which I have been, every experiment 1166 01:18:16,800 --> 01:18:19,439 Speaker 1: is a risk. You build it and you build it 1167 01:18:19,439 --> 01:18:23,400 Speaker 1: on previous work, on your previous experience, and many of 1168 01:18:23,439 --> 01:18:25,920 Speaker 1: them fail. I mean, I think the failure rate is 1169 01:18:26,000 --> 01:18:29,599 Speaker 1: lower than the failure rate for startups. So the failure 1170 01:18:29,680 --> 01:18:33,040 Speaker 1: rate is about meaning that you just don't reach any 1171 01:18:33,080 --> 01:18:36,360 Speaker 1: conclusion by the experience, and if something happens that you 1172 01:18:36,400 --> 01:18:39,559 Speaker 1: didn't expect and you might be disappointed in But for me, 1173 01:18:39,680 --> 01:18:43,240 Speaker 1: that's the adventure on the phone, and you learn something 1174 01:18:43,280 --> 01:18:49,400 Speaker 1: from that. Now, learning from failure is problematic because it's 1175 01:18:49,439 --> 01:18:53,479 Speaker 1: just one experience and it's so easy in hindsight to 1176 01:18:53,880 --> 01:18:57,000 Speaker 1: explain why you failed. I mean, you saw that with 1177 01:18:57,280 --> 01:19:00,920 Speaker 1: people in stock picks all the time, need do calls 1178 01:19:01,000 --> 01:19:05,479 Speaker 1: this uh resulting where the poker players they learn the 1179 01:19:05,600 --> 01:19:08,880 Speaker 1: lesson the wrong lesson from the result as opposed from 1180 01:19:08,880 --> 01:19:14,479 Speaker 1: the process exactly. So it's only one um, one sample, 1181 01:19:14,560 --> 01:19:18,120 Speaker 1: and you're interpreting at it in in the hindsight. So 1182 01:19:19,240 --> 01:19:23,040 Speaker 1: failure has become popular now everybody's talking about failing is 1183 01:19:23,080 --> 01:19:26,160 Speaker 1: good and you learn from failure, but you don't necessarily 1184 01:19:26,240 --> 01:19:29,400 Speaker 1: learn the right thing from failure. So we're failing at failing. 1185 01:19:30,320 --> 01:19:34,559 Speaker 1: We're doing it's getting a little a little fractal there, 1186 01:19:34,680 --> 01:19:37,640 Speaker 1: but I think not letting failures get you down is 1187 01:19:37,680 --> 01:19:41,080 Speaker 1: probably a good lesson. Okay, that's that's that's good. Um, 1188 01:19:41,120 --> 01:19:42,479 Speaker 1: what do you do for fun? What do you do 1189 01:19:42,520 --> 01:19:45,840 Speaker 1: when you're not doting on the grandkids and doing research? Oh? 1190 01:19:46,160 --> 01:19:48,719 Speaker 1: In research is fun. I'm one of the few people 1191 01:19:48,760 --> 01:19:55,840 Speaker 1: that loves writing because it's making things sculpting. Sculpting. That's 1192 01:19:55,920 --> 01:20:00,000 Speaker 1: such a good turn of a phrase. Um, the library 1193 01:20:00,000 --> 01:20:04,360 Speaker 1: areing of Congress. Daniel Borston used to say, I write 1194 01:20:04,400 --> 01:20:06,920 Speaker 1: to figure out what I think, But you've reduced that 1195 01:20:06,960 --> 01:20:11,280 Speaker 1: to one word sculpting. Yeah, I'm a fan of Daniel Borstein. 1196 01:20:11,400 --> 01:20:17,439 Speaker 1: Two Explorer exactly are great books. Yeah, really, uplifting and 1197 01:20:17,560 --> 01:20:20,840 Speaker 1: just so well researched and so beautifully written. Yeah, and 1198 01:20:20,920 --> 01:20:24,559 Speaker 1: he gets the essences of these things in an exciting way. 1199 01:20:24,640 --> 01:20:27,439 Speaker 1: Not not small books. Those are that's a summer. Each 1200 01:20:27,439 --> 01:20:29,879 Speaker 1: of those books is like there's your July and August. 1201 01:20:30,080 --> 01:20:34,320 Speaker 1: Yeah that was, and they're they're great. New York is 1202 01:20:34,400 --> 01:20:37,360 Speaker 1: full of fun. I mean the most fun is good 1203 01:20:37,400 --> 01:20:41,439 Speaker 1: conversation with friends, and I happen to have good friends 1204 01:20:41,439 --> 01:20:47,080 Speaker 1: who are good conversationalized. But I love music. I've developed 1205 01:20:47,120 --> 01:20:52,400 Speaker 1: a late passion for opera, for opera really interesting, and 1206 01:20:52,439 --> 01:20:55,160 Speaker 1: that took many years and now it's over the top 1207 01:20:55,240 --> 01:20:58,599 Speaker 1: and the stories are often you know, the men are 1208 01:20:58,640 --> 01:21:01,320 Speaker 1: bastards and the women are saints and the women die. 1209 01:21:01,880 --> 01:21:08,360 Speaker 1: But the end that's ever been yeah boham and right. 1210 01:21:08,720 --> 01:21:12,679 Speaker 1: But I've learned to love opera just by going. I'm 1211 01:21:12,760 --> 01:21:17,599 Speaker 1: no expert, but you learned by experiencing. And it's again 1212 01:21:17,640 --> 01:21:21,920 Speaker 1: a different way of learning than book learning. And I 1213 01:21:21,920 --> 01:21:24,280 Speaker 1: think that's the way you pointed out. You learn from 1214 01:21:24,360 --> 01:21:28,160 Speaker 1: our you just look just watching. There's this stuff to 1215 01:21:28,200 --> 01:21:30,160 Speaker 1: be picked up. So you're in New York a couple 1216 01:21:30,160 --> 01:21:33,200 Speaker 1: of months a year in California, Well, I'm now I'm 1217 01:21:33,240 --> 01:21:36,439 Speaker 1: actually Mrita at Stanford, so I'm there for summers and 1218 01:21:36,600 --> 01:21:40,880 Speaker 1: many breaks, but I'm still teaching at Columbia, so I'm 1219 01:21:40,960 --> 01:21:47,639 Speaker 1: more here, um right, and yeah, and hence I understand 1220 01:21:47,680 --> 01:21:50,679 Speaker 1: there's an opera or two here. Part I understand there's 1221 01:21:50,720 --> 01:21:55,599 Speaker 1: a couple of operas here all the time. The next 1222 01:21:55,640 --> 01:21:58,200 Speaker 1: one on my list is What's Sick, which is a 1223 01:21:58,439 --> 01:22:04,679 Speaker 1: very hard opera org It's it's human tragedy at its worst. 1224 01:22:04,760 --> 01:22:09,640 Speaker 1: But William Kentridge is doing this production and the sets, 1225 01:22:09,680 --> 01:22:12,880 Speaker 1: and he, in my mind, is the most inventive and 1226 01:22:13,000 --> 01:22:19,640 Speaker 1: interesting artist alive by a long shot. Wow. That's quite interesting. Um. 1227 01:22:19,760 --> 01:22:22,920 Speaker 1: So what are you optimistic about in the world of 1228 01:22:22,960 --> 01:22:25,880 Speaker 1: psychology today and what are you a little pessimistic about. 1229 01:22:27,360 --> 01:22:31,200 Speaker 1: I'm optimistic in general by the arts and sciences, um 1230 01:22:31,520 --> 01:22:35,400 Speaker 1: and they both have young people who are doing really 1231 01:22:35,439 --> 01:22:39,880 Speaker 1: innovative and creative things. In my own field, I happen 1232 01:22:39,960 --> 01:22:44,040 Speaker 1: to be past president of the Association of Psychological Sciences. 1233 01:22:44,320 --> 01:22:47,200 Speaker 1: I gave out a lot of prizes last year, including 1234 01:22:47,240 --> 01:22:51,360 Speaker 1: to young investigators. And they know math, and they know 1235 01:22:51,640 --> 01:22:54,439 Speaker 1: big data, and they know the brain and they know 1236 01:22:54,640 --> 01:22:58,479 Speaker 1: behavior and they're doing mind blowing things, and you can't 1237 01:22:58,479 --> 01:23:03,360 Speaker 1: help but be in all of these young people reasons 1238 01:23:03,400 --> 01:23:07,880 Speaker 1: to be optimistic about the future and about the the arts, politics, 1239 01:23:07,920 --> 01:23:10,920 Speaker 1: global warming. I'm worried about the same things. You know, 1240 01:23:10,960 --> 01:23:15,639 Speaker 1: a reckless leader of a major country doing impulsive things 1241 01:23:15,720 --> 01:23:20,680 Speaker 1: that that would never happen. You know, of course the 1242 01:23:20,720 --> 01:23:23,759 Speaker 1: adults are going to take charge. No one would behave 1243 01:23:23,800 --> 01:23:28,040 Speaker 1: recklessly like that. Yeah, you have to be optimistic that 1244 01:23:28,120 --> 01:23:30,559 Speaker 1: we will get past all that sort of Well, global 1245 01:23:30,600 --> 01:23:33,439 Speaker 1: warming is more of a worry. But so I'm worried 1246 01:23:33,479 --> 01:23:36,400 Speaker 1: about the things that normal people are worried about. But 1247 01:23:37,000 --> 01:23:40,160 Speaker 1: so I recently I agree with you on those I 1248 01:23:40,320 --> 01:23:44,280 Speaker 1: recently read something. So there are reasons to be frightened 1249 01:23:44,280 --> 01:23:47,200 Speaker 1: about global warming, but there are also reasons to be 1250 01:23:47,280 --> 01:23:53,880 Speaker 1: optimistic that will transition to sustainable energy and we'll find 1251 01:23:53,960 --> 01:23:57,880 Speaker 1: some technological solution that will reduce the negative effects. And 1252 01:23:57,920 --> 01:24:01,000 Speaker 1: then I read this column with a person and they 1253 01:24:01,080 --> 01:24:06,640 Speaker 1: look at all these surveys of people and my optimistic 1254 01:24:06,760 --> 01:24:10,799 Speaker 1: viewpoint on technology, what happens if it's wrong, And lots 1255 01:24:10,800 --> 01:24:14,080 Speaker 1: of lots of people seem to believe, oh yeah, we'll 1256 01:24:14,160 --> 01:24:17,639 Speaker 1: we'll see the skies, We'll find some way to reflect 1257 01:24:17,640 --> 01:24:21,160 Speaker 1: the sun temporarily and lower. There seems to be a 1258 01:24:21,240 --> 01:24:23,960 Speaker 1: belief amongst a lot of people that yeah, yeah, well 1259 01:24:24,000 --> 01:24:26,000 Speaker 1: we'll come up with a magic bullet, will be fine, 1260 01:24:26,520 --> 01:24:29,080 Speaker 1: and that's not usually how things work. They usually aren't 1261 01:24:29,120 --> 01:24:32,920 Speaker 1: magic bullets. And when somebody explained that, like lots of 1262 01:24:32,960 --> 01:24:35,880 Speaker 1: people think this, I'm like, gee, maybe it really is 1263 01:24:35,960 --> 01:24:38,680 Speaker 1: much worse than uh. I know it's bad, but I'm 1264 01:24:38,680 --> 01:24:41,240 Speaker 1: trying to be optimistic. And that kind of was like 1265 01:24:41,320 --> 01:24:44,920 Speaker 1: a reality check that maybe it's gonna be harder to 1266 01:24:44,960 --> 01:24:48,000 Speaker 1: fix this than we think. Yeah, and what what if 1267 01:24:48,040 --> 01:24:50,799 Speaker 1: you look for optimistic things and I think we share 1268 01:24:50,880 --> 01:24:53,800 Speaker 1: that looking for it. I don't know that there's going 1269 01:24:53,840 --> 01:24:57,800 Speaker 1: to be a magic, single solution. It's more gonna be many. 1270 01:24:57,920 --> 01:25:02,519 Speaker 1: But what is impressive. Despite the government's policy, which is 1271 01:25:02,600 --> 01:25:07,800 Speaker 1: not pro green, many companies have discovered that they're better off. 1272 01:25:07,880 --> 01:25:13,120 Speaker 1: It's economically in their interests, and so that that's happening. 1273 01:25:13,200 --> 01:25:16,080 Speaker 1: And you look at younger people and and you know, 1274 01:25:16,280 --> 01:25:19,080 Speaker 1: I once left the water running while I was brushing 1275 01:25:19,080 --> 01:25:23,080 Speaker 1: my teeth and one of my four year old grandchildren said, softer, 1276 01:25:23,760 --> 01:25:27,439 Speaker 1: turn off the water, stop wasting, stop wasting water. So 1277 01:25:27,560 --> 01:25:30,559 Speaker 1: coal is a perfect example of exactly what you're talking about. 1278 01:25:31,360 --> 01:25:39,200 Speaker 1: Coal has been and coal just plummeted in usage, but 1279 01:25:39,400 --> 01:25:44,000 Speaker 1: we have made natural gas not as good as solar, 1280 01:25:44,080 --> 01:25:48,519 Speaker 1: but much better than coal. It's become so inexpensive that 1281 01:25:49,560 --> 01:25:53,559 Speaker 1: coal fired electrical plants are rapidly going away. It's so 1282 01:25:53,640 --> 01:25:58,320 Speaker 1: much cheaper to switch to natural gas that the economic 1283 01:25:58,360 --> 01:26:00,679 Speaker 1: insensives are doing a lot of it on their own, 1284 01:26:01,360 --> 01:26:03,840 Speaker 1: just the cost of the material. You don't have to 1285 01:26:03,840 --> 01:26:06,400 Speaker 1: have scrubbers with natural gas, you don't have to have 1286 01:26:06,479 --> 01:26:12,960 Speaker 1: all these complicated carbon recaption systems. Now mining for natural 1287 01:26:13,000 --> 01:26:16,040 Speaker 1: gas releases methane another thing. Natural gas has a lot 1288 01:26:16,040 --> 01:26:19,920 Speaker 1: of its own problems, but on any comparison basis, it's 1289 01:26:20,000 --> 01:26:22,840 Speaker 1: just so much better than coal. Hopefully we see more 1290 01:26:22,920 --> 01:26:28,120 Speaker 1: of that moving in the right direction organically, um but 1291 01:26:28,240 --> 01:26:34,840 Speaker 1: we'll see and solar and wind and see and people 1292 01:26:34,840 --> 01:26:38,439 Speaker 1: are moving. The real question is can we go fast enough? 1293 01:26:38,479 --> 01:26:41,839 Speaker 1: Because the warming has already happened, The glaciers are mounting, 1294 01:26:41,880 --> 01:26:45,520 Speaker 1: the coral reefs are dying, and that's our fish population. 1295 01:26:46,040 --> 01:26:50,599 Speaker 1: So you could see the Great Barrier reef dying from space. 1296 01:26:50,680 --> 01:26:54,160 Speaker 1: There are satellite images that are showing it bleaching. For 1297 01:26:54,320 --> 01:26:58,200 Speaker 1: miles at a time. Um, I'm trying to Douglas Adams 1298 01:26:58,240 --> 01:27:00,840 Speaker 1: wrote a book maybe it was he's no longer with us, 1299 01:27:00,880 --> 01:27:04,240 Speaker 1: so it had to be like years ago called Last 1300 01:27:04,320 --> 01:27:09,760 Speaker 1: Chance to See And it's all these environments and species 1301 01:27:09,840 --> 01:27:13,439 Speaker 1: that this was before eco tourism was a big thing, 1302 01:27:14,040 --> 01:27:15,840 Speaker 1: and he said, hey, if you want to see these, 1303 01:27:16,200 --> 01:27:18,120 Speaker 1: you better go see these now because they're not going 1304 01:27:18,160 --> 01:27:21,080 Speaker 1: to be here in fifty years. And the great barrier 1305 01:27:21,240 --> 01:27:24,200 Speaker 1: brief is literally you know, they're much more sensitive to 1306 01:27:24,360 --> 01:27:28,719 Speaker 1: one degree increase in sea temperature than you know, even 1307 01:27:28,800 --> 01:27:32,639 Speaker 1: enjoiant populations of fish. So that's a really interesting book 1308 01:27:33,080 --> 01:27:36,759 Speaker 1: if you want to be depressed. So my last two questions, 1309 01:27:36,840 --> 01:27:40,360 Speaker 1: let me ask you this, Um, what sort of advice 1310 01:27:40,400 --> 01:27:42,920 Speaker 1: would you give to a recent college graduate who was 1311 01:27:43,080 --> 01:27:50,800 Speaker 1: interested in a career in experimental psychology. If it's experimental psychology, 1312 01:27:50,880 --> 01:27:56,280 Speaker 1: you have to learn brain and data. Um. Right, it's 1313 01:27:56,280 --> 01:27:59,760 Speaker 1: it's hard now, in much harder in some ways than 1314 01:28:00,000 --> 01:28:03,160 Speaker 1: and I was coming in. It's harder to get grant money, 1315 01:28:03,240 --> 01:28:07,559 Speaker 1: and you need grants, and so it's hard. I would 1316 01:28:07,560 --> 01:28:12,240 Speaker 1: tell people to be strategic. I wasn't, but it worked out. 1317 01:28:13,040 --> 01:28:18,000 Speaker 1: I was very lucky that is a theme on this show. 1318 01:28:18,160 --> 01:28:21,679 Speaker 1: Lots of people say how fortunate they were and how 1319 01:28:21,800 --> 01:28:25,200 Speaker 1: how lucky they were by their circumstances, and you just 1320 01:28:25,240 --> 01:28:28,559 Speaker 1: can't count on that happening always, right, And I think 1321 01:28:28,600 --> 01:28:32,720 Speaker 1: some of it was the meandering that I did seemed 1322 01:28:32,840 --> 01:28:36,439 Speaker 1: at first like me andering. This is the research track 1323 01:28:36,520 --> 01:28:40,480 Speaker 1: that I took. But eventually then I saw this isn't neandering, 1324 01:28:40,600 --> 01:28:43,639 Speaker 1: this is deliberate, and then I was able to craft 1325 01:28:43,720 --> 01:28:48,160 Speaker 1: what I was doing along the bigger vision that I had. 1326 01:28:48,800 --> 01:28:51,679 Speaker 1: But it took a while to get that. And if 1327 01:28:51,720 --> 01:28:56,120 Speaker 1: I look at Picasso at an artist Roscoe, if you 1328 01:28:56,160 --> 01:28:58,760 Speaker 1: look at early Roscoe is very different from later. It 1329 01:28:58,840 --> 01:29:03,200 Speaker 1: was representational, wasn't even abstract. And I think that a 1330 01:29:03,360 --> 01:29:05,920 Speaker 1: certain you do a certain amount of me entering, and 1331 01:29:06,000 --> 01:29:10,280 Speaker 1: that's probably a good thing of exploring and exploring widely 1332 01:29:10,400 --> 01:29:13,120 Speaker 1: before you get to help you get a vision, and 1333 01:29:13,160 --> 01:29:16,559 Speaker 1: also to give you the tools that you need to 1334 01:29:16,680 --> 01:29:20,200 Speaker 1: do something bigger. And so you build up the technical 1335 01:29:20,240 --> 01:29:23,559 Speaker 1: skills and then you get a leap off from conceptual skills, 1336 01:29:23,880 --> 01:29:27,559 Speaker 1: and you've tried many different things. It took a while 1337 01:29:27,600 --> 01:29:30,760 Speaker 1: before Picasso got his vision, and then he ended up 1338 01:29:30,800 --> 01:29:36,000 Speaker 1: having many visions because he was especially fertile in that way. 1339 01:29:36,200 --> 01:29:38,559 Speaker 1: And our final question, what do you know about the 1340 01:29:38,600 --> 01:29:42,200 Speaker 1: world of psychology today that you wish you knew thirty 1341 01:29:42,280 --> 01:29:45,760 Speaker 1: or forty years ago when you were a young student. Well, 1342 01:29:46,120 --> 01:29:50,679 Speaker 1: here's something I'm glad I didn't know. It's political, really, 1343 01:29:51,439 --> 01:29:54,720 Speaker 1: and I feeling like tenure at universities or what get 1344 01:29:54,760 --> 01:29:58,080 Speaker 1: published or or a little bit everywhere. So I thought 1345 01:29:58,120 --> 01:30:01,200 Speaker 1: it was grant money to or I thought I was 1346 01:30:01,280 --> 01:30:05,000 Speaker 1: going into a field that wasn't that was just intellectual. 1347 01:30:05,120 --> 01:30:09,719 Speaker 1: And you know, I'm aberrant that way of really enjoying 1348 01:30:09,840 --> 01:30:13,280 Speaker 1: ideas and playing with ideas and contributing to them and 1349 01:30:13,320 --> 01:30:16,240 Speaker 1: wanting to be around people who are thinking. And I 1350 01:30:16,280 --> 01:30:20,519 Speaker 1: thought academics was going to be the purest place on that. 1351 01:30:21,320 --> 01:30:25,120 Speaker 1: When I was studying, there weren't that many opportunities for women. 1352 01:30:25,160 --> 01:30:28,839 Speaker 1: And I was self supporting my last two years in college, 1353 01:30:28,840 --> 01:30:32,519 Speaker 1: so I I knew there was no I had nobody 1354 01:30:32,560 --> 01:30:36,200 Speaker 1: to fall back on. I had to make a living um. 1355 01:30:36,200 --> 01:30:41,240 Speaker 1: But sure, it's even who you site in your articles 1356 01:30:42,160 --> 01:30:48,000 Speaker 1: and and and so forth. So I thought, this is 1357 01:30:48,040 --> 01:30:51,400 Speaker 1: a level playing field. All that matters is good ideas. 1358 01:30:51,439 --> 01:30:56,920 Speaker 1: But there's not quite a meritocracy not right in meritocracy 1359 01:30:56,960 --> 01:31:00,559 Speaker 1: has come to be a bad word, but really in 1360 01:31:00,640 --> 01:31:05,240 Speaker 1: some cases. But I thought all that would matter with 1361 01:31:05,600 --> 01:31:10,160 Speaker 1: the ideas, but which ideas get picked up on and 1362 01:31:10,200 --> 01:31:12,920 Speaker 1: who they get attributed to it. Here being a woman 1363 01:31:13,320 --> 01:31:17,360 Speaker 1: was a bit of a disadvantage, and again I was oblivious, 1364 01:31:17,520 --> 01:31:23,360 Speaker 1: and I'm glad I was oblivious. But there are those 1365 01:31:23,479 --> 01:31:27,200 Speaker 1: political things are really social dynamics, and they're they're they're 1366 01:31:27,200 --> 01:31:30,479 Speaker 1: about human beings. And in the end, even if it's 1367 01:31:30,520 --> 01:31:35,080 Speaker 1: intellectual and ideas about science, it's still human beings that 1368 01:31:35,120 --> 01:31:39,240 Speaker 1: are making the market quite fascinating. Barbara, thank you for 1369 01:31:39,280 --> 01:31:41,960 Speaker 1: being so generous with your time. We've been talking for 1370 01:31:42,040 --> 01:31:45,200 Speaker 1: about two hours, and I could go for another two hours, 1371 01:31:45,600 --> 01:31:48,120 Speaker 1: but I know you have places to go and people 1372 01:31:48,120 --> 01:31:52,400 Speaker 1: to see. We have been speaking with Barbara Traversky, professor 1373 01:31:52,439 --> 01:31:56,559 Speaker 1: of psychology at Colombia and Stanford and author of the 1374 01:31:56,600 --> 01:32:01,040 Speaker 1: book Mind in Motion, How Actions Shape Thoughts. If you 1375 01:32:01,320 --> 01:32:04,479 Speaker 1: enjoy this conversation, well be sure and look up an 1376 01:32:04,479 --> 01:32:07,360 Speaker 1: intro down an inch on Apple iTunes and you could 1377 01:32:07,400 --> 01:32:11,160 Speaker 1: see any of the three hundred prior such conversations we've had. 1378 01:32:11,680 --> 01:32:15,000 Speaker 1: You can find that on iTunes, Google, podcasts. That's your 1379 01:32:15,000 --> 01:32:19,919 Speaker 1: Spotify overcast wherever final podcasts are sold. We love your comments, 1380 01:32:19,920 --> 01:32:23,799 Speaker 1: feedback and suggestions right to us at m IB podcast 1381 01:32:23,880 --> 01:32:27,760 Speaker 1: at Bloomberg dot net. Give us a review on Apple iTunes. 1382 01:32:28,280 --> 01:32:30,240 Speaker 1: I would be remiss if I did not thank the 1383 01:32:30,280 --> 01:32:34,519 Speaker 1: crack staff that helps put these conversations together each week. 1384 01:32:35,120 --> 01:32:41,280 Speaker 1: Paris Wald is my producer, Mark Sinnascalce is my audio engineer. 1385 01:32:41,760 --> 01:32:45,320 Speaker 1: I'm Barry Rehults. You've been listening to Masters in Business 1386 01:32:45,600 --> 01:32:46,759 Speaker 1: on Bloomberg Radio