1 00:00:05,480 --> 00:00:08,760 Speaker 1: Today we're going to talk about intuition. Why do you 2 00:00:08,840 --> 00:00:11,399 Speaker 1: experience a feeling when you walk into a place that 3 00:00:11,840 --> 00:00:14,000 Speaker 1: you don't like this restaurant and you want to go 4 00:00:14,000 --> 00:00:17,680 Speaker 1: to this other restaurant, Or you trust this person but 5 00:00:17,720 --> 00:00:20,880 Speaker 1: you don't quite trust that person. Or you might have 6 00:00:20,920 --> 00:00:23,720 Speaker 1: a sense that something is wrong with your pet or 7 00:00:23,720 --> 00:00:27,560 Speaker 1: your child, even if you can't articulate what the issue is. 8 00:00:27,960 --> 00:00:30,440 Speaker 1: So in all these cases, what are the signals that 9 00:00:30,480 --> 00:00:33,479 Speaker 1: the brain is picking up on and what fraction of 10 00:00:33,520 --> 00:00:38,480 Speaker 1: those signals does your consciousness have access to? And importantly, 11 00:00:39,040 --> 00:00:44,640 Speaker 1: does intuition sometimes steer us wrong? What we call intuition 12 00:00:44,880 --> 00:00:49,000 Speaker 1: is not something to be trusted automatically. It quite often 13 00:00:49,040 --> 00:00:51,839 Speaker 1: steers us the wrong way. So what are the features 14 00:00:51,920 --> 00:00:55,680 Speaker 1: of the situation to look for where you might feel like, Okay, 15 00:00:56,080 --> 00:00:59,600 Speaker 1: my intuition is more likely to be useful here than 16 00:00:59,640 --> 00:01:03,160 Speaker 1: in this other situation, where my intuition is more likely 17 00:01:03,440 --> 00:01:06,720 Speaker 1: to steer me wrong. And what is the future of 18 00:01:06,800 --> 00:01:11,280 Speaker 1: intuition as we build new technologies that might take the 19 00:01:11,800 --> 00:01:15,920 Speaker 1: myriad signals racing around in the darkness of our brains 20 00:01:15,959 --> 00:01:23,360 Speaker 1: and bodies and bring them to light. Welcome to inner 21 00:01:23,400 --> 00:01:27,080 Speaker 1: Cosmos with me. David Eagleman. I'm a neuroscientist and an 22 00:01:27,080 --> 00:01:30,240 Speaker 1: author at Stanford, and I've spent my career at the 23 00:01:30,280 --> 00:01:42,600 Speaker 1: intersection between how the brain works and how we experience life. 24 00:01:46,000 --> 00:01:51,800 Speaker 1: Today's episode is about that gut feeling that we call intuition. Now, 25 00:01:51,840 --> 00:01:54,200 Speaker 1: if you've been listening to this podcast for a while 26 00:01:54,360 --> 00:01:57,520 Speaker 1: or reading my books, you've definitely heard me talk about 27 00:01:57,560 --> 00:02:02,480 Speaker 1: all the stuff that transpires unconsciously in the unconscious mind. 28 00:02:02,880 --> 00:02:06,600 Speaker 1: This is all the computations we have no access to 29 00:02:06,720 --> 00:02:11,000 Speaker 1: and really no acquaintance with. So think about how you walk, 30 00:02:11,400 --> 00:02:16,400 Speaker 1: all the sophisticated coordination of muscles and balance that's required there. 31 00:02:17,000 --> 00:02:20,880 Speaker 1: Until very recently, that was an impossibly difficult task to 32 00:02:20,919 --> 00:02:23,960 Speaker 1: get robots to do this. But ever since you were 33 00:02:24,120 --> 00:02:26,880 Speaker 1: two years old or so, you've been doing it just fine. Now, 34 00:02:27,000 --> 00:02:29,760 Speaker 1: you can't articulate how you walk and how you keep 35 00:02:29,800 --> 00:02:33,399 Speaker 1: your balance. It's much like riding a bicycle in that way. 36 00:02:33,400 --> 00:02:37,680 Speaker 1: And presumably you can do several things while you're walking 37 00:02:37,800 --> 00:02:40,160 Speaker 1: or riding a bike. You can have a conversation or 38 00:02:40,160 --> 00:02:42,600 Speaker 1: maybe punch out a text on your phone while you're 39 00:02:42,600 --> 00:02:44,920 Speaker 1: doing it. But if I ask you to describe to 40 00:02:44,960 --> 00:02:49,120 Speaker 1: me what you are doing with all that beautiful sophisticated musculature, 41 00:02:49,639 --> 00:02:53,000 Speaker 1: it's simply not possible. You put the years of work 42 00:02:53,040 --> 00:02:55,560 Speaker 1: into learning how to walk and how to ride a bicycle, 43 00:02:55,960 --> 00:03:02,080 Speaker 1: and it resulted in expertise, but and inability to articulate 44 00:03:02,639 --> 00:03:05,960 Speaker 1: what was going on there under the hood. And it's 45 00:03:06,000 --> 00:03:08,399 Speaker 1: like this with most of the actions that you take, 46 00:03:08,480 --> 00:03:12,320 Speaker 1: whether that's deciding what you want to eat, or who 47 00:03:12,320 --> 00:03:15,920 Speaker 1: you're going to marry, or how you're identifying a smell 48 00:03:16,000 --> 00:03:19,880 Speaker 1: from some mixture of molecules or whatever. You just find 49 00:03:19,919 --> 00:03:23,239 Speaker 1: that your brain can magically do it because all these 50 00:03:23,280 --> 00:03:27,640 Speaker 1: operations are happening in silence, in the darkness of this 51 00:03:27,880 --> 00:03:33,480 Speaker 1: vast computational vault. So your conscious mind is like the 52 00:03:33,520 --> 00:03:36,960 Speaker 1: broom closet in the mansion of your brain, and all 53 00:03:37,040 --> 00:03:41,400 Speaker 1: of the rest involves work that's happening that you can't see. 54 00:03:41,720 --> 00:03:44,680 Speaker 1: And why is that the case. It's because the data 55 00:03:44,760 --> 00:03:48,560 Speaker 1: being pulled in from all your senses is way too 56 00:03:48,600 --> 00:03:52,560 Speaker 1: detailed to be useful. Your retina at the back of 57 00:03:52,560 --> 00:03:56,800 Speaker 1: your eye that transmits visual input at about ten million 58 00:03:56,880 --> 00:03:59,960 Speaker 1: bits per second, which is similar to an ethernet connection. 59 00:04:00,520 --> 00:04:04,560 Speaker 1: But if you had conscious access to all that data, 60 00:04:04,680 --> 00:04:07,640 Speaker 1: you wouldn't be able to operate at your level of 61 00:04:07,760 --> 00:04:10,880 Speaker 1: space and time. At your level, what you care about 62 00:04:11,040 --> 00:04:14,240 Speaker 1: is where the heck did I put my toothbrush? Or 63 00:04:14,760 --> 00:04:17,680 Speaker 1: what can I read on this menu? Or where is 64 00:04:17,760 --> 00:04:21,359 Speaker 1: my friend's face in this crowd of people? You just 65 00:04:21,440 --> 00:04:26,440 Speaker 1: want simple, high level answers to important questions. You want 66 00:04:26,440 --> 00:04:30,760 Speaker 1: the abstractions, not the details. So your brain takes in 67 00:04:31,160 --> 00:04:35,560 Speaker 1: enormous amounts of data, but what you want is the headline. 68 00:04:36,040 --> 00:04:39,080 Speaker 1: In Silicon Valley where I live, for example, all my 69 00:04:39,320 --> 00:04:43,560 Speaker 1: venture capitalist friends talk about this as pattern matching. They 70 00:04:43,600 --> 00:04:47,040 Speaker 1: have thousands of new companies coming in to give them pitches, 71 00:04:47,600 --> 00:04:50,000 Speaker 1: and in each of those pitches there are a million 72 00:04:50,080 --> 00:04:53,240 Speaker 1: details that they can't know, but they think, you know. 73 00:04:53,839 --> 00:04:59,400 Speaker 1: In my career, I've seen all these pitches and patterns emerge. 74 00:05:00,000 --> 00:05:02,799 Speaker 1: I don't even know why. I sometimes feel like, wow, 75 00:05:02,880 --> 00:05:06,360 Speaker 1: this team of founders is really great, or this team 76 00:05:06,360 --> 00:05:08,320 Speaker 1: of founders is never going to make it, even though 77 00:05:08,320 --> 00:05:14,680 Speaker 1: the idea seems strong. So those assessments rely on their intuition, 78 00:05:15,040 --> 00:05:17,440 Speaker 1: and what they're trying to do is make their intuition 79 00:05:17,720 --> 00:05:21,599 Speaker 1: good and refine it and refine it with practice. So 80 00:05:21,720 --> 00:05:24,120 Speaker 1: in many situations. In all of our lives, we have 81 00:05:24,320 --> 00:05:27,360 Speaker 1: lots of practice at something and we see a new 82 00:05:27,440 --> 00:05:31,039 Speaker 1: situation and we make an assessment like yes or no, 83 00:05:31,720 --> 00:05:34,320 Speaker 1: or good or bad, or take it or leave it. 84 00:05:34,880 --> 00:05:37,440 Speaker 1: And the extraordinary thing about brains is that they can 85 00:05:37,880 --> 00:05:42,280 Speaker 1: summarize a complex situation and say, look, although there are 86 00:05:42,360 --> 00:05:46,159 Speaker 1: a million details here, the big picture is that you 87 00:05:46,279 --> 00:05:49,599 Speaker 1: should approach that or avoid that. And you rarely have 88 00:05:49,760 --> 00:05:54,039 Speaker 1: access to the details about why. Sometimes you might think 89 00:05:54,080 --> 00:05:58,400 Speaker 1: you have access, but that is often just a retrospective story. Now, 90 00:05:58,640 --> 00:06:03,520 Speaker 1: in neuroscience jargon call this unconscious inference, meaning you are 91 00:06:03,880 --> 00:06:08,040 Speaker 1: inferring something about the situation and you're not doing it consciously, 92 00:06:08,440 --> 00:06:11,360 Speaker 1: presumably because of the amount of data that is involved. 93 00:06:11,720 --> 00:06:16,320 Speaker 1: And in popular parlance we call this intuition. But intuition 94 00:06:16,480 --> 00:06:20,240 Speaker 1: sometimes gets a bad name because it's often hijacked by 95 00:06:20,360 --> 00:06:24,480 Speaker 1: spiritual guru types who say you should always trust your intuition. 96 00:06:25,279 --> 00:06:29,240 Speaker 1: Now here's the truth. You should not always trust your intuition. 97 00:06:29,720 --> 00:06:34,480 Speaker 1: After all, your intuition is not a perfect predictor. In 98 00:06:34,520 --> 00:06:37,440 Speaker 1: Silicon Valley, for example, four out of ten companies that 99 00:06:37,480 --> 00:06:40,320 Speaker 1: are backed by those venture capitalists, they lose money or 100 00:06:40,320 --> 00:06:44,520 Speaker 1: they go bankrupt, and many times the intuitive decisions that 101 00:06:44,560 --> 00:06:47,839 Speaker 1: we make, let's say about relationships, turn out to be 102 00:06:47,920 --> 00:06:52,039 Speaker 1: bad decisions. Just because the computations are happening under the 103 00:06:52,040 --> 00:06:56,080 Speaker 1: hood and they're vast and complex, it doesn't mean they're 104 00:06:56,120 --> 00:07:01,120 Speaker 1: guaranteed to be correct. Intuition is quite often wrong or 105 00:07:01,200 --> 00:07:04,599 Speaker 1: makes a decision based on things that you wouldn't feel 106 00:07:04,760 --> 00:07:07,839 Speaker 1: proud of doing if you had access to it. For example, 107 00:07:07,880 --> 00:07:10,960 Speaker 1: what if your intuition tells you not to sell your 108 00:07:11,000 --> 00:07:13,920 Speaker 1: house to this guy, but it's because of some deep down, 109 00:07:14,320 --> 00:07:17,840 Speaker 1: unconscious racism that you weren't even aware of. What if 110 00:07:17,880 --> 00:07:21,480 Speaker 1: your intuition tells you that there's something creepy about that 111 00:07:21,520 --> 00:07:23,680 Speaker 1: guy over there, and you don't want to do business 112 00:07:23,680 --> 00:07:27,080 Speaker 1: with him, and you say, look, I'm just trusting my 113 00:07:27,200 --> 00:07:29,640 Speaker 1: gut on this one. But it turns out the poor 114 00:07:29,680 --> 00:07:33,720 Speaker 1: guy has early Parkinson's disease, and so as a result, 115 00:07:33,960 --> 00:07:38,600 Speaker 1: he's unable to make expressive facial expressions and instead his 116 00:07:38,680 --> 00:07:42,160 Speaker 1: face is mostly blank, and that creeps you out because 117 00:07:42,200 --> 00:07:44,360 Speaker 1: you don't get what's happening. Because let's say you're not 118 00:07:44,520 --> 00:07:48,200 Speaker 1: a clinician. Or what if you're sitting on a jury 119 00:07:48,520 --> 00:07:50,920 Speaker 1: and you think there's something about this guy that makes 120 00:07:50,920 --> 00:07:52,800 Speaker 1: me feel like he should be put away for a 121 00:07:52,800 --> 00:07:56,560 Speaker 1: long time. But it turns out it's because he's not 122 00:07:57,080 --> 00:08:00,560 Speaker 1: handsome but ugly. There's an extensive litery sure on this 123 00:08:00,920 --> 00:08:05,080 Speaker 1: that ugly people get longer sentences than good looking people. 124 00:08:05,520 --> 00:08:09,320 Speaker 1: So the idea that we should always trust our intuition 125 00:08:09,880 --> 00:08:13,320 Speaker 1: turns out to be mistaken. But the good news is 126 00:08:13,560 --> 00:08:16,000 Speaker 1: it turns out that we can be a little bit 127 00:08:16,040 --> 00:08:20,560 Speaker 1: smarter about when to look to our intuition or ignore it. 128 00:08:21,040 --> 00:08:24,640 Speaker 1: Researchers have been working to figure out the circumstances under 129 00:08:24,720 --> 00:08:28,880 Speaker 1: which your intuition might be more trustworthy or less trustworthy. 130 00:08:29,200 --> 00:08:31,600 Speaker 1: So for this I decided to have a conversation with 131 00:08:31,640 --> 00:08:35,240 Speaker 1: my friend and colleague Joel Pearson. Joel's a professor of 132 00:08:35,280 --> 00:08:39,520 Speaker 1: cognitive neuroscience at the University of New South Wales in Sydney, Australia. 133 00:08:40,040 --> 00:08:44,280 Speaker 1: He's recently written a book about this called The Intuition Toolkit. 134 00:08:44,559 --> 00:08:47,000 Speaker 1: So Joel and I sat down together to dig into 135 00:08:47,040 --> 00:09:03,920 Speaker 1: this topic. Joel tell us about intuition, Yeah, so a 136 00:09:04,000 --> 00:09:04,920 Speaker 1: topic I love. 137 00:09:05,400 --> 00:09:07,520 Speaker 2: We've been studying it for over a decade now, and 138 00:09:07,559 --> 00:09:10,240 Speaker 2: I've recently just published a book the Intuition Toolkit. So 139 00:09:10,840 --> 00:09:13,880 Speaker 2: a decade ago we started looking at the landscape of intuition, 140 00:09:13,960 --> 00:09:17,920 Speaker 2: and people define it as a spiritual thing. Other people 141 00:09:18,200 --> 00:09:21,200 Speaker 2: in psychology and neuroscience defind it very very differently. You 142 00:09:21,280 --> 00:09:23,280 Speaker 2: have all these different definitions. So I wanted to come 143 00:09:23,360 --> 00:09:25,880 Speaker 2: up with a clear definition, the most practical and what 144 00:09:26,000 --> 00:09:28,560 Speaker 2: I think is the most useful definition, and that is 145 00:09:29,160 --> 00:09:33,880 Speaker 2: the productive learnt use of unconscious information for better decisions 146 00:09:34,280 --> 00:09:35,319 Speaker 2: and actions. 147 00:09:35,720 --> 00:09:39,200 Speaker 1: By the way, does intuition always give better this is me? 148 00:09:39,320 --> 00:09:41,800 Speaker 3: Okay, yeah, that's how I'm so. So this was part 149 00:09:41,800 --> 00:09:42,160 Speaker 3: of the problem. 150 00:09:42,200 --> 00:09:44,000 Speaker 2: So I'm going to introduce a couple of words. So 151 00:09:44,480 --> 00:09:45,080 Speaker 2: this is what I did. 152 00:09:45,120 --> 00:09:45,559 Speaker 3: At the beginning. 153 00:09:45,640 --> 00:09:48,960 Speaker 2: I came up with another word called misintuition. Right, I know, 154 00:09:49,000 --> 00:09:50,920 Speaker 2: it sounds like a miss universe contest, but it's not. 155 00:09:51,240 --> 00:09:53,480 Speaker 2: I wanted a word that so I wanted to sort 156 00:09:53,480 --> 00:09:56,760 Speaker 2: of split apart the misfiring of intuition when it goes wrong, 157 00:09:56,800 --> 00:09:58,880 Speaker 2: because there are times when you should absolutely not use 158 00:09:58,920 --> 00:10:01,200 Speaker 2: it and other times where it is useful. So I 159 00:10:01,280 --> 00:10:02,920 Speaker 2: wanted to split that apart and came up with this 160 00:10:03,000 --> 00:10:06,200 Speaker 2: other word misintuition and intuition. So that's why I define 161 00:10:06,240 --> 00:10:11,319 Speaker 2: intuition as a productive use right of unconscious information and 162 00:10:11,360 --> 00:10:15,080 Speaker 2: it's learnt, right, So we can't we'll get to in 163 00:10:15,120 --> 00:10:17,640 Speaker 2: a second, I'm sure, but you can only use intuition 164 00:10:17,760 --> 00:10:20,280 Speaker 2: really for something you have experience with and that turns 165 00:10:20,280 --> 00:10:21,040 Speaker 2: out to be important. 166 00:10:21,360 --> 00:10:23,040 Speaker 1: So give us, give us more of a sense of 167 00:10:23,080 --> 00:10:23,840 Speaker 1: this then. Yeah. 168 00:10:23,880 --> 00:10:26,640 Speaker 2: So the example I like to give is you walk 169 00:10:26,679 --> 00:10:28,880 Speaker 2: into a cafe, right, and you at the second you 170 00:10:28,920 --> 00:10:33,160 Speaker 2: walk through the door, your brain's processing a thousand different things, right, 171 00:10:33,520 --> 00:10:36,520 Speaker 2: the music, what the staff are wearing, if there's tablecloths, 172 00:10:36,559 --> 00:10:40,640 Speaker 2: no tablecloths, the temperature, how clean the floor is, smells, this, 173 00:10:40,760 --> 00:10:43,400 Speaker 2: the hundreds of things, and you're not consciously aware of them, 174 00:10:43,400 --> 00:10:46,760 Speaker 2: all right, and as you walk in, you're going to go, yeah, 175 00:10:46,800 --> 00:10:49,559 Speaker 2: let's get coffee here, or let's go across the road 176 00:10:49,559 --> 00:10:53,280 Speaker 2: to the other place, right, And that people are going 177 00:10:53,320 --> 00:10:55,720 Speaker 2: to sometimes will report as they don't know why, but 178 00:10:55,760 --> 00:10:58,400 Speaker 2: this is something off that they felt sometimes in the gut, right, 179 00:10:58,640 --> 00:10:59,480 Speaker 2: the gut response. 180 00:11:00,040 --> 00:11:02,600 Speaker 3: That's kind of the way I tend to introduce intuition. 181 00:11:02,679 --> 00:11:04,839 Speaker 2: And what's happening in that those few seconds the second 182 00:11:04,840 --> 00:11:07,000 Speaker 2: as you walk in the door, is that things in 183 00:11:07,040 --> 00:11:11,920 Speaker 2: the environment are triggering positive or negative associations, and that 184 00:11:12,040 --> 00:11:15,040 Speaker 2: is based on past learning, right, the classic Pavlovs dogs, 185 00:11:15,080 --> 00:11:16,040 Speaker 2: classical conditioning. 186 00:11:16,559 --> 00:11:16,760 Speaker 3: Right. 187 00:11:16,800 --> 00:11:19,760 Speaker 2: So you go to enough cafes where it's really hot 188 00:11:19,760 --> 00:11:22,960 Speaker 2: in there and there's bad music and this coffee is terrible, 189 00:11:23,360 --> 00:11:25,959 Speaker 2: your brain will learn that. Whether you consciously remember that 190 00:11:26,080 --> 00:11:28,400 Speaker 2: or not, your brain will start to learn these associations. 191 00:11:28,800 --> 00:11:28,920 Speaker 1: Right. 192 00:11:28,960 --> 00:11:30,719 Speaker 2: If you get food poisoning and sick, it's going to 193 00:11:30,800 --> 00:11:33,880 Speaker 2: learn it really quickly. And so you go to enough cafes, 194 00:11:33,920 --> 00:11:36,360 Speaker 2: you go to one hundreds of thousands, your brain will 195 00:11:36,400 --> 00:11:39,400 Speaker 2: learn which cues the environment predicts good or bad food 196 00:11:39,559 --> 00:11:42,480 Speaker 2: or coffee. And that's what's happening. So all the things 197 00:11:42,520 --> 00:11:47,280 Speaker 2: in the environment are triggering red flag, green flag, right, 198 00:11:47,320 --> 00:11:49,760 Speaker 2: and you feel it and you don't know why, right, 199 00:11:50,160 --> 00:11:53,360 Speaker 2: and you either get the coffee or you go somewhere else. Yes, 200 00:11:53,520 --> 00:11:55,400 Speaker 2: So that's I think the best way to think about it. 201 00:11:55,480 --> 00:11:57,720 Speaker 1: Cool. And this is why I think about intuition as 202 00:11:57,720 --> 00:11:59,760 Speaker 1: long as it's just having a wide angle lens on 203 00:11:59,800 --> 00:12:02,840 Speaker 1: the There's a lot of interest in neuroscience obviously about 204 00:12:03,040 --> 00:12:06,520 Speaker 1: cognition where you're deciding, okay, what's the next chest move 205 00:12:06,559 --> 00:12:08,800 Speaker 1: here and so on? And that's what all lots of 206 00:12:08,840 --> 00:12:11,200 Speaker 1: AI is about, is how do I make a really 207 00:12:11,240 --> 00:12:14,240 Speaker 1: smart decision here? But but in a sense with intuition, 208 00:12:14,320 --> 00:12:16,920 Speaker 1: it's I'm going to take in the whole cafe, and 209 00:12:17,440 --> 00:12:20,400 Speaker 1: you know something's going to bubble up saying go or stay. 210 00:12:21,160 --> 00:12:23,960 Speaker 1: So I'm really interested in this issue of misintuition though, because, 211 00:12:24,400 --> 00:12:28,199 Speaker 1: as you phrase it, because there are lots of times 212 00:12:28,200 --> 00:12:31,839 Speaker 1: that people have intuitions that are totally incorrect, like, oh, 213 00:12:31,840 --> 00:12:34,000 Speaker 1: this coffee shop, there's something that creeps me out about that. 214 00:12:34,440 --> 00:12:37,920 Speaker 1: Like the guy in the corner there, I feel like 215 00:12:38,120 --> 00:12:41,160 Speaker 1: he's a mass murderer, even though maybe the guy has 216 00:12:41,280 --> 00:12:43,840 Speaker 1: you know, something like Parkinson's disease, where people have you know, 217 00:12:43,880 --> 00:12:46,400 Speaker 1: blank facial expressions. I've seen this a bunch of times 218 00:12:46,440 --> 00:12:49,360 Speaker 1: where someone has Parkinson's and other people think, oh my gosh, 219 00:12:49,360 --> 00:12:51,360 Speaker 1: that guy's creepy. There's something wrong with that guy because 220 00:12:51,360 --> 00:12:54,480 Speaker 1: they don't know about it. Right, Okay, So so give 221 00:12:54,559 --> 00:12:58,320 Speaker 1: us a sense of misintuition and how when you feel 222 00:12:59,080 --> 00:13:02,800 Speaker 1: like his intuition mostly right, mostly wrong, totally random. 223 00:13:02,880 --> 00:13:04,200 Speaker 3: So I'm glad you are. 224 00:13:04,320 --> 00:13:07,240 Speaker 2: So the second half of the book, he unpacks based 225 00:13:07,280 --> 00:13:09,240 Speaker 2: on these five rules and I use the word the 226 00:13:09,280 --> 00:13:12,079 Speaker 2: acronym smile so people can try to remember this right. 227 00:13:12,240 --> 00:13:13,840 Speaker 2: So the first half of the book is really about 228 00:13:13,840 --> 00:13:15,440 Speaker 2: the science we've done, the lab, how you measure it, 229 00:13:15,440 --> 00:13:17,960 Speaker 2: how you create the lab. And it's not always good. 230 00:13:17,960 --> 00:13:19,920 Speaker 2: Sometimes you should use it, sometimes you shouldn't. And that's 231 00:13:19,920 --> 00:13:22,560 Speaker 2: what the rules unpack. So let's sort of tell people 232 00:13:22,840 --> 00:13:26,760 Speaker 2: when it's safe to trust their intuition. And an example 233 00:13:26,800 --> 00:13:30,040 Speaker 2: of this I give often is Steve Jobs, who talked 234 00:13:30,040 --> 00:13:33,080 Speaker 2: about using his intuition at Apple, both for product design 235 00:13:33,120 --> 00:13:34,960 Speaker 2: and running the company in all different you know, and 236 00:13:35,000 --> 00:13:36,480 Speaker 2: he was a master at it. He went to India 237 00:13:36,520 --> 00:13:39,000 Speaker 2: and studied it. He was really into it when it 238 00:13:39,040 --> 00:13:41,240 Speaker 2: came to his health decisions later in life, and some 239 00:13:41,280 --> 00:13:44,200 Speaker 2: of his other decisions not so much right. And that's 240 00:13:44,400 --> 00:13:45,880 Speaker 2: interesting because. 241 00:13:46,000 --> 00:13:47,559 Speaker 1: Wait, can you impack that for people who don't know 242 00:13:47,559 --> 00:13:48,040 Speaker 1: about his health. 243 00:13:48,600 --> 00:13:51,240 Speaker 2: So he died from cancer in the end, he actually 244 00:13:51,600 --> 00:13:54,400 Speaker 2: or his doctors and his family wanted him to get treatment, 245 00:13:54,440 --> 00:13:57,160 Speaker 2: to get an operation, and he said no, he didn't 246 00:13:57,160 --> 00:13:59,920 Speaker 2: want to get have that sort of be invaded surgically 247 00:14:00,120 --> 00:14:01,920 Speaker 2: like that, and he put off the treatment, put it off, 248 00:14:01,920 --> 00:14:04,800 Speaker 2: and put it off, until it's basically too late. And 249 00:14:05,400 --> 00:14:09,480 Speaker 2: this idea that intuition is specific to a particular environment 250 00:14:09,600 --> 00:14:11,960 Speaker 2: or context, and it makes sense because the learning that 251 00:14:11,960 --> 00:14:13,800 Speaker 2: I'm saying that it seems to be based on is 252 00:14:13,880 --> 00:14:16,480 Speaker 2: context specific. And this is one of those rules is 253 00:14:16,480 --> 00:14:19,120 Speaker 2: actually the E at the end of smile for environment. 254 00:14:19,440 --> 00:14:20,360 Speaker 3: So if you learn. 255 00:14:20,200 --> 00:14:23,960 Speaker 2: It at you know, you craft your intuition expertly at work. 256 00:14:24,360 --> 00:14:26,680 Speaker 2: When you go home, when you're on holiday in a 257 00:14:26,720 --> 00:14:29,320 Speaker 2: different context, different environment, that learning is not going to 258 00:14:29,400 --> 00:14:31,920 Speaker 2: apply so well. So we need to be careful and 259 00:14:31,960 --> 00:14:34,560 Speaker 2: we change the environment because when you learn something in 260 00:14:34,760 --> 00:14:37,240 Speaker 2: the environment gets sort of attached to that. Right when 261 00:14:37,280 --> 00:14:40,040 Speaker 2: you're studying for the exam in your bedroom, not only 262 00:14:40,120 --> 00:14:43,320 Speaker 2: are you learning all information, you're learning the information imprinted 263 00:14:43,440 --> 00:14:45,720 Speaker 2: to your bedroom. So we have to be careful with that. 264 00:14:45,800 --> 00:14:48,160 Speaker 2: So that's an example of where you can be led astray. 265 00:14:48,240 --> 00:14:51,520 Speaker 2: Now s the beginning smile is self awareness, and that 266 00:14:51,600 --> 00:14:55,440 Speaker 2: really is a trigger for emotions. So we shouldn't trust 267 00:14:55,440 --> 00:14:58,520 Speaker 2: our intuition when we're emotional and we're stressed, anxious, or 268 00:14:58,560 --> 00:15:02,720 Speaker 2: depressed for a number of reasons for something called arousalor misattribution, 269 00:15:03,360 --> 00:15:05,200 Speaker 2: which is something so humans are not very good at 270 00:15:05,240 --> 00:15:08,680 Speaker 2: understanding where particular feelings or stress or emotion come from, 271 00:15:08,720 --> 00:15:10,800 Speaker 2: and we can go we can deep dive into this. 272 00:15:10,800 --> 00:15:13,480 Speaker 2: This classic experiments in psychology that are now a lot 273 00:15:13,560 --> 00:15:16,920 Speaker 2: of data to support this. Right, But basically, you know, 274 00:15:16,920 --> 00:15:18,600 Speaker 2: if there was a snake right here on the floor, 275 00:15:18,880 --> 00:15:21,320 Speaker 2: we both get adrenaline, we'd both be sweating and scared 276 00:15:21,360 --> 00:15:24,160 Speaker 2: of the snake, and then later on we confuse that 277 00:15:24,200 --> 00:15:27,400 Speaker 2: and say, man, that podcast was amazing, Right, we'd confuse 278 00:15:27,440 --> 00:15:30,160 Speaker 2: the emotion for the situation with the emotion from the snake. 279 00:15:30,160 --> 00:15:32,080 Speaker 2: We're just bad to understanding the source of these things. 280 00:15:32,280 --> 00:15:35,760 Speaker 2: So that's a number one rule. If you're emotional stressed, don't. 281 00:15:35,560 --> 00:15:38,360 Speaker 3: Follow your intuition. Just just put it aside, follow the 282 00:15:38,400 --> 00:15:41,480 Speaker 3: logic as best you can. Right. So that's s. 283 00:15:42,360 --> 00:15:45,480 Speaker 2: Then M is for mastery, and that we mentioned learning before, 284 00:15:45,840 --> 00:15:48,760 Speaker 2: So that's really that you need to build these associations 285 00:15:48,800 --> 00:15:51,560 Speaker 2: in your brain between the environment, the triggers, and good 286 00:15:51,680 --> 00:15:54,600 Speaker 2: or bad outcomes. So you've never played chess before, you 287 00:15:54,640 --> 00:15:57,240 Speaker 2: can't sit down and just be a grand master intuitive 288 00:15:57,280 --> 00:15:59,960 Speaker 2: chess player, right. Your brain needs to learn the associate 289 00:16:00,240 --> 00:16:03,840 Speaker 2: between patterns outcomes, what's good what's bad. Likewise, for anything 290 00:16:03,880 --> 00:16:05,800 Speaker 2: so you need to sort of put the time in 291 00:16:06,600 --> 00:16:09,920 Speaker 2: and build that experience. How much experience ten thousand hours, 292 00:16:10,760 --> 00:16:13,960 Speaker 2: not so much, We don't know, right, depends how emotional 293 00:16:13,960 --> 00:16:14,400 Speaker 2: things are. 294 00:16:14,840 --> 00:16:15,000 Speaker 4: Right. 295 00:16:15,000 --> 00:16:17,080 Speaker 2: We know that something like PTSD, you can learn something 296 00:16:17,280 --> 00:16:20,000 Speaker 2: very strong, you know, in a moment it was just 297 00:16:20,000 --> 00:16:21,960 Speaker 2: good or bad coffee. It's going to take a lot 298 00:16:21,960 --> 00:16:24,360 Speaker 2: of learning, a lot of iterations. But if something is 299 00:16:24,440 --> 00:16:27,440 Speaker 2: highly emotional, then you'll learn it very very quickly. So yeah, 300 00:16:27,520 --> 00:16:30,160 Speaker 2: so you need to have experience with that. And now 301 00:16:30,560 --> 00:16:33,040 Speaker 2: it's a really interesting one. I is really for impulses, 302 00:16:33,080 --> 00:16:36,240 Speaker 2: but also addiction. So I heard started hearing from people 303 00:16:36,320 --> 00:16:41,400 Speaker 2: that things that have addictive nature, so you alcohol, drugs, 304 00:16:41,400 --> 00:16:44,560 Speaker 2: but also social media, checking your email. So behavioral addictions. 305 00:16:44,960 --> 00:16:48,560 Speaker 2: People have this urge is strong pull towards them, and 306 00:16:48,600 --> 00:16:51,560 Speaker 2: they can confuse that or call it intuition. But I 307 00:16:51,600 --> 00:16:53,280 Speaker 2: didn't want that to be part of intuition. I think 308 00:16:53,320 --> 00:16:55,280 Speaker 2: that's something a little bit different. So one of the 309 00:16:55,320 --> 00:16:57,800 Speaker 2: rules is just to stay clear of anything that's addictive 310 00:16:58,240 --> 00:17:01,880 Speaker 2: and don't confuse that with intuition, right, including food. And 311 00:17:01,880 --> 00:17:04,359 Speaker 2: that's a controversial one because there's a whole movement of 312 00:17:04,400 --> 00:17:06,960 Speaker 2: intuitive eating, which is a hot topic which we can 313 00:17:07,000 --> 00:17:07,720 Speaker 2: deep dive into it. 314 00:17:07,760 --> 00:17:09,480 Speaker 1: What is that movement? So there's a whole idea. 315 00:17:09,520 --> 00:17:11,280 Speaker 2: There's all these books on intuitive eating, and this is 316 00:17:11,359 --> 00:17:13,640 Speaker 2: this idea that you just follow how you feel about 317 00:17:13,640 --> 00:17:15,760 Speaker 2: the food, right, you and you eat as much as 318 00:17:15,800 --> 00:17:18,200 Speaker 2: you want, whenever you want, and your body will naturally 319 00:17:18,240 --> 00:17:21,800 Speaker 2: come to an equilibrium more or less, right, I'm summarizing it. 320 00:17:21,880 --> 00:17:25,440 Speaker 2: The problem is modern food's highly engineered, right, billions of 321 00:17:25,480 --> 00:17:28,560 Speaker 2: dollars going into making food basically like a drug addictive, 322 00:17:28,840 --> 00:17:31,040 Speaker 2: and that's a very dangerous thing. I'm against that idea 323 00:17:31,880 --> 00:17:33,679 Speaker 2: unless you're out in the country and you're eating a 324 00:17:33,680 --> 00:17:36,280 Speaker 2: whole food diet and maybe that that can work. And 325 00:17:36,320 --> 00:17:39,120 Speaker 2: then L is for low probability, but it really it's 326 00:17:39,200 --> 00:17:41,320 Speaker 2: for anything around probabilities and numbers. 327 00:17:41,880 --> 00:17:42,040 Speaker 1: Right. 328 00:17:42,119 --> 00:17:45,840 Speaker 2: So psychology textbooks are full of examples where people just 329 00:17:45,880 --> 00:17:49,760 Speaker 2: get these probability tests and it's completely wrong, right, And 330 00:17:49,840 --> 00:17:53,520 Speaker 2: our brains just don't process numbers very well, or probabilities particularly, 331 00:17:54,400 --> 00:17:58,320 Speaker 2: And so again, anything around probabilities and numbers, stay away 332 00:17:58,320 --> 00:17:59,840 Speaker 2: from intuition, right. 333 00:17:59,720 --> 00:18:02,119 Speaker 3: Just don't don't use your intuition for anything around numbers. 334 00:18:02,560 --> 00:18:05,600 Speaker 1: Right, as in, I don't want to drive over the 335 00:18:05,640 --> 00:18:09,720 Speaker 1: bridge because I've seen lots of stories of bridges collapsing. 336 00:18:10,080 --> 00:18:10,359 Speaker 3: Yeah. 337 00:18:10,359 --> 00:18:13,760 Speaker 1: In fact, it's terribly low probability. People get scared to 338 00:18:13,840 --> 00:18:17,000 Speaker 1: get the gut feeling of not wanting to do that. 339 00:18:17,160 --> 00:18:18,880 Speaker 3: Is so I live in Sydney. 340 00:18:19,000 --> 00:18:21,400 Speaker 2: Sometimes I'll call it the shark attack rule right where 341 00:18:21,720 --> 00:18:24,679 Speaker 2: the sharks, you know, I think you're more likely to 342 00:18:24,720 --> 00:18:27,040 Speaker 2: be get this For an AUSSI stat more likely to 343 00:18:27,080 --> 00:18:29,600 Speaker 2: be injured by a kangaroo than a shark, right, and 344 00:18:29,640 --> 00:18:31,840 Speaker 2: certainly more likely to be hurt in a car crash 345 00:18:32,000 --> 00:18:34,720 Speaker 2: than a shark. But people don't think twice about getting 346 00:18:34,720 --> 00:18:36,399 Speaker 2: in the car or patting a kangaroo, but they'll be 347 00:18:36,880 --> 00:18:38,800 Speaker 2: you know, once you start imagining a sharks, you know, 348 00:18:38,840 --> 00:18:40,879 Speaker 2: in the water, then you get reoody scared of it. 349 00:18:40,920 --> 00:18:43,760 Speaker 2: So for a range offer reasons, we're just not good 350 00:18:43,760 --> 00:18:47,400 Speaker 2: at following the numbers and probabilities. Emotions other things creep in. 351 00:18:47,840 --> 00:18:51,000 Speaker 2: So anything around that don't follow the feelings, don't follow intuition. 352 00:18:51,119 --> 00:18:53,280 Speaker 2: Stick with the numbers and the probabilities, you know. 353 00:18:53,359 --> 00:18:55,640 Speaker 1: One example that here in Silicon Valley is with self 354 00:18:55,720 --> 00:18:59,440 Speaker 1: driving cars. The probabilities and self driving cars are terrific, 355 00:18:59,480 --> 00:19:01,880 Speaker 1: as in you're much less likely to get an accident 356 00:19:01,880 --> 00:19:05,000 Speaker 1: with those. But there's something about control and the other 357 00:19:05,240 --> 00:19:08,480 Speaker 1: emotions involved there that make it so people just can't 358 00:19:08,560 --> 00:19:12,879 Speaker 1: do it. They'd rather have a much higher risk drive 359 00:19:13,160 --> 00:19:14,359 Speaker 1: than do something safe. 360 00:19:14,760 --> 00:19:17,320 Speaker 3: Yeah it's a funny, Yeah, yeah, it's an odd thing. 361 00:19:17,680 --> 00:19:19,679 Speaker 1: So the l is what does the advice to people though, 362 00:19:19,720 --> 00:19:21,840 Speaker 1: that if it's a low probability event, don't try. 363 00:19:22,040 --> 00:19:26,440 Speaker 2: Anything abound probabilities, smoking, climate change, or anything around numbers 364 00:19:26,440 --> 00:19:30,200 Speaker 2: and probabilities. Don't feel it, don't like follow the numbers, 365 00:19:30,240 --> 00:19:32,840 Speaker 2: do the math, just follow whatever the numbers say, don't 366 00:19:32,880 --> 00:19:34,200 Speaker 2: try and feel your way through it. 367 00:19:34,680 --> 00:19:34,880 Speaker 1: Right. 368 00:19:35,160 --> 00:19:36,640 Speaker 3: And then E was the environment. 369 00:19:37,000 --> 00:19:40,399 Speaker 2: So I mentioned before with Steve Jobs that the learning 370 00:19:40,480 --> 00:19:42,400 Speaker 2: is context and environment specific. 371 00:19:42,480 --> 00:19:43,040 Speaker 3: So when you're on. 372 00:19:43,080 --> 00:19:46,119 Speaker 2: Holiday, right, I'm in a different country now, you know, 373 00:19:46,160 --> 00:19:47,840 Speaker 2: I look to the wrong side, and I feel like 374 00:19:47,840 --> 00:19:50,920 Speaker 2: I should look to the to the to the right 375 00:19:50,960 --> 00:19:53,800 Speaker 2: and right the cars coming from the left here in America, 376 00:19:54,080 --> 00:19:56,240 Speaker 2: and they're all you know, when there's there's all kinds 377 00:19:56,280 --> 00:19:59,159 Speaker 2: of very subtle things like that when you're traveling that 378 00:19:59,280 --> 00:20:02,040 Speaker 2: I have to really pay attention to, not just do 379 00:20:02,119 --> 00:20:05,119 Speaker 2: my usual intuitive thing and sort of think, go back 380 00:20:05,160 --> 00:20:07,119 Speaker 2: to sort of first principles and think logically about what 381 00:20:07,160 --> 00:20:07,560 Speaker 2: I'm doing. 382 00:20:07,600 --> 00:20:09,160 Speaker 3: So that's smile. 383 00:20:09,320 --> 00:20:11,520 Speaker 2: There's these five rules that fall out of the science 384 00:20:11,800 --> 00:20:12,560 Speaker 2: around intuition. 385 00:20:12,960 --> 00:20:14,800 Speaker 1: It's just so curiosity. Why did you call it smile 386 00:20:14,920 --> 00:20:17,000 Speaker 1: instead of miles or slime? 387 00:20:20,000 --> 00:20:22,280 Speaker 3: My wife helped me with that. Essentially came up with that. 388 00:20:22,320 --> 00:20:24,040 Speaker 3: I wanted something positive because a lot of. 389 00:20:24,040 --> 00:20:25,879 Speaker 2: The rules that don't do this don't use your intuition, 390 00:20:25,960 --> 00:20:28,800 Speaker 2: Like I like the idea of smile before you intuit, 391 00:20:29,200 --> 00:20:32,239 Speaker 2: smile before you use your intuition, and the idea that 392 00:20:32,440 --> 00:20:35,040 Speaker 2: of people to have a daily practice and you know, 393 00:20:35,160 --> 00:20:37,440 Speaker 2: getting really used to those rules, like you know, going 394 00:20:37,440 --> 00:20:40,200 Speaker 2: to the gym or something in practicing right, I suggest 395 00:20:40,280 --> 00:20:43,000 Speaker 2: people have a sort of an intuition practice diary or 396 00:20:43,119 --> 00:20:45,480 Speaker 2: a table where they'll keep track of things and learn 397 00:20:45,560 --> 00:20:47,600 Speaker 2: to use it. So one of the things that people, 398 00:20:47,640 --> 00:20:50,560 Speaker 2: you know, small decisions. They may not use intuition when 399 00:20:50,600 --> 00:20:53,320 Speaker 2: they're going to get married or get divorced, or change 400 00:20:53,640 --> 00:20:55,919 Speaker 2: live somewhere else, move overseas or a new job or 401 00:20:55,960 --> 00:20:58,360 Speaker 2: leave a job. All of a sudden, they start talking about, oh, 402 00:20:58,400 --> 00:21:00,879 Speaker 2: my gut says this, I'm feeling it this way or 403 00:21:00,880 --> 00:21:03,840 Speaker 2: that way. And they're not well practiced at following these 404 00:21:03,920 --> 00:21:06,360 Speaker 2: kind of rules. Might they might be finding highly anxious, right, 405 00:21:06,400 --> 00:21:07,560 Speaker 2: and their intuition. 406 00:21:07,280 --> 00:21:09,000 Speaker 3: Is not going to work when they're anxious. 407 00:21:09,040 --> 00:21:11,280 Speaker 2: So I like the idea of starting with small decisions 408 00:21:11,600 --> 00:21:13,159 Speaker 2: and working up to the larger ones. 409 00:21:13,359 --> 00:21:17,560 Speaker 1: That's excellent. That's so good because people write about intuition, 410 00:21:17,760 --> 00:21:21,960 Speaker 1: and I feel like they often give way too much 411 00:21:22,000 --> 00:21:24,440 Speaker 1: credit to intuition. So what you're doing is so important 412 00:21:24,440 --> 00:21:26,199 Speaker 1: to say, look, here are the times, listen do it 413 00:21:26,200 --> 00:21:28,840 Speaker 1: here at the times not to listen to it, because 414 00:21:28,880 --> 00:21:31,600 Speaker 1: you might drift into the realm of misintuition there, which 415 00:21:31,640 --> 00:21:34,639 Speaker 1: people do all the time. And obviously, people with anxieties 416 00:21:34,680 --> 00:21:41,639 Speaker 1: about things or misapprehensions about things, including you know, whatever 417 00:21:41,680 --> 00:21:44,000 Speaker 1: it is, maybe they're divorced from reality or they just 418 00:21:44,400 --> 00:21:49,400 Speaker 1: overrepresent something. People will often follow their gut but incorrectly. 419 00:21:49,560 --> 00:21:52,520 Speaker 1: So this is a really important thing that you're doing here. 420 00:21:52,160 --> 00:21:54,639 Speaker 2: If you're anxious about flying and getting on the plane, 421 00:21:54,640 --> 00:21:56,679 Speaker 2: it's probably not your intuition telling you something is going 422 00:21:56,760 --> 00:22:00,280 Speaker 2: to happen, unless you know you're a mechanic and line 423 00:22:00,280 --> 00:22:02,600 Speaker 2: mechanic and you're an expert, right, It's not that it's something. 424 00:22:02,600 --> 00:22:04,960 Speaker 2: It's just your anxiety, so don't confuse the two. 425 00:22:05,400 --> 00:22:06,840 Speaker 1: How do you measure this in the lab? 426 00:22:07,119 --> 00:22:09,840 Speaker 2: Yes, we started off on this quest almost a decade 427 00:22:09,840 --> 00:22:13,720 Speaker 2: ago now, well actually no a decade, so we had 428 00:22:13,720 --> 00:22:16,000 Speaker 2: this sort of rough definition of intuition and we've been 429 00:22:16,040 --> 00:22:18,879 Speaker 2: studying consciousness. So we do a lot of consciousness research 430 00:22:18,920 --> 00:22:22,520 Speaker 2: in my lab, and we have ways of rendering things unconscious. 431 00:22:22,560 --> 00:22:23,840 Speaker 2: What do I mean by that? So we have a 432 00:22:23,840 --> 00:22:26,479 Speaker 2: way to show people a picture in one eye and 433 00:22:26,480 --> 00:22:29,439 Speaker 2: then flash bright colors in the other eye, and that 434 00:22:29,480 --> 00:22:32,720 Speaker 2: can render that picture completely unconscious. So it's kind of 435 00:22:32,800 --> 00:22:34,320 Speaker 2: like just me doing this, right, I see you and 436 00:22:34,440 --> 00:22:36,000 Speaker 2: this eye and my hand there. 437 00:22:36,240 --> 00:22:39,800 Speaker 1: For the audio audience, he's covering his one of his eyes. 438 00:22:39,880 --> 00:22:42,320 Speaker 2: Yes, so if you're on YouTube there, yes, So that 439 00:22:42,440 --> 00:22:45,040 Speaker 2: puts the visual system into this binoch the rival with 440 00:22:45,080 --> 00:22:47,680 Speaker 2: this this state of competition. Right, And if the thing 441 00:22:47,720 --> 00:22:49,919 Speaker 2: and in the other eyes is bright enough and you 442 00:22:49,960 --> 00:22:52,359 Speaker 2: make it flicker, you'll never see the picture in the 443 00:22:52,400 --> 00:22:52,960 Speaker 2: other eye. 444 00:22:53,280 --> 00:22:56,120 Speaker 1: So let me just unpack this, which is that you've 445 00:22:56,119 --> 00:22:58,480 Speaker 1: got two eyes. They normally fuse their picture so you 446 00:22:58,520 --> 00:23:00,879 Speaker 1: see the world, but in fact, you can in the 447 00:23:00,920 --> 00:23:03,720 Speaker 1: laboratory give a stimulus to one eye and different stimulus 448 00:23:03,760 --> 00:23:05,720 Speaker 1: to the other eye, and you can have them compete. 449 00:23:06,200 --> 00:23:08,679 Speaker 1: And by doing this cleverally, you can make it so 450 00:23:08,720 --> 00:23:12,239 Speaker 1: that the input going into one eye you're not conscious of, 451 00:23:12,440 --> 00:23:15,960 Speaker 1: even though your visual system still sees it, just FAPs exactly. Yeah. 452 00:23:16,040 --> 00:23:19,159 Speaker 2: So I call this inception, and we're doing it with 453 00:23:19,400 --> 00:23:22,240 Speaker 2: emotional images. I call it emotional inception, like the Christopher 454 00:23:22,280 --> 00:23:24,600 Speaker 2: Nolan film. Right, So we don't we're not hacking dreams 455 00:23:24,720 --> 00:23:26,800 Speaker 2: or anything, but we're getting and this is what we need. 456 00:23:26,840 --> 00:23:28,560 Speaker 2: It's like the ingredients you need if you want to 457 00:23:28,560 --> 00:23:30,639 Speaker 2: study intuition. How can you get something into the brain 458 00:23:31,080 --> 00:23:33,160 Speaker 2: and you know it's unconscious And this is a way 459 00:23:33,160 --> 00:23:35,120 Speaker 2: of doing that. So what we do is we show 460 00:23:35,160 --> 00:23:39,040 Speaker 2: people sort of nasty images snakes, spiders, sharks, guns, these 461 00:23:39,119 --> 00:23:40,920 Speaker 2: kind of things in one eye and we have a bright, 462 00:23:41,080 --> 00:23:43,880 Speaker 2: flickering stimulus to the other eye and so they never 463 00:23:43,960 --> 00:23:46,639 Speaker 2: see the scary image, but we know their brain is 464 00:23:46,680 --> 00:23:49,439 Speaker 2: processing it. We can put a little thing around their 465 00:23:49,480 --> 00:23:51,159 Speaker 2: finger and we can see that they start sweating a 466 00:23:51,200 --> 00:23:53,280 Speaker 2: little bit more in those conditions when there's an emotional 467 00:23:53,280 --> 00:23:55,879 Speaker 2: thing there, so we know what is being processed. So 468 00:23:55,960 --> 00:23:58,960 Speaker 2: that's the first ingredient. The second ingredient that needs to 469 00:23:58,960 --> 00:24:01,480 Speaker 2: be a decision right to see making. So at the 470 00:24:01,520 --> 00:24:04,119 Speaker 2: same time, we have this noisy cloud of dots just 471 00:24:04,160 --> 00:24:06,120 Speaker 2: moving all over the place on the screen, a bit 472 00:24:06,200 --> 00:24:08,639 Speaker 2: like the old school analog TVs right, but there's that 473 00:24:09,000 --> 00:24:11,520 Speaker 2: fuzzy snow so a bit like that, but it's slightly 474 00:24:11,600 --> 00:24:14,280 Speaker 2: drifting left or right a little bit more, but it's 475 00:24:14,359 --> 00:24:16,240 Speaker 2: kind of hard to pick if it's going left or right. 476 00:24:16,560 --> 00:24:18,800 Speaker 2: So super simple, all people have to do is say, oh, 477 00:24:18,840 --> 00:24:19,600 Speaker 2: it's going left. 478 00:24:20,200 --> 00:24:21,960 Speaker 3: That's it. That's the end of a trial. 479 00:24:22,000 --> 00:24:23,960 Speaker 2: Then we do it again and again, and so they 480 00:24:24,000 --> 00:24:25,800 Speaker 2: don't know we're doing this, but at the same time 481 00:24:25,840 --> 00:24:27,640 Speaker 2: they're deciding whether it's moving or left or right, we're 482 00:24:27,640 --> 00:24:31,439 Speaker 2: showing them a positive or a negative image. And then 483 00:24:31,480 --> 00:24:35,080 Speaker 2: what we see over time is their brain starts to 484 00:24:35,080 --> 00:24:38,280 Speaker 2: associate the positive or negative with the direction, and their 485 00:24:38,280 --> 00:24:39,440 Speaker 2: performance starts to go up. 486 00:24:39,480 --> 00:24:40,719 Speaker 3: They get better and better at it. 487 00:24:41,240 --> 00:24:43,520 Speaker 2: Right, their reaction times get faster and faster, so their 488 00:24:43,560 --> 00:24:46,600 Speaker 2: accuracy goes up, their responsors are faster, and if you 489 00:24:46,600 --> 00:24:49,679 Speaker 2: ask them how confident they are, their confidence goes up 490 00:24:49,760 --> 00:24:53,040 Speaker 2: as well. Right, And then if you give them a 491 00:24:53,119 --> 00:24:56,240 Speaker 2: questionnaire and say, how do you make decisions in everyday life? 492 00:24:56,600 --> 00:24:57,919 Speaker 3: You know, outside the lab, the. 493 00:24:57,920 --> 00:25:00,600 Speaker 2: People that are report making more intuitor decisions are much 494 00:25:00,640 --> 00:25:02,680 Speaker 2: better or they get it more of a boost from 495 00:25:02,720 --> 00:25:06,159 Speaker 2: these unconscious images. So so I know to a lot 496 00:25:06,160 --> 00:25:07,720 Speaker 2: of people they're like, wow, this is kind of a 497 00:25:07,760 --> 00:25:10,520 Speaker 2: strange way to measure intuition. And it is, but I 498 00:25:10,520 --> 00:25:13,680 Speaker 2: think it's an interesting sort of technology in a way 499 00:25:13,720 --> 00:25:16,520 Speaker 2: to measure and dissect something in the lab so we 500 00:25:16,560 --> 00:25:19,080 Speaker 2: can understand it, right, but it is different to how a. 501 00:25:19,080 --> 00:25:21,080 Speaker 3: Lot of people may think about intuition. 502 00:25:21,160 --> 00:25:23,679 Speaker 1: And tell us about intuition and AI. 503 00:25:24,200 --> 00:25:25,520 Speaker 2: Yeah, So at the end of the book and the 504 00:25:25,560 --> 00:25:27,760 Speaker 2: very last the last chapter, I started talking about this 505 00:25:28,359 --> 00:25:30,800 Speaker 2: and it's something that came up early and journalists would 506 00:25:30,800 --> 00:25:34,840 Speaker 2: ask this question. And I realized that the way I'm 507 00:25:34,840 --> 00:25:37,960 Speaker 2: defining it right, unconscious learning. Things in the world become 508 00:25:37,960 --> 00:25:42,160 Speaker 2: associated with positive negative outcomes in your brain. Unconscious learning. 509 00:25:42,680 --> 00:25:46,720 Speaker 2: That's kind of how these new versions of AI operate. 510 00:25:46,840 --> 00:25:47,040 Speaker 3: Right. 511 00:25:47,640 --> 00:25:50,640 Speaker 2: We don't think they're conscious and they're learning, so unconscious 512 00:25:50,720 --> 00:25:53,760 Speaker 2: learning and they're learning this thing predicts this thing positive 513 00:25:53,840 --> 00:25:57,080 Speaker 2: negative outcome. So there's something interesting between those two that 514 00:25:57,560 --> 00:26:01,960 Speaker 2: it's just a nice parallel there, and that's interesting itself. 515 00:26:02,240 --> 00:26:05,120 Speaker 2: The second is the question of will we be able 516 00:26:05,160 --> 00:26:08,600 Speaker 2: to outsource our intuition to an AI assistant? 517 00:26:08,840 --> 00:26:09,000 Speaker 1: Right? 518 00:26:09,040 --> 00:26:09,760 Speaker 3: And could we you. 519 00:26:09,760 --> 00:26:11,560 Speaker 2: Know, if we have all these wearables and we're going 520 00:26:11,600 --> 00:26:14,080 Speaker 2: to get a lot more of them soon, could the 521 00:26:14,160 --> 00:26:17,800 Speaker 2: AI use all these wearables to basically tell us what 522 00:26:17,840 --> 00:26:18,879 Speaker 2: our intuition is saying? 523 00:26:18,960 --> 00:26:19,200 Speaker 3: Right? 524 00:26:19,359 --> 00:26:22,040 Speaker 2: So can we outsource it? And this, you know, would 525 00:26:22,040 --> 00:26:24,000 Speaker 2: be interesting to everyone, but i'd be interesting to certain 526 00:26:24,040 --> 00:26:29,200 Speaker 2: populations who don't have people with mental disorders or addiction 527 00:26:29,359 --> 00:26:32,280 Speaker 2: is one interesting area where we know decision making doesn't 528 00:26:32,320 --> 00:26:35,240 Speaker 2: work that well in people that have an addiction, and 529 00:26:35,280 --> 00:26:39,679 Speaker 2: intuition cerainly doesn't work. So could you outsource intuition to 530 00:26:39,760 --> 00:26:41,959 Speaker 2: an AI, your personal AI assistant? 531 00:26:42,000 --> 00:26:43,920 Speaker 3: I think it's a really fascinating question. 532 00:26:44,440 --> 00:26:47,280 Speaker 1: So we're actually doing something at my company in your sensory, 533 00:26:47,359 --> 00:26:50,640 Speaker 1: which is an interesting version of this. So, just as 534 00:26:50,760 --> 00:26:54,000 Speaker 1: one example, there were some students at us SEE who 535 00:26:54,040 --> 00:26:58,120 Speaker 1: took the risk band and for people with autism who 536 00:26:58,160 --> 00:27:00,520 Speaker 1: have a hard time reading the emotional of the person 537 00:27:00,560 --> 00:27:03,800 Speaker 1: they're talking to, the wristband listens in real time and 538 00:27:03,920 --> 00:27:07,280 Speaker 1: makes a machine learning decision, Oh the person is happy, sad, angry, 539 00:27:07,359 --> 00:27:09,960 Speaker 1: things like that, and then just buzzes to tell you ways, 540 00:27:10,040 --> 00:27:12,640 Speaker 1: oh that person. So if you're a kid with autism 541 00:27:12,640 --> 00:27:15,240 Speaker 1: who can't read that, you're just being told the answer 542 00:27:15,400 --> 00:27:15,760 Speaker 1: to that. 543 00:27:16,440 --> 00:27:18,360 Speaker 3: And do you see learning? Do they get better as 544 00:27:18,400 --> 00:27:18,720 Speaker 3: I learned? 545 00:27:18,840 --> 00:27:21,240 Speaker 1: I don't know yet, but presumably because you're just telling 546 00:27:21,280 --> 00:27:24,400 Speaker 1: them like, hey, this guy's angry, so to adjust your 547 00:27:24,440 --> 00:27:27,320 Speaker 1: behavior appropriately. So it's just a memorization thing. But I'll 548 00:27:27,359 --> 00:27:29,280 Speaker 1: give you another example. Something we did is we got 549 00:27:29,280 --> 00:27:32,959 Speaker 1: these smart watches that measure horr ate and herd variability 550 00:27:33,000 --> 00:27:36,520 Speaker 1: and galvinx can response and so on, and we use 551 00:27:36,560 --> 00:27:39,320 Speaker 1: the API to take that data out and feed it 552 00:27:39,359 --> 00:27:42,880 Speaker 1: into this wristband that vibrates in different ways so that 553 00:27:42,960 --> 00:27:46,520 Speaker 1: these invisible states of your body can become visible to you. 554 00:27:46,600 --> 00:27:49,840 Speaker 1: So you can actually feel, oh, it's not this thing 555 00:27:50,040 --> 00:27:53,439 Speaker 1: that is making you know. It's not the cafe's fault, 556 00:27:53,880 --> 00:27:56,560 Speaker 1: but it's that I drank too much coffee earlier, or 557 00:27:56,640 --> 00:27:59,960 Speaker 1: whatever the issue is. You can start actually reading these 558 00:28:00,000 --> 00:28:04,120 Speaker 1: signals from your bodies in ways that are typically invisible 559 00:28:04,119 --> 00:28:06,280 Speaker 1: to us. I'll tell you the really interesting spread we did, 560 00:28:06,280 --> 00:28:09,240 Speaker 1: which is we fed that data through the Internet to 561 00:28:09,320 --> 00:28:10,639 Speaker 1: the risk band. And the reason we did that is 562 00:28:10,680 --> 00:28:13,520 Speaker 1: so that, for example, your spouse could wear the watch 563 00:28:13,640 --> 00:28:17,359 Speaker 1: and you're wearing the vibrating wristband, so you're feeling your 564 00:28:17,400 --> 00:28:21,560 Speaker 1: spouse's physiology, so you know she's feeling nervous or stressed, 565 00:28:22,040 --> 00:28:23,440 Speaker 1: even if you're on the other side of the world. 566 00:28:23,520 --> 00:28:24,879 Speaker 3: I love it, she'll cut the argument. 567 00:28:26,040 --> 00:28:28,680 Speaker 2: But no, this is this must have an application for 568 00:28:28,720 --> 00:28:32,159 Speaker 2: you know, emotional intelligence and emotional awareness. So people have 569 00:28:32,200 --> 00:28:34,919 Speaker 2: some people don't have good emotional awareness. They're not aware 570 00:28:34,960 --> 00:28:37,080 Speaker 2: when they're angry or stressed, right, So this is part 571 00:28:37,080 --> 00:28:39,960 Speaker 2: of that s that first rule for using intuition. People 572 00:28:40,000 --> 00:28:42,280 Speaker 2: just don't realize when they're stressed or anxious. But this 573 00:28:42,320 --> 00:28:43,959 Speaker 2: could be a really interesting way of sort of training 574 00:28:44,000 --> 00:28:46,760 Speaker 2: people up in this self awareness of their own physiology. 575 00:28:46,840 --> 00:28:48,640 Speaker 1: I like them, you know. I think about the time. 576 00:28:48,800 --> 00:28:50,560 Speaker 1: I just heard a statistic this morning. I don't know 577 00:28:50,600 --> 00:28:53,200 Speaker 1: if it's true, but it's true. Truth to you though, 578 00:28:53,240 --> 00:28:56,880 Speaker 1: But which is that eighty five percent of the jobs 579 00:28:56,960 --> 00:29:00,000 Speaker 1: that will exist when our kids are adults don't even 580 00:29:00,200 --> 00:29:03,160 Speaker 1: exists yet, And so who knows if the number is 581 00:29:03,160 --> 00:29:05,560 Speaker 1: exactly right, But it strikes me as Wow, what a 582 00:29:05,600 --> 00:29:08,720 Speaker 1: world we're entering into. But in this context, the world 583 00:29:08,880 --> 00:29:11,479 Speaker 1: of maybe everyone's just going to have the opportunity be 584 00:29:11,600 --> 00:29:15,160 Speaker 1: much more emotionally intelligent because we'll all tap into this 585 00:29:15,160 --> 00:29:18,120 Speaker 1: stuff and get these signals and no because my you know, 586 00:29:18,160 --> 00:29:21,160 Speaker 1: Apple Vision proglasses or something. We'll say, hey, eaglman, your 587 00:29:21,280 --> 00:29:24,320 Speaker 1: heart rates going up. You're gotving skin response to spiking. 588 00:29:24,600 --> 00:29:27,360 Speaker 1: And I'll say, oh, okay, I wouldn't have known that, 589 00:29:27,480 --> 00:29:30,560 Speaker 1: but now, thank goodness, I'm being told that piece of cake. 590 00:29:30,800 --> 00:29:32,479 Speaker 2: I hope so, because I mean this data showing that 591 00:29:32,520 --> 00:29:36,480 Speaker 2: emotional intelligence has actually gone down of the last decade. 592 00:29:36,080 --> 00:29:37,560 Speaker 3: And it's been linked to tech use. 593 00:29:37,680 --> 00:29:40,280 Speaker 2: So it's hard to pull out the causation here, but 594 00:29:40,320 --> 00:29:43,560 Speaker 2: there's some that it suggested the more young people use tech, 595 00:29:43,600 --> 00:29:45,320 Speaker 2: they're lower their emotional intelligences. 596 00:29:45,360 --> 00:29:47,920 Speaker 1: Anyway, Well good, so we'll rescue it with tech things. 597 00:29:47,960 --> 00:29:51,760 Speaker 1: Oh yeah, yeah, that's fascinating. Cool. Yeah. I think I 598 00:29:51,800 --> 00:29:54,640 Speaker 1: think this as you and I study our whole careers. 599 00:29:55,200 --> 00:29:59,240 Speaker 1: The brain is this entire cosmos happening in there. There's 600 00:29:59,360 --> 00:30:01,800 Speaker 1: so much information that you're picking up from the world 601 00:30:02,200 --> 00:30:05,680 Speaker 1: you have no access to. We are so low bandwidth 602 00:30:05,760 --> 00:30:09,200 Speaker 1: in terms of our conscious understanding of any of this stuff. 603 00:30:09,560 --> 00:30:12,920 Speaker 1: But because the data is in there, it should be extractable. 604 00:30:13,440 --> 00:30:17,360 Speaker 1: And what that means is that what intuition becomes in 605 00:30:17,480 --> 00:30:19,880 Speaker 1: fifteen years is maybe you know, you'll write a sequel 606 00:30:19,920 --> 00:30:22,120 Speaker 1: to the book with this new world. 607 00:30:22,440 --> 00:30:25,320 Speaker 2: But yeah, I mean interception, right, That's kind of what 608 00:30:25,360 --> 00:30:27,480 Speaker 2: it is. It's you can think of it as as 609 00:30:27,520 --> 00:30:30,560 Speaker 2: our bodies, our physiology tapping in, so our bodies have 610 00:30:30,640 --> 00:30:33,720 Speaker 2: this access to the unconscious information in our brains, which 611 00:30:33,720 --> 00:30:35,800 Speaker 2: sounds like a funny way to put it. Our bodies 612 00:30:35,840 --> 00:30:38,400 Speaker 2: get access to something we don't if we use our consciousness, 613 00:30:38,480 --> 00:30:40,560 Speaker 2: but it kind of does so by tapping into your 614 00:30:40,760 --> 00:30:44,480 Speaker 2: gut response, your heart rate, your sweating or not, you're 615 00:30:44,720 --> 00:30:47,360 Speaker 2: getting this extra source of information, the unconscious. 616 00:30:47,520 --> 00:30:48,360 Speaker 3: And it's kind of like that. 617 00:30:48,400 --> 00:30:52,080 Speaker 2: It's just sort of using interception, this internal perceptual stet 618 00:30:52,080 --> 00:30:54,960 Speaker 2: of the body to get access to extra information. And 619 00:30:55,000 --> 00:30:57,280 Speaker 2: who wouldn't want that if we can trust. 620 00:30:56,960 --> 00:30:59,720 Speaker 1: It exactly, because maybe you can think of an example, 621 00:31:00,120 --> 00:31:02,280 Speaker 1: thinking off the top of my head, where we confuse 622 00:31:03,040 --> 00:31:08,040 Speaker 1: our emotional state or intuition about what's you know, we 623 00:31:08,080 --> 00:31:10,760 Speaker 1: misinterpret what's happening you gave the example before and if 624 00:31:10,760 --> 00:31:12,680 Speaker 1: there's a snake on the floor and then and everything. Wow, 625 00:31:12,720 --> 00:31:15,520 Speaker 1: that was a really exciting podcast. But we're we're mixing, 626 00:31:15,600 --> 00:31:17,960 Speaker 1: we're confleting some things. I mean, it is a very 627 00:31:18,000 --> 00:31:18,760 Speaker 1: exciting podcast. 628 00:31:18,760 --> 00:31:22,040 Speaker 2: But a great story, a great story from the book 629 00:31:22,120 --> 00:31:24,080 Speaker 2: is I went on a date many before I got married, 630 00:31:24,560 --> 00:31:26,160 Speaker 2: and it was a first date and we went rock 631 00:31:26,200 --> 00:31:29,000 Speaker 2: climbing one of those indoor rock climbing gyms, right, and 632 00:31:29,040 --> 00:31:31,360 Speaker 2: we're climbing and falling and blaying and then we swap 633 00:31:31,400 --> 00:31:33,440 Speaker 2: and it's exciting. It's like wow, and we were like, 634 00:31:33,440 --> 00:31:36,560 Speaker 2: the chemistry is amazing. And next time we meet up, 635 00:31:36,560 --> 00:31:39,520 Speaker 2: it's like, huh, not so much. Turns out we were 636 00:31:39,520 --> 00:31:41,840 Speaker 2: not suited for each other at all. And I was like, 637 00:31:42,160 --> 00:31:44,560 Speaker 2: but it felt so amazing that first date, and it 638 00:31:44,680 --> 00:31:47,360 Speaker 2: struck It puzzled me for years, and I was like, ah, 639 00:31:47,520 --> 00:31:51,840 Speaker 2: arousal misattribution, that's exactly what's going on. All the adrenaline 640 00:31:51,880 --> 00:31:53,360 Speaker 2: from the falling and the climbing. We thought it was 641 00:31:53,360 --> 00:31:54,960 Speaker 2: coming from each other and it wasn't. 642 00:31:55,000 --> 00:31:57,880 Speaker 1: So yeah, right, aren't There are these studies where you 643 00:31:57,960 --> 00:32:00,600 Speaker 1: have people crossing a rope bridge over a cliff and 644 00:32:00,680 --> 00:32:03,680 Speaker 1: you have the female assistant with the clipboard who asks 645 00:32:03,720 --> 00:32:07,040 Speaker 1: the person questions while they're on the rope bridge, and 646 00:32:07,080 --> 00:32:09,200 Speaker 1: then later the question is something about like, hey, would 647 00:32:09,240 --> 00:32:11,040 Speaker 1: you would you like to get her number and go 648 00:32:11,080 --> 00:32:13,960 Speaker 1: on a date something like that, versus the exact same 649 00:32:14,000 --> 00:32:16,200 Speaker 1: assistant with the same clipboard asks the same questions but 650 00:32:16,240 --> 00:32:18,400 Speaker 1: in a boring situation on campus. 651 00:32:18,760 --> 00:32:21,000 Speaker 2: Yes, it's the same thing. Yeah, the rickety bridge experience. 652 00:32:21,000 --> 00:32:24,600 Speaker 2: It's very family rickety bridge experiment. And yeah that people 653 00:32:24,680 --> 00:32:28,040 Speaker 2: just confuse these feelings with drenaline from the height with 654 00:32:28,120 --> 00:32:29,160 Speaker 2: a person in front of them. 655 00:32:29,360 --> 00:32:32,920 Speaker 1: It's yeah, yes, So we're entering a very exciting future 656 00:32:32,960 --> 00:32:36,640 Speaker 1: as we get better sensors and better AI being able 657 00:32:36,680 --> 00:32:40,320 Speaker 1: to essentially summarize things and tell them to our low 658 00:32:40,400 --> 00:32:42,760 Speaker 1: band with conscious mind and just say, hey, pal, here's 659 00:32:42,800 --> 00:32:46,080 Speaker 1: how you're feeling. You know, don't confuse this. So maybe 660 00:32:46,080 --> 00:32:48,280 Speaker 1: you'll have to add another letter to your smile. 661 00:32:49,480 --> 00:32:50,960 Speaker 3: Yeah, I think absolutely, David. 662 00:32:51,040 --> 00:32:53,080 Speaker 2: I think the things we can imagine now this is 663 00:32:53,120 --> 00:32:55,320 Speaker 2: like a passion of mine in my labs is measuring 664 00:32:55,360 --> 00:32:58,320 Speaker 2: things in the mind that we thought were too hard 665 00:32:58,320 --> 00:32:59,760 Speaker 2: to measure, right, And I call it like a blood 666 00:32:59,760 --> 00:33:03,000 Speaker 2: test for the mind. We want objective, reliable measurements of 667 00:33:03,120 --> 00:33:05,720 Speaker 2: mental things like a blood test that we can rely on. 668 00:33:06,080 --> 00:33:09,360 Speaker 2: When you interview someone, give them a questionnaire. It's good, 669 00:33:09,440 --> 00:33:12,280 Speaker 2: but it's not that reliable. We need this objective test. 670 00:33:12,320 --> 00:33:14,480 Speaker 2: It's a different way of thinking about tech and technology, 671 00:33:14,840 --> 00:33:15,880 Speaker 2: but it can be done now. 672 00:33:15,880 --> 00:33:16,720 Speaker 3: We can do this now. 673 00:33:17,040 --> 00:33:20,360 Speaker 1: The book is The Intuition Toolkit. And thank you so 674 00:33:20,400 --> 00:33:21,040 Speaker 1: much for me to Joel. 675 00:33:21,080 --> 00:33:21,720 Speaker 3: I'll pleasure David. 676 00:33:21,760 --> 00:33:23,200 Speaker 2: And before we go, I want to say thank you 677 00:33:23,520 --> 00:33:27,440 Speaker 2: for inspiring me with expanding what academia can be and 678 00:33:27,520 --> 00:33:30,560 Speaker 2: writing books and TV shows and companies and all the 679 00:33:30,560 --> 00:33:33,800 Speaker 2: cool things you do. I think it's groundbreaking in that 680 00:33:33,840 --> 00:33:36,440 Speaker 2: you're breaking open the classic academic model. 681 00:33:36,480 --> 00:33:37,640 Speaker 3: So thank you, Thank you, Joel. 682 00:33:38,040 --> 00:33:40,520 Speaker 1: So that was Joel Pearson talking about his new book, 683 00:33:40,560 --> 00:33:56,560 Speaker 1: The Intuition Toolkit, and now I want to bring it 684 00:33:56,600 --> 00:33:59,920 Speaker 1: back to the big picture. I'm always struck by how 685 00:34:00,360 --> 00:34:03,600 Speaker 1: much we do at the unconscious level that we don't 686 00:34:03,600 --> 00:34:06,240 Speaker 1: have access to. And a couple of episodes ago, I 687 00:34:06,280 --> 00:34:09,880 Speaker 1: was talking with Ed Katmull who's the founder of Pixar Films, 688 00:34:10,200 --> 00:34:12,640 Speaker 1: and one of the topics we touched on in our conversation, 689 00:34:13,239 --> 00:34:16,200 Speaker 1: which I didn't include in that episode was this topic 690 00:34:16,280 --> 00:34:21,000 Speaker 1: of intuition, the kind of intuition that an animator might have. 691 00:34:21,400 --> 00:34:24,839 Speaker 1: So here's a short clip from that conversation with Ed's 692 00:34:24,880 --> 00:34:25,640 Speaker 1: thoughts about that. 693 00:34:31,040 --> 00:34:36,080 Speaker 4: The ability to animate means you you understand motion and emotion, 694 00:34:37,440 --> 00:34:40,960 Speaker 4: and you are observing things, and you're you're conveying things 695 00:34:41,040 --> 00:34:45,800 Speaker 4: at a subconscious level, and that's not really a visualization 696 00:34:47,000 --> 00:34:49,640 Speaker 4: element I know in the in the case of our 697 00:34:50,640 --> 00:34:54,560 Speaker 4: our brains is that we take a lot of subtle clues, 698 00:34:55,840 --> 00:34:59,359 Speaker 4: but we're taking them unconsciously. So the way we move 699 00:34:59,440 --> 00:35:00,719 Speaker 4: our face. 700 00:35:00,360 --> 00:35:05,560 Speaker 5: Or our lips or our body motion, we're kind of 701 00:35:05,560 --> 00:35:10,000 Speaker 5: aware that we convey information to our body, but we 702 00:35:10,040 --> 00:35:15,120 Speaker 5: don't know exactly what that is or how we do it. 703 00:35:15,200 --> 00:35:16,360 Speaker 1: If you actually. 704 00:35:16,320 --> 00:35:20,320 Speaker 4: Understand how important that is, and then a good animator 705 00:35:20,360 --> 00:35:23,040 Speaker 4: will sort of get that and see it and then 706 00:35:23,280 --> 00:35:26,040 Speaker 4: convey it if they can, if they put it in 707 00:35:26,120 --> 00:35:30,279 Speaker 4: the animation, then people will see it and they'll get 708 00:35:30,400 --> 00:35:34,000 Speaker 4: information at this level, not knowing that they're getting it. 709 00:35:34,400 --> 00:35:34,680 Speaker 2: I mean. 710 00:35:34,680 --> 00:35:37,160 Speaker 4: One of my favorite examples in Twist Story two, where 711 00:35:37,640 --> 00:35:42,000 Speaker 4: Jesse that the doll was very upset because her owner 712 00:35:42,000 --> 00:35:44,720 Speaker 4: had actually discarded her. But while she was talking. 713 00:35:44,840 --> 00:35:47,319 Speaker 1: She was taking your pigtail and she was twisting it. 714 00:35:47,760 --> 00:35:54,759 Speaker 4: Almost nobody would notice that she would was twisting her pigtail. 715 00:35:54,960 --> 00:35:59,040 Speaker 4: They wouldn't notice at a conscious level, but an unconscious level, 716 00:36:00,080 --> 00:36:05,160 Speaker 4: that would mean that she's really torn up inside, along 717 00:36:05,200 --> 00:36:07,480 Speaker 4: with the other things that are there. So we've got 718 00:36:07,640 --> 00:36:11,719 Speaker 4: different levels of getting information from the world. There's the 719 00:36:11,840 --> 00:36:15,920 Speaker 4: direct words we're saying, and sometimes the direct words we're 720 00:36:15,960 --> 00:36:19,080 Speaker 4: saying are the opposite of what we really mean, and 721 00:36:19,160 --> 00:36:22,000 Speaker 4: so you've got like that's one level, and the other 722 00:36:22,160 --> 00:36:27,480 Speaker 4: is the body motion conveying something. That whole process means 723 00:36:27,520 --> 00:36:31,880 Speaker 4: that the animator is observing these different levels and putting 724 00:36:31,920 --> 00:36:36,040 Speaker 4: them into the character, and of course the actor also 725 00:36:36,120 --> 00:36:39,480 Speaker 4: provides layers to that. And when you get all of 726 00:36:39,480 --> 00:36:43,279 Speaker 4: that right, you have something which is complex and is interested, 727 00:36:43,640 --> 00:36:49,560 Speaker 4: it's observing multiple levels and isn't just about the pictures, it's. 728 00:36:48,719 --> 00:36:56,240 Speaker 1: About all of this coming together. That was Ed Catmull, 729 00:36:56,400 --> 00:37:00,440 Speaker 1: founder of Pixar Films, talking about the intuition that's for 730 00:37:00,520 --> 00:37:04,640 Speaker 1: an animator. Sometimes they can consciously articulate what they're up to, 731 00:37:04,719 --> 00:37:07,680 Speaker 1: and sometimes they can't. But for a good animator, they 732 00:37:07,920 --> 00:37:10,680 Speaker 1: pick up on those cues and they transmit those cues, 733 00:37:10,920 --> 00:37:15,080 Speaker 1: sometimes all under the radar of consciousness. And of course 734 00:37:15,239 --> 00:37:18,719 Speaker 1: these animators are working on areas where their intuition makes 735 00:37:18,760 --> 00:37:22,560 Speaker 1: good sense because they have mastered their data and they're 736 00:37:22,600 --> 00:37:27,000 Speaker 1: working in familiar and predictable contexts. So let's wrap up 737 00:37:27,000 --> 00:37:31,680 Speaker 1: for today. Intuition is about putting together vast information that 738 00:37:31,719 --> 00:37:34,560 Speaker 1: your brain has picked up on to give you a 739 00:37:34,600 --> 00:37:38,400 Speaker 1: final nudge in one direction or another, and it's massively 740 00:37:38,440 --> 00:37:41,960 Speaker 1: important for our functioning in the world. But we shouldn't 741 00:37:42,000 --> 00:37:45,560 Speaker 1: romanticize it to believe that it's always correct and always 742 00:37:45,800 --> 00:37:48,920 Speaker 1: to be listened to. Instead, we need to be clever 743 00:37:49,000 --> 00:37:53,000 Speaker 1: about when to trust it and under what circumstances. As 744 00:37:53,080 --> 00:37:56,480 Speaker 1: Joel suggests, we need to take into account whether we're 745 00:37:56,600 --> 00:38:00,440 Speaker 1: feeling too emotional, in which case, don't trust you tuition. 746 00:38:00,840 --> 00:38:05,520 Speaker 1: You need to master data before trusting your intuition on something. 747 00:38:06,080 --> 00:38:12,040 Speaker 1: Don't mistake impulses an addiction for intuition, don't use your 748 00:38:12,080 --> 00:38:16,600 Speaker 1: intuition for very low probability judgments, and you should really 749 00:38:16,640 --> 00:38:22,400 Speaker 1: only trust your intuition in familiar and predictable contexts. In 750 00:38:22,440 --> 00:38:27,759 Speaker 1: the end, each of us carries vast computational resources comparable 751 00:38:27,800 --> 00:38:31,160 Speaker 1: to Google or Meta or Apple. But instead of it 752 00:38:31,320 --> 00:38:34,240 Speaker 1: weighing eight ounces and us carrying in our back pocket, 753 00:38:34,320 --> 00:38:38,279 Speaker 1: we have three pounds of this computational material and we 754 00:38:38,400 --> 00:38:40,839 Speaker 1: lug it around on our shoulders the way that we 755 00:38:40,920 --> 00:38:44,759 Speaker 1: carry other things that are heavy and important. This is 756 00:38:44,880 --> 00:38:47,480 Speaker 1: all you have to make your decisions in the world, 757 00:38:48,040 --> 00:38:50,360 Speaker 1: and think about it this way, in the same way 758 00:38:50,800 --> 00:38:53,920 Speaker 1: that you might tell a student how to leverage the 759 00:38:54,040 --> 00:39:00,800 Speaker 1: computational might of the Internet while retaining appropriate skeptic about 760 00:39:00,800 --> 00:39:04,280 Speaker 1: some of the results. So it goes with the massive 761 00:39:04,880 --> 00:39:09,759 Speaker 1: networked biological landscape of our own brains. It is up 762 00:39:09,840 --> 00:39:17,400 Speaker 1: to us to learn how to become good users. Go 763 00:39:17,480 --> 00:39:21,040 Speaker 1: to Eagleman dot com slash podcast for more information and 764 00:39:21,120 --> 00:39:24,879 Speaker 1: to find further reading. Send me an email at podcasts 765 00:39:24,880 --> 00:39:28,359 Speaker 1: at eagleman dot com with questions or discussion, and check 766 00:39:28,400 --> 00:39:31,920 Speaker 1: out and subscribe to Inner Cosmos on YouTube for videos 767 00:39:31,920 --> 00:39:35,799 Speaker 1: of each episode and to leave comments until next time. 768 00:39:35,960 --> 00:39:39,040 Speaker 1: I'm David Eagleman and this is Inner Cosmos.