1 00:00:05,519 --> 00:00:08,840 Speaker 1: So what's happening in your brain When you stand at 2 00:00:08,840 --> 00:00:11,720 Speaker 1: the ice cream aisle at the store and you stare 3 00:00:11,800 --> 00:00:14,720 Speaker 1: at all the different options. It doesn't look like there's 4 00:00:14,760 --> 00:00:18,599 Speaker 1: a lot going on there, but inside there is a 5 00:00:18,920 --> 00:00:21,480 Speaker 1: war of brain networks. 6 00:00:20,960 --> 00:00:21,760 Speaker 2: That are raging. 7 00:00:22,280 --> 00:00:25,320 Speaker 1: So how does your brain make its decision about what 8 00:00:25,440 --> 00:00:29,440 Speaker 1: it's going to buy? And how is that influenced by price, 9 00:00:29,600 --> 00:00:33,440 Speaker 1: by your emotions, by your group of friends. How do 10 00:00:33,520 --> 00:00:37,159 Speaker 1: you decide whether you'd rather have a chocolate bar in 11 00:00:37,200 --> 00:00:41,120 Speaker 1: the shape of a computer keyboard or a nice candle? 12 00:00:41,240 --> 00:00:44,080 Speaker 1: Was an integrated match tickholder? And what does this have 13 00:00:44,120 --> 00:00:47,120 Speaker 1: to do with Starbucks or Tiger Woods or Burger King. 14 00:00:49,520 --> 00:00:54,200 Speaker 1: Welcome to Inner Cosmos with me David Eagleman. I'm a 15 00:00:54,240 --> 00:00:58,920 Speaker 1: neuroscientist and an author at Stanford University, and in these 16 00:00:58,960 --> 00:01:02,959 Speaker 1: episodes we say fail deeply into our three pound universe 17 00:01:03,280 --> 00:01:06,559 Speaker 1: to understand why and how our lives look the way 18 00:01:06,560 --> 00:01:17,160 Speaker 1: they do. Today's episode is the second part of a 19 00:01:17,240 --> 00:01:18,360 Speaker 1: three part series. 20 00:01:19,000 --> 00:01:20,399 Speaker 2: We're looking at how the. 21 00:01:20,360 --> 00:01:25,640 Speaker 1: Brain is a machine built of networks that live in conflict. 22 00:01:25,760 --> 00:01:28,800 Speaker 1: In the last episode, we saw that your brain is 23 00:01:28,880 --> 00:01:34,240 Speaker 1: a team of rivals. You have different networks with potentially 24 00:01:34,280 --> 00:01:38,480 Speaker 1: opposing opinions, and they all compete to steer the ship 25 00:01:38,560 --> 00:01:41,720 Speaker 1: of state. And today we're going to see how these 26 00:01:41,959 --> 00:01:46,840 Speaker 1: rivaling networks determine how we decide what to buy, like 27 00:01:46,959 --> 00:01:50,520 Speaker 1: which ice cream brand or which car brand, how much 28 00:01:50,800 --> 00:01:55,160 Speaker 1: should something cost? How do you get steered or manipulated 29 00:01:55,200 --> 00:01:58,400 Speaker 1: by emotions? And does it matter if your friends think 30 00:01:58,640 --> 00:02:01,560 Speaker 1: something is cool or lame. We're going to see how 31 00:02:01,560 --> 00:02:05,680 Speaker 1: an understanding of the brain exposes how we make our 32 00:02:05,800 --> 00:02:09,880 Speaker 1: daily decisions. As we found out in the last episode, 33 00:02:10,240 --> 00:02:13,960 Speaker 1: different networks in your brain care about different things, and 34 00:02:14,040 --> 00:02:18,200 Speaker 1: brain scientists all over the world have used imaging technologies 35 00:02:18,280 --> 00:02:21,240 Speaker 1: to see what the brain is up to when we 36 00:02:21,320 --> 00:02:26,960 Speaker 1: are faced with decisions like decisions about which fast food 37 00:02:27,000 --> 00:02:30,160 Speaker 1: restaurant you're going to choose on the block, or which 38 00:02:30,600 --> 00:02:33,920 Speaker 1: shirt to buy, or which internet service provider you're going 39 00:02:34,000 --> 00:02:36,880 Speaker 1: to sign up with. So let's start with the field 40 00:02:37,160 --> 00:02:41,160 Speaker 1: of economics. All of us who studied some economics in 41 00:02:41,200 --> 00:02:45,639 Speaker 1: college learned how humans make decisions, or that's what we 42 00:02:45,800 --> 00:02:49,120 Speaker 1: thought we were learning, But what we were actually learning 43 00:02:49,320 --> 00:02:55,560 Speaker 1: was an idealized version. We learned about Homo economicus, which 44 00:02:55,600 --> 00:02:59,840 Speaker 1: is a rational decision maker, and that decision maker always 45 00:02:59,840 --> 00:03:06,200 Speaker 1: works to maximize gain and minimize loss, and he's objective, 46 00:03:06,280 --> 00:03:11,440 Speaker 1: he's able to delay gratification. His decisions about what to 47 00:03:11,520 --> 00:03:15,919 Speaker 1: buy are unswayed by his emotions. And by the way, 48 00:03:15,919 --> 00:03:18,880 Speaker 1: if you put him in the same situation twice, he'll 49 00:03:18,919 --> 00:03:24,640 Speaker 1: behave consistently. But imagine I offer you two generic courts 50 00:03:24,680 --> 00:03:27,720 Speaker 1: of Rocky Road ice cream, and for one, I'm charging 51 00:03:27,760 --> 00:03:30,640 Speaker 1: you two dollars fifty cents. For the other one, I'm 52 00:03:30,720 --> 00:03:33,920 Speaker 1: charging you three dollars twenty five cents. As far as 53 00:03:33,960 --> 00:03:36,600 Speaker 1: you can tell, it's the same ice cream inside. So 54 00:03:36,680 --> 00:03:41,760 Speaker 1: if you're an economist, if you're homoeconomicist, the choice is easy. 55 00:03:41,960 --> 00:03:46,880 Speaker 1: It's straightforward. But now imagine that I rotate the courts 56 00:03:46,960 --> 00:03:50,040 Speaker 1: so you can see the labels, and one says Ben 57 00:03:50,080 --> 00:03:53,520 Speaker 1: and Jerry's and one says Hoggen Dawes. So which one 58 00:03:53,520 --> 00:03:53,960 Speaker 1: do you take? 59 00:03:54,040 --> 00:03:59,960 Speaker 3: Now, Well, it's not so straightforward for real humans, because 60 00:04:00,080 --> 00:04:04,960 Speaker 3: suddenly there's branding and the emotions that you anticipate from 61 00:04:05,040 --> 00:04:08,880 Speaker 3: eating these and what your friends think of these brands, 62 00:04:08,920 --> 00:04:12,560 Speaker 3: and all your past experience with these brands, and all 63 00:04:12,600 --> 00:04:15,200 Speaker 3: this has led, in the last thirty years or so. 64 00:04:15,520 --> 00:04:20,080 Speaker 1: To a new field called neuroeconomics. It's also known as 65 00:04:20,160 --> 00:04:23,680 Speaker 1: behavioral economics, and it comes down to the fact that 66 00:04:23,760 --> 00:04:29,400 Speaker 1: we have generally misunderstood how humans actually make decisions. So 67 00:04:29,560 --> 00:04:34,719 Speaker 1: psychologists and neuroscientists have been running experiments for decades now 68 00:04:35,240 --> 00:04:39,839 Speaker 1: to get a realistic understanding of the way people actually 69 00:04:39,920 --> 00:04:43,560 Speaker 1: behave rather than the way they're supposed to behave. And 70 00:04:43,640 --> 00:04:47,120 Speaker 1: what has emerged from this field is that real humans 71 00:04:47,279 --> 00:04:54,800 Speaker 1: are irrational. We care about immediate gratification, We ignore consequences. 72 00:04:55,200 --> 00:04:58,599 Speaker 1: Our decisions are massively swayed by our emotions. 73 00:04:59,080 --> 00:05:00,080 Speaker 2: We behave and. 74 00:05:00,240 --> 00:05:04,480 Speaker 1: Consistently not making the same decision each time. We're easily 75 00:05:04,560 --> 00:05:09,480 Speaker 1: confused by calculations about risk. We're always influenced by branding. 76 00:05:10,080 --> 00:05:12,120 Speaker 1: So this is what we're going to talk about today. 77 00:05:12,560 --> 00:05:15,560 Speaker 1: How an understanding of the team of rivals in the 78 00:05:15,560 --> 00:05:21,279 Speaker 1: brain gives a clearer picture of how we behave. As 79 00:05:21,279 --> 00:05:24,599 Speaker 1: we talked about in the previous episode, you are not 80 00:05:24,960 --> 00:05:29,280 Speaker 1: an individual. In other words, you're not a single program 81 00:05:29,320 --> 00:05:34,360 Speaker 1: that's running. Instead, your brain is made of competing pieces 82 00:05:34,440 --> 00:05:38,800 Speaker 1: and parts, a society of mind, and these are always 83 00:05:38,880 --> 00:05:42,960 Speaker 1: battling to control your decisions. The outcome of the battle 84 00:05:43,360 --> 00:05:46,440 Speaker 1: determines what you do in the world. So, for example, 85 00:05:46,480 --> 00:05:49,240 Speaker 1: if I take some warm chocolate chip cookies out of 86 00:05:49,279 --> 00:05:51,960 Speaker 1: the oven and place them in front of you, part 87 00:05:51,960 --> 00:05:55,159 Speaker 1: of your brain says, great, that's a good source of energy. 88 00:05:55,400 --> 00:05:57,839 Speaker 1: Another part of your brain says, don't eat them, you'll 89 00:05:57,839 --> 00:06:01,480 Speaker 1: get fat. Another says, Okay, I'll eat them this time, 90 00:06:01,480 --> 00:06:03,960 Speaker 1: but I promise I'll go to the gym tonight. And 91 00:06:04,000 --> 00:06:07,040 Speaker 1: you can argue with yourself and eventually get mad at 92 00:06:07,080 --> 00:06:11,280 Speaker 1: yourself for your decision. You can curse at yourself, cajole yourself, 93 00:06:11,839 --> 00:06:13,680 Speaker 1: and as we'll see in the next episode, you can 94 00:06:13,720 --> 00:06:15,520 Speaker 1: even make contracts with yourself. 95 00:06:16,000 --> 00:06:19,479 Speaker 2: But who is talking with whom? Here? It's all you, right, 96 00:06:19,560 --> 00:06:21,080 Speaker 2: but it's different parts of you. 97 00:06:21,839 --> 00:06:27,080 Speaker 1: And this is the sense in which humans can be conflicted. Now, 98 00:06:27,160 --> 00:06:30,400 Speaker 1: there are lots of drives that you have, but today 99 00:06:30,400 --> 00:06:33,040 Speaker 1: we're just going to concentrate on three of the main 100 00:06:33,279 --> 00:06:36,600 Speaker 1: networks that have come out of the last couple of 101 00:06:36,680 --> 00:06:41,240 Speaker 1: decades of brain scanning experiments. And these three networks are 102 00:06:41,279 --> 00:06:45,360 Speaker 1: among the biggest drivers of your decision making when you're 103 00:06:45,360 --> 00:06:48,480 Speaker 1: standing in the store. The first network that we're going 104 00:06:48,520 --> 00:06:52,680 Speaker 1: to talk about cares about the price of something. It's 105 00:06:52,720 --> 00:06:55,120 Speaker 1: made of a couple regions that sit on the side 106 00:06:55,120 --> 00:06:58,640 Speaker 1: of your brain in the northern part here above where 107 00:06:58,680 --> 00:07:04,080 Speaker 1: your ears are. The second network cares about the emotional experience, 108 00:07:04,320 --> 00:07:09,640 Speaker 1: or more specifically, the anticipated emotional experience. This is located 109 00:07:10,040 --> 00:07:12,520 Speaker 1: on the part of your brain just above your eyeballs. 110 00:07:12,600 --> 00:07:15,280 Speaker 2: This is called the orbitofrontal cortex. 111 00:07:14,840 --> 00:07:18,000 Speaker 1: Because it's just above the orbits of the eye and 112 00:07:18,080 --> 00:07:21,040 Speaker 1: the frontal lobe. And this network cares about things like 113 00:07:21,160 --> 00:07:23,960 Speaker 1: what will these chips taste like compared to these Will 114 00:07:24,000 --> 00:07:25,480 Speaker 1: they be delicious or gross? 115 00:07:25,560 --> 00:07:29,720 Speaker 2: Will this make me happy? Regretful? And the third network. 116 00:07:29,480 --> 00:07:34,320 Speaker 1: Responds to social context, as in do my friends think 117 00:07:34,400 --> 00:07:37,400 Speaker 1: this is cool or lame? And these regions right along 118 00:07:37,440 --> 00:07:40,800 Speaker 1: the midline of your brain between the two hemispheres near 119 00:07:40,840 --> 00:07:44,880 Speaker 1: the top. So I'm going to tell you this story 120 00:07:45,000 --> 00:07:49,800 Speaker 1: in three acts, one about each of these networks. So 121 00:07:49,920 --> 00:07:53,400 Speaker 1: let's start with the first, which is about how much 122 00:07:53,480 --> 00:07:57,920 Speaker 1: you think something is worth. A great book that summarizes 123 00:07:58,000 --> 00:08:01,320 Speaker 1: the psychology of this is predict to Be Irrational by 124 00:08:01,400 --> 00:08:04,960 Speaker 1: my colleague Dan Arieli, and I'll use some examples from that. 125 00:08:05,720 --> 00:08:08,040 Speaker 1: So let's say you're in the market for a pair 126 00:08:08,080 --> 00:08:12,360 Speaker 1: of Bluetooth headphones and you find it at a local 127 00:08:12,440 --> 00:08:15,760 Speaker 1: store and they're charging twenty nine dollars, so you think great. 128 00:08:16,320 --> 00:08:19,000 Speaker 1: So right when you're about to buy it, I ring 129 00:08:19,040 --> 00:08:21,840 Speaker 1: you up on your phone and I say, hey, I'm 130 00:08:21,880 --> 00:08:24,480 Speaker 1: two blocks away and I just found that headset you've 131 00:08:24,480 --> 00:08:28,880 Speaker 1: been looking for for nineteen dollars, so it's ten dollars cheaper. 132 00:08:29,240 --> 00:08:32,280 Speaker 1: The question is will you walk over to where I 133 00:08:32,360 --> 00:08:35,440 Speaker 1: am to buy the nineteen dollar headset instead of the 134 00:08:35,600 --> 00:08:39,680 Speaker 1: completely overpriced twenty nine dollars. Now, let's say you're in 135 00:08:39,720 --> 00:08:43,320 Speaker 1: the market for a new cell phone and you find 136 00:08:43,360 --> 00:08:45,800 Speaker 1: the one that you want at this store and it's 137 00:08:45,880 --> 00:08:49,679 Speaker 1: for five hundred and sixty seven dollars, and right when 138 00:08:49,679 --> 00:08:52,840 Speaker 1: you're about to purchase it, I call you and I say, hey, 139 00:08:52,880 --> 00:08:55,360 Speaker 1: I'm two blocks away and I found that phone you're 140 00:08:55,360 --> 00:08:59,559 Speaker 1: looking for for five hundred and fifty seven dollars, ten 141 00:08:59,600 --> 00:09:03,160 Speaker 1: dollars cheaper. The question is are you going to walk 142 00:09:03,360 --> 00:09:06,400 Speaker 1: all the way over two blocks to save ten dollars 143 00:09:06,520 --> 00:09:09,400 Speaker 1: five hundred sixty seven dollars versus five hundred and fifty 144 00:09:09,400 --> 00:09:12,559 Speaker 1: seven dollars. Now, the thing to notice is that these 145 00:09:12,559 --> 00:09:16,800 Speaker 1: two scenarios are exactly the same. I'm asking is ten 146 00:09:16,880 --> 00:09:20,839 Speaker 1: dollars worth the walk, or isn't it. It's the same 147 00:09:20,920 --> 00:09:24,959 Speaker 1: ten dollars either way. But in the first case, most 148 00:09:24,960 --> 00:09:28,080 Speaker 1: people will do it to pay nineteen instead of twenty nine. 149 00:09:28,120 --> 00:09:31,120 Speaker 1: And in the second case, almost nobody's gonna make that 150 00:09:31,240 --> 00:09:34,040 Speaker 1: walk two blocks just to pay five point fifty seven 151 00:09:34,040 --> 00:09:37,080 Speaker 1: instead of five to sixty seven. And this illustrates an 152 00:09:37,080 --> 00:09:40,840 Speaker 1: important point, which is that nothing is judged by itself. 153 00:09:40,920 --> 00:09:47,560 Speaker 1: Everything is judged in context. You are not homo economicus. Okay, 154 00:09:47,600 --> 00:09:49,880 Speaker 1: Now I'm gonna make you an offer. You can have 155 00:09:49,960 --> 00:09:52,679 Speaker 1: one of two things. I'm either gonna give you a 156 00:09:53,120 --> 00:09:58,240 Speaker 1: chocolate bar shaped like a computer keyboard or a candle 157 00:09:58,320 --> 00:10:01,199 Speaker 1: with a little drawer built into it that holds magticks. 158 00:10:01,679 --> 00:10:04,000 Speaker 1: So which one are you going to take? This is 159 00:10:04,040 --> 00:10:08,040 Speaker 1: a difficult decision, right because how do you translate these 160 00:10:08,080 --> 00:10:11,640 Speaker 1: into a common currency so you know which one has 161 00:10:11,840 --> 00:10:15,000 Speaker 1: more value to you. The reason this is a challenge 162 00:10:15,040 --> 00:10:19,720 Speaker 1: is because everything has to be converted into a common 163 00:10:19,760 --> 00:10:23,280 Speaker 1: currency to make a comparison, and this has to happen 164 00:10:23,640 --> 00:10:27,199 Speaker 1: at the level of neural activity, and that's tough when 165 00:10:27,240 --> 00:10:30,199 Speaker 1: you have such different items. This is the same if 166 00:10:30,240 --> 00:10:33,320 Speaker 1: I give you some other comparison like I'll give you 167 00:10:33,679 --> 00:10:38,840 Speaker 1: three apples, or a case for your sunglasses that shaped 168 00:10:38,840 --> 00:10:41,560 Speaker 1: like a hot dog? How do you convert that into 169 00:10:41,600 --> 00:10:45,360 Speaker 1: a common currency to make a direct comparison. So what 170 00:10:45,400 --> 00:10:49,520 Speaker 1: the brain is always looking for is context. It can't 171 00:10:49,559 --> 00:10:53,200 Speaker 1: make a decision about the value of something unless it 172 00:10:53,240 --> 00:10:56,839 Speaker 1: can do a direct head to head comparison. And this 173 00:10:56,920 --> 00:11:00,440 Speaker 1: ends up being an important clue for people who are 174 00:11:00,440 --> 00:11:03,240 Speaker 1: trying to sell you something because none of us know 175 00:11:03,320 --> 00:11:07,360 Speaker 1: how to price something, so we need to be told. 176 00:11:07,800 --> 00:11:11,640 Speaker 1: So here's an example. This happened with Williams Sonoma some 177 00:11:12,320 --> 00:11:16,760 Speaker 1: years ago. They introduced a home bread bakery. It's like 178 00:11:16,800 --> 00:11:19,800 Speaker 1: a large toast drub in that makes a loaf of 179 00:11:19,840 --> 00:11:22,040 Speaker 1: bread for you. And they put a price tag on 180 00:11:22,120 --> 00:11:25,120 Speaker 1: it around two hundred and seventy five dollars and it 181 00:11:25,160 --> 00:11:29,559 Speaker 1: was a perfectly good product, but sales wise, it failed completely. 182 00:11:29,920 --> 00:11:31,760 Speaker 2: No one was buying it. Why. 183 00:11:31,880 --> 00:11:34,760 Speaker 1: It's because people saw it on the shelf and they thought, 184 00:11:35,120 --> 00:11:36,840 Speaker 1: what is a home bread bakery anyway? 185 00:11:36,920 --> 00:11:38,640 Speaker 2: Is this a good one? Do I need it? 186 00:11:39,160 --> 00:11:42,720 Speaker 1: So Williams Sonoma took on a research firm who advised 187 00:11:42,760 --> 00:11:46,640 Speaker 1: them to get a hold of a second homebread bakery 188 00:11:46,679 --> 00:11:49,960 Speaker 1: a slightly bigger one that was more expensive and put 189 00:11:50,000 --> 00:11:51,559 Speaker 1: it on the shelves next to. 190 00:11:51,520 --> 00:11:55,080 Speaker 2: The first one. So they did that, and you know what, 191 00:11:55,160 --> 00:11:59,000 Speaker 2: the first bread bakery started selling. Why. 192 00:11:59,080 --> 00:12:01,480 Speaker 1: It's because people did have to make a decision in 193 00:12:01,559 --> 00:12:05,400 Speaker 1: a vacuum. Now, now they could think, okay, well, I 194 00:12:05,440 --> 00:12:08,600 Speaker 1: don't know much about home breadmakers, but this one seems 195 00:12:08,640 --> 00:12:11,960 Speaker 1: to take up less counter space and it's less expensive, 196 00:12:12,040 --> 00:12:14,080 Speaker 1: so that seems like it's the right choice. 197 00:12:14,559 --> 00:12:17,480 Speaker 2: So the customer's brains were given something that they. 198 00:12:17,320 --> 00:12:21,640 Speaker 1: Could compare and that allowed them to secure a decision. 199 00:12:22,400 --> 00:12:24,520 Speaker 1: And this trick of giving context, by the way, is 200 00:12:24,559 --> 00:12:27,360 Speaker 1: an old trick in real estate. So let's say you're 201 00:12:27,720 --> 00:12:31,200 Speaker 1: an agent and you're trying to sell someone a medium 202 00:12:31,240 --> 00:12:32,440 Speaker 1: sized traditional home. 203 00:12:32,800 --> 00:12:34,439 Speaker 2: But the prospective buyers. 204 00:12:34,040 --> 00:12:38,280 Speaker 1: Are torn between that one and some modern home, and 205 00:12:38,320 --> 00:12:42,080 Speaker 1: they're stuck because these two homes are as different as 206 00:12:42,160 --> 00:12:45,480 Speaker 1: the chocolate keyboard and the candlestick. So what do you 207 00:12:45,640 --> 00:12:48,920 Speaker 1: do as the real estate agent to get them unstuck. 208 00:12:49,280 --> 00:12:52,000 Speaker 1: What you do is you show them another mid sized 209 00:12:52,040 --> 00:12:55,440 Speaker 1: traditional home, one that's a little bit worse. Maybe the 210 00:12:55,520 --> 00:12:58,760 Speaker 1: kitchen is outdated, or it's covered in wood paneling. So 211 00:12:59,360 --> 00:13:02,760 Speaker 1: This is known is the decoy effect, because you don't 212 00:13:02,800 --> 00:13:05,320 Speaker 1: expect them to buy this other home you've just shown them. 213 00:13:05,679 --> 00:13:09,320 Speaker 1: But what you're doing is you're giving their valuation systems 214 00:13:09,440 --> 00:13:13,640 Speaker 1: something it can understand. The brain can now compare the 215 00:13:13,720 --> 00:13:17,240 Speaker 1: slightly worse traditional home against the slightly better traditional home, 216 00:13:17,400 --> 00:13:19,200 Speaker 1: the one where the kitchen doesn't need an update. 217 00:13:19,760 --> 00:13:22,600 Speaker 2: So they see this in context. 218 00:13:22,120 --> 00:13:25,560 Speaker 1: Now and it allows them to lock down a decision. 219 00:13:30,000 --> 00:13:33,959 Speaker 1: In general, everything in your brain is priced by comparison 220 00:13:34,360 --> 00:13:36,640 Speaker 1: to other things that are like it, and this is 221 00:13:36,679 --> 00:13:41,600 Speaker 1: because the brain stores its knowledge by association what is 222 00:13:41,640 --> 00:13:44,439 Speaker 1: related to what and how. So you're not going to 223 00:13:44,520 --> 00:13:47,440 Speaker 1: pay twenty five dollars for a bag of chips because 224 00:13:47,480 --> 00:13:50,440 Speaker 1: you have lots of experience with bags of chips and 225 00:13:50,440 --> 00:13:54,120 Speaker 1: they're not supposed to cost that much. But interestingly, there 226 00:13:54,240 --> 00:13:58,080 Speaker 1: is a way that companies can change where an item 227 00:13:58,200 --> 00:14:02,679 Speaker 1: sits in your network of associations. So, for example, some 228 00:14:02,720 --> 00:14:04,760 Speaker 1: of you may remember a time where you could drive 229 00:14:05,000 --> 00:14:07,480 Speaker 1: anywhere the na should and get a cup of coffee 230 00:14:07,520 --> 00:14:11,480 Speaker 1: for fifty cents. But when Starbucks launched, they didn't want 231 00:14:11,520 --> 00:14:14,560 Speaker 1: to sell their coffee for fifty cents, so they built 232 00:14:14,600 --> 00:14:20,080 Speaker 1: Starbucks to feel like a continental European coffee house, and 233 00:14:20,160 --> 00:14:24,240 Speaker 1: they had the place filled with the smell of roasted beans, 234 00:14:24,280 --> 00:14:28,760 Speaker 1: and they sold coffee presses and piscatti and croissants, and 235 00:14:28,960 --> 00:14:34,160 Speaker 1: they were creating a different experience, so different that we 236 00:14:34,200 --> 00:14:38,880 Speaker 1: wouldn't use the prices of the diners as an anchor 237 00:14:38,960 --> 00:14:42,160 Speaker 1: anymore for the price of coffee. So yeah, they even 238 00:14:42,400 --> 00:14:45,920 Speaker 1: changed the sizes they sold, so small, medium, and large 239 00:14:45,960 --> 00:14:50,240 Speaker 1: became tall, grande, and venti. And this made it so 240 00:14:50,280 --> 00:14:54,479 Speaker 1: that in your network of associations, you didn't link Starbucks 241 00:14:54,560 --> 00:14:59,680 Speaker 1: coffee with diner coffee, but instead with something different and fancier, 242 00:15:00,160 --> 00:15:04,920 Speaker 1: and therefore your valuation of the cup of coffee doesn't 243 00:15:04,960 --> 00:15:09,080 Speaker 1: feel off. So I was in Seattle a while ago, 244 00:15:09,120 --> 00:15:12,280 Speaker 1: and I saw a McDonald's billboard that was taking a 245 00:15:12,320 --> 00:15:16,240 Speaker 1: shot at Starbucks. The sign said four dollars for a 246 00:15:16,320 --> 00:15:19,760 Speaker 1: coffee is done. And what struck me as funny is 247 00:15:19,800 --> 00:15:24,760 Speaker 1: that homoeconomicists totally gets McDonald's message supply and demand and 248 00:15:24,760 --> 00:15:28,600 Speaker 1: so on, but real humans simply don't care. We get 249 00:15:28,640 --> 00:15:31,920 Speaker 1: something out of our experience with Starbucks, and therefore we're 250 00:15:32,000 --> 00:15:52,680 Speaker 1: willing to value a blonde vanilla latte differently. So now 251 00:15:52,720 --> 00:15:55,800 Speaker 1: I've told you how the brain thinks about value, how 252 00:15:55,880 --> 00:15:58,520 Speaker 1: much is something worth? But as I said at the beginning, 253 00:15:58,560 --> 00:16:02,280 Speaker 1: this is only part how the brain makes decisions. There 254 00:16:02,320 --> 00:16:06,440 Speaker 1: are other competing areas involved, and we turn to the 255 00:16:06,560 --> 00:16:12,160 Speaker 1: second act now, which is about emotions. Economists used to 256 00:16:12,160 --> 00:16:15,480 Speaker 1: think that all decision making in humans was a rational 257 00:16:15,640 --> 00:16:19,600 Speaker 1: process ua costs and benefits, and that tips your decision. 258 00:16:20,040 --> 00:16:23,680 Speaker 1: But recently the importance of emotions has come into the 259 00:16:23,800 --> 00:16:29,120 Speaker 1: spotlight because emotions are a critical part of the language 260 00:16:29,160 --> 00:16:33,640 Speaker 1: of the unconscious. It's not all about logic irrationality. We're 261 00:16:33,640 --> 00:16:38,480 Speaker 1: not just information processing devices. Our lives are richly painted 262 00:16:38,520 --> 00:16:43,400 Speaker 1: by emotion, and these influence decision making. So one basic 263 00:16:43,440 --> 00:16:48,440 Speaker 1: example is that you will make harsher moral judgments when 264 00:16:48,440 --> 00:16:51,920 Speaker 1: you're in a room that smells bad. This shows the 265 00:16:52,040 --> 00:16:56,840 Speaker 1: role of emotions like discuss in decision making about things 266 00:16:56,920 --> 00:17:00,560 Speaker 1: that you might think are higher level concepts. Okay, now 267 00:17:00,600 --> 00:17:03,880 Speaker 1: I'll give you another example, one that's been making the 268 00:17:04,000 --> 00:17:07,840 Speaker 1: rounds in philosophy for a little while. In part one 269 00:17:07,880 --> 00:17:11,040 Speaker 1: of this series, I mentioned the trolley dilemma. 270 00:17:11,440 --> 00:17:12,560 Speaker 2: Here's a quick reminder. 271 00:17:12,880 --> 00:17:16,240 Speaker 1: A trolley is barreling down the train tracks out of control. 272 00:17:16,680 --> 00:17:20,280 Speaker 1: Five workers are making repairs way down the track, and you, 273 00:17:20,480 --> 00:17:23,639 Speaker 1: a bystander, quickly realize that they're all going to be 274 00:17:23,760 --> 00:17:27,359 Speaker 1: killed by the trolley. But you also notice there's a 275 00:17:27,480 --> 00:17:30,800 Speaker 1: lever nearby that you can throw, and that will divert 276 00:17:30,880 --> 00:17:33,959 Speaker 1: the trolley down a different track where only a single 277 00:17:34,000 --> 00:17:35,080 Speaker 1: worker will be killed. 278 00:17:35,640 --> 00:17:36,440 Speaker 2: So what do you do? 279 00:17:36,880 --> 00:17:40,200 Speaker 1: If you're like most people, you pull the lever, because 280 00:17:40,200 --> 00:17:43,240 Speaker 1: it's better to have one person killed than five. Now, 281 00:17:43,280 --> 00:17:46,400 Speaker 1: in the second scenario of the trolley dilemma, the same 282 00:17:46,440 --> 00:17:49,600 Speaker 1: trolley's barreling down the tracks. The same five workers are 283 00:17:49,640 --> 00:17:53,840 Speaker 1: in harm's way, But this time you are a bystander 284 00:17:53,880 --> 00:17:57,080 Speaker 1: on a footbridge that goes over the tracks, and you 285 00:17:57,200 --> 00:18:00,000 Speaker 1: notice that there's a large man standing on the footbridge, 286 00:18:00,560 --> 00:18:02,840 Speaker 1: and you realize that if you were to push him 287 00:18:02,920 --> 00:18:06,880 Speaker 1: off the bridge, his bulk would be sufficient to stop 288 00:18:06,960 --> 00:18:09,040 Speaker 1: the train and save the five workers. 289 00:18:09,680 --> 00:18:14,800 Speaker 2: Do you push him off? Now? 290 00:18:14,840 --> 00:18:18,040 Speaker 1: Most people here just won't do it. Even now, the 291 00:18:18,200 --> 00:18:21,600 Speaker 1: math is exactly the same. You're trading one life for 292 00:18:21,720 --> 00:18:25,880 Speaker 1: five lives. The only difference is the issue of touching 293 00:18:25,920 --> 00:18:28,880 Speaker 1: the person with your bare hands or not. And as 294 00:18:28,920 --> 00:18:31,800 Speaker 1: I mentioned last time, when my colleagues did brain scanning. 295 00:18:31,840 --> 00:18:34,680 Speaker 1: They found that in the first case, the networks that 296 00:18:34,840 --> 00:18:39,920 Speaker 1: come online are just those involved with doing math problems essentially. 297 00:18:40,200 --> 00:18:43,840 Speaker 1: But in the second case, you've got these emotional networks 298 00:18:43,880 --> 00:18:47,320 Speaker 1: coming online too, these networks in the orbit of frontal cortex, 299 00:18:47,800 --> 00:18:54,520 Speaker 1: and that entirely changes your decision. So the networks involved 300 00:18:54,560 --> 00:18:59,080 Speaker 1: in emotion are very powerful drivers of decision making, and 301 00:18:59,119 --> 00:19:02,720 Speaker 1: this plays a bit role in how we select products, 302 00:19:02,720 --> 00:19:06,920 Speaker 1: how we make economic decisions. So when companies are interested 303 00:19:06,960 --> 00:19:10,639 Speaker 1: in plugging into these networks, they don't use terms like 304 00:19:11,359 --> 00:19:14,959 Speaker 1: we have an integrated approach or we give a comprehensive solution. 305 00:19:15,359 --> 00:19:19,720 Speaker 1: Those things don't talk to these orbital frontal networks at all. Instead, 306 00:19:20,400 --> 00:19:24,440 Speaker 1: companies say things like don't hate me because I'm beautiful, 307 00:19:25,240 --> 00:19:29,240 Speaker 1: or are you man enough to eat this burger? And 308 00:19:29,280 --> 00:19:32,840 Speaker 1: the idea is to plug right into the emotional networks 309 00:19:32,920 --> 00:19:36,800 Speaker 1: to steer your decision making. And it doesn't have to 310 00:19:36,840 --> 00:19:41,120 Speaker 1: be big and explicit, but also includes everything you've ever. 311 00:19:40,960 --> 00:19:42,320 Speaker 2: Seen in any ad. 312 00:19:42,880 --> 00:19:48,679 Speaker 1: Beautiful people loving the product being advertised, They smile beautifully 313 00:19:48,720 --> 00:19:52,440 Speaker 1: when they touch or look at the product. If it's cereal, 314 00:19:52,520 --> 00:19:55,199 Speaker 1: it's a family that's so happy and attractive and they 315 00:19:55,200 --> 00:19:58,560 Speaker 1: don't have any problems if it's for an adult audience. 316 00:19:58,680 --> 00:20:01,920 Speaker 1: The stars of the commercial are smiling at each other. 317 00:20:02,119 --> 00:20:05,320 Speaker 1: The camera lingers for just a moment longer to make 318 00:20:05,359 --> 00:20:08,280 Speaker 1: it clear that this is all a prelude to something 319 00:20:08,320 --> 00:20:12,560 Speaker 1: that's inevitably going to happen between them. In other words, 320 00:20:12,600 --> 00:20:16,720 Speaker 1: the appeal isn't to these networks that are concerned with valuation, 321 00:20:17,200 --> 00:20:21,920 Speaker 1: but instead the networks that drive primitive emotions. 322 00:20:22,000 --> 00:20:24,199 Speaker 2: These have tremendous. 323 00:20:23,440 --> 00:20:32,080 Speaker 1: Power in the neural parliament that drives your decisions. So 324 00:20:32,200 --> 00:20:36,520 Speaker 1: now I've mentioned the brain networks involved in valuation and 325 00:20:36,560 --> 00:20:40,119 Speaker 1: those involved in emotion, and there's one more which brings 326 00:20:40,160 --> 00:20:44,200 Speaker 1: us to Act three. Traditionally, we study the brain by 327 00:20:44,280 --> 00:20:47,520 Speaker 1: looking at different regions and mapping what they're involved in, 328 00:20:47,680 --> 00:20:50,040 Speaker 1: like this is the region for vision, and over here 329 00:20:50,160 --> 00:20:52,760 Speaker 1: is where hearing takes place, and this area is involved 330 00:20:52,760 --> 00:20:55,879 Speaker 1: in touch and so on. But an enormous amount of 331 00:20:55,920 --> 00:21:00,439 Speaker 1: the circuitry of the brain has to do with other people, 332 00:21:00,720 --> 00:21:04,880 Speaker 1: understanding other people. We have an enormous amount of circuitry 333 00:21:05,280 --> 00:21:11,760 Speaker 1: devoted towards issues of trust or integrity, or the reputation 334 00:21:11,920 --> 00:21:14,879 Speaker 1: of other people in our tribe or in modern life, 335 00:21:14,880 --> 00:21:17,080 Speaker 1: it's others we meet all over the world, as well 336 00:21:17,119 --> 00:21:21,639 Speaker 1: as celebrities we've never even met. Bizarrely, so much of 337 00:21:21,680 --> 00:21:26,080 Speaker 1: our circuitry is devoted to social concerns. I can even 338 00:21:26,119 --> 00:21:30,000 Speaker 1: ask you, hey, does that person in your thousand contacts 339 00:21:30,040 --> 00:21:33,840 Speaker 1: that you know know this other person? And you'll pretty 340 00:21:33,880 --> 00:21:36,879 Speaker 1: much know immediately whether they know each other? And how 341 00:21:37,440 --> 00:21:40,159 Speaker 1: think about that? It's like a thousand by a thousand 342 00:21:40,280 --> 00:21:43,800 Speaker 1: matrix of data that's being stored there that you can 343 00:21:43,880 --> 00:21:46,879 Speaker 1: just call up in an instant And these sorts of 344 00:21:47,640 --> 00:21:52,440 Speaker 1: considerations about how much our brains care about other people 345 00:21:53,119 --> 00:21:58,760 Speaker 1: has led to a new subfield called social neuroscience. Now, 346 00:21:58,880 --> 00:22:02,040 Speaker 1: something quite amazing, it's surprising that I studied in my 347 00:22:02,240 --> 00:22:07,240 Speaker 1: lab with brain imaging, is that to the brain, companies 348 00:22:07,880 --> 00:22:11,520 Speaker 1: are just like people. We evolved in small groups and 349 00:22:11,560 --> 00:22:15,280 Speaker 1: developed this very rich social circuitry to understand each other 350 00:22:15,320 --> 00:22:18,639 Speaker 1: and to understand issues of trust and integrity and reputation 351 00:22:19,240 --> 00:22:22,800 Speaker 1: of every person in the group. But companies came along 352 00:22:23,040 --> 00:22:26,959 Speaker 1: just in the last second of our evolutionary history, and 353 00:22:27,080 --> 00:22:31,480 Speaker 1: obviously Mother Nature hasn't had time to rewrite the brain 354 00:22:31,760 --> 00:22:35,920 Speaker 1: to understand companies, so we use exactly the same circuitry, 355 00:22:36,040 --> 00:22:39,800 Speaker 1: as we do when we think about other people. So 356 00:22:39,840 --> 00:22:42,520 Speaker 1: we study this in my lab by having people read 357 00:22:42,600 --> 00:22:46,440 Speaker 1: a vignette about a person that does something something either 358 00:22:46,640 --> 00:22:49,960 Speaker 1: good or bad or suspicious, and we measure how different 359 00:22:50,080 --> 00:22:54,479 Speaker 1: brain regions respond to a person behaving this way. And 360 00:22:54,520 --> 00:22:58,199 Speaker 1: then we show these same vignettes to other people, but 361 00:22:58,280 --> 00:23:01,119 Speaker 1: we swapped out the name of the person with the 362 00:23:01,200 --> 00:23:04,840 Speaker 1: name of a company, and the brain regions that respond 363 00:23:04,840 --> 00:23:10,359 Speaker 1: are exactly the same. Evolutionarily, companies are something new and 364 00:23:10,400 --> 00:23:14,359 Speaker 1: we haven't evolved new circuitry to understand them. So we 365 00:23:14,560 --> 00:23:17,680 Speaker 1: understand a company exactly the same way with the same 366 00:23:17,680 --> 00:23:21,480 Speaker 1: circuitry that we use to understand other people. Is that 367 00:23:21,640 --> 00:23:26,000 Speaker 1: company trustworthy? Does the company have integrity? What is that 368 00:23:26,080 --> 00:23:33,840 Speaker 1: company's social reputation? And you see this on Facebook where 369 00:23:33,840 --> 00:23:38,240 Speaker 1: people are friends with people and they're friends with companies. 370 00:23:39,119 --> 00:23:41,520 Speaker 1: And by the way, in the legal system, there's a 371 00:23:42,000 --> 00:23:46,879 Speaker 1: legal fiction in which we treat corporations like individuals. So 372 00:23:46,920 --> 00:23:49,400 Speaker 1: the corporation can be guilty of a crime, can be 373 00:23:49,640 --> 00:23:52,320 Speaker 1: liable of a crime, and so on. In fact, the 374 00:23:52,359 --> 00:23:58,359 Speaker 1: word corporation means body. Now it might seem hard to 375 00:23:58,440 --> 00:24:03,560 Speaker 1: believe that companies and social reputation are tied so closely. 376 00:24:04,200 --> 00:24:07,800 Speaker 1: But let's take a typical example. Look at what happened 377 00:24:07,840 --> 00:24:13,200 Speaker 1: when one major company tied their reputation to a clean cut, 378 00:24:13,280 --> 00:24:16,439 Speaker 1: admirable young man and they got all the benefits from that. 379 00:24:16,960 --> 00:24:19,760 Speaker 1: I'm talking about Nike, who tied themselves to Tiger Woods. 380 00:24:19,840 --> 00:24:24,480 Speaker 1: He was in their advertising and his superstar power directly 381 00:24:24,560 --> 00:24:27,840 Speaker 1: lifted their economics because he was awesome, and so people 382 00:24:27,880 --> 00:24:31,439 Speaker 1: felt like cool, Nike is awesome and same way. But 383 00:24:31,560 --> 00:24:34,520 Speaker 1: then one day in two thousand and nine, the news 384 00:24:34,600 --> 00:24:37,600 Speaker 1: hit Twitter that he was cheating on his wife. Now 385 00:24:37,680 --> 00:24:39,600 Speaker 1: you can go back and look at a chart of 386 00:24:39,720 --> 00:24:44,240 Speaker 1: Nike stock that day. The news of his infidelity hit 387 00:24:44,240 --> 00:24:47,560 Speaker 1: Twitter at eleven forty am, and you can see the 388 00:24:47,720 --> 00:24:52,119 Speaker 1: immediate effect on the Nike stock price, which continued to 389 00:24:52,280 --> 00:24:55,639 Speaker 1: drop precipitously for the rest of the day and for 390 00:24:55,680 --> 00:24:56,880 Speaker 1: the next six months. 391 00:24:57,440 --> 00:24:58,520 Speaker 2: Even though Tiger. 392 00:24:58,200 --> 00:25:02,040 Speaker 1: Woods has nothing to do with the manufacturing or distribution 393 00:25:02,240 --> 00:25:06,760 Speaker 1: or the quality of the shoes, his infidelity soiled the 394 00:25:06,760 --> 00:25:10,520 Speaker 1: company's reputation in the social brain. If he didn't have 395 00:25:10,560 --> 00:25:15,480 Speaker 1: integrity at that moment, then something becomes wrong with the shoes. Now, 396 00:25:15,480 --> 00:25:16,960 Speaker 1: what I want to point out is this is not 397 00:25:17,000 --> 00:25:21,400 Speaker 1: just an economic phenomenon, it's a social phenomenon, and fundamentally, 398 00:25:21,680 --> 00:25:28,639 Speaker 1: it's a neuroscientific phenomenon. These social circuits evolved when we 399 00:25:28,720 --> 00:25:32,200 Speaker 1: operated in small tribes. They weren't really built to operate 400 00:25:32,240 --> 00:25:36,240 Speaker 1: at the level of multinational companies, and as a result, 401 00:25:36,800 --> 00:25:40,159 Speaker 1: companies get to ride on this circuitry for free. But 402 00:25:40,240 --> 00:25:43,280 Speaker 1: it also means that companies need to be aware that 403 00:25:43,320 --> 00:25:47,359 Speaker 1: they're operating with these basic friend or faux circuits. It 404 00:25:47,400 --> 00:25:51,440 Speaker 1: means that companies need to treat their customers like tribe members. 405 00:25:51,880 --> 00:25:54,920 Speaker 1: And I'm not saying this just as a friendly business philosophy. 406 00:25:54,960 --> 00:25:58,760 Speaker 1: I'm saying this because the same circuitry is being used 407 00:25:59,160 --> 00:26:03,280 Speaker 1: and because of social media, these sorts of social exchanges 408 00:26:03,320 --> 00:26:08,920 Speaker 1: have taken an unexpected evolutionary leap forward because they are 409 00:26:09,160 --> 00:26:13,560 Speaker 1: unerasable now. So here's an example. Some years ago, there 410 00:26:13,640 --> 00:26:16,800 Speaker 1: was a young man traveling on United Airlines and he 411 00:26:16,920 --> 00:26:20,399 Speaker 1: had to check his guitar through the baggage plane. So 412 00:26:20,560 --> 00:26:23,240 Speaker 1: he's sitting on the plane when they're on the runway 413 00:26:23,320 --> 00:26:26,280 Speaker 1: and he's watching the guys load the luggage in and 414 00:26:26,320 --> 00:26:29,800 Speaker 1: he sees the luggage handlers toss his guitar as they're 415 00:26:29,840 --> 00:26:32,560 Speaker 1: loading it. And when this guy gets to the destination, 416 00:26:32,720 --> 00:26:35,960 Speaker 1: he discovers his guitar is broken, so he complains to 417 00:26:36,040 --> 00:26:39,560 Speaker 1: United Airlines, but they tell him they can't take responsibility 418 00:26:39,560 --> 00:26:41,479 Speaker 1: for it and he has to pay for his own guitar. 419 00:26:42,040 --> 00:26:45,160 Speaker 1: So he gets mad and he writes a guitar song 420 00:26:45,200 --> 00:26:48,360 Speaker 1: and he puts it on YouTube and it's called United 421 00:26:48,400 --> 00:26:53,560 Speaker 1: breaks Guitars, and this goes viral and the Economist magazine 422 00:26:53,600 --> 00:26:56,640 Speaker 1: wrote an article into which they estimated that this guy 423 00:26:56,840 --> 00:27:01,600 Speaker 1: cost United Airlines one hundred and thirty million dollars. Why 424 00:27:01,880 --> 00:27:07,119 Speaker 1: because social reputation matters, and the Internet has made reputational 425 00:27:07,160 --> 00:27:12,639 Speaker 1: issues fast and instant and uneasable, and all this ties 426 00:27:12,680 --> 00:27:17,120 Speaker 1: into how companies use social media. So when Facebook first 427 00:27:17,200 --> 00:27:21,240 Speaker 1: got introduced as a concept many years ago, all companies 428 00:27:21,280 --> 00:27:24,280 Speaker 1: immediately started thinking about this as Okay, how do we 429 00:27:24,359 --> 00:27:27,679 Speaker 1: start a Facebook account and post on our feeds and 430 00:27:27,720 --> 00:27:31,000 Speaker 1: get people to buy our product? But this turned out 431 00:27:31,040 --> 00:27:34,480 Speaker 1: to be a fail because people weren't going to Facebook 432 00:27:34,520 --> 00:27:38,440 Speaker 1: with the intention of being sold to, So companies realize 433 00:27:38,480 --> 00:27:43,119 Speaker 1: pretty quickly it's not about selling directly. But the reason 434 00:27:43,160 --> 00:27:46,800 Speaker 1: I have a Facebook account was about branding. In other words, 435 00:27:47,240 --> 00:27:51,240 Speaker 1: how could they develop a social reputation? How could they 436 00:27:51,760 --> 00:27:54,199 Speaker 1: talk to these networks in the brain to make you 437 00:27:54,280 --> 00:27:58,400 Speaker 1: feel like they are your friend? So I just looked 438 00:27:58,400 --> 00:28:01,600 Speaker 1: this up. As of this morning, Burger King has over 439 00:28:01,720 --> 00:28:05,639 Speaker 1: eight point six million people following them. That's more than 440 00:28:05,680 --> 00:28:10,560 Speaker 1: Stephen Colbert. Now what sober adult clicks to be friends 441 00:28:10,720 --> 00:28:14,400 Speaker 1: with a fast food restaurant. The key is that they're 442 00:28:14,400 --> 00:28:19,880 Speaker 1: not selling on their feed. They're strengthening their connection with you, 443 00:28:19,960 --> 00:28:23,560 Speaker 1: like you're a friend. They say chatty things like you 444 00:28:23,760 --> 00:28:27,159 Speaker 1: like us, we love you. They're just like a living, 445 00:28:27,400 --> 00:28:31,639 Speaker 1: breathing person. And this is what companies do to plug 446 00:28:31,640 --> 00:28:47,120 Speaker 1: in to these networks. So I was in Texas recently 447 00:28:47,160 --> 00:28:49,960 Speaker 1: and I noticed a billboard that said the most quote 448 00:28:50,240 --> 00:28:54,080 Speaker 1: liked energy company in Texas. You can choose your own 449 00:28:54,160 --> 00:28:57,920 Speaker 1: energy provider there. So I wondered, what is there to 450 00:28:58,680 --> 00:29:01,560 Speaker 1: like about an energy In other words, why on social 451 00:29:01,600 --> 00:29:04,000 Speaker 1: media would you click to light them? So I went 452 00:29:04,040 --> 00:29:06,480 Speaker 1: and looked them up on Facebook. They had three hundred 453 00:29:06,560 --> 00:29:10,560 Speaker 1: thousand friends. So a month ago, I was on an 454 00:29:10,600 --> 00:29:13,040 Speaker 1: airplane and sitting next to a guy and we started 455 00:29:13,120 --> 00:29:16,320 Speaker 1: chatting about this, and he says, oh, yeah, I'm a 456 00:29:16,360 --> 00:29:19,160 Speaker 1: member of that company. So first I was struck that 457 00:29:19,320 --> 00:29:22,560 Speaker 1: he didn't say customer, he said member. So I thought, 458 00:29:22,600 --> 00:29:25,320 Speaker 1: I need to understand what exactly this company is doing 459 00:29:25,320 --> 00:29:28,040 Speaker 1: that makes them so light. So I went to their 460 00:29:28,040 --> 00:29:31,720 Speaker 1: web page, and on their site, they're hitting all three 461 00:29:31,800 --> 00:29:34,440 Speaker 1: of these neural networks that I talked about. So in 462 00:29:34,440 --> 00:29:37,040 Speaker 1: one part of their homepage, they're plugging right into the 463 00:29:37,160 --> 00:29:42,080 Speaker 1: valuation networks. They say they're the least expensive company and 464 00:29:42,120 --> 00:29:44,360 Speaker 1: they'll save you tons of money, and they have pictures 465 00:29:44,400 --> 00:29:48,360 Speaker 1: of dollar bills snowing down. Now next to that, they 466 00:29:48,400 --> 00:29:51,560 Speaker 1: have a picture that shows when you buy your energy 467 00:29:51,600 --> 00:29:54,600 Speaker 1: from them, they'll give you points that can be turned 468 00:29:54,600 --> 00:29:58,160 Speaker 1: into rewards, and they have pictures of airline tickets and 469 00:29:58,280 --> 00:30:02,800 Speaker 1: coffee and cash. And this section plugs right into the 470 00:30:02,920 --> 00:30:04,080 Speaker 1: orbit of frontal networks. 471 00:30:04,080 --> 00:30:07,040 Speaker 2: They care about reward. And right below that. 472 00:30:07,000 --> 00:30:09,480 Speaker 1: They have a section on the page that plugs into 473 00:30:09,600 --> 00:30:14,680 Speaker 1: social context. It says, when you earn, your friends earn. 474 00:30:15,360 --> 00:30:18,280 Speaker 1: And there's an attractive young woman holding a sign that 475 00:30:18,360 --> 00:30:21,040 Speaker 1: tells us her name is Stacy, and behind us are 476 00:30:21,680 --> 00:30:24,160 Speaker 1: a group of her attractive friends, and they're holding a 477 00:30:24,200 --> 00:30:27,960 Speaker 1: sign that says friends of Stacy what you can read 478 00:30:28,040 --> 00:30:30,760 Speaker 1: on their faces is that they're all so taken with 479 00:30:30,800 --> 00:30:35,000 Speaker 1: Stacy because she's earning money for them. She's earning and 480 00:30:35,040 --> 00:30:38,960 Speaker 1: everyone likes her, and they're happy and thriving and good 481 00:30:38,960 --> 00:30:42,880 Speaker 1: looking and have a tight relationship, and they're cheering. 482 00:30:42,480 --> 00:30:46,560 Speaker 2: For Stacy with no hesitation. They really like her. 483 00:30:46,640 --> 00:30:49,120 Speaker 1: Possibly some of them are in love with her. And 484 00:30:49,160 --> 00:30:52,320 Speaker 1: this plugs right into the social networks that tell you 485 00:30:52,360 --> 00:30:56,800 Speaker 1: that your friends think this is cool. So this company 486 00:30:56,880 --> 00:31:00,480 Speaker 1: is doing a textbook job of hitting all the appaxis 487 00:31:00,560 --> 00:31:04,080 Speaker 1: to align these networks in your brain so you don't 488 00:31:04,120 --> 00:31:07,479 Speaker 1: have to feel like, yeah, they're inexpensive but they're boring, 489 00:31:08,160 --> 00:31:11,520 Speaker 1: or they're socially cool but they're too expensive or whatever. 490 00:31:11,720 --> 00:31:15,240 Speaker 1: They're firing on all cylinders and that's how they're able 491 00:31:15,280 --> 00:31:19,000 Speaker 1: to attract so many likes. Now, I'll mention something else. 492 00:31:19,040 --> 00:31:21,600 Speaker 1: When I went to their page, it told me that 493 00:31:22,040 --> 00:31:26,080 Speaker 1: two of my friends like this energy company. Now, who 494 00:31:26,120 --> 00:31:30,240 Speaker 1: spends their time liking an energy company? But I want 495 00:31:30,280 --> 00:31:31,760 Speaker 1: to point out that this has been one of the 496 00:31:31,800 --> 00:31:35,840 Speaker 1: most important tools of social media, not just the like button, 497 00:31:36,280 --> 00:31:41,200 Speaker 1: but the notification that your friends like this. Because brains 498 00:31:41,200 --> 00:31:45,400 Speaker 1: are so social, we really care about that situation. We 499 00:31:45,600 --> 00:31:49,160 Speaker 1: buy what our friends buy. We care about what our 500 00:31:49,240 --> 00:31:53,320 Speaker 1: friends care about. We are social animals, and things have 501 00:31:53,600 --> 00:31:57,960 Speaker 1: enormous sway. If our friends like it, that little message 502 00:31:58,280 --> 00:32:01,959 Speaker 1: plugs right into these networks telling us that it's cool. 503 00:32:02,320 --> 00:32:06,120 Speaker 1: And this is why social marketing is so effective. Think 504 00:32:06,120 --> 00:32:09,000 Speaker 1: about it this way. Would you pay one hundred and 505 00:32:09,040 --> 00:32:12,520 Speaker 1: twenty nine dollars for some weird new earbuds that were 506 00:32:12,560 --> 00:32:16,960 Speaker 1: green and hung halfway down your face? Maybe we could 507 00:32:16,960 --> 00:32:19,600 Speaker 1: be more likely to do it if your friends were 508 00:32:19,600 --> 00:32:22,280 Speaker 1: all doing it. If when you saw the ad for 509 00:32:22,320 --> 00:32:26,080 Speaker 1: those you saw that Bob and Tonica and Wag. We're 510 00:32:26,120 --> 00:32:29,920 Speaker 1: all doing it because we're social animals and we care 511 00:32:30,320 --> 00:32:33,479 Speaker 1: what our friends do. If they like it, we like it. 512 00:32:33,520 --> 00:32:37,120 Speaker 1: This is the basis of all trends, which are almost 513 00:32:37,120 --> 00:32:39,440 Speaker 1: always things that we look back on after a decade 514 00:32:39,480 --> 00:32:40,360 Speaker 1: and we can't. 515 00:32:40,080 --> 00:32:42,520 Speaker 2: Believe what we did or what we wore or what 516 00:32:42,560 --> 00:32:43,120 Speaker 2: we bought. 517 00:32:43,440 --> 00:32:46,080 Speaker 1: But at the time we did it, everyone was doing it, 518 00:32:46,600 --> 00:32:49,640 Speaker 1: and that was the basis of the coolness. That was 519 00:32:49,680 --> 00:32:53,000 Speaker 1: what plugged into these networks in the midline of your brain. 520 00:32:53,040 --> 00:32:57,720 Speaker 1: Here economists talk about faith in the market, but the 521 00:32:57,840 --> 00:32:59,960 Speaker 1: reason this all has so much influence on our behin 522 00:33:00,000 --> 00:33:04,160 Speaker 1: behavior is because of faith in the social market. We 523 00:33:04,600 --> 00:33:08,720 Speaker 1: can't possibly have time to research everything, but we assume, 524 00:33:09,720 --> 00:33:13,720 Speaker 1: almost always erroneously, that if our friend has bought that thing, 525 00:33:13,920 --> 00:33:18,080 Speaker 1: it must be because he or sheet has invested the 526 00:33:18,280 --> 00:33:22,720 Speaker 1: time researching it and weighing all the options and selecting 527 00:33:23,320 --> 00:33:27,040 Speaker 1: the killer perfect brand or model. So that gives us 528 00:33:27,160 --> 00:33:30,800 Speaker 1: faith and we can save time because our friend, who 529 00:33:30,800 --> 00:33:34,040 Speaker 1: we like because they're smart, or simply good looking, or 530 00:33:34,080 --> 00:33:37,040 Speaker 1: simply because they like us back, if that person has 531 00:33:37,160 --> 00:33:40,200 Speaker 1: chosen this brand, then it must be awesome where they 532 00:33:40,240 --> 00:33:43,160 Speaker 1: wouldn't have done it, And that's why the social brain 533 00:33:43,360 --> 00:33:48,640 Speaker 1: is so critical to our decisions. So zooming out, we 534 00:33:48,720 --> 00:33:54,680 Speaker 1: make our purchasing decisions based on these three networks, price point, 535 00:33:54,960 --> 00:33:59,240 Speaker 1: the emotional feeling, and social context. And there are other 536 00:33:59,280 --> 00:34:01,840 Speaker 1: networks involved as well, but these are some of the biggies. 537 00:34:02,560 --> 00:34:05,160 Speaker 1: And the key is that these are all running as 538 00:34:05,200 --> 00:34:09,600 Speaker 1: a team of rivals. These systems are always battling it 539 00:34:09,640 --> 00:34:13,359 Speaker 1: out under the surface, and it's all unconscious, which means 540 00:34:13,360 --> 00:34:15,400 Speaker 1: you don't have access to the details. 541 00:34:15,800 --> 00:34:17,680 Speaker 2: So when you choose to get. 542 00:34:17,520 --> 00:34:20,640 Speaker 1: A burger at one fast food place over another, you 543 00:34:20,680 --> 00:34:24,320 Speaker 1: don't ask, am I buying this because of the price point, 544 00:34:24,400 --> 00:34:27,760 Speaker 1: because of the emotional salience of the salt and fat 545 00:34:27,840 --> 00:34:30,759 Speaker 1: and the colors used in the restaurant, or because their 546 00:34:30,800 --> 00:34:34,920 Speaker 1: ads feature beautiful young people in love and that reminds 547 00:34:34,960 --> 00:34:37,040 Speaker 1: me of my friends and my social life, or what 548 00:34:37,080 --> 00:34:39,760 Speaker 1: I wish it were. When you're at the store choosing 549 00:34:39,800 --> 00:34:43,759 Speaker 1: between ice cream brands, when you're scanning the ben and 550 00:34:43,840 --> 00:34:47,080 Speaker 1: Jerry's and Bluebell and Hoggin'daws, you don't have access to 551 00:34:47,120 --> 00:34:49,919 Speaker 1: the details of the battles going on under the wood. 552 00:34:50,280 --> 00:34:53,359 Speaker 1: You simply wait for a feeling to get served up 553 00:34:53,400 --> 00:34:56,240 Speaker 1: to your consciousness, and then you reach out and grab 554 00:34:56,719 --> 00:34:59,319 Speaker 1: one container off the shelf and not the others. So 555 00:34:59,400 --> 00:35:02,120 Speaker 1: keep this in mind the next time you're choosing between brands. 556 00:35:02,160 --> 00:35:05,080 Speaker 1: Why are you choosing this one? What are the elements 557 00:35:05,120 --> 00:35:08,640 Speaker 1: that are influencing your decision? Might you make a different 558 00:35:08,719 --> 00:35:11,640 Speaker 1: choice if you become aware of the parts that are 559 00:35:11,680 --> 00:35:15,160 Speaker 1: influencing you. You can't change the wiring of your brain, 560 00:35:15,239 --> 00:35:24,800 Speaker 1: but you can become more aware of what steers you. Okay, 561 00:35:24,840 --> 00:35:28,480 Speaker 1: So the previous episode was called how is the brain like? 562 00:35:28,520 --> 00:35:32,439 Speaker 1: A Team of Rivals? And in this episode we dove 563 00:35:32,440 --> 00:35:36,880 Speaker 1: into specific examples of different drives that pull our behavior 564 00:35:36,920 --> 00:35:39,600 Speaker 1: in different directions. So I hope you'll join me for 565 00:35:39,719 --> 00:35:43,439 Speaker 1: the next episode, Part three, where we will discuss how 566 00:35:43,480 --> 00:35:47,680 Speaker 1: we can take this knowledge and use it to navigate 567 00:35:47,760 --> 00:35:50,800 Speaker 1: our own behavior. Because you have systems in your brain 568 00:35:50,840 --> 00:35:52,879 Speaker 1: that deal not only with what is right in front 569 00:35:52,920 --> 00:35:55,920 Speaker 1: of you, but also that do you long term thinking. 570 00:35:56,520 --> 00:35:58,719 Speaker 1: So how do you get yourself to do the right 571 00:35:58,840 --> 00:36:02,640 Speaker 1: thing in the moment? How do you balance networks that 572 00:36:02,880 --> 00:36:06,399 Speaker 1: care about the now with networks that care about who 573 00:36:06,480 --> 00:36:09,000 Speaker 1: you are and who you want to be? How do 574 00:36:09,080 --> 00:36:14,120 Speaker 1: you forego temptation that's not aligned with your long term goals? 575 00:36:14,440 --> 00:36:18,239 Speaker 1: How is your decision about eating that cookie related to 576 00:36:18,920 --> 00:36:22,000 Speaker 1: the hero of the Trojan War. I'll see you there 577 00:36:22,040 --> 00:36:25,360 Speaker 1: in the next episode for the culmination of everything that 578 00:36:25,400 --> 00:36:26,879 Speaker 1: we've been talking about so far. 579 00:36:30,160 --> 00:36:31,360 Speaker 2: That's all for this week. 580 00:36:31,600 --> 00:36:34,080 Speaker 1: To find out more and to share your thoughts, head 581 00:36:34,080 --> 00:36:37,920 Speaker 1: over to Eagleman dot com, slash Podcasts, and you can 582 00:36:37,960 --> 00:36:42,400 Speaker 1: also watch full episodes of Inner Cosmos on YouTube. Subscribe 583 00:36:42,400 --> 00:36:44,719 Speaker 1: to my channel so you can follow along each week 584 00:36:44,760 --> 00:36:48,279 Speaker 1: for new updates. I'd love to hear your questions, so 585 00:36:48,360 --> 00:36:52,480 Speaker 1: please send those to podcasts at eagleman dot com and 586 00:36:52,520 --> 00:36:55,280 Speaker 1: I will do a special episode where I answer questions 587 00:36:55,680 --> 00:36:59,520 Speaker 1: until next time. I'm David Eagleman, and this is Inner 588 00:36:59,600 --> 00:37:00,640 Speaker 1: Cosmo