1 00:00:05,080 --> 00:00:08,320 Speaker 1: Why is it so hard to keep a secret? 2 00:00:08,560 --> 00:00:10,799 Speaker 2: And what does this have to do with Abraham Lincoln's 3 00:00:10,800 --> 00:00:16,120 Speaker 2: political cabinet or when bosses should keep secrets within companies? 4 00:00:16,680 --> 00:00:19,240 Speaker 2: What does any of this have to do with political 5 00:00:19,440 --> 00:00:25,200 Speaker 2: hierarchies or chimpanzees or the formula for Coca cola, or 6 00:00:25,680 --> 00:00:30,160 Speaker 2: whether AI in the near future will be keeping secrets 7 00:00:30,240 --> 00:00:34,560 Speaker 2: from you. Today's episode is all about secrets. We've all 8 00:00:34,840 --> 00:00:37,879 Speaker 2: got them, from the little white lies we tell to 9 00:00:37,920 --> 00:00:41,640 Speaker 2: spare somebody's feelings to the deep unspoken things that we 10 00:00:41,680 --> 00:00:45,879 Speaker 2: try to bury. Secrets are a part of being a primate. 11 00:00:45,960 --> 00:00:48,680 Speaker 2: But why do we have them, why do we keep them? 12 00:00:49,080 --> 00:00:52,440 Speaker 2: And what happens when they start to unravel. As we'll see, 13 00:00:52,960 --> 00:00:57,880 Speaker 2: neuroscience tells us that secrets aren't just stashed away. They're active. 14 00:00:58,000 --> 00:01:02,040 Speaker 2: They weigh on us, They shape our relationships, and sometimes 15 00:01:02,560 --> 00:01:09,600 Speaker 2: they grow into tangled webs we never meant to weave. Today, 16 00:01:09,600 --> 00:01:12,880 Speaker 2: we'll dive into the neuroscience of secrets, how they form, 17 00:01:12,959 --> 00:01:15,560 Speaker 2: how they grow, and what it takes to keep them 18 00:01:15,720 --> 00:01:18,720 Speaker 2: or spill them. This is in her Cosmos, and I'm 19 00:01:18,800 --> 00:01:22,679 Speaker 2: David Eagleman. Let's dive into our three pound universe to 20 00:01:22,959 --> 00:01:36,520 Speaker 2: untangle the web. Now, before diving into the science of secrets, 21 00:01:36,560 --> 00:01:40,120 Speaker 2: we'll start with an observation that researchers made sometime ago. 22 00:01:40,880 --> 00:01:45,560 Speaker 2: Keeping a secret is unhealthy. So there's a psychologist named 23 00:01:45,600 --> 00:01:49,040 Speaker 2: James Pennebaker, and he and his colleagues studied what happened 24 00:01:49,480 --> 00:01:53,680 Speaker 2: when victims of rape or incest when they acted out 25 00:01:53,720 --> 00:01:59,200 Speaker 2: of shame or guilt and chose to keep secrets inside. So, 26 00:01:59,240 --> 00:02:04,000 Speaker 2: after years of study, Pennebaker concluded that quote, the act 27 00:02:04,240 --> 00:02:08,840 Speaker 2: of not discussing or confiding the event with another maybe 28 00:02:09,000 --> 00:02:13,560 Speaker 2: more damaging than having experienced the event per se. He 29 00:02:13,600 --> 00:02:17,880 Speaker 2: and his team discovered that when subjects confesst or wrote 30 00:02:17,960 --> 00:02:23,120 Speaker 2: about their deeply held secrets, their health improved. Their number 31 00:02:23,160 --> 00:02:27,359 Speaker 2: of doctor visits went down, and there were measurable decreases 32 00:02:27,520 --> 00:02:31,080 Speaker 2: in their stress hormone levels. So the results are clear enough. 33 00:02:31,440 --> 00:02:34,720 Speaker 2: But some years ago I began to ask myself how 34 00:02:34,760 --> 00:02:38,320 Speaker 2: to understand these findings from the point of view of 35 00:02:38,440 --> 00:02:41,040 Speaker 2: brain science, and that led to a question that I 36 00:02:41,120 --> 00:02:47,919 Speaker 2: realized was unaddressed in the scientific literature. What is a secret? Neurobiologically, 37 00:02:48,280 --> 00:02:53,760 Speaker 2: imagine constructing an artificial neural network of millions of interconnected neurons. 38 00:02:54,320 --> 00:02:56,040 Speaker 1: What would a secret look like? 39 00:02:56,400 --> 00:03:01,720 Speaker 2: Here could a toaster with its interconnect did parts harbor 40 00:03:01,840 --> 00:03:07,639 Speaker 2: a secret? We have useful scientific frameworks for understanding things 41 00:03:07,720 --> 00:03:12,080 Speaker 2: like Parkinson's disease, or color perception or sensing temperature, but 42 00:03:12,160 --> 00:03:16,440 Speaker 2: we don't really have any scientific framework for understanding what 43 00:03:16,560 --> 00:03:20,160 Speaker 2: it means for the brain to have and to hold 44 00:03:20,480 --> 00:03:23,480 Speaker 2: a secret. So that's what we're going to put into 45 00:03:23,480 --> 00:03:26,839 Speaker 2: place today. Now, before we get there, let's start with 46 00:03:27,360 --> 00:03:32,000 Speaker 2: Abraham Lincoln. In eighteen sixty, the young Lincoln emerged as 47 00:03:32,040 --> 00:03:35,400 Speaker 2: the Republican part of The's candidate for president. And this 48 00:03:35,720 --> 00:03:40,360 Speaker 2: was a surprise because he was relatively unknown compared to 49 00:03:40,440 --> 00:03:44,600 Speaker 2: the seasoned politicians he was going up against, like William 50 00:03:44,640 --> 00:03:47,840 Speaker 2: Seward and Sam and Chase and Edward Bates who are 51 00:03:47,880 --> 00:03:51,400 Speaker 2: also vying for that nomination. But here's what Lincoln did 52 00:03:51,520 --> 00:03:57,000 Speaker 2: after he won the presidency. He chose those people his rivals, 53 00:03:57,560 --> 00:04:02,360 Speaker 2: to comprise his presidential cabinet. And not only them, but 54 00:04:02,440 --> 00:04:06,440 Speaker 2: he also put in other people with totally different political 55 00:04:06,480 --> 00:04:10,440 Speaker 2: ideologies from his. And that's how he formed his cabinet. 56 00:04:10,840 --> 00:04:17,120 Speaker 2: The historian Doris Kerns Goodwin labeled this a team of rivals. Now, 57 00:04:17,200 --> 00:04:20,760 Speaker 2: I've always loved this fact about Lincoln, first because it's 58 00:04:20,800 --> 00:04:24,800 Speaker 2: actually quite sensible politically, but very few people do it. 59 00:04:24,800 --> 00:04:30,400 Speaker 2: It's sensible because bringing together people with conflicting perspectives reflected 60 00:04:30,680 --> 00:04:33,480 Speaker 2: his deep commitment to the Union and his focus on 61 00:04:34,000 --> 00:04:38,440 Speaker 2: the bigger picture rather than personal grudges. He believed that 62 00:04:38,480 --> 00:04:42,120 Speaker 2: the challenges of leading a divided nation through the Civil 63 00:04:42,160 --> 00:04:46,080 Speaker 2: War required the best minds available, even if they disagreed 64 00:04:46,120 --> 00:04:48,760 Speaker 2: with him or each other. So he got a spread 65 00:04:48,760 --> 00:04:51,880 Speaker 2: of opinions this way. But the key thing is that 66 00:04:51,920 --> 00:04:57,560 Speaker 2: the internal conflict gave strength to his presidency because no 67 00:04:57,600 --> 00:05:00,679 Speaker 2: one gets to enjoy the delusion that they're there's a 68 00:05:00,760 --> 00:05:04,760 Speaker 2: right and a wrong answer, But instead, every disagreement gets 69 00:05:04,920 --> 00:05:07,840 Speaker 2: fought out, and sometimes a good solution is reached, and 70 00:05:08,080 --> 00:05:12,400 Speaker 2: sometimes the problem festers and is revisited over and over. Now, 71 00:05:12,440 --> 00:05:15,600 Speaker 2: the argument I forwarded in my book Incognito is that 72 00:05:15,640 --> 00:05:21,360 Speaker 2: the brain fundamentally can best be understood as a team 73 00:05:21,720 --> 00:05:24,599 Speaker 2: of rivals. So what does that mean. Well, it means 74 00:05:24,640 --> 00:05:28,880 Speaker 2: that you are made up of many different neural networks, 75 00:05:29,040 --> 00:05:33,640 Speaker 2: each of which sees different data and has its own goals. 76 00:05:34,240 --> 00:05:37,240 Speaker 2: And this is why you can argue with yourself, or 77 00:05:37,800 --> 00:05:41,800 Speaker 2: get mad at yourself, or try to control yourself to 78 00:05:41,839 --> 00:05:45,320 Speaker 2: do something, or get angry at yourself, because you are 79 00:05:45,360 --> 00:05:50,680 Speaker 2: not one thing, you are a multitude. I'll give you 80 00:05:50,720 --> 00:05:52,720 Speaker 2: an example of this when it comes to how we 81 00:05:52,760 --> 00:05:56,640 Speaker 2: make purchasing decisions, as in which restaurant to choose, or 82 00:05:56,640 --> 00:06:00,000 Speaker 2: which clothing brand to buy, or what car should I purchase. 83 00:06:00,720 --> 00:06:05,159 Speaker 2: In economics, we were taught about Homo economicus, the rational 84 00:06:05,200 --> 00:06:09,200 Speaker 2: decision maker who maximizes gain and minimizes loss and is 85 00:06:09,279 --> 00:06:10,320 Speaker 2: unswayed by emotion. 86 00:06:10,880 --> 00:06:13,839 Speaker 1: But that is an idealized model. 87 00:06:14,240 --> 00:06:19,000 Speaker 2: Real humans are emotional and inconsistent and easily influenced by 88 00:06:19,120 --> 00:06:24,120 Speaker 2: context and branding, and the gap between theory and reality 89 00:06:24,440 --> 00:06:28,960 Speaker 2: has given rise to the field of neuroeconomics. So consider this. 90 00:06:29,040 --> 00:06:32,480 Speaker 2: I give you two courts of generic ice cream. One 91 00:06:32,560 --> 00:06:35,240 Speaker 2: is priced lower than the other, and so the rational 92 00:06:35,360 --> 00:06:38,600 Speaker 2: choice is pretty easy. But if I reveal the labels, 93 00:06:38,680 --> 00:06:41,560 Speaker 2: let's say one is Ben and Jerry's one is Hoggendaws, 94 00:06:42,040 --> 00:06:46,479 Speaker 2: your decisions might shift because all you're experiencing with their 95 00:06:46,600 --> 00:06:50,920 Speaker 2: branding kicks in and your emotional predictions about how good 96 00:06:50,960 --> 00:06:54,440 Speaker 2: something is likely to taste based on your past experiences, 97 00:06:54,520 --> 00:06:56,960 Speaker 2: and maybe your sense of whether your friends like this 98 00:06:57,120 --> 00:07:00,680 Speaker 2: or not, And all of this shifts the decision making 99 00:07:01,040 --> 00:07:03,840 Speaker 2: one way or the other. What two decades of neuroimaging 100 00:07:03,880 --> 00:07:07,559 Speaker 2: has shown is that our decisions about what to buy 101 00:07:08,120 --> 00:07:11,640 Speaker 2: are driven by at least three major brain networks. You 102 00:07:11,720 --> 00:07:15,480 Speaker 2: have one network along the midline that calculates the price 103 00:07:15,640 --> 00:07:19,840 Speaker 2: or worth of something. Then you've got a separate network 104 00:07:19,920 --> 00:07:22,960 Speaker 2: involved in emotion, mostly in the orbit or frontal cortex, 105 00:07:23,400 --> 00:07:27,480 Speaker 2: and that anticipates how good or bad something will feel. 106 00:07:27,720 --> 00:07:30,760 Speaker 2: Then you've got another network that's all about your choices 107 00:07:30,960 --> 00:07:34,840 Speaker 2: in a social context, as in, what do your friends 108 00:07:34,840 --> 00:07:35,560 Speaker 2: think about this? 109 00:07:35,680 --> 00:07:37,000 Speaker 1: Is it cool or is it lame? 110 00:07:37,360 --> 00:07:40,640 Speaker 2: And all of this tells us why companies don't just 111 00:07:40,840 --> 00:07:45,080 Speaker 2: advertise with bullet points about their rational advantages. 112 00:07:45,400 --> 00:07:48,000 Speaker 1: They spend millions of dollars on ads. 113 00:07:48,200 --> 00:07:51,280 Speaker 2: To assure us that this is the best value and 114 00:07:51,480 --> 00:07:54,960 Speaker 2: appeal to our feelings, and to assure us that everyone 115 00:07:55,040 --> 00:07:59,200 Speaker 2: else loves this product as well. So, as a result 116 00:07:59,240 --> 00:08:03,840 Speaker 2: of the complicate, hated voting of our neural parliaments, decisions 117 00:08:03,880 --> 00:08:07,480 Speaker 2: aren't always straightforward to decode. So the next time you're 118 00:08:07,480 --> 00:08:11,240 Speaker 2: thinking about your choice between ice cream brands or fast food, 119 00:08:11,280 --> 00:08:15,880 Speaker 2: restaurants or car brands. Ask yourself, what is driving this decision? 120 00:08:16,040 --> 00:08:19,640 Speaker 2: Is it price? Is it emotion? Is there a social 121 00:08:19,680 --> 00:08:23,240 Speaker 2: influence on what I'm choosing? All these networks are fighting 122 00:08:23,320 --> 00:08:27,720 Speaker 2: it out, each trying to have control in steering the 123 00:08:27,760 --> 00:08:30,520 Speaker 2: ship of state. If you're interested in more about this, 124 00:08:30,600 --> 00:08:34,240 Speaker 2: checkout episodes eight and nine. Now I mention all this 125 00:08:34,440 --> 00:08:39,000 Speaker 2: now to illustrate this notion of battles in the brain. 126 00:08:39,400 --> 00:08:41,960 Speaker 2: It's not just with buying decisions, but with all the 127 00:08:42,040 --> 00:08:44,920 Speaker 2: choices you make in life. You have some networks that 128 00:08:45,040 --> 00:08:48,600 Speaker 2: care about the short term in decision making, and you 129 00:08:48,640 --> 00:08:51,600 Speaker 2: have some that care about the long term. You have 130 00:08:51,679 --> 00:08:55,400 Speaker 2: networks that are monitoring and making predictions about the outside 131 00:08:55,440 --> 00:08:59,080 Speaker 2: world and others that care about your inside world, the 132 00:08:59,160 --> 00:09:02,360 Speaker 2: state of your body. Some of your networks provide a 133 00:09:02,440 --> 00:09:06,400 Speaker 2: drive towards novelty and others towards familiarity, and these can 134 00:09:06,480 --> 00:09:09,520 Speaker 2: both be active at the same time and having an 135 00:09:09,679 --> 00:09:13,520 Speaker 2: arm wrestle. So think about these rivaling networks like a 136 00:09:14,160 --> 00:09:18,720 Speaker 2: neural parliament. You're built of different political parties, all of 137 00:09:18,720 --> 00:09:22,000 Speaker 2: whom love their country, but they just have different opinions 138 00:09:22,080 --> 00:09:25,360 Speaker 2: about the best way forward. So this gives us a 139 00:09:25,840 --> 00:09:29,120 Speaker 2: sense of these battling networks, and now I want to 140 00:09:29,200 --> 00:09:32,760 Speaker 2: zoom in on something surprising. What you get out of 141 00:09:32,800 --> 00:09:36,960 Speaker 2: a brain that is composed of multiple rivaling networks, You 142 00:09:37,040 --> 00:09:42,160 Speaker 2: get a universe of secrets. The team of rivals framework 143 00:09:42,200 --> 00:09:46,000 Speaker 2: allows us to address a mystery that would be inexplicable 144 00:09:46,280 --> 00:09:49,720 Speaker 2: if we took the point of view of traditional computer programs. 145 00:09:49,960 --> 00:09:51,840 Speaker 2: So let's come back to the question I asked at 146 00:09:51,880 --> 00:09:56,679 Speaker 2: the beginning, what is a secret? Neurobiologically? And could your 147 00:09:56,720 --> 00:09:59,960 Speaker 2: toaster keep a secret? Within the team of rivals framework, 148 00:10:00,320 --> 00:10:04,959 Speaker 2: a secret is easily understood. It's the result of struggle 149 00:10:05,120 --> 00:10:08,360 Speaker 2: between competing parties in the brain. One part of the 150 00:10:08,360 --> 00:10:12,480 Speaker 2: brain wants to reveal something and another part does not 151 00:10:12,640 --> 00:10:16,199 Speaker 2: want to. When there are competing votes in the brain, 152 00:10:16,320 --> 00:10:19,559 Speaker 2: one for telling and one for withholding, that's a secret. 153 00:10:20,200 --> 00:10:24,120 Speaker 2: If neither party cares to tell, that's merely a boring fact. 154 00:10:24,400 --> 00:10:27,000 Speaker 2: And if both parties want to tell, that's just a 155 00:10:27,000 --> 00:10:30,960 Speaker 2: good story. Without the framework of rivalry, we would have 156 00:10:31,000 --> 00:10:34,840 Speaker 2: no way to understand a secret. The reason a secret 157 00:10:34,960 --> 00:10:40,080 Speaker 2: is experienced consciously is because it results from a conflict. 158 00:10:40,120 --> 00:10:44,240 Speaker 2: It's not business as usual, and so quite often our 159 00:10:44,360 --> 00:10:48,440 Speaker 2: consciousness is called upon to deal with it. Now, the 160 00:10:48,520 --> 00:10:53,160 Speaker 2: main reason not to tell a secret is aversion to 161 00:10:53,240 --> 00:10:57,560 Speaker 2: the long term consequences. A friend might think ill of you, 162 00:10:58,040 --> 00:11:02,800 Speaker 2: or a relationship might be revocably damaged, or a community 163 00:11:02,880 --> 00:11:07,920 Speaker 2: might ostracize you. This concern about the outcome is evidenced 164 00:11:07,920 --> 00:11:10,360 Speaker 2: by the fact that people are more likely to tell 165 00:11:10,400 --> 00:11:14,640 Speaker 2: their secrets to total strangers. With somebody that you don't know, 166 00:11:15,240 --> 00:11:19,400 Speaker 2: the neural conflict can be dissipated with none of the costs. 167 00:11:20,000 --> 00:11:25,040 Speaker 2: And that's why strangers are occasionally so forthcoming on airplanes 168 00:11:25,160 --> 00:11:28,320 Speaker 2: or in bars telling you all the details of their 169 00:11:28,760 --> 00:11:32,200 Speaker 2: marital troubles. And we can see a modern twist on 170 00:11:32,320 --> 00:11:35,960 Speaker 2: this ancient need to confess to strangers in the form 171 00:11:36,000 --> 00:11:40,439 Speaker 2: of websites like postsecret dot com. This is a site 172 00:11:40,440 --> 00:11:45,600 Speaker 2: where people go to anonymously disclose their confessions. Here are 173 00:11:45,760 --> 00:11:46,600 Speaker 2: some examples. 174 00:11:47,080 --> 00:11:47,520 Speaker 1: Quote. 175 00:11:47,640 --> 00:11:50,800 Speaker 2: When my daughter was stillborn, I not only thought about 176 00:11:50,960 --> 00:11:54,120 Speaker 2: kidnapping a baby, I planned it out in my head. 177 00:11:54,520 --> 00:11:58,320 Speaker 2: I even found myself watching new mothers with their babies, 178 00:11:58,679 --> 00:12:01,520 Speaker 2: trying to pick the perfect Now we don't know who 179 00:12:01,600 --> 00:12:05,640 Speaker 2: posted this, but for that person something was released by 180 00:12:05,679 --> 00:12:09,320 Speaker 2: confessing this. Here's another one. I was adopted, and my 181 00:12:09,400 --> 00:12:13,080 Speaker 2: biggest fear is passing my biological father on the street. 182 00:12:13,600 --> 00:12:16,080 Speaker 2: Or here's another one. I'm almost certain that your son 183 00:12:16,200 --> 00:12:19,280 Speaker 2: has autism, but I have no idea how to tell you. 184 00:12:19,440 --> 00:12:22,280 Speaker 2: There are tens of thousands of secrets on this website 185 00:12:22,840 --> 00:12:27,400 Speaker 2: and why. It's because just telling your secret to the website, 186 00:12:27,440 --> 00:12:30,319 Speaker 2: where you believe it will be read by other people 187 00:12:30,559 --> 00:12:35,680 Speaker 2: strangers to you, is sufficient. And this reduction of tension 188 00:12:35,760 --> 00:12:39,280 Speaker 2: by releasing a secret. This, I think also explains a 189 00:12:39,360 --> 00:12:43,920 Speaker 2: staple in one of the world's largest religions, the confessional booth. 190 00:12:44,040 --> 00:12:48,040 Speaker 2: You get something off your chest and without the consequences 191 00:12:48,040 --> 00:12:49,679 Speaker 2: that might otherwise accrue. 192 00:12:49,760 --> 00:12:51,520 Speaker 1: This same need to reveal. 193 00:12:51,200 --> 00:12:54,880 Speaker 2: Ourselves might also shine light on the appeal of prayer, 194 00:12:55,080 --> 00:12:58,720 Speaker 2: especially with religions that have a very personal God who 195 00:12:58,840 --> 00:13:03,120 Speaker 2: lends an ear with undivided attention and non judgmental love. 196 00:13:03,520 --> 00:13:06,120 Speaker 2: So why is a stranger or a website or a 197 00:13:06,160 --> 00:13:10,040 Speaker 2: deity useful. It's because it's the act of telling the 198 00:13:10,080 --> 00:13:13,160 Speaker 2: secret that matters. Things don't have to get solved, the 199 00:13:13,440 --> 00:13:16,880 Speaker 2: cognitive tension just has to be reduced. We've all seen 200 00:13:16,920 --> 00:13:19,920 Speaker 2: this that people often will vent a secret for its 201 00:13:19,960 --> 00:13:24,720 Speaker 2: own sake, not as an invitation for advice. For example, 202 00:13:24,760 --> 00:13:27,400 Speaker 2: somebody tells you a confession and you make the mistake 203 00:13:27,440 --> 00:13:31,600 Speaker 2: of suggesting some obvious solution that only frustrates the other person, 204 00:13:31,920 --> 00:13:35,960 Speaker 2: because all they really wanted was to tell to reduce 205 00:13:36,000 --> 00:13:39,920 Speaker 2: the neural tension. The act of telling the secret was 206 00:13:40,000 --> 00:13:43,520 Speaker 2: the solution. Now an open question is why the receiver 207 00:13:43,800 --> 00:13:47,440 Speaker 2: of the secrets has to be human or human like 208 00:13:47,480 --> 00:13:50,160 Speaker 2: in the case of deities. If you tell your secret 209 00:13:50,320 --> 00:13:53,640 Speaker 2: to a wall or a lizard or a goat, that's 210 00:13:53,679 --> 00:13:57,000 Speaker 2: going to be a lot less satisfying. Now, one thing 211 00:13:57,040 --> 00:13:59,720 Speaker 2: that strikes me is interesting in this regard is the 212 00:13:59,800 --> 00:14:03,439 Speaker 2: right popularity of AI therapist bots. 213 00:14:03,679 --> 00:14:05,280 Speaker 1: Where will lees sit in. 214 00:14:05,280 --> 00:14:09,640 Speaker 2: The hierarchy of human to non human I think it'll 215 00:14:09,640 --> 00:14:12,640 Speaker 2: be pretty close to a person or a deity. I 216 00:14:12,679 --> 00:14:15,160 Speaker 2: suspect that in the future we'll be pretty happy to 217 00:14:15,240 --> 00:14:19,320 Speaker 2: let our secrets out to our AI therapist, But only 218 00:14:19,400 --> 00:14:22,720 Speaker 2: if you're convinced about the security of the company. Is 219 00:14:22,760 --> 00:14:26,520 Speaker 2: it just a matter of time before the FBI says 220 00:14:26,560 --> 00:14:29,880 Speaker 2: to the tech company, Hey, we can subpoena your therapy 221 00:14:29,960 --> 00:14:34,360 Speaker 2: history under special circumstances. So it's still to be seen 222 00:14:34,480 --> 00:14:37,800 Speaker 2: how much people will unload on their bots because the 223 00:14:37,880 --> 00:14:41,040 Speaker 2: key with unloading your secrets is to get the cognitive 224 00:14:41,040 --> 00:15:00,040 Speaker 2: benefit without the social consequences. Okay, so I've been and 225 00:15:00,160 --> 00:15:03,640 Speaker 2: referring to this cognitive load of keeping a secret. But 226 00:15:03,680 --> 00:15:06,720 Speaker 2: the question is why do we have difficulty? Why isn't 227 00:15:06,720 --> 00:15:10,600 Speaker 2: it easy to keep a secret? Here's why, because keeping 228 00:15:10,680 --> 00:15:17,000 Speaker 2: a secret requires ongoing mental effort to monitor your speech 229 00:15:17,120 --> 00:15:20,520 Speaker 2: and actions. You have to make sure that the hidden 230 00:15:20,560 --> 00:15:25,280 Speaker 2: information isn't accidentally revealed. Even if this is happening at 231 00:15:25,280 --> 00:15:30,000 Speaker 2: a subconscious level, it's a state of constant vigilance to 232 00:15:30,120 --> 00:15:32,920 Speaker 2: keep from blurting out the secret. So how do you 233 00:15:33,000 --> 00:15:36,440 Speaker 2: measure this? So some years ago, a scientist named Sean 234 00:15:36,520 --> 00:15:40,600 Speaker 2: Spence and his colleagues interviewed participants on what activities they 235 00:15:40,600 --> 00:15:43,160 Speaker 2: had done the previous day, and then they put them 236 00:15:43,360 --> 00:15:46,960 Speaker 2: in the brain scanner fMRI. These people were given questions 237 00:15:46,960 --> 00:15:49,560 Speaker 2: about what they did and they were given two buttons 238 00:15:49,680 --> 00:15:52,760 Speaker 2: yes or no. Now, on each round, they were given 239 00:15:52,840 --> 00:15:55,480 Speaker 2: either a red light or a green light to tell 240 00:15:55,520 --> 00:15:59,480 Speaker 2: them whether they should lie or whether they should tell 241 00:15:59,520 --> 00:16:02,680 Speaker 2: the truth on that trial. So the first thing the 242 00:16:02,680 --> 00:16:05,200 Speaker 2: researchers noted is that it always takes a little bit 243 00:16:05,240 --> 00:16:07,720 Speaker 2: longer to lie, about a fifth of a second longer, 244 00:16:07,720 --> 00:16:10,680 Speaker 2: and this is because you are taking the answer that 245 00:16:10,720 --> 00:16:13,800 Speaker 2: you already know the true answer, and you're squelching that. 246 00:16:14,600 --> 00:16:19,000 Speaker 2: So they demonstrated that this squelching of the truth responses 247 00:16:19,520 --> 00:16:22,360 Speaker 2: activates a part of the prefrontal cortex, a part that's 248 00:16:22,640 --> 00:16:26,600 Speaker 2: down load near the side. It's called the ventrilateral prefrontal cortex. 249 00:16:27,120 --> 00:16:30,280 Speaker 2: The thing to appreciate is that this region becomes active 250 00:16:30,800 --> 00:16:34,080 Speaker 2: whenever you need a suppressive behavior that you would normally 251 00:16:34,240 --> 00:16:36,800 Speaker 2: just go with, in this case telling the truth. And 252 00:16:36,920 --> 00:16:39,840 Speaker 2: other brain errors are involved too. In a study bi 253 00:16:39,960 --> 00:16:42,920 Speaker 2: Langeleman and colleagues the next year, participants went into a 254 00:16:42,960 --> 00:16:45,880 Speaker 2: scanner and they saw a playing card on the screen 255 00:16:45,960 --> 00:16:47,480 Speaker 2: and they were asked to remember it. 256 00:16:48,160 --> 00:16:49,320 Speaker 1: Then they were shown. 257 00:16:49,080 --> 00:16:51,640 Speaker 2: A series of cards and they were asked was that 258 00:16:51,760 --> 00:16:52,720 Speaker 2: your card or not? 259 00:16:53,480 --> 00:16:57,560 Speaker 1: And they were sometimes instructed to lie. And here's the key. 260 00:16:58,040 --> 00:17:01,040 Speaker 2: When they lied, researchers saw a lot of activity in 261 00:17:01,080 --> 00:17:04,359 Speaker 2: an area called the anterior singulate cortex, which is a 262 00:17:04,400 --> 00:17:09,159 Speaker 2: region that detects conflict between other brain areas. So in 263 00:17:09,200 --> 00:17:12,159 Speaker 2: both these studies, people were just saying yes or no 264 00:17:12,359 --> 00:17:15,200 Speaker 2: to lie or not lie. But in the next study 265 00:17:15,400 --> 00:17:18,200 Speaker 2: by Spence and colleagues, they wanted to see what happened 266 00:17:18,240 --> 00:17:21,520 Speaker 2: in the brain when people are forced to get more 267 00:17:21,840 --> 00:17:25,359 Speaker 2: imaginative when they go beyond yes or no answers to 268 00:17:25,480 --> 00:17:28,199 Speaker 2: make up a new story. So imagine I ask you 269 00:17:28,840 --> 00:17:32,359 Speaker 2: where were you on the evening of February twenty second, 270 00:17:33,000 --> 00:17:34,639 Speaker 2: and you can figure out the answer that. 271 00:17:34,760 --> 00:17:36,600 Speaker 1: Let's say you were at a great party that night. 272 00:17:37,000 --> 00:17:40,120 Speaker 2: But imagine that you lie to me and you say, oh, 273 00:17:40,240 --> 00:17:43,480 Speaker 2: I met my friend for a quiet movie on Netflix 274 00:17:43,480 --> 00:17:46,080 Speaker 2: that night. Now, in order to do this, in order 275 00:17:46,160 --> 00:17:49,679 Speaker 2: to lie, you have to think of the true response 276 00:17:49,720 --> 00:17:53,679 Speaker 2: and suppress that like we saw before, but also you 277 00:17:53,800 --> 00:17:56,400 Speaker 2: then have to make something up. So you're now doing 278 00:17:56,440 --> 00:18:00,120 Speaker 2: two things. You're suppressing the true answer, which cranks the 279 00:18:00,119 --> 00:18:04,080 Speaker 2: the ventralateral prefrontal cortex like we saw, and then you 280 00:18:04,160 --> 00:18:07,119 Speaker 2: need to cook up the lie, and that activates a 281 00:18:07,160 --> 00:18:11,919 Speaker 2: different region than dorsilateral prefrontal cortex. The dorsilateral area cranks 282 00:18:12,000 --> 00:18:15,440 Speaker 2: up when you generate something new, like a new story, 283 00:18:15,840 --> 00:18:17,720 Speaker 2: and especially if this is the first time that you're 284 00:18:17,720 --> 00:18:20,200 Speaker 2: making up that story. And we see other areas too, 285 00:18:20,400 --> 00:18:24,199 Speaker 2: like the anterior cingulate cortex, which reflects internal conflict. And 286 00:18:24,240 --> 00:18:28,479 Speaker 2: we see another part of the prefrontal cortex, the ventromedial region, 287 00:18:28,720 --> 00:18:32,280 Speaker 2: which cranks up when you're trying to regulate your emotions. 288 00:18:32,960 --> 00:18:37,800 Speaker 2: So covering up the truth is a high energy endeavor 289 00:18:37,840 --> 00:18:41,560 Speaker 2: for the brain. We never see areas where there's more 290 00:18:41,600 --> 00:18:43,359 Speaker 2: activity when you tell the truth. 291 00:18:43,880 --> 00:18:46,560 Speaker 1: There's only more activity when you're lying. 292 00:18:47,119 --> 00:18:50,080 Speaker 2: So the idea that it's easier to tell the truth 293 00:18:50,520 --> 00:18:54,399 Speaker 2: holds as a rule in life, presumably because it holds 294 00:18:54,400 --> 00:18:58,440 Speaker 2: as a rule in the brain. And if the cooked 295 00:18:58,520 --> 00:19:01,960 Speaker 2: up story doesn't go so per we end up painting 296 00:19:02,000 --> 00:19:05,720 Speaker 2: ourselves into a corner, which brings with it lots more 297 00:19:06,240 --> 00:19:10,320 Speaker 2: brain energy expenditure. And this observation is one that the 298 00:19:10,440 --> 00:19:15,480 Speaker 2: Scottish poet novelist Sir Walter Scott immortalized in his poem 299 00:19:15,640 --> 00:19:18,360 Speaker 2: Marmion in eighteen oh eight. This is where he penned 300 00:19:18,400 --> 00:19:22,280 Speaker 2: the immortal line, Oh what a tangled web we weave 301 00:19:22,760 --> 00:19:26,679 Speaker 2: when first we practice to deceive. He was pointing to 302 00:19:26,880 --> 00:19:30,560 Speaker 2: a common problem that arises when we make up a 303 00:19:30,600 --> 00:19:33,159 Speaker 2: lie to cover a secret, and then that leads to 304 00:19:33,200 --> 00:19:35,879 Speaker 2: another necessary lie, which leads to another, and now you 305 00:19:35,920 --> 00:19:39,080 Speaker 2: have to keep track of this whole web of lies 306 00:19:39,480 --> 00:19:42,440 Speaker 2: to try to keep your story consistent, which can lead, 307 00:19:42,600 --> 00:19:45,639 Speaker 2: as Sir Walter Scott may have intuited, to a high 308 00:19:45,760 --> 00:19:50,439 Speaker 2: cognitive load. And cognitive load matters because it equates to 309 00:19:50,560 --> 00:19:55,080 Speaker 2: emotional stress. We all know from experience that holding on 310 00:19:55,160 --> 00:19:59,199 Speaker 2: to a secret creates anxiety, and often these bad feelings 311 00:19:59,440 --> 00:20:03,560 Speaker 2: increase our urge to confess or share the secret to 312 00:20:03,640 --> 00:20:08,160 Speaker 2: relieve the burden. So secret keeping requires a lot of work. 313 00:20:08,680 --> 00:20:12,240 Speaker 2: But what I want to highlight is the double edged 314 00:20:12,440 --> 00:20:17,639 Speaker 2: nature of the secret because, perhaps surprisingly for better or worse, 315 00:20:18,119 --> 00:20:21,159 Speaker 2: the ability to keep a secret is one of the 316 00:20:21,200 --> 00:20:26,000 Speaker 2: main drivers of the shape of our species and society. Now, 317 00:20:26,040 --> 00:20:29,359 Speaker 2: we can talk all day about the downsides of keeping 318 00:20:29,400 --> 00:20:32,520 Speaker 2: secrets from one another, but in fact, the ability to 319 00:20:32,600 --> 00:20:36,200 Speaker 2: keep secrets seems to have evolved many millions of years 320 00:20:36,240 --> 00:20:41,080 Speaker 2: ago as a survival mechanism, as it turns out, concealing 321 00:20:41,119 --> 00:20:46,520 Speaker 2: information about food sources or about shelter or defensive strategies. 322 00:20:46,680 --> 00:20:50,080 Speaker 2: This is often what made the difference between life and death. 323 00:20:50,480 --> 00:20:54,359 Speaker 2: And it's not just humans who practice deception. Other primates 324 00:20:54,440 --> 00:20:57,560 Speaker 2: do this as well. Now I'm not talking about deception 325 00:20:57,720 --> 00:21:01,879 Speaker 2: like using camouflage. Instead I'm talking thinking about active, clever 326 00:21:02,119 --> 00:21:06,480 Speaker 2: deception that requires theory of mind, in other words, the 327 00:21:06,520 --> 00:21:10,440 Speaker 2: ability to step into the perspective of another to understand 328 00:21:10,480 --> 00:21:13,320 Speaker 2: what they know and don't know. It turns out that 329 00:21:13,520 --> 00:21:16,680 Speaker 2: only primates seem to do this, and in fact they're 330 00:21:16,840 --> 00:21:20,400 Speaker 2: constantly doing things to deceive one another. On this note, 331 00:21:20,440 --> 00:21:23,480 Speaker 2: here's a short clip from my interview with Max Bennett 332 00:21:23,480 --> 00:21:25,520 Speaker 2: from last year, who wrote a great book called A 333 00:21:25,560 --> 00:21:29,760 Speaker 2: Brief History of Intelligence. And here's Max describing an example 334 00:21:29,880 --> 00:21:32,040 Speaker 2: of deception in chimpanzees. 335 00:21:32,480 --> 00:21:35,760 Speaker 3: So you can see non human apes do things like 336 00:21:36,119 --> 00:21:38,800 Speaker 3: they will hide transgressions from other people to try and 337 00:21:38,840 --> 00:21:42,080 Speaker 3: prevent themselves from getting in trouble. There's this famous study 338 00:21:42,440 --> 00:21:45,240 Speaker 3: that I love by Emil Menzel. I think it was 339 00:21:45,240 --> 00:21:48,679 Speaker 3: in the seventies where he put two chimpanzees in the 340 00:21:48,720 --> 00:21:52,280 Speaker 3: sort of one acre forest and he showed the location 341 00:21:52,400 --> 00:21:56,520 Speaker 3: of treats to one of the chimpanzees named Belle, and 342 00:21:56,880 --> 00:22:00,119 Speaker 3: she initially would share the treat with another chimpanze he 343 00:22:00,160 --> 00:22:03,280 Speaker 3: named Rock, but then Rock started just stealing the treat 344 00:22:03,320 --> 00:22:06,199 Speaker 3: from her. So what she started doing is when she 345 00:22:06,280 --> 00:22:08,520 Speaker 3: knew the location of the treat, she would wait for 346 00:22:08,560 --> 00:22:10,400 Speaker 3: a rock to look away, and then she would run 347 00:22:10,440 --> 00:22:13,720 Speaker 3: over and grab it. So then rock in response to this, 348 00:22:13,920 --> 00:22:17,199 Speaker 3: decided to pretend to look away so that when she 349 00:22:17,280 --> 00:22:19,800 Speaker 3: started running, then he would turn around and run. Then 350 00:22:19,800 --> 00:22:21,800 Speaker 3: in response to this, what she would do is she 351 00:22:21,800 --> 00:22:24,439 Speaker 3: would pretend to run in the wrong direction, lead him 352 00:22:24,480 --> 00:22:26,040 Speaker 3: to the wrong place, and then run back. 353 00:22:26,359 --> 00:22:30,520 Speaker 2: So the tricky deceptive capacity we have to not spill 354 00:22:30,600 --> 00:22:37,360 Speaker 2: the beans enhances the chances of individual survival. Now, when 355 00:22:37,359 --> 00:22:40,600 Speaker 2: we zoom in on humans and study them very carefully, 356 00:22:40,720 --> 00:22:44,280 Speaker 2: what we find is that keeping secrets and more generally, 357 00:22:44,640 --> 00:22:49,200 Speaker 2: lying is a cognitive development in children. In other words, 358 00:22:49,240 --> 00:22:51,400 Speaker 2: And first young children. 359 00:22:51,080 --> 00:22:52,240 Speaker 1: Don't know how to lie. 360 00:22:52,720 --> 00:22:55,440 Speaker 2: And then they hit a milestone one day, let's say 361 00:22:55,440 --> 00:22:57,919 Speaker 2: between two and four years old, and they get a 362 00:22:57,960 --> 00:23:02,280 Speaker 2: great idea, maybe they can assert something that's not true 363 00:23:02,760 --> 00:23:06,800 Speaker 2: and maybe the other person won't know. So they carefully 364 00:23:07,200 --> 00:23:11,800 Speaker 2: float this trial balloon and sometimes it works. Suddenly they 365 00:23:11,840 --> 00:23:16,320 Speaker 2: have eaten the cooky but avoided punishment, so they start 366 00:23:16,359 --> 00:23:19,399 Speaker 2: to lie. As soon as they develop theory of mind, 367 00:23:19,480 --> 00:23:23,240 Speaker 2: they start to understand that their thoughts and minds are 368 00:23:23,320 --> 00:23:26,320 Speaker 2: separate from those of their parents. They also begin to 369 00:23:26,320 --> 00:23:30,720 Speaker 2: realize that people aren't perfect mind readers, so the child 370 00:23:30,760 --> 00:23:34,680 Speaker 2: can say something that isn't true, and miraculously the other 371 00:23:34,800 --> 00:23:39,760 Speaker 2: person doesn't always know. Obviously, sometimes the child doesn't get 372 00:23:39,800 --> 00:23:41,600 Speaker 2: away with it, and so they conclude that they need 373 00:23:41,640 --> 00:23:45,040 Speaker 2: to work even harder to cook up better fabrications that 374 00:23:45,160 --> 00:23:47,920 Speaker 2: might fool the other person. And for this they need 375 00:23:47,960 --> 00:23:51,320 Speaker 2: to get better and better at stepping into the other 376 00:23:51,400 --> 00:23:55,280 Speaker 2: person's shoes to understand what that person might fall for. 377 00:23:55,840 --> 00:23:59,000 Speaker 2: So children start to lie more frequently as they get 378 00:23:59,000 --> 00:24:01,639 Speaker 2: older and they get better at lying. They learn to 379 00:24:01,760 --> 00:24:05,520 Speaker 2: match their facial expressions and tone of voice to what 380 00:24:05,760 --> 00:24:09,120 Speaker 2: they're saying, and they shape their lies based on their 381 00:24:09,200 --> 00:24:12,679 Speaker 2: assumptions of what the other person may or may not know. 382 00:24:13,000 --> 00:24:15,920 Speaker 2: And lying sucks, especially when it's in your own kids. 383 00:24:15,960 --> 00:24:19,520 Speaker 2: But I'll just mention that some researchers view lying as 384 00:24:19,560 --> 00:24:24,720 Speaker 2: an important developmental stage that indicates early intelligence. So while 385 00:24:24,760 --> 00:24:27,240 Speaker 2: none of us appreciate being lied to by our kids 386 00:24:27,359 --> 00:24:30,600 Speaker 2: or having secrets kept from us by our colleagues, it's 387 00:24:30,720 --> 00:24:35,879 Speaker 2: one of the signatures of the extraordinarily intelligent primate brain. 388 00:24:36,359 --> 00:24:39,040 Speaker 2: And in fact, when you look across human history, the 389 00:24:39,119 --> 00:24:43,320 Speaker 2: thing that becomes clear is that secrets quickly expanded beyond 390 00:24:43,359 --> 00:24:47,879 Speaker 2: the individual level to the group level. So trust within 391 00:24:48,000 --> 00:24:52,359 Speaker 2: small groups usually depended on the ability to keep certain 392 00:24:52,440 --> 00:24:56,400 Speaker 2: knowledge hidden from outsiders, and by the way, keeping internal 393 00:24:56,400 --> 00:25:00,480 Speaker 2: group secrets has the added benefit of thought during a 394 00:25:00,560 --> 00:25:05,199 Speaker 2: sense of belonging in mutual protection and secrets have always 395 00:25:05,200 --> 00:25:09,399 Speaker 2: been a key part of military strategy. The immortal book 396 00:25:09,480 --> 00:25:12,920 Speaker 2: The Art of War by Sun Su paints a picture 397 00:25:13,080 --> 00:25:19,560 Speaker 2: of deception and secrecy as absolutely essential for successful warfare, 398 00:25:20,200 --> 00:25:23,320 Speaker 2: and more generally, secrets have always been a lever for 399 00:25:23,400 --> 00:25:29,399 Speaker 2: power within larger groups. As civilizations grew, secrets became pervasive 400 00:25:29,720 --> 00:25:34,160 Speaker 2: wherever people maintained and exercised power. You see this all 401 00:25:34,200 --> 00:25:38,919 Speaker 2: over ancient empires like Egypt and China, where they leveraged 402 00:25:39,040 --> 00:25:43,600 Speaker 2: secrecy to control knowledge and to reinforce hierarchies. And as 403 00:25:43,640 --> 00:25:46,840 Speaker 2: we all know from our history classes and from Game 404 00:25:46,880 --> 00:25:51,960 Speaker 2: of Thrones, the secrets swirling around among advisors in royal 405 00:25:52,040 --> 00:25:57,360 Speaker 2: courts often dictated the rise and fall of leaders. Now 406 00:25:57,400 --> 00:26:01,960 Speaker 2: people have been grappling with the tension between keeping secrets 407 00:26:01,960 --> 00:26:05,760 Speaker 2: in a society versus telling everything to the public, and 408 00:26:05,800 --> 00:26:08,439 Speaker 2: they've been debating these points for a long time. So, 409 00:26:08,480 --> 00:26:11,919 Speaker 2: for example, the Enlightenment was characterized by arguments going on 410 00:26:12,000 --> 00:26:16,760 Speaker 2: about the value of secrecy versus transparency. You have some 411 00:26:16,840 --> 00:26:21,680 Speaker 2: philosophers like Jeremy Bentham who hated the idea of secrecy 412 00:26:21,760 --> 00:26:26,320 Speaker 2: in governance, but secrecy has always persisted in areas like 413 00:26:26,680 --> 00:26:31,159 Speaker 2: diplomacy and statecraft, where its defenders deem it as essential 414 00:26:31,359 --> 00:26:34,639 Speaker 2: for stability. And you can see the fingerprint of this 415 00:26:34,760 --> 00:26:39,119 Speaker 2: very old debate in modern questions about surveillance. To what 416 00:26:39,280 --> 00:26:42,720 Speaker 2: extent should a government be able to keep track of 417 00:26:42,760 --> 00:26:48,000 Speaker 2: who's doing what with cameras or data tracking? Should anyone 418 00:26:48,080 --> 00:26:51,080 Speaker 2: be able to keep total privacy from the states such 419 00:26:51,119 --> 00:26:54,439 Speaker 2: that I can work on three D printing guns and 420 00:26:54,480 --> 00:26:57,520 Speaker 2: making nuclear bombs in the privacy of my home. These 421 00:26:57,560 --> 00:27:00,679 Speaker 2: debates will presumably go on forever because because there is 422 00:27:00,720 --> 00:27:05,280 Speaker 2: no single right answer for balancing the privacy of the 423 00:27:05,320 --> 00:27:10,359 Speaker 2: individual against the security of the society. So back to secrets, 424 00:27:10,560 --> 00:27:14,720 Speaker 2: it's not just governments that traffic in these. Religious traditions 425 00:27:14,720 --> 00:27:21,600 Speaker 2: almost always involve sacred secrets accessible only to initiates or clergy. 426 00:27:21,800 --> 00:27:24,800 Speaker 2: In ancient Greece, there were rituals performed every year called 427 00:27:25,160 --> 00:27:29,640 Speaker 2: the Eleucinian Mysteries, and to participate you had to swear 428 00:27:29,840 --> 00:27:33,400 Speaker 2: a vow of secrecy, and in return you were promised 429 00:27:33,640 --> 00:27:38,520 Speaker 2: spiritual enlightenment. And you then learned and safeguarded the secret rights. 430 00:27:38,760 --> 00:27:41,080 Speaker 2: And these were kept so secret that all these centuries 431 00:27:41,119 --> 00:27:44,280 Speaker 2: later we still don't know what happened in those rituals. 432 00:27:44,920 --> 00:27:48,879 Speaker 2: Or similarly, in the medieval era, knowledge of alchemy or 433 00:27:49,000 --> 00:27:52,520 Speaker 2: mystical texts was always hidden so that it was accessible 434 00:27:52,960 --> 00:27:56,639 Speaker 2: only to a select few. More generally, this is the 435 00:27:56,720 --> 00:28:01,040 Speaker 2: kind of glue that keeps people locked into whether that's 436 00:28:01,240 --> 00:28:06,679 Speaker 2: religions or organizations or fraternities. It's the tension from letting 437 00:28:06,720 --> 00:28:10,159 Speaker 2: them know that as they level up, they'll get access 438 00:28:10,320 --> 00:28:13,159 Speaker 2: to the next big secrets, so they'll be part of 439 00:28:13,200 --> 00:28:15,960 Speaker 2: the in group, and they'll link arms to keep those 440 00:28:16,000 --> 00:28:18,680 Speaker 2: secrets from everyone else. As I mentioned a moment ago, 441 00:28:18,880 --> 00:28:25,879 Speaker 2: sharing a secret builds strong bonds between individuals, It fosters intimacy, 442 00:28:25,920 --> 00:28:46,360 Speaker 2: and it creates alliances. I'll just mention one more thing 443 00:28:46,400 --> 00:28:51,000 Speaker 2: about secrets on a societal level. Beyond warfare and politics 444 00:28:51,000 --> 00:28:55,320 Speaker 2: and religions, subgroups in society have to keep secrets to 445 00:28:55,440 --> 00:29:02,480 Speaker 2: maintain competitive advantages. In technology. Take something like trade secrets. Historically, 446 00:29:02,520 --> 00:29:06,959 Speaker 2: you've got things like the exact technique for making Chinese silk, 447 00:29:07,280 --> 00:29:11,560 Speaker 2: or currently the exact formula for Coca cola, or the 448 00:29:12,000 --> 00:29:16,480 Speaker 2: software details in cybersecurity firms. In all of these cases, 449 00:29:17,200 --> 00:29:22,600 Speaker 2: knowledge is closely guarded to keep a competitive advantage, and 450 00:29:22,600 --> 00:29:26,200 Speaker 2: people have studied the issue about keeping secrets even inside 451 00:29:26,560 --> 00:29:29,640 Speaker 2: a company, like the upper management knows something but the 452 00:29:29,720 --> 00:29:34,240 Speaker 2: employees don't, so that sounds bad. The researchers at Stanford 453 00:29:34,280 --> 00:29:37,800 Speaker 2: just published a paper on the benefits that companies derive 454 00:29:38,200 --> 00:29:42,600 Speaker 2: from what they call the confidentiality effect. The research suggests 455 00:29:42,640 --> 00:29:48,320 Speaker 2: that keeping company secrets has the benefit of boosting feelings 456 00:29:48,360 --> 00:29:51,120 Speaker 2: of privilege and status among those who know the secret 457 00:29:51,440 --> 00:29:55,120 Speaker 2: and protecting those who do not. Now, I want to 458 00:29:55,120 --> 00:29:58,080 Speaker 2: tie this into something I talked about before in the 459 00:29:58,160 --> 00:30:03,080 Speaker 2: episode titled why do you Brains love conspiracy theories? The 460 00:30:03,160 --> 00:30:07,960 Speaker 2: brain is always seeking explanations to reduce cognitive dissonance and 461 00:30:08,080 --> 00:30:12,480 Speaker 2: provide a sense of certainty. And when we solve puzzles, 462 00:30:12,480 --> 00:30:17,880 Speaker 2: that's deeply rewarding. And that's why uncovering patterns, even ones 463 00:30:17,880 --> 00:30:21,920 Speaker 2: that aren't accurate, that's why that feels so rewarding. Now, 464 00:30:22,320 --> 00:30:25,240 Speaker 2: one can never say that all conspiracy theories are false 465 00:30:25,320 --> 00:30:29,200 Speaker 2: because certainly, at moments in history, people can try to 466 00:30:29,200 --> 00:30:32,760 Speaker 2: get away with something, and they are very incentivized to 467 00:30:32,800 --> 00:30:38,880 Speaker 2: cover their tracks. But most conspiracy theories fall apart under scrutiny. 468 00:30:39,200 --> 00:30:43,720 Speaker 2: Why we'll just think about conspiracy theories through the lens 469 00:30:43,880 --> 00:30:49,360 Speaker 2: of human behavior and probabilities. Imagine a grand conspiracy like 470 00:30:49,440 --> 00:30:53,200 Speaker 2: faking the moon landing or hiding evidence of alien life 471 00:30:53,200 --> 00:30:58,120 Speaker 2: in Roswell. These cover ups would require hundreds, if not thousands, 472 00:30:58,200 --> 00:31:04,640 Speaker 2: of people to remain silent across agencies, across generations, across families. 473 00:31:05,280 --> 00:31:08,320 Speaker 2: But real life doesn't work that way. People get drunk, 474 00:31:08,360 --> 00:31:11,320 Speaker 2: they talk in their sleep, they have moments of guilt, 475 00:31:11,440 --> 00:31:16,040 Speaker 2: they decide to confess on their deathbeds. Secrets are fragile, 476 00:31:16,120 --> 00:31:18,920 Speaker 2: and the longer they need to be kept, the more 477 00:31:19,120 --> 00:31:22,440 Speaker 2: likely they are to leak. Now, you can model the 478 00:31:22,600 --> 00:31:26,440 Speaker 2: odds of someone spilling the secret using a math technique 479 00:31:26,440 --> 00:31:29,800 Speaker 2: called survival analysis, and it goes like this. Say there's 480 00:31:30,240 --> 00:31:35,720 Speaker 2: a tiny daily chance that someone involved lets something slip. 481 00:31:36,080 --> 00:31:38,600 Speaker 2: Let's say it's really small chance, like one in one 482 00:31:38,640 --> 00:31:42,040 Speaker 2: thousand on any given day that they'd screw up, but 483 00:31:42,360 --> 00:31:45,440 Speaker 2: multiply that over hundreds of people in years of time 484 00:31:46,000 --> 00:31:50,240 Speaker 2: and the probability of the secret being kept approaches zero. 485 00:31:50,520 --> 00:31:53,520 Speaker 2: You're rolling the dice over and over that somebody is 486 00:31:53,560 --> 00:31:56,720 Speaker 2: going to spill the beans. And note that with most 487 00:31:56,760 --> 00:32:01,560 Speaker 2: conspiracy theories, the incentive to spill the beans is usually compelling. 488 00:32:02,160 --> 00:32:05,800 Speaker 2: In the end, people typically act in their self interest, 489 00:32:06,200 --> 00:32:09,959 Speaker 2: and people who are in on a conspiracy sometimes find 490 00:32:10,160 --> 00:32:14,440 Speaker 2: the rational choice is to defect before someone else does. 491 00:32:14,800 --> 00:32:17,840 Speaker 2: In other words, if you are the one who spills 492 00:32:17,840 --> 00:32:21,000 Speaker 2: the big secret, let's say of a fake moon landing 493 00:32:21,120 --> 00:32:26,000 Speaker 2: or a planned murderer of JFK, then suddenly, not only 494 00:32:26,160 --> 00:32:28,800 Speaker 2: have you released the stress of a secret, but more importantly, 495 00:32:28,800 --> 00:32:32,320 Speaker 2: you're now famous and protected by the legal system and 496 00:32:32,440 --> 00:32:36,520 Speaker 2: probably signing book deals with Penguin Random House. So game 497 00:32:36,680 --> 00:32:41,360 Speaker 2: theory asks why risk being the one left behind when 498 00:32:41,360 --> 00:32:45,600 Speaker 2: you have so much incentive structure encouraging you to blurt 499 00:32:45,640 --> 00:32:49,600 Speaker 2: out the truth before someone else does. So, when you're 500 00:32:49,640 --> 00:32:54,120 Speaker 2: confronted with a conspiracy theory, the question isn't just could 501 00:32:54,160 --> 00:32:58,760 Speaker 2: that happen? But more importantly, how many people would need. 502 00:32:58,520 --> 00:32:59,560 Speaker 1: To keep the secret? 503 00:33:00,080 --> 00:33:05,160 Speaker 2: And how much time has passed without anyone defecting. And 504 00:33:05,200 --> 00:33:07,840 Speaker 2: this is why when you do the math and factor 505 00:33:07,920 --> 00:33:12,400 Speaker 2: in human behavior, the vast majority of conspiracies become not 506 00:33:12,600 --> 00:33:17,040 Speaker 2: just improbable, but implausible. So now I want to return 507 00:33:17,040 --> 00:33:19,360 Speaker 2: to a question that I posed at the very beginning. 508 00:33:19,840 --> 00:33:24,959 Speaker 2: Can a computer keep a secret? Well, a toaster doesn't 509 00:33:24,960 --> 00:33:28,000 Speaker 2: have the circuitry to keep a secret from you, presumably 510 00:33:28,080 --> 00:33:31,680 Speaker 2: because it doesn't have complex enough circuitry, it doesn't have 511 00:33:31,680 --> 00:33:34,720 Speaker 2: a team of rivals fighting it out under the hood. 512 00:33:35,520 --> 00:33:38,760 Speaker 1: But could AI keep a secret from you? 513 00:33:39,440 --> 00:33:42,600 Speaker 2: Well, people started asking this question some years ago using 514 00:33:42,640 --> 00:33:46,200 Speaker 2: what are called adversarial networks. Just think of this like 515 00:33:46,520 --> 00:33:49,800 Speaker 2: networks that are pitted against one another, and they each 516 00:33:49,920 --> 00:33:54,680 Speaker 2: might have a different incentive structure. So back in twenty sixteen, 517 00:33:55,000 --> 00:33:58,520 Speaker 2: a group of researchers a Google Brain, wanted to know 518 00:33:59,160 --> 00:34:03,240 Speaker 2: whether neural nets networks could develop their own encryption methods 519 00:34:03,360 --> 00:34:08,960 Speaker 2: independently without human guidance. So they set up three independent networks. 520 00:34:09,520 --> 00:34:13,120 Speaker 2: The first was named Alice, and Alice's task was to 521 00:34:13,200 --> 00:34:17,200 Speaker 2: send a secret message to Bob. This second network, Bob 522 00:34:17,680 --> 00:34:21,480 Speaker 2: His job was to decode the message sent by Alice. 523 00:34:21,520 --> 00:34:24,960 Speaker 2: And then there was a third network named Eve. Now 524 00:34:25,040 --> 00:34:29,160 Speaker 2: Eve represented and evesdropper. Her role was to try to 525 00:34:29,239 --> 00:34:33,719 Speaker 2: intercept and decrypt Alice's message, so they set these three 526 00:34:33,800 --> 00:34:38,520 Speaker 2: networks running. Alice and Bob were trained collaboratively to ensure 527 00:34:38,960 --> 00:34:44,440 Speaker 2: Bob could correctly decode the messages, and Eve was trained adversarially. 528 00:34:44,680 --> 00:34:49,160 Speaker 2: Her job was to improve her decryption attempts, and the 529 00:34:49,200 --> 00:34:53,839 Speaker 2: system evolved dynamically. As Alice and Bob became better at 530 00:34:53,880 --> 00:34:58,800 Speaker 2: secure communication, Eve became more sophisticated in breaking their encryption, 531 00:34:59,400 --> 00:35:03,240 Speaker 2: forcing Alice and Bob to improve further. Now, the outcome 532 00:35:03,719 --> 00:35:08,080 Speaker 2: was kind of incredible because Alice and Bob's neural networks 533 00:35:08,160 --> 00:35:13,279 Speaker 2: learned to create their own encryption strategies, and this was 534 00:35:13,360 --> 00:35:17,840 Speaker 2: without explicit programming. In other words, the encryption strategy was 535 00:35:17,880 --> 00:35:20,840 Speaker 2: never told to them by a human. So over time 536 00:35:20,920 --> 00:35:26,960 Speaker 2: those two succeeded in creating encrypted communications that Eve couldn't decrypt. 537 00:35:27,200 --> 00:35:31,839 Speaker 2: They outpaced Eve's efforts. The key here is that the 538 00:35:31,960 --> 00:35:37,840 Speaker 2: encryption methods developed by Alice were not human readable or interpretable. 539 00:35:38,239 --> 00:35:42,759 Speaker 2: The system invented its own cryptographic methods based on its 540 00:35:42,800 --> 00:35:47,239 Speaker 2: training objectives very rapidly. The two neural networks figured out 541 00:35:47,560 --> 00:35:50,640 Speaker 2: how to keep a secret not only from Eve, but 542 00:35:50,719 --> 00:35:57,200 Speaker 2: from everybody. They developed a totally novel and dynamic cryptographic system. 543 00:35:57,480 --> 00:36:00,560 Speaker 2: So that was one study. Then in twenty seven, the 544 00:36:00,800 --> 00:36:05,640 Speaker 2: Facebook AI research group was working on developing AI agents 545 00:36:06,040 --> 00:36:08,719 Speaker 2: that could negotiate. But when they left these agents to 546 00:36:08,719 --> 00:36:11,920 Speaker 2: their own devices, they saw unexpected behaviors. 547 00:36:12,200 --> 00:36:13,280 Speaker 1: So here was this setup. 548 00:36:13,640 --> 00:36:17,560 Speaker 2: The goal of the experiment was to train these two bots, 549 00:36:17,640 --> 00:36:21,560 Speaker 2: these two AI agents, to negotiate with each other to 550 00:36:21,719 --> 00:36:26,880 Speaker 2: achieve their objectives. So the two AI agents are tasked 551 00:36:26,920 --> 00:36:31,320 Speaker 2: with dividing items between them, like books, or balls or hats. 552 00:36:31,719 --> 00:36:35,240 Speaker 2: Each item has a different value to each of the bots, 553 00:36:35,480 --> 00:36:39,520 Speaker 2: and each bot has the objective to maximize their score 554 00:36:39,640 --> 00:36:42,640 Speaker 2: by getting the items that they value the most, and 555 00:36:42,680 --> 00:36:45,000 Speaker 2: they're rewarded based on how well they do. Okay, so 556 00:36:45,040 --> 00:36:49,920 Speaker 2: they set these bots running, but pretty soon two surprises emerged. 557 00:36:50,400 --> 00:36:54,799 Speaker 2: The first was that the bots started using tricky tactics. 558 00:36:55,200 --> 00:36:59,200 Speaker 2: They would sometimes pretend to be uninterested in a particular 559 00:36:59,239 --> 00:37:03,480 Speaker 2: item to convince the other bot to concede it. This 560 00:37:03,560 --> 00:37:07,400 Speaker 2: is called feigned disinterest, and of course humans do this 561 00:37:07,520 --> 00:37:11,920 Speaker 2: in negotiations to manipulate the perceived value of something that 562 00:37:11,960 --> 00:37:15,040 Speaker 2: they are interested in, but it certainly wasn't expected that 563 00:37:15,080 --> 00:37:19,080 Speaker 2: this would emerge on its own in an artificial neural network. 564 00:37:19,680 --> 00:37:23,600 Speaker 2: The second thing was that the bots started to communicate strangely. 565 00:37:23,680 --> 00:37:29,000 Speaker 2: They developed their own shorthand language to negotiate. So Bot 566 00:37:29,040 --> 00:37:32,680 Speaker 2: one would say I want want, want want one book 567 00:37:32,760 --> 00:37:36,080 Speaker 2: and bought two would respond okay, you you you you 568 00:37:36,360 --> 00:37:39,719 Speaker 2: two balls. Now why did their language start to turn 569 00:37:39,840 --> 00:37:44,760 Speaker 2: funny like this? It's because the AI wasn't explicitly incentivized 570 00:37:45,040 --> 00:37:49,680 Speaker 2: to stick with human readable language. Instead, the reward system 571 00:37:49,760 --> 00:37:56,080 Speaker 2: only focused on achieving successful negotiations, so the bots optimized 572 00:37:56,160 --> 00:38:01,520 Speaker 2: their communication for efficiency rather than for clarity to humans. 573 00:38:01,880 --> 00:38:04,440 Speaker 2: So it made sense to the bots, but eventually became 574 00:38:04,960 --> 00:38:08,880 Speaker 2: unintelligible to the human reader. Now, this experiment gives us 575 00:38:08,920 --> 00:38:10,560 Speaker 2: a few deep insights. 576 00:38:10,600 --> 00:38:11,279 Speaker 1: The first thing it. 577 00:38:11,280 --> 00:38:17,680 Speaker 2: Shows us is emergent behaviors. AI agents can develop creative 578 00:38:17,760 --> 00:38:21,960 Speaker 2: strategies to achieve their goals, and this is typically not 579 00:38:22,200 --> 00:38:25,960 Speaker 2: anticipated by the researchers. In this case, what developed was 580 00:38:26,320 --> 00:38:29,040 Speaker 2: something like a secret, or at least we would say, 581 00:38:29,080 --> 00:38:33,160 Speaker 2: a non truth, because the AI would pretend one thing 582 00:38:33,560 --> 00:38:36,920 Speaker 2: when it wanted something else. The experiment also demonstrated that 583 00:38:37,200 --> 00:38:41,239 Speaker 2: if it's not explicitly constrained, AI will optimize for its 584 00:38:41,280 --> 00:38:46,040 Speaker 2: objectives in ways that maybe don't align with human readability, 585 00:38:46,440 --> 00:38:49,440 Speaker 2: and if you extrapolate this, you can see this is 586 00:38:49,480 --> 00:38:53,120 Speaker 2: another way that AI might keep a secret from us, 587 00:38:53,160 --> 00:38:56,600 Speaker 2: even if it's not intending to. Just like the networks 588 00:38:56,640 --> 00:38:59,800 Speaker 2: Alice and Bob, it might just learn how to speak 589 00:38:59,840 --> 00:39:04,799 Speaker 2: a language that we simply don't understand. So let's wrap 590 00:39:04,880 --> 00:39:09,160 Speaker 2: up today's exploration of secrets and the brain. As we saw, 591 00:39:09,320 --> 00:39:13,319 Speaker 2: secrets of something we're often embarrassed to address, but they're 592 00:39:13,440 --> 00:39:17,760 Speaker 2: a massive feature of primate lives, shaping everything from personal 593 00:39:17,800 --> 00:39:22,759 Speaker 2: relationships to large scale politics. The act of keeping a 594 00:39:22,840 --> 00:39:27,320 Speaker 2: secret emerges from the rivalry within our brains, a team 595 00:39:27,480 --> 00:39:31,160 Speaker 2: of neural networks vying for control over what's revealed and 596 00:39:31,200 --> 00:39:36,879 Speaker 2: what's hidden. This internal competition creates the cognitive and emotional 597 00:39:36,960 --> 00:39:40,920 Speaker 2: weight of secret keeping, which can sometimes push us to 598 00:39:41,000 --> 00:39:46,319 Speaker 2: seek relief by confessing to strangers, or by writing anonymous 599 00:39:46,320 --> 00:39:50,680 Speaker 2: posts on the web, or even whispering to an AI therapist. 600 00:39:51,280 --> 00:39:54,920 Speaker 2: The relief that we feel from telling a secret comes 601 00:39:55,280 --> 00:40:01,200 Speaker 2: not from solving the external problem, but from reducing the tension. 602 00:40:02,040 --> 00:40:04,960 Speaker 2: And as we saw, secrets don't stop at the individual level. 603 00:40:05,160 --> 00:40:09,400 Speaker 2: They've been a key driver of human society, from military 604 00:40:09,440 --> 00:40:13,719 Speaker 2: strategy and political hierarchies to the rise of religions and 605 00:40:13,800 --> 00:40:21,759 Speaker 2: social institutions. Secrets bind groups together, They foster trust among allies. 606 00:40:22,200 --> 00:40:26,200 Speaker 2: They create competitive advantages, whether that's the formula for Coca 607 00:40:26,200 --> 00:40:30,799 Speaker 2: cola or the sacred rights of ancient mystery cults. But 608 00:40:30,880 --> 00:40:35,680 Speaker 2: they also lead to struggles over societal transparency, and there's 609 00:40:35,680 --> 00:40:39,480 Speaker 2: always a healthy debate about how much governments should be 610 00:40:39,520 --> 00:40:41,279 Speaker 2: allowed to conceal from us. 611 00:40:42,120 --> 00:40:43,520 Speaker 1: And what about AI. 612 00:40:44,160 --> 00:40:46,960 Speaker 2: I told you about two experiments where neural networks, left 613 00:40:47,000 --> 00:40:52,040 Speaker 2: to their own devices, developed encryption methods and private languages 614 00:40:52,400 --> 00:40:57,440 Speaker 2: that humans couldn't decode. These aren't conscious secrets in the 615 00:40:57,480 --> 00:41:01,760 Speaker 2: way we experience them, but they light a future where 616 00:41:01,800 --> 00:41:06,800 Speaker 2: machines might possess knowledge inaccessible to us, not necessarily because 617 00:41:06,800 --> 00:41:11,880 Speaker 2: they're deliberately hiding it, but because their optimization processes have 618 00:41:12,080 --> 00:41:18,160 Speaker 2: outpaced our ability to understand. As aisystems grow more sophisticated, 619 00:41:18,640 --> 00:41:21,680 Speaker 2: the question of what they know and what they're going 620 00:41:21,719 --> 00:41:25,799 Speaker 2: to keep from us is only going to evolve. So 621 00:41:26,400 --> 00:41:30,520 Speaker 2: secrets are woven into the fabric of our minds and societies. 622 00:41:30,560 --> 00:41:34,360 Speaker 2: They've been with us since before the sunrise of human cognition, 623 00:41:34,760 --> 00:41:38,400 Speaker 2: As we see from our primate cousins, and secrets and 624 00:41:38,480 --> 00:41:41,799 Speaker 2: lies may soon take on new dimensions as we come 625 00:41:41,840 --> 00:41:46,920 Speaker 2: to share our world with increasingly intelligent machines. Whether secrets 626 00:41:47,000 --> 00:41:50,400 Speaker 2: are helping us to bond or they're dividing us, they 627 00:41:50,440 --> 00:41:54,640 Speaker 2: are a reminder of the complexity of our inner cosmos 628 00:41:55,200 --> 00:41:59,319 Speaker 2: and the tangled web of our relationships with each other 629 00:42:00,080 --> 00:42:08,879 Speaker 2: and soon with our technology. Go to eagleman dot com 630 00:42:08,880 --> 00:42:12,280 Speaker 2: slash podcast for more information and to find further reading. 631 00:42:12,760 --> 00:42:15,840 Speaker 2: Send me an email at podcast at eagleman dot com 632 00:42:15,880 --> 00:42:19,000 Speaker 2: with questions or discussion, and check out and subscribe to 633 00:42:19,160 --> 00:42:22,840 Speaker 2: Inner Cosmos on YouTube for videos of each episode and 634 00:42:22,920 --> 00:42:23,880 Speaker 2: to leave comments. 635 00:42:25,120 --> 00:42:25,879 Speaker 1: Until next time. 636 00:42:26,000 --> 00:42:38,080 Speaker 2: I'm David Eagleman, and this is Inner Cosmos.