1 00:00:03,040 --> 00:00:06,760 Speaker 1: Welcome to Stuff to Blow Your Mind production of iHeartRadio. 2 00:00:12,920 --> 00:00:15,400 Speaker 2: Hey, welcome to Stuff to Blow Your Mind. My name 3 00:00:15,440 --> 00:00:16,480 Speaker 2: is Robert Lamb. 4 00:00:16,560 --> 00:00:19,279 Speaker 3: And I'm Joe McCormick. And in today's episode, we're going 5 00:00:19,360 --> 00:00:22,000 Speaker 3: to begin a series in which we look at a 6 00:00:22,040 --> 00:00:27,240 Speaker 3: phenomenon studied in psychology, economics, and decision theory which is 7 00:00:27,280 --> 00:00:34,559 Speaker 3: called ambiguity aversion. Ambiguity aversion is when we prefer risks 8 00:00:34,720 --> 00:00:40,320 Speaker 3: with known probabilities over risks with unknown probabilities, even if 9 00:00:40,360 --> 00:00:43,320 Speaker 3: you have no reason for thinking that the unknown risks 10 00:00:43,400 --> 00:00:46,519 Speaker 3: will be worse than the known risks. And we'll get 11 00:00:46,520 --> 00:00:49,960 Speaker 3: to some concrete examples and foundational research on this in 12 00:00:50,000 --> 00:00:52,839 Speaker 3: a bit, but to an approximation, a good way to 13 00:00:52,880 --> 00:00:57,800 Speaker 3: think about this is that ambiguity aversion is the sentiment 14 00:00:58,000 --> 00:01:01,640 Speaker 3: expressed in the saying, better the devil you know than 15 00:01:01,680 --> 00:01:04,840 Speaker 3: the devil you don't. This is an aphorism I feel 16 00:01:04,840 --> 00:01:08,240 Speaker 3: like people often quote when they're about to justify making 17 00:01:08,240 --> 00:01:13,040 Speaker 3: a really bad or indefensible decision. And the whole point 18 00:01:13,200 --> 00:01:15,960 Speaker 3: of this saying is that the devil you know is 19 00:01:16,040 --> 00:01:19,880 Speaker 3: already bad. That's usually implied, and there's no indication that 20 00:01:19,920 --> 00:01:22,600 Speaker 3: the devil you don't know will be worse. It could 21 00:01:22,680 --> 00:01:26,400 Speaker 3: well be a kinder, gentler devil with a less pointy pitchfork. 22 00:01:26,800 --> 00:01:31,520 Speaker 3: But still we often find ourselves thinking, I'd just rather 23 00:01:31,600 --> 00:01:33,720 Speaker 3: stick with the one where I know how pointy it 24 00:01:33,800 --> 00:01:34,440 Speaker 3: is upfront. 25 00:01:35,080 --> 00:01:38,240 Speaker 2: Yeah, it's often a form of indecision, like refused to 26 00:01:38,240 --> 00:01:41,520 Speaker 2: do anything about the current devil situation, but you know, 27 00:01:41,920 --> 00:01:45,480 Speaker 2: at least I know this devil. Next devil could be worse. 28 00:01:45,520 --> 00:01:47,400 Speaker 2: I don't want a worst devil, that's right. 29 00:01:47,960 --> 00:01:50,280 Speaker 3: This is another one of those aphorisms. We were talking 30 00:01:50,280 --> 00:01:53,920 Speaker 3: about this just recently in the Saint Swithin episode. It's 31 00:01:53,960 --> 00:01:57,000 Speaker 3: one of these aphorisms that I think is only wisdom 32 00:01:57,440 --> 00:02:01,360 Speaker 3: given certain assumptions or conditions that are not really part 33 00:02:01,400 --> 00:02:04,480 Speaker 3: of the saying, Like as stated, it could be purely 34 00:02:04,600 --> 00:02:07,560 Speaker 3: terrible advice, like what if the devil you know is 35 00:02:07,640 --> 00:02:10,200 Speaker 3: really bad? And what if going with the devil you 36 00:02:10,240 --> 00:02:12,760 Speaker 3: don't know gives you a really good shot at improving 37 00:02:12,800 --> 00:02:16,679 Speaker 3: your stay in hell? But also I think it does 38 00:02:16,760 --> 00:02:19,360 Speaker 3: have a kind of wisdom if you don't think of 39 00:02:19,400 --> 00:02:23,280 Speaker 3: it as advice, but rather as an ironic observation about 40 00:02:23,400 --> 00:02:25,679 Speaker 3: human nature, in which case I think it's sort of 41 00:02:25,760 --> 00:02:28,680 Speaker 3: dead on this This is often how we think, and 42 00:02:28,720 --> 00:02:33,080 Speaker 3: that is the core observation also of the ambiguity aversion literature. 43 00:02:33,360 --> 00:02:36,160 Speaker 2: Yeah yeah, I mean because you can also ask, well, 44 00:02:36,320 --> 00:02:38,040 Speaker 2: what if we just didn't have a devil? Is that 45 00:02:38,120 --> 00:02:39,919 Speaker 2: on the table? Can we just not have a devil? 46 00:02:39,960 --> 00:02:42,480 Speaker 2: Why do we have to worry about the devil we 47 00:02:42,560 --> 00:02:45,760 Speaker 2: have versus the devil we don't have yet it's you know, 48 00:02:45,800 --> 00:02:48,840 Speaker 2: it can be, and I guess it's rather condemning human 49 00:02:48,919 --> 00:02:51,760 Speaker 2: nature in that aspect as well, where the answer is, well, 50 00:02:51,760 --> 00:02:54,079 Speaker 2: you got to have a devil where humans we make 51 00:02:54,120 --> 00:02:56,280 Speaker 2: our own and you've got to have one in play. 52 00:02:56,400 --> 00:02:58,359 Speaker 3: You should have thought of that before you double park Rob. 53 00:02:58,560 --> 00:02:59,800 Speaker 2: Yeah, now you're here in hell with. 54 00:02:59,800 --> 00:03:00,520 Speaker 3: The rest of us. 55 00:03:00,919 --> 00:03:03,840 Speaker 2: There's a rather keen Chinese saying that goes along these 56 00:03:03,919 --> 00:03:06,840 Speaker 2: lines as well. It's a well known idiom that states 57 00:03:07,200 --> 00:03:10,000 Speaker 2: that it's quote easy to dodge the spear in the open, 58 00:03:10,360 --> 00:03:12,320 Speaker 2: hard to avoid a stab in the dark. 59 00:03:12,639 --> 00:03:16,960 Speaker 3: Oh that's interesting, because yeah, that's the same idea. It 60 00:03:17,120 --> 00:03:21,280 Speaker 3: isn't necessarily true in all circumstances, like in the dark, 61 00:03:21,360 --> 00:03:24,360 Speaker 3: the person stabbing at you also can't see you very well, 62 00:03:24,760 --> 00:03:26,640 Speaker 3: so it might be hard to dodge, but it's also 63 00:03:26,680 --> 00:03:28,880 Speaker 3: hard for them to hit you. But in the dark, 64 00:03:29,280 --> 00:03:31,960 Speaker 3: you just don't really know, you don't really know what's 65 00:03:32,000 --> 00:03:34,000 Speaker 3: going on, so it just seems scarier. 66 00:03:34,600 --> 00:03:37,600 Speaker 2: Yeah. Absolutely, And it reminds me a bit of another 67 00:03:37,720 --> 00:03:40,480 Speaker 2: Chinese saying, better to be a dog in times of 68 00:03:40,480 --> 00:03:44,200 Speaker 2: tranquility than a human in times of chaos, And this 69 00:03:44,280 --> 00:03:47,200 Speaker 2: has some of the same sentiments as a fourteenth century 70 00:03:47,240 --> 00:03:51,920 Speaker 2: Islamic saying that is sometimes misattributed as being of Chinese 71 00:03:51,960 --> 00:03:55,040 Speaker 2: origin itself, and that is better a century of tyranny 72 00:03:55,280 --> 00:03:56,560 Speaker 2: than one day of chaos. 73 00:03:57,560 --> 00:03:59,520 Speaker 3: Also, don't know if I actually agree with that. 74 00:04:00,120 --> 00:04:04,040 Speaker 2: Well, I don't know. It's I think about it a 75 00:04:04,040 --> 00:04:09,920 Speaker 2: lot for various reasons, and I do. I do agree 76 00:04:10,000 --> 00:04:14,520 Speaker 2: with the basic sentiment of it. When you find yourself 77 00:04:14,560 --> 00:04:18,600 Speaker 2: dipping into chaos. And I'm not talking about like capital 78 00:04:18,640 --> 00:04:21,520 Speaker 2: C chaos, like absolute chaos, but these little moments of 79 00:04:21,600 --> 00:04:24,560 Speaker 2: chaos and disorder, you can easily line yourself up with 80 00:04:24,640 --> 00:04:26,719 Speaker 2: the idea of like, well, yeah, this is why people, 81 00:04:27,600 --> 00:04:30,800 Speaker 2: you know, fall for tyranny. This is because they have 82 00:04:30,880 --> 00:04:34,240 Speaker 2: these moments of chaos and uncertainty and they're like, well, 83 00:04:34,279 --> 00:04:37,000 Speaker 2: I would just rather have this sort of known top 84 00:04:37,040 --> 00:04:40,520 Speaker 2: down oppression in place rather than some sort of ambiguous 85 00:04:40,600 --> 00:04:42,000 Speaker 2: chaos bubbling up around me. 86 00:04:42,320 --> 00:04:45,520 Speaker 3: Better are something that's definitely bad than something I don't 87 00:04:45,600 --> 00:04:47,000 Speaker 3: understand and can't predict. 88 00:04:47,480 --> 00:04:51,440 Speaker 2: Yeah, Now, ambiguity and chaos, we should note here, are 89 00:04:51,520 --> 00:04:55,240 Speaker 2: not the same thing. Ambiguity is a lack of clarity. Well, 90 00:04:55,360 --> 00:04:58,400 Speaker 2: chaos is a state of disorder, and the former may 91 00:04:58,440 --> 00:05:01,520 Speaker 2: well lead into the other. And chaos may be seen 92 00:05:01,560 --> 00:05:05,080 Speaker 2: as a state of absolute ambiguity and certainty as well. 93 00:05:05,120 --> 00:05:07,800 Speaker 2: So in that respect, it becomes even more difficult when 94 00:05:07,800 --> 00:05:11,600 Speaker 2: we try and weigh chaos and ambiguity in our concerns 95 00:05:11,640 --> 00:05:12,400 Speaker 2: about the future. 96 00:05:12,640 --> 00:05:15,719 Speaker 3: I'm just trying to think of familiar, everyday examples of 97 00:05:15,800 --> 00:05:21,000 Speaker 3: ambiguity aversion, and I think I came across one internally 98 00:05:21,200 --> 00:05:26,160 Speaker 3: in the now commonly acknowledged sentiment of not wanting to 99 00:05:26,200 --> 00:05:30,400 Speaker 3: go out and try anything you know, So, like, do 100 00:05:30,400 --> 00:05:33,880 Speaker 3: you ever find yourself trying to find a way to 101 00:05:34,080 --> 00:05:38,320 Speaker 3: avoid going through an experience that is new and unfamiliar 102 00:05:38,680 --> 00:05:41,640 Speaker 3: and thus unpredictable to you? So it seems like, I 103 00:05:41,680 --> 00:05:44,040 Speaker 3: don't know, it just seems kind of fraught, like a hassle, 104 00:05:44,560 --> 00:05:47,520 Speaker 3: And instead of doing that, choosing to have an experience 105 00:05:47,600 --> 00:05:50,720 Speaker 3: that you pretty much know that you will probably not enjoy. 106 00:05:54,160 --> 00:05:56,320 Speaker 2: Yeah, I think I know what you're saying here. Yeah, 107 00:05:56,360 --> 00:06:00,960 Speaker 2: I mean, I mean life is perpetually a situation where 108 00:06:01,200 --> 00:06:03,599 Speaker 2: it's like, am I going to stick with what's comfortable 109 00:06:03,720 --> 00:06:05,919 Speaker 2: or am I going to say yes to adventure? Am 110 00:06:05,960 --> 00:06:11,480 Speaker 2: I going to try something new? And I often find myself, 111 00:06:11,720 --> 00:06:14,440 Speaker 2: you know, at the crossroads of that moment, because there's 112 00:06:14,480 --> 00:06:16,479 Speaker 2: a huge part of me that doesn't want to try 113 00:06:16,520 --> 00:06:21,360 Speaker 2: new things, and I get anxious about the various experiences 114 00:06:21,400 --> 00:06:25,200 Speaker 2: aligned with those kinds of quests. But on the other hand, 115 00:06:25,279 --> 00:06:27,599 Speaker 2: I have to recognize that like most of the really 116 00:06:28,400 --> 00:06:31,279 Speaker 2: fulfilling things in my life have been because I said 117 00:06:31,320 --> 00:06:34,000 Speaker 2: yes to adventure and because I took a chance on 118 00:06:34,040 --> 00:06:37,839 Speaker 2: something and I went out of my comfort zone. But 119 00:06:37,880 --> 00:06:40,200 Speaker 2: then again, we call it our comfort zone because it 120 00:06:40,240 --> 00:06:42,400 Speaker 2: is comfortable. It is a nice place, it is a 121 00:06:42,440 --> 00:06:45,360 Speaker 2: comforting place. So it's you know, it's given take. 122 00:06:45,640 --> 00:06:48,359 Speaker 3: Yes, I think that's absolutely correct. But I think the 123 00:06:48,400 --> 00:06:51,200 Speaker 3: really weird thing is that sometimes your quote comfort zone 124 00:06:51,360 --> 00:06:55,280 Speaker 3: isn't comfortable. I mean, sometimes we prefer an experience that's 125 00:06:55,360 --> 00:06:57,920 Speaker 3: not even very nice. It's still like something that we 126 00:06:57,960 --> 00:07:00,240 Speaker 3: know we won't like, but it's just it's you know, 127 00:07:00,279 --> 00:07:02,640 Speaker 3: it's familiar. We at least know what we're getting with it, 128 00:07:03,000 --> 00:07:05,640 Speaker 3: over something that is just question marks. 129 00:07:05,880 --> 00:07:10,760 Speaker 2: Right right, will choose a known okayness over a potentially 130 00:07:11,000 --> 00:07:12,160 Speaker 2: amazing experience. 131 00:07:12,640 --> 00:07:16,880 Speaker 3: So one of the most important early writings on ambiguity 132 00:07:16,920 --> 00:07:20,880 Speaker 3: aversion is a classic paper in the Quarterly Journal of 133 00:07:20,920 --> 00:07:25,920 Speaker 3: Economics from nineteen sixty one by the American economist, whistleblower 134 00:07:26,000 --> 00:07:30,360 Speaker 3: and political activist Daniel Ellsberg. The paper was called Risk, 135 00:07:30,560 --> 00:07:35,240 Speaker 3: Ambiguity and the Savage Axioms. That's savage with a capital S. 136 00:07:35,400 --> 00:07:38,720 Speaker 3: That's a person's name. It's not describing the axioms as savage. 137 00:07:39,920 --> 00:07:42,680 Speaker 3: And just to go ahead and mention it now because 138 00:07:42,720 --> 00:07:45,360 Speaker 3: I also use this as a source in writing about 139 00:07:46,000 --> 00:07:50,320 Speaker 3: Elsberg and this paper's impact. It was also reading a 140 00:07:50,480 --> 00:07:54,920 Speaker 3: paper called Ambiguity and Ambiguity Aversion in the Handbook of 141 00:07:54,960 --> 00:07:58,080 Speaker 3: the Economics of Risk and Uncertainty. This is by Mark J. 142 00:07:58,320 --> 00:08:03,960 Speaker 3: Maschina and Marciano Cinneskalchi, published by north Holland in twenty fourteen. 143 00:08:04,280 --> 00:08:07,120 Speaker 3: This is a sort of general overview of the concept 144 00:08:07,160 --> 00:08:09,960 Speaker 3: of ambiguity aversion and a survey of a bunch of 145 00:08:10,000 --> 00:08:13,000 Speaker 3: different models of it. Now, before we get into the 146 00:08:13,040 --> 00:08:16,400 Speaker 3: idea that Elsburg explored in this very important paper, robbed it. 147 00:08:16,480 --> 00:08:18,280 Speaker 3: You had a bit of bio in Elsburg, didn't you. 148 00:08:18,960 --> 00:08:23,600 Speaker 2: Yeah. Yeah. Daniel Ellsberg quite an interesting fellow of nineteen 149 00:08:23,640 --> 00:08:28,960 Speaker 2: thirty one through twenty twenty three. In addition to Elsberg's paradox, 150 00:08:29,040 --> 00:08:33,120 Speaker 2: which we're getting into here, is he's also famous, perhaps 151 00:08:33,200 --> 00:08:36,440 Speaker 2: more so in a general sense, as the Rand Corporation 152 00:08:36,559 --> 00:08:39,520 Speaker 2: employee who photocopied and released what would come to be 153 00:08:39,600 --> 00:08:43,240 Speaker 2: known as the Pentagon Papers. This is a United States 154 00:08:43,280 --> 00:08:46,960 Speaker 2: Department of Defense history of the US's political and military 155 00:08:47,000 --> 00:08:50,520 Speaker 2: involvement in Vietnam from nineteen forty five through nineteen sixty eight. 156 00:08:51,040 --> 00:08:53,360 Speaker 3: The key point being that Ellsberg believed that getting these 157 00:08:53,360 --> 00:08:57,200 Speaker 3: papers out there was important in informing the public about 158 00:08:57,559 --> 00:08:59,840 Speaker 3: things that they didn't know about how the Vietnam War 159 00:09:00,160 --> 00:09:02,640 Speaker 3: being prosecuted and represented right right. 160 00:09:02,720 --> 00:09:06,840 Speaker 2: It was very much a whistle blowing act on Ellsberg's part, 161 00:09:07,320 --> 00:09:10,000 Speaker 2: and these papers were published by the New York Times 162 00:09:10,000 --> 00:09:12,680 Speaker 2: in nineteen seventy one. This would be a decade after 163 00:09:13,080 --> 00:09:17,280 Speaker 2: the publication regarding the Elsberg paradox. And yeah, they detailed 164 00:09:17,400 --> 00:09:20,679 Speaker 2: secret developments about the scope of the US military efforts 165 00:09:20,679 --> 00:09:25,280 Speaker 2: in Vietnam, the real sticking points being the deception involving 166 00:09:25,320 --> 00:09:28,680 Speaker 2: the Gulf of Tonkin incident, unreported expansion of the war, 167 00:09:28,800 --> 00:09:31,640 Speaker 2: the true objective of the war post nineteen sixty five, 168 00:09:31,800 --> 00:09:36,040 Speaker 2: which basically was more about containing China and preventing a 169 00:09:36,120 --> 00:09:41,640 Speaker 2: humiliating US defeat rather than any specific objectives to help 170 00:09:41,679 --> 00:09:44,280 Speaker 2: the people of Vietnam, and the lack of a clear 171 00:09:44,320 --> 00:09:47,320 Speaker 2: plan for victory as well. So I mentioned a lot 172 00:09:47,360 --> 00:09:49,400 Speaker 2: of you have at least heard of the Pentagon papers, 173 00:09:49,400 --> 00:09:52,000 Speaker 2: even if you're not familiar with the full story, because 174 00:09:52,480 --> 00:09:56,760 Speaker 2: this incident this led to conspiracy and espionage charges against 175 00:09:56,760 --> 00:10:02,400 Speaker 2: Ellsberg that were then later dismissed, greatly fueled the anti 176 00:10:02,440 --> 00:10:04,920 Speaker 2: war movement as well as distrust in the government at 177 00:10:04,920 --> 00:10:08,120 Speaker 2: the time, and led to the Nixon Whitehouse's use of 178 00:10:08,120 --> 00:10:11,600 Speaker 2: the infamous White House Plumbers to discredit Elsberg. And of course, 179 00:10:11,679 --> 00:10:14,000 Speaker 2: all of this ends up feeding into the Watergate scandal, 180 00:10:14,040 --> 00:10:17,559 Speaker 2: which is another one of those major historical touchstones from 181 00:10:17,720 --> 00:10:20,880 Speaker 2: the time period that again, if you're not super familiar 182 00:10:20,880 --> 00:10:24,000 Speaker 2: with Watergate, you've at least heard of Watergate, and you've 183 00:10:24,000 --> 00:10:28,640 Speaker 2: heard about the other different gates out there that allude 184 00:10:28,679 --> 00:10:30,360 Speaker 2: back to what a scandal it was. 185 00:10:31,440 --> 00:10:33,680 Speaker 3: Now everything gets a gate, suffix. 186 00:10:33,520 --> 00:10:37,880 Speaker 2: Right, no matter how deserve, any or undeserved now. Elsberg 187 00:10:37,960 --> 00:10:42,040 Speaker 2: remained an outspoken activist and whistleblower throughout his life, particularly 188 00:10:42,720 --> 00:10:47,240 Speaker 2: regarding the US's military actions under multiple presidents, including the 189 00:10:47,679 --> 00:10:50,960 Speaker 2: current one, the Russian invasion of Ukraine, and also in 190 00:10:51,000 --> 00:10:54,240 Speaker 2: support of various other whistle blowers. He was a vocal 191 00:10:54,240 --> 00:10:58,160 Speaker 2: opponent of the use and the threat of nuclear weapons 192 00:10:58,520 --> 00:11:01,960 Speaker 2: and the rhetoric around their use, and he is sometimes 193 00:11:02,000 --> 00:11:06,040 Speaker 2: described as a nuclear war planner turned nuclear war activists, 194 00:11:06,480 --> 00:11:11,080 Speaker 2: and his work at the Rand Corporation did involve such contemplations. Yeah. 195 00:11:11,120 --> 00:11:13,400 Speaker 3: I think a big part of his pitch in his 196 00:11:13,760 --> 00:11:17,280 Speaker 3: anti war and anti nuclear weapons activism was like, look, 197 00:11:17,320 --> 00:11:21,120 Speaker 3: I've been there where the power players are executing you know, 198 00:11:21,200 --> 00:11:24,719 Speaker 3: global military and nuclear strategy. You should be worried. This 199 00:11:24,840 --> 00:11:26,320 Speaker 3: is things are not okay. 200 00:11:27,320 --> 00:11:31,199 Speaker 2: Yeah. One of his major overarching criticisms of nuclear weapons, 201 00:11:31,280 --> 00:11:33,439 Speaker 2: and I feel like we're probably preaching to the choir 202 00:11:33,600 --> 00:11:35,240 Speaker 2: on this one. I can't imagine we have any big 203 00:11:35,480 --> 00:11:39,199 Speaker 2: nuclear war fans out there, was that their use is 204 00:11:39,240 --> 00:11:42,720 Speaker 2: a deterrent one of the major positive spins that are 205 00:11:42,760 --> 00:11:47,360 Speaker 2: often given about having a vast nuclear arsenal. This idea 206 00:11:47,440 --> 00:11:51,120 Speaker 2: of the nuclear deterrent was itself highly unreliable and unstable. 207 00:11:51,440 --> 00:11:55,960 Speaker 2: There were simply too many variables human, technological, situational that 208 00:11:56,040 --> 00:12:01,880 Speaker 2: could lead to not only catastrophic but existential unintended consequences. 209 00:12:02,280 --> 00:12:04,280 Speaker 3: Yeah, too many ways for it to go wrong, and 210 00:12:04,360 --> 00:12:06,880 Speaker 3: if it does go wrong, the consequences are too great. 211 00:12:07,320 --> 00:12:09,800 Speaker 2: By the way Elsberg has been portrayed, I believe twice 212 00:12:09,960 --> 00:12:12,800 Speaker 2: in major motion pictures, once by James Spader and once 213 00:12:12,840 --> 00:12:22,240 Speaker 2: by Matthew Reese. 214 00:12:24,080 --> 00:12:27,960 Speaker 3: So coming back to the Elsberg paper from nineteen sixty one, again, 215 00:12:27,960 --> 00:12:30,600 Speaker 3: that's risk ambiguity in the Savage axioms. This is the 216 00:12:30,640 --> 00:12:32,960 Speaker 3: really important one. I'd look it up. I think it 217 00:12:33,040 --> 00:12:36,959 Speaker 3: had like, I don't know, like tens of thousands of citations. 218 00:12:37,000 --> 00:12:41,280 Speaker 3: It's like a very, very famous and influential economics paper. 219 00:12:41,920 --> 00:12:45,240 Speaker 3: In this paper he illustrates the idea of ambiguity a 220 00:12:45,320 --> 00:12:49,000 Speaker 3: version with a number of thought experiments, and this is 221 00:12:49,040 --> 00:12:52,320 Speaker 3: one of them. So this is the three color urn. 222 00:12:53,559 --> 00:12:56,840 Speaker 3: Imagine that there's an urn like a jar that is 223 00:12:56,920 --> 00:13:00,600 Speaker 3: filled with ninety little ping pong balls that are a 224 00:13:00,760 --> 00:13:04,400 Speaker 3: mix of different colors. You cannot see inside the urn, 225 00:13:04,440 --> 00:13:06,840 Speaker 3: it's opaque, But you're going to play a game where 226 00:13:06,880 --> 00:13:09,360 Speaker 3: you reach into the urn and you pull out one 227 00:13:09,440 --> 00:13:12,720 Speaker 3: ball at a time, and you can win prizes based 228 00:13:12,760 --> 00:13:14,880 Speaker 3: on the color of the ball that you pull out. 229 00:13:15,559 --> 00:13:18,880 Speaker 3: You know that thirty of the balls inside the urn 230 00:13:19,080 --> 00:13:22,000 Speaker 3: are red, So since thirty out of ninety are red, 231 00:13:22,040 --> 00:13:24,760 Speaker 3: there's a one in three chance that a random ball 232 00:13:24,840 --> 00:13:27,000 Speaker 3: is going to be red. The odds there are pretty clear. 233 00:13:27,360 --> 00:13:31,120 Speaker 3: The other sixty balls, however, are a mix of yellow 234 00:13:31,240 --> 00:13:34,640 Speaker 3: and black. And here's the really important part. You don't 235 00:13:34,720 --> 00:13:39,000 Speaker 3: know the composition of that group. So those sixty balls 236 00:13:39,040 --> 00:13:41,839 Speaker 3: could be thirty yellow and thirty black. It could be 237 00:13:41,960 --> 00:13:45,199 Speaker 3: sixty yellow and zero black. It could be sixty black 238 00:13:45,200 --> 00:13:49,160 Speaker 3: and zero yellow, or any mix in between. Since you 239 00:13:49,240 --> 00:13:52,520 Speaker 3: don't know the mix, you don't know the probability of 240 00:13:52,600 --> 00:13:55,559 Speaker 3: drawing a yellow ball or a black ball ahead of time. 241 00:13:55,800 --> 00:13:59,040 Speaker 3: For red, it's one in three. For either yellow or black, 242 00:13:59,240 --> 00:14:03,719 Speaker 3: it's anywhere between zero and two out of three. Now, 243 00:14:03,760 --> 00:14:06,320 Speaker 3: imagine it is your turn to draw a ball out 244 00:14:06,320 --> 00:14:09,280 Speaker 3: of the urn, and you can take one of two bets. 245 00:14:09,840 --> 00:14:13,400 Speaker 3: Bet a you get one hundred dollars if your ball 246 00:14:13,520 --> 00:14:16,080 Speaker 3: is red, So if you pick this option, your chance 247 00:14:16,120 --> 00:14:19,560 Speaker 3: of winning one hundred dollars is one in three. In 248 00:14:19,640 --> 00:14:22,320 Speaker 3: bet B you get one hundred dollars if your ball 249 00:14:22,440 --> 00:14:25,960 Speaker 3: is black. Since you don't know the mix of yellow 250 00:14:26,040 --> 00:14:29,000 Speaker 3: and black, you don't know your odds of winning, but 251 00:14:29,080 --> 00:14:32,680 Speaker 3: they are somewhere between zero and two out of three, 252 00:14:33,080 --> 00:14:35,400 Speaker 3: with the midpoint being one out of three, which is 253 00:14:35,440 --> 00:14:38,240 Speaker 3: the same as the odds on red. Which one do 254 00:14:38,280 --> 00:14:38,680 Speaker 3: you pick? 255 00:14:39,360 --> 00:14:42,960 Speaker 2: Well, it sounds like bet A is the suer thing, right. 256 00:14:43,360 --> 00:14:45,760 Speaker 3: If somebody offered me this game, I would probably pick 257 00:14:45,840 --> 00:14:48,360 Speaker 3: bet A. Yeah, it just seems safer. It's like, okay, 258 00:14:48,560 --> 00:14:50,520 Speaker 3: I know what I'm dealing with. There one in three 259 00:14:51,200 --> 00:14:53,680 Speaker 3: the other option. Your odds could be better than one 260 00:14:53,680 --> 00:14:55,400 Speaker 3: in three, but you you just don't know. 261 00:14:56,000 --> 00:14:56,720 Speaker 2: They could be worse. 262 00:14:56,760 --> 00:14:59,600 Speaker 3: It could be zero, They could be worse, could be better. 263 00:14:59,680 --> 00:15:03,480 Speaker 3: You have no way of knowing either way, Ellsberg suggested, 264 00:15:03,640 --> 00:15:08,600 Speaker 3: and later experiments would empirically demonstrate that most people would 265 00:15:08,600 --> 00:15:11,760 Speaker 3: bet on red instead of black, even though there is 266 00:15:11,840 --> 00:15:15,680 Speaker 3: no objective reason to think that red is a better bet. 267 00:15:16,080 --> 00:15:19,880 Speaker 3: Following the pick red strategy will not necessarily earn you 268 00:15:19,960 --> 00:15:22,960 Speaker 3: more money in this game, But people really do, in fact, 269 00:15:23,240 --> 00:15:26,840 Speaker 3: just like it better to pick red. It feels right, 270 00:15:26,960 --> 00:15:29,600 Speaker 3: that's just what we think we should do. That in 271 00:15:29,680 --> 00:15:32,600 Speaker 3: itself is kind of weird and interesting, like why do 272 00:15:32,640 --> 00:15:36,440 Speaker 3: we prefer to avoid ambiguity even when there is no 273 00:15:36,680 --> 00:15:40,800 Speaker 3: clear advantage in doing so. Then there's a really interesting wrinkle. 274 00:15:41,120 --> 00:15:43,800 Speaker 3: Imagine you're playing the same game. You are drawing a 275 00:15:43,840 --> 00:15:47,960 Speaker 3: ball out of an urn, and this time you're given 276 00:15:47,960 --> 00:15:50,760 Speaker 3: assurance that the mix of the balls is the same 277 00:15:50,800 --> 00:15:53,040 Speaker 3: as it was the first time. So whatever was the 278 00:15:53,080 --> 00:15:55,640 Speaker 3: case with the mix of yellow and black in there, 279 00:15:56,200 --> 00:15:57,720 Speaker 3: it's going to be the same as it was with 280 00:15:57,760 --> 00:15:59,800 Speaker 3: your last draw. And you get a chance to make 281 00:15:59,800 --> 00:16:03,560 Speaker 3: a second bet. You can take either bet C, where 282 00:16:03,600 --> 00:16:06,080 Speaker 3: you get one hundred dollars if the ball drawn out 283 00:16:06,200 --> 00:16:11,240 Speaker 3: is either yellow or red, and this one has an 284 00:16:11,400 --> 00:16:14,080 Speaker 3: unknown chance of winning, because of course you don't know 285 00:16:14,120 --> 00:16:16,680 Speaker 3: how many yellow or black balls there are. You know 286 00:16:16,800 --> 00:16:19,480 Speaker 3: that your chance of winning is somewhere between one and 287 00:16:19,560 --> 00:16:22,240 Speaker 3: three if there are you know, the thirty red balls 288 00:16:22,240 --> 00:16:25,160 Speaker 3: and zero yellow balls, or one hundred percent if there 289 00:16:25,160 --> 00:16:28,000 Speaker 3: are sixty yellow balls, Or you can take bet D 290 00:16:28,600 --> 00:16:30,920 Speaker 3: where you get one hundred dollars. If the ball is 291 00:16:31,000 --> 00:16:35,160 Speaker 3: either yellow or black, this one has a guaranteed two 292 00:16:35,280 --> 00:16:38,480 Speaker 3: thirds chance to win, because we know that together yellow 293 00:16:38,560 --> 00:16:40,800 Speaker 3: and black make sixty of the ninety balls that we 294 00:16:40,840 --> 00:16:44,160 Speaker 3: don't know what the mix is. In this second trial, 295 00:16:44,640 --> 00:16:48,560 Speaker 3: people will tend to pick bet D once again, because 296 00:16:48,600 --> 00:16:51,680 Speaker 3: the probabilities are known. With D, it's a guaranteed two 297 00:16:51,680 --> 00:16:55,240 Speaker 3: thirds chance to win. With bets C, it's somewhere between 298 00:16:55,360 --> 00:16:59,080 Speaker 3: one one in three and one hundred percent. Picking bet 299 00:16:59,160 --> 00:17:03,000 Speaker 3: D allows them to avoid placing bets C, which, again, 300 00:17:03,080 --> 00:17:07,399 Speaker 3: which has ambiguous chances. Here's the really interesting thing that 301 00:17:08,119 --> 00:17:13,560 Speaker 3: Elsberg pointed out. There's actually a contradiction in people's behavior 302 00:17:13,960 --> 00:17:17,680 Speaker 3: between the first and second round of this game. Assuming 303 00:17:17,720 --> 00:17:19,879 Speaker 3: you are actually doing your best to win the money. 304 00:17:20,359 --> 00:17:23,679 Speaker 3: In round one, if you bet on red instead of black, 305 00:17:24,119 --> 00:17:27,639 Speaker 3: that implies you believe that less than thirty of the 306 00:17:27,760 --> 00:17:31,239 Speaker 3: unknown sixty balls are black. Right, That makes you it 307 00:17:31,280 --> 00:17:33,840 Speaker 3: implies that red has a better chance of winning, so 308 00:17:34,400 --> 00:17:36,960 Speaker 3: less than half of them have to be black. However, 309 00:17:37,000 --> 00:17:39,560 Speaker 3: in the second round, if you take the second option 310 00:17:39,920 --> 00:17:42,879 Speaker 3: picking yellow and black, it implies that you think that 311 00:17:43,359 --> 00:17:46,840 Speaker 3: more than thirty of the unknown balls are black. Otherwise 312 00:17:46,880 --> 00:17:50,119 Speaker 3: the red and yellow combination option would have the better 313 00:17:50,240 --> 00:17:56,120 Speaker 3: chance of winning. Right, These two bets in combination make 314 00:17:56,280 --> 00:18:02,000 Speaker 3: no sense because they imply self contradictory beliefs. It says 315 00:18:02,040 --> 00:18:05,200 Speaker 3: that I believe that more balls are black and less 316 00:18:05,200 --> 00:18:08,919 Speaker 3: balls are black at the same time. Obviously that can't 317 00:18:08,960 --> 00:18:12,760 Speaker 3: really be the case. But there could be another issue 318 00:18:12,880 --> 00:18:18,280 Speaker 3: driving this behavior, which is simply that people don't like ambiguity, 319 00:18:18,600 --> 00:18:22,000 Speaker 3: and they will pay up for known odds, even if 320 00:18:22,040 --> 00:18:27,640 Speaker 3: doing so implies mutually exclusive assumptions. I was trying to think, 321 00:18:27,800 --> 00:18:29,400 Speaker 3: you know, as I said a minute ago, I think 322 00:18:29,440 --> 00:18:32,280 Speaker 3: I would probably pick red if given the option to 323 00:18:32,320 --> 00:18:35,920 Speaker 3: play this game. Obviously, I sort of already knew the 324 00:18:35,960 --> 00:18:37,879 Speaker 3: deal here, so I can't know what I would do 325 00:18:38,080 --> 00:18:41,120 Speaker 3: going into it blind like most people, but I think 326 00:18:41,160 --> 00:18:46,480 Speaker 3: I would, for no rational reason, prefer the uncertainty reducing options. 327 00:18:47,359 --> 00:18:50,480 Speaker 3: I think the most likely reason for that is even 328 00:18:50,520 --> 00:18:52,960 Speaker 3: if I were given assurances that this game was on 329 00:18:53,000 --> 00:18:56,320 Speaker 3: the up and up, I would suspect some kind of trick, 330 00:18:56,520 --> 00:18:59,760 Speaker 3: Like if I'm being asked to bet on uncertainty, There's 331 00:18:59,800 --> 00:19:02,719 Speaker 3: some part of me that just kind of rises up 332 00:19:02,760 --> 00:19:05,239 Speaker 3: and says, uh, there's a scam here that you are 333 00:19:05,280 --> 00:19:05,840 Speaker 3: not seeing. 334 00:19:06,040 --> 00:19:08,280 Speaker 2: Yeah, yeah, I mean you kind of expect that within 335 00:19:08,359 --> 00:19:12,880 Speaker 2: the constraints of a testing environment, and you definitely expect 336 00:19:12,880 --> 00:19:16,399 Speaker 2: that out of the world. You know, whatever form this 337 00:19:16,600 --> 00:19:19,200 Speaker 2: was taking, you'd assume that somebody had a vested interest 338 00:19:19,800 --> 00:19:22,520 Speaker 2: in manipulating how you were going to behave to the 339 00:19:22,960 --> 00:19:24,920 Speaker 2: how you're going to respond to the two possibilities. 340 00:19:25,160 --> 00:19:27,760 Speaker 3: Yeah. Elsberg also in this paper, by the way, he 341 00:19:27,760 --> 00:19:30,000 Speaker 3: came up with a second illustration of the same principle. 342 00:19:30,040 --> 00:19:32,680 Speaker 3: So if you go reading about this, you'll read about 343 00:19:32,680 --> 00:19:35,400 Speaker 3: the three color urn and then the two urn experiment. 344 00:19:35,440 --> 00:19:38,720 Speaker 3: The second illustration used two eurns instead of one, because 345 00:19:38,720 --> 00:19:42,000 Speaker 3: it made the contradiction between the different bets even clearer. 346 00:19:42,840 --> 00:19:47,760 Speaker 3: And this contradictory self contradictory behavior is now called the 347 00:19:47,880 --> 00:19:51,679 Speaker 3: Elsberg paradox. Elsberg himself didn't call it that, that's what 348 00:19:51,840 --> 00:19:56,119 Speaker 3: other people they put his name on it. Anyway, Elsberg's 349 00:19:56,119 --> 00:20:01,520 Speaker 3: paper about ambiguity aversion was not just about observing a 350 00:20:01,560 --> 00:20:05,760 Speaker 3: weird little quirk of human behavior. It was especially of 351 00:20:05,800 --> 00:20:10,600 Speaker 3: interest to the fields of economics and decision theory modeling 352 00:20:10,640 --> 00:20:14,919 Speaker 3: how people make decisions, because it sought to undermine some 353 00:20:15,000 --> 00:20:20,720 Speaker 3: of the major underpinnings of the fields, specifically providing evidence 354 00:20:20,800 --> 00:20:24,800 Speaker 3: against some of the tenets of a framework called subjective 355 00:20:24,960 --> 00:20:29,359 Speaker 3: expected utility theory or savage as axioms, named after the 356 00:20:29,400 --> 00:20:35,640 Speaker 3: American mathematician Leonard Savage who formulated them. Subjective expected utility 357 00:20:35,680 --> 00:20:39,520 Speaker 3: theory or SEU theory is a way of modeling how 358 00:20:39,720 --> 00:20:45,840 Speaker 3: people make decisions when we don't have perfect or complete information, which, 359 00:20:45,880 --> 00:20:48,800 Speaker 3: of course in life we rarely do. Right, So, you 360 00:20:48,840 --> 00:20:50,959 Speaker 3: know this is going to be relevant in understanding how 361 00:20:51,000 --> 00:20:55,240 Speaker 3: people behave in most real world situations. Most situations in 362 00:20:55,280 --> 00:20:57,879 Speaker 3: life are not a casino table game where you have 363 00:20:58,040 --> 00:20:59,520 Speaker 3: no odds of winning. 364 00:21:00,400 --> 00:21:02,600 Speaker 2: Or you know, or you know, you can easily imagine 365 00:21:02,600 --> 00:21:05,280 Speaker 2: to a situation where you're playing some sort of a game, 366 00:21:05,359 --> 00:21:07,520 Speaker 2: be it a tabletop role playing game or some sort 367 00:21:07,520 --> 00:21:10,000 Speaker 2: of like, you know, video game, and you're given a 368 00:21:10,080 --> 00:21:13,399 Speaker 2: choice between two items for your character. Right, you have 369 00:21:13,440 --> 00:21:15,600 Speaker 2: the stats of those items right there. Yeah, you know 370 00:21:15,640 --> 00:21:17,720 Speaker 2: what kind of enemies you go up against in the game. 371 00:21:17,760 --> 00:21:20,399 Speaker 2: Maybe you've played through the game before. You have something 372 00:21:20,440 --> 00:21:25,440 Speaker 2: at least approaching perfect knowledge of the simplified world, and 373 00:21:25,640 --> 00:21:28,399 Speaker 2: our real world is not so simplified and there's so 374 00:21:28,440 --> 00:21:29,480 Speaker 2: many variables to it. 375 00:21:29,640 --> 00:21:32,640 Speaker 3: That's exactly right. So the real world, we're constantly dealing 376 00:21:32,720 --> 00:21:36,199 Speaker 3: with uncertainty ambiguous situations where we don't have all the 377 00:21:36,240 --> 00:21:39,280 Speaker 3: information to judge what outcomes are most likely, and yet 378 00:21:39,320 --> 00:21:42,760 Speaker 3: somehow we navigate this world making decisions all the time. 379 00:21:43,520 --> 00:21:47,040 Speaker 3: The question being addressed here is how do people make 380 00:21:47,160 --> 00:21:50,560 Speaker 3: those decisions when they don't actually know what the relative 381 00:21:50,680 --> 00:21:55,680 Speaker 3: likelihood of different outcomes are. Subjective expected utility theory has 382 00:21:55,720 --> 00:21:59,080 Speaker 3: several rules, but a simplified version is that it says, 383 00:22:00,000 --> 00:22:03,840 Speaker 3: and then when people don't know the objective likelihood of 384 00:22:03,880 --> 00:22:08,760 Speaker 3: an outcome, they form beliefs about how likely that outcome is, 385 00:22:09,280 --> 00:22:12,720 Speaker 3: and then they act consistently with those beliefs in order 386 00:22:12,760 --> 00:22:17,080 Speaker 3: to maximize their personal benefit. I came across a passage 387 00:22:17,119 --> 00:22:20,480 Speaker 3: by the American economist Frank Knight which I think summarizes 388 00:22:20,480 --> 00:22:24,719 Speaker 3: this idea well. Night writes, quote, we must observe at 389 00:22:24,760 --> 00:22:27,760 Speaker 3: the outset that when an individual instance i e. A 390 00:22:27,840 --> 00:22:31,680 Speaker 3: one time event only is at issue, there is no 391 00:22:31,760 --> 00:22:36,360 Speaker 3: difference for conduct between a measurable risk and an unmeasurable uncertainty. 392 00:22:36,800 --> 00:22:40,200 Speaker 3: The individual, as already observed, throws his estimate of the 393 00:22:40,320 --> 00:22:43,760 Speaker 3: value of an opinion into the probability form of a 394 00:22:44,000 --> 00:22:48,240 Speaker 3: successes in B trials a slash B being a proper fraction, 395 00:22:48,720 --> 00:22:53,240 Speaker 3: and feels toward it as toward any other probability situation. 396 00:22:53,880 --> 00:22:57,880 Speaker 3: So we essentially we form a belief for a feeling 397 00:22:58,160 --> 00:23:01,560 Speaker 3: about how likely something is, and then we act as 398 00:23:01,560 --> 00:23:04,160 Speaker 3: if that was like you know that we know the 399 00:23:04,200 --> 00:23:06,720 Speaker 3: odds of a coin flip coming up heads are one 400 00:23:06,760 --> 00:23:09,400 Speaker 3: in two, whether or not we're actually right about those 401 00:23:09,400 --> 00:23:14,960 Speaker 3: feelings or beliefs. Elsberg used the principle of ambiguity aversion 402 00:23:15,040 --> 00:23:20,000 Speaker 3: to say, actually, no, people do not always behave as 403 00:23:20,040 --> 00:23:23,240 Speaker 3: if they have consistent beliefs about what is more or 404 00:23:23,320 --> 00:23:26,639 Speaker 3: less likely to happen. And you can prove this because 405 00:23:26,680 --> 00:23:30,960 Speaker 3: they make bets that would imply self contradictory beliefs in 406 00:23:31,000 --> 00:23:35,480 Speaker 3: the same situation, meaning they either don't actually have or 407 00:23:35,600 --> 00:23:39,919 Speaker 3: act on consistent beliefs about a situation, or they don't 408 00:23:39,920 --> 00:23:44,600 Speaker 3: always act to maximize their own monetary benefit. Elsberg's point 409 00:23:44,640 --> 00:23:50,680 Speaker 3: here was that we can have obscure previously unacknowledged motivations 410 00:23:50,880 --> 00:23:55,159 Speaker 3: like the desire to avoid dealing with ambiguity, and that 411 00:23:55,280 --> 00:23:59,960 Speaker 3: particular desire is so strong in some cases that it overrides. 412 00:24:00,000 --> 00:24:03,960 Speaker 3: It's our consistent thinking and a simple dollar value understanding 413 00:24:04,280 --> 00:24:08,120 Speaker 3: of rational self interest. And by the way, the technical 414 00:24:08,200 --> 00:24:11,120 Speaker 3: name of the savage axiom that Elsberg was attacking here 415 00:24:11,240 --> 00:24:15,960 Speaker 3: was it's called the sure thing principle. Another thing to 416 00:24:16,040 --> 00:24:19,320 Speaker 3: understand about Elsberg's paper was that he was arguing that 417 00:24:19,680 --> 00:24:24,320 Speaker 3: we need to make a stronger distinction between two different concepts. 418 00:24:24,800 --> 00:24:30,119 Speaker 3: Those are risk and ambiguity. Elsberg gives the definition that 419 00:24:30,440 --> 00:24:33,280 Speaker 3: risk is when we have a stake in an outcome 420 00:24:33,760 --> 00:24:36,879 Speaker 3: and we know the theoretical likelihood of that outcome. So 421 00:24:37,000 --> 00:24:39,560 Speaker 3: this is like the casino games or a coin toss. 422 00:24:39,600 --> 00:24:41,600 Speaker 3: You know, a fair coin toss is fifty to fifty. 423 00:24:41,680 --> 00:24:45,280 Speaker 3: We know what the outcome likelihood is, and we can 424 00:24:45,320 --> 00:24:48,080 Speaker 3: take a risk. A fair die roll is one in 425 00:24:48,240 --> 00:24:51,960 Speaker 3: six to get a particular number. Ambiguity is when we 426 00:24:52,160 --> 00:24:55,680 Speaker 3: don't have an objective way of knowing what the probability 427 00:24:55,680 --> 00:24:59,679 Speaker 3: of our desired outcome is. To Elsberg, ambiguity was quote 428 00:24:59,760 --> 00:25:03,280 Speaker 3: the the nature of one's information concerning the relative likelihood 429 00:25:03,320 --> 00:25:07,480 Speaker 3: of events, a quality depending on the amount, type, reliability, 430 00:25:07,520 --> 00:25:11,480 Speaker 3: and unanimity of information, giving rise to one's degree of 431 00:25:11,560 --> 00:25:16,040 Speaker 3: confidence in an estimation of relative likelihoods. And so it's 432 00:25:16,080 --> 00:25:19,080 Speaker 3: worth noting that you can have confidence or even what 433 00:25:19,119 --> 00:25:22,560 Speaker 3: feels like certainty, or act as if you have confidence 434 00:25:22,600 --> 00:25:25,760 Speaker 3: or certainty without actually being correct. You can go through 435 00:25:25,800 --> 00:25:30,720 Speaker 3: life having high confidence in objectively low probability things. One 436 00:25:30,760 --> 00:25:34,879 Speaker 3: of the actually really useful things about SEU theories, the 437 00:25:34,920 --> 00:25:39,280 Speaker 3: subjective expected utility theory, was that it made things simple 438 00:25:39,440 --> 00:25:43,920 Speaker 3: and easier to understand because it treated risk and ambiguity 439 00:25:44,200 --> 00:25:47,680 Speaker 3: the same. So a person making bets on a die 440 00:25:47,800 --> 00:25:50,880 Speaker 3: roll will know that the probability of rolling a specific 441 00:25:50,960 --> 00:25:53,719 Speaker 3: number is one in six, and they will behave accordingly. 442 00:25:54,400 --> 00:25:56,640 Speaker 3: And a person who has a stake in an outcome 443 00:25:56,720 --> 00:26:02,360 Speaker 3: with an unknown probability will, according to Savage's axioms, mentally 444 00:26:02,560 --> 00:26:08,199 Speaker 3: even subconsciously form an internal belief about the probability of 445 00:26:08,200 --> 00:26:12,679 Speaker 3: that outcome. So example is you imagine there is a 446 00:26:12,720 --> 00:26:16,119 Speaker 3: one in four chance it's going to reign today. That 447 00:26:16,240 --> 00:26:18,399 Speaker 3: might not have anything to do with reality, but you 448 00:26:18,560 --> 00:26:22,119 Speaker 3: just take that belief on and you behave as if 449 00:26:22,240 --> 00:26:24,840 Speaker 3: that is that you make all of your decisions based 450 00:26:24,880 --> 00:26:27,679 Speaker 3: on that belief. They will act according to that belief 451 00:26:28,040 --> 00:26:29,800 Speaker 3: the same way they would act to knowing that a 452 00:26:29,840 --> 00:26:33,760 Speaker 3: coin flip is fifty to fifty, and Elsberg argued, no, 453 00:26:34,160 --> 00:26:37,439 Speaker 3: that is not how we make decisions. SEU theory with 454 00:26:37,520 --> 00:26:42,840 Speaker 3: savagees axioms failed to account for other complexities in human behavior, 455 00:26:42,920 --> 00:26:47,280 Speaker 3: for example, our deep desire to get away from ambiguity, 456 00:26:47,520 --> 00:26:50,880 Speaker 3: Like we don't like ambiguity so strongly we will make 457 00:26:51,000 --> 00:26:55,080 Speaker 3: decisions that imply self contradictory beliefs just to avoid it. Now, 458 00:26:55,080 --> 00:26:57,399 Speaker 3: there have been a lot of arguments over the years 459 00:26:57,640 --> 00:27:03,240 Speaker 3: about how to interpret ambiguity aversion. Is it actually rational 460 00:27:03,560 --> 00:27:07,400 Speaker 3: in ways that were not previously recognized, or is it 461 00:27:07,520 --> 00:27:11,480 Speaker 3: more like some kind of systematic error or cognitive bias. 462 00:27:12,240 --> 00:27:14,600 Speaker 3: We might come back and revisit that debate in a 463 00:27:14,680 --> 00:27:18,000 Speaker 3: subsequent part of this series, but we've got a couple 464 00:27:18,080 --> 00:27:21,320 Speaker 3: other things we wanted to talk about today. First, Rob, 465 00:27:21,359 --> 00:27:23,520 Speaker 3: I know you have a really interesting place you want 466 00:27:23,560 --> 00:27:25,280 Speaker 3: to take this, but first I wanted to mention the 467 00:27:25,320 --> 00:27:30,359 Speaker 3: idea just of experimental confirmation, because one important thing to 468 00:27:30,440 --> 00:27:34,000 Speaker 3: understand is that Ellsberg's original paper did not have an 469 00:27:34,040 --> 00:27:37,480 Speaker 3: experimental component. It was a thought experiment, and he realized 470 00:27:37,480 --> 00:27:40,840 Speaker 3: the phenomenon needed to be tested with real human subjects. 471 00:27:41,359 --> 00:27:44,120 Speaker 3: In the decades since his paper, it has been tested many, 472 00:27:44,160 --> 00:27:48,800 Speaker 3: many times, many different ways, and generally it has proved robust. 473 00:27:49,560 --> 00:27:54,639 Speaker 3: There are some nuances and exceptions, but generally people behave 474 00:27:54,760 --> 00:27:58,119 Speaker 3: the way Elsberg predicted. They would a majority of people 475 00:27:58,440 --> 00:28:02,320 Speaker 3: structure their choices to void ambiguity, even when there's no 476 00:28:02,560 --> 00:28:06,400 Speaker 3: clear objective benefit to doing so, and they even make 477 00:28:06,440 --> 00:28:10,199 Speaker 3: bets that would imply against self contradictory beliefs. According to 478 00:28:10,320 --> 00:28:13,439 Speaker 3: SEU theory, if those bets can get them out of 479 00:28:13,520 --> 00:28:16,720 Speaker 3: dealing with ambiguous probabilities, so I don't have to mess 480 00:28:16,720 --> 00:28:22,359 Speaker 3: with that. Regarding those experiments, my main source here is 481 00:28:22,400 --> 00:28:27,920 Speaker 3: again that survey article by Machina and Siniskalchi. Again that's 482 00:28:28,480 --> 00:28:32,679 Speaker 3: ambiguity and ambiguity of version from twenty fourteen. The authors 483 00:28:32,680 --> 00:28:36,000 Speaker 3: here list a whole bunch of experiments where people carried 484 00:28:36,000 --> 00:28:38,240 Speaker 3: out versions of like the three color urn or this 485 00:28:38,360 --> 00:28:42,800 Speaker 3: other experiment that Delsberg described, the two urn experiment. They 486 00:28:42,880 --> 00:28:45,920 Speaker 3: list Felner in nineteen sixty one, which found that people 487 00:28:45,920 --> 00:28:49,360 Speaker 3: would rather take fifty to fifty odds than unknown odds, 488 00:28:49,400 --> 00:28:53,040 Speaker 3: which could be better or worse. They also cite Becker 489 00:28:53,080 --> 00:28:56,760 Speaker 3: and Brownson sixty four, m. Kremen sixty eight, Slovak and 490 00:28:56,840 --> 00:29:00,880 Speaker 3: Taverski from nineteen seventy four, Curly and Yate from eighty nine, 491 00:29:00,960 --> 00:29:05,800 Speaker 3: among others, all generally finding confirmation of Elsberg's predictions about 492 00:29:05,840 --> 00:29:11,000 Speaker 3: ambiguity aversion quote. Although most of these experiments used students 493 00:29:11,080 --> 00:29:14,880 Speaker 3: as subjects. Researchers such as McCrimmon sixty five, Hogarth and 494 00:29:15,360 --> 00:29:20,200 Speaker 3: Kunruther eighty nine, Einhorn and Hogarth eighty six, Viscusi and 495 00:29:20,280 --> 00:29:24,160 Speaker 3: Chessen ninety nine, hoe Keller and Celtica two thousand and two, 496 00:29:24,480 --> 00:29:28,240 Speaker 3: and Mafioletti and Sentori two thousand and five have examined 497 00:29:28,240 --> 00:29:33,480 Speaker 3: the ambiguity preferences of business owners, trade union leaders, actuaries, managers, 498 00:29:33,520 --> 00:29:37,240 Speaker 3: and executives with the same overall findings, so it appears 499 00:29:37,320 --> 00:29:42,400 Speaker 3: quite robust. The authors here also report some interesting findings 500 00:29:42,480 --> 00:29:46,680 Speaker 3: of McCrimmon and Larsen from seventy nine which they looked 501 00:29:46,720 --> 00:29:49,000 Speaker 3: into it, and they found that while the majority of 502 00:29:49,040 --> 00:29:53,760 Speaker 3: people did try to avoid ambiguity, some minority of people 503 00:29:54,000 --> 00:29:57,440 Speaker 3: did not and even just chose to embrace it. They 504 00:29:57,480 --> 00:30:01,360 Speaker 3: also found, as you might expect, that our relative tolerance 505 00:30:01,400 --> 00:30:04,880 Speaker 3: for ambiguity went up or down depending on how good 506 00:30:05,000 --> 00:30:09,880 Speaker 3: the known odds were in the known odds bet condition. So, 507 00:30:10,000 --> 00:30:12,240 Speaker 3: for example, if you do this three color earn experiment, 508 00:30:12,520 --> 00:30:14,680 Speaker 3: if you take the number of red balls that's the 509 00:30:14,680 --> 00:30:16,520 Speaker 3: ones where you know how many there are, If you 510 00:30:16,560 --> 00:30:19,000 Speaker 3: take that down to zero or even just down to 511 00:30:19,080 --> 00:30:22,400 Speaker 3: five or ten, most people will take the gamble with 512 00:30:22,480 --> 00:30:25,719 Speaker 3: the unknown odds on yellow and black because the odds 513 00:30:25,760 --> 00:30:28,680 Speaker 3: on red are clearly very bad. If you make the 514 00:30:28,720 --> 00:30:32,240 Speaker 3: odds on red really good, even more people will stick 515 00:30:32,280 --> 00:30:35,400 Speaker 3: with the no nods, and ambiguity of version becomes even stronger. 516 00:30:35,440 --> 00:30:38,280 Speaker 3: So that shouldn't be all that surprising. But yeah, in 517 00:30:38,600 --> 00:30:41,840 Speaker 3: the middle category, where the odds on red are somewhere 518 00:30:41,880 --> 00:30:44,520 Speaker 3: in between good and bad, you're just kind of like, 519 00:30:44,760 --> 00:30:47,560 Speaker 3: I don't know, But still people tend to go for 520 00:30:47,760 --> 00:30:51,720 Speaker 3: the known odds rather than the unknown odds. However, the 521 00:30:51,800 --> 00:30:57,960 Speaker 3: authors also found that even when the idea of subjective 522 00:30:57,960 --> 00:31:01,600 Speaker 3: expected utility and choice consistent see was explained to the 523 00:31:01,640 --> 00:31:05,320 Speaker 3: people doing this experiment explained to the subjects, or even 524 00:31:05,320 --> 00:31:09,680 Speaker 3: in cases where the known odds were bad, some subjects 525 00:31:09,800 --> 00:31:13,920 Speaker 3: just stubbornly stuck to red and avoided betting on any 526 00:31:14,000 --> 00:31:19,120 Speaker 3: ambiguous conditions. So most of us seem to dislike ambiguity 527 00:31:19,720 --> 00:31:22,400 Speaker 3: in the majority of cases, and some of us just 528 00:31:22,440 --> 00:31:35,560 Speaker 3: don't like it at all and will not tolerate it. Now, 529 00:31:35,600 --> 00:31:38,959 Speaker 3: as is often the case with stuff and decision theory 530 00:31:39,080 --> 00:31:42,600 Speaker 3: and economics and stuff, you start off playing these little 531 00:31:42,640 --> 00:31:45,040 Speaker 3: like gambling games with like balls in a jar or 532 00:31:45,040 --> 00:31:47,040 Speaker 3: something like that, and so it seems like, well, how 533 00:31:47,080 --> 00:31:50,200 Speaker 3: consequential could this really be? But actually, I think you 534 00:31:50,240 --> 00:31:54,520 Speaker 3: can take the idea of ambiguity aversion to number one. 535 00:31:54,600 --> 00:31:58,720 Speaker 3: It has been applied to very consequential situations in real life, 536 00:31:58,800 --> 00:32:03,360 Speaker 3: and thus understanding it can have major consequences not only 537 00:32:03,400 --> 00:32:06,400 Speaker 3: for people's individual lives, but for world events. 538 00:32:07,200 --> 00:32:10,760 Speaker 2: That's right, And that brings us back to Ellsberg, who, again, 539 00:32:10,840 --> 00:32:15,880 Speaker 2: and in addition to crafting this highly influential paper, was 540 00:32:15,920 --> 00:32:20,960 Speaker 2: also a highly vocal critic of nuclear weapons and again 541 00:32:21,000 --> 00:32:26,280 Speaker 2: the rhetoric surrounding nuclear weapons. And I was reading a 542 00:32:26,280 --> 00:32:28,920 Speaker 2: good bit from his book The Doomsday Machine, which came 543 00:32:28,920 --> 00:32:31,440 Speaker 2: out in twenty twelve, which is quite a good read 544 00:32:31,480 --> 00:32:33,480 Speaker 2: if you want to much deeper dive into his thoughts 545 00:32:33,560 --> 00:32:37,640 Speaker 2: on all of this. But you know, in that book 546 00:32:37,960 --> 00:32:41,960 Speaker 2: the subject of ambiguous data comes up numerous times how 547 00:32:42,040 --> 00:32:46,800 Speaker 2: will a nuclear power react in response to ambiguous data 548 00:32:47,160 --> 00:32:51,600 Speaker 2: about a potential incoming attack? And additionally, it talks about 549 00:32:51,640 --> 00:32:57,600 Speaker 2: the purported advantage of ambiguity in intense slash response, namely 550 00:32:57,640 --> 00:33:01,560 Speaker 2: in the case of mad Man theory. This is very 551 00:33:01,560 --> 00:33:06,000 Speaker 2: closely associated with Richard Nixon, but also tied to various 552 00:33:06,040 --> 00:33:10,400 Speaker 2: other figures domestic and international, including the current US president. 553 00:33:10,720 --> 00:33:16,280 Speaker 2: A kind of supposed perceived madness or volatility that would 554 00:33:16,400 --> 00:33:20,400 Speaker 2: lead adversaries to second guests any plans to move against them. 555 00:33:20,840 --> 00:33:23,440 Speaker 3: Right, and we're going to get to some major criticisms 556 00:33:23,480 --> 00:33:26,120 Speaker 3: of the mad Man theory strategy in a bit, But 557 00:33:26,200 --> 00:33:29,800 Speaker 3: the idea of it is that you can leverage a 558 00:33:29,920 --> 00:33:35,320 Speaker 3: reputation for unpredictability to your advantage in negotiations. 559 00:33:35,960 --> 00:33:38,080 Speaker 2: That's right. You know, this is very much that nobody 560 00:33:38,120 --> 00:33:41,640 Speaker 2: knows what I'm going to do approach. That's the message 561 00:33:41,880 --> 00:33:44,479 Speaker 2: you put out there. And the idea part of the 562 00:33:44,520 --> 00:33:47,720 Speaker 2: idea here is that it makes logically empty or extreme 563 00:33:47,880 --> 00:33:51,560 Speaker 2: threats seem more possible. Now, mad Man theory was you. 564 00:33:51,560 --> 00:33:54,120 Speaker 2: Appointment is a term that Nixon even used himself when 565 00:33:54,160 --> 00:33:57,400 Speaker 2: talking about this approach. It was outlined by Daniel Elsberg 566 00:33:57,440 --> 00:33:59,880 Speaker 2: and Thomas C. Showing as well, but the basic con 567 00:34:00,720 --> 00:34:04,480 Speaker 2: pretending to be erratic, irrational, or unhinged to influence the 568 00:34:04,480 --> 00:34:07,040 Speaker 2: other side of a bargaining table. This goes back a 569 00:34:07,080 --> 00:34:11,040 Speaker 2: long ways. Machiavelli wrote about it in fifteen seventeen, and 570 00:34:11,080 --> 00:34:14,239 Speaker 2: it's probably as old as the first prehistoric human to 571 00:34:14,280 --> 00:34:17,640 Speaker 2: like go around hitting trees randomly with a club, you know, 572 00:34:17,719 --> 00:34:19,839 Speaker 2: and say, who knows what tharg is going to do? 573 00:34:20,400 --> 00:34:22,759 Speaker 2: Why would you mess with throg on this? We? 574 00:34:22,880 --> 00:34:24,840 Speaker 3: Yeah, we better just give throarc what he wants. 575 00:34:25,160 --> 00:34:29,240 Speaker 2: Yeah, So again, the Elsberg paradox highlights a core flaw 576 00:34:29,320 --> 00:34:33,360 Speaker 2: in human decision making, the aversion to ambiguity. The madman 577 00:34:33,520 --> 00:34:36,759 Speaker 2: theory as a political strategy seeks to exploit this by 578 00:34:36,800 --> 00:34:41,799 Speaker 2: cultivating a reputation of rationality, creating an ambiguous threat. The 579 00:34:41,880 --> 00:34:45,160 Speaker 2: adversary and their aversion to this ambiguity and the unknown 580 00:34:45,239 --> 00:34:50,160 Speaker 2: risk of an unpredictable response may be coerced into backing down. 581 00:34:50,239 --> 00:34:52,000 Speaker 2: Or at least this is the idea, this is the 582 00:34:52,680 --> 00:34:54,800 Speaker 2: logic behind the approach. Yeah. 583 00:34:55,040 --> 00:34:57,880 Speaker 3: The classic example of mad men theory in practice in 584 00:34:57,920 --> 00:35:02,560 Speaker 3: real life is that the Nixon administration could, for example, 585 00:35:02,840 --> 00:35:07,000 Speaker 3: have Kissinger call up his counterparts in the Kremlin and say, look, 586 00:35:07,239 --> 00:35:09,239 Speaker 3: you know, we're all trying to calm Nixon down, but 587 00:35:09,280 --> 00:35:12,959 Speaker 3: he's boiling mad at you. He's borderline crazy. You've got 588 00:35:12,960 --> 00:35:14,759 Speaker 3: to do X, Y and Z. You know, you've got 589 00:35:14,760 --> 00:35:16,879 Speaker 3: to give him these concessions he wants, or we don't 590 00:35:16,920 --> 00:35:18,560 Speaker 3: know if we'll be able to control him. 591 00:35:19,000 --> 00:35:20,960 Speaker 2: Yeah. Kind of a good cut, bad cotton thing too, 592 00:35:21,040 --> 00:35:21,719 Speaker 2: I imagine. 593 00:35:21,840 --> 00:35:25,799 Speaker 3: Yeah, And this was a way of making I think 594 00:35:25,840 --> 00:35:27,399 Speaker 3: a big part of it was it was a way 595 00:35:27,440 --> 00:35:31,120 Speaker 3: of making the nuclear deterrent feel once again like it 596 00:35:31,200 --> 00:35:35,440 Speaker 3: had force in matters other than just deterring a first strike. 597 00:35:36,440 --> 00:35:39,880 Speaker 3: Because the problem with using nuclear weapons as leverage and 598 00:35:39,960 --> 00:35:44,759 Speaker 3: negotiation between two nuclear armed powers is obviously, if the 599 00:35:44,760 --> 00:35:48,480 Speaker 3: weapons get used at all, everyone loses. There is not 600 00:35:48,600 --> 00:35:52,280 Speaker 3: a winner in mutually sured destruction. So if the White 601 00:35:52,280 --> 00:35:54,960 Speaker 3: House was trying to get the Kremlin to do something 602 00:35:55,040 --> 00:35:58,160 Speaker 3: like I'll nuke if you don't do it, that was 603 00:35:58,200 --> 00:36:01,560 Speaker 3: not a credible threat because bo both parties knew that 604 00:36:01,600 --> 00:36:04,760 Speaker 3: the other party knows that war would be the destruction 605 00:36:04,840 --> 00:36:08,920 Speaker 3: of them both. So no rational actor would ever strike first. 606 00:36:09,400 --> 00:36:12,480 Speaker 3: Between rational actors, I'll knew Q if you don't do 607 00:36:12,719 --> 00:36:16,600 Speaker 3: XYZ is an empty threat. The goal of madman theory 608 00:36:17,080 --> 00:36:21,000 Speaker 3: is to gain leverage in negotiations by introducing some amount 609 00:36:21,080 --> 00:36:25,400 Speaker 3: of fear that your counterparty may not be a rational actor. 610 00:36:25,719 --> 00:36:29,279 Speaker 3: He might be crazy enough to do it, we don't know. Therefore, 611 00:36:29,560 --> 00:36:33,960 Speaker 3: the madman strategy makes direct use of the opponent's ambiguity 612 00:36:34,040 --> 00:36:37,640 Speaker 3: version as a tactic to get leverage over them. The 613 00:36:37,680 --> 00:36:40,880 Speaker 3: madman player in this game is betting that the fear 614 00:36:40,960 --> 00:36:45,920 Speaker 3: and uncertainty about how to handle their unpredictability will cause 615 00:36:45,960 --> 00:36:49,520 Speaker 3: their counterparty to make concessions they might not make otherwise, 616 00:36:49,680 --> 00:36:52,319 Speaker 3: Like you end up thinking, well, I'll just take a 617 00:36:52,360 --> 00:36:56,120 Speaker 3: bad deal rather than wander into territory with unclear levels of. 618 00:36:56,200 --> 00:37:00,520 Speaker 2: Risk, right right, and not to derail the pointer, But 619 00:37:00,560 --> 00:37:04,120 Speaker 2: I want to bring it back to Ellsberg's other criticisms 620 00:37:04,160 --> 00:37:07,399 Speaker 2: of nuclear weapons and the rhetoric surrounding nuclear weapons, that being, 621 00:37:07,560 --> 00:37:10,640 Speaker 2: there's all the there's all these other uncertainty factors and 622 00:37:10,760 --> 00:37:12,960 Speaker 2: ambiguous data that would be coming in about what the 623 00:37:13,080 --> 00:37:17,040 Speaker 2: enemy is or is not doing, and therefore any given 624 00:37:17,080 --> 00:37:20,319 Speaker 2: situation like this is going to be just rot with 625 00:37:20,320 --> 00:37:24,279 Speaker 2: with potential ambiguity, potential missteps and potential mistakes, So like 626 00:37:24,360 --> 00:37:26,759 Speaker 2: if it were only as easy as we're laying it 627 00:37:26,800 --> 00:37:29,760 Speaker 2: out here, it would almost be a different situation totally. 628 00:37:29,800 --> 00:37:34,120 Speaker 3: I mean Ellsberg's point about the all of the uncertainty 629 00:37:34,360 --> 00:37:38,240 Speaker 3: in the actual command and control of you know, nuclear 630 00:37:38,239 --> 00:37:43,959 Speaker 3: strategy like that's that that introduces real levels of existential 631 00:37:44,040 --> 00:37:47,839 Speaker 3: risk into playing games like this. There are a lot 632 00:37:47,920 --> 00:37:50,680 Speaker 3: of reasonable criticisms of mad Man theory, but I think 633 00:37:50,680 --> 00:37:52,839 Speaker 3: a major one, at least from my point of view, 634 00:37:52,920 --> 00:37:57,120 Speaker 3: is that at best it is a strategy for short 635 00:37:57,239 --> 00:38:01,040 Speaker 3: term gain at the expense of long term stability and 636 00:38:01,120 --> 00:38:05,719 Speaker 3: negotiating power, because one way or another it undercuts your 637 00:38:05,719 --> 00:38:09,080 Speaker 3: credibility and your ability to be seen as an honest 638 00:38:09,120 --> 00:38:13,600 Speaker 3: and reliable broker in future negotiations, either because you come 639 00:38:13,680 --> 00:38:17,160 Speaker 3: to be seen as actually irrational, in which case people 640 00:38:17,200 --> 00:38:18,960 Speaker 3: don't want to deal with you, and they you know, 641 00:38:19,040 --> 00:38:21,799 Speaker 3: they will find what you know. They may be they 642 00:38:21,800 --> 00:38:24,480 Speaker 3: may be motivated to do something bad to strike first 643 00:38:24,480 --> 00:38:26,840 Speaker 3: to you because you're too dangerous to be let loose, 644 00:38:26,960 --> 00:38:29,440 Speaker 3: or they may be, you know, they may try to 645 00:38:29,480 --> 00:38:32,440 Speaker 3: find ways to cut you out of negotiations, so that 646 00:38:32,440 --> 00:38:35,120 Speaker 3: could be one consequence, or just because you are revealed 647 00:38:35,160 --> 00:38:38,680 Speaker 3: to have been bluffing, in which case you lose your credibility. 648 00:38:38,520 --> 00:38:41,000 Speaker 2: Right right, and yeah, this is this is all valid. 649 00:38:41,520 --> 00:38:43,759 Speaker 2: I was looking at a couple of papers on mad 650 00:38:43,800 --> 00:38:47,880 Speaker 2: men theory and mad man theory, like ambiguity of version itself, 651 00:38:47,920 --> 00:38:50,520 Speaker 2: is something that has been written about a lot there. 652 00:38:50,640 --> 00:38:53,279 Speaker 2: There are no shortage of papers out there, but I 653 00:38:53,280 --> 00:38:56,040 Speaker 2: looked at just a couple. I looked at one titled 654 00:38:56,160 --> 00:38:59,240 Speaker 2: crazy like a Fox? Are leaders with reputations for madness 655 00:38:59,239 --> 00:39:04,480 Speaker 2: more successful international coercion? By Roseanne W McManus. This came 656 00:39:04,520 --> 00:39:07,840 Speaker 2: out in twenty nineteen from Cambridge University Press, and in 657 00:39:07,880 --> 00:39:10,880 Speaker 2: this the author points out that a reputation for madness 658 00:39:11,120 --> 00:39:15,160 Speaker 2: would seem to be more often harmful than helpful in 659 00:39:15,160 --> 00:39:20,359 Speaker 2: international coercion, So it undercuts the leader's ability to make 660 00:39:20,440 --> 00:39:24,920 Speaker 2: believable peace commitments, treaties and so forth, which and it 661 00:39:25,000 --> 00:39:27,160 Speaker 2: really almost feels absurd that we need to stress this. 662 00:39:27,719 --> 00:39:31,719 Speaker 2: These are all vital for peaceful relations between nations. No 663 00:39:31,760 --> 00:39:34,480 Speaker 2: one knows what I'll do easily slides into. No one 664 00:39:34,560 --> 00:39:37,799 Speaker 2: knows what I'll honor, No one knows what treaties I 665 00:39:37,880 --> 00:39:41,880 Speaker 2: would well I actually stand by, and so forth. You know, 666 00:39:41,920 --> 00:39:43,640 Speaker 2: to bring this back to Dungeons and Dragons, I think 667 00:39:43,640 --> 00:39:45,680 Speaker 2: any dn D player worth your salt knows that while 668 00:39:45,719 --> 00:39:48,120 Speaker 2: you might well enter into a pact with a lawful 669 00:39:48,200 --> 00:39:51,160 Speaker 2: evil devil, you never enter into a pack with a 670 00:39:51,239 --> 00:39:54,480 Speaker 2: chaotic evil demon because the lords of the nine hell 671 00:39:54,520 --> 00:39:56,400 Speaker 2: are going to stick to their letter of the contract, 672 00:39:56,400 --> 00:39:59,400 Speaker 2: if not the spirit of the contract. But demon lords 673 00:39:59,440 --> 00:40:02,560 Speaker 2: such as the mcgorgan will honor nothing, and there is 674 00:40:02,719 --> 00:40:06,600 Speaker 2: no coherence even within themselves. They're just pure ambiguity, and 675 00:40:06,640 --> 00:40:09,680 Speaker 2: you can't strike any sort of deal because they won't 676 00:40:09,719 --> 00:40:12,359 Speaker 2: stand by it no matter what. Whereas the devil's being 677 00:40:12,440 --> 00:40:15,960 Speaker 2: lawful evil, they may look for that wiggle room, they 678 00:40:16,000 --> 00:40:19,600 Speaker 2: may find ways to, you know, to avoid the spirit 679 00:40:19,719 --> 00:40:21,720 Speaker 2: of the deal, but they're still bound to the letter. 680 00:40:22,040 --> 00:40:24,399 Speaker 3: Right. So if you think you discover that your your 681 00:40:24,440 --> 00:40:27,440 Speaker 3: counterparty is actually just chaotic evil, all you can do 682 00:40:27,520 --> 00:40:29,960 Speaker 3: is roll for initiative like there's no making a. 683 00:40:29,920 --> 00:40:33,080 Speaker 2: Deal yeah, and then you slide into chaos, pure chaos. 684 00:40:34,160 --> 00:40:37,120 Speaker 2: McManus back to her paper, though, she found that mad 685 00:40:37,160 --> 00:40:40,960 Speaker 2: men theory may be helpful in crisis bargaining, but only 686 00:40:41,080 --> 00:40:46,240 Speaker 2: under certain conditions, namely when employed by military week leaders. 687 00:40:46,640 --> 00:40:49,520 Speaker 2: So we're not talking about a true superpower here they 688 00:40:49,560 --> 00:40:53,759 Speaker 2: can engage in like true mutually assured destruction. Rather, we 689 00:40:53,800 --> 00:40:56,440 Speaker 2: would be dealing with the state that could do a 690 00:40:56,440 --> 00:41:00,640 Speaker 2: lot of damage, but in striking out with just thoroughly 691 00:41:00,640 --> 00:41:03,759 Speaker 2: destroy themselves. So the idea here is that no one 692 00:41:03,800 --> 00:41:07,360 Speaker 2: would take this hypothetical nation or the leader of this 693 00:41:07,440 --> 00:41:12,759 Speaker 2: hypothetical nation seriously unless they presented an air of madness, 694 00:41:12,800 --> 00:41:16,160 Speaker 2: and then perhaps, hey, they might do it anyway if provoked. 695 00:41:16,200 --> 00:41:18,480 Speaker 2: They they you know, they're they're unhinged, and thus it 696 00:41:18,520 --> 00:41:23,040 Speaker 2: can be a form of asymmetric leverage. So we're talking 697 00:41:23,239 --> 00:41:28,000 Speaker 2: low probability but high consequence. And MacManus also points out 698 00:41:28,000 --> 00:41:29,560 Speaker 2: that there would also seem to be more of an 699 00:41:29,560 --> 00:41:33,200 Speaker 2: advantage if you had a mild reputation for madness rather 700 00:41:33,239 --> 00:41:35,480 Speaker 2: than an extreme one. You don't want to come off like, 701 00:41:35,880 --> 00:41:41,200 Speaker 2: you know, a complete da da you know, eye rolling maniac. 702 00:41:41,560 --> 00:41:43,720 Speaker 2: In one of these cases, the idea would be like, well, 703 00:41:44,000 --> 00:41:46,000 Speaker 2: you know, we can, we can still get this individual 704 00:41:46,080 --> 00:41:48,960 Speaker 2: to the negotiation table, but we just have to be 705 00:41:49,239 --> 00:41:53,160 Speaker 2: hyper sensitive and I guess again, kind of avoiding that 706 00:41:53,320 --> 00:41:57,840 Speaker 2: complete fall off into chaotic evil demonhood and instead dealing 707 00:41:57,920 --> 00:41:59,879 Speaker 2: with devils that can still be bound by some sort 708 00:41:59,880 --> 00:42:03,640 Speaker 2: of law. All right, I have a hypothetical example here. 709 00:42:03,760 --> 00:42:08,799 Speaker 2: So we have demigodic hero Hercules and we have his 710 00:42:09,000 --> 00:42:13,240 Speaker 2: mortal cousin Eurystheus, and they're grabbing lunch at the local 711 00:42:13,239 --> 00:42:18,239 Speaker 2: euro place. Great, So Eurystheus warns his cousin that if 712 00:42:18,239 --> 00:42:20,480 Speaker 2: he tries to steal any of his fries, he's going 713 00:42:20,520 --> 00:42:22,120 Speaker 2: to flip the table and all their food's going to 714 00:42:22,160 --> 00:42:24,480 Speaker 2: wind up on the floor, and he acts just really 715 00:42:24,520 --> 00:42:29,040 Speaker 2: sensitive and unhinged about the whole thing. Yes, Okay, Hercules 716 00:42:29,200 --> 00:42:33,000 Speaker 2: obviously knows that Eurystheus can't take him in a fight 717 00:42:33,120 --> 00:42:35,280 Speaker 2: and might not even be able to flip that table 718 00:42:35,320 --> 00:42:41,760 Speaker 2: over before Hercules stops him, and rationally, Eurystheus should realize 719 00:42:41,760 --> 00:42:44,080 Speaker 2: this and just let big Hurk have a few extra 720 00:42:44,120 --> 00:42:45,839 Speaker 2: fries if he wants, like that's what does it matter. 721 00:42:46,520 --> 00:42:50,280 Speaker 2: But from Hercules's standpoint, does he really want to risk 722 00:42:50,400 --> 00:42:53,280 Speaker 2: his lunch winding up on the floor. Maybe he should 723 00:42:53,320 --> 00:42:57,680 Speaker 2: just let Eurystheus keep all of his fries. Okay. The 724 00:42:57,800 --> 00:43:02,600 Speaker 2: downside of course here is that maybe Hercules just won't 725 00:43:02,640 --> 00:43:05,439 Speaker 2: invite you risk the us out for lunch next time, 726 00:43:05,440 --> 00:43:07,160 Speaker 2: and he's certainly not going to pick up the check. 727 00:43:07,719 --> 00:43:10,879 Speaker 2: And on top of all that, how are they going 728 00:43:10,920 --> 00:43:13,759 Speaker 2: to work together to defeat the hydra? Because, as you 729 00:43:13,880 --> 00:43:16,400 Speaker 2: remember from past episodes and from Greek mythology in general, 730 00:43:16,640 --> 00:43:18,920 Speaker 2: Hercules can't do that on his own. He has to 731 00:43:18,960 --> 00:43:21,479 Speaker 2: have his cousins help to burn the stomps after each 732 00:43:21,640 --> 00:43:23,160 Speaker 2: head of the hydras cut off. 733 00:43:23,080 --> 00:43:27,200 Speaker 3: Very good points, I mean, apart from any like otherwise 734 00:43:27,239 --> 00:43:31,400 Speaker 3: like moral considerations or honesty considerations about deploying something like 735 00:43:31,480 --> 00:43:33,959 Speaker 3: mad men theory. Just from a strategic point of view, 736 00:43:35,200 --> 00:43:38,680 Speaker 3: it seems like it is probably best for situations where 737 00:43:39,160 --> 00:43:42,520 Speaker 3: like your counterparty has to deal with you, it is 738 00:43:42,600 --> 00:43:46,759 Speaker 3: not optional for them, and you don't have to like 739 00:43:47,080 --> 00:43:50,480 Speaker 3: or respect each other or ever work together on anything, 740 00:43:51,640 --> 00:43:54,600 Speaker 3: and you don't care about long term goals. You are 741 00:43:54,680 --> 00:43:58,719 Speaker 3: only interested in this situation right now, in extracting a 742 00:43:58,760 --> 00:44:00,120 Speaker 3: short term advantage. 743 00:44:00,760 --> 00:44:04,160 Speaker 2: That's right, that's right. But some papers out there that 744 00:44:04,239 --> 00:44:06,640 Speaker 2: have crunched the numbers on all this and done some 745 00:44:07,080 --> 00:44:10,719 Speaker 2: experimentation questionnaires and so forth, do acknowledge that you know 746 00:44:10,719 --> 00:44:14,640 Speaker 2: that there are multiple dimensions to any of these situations. Namely, 747 00:44:15,000 --> 00:44:17,000 Speaker 2: and this is something I think everyone can relate to, Like, 748 00:44:17,040 --> 00:44:20,520 Speaker 2: you can have a leader that is engaging in an 749 00:44:20,520 --> 00:44:26,120 Speaker 2: international situation, but there is still the domestic view of 750 00:44:26,160 --> 00:44:29,839 Speaker 2: that situation. There's still the domestic response, the domestic relationship. 751 00:44:31,000 --> 00:44:32,839 Speaker 2: And so one of the papers that I was looking 752 00:44:32,880 --> 00:44:35,440 Speaker 2: at this one is titled mad Man or mad Genius. 753 00:44:35,440 --> 00:44:38,840 Speaker 2: The International Benefits and Domestic Costs of mad Man's Strategy 754 00:44:38,880 --> 00:44:42,840 Speaker 2: by Joshua A. Schwartz, published twenty twenty three in Security Studies, 755 00:44:43,640 --> 00:44:46,400 Speaker 2: and in this the author found that the Madman approach 756 00:44:46,560 --> 00:44:51,160 Speaker 2: can work in negotiations with foreign adversaries, but quote entails 757 00:44:51,200 --> 00:44:55,560 Speaker 2: significant domestic costs that potentially erode its efficacy. He also 758 00:44:55,640 --> 00:44:58,200 Speaker 2: points out that mad Man theory may simply not work 759 00:44:58,239 --> 00:45:01,880 Speaker 2: against major powers, because while it might make a threat 760 00:45:01,920 --> 00:45:05,480 Speaker 2: more credible, that doesn't necessarily make the threat more effective 761 00:45:06,040 --> 00:45:08,960 Speaker 2: and make the adversary willing to cave to those demands. 762 00:45:09,000 --> 00:45:12,280 Speaker 2: So he points out that this is perhaps why Nixon's 763 00:45:12,320 --> 00:45:15,400 Speaker 2: use of mad Man theory didn't work against the Soviets. Like, Okay, 764 00:45:15,440 --> 00:45:20,480 Speaker 2: it made Nixon's threat more credible, but did it actually 765 00:45:20,719 --> 00:45:22,799 Speaker 2: make it more effective? Did he actually get what he 766 00:45:22,880 --> 00:45:26,560 Speaker 2: wanted to out of these bluffs these threats, and then 767 00:45:26,600 --> 00:45:30,480 Speaker 2: you have the domestic side of things, with the hypothetical 768 00:45:30,640 --> 00:45:36,000 Speaker 2: leader's own citizens not loving the heightened stakes because you know, 769 00:45:36,120 --> 00:45:38,920 Speaker 2: obviously I mean, I say obviously, but this is one 770 00:45:38,920 --> 00:45:40,480 Speaker 2: of those things that we often have to be reminded 771 00:45:40,520 --> 00:45:43,680 Speaker 2: of in a case of mutually assured destruction, or even 772 00:45:43,719 --> 00:45:47,680 Speaker 2: a case of asymmetric exchange involving nuclear weapons or some 773 00:45:47,719 --> 00:45:50,280 Speaker 2: other kind of just you know, horrible weapon of mass destruction, 774 00:45:51,160 --> 00:45:54,000 Speaker 2: a leader is always bargaining with the lives of their 775 00:45:54,000 --> 00:45:56,719 Speaker 2: own citizens. You are the chips on the board in 776 00:45:56,760 --> 00:45:59,600 Speaker 2: a no one knows what I'll do wager. Yeah, And 777 00:45:59,600 --> 00:46:02,520 Speaker 2: there also notes that, okay, you know, this is going 778 00:46:02,600 --> 00:46:04,760 Speaker 2: to also differ depending on how much of a voice 779 00:46:04,760 --> 00:46:07,120 Speaker 2: the people have in a given nation versus how much 780 00:46:07,280 --> 00:46:10,799 Speaker 2: power the ruler has, and there's going to be less 781 00:46:10,840 --> 00:46:13,520 Speaker 2: backlash in places where the people have less of a 782 00:46:13,640 --> 00:46:15,800 Speaker 2: voice in the ruler has more absolute power. 783 00:46:16,000 --> 00:46:18,280 Speaker 3: That's right. But one thing I did want to clarify 784 00:46:18,520 --> 00:46:21,640 Speaker 3: is that the sense in which you are the chips 785 00:46:21,680 --> 00:46:23,680 Speaker 3: on the board and no one knows what I'll do 786 00:46:23,800 --> 00:46:28,879 Speaker 3: wager is true is not just limited to like the 787 00:46:28,920 --> 00:46:32,080 Speaker 3: worst possible scenario like nuclear warfare. I mean it's also 788 00:46:32,680 --> 00:46:35,640 Speaker 3: the case in say, like trade negotiations or something like that, 789 00:46:35,680 --> 00:46:39,120 Speaker 3: like your economic prospects are in a way, those are 790 00:46:39,160 --> 00:46:43,080 Speaker 3: the chips on the board, Like the citizens' fates and 791 00:46:43,160 --> 00:46:45,759 Speaker 3: futures are the things that are being negotiated with. 792 00:46:46,120 --> 00:46:51,160 Speaker 2: That's right. Yeah, in contemplating potential nuclear exchanges, everything is 793 00:46:51,200 --> 00:46:54,000 Speaker 2: a bit more stark and a bit more black and white. 794 00:46:54,040 --> 00:46:56,800 Speaker 2: But obviously, you know, any give who can think of 795 00:46:56,800 --> 00:46:59,640 Speaker 2: any number of scenarios where the steaks are are still 796 00:46:59,719 --> 00:47:01,600 Speaker 2: quite high for the individual. 797 00:47:01,800 --> 00:47:03,719 Speaker 3: So yes, it's not hard to see at all why 798 00:47:03,800 --> 00:47:07,839 Speaker 3: a citizenry can easily become upset and annoyed if they're 799 00:47:07,960 --> 00:47:11,080 Speaker 3: leader who's supposed to be representing their interests in negotiations, 800 00:47:11,160 --> 00:47:15,879 Speaker 3: is acting unreliable and unpredictable. Like the leader may think 801 00:47:15,920 --> 00:47:17,960 Speaker 3: that they can get good gains out of that in 802 00:47:18,000 --> 00:47:21,120 Speaker 3: the short term, but the citizens are probably thinking about 803 00:47:21,120 --> 00:47:23,680 Speaker 3: a lot of these even if not you know, thinking 804 00:47:23,719 --> 00:47:26,840 Speaker 3: specifically about them, just thinking intuitively about it. Seems like 805 00:47:26,840 --> 00:47:28,239 Speaker 3: there's a lot of downside to this. 806 00:47:28,920 --> 00:47:32,400 Speaker 2: Yeah, and again coming back to what we mentioned earlier, 807 00:47:32,440 --> 00:47:35,920 Speaker 2: the idea too that by constant, if you swing around 808 00:47:36,560 --> 00:47:39,000 Speaker 2: the sword of mad Man theory too much, then it 809 00:47:39,040 --> 00:47:42,800 Speaker 2: makes it more difficult to engage in the other highly 810 00:47:42,840 --> 00:47:47,719 Speaker 2: important tools of state craft, peace treaties and agreements and 811 00:47:47,760 --> 00:47:50,520 Speaker 2: so forth. So it's like you can only swing that 812 00:47:50,560 --> 00:47:53,600 Speaker 2: sword around for so long, and then you have to 813 00:47:53,600 --> 00:47:57,120 Speaker 2: be able to engage in these other acts as well. Again, 814 00:47:57,360 --> 00:48:01,239 Speaker 2: short term gain with mad Man theory, not so in 815 00:48:01,280 --> 00:48:05,960 Speaker 2: the long term. But you know, unfortunately this ends up 816 00:48:06,040 --> 00:48:09,640 Speaker 2: rolling out into so many things about human experience and 817 00:48:09,719 --> 00:48:12,400 Speaker 2: human perception. Is that we are so focused on the 818 00:48:12,400 --> 00:48:14,800 Speaker 2: short term and we don't think about the long term. 819 00:48:15,040 --> 00:48:16,840 Speaker 3: So we've just been talking about a lot of reasons 820 00:48:16,880 --> 00:48:20,000 Speaker 3: that it might actually be bad and not as clever 821 00:48:20,080 --> 00:48:23,480 Speaker 3: as it first seems to try to leverage knowledge of 822 00:48:23,520 --> 00:48:28,800 Speaker 3: ambiguity aversion offensively and negotiations. But I think one way 823 00:48:28,880 --> 00:48:32,240 Speaker 3: that it can definitely be useful is to have knowledge 824 00:48:32,239 --> 00:48:35,840 Speaker 3: of ambiguity aversion to use defensively to think about in 825 00:48:35,960 --> 00:48:41,800 Speaker 3: analyzing your own behavior and being aware of your bias 826 00:48:41,840 --> 00:48:45,440 Speaker 3: to avoid ambiguity and sort of check yourself and think, like, 827 00:48:45,520 --> 00:48:48,520 Speaker 3: wait a minute, am I actually making the right decision here? 828 00:48:49,040 --> 00:48:52,080 Speaker 3: Is this actually what's rational? Or am I just having 829 00:48:52,160 --> 00:48:56,880 Speaker 3: an irrational bias against situations with unknown probabilities that are 830 00:48:57,800 --> 00:49:00,840 Speaker 3: having an outsized scariness in my mind because of the 831 00:49:00,880 --> 00:49:04,520 Speaker 3: ambiguity involved. Like the example we talked about earlier with 832 00:49:04,560 --> 00:49:07,319 Speaker 3: people being afraid to try new experiences because there's some 833 00:49:07,360 --> 00:49:08,400 Speaker 3: amount of ambiguity. 834 00:49:08,800 --> 00:49:11,600 Speaker 2: Yeah, absolutely, there Again, there's always ambiguity and any new 835 00:49:11,640 --> 00:49:14,160 Speaker 2: experience there there's so many ways that could go wrong, 836 00:49:14,200 --> 00:49:15,759 Speaker 2: but there are so many ways that could go right. 837 00:49:16,600 --> 00:49:18,640 Speaker 2: You know, a lot of us do tend to focus 838 00:49:18,680 --> 00:49:19,960 Speaker 2: more on the ways that could go wrong. 839 00:49:20,200 --> 00:49:22,520 Speaker 3: The devil, you know, is not necessarily better. 840 00:49:24,239 --> 00:49:26,120 Speaker 2: Yeah, the devil you don't know, It could be really fun, 841 00:49:26,600 --> 00:49:27,320 Speaker 2: give them a shot. 842 00:49:27,440 --> 00:49:30,960 Speaker 3: Sometimes the devil you know could be like a chaotic 843 00:49:31,000 --> 00:49:33,920 Speaker 3: good devil. Those those exist, right are there? 844 00:49:33,960 --> 00:49:38,359 Speaker 2: So you know, maybe under the new it's possible under 845 00:49:38,400 --> 00:49:40,800 Speaker 2: the new rules, but it's. 846 00:49:40,640 --> 00:49:42,359 Speaker 3: Actually just a friendly tee fling that. 847 00:49:42,280 --> 00:49:44,840 Speaker 2: You can get mistigo. You could have a kat a 848 00:49:44,840 --> 00:49:47,960 Speaker 2: good teethling. There you go, it's entirely possible. All right. 849 00:49:48,000 --> 00:49:50,040 Speaker 3: Well, that's going to do it for part one of 850 00:49:50,239 --> 00:49:53,560 Speaker 3: our look at ambiguity aversion, But we're going to be 851 00:49:53,600 --> 00:49:56,640 Speaker 3: back again next time to talk about how this applies 852 00:49:56,680 --> 00:50:00,440 Speaker 3: to some other domains of life and maybe some subsequent research, 853 00:50:00,600 --> 00:50:03,839 Speaker 3: I might get some into the different ways that the 854 00:50:03,880 --> 00:50:09,120 Speaker 3: ambiguity version observations have been interpreted in terms of different 855 00:50:09,120 --> 00:50:11,880 Speaker 3: types of decision theory, like is it an error or 856 00:50:11,920 --> 00:50:14,680 Speaker 3: is it actually a type of rationality that needs to 857 00:50:14,719 --> 00:50:17,560 Speaker 3: be better understood? So yeah, well we'll talk about that 858 00:50:17,600 --> 00:50:18,520 Speaker 3: kind of stuff next time. 859 00:50:18,760 --> 00:50:22,240 Speaker 2: Who knows what angles we'll actually discuss in the next episode. 860 00:50:22,280 --> 00:50:24,640 Speaker 2: You can tune in to find out in the meantime, 861 00:50:25,000 --> 00:50:27,160 Speaker 2: certainly right in we'd love to hear from you, and 862 00:50:27,239 --> 00:50:29,360 Speaker 2: we'd like to remind you that Stuff to Blow Your 863 00:50:29,400 --> 00:50:31,799 Speaker 2: Mind is primarily a science and culture podcast, with core 864 00:50:31,840 --> 00:50:35,640 Speaker 2: episodes on Tuesdays and Thursdays, short form episode on Wednesdays 865 00:50:35,640 --> 00:50:37,719 Speaker 2: and on Fridays. We set aside most serious concerns to 866 00:50:37,760 --> 00:50:40,719 Speaker 2: just talk about a weird film on Weird House Cinema. 867 00:50:40,960 --> 00:50:44,440 Speaker 3: Huge thanks as always to our excellent audio producer JJ Posway. 868 00:50:44,760 --> 00:50:46,400 Speaker 3: If you would like to get in touch with us 869 00:50:46,400 --> 00:50:48,840 Speaker 3: with feedback on this episode or any other, to suggest 870 00:50:48,840 --> 00:50:50,680 Speaker 3: a topic for the future, or just to say hello, 871 00:50:50,840 --> 00:50:53,520 Speaker 3: you can email us at contact at stuff to Blow 872 00:50:53,520 --> 00:51:01,800 Speaker 3: your Mind dot com. 873 00:51:01,960 --> 00:51:04,920 Speaker 1: Stuff to Blow Your Mind is production of iHeartRadio. For 874 00:51:05,000 --> 00:51:07,759 Speaker 1: more podcasts from my heart Radio, visit the iHeartRadio app, 875 00:51:07,920 --> 00:51:25,360 Speaker 1: Apple podcasts, or wherever you listen to your favorite shows.