1 00:00:00,840 --> 00:00:04,760 Speaker 1: It's not pathways switched on by propaganda, it's pathways switched off. 2 00:00:04,960 --> 00:00:09,120 Speaker 1: It turns out that what propaganda does is it suppresses 3 00:00:09,200 --> 00:00:12,080 Speaker 1: those networks so that I come to understand some other 4 00:00:12,160 --> 00:00:16,639 Speaker 1: group more like an object rather than another person. Where 5 00:00:16,680 --> 00:00:21,000 Speaker 1: you associate somebody with something bad like pollution or viruses 6 00:00:21,120 --> 00:00:23,680 Speaker 1: or whatever. But the point is you don't see them 7 00:00:23,720 --> 00:00:28,160 Speaker 1: as a human and that's the power of propaganda and dehumanization. 8 00:00:30,160 --> 00:00:33,159 Speaker 2: I'm John Cipher and I'm Jerry o'she I served in 9 00:00:33,200 --> 00:00:36,760 Speaker 2: the CIA's Clandestine Service for twenty eight years, living undercover 10 00:00:37,000 --> 00:00:38,240 Speaker 2: all around the world. 11 00:00:38,040 --> 00:00:40,400 Speaker 3: And in my thirty three years with the CIA, I 12 00:00:40,479 --> 00:00:43,320 Speaker 3: served in Africa, Asia, Europe, and the Middle East. 13 00:00:43,479 --> 00:00:45,720 Speaker 2: Although we don't usually look at it this way, we 14 00:00:45,840 --> 00:00:47,920 Speaker 2: created conspiracies. 15 00:00:47,360 --> 00:00:50,360 Speaker 3: In our operations. We got people to believe things that 16 00:00:50,440 --> 00:00:51,120 Speaker 3: weren't true. 17 00:00:51,240 --> 00:00:54,160 Speaker 2: Now we're investigating the conspiracy theories we see in the 18 00:00:54,200 --> 00:00:55,400 Speaker 2: news almost every day. 19 00:00:55,640 --> 00:00:57,880 Speaker 3: Will break them down for you to determine whether they 20 00:00:57,920 --> 00:00:59,960 Speaker 3: could be real or whether we're being manipular. 21 00:01:00,560 --> 00:01:07,560 Speaker 2: Welcome to Mission Implausible. Our guest today is David Eagleman. 22 00:01:07,720 --> 00:01:10,760 Speaker 2: He's a neuroscientist and a teacher at Stanford. Best selling 23 00:01:10,760 --> 00:01:12,840 Speaker 2: author of numerous books and articles, and the host of 24 00:01:12,840 --> 00:01:17,000 Speaker 2: the outstanding podcast Intercosmos with David Eagleman. David, Welcome to 25 00:01:17,040 --> 00:01:17,880 Speaker 2: Mission Implausible. 26 00:01:18,160 --> 00:01:19,360 Speaker 1: Great to be here, guys. 27 00:01:19,560 --> 00:01:21,720 Speaker 4: So David, I'll kick it off, maybe with an observation 28 00:01:22,000 --> 00:01:24,360 Speaker 4: and a question to get us going. One of the 29 00:01:24,400 --> 00:01:29,600 Speaker 4: observation is that CIA and neuroscience, at least when it 30 00:01:29,600 --> 00:01:33,240 Speaker 4: comes to conspiracy theories, they're actually much closer than you think, 31 00:01:33,280 --> 00:01:37,080 Speaker 4: because as agency officers, we deal not only with people 32 00:01:37,200 --> 00:01:42,000 Speaker 4: having conspiracy theories about US secret organization, but in our 33 00:01:42,160 --> 00:01:46,400 Speaker 4: work we dealt with people who are conspiracy theorists, and 34 00:01:46,440 --> 00:01:52,360 Speaker 4: we deal with societies that view life through a conspiratorial 35 00:01:52,440 --> 00:01:56,320 Speaker 4: or conspiracy theory lens. So if you sit down John 36 00:01:56,440 --> 00:02:00,240 Speaker 4: or I, if we sit down with a Pakistani scientist he's 37 00:02:00,280 --> 00:02:03,280 Speaker 4: convinced that the Jews run the world. You have to 38 00:02:03,360 --> 00:02:05,720 Speaker 4: deal with this guy, right, he's got information. We have 39 00:02:05,760 --> 00:02:08,280 Speaker 4: to figure out how to work with him and how 40 00:02:08,280 --> 00:02:10,480 Speaker 4: to talk to him. And on the other hand, there 41 00:02:10,639 --> 00:02:16,680 Speaker 4: was a really smart, intelligent person from a major agent country. 42 00:02:16,960 --> 00:02:19,720 Speaker 4: He was in their security service. He and I were 43 00:02:19,720 --> 00:02:22,320 Speaker 4: getting close, and I realized he was trying to recruit 44 00:02:22,440 --> 00:02:25,000 Speaker 4: me and what his government wanted to know, and what 45 00:02:25,040 --> 00:02:28,960 Speaker 4: he wanted to know is what was the role of 46 00:02:29,000 --> 00:02:32,839 Speaker 4: the Masonic lodges in the United States, because clearly they 47 00:02:32,960 --> 00:02:35,480 Speaker 4: run the US. So, just to kick it off, I'd 48 00:02:35,480 --> 00:02:38,840 Speaker 4: love your thoughts on why people are prone to conspiracy 49 00:02:38,880 --> 00:02:41,480 Speaker 4: theories and why societies are as well. 50 00:02:41,880 --> 00:02:44,600 Speaker 1: It's because the job of the brain is to make 51 00:02:44,720 --> 00:02:47,920 Speaker 1: hypotheses about the world. So your brain is locked in 52 00:02:48,000 --> 00:02:50,920 Speaker 1: silence and darkness inside the skull, and it's trying to 53 00:02:50,960 --> 00:02:53,239 Speaker 1: figure out what the heck is going on out there, 54 00:02:53,240 --> 00:02:55,760 Speaker 1: and so the way it does this is by generating 55 00:02:56,400 --> 00:03:00,560 Speaker 1: ideas and testing those because you know, we're totally limited 56 00:03:01,240 --> 00:03:03,160 Speaker 1: in what we can actually know about the world, not 57 00:03:03,200 --> 00:03:08,040 Speaker 1: only because our reach is very small, but also because 58 00:03:08,080 --> 00:03:11,320 Speaker 1: the world is very complicated. There are billions of people 59 00:03:11,320 --> 00:03:13,560 Speaker 1: in the world, and we're always trying to figure things out. Now. Now, 60 00:03:13,600 --> 00:03:16,239 Speaker 1: brains do this naturally. So if I'm trying to figure 61 00:03:16,240 --> 00:03:20,600 Speaker 1: out where I put my phone down, my first hypothesis is, oh, 62 00:03:20,919 --> 00:03:22,720 Speaker 1: I must have left it where I normally do, and 63 00:03:22,720 --> 00:03:24,960 Speaker 1: then it's not there, so I think, oh, maybe I 64 00:03:25,040 --> 00:03:26,920 Speaker 1: carried it with me over to the bedroom and left 65 00:03:26,960 --> 00:03:29,600 Speaker 1: it there, And then I search that and I find 66 00:03:29,600 --> 00:03:32,880 Speaker 1: that's not true. And then as I continue to not 67 00:03:33,080 --> 00:03:36,839 Speaker 1: find my phone, what the brain naturally does and appropriately 68 00:03:36,920 --> 00:03:40,200 Speaker 1: is it turns up the temperature on these ideas and 69 00:03:40,280 --> 00:03:43,920 Speaker 1: it comes up with whackier and whackier hypothesis. And eventually, 70 00:03:43,960 --> 00:03:46,360 Speaker 1: if you can't find your phone, you might say, well, 71 00:03:46,400 --> 00:03:49,680 Speaker 1: I think the electrician who was here maybe took the phone. 72 00:03:49,800 --> 00:03:52,080 Speaker 1: This is what brains do, and this is a natural 73 00:03:52,560 --> 00:03:56,880 Speaker 1: part of what they're supposed to do. Now, interestingly, this 74 00:03:56,960 --> 00:04:02,600 Speaker 1: is what leads to conspiracy theories, because when we're generating 75 00:04:02,640 --> 00:04:05,840 Speaker 1: a hypothesis about what was it that land in ron aswell, 76 00:04:06,040 --> 00:04:09,920 Speaker 1: or how did JFK actually get killed? Or the nine 77 00:04:09,960 --> 00:04:12,560 Speaker 1: to eleven? How did the towers actually fall? When your 78 00:04:12,560 --> 00:04:16,800 Speaker 1: brain comes up with a really crazy hypothesis, usually you 79 00:04:17,120 --> 00:04:19,920 Speaker 1: dismiss that one in favor of things that are more likely. 80 00:04:19,960 --> 00:04:22,760 Speaker 1: And so my interest has been in why people are 81 00:04:22,800 --> 00:04:25,520 Speaker 1: attracted to certain hypotheses, and I actually think there are 82 00:04:25,560 --> 00:04:28,560 Speaker 1: several reasons here. One of them is that puzzle solving 83 00:04:29,160 --> 00:04:31,480 Speaker 1: is very rewarding to the brain. And of course we're 84 00:04:31,520 --> 00:04:33,520 Speaker 1: trained on this since we're kids. You're in school, you 85 00:04:33,600 --> 00:04:36,479 Speaker 1: have all kinds of flavors of puzzles that you do 86 00:04:36,560 --> 00:04:41,240 Speaker 1: all the time, and this actually activates these reward systems 87 00:04:41,400 --> 00:04:44,880 Speaker 1: when you solve them, and when you generate a good 88 00:04:45,040 --> 00:04:48,680 Speaker 1: theory to explain a mysterious event or just to have 89 00:04:48,760 --> 00:04:52,120 Speaker 1: a complete different explanation of event, there's something really rewarding 90 00:04:52,160 --> 00:04:56,000 Speaker 1: about cracking that puzzle. You are the Sherlock Homes in 91 00:04:56,040 --> 00:04:58,880 Speaker 1: that moment. And I do want to make clear that 92 00:04:58,920 --> 00:05:01,760 Speaker 1: when I often you people say things like conspiracy theories 93 00:05:01,800 --> 00:05:04,200 Speaker 1: are not rational, but in fact they're completely in line 94 00:05:04,240 --> 00:05:08,360 Speaker 1: with what brains do to solve problems. And when someone 95 00:05:08,839 --> 00:05:11,560 Speaker 1: buys into a conspiracy theory, it's not necessarily that they're 96 00:05:11,600 --> 00:05:15,680 Speaker 1: being disingenuous. It's that the world is large and complex, 97 00:05:15,760 --> 00:05:17,440 Speaker 1: and the best any of us are trying to do 98 00:05:17,520 --> 00:05:20,360 Speaker 1: is fit together lego pieces to try to figure out 99 00:05:20,360 --> 00:05:22,400 Speaker 1: what the heck is going on out there. I'll just 100 00:05:22,400 --> 00:05:25,240 Speaker 1: say a couple other things, which is that fascinatingly, there 101 00:05:25,240 --> 00:05:28,280 Speaker 1: are many reasons why brains try to solve puzzles all 102 00:05:28,320 --> 00:05:32,520 Speaker 1: the time. One of them is to reduce cognitive dissonance, 103 00:05:32,600 --> 00:05:34,520 Speaker 1: which is to say, you know, when we're faced with 104 00:05:34,680 --> 00:05:37,960 Speaker 1: facts that don't fit or information that conflicts, which happens 105 00:05:38,000 --> 00:05:42,320 Speaker 1: all the time, that's uncomfortable for us. And so when 106 00:05:42,320 --> 00:05:45,240 Speaker 1: you can come up with a good conspiracy theory that fits, 107 00:05:45,360 --> 00:05:49,680 Speaker 1: it can reduce cognitive load and complexity and dissonance. So 108 00:05:49,880 --> 00:05:51,600 Speaker 1: just a couple other things I think are fascinating here, 109 00:05:51,640 --> 00:05:55,640 Speaker 1: which is we're also all yoked with a confirmation bias, 110 00:05:55,720 --> 00:05:59,120 Speaker 1: which just means that we seek out information and we 111 00:05:59,279 --> 00:06:03,720 Speaker 1: like information that confirms our pre existing beliefs. 112 00:06:04,080 --> 00:06:06,120 Speaker 4: I've got a ten dollar word I want to use, 113 00:06:06,400 --> 00:06:11,039 Speaker 4: so I probably mispronounce it, but peridola, right, paridolia. So 114 00:06:11,320 --> 00:06:14,720 Speaker 4: from an evolutionary point of view, as I understand it, 115 00:06:14,800 --> 00:06:18,760 Speaker 4: we're more likely to survive and breed if we see patterns. 116 00:06:18,880 --> 00:06:21,560 Speaker 4: If there's a rustle in the bushes in a shadow, 117 00:06:22,160 --> 00:06:24,640 Speaker 4: it could be a tiger, right, so you run. You 118 00:06:24,680 --> 00:06:26,640 Speaker 4: don't know if it's a tiger. And of course the 119 00:06:26,640 --> 00:06:28,640 Speaker 4: person who goes over and says I'm going to go 120 00:06:28,760 --> 00:06:31,039 Speaker 4: check this out and determine whether it is or isn't, 121 00:06:31,240 --> 00:06:33,560 Speaker 4: of course they're going to get eaten, right, even if 122 00:06:33,600 --> 00:06:35,839 Speaker 4: it's only one out of one hundred times. So that 123 00:06:35,960 --> 00:06:41,280 Speaker 4: we've evolved this innate capacity to see patterns and to 124 00:06:41,480 --> 00:06:45,440 Speaker 4: even if they're false, evolutionarily speaking, it's still an advantage 125 00:06:45,520 --> 00:06:47,040 Speaker 4: or it has been an advantage to us. 126 00:06:47,200 --> 00:06:49,920 Speaker 1: Yeah, exactly. So first, let me define the ten dollar word, 127 00:06:49,960 --> 00:06:56,320 Speaker 1: which is paradolia is seeing patterns in random visual patterns. 128 00:06:56,440 --> 00:06:59,880 Speaker 1: Probably the most common example of paradolia is seeing faces. 129 00:07:00,120 --> 00:07:02,440 Speaker 1: You look at the electrical plug on the wall and 130 00:07:02,520 --> 00:07:04,480 Speaker 1: it looks like two eyes and mouth. You go, there's 131 00:07:04,480 --> 00:07:07,120 Speaker 1: a old face there. That's paradolia. And of course we 132 00:07:07,160 --> 00:07:09,360 Speaker 1: see faces and everything, the man on the moon or 133 00:07:09,440 --> 00:07:13,120 Speaker 1: some burn marks on a piece of toast or whatever. Okay, 134 00:07:13,360 --> 00:07:16,600 Speaker 1: but I'm going to introduce another ten dollars word, which 135 00:07:16,680 --> 00:07:20,400 Speaker 1: is apo fenia, which is the more generalized thing. So 136 00:07:20,480 --> 00:07:25,000 Speaker 1: paradolia is about visual patterns, but apoffenia is about just 137 00:07:25,200 --> 00:07:29,120 Speaker 1: as signing meaning to random patterns. And I think that's 138 00:07:29,160 --> 00:07:33,080 Speaker 1: really at the heart of conspiracy theories, is our brain's 139 00:07:33,200 --> 00:07:37,480 Speaker 1: natural inclination to find meaningful connections in random data. 140 00:07:37,960 --> 00:07:41,400 Speaker 2: The thing that I don't get is how when people 141 00:07:41,760 --> 00:07:43,960 Speaker 2: come out the other end and they believe in lizard 142 00:07:44,000 --> 00:07:48,440 Speaker 2: people and QAnon and the politicians drink blood, those things 143 00:07:48,480 --> 00:07:54,080 Speaker 2: aren't sensible answers or sensible patterns of solving any kind 144 00:07:54,080 --> 00:07:57,680 Speaker 2: of puzzle. How does that work? How could someone go 145 00:07:57,720 --> 00:08:00,520 Speaker 2: down a rabbit hole and believe that politicians drink the 146 00:08:00,600 --> 00:08:01,400 Speaker 2: blood of children. 147 00:08:01,720 --> 00:08:04,440 Speaker 1: The thing to appreciate is that it's all on a spectrum, right, 148 00:08:04,560 --> 00:08:07,040 Speaker 1: So you can believe in a whack your and whack 149 00:08:07,080 --> 00:08:11,600 Speaker 1: your hypothesis about something, and eventually you find yourself pretty 150 00:08:11,640 --> 00:08:15,280 Speaker 1: far out on something. But if it confirms the bias 151 00:08:15,320 --> 00:08:18,000 Speaker 1: you have about this politician, and if it seems to 152 00:08:18,120 --> 00:08:21,720 Speaker 1: answer some sort of if it cracks a puzzle for you, 153 00:08:21,800 --> 00:08:23,240 Speaker 1: I don't know what it is. In this case, all 154 00:08:23,240 --> 00:08:26,480 Speaker 1: these children have disappeared and these politicians seem to be 155 00:08:26,480 --> 00:08:30,720 Speaker 1: getting healthier or whatever the thing is, then to your mind, 156 00:08:30,840 --> 00:08:33,880 Speaker 1: that might make sense. But there's actually another thing that 157 00:08:34,000 --> 00:08:36,440 Speaker 1: I think might shed some light here, which is that 158 00:08:36,640 --> 00:08:40,679 Speaker 1: there is a social component to conspiracy theories. And I've 159 00:08:40,720 --> 00:08:43,520 Speaker 1: been very interested in this issue. So actually I would 160 00:08:43,520 --> 00:08:47,360 Speaker 1: say there's two aspects of social components. One is brains 161 00:08:47,360 --> 00:08:50,880 Speaker 1: are very predisposed to have in groups and outgroups. You've 162 00:08:50,920 --> 00:08:52,880 Speaker 1: got the people that you know and like and trust 163 00:08:52,920 --> 00:08:55,920 Speaker 1: presumably the group that you are in, and then there's 164 00:08:56,000 --> 00:08:58,800 Speaker 1: the others who are in that outgroup, and we trust 165 00:08:58,800 --> 00:09:02,920 Speaker 1: them less, we think they're capable of anything any sort 166 00:09:02,920 --> 00:09:07,120 Speaker 1: of moral turpitude, and that allows for whackier and whack 167 00:09:07,120 --> 00:09:10,000 Speaker 1: your hypotheses to be possible. But I think there's even 168 00:09:10,040 --> 00:09:13,640 Speaker 1: a second social component that's even more important, which is 169 00:09:13,679 --> 00:09:17,200 Speaker 1: that if you are the one that comes up with 170 00:09:17,520 --> 00:09:20,600 Speaker 1: a conspiracy theory, you get a lot of attention for that, 171 00:09:20,720 --> 00:09:23,200 Speaker 1: or at least you think you do. You get to say, hey, 172 00:09:23,200 --> 00:09:26,440 Speaker 1: look I've pieced all these things together, and I know 173 00:09:26,520 --> 00:09:28,920 Speaker 1: something that is the first time anyone has ever known 174 00:09:28,960 --> 00:09:31,120 Speaker 1: this in the history of humankind. I mean, look, this 175 00:09:31,200 --> 00:09:34,559 Speaker 1: kind of thing drives science, right if you are Einstein 176 00:09:34,760 --> 00:09:37,400 Speaker 1: and you explain this little loop de loop that the 177 00:09:37,440 --> 00:09:40,840 Speaker 1: planet Mercury does with the special theory of relativity, and 178 00:09:41,040 --> 00:09:44,480 Speaker 1: suddenly like everything is clear, and no one in the 179 00:09:44,520 --> 00:09:46,480 Speaker 1: history of humankind has ever felt this, you know, has 180 00:09:46,520 --> 00:09:49,600 Speaker 1: ever understood this before. What Einstein said is he felt 181 00:09:49,600 --> 00:09:52,680 Speaker 1: like something snapped inside him. I mean, what a feeling, 182 00:09:52,679 --> 00:09:54,920 Speaker 1: what a moment. And by the way, I think this 183 00:09:55,040 --> 00:09:59,040 Speaker 1: social component matters even if you're just the guy repeating 184 00:09:59,080 --> 00:10:01,120 Speaker 1: the story, the one who came up with it, But 185 00:10:01,200 --> 00:10:05,240 Speaker 1: you get something out of telling it. Why, Because conspiracy 186 00:10:05,280 --> 00:10:09,560 Speaker 1: theories are interesting and you get attention at the party 187 00:10:09,960 --> 00:10:13,040 Speaker 1: for saying this or on social media. And the reason 188 00:10:13,320 --> 00:10:18,000 Speaker 1: they are interesting is because they challenge someone else's internal model. 189 00:10:18,080 --> 00:10:22,199 Speaker 1: They think they understood something. Oh, I know how nine 190 00:10:22,200 --> 00:10:24,480 Speaker 1: to eleven happened, or how JFK got shot orund and 191 00:10:24,520 --> 00:10:28,360 Speaker 1: certainly you tell them something and voila, there's a totally 192 00:10:28,360 --> 00:10:31,319 Speaker 1: different framework for explaining the same thing, and that holds 193 00:10:31,360 --> 00:10:34,320 Speaker 1: their attention. And even if you were at the original 194 00:10:34,320 --> 00:10:36,160 Speaker 1: sherlock who came up with a theory, you get to 195 00:10:36,200 --> 00:10:38,160 Speaker 1: glow in the credit of that because you're the ones 196 00:10:38,200 --> 00:10:42,439 Speaker 1: spreading the word and with a good story, it's like 197 00:10:42,480 --> 00:10:45,000 Speaker 1: a gift certificate that doesn't get used up, that gives 198 00:10:45,040 --> 00:10:46,920 Speaker 1: you really great social feedback. 199 00:10:48,360 --> 00:10:50,679 Speaker 4: Okay, let's take a break from the craziness just. 200 00:10:50,679 --> 00:11:01,280 Speaker 5: For a minute or two, and we're back. 201 00:11:02,960 --> 00:11:05,880 Speaker 4: I know somebody who's who lost a relative at Jonestown 202 00:11:06,440 --> 00:11:09,559 Speaker 4: and Jim Jones and the temple famously nine hundred and 203 00:11:09,600 --> 00:11:13,360 Speaker 4: eighteen people. The word is they committed suicide. They literally 204 00:11:13,440 --> 00:11:15,680 Speaker 4: drank the kool aid Arsenick and the grave kool aid. 205 00:11:15,960 --> 00:11:18,960 Speaker 4: But Jim Jones at the end had come up with 206 00:11:19,000 --> 00:11:24,000 Speaker 4: this intricate conspiracy theory about how the US government was 207 00:11:24,040 --> 00:11:27,120 Speaker 4: going to attack them. They were going to be murdered 208 00:11:27,200 --> 00:11:30,360 Speaker 4: and tortured, their children would be taken away, and this 209 00:11:30,480 --> 00:11:34,800 Speaker 4: was the only way out. And in the end, as 210 00:11:34,840 --> 00:11:38,800 Speaker 4: people have studied this more probably only about I'll just 211 00:11:38,840 --> 00:11:42,920 Speaker 4: say half a significant number actually drank the kool aid voluntarily, 212 00:11:43,600 --> 00:11:45,640 Speaker 4: but a lot of them when push came to shove, 213 00:11:45,679 --> 00:11:48,920 Speaker 4: when somebody almost literally put a gun to their head 214 00:11:48,960 --> 00:11:52,360 Speaker 4: and said, do you believe this conspiracy? More than half 215 00:11:52,400 --> 00:11:55,480 Speaker 4: of them are around half didn't you know? They went 216 00:11:55,520 --> 00:11:57,720 Speaker 4: along with it for the social component to be in 217 00:11:57,760 --> 00:12:01,200 Speaker 4: the end group, but when it's like my life's steak, Yeah, 218 00:12:01,240 --> 00:12:03,480 Speaker 4: they knew it wasn't true. Deep in their hearts. They 219 00:12:03,520 --> 00:12:05,880 Speaker 4: knew it wasn't They still drank it because they put 220 00:12:06,120 --> 00:12:08,560 Speaker 4: literally a gun to their head or a crossbow. And 221 00:12:08,600 --> 00:12:11,320 Speaker 4: so these people did kill themselves because they were forced to. 222 00:12:11,679 --> 00:12:13,640 Speaker 4: But it wasn't a mass suicide. It was a lot 223 00:12:13,720 --> 00:12:15,559 Speaker 4: of them. A lot of them did it because they 224 00:12:15,720 --> 00:12:17,760 Speaker 4: bought into it, and a lot of them did it 225 00:12:17,840 --> 00:12:20,480 Speaker 4: they didn't really believe. In the end. It was like 226 00:12:20,880 --> 00:12:24,200 Speaker 4: it was a sham in their inner hearts. They didn't 227 00:12:24,200 --> 00:12:25,280 Speaker 4: believe in the conspiracy. 228 00:12:25,679 --> 00:12:27,400 Speaker 1: I just want to address one point here that I 229 00:12:27,400 --> 00:12:31,400 Speaker 1: think matters, which is that when it comes to our 230 00:12:31,520 --> 00:12:36,120 Speaker 1: believing in something, it's not that there's a true answer, 231 00:12:36,240 --> 00:12:38,240 Speaker 1: as in, oh, I didn't really believe it, or I did. 232 00:12:38,559 --> 00:12:41,160 Speaker 1: You have lots of different networks in your brain all running. 233 00:12:41,160 --> 00:12:42,599 Speaker 1: The way I've described to some of my books is 234 00:12:42,640 --> 00:12:45,280 Speaker 1: that it's a team of rivals under the hood, and 235 00:12:45,360 --> 00:12:48,080 Speaker 1: so you can both believe something and part of you 236 00:12:48,160 --> 00:12:49,920 Speaker 1: doesn't believe it at all, and part of you thinks, hey, 237 00:12:49,960 --> 00:12:52,360 Speaker 1: this may be really true. So all those things can 238 00:12:52,400 --> 00:12:53,960 Speaker 1: be running in your head at once. So it's hard 239 00:12:53,960 --> 00:12:56,040 Speaker 1: to say that they didn't believe it at all and 240 00:12:56,080 --> 00:12:58,920 Speaker 1: they knew it was a sham. It's possible that they 241 00:12:59,000 --> 00:13:01,559 Speaker 1: were questioning back and forth, wow is this true? Is 242 00:13:01,559 --> 00:13:05,120 Speaker 1: it's not true? A good mid level conspiracy theory, ninety 243 00:13:05,200 --> 00:13:08,760 Speaker 1: nine percent of your cognitive mind says, Okay, that is 244 00:13:08,800 --> 00:13:10,680 Speaker 1: definitely not true. But then you think, hey, what if 245 00:13:10,720 --> 00:13:12,640 Speaker 1: what you can partially entertain it. 246 00:13:12,920 --> 00:13:15,839 Speaker 2: You talked about sort of truth and the relativism of truth, 247 00:13:15,880 --> 00:13:18,040 Speaker 2: and that we each have our own sort of worldview 248 00:13:18,160 --> 00:13:20,760 Speaker 2: or in our own heads that maybe there's not really 249 00:13:21,040 --> 00:13:25,120 Speaker 2: unimpeachable truth. There are things that are clearly not true, 250 00:13:25,480 --> 00:13:28,440 Speaker 2: like back to my original question, the view that there 251 00:13:28,440 --> 00:13:31,600 Speaker 2: are lizard people running the world and running our politics. 252 00:13:32,240 --> 00:13:34,959 Speaker 2: No one has ever seen a lizard person, So how 253 00:13:35,040 --> 00:13:37,800 Speaker 2: is it that people can convince themselves of things that 254 00:13:38,080 --> 00:13:39,880 Speaker 2: there's just no evidence for. 255 00:13:40,440 --> 00:13:43,400 Speaker 1: Yeah, okay, So I think this is a really important 256 00:13:43,480 --> 00:13:47,080 Speaker 1: question because it shows a couple things. One is, first 257 00:13:47,120 --> 00:13:50,040 Speaker 1: of all, this social component. I don't if you hear 258 00:13:50,120 --> 00:13:52,680 Speaker 1: about the lizard people running politics and you go and 259 00:13:52,720 --> 00:13:55,840 Speaker 1: you tell someone else, there is some social component to Hey, 260 00:13:55,920 --> 00:13:57,480 Speaker 1: I get to be the one telling you about this, 261 00:13:57,679 --> 00:14:00,600 Speaker 1: even if this is unlikely to be true, there's still 262 00:14:00,600 --> 00:14:03,920 Speaker 1: some social award to get. But more importantly, our world 263 00:14:04,240 --> 00:14:07,880 Speaker 1: has mental illness. About one percent of the population, for example, 264 00:14:07,880 --> 00:14:11,160 Speaker 1: has schizophrenia, which means that they're divorced from reality and 265 00:14:11,760 --> 00:14:16,000 Speaker 1: they can have a thought like lizard people are running society, 266 00:14:16,600 --> 00:14:20,280 Speaker 1: and to them there's nothing strange about that. They feel like, Okay, 267 00:14:20,360 --> 00:14:23,320 Speaker 1: I know that's true, because I don't know if you 268 00:14:23,440 --> 00:14:25,640 Speaker 1: have ever talked or any of the listeners ever known 269 00:14:25,680 --> 00:14:27,280 Speaker 1: someone or had a loved one or just a friend 270 00:14:27,360 --> 00:14:30,200 Speaker 1: or someone that talked to with schizophrenia. But when somebody 271 00:14:30,600 --> 00:14:35,440 Speaker 1: is delusional, they will believe whatever coinage their brain is 272 00:14:36,200 --> 00:14:38,320 Speaker 1: coughing up. So in the same way that we believe 273 00:14:38,600 --> 00:14:44,240 Speaker 1: our dreams are completely wacky, ridiculous dreams when we're in them, 274 00:14:44,520 --> 00:14:46,720 Speaker 1: that's exactly what it's like for a person with schizophrenia. 275 00:14:46,920 --> 00:14:49,520 Speaker 1: I talked with one gentleman who was telling me that 276 00:14:49,640 --> 00:14:53,160 Speaker 1: he had just had brunch with the president and advised 277 00:14:53,240 --> 00:14:54,760 Speaker 1: him on all these things and so on. He was 278 00:14:54,840 --> 00:14:59,080 Speaker 1: locked in a mental institution, and clearly this was not true, 279 00:14:59,120 --> 00:15:01,120 Speaker 1: but he believed it. I came back and saw him 280 00:15:01,600 --> 00:15:05,360 Speaker 1: about ten days later. He was now not having hallucinations, 281 00:15:05,360 --> 00:15:07,560 Speaker 1: and I asked him about that. I said, Hey, what's 282 00:15:07,600 --> 00:15:09,360 Speaker 1: your view on that now? And he said, oh, I 283 00:15:09,400 --> 00:15:11,760 Speaker 1: guess that wasn't true. Now it feels like a dream 284 00:15:11,840 --> 00:15:14,400 Speaker 1: to me. But when I was there ten days ago, 285 00:15:14,520 --> 00:15:18,480 Speaker 1: I felt like that was true. So there's some interesting 286 00:15:18,520 --> 00:15:21,400 Speaker 1: interaction between the one percent of the population that has 287 00:15:21,440 --> 00:15:25,920 Speaker 1: schizophrenia and then other people who are maybe just adjacent 288 00:15:25,960 --> 00:15:28,760 Speaker 1: to them. And so I have a suspicion this is 289 00:15:28,800 --> 00:15:31,280 Speaker 1: a little bit sometimes how things can spread. 290 00:15:31,760 --> 00:15:36,080 Speaker 4: So if I understand correctly hate speech, of course, dictators 291 00:15:36,240 --> 00:15:42,000 Speaker 4: and hate groups use these dehumanizing metaphors that they switch 292 00:15:42,160 --> 00:15:47,960 Speaker 4: on neural pathways and these neural pathways, they bypass higher 293 00:15:48,040 --> 00:15:52,440 Speaker 4: cognitive reasoning centers, and over time, these mental patterns become 294 00:15:52,960 --> 00:15:56,240 Speaker 4: entrenched and it becomes more and more difficult to get 295 00:15:56,280 --> 00:15:58,960 Speaker 4: out of them. And of course for CIA officers, this 296 00:15:59,000 --> 00:16:01,360 Speaker 4: is something that's very important when we're dealing with people 297 00:16:01,720 --> 00:16:05,640 Speaker 4: or societies that view the world this way. And so 298 00:16:05,720 --> 00:16:09,160 Speaker 4: I know John's a Russia expert, and looking at Russia now, 299 00:16:09,520 --> 00:16:13,040 Speaker 4: I think there's a lot of this neural pathway change 300 00:16:13,120 --> 00:16:16,960 Speaker 4: right now where they simply don't view the world exactly 301 00:16:17,000 --> 00:16:18,760 Speaker 4: in the same way we do. And let me just 302 00:16:18,800 --> 00:16:21,920 Speaker 4: follow it up with one quick sort of personal So 303 00:16:22,400 --> 00:16:25,600 Speaker 4: when I was eighteen, I went to Germany and I 304 00:16:25,600 --> 00:16:28,960 Speaker 4: had a German girlfriend and I met her mother, and 305 00:16:29,040 --> 00:16:32,320 Speaker 4: she was nice, but she was from the Sudate and 306 00:16:32,400 --> 00:16:34,280 Speaker 4: Land and driven out after World War Two as a 307 00:16:34,320 --> 00:16:37,040 Speaker 4: young child. And one night she had a couple of 308 00:16:37,120 --> 00:16:41,240 Speaker 4: drinks too many, and she said to me, look, she said, 309 00:16:41,480 --> 00:16:44,080 Speaker 4: it was wrong what happened to the Jews, but with 310 00:16:44,200 --> 00:16:47,000 Speaker 4: everything they did, they kind of had something coming to them. 311 00:16:47,240 --> 00:16:49,600 Speaker 4: And I thought, she's never going to wash it out 312 00:16:49,640 --> 00:16:53,760 Speaker 4: to her head. She's a well educated sixty year old 313 00:16:53,800 --> 00:16:59,560 Speaker 4: woman and she is never despite all the contradictory evidence 314 00:16:59,680 --> 00:17:03,520 Speaker 4: and everything, and she's very happy living in a democratic 315 00:17:03,560 --> 00:17:09,360 Speaker 4: West Germany. Yet how her views formed in her cognitive years, 316 00:17:09,840 --> 00:17:12,040 Speaker 4: that was depically never going to change. The only way 317 00:17:12,040 --> 00:17:13,840 Speaker 4: it was going to change is like she hasd her 318 00:17:13,880 --> 00:17:15,119 Speaker 4: and her generation had to go. 319 00:17:15,400 --> 00:17:17,760 Speaker 1: Here's what I would say. I've actually studied a lot 320 00:17:17,880 --> 00:17:23,400 Speaker 1: about propaganda and dehumanization, and you're mostly right about what 321 00:17:23,680 --> 00:17:26,120 Speaker 1: happens there, except by the way, it's not pathways switched 322 00:17:26,200 --> 00:17:29,480 Speaker 1: on by propaganda, it's pathways switched off. It turns out 323 00:17:29,480 --> 00:17:31,560 Speaker 1: that you can start by asking the question of what 324 00:17:31,640 --> 00:17:34,080 Speaker 1: is required for pro social behavior, In other words, what 325 00:17:34,160 --> 00:17:37,600 Speaker 1: is required for me to care about you and not 326 00:17:37,880 --> 00:17:40,960 Speaker 1: want to see you get hurt? And what is required 327 00:17:40,960 --> 00:17:44,919 Speaker 1: are very particular networks, mostly in the frontal lobe behind 328 00:17:44,960 --> 00:17:49,400 Speaker 1: your forehead. These are networks that allow me to understand 329 00:17:49,440 --> 00:17:52,480 Speaker 1: you as a fellow human. But it turns out that 330 00:17:52,680 --> 00:17:57,520 Speaker 1: what propaganda does is it suppresses those networks so that 331 00:17:57,600 --> 00:18:00,480 Speaker 1: I come to understand some other group more like an 332 00:18:00,480 --> 00:18:04,879 Speaker 1: object rather than another person. And this is the trick 333 00:18:04,920 --> 00:18:09,119 Speaker 1: that all propaganda across place in time uses. Is you 334 00:18:09,800 --> 00:18:14,800 Speaker 1: dehumanize by comparing them to for example, animals or viruses. 335 00:18:15,240 --> 00:18:17,919 Speaker 1: Because the interesting thing is I've collected up, for example, 336 00:18:18,000 --> 00:18:20,600 Speaker 1: propaganda posters from all over the world. They all do this. 337 00:18:20,760 --> 00:18:22,840 Speaker 1: So I don't know what you were thinking, but whatever 338 00:18:22,840 --> 00:18:26,440 Speaker 1: you were thinking generalizes to everybody. That's the interesting part. 339 00:18:26,520 --> 00:18:30,120 Speaker 1: So look American World War One and World War two 340 00:18:30,200 --> 00:18:34,480 Speaker 1: posters showed the enemy as like a gorilla, or I've 341 00:18:34,480 --> 00:18:37,920 Speaker 1: actually collected quotations from like the Hutu and the Tutsi 342 00:18:38,080 --> 00:18:43,399 Speaker 1: in Rwanda. The Hutu viewed them as rats. That's a 343 00:18:43,480 --> 00:18:45,879 Speaker 1: common one across place in time, viewing them as rats 344 00:18:45,920 --> 00:18:49,040 Speaker 1: or viruses, things like that. Okay, So the point is 345 00:18:49,080 --> 00:18:51,919 Speaker 1: what this does is this turns down these areas, these 346 00:18:51,960 --> 00:18:55,679 Speaker 1: networks in the brain that are required to see another 347 00:18:55,760 --> 00:18:59,440 Speaker 1: person as a person. And so what happened. By the way, 348 00:18:59,440 --> 00:19:01,760 Speaker 1: there are lots of turned this down. Moral pollution is 349 00:19:01,800 --> 00:19:05,680 Speaker 1: one as well, where you associate somebody with something bad 350 00:19:05,960 --> 00:19:08,879 Speaker 1: like pollution or viruses or whatever. There are many ways 351 00:19:08,880 --> 00:19:10,920 Speaker 1: to do this, but the point is you don't see 352 00:19:10,920 --> 00:19:14,960 Speaker 1: them as a human. And that's the power of propaganda 353 00:19:15,000 --> 00:19:17,119 Speaker 1: and dehumanization. And the reason I study this and give 354 00:19:17,200 --> 00:19:20,879 Speaker 1: talks about this is because once you're aware of these tricks. 355 00:19:20,920 --> 00:19:23,280 Speaker 1: Then it's easier to see them, and so the next time, oh, 356 00:19:23,320 --> 00:19:26,159 Speaker 1: this group is like a virus or like rats or whatever, 357 00:19:26,480 --> 00:19:28,440 Speaker 1: you can think, Okay, well I've seen this trick before. 358 00:19:28,480 --> 00:19:30,960 Speaker 1: I'm not going to allow that to happen. So clearly 359 00:19:31,000 --> 00:19:33,879 Speaker 1: the problem is you're pointing out, for example, what this 360 00:19:33,920 --> 00:19:36,680 Speaker 1: German woman is. Once you come to believe this about 361 00:19:36,720 --> 00:19:39,520 Speaker 1: in groups and out groups, it's not so easy to change. Now, 362 00:19:39,520 --> 00:19:43,720 Speaker 1: people do change. There are really interesting examples of a 363 00:19:43,760 --> 00:19:47,320 Speaker 1: guy who was in the KKKA who changed his view, 364 00:19:47,840 --> 00:19:49,679 Speaker 1: Or there's a young woman who was raised in a 365 00:19:49,840 --> 00:19:54,240 Speaker 1: very strict, kind of awful church environment where she was 366 00:19:54,240 --> 00:19:56,679 Speaker 1: taught to hate all kinds of groups of people and 367 00:19:56,720 --> 00:19:59,960 Speaker 1: she's really changed her ways. Yeah, these things do happen, 368 00:20:00,240 --> 00:20:03,600 Speaker 1: But to your point, it's tough. Once you've been taught 369 00:20:03,640 --> 00:20:06,720 Speaker 1: how to categorize the world. Those are not the kind 370 00:20:06,720 --> 00:20:09,399 Speaker 1: of people I hang out with, and they're capable of 371 00:20:09,440 --> 00:20:12,280 Speaker 1: any kind of devious behavior, then that tends to stick. 372 00:20:12,520 --> 00:20:15,560 Speaker 2: Do you think that social media and getting everybody in 373 00:20:15,640 --> 00:20:18,680 Speaker 2: to find these communities of people who believe in these 374 00:20:18,680 --> 00:20:21,439 Speaker 2: things has made this worse for us today or has 375 00:20:21,480 --> 00:20:22,640 Speaker 2: it always been the same. 376 00:20:23,080 --> 00:20:25,119 Speaker 1: I do not think that social media has anything to 377 00:20:25,160 --> 00:20:28,040 Speaker 1: do with this. Actually, for several reasons. One is that 378 00:20:28,200 --> 00:20:31,040 Speaker 1: and you guys must be experts in this part. When 379 00:20:31,040 --> 00:20:33,680 Speaker 1: you look back through history, there is no moment or 380 00:20:33,680 --> 00:20:36,480 Speaker 1: a place where there haven't been conspiracy theories. There was 381 00:20:36,480 --> 00:20:38,600 Speaker 1: a great, big fire that destroyed a lot of Rome 382 00:20:38,720 --> 00:20:41,520 Speaker 1: in sixty four CE, and there are all kinds of 383 00:20:41,560 --> 00:20:44,399 Speaker 1: conspiracy theories about it where people thought, hey, wait, I 384 00:20:44,440 --> 00:20:49,320 Speaker 1: think the Emperor Nero was responsible for the fire because 385 00:20:49,359 --> 00:20:51,600 Speaker 1: he wanted to build this new palace and for that 386 00:20:51,880 --> 00:20:56,000 Speaker 1: clear land, and so consequently all these conspiracy theories blossom. 387 00:20:56,119 --> 00:20:58,639 Speaker 4: Nero blamed the Christians, by the way, yeah. 388 00:20:58,440 --> 00:21:02,680 Speaker 1: Yeah, exactly, everyone exactly, and probably it was some random 389 00:21:02,760 --> 00:21:05,560 Speaker 1: spark from whatever, like who knows is ole is cow? 390 00:21:05,800 --> 00:21:09,000 Speaker 1: Yeah exactly. And the thing is we'll never know. And 391 00:21:09,040 --> 00:21:11,000 Speaker 1: of course there are no good forensic tools back then. 392 00:21:11,040 --> 00:21:13,760 Speaker 1: Some people then couldn't have known, you know, there were 393 00:21:13,760 --> 00:21:15,800 Speaker 1: conspiracy theories. And by the way, this was the same 394 00:21:15,800 --> 00:21:18,919 Speaker 1: thing behind the assassination of Julius Caesar, the death of 395 00:21:18,960 --> 00:21:24,200 Speaker 1: Alexander the Great, who actually wanted to ensure that Socrates 396 00:21:24,440 --> 00:21:27,640 Speaker 1: was tried and executed. So as far back as we 397 00:21:27,760 --> 00:21:31,200 Speaker 1: have written history. We have conspiracy theories, so first of all, 398 00:21:31,680 --> 00:21:35,680 Speaker 1: it's not a new thing. But more generally, the idea 399 00:21:35,840 --> 00:21:39,640 Speaker 1: that social media reduces our need for proof, I think 400 00:21:40,040 --> 00:21:43,520 Speaker 1: might be a retrospective romanticization to think that we ever 401 00:21:43,680 --> 00:21:46,240 Speaker 1: used to care about proof. For example, we're all old 402 00:21:46,320 --> 00:21:49,840 Speaker 1: enough to remember the pre internet world pretty clearly, and 403 00:21:49,960 --> 00:21:51,959 Speaker 1: you know, one of the things that everybody did at 404 00:21:52,000 --> 00:21:55,959 Speaker 1: the time was mailed pamphlets. You know, you subscribed and 405 00:21:56,000 --> 00:21:58,399 Speaker 1: you got pamphlets to our inbox, but it was your 406 00:21:58,440 --> 00:22:02,840 Speaker 1: physical mail in and they were the craziest, you know. 407 00:22:02,840 --> 00:22:05,600 Speaker 1: I mean, you can get American Nazi Party pamphlets whatever, 408 00:22:05,680 --> 00:22:08,639 Speaker 1: and I've read those, and the idea of proof or 409 00:22:08,640 --> 00:22:11,200 Speaker 1: whatever is the last thing on anybody's mind there. It's 410 00:22:11,280 --> 00:22:13,720 Speaker 1: just the same kind of stuff that you might find now. 411 00:22:13,880 --> 00:22:16,840 Speaker 2: It may make it easier to weaponize for partisan or 412 00:22:16,880 --> 00:22:19,679 Speaker 2: malign purposes. I mean, the fact that the Russians, for example, 413 00:22:19,680 --> 00:22:21,880 Speaker 2: can pump stuff into our system. They did do that 414 00:22:22,080 --> 00:22:24,280 Speaker 2: for decades and decades, the exact same thing. 415 00:22:24,680 --> 00:22:25,320 Speaker 1: It just is. 416 00:22:25,520 --> 00:22:26,359 Speaker 5: Easier to do it. 417 00:22:27,720 --> 00:22:29,960 Speaker 1: That might be right. I do wonder about this though. 418 00:22:30,000 --> 00:22:31,400 Speaker 1: So I was just talking to someone the other day 419 00:22:31,400 --> 00:22:33,679 Speaker 1: who said, who was actually making this argument that the 420 00:22:33,680 --> 00:22:37,840 Speaker 1: Internet is responsible for the proliferation of conspiracy theories because 421 00:22:37,880 --> 00:22:39,400 Speaker 1: it makes it so easy. And I said, okay, look, 422 00:22:40,080 --> 00:22:42,760 Speaker 1: why don't you start a conspiracy theory like that Joe 423 00:22:42,760 --> 00:22:46,760 Speaker 1: Biden actually has an alien baby and posted on your 424 00:22:46,800 --> 00:22:50,520 Speaker 1: Twitter account which has one hundred followers. Do you really 425 00:22:50,520 --> 00:22:53,199 Speaker 1: think this is going to cause the conflagration of a 426 00:22:53,200 --> 00:22:54,080 Speaker 1: conspiracy theory? 427 00:22:54,119 --> 00:22:54,280 Speaker 4: You know? 428 00:22:54,440 --> 00:22:57,119 Speaker 2: But the Russians can do is they used to create 429 00:22:57,160 --> 00:22:59,640 Speaker 2: them and try to push them, and usually they didn't catch. 430 00:22:59,680 --> 00:23:01,960 Speaker 2: Sometimes they would. They made up the AIDS crisis that 431 00:23:02,000 --> 00:23:03,960 Speaker 2: was created by the Pentagon, for example, and that one 432 00:23:04,040 --> 00:23:06,840 Speaker 2: caught and had an effect. Usually they don't. But one 433 00:23:06,880 --> 00:23:08,160 Speaker 2: of the things they can do now is they can 434 00:23:08,280 --> 00:23:11,879 Speaker 2: just monitor bad actors on Twitter and what have you 435 00:23:11,920 --> 00:23:14,439 Speaker 2: on social media and then use bots and things to 436 00:23:14,560 --> 00:23:17,200 Speaker 2: pump that stuff. In other words, it takes this person 437 00:23:17,240 --> 00:23:19,760 Speaker 2: has one hundred followers and pump it fifty million times 438 00:23:19,760 --> 00:23:21,560 Speaker 2: into the system and hope it catches. And they can 439 00:23:21,600 --> 00:23:23,720 Speaker 2: do that over and over and only a few of 440 00:23:23,720 --> 00:23:25,360 Speaker 2: them have to catch to create a problem. 441 00:23:25,359 --> 00:23:26,840 Speaker 5: But you're right, the. 442 00:23:26,720 --> 00:23:29,040 Speaker 2: Fact that our brains take it is the same as 443 00:23:29,119 --> 00:23:30,080 Speaker 2: it's always been. 444 00:23:30,240 --> 00:23:33,040 Speaker 1: So that's really interesting. Glad brought that up because what 445 00:23:33,119 --> 00:23:36,639 Speaker 1: that suggests is for a conspiracy theory to catch fire, 446 00:23:37,200 --> 00:23:39,680 Speaker 1: it has to be a good one in a sense. 447 00:23:39,720 --> 00:23:42,040 Speaker 1: So if you post that Joe Biden has an alien 448 00:23:42,080 --> 00:23:45,040 Speaker 1: baby from an alien distress or something, the Russians could 449 00:23:45,040 --> 00:23:46,960 Speaker 1: pump that all they want on their bot farms and 450 00:23:47,000 --> 00:23:48,560 Speaker 1: it wouldn't catch presumaly. 451 00:23:48,800 --> 00:23:51,600 Speaker 2: But that fits with our old world, right. So one 452 00:23:51,680 --> 00:23:54,400 Speaker 2: of the things that intelligence services do is deception, right. 453 00:23:54,440 --> 00:23:57,320 Speaker 2: So deception is an effort to try to fool people 454 00:23:57,359 --> 00:24:00,199 Speaker 2: to believe something that's not true, and of course that 455 00:24:00,240 --> 00:24:02,600 Speaker 2: works best when you know what the other side wants 456 00:24:02,680 --> 00:24:06,320 Speaker 2: to believe. The better you understand the other side's biases, 457 00:24:06,640 --> 00:24:08,919 Speaker 2: the easier is to fool them by giving them a 458 00:24:08,960 --> 00:24:11,560 Speaker 2: piece of what they want to believe, wrapped in something 459 00:24:11,600 --> 00:24:12,639 Speaker 2: that's not true. 460 00:24:12,760 --> 00:24:14,880 Speaker 1: But even still, you have to make sure that it's 461 00:24:14,920 --> 00:24:18,840 Speaker 1: a good conspiracy that's just on the edge of possibility. 462 00:24:19,800 --> 00:24:29,640 Speaker 2: Let's take a break, we'll be right back, and we're back. 463 00:24:32,280 --> 00:24:35,680 Speaker 4: I have a question about cognitive dissonance in a ways 464 00:24:35,720 --> 00:24:39,080 Speaker 4: touches on my own family and how we process information. 465 00:24:39,520 --> 00:24:43,239 Speaker 4: So Alex Jones on his show, he goes on and 466 00:24:43,280 --> 00:24:49,840 Speaker 4: he sells things like super male vitality drops and lung 467 00:24:49,960 --> 00:24:54,880 Speaker 4: cleansing plus spray with no FDA approval. Nobody knows what's 468 00:24:54,920 --> 00:24:58,439 Speaker 4: in this, and people buy millions of dollars of this 469 00:24:58,480 --> 00:25:03,359 Speaker 4: stuff the same time, and often these very same people 470 00:25:03,840 --> 00:25:08,480 Speaker 4: are mistrustful of the entire medical community, tens or even 471 00:25:08,840 --> 00:25:13,880 Speaker 4: hundreds of thousands of medical professionals who have dedicating their 472 00:25:14,000 --> 00:25:17,479 Speaker 4: lives to saving people, and who they many of them know, 473 00:25:17,600 --> 00:25:20,440 Speaker 4: like they know their doctor or surgeon and things like that, 474 00:25:20,800 --> 00:25:25,639 Speaker 4: and yet on COVID vaccines they don't buy it. You know, 475 00:25:25,640 --> 00:25:30,440 Speaker 4: It's like, no, the entire medical establishment is unethical. Although 476 00:25:30,680 --> 00:25:34,400 Speaker 4: the super male vitality drops, they're fine, right, I'll put 477 00:25:34,440 --> 00:25:37,160 Speaker 4: those in my body. And yet at the same time, 478 00:25:37,760 --> 00:25:41,360 Speaker 4: their kid gets sick with cancer, and those same doctors 479 00:25:41,359 --> 00:25:44,320 Speaker 4: who they think are unethical and they don't trust, come 480 00:25:44,400 --> 00:25:46,760 Speaker 4: to them and say, I'm going to have to put 481 00:25:46,920 --> 00:25:51,199 Speaker 4: poison in your kid's body chemotherarap just enough poison to 482 00:25:51,359 --> 00:25:53,400 Speaker 4: kill the cancer, but it's going to make your kid 483 00:25:53,480 --> 00:25:56,199 Speaker 4: really sick, but it could save them. And none of 484 00:25:56,200 --> 00:26:00,600 Speaker 4: those people, I'll bet you zero, say WHOA. I want 485 00:26:00,640 --> 00:26:03,399 Speaker 4: to know exactly what's in that stuff, right, they're going 486 00:26:03,480 --> 00:26:06,000 Speaker 4: to say, if you can save them, do it. So 487 00:26:06,240 --> 00:26:11,440 Speaker 4: they're conflicting and completely, I think, irreconcilable. So how does 488 00:26:11,480 --> 00:26:14,840 Speaker 4: the brain work that it functions as a unified hole 489 00:26:15,160 --> 00:26:17,520 Speaker 4: despite this and with real consequences. 490 00:26:17,880 --> 00:26:20,760 Speaker 1: The general story is that the brain is perfectly happy 491 00:26:20,760 --> 00:26:24,399 Speaker 1: to hold contradictions and behavior all the time. I mentioned 492 00:26:24,400 --> 00:26:26,480 Speaker 1: earlier that the brain is made up of a team 493 00:26:26,520 --> 00:26:28,520 Speaker 1: of rivals, by which I mean you've got all these 494 00:26:28,560 --> 00:26:31,920 Speaker 1: different networks that want different things at different times. And 495 00:26:32,000 --> 00:26:35,119 Speaker 1: think of it like a neural parliament. In a parliament 496 00:26:35,119 --> 00:26:38,160 Speaker 1: in a country, You've got all these different political parties 497 00:26:38,160 --> 00:26:41,960 Speaker 1: with different drives. They want different things to happen. They 498 00:26:42,000 --> 00:26:44,399 Speaker 1: all love their country, they just have different opinions on 499 00:26:44,480 --> 00:26:47,120 Speaker 1: what the right moves are. And this is exactly what's 500 00:26:47,160 --> 00:26:49,000 Speaker 1: going on with the brain. So for example, if I 501 00:26:49,040 --> 00:26:51,119 Speaker 1: put some warm chocolate chip cookies in front of you, 502 00:26:51,160 --> 00:26:53,520 Speaker 1: party brain wants to eat that. It's a rich energy source. 503 00:26:53,560 --> 00:26:55,920 Speaker 1: Party of brain says, don't eat it, you'll get fat. 504 00:26:56,000 --> 00:26:58,000 Speaker 1: Party brain says, Okay, I'll eat it, but I'll go 505 00:26:58,040 --> 00:27:00,399 Speaker 1: to the gym tonight. You can argue with yourself, you 506 00:27:00,480 --> 00:27:03,560 Speaker 1: can cuss it yourself, you can cajole who is talking 507 00:27:03,600 --> 00:27:06,160 Speaker 1: to whom? Here? Right? It's all you, but it's different 508 00:27:06,240 --> 00:27:10,760 Speaker 1: parts of you. And that's why we constantly hold contradictory 509 00:27:11,200 --> 00:27:14,360 Speaker 1: information or ways of being in our heads. But it's 510 00:27:14,359 --> 00:27:16,800 Speaker 1: easy to say you don't trust the medical system when 511 00:27:16,840 --> 00:27:19,160 Speaker 1: you've got a cold or something, but when something has 512 00:27:19,200 --> 00:27:22,600 Speaker 1: really gotten serious, then you might change your view on that. 513 00:27:23,200 --> 00:27:28,560 Speaker 4: Is it often a response to anxiety fear. So if 514 00:27:28,560 --> 00:27:34,320 Speaker 4: a society is unsettled, anxious, or made that way, does 515 00:27:34,359 --> 00:27:35,639 Speaker 4: that cause more conspiracy? 516 00:27:35,760 --> 00:27:36,000 Speaker 1: John? 517 00:27:36,040 --> 00:27:38,120 Speaker 4: I mean, you know the Serbs really well. You spend 518 00:27:38,160 --> 00:27:42,160 Speaker 4: a lot of time in Yugoslavia. I think Melosovich created 519 00:27:42,240 --> 00:27:47,000 Speaker 4: anxiety amongst the Serve population that made them more likely 520 00:27:47,080 --> 00:27:48,320 Speaker 4: to believe conspiracy theories. 521 00:27:48,359 --> 00:27:50,440 Speaker 2: And again you can it's very easy to use the other. 522 00:27:50,600 --> 00:27:53,680 Speaker 2: And so the thing in the former Yugoslavia is Serbia 523 00:27:53,720 --> 00:27:56,879 Speaker 2: with they weren't the majority of the ethnic population, but 524 00:27:56,920 --> 00:27:59,440 Speaker 2: they were the largest of the minority. So right, so 525 00:27:59,480 --> 00:28:02,520 Speaker 2: they would say forty percent of former Yugoslavia was serving. 526 00:28:02,560 --> 00:28:05,439 Speaker 2: So the Slovenes and the Croats and the Montenegrins and 527 00:28:05,840 --> 00:28:09,600 Speaker 2: Kosovars and others were smaller groups. And so when the 528 00:28:09,600 --> 00:28:13,199 Speaker 2: biggest group the Serbs claimed that they were the victims 529 00:28:13,320 --> 00:28:17,000 Speaker 2: of those smaller groups coming after them. It scared the 530 00:28:17,000 --> 00:28:20,240 Speaker 2: other groups, right, So the most powerful and biggest group 531 00:28:20,320 --> 00:28:22,880 Speaker 2: was claiming to be the victim and that therefore they 532 00:28:22,880 --> 00:28:26,680 Speaker 2: could act with impunity. So Meloswitch was very good at 533 00:28:27,080 --> 00:28:29,199 Speaker 2: He knew communism was falling apart. He knew that he 534 00:28:29,240 --> 00:28:33,000 Speaker 2: had to grasp onto nationalism, and therefore he used that 535 00:28:33,119 --> 00:28:36,399 Speaker 2: victimhood for him, but probably without realizing that it was 536 00:28:36,440 --> 00:28:39,400 Speaker 2: going to create fear in the other ethnic group and 537 00:28:39,560 --> 00:28:42,240 Speaker 2: created what became a nasty, horrible civil war. 538 00:28:42,560 --> 00:28:46,480 Speaker 4: And so the question is, can you create the preconditions 539 00:28:46,520 --> 00:28:50,400 Speaker 4: for people to believe conspiracy theories by making people afraid 540 00:28:50,440 --> 00:28:51,520 Speaker 4: and making them anxious. 541 00:28:52,040 --> 00:28:54,960 Speaker 1: Yeah, I think that's fascinating that. On the individual level. 542 00:28:55,480 --> 00:28:58,120 Speaker 1: One of the things that brains do all the time 543 00:28:58,360 --> 00:29:00,920 Speaker 1: is threat detection. This is the most important jobs of 544 00:29:00,960 --> 00:29:05,479 Speaker 1: the brain. And when you are sensing that there's trouble 545 00:29:05,600 --> 00:29:09,280 Speaker 1: or something's happening, you've got the sort of special network 546 00:29:09,280 --> 00:29:12,040 Speaker 1: in your brain center at the amygdala, which is an 547 00:29:12,040 --> 00:29:14,720 Speaker 1: emergency control center that says, Okay, look, there's some threat 548 00:29:14,760 --> 00:29:17,120 Speaker 1: going on. I really need to rev everything up and 549 00:29:17,240 --> 00:29:22,440 Speaker 1: operate with heightened suspicion and vigilance. And there's one related thing, 550 00:29:22,440 --> 00:29:25,240 Speaker 1: which is another thing that brains do is agency detection, 551 00:29:25,400 --> 00:29:28,720 Speaker 1: which is, hey, something just happened. Was there somebody behind this? 552 00:29:28,920 --> 00:29:32,360 Speaker 1: Was there some group, some person behind this? Is my 553 00:29:32,480 --> 00:29:35,480 Speaker 1: phone missing because I put it somewhere stupid? Or is 554 00:29:35,560 --> 00:29:39,520 Speaker 1: there somebody behind my phone being missing here? And so 555 00:29:39,720 --> 00:29:41,920 Speaker 1: these are natural things that the brains are always doing. 556 00:29:41,960 --> 00:29:45,360 Speaker 1: And to your point, these things can be cranked up 557 00:29:45,920 --> 00:29:49,400 Speaker 1: if the message from the government is look, there are 558 00:29:49,400 --> 00:29:52,760 Speaker 1: threats all around you. There are people doing shadowy things 559 00:29:52,800 --> 00:29:56,400 Speaker 1: around you. I have a hypothesis that I don't think 560 00:29:56,440 --> 00:29:59,040 Speaker 1: there's any way for me to prove at the moment, 561 00:29:59,080 --> 00:30:01,920 Speaker 1: but I know I noticed, starting in about twenty twenty, 562 00:30:02,840 --> 00:30:09,240 Speaker 1: right after COVID hit, conspiracy theories seemed to become more popular. Now, 563 00:30:09,240 --> 00:30:11,480 Speaker 1: I'd be really curious before I go on, what do 564 00:30:11,520 --> 00:30:13,280 Speaker 1: you guys think? Do you think they actually became slight 565 00:30:13,320 --> 00:30:14,600 Speaker 1: more popular in twenty twenty? 566 00:30:14,760 --> 00:30:17,000 Speaker 2: That's a good question, I think. So was it a 567 00:30:17,000 --> 00:30:18,880 Speaker 2: form of entertainment while we were staying at home? I 568 00:30:18,880 --> 00:30:20,000 Speaker 2: don't know what the answer to that. 569 00:30:20,600 --> 00:30:23,440 Speaker 1: Okay, So I have a hypothesis. What I in my 570 00:30:23,560 --> 00:30:26,800 Speaker 1: lab call an idiothesis, which is an idiotic hypothesis. But 571 00:30:27,120 --> 00:30:29,360 Speaker 1: here's what I think it is. So I mentioned earlier 572 00:30:29,360 --> 00:30:31,720 Speaker 1: that your brain's job is to make an internal model 573 00:30:31,760 --> 00:30:33,680 Speaker 1: of the world, and typically we're pretty good at that. 574 00:30:33,720 --> 00:30:36,400 Speaker 1: We say, look, I know how politics were on, how 575 00:30:36,440 --> 00:30:39,120 Speaker 1: people act, that sort of thing. But what happened in 576 00:30:39,120 --> 00:30:42,720 Speaker 1: March of twenty twenty is that suddenly all bets were off, 577 00:30:42,800 --> 00:30:47,080 Speaker 1: Suddenly the society shut down, and we didn't know what 578 00:30:47,240 --> 00:30:51,160 Speaker 1: was going on. And what appears to me to happen 579 00:30:51,480 --> 00:30:56,000 Speaker 1: is that the brain makes farther and farther reaches for 580 00:30:56,080 --> 00:30:59,120 Speaker 1: explanatory framework. And just as an example, I don't know 581 00:30:59,160 --> 00:31:01,560 Speaker 1: if you can really put yourself back in March or 582 00:31:01,600 --> 00:31:03,720 Speaker 1: twenty twenty, let's say late March or early April, but 583 00:31:04,480 --> 00:31:07,880 Speaker 1: what I noticed is that people would read one article 584 00:31:07,920 --> 00:31:10,120 Speaker 1: that said this is all going to be over within 585 00:31:10,160 --> 00:31:12,200 Speaker 1: a week, and they said, okay, got it, I can 586 00:31:12,240 --> 00:31:14,320 Speaker 1: repeat the arguments. I got this. And then the very 587 00:31:14,320 --> 00:31:16,600 Speaker 1: next article you read says, wow, this is going to 588 00:31:16,640 --> 00:31:19,400 Speaker 1: go on for years and hundreds of thousands or millions 589 00:31:19,440 --> 00:31:21,800 Speaker 1: will die, and you read it and you think, okay, 590 00:31:21,880 --> 00:31:23,880 Speaker 1: that's convincing, and okay, I got it and you can 591 00:31:23,920 --> 00:31:27,000 Speaker 1: repeat that. And so I just noticed that for all 592 00:31:27,080 --> 00:31:29,040 Speaker 1: of us, we were going back and forth on our 593 00:31:29,080 --> 00:31:32,840 Speaker 1: frameworks for everything. And that's my idiothesis on why I 594 00:31:32,880 --> 00:31:37,440 Speaker 1: think conspiracy theory has got more popular when our prediction 595 00:31:37,760 --> 00:31:42,440 Speaker 1: abilities were frayed by the pandemic and we just weren't 596 00:31:42,480 --> 00:31:45,880 Speaker 1: able to make good predictions anymore. Then it certainly seems 597 00:31:45,920 --> 00:31:47,800 Speaker 1: to me that if you are in a country and 598 00:31:47,840 --> 00:31:51,720 Speaker 1: you're trying to make the population more anxious, nervous and 599 00:31:51,760 --> 00:31:55,080 Speaker 1: crank up their threat detection, possibly their agency detection, another 600 00:31:55,200 --> 00:31:58,360 Speaker 1: piece would be just making it so that their brains 601 00:31:58,360 --> 00:32:00,880 Speaker 1: aren't good prediction machines and they feel like, Wow, I 602 00:32:00,960 --> 00:32:03,440 Speaker 1: need a better framework to hang on to, and this 603 00:32:03,520 --> 00:32:05,760 Speaker 1: guy is offering me a framework that gives me a 604 00:32:05,800 --> 00:32:08,040 Speaker 1: really good stable foundation. 605 00:32:08,640 --> 00:32:10,120 Speaker 4: Yeah, you're describing North Korea. 606 00:32:10,360 --> 00:32:13,600 Speaker 2: Yeah, quick questions. We have talked a lot about how 607 00:32:14,120 --> 00:32:18,200 Speaker 2: the brain and societies get us into thinking about conspiracies. 608 00:32:18,200 --> 00:32:20,200 Speaker 2: How do we get into conspiracies, how do people get 609 00:32:20,200 --> 00:32:22,040 Speaker 2: caught up in them? How do we get out? 610 00:32:22,440 --> 00:32:25,000 Speaker 1: I'll tell you, I think the most important part is 611 00:32:25,120 --> 00:32:31,040 Speaker 1: making just a scientific assessment of the likelihoods of things, 612 00:32:31,040 --> 00:32:35,320 Speaker 1: so all of us might get fooled someday where somebody 613 00:32:35,600 --> 00:32:37,520 Speaker 1: is doing conspiracy theory and we think, oh, we know 614 00:32:37,560 --> 00:32:39,960 Speaker 1: what the right answer is. Maybe there are conspiracy theories 615 00:32:39,960 --> 00:32:42,360 Speaker 1: that happen tame ones, maybe that happen in the world. 616 00:32:42,520 --> 00:32:46,160 Speaker 1: Who knows, but it probably makes sense to always make 617 00:32:46,200 --> 00:32:48,920 Speaker 1: sure that we're asking what are the chances. So I'll 618 00:32:48,920 --> 00:32:52,080 Speaker 1: give an example. Take the moon landing. Apparently there's still 619 00:32:52,080 --> 00:32:54,000 Speaker 1: some people who say it was done in a production 620 00:32:54,080 --> 00:32:56,040 Speaker 1: studio and we didn't actually land on the Moon. Now, 621 00:32:56,280 --> 00:33:00,360 Speaker 1: obviously that's not true, because I actually knew Neil arms Strong, 622 00:33:00,400 --> 00:33:02,000 Speaker 1: and one of the things he had done was put 623 00:33:02,040 --> 00:33:03,720 Speaker 1: a mirror on the Moon so you could bounce the 624 00:33:03,760 --> 00:33:06,520 Speaker 1: laser off its so you could do ranging and find 625 00:33:06,520 --> 00:33:07,880 Speaker 1: the exact distance of the Moon anyway, so you can 626 00:33:07,920 --> 00:33:11,000 Speaker 1: prove to yourself it's not. But anyway, let's imagine that 627 00:33:11,120 --> 00:33:14,120 Speaker 1: you thought, I wonder if the moon landing is a conspiracy. 628 00:33:14,280 --> 00:33:16,840 Speaker 1: The key is how many people would have had to 629 00:33:16,880 --> 00:33:21,320 Speaker 1: be involved, presumably everybody at NASA, the president, the thousands 630 00:33:21,360 --> 00:33:23,840 Speaker 1: of news broadcast whatever like. Lots of people would have 631 00:33:23,920 --> 00:33:26,880 Speaker 1: to be involved in that. And the question is how 632 00:33:26,960 --> 00:33:30,240 Speaker 1: long can a secret be kept? And the fact is 633 00:33:30,680 --> 00:33:32,520 Speaker 1: that if a bunch of people are holding onto a 634 00:33:32,560 --> 00:33:35,520 Speaker 1: secret every day, there's some chance that's going to spill out, 635 00:33:35,560 --> 00:33:39,120 Speaker 1: either because someone gets drunk or they have a religious 636 00:33:39,120 --> 00:33:41,680 Speaker 1: conversion or you know, spasm of guilt or whatever and 637 00:33:41,720 --> 00:33:43,880 Speaker 1: they decide I'm going to do this. But also it 638 00:33:43,920 --> 00:33:46,800 Speaker 1: should be noted from a game theory perspective, there's a 639 00:33:46,840 --> 00:33:50,720 Speaker 1: lot of reward if you're the defector. If you think, okay, look, 640 00:33:50,840 --> 00:33:52,920 Speaker 1: somebody on this team of one hundred people is going 641 00:33:52,920 --> 00:33:54,760 Speaker 1: to defect at some point, and I'm going to end 642 00:33:54,800 --> 00:33:56,000 Speaker 1: up in jail for the rest of my life. But 643 00:33:56,000 --> 00:33:58,440 Speaker 1: if I'm the defector, then I get the New York 644 00:33:58,480 --> 00:34:01,560 Speaker 1: Times bestselling book, and I get to be on CNN 645 00:34:01,640 --> 00:34:03,600 Speaker 1: every day talking about this thing. And so there's all 646 00:34:03,640 --> 00:34:06,160 Speaker 1: kinds of rewards from me and the guy who I was. 647 00:34:06,080 --> 00:34:08,080 Speaker 2: In CIA for thirty years. If I knew that we 648 00:34:08,120 --> 00:34:10,719 Speaker 2: did the JFK assassination, like some people claim, I would 649 00:34:10,760 --> 00:34:13,200 Speaker 2: be out on the rooftop screaming. 650 00:34:12,960 --> 00:34:16,120 Speaker 1: Exactly, you'd write a book, You'd be super famous for it. Yeah, exactly. 651 00:34:16,360 --> 00:34:17,960 Speaker 1: Let's imagine you say, okay, well, if these are a 652 00:34:18,000 --> 00:34:21,760 Speaker 1: bunch of really tough people who who aren't going to defect, 653 00:34:21,880 --> 00:34:24,520 Speaker 1: you know, just somebody getting drunk, like the woman from 654 00:34:24,560 --> 00:34:27,000 Speaker 1: Germany or whatever. What are the chances each day? And 655 00:34:27,040 --> 00:34:29,600 Speaker 1: if you multiply that over a year, over twenty years, 656 00:34:30,120 --> 00:34:33,120 Speaker 1: at some point you have to allow that mathematically, this 657 00:34:33,160 --> 00:34:36,080 Speaker 1: isn't really making sense anymore. Also, I think that even 658 00:34:36,120 --> 00:34:40,000 Speaker 1: if you imagine some really tough guys who pulled off 659 00:34:40,000 --> 00:34:43,400 Speaker 1: a moon conspiracy theory, their sun or their gen Z 660 00:34:43,800 --> 00:34:46,839 Speaker 1: grand child is going to say, oh my god, look 661 00:34:46,840 --> 00:34:49,280 Speaker 1: what I found in the attic and some so anyway, 662 00:34:49,600 --> 00:34:52,960 Speaker 1: I think that it's probably useful to just take a 663 00:34:53,040 --> 00:34:56,879 Speaker 1: scientific lens on any conspiracy theory and ask what are 664 00:34:56,960 --> 00:34:57,600 Speaker 1: the chances. 665 00:34:57,880 --> 00:35:00,920 Speaker 2: So it seems you're saying like data and logic can 666 00:35:00,960 --> 00:35:03,160 Speaker 2: help somebody pull out of that, But data and logic 667 00:35:03,160 --> 00:35:05,560 Speaker 2: didn't get them into that viewpoint. I could come to 668 00:35:05,600 --> 00:35:08,320 Speaker 2: them with logic, but I don't know that logic works 669 00:35:08,320 --> 00:35:10,480 Speaker 2: on them unless they all of a sudden now they 670 00:35:10,520 --> 00:35:12,839 Speaker 2: are ready to get out of it. And I don't 671 00:35:12,840 --> 00:35:14,240 Speaker 2: know how people get to that phase. 672 00:35:14,760 --> 00:35:16,920 Speaker 1: So I agree with that. Okay, for all these points 673 00:35:16,920 --> 00:35:19,480 Speaker 1: that we touched on during this episode, I think it's 674 00:35:19,600 --> 00:35:21,359 Speaker 1: unlikely that you'll be able to talk to someone out 675 00:35:21,360 --> 00:35:23,960 Speaker 1: of it. Why, there's the social component, which is they're 676 00:35:23,960 --> 00:35:27,279 Speaker 1: getting something out of it. And if you engage with 677 00:35:27,320 --> 00:35:30,200 Speaker 1: them and say, hey, let's argue this out whatever, it's 678 00:35:30,320 --> 00:35:32,600 Speaker 1: joyful to them. They're having a great time argue with you, 679 00:35:32,680 --> 00:35:35,120 Speaker 1: and they leave feeling smarter and they say, oh, I 680 00:35:35,160 --> 00:35:37,640 Speaker 1: talk to these guys. We got this podcast, and I 681 00:35:37,680 --> 00:35:39,319 Speaker 1: told them, you know, I was able to hold my 682 00:35:39,440 --> 00:35:40,279 Speaker 1: position on this. 683 00:35:40,800 --> 00:35:42,399 Speaker 4: It was really fun talking to you man. 684 00:35:42,520 --> 00:35:44,600 Speaker 2: Yeah, David, thank you so much for spending time as 685 00:35:44,640 --> 00:35:47,399 Speaker 2: it was really enlightening and we can't thank you enough. 686 00:35:47,520 --> 00:35:49,680 Speaker 1: It was blast talking with you guys. Thanks very much. 687 00:35:50,360 --> 00:35:53,040 Speaker 2: We will see you next time on a Mission implausible. 688 00:35:58,239 --> 00:36:03,320 Speaker 6: Mission Implausible is produced by Adam Davidson, Jerry O'Shea, John Seipher, 689 00:36:03,600 --> 00:36:08,000 Speaker 6: and Jonathan Sterner. The associate producer is Rachel Harner. Mission 690 00:36:08,040 --> 00:36:12,080 Speaker 6: Implausible is a production of honorable Mention and abominable pictures 691 00:36:12,120 --> 00:36:13,480 Speaker 6: for iHeart Podcasts.