1 00:00:15,082 --> 00:00:17,041 Speaker 1: John and I are on break. Now we're on a 2 00:00:17,082 --> 00:00:21,081 Speaker 1: secret mission and this before all new Mission Implausible episodes 3 00:00:21,122 --> 00:00:23,642 Speaker 1: come out this fall. But for now, we'll bring you 4 00:00:23,682 --> 00:00:26,842 Speaker 1: one of our favorite past episodes and we'll soon be 5 00:00:26,922 --> 00:00:29,162 Speaker 1: launching our YouTube channel. See you there. 6 00:00:30,322 --> 00:00:33,442 Speaker 2: I'm John Cipher and I'm Jerry O'sha. I was a 7 00:00:33,482 --> 00:00:36,762 Speaker 2: CIA officer stationed around the world in high threat posts 8 00:00:36,842 --> 00:00:38,522 Speaker 2: in Europe, Russia, and in Asia. 9 00:00:38,562 --> 00:00:41,642 Speaker 1: And I served in Africa, Asia, Europe, the Middle East 10 00:00:41,762 --> 00:00:45,842 Speaker 1: and in war zones. We sometimes created conspiracies to deceive 11 00:00:45,882 --> 00:00:46,802 Speaker 1: our adversaries. 12 00:00:47,042 --> 00:00:50,722 Speaker 2: Now we're going to use our expertise to deconstruct conspiracy 13 00:00:50,762 --> 00:00:52,522 Speaker 2: theories large and small. 14 00:00:52,802 --> 00:00:55,242 Speaker 1: Could they be true? Or are we being manipulated? 15 00:00:55,802 --> 00:00:57,842 Speaker 2: This is Mission Implausible. 16 00:01:00,562 --> 00:01:04,562 Speaker 1: We're three experts to experts in intelligence, one in journalism, 17 00:01:04,842 --> 00:01:07,322 Speaker 1: and we're going to talk about the death of expertise 18 00:01:07,322 --> 00:01:09,722 Speaker 1: and why people like us don't maner anymore. 19 00:01:09,842 --> 00:01:14,242 Speaker 3: Apparently it sure does feel like all the things that 20 00:01:14,282 --> 00:01:19,482 Speaker 3: I devoted my life who are have lost meaning. And 21 00:01:19,562 --> 00:01:22,922 Speaker 3: I'm excited for today's guest, Tom Nichols. He has been 22 00:01:23,002 --> 00:01:28,202 Speaker 3: monitoring this forever, this collapse of respect for expertise and 23 00:01:28,241 --> 00:01:30,122 Speaker 3: what it means to live in a country where that 24 00:01:30,241 --> 00:01:31,202 Speaker 3: just doesn't win the day. 25 00:01:31,521 --> 00:01:34,441 Speaker 2: Yeah, Tom writes for The Atlantic and he's on social media, 26 00:01:34,482 --> 00:01:36,401 Speaker 2: but he wrote a book called The Death of Expertise. 27 00:01:36,562 --> 00:01:39,241 Speaker 2: There's really a good study of how we've gotten here. 28 00:01:39,241 --> 00:01:42,522 Speaker 2: But we're seeing his thesis playing out at the highest levels. Now. 29 00:01:42,801 --> 00:01:44,881 Speaker 2: It's one thing when your uncle thinks he knows better 30 00:01:44,922 --> 00:01:47,522 Speaker 2: because he was on Google, but we now have conspiracy 31 00:01:47,602 --> 00:01:50,242 Speaker 2: theorists and unqualified people at the top of government. 32 00:01:50,642 --> 00:01:51,162 Speaker 4: It is true. 33 00:01:51,202 --> 00:01:54,242 Speaker 3: I was thinking about my uncle, who when I was 34 00:01:54,282 --> 00:01:56,282 Speaker 3: a kid, I was like, how does he know everything? 35 00:01:56,362 --> 00:01:59,082 Speaker 3: That guy knows everything, Like whatever topic came up he 36 00:01:59,162 --> 00:02:02,522 Speaker 3: to have an opinion on. And then at some point 37 00:02:02,682 --> 00:02:05,522 Speaker 3: I came across Newsweek. And this is back in the 38 00:02:05,562 --> 00:02:09,082 Speaker 3: eighties when Newsweek was like a maja. I realized he 39 00:02:09,242 --> 00:02:10,162 Speaker 3: just read Newsweek. 40 00:02:10,242 --> 00:02:13,002 Speaker 1: It reminds you of my grandpa. My grandmother used to say, 41 00:02:13,082 --> 00:02:15,561 Speaker 1: he's not always right, but he's all certain. 42 00:02:18,642 --> 00:02:21,242 Speaker 2: I have a favorite cartoon. It's like this old guy 43 00:02:21,322 --> 00:02:23,602 Speaker 2: sitting in a desk with a computer and his wife's 44 00:02:23,602 --> 00:02:25,562 Speaker 2: started the other side of the room, and he says, honey, 45 00:02:25,561 --> 00:02:28,242 Speaker 2: come look, I found some information. All the world's top 46 00:02:28,322 --> 00:02:31,762 Speaker 2: scientists and doctors missed like people think that they people 47 00:02:31,762 --> 00:02:34,642 Speaker 2: think that they don't need experts anymore, they can just 48 00:02:34,642 --> 00:02:35,722 Speaker 2: do it all themselves. 49 00:02:36,282 --> 00:02:40,441 Speaker 1: Might have one word for you, Smoot read Smoot. I 50 00:02:40,442 --> 00:02:42,882 Speaker 1: don't know why I know this, but he was elected 51 00:02:42,922 --> 00:02:46,282 Speaker 1: to the Senate in nineteen oh eight, and he was 52 00:02:46,442 --> 00:02:49,482 Speaker 1: so unqualified. He was one of twenty seven children from 53 00:02:49,482 --> 00:02:52,802 Speaker 1: this Mormon polygamous father and he's only voted because the 54 00:02:52,841 --> 00:02:56,121 Speaker 1: Mormon community voting for But anyway, he was so unqualified 55 00:02:56,522 --> 00:02:59,842 Speaker 1: that the Senate had a special commission they put together 56 00:03:00,322 --> 00:03:04,482 Speaker 1: that determined he was uniquely unqualified for this job. But 57 00:03:04,522 --> 00:03:07,402 Speaker 1: they couldn't get him kicked out because the Republicans needed 58 00:03:07,401 --> 00:03:10,082 Speaker 1: the things, and Smoot came along and he gave us 59 00:03:10,242 --> 00:03:16,962 Speaker 1: possibly the worst bit of legislation ever in American history today. Yeah, yeah, 60 00:03:17,282 --> 00:03:21,362 Speaker 1: the smooth holly egg that made the recession, a depression 61 00:03:21,401 --> 00:03:23,442 Speaker 1: that ended up with thirty percent of banks going out 62 00:03:23,482 --> 00:03:26,922 Speaker 1: of business, fifty percent employment. But this cuy was like, 63 00:03:27,242 --> 00:03:29,442 Speaker 1: he was not an expert, but he was in charge. 64 00:03:29,522 --> 00:03:32,482 Speaker 3: Here's what I love, though, We as human beings, learn 65 00:03:33,002 --> 00:03:36,002 Speaker 3: through time, So we now know that one hundred years ago, 66 00:03:36,522 --> 00:03:41,002 Speaker 3: some very ignorant people proposed massive tariffs to somehow help 67 00:03:41,082 --> 00:03:43,362 Speaker 3: the US economy without ever explaining how it would help 68 00:03:43,402 --> 00:03:45,802 Speaker 3: the US economy. No one would ever propose that again 69 00:03:46,242 --> 00:03:50,402 Speaker 3: because the expert we now understand. So we should listen 70 00:03:50,482 --> 00:03:52,842 Speaker 3: to experts, right, yeah, you should listen to experts. 71 00:03:53,002 --> 00:03:55,202 Speaker 1: We have two CIA guys here and I just want 72 00:03:55,242 --> 00:03:58,482 Speaker 1: to throw out on expertise. John, don't know who this is, 73 00:03:58,522 --> 00:04:01,962 Speaker 1: but we had a very senior guy in CIA. This 74 00:04:02,002 --> 00:04:05,042 Speaker 1: is like decades ago, and he got involved in the 75 00:04:05,082 --> 00:04:08,402 Speaker 1: senior most in the operation side, although he had almost 76 00:04:08,442 --> 00:04:11,922 Speaker 1: no experience, And there was this operation. I don't remember 77 00:04:12,002 --> 00:04:13,282 Speaker 1: what it was going on. I think it was someplace. 78 00:04:13,402 --> 00:04:15,762 Speaker 1: It wasn't Piro, but let's just say it was. And 79 00:04:15,802 --> 00:04:19,282 Speaker 1: they're planning out exactly where they're going to do this thing, right, 80 00:04:19,322 --> 00:04:21,642 Speaker 1: They're going to break into some embassy or something really, 81 00:04:21,842 --> 00:04:25,442 Speaker 1: and he's looking at the map and he starts giving directions. 82 00:04:25,882 --> 00:04:28,282 Speaker 1: He's never been to Cairo, he has no idea what 83 00:04:28,322 --> 00:04:30,921 Speaker 1: the traffics with the police, and he starts giving detailed 84 00:04:30,962 --> 00:04:33,482 Speaker 1: instructions in like where people are supposed to stand and 85 00:04:33,602 --> 00:04:35,562 Speaker 1: how many they're supposed to be, and then in the 86 00:04:35,562 --> 00:04:38,242 Speaker 1: middle of it he stops and goes, maybe I shouldn't 87 00:04:38,282 --> 00:04:41,002 Speaker 1: be doing this, right, I've never been there, and everybody 88 00:04:41,082 --> 00:04:43,042 Speaker 1: sort of like didn't want to say no shit man, 89 00:04:43,202 --> 00:04:45,522 Speaker 1: but they all went, well, sir, we appreciate your input. 90 00:04:45,642 --> 00:04:48,122 Speaker 1: And they totally disregarded everything he said, and it went 91 00:04:48,162 --> 00:04:51,042 Speaker 1: to the hallways like crazy. It was like, never let 92 00:04:51,042 --> 00:04:54,482 Speaker 1: this senior guy near your operation because like my grandfather, 93 00:04:54,642 --> 00:04:56,962 Speaker 1: he may not be right, but he's freaking cert. 94 00:04:57,002 --> 00:04:59,922 Speaker 3: When we set up Planet Money. When I created Planet Money, 95 00:05:00,002 --> 00:05:01,722 Speaker 3: this was in two thousand and eight, right after the 96 00:05:01,762 --> 00:05:05,162 Speaker 3: financial crisis, and we felt that many of the experts 97 00:05:05,202 --> 00:05:07,522 Speaker 3: led us astray, and so we didn't want the voice 98 00:05:07,522 --> 00:05:10,842 Speaker 3: of the show to be that kind of implied voice 99 00:05:10,842 --> 00:05:13,482 Speaker 3: of certainty that you heard on a lot of business reporting. 100 00:05:13,562 --> 00:05:16,122 Speaker 4: The Dale Jones fell today three point eight percent due 101 00:05:16,122 --> 00:05:18,522 Speaker 4: to late stage trading and blah blah blah, which is 102 00:05:18,522 --> 00:05:21,842 Speaker 4: always bs, and so we deliberately worked on this voice 103 00:05:21,842 --> 00:05:25,322 Speaker 4: that was more here's this thing that happened, we're figuring 104 00:05:25,322 --> 00:05:28,642 Speaker 4: it out. So we didn't want to be falsely certain. 105 00:05:28,962 --> 00:05:31,762 Speaker 4: But that doesn't mean then just anything goes. Like I 106 00:05:31,802 --> 00:05:35,082 Speaker 4: remember in Iraq one story I did that first year 107 00:05:35,162 --> 00:05:38,082 Speaker 4: of the presence. There are all these newspapers that appeared, 108 00:05:38,082 --> 00:05:40,402 Speaker 4: and I did a story about one of the more 109 00:05:40,442 --> 00:05:43,642 Speaker 4: popular new newspapers, and I went in and said, basically, 110 00:05:43,682 --> 00:05:45,682 Speaker 4: I found a polite way to say, every article is 111 00:05:45,722 --> 00:05:48,882 Speaker 4: just a rumor that someone heard, usually a conspiracy. And 112 00:05:48,922 --> 00:05:51,762 Speaker 4: he said, yes, we're a free country. Now, we'd print 113 00:05:51,922 --> 00:05:54,082 Speaker 4: all the rumors and the reader can decide. 114 00:05:54,402 --> 00:05:57,042 Speaker 1: So I have one more word for you, Adam, and 115 00:05:57,122 --> 00:06:01,002 Speaker 1: that is dunning Krueger, which is the cognitive bias. 116 00:06:01,122 --> 00:06:01,282 Speaker 2: Right. 117 00:06:01,322 --> 00:06:05,082 Speaker 1: It means the less you know, the more certain you are, 118 00:06:05,602 --> 00:06:08,282 Speaker 1: and the more you know, the less certain you are 119 00:06:08,362 --> 00:06:12,322 Speaker 1: because you understand it's complicated, right, as opposed to like tariffs, like, 120 00:06:12,362 --> 00:06:14,162 Speaker 1: oh well, just put them on and everything will be great, 121 00:06:14,162 --> 00:06:17,602 Speaker 1: because it's know enough to realize that absolutely. 122 00:06:17,642 --> 00:06:19,682 Speaker 4: All right, let's talk to an expert on the death 123 00:06:19,682 --> 00:06:20,402 Speaker 4: of expertise. 124 00:06:21,882 --> 00:06:25,282 Speaker 2: All right, we're lucky today we have Tom Nichols. Tom 125 00:06:25,402 --> 00:06:28,282 Speaker 2: is a recently retired professor from the Navy War College, 126 00:06:28,482 --> 00:06:31,442 Speaker 2: where he taught national security. He's better known these days 127 00:06:31,442 --> 00:06:34,082 Speaker 2: as a writer at The Atlantic and a popular curmudgeon 128 00:06:34,122 --> 00:06:37,322 Speaker 2: on television and social media, with a particular skill for 129 00:06:37,402 --> 00:06:40,122 Speaker 2: riling up the crazies. He's the author of several books 130 00:06:40,122 --> 00:06:43,042 Speaker 2: on Russia, and nuclear weapons. Lastly, I learned when checking 131 00:06:43,082 --> 00:06:45,002 Speaker 2: to prepare for this episode that Tom is a five 132 00:06:45,082 --> 00:06:49,482 Speaker 2: time Jeopardy champion. WHOA, yeah, man, I've been working on 133 00:06:49,522 --> 00:06:52,242 Speaker 2: some pieces related to how the reelection of President Trump 134 00:06:52,562 --> 00:06:55,082 Speaker 2: might impact the intelligence community, and I keep coming back 135 00:06:55,122 --> 00:06:58,041 Speaker 2: to something former CIA and n NSA Director General Michael 136 00:06:58,082 --> 00:07:01,562 Speaker 2: Hayden wrote about the incompatibility of intelligence in the Maga 137 00:07:01,602 --> 00:07:05,762 Speaker 2: Embrace of post truth politics. You wrote about much of 138 00:07:05,762 --> 00:07:07,922 Speaker 2: this and the death of expertise. Can you help us 139 00:07:08,002 --> 00:07:12,682 Speaker 2: understand why people so often embrace conspiracy theories rather than facts? Oh? 140 00:07:12,762 --> 00:07:15,602 Speaker 5: Yeah, Conspiracy theories are a subset of the whole death 141 00:07:15,602 --> 00:07:18,682 Speaker 5: of expertise problem, and they can't. I mean, let me 142 00:07:18,722 --> 00:07:21,042 Speaker 5: start with the most innocent explanation, which is that people 143 00:07:21,082 --> 00:07:23,682 Speaker 5: are just inundated with information. You know, none of our 144 00:07:23,682 --> 00:07:26,602 Speaker 5: brains are equipped to handle petabytes of information. We may 145 00:07:26,642 --> 00:07:29,722 Speaker 5: be highly evolved, but you know, we're still we still 146 00:07:29,762 --> 00:07:33,242 Speaker 5: have brains that sit inside our skull that literally can't 147 00:07:33,282 --> 00:07:36,482 Speaker 5: take in that much information. And so what people do 148 00:07:36,562 --> 00:07:41,602 Speaker 5: is they start using internal heuristics and relying on prior 149 00:07:41,722 --> 00:07:44,442 Speaker 5: knowledge and also using kind of a sorting function to 150 00:07:44,482 --> 00:07:48,122 Speaker 5: try and simplify everything. Because once you have to decide 151 00:07:48,122 --> 00:07:50,282 Speaker 5: what's true, then you spend all day trying to figure 152 00:07:50,282 --> 00:07:52,242 Speaker 5: out what's true, and so you just start taking the 153 00:07:52,282 --> 00:07:55,922 Speaker 5: things that are interesting or that peak your interest. That's 154 00:07:56,002 --> 00:07:58,562 Speaker 5: kind of a way that people can fall down rabbit 155 00:07:58,562 --> 00:08:01,122 Speaker 5: holes innocently because they're like, look, I don't really know 156 00:08:01,162 --> 00:08:04,882 Speaker 5: what's true about COVID, and I saw this thing about 157 00:08:04,922 --> 00:08:08,122 Speaker 5: how it was released by Bill Gates, so you could 158 00:08:08,122 --> 00:08:10,282 Speaker 5: stick Nana bots on our heads. You know, I don't 159 00:08:10,322 --> 00:08:12,082 Speaker 5: know what's true and I don't know what's not true. 160 00:08:12,442 --> 00:08:16,082 Speaker 5: There's a bigger problem, though, that conspiracy theory are always 161 00:08:16,122 --> 00:08:20,802 Speaker 5: concentrated among They tend to rise after terrible events like 162 00:08:20,842 --> 00:08:24,082 Speaker 5: a pandemic, like an assassination, like a world war. But 163 00:08:24,242 --> 00:08:26,722 Speaker 5: there's also a problem that they tend to arise in 164 00:08:26,882 --> 00:08:31,002 Speaker 5: decadent and bored societies. And by decadent I mean a 165 00:08:31,002 --> 00:08:34,921 Speaker 5: lot of leisure time, the collapse of kind of moral guardrails, 166 00:08:35,002 --> 00:08:38,242 Speaker 5: this sort of erosion of common sense of trust in 167 00:08:38,282 --> 00:08:43,562 Speaker 5: other people, very self absorbed, very narcissistic, very hedonistic, and 168 00:08:43,722 --> 00:08:48,202 Speaker 5: conspiracy theories are really attractive because they place you at 169 00:08:48,202 --> 00:08:51,282 Speaker 5: the center of a great drama. They place you at 170 00:08:51,282 --> 00:08:54,202 Speaker 5: the center of your own Jason Bourne movie. You know, 171 00:08:54,322 --> 00:08:57,882 Speaker 5: you sheeple, you don't get it that the reason I'm 172 00:08:57,882 --> 00:09:01,562 Speaker 5: buying all these bullshit gold coins from the guy on 173 00:09:01,682 --> 00:09:06,242 Speaker 5: TV is because I know and you don't. I am 174 00:09:06,282 --> 00:09:10,362 Speaker 5: one of the elect who knows that the American dollar 175 00:09:10,442 --> 00:09:13,482 Speaker 5: is going to be replaced by the amearo, a North 176 00:09:13,522 --> 00:09:17,162 Speaker 5: American concurrency that will be worthless. And when you poor 177 00:09:17,242 --> 00:09:20,402 Speaker 5: bastards are coming to me asking for one of my 178 00:09:21,082 --> 00:09:26,882 Speaker 5: gold plated buffalo head recreations, you'll be sorry. I am 179 00:09:26,922 --> 00:09:29,562 Speaker 5: in the know and powerful. You are weak and stupid. 180 00:09:30,242 --> 00:09:33,122 Speaker 5: And that's a very tempting thing for people who feel 181 00:09:33,162 --> 00:09:37,122 Speaker 5: like events are out of control. It's simple yet complicated, 182 00:09:37,202 --> 00:09:39,802 Speaker 5: and it makes you feel smart even though you're you're 183 00:09:39,842 --> 00:09:44,002 Speaker 5: actually very confused. The emergence of a lot of conspiracy 184 00:09:44,042 --> 00:09:46,282 Speaker 5: theories at the same time are really the sign of 185 00:09:46,282 --> 00:09:49,122 Speaker 5: a society and decline. When I worked in the Senate, 186 00:09:49,442 --> 00:09:51,242 Speaker 5: I remember the day I got a call and they said, oh, 187 00:09:51,282 --> 00:09:53,802 Speaker 5: you constituent online one, and like I took the call 188 00:09:53,842 --> 00:09:55,922 Speaker 5: and h yes, I'm you know calling. I have a 189 00:09:55,922 --> 00:09:57,762 Speaker 5: foreign policy question. I want to know about the senator's 190 00:09:57,802 --> 00:10:02,842 Speaker 5: relationship to the Trilateralists and the Builderbergers, and the you know, 191 00:10:03,562 --> 00:10:05,042 Speaker 5: I can't even remember what the other ones. There was 192 00:10:05,042 --> 00:10:07,002 Speaker 5: always one other that used to I think it was 193 00:10:07,042 --> 00:10:09,082 Speaker 5: a console on Foreign Relations. I think that one he's 194 00:10:09,082 --> 00:10:09,642 Speaker 5: a member of. 195 00:10:10,042 --> 00:10:11,442 Speaker 1: They don't Illuminati. 196 00:10:11,562 --> 00:10:13,402 Speaker 5: We didn't get the Illuminati calls, but we got a 197 00:10:13,442 --> 00:10:15,602 Speaker 5: lot of the weird stuff. And that leads me to 198 00:10:15,642 --> 00:10:19,522 Speaker 5: my third thing. The conspiracy theories become mainstreamed because now, 199 00:10:19,882 --> 00:10:23,042 Speaker 5: unlike in previous times, there are people who can really 200 00:10:23,082 --> 00:10:25,242 Speaker 5: make money off of it. There are people in right 201 00:10:25,282 --> 00:10:28,202 Speaker 5: wing media and in the right wing ecosystem who can 202 00:10:28,242 --> 00:10:31,002 Speaker 5: play thee well, we're just asking questions game, and of 203 00:10:31,002 --> 00:10:33,402 Speaker 5: course you're gonna click, You're gonna sit there and stay 204 00:10:33,442 --> 00:10:36,202 Speaker 5: on that channel. You're gonna stay glued to your talk 205 00:10:36,322 --> 00:10:40,402 Speaker 5: radio station because it's interesting. Because, let's face it, all 206 00:10:40,442 --> 00:10:43,722 Speaker 5: of us have worked for the federal government in various 207 00:10:43,722 --> 00:10:47,202 Speaker 5: capacities in national defense and intelligence, places that you'd think 208 00:10:47,242 --> 00:10:51,882 Speaker 5: would be pretty sexy. But my description of ninety percent 209 00:10:51,882 --> 00:10:54,362 Speaker 5: of government work, even when I was in DC, when 210 00:10:54,362 --> 00:10:56,002 Speaker 5: I was on the Hill, when I was you know, 211 00:10:56,082 --> 00:10:58,842 Speaker 5: consulting with the agency, when I was working for the Pentagon, 212 00:10:59,322 --> 00:11:03,362 Speaker 5: ninety percent of it's boring. It's boring. It's incredibly boring. 213 00:11:03,402 --> 00:11:07,482 Speaker 5: Government work is tedious and detail oriented. You know, just 214 00:11:07,522 --> 00:11:09,722 Speaker 5: before we came on, we were all talking about, you 215 00:11:09,762 --> 00:11:12,922 Speaker 5: know what some of these Trump nominees will encounter on 216 00:11:12,962 --> 00:11:15,282 Speaker 5: their first day in office, and it's gonna be some 217 00:11:15,362 --> 00:11:17,362 Speaker 5: guy with an iPad going, all right, so you need 218 00:11:17,402 --> 00:11:19,042 Speaker 5: to sign off on this thing. And the budget's going 219 00:11:19,082 --> 00:11:21,202 Speaker 5: to be new on here, and you've got fourteen meetings, 220 00:11:21,202 --> 00:11:23,602 Speaker 5: and you got HR and you got the guy from 221 00:11:23,682 --> 00:11:27,602 Speaker 5: downstairs and the h and it's it's boring. So rather 222 00:11:27,682 --> 00:11:30,802 Speaker 5: than try to explain to American citizens the ins and 223 00:11:30,842 --> 00:11:34,282 Speaker 5: outs of how things actually happen in this country, why 224 00:11:34,282 --> 00:11:36,682 Speaker 5: would you do that? Just sit there and say, you know, 225 00:11:36,922 --> 00:11:39,882 Speaker 5: I'm just asking questions. But sure seems to me like 226 00:11:39,962 --> 00:11:43,362 Speaker 5: those vaccines seem to have a lot of you know, 227 00:11:44,002 --> 00:11:47,602 Speaker 5: those Star Wars medichlorians in them. And you get people 228 00:11:47,762 --> 00:11:50,722 Speaker 5: who want to be entertained listening to you instead of 229 00:11:50,762 --> 00:11:53,642 Speaker 5: people who, like our parents used to do, snap open 230 00:11:53,642 --> 00:11:55,882 Speaker 5: the newspaper at the end of a long day and say, look, 231 00:11:55,922 --> 00:11:57,482 Speaker 5: tell me stuff I need to know so I can 232 00:11:57,482 --> 00:11:58,882 Speaker 5: go have a beer and relax. 233 00:11:59,322 --> 00:12:01,362 Speaker 4: Let's take a quick break. We'll be right back. 234 00:12:08,602 --> 00:12:11,722 Speaker 1: People obviously not our audience because they are listening to 235 00:12:11,882 --> 00:12:15,881 Speaker 1: expertise by this podcast. But I'm interested in what you 236 00:12:15,882 --> 00:12:20,042 Speaker 1: said about trust. So my mom got really sick ten 237 00:12:20,122 --> 00:12:23,042 Speaker 1: years ago and died this terrible disease, and we went 238 00:12:23,082 --> 00:12:28,242 Speaker 1: to doctors and I trusted them, right, who else am 239 00:12:28,282 --> 00:12:30,922 Speaker 1: I gonna trust? And yet I could have gone to 240 00:12:31,202 --> 00:12:35,362 Speaker 1: Alex Jones or to someone else and looked up herbal 241 00:12:35,442 --> 00:12:37,842 Speaker 1: remedies and things like that. It didn't make sense to 242 00:12:37,882 --> 00:12:39,602 Speaker 1: me to do that, to go to people who seem 243 00:12:39,682 --> 00:12:42,562 Speaker 1: to care, who have devoted their lives to this as 244 00:12:42,602 --> 00:12:45,482 Speaker 1: a general consensus. Doctors may not agree with everything, but 245 00:12:45,522 --> 00:12:49,281 Speaker 1: they're doing the best they can. And yet in our society, 246 00:12:49,522 --> 00:12:55,562 Speaker 1: people are choosing not to trust professionals and expertise, and instead, 247 00:12:55,642 --> 00:13:00,082 Speaker 1: increasingly they're turning to hacks like Alex Jones or even 248 00:13:00,082 --> 00:13:04,682 Speaker 1: to RFK Junior. I mean one of his closest corehorts 249 00:13:04,722 --> 00:13:09,042 Speaker 1: is this British doctor Andrew Wakefield. Oh yeah, he had 250 00:13:09,082 --> 00:13:12,522 Speaker 1: his medical license taken away and he came up with 251 00:13:12,562 --> 00:13:14,881 Speaker 1: this paper in the medical journal landst that was rich 252 00:13:15,042 --> 00:13:19,522 Speaker 1: reacted claiming that there was a link between autism and MMR, 253 00:13:19,562 --> 00:13:22,682 Speaker 1: and his studies proved it and it showed that instead 254 00:13:22,762 --> 00:13:26,162 Speaker 1: he manipulated that study and stood to make forty three 255 00:13:26,202 --> 00:13:31,562 Speaker 1: million dollars from his test kits. And he's still RFK Junior, 256 00:13:31,602 --> 00:13:35,322 Speaker 1: who's going to be running our healthcare, still is saying 257 00:13:35,322 --> 00:13:38,202 Speaker 1: that we should be erecting stands to this guy. 258 00:13:38,122 --> 00:13:39,602 Speaker 6: So bid to trust? 259 00:13:39,722 --> 00:13:44,842 Speaker 1: Why would people trust Wakefielder RFK but not like real doctors? Well, 260 00:13:44,882 --> 00:13:47,282 Speaker 1: here this is medical license revoked. 261 00:13:47,802 --> 00:13:51,442 Speaker 5: Here's where the magic word comes in, narcissism, because to 262 00:13:51,482 --> 00:13:55,322 Speaker 5: trust a real doctor, right, I shepherded a parent through 263 00:13:55,402 --> 00:13:59,322 Speaker 5: the hospicing process. I had a loved one that I 264 00:13:59,362 --> 00:14:02,162 Speaker 5: had to help get through cancer. It's an intimidating thing, 265 00:14:02,282 --> 00:14:04,761 Speaker 5: right that a guy in a white jacket comes in 266 00:14:05,042 --> 00:14:06,442 Speaker 5: and says, Okay, I'm going to tell you a lot 267 00:14:06,442 --> 00:14:07,482 Speaker 5: of stuff you don't want. 268 00:14:07,362 --> 00:14:09,962 Speaker 1: To hear and maybe don't understand completely. 269 00:14:10,042 --> 00:14:13,322 Speaker 5: And well, if they're good, they'll break it down for you. Maybe, 270 00:14:13,362 --> 00:14:15,482 Speaker 5: But you know that they're used to dealing with other 271 00:14:15,482 --> 00:14:17,442 Speaker 5: professionals all day long. They may not have the best 272 00:14:17,482 --> 00:14:20,962 Speaker 5: bedside manner, and they say, look, here's some terrible things 273 00:14:21,002 --> 00:14:22,722 Speaker 5: that you're gonna have to hear, and here's some stuff 274 00:14:22,762 --> 00:14:25,682 Speaker 5: you gotta do. There are two things that happen right away. 275 00:14:25,722 --> 00:14:29,122 Speaker 5: It feels very disempowering. It makes you feel very helpless, 276 00:14:30,082 --> 00:14:33,322 Speaker 5: and it's a little bit offensive. Who are you to 277 00:14:33,402 --> 00:14:36,722 Speaker 5: tell me what I have to do? You know to 278 00:14:36,962 --> 00:14:39,122 Speaker 5: I have to stick this poison in my veins, or 279 00:14:39,122 --> 00:14:42,442 Speaker 5: I have to stick my head in this radiation machine, 280 00:14:42,522 --> 00:14:45,762 Speaker 5: or whatever it is. It's the Fredo moment from Godfather Too, 281 00:14:46,162 --> 00:14:50,082 Speaker 5: where the outraged ego yells, I'm smart, I can do things, 282 00:14:50,122 --> 00:14:53,242 Speaker 5: not like people say no, the fact is you're not 283 00:14:53,322 --> 00:14:56,762 Speaker 5: smart enough to know this. That's why these people have 284 00:14:56,842 --> 00:15:00,122 Speaker 5: to have these jobs and are vetted by other people 285 00:15:00,162 --> 00:15:02,402 Speaker 5: who have these jobs. I mean, the way Wakefield got 286 00:15:02,402 --> 00:15:06,642 Speaker 5: caught is other doctors said, hey, well, okay, fine, you're 287 00:15:06,682 --> 00:15:08,602 Speaker 5: making this incredible claim. We're gonna look at this. We're 288 00:15:08,642 --> 00:15:10,842 Speaker 5: gonna reproduce your data, we're gonna go through this. And 289 00:15:10,882 --> 00:15:15,762 Speaker 5: it doesn't stand up. But that belies the the the 290 00:15:15,842 --> 00:15:19,282 Speaker 5: great movie that you think you're in. One brave doctor 291 00:15:19,562 --> 00:15:23,002 Speaker 5: tells the world this terrible secret. If there was a 292 00:15:23,042 --> 00:15:26,762 Speaker 5: connection between vaccines and autism, every doctor in the world 293 00:15:26,762 --> 00:15:28,442 Speaker 5: be all over it because most of them are parents. 294 00:15:28,962 --> 00:15:30,762 Speaker 5: I'm sorry, they're not gonna stick to stuff in their 295 00:15:30,762 --> 00:15:34,042 Speaker 5: own kids, which they do every day. But again that 296 00:15:34,522 --> 00:15:38,842 Speaker 5: undermines that narcissistic narrative of only I know and I've 297 00:15:38,882 --> 00:15:41,842 Speaker 5: done my research, and you people, and so the other 298 00:15:41,882 --> 00:15:44,082 Speaker 5: thing that happens with a kind of an Alex Jones 299 00:15:44,162 --> 00:15:47,122 Speaker 5: thing or the other Charlnan's out there who do this stuff. 300 00:15:47,282 --> 00:15:51,282 Speaker 5: They say, I'm like you, you and I together are 301 00:15:51,322 --> 00:15:54,082 Speaker 5: figuring this out. I remember the late Rush Limbaugh, who 302 00:15:54,122 --> 00:15:57,442 Speaker 5: relied on a lot of experts, railing about those white 303 00:15:57,642 --> 00:16:01,922 Speaker 5: jacketed creeps who tell us what to do. I'm sure 304 00:16:02,122 --> 00:16:04,442 Speaker 5: God rest his soul, but I'm sure when he died 305 00:16:04,522 --> 00:16:07,402 Speaker 5: he had a lot of white jacketed guys trying to 306 00:16:07,442 --> 00:16:10,082 Speaker 5: extend his life and take care of him. But it's 307 00:16:10,122 --> 00:16:14,242 Speaker 5: an easy hit, right, It's an easy kind of bullshit 308 00:16:15,362 --> 00:16:20,362 Speaker 5: hit on. Other people say, oh, like jacketed elitists. Well, yeah, 309 00:16:20,522 --> 00:16:23,322 Speaker 5: I hope the guy flying my plane is an elitist 310 00:16:23,362 --> 00:16:26,482 Speaker 5: with four bars on his shoulder who knows a lot 311 00:16:26,522 --> 00:16:29,082 Speaker 5: more about airplanes than I do. I'd like to sit 312 00:16:29,122 --> 00:16:30,962 Speaker 5: in the back and have a drink, thank you very much. 313 00:16:31,042 --> 00:16:33,562 Speaker 5: I don't want to be, you know, looking up clear 314 00:16:33,602 --> 00:16:35,602 Speaker 5: air turbulence and then having to go to the cockpit 315 00:16:35,642 --> 00:16:37,762 Speaker 5: and say listen to I want to talk to you 316 00:16:37,762 --> 00:16:41,762 Speaker 5: guys about how this flight's going. But that's that narcissistic 317 00:16:42,362 --> 00:16:46,682 Speaker 5: sense that we can all know anything, will it hard 318 00:16:46,762 --> 00:16:50,562 Speaker 5: enough has become speaking of RFK. It's become a kind 319 00:16:50,602 --> 00:16:54,762 Speaker 5: of international brainworm. And I was giving a talk and 320 00:16:54,762 --> 00:16:57,762 Speaker 5: I'm just and this brant because I'm you guys are 321 00:16:57,762 --> 00:17:01,562 Speaker 5: asking me stuff that makes me ranty. I was giving 322 00:17:01,562 --> 00:17:03,162 Speaker 5: a talk about the death of expertise and guy said, 323 00:17:03,162 --> 00:17:05,641 Speaker 5: why should I trust doctors when every issue of the 324 00:17:05,722 --> 00:17:08,522 Speaker 5: Lancet and the New England Journal of Medicine is available 325 00:17:08,522 --> 00:17:10,442 Speaker 5: to me now through the internet and online. And he 326 00:17:10,482 --> 00:17:12,722 Speaker 5: got really upset when he said, because they were written 327 00:17:12,762 --> 00:17:16,962 Speaker 5: for you, because you're not equipped to understand them, and 328 00:17:17,042 --> 00:17:20,762 Speaker 5: neither am I. They were written by doctors for doctors. 329 00:17:21,202 --> 00:17:23,802 Speaker 5: They were written for people who have a foundational knowledge 330 00:17:23,922 --> 00:17:28,762 Speaker 5: of medicine, chemistry, statistics, probability. You know that they weren't 331 00:17:28,802 --> 00:17:31,322 Speaker 5: written for a guy who says, oh, uh, I think 332 00:17:31,362 --> 00:17:34,202 Speaker 5: I have a lump in my groin or you know, 333 00:17:34,562 --> 00:17:37,122 Speaker 5: a sore on my neck, and so I'll just go 334 00:17:37,162 --> 00:17:39,841 Speaker 5: study the medical journals because it'll just be there and 335 00:17:39,882 --> 00:17:42,762 Speaker 5: I'll use an AI chatbot to summarize it for me. 336 00:17:43,082 --> 00:17:47,762 Speaker 5: That's insane. This is literally a form of psychosis where 337 00:17:47,802 --> 00:17:51,282 Speaker 5: you just lose touch with reality and just can't process 338 00:17:51,362 --> 00:17:53,642 Speaker 5: that there is someone who knows more about this than 339 00:17:53,642 --> 00:17:55,802 Speaker 5: you do, and that you should go to them, even 340 00:17:55,842 --> 00:17:58,882 Speaker 5: if they make you feel uncomfortable, even if they're talking 341 00:17:58,962 --> 00:18:00,642 Speaker 5: down to you, even if they seem to have a 342 00:18:00,682 --> 00:18:03,561 Speaker 5: lot more education than you do. Put your big boy 343 00:18:03,682 --> 00:18:07,242 Speaker 5: pants on, stop whining and ask them how to save 344 00:18:07,282 --> 00:18:07,882 Speaker 5: your life. 345 00:18:08,242 --> 00:18:10,842 Speaker 2: I feel bad for like pilots and doctors because people 346 00:18:10,842 --> 00:18:12,482 Speaker 2: probably come to them all the time now and act 347 00:18:12,522 --> 00:18:14,402 Speaker 2: like they know stuff and talk to them and try. 348 00:18:14,242 --> 00:18:18,482 Speaker 5: To not just pilots, John, I'll tell you, but I 349 00:18:18,482 --> 00:18:21,122 Speaker 5: mean no, seriously, I mean people in the trades. Talk 350 00:18:21,202 --> 00:18:27,482 Speaker 5: to plumbers, contractors, electricians, carpenters, you know, anybody with a 351 00:18:27,522 --> 00:18:31,682 Speaker 5: specialized knowledge. You know, electricians are telling me. People walk 352 00:18:31,722 --> 00:18:35,722 Speaker 5: up though, what are you putting in there? You know what? Oh? 353 00:18:35,802 --> 00:18:40,082 Speaker 5: This wire? This is a this is Linguini number seven. Okay, 354 00:18:40,202 --> 00:18:42,682 Speaker 5: I got it in the pasta aisle. What difference does it? 355 00:18:42,762 --> 00:18:46,121 Speaker 5: You don't understand the difference. Assume that I am a 356 00:18:46,162 --> 00:18:49,482 Speaker 5: licensed electrician and that I will explain everything to you 357 00:18:49,642 --> 00:18:51,922 Speaker 5: and then bill you. But people feel the need to 358 00:18:51,922 --> 00:18:54,962 Speaker 5: get in there and say, uh, you got a three 359 00:18:54,962 --> 00:18:57,162 Speaker 5: eighth cent rent. They also like Cliff claven right, you 360 00:18:57,162 --> 00:18:59,522 Speaker 5: got a three eighth cent rench in there. I mean, 361 00:18:59,562 --> 00:19:03,042 Speaker 5: it's just ridiculous. Not only is it the problem of narcissism, 362 00:19:03,362 --> 00:19:05,922 Speaker 5: but it's a way to fight back against feeling helpless. 363 00:19:06,042 --> 00:19:08,402 Speaker 5: It's a way to fight back against feeling disempowered. 364 00:19:09,402 --> 00:19:21,682 Speaker 6: More of this after a quick break, and we're back. 365 00:19:22,322 --> 00:19:24,282 Speaker 2: So it sounds like the people who are doing this 366 00:19:24,442 --> 00:19:27,802 Speaker 2: aren't the most destitute among us or the lowest education 367 00:19:28,242 --> 00:19:31,882 Speaker 2: among us. How does class and education factor into this 368 00:19:32,002 --> 00:19:34,402 Speaker 2: since some of it has to do with entertainment and boredom, 369 00:19:34,402 --> 00:19:35,842 Speaker 2: not just ignorance. 370 00:19:36,162 --> 00:19:39,682 Speaker 5: Yeah, there is a low information voter problem among people 371 00:19:39,762 --> 00:19:43,402 Speaker 5: who are working three jobs and just trying to stay alive. 372 00:19:43,802 --> 00:19:47,282 Speaker 5: But the boredom problem, where that becomes a threat to 373 00:19:47,402 --> 00:19:52,162 Speaker 5: democracy and to trusting information and not falling off the 374 00:19:52,242 --> 00:19:55,361 Speaker 5: rails and into these rabbit holes, tends to be a 375 00:19:55,682 --> 00:20:00,082 Speaker 5: middle class phenomenon, in part because again we're a leisure society. 376 00:20:00,122 --> 00:20:02,242 Speaker 5: We have a lot of spare time on our hands. 377 00:20:02,722 --> 00:20:05,042 Speaker 5: Notice that the people that are the most prone to 378 00:20:05,082 --> 00:20:08,202 Speaker 5: these conspiracy theories tend to be in our age because 379 00:20:08,202 --> 00:20:10,522 Speaker 5: they're semi retired. They spend a lot of time on 380 00:20:10,562 --> 00:20:14,602 Speaker 5: the internet. The craziest conspiracy theorists I've ever heard are 381 00:20:14,922 --> 00:20:17,162 Speaker 5: every time I'm on the road and someone talk to 382 00:20:17,242 --> 00:20:19,762 Speaker 5: me about this, it's always people that are over fifty five. 383 00:20:20,642 --> 00:20:24,121 Speaker 5: Inevitably people over fifty five. Younger people follow this too, 384 00:20:24,162 --> 00:20:26,642 Speaker 5: but they ask it as a question. They say like, Tom, 385 00:20:27,082 --> 00:20:30,562 Speaker 5: is it true? Or I've heard older people say, listen 386 00:20:30,642 --> 00:20:33,722 Speaker 5: to me, Tom, here's what's going on. Eric Hoffer in 387 00:20:33,802 --> 00:20:36,002 Speaker 5: nineteen fifty one wrote a book called The True Believer 388 00:20:36,442 --> 00:20:40,642 Speaker 5: about how mass authoritarian movements developed, and he said, if 389 00:20:40,642 --> 00:20:43,282 Speaker 5: you're hoping to start one of these movements, the best 390 00:20:43,282 --> 00:20:46,561 Speaker 5: news you could get is not that people are poor. 391 00:20:46,922 --> 00:20:50,162 Speaker 5: It's that the society you're targeting is one that is 392 00:20:50,282 --> 00:20:54,162 Speaker 5: riven by unrelieved boredom. This was five years after World 393 00:20:54,162 --> 00:20:57,682 Speaker 5: War Two. He was absolutely right. This kind of decadence 394 00:20:57,722 --> 00:21:00,602 Speaker 5: and boredom leads you to say life must be more 395 00:21:00,642 --> 00:21:04,321 Speaker 5: interesting than this. And let's face it, authoritarians and fascists 396 00:21:04,402 --> 00:21:07,882 Speaker 5: and bad guys in general of both the far right 397 00:21:07,922 --> 00:21:11,722 Speaker 5: and the far left tell you interesting narratives instead of saying, look, 398 00:21:11,762 --> 00:21:13,642 Speaker 5: life is hard, we have to work together. We have 399 00:21:13,682 --> 00:21:16,002 Speaker 5: to kind of slog this through. You got to pay 400 00:21:16,002 --> 00:21:18,321 Speaker 5: your taxes, you got to vote, get up every day 401 00:21:18,322 --> 00:21:20,082 Speaker 5: and go to work. Nobody wants to hear that shit. 402 00:21:20,362 --> 00:21:23,082 Speaker 5: They want to hear that you are the central character 403 00:21:23,682 --> 00:21:27,402 Speaker 5: fighting against Specter or Thanos or something. 404 00:21:27,602 --> 00:21:30,602 Speaker 1: You can't use the word decadenes without coming to the 405 00:21:30,682 --> 00:21:36,282 Speaker 1: Roman empire, right, and so Caligula famously his horse Incitatis. 406 00:21:36,282 --> 00:21:38,962 Speaker 1: The story goes that he named his horse as a 407 00:21:38,962 --> 00:21:42,442 Speaker 1: senator it was actually smart, and that the Senate had 408 00:21:42,562 --> 00:21:46,642 Speaker 1: expertise in how to run the empire and run it effectively. 409 00:21:47,122 --> 00:21:52,202 Speaker 1: He wanted loyalty over expertise. Expertise was dangerous to a 410 00:21:52,282 --> 00:21:54,401 Speaker 1: guy at the top who wants to run it as 411 00:21:54,442 --> 00:21:58,682 Speaker 1: his personal fiefdom. So appointing your horse and making everyone 412 00:21:59,282 --> 00:22:02,402 Speaker 1: pledge to support it is a way of stealing their 413 00:22:02,522 --> 00:22:08,121 Speaker 1: soul and either destroying or manipulating expertise so that it 414 00:22:08,162 --> 00:22:11,841 Speaker 1: doesn't matter. I mean, Stalin did the same thing with Lisenkoism. 415 00:22:12,282 --> 00:22:14,561 Speaker 5: The main reason caligulily did what he did was to 416 00:22:14,642 --> 00:22:18,162 Speaker 5: show utter contempt for the Senate. The Senate is so 417 00:22:18,282 --> 00:22:20,922 Speaker 5: worthless that I had appoint my horse, and in a 418 00:22:20,962 --> 00:22:23,841 Speaker 5: way I mean not to draw strong parallels today, but 419 00:22:23,922 --> 00:22:26,362 Speaker 5: in a way, you know, I wrote the other day 420 00:22:26,402 --> 00:22:29,162 Speaker 5: that we're being trolled by some of these appointments. To 421 00:22:29,282 --> 00:22:31,722 Speaker 5: show how little I think of the justice system, I'm 422 00:22:31,722 --> 00:22:34,482 Speaker 5: going to point met gates and screw you by the way. 423 00:22:34,762 --> 00:22:37,561 Speaker 5: And fortunately there still seem to be even left in 424 00:22:37,602 --> 00:22:40,682 Speaker 5: the GOP. Thank goodness, there still seem to be people saying, 425 00:22:40,682 --> 00:22:42,642 Speaker 5: you know, there are limits to the amount of trolling 426 00:22:42,682 --> 00:22:45,322 Speaker 5: and Carnie barking we can take while trying to run 427 00:22:45,362 --> 00:22:48,561 Speaker 5: the government. But your point about expertise and authoritarians is 428 00:22:48,602 --> 00:22:53,242 Speaker 5: really important because authoritarians rely on expertise like everybody else does, 429 00:22:53,282 --> 00:22:55,722 Speaker 5: but they do it by command and they don't want 430 00:22:55,762 --> 00:22:59,162 Speaker 5: to hear what the most important role in expert performs. 431 00:22:59,562 --> 00:23:02,042 Speaker 5: And I did this advising a senator when I was 432 00:23:02,042 --> 00:23:04,042 Speaker 5: a young guy. You have to be able to walk 433 00:23:04,082 --> 00:23:05,762 Speaker 5: in and tell the boss stuff he doesn't want to hear. 434 00:23:06,802 --> 00:23:08,121 Speaker 5: You have to be able to walk in, and if 435 00:23:08,122 --> 00:23:09,682 Speaker 5: you're a scientist, Dvean, you have to be able to 436 00:23:09,682 --> 00:23:13,522 Speaker 5: walk in and tell Stalin, comrade, the wing covering on 437 00:23:13,602 --> 00:23:16,442 Speaker 5: the new jets just it doesn't you know it's not 438 00:23:16,562 --> 00:23:19,442 Speaker 5: gonna work. We screwed up. This prototype isn't gonna fly, 439 00:23:19,922 --> 00:23:23,522 Speaker 5: because then he could say, okay, thank you, comrade engineer, 440 00:23:23,602 --> 00:23:26,682 Speaker 5: or he could say, clearly, you're a saboteur. I told 441 00:23:26,722 --> 00:23:28,762 Speaker 5: you to make this work. There's a kind of rule 442 00:23:28,762 --> 00:23:33,802 Speaker 5: among political scientists. High coercion systems are low information systems, 443 00:23:34,362 --> 00:23:36,402 Speaker 5: specifically because of that problem. And then if you have 444 00:23:36,442 --> 00:23:39,362 Speaker 5: a system that relies on a central leader, where there's 445 00:23:39,362 --> 00:23:43,442 Speaker 5: a lot of coercion and exercise of control, then those 446 00:23:43,442 --> 00:23:46,522 Speaker 5: tend to be very low information systems because nobody wants 447 00:23:46,522 --> 00:23:49,002 Speaker 5: to share information, and certainly nobody wants to share information 448 00:23:49,042 --> 00:23:51,442 Speaker 5: that could get them killed. We saw this all the 449 00:23:51,482 --> 00:23:54,762 Speaker 5: time with Stalinism. And yet the punchline to the stalin 450 00:23:54,842 --> 00:23:57,321 Speaker 5: story is that when he looks over he sees the 451 00:23:57,402 --> 00:24:00,922 Speaker 5: Americans developing superstar jets and missiles and all this stuff. 452 00:24:00,962 --> 00:24:04,082 Speaker 5: He literally takes some of the guys that he's imprisoned 453 00:24:04,482 --> 00:24:08,161 Speaker 5: and he tells them to open workshops in prison, you know, 454 00:24:08,242 --> 00:24:11,242 Speaker 5: like the tupelev that you know you've seen. These are 455 00:24:11,282 --> 00:24:14,282 Speaker 5: Soviet aircraft, the series of aircraft. He had that guy 456 00:24:14,362 --> 00:24:17,922 Speaker 5: working on aircraft designs while he was in prison. On 457 00:24:17,962 --> 00:24:20,802 Speaker 5: the one hand, he wants to squeeze these guys because 458 00:24:20,842 --> 00:24:24,321 Speaker 5: experts are always dangerous in an authority Darian government. But 459 00:24:24,362 --> 00:24:26,482 Speaker 5: then when it comes time to design a new bomber, 460 00:24:27,122 --> 00:24:28,882 Speaker 5: he's like, where it is that guy? Will you throw 461 00:24:28,922 --> 00:24:30,522 Speaker 5: him in prison? All right, We'll send him a bunch 462 00:24:30,562 --> 00:24:32,922 Speaker 5: of drafting tables and give them some extra rations to 463 00:24:32,922 --> 00:24:34,522 Speaker 5: tell them to get to work. I'm not letting them 464 00:24:34,522 --> 00:24:36,082 Speaker 5: out of prison. You understand a lot of. 465 00:24:36,002 --> 00:24:38,042 Speaker 2: His generals came out of prison World War Two. 466 00:24:38,042 --> 00:24:40,922 Speaker 5: Once the war started, Trump Junior said something very revealing. 467 00:24:41,242 --> 00:24:43,522 Speaker 5: We don't want anybody in this administration thinks are smarter 468 00:24:43,602 --> 00:24:47,402 Speaker 5: than my father. Oh well, you know that rules out 469 00:24:47,442 --> 00:24:50,642 Speaker 5: a lot of people. Well, because most people are smarter 470 00:24:50,802 --> 00:24:53,242 Speaker 5: than his father. You can see that in this pattern 471 00:24:53,282 --> 00:24:56,242 Speaker 5: of appointments. I want loyalists who look good on TV. 472 00:24:56,722 --> 00:24:58,762 Speaker 5: I don't want people actually knowing about any of the 473 00:24:58,762 --> 00:24:59,841 Speaker 5: stuff we're talking about. 474 00:25:00,122 --> 00:25:03,042 Speaker 2: Let's move into politics a little bit. So politicians obviously 475 00:25:03,642 --> 00:25:06,682 Speaker 2: like to use fear and other powerful motions to get support, 476 00:25:07,122 --> 00:25:09,842 Speaker 2: and certainly conspiracy theories are something they can weaponize with 477 00:25:09,962 --> 00:25:12,482 Speaker 2: voters weren't paying attention. But is there any way to 478 00:25:12,562 --> 00:25:16,321 Speaker 2: regain a serious interest in governing, embracing facts are tackling 479 00:25:16,362 --> 00:25:16,962 Speaker 2: hard problems. 480 00:25:17,082 --> 00:25:19,402 Speaker 5: I'm gonna step back before everybody yells at us about 481 00:25:19,402 --> 00:25:22,162 Speaker 5: both sides of this, because the right is really the 482 00:25:22,242 --> 00:25:25,322 Speaker 5: home of this stuff now, but I will say I'm 483 00:25:25,442 --> 00:25:28,362 Speaker 5: sorry to have to point this out. I was getting 484 00:25:28,762 --> 00:25:33,402 Speaker 5: lots of messages and tweets and posts on social media 485 00:25:33,402 --> 00:25:36,162 Speaker 5: saying you have to look into how Trump won. Clearly 486 00:25:36,162 --> 00:25:38,841 Speaker 5: it was fraud, and I'm like, no, this whole election 487 00:25:39,002 --> 00:25:42,762 Speaker 5: denier thing cannot start on the left after we just 488 00:25:42,802 --> 00:25:45,002 Speaker 5: had to deal with it. You notice that Trump wins, right, 489 00:25:45,042 --> 00:25:46,762 Speaker 5: and it's, oh, I guess elections are fair. 490 00:25:47,242 --> 00:25:48,482 Speaker 2: I forgot to throw this one. 491 00:25:48,602 --> 00:25:52,202 Speaker 5: Yeah, they forgot it right exactly. The most incompetent, doddering 492 00:25:52,242 --> 00:25:54,962 Speaker 5: old man in the White House somehow couldn't rig this election, 493 00:25:55,402 --> 00:25:57,682 Speaker 5: but managed to pull it off while Trump was in 494 00:25:57,842 --> 00:26:01,282 Speaker 5: office the first time, which just shows you that another 495 00:26:01,282 --> 00:26:03,802 Speaker 5: thing to understand about conspiracy theories is that they don't 496 00:26:03,802 --> 00:26:06,602 Speaker 5: have to be internally consistent. They just have to explain 497 00:26:06,682 --> 00:26:10,922 Speaker 5: whatever they explain at that moment. The conspiracy theories live 498 00:26:11,082 --> 00:26:13,802 Speaker 5: in the moment. They're not meant for the long haul 499 00:26:14,002 --> 00:26:17,162 Speaker 5: because you have to them over time. If it's there's 500 00:26:17,202 --> 00:26:20,122 Speaker 5: going to be mass arrests on just before the election, 501 00:26:20,402 --> 00:26:22,601 Speaker 5: Oh that didn't happen. Never mind that. I said that 502 00:26:22,682 --> 00:26:24,282 Speaker 5: there's a reason for that, and I'll get to it 503 00:26:24,282 --> 00:26:26,722 Speaker 5: in a minute. So how do you get away from this? 504 00:26:26,722 --> 00:26:29,242 Speaker 5: How do you get back to it? I think one thing, 505 00:26:29,322 --> 00:26:32,202 Speaker 5: and this has really hard for people to do. You 506 00:26:32,362 --> 00:26:35,561 Speaker 5: have to stop, even in your personal life, you have 507 00:26:35,642 --> 00:26:37,762 Speaker 5: to stop giving oxygen to the people who want to 508 00:26:37,842 --> 00:26:41,242 Speaker 5: suck your attention like vampires about this stuff. Because the 509 00:26:41,362 --> 00:26:45,362 Speaker 5: other thing that breeds conspiracy theories is loneliness. It makes 510 00:26:45,362 --> 00:26:48,002 Speaker 5: people feel part of a community. It gives them something 511 00:26:48,002 --> 00:26:49,442 Speaker 5: to talk about to other people. 512 00:26:49,802 --> 00:26:50,002 Speaker 2: You know. 513 00:26:50,442 --> 00:26:52,762 Speaker 5: I tried to explain to it years ago. A CBS 514 00:26:52,842 --> 00:26:57,122 Speaker 5: reporter did a story on conspiracy theories and whacked out 515 00:26:57,202 --> 00:26:59,482 Speaker 5: thoughts and he said, yeah, we're going to talk to 516 00:26:59,522 --> 00:27:01,922 Speaker 5: some flat earthers, and I stopped him. I said, do 517 00:27:01,962 --> 00:27:04,321 Speaker 5: you really believe they think the Earth is flat? Or 518 00:27:04,482 --> 00:27:06,122 Speaker 5: has it occurred to you that this is the way 519 00:27:06,162 --> 00:27:09,722 Speaker 5: that they can get a reporter from CBS News to 520 00:27:09,882 --> 00:27:12,362 Speaker 5: talk to them for three hours. Some of these people 521 00:27:12,442 --> 00:27:15,442 Speaker 5: are like the intellectual equivalent of like shut who just 522 00:27:15,522 --> 00:27:18,601 Speaker 5: want company It's like I'll say the earth is flat, 523 00:27:18,722 --> 00:27:21,641 Speaker 5: talk to me online for six hours about it, and 524 00:27:22,602 --> 00:27:24,962 Speaker 5: there's a sadness to it. So you have to stop 525 00:27:24,962 --> 00:27:28,321 Speaker 5: doing that. If someone says John those Venezuela, you have 526 00:27:28,362 --> 00:27:29,522 Speaker 5: to put your hand up and say, look, I'm not 527 00:27:29,522 --> 00:27:33,202 Speaker 5: having this conversation not happening. Don't argue with them. The 528 00:27:33,242 --> 00:27:37,362 Speaker 5: people who are not prone to these things can't help themselves. 529 00:27:37,442 --> 00:27:40,802 Speaker 5: If your kooky uncle says I think Michelle Obama's a 530 00:27:40,802 --> 00:27:43,042 Speaker 5: man that one goes around every now and then, just 531 00:27:43,042 --> 00:27:45,522 Speaker 5: stop him and say, I don't care what you think 532 00:27:45,562 --> 00:27:49,121 Speaker 5: about this, I'm not having this perfectly ridiculous conversation with 533 00:27:49,162 --> 00:27:52,881 Speaker 5: you past the potatoes. Look, when you argue with conspiracy theorists, 534 00:27:52,922 --> 00:27:56,042 Speaker 5: they take that as evidence of being right. Otherwise, why 535 00:27:56,042 --> 00:27:58,922 Speaker 5: would you be arguing so strenuously. I think when you 536 00:27:58,962 --> 00:28:01,042 Speaker 5: do the thing you ought to do, which is to 537 00:28:01,162 --> 00:28:03,842 Speaker 5: dismiss this as though someone just told you that they 538 00:28:03,842 --> 00:28:06,482 Speaker 5: have a leprechaun on their shoulder, which is the way 539 00:28:06,522 --> 00:28:09,682 Speaker 5: you ought to dismiss it, I think it does produce 540 00:28:09,722 --> 00:28:12,322 Speaker 5: some second thoughts of saying, wow, not only is this 541 00:28:12,762 --> 00:28:16,922 Speaker 5: person making me wonder, but it's socially crippling to say no, 542 00:28:16,962 --> 00:28:20,401 Speaker 5: I'm not gonna do this with you. The media has 543 00:28:20,402 --> 00:28:22,922 Speaker 5: to kind of brush these away. Stop going to diners 544 00:28:23,242 --> 00:28:26,082 Speaker 5: and asking people if they thought the election was stolen 545 00:28:26,962 --> 00:28:30,282 Speaker 5: it wasn't. And the sooner you stop asking people this, 546 00:28:30,522 --> 00:28:32,642 Speaker 5: the sooner they're going to stop trying to sink their 547 00:28:32,642 --> 00:28:35,402 Speaker 5: teeth into your neck to suck that attention out of 548 00:28:35,402 --> 00:28:39,002 Speaker 5: you for three hours. On a macro level, I don't know, guys, 549 00:28:39,802 --> 00:28:41,922 Speaker 5: I think some of this has to burn itself out. 550 00:28:42,162 --> 00:28:45,002 Speaker 5: If you're told over and over again that the spaceship 551 00:28:45,042 --> 00:28:47,522 Speaker 5: is coming to spare it us all away, and finally, 552 00:28:47,562 --> 00:28:50,842 Speaker 5: after the tenth time it doesn't come, maybe some of 553 00:28:50,842 --> 00:28:54,082 Speaker 5: those folks will reassess. At some point, they do get 554 00:28:54,082 --> 00:28:57,042 Speaker 5: bored and they do start to feel taken, and I 555 00:28:57,042 --> 00:28:59,362 Speaker 5: think at that point it's really important for other people 556 00:28:59,482 --> 00:29:00,442 Speaker 5: not to say I told you. 557 00:29:00,402 --> 00:29:03,002 Speaker 2: So, Tom, thank you very much for your time. It 558 00:29:03,042 --> 00:29:05,562 Speaker 2: was fun, all right, both of you. 559 00:29:05,602 --> 00:29:05,882 Speaker 1: Thank you. 560 00:29:05,882 --> 00:29:07,802 Speaker 2: You'll get to talk about horse all the time. 561 00:29:16,002 --> 00:29:21,082 Speaker 7: Mission Implausible is produced by Adam Davidson, Jerry O'shay, John C. Seipher, 562 00:29:21,362 --> 00:29:25,722 Speaker 7: and Jonathan Stern. The associate producer is Rachel Harner. Mission 563 00:29:25,762 --> 00:29:29,842 Speaker 7: Implausible is a production of honorable mention and abominable pictures 564 00:29:29,882 --> 00:29:31,242 Speaker 7: for iHeart podcasts.