1 00:00:15,396 --> 00:00:23,476 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:23,516 --> 00:00:26,516 Speaker 1: where we explored the stories behind the stories in the news. 3 00:00:26,916 --> 00:00:30,676 Speaker 1: I'm Noah Feldman. This time we have a special episode 4 00:00:30,676 --> 00:00:33,356 Speaker 1: for you, which we recorded live on stage here in 5 00:00:33,396 --> 00:00:35,436 Speaker 1: Boston in front of an audience of more than a 6 00:00:35,516 --> 00:00:39,516 Speaker 1: thousand people. My guest was none other than Malcolm Gladwell. 7 00:00:39,636 --> 00:00:41,956 Speaker 1: He's a staff writer at The New Yorker, the author 8 00:00:41,956 --> 00:00:45,996 Speaker 1: of best selling books including Blink, The Tipping Point, and Outliers. 9 00:00:46,556 --> 00:00:49,716 Speaker 1: He's also the co founder of Pushkin Industries, the company 10 00:00:49,756 --> 00:00:53,596 Speaker 1: that makes this podcast. Malcolm just published a new book 11 00:00:53,716 --> 00:01:00,196 Speaker 1: called Talking to Strangers. We started there. Thank you all 12 00:01:00,276 --> 00:01:03,316 Speaker 1: very much for coming. Thank you Malcolm for coming to Boston. 13 00:01:04,676 --> 00:01:08,196 Speaker 1: I want to start by just asking you about the 14 00:01:08,236 --> 00:01:12,716 Speaker 1: person to whom this book is dedicated, Yes, namely your father. Yes, 15 00:01:12,836 --> 00:01:16,076 Speaker 1: the book starts with a very sweet anecdote about him, 16 00:01:16,076 --> 00:01:18,516 Speaker 1: and then somewhere buried in the middle of the book, 17 00:01:18,676 --> 00:01:21,156 Speaker 1: like at the beating heart of the book, isn't actually 18 00:01:21,236 --> 00:01:24,236 Speaker 1: deeply moving and a little bit shocking story in which 19 00:01:24,236 --> 00:01:26,876 Speaker 1: he figures what led you to think that he should 20 00:01:26,916 --> 00:01:30,276 Speaker 1: be the Is he the inspiring figure for this book? Well, 21 00:01:30,276 --> 00:01:35,116 Speaker 1: he was. I lost my father in twenty seventeen, so 22 00:01:35,756 --> 00:01:40,396 Speaker 1: he was clearly on my mind. But he well, I 23 00:01:40,516 --> 00:01:43,356 Speaker 1: you know, I opened the He does in some way 24 00:01:43,476 --> 00:01:46,396 Speaker 1: inspire this book because I tell the story at the 25 00:01:46,476 --> 00:01:49,316 Speaker 1: very very beginning of the book of when my parents 26 00:01:49,396 --> 00:01:52,516 Speaker 1: would come to Manhattan, I would put them up at 27 00:01:52,556 --> 00:01:54,956 Speaker 1: the Mercer Hotel, which is the most sort of celebrity 28 00:01:54,956 --> 00:01:58,836 Speaker 1: written hotel in New York and is a joke because 29 00:01:58,836 --> 00:02:02,956 Speaker 1: my parents. No two individuals know less about celebrity culture 30 00:02:02,996 --> 00:02:05,756 Speaker 1: than my parents, and so it was always a kind 31 00:02:05,796 --> 00:02:09,036 Speaker 1: of inside joke on my part to think of them 32 00:02:09,156 --> 00:02:11,556 Speaker 1: mingling with rock stars Joe and you shared with no 33 00:02:11,596 --> 00:02:14,596 Speaker 1: one but yourself until now, and then you're inside Joe. 34 00:02:14,876 --> 00:02:16,676 Speaker 1: So one time, my dad's staying there and I asked 35 00:02:16,756 --> 00:02:18,876 Speaker 1: him what he had done in the previous afternoon. He said, oh, 36 00:02:18,876 --> 00:02:22,796 Speaker 1: I had a wonderful conversation with someone in the lobby 37 00:02:22,836 --> 00:02:26,636 Speaker 1: about gardening. And so the only problem was that people 38 00:02:26,716 --> 00:02:29,396 Speaker 1: kept coming up to this man and asking him to 39 00:02:29,396 --> 00:02:32,716 Speaker 1: sign bits of paper and taking his picture, and so 40 00:02:32,836 --> 00:02:34,796 Speaker 1: I suppolled, you know, who was it? So I have 41 00:02:34,836 --> 00:02:38,276 Speaker 1: no idea, um, And it was clearly some just being 42 00:02:38,276 --> 00:02:41,556 Speaker 1: the Mercer, some massive celebrity and I've scased the last 43 00:02:41,556 --> 00:02:43,236 Speaker 1: this was ten years ago, the last ten years trying 44 00:02:43,236 --> 00:02:46,236 Speaker 1: to figure out who it was. And you can be 45 00:02:46,316 --> 00:02:48,756 Speaker 1: a pretty big celebrity in the Mercer Hotel and no 46 00:02:48,796 --> 00:02:50,356 Speaker 1: one will come over to you because it's not cool 47 00:02:50,396 --> 00:02:52,676 Speaker 1: to do that. First of all, there's several key facts here. 48 00:02:53,116 --> 00:02:55,596 Speaker 1: What is that it's in the part of a part 49 00:02:55,596 --> 00:02:58,556 Speaker 1: of the Mercer that's only opened guests, and this is 50 00:02:58,556 --> 00:03:01,716 Speaker 1: where they were, so right away, the fact that own 51 00:03:02,036 --> 00:03:05,476 Speaker 1: other celebrities were coming up to this celebrity is brucial. 52 00:03:07,876 --> 00:03:12,156 Speaker 1: Second is my father. How all I can say is 53 00:03:12,356 --> 00:03:15,516 Speaker 1: he had a clear preference for talking with other Englishmen, 54 00:03:15,836 --> 00:03:19,316 Speaker 1: ay and be for people his own age. So I 55 00:03:19,356 --> 00:03:22,196 Speaker 1: think it's someone born in the nineteen thirties in England 56 00:03:22,596 --> 00:03:26,396 Speaker 1: who liked gardening, and he was a major celebrity. So 57 00:03:26,396 --> 00:03:29,156 Speaker 1: so I've been asking people and someone recently said, I 58 00:03:29,196 --> 00:03:33,556 Speaker 1: think a very good guess Michael Caine. Ah. Can you 59 00:03:33,596 --> 00:03:37,036 Speaker 1: can imagine people approaching Michael Caine. You can imagine Michael 60 00:03:37,076 --> 00:03:39,676 Speaker 1: Caine talking about apparently he's a big gardener that I was. 61 00:03:39,876 --> 00:03:42,596 Speaker 1: That actually surprises me. And and also kind of lower 62 00:03:42,636 --> 00:03:45,316 Speaker 1: middle class as my father was. And you know when 63 00:03:45,356 --> 00:03:49,596 Speaker 1: Britt's congregate. Outside of England, they do the class thing 64 00:03:49,636 --> 00:03:51,796 Speaker 1: instantly and then when they realized also do that inside 65 00:03:51,796 --> 00:03:54,996 Speaker 1: of English. Yes, they just do that everywhere. They might 66 00:03:55,036 --> 00:03:57,556 Speaker 1: do it more aggressively outside of him, but like once 67 00:03:57,756 --> 00:04:00,516 Speaker 1: they discover they're from the same narrow slice. Like I 68 00:04:00,556 --> 00:04:03,196 Speaker 1: once saw my father who was chatting with someone who 69 00:04:03,276 --> 00:04:05,276 Speaker 1: was from the same town in Kent that he was from, 70 00:04:05,796 --> 00:04:08,036 Speaker 1: and my father with that they hadn't done the class 71 00:04:08,076 --> 00:04:10,676 Speaker 1: thing yet, and he told it was a woman. He 72 00:04:10,676 --> 00:04:13,156 Speaker 1: told this woman that he used to break into some 73 00:04:13,556 --> 00:04:16,596 Speaker 1: estate and play in the garden, and she paused and said, oh, 74 00:04:16,716 --> 00:04:19,476 Speaker 1: that was my estate, And that was the end of 75 00:04:19,476 --> 00:04:23,756 Speaker 1: the conversation. But what is my father did like to 76 00:04:23,756 --> 00:04:27,716 Speaker 1: talk to strangers, and he um he embodied one of 77 00:04:27,756 --> 00:04:29,396 Speaker 1: the principles of the book, which is he did not 78 00:04:29,476 --> 00:04:33,796 Speaker 1: expect a conversation with a stranger to yield much about 79 00:04:33,796 --> 00:04:36,676 Speaker 1: the stranger. In other words, he could talk for an 80 00:04:36,676 --> 00:04:39,676 Speaker 1: hour with gardening about gardening with Michael Caine and never 81 00:04:39,716 --> 00:04:43,396 Speaker 1: discovered that Michael Caine was a famous actor. And is 82 00:04:43,436 --> 00:04:46,236 Speaker 1: that because talking about gardening is a form of small 83 00:04:46,276 --> 00:04:48,436 Speaker 1: talk or is it the opposite? Is it that talking 84 00:04:48,476 --> 00:04:53,556 Speaker 1: about gardening is so deeply interesting to the gardeners that 85 00:04:53,556 --> 00:04:56,756 Speaker 1: that you can have a deep, substantive, meaningful conversation. It 86 00:04:56,796 --> 00:04:58,356 Speaker 1: never comes up you know, what do you do for 87 00:04:58,356 --> 00:05:01,916 Speaker 1: a living? Or no, it's you've lost your Harrison's outfear. Well, 88 00:05:01,916 --> 00:05:03,996 Speaker 1: it's very English to have a subject that allows you 89 00:05:04,036 --> 00:05:06,756 Speaker 1: to avoid all intimacy, like the weather, like the weather 90 00:05:06,836 --> 00:05:11,356 Speaker 1: or gardening, but also it's something to do genuinely. A 91 00:05:11,436 --> 00:05:14,556 Speaker 1: wonderful fact about my father, which is I think he 92 00:05:14,636 --> 00:05:18,116 Speaker 1: understood that if you spend too much time gathering information 93 00:05:18,116 --> 00:05:20,116 Speaker 1: on a person you're talking to, you're only going to 94 00:05:20,196 --> 00:05:24,236 Speaker 1: find ways of alienating yourself from that person. So my father, 95 00:05:24,436 --> 00:05:27,236 Speaker 1: the more you know, the less yeah, you don't want 96 00:05:27,236 --> 00:05:29,276 Speaker 1: to will be yeah, you don't want to discover it, 97 00:05:29,396 --> 00:05:32,196 Speaker 1: you don't have in common with them. So he was famous. 98 00:05:32,396 --> 00:05:35,236 Speaker 1: So my father was a you know, he was a 99 00:05:35,276 --> 00:05:39,396 Speaker 1: mathematics PhD. Whose head was somewhere up here, but he 100 00:05:39,436 --> 00:05:42,716 Speaker 1: was always falling into conversation with people like his. He 101 00:05:42,796 --> 00:05:45,356 Speaker 1: was great friends with our neighbor who probably had a 102 00:05:45,356 --> 00:05:48,556 Speaker 1: seventh grade education, and it's because they never got past 103 00:05:49,516 --> 00:05:52,916 Speaker 1: They would also talk about Guardian a person now but 104 00:05:52,956 --> 00:05:54,836 Speaker 1: I sort of love that sort of you know that 105 00:05:55,196 --> 00:05:57,076 Speaker 1: I love that, and that's sort of part of where 106 00:05:57,076 --> 00:05:58,596 Speaker 1: I end up in the book, which is that we 107 00:05:58,716 --> 00:06:02,036 Speaker 1: expect to weigh too much from these conversations, and they're 108 00:06:02,156 --> 00:06:06,796 Speaker 1: dangerous when they get too aggressive. But in your book, 109 00:06:06,796 --> 00:06:09,956 Speaker 1: you've got lots and lots of examples of conversations that 110 00:06:10,076 --> 00:06:12,716 Speaker 1: go awry. Yeah. In fact, there's almost there are very 111 00:06:12,716 --> 00:06:14,956 Speaker 1: few conversations in the whole book other than your father's 112 00:06:14,956 --> 00:06:19,836 Speaker 1: conversation with the unnamed Englishmen, that go okay at all. 113 00:06:20,396 --> 00:06:23,836 Speaker 1: So I'm sort of wondering, do you agree with your 114 00:06:23,836 --> 00:06:26,356 Speaker 1: father's idea that, you know, maybe we should just talk 115 00:06:26,436 --> 00:06:29,476 Speaker 1: less to everybody you know, or at least less to 116 00:06:29,476 --> 00:06:32,316 Speaker 1: people whom we don't have immediate and total connection to. 117 00:06:32,476 --> 00:06:34,156 Speaker 1: Or is the whole point that we should talk to 118 00:06:34,196 --> 00:06:36,396 Speaker 1: them but not expect very much from the conversation. I 119 00:06:36,396 --> 00:06:38,516 Speaker 1: think the whole point is that we should talk to 120 00:06:38,516 --> 00:06:40,956 Speaker 1: them and not expect very much in the conversation. Noways 121 00:06:40,996 --> 00:06:44,156 Speaker 1: I've become Then why talk at all? No, because you 122 00:06:44,196 --> 00:06:47,676 Speaker 1: can have a meaningful conversation in the absence of drawing 123 00:06:48,516 --> 00:06:51,476 Speaker 1: aggressive conclusions about the nature of the person you're talking to. 124 00:06:51,876 --> 00:06:54,636 Speaker 1: So I think this is I mean, the real issue 125 00:06:54,636 --> 00:06:56,596 Speaker 1: in the book, and the reason the book begins and 126 00:06:56,716 --> 00:07:01,076 Speaker 1: ends with the story of Sandra Bland, is that the 127 00:07:01,116 --> 00:07:03,356 Speaker 1: thing about that encounter, she's pulled over by the side 128 00:07:03,396 --> 00:07:05,876 Speaker 1: the road in small town taxes by a police officer, 129 00:07:06,396 --> 00:07:10,156 Speaker 1: and the thing that leads that encounter to end in 130 00:07:10,236 --> 00:07:13,996 Speaker 1: tragedy is that the police officer takes it upon himself 131 00:07:14,636 --> 00:07:18,836 Speaker 1: to reach enormously consequential conclusions about Sandra Bland in the 132 00:07:18,876 --> 00:07:22,036 Speaker 1: absence both in the space of thirty seconds and in 133 00:07:22,076 --> 00:07:24,996 Speaker 1: the absence of any real evidence at all. Right, that's 134 00:07:24,996 --> 00:07:28,556 Speaker 1: what what is wrong with what he does, among other things, 135 00:07:28,836 --> 00:07:31,956 Speaker 1: is it is in such a hurry, and his ambitions 136 00:07:31,996 --> 00:07:36,396 Speaker 1: are so enormous he really thinks he pulls her over 137 00:07:36,476 --> 00:07:39,036 Speaker 1: because he thinks there is a chance he pullsoever because 138 00:07:39,036 --> 00:07:43,756 Speaker 1: she's out of state, black and driving a hunday. Right, 139 00:07:44,196 --> 00:07:46,676 Speaker 1: And there are policies covering at least two of those things, 140 00:07:47,196 --> 00:07:50,476 Speaker 1: officially in one of them on officially, yeah, that there are. 141 00:07:50,596 --> 00:07:53,116 Speaker 1: And he has been trained to pull over people who 142 00:07:53,156 --> 00:07:58,516 Speaker 1: are who are you know the phrase what the lovely 143 00:07:58,516 --> 00:08:03,556 Speaker 1: in quotation marks phrase that law enforcement uses curiosity ticklers. 144 00:08:04,116 --> 00:08:07,436 Speaker 1: So she has violated, she has she has activated several 145 00:08:07,476 --> 00:08:11,436 Speaker 1: of his curiosity ticklers by virtue of she and he 146 00:08:11,436 --> 00:08:13,676 Speaker 1: had noticed her while she was turning out of the 147 00:08:13,716 --> 00:08:18,156 Speaker 1: campus of Prairie View University, and he noticed her notices 148 00:08:18,156 --> 00:08:24,316 Speaker 1: that while she's on university territory, she rolls through a 149 00:08:24,356 --> 00:08:27,196 Speaker 1: stop sign. So she has rolled to a stop sign. 150 00:08:27,596 --> 00:08:30,196 Speaker 1: She's black, she's twenty eight, she's driving a Honday, she's 151 00:08:30,196 --> 00:08:33,596 Speaker 1: got Chicago plates. And he says, that's for curiosity ticklers, 152 00:08:33,716 --> 00:08:36,316 Speaker 1: and so he contrives a reason to pull her over. 153 00:08:36,796 --> 00:08:39,276 Speaker 1: He sort of creates the reason, creates the reason by 154 00:08:39,476 --> 00:08:41,956 Speaker 1: driving up fast behind her, which leads her to get 155 00:08:41,996 --> 00:08:43,396 Speaker 1: out of his way, and then she doesn't use her 156 00:08:43,436 --> 00:08:46,676 Speaker 1: turn signal. He has his pretexts, and then he creates 157 00:08:46,676 --> 00:08:49,756 Speaker 1: a fantasy about this person he's pulled over. And the 158 00:08:49,756 --> 00:08:54,916 Speaker 1: fantasy is that she's dangerous. And this comes up later 159 00:08:54,956 --> 00:08:57,196 Speaker 1: in the you know, in the course of the investigation, 160 00:08:57,236 --> 00:09:02,076 Speaker 1: he gives this deposition. And what's weird actually what the 161 00:09:02,076 --> 00:09:04,516 Speaker 1: deposition is that even though the case itself when it happened, 162 00:09:04,956 --> 00:09:08,476 Speaker 1: attracted attention, you know, around the world as far as 163 00:09:08,476 --> 00:09:11,236 Speaker 1: I can find. That was like one minor story in 164 00:09:11,276 --> 00:09:14,516 Speaker 1: the Austin Statesman about the deposition, and that was it. 165 00:09:14,796 --> 00:09:17,196 Speaker 1: The deposition was completely ignored. Even when the deposition is 166 00:09:17,236 --> 00:09:20,716 Speaker 1: like deeply revealing, deeply revealing, and you realize what a 167 00:09:21,316 --> 00:09:24,916 Speaker 1: what a you know, like I said, like this bizarre 168 00:09:24,996 --> 00:09:27,876 Speaker 1: fantasy that he creates in that moment that she is 169 00:09:28,436 --> 00:09:32,196 Speaker 1: dangerous and harboring some deep criminal intent on the basis 170 00:09:32,196 --> 00:09:36,756 Speaker 1: of the flimsiest and most misleading of cues. And it's 171 00:09:36,796 --> 00:09:40,156 Speaker 1: that that I think is problematic and that we need 172 00:09:40,196 --> 00:09:43,156 Speaker 1: to sort of zero in on. One of the fascinating 173 00:09:43,156 --> 00:09:45,436 Speaker 1: things though about your treatment of that whole deposition is 174 00:09:45,476 --> 00:09:48,436 Speaker 1: that you believe him when he says he thought she 175 00:09:48,556 --> 00:09:51,476 Speaker 1: was dangerous. I do. You can imagine that other readers 176 00:09:51,556 --> 00:09:54,236 Speaker 1: might be more skeptical and say, you know, and you 177 00:09:54,276 --> 00:09:56,116 Speaker 1: consider this view in the in the book and passing 178 00:09:56,116 --> 00:10:01,356 Speaker 1: at least, No, he was just a racist, and not 179 00:10:01,396 --> 00:10:04,676 Speaker 1: only an individual way, but also in the production of 180 00:10:04,716 --> 00:10:07,596 Speaker 1: the structural racism of the cops and the stop and 181 00:10:07,636 --> 00:10:10,956 Speaker 1: frisk and so forth, rather the pulling over of supposed 182 00:10:10,956 --> 00:10:14,316 Speaker 1: to traffic violations. And then later in a deposition under oath, 183 00:10:15,076 --> 00:10:16,956 Speaker 1: he wanted to explain why he stopped her, and he 184 00:10:17,036 --> 00:10:19,836 Speaker 1: made up these reasons after the fact yeah, but you 185 00:10:19,876 --> 00:10:21,596 Speaker 1: don't think that you buy what he's saying. And that's 186 00:10:21,636 --> 00:10:23,556 Speaker 1: partly because you're I don't want to say this, but 187 00:10:23,596 --> 00:10:26,996 Speaker 1: maybe you're defaulting to truth and believing his account. I 188 00:10:27,396 --> 00:10:29,556 Speaker 1: am devaulting to truth. I have two the magic thing 189 00:10:29,556 --> 00:10:31,436 Speaker 1: in the book, defaulting to truth. And you're not supposed 190 00:10:31,476 --> 00:10:34,116 Speaker 1: to do it. Yeah, I have. Well, you are supposed 191 00:10:34,156 --> 00:10:36,156 Speaker 1: to do it, but let me explain will to do it. 192 00:10:36,236 --> 00:10:37,796 Speaker 1: You're be able to do it. That makes me why 193 00:10:37,796 --> 00:10:42,556 Speaker 1: I believe him when he says that two reasons. One 194 00:10:42,636 --> 00:10:44,396 Speaker 1: is that if you look at his so we have 195 00:10:44,596 --> 00:10:48,036 Speaker 1: from the Texas how he Patrol has on file, you 196 00:10:48,076 --> 00:10:51,356 Speaker 1: can find it a complete record of every traffic stop 197 00:10:51,396 --> 00:10:54,716 Speaker 1: he ever made as a member of the of the 198 00:10:54,756 --> 00:10:57,356 Speaker 1: police force. And what you see when you look at 199 00:10:57,396 --> 00:11:02,236 Speaker 1: that is that Kim stopping Sandra blend On the flimsiest 200 00:11:02,236 --> 00:11:05,396 Speaker 1: of pretexts was not an anomaly. It was what he did. Yes, 201 00:11:05,636 --> 00:11:10,036 Speaker 1: he was, I mean he is. The further you can say, 202 00:11:10,116 --> 00:11:13,436 Speaker 1: you can say that he is by the standards of 203 00:11:13,476 --> 00:11:16,036 Speaker 1: what we want, what the Texas Harry Patrol wants a 204 00:11:16,036 --> 00:11:18,716 Speaker 1: cop to be. He's not a bad cop. He's an 205 00:11:18,756 --> 00:11:21,756 Speaker 1: ideal cop. They are trained to stop people on the 206 00:11:21,796 --> 00:11:24,236 Speaker 1: flimsiest of pretext and he did exactly as he was 207 00:11:24,276 --> 00:11:27,396 Speaker 1: trained to do. His whole career was stopping people on 208 00:11:27,396 --> 00:11:30,716 Speaker 1: flimsy pretexts. Right. So it's like if you look at 209 00:11:30,716 --> 00:11:33,796 Speaker 1: the list of reasons of first of all, you look 210 00:11:33,796 --> 00:11:35,916 Speaker 1: at the number of people he stopped, he's stopping. He's 211 00:11:35,996 --> 00:11:39,236 Speaker 1: jumping out of his car four or five, six times 212 00:11:39,236 --> 00:11:41,276 Speaker 1: an hour. And to put this in context, if you 213 00:11:41,356 --> 00:11:44,516 Speaker 1: go back twenty five years and you do a similar 214 00:11:45,116 --> 00:11:48,436 Speaker 1: search of what police officers did, you will discover they 215 00:11:48,476 --> 00:11:50,716 Speaker 1: never got out of their car. If you were a 216 00:11:50,716 --> 00:11:54,156 Speaker 1: police officer in smalltown, Texas in nineteen seventy five, you 217 00:11:54,236 --> 00:11:56,716 Speaker 1: get out of your car to get donuts, like you 218 00:11:56,756 --> 00:11:59,236 Speaker 1: did not the last thing you did was pull over 219 00:11:59,316 --> 00:12:01,996 Speaker 1: people who are driving down a road for Note that 220 00:12:02,116 --> 00:12:04,396 Speaker 1: the whole notion of policing was different. You sat in 221 00:12:04,396 --> 00:12:06,556 Speaker 1: your car and you waited for a call after some 222 00:12:06,716 --> 00:12:08,676 Speaker 1: and you reacted to something you could happen. This is 223 00:12:08,716 --> 00:12:11,356 Speaker 1: a whole new way of policing, and he was the 224 00:12:11,596 --> 00:12:14,756 Speaker 1: perfect embodiment of this new philosophy he's starting and you 225 00:12:14,836 --> 00:12:17,516 Speaker 1: look at the list he's stopping people for. You know, 226 00:12:17,596 --> 00:12:21,356 Speaker 1: the light above their license plate is out and he 227 00:12:21,476 --> 00:12:24,836 Speaker 1: pulls you over. Why because he thinks that maybe there 228 00:12:24,836 --> 00:12:26,716 Speaker 1: may be reason to believe that if you're the kind 229 00:12:26,716 --> 00:12:29,316 Speaker 1: of person who didn't attend to the broken bulb above 230 00:12:29,356 --> 00:12:31,676 Speaker 1: your license plate, then you're also the kind of person 231 00:12:31,876 --> 00:12:34,796 Speaker 1: who is smuggling drugs. Right, that's the theory. He never 232 00:12:34,996 --> 00:12:37,996 Speaker 1: found anything. Right, if you look at his long time 233 00:12:38,036 --> 00:12:42,516 Speaker 1: on the force, the man stops, you know, dozens upon 234 00:12:42,636 --> 00:12:45,596 Speaker 1: dozens of people, and the sum total of all of 235 00:12:45,636 --> 00:12:49,036 Speaker 1: that frenzied activity is like, once he stops a kid 236 00:12:49,676 --> 00:12:53,636 Speaker 1: who is I think has a token amount of marijuana 237 00:12:53,676 --> 00:12:56,316 Speaker 1: in his car. He may have got a gun once, 238 00:12:56,596 --> 00:13:01,476 Speaker 1: but basically he's coming up with nothing. So so reason 239 00:13:01,556 --> 00:13:03,916 Speaker 1: number one for me to believe him is that he 240 00:13:04,036 --> 00:13:06,076 Speaker 1: is so. What he does with sangel Blend is so 241 00:13:06,116 --> 00:13:08,716 Speaker 1: a part of who he is. There's not an expression 242 00:13:08,756 --> 00:13:12,476 Speaker 1: of some kind of of latent prejudice. It is an 243 00:13:12,516 --> 00:13:17,356 Speaker 1: expression of his training as a police officer. Secondly, it's 244 00:13:17,356 --> 00:13:19,996 Speaker 1: not inconsistent to say that he's scared of her because 245 00:13:19,996 --> 00:13:23,076 Speaker 1: he's a racist. In fact, it's entirely consistent. That's why 246 00:13:23,116 --> 00:13:26,116 Speaker 1: he's scared of her because she's actually bigger, than he is, 247 00:13:26,676 --> 00:13:29,356 Speaker 1: and she's black, and that makes it And because he 248 00:13:29,476 --> 00:13:31,956 Speaker 1: assumes that big black people are scary, it makes it 249 00:13:31,996 --> 00:13:34,836 Speaker 1: really easy for him to construct a scenario where she's 250 00:13:34,836 --> 00:13:38,396 Speaker 1: a threatening criminal and he is trying to uphold law 251 00:13:38,436 --> 00:13:41,596 Speaker 1: and order. So the racist argument and the argument that 252 00:13:41,636 --> 00:13:45,316 Speaker 1: she's scary are consistent, not inconsistent. And thirdly, there are 253 00:13:45,356 --> 00:13:50,476 Speaker 1: actual things that he does. He approaches her for the 254 00:13:50,556 --> 00:13:53,476 Speaker 1: first time on the passenger side, which is what you 255 00:13:53,556 --> 00:13:55,996 Speaker 1: do when you're not scared, and then he goes back 256 00:13:56,036 --> 00:13:58,836 Speaker 1: to his car, and then the second time he goes 257 00:13:59,036 --> 00:14:02,196 Speaker 1: up after he sits in his car, and he constructs 258 00:14:02,196 --> 00:14:05,356 Speaker 1: this fantasy. He approaches her on the driver's side. And 259 00:14:05,396 --> 00:14:07,516 Speaker 1: that's the only reason you would do that on a 260 00:14:07,756 --> 00:14:10,796 Speaker 1: highway traffic stop. So he exposed himself to the traffic 261 00:14:10,876 --> 00:14:13,156 Speaker 1: by approaching on the driver's side. Is if you think 262 00:14:13,156 --> 00:14:15,756 Speaker 1: the person has a gun, because you can defend yourself 263 00:14:15,796 --> 00:14:18,276 Speaker 1: if you approach on the driver's side, and you are 264 00:14:18,276 --> 00:14:20,716 Speaker 1: a target if you approach on the passenger's side. So 265 00:14:20,796 --> 00:14:24,916 Speaker 1: that all those things make me think, yeah, the man's terrified. 266 00:14:25,116 --> 00:14:27,476 Speaker 1: I mean. And by the way, the way we train 267 00:14:27,556 --> 00:14:31,396 Speaker 1: police officers in the present day is to be terrified, right, 268 00:14:31,596 --> 00:14:34,596 Speaker 1: like that the only way that the reason people who 269 00:14:34,676 --> 00:14:36,836 Speaker 1: deny that, think that he's making up a cover story 270 00:14:36,836 --> 00:14:41,156 Speaker 1: are wholly ignorant of the totally perverse way modern law 271 00:14:41,196 --> 00:14:44,956 Speaker 1: enforcement works. Right, He's trained to be terrified. Of course 272 00:14:44,996 --> 00:14:53,436 Speaker 1: he's terrified. So I'm feeling a little terrified. That was 273 00:14:53,436 --> 00:14:57,716 Speaker 1: a very effective invocation of the experience. When you think 274 00:14:57,756 --> 00:15:01,716 Speaker 1: about people who talk to strangers for a living, yeah, 275 00:15:01,916 --> 00:15:04,996 Speaker 1: cops are an example, and it's particularly egregious because after all, 276 00:15:04,996 --> 00:15:07,436 Speaker 1: he's talked to all those people that he stopped, Yeah, 277 00:15:07,476 --> 00:15:11,236 Speaker 1: and he's tried to make some termination about how dangerous 278 00:15:11,236 --> 00:15:13,636 Speaker 1: they are. But another kind of person, there are two 279 00:15:13,636 --> 00:15:15,236 Speaker 1: other kinds of people who immediate leap to my mind 280 00:15:15,276 --> 00:15:17,076 Speaker 1: when I think about people who talk to strangers. One 281 00:15:17,156 --> 00:15:22,356 Speaker 1: is salespeople. Yeah, because you're often selling to strangers and 282 00:15:22,396 --> 00:15:26,636 Speaker 1: they're it's sort of an interesting counterexample because there you're 283 00:15:26,676 --> 00:15:29,996 Speaker 1: doing everything you can to comprehend them in order to 284 00:15:30,116 --> 00:15:34,916 Speaker 1: produce an effective sale. And I just wonder whether I 285 00:15:34,916 --> 00:15:37,156 Speaker 1: don't I have no idea, I wonder if salespeople do 286 00:15:37,276 --> 00:15:40,516 Speaker 1: better at talking to strangers insofar as they're trying to 287 00:15:40,556 --> 00:15:42,156 Speaker 1: make a sale and in fact, to the extent this 288 00:15:42,236 --> 00:15:43,956 Speaker 1: comes up in your book, it's from the other side. 289 00:15:44,036 --> 00:15:46,196 Speaker 1: It's from the buyer, as it were, you know, the 290 00:15:46,236 --> 00:15:48,556 Speaker 1: person who believes what the salesman is this the pitch 291 00:15:48,556 --> 00:15:51,036 Speaker 1: that the salesman is saying. Yeah. And then the other 292 00:15:51,076 --> 00:15:52,916 Speaker 1: group of people who also come up in the book 293 00:15:52,916 --> 00:15:57,076 Speaker 1: are anthropologists. Yeah, and you take us into the into 294 00:15:57,236 --> 00:16:03,196 Speaker 1: Central America, a very intrepid part of anthropologists to actually Indonesia, 295 00:16:03,476 --> 00:16:08,516 Speaker 1: to Indonesia. Sorry, who go to a village where people 296 00:16:08,596 --> 00:16:11,076 Speaker 1: drink alcohol that turns out to be is it one 297 00:16:11,156 --> 00:16:13,596 Speaker 1: hundred and oh that story? Yes, yeah, I'm mixing up 298 00:16:13,596 --> 00:16:17,156 Speaker 1: my anthropology stories. Yes, there's several anthropologists anthropologies, Yes you are, 299 00:16:17,196 --> 00:16:20,716 Speaker 1: I'm talking about the drunk anthropologists. Yes, yeah, you know 300 00:16:20,956 --> 00:16:23,076 Speaker 1: all of Malcolm's I mean as as you can guess 301 00:16:23,076 --> 00:16:24,236 Speaker 1: and as you will know as soon as you read 302 00:16:24,236 --> 00:16:27,116 Speaker 1: this book. It's a classic, the great read. And every 303 00:16:27,156 --> 00:16:28,956 Speaker 1: you know anecdote could have a name, and this is 304 00:16:28,996 --> 00:16:32,076 Speaker 1: the Adventure of the Drunken Apologists. Yes, every it's like 305 00:16:32,076 --> 00:16:34,036 Speaker 1: every you can play a game where every Malcolm god 306 00:16:34,076 --> 00:16:35,836 Speaker 1: Well anecdote could be turned into the name of a 307 00:16:35,876 --> 00:16:40,876 Speaker 1: Sherlock Holms story. That's sorry. You know, so here these 308 00:16:40,876 --> 00:16:45,196 Speaker 1: anthropologists they do really well and talking to strangers, it seems, 309 00:16:45,236 --> 00:16:47,476 Speaker 1: at least in your I'm sure not all anthropologists do 310 00:16:47,476 --> 00:16:49,596 Speaker 1: do well that that that's what they're trying to do. 311 00:16:49,916 --> 00:16:52,276 Speaker 1: So I wonder is there something we could learn from 312 00:16:52,396 --> 00:16:56,516 Speaker 1: either salesman or anthropologists, Yes, about how to go about 313 00:16:56,556 --> 00:16:59,156 Speaker 1: talking strangers. The end of the salesman part is really 314 00:16:59,156 --> 00:17:02,996 Speaker 1: interesting because what is it that a successful salesperson does. 315 00:17:03,996 --> 00:17:06,836 Speaker 1: They The first thing they do before they size up 316 00:17:07,476 --> 00:17:11,436 Speaker 1: the person they're trying to sell to, is they sell themselves, right. 317 00:17:11,716 --> 00:17:16,836 Speaker 1: They established their credibility, their friendliness, their interests there. You know, 318 00:17:16,836 --> 00:17:19,076 Speaker 1: I once seen one of my earlier books, I like 319 00:17:19,116 --> 00:17:22,236 Speaker 1: when you Go to Sell a book. Yes, I had 320 00:17:22,276 --> 00:17:26,716 Speaker 1: a profile of the number one car salesman in America. 321 00:17:26,836 --> 00:17:30,516 Speaker 1: Who was this guy in rural New Jersey. And he 322 00:17:30,596 --> 00:17:34,236 Speaker 1: was really really fascinating and he was his record was 323 00:17:34,876 --> 00:17:37,556 Speaker 1: I mean, he was like an order of magnitude better 324 00:17:37,596 --> 00:17:40,396 Speaker 1: than anybody else. And he was the kind of most 325 00:17:40,996 --> 00:17:43,916 Speaker 1: there was nothing. He wasn't a fast talking, slick guy. 326 00:17:44,316 --> 00:17:46,516 Speaker 1: He was the opposite. He was this guy. When you 327 00:17:46,556 --> 00:17:51,276 Speaker 1: met him, he just oozed authenticity and he impressed upon 328 00:17:51,316 --> 00:17:53,836 Speaker 1: you before he even sized you up. He impressed upon 329 00:17:53,876 --> 00:17:57,516 Speaker 1: you the fact that he was a straight shooting, normal 330 00:17:57,636 --> 00:18:01,916 Speaker 1: guy who was not trying to hustle you. On the contrary, 331 00:18:02,236 --> 00:18:04,876 Speaker 1: he just cared about, you know, serving your interests. And 332 00:18:04,996 --> 00:18:07,516 Speaker 1: he was genuine. I think he actually genuine he believed 333 00:18:07,516 --> 00:18:09,836 Speaker 1: that that's what he was doing. But kinds of genuine. 334 00:18:09,876 --> 00:18:12,236 Speaker 1: There's he genuinely believed that's what he was doing, and 335 00:18:12,276 --> 00:18:14,316 Speaker 1: there's that is genuinely what he was doing. I think 336 00:18:14,396 --> 00:18:17,236 Speaker 1: that's genuinely what he was doing because he pointed out 337 00:18:17,276 --> 00:18:18,796 Speaker 1: he was just doing people the favor of selling them. 338 00:18:19,116 --> 00:18:21,636 Speaker 1: Because because he made this really interesting point, which is 339 00:18:21,676 --> 00:18:24,116 Speaker 1: that if you're a car salesman, and I love the 340 00:18:24,156 --> 00:18:27,756 Speaker 1: fact that we've now detoured into car salesman's but he 341 00:18:27,836 --> 00:18:30,796 Speaker 1: says the thing about the mistake he was I asked 342 00:18:30,836 --> 00:18:32,796 Speaker 1: him to talk about, what are the mistakes your peers make? 343 00:18:32,996 --> 00:18:35,436 Speaker 1: So why what does everyone else do such a bad 344 00:18:35,476 --> 00:18:37,556 Speaker 1: job of selling cars and you are so good? And 345 00:18:37,556 --> 00:18:39,316 Speaker 1: he said, well, the mistake they make is that they 346 00:18:39,356 --> 00:18:43,876 Speaker 1: think it's all about the person in front of them. 347 00:18:43,916 --> 00:18:45,876 Speaker 1: But he said, no, no no, success in this business is 348 00:18:45,876 --> 00:18:49,836 Speaker 1: about referrals. He's like a successful sale for me, is 349 00:18:50,076 --> 00:18:52,076 Speaker 1: whether I don't sell the person in front of me 350 00:18:52,116 --> 00:18:54,076 Speaker 1: a car, But they liked me so much that they 351 00:18:54,116 --> 00:18:56,796 Speaker 1: go home and when their friends are buying a car, 352 00:18:57,276 --> 00:18:59,196 Speaker 1: they say, oh, you should go talk to that guy. 353 00:18:59,236 --> 00:19:03,596 Speaker 1: He was really nice. That's where was a volume business. 354 00:19:03,676 --> 00:19:06,236 Speaker 1: Yes you want, you want, and that person might tell 355 00:19:06,316 --> 00:19:09,836 Speaker 1: four people and you might sell three cars because of 356 00:19:09,836 --> 00:19:12,396 Speaker 1: that one successful. So the other rule he had was 357 00:19:13,156 --> 00:19:16,956 Speaker 1: you can never dismiss so anyone who comes into the 358 00:19:16,996 --> 00:19:21,356 Speaker 1: dealership deserves your full attention. So he never judged anyone. 359 00:19:21,596 --> 00:19:25,516 Speaker 1: If you were a seven year old kid, he would 360 00:19:25,556 --> 00:19:28,236 Speaker 1: treat you as seriously as if you were, you know, 361 00:19:28,356 --> 00:19:30,436 Speaker 1: a millionaire. Why because you don't know who the seven 362 00:19:30,516 --> 00:19:33,956 Speaker 1: year old kid's father or uncle or grandfather is. So 363 00:19:34,036 --> 00:19:35,836 Speaker 1: he would spend this whole afternoon who the seven year 364 00:19:35,836 --> 00:19:37,196 Speaker 1: old kids? I thought it was kind of fascinating. But 365 00:19:37,196 --> 00:19:41,196 Speaker 1: the point was he begins with establishing his own credibility. 366 00:19:41,276 --> 00:19:44,156 Speaker 1: And this is of course exactly what Brian and Cinia 367 00:19:44,236 --> 00:19:47,076 Speaker 1: in the cop in the Sandra Bland case does not do. 368 00:19:47,876 --> 00:19:51,076 Speaker 1: He behaves without regard for his own credibility. In fact, 369 00:19:51,116 --> 00:19:55,316 Speaker 1: he blows his credibility before she even meets him by 370 00:19:56,116 --> 00:19:59,596 Speaker 1: pulling her over on a nonsensical traffic stop. And the 371 00:19:59,636 --> 00:20:02,956 Speaker 1: idea that we have trained police officers to behave in 372 00:20:02,996 --> 00:20:05,876 Speaker 1: such a way that their credibility is destroyed before they 373 00:20:05,956 --> 00:20:09,956 Speaker 1: even meet the person they're pulling over is incredib Right. 374 00:20:10,556 --> 00:20:13,916 Speaker 1: We should be taking the people training cops and sending 375 00:20:13,916 --> 00:20:16,676 Speaker 1: them to meet with this car salesman and he could 376 00:20:16,676 --> 00:20:20,316 Speaker 1: teach them something, right. So, I mean, it's funny because 377 00:20:21,636 --> 00:20:22,916 Speaker 1: I spent a lot of time in a book with 378 00:20:22,956 --> 00:20:26,836 Speaker 1: these two brilliant criminologists who have rethought a lot of 379 00:20:28,196 --> 00:20:31,996 Speaker 1: David Weisbergen and Larry Schreman, and they think endlessly about this. 380 00:20:32,396 --> 00:20:35,076 Speaker 1: It's like the first task of a police officer in 381 00:20:35,196 --> 00:20:40,476 Speaker 1: dealing with any member of the public is to establish 382 00:20:40,556 --> 00:20:44,756 Speaker 1: their credibilion integrity. But you can't do anything before you 383 00:20:44,796 --> 00:20:48,036 Speaker 1: do that, whereas presumably what a lot of people think 384 00:20:48,076 --> 00:20:51,556 Speaker 1: is that they should establish our authority. Yes, exactly, and 385 00:20:51,596 --> 00:20:54,556 Speaker 1: that is there's a huge difference between those two those 386 00:20:54,556 --> 00:20:58,636 Speaker 1: two things. Now, when you talk though about the salesman 387 00:20:58,676 --> 00:21:03,436 Speaker 1: who established his authority, that's also what your spies did 388 00:21:03,676 --> 00:21:05,676 Speaker 1: in the book. Yeah, right, they were really good at 389 00:21:05,756 --> 00:21:09,196 Speaker 1: establishing their authority. Yeah, there's a lot of spies in 390 00:21:09,196 --> 00:21:10,556 Speaker 1: the book In case you loved it, it's a lot 391 00:21:10,556 --> 00:21:13,916 Speaker 1: of spies and a lot of really cool spies. Yeah, 392 00:21:13,956 --> 00:21:19,036 Speaker 1: and so you establish authority and then the other person 393 00:21:19,636 --> 00:21:22,356 Speaker 1: will believe whatever you say. Do you think people believe 394 00:21:22,396 --> 00:21:25,276 Speaker 1: what the salesman said or they discounted what he was saying, 395 00:21:25,396 --> 00:21:27,156 Speaker 1: but they needed to buy a car, and they figured 396 00:21:27,156 --> 00:21:28,316 Speaker 1: he was a nice guy and they might as well 397 00:21:28,356 --> 00:21:31,876 Speaker 1: buy the car from him. No, I think that I 398 00:21:31,876 --> 00:21:34,076 Speaker 1: think they were. They are when they enter a dealership. 399 00:21:34,116 --> 00:21:39,516 Speaker 1: They are desperate to find someone who seems to have 400 00:21:39,556 --> 00:21:41,676 Speaker 1: their best interest at heart. There's I mean, I think. 401 00:21:41,796 --> 00:21:44,116 Speaker 1: I mean his argument was that people enter a car 402 00:21:44,156 --> 00:21:48,796 Speaker 1: dealership and their expectations are really, really low. They've had 403 00:21:48,796 --> 00:21:51,396 Speaker 1: so many bad experiences over the years, and that just 404 00:21:51,436 --> 00:21:53,916 Speaker 1: by being a normal human being he can sell well 405 00:21:53,956 --> 00:22:00,156 Speaker 1: caused anyone else in America ms the stunning thought. But spies. 406 00:22:00,516 --> 00:22:04,036 Speaker 1: So since writing a book, I've the more I think 407 00:22:04,036 --> 00:22:07,356 Speaker 1: about this, the more of a radical position I take 408 00:22:07,356 --> 00:22:09,676 Speaker 1: on spies. Okay, good, let's get let's to the let's 409 00:22:09,676 --> 00:22:13,516 Speaker 1: get into some spy radicalism. So spies and there's I 410 00:22:13,556 --> 00:22:15,636 Speaker 1: was reading this article in one of the you know, 411 00:22:15,636 --> 00:22:20,116 Speaker 1: they are all these journals devoted to spies. We're academic 412 00:22:20,796 --> 00:22:23,156 Speaker 1: and Bertward who writes in there like they're all x 413 00:22:23,236 --> 00:22:26,876 Speaker 1: CIA officers or XMI six officers. And there was one 414 00:22:26,996 --> 00:22:30,156 Speaker 1: really really brilliant essay I read recently by a guy 415 00:22:30,156 --> 00:22:32,076 Speaker 1: who said, you know, if you take the long view 416 00:22:32,796 --> 00:22:35,236 Speaker 1: and you see that, well, during the Cold War we 417 00:22:35,316 --> 00:22:40,596 Speaker 1: had Aldri Shames and m Robert Hansen and a couple 418 00:22:40,596 --> 00:22:42,836 Speaker 1: of other ones. But if you look at all of 419 00:22:42,876 --> 00:22:46,236 Speaker 1: the damage they did, they basically give gave away all 420 00:22:46,276 --> 00:22:48,196 Speaker 1: of our key secrets. It only took a few but 421 00:22:48,276 --> 00:22:50,516 Speaker 1: they gave away all of the secrets because they had 422 00:22:50,516 --> 00:22:52,356 Speaker 1: they were really high up boats in the CI and FBI. 423 00:22:52,516 --> 00:22:54,556 Speaker 1: And on the flip side, we had a couple of 424 00:22:54,596 --> 00:22:56,796 Speaker 1: people who come over from the Soviet side who basically 425 00:22:56,796 --> 00:22:58,996 Speaker 1: gave away all their key secrets. He said, so it 426 00:22:59,076 --> 00:23:01,916 Speaker 1: was a wash in the end, like the Cold War itself, 427 00:23:01,996 --> 00:23:03,916 Speaker 1: the Cold like the Cold War itself. And he's like, 428 00:23:03,956 --> 00:23:05,716 Speaker 1: so this guy was like, who is himself a spy guy? 429 00:23:05,756 --> 00:23:07,636 Speaker 1: He's like, looking at the evidence of the Cold War, 430 00:23:07,916 --> 00:23:10,756 Speaker 1: you should we should just give up, Like we should 431 00:23:10,756 --> 00:23:14,196 Speaker 1: just have saved ourself billions of dollars by by shutting 432 00:23:14,196 --> 00:23:19,236 Speaker 1: down all of the covert covert espionage operations of the CIA, 433 00:23:19,316 --> 00:23:22,676 Speaker 1: like we ended up no further ahead than we then 434 00:23:22,716 --> 00:23:24,356 Speaker 1: we would have been if we had had no spies 435 00:23:24,436 --> 00:23:26,116 Speaker 1: at all. And I tell the story in the book. 436 00:23:26,236 --> 00:23:27,556 Speaker 1: We need a treaty for that though, right, because it 437 00:23:27,636 --> 00:23:29,596 Speaker 1: has to be bilateral doesn't work if we give up 438 00:23:29,636 --> 00:23:31,116 Speaker 1: our spies and they don't give up their spies. It 439 00:23:31,116 --> 00:23:32,796 Speaker 1: has to be you have to sign a treaty. We 440 00:23:32,836 --> 00:23:35,836 Speaker 1: have to total disclosure of every total disclosure, which everyone 441 00:23:35,836 --> 00:23:41,196 Speaker 1: would then lie about. But uh if but if we don't, no, 442 00:23:41,196 --> 00:23:42,156 Speaker 1: no, no no, no, you don't have to do it. You 443 00:23:42,156 --> 00:23:44,716 Speaker 1: can be don't know, think about this laterally. It can 444 00:23:44,756 --> 00:23:47,996 Speaker 1: be done utilaterally because we're shutting. If we shut down 445 00:23:47,996 --> 00:23:50,676 Speaker 1: our spy service, then there is no spy service for 446 00:23:50,716 --> 00:23:55,036 Speaker 1: them to infiltrate. Right, we have removed the post, but 447 00:23:55,036 --> 00:23:57,796 Speaker 1: there's no spies, no no by on the non spots, 448 00:23:57,796 --> 00:23:59,596 Speaker 1: there's no what are they gonna do? So the Soviets 449 00:23:59,596 --> 00:24:01,876 Speaker 1: send a spy like go and spine the Department of 450 00:24:01,916 --> 00:24:07,076 Speaker 1: Agriculture like, go ahead, it's all yours, go right in. No, 451 00:24:07,116 --> 00:24:09,316 Speaker 1: there's no more, well there is the Defense Department. I 452 00:24:09,316 --> 00:24:10,796 Speaker 1: mean I like this. I like that, I like where 453 00:24:10,796 --> 00:24:13,796 Speaker 1: you're going here, and maybe the defense but there is 454 00:24:13,836 --> 00:24:17,516 Speaker 1: like the defense department, I know. But mostly what the 455 00:24:17,556 --> 00:24:20,276 Speaker 1: Soviets are doing is spying on our spies, and we're 456 00:24:20,276 --> 00:24:21,916 Speaker 1: spying on the round. Although I guess the Chinese do 457 00:24:21,916 --> 00:24:25,076 Speaker 1: it differently because they are mostly trying to steal technology, 458 00:24:25,116 --> 00:24:28,796 Speaker 1: technology and secrets. Okay, so let's let's blog and secrets. 459 00:24:28,956 --> 00:24:31,556 Speaker 1: But it's clear that an awful lot of this activity 460 00:24:31,636 --> 00:24:34,836 Speaker 1: is just a merrygo round, yes, and we should sing around. 461 00:24:34,956 --> 00:24:37,356 Speaker 1: And billions and billions and billions of dollars were spent 462 00:24:37,396 --> 00:24:40,236 Speaker 1: on the merrygor around, right, So this is I found 463 00:24:40,356 --> 00:24:41,996 Speaker 1: like for a spy to say this, and like I 464 00:24:42,236 --> 00:24:45,516 Speaker 1: tell the story and of how I opened the book 465 00:24:45,516 --> 00:24:48,916 Speaker 1: with a story of how the there's a defector comes 466 00:24:48,916 --> 00:24:52,956 Speaker 1: over from Cuba and he calls together all the leading 467 00:24:53,076 --> 00:24:57,956 Speaker 1: CIA people running our Cuban espionage operations. It says, um, 468 00:24:57,996 --> 00:25:00,956 Speaker 1: not one, but not two and nuts free, but every 469 00:25:00,996 --> 00:25:03,636 Speaker 1: single spy you have in Cuba right now is a 470 00:25:03,676 --> 00:25:06,876 Speaker 1: double agent working for Castra. The whole thing. Like so 471 00:25:06,996 --> 00:25:10,756 Speaker 1: like it was pointless, why bother shut you know? We 472 00:25:10,796 --> 00:25:13,556 Speaker 1: had an operation in Havana inside the whatever it is, 473 00:25:13,556 --> 00:25:16,196 Speaker 1: the Swiss embassy for years and years years the whole 474 00:25:16,196 --> 00:25:18,676 Speaker 1: thing was a wash, like it worse than a wash. 475 00:25:18,796 --> 00:25:20,676 Speaker 1: But so okay, let me let me try to up 476 00:25:20,676 --> 00:25:23,876 Speaker 1: the radicalism here. So you can imagine someone saying, well, 477 00:25:23,916 --> 00:25:27,076 Speaker 1: this is true of all arms races, right, each side 478 00:25:27,116 --> 00:25:29,916 Speaker 1: goes like crazy. And there is a theory that at 479 00:25:29,916 --> 00:25:31,836 Speaker 1: the point that the way the United States quote unquote 480 00:25:31,836 --> 00:25:33,836 Speaker 1: won the Cold War was just by outspending the Soviets. 481 00:25:33,876 --> 00:25:35,756 Speaker 1: So then there was a point we just waste more 482 00:25:35,756 --> 00:25:37,276 Speaker 1: money than you do, and since we can afford to, 483 00:25:37,356 --> 00:25:41,196 Speaker 1: we win. Yeah. Depressing, but maybe it's true. But if 484 00:25:41,196 --> 00:25:43,276 Speaker 1: you ask the Defense depart in people, they would say, 485 00:25:43,516 --> 00:25:47,036 Speaker 1: what do you mean intelligence services are expensive? They're really cheap, 486 00:25:47,756 --> 00:25:50,556 Speaker 1: Like you can have an entire intelligence service of thousands 487 00:25:50,556 --> 00:25:52,796 Speaker 1: and thousands of spies for the price of like one 488 00:25:52,916 --> 00:25:57,636 Speaker 1: fighter bomber. Yeah, that the equipment is really expensive, but 489 00:25:57,756 --> 00:26:01,596 Speaker 1: humans are relatively relatively inexpensive, and they don't actually have 490 00:26:01,676 --> 00:26:04,196 Speaker 1: you know, the James Bond, you know flying cars. I 491 00:26:04,196 --> 00:26:07,236 Speaker 1: mean there is no queue strictly speaking, well, so you know, 492 00:26:07,676 --> 00:26:11,916 Speaker 1: after yes, all right, although I will say that remember 493 00:26:11,956 --> 00:26:16,596 Speaker 1: after snowdon and gives away the end at the store 494 00:26:16,636 --> 00:26:19,436 Speaker 1: and at the NSA, there was all this handwringing about 495 00:26:19,436 --> 00:26:21,836 Speaker 1: how much it was going to cost, and the numbers 496 00:26:21,836 --> 00:26:24,036 Speaker 1: were I mean, they were not trivial. They were talking 497 00:26:24,076 --> 00:26:26,596 Speaker 1: about many, many, many billions of dollars, which again raised 498 00:26:26,596 --> 00:26:29,276 Speaker 1: the point like if this thing that you've constructed over 499 00:26:29,356 --> 00:26:33,396 Speaker 1: many many years, costing tens of billions of dollars, can 500 00:26:33,436 --> 00:26:36,796 Speaker 1: be essentially destroyed by one guy working for Dell, not 501 00:26:36,876 --> 00:26:40,596 Speaker 1: even like some random dude way off in the middle 502 00:26:40,596 --> 00:26:44,436 Speaker 1: of nowhere who is a subcontractor for Dell and who 503 00:26:44,876 --> 00:26:48,556 Speaker 1: himself you know, Snowden gets kicked you gore to college education. Yeah, 504 00:26:48,636 --> 00:26:50,596 Speaker 1: and he gets He tries to work for the CIA, 505 00:26:50,676 --> 00:26:52,276 Speaker 1: gets a job because it's like uncle gets him in, 506 00:26:52,476 --> 00:26:56,476 Speaker 1: gets kicked out. Why because he hacks into the personnel 507 00:26:56,556 --> 00:26:59,556 Speaker 1: database and changes his job evaluations so he looks good. 508 00:26:59,756 --> 00:27:02,916 Speaker 1: They discover this, I'm like, oh, this guy is a 509 00:27:02,916 --> 00:27:05,996 Speaker 1: a fraud, be a hacker, and the movies he would 510 00:27:05,996 --> 00:27:08,676 Speaker 1: have promoted him and see and see only got a 511 00:27:08,716 --> 00:27:10,836 Speaker 1: job because his uncle weighs in. So what does he do? 512 00:27:10,836 --> 00:27:13,076 Speaker 1: They fire him? What does he do? He resurfaces at 513 00:27:13,116 --> 00:27:16,196 Speaker 1: del and gets in again. Like if this is the system. 514 00:27:16,356 --> 00:27:19,716 Speaker 1: Is there any point to having the system like there 515 00:27:19,836 --> 00:27:22,676 Speaker 1: is this I you know, there is a point where 516 00:27:23,076 --> 00:27:26,316 Speaker 1: you have to wonder. And it's funny when you think 517 00:27:26,356 --> 00:27:28,756 Speaker 1: about it, how many American institutions are in the grip 518 00:27:28,796 --> 00:27:33,636 Speaker 1: of arms races. I mean, basically, what twenty first century 519 00:27:33,676 --> 00:27:37,756 Speaker 1: capitalism is is a series of I mean, is it 520 00:27:37,836 --> 00:27:39,716 Speaker 1: is a you start with the marketplace, I mean you 521 00:27:39,836 --> 00:27:43,876 Speaker 1: rapidly moved to a an arms race that has no 522 00:27:43,916 --> 00:27:47,156 Speaker 1: productive function. Coke versus PEPSI. I was going to say 523 00:27:47,476 --> 00:27:51,916 Speaker 1: Harvard versus Yale, but that would maybe maybe a better example, 524 00:27:51,956 --> 00:27:55,236 Speaker 1: more expensive, more expensive example, or you know, Harvard Deaconess 525 00:27:55,356 --> 00:27:58,596 Speaker 1: versus you know about SIONI or some I mean, they're 526 00:27:58,636 --> 00:27:59,996 Speaker 1: all of these things. They're all they all have the 527 00:28:00,036 --> 00:28:02,516 Speaker 1: same function, which is they're on the same kind of treadmill. 528 00:28:02,636 --> 00:28:05,196 Speaker 1: And there's no as a point on which the spending 529 00:28:05,236 --> 00:28:07,636 Speaker 1: no longer has any productive function. So the standard I 530 00:28:07,636 --> 00:28:09,956 Speaker 1: mean sent there's a standard aswer to the question of 531 00:28:09,956 --> 00:28:11,996 Speaker 1: why do we bother to compete, even though a lot 532 00:28:11,996 --> 00:28:15,556 Speaker 1: of competition is idiotic and pointless. Usually it's the alternative 533 00:28:16,236 --> 00:28:19,356 Speaker 1: is that we're colluding, and if we collude, we'll have 534 00:28:19,396 --> 00:28:23,076 Speaker 1: no incentive to try to do our jobs. Well, yeah, 535 00:28:23,156 --> 00:28:26,276 Speaker 1: do you think that's just basically a ridiculous Well, you 536 00:28:26,276 --> 00:28:32,436 Speaker 1: can compete on things that matter and what counts well, 537 00:28:32,476 --> 00:28:35,196 Speaker 1: it would be nice, for example, if universities competed on 538 00:28:35,356 --> 00:28:37,796 Speaker 1: how well they were educating children who needed to be 539 00:28:37,916 --> 00:28:42,916 Speaker 1: educated for example. I mean, I mean, I would seem 540 00:28:42,956 --> 00:28:47,676 Speaker 1: to be just we think we are competing on that, 541 00:28:48,116 --> 00:28:49,956 Speaker 1: Like that's what we tell ourselves. That's a sad thing 542 00:28:49,956 --> 00:28:52,636 Speaker 1: about it. Maybe sad institution that you work for. Yes, 543 00:28:52,756 --> 00:28:54,676 Speaker 1: I did hear your I did hear the podcast about 544 00:28:54,676 --> 00:28:58,596 Speaker 1: the LSA TU So yeah, yeah you yeah, I was 545 00:28:58,636 --> 00:29:00,756 Speaker 1: going to tell you, by the way, I'm probably most 546 00:29:00,756 --> 00:29:03,436 Speaker 1: people you ever heard the Malcolm's episode about the LSAT. 547 00:29:03,636 --> 00:29:08,196 Speaker 1: I only give eight hour open open everything. Yeah, I've 548 00:29:08,236 --> 00:29:10,116 Speaker 1: never done the So good for you. Yeah, we're think 549 00:29:10,116 --> 00:29:12,196 Speaker 1: it's because it as you say, it seems were you 550 00:29:12,236 --> 00:29:14,676 Speaker 1: revealed to us what year else's score was. Well, I 551 00:29:14,756 --> 00:29:16,836 Speaker 1: revealed to you what my score was. I got a 552 00:29:16,876 --> 00:29:18,836 Speaker 1: lower score on the l set than I ever got 553 00:29:18,876 --> 00:29:23,116 Speaker 1: on any other standardized test. Okay about that? Is that good? No? 554 00:29:27,036 --> 00:29:29,916 Speaker 1: We were you was impressed by its idiocy, as I 555 00:29:29,996 --> 00:29:33,716 Speaker 1: was deeply impressed by the idiocy. Yeah, I the logic games. 556 00:29:33,796 --> 00:29:36,716 Speaker 1: I just thought were they and truth to be told, 557 00:29:36,716 --> 00:29:39,756 Speaker 1: they bear no relationship to anything that one ever does 558 00:29:39,876 --> 00:29:41,996 Speaker 1: in the wall ye leaving the speed out of it 559 00:29:42,196 --> 00:29:44,516 Speaker 1: speed yea. For those you listen, the episode was all 560 00:29:44,556 --> 00:29:49,156 Speaker 1: about I didn't understand why the while they had time limits. Yeah. Um. 561 00:29:49,636 --> 00:29:52,196 Speaker 1: And then when I went to the people who would 562 00:29:52,236 --> 00:29:54,276 Speaker 1: make an administer the el set and asked that question, 563 00:29:54,396 --> 00:29:59,116 Speaker 1: they h it's like, no, one didn't have a good answer. Well, no, no, 564 00:29:59,156 --> 00:30:00,836 Speaker 1: it was worse than that. It was like they had 565 00:30:00,876 --> 00:30:02,556 Speaker 1: never occurred to him that that was that would be 566 00:30:02,596 --> 00:30:05,556 Speaker 1: an issue, Like so they Well, I think you got 567 00:30:05,556 --> 00:30:07,196 Speaker 1: the right answer. I mean you buried the answer a 568 00:30:07,236 --> 00:30:09,556 Speaker 1: little bit in the episode. But you pointed out that 569 00:30:10,156 --> 00:30:12,756 Speaker 1: law school exams are typically timed and they measure whether 570 00:30:12,756 --> 00:30:14,156 Speaker 1: they else that is any good by saying if it 571 00:30:14,196 --> 00:30:17,476 Speaker 1: predicts predicts first year grades, so that they are. So 572 00:30:17,476 --> 00:30:19,276 Speaker 1: it's really the law school's fault. It's not the fault 573 00:30:19,316 --> 00:30:21,156 Speaker 1: of the exam. It's really involved with the law schools. 574 00:30:21,156 --> 00:30:23,596 Speaker 1: But it was weird to find someone who was engaged 575 00:30:23,676 --> 00:30:27,436 Speaker 1: in doing something and was radically incurious about the reasons 576 00:30:27,436 --> 00:30:30,556 Speaker 1: for doing that thing. Right, that's that's only weird to you, Malcolm. 577 00:30:31,076 --> 00:30:33,316 Speaker 1: Most of the world is people doing whatever they do 578 00:30:33,356 --> 00:30:35,436 Speaker 1: every day and not keeping it. Who it about? What 579 00:30:35,436 --> 00:30:37,836 Speaker 1: it is that they're Yeah, no, it's it's funny. And 580 00:30:37,876 --> 00:30:40,356 Speaker 1: the other thing, um, no, I won't. I was going 581 00:30:40,436 --> 00:30:43,236 Speaker 1: to say something disparaging, but I won't. You can say 582 00:30:42,556 --> 00:30:47,196 Speaker 1: you can be disparaging. No. The other thing this is 583 00:30:47,236 --> 00:30:52,196 Speaker 1: totally um tangential and parenthetical, is that so you have 584 00:30:52,236 --> 00:30:59,116 Speaker 1: an institution which is making constructing these tests and administering 585 00:30:59,156 --> 00:31:02,676 Speaker 1: them and has been doing so for seventy years, and 586 00:31:02,956 --> 00:31:04,836 Speaker 1: by the way, it makes a lot of money doing it, 587 00:31:05,196 --> 00:31:08,996 Speaker 1: and you know, no one's particularly challenging their right to 588 00:31:08,996 --> 00:31:12,996 Speaker 1: do so. Um, and they don't serve any real productive function. 589 00:31:13,156 --> 00:31:17,236 Speaker 1: So the only reason to persist is if they're having fun. Right, 590 00:31:17,796 --> 00:31:19,436 Speaker 1: So that was I was looking. I was like, okay, 591 00:31:19,476 --> 00:31:21,516 Speaker 1: so I can think of no behind the closed door, 592 00:31:21,516 --> 00:31:23,956 Speaker 1: they would all be shortling to themselves, you know. So 593 00:31:23,996 --> 00:31:26,556 Speaker 1: my thought was the only I can't. There's no reason 594 00:31:26,636 --> 00:31:28,676 Speaker 1: for you guys to be doing what you're doing unless 595 00:31:28,916 --> 00:31:30,556 Speaker 1: you wake up in the morning with joy in your 596 00:31:30,556 --> 00:31:33,076 Speaker 1: heart and think I'm going to come up with some 597 00:31:33,156 --> 00:31:37,236 Speaker 1: really like killer quests and I don't even but this 598 00:31:37,276 --> 00:31:39,996 Speaker 1: example actually underscores what I was sort of the question 599 00:31:40,036 --> 00:31:42,436 Speaker 1: I was raising a couple of minutes ago, namely, so 600 00:31:42,916 --> 00:31:45,276 Speaker 1: the lsdas the law school. I don't even know when 601 00:31:45,276 --> 00:31:48,916 Speaker 1: that all stands for. They have no competition, right, I mean, 602 00:31:48,916 --> 00:31:50,396 Speaker 1: as you say, they're doing it for fun. I mean, 603 00:31:50,436 --> 00:31:52,236 Speaker 1: in fact, in recent years, a couple of law schools, 604 00:31:52,236 --> 00:31:54,716 Speaker 1: including mine, I've started experimenting by saying what you could 605 00:31:54,716 --> 00:31:57,596 Speaker 1: take to gre Yeah, I love you. That's what they 606 00:31:57,636 --> 00:32:01,316 Speaker 1: call an experiment. I mean, it's ridiculous thing. One ridiculous 607 00:32:01,316 --> 00:32:07,396 Speaker 1: steridized test. You can take another steridized one which does 608 00:32:07,436 --> 00:32:09,836 Speaker 1: not have logic games on unless we must be forgot. 609 00:32:10,036 --> 00:32:13,276 Speaker 1: But I mean the justification, if there is one for that, 610 00:32:13,636 --> 00:32:16,196 Speaker 1: it would only be if you have two different tests, 611 00:32:16,516 --> 00:32:19,476 Speaker 1: then maybe that would create some competition, and it would 612 00:32:19,556 --> 00:32:21,876 Speaker 1: lead the people who make up to tests in theory. 613 00:32:21,876 --> 00:32:23,996 Speaker 1: I mean, this is all in theory to think about 614 00:32:24,036 --> 00:32:27,196 Speaker 1: whether you know they could do something differently instead of 615 00:32:27,236 --> 00:32:29,676 Speaker 1: just as you say, just being out there to have fun. 616 00:32:29,756 --> 00:32:31,596 Speaker 1: Not that having fun. It's such a bad motive. No, No, 617 00:32:31,636 --> 00:32:33,476 Speaker 1: I thought that's why I'll do my job. If I 618 00:32:33,516 --> 00:32:35,396 Speaker 1: had thought they were having fun, I would have called 619 00:32:35,396 --> 00:32:37,636 Speaker 1: the whole thing off and did not attack them, right 620 00:32:37,836 --> 00:32:39,916 Speaker 1: but right, but they appeared not to be having fun. 621 00:32:40,196 --> 00:32:42,876 Speaker 1: They committed this end of not having fun, unlike the 622 00:32:42,996 --> 00:32:48,436 Speaker 1: drunken Anthropologists. Yes, yes, yes, who did have fun. Yeah. 623 00:32:48,476 --> 00:32:52,716 Speaker 1: I was fascinated by the chapter about alcohol here. There 624 00:32:52,756 --> 00:32:54,796 Speaker 1: was tons. I mean, there's always stuff in every chapter 625 00:32:54,916 --> 00:32:56,876 Speaker 1: that I don't know, but in that chapter, I feel 626 00:32:56,876 --> 00:32:58,996 Speaker 1: like I didn't know anything, Like everything I thought about 627 00:32:59,036 --> 00:33:03,236 Speaker 1: alcohol consumption was wrong. Can you tell the story of 628 00:33:03,276 --> 00:33:08,956 Speaker 1: the drunken Anthropologists? I can't. I can. So that chapter 629 00:33:10,716 --> 00:33:14,036 Speaker 1: began because I was looking for since I was interested 630 00:33:14,076 --> 00:33:18,196 Speaker 1: in this question of conversations between strangers going awry. Naturally, 631 00:33:18,236 --> 00:33:24,916 Speaker 1: I thought that campus sexual assault would be a reasonable 632 00:33:24,916 --> 00:33:27,636 Speaker 1: place to start. Like that's that seems to be albeit 633 00:33:27,596 --> 00:33:30,796 Speaker 1: a highly controversial, highly country but part, some part of 634 00:33:30,836 --> 00:33:34,396 Speaker 1: that problem is about that that clearly conversations are going 635 00:33:34,436 --> 00:33:37,516 Speaker 1: badly awry. Right. So I began to go and talk 636 00:33:37,516 --> 00:33:39,796 Speaker 1: to people who studied this problem, and there are many 637 00:33:39,836 --> 00:33:42,796 Speaker 1: of them, and all of them, five minutes into the 638 00:33:42,836 --> 00:33:44,956 Speaker 1: conversation would say, well, you know, this is about alcohol. 639 00:33:45,476 --> 00:33:47,556 Speaker 1: They all said this, and I realized, Okay, so maybe 640 00:33:47,556 --> 00:33:50,356 Speaker 1: I should rethink and start talking thinking about more about alcohol. 641 00:33:52,116 --> 00:33:56,956 Speaker 1: And that's interesting on another on a number of levels, 642 00:33:58,476 --> 00:34:03,116 Speaker 1: in part because if you read books about campus assault, 643 00:34:03,356 --> 00:34:05,956 Speaker 1: there are books written about it that don't mention alcohol, 644 00:34:06,076 --> 00:34:08,996 Speaker 1: which is quite incredible. But really, do you see this 645 00:34:09,156 --> 00:34:11,516 Speaker 1: gap between the way an issue is discussed in public 646 00:34:11,756 --> 00:34:14,556 Speaker 1: and luay an issue is discussed by the research community. 647 00:34:14,836 --> 00:34:18,036 Speaker 1: Um anyway, so it led to this long question about Okay, 648 00:34:18,076 --> 00:34:23,276 Speaker 1: so what happens when you're drunk? Um? That would that 649 00:34:23,396 --> 00:34:28,156 Speaker 1: might impair the um the conversation that's being had between 650 00:34:28,356 --> 00:34:30,916 Speaker 1: two strangers at a party or what have you. And 651 00:34:31,436 --> 00:34:33,956 Speaker 1: the common position is that what happens when you drink 652 00:34:34,076 --> 00:34:37,156 Speaker 1: is you become disinhibited, which is what I thought, Yes, 653 00:34:37,316 --> 00:34:40,676 Speaker 1: that you simply then the normal, the kind of surface 654 00:34:40,756 --> 00:34:45,156 Speaker 1: constraints on your um personality melt away, and some kind 655 00:34:45,196 --> 00:34:48,676 Speaker 1: of purer version of yourself emerges in vino veritas right 656 00:34:48,716 --> 00:34:51,596 Speaker 1: for better or words for better fans uh. In fact, 657 00:34:51,676 --> 00:34:56,636 Speaker 1: the contemporary position on alcohol now is on drunkenness is 658 00:34:56,916 --> 00:34:59,876 Speaker 1: very different from that, and that is that drunkenness causes myopia, 659 00:34:59,956 --> 00:35:02,156 Speaker 1: and what that means is that when you're drunk, what 660 00:35:02,316 --> 00:35:05,956 Speaker 1: happens is your higher cognitive functions start to shut down 661 00:35:06,236 --> 00:35:09,676 Speaker 1: and you're capable only of making sense of things in 662 00:35:09,716 --> 00:35:12,356 Speaker 1: the immediate term that are right in front of you. 663 00:35:12,796 --> 00:35:15,236 Speaker 1: And that's a significant That sounds like a subtle difference, 664 00:35:15,276 --> 00:35:22,076 Speaker 1: but it's significant because your personality is your normal personality 665 00:35:22,476 --> 00:35:25,676 Speaker 1: is a function of you, of a careful weighing of 666 00:35:25,756 --> 00:35:29,956 Speaker 1: short term versus long term consequences. Right that, Noah, you 667 00:35:30,036 --> 00:35:32,316 Speaker 1: are who you are because you're not just thinking about 668 00:35:32,316 --> 00:35:35,116 Speaker 1: what's happening now. You're thinking about tomorrow and next week. 669 00:35:35,196 --> 00:35:39,116 Speaker 1: And if you say something rude or stupid or offensive 670 00:35:39,156 --> 00:35:42,236 Speaker 1: to me right now, you know it'll matter tomorrow and right. 671 00:35:42,956 --> 00:35:46,076 Speaker 1: But if you're drunk, that falls away, and what's left 672 00:35:46,436 --> 00:35:49,396 Speaker 1: is not Noah anymore. Because Noah is someone who thinks 673 00:35:49,396 --> 00:35:51,796 Speaker 1: about tomorrow. What's left is the version of Noah that 674 00:35:51,796 --> 00:35:54,956 Speaker 1: doesn't think about tomorrow, which is not Noah. Right, And 675 00:35:55,036 --> 00:35:57,636 Speaker 1: to the extent that we it's a version of Noah 676 00:35:57,676 --> 00:36:00,876 Speaker 1: that doesn't think about tomorrow. Yes, but that's not a 677 00:36:00,956 --> 00:36:03,916 Speaker 1: version you approve of. True, But it's a separate question. 678 00:36:03,996 --> 00:36:05,716 Speaker 1: I think we can discuss it without without a view 679 00:36:05,716 --> 00:36:09,076 Speaker 1: of like, which is the essential Noah? Oh? No no, 680 00:36:09,276 --> 00:36:11,516 Speaker 1: I would say, it's the non essential no yeah, okay, 681 00:36:14,236 --> 00:36:16,996 Speaker 1: isn't your non essential self the portion of yourself that 682 00:36:17,076 --> 00:36:21,076 Speaker 1: you would not willingly choose to be? Well, that would 683 00:36:21,076 --> 00:36:23,116 Speaker 1: be super nice if that were true. But it might 684 00:36:23,156 --> 00:36:28,476 Speaker 1: be the other way around, right, I mean, it could 685 00:36:28,516 --> 00:36:30,556 Speaker 1: be that the portion of myself that I don't want 686 00:36:30,596 --> 00:36:33,116 Speaker 1: to be is actually the essential me, and that to me, 687 00:36:33,316 --> 00:36:36,276 Speaker 1: who goes all the work, goes into producing the public me, 688 00:36:36,636 --> 00:36:38,876 Speaker 1: is the inessential me. Do you want to lie on 689 00:36:38,876 --> 00:36:42,796 Speaker 1: the couch and should? I mean, I'm I'm getting to that, 690 00:36:42,876 --> 00:36:44,876 Speaker 1: Believe me, I'm getting to I'm going to get to 691 00:36:44,956 --> 00:36:49,196 Speaker 1: psychoanalysis and talking to strangers, I promise you, Okay. Regardless 692 00:36:49,196 --> 00:36:53,676 Speaker 1: of whether drunken Noah is the anti Noah or simply 693 00:36:54,676 --> 00:36:59,756 Speaker 1: altered Noah, it's not typical Noah, and it's not ideal Noah. Yes. Right, 694 00:37:00,316 --> 00:37:04,436 Speaker 1: So if this version, if drunken, the drunken version of ourselves, 695 00:37:04,596 --> 00:37:10,036 Speaker 1: is this radically altered, less than altimal verse, then that's 696 00:37:10,116 --> 00:37:14,316 Speaker 1: hugely problematic, right, And it makes it hard to understand, 697 00:37:14,356 --> 00:37:18,276 Speaker 1: for example, how consent can ever be appropriate when people 698 00:37:18,276 --> 00:37:22,436 Speaker 1: are very drunk, because the notion of consent assumes that 699 00:37:22,476 --> 00:37:24,916 Speaker 1: you're it is your self. It is your essential self 700 00:37:24,916 --> 00:37:28,996 Speaker 1: that's consenting, not your altered self. But also it means 701 00:37:28,996 --> 00:37:33,636 Speaker 1: that I think that we have Also it raises the 702 00:37:33,716 --> 00:37:38,196 Speaker 1: question of whether the transformed drunken self is someone who 703 00:37:38,236 --> 00:37:41,476 Speaker 1: is much more likely to engage in criminal behavior, which 704 00:37:41,996 --> 00:37:44,156 Speaker 1: turns out to be true. Right, That's what a lot 705 00:37:44,196 --> 00:37:47,436 Speaker 1: of this sexual assult is about, is that people get very, 706 00:37:47,516 --> 00:37:55,076 Speaker 1: very drunk thinking that it is a harmless, fun state, 707 00:37:55,396 --> 00:37:57,316 Speaker 1: and in fact it's a state that radically increases their 708 00:37:57,356 --> 00:38:01,476 Speaker 1: chances of being criminal sexual predators, right on the one 709 00:38:01,516 --> 00:38:04,596 Speaker 1: side of the equation. So anyway, what does this have 710 00:38:04,636 --> 00:38:06,516 Speaker 1: to do with the drunken anthropologists. Well, a lot of 711 00:38:06,556 --> 00:38:09,076 Speaker 1: this rethinking of drinking begins in the fifties, with all 712 00:38:09,236 --> 00:38:12,196 Speaker 1: the sense of pological work that starts going around the 713 00:38:12,196 --> 00:38:14,756 Speaker 1: world and observing that in other parts of the world 714 00:38:15,156 --> 00:38:18,636 Speaker 1: drunkenness doesn't look like drunkenness in the United States. Right, 715 00:38:18,636 --> 00:38:22,716 Speaker 1: So if if drunken, henkenness is culturally the drunkenness. So 716 00:38:22,756 --> 00:38:24,836 Speaker 1: if drunken that's just how often you get drunk, But 717 00:38:24,876 --> 00:38:26,996 Speaker 1: what you do when what you do when you get 718 00:38:27,076 --> 00:38:31,236 Speaker 1: drunk differs dramatically from culture. So these the the anthropologists. 719 00:38:31,236 --> 00:38:35,836 Speaker 1: So these two lovely this couple um who taught a 720 00:38:35,876 --> 00:38:38,876 Speaker 1: brown for many years. They went to Bolivia in the 721 00:38:38,956 --> 00:38:41,876 Speaker 1: fifties and they observed that this tribe they were living 722 00:38:41,916 --> 00:38:46,156 Speaker 1: with would get would drink essentially grain alcohol, but the 723 00:38:46,196 --> 00:38:50,756 Speaker 1: most potent liquor imaginable. Every Friday night they would get 724 00:38:50,796 --> 00:38:54,876 Speaker 1: together as a group and they would get so wasted that, 725 00:38:55,076 --> 00:38:58,436 Speaker 1: like I mean, they would just drink too. But there 726 00:38:58,516 --> 00:39:06,396 Speaker 1: was no no observable pathology. It didn't lead to fights, 727 00:39:06,436 --> 00:39:09,356 Speaker 1: it didn't lead to you know, absenteeism from work, it 728 00:39:09,396 --> 00:39:11,676 Speaker 1: didn't lead to broken marriages, that didn't lead to and 729 00:39:11,676 --> 00:39:13,276 Speaker 1: they were like so stunned by this, like how could 730 00:39:13,276 --> 00:39:15,436 Speaker 1: this be? And the answer is it goes to this 731 00:39:15,476 --> 00:39:18,756 Speaker 1: question of biopia, that when you are drunk, you were 732 00:39:18,916 --> 00:39:21,036 Speaker 1: at the mercy of your immediate environment. And they had 733 00:39:21,076 --> 00:39:24,356 Speaker 1: constructed an immediate environment that was entirely benign, more than benign, 734 00:39:24,556 --> 00:39:28,036 Speaker 1: that was socially positive. So when they got so, yeah, 735 00:39:28,076 --> 00:39:30,556 Speaker 1: they were surrounded by like happy things. They sang songs 736 00:39:30,556 --> 00:39:32,716 Speaker 1: and they held hands and it was all love. Was 737 00:39:32,756 --> 00:39:34,516 Speaker 1: like a rave, but with grain alcohol. It was like 738 00:39:34,556 --> 00:39:39,116 Speaker 1: a rave with green yes exactly, if that's what frat 739 00:39:39,156 --> 00:39:42,796 Speaker 1: parties were we would not have sexual assaults, right, But 740 00:39:42,876 --> 00:39:46,196 Speaker 1: frat parties are the opposite. They are places where the 741 00:39:46,236 --> 00:39:49,116 Speaker 1: thing that is immediately in front of this of these 742 00:39:49,676 --> 00:39:53,756 Speaker 1: you know, wasted eighteen year olds is not something that 743 00:39:53,796 --> 00:39:55,796 Speaker 1: brings out their best self. It is rather something that 744 00:39:55,836 --> 00:39:58,476 Speaker 1: brings out, in many cases, their worst self. The fact 745 00:39:58,556 --> 00:40:01,236 Speaker 1: that we allow this to happen on campuses and we 746 00:40:01,276 --> 00:40:05,276 Speaker 1: are seemingly oblivious to its consequences enrages me because the 747 00:40:05,276 --> 00:40:10,036 Speaker 1: answer is the answer to engage in some rat cultural 748 00:40:10,076 --> 00:40:14,636 Speaker 1: experiment to change the cultural norms. Because I mean, that's 749 00:40:14,636 --> 00:40:16,796 Speaker 1: the thing about anthropologists, right, they go all over the world. 750 00:40:16,996 --> 00:40:19,516 Speaker 1: They see incredible things, and they report back that what 751 00:40:19,596 --> 00:40:23,036 Speaker 1: we think is intuitive isn't intuitive. But their explanation tends 752 00:40:23,036 --> 00:40:25,116 Speaker 1: to be a teeny bit different from the explanations that 753 00:40:25,516 --> 00:40:27,676 Speaker 1: often come up in the other work of other social scientists. 754 00:40:27,716 --> 00:40:30,876 Speaker 1: They tend to say, it's a different culture. So those 755 00:40:30,916 --> 00:40:32,636 Speaker 1: folks had put a lot of time and effort into 756 00:40:32,636 --> 00:40:35,076 Speaker 1: figuring out a culture. Maybe it was luck, maybe it 757 00:40:35,156 --> 00:40:37,876 Speaker 1: wasn't where getting drunk actually gave them some kind of 758 00:40:37,876 --> 00:40:41,516 Speaker 1: communal solidarity, and so maybe we should be trying to 759 00:40:41,556 --> 00:40:44,436 Speaker 1: produce a culture like that. Yeah, I agree, I think 760 00:40:44,436 --> 00:40:48,596 Speaker 1: we should be. I think the idea of so some 761 00:40:48,636 --> 00:40:51,196 Speaker 1: people have been trying to produce a culture around drinking. 762 00:40:51,236 --> 00:40:53,196 Speaker 1: It's just that the culture that has been produced around 763 00:40:53,276 --> 00:40:56,756 Speaker 1: drinking on campuses in the last twenty five years is 764 00:40:56,796 --> 00:41:02,116 Speaker 1: the most monstrously maladaptive culture imaginable. So we know you 765 00:41:02,156 --> 00:41:06,036 Speaker 1: can produce powerful cultures around it, but we have surrendered 766 00:41:06,076 --> 00:41:07,716 Speaker 1: that task to people who do not have the best 767 00:41:07,716 --> 00:41:12,036 Speaker 1: interests of nineteen year role college students at heart. And 768 00:41:12,076 --> 00:41:16,396 Speaker 1: it's time we went back and took back that particular culture. 769 00:41:16,676 --> 00:41:19,956 Speaker 1: It is quite possible to have fun at a party 770 00:41:19,996 --> 00:41:24,036 Speaker 1: without getting blackout drunk, right at the very least, it 771 00:41:24,156 --> 00:41:27,996 Speaker 1: makes sense. You know, in telling in retelling the story 772 00:41:28,076 --> 00:41:30,756 Speaker 1: of the Stanford rape case, there are many things that 773 00:41:31,276 --> 00:41:34,196 Speaker 1: strike you. One is that here is a party at 774 00:41:34,236 --> 00:41:36,596 Speaker 1: Stanford where there is a lot of very young people 775 00:41:36,596 --> 00:41:38,436 Speaker 1: are getting very very drunk, and there appeared to be 776 00:41:38,476 --> 00:41:44,116 Speaker 1: no adults anywhere, no sober people present. How is that good? Idea? Like, 777 00:41:44,236 --> 00:41:46,356 Speaker 1: here's a college that has not like there are a 778 00:41:46,436 --> 00:41:49,356 Speaker 1: lack of resources at Stanford for this kind of thing. 779 00:41:50,396 --> 00:41:53,476 Speaker 1: When I was at you know, getting drunk in college 780 00:41:53,516 --> 00:41:55,516 Speaker 1: at the University of Toronto in the nineteen eighties. They 781 00:41:55,516 --> 00:41:58,876 Speaker 1: were always sober adults at our parties. It just it 782 00:41:58,916 --> 00:42:01,716 Speaker 1: was the way it was constructed, and so whenever something 783 00:42:01,716 --> 00:42:03,636 Speaker 1: got out of hand, the sober adult came in and 784 00:42:03,756 --> 00:42:06,636 Speaker 1: made sure it didn't go too far. I suspect that's 785 00:42:06,636 --> 00:42:11,516 Speaker 1: why the number of these incredibly problematic incidents in my 786 00:42:11,556 --> 00:42:14,636 Speaker 1: college years was small. I never knew of a single 787 00:42:14,916 --> 00:42:18,356 Speaker 1: person who went to the hospital suffering from alcohol poisoning. 788 00:42:18,396 --> 00:42:20,836 Speaker 1: I never knew of a single person although I got drunk. 789 00:42:20,916 --> 00:42:23,796 Speaker 1: Although I have read this is maybe about the early nineties, 790 00:42:23,876 --> 00:42:26,116 Speaker 1: I can't be so different from the late eighties studies 791 00:42:26,156 --> 00:42:29,836 Speaker 1: suggesting that the rate of sexual assault on campuses was 792 00:42:29,876 --> 00:42:33,996 Speaker 1: actually not so different then than it is now, that 793 00:42:34,116 --> 00:42:40,276 Speaker 1: is to say, outrageously, shockingly terribly high. Yeah. Yeah, All 794 00:42:40,316 --> 00:42:43,556 Speaker 1: I know is that the Yeah, the gathering of statistics 795 00:42:43,556 --> 00:42:46,276 Speaker 1: in this area is an incredibly difficult Yes, so safely 796 00:42:46,276 --> 00:42:49,356 Speaker 1: because so much goes unreported. Yeah. All I can say 797 00:42:49,356 --> 00:42:51,596 Speaker 1: for certain is that our best efforts at the moment 798 00:42:52,796 --> 00:42:56,836 Speaker 1: suggests the sexual self problem is way, way, way worse 799 00:42:56,876 --> 00:43:00,556 Speaker 1: than people imagine. I wanted to I promised one question 800 00:43:00,596 --> 00:43:06,436 Speaker 1: about broadly speaking psychoanalysis. So here's the question. You talk 801 00:43:06,516 --> 00:43:08,796 Speaker 1: a lot in the book about default to truth as 802 00:43:08,836 --> 00:43:11,196 Speaker 1: some thing that we do and maybe something that we're 803 00:43:11,196 --> 00:43:15,556 Speaker 1: hard wired to do. Yeah, so you know your spies 804 00:43:15,796 --> 00:43:18,196 Speaker 1: are being spies and they say, oh, I'm not a spy, 805 00:43:18,236 --> 00:43:19,556 Speaker 1: and then people say, oh, I guess you're not a 806 00:43:19,556 --> 00:43:22,156 Speaker 1: spy because we're you know, it's pro social to believe 807 00:43:22,156 --> 00:43:25,316 Speaker 1: people unless you have really strong evidence not too How 808 00:43:25,356 --> 00:43:28,356 Speaker 1: do you distinguish that from the situation where we kind 809 00:43:28,396 --> 00:43:32,076 Speaker 1: of know that you're lying, but we really don't want 810 00:43:32,116 --> 00:43:35,796 Speaker 1: to think that, where we're actually eager on some level, 811 00:43:36,236 --> 00:43:37,796 Speaker 1: and you know, it could be on different levels, but 812 00:43:37,796 --> 00:43:40,196 Speaker 1: it could actually be on a you know, a subconscious level. 813 00:43:40,556 --> 00:43:42,276 Speaker 1: We're eager to believe you because it would be just 814 00:43:42,316 --> 00:43:45,436 Speaker 1: too terrible to believe the truth that my co worker 815 00:43:45,476 --> 00:43:47,996 Speaker 1: in the next cubicle, who's an award winning you know, 816 00:43:48,036 --> 00:43:51,436 Speaker 1: Cuba analyst, is actually you know, piling around personally with 817 00:43:51,476 --> 00:43:54,476 Speaker 1: Fidel Castro, you know, on odd weekends. Yeah, it's it's 818 00:43:54,556 --> 00:43:56,236 Speaker 1: devastating to think that. I mean, in all of your 819 00:43:56,276 --> 00:43:58,076 Speaker 1: cases about the spies, you always have the person saying, 820 00:43:58,076 --> 00:44:01,116 Speaker 1: oh my god, we were just devastated to discover this. Yeah, so, 821 00:44:01,396 --> 00:44:06,036 Speaker 1: how do you distinguish a kind of idea that we're 822 00:44:06,196 --> 00:44:08,236 Speaker 1: just built to believe people, so we believe people from 823 00:44:08,276 --> 00:44:10,436 Speaker 1: the idea that no, it's not that, it's that we 824 00:44:10,516 --> 00:44:12,516 Speaker 1: kind of know they're lying on some level, but we 825 00:44:13,196 --> 00:44:15,236 Speaker 1: but we know and we don't know at the same 826 00:44:15,236 --> 00:44:19,156 Speaker 1: time because we just really, really really don't want to 827 00:44:19,156 --> 00:44:21,956 Speaker 1: be disillusioned. Well, I there's many way stiensers that I 828 00:44:21,996 --> 00:44:24,556 Speaker 1: would say that. So a lot of my ideas in 829 00:44:24,596 --> 00:44:27,796 Speaker 1: this book about why we do such a bad job 830 00:44:27,836 --> 00:44:31,116 Speaker 1: of knowing when others are lying come from the work 831 00:44:31,116 --> 00:44:34,196 Speaker 1: of this psychologist, Tim Levine and levine zanswer would be 832 00:44:35,556 --> 00:44:38,796 Speaker 1: that real liars. So there's a in a psychological literature, 833 00:44:38,796 --> 00:44:40,756 Speaker 1: there is a distinction between there's a big argument about 834 00:44:40,756 --> 00:44:43,716 Speaker 1: what is a lie, and they don't count as lies. 835 00:44:44,636 --> 00:44:48,316 Speaker 1: Untruths that are told with the intent of preserving social 836 00:44:48,356 --> 00:44:52,516 Speaker 1: relationships are not considered lies what we would call white lies, 837 00:44:52,756 --> 00:44:56,196 Speaker 1: but they're the category why life is quite large. Yes, um, so, 838 00:44:56,276 --> 00:44:59,436 Speaker 1: a true lie is a lie that's told deliberately and 839 00:44:59,476 --> 00:45:04,796 Speaker 1: malicious maliciously with the intent of severing or rupturing social relationships. 840 00:45:04,836 --> 00:45:08,956 Speaker 1: The number of the percentage of the population who tal 841 00:45:09,196 --> 00:45:14,356 Speaker 1: large numbers of true lies is really really small. So 842 00:45:14,396 --> 00:45:16,276 Speaker 1: the con style of that with the statistics on the 843 00:45:16,356 --> 00:45:19,396 Speaker 1: number of people who cheat on their spouses, I mean 844 00:45:19,396 --> 00:45:21,276 Speaker 1: that by depending on the numbers you look at, that's 845 00:45:21,316 --> 00:45:24,116 Speaker 1: you know, between a third and sixty percent of you 846 00:45:24,156 --> 00:45:26,556 Speaker 1: know of Americans, depending on which studies you believe. That's 847 00:45:26,596 --> 00:45:28,356 Speaker 1: a lot of people and they're all telling unless you 848 00:45:28,356 --> 00:45:33,516 Speaker 1: think that's a white lie. Now, so is cheating on 849 00:45:33,556 --> 00:45:38,436 Speaker 1: your I mean I'm talking about a cheating on this 850 00:45:38,676 --> 00:45:43,476 Speaker 1: on your spouse. Is an active an active deception, yeah, 851 00:45:43,516 --> 00:45:46,316 Speaker 1: but it may not involve an actual lie. In other words, 852 00:45:46,876 --> 00:45:49,916 Speaker 1: I'm talking about the moment when the two people confront 853 00:45:49,956 --> 00:45:53,316 Speaker 1: each other and one of the fun party says that 854 00:45:53,356 --> 00:45:55,116 Speaker 1: you're having an affair and the other party says, I'm 855 00:45:55,156 --> 00:45:58,356 Speaker 1: not okay. So that's no. I don't know what percentage 856 00:45:58,396 --> 00:46:00,916 Speaker 1: includes that moment. Yeah, I'm more interested in that act 857 00:46:00,956 --> 00:46:03,636 Speaker 1: where they confront each other. I'm not you know, you're 858 00:46:03,636 --> 00:46:07,916 Speaker 1: talking about the commission of a social acts or maybe 859 00:46:08,156 --> 00:46:11,236 Speaker 1: highly social acts, but social acts outside of overly social, 860 00:46:11,276 --> 00:46:16,996 Speaker 1: overly social acts, but um but no deeply malicious acts 861 00:46:17,196 --> 00:46:20,756 Speaker 1: meant to destroy. So made off made off level kinds 862 00:46:20,796 --> 00:46:24,276 Speaker 1: of lies are really really rare. Um. So to add 863 00:46:24,316 --> 00:46:25,716 Speaker 1: to you you, I agree with you. How do we know 864 00:46:25,796 --> 00:46:29,276 Speaker 1: that a lot of so there is a there is 865 00:46:29,276 --> 00:46:32,076 Speaker 1: a healthy literature in psychology and trying to figure this 866 00:46:32,156 --> 00:46:35,116 Speaker 1: question out. Yeah. Um, you know it's hard to tell. 867 00:46:35,556 --> 00:46:38,956 Speaker 1: But you know, we do know that because you know 868 00:46:38,956 --> 00:46:44,516 Speaker 1: what percentage of investment advisors are running massive Ponzi schemes. 869 00:46:44,636 --> 00:46:47,236 Speaker 1: It's actually quite small, right, I mean there's a lot 870 00:46:47,276 --> 00:46:52,156 Speaker 1: of Look, where's your money in like in a sock? 871 00:46:52,716 --> 00:46:55,956 Speaker 1: I you this, No, No, it's in an index fund. Yea. 872 00:46:56,076 --> 00:46:59,356 Speaker 1: The Wall Street Journal is not lying. I mean the index. 873 00:46:59,396 --> 00:47:01,356 Speaker 1: The index is the index. So you know how you're 874 00:47:01,396 --> 00:47:03,636 Speaker 1: you know, well, the person running the index fund, you're 875 00:47:03,636 --> 00:47:05,956 Speaker 1: saying it's not lying. You know, there's don't you don't 876 00:47:05,956 --> 00:47:07,836 Speaker 1: even need a person. It's just an index, I know. 877 00:47:07,916 --> 00:47:10,316 Speaker 1: But so therefore it's not you, whereas a human being 878 00:47:10,316 --> 00:47:12,116 Speaker 1: with a with a you know, with an investment strategy 879 00:47:12,156 --> 00:47:14,356 Speaker 1: could could be the nature of a Ponzi scheme? Is 880 00:47:14,396 --> 00:47:18,276 Speaker 1: that advision it? Actually? Yeah? So we do know that 881 00:47:18,756 --> 00:47:21,596 Speaker 1: there aren't ten madeups out there? Yeah, there are, you know, 882 00:47:21,596 --> 00:47:24,156 Speaker 1: there is a small number. And similarly, when we look 883 00:47:24,156 --> 00:47:27,636 Speaker 1: in you know, what is the incident incidents of Since 884 00:47:27,676 --> 00:47:29,996 Speaker 1: I talk about pedophilia in this book, how many you know, 885 00:47:29,996 --> 00:47:32,476 Speaker 1: what is the real incidence of pedophilia in the population, 886 00:47:32,516 --> 00:47:37,476 Speaker 1: So someone who is systematically deceiving those around him in 887 00:47:37,476 --> 00:47:39,756 Speaker 1: in the in the in the name of pursuing a 888 00:47:39,836 --> 00:47:43,116 Speaker 1: devan sexual agenda. It's actually it's quite small. It's like 889 00:47:44,196 --> 00:47:48,876 Speaker 1: two three percent. That who that's small? Well, two three percent. 890 00:47:48,916 --> 00:47:51,116 Speaker 1: Now we're not saying that these are people who actively 891 00:47:51,196 --> 00:47:55,036 Speaker 1: pursue their pedophilia, but who have those kinds of inclinations, 892 00:47:55,076 --> 00:48:00,356 Speaker 1: But that that's quite small. Sure, yeah, okay, two or 893 00:48:00,396 --> 00:48:03,516 Speaker 1: three people out of one hundred. It's nothing like sounds 894 00:48:03,516 --> 00:48:05,436 Speaker 1: like a lot of people to me. I don't know. 895 00:48:05,476 --> 00:48:07,556 Speaker 1: But it's because I have kids. I don't know. But 896 00:48:07,596 --> 00:48:10,916 Speaker 1: if you but if you have as a baseline position 897 00:48:10,996 --> 00:48:14,316 Speaker 1: that most people you deal with are not pedophiles, that's 898 00:48:14,356 --> 00:48:16,876 Speaker 1: not an irrational position. It's not irrational, not by any string. 899 00:48:16,956 --> 00:48:19,516 Speaker 1: So you can send your kids to boy scout the world, 900 00:48:19,836 --> 00:48:22,756 Speaker 1: and you shouldn't lie awake at night worrying about whether 901 00:48:22,756 --> 00:48:27,076 Speaker 1: the boy scouts. But verify, yes, that's right anyway, So 902 00:48:27,276 --> 00:48:30,516 Speaker 1: I don't think that your position is wrong. But I 903 00:48:30,556 --> 00:48:34,876 Speaker 1: would say that it's it is probably rational to want 904 00:48:34,916 --> 00:48:37,436 Speaker 1: to believe that the person you're dealing with is telling 905 00:48:37,476 --> 00:48:39,476 Speaker 1: the truth, because most people actually are telling the truth. 906 00:48:39,476 --> 00:48:42,036 Speaker 1: That's Slivian's argument. I think that's there's something and it's 907 00:48:42,036 --> 00:48:47,036 Speaker 1: a it's a rationalist argument, presumably built into some evolutionary theory. Right, 908 00:48:47,036 --> 00:48:49,916 Speaker 1: it's rational, and so we've evolved this tendency. You know, 909 00:48:49,956 --> 00:48:51,636 Speaker 1: what are the reasons why so many? And I guess 910 00:48:51,676 --> 00:48:53,796 Speaker 1: my worry is, I'm not sure that covers the cases 911 00:48:54,156 --> 00:48:56,756 Speaker 1: money of which you write about, where there's like stuff 912 00:48:56,876 --> 00:48:59,116 Speaker 1: staring the person in the face, you know, the person 913 00:48:59,196 --> 00:49:01,396 Speaker 1: who suspects the spy and says, gee, I suspect this 914 00:49:01,476 --> 00:49:03,636 Speaker 1: spy and goes and ask the questions and the spy 915 00:49:03,716 --> 00:49:06,796 Speaker 1: gives a lame answer, and the person's like, okay, I 916 00:49:06,836 --> 00:49:09,436 Speaker 1: believe you. And your your answer is you know, this 917 00:49:09,556 --> 00:49:12,116 Speaker 1: kind of evolutionary default the truth. But my instinct, at 918 00:49:12,156 --> 00:49:14,516 Speaker 1: least on reading those anecdotes was that's not enough to 919 00:49:14,556 --> 00:49:17,596 Speaker 1: explain that because the person's doubts were already raised. It's 920 00:49:17,636 --> 00:49:19,596 Speaker 1: really that they just didn't want to face it. They 921 00:49:19,596 --> 00:49:22,276 Speaker 1: didn't want to take on board the painful reality you 922 00:49:22,316 --> 00:49:25,516 Speaker 1: know spies thinking about spies as someone who has consumed 923 00:49:25,596 --> 00:49:28,516 Speaker 1: huge numbers of real and imaginary spy stories. But if 924 00:49:28,516 --> 00:49:31,236 Speaker 1: you just look at the real ones, spies never get caught. 925 00:49:31,396 --> 00:49:35,196 Speaker 1: Like show me a case where the first day that 926 00:49:35,636 --> 00:49:39,156 Speaker 1: Joe Hill decided to turn coat and spy for the 927 00:49:39,156 --> 00:49:42,236 Speaker 1: Soviet Union, even though he was a high ranking CIA officer, 928 00:49:42,436 --> 00:49:46,516 Speaker 1: he was caught by counterintelligence. Never ever happens. If you do, 929 00:49:46,556 --> 00:49:47,956 Speaker 1: like I woudn't want to write a book about it 930 00:49:47,996 --> 00:49:50,796 Speaker 1: because it short no if you talk to these counterintelligence 931 00:49:50,796 --> 00:49:53,156 Speaker 1: officers about their record and uncovering spies. First of all, 932 00:49:53,156 --> 00:49:55,796 Speaker 1: most counterintelligence officers never catch any spies, and to the 933 00:49:55,796 --> 00:49:58,516 Speaker 1: extent they do, they catch them like after ten years. 934 00:49:58,676 --> 00:50:02,436 Speaker 1: Like think about how long Aldrich aims possibly the worst 935 00:50:02,476 --> 00:50:06,316 Speaker 1: by this country ever had. The man is a buffoon, 936 00:50:06,756 --> 00:50:11,756 Speaker 1: he's a drunken His performance reviews are like terrible. He 937 00:50:11,796 --> 00:50:14,516 Speaker 1: starts getting huge amounts of money from the Soviets for 938 00:50:14,556 --> 00:50:16,396 Speaker 1: his spying or what does he do? He spends it 939 00:50:16,436 --> 00:50:19,356 Speaker 1: wildly and shows up at Langley and like a jaguar 940 00:50:19,436 --> 00:50:22,036 Speaker 1: with his teeth capped and wearing a fancy you know 941 00:50:22,196 --> 00:50:27,716 Speaker 1: BRIONI suit and like nobody nobody looks askance so like, 942 00:50:28,116 --> 00:50:33,116 Speaker 1: I'm sorry at a certain boot, President, I'm sorry. At 943 00:50:33,116 --> 00:50:34,796 Speaker 1: a certain point, you have to believe that this is 944 00:50:34,836 --> 00:50:38,676 Speaker 1: something that is I just read this, this book about 945 00:50:38,756 --> 00:50:42,676 Speaker 1: Klaus Fuchs. You know, the so Klaus Fuchs, he's the 946 00:50:42,716 --> 00:50:45,916 Speaker 1: one who betrays the atomic bomb to the Soviet Union. 947 00:50:46,436 --> 00:50:49,636 Speaker 1: I hadn't realized that a Klaus Fuchs. I thought he 948 00:50:49,676 --> 00:50:52,156 Speaker 1: was a kind of minor figure in Los Alamos. He's not. 949 00:50:52,196 --> 00:50:54,516 Speaker 1: He's like a he's one of the greatest nuclear physics 950 00:50:54,556 --> 00:50:58,476 Speaker 1: of his generation. And also his ties to the Communists 951 00:50:58,516 --> 00:51:01,036 Speaker 1: go way back. He's a he's a Communist from the 952 00:51:01,076 --> 00:51:03,156 Speaker 1: get go, which actually a good number of the nuclear 953 00:51:03,196 --> 00:51:06,836 Speaker 1: scientists were deeply sympathetics. So it's like and you know, 954 00:51:06,876 --> 00:51:08,916 Speaker 1: all kinds of people were like raising her hand and say, 955 00:51:08,956 --> 00:51:12,996 Speaker 1: I don't know about Klaus, like Klaus goes. And then 956 00:51:13,116 --> 00:51:15,716 Speaker 1: Klaus was at Los Alamos. They had you on lockdown 957 00:51:15,916 --> 00:51:17,796 Speaker 1: in the trend and you know, where to get out 958 00:51:17,796 --> 00:51:20,676 Speaker 1: of Los Alimos to meet with his Soviet handler. He 959 00:51:20,676 --> 00:51:24,516 Speaker 1: had to construct these elaborate reasons. Shore enough, like Klaus 960 00:51:24,556 --> 00:51:26,236 Speaker 1: is always getting in a car, like driving off to 961 00:51:26,316 --> 00:51:28,436 Speaker 1: the desert, and they're like, oh, where's Claus. I don't 962 00:51:28,436 --> 00:51:30,956 Speaker 1: know he's he's going to, like, you know, where would 963 00:51:30,956 --> 00:51:33,956 Speaker 1: you you're at Lost Alimos? Were you going for? I 964 00:51:33,956 --> 00:51:36,436 Speaker 1: think the supports my theory. I think not you're a theory. 965 00:51:36,436 --> 00:51:38,836 Speaker 1: I think the supports that people say wanted to they 966 00:51:38,836 --> 00:51:41,156 Speaker 1: really didn't want to believe this about classrooms. Of course 967 00:51:41,196 --> 00:51:43,276 Speaker 1: they didn't want to believe it, but they were. But 968 00:51:43,356 --> 00:51:45,396 Speaker 1: they didn't want to believe it because in part because 969 00:51:45,716 --> 00:51:49,516 Speaker 1: most people at Los Alimos were not spies, right Klaus, 970 00:51:49,516 --> 00:51:53,356 Speaker 1: It's Klaus, and like basically Klaus. I mean there's like 971 00:51:53,356 --> 00:51:55,836 Speaker 1: one other guy who's a spy, but maybe someone we 972 00:51:55,876 --> 00:51:59,796 Speaker 1: don't know about, but most of them not spies, right. Um. 973 00:52:00,756 --> 00:52:02,836 Speaker 1: So I want to make sure we ask some of 974 00:52:02,836 --> 00:52:05,516 Speaker 1: the audience questions. And this is the first question from 975 00:52:05,556 --> 00:52:08,916 Speaker 1: Tyler from Westfield in New Jersey, and this question I'll 976 00:52:09,196 --> 00:52:11,116 Speaker 1: you why I'm interesting this question. The question says, you're 977 00:52:11,156 --> 00:52:13,796 Speaker 1: known to be an avid runner. How does running factor 978 00:52:13,836 --> 00:52:16,036 Speaker 1: into your process as a writer and as a thinker? 979 00:52:16,036 --> 00:52:18,316 Speaker 1: And the reason I was especially attracted this question is 980 00:52:18,356 --> 00:52:20,116 Speaker 1: that I didn't exactly meet you, but the first time 981 00:52:20,116 --> 00:52:23,396 Speaker 1: I ever saw you was in the Equinox in the 982 00:52:23,396 --> 00:52:26,716 Speaker 1: West Village when I lived in New York, and there 983 00:52:26,716 --> 00:52:30,156 Speaker 1: were several rows of treadmills, and in the front row 984 00:52:30,196 --> 00:52:34,356 Speaker 1: of treadmills in the center was always Malcolm running at 985 00:52:34,396 --> 00:52:37,676 Speaker 1: like an unimaginable speed, and for a long time he 986 00:52:37,716 --> 00:52:40,236 Speaker 1: was by far the fastest person in that gym. And 987 00:52:40,276 --> 00:52:43,716 Speaker 1: then a early two thousand supermodel. I was trying to 988 00:52:43,716 --> 00:52:45,636 Speaker 1: figure out which one it was, one of the blond ones? 989 00:52:46,036 --> 00:52:49,676 Speaker 1: Was it? Which one was? She was right next to you, 990 00:52:49,756 --> 00:52:52,356 Speaker 1: and she would run right next to you really at 991 00:52:53,996 --> 00:52:56,716 Speaker 1: that now, you'd tell me, at super high speed. This 992 00:52:56,756 --> 00:52:58,476 Speaker 1: happened on more than one occasion. This is like a 993 00:52:58,516 --> 00:53:00,516 Speaker 1: story about you and your father, Like you didn't even notice, 994 00:53:01,716 --> 00:53:04,116 Speaker 1: and she was like pretty much as fast as as 995 00:53:04,156 --> 00:53:08,916 Speaker 1: you are, and comparably skinny. And so I thought to myself, 996 00:53:09,116 --> 00:53:10,516 Speaker 1: first I thought I didn't know who you were. I 997 00:53:10,556 --> 00:53:12,436 Speaker 1: was just like, wow, that guy is incredibly fast. And 998 00:53:12,516 --> 00:53:14,276 Speaker 1: I was like, that guy's incredibly fast and doesn't notice 999 00:53:14,316 --> 00:53:17,156 Speaker 1: the girl next to him, And then later I found 1000 00:53:17,196 --> 00:53:19,796 Speaker 1: out that it was you. So here's the question, how 1001 00:53:19,796 --> 00:53:22,636 Speaker 1: does that running factor into your process as a writer 1002 00:53:22,676 --> 00:53:25,476 Speaker 1: and as a thinker, other than making you oblivious to supermodels? 1003 00:53:27,516 --> 00:53:33,556 Speaker 1: I don't know. I suppose, well, you know the you 1004 00:53:33,596 --> 00:53:39,836 Speaker 1: know this how iterative writing is, that you never finding 1005 00:53:39,836 --> 00:53:43,756 Speaker 1: a way to express what you really mean takes forever 1006 00:53:44,596 --> 00:53:47,356 Speaker 1: and that early and an awful lot of kind of 1007 00:53:47,956 --> 00:53:50,116 Speaker 1: what you discover when you write something down is how 1008 00:53:50,156 --> 00:53:52,396 Speaker 1: difficult it is to put your thoughts into words, or 1009 00:53:53,036 --> 00:53:55,396 Speaker 1: how difficult it is to know what you think. And 1010 00:53:55,476 --> 00:53:58,436 Speaker 1: so I always think that you running is really, among 1011 00:53:58,516 --> 00:54:01,916 Speaker 1: other things, a way in which I can simply take 1012 00:54:01,956 --> 00:54:05,636 Speaker 1: time out to ruminate. All writers, I think, have to 1013 00:54:05,676 --> 00:54:09,756 Speaker 1: have some space in your life for rumination. Darthy from 1014 00:54:09,796 --> 00:54:12,636 Speaker 1: Boston And this is I don't think it means to 1015 00:54:12,676 --> 00:54:14,396 Speaker 1: be a mean question, but it's a teeny bit mean. 1016 00:54:15,436 --> 00:54:16,676 Speaker 1: When I was on my way over here, I asked 1017 00:54:16,716 --> 00:54:18,356 Speaker 1: my daughter what happens if I have to ask Malcolm 1018 00:54:18,356 --> 00:54:20,196 Speaker 1: a mean question? She said, you have to do it 1019 00:54:20,236 --> 00:54:25,236 Speaker 1: as a compliment sandwich, and I didn't. I didn't know 1020 00:54:25,276 --> 00:54:27,396 Speaker 1: what that was, but she explained to me. So I 1021 00:54:27,436 --> 00:54:33,476 Speaker 1: wanted to say, Malcolm, that's this is an amazing book. Yeah, 1022 00:54:33,796 --> 00:54:37,036 Speaker 1: after the aha moment that birth's a new theory? How 1023 00:54:37,036 --> 00:54:41,676 Speaker 1: do you avoid confirmation bias while evidence gathering? And also great, 1024 00:54:41,756 --> 00:54:49,396 Speaker 1: great shirt? Did I do it right. Um, well, one 1025 00:54:49,436 --> 00:54:51,556 Speaker 1: answer is to say that all journalism is an expression 1026 00:54:51,596 --> 00:54:55,316 Speaker 1: of confirmation bias, but then so it confirmed the bio. 1027 00:54:55,516 --> 00:55:00,716 Speaker 1: Charthy holds, Yeah, I mean you're I think I guess. 1028 00:55:00,716 --> 00:55:05,436 Speaker 1: The defense I would say is that, unlike other forms 1029 00:55:05,476 --> 00:55:09,396 Speaker 1: of writing, the kind of journalism that I'm up part 1030 00:55:09,436 --> 00:55:14,436 Speaker 1: of is meant to provoke, not convince. That is to say, 1031 00:55:14,516 --> 00:55:17,036 Speaker 1: I'm not and what I what I'd like to do 1032 00:55:17,116 --> 00:55:20,996 Speaker 1: is to present an argument, maybe an argument that readers 1033 00:55:20,996 --> 00:55:24,116 Speaker 1: haven't seen before. Not necessarily because I think I can 1034 00:55:24,356 --> 00:55:27,276 Speaker 1: persuade you to adopt it, or necessarily because I believe 1035 00:55:27,276 --> 00:55:29,556 Speaker 1: the argument is one hundred percent correct, but because I 1036 00:55:29,556 --> 00:55:33,956 Speaker 1: think it's incredibly useful to consider the problem from that angle. 1037 00:55:34,396 --> 00:55:38,676 Speaker 1: So an example would be, you know, I don't talk 1038 00:55:38,676 --> 00:55:40,276 Speaker 1: a lot, like we went over this, I don't talk 1039 00:55:40,276 --> 00:55:41,996 Speaker 1: a lot about race in the case of Sandra Bland. 1040 00:55:42,236 --> 00:55:44,836 Speaker 1: Why Because it's really useful to think through the problem 1041 00:55:44,876 --> 00:55:48,036 Speaker 1: of what to do about police shootings and put race 1042 00:55:48,076 --> 00:55:50,956 Speaker 1: to the side for a moment. Right, that's a good 1043 00:55:51,156 --> 00:55:54,876 Speaker 1: do I think that race doesn't belong, No race totally belongs, 1044 00:55:54,916 --> 00:55:58,676 Speaker 1: but it's useful to take a couple hours and set 1045 00:55:58,676 --> 00:56:00,916 Speaker 1: it aside and say, what if I thought about this 1046 00:56:01,396 --> 00:56:04,916 Speaker 1: outside of that of that particular prism, And that's what 1047 00:56:05,796 --> 00:56:10,836 Speaker 1: that's what I think useful nonfiction does. And so part 1048 00:56:10,916 --> 00:56:13,516 Speaker 1: of the part of the production of that kind of 1049 00:56:13,916 --> 00:56:17,196 Speaker 1: argument is confirmation bias, happily confirmation bison. Yeah, let's gather 1050 00:56:17,236 --> 00:56:18,836 Speaker 1: all the events we can for a particular point of 1051 00:56:18,876 --> 00:56:21,236 Speaker 1: view and want it buy you. Let's see what happens. Yeah, 1052 00:56:21,236 --> 00:56:23,076 Speaker 1: and if you default to truth and believe it, that's 1053 00:56:23,116 --> 00:56:25,356 Speaker 1: your own fault you have. You got to read a skeptically, right, 1054 00:56:25,396 --> 00:56:29,596 Speaker 1: I mean you've got to read skeptically. Yeah. Nicole from Massachusetts, 1055 00:56:29,676 --> 00:56:31,876 Speaker 1: This is a very Massachusetts question, and it's kind of 1056 00:56:31,916 --> 00:56:36,476 Speaker 1: a deep one. Is Trump derangement syndrome a similar phenomenon 1057 00:56:36,676 --> 00:56:42,076 Speaker 1: to the Belgian coke crisis? This question requires a great 1058 00:56:42,076 --> 00:56:46,356 Speaker 1: deal of unpacking the Belgian. Nicole is from Massachusetts. Yes, 1059 00:56:46,636 --> 00:56:49,476 Speaker 1: coke crisis was something I did a podcast episode about. 1060 00:56:49,516 --> 00:56:54,036 Speaker 1: Was an episode in the nineties or eighties in Belgium 1061 00:56:54,036 --> 00:56:58,116 Speaker 1: where Belgians became convinced by the way every time I 1062 00:56:58,236 --> 00:57:00,516 Speaker 1: say the word Belgian, I'm reminded of that great Monty 1063 00:57:00,516 --> 00:57:03,276 Speaker 1: Python episode about they wanted to come up with a 1064 00:57:03,356 --> 00:57:06,116 Speaker 1: nickname for Belgian's, remember, and one of them was sprouts, 1065 00:57:06,676 --> 00:57:08,556 Speaker 1: one of them was Flems, and then the winner was 1066 00:57:08,636 --> 00:57:16,076 Speaker 1: dirty and Belgian best so so fantastic Um. But Belgians 1067 00:57:16,116 --> 00:57:21,036 Speaker 1: became convinced that coke was poisoning them and all these 1068 00:57:21,156 --> 00:57:23,916 Speaker 1: kids got sick. And then it turns out there was 1069 00:57:23,956 --> 00:57:26,316 Speaker 1: not conversions, and yeah it was it was just all 1070 00:57:26,556 --> 00:57:29,916 Speaker 1: um hysteria, a moral panic. Anyway, this person wants to 1071 00:57:29,916 --> 00:57:35,756 Speaker 1: suggest that people that Trump is somewhat trumped arrangement syndrome. 1072 00:57:36,316 --> 00:57:37,756 Speaker 1: You know which is you know that you know that 1073 00:57:37,756 --> 00:57:41,996 Speaker 1: phenomenon when um one observes people who and the ques 1074 00:57:42,236 --> 00:57:44,996 Speaker 1: the name implies maybe that they're they shouldn't, but that 1075 00:57:45,116 --> 00:57:47,956 Speaker 1: people just can't stop thinking about Trump and how angry 1076 00:57:47,996 --> 00:57:50,276 Speaker 1: they are about Trump and how they are about Trump, 1077 00:57:50,396 --> 00:57:53,156 Speaker 1: and it just distorts their whole like frame of engagement 1078 00:57:53,156 --> 00:57:56,156 Speaker 1: with the world. Slash gives them great clarity and accuracy, 1079 00:57:56,196 --> 00:58:01,756 Speaker 1: depending on whether you're inside the syndrome or absence syndrome. UM, 1080 00:58:01,916 --> 00:58:08,956 Speaker 1: I don't know. Maybe Nicole wins, she gets a freak 1081 00:58:09,316 --> 00:58:10,836 Speaker 1: of the well, she already bought a copy of the book. 1082 00:58:12,836 --> 00:58:15,596 Speaker 1: This is a serious question, and it connects up to 1083 00:58:16,596 --> 00:58:19,476 Speaker 1: the fact that this book really is about very serious things. 1084 00:58:19,756 --> 00:58:21,076 Speaker 1: It has it has all of the fun of him 1085 00:58:21,076 --> 00:58:24,236 Speaker 1: Malcolm Gottwell book, but it's framed as an inquiry into 1086 00:58:24,356 --> 00:58:26,876 Speaker 1: one of the deepest, hardest and most serious questions that 1087 00:58:27,196 --> 00:58:30,476 Speaker 1: we're facing as a society. And the question is, do 1088 00:58:30,516 --> 00:58:33,356 Speaker 1: you think that we as Americans This is Christina from Boston, 1089 00:58:33,756 --> 00:58:37,396 Speaker 1: have improved in civil dialogue in the years, say, since 1090 00:58:37,516 --> 00:58:40,196 Speaker 1: nine to eleven, the last almost two decades, or have 1091 00:58:40,276 --> 00:58:45,756 Speaker 1: we regressed when it comes to talking to strangers. It's 1092 00:58:45,836 --> 00:58:48,676 Speaker 1: kind of a deep question. Well, in some ways we 1093 00:58:48,796 --> 00:58:55,356 Speaker 1: have regressed in a sense of um that you know, 1094 00:58:55,396 --> 00:58:58,556 Speaker 1: we're in a peculiar moment where we're we're obsessed with 1095 00:58:58,556 --> 00:59:01,276 Speaker 1: each other's differences and not what we have in common. 1096 00:59:02,796 --> 00:59:05,956 Speaker 1: And you know, at other times, other countries don't have 1097 00:59:06,036 --> 00:59:09,676 Speaker 1: these kinds of conversations. Canadians don't get together and enumeered 1098 00:59:09,676 --> 00:59:12,436 Speaker 1: all the ways in which they disagree. They do the opposite, right, 1099 00:59:12,676 --> 00:59:16,076 Speaker 1: They talk about things they all share. Um, that's the age, 1100 00:59:16,236 --> 00:59:18,036 Speaker 1: and they do it nicely. They do it nicely. The 1101 00:59:18,076 --> 00:59:21,996 Speaker 1: great Canadian project is this endless search for common ground 1102 00:59:22,276 --> 00:59:26,076 Speaker 1: gets a little tedious at times, but um, this country 1103 00:59:26,196 --> 00:59:29,636 Speaker 1: has decided, but less shooting each other. This country's society 1104 00:59:29,676 --> 00:59:31,476 Speaker 1: doesn't want to do that. I guess that would be 1105 00:59:31,476 --> 00:59:34,276 Speaker 1: point number one. But two slash we've never done it. 1106 00:59:34,756 --> 00:59:37,196 Speaker 1: Maybe we've never done it. Be to be there. Yeah, 1107 00:59:37,236 --> 00:59:39,356 Speaker 1: but I you know, to going to back to the 1108 00:59:39,436 --> 00:59:42,396 Speaker 1: theme of this This book is a lot about misunderstandings 1109 00:59:41,956 --> 00:59:46,476 Speaker 1: that arise from hasty generalizations about strangers. I'm always struck 1110 00:59:46,516 --> 00:59:51,836 Speaker 1: on Twitter about how you know seventy five of what 1111 00:59:51,916 --> 00:59:54,876 Speaker 1: are classified as disagreements on Twitter or not disagreements, but 1112 00:59:54,916 --> 00:59:58,316 Speaker 1: they are misunderstandings. Yeah that they just people have been 1113 00:59:58,316 --> 01:00:00,556 Speaker 1: bothered to figure out what the other person is arguing, 1114 01:00:00,836 --> 01:00:03,396 Speaker 1: or they delicated by the medium, which is designed to 1115 01:00:03,436 --> 01:00:05,236 Speaker 1: make it impossible for you to explain your argument in 1116 01:00:05,316 --> 01:00:07,396 Speaker 1: any depth. It is astonished to me what a bad 1117 01:00:07,396 --> 01:00:10,316 Speaker 1: idea of Twitter is, Like if you it's one of 1118 01:00:10,356 --> 01:00:13,156 Speaker 1: those things like in retrospect, like who if they were 1119 01:00:13,356 --> 01:00:15,836 Speaker 1: so like who thought this was going to end up 1120 01:00:15,876 --> 01:00:18,676 Speaker 1: being Remember there was that golden period of a like 1121 01:00:18,756 --> 01:00:21,556 Speaker 1: a year and a half when they were serious people 1122 01:00:21,556 --> 01:00:23,156 Speaker 1: in this country who thought that Twitter was going to 1123 01:00:23,236 --> 01:00:26,516 Speaker 1: save the world from tyranny. And I will say I 1124 01:00:26,596 --> 01:00:29,556 Speaker 1: was a skeptic in that moment, and I people were 1125 01:00:29,636 --> 01:00:32,196 Speaker 1: so angry with me for not believing in the condemptive 1126 01:00:32,236 --> 01:00:33,916 Speaker 1: power of Twitter, and I was like, I don't know, 1127 01:00:34,076 --> 01:00:36,676 Speaker 1: like I don't see how you know. And sure enough, 1128 01:00:36,876 --> 01:00:39,196 Speaker 1: what happens. It turns out Twitter is good for like 1129 01:00:39,556 --> 01:00:43,036 Speaker 1: cat videos. That's useful. But other than that, we had 1130 01:00:43,076 --> 01:00:45,236 Speaker 1: Facebook for that. We already had Facebook for that. But 1131 01:00:45,276 --> 01:00:49,316 Speaker 1: it is astonishing to me how like pointless it is. 1132 01:00:49,476 --> 01:00:51,716 Speaker 1: But it is fascinating then that it has the reach 1133 01:00:51,756 --> 01:00:54,756 Speaker 1: that it has. I mean specifically among people who talk 1134 01:00:55,036 --> 01:00:57,956 Speaker 1: at length for a living, people who should do the 1135 01:00:58,036 --> 01:01:02,476 Speaker 1: best at going deeper conversational at least in principle, seemed 1136 01:01:02,516 --> 01:01:06,356 Speaker 1: to have this deep desire to communicate on Twitter, the 1137 01:01:06,436 --> 01:01:08,876 Speaker 1: framework in which all of the things that in theory 1138 01:01:08,916 --> 01:01:12,036 Speaker 1: make them worth listening to are taken away are stripped away. 1139 01:01:12,076 --> 01:01:13,836 Speaker 1: I mean, it's like, you know, your your example of 1140 01:01:14,556 --> 01:01:17,996 Speaker 1: in the LSAD podcast, your example of speed chess compared 1141 01:01:18,036 --> 01:01:20,356 Speaker 1: to a real chess. All these people who are actually 1142 01:01:20,396 --> 01:01:25,316 Speaker 1: supposedly writers, you know, scholars, people who think in larger 1143 01:01:25,476 --> 01:01:28,396 Speaker 1: longer than a sentence or two, and that's who spends 1144 01:01:28,396 --> 01:01:29,916 Speaker 1: a lot of time yelling at each other. And this 1145 01:01:30,076 --> 01:01:32,436 Speaker 1: understand each other on Twitter. I have no answer for 1146 01:01:32,556 --> 01:01:35,116 Speaker 1: why we have the impulse to do that, but apparently 1147 01:01:35,276 --> 01:01:38,036 Speaker 1: we do, and distinctively that group of people, because it's 1148 01:01:38,036 --> 01:01:40,676 Speaker 1: not like Twitter has as many users as one of 1149 01:01:40,676 --> 01:01:44,076 Speaker 1: the formats that allows for a longer form communication. Yeah, yeah, 1150 01:01:44,396 --> 01:01:45,916 Speaker 1: I don't. I don't get it. It's like the it's 1151 01:01:45,956 --> 01:01:48,556 Speaker 1: like the id of It's like the id of people 1152 01:01:48,596 --> 01:01:50,796 Speaker 1: who like to express themselves at length. It's like, really 1153 01:01:50,876 --> 01:01:53,036 Speaker 1: that they can sum it up in one sentence, which 1154 01:01:53,076 --> 01:01:56,436 Speaker 1: actually they can't. Yeah. Yeah, it joins along this I 1155 01:01:56,436 --> 01:02:02,316 Speaker 1: always even it should be this everything. Every institution, uh, 1156 01:02:03,356 --> 01:02:05,876 Speaker 1: in a kind of functioning society should have a sunset 1157 01:02:05,916 --> 01:02:10,116 Speaker 1: clause and then at that moment of sunset, everyone sits 1158 01:02:10,156 --> 01:02:12,796 Speaker 1: around and should say, well up or down? Good idea? 1159 01:02:12,996 --> 01:02:15,156 Speaker 1: Do we should we start over? Like? Be very imagine 1160 01:02:15,156 --> 01:02:17,476 Speaker 1: if we hit like so, if we had a sunset 1161 01:02:17,476 --> 01:02:23,596 Speaker 1: clause on I mentioned higher education before, Yeah, super useful 1162 01:02:24,276 --> 01:02:27,596 Speaker 1: for us to say. In say, in twenty twenty five, 1163 01:02:27,836 --> 01:02:29,996 Speaker 1: we all get together and we say, okay, let's start 1164 01:02:29,996 --> 01:02:32,396 Speaker 1: over and see what we come up with. Right, we 1165 01:02:32,396 --> 01:02:34,956 Speaker 1: would come up with something very very different. Similarly with Twitter, 1166 01:02:34,996 --> 01:02:37,756 Speaker 1: if we decided in twenty twenty Five's go, We're gonna 1167 01:02:37,756 --> 01:02:39,836 Speaker 1: shut it down for six months and then have a 1168 01:02:39,876 --> 01:02:42,516 Speaker 1: meeting and figure out what we want, whether we want 1169 01:02:42,516 --> 01:02:44,316 Speaker 1: to replace it or what we would replace it with. 1170 01:02:44,556 --> 01:02:46,556 Speaker 1: We would come up with something very different. There's no 1171 01:02:46,636 --> 01:02:51,436 Speaker 1: reason these things persist long past their their useful stage. 1172 01:02:52,076 --> 01:02:57,116 Speaker 1: Last question and it connects Canada to the time, when 1173 01:02:57,156 --> 01:02:59,556 Speaker 1: and why was the last time you applied your pull 1174 01:02:59,636 --> 01:03:04,156 Speaker 1: the goalie rule in real life? Another recond ask that 1175 01:03:04,196 --> 01:03:07,676 Speaker 1: because you know we're almost done here, I'm pulling the goalie. 1176 01:03:07,796 --> 01:03:11,156 Speaker 1: But this is another friends to one of my podcasts 1177 01:03:11,476 --> 01:03:15,356 Speaker 1: where I described the work of this hilarious hedge fund 1178 01:03:15,396 --> 01:03:19,396 Speaker 1: guy who published a paper on SSRN rolls quast website 1179 01:03:19,796 --> 01:03:23,236 Speaker 1: about when you should what is the optimal time to 1180 01:03:23,236 --> 01:03:25,196 Speaker 1: pull a goalie in a hockey game if if you 1181 01:03:25,196 --> 01:03:28,796 Speaker 1: are down a goal And his answer is like I forgot, 1182 01:03:28,876 --> 01:03:31,476 Speaker 1: it's like with six minutes to go far, far, longer 1183 01:03:31,476 --> 01:03:33,676 Speaker 1: than anyone would begin to imagine, and he does all 1184 01:03:33,716 --> 01:03:35,036 Speaker 1: the math and then if no one ever does that, 1185 01:03:35,196 --> 01:03:37,316 Speaker 1: no one ever does that way. But the idea is 1186 01:03:37,316 --> 01:03:40,996 Speaker 1: that you should take big risks, take big way your 1187 01:03:41,076 --> 01:03:43,716 Speaker 1: way behind when you're way behind, when all is hopeless. Um, 1188 01:03:44,076 --> 01:03:47,556 Speaker 1: when was the last time I took a massive, unparalleled 1189 01:03:47,636 --> 01:03:54,956 Speaker 1: risk because I was otherwise almost certain to lose. Yeah, 1190 01:03:55,036 --> 01:03:57,476 Speaker 1: it's a really good question. It is a good question. 1191 01:03:57,956 --> 01:04:04,716 Speaker 1: Jim from Bridgewater. Well, no, I should add that you're 1192 01:04:04,716 --> 01:04:08,396 Speaker 1: willing to share. What about you? Have you ever pulled? 1193 01:04:09,596 --> 01:04:16,636 Speaker 1: Do you ever? Um, that's a good question, I think, Um, yeah, 1194 01:04:16,676 --> 01:04:18,196 Speaker 1: I think I've I think I've done it in my 1195 01:04:18,276 --> 01:04:22,276 Speaker 1: personal life on multiple occasions. Um, you know, gone for 1196 01:04:22,356 --> 01:04:25,676 Speaker 1: like the big the big risk. Yeah, um, you know, 1197 01:04:26,316 --> 01:04:29,196 Speaker 1: hoping that it would make things work out. But as 1198 01:04:29,196 --> 01:04:31,116 Speaker 1: we know from the statistics of pulling the goalie, usually 1199 01:04:31,116 --> 01:04:32,876 Speaker 1: what happens is they just score a goal on you 1200 01:04:32,916 --> 01:04:35,356 Speaker 1: and it doesn't work out. Yeah, you usually fail. Yeah, 1201 01:04:35,396 --> 01:04:38,316 Speaker 1: so that would be me. Elizabeth Thorns politically didn't she. 1202 01:04:39,476 --> 01:04:42,036 Speaker 1: I don think Elizabeth Tharos. I don't think Elizabeth what's 1203 01:04:42,036 --> 01:04:50,316 Speaker 1: her last name of Paris? Elizabeth Holmes Holmes, Elizabeth Holmes 1204 01:04:50,316 --> 01:04:54,196 Speaker 1: of Tharnis was playing without a goalie from the very 1205 01:04:54,956 --> 01:04:58,676 Speaker 1: This is not true. So think about her position if 1206 01:04:58,716 --> 01:05:01,356 Speaker 1: you think about this rationally. So, she had this idea 1207 01:05:01,636 --> 01:05:05,476 Speaker 1: that you could completely take over a huge corner of 1208 01:05:05,516 --> 01:05:10,076 Speaker 1: the healthcare market if you could do. Liabel diagnoses from 1209 01:05:10,076 --> 01:05:13,756 Speaker 1: a drop of blood. She was, let's say, generously, twenty 1210 01:05:13,756 --> 01:05:16,196 Speaker 1: five percent of the way there, right, but we don't 1211 01:05:16,196 --> 01:05:17,836 Speaker 1: we don't really know how far she was, and her 1212 01:05:18,156 --> 01:05:20,796 Speaker 1: her gamble was I can get to one hundred percent. 1213 01:05:20,836 --> 01:05:23,556 Speaker 1: Undergraduate advisor appeared in the at least in the documentary 1214 01:05:23,596 --> 01:05:25,636 Speaker 1: and said she was zero percent of the because, okay, 1215 01:05:25,716 --> 01:05:27,956 Speaker 1: because I explained to her that it was impossible. Yeah, 1216 01:05:28,036 --> 01:05:29,836 Speaker 1: that's right. And then the other guy got it. The 1217 01:05:29,876 --> 01:05:31,836 Speaker 1: other advisor from Stanford got on and said, but she 1218 01:05:31,916 --> 01:05:34,796 Speaker 1: was a genius, so it was possible. Yeah. So let's say, okay, 1219 01:05:34,836 --> 01:05:36,636 Speaker 1: let's say that she was fifteen percent of the way. 1220 01:05:36,636 --> 01:05:39,196 Speaker 1: They okay, So she was, in a sense, pulling the goalie. 1221 01:05:39,276 --> 01:05:42,556 Speaker 1: So she was faced with an all but impossible task. 1222 01:05:42,636 --> 01:05:44,836 Speaker 1: But there was this little glimmer of hope if she 1223 01:05:44,916 --> 01:05:47,436 Speaker 1: gambled everything that she could pull it off. And if 1224 01:05:47,436 --> 01:05:50,236 Speaker 1: she pulled it off, she would be a mega billionaire. 1225 01:05:50,636 --> 01:05:53,596 Speaker 1: I don't know, I mean. So she chose to play 1226 01:05:53,796 --> 01:05:57,036 Speaker 1: very very very long odds. So did Bernie Madoff. According 1227 01:05:57,036 --> 01:05:58,636 Speaker 1: to that theory, Well no, no no, remember could have had 1228 01:05:58,636 --> 01:06:01,476 Speaker 1: the first pay scheme that worked by Bonnie Monoff never 1229 01:06:01,556 --> 01:06:06,316 Speaker 1: made an honest attempt to actually invest people's money. But 1230 01:06:06,396 --> 01:06:08,036 Speaker 1: he was a fraud from the start. Hers was not 1231 01:06:08,076 --> 01:06:10,316 Speaker 1: a fraud from the start. I mean, no one claims 1232 01:06:10,316 --> 01:06:12,916 Speaker 1: that she was. Did it was all fictional that when 1233 01:06:13,316 --> 01:06:16,076 Speaker 1: all the activity going on in her firm was just 1234 01:06:16,116 --> 01:06:17,916 Speaker 1: for show. No people were trying to solve the problem. 1235 01:06:17,956 --> 01:06:20,356 Speaker 1: It's just the problem was pertually impossible to solve. But 1236 01:06:20,396 --> 01:06:23,476 Speaker 1: there was some little glimmer of not I don't mean 1237 01:06:23,556 --> 01:06:27,156 Speaker 1: to defend her, but I do want to point out 1238 01:06:27,276 --> 01:06:30,196 Speaker 1: you just there is a kind of little sliver of 1239 01:06:30,236 --> 01:06:33,716 Speaker 1: logic to what she was doing. She was classically pulling 1240 01:06:33,796 --> 01:06:36,876 Speaker 1: the goalie, like she was down by three goals with 1241 01:06:37,716 --> 01:06:39,676 Speaker 1: you know, deep in the third quarter, and she decided, 1242 01:06:39,676 --> 01:06:42,476 Speaker 1: I'm going to go for broke. I'm gonna convince a 1243 01:06:42,476 --> 01:06:44,156 Speaker 1: bunch of rich people to give me lots of money, 1244 01:06:44,196 --> 01:06:45,356 Speaker 1: and I'm going to try and do this thing that 1245 01:06:45,716 --> 01:06:48,756 Speaker 1: i'm And had she succeeded, she would be right in 1246 01:06:48,876 --> 01:06:53,276 Speaker 1: the pantheon. Well, none of you has pulled the goalie 1247 01:06:53,316 --> 01:06:57,356 Speaker 1: by coming here tonight, and you've had a short thing 1248 01:06:57,356 --> 01:07:00,836 Speaker 1: from the beginning because you bought the book, and I'm 1249 01:07:00,876 --> 01:07:02,956 Speaker 1: thrilled that you did come and thrilled, and Malcolm came 1250 01:07:02,996 --> 01:07:04,716 Speaker 1: and joined us, and I'm really grateful to you, Malcolm 1251 01:07:04,756 --> 01:07:06,636 Speaker 1: for the conversation and thank you all for coming. Thank you. 1252 01:07:14,596 --> 01:07:17,516 Speaker 1: Deep Background is brought to you by Pushkin Industries. Our 1253 01:07:17,556 --> 01:07:20,916 Speaker 1: producer is Lydia Gancott, with engineering by Jason Gambrell and 1254 01:07:20,996 --> 01:07:24,876 Speaker 1: Jason Rostkowski. Our showrunner is Sophie mckibbon. Our theme music 1255 01:07:24,956 --> 01:07:28,196 Speaker 1: is composed by Luis GERA special thanks to the Pushkin Brass, 1256 01:07:28,396 --> 01:07:36,076 Speaker 1: Malcolm Gladwell, Jacob Weisberg, and Mia Lobell. This week, we 1257 01:07:36,076 --> 01:07:37,756 Speaker 1: would also like to give a shout out to the 1258 01:07:37,756 --> 01:07:41,356 Speaker 1: Harvard Bookstore my home bookstore, for helping organize the conversation 1259 01:07:41,396 --> 01:07:44,716 Speaker 1: between me and Malcolm. I'm Noah Feldman. You can follow 1260 01:07:44,756 --> 01:07:48,356 Speaker 1: me on Twitter at Noah R. Feldman. This is Deep Background.