1 00:00:05,160 --> 00:00:09,720 Speaker 1: Why are we humans so easy to deceive? What are 2 00:00:09,800 --> 00:00:12,520 Speaker 1: the tricks of the trade, and how can we train 3 00:00:12,600 --> 00:00:16,640 Speaker 1: ourselves to be more aware of these? And what does 4 00:00:16,680 --> 00:00:20,240 Speaker 1: any of this have to do with fahnose or forging 5 00:00:20,320 --> 00:00:28,200 Speaker 1: letters or the shell game. Welcome to Inner Cosmos with 6 00:00:28,320 --> 00:00:31,400 Speaker 1: me David Eagleman. I'm a neuroscientist and an author at 7 00:00:31,440 --> 00:00:36,159 Speaker 1: Stanford and in these episodes, I examine the intersection between 8 00:00:36,240 --> 00:00:41,920 Speaker 1: our brains and our lives, and today's episode is about deception. 9 00:00:49,800 --> 00:00:53,559 Speaker 1: You presumably wouldn't do something to cheat a stranger out 10 00:00:53,600 --> 00:00:56,760 Speaker 1: of twenty dollars, So why are there people who would 11 00:00:56,960 --> 00:00:57,279 Speaker 1: do that? 12 00:00:57,720 --> 00:00:59,200 Speaker 2: And what can we do. 13 00:00:59,560 --> 00:01:04,120 Speaker 1: To be a little more thoughtful and aware and immune 14 00:01:04,440 --> 00:01:09,080 Speaker 1: against deception? So in a previous episode I talked about 15 00:01:09,120 --> 00:01:13,000 Speaker 1: a really impactful event when I was a neuroscience graduate 16 00:01:13,040 --> 00:01:16,480 Speaker 1: student getting my PhD. I was a second year student 17 00:01:16,520 --> 00:01:19,480 Speaker 1: in the department and this new young woman came in 18 00:01:19,520 --> 00:01:22,560 Speaker 1: as a first year student. We'll call her Tanya, and 19 00:01:22,640 --> 00:01:26,720 Speaker 1: everyone could see that Tanya was great. She had great grades, 20 00:01:27,200 --> 00:01:31,480 Speaker 1: top standardized test scores, terrific letters of recommendation, and in 21 00:01:31,520 --> 00:01:35,560 Speaker 1: the interviews she even won over my graduate advisor, who 22 00:01:35,640 --> 00:01:39,440 Speaker 1: was famously spiky towards people. And I tell her full 23 00:01:39,480 --> 00:01:42,920 Speaker 1: story in episode sixteen, but the short version is that 24 00:01:43,000 --> 00:01:47,480 Speaker 1: she faked everything on her graduate school application. She faked 25 00:01:47,720 --> 00:01:52,240 Speaker 1: the school transcript and the GRE scores and the letters 26 00:01:52,240 --> 00:01:56,800 Speaker 1: of recommendation, and she was only caught because an administrator 27 00:01:56,840 --> 00:01:59,520 Speaker 1: at the school was so impressed with her that she 28 00:01:59,640 --> 00:02:03,040 Speaker 1: decided to call the professors who had written the letters 29 00:02:03,080 --> 00:02:06,880 Speaker 1: of recommendation to ask how they'd produced a student like Tanya. 30 00:02:07,400 --> 00:02:11,360 Speaker 1: And that's how the whole house of cards came tumbling down. Now, 31 00:02:11,400 --> 00:02:14,359 Speaker 1: for those of you who listened to episode sixteen, you'll 32 00:02:14,400 --> 00:02:19,160 Speaker 1: remember that Tanya's story then got much weirder because she 33 00:02:19,280 --> 00:02:22,200 Speaker 1: went to Yale University and tried to pull exactly the 34 00:02:22,240 --> 00:02:25,079 Speaker 1: same trick, and when she was caught there, they put 35 00:02:25,080 --> 00:02:28,320 Speaker 1: her in jail. And then she and her mother got 36 00:02:28,360 --> 00:02:32,440 Speaker 1: caught doing a drug deal with two undercover agents. And 37 00:02:32,480 --> 00:02:36,720 Speaker 1: then Tanya decided to try murdering a girl who looked 38 00:02:36,800 --> 00:02:40,480 Speaker 1: vaguely like her to avoid going to prison. Now that 39 00:02:40,600 --> 00:02:43,800 Speaker 1: plot failed, but only barely. So that's the quick recap 40 00:02:43,840 --> 00:02:46,079 Speaker 1: of the story. But the part I want to concentrate 41 00:02:46,120 --> 00:02:50,400 Speaker 1: on today is why did none of us see this coming? 42 00:02:50,919 --> 00:02:53,680 Speaker 1: We all thought she was great. And this was a 43 00:02:53,840 --> 00:02:59,000 Speaker 1: neuroscience graduate program full of people who were aspiring learners 44 00:02:59,040 --> 00:03:03,120 Speaker 1: about the human brain and faculty who were presumably already 45 00:03:03,400 --> 00:03:06,320 Speaker 1: experts in the brain. And yet every single one of 46 00:03:06,400 --> 00:03:09,639 Speaker 1: us thought that Tanya was great. None of us even 47 00:03:09,680 --> 00:03:13,799 Speaker 1: had the briefest glimpse of doubt or suspicion when she 48 00:03:13,919 --> 00:03:19,240 Speaker 1: started school, And we were all maximally surprised when we 49 00:03:19,280 --> 00:03:22,200 Speaker 1: saw how completely we had been fooled. 50 00:03:23,120 --> 00:03:25,120 Speaker 2: So why were we so blind? 51 00:03:26,200 --> 00:03:29,280 Speaker 1: Well, first of all, none of us would have thought 52 00:03:29,360 --> 00:03:33,360 Speaker 1: about faking our transcripts and writing fake letters and so on. 53 00:03:33,440 --> 00:03:34,160 Speaker 3: That kind of. 54 00:03:34,720 --> 00:03:40,040 Speaker 1: Deception didn't exist in our mental models, and so it 55 00:03:40,160 --> 00:03:42,600 Speaker 1: was totally invisible to us when it was sitting there 56 00:03:42,680 --> 00:03:45,440 Speaker 1: right in front of us. And one of the themes 57 00:03:45,480 --> 00:03:48,040 Speaker 1: of this podcast and of my next book is that 58 00:03:48,480 --> 00:03:52,280 Speaker 1: we need to get better at seeing outside the garden 59 00:03:52,320 --> 00:03:56,640 Speaker 1: walls of our own internal models. This is really what 60 00:03:56,760 --> 00:04:01,840 Speaker 1: the passage into maturity is about, seeing the limitations of 61 00:04:01,880 --> 00:04:06,200 Speaker 1: our own thinking and realizing that what's going on in 62 00:04:06,240 --> 00:04:09,960 Speaker 1: someone else's head might be very different than what's going 63 00:04:10,000 --> 00:04:13,640 Speaker 1: on inside hours even if we're not the kind of 64 00:04:13,720 --> 00:04:18,240 Speaker 1: person to do something, even if it seems absolutely unimaginable 65 00:04:18,279 --> 00:04:20,200 Speaker 1: to us, it doesn't mean that. 66 00:04:20,120 --> 00:04:21,560 Speaker 2: It seems that way to someone else. 67 00:04:22,080 --> 00:04:24,560 Speaker 1: And if you heard episodes twenty and twenty one, you'll 68 00:04:24,560 --> 00:04:27,159 Speaker 1: know that we dove into some of the really awful 69 00:04:27,200 --> 00:04:31,480 Speaker 1: things that happened during wartime. And again, just because you 70 00:04:31,680 --> 00:04:35,880 Speaker 1: can't imagine hacking your neighbors to death with a machete 71 00:04:35,960 --> 00:04:39,360 Speaker 1: or shooting your neighbors or bayonetting them, it doesn't mean 72 00:04:39,880 --> 00:04:44,880 Speaker 1: that someone else can't imagine that and won't foment violence 73 00:04:44,960 --> 00:04:48,800 Speaker 1: without having much compunction about it. So an understanding of 74 00:04:49,040 --> 00:04:55,440 Speaker 1: history requires an expansion of our mental models, and that's 75 00:04:55,440 --> 00:04:59,520 Speaker 1: what's required for navigating day to day life as well, 76 00:04:59,600 --> 00:05:03,719 Speaker 1: because if not everyone is just like you on the inside. 77 00:05:03,800 --> 00:05:09,000 Speaker 1: For example, psychopaths make up about one percent of the population, 78 00:05:09,520 --> 00:05:11,240 Speaker 1: and by the way, they make up about twenty to 79 00:05:11,240 --> 00:05:15,880 Speaker 1: thirty percent of the prison population. They don't care about you, 80 00:05:15,920 --> 00:05:18,960 Speaker 1: they don't simulate what it is like to be you, 81 00:05:19,640 --> 00:05:22,320 Speaker 1: and they can be violent towards you because they just 82 00:05:22,400 --> 00:05:25,320 Speaker 1: see you as an obstacle to. 83 00:05:25,400 --> 00:05:27,080 Speaker 2: Flow around to get what they want. 84 00:05:27,520 --> 00:05:30,240 Speaker 1: And I'm going to do an episode on psychopathy soon, 85 00:05:30,279 --> 00:05:31,839 Speaker 1: but the point I want to make right now is 86 00:05:31,839 --> 00:05:35,520 Speaker 1: that if you are not a psychopath, this is very 87 00:05:35,520 --> 00:05:39,120 Speaker 1: difficult to imagine someone behaving that way. But you'll be 88 00:05:39,320 --> 00:05:43,040 Speaker 1: smarter in your daily life if you understand how other 89 00:05:43,160 --> 00:05:47,160 Speaker 1: people can be different from you. Now, sometimes people are 90 00:05:47,200 --> 00:05:50,279 Speaker 1: different in wonderful ways, like when you see some situation 91 00:05:50,960 --> 00:05:53,960 Speaker 1: in which someone is braver than you, or just more 92 00:05:54,080 --> 00:05:57,680 Speaker 1: charitable with a higher percentage of their money, or more 93 00:05:57,800 --> 00:06:00,599 Speaker 1: willing to do the right thing, like to climb the 94 00:06:00,680 --> 00:06:03,560 Speaker 1: side of a building to save the toddler hanging off 95 00:06:03,600 --> 00:06:07,080 Speaker 1: the balcony, even though you would be more scared. But 96 00:06:07,279 --> 00:06:11,160 Speaker 1: sometimes we see people different from us in the other direction, 97 00:06:11,279 --> 00:06:15,200 Speaker 1: people who cheat and lie and steal, and it's hard 98 00:06:15,240 --> 00:06:17,640 Speaker 1: to understand because we don't have a good model of that, 99 00:06:17,800 --> 00:06:21,920 Speaker 1: and so we're often caught completely by surprise. I'll give 100 00:06:21,960 --> 00:06:24,640 Speaker 1: you an example of this when I was young. When 101 00:06:24,680 --> 00:06:27,880 Speaker 1: I was sixteen years old, I was traveling with my 102 00:06:28,000 --> 00:06:31,800 Speaker 1: parents in Barcelona, and I was spending an afternoon walking 103 00:06:31,800 --> 00:06:34,880 Speaker 1: around by myself, and I saw a crowd of people 104 00:06:34,920 --> 00:06:38,400 Speaker 1: playing a shell game. You know, this is the game 105 00:06:38,440 --> 00:06:41,839 Speaker 1: where a person puts a small ball under one of 106 00:06:41,920 --> 00:06:45,120 Speaker 1: three cups and then rotates the cups around and around 107 00:06:45,640 --> 00:06:48,360 Speaker 1: and then you have to guess which cup the ball 108 00:06:48,440 --> 00:06:52,040 Speaker 1: is under. So I stopped to watch because there was 109 00:06:52,080 --> 00:06:55,760 Speaker 1: a small crowd and the dealer was moving the cups around, 110 00:06:55,839 --> 00:06:58,280 Speaker 1: and there was this pedestrian like me who had put 111 00:06:58,320 --> 00:07:02,320 Speaker 1: down some money. And pedestrian watched the cups go around 112 00:07:02,360 --> 00:07:04,919 Speaker 1: and round, and when they stopped, he pointed to a 113 00:07:04,960 --> 00:07:06,160 Speaker 1: cup and it was. 114 00:07:06,120 --> 00:07:06,960 Speaker 2: The wrong cup. 115 00:07:07,600 --> 00:07:10,200 Speaker 1: But I could see where the cups had moved, and 116 00:07:10,280 --> 00:07:12,240 Speaker 1: I knew it was the cup on the left, but 117 00:07:12,280 --> 00:07:15,400 Speaker 1: this pedestrian pointed to the middle. So the dealer uncovered 118 00:07:15,440 --> 00:07:18,400 Speaker 1: the middle cup, and the whole crowd made a whooping sound, 119 00:07:18,800 --> 00:07:21,520 Speaker 1: and so the pedestrian put down more money to play 120 00:07:21,560 --> 00:07:24,800 Speaker 1: another round, and the dealer shows the ball under the 121 00:07:24,880 --> 00:07:27,680 Speaker 1: left cup, and then he rotates the cups around faster 122 00:07:27,720 --> 00:07:30,800 Speaker 1: and faster, but I kept my eyes locked on the 123 00:07:30,800 --> 00:07:34,520 Speaker 1: correct cup, and again the pedestrian guessed wrong, but I 124 00:07:34,600 --> 00:07:37,160 Speaker 1: knew where the ball was. So this happens a few 125 00:07:37,200 --> 00:07:40,760 Speaker 1: more rounds, and the pedestrian gives up, and the dealer 126 00:07:40,840 --> 00:07:42,920 Speaker 1: looks at me and motions for me to put up 127 00:07:42,960 --> 00:07:45,960 Speaker 1: some money, so I did so he shows me the 128 00:07:45,960 --> 00:07:48,600 Speaker 1: ball and rotates the cups around and around, and I 129 00:07:48,760 --> 00:07:50,560 Speaker 1: keep my eye on it, and when he stops, I 130 00:07:50,680 --> 00:07:54,679 Speaker 1: point to the correct cup and the cup was empty. 131 00:07:55,200 --> 00:07:57,960 Speaker 2: I got it wrong, what was going on? 132 00:07:58,160 --> 00:08:00,240 Speaker 1: So he motions for me to put down more money, 133 00:08:00,280 --> 00:08:02,840 Speaker 1: and I want to win my lost money back, so 134 00:08:02,880 --> 00:08:05,520 Speaker 1: I put down more and he runs the rotations again, 135 00:08:05,800 --> 00:08:08,200 Speaker 1: and I point to the cup where the ball should be, 136 00:08:08,680 --> 00:08:12,440 Speaker 1: and again it's empty. And before I know it, someone 137 00:08:12,440 --> 00:08:14,640 Speaker 1: in the crowd makes a whoop sound, and suddenly the 138 00:08:14,680 --> 00:08:17,840 Speaker 1: dealer folds up the board and the entire crowd disappears, 139 00:08:18,440 --> 00:08:21,280 Speaker 1: and I'm standing there all by myself in the street, 140 00:08:21,360 --> 00:08:24,160 Speaker 1: and I felt like such a fool because I had 141 00:08:24,200 --> 00:08:27,280 Speaker 1: just been deceived. Now this is embarrassing for me to 142 00:08:27,320 --> 00:08:29,520 Speaker 1: tell the story, and even all these years later, there 143 00:08:29,600 --> 00:08:31,400 Speaker 1: is some pain in the remembrance. 144 00:08:31,720 --> 00:08:32,600 Speaker 3: But my hope in. 145 00:08:32,520 --> 00:08:36,480 Speaker 1: Relating the story is that at least one teenage listener 146 00:08:36,600 --> 00:08:39,840 Speaker 1: gets an expansion of their mental model from this and 147 00:08:40,120 --> 00:08:42,600 Speaker 1: doesn't have to play this game. Just in case you 148 00:08:42,600 --> 00:08:45,720 Speaker 1: don't know, there is no honest version of the shell game. 149 00:08:45,760 --> 00:08:49,040 Speaker 1: It's always performed by hucksters who use sleight of hand 150 00:08:49,080 --> 00:08:51,840 Speaker 1: to move the ball from one cup to another, and 151 00:08:51,880 --> 00:08:55,760 Speaker 1: the whole crowd is in on the deception. Now, when 152 00:08:55,800 --> 00:08:57,680 Speaker 1: I did some research on this, I found the shell 153 00:08:57,720 --> 00:09:01,480 Speaker 1: game is very old, so the game of people trying 154 00:09:01,520 --> 00:09:03,920 Speaker 1: to deceive other people is ancient. 155 00:09:04,920 --> 00:09:05,160 Speaker 3: Now. 156 00:09:05,240 --> 00:09:10,160 Speaker 1: Sometimes deception is planned in advance like this, and sometimes 157 00:09:10,200 --> 00:09:12,400 Speaker 1: people are just trying to get out of bad situation. 158 00:09:12,559 --> 00:09:15,800 Speaker 1: They make it worse. Sometimes it's hard to tell. I mean, 159 00:09:15,800 --> 00:09:18,800 Speaker 1: look at the company, Thearrhannos, which you've probably heard of. 160 00:09:18,840 --> 00:09:21,800 Speaker 1: They were a health tech company founded by a young 161 00:09:21,840 --> 00:09:25,280 Speaker 1: woman named Elizabeth Holmes, and they were out to develop 162 00:09:25,400 --> 00:09:29,720 Speaker 1: a biological test that could measure a whole bunch of 163 00:09:29,760 --> 00:09:33,240 Speaker 1: things using just a drop or two of blood. So 164 00:09:33,320 --> 00:09:37,240 Speaker 1: they raised seven hundred and twenty four million dollars from investors, 165 00:09:37,760 --> 00:09:41,880 Speaker 1: including the media mogul Robert Murdoch and former Secretary of 166 00:09:41,920 --> 00:09:45,560 Speaker 1: State Henry Kissinger, and all kinds of big players were 167 00:09:45,679 --> 00:09:51,080 Speaker 1: enthusiastic about this revolutionary technology. But this all came crashing 168 00:09:51,160 --> 00:09:55,480 Speaker 1: down in twenty fifteen when it surfaced that Sarahnos had 169 00:09:55,520 --> 00:09:59,680 Speaker 1: been lying about what their technology could do. Holmes was 170 00:09:59,760 --> 00:10:03,760 Speaker 1: charged with fraud and conspiracy, and she was found guilty 171 00:10:03,800 --> 00:10:06,960 Speaker 1: in twenty twenty two. Now you probably know this story, 172 00:10:06,960 --> 00:10:10,560 Speaker 1: but the interesting backside of this story is So many 173 00:10:10,600 --> 00:10:13,240 Speaker 1: people here in Silicon Valley have said things to me 174 00:10:13,840 --> 00:10:18,959 Speaker 1: suggesting that they would have never fallen for Farnos, like, oh, 175 00:10:19,000 --> 00:10:21,800 Speaker 1: I would have known right away that couldn't work. But 176 00:10:21,920 --> 00:10:25,480 Speaker 1: that's silly, and I generally don't believe them because it's 177 00:10:25,559 --> 00:10:28,960 Speaker 1: easy to get douped. And if the other people who 178 00:10:28,960 --> 00:10:31,200 Speaker 1: believe in it and invest in it and sit on 179 00:10:31,240 --> 00:10:35,079 Speaker 1: the board are billionaires and big shots, what's to make 180 00:10:35,120 --> 00:10:38,800 Speaker 1: you think that wouldn't have a great gravitational pull on you. 181 00:10:39,080 --> 00:10:41,400 Speaker 1: Because the truth is, when we look at things that 182 00:10:41,760 --> 00:10:45,320 Speaker 1: other people believe in, or even simply things that match 183 00:10:45,360 --> 00:10:51,120 Speaker 1: our expectations, we often don't do any further looking into it. Now, 184 00:10:51,120 --> 00:10:53,959 Speaker 1: this puts us in a tough spot because we have 185 00:10:54,080 --> 00:10:57,960 Speaker 1: to trust other people. We really have no choice, because 186 00:10:58,000 --> 00:11:01,839 Speaker 1: we can't disbelieve every thing or have time to check 187 00:11:01,880 --> 00:11:05,679 Speaker 1: on everything. So how do we work around this bias? 188 00:11:06,080 --> 00:11:08,880 Speaker 1: How can we take some of the tools of science, 189 00:11:09,320 --> 00:11:13,160 Speaker 1: which are all about clear thinking and import these into 190 00:11:13,200 --> 00:11:17,040 Speaker 1: our daily lives. Well, I'm no expert on this. I 191 00:11:17,080 --> 00:11:20,080 Speaker 1: often err on the side of believing everyone, but I 192 00:11:20,120 --> 00:11:24,360 Speaker 1: knew who to call. My colleagues Dan Simon's and Christopher 193 00:11:24,360 --> 00:11:27,959 Speaker 1: Shubrie recently wrote a terrific book all about deception called 194 00:11:28,360 --> 00:11:30,760 Speaker 1: Nobody's Fool. So I wrung them up. 195 00:11:31,559 --> 00:11:34,760 Speaker 4: The most interesting thing we found about what all the 196 00:11:34,800 --> 00:11:38,679 Speaker 4: cons and deceptions have in common is that the con artists, 197 00:11:38,760 --> 00:11:41,040 Speaker 4: the scammers, the swindlers, whatever you want to call them, 198 00:11:41,559 --> 00:11:44,800 Speaker 4: all seem to be taking advantage of sort of the 199 00:11:44,840 --> 00:11:50,920 Speaker 4: same set of our cognitive proclivities and our attentional biases. 200 00:11:50,960 --> 00:11:53,560 Speaker 4: What we like to pay attention to, what attracts us, 201 00:11:54,160 --> 00:11:58,520 Speaker 4: and what mistakes we tend to make, and decision making 202 00:11:58,559 --> 00:12:00,400 Speaker 4: that we may not be aware of. 203 00:12:01,080 --> 00:12:05,240 Speaker 1: That's Christopher Shabri, a professor of psychology and director of 204 00:12:05,320 --> 00:12:08,720 Speaker 1: Decision Sciences at Geisinger Research Institute. 205 00:12:08,760 --> 00:12:10,800 Speaker 4: They may not be consciously aware of, but somehow they 206 00:12:10,840 --> 00:12:13,840 Speaker 4: have sort of learned that adapted their schemes to things 207 00:12:13,920 --> 00:12:16,800 Speaker 4: that tend to exploit these loopholes in our thinking. Not 208 00:12:16,920 --> 00:12:21,959 Speaker 4: loopholes that are sort of design flaws necessarily, they're actually usually, 209 00:12:22,040 --> 00:12:24,440 Speaker 4: you know, good things about how we think. But when 210 00:12:24,440 --> 00:12:26,320 Speaker 4: someone is really trying to take advantage of us, they 211 00:12:26,320 --> 00:12:28,920 Speaker 4: can cleverly exploit those and gain the advantage. 212 00:12:29,360 --> 00:12:33,600 Speaker 1: So what are the important lessons for all of us 213 00:12:33,720 --> 00:12:36,760 Speaker 1: to think about? Given that I'd say there are a few, 214 00:12:37,840 --> 00:12:41,480 Speaker 1: and that's Dan Simons, a professor of psychology at the 215 00:12:41,600 --> 00:12:48,120 Speaker 1: University of Illinois. One is that we have a tendency 216 00:12:48,160 --> 00:12:52,400 Speaker 1: to assume that only the most gullible or naive or 217 00:12:52,800 --> 00:12:57,080 Speaker 1: uneducated people fall for scams. And that's partly because we 218 00:12:57,200 --> 00:13:00,959 Speaker 1: generally only see the results of cons and scams after 219 00:13:00,960 --> 00:13:03,000 Speaker 1: they're over right, So it's easy to see what the 220 00:13:03,000 --> 00:13:05,800 Speaker 1: red flags were when you knew it was a scam, 221 00:13:05,880 --> 00:13:08,240 Speaker 1: you found out it was a scam in the same 222 00:13:08,280 --> 00:13:12,160 Speaker 1: way that we can easily spot, you know, the obvious 223 00:13:12,200 --> 00:13:15,160 Speaker 1: ones in advance, things like the Nigerian prints email scam 224 00:13:15,160 --> 00:13:17,640 Speaker 1: that we know about. We can spot those red flags 225 00:13:17,679 --> 00:13:20,880 Speaker 1: because we've seen them before. But we tend to assume 226 00:13:20,880 --> 00:13:23,560 Speaker 1: that all scams prey on the people who are gullible. 227 00:13:23,840 --> 00:13:26,560 Speaker 1: And one of the key insights we've across all of 228 00:13:26,559 --> 00:13:29,320 Speaker 1: the sorts of scams that we've encountered is that scams 229 00:13:29,320 --> 00:13:32,800 Speaker 1: can affect anybody. Cons can affect anybody if they're targeted 230 00:13:33,080 --> 00:13:37,240 Speaker 1: in the right way to our wants and desires and needs. Yeah, 231 00:13:37,240 --> 00:13:39,880 Speaker 1: you know, I thought about this a lot with Sarahnos 232 00:13:40,160 --> 00:13:45,320 Speaker 1: here in Silicon Valley retrospectively, everyone acted like I would 233 00:13:45,320 --> 00:13:47,960 Speaker 1: have never fallen for that. But it's obvious that a 234 00:13:48,000 --> 00:13:51,560 Speaker 1: lot of good and smart people got sucked up into that, 235 00:13:52,120 --> 00:13:55,160 Speaker 1: And so how do you how do you interpret that? Well? 236 00:13:55,200 --> 00:13:58,119 Speaker 5: I think it's the way we see something like Sarahnose 237 00:13:58,600 --> 00:14:01,120 Speaker 5: is in hindsight, after the fact, in the same way 238 00:14:01,160 --> 00:14:03,040 Speaker 5: that we might watch a heist movie or a who 239 00:14:03,120 --> 00:14:05,839 Speaker 5: Done It movie, where we know there's a heist, right, 240 00:14:05,880 --> 00:14:08,520 Speaker 5: we know there's a con artist, we know that. In 241 00:14:09,760 --> 00:14:11,520 Speaker 5: the context of the movie, we know to look for 242 00:14:11,559 --> 00:14:14,120 Speaker 5: those red flags. We're trying to figure it out, and 243 00:14:14,280 --> 00:14:17,000 Speaker 5: the characters in the movie aren't. But when it's viewed 244 00:14:17,000 --> 00:14:20,360 Speaker 5: from the outside, it's kind of obvious, right, So Pharaohnose 245 00:14:20,480 --> 00:14:22,760 Speaker 5: after the fact, Yeah, there are lots of red flags 246 00:14:22,760 --> 00:14:24,880 Speaker 5: along the way, and they've been reported thoroughly, and it's 247 00:14:25,000 --> 00:14:28,040 Speaker 5: great narrative. But when you're immersed in it and you're 248 00:14:28,040 --> 00:14:30,360 Speaker 5: trying to figure out what's the next best investment or 249 00:14:30,400 --> 00:14:31,920 Speaker 5: what do I want to get in on really quick? 250 00:14:31,920 --> 00:14:34,520 Speaker 5: If you're a venture capitalist trying to kind of get 251 00:14:34,600 --> 00:14:37,320 Speaker 5: in on the next big thing, spotting all those red 252 00:14:37,320 --> 00:14:40,960 Speaker 5: flags is more difficult because you're incentivized to act with 253 00:14:41,040 --> 00:14:43,560 Speaker 5: efficiency and to try and catch things before they take 254 00:14:43,600 --> 00:14:46,480 Speaker 5: off and before people know about them. So those are 255 00:14:46,520 --> 00:14:49,440 Speaker 5: the contexts in which they're the marks, rather than watching 256 00:14:49,520 --> 00:14:51,520 Speaker 5: some interesting, engaging movie. 257 00:14:51,560 --> 00:14:54,320 Speaker 4: In the case of Pharaohose. Also, you know, there were 258 00:14:54,360 --> 00:14:57,200 Speaker 4: people who didn't invest and who didn't join the board 259 00:14:57,200 --> 00:14:59,040 Speaker 4: of directors and so on. They don't get as much 260 00:14:59,080 --> 00:15:02,800 Speaker 4: publicity as the unfortunate ones who did and look like 261 00:15:02,880 --> 00:15:05,800 Speaker 4: marks and suckers and so on in retrospect. But some 262 00:15:05,840 --> 00:15:09,760 Speaker 4: professional investors who specialized in biotech and healthcare investing, they 263 00:15:09,760 --> 00:15:13,200 Speaker 4: asked a lot more questions about the product, about the technology, 264 00:15:13,240 --> 00:15:15,640 Speaker 4: about clinical data, about all of that stuff, and then 265 00:15:16,080 --> 00:15:20,480 Speaker 4: they walked away. And I think one other important point 266 00:15:20,520 --> 00:15:22,480 Speaker 4: about their nose is I think, although I don't know, 267 00:15:22,480 --> 00:15:24,240 Speaker 4: because I'm not inside the heads of all those people, 268 00:15:24,280 --> 00:15:27,239 Speaker 4: but I think a lot of people didn't even consciously 269 00:15:27,400 --> 00:15:29,640 Speaker 4: consider the idea that there might have been a scam 270 00:15:29,720 --> 00:15:33,680 Speaker 4: or a fraud going on. Everything seemed good, everybody was optimistic, 271 00:15:33,760 --> 00:15:36,480 Speaker 4: there was a great vision. Little things that seem kind 272 00:15:36,480 --> 00:15:38,280 Speaker 4: of odd. Maybe you can explain away, this is just 273 00:15:38,320 --> 00:15:42,080 Speaker 4: a quirky company. The CEO is a little odd. You know, Well, 274 00:15:42,120 --> 00:15:43,840 Speaker 4: they've got all these famous people on the board. That 275 00:15:43,920 --> 00:15:47,240 Speaker 4: must be a good thing. Simply considering the possibility in 276 00:15:47,280 --> 00:15:49,760 Speaker 4: a big decision making situation that you maybe are being 277 00:15:49,800 --> 00:15:52,600 Speaker 4: scammed or there's something going on that you're not aware of, 278 00:15:53,080 --> 00:15:55,280 Speaker 4: you know, could be the first step towards like starting 279 00:15:55,280 --> 00:15:57,960 Speaker 4: to see those red flags or look for those red flags, 280 00:15:57,960 --> 00:16:00,280 Speaker 4: and maybe you can actually find some of them if 281 00:16:00,320 --> 00:16:02,480 Speaker 4: you were even thinking about the possibility that they might 282 00:16:02,520 --> 00:16:06,440 Speaker 4: be out there somewhere. We require a lot of trust 283 00:16:06,840 --> 00:16:09,440 Speaker 4: just to get by in life. And so how do 284 00:16:09,560 --> 00:16:13,120 Speaker 4: you guys think about striking a balance of trust and 285 00:16:13,160 --> 00:16:14,200 Speaker 4: a little bit of suspicion. 286 00:16:15,240 --> 00:16:17,760 Speaker 5: Well, trust is essential, right. In fact, we tend to 287 00:16:17,840 --> 00:16:21,560 Speaker 5: assume that when we hear something from somebody it's true 288 00:16:21,640 --> 00:16:23,960 Speaker 5: and until we take time to think about it otherwise. 289 00:16:24,000 --> 00:16:26,320 Speaker 5: And most of the time that's a great thing because 290 00:16:26,320 --> 00:16:29,120 Speaker 5: in most conversations, nobody's trying to lie to you. In 291 00:16:29,280 --> 00:16:31,840 Speaker 5: most interactions, nobody's trying to con you. I mean, the 292 00:16:31,880 --> 00:16:34,400 Speaker 5: odds of any of us being a victim of a 293 00:16:34,440 --> 00:16:37,200 Speaker 5: bernie made off or a their noose is pretty low. 294 00:16:37,680 --> 00:16:40,520 Speaker 5: The odds of any of us receiving fake information on 295 00:16:40,560 --> 00:16:43,240 Speaker 5: social media is pretty high. But we tend to be 296 00:16:43,280 --> 00:16:46,440 Speaker 5: trusting of the information we get, and it's a good 297 00:16:46,480 --> 00:16:47,000 Speaker 5: thing that we are. 298 00:16:47,120 --> 00:16:47,240 Speaker 4: Right. 299 00:16:47,240 --> 00:16:50,080 Speaker 5: If we were constantly skeptical of everything we encountered, we 300 00:16:50,080 --> 00:16:52,840 Speaker 5: could just never do anything. We could never have a conversation. 301 00:16:53,240 --> 00:16:56,440 Speaker 5: You can't check everything. You can't be a perpetual skeptic 302 00:16:56,520 --> 00:16:59,880 Speaker 5: or cynic about everything. You're not going to go check 303 00:17:00,720 --> 00:17:04,520 Speaker 5: in the grocery store if you buy an organic apple, right, 304 00:17:04,560 --> 00:17:05,840 Speaker 5: You're not going to go out to the farm and 305 00:17:05,840 --> 00:17:09,080 Speaker 5: make sure they didn't use any pesticides. Right, It's too much. 306 00:17:09,760 --> 00:17:12,120 Speaker 5: We can't really check that. We have to accept that 307 00:17:12,320 --> 00:17:13,960 Speaker 5: some of the time we're going to have to be trusting. 308 00:17:14,560 --> 00:17:16,240 Speaker 5: And the key is to kind of figure out when 309 00:17:16,280 --> 00:17:18,840 Speaker 5: are those times when we're at the greatest risk, When 310 00:17:18,880 --> 00:17:22,880 Speaker 5: are those times when the consequences could be bad enough 311 00:17:22,920 --> 00:17:24,520 Speaker 5: that we really would want to check and see if 312 00:17:24,520 --> 00:17:25,160 Speaker 5: we're being scammed. 313 00:17:25,840 --> 00:17:28,280 Speaker 2: So before we go on to some other topics, just 314 00:17:28,560 --> 00:17:31,719 Speaker 2: can you give a few examples of hoaxes or swindles 315 00:17:31,800 --> 00:17:35,720 Speaker 2: or scams so that our listeners can understand what we're 316 00:17:35,720 --> 00:17:36,240 Speaker 2: talking about. 317 00:17:37,480 --> 00:17:39,439 Speaker 4: I have a good one that I think a lot 318 00:17:39,440 --> 00:17:41,679 Speaker 4: of people probably haven't heard of, but they really should have, 319 00:17:42,200 --> 00:17:45,800 Speaker 4: which is called sometimes the president scam or the CEO scam, 320 00:17:46,000 --> 00:17:48,840 Speaker 4: and I didn't discover it. It's been it was going 321 00:17:48,880 --> 00:17:50,960 Speaker 4: on for a while and it was documented elsewhere. But 322 00:17:51,280 --> 00:17:53,560 Speaker 4: I think it's a great example of some of some 323 00:17:53,640 --> 00:17:59,000 Speaker 4: of the key ideas. So this French Israeli fraudster named 324 00:17:59,080 --> 00:18:03,920 Speaker 4: Jilbert Shickley developed a scam in which he would call 325 00:18:04,040 --> 00:18:08,040 Speaker 4: up sort of mid level employees of French companies, pretending 326 00:18:08,080 --> 00:18:11,760 Speaker 4: to be the CEO of the company, reaching down through 327 00:18:11,800 --> 00:18:13,880 Speaker 4: the ranks and calling up some middle manager and giving 328 00:18:13,920 --> 00:18:17,720 Speaker 4: them a task to do directly for him. And the 329 00:18:17,800 --> 00:18:21,359 Speaker 4: task always wound up in money being transferred directly to 330 00:18:21,440 --> 00:18:26,160 Speaker 4: some bank account or person or something like that, where 331 00:18:26,240 --> 00:18:28,280 Speaker 4: of course it wound up with Shickley and whoever his 332 00:18:28,359 --> 00:18:32,240 Speaker 4: associates were in their bank account somehow. And I think 333 00:18:32,240 --> 00:18:35,880 Speaker 4: it's kind of an audacious con because it's one guy 334 00:18:35,960 --> 00:18:38,360 Speaker 4: with a telephone calling people up who he's never met 335 00:18:38,400 --> 00:18:43,359 Speaker 4: before and talking them into essentially giving him a lot 336 00:18:43,400 --> 00:18:47,160 Speaker 4: of money. But it does illustrate sort of the idea 337 00:18:47,160 --> 00:18:49,280 Speaker 4: of truth bias that Dan was just talking about, that 338 00:18:49,720 --> 00:18:51,879 Speaker 4: if you don't believe that the person on the phone 339 00:18:51,920 --> 00:18:54,760 Speaker 4: is the CEO calling you, the whole thing goes nowhere. 340 00:18:55,200 --> 00:18:57,280 Speaker 4: But once you believe that, then the scam has a 341 00:18:57,359 --> 00:18:59,720 Speaker 4: chance to get through. And it also illustrates sort of 342 00:18:59,720 --> 00:19:02,640 Speaker 4: some of selection, sort of someone selection bias we see 343 00:19:02,640 --> 00:19:04,720 Speaker 4: in cons Like we hear about the ones that worked, 344 00:19:04,920 --> 00:19:06,399 Speaker 4: but we don't know about all the people who just 345 00:19:06,400 --> 00:19:08,560 Speaker 4: like hung up the phone or deleted the email when 346 00:19:08,600 --> 00:19:10,480 Speaker 4: he tried to talk to you know, to start talking 347 00:19:10,520 --> 00:19:12,680 Speaker 4: to them into it. Just like the millions and millions 348 00:19:12,720 --> 00:19:15,320 Speaker 4: of people who delete the Nigerian prints emails never get 349 00:19:15,320 --> 00:19:17,280 Speaker 4: it mentioned anywhere. You know, it's just a few people 350 00:19:17,320 --> 00:19:20,680 Speaker 4: who actually wind up going through with it. That CEO 351 00:19:20,800 --> 00:19:24,120 Speaker 4: scam or President scam went along for quite a long 352 00:19:24,160 --> 00:19:26,560 Speaker 4: time and sort of morphed and changed into different versions 353 00:19:26,600 --> 00:19:29,240 Speaker 4: where eventually people were pretending to be the Defense Minister 354 00:19:29,320 --> 00:19:34,520 Speaker 4: of France calling, you know, contacting wealthy individuals, especially with 355 00:19:34,600 --> 00:19:37,560 Speaker 4: French ties, and saying that the government of France needed 356 00:19:37,560 --> 00:19:41,919 Speaker 4: their help getting hostages, secret hostages out of Syria and 357 00:19:41,960 --> 00:19:44,720 Speaker 4: Pharmisis and so on, and wound up taking i think 358 00:19:44,760 --> 00:19:47,359 Speaker 4: something like eighty million dollars or eighty million euros in 359 00:19:47,400 --> 00:19:50,520 Speaker 4: total from a number of you know, French companies and 360 00:19:50,560 --> 00:19:53,240 Speaker 4: wealthy wealthy individuals by sort of similar tactics. 361 00:19:53,760 --> 00:19:55,719 Speaker 5: There's a new modern version of this which is much 362 00:19:55,800 --> 00:19:58,080 Speaker 5: dumber and much simpler. It doesn't require any sort of 363 00:19:58,080 --> 00:20:02,439 Speaker 5: sophisticated persuasion. People just send an email purportedly from the 364 00:20:02,440 --> 00:20:04,520 Speaker 5: boss of a company and saying, Hey, I'm in a 365 00:20:04,560 --> 00:20:06,480 Speaker 5: meeting right now, but I need to transfer these funds 366 00:20:06,520 --> 00:20:08,720 Speaker 5: right away or I need to close this sale. Can 367 00:20:08,760 --> 00:20:10,960 Speaker 5: you just go ahead and make this transfer for me? 368 00:20:11,680 --> 00:20:14,919 Speaker 5: And if the email happens to reach somebody who is 369 00:20:15,000 --> 00:20:18,159 Speaker 5: responsible for doing that, then they might go ahead and 370 00:20:18,200 --> 00:20:20,560 Speaker 5: do it without even double checking, when in reality the 371 00:20:20,560 --> 00:20:23,320 Speaker 5: money would just get paid to some other count of 372 00:20:23,320 --> 00:20:27,119 Speaker 5: the scammers. And this is so pervasive that I have 373 00:20:27,160 --> 00:20:29,560 Speaker 5: a cousin who teaches tennis. She runs a tennis club 374 00:20:29,600 --> 00:20:34,440 Speaker 5: and regularly gets her underlings. The other people teaching there 375 00:20:34,600 --> 00:20:37,879 Speaker 5: regularly get emails from her, not really from her, but 376 00:20:37,920 --> 00:20:40,119 Speaker 5: from her saying hey, I'm in a meeting, can you 377 00:20:40,200 --> 00:20:44,280 Speaker 5: run this? Can you make this payment? And her employees 378 00:20:44,520 --> 00:20:47,239 Speaker 5: and coworkers know that that's not true. She's pretty much 379 00:20:47,240 --> 00:20:49,720 Speaker 5: never in meetings, and they're not the sorts of people 380 00:20:49,720 --> 00:20:52,520 Speaker 5: who make purchases. But again, if you send it out 381 00:20:52,520 --> 00:20:55,119 Speaker 5: to enough people, you're going to happen to hit. Some 382 00:20:55,200 --> 00:20:57,359 Speaker 5: who in that moment are busy doing things are used 383 00:20:57,359 --> 00:20:59,119 Speaker 5: to getting emails from their boss asking them to do 384 00:20:59,160 --> 00:21:01,800 Speaker 5: something really quickly, and we'll go along with it without 385 00:21:01,880 --> 00:21:05,240 Speaker 5: questioning it. And this is a major source of business. 386 00:21:05,320 --> 00:21:05,560 Speaker 3: Run. 387 00:21:06,200 --> 00:21:09,000 Speaker 1: Do you guys have any other examples, just something that 388 00:21:09,240 --> 00:21:12,480 Speaker 1: some hoax or swinger or something that you came across 389 00:21:12,920 --> 00:21:15,160 Speaker 1: that you think is really illustrative. 390 00:21:15,600 --> 00:21:19,000 Speaker 4: I can give another example from the world of chess. 391 00:21:19,160 --> 00:21:21,240 Speaker 3: So I'm a chess player. Well, Dan is also a 392 00:21:21,320 --> 00:21:21,840 Speaker 3: chess player. 393 00:21:22,119 --> 00:21:24,560 Speaker 4: I'm probably a more serious player than Dan is, and 394 00:21:25,320 --> 00:21:29,159 Speaker 4: I'm a funnier player. Dan's a funny player. I'm a 395 00:21:29,160 --> 00:21:33,399 Speaker 4: serious player. I'll try to be funnier though. When you 396 00:21:33,440 --> 00:21:36,560 Speaker 4: play chess online, you don't see the person you're playing. 397 00:21:36,880 --> 00:21:39,879 Speaker 4: It's just a screen name and all you see is 398 00:21:39,920 --> 00:21:42,479 Speaker 4: the moves they play on an animated chess board, kind 399 00:21:42,520 --> 00:21:43,920 Speaker 4: of like you're playing a video game, right, You just 400 00:21:43,960 --> 00:21:47,080 Speaker 4: see the moose being made. And So I was playing 401 00:21:47,080 --> 00:21:49,120 Speaker 4: a game once. This has happened to me more than once. 402 00:21:49,119 --> 00:21:50,760 Speaker 4: But the occasion that I remember is I was playing 403 00:21:50,760 --> 00:21:51,959 Speaker 4: a game and it was a guy I had never 404 00:21:52,000 --> 00:21:55,240 Speaker 4: played before and the game started, and he had a 405 00:21:55,240 --> 00:21:57,880 Speaker 4: similar rating to mine, meaning, you know, we were both 406 00:21:57,880 --> 00:22:00,119 Speaker 4: pretty good players. I should be ready for a you know, 407 00:22:00,200 --> 00:22:03,000 Speaker 4: I should be ready for a good game. And every 408 00:22:03,040 --> 00:22:05,879 Speaker 4: move I made, it seemed like he always found like 409 00:22:05,920 --> 00:22:08,880 Speaker 4: a great response, and he never made a mistake. And 410 00:22:09,240 --> 00:22:11,080 Speaker 4: times when I thought I was winning, I really wasn't 411 00:22:11,080 --> 00:22:14,320 Speaker 4: winning because he found the escape. And moreover, he was 412 00:22:14,359 --> 00:22:16,919 Speaker 4: moving quite quickly, like he would make every move in 413 00:22:16,960 --> 00:22:18,280 Speaker 4: like five to ten seconds. And I was like, wow, 414 00:22:18,320 --> 00:22:19,800 Speaker 4: this guy's putting a lot of pressure on me. You know, 415 00:22:20,160 --> 00:22:22,040 Speaker 4: I'm thinking for a minute sometimes on these moves, and 416 00:22:22,080 --> 00:22:24,679 Speaker 4: he comes back in ten seconds all the time. And 417 00:22:24,720 --> 00:22:26,680 Speaker 4: in the end I got checkmated and I lost the game, 418 00:22:27,080 --> 00:22:29,800 Speaker 4: and I thought, wow, that guy like that guy played 419 00:22:29,800 --> 00:22:31,480 Speaker 4: a really good game against me. But then when I 420 00:22:31,520 --> 00:22:34,359 Speaker 4: looked at the game afterwards, chess dot com where we 421 00:22:34,359 --> 00:22:37,160 Speaker 4: played this game shows you exactly how much time both 422 00:22:37,160 --> 00:22:40,040 Speaker 4: players use in every move after the game, and I 423 00:22:40,080 --> 00:22:42,080 Speaker 4: noticed that he was making all of his moves within 424 00:22:42,119 --> 00:22:45,760 Speaker 4: that very tight band of five to ten seconds per move, basically, 425 00:22:46,040 --> 00:22:48,879 Speaker 4: never less than five seconds, never more than ten seconds. 426 00:22:48,960 --> 00:22:50,560 Speaker 4: Maybe it was twelve seconds, I don't remember, but if 427 00:22:50,560 --> 00:22:52,560 Speaker 4: you looked at my own graph. There were a couple 428 00:22:52,560 --> 00:22:54,119 Speaker 4: of moves that I took like one or two minutes 429 00:22:54,160 --> 00:22:56,719 Speaker 4: on and some moves I played almost instantly, like one second. 430 00:22:57,640 --> 00:23:01,199 Speaker 4: The consistency of his timing, and also the consistency of 431 00:23:01,240 --> 00:23:03,320 Speaker 4: the fact that he never made a mistake. All of 432 00:23:03,359 --> 00:23:06,000 Speaker 4: his moves were you know, almost the best move, if 433 00:23:06,040 --> 00:23:10,480 Speaker 4: not the best move according to computer analysis, really reveals 434 00:23:10,520 --> 00:23:12,439 Speaker 4: that all he was doing was being a conduit for 435 00:23:12,480 --> 00:23:14,960 Speaker 4: a computer, Like he was just typing my moves into 436 00:23:14,960 --> 00:23:17,240 Speaker 4: a computer and typing and putting back into chess dot 437 00:23:17,280 --> 00:23:19,480 Speaker 4: com the moves that the computer told him to play. 438 00:23:19,840 --> 00:23:23,240 Speaker 4: And here's an example where the behavior of a human 439 00:23:23,760 --> 00:23:26,280 Speaker 4: you know, really ought to be more noisy in some 440 00:23:26,400 --> 00:23:29,360 Speaker 4: fundamental way than the behavior of a computer. Humans never 441 00:23:29,400 --> 00:23:33,880 Speaker 4: play moves with robotic cadences. They never play the correct 442 00:23:33,880 --> 00:23:37,200 Speaker 4: move forty or fifty moves in a row, and there's 443 00:23:37,240 --> 00:23:40,280 Speaker 4: just much more variability in human decision making and almost 444 00:23:40,280 --> 00:23:44,320 Speaker 4: any human activity than there is in computer based activity. 445 00:23:44,320 --> 00:23:46,320 Speaker 4: So I filed a report. You know, you can report 446 00:23:46,320 --> 00:23:48,200 Speaker 4: any player, and sure enough, like a day or two later, 447 00:23:48,280 --> 00:23:49,919 Speaker 4: chess dot com came back and said, we're giving you 448 00:23:49,960 --> 00:23:52,600 Speaker 4: back the rating. Points you lost. This guy is violated, 449 00:23:52,640 --> 00:23:54,439 Speaker 4: you know, violated our fair play policy. And that kind 450 00:23:54,440 --> 00:23:56,320 Speaker 4: of thing happens all the time in online chess because 451 00:23:56,320 --> 00:23:59,280 Speaker 4: computers are so good that they can be used easily 452 00:23:59,320 --> 00:24:02,000 Speaker 4: to cheat. The real problem is are people using it 453 00:24:02,040 --> 00:24:03,840 Speaker 4: sort of like in over the board and like serious 454 00:24:03,840 --> 00:24:06,480 Speaker 4: tournaments and matches. And that's a whole other controversy. But 455 00:24:06,560 --> 00:24:09,360 Speaker 4: certainly was a case where you know, I was essentially 456 00:24:09,359 --> 00:24:11,359 Speaker 4: the victim of a little minor scam. I happened to 457 00:24:11,359 --> 00:24:14,679 Speaker 4: figure it out, but as a scam based on people 458 00:24:14,800 --> 00:24:19,000 Speaker 4: not noticing that, you know, not noticing the absence of variability, 459 00:24:19,040 --> 00:24:20,639 Speaker 4: which is a critical thing and a lot of and 460 00:24:20,720 --> 00:24:22,280 Speaker 4: a lot of cons I was. 461 00:24:22,280 --> 00:24:25,119 Speaker 5: Gonna say one thing about that that particular case. It's 462 00:24:25,160 --> 00:24:29,440 Speaker 5: a fairly minor scam, right, It's just one element of 463 00:24:29,960 --> 00:24:33,200 Speaker 5: the kinds of habits and hooks that we find really compelling, 464 00:24:33,880 --> 00:24:37,159 Speaker 5: the hook of consistency, right or that. Yeah, it's just 465 00:24:37,240 --> 00:24:40,520 Speaker 5: something that we don't look for the noise when we should. 466 00:24:40,520 --> 00:24:43,080 Speaker 5: But if you look at bigger scams, things like Bernie 467 00:24:43,080 --> 00:24:46,159 Speaker 5: made Us, Ponzi Ski or fairness, they rely on a 468 00:24:46,200 --> 00:24:48,760 Speaker 5: whole bunch of our cognitive tendencies and they appeal to 469 00:24:49,359 --> 00:24:51,359 Speaker 5: a lot of kinds of information that we find really 470 00:24:51,440 --> 00:24:54,120 Speaker 5: valuable and that do help us most of the time, 471 00:24:54,680 --> 00:24:55,679 Speaker 5: but they take advantage of. 472 00:24:55,680 --> 00:24:58,760 Speaker 3: Those to doup us. 473 00:24:59,119 --> 00:25:01,840 Speaker 1: So can you unpacked that a little bit about noise 474 00:25:02,119 --> 00:25:04,840 Speaker 1: in data and what we should be looking for. 475 00:25:05,520 --> 00:25:08,000 Speaker 5: Well, I mean, really, in any human behavior, anything that's 476 00:25:08,280 --> 00:25:12,879 Speaker 5: governed by interaction of people, we don't expect people to 477 00:25:12,920 --> 00:25:15,120 Speaker 5: perform the same way every single time. We don't expect 478 00:25:15,440 --> 00:25:18,040 Speaker 5: a three hundred hitter or three thirty three baseball hitter 479 00:25:18,119 --> 00:25:20,639 Speaker 5: to hit one out of every single every one out 480 00:25:20,640 --> 00:25:23,560 Speaker 5: of every three at bets, right, they will on average 481 00:25:24,480 --> 00:25:26,920 Speaker 5: average about one out of every three. But in any 482 00:25:26,960 --> 00:25:29,200 Speaker 5: game that doesn't guarantee they're going to get at least 483 00:25:29,240 --> 00:25:32,439 Speaker 5: one hit, right, we tend to confuse the sort of 484 00:25:32,480 --> 00:25:35,840 Speaker 5: on average performance with what happens every single time. So 485 00:25:35,960 --> 00:25:38,800 Speaker 5: take the case of Bernie Madoff's Ponzi scheme. Right, this 486 00:25:38,920 --> 00:25:41,240 Speaker 5: wasn't a sort of classic Ponzi scheme where he promised 487 00:25:41,240 --> 00:25:43,720 Speaker 5: fifty percent returns in six months like a you know, 488 00:25:43,800 --> 00:25:48,359 Speaker 5: current crypto scam. Would his returned eight to fourteen percent 489 00:25:48,400 --> 00:25:51,439 Speaker 5: or eight to twelve percent almost every year for the 490 00:25:51,560 --> 00:25:53,560 Speaker 5: entire life of the Ponzi scheme, with never a down 491 00:25:53,640 --> 00:25:55,520 Speaker 5: year and almost never a down month. It was like 492 00:25:55,560 --> 00:25:59,639 Speaker 5: a smooth, steady growth, And that's not what you expect 493 00:25:59,640 --> 00:26:01,760 Speaker 5: for some thing is complex. As a financial system, you 494 00:26:01,840 --> 00:26:05,040 Speaker 5: expect ups and downs. Sometimes you'll be up twenty five 495 00:26:05,040 --> 00:26:07,000 Speaker 5: percent in a year, sometimes you'll be down ten percent 496 00:26:07,040 --> 00:26:09,159 Speaker 5: in a year, and the average might be eight to 497 00:26:09,200 --> 00:26:11,920 Speaker 5: twelve percent, and that'd be pretty good, But you don't 498 00:26:11,960 --> 00:26:14,399 Speaker 5: expect the average to be true of every single case. 499 00:26:15,160 --> 00:26:18,439 Speaker 5: And this plays out in many many contexts where usually 500 00:26:18,480 --> 00:26:21,119 Speaker 5: consistency is a sign that we have great understanding of 501 00:26:21,119 --> 00:26:22,640 Speaker 5: how things work. We can do it the same way 502 00:26:22,680 --> 00:26:25,760 Speaker 5: every single time. We want things to be reliable, but 503 00:26:26,840 --> 00:26:30,000 Speaker 5: the tendency to have things be too consistent can be 504 00:26:30,040 --> 00:26:33,320 Speaker 5: so appealing to us that we don't realize when noise 505 00:26:33,320 --> 00:26:36,000 Speaker 5: should be present. This is common in a lot of 506 00:26:36,040 --> 00:26:38,800 Speaker 5: science frunt as well, that you find results that are 507 00:26:38,800 --> 00:26:41,919 Speaker 5: just too consistent to be believable, but the people who 508 00:26:41,960 --> 00:26:43,879 Speaker 5: are making it up don't realize that they need to 509 00:26:43,880 --> 00:26:44,760 Speaker 5: make up noise too. 510 00:27:00,359 --> 00:27:02,919 Speaker 1: So a lot of the reasons that we fall for 511 00:27:03,040 --> 00:27:06,840 Speaker 1: hoaxes their scams is because of cognitive shortcuts that we're 512 00:27:06,880 --> 00:27:09,919 Speaker 1: taking so tell us about that and what we can 513 00:27:09,960 --> 00:27:11,000 Speaker 1: do about those shortcuts. 514 00:27:12,000 --> 00:27:14,760 Speaker 4: Well, one of the most important shortcuts, I think is 515 00:27:15,160 --> 00:27:18,000 Speaker 4: it's not even so much a shortcut, it's just an 516 00:27:18,119 --> 00:27:22,680 Speaker 4: un standard operating procedure. We are very good at paying 517 00:27:22,680 --> 00:27:23,440 Speaker 4: attention to things. 518 00:27:23,520 --> 00:27:23,679 Speaker 3: You know. 519 00:27:23,720 --> 00:27:26,320 Speaker 4: Attention is a wonderful thing. We can do things with 520 00:27:26,359 --> 00:27:28,560 Speaker 4: attention that we can't do without attention. Like we couldn't 521 00:27:28,560 --> 00:27:31,040 Speaker 4: even follow a soccer game or a football game or 522 00:27:31,040 --> 00:27:33,119 Speaker 4: a basketball game without attention, and otherwise would just be 523 00:27:33,119 --> 00:27:35,120 Speaker 4: a big blur of bodies moving around and the little 524 00:27:35,200 --> 00:27:37,640 Speaker 4: round thing like you know, flying back and forth. Occasionally 525 00:27:37,920 --> 00:27:40,240 Speaker 4: we'd have no hope of understanding it. But with attention 526 00:27:40,320 --> 00:27:42,440 Speaker 4: we can focus on selected aspects of it and sort 527 00:27:42,440 --> 00:27:44,399 Speaker 4: of put together the plot and the sequence of events 528 00:27:44,440 --> 00:27:46,480 Speaker 4: and what people are trying to do, and understand the 529 00:27:46,480 --> 00:27:48,560 Speaker 4: intentions behind it, all the way up to the strategies 530 00:27:48,600 --> 00:27:48,959 Speaker 4: and so on. 531 00:27:49,040 --> 00:27:49,760 Speaker 3: It's great. 532 00:27:50,080 --> 00:27:52,920 Speaker 4: However, the downside of attention is that when we're paying 533 00:27:52,920 --> 00:27:56,679 Speaker 4: attention to something, we may not notice other things that 534 00:27:56,720 --> 00:28:01,919 Speaker 4: we're not paying attention to. And a Hoover fraudster knows that, 535 00:28:02,400 --> 00:28:04,119 Speaker 4: and they know that if they can get our attention 536 00:28:04,320 --> 00:28:07,240 Speaker 4: on one thing, kind of like a magician, then we 537 00:28:07,359 --> 00:28:09,639 Speaker 4: might not notice other important things that are happening. And 538 00:28:09,640 --> 00:28:12,520 Speaker 4: of course they're not doing magic, they're actually trying to 539 00:28:12,560 --> 00:28:13,800 Speaker 4: deceive us for profit. 540 00:28:14,240 --> 00:28:15,800 Speaker 3: So many of. 541 00:28:15,760 --> 00:28:21,480 Speaker 4: The basic sort of deceptions in areas like marketing, where 542 00:28:21,480 --> 00:28:23,800 Speaker 4: it's sort of not even deception in some cases, it's 543 00:28:23,800 --> 00:28:25,600 Speaker 4: just kind of like the way business is done. You 544 00:28:25,720 --> 00:28:30,040 Speaker 4: get the recipient to focus on what you're showing them, 545 00:28:30,400 --> 00:28:32,640 Speaker 4: and you can count on them usually to not ask 546 00:28:32,720 --> 00:28:35,040 Speaker 4: questions about what you're not showing them. So, for example, 547 00:28:35,080 --> 00:28:39,280 Speaker 4: like a product demo video, like this startup company called Nicola, 548 00:28:39,320 --> 00:28:43,920 Speaker 4: which is still around trying to build electric vehicles trucks. 549 00:28:43,920 --> 00:28:46,840 Speaker 4: In this case, they created a demo video of one 550 00:28:46,840 --> 00:28:50,360 Speaker 4: of their trucks tooling down a highway, looked like going 551 00:28:50,360 --> 00:28:52,680 Speaker 4: at a nice rate of speed, and addingpressive music behind 552 00:28:52,720 --> 00:28:56,200 Speaker 4: it and so on, counting on people not to realize, 553 00:28:56,240 --> 00:28:59,080 Speaker 4: not to think, well, wait a minute, what happened before 554 00:28:59,120 --> 00:29:01,840 Speaker 4: the demo started. What was the angle that the camera 555 00:29:02,000 --> 00:29:04,240 Speaker 4: was at. Actually the camera was tilted a little bit, 556 00:29:04,280 --> 00:29:07,120 Speaker 4: so actually what was appeared to be rolling down on 557 00:29:07,160 --> 00:29:09,600 Speaker 4: a flat surface was rolling down a hill. So the 558 00:29:09,640 --> 00:29:12,080 Speaker 4: thing actually had no functioning motor you know, and so on. 559 00:29:12,160 --> 00:29:14,520 Speaker 4: It just rolled down a hill slowly, and then the 560 00:29:14,560 --> 00:29:17,200 Speaker 4: positioning of the camera and the video, you know, cutting 561 00:29:17,200 --> 00:29:18,880 Speaker 4: did the rest of the work. And those are things 562 00:29:18,920 --> 00:29:20,880 Speaker 4: we just don't think about. Right, We're focusing on the truck. 563 00:29:20,920 --> 00:29:23,400 Speaker 4: It looks nice, it's moving, nice background, and so on, 564 00:29:23,960 --> 00:29:26,320 Speaker 4: and we don't ask what's missing, Like, what information are 565 00:29:26,320 --> 00:29:28,160 Speaker 4: we missing about what's here? What information are they not 566 00:29:28,200 --> 00:29:29,920 Speaker 4: providing to us? Are they telling us about all the 567 00:29:29,920 --> 00:29:31,640 Speaker 4: times they tried to make the vehicle work but it 568 00:29:31,720 --> 00:29:33,520 Speaker 4: just didn't work at all, and just showing us the 569 00:29:33,520 --> 00:29:38,120 Speaker 4: one time that it did. So attention focus is really useful, 570 00:29:38,480 --> 00:29:41,600 Speaker 4: but it creates, you know, it sort of creates a loophole. 571 00:29:41,920 --> 00:29:45,040 Speaker 4: It creates a way for other people to, you know, 572 00:29:45,120 --> 00:29:46,800 Speaker 4: to exploit that well. 573 00:29:46,840 --> 00:29:50,360 Speaker 1: One cognitive shortcut that you mentioned in the book is prediction. 574 00:29:50,600 --> 00:29:53,560 Speaker 1: So how is it that we become victims of our 575 00:29:53,560 --> 00:29:54,800 Speaker 1: own life experience? 576 00:29:56,440 --> 00:29:58,560 Speaker 5: Yeah, And I think that's a great way of phrasing it, 577 00:29:58,600 --> 00:30:01,120 Speaker 5: that it's our life experience. It makes sense for us 578 00:30:01,120 --> 00:30:03,560 Speaker 5: to have expectations based on our past experience and to 579 00:30:03,640 --> 00:30:06,400 Speaker 5: use those predictions. And the vast majority of the time 580 00:30:07,200 --> 00:30:09,040 Speaker 5: we can use our past behavior to predict what's going 581 00:30:09,080 --> 00:30:10,920 Speaker 5: to happen in the future, right, that that's a really 582 00:30:10,920 --> 00:30:13,880 Speaker 5: important thing to be able to do. The challenge comes 583 00:30:13,920 --> 00:30:18,560 Speaker 5: in that we don't tend to question enough the information 584 00:30:18,600 --> 00:30:22,040 Speaker 5: we get when it's perfectly consistent with what we predicted. So, 585 00:30:22,760 --> 00:30:24,480 Speaker 5: and this is something that I think is really interesting 586 00:30:24,480 --> 00:30:27,520 Speaker 5: in the context of scientific errors. So let's say you 587 00:30:27,640 --> 00:30:30,840 Speaker 5: run an experiment and you've got an experimental group and 588 00:30:30,840 --> 00:30:32,640 Speaker 5: a placebo group, and you want to see which one 589 00:30:32,680 --> 00:30:35,760 Speaker 5: does better. Right, and you're predicting your new experimental intervention 590 00:30:35,840 --> 00:30:38,040 Speaker 5: is going to do great. And let's say that you 591 00:30:38,200 --> 00:30:41,600 Speaker 5: find that the placebo condition actually does better than the 592 00:30:41,600 --> 00:30:45,240 Speaker 5: experimental condition. Well, you're going to really dig into those results. 593 00:30:45,280 --> 00:30:46,680 Speaker 5: You're going to dig into the data. You're going to 594 00:30:46,680 --> 00:30:48,200 Speaker 5: look at your code. You're going to make sure that 595 00:30:48,240 --> 00:30:51,240 Speaker 5: everything was coded correctly, that there weren't any data points 596 00:30:51,240 --> 00:30:54,280 Speaker 5: that didn't make sense. You're to make sure you didn't 597 00:30:54,280 --> 00:30:56,120 Speaker 5: swap the names of the conditions so that you got 598 00:30:56,160 --> 00:30:58,400 Speaker 5: it wrong. You're going to look into it pretty carefully 599 00:30:58,440 --> 00:31:00,400 Speaker 5: because it didn't match what you were predicting. 600 00:31:01,280 --> 00:31:02,080 Speaker 3: Had it come out. 601 00:31:01,960 --> 00:31:04,160 Speaker 5: Exactly the way you predicted, you might not dig as 602 00:31:04,200 --> 00:31:07,280 Speaker 5: closely and that's been something that's led to a lot 603 00:31:07,280 --> 00:31:10,320 Speaker 5: of errors. Right, So you have a spreadsheet that produces 604 00:31:10,320 --> 00:31:12,000 Speaker 5: the right results, and you don't double check to make 605 00:31:12,040 --> 00:31:15,440 Speaker 5: sure you didn't fill down the column incorrectly because it 606 00:31:15,480 --> 00:31:18,280 Speaker 5: matched what you were predicting. So that sort of error 607 00:31:19,000 --> 00:31:21,800 Speaker 5: is I think a really common one. We're really good 608 00:31:21,920 --> 00:31:24,600 Speaker 5: at applying our critical faculties when we see something we 609 00:31:24,640 --> 00:31:27,040 Speaker 5: don't like that we didn't expect that we didn't predict. 610 00:31:27,480 --> 00:31:30,200 Speaker 5: Somebody shares something on social media that was counter to 611 00:31:30,240 --> 00:31:32,920 Speaker 5: your views, you can rip into that, and we're all 612 00:31:32,960 --> 00:31:35,640 Speaker 5: pretty good at doing that, But when it perfectly matches, 613 00:31:35,840 --> 00:31:38,080 Speaker 5: we're much more likely to just quickly pass it along 614 00:31:38,120 --> 00:31:41,840 Speaker 5: and retweet it and not necessarily think through carefully is 615 00:31:41,840 --> 00:31:43,280 Speaker 5: it really true. 616 00:31:43,760 --> 00:31:45,800 Speaker 1: Indeed, when we see things that are familiar in matching 617 00:31:45,800 --> 00:31:48,120 Speaker 1: our expectation, we don't look further into it. So how 618 00:31:48,160 --> 00:31:51,240 Speaker 1: do we work around that bias? 619 00:31:51,960 --> 00:31:54,520 Speaker 4: Well, I would say obviously the first thing is to 620 00:31:54,520 --> 00:31:57,080 Speaker 4: be where that we're doing this. That's the first step. 621 00:31:57,440 --> 00:32:03,120 Speaker 4: Second step is to and again like not every moment 622 00:32:03,120 --> 00:32:05,680 Speaker 4: of every day, but when you're making a big decision, 623 00:32:05,840 --> 00:32:07,600 Speaker 4: or when you think that the stakes are high, or 624 00:32:07,600 --> 00:32:11,440 Speaker 4: when someone might be trying to deceive you ask consciously, 625 00:32:11,480 --> 00:32:16,120 Speaker 4: explicitly whether you predicted what just happened, and if you 626 00:32:16,200 --> 00:32:20,000 Speaker 4: did predict it, then actually check it out as well. 627 00:32:20,080 --> 00:32:21,640 Speaker 4: I think a lot of times we don't even sort 628 00:32:21,680 --> 00:32:24,959 Speaker 4: of stop to wonder whether this is coming out exactly 629 00:32:24,960 --> 00:32:27,120 Speaker 4: the way I predict it, because you know, things rarely 630 00:32:27,160 --> 00:32:29,800 Speaker 4: happen exactly the way we predict, you know, especially in 631 00:32:29,800 --> 00:32:32,440 Speaker 4: an environment we don't have a lot of experience with before. 632 00:32:32,440 --> 00:32:34,640 Speaker 4: When we're doing a new experiment, testing a new theory, 633 00:32:34,920 --> 00:32:37,360 Speaker 4: should it really come out exactly like we predict I 634 00:32:37,360 --> 00:32:40,200 Speaker 4: mean maybe if we're the big best scientist ever, you know, 635 00:32:41,240 --> 00:32:43,960 Speaker 4: but often it doesn't go that way. So we should 636 00:32:44,000 --> 00:32:46,000 Speaker 4: be vigilant at those points also to see like, is 637 00:32:46,040 --> 00:32:47,800 Speaker 4: our code right, did we you know, did we make 638 00:32:47,800 --> 00:32:50,320 Speaker 4: a mistake or something like that. The example that Dan 639 00:32:50,600 --> 00:32:53,120 Speaker 4: gave about switching the columns, you know, or switching the 640 00:32:53,200 --> 00:32:56,000 Speaker 4: variable names or something like that is actually exactly what 641 00:32:56,080 --> 00:32:59,720 Speaker 4: happened in a you know, a fairly recently uncovered case 642 00:32:59,760 --> 00:33:03,080 Speaker 4: where where the data totally did not support the claim 643 00:33:03,120 --> 00:33:06,120 Speaker 4: that was being made. This was a study of the 644 00:33:06,200 --> 00:33:10,040 Speaker 4: idea that signing a declaration at the top versus at 645 00:33:10,080 --> 00:33:12,440 Speaker 4: the bottom would make you more honest in what you 646 00:33:12,520 --> 00:33:14,440 Speaker 4: declared on that form. So, in this case it was 647 00:33:14,880 --> 00:33:18,280 Speaker 4: an automobile insurance company. They were asking people to report 648 00:33:18,280 --> 00:33:20,320 Speaker 4: how many miles they had driven their vehicles in the 649 00:33:20,360 --> 00:33:24,160 Speaker 4: previous year. And the test was sign at the top 650 00:33:24,160 --> 00:33:26,160 Speaker 4: saying you're going to be honest in you're reporting, versus 651 00:33:26,160 --> 00:33:27,880 Speaker 4: sign at the bottom saying you've been honest in what 652 00:33:27,920 --> 00:33:30,680 Speaker 4: you reported. The idea was like, signing first would draw 653 00:33:30,680 --> 00:33:33,080 Speaker 4: your attention to honesty, and you'd you know, produce more 654 00:33:34,000 --> 00:33:36,440 Speaker 4: you know, more accurate, more honest results in that case. 655 00:33:36,760 --> 00:33:37,760 Speaker 3: And when. 656 00:33:39,000 --> 00:33:42,120 Speaker 4: When this experiment was done and the data file was 657 00:33:42,160 --> 00:33:45,840 Speaker 4: being looked at by some of the researchers, initially it 658 00:33:45,880 --> 00:33:47,480 Speaker 4: seemed like there was no effect at all, or the 659 00:33:47,480 --> 00:33:49,600 Speaker 4: effect was even the opposite of what they had expected. 660 00:33:50,040 --> 00:33:52,480 Speaker 4: But then one of them said, oh, well, accidentally switched 661 00:33:52,800 --> 00:33:56,360 Speaker 4: the columns, you know. So once the columns were switched back, 662 00:33:56,400 --> 00:33:58,880 Speaker 4: then the effect, you know, turned out to be right 663 00:33:58,960 --> 00:34:01,040 Speaker 4: basically exactly as Dan said, you know, you switched sort 664 00:34:01,040 --> 00:34:03,680 Speaker 4: of the you know, the treatment and the placebo in 665 00:34:03,680 --> 00:34:07,280 Speaker 4: this case, the sign first and sign and sign later columns. Well, 666 00:34:07,280 --> 00:34:09,520 Speaker 4: it turned out in retrospect that the entire data set 667 00:34:09,600 --> 00:34:13,759 Speaker 4: was fraudulent. But once they got the result that you know, 668 00:34:13,800 --> 00:34:15,960 Speaker 4: that fit the theory, or fit the prediction, or fit 669 00:34:16,000 --> 00:34:19,399 Speaker 4: the expectations, then apparently they stopped looking to see does 670 00:34:19,400 --> 00:34:21,040 Speaker 4: the rest of the data make sense? Are there any 671 00:34:21,040 --> 00:34:22,319 Speaker 4: obvious red flags in there? 672 00:34:22,360 --> 00:34:22,839 Speaker 3: And so on. 673 00:34:23,520 --> 00:34:25,759 Speaker 4: I think a perfect example of at least, you know, 674 00:34:25,840 --> 00:34:30,080 Speaker 4: some authors of that paper being satisfied that they're you know, 675 00:34:30,120 --> 00:34:33,080 Speaker 4: the theory had been confirmed and not looking more deeply enough. 676 00:34:33,440 --> 00:34:35,920 Speaker 4: Of course, you know, researchers are taught to look at 677 00:34:35,920 --> 00:34:37,920 Speaker 4: the distributions of their variables, look at all of this 678 00:34:38,040 --> 00:34:40,600 Speaker 4: kinds of stuff and so on, before getting too excited 679 00:34:40,640 --> 00:34:43,359 Speaker 4: about just you know, confirming their hypothesis. But sometimes that's 680 00:34:43,360 --> 00:34:47,359 Speaker 4: hard even for experienced scientists to do so. In our 681 00:34:47,400 --> 00:34:51,120 Speaker 4: own everyday life, we should be more aware of when 682 00:34:51,120 --> 00:34:54,120 Speaker 4: our expectations are being sort of exquisitely satisfied. That could 683 00:34:54,160 --> 00:34:58,000 Speaker 4: be someone deliberately designing something to you know, to take 684 00:34:58,040 --> 00:34:58,680 Speaker 4: advantage of. 685 00:34:58,680 --> 00:35:02,200 Speaker 5: Us, say in a much more mundane case, you know, 686 00:35:02,280 --> 00:35:05,120 Speaker 5: before you repost something or share it on social media, 687 00:35:05,320 --> 00:35:06,760 Speaker 5: just ask yourself, is it really true? 688 00:35:07,280 --> 00:35:07,440 Speaker 3: Right? 689 00:35:07,440 --> 00:35:08,839 Speaker 5: And what would I need to know to be sure 690 00:35:08,880 --> 00:35:11,239 Speaker 5: that it was really true, And that's something you can do, 691 00:35:11,280 --> 00:35:13,719 Speaker 5: whether or not you agree with it, and it just 692 00:35:13,800 --> 00:35:16,640 Speaker 5: takes a second. But once you ask that question, you 693 00:35:16,719 --> 00:35:18,719 Speaker 5: might realize, I have no idea how i'd know if 694 00:35:18,719 --> 00:35:20,400 Speaker 5: that were actually true. You know, I'd have to do 695 00:35:20,400 --> 00:35:22,000 Speaker 5: a lot of digging. And then you know, maybe just 696 00:35:22,000 --> 00:35:23,880 Speaker 5: don't reshare things that you haven't been able to verify 697 00:35:24,640 --> 00:35:27,640 Speaker 5: that might might actually help prevent the spread of information. 698 00:35:28,320 --> 00:35:30,279 Speaker 4: I think most people really do want to only share 699 00:35:30,320 --> 00:35:32,279 Speaker 4: true stuff. I don't think people deliberately want to spread 700 00:35:32,320 --> 00:35:33,920 Speaker 4: false information a lot of time. I think they're just 701 00:35:33,960 --> 00:35:35,920 Speaker 4: not often thinking like whether it might be false that 702 00:35:36,360 --> 00:35:38,240 Speaker 4: they're being swept along by other cues. 703 00:35:38,280 --> 00:35:41,000 Speaker 1: Besides that, I wanted to jump back to the science 704 00:35:41,680 --> 00:35:43,759 Speaker 1: the practice of science for just a second, which is 705 00:35:44,040 --> 00:35:46,560 Speaker 1: I was just talking some colleagues who got a big 706 00:35:46,640 --> 00:35:49,560 Speaker 1: data set and they wanted to prove something in particular 707 00:35:49,600 --> 00:35:51,600 Speaker 1: about it. They got this big thing of police records 708 00:35:51,600 --> 00:35:54,719 Speaker 1: and they had a particular thing that they wanted to demonstrate, 709 00:35:55,440 --> 00:35:59,799 Speaker 1: and they analyzed it and couldn't find evidence for their hypothesis, 710 00:36:00,040 --> 00:36:02,239 Speaker 1: figured out another way to look at it statistically, and 711 00:36:02,320 --> 00:36:05,080 Speaker 1: then another way, and they still could fine and finally 712 00:36:05,200 --> 00:36:09,040 Speaker 1: came up with some way, some statistical trick, and they 713 00:36:09,040 --> 00:36:13,000 Speaker 1: were very proud, they said, to have found finally this 714 00:36:13,160 --> 00:36:15,520 Speaker 1: evidence of this bias that they were looking for in 715 00:36:15,560 --> 00:36:18,279 Speaker 1: these police records. But it made me wonder about it, 716 00:36:18,320 --> 00:36:21,279 Speaker 1: because they clearly went in to find this thing. And 717 00:36:21,320 --> 00:36:25,319 Speaker 1: the question is did they do the right thing by 718 00:36:25,760 --> 00:36:29,160 Speaker 1: continuing to search and search and search with different statistical methods, 719 00:36:29,360 --> 00:36:32,359 Speaker 1: or is it purely that they were trying to make 720 00:36:32,400 --> 00:36:35,040 Speaker 1: the duck quack in a particular way. How do you 721 00:36:35,080 --> 00:36:37,000 Speaker 1: think about these issues? 722 00:36:37,800 --> 00:36:40,239 Speaker 5: It's a really complicated problem because, of course you want 723 00:36:40,280 --> 00:36:43,280 Speaker 5: to be able to explore your data right. The problem 724 00:36:43,360 --> 00:36:45,719 Speaker 5: comes when you don't think about all of the alternative 725 00:36:45,719 --> 00:36:47,520 Speaker 5: paths you could take to get to the outcome that 726 00:36:47,560 --> 00:36:50,560 Speaker 5: you want to report, right. And Andrew Gellman refers to 727 00:36:50,560 --> 00:36:52,640 Speaker 5: this as the garden of forking paths. And I think 728 00:36:52,640 --> 00:36:55,560 Speaker 5: it doesn't imply any sort of malicious intent or intent 729 00:36:55,640 --> 00:36:58,000 Speaker 5: to deceive at all. But we make lots of choices 730 00:36:58,040 --> 00:37:01,600 Speaker 5: along the way that can influence the result, and sometimes 731 00:37:01,600 --> 00:37:04,560 Speaker 5: we don't even think about what those choices were. So 732 00:37:04,640 --> 00:37:07,239 Speaker 5: I think the problem comes not in exploring your data 733 00:37:07,239 --> 00:37:10,400 Speaker 5: really fully. It comes in only reporting the thing that worked, 734 00:37:10,800 --> 00:37:14,120 Speaker 5: the one example, the one analysis that was successful, And 735 00:37:14,160 --> 00:37:15,960 Speaker 5: what you really want to know is, hey, is this 736 00:37:16,040 --> 00:37:19,120 Speaker 5: hypothesis robust to a whole bunch of different ways of 737 00:37:19,160 --> 00:37:21,640 Speaker 5: testing it? And it sounds like in that particular case 738 00:37:21,680 --> 00:37:23,799 Speaker 5: it wasn't at all Right. All of the other ways 739 00:37:23,840 --> 00:37:25,480 Speaker 5: you look at it, you don't find anything. You only 740 00:37:25,480 --> 00:37:28,680 Speaker 5: find it if you look in this one particular way. Well, 741 00:37:29,160 --> 00:37:31,160 Speaker 5: that would be an important thing to know, right, That 742 00:37:31,239 --> 00:37:33,799 Speaker 5: would be important for the science in the field to 743 00:37:33,920 --> 00:37:37,080 Speaker 5: know that this only works in this one study if 744 00:37:37,080 --> 00:37:39,080 Speaker 5: you measure it this way, and if you fish around enough, 745 00:37:39,080 --> 00:37:42,200 Speaker 5: you'll find something that could be consistent, which means that 746 00:37:42,200 --> 00:37:44,040 Speaker 5: maybe we shouldn't trust that a whole lot until we 747 00:37:44,040 --> 00:37:46,839 Speaker 5: can replicate with that particular way of analyzing the data 748 00:37:46,840 --> 00:37:49,839 Speaker 5: and see if that holds up consistently. We should also 749 00:37:49,880 --> 00:37:51,560 Speaker 5: check to make sure that it holds up for real 750 00:37:51,600 --> 00:37:54,080 Speaker 5: reasons as opposed to just something odd about how you've 751 00:37:54,080 --> 00:37:57,760 Speaker 5: constructed the measure. Right, it might be that it's completely 752 00:37:57,800 --> 00:38:00,359 Speaker 5: reliable when you measure it that way, but it's sort 753 00:38:00,360 --> 00:38:03,560 Speaker 5: of an artifact of the structure of data of that sort. 754 00:38:04,160 --> 00:38:06,279 Speaker 5: So I think the most powerful way to do that 755 00:38:06,320 --> 00:38:07,920 Speaker 5: to say, hey, I want to make a claim that 756 00:38:07,960 --> 00:38:11,400 Speaker 5: we've discovered some relationship with some bias. Well, if you 757 00:38:11,400 --> 00:38:14,200 Speaker 5: want to claim that it's a general truth about how 758 00:38:14,200 --> 00:38:16,439 Speaker 5: the world works, you want to be able to show 759 00:38:16,440 --> 00:38:18,279 Speaker 5: that it works under a range of different ways of 760 00:38:18,320 --> 00:38:20,680 Speaker 5: measuring it and under a range of different conditions, not 761 00:38:20,800 --> 00:38:23,880 Speaker 5: just the one special one that you identified. And I 762 00:38:23,880 --> 00:38:25,320 Speaker 5: think this has been an issue in our field for 763 00:38:25,360 --> 00:38:28,040 Speaker 5: a long time, is that you know, there's obviously a 764 00:38:28,080 --> 00:38:30,120 Speaker 5: goal to try and support the theories that you're working under. 765 00:38:30,400 --> 00:38:32,719 Speaker 5: That's a natural thing to be doing, and it's not 766 00:38:32,760 --> 00:38:34,120 Speaker 5: necessarily a terrible. 767 00:38:33,800 --> 00:38:34,399 Speaker 3: Thing to be doing. 768 00:38:35,120 --> 00:38:39,239 Speaker 5: But we haven't been completely straightforward as a field in 769 00:38:39,760 --> 00:38:43,239 Speaker 5: reporting all the things we've tried, and depending on the 770 00:38:43,320 --> 00:38:46,000 Speaker 5: kind of approach you're taking, that can be really misleading. 771 00:38:46,040 --> 00:38:47,960 Speaker 5: If you don't report everything the way it was done, 772 00:38:48,000 --> 00:38:50,640 Speaker 5: it's cherry picking in a sense. You're taking out the 773 00:38:50,640 --> 00:38:53,200 Speaker 5: results that you wanted and ignoring all the ones that 774 00:38:53,239 --> 00:38:57,080 Speaker 5: didn't work, and you're left with the reader only having 775 00:38:57,120 --> 00:38:59,560 Speaker 5: in the paper that you read about it, focusing on 776 00:38:59,600 --> 00:39:01,400 Speaker 5: the information that you've told them. And it's just like 777 00:39:01,440 --> 00:39:04,239 Speaker 5: a magician who's directed you to the thing they want 778 00:39:04,280 --> 00:39:07,160 Speaker 5: you to see and hidden all of the secret methods. 779 00:39:06,840 --> 00:39:07,719 Speaker 3: That get you there. 780 00:39:08,840 --> 00:39:11,040 Speaker 4: Yeah, you, in a way, as the researcher who did that, 781 00:39:11,040 --> 00:39:13,440 Speaker 4: have inadvertently become the con artist, although you had no 782 00:39:13,480 --> 00:39:15,880 Speaker 4: intention to do it, and you're not actually trying to deceive, 783 00:39:15,880 --> 00:39:18,080 Speaker 4: but you're accidentally sort of using some of those very 784 00:39:18,080 --> 00:39:22,239 Speaker 4: same techniques that people could use, you know, to do 785 00:39:22,320 --> 00:39:25,160 Speaker 4: worse things than publish a paper that didn't you know, 786 00:39:25,160 --> 00:39:27,920 Speaker 4: that didn't actually have good evidence for its conclusions. 787 00:39:28,560 --> 00:39:30,680 Speaker 1: Yeah, it strikes me that one of the things as 788 00:39:30,680 --> 00:39:34,719 Speaker 1: I was reading this excellent book, a lot of this 789 00:39:34,880 --> 00:39:37,520 Speaker 1: just has to do with taking the tools of good 790 00:39:37,719 --> 00:39:41,239 Speaker 1: science to the way that we interpret the world around us. 791 00:39:41,320 --> 00:39:45,040 Speaker 1: So the things about asking more questions and digging deeper 792 00:39:45,080 --> 00:39:48,439 Speaker 1: and so on. But it's interesting that even scientists don't 793 00:39:48,440 --> 00:39:49,840 Speaker 1: always do good science. 794 00:39:50,840 --> 00:39:53,880 Speaker 5: Yeah, we're all capable of being fooled, right in. Scientists 795 00:39:54,160 --> 00:39:57,359 Speaker 5: and maybe journalists are trained to dig more and ask 796 00:39:57,440 --> 00:39:59,840 Speaker 5: more questions and to think critically about what they're hearing. 797 00:40:00,640 --> 00:40:03,120 Speaker 5: But you know, we're human. We have the same sorts 798 00:40:03,160 --> 00:40:05,680 Speaker 5: of habits and ways of thinking. We tend to like 799 00:40:05,760 --> 00:40:08,600 Speaker 5: results that support what we predicted, and we're drawn to 800 00:40:08,640 --> 00:40:11,600 Speaker 5: the same kinds of information, and if somebody's looking to 801 00:40:11,640 --> 00:40:15,800 Speaker 5: sort of hide what they're doing, they can fool scientists. 802 00:40:15,840 --> 00:40:18,880 Speaker 5: And there are plenty of fraudulent papers out there that 803 00:40:19,560 --> 00:40:22,200 Speaker 5: got through peer review even though there were red flags. 804 00:40:22,239 --> 00:40:24,800 Speaker 5: And just like that sort of heist movie that you 805 00:40:24,960 --> 00:40:27,040 Speaker 5: watch from the outside and you see all of the 806 00:40:27,080 --> 00:40:28,839 Speaker 5: red flags along the way that people are falling for, 807 00:40:28,920 --> 00:40:30,839 Speaker 5: but you're not falling for them because you're watching them. 808 00:40:31,480 --> 00:40:34,400 Speaker 5: In hindsight, they're all obvious, right, but in that moment, 809 00:40:34,600 --> 00:40:37,319 Speaker 5: you don't necessarily see them as red flags until you know, oh, wait, 810 00:40:37,360 --> 00:40:39,640 Speaker 5: that paper was fraudulent. I found out through other means. 811 00:40:40,239 --> 00:40:41,960 Speaker 5: Now I can see all the red flags that are there. 812 00:40:42,360 --> 00:40:45,440 Speaker 1: Yes, So as I was reading the book, the way 813 00:40:45,480 --> 00:40:48,000 Speaker 1: I was thinking about it was, you know, the brain 814 00:40:48,120 --> 00:40:51,200 Speaker 1: is of course locked in silence and darkness inside the skull, 815 00:40:51,239 --> 00:40:54,279 Speaker 1: and it's just trying to make an internal model of 816 00:40:54,320 --> 00:40:56,800 Speaker 1: the world out there, a mental model, and we're always 817 00:40:57,200 --> 00:41:01,240 Speaker 1: we're always very limited in what our internal models can detect, 818 00:41:01,360 --> 00:41:05,640 Speaker 1: can see, And so one of the most important things 819 00:41:05,680 --> 00:41:11,920 Speaker 1: to expanding our model is to ask questions. And in 820 00:41:11,960 --> 00:41:14,680 Speaker 1: a sense this is the same as paying attention to something. 821 00:41:14,880 --> 00:41:17,400 Speaker 1: We ask a question that forces us to attend to 822 00:41:17,480 --> 00:41:20,960 Speaker 1: some aspect and then that updates our model a little bit. 823 00:41:21,239 --> 00:41:25,160 Speaker 1: So in chapter four, you guys had an example of 824 00:41:25,200 --> 00:41:29,320 Speaker 1: a chess grand master that asks his students to always 825 00:41:29,360 --> 00:41:31,720 Speaker 1: ask three questions when they're looking at the board? 826 00:41:31,800 --> 00:41:33,160 Speaker 2: Can you tell us about that? 827 00:41:34,560 --> 00:41:36,160 Speaker 4: Would I would love to tell you. I would love 828 00:41:36,200 --> 00:41:40,800 Speaker 4: to tell you about that. So I actually took a 829 00:41:40,920 --> 00:41:43,760 Speaker 4: during COVID, I took a summer chess camp on Zoom 830 00:41:43,800 --> 00:41:48,000 Speaker 4: with this guy and it was me and like twelve 831 00:41:48,040 --> 00:41:51,080 Speaker 4: people aged ten and under, which was a fun you know, 832 00:41:51,120 --> 00:41:53,240 Speaker 4: which was a fun again because who goes to summer 833 00:41:53,280 --> 00:41:55,680 Speaker 4: chess camps, right, It's like, you know, kids gauge ten 834 00:41:55,719 --> 00:41:59,400 Speaker 4: and under. Very good players by the way. And you 835 00:41:59,400 --> 00:42:01,759 Speaker 4: know one thing that he often that the coach would 836 00:42:01,760 --> 00:42:03,600 Speaker 4: often do. His name is jako Ogard. He's one of 837 00:42:03,680 --> 00:42:05,480 Speaker 4: the most famous chess coaches in the world, and it 838 00:42:05,560 --> 00:42:07,239 Speaker 4: was privileged to be able to sort of be in. 839 00:42:07,200 --> 00:42:08,320 Speaker 3: His camp for a few hours. 840 00:42:08,600 --> 00:42:11,200 Speaker 4: He would constantly say, we need more, you know, you 841 00:42:11,239 --> 00:42:13,280 Speaker 4: need to think of more moves. Right in a chess position, 842 00:42:13,360 --> 00:42:15,399 Speaker 4: there's like thirty to forty moves you can typically play, 843 00:42:15,600 --> 00:42:18,160 Speaker 4: you know, with all your pieces and people are often 844 00:42:18,160 --> 00:42:21,160 Speaker 4: become too focused on one. So he would say, think 845 00:42:21,200 --> 00:42:23,080 Speaker 4: of more candidate moves. Think of more moves you might 846 00:42:23,120 --> 00:42:25,000 Speaker 4: want to play and analyze. And if you're having trouble 847 00:42:25,000 --> 00:42:27,560 Speaker 4: thinking of them, he has specific questions that you can 848 00:42:27,640 --> 00:42:30,080 Speaker 4: use to try to generate ideas, and one of them 849 00:42:30,160 --> 00:42:33,399 Speaker 4: is what's your worst place piece? Maybe you should move 850 00:42:33,400 --> 00:42:35,080 Speaker 4: it if that's the one that's in the worst position. 851 00:42:35,440 --> 00:42:37,600 Speaker 4: Or what's the opponent's idea, Well, maybe you should come 852 00:42:37,640 --> 00:42:39,839 Speaker 4: up with a move that stops their idea. The third 853 00:42:39,880 --> 00:42:41,520 Speaker 4: one is what are the weaknesses? Maybe you should come 854 00:42:41,600 --> 00:42:43,239 Speaker 4: up with a move that attacks something that's weak. I 855 00:42:43,280 --> 00:42:45,440 Speaker 4: mean this is this will make sense to people who 856 00:42:45,520 --> 00:42:48,000 Speaker 4: play chess, but these are they're in almost all fields. 857 00:42:48,000 --> 00:42:50,520 Speaker 4: There are sort of general kinds of things you can 858 00:42:50,520 --> 00:42:53,319 Speaker 4: look at and principles you can use to generate, you know, 859 00:42:53,360 --> 00:42:55,360 Speaker 4: to generate more information and as you say, like you 860 00:42:55,840 --> 00:42:57,520 Speaker 4: I like your way putting it to sort of improve 861 00:42:57,600 --> 00:43:00,440 Speaker 4: your mental model of what's really going on, because in 862 00:43:00,520 --> 00:43:02,880 Speaker 4: order to in order to play good chess moves, you 863 00:43:02,960 --> 00:43:05,000 Speaker 4: have to have a good internal model of what's going 864 00:43:05,040 --> 00:43:07,839 Speaker 4: on on the board someplace in your brain. You've got 865 00:43:07,880 --> 00:43:09,840 Speaker 4: to have it and that's a way of sort of 866 00:43:09,880 --> 00:43:13,640 Speaker 4: generating more ideas, more analysis that then updates that then 867 00:43:13,719 --> 00:43:14,520 Speaker 4: updates the model. 868 00:43:15,400 --> 00:43:17,040 Speaker 5: And most of the time the models that we have 869 00:43:17,160 --> 00:43:18,680 Speaker 5: for how the world works are pretty. 870 00:43:18,400 --> 00:43:18,759 Speaker 3: Good, right. 871 00:43:18,800 --> 00:43:22,359 Speaker 5: I mean, we're not, you know, constantly getting conned. We're 872 00:43:22,360 --> 00:43:25,200 Speaker 5: not you know, we don't have trouble getting around, we 873 00:43:25,200 --> 00:43:28,520 Speaker 5: don't have trouble communicating with other people. Most of the time, 874 00:43:28,520 --> 00:43:31,680 Speaker 5: our models of how the world is working are great. 875 00:43:31,800 --> 00:43:34,560 Speaker 5: They work very effectively. And it's only in those cases 876 00:43:34,600 --> 00:43:36,520 Speaker 5: where we need to dig a little more to update 877 00:43:36,520 --> 00:43:39,320 Speaker 5: our model for the possibility we're being teated or deceived 878 00:43:40,120 --> 00:43:42,879 Speaker 5: that we need to ask a lot more questions. Right, 879 00:43:43,280 --> 00:43:45,000 Speaker 5: most of the time, we've built up these models from 880 00:43:45,040 --> 00:43:48,279 Speaker 5: a ton of experience, and they generally do okay, Yeah. 881 00:43:48,080 --> 00:43:52,280 Speaker 1: And we have expectations about what we're looking for given 882 00:43:52,320 --> 00:43:55,840 Speaker 1: these models, such that most of the time we're filling 883 00:43:55,880 --> 00:43:58,839 Speaker 1: in the blanks. And that is at the heart of 884 00:43:58,880 --> 00:44:01,600 Speaker 1: all these hoaxes and scams that you talk about throughout 885 00:44:01,600 --> 00:44:03,840 Speaker 1: the book, is we're filling in the blanks, and the 886 00:44:03,920 --> 00:44:06,799 Speaker 1: things that aren't said, we assume we. 887 00:44:06,719 --> 00:44:08,160 Speaker 2: Know what they mean. 888 00:44:08,480 --> 00:44:11,760 Speaker 1: So I'm curious what you guys think about the Turing 889 00:44:11,880 --> 00:44:16,520 Speaker 1: tests and for the listeners in case someone doesn't know. 890 00:44:16,719 --> 00:44:19,760 Speaker 1: The Turing test was proposed by Alan Turning to figure 891 00:44:19,760 --> 00:44:22,719 Speaker 1: out when a machine has become as smart as the human. 892 00:44:22,800 --> 00:44:26,239 Speaker 1: The idea is that you are the evaluator and you're 893 00:44:26,280 --> 00:44:28,760 Speaker 1: talking to a machine, and you're talking to a human, 894 00:44:29,160 --> 00:44:31,640 Speaker 1: let's say, by text, and you don't know which is which, 895 00:44:31,680 --> 00:44:33,640 Speaker 1: and the question is can you tell the difference between 896 00:44:33,640 --> 00:44:36,279 Speaker 1: the human and the machine. And the interesting part is, 897 00:44:36,360 --> 00:44:38,799 Speaker 1: because we bring so much to the table in any 898 00:44:38,840 --> 00:44:42,319 Speaker 1: conversation and we fill in the blanks, what do you 899 00:44:42,320 --> 00:44:45,040 Speaker 1: guys think Is that a good, meaningful test or is 900 00:44:45,080 --> 00:44:46,120 Speaker 1: it flawed in that way? 901 00:44:46,920 --> 00:44:49,080 Speaker 4: I think we have a lot of evidence that it's 902 00:44:49,120 --> 00:44:53,560 Speaker 4: not so good from chat, JPT and large language models, 903 00:44:53,560 --> 00:44:56,640 Speaker 4: which I don't think are actually intelligent in the way 904 00:44:56,680 --> 00:44:59,080 Speaker 4: that we should. I mean, I realized there might be 905 00:44:59,080 --> 00:45:00,800 Speaker 4: some dispute about this. Some people make some sort of 906 00:45:00,840 --> 00:45:04,600 Speaker 4: extravagant claims about signs of general intelligence, but I don't 907 00:45:04,600 --> 00:45:06,920 Speaker 4: think they're actually intelligent, or at least not in a 908 00:45:07,040 --> 00:45:09,279 Speaker 4: useful way. And yet they are extremely convincing. I mean, 909 00:45:09,320 --> 00:45:11,360 Speaker 4: I think they show that you can sort of dissociate, 910 00:45:11,840 --> 00:45:15,239 Speaker 4: you know, producing what humans expect to see next, which 911 00:45:15,280 --> 00:45:18,680 Speaker 4: is basically, you know, basically what large language models do 912 00:45:18,760 --> 00:45:21,040 Speaker 4: because they've been trained to sort of, you know, output 913 00:45:21,120 --> 00:45:23,319 Speaker 4: the most probable next token or word and so on. 914 00:45:23,560 --> 00:45:27,480 Speaker 4: You can dissociate that capability from having sort of a 915 00:45:27,520 --> 00:45:30,359 Speaker 4: true understanding of what's going on. For example, you could 916 00:45:30,440 --> 00:45:32,600 Speaker 4: ask an LLLM to play chess with you, and it 917 00:45:32,640 --> 00:45:34,279 Speaker 4: wouldn't do very well. It would produce a lot of 918 00:45:34,280 --> 00:45:36,080 Speaker 4: stuff that sounds like chess and so on. But if 919 00:45:36,080 --> 00:45:38,239 Speaker 4: you really knew the game of chess, you would know 920 00:45:38,320 --> 00:45:40,279 Speaker 4: that this is sort of gibberish. If you don't know, 921 00:45:40,400 --> 00:45:43,040 Speaker 4: it sounds perfectly good, right, You just you just sort 922 00:45:43,080 --> 00:45:45,040 Speaker 4: of you fill it in with sort of the assumption 923 00:45:45,120 --> 00:45:47,840 Speaker 4: that this guy sounds like he knows what he's talking about, 924 00:45:48,200 --> 00:45:51,359 Speaker 4: you know, which is which is not not you know, 925 00:45:51,440 --> 00:45:54,480 Speaker 4: not necessarily intelligence things. 926 00:45:54,280 --> 00:45:59,759 Speaker 5: Like chat GPT they speak with absolute confidence and certainty, right, 927 00:46:00,200 --> 00:46:03,280 Speaker 5: and that's regardless of whether or not they're generating true content. 928 00:46:03,920 --> 00:46:06,920 Speaker 5: You know, they're the consonant bullshitter right in that they 929 00:46:07,440 --> 00:46:10,600 Speaker 5: are equally confident when they're completely wrong and when they're 930 00:46:10,600 --> 00:46:14,239 Speaker 5: completely right. Because there's no grounding to any sort of 931 00:46:14,280 --> 00:46:16,719 Speaker 5: reality in the world. All they're doing is predicting what 932 00:46:16,760 --> 00:46:17,320 Speaker 5: comes next. 933 00:46:18,280 --> 00:46:20,319 Speaker 1: That's true, although I have to say one of the 934 00:46:20,360 --> 00:46:24,440 Speaker 1: things about chat GPT that I've really come to appreciate 935 00:46:24,560 --> 00:46:28,600 Speaker 1: is that almost any question that you ask it, it'll say, look, 936 00:46:28,680 --> 00:46:31,960 Speaker 1: some people think this, some people think that, and in conclusion, 937 00:46:32,080 --> 00:46:35,120 Speaker 1: we need to balance these points of view. And at 938 00:46:35,160 --> 00:46:37,759 Speaker 1: first I found that really annoying, but I came to 939 00:46:37,920 --> 00:46:42,400 Speaker 1: understand and appreciate that it is trying, you know, because 940 00:46:42,440 --> 00:46:45,239 Speaker 1: of the reinforcement learning and so on, it's trying to 941 00:46:46,160 --> 00:46:49,680 Speaker 1: give different perspectives instead of sounding totally confident about just 942 00:46:49,760 --> 00:46:50,600 Speaker 1: a single answer. 943 00:46:52,320 --> 00:46:53,920 Speaker 4: You kind of wish politicians would talk to you that 944 00:46:53,960 --> 00:46:57,879 Speaker 4: way sometimes exactly instead of the way they do it. Yeah, 945 00:46:57,920 --> 00:47:01,879 Speaker 4: well it's it's it's reasonable. I just don't think chet 946 00:47:01,920 --> 00:47:04,600 Speaker 4: GPT believes that both of those things are equally you know, 947 00:47:04,640 --> 00:47:06,440 Speaker 4: are equally likely to be true and so on in 948 00:47:06,480 --> 00:47:09,200 Speaker 4: any in any meaningful way. I think, you know, it's 949 00:47:09,880 --> 00:47:13,280 Speaker 4: the part of the danger I think of of models 950 00:47:13,320 --> 00:47:16,200 Speaker 4: like this is if you don't understand how they work, 951 00:47:17,040 --> 00:47:19,000 Speaker 4: and if you just see this is a I like 952 00:47:19,040 --> 00:47:20,560 Speaker 4: a lot of times you see these posts on Twitter 953 00:47:20,640 --> 00:47:24,160 Speaker 4: that say and A n AI did this that's just 954 00:47:24,239 --> 00:47:26,839 Speaker 4: designed to impress you, right and to mislead you by 955 00:47:26,880 --> 00:47:29,840 Speaker 4: thinking that therefore the results must be created by genius 956 00:47:29,880 --> 00:47:32,479 Speaker 4: and totally accurate. But if you actually understand something about 957 00:47:32,480 --> 00:47:34,600 Speaker 4: how it works, then you will have the reaction you did. 958 00:47:34,640 --> 00:47:36,600 Speaker 3: You'll say, well, this is this is kind. 959 00:47:36,440 --> 00:47:38,319 Speaker 4: Of interesting that it's trying to give you know, sort 960 00:47:38,360 --> 00:47:41,799 Speaker 4: of too equally like probable you know, schools of thought here, 961 00:47:42,000 --> 00:47:44,480 Speaker 4: or you know common you know, views on the topic 962 00:47:44,600 --> 00:47:47,040 Speaker 4: or something like that, but you understand something about how 963 00:47:47,080 --> 00:47:49,520 Speaker 4: they work, so it doesn't you know, they their output 964 00:47:49,600 --> 00:47:51,799 Speaker 4: sort of is more sensible to you than it might 965 00:47:51,840 --> 00:47:53,160 Speaker 4: be to people who don't know. 966 00:48:08,239 --> 00:48:10,840 Speaker 1: I want to come back to just something you mentioned 967 00:48:10,840 --> 00:48:13,160 Speaker 1: about politicians. One of the things I was thinking about 968 00:48:13,239 --> 00:48:16,520 Speaker 1: as I was reading the book was politicians often get 969 00:48:16,560 --> 00:48:19,480 Speaker 1: points deducted if they change their mind on something. 970 00:48:19,520 --> 00:48:20,880 Speaker 2: They're called flip floppers. 971 00:48:21,520 --> 00:48:26,000 Speaker 1: And it's such a shame because we know that if 972 00:48:26,000 --> 00:48:30,040 Speaker 1: people are using scientific reasoning, they might reasonably change their 973 00:48:30,160 --> 00:48:32,840 Speaker 1: mind about some issue. 974 00:48:32,719 --> 00:48:33,560 Speaker 2: At some point. 975 00:48:34,080 --> 00:48:36,479 Speaker 1: The question is, how would you guys see a way 976 00:48:37,040 --> 00:48:40,520 Speaker 1: to change something about the way we educate the public 977 00:48:40,719 --> 00:48:44,240 Speaker 1: so that politicians who change their mind on a topic 978 00:48:44,680 --> 00:48:47,760 Speaker 1: are not considered flip floppers and deducted points. 979 00:48:48,560 --> 00:48:53,239 Speaker 4: Now, I guess I would say that this refers back 980 00:48:53,280 --> 00:48:56,960 Speaker 4: to our taste for consistency. So there's something appealing in 981 00:48:57,120 --> 00:49:00,360 Speaker 4: consistency of a wide variety and consistency and a person's 982 00:49:00,400 --> 00:49:03,919 Speaker 4: behavior is also appealing to us. And many times that's good. 983 00:49:04,000 --> 00:49:04,120 Speaker 3: Right. 984 00:49:04,160 --> 00:49:06,200 Speaker 4: You want someone who keeps their word. You want someone 985 00:49:06,239 --> 00:49:08,040 Speaker 4: who says what they are going to do and then 986 00:49:08,040 --> 00:49:09,600 Speaker 4: does what they said they would do. You want someone 987 00:49:09,600 --> 00:49:12,240 Speaker 4: who's always on time. Like those are generally positive things. 988 00:49:12,560 --> 00:49:16,240 Speaker 4: But when you're talking about complex subjects like what should 989 00:49:16,239 --> 00:49:19,000 Speaker 4: climate policy be? You know, what should tax policy be? 990 00:49:19,160 --> 00:49:22,000 Speaker 4: Any of these kinds of things and so on, facts 991 00:49:22,080 --> 00:49:25,680 Speaker 4: change over time, you know, and that alone, never mind 992 00:49:25,880 --> 00:49:27,840 Speaker 4: you know, using scientific reasoning, but just the changing of 993 00:49:27,880 --> 00:49:31,480 Speaker 4: facts might, you know, might change people's views. I think 994 00:49:32,920 --> 00:49:37,960 Speaker 4: teaching people about the trap of consistency, you know, may 995 00:49:38,000 --> 00:49:39,879 Speaker 4: start to help. I'm not sure there's like an easy 996 00:49:39,960 --> 00:49:42,440 Speaker 4: nudge that is going to make people suddenly prefer the 997 00:49:42,440 --> 00:49:45,000 Speaker 4: flip flopper, because, after all, sometimes the flip flopper is 998 00:49:45,040 --> 00:49:47,160 Speaker 4: just being expedient, right. It's not as simple as saying, like, well, 999 00:49:47,160 --> 00:49:49,320 Speaker 4: we should anytime someone flip flops. We should just assume 1000 00:49:49,400 --> 00:49:52,600 Speaker 4: that they are have done some complex ratios nation and 1001 00:49:52,680 --> 00:49:55,719 Speaker 4: integration who do information, and now they've changed their mind. 1002 00:49:55,760 --> 00:49:57,600 Speaker 4: It could be they're just saying something to a different 1003 00:49:57,600 --> 00:50:00,560 Speaker 4: audience because that's what is expedient for them at the moment. 1004 00:50:00,640 --> 00:50:03,040 Speaker 4: It's a tough problem to sort out, but certainly I 1005 00:50:03,040 --> 00:50:05,080 Speaker 4: don't think we should have the bias against changing one's 1006 00:50:05,120 --> 00:50:06,640 Speaker 4: mind that it is really seems to be built into 1007 00:50:06,640 --> 00:50:08,960 Speaker 4: the political system, because I do think it contributes to 1008 00:50:08,960 --> 00:50:11,719 Speaker 4: some polarization also, Right, in order to be a consistent conservative, 1009 00:50:11,719 --> 00:50:13,000 Speaker 4: you always have to do this, in order to be 1010 00:50:13,080 --> 00:50:17,640 Speaker 4: a strong you know, it's definitely it definitely has its cost. 1011 00:50:18,440 --> 00:50:21,560 Speaker 5: Well, and you know, scientists also aren't necessarily the greatest 1012 00:50:21,600 --> 00:50:23,160 Speaker 5: at updating their beliefs in light of evidence. 1013 00:50:23,400 --> 00:50:23,600 Speaker 2: Right. 1014 00:50:24,600 --> 00:50:26,840 Speaker 5: There are plenty of people who get evidence that should 1015 00:50:26,880 --> 00:50:29,560 Speaker 5: contradict their claims, but they continue arguing for the old 1016 00:50:29,760 --> 00:50:33,719 Speaker 5: position without fully updating their beliefs accordingly, Right, every time 1017 00:50:33,760 --> 00:50:36,879 Speaker 5: somebody encounters have failure to replicate their own work, right, 1018 00:50:37,480 --> 00:50:39,560 Speaker 5: what's their reaction, Well, I'm going to try and find 1019 00:50:39,560 --> 00:50:41,200 Speaker 5: all of the things that were done differently in that 1020 00:50:41,760 --> 00:50:44,880 Speaker 5: replication attempt that could maybe excuse why they didn't find 1021 00:50:45,520 --> 00:50:48,360 Speaker 5: what I found, And what they really should do is say, Okay, 1022 00:50:48,640 --> 00:50:50,279 Speaker 5: you know that may or may not be right. I 1023 00:50:50,360 --> 00:50:52,600 Speaker 5: might disagree with it, but it should make me a 1024 00:50:52,640 --> 00:50:55,879 Speaker 5: little less likely to believe in my original result. And 1025 00:50:55,920 --> 00:50:58,080 Speaker 5: maybe there are alternatives and I need to go test those, 1026 00:50:58,120 --> 00:51:00,880 Speaker 5: and if those alternatives don't work out, then I should 1027 00:51:00,880 --> 00:51:02,759 Speaker 5: be changing my beliefs. But you don't see that all 1028 00:51:02,760 --> 00:51:05,360 Speaker 5: that often. You don't see the person going back to 1029 00:51:05,360 --> 00:51:07,440 Speaker 5: the study and saying, Okay, I'm going to prove that 1030 00:51:07,520 --> 00:51:10,799 Speaker 5: it was this moderation that explained why they get They 1031 00:51:10,800 --> 00:51:13,200 Speaker 5: didn't get it, but I did. More often than not, 1032 00:51:13,320 --> 00:51:16,600 Speaker 5: you have a dismissal Right, Well, that's really not all 1033 00:51:16,600 --> 00:51:19,920 Speaker 5: that different than what a politician does when they refuse 1034 00:51:19,960 --> 00:51:21,920 Speaker 5: to change their views in light of new data. 1035 00:51:22,200 --> 00:51:23,400 Speaker 3: Yeah, you know. 1036 00:51:23,440 --> 00:51:25,919 Speaker 1: I think this is back to this issue of complexity 1037 00:51:25,960 --> 00:51:29,680 Speaker 1: in science, which is that it essentially works by the 1038 00:51:29,719 --> 00:51:33,640 Speaker 1: advocacy system, which is that you're supposed to defend an 1039 00:51:33,719 --> 00:51:36,200 Speaker 1: idea all the way down until you can't defend it 1040 00:51:36,200 --> 00:51:38,239 Speaker 1: anymore and then you give up on it. But so 1041 00:51:38,400 --> 00:51:41,520 Speaker 1: when someone says, oh, I didn't replicate your thing, you 1042 00:51:41,600 --> 00:51:43,680 Speaker 1: might be the only one who says, hey, I'm willing 1043 00:51:43,719 --> 00:51:47,040 Speaker 1: to defend this and really fight for the idea until 1044 00:51:47,080 --> 00:51:49,759 Speaker 1: I reached some point. The question is what is the 1045 00:51:49,800 --> 00:51:54,040 Speaker 1: proper point? And as we said, scientists are humans too, 1046 00:51:54,120 --> 00:51:57,400 Speaker 1: and so they really care about their reputation and their 1047 00:51:57,440 --> 00:51:58,480 Speaker 1: previous publications. 1048 00:51:59,160 --> 00:52:01,440 Speaker 5: And that's fine, So it's what's what's the point? But 1049 00:52:01,480 --> 00:52:04,160 Speaker 5: also what evidence do you use to defend it? 1050 00:52:04,360 --> 00:52:05,040 Speaker 3: Yeah? Right? 1051 00:52:05,200 --> 00:52:08,040 Speaker 5: So if if the evidence you used to defend it is, hey, 1052 00:52:08,040 --> 00:52:10,319 Speaker 5: we can still get our result and we can show 1053 00:52:10,360 --> 00:52:13,400 Speaker 5: why you didn't get it, that's great, right, Then you 1054 00:52:13,560 --> 00:52:16,719 Speaker 5: then you've bolstered your position. You've let everybody to update 1055 00:52:16,719 --> 00:52:19,560 Speaker 5: with their beliefs. Are If your defense is they're incompetent, 1056 00:52:20,400 --> 00:52:22,680 Speaker 5: that's not a great defense. It might be true, but 1057 00:52:23,280 --> 00:52:25,040 Speaker 5: you'd have to show why and then show that if 1058 00:52:25,040 --> 00:52:26,640 Speaker 5: you do it in the incompetent way, you don't get 1059 00:52:26,680 --> 00:52:27,080 Speaker 5: the result. 1060 00:52:28,480 --> 00:52:32,240 Speaker 4: Well, according to the adage that science progresses one funeral 1061 00:52:32,239 --> 00:52:34,799 Speaker 4: at a time, that point is the point of death. Right, 1062 00:52:34,920 --> 00:52:37,200 Speaker 4: And so there is a generational aspect to some of 1063 00:52:37,239 --> 00:52:39,160 Speaker 4: these to some of these things. I think there's there, seriously, 1064 00:52:39,200 --> 00:52:41,960 Speaker 4: is right, there's sort of generations. Generations often are associated 1065 00:52:41,960 --> 00:52:44,520 Speaker 4: with particular schools of thought or views that sort of 1066 00:52:44,560 --> 00:52:46,640 Speaker 4: pass and. 1067 00:52:45,760 --> 00:52:46,400 Speaker 3: And and evolved. 1068 00:52:46,400 --> 00:52:49,080 Speaker 4: It would be great if we could get there before then, right, 1069 00:52:50,080 --> 00:52:54,280 Speaker 4: So we should get there before that point. But it's 1070 00:52:54,400 --> 00:52:56,200 Speaker 4: I don't think I'm not so bleak. I'm not so 1071 00:52:56,280 --> 00:52:58,520 Speaker 4: blik on this maybe as some people, because I think 1072 00:52:58,520 --> 00:52:59,920 Speaker 4: a lot of times what happens is people just sort 1073 00:52:59,920 --> 00:53:01,960 Speaker 4: of drop out of the debate. Right. They may not 1074 00:53:02,000 --> 00:53:04,440 Speaker 4: have changed their mind, but they don't become an important 1075 00:53:04,440 --> 00:53:06,799 Speaker 4: person in that in that area anymore, or at least 1076 00:53:06,840 --> 00:53:10,080 Speaker 4: the next generation who's doing all the interesting work doesn't 1077 00:53:10,120 --> 00:53:13,840 Speaker 4: take those people that seriously anymore. They're still alive where 1078 00:53:13,880 --> 00:53:16,399 Speaker 4: you know, their funeral hasn't happened yet, but they maybe 1079 00:53:16,440 --> 00:53:18,279 Speaker 4: moved on to another topic. They're doing something else in 1080 00:53:18,320 --> 00:53:20,640 Speaker 4: science maybe, but they're not like exerting like an iron 1081 00:53:21,280 --> 00:53:23,880 Speaker 4: you know, an iron fist, you know, rule over some area, 1082 00:53:23,920 --> 00:53:26,120 Speaker 4: like when a science works that way, that's pathological. 1083 00:53:26,200 --> 00:53:26,359 Speaker 1: Right. 1084 00:53:26,440 --> 00:53:29,040 Speaker 4: But you know when when someone's views like you know, 1085 00:53:29,160 --> 00:53:30,799 Speaker 4: must be then then we're you know, you're not talking 1086 00:53:30,840 --> 00:53:32,359 Speaker 4: about science anymore, right. I think a lot of people 1087 00:53:32,440 --> 00:53:35,200 Speaker 4: just sort of move on instead of before it's too late. 1088 00:53:35,640 --> 00:53:37,759 Speaker 1: Yeah, And this is the great part about science is 1089 00:53:37,800 --> 00:53:41,120 Speaker 1: that it is the only endeavor that's constantly knocking down 1090 00:53:41,160 --> 00:53:41,759 Speaker 1: its own wall. 1091 00:53:41,920 --> 00:53:44,800 Speaker 2: So with enough time, the truth will out. 1092 00:53:45,239 --> 00:53:48,799 Speaker 1: Things are changing really rapidly ever since you, you know, 1093 00:53:48,880 --> 00:53:51,759 Speaker 1: finished writing the book and sent it off, And so 1094 00:53:51,920 --> 00:53:54,080 Speaker 1: one of the things that I want to ask about 1095 00:53:54,080 --> 00:53:57,040 Speaker 1: are things like what's happening with AI and deep fakes. 1096 00:53:57,120 --> 00:54:00,200 Speaker 1: What are the new kinds of hoaxes that you see 1097 00:54:00,360 --> 00:54:02,640 Speaker 1: coming down the line. 1098 00:54:02,920 --> 00:54:05,319 Speaker 5: Well, I'll raise one that's not new, And I think 1099 00:54:05,360 --> 00:54:08,360 Speaker 5: it's important to point out that none of the hoaxes 1100 00:54:08,400 --> 00:54:11,200 Speaker 5: that have happened over the thousands of years are really 1101 00:54:11,280 --> 00:54:13,480 Speaker 5: fundamentally all that different in the way that they take 1102 00:54:13,520 --> 00:54:16,480 Speaker 5: advantage of our tendencies, right, And that's the thing that 1103 00:54:16,480 --> 00:54:19,160 Speaker 5: we noticed across all of these different domains, from chess 1104 00:54:19,200 --> 00:54:22,600 Speaker 5: to sports, to art to science, that they all take 1105 00:54:22,640 --> 00:54:25,720 Speaker 5: advantage of the same sorts of tendencies. And new scams 1106 00:54:25,719 --> 00:54:27,239 Speaker 5: are going to do that too, they just might do 1107 00:54:27,280 --> 00:54:30,800 Speaker 5: it more effectively. And even the Nigerian Prince email scam 1108 00:54:31,080 --> 00:54:35,279 Speaker 5: was originally a Nigerian prince mail scam back when people 1109 00:54:35,320 --> 00:54:35,960 Speaker 5: sent letters. 1110 00:54:36,520 --> 00:54:38,080 Speaker 3: It's more effective in. 1111 00:54:38,000 --> 00:54:40,400 Speaker 5: That they don't have to spend as much time and 1112 00:54:40,440 --> 00:54:44,640 Speaker 5: effort finding potential victims. Right, some of these scams with 1113 00:54:44,719 --> 00:54:47,840 Speaker 5: the advent of AI are going to become more effective. 1114 00:54:48,040 --> 00:54:48,200 Speaker 3: Right. 1115 00:54:48,239 --> 00:54:51,680 Speaker 5: So one that's common right now is it's either the 1116 00:54:51,880 --> 00:54:55,759 Speaker 5: you know your kid's been arrested scam or has been 1117 00:54:55,800 --> 00:54:58,160 Speaker 5: in an accident or is being held hostage. It's a 1118 00:54:58,200 --> 00:55:00,840 Speaker 5: horrible thing, preying on people's fears. They'll call up a 1119 00:55:00,920 --> 00:55:04,359 Speaker 5: parent or a grandparent and say that the kid needs 1120 00:55:04,400 --> 00:55:07,399 Speaker 5: to be bailed out immediately, right, and often they'll call up, 1121 00:55:07,600 --> 00:55:10,600 Speaker 5: you know, a relative like a cousin or something. Now, 1122 00:55:10,600 --> 00:55:14,879 Speaker 5: that scam's pretty effective because people want to quickly solve 1123 00:55:14,880 --> 00:55:17,640 Speaker 5: this problem, right, They want to quickly fix what is wrong, 1124 00:55:18,520 --> 00:55:21,240 Speaker 5: and often we don't have the preventive measures in place 1125 00:55:21,960 --> 00:55:25,040 Speaker 5: to stop that. But imagine how much more powerful that 1126 00:55:25,239 --> 00:55:27,719 Speaker 5: is if you're using AI voice synthesis to make the call, 1127 00:55:28,200 --> 00:55:30,920 Speaker 5: right and it actually sounds like it's coming from that person. 1128 00:55:31,800 --> 00:55:34,399 Speaker 5: That's going to ramp that one up, same principles, it's 1129 00:55:34,440 --> 00:55:35,080 Speaker 5: just more potent. 1130 00:55:36,280 --> 00:55:36,719 Speaker 3: I think a. 1131 00:55:38,440 --> 00:55:40,759 Speaker 4: Whole area which is rife with scams, which is a 1132 00:55:40,800 --> 00:55:43,560 Speaker 4: new area, but you know, sort of re scamming based 1133 00:55:43,560 --> 00:55:47,520 Speaker 4: on old principles as cryptocurrency, So there are thousands and 1134 00:55:47,560 --> 00:55:51,080 Speaker 4: thousands and thousands of cryptocurrencies and coins being issued and 1135 00:55:51,640 --> 00:55:54,160 Speaker 4: so on, and you know, as far as I can tell, 1136 00:55:54,400 --> 00:55:58,040 Speaker 4: most of that is mostly fraud. But yet it relies 1137 00:55:58,040 --> 00:55:59,640 Speaker 4: on all the same principles. You've got sort of like 1138 00:55:59,680 --> 00:56:03,120 Speaker 4: famili your celebrities advertising these things. You've got time pressure, 1139 00:56:03,160 --> 00:56:04,920 Speaker 4: there's a limited offering. You know, you've got to make 1140 00:56:04,920 --> 00:56:09,080 Speaker 4: a decision. Now, You've got these sort of like fake consistency, 1141 00:56:09,200 --> 00:56:11,319 Speaker 4: Like people will claim that like our crypto fund is, 1142 00:56:11,480 --> 00:56:13,520 Speaker 4: you know, never had a down month in all of 1143 00:56:13,520 --> 00:56:15,719 Speaker 4: its three months of existence or something like that, you know, 1144 00:56:15,800 --> 00:56:18,319 Speaker 4: but consistently going up, and all the same stuff just 1145 00:56:18,360 --> 00:56:21,040 Speaker 4: being applied to a whole new being applied to a 1146 00:56:21,040 --> 00:56:23,120 Speaker 4: whole new thing. And I don't really see that, you know, 1147 00:56:23,160 --> 00:56:26,560 Speaker 4: getting any better. I as far as AI in deep 1148 00:56:26,560 --> 00:56:30,359 Speaker 4: fakes and so on, I do have some optimism that 1149 00:56:30,920 --> 00:56:33,960 Speaker 4: it's going to increase the value of truly trusted sources 1150 00:56:33,960 --> 00:56:35,480 Speaker 4: who bother to check that stuff. 1151 00:56:35,920 --> 00:56:36,600 Speaker 3: Right, So I. 1152 00:56:36,640 --> 00:56:39,200 Speaker 4: Noticed during there were not long ago there but this 1153 00:56:39,239 --> 00:56:42,120 Speaker 4: is sort of pseudo attempted coup revolution weird thing that 1154 00:56:42,200 --> 00:56:46,279 Speaker 4: happened in Russia. You know, a sort of paramilitary group 1155 00:56:46,360 --> 00:56:49,080 Speaker 4: kind of turned on the military and started marching to 1156 00:56:49,160 --> 00:56:51,799 Speaker 4: Moscow and so on. And I was fascinated by this 1157 00:56:51,920 --> 00:56:53,640 Speaker 4: and paying attention to Twitter, and there were all kinds 1158 00:56:53,640 --> 00:56:56,879 Speaker 4: of reports on Twitter, people claiming to be like eyewitnesses 1159 00:56:56,920 --> 00:56:59,560 Speaker 4: to things and so on, and very little of that 1160 00:56:59,600 --> 00:57:02,839 Speaker 4: made it to the mainstream media or to legitimate sources. 1161 00:57:03,400 --> 00:57:06,080 Speaker 4: And I thought about it afterwards, I thought, why is that? 1162 00:57:06,200 --> 00:57:08,400 Speaker 4: You know, well, maybe some of it was true, but 1163 00:57:08,400 --> 00:57:10,520 Speaker 4: probably most of it just couldn't be verified. You know. 1164 00:57:10,560 --> 00:57:12,080 Speaker 4: It was like one guy said they saw something, they 1165 00:57:12,080 --> 00:57:13,799 Speaker 4: couldn't find someone else who saw the same thing, They 1166 00:57:13,800 --> 00:57:17,520 Speaker 4: couldn't find the underlying you know, whatever it was that 1167 00:57:17,640 --> 00:57:20,120 Speaker 4: was supposedly the source of the evidence. But nonetheless the 1168 00:57:20,160 --> 00:57:24,160 Speaker 4: story that emerged, although a little more vague and abstract 1169 00:57:24,240 --> 00:57:26,480 Speaker 4: with less detail, was probably much more likely to be 1170 00:57:26,520 --> 00:57:30,200 Speaker 4: true because it was sort of filtered through agents that 1171 00:57:30,280 --> 00:57:33,280 Speaker 4: bother to check and try to only pass on verifiable information. 1172 00:57:33,360 --> 00:57:35,080 Speaker 4: And they are now faced with the problem of how 1173 00:57:35,120 --> 00:57:37,320 Speaker 4: do you tell whether this video of Trump doing X 1174 00:57:37,720 --> 00:57:40,480 Speaker 4: is actually Trump doing it or some fake that someone created, right, 1175 00:57:40,520 --> 00:57:42,800 Speaker 4: But I don't know who else to put more trust in, 1176 00:57:43,200 --> 00:57:46,840 Speaker 4: you know, for sorting that out than journalists and or 1177 00:57:46,880 --> 00:57:48,760 Speaker 4: and there are some organizations that they work with who 1178 00:57:48,760 --> 00:57:51,200 Speaker 4: are experts at detecting these kinds of things and so on. 1179 00:57:51,720 --> 00:57:55,440 Speaker 4: So I think maybe it might paradoxically increase the value 1180 00:57:55,680 --> 00:57:59,120 Speaker 4: and the attention paid to more legitimate sources, which I 1181 00:57:59,120 --> 00:58:00,800 Speaker 4: think would probably be a good thing on balance. 1182 00:58:01,200 --> 00:58:03,240 Speaker 5: I mean, the pessimistic view is that these things get 1183 00:58:03,280 --> 00:58:05,600 Speaker 5: increased in scale, right, it makes it much easier to 1184 00:58:06,520 --> 00:58:10,080 Speaker 5: scam at large scale and make it sound plausible, right. 1185 00:58:10,200 --> 00:58:14,000 Speaker 5: But the optimistic take is exactly what Chris was saying, 1186 00:58:14,000 --> 00:58:17,640 Speaker 5: that once we realize that these things are possible at scale, 1187 00:58:18,320 --> 00:58:20,800 Speaker 5: maybe we start being more skeptical of most of the 1188 00:58:20,880 --> 00:58:24,280 Speaker 5: sort of rapid information that we get, and we withhold 1189 00:58:24,320 --> 00:58:26,600 Speaker 5: judgment just a little bit longer until we can have 1190 00:58:26,640 --> 00:58:29,160 Speaker 5: some verified sources. And the idea would be if we 1191 00:58:29,160 --> 00:58:32,240 Speaker 5: could actually have verified sources again, we haven't had that 1192 00:58:32,280 --> 00:58:34,440 Speaker 5: for a while. Now that anybody can start up a 1193 00:58:34,520 --> 00:58:35,919 Speaker 5: cable network and say whatever they want. 1194 00:58:36,560 --> 00:58:39,040 Speaker 1: This is something I've been wondering about recently, is all 1195 00:58:39,040 --> 00:58:41,600 Speaker 1: of these things that we're very concerned about, like deep fakes, 1196 00:58:42,320 --> 00:58:46,120 Speaker 1: will the younger generations be much less susceptible to them 1197 00:58:46,160 --> 00:58:48,400 Speaker 1: because they're well aware that if you see a video 1198 00:58:48,440 --> 00:58:50,200 Speaker 1: of something, it might be real, it might be fake, 1199 00:58:50,640 --> 00:58:53,480 Speaker 1: as opposed to know, those of us who are older 1200 00:58:53,680 --> 00:58:56,080 Speaker 1: are really concerned about it in a way that we 1201 00:58:56,160 --> 00:58:56,960 Speaker 1: might not need to be. 1202 00:58:57,720 --> 00:58:59,080 Speaker 3: I think that's a really good question. 1203 00:58:59,160 --> 00:59:01,320 Speaker 4: I think the jury still out on that, because I 1204 00:59:01,360 --> 00:59:04,560 Speaker 4: think in some ways younger people are a bit more 1205 00:59:04,680 --> 00:59:07,720 Speaker 4: naive about some things. They don't have certain experiences and 1206 00:59:07,720 --> 00:59:10,240 Speaker 4: so on. On the other hand, as you say, they 1207 00:59:10,280 --> 00:59:12,360 Speaker 4: may be more used to the idea that videos are 1208 00:59:12,360 --> 00:59:15,440 Speaker 4: not proof the way people who grew up in an 1209 00:59:15,480 --> 00:59:19,080 Speaker 4: era of less video and less awareness of video editing 1210 00:59:19,120 --> 00:59:21,640 Speaker 4: and so on might not be. I'm reminded of the 1211 00:59:21,680 --> 00:59:24,960 Speaker 4: sort of discussion that you heard, you know, fifteen to 1212 00:59:25,000 --> 00:59:27,520 Speaker 4: twenty years ago about the so called digital natives and 1213 00:59:27,560 --> 00:59:29,840 Speaker 4: how having grown up with technology, they were so smart 1214 00:59:29,840 --> 00:59:30,680 Speaker 4: in using it and so on. 1215 00:59:30,760 --> 00:59:32,520 Speaker 3: And then when I became a college. 1216 00:59:32,200 --> 00:59:34,240 Speaker 4: Professor, I found out that students didn't really know how 1217 00:59:34,240 --> 00:59:36,600 Speaker 4: to do a proper Google search, you know, and so on, 1218 00:59:36,640 --> 00:59:39,880 Speaker 4: even though they were supposedly natives, like it's not in America, 1219 00:59:40,000 --> 00:59:43,440 Speaker 4: not being able to, you know, to speak English correctly, 1220 00:59:43,720 --> 00:59:47,600 Speaker 4: So that gives me less optimism. But I think in general, 1221 00:59:47,760 --> 00:59:50,240 Speaker 4: across generations. I think there's going to be a rise 1222 00:59:50,480 --> 00:59:54,280 Speaker 4: in a rise in skepticism, may be somewhat of a 1223 00:59:54,320 --> 00:59:57,760 Speaker 4: decline of truth bias. Truth bias can't decline too far 1224 00:59:57,840 --> 01:00:00,439 Speaker 4: otherwise we just can't interact with anybody anymore. But maybe 1225 01:00:00,480 --> 01:00:03,240 Speaker 4: a sort of a decline or a specialization of truth 1226 01:00:03,280 --> 01:00:04,960 Speaker 4: bias where you have sort of a little bit more 1227 01:00:04,960 --> 01:00:07,520 Speaker 4: truth bias in some areas, like when you're talking to 1228 01:00:07,560 --> 01:00:09,800 Speaker 4: an actual human being standing in front of you, and 1229 01:00:09,960 --> 01:00:11,960 Speaker 4: less when you're watching a video on TikTok. Like that 1230 01:00:12,000 --> 01:00:14,480 Speaker 4: would be a nice balance to have, right and not 1231 01:00:14,560 --> 01:00:16,480 Speaker 4: to pick on TikTok, but there seems to be more 1232 01:00:16,480 --> 01:00:19,200 Speaker 4: nonsense there than most other places, just from what I've noticed. 1233 01:00:20,120 --> 01:00:25,160 Speaker 1: Okay, so zooming out, give us some practical advice for people, 1234 01:00:25,720 --> 01:00:29,520 Speaker 1: some tips they can take home. Well. 1235 01:00:29,600 --> 01:00:32,000 Speaker 5: I'd say one quick one is that whenever you're in 1236 01:00:32,040 --> 01:00:35,760 Speaker 5: a situation where the consequences could be big, be willing 1237 01:00:35,800 --> 01:00:38,360 Speaker 5: to ask more questions. And it can be socially awkward 1238 01:00:38,400 --> 01:00:40,800 Speaker 5: to do that, right, to kind of press for more information, 1239 01:00:41,520 --> 01:00:44,720 Speaker 5: but doing that's essential if the consequences of being deceived 1240 01:00:44,760 --> 01:00:47,880 Speaker 5: are big, and sometimes you can kind of get started 1241 01:00:47,920 --> 01:00:51,200 Speaker 5: on asking questions without actually, you know, being hostile and aggressive, 1242 01:00:51,440 --> 01:00:53,520 Speaker 5: like can you tell me more? Is a way of 1243 01:00:53,520 --> 01:00:55,080 Speaker 5: getting somebody to talk a little bit more, give you 1244 01:00:55,120 --> 01:00:57,360 Speaker 5: a little more information that might actually make it more 1245 01:00:57,400 --> 01:01:01,120 Speaker 5: comfortable to ask questions about that more information they give you. Right, 1246 01:01:01,240 --> 01:01:04,600 Speaker 5: So the sorts of skills that many of us develop 1247 01:01:04,600 --> 01:01:06,520 Speaker 5: an academia. Or you're giving a talk and you can 1248 01:01:06,920 --> 01:01:09,600 Speaker 5: stand up and ask the hostile question, or you could 1249 01:01:09,960 --> 01:01:13,280 Speaker 5: ask a question that reveals more information, and the goal 1250 01:01:13,320 --> 01:01:15,360 Speaker 5: is to try and reveal more information and remain a 1251 01:01:15,360 --> 01:01:19,320 Speaker 5: little uncertain until you have that information. One broader one 1252 01:01:19,360 --> 01:01:21,920 Speaker 5: is if somebody were trying to scam me in this situation, 1253 01:01:22,000 --> 01:01:24,720 Speaker 5: Let's say you're investing in something. If somebody were trying 1254 01:01:24,760 --> 01:01:26,000 Speaker 5: to scam me, how would I know? 1255 01:01:26,720 --> 01:01:26,840 Speaker 1: Right? 1256 01:01:26,920 --> 01:01:29,000 Speaker 5: So, if I'm thinking about investing in crypto and say, 1257 01:01:29,040 --> 01:01:30,760 Speaker 5: is that a scam? How would I know if that's 1258 01:01:30,760 --> 01:01:34,080 Speaker 5: a scam. If you can't answer that question, then you 1259 01:01:34,080 --> 01:01:35,080 Speaker 5: probably should walk away. 1260 01:01:35,960 --> 01:01:36,520 Speaker 3: So if you. 1261 01:01:36,440 --> 01:01:40,840 Speaker 5: Don't understand how blockchain works and how crypto coins work, 1262 01:01:41,000 --> 01:01:43,400 Speaker 5: you probably shouldn't be investing in crypto. Regardless of what 1263 01:01:43,440 --> 01:01:46,200 Speaker 5: a celebrity tells you. If it were a scam, how 1264 01:01:46,240 --> 01:01:49,080 Speaker 5: would you tell well, and be really hard to tell 1265 01:01:49,120 --> 01:01:52,000 Speaker 5: if you don't understand how it works intimately. I'll give 1266 01:01:52,040 --> 01:01:55,880 Speaker 5: two practical ideas also. I think one is don't make 1267 01:01:55,920 --> 01:01:59,520 Speaker 5: really important decisions all by yourself. We came across many 1268 01:01:59,560 --> 01:02:03,400 Speaker 5: example where people were about to make mistakes, big mistakes, like, 1269 01:02:03,440 --> 01:02:05,320 Speaker 5: for example, one of those guys was about to give 1270 01:02:06,240 --> 01:02:08,960 Speaker 5: to wire money to the fake French defense minister, and 1271 01:02:09,120 --> 01:02:11,200 Speaker 5: his friend walked into the room where he was having 1272 01:02:11,440 --> 01:02:13,560 Speaker 5: this call with him, and he immediately right away said 1273 01:02:13,600 --> 01:02:15,600 Speaker 5: this can't be real. This must be a scam. And 1274 01:02:15,640 --> 01:02:17,920 Speaker 5: why was the friend able to notice but the victim, 1275 01:02:18,200 --> 01:02:21,360 Speaker 5: the intended victim wasn't. Well, probably the friend had not 1276 01:02:21,400 --> 01:02:23,840 Speaker 5: been in on all the previous conversations, so there wasn't 1277 01:02:23,880 --> 01:02:26,920 Speaker 5: sort of that sunk costs idea, that idea of a relationship. 1278 01:02:26,360 --> 01:02:26,840 Speaker 3: And so on. 1279 01:02:27,320 --> 01:02:29,200 Speaker 4: And maybe it was just he had a different mindset 1280 01:02:29,200 --> 01:02:31,040 Speaker 4: that day, He had a different attitude, he was thinking 1281 01:02:31,040 --> 01:02:33,640 Speaker 4: different things, and he never got sucked into the whole thing. 1282 01:02:33,680 --> 01:02:37,520 Speaker 4: So ask a friend, get an outside view before you 1283 01:02:37,560 --> 01:02:39,400 Speaker 4: make a big decision. Should I really send all my 1284 01:02:39,400 --> 01:02:42,080 Speaker 4: life savings to this guy, you know, just because everybody 1285 01:02:42,080 --> 01:02:44,200 Speaker 4: says he's the greatest thing, or is there any other 1286 01:02:44,320 --> 01:02:46,640 Speaker 4: consideration I should be using when investing my money. 1287 01:02:46,680 --> 01:02:47,439 Speaker 3: So that's one. 1288 01:02:47,960 --> 01:02:51,480 Speaker 4: The second one is like, do your work on deadlines, 1289 01:02:51,520 --> 01:02:53,640 Speaker 4: but don't like give away your money on deadline. So 1290 01:02:53,640 --> 01:02:55,680 Speaker 4: if anybody ever says, like, you know, you've got to 1291 01:02:55,680 --> 01:02:58,240 Speaker 4: do this within a certain period of time, the police 1292 01:02:58,240 --> 01:03:00,000 Speaker 4: are coming to your house if you don't like pay 1293 01:03:00,120 --> 01:03:04,720 Speaker 4: this bill right away, or this offers exploding very quickly, 1294 01:03:05,040 --> 01:03:07,240 Speaker 4: you know, or there's only one of these things left 1295 01:03:07,400 --> 01:03:10,800 Speaker 4: or whatever, just be aware that, like that's a prime 1296 01:03:10,960 --> 01:03:13,960 Speaker 4: environment to not have time to ask questions, not have 1297 01:03:13,960 --> 01:03:15,800 Speaker 4: time to think about the information you're missing, not go 1298 01:03:15,840 --> 01:03:18,040 Speaker 4: through any of this, and realize that, like, you can 1299 01:03:18,080 --> 01:03:20,200 Speaker 4: still buy that thing the next day if you really 1300 01:03:20,240 --> 01:03:22,000 Speaker 4: want it, you can still invest your money next week 1301 01:03:22,040 --> 01:03:23,120 Speaker 4: after you have checked the guy out. 1302 01:03:23,560 --> 01:03:26,200 Speaker 3: You're not going to lose much. I would go with 1303 01:03:26,240 --> 01:03:26,640 Speaker 3: those two. 1304 01:03:31,520 --> 01:03:34,920 Speaker 1: So that was Dan Simon's and Christopher Shabri. Now what 1305 01:03:35,000 --> 01:03:38,280 Speaker 1: we learn from them is that a lot of protecting 1306 01:03:38,280 --> 01:03:42,640 Speaker 1: ourselves against deception is about taking the tools of science, 1307 01:03:42,680 --> 01:03:46,440 Speaker 1: which is nothing but the tools of thinking clearly, and 1308 01:03:46,520 --> 01:03:49,840 Speaker 1: applying those to our daily lives. So, for example, if 1309 01:03:49,880 --> 01:03:54,440 Speaker 1: somebody says something is true, whether from their position of 1310 01:03:54,480 --> 01:03:58,360 Speaker 1: authority or religious status, or with a trust me on 1311 01:03:58,400 --> 01:04:03,480 Speaker 1: this one vibe. The key is to trust, but verify. 1312 01:04:03,600 --> 01:04:06,200 Speaker 1: The important thing to get in the habit of is 1313 01:04:06,280 --> 01:04:10,680 Speaker 1: just asking the next question. And it's tough because life 1314 01:04:10,720 --> 01:04:14,560 Speaker 1: doesn't allow questioning everything. Our schedules just don't allow that, 1315 01:04:15,080 --> 01:04:17,920 Speaker 1: and we have to operate on trust for most of 1316 01:04:17,960 --> 01:04:20,480 Speaker 1: what we do. And sometimes we find ourselves in a 1317 01:04:20,520 --> 01:04:23,760 Speaker 1: situation where someone doesn't quite answer the question we've asked, 1318 01:04:24,080 --> 01:04:27,200 Speaker 1: and it feels impolite to keep pressing on it. And 1319 01:04:27,320 --> 01:04:30,760 Speaker 1: also what is life if we don't trust? But the 1320 01:04:30,880 --> 01:04:34,480 Speaker 1: fact is we can always get a little smarter, a 1321 01:04:34,520 --> 01:04:39,040 Speaker 1: little less gullible by knowing that reality can be different 1322 01:04:39,080 --> 01:04:42,760 Speaker 1: in different heads. And whether we're talking about Tanya my 1323 01:04:42,840 --> 01:04:46,360 Speaker 1: fellow graduate student, or the dealer of the shell Game, 1324 01:04:46,880 --> 01:04:52,240 Speaker 1: or Elizabeth Holmes and Farahos or whatever, it's incredibly useful 1325 01:04:52,720 --> 01:04:57,960 Speaker 1: to stretch beyond the parochial limits of our mental models 1326 01:04:58,040 --> 01:05:02,000 Speaker 1: of the world, because with with more knowledge comes a 1327 01:05:02,080 --> 01:05:06,960 Speaker 1: bit more immunity, and understanding the character of our brains 1328 01:05:07,040 --> 01:05:10,240 Speaker 1: allows us to move through the world a little bit 1329 01:05:10,280 --> 01:05:20,920 Speaker 1: more smoothly than we would without that knowledge. Go to 1330 01:05:20,960 --> 01:05:24,360 Speaker 1: Eagleman dot com slash podcast. For more information and to 1331 01:05:24,400 --> 01:05:28,680 Speaker 1: find further reading, send me an email at podcasts at 1332 01:05:28,680 --> 01:05:32,520 Speaker 1: eagleman dot com with questions or discussion, and I'll address 1333 01:05:32,600 --> 01:05:40,040 Speaker 1: those in a special episode. Until next time, I'm David Eagleman, 1334 01:05:40,240 --> 01:05:42,160 Speaker 1: and this is Inner Cosmos.