1 00:00:01,440 --> 00:00:04,920 Speaker 1: Welcome to Stuff you should know, a production of iHeartRadio. 2 00:00:11,160 --> 00:00:13,640 Speaker 2: Hey, and welcome to the podcast. I'm Josh, and there's 3 00:00:13,720 --> 00:00:17,200 Speaker 2: Chuck and we're flying solo again, which means we hopefully 4 00:00:17,239 --> 00:00:22,000 Speaker 2: won't crash this joint. And this is stuff. That's right, 5 00:00:22,960 --> 00:00:26,640 Speaker 2: that's right. How are you man? You're still sick? Huh? 6 00:00:26,720 --> 00:00:28,920 Speaker 1: Yeah, I mean, and this is kind of I mean, 7 00:00:29,040 --> 00:00:30,760 Speaker 1: I don't like playing it this close, but it's kind 8 00:00:30,760 --> 00:00:33,839 Speaker 1: of fun to be a little more current with like 9 00:00:33,920 --> 00:00:35,680 Speaker 1: listener mails and updates and stuff. 10 00:00:36,040 --> 00:00:37,919 Speaker 2: Yeah, it keeps us on the edge where we need 11 00:00:37,960 --> 00:00:38,159 Speaker 2: to be. 12 00:00:38,320 --> 00:00:40,120 Speaker 1: Yeah. So this will be out on Tuesday, I guess. 13 00:00:40,120 --> 00:00:42,199 Speaker 1: And in real time, this is the day after the 14 00:00:43,080 --> 00:00:44,919 Speaker 1: automat Oyster stewed debacle. 15 00:00:46,840 --> 00:00:48,440 Speaker 2: I don't know if it was a debacle that turned 16 00:00:48,440 --> 00:00:49,400 Speaker 2: out to be a pretty good. 17 00:00:49,240 --> 00:00:51,400 Speaker 1: App and Aaron Cooper already came through. 18 00:00:52,080 --> 00:00:53,159 Speaker 2: Oh good, I haven't seen it yet. 19 00:00:53,200 --> 00:00:57,680 Speaker 1: Yeah, it's funny. I'm okay. You know, things subside in 20 00:00:57,720 --> 00:00:59,639 Speaker 1: the early afternoon, so I'm actually feeling a little better 21 00:00:59,640 --> 00:01:01,560 Speaker 1: than it was like twenty minutes ago even. 22 00:01:01,360 --> 00:01:02,720 Speaker 2: But yeah, it's crazy. 23 00:01:02,760 --> 00:01:04,080 Speaker 1: Yeah, I just I gotta go to the doctor and 24 00:01:04,280 --> 00:01:07,000 Speaker 1: just get it over with, you do. 25 00:01:07,120 --> 00:01:10,559 Speaker 2: I heard that there's a really bad neurovirus going around, 26 00:01:10,640 --> 00:01:13,800 Speaker 2: and that's gotta be what you got man going around 27 00:01:13,800 --> 00:01:16,080 Speaker 2: Mexico City, going around the world. 28 00:01:16,200 --> 00:01:20,520 Speaker 1: Oh really, yeah, Mexico City is included. This feels bacterial 29 00:01:20,600 --> 00:01:21,840 Speaker 1: diverticulitis related. 30 00:01:22,520 --> 00:01:24,800 Speaker 2: Oh that's true. I forgot you got that. Yeah, well, 31 00:01:24,800 --> 00:01:27,440 Speaker 2: there's still a neurovirus going around, so don't catch that too. 32 00:01:28,680 --> 00:01:32,280 Speaker 1: I'll try, but I'm hanging in there. I am working 33 00:01:32,319 --> 00:01:35,400 Speaker 1: on less than three hundred calories a day for five 34 00:01:35,480 --> 00:01:38,160 Speaker 1: days now, so I am a shell of a human. 35 00:01:38,640 --> 00:01:41,039 Speaker 2: You're gonna look lean and mean, have you been doing 36 00:01:41,040 --> 00:01:41,640 Speaker 2: push ups? 37 00:01:42,280 --> 00:01:44,160 Speaker 1: I can't do one push up right now, there's no way. 38 00:01:45,400 --> 00:01:48,520 Speaker 2: Well, Chuck, I guess it's entirely possible since I haven't 39 00:01:48,520 --> 00:01:52,520 Speaker 2: seen you, I've just you know, been talking to you. Yeah, 40 00:01:52,560 --> 00:01:55,880 Speaker 2: while we record, I have no idea whether you're actually 41 00:01:55,920 --> 00:01:58,320 Speaker 2: sick or not. And it's entirely possible that you're fooling 42 00:01:58,320 --> 00:02:02,040 Speaker 2: me right now. Oh and if you are, I would 43 00:02:02,200 --> 00:02:05,840 Speaker 2: argue that doesn't make me gullible because I generally believe 44 00:02:05,840 --> 00:02:10,640 Speaker 2: you're trustworthy. There's no reason to believe that you're not sick. 45 00:02:11,200 --> 00:02:14,360 Speaker 2: So really, you'd just be a shameful, dirty liar, and 46 00:02:14,440 --> 00:02:17,240 Speaker 2: I would be the hero in this situation. 47 00:02:17,840 --> 00:02:22,640 Speaker 1: That's right. This is on gullibility and this you know, 48 00:02:22,800 --> 00:02:25,160 Speaker 1: we were just talking offline that there are I think 49 00:02:25,200 --> 00:02:27,639 Speaker 1: a hundred different ways to approach this kind of topic, 50 00:02:27,680 --> 00:02:32,120 Speaker 1: and sometimes that's like freeing, and sometimes that's really frustrating, 51 00:02:33,040 --> 00:02:35,520 Speaker 1: and I think this one was a little frustrating. Lvia 52 00:02:35,560 --> 00:02:38,200 Speaker 1: put together a great article, I think, but it's just 53 00:02:38,240 --> 00:02:40,680 Speaker 1: a hard one. When I pitched it to her, I 54 00:02:40,720 --> 00:02:43,040 Speaker 1: was like, you know what I feel like, especially here 55 00:02:43,040 --> 00:02:47,760 Speaker 1: in America, we're at peak gullibility as a nation, and 56 00:02:47,840 --> 00:02:49,920 Speaker 1: like I just wondered, like, is there any science of that, 57 00:02:50,080 --> 00:02:52,359 Speaker 1: or like there are people more gullible than others? And 58 00:02:52,720 --> 00:02:55,680 Speaker 1: can science be gullible? And this is what we came 59 00:02:55,760 --> 00:02:55,960 Speaker 1: up with. 60 00:02:56,720 --> 00:03:00,400 Speaker 2: Yeah, Interestingly, yes, science can be gullible. On the other hand, 61 00:03:01,280 --> 00:03:05,040 Speaker 2: you could argue that Americans aren't more gullible than usual, 62 00:03:05,520 --> 00:03:08,959 Speaker 2: that there's actually just different factors involved that make people 63 00:03:09,000 --> 00:03:13,000 Speaker 2: want to believe things. Maybe it's a it's it's weird. 64 00:03:13,040 --> 00:03:14,799 Speaker 2: I think one of the reasons why it's so hard 65 00:03:14,840 --> 00:03:18,560 Speaker 2: to wrap our head around is social psychologists are still 66 00:03:18,560 --> 00:03:20,040 Speaker 2: trying to wrap their head around it. 67 00:03:20,080 --> 00:03:20,480 Speaker 1: Totally. 68 00:03:20,560 --> 00:03:23,680 Speaker 2: And and you know what happens when social psychologists get 69 00:03:23,680 --> 00:03:24,520 Speaker 2: a hold of something. 70 00:03:24,680 --> 00:03:26,359 Speaker 1: Oh yeah, it's an oyster stew party. 71 00:03:26,600 --> 00:03:28,919 Speaker 2: It's a little unsteady as they figure it out. That's right, 72 00:03:29,240 --> 00:03:32,480 Speaker 2: it's an oyster stew party. So I think it's not 73 00:03:32,600 --> 00:03:34,640 Speaker 2: it's not us, as what I'm trying to say. And you, 74 00:03:34,760 --> 00:03:37,200 Speaker 2: dear listener, if you're like, what is going on, it's 75 00:03:37,240 --> 00:03:39,960 Speaker 2: not you either. It's social psychology. 76 00:03:39,480 --> 00:03:42,440 Speaker 1: That's right. I guess we can start by talking about 77 00:03:42,840 --> 00:03:44,120 Speaker 1: I mean, we're going to talk about a different a 78 00:03:44,120 --> 00:03:45,960 Speaker 1: lot of different people, a lot of different people that 79 00:03:46,000 --> 00:03:48,600 Speaker 1: study this kind of stuff, a lot of different studies, 80 00:03:49,240 --> 00:03:52,280 Speaker 1: some of which make more sense than others. But this guy, 81 00:03:52,320 --> 00:03:55,840 Speaker 1: Stephen green Span, is an author. He wrote a book. 82 00:03:55,880 --> 00:03:59,880 Speaker 1: He wrote the book on it, Annals of Gullibility Colon 83 00:04:00,600 --> 00:04:02,480 Speaker 1: Why we are duped and how to avoid it. And 84 00:04:03,000 --> 00:04:05,800 Speaker 1: one sort of important thing he does upfront is say, hey, 85 00:04:05,800 --> 00:04:10,280 Speaker 1: there's a difference between credulity and gullibility. Credulity is if 86 00:04:10,400 --> 00:04:13,320 Speaker 1: you know you'll believe something just without looking at all 87 00:04:13,360 --> 00:04:17,800 Speaker 1: the evidence, and gullibility means you have an active response 88 00:04:18,320 --> 00:04:20,400 Speaker 1: to perhaps being conn. 89 00:04:21,360 --> 00:04:23,479 Speaker 2: I take issue with us right out of the gate. 90 00:04:23,560 --> 00:04:24,320 Speaker 1: I kind of think too. 91 00:04:24,920 --> 00:04:27,880 Speaker 2: I think that's a terrible distinction because I think you 92 00:04:27,960 --> 00:04:31,480 Speaker 2: can totally fall for something and be duped. Yeah, and 93 00:04:32,000 --> 00:04:34,880 Speaker 2: you be the only person who knew that. Who knows it. 94 00:04:34,960 --> 00:04:38,000 Speaker 2: You know, somebody could say something that duped you, and 95 00:04:37,720 --> 00:04:41,919 Speaker 2: they don't stop and focus to get like that that 96 00:04:41,920 --> 00:04:44,080 Speaker 2: that question of whether they duped you or not answered. 97 00:04:44,120 --> 00:04:46,760 Speaker 2: They just keep going on. But you know you've been duped. 98 00:04:46,880 --> 00:04:50,120 Speaker 2: You don't have to respond to a Nigerian prints email 99 00:04:50,279 --> 00:04:52,560 Speaker 2: or send somebody a bunch of Walmart cards to get 100 00:04:52,600 --> 00:04:56,720 Speaker 2: out of some random federal case that's against you. To 101 00:04:57,360 --> 00:05:02,039 Speaker 2: have been gullible, you just have to believe, and usually 102 00:05:02,080 --> 00:05:06,200 Speaker 2: in the absence of any kind of supporting evidence, and 103 00:05:06,240 --> 00:05:10,960 Speaker 2: sometimes in the presence of contradictory evidence. That's gollibility in 104 00:05:11,400 --> 00:05:17,200 Speaker 2: my in my understanding. Thanks You're you're believing something without 105 00:05:17,240 --> 00:05:21,400 Speaker 2: bothering to go check it out, and that to me 106 00:05:21,520 --> 00:05:23,040 Speaker 2: is the baseline of gollibility. 107 00:05:23,120 --> 00:05:26,000 Speaker 1: I totally agree. I thought that definition was really weird, 108 00:05:26,040 --> 00:05:28,000 Speaker 1: and I'm glad both are in here though, because sometimes 109 00:05:28,000 --> 00:05:30,880 Speaker 1: it's a nice contrast. But along the lines of what 110 00:05:30,920 --> 00:05:35,800 Speaker 1: you were saying, there's a group of researchers social psychologists 111 00:05:36,160 --> 00:05:38,960 Speaker 1: from McQuary University. There can be a lot of Aussie's 112 00:05:39,040 --> 00:05:39,240 Speaker 1: in this. 113 00:05:40,200 --> 00:05:42,880 Speaker 2: You can say that name better than that McQuary. 114 00:05:43,120 --> 00:05:46,120 Speaker 1: Oh like Ausie style, Yeah, McQuary. 115 00:05:48,080 --> 00:05:50,560 Speaker 2: Although anytime you do that you sound like Murray from 116 00:05:50,640 --> 00:05:52,040 Speaker 2: Flight to the Concord. 117 00:05:51,760 --> 00:05:56,880 Speaker 1: Murray present Alessandra k Tenise. 118 00:05:57,000 --> 00:05:59,320 Speaker 2: Maybe that's what I'm going to find it. 119 00:05:59,279 --> 00:06:00,839 Speaker 1: Much in the way you and I would, and I 120 00:06:00,839 --> 00:06:02,599 Speaker 1: think a lot of people would, which is simply the 121 00:06:02,600 --> 00:06:05,120 Speaker 1: propensity to accept a false premise in the presence of 122 00:06:05,279 --> 00:06:11,719 Speaker 1: un trustworthy clues. That's it. That's it. You don't have 123 00:06:11,760 --> 00:06:12,360 Speaker 1: to act on it. 124 00:06:12,839 --> 00:06:15,960 Speaker 2: No, you could just believe and no one in the 125 00:06:16,000 --> 00:06:19,160 Speaker 2: world could know besides you that you believed, and you're 126 00:06:19,240 --> 00:06:22,279 Speaker 2: still gulible in that sense. The thing that I really 127 00:06:22,279 --> 00:06:24,479 Speaker 2: stood out to me that we'll talk about a lot more, though, 128 00:06:24,600 --> 00:06:27,840 Speaker 2: is you could make a really good case that people 129 00:06:27,920 --> 00:06:32,279 Speaker 2: aren't as gullible as other people think they are. And 130 00:06:32,320 --> 00:06:34,719 Speaker 2: I found that kind of reassuring. We'll talk about that later, 131 00:06:34,800 --> 00:06:37,680 Speaker 2: but I don't want anybody to get the impression that 132 00:06:37,720 --> 00:06:40,280 Speaker 2: we're just like, Yep, people are generally stupid, yeah, and 133 00:06:40,320 --> 00:06:43,000 Speaker 2: here's how they fall for stupid stuff now and you're 134 00:06:43,080 --> 00:06:46,479 Speaker 2: probably stupid too. That's not actually what the science of 135 00:06:46,560 --> 00:06:47,880 Speaker 2: gollibility is turned up. 136 00:06:48,360 --> 00:06:50,160 Speaker 1: No, And there's a lot of factors, And this is 137 00:06:50,160 --> 00:06:52,200 Speaker 1: where I think Greenspan did kind of hit on something 138 00:06:52,839 --> 00:06:57,279 Speaker 1: his four factors of gullibility. Situational like if there's a 139 00:06:57,320 --> 00:06:59,280 Speaker 1: lot of if everyone else is doing it, and there's 140 00:06:59,279 --> 00:07:01,839 Speaker 1: a lot of social pre sure, like all the bros 141 00:07:01,839 --> 00:07:05,039 Speaker 1: are investing in the same cryptocurrency and it's at a 142 00:07:05,040 --> 00:07:06,400 Speaker 1: great price, and you're like, oh man, I got to 143 00:07:06,400 --> 00:07:07,760 Speaker 1: get in there. All the guys are right, you know, 144 00:07:07,839 --> 00:07:10,440 Speaker 1: everyone's in on that. So there's social pressure where you 145 00:07:10,480 --> 00:07:15,920 Speaker 1: can fall for something. Cognitive issues like well, as we'll 146 00:07:15,920 --> 00:07:18,920 Speaker 1: get to later with you know, our senior friends. Sometimes 147 00:07:18,960 --> 00:07:22,200 Speaker 1: there's like legit brain cognitive issues that's a different thing 148 00:07:22,240 --> 00:07:26,240 Speaker 1: than this. But this is just lacking expertise, and you know, 149 00:07:26,440 --> 00:07:29,520 Speaker 1: you can't evaluate what you're being told because you're just 150 00:07:29,560 --> 00:07:32,200 Speaker 1: not I don't want to say smart enough, you're just 151 00:07:32,240 --> 00:07:33,720 Speaker 1: not an expert in whatever that is. 152 00:07:34,160 --> 00:07:36,400 Speaker 2: Yeah, you're not informed enough in that particular thing. 153 00:07:36,520 --> 00:07:37,560 Speaker 1: Yeah, what else. 154 00:07:38,120 --> 00:07:40,720 Speaker 2: Personality is another one. If you're impulsive, this is a 155 00:07:40,720 --> 00:07:44,120 Speaker 2: big one, big one. If you're lowing curiosity and you're like, 156 00:07:44,200 --> 00:07:46,560 Speaker 2: I don't care, Just tell me what difference. I'm too 157 00:07:46,640 --> 00:07:49,320 Speaker 2: lazy to go figure it out myself. Uh huh, I 158 00:07:49,400 --> 00:07:52,760 Speaker 2: got better things to do than think. Or if you 159 00:07:52,800 --> 00:07:55,280 Speaker 2: have a high need for independence, And this struck me 160 00:07:55,440 --> 00:07:58,600 Speaker 2: quite a bit because if you were, if you're independence minded, 161 00:07:59,320 --> 00:08:04,680 Speaker 2: you don't need smarty pants, pencil neck college boys telling 162 00:08:04,760 --> 00:08:08,120 Speaker 2: you what's right or what's wrong, or what's true or 163 00:08:08,160 --> 00:08:11,760 Speaker 2: what's false. You can figure it out yourself. And those 164 00:08:11,800 --> 00:08:14,680 Speaker 2: people are actually a high risk of being duped, which 165 00:08:14,720 --> 00:08:17,760 Speaker 2: is really surprising, but if you stop and think about it, 166 00:08:17,760 --> 00:08:21,080 Speaker 2: it makes total sense. They're overconfident and that's a huge 167 00:08:21,120 --> 00:08:22,880 Speaker 2: factor in being gullible. 168 00:08:23,080 --> 00:08:26,400 Speaker 1: Yeah, I totally think it makes sense, you know, because 169 00:08:26,440 --> 00:08:27,480 Speaker 1: it happened to his cousin. 170 00:08:29,160 --> 00:08:29,640 Speaker 2: That's right. 171 00:08:31,360 --> 00:08:34,480 Speaker 1: Emotion can play a big factor in a lot of ways, 172 00:08:34,480 --> 00:08:36,280 Speaker 1: and we'll talk about some of those with some studies 173 00:08:36,320 --> 00:08:40,400 Speaker 1: later on, but one way is, like, let's say we're 174 00:08:40,440 --> 00:08:42,480 Speaker 1: specifically talking about being conned. If it gives you a 175 00:08:42,520 --> 00:08:46,280 Speaker 1: positive feeling, whether it's a somebody catfishing you and making 176 00:08:46,280 --> 00:08:50,439 Speaker 1: you feel loved, or you know, some sort of financial 177 00:08:50,480 --> 00:08:52,480 Speaker 1: thing that you think might provide for your long term 178 00:08:52,520 --> 00:08:56,760 Speaker 1: security or like, man, no one else knows about the 179 00:08:56,800 --> 00:08:58,760 Speaker 1: steal but me, I'm so smart for getting in on 180 00:08:58,800 --> 00:09:01,840 Speaker 1: the ground floor here that kind of thing. 181 00:09:02,440 --> 00:09:06,000 Speaker 2: Right, and as strangely ironically, almost as if he did 182 00:09:06,000 --> 00:09:08,760 Speaker 2: it on purpose, because it supports everything he wrote about. 183 00:09:09,440 --> 00:09:13,040 Speaker 2: Stephen Greenspan, the author of that book about golibility, he 184 00:09:13,120 --> 00:09:16,000 Speaker 2: finished his book and shortly afterward he was informed by 185 00:09:16,080 --> 00:09:18,680 Speaker 2: I guess his stockbroker that he had lost a bunch 186 00:09:18,720 --> 00:09:22,960 Speaker 2: of a bunch of money by investing in Bernie Madoff's contest. 187 00:09:22,559 --> 00:09:24,440 Speaker 1: Scheme with the ironies. Huh. 188 00:09:24,480 --> 00:09:26,640 Speaker 2: So he was like, even the guy that researched this 189 00:09:26,720 --> 00:09:29,160 Speaker 2: and wrote the book on golibility can fall for it. 190 00:09:29,720 --> 00:09:33,080 Speaker 2: That's a really great little tidbit. Yeah, but I think 191 00:09:33,120 --> 00:09:38,480 Speaker 2: it also goes to show just how specific golibility is, 192 00:09:39,000 --> 00:09:42,000 Speaker 2: because I don't get the impression that Stephen Greenspan was like, 193 00:09:42,320 --> 00:09:44,679 Speaker 2: this madeoff guy is making a lot of really great 194 00:09:44,720 --> 00:09:48,080 Speaker 2: points and this is incredibly high risk, but I'm going 195 00:09:48,160 --> 00:09:50,080 Speaker 2: to go along with it anyway, Like he went through 196 00:09:50,120 --> 00:09:52,640 Speaker 2: a stockbroker and everything. So, yeah, there's only a certain 197 00:09:52,640 --> 00:09:53,679 Speaker 2: amount of golibility. 198 00:09:53,679 --> 00:09:54,160 Speaker 1: It's just. 199 00:09:55,679 --> 00:09:59,319 Speaker 2: Bernie Madoff is like shorthand for fooling people, you know 200 00:09:59,360 --> 00:10:03,080 Speaker 2: what I mean, not to pick on Stephen green Span 201 00:10:03,200 --> 00:10:04,040 Speaker 2: or anything like that. 202 00:10:04,200 --> 00:10:07,600 Speaker 1: Now I feel very bad for him despite his poor definition. 203 00:10:08,679 --> 00:10:12,960 Speaker 2: Right. So some other people have said, well, we really 204 00:10:13,000 --> 00:10:15,079 Speaker 2: want to show off as social psychologists, we're going to 205 00:10:15,120 --> 00:10:19,280 Speaker 2: create a gibility scale, and in fact, Alessandra to Nie 206 00:10:19,600 --> 00:10:23,240 Speaker 2: from Macquarie University, I'm not even gonna try that one. 207 00:10:24,120 --> 00:10:30,480 Speaker 2: But it's Australian for university. Sorry, Australians. There's this beer 208 00:10:30,559 --> 00:10:33,960 Speaker 2: called Fosters that here in America we think you drink 209 00:10:34,000 --> 00:10:38,280 Speaker 2: a lot of. And in America the ad campaign says Fosters, 210 00:10:38,520 --> 00:10:40,280 Speaker 2: it's Australian for beer. 211 00:10:41,000 --> 00:10:43,440 Speaker 1: I love that you barely use an accent. You just 212 00:10:43,440 --> 00:10:46,280 Speaker 1: say it seriously and that gets the point across. 213 00:10:47,960 --> 00:10:49,160 Speaker 2: It makes people pay attention. 214 00:10:49,360 --> 00:10:50,960 Speaker 1: Australian for beer. 215 00:10:52,480 --> 00:10:54,640 Speaker 2: That's the best I can do. That's how I think 216 00:10:54,679 --> 00:10:55,640 Speaker 2: Australians talk. 217 00:10:56,520 --> 00:11:00,240 Speaker 1: Uh yeah. So this gullibility scale was self report did 218 00:11:01,120 --> 00:11:05,280 Speaker 1: basically like do you self reported, meaning do you think 219 00:11:05,360 --> 00:11:07,760 Speaker 1: others do you perceive yourself as gullible? And do you 220 00:11:07,800 --> 00:11:11,800 Speaker 1: think others perceive you as gullible? And then they you know, 221 00:11:11,840 --> 00:11:14,319 Speaker 1: they filled in with some other questions like how persuadable 222 00:11:14,360 --> 00:11:16,800 Speaker 1: are you? And stuff like that, and it actually for 223 00:11:16,840 --> 00:11:19,560 Speaker 1: a self reported study, which you know a lot of 224 00:11:19,559 --> 00:11:22,840 Speaker 1: those can be tough. This seemed to work out pretty 225 00:11:22,880 --> 00:11:25,080 Speaker 1: good for them, don't you think it did? 226 00:11:25,120 --> 00:11:28,679 Speaker 2: Because they backed it up. They I can't remember what 227 00:11:28,720 --> 00:11:32,640 Speaker 2: it's called, but they tested the validity. They tested the 228 00:11:32,720 --> 00:11:37,520 Speaker 2: validity of this self reporting panel and found that the 229 00:11:37,600 --> 00:11:42,360 Speaker 2: people who reported themselves or scored the highest on gullibility 230 00:11:42,520 --> 00:11:45,600 Speaker 2: on this test were more likely to click a link 231 00:11:45,640 --> 00:11:48,760 Speaker 2: on a phishing email than people who scored low. Right, 232 00:11:48,800 --> 00:11:51,400 Speaker 2: So it seems like a valid test. And one of 233 00:11:51,440 --> 00:11:53,160 Speaker 2: the things I went and looked at up Chuck, and 234 00:11:53,200 --> 00:11:55,360 Speaker 2: one of the questions wasn't even a question. It was 235 00:11:55,400 --> 00:11:59,080 Speaker 2: you are very persuadable, And the only option to check 236 00:11:59,200 --> 00:12:02,200 Speaker 2: was yes, Ah, what I'm kidding? 237 00:12:03,120 --> 00:12:05,199 Speaker 1: Oh man, this is so upciting. 238 00:12:06,160 --> 00:12:08,640 Speaker 2: That's all right, You're not at one hundred percent at all. 239 00:12:08,640 --> 00:12:10,440 Speaker 1: I didn't think you would take advantage of this today. 240 00:12:11,920 --> 00:12:14,880 Speaker 2: It was more of the joke. I wasn't trying to 241 00:12:14,920 --> 00:12:17,480 Speaker 2: take advantage of you, although I realize now that I didn't. 242 00:12:17,600 --> 00:12:22,440 Speaker 1: It's okay. All for the show. On that scale, they 243 00:12:22,480 --> 00:12:25,839 Speaker 1: found some traits that were common among those that scored 244 00:12:25,920 --> 00:12:29,480 Speaker 1: high in gullibility. Social intelligence was one of them that'll 245 00:12:29,520 --> 00:12:33,719 Speaker 1: keep coming back over and over. Vulnerability, emotionality, which we've 246 00:12:33,720 --> 00:12:36,240 Speaker 1: talked about a little bit, a weak sense of self 247 00:12:36,920 --> 00:12:40,640 Speaker 1: which also comes up in different ways. I think you 248 00:12:40,679 --> 00:12:43,040 Speaker 1: found an article about how parents can wreck kids by 249 00:12:43,040 --> 00:12:45,360 Speaker 1: not giving them self confidence, right, and they'll end up gible. 250 00:12:45,679 --> 00:12:49,160 Speaker 2: Yeah, pretty much, and depending on And it doesn't even 251 00:12:49,200 --> 00:12:51,240 Speaker 2: have to be like you're such a stupid kid every day. 252 00:12:51,240 --> 00:12:53,960 Speaker 2: It can just be things like where your opinion is 253 00:12:54,000 --> 00:12:57,040 Speaker 2: not really heard or validated, or just all sorts of 254 00:12:57,080 --> 00:12:59,800 Speaker 2: little missteps that parents can make that make parenting a 255 00:12:59,760 --> 00:13:03,600 Speaker 2: lit nightmare you can carry on as an adult. And 256 00:13:03,640 --> 00:13:05,959 Speaker 2: it can make you doubt your own opinion, so you're 257 00:13:06,000 --> 00:13:08,880 Speaker 2: not gonna speak up. It can make you be afraid 258 00:13:08,920 --> 00:13:12,000 Speaker 2: of looking stupid, so you don't ask questions because you 259 00:13:12,000 --> 00:13:14,760 Speaker 2: don't want to seem like I don't I didn't immediately 260 00:13:14,800 --> 00:13:16,400 Speaker 2: get it, so I'm gonna look dumb if I ask 261 00:13:16,520 --> 00:13:20,080 Speaker 2: these questions. There's like it just sets you up for 262 00:13:20,679 --> 00:13:23,600 Speaker 2: being more likely to be a victim of being duped 263 00:13:23,880 --> 00:13:26,040 Speaker 2: than somebody who has a lot of confidence. 264 00:13:26,360 --> 00:13:29,280 Speaker 1: Yeah, I have a good friend who had a pretty 265 00:13:29,320 --> 00:13:34,120 Speaker 1: bad stepfather, and the abuse in this situation was exclusively 266 00:13:34,440 --> 00:13:37,599 Speaker 1: he made him feel stupid at every opportunity. 267 00:13:37,720 --> 00:13:39,760 Speaker 2: That is so wrong, Like I should be in jail. 268 00:13:40,800 --> 00:13:44,920 Speaker 1: He's passed on now. But it's I can't think of 269 00:13:44,960 --> 00:13:46,839 Speaker 1: any I mean, there are all kinds of things that 270 00:13:46,880 --> 00:13:50,560 Speaker 1: are worse, obviously, but something so damaging for such a 271 00:13:50,559 --> 00:13:54,080 Speaker 1: small person to do that to a child. Yeah, and 272 00:13:54,120 --> 00:13:56,720 Speaker 1: literally like oh you think so, like you know, just 273 00:13:56,880 --> 00:13:58,319 Speaker 1: that's how he was talked to it his whole life 274 00:13:58,320 --> 00:14:02,080 Speaker 1: growing up. It's awful. That is rough and he's super gullible. 275 00:14:03,000 --> 00:14:03,760 Speaker 2: Oh is he really? 276 00:14:03,920 --> 00:14:04,800 Speaker 1: No? Actually, I don't know. 277 00:14:05,520 --> 00:14:08,120 Speaker 2: Oh, you got me back there. You should just do 278 00:14:08,200 --> 00:14:10,160 Speaker 2: that to one another, like every d or too. 279 00:14:10,800 --> 00:14:12,560 Speaker 1: One thing we should mention though, because this pops up 280 00:14:12,559 --> 00:14:14,760 Speaker 1: a couple of times and I think it's super fascinating. 281 00:14:15,200 --> 00:14:17,680 Speaker 1: Is another trait they found on the gullibility scale if 282 00:14:17,720 --> 00:14:22,280 Speaker 1: you're very gullible, was belief in paranormal activity. Yeah, just 283 00:14:22,320 --> 00:14:23,040 Speaker 1: park it right there. 284 00:14:24,080 --> 00:14:26,800 Speaker 2: But I guess that depends on whether paranormal activity is 285 00:14:26,840 --> 00:14:27,320 Speaker 2: real or not. 286 00:14:27,400 --> 00:14:28,720 Speaker 1: You know, well, I guess so. 287 00:14:29,960 --> 00:14:32,840 Speaker 2: I mean that's described from a point of view where 288 00:14:32,840 --> 00:14:36,560 Speaker 2: you're just like that's all fake anyway. So yeah, duh. 289 00:14:36,920 --> 00:14:39,720 Speaker 2: One of the things about social intelligence that's worth pointing out. 290 00:14:39,760 --> 00:14:44,760 Speaker 2: So that's basically a package that you can have. Some 291 00:14:44,760 --> 00:14:47,040 Speaker 2: people are much better at it than others, but basically 292 00:14:47,080 --> 00:14:50,920 Speaker 2: everyone alive in a society has some degree or other 293 00:14:51,320 --> 00:14:54,360 Speaker 2: of this package of skills that forms social intelligence, Like 294 00:14:55,040 --> 00:14:58,680 Speaker 2: whether or not you're good at conversation, whether you are 295 00:14:58,720 --> 00:15:02,480 Speaker 2: good at effective listening, what your knowledge of like social 296 00:15:02,600 --> 00:15:06,200 Speaker 2: roles and social scripts are, and then awareness of like 297 00:15:06,240 --> 00:15:09,160 Speaker 2: what make other people tick, and then what people think 298 00:15:09,200 --> 00:15:11,120 Speaker 2: of you. And you put all this together, and if 299 00:15:11,160 --> 00:15:14,480 Speaker 2: you have like high emotional or social intelligence, you're going 300 00:15:14,520 --> 00:15:19,440 Speaker 2: to be able to navigate interactions with other people much 301 00:15:19,600 --> 00:15:22,720 Speaker 2: better than somebody with low intelligence. Part of that is 302 00:15:22,760 --> 00:15:25,360 Speaker 2: not getting scanned by somebody by being able to be like, 303 00:15:25,440 --> 00:15:27,920 Speaker 2: you're a scammer and I'm not going to send you 304 00:15:27,960 --> 00:15:29,240 Speaker 2: a Walmart gift card now. 305 00:15:29,440 --> 00:15:32,000 Speaker 1: Yeah, And it's a trade I think that you can't 306 00:15:32,000 --> 00:15:34,920 Speaker 1: necessarily teach, but is really beneficial to have as a human. 307 00:15:35,360 --> 00:15:38,200 Speaker 2: Yeah. I admire people with high social intelligence because it's 308 00:15:38,200 --> 00:15:42,720 Speaker 2: not just you know, being able to spot a scammer, 309 00:15:42,760 --> 00:15:45,200 Speaker 2: it's being able to see the best in other people, 310 00:15:45,240 --> 00:15:46,880 Speaker 2: and I think to bring out the best in other 311 00:15:46,920 --> 00:15:48,720 Speaker 2: people and let them bring out the best in you. 312 00:15:48,840 --> 00:15:52,600 Speaker 2: And that's just it's maybe in another life, maybe in 313 00:15:52,600 --> 00:15:53,440 Speaker 2: the next lifetime. 314 00:15:53,680 --> 00:15:57,760 Speaker 1: Oh buddy, I think you're great. They did another study 315 00:15:57,800 --> 00:16:02,360 Speaker 1: at the University of Leicester where they found that childhood 316 00:16:02,400 --> 00:16:05,320 Speaker 1: traumas can really affect you later in life in terms 317 00:16:05,360 --> 00:16:08,480 Speaker 1: of gullibility, like any kind of bullying, death of a 318 00:16:08,480 --> 00:16:11,600 Speaker 1: family member or something like that. It leads you more 319 00:16:12,040 --> 00:16:16,600 Speaker 1: susceptible to fall for tricks later in life. And apparently 320 00:16:16,640 --> 00:16:19,120 Speaker 1: they say it could be because that kind of trauma 321 00:16:19,200 --> 00:16:21,720 Speaker 1: just makes it hard to trust your own judgments and 322 00:16:22,400 --> 00:16:24,800 Speaker 1: you know, I guess everyone else's intent. 323 00:16:25,480 --> 00:16:28,840 Speaker 2: For sure, and then some people because it's actually kind 324 00:16:28,840 --> 00:16:30,840 Speaker 2: of counterintuitive if you think if you've gone through the 325 00:16:30,840 --> 00:16:33,280 Speaker 2: school of hard knocks, I think is the way that 326 00:16:33,320 --> 00:16:35,560 Speaker 2: the study put it. Yeah, you could think that they'd 327 00:16:35,560 --> 00:16:40,000 Speaker 2: come out like much more world wary and like suspicious 328 00:16:40,000 --> 00:16:41,840 Speaker 2: of people, and so they'd be less likely to fall 329 00:16:41,880 --> 00:16:44,680 Speaker 2: first game. But no, instead, like you said, they just 330 00:16:44,760 --> 00:16:47,040 Speaker 2: they question their own judgment for having gone through what 331 00:16:47,080 --> 00:16:47,720 Speaker 2: they went through. 332 00:16:47,720 --> 00:16:48,640 Speaker 1: So that's terrible. 333 00:16:49,040 --> 00:16:52,560 Speaker 2: It is, It is very terrible. Childhood is just fraught. 334 00:16:52,880 --> 00:16:54,560 Speaker 1: You know, it really is. 335 00:16:54,920 --> 00:16:57,080 Speaker 2: It's a wonder any of us can function in any 336 00:16:57,240 --> 00:16:58,240 Speaker 2: like real way. 337 00:16:58,560 --> 00:17:01,280 Speaker 1: Oh I know. I mean we're pretty good parents, but 338 00:17:01,360 --> 00:17:04,080 Speaker 1: I often think like, how are we messing her up? 339 00:17:04,200 --> 00:17:06,240 Speaker 1: Because I know we are in some way. 340 00:17:06,600 --> 00:17:09,120 Speaker 2: Yep, I mean I can't I can't imagine, Like that's 341 00:17:09,119 --> 00:17:10,959 Speaker 2: got to just keep you up at nights sometimes if 342 00:17:10,960 --> 00:17:13,160 Speaker 2: you think about it too much. You know, I sleep 343 00:17:13,200 --> 00:17:16,600 Speaker 2: pretty good good you just wake up to throw up 344 00:17:16,640 --> 00:17:17,160 Speaker 2: every hour. 345 00:17:18,160 --> 00:17:20,160 Speaker 1: Yeah, I think, just try to limit that stuff as 346 00:17:20,160 --> 00:17:22,800 Speaker 1: a parent, Like, there's you can't be perfect. I mean, 347 00:17:22,920 --> 00:17:24,960 Speaker 1: my brother is a perfect parent, but there's only there's 348 00:17:24,960 --> 00:17:27,920 Speaker 1: only one. 349 00:17:28,160 --> 00:17:28,600 Speaker 2: Scott. 350 00:17:29,080 --> 00:17:31,440 Speaker 1: Another thing I thought was interesting, and this makes total sense, 351 00:17:31,520 --> 00:17:34,520 Speaker 1: is if you rely on your intuition a lot, you're 352 00:17:34,520 --> 00:17:37,600 Speaker 1: a lot more vulnerable, vulnerable to being duped by something, 353 00:17:37,800 --> 00:17:41,560 Speaker 1: just like you know, some people have a good gut, 354 00:17:41,960 --> 00:17:43,600 Speaker 1: and some people think they have a good gut but 355 00:17:43,720 --> 00:17:44,040 Speaker 1: do not. 356 00:17:45,280 --> 00:17:48,040 Speaker 2: Yes. Another one that really stood out to me though 357 00:17:48,080 --> 00:17:50,560 Speaker 2: that this this I would not have predicted, is the 358 00:17:50,560 --> 00:17:53,800 Speaker 2: more cynical you are. Studies have found like this that 359 00:17:54,119 --> 00:17:58,080 Speaker 2: the likelier you are to be gullible or duped. And 360 00:17:58,119 --> 00:18:02,359 Speaker 2: the reason why actually makes time sense. Again, if you're cynical, 361 00:18:02,640 --> 00:18:05,159 Speaker 2: you think you've got everything figured out, like you're just 362 00:18:05,480 --> 00:18:07,800 Speaker 2: you think the world sucks and everybody's trying to take 363 00:18:07,840 --> 00:18:10,720 Speaker 2: advantage of you, and the government's constantly screwing you over, 364 00:18:11,400 --> 00:18:14,800 Speaker 2: and everyone's going to try to get an angle on you. 365 00:18:15,240 --> 00:18:19,119 Speaker 2: That's cynicism, right, at least in the modern sense. And 366 00:18:20,200 --> 00:18:24,520 Speaker 2: it's actually a lazy shortcut to experiencing reality because on 367 00:18:24,560 --> 00:18:27,800 Speaker 2: the one hand, you lose out an opportunity costs you 368 00:18:27,840 --> 00:18:29,760 Speaker 2: miss a lot of great stuff, Like you might not 369 00:18:29,880 --> 00:18:32,159 Speaker 2: make friends that you could have made because you were 370 00:18:32,160 --> 00:18:35,480 Speaker 2: suspicious of this stranger chatting you up at the outset 371 00:18:35,600 --> 00:18:36,199 Speaker 2: or something like that. 372 00:18:36,280 --> 00:18:37,320 Speaker 1: Yeah, But as far. 373 00:18:37,240 --> 00:18:41,960 Speaker 2: As gullibility goes, if somebody comes along and talks to 374 00:18:42,040 --> 00:18:44,959 Speaker 2: you in your language, they can pull one over on 375 00:18:45,000 --> 00:18:49,480 Speaker 2: you much more easily because they are tapping into your cynicism, 376 00:18:49,640 --> 00:18:54,000 Speaker 2: which again is just lazy shorthand for experiencing reality. It's 377 00:18:54,160 --> 00:18:58,800 Speaker 2: based largely on intuition and supposition and not necessarily taking 378 00:18:58,840 --> 00:19:01,919 Speaker 2: each experience and looking at it based on the facts. 379 00:19:02,000 --> 00:19:05,320 Speaker 2: Is a unique thing. It all has this one cast 380 00:19:05,480 --> 00:19:08,440 Speaker 2: to it that's the same, and that's just not how 381 00:19:08,480 --> 00:19:09,679 Speaker 2: the world actually works. 382 00:19:10,080 --> 00:19:12,040 Speaker 1: Yeah, and I think you know, that kind of suggests 383 00:19:12,040 --> 00:19:17,159 Speaker 1: that if there's like a country with an authoritarian leader 384 00:19:17,280 --> 00:19:21,880 Speaker 1: in place, like the simple sort of easy to understand 385 00:19:21,960 --> 00:19:25,000 Speaker 1: radical solutions that are pitched out oftentimes in those situations 386 00:19:25,480 --> 00:19:28,600 Speaker 1: are very easy to fall for if you're a gullible person, 387 00:19:28,640 --> 00:19:32,359 Speaker 1: because that itself is a mental shortcut. Well, we we 388 00:19:32,440 --> 00:19:32,960 Speaker 1: just got to do. 389 00:19:32,960 --> 00:19:38,560 Speaker 2: This for sure. And then conversely too, not being cynical 390 00:19:38,680 --> 00:19:44,480 Speaker 2: requires way more brain power and thought and just participation 391 00:19:45,359 --> 00:19:48,520 Speaker 2: than being cynical does. Like you have to actually like 392 00:19:48,680 --> 00:19:53,159 Speaker 2: ask yourself like, is this true? What kind of source 393 00:19:53,280 --> 00:19:55,280 Speaker 2: is this coming from? I might need to go do 394 00:19:55,359 --> 00:19:58,000 Speaker 2: some research. I might need to ask people. It's just 395 00:19:58,040 --> 00:20:00,560 Speaker 2: so much easier to be like, nope, there's screw me over. 396 00:20:00,640 --> 00:20:02,600 Speaker 2: I don't even need to bother to look into that, 397 00:20:03,280 --> 00:20:05,880 Speaker 2: because you're also defending yourself at the same time from 398 00:20:05,880 --> 00:20:09,159 Speaker 2: getting taken advantage of again until somebody comes along and 399 00:20:09,240 --> 00:20:12,680 Speaker 2: is talking your language, and then you will oftentimes fall 400 00:20:12,720 --> 00:20:13,720 Speaker 2: for whatever they're saying. 401 00:20:13,960 --> 00:20:18,000 Speaker 1: Yeah, should we take a break, Yeah, all right, we'll 402 00:20:18,000 --> 00:20:22,359 Speaker 1: take a break and talk about mood right after this. 403 00:20:47,720 --> 00:20:49,440 Speaker 1: All right, we're back. We promise to talk a little 404 00:20:49,440 --> 00:20:52,879 Speaker 1: bit about mood because The fact is you are not 405 00:20:53,400 --> 00:20:58,600 Speaker 1: always gullible or always not gullible. Everybody could get due 406 00:20:58,680 --> 00:21:01,359 Speaker 1: to that anytime from you know, that changes from day 407 00:21:01,400 --> 00:21:03,520 Speaker 1: to day, sometimes from hour to hour, depending on a 408 00:21:03,520 --> 00:21:06,560 Speaker 1: lot of factors, like mood. If you're really really tired, 409 00:21:06,600 --> 00:21:10,960 Speaker 1: if you're super distracted, if you're upset, you may not 410 00:21:11,119 --> 00:21:14,720 Speaker 1: notice something that can, you know, make you fall for 411 00:21:14,760 --> 00:21:17,879 Speaker 1: a scam. Also, the same holds if you're in a 412 00:21:17,880 --> 00:21:20,960 Speaker 1: really good mood. You know, if you're just feeling great 413 00:21:20,960 --> 00:21:24,160 Speaker 1: about everything, you're like, yeah, yes to life, Yes to everything. 414 00:21:25,080 --> 00:21:26,919 Speaker 1: There was a study in nineteen thirty eight by a 415 00:21:26,960 --> 00:21:31,480 Speaker 1: researcher named Gregory Razran who found that giving a free 416 00:21:31,520 --> 00:21:35,520 Speaker 1: lunch made people more receptive to a political message. And 417 00:21:35,800 --> 00:21:39,480 Speaker 1: apparently that is sort of where like the sales lunch started. 418 00:21:40,359 --> 00:21:42,760 Speaker 1: Taking people out to sell them something and feeding them 419 00:21:43,119 --> 00:21:45,840 Speaker 1: you're more likely to close a deal. And I'm sure 420 00:21:45,960 --> 00:21:50,320 Speaker 1: the same thing like golf course sales. Things like the 421 00:21:50,359 --> 00:21:52,960 Speaker 1: salesperson's not out, they're beating the person in golf that 422 00:21:52,960 --> 00:21:55,719 Speaker 1: they're selling to. I guarantee it. I don't know how 423 00:21:55,720 --> 00:21:58,879 Speaker 1: that works, but I imagine you're letting them win and feel 424 00:21:58,880 --> 00:21:59,679 Speaker 1: good about stuff. 425 00:22:00,280 --> 00:22:02,199 Speaker 2: Yeah, think about how good you have to be to 426 00:22:02,280 --> 00:22:03,879 Speaker 2: purposely lose at golf. 427 00:22:05,000 --> 00:22:07,200 Speaker 1: Oh, I could play bad golf on purpose and I'm not. 428 00:22:07,359 --> 00:22:11,280 Speaker 2: Really Yeah, okay, well I take that one back and 429 00:22:11,320 --> 00:22:16,879 Speaker 2: on accident. So yes. But on the contrary, if you 430 00:22:16,960 --> 00:22:20,119 Speaker 2: are an upset, if you're sad, if you're depressed, if 431 00:22:20,119 --> 00:22:25,320 Speaker 2: you're mad, if you're in a low mood, you are 432 00:22:25,440 --> 00:22:28,840 Speaker 2: actually more likely to pay attention to granular things. I 433 00:22:28,880 --> 00:22:32,719 Speaker 2: think it actually kind of ties into rumination. You're just 434 00:22:32,840 --> 00:22:35,960 Speaker 2: thinking about stuff. You're turned inwards. So if somebody comes 435 00:22:35,960 --> 00:22:38,200 Speaker 2: along and tries to sell you something, yeah, that makes sense, 436 00:22:38,200 --> 00:22:39,840 Speaker 2: it's going to be harder to slip it past you 437 00:22:39,880 --> 00:22:43,880 Speaker 2: because you're paying attention more than somebody who's like, yeah, whatever, 438 00:22:43,960 --> 00:22:45,240 Speaker 2: let's have another round. 439 00:22:45,600 --> 00:22:50,040 Speaker 1: Right. So overall, if you think about people who might 440 00:22:50,080 --> 00:22:53,240 Speaker 1: be goable, you might think, and you know, if you're 441 00:22:53,240 --> 00:22:57,080 Speaker 1: going to stereotype it, like people like kids, very young people, 442 00:22:58,119 --> 00:23:01,840 Speaker 1: very old people, and people that aren't very well educated obviously, 443 00:23:02,400 --> 00:23:08,080 Speaker 1: but it's not necessarily true. What There is a lot 444 00:23:08,119 --> 00:23:12,400 Speaker 1: of factors, one of which I mentioned earlier. You can 445 00:23:12,440 --> 00:23:14,520 Speaker 1: get you know, a lot of skewed studies about the 446 00:23:14,520 --> 00:23:18,320 Speaker 1: gullibility of someone who's older, because if you're older, you're 447 00:23:18,320 --> 00:23:21,200 Speaker 1: more likely to have a cognitive ability that's literally keeping 448 00:23:21,240 --> 00:23:24,960 Speaker 1: you from being able to determine whether something is true. 449 00:23:25,840 --> 00:23:28,280 Speaker 1: But they've also conversely found that sometimes they're a little 450 00:23:28,320 --> 00:23:31,840 Speaker 1: more protected because they're constantly have their children and everyone 451 00:23:31,840 --> 00:23:34,040 Speaker 1: else saying like, no, no, no, watch out for scams. They're 452 00:23:34,040 --> 00:23:35,959 Speaker 1: trying to scam you. Everyone's trying to scam you. 453 00:23:36,080 --> 00:23:39,200 Speaker 2: Right, Yeah, so it's like a self fulfilling prophecy that 454 00:23:39,440 --> 00:23:43,639 Speaker 2: they are less likely to be scammed because they're so vigilant. 455 00:23:43,880 --> 00:23:45,040 Speaker 2: That's amazing to me. 456 00:23:45,320 --> 00:23:45,960 Speaker 1: Yeah. 457 00:23:46,240 --> 00:23:48,960 Speaker 2: So there was this one study that kind of backed 458 00:23:48,960 --> 00:23:53,120 Speaker 2: all this up from the University of Tirana, and they 459 00:23:53,240 --> 00:23:56,600 Speaker 2: found they looked at adults sixty to ninety who handled 460 00:23:56,600 --> 00:24:00,000 Speaker 2: their own finances. They didn't have any diagnosed cognitive issues 461 00:24:00,119 --> 00:24:05,720 Speaker 2: use and they found that people who had reported being 462 00:24:05,880 --> 00:24:09,720 Speaker 2: victims of a fraud, there was nothing that really happened, 463 00:24:10,160 --> 00:24:15,000 Speaker 2: or there was no characteristic demographically anything like that that 464 00:24:15,119 --> 00:24:19,880 Speaker 2: made them different from anybody else. The only thing that 465 00:24:20,040 --> 00:24:24,359 Speaker 2: seemed to really kind of stick out was that they 466 00:24:24,600 --> 00:24:29,800 Speaker 2: the people who had been scammed before were low had 467 00:24:29,840 --> 00:24:35,280 Speaker 2: low conscientiousness one of the big five. They were less honest, humble, 468 00:24:35,760 --> 00:24:39,320 Speaker 2: which is another another kind of personality trait from a 469 00:24:39,320 --> 00:24:44,720 Speaker 2: different scale, and that from what I could see, the 470 00:24:44,840 --> 00:24:49,000 Speaker 2: honesty thing means. They explained it like, you're if you 471 00:24:49,359 --> 00:24:53,719 Speaker 2: are low on honesty, you're more likely to try something 472 00:24:53,760 --> 00:24:56,680 Speaker 2: that might be a scam because you might get rich 473 00:24:56,800 --> 00:24:58,520 Speaker 2: quick or something like that. You're more willing to take 474 00:24:58,560 --> 00:25:02,000 Speaker 2: a shortcut, say, than somebody who would score higher on honesty, 475 00:25:02,040 --> 00:25:05,600 Speaker 2: which you put you at greater risk. But that was 476 00:25:05,720 --> 00:25:08,119 Speaker 2: about it. There wasn't like, you know, the older you 477 00:25:08,240 --> 00:25:11,119 Speaker 2: get or the less educated you are in this group, 478 00:25:11,560 --> 00:25:13,800 Speaker 2: you're more likely to get scammed. It was some other 479 00:25:13,840 --> 00:25:19,359 Speaker 2: stuff entirely. But they found also that people who do 480 00:25:20,400 --> 00:25:26,160 Speaker 2: experience cognitive decline do tend to get taken advantage of more, 481 00:25:26,400 --> 00:25:30,600 Speaker 2: which is really messed up, sad, but it's true. And 482 00:25:30,640 --> 00:25:33,040 Speaker 2: as a matter of fact, they've started to some people 483 00:25:33,119 --> 00:25:35,440 Speaker 2: have started to push this idea like, if you fall 484 00:25:35,520 --> 00:25:39,640 Speaker 2: for a scam, you should immediately be tested for Alzheimer's 485 00:25:39,680 --> 00:25:44,080 Speaker 2: or dementia because there's a high correlation with getting scammed 486 00:25:44,119 --> 00:25:48,080 Speaker 2: as an older person and the early early developments of 487 00:25:48,200 --> 00:25:51,639 Speaker 2: cognitive decline. Yes, you got to feel terrible. I mean, 488 00:25:51,680 --> 00:25:53,840 Speaker 2: it's bad enough to feel like you're getting scammed, but 489 00:25:53,920 --> 00:25:56,600 Speaker 2: then to stop and be like, well, is this it 490 00:25:56,640 --> 00:25:57,600 Speaker 2: for me in my mind? 491 00:25:58,080 --> 00:26:01,840 Speaker 1: Yeah? Absolutely, I mean thankfully, nothing like that's ever happened 492 00:26:01,840 --> 00:26:05,000 Speaker 1: to my parents. But it's you hear the stories all 493 00:26:05,040 --> 00:26:07,399 Speaker 1: the time, and it's just you know, it's tragic and 494 00:26:07,560 --> 00:26:11,080 Speaker 1: shameful for sure. There was a study in twenty eighteen 495 00:26:11,119 --> 00:26:13,840 Speaker 1: that I thought was pretty interesting a woman named Monica 496 00:26:13,880 --> 00:26:20,120 Speaker 1: t Witty, another Aussie when we talk about like being catfish, 497 00:26:20,240 --> 00:26:22,159 Speaker 1: which is if I guess I threw that word out, 498 00:26:22,200 --> 00:26:24,760 Speaker 1: assuming everyone knows that that's like when you get scammed 499 00:26:24,800 --> 00:26:29,080 Speaker 1: in a sort of a romantic thing online by someone 500 00:26:29,080 --> 00:26:30,639 Speaker 1: who's pretending to be someone they're not. 501 00:26:30,800 --> 00:26:33,400 Speaker 2: Generally, we should do an episode on that sometimes because 502 00:26:33,480 --> 00:26:36,040 Speaker 2: I just don't I don't. I mean I get it, 503 00:26:36,119 --> 00:26:38,880 Speaker 2: but I don't understand like where it started or anything 504 00:26:38,960 --> 00:26:39,239 Speaker 2: like that. 505 00:26:39,560 --> 00:26:42,520 Speaker 1: Yeah, let's put that down. That would be super interesting. Okay, 506 00:26:42,560 --> 00:26:44,200 Speaker 1: do you remember the Notre Dame football player? 507 00:26:44,800 --> 00:26:47,920 Speaker 2: Yeah? I thought he was. Isn't he like the Dolphins quarterback. 508 00:26:47,560 --> 00:26:48,480 Speaker 1: Now or no? 509 00:26:48,920 --> 00:26:49,080 Speaker 2: He? 510 00:26:49,160 --> 00:26:50,760 Speaker 1: I don't think he's in the league anymore. He played 511 00:26:50,760 --> 00:26:52,000 Speaker 1: the NFL for a little while, but he was a 512 00:26:52,000 --> 00:26:55,440 Speaker 1: linebacker for Notre Dame that oh gotcha was famously catfished 513 00:26:55,440 --> 00:26:59,040 Speaker 1: and like, you know, smart, handsome, young athlete guy. So 514 00:26:59,119 --> 00:27:02,159 Speaker 1: it's not like just you know, the lonely loser in 515 00:27:02,160 --> 00:27:03,719 Speaker 1: the basement that falls for stuff like that. 516 00:27:04,359 --> 00:27:06,040 Speaker 2: Have you heard about the lonesome loser? 517 00:27:06,200 --> 00:27:12,000 Speaker 1: Yeah, he still keeps on trying, man, little riverman so good. 518 00:27:13,560 --> 00:27:16,520 Speaker 1: At twenty eighteen, Monica Whitty did one on sort of 519 00:27:16,520 --> 00:27:19,199 Speaker 1: catfishing but really just romance scams is what they called it, 520 00:27:19,960 --> 00:27:21,600 Speaker 1: and she said, if you fall for something like that, 521 00:27:21,640 --> 00:27:24,840 Speaker 1: you obviously will be a little more impulsive in sensation seeking. 522 00:27:26,119 --> 00:27:28,160 Speaker 1: And so if someone's building up about all these great 523 00:27:28,160 --> 00:27:31,560 Speaker 1: stories and these big travels, and you know, it's always 524 00:27:31,600 --> 00:27:33,239 Speaker 1: it's never just like wow, I just kind of sent 525 00:27:33,320 --> 00:27:36,040 Speaker 1: around at home like they always present themselves as offering 526 00:27:36,080 --> 00:27:40,760 Speaker 1: some new, exciting life, it seems like. But she also 527 00:27:40,800 --> 00:27:45,040 Speaker 1: found that they were more highly educated than average, and Lvia, 528 00:27:45,240 --> 00:27:46,919 Speaker 1: I think is on the money, kind of speculates that 529 00:27:47,000 --> 00:27:50,320 Speaker 1: could be, and I think it's true. When we did 530 00:27:50,359 --> 00:27:52,919 Speaker 1: our thing on online dating, it's generally people that are 531 00:27:52,920 --> 00:27:57,120 Speaker 1: college educated that participate in online dating a little more statistically, 532 00:27:57,280 --> 00:28:01,480 Speaker 1: But also maybe that if you're more educated. You just 533 00:28:01,480 --> 00:28:04,080 Speaker 1: think like, I'm not going to fall for countfishing. I 534 00:28:04,080 --> 00:28:06,560 Speaker 1: know all about that, and this is not that right, 535 00:28:06,920 --> 00:28:09,200 Speaker 1: over confidence, right, and then you're on that hook. 536 00:28:10,040 --> 00:28:12,200 Speaker 2: And then another thing about being online too. The Better 537 00:28:12,200 --> 00:28:15,600 Speaker 2: Business Bureau back in twoy fifteen, I think they looked 538 00:28:15,640 --> 00:28:20,719 Speaker 2: at a I guess a bunch of their like scam 539 00:28:20,760 --> 00:28:24,040 Speaker 2: complaints that came in just to see who reported them, 540 00:28:24,080 --> 00:28:27,439 Speaker 2: and they found that people between twenty five and thirty 541 00:28:27,480 --> 00:28:31,080 Speaker 2: five were more likely to lose money on a scam 542 00:28:31,119 --> 00:28:35,000 Speaker 2: than older people, which is totally contrary to what people 543 00:28:35,040 --> 00:28:37,120 Speaker 2: think of when they think of people who get scammed. 544 00:28:37,880 --> 00:28:41,560 Speaker 2: And one of the explanations that they came up with is, 545 00:28:41,760 --> 00:28:45,280 Speaker 2: in part, younger people are just online more so they're 546 00:28:45,320 --> 00:28:50,080 Speaker 2: just more likely by the numbers, to have scams presented 547 00:28:50,120 --> 00:28:52,720 Speaker 2: to them, which means that they're more likely to probably 548 00:28:52,960 --> 00:28:55,360 Speaker 2: go for a scam than say, people who are online 549 00:28:55,480 --> 00:28:56,240 Speaker 2: less right. 550 00:28:57,440 --> 00:28:59,360 Speaker 1: I agree with that in the old days. I think 551 00:28:59,360 --> 00:29:02,600 Speaker 1: that's changing because I've never seen a generation as phone 552 00:29:02,600 --> 00:29:05,360 Speaker 1: addicted as boomers are smart. 553 00:29:05,400 --> 00:29:05,960 Speaker 2: Oh really good? 554 00:29:06,160 --> 00:29:09,560 Speaker 1: Oh man, they have a lot of boomers, do you 555 00:29:09,960 --> 00:29:13,400 Speaker 1: They have gen Z beat every boomer I know, just 556 00:29:13,520 --> 00:29:17,120 Speaker 1: obsessively stares at their phone and looks things up and. 557 00:29:18,160 --> 00:29:21,120 Speaker 2: Yeah, I thought they all had like flip phones that 558 00:29:21,320 --> 00:29:22,920 Speaker 2: only dial numbers. 559 00:29:23,360 --> 00:29:26,320 Speaker 1: No, no, no, they want to show you all the 560 00:29:26,320 --> 00:29:27,920 Speaker 1: information in the moment. 561 00:29:28,520 --> 00:29:28,920 Speaker 2: I got it. 562 00:29:29,000 --> 00:29:30,320 Speaker 1: Yeah, I got in the middle of dinner at a 563 00:29:30,400 --> 00:29:31,200 Speaker 1: nice restaurant. Even. 564 00:29:31,480 --> 00:29:33,920 Speaker 2: I guess I've not experienced. 565 00:29:33,280 --> 00:29:38,320 Speaker 1: That, but I do think that that generally is true. Okay, 566 00:29:39,840 --> 00:29:41,280 Speaker 1: I just want to take a shot at boomers. 567 00:29:41,600 --> 00:29:43,920 Speaker 2: Well, then that makes it, That makes it even less 568 00:29:44,000 --> 00:29:47,560 Speaker 2: understandable that twenty five to thirty five year olds would 569 00:29:47,600 --> 00:29:50,880 Speaker 2: be more likely to be scammed. I don't know, maybe 570 00:29:50,880 --> 00:29:53,400 Speaker 2: that generation is just more trusting these days or something 571 00:29:53,480 --> 00:29:56,440 Speaker 2: like that, or I actually I got to take that back, because, 572 00:29:56,440 --> 00:29:59,920 Speaker 2: as we'll see, trusting, being trusting is not necessarily corely 573 00:30:00,200 --> 00:30:01,080 Speaker 2: with being gullible. 574 00:30:01,640 --> 00:30:04,640 Speaker 1: Yeah, which I think we'll get to in a minute 575 00:30:04,680 --> 00:30:06,800 Speaker 1: before or after the next break. But can we talk 576 00:30:06,840 --> 00:30:09,760 Speaker 1: about science, because this is one thing. When I sent 577 00:30:09,800 --> 00:30:12,120 Speaker 1: Livit the idea, I was like, I think I read 578 00:30:12,120 --> 00:30:16,080 Speaker 1: an article about scientists being gullible, and I was like, no, 579 00:30:16,200 --> 00:30:19,640 Speaker 1: not scientists. But it turns out they very much can 580 00:30:19,720 --> 00:30:23,480 Speaker 1: be because a lot of times when you are that's 581 00:30:25,520 --> 00:30:29,920 Speaker 1: well versed in a field, you might you might kind 582 00:30:29,920 --> 00:30:32,200 Speaker 1: of think you know it all and like, oh, no, 583 00:30:32,280 --> 00:30:34,680 Speaker 1: I know what I'm doing, and so you might be 584 00:30:35,040 --> 00:30:40,719 Speaker 1: more apt to believe a result that isn't accurate because 585 00:30:40,800 --> 00:30:43,320 Speaker 1: you think you did it the right way. Like that's 586 00:30:43,320 --> 00:30:44,200 Speaker 1: just one aspect of it. 587 00:30:44,680 --> 00:30:48,080 Speaker 2: Yeah, Another aspect is, like you said, people people in 588 00:30:48,240 --> 00:30:51,280 Speaker 2: science typically know a tremendous amount about their field, but 589 00:30:51,360 --> 00:30:55,320 Speaker 2: they can make a mistake and think that that understanding, 590 00:30:55,720 --> 00:30:58,920 Speaker 2: that depth of understanding will just apply to other fields 591 00:30:58,960 --> 00:31:01,600 Speaker 2: as well if they just don't know as much about Yeah, 592 00:31:01,640 --> 00:31:04,280 Speaker 2: and that's another way they can fall prey to it. 593 00:31:04,320 --> 00:31:07,600 Speaker 2: But also scientists like to be right as much as 594 00:31:07,600 --> 00:31:11,600 Speaker 2: anybody else. And you know, I don't remember what episode 595 00:31:11,640 --> 00:31:14,960 Speaker 2: we did this, and I think it was about the 596 00:31:14,960 --> 00:31:20,440 Speaker 2: the just reproducibility crisis in science papers if I remember correctly. 597 00:31:20,920 --> 00:31:25,720 Speaker 2: But just how like the scientists don't set up experiments 598 00:31:25,760 --> 00:31:28,640 Speaker 2: to disprove their hypothesis. They set them up to prove 599 00:31:28,680 --> 00:31:31,600 Speaker 2: their hypothesis. That's how you get published, that's how you 600 00:31:31,640 --> 00:31:35,120 Speaker 2: get celebrated. Like nobody wants to hear about you failing, 601 00:31:35,760 --> 00:31:38,960 Speaker 2: even though that's what science is meant to be. That's 602 00:31:39,000 --> 00:31:41,400 Speaker 2: a that's a part of it as well, just wanting 603 00:31:41,520 --> 00:31:43,680 Speaker 2: to be right. So if somebody comes along, it's like, yep, 604 00:31:43,880 --> 00:31:47,320 Speaker 2: you're right, let's let's uh, let's use that to explain 605 00:31:47,400 --> 00:31:50,280 Speaker 2: this other thing that's actually not true. The scientists might 606 00:31:50,280 --> 00:31:52,040 Speaker 2: go along with it because if it is true, then 607 00:31:52,120 --> 00:31:55,960 Speaker 2: it will prove their their hypothesis and make them very famous, 608 00:31:56,000 --> 00:31:59,840 Speaker 2: and they'll probably end up having an HBO movie made. 609 00:32:01,120 --> 00:32:04,720 Speaker 1: Well, that was probably a scientific method. Huh. 610 00:32:04,880 --> 00:32:08,680 Speaker 2: Maybe maybe, but I mean we definitely talked about papers 611 00:32:08,720 --> 00:32:12,040 Speaker 2: just being some of them just being outright fraudulent because 612 00:32:12,080 --> 00:32:14,840 Speaker 2: their experiments are set up incorrectly. It could have been 613 00:32:14,920 --> 00:32:15,760 Speaker 2: scientific method. 614 00:32:16,040 --> 00:32:19,800 Speaker 1: Yeah, or like the little student in Rushmore that faked 615 00:32:19,840 --> 00:32:20,360 Speaker 1: the results. 616 00:32:21,800 --> 00:32:22,880 Speaker 2: I remember that part. 617 00:32:23,440 --> 00:32:26,240 Speaker 1: You know, Max has his sort of little budding girlfriend 618 00:32:26,280 --> 00:32:29,520 Speaker 1: at the end and he says something about she won 619 00:32:29,600 --> 00:32:31,120 Speaker 1: some science award and I think she had to give 620 00:32:31,120 --> 00:32:32,760 Speaker 1: it back or something, and she said, He's like why, 621 00:32:33,120 --> 00:32:35,760 Speaker 1: she said, I faked the results. It didn't work, so 622 00:32:35,800 --> 00:32:36,400 Speaker 1: I faked it. 623 00:32:36,800 --> 00:32:39,760 Speaker 2: Yeah. I thought that was so her line where she 624 00:32:39,920 --> 00:32:42,320 Speaker 2: tells Bill Murray that she won't dance with them, it 625 00:32:42,360 --> 00:32:44,240 Speaker 2: was a little out of nowhere. 626 00:32:44,560 --> 00:32:47,480 Speaker 1: Oh interesting, Yeah, I get that. 627 00:32:47,520 --> 00:32:49,680 Speaker 2: A little harsh. I think is what I'm trying to say. 628 00:32:49,840 --> 00:32:52,320 Speaker 1: Yeah, who didn't want to dance with Bill Murray? 629 00:32:52,560 --> 00:32:52,920 Speaker 2: I do. 630 00:32:54,360 --> 00:32:55,200 Speaker 1: You? And Lucy Lou? 631 00:32:56,200 --> 00:32:59,000 Speaker 2: All right, wait, Lucy Lou doesn't or does want to 632 00:32:59,080 --> 00:32:59,960 Speaker 2: dance with Bill Murray. 633 00:33:00,120 --> 00:33:02,200 Speaker 1: No, I don't think she does. They were on Charlie's 634 00:33:02,200 --> 00:33:03,680 Speaker 1: Angels together and had some words. 635 00:33:04,080 --> 00:33:05,440 Speaker 2: Oh that's right, I remember that. 636 00:33:05,720 --> 00:33:09,040 Speaker 1: Yeah, so I doubt she's dancing with Bill. Okay, all right, 637 00:33:09,040 --> 00:33:10,520 Speaker 1: should we take a break? Wait? 638 00:33:10,560 --> 00:33:12,200 Speaker 2: I just before we go to a break, I was 639 00:33:12,200 --> 00:33:15,360 Speaker 2: saying I would like to dance with Bill Murray. Oh yeah, okay, 640 00:33:15,360 --> 00:33:16,920 Speaker 2: I just want to make sure that no one walks 641 00:33:16,960 --> 00:33:19,240 Speaker 2: away to this ad break thinking that I don't want 642 00:33:19,280 --> 00:33:20,240 Speaker 2: to dance with Bill Murray. 643 00:33:20,400 --> 00:33:23,720 Speaker 1: Yeah. I was being sort of a opposite with my 644 00:33:23,840 --> 00:33:26,960 Speaker 1: Lucy lu joke. Okay, this, you know I'm not firing 645 00:33:26,960 --> 00:33:28,320 Speaker 1: on all cylinders. I'm doing my best. 646 00:33:28,760 --> 00:33:29,960 Speaker 2: I'm not either, apparently. 647 00:33:30,480 --> 00:33:32,640 Speaker 1: All right, we'll be right back and Josh will lead 648 00:33:32,680 --> 00:33:45,040 Speaker 1: off with a little bit on trust. 649 00:34:00,120 --> 00:34:04,240 Speaker 2: Okay, we're back, everybody. And I mentioned before that trust 650 00:34:04,320 --> 00:34:08,840 Speaker 2: is not necessarily correlated with gullibility, and I love that. 651 00:34:08,840 --> 00:34:11,319 Speaker 2: That just makes me feel good about the world again. 652 00:34:11,600 --> 00:34:14,960 Speaker 1: You know, you can trust people and think the best 653 00:34:15,000 --> 00:34:16,400 Speaker 1: of people and still not be gullible. 654 00:34:16,800 --> 00:34:20,600 Speaker 2: Yeah, and so we'll kind of explain why. But there 655 00:34:20,680 --> 00:34:23,960 Speaker 2: have been study after study after study that basically say, yeah, 656 00:34:24,080 --> 00:34:26,439 Speaker 2: that's actually true, Like you can have a high level 657 00:34:26,440 --> 00:34:28,839 Speaker 2: of trust, be tested for that kind of thing, and 658 00:34:28,880 --> 00:34:31,280 Speaker 2: you are not more likely to be gible. And in fact, 659 00:34:31,840 --> 00:34:35,600 Speaker 2: it seems that if you are a higher trusting person, 660 00:34:35,640 --> 00:34:39,239 Speaker 2: you're actually less likely to be gible compared to say, 661 00:34:39,280 --> 00:34:43,240 Speaker 2: like the cynic right, Like, there's this researcher named Toshio 662 00:34:43,480 --> 00:34:47,640 Speaker 2: Yamagishi who's considered one of the most prominent researchers in 663 00:34:48,160 --> 00:34:53,680 Speaker 2: glibility and trust out of Kaido University. I know how 664 00:34:53,680 --> 00:34:55,640 Speaker 2: to say, Hokkaido. I don't know why I had trouble 665 00:34:55,680 --> 00:34:58,040 Speaker 2: with that at first, but one of the things that 666 00:34:58,320 --> 00:35:02,319 Speaker 2: Yamagishi did in the nineteen nineth these was to tell 667 00:35:02,440 --> 00:35:05,359 Speaker 2: people who scored high in trusting this and other people 668 00:35:05,400 --> 00:35:10,000 Speaker 2: who scored low about the story of Bill and Chuck. 669 00:35:10,040 --> 00:35:11,920 Speaker 2: I think you should take it because Bill's got a 670 00:35:11,960 --> 00:35:12,760 Speaker 2: great story. 671 00:35:13,120 --> 00:35:15,920 Speaker 1: Yeah. I kind of understand this, but not one hundred percent, 672 00:35:16,000 --> 00:35:17,759 Speaker 1: but I think I get it. So what he would 673 00:35:17,800 --> 00:35:20,920 Speaker 1: say is Bill, your friend. Bill stated at a hotel 674 00:35:20,960 --> 00:35:24,319 Speaker 1: for a week, he was only charged one day. Do 675 00:35:24,360 --> 00:35:26,759 Speaker 1: you think he would tell the cashier about this even 676 00:35:26,800 --> 00:35:28,960 Speaker 1: though there's like no chance, let's say there's no chance 677 00:35:28,960 --> 00:35:31,000 Speaker 1: of him getting caught later on, do you think he 678 00:35:31,040 --> 00:35:34,759 Speaker 1: would do that? And people who scored high on their 679 00:35:34,960 --> 00:35:38,799 Speaker 1: trustworthy score, like people who were trustworthy, they were more 680 00:35:38,920 --> 00:35:41,000 Speaker 1: likely to say that Bill would do the honest thing. 681 00:35:41,560 --> 00:35:44,400 Speaker 1: But when he added in a twist here, which is 682 00:35:44,480 --> 00:35:47,200 Speaker 1: to tell them some negative things about Bill, like by 683 00:35:47,239 --> 00:35:50,160 Speaker 1: the way, just want to let you know, Bill also 684 00:35:50,200 --> 00:35:51,359 Speaker 1: cut in line. The other day. 685 00:35:51,680 --> 00:35:53,760 Speaker 2: He also makes his steps on feel stupid. 686 00:35:53,880 --> 00:35:57,120 Speaker 1: He also makes his steps on feel stupid. But if 687 00:35:57,160 --> 00:35:59,440 Speaker 1: the added in a couple of nuggets like that negative 688 00:35:59,480 --> 00:36:02,600 Speaker 1: things about Bill, the people who had high trust in 689 00:36:02,680 --> 00:36:06,320 Speaker 1: people generally put a lot more weight on that additional 690 00:36:06,360 --> 00:36:10,879 Speaker 1: information than the other people did the people that were 691 00:36:10,880 --> 00:36:12,799 Speaker 1: low and trustworthiness. 692 00:36:12,280 --> 00:36:16,360 Speaker 2: Right, But the bottom line was even with positive information 693 00:36:16,719 --> 00:36:20,239 Speaker 2: like Bill litard, but he also cut in line. If 694 00:36:20,239 --> 00:36:22,040 Speaker 2: you took all of the tallies, you would see that 695 00:36:22,080 --> 00:36:25,000 Speaker 2: people who are low in trusting others and people who 696 00:36:25,040 --> 00:36:27,600 Speaker 2: are high in trusting others they had about the same scores. 697 00:36:28,200 --> 00:36:33,000 Speaker 2: So this research from yam Magishi and others shows that 698 00:36:33,560 --> 00:36:36,279 Speaker 2: you can trust other people and it doesn't open you 699 00:36:36,440 --> 00:36:40,040 Speaker 2: up to being taken advantage of. And that just doesn't 700 00:36:40,040 --> 00:36:44,160 Speaker 2: make any sense because just the idea of being gullible 701 00:36:44,239 --> 00:36:47,239 Speaker 2: means that you're trusting what somebody else is saying. That's 702 00:36:47,320 --> 00:36:51,440 Speaker 2: the popular conception of it. But as we've seen, really 703 00:36:51,480 --> 00:36:55,000 Speaker 2: the idea of gullibility is trusting what somebody says because 704 00:36:55,040 --> 00:36:57,560 Speaker 2: you either don't care enough to go figure it out yourself, 705 00:36:58,480 --> 00:37:02,640 Speaker 2: because you don't feel like thinking for yourself, because what 706 00:37:02,680 --> 00:37:07,799 Speaker 2: they're saying confirms your biased beliefs, not that you just 707 00:37:07,960 --> 00:37:11,399 Speaker 2: trust people in general. And the explanation that I saw 708 00:37:11,440 --> 00:37:14,000 Speaker 2: that really kind of drives it home for me, Chuck, 709 00:37:14,640 --> 00:37:19,920 Speaker 2: is that people who have high trust are also more discerning, 710 00:37:21,200 --> 00:37:24,200 Speaker 2: so they would have probably a better social intelligence than 711 00:37:24,239 --> 00:37:27,040 Speaker 2: people who don't trust as much. And that makes sense 712 00:37:27,040 --> 00:37:29,920 Speaker 2: because if you don't trust people like the cynic, you're 713 00:37:29,960 --> 00:37:33,400 Speaker 2: actually protecting yourself. You're guarding yourself. You know that you 714 00:37:33,400 --> 00:37:37,359 Speaker 2: are probably not as discerning as other people, and so 715 00:37:37,640 --> 00:37:40,120 Speaker 2: rather than get yourself into trouble time and time again, 716 00:37:40,360 --> 00:37:42,520 Speaker 2: you just keep people at arms length, you don't really 717 00:37:42,520 --> 00:37:46,600 Speaker 2: trust them, whereas if you are high trusting you are 718 00:37:46,600 --> 00:37:50,680 Speaker 2: better at discerning, And that either means that because you're 719 00:37:50,680 --> 00:37:53,719 Speaker 2: good at discerning, you have the freedom to trust other 720 00:37:53,760 --> 00:37:56,160 Speaker 2: people because you can be confident in your judgment of 721 00:37:56,200 --> 00:37:59,200 Speaker 2: other people and you're probably not going to be taken 722 00:37:59,200 --> 00:38:02,440 Speaker 2: advantage of. Or if you are just a trusting person 723 00:38:02,480 --> 00:38:05,440 Speaker 2: by nature, you have to have a higher discernment or 724 00:38:05,480 --> 00:38:08,120 Speaker 2: else you're going to be taken advantage of. Either way, 725 00:38:08,600 --> 00:38:10,879 Speaker 2: high discernment and high trust go hand in hand. 726 00:38:11,320 --> 00:38:13,440 Speaker 1: Yeah, and that can be a very freeing thing. And 727 00:38:13,440 --> 00:38:15,840 Speaker 1: that's how Yamagishi sort of thought about it when he 728 00:38:15,880 --> 00:38:21,520 Speaker 1: talked about his emancipation theory, which is, if you're trusting, 729 00:38:21,560 --> 00:38:24,719 Speaker 1: you're kind of or if you're untrusting, I guess you're 730 00:38:24,800 --> 00:38:27,680 Speaker 1: kind of shackled in a way because you'll you may 731 00:38:27,800 --> 00:38:31,560 Speaker 1: just be stuck in a place because why why hire 732 00:38:31,680 --> 00:38:33,520 Speaker 1: a different person to do it, because they're just going 733 00:38:33,560 --> 00:38:36,120 Speaker 1: to be a scammer two and so you can get 734 00:38:36,160 --> 00:38:38,960 Speaker 1: stuck in this cycle. But if you free yourself from 735 00:38:39,000 --> 00:38:42,680 Speaker 1: that with his emancipation theory, and you break those shackles 736 00:38:42,719 --> 00:38:45,319 Speaker 1: and you start trusting people, it makes you much more 737 00:38:45,360 --> 00:38:48,600 Speaker 1: apt to make a positive change in life because you 738 00:38:48,719 --> 00:38:51,600 Speaker 1: trust somebody or something or some situation. 739 00:38:52,480 --> 00:38:55,080 Speaker 2: Yeah, because at base, you can go through life not 740 00:38:55,120 --> 00:38:57,319 Speaker 2: trusting other people, and you can make it all the 741 00:38:57,320 --> 00:39:00,880 Speaker 2: way to old age and die at pretty much the 742 00:39:00,880 --> 00:39:03,800 Speaker 2: the same age that you would have had you trusted people. 743 00:39:04,200 --> 00:39:07,880 Speaker 2: But you're again, you're missing out. There's opportunity costs to 744 00:39:08,200 --> 00:39:11,439 Speaker 2: not trusting other people that people who do trust other 745 00:39:11,480 --> 00:39:14,759 Speaker 2: people are not missing out on, and you're just not 746 00:39:15,080 --> 00:39:19,280 Speaker 2: connected as socially, and research after research after research shows 747 00:39:19,280 --> 00:39:22,680 Speaker 2: that social connections are like the number one predictor of 748 00:39:22,760 --> 00:39:26,040 Speaker 2: living to a healthy older age. So you're actually robbing 749 00:39:26,080 --> 00:39:29,080 Speaker 2: yourself by just not trusting other people. But again, it's 750 00:39:29,160 --> 00:39:32,600 Speaker 2: kind of understandable if you were taught that your judgment 751 00:39:33,000 --> 00:39:37,000 Speaker 2: is questionable, either through trauma, through a jerk stepdad, or whatever, 752 00:39:37,719 --> 00:39:41,880 Speaker 2: it's understandable. And I'm not sure if that's something that 753 00:39:41,920 --> 00:39:44,400 Speaker 2: you can learn to break out of, although I sincerely 754 00:39:44,400 --> 00:39:44,920 Speaker 2: hope it is. 755 00:39:45,560 --> 00:39:48,600 Speaker 1: Yeah, for sure, there are people that think we are 756 00:39:48,600 --> 00:39:52,000 Speaker 1: actually not as gullible as everyone thinks. There's this writer, 757 00:39:52,200 --> 00:39:56,200 Speaker 1: Hugo Mercier, who wrote a book in twenty twenty called 758 00:39:56,280 --> 00:39:59,440 Speaker 1: Not Born Yesterday. Great title for a book like that, 759 00:40:00,120 --> 00:40:04,320 Speaker 1: and he's like people are less gullible than we think, 760 00:40:05,560 --> 00:40:07,719 Speaker 1: and there are a lot of like criteria people used 761 00:40:07,719 --> 00:40:11,319 Speaker 1: to work out if they're if they believe something or not, 762 00:40:11,400 --> 00:40:13,640 Speaker 1: and we're better at it than we all think. We 763 00:40:13,680 --> 00:40:17,800 Speaker 1: are like a lot most people, or I guess in 764 00:40:17,880 --> 00:40:21,840 Speaker 1: his idea, most people are actually looking for well informed 765 00:40:23,280 --> 00:40:28,160 Speaker 1: or well intentioned information, or if it's a has logic 766 00:40:28,200 --> 00:40:32,560 Speaker 1: to it, if it's logically strong, or you know. Maybe 767 00:40:32,680 --> 00:40:36,160 Speaker 1: people are less like this, which is I'm just going 768 00:40:36,239 --> 00:40:39,200 Speaker 1: to accept something or I'm sorry, I'm not going to 769 00:40:39,239 --> 00:40:42,399 Speaker 1: accept something as a new piece of information because it's 770 00:40:42,480 --> 00:40:45,560 Speaker 1: not something that I have found to be true. He 771 00:40:45,680 --> 00:40:47,919 Speaker 1: argues that people are less like that than they say. 772 00:40:48,480 --> 00:40:53,400 Speaker 2: Yeah, and people also judge other people to be more 773 00:40:53,640 --> 00:40:56,040 Speaker 2: likely to be duped than they are more gullible than 774 00:40:56,040 --> 00:40:58,400 Speaker 2: they are. But yeah, his whole message is like, no, 775 00:40:58,440 --> 00:41:01,240 Speaker 2: we're actually as a as a group, as a species, 776 00:41:01,360 --> 00:41:06,239 Speaker 2: not all that gulible. What appears to be gullibility is 777 00:41:06,280 --> 00:41:09,560 Speaker 2: actually just somebody not caring enough to argue a point, 778 00:41:09,920 --> 00:41:15,040 Speaker 2: or they're accepting information but they're hanging on to it loosely. Olivia, 779 00:41:15,120 --> 00:41:18,399 Speaker 2: I thought this is awesome. She pointed out that if 780 00:41:18,400 --> 00:41:22,680 Speaker 2: you are shown like an AI generated baby peacock that 781 00:41:22,760 --> 00:41:25,440 Speaker 2: looks super cute and has huge eyes and is colorful, 782 00:41:25,440 --> 00:41:27,839 Speaker 2: and it is nothing like what a baby peacock really 783 00:41:27,880 --> 00:41:32,080 Speaker 2: looks like. If you're not like a peacock researcher or 784 00:41:32,480 --> 00:41:36,920 Speaker 2: your job doesn't depend on positively identifying baby peacocks, it 785 00:41:36,920 --> 00:41:39,640 Speaker 2: doesn't really matter if you think that that's what they 786 00:41:39,680 --> 00:41:43,080 Speaker 2: look like, because you're holding onto it loosely enough that 787 00:41:43,120 --> 00:41:45,640 Speaker 2: if somebody comes along and it says that's not what 788 00:41:45,719 --> 00:41:48,279 Speaker 2: baby peacocks actually look like, you're not going to like, 789 00:41:48,400 --> 00:41:50,520 Speaker 2: that's not the hill you're going to die on. You're 790 00:41:50,520 --> 00:41:53,120 Speaker 2: going to be like, oh, that's crazy what AI can do? 791 00:41:53,239 --> 00:41:55,719 Speaker 2: Or oh it got me, or or just be like, great, 792 00:41:56,000 --> 00:42:00,439 Speaker 2: I now know what baby peacocks look like. And that's 793 00:42:00,480 --> 00:42:03,840 Speaker 2: his point is that's not gullibility. That's just not stopping 794 00:42:04,200 --> 00:42:06,799 Speaker 2: to analyze, you know, whether it's true or not, because 795 00:42:06,840 --> 00:42:08,760 Speaker 2: it just isn't that important, right. 796 00:42:08,640 --> 00:42:11,920 Speaker 1: Then, Yeah, exactly. He also points out in the book 797 00:42:12,320 --> 00:42:15,360 Speaker 1: when it comes to like propaganda, that propaganda isn't something 798 00:42:15,400 --> 00:42:20,600 Speaker 1: that can usually really completely change someone's mind. What propaganda 799 00:42:20,680 --> 00:42:23,040 Speaker 1: is good at is taking someone who already has those 800 00:42:23,080 --> 00:42:27,120 Speaker 1: beliefs and putting them on turbo speed and reinforcing them 801 00:42:27,719 --> 00:42:31,760 Speaker 1: even like the Nazi propaganda machine. You know, he contends, 802 00:42:31,840 --> 00:42:36,160 Speaker 1: probably wasn't making someone anti semitic, but if you were 803 00:42:36,200 --> 00:42:39,359 Speaker 1: anti Semitic, then it really drove you down that road 804 00:42:39,480 --> 00:42:40,640 Speaker 1: at a pretty fast pace. 805 00:42:41,080 --> 00:42:43,880 Speaker 2: Yeah, because it came at your beliefs and said, yep, 806 00:42:44,040 --> 00:42:47,319 Speaker 2: go for it. Like that's what that's the official line now, 807 00:42:47,440 --> 00:42:49,200 Speaker 2: is anti semitism. 808 00:42:49,520 --> 00:42:49,840 Speaker 1: Yeah. 809 00:42:49,960 --> 00:42:54,560 Speaker 2: Yeah. And also similarly, political ads don't really work. 810 00:42:55,040 --> 00:42:55,880 Speaker 1: That's what they say. 811 00:42:56,120 --> 00:42:58,120 Speaker 2: Yeah, And that makes me wonder though, if that's just 812 00:42:58,400 --> 00:43:02,440 Speaker 2: being suspicious of the messenger because of polarization, that you're 813 00:43:02,480 --> 00:43:06,200 Speaker 2: not gonna be like, hmmm, let's hear what this opposing 814 00:43:06,280 --> 00:43:09,440 Speaker 2: political party has to say about medicare I'm really interested. 815 00:43:09,480 --> 00:43:11,879 Speaker 2: I'm gonna keep an open mind. No, it's like this 816 00:43:11,960 --> 00:43:14,640 Speaker 2: message is from the opposing party. I'm just gonna laugh 817 00:43:14,719 --> 00:43:16,319 Speaker 2: at it because it's just so full of it. 818 00:43:16,960 --> 00:43:20,120 Speaker 1: Yeah. I mean, I think political ads are terrible and 819 00:43:20,200 --> 00:43:24,279 Speaker 1: ridiculous and so over valued, but I feel like these 820 00:43:24,360 --> 00:43:27,840 Speaker 1: days it's less like it's more just beating that drum 821 00:43:27,880 --> 00:43:29,759 Speaker 1: of like aren't you mad? Aren't you mad? Go vote? 822 00:43:29,800 --> 00:43:30,280 Speaker 1: Go vote? 823 00:43:30,320 --> 00:43:30,879 Speaker 2: I know, man. 824 00:43:31,280 --> 00:43:31,720 Speaker 1: Yeah. 825 00:43:31,880 --> 00:43:33,680 Speaker 2: The thing is, though, is this. None of this is 826 00:43:33,719 --> 00:43:38,120 Speaker 2: to say that they're like, people don't get scammed. There's 827 00:43:38,160 --> 00:43:42,280 Speaker 2: a group called the Global Anti Scam Alliance, which sounds 828 00:43:42,280 --> 00:43:45,520 Speaker 2: like a scam itself. They came up with the report, 829 00:43:45,560 --> 00:43:48,359 Speaker 2: doesn't it. They came up with a report that found 830 00:43:48,400 --> 00:43:54,160 Speaker 2: that worldwide, people lose a trillion dollars to scams every year. Man, 831 00:43:54,520 --> 00:43:56,919 Speaker 2: that's a lot of money. But some of these same 832 00:43:56,960 --> 00:44:00,920 Speaker 2: researchers are like, hey, there's actually some short like easy 833 00:44:00,960 --> 00:44:04,440 Speaker 2: stuff you can walk around in your head with to 834 00:44:04,640 --> 00:44:08,520 Speaker 2: use to apply to new information to protect from being 835 00:44:08,600 --> 00:44:10,360 Speaker 2: gold which is actually a word. 836 00:44:10,800 --> 00:44:11,839 Speaker 1: Let's hear it. Do you have a list? 837 00:44:12,280 --> 00:44:14,680 Speaker 2: Yeah, one of them is. The first step is to 838 00:44:14,719 --> 00:44:17,719 Speaker 2: admit that you're as susceptible to being scammed as anybody else. 839 00:44:18,040 --> 00:44:20,479 Speaker 1: Okay, yeah, just a reality check. Yeah. 840 00:44:20,520 --> 00:44:23,359 Speaker 2: Well, it also puts the kibosh on being overconfident, which 841 00:44:23,360 --> 00:44:25,560 Speaker 2: again can increase your chance of being duped. 842 00:44:25,760 --> 00:44:26,520 Speaker 1: Yeah. Yeah. 843 00:44:26,600 --> 00:44:30,640 Speaker 2: Don't make emotional decisions like we talked about, keep a 844 00:44:30,640 --> 00:44:35,000 Speaker 2: lid on impulsivity. Don't respond to like act now supplies 845 00:44:35,040 --> 00:44:38,680 Speaker 2: are running out kind of like come ons. Don't respond 846 00:44:38,680 --> 00:44:43,600 Speaker 2: to false scarcity, like remember people hoarding toilet paper. Yeah, oh, Yeah, 847 00:44:43,680 --> 00:44:46,560 Speaker 2: those are emotional decisions. You want to just say cool 848 00:44:46,600 --> 00:44:49,600 Speaker 2: and level headed. Another one is asked questions, ask for 849 00:44:49,640 --> 00:44:53,040 Speaker 2: more information, don't be afraid to look dumb. That's a 850 00:44:53,480 --> 00:44:55,239 Speaker 2: that's a big one. Yeah, that's a big one. And 851 00:44:55,280 --> 00:44:59,319 Speaker 2: then consider the source. Is there any supporting information? And 852 00:44:59,360 --> 00:45:02,759 Speaker 2: when you put all together, you are probably going to 853 00:45:02,840 --> 00:45:06,920 Speaker 2: come up with a good decision or understanding. And if 854 00:45:06,920 --> 00:45:10,719 Speaker 2: you're being gold by somebody that's a real word, you 855 00:45:11,040 --> 00:45:15,200 Speaker 2: are probably going to say, I don't believe what you're saying. You, sir, 856 00:45:15,400 --> 00:45:18,160 Speaker 2: are a cad and a scoundrel. Right, please get out 857 00:45:18,200 --> 00:45:20,160 Speaker 2: of my face before I smack you with my glove 858 00:45:20,520 --> 00:45:20,839 Speaker 2: and we. 859 00:45:20,840 --> 00:45:25,040 Speaker 1: Have to duel. We get I'm sure anyone who works 860 00:45:25,040 --> 00:45:27,360 Speaker 1: where big companies get these, and maybe even small companies 861 00:45:27,400 --> 00:45:29,760 Speaker 1: do this. But when they send out the the test, 862 00:45:29,960 --> 00:45:33,480 Speaker 1: like the test phishing emails, and then like the next day, 863 00:45:34,080 --> 00:45:36,200 Speaker 1: you'll get an email that's like did you fall for it? 864 00:45:36,600 --> 00:45:36,839 Speaker 2: Right? 865 00:45:37,280 --> 00:45:39,239 Speaker 1: It's always I'm always nervous. I'm like, oh God, did 866 00:45:39,280 --> 00:45:41,759 Speaker 1: I click on that thing that you know from you know, 867 00:45:42,040 --> 00:45:48,759 Speaker 1: Facebook dot gold dot au. It's usually there in the 868 00:45:48,800 --> 00:45:50,040 Speaker 1: email address, you know. 869 00:45:50,280 --> 00:45:52,080 Speaker 2: Well, at least the next day when they send out 870 00:45:52,080 --> 00:45:54,000 Speaker 2: the email. They ask if you fell for it. They 871 00:45:54,000 --> 00:45:55,279 Speaker 2: don't show like a list with. 872 00:45:55,280 --> 00:45:57,480 Speaker 1: Prices of all the people who did command for it. 873 00:45:57,719 --> 00:46:00,160 Speaker 1: They should do that, just pictures of everyone. 874 00:46:01,400 --> 00:46:02,160 Speaker 2: You got anything else? 875 00:46:02,600 --> 00:46:04,160 Speaker 1: I got nothing else. I think you know we did 876 00:46:04,200 --> 00:46:05,319 Speaker 1: a pretty good job on this one. 877 00:46:06,200 --> 00:46:10,600 Speaker 2: I agree, and that is no fooling. And if you 878 00:46:10,640 --> 00:46:13,160 Speaker 2: want to know more about gullibility, go do some research 879 00:46:13,200 --> 00:46:15,480 Speaker 2: yourself on it. That's kind of the point of not 880 00:46:15,600 --> 00:46:18,319 Speaker 2: being gold, which is a real word. And since I 881 00:46:18,360 --> 00:46:20,160 Speaker 2: said that, it's time for listener mail. 882 00:46:22,760 --> 00:46:25,839 Speaker 1: This is a great current listener mail from yesterday's I'm 883 00:46:25,880 --> 00:46:31,360 Speaker 1: Sorry Thursday is rather episode yesterday to us on automats right, Hey, guys, 884 00:46:31,640 --> 00:46:33,520 Speaker 1: two friends and I gave each other a graduation present 885 00:46:33,600 --> 00:46:35,680 Speaker 1: from high school in nineteen seventy and spent a week 886 00:46:35,960 --> 00:46:39,000 Speaker 1: by ourselves in New York where we went to the Automat. 887 00:46:39,000 --> 00:46:41,479 Speaker 1: And it was still great in seventy four years later. 888 00:46:41,640 --> 00:46:44,080 Speaker 1: This gets so good. Four years later, as a senior 889 00:46:44,120 --> 00:46:46,480 Speaker 1: in college, a group of us did an independent study 890 00:46:46,640 --> 00:46:49,640 Speaker 1: in humor and music as an excuse to do a 891 00:46:49,680 --> 00:46:53,520 Speaker 1: concert of Bach stuff. I got to be the soloist 892 00:46:53,560 --> 00:46:57,360 Speaker 1: in the Concerto for horn and hard art nice and 893 00:46:57,400 --> 00:46:59,520 Speaker 1: he sent a video. Unfortunately it was just audio. I 894 00:46:59,520 --> 00:47:01,719 Speaker 1: mean it sound like a hoot, and really it was great, 895 00:47:01,800 --> 00:47:04,520 Speaker 1: but I wanted to see everything because here's what they did. 896 00:47:04,760 --> 00:47:07,920 Speaker 1: This piece is for orchestra and also a table filled 897 00:47:07,920 --> 00:47:11,400 Speaker 1: with various household items to play. Ideally they should have 898 00:47:11,400 --> 00:47:14,399 Speaker 1: been picked out of an automat on stage in order 899 00:47:14,440 --> 00:47:17,600 Speaker 1: to play them. However, this is beyond our set construction abilities. 900 00:47:18,200 --> 00:47:21,360 Speaker 1: We did have the recommended We did at least have 901 00:47:21,440 --> 00:47:24,960 Speaker 1: the recommended banner overhead reading in Latin, less work for 902 00:47:25,080 --> 00:47:28,840 Speaker 1: mother along with trying to master the rather challenging music. 903 00:47:29,080 --> 00:47:32,839 Speaker 1: It involved me running around Gettysburg with a pitch pipe, 904 00:47:32,880 --> 00:47:37,440 Speaker 1: trying to find bell's pots, uga horns, and lots of 905 00:47:37,440 --> 00:47:41,200 Speaker 1: other items that played specific notes. This is so great. 906 00:47:41,360 --> 00:47:44,840 Speaker 1: Thanks for speaking those wonder sparking those wonderful memories. I 907 00:47:44,880 --> 00:47:47,600 Speaker 1: discovered you during COVID and have been an extremely faithful 908 00:47:47,640 --> 00:47:51,760 Speaker 1: listener ever since. And that is from the Reverend doctor 909 00:47:51,840 --> 00:47:57,719 Speaker 1: Mark Oldenburg steck Miller, Professor Emeritus of the Art of 910 00:47:57,760 --> 00:48:02,000 Speaker 1: Worship and the music chair at Getty's United Lutheran Semmetary. 911 00:48:02,600 --> 00:48:04,320 Speaker 1: Pronounce him wow. 912 00:48:04,680 --> 00:48:06,640 Speaker 2: Also the most interesting person we know. 913 00:48:06,680 --> 00:48:11,240 Speaker 1: Now totally Reverend Mark. Write in more please the doctor, 914 00:48:11,280 --> 00:48:15,560 Speaker 1: Reverend Mark, the doctor Reverend esquire the Reverend doctor. 915 00:48:15,760 --> 00:48:16,960 Speaker 2: Sorry either way. 916 00:48:17,200 --> 00:48:18,440 Speaker 1: Yeah, pretty impressive. 917 00:48:18,840 --> 00:48:20,919 Speaker 2: Thanks a lot, Mark. I'm just gonna call you Mark 918 00:48:20,960 --> 00:48:22,520 Speaker 2: for now because I feel like we're on a first 919 00:48:22,600 --> 00:48:25,680 Speaker 2: name basis. That was a train email. Great story and 920 00:48:25,760 --> 00:48:28,160 Speaker 2: if you want to see if you can top Mark, 921 00:48:28,640 --> 00:48:31,040 Speaker 2: you can send us an email too. Send it off 922 00:48:31,080 --> 00:48:37,040 Speaker 2: to stuff Podcasts at iHeartRadio dot com. 923 00:48:37,200 --> 00:48:40,080 Speaker 1: Stuff you Should Know is a production of iHeartRadio. For 924 00:48:40,160 --> 00:48:44,319 Speaker 1: more podcasts, my heart Radio, visit the iHeartRadio app, Apple Podcasts, 925 00:48:44,440 --> 00:48:46,280 Speaker 1: or wherever you listen to your favorite shows,