1 00:00:01,440 --> 00:00:07,680 Speaker 1: Welcome to Stuff you should know a production of iHeartRadio. 2 00:00:11,600 --> 00:00:14,240 Speaker 2: Hey, and welcome to the podcast. I'm Josh, and there's 3 00:00:14,400 --> 00:00:18,640 Speaker 2: Chuck Charles w Chuck Bryant, the best all around Boy, 4 00:00:19,120 --> 00:00:22,840 Speaker 2: and there's Jerry Rowland. And this is stuff you should 5 00:00:22,920 --> 00:00:23,279 Speaker 2: know about. 6 00:00:23,320 --> 00:00:26,480 Speaker 1: Really, wish I hadn't have told you that. I think 7 00:00:26,480 --> 00:00:28,240 Speaker 1: I've told you that before, though, so you forgot the 8 00:00:28,240 --> 00:00:29,840 Speaker 1: first time, so maybe you'll forget again. 9 00:00:30,480 --> 00:00:32,919 Speaker 2: I can't you know how people keep lists of like 10 00:00:33,120 --> 00:00:35,800 Speaker 2: episodes we say we should do, or movies we've mentioned, 11 00:00:35,840 --> 00:00:38,159 Speaker 2: or something like that. They should keep a list of 12 00:00:38,200 --> 00:00:41,160 Speaker 2: the different things that we've talked about and then completely 13 00:00:41,200 --> 00:00:43,920 Speaker 2: forgot we talked about, because I'm sure it would be extensive. 14 00:00:44,000 --> 00:00:44,800 Speaker 1: Yeah, big list. 15 00:00:46,120 --> 00:00:50,720 Speaker 2: So speaking of big list, Chuck, we're talking today about priming, 16 00:00:51,600 --> 00:00:55,040 Speaker 2: which is the present tense of prime us, and what 17 00:00:55,080 --> 00:01:00,920 Speaker 2: it refers to is a a psychological term where you 18 00:01:01,120 --> 00:01:05,040 Speaker 2: are prompted to respond in a certain way, behave in 19 00:01:05,080 --> 00:01:10,440 Speaker 2: a certain way, choose a certain selection based on some 20 00:01:11,120 --> 00:01:14,800 Speaker 2: prompt that was given to you without your knowledge. It 21 00:01:14,800 --> 00:01:17,520 Speaker 2: could either be so flashed so fast or something on 22 00:01:17,560 --> 00:01:21,400 Speaker 2: a computer screen that your conscious awareness didn't pick up 23 00:01:21,400 --> 00:01:25,240 Speaker 2: on it, or it could just be presented to you 24 00:01:25,480 --> 00:01:27,800 Speaker 2: in a way that you're not you're not aware that 25 00:01:28,000 --> 00:01:30,720 Speaker 2: it's actually related to the thing that you're being say, 26 00:01:30,760 --> 00:01:31,280 Speaker 2: tested on. 27 00:01:31,640 --> 00:01:35,959 Speaker 1: Yeah, exactly. And this is a big, big thing in 28 00:01:36,000 --> 00:01:39,360 Speaker 1: cognitive psychology, and you know, we're going to kind of 29 00:01:39,360 --> 00:01:41,120 Speaker 1: get into the ins and outs of it. Some of it, 30 00:01:41,200 --> 00:01:46,119 Speaker 1: there's there's a lot of for some of it, there's 31 00:01:46,160 --> 00:01:48,600 Speaker 1: a lot of studies that sort of indicate that there's 32 00:01:48,640 --> 00:01:51,400 Speaker 1: a lot to it. There's another part of this that 33 00:01:51,440 --> 00:01:53,840 Speaker 1: we'll get to where there's a lot of studies that 34 00:01:53,880 --> 00:01:57,760 Speaker 1: aren't so great or that have been fudged or data's 35 00:01:57,800 --> 00:02:02,520 Speaker 1: been massaged or completely fabricated, where it seems like sometimes 36 00:02:02,520 --> 00:02:06,960 Speaker 1: they've been found out as far as researchers go. But 37 00:02:07,000 --> 00:02:08,840 Speaker 1: it all comes down to memory. And if you're if 38 00:02:08,880 --> 00:02:11,440 Speaker 1: you're talking to a cognitive psychologist, they'll say that you 39 00:02:11,440 --> 00:02:14,639 Speaker 1: have a couple of ways that you remember things. One 40 00:02:14,720 --> 00:02:18,079 Speaker 1: is your explicit memory or your active or conscious memory. 41 00:02:18,080 --> 00:02:20,560 Speaker 1: Like if you're trying to think of something and you 42 00:02:20,720 --> 00:02:24,240 Speaker 1: actively engage your memory to think of a thing, that's 43 00:02:24,240 --> 00:02:24,760 Speaker 1: what that is. 44 00:02:25,880 --> 00:02:28,800 Speaker 2: If somebody asked, like what was Chuck's senior superlative in 45 00:02:28,880 --> 00:02:31,680 Speaker 2: high school? I would think back to a fact that 46 00:02:31,720 --> 00:02:35,360 Speaker 2: I learned, and I would say best all round boy, is. 47 00:02:35,360 --> 00:02:38,840 Speaker 1: It going to be one of those? And then implicit 48 00:02:38,960 --> 00:02:41,000 Speaker 1: memory is the other one. And that's just what you 49 00:02:41,320 --> 00:02:46,040 Speaker 1: remember on an unconscious level. So if someone asked me 50 00:02:47,160 --> 00:02:49,560 Speaker 1: what was your senior superlo, wouldn't even have to think 51 00:02:49,560 --> 00:02:51,720 Speaker 1: about it, best all around boy, boom, it just pops 52 00:02:51,800 --> 00:02:53,079 Speaker 1: right out as if it were my name. 53 00:02:53,840 --> 00:02:57,800 Speaker 2: Right, So those are and this is really kind of 54 00:02:57,880 --> 00:03:00,880 Speaker 2: like breaking it down to a really rough basic level. 55 00:03:00,919 --> 00:03:03,320 Speaker 2: It's much more complex than that. Well, we bring that 56 00:03:03,480 --> 00:03:08,040 Speaker 2: up because yours. It's tapping into the different ways that 57 00:03:08,160 --> 00:03:13,440 Speaker 2: you access memories that priming is based on. Yeah, and 58 00:03:13,480 --> 00:03:18,160 Speaker 2: you said it falls under the rubric of cognitive psychology, 59 00:03:19,160 --> 00:03:22,880 Speaker 2: and that is true. Cognitive psychology has shown very clearly 60 00:03:22,919 --> 00:03:27,920 Speaker 2: that this actually works. That if you give somebody, say 61 00:03:28,000 --> 00:03:31,400 Speaker 2: a word, and you show them a list of associated words, 62 00:03:31,720 --> 00:03:34,960 Speaker 2: it's going to just happen that they they're able to 63 00:03:35,000 --> 00:03:39,200 Speaker 2: pick out the associated word faster than other words. There, Well, 64 00:03:39,280 --> 00:03:42,360 Speaker 2: we'll give you some examples. Rather than just me mushing 65 00:03:42,360 --> 00:03:44,680 Speaker 2: it all together. Let's let's tease it out a little 66 00:03:44,680 --> 00:03:46,360 Speaker 2: bit like a eighties per. 67 00:03:46,560 --> 00:03:50,720 Speaker 1: Yeah, oh boy, Yeah, Let let's do that, because I 68 00:03:50,760 --> 00:03:52,520 Speaker 1: think if we give some of these examples it'll make 69 00:03:52,520 --> 00:03:55,760 Speaker 1: a little more sense. For instance, this first exercise that 70 00:03:56,520 --> 00:03:58,560 Speaker 1: Dave helped us out with this they found this one. 71 00:03:58,560 --> 00:04:01,840 Speaker 1: It's called the lexical decision, and that is the idea 72 00:04:01,920 --> 00:04:04,080 Speaker 1: that if you put a word up on a screen 73 00:04:04,120 --> 00:04:07,640 Speaker 1: for a couple of seconds as a priming word, then 74 00:04:09,320 --> 00:04:12,000 Speaker 1: as the subject, will indicate the next word on the 75 00:04:12,000 --> 00:04:14,600 Speaker 1: screen if it's real word or if it's a nonsense word, 76 00:04:14,880 --> 00:04:17,880 Speaker 1: but that will be influenced that's your lexical decision you're making, 77 00:04:18,120 --> 00:04:20,640 Speaker 1: but that will be influenced by that priming word that 78 00:04:20,720 --> 00:04:22,640 Speaker 1: you saw. So in this case, Dave, do you use 79 00:04:22,680 --> 00:04:27,400 Speaker 1: the example Doctor You're sitting there in the lab. Bill 80 00:04:27,480 --> 00:04:31,800 Speaker 1: Murray's on the other side, got the shaker already. Doctor 81 00:04:31,839 --> 00:04:33,760 Speaker 1: pops up on the screen for a couple of seconds. 82 00:04:33,839 --> 00:04:36,280 Speaker 1: That's your priming word, and then all of a sudden 83 00:04:37,279 --> 00:04:40,360 Speaker 1: other words pop up after that, and our job as 84 00:04:40,360 --> 00:04:43,479 Speaker 1: a subject is to say whether that's a real word 85 00:04:43,560 --> 00:04:47,520 Speaker 1: or a nonsense word. And obviously you would think if 86 00:04:47,520 --> 00:04:51,080 Speaker 1: the word nurse pops up, you're going to be recognizing 87 00:04:51,120 --> 00:04:53,200 Speaker 1: that a lot faster. So it's all about sort of 88 00:04:53,200 --> 00:04:55,520 Speaker 1: the speed at which you recognize this as a real 89 00:04:55,600 --> 00:04:59,000 Speaker 1: or nonsense word. Nurse would be a much faster decision 90 00:04:59,440 --> 00:05:02,440 Speaker 1: than if it was basketball, right. 91 00:05:03,040 --> 00:05:08,520 Speaker 2: And it's basically words that share a similar category are 92 00:05:08,600 --> 00:05:11,919 Speaker 2: more easily accessed once you've been primed, almost like the 93 00:05:11,960 --> 00:05:14,719 Speaker 2: way that we sort things is by putting them into 94 00:05:14,920 --> 00:05:19,480 Speaker 2: large categories like doctor, nurse, hospital, stethoscope, right, and then 95 00:05:19,520 --> 00:05:24,279 Speaker 2: once you open that category by thinking of doctor, you're 96 00:05:24,320 --> 00:05:26,080 Speaker 2: going to be able to access the other stuff in 97 00:05:26,120 --> 00:05:28,640 Speaker 2: that category much more easily than say something in a 98 00:05:28,680 --> 00:05:32,880 Speaker 2: totally different category like cheese in the same category as delicious, 99 00:05:32,920 --> 00:05:36,719 Speaker 2: that kind of stuff. Right, That seems for sure, that 100 00:05:36,800 --> 00:05:40,520 Speaker 2: seems like what it's tapping into, and it actually seems 101 00:05:40,520 --> 00:05:43,799 Speaker 2: to reveal that that's kind of how we store memories. 102 00:05:44,320 --> 00:05:49,400 Speaker 2: There's another another demonstration that was pretty famous that shows 103 00:05:49,480 --> 00:05:51,679 Speaker 2: for sure. Again, I just want to get this across. 104 00:05:51,720 --> 00:05:55,559 Speaker 2: Cognitive priming really works. If you give people a list 105 00:05:55,600 --> 00:05:58,800 Speaker 2: of words and one of those words has a letter missing, 106 00:05:58,800 --> 00:06:02,039 Speaker 2: say a vowel, that could make multiple different words they're 107 00:06:02,080 --> 00:06:04,400 Speaker 2: going to choose or they're going to fill it in 108 00:06:04,520 --> 00:06:07,880 Speaker 2: differently based on the other words in the list. So, 109 00:06:08,040 --> 00:06:11,920 Speaker 2: for example, if you have a list of words bread, milk, hot, 110 00:06:12,320 --> 00:06:15,440 Speaker 2: and then the last word is so o, blank p, 111 00:06:15,440 --> 00:06:20,440 Speaker 2: poop sure or soup Oh, I think would be a 112 00:06:20,440 --> 00:06:23,880 Speaker 2: good one, right, yeah, And depending on whether you know 113 00:06:23,920 --> 00:06:25,719 Speaker 2: how to spell, you might spell it with an oh 114 00:06:25,880 --> 00:06:29,080 Speaker 2: or you doesn't matter, You're still getting the point across, right. 115 00:06:29,760 --> 00:06:33,039 Speaker 1: Or in another example of the same, you know, missing letter, 116 00:06:33,600 --> 00:06:38,400 Speaker 1: if it was shower, water, wash, so blank p, you 117 00:06:38,480 --> 00:06:42,160 Speaker 1: might say poop there too. Actually, soap would probably be 118 00:06:42,279 --> 00:06:45,000 Speaker 1: the word. So you know, it's just a very intuitive 119 00:06:45,279 --> 00:06:47,719 Speaker 1: kind of thing happening because your brain is working in 120 00:06:47,760 --> 00:06:51,440 Speaker 1: a very unconscious way because or I guess non conscious, 121 00:06:51,920 --> 00:06:55,440 Speaker 1: because it is primed by the words that precede the 122 00:06:55,480 --> 00:06:56,360 Speaker 1: one with a missing letter. 123 00:06:57,360 --> 00:07:01,320 Speaker 2: Right. And then there's also another another There's a number 124 00:07:01,320 --> 00:07:04,600 Speaker 2: of different ways of priming people cognitively, but one that 125 00:07:04,720 --> 00:07:06,640 Speaker 2: just makes total sense, and I think we've all run 126 00:07:06,640 --> 00:07:09,960 Speaker 2: into is repetition priming, to where if you are shown, 127 00:07:10,120 --> 00:07:14,800 Speaker 2: like say, a pair of words, maybe sometimes one of 128 00:07:14,840 --> 00:07:16,520 Speaker 2: which is a nonsense word, and you have to pick 129 00:07:16,520 --> 00:07:20,520 Speaker 2: out the nonsense words. If you see, like in a 130 00:07:20,560 --> 00:07:23,080 Speaker 2: list over and over again, doctor and nurse, doctor and nurse, 131 00:07:23,080 --> 00:07:25,840 Speaker 2: it comes up three times out of a seven word 132 00:07:25,920 --> 00:07:29,160 Speaker 2: pair list. You're going to move through those much more 133 00:07:29,240 --> 00:07:33,520 Speaker 2: quickly because it's right there in the forefront of your mind, 134 00:07:33,640 --> 00:07:36,440 Speaker 2: so it hasn't faded yet, so you're just going to 135 00:07:36,480 --> 00:07:38,880 Speaker 2: pick it out faster and faster the more it's repeated. 136 00:07:39,240 --> 00:07:41,000 Speaker 2: That's a form of priming too. 137 00:07:41,760 --> 00:07:44,520 Speaker 1: Yeah, and these are all really super basic, you know, 138 00:07:44,720 --> 00:07:48,320 Speaker 1: examples of cognitive priming, But it does show a lot 139 00:07:48,360 --> 00:07:51,560 Speaker 1: about how our memory works. That were not just computers 140 00:07:52,560 --> 00:07:54,640 Speaker 1: where we can just access a file by clicking on 141 00:07:54,680 --> 00:07:59,800 Speaker 1: it very easily. We develop shortcuts in our brain. We 142 00:08:00,240 --> 00:08:03,240 Speaker 1: take shortcuts when we see words relating them to other words, 143 00:08:03,920 --> 00:08:06,720 Speaker 1: and so it just makes sense, you know, you could 144 00:08:06,800 --> 00:08:12,680 Speaker 1: access something much much faster because you have been primed. 145 00:08:13,000 --> 00:08:16,680 Speaker 1: In other words, our implicit memory can be tapped into 146 00:08:16,720 --> 00:08:18,560 Speaker 1: a lot quicker than our explicit memory. 147 00:08:19,520 --> 00:08:23,240 Speaker 2: Right. And at the end of each of these experiments, 148 00:08:23,360 --> 00:08:26,800 Speaker 2: it kind of quickly became tradition where the researcher would 149 00:08:26,840 --> 00:08:28,760 Speaker 2: stand up and point at this and be like, you've 150 00:08:28,800 --> 00:08:32,959 Speaker 2: been primed, maybe adding a boo yah once in a while. 151 00:08:33,160 --> 00:08:35,640 Speaker 1: Yeah, and Deon Sanders just moon walks through the background. 152 00:08:36,280 --> 00:08:39,720 Speaker 2: Really, so we know that this is true also, not 153 00:08:39,840 --> 00:08:42,960 Speaker 2: just because study after study has shown that this is 154 00:08:43,080 --> 00:08:46,920 Speaker 2: actually correct. But when you put somebody in the Wonder Machine, 155 00:08:46,960 --> 00:08:50,480 Speaker 2: the fMRI gotta do it. Yeah, they found that, like 156 00:08:50,520 --> 00:08:56,720 Speaker 2: if you ask somebody the name of let's say, Anya 157 00:08:56,760 --> 00:08:59,720 Speaker 2: Taylor Joy, say who was the star of the Witch, 158 00:09:00,160 --> 00:09:02,160 Speaker 2: but also the Gorge, which isn't that good? 159 00:09:03,040 --> 00:09:05,600 Speaker 1: I thought that was fairly entertaining for a not great movie. 160 00:09:05,600 --> 00:09:09,080 Speaker 1: By the way, The Gorge, Yeah, it was highly watchable 161 00:09:09,120 --> 00:09:11,120 Speaker 1: on like a rainy, not feeling so great day kind 162 00:09:11,160 --> 00:09:11,440 Speaker 1: of movie. 163 00:09:11,520 --> 00:09:15,280 Speaker 2: Way, agreed, Yeah, agreed. It just wow, it takes a 164 00:09:15,400 --> 00:09:19,120 Speaker 2: sudden turn. It's just really surprising. But yes, I agree. 165 00:09:19,160 --> 00:09:20,880 Speaker 2: I think that's a good way to put it. You 166 00:09:20,880 --> 00:09:24,560 Speaker 2: would stop and think like, oh, Anya Taylor Joy. And 167 00:09:24,600 --> 00:09:26,360 Speaker 2: by the way, the Queen's Gambit is one of the 168 00:09:26,360 --> 00:09:28,319 Speaker 2: best things I've ever seen in my entire life. 169 00:09:28,720 --> 00:09:31,719 Speaker 1: Oh yeah, yeah, the chess one. Yeah, that was great. 170 00:09:32,040 --> 00:09:37,920 Speaker 2: Okay, but if you also said, you know, name a 171 00:09:37,920 --> 00:09:40,800 Speaker 2: animal that you think of when we'd say the word dog. 172 00:09:41,320 --> 00:09:44,079 Speaker 2: What they found in the fMRI is that different parts 173 00:09:44,080 --> 00:09:46,240 Speaker 2: of your brain light up. So we do know that 174 00:09:46,360 --> 00:09:50,040 Speaker 2: priming does have a certain effect and it is different 175 00:09:50,400 --> 00:09:53,559 Speaker 2: than our normal kind of conscious recall, I. 176 00:09:53,559 --> 00:09:58,040 Speaker 1: Wonder if when they go into the fMRI Wonder Machine room. 177 00:09:58,160 --> 00:10:01,199 Speaker 1: Now the person running it just goes I'm telling you 178 00:10:01,240 --> 00:10:03,600 Speaker 1: what's gonna happen. Some part of light up and then 179 00:10:03,640 --> 00:10:06,240 Speaker 1: another part of light up, and you'll all be super happy. 180 00:10:06,880 --> 00:10:08,400 Speaker 2: Right, you've been primed. 181 00:10:08,559 --> 00:10:11,479 Speaker 1: You've been primed in Deander's very slowly moonwalks. 182 00:10:13,000 --> 00:10:17,760 Speaker 2: So, okay, that's cognitive priming. Now, imagine if you were 183 00:10:17,880 --> 00:10:21,120 Speaker 2: a psychologist and you said cognitive priming is kind of boring. 184 00:10:21,720 --> 00:10:25,400 Speaker 2: What if we could use that same stuff to get 185 00:10:25,400 --> 00:10:28,560 Speaker 2: people to eat more cheeseburgers, Yeah, or to vote for 186 00:10:28,600 --> 00:10:34,319 Speaker 2: a particular candidate in a political election, Like, what if 187 00:10:34,360 --> 00:10:35,560 Speaker 2: priming works for that? 188 00:10:36,160 --> 00:10:38,600 Speaker 1: Yeah? And that's you know, it's all fun in games 189 00:10:38,600 --> 00:10:40,559 Speaker 1: to just sort of look at these little experiments on 190 00:10:40,640 --> 00:10:45,480 Speaker 1: a campus somewhere. But if it has a real world use, 191 00:10:46,480 --> 00:10:49,240 Speaker 1: especially when it comes to marketing and advertising and stuff 192 00:10:49,280 --> 00:10:51,720 Speaker 1: like that, you can bet corporations are going to sink 193 00:10:51,760 --> 00:10:52,720 Speaker 1: some money into doing that. 194 00:10:53,640 --> 00:10:56,679 Speaker 2: Yes, So this is the handoff right here between cognitive 195 00:10:56,679 --> 00:11:02,439 Speaker 2: psychology to social psychology. And social psychology has studied priming 196 00:11:02,840 --> 00:11:05,480 Speaker 2: in great detail. It was a huge hit. You'll remember 197 00:11:05,520 --> 00:11:09,000 Speaker 2: back to nudge Economics, we talked about it in our 198 00:11:09,120 --> 00:11:13,120 Speaker 2: PR Live episode, like it was a big deal in 199 00:11:13,200 --> 00:11:17,360 Speaker 2: like the Late ought to about the twenty twelve. I 200 00:11:17,360 --> 00:11:19,679 Speaker 2: think something like those is just a big deal, and 201 00:11:19,840 --> 00:11:23,079 Speaker 2: it makes a lot of sense, Like it's the basis 202 00:11:23,200 --> 00:11:27,679 Speaker 2: of things like ideas like McDonald's uses red and yellow 203 00:11:28,200 --> 00:11:31,840 Speaker 2: in its logo because those colors are associated with excitement 204 00:11:31,960 --> 00:11:34,920 Speaker 2: or energy, your happiness, or they call it a happy 205 00:11:35,040 --> 00:11:40,280 Speaker 2: meal because over time your kid will associate McDonald's with 206 00:11:40,360 --> 00:11:44,439 Speaker 2: being happy or I'm loving it. It's just a jingle 207 00:11:44,520 --> 00:11:47,600 Speaker 2: or whatever, and it makes sense. It's very catchy, but 208 00:11:47,840 --> 00:11:50,240 Speaker 2: there's some part of your mind that it has been 209 00:11:50,320 --> 00:11:55,880 Speaker 2: primed to later on associate McDonald's with love, a positive feeling. 210 00:11:56,200 --> 00:12:00,520 Speaker 2: All of this is these examples of social psychology research 211 00:12:00,800 --> 00:12:04,120 Speaker 2: supporting this idea that you can prime human beings to 212 00:12:04,160 --> 00:12:06,880 Speaker 2: behave in a certain way just by using these same 213 00:12:06,960 --> 00:12:12,040 Speaker 2: techniques that cognitive psychologists prove work. Dog cat soup, soap, 214 00:12:12,120 --> 00:12:12,920 Speaker 2: that kind of stuff. 215 00:12:13,080 --> 00:12:15,080 Speaker 1: Yeah, as long as you beat them over the head 216 00:12:15,080 --> 00:12:17,840 Speaker 1: with it over and over and over again, because repetition 217 00:12:17,960 --> 00:12:21,040 Speaker 1: is one of the real keys here. Your brain just 218 00:12:21,080 --> 00:12:25,120 Speaker 1: being repeatedly exposed to that stimuli is going to you know, 219 00:12:25,160 --> 00:12:28,160 Speaker 1: strengthen that association over time, and brands know that, and 220 00:12:28,160 --> 00:12:31,319 Speaker 1: that's why it drives you crazy during you know, especially 221 00:12:31,360 --> 00:12:33,319 Speaker 1: like sports playoffs or something, when you see those same 222 00:12:33,400 --> 00:12:36,360 Speaker 1: ads over and over and over again. Yeah, it's a 223 00:12:36,640 --> 00:12:38,880 Speaker 1: it's big business. They're putting a lot of money into 224 00:12:39,480 --> 00:12:43,160 Speaker 1: into you know, sort of manipulating your brain essentially. 225 00:12:43,840 --> 00:12:46,400 Speaker 2: Yeah, I've said it before and I'll say it again. 226 00:12:46,679 --> 00:12:52,480 Speaker 2: For some reason, some ad exec chose morning reruns of Murder. 227 00:12:52,559 --> 00:12:55,520 Speaker 2: She wrote, all right, start TV over the air channel 228 00:12:55,600 --> 00:12:59,520 Speaker 2: network for the Burger King terrible singing. 229 00:13:01,040 --> 00:13:06,320 Speaker 1: There had to be some research into their viewership or something, right, it. 230 00:13:06,160 --> 00:13:09,120 Speaker 2: Had to be. But the thing is, every single other 231 00:13:09,360 --> 00:13:13,280 Speaker 2: ad is for like human life insurance or health insurance 232 00:13:13,320 --> 00:13:16,679 Speaker 2: or dental insurance because you're you're just retired and now 233 00:13:16,679 --> 00:13:21,120 Speaker 2: you have Medicaid. Like every other ad that's almost stood out, 234 00:13:21,320 --> 00:13:21,920 Speaker 2: you know what I mean. 235 00:13:22,080 --> 00:13:26,480 Speaker 1: Well, you know, our senior friends in the world are 236 00:13:26,760 --> 00:13:29,439 Speaker 1: very concerned about their health and their healthcare, and they 237 00:13:29,480 --> 00:13:31,800 Speaker 1: still love those cheeseburgers. That's two things we know. 238 00:13:32,400 --> 00:13:34,800 Speaker 2: I guess that's it. But it really grated on my 239 00:13:34,840 --> 00:13:36,720 Speaker 2: nerves because I watched that almost every day. 240 00:13:37,240 --> 00:13:38,959 Speaker 1: All right, why don't we take a break here. That's 241 00:13:38,960 --> 00:13:42,000 Speaker 1: a good table setting, I think, and we'll get back 242 00:13:42,040 --> 00:13:45,840 Speaker 1: to how this is used in media and politics. Should 243 00:13:45,880 --> 00:14:37,840 Speaker 1: be no surprise right after this. All right, so we're back, 244 00:14:37,840 --> 00:14:39,680 Speaker 1: as promised. We're going to talk a little bit about 245 00:14:39,880 --> 00:14:42,520 Speaker 1: the media and politics, hopefully in a way that's just 246 00:14:42,520 --> 00:14:44,800 Speaker 1: sort of quick and easy and doesn't ruffle too many feathers, 247 00:14:44,840 --> 00:14:49,120 Speaker 1: because what we've seen is media of all stripes do this, 248 00:14:49,800 --> 00:14:52,880 Speaker 1: and politicians of all stripes do this. So it's not 249 00:14:53,120 --> 00:14:56,480 Speaker 1: singling anybody out. But when the media gets involved in 250 00:14:56,480 --> 00:14:59,520 Speaker 1: this kind of thing to shape public opinion, they do 251 00:14:59,640 --> 00:15:05,440 Speaker 1: so sort of a three pronged fork, which is agenda setting, framing, 252 00:15:05,560 --> 00:15:10,080 Speaker 1: and priming. Agenda setting being like, hey, what's what are 253 00:15:10,080 --> 00:15:13,040 Speaker 1: we going to focus on editorially maybe in today's paper 254 00:15:13,480 --> 00:15:16,480 Speaker 1: or every day until the election? How are you going 255 00:15:16,520 --> 00:15:19,800 Speaker 1: to frame this thing to suit what we feel is 256 00:15:19,960 --> 00:15:25,440 Speaker 1: probably our slant? And then priming it's basically, you know, 257 00:15:25,560 --> 00:15:28,320 Speaker 1: similar to the first two, but this is the subconscious 258 00:15:28,320 --> 00:15:30,360 Speaker 1: things where we're going to repeat things. We're going to 259 00:15:30,440 --> 00:15:34,680 Speaker 1: use certain words, certain images as to just sort of 260 00:15:34,880 --> 00:15:36,240 Speaker 1: reinforce how we framed it. 261 00:15:37,000 --> 00:15:39,600 Speaker 2: Yeah, so just for a quick example, let's say the 262 00:15:39,640 --> 00:15:44,840 Speaker 2: news decides to focus on crime, and that the framing 263 00:15:44,880 --> 00:15:48,440 Speaker 2: they uses that crime is on the rise, and you 264 00:15:48,520 --> 00:15:50,840 Speaker 2: hear about this over and over again, night after night 265 00:15:50,880 --> 00:15:52,720 Speaker 2: on the news, or you read about it over and 266 00:15:52,760 --> 00:15:56,680 Speaker 2: over again on your favorite news site. Eventually you're going 267 00:15:56,720 --> 00:15:59,680 Speaker 2: to be primed to think that crime is on the rise, 268 00:16:00,120 --> 00:16:02,080 Speaker 2: and you might even be a little scared of it. 269 00:16:02,240 --> 00:16:07,280 Speaker 2: Or relating to politics, a politician might latch onto that 270 00:16:07,320 --> 00:16:09,040 Speaker 2: and be like, crime's really hot right now. We're going 271 00:16:09,120 --> 00:16:12,720 Speaker 2: to make crime like the main point. We're going to 272 00:16:12,880 --> 00:16:15,800 Speaker 2: use it to set the agenda of our campaign, and 273 00:16:15,840 --> 00:16:19,360 Speaker 2: we're going to tap into that fear that this person, 274 00:16:19,440 --> 00:16:21,720 Speaker 2: this lack of safety that the people out there watching 275 00:16:21,760 --> 00:16:24,080 Speaker 2: the news feel because they're being told over and over 276 00:16:24,120 --> 00:16:26,400 Speaker 2: again crime is on the rise. Whether it is or not, 277 00:16:26,480 --> 00:16:27,600 Speaker 2: as irrelevant. 278 00:16:27,160 --> 00:16:30,040 Speaker 1: Yeah, there's no data, feel that it is exactly. 279 00:16:29,840 --> 00:16:32,760 Speaker 2: They feel that crime is on the rise. So we 280 00:16:33,440 --> 00:16:35,840 Speaker 2: a politician, are going to tap into that and use 281 00:16:35,880 --> 00:16:39,000 Speaker 2: it to helpfully get an election, because now we can 282 00:16:39,080 --> 00:16:42,000 Speaker 2: prime them to get that emotional response out of them 283 00:16:42,040 --> 00:16:44,400 Speaker 2: over and over again until we finally move them to 284 00:16:44,440 --> 00:16:48,479 Speaker 2: the voting stations to vote for me the politician. 285 00:16:48,720 --> 00:16:50,160 Speaker 1: That's right, I'd vote for you. 286 00:16:50,200 --> 00:16:54,760 Speaker 2: By the way, I would never ever run for public life. 287 00:16:54,840 --> 00:16:56,920 Speaker 1: Oh of course you wouldn't, but I would never. I 288 00:16:56,920 --> 00:16:59,280 Speaker 1: would vote for you for like, you know, Neighborhood Watch 289 00:16:59,320 --> 00:16:59,720 Speaker 1: or something. 290 00:17:00,280 --> 00:17:02,680 Speaker 2: Would you vote for me for best all Round? Boy? 291 00:17:03,560 --> 00:17:06,440 Speaker 1: Oh man, I mean you got that one in the bag, buddy. 292 00:17:06,480 --> 00:17:07,120 Speaker 1: You don't need my. 293 00:17:07,119 --> 00:17:08,399 Speaker 2: Vote, thank you. 294 00:17:10,240 --> 00:17:12,480 Speaker 1: Dog Whistling is something that you know, you've heard more 295 00:17:12,520 --> 00:17:15,560 Speaker 1: and more as a sort of a term used and 296 00:17:16,760 --> 00:17:19,240 Speaker 1: you know, social media politics, on the news, stuff like that, 297 00:17:19,320 --> 00:17:22,160 Speaker 1: And that is when you use a word or phrase 298 00:17:22,280 --> 00:17:25,560 Speaker 1: that you sort of are in not sort of, you 299 00:17:25,600 --> 00:17:30,359 Speaker 1: are very much intentionally associating with a negative racial stereotype. 300 00:17:30,800 --> 00:17:35,560 Speaker 1: But it's not explicit, so it operates on the unconscious level. 301 00:17:36,280 --> 00:17:40,600 Speaker 1: And this is where they tap into implicit bias, something 302 00:17:40,640 --> 00:17:45,040 Speaker 1: that everybody has, and you know, using these coded words. 303 00:17:45,080 --> 00:17:48,760 Speaker 1: Basically there are things that I will never take one 304 00:17:48,800 --> 00:17:50,760 Speaker 1: of these. It scares me to death. So I would 305 00:17:50,760 --> 00:17:53,840 Speaker 1: never sit down for an implicit bias test because I 306 00:17:53,840 --> 00:17:57,360 Speaker 1: don't want to know what my implicit biases are. It's 307 00:17:57,359 --> 00:18:01,040 Speaker 1: absolutely terrifying because we all have them. I guess that's 308 00:18:01,040 --> 00:18:03,520 Speaker 1: probably cowardly and I should know what my implicit biases 309 00:18:03,520 --> 00:18:05,960 Speaker 1: are so I can try and correct those. But it's 310 00:18:06,000 --> 00:18:08,200 Speaker 1: basically when they sit you down and then you say, 311 00:18:09,040 --> 00:18:11,880 Speaker 1: all right, respond very quickly as quick as you can 312 00:18:12,000 --> 00:18:16,119 Speaker 1: to a series of images or words, and anything within 313 00:18:16,160 --> 00:18:20,040 Speaker 1: two hundred and six hundred milliseconds is your implicit memory. 314 00:18:20,560 --> 00:18:24,199 Speaker 1: Anything after that you're thinking about, and they that's explicit memory. 315 00:18:24,920 --> 00:18:30,280 Speaker 2: Right. Two things about that. One If every if implicit 316 00:18:30,320 --> 00:18:34,359 Speaker 2: bias is a universal thing and everybody has it, you're free, Okay, 317 00:18:34,359 --> 00:18:36,600 Speaker 2: I have to feel as bad about it. I mean, 318 00:18:36,640 --> 00:18:38,800 Speaker 2: it's definitely something you should work to correct, if you say, 319 00:18:38,960 --> 00:18:41,040 Speaker 2: but you don't have to feel like it's just you. 320 00:18:41,440 --> 00:18:41,640 Speaker 1: Right. 321 00:18:42,359 --> 00:18:45,040 Speaker 2: That's one thing. And then the second thing, too, Chuck, 322 00:18:45,200 --> 00:18:49,560 Speaker 2: is if it is true that we all are implicitly biased, 323 00:18:50,320 --> 00:18:54,480 Speaker 2: say like along racial lines. I feel like you could 324 00:18:54,480 --> 00:18:58,760 Speaker 2: explain it by saying, by it's an example of our 325 00:18:58,920 --> 00:19:03,600 Speaker 2: evolutionary history not being caught up yet to our current 326 00:19:03,760 --> 00:19:08,639 Speaker 2: social history. Like when we foreign modern societies, we jumped 327 00:19:08,800 --> 00:19:12,520 Speaker 2: light years ahead as far as evolution goes, and the 328 00:19:13,880 --> 00:19:16,080 Speaker 2: way that we think and see the world just completely 329 00:19:16,480 --> 00:19:21,200 Speaker 2: just hit warp speed. But evolutionarily there's still a big 330 00:19:21,320 --> 00:19:24,600 Speaker 2: lag catching up to it. So we're fearful of people 331 00:19:24,840 --> 00:19:27,720 Speaker 2: who don't look like us, or have a slightly different culture, 332 00:19:27,840 --> 00:19:31,800 Speaker 2: live in a different nation, just evolutionarily speaking, even though 333 00:19:31,840 --> 00:19:34,520 Speaker 2: we know we shouldn't feel that way, that that's not 334 00:19:34,560 --> 00:19:39,040 Speaker 2: actually how things are. That conflict between those two things 335 00:19:39,160 --> 00:19:40,560 Speaker 2: is what the real issue is. 336 00:19:40,840 --> 00:19:43,880 Speaker 1: Yeah, oh boy, that really lets us all off the hook. 337 00:19:43,960 --> 00:19:47,520 Speaker 2: I guess, right, So we don't have to do anything 338 00:19:47,560 --> 00:19:50,760 Speaker 2: about it. It'll work itself out in ten thousand years. 339 00:19:50,920 --> 00:19:53,320 Speaker 1: Well, here's the thing, though, is when people sit down 340 00:19:53,320 --> 00:19:56,440 Speaker 1: for these tests, they found that even people who explicitly 341 00:19:56,480 --> 00:19:59,000 Speaker 1: disagree with any kind of racial stereotype and like, I 342 00:19:59,000 --> 00:20:00,760 Speaker 1: would never do that, I don't have those at all. 343 00:20:01,560 --> 00:20:04,160 Speaker 1: They even while I was about to say, fail those tests, 344 00:20:04,200 --> 00:20:07,480 Speaker 1: they even chart on those tests as showing implicit bias. 345 00:20:07,880 --> 00:20:12,520 Speaker 2: Right, so we've made it pretty squarely into social psychology 346 00:20:12,600 --> 00:20:15,879 Speaker 2: territory I now. And there were some there were some 347 00:20:16,000 --> 00:20:20,120 Speaker 2: early experiments trying to figure out how you can persuade 348 00:20:20,119 --> 00:20:21,280 Speaker 2: people using priming. 349 00:20:21,640 --> 00:20:24,080 Speaker 1: Right, he's like the seventies, right, yeah. 350 00:20:24,040 --> 00:20:28,160 Speaker 2: Late seventies, early eighties, and so one of the first 351 00:20:28,160 --> 00:20:31,280 Speaker 2: ones came out of nineteen seventy nine, and it was 352 00:20:31,280 --> 00:20:36,280 Speaker 2: a social psychology experiment where you gave people scrambled, like 353 00:20:36,400 --> 00:20:38,920 Speaker 2: a scrambled word list and said make a sentence out 354 00:20:38,960 --> 00:20:42,760 Speaker 2: of this, right, And so you would get something neutral 355 00:20:42,800 --> 00:20:48,159 Speaker 2: like her found new eye and I knew her is 356 00:20:48,320 --> 00:20:51,240 Speaker 2: a sentence you could make out of that, or you 357 00:20:51,240 --> 00:20:55,480 Speaker 2: would give something like leg break arm his You could say, 358 00:20:55,520 --> 00:20:59,119 Speaker 2: break his arm. And so one group was given more 359 00:20:59,440 --> 00:21:03,720 Speaker 2: hostile word scrambles to make sentences out of the other 360 00:21:03,880 --> 00:21:07,200 Speaker 2: was more neutral, And then that was part one. After 361 00:21:07,240 --> 00:21:12,760 Speaker 2: that they were asked to consider a hypothetical scenario in 362 00:21:12,800 --> 00:21:17,439 Speaker 2: which somebody's kind of ambiguously responding or interacting with somebody. 363 00:21:17,440 --> 00:21:20,120 Speaker 2: I think I saw that Donald is refusing to pay 364 00:21:20,160 --> 00:21:23,440 Speaker 2: rent until his landlord paints his apartment, and the people 365 00:21:23,520 --> 00:21:28,240 Speaker 2: who were given the hostile word scrambles rated Donald is 366 00:21:28,359 --> 00:21:30,719 Speaker 2: much more hostile than the people who are given the 367 00:21:30,760 --> 00:21:33,639 Speaker 2: neutral word scrambles. So what they're showing is that you 368 00:21:33,680 --> 00:21:37,840 Speaker 2: can nudge people toward forming an impression about someone they 369 00:21:37,880 --> 00:21:42,119 Speaker 2: know basically nothing about, based on priming them, yeah, to 370 00:21:42,200 --> 00:21:45,080 Speaker 2: feel one way or another about them. And then that 371 00:21:45,280 --> 00:21:47,800 Speaker 2: was like, if we can do that, man, what else 372 00:21:47,840 --> 00:21:50,040 Speaker 2: can we do? That really opened the floodgates. 373 00:21:50,320 --> 00:21:53,359 Speaker 1: Yeah, and you know we're about to tick through a 374 00:21:53,400 --> 00:21:57,160 Speaker 1: series of little things like that that might be mind 375 00:21:57,200 --> 00:21:58,879 Speaker 1: blowing for you where you're like, oh my god, I 376 00:21:58,880 --> 00:22:01,439 Speaker 1: can't believe that works, but just you know, sort of 377 00:22:01,600 --> 00:22:05,159 Speaker 1: put that in your pocket for now. There was one 378 00:22:05,240 --> 00:22:08,960 Speaker 1: study where it was again it was unscrambling sentences. Half 379 00:22:09,000 --> 00:22:12,359 Speaker 1: the people were given sentences that had words that were 380 00:22:12,560 --> 00:22:16,199 Speaker 1: you know, like healthier, healthy, and active lifestyle words. The 381 00:22:16,240 --> 00:22:19,760 Speaker 1: other half got neutral words, and afterward they said, all right, 382 00:22:19,840 --> 00:22:22,280 Speaker 1: you guys are great. You can go ahead and leave there. 383 00:22:22,359 --> 00:22:25,359 Speaker 1: You can you know, go up the stairs there and 384 00:22:25,680 --> 00:22:27,959 Speaker 1: get out of here to the parking deck up there, 385 00:22:28,080 --> 00:22:31,199 Speaker 1: or you can take the elevator. And people who unscrambled 386 00:22:31,200 --> 00:22:34,200 Speaker 1: the healthy words were more likely to take the stairs afterward, 387 00:22:35,160 --> 00:22:37,440 Speaker 1: so you know, they're like, hey, that's pretty much proof 388 00:22:37,520 --> 00:22:39,879 Speaker 1: right there. They had been primed with these healthy words 389 00:22:40,119 --> 00:22:41,840 Speaker 1: to make a healthier decision afterward. 390 00:22:42,680 --> 00:22:45,479 Speaker 2: Right And well, like you said, we'll talk about some 391 00:22:45,520 --> 00:22:49,320 Speaker 2: of the more shocking or surprising studies, but before that, 392 00:22:49,400 --> 00:22:52,520 Speaker 2: we need to mention a guy named John Barg who 393 00:22:53,000 --> 00:22:55,960 Speaker 2: became basically the rock star of this field starting in 394 00:22:56,000 --> 00:23:02,399 Speaker 2: the nineties. He essentially wrote a paper that said, like 395 00:23:02,880 --> 00:23:05,879 Speaker 2: all of this is possible, Like, here's how you do that. 396 00:23:05,920 --> 00:23:09,879 Speaker 2: Here's how you take the findings of cognitive priming and 397 00:23:09,960 --> 00:23:13,320 Speaker 2: turn it into social or behavioral priming. And he was 398 00:23:13,480 --> 00:23:16,439 Speaker 2: very famous for a couple of studies, many studies, but 399 00:23:16,480 --> 00:23:19,600 Speaker 2: there's two that really stuck out to me that Let 400 00:23:19,600 --> 00:23:22,040 Speaker 2: me give you an example. One is that he tested 401 00:23:22,560 --> 00:23:27,119 Speaker 2: whether something is as random as temperature could affect your 402 00:23:27,160 --> 00:23:28,600 Speaker 2: impression of another person. 403 00:23:29,000 --> 00:23:30,679 Speaker 1: Yeah, I thought that was interesting because it was like 404 00:23:30,720 --> 00:23:32,959 Speaker 1: they'd use words and finally like, hey, I wonder if 405 00:23:33,359 --> 00:23:36,520 Speaker 1: words worked, if images only alone could work? And then 406 00:23:36,560 --> 00:23:39,840 Speaker 1: he was like, hold my beer, right, what about just 407 00:23:39,880 --> 00:23:44,359 Speaker 1: temperature alone? So in this one, he asked participants to 408 00:23:44,680 --> 00:23:47,320 Speaker 1: hold a something warm, like a cup of coffee or 409 00:23:47,359 --> 00:23:50,800 Speaker 1: warm cup of soup or something, or a frosty cold beverage, 410 00:23:50,840 --> 00:23:53,119 Speaker 1: and then they bring them in a room where they 411 00:23:53,119 --> 00:23:56,320 Speaker 1: have a conversation with a stranger and they found that 412 00:23:56,800 --> 00:23:59,479 Speaker 1: or he found he claimed to find that people who 413 00:23:59,600 --> 00:24:02,840 Speaker 1: were with that warm bread beverage had a warmer impression 414 00:24:02,880 --> 00:24:05,880 Speaker 1: of that stranger afterward, and people who had that cold 415 00:24:05,960 --> 00:24:10,080 Speaker 1: beverage frostier had eight you know, they felt frostier toward 416 00:24:10,160 --> 00:24:10,600 Speaker 1: that person. 417 00:24:11,359 --> 00:24:13,679 Speaker 2: Right, So that means that you can nudge people to 418 00:24:13,720 --> 00:24:18,840 Speaker 2: feel a certain way based on metaphor. Priming using metaphor 419 00:24:18,920 --> 00:24:21,119 Speaker 2: and not even words or images, but temperature. 420 00:24:21,280 --> 00:24:21,720 Speaker 1: Yeah. 421 00:24:22,119 --> 00:24:24,639 Speaker 2: Another one that he figured out was and this is 422 00:24:24,680 --> 00:24:26,720 Speaker 2: the one he's really famous for. I think this is 423 00:24:26,760 --> 00:24:30,159 Speaker 2: the one that came from his nineteen ninety six paper. 424 00:24:30,840 --> 00:24:34,880 Speaker 2: But he tested to see how certain kinds of words 425 00:24:34,920 --> 00:24:38,800 Speaker 2: affect certain kinds of behavior. And he took some nineteen 426 00:24:38,880 --> 00:24:41,919 Speaker 2: twenty twenty one year old students and had them do 427 00:24:42,080 --> 00:24:46,840 Speaker 2: that famous word scramble. Priming researchers love word scrambles. Yeah, 428 00:24:47,600 --> 00:24:51,960 Speaker 2: and some had just a normal neutral set of words 429 00:24:52,600 --> 00:24:55,760 Speaker 2: to pick out from. Another had sets of words that 430 00:24:56,440 --> 00:25:03,000 Speaker 2: were associated with being old, so straight ahead that you'd 431 00:25:03,000 --> 00:25:05,600 Speaker 2: be like, these are all old people words, but they 432 00:25:05,600 --> 00:25:09,640 Speaker 2: were like bingo or florida or wrinkles, that kind of stuff. Right. 433 00:25:10,400 --> 00:25:12,959 Speaker 2: That was part one, And then the students who were 434 00:25:13,000 --> 00:25:16,080 Speaker 2: participating thought that they were done at that point, but 435 00:25:16,240 --> 00:25:18,320 Speaker 2: he said, Okay, now we've got to do part two. 436 00:25:18,400 --> 00:25:20,159 Speaker 2: But we're gonna have to go to the end of 437 00:25:20,200 --> 00:25:22,440 Speaker 2: the hall and turn right and where there's another lab 438 00:25:22,480 --> 00:25:24,119 Speaker 2: we need to go to for the second part of 439 00:25:24,160 --> 00:25:27,000 Speaker 2: this test. But the real thing he was doing was 440 00:25:27,080 --> 00:25:29,639 Speaker 2: clocking how long it took the students to make it 441 00:25:29,680 --> 00:25:32,119 Speaker 2: from one end of the hall to the other. And 442 00:25:32,160 --> 00:25:35,439 Speaker 2: he found the people who had who had worked with 443 00:25:35,520 --> 00:25:40,040 Speaker 2: word scrambles that had age or old elder related words 444 00:25:40,920 --> 00:25:44,760 Speaker 2: walked slower than the people who had the neutral words. 445 00:25:45,280 --> 00:25:48,560 Speaker 2: And this was the one this experiment, Chuck, is what 446 00:25:49,119 --> 00:25:52,879 Speaker 2: broke open. This is what led to nudge economics. This 447 00:25:52,920 --> 00:25:56,640 Speaker 2: is what led to governments saying like, man, we could 448 00:25:56,720 --> 00:25:59,359 Speaker 2: use this to like move people in a way that 449 00:25:59,440 --> 00:26:02,600 Speaker 2: we want them. Two that's healthier and happier. This is 450 00:26:02,640 --> 00:26:03,679 Speaker 2: the study that did that. 451 00:26:04,520 --> 00:26:06,720 Speaker 1: Yeah, and here's my I mean, I hope I'm not 452 00:26:06,760 --> 00:26:09,679 Speaker 1: giving anything away for act three when we sort of 453 00:26:10,240 --> 00:26:13,880 Speaker 1: rained down our judgment upon this stuff. But like even 454 00:26:13,880 --> 00:26:15,760 Speaker 1: when I was first going through this, I was like, 455 00:26:16,560 --> 00:26:19,000 Speaker 1: I even know, there are so many variables to account 456 00:26:19,040 --> 00:26:21,400 Speaker 1: for in these experiments that there's no way they're accounting 457 00:26:21,440 --> 00:26:24,720 Speaker 1: for him like the taking the stairs or taking the elevator, 458 00:26:25,240 --> 00:26:27,639 Speaker 1: like who was tired that day, who had a bum 459 00:26:27,680 --> 00:26:31,280 Speaker 1: ankle or who you know, There's just so many different 460 00:26:31,359 --> 00:26:33,639 Speaker 1: variables to account for on why someone would take the 461 00:26:33,640 --> 00:26:37,520 Speaker 1: stairs or an elevator, or why somebody would walk slower 462 00:26:37,520 --> 00:26:39,600 Speaker 1: down the hall. Maybe they were super bored by this 463 00:26:39,720 --> 00:26:42,480 Speaker 1: experiment and so they walked a little more sluggish or 464 00:26:42,480 --> 00:26:45,719 Speaker 1: something like that, and those types of variables there's they 465 00:26:45,720 --> 00:26:47,840 Speaker 1: weren't accounting for. Ever, it seems like. 466 00:26:48,480 --> 00:26:51,800 Speaker 2: I think we can just go ahead and reveal. Now 467 00:26:52,119 --> 00:26:52,600 Speaker 2: what do you think? 468 00:26:53,080 --> 00:26:54,119 Speaker 1: Yeah? Sure, so. 469 00:26:54,400 --> 00:26:57,199 Speaker 2: I wish you had been illuminary in the field of 470 00:26:57,200 --> 00:27:00,520 Speaker 2: social psychology and priming research back in the late nineties 471 00:27:00,560 --> 00:27:02,480 Speaker 2: or eight two thousands, because you could have derailed this 472 00:27:02,560 --> 00:27:05,800 Speaker 2: whole thing before it ever got started. Because if that 473 00:27:05,840 --> 00:27:09,200 Speaker 2: seemed ridiculous to you, that idea that you could suggest 474 00:27:10,040 --> 00:27:13,520 Speaker 2: old related or age related words to twenty year olds 475 00:27:13,600 --> 00:27:15,840 Speaker 2: and they're going to walk slower because they were just 476 00:27:15,880 --> 00:27:19,840 Speaker 2: thinking about being elderly. Yeah, if that seems ridiculous to you, 477 00:27:19,840 --> 00:27:23,520 Speaker 2: you are one hundred percent right. Okay, it's a ridiculous study, 478 00:27:24,000 --> 00:27:28,320 Speaker 2: and it's ridiculous that the entire field of social psychology, economics, 479 00:27:28,640 --> 00:27:32,960 Speaker 2: the politics paid attention to this and went all in 480 00:27:33,080 --> 00:27:35,800 Speaker 2: on it. But what we have, what we're actually talking 481 00:27:35,800 --> 00:27:39,600 Speaker 2: about today, is one of the biggest black eyes in 482 00:27:39,640 --> 00:27:44,400 Speaker 2: the history of psychology that didn't involve torturing human beings. 483 00:27:44,640 --> 00:27:47,399 Speaker 1: Yeah, but you know, let's tick through a few of 484 00:27:47,440 --> 00:27:50,200 Speaker 1: these that you found kind of quickly, because I think 485 00:27:50,240 --> 00:27:54,000 Speaker 1: it just illustrates, though, how a company or or a 486 00:27:54,040 --> 00:27:56,800 Speaker 1: political party would really latch onto this when they see 487 00:27:56,800 --> 00:27:59,520 Speaker 1: stuff like this right without kind of like critically thinking 488 00:27:59,520 --> 00:28:03,200 Speaker 1: on how they got there. One study exposure to fishy 489 00:28:03,240 --> 00:28:07,919 Speaker 1: smells would induce suspicion on trust based economic exchanges in 490 00:28:07,920 --> 00:28:11,040 Speaker 1: a trust game. In other words, like they smelled something fishy, 491 00:28:11,160 --> 00:28:13,480 Speaker 1: so they're going to carry that over as in something 492 00:28:13,480 --> 00:28:14,199 Speaker 1: smells fishy. 493 00:28:15,320 --> 00:28:18,439 Speaker 2: Can you imagine reading that in a scholarly, scholarly journal 494 00:28:18,440 --> 00:28:20,520 Speaker 2: and being like, man, that's that's really crazy. 495 00:28:20,600 --> 00:28:22,639 Speaker 1: I would say this paper smells like tuna. 496 00:28:25,040 --> 00:28:28,400 Speaker 2: There's another one. Remember power poses. I specifically remember John 497 00:28:28,480 --> 00:28:32,120 Speaker 2: Hodgman realizing that I was nervous backstage at the Bellhouse 498 00:28:32,119 --> 00:28:35,000 Speaker 2: once and telling me to do a power pose. Oh really, 499 00:28:35,040 --> 00:28:36,800 Speaker 2: to get over my stage fright. And I was like, 500 00:28:36,840 --> 00:28:39,400 Speaker 2: this isn't working. The reason it doesn't work is because 501 00:28:39,440 --> 00:28:41,920 Speaker 2: that was a finding from priming. 502 00:28:43,040 --> 00:28:46,760 Speaker 1: Oh man, did you did you text him and say 503 00:28:47,040 --> 00:28:48,040 Speaker 1: you're full of crap? 504 00:28:48,640 --> 00:28:52,960 Speaker 2: No, I'm gonna let him hear this episode. Okay, there's 505 00:28:53,000 --> 00:28:55,760 Speaker 2: another one. If you make a frownie face and you're 506 00:28:55,760 --> 00:28:58,800 Speaker 2: shown upsetting pictures, you will self report that should be 507 00:28:58,840 --> 00:29:02,400 Speaker 2: a red flag in and of us that you were 508 00:29:02,560 --> 00:29:07,880 Speaker 2: upset by pictures of starving children, people arguing accident victims 509 00:29:07,880 --> 00:29:11,720 Speaker 2: that had been maimed more than people who weren't making 510 00:29:11,720 --> 00:29:14,200 Speaker 2: a frownie face at the time they saw the pictures. 511 00:29:14,520 --> 00:29:19,320 Speaker 1: Yeah, here's one that's interesting to me. Money primed people 512 00:29:19,360 --> 00:29:22,320 Speaker 1: are more selfish. So if you're I guess they did 513 00:29:22,360 --> 00:29:25,120 Speaker 1: this experiment with people who made a lot of money 514 00:29:25,120 --> 00:29:26,280 Speaker 1: and people who didn't. 515 00:29:27,200 --> 00:29:30,760 Speaker 2: I think it was they were winning money in like games. 516 00:29:30,960 --> 00:29:33,520 Speaker 1: Oh okay, okay, But the long and short of it is, 517 00:29:34,040 --> 00:29:36,200 Speaker 1: if you were one of the money people, quote unquote, 518 00:29:36,760 --> 00:29:39,680 Speaker 1: then you would not if someone spilled their pencils like 519 00:29:39,680 --> 00:29:41,640 Speaker 1: one of the researchers, like, oh look at me, I 520 00:29:41,640 --> 00:29:44,160 Speaker 1: spilled all these pencils, they wouldn't pick up as many 521 00:29:44,200 --> 00:29:46,720 Speaker 1: pencils as someone who didn't have the money. So people 522 00:29:46,720 --> 00:29:51,280 Speaker 1: without money are kinder. I can see where that's going 523 00:29:51,320 --> 00:29:53,480 Speaker 1: in a way, But there are also just so many 524 00:29:53,560 --> 00:29:55,200 Speaker 1: variables in that, you. 525 00:29:55,160 --> 00:29:59,480 Speaker 2: Know, right, And also if your study is supporting just 526 00:29:59,520 --> 00:30:02,000 Speaker 2: a general moral judgment against. 527 00:30:01,600 --> 00:30:04,120 Speaker 1: A certain group, yeah. 528 00:30:03,440 --> 00:30:06,160 Speaker 2: It may have been biased in and of itself. Right, 529 00:30:06,200 --> 00:30:08,880 Speaker 2: it's a good point. There's another one. I love this. 530 00:30:08,960 --> 00:30:10,160 Speaker 1: At least last two are great. 531 00:30:10,480 --> 00:30:14,600 Speaker 2: If you think about stabbing a coworker in the back metaphorically, 532 00:30:15,360 --> 00:30:19,080 Speaker 2: you are more inclined when given a choice to buy soap, detergent, 533 00:30:19,200 --> 00:30:22,240 Speaker 2: or disinfectant than you are to buy batteries, juice, or 534 00:30:22,280 --> 00:30:23,440 Speaker 2: candy bars. 535 00:30:25,160 --> 00:30:26,560 Speaker 1: Out damned spot. 536 00:30:26,680 --> 00:30:29,200 Speaker 2: Yes exactly. Why don't you give him the last one? Chuck? 537 00:30:30,440 --> 00:30:33,920 Speaker 1: Okay, if you were in this experiment, you were induced 538 00:30:33,920 --> 00:30:37,480 Speaker 1: to lie to an imaginary person, either in an email 539 00:30:37,520 --> 00:30:39,560 Speaker 1: or a phone call. And then and a test that 540 00:30:39,600 --> 00:30:44,280 Speaker 1: followed that of the desirability of different products. People who 541 00:30:44,320 --> 00:30:47,280 Speaker 1: lied on the people who lied on the telephone they 542 00:30:47,280 --> 00:30:52,160 Speaker 1: actually said to lie out loud, preferred mouthwash over soap, 543 00:30:52,240 --> 00:30:54,120 Speaker 1: and people who typed it out and light in the 544 00:30:54,120 --> 00:30:56,640 Speaker 1: email preferred soap to mouthwash. 545 00:30:56,840 --> 00:31:00,280 Speaker 2: Yeah, the first group preferred mouthwash to soap and less 546 00:31:00,280 --> 00:31:05,960 Speaker 2: the soap was lifeboy. Right, So okay, yes, we should 547 00:31:05,960 --> 00:31:08,880 Speaker 2: probably rein it back a little bit. Yeah, because our 548 00:31:09,000 --> 00:31:11,640 Speaker 2: bias is showing. But for good reason. I mean, we 549 00:31:11,680 --> 00:31:14,840 Speaker 2: should say priming is not just this point of ridicule. This, 550 00:31:15,680 --> 00:31:19,960 Speaker 2: it just comple the bottom fell out of this really hot, 551 00:31:20,120 --> 00:31:24,120 Speaker 2: super sexy field of research that everyone had bought into, 552 00:31:24,880 --> 00:31:28,440 Speaker 2: like a mud slide going down a mountain, Like it 553 00:31:28,720 --> 00:31:32,920 Speaker 2: just erupted. It was. It went so south so fast 554 00:31:33,280 --> 00:31:37,760 Speaker 2: that today, about about almost fifteen years on, since the 555 00:31:37,920 --> 00:31:42,400 Speaker 2: everybody was like, this is all made up. It's essentially 556 00:31:42,480 --> 00:31:45,160 Speaker 2: a discredited field. Like there's almost no one working in 557 00:31:45,200 --> 00:31:47,920 Speaker 2: this anymore, because most people are like, this isn't true. 558 00:31:50,040 --> 00:31:52,160 Speaker 1: Yeah, but that's not to say that they're like people 559 00:31:52,160 --> 00:31:55,680 Speaker 1: didn't get a lot of like legitimate recognition for this stuff. 560 00:31:55,720 --> 00:31:59,040 Speaker 1: Like there were people won Nobel prizes who worked on 561 00:31:59,080 --> 00:32:02,000 Speaker 1: this stuff and wrote selling books, who worked on this 562 00:32:02,040 --> 00:32:05,240 Speaker 1: stuff and made millions of dollars, like speaking to corporations 563 00:32:05,920 --> 00:32:08,520 Speaker 1: about how they can better, you know, take advantage of 564 00:32:08,520 --> 00:32:12,360 Speaker 1: their consumers. So it was it was swallowed, hook line 565 00:32:12,400 --> 00:32:12,920 Speaker 1: and sinker. 566 00:32:13,400 --> 00:32:16,520 Speaker 2: So let's give you an example. Daniel Kahneman. He was 567 00:32:16,560 --> 00:32:22,120 Speaker 2: a Nobel already a Nobel Prize winning economist. He wrote 568 00:32:22,160 --> 00:32:25,640 Speaker 2: Thinking Fast and Slow, and it was essentially an introduction 569 00:32:26,200 --> 00:32:32,760 Speaker 2: to nudge economics and priming for the average person. And 570 00:32:32,800 --> 00:32:35,680 Speaker 2: it was a huge bestseller. I mean, everybody was reading 571 00:32:35,760 --> 00:32:37,280 Speaker 2: that book back in the day. I think it was 572 00:32:37,360 --> 00:32:39,280 Speaker 2: two thousand and two. 573 00:32:40,000 --> 00:32:41,080 Speaker 1: Okay, yeah. 574 00:32:41,880 --> 00:32:47,760 Speaker 2: In this book he says, this is a quote. Disbelief 575 00:32:47,880 --> 00:32:50,720 Speaker 2: is not an option. The results are not made up, 576 00:32:51,000 --> 00:32:54,120 Speaker 2: nor are they statistical flukes. You have no choice but 577 00:32:54,200 --> 00:32:57,400 Speaker 2: to accept that the major conclusions of these studies are true. 578 00:32:58,120 --> 00:33:00,680 Speaker 2: You have no choice but to say that this is true. 579 00:33:00,720 --> 00:33:03,600 Speaker 2: So just get on board. This was like a Nobel 580 00:33:03,640 --> 00:33:07,400 Speaker 2: prize wing economist who was a psychologist himself, who wrote 581 00:33:07,440 --> 00:33:09,600 Speaker 2: that to the rest of the world, saying like, don't 582 00:33:09,600 --> 00:33:12,600 Speaker 2: even question priming. This is true. Let's figure out how 583 00:33:12,640 --> 00:33:14,960 Speaker 2: to use it to persuade people to do what we want. 584 00:33:15,400 --> 00:33:18,480 Speaker 1: Yeah, there was another guy. We talked about the nudge earlier. 585 00:33:19,080 --> 00:33:21,680 Speaker 1: In two thousand and eight, another Nobel winning economist named 586 00:33:21,720 --> 00:33:26,360 Speaker 1: Richard thaller Co wrote a bestseller called Nudge Colon Improving 587 00:33:26,400 --> 00:33:29,360 Speaker 1: Decisions about health, wealth, and Happiness. And in that book 588 00:33:29,400 --> 00:33:34,400 Speaker 1: he talked about social priming experiences, benign paternalism, or that 589 00:33:34,520 --> 00:33:36,640 Speaker 1: nudge that you know you kind of mentioned earlier, towards 590 00:33:36,760 --> 00:33:40,200 Speaker 1: making a better decision. And then there's this other economists 591 00:33:40,240 --> 00:33:43,800 Speaker 1: name they're all economists, Daniel what's his name, Eirily? 592 00:33:44,320 --> 00:33:46,480 Speaker 2: I think so, yeah, And he. 593 00:33:46,360 --> 00:33:49,920 Speaker 1: Tested a lot of these supposed nudges. In one experiment, 594 00:33:50,000 --> 00:33:53,800 Speaker 1: he had kids students grade their own tests, but before 595 00:33:53,960 --> 00:33:57,600 Speaker 1: had half of them write out as many of the 596 00:33:57,600 --> 00:34:00,440 Speaker 1: Ten Commandments as they could remember. And what he found 597 00:34:00,560 --> 00:34:03,680 Speaker 1: was that those students were less likely to cheat on 598 00:34:03,760 --> 00:34:07,200 Speaker 1: their self graded testing because they had just been primed 599 00:34:07,600 --> 00:34:09,319 Speaker 1: to sort of think of like, hey, what's the right 600 00:34:09,360 --> 00:34:09,799 Speaker 1: thing to do? 601 00:34:10,520 --> 00:34:14,160 Speaker 2: Totally makes sense? Why even question it. He worked with 602 00:34:14,400 --> 00:34:18,840 Speaker 2: a car insurance company for another study. Car insurance companies 603 00:34:18,880 --> 00:34:21,680 Speaker 2: will sometimes have you say how many miles you've driven 604 00:34:21,800 --> 00:34:26,200 Speaker 2: in a year, and you can just lie and your 605 00:34:26,239 --> 00:34:28,560 Speaker 2: insurance rates will be affected on whether you lie or not. 606 00:34:28,840 --> 00:34:30,520 Speaker 2: So to find out if you could make people be 607 00:34:30,640 --> 00:34:35,560 Speaker 2: more honest about that, Erarily introduced an honesty pledge that 608 00:34:35,600 --> 00:34:38,520 Speaker 2: the person would sign at the top of their insurance 609 00:34:38,920 --> 00:34:42,839 Speaker 2: agreement or contract saying I won't fudge these numbers. And 610 00:34:42,960 --> 00:34:46,920 Speaker 2: those people fudge their numbers less than people who didn't 611 00:34:46,960 --> 00:34:50,640 Speaker 2: have that pledge to sign. So to prompt people to 612 00:34:50,719 --> 00:34:54,360 Speaker 2: be honest, whether it's the Ten Commandments or pledging honesty 613 00:34:54,480 --> 00:34:57,399 Speaker 2: or whatever, they're going to be more honest. And that's 614 00:34:57,400 --> 00:35:01,080 Speaker 2: a perfect example of the nudge economic that just swept 615 00:35:01,120 --> 00:35:04,200 Speaker 2: the world at the turn of the two thousands to 616 00:35:04,280 --> 00:35:05,360 Speaker 2: the twenty tens. 617 00:35:05,640 --> 00:35:08,120 Speaker 1: Yeah, I wonder if they asked any of the participants 618 00:35:08,120 --> 00:35:09,960 Speaker 1: coming in, are you. 619 00:35:09,960 --> 00:35:13,279 Speaker 2: Honest generally or not self report? 620 00:35:15,840 --> 00:35:17,920 Speaker 1: Maybe we should take a break here. Yeah, it's a 621 00:35:17,920 --> 00:35:19,640 Speaker 1: good time for a break, and we'll come back and 622 00:35:19,680 --> 00:35:22,239 Speaker 1: sort of talk about the big problem with all this, 623 00:35:22,440 --> 00:35:25,920 Speaker 1: which is, you know, scientifically speaking, the replication of these 624 00:35:25,920 --> 00:35:26,920 Speaker 1: studies right after this. 625 00:36:16,640 --> 00:36:19,680 Speaker 2: Okay, Chuck, So we've talked about this before. The replication 626 00:36:19,800 --> 00:36:24,880 Speaker 2: crisis in science. I think it's in the physical sciences. 627 00:36:24,920 --> 00:36:29,040 Speaker 2: It's all over where people have had these landmark papers, 628 00:36:29,080 --> 00:36:32,439 Speaker 2: groundbreaking papers that have changed the world, changed how people act, 629 00:36:32,840 --> 00:36:37,200 Speaker 2: changed where research dollars go. When somebody finally gets around 630 00:36:37,520 --> 00:36:40,520 Speaker 2: to trying to replicate that exact same experiment, they come 631 00:36:40,600 --> 00:36:44,080 Speaker 2: up with different results, often negative results. They can't replicate 632 00:36:44,120 --> 00:36:47,120 Speaker 2: the results that that study found. And like I said, 633 00:36:47,200 --> 00:36:50,600 Speaker 2: that's throughout science this is a big problem. But in 634 00:36:50,640 --> 00:36:56,440 Speaker 2: psychology and more specifically social psychology, the replication crisis is 635 00:36:56,680 --> 00:37:02,000 Speaker 2: shaking the foundations of the thee and a reason why. 636 00:37:02,200 --> 00:37:05,319 Speaker 2: Let's do a little thought experiment here. Imagine that you're 637 00:37:05,320 --> 00:37:09,920 Speaker 2: a social psychologist, researcher, Okay, you're at the university and 638 00:37:09,960 --> 00:37:12,960 Speaker 2: you've come up with an experiment that you believe the 639 00:37:13,080 --> 00:37:17,920 Speaker 2: results of which will be used a corporations, political campaigns 640 00:37:18,360 --> 00:37:22,200 Speaker 2: all over the place to persuade everyday people to change 641 00:37:22,239 --> 00:37:26,400 Speaker 2: their behavior which will shape the world for years to come. 642 00:37:27,239 --> 00:37:30,359 Speaker 2: But you don't really feel like leaving your office that day, 643 00:37:30,640 --> 00:37:32,920 Speaker 2: so you just bring in a bunch of students, maybe 644 00:37:32,960 --> 00:37:36,359 Speaker 2: fifteen of them, experiment on them, and then extrapolate those 645 00:37:36,400 --> 00:37:41,000 Speaker 2: findings to a human universality, and then it gets packaged 646 00:37:41,200 --> 00:37:45,280 Speaker 2: and exported to corporations and political campaigns. That, in a nutshell, 647 00:37:45,320 --> 00:37:46,440 Speaker 2: is social psychology. 648 00:37:46,760 --> 00:37:49,759 Speaker 1: Yeah, I mean, that's what happened. We have some specific 649 00:37:49,800 --> 00:37:53,960 Speaker 1: examples of like real mouthfeasance that happened in these studies. 650 00:37:54,600 --> 00:37:57,719 Speaker 1: In some cases even not just like oh, you know, 651 00:37:58,280 --> 00:38:00,879 Speaker 1: they didn't account for all the very like in some 652 00:38:00,920 --> 00:38:04,680 Speaker 1: cases they massage data. That Daniel Areley guy that we 653 00:38:04,680 --> 00:38:09,400 Speaker 1: were talking about, yeah, and his colleague Francesca Gino, they 654 00:38:09,400 --> 00:38:12,480 Speaker 1: were accused of massaging the data just to fit their hypothesis. 655 00:38:13,040 --> 00:38:16,320 Speaker 1: Turns out that car insurance company came back and said 656 00:38:16,880 --> 00:38:20,000 Speaker 1: that data doesn't even match what we sent you and 657 00:38:20,040 --> 00:38:24,000 Speaker 1: what you published in that honesty pledge situation. Yeah, there 658 00:38:24,040 --> 00:38:29,360 Speaker 1: was another guy, a Dutch guy named Diedrich Stoppel. He 659 00:38:29,520 --> 00:38:32,400 Speaker 1: was a big, sort of big name in this field. 660 00:38:33,120 --> 00:38:35,240 Speaker 1: He fabricated studies out of whole cloth. 661 00:38:36,120 --> 00:38:39,160 Speaker 2: Yeah, for sure. So it's bad for psychology, bad for 662 00:38:39,320 --> 00:38:43,400 Speaker 2: social psychology. But priming research, it was just this is 663 00:38:43,760 --> 00:38:46,279 Speaker 2: like it must have just felt like a bloodbath day 664 00:38:46,320 --> 00:38:49,160 Speaker 2: after day if you were in the field, just bad 665 00:38:49,239 --> 00:38:52,960 Speaker 2: news after bad news. It got so bad that ten 666 00:38:53,040 --> 00:38:56,160 Speaker 2: years after he wrote Thinking Fast and Slow, Daniel Kahneman 667 00:38:56,520 --> 00:38:59,160 Speaker 2: wrote an open letter but it was actually directed to 668 00:38:59,480 --> 00:39:03,719 Speaker 2: John Barr saying like something, this is really bad, that 669 00:39:05,320 --> 00:39:08,160 Speaker 2: priming research is the poster child for doubts about the 670 00:39:08,160 --> 00:39:11,719 Speaker 2: integrity of psychological research. And he warned of becoming train 671 00:39:11,800 --> 00:39:15,280 Speaker 2: wreck and was basically saying like, hey man, you guys, 672 00:39:15,320 --> 00:39:17,520 Speaker 2: I'm just a believer. I'm just a fan. You guys 673 00:39:17,560 --> 00:39:21,120 Speaker 2: screwed this up. You better fix it. And there's actually 674 00:39:21,160 --> 00:39:24,840 Speaker 2: I found a really great website called Replicable Replicability Index. 675 00:39:24,880 --> 00:39:29,560 Speaker 2: It's a blog by Ulric Shimack at the University of Toronto, 676 00:39:30,200 --> 00:39:33,960 Speaker 2: and they have a something called the Reconstruction of a 677 00:39:34,120 --> 00:39:37,040 Speaker 2: train Wreck where they go through and pick out like 678 00:39:37,280 --> 00:39:41,200 Speaker 2: how Kahman sold this to the public and misrepresented it 679 00:39:41,280 --> 00:39:44,959 Speaker 2: all sorts of different ways himself. So he's passed on RIP. 680 00:39:45,200 --> 00:39:47,319 Speaker 2: I don't like to speak ill of the dead. And 681 00:39:47,360 --> 00:39:50,520 Speaker 2: everything leading up to that, from what I understand, was 682 00:39:50,600 --> 00:39:54,000 Speaker 2: a great career, but he just hitched his wagonto the 683 00:39:54,040 --> 00:39:57,240 Speaker 2: wrong thing and then tried to distance himself from as 684 00:39:57,480 --> 00:39:58,520 Speaker 2: much as he could. 685 00:39:58,840 --> 00:40:01,560 Speaker 1: Yeah, and you know, the community that worked in that 686 00:40:01,600 --> 00:40:04,120 Speaker 1: field certainly woke up, and they weren't just like this 687 00:40:04,160 --> 00:40:05,560 Speaker 1: is no big deal. They were like, all right, this 688 00:40:05,600 --> 00:40:08,919 Speaker 1: is a real problem. So they tried have tried since 689 00:40:08,960 --> 00:40:10,440 Speaker 1: then to try and clean things up a bit and 690 00:40:10,480 --> 00:40:15,640 Speaker 1: address what they call qrps, or questionable research practices, of 691 00:40:15,680 --> 00:40:17,799 Speaker 1: which there are many. We're going to talk about some 692 00:40:17,840 --> 00:40:21,520 Speaker 1: of them right now, but one obviously you sort of 693 00:40:21,560 --> 00:40:24,160 Speaker 1: mentioned this was they're just really small studies. You can't 694 00:40:24,160 --> 00:40:27,480 Speaker 1: make these big, huge conclusions about human behavior when you 695 00:40:27,520 --> 00:40:31,040 Speaker 1: studied twelve, you know, college freshmen, you know, on a 696 00:40:31,080 --> 00:40:36,160 Speaker 1: Saturday morning. As part of the open science movement, and 697 00:40:36,200 --> 00:40:38,200 Speaker 1: this is pretty cool. I didn't know about this, but 698 00:40:38,400 --> 00:40:42,720 Speaker 1: researchers are now encouraged to register their studies ahead of time, 699 00:40:42,920 --> 00:40:46,600 Speaker 1: including what their hypothesis is before they collect the data, 700 00:40:47,440 --> 00:40:51,000 Speaker 1: and that'll keep them from what's called HARKing hypothesizing after 701 00:40:51,200 --> 00:40:54,000 Speaker 1: results are known. So basically like, hey, here's what I 702 00:40:54,040 --> 00:40:55,920 Speaker 1: think is going to happen. Now, let's do the study. 703 00:40:56,000 --> 00:40:59,600 Speaker 1: But it's officially registered, so I can't sort of pick 704 00:40:59,640 --> 00:41:01,520 Speaker 1: and choose what I what I look at. 705 00:41:02,080 --> 00:41:05,640 Speaker 2: Right, there's another big problem called pea hacking, which is 706 00:41:06,440 --> 00:41:10,319 Speaker 2: taking data and then making it work statistically so that 707 00:41:10,440 --> 00:41:15,440 Speaker 2: these random flukes suddenly became statistically significant. Yeah, that's a 708 00:41:15,440 --> 00:41:20,080 Speaker 2: big problem. And that I read that it's not so 709 00:41:20,239 --> 00:41:24,440 Speaker 2: much that researchers were sitting there purposely massaging their data 710 00:41:24,480 --> 00:41:27,080 Speaker 2: over and over and over again to tease out some 711 00:41:27,120 --> 00:41:30,319 Speaker 2: results that they could publish, but it was more like 712 00:41:30,320 --> 00:41:35,880 Speaker 2: they were just falling force flukes being more significant than 713 00:41:35,960 --> 00:41:38,520 Speaker 2: they were. That's what I read that that was really 714 00:41:38,520 --> 00:41:40,760 Speaker 2: the big problem. That it wasn't like an entire field 715 00:41:40,760 --> 00:41:41,600 Speaker 2: of bad actors. 716 00:41:41,680 --> 00:41:44,719 Speaker 1: Yeah, I mean I sort of get it in a way. 717 00:41:44,719 --> 00:41:48,040 Speaker 1: And we've talked about this in our Scientific Method show 718 00:41:48,080 --> 00:41:50,719 Speaker 1: and other episodes too, where you know, you put all 719 00:41:50,719 --> 00:41:52,480 Speaker 1: this time and you want to you want to you 720 00:41:52,520 --> 00:41:55,040 Speaker 1: want your thing to pan out that you think is true. 721 00:41:55,120 --> 00:41:58,520 Speaker 1: So I get the inclination, but you can't you can't 722 00:41:58,520 --> 00:42:00,640 Speaker 1: budge numbers, you can't look at stuff that just backs 723 00:42:00,719 --> 00:42:02,959 Speaker 1: up your conclusion. You can't throw stuff in the file 724 00:42:03,040 --> 00:42:05,600 Speaker 1: drawer that doesn't you got and this is one of 725 00:42:05,640 --> 00:42:07,320 Speaker 1: the things they talked about, the file drawer, like you 726 00:42:07,400 --> 00:42:11,319 Speaker 1: got to make all this stuff public. And part of 727 00:42:11,360 --> 00:42:13,200 Speaker 1: the problem is the media, because they want to write 728 00:42:13,200 --> 00:42:17,000 Speaker 1: about something splashy and super interesting, right, and so like 729 00:42:17,040 --> 00:42:18,799 Speaker 1: there's a lot of things that play here that go 730 00:42:18,840 --> 00:42:20,880 Speaker 1: into why somebody would do this beyond just being like, 731 00:42:20,960 --> 00:42:22,120 Speaker 1: you know, you're a bad person. 732 00:42:22,560 --> 00:42:25,560 Speaker 2: Yeah, and so there's something called publisher perish, like you 733 00:42:25,680 --> 00:42:28,560 Speaker 2: basically are advancing your careers if you get published in 734 00:42:28,600 --> 00:42:32,200 Speaker 2: an academic journal. The problem is academic journals they're like 735 00:42:32,239 --> 00:42:34,480 Speaker 2: the media. They wanted it to be splashing and sexy, 736 00:42:34,640 --> 00:42:37,759 Speaker 2: so they don't really publish negative results anyway. So even 737 00:42:37,800 --> 00:42:40,280 Speaker 2: if you wanted to, you'd have a hard time getting 738 00:42:40,280 --> 00:42:43,560 Speaker 2: it published in a legitimate journal these days. So that's 739 00:42:43,560 --> 00:42:47,000 Speaker 2: a big problem. Ultimately, Chuck the biggest problem, and this 740 00:42:47,080 --> 00:42:50,920 Speaker 2: is what really tripped up social psychology. It's the drum 741 00:42:50,960 --> 00:42:53,920 Speaker 2: that you've been beating this whole time. Humans are not 742 00:42:54,880 --> 00:42:58,759 Speaker 2: predictable computers who will respond in a predictable way if 743 00:42:58,760 --> 00:43:01,960 Speaker 2: you give them a specific stimulus. That's just not how 744 00:43:02,040 --> 00:43:02,879 Speaker 2: humans work now. 745 00:43:02,920 --> 00:43:06,160 Speaker 1: And even the same person will react on a different 746 00:43:06,200 --> 00:43:09,799 Speaker 1: day to the same exact experiment depending on the host 747 00:43:09,880 --> 00:43:10,480 Speaker 1: of factors. 748 00:43:11,040 --> 00:43:15,040 Speaker 2: Sure, so the same person will respond differently, you better 749 00:43:15,040 --> 00:43:18,480 Speaker 2: believe different people will respond differently to the same stimulus. 750 00:43:18,840 --> 00:43:22,040 Speaker 2: And then it also depends on who's presenting the stimulus. 751 00:43:22,760 --> 00:43:26,640 Speaker 2: Is it being presented by a grad student who's posing 752 00:43:26,680 --> 00:43:29,000 Speaker 2: as one of the participants, or is it like a 753 00:43:29,040 --> 00:43:33,360 Speaker 2: professor wearing a white lab coat for some reason. That's 754 00:43:33,480 --> 00:43:36,680 Speaker 2: definitely going to shape the information or the stimulus that's 755 00:43:36,719 --> 00:43:41,160 Speaker 2: being received. And when you put all this together, it's 756 00:43:41,520 --> 00:43:48,160 Speaker 2: essentially impossible to replicate a priming study, and if you 757 00:43:48,239 --> 00:43:51,640 Speaker 2: get the same results, that's essentially a statistical fluke from 758 00:43:51,640 --> 00:43:52,440 Speaker 2: what I understand. 759 00:43:53,040 --> 00:43:56,000 Speaker 1: Yeah, I mean, we've talked about the Stanford prison experiment 760 00:43:56,040 --> 00:43:59,200 Speaker 1: a few times, and looking back, like especially when you 761 00:43:59,239 --> 00:44:02,520 Speaker 1: see the movie I talk about who's presenting the experiment, 762 00:44:02,560 --> 00:44:05,839 Speaker 1: it's like those guards showed up. It's like, you, guys, 763 00:44:05,920 --> 00:44:08,720 Speaker 1: you look ridiculous. You know, you look like an eighteen 764 00:44:08,800 --> 00:44:11,560 Speaker 1: year old who painted on a mustache and put on 765 00:44:11,600 --> 00:44:14,239 Speaker 1: some you know, mirror aviator sunglasses trying to be a 766 00:44:14,239 --> 00:44:14,759 Speaker 1: prison guard. 767 00:44:15,000 --> 00:44:17,600 Speaker 2: Yeah, and developed a southern accent. That one always stuck 768 00:44:17,640 --> 00:44:18,040 Speaker 2: out to me. 769 00:44:18,239 --> 00:44:18,880 Speaker 1: Yeah, exactly. 770 00:44:19,480 --> 00:44:22,040 Speaker 2: So one reason why, I mean, when you're looking back, 771 00:44:22,280 --> 00:44:24,719 Speaker 2: so we should say that there are still people I 772 00:44:24,719 --> 00:44:27,680 Speaker 2: think I did say working in this field earnestly, but 773 00:44:27,719 --> 00:44:30,080 Speaker 2: they're essentially going through and picking out what could be 774 00:44:30,800 --> 00:44:31,799 Speaker 2: salvaged from it. 775 00:44:31,840 --> 00:44:32,040 Speaker 1: Yeah. 776 00:44:32,080 --> 00:44:35,640 Speaker 2: What they're finding is that priming does work, but it, 777 00:44:35,840 --> 00:44:38,240 Speaker 2: like we said, it's going to work differently for different people. 778 00:44:38,280 --> 00:44:42,239 Speaker 2: There's very few universalities, if any. But one thing that 779 00:44:42,280 --> 00:44:44,759 Speaker 2: they have figured out that is legit with social or 780 00:44:44,800 --> 00:44:48,440 Speaker 2: behavioral priming is that you can be primed most easily 781 00:44:48,480 --> 00:44:51,520 Speaker 2: and most reliably if it's pointing you in a direction 782 00:44:51,640 --> 00:44:54,279 Speaker 2: you already want to go. So let's say you want 783 00:44:54,280 --> 00:44:55,200 Speaker 2: to lose weight. 784 00:44:55,239 --> 00:44:55,880 Speaker 1: Uh huh. 785 00:44:55,920 --> 00:44:58,719 Speaker 2: If you're given a menu that has words like light 786 00:44:59,320 --> 00:45:02,000 Speaker 2: or diet on it or something like that, you're more 787 00:45:02,160 --> 00:45:05,920 Speaker 2: likely to choose those items than somebody who isn't interested 788 00:45:05,960 --> 00:45:10,520 Speaker 2: in dieting. Yeah, that's essentially what it got reduced back to, 789 00:45:10,640 --> 00:45:15,320 Speaker 2: which is just barely beyond cognitive priming. But it's legitimate, 790 00:45:15,320 --> 00:45:17,920 Speaker 2: and that's where they're starting out from. Again, that's the 791 00:45:17,960 --> 00:45:20,600 Speaker 2: current state of social priming. But just one more thing 792 00:45:20,640 --> 00:45:24,160 Speaker 2: I wanted to talk about, Chuck, is why everyone bought 793 00:45:24,200 --> 00:45:24,719 Speaker 2: into this? 794 00:45:26,280 --> 00:45:28,680 Speaker 1: Well, why do you think everyone bought into it? 795 00:45:28,880 --> 00:45:32,880 Speaker 2: I'll tell you why. Thank you for asking. Because it 796 00:45:32,920 --> 00:45:36,719 Speaker 2: reduces humans to an understandable, predictable state. 797 00:45:37,120 --> 00:45:39,640 Speaker 1: Yeah, which is very easy to market and sell to. 798 00:45:40,120 --> 00:45:43,719 Speaker 2: Yeah, and understand, not feel threatened by, but also to 799 00:45:43,800 --> 00:45:47,640 Speaker 2: feel superior to. I ran across one explanation called MPC 800 00:45:47,840 --> 00:45:51,800 Speaker 2: theory non player character theory, like referencing background characters and 801 00:45:51,920 --> 00:45:54,680 Speaker 2: video games. We don't think for themselves. They're just kind 802 00:45:54,680 --> 00:46:00,560 Speaker 2: of automated, and something like social priming unders scores that 803 00:46:00,640 --> 00:46:03,480 Speaker 2: idea that other people are like that. I'm not like that. 804 00:46:03,760 --> 00:46:06,479 Speaker 2: I can think for myself. Nobody's gonna doupe me into 805 00:46:06,480 --> 00:46:09,920 Speaker 2: eating a cheeseburger voting for them. Yeah, But other people 806 00:46:10,000 --> 00:46:14,000 Speaker 2: that happens too, And that's what priming research supported. That 807 00:46:14,200 --> 00:46:17,359 Speaker 2: other people are non player characters, which elevates your sense 808 00:46:17,400 --> 00:46:22,200 Speaker 2: of superiority and your sense of intelligence while also deflating 809 00:46:22,560 --> 00:46:24,719 Speaker 2: that of other people in your mind. 810 00:46:25,120 --> 00:46:27,480 Speaker 1: I have to say, though, we said the word cheeseburger 811 00:46:27,520 --> 00:46:29,719 Speaker 1: so many times I can't remember the last cheeseburger I had. 812 00:46:29,760 --> 00:46:30,400 Speaker 1: All I want right now? 813 00:46:30,440 --> 00:46:31,759 Speaker 2: It's a cheeseburger, is that right? 814 00:46:32,239 --> 00:46:33,200 Speaker 1: Yeah? 815 00:46:33,280 --> 00:46:35,239 Speaker 2: Okay, well we'll get you one. What kind do you want? 816 00:46:36,520 --> 00:46:40,200 Speaker 1: Well, you know, Dave said something about nasty McDonald's cheezburgers 817 00:46:40,200 --> 00:46:42,600 Speaker 1: and that I was like, what are you talking about? 818 00:46:42,680 --> 00:46:45,880 Speaker 2: Mc donalds. It's a classic, it's the best of the best. 819 00:46:46,360 --> 00:46:48,839 Speaker 1: Yeah. Well, you know, they may not like it. Maybe 820 00:46:48,840 --> 00:46:49,919 Speaker 1: he's a hearty's guy. 821 00:46:50,280 --> 00:46:52,560 Speaker 2: Maybe he's one of those guys who eats his cheeseburgers 822 00:46:52,560 --> 00:46:53,520 Speaker 2: on a briosh moon. 823 00:46:54,120 --> 00:46:57,800 Speaker 1: Oh an egg roll? Uh? 824 00:46:57,920 --> 00:46:58,680 Speaker 2: You got anything else? 825 00:46:58,960 --> 00:47:01,279 Speaker 1: I got nothing else. These are always fun. I like 826 00:47:01,320 --> 00:47:02,439 Speaker 1: these they are. 827 00:47:02,440 --> 00:47:05,319 Speaker 2: We don't very frequently tee off on something, but it 828 00:47:05,360 --> 00:47:07,400 Speaker 2: does feel good when we do once in a while. 829 00:47:09,239 --> 00:47:12,319 Speaker 2: Chuck just said yeah, and as everybody who's ever listened 830 00:47:12,360 --> 00:47:15,279 Speaker 2: to the podcast knows, he just unlocked listener mail. 831 00:47:18,600 --> 00:47:20,879 Speaker 1: Hey guys, to listen to your wonderful podcast about three 832 00:47:20,880 --> 00:47:22,359 Speaker 1: hours a day of my way to and from work. 833 00:47:22,360 --> 00:47:24,840 Speaker 1: I want to say thank you for all of that 834 00:47:24,880 --> 00:47:27,400 Speaker 1: over the years. I had a very funny incident the 835 00:47:27,480 --> 00:47:29,759 Speaker 1: other day listening to the Sea Monkeys episode. I was 836 00:47:29,800 --> 00:47:32,200 Speaker 1: coming home after a long day and that episode came on. 837 00:47:32,280 --> 00:47:35,520 Speaker 1: I was really tired, and when I originally fell asleep 838 00:47:35,920 --> 00:47:38,520 Speaker 1: against the window, I take it the person isn't driving. 839 00:47:40,480 --> 00:47:42,319 Speaker 1: When I originally fell asleep, I was listening to the 840 00:47:42,360 --> 00:47:44,439 Speaker 1: history of sea monkeys. To my surprise, though, I woke 841 00:47:44,520 --> 00:47:49,080 Speaker 1: up and heard White Supremacists and the KKK. I panic 842 00:47:49,160 --> 00:47:50,719 Speaker 1: and jumped out of my seat, thinking I had missed 843 00:47:50,719 --> 00:47:53,040 Speaker 1: my stop by a long shot and I had finished 844 00:47:53,040 --> 00:47:55,920 Speaker 1: an episode and started a completely different one. I was 845 00:47:55,960 --> 00:47:58,359 Speaker 1: not expecting the sea monkey episode to take that turn, 846 00:47:58,400 --> 00:48:00,520 Speaker 1: you guys. I was relieved to find out that not 847 00:48:00,600 --> 00:48:02,520 Speaker 1: much time had passed and I didn't miss my stop, 848 00:48:02,800 --> 00:48:04,680 Speaker 1: but also a little distressed to find out the dark 849 00:48:04,760 --> 00:48:08,120 Speaker 1: history of such an innocent children's produt. Yeah, I live 850 00:48:08,120 --> 00:48:10,600 Speaker 1: in Istanbul as a foreigner. In your podcast gives me 851 00:48:10,600 --> 00:48:12,879 Speaker 1: a little taste of home. It's comforting on those days 852 00:48:12,880 --> 00:48:15,200 Speaker 1: when I want to tune into two smart guys talking 853 00:48:15,239 --> 00:48:18,520 Speaker 1: about something interesting. Really appreciate you for all the work 854 00:48:18,560 --> 00:48:20,040 Speaker 1: you do and all the fun moments you've given me 855 00:48:20,040 --> 00:48:24,400 Speaker 1: on the bus. And that is from Katie Sesenler. 856 00:48:25,160 --> 00:48:28,480 Speaker 2: Katie knows how to speak our language, doesn't she Katie 857 00:48:28,520 --> 00:48:31,760 Speaker 2: really knows, she knows how to flatter. Thanks a lot, Katie, 858 00:48:31,880 --> 00:48:34,560 Speaker 2: have fun in Istanbul. That's very exciting and thrilling, and 859 00:48:34,600 --> 00:48:37,080 Speaker 2: I'm glad you did not miss your stop. If you 860 00:48:37,120 --> 00:48:39,040 Speaker 2: want to be like Katie and let us know where 861 00:48:39,080 --> 00:48:41,920 Speaker 2: you live and some funny story about stuff you Should Know, 862 00:48:42,200 --> 00:48:44,960 Speaker 2: we love that kind of thing. You can wrap it up, 863 00:48:45,160 --> 00:48:47,040 Speaker 2: spank it on the bottom, and send it off to 864 00:48:47,280 --> 00:48:53,160 Speaker 2: Stuff podcast at iHeartRadio dot com. 865 00:48:53,280 --> 00:48:56,160 Speaker 1: Stuff you Should Know is a production of iHeartRadio. For 866 00:48:56,239 --> 00:48:59,680 Speaker 1: more podcasts my heart Radio, visit the iHeartRadio app, Apple 867 00:48:59,719 --> 00:49:02,360 Speaker 1: pod Tests, or wherever you listen to your favorite shows.