1 00:00:00,240 --> 00:00:02,960 Speaker 1: Hey, everyone, We're coming to Salt Lake City, Utah and Phoenix, 2 00:00:03,000 --> 00:00:06,600 Speaker 1: Arizona this fall. Yeah, October, we're going to be at 3 00:00:06,600 --> 00:00:09,479 Speaker 1: Salt Lake Cities Grand Theater and then the next night 4 00:00:09,520 --> 00:00:13,880 Speaker 1: October will be in Phoenix. And we added a second 5 00:00:13,880 --> 00:00:16,840 Speaker 1: show to our Melbourne show, right, that's right, a second 6 00:00:16,840 --> 00:00:19,239 Speaker 1: earlier show in Melbourne. So you can get all the 7 00:00:19,280 --> 00:00:22,119 Speaker 1: information for all of these shows at s y s 8 00:00:22,160 --> 00:00:26,600 Speaker 1: K live dot com. Welcome to Stuff you Should Know 9 00:00:27,200 --> 00:00:36,040 Speaker 1: from how Stuff Works dot com. Hey you, and welcome 10 00:00:36,080 --> 00:00:39,120 Speaker 1: to the podcast. I'm Josh Clark, and there's Charles W. 11 00:00:39,280 --> 00:00:42,519 Speaker 1: Chuck Bryant, Jerry's over there. So why don't you pull 12 00:00:42,600 --> 00:00:45,720 Speaker 1: up a chair, kick back and tell us about your problems? 13 00:00:45,720 --> 00:00:50,519 Speaker 1: Because this is psychology stuff. We should just call this 14 00:00:50,600 --> 00:00:57,720 Speaker 1: episode the Stanford Prison Experiment a k a. Perhaps the 15 00:00:57,800 --> 00:01:01,000 Speaker 1: happiest experiment of all time, and it's really not an 16 00:01:01,000 --> 00:01:07,240 Speaker 1: experiment anyway, but it's the most famous psychology experiment ever. Yeah. 17 00:01:07,280 --> 00:01:09,600 Speaker 1: I got kind of ticked off while I was researching this, 18 00:01:10,520 --> 00:01:12,600 Speaker 1: you should, man, because I used to think it was cool, like, 19 00:01:12,600 --> 00:01:15,320 Speaker 1: oh man, what a cool experiment. Yeah, everybody's evil and 20 00:01:15,440 --> 00:01:18,399 Speaker 1: it's core. Yeah. Then I researched it, and I was like, 21 00:01:18,800 --> 00:01:21,160 Speaker 1: this is a bunch of bs, all of it. This 22 00:01:21,200 --> 00:01:24,280 Speaker 1: is one of the worst executed experiments I've ever heard of. 23 00:01:24,440 --> 00:01:27,200 Speaker 1: That is so funny because I while I was researching this, 24 00:01:27,280 --> 00:01:29,039 Speaker 1: I was like, I'm gonna have to keep it together. 25 00:01:29,160 --> 00:01:31,160 Speaker 1: Maybe at the end I can really go off from 26 00:01:31,440 --> 00:01:33,959 Speaker 1: let's go off at the beginning. That's great. Man. I 27 00:01:34,000 --> 00:01:38,200 Speaker 1: watched the movie today to yeah, how was it? How 28 00:01:38,240 --> 00:01:41,240 Speaker 1: was Billy crude up? Because I loved him in almost 29 00:01:41,319 --> 00:01:45,639 Speaker 1: famous Well, I'm a fan, he's he was good. Um, 30 00:01:45,680 --> 00:01:49,000 Speaker 1: But like, I don't know, the movie A was pretty 31 00:01:49,040 --> 00:01:52,960 Speaker 1: sensationalized as far as the violence. Like they showed a 32 00:01:52,960 --> 00:01:56,240 Speaker 1: lot of straight up physical violence in the movie which 33 00:01:56,240 --> 00:02:01,480 Speaker 1: supposedly didn't occur, um, like beating them with billy clubs 34 00:02:01,560 --> 00:02:06,760 Speaker 1: and hog tying them in like like real violence. How actually, 35 00:02:06,800 --> 00:02:11,280 Speaker 1: these days I should say at Lanta, Yeah, Uh, y'allywood 36 00:02:11,400 --> 00:02:13,560 Speaker 1: is what they call it. Oh, there you go. Perfect, 37 00:02:14,240 --> 00:02:17,160 Speaker 1: That's that's perfect. That sounds like a Norman Reda's creation. 38 00:02:17,440 --> 00:02:23,799 Speaker 1: It might have been uh and then uh, what was 39 00:02:23,800 --> 00:02:26,679 Speaker 1: I saying? Oh? Um, I don't feel like it came 40 00:02:26,720 --> 00:02:31,800 Speaker 1: down hard enough on on this Yahoo. What was the 41 00:02:31,800 --> 00:02:37,480 Speaker 1: guy's name? Zimbardo. Yeah, Zimbardo for just crafting a really 42 00:02:37,520 --> 00:02:40,519 Speaker 1: poor doing a very poor job at crafting its supposedly 43 00:02:40,560 --> 00:02:43,440 Speaker 1: scientific experiment. No, he was like the driving force behind 44 00:02:43,480 --> 00:02:45,959 Speaker 1: that movie getting made. Apparently he'd been trying to get 45 00:02:45,960 --> 00:02:48,639 Speaker 1: a movie made in America. He seems to be a 46 00:02:48,639 --> 00:02:51,640 Speaker 1: pretty frameless self promoter decades. Yes, Yeah, it's not a 47 00:02:51,639 --> 00:02:55,320 Speaker 1: good quality in a social psychologist. No, so we're gonna see. 48 00:02:56,480 --> 00:02:57,840 Speaker 1: I guess we'll let the cat out of the bag. 49 00:02:57,919 --> 00:03:03,799 Speaker 1: But well, we shall see that. Um. The Stanford prison 50 00:03:03,880 --> 00:03:08,880 Speaker 1: experiment one of the most famous experiments in the Annals 51 00:03:08,919 --> 00:03:13,880 Speaker 1: of Psychology. It's not an experiment at all. It's it's 52 00:03:13,960 --> 00:03:18,360 Speaker 1: findings are wide open new interpretation, and it was conducted 53 00:03:18,400 --> 00:03:22,399 Speaker 1: by a showman. Basically. Yeah, I mean, you know, it's 54 00:03:22,760 --> 00:03:25,680 Speaker 1: a red flag when you don't publish your findings in 55 00:03:25,720 --> 00:03:28,120 Speaker 1: a medical journal. You published them in New York was 56 00:03:28,160 --> 00:03:33,200 Speaker 1: a New York magazine, New York Times magazine, Hodgeman's rag. Well, 57 00:03:33,400 --> 00:03:36,760 Speaker 1: great rag, But that's not the place to go publish 58 00:03:37,120 --> 00:03:40,560 Speaker 1: scientific findings. No peer of viewed journals are, and they 59 00:03:40,920 --> 00:03:44,040 Speaker 1: circumvented that, but for very good reasons. All right, so 60 00:03:44,120 --> 00:03:46,520 Speaker 1: let's let's talk about the outline. So let's go back 61 00:03:46,520 --> 00:03:48,680 Speaker 1: to the beginning, right, Yeah, back to the year of 62 00:03:48,760 --> 00:03:56,960 Speaker 1: my birth and Stanford at Stanford University, which is what 63 00:03:57,120 --> 00:04:02,760 Speaker 1: Palo Alto. Yeah, uh find being sequoia is what is there? 64 00:04:02,920 --> 00:04:05,200 Speaker 1: They have like a big old sequoia on their logo. 65 00:04:05,840 --> 00:04:07,400 Speaker 1: I think it's like a and then they have a 66 00:04:08,240 --> 00:04:13,600 Speaker 1: uh sequoia with its fists up. That's Notre Dame I'm 67 00:04:13,640 --> 00:04:17,919 Speaker 1: thinking of. I do feel like it has something. Chuck's 68 00:04:17,960 --> 00:04:19,800 Speaker 1: looking it up everybody, so let me stall. It is 69 00:04:19,800 --> 00:04:22,839 Speaker 1: a tree, the Stanford Tree. I don't know what the 70 00:04:22,880 --> 00:04:25,840 Speaker 1: mascot is, but there's definitely a tree associated. I looked 71 00:04:25,839 --> 00:04:29,159 Speaker 1: it up, the Stanford Tree. Okay. And the first question 72 00:04:29,240 --> 00:04:32,840 Speaker 1: is why is it a tree? Huh? Well, what's the answer. Well, 73 00:04:32,880 --> 00:04:35,320 Speaker 1: I mean, I'm sure it's just because of where it 74 00:04:35,360 --> 00:04:38,840 Speaker 1: is in California, But that doesn't answer the real question, 75 00:04:38,960 --> 00:04:41,279 Speaker 1: which is why would you have a tree? Right Phillips 76 00:04:41,279 --> 00:04:43,840 Speaker 1: Embardo sitting there like quit stall and get to the 77 00:04:44,440 --> 00:04:47,960 Speaker 1: get to the heckling. He's still around, Yeah he is, 78 00:04:48,320 --> 00:04:51,279 Speaker 1: so um, all right, we're at Stanford. It's nine Yeah. 79 00:04:51,320 --> 00:04:53,960 Speaker 1: We're actually in the basement of one of the buildings 80 00:04:54,000 --> 00:04:56,680 Speaker 1: at Stanford, Stanford University. I think like Campbell Hall or 81 00:04:56,720 --> 00:05:00,279 Speaker 1: something like that. And I think August of nineteen seventy one, 82 00:05:00,480 --> 00:05:05,680 Speaker 1: there were um twenty four young men, almost all of 83 00:05:05,680 --> 00:05:07,520 Speaker 1: them one I think one of them was Asian American, 84 00:05:08,480 --> 00:05:12,440 Speaker 1: and um, they are doing something pretty bizarre in this 85 00:05:12,520 --> 00:05:17,600 Speaker 1: basement in August of They've been divided into two groups, 86 00:05:18,360 --> 00:05:23,400 Speaker 1: guards and prisoners supposedly average kids, right, and they are 87 00:05:23,680 --> 00:05:30,880 Speaker 1: um acting out this basically role playing game of guards 88 00:05:31,000 --> 00:05:34,440 Speaker 1: versus prisoners for fifteen bucks a day in a simulated 89 00:05:34,480 --> 00:05:38,839 Speaker 1: prison in the basement of this this hall at Stanford University. Yeah, 90 00:05:38,960 --> 00:05:42,400 Speaker 1: which would be about nine dollars today, funded by the 91 00:05:42,520 --> 00:05:45,240 Speaker 1: U S Office of Naval Research. Is that right? So 92 00:05:45,279 --> 00:05:47,520 Speaker 1: it would be ninety three bucks a day, and it 93 00:05:47,560 --> 00:05:49,479 Speaker 1: was originally gonna be two weeks. So I'm sure some 94 00:05:49,520 --> 00:05:51,760 Speaker 1: of these guys were like, hey, yeah, yeah, I mean 95 00:05:52,040 --> 00:05:53,400 Speaker 1: I kind of forgot what it was like to be 96 00:05:53,400 --> 00:05:56,880 Speaker 1: a college student that that would be, uh, you know 97 00:05:56,920 --> 00:06:00,920 Speaker 1: what between twelve and four bucks darting off your summer 98 00:06:01,800 --> 00:06:06,680 Speaker 1: it would be about dollars if my quick math is correct, 99 00:06:07,720 --> 00:06:11,800 Speaker 1: good scratch for year old. Yeah, two weeks on summer break, 100 00:06:12,520 --> 00:06:15,520 Speaker 1: that's right. So Uh, you were divided into two lots, 101 00:06:15,560 --> 00:06:19,159 Speaker 1: like you said. Um, they asked people, um, supposedly what 102 00:06:19,240 --> 00:06:21,960 Speaker 1: you wanted to be, unless this was purely a movie creation, 103 00:06:22,279 --> 00:06:23,880 Speaker 1: and they did try and look up and try and 104 00:06:24,000 --> 00:06:28,720 Speaker 1: find out the differences. Um, But they supposedly asked him. 105 00:06:28,720 --> 00:06:33,520 Speaker 1: In most everyone said, or in fact everyone said prisoner. Uh. 106 00:06:33,600 --> 00:06:35,640 Speaker 1: In one of the reactions from who ended up being 107 00:06:35,680 --> 00:06:39,760 Speaker 1: the bad guard, the guy said, they asked him why, 108 00:06:39,800 --> 00:06:42,720 Speaker 1: and he's like, because nobody likes cards. It's like, why 109 00:06:42,760 --> 00:06:45,479 Speaker 1: would anyone want to be a guard? Because they thought 110 00:06:45,760 --> 00:06:48,000 Speaker 1: we'll just be prisoners, because they just will lay around 111 00:06:48,000 --> 00:06:51,960 Speaker 1: smoke cigarettes. So we'll we'll and we'll kind of unpack 112 00:06:52,040 --> 00:06:55,560 Speaker 1: what that suggests later on. Sure. Okay, so you've got 113 00:06:55,600 --> 00:06:58,320 Speaker 1: these these guys and they're down here for this experiment, 114 00:06:58,360 --> 00:07:00,479 Speaker 1: and so coming at it from the way. This is 115 00:07:00,520 --> 00:07:05,920 Speaker 1: the popular interpretation of what happened at the Stanford Prison experiment. Okay, 116 00:07:06,920 --> 00:07:10,720 Speaker 1: you've got you've got twelve guards and twelve prisoners. The 117 00:07:10,760 --> 00:07:13,720 Speaker 1: prisoners had been arrested, by the way, by the real 118 00:07:13,760 --> 00:07:16,320 Speaker 1: Palo Alto police. Yeah, they weren't told when, but like 119 00:07:16,360 --> 00:07:19,400 Speaker 1: the real cops came by arrested each one of them 120 00:07:19,600 --> 00:07:23,280 Speaker 1: for you know, a variety of crimes, booked them at 121 00:07:23,320 --> 00:07:26,200 Speaker 1: the Palo Alto Police station and then transported them to 122 00:07:26,800 --> 00:07:31,320 Speaker 1: the jail, the fake jail in uh at Stanford. Yeah, 123 00:07:31,320 --> 00:07:33,360 Speaker 1: they call it the Stanford County Jail. And they did 124 00:07:33,440 --> 00:07:36,600 Speaker 1: a legit job. They put up signs, they had these 125 00:07:37,160 --> 00:07:42,120 Speaker 1: rooms decked out like jail cells, they had a whole um. 126 00:07:42,200 --> 00:07:45,040 Speaker 1: They did a really believable job of making this seem 127 00:07:45,120 --> 00:07:49,120 Speaker 1: like a prison environment at least so Um, you've got 128 00:07:49,160 --> 00:07:52,240 Speaker 1: these these prisoners who have been delivered, You've got these 129 00:07:52,240 --> 00:07:56,320 Speaker 1: guards who are waiting there for him, and um as 130 00:07:56,400 --> 00:07:59,280 Speaker 1: as far as Zimbardo has ever said, these these guards 131 00:07:59,280 --> 00:08:04,240 Speaker 1: were told you have to protect the prison and everything 132 00:08:04,240 --> 00:08:06,480 Speaker 1: else is up to you. The only rule is there's 133 00:08:06,480 --> 00:08:10,480 Speaker 1: no physical punishment. We're just here to observe, Like here's 134 00:08:10,480 --> 00:08:14,320 Speaker 1: your uniforms, here's your sunglasses. Yeah. And then the prisoners 135 00:08:14,320 --> 00:08:19,440 Speaker 1: were booked in with wearing smocks, no shoes, no underwear, Yeah, 136 00:08:19,520 --> 00:08:22,760 Speaker 1: naked under the smocks, chained at the ankles. And then 137 00:08:22,760 --> 00:08:26,360 Speaker 1: they wore like um, those stocking cap do rags, a 138 00:08:26,400 --> 00:08:31,080 Speaker 1: panty on their head to simulate to simulate they're having 139 00:08:31,120 --> 00:08:34,920 Speaker 1: their head shaved, right, Uh, and you know, this is 140 00:08:34,960 --> 00:08:37,360 Speaker 1: this the early seventies, so most of them had these 141 00:08:37,400 --> 00:08:44,600 Speaker 1: big afrozen long hair and stuff under these panties. Right, so, um, 142 00:08:44,679 --> 00:08:47,840 Speaker 1: this is like at first, everything's pretty normal. The guards 143 00:08:47,840 --> 00:08:49,920 Speaker 1: don't quite know what to do. They're a little timid. 144 00:08:50,840 --> 00:08:56,960 Speaker 1: The prisoners apparently relish this immediately and started like finding 145 00:08:56,960 --> 00:09:00,800 Speaker 1: where the guard's boundaries were, and they started to band together. 146 00:09:01,360 --> 00:09:06,200 Speaker 1: And there was actually, I think on day two, the 147 00:09:06,200 --> 00:09:09,760 Speaker 1: the the turnover from day one to two, there was 148 00:09:09,800 --> 00:09:14,160 Speaker 1: a prisoner riot. Yeah, I mean they m like you said, 149 00:09:14,160 --> 00:09:15,959 Speaker 1: they were sort of laughing at first, and I think 150 00:09:15,960 --> 00:09:18,240 Speaker 1: we didn't mention too and this is will end up 151 00:09:18,240 --> 00:09:21,439 Speaker 1: being very very problematic and the first sign that he 152 00:09:21,480 --> 00:09:24,520 Speaker 1: didn't do a good job. Zimbardo actually acted as the 153 00:09:24,559 --> 00:09:28,319 Speaker 1: superintendent of the prison, involved himself in his own experiment 154 00:09:28,960 --> 00:09:32,040 Speaker 1: and had one of it. He had some graduate assistants 155 00:09:32,040 --> 00:09:35,400 Speaker 1: that were assisting in the program. They acted as parole board, 156 00:09:35,520 --> 00:09:38,600 Speaker 1: and one of them was the warden that was yeah, 157 00:09:39,080 --> 00:09:44,239 Speaker 1: undergrad actually, well the warden Jaffee, his last name was Jaffee. 158 00:09:44,320 --> 00:09:46,760 Speaker 1: He was an undergrad at the time and actually he 159 00:09:46,800 --> 00:09:49,360 Speaker 1: had come up with the experiment on his own. Oh, 160 00:09:49,400 --> 00:09:51,800 Speaker 1: he was the guy, huh huh. And then um, Zimbardo 161 00:09:51,880 --> 00:09:53,360 Speaker 1: was like, this is a really good idea. Let's do 162 00:09:53,400 --> 00:09:57,280 Speaker 1: this for real. Imagine the press, right. So yeah, like 163 00:09:57,360 --> 00:09:59,840 Speaker 1: he said, it escalated pretty quickly. After kind of laughing 164 00:09:59,880 --> 00:10:04,080 Speaker 1: at first, these guards got into their roles, to say 165 00:10:04,080 --> 00:10:09,160 Speaker 1: the least, and um really kind of started being jerks 166 00:10:09,640 --> 00:10:13,520 Speaker 1: in quick order. And after the prisoners were like, hey, 167 00:10:13,559 --> 00:10:15,720 Speaker 1: this is kind of funny, like you're being you're not 168 00:10:15,760 --> 00:10:18,480 Speaker 1: being very cool, and they were you know, kind of 169 00:10:18,480 --> 00:10:21,040 Speaker 1: smacked down and and you know, made to do things 170 00:10:21,080 --> 00:10:24,160 Speaker 1: like push ups and jumping jacks and uh, they would 171 00:10:24,200 --> 00:10:27,360 Speaker 1: withhold food and eventually they would like take their beds 172 00:10:27,360 --> 00:10:29,320 Speaker 1: away from them and stuff like. It just got worse 173 00:10:29,320 --> 00:10:31,640 Speaker 1: and worse, and there was, I think, like you said, 174 00:10:31,640 --> 00:10:36,920 Speaker 1: on day two, a an uprising. They got together, threw 175 00:10:36,960 --> 00:10:40,480 Speaker 1: the cots off their beds and through the bed frames 176 00:10:40,480 --> 00:10:43,439 Speaker 1: against the door and wouldn't let them in. So there 177 00:10:43,520 --> 00:10:47,760 Speaker 1: was a prisoner riot. Yeah. That's pretty significant, right, um. 178 00:10:47,840 --> 00:10:51,360 Speaker 1: And what's equally significant is that the guards by the 179 00:10:51,440 --> 00:10:55,080 Speaker 1: second day started to show signs of like real cruelty 180 00:10:55,280 --> 00:10:58,880 Speaker 1: toward the prisoners. They started treating them very poorly. Um, 181 00:10:59,000 --> 00:11:01,920 Speaker 1: they started engage aging and basically acts of torture like 182 00:11:01,960 --> 00:11:03,839 Speaker 1: waking them up randomly in the middle of the night, 183 00:11:03,920 --> 00:11:06,600 Speaker 1: making them get up, like you said, push ups, which 184 00:11:06,640 --> 00:11:10,720 Speaker 1: is interpreted as um physical punishment because again you couldn't 185 00:11:10,760 --> 00:11:12,600 Speaker 1: you couldn't hit them with the rubber hose, you couldn't 186 00:11:12,640 --> 00:11:15,520 Speaker 1: hit them with the baton, you couldn't punch them. But 187 00:11:16,040 --> 00:11:17,880 Speaker 1: if you make somebody to do a bunch of push ups, 188 00:11:18,240 --> 00:11:21,640 Speaker 1: that's physical punishment too. Yeah, and it was within the 189 00:11:21,640 --> 00:11:24,400 Speaker 1: bounds apparently. Yeah. They were referred to only by their 190 00:11:24,440 --> 00:11:27,199 Speaker 1: prison numbers. They would never say their names. They were 191 00:11:27,200 --> 00:11:30,360 Speaker 1: made to memorize everyone else's prison number, and like they 192 00:11:30,360 --> 00:11:32,720 Speaker 1: would line them up and tell them to repeat their 193 00:11:32,800 --> 00:11:35,400 Speaker 1: numbers for like an hour. If they didn't do it 194 00:11:35,440 --> 00:11:38,320 Speaker 1: fast enough and then in reverse order, uh, they would 195 00:11:38,360 --> 00:11:41,079 Speaker 1: get punishment. They would do the kind of the classic 196 00:11:41,120 --> 00:11:44,960 Speaker 1: moves of holding one responsible for the punishment of others. Yeah, 197 00:11:44,960 --> 00:11:46,640 Speaker 1: that's a that's a big one. Like if you didn't 198 00:11:46,679 --> 00:11:48,200 Speaker 1: make your bed good enough, then no one could go 199 00:11:48,240 --> 00:11:51,439 Speaker 1: to sleep. Stuff like that. The guards also innovated um 200 00:11:51,559 --> 00:11:54,880 Speaker 1: the carrots here there too. They actually made one cell 201 00:11:55,040 --> 00:11:57,600 Speaker 1: like a good cell, like they put a bed in 202 00:11:57,679 --> 00:12:00,920 Speaker 1: it with like betting. Um, if you were in that cell, 203 00:12:01,000 --> 00:12:04,160 Speaker 1: you were eligible for like good meals better than what 204 00:12:04,200 --> 00:12:07,040 Speaker 1: the other prisoners had. And there were room for three 205 00:12:07,040 --> 00:12:10,520 Speaker 1: inmates in there at a time, and so instilled this 206 00:12:10,600 --> 00:12:16,479 Speaker 1: sense of competition and and um skullduggery, I guess backstabbery 207 00:12:17,000 --> 00:12:21,040 Speaker 1: among the prisoners to curry favor with the guards, like 208 00:12:21,120 --> 00:12:23,720 Speaker 1: by informing on the other ones so that you could 209 00:12:23,800 --> 00:12:26,199 Speaker 1: get a chance to be in like the nice cell. Yeah. 210 00:12:26,240 --> 00:12:28,160 Speaker 1: And I think even before that, like when they went 211 00:12:28,200 --> 00:12:30,439 Speaker 1: to do the when they went to stage the uprising, 212 00:12:31,040 --> 00:12:33,400 Speaker 1: I don't think there were three rooms of three and 213 00:12:33,440 --> 00:12:35,920 Speaker 1: I think six of them. Two of the rooms participated 214 00:12:36,320 --> 00:12:40,000 Speaker 1: and one of the rooms did not. And uh, because 215 00:12:40,040 --> 00:12:42,920 Speaker 1: not all the guys, um, you know, not all the 216 00:12:42,960 --> 00:12:45,160 Speaker 1: prisoners like rebelled as much. Some of them just kind 217 00:12:45,160 --> 00:12:48,480 Speaker 1: of went along with it. Interestingly, some of the guards 218 00:12:48,520 --> 00:12:52,000 Speaker 1: did not descend into cruelty. They actually some of them 219 00:12:52,040 --> 00:12:53,760 Speaker 1: did like favors, went out of their way to be 220 00:12:53,880 --> 00:12:57,480 Speaker 1: nice to the prisoners. But in um the grabster who 221 00:12:57,520 --> 00:13:01,559 Speaker 1: wrote this article points out very significantly, they didn't stand 222 00:13:01,720 --> 00:13:05,520 Speaker 1: up to the cruel guards or officially object to their 223 00:13:05,600 --> 00:13:09,800 Speaker 1: their behavior. They went along with it. But then in 224 00:13:09,840 --> 00:13:12,840 Speaker 1: their own right, in their own way, they did what 225 00:13:12,920 --> 00:13:15,439 Speaker 1: they could to retain their humanity. So there are two 226 00:13:15,520 --> 00:13:19,280 Speaker 1: huge points and one of them, there's one among the 227 00:13:19,280 --> 00:13:21,959 Speaker 1: guards and one among the prisoners. And the one among 228 00:13:22,000 --> 00:13:24,640 Speaker 1: the prisoners comes thirty six hours after the beginning of 229 00:13:24,679 --> 00:13:28,920 Speaker 1: the of the experiment, and this prisoner, his name, it 230 00:13:28,960 --> 00:13:33,280 Speaker 1: would later be revealed, was Douglas Corpy. Um, he had 231 00:13:33,360 --> 00:13:37,880 Speaker 1: an emotional breakdown, a nervous breakdown thirty six hours after 232 00:13:37,920 --> 00:13:42,160 Speaker 1: this this this experiment starts, one of the prisoners, it 233 00:13:42,480 --> 00:13:48,640 Speaker 1: becomes so emotionally involved in this simulated prison at this cruelty, 234 00:13:48,720 --> 00:13:51,400 Speaker 1: the simulated supposedly cruelty of the guards that he had 235 00:13:51,440 --> 00:13:55,400 Speaker 1: a nervous breakdown and had to be had to be 236 00:13:55,480 --> 00:13:58,640 Speaker 1: removed from them the experiment. And this is like, this 237 00:13:58,720 --> 00:14:01,360 Speaker 1: is embardos. This is the official line for the Stanford 238 00:14:01,400 --> 00:14:06,880 Speaker 1: Prison experiment and has been for decades. Yeah. He also 239 00:14:06,920 --> 00:14:09,480 Speaker 1: said that one of them broke out in a psychosomatic rash. 240 00:14:09,640 --> 00:14:14,960 Speaker 1: There was um all manner of of various levels of 241 00:14:14,960 --> 00:14:19,440 Speaker 1: psychological breakdowns happening on the other side. The big star 242 00:14:19,600 --> 00:14:22,040 Speaker 1: among the guards was a guy named John Wayne, who 243 00:14:22,080 --> 00:14:27,280 Speaker 1: you referenced earlier. Yeah, his name was Dave Eshelman, and 244 00:14:27,400 --> 00:14:30,360 Speaker 1: he was the one who he was the ringleader. He's 245 00:14:30,360 --> 00:14:32,600 Speaker 1: the one that came out as the most brutal guard 246 00:14:33,160 --> 00:14:34,920 Speaker 1: of them all, and all the other guards kind of 247 00:14:34,920 --> 00:14:37,640 Speaker 1: fell in line behind him and took their cues from him. 248 00:14:37,720 --> 00:14:39,880 Speaker 1: So this whole thing is going on, This is crazy 249 00:14:39,920 --> 00:14:43,440 Speaker 1: town in this place. In in six days, six days, 250 00:14:43,480 --> 00:14:46,840 Speaker 1: this thing descends into chaos. Supposed to be two weeks, yes, 251 00:14:46,960 --> 00:14:49,800 Speaker 1: there was. There was rumors that there was going to 252 00:14:49,840 --> 00:14:53,160 Speaker 1: be a breakout, and so they moved the experiment. Um. 253 00:14:53,240 --> 00:14:56,880 Speaker 1: There were that that guy Douglas Corpi, who had a 254 00:14:56,920 --> 00:15:00,280 Speaker 1: nervous breakdown, ended up getting put into the whole um 255 00:15:00,320 --> 00:15:04,960 Speaker 1: this broom closet for I think overnight and was finally 256 00:15:05,000 --> 00:15:09,920 Speaker 1: released because the the researchers that actually stepped in and said, 257 00:15:10,000 --> 00:15:13,120 Speaker 1: you shoulhould probably let him out. Um. It was. It 258 00:15:13,200 --> 00:15:17,560 Speaker 1: was just utter chaos. And then eventually um Phillip Zimbardo's 259 00:15:17,560 --> 00:15:20,360 Speaker 1: girlfriend at the time of a woman named Christine mass 260 00:15:20,560 --> 00:15:24,280 Speaker 1: Maslock his wife to be. Um. Oh she married him, 261 00:15:24,360 --> 00:15:28,600 Speaker 1: huh um. So she came and just dropped in to 262 00:15:28,640 --> 00:15:32,320 Speaker 1: see how things were going and was so outraged at 263 00:15:32,360 --> 00:15:34,920 Speaker 1: what she saw that she was like, you, you're so 264 00:15:35,000 --> 00:15:37,320 Speaker 1: far beyond the line. You have to stop this now, 265 00:15:37,360 --> 00:15:39,960 Speaker 1: like this is this is descended into chaos. You can't 266 00:15:40,000 --> 00:15:44,120 Speaker 1: do this. These people are treating these these prisoners horribly, 267 00:15:44,240 --> 00:15:49,400 Speaker 1: Like how are you letting this go on? Fine? And 268 00:15:49,480 --> 00:15:52,600 Speaker 1: so the next day he canceled the experiment again after 269 00:15:52,680 --> 00:15:54,800 Speaker 1: six days, and it was scheduled to go on for 270 00:15:54,840 --> 00:15:59,920 Speaker 1: two weeks. And so he comes out tells the world 271 00:16:00,600 --> 00:16:04,720 Speaker 1: in this New York Times magazine, guys, if I took you, 272 00:16:04,880 --> 00:16:06,840 Speaker 1: if I took you josh and I took you Chuck 273 00:16:07,160 --> 00:16:10,080 Speaker 1: and put you as guarden prisoner in an even a 274 00:16:10,160 --> 00:16:14,440 Speaker 1: simulated prison, and put a smock on josh and took 275 00:16:14,440 --> 00:16:18,120 Speaker 1: his underwear off and uh put a stocking on his head, 276 00:16:18,480 --> 00:16:21,320 Speaker 1: and gave Chuck a baton and some glasses. Chuck would 277 00:16:21,880 --> 00:16:25,640 Speaker 1: beat Josh up, and Joshua probably have his spirit broken 278 00:16:25,640 --> 00:16:29,000 Speaker 1: and have a nervous breakdown. It's in everybody. Evil is 279 00:16:29,000 --> 00:16:32,720 Speaker 1: in everybody crumbling at the first sign of adversity. Isn't 280 00:16:32,720 --> 00:16:37,240 Speaker 1: everybody We're all just pathetic weaklings? Stanford Prison experiment and 281 00:16:37,280 --> 00:16:40,960 Speaker 1: he ran off and said, I'm famous. All right, that's 282 00:16:40,960 --> 00:16:44,080 Speaker 1: a great setup. So we'll take a break here and 283 00:16:44,240 --> 00:16:46,280 Speaker 1: come back and talk a little bit about the more 284 00:16:46,320 --> 00:16:48,520 Speaker 1: about the experiment and the realities of it right after this. 285 00:17:10,280 --> 00:17:13,119 Speaker 1: All right, So you've got John Wayne in there. I 286 00:17:13,160 --> 00:17:15,160 Speaker 1: don't think we mentioned that he took on the persona 287 00:17:15,359 --> 00:17:19,440 Speaker 1: of the prison boss and cool hand Luke. He did 288 00:17:19,440 --> 00:17:22,680 Speaker 1: a fake Southern accent and everything and dove right into 289 00:17:22,720 --> 00:17:25,600 Speaker 1: this role. Um, if you talk to Dave Eshelmann today, 290 00:17:26,359 --> 00:17:29,120 Speaker 1: he will say he's very much on record of saying, 291 00:17:30,000 --> 00:17:33,520 Speaker 1: I'm not some jerk, uh, And I didn't get off 292 00:17:33,600 --> 00:17:36,960 Speaker 1: on being sadistic. He said, I wanted to do what 293 00:17:37,000 --> 00:17:39,320 Speaker 1: they paid me fifteen dollars a day to do, which 294 00:17:39,359 --> 00:17:42,480 Speaker 1: was to be a prison guard and to treat these 295 00:17:42,520 --> 00:17:45,040 Speaker 1: guys poorly. And so I you know, he said, I 296 00:17:45,080 --> 00:17:47,600 Speaker 1: did some drama in high school, and I literally acted 297 00:17:47,600 --> 00:17:51,000 Speaker 1: this part as well as I could. That was I 298 00:17:51,160 --> 00:17:54,800 Speaker 1: felt was expected and wanted from me, right, And I 299 00:17:55,040 --> 00:17:57,800 Speaker 1: put on this fake Southern accent. And if you like 300 00:17:58,000 --> 00:18:00,359 Speaker 1: asked people friends and family today, they would laugh at 301 00:18:00,400 --> 00:18:02,439 Speaker 1: this because I'm really not this guy at all, right, 302 00:18:02,480 --> 00:18:04,800 Speaker 1: Because he really comes off as as a bit of 303 00:18:04,800 --> 00:18:08,560 Speaker 1: a villain in this movie. For sure. Well, he perpetrated 304 00:18:08,760 --> 00:18:12,560 Speaker 1: real cruelty on other people and we'll get to that later. 305 00:18:14,200 --> 00:18:17,800 Speaker 1: And he should because the other people actually did suffer 306 00:18:18,720 --> 00:18:23,240 Speaker 1: under this guy's leadership as the ringleader of the mean guards, 307 00:18:23,640 --> 00:18:28,880 Speaker 1: like they wore pink on Wednesday. It was terrible everywhere, right, So, um, 308 00:18:28,920 --> 00:18:31,240 Speaker 1: he really should feel bad, and apparently he does. I 309 00:18:31,280 --> 00:18:33,439 Speaker 1: saw that all over the place too, that he feels 310 00:18:33,440 --> 00:18:36,520 Speaker 1: bad for it. But the point is is that he 311 00:18:36,600 --> 00:18:43,200 Speaker 1: has said, like, this didn't happen organically, like I I wasn't. 312 00:18:43,280 --> 00:18:47,280 Speaker 1: I wasn't I felt encouraged to play this role. That's 313 00:18:47,320 --> 00:18:50,119 Speaker 1: a big deal because the findings of the Stanford prison 314 00:18:50,160 --> 00:18:52,880 Speaker 1: experiments say, if you take some people and say you're 315 00:18:52,920 --> 00:18:56,200 Speaker 1: a guard empower and you will turn evil, they will 316 00:18:56,240 --> 00:19:00,200 Speaker 1: turn evil within a day. A day they said about 317 00:19:00,200 --> 00:19:02,560 Speaker 1: this guy. And this guy's like, no, I was, just 318 00:19:02,800 --> 00:19:05,040 Speaker 1: like you said, doing my job. But they're paying me 319 00:19:05,080 --> 00:19:07,520 Speaker 1: fifteen bucks a day for Let's put that one to 320 00:19:07,600 --> 00:19:10,560 Speaker 1: the side. In that Let's go visit with Douglas Corpi, 321 00:19:10,640 --> 00:19:14,639 Speaker 1: who was the prisoner who in thirty six short hours 322 00:19:14,840 --> 00:19:19,840 Speaker 1: of this simulated prison experiment lost his marbles and had 323 00:19:19,880 --> 00:19:23,400 Speaker 1: a nervous breakdown and had to go home. Right, one 324 00:19:23,440 --> 00:19:26,080 Speaker 1: of the other two pillars of the findings that people 325 00:19:26,119 --> 00:19:28,879 Speaker 1: are either evil or easily crumble in the face of 326 00:19:28,920 --> 00:19:32,480 Speaker 1: adversity from the Stanford prison experiment. And again, this is 327 00:19:32,480 --> 00:19:35,600 Speaker 1: how this thing has been taught for like fifty years. Okay, Yeah, 328 00:19:35,760 --> 00:19:39,000 Speaker 1: So Corpy comes out and says, I was faking that, 329 00:19:39,600 --> 00:19:41,600 Speaker 1: and I put on a big act so I could 330 00:19:41,600 --> 00:19:44,160 Speaker 1: get out of there because it sucked and I didn't 331 00:19:44,200 --> 00:19:47,159 Speaker 1: want to be there anymore. So I fake like I was. 332 00:19:47,400 --> 00:19:50,480 Speaker 1: And he he like one of his quotes was I 333 00:19:50,480 --> 00:19:53,280 Speaker 1: don't have it here, but basically said like any trained 334 00:19:53,280 --> 00:19:56,159 Speaker 1: clinician would have been able to see right through this. 335 00:19:56,280 --> 00:19:59,320 Speaker 1: Like when I hear the tapes years later, it's like, 336 00:19:59,359 --> 00:20:02,320 Speaker 1: I'm not an actor. I wasn't like apparently the John 337 00:20:02,359 --> 00:20:04,040 Speaker 1: Wayne guy at least had been in like high school 338 00:20:04,040 --> 00:20:06,480 Speaker 1: plays in college too, I think, yeah, And he was like, 339 00:20:06,520 --> 00:20:08,760 Speaker 1: I was not an actor. And it was so clear 340 00:20:08,800 --> 00:20:11,239 Speaker 1: to me looking back at these tapes that I was 341 00:20:11,280 --> 00:20:14,520 Speaker 1: faking it, faking a nervous breakdown, faking a nervous breakdown 342 00:20:14,560 --> 00:20:17,399 Speaker 1: to get out of there. So the reason why he 343 00:20:17,480 --> 00:20:20,240 Speaker 1: said later that he did fake this nervous breakdown is 344 00:20:20,240 --> 00:20:22,879 Speaker 1: because he took the job because he thought he'd just 345 00:20:22,920 --> 00:20:26,000 Speaker 1: be laying around, like you said, smoking cigarettes, being a prisoner, 346 00:20:26,520 --> 00:20:28,240 Speaker 1: and he would get to study for the g r E. 347 00:20:28,480 --> 00:20:31,720 Speaker 1: He was inner grad school and well they said, no, 348 00:20:31,880 --> 00:20:34,000 Speaker 1: you can't have your books. Now. They didn't give him anything, 349 00:20:34,000 --> 00:20:36,000 Speaker 1: and this guy was like, whoa, whoa, Wait a minute, 350 00:20:36,040 --> 00:20:37,919 Speaker 1: this is day one. He's like, whoa, whoa whoa, Like, 351 00:20:38,040 --> 00:20:40,080 Speaker 1: I need those books. I'm taking the g r E, 352 00:20:40,280 --> 00:20:43,400 Speaker 1: basically leaving here after two weeks and going to take 353 00:20:43,440 --> 00:20:46,320 Speaker 1: the test, like I've got to spend this two week studying. 354 00:20:46,520 --> 00:20:49,000 Speaker 1: They're like, you can't have your books. So he quickly 355 00:20:49,040 --> 00:20:52,240 Speaker 1: saw that the only way out was to fake this 356 00:20:52,320 --> 00:20:54,880 Speaker 1: nervous breakdown. And Billy crude Up went in there and said, 357 00:20:54,880 --> 00:20:57,600 Speaker 1: why is everyone saying whoa whoa whoa? Only I can 358 00:20:57,680 --> 00:21:01,960 Speaker 1: say whoa whoa, whoa, whoa, whoa whoa. Uh yeah, so 359 00:21:02,600 --> 00:21:05,639 Speaker 1: we've kind of poo pooed the two major findings from 360 00:21:05,640 --> 00:21:08,240 Speaker 1: the study already. So that's that's a huge deal, right, 361 00:21:08,280 --> 00:21:11,320 Speaker 1: because again the idea is that if you put people 362 00:21:11,480 --> 00:21:14,359 Speaker 1: any random people, Remember these are just average like middle 363 00:21:14,560 --> 00:21:18,320 Speaker 1: middle class white kids, um, which is another problem. Right 364 00:21:18,359 --> 00:21:21,640 Speaker 1: if you put if you put any, well, that means 365 00:21:21,680 --> 00:21:24,760 Speaker 1: everybody that's the whole world. Right, If you put anybody 366 00:21:24,760 --> 00:21:26,760 Speaker 1: in the world in this situation, they're going to either 367 00:21:26,800 --> 00:21:31,399 Speaker 1: turn evil or lose their marbles. So, um, those are 368 00:21:31,440 --> 00:21:33,960 Speaker 1: the two findings. That's what everybody took it as at first. 369 00:21:34,119 --> 00:21:36,959 Speaker 1: It later came out, now this guy was acting. This 370 00:21:37,000 --> 00:21:40,439 Speaker 1: guy was faking. So what else do we have? Then, Well, 371 00:21:40,760 --> 00:21:46,960 Speaker 1: we have this idea that Zimbardo insinuated himself as part 372 00:21:47,040 --> 00:21:50,920 Speaker 1: of the experiment, and that actually created the findings from 373 00:21:50,960 --> 00:21:54,600 Speaker 1: the Standford prison experiment. So should we put a pin 374 00:21:54,640 --> 00:21:56,480 Speaker 1: in that. You want to talk about that now? No, No, 375 00:21:56,520 --> 00:21:57,879 Speaker 1: I want to go. I want to go where you 376 00:21:57,920 --> 00:21:59,359 Speaker 1: want to go. All right, let's put a pin in 377 00:21:59,400 --> 00:22:01,399 Speaker 1: that then and talk about a little bit more about 378 00:22:01,400 --> 00:22:05,520 Speaker 1: what went on that week. Um, they had everything from 379 00:22:06,320 --> 00:22:09,480 Speaker 1: visitation like you could write a letter to your family 380 00:22:09,600 --> 00:22:11,680 Speaker 1: or girlfriend or whoever you want to come visit you 381 00:22:12,359 --> 00:22:14,840 Speaker 1: to ask for visitation rights, and the family came in 382 00:22:15,480 --> 00:22:17,199 Speaker 1: and they did. They came in and visited for an 383 00:22:17,200 --> 00:22:20,120 Speaker 1: hour and they were in some cases parents were like 384 00:22:20,880 --> 00:22:23,720 Speaker 1: I don't know about this. This is like this just 385 00:22:23,800 --> 00:22:27,080 Speaker 1: seems like a really weird thing. And Zimbardo will be like, 386 00:22:27,119 --> 00:22:30,400 Speaker 1: oh no, it's totally fine, Like you know, they're the psychologists. Yeah, 387 00:22:30,400 --> 00:22:31,960 Speaker 1: like they want to be here, like ask them and 388 00:22:32,000 --> 00:22:35,360 Speaker 1: the the kids. You know, they did say that they 389 00:22:36,520 --> 00:22:42,479 Speaker 1: wanted to stay okay, which is which is important? Okay, 390 00:22:42,920 --> 00:22:45,520 Speaker 1: so what all? What else is importantess Like, no one 391 00:22:45,560 --> 00:22:47,320 Speaker 1: in the visiting hour. I don't think we're like, get 392 00:22:47,320 --> 00:22:49,640 Speaker 1: me out of here. They're like no, this is all 393 00:22:49,680 --> 00:22:53,360 Speaker 1: part of the part of the act. Essentially. Um, they 394 00:22:53,400 --> 00:22:57,520 Speaker 1: had parole hearings inside the course of a week. Somehow 395 00:22:57,840 --> 00:23:00,639 Speaker 1: they said that if they they could be released, if 396 00:23:00,680 --> 00:23:03,520 Speaker 1: they would forfeit the money. Uh. And this is after 397 00:23:03,920 --> 00:23:06,800 Speaker 1: I don't know how many of the six days. But um, 398 00:23:06,880 --> 00:23:10,840 Speaker 1: they could not get paid if and be paroled if 399 00:23:10,840 --> 00:23:12,399 Speaker 1: they went in front of the parole board. They went 400 00:23:12,440 --> 00:23:14,120 Speaker 1: in front of the parole board, some of them did, 401 00:23:14,760 --> 00:23:16,800 Speaker 1: and most of the prisoners said that they would give 402 00:23:16,840 --> 00:23:20,240 Speaker 1: up their money. In fact, and the parole members like, uh, 403 00:23:20,560 --> 00:23:22,400 Speaker 1: like I said, they were the graduate assistants. They even 404 00:23:22,400 --> 00:23:27,040 Speaker 1: had one um former prisoner, this guy that like it 405 00:23:27,080 --> 00:23:30,840 Speaker 1: was a fifteen year yeah inmate, fifteen or seventeen year 406 00:23:30,880 --> 00:23:34,359 Speaker 1: inmate on the board that I guess him Zimbardo. I 407 00:23:34,400 --> 00:23:37,359 Speaker 1: want to call him Zamboni. So he actually was a 408 00:23:37,400 --> 00:23:42,320 Speaker 1: friend of Jaffi's, the guy who originally actually as an undergrad, 409 00:23:42,440 --> 00:23:44,239 Speaker 1: so he brought him in on it, right, So he 410 00:23:44,280 --> 00:23:46,439 Speaker 1: was on the parole board, and he was kind of 411 00:23:46,440 --> 00:23:48,840 Speaker 1: one of the ones, um at least in the film 412 00:23:48,960 --> 00:23:51,280 Speaker 1: version that was kind of saying like, no, this is 413 00:23:51,320 --> 00:23:53,359 Speaker 1: like how it is, like, you should keep it going. 414 00:23:54,480 --> 00:23:56,280 Speaker 1: But I don't know how much of that was dramatized. 415 00:23:56,520 --> 00:23:58,840 Speaker 1: I don't either. That's a that's a That's one of 416 00:23:58,880 --> 00:24:01,280 Speaker 1: the problems with this is, you know, so much of 417 00:24:01,320 --> 00:24:04,320 Speaker 1: the documentation has been not released over the years, and 418 00:24:04,320 --> 00:24:06,840 Speaker 1: when it does get released, it contradicts the official line. 419 00:24:06,880 --> 00:24:10,280 Speaker 1: And um, it's very tough to separate truth from fiction, 420 00:24:10,600 --> 00:24:13,640 Speaker 1: especially when you introduce a Hollywood movie into the whole thing, 421 00:24:13,760 --> 00:24:17,560 Speaker 1: just to just to drive those nails in the coffin too. Yeah, 422 00:24:17,560 --> 00:24:19,960 Speaker 1: and so reality in fact, there's been a lot of 423 00:24:20,280 --> 00:24:22,919 Speaker 1: in the years since, a lot of complaints that a 424 00:24:22,920 --> 00:24:25,280 Speaker 1: lot of these you know, kids were screaming, I want 425 00:24:25,280 --> 00:24:27,359 Speaker 1: to go home, I want to go home. And for 426 00:24:27,480 --> 00:24:31,639 Speaker 1: his part, Zimbardo said, in the contract it says I 427 00:24:31,720 --> 00:24:36,080 Speaker 1: want to exit the experiment, as the official line to say, 428 00:24:36,160 --> 00:24:38,159 Speaker 1: and they could have gone home and he was like, 429 00:24:38,240 --> 00:24:39,800 Speaker 1: but you hear no one ever said I want to 430 00:24:39,800 --> 00:24:41,960 Speaker 1: exit the experiment. They would say I want my mommy, 431 00:24:42,359 --> 00:24:45,479 Speaker 1: or I'm going crazy, or my god, please stop this, 432 00:24:45,680 --> 00:24:48,840 Speaker 1: please stop this, but they never said those exact words, 433 00:24:48,920 --> 00:24:51,480 Speaker 1: this safe phrase. Let's say, yeah, the safe phrase. But 434 00:24:51,800 --> 00:24:54,359 Speaker 1: it turns out that's bunk to right. Yeah, it turns 435 00:24:54,359 --> 00:24:56,800 Speaker 1: out that if you look at the contract that they 436 00:24:56,840 --> 00:24:59,159 Speaker 1: had that he's referencing that that say the rules and 437 00:24:59,200 --> 00:25:02,880 Speaker 1: everything in the emit there's no safe word to be mentioned. 438 00:25:03,040 --> 00:25:05,360 Speaker 1: Certainly doesn't say if you say I want to quit 439 00:25:05,400 --> 00:25:08,600 Speaker 1: the experiment, you get released from the experiment. So he's 440 00:25:08,640 --> 00:25:10,800 Speaker 1: just flat out lying about that. Then that's from what 441 00:25:10,880 --> 00:25:13,600 Speaker 1: I understand. Yes, and what article was this that you sent? 442 00:25:14,080 --> 00:25:18,960 Speaker 1: There's a really good takedown um in medium called the 443 00:25:18,960 --> 00:25:23,520 Speaker 1: The Lifespan of a Lie um and it's based on 444 00:25:23,680 --> 00:25:28,240 Speaker 1: that titles based on a I think a documentary by 445 00:25:28,440 --> 00:25:34,040 Speaker 1: a documentary or book by a French filmmaker um which 446 00:25:34,600 --> 00:25:38,199 Speaker 1: who titled his version The Birth of a Lie and 447 00:25:38,240 --> 00:25:42,080 Speaker 1: it's basically about how the Stanford prison experiment was just 448 00:25:42,280 --> 00:25:44,760 Speaker 1: basically it was bunked from the get go, which will 449 00:25:44,880 --> 00:25:46,959 Speaker 1: kind of pick that apart in a little bit, and 450 00:25:47,000 --> 00:25:51,560 Speaker 1: that it's just fascinatingly has been perpetuated over again, basically 451 00:25:51,640 --> 00:25:56,200 Speaker 1: fifty years. It just entered the cultural zeitgeist and just 452 00:25:56,280 --> 00:26:00,080 Speaker 1: stayed like an infection. All Right. Some other things that 453 00:26:00,200 --> 00:26:03,359 Speaker 1: happened to make it realistic. They brought in, um, a 454 00:26:03,480 --> 00:26:06,919 Speaker 1: lawyer when parents asked for one, and played along like 455 00:26:07,000 --> 00:26:09,640 Speaker 1: it was real. They brought in a chaplain who came 456 00:26:09,680 --> 00:26:13,080 Speaker 1: in to speak to prisoners. Uh, and he played along 457 00:26:13,119 --> 00:26:18,840 Speaker 1: with it too. Uh. They basically did everything that you 458 00:26:18,840 --> 00:26:22,159 Speaker 1: would think would happen in a real prison um on 459 00:26:22,280 --> 00:26:25,240 Speaker 1: on a slightly scaled down level. Right, But the upshot 460 00:26:25,240 --> 00:26:28,399 Speaker 1: of all of this is Zimbardo saying, like, do you 461 00:26:28,480 --> 00:26:32,040 Speaker 1: see what's going on here? Everybody? Like I just put 462 00:26:32,200 --> 00:26:35,320 Speaker 1: some guys in, like nine guys in at a time, 463 00:26:35,400 --> 00:26:38,359 Speaker 1: or twelve guys as guards, twelve guys as prisoners, and 464 00:26:38,440 --> 00:26:43,280 Speaker 1: their parents came for visiting hours, a lawyer came. That's 465 00:26:43,320 --> 00:26:46,919 Speaker 1: the that's how real the simulated prison became in people's minds. 466 00:26:46,960 --> 00:26:51,000 Speaker 1: Just imagine what a real prisons like, right. So um 467 00:26:51,040 --> 00:26:52,600 Speaker 1: and he was saying they could have left at any 468 00:26:52,600 --> 00:26:54,159 Speaker 1: time if they just said the safe word, and no 469 00:26:54,160 --> 00:26:57,600 Speaker 1: one ever said the safe word. There is some evidence 470 00:26:57,680 --> 00:27:00,400 Speaker 1: that these people were basically kept there against their will, 471 00:27:01,600 --> 00:27:08,919 Speaker 1: especially after Douglas Corpy basically faked his emotional breakdown and 472 00:27:08,960 --> 00:27:13,800 Speaker 1: then was thrown into broom closet and retaliation for it. 473 00:27:14,480 --> 00:27:17,919 Speaker 1: Um that he should have very very clearly should have 474 00:27:17,960 --> 00:27:20,919 Speaker 1: been left or allowed to leave. And to even be 475 00:27:21,040 --> 00:27:24,280 Speaker 1: led to think that you couldn't leave, which is apparently 476 00:27:24,320 --> 00:27:28,200 Speaker 1: the idea that spread throughout the prisoners. Um, that would 477 00:27:28,200 --> 00:27:30,760 Speaker 1: be like keeping someone against their will. Yeah, and he 478 00:27:30,800 --> 00:27:33,920 Speaker 1: did leave, but was supposed to agree to come back, supposedly, 479 00:27:34,880 --> 00:27:38,560 Speaker 1: uh to like play a different role as a prisoner 480 00:27:38,600 --> 00:27:41,280 Speaker 1: who like maybe escaped and came back I think, but 481 00:27:41,480 --> 00:27:45,480 Speaker 1: didn't come back, right, And um, I think five people 482 00:27:45,560 --> 00:27:49,000 Speaker 1: were released early before the whole experiment was called off, 483 00:27:49,000 --> 00:27:53,600 Speaker 1: all prisoners. No guards left the experiment, which is telling well, 484 00:27:53,640 --> 00:27:55,840 Speaker 1: and they were working in shifts that which is important. Okay, 485 00:27:55,840 --> 00:27:58,280 Speaker 1: that is a big one too. Um. But but if 486 00:27:58,320 --> 00:28:00,760 Speaker 1: you consider that no one asked to be a guard, 487 00:28:00,960 --> 00:28:03,159 Speaker 1: they all has to be prisoners. But then none of 488 00:28:03,200 --> 00:28:06,960 Speaker 1: the guards left the experiment, that's to me, that's interesting 489 00:28:07,000 --> 00:28:10,480 Speaker 1: on its face, right, there's something to that, but um, 490 00:28:10,600 --> 00:28:15,359 Speaker 1: they're the whole thing just kind of falling apart after 491 00:28:15,680 --> 00:28:21,399 Speaker 1: Zimbardo's girlfriend Um at the time, came um the idea 492 00:28:21,480 --> 00:28:24,560 Speaker 1: that up to this point, these people had engaged in 493 00:28:24,600 --> 00:28:27,919 Speaker 1: this fantasy and thought that they couldn't leave when they 494 00:28:27,960 --> 00:28:32,159 Speaker 1: really could. That's controversial in and of itself, because again 495 00:28:32,520 --> 00:28:35,359 Speaker 1: there's evidence that they were led to believe they couldn't leave, 496 00:28:35,920 --> 00:28:41,480 Speaker 1: and that's different. That changes things entirely. So you want 497 00:28:41,480 --> 00:28:44,720 Speaker 1: to take another break and then pick this part some more. Yeah, 498 00:28:44,840 --> 00:29:09,040 Speaker 1: let's do it. Kind of fun alright, the final takedown. 499 00:29:09,240 --> 00:29:12,440 Speaker 1: I'm I'm waiting for I'm waiting for Phillipsimbardo to release 500 00:29:12,480 --> 00:29:17,240 Speaker 1: a book about like our Jackhammer episode. That's fine, I 501 00:29:17,280 --> 00:29:20,600 Speaker 1: would read it, all right, So where are we here? Basically, 502 00:29:21,080 --> 00:29:24,120 Speaker 1: we're at the point where uh he is he has 503 00:29:24,160 --> 00:29:27,760 Speaker 1: into the experiment and now we're dealing with the fallout 504 00:29:27,840 --> 00:29:31,120 Speaker 1: since nine and how this should be viewed. One of 505 00:29:31,160 --> 00:29:33,480 Speaker 1: the big things that came out of that French book, 506 00:29:33,560 --> 00:29:38,120 Speaker 1: The Birth of a Lie, is the um the filmmaker 507 00:29:38,640 --> 00:29:43,840 Speaker 1: unearthed a recording that was I don't know where he 508 00:29:43,880 --> 00:29:46,920 Speaker 1: found it, but they he found it and released the 509 00:29:47,000 --> 00:29:52,280 Speaker 1: transcript of it that clearly has Um, if not Zimbardo 510 00:29:52,400 --> 00:29:57,520 Speaker 1: at least Jaffee. Definitely Jaffee coaching the um the guards 511 00:29:57,880 --> 00:30:01,400 Speaker 1: to be more brutal, Right, be a tough guard. Just 512 00:30:01,440 --> 00:30:03,200 Speaker 1: think of like how the pigs do it, and do 513 00:30:03,240 --> 00:30:06,120 Speaker 1: it like that, I think is what the quote was, right, Yeah, 514 00:30:06,120 --> 00:30:07,920 Speaker 1: when the whole idea of this thing is to try 515 00:30:07,960 --> 00:30:12,520 Speaker 1: and prove that without any influence, Yes, this is what happens. Right, 516 00:30:12,840 --> 00:30:15,920 Speaker 1: So there's a couple of things that happened method Methodologically, 517 00:30:15,960 --> 00:30:17,920 Speaker 1: there's a lot of things that happened the moment they 518 00:30:17,920 --> 00:30:21,960 Speaker 1: started coaching those guards. Number one, they took any organic 519 00:30:22,040 --> 00:30:24,600 Speaker 1: nous out of their behavior. They were then doing what 520 00:30:24,640 --> 00:30:26,800 Speaker 1: they thought they were expected to do, like John Wayne 521 00:30:27,080 --> 00:30:28,800 Speaker 1: for sure, who just went over the top is what 522 00:30:28,880 --> 00:30:32,800 Speaker 1: it was, um. And then number two, they made them 523 00:30:33,080 --> 00:30:36,120 Speaker 1: co experimenters. Like the whole thing was supposed to be 524 00:30:36,160 --> 00:30:38,720 Speaker 1: guards and prisoners, and we're going to watch as test 525 00:30:38,760 --> 00:30:43,960 Speaker 1: subjects or participants. And when you coach the guards, you're 526 00:30:44,120 --> 00:30:47,440 Speaker 1: their co experimenters. Now, now the experiments entirely on the 527 00:30:47,600 --> 00:30:50,520 Speaker 1: on the prisoners, which you can say, okay, well then 528 00:30:50,560 --> 00:30:52,960 Speaker 1: those findings still worked. Well, that gets thrown out when 529 00:30:53,000 --> 00:30:54,640 Speaker 1: you base the whole thing on a guy who is 530 00:30:54,680 --> 00:30:58,600 Speaker 1: faking right. But but you you make the guards co 531 00:30:58,760 --> 00:31:02,360 Speaker 1: experimenters and you just completely take out any objectivity from 532 00:31:02,360 --> 00:31:06,680 Speaker 1: this experiment. That's problem one with the methodology. Well, and 533 00:31:06,760 --> 00:31:08,720 Speaker 1: the fact we already mentioned that one of one of 534 00:31:08,760 --> 00:31:11,960 Speaker 1: the researchers was a warden and zim uh keep on 535 00:31:12,040 --> 00:31:17,680 Speaker 1: to call m Zambrano. That's fine. Zimbardo Zamboni himself was 536 00:31:17,760 --> 00:31:20,560 Speaker 1: the superintendent, like the minute he decided to do that, 537 00:31:21,000 --> 00:31:23,400 Speaker 1: Like I looked up. I think he's like in his 538 00:31:23,480 --> 00:31:26,360 Speaker 1: late thirties when he did this. How did he not like, 539 00:31:26,520 --> 00:31:29,040 Speaker 1: was he that bad and doing his job? How did 540 00:31:29,120 --> 00:31:31,400 Speaker 1: he not know? Like, wait a minute, this will taint 541 00:31:31,440 --> 00:31:34,000 Speaker 1: the experiment. Do you want to talk about why the 542 00:31:34,240 --> 00:31:38,320 Speaker 1: people think that he was so? Yeah, okay, so he 543 00:31:38,480 --> 00:31:41,440 Speaker 1: was a uh he wasn't. I think still is a 544 00:31:41,520 --> 00:31:45,479 Speaker 1: social activist for sure, And he had decided, um, and 545 00:31:45,520 --> 00:31:48,320 Speaker 1: I can't really disagree with them that prisons were brutal 546 00:31:48,400 --> 00:31:53,240 Speaker 1: places where brutality lived, and that they were inherently brutal. 547 00:31:53,680 --> 00:31:55,600 Speaker 1: And so if you take somebody and put them into 548 00:31:55,640 --> 00:31:58,680 Speaker 1: this place, you're doing a real disservice to humanity by 549 00:31:58,680 --> 00:32:00,920 Speaker 1: by throwing somebody in a bruutal place that you know, 550 00:32:01,040 --> 00:32:03,920 Speaker 1: is brutal. His aim was to get reformed to happen. Yes, 551 00:32:04,720 --> 00:32:06,920 Speaker 1: from the outset. Well, I mean, I can't fault that, 552 00:32:06,960 --> 00:32:10,800 Speaker 1: but you can't call it a scientific experiment either. And 553 00:32:10,840 --> 00:32:15,959 Speaker 1: it actually supposedly backfired as well, because one interpretation of 554 00:32:15,960 --> 00:32:20,240 Speaker 1: his findings is that it's all or nothing with prisons. 555 00:32:20,360 --> 00:32:25,000 Speaker 1: Prisons are inherently brutal, or you can't have them. So 556 00:32:25,040 --> 00:32:27,640 Speaker 1: either you have prisons and you have brutal prisons, or 557 00:32:27,800 --> 00:32:30,959 Speaker 1: you have no prisons. And so, faced with that choice 558 00:32:31,000 --> 00:32:34,320 Speaker 1: and with rising crime rates in the seventies, um a 559 00:32:34,320 --> 00:32:37,120 Speaker 1: lot of people double down on getting tough and made 560 00:32:37,120 --> 00:32:39,840 Speaker 1: prisons even worse and built more prisons and said, t yes, 561 00:32:40,200 --> 00:32:42,960 Speaker 1: we're not even gonna try to like reform you anymore, 562 00:32:43,280 --> 00:32:45,520 Speaker 1: but we're just gonna send you to these brutal places 563 00:32:45,560 --> 00:32:47,640 Speaker 1: that are inherently brutal and there's nothing we can do 564 00:32:47,720 --> 00:32:50,800 Speaker 1: about it. So it would have it would have backfired 565 00:32:50,840 --> 00:32:53,400 Speaker 1: in that sense, but in in in the idea that 566 00:32:53,440 --> 00:32:56,360 Speaker 1: he was doing something with the best interests of his 567 00:32:56,440 --> 00:33:00,320 Speaker 1: fellow people at heart. Again, like you said, it's it's 568 00:33:00,320 --> 00:33:03,840 Speaker 1: tough to fault him for that. He just really really 569 00:33:04,200 --> 00:33:08,720 Speaker 1: gave social psychology a black eye. Yeah. So one of 570 00:33:08,720 --> 00:33:10,920 Speaker 1: the other things he did wrong, UM, and this one 571 00:33:11,000 --> 00:33:13,880 Speaker 1: I just can't figure out either, is he didn't have 572 00:33:13,920 --> 00:33:17,440 Speaker 1: a control group and one of his Um, this guy 573 00:33:17,520 --> 00:33:20,680 Speaker 1: wasn't in the experiment, but one of his colleagues came 574 00:33:20,680 --> 00:33:23,440 Speaker 1: by one day and was like, you know what, what's 575 00:33:23,440 --> 00:33:26,120 Speaker 1: your control what's your independent variable? Yeah, and he was 576 00:33:26,200 --> 00:33:29,640 Speaker 1: like what, Yeah, He's like, I don't have one. So 577 00:33:29,720 --> 00:33:33,440 Speaker 1: if you if you run a an experiment of any sort, 578 00:33:33,840 --> 00:33:36,720 Speaker 1: um grabstre uses a great analogy where if you're trying 579 00:33:36,720 --> 00:33:39,760 Speaker 1: to figure out what the effects of radiation are on tomatoes, 580 00:33:40,280 --> 00:33:42,720 Speaker 1: you pick a bunch of tomatoes, you weigh them, you 581 00:33:42,800 --> 00:33:45,360 Speaker 1: check them for color, um, you make sure that they're 582 00:33:45,400 --> 00:33:49,040 Speaker 1: identical to another set of tomatoes. So you have two 583 00:33:49,080 --> 00:33:52,680 Speaker 1: sets of basically identical tomatoes. One you radiate, one you 584 00:33:52,720 --> 00:33:54,800 Speaker 1: do not, and after a set amount of time you 585 00:33:54,840 --> 00:33:57,000 Speaker 1: go back and see what the differences are. And then 586 00:33:57,040 --> 00:34:00,680 Speaker 1: you can say probably that when you read the eight tomatoes, 587 00:34:00,720 --> 00:34:02,680 Speaker 1: these are the effects, and the effects are the differences 588 00:34:02,720 --> 00:34:06,120 Speaker 1: between the two. Same thing with the prison experiment, what 589 00:34:06,160 --> 00:34:10,080 Speaker 1: would you have here two different cell blocks and one 590 00:34:10,160 --> 00:34:13,520 Speaker 1: that literally isn't coached and completely left alone. That's what 591 00:34:13,560 --> 00:34:15,359 Speaker 1: I would have done for sure. And then one where 592 00:34:15,360 --> 00:34:18,120 Speaker 1: you're saying, hey, be brutal, and yeah, we'll see if 593 00:34:18,160 --> 00:34:20,840 Speaker 1: everyone falls into these roles exactly that that would have 594 00:34:20,880 --> 00:34:23,880 Speaker 1: been great. And actually some researchers in two thousand one 595 00:34:24,360 --> 00:34:28,520 Speaker 1: they did exactly that. They basically ran the experiment with 596 00:34:28,680 --> 00:34:31,839 Speaker 1: just that control group you suggested. Um, it was called 597 00:34:31,880 --> 00:34:35,520 Speaker 1: the BBC Prison Study. Yeah, has Lem and Riker. Yeah, 598 00:34:35,600 --> 00:34:38,040 Speaker 1: and basically they did the same thing. They they they 599 00:34:38,120 --> 00:34:41,320 Speaker 1: did not do any coaching, they didn't do any intervention. 600 00:34:41,360 --> 00:34:44,279 Speaker 1: They did the the thing exactly like you're supposed to, 601 00:34:44,400 --> 00:34:47,720 Speaker 1: or like Zimbardo should have from the outset. And um, 602 00:34:47,760 --> 00:34:50,880 Speaker 1: they found that again they made the control group to 603 00:34:50,920 --> 00:34:54,000 Speaker 1: the original Stanford prison experiment, they found that the exact 604 00:34:54,080 --> 00:34:58,080 Speaker 1: opposite happened. The prisoners stayed banded together, the guards were 605 00:34:58,120 --> 00:35:04,200 Speaker 1: totally in disarray, um and disorganized. The brutality never emerged. Um, 606 00:35:04,239 --> 00:35:06,680 Speaker 1: and there wasn't any violence from what I understand. And 607 00:35:06,719 --> 00:35:08,879 Speaker 1: this is where it gets really scummy. If he asked me. 608 00:35:09,719 --> 00:35:14,520 Speaker 1: Zimbardo found out about this, and supposedly Hasselman Reicher said 609 00:35:14,520 --> 00:35:18,480 Speaker 1: they discovered he was privately writing editors, uh to keep 610 00:35:18,520 --> 00:35:22,520 Speaker 1: them from getting published and claiming that they were fraudulent. Yeah, 611 00:35:22,600 --> 00:35:25,560 Speaker 1: in the journal that they they released their findings, and 612 00:35:25,680 --> 00:35:30,480 Speaker 1: he wrote in an appendage to their their article and said, 613 00:35:30,960 --> 00:35:33,440 Speaker 1: these are they just don't even listen to these guys. 614 00:35:33,640 --> 00:35:37,520 Speaker 1: I'm Philip Simbardo. Man. So yeah, I thought that was 615 00:35:37,560 --> 00:35:40,960 Speaker 1: pretty scummy too if he did that. Um, so you've 616 00:35:40,960 --> 00:35:45,120 Speaker 1: got methodologically, there's even more problems too. If in the 617 00:35:45,239 --> 00:35:50,840 Speaker 1: in the original newspaper advertisement, chuck, he said, um, prison experiment, 618 00:35:50,920 --> 00:35:53,680 Speaker 1: prison experiment. Everybody sign up. Yeah, that was a problem 619 00:35:53,719 --> 00:35:56,040 Speaker 1: in itself. They shouldn't have known what they were doing, no, 620 00:35:56,160 --> 00:35:58,759 Speaker 1: exactly until they showed up, right, So you're gonna get 621 00:35:58,800 --> 00:36:02,200 Speaker 1: a big, wide swath of people, and then once they 622 00:36:02,200 --> 00:36:04,480 Speaker 1: find out what the experiment is, maybe they'll say no 623 00:36:04,600 --> 00:36:08,640 Speaker 1: thanks or whatever. But this was like attracting um. A 624 00:36:08,680 --> 00:36:14,160 Speaker 1: two thousand and seven follow up study found narcissistic, hostile, 625 00:36:15,080 --> 00:36:21,239 Speaker 1: overly aggressive authoritarian types like flies to honey or the opposite. Well, 626 00:36:21,280 --> 00:36:23,239 Speaker 1: that seems to be the case in this case, yeah, 627 00:36:23,239 --> 00:36:25,640 Speaker 1: which was in fact, one of them was a liberal 628 00:36:25,680 --> 00:36:29,120 Speaker 1: activist who kind of purposely went in there because he 629 00:36:29,120 --> 00:36:32,759 Speaker 1: thought maybe these findings could be used one day for 630 00:36:33,080 --> 00:36:36,520 Speaker 1: prison reform. Well, I think also most of the UM 631 00:36:36,800 --> 00:36:39,319 Speaker 1: what I got from Jaffee coaching the people to say, like, 632 00:36:39,360 --> 00:36:41,600 Speaker 1: think about what the pigs would do, and then do 633 00:36:41,600 --> 00:36:43,960 Speaker 1: do that, because we really got to show them how 634 00:36:44,040 --> 00:36:47,839 Speaker 1: how brutal prisons are. Um. I think everybody who showed 635 00:36:47,920 --> 00:36:52,160 Speaker 1: up basically was against prisons. But whether you're against prisons 636 00:36:52,200 --> 00:36:56,080 Speaker 1: or forum, you were automatically tainted before you even showed 637 00:36:56,160 --> 00:37:00,200 Speaker 1: up for the interview because they wrote prison experiment in 638 00:37:00,239 --> 00:37:03,440 Speaker 1: the ad. So from the outset there was biased, There 639 00:37:03,480 --> 00:37:06,720 Speaker 1: was no control group. It attracted a bias cross section 640 00:37:06,760 --> 00:37:11,240 Speaker 1: of people participated. He was a participant, and that actually 641 00:37:11,320 --> 00:37:17,040 Speaker 1: chuck led to the second set of findings that Zimbardo 642 00:37:17,120 --> 00:37:23,480 Speaker 1: had influenced this and become a participant himself. And here's 643 00:37:23,520 --> 00:37:27,000 Speaker 1: the current interpretation of all of it. Okay, this seems 644 00:37:27,040 --> 00:37:31,560 Speaker 1: to be the current dujur interpretation of the Stanford prison experiment. 645 00:37:32,560 --> 00:37:37,040 Speaker 1: Not that people are inherently cruel and inherently will just 646 00:37:37,080 --> 00:37:39,239 Speaker 1: crumble in the face of authority, although that that might 647 00:37:39,360 --> 00:37:45,719 Speaker 1: still stand, but that people will be are capable of 648 00:37:45,760 --> 00:37:50,359 Speaker 1: cruelty if they're recruited by an authority. Figure the second set, 649 00:37:50,360 --> 00:37:54,160 Speaker 1: and there's actually been three sets of interpretations. The second 650 00:37:54,160 --> 00:37:58,040 Speaker 1: set was that Zimbardo inserted himself and that it actually 651 00:37:58,400 --> 00:38:02,759 Speaker 1: demonstrated what's called situation fist theory. Yeah, and that's basically 652 00:38:02,840 --> 00:38:07,839 Speaker 1: that external circumstances are the drivers of human behavior. Right, 653 00:38:07,920 --> 00:38:10,760 Speaker 1: So the point was not that people are inherently cruel 654 00:38:10,880 --> 00:38:13,879 Speaker 1: on an individual level, but the situation that they're put in, 655 00:38:14,280 --> 00:38:18,000 Speaker 1: they will quickly find those roles if there's a power 656 00:38:18,040 --> 00:38:22,200 Speaker 1: structure above them, that is, that has normalized this and 657 00:38:22,320 --> 00:38:25,960 Speaker 1: is expecting them to fulfill those roles. And this really 658 00:38:26,000 --> 00:38:29,120 Speaker 1: tied in with you know, this is one people were 659 00:38:29,160 --> 00:38:32,279 Speaker 1: still really trying to figure out what the heck had 660 00:38:32,360 --> 00:38:36,120 Speaker 1: just happened with the Nazis. It was only like years before. 661 00:38:36,840 --> 00:38:40,239 Speaker 1: So this idea that this banality of evil, this made 662 00:38:40,440 --> 00:38:43,320 Speaker 1: perfect sense in that in that respect, right, there is 663 00:38:43,360 --> 00:38:47,440 Speaker 1: a bureaucracy that had normalized evil and you were just 664 00:38:47,600 --> 00:38:51,480 Speaker 1: following orders. That was the second interpretation of the Stanford 665 00:38:51,520 --> 00:38:54,440 Speaker 1: prison experiment. Yeah. Well, and not just the Nazis, but 666 00:38:54,560 --> 00:38:58,640 Speaker 1: everything like the Vietnam War, which was I mean, this 667 00:38:58,719 --> 00:39:02,799 Speaker 1: is one, and like the Myley massacre, and you know, 668 00:39:02,840 --> 00:39:06,080 Speaker 1: I was just following orders like this tied in this 669 00:39:06,120 --> 00:39:08,920 Speaker 1: has his fingers in a lot of relevant politics of 670 00:39:08,960 --> 00:39:12,520 Speaker 1: the day. Right, So, um, apparently it also tied in 671 00:39:12,600 --> 00:39:15,800 Speaker 1: really well to Attica, and Zimbardo must have just couldn't 672 00:39:15,800 --> 00:39:18,640 Speaker 1: believe is his good fortune that there was a there's 673 00:39:18,600 --> 00:39:22,919 Speaker 1: alotiest prison riot in American history happened like a couple 674 00:39:22,960 --> 00:39:25,000 Speaker 1: of weeks after he made the news in the New 675 00:39:25,080 --> 00:39:27,840 Speaker 1: York Times magazine with this journal art or this article 676 00:39:27,920 --> 00:39:30,600 Speaker 1: that he wrote, right, But that actually played into it too, 677 00:39:30,640 --> 00:39:33,960 Speaker 1: because apparently, following orders, a lot of guards just fired 678 00:39:34,000 --> 00:39:37,799 Speaker 1: blindly into the tear gas smoke of this prison riot 679 00:39:37,840 --> 00:39:44,239 Speaker 1: and killed tons of of unarmed prisoners and hostages. So 680 00:39:44,239 --> 00:39:46,839 Speaker 1: so Zimbard is like, Okay, that's fine, however we're going 681 00:39:46,880 --> 00:39:50,799 Speaker 1: to interpret this. I'm cool with that. But the third one, 682 00:39:50,880 --> 00:39:53,239 Speaker 1: I'm not quite sure that he would be cool with 683 00:39:53,719 --> 00:39:57,719 Speaker 1: the current one, which is bad science, I think. So 684 00:39:57,760 --> 00:39:59,960 Speaker 1: what I saw is that a lot of social psychologists 685 00:40:00,360 --> 00:40:02,160 Speaker 1: we've known this is bad science all along, but the 686 00:40:02,160 --> 00:40:06,399 Speaker 1: findings were really interesting and worthwhile, so we didn't throw 687 00:40:06,440 --> 00:40:09,080 Speaker 1: the baby out with the bathwater. The third one is 688 00:40:09,160 --> 00:40:14,360 Speaker 1: that that Zimbardo inserted himself and what this, what this 689 00:40:14,360 --> 00:40:19,120 Speaker 1: this study really showed was that people will engage in 690 00:40:19,200 --> 00:40:22,880 Speaker 1: acts of cruelty if there is a figure of authority 691 00:40:22,920 --> 00:40:26,080 Speaker 1: recruiting them to what they think is a righteous cause. 692 00:40:26,400 --> 00:40:30,800 Speaker 1: And in this case it was Zimbardo making the guards 693 00:40:30,880 --> 00:40:34,400 Speaker 1: co experiment ers by coaching them to be cruel and 694 00:40:34,920 --> 00:40:38,359 Speaker 1: in in the name of prison reform. Ultimately, when they 695 00:40:38,400 --> 00:40:41,320 Speaker 1: showed the world what happens when you put normal people 696 00:40:41,320 --> 00:40:44,040 Speaker 1: in a prison situation. Yeah, which is what the John 697 00:40:44,080 --> 00:40:46,960 Speaker 1: Wayne guy very much has said all his life since then, 698 00:40:47,080 --> 00:40:49,800 Speaker 1: is that this is what they I thought they wanted 699 00:40:49,960 --> 00:40:52,480 Speaker 1: was for me to be a bad guard so we 700 00:40:52,520 --> 00:40:57,520 Speaker 1: could prove, uh, ultimately that prisons need reform. And that 701 00:40:57,520 --> 00:41:00,760 Speaker 1: that is why he's still complicit, because he's still engaged 702 00:41:00,800 --> 00:41:04,719 Speaker 1: in these acts of genuine cruelty against the prisoners in 703 00:41:04,760 --> 00:41:07,440 Speaker 1: the study. And that's why he should still feel bad 704 00:41:07,440 --> 00:41:09,800 Speaker 1: and still does feel bad. But he did it because 705 00:41:10,000 --> 00:41:13,080 Speaker 1: he was recruited in the name of this rights is 706 00:41:13,160 --> 00:41:15,960 Speaker 1: caused by somebody who was in authority. So is this 707 00:41:16,040 --> 00:41:18,640 Speaker 1: being taught this way in classes now? I don't I 708 00:41:18,719 --> 00:41:21,640 Speaker 1: think that they, especially once it came out the Zimbardo 709 00:41:21,760 --> 00:41:24,719 Speaker 1: and at the very least his warden co experiment er 710 00:41:24,880 --> 00:41:27,600 Speaker 1: was was coaching them to do this and that the 711 00:41:27,760 --> 00:41:31,319 Speaker 1: organic cruelty is just totally out the window. I think 712 00:41:31,600 --> 00:41:33,319 Speaker 1: they don't know what to do with it right now. 713 00:41:33,400 --> 00:41:36,439 Speaker 1: They're trying to figure out like how to get these 714 00:41:36,440 --> 00:41:39,239 Speaker 1: findings across or what to make of them. Because one 715 00:41:39,440 --> 00:41:41,400 Speaker 1: one of these quotes from the article you sent the 716 00:41:41,440 --> 00:41:43,920 Speaker 1: guy said, I don't think it's scientific fraud in the 717 00:41:43,920 --> 00:41:46,960 Speaker 1: typical sense. It was never considered to be scientific. It's 718 00:41:46,960 --> 00:41:52,000 Speaker 1: typically represented in classrooms as a demonstration, uh, not an experiment. 719 00:41:52,400 --> 00:41:56,160 Speaker 1: And there's a notorious case of ethical malfeasance. So that's 720 00:41:56,160 --> 00:42:00,200 Speaker 1: almost a fourth takeaway is that it's an example have 721 00:42:00,239 --> 00:42:04,280 Speaker 1: how to not do a study correctly, which is interesting. 722 00:42:04,680 --> 00:42:09,120 Speaker 1: Oh yeah, I mean methodologically inserting yourself, like lying about 723 00:42:09,160 --> 00:42:14,200 Speaker 1: the findings later on, or misinterpreting the results or using spin. 724 00:42:14,400 --> 00:42:16,680 Speaker 1: It's yeah, there's a lot here. But it was approved 725 00:42:16,680 --> 00:42:20,400 Speaker 1: by the Stanford Human Rights Subjects Review Committee at the time. 726 00:42:20,440 --> 00:42:24,239 Speaker 1: Those were Zimbardo's experiments. We presented this too, uh And 727 00:42:24,280 --> 00:42:28,879 Speaker 1: there you know, he still says that it was ethical. Well, 728 00:42:28,920 --> 00:42:31,160 Speaker 1: it was at the time under the guidelines, it was ethical, 729 00:42:31,320 --> 00:42:34,680 Speaker 1: but then after they changed the guidelines, you couldn't do 730 00:42:34,680 --> 00:42:38,719 Speaker 1: this today, or at least not with like he did it. So, 731 00:42:39,040 --> 00:42:43,080 Speaker 1: I did you remember the very brief psychology is nuts serious? 732 00:42:43,360 --> 00:42:46,160 Speaker 1: Watch that I did win on the Stanford Prison Experiment. Yeah, 733 00:42:46,160 --> 00:42:48,160 Speaker 1: I watched that today, did you What do you think 734 00:42:48,239 --> 00:42:50,840 Speaker 1: it was good? Thanks? Man? Cute little background? Yeah, I 735 00:42:50,840 --> 00:42:55,719 Speaker 1: thought so too. Um and let's see you got anything else? No, 736 00:42:55,960 --> 00:42:57,920 Speaker 1: I mean, boy, I thought we were pretty scathing, but 737 00:42:58,200 --> 00:43:01,440 Speaker 1: we were. This is like vaping love all scathing. This 738 00:43:01,520 --> 00:43:03,600 Speaker 1: is way worse than vaporing. I'm sure the vapors are 739 00:43:03,640 --> 00:43:05,880 Speaker 1: like going. They were really hard on that guy. Yeah. 740 00:43:05,920 --> 00:43:09,440 Speaker 1: The movie, uh, you know, the documentary is probably a 741 00:43:09,480 --> 00:43:13,279 Speaker 1: little more accurate. But the movie wasn't bad. I mean 742 00:43:13,360 --> 00:43:16,000 Speaker 1: it's not great, Yeah, but it was okay. It felt 743 00:43:16,000 --> 00:43:20,319 Speaker 1: like a movie the week it's an airplane movie. Yeah, 744 00:43:20,400 --> 00:43:24,120 Speaker 1: watch it on your next poine. That my recommendation, Thanks buddy. Well, 745 00:43:24,160 --> 00:43:28,000 Speaker 1: if you want to know more about the Stanford Prison Experiment, um, 746 00:43:28,239 --> 00:43:30,279 Speaker 1: type those words in the search part how stuff works 747 00:43:30,280 --> 00:43:32,560 Speaker 1: dot comm and it will bring up this grab store article. 748 00:43:32,680 --> 00:43:34,719 Speaker 1: Since I said grab store, it's time for listener mail. 749 00:43:37,880 --> 00:43:40,319 Speaker 1: I'm gonna call this beautiful landscaping. Hey, guys, it's spent 750 00:43:40,360 --> 00:43:42,680 Speaker 1: in the last two years fixing up the yard. Uh 751 00:43:42,760 --> 00:43:47,080 Speaker 1: in our house. Uh in Point Pleasant, Pennsylvania. Oh, that 752 00:43:47,120 --> 00:43:49,680 Speaker 1: sounds like a pleasant place. Yeah it is. My husband 753 00:43:49,680 --> 00:43:52,359 Speaker 1: actually introducing your show a few years back, and thank 754 00:43:52,400 --> 00:43:54,120 Speaker 1: god he did, because I've literally listened to you for 755 00:43:54,200 --> 00:43:57,000 Speaker 1: hours and hours while working in the yard. It was 756 00:43:57,000 --> 00:43:59,480 Speaker 1: a huge undertaking. I have a more flexible work schedule 757 00:43:59,800 --> 00:44:02,000 Speaker 1: the he does, so I volunteered to absorb most of 758 00:44:02,040 --> 00:44:04,840 Speaker 1: the responsibility, although he did a lot of heavy looking. 759 00:44:04,880 --> 00:44:07,840 Speaker 1: To enjoyed the show so much, I stopped allowing myself 760 00:44:07,880 --> 00:44:10,680 Speaker 1: to listen to to it any other time. You were 761 00:44:10,719 --> 00:44:13,279 Speaker 1: only allowed during yard work. This made me much more 762 00:44:13,280 --> 00:44:15,760 Speaker 1: ready to get outside and get into it. You guys 763 00:44:15,760 --> 00:44:17,600 Speaker 1: were with me while I carried literally tons of red 764 00:44:17,640 --> 00:44:20,720 Speaker 1: stone uphill and buckets, hauling rocks for a firing landing, 765 00:44:21,080 --> 00:44:27,840 Speaker 1: planted uh pecky sandra ferns and hostas, and the rocky 766 00:44:27,960 --> 00:44:30,440 Speaker 1: soil I've ever had to work with, and just clearing 767 00:44:30,440 --> 00:44:33,360 Speaker 1: away over growth which it sounds like Tanya Harding training 768 00:44:33,400 --> 00:44:36,719 Speaker 1: for the Olympics, and that one that one montage, which 769 00:44:36,719 --> 00:44:38,520 Speaker 1: it turned out included a fair amount of poison ivy. 770 00:44:38,640 --> 00:44:40,720 Speaker 1: During it all, I learned about tiny adorable little creature 771 00:44:40,719 --> 00:44:44,480 Speaker 1: called it's heart degrade, the business of head transplants, the 772 00:44:44,520 --> 00:44:48,320 Speaker 1: hookworm her favorite episode, and some haunting information I cannot 773 00:44:48,320 --> 00:44:51,799 Speaker 1: and hear, such as you provided in the bullfighting and 774 00:44:51,840 --> 00:44:55,879 Speaker 1: drowning episodes. You're always very entertaining, full of information. Even 775 00:44:55,920 --> 00:44:58,000 Speaker 1: when I think it's boring, you make it fun. There 776 00:44:58,000 --> 00:44:59,480 Speaker 1: were times you had me l o l ng in 777 00:44:59,560 --> 00:45:02,760 Speaker 1: my back ard alone and covered in dirt and sweat 778 00:45:02,840 --> 00:45:05,840 Speaker 1: like a crazy person. Attached her some pictures of the progress, 779 00:45:06,800 --> 00:45:11,440 Speaker 1: all from your climate controlled studio, that is from Sharon Prashynski. 780 00:45:11,880 --> 00:45:15,560 Speaker 1: And Sharon, you did a great job. That is one 781 00:45:15,680 --> 00:45:18,680 Speaker 1: beautiful yard you got going. Yeah, for sure, it is lovely. 782 00:45:18,840 --> 00:45:21,040 Speaker 1: It is nice work. We're glad we could be there 783 00:45:21,040 --> 00:45:24,000 Speaker 1: with you to help you get up that hill. Yeah, 784 00:45:24,040 --> 00:45:26,000 Speaker 1: and down the hill and then back up the hill 785 00:45:26,080 --> 00:45:28,279 Speaker 1: and back down the hill. That's right, and then back 786 00:45:28,360 --> 00:45:30,839 Speaker 1: up again. Uh. If you want to get in touch 787 00:45:30,840 --> 00:45:32,560 Speaker 1: with us to let us know how we've helped you out, 788 00:45:32,600 --> 00:45:35,160 Speaker 1: we love hearing that kind of stuff. If you're Phillip Lombardo, 789 00:45:35,280 --> 00:45:39,000 Speaker 1: we expect to hear from your lawyer. Um. And in 790 00:45:39,080 --> 00:45:40,880 Speaker 1: the meantime you can hang out with us at our 791 00:45:40,920 --> 00:45:42,520 Speaker 1: home on the web. Stuff you should Know dot com, 792 00:45:42,560 --> 00:45:44,920 Speaker 1: where you can find all of our social links, and 793 00:45:45,000 --> 00:45:47,280 Speaker 1: you can also send us an email to Stuff Podcast 794 00:45:47,360 --> 00:45:53,680 Speaker 1: at how stuff works dot com. For more on this 795 00:45:53,880 --> 00:45:56,359 Speaker 1: and thousands of other topics. Is it how stuff works 796 00:45:56,360 --> 00:46:03,920 Speaker 1: dot com