1 00:00:00,720 --> 00:00:03,040 Speaker 1: Hi, everybody, Chuck here. Hope you're having a great day. 2 00:00:03,360 --> 00:00:05,440 Speaker 1: Hope you're having a great weekend. I'm thinking about you, 3 00:00:05,800 --> 00:00:08,959 Speaker 1: each one of you individually. I'm thinking about you. I 4 00:00:09,000 --> 00:00:12,039 Speaker 1: know where you live. I'm standing right behind you. Actually 5 00:00:12,080 --> 00:00:14,320 Speaker 1: I'm just kidding, but I hope you're doing well. This 6 00:00:14,360 --> 00:00:18,000 Speaker 1: one goes back to July fifth, twenty eighteen. It's a 7 00:00:18,000 --> 00:00:20,400 Speaker 1: good one. I think I picked this one because I 8 00:00:20,400 --> 00:00:26,160 Speaker 1: had just recently seen the movie on the Stanford Prison Experiment. 9 00:00:26,520 --> 00:00:30,319 Speaker 1: The movie is okay, it's not great, it's not bad, 10 00:00:30,560 --> 00:00:34,559 Speaker 1: it's fine. The podcast episode was pretty good from what 11 00:00:34,600 --> 00:00:36,760 Speaker 1: I remember though, But you should do both. If you've 12 00:00:36,760 --> 00:00:38,880 Speaker 1: seen the movie, listen to the show. If you listen 13 00:00:38,920 --> 00:00:40,720 Speaker 1: to the show, then give the movie a shot. It's 14 00:00:40,760 --> 00:00:43,720 Speaker 1: got Philly Crudup. He's always awesome. And it's called How 15 00:00:43,760 --> 00:00:48,080 Speaker 1: the Stanford Prison Experiment Worked. Where a friend becomes foe, 16 00:00:48,320 --> 00:00:52,040 Speaker 1: Bo becomes friend. What will happen in the end? Listen 17 00:00:52,080 --> 00:00:58,920 Speaker 1: in to find out. Welcome to Stuff you Should Know, 18 00:00:59,240 --> 00:01:00,920 Speaker 1: a production of iHeartRadio. 19 00:01:07,600 --> 00:01:10,400 Speaker 2: Hey, and welcome to the podcast. I'm Josh Clark, and 20 00:01:10,520 --> 00:01:12,400 Speaker 2: there's Charles w Chuck Bryant. 21 00:01:12,240 --> 00:01:13,160 Speaker 1: Jerry's over there. 22 00:01:13,240 --> 00:01:16,280 Speaker 2: So why don't you pull up a chair, kick back 23 00:01:16,319 --> 00:01:20,320 Speaker 2: and tell us about your problems. Because this is psychology stuff. 24 00:01:21,880 --> 00:01:27,760 Speaker 1: We should just call this episode the Stanford Prison Experiment, 25 00:01:27,959 --> 00:01:32,720 Speaker 1: aka perhaps the hackiest experiment of all time. And it's 26 00:01:32,760 --> 00:01:34,720 Speaker 1: really not an experiment anyway. 27 00:01:34,480 --> 00:01:39,000 Speaker 2: No, but it's the most famous psychology experiment ever. 28 00:01:39,400 --> 00:01:40,920 Speaker 1: Yeah. I got kind of ticked off while I was 29 00:01:41,240 --> 00:01:44,040 Speaker 1: researching this. Yeah you should, man, because I used to 30 00:01:44,080 --> 00:01:46,360 Speaker 1: think it was cool, like, oh man, what a cool experiment. 31 00:01:46,560 --> 00:01:48,880 Speaker 2: Yeah, everybody's evil, it's core the core. 32 00:01:49,000 --> 00:01:51,160 Speaker 1: Yeah. Then I researched it and I was like, this 33 00:01:51,280 --> 00:01:53,560 Speaker 1: is a bunch of bs, all of it. This is 34 00:01:53,600 --> 00:01:56,520 Speaker 1: one of the worst executed experiments I've ever heard of. 35 00:01:56,760 --> 00:01:59,480 Speaker 2: That is so funny because I while I was researching this, 36 00:01:59,560 --> 00:02:01,360 Speaker 2: I was like, I'm gonna have to keep it together 37 00:02:01,480 --> 00:02:03,639 Speaker 2: maybe at the end and I can really go off from. 38 00:02:03,760 --> 00:02:04,880 Speaker 1: Let's go off at the beginning. 39 00:02:04,960 --> 00:02:06,200 Speaker 2: That's great, man. Yeah. 40 00:02:06,200 --> 00:02:07,880 Speaker 1: I watched the movie today. 41 00:02:07,600 --> 00:02:09,119 Speaker 2: Too, the twenty fifty one. 42 00:02:09,320 --> 00:02:10,280 Speaker 1: Yeah, how was it? 43 00:02:10,360 --> 00:02:12,840 Speaker 2: How was Billy Croud up? Because I loved him in 44 00:02:13,200 --> 00:02:14,519 Speaker 2: almost famous Uh. 45 00:02:14,639 --> 00:02:18,320 Speaker 1: Well, I'm a fan, he's he was good, but like, 46 00:02:18,960 --> 00:02:22,520 Speaker 1: I don't know, the movie A was pretty sensationalized. As 47 00:02:22,520 --> 00:02:25,600 Speaker 1: far as the violence, Like, they showed a lot of 48 00:02:25,639 --> 00:02:30,280 Speaker 1: straight up physical violence in the movie which supposedly didn't occur, right, 49 00:02:32,080 --> 00:02:35,519 Speaker 1: like beating them with billy clubs and hog tying them 50 00:02:35,560 --> 00:02:37,400 Speaker 1: and like like real violence. 51 00:02:38,720 --> 00:02:40,639 Speaker 2: Actually, these days I should say Atlanta. 52 00:02:40,960 --> 00:02:44,440 Speaker 1: Yeah, Yollywood is what they call it. 53 00:02:44,520 --> 00:02:47,959 Speaker 2: Oh, there you go. Perfect. That's that's perfect. That sounds 54 00:02:47,960 --> 00:02:49,400 Speaker 2: like a Norman Rita's creation. 55 00:02:49,720 --> 00:02:57,680 Speaker 1: It might have been. And then what was I saying? Oh, 56 00:02:58,000 --> 00:03:01,799 Speaker 1: I don't feel like it came down hard enough on 57 00:03:03,080 --> 00:03:06,600 Speaker 1: this Yahoo. What was a guy's name, Zimbardo? Yeah, Zimbardo 58 00:03:06,760 --> 00:03:11,000 Speaker 1: for just crafting a really poor doing a very poor 59 00:03:11,040 --> 00:03:13,880 Speaker 1: job at crafting is supposedly scientific experiment. 60 00:03:13,960 --> 00:03:16,200 Speaker 2: No, he was like the driving force behind that movie 61 00:03:16,240 --> 00:03:18,760 Speaker 2: getting made. Apparently he'd been trying to get a movie 62 00:03:18,800 --> 00:03:19,840 Speaker 2: made in America. 63 00:03:20,400 --> 00:03:22,720 Speaker 1: He seems to be a pretty fameless self promoter decades. 64 00:03:22,800 --> 00:03:26,000 Speaker 2: Yes, yeah, it's not a good quality in a social psychologist. 65 00:03:26,120 --> 00:03:27,280 Speaker 1: No, so we're going to. 66 00:03:27,280 --> 00:03:30,200 Speaker 2: See I guess we let the cat out of the bag. 67 00:03:30,240 --> 00:03:36,720 Speaker 2: But well, we shall see that. The Stanford prison experiment 68 00:03:37,000 --> 00:03:42,120 Speaker 2: one of the most famous experiments in the Annals of Psychology. 69 00:03:42,960 --> 00:03:46,920 Speaker 2: It's not an experiment at all. No, it's findings are 70 00:03:47,080 --> 00:03:51,520 Speaker 2: wide open to interpretation, and it was conducted by a 71 00:03:51,640 --> 00:03:53,040 Speaker 2: showman basically. 72 00:03:53,720 --> 00:03:56,200 Speaker 1: Yeah. I mean, you know, it's a red flag when 73 00:03:56,600 --> 00:03:59,040 Speaker 1: you don't publish your findings in a medical journal. You 74 00:03:59,040 --> 00:04:01,280 Speaker 1: publish them in New York was it? New York Magazine? 75 00:04:01,280 --> 00:04:02,960 Speaker 2: New York Times magazine? 76 00:04:03,600 --> 00:04:08,360 Speaker 1: Hodgman's rag, Well, great rag, But that's not the place 77 00:04:08,400 --> 00:04:10,560 Speaker 1: to go publish scientific findings. 78 00:04:10,640 --> 00:04:14,760 Speaker 2: No peer reviewed journals are. Yeah, and they circumvented that, yeah, 79 00:04:14,800 --> 00:04:17,279 Speaker 2: for very good reasons. All right, so let's let's talk 80 00:04:17,320 --> 00:04:19,880 Speaker 2: about the outline. So let's go back to the beginning, right. 81 00:04:20,000 --> 00:04:23,080 Speaker 1: Yeah, back to the year of my birth, nineteen seventy 82 00:04:23,120 --> 00:04:30,080 Speaker 1: one and Stanford at Stanford University, which is what Palo Alto. 83 00:04:30,279 --> 00:04:33,960 Speaker 2: Yeah, go fighting sequoias. 84 00:04:34,480 --> 00:04:35,040 Speaker 1: What is there? 85 00:04:35,240 --> 00:04:37,600 Speaker 2: They have like a big old sequoya on their logo. 86 00:04:38,160 --> 00:04:40,719 Speaker 2: I think it's like a and then they have a 87 00:04:40,960 --> 00:04:45,120 Speaker 2: sequoia with its fists up or is that? Oh, that's 88 00:04:45,200 --> 00:04:48,760 Speaker 2: Notre Dame I'm thinking of. I do feel like it 89 00:04:48,800 --> 00:04:52,000 Speaker 2: has sometimes Chuck's looking it up everybody, so let me stall. 90 00:04:51,760 --> 00:04:53,719 Speaker 1: It is a tree, the Stanford Tree. 91 00:04:54,480 --> 00:04:56,400 Speaker 2: I don't know what the mascot is, but there's definitely 92 00:04:56,480 --> 00:04:57,760 Speaker 2: a tree associated in No. 93 00:04:57,839 --> 00:04:59,279 Speaker 1: I looked it up, the Stanford Tree. 94 00:04:59,560 --> 00:05:00,599 Speaker 2: Oh okay, cool. 95 00:05:00,680 --> 00:05:02,520 Speaker 1: And the first question is why is it a tree? 96 00:05:02,720 --> 00:05:04,760 Speaker 2: Uh huh, Well, what's the answer. 97 00:05:04,920 --> 00:05:07,480 Speaker 1: Well, I mean, I'm sure it's just because of where 98 00:05:07,480 --> 00:05:11,120 Speaker 1: it is in California. But that doesn't answer the real question, 99 00:05:11,279 --> 00:05:12,680 Speaker 1: which is why would you have a tree? 100 00:05:12,920 --> 00:05:17,120 Speaker 2: Right Phillips Embardo sitting there like, quit stalling, get to 101 00:05:17,160 --> 00:05:18,320 Speaker 2: the heckling. 102 00:05:18,560 --> 00:05:21,720 Speaker 1: He's still around, Yeah, he is, so all right, We're 103 00:05:21,760 --> 00:05:23,280 Speaker 1: at Stanford. It's nineteen seventy one. 104 00:05:23,360 --> 00:05:25,919 Speaker 2: Yeah, we're actually in the basement of one of the 105 00:05:25,920 --> 00:05:29,000 Speaker 2: buildings at Stanford University, I think like Campbell Hall or 106 00:05:29,040 --> 00:05:32,440 Speaker 2: something like that. And I think August of nineteen seventy one, 107 00:05:32,800 --> 00:05:38,080 Speaker 2: there were twenty four young men, almost all of them 108 00:05:38,080 --> 00:05:40,960 Speaker 2: when I think one of them was Asian American, and 109 00:05:41,960 --> 00:05:45,880 Speaker 2: they are doing something pretty bizarre in this basement in 110 00:05:45,920 --> 00:05:49,920 Speaker 2: August of nineteen seventy one. They've been divided into two groups, 111 00:05:50,720 --> 00:05:51,440 Speaker 2: guards and. 112 00:05:51,440 --> 00:05:54,360 Speaker 1: Prisoners, supposedly average kids. 113 00:05:54,200 --> 00:06:00,960 Speaker 2: Right, and they are acting out this basically role playing 114 00:06:01,000 --> 00:06:03,560 Speaker 2: game of guards. 115 00:06:03,279 --> 00:06:05,560 Speaker 1: Versus prisoners for fifteen bucks a day. 116 00:06:05,680 --> 00:06:09,440 Speaker 2: In a simulated prison in the basement of this hall 117 00:06:09,480 --> 00:06:10,720 Speaker 2: at Stanford University. 118 00:06:10,920 --> 00:06:14,560 Speaker 1: Yeah, which would be about ninety three dollars today funded 119 00:06:14,560 --> 00:06:17,200 Speaker 1: by the US Office of Naval Research. Is that right? 120 00:06:17,560 --> 00:06:19,719 Speaker 2: So it would be ninety three bucks a day, and 121 00:06:19,760 --> 00:06:21,680 Speaker 2: it was originally gonna be two weeks. So I'm sure 122 00:06:21,680 --> 00:06:23,000 Speaker 2: some of these guys were like, heck. 123 00:06:22,880 --> 00:06:25,280 Speaker 1: Yeah, yeah, I mean I kind of forgot what it 124 00:06:25,320 --> 00:06:27,960 Speaker 1: was like to be a college student. That'd be uh, 125 00:06:28,960 --> 00:06:32,560 Speaker 1: you know what between twelve and fourteen hundred bucks starting 126 00:06:32,560 --> 00:06:33,520 Speaker 1: off your summer. 127 00:06:34,160 --> 00:06:37,720 Speaker 2: It'd be about thirteen thirteen hundred and two dollars if 128 00:06:37,760 --> 00:06:39,080 Speaker 2: my quick math is correct. 129 00:06:40,040 --> 00:06:42,440 Speaker 1: Good scratch, Yeah, for a twenty one year old. 130 00:06:42,520 --> 00:06:44,160 Speaker 2: Yeah, two weeks on summer break. 131 00:06:44,839 --> 00:06:48,000 Speaker 1: That's right. So you were divided into two lots. Like 132 00:06:48,040 --> 00:06:52,120 Speaker 1: you said, they asked people supposedly what you wanted to be, 133 00:06:52,200 --> 00:06:55,039 Speaker 1: unless this was purely a movie creation, and they did 134 00:06:55,040 --> 00:06:59,080 Speaker 1: try and look up and try and find out the differences. Yeah, 135 00:06:59,440 --> 00:07:03,480 Speaker 1: but they supposedly asked them. In most everyone said, or 136 00:07:03,480 --> 00:07:06,880 Speaker 1: in fact everyone said prisoner. In one of the reactions 137 00:07:06,880 --> 00:07:10,480 Speaker 1: from who ended up being the bad guard, the guy said, 138 00:07:11,360 --> 00:07:14,680 Speaker 1: they asked him why, and he's like, because nobody likes guards, right, 139 00:07:14,680 --> 00:07:16,400 Speaker 1: It's like, why would anyone want to be a guard? 140 00:07:16,760 --> 00:07:19,640 Speaker 1: Because they thought we'll just be prisoners because they just 141 00:07:19,680 --> 00:07:21,120 Speaker 1: will lay around and smoke cigarettes. 142 00:07:21,360 --> 00:07:24,520 Speaker 2: Right, So we'll we'll and we'll kind of unpack what 143 00:07:24,560 --> 00:07:26,000 Speaker 2: that suggests later on. 144 00:07:26,160 --> 00:07:26,440 Speaker 1: Sure. 145 00:07:26,520 --> 00:07:29,560 Speaker 2: Okay, so you've got these these guys and they're down 146 00:07:29,560 --> 00:07:31,800 Speaker 2: here for this experiment, and so coming at it from 147 00:07:31,800 --> 00:07:36,280 Speaker 2: the way, this is the popular interpretation of what happened 148 00:07:36,280 --> 00:07:40,200 Speaker 2: at the Stanford prison experiment. Okay, yes, you've got you've 149 00:07:40,240 --> 00:07:42,600 Speaker 2: got twelve guards and twelve prisoners. 150 00:07:42,960 --> 00:07:45,800 Speaker 1: The prisoners had been arrested, by the way, by the 151 00:07:45,840 --> 00:07:48,440 Speaker 1: real Palo Alto police. Yeah, they weren't told when, but 152 00:07:48,560 --> 00:07:50,840 Speaker 1: like the real cops came by, arrested each one of 153 00:07:50,880 --> 00:07:55,000 Speaker 1: them for you know, a variety of crimes. 154 00:07:54,680 --> 00:07:57,440 Speaker 2: Book them at the Palo Alto Police station, and then 155 00:07:57,520 --> 00:08:03,360 Speaker 2: transported them to the jail, the fake jail in at Stanford. 156 00:08:03,400 --> 00:08:05,560 Speaker 1: Yeah, they call it the Stanford County Jail. And they 157 00:08:05,560 --> 00:08:08,520 Speaker 1: did a legit job. They put up signs, they had 158 00:08:08,560 --> 00:08:13,400 Speaker 1: these rooms decked out like jail cells, they had a hole. 159 00:08:14,520 --> 00:08:17,360 Speaker 1: They did a really believable job of making this seem 160 00:08:17,520 --> 00:08:19,720 Speaker 1: like a prison environment at least. Right. 161 00:08:20,080 --> 00:08:24,120 Speaker 2: So you've got these these prisoners who've been delivered. You've 162 00:08:24,160 --> 00:08:28,320 Speaker 2: got these guards who are waiting there for them. And 163 00:08:28,400 --> 00:08:31,600 Speaker 2: as as far as Imbardo's ever said, these these guards 164 00:08:31,600 --> 00:08:36,560 Speaker 2: were told, you have to protect the prison and everything 165 00:08:36,600 --> 00:08:38,719 Speaker 2: else is up to you. The only rule is there's 166 00:08:38,800 --> 00:08:41,800 Speaker 2: no physical punishment. We're just here to observe. 167 00:08:42,120 --> 00:08:45,240 Speaker 1: Yeah, Like, here's your uniforms, here's your sunglasses. 168 00:08:45,559 --> 00:08:49,640 Speaker 2: Yeah. And then the prisoners were booked in with wearing smocks. 169 00:08:49,320 --> 00:08:53,520 Speaker 1: Yeah, no shoes, no underwear, yeah, naked under the smocks. 170 00:08:53,080 --> 00:08:56,560 Speaker 2: Chained at the ankles. And then they wore like those 171 00:08:56,640 --> 00:08:59,280 Speaker 2: stocking cap do rags. They had a panty on their 172 00:08:59,320 --> 00:09:05,160 Speaker 2: head to simulate their having their head shaved. 173 00:09:04,920 --> 00:09:08,480 Speaker 1: Right, and you know, this is this the early seventies, 174 00:09:08,480 --> 00:09:11,640 Speaker 1: so most of them had these big afros and long 175 00:09:11,679 --> 00:09:14,479 Speaker 1: hair and stuff under these pantees. 176 00:09:14,720 --> 00:09:19,640 Speaker 2: Right, So this is I At first, everything's pretty normal. 177 00:09:19,720 --> 00:09:21,559 Speaker 2: The guards don't quite know what to do. They're a 178 00:09:21,600 --> 00:09:27,600 Speaker 2: little timid. The prisoners apparently relished this immediately and started 179 00:09:28,720 --> 00:09:32,320 Speaker 2: like finding where the guard's boundaries were, and they started 180 00:09:32,360 --> 00:09:35,720 Speaker 2: to band together, and there was actually, I think on 181 00:09:35,960 --> 00:09:41,959 Speaker 2: day two they turnover. From day one to two, there 182 00:09:42,040 --> 00:09:43,199 Speaker 2: was a prisoner riot. 183 00:09:44,320 --> 00:09:46,920 Speaker 1: Yeah. I mean they, like you said, they were sort 184 00:09:46,920 --> 00:09:48,880 Speaker 1: of laughing at first, and I think we didn't mention 185 00:09:48,960 --> 00:09:52,400 Speaker 1: too and this is we'll end up being very, very problematic, 186 00:09:52,720 --> 00:09:54,640 Speaker 1: and the first sign that he didn't do a good job. 187 00:09:54,679 --> 00:09:59,240 Speaker 1: Simbardo actually acted as the superintendent of the prison, involved 188 00:09:59,280 --> 00:10:02,160 Speaker 1: himself in his own experiment and had one of it. 189 00:10:02,280 --> 00:10:05,920 Speaker 1: He had some graduate assistants that were assisting in the program. 190 00:10:06,120 --> 00:10:09,000 Speaker 1: They acted as parole board, and one of them was 191 00:10:09,040 --> 00:10:09,880 Speaker 1: the warden. 192 00:10:09,800 --> 00:10:13,679 Speaker 2: That was yeah, an undergrad. Actually were they undergrad assistant? Well, 193 00:10:13,720 --> 00:10:17,040 Speaker 2: the warden Jaffy, his last name was Jaffy. He was 194 00:10:17,080 --> 00:10:19,400 Speaker 2: an undergrad at the time, and actually he had come 195 00:10:19,480 --> 00:10:21,240 Speaker 2: up with the experiment on his own. 196 00:10:21,440 --> 00:10:22,240 Speaker 1: Oh he was the guy. 197 00:10:22,320 --> 00:10:24,599 Speaker 2: Huh huh. And then Zimbardo was like, this is a 198 00:10:24,679 --> 00:10:27,839 Speaker 2: really good idea. Let's do this for real. Imagine the press, right. 199 00:10:29,000 --> 00:10:31,600 Speaker 1: So, yeah, like you said, it escalated pretty quickly. After 200 00:10:31,679 --> 00:10:36,000 Speaker 1: kind of laughing at first, these guards got into their roles, 201 00:10:36,040 --> 00:10:40,760 Speaker 1: to say the least, and really kind of started being 202 00:10:41,080 --> 00:10:45,840 Speaker 1: jerks in quick order, and after the prisoners were like, hey, 203 00:10:45,840 --> 00:10:48,080 Speaker 1: this is kind of funny, like you're being you're not 204 00:10:48,080 --> 00:10:50,719 Speaker 1: being very cool. Yeah, and they were you know, kind 205 00:10:50,760 --> 00:10:53,360 Speaker 1: of smacked down and you know, made to do things 206 00:10:53,400 --> 00:10:56,960 Speaker 1: like push ups and jumping jacks, and they would withhold 207 00:10:57,000 --> 00:10:59,800 Speaker 1: food and eventually they would like take their beds away 208 00:10:59,800 --> 00:11:02,000 Speaker 1: from them and stuff like. It just got worse and worse, 209 00:11:02,040 --> 00:11:04,480 Speaker 1: and there was, I think, like you said, on day two, 210 00:11:05,240 --> 00:11:10,000 Speaker 1: an uprising. They got together, threw the cots off their 211 00:11:10,000 --> 00:11:13,520 Speaker 1: beds and through the bed frames against the door and 212 00:11:13,520 --> 00:11:14,320 Speaker 1: wouldn't let them in. 213 00:11:14,600 --> 00:11:19,120 Speaker 2: Right, So there was a prisoner riot. Yeah, that's pretty significant, right, 214 00:11:20,160 --> 00:11:23,680 Speaker 2: And what's equally significant is that the guards by the 215 00:11:23,760 --> 00:11:27,360 Speaker 2: second day started to show signs of like real cruelty 216 00:11:27,600 --> 00:11:31,480 Speaker 2: towards the prisoners. They started treating them very poorly. They 217 00:11:31,480 --> 00:11:34,720 Speaker 2: started engaging in basically acts of torture, like waking them 218 00:11:34,760 --> 00:11:36,679 Speaker 2: up randomly in the middle of the night, making them 219 00:11:36,679 --> 00:11:39,760 Speaker 2: get up, like you said, push ups, which is interpreted 220 00:11:39,800 --> 00:11:43,839 Speaker 2: as physical punishment because again, you couldn't hit them with 221 00:11:43,880 --> 00:11:45,840 Speaker 2: the rubber hose, you couldn't hit them with the baton, 222 00:11:46,240 --> 00:11:49,320 Speaker 2: you couldn't punch them. But if you make somebody do 223 00:11:49,360 --> 00:11:52,880 Speaker 2: a bunch of push ups, that's physical punishment too. Yeah, 224 00:11:52,920 --> 00:11:54,959 Speaker 2: and it was within the bounds apparently. 225 00:11:55,120 --> 00:11:57,400 Speaker 1: Yeah. They were referred to only by their prison numbers. 226 00:11:57,400 --> 00:12:00,200 Speaker 1: They would never say their names. They were made to memory. 227 00:12:00,080 --> 00:12:03,320 Speaker 1: Everyone else's prison number, and like they would line them 228 00:12:03,360 --> 00:12:06,160 Speaker 1: up and tell them to repeat their numbers for like 229 00:12:06,240 --> 00:12:08,480 Speaker 1: an hour if they didn't do it fast enough, and 230 00:12:08,480 --> 00:12:12,240 Speaker 1: then in reverse order they would get punishment. They would 231 00:12:12,320 --> 00:12:14,960 Speaker 1: do the kind of the classic moves of holding one 232 00:12:15,040 --> 00:12:16,880 Speaker 1: responsible for the punishment of others. 233 00:12:17,000 --> 00:12:18,600 Speaker 2: Yeah, that's a big one, Like. 234 00:12:18,559 --> 00:12:20,079 Speaker 1: If you didn't make your bed good enough and no 235 00:12:20,120 --> 00:12:21,640 Speaker 1: one could go to sleep, stuff like that. 236 00:12:21,800 --> 00:12:25,680 Speaker 2: The guards also innovated the carrots here or there too. 237 00:12:25,720 --> 00:12:28,920 Speaker 2: They actually made one cell like a good cell, Like 238 00:12:29,040 --> 00:12:31,280 Speaker 2: they put a bed in it, yeah, with like betting. 239 00:12:32,080 --> 00:12:34,200 Speaker 2: If you were in that cell, you were eligible for 240 00:12:34,280 --> 00:12:37,560 Speaker 2: like good meals, yeah, better than what the other prisoners had. 241 00:12:37,840 --> 00:12:40,200 Speaker 2: And there were room for three inmates in there at 242 00:12:40,240 --> 00:12:44,640 Speaker 2: a time. And so it instilled this sense of competition 243 00:12:45,200 --> 00:12:52,120 Speaker 2: and skullduggery, I guess, backstabbery among the prisoners to curry 244 00:12:52,120 --> 00:12:55,320 Speaker 2: favor with the guards, like by informing on the other ones. Yeah, 245 00:12:55,360 --> 00:12:57,240 Speaker 2: so that you could get a chance to be in 246 00:12:57,360 --> 00:12:58,200 Speaker 2: like the nice cell. 247 00:12:58,320 --> 00:13:00,480 Speaker 1: Yeah. And I think even before that when they went 248 00:13:00,520 --> 00:13:02,800 Speaker 1: to do the when they went to stage the uprising, 249 00:13:03,360 --> 00:13:05,720 Speaker 1: I don't think there were three rooms of three, and 250 00:13:05,760 --> 00:13:08,280 Speaker 1: I think six of them. Two of the rooms participated 251 00:13:08,640 --> 00:13:12,480 Speaker 1: and one of the rooms did not. And because not 252 00:13:12,520 --> 00:13:15,720 Speaker 1: all the guys, you know, on not all the prisoners 253 00:13:15,720 --> 00:13:17,560 Speaker 1: like rebelled as much. Some of them just kind of 254 00:13:17,559 --> 00:13:18,240 Speaker 1: went along with it. 255 00:13:18,480 --> 00:13:22,680 Speaker 2: Interestingly, some of the guards did not descend into cruelty, right. 256 00:13:23,000 --> 00:13:25,560 Speaker 2: They actually some of them did, like favors, went out 257 00:13:25,559 --> 00:13:27,720 Speaker 2: of their way to be nice to the prisoners. But 258 00:13:28,000 --> 00:13:32,200 Speaker 2: and the grabster who wrote this article points out very significantly, 259 00:13:32,800 --> 00:13:36,720 Speaker 2: they didn't stand up to the cruel guards or officially 260 00:13:36,720 --> 00:13:40,199 Speaker 2: object to their behavior. Right, they went along with it. 261 00:13:40,520 --> 00:13:41,920 Speaker 1: But then they thought they had to. 262 00:13:42,040 --> 00:13:45,040 Speaker 2: In their own right, in their own way, they did 263 00:13:45,080 --> 00:13:47,439 Speaker 2: what they could to retain their humanity. So there are 264 00:13:47,440 --> 00:13:51,480 Speaker 2: two huge points, and one of them, there's one among 265 00:13:51,520 --> 00:13:53,960 Speaker 2: the guards and one among the prisoners. And the one 266 00:13:54,000 --> 00:13:56,840 Speaker 2: among the prisoners comes thirty six hours after the beginning 267 00:13:56,840 --> 00:14:01,080 Speaker 2: of the of the experiment, and this prison his name, 268 00:14:01,160 --> 00:14:05,640 Speaker 2: it would later be revealed, was Douglas Corpi. He had 269 00:14:05,679 --> 00:14:10,199 Speaker 2: an emotional breakdown, a nervous breakdown thirty six hours after 270 00:14:10,240 --> 00:14:15,280 Speaker 2: this experiment starts. One of the prisoners it becomes so 271 00:14:15,400 --> 00:14:21,120 Speaker 2: emotionally involved in this simulated prison at the cruelty the 272 00:14:21,200 --> 00:14:23,880 Speaker 2: simulated supposedly cruelty of the guards, that he had a 273 00:14:23,920 --> 00:14:27,720 Speaker 2: nervous breakdown well and had to be had to be 274 00:14:27,800 --> 00:14:31,120 Speaker 2: removed from the experiment. And this is like, this is 275 00:14:31,160 --> 00:14:35,920 Speaker 2: Embardo's This is the official line for the Stanford Prison experiment, right, 276 00:14:36,240 --> 00:14:37,560 Speaker 2: it has been for decades. 277 00:14:38,440 --> 00:14:40,520 Speaker 1: Yeah. He also said that one of them broke out 278 00:14:40,560 --> 00:14:46,840 Speaker 1: in a psychosomatic rash. There was all manner of various 279 00:14:46,880 --> 00:14:49,240 Speaker 1: levels of psychological breakdowns happening on. 280 00:14:49,200 --> 00:14:52,960 Speaker 2: The other side. The big star among the guards was 281 00:14:53,000 --> 00:14:55,480 Speaker 2: a guy named John Wayne who you referenced earlier. 282 00:14:55,600 --> 00:15:00,360 Speaker 1: Yeah, his name was Dave Eshelman, and he he was 283 00:15:00,400 --> 00:15:02,920 Speaker 1: the one who he was the ringleader. He's the one 284 00:15:02,920 --> 00:15:06,000 Speaker 1: that came out as the most brutal guard of them all, 285 00:15:06,080 --> 00:15:07,760 Speaker 1: and all the other guards kind of fell in line 286 00:15:07,800 --> 00:15:09,800 Speaker 1: behind him and took their cues from him. 287 00:15:10,040 --> 00:15:12,480 Speaker 2: So this whole thing's going on, This is crazy town 288 00:15:12,520 --> 00:15:16,160 Speaker 2: in this place. In six days, six days, this thing 289 00:15:16,200 --> 00:15:17,560 Speaker 2: descends into chaos. 290 00:15:17,640 --> 00:15:19,479 Speaker 1: Supposed to be two weeks, Yes. 291 00:15:19,280 --> 00:15:22,160 Speaker 2: There was. There was rumors that there was going to 292 00:15:22,160 --> 00:15:25,720 Speaker 2: be a breakout, and so they moved the experiment. There 293 00:15:25,800 --> 00:15:29,600 Speaker 2: were that that guy Douglas Corpi, who had a nervous 294 00:15:29,600 --> 00:15:33,200 Speaker 2: breakdown ended up getting put into the hole this broom 295 00:15:33,240 --> 00:15:38,200 Speaker 2: closet for I think overnight, and was finally released because 296 00:15:38,240 --> 00:15:42,360 Speaker 2: the the researchers that actually stepped in and said you 297 00:15:42,400 --> 00:15:45,600 Speaker 2: should should probably let him out. There was it was 298 00:15:45,680 --> 00:15:50,520 Speaker 2: just utter chaos. And then eventually Phillips Zimbardo's girlfriend at 299 00:15:50,520 --> 00:15:54,360 Speaker 2: the time, a woman named Christine Maslock. 300 00:15:53,680 --> 00:15:57,080 Speaker 1: Yeah, his wife to be. Oh she married him, huh yeah, 301 00:15:57,120 --> 00:15:57,640 Speaker 1: still married. 302 00:15:58,240 --> 00:16:01,480 Speaker 2: So she came and just dropped in to see how 303 00:16:01,560 --> 00:16:04,960 Speaker 2: things were going and was so outraged at what she 304 00:16:05,120 --> 00:16:08,000 Speaker 2: saw that she was like, you you so far beyond 305 00:16:08,040 --> 00:16:10,000 Speaker 2: the line. You have to stop this now, like this 306 00:16:10,080 --> 00:16:12,600 Speaker 2: is this is descended into chaos. You can't do this. 307 00:16:12,960 --> 00:16:16,880 Speaker 2: These people are treating these these prisoners horribly, Like how 308 00:16:16,880 --> 00:16:21,760 Speaker 2: are you letting this go on? Went okay, fine, and 309 00:16:21,800 --> 00:16:24,920 Speaker 2: so the next day he canceled the experiment again after 310 00:16:25,000 --> 00:16:27,120 Speaker 2: six days, and it was scheduled to go on for 311 00:16:27,160 --> 00:16:32,240 Speaker 2: two weeks. And so he comes out tells the world 312 00:16:32,920 --> 00:16:37,080 Speaker 2: in this New York Times magazine, guys, if I took you, 313 00:16:37,200 --> 00:16:39,160 Speaker 2: if I took you Josh, and I took you Chuck 314 00:16:39,480 --> 00:16:43,440 Speaker 2: and put you as guarden prisoner in even a simulated prison, 315 00:16:44,160 --> 00:16:47,400 Speaker 2: and put a smock on Josh and took his underwear 316 00:16:47,440 --> 00:16:51,080 Speaker 2: off and put a stocking on his head and gave 317 00:16:51,200 --> 00:16:54,880 Speaker 2: Chuck a baton and some glasses. Chuck would beat Josh 318 00:16:54,920 --> 00:16:58,080 Speaker 2: up well, and joshuauld probably have his spirit broken and 319 00:16:58,120 --> 00:17:02,360 Speaker 2: have a nervous breakdown. It's in everybody. Evil is in everybody. Yeah, 320 00:17:02,440 --> 00:17:05,560 Speaker 2: crumbling at the first sign of adversity is in everybody. 321 00:17:05,680 --> 00:17:09,679 Speaker 2: We're all just pathetic weaklings. Stanford Prison experiment, and he 322 00:17:09,760 --> 00:17:11,240 Speaker 2: ran off and said I'm famous. 323 00:17:11,960 --> 00:17:15,320 Speaker 1: All right, that's a great setup. So we'll take a 324 00:17:15,359 --> 00:17:17,680 Speaker 1: break here and come back and talk a little bit 325 00:17:17,720 --> 00:17:20,159 Speaker 1: about the more about the experiment and the realities of 326 00:17:20,200 --> 00:17:43,960 Speaker 1: it right after this. All right, So you've got John 327 00:17:44,000 --> 00:17:46,560 Speaker 1: Wayne in there. I don't think we mentioned that he 328 00:17:46,600 --> 00:17:50,240 Speaker 1: took on the persona of the prison boss and cool 329 00:17:50,280 --> 00:17:50,760 Speaker 1: hand Luke. 330 00:17:50,960 --> 00:17:51,200 Speaker 2: Yeah. 331 00:17:51,480 --> 00:17:54,480 Speaker 1: He did a fake Southern accent and everything and dove 332 00:17:54,560 --> 00:17:57,919 Speaker 1: right into this role. If you talk to Dave Eshelman today, 333 00:17:58,680 --> 00:18:01,440 Speaker 1: he will say he's very much on record of saying 334 00:18:02,320 --> 00:18:05,960 Speaker 1: I'm not some jerk, and I didn't get off on 335 00:18:06,080 --> 00:18:09,480 Speaker 1: being sadistic. He said, I wanted to do what they 336 00:18:09,640 --> 00:18:11,760 Speaker 1: paid me fifteen dollars a day to do, which was 337 00:18:11,800 --> 00:18:15,879 Speaker 1: to be a prison guard and to treat these guys poorly, right, 338 00:18:15,960 --> 00:18:17,520 Speaker 1: and so I create you know, he said, I did 339 00:18:17,520 --> 00:18:20,000 Speaker 1: some drama in high school and I literally acted this 340 00:18:20,119 --> 00:18:21,920 Speaker 1: part as well as I could. 341 00:18:22,240 --> 00:18:26,400 Speaker 2: That was I felt was expected and wanted from. 342 00:18:26,240 --> 00:18:28,800 Speaker 1: Me, right, And I put on this fake Southern accent. 343 00:18:29,320 --> 00:18:32,040 Speaker 1: And if you like ask friends and family today, they 344 00:18:32,040 --> 00:18:34,120 Speaker 1: would laugh at this because I'm really not this guy 345 00:18:34,160 --> 00:18:36,640 Speaker 1: at all, right, because he really comes off as as 346 00:18:36,760 --> 00:18:38,600 Speaker 1: a bit of a villain in this movie. For sure. 347 00:18:38,760 --> 00:18:44,160 Speaker 2: Well, he perpetrated real cruelty on other people, and we'll 348 00:18:44,160 --> 00:18:45,399 Speaker 2: get to that later. He said he. 349 00:18:45,400 --> 00:18:46,680 Speaker 1: Feels bad about it too, and. 350 00:18:46,640 --> 00:18:50,160 Speaker 2: He should, yeah, because the other people actually did suffer 351 00:18:51,040 --> 00:18:54,720 Speaker 2: under this guy's leadership as the ring leader of the 352 00:18:54,800 --> 00:18:57,760 Speaker 2: mean guards, right, like they wore pink on Wednesday. It 353 00:18:57,800 --> 00:19:02,440 Speaker 2: was terrible everywhere, right, So he really should feel bad 354 00:19:02,440 --> 00:19:04,639 Speaker 2: and apparently he does. I saw that all over the 355 00:19:04,680 --> 00:19:06,960 Speaker 2: place too, that he feels bad for it. But the 356 00:19:07,119 --> 00:19:11,399 Speaker 2: point is is that he has said, like this didn't 357 00:19:11,440 --> 00:19:17,040 Speaker 2: happen organically, like I I wasn't. I felt encouraged to 358 00:19:17,560 --> 00:19:18,320 Speaker 2: play this role. 359 00:19:18,960 --> 00:19:20,679 Speaker 1: Right. That's a big deal. 360 00:19:20,520 --> 00:19:23,760 Speaker 2: Because the findings of the Stanford Prison experiment say if 361 00:19:23,800 --> 00:19:25,919 Speaker 2: you take some people and say you're a guard. 362 00:19:25,840 --> 00:19:28,000 Speaker 1: Give him your empower, and you will turn evil. 363 00:19:28,160 --> 00:19:31,800 Speaker 2: They will turn evil within a day. A day they 364 00:19:31,840 --> 00:19:34,719 Speaker 2: said about this guy. And this guy's like, no, I was, 365 00:19:34,760 --> 00:19:37,000 Speaker 2: just like you said, doing my job what they were 366 00:19:37,000 --> 00:19:39,400 Speaker 2: paying me fifteen bucks a day for. Yeah, let's put 367 00:19:39,440 --> 00:19:42,879 Speaker 2: that one to the side. Let's go visit with Douglas Corpi, 368 00:19:42,960 --> 00:19:47,000 Speaker 2: who was the prisoner who in thirty six short hours 369 00:19:47,160 --> 00:19:52,160 Speaker 2: of this simulated prison experiment lost his marbles and had 370 00:19:52,200 --> 00:19:55,760 Speaker 2: a nervous breakdown and had to go home. Right, one 371 00:19:55,760 --> 00:19:58,359 Speaker 2: of the other two pillars of the findings that people 372 00:19:58,440 --> 00:20:01,840 Speaker 2: are either evil or crumble in the face of adversity 373 00:20:02,200 --> 00:20:04,960 Speaker 2: from the Stanford prison experiment, and again this is how 374 00:20:04,960 --> 00:20:06,680 Speaker 2: this thing's been taught for like fifty years. 375 00:20:06,720 --> 00:20:10,679 Speaker 1: Okay, Yeah, So Corpy comes out and says, I was 376 00:20:10,720 --> 00:20:13,560 Speaker 1: faking that, and I put on a big act so 377 00:20:13,640 --> 00:20:15,919 Speaker 1: I could get out of there because it sucked and 378 00:20:16,119 --> 00:20:19,000 Speaker 1: I didn't want to be there anymore. So I fake 379 00:20:19,080 --> 00:20:21,560 Speaker 1: like I was. And he like one of his quotes 380 00:20:21,720 --> 00:20:24,240 Speaker 1: was I don't have it here, But he basically said, 381 00:20:24,280 --> 00:20:27,880 Speaker 1: like any trained clinician would have been able to see 382 00:20:27,920 --> 00:20:30,480 Speaker 1: right through this, Like when I hear the tapes years later, 383 00:20:31,240 --> 00:20:34,160 Speaker 1: it's like, I'm not an actor. I wasn't like apparently 384 00:20:34,200 --> 00:20:35,800 Speaker 1: the John wayn guy at least had been in like 385 00:20:35,880 --> 00:20:36,600 Speaker 1: high school. 386 00:20:36,359 --> 00:20:38,560 Speaker 2: Plays in college too, I think, yeah. 387 00:20:38,400 --> 00:20:40,080 Speaker 1: And he was like, I was not an actor. And 388 00:20:40,119 --> 00:20:42,199 Speaker 1: it was so clear to me looking back at these 389 00:20:42,280 --> 00:20:44,320 Speaker 1: tapes that I was faking. 390 00:20:44,000 --> 00:20:45,439 Speaker 2: It, faking a nervous breakdown. 391 00:20:45,520 --> 00:20:47,880 Speaker 1: Yeah, faking a nervous breakdown to get out of there. Right. 392 00:20:48,119 --> 00:20:51,240 Speaker 2: So the reason why he said later that he did 393 00:20:51,320 --> 00:20:53,840 Speaker 2: fake this nervous breakdown is because he took the job 394 00:20:54,200 --> 00:20:56,400 Speaker 2: because he thought he'd just be laying around, like you said, 395 00:20:56,400 --> 00:20:59,400 Speaker 2: smoking cigarettes, being a prisoner. Yeah, and he would get 396 00:20:59,400 --> 00:21:02,439 Speaker 2: to study for GRE. He was about to enter grad school. 397 00:21:02,400 --> 00:21:03,040 Speaker 1: I see that. 398 00:21:03,119 --> 00:21:05,199 Speaker 2: Well, they said, no, you can't have your books. 399 00:21:05,280 --> 00:21:06,440 Speaker 1: Now. They didn'tive him anything, and. 400 00:21:06,400 --> 00:21:08,359 Speaker 2: This guy was like, whoa, whoa, whoa, Wait a minute, 401 00:21:08,359 --> 00:21:10,239 Speaker 2: this is day one. He's like whoa, whoa, whoa, Like, 402 00:21:10,320 --> 00:21:13,399 Speaker 2: I need those books. I'm taking the GRE basically leaving 403 00:21:13,440 --> 00:21:16,240 Speaker 2: here after two weeks and going to take the test, 404 00:21:16,760 --> 00:21:19,159 Speaker 2: like I've got to spend this two week studying. They're like, 405 00:21:19,160 --> 00:21:21,760 Speaker 2: you can't have your books. So he quickly saw that 406 00:21:21,840 --> 00:21:26,199 Speaker 2: the only way out was to fake this nervous breakdown. 407 00:21:25,760 --> 00:21:27,800 Speaker 1: And Billy Crudup went in and said, why is everyone 408 00:21:27,840 --> 00:21:30,560 Speaker 1: saying whoa whoa whoa? Only I can say whoa whoa 409 00:21:30,600 --> 00:21:36,440 Speaker 1: whoa whoa whoa whoa. Yeah, So we've kind of poo 410 00:21:36,480 --> 00:21:38,960 Speaker 1: pooed the two major findings from this study already. 411 00:21:39,040 --> 00:21:42,080 Speaker 2: So that's that's a huge deal, right, because again, the 412 00:21:42,160 --> 00:21:44,680 Speaker 2: idea is that if you put people any random people. 413 00:21:44,880 --> 00:21:47,920 Speaker 2: Remember these are just average, like middle middle class white. 414 00:21:47,800 --> 00:21:50,840 Speaker 1: Kids, which is another problem, right. 415 00:21:50,680 --> 00:21:52,639 Speaker 2: If you put if you put any well, you know 416 00:21:52,760 --> 00:21:55,880 Speaker 2: nineteen seventy one, that means everybody, Right, that's the whole world. Right, 417 00:21:56,119 --> 00:21:58,280 Speaker 2: If you put anybody in the world in this situation, 418 00:21:58,359 --> 00:22:01,080 Speaker 2: they're going to either turn evil or lose their marbles. 419 00:22:01,640 --> 00:22:05,560 Speaker 2: So those are the two findings. That's what everybody took 420 00:22:05,600 --> 00:22:08,119 Speaker 2: it as at first. It later came out, no, this 421 00:22:08,160 --> 00:22:11,040 Speaker 2: guy was acting, this guy was faking. So what else 422 00:22:11,080 --> 00:22:14,680 Speaker 2: do we have? Then, Well, we have this idea that 423 00:22:15,680 --> 00:22:20,960 Speaker 2: Zimbardo insinuated himself as part of the experiment and that 424 00:22:21,200 --> 00:22:24,720 Speaker 2: actually created the findings from the Stanford prison experiment. 425 00:22:25,960 --> 00:22:27,840 Speaker 1: So should we put a pin in that? Should we 426 00:22:27,960 --> 00:22:28,399 Speaker 1: talk about that? 427 00:22:28,400 --> 00:22:28,560 Speaker 2: Now? 428 00:22:28,600 --> 00:22:28,760 Speaker 1: No? 429 00:22:28,760 --> 00:22:30,080 Speaker 2: No, I want to go. I want to go where 430 00:22:30,119 --> 00:22:30,560 Speaker 2: you want to go. 431 00:22:30,640 --> 00:22:32,199 Speaker 1: All right, let's put a pin in that then and 432 00:22:32,280 --> 00:22:34,159 Speaker 1: talk about a little bit more about what went on 433 00:22:34,240 --> 00:22:40,560 Speaker 1: that week. They had everything from visitation like you could 434 00:22:40,560 --> 00:22:43,040 Speaker 1: write a letter to your family or girlfriend or whoever 435 00:22:43,040 --> 00:22:46,160 Speaker 1: you want to come visit you to ask for visitation rights. 436 00:22:46,200 --> 00:22:48,640 Speaker 1: And the family came in and they did. They came 437 00:22:48,680 --> 00:22:51,040 Speaker 1: in and visited for an hour, and they were in 438 00:22:51,040 --> 00:22:54,399 Speaker 1: some cases parents were like, I don't know about this. 439 00:22:54,400 --> 00:22:57,680 Speaker 1: This is like this seems like a really weird thing. 440 00:22:58,400 --> 00:23:00,720 Speaker 1: And Zimbardo would be like, oh no, it's totally fine, 441 00:23:00,760 --> 00:23:03,159 Speaker 1: Like you know, they're psychologists. Yeah, like they want to 442 00:23:03,200 --> 00:23:05,600 Speaker 1: be here, like ask them and the the kids. You know. 443 00:23:05,640 --> 00:23:12,840 Speaker 1: They did say that they wanted to stay, okay, which 444 00:23:12,880 --> 00:23:14,200 Speaker 1: is which is important. 445 00:23:14,440 --> 00:23:17,119 Speaker 2: Okay, So what else is important is. 446 00:23:17,240 --> 00:23:19,000 Speaker 1: Like, no one in the visiting hour. I don't think 447 00:23:19,000 --> 00:23:20,280 Speaker 1: we're like, get me out of here. 448 00:23:20,600 --> 00:23:20,840 Speaker 2: Okay. 449 00:23:20,880 --> 00:23:22,399 Speaker 1: They are like, no, this is all part of the 450 00:23:22,440 --> 00:23:26,400 Speaker 1: part of the act essentially, all right. They had parole 451 00:23:27,280 --> 00:23:30,680 Speaker 1: hearings inside the course of a week. Somehow they said 452 00:23:30,760 --> 00:23:33,280 Speaker 1: that if they they could be released, if they would 453 00:23:33,280 --> 00:23:36,600 Speaker 1: forfeit the money. And this is after I don't know 454 00:23:36,600 --> 00:23:40,560 Speaker 1: how many of the six days, but they could not 455 00:23:40,680 --> 00:23:43,560 Speaker 1: get paid if and be paroled if they went in 456 00:23:43,560 --> 00:23:45,080 Speaker 1: front of the parole board. They went in front of 457 00:23:45,080 --> 00:23:47,800 Speaker 1: the parole board, some of them did, and most of 458 00:23:47,840 --> 00:23:49,680 Speaker 1: the prisoners said that they would give up their money 459 00:23:49,680 --> 00:23:53,280 Speaker 1: in fact, and the parole members, like like I said, 460 00:23:53,280 --> 00:23:57,000 Speaker 1: they were the graduate assistants. Even had one former prisoner, 461 00:23:57,720 --> 00:24:02,120 Speaker 1: this guy that like was a fifteen year quentin yeah inmate, 462 00:24:02,160 --> 00:24:04,760 Speaker 1: fifteen or seventeen year inmate on the board that I 463 00:24:04,800 --> 00:24:08,159 Speaker 1: guess Zimbardo. I want to call him Zamboni. 464 00:24:08,400 --> 00:24:11,280 Speaker 2: So he actually was a friend of Jaffy's, the guy 465 00:24:11,280 --> 00:24:14,680 Speaker 2: who originally actually that's where he came as an undergrad. 466 00:24:14,760 --> 00:24:16,159 Speaker 2: So he brought him in on it. 467 00:24:16,080 --> 00:24:18,159 Speaker 1: Right, So he was on the parole board, and he 468 00:24:18,320 --> 00:24:20,840 Speaker 1: was kind of one of the ones, at least in 469 00:24:20,880 --> 00:24:23,120 Speaker 1: the film version that was kind of saying like, no, 470 00:24:23,280 --> 00:24:25,320 Speaker 1: this is like how it is, like, you should keep 471 00:24:25,320 --> 00:24:27,680 Speaker 1: it going, right, But I don't know how much of 472 00:24:27,720 --> 00:24:28,600 Speaker 1: that was dramatized. 473 00:24:28,840 --> 00:24:31,720 Speaker 2: I don't either. That's a That's one of the problems 474 00:24:31,760 --> 00:24:34,480 Speaker 2: with this is, you know, so much of the documentation 475 00:24:34,600 --> 00:24:36,919 Speaker 2: has been not released over the years, and when it 476 00:24:36,920 --> 00:24:40,360 Speaker 2: does get released, it contradicts the official line, and it's 477 00:24:40,560 --> 00:24:43,719 Speaker 2: very tough to separate truth from fiction, especially when you 478 00:24:43,800 --> 00:24:47,040 Speaker 2: introduce a Hollywood movie into the whole thing, just to 479 00:24:47,080 --> 00:24:49,880 Speaker 2: just to drive those nails in the coffin too. Yeah, 480 00:24:49,920 --> 00:24:50,800 Speaker 2: and so reality. 481 00:24:50,840 --> 00:24:53,000 Speaker 1: In fact, there's been a lot of in the year 482 00:24:53,119 --> 00:24:55,760 Speaker 1: since a lot of complaints that a lot of these 483 00:24:56,119 --> 00:24:58,000 Speaker 1: you know, kids were screaming, I want to go home, 484 00:24:58,040 --> 00:25:01,680 Speaker 1: I want to go home. And for his heart, Zimbardo said, 485 00:25:02,400 --> 00:25:05,760 Speaker 1: in the contract it says I want to exit the experiment, 486 00:25:06,119 --> 00:25:09,040 Speaker 1: as the official line to say, and they could have 487 00:25:09,080 --> 00:25:11,199 Speaker 1: gone home. And he was like, but you hear no 488 00:25:11,240 --> 00:25:13,119 Speaker 1: one ever said I want to exit the experiment. They 489 00:25:13,119 --> 00:25:15,640 Speaker 1: would say I want my mommy, or I'm going crazy, 490 00:25:16,240 --> 00:25:19,120 Speaker 1: or my god, please stop this, please stop this, right, 491 00:25:19,440 --> 00:25:22,840 Speaker 1: but they never said those exact words, this safe phrase, say, yeah, 492 00:25:22,880 --> 00:25:25,280 Speaker 1: the safe phrase. But it turns out that's bunk two. 493 00:25:25,400 --> 00:25:27,720 Speaker 2: Right, Yeah, it turns out that if you look at 494 00:25:27,760 --> 00:25:30,920 Speaker 2: the contract that they had that he's referencing that say 495 00:25:30,960 --> 00:25:33,919 Speaker 2: the rules and everything in the agreement, there's no safe 496 00:25:33,920 --> 00:25:36,920 Speaker 2: word to be mentioned. Certainly doesn't say if you say 497 00:25:36,960 --> 00:25:39,760 Speaker 2: I want to quit the experiment, you get released from 498 00:25:39,800 --> 00:25:40,480 Speaker 2: the experiment. 499 00:25:40,600 --> 00:25:42,080 Speaker 1: So he's just flat out lying about that. 500 00:25:42,160 --> 00:25:44,640 Speaker 2: Then that's from what I understand. Yes, and what. 501 00:25:44,720 --> 00:25:45,960 Speaker 1: Article was this that you sent? 502 00:25:46,400 --> 00:25:51,359 Speaker 2: There's a really good takedown in medium called the The 503 00:25:52,240 --> 00:25:53,400 Speaker 2: Lifespan of a Lie. 504 00:25:53,520 --> 00:25:55,160 Speaker 1: Yeah, it's a good one, and. 505 00:25:55,119 --> 00:25:58,600 Speaker 2: It's based on that titles based on a I think 506 00:25:58,880 --> 00:26:04,040 Speaker 2: a documentary by a documentary or book by a yeah, 507 00:26:04,119 --> 00:26:09,359 Speaker 2: French filmmaker which who titled his version The Birth of 508 00:26:09,400 --> 00:26:13,479 Speaker 2: a Lie. And it's basically about how the Stanford prison 509 00:26:13,520 --> 00:26:16,560 Speaker 2: experiment was just basically it was bunk from the get go, 510 00:26:16,640 --> 00:26:18,680 Speaker 2: which we'll kind of pick that apart in a little bit, 511 00:26:19,160 --> 00:26:23,920 Speaker 2: and that just fascinatingly has been perpetuated over again basically 512 00:26:23,960 --> 00:26:28,560 Speaker 2: fifty years. It just entered the cultural zeitgeist and just 513 00:26:28,600 --> 00:26:30,240 Speaker 2: stayed like an infection. 514 00:26:31,400 --> 00:26:33,760 Speaker 1: All Right. Some other things that happened to make it realistic. 515 00:26:33,800 --> 00:26:37,520 Speaker 1: They brought in a lawyer when parents asked for one, 516 00:26:38,119 --> 00:26:40,320 Speaker 1: and played along like it was real. They brought in 517 00:26:40,359 --> 00:26:44,560 Speaker 1: a chaplain who came in to speak to prisoners and 518 00:26:44,680 --> 00:26:50,840 Speaker 1: he played along with it too. They basically did everything 519 00:26:50,880 --> 00:26:52,879 Speaker 1: that you would think would happen in a real prison 520 00:26:54,359 --> 00:26:56,119 Speaker 1: on a slightly scaled down level. 521 00:26:56,240 --> 00:26:59,840 Speaker 2: Right, But the upshot of all of This is Zimbardo saying, like, 522 00:27:00,480 --> 00:27:03,280 Speaker 2: do you see what's going on here? Everybody? Yeah, Like, 523 00:27:03,680 --> 00:27:07,080 Speaker 2: I just put some guys in, like nine guys in 524 00:27:07,160 --> 00:27:09,400 Speaker 2: at a time, or twelve guys as guards, twelve guys 525 00:27:09,400 --> 00:27:14,159 Speaker 2: as prisoners, and their parents came for visiting hours, a 526 00:27:14,280 --> 00:27:18,399 Speaker 2: lawyer came. That's how real the simulated prison became in 527 00:27:18,440 --> 00:27:22,359 Speaker 2: people's minds. Just imagine what a real prison's like. Right, So, 528 00:27:23,359 --> 00:27:24,880 Speaker 2: and he was saying they could have left at any 529 00:27:24,920 --> 00:27:26,640 Speaker 2: time if they just said the safeword, and no one 530 00:27:26,640 --> 00:27:30,640 Speaker 2: ever said the safeboard. There is some evidence that these 531 00:27:30,680 --> 00:27:35,280 Speaker 2: people were basically kept there against their will, especially after 532 00:27:35,840 --> 00:27:41,600 Speaker 2: Douglas Corpy basically faked his emotional breakdown and then was 533 00:27:41,640 --> 00:27:47,520 Speaker 2: thrown into a broom closet and retaliation for it, that 534 00:27:47,880 --> 00:27:50,760 Speaker 2: he should have very very clearly should have been left 535 00:27:50,880 --> 00:27:53,720 Speaker 2: or allowed to leave, and to even be led to 536 00:27:53,760 --> 00:27:57,119 Speaker 2: think that you couldn't leave, which is apparently the idea 537 00:27:57,160 --> 00:28:01,200 Speaker 2: that spread throughout the prisoners. That would be like keeping 538 00:28:01,240 --> 00:28:02,359 Speaker 2: someone against their will. 539 00:28:02,600 --> 00:28:05,000 Speaker 1: Yeah, and he did leave, but was supposed to agreed 540 00:28:05,040 --> 00:28:10,120 Speaker 1: to come back, supposedly to like play a different role 541 00:28:10,240 --> 00:28:12,440 Speaker 1: as a prisoner who like maybe escaped and came back, 542 00:28:12,480 --> 00:28:15,400 Speaker 1: I think, but didn't come back, right. 543 00:28:15,520 --> 00:28:19,560 Speaker 2: And I think five people were released early before the 544 00:28:20,000 --> 00:28:23,480 Speaker 2: whole experiment was called off. All prisoners. No guards left 545 00:28:23,520 --> 00:28:25,199 Speaker 2: the experiment, which is telling. 546 00:28:25,320 --> 00:28:27,240 Speaker 1: Yes, well, and they were working in shifts though, which 547 00:28:27,280 --> 00:28:27,800 Speaker 1: is important. 548 00:28:27,840 --> 00:28:30,720 Speaker 2: Okay, that is a big one too. But if you 549 00:28:30,800 --> 00:28:33,359 Speaker 2: consider that no one asked to be a guard, they 550 00:28:33,400 --> 00:28:35,600 Speaker 2: all asked to be prisoners, but then none of the 551 00:28:35,640 --> 00:28:39,360 Speaker 2: guards left the experiment, right to me, that's interesting on 552 00:28:39,480 --> 00:28:44,320 Speaker 2: its face, right, There's something to that, But the whole 553 00:28:44,320 --> 00:28:50,200 Speaker 2: thing just kind of falling apart after Zimbardo's girlfriend at 554 00:28:50,240 --> 00:28:55,400 Speaker 2: the time came the idea that up to this point, 555 00:28:55,600 --> 00:28:59,120 Speaker 2: these people had engaged in this fantasy and thought that 556 00:28:59,160 --> 00:29:02,400 Speaker 2: they couldn't leave, and they really could. That's controversial in 557 00:29:02,440 --> 00:29:05,920 Speaker 2: and of itself, because again there's evidence that they were 558 00:29:06,000 --> 00:29:09,800 Speaker 2: led to believe they couldn't leave, and that's different. That 559 00:29:09,960 --> 00:29:12,200 Speaker 2: changes things entirely. 560 00:29:12,480 --> 00:29:13,720 Speaker 1: Yeah, so you. 561 00:29:13,640 --> 00:29:15,280 Speaker 2: Want to take another break and then pick this part 562 00:29:15,320 --> 00:29:15,680 Speaker 2: some more. 563 00:29:16,880 --> 00:29:40,400 Speaker 1: Yeah, let's do it. Kind of fun, all right, the 564 00:29:40,480 --> 00:29:41,840 Speaker 1: final takedown. 565 00:29:41,560 --> 00:29:44,760 Speaker 2: I'm waiting for. I'm waiting for Phillips Embardo to release 566 00:29:44,760 --> 00:29:47,120 Speaker 2: a book about like our Jackhammer episode. 567 00:29:48,160 --> 00:29:51,680 Speaker 1: That's fine, I would read it all right, so where 568 00:29:51,680 --> 00:29:55,680 Speaker 1: are we here. Basically, we're at the point where he 569 00:29:56,280 --> 00:29:59,440 Speaker 1: has endo the experiment and now we're dealing with the 570 00:29:59,560 --> 00:30:02,920 Speaker 1: fallout since nineteen seventy one and how this should be viewed. 571 00:30:03,160 --> 00:30:04,960 Speaker 2: One of the big things that came out of that 572 00:30:05,160 --> 00:30:10,400 Speaker 2: French book, The Birth of a Lie, is the filmmaker 573 00:30:10,960 --> 00:30:16,120 Speaker 2: unearthed a recording that was I don't know where he 574 00:30:16,240 --> 00:30:19,200 Speaker 2: found it, but they he found it and released the 575 00:30:19,360 --> 00:30:24,800 Speaker 2: transcript of it that clearly has if not Zimbardo, at 576 00:30:24,880 --> 00:30:30,360 Speaker 2: least Jaffy, definitely Jaffy coaching the guards. Yeah, to be 577 00:30:30,480 --> 00:30:34,000 Speaker 2: more brutal, Right, be a tough guard. Just think of 578 00:30:34,280 --> 00:30:35,960 Speaker 2: how the pigs do it, and do it like that, 579 00:30:36,280 --> 00:30:38,520 Speaker 2: I think is what the quote was, right, Yeah. 580 00:30:38,440 --> 00:30:40,200 Speaker 1: When the whole idea of this thing is to try 581 00:30:40,240 --> 00:30:42,880 Speaker 1: and prove that without any influence. 582 00:30:43,240 --> 00:30:46,040 Speaker 2: Yes, this is what happens. Right, So there's a couple 583 00:30:46,080 --> 00:30:49,080 Speaker 2: of things that happened. Methodologically, there's a lot of things 584 00:30:49,120 --> 00:30:52,560 Speaker 2: that happened. The moment they started coaching those guards. Number one, 585 00:30:52,920 --> 00:30:56,040 Speaker 2: they took any organicness out of their behavior. They were 586 00:30:56,120 --> 00:30:58,240 Speaker 2: then doing what they thought they were expected to do. 587 00:30:58,400 --> 00:31:00,840 Speaker 2: Like John Wayne, Yeah, for sure, just went over the top, 588 00:31:00,960 --> 00:31:04,840 Speaker 2: is what it was. And then number two, they made 589 00:31:04,880 --> 00:31:08,280 Speaker 2: them co experimenters. Like the whole thing was supposed to 590 00:31:08,280 --> 00:31:10,640 Speaker 2: be guards and prisoners, and we're going to watch as 591 00:31:10,800 --> 00:31:15,600 Speaker 2: test subjects or participants. And when you coach the guards, 592 00:31:16,040 --> 00:31:19,480 Speaker 2: you're they're co experimenters. Now, now the experiments entirely on 593 00:31:19,600 --> 00:31:22,640 Speaker 2: the on the prisoners, which you can say, okay, well, 594 00:31:22,680 --> 00:31:25,080 Speaker 2: then those findings still worked. Well, that gets thrown out 595 00:31:25,120 --> 00:31:26,800 Speaker 2: when you base the whole thing on a guy who 596 00:31:26,880 --> 00:31:30,560 Speaker 2: is faking, right, But but you you make the guards 597 00:31:30,840 --> 00:31:34,440 Speaker 2: co experimenters, and you just completely take out any objectivity 598 00:31:34,480 --> 00:31:38,320 Speaker 2: from this experiment. That's problem one with the methodology. 599 00:31:38,560 --> 00:31:40,680 Speaker 1: Well, and the fact we already mentioned that one of 600 00:31:41,040 --> 00:31:44,520 Speaker 1: the researchers was a warden and zim want to call 601 00:31:44,560 --> 00:31:49,960 Speaker 1: them Sambrano, that's fine, go ahead. Zimbardo Zamboni himself was 602 00:31:50,080 --> 00:31:52,760 Speaker 1: the superintendent, like the minute he decided to do that, 603 00:31:53,360 --> 00:31:55,680 Speaker 1: Like I looked up. I think he's like in his 604 00:31:55,760 --> 00:31:58,320 Speaker 1: late thirties when he did this. How did he not 605 00:31:58,640 --> 00:32:01,200 Speaker 1: like was he that bad and doing his job? How 606 00:32:01,240 --> 00:32:03,480 Speaker 1: did he not know? Like wait a minute, this will 607 00:32:03,480 --> 00:32:04,400 Speaker 1: taint the experiment. 608 00:32:04,520 --> 00:32:07,080 Speaker 2: Do you want to talk about why the people think 609 00:32:07,160 --> 00:32:11,080 Speaker 2: that he was so yeah, okay, so he was a 610 00:32:12,320 --> 00:32:15,120 Speaker 2: he wasn't I think still is a social activist for sure. 611 00:32:15,760 --> 00:32:19,160 Speaker 2: And he had decided, and I can't really disagree with him, 612 00:32:19,400 --> 00:32:23,240 Speaker 2: that prisons were brutal places where brutality lived, and that 613 00:32:23,360 --> 00:32:27,160 Speaker 2: they were inherently brutal. And so if you take somebody 614 00:32:27,240 --> 00:32:29,320 Speaker 2: and put them into this place, you're doing a real 615 00:32:29,400 --> 00:32:32,800 Speaker 2: disservice to humanity by throwing somebody in a brutal place 616 00:32:32,880 --> 00:32:33,800 Speaker 2: that you know is brutal. 617 00:32:33,880 --> 00:32:35,680 Speaker 1: So his aim was to get reformed to happen. 618 00:32:35,880 --> 00:32:37,800 Speaker 2: Yes, from the outset. 619 00:32:38,000 --> 00:32:39,800 Speaker 1: Well, I mean, I can't fault that, but you can't 620 00:32:39,920 --> 00:32:42,000 Speaker 1: call it a scientific experiment either. 621 00:32:42,200 --> 00:32:47,240 Speaker 2: No, And it actually supposedly backfired as well, because one 622 00:32:47,320 --> 00:32:51,840 Speaker 2: interpretation of his findings is that it's all or nothing 623 00:32:51,880 --> 00:32:57,040 Speaker 2: with prisons. Prisons are inherently brutal or you can't have them. 624 00:32:57,200 --> 00:32:59,720 Speaker 2: So either you have prisons and you have brutal prisons, 625 00:32:59,840 --> 00:33:02,680 Speaker 2: or you have no prisons. And so, faced with that 626 00:33:02,920 --> 00:33:06,600 Speaker 2: choice and with rising crime rates in the seventies, a 627 00:33:06,640 --> 00:33:09,400 Speaker 2: lot of people doubled down on getting tough and made 628 00:33:09,440 --> 00:33:12,160 Speaker 2: prisons even worse and built more prisons and said to yes, 629 00:33:12,480 --> 00:33:15,240 Speaker 2: we're not even going to try to like, reform you anymore. 630 00:33:15,760 --> 00:33:17,640 Speaker 2: We're just going to send you to these brutal places 631 00:33:17,880 --> 00:33:19,920 Speaker 2: that are inherently brutal and there's nothing we can do 632 00:33:20,000 --> 00:33:23,080 Speaker 2: about it. So it would have it would have backfired 633 00:33:23,160 --> 00:33:26,040 Speaker 2: in that sense, but in the idea that he was 634 00:33:26,120 --> 00:33:29,440 Speaker 2: doing something with the best interests of his fellow people 635 00:33:30,400 --> 00:33:33,320 Speaker 2: at heart. Again, like you said, it's tough to fault 636 00:33:33,400 --> 00:33:38,520 Speaker 2: him for that. He just really really gave social psychology 637 00:33:38,600 --> 00:33:39,280 Speaker 2: a black eye. 638 00:33:40,280 --> 00:33:42,080 Speaker 1: Yeah. So one of the other things he did wrong, 639 00:33:42,760 --> 00:33:45,440 Speaker 1: and this one I just can't figure out either, is 640 00:33:45,680 --> 00:33:48,120 Speaker 1: he didn't have a control group and one of his 641 00:33:49,400 --> 00:33:51,200 Speaker 1: this guy wasn't in the experiment, but one of his 642 00:33:51,320 --> 00:33:55,280 Speaker 1: colleagues came by one day and was like, you know, 643 00:33:55,360 --> 00:33:55,960 Speaker 1: what's your. 644 00:33:55,880 --> 00:33:57,760 Speaker 2: Control what's your independent variable? 645 00:33:57,880 --> 00:34:01,000 Speaker 1: Yeah, and he was like what, Yeah, He's like, I 646 00:34:01,040 --> 00:34:01,600 Speaker 1: don't have one. 647 00:34:01,840 --> 00:34:06,960 Speaker 2: So if you run an experiment of any sort, Grabster 648 00:34:07,120 --> 00:34:09,400 Speaker 2: uses a great analogy where if you're trying to figure 649 00:34:09,400 --> 00:34:12,640 Speaker 2: out what the effects of radiation are on tomatoes, you 650 00:34:12,760 --> 00:34:15,239 Speaker 2: pick a bunch of tomatoes, you wagh them, you check 651 00:34:15,280 --> 00:34:18,719 Speaker 2: them for color, you make sure that they're identical to 652 00:34:18,880 --> 00:34:21,719 Speaker 2: another set of tomatoes, So you have two sets of 653 00:34:21,840 --> 00:34:25,279 Speaker 2: basically identical tomatoes. One you radiate, one you do not, 654 00:34:25,719 --> 00:34:27,480 Speaker 2: and after a set amount of time you go back 655 00:34:27,560 --> 00:34:29,600 Speaker 2: and see what the differences are, and then you can 656 00:34:29,680 --> 00:34:33,760 Speaker 2: say probably that when you radiate tomatoes, these are the effects, 657 00:34:33,800 --> 00:34:36,479 Speaker 2: and the effects are the differences between the two. Same 658 00:34:36,560 --> 00:34:37,920 Speaker 2: thing with the prison experiment. 659 00:34:38,000 --> 00:34:41,360 Speaker 1: Yeah, what would you have here? Two different cell blocks 660 00:34:41,960 --> 00:34:45,480 Speaker 1: and one that literally isn't coached and completely left alone. 661 00:34:45,600 --> 00:34:46,919 Speaker 2: That's what I would have done, for sure. 662 00:34:47,000 --> 00:34:49,799 Speaker 1: And then one where you're saying, hey, be brutal, and yeah, 663 00:34:49,960 --> 00:34:52,200 Speaker 1: we'll see if everyone falls into these roles exactly. 664 00:34:52,560 --> 00:34:54,920 Speaker 2: That would have been great. And actually some researchers in 665 00:34:55,000 --> 00:34:57,160 Speaker 2: two thousand and one, oh yeah, they did. They did 666 00:34:57,719 --> 00:35:01,320 Speaker 2: exactly that. They basically ran the experiment with just that 667 00:35:01,560 --> 00:35:06,080 Speaker 2: control group you suggested. It was called the BBC Prison Study. 668 00:35:05,920 --> 00:35:07,120 Speaker 1: Yeah, Haslom and Reiker. 669 00:35:07,480 --> 00:35:10,279 Speaker 2: Yeah, and basically they did the same thing. They they 670 00:35:10,440 --> 00:35:13,520 Speaker 2: did not do any coaching, they didn't do any intervention. 671 00:35:13,680 --> 00:35:16,759 Speaker 2: They did the thing exactly like you're supposed to, or 672 00:35:16,840 --> 00:35:20,600 Speaker 2: like Zimbardo should have from the outset, and they found 673 00:35:20,680 --> 00:35:23,680 Speaker 2: that again they made the control group to the original 674 00:35:23,719 --> 00:35:27,240 Speaker 2: Stanford prison experiment, they found that the exact opposite happened. 675 00:35:27,560 --> 00:35:31,360 Speaker 2: The prisoners stayed banded together, the guards were totally in 676 00:35:31,480 --> 00:35:37,080 Speaker 2: disarray and disorganized. The brutality never emerged, and there wasn't 677 00:35:37,080 --> 00:35:38,720 Speaker 2: any violence from Yeah, I understand. 678 00:35:38,800 --> 00:35:40,719 Speaker 1: And this is where it gets really scummy, if you 679 00:35:40,760 --> 00:35:46,240 Speaker 1: asked me. Zimbardo found out about this, and supposedly Haslman 680 00:35:46,320 --> 00:35:50,239 Speaker 1: Riker said they discovered he was privately writing editors to 681 00:35:50,600 --> 00:35:54,480 Speaker 1: keep them from getting published and claiming that they were fraudulent. 682 00:35:54,640 --> 00:35:57,840 Speaker 2: Yeah, in the journal that they released their findings, and 683 00:35:58,000 --> 00:36:03,439 Speaker 2: he wrote an appendage to their article and said, these 684 00:36:03,480 --> 00:36:06,120 Speaker 2: are they just don't even listen to these guys. I'm 685 00:36:06,160 --> 00:36:10,120 Speaker 2: Phillips Embardo man. So yeah, I thought that was pretty 686 00:36:10,120 --> 00:36:14,520 Speaker 2: scummy too if he did that, So you've got methodologically, 687 00:36:14,640 --> 00:36:20,080 Speaker 2: there's even more problems too. If in the original newspaper advertisement, chuck, 688 00:36:20,480 --> 00:36:24,960 Speaker 2: he said, prison experiment, the prison experiment, everybody sign up. 689 00:36:25,120 --> 00:36:27,080 Speaker 1: Yeah, that was a problem in of itself. They shouldn't 690 00:36:27,080 --> 00:36:28,120 Speaker 1: have known what they were doing. 691 00:36:28,280 --> 00:36:30,919 Speaker 2: No, exactly until they showed up, right, So you're gonna 692 00:36:30,960 --> 00:36:34,120 Speaker 2: get a big wide swath of people, and then once 693 00:36:34,160 --> 00:36:36,680 Speaker 2: they find out what the experiment is, maybe they'll say 694 00:36:36,719 --> 00:36:40,920 Speaker 2: no thanks or whatever. But this was like attracting. A 695 00:36:41,000 --> 00:36:46,480 Speaker 2: two thousand and seven follow up study found narcissistic, hostile, 696 00:36:47,400 --> 00:36:51,520 Speaker 2: overly aggressive authoritarian types like flies to honey. 697 00:36:51,680 --> 00:36:53,040 Speaker 1: Yeah, or the opposite. 698 00:36:53,400 --> 00:36:55,400 Speaker 2: Well, that seems to be the case in this case. 699 00:36:55,320 --> 00:36:57,560 Speaker 1: Yeah, which was in fact one of them was a 700 00:36:57,600 --> 00:37:01,360 Speaker 1: liberal activist who kind of purposely in there because he 701 00:37:01,440 --> 00:37:04,920 Speaker 1: thought maybe these findings could be used one day for 702 00:37:05,400 --> 00:37:06,040 Speaker 1: prison reform. 703 00:37:06,239 --> 00:37:09,879 Speaker 2: Well, I think also most of what I got from 704 00:37:09,960 --> 00:37:12,120 Speaker 2: Jaffy coaching the people to say, like, think about what 705 00:37:12,200 --> 00:37:15,040 Speaker 2: the pigs would do, and then do that, because we 706 00:37:15,160 --> 00:37:18,640 Speaker 2: really got to show them how brutal prisons are. I 707 00:37:18,719 --> 00:37:23,000 Speaker 2: think everybody who showed up basically was against prisons. But 708 00:37:23,160 --> 00:37:26,840 Speaker 2: whether you're against prisons or forum, you were automatically tainted 709 00:37:27,239 --> 00:37:30,680 Speaker 2: before you even showed up for the interview. Yeah, because 710 00:37:30,760 --> 00:37:33,920 Speaker 2: they wrote prison experiment in the ad. So from the 711 00:37:34,200 --> 00:37:37,160 Speaker 2: outset there was bias. There was no control group. It 712 00:37:37,280 --> 00:37:39,520 Speaker 2: attracted a bias cross section of people. 713 00:37:39,880 --> 00:37:42,000 Speaker 1: Zimbardo participated, he was. 714 00:37:42,000 --> 00:37:45,440 Speaker 2: A participant, and that actually chuck led to the second 715 00:37:45,719 --> 00:37:51,959 Speaker 2: set of findings that Zimbardo had influenced this and become 716 00:37:52,040 --> 00:37:57,920 Speaker 2: a participant himself. And here's the current interpretation of all 717 00:37:58,000 --> 00:38:00,759 Speaker 2: of it. Okay, this seems to be the current dojure 718 00:38:01,840 --> 00:38:05,920 Speaker 2: interpretation of the Stanford prison experiment, not that people are 719 00:38:06,000 --> 00:38:10,120 Speaker 2: inherently cruel and inherently will just crumble in the face 720 00:38:10,160 --> 00:38:15,160 Speaker 2: of authority, although that might still stand, but that people 721 00:38:16,040 --> 00:38:20,120 Speaker 2: will be are capable of cruelty if they're recruited by 722 00:38:20,200 --> 00:38:23,400 Speaker 2: an authority. Figure the second set, and there's actually been 723 00:38:23,520 --> 00:38:28,160 Speaker 2: three sets of interpretations. The second set was that Zimbardo 724 00:38:28,320 --> 00:38:33,000 Speaker 2: inserted himself and that it actually demonstrated what's called situationist theory. 725 00:38:33,880 --> 00:38:37,840 Speaker 1: Yeah, and that's basically that external circumstances are the drivers 726 00:38:37,960 --> 00:38:38,880 Speaker 1: of human behavior. 727 00:38:39,920 --> 00:38:42,640 Speaker 2: Right. So the point was not that people are inherently 728 00:38:42,719 --> 00:38:44,640 Speaker 2: cruel on an individual level, but. 729 00:38:44,760 --> 00:38:48,120 Speaker 1: The situation that they're put in. They will quickly find 730 00:38:48,160 --> 00:38:49,320 Speaker 1: those roles if. 731 00:38:49,520 --> 00:38:54,040 Speaker 2: There's a power structure above them that has normalized this 732 00:38:54,400 --> 00:38:57,680 Speaker 2: and is expecting them to fulfill those roles. And this 733 00:38:57,920 --> 00:39:00,880 Speaker 2: really tied in with you know this nineteen seventy one, 734 00:39:00,960 --> 00:39:04,160 Speaker 2: people were still really trying to figure out what the 735 00:39:04,239 --> 00:39:06,399 Speaker 2: heck could just happen with the Nazis. It was only 736 00:39:06,520 --> 00:39:09,480 Speaker 2: like twenty five twenty six years before. Yeah, so this 737 00:39:09,640 --> 00:39:13,479 Speaker 2: idea that this banality of evil, this made perfect sense 738 00:39:13,560 --> 00:39:17,239 Speaker 2: in that respect. Right, there was a bureaucracy that had 739 00:39:17,360 --> 00:39:21,320 Speaker 2: normalized evil and you were just following orders, right. That 740 00:39:21,440 --> 00:39:24,600 Speaker 2: was the second interpretation of the Stanford prison experiment. 741 00:39:24,760 --> 00:39:27,400 Speaker 1: Yeah. Well, and not just the Nazis, but everything like 742 00:39:27,640 --> 00:39:31,480 Speaker 1: the Vietnam War, which was I mean, this was nineteen 743 00:39:31,520 --> 00:39:35,080 Speaker 1: seventy one, right, and like the Miley massacre, and you know, 744 00:39:35,160 --> 00:39:38,320 Speaker 1: I was just following orders. Like this tied in this 745 00:39:38,440 --> 00:39:41,600 Speaker 1: has his fingers in a lot of relevant politics. 746 00:39:41,160 --> 00:39:44,800 Speaker 2: Of the day, right, So apparently it also tied in 747 00:39:44,920 --> 00:39:48,080 Speaker 2: really well to Attica, and Zimbardo must have just couldn't 748 00:39:48,120 --> 00:39:50,919 Speaker 2: believe is his good fortune that there was a there's 749 00:39:51,040 --> 00:39:55,200 Speaker 2: lightius prison riot in American history. Happened like a couple 750 00:39:55,239 --> 00:39:57,279 Speaker 2: of weeks after he made the news in the New 751 00:39:57,400 --> 00:39:59,960 Speaker 2: York Times magazine with this journal arter or this article 752 00:40:00,239 --> 00:40:02,560 Speaker 2: that he wrote. Right, Yeah, but that actually played into 753 00:40:02,600 --> 00:40:05,640 Speaker 2: it too, because apparently, following orders, a lot of guards 754 00:40:05,760 --> 00:40:09,239 Speaker 2: just fired blindly into the tear gas smoke of this 755 00:40:09,440 --> 00:40:14,680 Speaker 2: prison riot and killed tons of unarmed prisoners and hostages. 756 00:40:15,320 --> 00:40:19,200 Speaker 2: So so Zimbard's like, Okay, that's fine, however we're going 757 00:40:19,239 --> 00:40:23,080 Speaker 2: to interpret this. I'm cool with that. But the third one. 758 00:40:23,200 --> 00:40:25,400 Speaker 2: I'm not quite sure that he would be cool with 759 00:40:26,040 --> 00:40:26,960 Speaker 2: the current. 760 00:40:26,719 --> 00:40:29,520 Speaker 1: One, which is bad science, I think. 761 00:40:29,880 --> 00:40:31,560 Speaker 2: So what I saw is that a lot of social 762 00:40:31,640 --> 00:40:34,200 Speaker 2: psychologists said, we've known this as bad science all along, 763 00:40:34,280 --> 00:40:38,239 Speaker 2: but the findings were really interesting and worthwhile, so we 764 00:40:38,280 --> 00:40:40,959 Speaker 2: didn't throw the baby out with the bathwater. The third 765 00:40:41,000 --> 00:40:46,400 Speaker 2: one is that Zimbardo inserted himself. And this what this 766 00:40:46,680 --> 00:40:51,440 Speaker 2: this study really showed was that people will engage in 767 00:40:51,520 --> 00:40:54,920 Speaker 2: acts of cruelty if there is a figure of authority 768 00:40:55,239 --> 00:40:58,400 Speaker 2: recruiting them to what they think is a righteous cause. 769 00:40:58,719 --> 00:41:03,040 Speaker 2: And in this case, it was Zimbardo making the guards 770 00:41:03,200 --> 00:41:06,400 Speaker 2: co experimenters by coaching them to be cruel right, and 771 00:41:07,239 --> 00:41:10,959 Speaker 2: in the name of prison reform. Ultimately, when they showed 772 00:41:11,000 --> 00:41:13,680 Speaker 2: the world what happens when you put normal people in 773 00:41:13,760 --> 00:41:14,560 Speaker 2: a prison situation. 774 00:41:14,760 --> 00:41:17,560 Speaker 1: Yeah, which is what the John Wayne guy very much 775 00:41:17,600 --> 00:41:20,440 Speaker 1: has said all his life since then, is that this 776 00:41:20,640 --> 00:41:22,880 Speaker 1: is what they I thought they wanted was for me 777 00:41:23,000 --> 00:41:25,400 Speaker 1: to be a bad guard, right, so we could prove 778 00:41:26,880 --> 00:41:28,920 Speaker 1: ultimately that prisons need reform. 779 00:41:29,200 --> 00:41:32,000 Speaker 2: And that is why he's still complicit, because he's still 780 00:41:32,640 --> 00:41:36,920 Speaker 2: engaged in these acts of genuine cruelty against the prisoners 781 00:41:36,960 --> 00:41:39,560 Speaker 2: in the study, and that's why he should still feel 782 00:41:39,560 --> 00:41:41,560 Speaker 2: bad and still does feel bad. But he did it 783 00:41:41,719 --> 00:41:45,440 Speaker 2: because he was recruited in the name of this righteous 784 00:41:45,520 --> 00:41:47,719 Speaker 2: caused by somebody who was in authority. 785 00:41:47,880 --> 00:41:50,040 Speaker 1: So is this being taught this way in classes now? 786 00:41:50,320 --> 00:41:53,239 Speaker 2: I think that they, especially once it came out that 787 00:41:53,400 --> 00:41:57,040 Speaker 2: Zimbardo and at the very least his warden, a co experimenter, 788 00:41:57,239 --> 00:42:00,480 Speaker 2: was coaching them to do this, and that the organic 789 00:42:00,600 --> 00:42:04,040 Speaker 2: cruelty is just totally out the window. I think they 790 00:42:04,080 --> 00:42:05,879 Speaker 2: don't know what to do with it right now. They're 791 00:42:05,960 --> 00:42:08,680 Speaker 2: trying to figure it out, like how to get these 792 00:42:08,760 --> 00:42:10,680 Speaker 2: findings across or what to make of them. 793 00:42:11,000 --> 00:42:13,319 Speaker 1: Because one of these quotes from the article you sent 794 00:42:13,640 --> 00:42:16,120 Speaker 1: the guy said, I don't think it's scientific fraud in 795 00:42:16,160 --> 00:42:19,040 Speaker 1: the typical sense. It was never considered to be scientific. 796 00:42:19,120 --> 00:42:24,239 Speaker 1: It's typically represented in classrooms as a demonstration, not an experiment, 797 00:42:24,719 --> 00:42:28,440 Speaker 1: and as a notorious case of ethical malfeasance. So that's 798 00:42:28,440 --> 00:42:32,440 Speaker 1: almost a fourth takeaway is that it's an example of 799 00:42:32,560 --> 00:42:36,520 Speaker 1: how to not do a study correctly, right, which is interesting. 800 00:42:37,040 --> 00:42:41,400 Speaker 2: Oh yeah, I mean methodologically inserting yourself, like lying about 801 00:42:41,440 --> 00:42:46,000 Speaker 2: the findings later on, or misinterpreting the results, or using 802 00:42:46,160 --> 00:42:48,440 Speaker 2: spin it's yeah, there's a lot here, but it. 803 00:42:48,520 --> 00:42:51,520 Speaker 1: Was approved by the Stanford Human Rights Subjects Review Committee 804 00:42:52,120 --> 00:42:55,240 Speaker 1: at the time. Those were Zimbardo's experiments who he presented 805 00:42:55,280 --> 00:42:59,840 Speaker 1: this to, and they're you know, he still says that 806 00:42:59,880 --> 00:43:00,640 Speaker 1: it was ethical. 807 00:43:01,000 --> 00:43:02,920 Speaker 2: Well, it was at the time under the guidelines, it 808 00:43:03,040 --> 00:43:05,759 Speaker 2: was ethical, but then after they changed the guidelines. 809 00:43:06,480 --> 00:43:08,719 Speaker 1: You couldn't do this today, no, or at least not 810 00:43:08,800 --> 00:43:10,000 Speaker 1: with like he did it. 811 00:43:10,840 --> 00:43:14,640 Speaker 2: So, I did you remember the very brief psychology is 812 00:43:14,719 --> 00:43:15,400 Speaker 2: not serious? 813 00:43:15,560 --> 00:43:16,000 Speaker 1: I watched that. 814 00:43:16,160 --> 00:43:18,200 Speaker 2: I did one on the Stanford prison experiment. 815 00:43:18,280 --> 00:43:19,080 Speaker 1: Yeah, I watched that today. 816 00:43:19,239 --> 00:43:21,439 Speaker 2: Did you what did you think it was good? Thanks? 817 00:43:21,520 --> 00:43:22,720 Speaker 1: Man? Cute little background? 818 00:43:22,880 --> 00:43:26,640 Speaker 2: Yeah, I thought so too. And let's see you got 819 00:43:26,680 --> 00:43:27,239 Speaker 2: anything else. 820 00:43:27,960 --> 00:43:30,160 Speaker 1: No, I mean, boy, I thought we were pretty scathing, but. 821 00:43:30,520 --> 00:43:33,840 Speaker 2: We were This is like vaping level scathing. This is 822 00:43:33,920 --> 00:43:36,239 Speaker 2: way worse than vaping. I'm sure the vapors are like going. 823 00:43:36,320 --> 00:43:37,600 Speaker 2: They were really hard on that guy. 824 00:43:37,920 --> 00:43:41,720 Speaker 1: Yeah. The movie, uh, you know, the documentary is probably 825 00:43:41,719 --> 00:43:45,160 Speaker 1: a little more accurate. But the movie wasn't bad. Yeah, 826 00:43:45,280 --> 00:43:47,960 Speaker 1: I mean it's not great, yeah, but it was okay. 827 00:43:48,000 --> 00:43:48,840 Speaker 1: It felt like a movie the. 828 00:43:48,840 --> 00:43:51,560 Speaker 2: Week, gotcha it's an airplane movie. 829 00:43:52,400 --> 00:43:54,719 Speaker 1: Yeah, watch it on your next flight. That's my recommendation. 830 00:43:54,880 --> 00:43:58,000 Speaker 2: Thanks buddy. Well, if you want to know more about 831 00:43:58,040 --> 00:44:01,680 Speaker 2: the Stanford prison experiment, type those words in the search 832 00:44:01,719 --> 00:44:03,759 Speaker 2: bar at houstuffworks dot com and it'll bring up this 833 00:44:03,880 --> 00:44:07,040 Speaker 2: grabster article. Since I said grabster, it's time for listener mail. 834 00:44:10,200 --> 00:44:12,640 Speaker 1: I'm gonna call this beautiful landscaping. Hey, guys, I spent 835 00:44:12,680 --> 00:44:15,359 Speaker 1: the last two years fixing up the yard in our 836 00:44:15,480 --> 00:44:19,040 Speaker 1: house in Point Pleasant, Pennsylvania. 837 00:44:19,160 --> 00:44:20,520 Speaker 2: Oh that sounds like a pleasant place. 838 00:44:20,760 --> 00:44:23,040 Speaker 1: Yeah it is. My husband actually introduced me your show 839 00:44:23,040 --> 00:44:25,399 Speaker 1: a few years back, and thank god he did, because 840 00:44:25,440 --> 00:44:27,919 Speaker 1: I've literally listened to you for hours and hours while 841 00:44:27,960 --> 00:44:28,640 Speaker 1: working in the yard. 842 00:44:28,800 --> 00:44:29,000 Speaker 2: Nice. 843 00:44:29,120 --> 00:44:31,160 Speaker 1: It was a huge undertaking. I have a more flexible 844 00:44:31,200 --> 00:44:33,960 Speaker 1: work schedule than he does, so I volunteered to absorb 845 00:44:34,040 --> 00:44:36,520 Speaker 1: most of the responsibility, although he did a lot of 846 00:44:36,560 --> 00:44:39,120 Speaker 1: heavy looking to I enjoyed the show so much I 847 00:44:39,160 --> 00:44:41,879 Speaker 1: stopped allowing myself to listen to it. To it any 848 00:44:41,960 --> 00:44:44,840 Speaker 1: other time. You were only allowed during yard work. This 849 00:44:44,960 --> 00:44:46,719 Speaker 1: made me much more ready to get outside and get 850 00:44:46,760 --> 00:44:48,959 Speaker 1: into it. You guys were with me while I carried 851 00:44:49,000 --> 00:44:52,040 Speaker 1: literally tons of redstone uphill and buckets hauling rocks for 852 00:44:52,120 --> 00:44:59,640 Speaker 1: a firing landing, planeted pecky sandra ferns and hostas in 853 00:44:59,719 --> 00:45:02,320 Speaker 1: the soil I've ever had to work with, and just 854 00:45:02,440 --> 00:45:03,960 Speaker 1: clearing away overgrowth. 855 00:45:03,600 --> 00:45:06,400 Speaker 2: Which it sounds like Tanya Harding training for the Olympics. 856 00:45:06,440 --> 00:45:09,800 Speaker 1: And that one montage which it turned out included a 857 00:45:09,840 --> 00:45:11,600 Speaker 1: fair amount of poison ivy. During it on I learned 858 00:45:11,600 --> 00:45:15,120 Speaker 1: about tiny adorable little creature called it's artigrade, the business 859 00:45:15,160 --> 00:45:19,080 Speaker 1: of head trains plants, the hookworm her favorite episode, and 860 00:45:19,160 --> 00:45:23,240 Speaker 1: some haunting information I cannot unher, such as you provided 861 00:45:23,320 --> 00:45:26,480 Speaker 1: in the bull fighting and drowning episodes. You're always very entertaining, 862 00:45:26,520 --> 00:45:29,080 Speaker 1: full of information. Even when I think it's boring, you 863 00:45:29,160 --> 00:45:31,680 Speaker 1: make it fun. There were times you had me loling 864 00:45:31,760 --> 00:45:35,080 Speaker 1: in my backyard alone and covered in dirt and sweat 865 00:45:35,160 --> 00:45:38,120 Speaker 1: like a crazy person. Attached her some pictures of the progress, 866 00:45:39,120 --> 00:45:43,719 Speaker 1: all from your climate controlled studio, that is from Sharon Prashinsky. 867 00:45:44,200 --> 00:45:47,839 Speaker 1: And Sharon, you did a great job. That is one 868 00:45:48,000 --> 00:45:50,960 Speaker 1: beautiful yard you got going. Yeah, for sure, it is lovely. 869 00:45:51,160 --> 00:45:53,319 Speaker 2: It is nice work. We're glad we could be there 870 00:45:53,360 --> 00:45:55,759 Speaker 2: with you to help you get up that hill. 871 00:45:56,000 --> 00:45:58,000 Speaker 1: Yeah, and down the hill and then back up the 872 00:45:58,120 --> 00:45:59,560 Speaker 1: hill and back down the hill. 873 00:45:59,480 --> 00:46:02,520 Speaker 2: That's right, and then back up again. If you want 874 00:46:02,600 --> 00:46:03,920 Speaker 2: to get in touch with this so let us know 875 00:46:03,960 --> 00:46:05,839 Speaker 2: how we've helped you out. We love hearing that kind 876 00:46:05,880 --> 00:46:08,359 Speaker 2: of stuff. If you're Phillips Lombardo, we expect to hear 877 00:46:08,440 --> 00:46:12,560 Speaker 2: from your lawyer. And in the meantime, you can hang 878 00:46:12,600 --> 00:46:14,040 Speaker 2: out with us at our home on the web, Stuff 879 00:46:14,080 --> 00:46:15,759 Speaker 2: Youshould Do dot com, where you can find all of 880 00:46:15,800 --> 00:46:18,279 Speaker 2: our social links, and you can also send us an 881 00:46:18,320 --> 00:46:20,799 Speaker 2: email to Stuff podcast at HowStuffWorks dot com. 882 00:46:23,320 --> 00:46:26,160 Speaker 1: Stuff You Should Know is a production of iHeartRadio. For 883 00:46:26,320 --> 00:46:30,440 Speaker 1: more podcasts my heart Radio, visit the iHeartRadio app, Apple Podcasts, 884 00:46:30,600 --> 00:46:32,400 Speaker 1: or wherever you listen to your favorite shows.