1 00:00:01,480 --> 00:00:07,720 Speaker 1: Welcome to Stuff you Should Know, a production of iHeartRadio. 2 00:00:11,119 --> 00:00:14,040 Speaker 2: Hey, and welcome to the podcast. I'm Josh, there's Chuck, 3 00:00:14,400 --> 00:00:18,639 Speaker 2: there's Jerry, and he squeezes together in a one meter 4 00:00:18,800 --> 00:00:22,680 Speaker 2: square space. We're still doing pretty good on Stuff you 5 00:00:22,720 --> 00:00:24,640 Speaker 2: should Know, you. 6 00:00:24,600 --> 00:00:27,440 Speaker 3: Know, as funny as I was reading, and then we'll get 7 00:00:27,480 --> 00:00:30,560 Speaker 3: to it. But when things start to become problematic as 8 00:00:30,560 --> 00:00:34,040 Speaker 3: far as people per square meter, and I started to panic. 9 00:00:34,920 --> 00:00:38,800 Speaker 2: Oh really, Yeah, that didn't get me, But it occurred 10 00:00:38,840 --> 00:00:40,599 Speaker 2: to me that some people listening to this are going 11 00:00:40,680 --> 00:00:41,560 Speaker 2: to feel that way too. 12 00:00:42,200 --> 00:00:44,920 Speaker 3: Yeah, because we're talking about crowds, and if even the 13 00:00:45,000 --> 00:00:49,080 Speaker 3: discussion of being in very close proximity to someone else 14 00:00:49,200 --> 00:00:51,520 Speaker 3: triggers you, then consider this your warning. 15 00:00:52,600 --> 00:00:55,920 Speaker 2: Yeah, great, great job, Chuck. Yeah, sure, So we are 16 00:00:55,960 --> 00:00:59,160 Speaker 2: talking crowds today, Chuck, you spoiled the entire episode by 17 00:00:59,240 --> 00:01:05,280 Speaker 2: mentioning that, Yeah, I'm just kidding, buddy. And generally think 18 00:01:05,319 --> 00:01:07,640 Speaker 2: of this this huge mass of people, say like at 19 00:01:07,640 --> 00:01:10,000 Speaker 2: a concert or something like that, or a show if 20 00:01:10,000 --> 00:01:14,520 Speaker 2: you're into indie bands, but it can be any assemblage 21 00:01:14,560 --> 00:01:17,200 Speaker 2: of people, any group of people. I'm guessing more than 22 00:01:17,240 --> 00:01:20,880 Speaker 2: two if that old adage is correct, about three being 23 00:01:20,920 --> 00:01:23,240 Speaker 2: a crowd, but it can be anything from a bunch 24 00:01:23,240 --> 00:01:26,319 Speaker 2: of people in an elevator to a bunch of people 25 00:01:26,880 --> 00:01:29,600 Speaker 2: going to Mecca flocking to Mecca for the Hodge on 26 00:01:29,680 --> 00:01:30,440 Speaker 2: any given year. 27 00:01:31,000 --> 00:01:34,240 Speaker 3: Yeah, I think like technically just any any grouping who 28 00:01:34,280 --> 00:01:35,040 Speaker 3: put this one together. 29 00:01:35,080 --> 00:01:36,640 Speaker 4: By the way, initially. 30 00:01:36,600 --> 00:01:39,760 Speaker 2: This was a Julia joint And thank you for reminding 31 00:01:39,800 --> 00:01:44,080 Speaker 2: me before we go further. Kimberly from the Prison Labor 32 00:01:44,240 --> 00:01:46,720 Speaker 2: listener Mail is the person who got this one going. 33 00:01:46,760 --> 00:01:47,760 Speaker 2: So thanks Kimberly. 34 00:01:47,880 --> 00:01:50,080 Speaker 3: Okay, yeah, thanks big time because I thought this was 35 00:01:50,120 --> 00:01:53,120 Speaker 3: super interesting. But yeah, yeah, I mean Julia did some 36 00:01:53,160 --> 00:01:55,800 Speaker 3: research and basically came back with this a crowd as 37 00:01:55,880 --> 00:01:59,760 Speaker 3: any group of people temporarily gathered in the same physical 38 00:02:00,920 --> 00:02:03,720 Speaker 3: relatively close to each other elevators. 39 00:02:05,040 --> 00:02:07,480 Speaker 2: Sure, I mean it's a little surprising if you think 40 00:02:07,520 --> 00:02:09,320 Speaker 2: about it. For some reason, I got hooked on the 41 00:02:09,360 --> 00:02:12,480 Speaker 2: elevator thing. But there's a lot of different reasons those 42 00:02:12,480 --> 00:02:15,240 Speaker 2: people could be together. They're all going to different floors 43 00:02:15,240 --> 00:02:17,040 Speaker 2: in the same building, say if they happen to be 44 00:02:17,080 --> 00:02:20,280 Speaker 2: on an elevator, or they could all have a shared 45 00:02:20,320 --> 00:02:25,240 Speaker 2: interest they're all at a gun show, or they have 46 00:02:25,280 --> 00:02:29,799 Speaker 2: a shared goal they're trying to overthrow their regime, their 47 00:02:29,880 --> 00:02:32,840 Speaker 2: ruling regime. There's all sorts of different reasons people come 48 00:02:32,880 --> 00:02:35,720 Speaker 2: together in crowds. Sometimes it's on purpose, sometimes it's not 49 00:02:36,360 --> 00:02:39,480 Speaker 2: planned by that person, but it just happens. And one 50 00:02:39,480 --> 00:02:42,360 Speaker 2: of the cool things is we've been studying crowds through 51 00:02:42,520 --> 00:02:46,919 Speaker 2: all sorts of different lenses, and we've kind of whittled 52 00:02:46,960 --> 00:02:51,480 Speaker 2: down to the fact that humans are essentially innately good 53 00:02:52,000 --> 00:02:54,519 Speaker 2: at navigating crowds for the most part. 54 00:02:54,760 --> 00:02:57,760 Speaker 3: Yeah, you know, some of the stuff Julia came back 55 00:02:57,760 --> 00:03:01,000 Speaker 3: with was fairly like you might be a little scared 56 00:03:01,240 --> 00:03:04,000 Speaker 3: of crowds, but it is good to know that that 57 00:03:04,120 --> 00:03:06,679 Speaker 3: is sort of outdated thinking and why we will talk 58 00:03:06,720 --> 00:03:10,680 Speaker 3: about bad things that can happen in crowds. The modern 59 00:03:10,960 --> 00:03:14,320 Speaker 3: understanding of crowds, You're right, is that people are generally 60 00:03:14,480 --> 00:03:20,280 Speaker 3: pretty orderly, even in the face of disaster, and the 61 00:03:20,320 --> 00:03:23,400 Speaker 3: things that make a crowd go bad. A lot of 62 00:03:23,440 --> 00:03:26,040 Speaker 3: times it's not the crowd's fault. 63 00:03:27,040 --> 00:03:30,559 Speaker 2: Yeah, they get blamed by authorities very frequently in retrospect, 64 00:03:30,600 --> 00:03:33,280 Speaker 2: and then in further retrospect it turns out like, no, actually, 65 00:03:33,360 --> 00:03:35,560 Speaker 2: the authorities are probably at fault in this case. 66 00:03:36,000 --> 00:03:38,880 Speaker 3: Yeah, I mean it's almost as if sometimes a peaceful 67 00:03:38,880 --> 00:03:45,520 Speaker 3: protest can turn bad when an armed military shows up, right, 68 00:03:45,640 --> 00:03:48,320 Speaker 3: just the presence of that, just the presence and then 69 00:03:48,520 --> 00:03:49,960 Speaker 3: other things that happen after. 70 00:03:50,240 --> 00:03:53,360 Speaker 2: Right, So yes, and we'll talk about that for sure, 71 00:03:53,360 --> 00:03:55,720 Speaker 2: because there is a huge role for law enforcement and 72 00:03:55,760 --> 00:03:59,240 Speaker 2: dealing with crowds, like it's part of their job. Sure, 73 00:03:59,440 --> 00:04:03,040 Speaker 2: crowd out, they say crowd control, but that's apparently an 74 00:04:03,040 --> 00:04:06,360 Speaker 2: outdated term as well. But dealing with crowds, managing crowds, 75 00:04:07,080 --> 00:04:09,080 Speaker 2: So that's part of their job. That's not gonna go away. 76 00:04:09,200 --> 00:04:11,920 Speaker 2: We don't really want it to go away. Instead, we 77 00:04:11,960 --> 00:04:15,440 Speaker 2: want law enforcement to do it in the using the 78 00:04:15,480 --> 00:04:18,360 Speaker 2: best practices that have been proven over and over again 79 00:04:18,600 --> 00:04:21,599 Speaker 2: that cut down the chance of a crowd turning ugly 80 00:04:22,160 --> 00:04:24,400 Speaker 2: by huge percentages. 81 00:04:24,960 --> 00:04:27,560 Speaker 3: Yeah, but you know, one thing is for sure, like 82 00:04:28,040 --> 00:04:32,240 Speaker 3: people behave differently in crowds, and sometimes it's great. You know, 83 00:04:33,080 --> 00:04:35,400 Speaker 3: I never I never dance like this man, but I'm 84 00:04:35,400 --> 00:04:38,599 Speaker 3: at burning man and look at me all of a sudden, right, 85 00:04:39,600 --> 00:04:41,560 Speaker 3: or it can turn bad and we're gonna we're gonna 86 00:04:41,560 --> 00:04:42,360 Speaker 3: cover all angles. 87 00:04:42,440 --> 00:04:44,320 Speaker 2: Well, let's talk about some of the bad ones, because 88 00:04:44,360 --> 00:04:48,000 Speaker 2: there are some famous ones that this happens. You don't 89 00:04:48,000 --> 00:04:49,480 Speaker 2: want to say a lot, but it's one of those 90 00:04:49,520 --> 00:04:51,640 Speaker 2: things like a plane crash, where it seems like it 91 00:04:51,680 --> 00:04:55,839 Speaker 2: will happen all the time because it's just so shocking 92 00:04:56,600 --> 00:04:58,960 Speaker 2: details of it. Whenever it does happen, it seems like 93 00:04:59,000 --> 00:05:02,200 Speaker 2: it happens way more frequently. That's just not the case. 94 00:05:02,200 --> 00:05:03,720 Speaker 2: But when it does happen, like I said, it can 95 00:05:03,760 --> 00:05:04,440 Speaker 2: be pretty bad. 96 00:05:05,080 --> 00:05:08,760 Speaker 3: Yeah, you know, because people can get trampled, people can 97 00:05:08,839 --> 00:05:12,560 Speaker 3: be squashed by force, like in a crowd rush. Pressure 98 00:05:12,600 --> 00:05:15,280 Speaker 3: can reach a thousand pounds of force, which is a lot. Yeah, 99 00:05:15,560 --> 00:05:18,240 Speaker 3: that can lead to loss of life, notably, like you said, 100 00:05:18,240 --> 00:05:20,480 Speaker 3: we're going to mention a handful, but the Astro World 101 00:05:20,520 --> 00:05:24,040 Speaker 3: Festival in Houston very sadly in twenty twenty one at 102 00:05:24,040 --> 00:05:26,840 Speaker 3: the Travis Scott performance, ten people lost their lives and 103 00:05:26,880 --> 00:05:27,800 Speaker 3: many more were injured. 104 00:05:28,440 --> 00:05:30,640 Speaker 2: Yeah, there was one I hadn't heard of that was 105 00:05:30,680 --> 00:05:34,320 Speaker 2: pretty bad, quite bad. It happened at the coronation of 106 00:05:34,400 --> 00:05:38,360 Speaker 2: Zar Nicholas the Second in Moscow in eighteen ninety six. Yeah, 107 00:05:38,400 --> 00:05:41,200 Speaker 2: there was a crowd of half a million people who 108 00:05:41,240 --> 00:05:43,599 Speaker 2: were all there for the coronation, and they knew that 109 00:05:43,640 --> 00:05:47,560 Speaker 2: they were giving out free souvenirs. Essentially, I saw half 110 00:05:47,560 --> 00:05:50,479 Speaker 2: a pound of sausage, bags of nuts, a souvenir cup, 111 00:05:50,920 --> 00:05:53,400 Speaker 2: and apparently they had enough for everybody, but a rumor 112 00:05:53,440 --> 00:05:56,279 Speaker 2: spread in the crowd that they didn't have enough for everybody, 113 00:05:56,400 --> 00:05:58,919 Speaker 2: so the people in the back started pushing. A stampede 114 00:05:58,960 --> 00:06:01,559 Speaker 2: broke out. In thirteen one hundred people died. 115 00:06:02,040 --> 00:06:03,520 Speaker 4: Yeah, it feels like concerts. 116 00:06:04,240 --> 00:06:07,040 Speaker 3: In sporting events, it can be dozens of people, which 117 00:06:07,040 --> 00:06:09,600 Speaker 3: is all incredibly sad, But when people in the order 118 00:06:09,640 --> 00:06:13,640 Speaker 3: of thousands are dying, yeah, from an event, that's pretty striking. 119 00:06:13,400 --> 00:06:16,960 Speaker 2: And that really follows the model. It seems like for 120 00:06:17,480 --> 00:06:20,680 Speaker 2: loss of life in crowd crushes, there's some sort of 121 00:06:20,760 --> 00:06:24,240 Speaker 2: bottleneck toward the front, and it's wider in back, so 122 00:06:24,360 --> 00:06:27,400 Speaker 2: people in back start pushing and the people in front 123 00:06:27,440 --> 00:06:30,839 Speaker 2: can't go forward, so they get crushed. That seems to 124 00:06:30,839 --> 00:06:33,359 Speaker 2: be like the case. That's what happened in Astro World, 125 00:06:33,520 --> 00:06:38,080 Speaker 2: That's what happened in Moscow, That's what happened in Dusburg, Germany, 126 00:06:38,520 --> 00:06:41,880 Speaker 2: at the world's largest techno festival, the Love Parade. 127 00:06:42,040 --> 00:06:42,760 Speaker 4: Yeah, I remember that. 128 00:06:42,920 --> 00:06:46,640 Speaker 3: Twenty ten twenty one people died and it was a 129 00:06:46,640 --> 00:06:47,559 Speaker 3: tunnel in that case. 130 00:06:48,000 --> 00:06:52,680 Speaker 2: Yeah, it also happened. I mentioned the Haje and two 131 00:06:52,720 --> 00:06:58,640 Speaker 2: thousand people. Two thousand people died at the Hajje as 132 00:06:58,720 --> 00:07:02,320 Speaker 2: part of that annual pilgrim to Mecca that all Muslims 133 00:07:02,320 --> 00:07:05,920 Speaker 2: are required to do at least once in their lifetime. Well, 134 00:07:06,040 --> 00:07:07,920 Speaker 2: they have like on the order of one and a 135 00:07:08,000 --> 00:07:11,360 Speaker 2: half million people, up to two million people every year 136 00:07:11,480 --> 00:07:15,080 Speaker 2: do this thing. And in twenty fifteen, two thousand people 137 00:07:15,120 --> 00:07:18,120 Speaker 2: died because there was a bottleneck people push from the back, 138 00:07:18,720 --> 00:07:22,520 Speaker 2: and that actually broke the previous record of the most 139 00:07:22,560 --> 00:07:25,680 Speaker 2: people killed in a crowd crush that was also at 140 00:07:25,720 --> 00:07:27,720 Speaker 2: the Hajj, this time in nineteen ninety. 141 00:07:28,200 --> 00:07:33,400 Speaker 3: Yeah, super super sad. Because of all these tragedies, people 142 00:07:33,560 --> 00:07:36,800 Speaker 3: have been studying how people move through crowds, crowd dynamics. 143 00:07:36,840 --> 00:07:39,920 Speaker 3: It's you know, they use a lot of different sciences. 144 00:07:40,160 --> 00:07:42,320 Speaker 3: Physics definitely is one, and we're going to talk about 145 00:07:42,320 --> 00:07:47,840 Speaker 3: all these, but also physiology, psychology certainly, and what they're 146 00:07:47,880 --> 00:07:51,240 Speaker 3: looking at is human behavior, how people behave when they 147 00:07:51,280 --> 00:07:54,760 Speaker 3: get into crowd because it changes once you have sort 148 00:07:54,760 --> 00:07:57,400 Speaker 3: of a veil of anonymity going on, and again it 149 00:07:57,400 --> 00:07:59,840 Speaker 3: can be good or bad. But they use this in 150 00:07:59,840 --> 00:08:02,400 Speaker 3: all all kinds of modeling, whether it's you know, in 151 00:08:02,440 --> 00:08:08,120 Speaker 3: city planning or certainly event preparation, disaster management, crowd management, 152 00:08:08,120 --> 00:08:10,560 Speaker 3: all kinds of people like we'll pay people a lot 153 00:08:10,560 --> 00:08:12,920 Speaker 3: of money to study and give their findings. 154 00:08:13,080 --> 00:08:16,920 Speaker 2: And you mentioned applying the principles of physics to studying this. 155 00:08:17,040 --> 00:08:20,000 Speaker 2: I think that started in the fifties. And it's not 156 00:08:20,080 --> 00:08:24,000 Speaker 2: like tit for tat, but the movement and the formation 157 00:08:24,120 --> 00:08:27,840 Speaker 2: of crowds resembles it so much that you can basically 158 00:08:27,920 --> 00:08:33,479 Speaker 2: use physics terms like orbital motion or oscillators to basically 159 00:08:34,000 --> 00:08:37,679 Speaker 2: describe how people behave in a crowd and how crowds 160 00:08:38,200 --> 00:08:40,520 Speaker 2: behave themselves. One of the ways of looking at a 161 00:08:40,640 --> 00:08:45,160 Speaker 2: systems theory and complex adapted systems are what crowds do. 162 00:08:45,240 --> 00:08:47,600 Speaker 2: It's chaotic at first, but then they start to move 163 00:08:47,640 --> 00:08:51,880 Speaker 2: together as a whole. Lines of communication develop without speech. 164 00:08:51,920 --> 00:08:55,120 Speaker 2: It's just kind of like the crowd learns through feedback 165 00:08:55,480 --> 00:09:00,480 Speaker 2: and it just starts. It becomes a cohesive of whole 166 00:09:00,880 --> 00:09:03,680 Speaker 2: rather than just a bunch of people together into space. 167 00:09:04,280 --> 00:09:07,920 Speaker 3: Yeah, I mean, certainly fluid dynamics is a factor, and 168 00:09:07,920 --> 00:09:10,440 Speaker 3: they kind of use some of that terminology as well 169 00:09:10,480 --> 00:09:14,440 Speaker 3: when talking about crowds, because if you're a fluid, like 170 00:09:14,480 --> 00:09:16,880 Speaker 3: let's say you're a liquid or even a gas, the 171 00:09:16,920 --> 00:09:20,559 Speaker 3: particles are really tightly packed, but they're never actually colliding 172 00:09:20,600 --> 00:09:23,640 Speaker 3: because they have electrons repelling each other. Yeah, we do 173 00:09:23,800 --> 00:09:26,959 Speaker 3: a similar thing, but it's called what's known as social force. 174 00:09:27,840 --> 00:09:30,720 Speaker 3: It's that thing when you're in a crowd and you 175 00:09:30,920 --> 00:09:37,440 Speaker 3: just unconsciously instinctively navigate without like constantly bumping into other people, 176 00:09:38,320 --> 00:09:40,679 Speaker 3: at least if. 177 00:09:39,760 --> 00:09:41,160 Speaker 4: You're doing it right right. 178 00:09:41,600 --> 00:09:43,960 Speaker 3: I've been in crowds where people are a little more unaware, 179 00:09:43,960 --> 00:09:45,880 Speaker 3: and there may be reasons for that, so I'm not 180 00:09:45,960 --> 00:09:50,640 Speaker 3: like casting aspersions. But generally speaking, your body is just 181 00:09:50,720 --> 00:09:54,720 Speaker 3: automatically adjusting because you want that path of least resistance. 182 00:09:54,880 --> 00:09:57,840 Speaker 3: You don't want to be banging into people all around you. 183 00:09:58,400 --> 00:10:02,360 Speaker 3: And even in great crowds like a huge excited stadium 184 00:10:03,480 --> 00:10:06,520 Speaker 3: after a big sports win or after a big energetic 185 00:10:06,520 --> 00:10:09,960 Speaker 3: concert or something, you still find your way out of 186 00:10:09,960 --> 00:10:14,920 Speaker 3: there generally, you know, not making contact with other people. 187 00:10:15,000 --> 00:10:18,520 Speaker 3: Maybe slight you know, bumps here and there, but unless 188 00:10:18,520 --> 00:10:21,319 Speaker 3: someone really has their head up their butt or they're 189 00:10:21,360 --> 00:10:23,680 Speaker 3: super drunk, or there may be other you know, some 190 00:10:24,200 --> 00:10:26,680 Speaker 3: genuine factor. They can't help. They're not just banging right 191 00:10:26,720 --> 00:10:27,720 Speaker 3: into people constantly. 192 00:10:28,000 --> 00:10:31,640 Speaker 2: Yeah. That made me wonder if it's electrons repelling each 193 00:10:31,679 --> 00:10:35,520 Speaker 2: other that keeps liquids from bumping into one another. I 194 00:10:35,600 --> 00:10:39,079 Speaker 2: wonder if the collective group of electrons in our bodies 195 00:10:39,600 --> 00:10:43,679 Speaker 2: are what makes us inherently makes us inherently avoid it 196 00:10:43,760 --> 00:10:45,840 Speaker 2: and almostatic sense for that kind of thing. 197 00:10:46,200 --> 00:10:46,960 Speaker 4: I like that, Eddie. 198 00:10:47,320 --> 00:10:49,560 Speaker 2: It's like the quantum explanation of swerving. 199 00:10:49,760 --> 00:10:50,960 Speaker 4: Yeah, little hippie dippy. 200 00:10:51,320 --> 00:10:53,440 Speaker 2: So there are a couple things that we've figured out 201 00:10:53,480 --> 00:10:57,480 Speaker 2: about crowds that the way that they behave is again, 202 00:10:57,520 --> 00:10:59,720 Speaker 2: this is spontaneous. I don't even know if we've said it, 203 00:10:59,760 --> 00:11:04,240 Speaker 2: so maybe not. Again, it's spontaneous, it's unconscious typically, and 204 00:11:04,880 --> 00:11:09,520 Speaker 2: it's collective. Right, So we're actually moving cooperatively with other people, 205 00:11:10,080 --> 00:11:12,360 Speaker 2: whether we realize it or not. We think we're just 206 00:11:12,440 --> 00:11:14,559 Speaker 2: trying to make it to the exit because we want 207 00:11:14,600 --> 00:11:16,160 Speaker 2: to get to our car first so we can get 208 00:11:16,160 --> 00:11:18,400 Speaker 2: the heck out of the parking lot as soon as possible. 209 00:11:18,840 --> 00:11:23,200 Speaker 2: But we're actually unconsciously moving in conjunction with other people. 210 00:11:23,559 --> 00:11:27,240 Speaker 2: One of the really great ways that that expresses itself 211 00:11:27,320 --> 00:11:31,440 Speaker 2: is in lane formation, which is well, it's exactly what 212 00:11:31,480 --> 00:11:32,440 Speaker 2: it sounds like, right. 213 00:11:33,559 --> 00:11:36,640 Speaker 3: Yeah, it's people kind of gathering and moving in one 214 00:11:36,720 --> 00:11:42,200 Speaker 3: direction together forming a lane. No one's taking the lead 215 00:11:42,240 --> 00:11:45,520 Speaker 3: and saying everyone, this is the United States, So we 216 00:11:45,640 --> 00:11:47,840 Speaker 3: generally walk down the right side of the hallway or 217 00:11:47,880 --> 00:11:50,680 Speaker 3: the corridor and other people on the left. That is 218 00:11:50,720 --> 00:11:52,640 Speaker 3: something when I've traveled abroad I had to get used 219 00:11:52,640 --> 00:11:55,200 Speaker 3: to because I didn't realize the rules of the road 220 00:11:55,280 --> 00:11:59,199 Speaker 3: typically apply to moving around the world as well. So 221 00:11:59,760 --> 00:12:01,840 Speaker 3: I I didn't know that until I went over to 222 00:12:02,000 --> 00:12:05,480 Speaker 3: England for the first time and was bumping into people constantly. 223 00:12:06,600 --> 00:12:09,559 Speaker 3: But yeah, here in the United States, it's generally right side, 224 00:12:09,720 --> 00:12:12,400 Speaker 3: left side or I guess it doesn't explain it well, 225 00:12:12,400 --> 00:12:14,360 Speaker 3: but no, you did you move along the right side 226 00:12:14,400 --> 00:12:14,959 Speaker 3: of a hallway? 227 00:12:15,040 --> 00:12:18,520 Speaker 2: Sure, And that's just a difference in the electron spin 228 00:12:18,640 --> 00:12:21,120 Speaker 2: of people in Europe and the people in the US. 229 00:12:21,400 --> 00:12:25,040 Speaker 2: Oh Man, I also mentioned orbital motion too, right, So 230 00:12:25,360 --> 00:12:29,520 Speaker 2: apparently in a crowd people moving in the same direction, 231 00:12:30,200 --> 00:12:32,520 Speaker 2: or if you're in a crowd, that's just basically in 232 00:12:32,559 --> 00:12:35,280 Speaker 2: one place. I think they studied the Festival of the 233 00:12:35,360 --> 00:12:38,679 Speaker 2: Running of the Bulls in Pamplona. To get this information. 234 00:12:39,120 --> 00:12:43,319 Speaker 2: You basically create you move in a circle in an 235 00:12:43,400 --> 00:12:46,080 Speaker 2: orbit is what they call it, and you complete this 236 00:12:46,200 --> 00:12:49,880 Speaker 2: orbit in about eighteen seconds. And when I was reading this, 237 00:12:50,000 --> 00:12:52,199 Speaker 2: I'm like, that's just not true. And then I went 238 00:12:52,240 --> 00:12:56,200 Speaker 2: and watched video of this study, and yes, people just 239 00:12:56,320 --> 00:13:00,360 Speaker 2: move around in a tight circle. Basically, you're moving because 240 00:13:00,400 --> 00:13:04,240 Speaker 2: other people are moving, but you're ultimately you're keeping your 241 00:13:04,240 --> 00:13:08,000 Speaker 2: same space this one orbital circle, which is awesome. 242 00:13:08,000 --> 00:13:11,400 Speaker 3: I mean to be clear, because that sounded slightly confusing. 243 00:13:12,080 --> 00:13:15,480 Speaker 3: People are not walking in a tight circle. You're just 244 00:13:15,679 --> 00:13:20,000 Speaker 3: moving your body within a circular space, right, like maybe 245 00:13:20,080 --> 00:13:22,800 Speaker 3: raising your elbow to scratch your head or something like that. 246 00:13:22,920 --> 00:13:27,480 Speaker 2: Right, So imagine imagine the person in front of you. 247 00:13:28,960 --> 00:13:30,839 Speaker 2: Imagine the person in front of you is backing up, 248 00:13:30,840 --> 00:13:32,640 Speaker 2: so you back up and then you maybe moved to 249 00:13:32,679 --> 00:13:34,360 Speaker 2: the right, and then they move this way and you 250 00:13:34,480 --> 00:13:37,160 Speaker 2: kind of come forward to your left, and then you 251 00:13:37,280 --> 00:13:38,880 Speaker 2: kind of back up a little bit to your left, 252 00:13:38,880 --> 00:13:41,240 Speaker 2: and then you back up center like that. So you're 253 00:13:41,280 --> 00:13:43,160 Speaker 2: facing the same way the whole time, but you're just 254 00:13:43,240 --> 00:13:45,640 Speaker 2: shuffling your feet in the way that you're doing this. 255 00:13:45,720 --> 00:13:47,559 Speaker 2: You're ultimately creating an orbit. 256 00:13:48,520 --> 00:13:51,600 Speaker 3: Yeah, and I think that also applies to just taking 257 00:13:51,640 --> 00:13:53,319 Speaker 3: up your personal space with general body. 258 00:13:53,679 --> 00:13:54,840 Speaker 2: Yes, the electrons. 259 00:13:56,120 --> 00:13:58,840 Speaker 3: So this this works out pretty great, this idea of 260 00:13:58,840 --> 00:14:02,360 Speaker 3: social force, it seems to work pretty well. And even 261 00:14:02,360 --> 00:14:06,000 Speaker 3: in big crowds, when it starts to get a little 262 00:14:06,480 --> 00:14:09,360 Speaker 3: more highly dense is when it's a problem. And this 263 00:14:09,520 --> 00:14:11,720 Speaker 3: is where you might get triggered. Like I was. And 264 00:14:11,760 --> 00:14:14,960 Speaker 3: I don't even mind crowds, but when it started talking 265 00:14:15,200 --> 00:14:17,959 Speaker 3: and getting like denser and denser, I just found myself 266 00:14:17,960 --> 00:14:20,280 Speaker 3: getting a little you know, my breathing sort of increased. 267 00:14:20,560 --> 00:14:21,200 Speaker 2: Yeah. 268 00:14:21,240 --> 00:14:24,240 Speaker 3: So at two people per square meter, a crowd moves 269 00:14:24,320 --> 00:14:27,160 Speaker 3: a little slower because they're still trying to you know, 270 00:14:27,200 --> 00:14:29,840 Speaker 3: get that distance between you and and your friend next 271 00:14:29,840 --> 00:14:33,360 Speaker 3: to you. Four people, you're going to start to bump 272 00:14:33,400 --> 00:14:36,920 Speaker 3: around a little bit. At six people, things get truly difficult. 273 00:14:36,960 --> 00:14:39,920 Speaker 3: And at ten people per square meter and I measured 274 00:14:39,920 --> 00:14:42,760 Speaker 3: that out on the floor, that made me panic a 275 00:14:42,760 --> 00:14:46,520 Speaker 3: little bit. That means individual movement is basically impossible, and 276 00:14:46,640 --> 00:14:50,360 Speaker 3: the likelihood of, like in a big crowd that's that dense, 277 00:14:51,240 --> 00:14:53,080 Speaker 3: something bad is likely to happen. 278 00:14:53,440 --> 00:14:56,520 Speaker 2: Yeah, this is the point where you can't raise your arms, 279 00:14:56,800 --> 00:15:00,360 Speaker 2: they're at your side. If a crush starts to have happened, 280 00:15:00,440 --> 00:15:04,760 Speaker 2: you can't breathe, so they're like, you can die of asphyxiation. 281 00:15:05,320 --> 00:15:09,000 Speaker 2: There is a very famous tragedy in twenty twenty two 282 00:15:09,080 --> 00:15:13,920 Speaker 2: in Soul, a Taiwan, tragedy where kids in their twenties, 283 00:15:14,080 --> 00:15:16,800 Speaker 2: healthy kids in their twenties died of heart attacks because 284 00:15:16,840 --> 00:15:21,760 Speaker 2: they asphyxiated and their heart stopped because of a crowd crush. Yeah. 285 00:15:21,760 --> 00:15:26,000 Speaker 3: And if you're wondering what this all looks like, we 286 00:15:26,040 --> 00:15:27,800 Speaker 3: didn't do big max. Should do one big max, but 287 00:15:28,320 --> 00:15:31,360 Speaker 3: six people per square meter where you know it's not tragic, 288 00:15:31,400 --> 00:15:35,640 Speaker 3: but movement is difficult. That's about seventeen hundred people packed 289 00:15:35,680 --> 00:15:36,880 Speaker 3: onto a tennis court. 290 00:15:37,160 --> 00:15:38,720 Speaker 4: Yep, So that's a lot. 291 00:15:39,320 --> 00:15:42,680 Speaker 2: Yeah, and that was at six. We're talking ten is 292 00:15:42,800 --> 00:15:45,280 Speaker 2: when you can really be in trouble. Now, that is 293 00:15:45,360 --> 00:15:50,880 Speaker 2: not guaranteed that something's going to go wrong. In a 294 00:15:51,000 --> 00:15:55,240 Speaker 2: Taiwan there were like this, This situation was going on 295 00:15:55,400 --> 00:15:58,520 Speaker 2: for an hour or so before it turned problematic. So 296 00:15:58,640 --> 00:16:00,480 Speaker 2: as long as the people in the back are doing 297 00:16:00,520 --> 00:16:03,440 Speaker 2: good and not pushing, as long as somebody in the 298 00:16:03,480 --> 00:16:07,560 Speaker 2: center doesn't faint and fall over or trip and fall down. 299 00:16:08,160 --> 00:16:11,240 Speaker 2: This can work. It's not inherently going to be deadly, 300 00:16:11,640 --> 00:16:14,440 Speaker 2: but the chances of it becoming dead it's just sitting 301 00:16:14,440 --> 00:16:17,160 Speaker 2: there balancing on the razor's edge at that point, it's 302 00:16:17,200 --> 00:16:18,760 Speaker 2: a really dangerous place to be. 303 00:16:19,320 --> 00:16:22,560 Speaker 3: Yeah, it's that one triggering incident is when it can 304 00:16:22,640 --> 00:16:23,600 Speaker 3: really go south there. 305 00:16:23,840 --> 00:16:24,880 Speaker 2: Yeah, for sure, it's scary. 306 00:16:25,640 --> 00:16:28,560 Speaker 3: Shall we take a break, Yes, all right, we'll be 307 00:16:28,680 --> 00:16:41,280 Speaker 3: right back with more crowds. 308 00:16:34,360 --> 00:16:58,560 Speaker 1: Stop you shut. 309 00:16:55,640 --> 00:16:58,960 Speaker 4: All right. So we covered partially coverage physics. 310 00:16:59,040 --> 00:17:03,640 Speaker 3: We should talk a little bit about psychology because crowds 311 00:17:03,800 --> 00:17:06,639 Speaker 3: have their own distinct psychology around them as well, and 312 00:17:07,280 --> 00:17:10,159 Speaker 3: people can behave in ways that they don't normally behave 313 00:17:10,240 --> 00:17:15,159 Speaker 3: in a crowd just because of that psychology. And early 314 00:17:15,200 --> 00:17:17,560 Speaker 3: on the explanation for that was like, if there's a 315 00:17:17,640 --> 00:17:21,640 Speaker 3: violent crowd, then it was just full of violent people, right. 316 00:17:21,680 --> 00:17:26,560 Speaker 3: And there was a French psychologist named Gustave Lebon in 317 00:17:26,600 --> 00:17:29,040 Speaker 3: eighteen ninety five who wrote the literal book on this, 318 00:17:29,200 --> 00:17:33,160 Speaker 3: like the first one called the Crowd Colon a Study 319 00:17:33,200 --> 00:17:37,520 Speaker 3: of the Popular Mind, and it was very influential. But 320 00:17:37,760 --> 00:17:40,239 Speaker 3: this is one of those that's a little outdated in 321 00:17:40,280 --> 00:17:41,160 Speaker 3: a lot of ways. 322 00:17:41,560 --> 00:17:44,760 Speaker 2: Yeah, it's right in some ways. But his whole thing was, like, 323 00:17:44,800 --> 00:17:48,800 Speaker 2: you have any crowd together, they're inherently mindless, they form 324 00:17:48,840 --> 00:17:52,360 Speaker 2: a collective mind, they become capable of anything. They're very suggestible. 325 00:17:52,520 --> 00:17:55,800 Speaker 2: They're going to kill everybody in their path, right, and 326 00:17:56,160 --> 00:17:59,000 Speaker 2: like there are crowds that have done that before. Sure, 327 00:17:59,440 --> 00:18:04,560 Speaker 2: but that's not the inherent I guess trait of crowds. 328 00:18:04,560 --> 00:18:07,440 Speaker 2: Like crowds are actually the opposite. They're actually way more 329 00:18:07,440 --> 00:18:12,520 Speaker 2: peaceful and pro social than that. But Lebon essentially set 330 00:18:12,680 --> 00:18:15,879 Speaker 2: the view of crowds that's still persistent today in a 331 00:18:15,920 --> 00:18:19,720 Speaker 2: lot of quarters. And he did. Yeah, he just had 332 00:18:19,720 --> 00:18:21,360 Speaker 2: a huge impact on this. 333 00:18:21,920 --> 00:18:22,120 Speaker 4: Yeah. 334 00:18:22,160 --> 00:18:27,679 Speaker 3: Absolutely, if you look at the different explanations over the 335 00:18:27,760 --> 00:18:31,240 Speaker 3: years of like this collective behavior, because collective behavior is 336 00:18:31,280 --> 00:18:32,000 Speaker 3: definitely happening. 337 00:18:32,359 --> 00:18:34,160 Speaker 5: Yeah, there are a few different theories. 338 00:18:34,520 --> 00:18:36,680 Speaker 3: One is a contagion theory and that was put forth 339 00:18:36,720 --> 00:18:41,480 Speaker 3: by Lebon himself, which is that this irrational behaviors that 340 00:18:41,480 --> 00:18:45,080 Speaker 3: happened in a crowd spreads like a virus. Yeah, that 341 00:18:45,160 --> 00:18:48,119 Speaker 3: can happen, and it also can't. So I'm sort of 342 00:18:48,160 --> 00:18:49,240 Speaker 3: fifty to fifty on that one. 343 00:18:49,640 --> 00:18:52,760 Speaker 2: Yeah. Again, like all of this stuff makes sense. It's 344 00:18:52,880 --> 00:18:56,440 Speaker 2: just not like this is the most salacious rare version 345 00:18:56,520 --> 00:18:59,600 Speaker 2: of crowds, right. Yeah, you can also make the case 346 00:18:59,680 --> 00:19:05,240 Speaker 2: like good positive behavior can be contagious in a crowd 347 00:19:05,280 --> 00:19:08,240 Speaker 2: as well, because, like you said, people behave differently in crowds. 348 00:19:08,280 --> 00:19:09,240 Speaker 2: That's just a fact. 349 00:19:09,520 --> 00:19:11,400 Speaker 4: Yeah, like that dancing at Burning Man. 350 00:19:11,520 --> 00:19:15,480 Speaker 2: Yeah, exactly. There's also convergence theory, which also makes a 351 00:19:15,480 --> 00:19:19,639 Speaker 2: bit of sense that explains why crowds have a collective mind, 352 00:19:19,920 --> 00:19:22,240 Speaker 2: and it just basically says that's because crowds are made 353 00:19:22,280 --> 00:19:25,920 Speaker 2: up of similar people. Like usually there's not a lot 354 00:19:25,960 --> 00:19:30,959 Speaker 2: of crowd like mindless, say behavior on an elevator, because 355 00:19:31,000 --> 00:19:34,200 Speaker 2: everybody there didn't come together on the elevator to protest something. 356 00:19:34,600 --> 00:19:37,679 Speaker 2: But if you have a protest, there's way more potential 357 00:19:38,080 --> 00:19:42,520 Speaker 2: for collective mind behavior because there's people who are there 358 00:19:42,520 --> 00:19:44,840 Speaker 2: for the same reason, So they're sharing kind of a 359 00:19:44,880 --> 00:19:46,600 Speaker 2: wavelength already when they get there. 360 00:19:47,000 --> 00:19:49,399 Speaker 3: Yeah, if you're at that concert together, you're there because 361 00:19:49,440 --> 00:19:53,560 Speaker 3: you're you want to see Bob Dylan in person, all right, 362 00:19:53,960 --> 00:19:55,160 Speaker 3: for the same reason. 363 00:19:55,080 --> 00:19:58,240 Speaker 2: Sure, Or you want to hear David Crosby harangue you. 364 00:19:58,800 --> 00:20:01,359 Speaker 5: Oh man, he will too, or he would have he did. 365 00:20:02,040 --> 00:20:06,639 Speaker 2: Oh yeah, Okay, so that's convergence. There's also group mind 366 00:20:06,680 --> 00:20:09,679 Speaker 2: theory a little on the nose. It's basically saying like, 367 00:20:10,200 --> 00:20:14,720 Speaker 2: you lose your individual identity and it's replaced as a part, 368 00:20:14,880 --> 00:20:19,560 Speaker 2: a cog of this larger group's identity. Yeah, yeah again, 369 00:20:20,080 --> 00:20:24,119 Speaker 2: I guess. Yeah, it's not totally off. Social identity theory 370 00:20:24,160 --> 00:20:28,640 Speaker 2: seems to be the prevalent, dominant view of I guess 371 00:20:28,720 --> 00:20:30,800 Speaker 2: kind of d individuation in crowds. 372 00:20:31,320 --> 00:20:33,880 Speaker 3: Yeah, that's where you just you're acting like everyone else 373 00:20:33,960 --> 00:20:35,800 Speaker 3: is acting because you just want to fit in. 374 00:20:36,680 --> 00:20:38,399 Speaker 2: Yeah, you can't really put it better than that. 375 00:20:39,760 --> 00:20:43,000 Speaker 4: So how do you become a crowd member? There are 376 00:20:43,040 --> 00:20:44,359 Speaker 4: a few key concepts there. 377 00:20:44,720 --> 00:20:47,080 Speaker 2: First, you have to get training, go to school. 378 00:20:48,840 --> 00:20:51,800 Speaker 3: Got to go to crowd school, all right, Uh, And 379 00:20:51,840 --> 00:20:53,520 Speaker 3: the first thing you need in crowd school is a 380 00:20:53,520 --> 00:20:56,080 Speaker 3: lot of people in a very tight spot. 381 00:20:56,520 --> 00:20:57,600 Speaker 4: It's called panic school. 382 00:20:59,080 --> 00:21:01,360 Speaker 3: Feeling right now, by the way, I'm okay, And it's 383 00:21:01,400 --> 00:21:05,800 Speaker 3: weird because I don't have like, what's it called claustrophobia necessarily. 384 00:21:08,400 --> 00:21:11,280 Speaker 3: I think it's more for me, just like I can't 385 00:21:11,280 --> 00:21:14,040 Speaker 3: wait to get out of that crowd because I just 386 00:21:14,040 --> 00:21:15,000 Speaker 3: want to have a little space. 387 00:21:15,160 --> 00:21:17,920 Speaker 5: Maybe I do have a little clusterphobiah. 388 00:21:17,800 --> 00:21:19,480 Speaker 2: Yeah, I mean it sounds like it, but I think 389 00:21:19,520 --> 00:21:20,840 Speaker 2: everybody does to a degree. 390 00:21:20,840 --> 00:21:21,640 Speaker 4: It's just I guess. 391 00:21:22,840 --> 00:21:25,480 Speaker 3: Yeah, but I've never you know, I can accept hugs. 392 00:21:25,520 --> 00:21:27,680 Speaker 3: And I was never one of those kids like under 393 00:21:27,680 --> 00:21:29,240 Speaker 3: the dog pile that was like freaking out or. 394 00:21:29,200 --> 00:21:29,880 Speaker 4: Anything like that. 395 00:21:30,040 --> 00:21:31,560 Speaker 2: Oh yeah, I did not like that. 396 00:21:32,040 --> 00:21:33,880 Speaker 5: Yeah, well you have a little claustrophobia, right. 397 00:21:34,800 --> 00:21:38,680 Speaker 2: Yeah, a little bit. I've kind of outgrown it a lot. 398 00:21:38,760 --> 00:21:40,679 Speaker 2: But yeah, if you put me in like a sewer 399 00:21:40,720 --> 00:21:45,520 Speaker 2: culvert or hearing about like a caving accident, I mean 400 00:21:45,640 --> 00:21:48,080 Speaker 2: I'll start Yeah, I can't handle that. 401 00:21:48,440 --> 00:21:50,200 Speaker 4: Well, maybe you have a fear of dying. 402 00:21:51,119 --> 00:21:52,119 Speaker 2: That could be it. 403 00:21:52,400 --> 00:21:55,400 Speaker 5: That might be what's going on, which is rational. 404 00:21:56,240 --> 00:21:57,480 Speaker 4: But what are we talking about. 405 00:21:57,480 --> 00:22:01,800 Speaker 3: We're talking about the three key concepts that form these 406 00:22:01,800 --> 00:22:06,440 Speaker 3: foundations of collected behavior, and they are d oh Man, 407 00:22:06,560 --> 00:22:13,720 Speaker 3: D individuation, Yeah, emotional contagion, and suggestibility. And that first one, 408 00:22:13,800 --> 00:22:18,280 Speaker 3: D individuation is basically like kind of what you were 409 00:22:18,320 --> 00:22:21,879 Speaker 3: hinting at earlier, is there's this new social identity and 410 00:22:22,000 --> 00:22:24,639 Speaker 3: your individual identity is taking a break. 411 00:22:25,680 --> 00:22:28,159 Speaker 2: Yeah, And this is a good example of how this 412 00:22:28,240 --> 00:22:30,600 Speaker 2: stuff can kind of make sense, but it's also like 413 00:22:30,640 --> 00:22:33,080 Speaker 2: you really just paid attention to the worst part of it. 414 00:22:34,080 --> 00:22:37,440 Speaker 2: And this came from Philip Bombardo. Remember him from the 415 00:22:37,480 --> 00:22:41,600 Speaker 2: Stanford Prison experiment. Oh yeah, that was all about d individuation. 416 00:22:41,760 --> 00:22:45,280 Speaker 2: So yeah, he focused some work on crowds, and essentially 417 00:22:45,320 --> 00:22:47,960 Speaker 2: what he said is that, yeah, you just basically leave 418 00:22:48,040 --> 00:22:50,520 Speaker 2: your own identity at the door. You take on a 419 00:22:50,600 --> 00:22:55,240 Speaker 2: new identity of crowd member, group member, and basically whatever 420 00:22:55,280 --> 00:22:59,640 Speaker 2: the group's up for, e're up for. Two. You feel anonymous, 421 00:23:00,119 --> 00:23:04,399 Speaker 2: you feel unidentified. You also feel connected to those other people, 422 00:23:04,840 --> 00:23:07,600 Speaker 2: and so if those other people start, you know, looting 423 00:23:07,720 --> 00:23:12,560 Speaker 2: or something like that, you would probably never loot by yourself, right, 424 00:23:12,760 --> 00:23:15,760 Speaker 2: But since you're in that group, you've lost your individual 425 00:23:15,800 --> 00:23:19,440 Speaker 2: identity that would prevent you from looting. Now it's like, well, yeah, 426 00:23:19,480 --> 00:23:21,639 Speaker 2: I mean I'm part of this group and we're looting, 427 00:23:21,680 --> 00:23:22,680 Speaker 2: so let's get to it. 428 00:23:23,480 --> 00:23:26,439 Speaker 3: I think that's fascinating and I think that is so 429 00:23:26,600 --> 00:23:30,200 Speaker 3: true and happens all the time, and we've seen it 430 00:23:30,240 --> 00:23:33,600 Speaker 3: in all across Like I feel like in recent years, 431 00:23:33,680 --> 00:23:38,560 Speaker 3: especially with various protests and you know, things that people 432 00:23:38,680 --> 00:23:41,560 Speaker 3: might call riots, things that people might call insurrections where 433 00:23:41,760 --> 00:23:45,560 Speaker 3: people that normally wouldn't behave in a certain way, like, Hey, 434 00:23:45,880 --> 00:23:49,480 Speaker 3: I was going down to the Capitol Building to voice 435 00:23:49,520 --> 00:23:51,919 Speaker 3: my opinion about how this country is being run, and 436 00:23:52,400 --> 00:23:54,280 Speaker 3: next thing you know, I'm beating a cop down with 437 00:23:54,320 --> 00:23:56,919 Speaker 3: a flagpole, and that normally I would not do something. 438 00:23:56,680 --> 00:23:59,040 Speaker 2: Like that exactly, And that's a great example of that. 439 00:23:59,160 --> 00:24:02,560 Speaker 2: On the other side of it, you might never dance 440 00:24:02,760 --> 00:24:06,320 Speaker 2: with your shirt out wearing nothing but a loincloth burning 441 00:24:06,359 --> 00:24:09,600 Speaker 2: me yes, But in a different situation, you could be 442 00:24:09,880 --> 00:24:13,119 Speaker 2: having just one of the greatest moments of your life totally. 443 00:24:13,160 --> 00:24:16,119 Speaker 2: And that's not just the ecstasy speaking friend, that is 444 00:24:16,200 --> 00:24:18,600 Speaker 2: you being a part of a crowd and feeling that 445 00:24:18,680 --> 00:24:22,840 Speaker 2: kind of exhilaration of being part connected to something larger 446 00:24:22,880 --> 00:24:26,640 Speaker 2: than yourself. So it's the same thing in a sense. 447 00:24:26,680 --> 00:24:30,320 Speaker 2: You're deindividuated one way or another. It's just how is 448 00:24:30,359 --> 00:24:32,880 Speaker 2: it going? Is it going positive? Is it going negative? 449 00:24:33,080 --> 00:24:34,320 Speaker 2: Or is it even neutral? 450 00:24:34,960 --> 00:24:35,480 Speaker 4: Yeah? 451 00:24:35,640 --> 00:24:38,119 Speaker 3: Yeah, I think this is all just very fascinating. The 452 00:24:38,119 --> 00:24:42,320 Speaker 3: second one we mentioned was emotional contagion, and that is 453 00:24:42,480 --> 00:24:45,119 Speaker 3: just the fact that you know, emotions in a crowd 454 00:24:45,119 --> 00:24:47,800 Speaker 3: are heightened anyway if you're there at that concert or 455 00:24:47,840 --> 00:24:52,400 Speaker 3: that protest, you're highly emotional, probably for whatever reason you're there, 456 00:24:53,240 --> 00:24:54,879 Speaker 3: or if your team just won the big game and 457 00:24:54,920 --> 00:24:59,520 Speaker 3: you're leaving in a big crowd, so you're aware of 458 00:24:59,600 --> 00:25:05,200 Speaker 3: the people around you. But that like one person, and 459 00:25:05,640 --> 00:25:08,480 Speaker 3: you've seen it before in any of those circumstances where 460 00:25:08,520 --> 00:25:13,439 Speaker 3: one person again triggers something because they're extra emo, and 461 00:25:13,480 --> 00:25:15,720 Speaker 3: all of a sudden, everyone rises to meet that level 462 00:25:15,720 --> 00:25:16,159 Speaker 3: of emotion. 463 00:25:16,800 --> 00:25:18,720 Speaker 2: Yeah. One thing I saw, though, I think it's part 464 00:25:18,720 --> 00:25:21,520 Speaker 2: of social identity theory, is that that depends on the 465 00:25:21,520 --> 00:25:23,400 Speaker 2: mood of the group or the norms of the group. 466 00:25:23,480 --> 00:25:25,879 Speaker 2: So if it's like a group that is generally saying like, 467 00:25:25,920 --> 00:25:28,920 Speaker 2: we're a peaceful protest, somebody can come up and throw 468 00:25:28,960 --> 00:25:32,120 Speaker 2: a Molotov cocktail and everybody else in the group's gonna 469 00:25:32,119 --> 00:25:33,960 Speaker 2: look at them like, what the heck are you doing. Yeah, 470 00:25:34,000 --> 00:25:37,440 Speaker 2: they will isolate that person, ostracize that person, and continue 471 00:25:37,440 --> 00:25:40,439 Speaker 2: on with their peaceful protests. But if the norm of 472 00:25:40,480 --> 00:25:43,520 Speaker 2: the group is like, yeah, we're being like repressed here 473 00:25:43,560 --> 00:25:46,200 Speaker 2: and somebody throws a Molotov cocktail, there's a good chance 474 00:25:46,280 --> 00:25:50,440 Speaker 2: that that crowd will change its norms to include throwing 475 00:25:50,480 --> 00:25:52,840 Speaker 2: molotov cocktails and more people will join in. 476 00:25:53,280 --> 00:25:56,159 Speaker 3: Yeah, I mean you used to hang out in a 477 00:25:56,200 --> 00:25:58,640 Speaker 3: moshpit or two back in the day. I'm sure, right, yeah, 478 00:25:58,680 --> 00:26:01,600 Speaker 3: here and there, the mashpit. I mean, I think they 479 00:26:01,600 --> 00:26:03,960 Speaker 3: could be and they probably have been studied within the 480 00:26:04,000 --> 00:26:07,280 Speaker 3: crowd dynamics because it's sort of just a microcosm. But 481 00:26:07,800 --> 00:26:10,800 Speaker 3: I remember in those days and I wasn't like, oh man, 482 00:26:10,840 --> 00:26:13,040 Speaker 3: I can't wait to get in that mash pit, but 483 00:26:13,359 --> 00:26:15,679 Speaker 3: if one broke out around me, I would find myself 484 00:26:15,720 --> 00:26:17,840 Speaker 3: in it like having a good time, and the same 485 00:26:17,840 --> 00:26:20,320 Speaker 3: thing would happen. I was in moshpits where there was 486 00:26:20,359 --> 00:26:24,399 Speaker 3: one jerk throwing elbows and everyone's like, get out of here, dude. 487 00:26:24,880 --> 00:26:26,800 Speaker 4: And then I've been in others where all of a 488 00:26:26,840 --> 00:26:30,160 Speaker 4: sudden that one dude triggered another dude, and then all 489 00:26:30,160 --> 00:26:31,080 Speaker 4: of a sudden it got. 490 00:26:31,000 --> 00:26:33,080 Speaker 5: A little scary in there. And that's when. 491 00:26:33,000 --> 00:26:33,880 Speaker 4: Chuck took a walk. 492 00:26:34,000 --> 00:26:35,000 Speaker 2: Yeah, that's a good idea. 493 00:26:35,359 --> 00:26:35,760 Speaker 4: Yeah. 494 00:26:35,800 --> 00:26:38,439 Speaker 2: And then also there's there's often that jerk that's like 495 00:26:38,520 --> 00:26:40,920 Speaker 2: not even in the pit. He just pushes people in 496 00:26:41,160 --> 00:26:42,800 Speaker 2: who weren't planning on joining. 497 00:26:43,080 --> 00:26:44,679 Speaker 4: Yeah, that should be a choice. 498 00:26:44,680 --> 00:26:46,120 Speaker 2: That guy needs to soak his head. 499 00:26:46,720 --> 00:26:47,000 Speaker 4: Yeah. 500 00:26:47,400 --> 00:26:50,240 Speaker 3: I had a quick story here about the first or 501 00:26:50,280 --> 00:26:52,000 Speaker 3: not the first Lola Plusa, the one with the Beastie 502 00:26:52,000 --> 00:26:57,080 Speaker 3: Boys and at Lakewood Amphitheater, big outdoor Amphitheater on the lawn, 503 00:26:58,040 --> 00:26:59,600 Speaker 3: the mash pit behind us. 504 00:26:59,640 --> 00:27:01,919 Speaker 4: I was in this one, but it got so big. 505 00:27:02,520 --> 00:27:04,640 Speaker 3: It was about it big was it was about a 506 00:27:04,680 --> 00:27:07,760 Speaker 3: third of the size of an amphitheater lawn and it 507 00:27:07,800 --> 00:27:11,040 Speaker 3: was swirling in the circle and people started throwing up 508 00:27:11,119 --> 00:27:15,000 Speaker 3: their their cups and stuff, and there the trash around them, 509 00:27:15,040 --> 00:27:20,600 Speaker 3: and that stuff formed a whirlpool, a tunnel of trash. Wow, 510 00:27:21,080 --> 00:27:23,679 Speaker 3: as if a tornado was taking it up. 511 00:27:23,920 --> 00:27:26,880 Speaker 5: That's amazing, or at least that's what I saw. 512 00:27:27,400 --> 00:27:28,960 Speaker 4: That's you know what I mean. 513 00:27:29,080 --> 00:27:31,040 Speaker 2: That's a world class mosh pit right there. 514 00:27:31,359 --> 00:27:31,719 Speaker 4: It was. 515 00:27:32,160 --> 00:27:33,960 Speaker 3: I've never seen anything like it. And I've been around 516 00:27:34,000 --> 00:27:35,600 Speaker 3: a lot of big shows like that, and I've never 517 00:27:35,640 --> 00:27:36,560 Speaker 3: seen anything like that. 518 00:27:37,119 --> 00:27:39,720 Speaker 2: That's like that part from American Beauty where the kid 519 00:27:39,800 --> 00:27:42,440 Speaker 2: sees the plastic bag caught up in the little whirlwind. 520 00:27:42,480 --> 00:27:44,520 Speaker 2: It's like, this is the most beautiful thing I've ever seen. 521 00:27:44,600 --> 00:27:46,359 Speaker 4: Yeah, I used to think that was the best movie 522 00:27:46,359 --> 00:27:48,760 Speaker 4: ever hit. I don't think it's aged. Well, not just 523 00:27:50,240 --> 00:27:52,320 Speaker 4: not just because it's spacey. The whole thing is kind 524 00:27:52,320 --> 00:27:53,000 Speaker 4: of corny to me. 525 00:27:53,040 --> 00:27:54,800 Speaker 2: Now when I think about it, Well, I'm not gonna 526 00:27:54,800 --> 00:27:59,359 Speaker 2: watch it again then. Yeah, okay, there's one more I 527 00:27:59,600 --> 00:28:04,160 Speaker 2: guess aspect that you mentioned too, which is suggestibility, and 528 00:28:04,359 --> 00:28:06,720 Speaker 2: this is I think another Lebon thing, who, by the way, 529 00:28:06,720 --> 00:28:08,480 Speaker 2: as far as I can tell, is not related to 530 00:28:08,520 --> 00:28:14,480 Speaker 2: Simon Lebon. H Yeah, but Lebon basically was like they're 531 00:28:14,520 --> 00:28:19,360 Speaker 2: de individuated. They are basically capable of anything, and if 532 00:28:19,400 --> 00:28:21,960 Speaker 2: there's like that, you could tell them whatever and they'll 533 00:28:21,960 --> 00:28:25,399 Speaker 2: go do it. This definitely, I mean, just from recent 534 00:28:25,480 --> 00:28:30,760 Speaker 2: experience like this can happen, but it requires typically a 535 00:28:30,800 --> 00:28:34,480 Speaker 2: central leader or or organizer that people are looking to, 536 00:28:35,600 --> 00:28:39,920 Speaker 2: a charismatic speaker, somebody who can actually tap into that 537 00:28:40,080 --> 00:28:43,120 Speaker 2: collective mind and push it one direction or another. That 538 00:28:43,280 --> 00:28:46,960 Speaker 2: is possible, but again that is fairly rare when that 539 00:28:47,120 --> 00:28:51,160 Speaker 2: when something like that happens, but it again, it is possible. 540 00:28:51,200 --> 00:28:52,640 Speaker 2: It does happen sometimes. 541 00:28:53,000 --> 00:28:55,880 Speaker 4: Well, I mean I don't think we mentioned earlier. Hitler 542 00:28:55,920 --> 00:28:58,240 Speaker 4: and Mussolini both studied the works of Lebon. 543 00:28:58,400 --> 00:29:01,560 Speaker 2: M Yeah, and I mean like they mobilized it to 544 00:29:01,600 --> 00:29:04,560 Speaker 2: their own ends. They used it to generate nationalism and 545 00:29:04,640 --> 00:29:09,640 Speaker 2: xenophobia and essentially create like their own fascist states. Because 546 00:29:09,920 --> 00:29:13,080 Speaker 2: it does work if you do it right and you 547 00:29:13,240 --> 00:29:15,800 Speaker 2: have a crowd in the right mindset. 548 00:29:16,280 --> 00:29:20,200 Speaker 5: That's right, that's nuts, dude, Should we break or keep going? 549 00:29:21,480 --> 00:29:22,480 Speaker 2: I say we. 550 00:29:25,680 --> 00:29:26,640 Speaker 4: All right, we can break. 551 00:29:27,760 --> 00:29:49,320 Speaker 6: You have my permission, stop you shut. 552 00:29:54,080 --> 00:29:57,520 Speaker 2: So one of the great things Chuck about studying crowds 553 00:29:57,640 --> 00:30:00,960 Speaker 2: is that you you can figure out what makes crowds 554 00:30:01,000 --> 00:30:03,840 Speaker 2: do what, and they have in a lot of ways, 555 00:30:03,880 --> 00:30:06,200 Speaker 2: not just how they behave like physics or what the 556 00:30:06,240 --> 00:30:10,680 Speaker 2: psychology is behind crowds, but like what triggers crowds to say, 557 00:30:10,840 --> 00:30:13,760 Speaker 2: change their mood because remember I said, like depending on 558 00:30:13,800 --> 00:30:16,720 Speaker 2: the norms, like they may or may not behave violently 559 00:30:16,840 --> 00:30:20,200 Speaker 2: or unlawfully, and that can change with the same crowd 560 00:30:20,240 --> 00:30:23,640 Speaker 2: depending on the circumstances. And one of the like a 561 00:30:23,680 --> 00:30:27,120 Speaker 2: good example of that is that if you have been 562 00:30:27,640 --> 00:30:30,840 Speaker 2: if you're a law enforcement officer and you're there and 563 00:30:30,880 --> 00:30:34,760 Speaker 2: you basically your whole squad is like it's a protest, 564 00:30:34,800 --> 00:30:36,480 Speaker 2: So of course they're going to turn violent and they're 565 00:30:36,480 --> 00:30:38,080 Speaker 2: going to throw rocks at us and stuff like that. 566 00:30:38,400 --> 00:30:41,520 Speaker 2: Just being primed to believe that actually makes it more 567 00:30:41,720 --> 00:30:45,920 Speaker 2: likely that the crowd will behave that way. It's it's 568 00:30:46,000 --> 00:30:50,120 Speaker 2: not like we have it reversed. So like like you said, 569 00:30:50,200 --> 00:30:54,360 Speaker 2: just the presence of a law enforcement like group or 570 00:30:54,360 --> 00:30:58,280 Speaker 2: force in riot gear makes it more likely that a 571 00:30:58,360 --> 00:30:59,600 Speaker 2: crowd will turn violent. 572 00:31:00,600 --> 00:31:04,760 Speaker 3: Yeah, for sure, they have this new I don't know 573 00:31:05,520 --> 00:31:07,920 Speaker 3: how widespread it is yet, but one of the newer 574 00:31:07,960 --> 00:31:11,280 Speaker 3: developments in this kind of quote unquote crowd control is 575 00:31:11,320 --> 00:31:14,640 Speaker 3: something called the dialogue team. Yeah, and that's when you 576 00:31:14,760 --> 00:31:18,240 Speaker 3: get your nicest cop I guess, yeah, and make them 577 00:31:18,280 --> 00:31:22,200 Speaker 3: the public facing face of that unit, and they put 578 00:31:22,240 --> 00:31:26,600 Speaker 3: them out in the lead and they communicate their purpose there, 579 00:31:27,040 --> 00:31:31,160 Speaker 3: their presence there, how they'd like things to go, and 580 00:31:31,200 --> 00:31:33,920 Speaker 3: it's and apparently it works a lot better than just 581 00:31:33,960 --> 00:31:36,600 Speaker 3: showing up and standing there, beating that billy club into 582 00:31:36,600 --> 00:31:39,480 Speaker 3: your hand and staring down at people in all your 583 00:31:39,480 --> 00:31:42,280 Speaker 3: heavy gear. Apparently that has a quite a great calming 584 00:31:42,320 --> 00:31:45,960 Speaker 3: effect on crowds and can even see the police presence 585 00:31:46,040 --> 00:31:49,920 Speaker 3: is a little more benevolent and cause the crowd to 586 00:31:50,080 --> 00:31:51,400 Speaker 3: maybe self police a little more. 587 00:31:51,760 --> 00:31:55,000 Speaker 2: Yeah. Yeah, So the chances of the crowd turning ugly 588 00:31:55,240 --> 00:31:59,120 Speaker 2: is way decreased when you have that kind of that, 589 00:31:59,240 --> 00:32:02,680 Speaker 2: like you said, public facing kind of law enforcement, Like 590 00:32:02,840 --> 00:32:05,520 Speaker 2: you still have the riot gear. You may even still 591 00:32:05,560 --> 00:32:08,800 Speaker 2: have a line of guys with assault rifles with rubber 592 00:32:08,840 --> 00:32:13,240 Speaker 2: bullets and there's like the m wrap like off in 593 00:32:13,320 --> 00:32:16,440 Speaker 2: the distance, but that's not what you have up against 594 00:32:16,440 --> 00:32:20,800 Speaker 2: the crowd. You have friendly officers that are explaining that 595 00:32:20,800 --> 00:32:23,040 Speaker 2: they're there to protect your First Amendment rights but also 596 00:32:23,120 --> 00:32:26,040 Speaker 2: keep you safe. And yeah, like you said, self police 597 00:32:26,080 --> 00:32:29,160 Speaker 2: in getting a crowd to police itself. Yeah, is I mean, 598 00:32:29,360 --> 00:32:32,760 Speaker 2: that's just the pinnacle of crowd control totally. And it 599 00:32:32,800 --> 00:32:36,479 Speaker 2: isn't very widespread here, but it's a model that Europe 600 00:32:36,560 --> 00:32:39,440 Speaker 2: as a whole has adopted since the early two thousands 601 00:32:39,480 --> 00:32:43,000 Speaker 2: of the aughts. Yeah, and so it's starting to spread 602 00:32:43,040 --> 00:32:46,720 Speaker 2: over here, but it feels like it's more being that 603 00:32:47,320 --> 00:32:50,520 Speaker 2: it's being handed over from Europe to the academics in 604 00:32:50,840 --> 00:32:54,400 Speaker 2: America who are trying to give it to law enforcement 605 00:32:54,440 --> 00:32:56,240 Speaker 2: in America to take and run with. 606 00:32:56,840 --> 00:32:59,480 Speaker 3: Yeah, and the key to any you know, regardless of 607 00:32:59,480 --> 00:33:02,600 Speaker 3: the dialogue team out in front, the key to any 608 00:33:02,760 --> 00:33:06,360 Speaker 3: sort of peaceful situation in a protest and dealing with 609 00:33:06,400 --> 00:33:09,800 Speaker 3: cops do is these cops have to be trained, like 610 00:33:10,000 --> 00:33:10,680 Speaker 3: super trained. 611 00:33:11,160 --> 00:33:11,880 Speaker 4: Training is the. 612 00:33:11,920 --> 00:33:17,920 Speaker 3: Key to every officer interaction that they take. And most 613 00:33:17,960 --> 00:33:21,239 Speaker 3: of these big tragedies that you can point to, and 614 00:33:21,280 --> 00:33:24,000 Speaker 3: not just in protests, but just any kind of these 615 00:33:24,280 --> 00:33:29,920 Speaker 3: crushing tragedies, Yeah, is the result of untrained officers in 616 00:33:29,960 --> 00:33:30,719 Speaker 3: a lot of cases. 617 00:33:30,880 --> 00:33:33,480 Speaker 2: Yeah. Sometimes even the people who are in charge of 618 00:33:33,560 --> 00:33:35,960 Speaker 2: all of the police there might not be trained in 619 00:33:36,000 --> 00:33:39,560 Speaker 2: what they're doing. Like the Sheffield tragedy in I think 620 00:33:39,680 --> 00:33:42,720 Speaker 2: nineteen eighty nine. The police were blamed for that, although 621 00:33:42,720 --> 00:33:45,400 Speaker 2: the police initially blamed soccer hooligans and it was like, no, 622 00:33:45,680 --> 00:33:48,720 Speaker 2: the cops who were in charge of this had no 623 00:33:49,120 --> 00:33:51,200 Speaker 2: reason to be in charge of this. They had no 624 00:33:51,320 --> 00:33:55,120 Speaker 2: training whatsoever. Yea, So that is a huge one for sure. 625 00:33:56,720 --> 00:33:59,120 Speaker 3: A couple of sociologists came along to do a lot 626 00:33:59,120 --> 00:34:03,160 Speaker 3: of the sort of official refutation of Lebon in the 627 00:34:03,200 --> 00:34:07,360 Speaker 3: twentieth century, and it was the Italian guy. 628 00:34:07,400 --> 00:34:10,040 Speaker 4: You mentioned this guy, Henrico. 629 00:34:10,920 --> 00:34:13,120 Speaker 2: I didn't, did I mention an Italian guy? 630 00:34:13,160 --> 00:34:14,399 Speaker 4: I thought you mentioned an Italian guy. 631 00:34:14,440 --> 00:34:16,920 Speaker 3: But regardless whether he's showing up for the first time 632 00:34:17,000 --> 00:34:21,680 Speaker 3: or the second, Okay, A guy named Enrico Quarantelli He 633 00:34:21,719 --> 00:34:24,439 Speaker 3: looked at a lot of emergency evacuations and studied them, 634 00:34:24,640 --> 00:34:29,200 Speaker 3: and he basically came to the conclusion that usually people 635 00:34:29,239 --> 00:34:33,160 Speaker 3: flee from these things, obviously because it's the sensible thing 636 00:34:33,200 --> 00:34:37,359 Speaker 3: to do, but it's not necessarily a panicked, irrational group 637 00:34:37,360 --> 00:34:40,840 Speaker 3: of people. They do so generally order in an orderly fashion. 638 00:34:41,040 --> 00:34:43,840 Speaker 2: Yeah, let's talk about that. Let's talk about emergency identity 639 00:34:43,880 --> 00:34:46,960 Speaker 2: because one of the things, one of the ways that 640 00:34:46,960 --> 00:34:50,120 Speaker 2: that crowds kind of create like a different a different 641 00:34:50,440 --> 00:34:53,399 Speaker 2: way of thinking is Let's say you have a group 642 00:34:53,440 --> 00:34:56,640 Speaker 2: of people on a subway, and the subway like it's 643 00:34:56,719 --> 00:35:00,520 Speaker 2: running normally. They are just a group of strangers trying 644 00:35:00,560 --> 00:35:02,520 Speaker 2: not to make eye contact with one another, on their 645 00:35:02,560 --> 00:35:04,720 Speaker 2: way home, on their way to work, on their way wherever. 646 00:35:05,360 --> 00:35:07,840 Speaker 2: They have no affiliation with one another aside from the 647 00:35:07,880 --> 00:35:10,680 Speaker 2: fact that they're currently sharing space with one another. If 648 00:35:10,719 --> 00:35:15,880 Speaker 2: that subway breaks down, those strangers become a group almost 649 00:35:15,920 --> 00:35:20,719 Speaker 2: instantly once they realize, okay, we're stuck. The human tendency 650 00:35:20,800 --> 00:35:24,680 Speaker 2: that recent scholarship has found is to come together as 651 00:35:24,680 --> 00:35:27,640 Speaker 2: a group, help one another, start to actually care what 652 00:35:27,760 --> 00:35:31,520 Speaker 2: happens to every other member of the group, trust one another, 653 00:35:31,760 --> 00:35:35,719 Speaker 2: treat one another with respect, and that that happens in 654 00:35:35,840 --> 00:35:40,840 Speaker 2: disaster after disaster after disaster, or subway breakdown after subway breakdown. 655 00:35:40,960 --> 00:35:44,319 Speaker 2: There was another sociologist I don't have his name, but 656 00:35:44,440 --> 00:35:48,200 Speaker 2: he studied like one hundred and eighty different peacetime disasters 657 00:35:48,400 --> 00:35:51,960 Speaker 2: and basically found, Yeah, there's jerks in every crowd, but 658 00:35:52,239 --> 00:35:55,000 Speaker 2: for the most part, they are the vast majority of 659 00:35:55,040 --> 00:35:58,919 Speaker 2: people act pro socially and positively to come together as 660 00:35:58,960 --> 00:36:01,320 Speaker 2: a group in the face of a disaster. 661 00:36:01,920 --> 00:36:05,400 Speaker 3: Yeah, that was Charles Fritz and I couldn't help but 662 00:36:05,400 --> 00:36:09,920 Speaker 3: think of Elaine Venice the second you mentioned the subway stopping. 663 00:36:10,440 --> 00:36:11,680 Speaker 2: Oh, I don't remember that one. 664 00:36:11,920 --> 00:36:14,800 Speaker 3: Oh it was the great Seinfeld episode where Elaine was 665 00:36:14,840 --> 00:36:18,680 Speaker 3: stuck on a out of work subway malfunctioning subway. 666 00:36:18,880 --> 00:36:19,560 Speaker 4: It was pretty great. 667 00:36:19,920 --> 00:36:22,919 Speaker 3: I had a great experience in New York years ago, 668 00:36:22,960 --> 00:36:24,760 Speaker 3: and I might have told this one before, but where 669 00:36:25,560 --> 00:36:27,520 Speaker 3: there was a guy kind of stomping up and down 670 00:36:27,560 --> 00:36:31,160 Speaker 3: the subway platform saying he was going to shove someone 671 00:36:32,120 --> 00:36:37,160 Speaker 3: onto the oncoming train, and just instinctively, a group of 672 00:36:37,360 --> 00:36:42,799 Speaker 3: twenty New Yorkers and me went and gathered together in 673 00:36:42,840 --> 00:36:46,400 Speaker 3: a group with a mother and her baby at the 674 00:36:46,400 --> 00:36:48,160 Speaker 3: center of that atom. 675 00:36:48,480 --> 00:36:49,880 Speaker 4: Wow, at the nucleus. 676 00:36:49,920 --> 00:36:53,560 Speaker 3: And it wasn't no one said all right, everyone, let's 677 00:36:53,600 --> 00:36:57,040 Speaker 3: get together here. We all just stood together, got just 678 00:36:57,239 --> 00:37:00,759 Speaker 3: wandered closer to one another and probably all making our 679 00:37:00,800 --> 00:37:03,480 Speaker 3: way toward that woman and her baby. And before you 680 00:37:03,560 --> 00:37:05,600 Speaker 3: know it, there was a big group of us standing 681 00:37:05,640 --> 00:37:08,560 Speaker 3: in a circle. And that's the one where I went 682 00:37:08,640 --> 00:37:13,520 Speaker 3: and met the cops upstairs and led them to the guy. 683 00:37:14,000 --> 00:37:14,320 Speaker 2: Awesome. 684 00:37:14,840 --> 00:37:17,720 Speaker 3: I was pretty obvious who it was, but I was like, hey, 685 00:37:17,760 --> 00:37:20,800 Speaker 3: he's down here, and boy, New York City's finest. 686 00:37:21,440 --> 00:37:24,920 Speaker 4: They acted with intention, that's all I'll all say. 687 00:37:25,000 --> 00:37:29,160 Speaker 2: I see, yeah, so what you just said. That story 688 00:37:29,320 --> 00:37:35,720 Speaker 2: undermines a very famous, misguided idea about crowds, the bystander effect, 689 00:37:36,120 --> 00:37:38,120 Speaker 2: which we talked about at great length in our episode 690 00:37:39,000 --> 00:37:40,440 Speaker 2: on Kitty Jenavsi. 691 00:37:40,880 --> 00:37:41,399 Speaker 4: That's right. 692 00:37:41,560 --> 00:37:44,239 Speaker 2: But one study I saw recently found that they were 693 00:37:44,239 --> 00:37:46,800 Speaker 2: studying all sorts of fights that were caught on like 694 00:37:48,120 --> 00:37:51,560 Speaker 2: security cameras or whatever. Uh huh, and I think ninety 695 00:37:51,600 --> 00:37:56,600 Speaker 2: percent of these fights at least one person intervened. Most 696 00:37:56,600 --> 00:38:00,160 Speaker 2: of the time more than one person intervened, and the 697 00:38:00,160 --> 00:38:02,880 Speaker 2: bystander effect says the more people there are, the less 698 00:38:02,920 --> 00:38:05,600 Speaker 2: likely anyone needs to act to help. They found the 699 00:38:05,640 --> 00:38:08,279 Speaker 2: opposite was true. The more people there were, the more 700 00:38:08,360 --> 00:38:12,719 Speaker 2: people helped intervene in this violent fight. So that whole 701 00:38:12,760 --> 00:38:17,240 Speaker 2: idea that people just don't do that is actually untrue. 702 00:38:17,280 --> 00:38:20,080 Speaker 2: It's based on misreporting by a New York Times article. 703 00:38:20,480 --> 00:38:24,160 Speaker 2: That's right, there are there's evidence that there are other 704 00:38:24,200 --> 00:38:29,120 Speaker 2: reasons people might not necessarily not render aid, but say, 705 00:38:29,239 --> 00:38:31,960 Speaker 2: not speak up or share their opinions because they're in 706 00:38:32,000 --> 00:38:33,040 Speaker 2: a group too. Right. 707 00:38:34,160 --> 00:38:37,719 Speaker 3: Uh. Yeah, that's sort of like if you're in a 708 00:38:38,239 --> 00:38:42,160 Speaker 3: classroom and the teacher asks if anyone has anything else 709 00:38:42,200 --> 00:38:45,960 Speaker 3: to say or whatever, or has any questions, and nobody 710 00:38:46,000 --> 00:38:49,000 Speaker 3: says anything, even though they might have questions. 711 00:38:49,320 --> 00:38:50,319 Speaker 2: I remember doing that. 712 00:38:51,040 --> 00:38:53,120 Speaker 4: Sure it would still do that. 713 00:38:53,960 --> 00:38:56,440 Speaker 2: Yeah, I'm sure I would too in a classroom setting, 714 00:38:56,840 --> 00:38:59,160 Speaker 2: but I would I would just assume like I was 715 00:38:59,200 --> 00:39:03,560 Speaker 2: the only one who did get it, or I and 716 00:39:03,760 --> 00:39:05,839 Speaker 2: probably a lot of other people were just ready to leave. 717 00:39:06,719 --> 00:39:10,480 Speaker 3: Well, in a classroom, I don't think that factors in. 718 00:39:10,520 --> 00:39:13,080 Speaker 3: But as an adult, I think you're right on the money. 719 00:39:13,080 --> 00:39:16,239 Speaker 3: I've been in situations where it's a oh, I don't know. 720 00:39:16,280 --> 00:39:19,000 Speaker 3: If you're in a public meeting for something, or if 721 00:39:19,000 --> 00:39:23,200 Speaker 3: you're on the board of a local garden or whatever and. 722 00:39:23,800 --> 00:39:26,440 Speaker 4: Everything, you can get the sense that everyone wants to 723 00:39:26,480 --> 00:39:26,879 Speaker 4: go home. 724 00:39:27,080 --> 00:39:29,840 Speaker 3: Yeah, you don't want to be that person. That's like, 725 00:39:30,280 --> 00:39:32,280 Speaker 3: I have another point at the end of it. 726 00:39:32,400 --> 00:39:35,360 Speaker 2: Right, Well, there's a difference between asking a question and 727 00:39:35,440 --> 00:39:36,840 Speaker 2: liking to hear yourself talk. 728 00:39:37,239 --> 00:39:38,479 Speaker 4: Well, very very good point. 729 00:39:38,640 --> 00:39:42,360 Speaker 2: So yeah, that's called pluralistic ignorance, where you feel like, 730 00:39:43,200 --> 00:39:46,160 Speaker 2: even though your opinions differ, you are making an assumption 731 00:39:46,160 --> 00:39:47,839 Speaker 2: about the rest of the group that you that they 732 00:39:47,880 --> 00:39:49,720 Speaker 2: feel this way. So you don't want to make waves 733 00:39:50,040 --> 00:39:52,360 Speaker 2: when it turns out that most of the group probably 734 00:39:52,360 --> 00:39:53,360 Speaker 2: feels the same way you do. 735 00:39:54,040 --> 00:39:57,160 Speaker 3: Yeah, for sure. I did want to follow up with 736 00:39:57,239 --> 00:39:59,640 Speaker 3: a stat that you found. Yeah, I want to follow 737 00:39:59,680 --> 00:40:02,439 Speaker 3: up with you your information because when you were talking 738 00:40:02,440 --> 00:40:06,040 Speaker 3: about Fritz, I think this idea that you know, if 739 00:40:06,320 --> 00:40:08,919 Speaker 3: you watch the news, you might think every protest leads 740 00:40:08,960 --> 00:40:11,640 Speaker 3: to violence, or you know, the summer of violence with 741 00:40:12,040 --> 00:40:15,160 Speaker 3: George Floyd and Black Lives Matter and stuff like that, 742 00:40:15,719 --> 00:40:20,000 Speaker 3: or pro Palestinian campus protests. There are stats on this stuff, 743 00:40:20,360 --> 00:40:22,560 Speaker 3: like there's raw numbers and of the five hundred and 744 00:40:22,600 --> 00:40:28,080 Speaker 3: fifty three pro Palestine campus protests between April and April 745 00:40:28,080 --> 00:40:30,760 Speaker 3: eighteenth and May third, twenty twenty four, ninety seven percent 746 00:40:31,280 --> 00:40:35,520 Speaker 3: were what was classified as overwhelmingly peaceful. And the same 747 00:40:35,560 --> 00:40:39,440 Speaker 3: with Black Lives Matter. There were twenty four hundred demonstrations 748 00:40:40,280 --> 00:40:43,600 Speaker 3: after the murder of George Floyd, and I think it 749 00:40:43,640 --> 00:40:46,400 Speaker 3: was ninety three percent of them remained entirely peaceful. They 750 00:40:46,400 --> 00:40:47,880 Speaker 3: were less than two hundred and twenty out of the 751 00:40:47,920 --> 00:40:50,640 Speaker 3: twenty four hundred that had any kind of violence whatsoever. 752 00:40:50,960 --> 00:40:52,919 Speaker 4: Right, So that's good news. 753 00:40:53,000 --> 00:40:54,799 Speaker 2: I mean, yeah, that's great news. And I think that 754 00:40:55,320 --> 00:40:58,239 Speaker 2: the Black Lives Matter George Floyd protests were one of 755 00:40:58,239 --> 00:41:01,279 Speaker 2: the things that really kind of changed a lot of 756 00:41:01,280 --> 00:41:03,400 Speaker 2: scholarship or gave a lot of weight to some of 757 00:41:03,440 --> 00:41:07,640 Speaker 2: the emerging scholarship that crowds actually aren't bad necessarily and 758 00:41:07,640 --> 00:41:09,560 Speaker 2: they can be beneficial socially too. 759 00:41:10,160 --> 00:41:11,600 Speaker 4: Yeah, because they don't. 760 00:41:11,640 --> 00:41:14,879 Speaker 3: The news doesn't cover twenty four hundred pieceful protests, right, 761 00:41:15,920 --> 00:41:17,960 Speaker 3: they cover the ones, you know, And I guess I 762 00:41:18,040 --> 00:41:20,920 Speaker 3: get it because it's the news, And if it's. 763 00:41:22,880 --> 00:41:24,239 Speaker 4: I guess I get it. That's all I'm saying. 764 00:41:24,440 --> 00:41:26,960 Speaker 2: Yeah, I want to do an episode on how the 765 00:41:27,000 --> 00:41:29,520 Speaker 2: news affects the world and how it has so far, 766 00:41:29,960 --> 00:41:32,640 Speaker 2: because I think it's done a lot of damage. 767 00:41:32,880 --> 00:41:34,560 Speaker 5: Yeah, I agree. 768 00:41:34,800 --> 00:41:37,680 Speaker 2: There's one other thing too we mentioned, like say, the 769 00:41:37,719 --> 00:41:42,040 Speaker 2: presence of police in riot gear can actually trigger violence 770 00:41:42,080 --> 00:41:44,600 Speaker 2: that otherwise might not have taken place in a crowd. 771 00:41:45,840 --> 00:41:48,319 Speaker 2: That's not the only thing. If there are weapons of 772 00:41:48,520 --> 00:41:51,799 Speaker 2: whoever has the weapons, that can change things. And then 773 00:41:51,840 --> 00:41:56,120 Speaker 2: another really big one that cannot be overlooked is the 774 00:41:56,160 --> 00:42:00,399 Speaker 2: presence of a lot of people on alcohol. Yeah, that 775 00:42:00,440 --> 00:42:04,400 Speaker 2: has caused a lot of riots. It just has Like 776 00:42:04,480 --> 00:42:07,400 Speaker 2: you're not going to set a couch on fire in 777 00:42:07,480 --> 00:42:10,640 Speaker 2: the street normally, but if you're drunk and you're part 778 00:42:10,640 --> 00:42:13,360 Speaker 2: of a crowd that who's basketball team just lost in 779 00:42:13,400 --> 00:42:15,759 Speaker 2: the final four, there's a good chance you're going to 780 00:42:15,840 --> 00:42:19,080 Speaker 2: do something like that. And alcohol fuels a lot of 781 00:42:19,120 --> 00:42:22,840 Speaker 2: the problematic, unlawful crowds that turn ugly. 782 00:42:23,360 --> 00:42:25,960 Speaker 3: Yeah, those the It's a lot of times it's the 783 00:42:26,000 --> 00:42:28,800 Speaker 3: winning team and that just it's always very frustrating to 784 00:42:28,840 --> 00:42:30,759 Speaker 3: see the news after a city has won a big, 785 00:42:30,800 --> 00:42:34,760 Speaker 3: important championship. Yeah, important to the fans, And then there's 786 00:42:35,200 --> 00:42:38,280 Speaker 3: a big group of people like I've been among those parties. 787 00:42:39,120 --> 00:42:41,000 Speaker 3: I remember in Athens when the Braves won their first 788 00:42:41,040 --> 00:42:42,160 Speaker 3: World Series back in the day. 789 00:42:42,280 --> 00:42:43,600 Speaker 2: Oh, that was just a great night. 790 00:42:43,880 --> 00:42:47,359 Speaker 3: The streets flooded and it was amazing. Yeah, but you know, 791 00:42:47,560 --> 00:42:49,359 Speaker 3: we've seen that same thing. All of a sudden, people 792 00:42:49,360 --> 00:42:50,640 Speaker 3: are flipping over cop cars. 793 00:42:50,800 --> 00:42:53,600 Speaker 2: Yeah, it's setting them on fire, and it's like, yeah, 794 00:42:53,640 --> 00:42:55,480 Speaker 2: I'm so happy that I'm going to set this car 795 00:42:55,560 --> 00:42:56,040 Speaker 2: on fire. 796 00:42:56,920 --> 00:42:57,480 Speaker 4: Yeah. 797 00:42:57,560 --> 00:43:01,000 Speaker 3: Yeah, it's a shame. It's like ruining it for the best, 798 00:43:01,120 --> 00:43:03,040 Speaker 3: for the rest of us and the best of us. 799 00:43:03,160 --> 00:43:05,080 Speaker 2: I saw that that same kind of thing ruined the 800 00:43:05,200 --> 00:43:07,720 Speaker 2: Keen Pumpkin Festival years back. 801 00:43:08,440 --> 00:43:11,719 Speaker 3: I think we talked about that, did we. I think 802 00:43:11,840 --> 00:43:14,919 Speaker 3: years ago you definitely mentioned that. Otherwise, I don't see 803 00:43:14,920 --> 00:43:15,600 Speaker 3: how it would have known. 804 00:43:15,760 --> 00:43:18,000 Speaker 2: Yeah, I don't remember it at all, but I'll go 805 00:43:18,080 --> 00:43:19,080 Speaker 2: with your interpretation. 806 00:43:19,560 --> 00:43:22,720 Speaker 3: Yeah, it might have been during our early Pumpkin Chunkin 807 00:43:22,880 --> 00:43:23,720 Speaker 3: coverage days. 808 00:43:24,560 --> 00:43:26,760 Speaker 2: Didn't we do a whole episode on punkin Chunkin? 809 00:43:27,239 --> 00:43:28,919 Speaker 4: Oh yeah, we worked for Discovery Channel. 810 00:43:29,000 --> 00:43:32,920 Speaker 2: We sure did. They made it so, yes, you got 811 00:43:32,960 --> 00:43:33,480 Speaker 2: anything else? 812 00:43:34,719 --> 00:43:37,239 Speaker 3: I got nothing else. I thought that was super interesting. Yeah, 813 00:43:37,280 --> 00:43:37,920 Speaker 3: did you pick that one? 814 00:43:38,000 --> 00:43:39,280 Speaker 4: Yeah, thank Kimberly. 815 00:43:39,640 --> 00:43:43,040 Speaker 2: Yeah, yeah, thanks Kimberly. But it was interesting, Chuck, and 816 00:43:43,200 --> 00:43:46,239 Speaker 2: thanks for doing that, and I'll thank myself from you 817 00:43:46,360 --> 00:43:49,720 Speaker 2: as well. Okay, since I just thank myself on behalf 818 00:43:49,760 --> 00:43:51,799 Speaker 2: of Chuck, I think it's time for listener mail. 819 00:43:55,280 --> 00:43:55,719 Speaker 4: That's right. 820 00:43:55,800 --> 00:43:57,640 Speaker 5: I was thinking about the way and you diverted that 821 00:43:57,680 --> 00:43:58,200 Speaker 5: to Kimberly. 822 00:43:58,640 --> 00:44:00,520 Speaker 4: Okay, afflicted. 823 00:44:01,520 --> 00:44:03,359 Speaker 3: Hey, guys, I was listening to the short stuff about 824 00:44:03,360 --> 00:44:04,920 Speaker 3: color psychology, and I got to say I was a 825 00:44:04,920 --> 00:44:07,800 Speaker 3: little disappointed there was no mention of drunk tank pink. 826 00:44:09,120 --> 00:44:10,720 Speaker 4: Didn't know this. If it's not obvious. 827 00:44:10,760 --> 00:44:12,359 Speaker 3: It gets his name from being the go to color 828 00:44:12,360 --> 00:44:15,360 Speaker 3: of choice for drunk tanks because of the noticeable calming 829 00:44:15,360 --> 00:44:18,360 Speaker 3: effect it has on belligerent people. I was so intrigued 830 00:44:18,400 --> 00:44:20,840 Speaker 3: by this that I painted my bathroom drunk tank pink. 831 00:44:21,640 --> 00:44:24,520 Speaker 3: Is Hannah. Hannah sent in a picture too. I didn't 832 00:44:24,520 --> 00:44:26,160 Speaker 3: ask Hannah why she felt the need to do this, 833 00:44:26,239 --> 00:44:26,839 Speaker 3: but she did it. 834 00:44:26,920 --> 00:44:27,200 Speaker 2: Okay. 835 00:44:27,760 --> 00:44:30,760 Speaker 3: There's a book called Drunk Tank Pink by author Adam 836 00:44:30,840 --> 00:44:33,560 Speaker 3: Alter that dives into the psychology behind it, and I'd 837 00:44:33,600 --> 00:44:35,920 Speaker 3: think you would both find this super interesting. 838 00:44:36,120 --> 00:44:38,680 Speaker 2: Man, I'm sorry we missed that, because that is awesome. 839 00:44:39,200 --> 00:44:39,800 Speaker 4: I agree. 840 00:44:40,360 --> 00:44:42,239 Speaker 3: And by the way, Hannah got tickets to our show 841 00:44:42,239 --> 00:44:45,520 Speaker 3: in Akron, Ohio as a birthday present for her partner Awesome, 842 00:44:45,640 --> 00:44:48,360 Speaker 3: who had been listening since he was fifteen years old. 843 00:44:48,440 --> 00:44:50,399 Speaker 2: Wow and no seventy two. 844 00:44:51,040 --> 00:44:53,960 Speaker 3: Yeah, that's right, and asked if we could give an 845 00:44:54,000 --> 00:44:56,319 Speaker 3: on stage shout out, and Hannah sadly, we don't do 846 00:44:56,400 --> 00:44:58,160 Speaker 3: that because we get a lot of requests and it's 847 00:44:58,160 --> 00:45:00,879 Speaker 3: no fun sitting around as an audience member for ten 848 00:45:00,920 --> 00:45:02,440 Speaker 3: minutes while we read people's names. 849 00:45:03,160 --> 00:45:04,319 Speaker 4: So we're gonna do it right here. 850 00:45:04,560 --> 00:45:09,560 Speaker 3: Wish a very future happy birthday to Isaac Canise. 851 00:45:09,840 --> 00:45:11,480 Speaker 2: Nice Happy birthday, Isaac. 852 00:45:11,520 --> 00:45:15,640 Speaker 7: That was quite magnanimous, so Chuck, Yeah, because we usually 853 00:45:15,680 --> 00:45:17,640 Speaker 7: don't do it on the episode either, But since Drunk 854 00:45:17,640 --> 00:45:20,640 Speaker 7: Tank Pink was such a great email, Hannah, we want 855 00:45:20,640 --> 00:45:24,000 Speaker 7: to say happy happy birthday to Isaac and we hope 856 00:45:24,000 --> 00:45:25,800 Speaker 7: you both have a great time in Akron. 857 00:45:25,960 --> 00:45:28,560 Speaker 2: Yes, yeah, we'll see you guys there. Make sure you 858 00:45:28,560 --> 00:45:30,719 Speaker 2: stand up and go drunk Tank Pink in the middle 859 00:45:30,760 --> 00:45:31,280 Speaker 2: of the show. 860 00:45:31,440 --> 00:45:32,200 Speaker 4: Oh no, don't. 861 00:45:32,520 --> 00:45:34,759 Speaker 2: If you want to be like Hannah, you can send 862 00:45:34,880 --> 00:45:37,000 Speaker 2: us an email too. Say whatever you like in it 863 00:45:37,680 --> 00:45:40,160 Speaker 2: and we will love it. Just send it off to 864 00:45:40,200 --> 00:45:42,680 Speaker 2: stuff Podcasts at iHeartRadio dot. 865 00:45:42,400 --> 00:45:47,839 Speaker 4: Com Stuff you Should Know is a production of iHeartRadio. 866 00:45:48,320 --> 00:45:51,520 Speaker 5: For more podcasts my heart Radio, visit the iHeartRadio app, 867 00:45:51,719 --> 00:46:00,799 Speaker 5: Apple Podcasts, or wherever you listen to your favorite shows.