1 00:00:02,960 --> 00:00:06,800 Speaker 1: Welcome to Stuff to Blow Your Mind, the production of iHeartRadio. 2 00:00:12,760 --> 00:00:14,640 Speaker 1: Hey you welcome to Stuff to Blow your Mind. My 3 00:00:14,720 --> 00:00:18,320 Speaker 1: name is Robert Lamb and I'm Joe McCormick. And hey, 4 00:00:18,360 --> 00:00:21,160 Speaker 1: fair warning, folks. If I if I sound like I 5 00:00:21,200 --> 00:00:24,119 Speaker 1: speak with the voice and mind of some kind of 6 00:00:24,160 --> 00:00:27,800 Speaker 1: decrepit bog monster today, it is. It is because I'm 7 00:00:27,840 --> 00:00:30,680 Speaker 1: on the mend from a from a bad cold. Sou 8 00:00:31,440 --> 00:00:34,560 Speaker 1: so apologies on whatever's happening in your ears right now. 9 00:00:34,640 --> 00:00:38,240 Speaker 1: But but but here I am on Mike. Well, sometimes 10 00:00:38,320 --> 00:00:41,200 Speaker 1: bog monsters are quite wise, so you know, it depends 11 00:00:41,200 --> 00:00:43,280 Speaker 1: on the story you're looking at. I hope to bring 12 00:00:43,320 --> 00:00:48,640 Speaker 1: a real meg muckle bones energy today's episode, so you'll 13 00:00:48,680 --> 00:00:50,680 Speaker 1: have to tell me how I do. But yeah, what 14 00:00:50,920 --> 00:00:53,080 Speaker 1: are we talking about today, Rob, Well, we're gonna be 15 00:00:53,120 --> 00:00:56,120 Speaker 1: talking about about a little something called the minimal group paradigm, 16 00:00:56,120 --> 00:00:58,840 Speaker 1: which sounds I know, if you're not familiar with it, 17 00:00:58,880 --> 00:01:03,040 Speaker 1: sounds a bit a bit duffy. Perhaps perhaps sounds a 18 00:01:03,040 --> 00:01:05,760 Speaker 1: little bit clinical, but I think it's it's a very 19 00:01:05,840 --> 00:01:09,440 Speaker 1: fascinating little topic. Shouldn't make for a nice one part 20 00:01:09,959 --> 00:01:13,280 Speaker 1: episode here because it attempts to come down to some 21 00:01:13,360 --> 00:01:19,760 Speaker 1: of the major concerns regarding human civilization and human interactions 22 00:01:20,480 --> 00:01:22,680 Speaker 1: basically coming out of the question of like, just how 23 00:01:22,880 --> 00:01:26,399 Speaker 1: divisive are human beings and how little does it take 24 00:01:26,440 --> 00:01:30,160 Speaker 1: for us to split into factions over something or next 25 00:01:30,160 --> 00:01:33,080 Speaker 1: to nothing even And I think for many of us, 26 00:01:33,080 --> 00:01:35,240 Speaker 1: the answer seems to be that we're, you know, who 27 00:01:35,240 --> 00:01:39,080 Speaker 1: are very divisive and that it doesn't take much at 28 00:01:39,120 --> 00:01:41,200 Speaker 1: all for us to split off into factions. And I 29 00:01:41,200 --> 00:01:43,920 Speaker 1: think this has been played to great effect in literature 30 00:01:44,040 --> 00:01:48,160 Speaker 1: and cinema, especially comedically, and two examples always come to 31 00:01:48,240 --> 00:01:51,520 Speaker 1: my mind. So one of them, Joe, I'm not sure 32 00:01:51,520 --> 00:01:52,720 Speaker 1: if you're familiar with this. I don't know if we've 33 00:01:52,720 --> 00:01:55,880 Speaker 1: talked about this before, but in the nineteen fifty three 34 00:01:55,920 --> 00:01:59,760 Speaker 1: story from Doctor SEUs the Sneeches has this the butter 35 00:02:00,120 --> 00:02:02,840 Speaker 1: the bread one. No, no, you're thinking of the Butter 36 00:02:02,840 --> 00:02:06,480 Speaker 1: Battle Book, which does get into a similar situation. That's 37 00:02:06,520 --> 00:02:10,320 Speaker 1: a where you have two different groups and one side 38 00:02:10,320 --> 00:02:12,919 Speaker 1: thinks you should do the butter side down there those 39 00:02:12,960 --> 00:02:15,600 Speaker 1: butter side up, and they get into this big Cold 40 00:02:15,600 --> 00:02:19,679 Speaker 1: War stalemate. This is an arms race, an escalation of 41 00:02:20,800 --> 00:02:25,320 Speaker 1: their weaponry based on the butter ideology difference. Yeah, so 42 00:02:25,320 --> 00:02:27,440 Speaker 1: that's a good one to bring up to the sneeches 43 00:02:27,880 --> 00:02:33,440 Speaker 1: concerns this population of avian creatures that in their entire 44 00:02:33,520 --> 00:02:36,480 Speaker 1: social hierarchy is based on which ones have a star 45 00:02:36,560 --> 00:02:38,440 Speaker 1: on their bellies and which ones don't have a star 46 00:02:38,600 --> 00:02:41,240 Speaker 1: on their bellies, and the star bellies sneeches are the 47 00:02:41,240 --> 00:02:44,160 Speaker 1: ones that live at the top and the rest live 48 00:02:44,200 --> 00:02:46,519 Speaker 1: at the bottom. But then a con artist moves into 49 00:02:46,560 --> 00:02:49,280 Speaker 1: town with a star on machine and then later a 50 00:02:49,360 --> 00:02:53,800 Speaker 1: star off machine to capitalize on their divisiveness. Though at 51 00:02:53,800 --> 00:02:56,000 Speaker 1: the end of that, the Sneeches move beyond all of 52 00:02:56,000 --> 00:02:58,320 Speaker 1: this and they unite as a single people. So it's 53 00:02:58,600 --> 00:03:01,440 Speaker 1: kind of a nice message. Oh that's that's very nice. 54 00:03:01,480 --> 00:03:04,200 Speaker 1: That's a much happier ending than the butter Battle Book, which, 55 00:03:04,639 --> 00:03:07,720 Speaker 1: as I recall, it ends with basically both sides on 56 00:03:07,720 --> 00:03:11,119 Speaker 1: a hair trigger with their ultimate weaponry. Yeah. Yeah, it's 57 00:03:11,520 --> 00:03:15,000 Speaker 1: it's a real clincher that one. But another example that 58 00:03:15,000 --> 00:03:16,400 Speaker 1: comes to mind, and I know you're familiar with this 59 00:03:16,400 --> 00:03:20,239 Speaker 1: one is of course Monty Python's Life of Brian. There's 60 00:03:20,240 --> 00:03:23,480 Speaker 1: a memorable scene in which the anti Roman resistance is 61 00:03:23,520 --> 00:03:27,600 Speaker 1: split more than split between the Judean People's Front and 62 00:03:27,639 --> 00:03:30,680 Speaker 1: the People's Front of Judea and various other fragments of 63 00:03:30,720 --> 00:03:34,400 Speaker 1: their independence group. One of the characters in the Judean 64 00:03:34,440 --> 00:03:37,600 Speaker 1: People's Front proudly proclaims that the only people they hate 65 00:03:37,600 --> 00:03:40,560 Speaker 1: more than the Romans is the People's Front of Judea. 66 00:03:40,800 --> 00:03:42,760 Speaker 1: I think this is meant to play on a concept 67 00:03:42,880 --> 00:03:47,440 Speaker 1: that was called the narcissism of small differences by Sigmund Freud. 68 00:03:47,800 --> 00:03:49,520 Speaker 1: I don't know if Freud was the first person ever 69 00:03:49,560 --> 00:03:52,040 Speaker 1: to observe this, but I think that's where the phrase 70 00:03:52,120 --> 00:03:55,680 Speaker 1: comes from, is his writings about the idea that it's 71 00:03:55,720 --> 00:04:00,400 Speaker 1: actually like the most bitter, hateful, divisive struggles in the 72 00:04:00,440 --> 00:04:03,640 Speaker 1: world tend to be between people who actually share a 73 00:04:03,680 --> 00:04:07,880 Speaker 1: lot of things in common but have some difference that 74 00:04:08,040 --> 00:04:13,440 Speaker 1: really appears minor to people looking in from the outside. Now, 75 00:04:13,480 --> 00:04:14,760 Speaker 1: I think a lot of you out there. You may 76 00:04:14,800 --> 00:04:17,400 Speaker 1: be able to think of examples from other works of fiction, 77 00:04:17,720 --> 00:04:20,800 Speaker 1: or certainly from real life, many of the various serious 78 00:04:20,800 --> 00:04:24,120 Speaker 1: things we divide ourselves over, or you know, some of 79 00:04:24,120 --> 00:04:27,320 Speaker 1: the equally seemingly silly, at least from the outside, things 80 00:04:27,320 --> 00:04:29,359 Speaker 1: that we get that we're very divisive over, and to 81 00:04:29,400 --> 00:04:33,960 Speaker 1: your points, sometimes they're within like subgroups and fandoms. Even 82 00:04:34,360 --> 00:04:38,040 Speaker 1: all manner of brand and sports team loyalty can lead 83 00:04:38,080 --> 00:04:42,000 Speaker 1: to division that doesn't necessarily make much sense on closer inspection. 84 00:04:43,040 --> 00:04:46,919 Speaker 1: Perhaps you prefer Puma shoes and this other person prefers Adidas. 85 00:04:46,920 --> 00:04:48,920 Speaker 1: How could the two of you ever see eyed eye? 86 00:04:49,240 --> 00:04:52,640 Speaker 1: And this specific example ties into I think, what is 87 00:04:52,720 --> 00:04:56,760 Speaker 1: a great example of division in human beings and human groups? 88 00:04:56,920 --> 00:05:00,320 Speaker 1: One I originally saw pointed out by and it's it's 89 00:05:00,440 --> 00:05:03,360 Speaker 1: been well documented for a while by j Van Babel 90 00:05:03,440 --> 00:05:06,560 Speaker 1: and Dominique Packer in a ted Ed video. This is 91 00:05:06,560 --> 00:05:09,839 Speaker 1: like an animated educational short that ted Ed puts on 92 00:05:09,960 --> 00:05:14,400 Speaker 1: Wonderful Shorts. It's regular viewing in my household with my family. 93 00:05:14,720 --> 00:05:17,200 Speaker 1: But the title of this one is the sibling rivalry 94 00:05:17,440 --> 00:05:20,159 Speaker 1: that divided a town. So I thought i'd cover the 95 00:05:20,200 --> 00:05:24,560 Speaker 1: basics of this sibling rivalry all right, So are you 96 00:05:24,600 --> 00:05:27,520 Speaker 1: familiar with this story, Joe, I'm not What all starts 97 00:05:27,520 --> 00:05:31,279 Speaker 1: in around nineteen nineteen. That's when these two brothers, Adolph 98 00:05:31,320 --> 00:05:35,680 Speaker 1: and Rudolph Dassler found a shoe company called Gabruda Dassler 99 00:05:36,440 --> 00:05:40,320 Speaker 1: Shoe Fabric or GAETA in their hometown of Herzogen. I 100 00:05:40,400 --> 00:05:43,400 Speaker 1: rock in Bavaria. It turns out they were very successful. 101 00:05:43,520 --> 00:05:46,920 Speaker 1: These shoes were really took off. You even had a 102 00:05:47,360 --> 00:05:50,440 Speaker 1: situation where in the nineteen thirty six Olympics American runner 103 00:05:50,520 --> 00:05:54,240 Speaker 1: Jesse Owens apparently was wearing some of these shoes. But 104 00:05:54,320 --> 00:05:57,240 Speaker 1: then World War two breaks out. This disrupts everything, to 105 00:05:57,240 --> 00:06:00,560 Speaker 1: say the least, Rudolph is drafted into the An Army, 106 00:06:00,600 --> 00:06:04,760 Speaker 1: the factory is transformed into a weapons factory, and again 107 00:06:04,800 --> 00:06:07,440 Speaker 1: everything is just super disrupted until after the war the 108 00:06:07,480 --> 00:06:12,119 Speaker 1: brothers reunite, their work continues ad Is until nineteen forty 109 00:06:12,160 --> 00:06:15,600 Speaker 1: eight when they split over some personal issues. And I 110 00:06:15,640 --> 00:06:18,800 Speaker 1: think there are a few different analyzes of what those 111 00:06:18,839 --> 00:06:22,600 Speaker 1: personal issues might have been, but the results are the same. 112 00:06:22,640 --> 00:06:25,239 Speaker 1: Meaning anyway you cut it, the company is split into 113 00:06:25,760 --> 00:06:31,360 Speaker 1: that means material, workforce, and so forth. Rudolph founds Ruda, 114 00:06:31,480 --> 00:06:35,919 Speaker 1: which becomes Puma, and Adolph starts Adidas. Now that's not 115 00:06:35,960 --> 00:06:38,560 Speaker 1: that crazy, right, It's just one shoe company splitting into 116 00:06:38,560 --> 00:06:42,520 Speaker 1: two shoe companies. Now. The interesting thing about this, though, 117 00:06:42,520 --> 00:06:45,360 Speaker 1: according to Javan Babble and Dominique Packer and that ted 118 00:06:45,560 --> 00:06:47,960 Speaker 1: Ed video, is that the brothers feud and business division 119 00:06:48,040 --> 00:06:53,120 Speaker 1: ultimately divides the entire town quote residents became fiercely loyal 120 00:06:53,160 --> 00:06:56,560 Speaker 1: to one brand of shoe, local businesses chose sides, and 121 00:06:56,760 --> 00:07:01,560 Speaker 1: marriage across lines was discouraged. Herzegoganak eventually became known as 122 00:07:01,839 --> 00:07:05,160 Speaker 1: the town of bent necks because its residents looked down 123 00:07:05,200 --> 00:07:09,239 Speaker 1: to ensure they were interacting with members of their group. Oh, 124 00:07:09,279 --> 00:07:12,840 Speaker 1: look down at the shoes. That one took me a second. Yeah. 125 00:07:13,000 --> 00:07:15,600 Speaker 1: So I think it's a great example, not because not 126 00:07:15,640 --> 00:07:18,960 Speaker 1: only because it's kind of has some sort of comical 127 00:07:19,000 --> 00:07:21,520 Speaker 1: elements to it, kind of like belly stars, but also 128 00:07:21,640 --> 00:07:25,120 Speaker 1: we do see these various elements to the division, the personal, 129 00:07:25,280 --> 00:07:29,240 Speaker 1: the business, the social, and the schism is is quite real, 130 00:07:30,440 --> 00:07:32,920 Speaker 1: and it is funny to think how how split people 131 00:07:32,960 --> 00:07:35,880 Speaker 1: can be about brands. I mean sometimes I think it's 132 00:07:36,240 --> 00:07:40,600 Speaker 1: it's meant jokingly. You see a lot of joking comments today, 133 00:07:40,800 --> 00:07:44,679 Speaker 1: even about things like coke versus Pepsi, or Twizzlers versus 134 00:07:44,720 --> 00:07:46,960 Speaker 1: red vines or something. And then also things that are 135 00:07:46,960 --> 00:07:51,160 Speaker 1: not even brand oriented, like overhanded versus underheaded toilet paper rolls. 136 00:07:51,560 --> 00:07:54,280 Speaker 1: I recall divisions of this type. We're big on like 137 00:07:54,440 --> 00:07:58,080 Speaker 1: early Facebook, like mid two thousands Facebook, where people would 138 00:07:58,120 --> 00:08:00,400 Speaker 1: make all these joke groups, and it would be like, 139 00:08:00,560 --> 00:08:04,000 Speaker 1: you know, for people who like red vines, because twizzlers 140 00:08:04,000 --> 00:08:09,200 Speaker 1: are for cowards. I mean, it's still I think, very 141 00:08:09,200 --> 00:08:11,600 Speaker 1: prominent in like name making. You know, people like to 142 00:08:11,600 --> 00:08:13,760 Speaker 1: get in on this sort of thing. I don't know, maybe, 143 00:08:14,480 --> 00:08:16,920 Speaker 1: especially when it's meant jokingly. It's kind of like low 144 00:08:16,960 --> 00:08:21,080 Speaker 1: stakes things to sort of mock disagree about. I'm not sure. 145 00:08:21,200 --> 00:08:24,600 Speaker 1: But then at what point does does does just sort 146 00:08:24,600 --> 00:08:26,840 Speaker 1: of trolling and mock fund At what point does that 147 00:08:26,960 --> 00:08:30,720 Speaker 1: then become like an actual entrenched belief or opinion? Oh? 148 00:08:30,840 --> 00:08:35,080 Speaker 1: I think rather quickly actually, So in this episode of 149 00:08:35,080 --> 00:08:36,640 Speaker 1: Stuff to Blow your mind, where you're going to look 150 00:08:36,640 --> 00:08:39,920 Speaker 1: at a at a social psychology concept that ties into 151 00:08:39,920 --> 00:08:42,480 Speaker 1: all of this, the minimal group paradigm, a method for 152 00:08:42,559 --> 00:08:46,679 Speaker 1: sussing out what might be the absolute minimal conditions for 153 00:08:46,840 --> 00:08:50,400 Speaker 1: discrimination to take place between two groups. Will their findings 154 00:08:50,440 --> 00:08:53,000 Speaker 1: be twiddlers versus red vines? Is that the minimal thing? 155 00:08:53,400 --> 00:08:55,719 Speaker 1: I don't know, you'll just have to find out, all right, 156 00:08:55,760 --> 00:08:59,120 Speaker 1: So where does this minimal group paradigm come from? All right? So, 157 00:09:00,000 --> 00:09:02,480 Speaker 1: one of the sources that I was looking at specifically 158 00:09:02,520 --> 00:09:04,880 Speaker 1: in order to understand the minimal group Paradigm and its 159 00:09:04,920 --> 00:09:07,720 Speaker 1: History was The Origins of the Minimal Group Paradigm by 160 00:09:07,760 --> 00:09:11,080 Speaker 1: Rupert Brown of the University of Sussex, twenty twenty, published 161 00:09:11,120 --> 00:09:15,520 Speaker 1: by the American Psychological Association. Brown points out that the 162 00:09:16,120 --> 00:09:19,120 Speaker 1: basis of prejudice and inner group discrimination has of course 163 00:09:19,320 --> 00:09:21,480 Speaker 1: been a human concern for a long time, and certainly 164 00:09:21,600 --> 00:09:25,880 Speaker 1: was a long time concern of people in psychology. But 165 00:09:25,960 --> 00:09:28,640 Speaker 1: the MGP or the minimal group paradigm as we know 166 00:09:28,760 --> 00:09:34,000 Speaker 1: it generally is attributed to Polish social scientists on Retash 167 00:09:34,080 --> 00:09:37,640 Speaker 1: Fill who of nineteen nineteen through nineteen eighty two, and 168 00:09:37,960 --> 00:09:41,560 Speaker 1: also a British social psychologist Michael Billigg, who worked with him, 169 00:09:41,800 --> 00:09:44,560 Speaker 1: was born nineteen forty seven. Typically, I see a lot 170 00:09:44,600 --> 00:09:47,760 Speaker 1: of references to work they did in the early seventies. 171 00:09:48,000 --> 00:09:52,000 Speaker 1: One of the main citations is on retash film Michael Billig, 172 00:09:52,080 --> 00:09:55,800 Speaker 1: Robert Bundy and Claude Flament, and the title is Social 173 00:09:55,840 --> 00:10:00,240 Speaker 1: Categorization and Intergroup Behavior, published in the European Journal of 174 00:10:00,280 --> 00:10:03,599 Speaker 1: Social Psychology, nineteen seventy one. If you want to go 175 00:10:03,679 --> 00:10:07,840 Speaker 1: back to the source now, tash Fell it's worth noting 176 00:10:08,000 --> 00:10:11,240 Speaker 1: was a survivor of the Holocaust, and this is important 177 00:10:11,240 --> 00:10:13,880 Speaker 1: to keep in mind because much of his work does 178 00:10:13,920 --> 00:10:17,000 Speaker 1: ponder the question of what drives groups of people to 179 00:10:17,080 --> 00:10:21,360 Speaker 1: take up extreme prejudice views and does the transference rely 180 00:10:21,559 --> 00:10:24,560 Speaker 1: on extreme personality types or is it something more mundane. 181 00:10:26,120 --> 00:10:30,040 Speaker 1: So yeah, in the early nineteen seventies, tash Fell at 182 00:10:30,040 --> 00:10:34,000 Speaker 1: all conducted a series of experiments on MGP studies that 183 00:10:34,040 --> 00:10:36,360 Speaker 1: would end up having an enormous impact on the field 184 00:10:36,400 --> 00:10:39,400 Speaker 1: of social psychology. More on this in a second, but 185 00:10:39,440 --> 00:10:42,320 Speaker 1: I also want to point out that Brown stresses that 186 00:10:42,400 --> 00:10:45,880 Speaker 1: there is also a pre tash Fell origin in the 187 00:10:45,880 --> 00:10:50,840 Speaker 1: work of Dutch social psychologist Yap Robbie in nineteen sixty four. 188 00:10:51,400 --> 00:10:55,240 Speaker 1: So Robbie suspected that common fate was the essential component 189 00:10:55,280 --> 00:10:58,800 Speaker 1: for a group to hold together and for intergroup discrimination 190 00:10:58,840 --> 00:11:03,120 Speaker 1: to occur. Common fate is a distalled psychology concept that 191 00:11:03,760 --> 00:11:06,640 Speaker 1: says that objects functioning or moving in the same direction 192 00:11:07,000 --> 00:11:09,400 Speaker 1: appear to belong together, kind of like we're off to 193 00:11:09,440 --> 00:11:11,680 Speaker 1: see the wizard, right, I mean, you're going to see 194 00:11:11,679 --> 00:11:13,240 Speaker 1: the wizard, while I'm going to the sea. We'll see 195 00:11:13,240 --> 00:11:14,960 Speaker 1: the wizard. Or I'm going down this road. Well, I 196 00:11:14,960 --> 00:11:18,080 Speaker 1: guess we're we're a group, Okay, so under this view, 197 00:11:18,280 --> 00:11:22,040 Speaker 1: the thing that would make you prefer and show favoritism 198 00:11:22,040 --> 00:11:25,160 Speaker 1: to members of your in group is a basic belief 199 00:11:25,200 --> 00:11:27,640 Speaker 1: that the same kind of thing is going to happen 200 00:11:27,679 --> 00:11:30,640 Speaker 1: to all the members of this group. Yeah. Yeah, and 201 00:11:30,720 --> 00:11:32,120 Speaker 1: I think you can probably you know, cut it a 202 00:11:32,160 --> 00:11:34,600 Speaker 1: few different ways, but yeah, it's like there's something about 203 00:11:34,679 --> 00:11:37,720 Speaker 1: your your sort of common direction, common fate, if you 204 00:11:37,760 --> 00:11:39,920 Speaker 1: want to put it that way, that this sort of 205 00:11:39,920 --> 00:11:45,679 Speaker 1: binds you together. Now, Robbie's experiments involved classifying subjects into 206 00:11:45,720 --> 00:11:49,280 Speaker 1: groups to explore inner group discrimination, but he ultimately concluded 207 00:11:49,280 --> 00:11:54,080 Speaker 1: that mere classification was not enough to elicit in group favoritism. 208 00:11:54,640 --> 00:11:57,360 Speaker 1: So um again Worth noting that he was looking at 209 00:11:57,400 --> 00:11:59,960 Speaker 1: some of the some of the same stuff that would 210 00:12:00,080 --> 00:12:03,120 Speaker 1: become important to the mental group paradigm and what kind 211 00:12:03,160 --> 00:12:06,240 Speaker 1: of lays some of the groundwork for it, even but 212 00:12:06,520 --> 00:12:11,360 Speaker 1: his findings were different. Now, this raises a question that 213 00:12:11,360 --> 00:12:15,439 Speaker 1: Brown explores, why was Robbie overlooked and why is he 214 00:12:15,520 --> 00:12:19,400 Speaker 1: still sort of overlooked and some of the documentation surrounding 215 00:12:19,640 --> 00:12:23,240 Speaker 1: MGP and Brown breaks it down and attributes it to 216 00:12:23,280 --> 00:12:28,000 Speaker 1: three reasons. So, first, tash Fell's findings were counterintuitive and 217 00:12:28,040 --> 00:12:30,959 Speaker 1: therefore more newsworthy. That's one of the big things about 218 00:12:31,040 --> 00:12:34,040 Speaker 1: MGP is that you know a lot of people going 219 00:12:34,080 --> 00:12:37,080 Speaker 1: into it, you don't expect to see the results you see. 220 00:12:37,120 --> 00:12:39,080 Speaker 1: You don't expect to see this thing that seems to 221 00:12:39,280 --> 00:12:41,640 Speaker 1: explain a lot of the division that goes on in 222 00:12:41,679 --> 00:12:47,320 Speaker 1: groups and the discrimination that occurs between groups. Just based 223 00:12:47,400 --> 00:12:49,880 Speaker 1: on as we'll get into, like just sort of random 224 00:12:49,880 --> 00:12:54,200 Speaker 1: grouping of people, it makes more sense to assume that 225 00:12:54,240 --> 00:12:57,760 Speaker 1: if people are showing in group favoritism, it would be 226 00:12:57,760 --> 00:13:00,800 Speaker 1: because I don't know, they assume that all of the 227 00:13:00,840 --> 00:13:03,120 Speaker 1: members of the group are sharing a common fate or 228 00:13:03,160 --> 00:13:06,520 Speaker 1: something like that. Yeah. Yeah. The other thing to keep 229 00:13:06,520 --> 00:13:10,200 Speaker 1: in mind is toash Fell's MGP work helped inspire and 230 00:13:10,400 --> 00:13:13,960 Speaker 1: like the groundwork for social identity theory, which became huge, 231 00:13:14,360 --> 00:13:18,160 Speaker 1: so that that in turn elevated his work with MGP, 232 00:13:18,559 --> 00:13:22,000 Speaker 1: and in fact, the social identity theory was formulated by 233 00:13:22,800 --> 00:13:25,200 Speaker 1: tash Fell and John Turner, who lived nineteen forty seven 234 00:13:25,240 --> 00:13:28,120 Speaker 1: through twenty eleven in the nineteen seventies and the nineteen eighties. 235 00:13:28,440 --> 00:13:30,560 Speaker 1: And then the third factor that Brown points out is 236 00:13:30,600 --> 00:13:33,600 Speaker 1: that personality differences between the two men. So tash Fell 237 00:13:33,720 --> 00:13:37,000 Speaker 1: has been characterized as more of a go getter, essentially 238 00:13:37,040 --> 00:13:41,480 Speaker 1: someone who really, you know, took full advantage of any 239 00:13:41,520 --> 00:13:46,160 Speaker 1: opportunity to you know, sort of explore his ideas and 240 00:13:46,200 --> 00:13:50,679 Speaker 1: get his ideas out there, whereas Robbie was more unassuming. 241 00:13:51,480 --> 00:13:55,439 Speaker 1: So some combination of these three factors, according to Brown, 242 00:13:55,720 --> 00:13:58,000 Speaker 1: So tash Fell was quite aware of these studies, but 243 00:13:58,200 --> 00:14:02,559 Speaker 1: suspected that the opposite was true and was already experimenting 244 00:14:02,559 --> 00:14:05,080 Speaker 1: with the social comparison theory. So fast forward to the 245 00:14:05,120 --> 00:14:09,360 Speaker 1: nineteen seventies and the first MGP experiments. I'm not going 246 00:14:09,360 --> 00:14:12,920 Speaker 1: to bust these experiments out blow by blow necessarily, but 247 00:14:13,080 --> 00:14:17,360 Speaker 1: certainly hitting the really important parts the basics of the 248 00:14:17,480 --> 00:14:21,760 Speaker 1: MGP experiments. So the first part is you have subjects 249 00:14:21,840 --> 00:14:24,520 Speaker 1: carry out a task, and the task is often described 250 00:14:24,520 --> 00:14:26,680 Speaker 1: as something like estimating the number of dots on an 251 00:14:26,720 --> 00:14:30,480 Speaker 1: image or answering an opinion question about a work of 252 00:14:30,520 --> 00:14:35,400 Speaker 1: abstract art. All right, Next, presumably based on these results, 253 00:14:35,680 --> 00:14:39,360 Speaker 1: subjects are placed into groups. But known only to the researchers, 254 00:14:39,360 --> 00:14:41,800 Speaker 1: not to the subjects, is the fact that the group 255 00:14:41,840 --> 00:14:45,640 Speaker 1: assignment is actually random. Okay, So, for example, if the 256 00:14:46,000 --> 00:14:48,480 Speaker 1: question you were given had to do with like estimating 257 00:14:48,520 --> 00:14:52,160 Speaker 1: the number of dots. You might break people into groups, 258 00:14:52,360 --> 00:14:55,280 Speaker 1: say and tell them that, okay, group A is the 259 00:14:55,320 --> 00:14:58,840 Speaker 1: people who overestimated the number of dots in the image, 260 00:14:58,840 --> 00:15:01,640 Speaker 1: and group B is the number of people who underestimated 261 00:15:01,680 --> 00:15:04,160 Speaker 1: the number of dots in the image. Or with the 262 00:15:04,240 --> 00:15:08,720 Speaker 1: question about art, you might separate people into different taste categories. 263 00:15:08,760 --> 00:15:10,760 Speaker 1: You say like, oh, you were the people who preferred 264 00:15:10,760 --> 00:15:13,360 Speaker 1: the art by this artist, and group B is the 265 00:15:13,560 --> 00:15:17,560 Speaker 1: people who preferred the art by this other artist. Yeah. Yeah, 266 00:15:17,640 --> 00:15:19,120 Speaker 1: you can certainly break it down like that, But I 267 00:15:19,120 --> 00:15:21,040 Speaker 1: think on the other end, you could also just not 268 00:15:21,160 --> 00:15:24,600 Speaker 1: explain what the what the methodology is at all, Like 269 00:15:24,960 --> 00:15:27,640 Speaker 1: you could just put people into groups and it's just 270 00:15:27,680 --> 00:15:31,600 Speaker 1: the idea that there's something about data that originates in 271 00:15:31,640 --> 00:15:35,520 Speaker 1: you that informed this choice because it's actually random. But 272 00:15:35,600 --> 00:15:37,480 Speaker 1: you don't want people to think it's random, that you 273 00:15:37,520 --> 00:15:40,400 Speaker 1: want them to think that it's based on something. Yes, 274 00:15:40,920 --> 00:15:44,200 Speaker 1: now there's no interaction between the resulting groups, no room 275 00:15:44,240 --> 00:15:47,600 Speaker 1: for interpersonal bonds, you know, not enough to be like, hey, 276 00:15:47,640 --> 00:15:51,320 Speaker 1: those the people who apparently guess differently from me about 277 00:15:51,440 --> 00:15:54,240 Speaker 1: jelly beans in a jar. They seem a little stuck up, 278 00:15:54,320 --> 00:15:55,720 Speaker 1: or they seem a little sting. You know, there's no 279 00:15:55,840 --> 00:15:57,760 Speaker 1: room for that at all, or likewise, no room for 280 00:15:57,760 --> 00:16:00,160 Speaker 1: you to say, well, they seem like decent people, even 281 00:16:00,200 --> 00:16:02,720 Speaker 1: if they count dots differently or estimate dots differently than 282 00:16:02,720 --> 00:16:05,920 Speaker 1: I do. In fact, a group membership was anonymous, right, 283 00:16:05,960 --> 00:16:07,920 Speaker 1: so you didn't know who was in the other You 284 00:16:07,920 --> 00:16:09,920 Speaker 1: couldn't like look around the room and be right, here 285 00:16:09,920 --> 00:16:13,440 Speaker 1: are the group a people. Yeah, and that's important to 286 00:16:13,520 --> 00:16:16,440 Speaker 1: stress too, because yeah, it's just really the beauty of 287 00:16:16,480 --> 00:16:19,240 Speaker 1: this experiment and the attractiveness of it is that it 288 00:16:19,320 --> 00:16:23,200 Speaker 1: does just strip everything else away, everything that you could 289 00:16:23,360 --> 00:16:27,640 Speaker 1: use and also could therefore muddy and complicate the findings. Okay, 290 00:16:27,680 --> 00:16:30,640 Speaker 1: so people are assigned into these random groups. They think 291 00:16:30,680 --> 00:16:33,040 Speaker 1: there is a reason for the assignment. They don't know 292 00:16:33,080 --> 00:16:35,160 Speaker 1: who's in the groups. They just know they're in one 293 00:16:35,160 --> 00:16:38,640 Speaker 1: of them. Right, So now it's time to get busy. Here, 294 00:16:39,320 --> 00:16:42,040 Speaker 1: a second task is assigned, in which subjects had to 295 00:16:42,080 --> 00:16:47,000 Speaker 1: assign reward tasks to anonymous individuals, either two from the 296 00:16:47,040 --> 00:16:49,760 Speaker 1: end group, two from the outgroup, or one from each 297 00:16:50,400 --> 00:16:53,160 Speaker 1: These individuals would be marked by a code number, and 298 00:16:53,440 --> 00:16:55,960 Speaker 1: your code number would never come up, so it wasn't 299 00:16:55,960 --> 00:16:59,440 Speaker 1: completely self serving, right, And these would be what are 300 00:16:59,480 --> 00:17:02,960 Speaker 1: known as allocation tasks, tasks that are used in a 301 00:17:03,040 --> 00:17:06,040 Speaker 1: number of different experiments to try to see what people 302 00:17:06,240 --> 00:17:10,000 Speaker 1: value or reward. And generally these are just experiments where 303 00:17:10,000 --> 00:17:14,120 Speaker 1: subjects play some kind of game that involves distributing rewards 304 00:17:14,119 --> 00:17:17,040 Speaker 1: off in monetary rewards like a number of dollars or 305 00:17:18,200 --> 00:17:20,840 Speaker 1: tokens of some kind that can be exchanged for something 306 00:17:21,400 --> 00:17:24,840 Speaker 1: to these anonymous players belonging to the in group or 307 00:17:24,880 --> 00:17:28,439 Speaker 1: the outgroup or both. Yeah, and you might think that 308 00:17:28,480 --> 00:17:31,480 Speaker 1: this would all favor even handed distribution, since there's just 309 00:17:31,520 --> 00:17:34,840 Speaker 1: so little to go on aside from group affiliation in 310 00:17:34,960 --> 00:17:38,160 Speaker 1: divvying it up. And I've read that in some cases 311 00:17:38,480 --> 00:17:41,560 Speaker 1: many subjects did try to distribute things kind of fairly. 312 00:17:41,720 --> 00:17:46,000 Speaker 1: Like one of the later reviews of minimal group paradigm 313 00:17:46,040 --> 00:17:48,840 Speaker 1: I was looking at by Sabine Oughton from twenty sixteen 314 00:17:49,800 --> 00:17:54,919 Speaker 1: said that basically quote fairness concerns strongly guided intergroup allocations, 315 00:17:55,119 --> 00:17:59,200 Speaker 1: but that didn't always hold true. There was a number 316 00:17:59,240 --> 00:18:04,560 Speaker 1: of exceptions. Yeah, Ultimately, subjects consistently engage in constant in 317 00:18:04,680 --> 00:18:08,439 Speaker 1: group bias, so the groups were again entirely made up 318 00:18:08,480 --> 00:18:12,280 Speaker 1: by chance, there was no contact here, but it was 319 00:18:12,400 --> 00:18:15,640 Speaker 1: enough to generate a sense of group belonging. It created 320 00:18:15,680 --> 00:18:18,359 Speaker 1: in us, and it also created a them which you 321 00:18:18,480 --> 00:18:22,800 Speaker 1: then see borne out in the study results. So that's 322 00:18:22,840 --> 00:18:25,600 Speaker 1: the big take home from the minimal group paradigm, even 323 00:18:25,600 --> 00:18:30,480 Speaker 1: without factors such as religion, race, nationality, socioeconomic class, even 324 00:18:30,520 --> 00:18:33,200 Speaker 1: without things like what do other people? What do people 325 00:18:33,200 --> 00:18:35,720 Speaker 1: in the outside group look like or act like? When 326 00:18:35,760 --> 00:18:37,879 Speaker 1: you know, what do I have in common with the 327 00:18:37,880 --> 00:18:40,120 Speaker 1: people around me? It was even stripping all that away, 328 00:18:40,440 --> 00:18:45,200 Speaker 1: humans rather swiftly foreign factions and discriminate against others. So, 329 00:18:45,240 --> 00:18:48,240 Speaker 1: to offer a little bit more detail from that later 330 00:18:48,280 --> 00:18:51,719 Speaker 1: piece I mentioned this was a paper called the Minimal 331 00:18:51,760 --> 00:18:56,000 Speaker 1: Group Paradigm and its Maximal Impact in Research on social categorization. 332 00:18:56,119 --> 00:18:58,919 Speaker 1: This was published in Current Opinion and Psychology in twenty 333 00:18:58,960 --> 00:19:03,359 Speaker 1: sixteen by Sabinet. One thing Oughton mentions is that when 334 00:19:03,520 --> 00:19:08,080 Speaker 1: Tashfell and colleagues first came up with the minimal group paradigm, 335 00:19:08,680 --> 00:19:14,080 Speaker 1: their original intention was apparently to investigate whether people would 336 00:19:14,119 --> 00:19:18,159 Speaker 1: display in group favoritism even in situations where there was 337 00:19:18,200 --> 00:19:21,679 Speaker 1: no actual conflict for resources between the two groups. So 338 00:19:21,720 --> 00:19:24,680 Speaker 1: their original question was a little bit different, but as 339 00:19:24,680 --> 00:19:28,919 Speaker 1: a preliminary avenue of research to that project to address 340 00:19:28,960 --> 00:19:32,479 Speaker 1: this other question. First, they wanted to just find out 341 00:19:32,520 --> 00:19:35,560 Speaker 1: what were the minimum criteria that could be leveraged to 342 00:19:35,600 --> 00:19:38,919 Speaker 1: cause people to show in group favoritism. So this was 343 00:19:38,920 --> 00:19:42,479 Speaker 1: originally supposed to be just like trying to establish what 344 00:19:42,520 --> 00:19:45,480 Speaker 1: they would need to do in this other test oughten 345 00:19:45,560 --> 00:19:48,520 Speaker 1: Wrights quote. They planned to start with a most minimal 346 00:19:48,600 --> 00:19:52,320 Speaker 1: setup and to successively add elements to the design until 347 00:19:52,520 --> 00:19:57,639 Speaker 1: intergroup discrimination would emerge. So they started with these novel 348 00:19:57,720 --> 00:20:00,880 Speaker 1: social categorizations based on things that they expected to have 349 00:20:01,119 --> 00:20:05,159 Speaker 1: no power of social cohesion at all, like tendencies in 350 00:20:05,400 --> 00:20:08,760 Speaker 1: you know, the numerical estimation game, the dots game, or 351 00:20:09,240 --> 00:20:12,440 Speaker 1: preferences for the types of paintings. And again these were 352 00:20:12,480 --> 00:20:15,520 Speaker 1: only pretenses. The people were actually assigned to groups randomly 353 00:20:15,600 --> 00:20:18,800 Speaker 1: in most are all cases, And instead what they found 354 00:20:18,920 --> 00:20:23,240 Speaker 1: was that these fake, made up bases for social categorization 355 00:20:23,400 --> 00:20:27,199 Speaker 1: were good enough to kickstart in group favoritism. So they 356 00:20:27,200 --> 00:20:30,359 Speaker 1: were trying to find the minimum criteria, and it turns 357 00:20:30,359 --> 00:20:32,400 Speaker 1: out they just didn't really have to look very hard. 358 00:20:32,400 --> 00:20:36,200 Speaker 1: There's barely a minimum at all. Yeah, and that again 359 00:20:36,240 --> 00:20:38,680 Speaker 1: I think is the thing that that that floored everybody, 360 00:20:38,720 --> 00:20:41,119 Speaker 1: and I in still floors people when they hear about 361 00:20:41,119 --> 00:20:54,520 Speaker 1: it for the first time or reminded him. Oughton recognizes 362 00:20:54,640 --> 00:20:59,320 Speaker 1: three experimental features for recognizing what authors in this experimental 363 00:20:59,359 --> 00:21:02,720 Speaker 1: domain college mere categorization effect. That's what's going on in 364 00:21:02,760 --> 00:21:06,239 Speaker 1: the minimal group paradigm. It's like people are behaving in 365 00:21:06,280 --> 00:21:10,840 Speaker 1: ways that indicate group favoritism, but only based on merely 366 00:21:10,880 --> 00:21:13,960 Speaker 1: being categorized in a group, and like nothing happening in 367 00:21:13,960 --> 00:21:17,800 Speaker 1: the real world. The three features, as autenlists of them are. 368 00:21:18,000 --> 00:21:22,040 Speaker 1: Number one, categorization is novel and arbitrary, no history of 369 00:21:22,080 --> 00:21:25,640 Speaker 1: experiences within group and or out group, so it's got 370 00:21:25,640 --> 00:21:28,600 Speaker 1: to be all new in the experiment. Number two, categorization 371 00:21:28,720 --> 00:21:32,720 Speaker 1: is anonymous, no face to face interaction between group members, 372 00:21:32,760 --> 00:21:36,119 Speaker 1: because you can obviously see how that would introduce complications 373 00:21:36,680 --> 00:21:42,160 Speaker 1: and see no utilitarian self interests can be directly served 374 00:21:42,240 --> 00:21:46,119 Speaker 1: by intergroup evaluations or allocations. So you don't want to 375 00:21:46,119 --> 00:21:50,080 Speaker 1: complicate your study by having people have the ability to 376 00:21:50,320 --> 00:21:54,000 Speaker 1: pay money out to themselves because that would obviously add 377 00:21:54,000 --> 00:21:56,960 Speaker 1: in new variables, right right, That would be the self 378 00:21:57,000 --> 00:22:01,600 Speaker 1: interest kicking in for sure. But a really interesting thing 379 00:22:01,760 --> 00:22:05,920 Speaker 1: emerges with the allocation tasks, that is, along the lines 380 00:22:05,960 --> 00:22:09,480 Speaker 1: of self interest. Instead of individual self interest, it is 381 00:22:09,640 --> 00:22:13,479 Speaker 1: in group interest. So Otton says, as we've discussed, there 382 00:22:13,520 --> 00:22:17,240 Speaker 1: were fairness concerns that did guide some inner group allocations, 383 00:22:17,280 --> 00:22:21,280 Speaker 1: but also there was evidence of in group favoritism even 384 00:22:21,320 --> 00:22:24,119 Speaker 1: when the group had just been formed. It meant essentially 385 00:22:24,119 --> 00:22:27,280 Speaker 1: nothing and the members were anonymous. And here's a really 386 00:22:27,320 --> 00:22:32,560 Speaker 1: interesting thing. In some cases quote, the tendency to positively 387 00:22:32,720 --> 00:22:37,720 Speaker 1: differentiate the in group from the outgroup was stronger than 388 00:22:37,760 --> 00:22:43,000 Speaker 1: the tendency to maximize the in group's profit. So the 389 00:22:43,080 --> 00:22:47,280 Speaker 1: example Latton gives here would be, instead of giving twelve 390 00:22:47,440 --> 00:22:50,959 Speaker 1: dollars to the in group and eleven dollars to the outgroup, 391 00:22:51,440 --> 00:22:56,000 Speaker 1: some subjects would select a strategy that gave eleven dollars 392 00:22:56,080 --> 00:22:59,280 Speaker 1: to the in group and nine dollars to the outgroup. 393 00:23:00,280 --> 00:23:04,439 Speaker 1: Everybody gets less, but the difference between the rewards of 394 00:23:04,480 --> 00:23:07,520 Speaker 1: the two groups is greater. And if you were on 395 00:23:07,560 --> 00:23:10,720 Speaker 1: the top of that difference, even if you got less, 396 00:23:10,760 --> 00:23:14,480 Speaker 1: some people preferred that, oh wow, and this thing about 397 00:23:14,640 --> 00:23:19,639 Speaker 1: sacrificing the overall objective gains of the in group for 398 00:23:19,880 --> 00:23:23,320 Speaker 1: a greater distinction in gains between the in group and 399 00:23:23,359 --> 00:23:26,440 Speaker 1: out group made me think about a thing I read 400 00:23:26,440 --> 00:23:29,159 Speaker 1: in the context of a different paper exploring a different theory, 401 00:23:29,800 --> 00:23:33,280 Speaker 1: but it was won by the Harvard psychologist Jim Sidenius 402 00:23:33,400 --> 00:23:38,199 Speaker 1: and co authors called Vladimir's Choice and the distribution of 403 00:23:38,240 --> 00:23:42,760 Speaker 1: social resources a group dominance perspective. This was exploring a 404 00:23:42,800 --> 00:23:46,640 Speaker 1: different theory called social dominance theory, but it starts off 405 00:23:46,680 --> 00:23:51,720 Speaker 1: with this anecdote that apparently comes from an Eastern European fable. 406 00:23:52,480 --> 00:23:56,480 Speaker 1: The authors related as following, one day, God came down 407 00:23:56,520 --> 00:23:59,720 Speaker 1: to Vladimir, a poor peasant, and said, Vladimir, I will 408 00:23:59,720 --> 00:24:03,640 Speaker 1: grant you one wish. Anything you want will be yours. However, 409 00:24:03,760 --> 00:24:07,119 Speaker 1: God added, there is one condition. Anything I give to 410 00:24:07,160 --> 00:24:10,560 Speaker 1: you will be granted to your neighbor if on twice over. 411 00:24:11,240 --> 00:24:15,240 Speaker 1: Vladimir immediately answered, saying, okay, take out one of my eyes. 412 00:24:15,880 --> 00:24:19,399 Speaker 1: Oh that's grim, that's very gram. Now, the stakes in 413 00:24:19,440 --> 00:24:22,760 Speaker 1: these minimal group paradigm experiments are certainly not that high, 414 00:24:22,960 --> 00:24:25,720 Speaker 1: but I think we can all think of examples where, 415 00:24:26,400 --> 00:24:29,639 Speaker 1: you know, sometimes you just see a case where what 416 00:24:29,720 --> 00:24:33,560 Speaker 1: appears to be spite or maybe something else like that 417 00:24:34,440 --> 00:24:38,280 Speaker 1: overrides a person's own objective self interest, like they would 418 00:24:38,440 --> 00:24:43,359 Speaker 1: rather have a higher degree of advantage over a known 419 00:24:43,480 --> 00:24:49,320 Speaker 1: neighbor or adversary than a greater objective advantage overall. It'd 420 00:24:49,359 --> 00:24:51,399 Speaker 1: be like if you sort of had it in for 421 00:24:52,000 --> 00:24:54,159 Speaker 1: your buddy and it was your turn to pick the 422 00:24:55,480 --> 00:24:57,360 Speaker 1: type of pizza you get, and you'd make sure you've 423 00:24:57,359 --> 00:25:00,439 Speaker 1: got a flavor that you weren't crazy about, but you 424 00:25:00,520 --> 00:25:04,400 Speaker 1: knew that your your friend hated. Yeah, like you're you're 425 00:25:04,440 --> 00:25:08,800 Speaker 1: willing to choke it down just because more refreshing to you, 426 00:25:08,920 --> 00:25:12,160 Speaker 1: more delicious to you is is the fact that they 427 00:25:12,160 --> 00:25:15,720 Speaker 1: are going to dislike it more than you do, which, 428 00:25:15,720 --> 00:25:18,280 Speaker 1: again is illogical. It shouldn't be a thing that someone 429 00:25:18,320 --> 00:25:20,640 Speaker 1: would do. But I think we can all easily imagine 430 00:25:20,640 --> 00:25:24,719 Speaker 1: a scenario where someone's pettiness in spite would lead to 431 00:25:24,720 --> 00:25:26,840 Speaker 1: such an occurrence, and maybe this one, like that's a 432 00:25:26,920 --> 00:25:28,600 Speaker 1: version of it, and maybe is a little more real 433 00:25:28,640 --> 00:25:32,160 Speaker 1: world accurate as opposed to the blinding the gulf between 434 00:25:32,359 --> 00:25:36,760 Speaker 1: your okayness and your friend's misery is more valuable to 435 00:25:36,840 --> 00:25:39,680 Speaker 1: you than the extra pleasure you would get from getting 436 00:25:39,680 --> 00:25:42,639 Speaker 1: a topping you really liked. Right, And for some reason, 437 00:25:42,680 --> 00:25:45,280 Speaker 1: this like, this whole scenario makes more sense concerning friends 438 00:25:45,359 --> 00:25:48,199 Speaker 1: than it does like enemies of any sort. I don't know, 439 00:25:48,440 --> 00:25:52,479 Speaker 1: I don't it. Perhaps the suggests a lot about about 440 00:25:52,640 --> 00:25:55,480 Speaker 1: the way relationships work. Now there's some caveats to this 441 00:25:55,600 --> 00:25:57,440 Speaker 1: that I want to get into in a second, because 442 00:25:57,680 --> 00:26:01,280 Speaker 1: to come back to that paper by Aughton, one thing 443 00:26:01,320 --> 00:26:05,880 Speaker 1: I was interested in was criticisms of the minimal group paradigm. 444 00:26:05,920 --> 00:26:10,120 Speaker 1: It does seem that the MGP findings have been widely 445 00:26:10,119 --> 00:26:13,520 Speaker 1: replicated with a lot of superficial variations, So it does 446 00:26:13,600 --> 00:26:16,520 Speaker 1: look to me like the finding is robust. But while 447 00:26:16,520 --> 00:26:18,880 Speaker 1: the finding itself is sound, you could argue that people 448 00:26:18,960 --> 00:26:23,040 Speaker 1: might be drawing the wrong conclusions from it, and so 449 00:26:23,080 --> 00:26:26,960 Speaker 1: there are a number of criticisms along those lines. One 450 00:26:27,000 --> 00:26:29,960 Speaker 1: thing that comes up in Aten's paper here is is 451 00:26:30,000 --> 00:26:33,920 Speaker 1: the minimal group paradigm really revealing something about how people 452 00:26:33,960 --> 00:26:38,040 Speaker 1: would behave in the real world or does the experiment 453 00:26:38,160 --> 00:26:43,200 Speaker 1: quote merely create a situation in which social category information 454 00:26:43,359 --> 00:26:48,440 Speaker 1: receives unrealistic attention. I was like, Oh, I think that's interesting, 455 00:26:48,480 --> 00:26:53,040 Speaker 1: because Okay, you're in a contrived laboratory scenario. Your membership 456 00:26:53,040 --> 00:26:56,080 Speaker 1: in one group or the other is highlighted to you, 457 00:26:56,280 --> 00:26:59,000 Speaker 1: people are telling you about it, and the situation is 458 00:26:59,000 --> 00:27:02,600 Speaker 1: stripped of a lot of other contextual information that would 459 00:27:02,640 --> 00:27:06,640 Speaker 1: exist in the real world that would normally inform your behavior. 460 00:27:06,920 --> 00:27:10,840 Speaker 1: Maybe people are placing undue weight on group membership even 461 00:27:10,880 --> 00:27:15,000 Speaker 1: though it's arbitrary, because it's really like the only variable 462 00:27:15,040 --> 00:27:18,960 Speaker 1: they're being aware of in this situation. On the other hand, 463 00:27:19,640 --> 00:27:21,840 Speaker 1: while that criticism makes a lot of sense to me, 464 00:27:22,880 --> 00:27:26,000 Speaker 1: I think these experiments are just as valuable if you 465 00:27:26,080 --> 00:27:29,560 Speaker 1: think about them with that caveat in mind, Like, they 466 00:27:29,640 --> 00:27:33,520 Speaker 1: show a certain irrational way that some people behave showing 467 00:27:33,640 --> 00:27:38,920 Speaker 1: in group preference for utterly arbitrary groups when group membership 468 00:27:39,119 --> 00:27:42,200 Speaker 1: is made salient when it is brought to your attention 469 00:27:42,320 --> 00:27:45,439 Speaker 1: and people are talking about it, which is something that 470 00:27:45,520 --> 00:27:47,879 Speaker 1: does happen in the real world all the time. Actually, Like, 471 00:27:48,000 --> 00:27:53,200 Speaker 1: there is some category distinction between people that was maybe 472 00:27:53,280 --> 00:27:58,080 Speaker 1: not previously much noted, and for some reason suddenly it 473 00:27:58,240 --> 00:28:01,920 Speaker 1: is made salient people start paying attention to this difference 474 00:28:02,000 --> 00:28:05,240 Speaker 1: and talking about it. It seems to me that in 475 00:28:05,359 --> 00:28:09,399 Speaker 1: reality this is enough to trigger minimal group paradigm effects. 476 00:28:11,119 --> 00:28:13,679 Speaker 1: This is only partially related, but it reminds me of 477 00:28:13,720 --> 00:28:18,880 Speaker 1: that thing when an arbitrary, factual question that previously had 478 00:28:18,960 --> 00:28:24,440 Speaker 1: no political valence suddenly becomes politicized for some reason. Yeah, 479 00:28:25,040 --> 00:28:28,800 Speaker 1: maybe by like a prominent politician taking a stance one 480 00:28:28,840 --> 00:28:32,239 Speaker 1: way or another on this question, and now suddenly, like 481 00:28:33,080 --> 00:28:36,760 Speaker 1: what you think about this, this question that previously involved 482 00:28:36,840 --> 00:28:40,840 Speaker 1: no political values, now is a major part of your identity, 483 00:28:41,320 --> 00:28:45,360 Speaker 1: and people will factionalize on the basis of it. Yeah, 484 00:28:45,360 --> 00:28:47,280 Speaker 1: And sometimes it takes the form of just sort of 485 00:28:47,360 --> 00:28:51,360 Speaker 1: of a you know, fear mongering about something that normally 486 00:28:51,440 --> 00:28:54,120 Speaker 1: had no real kind of like fear weight to it. 487 00:28:54,440 --> 00:28:56,720 Speaker 1: Like I instantly think of various things going on doing 488 00:28:56,760 --> 00:29:00,640 Speaker 1: say the Satanic panic, where you know, it's a suddenly 489 00:29:00,640 --> 00:29:02,880 Speaker 1: there's you know, there's some sort of an outrage over 490 00:29:03,200 --> 00:29:07,480 Speaker 1: a particular piece of music that is interpreted by somebody 491 00:29:07,520 --> 00:29:11,160 Speaker 1: as having some sort of subliminal, demonic message inside it, 492 00:29:11,680 --> 00:29:13,560 Speaker 1: even if there's little or no proof that that is 493 00:29:13,880 --> 00:29:17,200 Speaker 1: even possibly the case or certainly the intent of the artists. 494 00:29:17,360 --> 00:29:19,160 Speaker 1: It ends up picking up steam all its own. And 495 00:29:19,200 --> 00:29:21,960 Speaker 1: then where do you fall on this divide totally? Now, 496 00:29:22,080 --> 00:29:25,400 Speaker 1: to be fair, things like that are not purely minimal 497 00:29:25,480 --> 00:29:28,720 Speaker 1: group paradigm, because once you're talking about like cultural artifacts 498 00:29:28,720 --> 00:29:32,000 Speaker 1: and preferences, you do start bringing in like, well, maybe 499 00:29:32,000 --> 00:29:35,520 Speaker 1: that already touches certain things about you know, cultural identity, 500 00:29:35,520 --> 00:29:38,160 Speaker 1: which people would have opinions about and would have some 501 00:29:38,320 --> 00:29:41,160 Speaker 1: in group outgroup associations and so forth. But it's sort 502 00:29:41,160 --> 00:29:43,440 Speaker 1: of halfway there. I wonder if there might be a 503 00:29:43,440 --> 00:29:45,640 Speaker 1: comparison to draw here to the there were two things 504 00:29:45,680 --> 00:29:47,720 Speaker 1: that in recent years. There was the whole like what 505 00:29:47,800 --> 00:29:51,400 Speaker 1: color is this dress? Right? And people were split over that. 506 00:29:51,440 --> 00:29:53,000 Speaker 1: I don't know, I mean, not to the point where 507 00:29:53,040 --> 00:29:58,040 Speaker 1: I guess you really saw outgroup discrimination. But it was 508 00:29:58,080 --> 00:30:01,680 Speaker 1: interesting to see how quickly people's became, like they were 509 00:30:01,760 --> 00:30:05,600 Speaker 1: quick to state what their interpretation of it was and 510 00:30:05,760 --> 00:30:08,400 Speaker 1: become a part of that group that sawd a certain way. 511 00:30:08,560 --> 00:30:10,360 Speaker 1: I do not remember what I thought of this dress 512 00:30:10,440 --> 00:30:13,440 Speaker 1: or even who wore it. Oh, I just remember being 513 00:30:14,000 --> 00:30:16,240 Speaker 1: amused that it was a thing at all. I think 514 00:30:16,240 --> 00:30:18,600 Speaker 1: I remember when I first saw it, it it looked blue 515 00:30:18,720 --> 00:30:21,800 Speaker 1: and black to me, So hate me if you want. 516 00:30:32,440 --> 00:30:34,640 Speaker 1: But anyway, coming back to this issue, so it may 517 00:30:34,680 --> 00:30:37,719 Speaker 1: be a good criticism that this has some limitation in 518 00:30:37,760 --> 00:30:39,800 Speaker 1: how it applies to the real world. Once you bring 519 00:30:39,800 --> 00:30:43,160 Speaker 1: in all the context of culture and all that, But 520 00:30:43,320 --> 00:30:46,560 Speaker 1: I do think it still probably highlights something very interesting, 521 00:30:46,760 --> 00:30:50,440 Speaker 1: which is that group sort of in group favoritism can 522 00:30:50,440 --> 00:30:54,400 Speaker 1: emerge with minimal stimulation just by like drawing a lot 523 00:30:54,440 --> 00:30:58,800 Speaker 1: of attention to the presence and division differentiation of the groups. 524 00:31:00,080 --> 00:31:04,479 Speaker 1: Another interesting limitation that Otton mentions. Subsequent research has shown 525 00:31:04,560 --> 00:31:09,080 Speaker 1: that in group favoritism with the minimal group paradigm is 526 00:31:09,520 --> 00:31:15,720 Speaker 1: quote mostly restricted to allocations of positive resources into valuations 527 00:31:15,720 --> 00:31:20,320 Speaker 1: regarding positive traits. So when you're talking about things like 528 00:31:20,440 --> 00:31:27,320 Speaker 1: assigning actual punishments or negative personal assessments, it seems that 529 00:31:27,360 --> 00:31:33,520 Speaker 1: the mere categorization effect no longer reliably produces results, which 530 00:31:33,600 --> 00:31:36,240 Speaker 1: should be a good result, right Yeah, yeah, knowing that 531 00:31:36,920 --> 00:31:40,040 Speaker 1: eye gouging actually wouldn't play out all that well in 532 00:31:40,120 --> 00:31:43,040 Speaker 1: this scenario. Right, So, maybe experiments show the minimal group 533 00:31:43,080 --> 00:31:46,680 Speaker 1: stuff is enough to make you treat your in group 534 00:31:46,760 --> 00:31:50,680 Speaker 1: better and maybe even in some cases prefer them to 535 00:31:50,720 --> 00:31:53,560 Speaker 1: get a better leg up over the other group as 536 00:31:53,560 --> 00:31:57,920 Speaker 1: opposed to more pay out overall. But it doesn't extend 537 00:31:57,960 --> 00:32:01,800 Speaker 1: to actually wanting to hurt or punish the outgroup. Yes, 538 00:32:01,960 --> 00:32:05,520 Speaker 1: though not to imply though, that just not wanting or 539 00:32:05,560 --> 00:32:07,880 Speaker 1: not thinking about actively hurting the group doesn't mean that 540 00:32:08,320 --> 00:32:11,880 Speaker 1: in the like the real world implications of the minimal 541 00:32:11,880 --> 00:32:15,160 Speaker 1: group paradigm, that that plenty of hurt might be inflicted, 542 00:32:15,720 --> 00:32:17,720 Speaker 1: you know, especially if you're dealing like you know, any 543 00:32:17,800 --> 00:32:22,160 Speaker 1: kind of outgroup discrimination could of course have terrible effects 544 00:32:22,800 --> 00:32:25,520 Speaker 1: in the real world, but in those situations you'd be 545 00:32:25,560 --> 00:32:28,320 Speaker 1: going beyond the conditions of the minimal group paradigm and 546 00:32:28,360 --> 00:32:31,479 Speaker 1: sort of bringing in the real world. Yeah, but anyway, 547 00:32:31,560 --> 00:32:33,840 Speaker 1: I thought this was an interesting dynamic. So people might 548 00:32:33,880 --> 00:32:37,520 Speaker 1: be more willing to allocate monetary payments to their own group, 549 00:32:38,120 --> 00:32:42,280 Speaker 1: even if that group is novel or arbitrary. But studies 550 00:32:42,320 --> 00:32:45,080 Speaker 1: don't reliably show people to be willing to dule up 551 00:32:45,120 --> 00:32:49,640 Speaker 1: punishments or disparagements against a novel, arbitrary outgroup. Why might 552 00:32:49,680 --> 00:32:54,000 Speaker 1: this be? One interpretation given in this paper is it's 553 00:32:54,040 --> 00:32:57,320 Speaker 1: possible that the the in group favoritism in minimal group 554 00:32:57,320 --> 00:33:02,200 Speaker 1: paradigm experiments shows up because people have positive associations with 555 00:33:02,240 --> 00:33:06,160 Speaker 1: themselves and hey, I'm part of the in group, so 556 00:33:06,600 --> 00:33:09,120 Speaker 1: I'm good in deserving and I'm part of group A, 557 00:33:09,280 --> 00:33:12,120 Speaker 1: and therefore group A is good in deserving, and there 558 00:33:12,200 --> 00:33:16,840 Speaker 1: might not really be an equivalent mechanism of comparison with 559 00:33:16,880 --> 00:33:19,600 Speaker 1: the outgroup. So the same logic doesn't lead someone to 560 00:33:19,640 --> 00:33:24,280 Speaker 1: conclude that group B is bad and undeserving, So you 561 00:33:24,360 --> 00:33:27,240 Speaker 1: might not actually go so far as to select punishments 562 00:33:27,280 --> 00:33:31,640 Speaker 1: and disparagements for them. Yet this would raise interesting questions. 563 00:33:31,640 --> 00:33:34,840 Speaker 1: It would bring me back to that thing about why 564 00:33:34,960 --> 00:33:39,880 Speaker 1: people so often in these experiments will sacrifice overall rewards 565 00:33:39,880 --> 00:33:42,520 Speaker 1: of the in group to get a bigger leg up 566 00:33:42,640 --> 00:33:46,440 Speaker 1: on the outgroup. Because again, remember, like you know, a 567 00:33:46,440 --> 00:33:48,280 Speaker 1: lot of these findings are If I'm in group A 568 00:33:48,560 --> 00:33:52,360 Speaker 1: but not personally receiving any rewards, I might choose a 569 00:33:52,400 --> 00:33:55,720 Speaker 1: plan where group A gets ten and group B gets 570 00:33:55,720 --> 00:33:58,720 Speaker 1: seven instead of a plan where group A gets twelve 571 00:33:58,800 --> 00:34:01,840 Speaker 1: and group B gets a leven. If this is not 572 00:34:01,960 --> 00:34:04,960 Speaker 1: to be interpreted as an attempt to punish group BE, 573 00:34:05,840 --> 00:34:10,120 Speaker 1: what does it mean. Maybe it means that some people 574 00:34:10,280 --> 00:34:15,439 Speaker 1: sometimes interpret it as a greater personal reward to get 575 00:34:15,520 --> 00:34:18,719 Speaker 1: significantly more than your neighbor, then it would be to 576 00:34:18,800 --> 00:34:22,640 Speaker 1: get a greater objective reward overall. Like some people would 577 00:34:22,680 --> 00:34:26,040 Speaker 1: just rather come in second place and have Jeff come 578 00:34:26,080 --> 00:34:29,320 Speaker 1: in sixth place, rather than come in first place myself 579 00:34:29,360 --> 00:34:32,600 Speaker 1: and have Jeff come in second. Yeah, it's interesting to 580 00:34:32,680 --> 00:34:35,160 Speaker 1: sort of crunch that and I try and apply it 581 00:34:35,200 --> 00:34:40,640 Speaker 1: to some sort of you know, hunter gatherer scenario and 582 00:34:40,719 --> 00:34:43,240 Speaker 1: try and figure out how that makes sense even even 583 00:34:43,280 --> 00:34:46,640 Speaker 1: you know, like in a in those situations. But yeah, 584 00:34:46,680 --> 00:34:48,960 Speaker 1: I don't know, that is a weird little wrinkle in 585 00:34:49,040 --> 00:34:53,160 Speaker 1: human nature. Yeah, you know, sometimes I see this disgust 586 00:34:53,200 --> 00:34:54,719 Speaker 1: and I think of it in terms of it's kind 587 00:34:54,719 --> 00:34:57,919 Speaker 1: of like the idea here is that MGP is kind 588 00:34:57,920 --> 00:35:02,520 Speaker 1: of like a bedrock scenario, you know, and that again, 589 00:35:02,600 --> 00:35:04,360 Speaker 1: when you bring into the real world, everything else is 590 00:35:04,360 --> 00:35:06,040 Speaker 1: going to be built on top of that bedrock. Or 591 00:35:06,080 --> 00:35:07,440 Speaker 1: you could think of it in terms of like just 592 00:35:07,440 --> 00:35:11,080 Speaker 1: sort of the initial like laying out with stakes of 593 00:35:11,120 --> 00:35:15,839 Speaker 1: what will become a cathedral. And so you're not necessarily 594 00:35:15,880 --> 00:35:17,799 Speaker 1: going to get the full picture of the cathedral looking 595 00:35:17,840 --> 00:35:20,960 Speaker 1: at the basic shape that you've marked out in the dirt, 596 00:35:22,080 --> 00:35:24,200 Speaker 1: but you may be able to figure out some things, 597 00:35:24,200 --> 00:35:27,759 Speaker 1: some of the sweeping ideas that will be present in 598 00:35:27,800 --> 00:35:30,680 Speaker 1: the final design. But then again, you have no idea 599 00:35:31,520 --> 00:35:34,080 Speaker 1: like what all the different cultural structures on top of 600 00:35:34,120 --> 00:35:36,640 Speaker 1: it are ultimately going to produce. But it's still an 601 00:35:36,640 --> 00:35:40,000 Speaker 1: interesting exercise to sort of strip things down to this level. Now, 602 00:35:40,040 --> 00:35:42,160 Speaker 1: I want to come back to Brown for just one 603 00:35:42,239 --> 00:35:46,360 Speaker 1: last thing here, because in that paper Brown stresses the 604 00:35:46,440 --> 00:35:49,480 Speaker 1: historical context of MGP and says it is also to 605 00:35:49,520 --> 00:35:54,480 Speaker 1: consider especially as it regards two major points. So first 606 00:35:54,480 --> 00:35:56,759 Speaker 1: of all, he says that during the mid to late 607 00:35:56,840 --> 00:35:59,719 Speaker 1: sixties there is a so called crisis and social psychology 608 00:36:00,160 --> 00:36:03,239 Speaker 1: in which North American scholars in particular were questioning whether 609 00:36:03,280 --> 00:36:07,520 Speaker 1: European studies involving a great deal of laboratory experimentation could 610 00:36:07,520 --> 00:36:10,560 Speaker 1: actually apply to real world social issues of the time. 611 00:36:11,200 --> 00:36:13,160 Speaker 1: So this led to a lot of soul searching and 612 00:36:13,280 --> 00:36:16,920 Speaker 1: changes in Western psychology in general. And ironically, there were 613 00:36:16,920 --> 00:36:20,640 Speaker 1: a lot of questions about quote unquote experiments in a vacuum. 614 00:36:20,719 --> 00:36:23,920 Speaker 1: Now it's ironic because, I mean, as we've been discussing, 615 00:36:24,400 --> 00:36:26,880 Speaker 1: the minimal group paradigm is very much an experiment in 616 00:36:26,880 --> 00:36:29,560 Speaker 1: a vacuum like that, a lot of effort goes into 617 00:36:30,200 --> 00:36:33,040 Speaker 1: sucking all of the real world complexity, sucking all the 618 00:36:33,120 --> 00:36:37,000 Speaker 1: air out of the chamber of this experiment. But it 619 00:36:37,160 --> 00:36:40,040 Speaker 1: also could be seen, especially in the time period, is 620 00:36:40,080 --> 00:36:43,240 Speaker 1: kind of like a stripping down to a new bedrock, 621 00:36:43,320 --> 00:36:46,960 Speaker 1: to a new level upon which to try and understand, 622 00:36:47,080 --> 00:36:49,719 Speaker 1: like sort of like sweeping out, removing all those other 623 00:36:49,760 --> 00:36:54,440 Speaker 1: experiments that were potentially complicating things. And Brown also stresses 624 00:36:54,480 --> 00:36:58,120 Speaker 1: that prior to minimal group paradigm, the main ideas for 625 00:36:58,200 --> 00:37:01,240 Speaker 1: why you had social prejudice is in the real world 626 00:37:01,680 --> 00:37:06,880 Speaker 1: we're tied to personality dynamics often connected to things in 627 00:37:06,920 --> 00:37:12,840 Speaker 1: your upbringing, built up frustration, and negative interdependence among groups, 628 00:37:13,400 --> 00:37:17,719 Speaker 1: and all of these ideas as sort of sweeping definitions 629 00:37:17,719 --> 00:37:22,680 Speaker 1: were challenged by experimental data. Instead, the mental group paradigm 630 00:37:22,760 --> 00:37:27,640 Speaker 1: creates this against super stripped down simple experiment that does 631 00:37:27,920 --> 00:37:30,560 Speaker 1: seem to reveal a lot about someone, like the basic 632 00:37:30,640 --> 00:37:34,680 Speaker 1: mechanics of how we think about our group and outside groups, 633 00:37:35,320 --> 00:37:38,640 Speaker 1: coming back two memes and so forth. It also reminds 634 00:37:38,640 --> 00:37:40,640 Speaker 1: me of a common thing you I think still see 635 00:37:40,840 --> 00:37:42,560 Speaker 1: and that as people saying, well, there are two types 636 00:37:42,600 --> 00:37:44,200 Speaker 1: of people in the world. There are the people that 637 00:37:44,440 --> 00:37:46,759 Speaker 1: do or believe X and those who do or believe why. 638 00:37:47,560 --> 00:37:50,359 Speaker 1: And I guess the thing that often makes them funny 639 00:37:50,520 --> 00:37:53,680 Speaker 1: is that it will or potentially makes them funny is 640 00:37:53,680 --> 00:37:56,120 Speaker 1: that they'll hit on a division you did not realize 641 00:37:56,200 --> 00:37:59,480 Speaker 1: was a thing, but then suddenly You're just presented with 642 00:37:59,520 --> 00:38:02,120 Speaker 1: this spark of an idea that this is truly a 643 00:38:02,200 --> 00:38:05,759 Speaker 1: defining choice to make. And you know, even though it's 644 00:38:05,800 --> 00:38:08,560 Speaker 1: generally played for laughs, you know, you can kind of 645 00:38:08,600 --> 00:38:12,319 Speaker 1: feel it, sort of you can feel the divide sort 646 00:38:12,320 --> 00:38:14,719 Speaker 1: of moving and you've sort of forced to step to 647 00:38:14,760 --> 00:38:17,400 Speaker 1: one side or the other, even if you don't actually 648 00:38:17,440 --> 00:38:20,719 Speaker 1: engage with said meme or said conversation. Well, I think 649 00:38:20,719 --> 00:38:23,160 Speaker 1: one of the things that's interesting about those memes is 650 00:38:23,280 --> 00:38:26,879 Speaker 1: they tend to it's kinna. I feel like we should 651 00:38:26,880 --> 00:38:28,920 Speaker 1: give an example. What do they say? There are two 652 00:38:28,960 --> 00:38:31,680 Speaker 1: types of people, those who peel back the slim gym 653 00:38:31,760 --> 00:38:33,840 Speaker 1: rapper as they eat it, or those who take the 654 00:38:33,840 --> 00:38:35,759 Speaker 1: slim gym out in one go and then eat it 655 00:38:35,800 --> 00:38:38,160 Speaker 1: with their hold it with their fingers. You know, do 656 00:38:38,200 --> 00:38:40,640 Speaker 1: you get your fingers greasy or not? Those memes are 657 00:38:40,640 --> 00:38:44,000 Speaker 1: funny because they ask people to read a lot into 658 00:38:44,040 --> 00:38:46,560 Speaker 1: a behavior that, on its space we would assume does 659 00:38:46,560 --> 00:38:49,640 Speaker 1: not tell you much about a person. And exactly the 660 00:38:49,719 --> 00:38:52,680 Speaker 1: humor is in trying to like extrapolate everything you could 661 00:38:52,680 --> 00:38:55,360 Speaker 1: possibly want to know about a person from from that 662 00:38:55,440 --> 00:38:57,480 Speaker 1: one thing. Though that that is often kind of what 663 00:38:57,520 --> 00:39:00,960 Speaker 1: we do, like You can imagine sitting in these these 664 00:39:00,960 --> 00:39:05,480 Speaker 1: early experiments with Tesh Fell and saying like, you know, okay, 665 00:39:05,520 --> 00:39:07,799 Speaker 1: what are the people who counted the dots, you know, 666 00:39:07,840 --> 00:39:10,160 Speaker 1: the people who counted the dots differently? What does that 667 00:39:10,280 --> 00:39:14,080 Speaker 1: say about their personality? And trying to like work that 668 00:39:14,160 --> 00:39:18,120 Speaker 1: up into something meaningful about reality. Yeah. I mean, as humans, 669 00:39:18,120 --> 00:39:20,080 Speaker 1: we tend to look for the patterns and things. So 670 00:39:20,200 --> 00:39:24,040 Speaker 1: even when there's a random splitting, like if there's a 671 00:39:24,400 --> 00:39:27,760 Speaker 1: if it's supposedly based on how we counted the jellybeans 672 00:39:27,800 --> 00:39:29,880 Speaker 1: in a jar, how we saw the dots and in 673 00:39:29,920 --> 00:39:31,680 Speaker 1: some sort of an array, we're going to think about 674 00:39:31,680 --> 00:39:34,000 Speaker 1: all the ways that that could potentially define who or 675 00:39:34,040 --> 00:39:36,360 Speaker 1: what we are. You would say that because you're a 676 00:39:36,440 --> 00:39:41,040 Speaker 1: dot undercounter probably, Yeah, I mean it does make you 677 00:39:41,080 --> 00:39:44,040 Speaker 1: think like, oh, does that mean I'm a I'm a pessimist? 678 00:39:45,320 --> 00:39:47,560 Speaker 1: Does that mean I'm just not that into sugar? What 679 00:39:47,640 --> 00:39:50,279 Speaker 1: does it mean? We can't we can't help but try 680 00:39:50,320 --> 00:39:51,879 Speaker 1: and figure that out and come up with all sorts 681 00:39:51,920 --> 00:39:55,560 Speaker 1: of ridiculous theories as to what it says. All right, well, 682 00:39:55,560 --> 00:39:57,759 Speaker 1: we're gonna go and close it out right there, but 683 00:39:57,880 --> 00:40:00,799 Speaker 1: hopefully we gave you just a good taste of the 684 00:40:00,840 --> 00:40:04,759 Speaker 1: minimal group paradigm, like where it came from, it, what 685 00:40:04,840 --> 00:40:08,359 Speaker 1: it seems to mean, what it seems to tell us 686 00:40:08,400 --> 00:40:11,240 Speaker 1: about human nature. Obviously, we'd love to hear from everyone 687 00:40:11,239 --> 00:40:15,360 Speaker 1: out there if you have some more great fictional examples 688 00:40:15,440 --> 00:40:19,080 Speaker 1: or real world examples of some of some of what's 689 00:40:19,080 --> 00:40:22,000 Speaker 1: going on here right in we'd love to hear from you. 690 00:40:22,719 --> 00:40:25,640 Speaker 1: We read listener mail every Monday, and the Stuff and 691 00:40:25,640 --> 00:40:27,520 Speaker 1: the Stuff to Blow Your Mind podcast feeding our Stuff 692 00:40:27,520 --> 00:40:29,640 Speaker 1: to Blow your Mind listener Mail episodes. On Wednesdays we 693 00:40:29,680 --> 00:40:32,080 Speaker 1: do a short form artifact or monster fact. Tuesdays and 694 00:40:32,120 --> 00:40:35,200 Speaker 1: Thursdays are our core episodes, and on Fridays we set 695 00:40:35,239 --> 00:40:38,320 Speaker 1: aside most serious concerns to just watch a weird film. 696 00:40:38,760 --> 00:40:42,080 Speaker 1: Huge thanks to our audio producer JJ Posway. If you 697 00:40:42,080 --> 00:40:44,080 Speaker 1: would like to get in touch with us with feedback 698 00:40:44,120 --> 00:40:46,920 Speaker 1: on this episode or any other, to suggest topic for 699 00:40:46,920 --> 00:40:49,160 Speaker 1: the future, or just to say hello, you can email 700 00:40:49,239 --> 00:40:59,600 Speaker 1: us at contact at Stuff to Blow your Mind dot com. 701 00:41:00,000 --> 00:41:02,759 Speaker 1: Stuff to Blow Your Mind is production of iHeartRadio. For 702 00:41:02,880 --> 00:41:05,640 Speaker 1: more podcasts from my Heart Radio, visit the iHeartRadio app, 703 00:41:05,800 --> 00:41:12,439 Speaker 1: Apple Podcasts, or wherever you're listening to your favorite shows