1 00:00:06,280 --> 00:00:07,960 Speaker 1: Hey, you welcome to Stuff to Blow your Mind. My 2 00:00:08,039 --> 00:00:09,320 Speaker 1: name is Robert Lamb. 3 00:00:09,200 --> 00:00:12,119 Speaker 2: And I am Joe McCormick, and it's Saturday. Time to 4 00:00:12,160 --> 00:00:14,360 Speaker 2: go into the vault for an older episode of the show. 5 00:00:14,840 --> 00:00:18,239 Speaker 2: This one was on the minimal group Paradigm, and it 6 00:00:18,360 --> 00:00:21,800 Speaker 2: originally published March thirtieth, twenty twenty three. Is this from 7 00:00:21,840 --> 00:00:23,840 Speaker 2: exactly one year ago today? I think so? 8 00:00:24,280 --> 00:00:31,240 Speaker 1: Yeah, Hey the Jackpot, hope you enjoy. 9 00:00:30,080 --> 00:00:33,960 Speaker 3: Welcome to Stuff to Blow your Mind, a production of iHeartRadio. 10 00:00:39,880 --> 00:00:41,720 Speaker 1: Hey you welcome to Stuff to Blow your Mind. My 11 00:00:41,800 --> 00:00:42,639 Speaker 1: name is Robert. 12 00:00:42,479 --> 00:00:46,519 Speaker 2: Lamb and I'm Joe McCormick. And hey, fair warning, folks. 13 00:00:46,560 --> 00:00:49,440 Speaker 2: If I sound like I speak with the voice and 14 00:00:49,640 --> 00:00:53,320 Speaker 2: mind of some kind of decrepit bog monster today, it 15 00:00:53,479 --> 00:00:57,200 Speaker 2: is because I'm on the mend from a bad cold. 16 00:00:57,400 --> 00:01:01,640 Speaker 2: So apologies on what's happening in your ears right now. 17 00:01:01,720 --> 00:01:04,400 Speaker 2: But but but here I am on Mike. 18 00:01:04,760 --> 00:01:07,560 Speaker 1: Well, sometimes bog monsters are quite wise, so you know, 19 00:01:07,640 --> 00:01:09,320 Speaker 1: it depends on the story you're looking at. 20 00:01:09,720 --> 00:01:13,240 Speaker 2: I hope to bring a real meg muckle bones energy 21 00:01:13,920 --> 00:01:16,679 Speaker 2: today's episode, so you'll have to tell me how I do. 22 00:01:16,800 --> 00:01:19,240 Speaker 2: But yeah, what are we talking about today? Rob? 23 00:01:19,560 --> 00:01:21,800 Speaker 1: Well? We're gonna be talking about a little something called 24 00:01:21,800 --> 00:01:25,160 Speaker 1: the minimal group paradigm, which sounds, I know, if you're 25 00:01:25,200 --> 00:01:27,480 Speaker 1: not familiar with it, sounds a bit a bit stuffy, 26 00:01:28,319 --> 00:01:32,000 Speaker 1: perhaps perhaps sounds a little bit clinical, but I think 27 00:01:32,000 --> 00:01:35,240 Speaker 1: it's it's a very fascinating little topic. I shouldn't make 28 00:01:35,280 --> 00:01:39,759 Speaker 1: for a nice one part episode here, because it attempts 29 00:01:39,760 --> 00:01:43,440 Speaker 1: to come down to some of the major concerns regarding 30 00:01:43,680 --> 00:01:48,480 Speaker 1: human civilization and human interactions, basically coming out of the 31 00:01:48,560 --> 00:01:52,360 Speaker 1: question of like, just how divisive are human beings and 32 00:01:52,400 --> 00:01:54,880 Speaker 1: how little does it take for us to split into 33 00:01:54,920 --> 00:01:59,000 Speaker 1: factions over something or next to nothing even And I 34 00:01:59,240 --> 00:02:01,240 Speaker 1: think for many of us, the answer seems to be 35 00:02:01,400 --> 00:02:05,360 Speaker 1: that we're, you know, are very divisive and that it 36 00:02:05,360 --> 00:02:07,440 Speaker 1: doesn't take much at all for us to split off 37 00:02:07,440 --> 00:02:09,320 Speaker 1: into factions. And I think this has been played to 38 00:02:09,360 --> 00:02:14,079 Speaker 1: great effect in literature and cinema, especially comedically, and two 39 00:02:14,120 --> 00:02:17,919 Speaker 1: examples always come to my mind, so one of them, Joe, 40 00:02:17,960 --> 00:02:19,520 Speaker 1: I'm not sure if you're familiar with this. I don't 41 00:02:19,520 --> 00:02:22,000 Speaker 1: know if we've talked about this before, but in the 42 00:02:22,080 --> 00:02:26,520 Speaker 1: nineteen fifty three story from Doctor Seuss The Sneeches, is this. 43 00:02:26,520 --> 00:02:27,960 Speaker 2: The butter on the bread one. 44 00:02:28,639 --> 00:02:30,840 Speaker 1: No, no, you're thinking of the Butter battle Book, which 45 00:02:30,960 --> 00:02:34,880 Speaker 1: does get into a similar situation. That's a where you 46 00:02:34,919 --> 00:02:38,000 Speaker 1: have two different groups and one side thinks you should 47 00:02:38,000 --> 00:02:41,239 Speaker 1: do the butter side down those buttered side up, and 48 00:02:41,480 --> 00:02:43,880 Speaker 1: they get into this big Cold war stalemate. 49 00:02:44,280 --> 00:02:48,640 Speaker 2: This is an arms race and escalation of their weaponry 50 00:02:48,720 --> 00:02:51,639 Speaker 2: based on the butter ideology difference. 51 00:02:52,040 --> 00:02:53,679 Speaker 1: Yeah, so that's a good one to bring up too. 52 00:02:53,800 --> 00:02:59,840 Speaker 1: The Sneeches concerns this population of avian creatures that in 53 00:03:00,000 --> 00:03:03,240 Speaker 1: their entire social hierarchy is based on which ones have 54 00:03:03,280 --> 00:03:05,160 Speaker 1: a star on their bellies and which ones don't have 55 00:03:05,200 --> 00:03:08,160 Speaker 1: a star on their bellies, and the star bellied Sneaches 56 00:03:08,160 --> 00:03:10,800 Speaker 1: are the ones that live at the top and the 57 00:03:10,840 --> 00:03:13,120 Speaker 1: rest live at the bottom. But then a con artist 58 00:03:13,160 --> 00:03:15,880 Speaker 1: moves into town with a star on machine and then 59 00:03:15,919 --> 00:03:19,840 Speaker 1: later a star off machine to capitalize on their divisiveness. 60 00:03:20,480 --> 00:03:22,760 Speaker 1: Though at the end of that, the Sneeches move beyond 61 00:03:22,840 --> 00:03:25,120 Speaker 1: all of this and they unite as a single people. 62 00:03:25,160 --> 00:03:26,720 Speaker 1: So it's kind of a nice message. 63 00:03:26,919 --> 00:03:29,880 Speaker 2: Oh, that's that's very nice. That's a much happier ending 64 00:03:29,880 --> 00:03:33,080 Speaker 2: than the Butter battle Book, which, as I recall, it 65 00:03:33,200 --> 00:03:35,720 Speaker 2: ends with basically both sides on a hair trigger with 66 00:03:35,760 --> 00:03:36,880 Speaker 2: their ultimate weaponry. 67 00:03:37,400 --> 00:03:40,480 Speaker 1: Yeah, yeah, it's it's a real clincher that one. But 68 00:03:41,200 --> 00:03:42,920 Speaker 1: another example that comes to mind, and I know you're 69 00:03:42,920 --> 00:03:45,840 Speaker 1: familiar with this one is, of course Monty Python's Life 70 00:03:45,880 --> 00:03:49,360 Speaker 1: of Brian. There's a memorable scene in which the anti 71 00:03:49,480 --> 00:03:53,920 Speaker 1: Roman resistance is split more than split between the Judee 72 00:03:53,920 --> 00:03:56,440 Speaker 1: and People's Front and the People's Front of Judea and 73 00:03:56,560 --> 00:03:59,480 Speaker 1: various other fragments of their independence group. One of the 74 00:03:59,560 --> 00:04:03,480 Speaker 1: characters in the Judea and People's Front proudly proclaims that 75 00:04:03,520 --> 00:04:06,120 Speaker 1: the only people they hate more than the Romans is 76 00:04:06,160 --> 00:04:07,360 Speaker 1: the People's Front of Judea. 77 00:04:07,920 --> 00:04:09,880 Speaker 2: I think this is meant to play on a concept 78 00:04:09,960 --> 00:04:14,560 Speaker 2: that was called the narcissism of small differences by Sigmund Freud. 79 00:04:14,920 --> 00:04:16,760 Speaker 2: I don't know Freud was the first person ever to 80 00:04:16,800 --> 00:04:19,680 Speaker 2: observe this, but I think that's where the phrase comes from, 81 00:04:19,720 --> 00:04:23,560 Speaker 2: is his writings about the idea that it's actually like 82 00:04:23,720 --> 00:04:28,120 Speaker 2: the most bitter, hateful, divisive struggles in the world tend 83 00:04:28,120 --> 00:04:31,200 Speaker 2: to be between people who actually share a lot of 84 00:04:31,279 --> 00:04:35,960 Speaker 2: things in common but have some difference that really appears 85 00:04:36,080 --> 00:04:38,400 Speaker 2: minor to people looking in from the outside. 86 00:04:40,360 --> 00:04:41,760 Speaker 1: Now, I think a lot of you out there you 87 00:04:41,760 --> 00:04:43,960 Speaker 1: may be able to think of examples from other works 88 00:04:43,960 --> 00:04:46,920 Speaker 1: of fiction, or certainly from real life, many of the 89 00:04:47,000 --> 00:04:50,919 Speaker 1: various serious things we divide ourselves over, or you know, 90 00:04:50,960 --> 00:04:54,200 Speaker 1: some of the equally seemingly silly, at least from the outside, 91 00:04:54,200 --> 00:04:56,320 Speaker 1: things that get that we were very divisive over. And 92 00:04:56,360 --> 00:05:00,800 Speaker 1: to your point, sometimes they're within like subgroups and and fandoms. 93 00:05:00,839 --> 00:05:04,880 Speaker 1: Even all manner of brand and sports team loyalty can 94 00:05:04,960 --> 00:05:08,000 Speaker 1: lead to division that doesn't necessarily make much sense on 95 00:05:08,120 --> 00:05:12,320 Speaker 1: closer inspection. Perhaps you prefer Puma shoes and this other 96 00:05:12,360 --> 00:05:15,200 Speaker 1: person prefers Adidas. How could the two of you ever 97 00:05:15,240 --> 00:05:18,680 Speaker 1: see eye to eye? And this specific example ties into 98 00:05:19,080 --> 00:05:22,479 Speaker 1: I think, what is a great example of division in 99 00:05:22,560 --> 00:05:25,480 Speaker 1: human beings and human groups? One I originally saw pointed 100 00:05:25,480 --> 00:05:28,640 Speaker 1: out by and though it's been well documented for a 101 00:05:28,640 --> 00:05:32,160 Speaker 1: while by Jay van Bavel and Dominique Packer in a 102 00:05:32,240 --> 00:05:36,080 Speaker 1: ted Ed video. This is like an animated educational short 103 00:05:36,080 --> 00:05:39,120 Speaker 1: that ted Ed puts on Wonderful Shorts. It's regular viewing 104 00:05:39,160 --> 00:05:42,760 Speaker 1: in my household with my family. But the title of 105 00:05:42,760 --> 00:05:45,599 Speaker 1: this one is the sibling rivalry that divided a town. 106 00:05:46,279 --> 00:05:50,080 Speaker 1: So I thought i'd cover the basics of this sibling rivalry. 107 00:05:50,520 --> 00:05:52,920 Speaker 1: All right, So are you familiar with this story, Joe, 108 00:05:53,080 --> 00:05:56,279 Speaker 1: I'm not. Well. It all starts in around nineteen nineteen. 109 00:05:56,360 --> 00:06:00,240 Speaker 1: That's when these two brothers, Adolph and Rudolph Dossler, found 110 00:06:00,279 --> 00:06:05,240 Speaker 1: a shoe company called Gebruda Dossler Shoe Fabric or GAETA 111 00:06:05,360 --> 00:06:09,480 Speaker 1: in their hometown of herzogen Rock in Bavaria. Turns out 112 00:06:09,480 --> 00:06:13,280 Speaker 1: they were very successful. These shoes really took off. You 113 00:06:13,400 --> 00:06:16,080 Speaker 1: even had the situation where in the nineteen thirty six 114 00:06:16,120 --> 00:06:19,760 Speaker 1: Olympics American runner Jesse Owens apparently was wearing some of 115 00:06:19,800 --> 00:06:23,320 Speaker 1: these shoes. But then World War Two breaks out. This 116 00:06:23,360 --> 00:06:26,880 Speaker 1: disrupts everything, to say the least, Rudolph is drafted into 117 00:06:26,920 --> 00:06:30,200 Speaker 1: the German Army, the factory is transformed into a weapons factory, 118 00:06:30,640 --> 00:06:34,400 Speaker 1: and again everything's just super disrupted until after the war 119 00:06:34,480 --> 00:06:38,840 Speaker 1: the brothers reunite. Their work continues, that is until nineteen 120 00:06:38,880 --> 00:06:42,640 Speaker 1: forty eight when they split over some personal issues. And 121 00:06:42,680 --> 00:06:45,599 Speaker 1: I think there are a few different analyzes of what 122 00:06:45,720 --> 00:06:49,400 Speaker 1: those personal issues might have been, but the results are 123 00:06:49,440 --> 00:06:51,560 Speaker 1: the same. Meaning anyway you cut it, the company is 124 00:06:51,600 --> 00:06:57,359 Speaker 1: split into that means material, workforce, and so forth. Rudolph 125 00:06:57,520 --> 00:07:01,839 Speaker 1: founds Ruda, which becomes Puma, and add All starts Adidas. 126 00:07:02,200 --> 00:07:04,799 Speaker 1: Now that's not that crazy, right, It's just one shoe 127 00:07:04,800 --> 00:07:09,000 Speaker 1: company splitting into two shoe companies. Now. The interesting thing 128 00:07:09,000 --> 00:07:11,440 Speaker 1: about this, though, according to j Van Babel and Dominice 129 00:07:11,440 --> 00:07:13,880 Speaker 1: Pucker and that ted Ed video, is that the brothers 130 00:07:13,920 --> 00:07:18,360 Speaker 1: feud and business division ultimately divides the entire town. Quote, 131 00:07:18,520 --> 00:07:21,800 Speaker 1: residents became fiercely loyal to one brand of shoe, local 132 00:07:21,840 --> 00:07:27,080 Speaker 1: businesses chose sides, and marriage across lines was discouraged. Herzegognoch 133 00:07:27,240 --> 00:07:30,920 Speaker 1: eventually became known as the town of bent Necks because 134 00:07:30,920 --> 00:07:34,160 Speaker 1: its residents looked down to ensure they were interacting with 135 00:07:34,280 --> 00:07:35,360 Speaker 1: members of their group. 136 00:07:35,960 --> 00:07:38,080 Speaker 2: Oh look down at the shoe. Is that one took 137 00:07:38,080 --> 00:07:38,800 Speaker 2: me in second? 138 00:07:38,880 --> 00:07:42,440 Speaker 1: Yeah? So I think it's a great example, not because 139 00:07:42,560 --> 00:07:45,240 Speaker 1: not only because it's kind of has some sort of 140 00:07:45,600 --> 00:07:48,160 Speaker 1: comical elements to it, kind of like belly Stars, but 141 00:07:48,200 --> 00:07:51,600 Speaker 1: also we do see these various elements to the division, 142 00:07:51,640 --> 00:07:55,640 Speaker 1: the personal, the business, the social, and the schism is 143 00:07:55,640 --> 00:07:58,880 Speaker 1: is quite real and it is funny to think how 144 00:07:59,040 --> 00:08:02,320 Speaker 1: how split people can be about brands. I mean, sometimes 145 00:08:02,400 --> 00:08:05,000 Speaker 1: I think it's meant jokingly you see a lot of 146 00:08:05,080 --> 00:08:10,080 Speaker 1: joking comments today, even about things like coke versus pepsi, 147 00:08:10,560 --> 00:08:13,560 Speaker 1: or Twizzlers versus red vines or something, and then also 148 00:08:13,600 --> 00:08:16,720 Speaker 1: things that are not even brand oriented, like overhanded versus 149 00:08:16,760 --> 00:08:18,200 Speaker 1: underhanded toilet paper rolls. 150 00:08:18,680 --> 00:08:21,400 Speaker 2: I recall divisions of this type were big on like 151 00:08:21,520 --> 00:08:25,200 Speaker 2: early Facebook, like mid two thousands Facebook, where people would 152 00:08:25,200 --> 00:08:27,520 Speaker 2: make all these joke groups and it would be like, 153 00:08:27,640 --> 00:08:31,080 Speaker 2: you know, for people who like red vines because twizzlers 154 00:08:31,120 --> 00:08:32,080 Speaker 2: are for cowards. 155 00:08:34,320 --> 00:08:37,080 Speaker 1: I mean, it's still I think, very prominent in like 156 00:08:37,160 --> 00:08:39,080 Speaker 1: meme making. You know, people like to get in on 157 00:08:39,120 --> 00:08:42,280 Speaker 1: this sort of thing. I don't know, maybe especially when 158 00:08:42,320 --> 00:08:44,920 Speaker 1: it's meant jokingly. It's kind of like low stakes things 159 00:08:44,960 --> 00:08:48,280 Speaker 1: to sort of mock disagree about. I'm not sure. But 160 00:08:48,679 --> 00:08:52,600 Speaker 1: then at what point does just sort of trolling and 161 00:08:52,640 --> 00:08:54,840 Speaker 1: mock fun. At what point does that then become like 162 00:08:54,920 --> 00:08:57,400 Speaker 1: an actual entrenched belief or opinion. 163 00:08:57,720 --> 00:09:01,600 Speaker 2: Oh, I think rather quickly actually, So. 164 00:09:01,520 --> 00:09:03,320 Speaker 1: In this episode of Stuff to Blow Your Mind, we're 165 00:09:03,320 --> 00:09:06,400 Speaker 1: going to look at a social psychology concept that ties 166 00:09:06,440 --> 00:09:09,120 Speaker 1: into to all of this, the minimal group paradigm, a 167 00:09:09,160 --> 00:09:12,880 Speaker 1: method for sessing out what might be the absolute minimal 168 00:09:12,880 --> 00:09:16,840 Speaker 1: conditions for discrimination to take place between two groups, will 169 00:09:16,880 --> 00:09:20,120 Speaker 1: their findings be twizzlers versus redfines? Is that the minimal thing, 170 00:09:20,480 --> 00:09:22,040 Speaker 1: I don't know, You'll just have to find out. 171 00:09:22,440 --> 00:09:25,320 Speaker 2: All right. So where does this minimal group paradigm come from? 172 00:09:25,640 --> 00:09:28,439 Speaker 1: All right? So one of the sources that I was 173 00:09:28,480 --> 00:09:31,280 Speaker 1: looking at specifically in order to understand the minimal group 174 00:09:31,320 --> 00:09:33,960 Speaker 1: paradigm and its history was The Origins of the Minimal 175 00:09:34,000 --> 00:09:36,680 Speaker 1: Group Paradigm by Rupert Brown of the University of Sussex, 176 00:09:37,000 --> 00:09:42,160 Speaker 1: twenty twenty, published by the American Psychological Association. Brown points 177 00:09:42,160 --> 00:09:45,640 Speaker 1: out that the basis of prejudice and inner group discrimination 178 00:09:45,720 --> 00:09:48,040 Speaker 1: has of course been a human concern for a long time, 179 00:09:48,080 --> 00:09:52,479 Speaker 1: and certainly was a long time concern of people in psychology. 180 00:09:52,880 --> 00:09:55,600 Speaker 1: But the MGP or the minimal group paradigm as we 181 00:09:55,640 --> 00:10:01,400 Speaker 1: know it generally is attributed to Polish social scientists Henri Tashfell, 182 00:10:01,760 --> 00:10:05,319 Speaker 1: who live nineteen nineteen through nineteen eighty two, and also 183 00:10:05,600 --> 00:10:09,000 Speaker 1: British social psychologist Michael Billig, who worked with him was 184 00:10:09,040 --> 00:10:10,280 Speaker 1: born nineteen forty seven. 185 00:10:10,720 --> 00:10:13,160 Speaker 2: Typically, I see a lot of references to work they 186 00:10:13,160 --> 00:10:16,679 Speaker 2: did in the early seventies. One of the main citations 187 00:10:16,960 --> 00:10:21,120 Speaker 2: is Henri Tashfell, Michael Billig, Robert Bundy, and Claude Flement, 188 00:10:21,600 --> 00:10:25,800 Speaker 2: and the title is Social Categorization and Intergroup Behavior, published 189 00:10:25,840 --> 00:10:29,800 Speaker 2: in the European Journal of Social Psychology, nineteen seventy one. 190 00:10:30,200 --> 00:10:31,320 Speaker 2: If you want to go back to the. 191 00:10:31,200 --> 00:10:35,720 Speaker 1: Source now, tash Fell, it's worth noting was a survivor 192 00:10:35,760 --> 00:10:38,640 Speaker 1: of the Holocaust, and this is important to keep in 193 00:10:38,720 --> 00:10:41,800 Speaker 1: mind because much of his work does ponder the question 194 00:10:41,880 --> 00:10:45,240 Speaker 1: of what drives groups of people to take up extreme 195 00:10:45,400 --> 00:10:50,040 Speaker 1: prejudice views and does the transference rely on extreme personality 196 00:10:50,040 --> 00:10:53,760 Speaker 1: types or is it something more mundane. So yeah, in 197 00:10:53,800 --> 00:10:58,840 Speaker 1: the early nineteen seventies, Toashfel at all conducted a series 198 00:10:58,880 --> 00:11:02,079 Speaker 1: of experiments on MGP studies that would end up having 199 00:11:02,080 --> 00:11:05,560 Speaker 1: an enormous impact on the field of social psychology. More 200 00:11:05,559 --> 00:11:07,320 Speaker 1: on this in a second, but I also want to 201 00:11:07,320 --> 00:11:10,520 Speaker 1: point out that Brown stresses that there is also a 202 00:11:10,679 --> 00:11:14,120 Speaker 1: pre tached Fell origin in the work of Dutch social 203 00:11:14,200 --> 00:11:19,920 Speaker 1: psychologist Yap Robbie in nineteen sixty four. So Robbie suspected 204 00:11:19,960 --> 00:11:22,840 Speaker 1: that common fate was the essential component for a group 205 00:11:22,880 --> 00:11:26,800 Speaker 1: to hold together and for intergroup discrimination to occur. Common 206 00:11:26,800 --> 00:11:31,840 Speaker 1: fate is a distallt psychology concept that says that objects 207 00:11:31,880 --> 00:11:35,400 Speaker 1: functioning or moving in the same direction appear to belong together, 208 00:11:35,840 --> 00:11:37,400 Speaker 1: kind of like we're off to see the wizard, right, 209 00:11:37,480 --> 00:11:39,439 Speaker 1: I mean you're going to see the wizard. Well I'm 210 00:11:39,440 --> 00:11:41,920 Speaker 1: going to see the wizard, or I'm going down this road. Well, 211 00:11:41,960 --> 00:11:43,120 Speaker 1: I guess we're a group. 212 00:11:43,559 --> 00:11:46,680 Speaker 2: Okay, So under this view, the thing that would make 213 00:11:46,720 --> 00:11:50,000 Speaker 2: you prefer and show favoritism to members of your in 214 00:11:50,120 --> 00:11:53,400 Speaker 2: group is a basic belief that the same kind of 215 00:11:53,520 --> 00:11:55,800 Speaker 2: thing is going to happen to all the members of 216 00:11:55,840 --> 00:11:56,960 Speaker 2: this group. Yeah. 217 00:11:57,080 --> 00:11:59,199 Speaker 1: Yeah, And I think you can probably cut it a 218 00:11:59,240 --> 00:12:01,760 Speaker 1: few different ways, but yeah, it's like there's something about 219 00:12:01,760 --> 00:12:05,000 Speaker 1: your sort of common direction, common fate, if you want 220 00:12:05,040 --> 00:12:07,320 Speaker 1: to put it that way, that this sort of binds 221 00:12:07,360 --> 00:12:13,160 Speaker 1: you together. Now, Robbie's experiments involved classifying subjects into groups 222 00:12:13,200 --> 00:12:16,480 Speaker 1: to explore inner group discrimination, but he ultimately concluded that 223 00:12:16,559 --> 00:12:21,120 Speaker 1: mere classification was not enough to elicit in group favoritism. 224 00:12:21,760 --> 00:12:24,560 Speaker 1: So again worth noting that he was looking at some 225 00:12:24,600 --> 00:12:27,360 Speaker 1: of the some of the same stuff that would become 226 00:12:27,400 --> 00:12:30,280 Speaker 1: important to the minimal group paradigm and what kind of 227 00:12:30,320 --> 00:12:33,720 Speaker 1: lays some of the groundwork for it. Even but his 228 00:12:33,880 --> 00:12:39,319 Speaker 1: findings were different. Now, this raises a question that Brown explores, 229 00:12:39,720 --> 00:12:43,320 Speaker 1: why was Robbie overlooked and why is he still sort 230 00:12:43,320 --> 00:12:47,640 Speaker 1: of overlooked in some of the documentation surrounding MGP, and 231 00:12:47,679 --> 00:12:52,360 Speaker 1: Brown breaks it down and attributes it to three reasons. So, first, 232 00:12:52,400 --> 00:12:57,120 Speaker 1: Toshfell's findings were counterintuitive and therefore more newsworthy. That's one 233 00:12:57,120 --> 00:13:00,360 Speaker 1: of the big things about MGP is that you know 234 00:13:00,400 --> 00:13:02,599 Speaker 1: a lot of people going into it, you don't expect 235 00:13:02,760 --> 00:13:04,880 Speaker 1: to see the results you see. You don't expect to 236 00:13:04,880 --> 00:13:07,560 Speaker 1: see this thing that seems to explain a lot of 237 00:13:07,559 --> 00:13:10,680 Speaker 1: the division that goes on in groups and the discrimination 238 00:13:10,720 --> 00:13:15,640 Speaker 1: that occurs between groups. Just based on as we'll get into, 239 00:13:15,679 --> 00:13:18,240 Speaker 1: like just sort of random grouping of people. 240 00:13:19,240 --> 00:13:21,880 Speaker 2: It makes more sense to assume that if people are 241 00:13:21,880 --> 00:13:25,800 Speaker 2: showing in group favoritism, it would be because I don't know, 242 00:13:25,880 --> 00:13:28,720 Speaker 2: they assume that all of the members of the group 243 00:13:28,760 --> 00:13:30,920 Speaker 2: are sharing a common fate or something like that. 244 00:13:31,679 --> 00:13:31,920 Speaker 1: Yeah. 245 00:13:32,000 --> 00:13:32,320 Speaker 2: Yeah. 246 00:13:32,880 --> 00:13:35,120 Speaker 1: The other thing to keep in mind is Toshfel's MGP 247 00:13:35,280 --> 00:13:39,440 Speaker 1: work helped inspire and the groundwork for social identity theory, 248 00:13:39,800 --> 00:13:44,000 Speaker 1: which became huge, so that that in turn elevated his 249 00:13:44,160 --> 00:13:48,160 Speaker 1: work with MGP, and in fact the social identity theory 250 00:13:48,200 --> 00:13:51,439 Speaker 1: was formulated by Tosh Bell and John Turner, who lived 251 00:13:51,520 --> 00:13:54,319 Speaker 1: nineteen forty seven through twenty eleven in the nineteen seventies, 252 00:13:54,320 --> 00:13:56,760 Speaker 1: in the nineteen eighties. And then the third factor that 253 00:13:56,800 --> 00:13:59,640 Speaker 1: Brown points out is that personality differences between the two men. 254 00:14:00,040 --> 00:14:04,080 Speaker 1: So Toshvel has been characterized as more of a go getter, essentially, 255 00:14:04,120 --> 00:14:09,360 Speaker 1: someone who really took full advantage of any opportunity to 256 00:14:09,480 --> 00:14:14,000 Speaker 1: you know, sort of explore his ideas and get his 257 00:14:14,200 --> 00:14:20,080 Speaker 1: ideas out there, whereas Robbie was more unassuming. So some 258 00:14:20,160 --> 00:14:23,200 Speaker 1: combination of these three factors, according to Brown. So toash 259 00:14:23,240 --> 00:14:26,320 Speaker 1: Fell was quite aware of these studies, but suspected that 260 00:14:26,400 --> 00:14:29,920 Speaker 1: the opposite was true and was already experimenting with the 261 00:14:29,920 --> 00:14:33,040 Speaker 1: social comparison theory. So fast forward to the nineteen seventies 262 00:14:33,080 --> 00:14:37,080 Speaker 1: and the first MGP experiments. I'm not going to bust 263 00:14:37,440 --> 00:14:40,800 Speaker 1: these experiments out blow by blow necessarily, but certainly hitting 264 00:14:40,800 --> 00:14:45,680 Speaker 1: the really important parts the basics of the MGP experiments. 265 00:14:46,280 --> 00:14:49,360 Speaker 1: So the first part is you have subjects carry out 266 00:14:49,360 --> 00:14:51,960 Speaker 1: a task, and the task is often described as something 267 00:14:52,000 --> 00:14:54,680 Speaker 1: like estimating the number of dots on an image or 268 00:14:54,760 --> 00:14:58,480 Speaker 1: answering an opinion question about a work of abstract art. 269 00:14:58,720 --> 00:14:59,200 Speaker 2: All right. 270 00:14:59,440 --> 00:15:04,360 Speaker 1: Next, presumably based on these results, subjects are placed into groups. 271 00:15:04,800 --> 00:15:07,480 Speaker 1: But known only to the researchers, not to the subjects, 272 00:15:07,960 --> 00:15:10,600 Speaker 1: is the fact that the group assignment is actually random. 273 00:15:11,080 --> 00:15:14,200 Speaker 2: Okay. So, for example, if the question you were given 274 00:15:14,240 --> 00:15:16,680 Speaker 2: had to do with like estimating the number of dots, 275 00:15:16,960 --> 00:15:21,520 Speaker 2: you might break people into groups, say and tell them that, okay, 276 00:15:21,520 --> 00:15:24,960 Speaker 2: Group A is the people who overestimated the number of 277 00:15:25,000 --> 00:15:27,280 Speaker 2: dots in the image, and group B is the number 278 00:15:27,280 --> 00:15:30,280 Speaker 2: of people who underestimated the number of dots in the image. 279 00:15:30,880 --> 00:15:33,480 Speaker 2: Or with the question about art, you might separate people 280 00:15:33,600 --> 00:15:36,800 Speaker 2: into different taste categories. You say, like, oh, you were 281 00:15:36,840 --> 00:15:39,440 Speaker 2: the people who preferred the art by this artist, and 282 00:15:39,600 --> 00:15:41,960 Speaker 2: group B is the people who preferred the art by 283 00:15:41,960 --> 00:15:42,760 Speaker 2: this other artist. 284 00:15:43,440 --> 00:15:46,000 Speaker 1: Yeah. Yeah, you can certainly break it down like that, 285 00:15:46,040 --> 00:15:47,440 Speaker 1: But I think on the other end, you could also 286 00:15:47,600 --> 00:15:51,720 Speaker 1: just not explain what the methodology is at all, Like 287 00:15:52,080 --> 00:15:54,720 Speaker 1: you could just put people into groups and it's just 288 00:15:54,800 --> 00:15:58,680 Speaker 1: the idea that there's something about data that originates in 289 00:15:58,720 --> 00:16:02,600 Speaker 1: you that inform this choice because it's actually random. But 290 00:16:02,680 --> 00:16:04,560 Speaker 1: you don't want people to think it's random. That you 291 00:16:04,600 --> 00:16:07,520 Speaker 1: want them to think that it's based on something. Yes, 292 00:16:08,000 --> 00:16:11,280 Speaker 1: Now there's no interaction between the resulting groups, no room 293 00:16:11,320 --> 00:16:14,680 Speaker 1: for interpersonal bonds, you know, not enough to be like, hey, 294 00:16:14,720 --> 00:16:18,400 Speaker 1: those the people who apparently guess differently from me about 295 00:16:18,520 --> 00:16:21,240 Speaker 1: jelly beans and a jar. They seem a little stuck 296 00:16:21,280 --> 00:16:22,760 Speaker 1: up or they seem a little sting, you know, there's 297 00:16:22,760 --> 00:16:24,760 Speaker 1: no room for that at all, or likewise, no room 298 00:16:24,760 --> 00:16:26,640 Speaker 1: for you to say, well, they seem like decent people, 299 00:16:26,960 --> 00:16:29,680 Speaker 1: even if they count dots differently or estimate dots differently 300 00:16:29,680 --> 00:16:30,000 Speaker 1: than I do. 301 00:16:30,480 --> 00:16:33,280 Speaker 2: In fact, a group membership was anonymous, right, so you 302 00:16:33,320 --> 00:16:35,680 Speaker 2: didn't know who was in the You couldn't like look 303 00:16:35,720 --> 00:16:38,360 Speaker 2: around the room and hear the group a people. 304 00:16:38,960 --> 00:16:41,840 Speaker 1: Yeah, and that's important to stress too, because yeah, it's 305 00:16:41,880 --> 00:16:45,640 Speaker 1: just really the beauty of this experiment and the attractiveness 306 00:16:45,640 --> 00:16:48,600 Speaker 1: of it is that it does just strip everything else away, 307 00:16:49,160 --> 00:16:52,800 Speaker 1: everything that you could use and also could therefore muddy 308 00:16:52,840 --> 00:16:54,160 Speaker 1: and complicate the findings. 309 00:16:54,400 --> 00:16:57,440 Speaker 2: Okay, so people are assigned into these random groups. They 310 00:16:57,480 --> 00:17:00,000 Speaker 2: think there is a reason for the assignment. They don't 311 00:17:00,120 --> 00:17:02,040 Speaker 2: know who's in the groups. They just know they're in 312 00:17:02,080 --> 00:17:02,760 Speaker 2: one of them. 313 00:17:03,160 --> 00:17:06,560 Speaker 1: Right, So now it's time to get busy. Here a 314 00:17:06,680 --> 00:17:09,159 Speaker 1: second task is a sign in which subjects had to 315 00:17:09,200 --> 00:17:14,040 Speaker 1: assign reward tasks to anonymous individuals, either two from the 316 00:17:14,119 --> 00:17:16,560 Speaker 1: in group, two from the out group, or one from 317 00:17:16,600 --> 00:17:19,760 Speaker 1: each These individuals will be marked by a code number, 318 00:17:20,040 --> 00:17:22,679 Speaker 1: and your code number would never come up, so it 319 00:17:22,760 --> 00:17:24,879 Speaker 1: wasn't completely self serving. 320 00:17:25,080 --> 00:17:28,800 Speaker 2: Right, And these would be what are known as allocation tasks, 321 00:17:28,920 --> 00:17:31,399 Speaker 2: tasks that are used in a number of different experiments 322 00:17:31,440 --> 00:17:34,879 Speaker 2: to try to see what people value or reward. And 323 00:17:35,480 --> 00:17:38,440 Speaker 2: generally these are just experiments where subjects play some kind 324 00:17:38,480 --> 00:17:42,840 Speaker 2: of game that involves distributing rewards, often monetary rewards like 325 00:17:42,880 --> 00:17:46,360 Speaker 2: a number of dollars or or tokens of some kind 326 00:17:46,400 --> 00:17:50,320 Speaker 2: that can be exchanged for something to these anonymous players 327 00:17:50,359 --> 00:17:53,080 Speaker 2: belonging to the in group or the outgroup or both. 328 00:17:53,760 --> 00:17:56,439 Speaker 1: Yeah, and you might think that this would all favor 329 00:17:56,720 --> 00:17:59,399 Speaker 1: even handed distribution, since there's just so little to go 330 00:17:59,480 --> 00:18:02,800 Speaker 1: on aid from group affiliation in digvating it up. 331 00:18:03,359 --> 00:18:06,560 Speaker 2: And I've read that in some cases many subjects did 332 00:18:06,600 --> 00:18:10,240 Speaker 2: try to distribute things kind of fairly. Like one of 333 00:18:10,040 --> 00:18:13,560 Speaker 2: the later reviews of minimal group paradigm I was looking 334 00:18:13,600 --> 00:18:17,840 Speaker 2: at by Sabine Aughten from twenty sixteen said that basically 335 00:18:17,960 --> 00:18:23,160 Speaker 2: quote fairness concerns strongly guided intergroup allocations, but that didn't 336 00:18:23,200 --> 00:18:27,160 Speaker 2: always hold true. There were a number of exceptions. 337 00:18:27,440 --> 00:18:32,520 Speaker 1: Yeah, Ultimately, subjects consistently engage in constant in group bias. 338 00:18:33,520 --> 00:18:36,119 Speaker 1: So the groups were again entirely made up by chance. 339 00:18:36,480 --> 00:18:39,800 Speaker 1: There was no contact here, but it was enough to 340 00:18:39,880 --> 00:18:43,240 Speaker 1: generate a sense of group belonging. It created an us 341 00:18:43,240 --> 00:18:46,600 Speaker 1: and it also created them, which you then see born 342 00:18:46,680 --> 00:18:50,680 Speaker 1: out in the study results. So that's the big take 343 00:18:50,720 --> 00:18:53,560 Speaker 1: home from the minimal group paradigm, even without factors such 344 00:18:53,600 --> 00:18:58,320 Speaker 1: as religion, race, nationality, socioeconomic class, even without things like 345 00:18:58,359 --> 00:19:01,000 Speaker 1: what do other people what people in the outside group 346 00:19:01,119 --> 00:19:03,359 Speaker 1: look like or act like? When you know what do 347 00:19:03,480 --> 00:19:05,760 Speaker 1: I have in common with the people around me? It 348 00:19:05,800 --> 00:19:09,240 Speaker 1: is even stripping all of that away, humans rather swiftly 349 00:19:09,400 --> 00:19:11,679 Speaker 1: formed factions and discriminate against others. 350 00:19:12,160 --> 00:19:14,960 Speaker 2: So to offer a little bit more detail from that 351 00:19:15,080 --> 00:19:18,399 Speaker 2: later piece I mentioned, this was a paper called the 352 00:19:18,440 --> 00:19:21,840 Speaker 2: Minimal group Paradigm and its Maximal Impact in Research on 353 00:19:21,880 --> 00:19:25,600 Speaker 2: social categorization. This was published in Current Opinion in Psychology 354 00:19:25,640 --> 00:19:29,440 Speaker 2: in twenty sixteen by Sabine Aughten. One thing Aughton mentions 355 00:19:29,680 --> 00:19:32,880 Speaker 2: is that when Tash fell and colleagues first came up 356 00:19:32,920 --> 00:19:38,560 Speaker 2: with the minimal group paradigm. Their original intention was apparently 357 00:19:38,640 --> 00:19:43,840 Speaker 2: to investigate whether people would display in group favoritism even 358 00:19:43,880 --> 00:19:47,560 Speaker 2: in situations where there was no actual conflict for resources 359 00:19:47,600 --> 00:19:50,080 Speaker 2: between the two groups. So their original question was a 360 00:19:50,080 --> 00:19:53,840 Speaker 2: little bit different, but as a preliminary avenue of research 361 00:19:53,960 --> 00:19:58,120 Speaker 2: to that project to address this other question. First, they 362 00:19:58,160 --> 00:20:01,600 Speaker 2: wanted to just find out what the minimum criteria that 363 00:20:01,680 --> 00:20:05,160 Speaker 2: could be leveraged to cause people to show in group favoritism. 364 00:20:05,480 --> 00:20:08,440 Speaker 2: So this was originally supposed to be just like trying 365 00:20:08,480 --> 00:20:11,080 Speaker 2: to establish what they would need to do in this 366 00:20:11,160 --> 00:20:14,520 Speaker 2: other test. Oughten writes quote, they planned to start with 367 00:20:14,600 --> 00:20:18,399 Speaker 2: a most minimal setup and to successively add elements to 368 00:20:18,440 --> 00:20:24,000 Speaker 2: the design until intergroup discrimination would emerge. So they started 369 00:20:24,040 --> 00:20:27,080 Speaker 2: with these novel social categorizations based on things that they 370 00:20:27,119 --> 00:20:30,560 Speaker 2: expected to have no power of social cohesion at all, 371 00:20:30,640 --> 00:20:34,679 Speaker 2: like tendencies in you know, the numerical estimation game, the 372 00:20:34,720 --> 00:20:39,160 Speaker 2: dots game, or preferences for the types of paintings. And again, 373 00:20:39,200 --> 00:20:41,680 Speaker 2: these were only pretenses that people were actually assigned to 374 00:20:41,720 --> 00:20:45,280 Speaker 2: groups randomly in most of all cases, and instead what 375 00:20:45,359 --> 00:20:48,720 Speaker 2: they found was that these fake made up bases for 376 00:20:49,080 --> 00:20:53,520 Speaker 2: social categorization were good enough to kick start in group favoritism. 377 00:20:54,000 --> 00:20:56,960 Speaker 2: So they were trying to find the minimum criteria, and 378 00:20:57,040 --> 00:20:59,000 Speaker 2: it turns out they just didn't really have to look 379 00:20:59,119 --> 00:21:01,320 Speaker 2: very hard. There's barely a minimum at all. 380 00:21:01,880 --> 00:21:04,840 Speaker 1: Yeah, and that, again, I think, is the thing that 381 00:21:04,640 --> 00:21:07,560 Speaker 1: that floored everybody and I and still floors people when 382 00:21:07,600 --> 00:21:09,399 Speaker 1: they hear about it for the first time or are 383 00:21:09,480 --> 00:21:10,199 Speaker 1: reminded of it. 384 00:21:20,480 --> 00:21:25,600 Speaker 2: Aughtan recognizes three experimental features for recognizing what authors in 385 00:21:25,640 --> 00:21:29,320 Speaker 2: this experimental domain call a mere categorization effect. That's what's 386 00:21:29,320 --> 00:21:31,840 Speaker 2: going on in the minimal group paradigm. It's like people 387 00:21:31,880 --> 00:21:37,040 Speaker 2: are behaving in ways that indicate group favoritism, but only 388 00:21:37,080 --> 00:21:40,080 Speaker 2: based on merely being categorized in a group, and like 389 00:21:40,160 --> 00:21:43,920 Speaker 2: nothing happening in the real world. The three features, as 390 00:21:43,960 --> 00:21:48,159 Speaker 2: outanlists them, are, Number one, categorization is novel and arbitrary. 391 00:21:48,400 --> 00:21:52,280 Speaker 2: No history of experiences within group and or outgroup, so 392 00:21:52,359 --> 00:21:54,840 Speaker 2: it's got to be all new in the experiment. Number two, 393 00:21:54,920 --> 00:21:59,080 Speaker 2: categorization is anonymous, no face to face interaction between group 394 00:21:59,160 --> 00:22:02,359 Speaker 2: members because you can obviously see how that would introduce 395 00:22:02,440 --> 00:22:08,840 Speaker 2: complications and see no utilitarian self interests can be directly 396 00:22:08,880 --> 00:22:13,120 Speaker 2: served by intergroup evaluations or allocations. So you don't want 397 00:22:13,119 --> 00:22:16,800 Speaker 2: to complicate your study by having people have the ability 398 00:22:16,840 --> 00:22:20,760 Speaker 2: to pay money out to themselves, because that would obviously 399 00:22:20,840 --> 00:22:22,840 Speaker 2: add in new variables. 400 00:22:22,760 --> 00:22:25,119 Speaker 1: Right right, That would be the self interest kicking in 401 00:22:25,240 --> 00:22:25,639 Speaker 1: for sure. 402 00:22:26,040 --> 00:22:31,560 Speaker 2: But a really interesting thing emerges with the allocation tasks. 403 00:22:31,880 --> 00:22:34,399 Speaker 2: That is along the lines of self interest. Instead of 404 00:22:34,480 --> 00:22:39,160 Speaker 2: individual self interest, it is in group interest. So Oughten says, 405 00:22:39,359 --> 00:22:42,919 Speaker 2: as we've discussed, there were fairness concerns that did guide 406 00:22:42,960 --> 00:22:46,560 Speaker 2: some intergroup allocations, but also there was evidence of in 407 00:22:46,640 --> 00:22:50,040 Speaker 2: group favoritism even when the group had just been formed. 408 00:22:50,160 --> 00:22:53,560 Speaker 2: It meant essentially nothing and the members were anonymous. And 409 00:22:53,640 --> 00:22:57,760 Speaker 2: here's a really interesting thing. In some cases quote the 410 00:22:57,880 --> 00:23:02,040 Speaker 2: tendency to positively differ diferentiate the in group from the 411 00:23:02,080 --> 00:23:07,080 Speaker 2: out group was stronger than the tendency to maximize the 412 00:23:07,160 --> 00:23:11,760 Speaker 2: in group's profit. So the example Lotten gives here would 413 00:23:11,800 --> 00:23:16,000 Speaker 2: be instead of giving twelve dollars to the in group 414 00:23:16,080 --> 00:23:19,760 Speaker 2: and eleven dollars to the out group, some subjects would 415 00:23:19,840 --> 00:23:23,520 Speaker 2: select a strategy that gave eleven dollars to the in 416 00:23:23,680 --> 00:23:28,879 Speaker 2: group and nine dollars to the outgroup. So everybody gets less, 417 00:23:29,040 --> 00:23:32,400 Speaker 2: but the difference between the rewards of the two groups 418 00:23:32,520 --> 00:23:35,320 Speaker 2: is greater, and if you were on the top of 419 00:23:35,359 --> 00:23:39,280 Speaker 2: that difference, even if you got less, some people preferred that. 420 00:23:39,840 --> 00:23:44,480 Speaker 2: Oh wow, and this thing about sacrificing the overall objective 421 00:23:44,600 --> 00:23:48,800 Speaker 2: gains of the in group for a greater distinction in 422 00:23:48,920 --> 00:23:52,199 Speaker 2: gains between the in group and out group made me 423 00:23:52,200 --> 00:23:54,199 Speaker 2: think about a thing I read in the context of 424 00:23:54,200 --> 00:23:57,280 Speaker 2: a different paper exploring a different theory, but it was 425 00:23:57,320 --> 00:24:01,920 Speaker 2: won by the Harvard psychologist Jim's Side and co authors 426 00:24:02,000 --> 00:24:06,960 Speaker 2: called Vladimir's choice and the distribution of social resources a 427 00:24:07,040 --> 00:24:10,840 Speaker 2: group dominance perspective. This was exploring a different theory called 428 00:24:11,200 --> 00:24:14,760 Speaker 2: social dominance theory, but it starts off with this anecdote 429 00:24:14,880 --> 00:24:20,240 Speaker 2: that apparently comes from an Eastern European fable. The authors 430 00:24:20,680 --> 00:24:24,359 Speaker 2: related as following, one day, God came down to Vladimir, 431 00:24:24,480 --> 00:24:27,320 Speaker 2: a poor peasant, and said, Vladimir, I will grant you 432 00:24:27,359 --> 00:24:31,560 Speaker 2: one wish. Anything you want will be yours. However, God added, 433 00:24:31,720 --> 00:24:34,720 Speaker 2: there is one condition. Anything I give to you will 434 00:24:34,720 --> 00:24:38,879 Speaker 2: be granted to your neighbor if on twice over. Vladimir 435 00:24:38,920 --> 00:24:42,280 Speaker 2: immediately answered, saying, okay, take out one of my eyes. 436 00:24:43,000 --> 00:24:45,640 Speaker 1: Oh, that's grim, that's very grim. 437 00:24:45,720 --> 00:24:48,680 Speaker 2: Now, the stakes in these minimal group paradigm experiments are 438 00:24:48,720 --> 00:24:51,439 Speaker 2: certainly not that high, but I think we can all 439 00:24:51,480 --> 00:24:55,680 Speaker 2: think of examples where, you know, sometimes you just see 440 00:24:55,840 --> 00:24:59,119 Speaker 2: a case where what appears to be spite or maybe 441 00:24:59,119 --> 00:25:04,679 Speaker 2: something else like that overrides a person's own objective self interest, 442 00:25:04,880 --> 00:25:09,640 Speaker 2: like they would rather have a higher degree of advantage 443 00:25:09,720 --> 00:25:14,320 Speaker 2: over a known neighbor or adversary than a greater objective 444 00:25:14,400 --> 00:25:15,800 Speaker 2: advantage overall. 445 00:25:16,240 --> 00:25:18,160 Speaker 1: It'd be like if you if you sort of had 446 00:25:18,160 --> 00:25:20,480 Speaker 1: it in for your for your buddy, and it was 447 00:25:20,480 --> 00:25:23,280 Speaker 1: your turn to pick the the the type of pizza 448 00:25:23,359 --> 00:25:25,080 Speaker 1: you get, and you make sure you've got a flavor 449 00:25:25,119 --> 00:25:28,280 Speaker 1: that you weren't crazy about, but you knew that your 450 00:25:28,480 --> 00:25:33,600 Speaker 1: your friend hated. You're willing to choke it down just 451 00:25:33,680 --> 00:25:37,000 Speaker 1: because more more refreshing to you, more delicious to you 452 00:25:37,560 --> 00:25:41,080 Speaker 1: is is the fact that they are going to dislike 453 00:25:41,160 --> 00:25:44,119 Speaker 1: it more than you do, which again is illogical. It 454 00:25:44,119 --> 00:25:46,280 Speaker 1: shouldn't be a thing that someone would do. But I 455 00:25:46,280 --> 00:25:49,320 Speaker 1: think we can all easily imagine a scenario where someone's 456 00:25:49,359 --> 00:25:52,720 Speaker 1: pettiness and spite would lead to such an occurrence. And 457 00:25:52,720 --> 00:25:54,560 Speaker 1: maybe this one like that's a version of it that 458 00:25:54,600 --> 00:25:56,919 Speaker 1: maybe is a little more real world accurate as opposed 459 00:25:56,920 --> 00:25:57,640 Speaker 1: to the blinding. 460 00:25:57,920 --> 00:26:02,840 Speaker 2: The gulf between your okayness and your friend's misery is 461 00:26:02,920 --> 00:26:06,160 Speaker 2: more valuable to you than the extra pleasure you would 462 00:26:06,160 --> 00:26:08,200 Speaker 2: get from getting a topping you really liked. 463 00:26:08,720 --> 00:26:10,840 Speaker 1: Right, And for some reason, this, like this whole scenario 464 00:26:10,840 --> 00:26:14,360 Speaker 1: makes more sense concerning friends than it does like enemies 465 00:26:14,359 --> 00:26:18,080 Speaker 1: of any sort. I don't know. It perhaps suggests a 466 00:26:18,080 --> 00:26:20,879 Speaker 1: lot about the way relationships were. 467 00:26:21,240 --> 00:26:23,119 Speaker 2: Now there's some caveats to this that I want to 468 00:26:23,119 --> 00:26:25,359 Speaker 2: get into in a second, because to come back to 469 00:26:25,440 --> 00:26:29,360 Speaker 2: that paper by Aughten, one thing I was interested in 470 00:26:29,840 --> 00:26:33,720 Speaker 2: was criticisms of the minimal group paradigm. It does seem 471 00:26:33,760 --> 00:26:38,119 Speaker 2: that the MGP findings have been widely replicated with a 472 00:26:38,160 --> 00:26:41,080 Speaker 2: lot of superficial variations. So it does look to me 473 00:26:41,200 --> 00:26:44,520 Speaker 2: like the finding is robust. But while the finding itself 474 00:26:44,560 --> 00:26:46,880 Speaker 2: is sound, you could argue that people might be drawing 475 00:26:46,920 --> 00:26:50,480 Speaker 2: the wrong conclusions from it, and so there are a 476 00:26:50,560 --> 00:26:55,080 Speaker 2: number of criticisms along those lines. One thing that comes 477 00:26:55,160 --> 00:26:57,800 Speaker 2: up in Aughton's paper here is is the minimal group 478 00:26:57,840 --> 00:27:01,720 Speaker 2: paradigm really revealing some thing about how people would behave 479 00:27:01,840 --> 00:27:06,399 Speaker 2: in the real world, or does the experiment quote merely 480 00:27:06,520 --> 00:27:13,040 Speaker 2: create a situation in which social category information receives unrealistic attention. 481 00:27:13,960 --> 00:27:16,600 Speaker 2: I was like, Oh, I think that's interesting because, Okay, 482 00:27:16,600 --> 00:27:20,440 Speaker 2: you're in a contrived laboratory scenario. Your membership in one 483 00:27:20,520 --> 00:27:23,800 Speaker 2: group or the other is highlighted to you, people are 484 00:27:23,800 --> 00:27:26,600 Speaker 2: telling you about it, and the situation is stripped of 485 00:27:26,640 --> 00:27:30,240 Speaker 2: a lot of other contextual information that would exist in 486 00:27:30,240 --> 00:27:34,320 Speaker 2: the real world that would normally inform your behavior. Maybe 487 00:27:34,320 --> 00:27:38,120 Speaker 2: people are placing undue weight on group membership even though 488 00:27:38,160 --> 00:27:42,320 Speaker 2: it's arbitrary, because it's really like the only variable they're 489 00:27:42,359 --> 00:27:46,080 Speaker 2: being aware of in this situation. On the other hand, 490 00:27:46,760 --> 00:27:48,919 Speaker 2: while that criticism makes a lot of sense to me, 491 00:27:50,000 --> 00:27:53,119 Speaker 2: I think these experiments are just as valuable if you 492 00:27:53,160 --> 00:27:56,640 Speaker 2: think about them with that caveat in mind, Like they 493 00:27:56,760 --> 00:28:00,639 Speaker 2: show a certain irrational way that some people behave, showing 494 00:28:00,720 --> 00:28:06,040 Speaker 2: in group preference for utterly arbitrary groups when group membership 495 00:28:06,200 --> 00:28:09,200 Speaker 2: is made salient when it is brought to your attention 496 00:28:09,440 --> 00:28:12,560 Speaker 2: and people are talking about it, which is something that 497 00:28:12,600 --> 00:28:14,880 Speaker 2: does happen in the real world all the time. Actually, 498 00:28:14,920 --> 00:28:19,919 Speaker 2: like there is some category distinction between people that was 499 00:28:20,000 --> 00:28:24,879 Speaker 2: maybe not previously much noted, and for some reason, suddenly 500 00:28:25,080 --> 00:28:28,399 Speaker 2: it is made salient. People start paying attention to this 501 00:28:28,520 --> 00:28:32,000 Speaker 2: difference and talking about it. It seems to me that 502 00:28:32,200 --> 00:28:36,480 Speaker 2: in reality, this is enough to trigger minimal group paradigm effects. 503 00:28:38,240 --> 00:28:40,760 Speaker 2: This is only partially related, but it reminds me of 504 00:28:40,800 --> 00:28:46,040 Speaker 2: that thing when an arbitrary factual question that previously had 505 00:28:46,080 --> 00:28:52,480 Speaker 2: no political valance suddenly becomes politicized for some reason, maybe 506 00:28:52,520 --> 00:28:56,040 Speaker 2: by like a prominent politician taking a stance one way 507 00:28:56,160 --> 00:29:00,400 Speaker 2: or another on this question, and now suddenly, like, what 508 00:29:00,480 --> 00:29:04,040 Speaker 2: you think about this, this question that previously involved no 509 00:29:04,200 --> 00:29:07,959 Speaker 2: political values, now is a major part of your identity, 510 00:29:08,480 --> 00:29:11,800 Speaker 2: and people will factionalize on the basis of it. 511 00:29:12,240 --> 00:29:14,320 Speaker 1: Yeah, and sometimes it takes the form of just sort 512 00:29:14,360 --> 00:29:18,440 Speaker 1: of a you know, fear mongering about something that normally 513 00:29:18,560 --> 00:29:21,120 Speaker 1: had no real kind of like fear weight to it. 514 00:29:21,560 --> 00:29:23,840 Speaker 1: Like I instantly think of various things going on doing 515 00:29:23,880 --> 00:29:27,960 Speaker 1: say the Satanic panic, where you know, it's suddenly there's 516 00:29:28,080 --> 00:29:30,360 Speaker 1: you know, there's some sort of an outrage over a 517 00:29:30,400 --> 00:29:34,800 Speaker 1: particular piece of music that is interpreted by somebody as 518 00:29:34,840 --> 00:29:38,960 Speaker 1: having some sort of subliminal, demonic message inside it, even 519 00:29:39,000 --> 00:29:41,240 Speaker 1: if there's little or no proof that that is even 520 00:29:41,360 --> 00:29:44,240 Speaker 1: possibly the case or certainly the intent of the artist. 521 00:29:44,520 --> 00:29:46,280 Speaker 1: It ends up picking up steam all its own, and 522 00:29:46,280 --> 00:29:48,800 Speaker 1: then where do you fall on this divide totally? 523 00:29:48,880 --> 00:29:52,160 Speaker 2: Now, to be fair, things like that are not purely 524 00:29:52,200 --> 00:29:55,080 Speaker 2: minimal group paradigm, because once you're talking about like cultural 525 00:29:55,240 --> 00:29:58,800 Speaker 2: artifacts and preferences, you do start bringing in like, well, 526 00:29:58,840 --> 00:30:02,600 Speaker 2: maybe that already touches certain things about, you know, cultural identity, 527 00:30:02,640 --> 00:30:05,280 Speaker 2: which people would have opinions about and would have some 528 00:30:05,400 --> 00:30:08,040 Speaker 2: in group out group associations and so forth. But it's 529 00:30:08,040 --> 00:30:09,080 Speaker 2: sort of halfway there. 530 00:30:09,240 --> 00:30:11,360 Speaker 1: I wonder if there might be a comparison to draw 531 00:30:11,400 --> 00:30:14,000 Speaker 1: here to the there were two things that in recent years. 532 00:30:14,040 --> 00:30:15,800 Speaker 1: There was the whole like what color is this dress? 533 00:30:15,920 --> 00:30:16,640 Speaker 2: Right? Yeah? 534 00:30:16,680 --> 00:30:18,920 Speaker 1: And people were split over that. I don't know, I 535 00:30:18,960 --> 00:30:20,880 Speaker 1: mean not to the point where I guess you really 536 00:30:20,920 --> 00:30:26,080 Speaker 1: saw outgroup discrimination. But it was interesting to see how 537 00:30:26,160 --> 00:30:30,840 Speaker 1: quickly peoples became like they were quick to state what 538 00:30:30,920 --> 00:30:33,719 Speaker 1: their interpretation of it was and become a part of 539 00:30:33,760 --> 00:30:36,080 Speaker 1: that group that sawd a certain way. I do not 540 00:30:36,160 --> 00:30:38,000 Speaker 1: remember what I thought of this dress or even who 541 00:30:38,000 --> 00:30:41,760 Speaker 1: wore it, so I just remember being amused that it 542 00:30:41,800 --> 00:30:42,520 Speaker 1: was a thing at all. 543 00:30:43,000 --> 00:30:44,959 Speaker 2: I think I remember when I first saw it, it 544 00:30:45,000 --> 00:30:48,520 Speaker 2: looked blue and black to me, So hate me if 545 00:30:48,560 --> 00:31:01,200 Speaker 2: you want. But anyway, coming back to this issue, so 546 00:31:01,480 --> 00:31:03,920 Speaker 2: it may be a good criticism that this has some 547 00:31:04,080 --> 00:31:06,560 Speaker 2: limitation in how it applies to the real world once 548 00:31:06,600 --> 00:31:08,880 Speaker 2: you bring in all the context of culture and all that. 549 00:31:10,120 --> 00:31:13,640 Speaker 2: But I do think it still probably highlights something very interesting, 550 00:31:13,840 --> 00:31:17,520 Speaker 2: which is that group sort of in group favoritism can 551 00:31:17,560 --> 00:31:21,520 Speaker 2: emerge with minimal stimulation just by like drawing a lot 552 00:31:21,520 --> 00:31:25,920 Speaker 2: of attention to the presence and division differentiation of the groups. 553 00:31:27,160 --> 00:31:31,560 Speaker 2: Another interesting limitation that Otten mentions. Subsequent research has shown 554 00:31:31,640 --> 00:31:36,160 Speaker 2: that in group favoritism with the minimal group paradigm is 555 00:31:36,600 --> 00:31:43,520 Speaker 2: quote mostly restricted to allocations of positive resources into valuations 556 00:31:43,560 --> 00:31:48,160 Speaker 2: regarding positive traits. So when you're talking about things like 557 00:31:48,280 --> 00:31:55,120 Speaker 2: assigning actual punishments or negative personal assessments, it seems that 558 00:31:55,160 --> 00:32:01,320 Speaker 2: the mere categorization effect no longer reliably produced results, which 559 00:32:01,440 --> 00:32:02,680 Speaker 2: should be a good result. 560 00:32:02,520 --> 00:32:06,440 Speaker 1: Right yeah, yeah, knowing that that eye gouging actually wouldn't 561 00:32:06,720 --> 00:32:08,560 Speaker 1: play out all that well in this scenario. 562 00:32:08,880 --> 00:32:11,760 Speaker 2: Right, So maybe experiments show the minimal group stuff is 563 00:32:11,880 --> 00:32:14,920 Speaker 2: enough to make you treat your your in group better, 564 00:32:15,000 --> 00:32:18,800 Speaker 2: and maybe even in some cases prefer them to get 565 00:32:18,840 --> 00:32:21,800 Speaker 2: a better leg up over the other group as opposed 566 00:32:21,800 --> 00:32:26,400 Speaker 2: to more payout overall. But it doesn't extend to actually 567 00:32:26,440 --> 00:32:28,840 Speaker 2: wanting to hurt or punish the outgroup. 568 00:32:29,440 --> 00:32:33,160 Speaker 1: Yeah, though not to imply though, that just not wanting 569 00:32:33,280 --> 00:32:35,520 Speaker 1: or not thinking about actively hurting the group doesn't mean 570 00:32:35,560 --> 00:32:39,360 Speaker 1: that in the like the real world implications of the 571 00:32:39,360 --> 00:32:42,960 Speaker 1: minimal group paradigm, that plenty of hurt might be inflicted, 572 00:32:43,560 --> 00:32:45,520 Speaker 1: you know, especially if you're dealing like you know, any 573 00:32:45,600 --> 00:32:50,000 Speaker 1: kind of outgroup discrimination could of course have terrible effects 574 00:32:50,640 --> 00:32:51,840 Speaker 1: in the real world. 575 00:32:51,840 --> 00:32:54,640 Speaker 2: But in those situations you'd be going beyond the conditions 576 00:32:54,680 --> 00:32:57,080 Speaker 2: of the minimal group paradigm and sort of bringing in 577 00:32:57,120 --> 00:32:59,840 Speaker 2: the real world. Yeah. But anyway, I thought this was 578 00:32:59,840 --> 00:33:02,680 Speaker 2: an interesting dynamics. So people might be more willing to 579 00:33:02,800 --> 00:33:06,480 Speaker 2: allocate monetary payments to their own group, even if that 580 00:33:06,560 --> 00:33:11,320 Speaker 2: group is novel or arbitrary. But studies don't reliably show 581 00:33:11,480 --> 00:33:14,520 Speaker 2: people to be willing to dull at punishments or disparagements 582 00:33:14,560 --> 00:33:18,880 Speaker 2: against a novel, arbitrary outgroup. Why might this be? One 583 00:33:19,320 --> 00:33:22,640 Speaker 2: interpretation given in this paper is it's possible that the 584 00:33:23,200 --> 00:33:27,080 Speaker 2: in group favoritism in minimal group paradigm experiments shows up 585 00:33:27,200 --> 00:33:32,040 Speaker 2: because people have positive associations with themselves, and hey, I'm 586 00:33:32,080 --> 00:33:35,600 Speaker 2: part of the in group, so I'm good in deserving 587 00:33:35,720 --> 00:33:38,200 Speaker 2: and I'm part of group A, and therefore group A 588 00:33:38,240 --> 00:33:40,760 Speaker 2: is good in deserving, and there might not really be 589 00:33:40,880 --> 00:33:45,480 Speaker 2: an equivalent mechanism of comparison with the out group. So 590 00:33:45,560 --> 00:33:48,880 Speaker 2: the same logic doesn't lead someone to conclude that group 591 00:33:49,040 --> 00:33:52,960 Speaker 2: B is bad and undeserving, so you might not actually 592 00:33:52,960 --> 00:33:56,440 Speaker 2: go so far as to select punishments and disparagements for them. 593 00:33:56,840 --> 00:34:00,440 Speaker 2: Yet this would raise interesting questions. It would bring back 594 00:34:00,480 --> 00:34:04,320 Speaker 2: to that thing about why people so often in these 595 00:34:04,360 --> 00:34:08,920 Speaker 2: experiments will sacrifice overall rewards of the in group to 596 00:34:09,000 --> 00:34:12,640 Speaker 2: get a bigger leg up on the out group. Because again, remember, 597 00:34:12,719 --> 00:34:15,399 Speaker 2: like you know, a lot of these findings are If 598 00:34:15,400 --> 00:34:18,719 Speaker 2: I'm in group A but not personally receiving any rewards, 599 00:34:19,360 --> 00:34:22,160 Speaker 2: I might choose a plan where group A gets ten 600 00:34:22,360 --> 00:34:25,399 Speaker 2: and group B gets seven instead of a plan where 601 00:34:25,400 --> 00:34:28,960 Speaker 2: group A gets twelve and group B gets eleven. If 602 00:34:28,960 --> 00:34:31,600 Speaker 2: this is not to be interpreted as an attempt to 603 00:34:31,719 --> 00:34:36,040 Speaker 2: punish group B, what does it mean? Uh? Maybe it 604 00:34:36,160 --> 00:34:40,880 Speaker 2: means that some people sometimes interpret it as a greater 605 00:34:41,120 --> 00:34:45,839 Speaker 2: personal reward to get significantly more than your neighbor, then 606 00:34:45,880 --> 00:34:49,160 Speaker 2: it would be to get a greater objective reward overall. 607 00:34:49,440 --> 00:34:52,880 Speaker 2: Like some people would just rather come in second place 608 00:34:52,920 --> 00:34:55,799 Speaker 2: and have Jeff come in sixth place, rather than come 609 00:34:55,840 --> 00:34:58,520 Speaker 2: in first place myself and have Jeff come in second. 610 00:34:59,400 --> 00:35:01,719 Speaker 1: Yeah, it's I seem to sort of crunch that and 611 00:35:02,040 --> 00:35:05,520 Speaker 1: try and apply it to some sort of you know, 612 00:35:05,600 --> 00:35:09,480 Speaker 1: a hunter gatherer scenario and try and figure out how 613 00:35:09,480 --> 00:35:14,000 Speaker 1: that makes sense even like in a in those situations. 614 00:35:14,040 --> 00:35:15,560 Speaker 1: But yeah, I don't know, that is. 615 00:35:15,520 --> 00:35:17,600 Speaker 2: A weird little wrinkle in human nature. 616 00:35:17,880 --> 00:35:21,160 Speaker 1: Yeah, you know, sometimes I see this discuss and I 617 00:35:21,200 --> 00:35:22,719 Speaker 1: think of it in terms of it's kind of like 618 00:35:23,160 --> 00:35:26,000 Speaker 1: the idea here is that MGP is kind of like 619 00:35:26,040 --> 00:35:30,640 Speaker 1: a bedrock scenario, you know, and that again, when you 620 00:35:30,640 --> 00:35:32,400 Speaker 1: bring into the real world, everything else is going to 621 00:35:32,400 --> 00:35:34,120 Speaker 1: be built on top of that bedrock. Or you could 622 00:35:34,160 --> 00:35:35,520 Speaker 1: think of it in terms of like just sort of 623 00:35:35,520 --> 00:35:39,359 Speaker 1: the initial like laying out with stakes of what will 624 00:35:39,360 --> 00:35:43,759 Speaker 1: become a cathedral, and and so you're not necessarily going 625 00:35:43,840 --> 00:35:45,839 Speaker 1: to get the full picture of the cathedral looking at 626 00:35:46,120 --> 00:35:48,760 Speaker 1: at the basic shape that you've marked out in the dirt, 627 00:35:49,880 --> 00:35:51,840 Speaker 1: but you may be able to figure out some things, 628 00:35:52,000 --> 00:35:54,880 Speaker 1: some of like the sweeping ideas that'll be that will 629 00:35:54,920 --> 00:35:57,759 Speaker 1: be present in the final design, but then again you 630 00:35:57,800 --> 00:36:01,440 Speaker 1: have no idea like what all the different cultural structures 631 00:36:01,480 --> 00:36:03,960 Speaker 1: on top of it are ultimately going to produce. But 632 00:36:03,960 --> 00:36:06,319 Speaker 1: it's still an interesting exercise to sort of strip things 633 00:36:06,320 --> 00:36:08,440 Speaker 1: down to this level. Now, I want to come back 634 00:36:08,680 --> 00:36:12,480 Speaker 1: to Brown for just one last thing here, because in 635 00:36:12,800 --> 00:36:16,360 Speaker 1: that paper, Brown stresses the historical context of MGP and 636 00:36:16,440 --> 00:36:20,000 Speaker 1: says it is also to consider, especially as it regards 637 00:36:20,040 --> 00:36:23,560 Speaker 1: two major points. So first of all, he says that 638 00:36:23,680 --> 00:36:25,480 Speaker 1: during the mid to late sixties, there is a so 639 00:36:25,520 --> 00:36:29,480 Speaker 1: called crisis in social psychology in which North American scholars, 640 00:36:29,520 --> 00:36:32,959 Speaker 1: in particular, we're questioning whether European studies, involving a great 641 00:36:32,960 --> 00:36:36,920 Speaker 1: deal of laboratory experimentation could actually apply to real world 642 00:36:37,000 --> 00:36:39,719 Speaker 1: social issues of the time. So this led to a 643 00:36:39,719 --> 00:36:43,120 Speaker 1: lot of soul searching and changes in Western psychology in general. 644 00:36:43,440 --> 00:36:46,360 Speaker 1: And ironically, there were a lot of questions about quote 645 00:36:46,400 --> 00:36:50,680 Speaker 1: unquote experiments in a vacuum. Now it's ironic because I mean, 646 00:36:50,719 --> 00:36:53,719 Speaker 1: as we've been discussing, the minimal group paradigm is very 647 00:36:53,800 --> 00:36:56,040 Speaker 1: much an experiment in a vacuum like that, a lot 648 00:36:56,080 --> 00:36:59,520 Speaker 1: of effort goes into sucking all of the real world 649 00:36:59,560 --> 00:37:02,120 Speaker 1: complex sucking all the air out of the chamber of 650 00:37:02,160 --> 00:37:07,080 Speaker 1: this experiment. But it also could be seen, especially in 651 00:37:07,080 --> 00:37:09,640 Speaker 1: the time period, it's kind of like a stripping down 652 00:37:10,160 --> 00:37:13,439 Speaker 1: to a new bedrock, to a new level upon which 653 00:37:13,480 --> 00:37:16,239 Speaker 1: to try and understand, like sort of like sweeping out, 654 00:37:16,480 --> 00:37:19,960 Speaker 1: removing all those other experiments that were potentially complicating things. 655 00:37:21,040 --> 00:37:24,600 Speaker 1: And Brown also stresses that prior to minimal group paradigm, 656 00:37:24,760 --> 00:37:28,239 Speaker 1: the main ideas for why you had social prejudices in 657 00:37:28,280 --> 00:37:33,960 Speaker 1: the real world were tied to personality dynamics often connected 658 00:37:34,000 --> 00:37:38,960 Speaker 1: to things in your upbringing, built up frustration, and negative 659 00:37:39,040 --> 00:37:44,399 Speaker 1: interdependence among groups, and all of these ideas as sort 660 00:37:44,400 --> 00:37:49,400 Speaker 1: of sweeping definitions were challenged by experimental data. Instead, the 661 00:37:49,640 --> 00:37:53,360 Speaker 1: mineral group paradigm creates this again super stripped down, simple 662 00:37:53,480 --> 00:37:57,400 Speaker 1: experiment that does seem to reveal a lot about some 663 00:37:57,480 --> 00:38:00,439 Speaker 1: of the basic mechanics of how we think about our 664 00:38:00,480 --> 00:38:05,360 Speaker 1: group and outside groups, coming back to memes and so forth. 665 00:38:05,600 --> 00:38:07,919 Speaker 1: It also reminds me of a common thing I think 666 00:38:07,920 --> 00:38:09,960 Speaker 1: still see, and that is people saying, well, there are 667 00:38:09,960 --> 00:38:11,520 Speaker 1: two types of people in the world. There are the 668 00:38:11,560 --> 00:38:13,799 Speaker 1: people that do or believe X and those who do 669 00:38:13,920 --> 00:38:17,560 Speaker 1: or believe why. And I guess the thing that often 670 00:38:17,560 --> 00:38:20,680 Speaker 1: makes them funny is that it will or potentially makes 671 00:38:20,719 --> 00:38:22,880 Speaker 1: them funny is that they'll hit on a division you 672 00:38:22,920 --> 00:38:26,640 Speaker 1: did not realize was a thing, but then suddenly you're 673 00:38:26,640 --> 00:38:29,000 Speaker 1: just presented with this spark of an idea that this 674 00:38:29,120 --> 00:38:32,920 Speaker 1: is truly a defining choice to make. And you know, 675 00:38:33,120 --> 00:38:35,799 Speaker 1: even though it's generally played for laughs, you know, you 676 00:38:35,840 --> 00:38:39,319 Speaker 1: can kind of feel it, sort of you can feel 677 00:38:39,320 --> 00:38:41,760 Speaker 1: the divide sort of moving and you've sort of forced 678 00:38:41,760 --> 00:38:44,400 Speaker 1: to step to one side or the other, even if 679 00:38:44,440 --> 00:38:47,520 Speaker 1: you don't actually engage with said meme or said conversation. 680 00:38:48,080 --> 00:38:50,000 Speaker 2: Well, I think one of the things that's interesting about 681 00:38:50,000 --> 00:38:54,279 Speaker 2: those memes is they tend to it's kinda I feel 682 00:38:54,280 --> 00:38:56,200 Speaker 2: like we should give an example. What do they say? 683 00:38:56,280 --> 00:38:58,839 Speaker 2: There are two types of people, those who peel back 684 00:38:58,880 --> 00:39:01,160 Speaker 2: the slim gym rapper as they eat it, or those 685 00:39:01,160 --> 00:39:03,120 Speaker 2: who take the slim gym out in one go and 686 00:39:03,160 --> 00:39:05,160 Speaker 2: then eat it with their hold it with their fingers. 687 00:39:05,560 --> 00:39:07,440 Speaker 2: You know, do you get your fingers greasy or not? 688 00:39:07,800 --> 00:39:10,600 Speaker 2: Those memes are funny because they ask people to read 689 00:39:10,760 --> 00:39:13,560 Speaker 2: a lot into a behavior that, on its face we 690 00:39:13,600 --> 00:39:15,880 Speaker 2: would assume does not tell you much about a person. 691 00:39:16,520 --> 00:39:19,560 Speaker 2: And exactly the humor is in trying to like extrapolate 692 00:39:19,640 --> 00:39:22,080 Speaker 2: everything you could possibly want to know about a person 693 00:39:22,120 --> 00:39:25,120 Speaker 2: from that one thing, though, that is often kind of 694 00:39:25,160 --> 00:39:28,239 Speaker 2: what we do. Like you can imagine sitting in these 695 00:39:28,760 --> 00:39:33,280 Speaker 2: early experiments with Tesh Fell and saying like, you know, okay, 696 00:39:33,320 --> 00:39:35,600 Speaker 2: what are the people who counted the dots, you know, 697 00:39:35,680 --> 00:39:37,960 Speaker 2: the people who counted the dots differently? What does that 698 00:39:38,120 --> 00:39:41,840 Speaker 2: say about their personality? And trying to like work that 699 00:39:42,000 --> 00:39:44,320 Speaker 2: up into something meaningful about reality. 700 00:39:44,840 --> 00:39:46,640 Speaker 1: Yeah, I mean, we as humans, we tend to look 701 00:39:46,680 --> 00:39:48,920 Speaker 1: for the patterns and things. So even when there's a 702 00:39:49,000 --> 00:39:53,680 Speaker 1: random splitting, like if there's a if it's supposedly based 703 00:39:53,719 --> 00:39:56,040 Speaker 1: on how we counted the jelly beans in a jar, 704 00:39:56,200 --> 00:39:58,120 Speaker 1: how we saw the dots and in some sort of 705 00:39:58,160 --> 00:39:59,839 Speaker 1: an array, We're going to think about all the ways 706 00:39:59,840 --> 00:40:02,359 Speaker 1: that that could potentially define who or what we are. 707 00:40:02,800 --> 00:40:05,960 Speaker 2: You would say that because you're a dot undercounter. 708 00:40:05,880 --> 00:40:09,399 Speaker 1: I probably yeah, I mean it does make you think, like, oh, 709 00:40:09,440 --> 00:40:13,319 Speaker 1: it does that mean I'm a I'm a pessimist? Does 710 00:40:13,360 --> 00:40:15,600 Speaker 1: that mean I'm just not that into sugar. What does 711 00:40:15,600 --> 00:40:18,120 Speaker 1: it mean? Uh, we can't. We can't help but try 712 00:40:18,120 --> 00:40:19,680 Speaker 1: and figure that out and come up with all sorts 713 00:40:19,719 --> 00:40:23,279 Speaker 1: of ridiculous theories as to what it says. All Right, 714 00:40:23,320 --> 00:40:25,600 Speaker 1: we're gonna go and close it out right there, but 715 00:40:25,680 --> 00:40:28,640 Speaker 1: hopefully we gave you just a good taste of the 716 00:40:28,680 --> 00:40:32,120 Speaker 1: minimal group paradigm, like where it came from, what it 717 00:40:32,480 --> 00:40:35,919 Speaker 1: what it seems to mean, what it seems to tell 718 00:40:36,000 --> 00:40:38,719 Speaker 1: us about human nature. Obviously, we'd love to hear from 719 00:40:38,719 --> 00:40:41,880 Speaker 1: everyone out there if you have some more great, you know, 720 00:40:41,920 --> 00:40:46,480 Speaker 1: fictional examples, the real world examples of some of some 721 00:40:46,560 --> 00:40:49,200 Speaker 1: of what's going on here right in, we'd love to 722 00:40:49,200 --> 00:40:52,960 Speaker 1: hear from you. We read listener mail every Monday and 723 00:40:53,000 --> 00:40:54,720 Speaker 1: the Stuff and the Stuff to Blow your Mind podcast 724 00:40:54,719 --> 00:40:56,800 Speaker 1: feeding our Stuff to Blow your Mind listener Mail episodes. 725 00:40:56,800 --> 00:40:59,200 Speaker 1: On Wednesday's we do a short form artifact or monster fact. 726 00:40:59,320 --> 00:41:02,200 Speaker 1: Tuesdays and third Thursdays are our core episodes, and on 727 00:41:02,239 --> 00:41:05,359 Speaker 1: Fridays we set aside most serious concerns to just watch 728 00:41:05,360 --> 00:41:06,120 Speaker 1: a weird film. 729 00:41:06,560 --> 00:41:09,880 Speaker 2: Huge thanks to our audio producer JJ Posway. If you 730 00:41:09,880 --> 00:41:11,920 Speaker 2: would like to get in touch with us with feedback 731 00:41:11,960 --> 00:41:14,719 Speaker 2: on this episode or any other to suggest topic for 732 00:41:14,760 --> 00:41:17,000 Speaker 2: the future, or just to say hello. You can email 733 00:41:17,080 --> 00:41:27,440 Speaker 2: us at contact at stuff to Blow your Mind dot com. 734 00:41:27,680 --> 00:41:30,600 Speaker 3: Stuff to Blow Your Mind is production of iHeartRadio. For 735 00:41:30,680 --> 00:41:33,480 Speaker 3: more podcasts from my heart Radio, visit the iHeartRadio app, 736 00:41:33,640 --> 00:41:50,920 Speaker 3: Apple Podcasts, or wherever you're listening to your favorite shows.