1 00:00:10,320 --> 00:00:15,640 Speaker 1: Hither, folks, it is Saturday, December twenty seventh. And no, ma'am, 2 00:00:16,040 --> 00:00:20,360 Speaker 1: I am not Barack Obama. And no ma'am, I don't 3 00:00:20,440 --> 00:00:25,400 Speaker 1: often get mistaken for Barack Obama. It's just you who 4 00:00:25,520 --> 00:00:29,200 Speaker 1: can't tell one light skin brother from another. And with that, 5 00:00:29,800 --> 00:00:33,920 Speaker 1: welcome to this, y'all all look alike episode of Amy 6 00:00:33,960 --> 00:00:38,640 Speaker 1: and TJ Robes. Not the episode we planned on doing necessarily. 7 00:00:38,880 --> 00:00:42,040 Speaker 1: But is this a fun funny topic? Y'all all look alike? 8 00:00:42,159 --> 00:00:43,240 Speaker 2: We've all familiar. 9 00:00:43,040 --> 00:00:47,320 Speaker 3: Yes, yes, and look I think it's only it's funny 10 00:00:47,440 --> 00:00:50,479 Speaker 3: in that it's alarming and all you can do is 11 00:00:50,560 --> 00:00:52,360 Speaker 3: laugh at a certain point, because what are you gonna 12 00:00:52,360 --> 00:00:54,200 Speaker 3: do about it? I mean, your other choice is to 13 00:00:54,200 --> 00:00:57,960 Speaker 3: be angry and offended, which is well within your right, 14 00:00:59,240 --> 00:01:00,400 Speaker 3: but you chose to laugh. 15 00:01:00,880 --> 00:01:03,560 Speaker 1: Yeah, and we are going to get into that. What 16 00:01:03,640 --> 00:01:06,480 Speaker 1: you just said, what do you do about it? There 17 00:01:06,560 --> 00:01:08,720 Speaker 1: is something that can be done about it. So folks, 18 00:01:08,800 --> 00:01:12,679 Speaker 1: ask yourself, have you ever found yourself in that position 19 00:01:12,760 --> 00:01:15,319 Speaker 1: to where you mixed up a couple of folks from 20 00:01:15,319 --> 00:01:17,360 Speaker 1: a minority group, a couple of Asian women, You can't 21 00:01:17,400 --> 00:01:18,920 Speaker 1: tell the difference a couple of black guys. 22 00:01:18,959 --> 00:01:21,320 Speaker 2: You don't know. They all look alike. Have you done that? 23 00:01:21,520 --> 00:01:23,520 Speaker 1: Now? I'll ask you first before we get into this 24 00:01:23,600 --> 00:01:26,920 Speaker 1: incident that prompted this episode. Do you think you have 25 00:01:27,120 --> 00:01:29,800 Speaker 1: ever done that before? Maybe not even said it to 26 00:01:29,800 --> 00:01:32,680 Speaker 1: somebody's face, but even maybe internally or you mixed up. 27 00:01:32,760 --> 00:01:34,560 Speaker 4: I'm sure I have, yes, I mean I don't. 28 00:01:34,600 --> 00:01:37,839 Speaker 3: I can't think of a specific example, but I would 29 00:01:37,880 --> 00:01:40,240 Speaker 3: never say, oh no, I've never done that. 30 00:01:40,440 --> 00:01:41,319 Speaker 4: I'm sure I have. 31 00:01:41,600 --> 00:01:42,360 Speaker 2: And a lot of people have. 32 00:01:42,440 --> 00:01:45,840 Speaker 1: And some people will immediately point to racism, and this 33 00:01:45,920 --> 00:01:47,880 Speaker 1: is what came out of and that word came out 34 00:01:47,920 --> 00:01:50,160 Speaker 1: immediately after an incident we had at the airport. 35 00:01:50,200 --> 00:01:50,960 Speaker 2: We'll explain, folks. 36 00:01:51,040 --> 00:01:53,920 Speaker 1: We are on a trip, me, Robes and the girls 37 00:01:54,320 --> 00:01:56,600 Speaker 1: at least Ava and Sabine. 38 00:01:56,720 --> 00:01:57,760 Speaker 2: So we were in. 39 00:01:58,240 --> 00:02:00,840 Speaker 1: Just yesterday airport lounge. I don't even know if we 40 00:02:00,920 --> 00:02:02,800 Speaker 1: just say the lounge or not what airport. I don't 41 00:02:02,800 --> 00:02:04,880 Speaker 1: know if it matters, but the point is we're sitting 42 00:02:04,880 --> 00:02:08,239 Speaker 1: in the lounge all kind of lined up on this cushion, bitch, 43 00:02:08,360 --> 00:02:12,720 Speaker 1: just enjoying ourselves, eating our food, drinking Ama Moosa's, and 44 00:02:12,760 --> 00:02:16,880 Speaker 1: a woman as we were talking, Robes interrupts us. As 45 00:02:16,919 --> 00:02:17,640 Speaker 1: we were talking. 46 00:02:18,120 --> 00:02:18,720 Speaker 2: You take it from it. 47 00:02:18,880 --> 00:02:21,600 Speaker 3: Yeah, it began rudely because you were actually winding up 48 00:02:21,600 --> 00:02:23,480 Speaker 3: to tell me something. I was really I couldn't wait 49 00:02:23,520 --> 00:02:25,959 Speaker 3: to hear You're like, Okay, I had this idea and 50 00:02:26,480 --> 00:02:29,280 Speaker 3: don't you know, don't know judge right away, but I 51 00:02:29,320 --> 00:02:30,960 Speaker 3: think it could be fun. So I was like all 52 00:02:31,000 --> 00:02:33,680 Speaker 3: into it, leaning in, and this woman just started saying, 53 00:02:34,840 --> 00:02:36,840 Speaker 3: you know, we thought you were Barack Obama, and you 54 00:02:37,000 --> 00:02:39,120 Speaker 3: just turned your head and looked at her, and to 55 00:02:39,160 --> 00:02:41,240 Speaker 3: the point where she recognized how rude it was that 56 00:02:41,320 --> 00:02:44,280 Speaker 3: she cut us off. She didn't say excuse me, or 57 00:02:44,320 --> 00:02:46,320 Speaker 3: she didn't wait for us to finish talking. She just 58 00:02:46,440 --> 00:02:50,079 Speaker 3: blurted out she couldn't wait to tell you that. Her husband, 59 00:02:50,440 --> 00:02:54,760 Speaker 3: she went on to say, was convinced you were Barack Obama. 60 00:02:54,200 --> 00:02:56,480 Speaker 1: To the point that this woman got up and walked 61 00:02:56,480 --> 00:02:58,440 Speaker 1: over to us. I couldn't see where her husband was. 62 00:02:58,760 --> 00:03:00,560 Speaker 3: Yes, I think he was off to the line, but yeah, 63 00:03:00,560 --> 00:03:03,320 Speaker 3: she and she just couldn't wait to tell you that. 64 00:03:03,400 --> 00:03:05,960 Speaker 4: She told him, no, I don't think that. 65 00:03:06,080 --> 00:03:09,920 Speaker 3: Is Barack Obama, because look, he's thin, he's thinner and 66 00:03:10,160 --> 00:03:13,359 Speaker 3: lighter than Barack Obama. So she also kind of wanted 67 00:03:13,360 --> 00:03:16,520 Speaker 3: to kind of prove that she knew that you were 68 00:03:16,560 --> 00:03:18,960 Speaker 3: a little bit more light skinned and a little bit thinner, 69 00:03:19,040 --> 00:03:20,919 Speaker 3: and that's why you couldn't be Barack Obama. 70 00:03:21,320 --> 00:03:23,360 Speaker 4: And you just had a smile on your face. You said, no, ma'am. 71 00:03:23,639 --> 00:03:25,320 Speaker 3: She's like, you must get that all the time, and 72 00:03:25,360 --> 00:03:26,799 Speaker 3: you said and you just looked at her. She's like, 73 00:03:26,880 --> 00:03:29,000 Speaker 3: you get that all the time, right, And you said, actually, 74 00:03:29,000 --> 00:03:29,680 Speaker 3: I don't, ma'am. 75 00:03:29,960 --> 00:03:34,320 Speaker 4: And she then got upset, and she got upset that you. 76 00:03:34,240 --> 00:03:38,960 Speaker 3: Wouldn't acknowledge or validate the fact that she's was convinced 77 00:03:39,040 --> 00:03:40,560 Speaker 3: that you might have been and could have been and 78 00:03:40,600 --> 00:03:43,160 Speaker 3: certainly must be mistaken for Barack Obama. 79 00:03:43,320 --> 00:03:44,560 Speaker 4: All the time she was. 80 00:03:44,920 --> 00:03:47,840 Speaker 1: Got flustered by the idea that we wouldn't go along 81 00:03:47,880 --> 00:03:50,600 Speaker 1: with her foolishness, which it was foolishness. And I will, oh, 82 00:03:50,640 --> 00:03:53,040 Speaker 1: I never my tone never changed. I actually got happier 83 00:03:53,400 --> 00:03:55,120 Speaker 1: talking to her because I was toying with her, and 84 00:03:55,160 --> 00:03:58,600 Speaker 1: I continue to toy with her. For the part of 85 00:03:58,640 --> 00:04:01,520 Speaker 1: the reason is that, really, you think Barack Obama is 86 00:04:01,640 --> 00:04:05,600 Speaker 1: just hanging out here by himself, lounging this in the 87 00:04:05,760 --> 00:04:10,440 Speaker 1: United Lounge. Okay, First, he's not. Second, I am sitting 88 00:04:10,440 --> 00:04:12,880 Speaker 1: next to my hand on the thigh of a white woman, 89 00:04:13,200 --> 00:04:17,200 Speaker 1: and you really think me and Barack, Yes, you. 90 00:04:17,200 --> 00:04:17,800 Speaker 2: Mixed us up. 91 00:04:17,920 --> 00:04:20,560 Speaker 3: Yes, he's stepping out on the shell with a white 92 00:04:20,600 --> 00:04:23,440 Speaker 3: woman and a whole white family by the way, and. 93 00:04:23,440 --> 00:04:26,680 Speaker 4: Two white daughters. He's just you know, he's adopted them. 94 00:04:26,880 --> 00:04:28,600 Speaker 3: Yes, you haven't heard a lot, but that's all been 95 00:04:28,640 --> 00:04:29,599 Speaker 3: going on behind the scenes. 96 00:04:29,640 --> 00:04:32,839 Speaker 1: So this ridiculous conversation is going on. And it went 97 00:04:32,880 --> 00:04:34,880 Speaker 1: on longer than it should have because I was giving 98 00:04:34,880 --> 00:04:35,280 Speaker 1: her help. 99 00:04:35,360 --> 00:04:35,760 Speaker 4: Correct you. 100 00:04:35,920 --> 00:04:38,200 Speaker 3: You kept toying with her, and she kept getting frustrated. 101 00:04:38,240 --> 00:04:40,240 Speaker 3: She's like, are you messing with me? Are you messing 102 00:04:40,279 --> 00:04:40,520 Speaker 3: with me? 103 00:04:40,839 --> 00:04:41,159 Speaker 4: Come on? 104 00:04:41,480 --> 00:04:43,680 Speaker 3: You know people do tell you that, right, and you 105 00:04:43,800 --> 00:04:46,360 Speaker 3: just kept going. That's when you started saying, well, no, 106 00:04:46,400 --> 00:04:48,840 Speaker 3: not really, I'm here with three white women, so no, 107 00:04:48,960 --> 00:04:51,520 Speaker 3: it doesn't happen. She actually looked at me at one point. 108 00:04:52,040 --> 00:04:54,320 Speaker 4: I'm just witnessing it all. She said you should break 109 00:04:54,400 --> 00:04:55,560 Speaker 4: up with him. 110 00:04:56,160 --> 00:04:59,839 Speaker 3: She was so annoyed that you wouldn't play along with 111 00:04:59,839 --> 00:05:04,880 Speaker 3: her or validate her way of thinking that she actually 112 00:05:04,920 --> 00:05:07,680 Speaker 3: looked at me and said, you should break up with him. 113 00:05:07,680 --> 00:05:10,840 Speaker 1: So still I'm laughing, and I'm happy and we still 114 00:05:10,839 --> 00:05:11,520 Speaker 1: having a good time. 115 00:05:11,600 --> 00:05:12,839 Speaker 2: I never got upset. 116 00:05:12,880 --> 00:05:16,440 Speaker 1: The woman finally walks away in this incident, again longer 117 00:05:16,440 --> 00:05:19,520 Speaker 1: than it should have gone, walks away, and at least 118 00:05:19,560 --> 00:05:23,960 Speaker 1: some members of our party immediately looked, and the word 119 00:05:24,120 --> 00:05:26,640 Speaker 1: racism came up, saying, isn't that a little. 120 00:05:26,440 --> 00:05:28,320 Speaker 2: Racist what she just did. 121 00:05:28,600 --> 00:05:32,440 Speaker 1: I didn't necessarily think much of it at the time, 122 00:05:32,920 --> 00:05:36,120 Speaker 1: but it brings up robes, of course, something we all 123 00:05:36,480 --> 00:05:40,159 Speaker 1: we all are aware of this idea that they all 124 00:05:40,200 --> 00:05:40,679 Speaker 1: look alike. 125 00:05:40,880 --> 00:05:43,000 Speaker 2: Yep, they all look alike. Who are they? 126 00:05:43,279 --> 00:05:47,159 Speaker 1: Oftentimes as black folks, they all look alike. So is 127 00:05:47,200 --> 00:05:51,800 Speaker 1: there anything to that. Scientists will actually tell you. 128 00:05:51,800 --> 00:05:53,880 Speaker 4: Yes, and that's fascinating, And. 129 00:05:53,800 --> 00:05:56,919 Speaker 1: It's not racism. Now, racism can get involved in this, 130 00:05:56,960 --> 00:05:59,599 Speaker 1: but it's not racism. So this is something that actually 131 00:05:59,600 --> 00:06:02,760 Speaker 1: call I've heard of a couple of these. What it's 132 00:06:02,800 --> 00:06:07,680 Speaker 1: one called the other race effect, the cross race effect. 133 00:06:07,160 --> 00:06:10,640 Speaker 2: Or own race bias is how they refer to them. 134 00:06:10,680 --> 00:06:12,200 Speaker 2: And what we're talking. Doesn't this make sense? 135 00:06:12,240 --> 00:06:14,800 Speaker 1: Though it's harder for people of one race to identify 136 00:06:15,240 --> 00:06:19,000 Speaker 1: people of other races, it's. 137 00:06:18,160 --> 00:06:20,200 Speaker 4: In your brain. Your brain's wired that way. 138 00:06:20,880 --> 00:06:21,840 Speaker 2: That does make sense. 139 00:06:21,880 --> 00:06:24,919 Speaker 1: We are born and the first people we see or 140 00:06:24,960 --> 00:06:29,200 Speaker 1: go to school with or in our homes look like us, 141 00:06:29,680 --> 00:06:31,960 Speaker 1: so we know and we get comfortable early on with 142 00:06:32,000 --> 00:06:34,720 Speaker 1: those features. The problem is that some people from some 143 00:06:34,800 --> 00:06:39,200 Speaker 1: cultures don't have as much meaningful interaction with people of 144 00:06:39,240 --> 00:06:42,960 Speaker 1: other races, like their identities early on, so you miss 145 00:06:43,000 --> 00:06:47,200 Speaker 1: out on kind of registering what people look like. 146 00:06:47,279 --> 00:06:48,640 Speaker 2: It kind of makes sense. 147 00:06:48,480 --> 00:06:50,320 Speaker 3: That does if you think about all the things that 148 00:06:50,360 --> 00:06:52,800 Speaker 3: are formed in your brain, but before you turn five, 149 00:06:52,920 --> 00:06:55,360 Speaker 3: A lot of it is language, so it would make 150 00:06:55,400 --> 00:06:57,120 Speaker 3: sense that sight would be a part of it as well, 151 00:06:57,400 --> 00:06:59,520 Speaker 3: and your ability to distinguish. Think about it, if you 152 00:06:59,680 --> 00:07:03,240 Speaker 3: don't learn certain languages or certain word patternspy. 153 00:07:03,680 --> 00:07:05,200 Speaker 4: I believe it's like ten or eleven. 154 00:07:05,480 --> 00:07:08,839 Speaker 3: You can't say you're ours if you're Asian the way 155 00:07:08,880 --> 00:07:10,280 Speaker 3: we do here in America. 156 00:07:10,560 --> 00:07:10,800 Speaker 4: Can't. 157 00:07:10,840 --> 00:07:15,560 Speaker 3: You literally cannot get your brain to unform whatever connection 158 00:07:15,680 --> 00:07:17,960 Speaker 3: it's made and make a new connection, like you just 159 00:07:18,080 --> 00:07:20,320 Speaker 3: lose out or miss out on the ability to make 160 00:07:20,400 --> 00:07:24,840 Speaker 3: different connections or distinguish. I know it's true for language, 161 00:07:24,840 --> 00:07:26,400 Speaker 3: so that makes sense. It would be true for sight. 162 00:07:26,640 --> 00:07:30,600 Speaker 1: It is in the brain, but they what happens is 163 00:07:30,800 --> 00:07:35,440 Speaker 1: early on, see a white kid surrounded by white folks. 164 00:07:35,440 --> 00:07:38,440 Speaker 1: A black kid surrounded by black folks growing up, But 165 00:07:38,520 --> 00:07:40,480 Speaker 1: the white kid is going to have a more difficult time. 166 00:07:40,560 --> 00:07:43,920 Speaker 1: Why because the white kid is in the majority group 167 00:07:44,320 --> 00:07:49,040 Speaker 1: and he is growing up with less exposure to those 168 00:07:49,080 --> 00:07:51,440 Speaker 1: minority groups. Me as a black kid, I see white 169 00:07:51,480 --> 00:07:54,160 Speaker 1: folks all the time. There's my teacher, there's my pastor, 170 00:07:54,560 --> 00:07:56,880 Speaker 1: there's this historical figure. 171 00:07:56,600 --> 00:07:57,559 Speaker 2: That they're teaching me about. 172 00:07:57,680 --> 00:08:00,720 Speaker 1: We get more access if you will make even looking 173 00:08:00,720 --> 00:08:06,560 Speaker 1: at movies and whatnot, you we are accustomed to studying 174 00:08:06,640 --> 00:08:10,560 Speaker 1: and seeing and understanding white features more so than the 175 00:08:10,600 --> 00:08:10,960 Speaker 1: other way. 176 00:08:10,960 --> 00:08:11,600 Speaker 2: Does that make sense? 177 00:08:11,680 --> 00:08:14,600 Speaker 3: Yes, that I've never thought about it, but yes, that 178 00:08:14,680 --> 00:08:15,840 Speaker 3: does make perfect sense. 179 00:08:15,880 --> 00:08:18,559 Speaker 1: And the other thing for white folks that they, again 180 00:08:18,600 --> 00:08:22,720 Speaker 1: research says here is that you all start to register 181 00:08:23,000 --> 00:08:26,400 Speaker 1: different features that you all have. Meaning there's a blond 182 00:08:26,440 --> 00:08:29,720 Speaker 1: haired girl, there is a redhead guy, there is a 183 00:08:29,880 --> 00:08:32,760 Speaker 1: blue eyed boy, there is a green eye. You all 184 00:08:32,840 --> 00:08:35,840 Speaker 1: have different types of features. You see us. You see 185 00:08:35,880 --> 00:08:39,480 Speaker 1: brown skin, brown eyes, and black hair. There aren't the 186 00:08:39,520 --> 00:08:42,640 Speaker 1: same types, and you all's minds the same types of 187 00:08:42,679 --> 00:08:45,959 Speaker 1: distinguishing features. For me, I know, the nose can look 188 00:08:46,000 --> 00:08:48,000 Speaker 1: this way, the lips can be that way, the ass 189 00:08:48,000 --> 00:08:48,839 Speaker 1: can be that way, the. 190 00:08:48,840 --> 00:08:49,400 Speaker 2: Ears could be. 191 00:08:49,559 --> 00:08:51,360 Speaker 1: I know, all the features of black folks are different. 192 00:08:51,440 --> 00:08:53,280 Speaker 1: I can identify because I am familiar. 193 00:08:53,320 --> 00:08:54,160 Speaker 2: Does that makes sense? 194 00:08:54,240 --> 00:08:55,040 Speaker 4: Yes, it does. 195 00:08:55,240 --> 00:08:57,520 Speaker 2: So we haven't hit racism yet necessarily, right. 196 00:08:57,679 --> 00:08:59,200 Speaker 4: No, it's exposure. 197 00:08:59,520 --> 00:09:01,560 Speaker 3: And I mean in that sense you could say that 198 00:09:01,760 --> 00:09:05,720 Speaker 3: racist attitudes create that because if you are isolated, you 199 00:09:05,760 --> 00:09:08,720 Speaker 3: are segregated by choice because you prefer to be with 200 00:09:08,760 --> 00:09:11,840 Speaker 3: your own kind. You don't expose yourself, you don't expose 201 00:09:11,840 --> 00:09:15,000 Speaker 3: your children. So it's a choice in that sense. So 202 00:09:15,040 --> 00:09:17,840 Speaker 3: that does lead to something that maybe isn't intentional, but 203 00:09:17,880 --> 00:09:20,840 Speaker 3: it's still a byproduct of Man. 204 00:09:20,600 --> 00:09:23,760 Speaker 1: You are nailing it right now, what the research says, 205 00:09:23,800 --> 00:09:25,760 Speaker 1: you are going through loves with this research that's going 206 00:09:25,800 --> 00:09:28,240 Speaker 1: to come out. So there has been and I mean 207 00:09:28,320 --> 00:09:30,920 Speaker 1: a bunch of study in this and they all come 208 00:09:30,960 --> 00:09:34,200 Speaker 1: out with the same result, which is that, yes, white 209 00:09:34,240 --> 00:09:39,200 Speaker 1: people have a more difficult time identifying features of minorities. 210 00:09:39,440 --> 00:09:42,199 Speaker 1: Now hear this a very important studying done out in 211 00:09:42,240 --> 00:09:43,600 Speaker 1: California in twenty nineteen. 212 00:09:44,120 --> 00:09:44,880 Speaker 2: White people were. 213 00:09:44,800 --> 00:09:46,760 Speaker 1: Shown this is going to mess you up sebeing white people, 214 00:09:46,760 --> 00:09:49,880 Speaker 1: I mean sabine ropes. White people are shown photos of 215 00:09:50,040 --> 00:09:54,240 Speaker 1: various faces of black and white people. Okay, right, so 216 00:09:54,440 --> 00:09:58,160 Speaker 1: then they altered the photo a little by thirty percent, 217 00:09:58,240 --> 00:10:00,560 Speaker 1: than by fifty percent, then by seven twenty percent, then 218 00:10:00,559 --> 00:10:03,040 Speaker 1: by one hundred percent, meaning they show them a totally 219 00:10:03,040 --> 00:10:03,680 Speaker 1: different person. 220 00:10:04,480 --> 00:10:07,880 Speaker 2: Get this, and this They put them through in an MRI. 221 00:10:08,120 --> 00:10:09,760 Speaker 2: So they were looking at their brain function. 222 00:10:09,880 --> 00:10:10,160 Speaker 1: Wow. 223 00:10:10,240 --> 00:10:11,040 Speaker 2: Oh yeah. 224 00:10:11,160 --> 00:10:15,560 Speaker 1: So the result, people's brains reacted strongly to even the 225 00:10:15,600 --> 00:10:20,600 Speaker 1: slightest change in a white face. Brains didn't register reaction 226 00:10:21,320 --> 00:10:25,959 Speaker 1: even when a different black face was shown. Wow, if 227 00:10:26,000 --> 00:10:28,920 Speaker 1: you change the eye color on a white person, a 228 00:10:28,960 --> 00:10:32,560 Speaker 1: white person will go their my mind is firing. You 229 00:10:32,559 --> 00:10:36,360 Speaker 1: can show them a different black man and they do 230 00:10:36,480 --> 00:10:39,000 Speaker 1: not have the same brain reaction. Isn't that crazy? 231 00:10:39,360 --> 00:10:42,240 Speaker 4: Wow? I mean actually, so you can't see me, but 232 00:10:42,320 --> 00:10:43,360 Speaker 4: my jaw is dropped. 233 00:10:44,040 --> 00:10:45,400 Speaker 2: What okay, now hear what? 234 00:10:45,440 --> 00:10:48,280 Speaker 1: The research says that these white folks in the study 235 00:10:48,280 --> 00:10:54,200 Speaker 1: were essentially treating black faces as almost like they're not faces. 236 00:10:54,840 --> 00:10:58,800 Speaker 1: His quote, Wow, yikes, it does out. 237 00:10:58,960 --> 00:10:59,720 Speaker 2: That stings. 238 00:10:59,800 --> 00:11:03,840 Speaker 1: Now it's actually going to get worse with this. The 239 00:11:03,960 --> 00:11:07,199 Speaker 1: bias starts immediately when we. 240 00:11:07,160 --> 00:11:08,600 Speaker 2: See somebody, like as soon. 241 00:11:08,400 --> 00:11:12,840 Speaker 1: As you see anything, your brain starts registering that face. 242 00:11:13,720 --> 00:11:18,600 Speaker 1: The researchers shows white folks, you see someone they're not 243 00:11:18,880 --> 00:11:22,200 Speaker 1: like you, They're from another group, and that's where you 244 00:11:22,280 --> 00:11:26,960 Speaker 1: stop processing and giving a shit about anything else other 245 00:11:27,000 --> 00:11:31,440 Speaker 1: than black skin. Wow, this is a this is a 246 00:11:31,520 --> 00:11:34,640 Speaker 1: California researcher. This is not some theory or something somebody 247 00:11:34,679 --> 00:11:38,680 Speaker 1: wrote in some paper. When you think about it that way, 248 00:11:39,040 --> 00:11:42,199 Speaker 1: our brain's immediately when you see somebody's face ropes, your 249 00:11:42,240 --> 00:11:45,280 Speaker 1: brain starts working. But the research shows that white people 250 00:11:45,280 --> 00:11:48,319 Speaker 1: when they see a face not like theirs, immediately stop 251 00:11:48,400 --> 00:11:51,960 Speaker 1: working and don't even care to further investigate anything else. 252 00:11:51,840 --> 00:11:54,720 Speaker 3: About that because and is there they did they go 253 00:11:54,800 --> 00:11:56,839 Speaker 3: into the reason Obviously, I know there's some of the 254 00:11:56,840 --> 00:12:00,440 Speaker 3: reasons why, because of early perhaps processing, but is it 255 00:12:00,480 --> 00:12:03,080 Speaker 3: because we've never had to as white people. Because when 256 00:12:03,120 --> 00:12:06,360 Speaker 3: you're in the majority, it's not important enough, it doesn't 257 00:12:06,440 --> 00:12:08,800 Speaker 3: change your life enough, it's not necessary. 258 00:12:09,040 --> 00:12:09,920 Speaker 2: You nailed it. 259 00:12:10,160 --> 00:12:14,400 Speaker 1: And the best example is, take for example, your office. Right, 260 00:12:14,720 --> 00:12:18,600 Speaker 1: everybody remembers this boss, this manager. Because you have to 261 00:12:18,679 --> 00:12:21,480 Speaker 1: write people you don't even see a lot, you study 262 00:12:21,480 --> 00:12:24,360 Speaker 1: them because they mean something to you. Who's on the 263 00:12:24,400 --> 00:12:26,800 Speaker 1: lower end of the totem pole. You don't have to 264 00:12:26,800 --> 00:12:30,240 Speaker 1: give a damn and about that person. So the hierarchy 265 00:12:30,400 --> 00:12:32,600 Speaker 1: there is a big part of it. His quote, this 266 00:12:32,720 --> 00:12:38,160 Speaker 1: researcher white folks, they don't have the motivation to process 267 00:12:38,360 --> 00:12:42,280 Speaker 1: an individual more deeply when you see somebody that doesn't 268 00:12:42,320 --> 00:12:44,360 Speaker 1: look like you. This is his research and a lot 269 00:12:44,400 --> 00:12:47,120 Speaker 1: of things come to these conclusions. I was fascinated by it, 270 00:12:47,160 --> 00:12:49,640 Speaker 1: killing it. But we go from we don't even. 271 00:12:49,520 --> 00:12:49,959 Speaker 2: Laugh about it. 272 00:12:49,960 --> 00:12:52,720 Speaker 1: Oh, they all look like like this is some deeper 273 00:12:52,760 --> 00:12:53,600 Speaker 1: and heavier stuff. 274 00:12:53,920 --> 00:12:56,839 Speaker 3: Yeah, then there's science behind it. There's research behind it. 275 00:12:56,880 --> 00:12:59,560 Speaker 3: And you know what, as you're telling me this, it 276 00:12:59,600 --> 00:13:03,000 Speaker 3: makes sense. It's hard to it's hard to accept because 277 00:13:03,000 --> 00:13:05,960 Speaker 3: no one wants to consider themselves racist. No one wants 278 00:13:06,000 --> 00:13:10,400 Speaker 3: to consider themselves exclusionary or dismissive. That's a terrible thing 279 00:13:10,480 --> 00:13:13,360 Speaker 3: to imagine yourself as being. But yet you kind of 280 00:13:13,400 --> 00:13:15,920 Speaker 3: have to hear this, recognize it, and maybe see it 281 00:13:15,960 --> 00:13:18,240 Speaker 3: in someone else, and then look within and say do 282 00:13:18,320 --> 00:13:18,800 Speaker 3: I do that? 283 00:13:19,120 --> 00:13:20,240 Speaker 4: And how can I do better? 284 00:13:21,000 --> 00:13:24,400 Speaker 1: And so you once again, because some people will hear 285 00:13:24,440 --> 00:13:27,520 Speaker 1: this and go, oh, it's not my fault. It's a 286 00:13:27,559 --> 00:13:30,319 Speaker 1: phenomenon I can't do anything about. Some people will look 287 00:13:30,320 --> 00:13:31,920 Speaker 1: at it that way. They absolutely will. 288 00:13:32,160 --> 00:13:32,840 Speaker 4: All right, you're right. 289 00:13:32,920 --> 00:13:37,200 Speaker 3: Oh see, it's not I did help it. This is 290 00:13:37,200 --> 00:13:38,960 Speaker 3: how I was born, this is how I was raised. 291 00:13:38,960 --> 00:13:41,520 Speaker 3: There's nothing I can do about it, but I would argue, 292 00:13:41,600 --> 00:13:43,560 Speaker 3: and I don't know what you're about to say or 293 00:13:43,559 --> 00:13:46,320 Speaker 3: tell us that this is an aha moment that you 294 00:13:46,320 --> 00:13:47,559 Speaker 3: can that can be actionable. 295 00:13:47,960 --> 00:13:49,079 Speaker 4: You can do something about. 296 00:13:49,360 --> 00:13:53,880 Speaker 1: Yes, awareness first, but ask yourself, why is it you 297 00:13:53,920 --> 00:13:57,040 Speaker 1: look at this person and don't care about knowing anything 298 00:13:57,080 --> 00:13:58,640 Speaker 1: more than seeing the skin color. 299 00:13:58,720 --> 00:14:01,400 Speaker 3: Yeah, I'm thinking about it, going wow, I'm now like, 300 00:14:01,640 --> 00:14:04,680 Speaker 3: as you're telling me this, I'm already imagining how I'm 301 00:14:04,720 --> 00:14:09,040 Speaker 3: going to now try to process and think about people 302 00:14:09,400 --> 00:14:12,120 Speaker 3: when I look at them, see them, meet them, and register, 303 00:14:12,600 --> 00:14:15,840 Speaker 3: don't do that thing that maybe you do that you 304 00:14:15,840 --> 00:14:18,920 Speaker 3: didn't intend to or you didn't Actually it wasn't something 305 00:14:18,960 --> 00:14:21,560 Speaker 3: that was conscious. But now that I am conscious, now 306 00:14:21,600 --> 00:14:25,080 Speaker 3: that I've woken up to this, I can now be better. 307 00:14:25,520 --> 00:14:26,640 Speaker 2: You woke now? Huh? 308 00:14:26,680 --> 00:14:31,120 Speaker 4: Oh God? Why did I use the W word? 309 00:14:31,200 --> 00:14:31,240 Speaker 3: No? 310 00:14:31,600 --> 00:14:35,520 Speaker 1: But this is okay. You used it appropriately. It's been 311 00:14:35,680 --> 00:14:37,440 Speaker 1: taken from us in a lot of ways. But this 312 00:14:37,560 --> 00:14:40,080 Speaker 1: is what we're talking about. If you're just aware, a 313 00:14:40,080 --> 00:14:42,200 Speaker 1: lot of people aren't even aware of this. You're woke 314 00:14:42,280 --> 00:14:45,320 Speaker 1: by just being aware. You're aware, you've woken up to 315 00:14:45,440 --> 00:14:47,760 Speaker 1: something going on, and now you can make a difference 316 00:14:47,800 --> 00:14:50,680 Speaker 1: and improve That's what woke is, by the way, people. So, yes, 317 00:14:50,720 --> 00:14:52,800 Speaker 1: you used it appropriately, but I knew you would laugh 318 00:14:53,120 --> 00:14:56,200 Speaker 1: when I said that. Okay, here's the problem you kind 319 00:14:56,200 --> 00:14:59,680 Speaker 1: of hit on roades. It's two problems. One thing you 320 00:14:59,760 --> 00:15:01,720 Speaker 1: kind of dismissive of. But what happens in the criminal 321 00:15:01,800 --> 00:15:04,520 Speaker 1: justice system. Ah, yeah, come in for a lineup. Yep, 322 00:15:04,520 --> 00:15:07,880 Speaker 1: that's him right there. That kind of a thing, mistaken identity. 323 00:15:08,000 --> 00:15:11,080 Speaker 1: Wrong guy gets arrested, wrong god goes a trial, wrong 324 00:15:11,120 --> 00:15:13,160 Speaker 1: guy goes to jail because somebody identified. 325 00:15:14,280 --> 00:15:15,560 Speaker 2: Every day, that happens too often. 326 00:15:15,600 --> 00:15:17,200 Speaker 1: But the other thing is just sucks to be in 327 00:15:17,200 --> 00:15:19,360 Speaker 1: an office where somebody keeps calling you somebody else. It 328 00:15:19,440 --> 00:15:23,440 Speaker 1: keeps getting you confused because why you're actually telling me 329 00:15:23,840 --> 00:15:27,200 Speaker 1: that there's nothing unique or special about me, or nothing 330 00:15:27,240 --> 00:15:29,920 Speaker 1: about me that makes you even want to take a 331 00:15:30,240 --> 00:15:31,640 Speaker 1: moment to know who I am. 332 00:15:31,760 --> 00:15:32,240 Speaker 2: That hurts. 333 00:15:32,720 --> 00:15:33,960 Speaker 4: Yeah, that hurts. 334 00:15:34,080 --> 00:15:36,640 Speaker 3: I can only imagine because you yeah, I was just 335 00:15:37,040 --> 00:15:38,920 Speaker 3: or you can't be of service to me, You're of 336 00:15:39,000 --> 00:15:41,680 Speaker 3: no use to me, well. 337 00:15:41,840 --> 00:15:45,120 Speaker 1: Robes, stay here, not Robes. I know you're staying here. 338 00:15:45,880 --> 00:15:48,400 Speaker 1: I shouldn't be presumptive us. I hope you stay around Robes, 339 00:15:48,480 --> 00:15:50,480 Speaker 1: but you all stay here because when we come back, 340 00:15:50,840 --> 00:15:55,400 Speaker 1: will actually tell you the three individuals that I've actually 341 00:15:55,400 --> 00:15:59,680 Speaker 1: been mistaken for most in my life, and also will 342 00:15:59,720 --> 00:16:02,560 Speaker 1: hit you with a couple of high profile incidents that 343 00:16:02,600 --> 00:16:04,680 Speaker 1: have happened to celebrities. 344 00:16:04,200 --> 00:16:06,880 Speaker 2: Who have gotten mixed up with other celebrities. 345 00:16:07,240 --> 00:16:10,680 Speaker 1: I e. Yeah, we're both black, Yeah we're both rich, 346 00:16:11,000 --> 00:16:12,480 Speaker 1: but that don't mean we're the same person. 347 00:16:12,520 --> 00:16:25,000 Speaker 2: Stay here, all right? We continue here on this. 348 00:16:25,360 --> 00:16:27,920 Speaker 1: Y'all all looklike episode of Amy and TJ Robes that 349 00:16:27,960 --> 00:16:31,120 Speaker 1: we've been together. When someone has come up to me 350 00:16:31,160 --> 00:16:33,480 Speaker 1: and got me mixed up with somebody else, I think, yes, 351 00:16:36,040 --> 00:16:37,200 Speaker 1: that's the one you get. 352 00:16:37,520 --> 00:16:41,000 Speaker 3: I've heard people come up to you and actually call 353 00:16:41,080 --> 00:16:41,760 Speaker 3: you don lemon. 354 00:16:42,200 --> 00:16:42,400 Speaker 2: See. 355 00:16:42,440 --> 00:16:44,680 Speaker 1: I can't I can't remember the last time it used 356 00:16:44,680 --> 00:16:46,800 Speaker 1: to happen a bunch and we were both seeing it. 357 00:16:46,800 --> 00:16:48,640 Speaker 1: It was we used to work seeing it in Saturday 358 00:16:48,640 --> 00:16:52,040 Speaker 1: and Sunday. I did mornings, he did evenings. That may 359 00:16:52,080 --> 00:16:53,960 Speaker 1: be a little. 360 00:16:54,200 --> 00:16:57,720 Speaker 3: But still you don't look anything alike. Wat you have 361 00:16:57,840 --> 00:16:58,960 Speaker 3: similar coloring. 362 00:17:00,240 --> 00:17:01,240 Speaker 2: That's it, you know. 363 00:17:01,320 --> 00:17:02,960 Speaker 1: The thing I used to say when I was when 364 00:17:03,000 --> 00:17:04,719 Speaker 1: people used to say that to me, I would wouldn't 365 00:17:04,720 --> 00:17:08,199 Speaker 1: break stride, as you know. I would say, nah, he's 366 00:17:08,800 --> 00:17:12,359 Speaker 1: he's gay and I'm married to a woman. I would 367 00:17:12,359 --> 00:17:15,040 Speaker 1: say that and just keep it pushing and that would 368 00:17:15,040 --> 00:17:16,880 Speaker 1: be the end of But Dawn, that one made sense. 369 00:17:17,400 --> 00:17:20,800 Speaker 1: I've only another time gotten Barack Obama. And that was 370 00:17:20,840 --> 00:17:23,000 Speaker 1: literally when I was walking near the White House. I 371 00:17:23,040 --> 00:17:24,720 Speaker 1: was in a I remember the blue suit I was in. 372 00:17:24,880 --> 00:17:28,400 Speaker 1: Walking somebody said, oh gods, it's like, bro, you. 373 00:17:28,440 --> 00:17:30,080 Speaker 4: Look nothing like Barack Obama. 374 00:17:30,119 --> 00:17:32,480 Speaker 1: You know the response I gave, I said, I'm much 375 00:17:32,520 --> 00:17:33,119 Speaker 1: better dressed. 376 00:17:33,560 --> 00:17:34,320 Speaker 2: Never broke stride. 377 00:17:34,440 --> 00:17:36,639 Speaker 4: Oh that is what you told the woman. Yes, she 378 00:17:36,800 --> 00:17:39,040 Speaker 4: did not know what to do. 379 00:17:39,720 --> 00:17:42,679 Speaker 3: She was so confused, and she actually was thinking like 380 00:17:43,320 --> 00:17:45,800 Speaker 3: is he kidding? Is he not? She was You made 381 00:17:45,840 --> 00:17:50,360 Speaker 3: her uncomfortable, and that perhaps was the points I thought 382 00:17:50,359 --> 00:17:51,080 Speaker 3: she would move on. 383 00:17:51,359 --> 00:17:51,880 Speaker 2: It was a joke. 384 00:17:51,960 --> 00:17:54,720 Speaker 1: I said, Oh, somethime, Barack Obam, clearly I'm much better dressed. 385 00:17:55,000 --> 00:17:57,240 Speaker 2: Hahaha. I thought that would lighten the movie. 386 00:17:57,359 --> 00:17:59,200 Speaker 3: You were in a hoodie, right, and she and then 387 00:17:59,200 --> 00:18:00,639 Speaker 3: she wouldn't take no for an answer. 388 00:18:01,320 --> 00:18:02,120 Speaker 2: I will tell you. 389 00:18:02,119 --> 00:18:04,000 Speaker 1: Are the other guy. 390 00:18:04,160 --> 00:18:06,000 Speaker 2: You won't know his name, but you'll know his face. 391 00:18:06,119 --> 00:18:06,640 Speaker 2: That I. 392 00:18:08,400 --> 00:18:12,320 Speaker 1: Probably more recently and more often than Don Lemon in 393 00:18:12,359 --> 00:18:15,280 Speaker 1: recent years, Gary Jordan. 394 00:18:15,160 --> 00:18:15,840 Speaker 2: You don't know the name. 395 00:18:16,080 --> 00:18:18,360 Speaker 1: I wouldn't have known it either, and nobody would say. 396 00:18:18,200 --> 00:18:22,280 Speaker 2: That name to me. The black guy from CSI. 397 00:18:23,800 --> 00:18:26,640 Speaker 1: Had the hair and he had light eyes, and that 398 00:18:26,800 --> 00:18:30,600 Speaker 1: is the guy people would come to me and swear 399 00:18:30,680 --> 00:18:31,360 Speaker 1: I was that guy. 400 00:18:33,400 --> 00:18:34,359 Speaker 2: You remember what I'm talking about. 401 00:18:34,480 --> 00:18:35,000 Speaker 4: Yes, I'm trying. 402 00:18:35,040 --> 00:18:38,159 Speaker 3: I'm just like, that's I just why do people need 403 00:18:38,200 --> 00:18:42,040 Speaker 3: to do that? It's so interesting people have this obsession 404 00:18:42,080 --> 00:18:45,800 Speaker 3: with saying and maybe it's that they recognize you that's hilarious. 405 00:18:45,880 --> 00:18:47,720 Speaker 4: Yes, I know exactly what you're talking about. He just 406 00:18:47,720 --> 00:18:48,480 Speaker 4: showed me the picture. 407 00:18:48,880 --> 00:18:52,639 Speaker 3: But maybe it's because they know, I actually think because 408 00:18:52,640 --> 00:18:54,720 Speaker 3: they know they know your face and they're trying to 409 00:18:54,760 --> 00:18:56,920 Speaker 3: place it, so they just mix you up with every 410 00:18:56,960 --> 00:18:59,920 Speaker 3: other famous light skinned black man they know, or they 411 00:19:00,240 --> 00:19:02,359 Speaker 3: off there and that is sad. 412 00:19:02,560 --> 00:19:03,520 Speaker 2: So that there you go. 413 00:19:03,560 --> 00:19:05,800 Speaker 1: That takes me to the semi old Jackson Lawrence Fishburn. 414 00:19:05,840 --> 00:19:08,199 Speaker 1: He was doing an interview. Samuel Jackson was doing an interview. 415 00:19:08,560 --> 00:19:10,840 Speaker 1: On I think it was live kat till. I don't 416 00:19:10,880 --> 00:19:12,320 Speaker 1: want to mix it up, but one of the stations 417 00:19:12,800 --> 00:19:14,760 Speaker 1: and a guy made a reference to him being in 418 00:19:14,760 --> 00:19:18,120 Speaker 1: a super Bowl commercial, like and he was thinking about 419 00:19:18,160 --> 00:19:20,280 Speaker 1: Lawrence Fishburn who was in a super Bowl commercial, And 420 00:19:20,280 --> 00:19:22,880 Speaker 1: Sami ol Jackson went off like, yeah, we got black 421 00:19:22,960 --> 00:19:25,000 Speaker 1: and rich, but we at the same person. We don't 422 00:19:25,040 --> 00:19:27,560 Speaker 1: all look alike. Actually went off on him at the 423 00:19:27,600 --> 00:19:32,240 Speaker 1: time about it. You remember America Ferrera, Gina Rodriguez, Yes, 424 00:19:32,400 --> 00:19:34,800 Speaker 1: Gina Rodriguez, I think nominated for a Golden Globe. 425 00:19:34,800 --> 00:19:35,680 Speaker 2: The Golden Globes on. 426 00:19:35,600 --> 00:19:39,280 Speaker 1: Their official Twitter account posted that and put out the 427 00:19:39,280 --> 00:19:41,000 Speaker 1: wrong picture between. 428 00:19:40,800 --> 00:19:42,640 Speaker 2: America Ferrera and Gina Rodriguez. 429 00:19:42,680 --> 00:19:45,960 Speaker 1: Lucy Lou Lisa Ling apparently get mixed up a bunch 430 00:19:46,080 --> 00:19:50,600 Speaker 1: And another ugly incident Daniel Kalua Colujah from he was 431 00:19:50,640 --> 00:19:54,560 Speaker 1: in Black Panther, but he won the oscar for Judas 432 00:19:54,560 --> 00:19:55,320 Speaker 1: in The Black Messiah. 433 00:19:56,040 --> 00:19:56,640 Speaker 2: You remember that? 434 00:19:56,800 --> 00:20:01,040 Speaker 1: Yes? On the Red carpet, a reporter asked, how do 435 00:20:01,119 --> 00:20:05,480 Speaker 1: you feel about being directed by Regina Regina King directed 436 00:20:05,600 --> 00:20:08,520 Speaker 1: One Night in Miami with Leslie. 437 00:20:08,240 --> 00:20:09,480 Speaker 2: Odhom Junior Ouch. 438 00:20:10,160 --> 00:20:12,200 Speaker 1: Yeah, he had to apologize later And the famous one 439 00:20:12,280 --> 00:20:14,920 Speaker 1: James Blake, the tennis star. This was almost ten years 440 00:20:14,960 --> 00:20:16,480 Speaker 1: ago now in New York. We were at GMA at 441 00:20:16,480 --> 00:20:18,040 Speaker 1: the time. He came into the studio. 442 00:20:18,240 --> 00:20:21,800 Speaker 3: We remember after this incident, Oh yes, I absolutely remember this, 443 00:20:21,920 --> 00:20:23,720 Speaker 3: but it was all on video, right. 444 00:20:23,840 --> 00:20:26,720 Speaker 1: A police officer tackled him because they thought he was 445 00:20:26,800 --> 00:20:29,480 Speaker 1: some other guy and they had to apologize. So those 446 00:20:29,560 --> 00:20:32,640 Speaker 1: high profile ones happened, but they hurt well. 447 00:20:32,680 --> 00:20:34,560 Speaker 3: And you know, it's funny you asked me and I 448 00:20:34,680 --> 00:20:36,920 Speaker 3: just remembered something because it's similar to what you're saying. 449 00:20:36,920 --> 00:20:39,160 Speaker 3: This happened a Good Morning America when I was doing 450 00:20:39,200 --> 00:20:39,840 Speaker 3: the Olympics. 451 00:20:40,680 --> 00:20:41,520 Speaker 4: Simone Biles. 452 00:20:41,560 --> 00:20:46,240 Speaker 3: Obviously, oh you remember this, yes, So we go to 453 00:20:46,280 --> 00:20:48,560 Speaker 3: bed and my producer, thank god. 454 00:20:49,960 --> 00:20:52,760 Speaker 4: We had recorded Simone Biles, Simone Biles. 455 00:20:54,200 --> 00:20:58,320 Speaker 3: They put Jordan Childs in the video, lining up with 456 00:20:58,359 --> 00:21:01,960 Speaker 3: me saying Simone Biles and my producer I'm talking for 457 00:21:02,119 --> 00:21:05,320 Speaker 3: minutes before it was supposed to air, caught that the 458 00:21:05,400 --> 00:21:09,159 Speaker 3: overnight editor did not know the difference between Simone Biles 459 00:21:09,200 --> 00:21:12,840 Speaker 3: and Jordan Chiles uh and switched the video and that 460 00:21:12,880 --> 00:21:15,680 Speaker 3: would have been Can you imagine if that had aired 461 00:21:15,720 --> 00:21:19,800 Speaker 3: on Good Morning America with me saying Simone Biles and 462 00:21:19,880 --> 00:21:21,480 Speaker 3: the audience seeing Jordan Chiles. 463 00:21:21,600 --> 00:21:22,199 Speaker 2: I haven't. 464 00:21:22,800 --> 00:21:25,119 Speaker 1: I kind of remember him now. So throughout your career, 465 00:21:25,119 --> 00:21:27,560 Speaker 1: I know there's been so many incidents like that behind 466 00:21:27,560 --> 00:21:29,520 Speaker 1: the scenes, of mixing people up. 467 00:21:29,560 --> 00:21:32,680 Speaker 2: And almost getting on the air. Get on the air. 468 00:21:32,800 --> 00:21:38,200 Speaker 3: Yes, I had another co anchor accidentally say Jesse Jackson 469 00:21:38,440 --> 00:21:40,640 Speaker 3: Junior when it was actually El Sharpton. 470 00:21:41,080 --> 00:21:45,000 Speaker 4: That was tough. That was tough. Yeah, that was really tough. 471 00:21:45,320 --> 00:21:46,480 Speaker 4: And I think it was. 472 00:21:46,440 --> 00:21:48,560 Speaker 3: Before the internet was crazy, otherwise that would be on 473 00:21:48,680 --> 00:21:49,400 Speaker 3: YouTube right now. 474 00:21:50,119 --> 00:21:54,960 Speaker 1: So look, I am not offended. I got a little 475 00:21:54,960 --> 00:21:56,520 Speaker 1: I was having fun with it. We went a good move. 476 00:21:56,560 --> 00:21:58,680 Speaker 1: We're vacationing, so maybe she called me at a good time, 477 00:21:59,280 --> 00:22:04,199 Speaker 1: but I don't necessarily get offended by it. But I 478 00:22:04,240 --> 00:22:07,040 Speaker 1: think it's interesting that we have to admit she wasn't 479 00:22:07,080 --> 00:22:09,160 Speaker 1: willing to admit that she made a mistake and why 480 00:22:09,200 --> 00:22:10,560 Speaker 1: she made it. If she didn that we could have 481 00:22:10,600 --> 00:22:11,760 Speaker 1: had a meaningful conversation. 482 00:22:12,240 --> 00:22:13,600 Speaker 2: And that's okay. I'm not mad. 483 00:22:13,720 --> 00:22:17,000 Speaker 1: She didn't see anything about me that said anger, So 484 00:22:17,600 --> 00:22:22,560 Speaker 1: it's it's okay, but we b it sucks that we 485 00:22:22,960 --> 00:22:25,280 Speaker 1: get put in the position that we have to make 486 00:22:25,800 --> 00:22:27,680 Speaker 1: her feel better about what she did. 487 00:22:27,760 --> 00:22:29,159 Speaker 4: It's a good way to put it. And you know 488 00:22:29,200 --> 00:22:30,240 Speaker 4: it's interesting too. 489 00:22:30,560 --> 00:22:33,679 Speaker 3: Is this has been a fascinating conversation and fascinating to 490 00:22:33,720 --> 00:22:34,919 Speaker 3: hear the research behind it. 491 00:22:35,200 --> 00:22:38,960 Speaker 4: And I hope that white people can. 492 00:22:38,800 --> 00:22:42,199 Speaker 3: Listen to this who are listening and say not to 493 00:22:42,520 --> 00:22:45,920 Speaker 3: say do I do that, but recognize when you're meeting someone, 494 00:22:45,920 --> 00:22:48,760 Speaker 3: when you're looking at people, just recognize that there might 495 00:22:48,880 --> 00:22:51,160 Speaker 3: just be something in you that needs to be corrected, 496 00:22:51,160 --> 00:22:54,120 Speaker 3: that needs to be refocused, that needs to be aware. 497 00:22:54,520 --> 00:22:56,880 Speaker 3: And I think this is I I am so appreciative 498 00:22:56,960 --> 00:23:00,479 Speaker 3: of actually seeing what happened yesterday and then seeing and 499 00:23:00,520 --> 00:23:02,600 Speaker 3: hearing the research you just gave me and saying, Okay, 500 00:23:03,119 --> 00:23:05,639 Speaker 3: I can do better. I can recognize that this is 501 00:23:05,680 --> 00:23:08,639 Speaker 3: an issue. Because you asked me, do people mistake you 502 00:23:08,720 --> 00:23:12,040 Speaker 3: for someone else? And I said, no, people tell me 503 00:23:12,080 --> 00:23:14,439 Speaker 3: I look like. Has anyone ever said you look like? 504 00:23:14,800 --> 00:23:17,720 Speaker 3: But no one's ever said I swore it was this 505 00:23:17,800 --> 00:23:20,560 Speaker 3: person or that person, or you were someone else, or 506 00:23:20,600 --> 00:23:21,760 Speaker 3: don't you get that all the time. 507 00:23:21,880 --> 00:23:24,480 Speaker 4: No, I have not had that experience. 508 00:23:25,040 --> 00:23:27,520 Speaker 2: And we need to be clear here. 509 00:23:28,480 --> 00:23:31,760 Speaker 1: Plenty of research out there shows that any racial group 510 00:23:32,000 --> 00:23:36,080 Speaker 1: can have that kind of bias, but it is overwhelmingly 511 00:23:36,200 --> 00:23:38,280 Speaker 1: weighted towards white folks. 512 00:23:39,000 --> 00:23:40,920 Speaker 2: First, just simply waited. 513 00:23:41,080 --> 00:23:43,600 Speaker 1: But you add to that the fact that we have 514 00:23:43,720 --> 00:23:47,240 Speaker 1: majority white spaces. Yes, and they don't have to improve, 515 00:23:47,359 --> 00:23:49,920 Speaker 1: don't have to get better, don't have to And that 516 00:23:50,680 --> 00:23:52,640 Speaker 1: is why the focus in the research so often has 517 00:23:52,680 --> 00:23:55,920 Speaker 1: been specifically on white folks in the majority group. 518 00:23:56,040 --> 00:23:57,560 Speaker 4: Yep, and that does make sense. 519 00:23:57,680 --> 00:24:00,200 Speaker 1: Okay, so you just I have a list here, how 520 00:24:00,200 --> 00:24:03,280 Speaker 1: can you do something about it? I think you actually 521 00:24:03,320 --> 00:24:04,600 Speaker 1: have already on your own hit. 522 00:24:04,760 --> 00:24:05,200 Speaker 2: I'm serious. 523 00:24:06,119 --> 00:24:08,040 Speaker 1: The first thing first, I have written down here, give 524 00:24:08,080 --> 00:24:10,320 Speaker 1: a shit care enough to pay attention to dig deeper 525 00:24:10,320 --> 00:24:12,359 Speaker 1: about individual Why you should do that with anybody, no 526 00:24:12,400 --> 00:24:12,960 Speaker 1: matter what race. 527 00:24:12,960 --> 00:24:13,359 Speaker 2: There you go. 528 00:24:13,720 --> 00:24:15,360 Speaker 1: Meaningful interactions can help. 529 00:24:16,000 --> 00:24:16,680 Speaker 4: That's cool. 530 00:24:16,920 --> 00:24:18,080 Speaker 2: Take time have a. 531 00:24:18,040 --> 00:24:21,240 Speaker 1: Meaningful conversation with somebody beyond just somebody taking your order 532 00:24:21,359 --> 00:24:23,480 Speaker 1: that happens to be a minority, or somebody checking you 533 00:24:23,560 --> 00:24:25,320 Speaker 1: out at the grocery store that happens to be a minority, 534 00:24:25,400 --> 00:24:27,720 Speaker 1: or whatever it may be. Do that the other one, 535 00:24:28,040 --> 00:24:30,960 Speaker 1: studying someone's face, like actually being conscious of it, Like 536 00:24:31,000 --> 00:24:34,479 Speaker 1: you said, the other thing, being thoughtful can help. And 537 00:24:34,600 --> 00:24:39,359 Speaker 1: look at your friend group, look at your friend group. 538 00:24:39,680 --> 00:24:42,680 Speaker 1: If you're constantly around people that don't look like you, 539 00:24:42,680 --> 00:24:46,399 Speaker 1: you are caring about someone, you are studying them, you 540 00:24:47,160 --> 00:24:50,280 Speaker 1: are caring enough to notice their features and whatnot. 541 00:24:51,280 --> 00:24:52,160 Speaker 2: Look at your friend group. 542 00:24:52,200 --> 00:24:55,000 Speaker 1: That can help a lot because you'll have meaningful interactions 543 00:24:55,000 --> 00:24:57,640 Speaker 1: obviously with people of other races, and that can help. 544 00:24:57,680 --> 00:24:58,800 Speaker 2: But take a look at your friend group. 545 00:24:59,160 --> 00:25:00,560 Speaker 4: I love that, you know. 546 00:25:00,560 --> 00:25:04,200 Speaker 3: Look, it's something that happened that wasn't traumatic in any way, 547 00:25:04,240 --> 00:25:05,120 Speaker 3: but it but it. 548 00:25:05,080 --> 00:25:07,679 Speaker 4: Was interesting, and it certainly was a moment. 549 00:25:07,800 --> 00:25:09,520 Speaker 3: And I love when you can take a moment like 550 00:25:09,600 --> 00:25:12,720 Speaker 3: that and recognize something and learn something. 551 00:25:12,760 --> 00:25:14,119 Speaker 4: And that's exactly what happened. 552 00:25:14,160 --> 00:25:18,440 Speaker 3: So maybe I appreciate the incident that occurred because. 553 00:25:18,240 --> 00:25:19,000 Speaker 4: I learned something. 554 00:25:19,160 --> 00:25:21,280 Speaker 1: Well, it's amazing. We had no idea and had no plan. 555 00:25:21,320 --> 00:25:22,600 Speaker 1: We were on vacation, we. 556 00:25:22,440 --> 00:25:24,280 Speaker 3: Were delayed, we had a flight delay. We wouldn't have 557 00:25:24,320 --> 00:25:26,400 Speaker 3: been there if the flight hadn't been delayed. We were wow, well, 558 00:25:26,400 --> 00:25:28,600 Speaker 3: you know, we got there with enough time, and then 559 00:25:28,680 --> 00:25:30,199 Speaker 3: all of a sudden, it get we got delayed by 560 00:25:30,240 --> 00:25:32,000 Speaker 3: over an hour at least at the airport and then 561 00:25:32,040 --> 00:25:34,479 Speaker 3: another hour on the tarmac. But yeah, that was and 562 00:25:34,520 --> 00:25:36,679 Speaker 3: he gave us good content that delay gave us a 563 00:25:36,760 --> 00:25:39,080 Speaker 3: really something good to talk about and something to think about, 564 00:25:39,640 --> 00:25:40,600 Speaker 3: and I appreciate. 565 00:25:41,280 --> 00:25:43,720 Speaker 4: I always love learning something new about. 566 00:25:43,800 --> 00:25:46,719 Speaker 3: It's fascinating how we work, why we do what we do, 567 00:25:46,840 --> 00:25:48,960 Speaker 3: and we always can do better. 568 00:25:49,000 --> 00:25:52,879 Speaker 1: Well, Folks, always appreciate you spending some time with us. 569 00:25:53,040 --> 00:25:56,280 Speaker 1: Over now for my dear Amy Lack, I am Barack, 570 00:25:56,600 --> 00:25:59,239 Speaker 1: don DJ, Gary whatever you want to call me. When 571 00:25:59,240 --> 00:26:01,960 Speaker 1: you see me out there, Folks, just say hello and 572 00:26:01,960 --> 00:26:02,400 Speaker 1: be nice. 573 00:26:02,400 --> 00:26:05,359 Speaker 2: I'd appreciate it. But we appreciate you listening to us 574 00:26:05,359 --> 00:26:06,960 Speaker 2: as always. What's talk about it