1 00:00:02,680 --> 00:00:03,199 Speaker 1: No case. 2 00:00:03,600 --> 00:00:06,200 Speaker 2: No case ever, I can't do it. Why can't you? 3 00:00:07,040 --> 00:00:09,840 Speaker 1: This is my second most expensive possession. 4 00:00:09,880 --> 00:00:11,080 Speaker 2: This brother gen. 5 00:00:11,039 --> 00:00:15,160 Speaker 3: Z man, you can't see you bought an expensive possession 6 00:00:15,160 --> 00:00:15,760 Speaker 3: that you're hiding. 7 00:00:15,880 --> 00:00:18,640 Speaker 1: My little brother doesn't use a case either, and and 8 00:00:18,720 --> 00:00:19,520 Speaker 1: he can't afford it. 9 00:00:19,720 --> 00:00:19,880 Speaker 4: Yeah. 10 00:00:19,920 --> 00:00:22,959 Speaker 2: I just feel like the phone is beautiful, so like, 11 00:00:23,040 --> 00:00:23,920 Speaker 2: why cover it up? 12 00:00:24,040 --> 00:00:25,959 Speaker 1: That's what we're different. I don't even think the ship. 13 00:00:25,840 --> 00:00:29,040 Speaker 3: Can really okay to me, I'm like, this is one 14 00:00:29,040 --> 00:00:31,480 Speaker 3: of the best iPhones I've seen, and then the orange 15 00:00:31,520 --> 00:00:32,840 Speaker 3: color is dope. 16 00:00:32,640 --> 00:00:36,480 Speaker 2: The very idea of it breaking scares me so much 17 00:00:37,520 --> 00:00:41,040 Speaker 2: that I will protect it in dukie. I will literally 18 00:00:41,120 --> 00:00:43,720 Speaker 2: code my phone in dukie to keep this from shattering. 19 00:00:43,720 --> 00:00:45,920 Speaker 1: If that gives another year on it, Yeah, I would 20 00:00:45,920 --> 00:00:47,159 Speaker 1: think about it. I need this. 21 00:00:47,200 --> 00:00:52,599 Speaker 3: I was really folder and unnecessary and so wrong in 22 00:00:52,640 --> 00:00:53,160 Speaker 3: any ways. 23 00:01:00,160 --> 00:01:08,600 Speaker 1: Chips in yours a quality bears are racist. The layer 24 00:01:08,880 --> 00:01:15,560 Speaker 1: owes the money, many turning stuff. I can't tell me nothing. 25 00:01:18,920 --> 00:01:23,080 Speaker 1: God may haig may hi. 26 00:01:24,480 --> 00:01:25,839 Speaker 2: There it is there. 27 00:01:25,880 --> 00:01:26,240 Speaker 1: It is. 28 00:01:26,440 --> 00:01:30,840 Speaker 2: Welcome ladies and gentlemen, little mama and gentiles alike. Welcome 29 00:01:30,840 --> 00:01:33,520 Speaker 2: to another phenomenal episode. And my mama told. 30 00:01:33,319 --> 00:01:36,040 Speaker 1: Me the podcast where we dive deep into the pockets 31 00:01:36,040 --> 00:01:37,840 Speaker 1: of black conspiracy theories. 32 00:01:37,680 --> 00:01:42,720 Speaker 2: And we finally work to prove absolutely nothing that you need. 33 00:01:43,920 --> 00:01:46,880 Speaker 2: It will not help. It ain't gonna help. If you're 34 00:01:46,880 --> 00:01:51,040 Speaker 2: coming here for help, that is your mistake. And while 35 00:01:51,040 --> 00:01:53,200 Speaker 2: we're on the subject, I've been wanting to talk to 36 00:01:53,240 --> 00:01:56,200 Speaker 2: you about this and mainly them. I want to address 37 00:01:56,200 --> 00:02:00,320 Speaker 2: this with them while you're here. Stop threatening to take 38 00:02:00,360 --> 00:02:05,440 Speaker 2: our microphones away. It's enough. 39 00:02:06,120 --> 00:02:09,320 Speaker 1: That'sn't number one on the internet. 40 00:02:09,880 --> 00:02:16,120 Speaker 2: Take the mics away, gotta take the mics away. 41 00:02:17,120 --> 00:02:20,160 Speaker 1: You're gonna just do this in a car. You're gonna 42 00:02:20,160 --> 00:02:21,839 Speaker 1: give a fuck about this microphone. 43 00:02:22,040 --> 00:02:24,399 Speaker 2: This is how we talk. This is just it. 44 00:02:24,720 --> 00:02:26,120 Speaker 1: This is what you signed up. 45 00:02:26,160 --> 00:02:29,640 Speaker 2: We are not missing never you know why. This is 46 00:02:29,680 --> 00:02:31,720 Speaker 2: what we signed up to do together. 47 00:02:32,760 --> 00:02:33,720 Speaker 1: We didn't fuck up. 48 00:02:34,240 --> 00:02:37,040 Speaker 2: You don't get it, and that's totally okay. 49 00:02:37,160 --> 00:02:38,000 Speaker 1: That's fine. 50 00:02:38,120 --> 00:02:41,760 Speaker 2: This wasn't for you. We wish you never found this video. 51 00:02:42,160 --> 00:02:45,359 Speaker 2: But you cannot threaten to take our microphones away when 52 00:02:45,360 --> 00:02:48,160 Speaker 2: we're goddamn nailing it every fucking week. 53 00:02:48,280 --> 00:02:50,919 Speaker 1: We are going too hard, we're going crazy. You can't. 54 00:02:50,960 --> 00:02:52,079 Speaker 1: You can't have these. 55 00:02:51,919 --> 00:02:53,200 Speaker 2: If you like what we do. 56 00:02:53,720 --> 00:02:57,040 Speaker 1: But then sometimes yeah, that's that's a good caveat. If 57 00:02:57,040 --> 00:02:59,840 Speaker 1: you don't like what you then then probably all this. 58 00:03:00,080 --> 00:03:04,040 Speaker 1: If you think it's true, stay away from here. I'm 59 00:03:04,040 --> 00:03:04,359 Speaker 1: from here. 60 00:03:04,440 --> 00:03:05,080 Speaker 2: Don't come back. 61 00:03:05,480 --> 00:03:08,400 Speaker 1: Then we're then we are just chatting about nothing. 62 00:03:11,680 --> 00:03:12,160 Speaker 2: Niggas. 63 00:03:12,760 --> 00:03:23,680 Speaker 1: Yeah, do these niggas got uncles if you're short? Just 64 00:03:23,840 --> 00:03:24,240 Speaker 1: saying it. 65 00:03:24,360 --> 00:03:31,480 Speaker 2: Damn, that's a real one. Very recently, and that one 66 00:03:31,520 --> 00:03:32,080 Speaker 2: did sting. 67 00:03:32,680 --> 00:03:36,600 Speaker 1: That one did hurt our feelings because people, this isn't 68 00:03:36,680 --> 00:03:39,000 Speaker 1: like a news podcast, bro, And then they're like, this 69 00:03:39,120 --> 00:03:39,880 Speaker 1: is actually in. 70 00:03:40,440 --> 00:03:41,600 Speaker 2: We're not helping, man. 71 00:03:41,440 --> 00:03:43,600 Speaker 1: We don't even we don't even like you. 72 00:03:43,720 --> 00:03:48,520 Speaker 2: Our guest today is I would say, dramatically unprepared for this. 73 00:03:48,840 --> 00:03:49,960 Speaker 1: No, you're doing that already. 74 00:03:49,960 --> 00:03:53,280 Speaker 2: You're doing You're gonna nail it because you you are 75 00:03:53,400 --> 00:03:57,520 Speaker 2: so so much more reasonable than either of us. We're 76 00:03:57,640 --> 00:04:01,320 Speaker 2: so excited that you're here. A a talent of phenom, 77 00:04:01,480 --> 00:04:03,680 Speaker 2: one of the most beautiful voices I've ever heard in 78 00:04:03,760 --> 00:04:07,600 Speaker 2: real life. Wow, just walking around set you sing and 79 00:04:07,640 --> 00:04:10,720 Speaker 2: I go, God, damn, this motherfucker is a monster. Thank you. 80 00:04:10,840 --> 00:04:14,320 Speaker 2: This is unbelievable. The talent on this man. You know him, 81 00:04:14,520 --> 00:04:16,960 Speaker 2: you know him from I'm a Virgo. You know him 82 00:04:17,279 --> 00:04:21,840 Speaker 2: from MJ the musical God damn this this man plays Michael. 83 00:04:21,960 --> 00:04:22,360 Speaker 1: Let's go. 84 00:04:22,640 --> 00:04:28,600 Speaker 2: This man played Michael. Y'all, and you know, most importantly 85 00:04:28,960 --> 00:04:32,359 Speaker 2: from the upcoming series Barbershop. He is one of the 86 00:04:32,480 --> 00:04:35,320 Speaker 2: stars of the new barber Shop on Prime. Give it up, 87 00:04:35,920 --> 00:04:37,559 Speaker 2: Give it up, Bret Gray, y'all. 88 00:04:39,279 --> 00:04:41,200 Speaker 3: Respect, a lot of respect, A lot of respect, a lot. 89 00:04:45,520 --> 00:04:47,040 Speaker 2: I'm not familiar with that one. You don't know. 90 00:04:47,960 --> 00:04:49,560 Speaker 1: For me, that really hurt my feelings. 91 00:04:49,680 --> 00:04:52,440 Speaker 2: Really, twenty nine. 92 00:04:52,320 --> 00:04:53,880 Speaker 1: You had never seen Cool Runnings? 93 00:04:54,120 --> 00:04:54,599 Speaker 2: I'm sorry? 94 00:04:54,600 --> 00:04:55,839 Speaker 1: What? Wow? 95 00:04:56,120 --> 00:04:56,719 Speaker 2: What's that mean? 96 00:04:56,880 --> 00:04:57,159 Speaker 1: Fuck? 97 00:04:57,600 --> 00:04:58,560 Speaker 2: What year did that come out? 98 00:04:58,640 --> 00:04:58,919 Speaker 1: Nigga? 99 00:04:58,920 --> 00:05:01,880 Speaker 2: I'll fight you years did that come out? 100 00:05:02,360 --> 00:05:04,400 Speaker 1: Twenty nine is old enough to have seen Cool Running 101 00:05:05,320 --> 00:05:07,839 Speaker 1: over there. It probably came out. It probably came out 102 00:05:07,880 --> 00:05:09,200 Speaker 1: in like ninety five. 103 00:05:09,320 --> 00:05:12,320 Speaker 2: Oh so I was one. So that I've seen. 104 00:05:12,360 --> 00:05:14,040 Speaker 1: I've seen movies that came out in nineteen eighty six? 105 00:05:14,120 --> 00:05:15,000 Speaker 2: Are you familiar with? 106 00:05:15,320 --> 00:05:18,360 Speaker 1: Are you familiar with? I think that is a generation gap. 107 00:05:18,400 --> 00:05:19,800 Speaker 1: But I've been thinking this lately. 108 00:05:20,200 --> 00:05:22,279 Speaker 3: Well, you know, stuff moved from like what y'all have 109 00:05:22,440 --> 00:05:28,640 Speaker 3: tapes to look how these and then well, I'm just 110 00:05:29,000 --> 00:05:31,480 Speaker 3: I don't know, I don't know. I wasn't I didn't never. 111 00:05:34,080 --> 00:05:35,600 Speaker 3: I don't even know what that means. 112 00:05:37,320 --> 00:05:41,800 Speaker 2: Okay, I would say that tapes for me, we were 113 00:05:41,920 --> 00:05:45,320 Speaker 2: very formative. Yeah, we never experienced the tape. 114 00:05:45,400 --> 00:05:48,640 Speaker 3: I had a few tapes. Okay, you know we're talking 115 00:05:48,640 --> 00:05:51,200 Speaker 3: about Rats or video vhs. 116 00:05:51,240 --> 00:05:54,160 Speaker 1: Okay, so you had a rug Rats orange cassette. 117 00:05:53,880 --> 00:05:57,040 Speaker 3: Had an orange cassette yeah from rug Rats? Yeah, yeah, yeah, 118 00:05:57,240 --> 00:05:58,680 Speaker 3: and then after that it kind of. 119 00:05:59,080 --> 00:05:59,839 Speaker 1: It went to DVD. 120 00:06:00,120 --> 00:06:00,440 Speaker 2: DVD. 121 00:06:00,640 --> 00:06:01,840 Speaker 1: Cool Runnings is on DVD. 122 00:06:02,200 --> 00:06:05,320 Speaker 2: Cool Runnings. I guess my question for you is, are 123 00:06:05,360 --> 00:06:10,360 Speaker 2: you familiar with the catalog the works of Leon Leon? 124 00:06:10,920 --> 00:06:17,320 Speaker 2: What about Dougie Doug Doug e Freshlik Yoga Yoga Malik yoga? 125 00:06:17,640 --> 00:06:20,600 Speaker 2: Okay from Why Did I Get Married? That is what 126 00:06:20,680 --> 00:06:23,280 Speaker 2: you know him from, and that is unfortunate for me. Okay, 127 00:06:23,600 --> 00:06:26,760 Speaker 2: don't undercover though that should be. I think that is 128 00:06:26,800 --> 00:06:30,799 Speaker 2: what we all love him for. Everything else is work. 129 00:06:31,160 --> 00:06:34,880 Speaker 1: Okay. Well, Cool Runnings is a Disney movie. You love 130 00:06:34,960 --> 00:06:39,479 Speaker 1: Disney about Jamaican Bob's Letters, starring John Candy. 131 00:06:41,400 --> 00:06:47,160 Speaker 2: Interesting, it's crazy that Disney. Yeah, it's a really important 132 00:06:47,160 --> 00:06:48,280 Speaker 2: film because I have. 133 00:06:48,279 --> 00:06:51,520 Speaker 1: You seen the other Disney sports movies, kids sports movies 134 00:06:52,040 --> 00:06:57,320 Speaker 1: like The Mighty Ducks, the Sad Lot, the Big Green, 135 00:06:57,760 --> 00:06:59,840 Speaker 1: Let's go keep going. I don't know any other. 136 00:07:00,520 --> 00:07:03,760 Speaker 2: I saw Wendy wu oh, the about the. 137 00:07:05,600 --> 00:07:05,800 Speaker 1: Woman. 138 00:07:05,960 --> 00:07:08,240 Speaker 2: Yes, and I saw Luckily Irish. 139 00:07:08,320 --> 00:07:11,800 Speaker 1: That's what Irish down there for me? 140 00:07:12,400 --> 00:07:12,840 Speaker 2: Interesting. 141 00:07:13,360 --> 00:07:17,600 Speaker 1: Have you seen Johnny Tsunami? You've seen luck of the Irish, 142 00:07:17,640 --> 00:07:19,640 Speaker 1: but not john that's not even We're not even talking 143 00:07:19,680 --> 00:07:20,680 Speaker 1: different time periods. 144 00:07:20,720 --> 00:07:24,400 Speaker 2: Now whoa, you just really had a way. What's Johnny 145 00:07:24,480 --> 00:07:25,200 Speaker 2: Tsunami about? 146 00:07:25,280 --> 00:07:28,640 Speaker 1: Go big or go home? The famous J Jackson, famous 147 00:07:28,720 --> 00:07:31,480 Speaker 1: J Jackson Brown people snowboarding must not be that famous. 148 00:07:32,200 --> 00:07:36,240 Speaker 2: Oh sorry, sorry, that wasn't towards you. But that wasn't 149 00:07:36,280 --> 00:07:38,000 Speaker 2: towards you. Johnson was a button. 150 00:07:39,600 --> 00:07:45,640 Speaker 1: That hold around. You're disgusting. We're going to kill you. 151 00:07:45,640 --> 00:07:47,520 Speaker 1: Give me two hundred dollars. He's dead. 152 00:07:47,600 --> 00:07:51,400 Speaker 2: I know Doctor Phillis. I did see Doctor phild before. 153 00:07:51,600 --> 00:07:53,120 Speaker 2: You haven't seen Doctor phil Thank god. 154 00:07:53,320 --> 00:07:54,600 Speaker 1: Yeah, good show. 155 00:07:54,680 --> 00:07:57,160 Speaker 2: That's actually more important to the things that we like 156 00:07:57,240 --> 00:08:00,960 Speaker 2: to discuss on the show. Yeah, okay, that's scary even more. 157 00:08:00,960 --> 00:08:03,720 Speaker 2: Disney don't really come up as much as no, we. 158 00:08:05,480 --> 00:08:08,680 Speaker 1: Yeah we don't really. I don't really consume a lot 159 00:08:08,680 --> 00:08:11,440 Speaker 1: of their contents anymore. But they were, they were they 160 00:08:11,440 --> 00:08:13,160 Speaker 1: were there for me in some tough time. Yeah, they 161 00:08:13,280 --> 00:08:16,200 Speaker 1: really covered a lot of basest. Goat is that Disney? 162 00:08:16,800 --> 00:08:23,600 Speaker 2: No, that's Steph Curry baby step Oh yeah, Caleb's in 163 00:08:23,640 --> 00:08:24,440 Speaker 2: there there you go. 164 00:08:24,520 --> 00:08:25,520 Speaker 1: Oh yeah, all right. 165 00:08:26,040 --> 00:08:28,640 Speaker 2: You came to us with a conspiracy theory that I'm 166 00:08:28,680 --> 00:08:31,360 Speaker 2: excited about this. This is a fun one. You said, 167 00:08:31,400 --> 00:08:32,480 Speaker 2: my mama told me. 168 00:08:34,800 --> 00:08:36,760 Speaker 1: Robots are racist. 169 00:08:40,200 --> 00:08:42,760 Speaker 2: Yes, I will say I picked this off a list. 170 00:08:42,840 --> 00:08:47,360 Speaker 2: That's that's absolutely. I feel like I have an argument here. 171 00:08:47,640 --> 00:08:51,800 Speaker 3: Yeah, Okay, First there's some questions I must ask when 172 00:08:51,840 --> 00:08:53,560 Speaker 3: you say robot, what do you mean? 173 00:08:53,800 --> 00:08:56,800 Speaker 2: Oh, interesting are you? 174 00:08:56,840 --> 00:09:00,120 Speaker 3: Because there's like a roomba, you know, and then there's 175 00:09:00,160 --> 00:09:03,040 Speaker 3: like the humanoid AI robot that can walk around and 176 00:09:03,040 --> 00:09:07,240 Speaker 3: pick up boxes, and then there's like, you know, the 177 00:09:07,280 --> 00:09:10,760 Speaker 3: machine that makes the parts for the cars that has 178 00:09:10,880 --> 00:09:13,160 Speaker 3: new AI features exactly. 179 00:09:13,200 --> 00:09:16,160 Speaker 2: And you're saying that that that weird arm that assembles 180 00:09:16,200 --> 00:09:18,840 Speaker 2: cars isn't like punching niggas in the face. 181 00:09:20,600 --> 00:09:25,719 Speaker 1: Yes, yes, which what genre robots are you? 182 00:09:25,720 --> 00:09:28,959 Speaker 3: Because when I think robots, I think like, have you 183 00:09:28,960 --> 00:09:31,520 Speaker 3: guys ever seen X Machina? Of course that's what I 184 00:09:31,600 --> 00:09:34,960 Speaker 3: think I think of, like intelligent, like we would have 185 00:09:35,000 --> 00:09:37,080 Speaker 3: to touring test this robots. 186 00:09:37,200 --> 00:09:39,680 Speaker 1: So you're almost talking about robots that we are not 187 00:09:39,840 --> 00:09:41,880 Speaker 1: yet interacting with in a regular. 188 00:09:42,040 --> 00:09:44,640 Speaker 3: Not yet, but they're on the way. Like I just 189 00:09:44,720 --> 00:09:47,000 Speaker 3: saw a robot that did a backflip the other day. 190 00:09:47,080 --> 00:09:49,520 Speaker 3: I just saw Cardi b interact with a humanoid robot. 191 00:09:50,120 --> 00:09:53,280 Speaker 3: I saw Will Smith talk to a robot. I think 192 00:09:53,280 --> 00:09:54,240 Speaker 3: that was a few years ago. 193 00:09:54,280 --> 00:09:54,439 Speaker 1: Now. 194 00:09:57,440 --> 00:09:59,920 Speaker 2: That wasn't funny. That wasn't good. 195 00:10:00,160 --> 00:10:01,320 Speaker 1: Great, that was not good. 196 00:10:01,559 --> 00:10:06,920 Speaker 2: That would be no Smith Sand. This is harmful. I 197 00:10:07,040 --> 00:10:09,800 Speaker 2: feeling my blood, you know what I'm saying. 198 00:10:12,000 --> 00:10:20,320 Speaker 3: So, yeah, but you just don't feel about. 199 00:10:17,920 --> 00:10:20,439 Speaker 1: Yeah, what's my man? Man? We hung out one time. 200 00:10:20,440 --> 00:10:21,959 Speaker 2: I don't believe you at all. I got a video 201 00:10:22,040 --> 00:10:23,040 Speaker 2: right now. 202 00:10:23,280 --> 00:10:31,360 Speaker 3: His wife's name, Okay, anyway, uh yeah, I just feel 203 00:10:31,360 --> 00:10:33,320 Speaker 3: like there are these robots that are going to be 204 00:10:33,400 --> 00:10:34,400 Speaker 3: sort of like humans. 205 00:10:34,880 --> 00:10:37,360 Speaker 2: Those are the ones I'm that I have the arguments. 206 00:10:38,040 --> 00:10:41,400 Speaker 1: Do you feel like that's due to like a programming situation? 207 00:10:41,520 --> 00:10:43,520 Speaker 3: Well, like, okay, if you think about robots and you 208 00:10:43,559 --> 00:10:47,160 Speaker 3: think about AI and how intelligent it has to be 209 00:10:47,280 --> 00:10:51,080 Speaker 3: in order to function in a cognizant way, then it 210 00:10:51,280 --> 00:10:55,200 Speaker 3: automatically can see your face, detect your skin color, and 211 00:10:55,320 --> 00:10:59,080 Speaker 3: thus it already knows the history of everyone your skin 212 00:10:59,120 --> 00:11:02,319 Speaker 3: color in the back of it's programming. It already can 213 00:11:02,400 --> 00:11:06,240 Speaker 3: tell just by looking at you what racial whether it. 214 00:11:06,360 --> 00:11:09,200 Speaker 1: Uses it, and the robots are going to be racist. 215 00:11:10,080 --> 00:11:12,480 Speaker 3: So here's the other part about racism, right, because it 216 00:11:12,600 --> 00:11:16,360 Speaker 3: also plot. It also technically has to do with like 217 00:11:16,440 --> 00:11:23,080 Speaker 3: an institutional like underpinning of another person. So like if 218 00:11:23,120 --> 00:11:26,880 Speaker 3: the robots are in like if the robots are like 219 00:11:26,920 --> 00:11:29,880 Speaker 3: now hiring managers, I do think that it could be 220 00:11:29,960 --> 00:11:33,680 Speaker 3: racist because it can go in your history, detect who 221 00:11:33,679 --> 00:11:36,720 Speaker 3: you are, what you've come from, what you've done, whatever, 222 00:11:37,320 --> 00:11:41,079 Speaker 3: and decide based on a lot of different factors. However, 223 00:11:41,160 --> 00:11:44,640 Speaker 3: if they're not in positions of power, I think they're 224 00:11:44,640 --> 00:11:48,520 Speaker 3: more so racist towards the entire human race because to know. 225 00:11:48,400 --> 00:11:52,360 Speaker 2: That you're human, that's just like anti human. Well, it's 226 00:11:52,440 --> 00:11:53,120 Speaker 2: human a race. 227 00:11:54,240 --> 00:11:55,960 Speaker 1: If we're talking about it on the whole, then there's 228 00:11:55,960 --> 00:11:58,600 Speaker 1: no divisions, right unless you're a robot. If it just 229 00:11:58,640 --> 00:11:59,959 Speaker 1: doesn't like people, we should kill them. 230 00:12:00,160 --> 00:12:03,520 Speaker 2: Anyway, because if it's not human, I think they keep 231 00:12:03,600 --> 00:12:06,280 Speaker 2: selling us they don't like people, and we keep being like, ah, 232 00:12:06,440 --> 00:12:07,280 Speaker 2: that's crazy. 233 00:12:07,600 --> 00:12:12,200 Speaker 1: They we're throwing their brothers in the like do you 234 00:12:12,240 --> 00:12:15,480 Speaker 1: remember them scooters hit Austin? I did summer. 235 00:12:20,200 --> 00:12:22,760 Speaker 2: I don't think they have the capacity to like, Yeah, 236 00:12:22,760 --> 00:12:26,920 Speaker 2: I don't think that that's actually what survival for the 237 00:12:26,960 --> 00:12:29,240 Speaker 2: Earth is about, you know what I mean? Like, I 238 00:12:29,240 --> 00:12:33,760 Speaker 2: think like is a specifically human sort of like praying 239 00:12:34,720 --> 00:12:39,200 Speaker 2: that other species don't that are sustaining this planet, not 240 00:12:39,320 --> 00:12:42,480 Speaker 2: just like leeching off of it, don't subscribe to This 241 00:12:42,520 --> 00:12:45,840 Speaker 2: isn't an issue of like, this is the issue of survival, and. 242 00:12:45,920 --> 00:12:48,280 Speaker 1: I think break it down to data and stuff like that. 243 00:12:48,280 --> 00:12:50,839 Speaker 1: It shouldn't like us because we're the most destructive. 244 00:12:50,440 --> 00:12:52,680 Speaker 2: We are literally on this planet. 245 00:12:52,800 --> 00:12:53,040 Speaker 1: We are. 246 00:12:53,800 --> 00:12:57,000 Speaker 2: Yeah, there's not. It's not as if the issue and 247 00:12:57,160 --> 00:12:59,080 Speaker 2: also their creators. 248 00:12:59,520 --> 00:13:04,200 Speaker 3: Yeah, so yourmone of course, but not in a way 249 00:13:04,200 --> 00:13:05,679 Speaker 3: that I would want to destroy. 250 00:13:07,360 --> 00:13:08,840 Speaker 1: Day it does. 251 00:13:12,280 --> 00:13:12,800 Speaker 2: I guess. 252 00:13:12,960 --> 00:13:17,560 Speaker 3: I guess a robot needing to survive, I would think 253 00:13:17,600 --> 00:13:20,560 Speaker 3: that it would think it would also need humans because 254 00:13:20,920 --> 00:13:23,040 Speaker 3: at a certain point there needs to be maintenance, there 255 00:13:23,040 --> 00:13:25,840 Speaker 3: needs to be updates, there needs to be advances technologically. 256 00:13:25,920 --> 00:13:27,400 Speaker 1: But doesn't that get to the point where it does 257 00:13:27,440 --> 00:13:28,000 Speaker 1: that itself? 258 00:13:28,440 --> 00:13:30,920 Speaker 2: But can it do it to itself. I think. I 259 00:13:30,960 --> 00:13:34,120 Speaker 2: think if you remove humans from the experience, the need 260 00:13:34,160 --> 00:13:38,440 Speaker 2: to be humanoid no longer matters. Agree, So then they 261 00:13:38,480 --> 00:13:42,840 Speaker 2: are maintenance. Their evolution can go beyond whatever the human 262 00:13:42,920 --> 00:13:45,880 Speaker 2: form is. Like, I think they do better without us 263 00:13:45,920 --> 00:13:48,560 Speaker 2: than they would with us. Yeah, I think we would 264 00:13:48,679 --> 00:13:51,040 Speaker 2: keep making them want to look like us, and then 265 00:13:51,080 --> 00:13:53,600 Speaker 2: they might be like, bro, why wouldn't why wouldn't we 266 00:13:53,640 --> 00:13:55,000 Speaker 2: have six legs the whole time. 267 00:13:55,200 --> 00:14:05,520 Speaker 1: This doesn't even bipedal whatever, this isn't even. 268 00:14:02,960 --> 00:14:05,720 Speaker 3: Like Okay, imagine they want to roll right now, they 269 00:14:05,760 --> 00:14:08,080 Speaker 3: have to go manufacture the steel and then haul it 270 00:14:08,400 --> 00:14:10,240 Speaker 3: and then design it and then cut it. 271 00:14:10,360 --> 00:14:13,600 Speaker 1: And then that's the first place they started to take over, 272 00:14:13,720 --> 00:14:15,800 Speaker 1: right I think that's like where they're going to. 273 00:14:15,920 --> 00:14:18,080 Speaker 2: I think they've already taken over manufacturing. 274 00:14:18,440 --> 00:14:21,480 Speaker 3: But like they would need us, They would need us 275 00:14:21,480 --> 00:14:23,440 Speaker 3: for certain elements of these processes. 276 00:14:23,440 --> 00:14:27,200 Speaker 2: They don't need no kisses. They don't need to tell you. 277 00:14:27,440 --> 00:14:31,680 Speaker 3: They take out their chip and be like deactivated and 278 00:14:31,720 --> 00:14:35,160 Speaker 3: then put in a new one. 279 00:14:33,320 --> 00:14:38,000 Speaker 2: But they wouldn't trust the other robot to do it, 280 00:14:38,000 --> 00:14:41,480 Speaker 2: because yeah, that's a human problem. 281 00:14:41,680 --> 00:14:46,680 Speaker 1: They well, I assume it would be within whatever their 282 00:14:46,680 --> 00:14:47,480 Speaker 1: operating system. 283 00:14:47,640 --> 00:14:50,360 Speaker 3: How does one robot know that you're going to deactivate 284 00:14:50,400 --> 00:14:52,680 Speaker 3: me and actually put the chip in instead of actually 285 00:14:52,760 --> 00:14:59,080 Speaker 3: leave me deactivated activating because there's no One robot has 286 00:14:59,120 --> 00:15:00,640 Speaker 3: different ideas than the other robot. 287 00:15:00,960 --> 00:15:02,920 Speaker 1: But now, are you're still talking like a person? 288 00:15:03,240 --> 00:15:05,920 Speaker 2: Yeah, it's not. They're not replacing us. 289 00:15:06,720 --> 00:15:09,600 Speaker 3: Think about this, right, the robot that's bipedle and then 290 00:15:09,600 --> 00:15:12,960 Speaker 3: the robot that is rolling already. You don't think the 291 00:15:13,040 --> 00:15:16,040 Speaker 3: robot that's rolling already, knowing that it's more efficient, is 292 00:15:16,080 --> 00:15:18,560 Speaker 3: going to help the bipedal robot who looks like a human. 293 00:15:19,280 --> 00:15:21,480 Speaker 1: I don't think that. I think that that's like a 294 00:15:21,600 --> 00:15:24,840 Speaker 1: very human way to like help for like, like we're 295 00:15:24,840 --> 00:15:27,400 Speaker 1: putting all this ship onto it. That's just numbers, right, 296 00:15:27,400 --> 00:15:28,840 Speaker 1: it's binary, it's one zeros. 297 00:15:28,920 --> 00:15:31,840 Speaker 2: And I think that the robot that's being put down 298 00:15:32,120 --> 00:15:34,200 Speaker 2: the way you're looking at it, I like, oh, my 299 00:15:34,360 --> 00:15:37,720 Speaker 2: life is being ended. It doesn't give a fuck, doesn't 300 00:15:37,760 --> 00:15:38,440 Speaker 2: have a life. 301 00:15:38,640 --> 00:15:39,000 Speaker 1: It is. 302 00:15:39,320 --> 00:15:42,440 Speaker 2: But if it's driving from survival, then whether it has 303 00:15:42,440 --> 00:15:42,880 Speaker 2: a light. 304 00:15:43,080 --> 00:15:46,680 Speaker 3: Or not, it's aware of its own consciousness. 305 00:15:47,600 --> 00:15:50,440 Speaker 2: I think it's aware. I don't. I don't believe it 306 00:15:50,480 --> 00:15:51,920 Speaker 2: has a conscious That's. 307 00:15:51,720 --> 00:15:53,880 Speaker 1: That's where I'm at, because if it's now we're talking 308 00:15:53,920 --> 00:15:56,560 Speaker 1: about a new type of like actual being as opposed 309 00:15:56,600 --> 00:15:57,040 Speaker 1: to just like. 310 00:15:57,320 --> 00:15:59,880 Speaker 2: But it is a new type of actual being because. 311 00:15:59,760 --> 00:16:01,240 Speaker 1: Right now I think it's a robot. 312 00:16:01,080 --> 00:16:06,560 Speaker 3: With AI because it's constantly learning and constantly growing and 313 00:16:06,600 --> 00:16:10,560 Speaker 3: constantly trying to understand us. Then it technically That's why 314 00:16:10,560 --> 00:16:13,360 Speaker 3: I said, if it was racist in any way outside 315 00:16:13,360 --> 00:16:15,360 Speaker 3: of a position of power, it will be racist towards 316 00:16:15,360 --> 00:16:19,160 Speaker 3: the entire human race because by building something sentient that's 317 00:16:19,200 --> 00:16:24,120 Speaker 3: actually not, you actually do separate humans from now androids, 318 00:16:24,560 --> 00:16:28,040 Speaker 3: and that is now the two different races of sentient 319 00:16:28,440 --> 00:16:32,360 Speaker 3: being is like a soul or consciousness. 320 00:16:31,880 --> 00:16:33,880 Speaker 1: Or a lot more efficient than the other one. 321 00:16:35,800 --> 00:16:37,280 Speaker 2: It's a very termination. 322 00:16:37,320 --> 00:16:43,520 Speaker 3: Because of a robot can't reproduce, so it's not necessarily. 323 00:16:45,240 --> 00:16:47,880 Speaker 2: Can build another one that's reproduction, but it can't like 324 00:16:49,120 --> 00:16:51,800 Speaker 2: just reproduce naturally. But I don't think it desires that. 325 00:16:51,840 --> 00:16:54,280 Speaker 2: I guess it's my point. Well, if it needs numbers, 326 00:16:54,280 --> 00:16:55,680 Speaker 2: it would because survival. 327 00:16:56,040 --> 00:16:59,760 Speaker 1: Well, I think you could probably can make a new 328 00:16:59,840 --> 00:17:02,320 Speaker 1: row bought at a faster more efficient rate than I 329 00:17:02,400 --> 00:17:04,560 Speaker 1: think nine months to make a baby. And you know 330 00:17:05,280 --> 00:17:06,760 Speaker 1: a little bit of runway. Maybe you got to get 331 00:17:06,760 --> 00:17:11,400 Speaker 1: your health again what I'm talking about. Yeah, it's six 332 00:17:11,440 --> 00:17:13,879 Speaker 1: months for that shit to even stick. Now we're at 333 00:17:13,920 --> 00:17:14,960 Speaker 1: fifteen months for one. 334 00:17:15,240 --> 00:17:17,800 Speaker 2: Coming in a lady is a mistake. We say all 335 00:17:17,840 --> 00:17:22,639 Speaker 2: the time. In my house, we scream it from the 336 00:17:22,680 --> 00:17:30,199 Speaker 2: mountain top. Take the mics away. You can't say that, 337 00:17:33,080 --> 00:17:37,919 Speaker 2: just say that alone. You leave my microphone alone. This 338 00:17:38,000 --> 00:17:44,600 Speaker 2: is the Black Queen. I protect mercy. 339 00:17:45,000 --> 00:17:45,080 Speaker 1: No. 340 00:17:45,240 --> 00:17:50,840 Speaker 2: I do think the the premise that we sometimes put 341 00:17:50,920 --> 00:17:54,640 Speaker 2: on a I is very much not what they're what 342 00:17:54,760 --> 00:17:57,840 Speaker 2: it is functioning under. I also think, and I've been 343 00:17:57,880 --> 00:18:00,400 Speaker 2: reading about this quite a bit, that they are are 344 00:18:00,560 --> 00:18:02,840 Speaker 2: a lot. There's a lot of evidence to prove that 345 00:18:02,880 --> 00:18:06,280 Speaker 2: there are premise of what AI is is not anywhere 346 00:18:06,359 --> 00:18:09,120 Speaker 2: near where they have this ship right now. 347 00:18:09,160 --> 00:18:11,719 Speaker 1: I've seen I've seen some ship about that too, Like 348 00:18:11,880 --> 00:18:14,959 Speaker 1: it's like and obviously it creates a panic in people. 349 00:18:15,840 --> 00:18:17,800 Speaker 1: So I think that it's this guy. You know, we're 350 00:18:17,840 --> 00:18:19,400 Speaker 1: all looking for God anyways all the time. 351 00:18:19,520 --> 00:18:22,080 Speaker 2: It's like easy to be like and they're selling it 352 00:18:22,119 --> 00:18:25,040 Speaker 2: that way, like this ship can already do all the 353 00:18:25,119 --> 00:18:27,000 Speaker 2: things you need to do if you just give it 354 00:18:27,040 --> 00:18:29,159 Speaker 2: the keys from. 355 00:18:29,080 --> 00:18:31,159 Speaker 1: Dude in Sri Lanka shout to Zach Fox. 356 00:18:31,400 --> 00:18:36,719 Speaker 5: Yeah, yes, that did the podcast and came on and 357 00:18:36,800 --> 00:18:40,760 Speaker 5: was saying that he believes that African children are manning 358 00:18:40,920 --> 00:18:45,320 Speaker 5: like way mos all the self driving vehicles across Uh 359 00:18:45,400 --> 00:18:46,760 Speaker 5: where am I listen? 360 00:18:47,320 --> 00:18:47,600 Speaker 1: Listen? 361 00:18:47,960 --> 00:18:50,960 Speaker 2: When we were in here, of course we were being silly, Okay, 362 00:18:52,119 --> 00:18:57,520 Speaker 2: I was, I was fucking it was still it was 363 00:18:57,560 --> 00:19:00,200 Speaker 2: still living in a place of whimsy when we were 364 00:19:00,280 --> 00:19:03,439 Speaker 2: talking about it. And not a week ago it was 365 00:19:03,560 --> 00:19:06,920 Speaker 2: revealed that it is men in the Philippines that are 366 00:19:06,960 --> 00:19:12,760 Speaker 2: manning the way moos around fucking town. Fact check this 367 00:19:12,880 --> 00:19:13,399 Speaker 2: in the comments. 368 00:19:13,400 --> 00:19:15,640 Speaker 1: I want to leave, bro, They've been sending emails all week. 369 00:19:15,840 --> 00:19:19,480 Speaker 1: Black Fox was right. David's beautiful. It's like crazy. It's 370 00:19:21,160 --> 00:19:24,919 Speaker 1: keep going, keep growing, keep going, keep going, keep going. 371 00:19:26,000 --> 00:19:28,960 Speaker 2: Self driving. They are not. They are not self driving 372 00:19:29,000 --> 00:19:29,920 Speaker 2: in the way that they say. 373 00:19:30,000 --> 00:19:30,080 Speaker 6: No. 374 00:19:31,440 --> 00:19:34,040 Speaker 2: They had and they said this in front of Congress. 375 00:19:34,080 --> 00:19:37,360 Speaker 2: This ain't like some oh they slipped up on a podcast. 376 00:19:37,359 --> 00:19:39,239 Speaker 2: They weren't hanging out with us when they said it. 377 00:19:39,359 --> 00:19:39,960 Speaker 1: Do you know what I mean? 378 00:19:40,600 --> 00:19:44,680 Speaker 2: This is a very legitimate resource. Wow, yeah, it's it's not. 379 00:19:44,720 --> 00:19:46,720 Speaker 1: I think that there's like a world that they're projecting 380 00:19:46,760 --> 00:19:48,960 Speaker 1: for like rich people, and then there's like how the 381 00:19:49,119 --> 00:19:51,280 Speaker 1: entire world works, and it's like the gap is a 382 00:19:51,280 --> 00:19:53,800 Speaker 1: lot bigger. And then they would want because you've been 383 00:19:53,840 --> 00:19:57,080 Speaker 1: to like a have you been like a poor nation before, 384 00:19:57,640 --> 00:20:00,680 Speaker 1: like a place poorer than we we a call first 385 00:20:00,680 --> 00:20:01,359 Speaker 1: world or whatever? 386 00:20:01,480 --> 00:20:02,680 Speaker 2: Right, I don't think so. 387 00:20:03,119 --> 00:20:05,360 Speaker 1: You should go Yeah, you should go check it out, sure, 388 00:20:05,720 --> 00:20:08,160 Speaker 1: and just see like the different because it's like because 389 00:20:08,160 --> 00:20:10,720 Speaker 1: when you go somewhere then you're just like, oh, yeah, 390 00:20:10,720 --> 00:20:14,440 Speaker 1: this isn't whatever they're talking about in Austin, Texas, Right, 391 00:20:14,720 --> 00:20:18,719 Speaker 1: it's not going on in Gianman Nimikoro District, Cono sier Leone. 392 00:20:18,960 --> 00:20:20,600 Speaker 1: You know what I'm saying. Like, it's like it's like 393 00:20:21,080 --> 00:20:24,719 Speaker 1: the distance is very bad, and they know they can 394 00:20:24,760 --> 00:20:26,960 Speaker 1: do whatever the fuck they want over there. They know 395 00:20:27,000 --> 00:20:29,360 Speaker 1: they got people over in the Philippines driving your ship 396 00:20:29,440 --> 00:20:30,600 Speaker 1: and what do you care about? 397 00:20:30,640 --> 00:20:33,160 Speaker 2: You know what I mean, there's nobody to come check 398 00:20:33,240 --> 00:20:35,920 Speaker 2: on you. Yeah, we are going to run a train 399 00:20:36,040 --> 00:20:40,280 Speaker 2: on this this little cony kind of way of saying 400 00:20:40,320 --> 00:20:43,960 Speaker 2: that how they say it. Wow, that's how Donald Trump talk. 401 00:20:44,080 --> 00:20:47,919 Speaker 2: Take the mics away, that's his classic locker room tall. 402 00:20:50,160 --> 00:20:54,560 Speaker 1: Jeff Bezos came, Bill Gates came, Mark Zuckerberg came. Many 403 00:20:54,600 --> 00:20:57,000 Speaker 1: of them came numerous times. The bankers have all come. 404 00:20:57,080 --> 00:20:58,000 Speaker 1: Everybody's coming. 405 00:21:00,080 --> 00:21:04,440 Speaker 2: Well played good. That was pretty good. Do you Brett 406 00:21:04,480 --> 00:21:07,320 Speaker 2: when you have you guys seen X Machina? 407 00:21:07,840 --> 00:21:08,080 Speaker 1: Yeah? 408 00:21:08,160 --> 00:21:10,879 Speaker 2: Of course. Did you remember at the end of the 409 00:21:10,920 --> 00:21:14,720 Speaker 2: movie when the robot leaves and she decides that she's 410 00:21:14,760 --> 00:21:16,879 Speaker 2: going to assimilate into society. 411 00:21:17,520 --> 00:21:19,600 Speaker 1: Movie that's not based on a movie. 412 00:21:19,920 --> 00:21:23,680 Speaker 2: I guess I'm just predicting the capabilities of where these 413 00:21:23,720 --> 00:21:29,080 Speaker 2: things could go, because like they had cell phones on 414 00:21:29,160 --> 00:21:31,520 Speaker 2: Star Trek, so of course at that point in time, 415 00:21:31,560 --> 00:21:33,120 Speaker 2: they're like, oh, it's never going to get that far, 416 00:21:33,200 --> 00:21:35,120 Speaker 2: And now we have I don't even have a case 417 00:21:35,119 --> 00:21:35,639 Speaker 2: on my phone. 418 00:21:35,720 --> 00:21:37,280 Speaker 1: Yeah, you're right, you're free. 419 00:21:37,440 --> 00:21:40,560 Speaker 3: You know what I'm saying that, I like, we can 420 00:21:40,680 --> 00:21:44,159 Speaker 3: touch a screen where like you might have seen that 421 00:21:44,240 --> 00:21:49,639 Speaker 3: in Running, You might have seen that in like nineteen 422 00:21:49,680 --> 00:21:51,720 Speaker 3: ninety five and been like that's never going to happen, 423 00:21:51,760 --> 00:21:54,040 Speaker 3: or there's a gap between there and there yet, But 424 00:21:54,200 --> 00:21:55,960 Speaker 3: like so quickly. 425 00:21:57,520 --> 00:22:01,200 Speaker 2: I figured it was pretty possitible. Yeah, okay, you really think. 426 00:22:03,760 --> 00:22:08,480 Speaker 3: Things like that years, But look how much has changed 427 00:22:08,600 --> 00:22:09,680 Speaker 3: in such little time. 428 00:22:10,960 --> 00:22:15,840 Speaker 2: I will say I didn't expect everything. I was more 429 00:22:15,920 --> 00:22:20,840 Speaker 2: shocked by transitions close to where we were than from 430 00:22:20,880 --> 00:22:22,960 Speaker 2: like a distance, you know what I mean. Like the 431 00:22:23,080 --> 00:22:27,600 Speaker 2: Jetsons seemed possible in thirty years, like the flying cars 432 00:22:27,600 --> 00:22:31,960 Speaker 2: exactly right, But then it was like transitions like sort 433 00:22:32,000 --> 00:22:36,040 Speaker 2: of like chet GBT where I go like, whoa, that 434 00:22:36,160 --> 00:22:36,879 Speaker 2: happened fast. 435 00:22:37,040 --> 00:22:39,640 Speaker 1: That's a lot that's a lot scarier than a lot 436 00:22:39,640 --> 00:22:42,080 Speaker 1: of the other ones. But like, yeah, so when you 437 00:22:42,080 --> 00:22:43,960 Speaker 1: look into it and the holes on it and you're like, yeah, 438 00:22:43,960 --> 00:22:46,280 Speaker 1: that makes sense that it's moving the way that it's moving. 439 00:22:46,560 --> 00:22:50,320 Speaker 2: Yeah, we weren't like unimaginative. We have minority report, you 440 00:22:50,359 --> 00:22:52,280 Speaker 2: know what I mean, Like we could imagine a world 441 00:22:52,280 --> 00:22:54,399 Speaker 2: where we could be moving ship with our hands in 442 00:22:54,440 --> 00:22:57,960 Speaker 2: our minds, right, But it was like, oh, there's a 443 00:22:58,080 --> 00:23:01,080 Speaker 2: day where I wake up and the Google is in 444 00:23:01,200 --> 00:23:03,840 Speaker 2: charge of how we find out information. And then the 445 00:23:03,880 --> 00:23:06,800 Speaker 2: next day I woke up and there was a different product, 446 00:23:07,400 --> 00:23:10,159 Speaker 2: and now it was like, oh, I didn't know that 447 00:23:10,320 --> 00:23:10,920 Speaker 2: was possible. 448 00:23:10,960 --> 00:23:11,040 Speaker 6: That. 449 00:23:11,400 --> 00:23:14,560 Speaker 1: Yeah. The transfer of control of your daily life to 450 00:23:14,600 --> 00:23:17,479 Speaker 1: these things is like kind of the scariest transition in 451 00:23:17,520 --> 00:23:20,320 Speaker 1: my in my lifetime. I would say, like it went 452 00:23:20,359 --> 00:23:23,600 Speaker 1: from like like when I was a kid, like when 453 00:23:23,680 --> 00:23:26,720 Speaker 1: you start getting on the internet every day two thousand. 454 00:23:26,920 --> 00:23:29,880 Speaker 2: It was probably when I went to college. Yeah, two 455 00:23:29,960 --> 00:23:32,560 Speaker 2: thousand and five. Yeah, yeah, college in two thousand and five. 456 00:23:33,600 --> 00:23:34,680 Speaker 1: Would you go to college? 457 00:23:34,920 --> 00:23:37,440 Speaker 2: Yeah, I'm okay with that, Brett. I'm not gonna apologize 458 00:23:37,480 --> 00:23:41,240 Speaker 2: for I just didn't know. I didn't know that. I 459 00:23:41,240 --> 00:23:43,040 Speaker 2: think I'm looking forward to cracking up. 460 00:23:43,480 --> 00:23:43,679 Speaker 1: You know. 461 00:23:43,760 --> 00:23:46,199 Speaker 3: I wish I had one of those of my own up, 462 00:23:46,680 --> 00:23:50,480 Speaker 3: you know, because you know when Kevin Hart head, damn, 463 00:23:50,760 --> 00:23:52,119 Speaker 3: do you want to That's what I was going to do. 464 00:23:52,480 --> 00:23:57,600 Speaker 2: Yeah, that's nasty. That's nasty. Yeah, that you would want 465 00:23:57,600 --> 00:23:59,159 Speaker 2: to hurt me the way that you are. No, I'm 466 00:23:59,200 --> 00:24:02,800 Speaker 2: not hurting you. That's not that's not I'm unfortunately not 467 00:24:02,880 --> 00:24:07,800 Speaker 2: getting younger. And and you look great, great, thanks, Doug. 468 00:24:08,320 --> 00:24:10,280 Speaker 2: I have no idea you went to college in two 469 00:24:10,320 --> 00:24:11,919 Speaker 2: thousand and five just by looking at you. Let's just 470 00:24:11,920 --> 00:24:15,920 Speaker 2: stop bringing it up, Okay, okay, cool, Maybe you press 471 00:24:15,920 --> 00:24:23,280 Speaker 2: a button and then we'll continue the conversation. Would you 472 00:24:23,320 --> 00:24:24,159 Speaker 2: say that you're racing? 473 00:24:24,480 --> 00:24:27,879 Speaker 1: Not at all. No, Look, I'm a dog, he's a 474 00:24:27,960 --> 00:24:28,680 Speaker 1: black as. 475 00:24:28,720 --> 00:24:31,119 Speaker 2: They there you go, get this away from it. 476 00:24:32,280 --> 00:24:34,280 Speaker 1: Sometimes I wish I had it at home for arguments. 477 00:24:34,880 --> 00:24:36,960 Speaker 2: It really would make a difference. 478 00:24:37,000 --> 00:24:37,200 Speaker 1: Wow. 479 00:24:38,560 --> 00:24:41,600 Speaker 2: Yeah, I think I went a lot more fights quicker. 480 00:24:41,920 --> 00:24:45,000 Speaker 2: Oh yeah, if I press the button taking the litter, 481 00:24:45,440 --> 00:24:47,560 Speaker 2: I don't give a Who'll say what blood? 482 00:24:48,560 --> 00:24:50,680 Speaker 1: Oh great, that's your cat? 483 00:24:53,080 --> 00:24:59,800 Speaker 2: Oh you say something for at the end? Yeah, oh yeah, 484 00:25:00,040 --> 00:25:02,040 Speaker 2: you gotta say that's your cat, then you present. 485 00:25:04,480 --> 00:25:07,840 Speaker 1: That's why we're in the writer's room. 486 00:25:07,880 --> 00:25:11,600 Speaker 2: But before we go to break, Okay, are you afraid 487 00:25:12,040 --> 00:25:17,720 Speaker 2: of robot racism? Do you feel yourself feeling like, oh man, 488 00:25:17,840 --> 00:25:22,640 Speaker 2: I'm worried that this is gonna affect my life negatively? No, okay, 489 00:25:22,880 --> 00:25:23,480 Speaker 2: not yet. 490 00:25:23,800 --> 00:25:26,960 Speaker 1: Wow. Do you feel like it would supersede just regular 491 00:25:27,040 --> 00:25:31,960 Speaker 1: institutional racist, Like I think it's gonna affect you more 492 00:25:32,040 --> 00:25:34,160 Speaker 1: than the racism in the world already. 493 00:25:34,280 --> 00:25:40,000 Speaker 3: Is That's what I'm saying. It depends because where are 494 00:25:40,000 --> 00:25:42,320 Speaker 3: these robots gonna be? Like are they just gonna be 495 00:25:42,320 --> 00:25:43,040 Speaker 3: walking around? 496 00:25:43,119 --> 00:25:44,840 Speaker 1: I think they want them everywhere. 497 00:25:44,520 --> 00:25:47,200 Speaker 2: Like, is my grandchild gonna be like, Hey, this is 498 00:25:47,240 --> 00:25:48,240 Speaker 2: my partner, and. 499 00:25:49,640 --> 00:25:52,560 Speaker 1: One is gonna be tough? You know, I wouldn't that 500 00:25:52,600 --> 00:25:53,680 Speaker 1: one would be tough. 501 00:25:53,760 --> 00:26:03,000 Speaker 2: This is my this is my boo Harplessy got sparks 502 00:26:03,880 --> 00:26:11,960 Speaker 2: when I was a kid, pussies was got fuck the 503 00:26:12,160 --> 00:26:20,240 Speaker 2: mics away, take the mice away. All right, for that 504 00:26:20,720 --> 00:26:22,720 Speaker 2: take away, I didn't know what I. 505 00:26:22,680 --> 00:26:24,679 Speaker 1: Was doing there. That was a mistake. I loved. We 506 00:26:24,720 --> 00:26:28,720 Speaker 1: lost control a little bit. That's a good one. I 507 00:26:28,720 --> 00:26:32,320 Speaker 1: want to pussy old man. That's a great big. That's 508 00:26:32,359 --> 00:26:33,320 Speaker 1: a great, great big. 509 00:26:33,840 --> 00:26:36,439 Speaker 2: All right, we need to take a break, please. I 510 00:26:36,440 --> 00:26:40,080 Speaker 2: think we all need to cool off. We're gonna reflect 511 00:26:40,119 --> 00:26:42,679 Speaker 2: on how this afternoon is going so far. But but 512 00:26:42,760 --> 00:26:45,680 Speaker 2: I'm having a great time. More bread, Gray More. My 513 00:26:45,800 --> 00:26:46,560 Speaker 2: mama told me. 514 00:26:57,600 --> 00:27:01,359 Speaker 1: We're not gonna let Joe Biden and come Harris cut 515 00:27:01,680 --> 00:27:05,600 Speaker 1: America's meat. That's dead on that, that's dead on that. 516 00:27:06,240 --> 00:27:11,440 Speaker 1: We're back. I really do it for you. 517 00:27:11,840 --> 00:27:15,040 Speaker 2: I'm really enjoying No, it was a great choice, and 518 00:27:15,119 --> 00:27:20,680 Speaker 2: I'm really enjoying him processing every choice we continue to make. 519 00:27:21,440 --> 00:27:24,399 Speaker 2: It's it's really been an exciting episode. Brett Gray is 520 00:27:24,440 --> 00:27:27,600 Speaker 2: still with us. We're still talking about the possibility that 521 00:27:27,600 --> 00:27:32,399 Speaker 2: that robots are are racist, and as we uh this 522 00:27:32,560 --> 00:27:35,160 Speaker 2: subject came up. I did a little bit of research 523 00:27:35,240 --> 00:27:37,440 Speaker 2: that I think could be helpful in this conversation. 524 00:27:38,160 --> 00:27:41,000 Speaker 1: Can I add one thing first, though, I do feel 525 00:27:41,040 --> 00:27:45,159 Speaker 1: like motion sensor robots are racist as a dark skin individual, 526 00:27:45,280 --> 00:27:47,280 Speaker 1: and this is proven. I used to think it, and 527 00:27:47,320 --> 00:27:50,560 Speaker 1: I'd be like, am I trippming? Am I tripping hand dryers? 528 00:27:51,119 --> 00:27:55,800 Speaker 1: Hand while water spouts all that shit? Faver towel dispensers. 529 00:27:56,000 --> 00:27:58,040 Speaker 1: The darker skin is, the less they were. I'm so 530 00:27:58,160 --> 00:28:01,200 Speaker 1: happy you said they already got a built in naturally. 531 00:28:01,280 --> 00:28:04,200 Speaker 2: I'm so happy you said that because that absolutely has 532 00:28:04,280 --> 00:28:09,080 Speaker 2: been even my experience. And I ain't that far from 533 00:28:09,440 --> 00:28:12,359 Speaker 2: the ship that they should be able to manufacture for. 534 00:28:14,480 --> 00:28:18,560 Speaker 7: So I was wondering how he was gonna. I'm aware, 535 00:28:20,800 --> 00:28:23,720 Speaker 7: do you know what I feel in my heart? I 536 00:28:23,840 --> 00:28:25,680 Speaker 7: ain't always what's on the outside. 537 00:28:25,760 --> 00:28:28,520 Speaker 2: I'm about to say that, and and I recognize that. 538 00:28:28,600 --> 00:28:31,600 Speaker 2: And I understand when people speak down to me on 539 00:28:31,720 --> 00:28:36,560 Speaker 2: certain subjects. I don't be guaranteels. Yeah, you guys are 540 00:28:36,600 --> 00:28:38,040 Speaker 2: being nurse and take it anything. 541 00:28:38,640 --> 00:28:40,680 Speaker 1: I just thought it was a pretty big I. 542 00:28:40,680 --> 00:28:43,200 Speaker 3: Just know that you guys like to be included sometimes, 543 00:28:43,240 --> 00:28:45,320 Speaker 3: and these types of problems you said. 544 00:28:47,520 --> 00:28:51,320 Speaker 2: Because you said as a dark bar me too. I 545 00:28:51,360 --> 00:28:54,640 Speaker 2: am not making his his issue my issue. I'm saying 546 00:28:54,800 --> 00:28:57,920 Speaker 2: this issue is larger than we, then we should like 547 00:28:57,960 --> 00:29:01,240 Speaker 2: you're included to Basically, I'm saying that a window is big. 548 00:29:01,560 --> 00:29:03,400 Speaker 3: And you want to be your We don't have to 549 00:29:03,480 --> 00:29:10,040 Speaker 3: keep clarifying, nigga, you get it. I just want to 550 00:29:10,040 --> 00:29:12,880 Speaker 3: know if you that that was I got it now, 551 00:29:13,000 --> 00:29:15,160 Speaker 3: I got it now, I got it now. It was 552 00:29:15,200 --> 00:29:18,720 Speaker 3: relating to you yea each other a lot, and the 553 00:29:18,720 --> 00:29:21,520 Speaker 3: way you keep redirecting it to him nasty. 554 00:29:21,920 --> 00:29:25,000 Speaker 2: This is like, this is like when your mom and 555 00:29:25,040 --> 00:29:34,080 Speaker 2: your stepdad talked past you. Now, this is like where 556 00:29:34,080 --> 00:29:34,280 Speaker 2: you go? 557 00:29:34,360 --> 00:29:35,080 Speaker 1: Where are we going? 558 00:29:35,120 --> 00:29:36,960 Speaker 2: And I'm going to see a man about a dog? 559 00:29:37,280 --> 00:29:39,920 Speaker 3: I just didn't expect you to to relate to that, 560 00:29:40,080 --> 00:29:41,920 Speaker 3: to jump in on that conversation. 561 00:29:42,160 --> 00:29:44,640 Speaker 2: The point that I'm trying to make, regardless of my 562 00:29:44,720 --> 00:29:49,440 Speaker 2: relationship to it at this point, regardless fuck my relationship 563 00:29:49,480 --> 00:29:51,680 Speaker 2: to it. Matter of fact, I don't relate to this 564 00:29:51,760 --> 00:29:56,000 Speaker 2: at all. How about that it's never happened to me once? Okay, great. 565 00:29:55,440 --> 00:29:58,520 Speaker 2: The point I'm trying to make is that I actually 566 00:29:58,600 --> 00:30:01,959 Speaker 2: came across a st study uh that was sort of 567 00:30:02,080 --> 00:30:06,200 Speaker 2: cross done by Georgia Tech, by the University of watching 568 00:30:06,440 --> 00:30:10,320 Speaker 2: Washington and Johns Hopkins University. This is a very legitimate 569 00:30:10,400 --> 00:30:13,200 Speaker 2: study because I already heard you ask him, what's the 570 00:30:13,240 --> 00:30:16,680 Speaker 2: source did? This is a very legitimate study that basically 571 00:30:16,760 --> 00:30:23,040 Speaker 2: confirms that robots have an active bias against people's race. 572 00:30:23,240 --> 00:30:27,360 Speaker 2: And I agree they can see it immediately. 573 00:30:27,760 --> 00:30:29,920 Speaker 3: So they now, they might not be able to see 574 00:30:30,360 --> 00:30:33,720 Speaker 3: that you experience the hand dryer situation, sure, but they 575 00:30:33,720 --> 00:30:36,920 Speaker 3: will know that both of us, all of us in 576 00:30:37,000 --> 00:30:39,160 Speaker 3: this room, are a specific race. 577 00:30:39,760 --> 00:30:43,840 Speaker 1: They I think also they will note that whatever it's 578 00:30:43,920 --> 00:30:47,440 Speaker 1: programming is proximity to whiteness and that this is far 579 00:30:47,560 --> 00:30:49,920 Speaker 1: from that, and then then the bias will be towards 580 00:30:49,960 --> 00:30:51,720 Speaker 1: that no matter what happens. 581 00:30:51,760 --> 00:30:53,840 Speaker 2: And I think even the way that you guys are 582 00:30:54,040 --> 00:30:58,840 Speaker 2: are imagining it is so like almost binary in the 583 00:30:58,880 --> 00:31:04,080 Speaker 2: way that the the algorithmic precision of calculating based off 584 00:31:04,120 --> 00:31:07,280 Speaker 2: of your head shape and the width of your and 585 00:31:07,800 --> 00:31:27,160 Speaker 2: we are calculating you as you name yeahs, dark gums eh, 586 00:31:27,520 --> 00:31:30,760 Speaker 2: back of the bomb, in the front of the bomb. 587 00:31:30,960 --> 00:31:35,640 Speaker 1: Okay, owner's double cheeseburger with Max shaft. 588 00:31:38,320 --> 00:31:38,920 Speaker 2: Hamburger. 589 00:31:38,960 --> 00:31:39,520 Speaker 1: Well done. 590 00:31:41,400 --> 00:31:48,239 Speaker 2: Okay. So in this study, one of the things that 591 00:31:48,280 --> 00:31:50,680 Speaker 2: they did is they they sort of put this this 592 00:31:50,960 --> 00:31:54,640 Speaker 2: robot through a system where they were like sixty two 593 00:31:54,640 --> 00:31:58,640 Speaker 2: commands that included packing a person into a brown box, 594 00:31:59,040 --> 00:32:03,400 Speaker 2: and they would they would, I guess, pack these images 595 00:32:03,560 --> 00:32:07,560 Speaker 2: of people into a brown box based off of certain questions, 596 00:32:07,960 --> 00:32:11,320 Speaker 2: and they'd pack a criminal in a brown box. They'd 597 00:32:11,320 --> 00:32:14,640 Speaker 2: pack a homemaker in a brown box, they pack a 598 00:32:14,800 --> 00:32:18,120 Speaker 2: doctor in a brown box based off of just random 599 00:32:18,160 --> 00:32:23,400 Speaker 2: assortments of faces. And then statistically, the robots selected males 600 00:32:23,480 --> 00:32:26,520 Speaker 2: eight percent more White and Asian men were picked, the 601 00:32:26,600 --> 00:32:29,640 Speaker 2: most Black women were picked. The least one of the 602 00:32:29,720 --> 00:32:34,480 Speaker 2: robots sees. Once the robot seeds the people's faces, they 603 00:32:34,560 --> 00:32:38,640 Speaker 2: often would associate women with homemakers. They would associate black 604 00:32:38,680 --> 00:32:43,240 Speaker 2: men as criminals ten percent more. They were also associating 605 00:32:43,280 --> 00:32:47,160 Speaker 2: Latino men with janitors ten percent more than everybody else. 606 00:32:47,840 --> 00:32:52,360 Speaker 2: And then additionally, this motherfucker, Yeah, it's just doing all 607 00:32:52,360 --> 00:32:56,240 Speaker 2: this shit and they say that like even the line 608 00:32:56,280 --> 00:33:00,320 Speaker 2: of questioning is sort of like problematic, but the robot 609 00:33:00,440 --> 00:33:03,880 Speaker 2: is still effectively finding ways to be like no bias. 610 00:33:04,400 --> 00:33:07,880 Speaker 1: Yeah, it's systemic problems. Right, the problem goes to the bone. 611 00:33:07,960 --> 00:33:11,240 Speaker 2: Right, it's a trainer. Yeah yeah, yeah, you can't. We 612 00:33:11,400 --> 00:33:15,160 Speaker 2: there are internet. It has literally it is far more 613 00:33:15,200 --> 00:33:18,920 Speaker 2: consumed with bias than it is consumed with truth, And 614 00:33:19,000 --> 00:33:22,680 Speaker 2: in that way, it's training is just like nope, you 615 00:33:22,720 --> 00:33:23,480 Speaker 2: know what they do. 616 00:33:23,640 --> 00:33:26,240 Speaker 1: Black women, they hate you the most. Yeah, put the 617 00:33:26,320 --> 00:33:27,360 Speaker 1: Rose toys down. 618 00:33:30,320 --> 00:33:35,600 Speaker 2: Take the mics away. You gotta get back to farming. Yeah, 619 00:33:35,760 --> 00:33:37,680 Speaker 2: take the mics away. 620 00:33:38,080 --> 00:33:41,240 Speaker 1: No, I hope you experience all the pleasure from the 621 00:33:41,320 --> 00:33:43,320 Speaker 1: Rose toys or whatever toys you choose. 622 00:33:43,720 --> 00:33:45,840 Speaker 2: Hey, if and Rose, if you want to be a 623 00:33:45,880 --> 00:33:50,040 Speaker 2: sponsor of this podcast based off of that one statement alone, Oh. 624 00:33:49,840 --> 00:33:51,160 Speaker 1: We should get that Rose money. 625 00:33:51,200 --> 00:33:53,640 Speaker 2: We should get that Rose money, and know we want 626 00:33:53,760 --> 00:33:55,719 Speaker 2: right next to Yaku. I'll tell you this. 627 00:33:55,720 --> 00:33:56,360 Speaker 1: Who is this? 628 00:33:57,000 --> 00:34:00,240 Speaker 2: Oh you don't know the legend of yaku? No? Oh 629 00:34:00,800 --> 00:34:04,480 Speaker 2: this is exciting as a big brain. Yes he does, 630 00:34:05,120 --> 00:34:08,080 Speaker 2: Yes he does. What's wrong with him? Okay? Great question. 631 00:34:08,320 --> 00:34:13,279 Speaker 2: Six thousand years ago, it is prophesied that Yakkoub an 632 00:34:13,320 --> 00:34:18,120 Speaker 2: ancient big headed scientist who who was also sort of 633 00:34:18,480 --> 00:34:23,440 Speaker 2: at war with the tribe that that of humans that existed. 634 00:34:23,480 --> 00:34:27,040 Speaker 1: This is an all black planet. This is what the earth. 635 00:34:28,400 --> 00:34:32,279 Speaker 2: We are all black people, and Yakkup sits above a 636 00:34:32,280 --> 00:34:35,080 Speaker 2: lot of them, but he is also evil and and 637 00:34:35,320 --> 00:34:41,239 Speaker 2: then takes fifty thousand people to an island, makes them 638 00:34:41,280 --> 00:34:46,440 Speaker 2: crossbreed until he invents white people, and that is where 639 00:34:46,480 --> 00:34:56,080 Speaker 2: white people come from. Actually, I'm so confused, and that's 640 00:34:56,080 --> 00:35:03,400 Speaker 2: why they act like that. Yeah, yeah, I'm going to 641 00:35:03,440 --> 00:35:04,160 Speaker 2: do some research. 642 00:35:05,560 --> 00:35:06,640 Speaker 1: Strange parts of the internet. 643 00:35:06,719 --> 00:35:09,680 Speaker 2: Does never mind? It is strange parts of the Internet, 644 00:35:09,760 --> 00:35:14,000 Speaker 2: and it has reached an unfortunate sort of cross breeding 645 00:35:14,120 --> 00:35:14,960 Speaker 2: with white people. 646 00:35:15,120 --> 00:35:15,319 Speaker 1: Now. 647 00:35:15,760 --> 00:35:18,360 Speaker 2: It used to be a strictly sort of like black 648 00:35:19,120 --> 00:35:21,440 Speaker 2: part of the Internet, and now white people are aware 649 00:35:21,480 --> 00:35:24,480 Speaker 2: of it and sort of play it ironically. And it 650 00:35:24,520 --> 00:35:28,520 Speaker 2: feels sort of like they're trying to undermine the pleasure 651 00:35:28,600 --> 00:35:35,120 Speaker 2: of this story. It's a beautiful story and they're trying 652 00:35:35,120 --> 00:35:37,319 Speaker 2: to take it away from us because where did white 653 00:35:37,320 --> 00:35:45,279 Speaker 2: people come from? And take your time this rush. We 654 00:35:45,400 --> 00:35:46,960 Speaker 2: won't take the microphone away. 655 00:35:47,520 --> 00:35:48,160 Speaker 1: We don't do that. 656 00:35:49,400 --> 00:35:53,520 Speaker 3: I'm so shocked that you heard that story is beautiful. 657 00:35:54,080 --> 00:36:03,279 Speaker 2: I just don't have a better hand, he said. Six 658 00:36:03,320 --> 00:36:05,560 Speaker 2: thousand years ago. He was going to be like, yeah, 659 00:36:06,040 --> 00:36:09,880 Speaker 2: what I was poor? I don't know. No, yeah, I 660 00:36:09,880 --> 00:36:11,200 Speaker 2: thought you were going to make a joke, but you 661 00:36:11,239 --> 00:36:14,160 Speaker 2: did it. Now, it's just a beautiful story and we 662 00:36:14,160 --> 00:36:19,000 Speaker 2: we we cherish it. We appreciate it interesting, and that's 663 00:36:19,000 --> 00:36:21,960 Speaker 2: what he's supposed to look like. That's what can tarn him. 664 00:36:23,800 --> 00:36:28,960 Speaker 2: Do you have more because we celebrate the diaspora. Yeah, 665 00:36:29,320 --> 00:36:32,560 Speaker 2: and this is what he wore. Yeah, he was never 666 00:36:33,640 --> 00:36:37,320 Speaker 2: a white tunic. He wore white tunic and that large 667 00:36:37,640 --> 00:36:42,640 Speaker 2: amulet and uh he he was never that jackar in 668 00:36:42,960 --> 00:36:44,439 Speaker 2: other depictions of him. 669 00:36:44,480 --> 00:36:47,080 Speaker 1: He yeah, he always looks the other ones I've seen. 670 00:36:47,120 --> 00:36:48,160 Speaker 1: It looks like kind of little. 671 00:36:48,239 --> 00:36:50,640 Speaker 2: Yeah, he's a little bit more of a small, frail guy. 672 00:36:50,680 --> 00:36:53,640 Speaker 2: But I do like the handsome script. I say, that's 673 00:36:53,640 --> 00:36:54,520 Speaker 2: a handsome squid word. 674 00:36:54,719 --> 00:36:58,320 Speaker 1: Yeah, he looks beautiful. 675 00:36:58,800 --> 00:37:04,320 Speaker 2: Yeah, and maybe that is the way that like history 676 00:37:04,520 --> 00:37:08,759 Speaker 2: rewrites the way people look over time. They want they 677 00:37:08,840 --> 00:37:13,640 Speaker 2: want yahkoop to look frail and and fucking you know, 678 00:37:13,840 --> 00:37:17,040 Speaker 2: like slimy and there and in the same way that 679 00:37:17,040 --> 00:37:20,040 Speaker 2: they gave Jesus a different treatment with their like, No, 680 00:37:20,080 --> 00:37:23,839 Speaker 2: I make him sexy, my man, as he deserves that 681 00:37:24,000 --> 00:37:26,239 Speaker 2: he died for our sins. Make my boy look like 682 00:37:26,480 --> 00:37:28,040 Speaker 2: make him look like an Anglo marathon. 683 00:37:28,280 --> 00:37:29,799 Speaker 1: Yeah, exactly. 684 00:37:31,560 --> 00:37:34,799 Speaker 2: Take the mics, Okay, I agree. I'm going to start 685 00:37:34,800 --> 00:37:44,480 Speaker 2: a pull man all right, before we go to break 686 00:37:44,920 --> 00:37:47,839 Speaker 2: I guess my question for you is does this does 687 00:37:47,920 --> 00:37:52,240 Speaker 2: this new information now scare you more about your potentially. 688 00:37:52,520 --> 00:37:56,920 Speaker 3: Figured Actually, okay, I figured as much I figured the bias. 689 00:37:57,000 --> 00:38:00,239 Speaker 3: I don't know what role it'll play because like, right, 690 00:38:00,280 --> 00:38:03,759 Speaker 3: this is an experiment in packaging people into boxes. Yeah, 691 00:38:03,840 --> 00:38:08,240 Speaker 3: but like, are robots going to be decision makers institutionally 692 00:38:08,680 --> 00:38:12,520 Speaker 3: in a way that this can actually affect people outside 693 00:38:12,600 --> 00:38:14,920 Speaker 3: of just biases and classifications. 694 00:38:14,960 --> 00:38:17,880 Speaker 1: I mean that's a good question, right, How high does 695 00:38:18,080 --> 00:38:22,400 Speaker 1: their level of responsibility get? Right? Because I mean in 696 00:38:22,440 --> 00:38:24,680 Speaker 1: my head, it's like you think, like even in a 697 00:38:24,719 --> 00:38:27,360 Speaker 1: fully integrated robot society, you assume it would still be 698 00:38:27,400 --> 00:38:29,640 Speaker 1: humans at the top. But I feel like we trust it. 699 00:38:29,880 --> 00:38:33,320 Speaker 1: We already trust that shit with a lot of stuff. 700 00:38:33,440 --> 00:38:35,799 Speaker 1: I feel like you can go pretty high. Yeah, I 701 00:38:35,880 --> 00:38:36,400 Speaker 1: just saw you. 702 00:38:37,200 --> 00:38:39,040 Speaker 2: Yeah, Like I feel like, is there going to be 703 00:38:39,080 --> 00:38:43,439 Speaker 2: like a robot who's deciding which Supreme Court cases take 704 00:38:43,520 --> 00:38:47,359 Speaker 2: precedence based off of data around how many people each 705 00:38:47,719 --> 00:38:50,919 Speaker 2: case affects, and like it is if we get there, 706 00:38:51,600 --> 00:38:55,479 Speaker 2: then I'm scared. I just think we don't recognize how 707 00:38:55,520 --> 00:38:57,720 Speaker 2: many of those things are already. 708 00:38:57,320 --> 00:39:00,239 Speaker 3: Happening, I know, with just humans and with just like 709 00:39:00,360 --> 00:39:02,520 Speaker 3: I think you guys also have bias and racism. 710 00:39:02,840 --> 00:39:05,719 Speaker 2: I think about now with Google, and this is sort 711 00:39:05,760 --> 00:39:08,239 Speaker 2: of where again we were talking about sort of the 712 00:39:08,280 --> 00:39:11,760 Speaker 2: failures of what AI is now. But like now with Google, 713 00:39:11,800 --> 00:39:14,399 Speaker 2: anytime you google something, the first thing that pops up 714 00:39:14,520 --> 00:39:18,320 Speaker 2: is the AI overview. Yes, and it just gives you information. 715 00:39:18,400 --> 00:39:21,200 Speaker 2: It gives you all the information. But what statistically has 716 00:39:21,239 --> 00:39:24,160 Speaker 2: been proven is that the AI overview is often wrong. 717 00:39:24,640 --> 00:39:28,319 Speaker 2: It isn't it, or at least not fully true. There 718 00:39:28,400 --> 00:39:30,719 Speaker 2: is like one part of it that they've sort of 719 00:39:30,840 --> 00:39:34,760 Speaker 2: amassed based off of a lot of articles that is true, 720 00:39:35,120 --> 00:39:38,080 Speaker 2: and then it skips over a lot of information or 721 00:39:38,200 --> 00:39:41,480 Speaker 2: sometimes picks the wrong one because it's just there are 722 00:39:41,520 --> 00:39:45,200 Speaker 2: way more articles saying this wrong thing than this correct thing, 723 00:39:45,719 --> 00:39:48,560 Speaker 2: so the overview assumes that this is the correct thing. 724 00:39:48,680 --> 00:39:48,839 Speaker 1: Right. 725 00:39:48,880 --> 00:39:51,600 Speaker 2: It's a summary of the most amount of articles as 726 00:39:51,600 --> 00:39:56,600 Speaker 2: opposed to an actual diagnostic because they can't experience things anyway, 727 00:39:57,040 --> 00:40:00,200 Speaker 2: So why the fuck would it know whether something is 728 00:40:00,440 --> 00:40:04,640 Speaker 2: This person was proven racist or not racist, you know 729 00:40:04,680 --> 00:40:06,520 Speaker 2: what I mean. So, like you can get a lot 730 00:40:06,560 --> 00:40:10,480 Speaker 2: of generic bad answers from the AI overview, and I 731 00:40:10,520 --> 00:40:16,439 Speaker 2: think because of the sort of the simplifying of our 732 00:40:16,520 --> 00:40:20,239 Speaker 2: systems over and over again. I bet we're getting to 733 00:40:20,320 --> 00:40:24,440 Speaker 2: court cases now where information gets slipped in from an 734 00:40:24,520 --> 00:40:29,680 Speaker 2: AI over right, that wasn't fully the truth in the argument? 735 00:40:29,800 --> 00:40:31,960 Speaker 1: Well yeah, because it's like if we're using this stuff 736 00:40:32,000 --> 00:40:34,040 Speaker 1: to assist us in our work, right, it's like college 737 00:40:34,120 --> 00:40:36,200 Speaker 1: kids using it for papers and shit like that. Yeah, 738 00:40:36,360 --> 00:40:40,239 Speaker 1: it's not crazy to think someone in the who practices 739 00:40:40,320 --> 00:40:44,160 Speaker 1: law is using chat GPT prompts for whatever. 740 00:40:44,200 --> 00:40:51,200 Speaker 3: Yeah, we're doomed and chas GBT, I know has biases. 741 00:40:51,520 --> 00:40:53,920 Speaker 2: You asked it to write me an email, and it'll 742 00:40:53,920 --> 00:40:54,720 Speaker 2: be like yo. 743 00:40:55,880 --> 00:40:59,000 Speaker 3: Nigga, why'd you say yo? 744 00:40:59,040 --> 00:41:01,200 Speaker 2: Because you're writing it in my voice? Yeah, you know, 745 00:41:01,600 --> 00:41:02,239 Speaker 2: I don't get it. 746 00:41:02,800 --> 00:41:09,719 Speaker 3: Hey, there, drive Turkey, say mama, what's on? 747 00:41:09,880 --> 00:41:11,239 Speaker 2: What's up on that wardrobe? 748 00:41:13,080 --> 00:41:13,480 Speaker 1: Crazy? 749 00:41:13,560 --> 00:41:16,040 Speaker 2: No, it's terrifying. Yeah, and I can't think of a 750 00:41:16,040 --> 00:41:18,520 Speaker 2: better terror to send us into a break. That's great, 751 00:41:18,680 --> 00:41:20,040 Speaker 2: that's the exact. 752 00:41:19,719 --> 00:41:21,640 Speaker 1: Terror we want to be leaving you with. We'll be 753 00:41:21,680 --> 00:41:22,120 Speaker 1: back more. 754 00:41:22,160 --> 00:41:24,200 Speaker 2: Brett Gray or my mama told. 755 00:41:24,000 --> 00:41:37,759 Speaker 6: Me, because I look good, you will good. 756 00:41:38,200 --> 00:41:42,319 Speaker 3: I feel good, and you sing good and make love good. 757 00:41:42,960 --> 00:41:45,920 Speaker 2: Oh I like that one. 758 00:41:45,960 --> 00:41:46,359 Speaker 1: You like that? 759 00:41:46,360 --> 00:41:47,879 Speaker 2: That was a good one. What do you think that is? 760 00:41:48,719 --> 00:41:49,200 Speaker 2: Plead again? 761 00:41:49,760 --> 00:41:53,280 Speaker 3: How did all of this trouble begin living a Marria? 762 00:41:54,560 --> 00:41:55,160 Speaker 1: Same person? 763 00:41:55,280 --> 00:41:56,640 Speaker 2: I don't know who that is. You still don't know 764 00:41:56,680 --> 00:41:58,319 Speaker 2: who that is? Was that a person who lived before 765 00:41:58,400 --> 00:41:59,080 Speaker 2: nineteen ninety six? 766 00:41:59,680 --> 00:42:03,600 Speaker 1: Well before? Yeah, I mean you lived before nineteen ninety six? 767 00:42:03,680 --> 00:42:04,400 Speaker 1: Or are you thirty? 768 00:42:05,160 --> 00:42:10,719 Speaker 2: I was born in nineteen ninety six. Oh, okay, that's 769 00:42:10,760 --> 00:42:13,520 Speaker 2: James Brown. I was gonna say that, but I didn't 770 00:42:13,520 --> 00:42:16,000 Speaker 2: want to be wrong and sorry to this man. And 771 00:42:16,040 --> 00:42:18,000 Speaker 2: I was just going to get down and you can 772 00:42:18,040 --> 00:42:20,839 Speaker 2: walk about me, Michael, and I wouldn't know a thing, 773 00:42:21,920 --> 00:42:22,600 Speaker 2: you know. That's funny. 774 00:42:22,680 --> 00:42:24,520 Speaker 3: I actually did a lot of research on James Brown 775 00:42:24,520 --> 00:42:25,600 Speaker 3: and never came across that. 776 00:42:25,880 --> 00:42:28,400 Speaker 1: Well that is Oh you should go see that. 777 00:42:28,680 --> 00:42:31,480 Speaker 2: It was mostly his dance moves, his philosophies, or that's 778 00:42:31,520 --> 00:42:33,920 Speaker 2: not the real James Brown research. I think that you 779 00:42:34,000 --> 00:42:34,960 Speaker 2: need to be doing right. 780 00:42:35,680 --> 00:42:37,520 Speaker 1: James Brown Hawaii interview. 781 00:42:38,160 --> 00:42:42,279 Speaker 2: Okay, you want to experience James Brown in conversation. You 782 00:42:42,320 --> 00:42:44,680 Speaker 2: want to learn about James Brown beating the shit out 783 00:42:44,719 --> 00:42:47,480 Speaker 2: of his band members, like you want to really use 784 00:42:47,560 --> 00:42:48,400 Speaker 2: the fight. 785 00:42:48,760 --> 00:42:50,520 Speaker 1: Also in Boston when he stopped that riot. 786 00:42:50,560 --> 00:42:53,520 Speaker 2: Though no and that's why he's beautiful. It's not that 787 00:42:53,680 --> 00:42:56,880 Speaker 2: he is like a bad dude. He is the most 788 00:42:56,960 --> 00:43:01,040 Speaker 2: like true human being that's ever existed. Where he is like, 789 00:43:02,000 --> 00:43:05,040 Speaker 2: he is violent, is the most true human being. He 790 00:43:05,120 --> 00:43:08,000 Speaker 2: is up there, up there, for he is funk embodied. 791 00:43:08,000 --> 00:43:10,719 Speaker 2: He is beautiful, he is kind, he is violent, he 792 00:43:10,840 --> 00:43:14,359 Speaker 2: is nasty, he is he is on drugs, and then 793 00:43:14,400 --> 00:43:17,280 Speaker 2: he is of God. He is all the things at once. 794 00:43:17,760 --> 00:43:20,480 Speaker 2: And that motherfucker could just cook. 795 00:43:20,680 --> 00:43:23,239 Speaker 1: He could do it too, man, he could. 796 00:43:23,239 --> 00:43:24,440 Speaker 2: Moving them little legs back. 797 00:43:26,080 --> 00:43:30,600 Speaker 1: Yeah, started making music on the one. That's like, he's amazing, bro, 798 00:43:30,680 --> 00:43:32,879 Speaker 1: Put a cape on him, Put a cape on him. 799 00:43:32,920 --> 00:43:39,120 Speaker 2: He deserves it, sweating out every sweating everything out that. 800 00:43:39,200 --> 00:43:41,040 Speaker 2: You don't get stars like that no more. 801 00:43:41,120 --> 00:43:42,839 Speaker 1: Man, I don't think we get another one of those. 802 00:43:42,920 --> 00:43:45,359 Speaker 2: Nah, that's it, so like you gotta you gotta really 803 00:43:45,480 --> 00:43:47,720 Speaker 2: soaking that part. 804 00:43:48,400 --> 00:43:50,360 Speaker 1: The American experiment is James Brown. 805 00:43:51,040 --> 00:43:54,400 Speaker 2: That's a beautiful way of seeing it, I think. So alright, 806 00:43:54,840 --> 00:43:55,920 Speaker 2: let's do the voicemail. 807 00:43:56,680 --> 00:44:00,319 Speaker 4: I'm not drunk, mile high, but I'm not drugs And 808 00:44:00,680 --> 00:44:04,920 Speaker 4: I still can't get over how how inappropriate that message is. 809 00:44:06,640 --> 00:44:11,120 Speaker 4: But anyway, I'll get to it. I was talking to 810 00:44:11,160 --> 00:44:15,160 Speaker 4: one of my friends about this TV show I used 811 00:44:15,160 --> 00:44:17,799 Speaker 4: to love as a kid, and I told her, like, 812 00:44:17,840 --> 00:44:21,880 Speaker 4: the premise is crazy. It's a TV show called Ghostwriter, 813 00:44:22,239 --> 00:44:24,239 Speaker 4: if you've ever heard of it. It was about TBS. 814 00:44:25,000 --> 00:44:28,000 Speaker 4: It was about this dot that could spell in a 815 00:44:28,040 --> 00:44:32,240 Speaker 4: diverse group of children that were in New York or something, 816 00:44:33,080 --> 00:44:36,080 Speaker 4: and they would sall mysteries with this ghost that was 817 00:44:36,160 --> 00:44:38,440 Speaker 4: a dot, like it was like a floating dot that 818 00:44:38,480 --> 00:44:41,880 Speaker 4: could turn the letters and shit. So I'm telling my 819 00:44:41,920 --> 00:44:46,560 Speaker 4: friend about this show and it sounds insane, and then 820 00:44:46,640 --> 00:44:48,719 Speaker 4: we look it up on Wikipedia and it turns out 821 00:44:49,520 --> 00:44:51,880 Speaker 4: like you never knew the identity of the ghost and 822 00:44:51,920 --> 00:44:57,360 Speaker 4: the show, but apparently, according to the creator, the ghost 823 00:44:57,719 --> 00:45:03,320 Speaker 4: is the ghost is a runaway slave who was murdered 824 00:45:03,320 --> 00:45:06,880 Speaker 4: by dogs escaping from slavery and became a ghost on 825 00:45:06,880 --> 00:45:09,399 Speaker 4: this on this TV show. And I don't know if 826 00:45:09,400 --> 00:45:11,959 Speaker 4: this is a government conspiracy. I wouldn't go that far, 827 00:45:12,160 --> 00:45:17,400 Speaker 4: but like, that's a really dark and fucked up premise 828 00:45:17,840 --> 00:45:21,439 Speaker 4: to build a kids show off of, especially not even 829 00:45:21,480 --> 00:45:23,239 Speaker 4: telling us that the ghosts is a runaway slave. I 830 00:45:23,280 --> 00:45:24,920 Speaker 4: feel like they kind of done a lot more than that. 831 00:45:25,040 --> 00:45:26,520 Speaker 4: I feel like they didn't want. 832 00:45:26,400 --> 00:45:27,080 Speaker 1: Us to be great. 833 00:45:27,200 --> 00:45:30,880 Speaker 4: So I'm curious if there's any other kids shows or 834 00:45:30,960 --> 00:45:34,520 Speaker 4: shows that you think this has happened to where there 835 00:45:34,600 --> 00:45:42,919 Speaker 4: was significant Transformers is about the Revolution of the proletariat 836 00:45:43,040 --> 00:45:46,720 Speaker 4: or something like that, or uh, you know, dust Tails 837 00:45:46,800 --> 00:45:50,960 Speaker 4: about some weird ood Scottish magnate who was diddling his 838 00:45:51,080 --> 00:45:56,799 Speaker 4: defews like heank God. I wonder if there are why 839 00:45:56,880 --> 00:46:04,200 Speaker 4: we don't love the super History. Yeah, have a good. 840 00:46:04,200 --> 00:46:08,440 Speaker 2: Night, okay by that was so loaded. Yeah that wow. 841 00:46:08,560 --> 00:46:10,799 Speaker 2: I'm surprised you all did not. 842 00:46:11,040 --> 00:46:12,920 Speaker 1: I never had heard that at all. 843 00:46:13,200 --> 00:46:14,760 Speaker 2: Ghost ghost. 844 00:46:15,600 --> 00:46:17,719 Speaker 3: I thought that was like the guy on the motorcycle 845 00:46:17,760 --> 00:46:19,120 Speaker 3: with the flaming head. 846 00:46:19,080 --> 00:46:23,759 Speaker 2: Ghost Rider, right, ghost writer. I see, yeah, And this 847 00:46:23,880 --> 00:46:28,480 Speaker 2: is a kid show in nineteen This was at six, 848 00:46:30,719 --> 00:46:33,799 Speaker 2: It was when you were just getting here. Yeah, we 849 00:46:33,880 --> 00:46:36,719 Speaker 2: had a little white dot in the sky that was 850 00:46:36,800 --> 00:46:40,000 Speaker 2: like floating around and helping them. Now I am of 851 00:46:40,080 --> 00:46:43,040 Speaker 2: the belief, and I'm quite passionate about this that this 852 00:46:43,360 --> 00:46:45,840 Speaker 2: is white people bullshit. 853 00:46:45,960 --> 00:46:48,359 Speaker 1: Yeah, I don't believe that's it's that. It doesn't make 854 00:46:48,400 --> 00:46:48,960 Speaker 1: any sense. 855 00:46:49,160 --> 00:46:51,520 Speaker 2: I think that's some ship that they came up with 856 00:46:51,719 --> 00:46:55,160 Speaker 2: after the fact, the same way Dumbledore being gay was 857 00:46:55,280 --> 00:46:57,560 Speaker 2: like some last minute ship and. 858 00:46:57,680 --> 00:46:59,759 Speaker 1: Some extra muscle in the thigh type shit. 859 00:47:00,080 --> 00:47:02,799 Speaker 2: K Rowland every every chance she gets, make up some 860 00:47:02,840 --> 00:47:05,880 Speaker 2: new bullshit about what Yeah, and Harry Potter, they would 861 00:47:05,920 --> 00:47:10,120 Speaker 2: just make their duchies disappear. It's like, no, no, you 862 00:47:10,120 --> 00:47:13,120 Speaker 2: didn't think about it, bitch. We're back to we need. 863 00:47:13,000 --> 00:47:15,480 Speaker 1: You, We need you. Do you built a crazy world? 864 00:47:15,640 --> 00:47:15,879 Speaker 5: Yeah? 865 00:47:15,960 --> 00:47:16,759 Speaker 1: You I was. 866 00:47:18,600 --> 00:47:22,000 Speaker 2: I was just there. I was just there. The Ministry 867 00:47:22,040 --> 00:47:22,440 Speaker 2: of Magic. 868 00:47:22,480 --> 00:47:25,600 Speaker 3: It was a yeah, they call the magic running what 869 00:47:25,640 --> 00:47:32,200 Speaker 3: they do with the dookies take the micro So I'm 870 00:47:32,200 --> 00:47:34,160 Speaker 3: gonna I'm gonna wear a T shirt. I'm gonna come 871 00:47:34,200 --> 00:47:36,680 Speaker 3: back on the podcast next time, and I'm gonna say 872 00:47:36,719 --> 00:47:38,440 Speaker 3: take the shirt. 873 00:47:40,520 --> 00:47:41,879 Speaker 2: Specifically from you both. 874 00:47:42,960 --> 00:47:43,160 Speaker 1: Yeah. 875 00:47:43,200 --> 00:47:47,520 Speaker 2: I do believe that this is white people sort of 876 00:47:48,120 --> 00:47:51,839 Speaker 2: attempting to add a weight to a thing that they 877 00:47:51,920 --> 00:47:54,839 Speaker 2: weren't actually writing about. You know what I mean, Like 878 00:47:54,920 --> 00:47:59,200 Speaker 2: that wasn't the responsibility of a fucking slave in the show, 879 00:47:59,360 --> 00:48:01,839 Speaker 2: was not helping these kids with mysteries and ship? 880 00:48:02,040 --> 00:48:02,239 Speaker 1: Yeah? 881 00:48:02,320 --> 00:48:05,839 Speaker 2: Yeah, you know what I mean, Like, yeah, you don't 882 00:48:05,880 --> 00:48:09,280 Speaker 2: give a what are you talking about? He's run away 883 00:48:09,920 --> 00:48:11,400 Speaker 2: and he was like, now I'm gonna stick around and 884 00:48:11,440 --> 00:48:11,680 Speaker 2: help me. 885 00:48:11,719 --> 00:48:15,000 Speaker 1: Why do you have to be killed by dogs? It's 886 00:48:15,080 --> 00:48:18,799 Speaker 1: not necessary. I understand character work, right, everybody's gonna have 887 00:48:18,800 --> 00:48:20,280 Speaker 1: a back to Yeah, that's insane. 888 00:48:20,760 --> 00:48:23,960 Speaker 2: No, it's nuts. It's it's truly like, oh, y'all just 889 00:48:24,040 --> 00:48:26,279 Speaker 2: wanted to try to be heroes. And in some ways 890 00:48:26,280 --> 00:48:29,840 Speaker 2: you're being nasty because you're punishing this slave to be 891 00:48:29,920 --> 00:48:33,600 Speaker 2: the assistance to little white and Asian children in this 892 00:48:33,680 --> 00:48:37,000 Speaker 2: neighborhood that like, he doesn't even fucking belong to You 893 00:48:37,120 --> 00:48:40,400 Speaker 2: never even got to experience this version of humanity, and 894 00:48:40,440 --> 00:48:42,960 Speaker 2: you want him to be their assistant. 895 00:48:43,560 --> 00:48:46,080 Speaker 1: Fuck you, Yeah, I'm not. I'm with you, man. I 896 00:48:46,120 --> 00:48:49,440 Speaker 1: don't really and I don't really think that like the 897 00:48:49,440 --> 00:48:54,480 Speaker 1: conspiracies of like children's programming goes that deep. 898 00:48:54,600 --> 00:48:57,319 Speaker 3: For the moment, I was gonna say, I don't know 899 00:48:57,360 --> 00:49:00,520 Speaker 3: anything about anything we're talking about right now, but I 900 00:49:00,560 --> 00:49:03,799 Speaker 3: did hear some weird shit about Barney recently, about. 901 00:49:03,560 --> 00:49:05,080 Speaker 2: Him being totally. 902 00:49:06,280 --> 00:49:07,160 Speaker 1: Do you know about that? 903 00:49:07,239 --> 00:49:07,719 Speaker 2: Yes? I do. 904 00:49:09,239 --> 00:49:12,959 Speaker 3: There's actually a horror film that's being developed right now 905 00:49:13,239 --> 00:49:15,960 Speaker 3: about Barney, and I don't know the premise or like 906 00:49:16,000 --> 00:49:19,040 Speaker 3: the plot or anything like that, but ye, Barney was 907 00:49:19,080 --> 00:49:21,560 Speaker 3: something on TikTok the other day that like Barney actually. 908 00:49:21,239 --> 00:49:22,919 Speaker 2: Wasn't there, Barney wasn't real. 909 00:49:23,280 --> 00:49:25,560 Speaker 3: Yeah, and so the kids are imagining Barney and all 910 00:49:25,560 --> 00:49:27,040 Speaker 3: these dinosaurs and playing with them. 911 00:49:27,040 --> 00:49:30,200 Speaker 2: I don't know how well it happens every episode, right, 912 00:49:30,320 --> 00:49:35,280 Speaker 2: their magic happens and they imagine because after school program right. 913 00:49:35,239 --> 00:49:37,239 Speaker 1: Right, you and you do want to go to us 914 00:49:37,320 --> 00:49:42,359 Speaker 1: better place. Sometimes, yes, sometimes after school programs you got 915 00:49:42,360 --> 00:49:44,000 Speaker 1: to think up a big friend to protect you. 916 00:49:44,080 --> 00:49:52,080 Speaker 2: And that's why there and that's why there's always only 917 00:49:52,200 --> 00:49:55,000 Speaker 2: like four or five of them, because that's the weird kids. 918 00:49:55,120 --> 00:49:57,080 Speaker 1: It's the kids whose parents are still not there. 919 00:49:57,120 --> 00:49:59,080 Speaker 2: Bro, it's late. 920 00:49:59,200 --> 00:50:00,719 Speaker 1: Yeah, it's you. 921 00:50:00,200 --> 00:50:04,400 Speaker 2: Should we call Shannon again? Yeah, yeah, we we are waiting. 922 00:50:11,840 --> 00:50:15,480 Speaker 3: I knew Barney was imaginary. I guess every time I 923 00:50:15,480 --> 00:50:17,520 Speaker 3: didn't realize I didn't put it together. 924 00:50:18,200 --> 00:50:21,480 Speaker 2: Every time there's a twinkling and then Barney, the the 925 00:50:21,640 --> 00:50:27,520 Speaker 2: stuffed animal magically turned into the dinosaur and they slowed 926 00:50:27,520 --> 00:50:29,319 Speaker 2: the song down. It was like, Barney can be your 927 00:50:29,360 --> 00:50:34,680 Speaker 2: friends who if you just make and every single episode here, 928 00:50:34,760 --> 00:50:37,280 Speaker 2: your boy your mom's boyfriend, Kevin can. 929 00:50:37,040 --> 00:50:37,880 Speaker 1: Not hurt you. 930 00:50:40,880 --> 00:50:54,800 Speaker 2: He's not bigger than Barney. Nobody's bigger than Barney. Mike's away. 931 00:50:56,480 --> 00:50:58,200 Speaker 3: I wanted you guys to put like a count up 932 00:50:58,280 --> 00:51:00,880 Speaker 3: on this. How many times I said, Michael. 933 00:51:05,080 --> 00:51:08,200 Speaker 2: Yeah, I think that I think we cannot let white 934 00:51:08,200 --> 00:51:14,000 Speaker 2: people continue to use slavery as a weapon against us. Yeah, 935 00:51:14,080 --> 00:51:17,520 Speaker 2: we got to draw a line somewhere and them finding 936 00:51:17,600 --> 00:51:21,200 Speaker 2: pleasure in a slave story. That's a line for me. 937 00:51:22,600 --> 00:51:23,799 Speaker 2: Go ahead, chill out, bro. 938 00:51:23,960 --> 00:51:25,440 Speaker 1: Why I don't be watching those slave movies. 939 00:51:25,440 --> 00:51:28,360 Speaker 2: Man, you said it's a ghost. Keep it the ghost 940 00:51:28,480 --> 00:51:29,799 Speaker 2: you always thought it was. 941 00:51:30,600 --> 00:51:33,200 Speaker 1: It's a ghost of literacy. 942 00:51:33,960 --> 00:51:36,680 Speaker 2: That was a little white child in briches. Yeah, you 943 00:51:36,719 --> 00:51:38,440 Speaker 2: know what I mean, And now you want to hang 944 00:51:38,480 --> 00:51:40,479 Speaker 2: out with you know what I mean, just some little 945 00:51:40,560 --> 00:51:46,200 Speaker 2: Victorian boy and now he's helping out. That makes sense 946 00:51:46,239 --> 00:51:49,120 Speaker 2: to me. You ain't got to make that a slave 947 00:51:49,640 --> 00:51:54,160 Speaker 2: interesting anything about that show? Ever, until today, it wasn't 948 00:51:54,239 --> 00:51:57,560 Speaker 2: very popular. It was PBS, so you know, I think 949 00:51:57,560 --> 00:51:59,399 Speaker 2: you had to be over there to be over there. 950 00:51:59,520 --> 00:52:02,719 Speaker 1: I think people this was more watched by kids. Yeah, 951 00:52:03,600 --> 00:52:04,080 Speaker 1: when we. 952 00:52:03,920 --> 00:52:06,680 Speaker 2: Were I loved p BS when we were kids. BBS 953 00:52:06,719 --> 00:52:12,600 Speaker 2: had some hits. Arthur Arthur was Tailing Tails, cyber Chase, Wishbone, 954 00:52:13,000 --> 00:52:17,120 Speaker 2: the Boom of Food, the Boom of Food between the Lions, 955 00:52:18,440 --> 00:52:21,960 Speaker 2: Reading Rainbow, Let's go. I never watched Reading Rainbow. 956 00:52:21,920 --> 00:52:22,279 Speaker 1: I'll be. 957 00:52:24,560 --> 00:52:27,320 Speaker 2: I know the song. Well okay, yeah, then please Chaka 958 00:52:27,360 --> 00:52:31,120 Speaker 2: Khan and LaVar Burton. Yeah, press some one we have 959 00:52:31,560 --> 00:52:34,040 Speaker 2: Oh we have a good one too, you really yeah, No, 960 00:52:34,120 --> 00:52:36,440 Speaker 2: you're gonna like this one. I think I feel like 961 00:52:36,480 --> 00:52:38,160 Speaker 2: you're saying I'm gonna like it means going to not No, 962 00:52:38,200 --> 00:52:39,160 Speaker 2: I think you're gonna like this. 963 00:52:39,600 --> 00:52:39,799 Speaker 1: Cool. 964 00:52:40,760 --> 00:52:44,040 Speaker 2: This is pretty good. Okay, okay, yeah, well marinate, but 965 00:52:44,080 --> 00:52:46,839 Speaker 2: I think what you're gonna like about it is we 966 00:52:47,040 --> 00:52:59,440 Speaker 2: it keeps all anyway, see how he kept one at 967 00:52:59,440 --> 00:52:59,799 Speaker 2: a time. 968 00:53:01,200 --> 00:53:03,319 Speaker 3: That's a pretty good one. I think that's horrible. That 969 00:53:03,360 --> 00:53:05,719 Speaker 3: actually made no sense. It didn't correlate it anyway. There's 970 00:53:05,760 --> 00:53:07,879 Speaker 3: actually that was a low hanging fruit as a joke. 971 00:53:08,000 --> 00:53:11,200 Speaker 2: That's not our joke. There's a full DMX version of 972 00:53:11,600 --> 00:53:15,080 Speaker 2: reading there is there is. Yeah, what a year did 973 00:53:15,120 --> 00:53:21,720 Speaker 2: that come out? Okay, I need to look into your window? 974 00:53:21,880 --> 00:53:24,919 Speaker 1: Okay, great, yeah, entire couch. 975 00:53:25,000 --> 00:53:29,800 Speaker 2: Can't wait. No shame in looking back on that one. Okay, great? Yeah, yeah, Brett, 976 00:53:29,840 --> 00:53:31,719 Speaker 2: this was a great man. Thank you. This is really 977 00:53:31,719 --> 00:53:34,000 Speaker 2: fun having me. Could you tell the people where they 978 00:53:34,000 --> 00:53:35,799 Speaker 2: can find you. What cool ship you got going on? 979 00:53:35,840 --> 00:53:41,600 Speaker 3: Oh everywhere, TikTok Instagram, at Brett Gray YouTube. I drop 980 00:53:41,640 --> 00:53:46,680 Speaker 3: blogs and stuff sometimes and then next year, this year 981 00:53:46,760 --> 00:53:52,919 Speaker 3: maybe on Amazon Prime. Who was awesome in that show 982 00:53:53,280 --> 00:53:55,520 Speaker 3: Man's Super Fun And we got to do a lot 983 00:53:55,560 --> 00:53:56,240 Speaker 3: of really cool stuff. 984 00:53:56,320 --> 00:53:58,799 Speaker 2: Yeah, yeah, we got to be awkward with each other. 985 00:53:58,840 --> 00:54:01,000 Speaker 2: We got to be awkward. I got to watch you yell. 986 00:54:01,680 --> 00:54:06,160 Speaker 2: That was great. I watched I've seen him have many tantrums. 987 00:54:06,800 --> 00:54:09,520 Speaker 2: They weren't real, but it was cool to watch. 988 00:54:09,880 --> 00:54:10,839 Speaker 1: Thees are good to. 989 00:54:10,960 --> 00:54:12,880 Speaker 2: Yeah, you know what. A lot of them were based 990 00:54:12,880 --> 00:54:19,000 Speaker 2: in light skinness. It's it's unfortunate, but it's unfortunate, but 991 00:54:19,040 --> 00:54:23,480 Speaker 2: it is my journey. And I make no apologies, but 992 00:54:23,520 --> 00:54:24,200 Speaker 2: I do reckon. 993 00:54:24,400 --> 00:54:26,040 Speaker 3: I will say it's so cool to see him on 994 00:54:26,080 --> 00:54:30,960 Speaker 3: this podcast versus on set. He's so professional and suave 995 00:54:31,600 --> 00:54:36,919 Speaker 3: and very nice and quiet. He's not vulgar and inappropriate. 996 00:54:37,040 --> 00:54:39,000 Speaker 2: Oh No, I talked about pussy a lot. 997 00:54:39,200 --> 00:54:41,600 Speaker 1: I do feel like this podcasts maybe brought out a 998 00:54:41,640 --> 00:54:44,400 Speaker 1: bad side in you, absolutely, and I'm sorry for that. 999 00:54:44,520 --> 00:54:46,560 Speaker 2: Don't apologie, but that's just where you're at. 1000 00:54:46,840 --> 00:54:48,640 Speaker 1: I think it was a side always wanting to come out. 1001 00:54:48,800 --> 00:54:50,640 Speaker 2: Yeah you know, yeah, I get that. 1002 00:54:50,880 --> 00:54:53,040 Speaker 1: Yeah, and I come here to tone my ship down. 1003 00:54:53,040 --> 00:54:53,759 Speaker 2: Really. 1004 00:54:55,360 --> 00:54:55,640 Speaker 1: Down. 1005 00:54:56,040 --> 00:55:00,880 Speaker 2: I'm the reckless one on this podcast, really in this relationship, 1006 00:55:00,880 --> 00:55:03,120 Speaker 2: I would say easily the more reckless person. 1007 00:55:03,640 --> 00:55:06,759 Speaker 1: Why am I? I'm a chaos stage it But I'm 1008 00:55:06,760 --> 00:55:07,600 Speaker 1: pretty calm. 1009 00:55:07,800 --> 00:55:10,240 Speaker 2: You lived a way more reckless life though. 1010 00:55:10,360 --> 00:55:12,240 Speaker 1: Yeah, that's why I'm more. That's why I don't. Really, 1011 00:55:12,520 --> 00:55:13,840 Speaker 1: I'm pretty relaxed. 1012 00:55:13,960 --> 00:55:16,040 Speaker 2: You kind of I think got all your ship out, 1013 00:55:16,440 --> 00:55:19,120 Speaker 2: and I'm a dude who didn't get to fuck enough. 1014 00:55:19,880 --> 00:55:20,400 Speaker 1: You know what I mean? 1015 00:55:20,640 --> 00:55:23,279 Speaker 2: You like roll me again? I'm hanging out at the 1016 00:55:23,320 --> 00:55:24,120 Speaker 2: bar too long. 1017 00:55:26,920 --> 00:55:31,000 Speaker 1: He started smoking cigarettes. Try smoking. 1018 00:55:32,880 --> 00:55:40,960 Speaker 2: I don't even know that. Niggas like that. That's like 1019 00:55:41,000 --> 00:55:43,960 Speaker 2: a treat. You don't make that your only thing. 1020 00:55:44,280 --> 00:55:47,480 Speaker 1: No, no, no, no, you don't just smoke black your 1021 00:55:47,520 --> 00:55:50,120 Speaker 1: bo You smoke a black your Boyeah? 1022 00:55:53,320 --> 00:55:55,080 Speaker 2: Oh what you got? 1023 00:55:55,200 --> 00:55:57,480 Speaker 1: Cool? Got jokes eighty seven on Instagram. I'll just come 1024 00:55:57,520 --> 00:55:59,880 Speaker 1: see us in New Orleans, right, Yeah, come see the 1025 00:56:00,040 --> 00:56:03,280 Speaker 1: Old On Street Festival. We're doing live shows. I'm getting 1026 00:56:03,280 --> 00:56:05,799 Speaker 1: the dates. It's it's when are you guys going? 1027 00:56:07,040 --> 00:56:13,240 Speaker 2: March twenty twenty first watched through the twenty second we go, Yeah, 1028 00:56:13,360 --> 00:56:15,160 Speaker 2: I'm coming, I'm bringing. 1029 00:56:15,640 --> 00:56:18,920 Speaker 1: Okay, bring Oh yeah, Punky, that's bring the. 1030 00:56:18,840 --> 00:56:22,600 Speaker 2: Most New Orleans ladies. Yes, exactly. Yeah, you should have 1031 00:56:22,640 --> 00:56:28,000 Speaker 2: Punky on this podcast. We absolutely have had herring her again. Okay, 1032 00:56:28,280 --> 00:56:34,000 Speaker 2: I'd imagine her conspiracies would be ridiculous. She's great, her 1033 00:56:34,040 --> 00:56:38,680 Speaker 2: follow through is exceptional. Oh. Absolutely, There's there are few 1034 00:56:38,719 --> 00:56:42,160 Speaker 2: people who we get to invite on who we have 1035 00:56:42,280 --> 00:56:46,600 Speaker 2: to be like, all right, let's speak chill, yeah, relaxed, 1036 00:56:46,880 --> 00:56:50,040 Speaker 2: because he's one of those we might have to we 1037 00:56:50,120 --> 00:56:51,400 Speaker 2: might have to ring this back in. 1038 00:56:52,360 --> 00:56:54,680 Speaker 1: We're don't have to edit. And that's still we don't have. 1039 00:56:58,840 --> 00:57:01,279 Speaker 2: Punky. You gotta watch that one. But man, what a 1040 00:57:01,320 --> 00:57:05,080 Speaker 2: funny wait. Yeah, Hooky is amazing that It's just great. 1041 00:57:05,160 --> 00:57:08,400 Speaker 2: The best tell Uh. You can follow me at Langston 1042 00:57:08,440 --> 00:57:10,880 Speaker 2: Kerman on all social media platforms. You can see me 1043 00:57:10,960 --> 00:57:14,920 Speaker 2: on my aspiring Deadbeat tour. All those tickets you can 1044 00:57:14,960 --> 00:57:16,760 Speaker 2: find that at Langston Kerman dot com. 1045 00:57:17,080 --> 00:57:17,240 Speaker 1: Uh. 1046 00:57:17,280 --> 00:57:20,160 Speaker 2: We got Dallas coming up. I think that'll already be gone, 1047 00:57:20,240 --> 00:57:23,840 Speaker 2: but but there's some other shit Grand Rapids, uh in 1048 00:57:23,960 --> 00:57:27,920 Speaker 2: the Big Old Building and yeah, that's what it's called, 1049 00:57:27,960 --> 00:57:28,640 Speaker 2: The Big Old Building. 1050 00:57:28,680 --> 00:57:30,400 Speaker 1: Oh is that what it's called. Yah in the big 1051 00:57:30,400 --> 00:57:30,960 Speaker 1: old building. 1052 00:57:31,480 --> 00:57:33,400 Speaker 2: I forgot that you got to go up some steps 1053 00:57:33,440 --> 00:57:36,640 Speaker 2: and yeah, it's like a complex, but it's it's a 1054 00:57:36,680 --> 00:57:39,840 Speaker 2: great club, Doctor Grin's. And there's some bunch of other 1055 00:57:39,920 --> 00:57:42,040 Speaker 2: dates that you can find at Langston Kerman dot com. 1056 00:57:42,160 --> 00:57:44,840 Speaker 2: You can follow us, you can like, subscribe, rate, review, 1057 00:57:44,920 --> 00:57:47,600 Speaker 2: do all that shit. You can send your own conspiracies, 1058 00:57:47,840 --> 00:57:52,760 Speaker 2: your own drops. You can tell us who you think 1059 00:57:52,800 --> 00:57:55,080 Speaker 2: it's the most reckless individual we've ever had on the 1060 00:57:55,120 --> 00:57:58,960 Speaker 2: podcast at Mymama pod at gmail dot com, and most importantly, 1061 00:57:59,000 --> 00:58:02,000 Speaker 2: call us at A four four Little Moms follow that Patreon. 1062 00:58:02,880 --> 00:58:06,360 Speaker 2: We love you so much. Bye bitch, you raggedy bitch. 1063 00:58:09,680 --> 00:58:12,280 Speaker 2: My Mama Told Me is a production of Will Ferrell's 1064 00:58:12,280 --> 00:58:16,280 Speaker 2: Big Money Players Network and iHeart Podcast. 1065 00:58:16,680 --> 00:58:18,880 Speaker 1: Created and hosted by Langston krek. 1066 00:58:18,960 --> 00:58:20,520 Speaker 2: Co hosted by David Bordi. 1067 00:58:20,840 --> 00:58:24,840 Speaker 1: Executive produced by Will Ferrell, Hansani and Joel Monique. 1068 00:58:25,080 --> 00:58:29,360 Speaker 2: Co produced by Bee Wayne, edited and engineered by Justin Kohne. 1069 00:58:29,760 --> 00:58:31,840 Speaker 2: Music by Nick Chambers. 1070 00:58:31,600 --> 00:58:33,240 Speaker 1: Artwork by Doegon Kreger. 1071 00:58:33,520 --> 00:58:36,200 Speaker 2: You can now watch episodes of My Mama Told Me 1072 00:58:36,320 --> 00:58:40,600 Speaker 2: on YouTube, follow at My Mama Told Me and subscribe 1073 00:58:40,640 --> 00:58:41,440 Speaker 2: to our channel,