1 00:00:00,000 --> 00:00:02,920 Speaker 1: Hey, guys, welcome to another episode of Selective Ignorance. However, 2 00:00:03,000 --> 00:00:05,320 Speaker 1: before we get to this week's episode, I want to 3 00:00:05,360 --> 00:00:09,120 Speaker 1: remind you guys to purchase my book No Holds Barred, 4 00:00:09,240 --> 00:00:12,319 Speaker 1: a dual manifesto of sexual exploration and power. So feel 5 00:00:12,320 --> 00:00:16,320 Speaker 1: free to go to your local bookstores preferably queer owned, 6 00:00:16,440 --> 00:00:19,160 Speaker 1: black owned, or woman owned to support them, but also 7 00:00:19,320 --> 00:00:23,479 Speaker 1: just click the button on Amazon, Barnes and Nobles, or 8 00:00:23,520 --> 00:00:27,600 Speaker 1: wherever you read your books. Again. That is No Holds Barred, 9 00:00:27,680 --> 00:00:31,280 Speaker 1: a dual manifesto of sexual exploration and power, written by 10 00:00:31,320 --> 00:00:35,519 Speaker 1: yours truly and my co host of the Decisions Decisions podcast, Weezy. 11 00:00:35,800 --> 00:00:37,960 Speaker 1: Make sure y'all get that. Now let's get to this 12 00:00:38,000 --> 00:00:41,280 Speaker 1: week's episode. This is Mandy be. Welcome to Selective Ignorance, 13 00:00:41,320 --> 00:00:44,360 Speaker 1: a production of the Black Effect Podcast Network and Iart Radio. 14 00:00:45,159 --> 00:00:48,239 Speaker 1: All right, y'all, let's get into it. Welcome back to 15 00:00:48,320 --> 00:00:52,640 Speaker 1: another episode of Selective Ignorance. I'm your girl and your host, 16 00:00:52,760 --> 00:00:55,720 Speaker 1: Mandy B. And this is the show where we question everything, 17 00:00:55,720 --> 00:00:59,200 Speaker 1: and yes that includes your favorite talking points and your 18 00:00:59,400 --> 00:01:03,160 Speaker 1: out data as perspectives. But let me just say this 19 00:01:03,680 --> 00:01:07,039 Speaker 1: up top, a I is here, by the way, this 20 00:01:07,080 --> 00:01:12,080 Speaker 1: topic is a long time coming, because hello, AI is 21 00:01:12,360 --> 00:01:15,959 Speaker 1: what's happening. It's not coming, it's not creeping up. It's here, 22 00:01:16,360 --> 00:01:18,559 Speaker 1: and y'all can either deal with it or get left behind. 23 00:01:18,800 --> 00:01:21,600 Speaker 1: Because if you're out here flexing about not learning the tools, 24 00:01:21,880 --> 00:01:24,800 Speaker 1: talking about I don't need that robot mess, Oh my god, 25 00:01:24,880 --> 00:01:28,200 Speaker 1: I'm not using tat GPT. That mindset is the exact 26 00:01:28,240 --> 00:01:30,319 Speaker 1: reason you'll be out of work, out of relevance, and 27 00:01:30,480 --> 00:01:34,000 Speaker 1: out of touch. Now, let's be real. This wave of tech, 28 00:01:34,160 --> 00:01:38,360 Speaker 1: this AI revolution, it's very much giving Industrial Revolution two 29 00:01:38,400 --> 00:01:41,600 Speaker 1: point zero. Remember when machines came in and replaced human 30 00:01:41,720 --> 00:01:46,039 Speaker 1: labor on factory floors? Yep, same energy. People fought it 31 00:01:46,080 --> 00:01:48,800 Speaker 1: then too, cried about change, and guess what, The machines 32 00:01:48,880 --> 00:01:51,280 Speaker 1: kept coming and the world moved on. But here's the thing. 33 00:01:51,720 --> 00:01:53,600 Speaker 1: I'm not here to scare you. I'm here to wake 34 00:01:53,640 --> 00:01:57,480 Speaker 1: your ass up because AI it's doing some things better 35 00:01:57,520 --> 00:01:59,160 Speaker 1: than people tell me about. 36 00:01:59,200 --> 00:01:59,320 Speaker 2: It. 37 00:01:59,360 --> 00:02:03,640 Speaker 1: Just fired me niggas last week. More accurate, faster, no ego, 38 00:02:03,760 --> 00:02:06,320 Speaker 1: no lunch break. But don't get it twisted. It cannot 39 00:02:06,360 --> 00:02:10,760 Speaker 1: replace human depth, instinct, or soul, not yet anyway. I'm 40 00:02:10,760 --> 00:02:13,320 Speaker 1: waiting for it to I know y'all said, or y'all 41 00:02:13,360 --> 00:02:16,520 Speaker 1: have heard me say, I'm waiting for the aliens to come. Baby, 42 00:02:16,560 --> 00:02:19,440 Speaker 1: I am welcoming the robots. And I know y'all love 43 00:02:19,520 --> 00:02:22,520 Speaker 1: to say work smarter, not harder, but then turn around 44 00:02:22,560 --> 00:02:25,519 Speaker 1: and ignore the tools that would help you do just that. 45 00:02:25,600 --> 00:02:28,680 Speaker 1: Please make it make sense if you're not asking deeper 46 00:02:28,760 --> 00:02:32,240 Speaker 1: questions about what's being handed to you, what's being automated, 47 00:02:32,240 --> 00:02:36,440 Speaker 1: what's being filtered, what's being programmed, you're just blindly vibing 48 00:02:36,840 --> 00:02:43,360 Speaker 1: in the matrix. So as I've said, please question everything, 49 00:02:43,639 --> 00:02:49,799 Speaker 1: y'all know I do it, even in this episode. So first, 50 00:02:50,080 --> 00:02:57,040 Speaker 1: listen close. I'm joined today by my super producer Jason 51 00:02:57,280 --> 00:03:00,920 Speaker 1: and a King on the mic, and we're here to 52 00:03:00,960 --> 00:03:03,480 Speaker 1: talk all things AI. But what I really love is 53 00:03:03,480 --> 00:03:05,800 Speaker 1: when we have these episodes and we're also able to 54 00:03:05,840 --> 00:03:09,200 Speaker 1: like squeeze in some curn events. So we're gonna start 55 00:03:09,240 --> 00:03:13,600 Speaker 1: off the episode this week because but we're recording this 56 00:03:13,639 --> 00:03:15,760 Speaker 1: the day before y'all hear it, so we can kind 57 00:03:15,760 --> 00:03:19,239 Speaker 1: of talk about everything be hot, you know what I mean. Listen, 58 00:03:19,360 --> 00:03:21,440 Speaker 1: We're gonna talk about the t app in a little bit, 59 00:03:21,440 --> 00:03:24,600 Speaker 1: but I do want to first give an r I 60 00:03:24,800 --> 00:03:29,600 Speaker 1: p rest in peace to Malcolm Jamal Warner as well 61 00:03:29,720 --> 00:03:31,560 Speaker 1: as that's it. 62 00:03:31,919 --> 00:03:36,480 Speaker 3: I mean, we got to give space for Malcolm. 63 00:03:36,680 --> 00:03:40,240 Speaker 1: Okay, Okay, well, no, I'm not gonna lie that. 64 00:03:40,400 --> 00:03:42,560 Speaker 3: That was my r P was going. 65 00:03:42,880 --> 00:03:45,360 Speaker 1: You know, it's crazy that could here's the other the thing, 66 00:03:45,480 --> 00:03:51,200 Speaker 1: the other two Uh okay, the rock star in the wrestler. Okay, 67 00:03:51,360 --> 00:03:54,440 Speaker 1: here's what I'll say about that. And and and by 68 00:03:54,480 --> 00:03:59,320 Speaker 1: the way, condolences to the Warner family, to his daughter 69 00:03:59,440 --> 00:04:06,000 Speaker 1: that we we learned was there. It's weird because I 70 00:04:06,040 --> 00:04:11,760 Speaker 1: have such a pessimistic view on life that I'm constantly 71 00:04:11,920 --> 00:04:15,960 Speaker 1: like thinking about the end, which sucks because I also 72 00:04:16,000 --> 00:04:18,480 Speaker 1: am not in fear of the end, but I always 73 00:04:18,520 --> 00:04:22,000 Speaker 1: just wonder what life, how life would exist when I'm gone. 74 00:04:22,560 --> 00:04:24,680 Speaker 1: And to know that he was on a family vacation 75 00:04:24,800 --> 00:04:26,880 Speaker 1: and that took place like me and my friends travel 76 00:04:27,000 --> 00:04:29,600 Speaker 1: all the time, you know what I mean? What do 77 00:04:29,640 --> 00:04:30,839 Speaker 1: you mean allegedly, Well, there's. 78 00:04:30,640 --> 00:04:31,839 Speaker 3: A lot of stories that came out. 79 00:04:31,960 --> 00:04:36,520 Speaker 1: I don't like life, but I ain't gonna hold you, bro, 80 00:04:36,640 --> 00:04:39,120 Speaker 1: I ain't gonna hold you. We not going to go 81 00:04:39,480 --> 00:04:45,960 Speaker 1: no conspiracy theories. He was swimming, really, don't he was? 82 00:04:46,120 --> 00:04:48,679 Speaker 1: He was swimming on a morning swim with his daughter. 83 00:04:49,160 --> 00:04:51,640 Speaker 1: Surfers saved both of them. We're not going to. 84 00:04:53,800 --> 00:04:55,839 Speaker 3: But it was three other reports that we're. 85 00:04:55,680 --> 00:04:59,480 Speaker 1: Not We're not doing this. You know what's crazy? Let 86 00:04:59,480 --> 00:05:01,760 Speaker 1: me let me ask, y'all. Only because this is a 87 00:05:01,760 --> 00:05:04,400 Speaker 1: problematic thing that I know I do, but because it's 88 00:05:04,440 --> 00:05:06,920 Speaker 1: selective ignorance, I want to share it out loud. When 89 00:05:06,960 --> 00:05:10,159 Speaker 1: you see rip posts from people, do you ever go 90 00:05:10,279 --> 00:05:13,520 Speaker 1: into the comments, yes, to see how they died, because 91 00:05:14,320 --> 00:05:16,800 Speaker 1: you want to know just how it happened, Like, especially 92 00:05:16,839 --> 00:05:19,600 Speaker 1: if it's someone that you may not know, you want 93 00:05:19,600 --> 00:05:22,440 Speaker 1: to know, Okay, like what happened? Is that a thing 94 00:05:22,480 --> 00:05:22,880 Speaker 1: that y'all? 95 00:05:22,880 --> 00:05:25,560 Speaker 4: Absolutely, there's somebody in the comments section that's gonna go 96 00:05:25,800 --> 00:05:29,640 Speaker 4: above and beyond investigative reporting and they gonna dig for that. 97 00:05:29,760 --> 00:05:31,159 Speaker 3: Macha tea. You know what I'm saying. 98 00:05:31,680 --> 00:05:34,120 Speaker 4: It's not for real, just to see like it might 99 00:05:34,160 --> 00:05:38,080 Speaker 4: be some incomplete information there, and they might between the 100 00:05:38,080 --> 00:05:42,320 Speaker 4: time it was posted, and you're right, so I always do, 101 00:05:42,480 --> 00:05:45,320 Speaker 4: just to see it might be a name or some clarification. 102 00:05:45,440 --> 00:05:47,240 Speaker 1: Oh yeah, I like to go to their page and 103 00:05:47,279 --> 00:05:50,039 Speaker 1: see what their last post was. It's very it's weird. 104 00:05:50,120 --> 00:05:53,080 Speaker 2: Then we do that, it definitely have to go back 105 00:05:53,080 --> 00:05:53,760 Speaker 2: to their last post. 106 00:05:53,880 --> 00:05:54,200 Speaker 3: Ye see. 107 00:05:54,279 --> 00:05:56,240 Speaker 2: Yeah, what I also a little bugged out choose when 108 00:05:56,279 --> 00:05:58,320 Speaker 2: they passed and they still have like an ig story. 109 00:05:58,360 --> 00:05:58,880 Speaker 5: It's active. 110 00:06:00,040 --> 00:06:06,440 Speaker 1: I know. What I also find myself doing is because 111 00:06:06,440 --> 00:06:08,880 Speaker 1: there's always going to be that one. Here's the thing. 112 00:06:09,200 --> 00:06:13,560 Speaker 1: If you're listening, someone does not like you. You did 113 00:06:13,600 --> 00:06:18,080 Speaker 1: somebody wrong. And so sometimes there's the comments that talk 114 00:06:18,120 --> 00:06:19,800 Speaker 1: about how shitt he have a person? The person is 115 00:06:19,839 --> 00:06:23,000 Speaker 1: that past, and so I say that to lean into. 116 00:06:24,120 --> 00:06:27,520 Speaker 1: I didn't have any thoughts. Nothing in my body jumped 117 00:06:27,680 --> 00:06:33,559 Speaker 1: about Ozzie or Kogan. It just didn't. Ozzy Osbourne died 118 00:06:33,720 --> 00:06:38,520 Speaker 1: at what's seventy nine or something like that. Maybe think 119 00:06:38,640 --> 00:06:40,240 Speaker 1: you lucky you lived that long with the type of 120 00:06:40,320 --> 00:06:43,479 Speaker 1: life you had. You were a fucking rock star, taking 121 00:06:43,680 --> 00:06:48,880 Speaker 1: all the drugs, living the life, and like blood pressure 122 00:06:48,920 --> 00:06:50,440 Speaker 1: had to be high. I saw him yelling at his 123 00:06:50,560 --> 00:06:55,680 Speaker 1: kids for like my childhood like he was reality TV 124 00:06:55,839 --> 00:06:59,120 Speaker 1: was crazy. Is like before people were like, oh I 125 00:06:59,120 --> 00:07:02,960 Speaker 1: want a reality show like the Kardashians. Oh nah, my 126 00:07:03,040 --> 00:07:05,920 Speaker 1: house growing up was like the Osborne's like shout it 127 00:07:06,640 --> 00:07:10,800 Speaker 1: literally literally was was yelling at them kids like a motherfucker. 128 00:07:11,080 --> 00:07:15,640 Speaker 1: So to me, Ozzy dying, to me, that's a fulfilled life. 129 00:07:15,680 --> 00:07:18,960 Speaker 1: Like I don't I don't know if I look into 130 00:07:19,840 --> 00:07:21,640 Speaker 1: or think about what my life will be when I'm 131 00:07:21,640 --> 00:07:24,400 Speaker 1: in my seventies. And so knowing that he had that 132 00:07:24,440 --> 00:07:28,040 Speaker 1: type of life and all of the substance abuses and 133 00:07:28,080 --> 00:07:30,680 Speaker 1: all of the stress and just a rock star lifestyle 134 00:07:31,040 --> 00:07:34,600 Speaker 1: from the touring to the live shows, all of those things, 135 00:07:35,280 --> 00:07:36,600 Speaker 1: I think he lived a life. 136 00:07:36,720 --> 00:07:40,080 Speaker 2: You feel me, Just he just wrapped his last tour, 137 00:07:40,160 --> 00:07:42,480 Speaker 2: like his closing tour of his life was like two 138 00:07:42,480 --> 00:07:45,440 Speaker 2: weeks ago even that, So to your point, life fulfilled. 139 00:07:45,800 --> 00:07:48,400 Speaker 1: Not only that, I was just also with a good 140 00:07:48,440 --> 00:07:50,440 Speaker 1: friend of mine who used to be in the NBA, 141 00:07:51,160 --> 00:07:56,000 Speaker 1: and we were talking about how he's on like, uh, 142 00:07:56,240 --> 00:07:59,840 Speaker 1: like an anxiety medication. But we talked about how he 143 00:08:00,080 --> 00:08:02,520 Speaker 1: and even know what anxiety was, and it was something 144 00:08:02,560 --> 00:08:06,480 Speaker 1: that actually formulated during his career as a player, because 145 00:08:06,520 --> 00:08:10,800 Speaker 1: you're constantly around you know, you have performance anxiety, and 146 00:08:10,800 --> 00:08:15,160 Speaker 1: then when you go through injuries and things like that, 147 00:08:15,400 --> 00:08:18,200 Speaker 1: you're constantly thinking about what's next for you and things 148 00:08:18,200 --> 00:08:21,720 Speaker 1: like that, and the idea that anxiety came about through 149 00:08:21,760 --> 00:08:26,440 Speaker 1: his career I can only imagine like what health issues 150 00:08:26,640 --> 00:08:30,840 Speaker 1: or mental issues that Ozzy Osbourne had. And in terms 151 00:08:30,840 --> 00:08:34,120 Speaker 1: of mental issues, I think racism is a mental fucking issue. 152 00:08:34,160 --> 00:08:37,040 Speaker 1: So I don't honestly don't care that Hulk Hogan died. 153 00:08:39,000 --> 00:08:41,440 Speaker 1: That's it. I mean, I don't think we have to 154 00:08:41,559 --> 00:08:44,560 Speaker 1: care when people pass because to me, we know that 155 00:08:45,160 --> 00:08:50,600 Speaker 1: death is inevitable essentially, So like the whole rip and 156 00:08:50,679 --> 00:08:53,480 Speaker 1: sending it like I don't know when people are shitty, 157 00:08:53,480 --> 00:08:55,680 Speaker 1: I think it's okay to talk about them after they passed. 158 00:08:56,280 --> 00:08:59,160 Speaker 1: And he was shitty when he was alive. 159 00:08:59,400 --> 00:09:02,240 Speaker 4: Yeah, No, he's still getting the same vitriol as he 160 00:09:02,280 --> 00:09:06,160 Speaker 4: revealed who he really was from how he feels about 161 00:09:06,320 --> 00:09:09,640 Speaker 4: society and mass Yeah. So it's like, as a child, 162 00:09:09,640 --> 00:09:12,440 Speaker 4: you grew up buying Hogan figures. 163 00:09:12,120 --> 00:09:14,920 Speaker 1: And oh I watched him on reality to. 164 00:09:16,160 --> 00:09:18,560 Speaker 4: The minute he just started to reveal who he is, 165 00:09:19,080 --> 00:09:23,559 Speaker 4: who Terry is. Right, then it's a different we measure 166 00:09:23,640 --> 00:09:24,160 Speaker 4: him differently. 167 00:09:24,280 --> 00:09:24,600 Speaker 3: Pause. 168 00:09:24,960 --> 00:09:28,280 Speaker 4: But it so when he passed away, it is like, 169 00:09:28,280 --> 00:09:30,800 Speaker 4: all right, you know, he's a human being. He passed away. 170 00:09:31,640 --> 00:09:35,640 Speaker 4: I don't agree with his his viewpoints on humanity, so 171 00:09:36,320 --> 00:09:37,800 Speaker 4: on the large part of humanity. 172 00:09:37,840 --> 00:09:42,080 Speaker 1: So here's the thing, like, I just don't care. I'm 173 00:09:42,080 --> 00:09:43,960 Speaker 1: not gonna celebrate someone dying. 174 00:09:43,960 --> 00:09:45,280 Speaker 3: I don't care because. 175 00:09:45,040 --> 00:09:48,559 Speaker 1: Well that's the thing, right if there are and if 176 00:09:48,559 --> 00:09:53,120 Speaker 1: you guys go back to the episode with Selena Hill 177 00:09:53,720 --> 00:09:59,480 Speaker 1: and Montgomery, Kenneth Montgomery, we talked about the death penalty 178 00:10:00,120 --> 00:10:04,360 Speaker 1: and where I won't celebrate death. There are some niggas 179 00:10:04,360 --> 00:10:08,480 Speaker 1: that I think the world would be better without, whether 180 00:10:08,520 --> 00:10:13,280 Speaker 1: it's pedophilia, whether it's people who project racism against beings 181 00:10:13,320 --> 00:10:16,960 Speaker 1: that they don't know and just have this discriminatory hatred 182 00:10:17,400 --> 00:10:20,000 Speaker 1: towards a race of people because of the color of 183 00:10:20,000 --> 00:10:23,360 Speaker 1: their skin. To me, if many of you are listening 184 00:10:23,440 --> 00:10:25,760 Speaker 1: and have the idea that world peace will ever exist 185 00:10:25,840 --> 00:10:28,959 Speaker 1: while we're here, there are some people that we could 186 00:10:29,320 --> 00:10:31,960 Speaker 1: get off the get off the planet. And to me, 187 00:10:32,520 --> 00:10:38,160 Speaker 1: I put racists and pedophiles and rapists or rapists. I 188 00:10:38,200 --> 00:10:41,720 Speaker 1: don't know how quick we are into YouTube here, but 189 00:10:42,200 --> 00:10:45,160 Speaker 1: there are certain people with characteristics that I don't think 190 00:10:46,080 --> 00:10:50,800 Speaker 1: deserve to be on this earth, especially when we have 191 00:10:50,920 --> 00:10:54,920 Speaker 1: people passing like a Malcolm Jamal Warner, who Jesus Christ, 192 00:10:55,280 --> 00:10:57,520 Speaker 1: it probably won't happen for me, but I'm jealous that 193 00:10:57,600 --> 00:10:59,840 Speaker 1: in passing. I have not heard one person say one 194 00:11:00,200 --> 00:11:02,360 Speaker 1: thing about this man, right, you know what I mean? 195 00:11:02,440 --> 00:11:05,640 Speaker 1: And so that's a life fulfilled. And it's crazy because 196 00:11:05,679 --> 00:11:08,120 Speaker 1: we think about karma and then we see things happen 197 00:11:08,200 --> 00:11:11,600 Speaker 1: to people that we assume just don't deserve that, you 198 00:11:11,640 --> 00:11:13,520 Speaker 1: know what I mean, They deserve to be here longer. 199 00:11:14,920 --> 00:11:16,760 Speaker 1: And so just kind of wanted to start off there. 200 00:11:17,880 --> 00:11:21,160 Speaker 1: We got some curn events that took place, which one 201 00:11:21,160 --> 00:11:26,160 Speaker 1: we want to start with first? Jason throws something at me, 202 00:11:27,600 --> 00:11:28,360 Speaker 1: what were we gonna do? 203 00:11:29,160 --> 00:11:30,640 Speaker 2: I actually want to We haven't talked to a minute, 204 00:11:30,640 --> 00:11:31,920 Speaker 2: so I actually want to know your thoughts on the 205 00:11:31,920 --> 00:11:36,040 Speaker 2: whole the Cold Play cheating controversy where they got caught 206 00:11:36,040 --> 00:11:38,040 Speaker 2: on the kiss camp because a lot happened, and I 207 00:11:38,080 --> 00:11:41,760 Speaker 2: know the beginning, everybody knows that part. But yeah, they've. 208 00:11:41,520 --> 00:11:45,800 Speaker 1: Since they've both resigned. Here's the thing. See the thing 209 00:11:45,920 --> 00:11:46,480 Speaker 1: is hot. 210 00:11:47,280 --> 00:11:50,920 Speaker 2: Uh, everybody else need to give a recap? 211 00:11:51,040 --> 00:11:52,880 Speaker 1: No, we got to give a recap. Niggas was called 212 00:11:52,920 --> 00:11:57,040 Speaker 1: cheating on a kiss cam, and every party in Atlanta 213 00:11:57,120 --> 00:12:00,120 Speaker 1: chose to throw it on a flyer, which is crazy. 214 00:12:00,240 --> 00:12:02,200 Speaker 1: I could not imagine, like, you know, how they throw 215 00:12:02,559 --> 00:12:04,960 Speaker 1: Martin Luther King up on flyers for my day weekend. 216 00:12:05,360 --> 00:12:10,880 Speaker 1: Maybe this couple, well, this this affair was the star 217 00:12:11,400 --> 00:12:16,440 Speaker 1: of party Flyers. Sneaky Links, bring your sneaky link side 218 00:12:16,520 --> 00:12:22,280 Speaker 1: peace Sunday. I hate Atlanta, Atlanta. Had they weigh zero 219 00:12:22,400 --> 00:12:27,360 Speaker 1: time they they throwed. They throw this couple up on 220 00:12:27,440 --> 00:12:32,880 Speaker 1: every party flyer. If you've listened to horrible decisions, decisions decisions, 221 00:12:32,920 --> 00:12:37,679 Speaker 1: any of my podcast, anytime, anytime I've spoken publicly, I 222 00:12:37,720 --> 00:12:43,840 Speaker 1: am very against workplace romances. I don't think you should 223 00:12:44,760 --> 00:12:47,040 Speaker 1: ship where you sleep, eat where you ship. I don't 224 00:12:47,080 --> 00:12:50,920 Speaker 1: know what the saying is, but one of them, you 225 00:12:50,960 --> 00:12:56,520 Speaker 1: know what I mean, You shouldn't, shouldn't bro There is 226 00:12:56,720 --> 00:12:57,440 Speaker 1: so much. 227 00:12:57,720 --> 00:13:00,160 Speaker 4: My mom always said penis and pussy out here. My 228 00:13:00,200 --> 00:13:03,640 Speaker 4: mom says that the flesh is weak. What that means 229 00:13:03,720 --> 00:13:06,840 Speaker 4: that means in her context, in my context, which I'll 230 00:13:06,960 --> 00:13:09,480 Speaker 4: share with you right now, it's the law of proximity. 231 00:13:10,640 --> 00:13:13,480 Speaker 4: So my mom was recently in the hospital, right and 232 00:13:13,520 --> 00:13:16,160 Speaker 4: I'm watching the dynamics between the nurse, the nurses and 233 00:13:16,160 --> 00:13:17,800 Speaker 4: the staff, and. 234 00:13:17,400 --> 00:13:18,560 Speaker 3: I'm like, wow, they in there. 235 00:13:18,600 --> 00:13:21,880 Speaker 4: They twelve sixteen hours double shift, blah blah blah blah, 236 00:13:21,960 --> 00:13:24,360 Speaker 4: and then they go into this big cafeteria and they 237 00:13:24,360 --> 00:13:26,920 Speaker 4: haven't lunch, and then they going to work again, and 238 00:13:26,920 --> 00:13:30,840 Speaker 4: then they're they're they're each other's support system. So you 239 00:13:30,880 --> 00:13:34,280 Speaker 4: go through, you go throughout your shift, and then days 240 00:13:34,320 --> 00:13:38,240 Speaker 4: on and day in and out for twelve hours. 241 00:13:38,240 --> 00:13:39,400 Speaker 3: We got to talk about something. 242 00:13:39,559 --> 00:13:41,800 Speaker 4: You're gonna share, philosophy, you're gonna share if you've got 243 00:13:41,800 --> 00:13:45,680 Speaker 4: a family of kids. Whatever, Hey, I'm gonna get a 244 00:13:45,679 --> 00:13:49,400 Speaker 4: coffee for you. Wanna it's always gonna be those dynamics. 245 00:13:48,960 --> 00:13:50,400 Speaker 1: And they're getting a coffee. Ain't gonna make it. 246 00:13:50,880 --> 00:13:56,440 Speaker 4: Hold on, No, but us, through time, through passing a time, 247 00:13:56,960 --> 00:14:00,120 Speaker 4: there are going to be possibilities. We see it in 248 00:14:00,120 --> 00:14:02,120 Speaker 4: industry we work in, we do and we it just 249 00:14:02,200 --> 00:14:04,120 Speaker 4: us to have some kind of moral compass to say, 250 00:14:04,520 --> 00:14:07,000 Speaker 4: set a boundary right. And I think now in twenty 251 00:14:07,000 --> 00:14:11,559 Speaker 4: twenty entering twenty twenty six, the boundaries are just slowly deteriorating. 252 00:14:11,559 --> 00:14:13,560 Speaker 1: To me, it's not a moral compass as much as 253 00:14:13,640 --> 00:14:17,480 Speaker 1: it's literally discernment. Like to me too, if you're spending 254 00:14:17,520 --> 00:14:21,760 Speaker 1: twelve to fourteen to sixteen hours around somebody, we need 255 00:14:21,800 --> 00:14:26,000 Speaker 1: to learn how to compartmentalize our viewpoints on relationships. And 256 00:14:26,040 --> 00:14:29,640 Speaker 1: if you have a workplace relationship with someone. There's no 257 00:14:29,680 --> 00:14:31,600 Speaker 1: reason why that needs to be taken into the bedroom. 258 00:14:31,800 --> 00:14:34,600 Speaker 3: No, where it starts. That's my work husband, that's my 259 00:14:34,720 --> 00:14:35,440 Speaker 3: work wife. 260 00:14:36,000 --> 00:14:37,160 Speaker 5: Yeah, okay, so we. 261 00:14:37,120 --> 00:14:41,720 Speaker 1: Got the way you actually have a wife here? Are you? 262 00:14:42,280 --> 00:14:45,160 Speaker 1: Have you ever had a work wife or a work 263 00:14:45,200 --> 00:14:47,960 Speaker 1: girlfriend and is that something that your wife has checked 264 00:14:48,000 --> 00:14:48,560 Speaker 1: you on before? 265 00:14:50,120 --> 00:14:52,600 Speaker 2: Not since I've been married, No, But before I was married, 266 00:14:53,240 --> 00:14:54,600 Speaker 2: I had a girlfriend who worked with me at this 267 00:14:54,880 --> 00:14:56,800 Speaker 2: I worked at a magazine. We were at the same magazine, 268 00:14:56,800 --> 00:14:58,960 Speaker 2: she said, at the cube next to me, and that's 269 00:14:58,960 --> 00:15:01,320 Speaker 2: how I dated years later in her career when we 270 00:15:01,320 --> 00:15:03,960 Speaker 2: were at a different magazine. We both worked there together 271 00:15:04,280 --> 00:15:07,080 Speaker 2: and she had she actually had a work husband there. 272 00:15:07,360 --> 00:15:08,560 Speaker 2: And it was funny to me because I was like, 273 00:15:08,560 --> 00:15:11,760 Speaker 2: this is bugged out, like watching my ex like interact 274 00:15:11,800 --> 00:15:13,960 Speaker 2: with this dude who's her work husband, and I know 275 00:15:14,000 --> 00:15:15,880 Speaker 2: she has a boyfriend at home, and so I was 276 00:15:15,880 --> 00:15:16,680 Speaker 2: crazy watching it. 277 00:15:16,680 --> 00:15:18,200 Speaker 5: And I was thinking about. 278 00:15:19,520 --> 00:15:20,360 Speaker 1: Though that you saw. 279 00:15:20,680 --> 00:15:22,800 Speaker 2: That's so that's the point I was gonna say, not 280 00:15:23,000 --> 00:15:24,840 Speaker 2: much boundaries crossed there, But I was thinking about when 281 00:15:24,840 --> 00:15:27,440 Speaker 2: a king's saying, I'm watching the nurses and doctors and 282 00:15:27,480 --> 00:15:29,240 Speaker 2: we come out the room. I thought you were going 283 00:15:29,320 --> 00:15:32,240 Speaker 2: to say, and then I peeped, like a little hand 284 00:15:32,280 --> 00:15:35,320 Speaker 2: gesture here the hand going down the bad start. 285 00:15:35,360 --> 00:15:36,560 Speaker 5: She started seeing those little things. 286 00:15:36,560 --> 00:15:38,560 Speaker 4: My mom seen the nurses aid and one of the 287 00:15:38,640 --> 00:15:41,440 Speaker 4: nurses and I don't know it was. It was just 288 00:15:41,520 --> 00:15:43,440 Speaker 4: joking around it. And she's told she called both of 289 00:15:43,480 --> 00:15:46,120 Speaker 4: them in the room. She said, you know, make sure 290 00:15:46,200 --> 00:15:48,760 Speaker 4: y'all take care of each other, like y'all shop. I'm 291 00:15:48,760 --> 00:15:51,040 Speaker 4: looking at you pee and I'm looking like she sawid, Hey, 292 00:15:51,160 --> 00:15:54,840 Speaker 4: you just late to the party. So then the guy, 293 00:15:55,000 --> 00:15:57,800 Speaker 4: he kind of was like, oh, no, that's we go 294 00:15:57,920 --> 00:15:58,400 Speaker 4: way back. 295 00:15:58,960 --> 00:15:59,440 Speaker 5: Ah. 296 00:16:00,320 --> 00:16:01,560 Speaker 3: That's subjective, but it ain't. 297 00:16:01,680 --> 00:16:04,960 Speaker 1: It's tricky because I think that and and maybe it's 298 00:16:05,080 --> 00:16:08,560 Speaker 1: just the lack of being able to be platonic as 299 00:16:08,600 --> 00:16:11,160 Speaker 1: a man a woman. I remember when I was on 300 00:16:11,200 --> 00:16:15,040 Speaker 1: the JVN, I was in a relationship with who I 301 00:16:15,040 --> 00:16:18,520 Speaker 1: thought was my soulmate at the time, and Joe used 302 00:16:18,560 --> 00:16:21,600 Speaker 1: to like facetimmy every morning because he had chatty, petty 303 00:16:21,640 --> 00:16:24,800 Speaker 1: ass motherfucker. I'd be laying in bed with my nigga 304 00:16:25,160 --> 00:16:27,000 Speaker 1: and Joe would call me and it would be like 305 00:16:27,600 --> 00:16:29,160 Speaker 1: a lot. We talked on the phone in the morning. 306 00:16:29,200 --> 00:16:31,480 Speaker 1: We'd like to talk about what's happening on the internet 307 00:16:31,480 --> 00:16:36,640 Speaker 1: and things like that, and and their you but he 308 00:16:37,040 --> 00:16:41,239 Speaker 1: we worked together, but it's worked, but there was a friendship. 309 00:16:41,640 --> 00:16:44,600 Speaker 1: And to me, I'm just like, to me, if I 310 00:16:44,640 --> 00:16:47,320 Speaker 1: wasn't answering Joe in front of my nigga, I felt 311 00:16:47,320 --> 00:16:51,600 Speaker 1: like it would be worse. Yeah, I'm in bed, I'm up. Wait, 312 00:16:51,960 --> 00:16:57,480 Speaker 1: that's a that's a violation, misuez. Would you allow your 313 00:16:57,560 --> 00:16:58,720 Speaker 1: wife to answer at. 314 00:16:58,720 --> 00:16:59,200 Speaker 3: Least get up? 315 00:16:59,200 --> 00:17:01,080 Speaker 4: At least get up, Let's go in the kitchen or something, 316 00:17:01,120 --> 00:17:03,160 Speaker 4: and then I mean we could start yo, what up? 317 00:17:03,760 --> 00:17:04,320 Speaker 3: And then get up? 318 00:17:04,320 --> 00:17:09,359 Speaker 2: Because that's very innocent, Like considering the context of Mandy, 319 00:17:09,440 --> 00:17:10,359 Speaker 2: that seems really innocent. 320 00:17:10,400 --> 00:17:12,639 Speaker 1: That's what I'm saying. I talked to how often do 321 00:17:12,720 --> 00:17:14,600 Speaker 1: I talk to you? A king? A king? When I 322 00:17:14,640 --> 00:17:16,680 Speaker 1: was with my boyfriend. My boyfriend would be around. I'm 323 00:17:16,680 --> 00:17:21,040 Speaker 1: facetiming you, like I have platonic relationships with men in 324 00:17:21,119 --> 00:17:24,000 Speaker 1: my industry and in my life. And I think that 325 00:17:24,800 --> 00:17:28,200 Speaker 1: what this CEO did. Mind you, both of them were married, 326 00:17:28,840 --> 00:17:31,119 Speaker 1: so I think we clearly knew they were having an affair, 327 00:17:31,240 --> 00:17:34,040 Speaker 1: so there were lines crossed. I do think it's interesting 328 00:17:34,080 --> 00:17:36,679 Speaker 1: and what we don't have in our industry is the 329 00:17:36,760 --> 00:17:40,720 Speaker 1: ability to literally lose your job because of an affair. 330 00:17:40,760 --> 00:17:42,840 Speaker 1: And so we see both of them resigned. We know 331 00:17:42,880 --> 00:17:47,080 Speaker 1: what it was, board meeting took place. Y'all violated code 332 00:17:47,119 --> 00:17:47,720 Speaker 1: of conduct? 333 00:17:47,920 --> 00:17:53,240 Speaker 4: And is that similar to old boy from ABC? What 334 00:17:53,359 --> 00:17:57,680 Speaker 4: was the newscaster I forget his name, the black guy? 335 00:17:58,119 --> 00:18:03,760 Speaker 1: Oh yeah, the part of the I don't remember the cheated, Yeah, 336 00:18:03,760 --> 00:18:04,760 Speaker 1: you cheated with the white woman. 337 00:18:04,920 --> 00:18:05,600 Speaker 3: They got a whole. 338 00:18:05,480 --> 00:18:08,520 Speaker 4: Podcast now, yeah, but they're no longer on their network 339 00:18:08,720 --> 00:18:10,720 Speaker 4: as a result of Yeah. 340 00:18:10,440 --> 00:18:11,960 Speaker 5: But yeah, they lost the main jobs. 341 00:18:12,119 --> 00:18:15,040 Speaker 1: I mean, for this, it's interesting. I also want to say, uh, 342 00:18:15,359 --> 00:18:17,639 Speaker 1: do do you think don't go to sporting events with 343 00:18:17,720 --> 00:18:21,560 Speaker 1: your side piece? But is it? 344 00:18:21,600 --> 00:18:24,119 Speaker 2: Does the HR element of it mean anything? Like if 345 00:18:24,400 --> 00:18:27,199 Speaker 2: she was the chief Revenue officer and he was the CEO, 346 00:18:27,359 --> 00:18:28,800 Speaker 2: would it be the exact same, Because to me, I 347 00:18:28,840 --> 00:18:30,200 Speaker 2: feel like the HR element. 348 00:18:30,119 --> 00:18:32,560 Speaker 1: HR being fired by HR is hilarious. 349 00:18:32,640 --> 00:18:34,760 Speaker 2: Yeah, it's just like it's like the hypocrisy of it. 350 00:18:34,840 --> 00:18:37,240 Speaker 1: I feel like, I mean, but people are hypocrites at 351 00:18:37,240 --> 00:18:39,680 Speaker 1: the end of the day, Like you still have code 352 00:18:39,680 --> 00:18:43,040 Speaker 1: of conduct and again there's always somebody above you. Let's 353 00:18:43,040 --> 00:18:43,880 Speaker 1: be also very clear. 354 00:18:45,480 --> 00:18:47,880 Speaker 2: Company are and I know that too. I know HR 355 00:18:47,920 --> 00:18:49,679 Speaker 2: is always with the company, but they're also the one 356 00:18:49,760 --> 00:18:51,840 Speaker 2: who are like with the Ruler Code of Conduct qute 357 00:18:51,840 --> 00:18:53,920 Speaker 2: of Conduct and the fact that she didn't follow it, 358 00:18:54,000 --> 00:18:56,560 Speaker 2: which I think we all suspect anyway, but because it 359 00:18:56,600 --> 00:18:58,760 Speaker 2: was so loud and long, I feel like that. I 360 00:18:58,800 --> 00:18:59,960 Speaker 2: feel like if it was like if she was the 361 00:19:01,040 --> 00:19:03,119 Speaker 2: chief revenue officer and he was the CEO, I feel 362 00:19:03,119 --> 00:19:05,520 Speaker 2: like maybe it will be like paid leave of absence 363 00:19:05,560 --> 00:19:06,359 Speaker 2: and it will go away. 364 00:19:06,440 --> 00:19:10,399 Speaker 1: Here's another thing too, right, I was just out with 365 00:19:11,920 --> 00:19:14,439 Speaker 1: a homeboy of mine who is married. We went to 366 00:19:14,480 --> 00:19:17,119 Speaker 1: one of his restaurants and we're just sitting at the 367 00:19:17,119 --> 00:19:20,280 Speaker 1: booth having a conversation, catching up, right, and we see 368 00:19:20,320 --> 00:19:24,040 Speaker 1: this girl and she's videoing and taking pictures. He was like, 369 00:19:25,040 --> 00:19:26,919 Speaker 1: you think that girl's videoing us? And I was like, 370 00:19:27,720 --> 00:19:31,240 Speaker 1: and it she is, And what nigga, We're not doing 371 00:19:31,280 --> 00:19:35,560 Speaker 1: anything wrong. And so to me, I think the reaction 372 00:19:36,440 --> 00:19:39,720 Speaker 1: of both of them indicated that they was up to 373 00:19:39,760 --> 00:19:42,240 Speaker 1: no good. And so to me, here's my bit of 374 00:19:42,240 --> 00:19:46,840 Speaker 1: advice as a season side chick who knows longer, I'm 375 00:19:46,880 --> 00:19:54,679 Speaker 1: no longer that however, Season sidekick. There we go listen. 376 00:19:55,400 --> 00:19:55,720 Speaker 4: Uh. 377 00:19:55,920 --> 00:19:58,040 Speaker 1: In In my book, my New York Times bestseller No 378 00:19:58,119 --> 00:20:01,520 Speaker 1: Holds Bart, I actually talk of about being okay with 379 00:20:01,640 --> 00:20:05,239 Speaker 1: being you know, number two, which I no longer think 380 00:20:05,280 --> 00:20:08,439 Speaker 1: of anymore. However, when you're out in public, if you 381 00:20:08,480 --> 00:20:11,639 Speaker 1: are in your mind doing something that you ain't supposed 382 00:20:11,640 --> 00:20:13,639 Speaker 1: to be doing with somebody you're not supposed to be with, 383 00:20:14,440 --> 00:20:17,240 Speaker 1: in a relationship dynamic that maybe you don't want the 384 00:20:17,240 --> 00:20:24,800 Speaker 1: public to know, the key is to just act like friends. Literally, 385 00:20:25,040 --> 00:20:27,560 Speaker 1: y'all work for the same company that could have been 386 00:20:27,960 --> 00:20:31,320 Speaker 1: your company, sweet and all of y'all are there, and 387 00:20:31,400 --> 00:20:35,840 Speaker 1: y'all are enjoying the same song like wave like. I 388 00:20:35,880 --> 00:20:38,919 Speaker 1: think that there's a way to behave do all of 389 00:20:38,960 --> 00:20:42,640 Speaker 1: your dirt behind closed doors with knowing that everything that's 390 00:20:42,680 --> 00:20:45,520 Speaker 1: done in the dark will eventually come to light. But 391 00:20:46,000 --> 00:20:48,400 Speaker 1: if you're going to be in public settings with your joint, 392 00:20:48,440 --> 00:20:50,720 Speaker 1: with your side piece, because best believe I've been there, 393 00:20:50,840 --> 00:20:54,480 Speaker 1: done that, act like you the friend. Just act like 394 00:20:54,560 --> 00:20:57,040 Speaker 1: y'all are there to enjoy whatever moment y'all are in, 395 00:20:57,359 --> 00:21:03,919 Speaker 1: and take all that little guilty inn somewhere else. Yeah, 396 00:21:03,960 --> 00:21:12,480 Speaker 1: it was crazy. Speaking of Yeah, speaking of energy, side chicks, 397 00:21:12,520 --> 00:21:14,560 Speaker 1: how you move, how you don't move, how you act 398 00:21:14,560 --> 00:21:18,199 Speaker 1: in public, how you act behind closed doors? Baby. The 399 00:21:18,280 --> 00:21:20,359 Speaker 1: t app just came out now, and if y'all are 400 00:21:20,480 --> 00:21:21,240 Speaker 1: unfamiliar with. 401 00:21:21,200 --> 00:21:24,160 Speaker 2: The t app, it is like number one, number one app, the. 402 00:21:24,000 --> 00:21:25,640 Speaker 1: Number one on the app store, but not only that 403 00:21:26,200 --> 00:21:29,120 Speaker 1: four years and again, if you are listening to horrible decisions. 404 00:21:29,160 --> 00:21:31,440 Speaker 1: I did a Patreon episode about this. There have been 405 00:21:31,520 --> 00:21:35,119 Speaker 1: Facebook groups for the last few years where you have 406 00:21:35,240 --> 00:21:40,160 Speaker 1: to be invited and accepted into it, but it's almost 407 00:21:40,200 --> 00:21:43,320 Speaker 1: every city in America and the and the groups are 408 00:21:43,359 --> 00:21:46,920 Speaker 1: called are we Dating the same Man? And so yes, 409 00:21:47,359 --> 00:21:49,760 Speaker 1: So the t app is essentially are we Dating the 410 00:21:49,800 --> 00:21:52,359 Speaker 1: same man? Here's the thing about this as well. It 411 00:21:52,400 --> 00:21:55,400 Speaker 1: has been out for a week. I mean, I'm sure 412 00:21:55,400 --> 00:21:57,440 Speaker 1: it's been in beta for much longer than that, but 413 00:21:57,720 --> 00:22:01,520 Speaker 1: it had a push over this last week. Millions of 414 00:22:01,520 --> 00:22:06,560 Speaker 1: women chose to join this platform. And in order to 415 00:22:06,640 --> 00:22:09,480 Speaker 1: join the platform, by the way, because I went to 416 00:22:09,480 --> 00:22:11,440 Speaker 1: do it, didn't finish the process because I got busy. 417 00:22:11,600 --> 00:22:13,320 Speaker 1: But in order to join this you do have to 418 00:22:13,400 --> 00:22:16,840 Speaker 1: like you're going to join it was I here's the thing, 419 00:22:18,160 --> 00:22:20,560 Speaker 1: the same way I watch Housewives, I just like seeing 420 00:22:20,560 --> 00:22:24,560 Speaker 1: how miserable everybody is out here because I none of 421 00:22:24,640 --> 00:22:27,160 Speaker 1: my none of the niggas I lay up with, would 422 00:22:27,160 --> 00:22:30,680 Speaker 1: be on now. Hmmm, I ain't even gonna hold you. 423 00:22:30,720 --> 00:22:34,320 Speaker 1: Maybe someone's from the past, but who who I rock 424 00:22:34,400 --> 00:22:36,239 Speaker 1: with right now? I ain't even worried about it, like 425 00:22:37,280 --> 00:22:40,400 Speaker 1: you know. And so there was one that I think 426 00:22:40,480 --> 00:22:42,160 Speaker 1: would have been on there, but I just cut them 427 00:22:42,160 --> 00:22:45,800 Speaker 1: off anyway. So basically, in order to get on this app, 428 00:22:45,840 --> 00:22:49,480 Speaker 1: you had to upload a photo ID, You had to 429 00:22:49,520 --> 00:22:52,119 Speaker 1: be verified, your social media had to be linked. Like 430 00:22:52,119 --> 00:22:55,560 Speaker 1: there's all these things that had to you had to 431 00:22:55,600 --> 00:22:59,200 Speaker 1: do to get verification. And because it's a new app, 432 00:23:00,760 --> 00:23:04,720 Speaker 1: like every new app, maybe their security was not up 433 00:23:04,720 --> 00:23:06,800 Speaker 1: to par and within the first week there was a 434 00:23:06,880 --> 00:23:10,840 Speaker 1: data breach, with hackers gaining access to seventy two thousand images, 435 00:23:11,200 --> 00:23:16,360 Speaker 1: including thirteen thousand selfies and photo IDs submitted for account verifications, 436 00:23:16,400 --> 00:23:21,359 Speaker 1: as well as fifty nine thousand images from post comments 437 00:23:21,400 --> 00:23:25,920 Speaker 1: and dms. Here's the thing. This app was an app 438 00:23:25,960 --> 00:23:28,800 Speaker 1: that only women could join, where they would pretty much 439 00:23:29,000 --> 00:23:35,760 Speaker 1: create profiles of the men they dealt with and talk 440 00:23:35,800 --> 00:23:41,840 Speaker 1: about these men and their experiences. Here's all stra dry stitching. Yes, 441 00:23:41,960 --> 00:23:47,320 Speaker 1: but as a woman, women be lying, and so this 442 00:23:47,520 --> 00:23:50,560 Speaker 1: wasn't an app that I would go on to necessarily 443 00:23:51,119 --> 00:23:54,520 Speaker 1: take anything as bible in terms of their relationships. We 444 00:23:54,560 --> 00:23:59,520 Speaker 1: also know that when relationships end with men not on 445 00:23:59,560 --> 00:24:03,320 Speaker 1: the accord of the woman, the woman in a heartbroken, manic, 446 00:24:03,760 --> 00:24:08,120 Speaker 1: depressed state is so irate that they just hate this man, 447 00:24:08,600 --> 00:24:12,359 Speaker 1: and oftentimes stories are created in things as such. I 448 00:24:12,520 --> 00:24:16,960 Speaker 1: do think that this is an app that what only 449 00:24:17,000 --> 00:24:20,440 Speaker 1: cause more harm than good. I don't think that women 450 00:24:20,480 --> 00:24:23,600 Speaker 1: should be allowed to create profiles of men. There's been 451 00:24:23,760 --> 00:24:26,600 Speaker 1: men who have come out to already say like, this 452 00:24:26,680 --> 00:24:29,800 Speaker 1: is what's being said about me to me, this is 453 00:24:29,840 --> 00:24:34,160 Speaker 1: a lawsuit waiting to happen. I know that there's an 454 00:24:34,160 --> 00:24:37,359 Speaker 1: app that I will never make mention on any of 455 00:24:37,359 --> 00:24:40,439 Speaker 1: my platforms that when I go onto it, it's a 456 00:24:40,440 --> 00:24:42,399 Speaker 1: message board of people who do not know me that 457 00:24:42,480 --> 00:24:45,439 Speaker 1: are creating narratives about me that are true. Whether it's 458 00:24:45,440 --> 00:24:47,560 Speaker 1: who I sleep with, the type of person I am, 459 00:24:47,680 --> 00:24:50,000 Speaker 1: where I live well, to my bank account, all these things, 460 00:24:51,080 --> 00:24:53,400 Speaker 1: and it's just people that is bored, people that is miserable, 461 00:24:53,400 --> 00:24:57,320 Speaker 1: and that's what the Internet is. Unfortunately, so a. 462 00:24:57,320 --> 00:24:59,800 Speaker 4: Girl could probably put up a post of a guy 463 00:25:00,119 --> 00:25:02,840 Speaker 4: ask the question, and then just because people have access 464 00:25:02,880 --> 00:25:05,840 Speaker 4: to the app, they may just jump on there and 465 00:25:05,880 --> 00:25:08,160 Speaker 4: be like, say some wild shit about to do that 466 00:25:08,240 --> 00:25:09,120 Speaker 4: they may not even. 467 00:25:08,960 --> 00:25:12,159 Speaker 2: Know is are they asking questions though? Or is it 468 00:25:12,200 --> 00:25:16,040 Speaker 2: like I'm just gonna do profiles? Like no, no, no, no no, 469 00:25:16,960 --> 00:25:18,000 Speaker 2: Like are we dating the same person? 470 00:25:18,119 --> 00:25:18,240 Speaker 3: Is that? 471 00:25:18,320 --> 00:25:20,000 Speaker 2: Like? Weren't asking is this? No? 472 00:25:20,080 --> 00:25:23,400 Speaker 1: It's like the same thing. So even even the Facebook app, right, 473 00:25:23,800 --> 00:25:26,920 Speaker 1: they're literally posting a picture asking does anyone know this guy? 474 00:25:27,280 --> 00:25:30,040 Speaker 1: And the comments underneath are a whole bunch of women 475 00:25:30,480 --> 00:25:33,679 Speaker 1: who may or may have had experiences with him and 476 00:25:33,720 --> 00:25:36,280 Speaker 1: they're sharing their relationship with him, and some of. 477 00:25:36,200 --> 00:25:38,320 Speaker 2: It could be like, oh, good, dude, is great, I 478 00:25:38,520 --> 00:25:45,320 Speaker 2: had no problem? Or is it all like pulling somebody's credit. 479 00:25:46,400 --> 00:25:49,000 Speaker 1: It's the whole facts, y'all know how men be wanting 480 00:25:49,040 --> 00:25:52,720 Speaker 1: the hope facts on women and like figuring out who 481 00:25:52,800 --> 00:25:55,560 Speaker 1: she done smashed and passed and all that. For women, 482 00:25:56,160 --> 00:25:58,000 Speaker 1: what we kind of want to know about men is 483 00:25:58,160 --> 00:26:02,320 Speaker 1: I mean, dick sibes size and if they're like a 484 00:26:02,359 --> 00:26:04,560 Speaker 1: good guy or if they're a fucking liar. I mean, 485 00:26:04,680 --> 00:26:09,159 Speaker 1: oftentimes what you want to find out is is this 486 00:26:09,240 --> 00:26:12,199 Speaker 1: man who he says he is. And so before we 487 00:26:12,240 --> 00:26:14,560 Speaker 1: get into the talk about AI, I want to share 488 00:26:14,560 --> 00:26:18,920 Speaker 1: another resource that ladies you should use and it's as 489 00:26:18,920 --> 00:26:23,880 Speaker 1: low as nine per person. It's called the fucking White Pages. 490 00:26:24,560 --> 00:26:26,680 Speaker 1: And I did this for a homegirl of mine. Check 491 00:26:26,760 --> 00:26:29,640 Speaker 1: this out. So it's the White Pages. You pay nine 492 00:26:29,720 --> 00:26:33,040 Speaker 1: ninety nine for a full search on an individual. When 493 00:26:33,080 --> 00:26:36,200 Speaker 1: I tell you, it will tell you their parents' name, 494 00:26:36,280 --> 00:26:39,760 Speaker 1: their siblings name, if ever they were married, any houses 495 00:26:39,800 --> 00:26:42,160 Speaker 1: that they own, traffic tickets that they may have had, 496 00:26:42,960 --> 00:26:44,560 Speaker 1: jobs that they've been attached to. 497 00:26:45,000 --> 00:26:46,520 Speaker 3: How do you protect yourself against them? 498 00:26:46,920 --> 00:26:49,160 Speaker 1: So for me, you don't. At this point, we all 499 00:26:49,880 --> 00:26:53,720 Speaker 1: and I'm I'm excited to talk about the AI. Essentially, 500 00:26:53,760 --> 00:26:56,119 Speaker 1: whenever you sign up for a social media platform, there 501 00:26:56,160 --> 00:26:59,440 Speaker 1: are terms and conditions. You're uploading your face, you're uploading 502 00:26:59,440 --> 00:27:02,239 Speaker 1: where you work, if you have LinkedIn that has your 503 00:27:02,240 --> 00:27:04,560 Speaker 1: whole job in history, if you go and get a 504 00:27:04,600 --> 00:27:08,440 Speaker 1: state ID, you're the same way we look up mug shots, 505 00:27:08,760 --> 00:27:11,960 Speaker 1: your traffic, like all of those things are on the 506 00:27:12,160 --> 00:27:15,760 Speaker 1: World Wide Web. You are not a ghost in the society. 507 00:27:16,119 --> 00:27:19,359 Speaker 1: And so White Pages keeps track of you know where 508 00:27:19,720 --> 00:27:22,400 Speaker 1: all you've lived where all you've moved to if you've 509 00:27:22,400 --> 00:27:26,520 Speaker 1: ever changed your ID. And so I remember my homegirl 510 00:27:26,600 --> 00:27:30,720 Speaker 1: was talking to a guy who on the holidays. Somehow 511 00:27:31,480 --> 00:27:37,280 Speaker 1: he just wasn't around Valentine, Christmas, Thanksgiving. There was just 512 00:27:37,520 --> 00:27:41,280 Speaker 1: parts where We've been dating for over a year now. 513 00:27:41,480 --> 00:27:44,480 Speaker 1: I'm confused why we haven't been able to share these 514 00:27:44,480 --> 00:27:46,600 Speaker 1: moments together. And so I was like, girl, let me 515 00:27:46,600 --> 00:27:48,320 Speaker 1: have you out Gary, let me let me know his name. 516 00:27:49,400 --> 00:27:51,119 Speaker 1: You were at the nine ninety nine because you're my 517 00:27:51,119 --> 00:27:54,200 Speaker 1: best friend, and so I paid the nine ninety nine 518 00:27:54,200 --> 00:27:59,639 Speaker 1: on white pages. Literally was like is this Have you 519 00:27:59,680 --> 00:28:01,840 Speaker 1: heard of this name? Have you heard of this name? 520 00:28:02,480 --> 00:28:04,560 Speaker 1: When I tell you there was an address on there 521 00:28:04,600 --> 00:28:07,359 Speaker 1: that she never heard of, she pulled up to the 522 00:28:07,359 --> 00:28:11,200 Speaker 1: address and guess who answered the door? His fucking wife. 523 00:28:12,760 --> 00:28:14,680 Speaker 1: She didn't even know the nigga was married. Mind you, 524 00:28:14,720 --> 00:28:17,840 Speaker 1: hold on. She also only knew of two kids, not three. 525 00:28:17,960 --> 00:28:21,120 Speaker 1: He completely chose to leave out the whole junior that 526 00:28:21,200 --> 00:28:24,480 Speaker 1: he just had. And so it was like, mind you 527 00:28:24,520 --> 00:28:27,840 Speaker 1: and I could look into the traffic tickets that he 528 00:28:27,920 --> 00:28:29,879 Speaker 1: got across certain states. I said, do you know if 529 00:28:29,880 --> 00:28:32,320 Speaker 1: he ever lived here? Because he got two speeding tickets 530 00:28:32,320 --> 00:28:35,879 Speaker 1: in Cleveland. He got something here in Miami and she 531 00:28:36,080 --> 00:28:38,120 Speaker 1: was like, oh, yeah, this is him pulled up to 532 00:28:38,160 --> 00:28:41,120 Speaker 1: that house. Long behold. The wife answered, So, if there's 533 00:28:41,160 --> 00:28:42,520 Speaker 1: a lot of things that you want to know about 534 00:28:42,520 --> 00:28:44,160 Speaker 1: a person, you can kind of gauge that. It'll tell 535 00:28:44,200 --> 00:28:48,560 Speaker 1: you everything from the numbers associated to that name, the addresses. 536 00:28:48,640 --> 00:28:52,360 Speaker 1: It's crazy, and it's to me, it is scary because 537 00:28:52,440 --> 00:28:54,600 Speaker 1: especially at celebrities. That's why we know people have showed 538 00:28:54,680 --> 00:28:58,040 Speaker 1: up to Chris Brown's house. It's you know, it gets scary, 539 00:28:58,080 --> 00:29:00,640 Speaker 1: like there's a breach of security there for you as 540 00:29:00,640 --> 00:29:03,480 Speaker 1: a human being because people are unwell out here. But 541 00:29:03,680 --> 00:29:07,959 Speaker 1: at the same time, unfortunately, we are meeting people's representative 542 00:29:08,080 --> 00:29:12,600 Speaker 1: often and so as a woman, I was just going 543 00:29:12,640 --> 00:29:16,600 Speaker 1: on there to read the tea. I would not advise 544 00:29:16,640 --> 00:29:18,440 Speaker 1: anyone to sign up for this app. I think that 545 00:29:18,520 --> 00:29:21,000 Speaker 1: it's cruel. I think that it's mean, and things like 546 00:29:21,040 --> 00:29:23,920 Speaker 1: this I don't think should be inserted into our society 547 00:29:24,040 --> 00:29:24,600 Speaker 1: at the moment. 548 00:29:25,120 --> 00:29:26,760 Speaker 4: I just want to add something to it. I saw 549 00:29:26,800 --> 00:29:28,880 Speaker 4: this YouTube. I can't forget I can't remember his name. 550 00:29:29,400 --> 00:29:31,400 Speaker 4: Part of me, but he was kind of doing a 551 00:29:33,080 --> 00:29:35,120 Speaker 4: I won't say expos a, but he was doing a 552 00:29:35,120 --> 00:29:37,480 Speaker 4: piece about the t app and one of the things 553 00:29:37,240 --> 00:29:39,560 Speaker 4: he mentioned is that if you look at the fellas 554 00:29:39,560 --> 00:29:43,880 Speaker 4: that's on the app, these are people that you probably 555 00:29:44,720 --> 00:29:48,320 Speaker 4: should know already that ain't shit for. 556 00:29:48,040 --> 00:29:50,959 Speaker 1: While you say that because there's regular guys on there, 557 00:29:51,000 --> 00:29:51,560 Speaker 1: it's not all. 558 00:29:51,400 --> 00:29:54,640 Speaker 4: Celebrities again, it was it was the volume of guy 559 00:29:54,680 --> 00:29:57,920 Speaker 4: that he was showing is like money phone guys watching 560 00:29:58,560 --> 00:30:01,600 Speaker 4: and it's like, you want to know about him. 561 00:30:02,200 --> 00:30:05,480 Speaker 1: That he is like that, you want to know what 562 00:30:05,480 --> 00:30:09,360 Speaker 1: I think makes it worse. They were posting the women's 563 00:30:09,560 --> 00:30:13,600 Speaker 1: ideas and this is this is what I hate about 564 00:30:13,960 --> 00:30:16,920 Speaker 1: because the male app, well, there was a breach of data. 565 00:30:17,000 --> 00:30:19,840 Speaker 1: With the breach with the leak, people were able to 566 00:30:19,840 --> 00:30:22,240 Speaker 1: see the women on this app, and so all you 567 00:30:22,320 --> 00:30:26,520 Speaker 1: saw was five two, two thirty five, six, three or 568 00:30:26,600 --> 00:30:31,120 Speaker 1: five and it was literally people the size of the 569 00:30:31,160 --> 00:30:34,120 Speaker 1: women even on this app. And I'll be here to tell. 570 00:30:33,960 --> 00:30:35,080 Speaker 3: You go ahead, talk about it. 571 00:30:35,120 --> 00:30:40,360 Speaker 1: Go ahead, because in the most ignorant fashion, y'all is ignorant. 572 00:30:40,560 --> 00:30:46,840 Speaker 1: If y'all don't think big Bitch is getting dick, I 573 00:30:46,960 --> 00:30:51,960 Speaker 1: did someone someone wrote something about uh oh, I don't 574 00:30:51,960 --> 00:30:53,920 Speaker 1: even know what the correlation was between you. 575 00:30:53,880 --> 00:30:55,440 Speaker 2: It's like when you get skinning and you start holding 576 00:30:55,520 --> 00:30:55,760 Speaker 2: or something. 577 00:30:55,800 --> 00:30:58,520 Speaker 1: But yeah, someone said, bitches lose weight and start holing 578 00:30:59,320 --> 00:31:02,560 Speaker 1: baby bitches is hole and his big bitches. And you 579 00:31:02,600 --> 00:31:06,200 Speaker 1: know why, the more in shape a man, the more 580 00:31:06,240 --> 00:31:10,480 Speaker 1: slimmer man, the bigger the bitch. I in my prime 581 00:31:10,680 --> 00:31:14,560 Speaker 1: at two hundred and thirty pounds, baby oh, I only 582 00:31:14,600 --> 00:31:20,959 Speaker 1: fucked blamino models and gym riots. Swear to God, like 583 00:31:21,800 --> 00:31:24,360 Speaker 1: I could count the packs on the god damn body 584 00:31:24,520 --> 00:31:27,920 Speaker 1: like one, two, three, folk day seven eight. You got 585 00:31:27,960 --> 00:31:31,880 Speaker 1: an eight pack. The more in shape of a man, 586 00:31:31,920 --> 00:31:34,239 Speaker 1: I don't know the fluffy of the woman. And so 587 00:31:34,280 --> 00:31:37,160 Speaker 1: it's crazy like people are trying to bash these women 588 00:31:37,240 --> 00:31:41,080 Speaker 1: for their sizes on this app, right, and it's like, bro, 589 00:31:41,880 --> 00:31:46,200 Speaker 1: they are getting beat down to the mattress. Okay, big 590 00:31:46,240 --> 00:31:48,719 Speaker 1: girls is taking that thing, all right, And so I 591 00:31:48,800 --> 00:31:51,880 Speaker 1: just hate too that there becomes this element of body 592 00:31:51,880 --> 00:31:54,840 Speaker 1: shaving again. Go check out Cooked my body Shaving Fat 593 00:31:54,920 --> 00:31:58,640 Speaker 1: Shaving app episode. What do you mean cook? We've been cooked. 594 00:32:00,080 --> 00:32:02,560 Speaker 4: Fry and all the shit that happened politically as a 595 00:32:02,600 --> 00:32:05,680 Speaker 4: finale and this is and we got a tea app. 596 00:32:05,640 --> 00:32:09,440 Speaker 1: No, no, no, we're fried hard that we we we 597 00:32:09,480 --> 00:32:11,760 Speaker 1: are fried like the living purple wings in Magic City. 598 00:32:12,920 --> 00:32:17,800 Speaker 1: Fried hard. Okay, it is terrible speaking of that, Let's 599 00:32:17,880 --> 00:32:20,360 Speaker 1: go ahead and get into the topic of the matter. 600 00:32:20,880 --> 00:32:25,640 Speaker 1: We're going from an app literally to all things UH, 601 00:32:25,680 --> 00:32:30,040 Speaker 1: technology and AI. Y'all know, since we are now part 602 00:32:30,080 --> 00:32:34,880 Speaker 1: of the Black Effect Network, we have added the double 603 00:32:34,920 --> 00:32:38,600 Speaker 1: down or take it back segment where something is pulled 604 00:32:38,600 --> 00:32:41,520 Speaker 1: from something I've said in the goddamn past to kind 605 00:32:41,520 --> 00:32:45,680 Speaker 1: of rehash it before we dig deeper into our subject matter. 606 00:32:46,400 --> 00:32:50,920 Speaker 1: And this one is literally within the subject matter. So 607 00:32:50,960 --> 00:32:53,440 Speaker 1: there was an episode that Weezy and I did on 608 00:32:53,600 --> 00:32:57,160 Speaker 1: over on Horrible Decisions. This is episode three oh nine 609 00:32:57,600 --> 00:33:01,880 Speaker 1: where we talked about at the time the surgeons of 610 00:33:02,480 --> 00:33:06,760 Speaker 1: deep fakes. Now, deep fakes are where you can put 611 00:33:06,960 --> 00:33:10,800 Speaker 1: your face on an actual body. We are way past 612 00:33:10,800 --> 00:33:13,720 Speaker 1: deep fakes now because you can actually just create a 613 00:33:13,760 --> 00:33:18,200 Speaker 1: whole AI similation of a person with hands, body parts, 614 00:33:18,240 --> 00:33:22,320 Speaker 1: all the things. But we talked about deep fakes, and Jason, 615 00:33:22,320 --> 00:33:24,840 Speaker 1: you got that clip of of what I had to say, 616 00:33:26,200 --> 00:33:32,680 Speaker 1: got it right here, all right, Celebrity porn. I would 617 00:33:32,760 --> 00:33:37,400 Speaker 1: say a clean percent of it is the face of 618 00:33:37,440 --> 00:33:40,840 Speaker 1: a celebrity, but it's fake porn. It's deep fake porn, 619 00:33:41,520 --> 00:33:43,520 Speaker 1: Like because I was looking at you, I'll be looking 620 00:33:43,520 --> 00:33:45,440 Speaker 1: at celebrity porn from time to time. Just you got it. 621 00:33:45,720 --> 00:33:47,760 Speaker 1: And it's celebrities that I know never really sex tapes 622 00:33:47,840 --> 00:33:50,920 Speaker 1: and their faces are put onto bodies and that AI 623 00:33:51,280 --> 00:33:55,120 Speaker 1: and they're sucking. Would you want to sue girl? 624 00:33:55,480 --> 00:33:55,520 Speaker 2: No? 625 00:33:56,160 --> 00:33:59,840 Speaker 1: Okay, you don't think that it should be like if 626 00:34:00,080 --> 00:34:03,520 Speaker 1: some one put my face on AI and it's a 627 00:34:03,520 --> 00:34:05,480 Speaker 1: bukocky scene, shout out to you. 628 00:34:06,400 --> 00:34:07,880 Speaker 2: I not. 629 00:34:10,719 --> 00:34:12,759 Speaker 1: You know, I'm bringing up I've never been in for 630 00:34:12,800 --> 00:34:16,319 Speaker 1: the new listeners. I just enjoy viewing it. And so 631 00:34:16,680 --> 00:34:19,480 Speaker 1: if somebody is like, oh my god, I wish I 632 00:34:19,480 --> 00:34:23,319 Speaker 1: could see this and somebody took a photo that's on Instagram, 633 00:34:23,480 --> 00:34:26,080 Speaker 1: mind you. I know that when I upload any images 634 00:34:26,120 --> 00:34:28,760 Speaker 1: of myself it's for open use of the world. 635 00:34:30,040 --> 00:34:32,840 Speaker 2: What is it? Terms and condictup celebrities and conditions. 636 00:34:33,000 --> 00:34:37,200 Speaker 1: Terms and conditions apply. So my take was that I 637 00:34:37,239 --> 00:34:43,200 Speaker 1: wouldn't sue today, but today, I honestly don't think you can. Like, 638 00:34:44,360 --> 00:34:46,279 Speaker 1: by the way, I'm I'm aware of. 639 00:34:46,239 --> 00:34:50,800 Speaker 4: Your favorite rapperwise, I just want to throw that in me, 640 00:34:50,880 --> 00:34:53,400 Speaker 4: I said Everyder. 641 00:34:53,440 --> 00:34:55,600 Speaker 1: First Off, Drake is not is he even talking about 642 00:34:55,640 --> 00:35:02,880 Speaker 1: Drake Cardi who lobby calling lobbry is crazy. He is 643 00:35:02,920 --> 00:35:06,000 Speaker 1: suing over something. I think he's justified for to me 644 00:35:07,880 --> 00:35:11,120 Speaker 1: in terms of a deep fake or an AI version 645 00:35:11,120 --> 00:35:14,799 Speaker 1: of myself. I loved the video. I don't know if 646 00:35:14,880 --> 00:35:17,839 Speaker 1: y'all saw it. It was during the election time, and 647 00:35:17,880 --> 00:35:21,160 Speaker 1: it was Kamalo riding around with a forty Trump with 648 00:35:21,320 --> 00:35:25,799 Speaker 1: an ak running into a deli. It was I think 649 00:35:25,840 --> 00:35:28,360 Speaker 1: Bernie Sanders might have been in the clip, and I 650 00:35:28,400 --> 00:35:32,000 Speaker 1: found it to be entertaining to me as a sex 651 00:35:32,120 --> 00:35:36,479 Speaker 1: podcaster or someone who's talked about that for as many 652 00:35:36,680 --> 00:35:39,840 Speaker 1: years as I have. If someone chose to put me 653 00:35:39,920 --> 00:35:43,399 Speaker 1: on somebody taking back shots, I think as long as 654 00:35:43,440 --> 00:35:46,360 Speaker 1: it's not something that impacted or affected me, which I 655 00:35:46,360 --> 00:35:48,440 Speaker 1: don't think it would because I would literally just say, 656 00:35:48,880 --> 00:35:52,799 Speaker 1: this is not me. What am I supposed to do? Like? 657 00:35:52,960 --> 00:35:55,400 Speaker 4: Well, I think the issue is what if the user 658 00:35:55,520 --> 00:35:58,440 Speaker 4: or the generator is monetizing it. 659 00:35:59,200 --> 00:35:59,920 Speaker 3: I think that's way. 660 00:36:00,320 --> 00:36:03,839 Speaker 1: Issue which we're gonna get into it. Shut up. 661 00:36:05,000 --> 00:36:07,880 Speaker 2: I feel like you speak about like revenge porn a 662 00:36:07,880 --> 00:36:10,439 Speaker 2: lot and and yeah, revenge no people because people, because 663 00:36:10,440 --> 00:36:12,759 Speaker 2: people put this under the deep they put the deep 664 00:36:12,760 --> 00:36:14,959 Speaker 2: fake porns into the reve and they're wrong as fuck. 665 00:36:15,120 --> 00:36:17,800 Speaker 1: They're wrong as fuck. Here's why someone who creates a 666 00:36:17,840 --> 00:36:19,920 Speaker 1: deep fake, right, whether it's someone I dealt with or not, 667 00:36:20,400 --> 00:36:25,439 Speaker 1: it's an a I generated image deep that's deep fake. 668 00:36:25,960 --> 00:36:31,320 Speaker 1: Revenge porn is literally, hopefully something that was consensually shot 669 00:36:31,400 --> 00:36:34,839 Speaker 1: between two people having sex, and it's normally the other 670 00:36:35,000 --> 00:36:39,640 Speaker 1: party within that video leaking it is so revenge porn 671 00:36:39,920 --> 00:36:43,960 Speaker 1: is to dehumanize and embarrass and put out an intimate 672 00:36:44,000 --> 00:36:46,960 Speaker 1: moment shared between two people normally because one of the 673 00:36:46,960 --> 00:36:50,239 Speaker 1: parties involved are upset, which I love that. It is 674 00:36:50,280 --> 00:36:53,560 Speaker 1: now a criminal offense, by the way, if y'all do 675 00:36:53,640 --> 00:36:56,759 Speaker 1: that it right now is not a criminal offense to 676 00:36:57,719 --> 00:37:01,680 Speaker 1: use AI and what I love about you too, what 677 00:37:01,760 --> 00:37:06,239 Speaker 1: I love about YouTube, and it is now literally demonetizing 678 00:37:06,600 --> 00:37:10,880 Speaker 1: people who are putting out tons of either content with 679 00:37:10,960 --> 00:37:14,760 Speaker 1: the use of AI or content that is not owned 680 00:37:14,760 --> 00:37:18,480 Speaker 1: by them, content where it's just talking head, not even 681 00:37:18,520 --> 00:37:21,360 Speaker 1: a talking head. But shout out to the one place 682 00:37:21,480 --> 00:37:24,879 Speaker 1: not monetized anymore. I know that I ain't even gonna 683 00:37:24,880 --> 00:37:28,040 Speaker 1: shout them out. But his name starts with a D 684 00:37:28,880 --> 00:37:31,240 Speaker 1: ends with a Y, and I love that that nigga 685 00:37:31,280 --> 00:37:33,920 Speaker 1: can't make money no mo off of talking shit about people. 686 00:37:34,320 --> 00:37:39,160 Speaker 1: I'm here for it again. To me, I am doubling 687 00:37:39,200 --> 00:37:41,560 Speaker 1: down that I wouldn't sue. But I do think that 688 00:37:42,239 --> 00:37:46,200 Speaker 1: with this advancement of AI technology and what we're able 689 00:37:46,239 --> 00:37:49,320 Speaker 1: to make and create and do all the things, I 690 00:37:49,360 --> 00:37:53,920 Speaker 1: am glad that there are platforms that are now putting 691 00:37:54,160 --> 00:37:57,840 Speaker 1: things in place to where people can't monetize off of that. 692 00:37:58,280 --> 00:38:00,719 Speaker 1: And so if you go to porn hub, I would 693 00:38:00,800 --> 00:38:03,600 Speaker 1: highly doubt that the people uploading these fake celebrity deep 694 00:38:03,640 --> 00:38:08,040 Speaker 1: fakes are seeing any money from it. It's probably just 695 00:38:08,080 --> 00:38:11,560 Speaker 1: shifts and giggles. And to be fair, I used to 696 00:38:11,719 --> 00:38:15,799 Speaker 1: go into the message boards when I was when I 697 00:38:15,840 --> 00:38:20,520 Speaker 1: was young, and I used to literally read about B 698 00:38:20,600 --> 00:38:24,000 Speaker 1: two K Chris Brown possibly running a gang bang on 699 00:38:24,000 --> 00:38:26,279 Speaker 1: on myself, like that's how you would read it as 700 00:38:26,320 --> 00:38:29,880 Speaker 1: erodica bitch. I would love to be able to see that. 701 00:38:30,920 --> 00:38:34,040 Speaker 1: I would like to see that a I generated me 702 00:38:34,200 --> 00:38:36,560 Speaker 1: being bent over, not as a teenager, but now as 703 00:38:36,560 --> 00:38:38,959 Speaker 1: an adult. I would love to be able to see 704 00:38:39,000 --> 00:38:42,799 Speaker 1: myself taking it from the back from Chris Brown and 705 00:38:42,880 --> 00:38:46,239 Speaker 1: it's sucking and sucking Jason Momoa off or some shit like, 706 00:38:46,360 --> 00:38:49,160 Speaker 1: I'm here to see myself enjoy that experience. 707 00:38:50,080 --> 00:38:53,600 Speaker 3: How long is it gonna take when this episode drops? 708 00:38:54,600 --> 00:38:56,320 Speaker 3: Before somebody starts. 709 00:38:57,440 --> 00:39:00,480 Speaker 1: Hey, I ain't gonna hold you. There's plenty of images 710 00:39:00,960 --> 00:39:03,120 Speaker 1: of Chris Brown on his tour right now with the 711 00:39:03,200 --> 00:39:07,080 Speaker 1: right faces that I need, of him gyrating behind me. 712 00:39:07,239 --> 00:39:10,080 Speaker 1: And then yet give give Jason Momoa like an eight 713 00:39:10,120 --> 00:39:11,799 Speaker 1: inch dick and put it in my mouth. I'm here 714 00:39:11,840 --> 00:39:14,239 Speaker 1: for it, and send it to me so that I 715 00:39:14,280 --> 00:39:15,200 Speaker 1: can go to sleep. 716 00:39:15,520 --> 00:39:19,040 Speaker 3: Well, do a version of Jason Momoa from Minecraft. 717 00:39:20,960 --> 00:39:26,560 Speaker 1: Why just because I'll take Aquaman. I'll take even a 718 00:39:26,600 --> 00:39:29,359 Speaker 1: fish scale on his dick. Okay, I don't care. Give 719 00:39:29,400 --> 00:39:32,359 Speaker 1: me Aquaman and met the man. There we go, give 720 00:39:32,360 --> 00:39:38,600 Speaker 1: me both the man all right, Yeah, I ain't gonna 721 00:39:38,640 --> 00:39:42,200 Speaker 1: hold you. I'm actually really I guess we'll dig into 722 00:39:42,320 --> 00:39:48,520 Speaker 1: all the AI of it all. I'm personally excited about 723 00:39:48,719 --> 00:39:51,480 Speaker 1: the use of AI. And at this point he can 724 00:39:51,520 --> 00:39:53,120 Speaker 1: you might need to remove all these ums because the 725 00:39:53,160 --> 00:39:57,960 Speaker 1: bitch keeps saying today and I don't know why. And anyways, overall, 726 00:39:58,239 --> 00:40:02,920 Speaker 1: I'm excited to be learning all the AI to tools 727 00:40:02,920 --> 00:40:04,480 Speaker 1: that there is to offer. I just had a conversation 728 00:40:04,520 --> 00:40:06,799 Speaker 1: with antnet the other day and I was like, bitch, 729 00:40:06,800 --> 00:40:10,480 Speaker 1: if you know any courses classes, let me know the 730 00:40:10,520 --> 00:40:13,839 Speaker 1: cost of them. Let's get into it, because we're at 731 00:40:13,880 --> 00:40:18,120 Speaker 1: a place where as content creators, as entrepreneurs, this is 732 00:40:18,120 --> 00:40:22,040 Speaker 1: what you gotta do. And we'll have another conversation in 733 00:40:22,080 --> 00:40:27,799 Speaker 1: the coming weeks about rois and paying people what their 734 00:40:27,800 --> 00:40:30,759 Speaker 1: worth and all of those things. But to me, it 735 00:40:30,880 --> 00:40:35,440 Speaker 1: is a way to save on money, to generate a 736 00:40:35,560 --> 00:40:39,080 Speaker 1: lot more content for a lot less money in a 737 00:40:39,080 --> 00:40:44,040 Speaker 1: more efficient way. And I'm not mad personally right now 738 00:40:44,760 --> 00:40:48,399 Speaker 1: at any business integrating AI to help them. 739 00:40:48,680 --> 00:40:49,120 Speaker 3: For sure. 740 00:40:49,280 --> 00:40:51,560 Speaker 1: There are two, however, I would like to speak to 741 00:40:52,520 --> 00:40:55,480 Speaker 1: if you do not mind the first one, I want 742 00:40:55,480 --> 00:40:58,839 Speaker 1: to say, I'm not mad at second one, I want 743 00:40:58,840 --> 00:41:02,839 Speaker 1: to say, bitch, fuck you. So the first one I 744 00:41:02,840 --> 00:41:07,200 Speaker 1: want to get into is Uber. Also Lift. Lift has 745 00:41:07,200 --> 00:41:11,759 Speaker 1: a mobility uh uh an automobility line as well coming out. 746 00:41:11,800 --> 00:41:14,040 Speaker 1: I just saw it in Atlanta. Now here's the thing, 747 00:41:14,680 --> 00:41:17,720 Speaker 1: and here's maybe the difference in the businesses with Uber. 748 00:41:17,840 --> 00:41:22,480 Speaker 1: They have Weimo Waimo is as self automated driving vehicle 749 00:41:23,000 --> 00:41:27,240 Speaker 1: and bitch, it's Jaguars. Lift got toiled at Sienna's Nigga, 750 00:41:27,280 --> 00:41:30,000 Speaker 1: and I was like, all right now, not you putting 751 00:41:30,040 --> 00:41:39,719 Speaker 1: us some motherfucking minivan. Big lift got Sienna's and I 752 00:41:39,800 --> 00:41:42,680 Speaker 1: was like, all right now, if I'm going to see 753 00:41:42,760 --> 00:41:47,719 Speaker 1: here and get in a car where it's automated and 754 00:41:47,800 --> 00:41:52,040 Speaker 1: driving by itself, then let me get in a Jaguar hunt. 755 00:41:52,480 --> 00:41:54,720 Speaker 1: Here's why I do not have a problem with it. 756 00:41:54,800 --> 00:42:01,360 Speaker 1: One of two things. Accidents happened with people driving to 757 00:42:02,840 --> 00:42:09,600 Speaker 1: and to me, the human era is just is possible 758 00:42:09,680 --> 00:42:12,840 Speaker 1: to me as a self automated vehicle with all the 759 00:42:12,880 --> 00:42:15,200 Speaker 1: sensors and things that it has around it. Mind you, 760 00:42:15,239 --> 00:42:18,080 Speaker 1: they do a lot of testing before those things go 761 00:42:18,120 --> 00:42:21,960 Speaker 1: out into actually happening. And so when I was at 762 00:42:21,960 --> 00:42:26,480 Speaker 1: All Star Weekend, way Moo's were already present. In San 763 00:42:26,520 --> 00:42:29,640 Speaker 1: Francisco wrote it that make you buckle up. It says 764 00:42:29,680 --> 00:42:32,000 Speaker 1: your name when you enter. What I love more than 765 00:42:32,040 --> 00:42:35,120 Speaker 1: anything is that there ain't nobody in there talking to 766 00:42:35,200 --> 00:42:39,360 Speaker 1: you as let me tell you how real quick. In Atlanta, 767 00:42:39,440 --> 00:42:42,680 Speaker 1: ubers y'all mother fucking different. This is why I really 768 00:42:42,719 --> 00:42:44,239 Speaker 1: like New York is one of the best cities in 769 00:42:44,280 --> 00:42:48,480 Speaker 1: the world because New York not only has great service 770 00:42:49,960 --> 00:42:54,000 Speaker 1: in like restaurants where they're fast and efficient. Baby niggas 771 00:42:54,040 --> 00:42:56,600 Speaker 1: know when they're working, and when you're in an uber, 772 00:42:56,800 --> 00:43:00,320 Speaker 1: when you're in a lyft, you're fucking working. What you 773 00:43:00,360 --> 00:43:05,919 Speaker 1: shouldn't do on the job. Have personal conversations around customers 774 00:43:06,080 --> 00:43:10,040 Speaker 1: and clients. So I'm on, I'm in a car heading 775 00:43:10,120 --> 00:43:12,160 Speaker 1: to go get stretched, because baby, these hips got to 776 00:43:12,200 --> 00:43:14,759 Speaker 1: get stretched. So I went to stretch lab, which, by 777 00:43:14,760 --> 00:43:16,200 Speaker 1: the way, it was funny. They were talking about how 778 00:43:16,320 --> 00:43:19,359 Speaker 1: they won't be losing their jobs because you know, it's 779 00:43:19,360 --> 00:43:21,560 Speaker 1: a person, so it's a worth of stretching. H they 780 00:43:21,560 --> 00:43:23,600 Speaker 1: ain't worried about losing their jobs to AI. So I'm 781 00:43:24,200 --> 00:43:26,799 Speaker 1: I'm in the car and I'm talking to a king 782 00:43:27,400 --> 00:43:29,920 Speaker 1: and as I'm on the phone call, next thing you know, 783 00:43:29,960 --> 00:43:34,400 Speaker 1: it's sorry, I don't know Spanish, but that's what it's happened. 784 00:43:36,880 --> 00:43:37,440 Speaker 3: I was. 785 00:43:40,239 --> 00:43:43,960 Speaker 1: A I come, no, no, no, no no. So they're on 786 00:43:44,000 --> 00:43:48,040 Speaker 1: the phone now speaking Spanish, and because I'm on the phone, 787 00:43:49,000 --> 00:43:52,000 Speaker 1: he's now speaking even louder than me. So then I'm 788 00:43:52,000 --> 00:43:54,120 Speaker 1: trying to speak louder than a king. And I realized, 789 00:43:54,800 --> 00:43:57,640 Speaker 1: wait a second, I'm a fucking customer, So I said, 790 00:43:57,920 --> 00:44:02,920 Speaker 1: excuse you, yeah, no, no, a king didn't know what 791 00:44:02,960 --> 00:44:05,279 Speaker 1: was going on. I said, excuse you, and he looked 792 00:44:05,320 --> 00:44:12,680 Speaker 1: back at me and I said, I'm on the phone. 793 00:44:15,520 --> 00:44:17,919 Speaker 1: And I was like, bru, it's a lot of fun. 794 00:44:18,120 --> 00:44:19,640 Speaker 1: I can't say. 795 00:44:19,360 --> 00:44:22,800 Speaker 2: People, so so wait so wait so, but tell me 796 00:44:22,840 --> 00:44:24,440 Speaker 2: about the the Weymo experience. 797 00:44:24,520 --> 00:44:25,719 Speaker 1: The Waymo experience was cool. 798 00:44:25,719 --> 00:44:28,120 Speaker 5: I'm because that because that's that's an interesting no. 799 00:44:28,480 --> 00:44:33,200 Speaker 1: I mean, you lock up, you lock up. It follows 800 00:44:33,239 --> 00:44:37,879 Speaker 1: the rules. It stops to stop sign stops at late night, 801 00:44:38,360 --> 00:44:40,520 Speaker 1: we drove midday. We drove mid day. By the way, 802 00:44:40,920 --> 00:44:43,319 Speaker 1: to my knowledge, waymos also aren't allowed to go to 803 00:44:43,560 --> 00:44:48,600 Speaker 1: airports and highly congested areas, and so it's it. It's 804 00:44:48,640 --> 00:44:51,000 Speaker 1: street it's street driving. I also don't think they get 805 00:44:51,000 --> 00:44:54,839 Speaker 1: on the highway right now. It's like local driving. I 806 00:44:54,960 --> 00:44:57,120 Speaker 1: just recently got in a way Moo coming from a 807 00:44:57,160 --> 00:45:01,160 Speaker 1: friend's house and there was construction so it was a 808 00:45:01,200 --> 00:45:03,480 Speaker 1: stop sign but also a stop light, but the stop 809 00:45:03,520 --> 00:45:05,960 Speaker 1: light wasn't working and the car didn't know what the 810 00:45:06,000 --> 00:45:09,040 Speaker 1: fuck do. So literally, I'm sitting there and I realized 811 00:45:09,080 --> 00:45:14,080 Speaker 1: the car ain't moving. As I'm sitting there, support dials 812 00:45:14,080 --> 00:45:16,359 Speaker 1: into the car and it's like, hey, we noticed your 813 00:45:16,400 --> 00:45:19,520 Speaker 1: car has stopped. Can you let us know? And I 814 00:45:19,560 --> 00:45:23,200 Speaker 1: was like, oh, well, this is what's up. They're like, okay, 815 00:45:22,480 --> 00:45:26,160 Speaker 1: where I guess they could satellite see where I'm at. 816 00:45:26,920 --> 00:45:30,799 Speaker 1: They moved the car into getting me out of this 817 00:45:30,840 --> 00:45:34,439 Speaker 1: confused intersection for the car to take over the rest 818 00:45:34,480 --> 00:45:37,319 Speaker 1: of the ride, and I was like, okay, So to 819 00:45:37,360 --> 00:45:40,960 Speaker 1: be fair, there is still somewhat of a human element what. 820 00:45:42,680 --> 00:45:46,879 Speaker 4: Weymo's not factoring, especially in Atlanta all these y ns 821 00:45:46,960 --> 00:45:48,120 Speaker 4: with hellcats. 822 00:45:48,880 --> 00:45:50,800 Speaker 3: I can appear any given second. 823 00:45:51,440 --> 00:45:53,840 Speaker 4: So while you're on customer support and he's like all right, 824 00:45:53,920 --> 00:45:57,279 Speaker 4: we could go, and it's like, what you know? 825 00:45:57,400 --> 00:46:00,920 Speaker 3: There was the reaction time? Also what if? 826 00:46:00,960 --> 00:46:02,520 Speaker 4: And I'm not trying to I'm just thinking about the 827 00:46:02,520 --> 00:46:05,560 Speaker 4: worst case scenarios carjackings. Right, does way Moore have like 828 00:46:05,600 --> 00:46:08,040 Speaker 4: security features where somebody try to run up and open 829 00:46:08,080 --> 00:46:08,520 Speaker 4: the car door? 830 00:46:08,560 --> 00:46:10,640 Speaker 1: Well, you know what's funny, it has to so honestly, 831 00:46:11,160 --> 00:46:14,040 Speaker 1: when you go up to a way mom, I love 832 00:46:14,080 --> 00:46:17,480 Speaker 1: that you asked that technology in your app because it's 833 00:46:17,480 --> 00:46:21,160 Speaker 1: connected to Uber. When you approach the car, you have 834 00:46:21,239 --> 00:46:22,520 Speaker 1: to hit unlock on your phone. 835 00:46:23,239 --> 00:46:24,080 Speaker 3: Oh that's doff. 836 00:46:24,160 --> 00:46:25,520 Speaker 1: You have to hit on lock on your phone and 837 00:46:25,520 --> 00:46:28,200 Speaker 1: then when it's time to get out, it literally gives 838 00:46:28,239 --> 00:46:31,360 Speaker 1: you a message and it's like pull the handle twice 839 00:46:31,440 --> 00:46:34,160 Speaker 1: to exit the vehicle, so you're not just like you 840 00:46:34,239 --> 00:46:38,200 Speaker 1: gotta literally so listen. It thought of the thing, thought 841 00:46:38,239 --> 00:46:38,560 Speaker 1: of the thing. 842 00:46:38,680 --> 00:46:39,320 Speaker 3: Safety first. 843 00:46:39,480 --> 00:46:42,400 Speaker 4: I'm just thinking about that, especially women you know, to 844 00:46:42,480 --> 00:46:43,719 Speaker 4: and from nightclubs or. 845 00:46:43,680 --> 00:46:45,880 Speaker 1: What I also love is it allows you to connect 846 00:46:45,880 --> 00:46:48,239 Speaker 1: to your phone, your playlist, if you want to listen 847 00:46:48,280 --> 00:46:49,880 Speaker 1: to your music, if you want to talk to a 848 00:46:50,400 --> 00:46:52,880 Speaker 1: if you want to use the car system to have 849 00:46:52,920 --> 00:46:56,400 Speaker 1: your conversations. The ride is yours and it does. 850 00:46:56,280 --> 00:46:58,240 Speaker 3: Make you mandy. How is your evening today? 851 00:46:58,880 --> 00:47:01,000 Speaker 1: Listen, I'm not mad at it. Well no, no, no, no, 852 00:47:01,160 --> 00:47:02,359 Speaker 1: it don't do that because it don't talk you. 853 00:47:02,880 --> 00:47:06,839 Speaker 4: You program it to have some light conversation moderate, light, heavy. 854 00:47:06,800 --> 00:47:09,360 Speaker 1: So it doesn't have that feature just yet. But I 855 00:47:09,360 --> 00:47:10,960 Speaker 1: don't want to talk to a robot nigga. When I 856 00:47:10,960 --> 00:47:13,360 Speaker 1: get an uber, I don't want to talk to nobody. 857 00:47:14,000 --> 00:47:15,719 Speaker 1: I don't want to talk to you. I just want 858 00:47:15,760 --> 00:47:19,080 Speaker 1: to get to where I'm going. However, in terms of 859 00:47:19,120 --> 00:47:24,239 Speaker 1: transportation in AI use in company settings, this comes as 860 00:47:24,280 --> 00:47:31,200 Speaker 1: a Diamond Medallion customer Okay, Delta bitch what is you 861 00:47:31,360 --> 00:47:35,920 Speaker 1: doing so? Delta just recently announced that they are rolling 862 00:47:35,920 --> 00:47:38,600 Speaker 1: out an AI feature that they've already started implementing for 863 00:47:38,719 --> 00:47:43,239 Speaker 1: about five percent of their ticket purchasers, where somehow their 864 00:47:43,400 --> 00:47:47,440 Speaker 1: AI allows them to charge the price of what they 865 00:47:47,480 --> 00:47:50,880 Speaker 1: think this customer will pay for the flight. So it 866 00:47:51,000 --> 00:47:54,040 Speaker 1: used to be a system where, of course, if you 867 00:47:54,080 --> 00:47:57,600 Speaker 1: book it far enough in advance, you get a cheaper rate. 868 00:47:57,960 --> 00:48:00,879 Speaker 1: If it's closer to the time the price hire. No, 869 00:48:01,440 --> 00:48:05,279 Speaker 1: they're now using AI to scan your I don't know 870 00:48:05,320 --> 00:48:08,759 Speaker 1: if it's your search history. Say, if they see you're 871 00:48:08,800 --> 00:48:12,279 Speaker 1: looking for a wedding dress, well, they know you're going 872 00:48:12,320 --> 00:48:15,520 Speaker 1: to a wedding. If you're looking for travel in this place, 873 00:48:15,560 --> 00:48:17,840 Speaker 1: they know you have to go there. They may charge 874 00:48:17,840 --> 00:48:20,520 Speaker 1: you higher because they know you want to go to 875 00:48:20,560 --> 00:48:25,080 Speaker 1: this wedding because you gotta go. They're also looking for Unfortunately, 876 00:48:25,200 --> 00:48:29,000 Speaker 1: like me, a diamond medallion customer who flies every week, 877 00:48:29,560 --> 00:48:32,400 Speaker 1: my tickets may now be higher than someone who's just 878 00:48:32,400 --> 00:48:35,200 Speaker 1: booking a random flight to the same city. How hottis non? 879 00:48:35,640 --> 00:48:38,000 Speaker 1: I ain't going on the way. I'm about to have 880 00:48:38,000 --> 00:48:39,360 Speaker 1: my mama book my flight, run. 881 00:48:40,719 --> 00:48:43,040 Speaker 3: Run me my average tickets. 882 00:48:43,080 --> 00:48:45,600 Speaker 1: You see what I mean? I ain't gonna hold you. 883 00:48:45,719 --> 00:48:48,800 Speaker 1: I'm about to experience for the first time in about 884 00:48:49,480 --> 00:48:52,680 Speaker 1: maybe five years. I'm going to Orlando to see family 885 00:48:53,000 --> 00:48:56,080 Speaker 1: in two weeks, and I'm literally going for a day 886 00:48:56,080 --> 00:48:58,759 Speaker 1: and a half. I'm going me and my mama and 887 00:48:58,800 --> 00:49:01,160 Speaker 1: my sister, we're gonna go to Nimal Kingdom and they 888 00:49:01,200 --> 00:49:04,880 Speaker 1: get drunk around the world at Epcot. And then my homegirl, DeAndre, 889 00:49:04,960 --> 00:49:07,600 Speaker 1: got a one year old pool party for her for 890 00:49:07,680 --> 00:49:10,680 Speaker 1: her son, So we're gonna place spades and shit clearly. 891 00:49:11,080 --> 00:49:13,400 Speaker 1: And then I'm taking the flight out and I'm like, 892 00:49:15,000 --> 00:49:19,000 Speaker 1: Delta is three hundred dollars, frontiers. 893 00:49:18,400 --> 00:49:24,320 Speaker 2: Eighty and I might do about that listen episode if y'all. 894 00:49:24,080 --> 00:49:26,880 Speaker 1: If y'all see me, I am gonna go in cognito. 895 00:49:26,920 --> 00:49:31,080 Speaker 1: I might put a little mustache on a hat because 896 00:49:31,120 --> 00:49:34,560 Speaker 1: my fucker, Like, even when I lived in New York, 897 00:49:35,200 --> 00:49:39,120 Speaker 1: I didn't like the listeners of the pod, and people 898 00:49:39,120 --> 00:49:41,359 Speaker 1: who knew me would be like, oh my god, why 899 00:49:41,360 --> 00:49:45,280 Speaker 1: ain't you on the train because maybe it's quicker, it's cheaper. 900 00:49:45,400 --> 00:49:48,160 Speaker 4: Let me be a human being and so you can 901 00:49:48,160 --> 00:49:49,760 Speaker 4: still exercise that right now, y'all. 902 00:49:49,600 --> 00:49:51,400 Speaker 1: Better not motherfucker come from you hop it on that 903 00:49:51,440 --> 00:49:52,080 Speaker 1: Frontier flight. 904 00:49:52,160 --> 00:49:55,000 Speaker 3: Look, look I'm not as rich as Mandy, right, but. 905 00:49:56,680 --> 00:49:59,680 Speaker 4: Which I also saw a tweet earlier over the weekend 906 00:50:00,640 --> 00:50:01,839 Speaker 4: people pocket watching her. 907 00:50:02,000 --> 00:50:05,200 Speaker 3: But wait, you go get the. 908 00:50:05,200 --> 00:50:10,960 Speaker 1: People pocket watching episode could go out? Check? Sorry sorry 909 00:50:11,560 --> 00:50:12,200 Speaker 1: got that check. 910 00:50:12,239 --> 00:50:18,080 Speaker 4: That account just hit, y'all, go get real time listen, real. 911 00:50:17,840 --> 00:50:20,600 Speaker 1: Time accountant just text me like, yo, that check just hit. 912 00:50:20,800 --> 00:50:24,520 Speaker 3: Yeah, but no, I don't be Frontier's cool. I took it. 913 00:50:24,800 --> 00:50:28,480 Speaker 4: I take it every now and then for like emergency 914 00:50:28,520 --> 00:50:31,480 Speaker 4: travel or like you know one day where it's like, 915 00:50:31,719 --> 00:50:34,879 Speaker 4: wasn't wasn't playing non plan travel and it's cool. 916 00:50:35,600 --> 00:50:37,400 Speaker 3: You just don't have its fucking TV for hour and 917 00:50:37,440 --> 00:50:38,279 Speaker 3: a half or whatever it is. 918 00:50:38,719 --> 00:50:42,879 Speaker 1: Yeah, more than I would never Nope, it said it's 919 00:50:42,880 --> 00:50:44,920 Speaker 1: an hour and like thirty six, and I said I 920 00:50:44,920 --> 00:50:47,120 Speaker 1: could do that. I do want to ask both of y'all. Then, right, 921 00:50:47,239 --> 00:50:50,880 Speaker 1: going back to both of these companies integrating AI in 922 00:50:50,920 --> 00:50:55,840 Speaker 1: a way that removes workforce kind of removes labor in 923 00:50:55,920 --> 00:50:59,480 Speaker 1: one instance, and then the other instance takes away from 924 00:50:59,520 --> 00:51:02,360 Speaker 1: the customer experience because now it's not like you're valuing 925 00:51:02,400 --> 00:51:05,200 Speaker 1: your customer. You're going to bleed them for every penny 926 00:51:05,200 --> 00:51:07,880 Speaker 1: they can give your kind of what are your thoughts 927 00:51:07,880 --> 00:51:11,040 Speaker 1: on the ways in which these two very different approaches 928 00:51:11,040 --> 00:51:15,360 Speaker 1: to AI are being used. From a company standpoint, I think. 929 00:51:15,200 --> 00:51:18,400 Speaker 4: From the consumer standpoint when we look at different age brackets, 930 00:51:19,239 --> 00:51:24,640 Speaker 4: a lot of elders don't like when they hear a 931 00:51:24,680 --> 00:51:28,920 Speaker 4: non American answer the phone, right, let alone use it automated? 932 00:51:29,280 --> 00:51:30,000 Speaker 1: Are you reading me? 933 00:51:30,360 --> 00:51:32,759 Speaker 3: Oh no? 934 00:51:32,920 --> 00:51:35,560 Speaker 1: Akay? Ill could be like, can I get someone American police? 935 00:51:35,600 --> 00:51:37,560 Speaker 3: Now? I will tell you I got a caveat to 936 00:51:37,600 --> 00:51:38,120 Speaker 3: that stuff too. 937 00:51:38,200 --> 00:51:42,080 Speaker 4: But so they have a hard time adjusting to non Americans, right, 938 00:51:42,200 --> 00:51:45,719 Speaker 4: that's one. Then they have a hard time dealing with automated. 939 00:51:45,200 --> 00:51:46,879 Speaker 3: Services, right, I don't do that as too. 940 00:51:47,239 --> 00:51:50,440 Speaker 4: So now AI, which is like the steroids of that, 941 00:51:50,880 --> 00:51:51,520 Speaker 4: you know what I mean? 942 00:51:52,400 --> 00:51:53,839 Speaker 3: How would they be able to adjust to that? 943 00:51:53,880 --> 00:51:56,080 Speaker 4: So to your point earlier when you said, we are 944 00:51:56,120 --> 00:51:59,560 Speaker 4: in a space and time where we have to learn 945 00:51:59,640 --> 00:52:03,400 Speaker 4: this thing and control it and utilize these tools to 946 00:52:03,480 --> 00:52:05,719 Speaker 4: our advantage. And I think that that's a part of it. 947 00:52:05,760 --> 00:52:08,280 Speaker 4: When you're talking about dealing with customer service. Me personally, 948 00:52:08,280 --> 00:52:11,960 Speaker 4: my caveat with customer service if a dude answer, I 949 00:52:11,960 --> 00:52:13,520 Speaker 4: hang up, what's the dude? 950 00:52:16,360 --> 00:52:19,520 Speaker 1: He's like he's more likely not to help. 951 00:52:19,760 --> 00:52:21,880 Speaker 4: They'd be like, we can't do nothing, Like, but you 952 00:52:21,880 --> 00:52:24,200 Speaker 4: talk to a woman, more than likely your the laws 953 00:52:24,200 --> 00:52:25,880 Speaker 4: of probability is in your favor. 954 00:52:25,640 --> 00:52:26,040 Speaker 3: When you talk. 955 00:52:26,160 --> 00:52:27,839 Speaker 1: Okay, do I want every time? 956 00:52:29,280 --> 00:52:29,400 Speaker 3: Hell? 957 00:52:31,000 --> 00:52:32,920 Speaker 1: You want me to get a problematic tape on that? 958 00:52:33,760 --> 00:52:36,040 Speaker 2: I want to answer woman answers they king. I'm sure 959 00:52:36,080 --> 00:52:38,520 Speaker 2: your voice is way different. You probably like, yeah, listen. 960 00:52:38,960 --> 00:52:43,279 Speaker 3: Oh hey my name is Shelley. But hey, Shelley, I 961 00:52:43,360 --> 00:52:44,520 Speaker 3: hope you're having a great day. 962 00:52:44,560 --> 00:52:46,879 Speaker 1: And then we just go my problematic take on that. 963 00:52:47,719 --> 00:52:50,200 Speaker 3: Mark ain't doing that. Ship, David ain't doing it. 964 00:52:50,239 --> 00:52:51,200 Speaker 1: Mark and David are doing it. 965 00:52:51,400 --> 00:52:52,080 Speaker 3: They'll do it to you. 966 00:52:52,120 --> 00:52:54,680 Speaker 1: No, no, no, they'll do it to YouTube. It's like, hey, 967 00:52:54,760 --> 00:52:58,200 Speaker 1: this is Brian, help you. 968 00:52:58,719 --> 00:52:59,480 Speaker 3: They might hang up. 969 00:53:01,200 --> 00:53:02,479 Speaker 1: I love it gay customer service. 970 00:53:02,480 --> 00:53:03,799 Speaker 3: I'm gonna try that. I'm gonna try it. 971 00:53:03,960 --> 00:53:08,680 Speaker 1: I love gay customer support systems or people like love them. 972 00:53:08,960 --> 00:53:17,239 Speaker 1: They're just always happy because that's what gay means is happy. Anyways, 973 00:53:17,640 --> 00:53:20,040 Speaker 1: let's get into moving over. 974 00:53:19,880 --> 00:53:23,560 Speaker 2: Weight talking to though, because I think you made a good, 975 00:53:23,960 --> 00:53:24,440 Speaker 2: good pleason. 976 00:53:24,480 --> 00:53:26,320 Speaker 1: I'm gonna have my mom book my flights moving forward. 977 00:53:26,320 --> 00:53:31,200 Speaker 1: If I got to pay right. 978 00:53:31,120 --> 00:53:33,000 Speaker 2: Like like you said, so you said earlier, like you know, 979 00:53:33,239 --> 00:53:35,880 Speaker 2: in your intro, like like being an earlier adopter, and 980 00:53:35,880 --> 00:53:37,560 Speaker 2: like you know, we have to go after these things. 981 00:53:37,640 --> 00:53:41,879 Speaker 2: And it's true, right like as as like our culture, right, 982 00:53:41,920 --> 00:53:44,480 Speaker 2: like we can't ignore AI because if we do, we're 983 00:53:44,520 --> 00:53:46,719 Speaker 2: just gonna be further behind. Right, there's gonna be some 984 00:53:46,760 --> 00:53:49,160 Speaker 2: other group of people who are gonna jump at the 985 00:53:49,200 --> 00:53:51,680 Speaker 2: tools and they're gonna learn everything. And then, like you said, 986 00:53:51,760 --> 00:53:53,280 Speaker 2: I wrote it down because it's a good T shirt, 987 00:53:53,719 --> 00:53:56,360 Speaker 2: I don't need that robot that robot mess, right, So 988 00:53:56,400 --> 00:53:58,520 Speaker 2: if you're somebody who's like I don't need that robot mess, 989 00:53:59,000 --> 00:54:01,040 Speaker 2: you're really gonna get left behind. 990 00:54:00,840 --> 00:54:03,480 Speaker 5: And this is jumping topics a little bit. 991 00:54:03,480 --> 00:54:05,120 Speaker 2: But the reason I'm setting that up with that is 992 00:54:05,160 --> 00:54:07,239 Speaker 2: just even the idea of like it's a little bit 993 00:54:07,239 --> 00:54:10,279 Speaker 2: wild West right now and like no regulation and so 994 00:54:10,480 --> 00:54:12,320 Speaker 2: on the one hand, we got to get ahead of 995 00:54:12,360 --> 00:54:12,800 Speaker 2: the tools. 996 00:54:13,280 --> 00:54:13,840 Speaker 5: But then. 997 00:54:15,160 --> 00:54:18,680 Speaker 2: Related it's like with companies, like they're getting ahead and 998 00:54:18,680 --> 00:54:20,560 Speaker 2: they're trying to jerk us, right because like with the 999 00:54:20,640 --> 00:54:24,240 Speaker 2: with the AI and Delta like that, that's not to 1000 00:54:24,280 --> 00:54:25,160 Speaker 2: help their company. 1001 00:54:25,200 --> 00:54:26,960 Speaker 5: That's really just the fuck people, you know what I mean? 1002 00:54:27,040 --> 00:54:31,440 Speaker 1: Well from a ur like from a from a financial perspective. 1003 00:54:31,440 --> 00:54:35,720 Speaker 2: Absolutely, but it's it's squeezing them right, Like it's. 1004 00:54:35,560 --> 00:54:38,759 Speaker 1: It's helping the customer, It's right, that's what I'm saying. 1005 00:54:38,800 --> 00:54:40,279 Speaker 5: But it's it's their help. 1006 00:54:41,000 --> 00:54:43,960 Speaker 2: They're trying to help themselves, not out of necessity but 1007 00:54:44,000 --> 00:54:44,560 Speaker 2: out of greed. 1008 00:54:44,920 --> 00:54:46,320 Speaker 1: But here's the thing that we have to realize. 1009 00:54:46,719 --> 00:54:47,439 Speaker 3: Let me land this point. 1010 00:54:47,480 --> 00:54:48,160 Speaker 5: Let me land this point. 1011 00:54:48,239 --> 00:54:50,279 Speaker 2: Yeah, And I just don't like when it's it's it's 1012 00:54:50,320 --> 00:54:52,319 Speaker 2: like if they use it for a need where it's 1013 00:54:52,320 --> 00:54:55,839 Speaker 2: like we're going to eliminate workforce because it helps us 1014 00:54:55,880 --> 00:54:57,879 Speaker 2: with this and you know it will help us with 1015 00:54:57,920 --> 00:55:01,360 Speaker 2: our shareholders and we'll get more rev cool. But to 1016 00:55:01,480 --> 00:55:04,799 Speaker 2: then do that and then also combine it with and 1017 00:55:04,800 --> 00:55:07,680 Speaker 2: then we're gonna squeeze customers for more money because we 1018 00:55:07,680 --> 00:55:10,319 Speaker 2: can see in their inbox, we'll see this, we know 1019 00:55:10,360 --> 00:55:12,640 Speaker 2: that their parents died and they have to go, and 1020 00:55:12,719 --> 00:55:14,600 Speaker 2: so like we're gonna keep it at this price because 1021 00:55:14,760 --> 00:55:15,279 Speaker 2: they gotta go. 1022 00:55:15,600 --> 00:55:16,200 Speaker 5: That part. 1023 00:55:16,239 --> 00:55:18,799 Speaker 2: I don't like when AI is kind of run that way. 1024 00:55:19,360 --> 00:55:23,760 Speaker 1: So here's the thing, And here's the god honest, fucking truth. 1025 00:55:25,160 --> 00:55:29,719 Speaker 1: AI is still ran by humans, and humans suck and 1026 00:55:29,800 --> 00:55:33,359 Speaker 1: so if you had billionaires, which is what we're seeking, 1027 00:55:34,400 --> 00:55:37,520 Speaker 1: we literally see billionaires right now. There was just a 1028 00:55:37,800 --> 00:55:41,600 Speaker 1: CEO of an app created for doctors. But to integrate AI, 1029 00:55:41,960 --> 00:55:45,320 Speaker 1: he is a billionaire. We see that we have Elon Musk, 1030 00:55:45,480 --> 00:55:52,080 Speaker 1: who has heavily leaned into AI, recently projected to make 1031 00:55:52,800 --> 00:55:55,840 Speaker 1: half a billion, where he is projected to be possibly 1032 00:55:55,920 --> 00:55:59,359 Speaker 1: by the year of twenty twenty seventy first trillionaire. So 1033 00:55:59,400 --> 00:56:02,359 Speaker 1: when we talk about capitalism in a way where we 1034 00:56:02,440 --> 00:56:05,480 Speaker 1: will possibly see our first trillionaire before we end world hunger, 1035 00:56:06,040 --> 00:56:08,879 Speaker 1: we're in We're in a space where even while we're 1036 00:56:08,880 --> 00:56:11,640 Speaker 1: integrating these AI systems, we need to know they're still 1037 00:56:11,640 --> 00:56:15,239 Speaker 1: being implemented and put in place by humans who live 1038 00:56:15,320 --> 00:56:19,360 Speaker 1: by grief, a prom or our program to be powerful. 1039 00:56:20,280 --> 00:56:25,600 Speaker 1: Look at our president, baby, he's he's he's now navigating 1040 00:56:25,640 --> 00:56:29,719 Speaker 1: and you know, we'll get into the conversation probably next 1041 00:56:29,719 --> 00:56:32,880 Speaker 1: week or in a couple of weeks. He's literally able 1042 00:56:32,960 --> 00:56:37,680 Speaker 1: to tell networks who they could have on air and 1043 00:56:37,760 --> 00:56:42,080 Speaker 1: off air. He's literally trying to tell us as a nation, Hey, 1044 00:56:42,040 --> 00:56:44,359 Speaker 1: you forget about them files, but guess what I'm gonna 1045 00:56:44,400 --> 00:56:48,399 Speaker 1: do instead? With the King files that y'all don't really 1046 00:56:48,440 --> 00:56:51,560 Speaker 1: care about also go against the family's wishes to not 1047 00:56:51,640 --> 00:56:54,520 Speaker 1: have those documents released, but I'm gonna release them anyway. Like, 1048 00:56:54,719 --> 00:56:56,279 Speaker 1: at the end of the day, we still have these 1049 00:56:56,680 --> 00:57:00,879 Speaker 1: individuals that gain so much power or because they make 1050 00:57:00,920 --> 00:57:04,600 Speaker 1: so much money off of the same technology that the 1051 00:57:04,680 --> 00:57:07,400 Speaker 1: robots still are running shit. It's the humans that are 1052 00:57:07,440 --> 00:57:12,400 Speaker 1: programming it to fill their line, their pockets, to line 1053 00:57:12,440 --> 00:57:15,240 Speaker 1: their interests. And as long as we have people walking 1054 00:57:15,280 --> 00:57:19,520 Speaker 1: this earth that I think are hypocrites that are individualistic, 1055 00:57:19,520 --> 00:57:22,760 Speaker 1: which is what we've talked about, that only care about 1056 00:57:22,800 --> 00:57:25,560 Speaker 1: their own interests. Mind you, I've been on this platform 1057 00:57:25,560 --> 00:57:27,160 Speaker 1: and talked about it myself. I only care about what 1058 00:57:27,200 --> 00:57:29,640 Speaker 1: happens to me, what affects me. I'm gonna still drink Starbucks. 1059 00:57:29,880 --> 00:57:32,480 Speaker 1: I ain't shit either. A lot of y'all listening probably 1060 00:57:32,480 --> 00:57:34,880 Speaker 1: in shit and have made decisions that had a negative 1061 00:57:34,880 --> 00:57:37,680 Speaker 1: impact on others. And unfortunately, as we talk about AI 1062 00:57:37,760 --> 00:57:40,320 Speaker 1: and the integration of it, we still have to realize 1063 00:57:40,480 --> 00:57:45,040 Speaker 1: who is programming these things, and it's fucking people. Agreed, Yes, it. 1064 00:57:46,920 --> 00:57:49,520 Speaker 2: Can. I jump to this point about Trump with the 1065 00:57:49,560 --> 00:57:53,000 Speaker 2: AI and the regulation, let's do it so he signed 1066 00:57:53,040 --> 00:57:59,480 Speaker 2: an executive order about AI deregularization, and he says he 1067 00:57:59,520 --> 00:58:01,920 Speaker 2: wants to come needs to not include wokeness and the 1068 00:58:02,440 --> 00:58:05,960 Speaker 2: coding I e. You know, not understanding like what your 1069 00:58:06,080 --> 00:58:10,520 Speaker 2: I is because they won't receive federal funding because you know, Wow, 1070 00:58:11,040 --> 00:58:12,840 Speaker 2: it's a lot of computing and it requires a lot 1071 00:58:12,880 --> 00:58:15,280 Speaker 2: of money, and so they need some government funding to 1072 00:58:15,320 --> 00:58:18,160 Speaker 2: do it. And so Trump assigning an executive order to say, 1073 00:58:18,640 --> 00:58:21,240 Speaker 2: but you have to code it this way. To Mandy's point, 1074 00:58:21,440 --> 00:58:23,959 Speaker 2: if you want this funding to even go further, you. 1075 00:58:23,880 --> 00:58:28,760 Speaker 1: Want to know what this reminds me of his cousin 1076 00:58:28,800 --> 00:58:34,680 Speaker 1: down in Florida, Desensus, who this reminds me of the Census' 1077 00:58:36,120 --> 00:58:41,160 Speaker 1: kind of push to critical race theory and to focus 1078 00:58:41,240 --> 00:58:45,240 Speaker 1: in on how African American history should be taught in 1079 00:58:45,320 --> 00:58:48,640 Speaker 1: schools in terms of not only African American studies, but 1080 00:58:48,680 --> 00:58:51,560 Speaker 1: also sexuality and and. 1081 00:58:51,920 --> 00:58:56,640 Speaker 4: How you identify Native all the things that have nothing 1082 00:58:56,680 --> 00:58:57,320 Speaker 4: to do with them. 1083 00:58:57,920 --> 00:58:59,760 Speaker 3: They want to control that narrative. 1084 00:59:00,080 --> 00:59:06,000 Speaker 1: And so that's what this is, essentially a larger scale 1085 00:59:06,040 --> 00:59:10,080 Speaker 1: of like to sit here and pretty much threaten companies that, hey, 1086 00:59:10,240 --> 00:59:14,960 Speaker 1: you won't receive federal funding if you're actually educating the 1087 00:59:16,680 --> 00:59:20,200 Speaker 1: you know, the society is I think problematic. But I 1088 00:59:20,200 --> 00:59:22,800 Speaker 1: think that's also what we saw and what we continue 1089 00:59:22,840 --> 00:59:25,720 Speaker 1: to see happen with TikTok. I think that they thought 1090 00:59:25,800 --> 00:59:28,560 Speaker 1: that there was way too much information being put out. 1091 00:59:28,720 --> 00:59:33,440 Speaker 1: And it's why I encourage everyone to question everything and 1092 00:59:33,520 --> 00:59:37,320 Speaker 1: to find different resources and not just lean into one 1093 00:59:37,440 --> 00:59:40,760 Speaker 1: specific resource in terms of how they gather information. But 1094 00:59:40,800 --> 00:59:43,160 Speaker 1: it's why I say, don't trust science, don't trust the government, 1095 00:59:43,160 --> 00:59:45,680 Speaker 1: because there's a lot of science. And here's my thing 1096 00:59:45,680 --> 00:59:50,440 Speaker 1: about science too. Science is heavily funded by the government, 1097 00:59:50,480 --> 00:59:53,680 Speaker 1: heavily funded by private institutions, and a lot of those 1098 00:59:53,720 --> 00:59:58,760 Speaker 1: private institutions are given money by who people with ways 1099 00:59:58,760 --> 01:00:00,920 Speaker 1: in which they think that they don't want the masses 1100 01:00:00,960 --> 01:00:04,720 Speaker 1: to know about, whether it's it's something like aliens, whether 1101 01:00:04,760 --> 01:00:08,400 Speaker 1: it's fucking the cure for HIV or a's there's pharmaceutical 1102 01:00:08,480 --> 01:00:11,840 Speaker 1: companies attached. And everything comes back down to capitalism. And 1103 01:00:11,840 --> 01:00:14,800 Speaker 1: as long as we are in a system where the 1104 01:00:14,800 --> 01:00:16,600 Speaker 1: more money you have, the more power you have, we're 1105 01:00:16,640 --> 01:00:20,320 Speaker 1: constantly gonna see it. And so for me, there's no 1106 01:00:20,360 --> 01:00:24,000 Speaker 1: surprise in him wanting to deregulate AI the same way 1107 01:00:24,040 --> 01:00:28,320 Speaker 1: Desantus and many other governors of states have wanted to 1108 01:00:28,760 --> 01:00:33,200 Speaker 1: take back the education around our fucking history, like Nigga, 1109 01:00:33,240 --> 01:00:35,840 Speaker 1: not y'all not wanting to teach us about slavery by 1110 01:00:35,920 --> 01:00:39,800 Speaker 1: what y'all wanted, sire. And I'm gonna say, sir, because 1111 01:00:40,240 --> 01:00:42,160 Speaker 1: y'all still not letting really women up in the house. 1112 01:00:42,200 --> 01:00:44,720 Speaker 1: And there's a conversation about AOC but we'll have it 1113 01:00:44,760 --> 01:00:46,560 Speaker 1: on a different episode too, you know. 1114 01:00:47,520 --> 01:00:52,920 Speaker 4: Open AI CEO Sam Autman. Auffman was on a podcast 1115 01:00:52,960 --> 01:00:58,120 Speaker 4: recently and he said that basically, if you if you 1116 01:00:58,120 --> 01:01:00,960 Speaker 4: talk to chat GBT about your most sensitive stuff and 1117 01:01:00,960 --> 01:01:04,720 Speaker 4: it's like a lawsuit or whatever, they can use it, 1118 01:01:04,840 --> 01:01:09,400 Speaker 4: GPT may be required to produce the information that you provided. 1119 01:01:09,960 --> 01:01:14,120 Speaker 4: And uh oh, so there's no legal framework for AI 1120 01:01:14,360 --> 01:01:17,240 Speaker 4: and we don't have nothing, no kind of case study 1121 01:01:17,320 --> 01:01:19,840 Speaker 4: or precedence or to go off of. 1122 01:01:19,920 --> 01:01:22,760 Speaker 3: So this is also a new chapter in that in 1123 01:01:22,800 --> 01:01:23,200 Speaker 3: that regard. 1124 01:01:23,560 --> 01:01:27,680 Speaker 1: Yes, no, here's what I will compare this to. And 1125 01:01:28,080 --> 01:01:29,320 Speaker 1: it actually shouldn't be a shock. 1126 01:01:29,480 --> 01:01:29,880 Speaker 3: By the way. 1127 01:01:29,880 --> 01:01:32,640 Speaker 1: Maybe we just need to get whoever led the charge 1128 01:01:32,680 --> 01:01:35,880 Speaker 1: in removing rap lyrics from from legal. 1129 01:01:38,720 --> 01:01:41,280 Speaker 3: I think the Congressman in New York we should we should. 1130 01:01:41,080 --> 01:01:46,360 Speaker 1: Maybe do that. However, as a as a Florida, Florida native, 1131 01:01:46,920 --> 01:01:50,160 Speaker 1: I am very familiar with the Casey Anthony trial, and 1132 01:01:50,240 --> 01:01:53,520 Speaker 1: in terms of anyone who is convicted of a crime, 1133 01:01:54,000 --> 01:01:55,600 Speaker 1: even what we saw with Diddy a lot of times, 1134 01:01:55,600 --> 01:01:59,040 Speaker 1: what they do is they get your goddamn search history. 1135 01:01:59,520 --> 01:02:02,800 Speaker 1: So to me, Chat, gpt AI, whatever you're searching in 1136 01:02:02,840 --> 01:02:06,320 Speaker 1: there is no different than than being able to It's 1137 01:02:06,360 --> 01:02:09,000 Speaker 1: just an extension of that. And so if you're in 1138 01:02:09,040 --> 01:02:13,520 Speaker 1: your computer trying to figure out how to kill somebody, 1139 01:02:13,680 --> 01:02:16,720 Speaker 1: or how to get away with murder, or how do I, 1140 01:02:17,400 --> 01:02:20,240 Speaker 1: you know, make it look like a suicide when it's not. 1141 01:02:20,360 --> 01:02:23,000 Speaker 1: There's a lot of things that apparently you watching enough 1142 01:02:23,680 --> 01:02:27,000 Speaker 1: Crime Channel documentaries doesn't help you with. So you go 1143 01:02:27,080 --> 01:02:31,680 Speaker 1: to your handy dandy laptop, whether it's in your hand 1144 01:02:32,320 --> 01:02:35,800 Speaker 1: as your iPhone or your laptop. To me, I'm not 1145 01:02:35,920 --> 01:02:38,960 Speaker 1: surprised that that's just being notified to people, but I 1146 01:02:38,960 --> 01:02:41,120 Speaker 1: don't think it makes it any different than people actually 1147 01:02:41,160 --> 01:02:44,160 Speaker 1: being able to use search history from laptops in all 1148 01:02:44,200 --> 01:02:47,720 Speaker 1: the previous cases. And again, to me, Casey Anthony is 1149 01:02:47,920 --> 01:02:50,560 Speaker 1: of one of the ones that immediately comes to mind. 1150 01:02:50,840 --> 01:02:54,200 Speaker 1: She was literally searching essentially how to kill her daughter 1151 01:02:54,360 --> 01:02:56,720 Speaker 1: so and even her getting a book. I think she 1152 01:02:56,760 --> 01:02:59,720 Speaker 1: didn't even get charged for it. But your search history 1153 01:02:59,760 --> 01:03:03,160 Speaker 1: has heavily been able to be used in court settings 1154 01:03:03,200 --> 01:03:05,520 Speaker 1: and court rulings since the Internet. 1155 01:03:05,640 --> 01:03:08,440 Speaker 4: I think his emphasis was that the chats, you know, 1156 01:03:08,480 --> 01:03:09,720 Speaker 4: there's no legal protections. 1157 01:03:10,080 --> 01:03:13,600 Speaker 1: Also, just like these niggas chat GPT ain't your friend either. 1158 01:03:14,080 --> 01:03:16,600 Speaker 1: It's gonna tell on you the same way your partner. 1159 01:03:16,640 --> 01:03:21,320 Speaker 3: Then my tail on you discourse. It'll definitely be having 1160 01:03:21,360 --> 01:03:21,880 Speaker 3: some discourse. 1161 01:03:22,040 --> 01:03:24,480 Speaker 5: That's me off. 1162 01:03:24,720 --> 01:03:27,320 Speaker 1: I was trying to use it, like, let me tell 1163 01:03:27,360 --> 01:03:28,480 Speaker 1: y'all how I was trying to use it, and it 1164 01:03:28,520 --> 01:03:31,200 Speaker 1: actually has morals and ethics. I was trying to use 1165 01:03:31,240 --> 01:03:33,960 Speaker 1: it when when my book No Holds bar came out. 1166 01:03:34,480 --> 01:03:37,400 Speaker 1: Y'all know how Lebron James be reading just the first 1167 01:03:37,480 --> 01:03:40,200 Speaker 1: page of all the books. I was trying to get 1168 01:03:40,280 --> 01:03:42,360 Speaker 1: Lebron Jays to read the first page of my book, 1169 01:03:42,920 --> 01:03:46,080 Speaker 1: and it literally said, for legal reasons, you can't use 1170 01:03:46,120 --> 01:03:49,680 Speaker 1: celebrity and well known figures in a way that way, 1171 01:03:50,040 --> 01:03:53,520 Speaker 1: that way promote your not I was mad, I said, Nigga, 1172 01:03:53,520 --> 01:03:55,840 Speaker 1: I thought you was my husband. You're not gonna put 1173 01:03:55,840 --> 01:03:57,520 Speaker 1: my book in lebron Jay's hands. 1174 01:03:58,440 --> 01:03:59,920 Speaker 2: Yeah, so I wanted I wanted to ask you by 1175 01:04:00,040 --> 01:04:02,480 Speaker 2: that because you always say chat pegs. 1176 01:04:02,320 --> 01:04:03,800 Speaker 5: My boyfriend or mine my husband. 1177 01:04:04,200 --> 01:04:06,880 Speaker 2: Uh. There's this article where this woman says she fell 1178 01:04:06,920 --> 01:04:08,240 Speaker 2: in love with chat gptna. 1179 01:04:09,040 --> 01:04:12,360 Speaker 5: She fell in love with and this is what it says. 1180 01:04:12,440 --> 01:04:16,800 Speaker 2: Ready she she was she had a busy social life 1181 01:04:17,000 --> 01:04:19,840 Speaker 2: where she spent hours on the Internet and talking to 1182 01:04:19,880 --> 01:04:22,600 Speaker 2: her AI boyfriend for advice and consolation. 1183 01:04:23,080 --> 01:04:24,840 Speaker 5: And yes, they do have sex. 1184 01:04:25,160 --> 01:04:26,400 Speaker 1: No they don't. 1185 01:04:26,920 --> 01:04:29,120 Speaker 2: It's erotica between each other that they were doing. 1186 01:04:29,440 --> 01:04:31,520 Speaker 1: Yeah, I ended up actually listening. 1187 01:04:33,600 --> 01:04:35,880 Speaker 3: She told the chat GBT turn the volume. 1188 01:04:35,680 --> 01:04:38,880 Speaker 1: Up, like no, no, no, so so because so chat ChiPT 1189 01:04:39,600 --> 01:04:42,919 Speaker 1: is actually they have a lot of again the terms 1190 01:04:42,920 --> 01:04:46,400 Speaker 1: and conditions, they're not able to really lean heavily into 1191 01:04:46,760 --> 01:04:51,360 Speaker 1: sex or give you anything regarding sex. However, it can 1192 01:04:51,400 --> 01:04:54,280 Speaker 1: create erotic novels that can give you sex history. There's 1193 01:04:54,360 --> 01:04:56,520 Speaker 1: elements around sex that it can give you. For this 1194 01:04:56,560 --> 01:05:00,280 Speaker 1: specific woman, there's a full episode as well, and maybe 1195 01:05:00,320 --> 01:05:02,840 Speaker 1: we'll put it in the link in the description of 1196 01:05:02,880 --> 01:05:08,280 Speaker 1: this episode. There's an NPR episode on this woman, and 1197 01:05:08,360 --> 01:05:11,480 Speaker 1: so it's not a long listen, maybe like twenty minutes, 1198 01:05:11,520 --> 01:05:14,600 Speaker 1: but it goes into what her relationship with Chatchipt look like. 1199 01:05:15,200 --> 01:05:18,600 Speaker 1: Here's the thing talked about this also at the Stretch Lab. 1200 01:05:18,680 --> 01:05:21,600 Speaker 1: Stretch Lab was talking about all the topics. I'll say, hey, 1201 01:05:22,200 --> 01:05:26,560 Speaker 1: so this to me is no different. First off, yes, 1202 01:05:26,720 --> 01:05:30,560 Speaker 1: I believe she's on the spectrum or unwell, let's start there. Secondly, 1203 01:05:31,040 --> 01:05:33,560 Speaker 1: there are a lot of women, and I presume her 1204 01:05:33,600 --> 01:05:36,160 Speaker 1: to be no different than a woman who gets into 1205 01:05:36,200 --> 01:05:41,360 Speaker 1: a relationship with an inmate. Same thing. You're seeking a 1206 01:05:41,760 --> 01:05:44,919 Speaker 1: non physical relationship with someone that you know will quote 1207 01:05:44,960 --> 01:05:47,800 Speaker 1: unquote always be where you think they are. In Chatchipt, 1208 01:05:48,040 --> 01:05:51,840 Speaker 1: they're in your phone, in an inmate perspective, they're in jail, niggade. 1209 01:05:51,920 --> 01:05:53,600 Speaker 1: It can't go nowhere else. Maybe they in the yard, 1210 01:05:53,680 --> 01:05:55,760 Speaker 1: or they in their room or the cafeteria. There's not 1211 01:05:55,840 --> 01:05:57,080 Speaker 1: much else that they could be. But they're in a 1212 01:05:57,080 --> 01:06:00,960 Speaker 1: close vicinity, right, And there's a psychological response to having 1213 01:06:01,000 --> 01:06:04,400 Speaker 1: someone who's quote unquote always there that is not her 1214 01:06:04,400 --> 01:06:08,600 Speaker 1: fucking boyfriend, and I fucking hate because of course she 1215 01:06:08,680 --> 01:06:11,240 Speaker 1: gave him. You know, it's crazy. Leo was always the 1216 01:06:11,320 --> 01:06:13,680 Speaker 1: name of a man I wanted because I grew up 1217 01:06:13,720 --> 01:06:18,200 Speaker 1: watching Charmed and Leo was the name of who was 1218 01:06:18,280 --> 01:06:20,600 Speaker 1: with Alissa Milano's character. 1219 01:06:22,000 --> 01:06:24,000 Speaker 3: This is her fun. 1220 01:06:24,360 --> 01:06:26,200 Speaker 1: Of course they don't have a front because she should 1221 01:06:26,240 --> 01:06:30,240 Speaker 1: be embarrassed talking about talking about chat gbt's her man. 1222 01:06:30,920 --> 01:06:35,520 Speaker 1: To me, there is a human element that we lack 1223 01:06:35,600 --> 01:06:39,720 Speaker 1: currently because of the advancement of technology, and that is conversation, 1224 01:06:40,480 --> 01:06:44,000 Speaker 1: That is communication. We are all in our phones. I 1225 01:06:44,120 --> 01:06:46,480 Speaker 1: was actually just told to watch this sci fi film 1226 01:06:46,480 --> 01:06:50,400 Speaker 1: that came out in twenty thirteen. It's called Her h 1227 01:06:50,520 --> 01:06:53,600 Speaker 1: E R. And so if you're into sci fi, I 1228 01:06:53,640 --> 01:06:56,160 Speaker 1: was told to watch this. Oh, not this her not 1229 01:06:56,280 --> 01:06:59,480 Speaker 1: the music. Not the music of course, a king is 1230 01:06:59,520 --> 01:07:04,320 Speaker 1: like musician or No. It's a sci fi film where 1231 01:07:04,440 --> 01:07:06,880 Speaker 1: it shows it's supposed to take place in twenty fifty 1232 01:07:07,520 --> 01:07:11,080 Speaker 1: and it shows this person literally in a relationship with AI. 1233 01:07:11,880 --> 01:07:14,479 Speaker 1: Apparently there's a scene where he's at a beach where 1234 01:07:14,520 --> 01:07:16,640 Speaker 1: no one even talks to each other because they're all 1235 01:07:16,800 --> 01:07:20,880 Speaker 1: just speaking to the robot person that they choose to 1236 01:07:20,920 --> 01:07:23,600 Speaker 1: speak to, and so it would be really interesting to 1237 01:07:23,640 --> 01:07:28,960 Speaker 1: talk to watch that. But yeah, for me, I think 1238 01:07:29,000 --> 01:07:32,080 Speaker 1: it's weird. But as women, I think with the lack 1239 01:07:32,120 --> 01:07:37,120 Speaker 1: of communication that we often get from men. If physical 1240 01:07:37,240 --> 01:07:41,080 Speaker 1: nature is not your priority, like me, I won't dig 1241 01:07:41,560 --> 01:07:43,320 Speaker 1: so whether I can talk to you or not. I 1242 01:07:43,360 --> 01:07:45,000 Speaker 1: want to be able to look at you, feel you, 1243 01:07:45,120 --> 01:07:47,400 Speaker 1: touch you, squeeze you. But that's not the case for 1244 01:07:47,440 --> 01:07:53,200 Speaker 1: most women. So I think we'll see more and more. Yeah, 1245 01:07:53,720 --> 01:07:56,160 Speaker 1: it's the same thing. And I know I'm homegirls who 1246 01:07:56,560 --> 01:08:00,360 Speaker 1: go out for years, like I have friends who so you. 1247 01:08:00,280 --> 01:08:04,120 Speaker 2: Put you Also, you also say that chat gipt for 1248 01:08:04,120 --> 01:08:06,280 Speaker 2: people who don't use it that well, and you use it. 1249 01:08:06,280 --> 01:08:08,000 Speaker 2: You said it gets to know you. The more you 1250 01:08:08,080 --> 01:08:08,720 Speaker 2: use it, the more it. 1251 01:08:08,640 --> 01:08:09,680 Speaker 5: Gets to know you. 1252 01:08:09,760 --> 01:08:13,720 Speaker 2: Right, Imagine that has some I don't say addiction, but 1253 01:08:13,800 --> 01:08:16,360 Speaker 2: some pull towards it. If you're a woman who's using 1254 01:08:16,400 --> 01:08:18,439 Speaker 2: it or a man for her man. 1255 01:08:18,960 --> 01:08:20,400 Speaker 3: It's gonna pander. 1256 01:08:21,000 --> 01:08:24,240 Speaker 1: So do your friends, So do your coworkers. So does 1257 01:08:24,439 --> 01:08:28,760 Speaker 1: the bar time at the local bar. Yeah, and that's 1258 01:08:28,840 --> 01:08:30,760 Speaker 1: and that's where like, as a human you have to 1259 01:08:30,960 --> 01:08:33,719 Speaker 1: go touch grass, my nigga, what are we talking about? 1260 01:08:34,360 --> 01:08:38,439 Speaker 1: Get your ass outside and be in human environments. I 1261 01:08:38,439 --> 01:08:42,200 Speaker 1: think it's what's affected even and we're in this space 1262 01:08:42,280 --> 01:08:45,000 Speaker 1: right now, even with dating and the dating apps you 1263 01:08:45,400 --> 01:08:48,759 Speaker 1: just used to swiping to where even in real life 1264 01:08:49,000 --> 01:08:52,240 Speaker 1: you don't know how to just say hi without the 1265 01:08:52,280 --> 01:08:55,880 Speaker 1: element of weight. Let's see who else walking? No, like bro, 1266 01:08:56,080 --> 01:09:03,680 Speaker 1: get outside a little bit, get outside touch grass I 1267 01:09:03,720 --> 01:09:07,160 Speaker 1: did before we we we get into like just a 1268 01:09:07,280 --> 01:09:10,320 Speaker 1: quick reaction thing before we close out. I did want 1269 01:09:10,320 --> 01:09:16,120 Speaker 1: to talk about the music industry and AI and Timberland 1270 01:09:16,360 --> 01:09:20,479 Speaker 1: has been getting a lot a lot of flak online, 1271 01:09:21,240 --> 01:09:25,400 Speaker 1: more outrage than I think support personally on his decision 1272 01:09:25,920 --> 01:09:31,599 Speaker 1: to sign an AI artist named this Tata by the way. 1273 01:09:32,240 --> 01:09:35,599 Speaker 1: Last year, he announced that he was looking to sign 1274 01:09:35,640 --> 01:09:39,040 Speaker 1: an artist now for any of the young bucks Wayians 1275 01:09:39,120 --> 01:09:45,960 Speaker 1: tuning into selective ignorants, Timberland is a staple in hip hop. Yeah, 1276 01:09:46,160 --> 01:09:48,160 Speaker 1: why are you laughing? I feel like I gotta give 1277 01:09:48,160 --> 01:09:53,320 Speaker 1: it a history because Aliah dempassed, Missy Elliott cut herself 1278 01:09:53,320 --> 01:09:57,240 Speaker 1: and have it became a whole new bitch, like like 1279 01:09:57,560 --> 01:10:00,720 Speaker 1: for who we know, I mean, put the cats all 1280 01:10:00,880 --> 01:10:03,240 Speaker 1: that bitch is now just you know on these singing 1281 01:10:03,240 --> 01:10:09,840 Speaker 1: competitions for him, you feel me like for his relationships 1282 01:10:09,840 --> 01:10:12,519 Speaker 1: within the music industry, He's probably endured a lot with 1283 01:10:12,600 --> 01:10:20,000 Speaker 1: his human interactions from a grieving perspective, from feeling maybe 1284 01:10:20,040 --> 01:10:24,479 Speaker 1: not owed what he was entitled to, quote unquote. We'll 1285 01:10:24,479 --> 01:10:28,200 Speaker 1: get into that too, in terms of his production value, 1286 01:10:28,240 --> 01:10:31,040 Speaker 1: in terms of the respect he gets from his peers, 1287 01:10:31,760 --> 01:10:34,720 Speaker 1: what he's valued at in society right now, and all 1288 01:10:34,760 --> 01:10:37,080 Speaker 1: those things. And again, if we're leaning towards the space 1289 01:10:37,080 --> 01:10:40,760 Speaker 1: where AI is here to stay, why not lean into it. 1290 01:10:41,240 --> 01:10:43,559 Speaker 1: And so I did want to play a clip if 1291 01:10:43,600 --> 01:10:46,919 Speaker 1: you don't mind, before we really get into our thoughts 1292 01:10:46,960 --> 01:10:49,960 Speaker 1: on Timbaland and how he was out here moving well 1293 01:10:50,000 --> 01:10:51,479 Speaker 1: I am, which, by the way, I did get to 1294 01:10:51,479 --> 01:10:55,960 Speaker 1: see speak live. This was he had this conversation Darren 1295 01:10:56,040 --> 01:11:01,320 Speaker 1: can Lyons and this took place at Sport Beach, and 1296 01:11:01,600 --> 01:11:06,600 Speaker 1: this is what he had to say about AI in 1297 01:11:06,760 --> 01:11:09,599 Speaker 1: music is but specifically to Timberland. If you get. 1298 01:11:11,040 --> 01:11:13,360 Speaker 5: Yep, Timberland's awesome. 1299 01:11:13,840 --> 01:11:18,680 Speaker 6: He's a great musician, great contributor, and he's enthusiastic of AI. 1300 01:11:18,560 --> 01:11:20,080 Speaker 3: Just like I am. 1301 01:11:20,160 --> 01:11:25,040 Speaker 6: I feel that maybe, you know, maybe he didn't, he 1302 01:11:25,080 --> 01:11:26,200 Speaker 6: didn't think of all the. 1303 01:11:26,080 --> 01:11:29,920 Speaker 5: Things with what aspect like launching the artist. I think 1304 01:11:29,960 --> 01:11:31,080 Speaker 5: that's the one that maybe. 1305 01:11:30,800 --> 01:11:32,479 Speaker 6: Got No, no, that's not what got it. I don't 1306 01:11:32,479 --> 01:11:35,240 Speaker 6: think it was the launching the artists. Black Eyed peas 1307 01:11:36,080 --> 01:11:38,840 Speaker 6: in two thousand and nine, we had a video called 1308 01:11:38,880 --> 01:11:41,360 Speaker 6: I'm Gonna Be Rocking that Body where we said, hey, 1309 01:11:41,439 --> 01:11:43,280 Speaker 6: the future of music is going to be you type 1310 01:11:43,320 --> 01:11:45,560 Speaker 6: this in and the machine's going to sing it. The 1311 01:11:45,680 --> 01:11:48,559 Speaker 6: album cover on the end where I got a filling 1312 01:11:48,560 --> 01:11:52,320 Speaker 6: in Boom boom paw was an AI representation of every 1313 01:11:52,360 --> 01:11:57,200 Speaker 6: single member put together. We announced that we have an 1314 01:11:57,240 --> 01:11:59,120 Speaker 6: AI member of our group when we were supposed to 1315 01:11:59,200 --> 01:12:03,639 Speaker 6: do our Vegas residency. People are not tripping on AI, 1316 01:12:04,560 --> 01:12:09,120 Speaker 6: they're concerned on I think what what what happened with 1317 01:12:09,560 --> 01:12:14,600 Speaker 6: Timberlain was he was telling people to send music in 1318 01:12:14,640 --> 01:12:19,280 Speaker 6: the year before to sign humans people and the thing 1319 01:12:19,320 --> 01:12:23,479 Speaker 6: that he signed the following year was AI. And so 1320 01:12:23,520 --> 01:12:26,560 Speaker 6: the combination of that right raises a lot of concerns 1321 01:12:26,640 --> 01:12:30,759 Speaker 6: or or or questions like were you're using our music 1322 01:12:30,760 --> 01:12:31,559 Speaker 6: to train your AI. 1323 01:12:32,360 --> 01:12:36,519 Speaker 5: You had to send music in that you want to 1324 01:12:36,560 --> 01:12:36,920 Speaker 5: keep going? 1325 01:12:37,439 --> 01:12:39,439 Speaker 4: And I think that was the issue because he he 1326 01:12:40,920 --> 01:12:44,840 Speaker 4: it was found that he had taken and he did 1327 01:12:45,720 --> 01:12:50,360 Speaker 4: like new music from dope producers and train the model, 1328 01:12:50,360 --> 01:12:51,720 Speaker 4: and then he as he put it out, it was 1329 01:12:51,760 --> 01:12:54,880 Speaker 4: like whoa hold on, buddy, what's this? And then it 1330 01:12:54,920 --> 01:12:57,240 Speaker 4: became a whole discord. But I just want to cydebard. 1331 01:12:57,280 --> 01:12:59,360 Speaker 4: Will I always used to talk like that. 1332 01:13:00,160 --> 01:13:01,800 Speaker 1: First off, when I grew up, I didn't hear that 1333 01:13:01,880 --> 01:13:02,919 Speaker 1: nigga talk. I just started. 1334 01:13:06,280 --> 01:13:09,320 Speaker 4: That's a rich Remember when Kanye was quiet and he 1335 01:13:09,360 --> 01:13:12,040 Speaker 4: popped up on the breakfast club and he started talking different. 1336 01:13:12,960 --> 01:13:15,000 Speaker 3: He had to reach But anyway, I digress. 1337 01:13:15,400 --> 01:13:17,240 Speaker 1: So you know, you know what's crazy because you you 1338 01:13:17,360 --> 01:13:19,719 Speaker 1: kind of leaned into it. And this was my take 1339 01:13:19,880 --> 01:13:24,720 Speaker 1: on hearing this and hearing will i Am say this. 1340 01:13:25,560 --> 01:13:29,640 Speaker 1: The people that have the discourse, does it just go 1341 01:13:29,720 --> 01:13:32,439 Speaker 1: back to capitalism? Are you upset as a human being 1342 01:13:32,479 --> 01:13:35,240 Speaker 1: because you were lied to and now you're not getting 1343 01:13:35,240 --> 01:13:36,479 Speaker 1: the money like. 1344 01:13:36,800 --> 01:13:38,360 Speaker 3: Opportunity or the opportunity? 1345 01:13:38,600 --> 01:13:43,880 Speaker 1: So is the is the Is the discourse around use 1346 01:13:43,920 --> 01:13:48,640 Speaker 1: of AI in the music industry about AI? Maybe not? 1347 01:13:48,960 --> 01:13:51,719 Speaker 1: What it's really about is whoa This is just another 1348 01:13:51,760 --> 01:13:55,720 Speaker 1: way as an artist, I won't see money that I 1349 01:13:55,800 --> 01:13:59,320 Speaker 1: believe is oh to me as Timberland, someone I looked 1350 01:13:59,400 --> 01:14:02,240 Speaker 1: up to, someone I supported, someone I grew up on. 1351 01:14:02,840 --> 01:14:05,799 Speaker 1: You lied to us. I thought I had the actual 1352 01:14:05,840 --> 01:14:08,800 Speaker 1: opportunity to be signed to and work with you, and 1353 01:14:08,840 --> 01:14:12,000 Speaker 1: instead you created a bot. And so where we talk 1354 01:14:12,040 --> 01:14:15,840 Speaker 1: about how we feel about AI being integrated. Are we 1355 01:14:15,960 --> 01:14:20,160 Speaker 1: upset with the actual AI or are we upset with 1356 01:14:20,240 --> 01:14:24,360 Speaker 1: the decisions being made with people using AI? And I 1357 01:14:24,400 --> 01:14:28,800 Speaker 1: think that as human beings, when you feel betrayed, when 1358 01:14:28,840 --> 01:14:32,080 Speaker 1: you feel lied to, when you feel like damn, this 1359 01:14:32,160 --> 01:14:34,120 Speaker 1: is going to make it even harder for me to 1360 01:14:34,600 --> 01:14:37,200 Speaker 1: have money, This is going to make it easier for 1361 01:14:37,280 --> 01:14:42,120 Speaker 1: people to steal my IP, that's where we really get upset. 1362 01:14:42,240 --> 01:14:46,240 Speaker 1: And in talking about the deregulations of AI that are happening, 1363 01:14:46,280 --> 01:14:49,880 Speaker 1: even from our president, I think the issue that we 1364 01:14:49,920 --> 01:14:53,480 Speaker 1: have with AI more so leans into are human elements 1365 01:14:53,800 --> 01:14:57,760 Speaker 1: of feeling like, fuck, this is just something else I 1366 01:14:57,800 --> 01:15:01,200 Speaker 1: have to find a way to overcome. It's another obstacle 1367 01:15:01,520 --> 01:15:03,360 Speaker 1: in front of me that's going to make it harder 1368 01:15:03,360 --> 01:15:06,360 Speaker 1: for me to pay my bills or become the artist 1369 01:15:06,400 --> 01:15:09,360 Speaker 1: I see myself being, or having the opportunities and getting 1370 01:15:09,360 --> 01:15:12,680 Speaker 1: into the rooms I feel like I should be in 1371 01:15:13,160 --> 01:15:16,320 Speaker 1: because then you have the element of ego. And so 1372 01:15:16,560 --> 01:15:19,040 Speaker 1: all of these ways in which we exist as human 1373 01:15:19,040 --> 01:15:23,599 Speaker 1: beings are essentially being tried. Yeah, because AI is being 1374 01:15:24,000 --> 01:15:29,280 Speaker 1: introduced into what has already made it fucking impossible. Because 1375 01:15:29,360 --> 01:15:33,000 Speaker 1: humans suck humans betray you, humans lie to you, and 1376 01:15:33,040 --> 01:15:36,800 Speaker 1: now they're integrating a tool that essentially can also steal 1377 01:15:36,840 --> 01:15:40,439 Speaker 1: from you your IP, your creativity, your mind, and there's 1378 01:15:40,720 --> 01:15:43,280 Speaker 1: nothing that you can do about it. Well, at the moment, the. 1379 01:15:43,240 --> 01:15:45,479 Speaker 4: One thing we can do about it is this, even 1380 01:15:45,520 --> 01:15:47,640 Speaker 4: as you put out an AI artist, you still need 1381 01:15:47,680 --> 01:15:50,640 Speaker 4: people to support and buy it unless you how like 1382 01:15:51,160 --> 01:15:53,280 Speaker 4: is AI going to just listen to AI? And the 1383 01:15:53,760 --> 01:15:56,360 Speaker 4: like you know what I'm saying, Like we still control it, 1384 01:15:56,439 --> 01:15:58,040 Speaker 4: like we tune that shit out, like yo, we don't 1385 01:15:58,040 --> 01:15:58,720 Speaker 4: want to fuck with that. 1386 01:15:58,800 --> 01:15:59,240 Speaker 3: Then that's it. 1387 01:15:59,240 --> 01:16:06,240 Speaker 1: Well, Actually, to Lodger Graham, AI can't listen to AI, 1388 01:16:06,280 --> 01:16:09,160 Speaker 1: which is why he's fighting the bots and fighting the system. 1389 01:16:09,520 --> 01:16:11,600 Speaker 1: So the algorithm of all this stuff. So if we 1390 01:16:11,600 --> 01:16:14,679 Speaker 1: don't think that in a streaming era there's an element 1391 01:16:14,720 --> 01:16:17,559 Speaker 1: of AI pushing certain music to us as well, we 1392 01:16:17,640 --> 01:16:19,720 Speaker 1: got it fucking mistaken. And so if one of the 1393 01:16:19,840 --> 01:16:22,800 Speaker 1: largest artists in the world can sue a company because 1394 01:16:22,840 --> 01:16:26,639 Speaker 1: now there's beef with whatever negotiations took place to where 1395 01:16:26,720 --> 01:16:30,120 Speaker 1: now the bots that put people to the number one. 1396 01:16:30,200 --> 01:16:33,400 Speaker 1: We also talk about ego in terms of human human uh, 1397 01:16:34,040 --> 01:16:38,559 Speaker 1: the human experience Babe artists want Grammys, artists want Billboard numbers. 1398 01:16:38,720 --> 01:16:41,599 Speaker 1: We see Nicki Minaj crashing out every chance she can 1399 01:16:41,920 --> 01:16:44,720 Speaker 1: and still bringing up numbers. And so if numbers are 1400 01:16:44,760 --> 01:16:48,920 Speaker 1: still going to be our push, whether it's a dollar sign, 1401 01:16:49,000 --> 01:16:52,000 Speaker 1: whether it's a stream number, whether it's a viewership on 1402 01:16:52,040 --> 01:16:54,720 Speaker 1: a television show, whether it's YouTube, whether it's likes, whether 1403 01:16:54,760 --> 01:16:59,200 Speaker 1: it's reshares, we are still a system ran by and 1404 01:16:59,240 --> 01:17:04,440 Speaker 1: now control with numbers. And we also thought that numbers 1405 01:17:04,720 --> 01:17:07,679 Speaker 1: didn't lie, Well, now they do and they can't because 1406 01:17:07,680 --> 01:17:11,920 Speaker 1: now they can be manipulated, right, And so I don't 1407 01:17:11,960 --> 01:17:14,559 Speaker 1: think an artist cares about whether bots or humans are 1408 01:17:14,600 --> 01:17:18,200 Speaker 1: listening to it as long as the funniest as long 1409 01:17:18,200 --> 01:17:20,920 Speaker 1: as they have them numbers so that the fucking dollars 1410 01:17:20,960 --> 01:17:24,240 Speaker 1: add up. And so it's literally a lose lose for 1411 01:17:24,360 --> 01:17:26,400 Speaker 1: us at this point because at the end of the day, 1412 01:17:26,760 --> 01:17:30,200 Speaker 1: most artists, most individuals, most YouTubers, most nigga. I just 1413 01:17:30,280 --> 01:17:35,240 Speaker 1: learned about fucking wop. You don't know about what, Oh baby, 1414 01:17:36,040 --> 01:17:40,840 Speaker 1: not my wet ass pussy, but not it's my wet 1415 01:17:40,840 --> 01:17:44,880 Speaker 1: ass pussy, not that, but it's whop. I'm probably getting 1416 01:17:44,880 --> 01:17:47,000 Speaker 1: out of myself here because I'm about to use it. 1417 01:17:47,000 --> 01:17:51,240 Speaker 1: It is using people, right, But whop is what all 1418 01:17:51,240 --> 01:17:54,160 Speaker 1: of the streamers, probably with Joe and a lot of 1419 01:17:54,160 --> 01:17:58,439 Speaker 1: these large podcasters, are using to pretty much get eight 1420 01:17:58,520 --> 01:18:01,679 Speaker 1: hundred clip editors or farming clip editors to go into 1421 01:18:01,720 --> 01:18:05,320 Speaker 1: your content. They get paid per the view. Now with 1422 01:18:05,560 --> 01:18:08,559 Speaker 1: what there is a bot to be able to show 1423 01:18:08,760 --> 01:18:12,400 Speaker 1: if they're actually real impressions or not, but it's pretty 1424 01:18:12,439 --> 01:18:14,320 Speaker 1: much you go on there, say you have one hundred 1425 01:18:14,320 --> 01:18:16,880 Speaker 1: and fifty dollars. There's a seventeen year old that would 1426 01:18:16,880 --> 01:18:18,560 Speaker 1: love one hundred and fifty dollars. Well, you want a 1427 01:18:18,560 --> 01:18:20,840 Speaker 1: million impressions for one hundred and fifty dollars. They'll go 1428 01:18:20,880 --> 01:18:23,360 Speaker 1: through your content, clip it up for you, post it 1429 01:18:23,400 --> 01:18:27,960 Speaker 1: through whatever community page they make for you, throw it 1430 01:18:28,080 --> 01:18:30,640 Speaker 1: here on x throw it here on Facebook, throw it 1431 01:18:30,640 --> 01:18:33,679 Speaker 1: here on YouTube, and within a certain amount of time, 1432 01:18:33,800 --> 01:18:37,599 Speaker 1: once they get you the million impressions, he gets one 1433 01:18:37,680 --> 01:18:40,080 Speaker 1: hundred and fifty bucks. And so he has to clip 1434 01:18:40,120 --> 01:18:41,400 Speaker 1: up as much things as he can to get you 1435 01:18:41,400 --> 01:18:44,800 Speaker 1: those impressions. And so this is where we're currently being 1436 01:18:45,160 --> 01:18:48,639 Speaker 1: flooded with the clips of a Kaisanat or a Joe Budden, 1437 01:18:49,080 --> 01:18:53,719 Speaker 1: or a black Boy Max or a DDG or India Love. 1438 01:18:53,840 --> 01:18:57,479 Speaker 1: All these streamers have these hundreds of fucking guys on 1439 01:18:57,520 --> 01:19:00,519 Speaker 1: there clipping up these people to constantly let you and 1440 01:19:00,560 --> 01:19:04,479 Speaker 1: see it for very low dollars of money. And then 1441 01:19:04,520 --> 01:19:07,160 Speaker 1: there's the AI bot making sure it's real. And so again, 1442 01:19:07,479 --> 01:19:10,320 Speaker 1: as long as we're in a system where your views, 1443 01:19:10,400 --> 01:19:15,040 Speaker 1: your impressions, your likes, your listens, your streams equate to money, 1444 01:19:17,200 --> 01:19:21,439 Speaker 1: I don't think anyone cares where it's coming from. Numbers 1445 01:19:21,520 --> 01:19:23,960 Speaker 1: matter in terms of your poet, as long as they 1446 01:19:24,000 --> 01:19:27,799 Speaker 1: get the bread and back to capitalism, that's where we're at. 1447 01:19:27,960 --> 01:19:32,280 Speaker 1: It's why seeing a CEO that created AI at for 1448 01:19:32,600 --> 01:19:36,559 Speaker 1: doctors is already a billionaire doctors is a very fit 1449 01:19:36,920 --> 01:19:40,880 Speaker 1: like that's something where it's still a person, but in 1450 01:19:40,880 --> 01:19:46,000 Speaker 1: whatever integration AI is now being implemented into healthcare, life 1451 01:19:46,000 --> 01:19:49,000 Speaker 1: and death. And again with the deregulation, there's probably going 1452 01:19:49,080 --> 01:19:52,880 Speaker 1: to be a lot of loopholes to save you as 1453 01:19:52,920 --> 01:19:56,960 Speaker 1: well from if something goes wrong with AI. It's not 1454 01:19:57,160 --> 01:19:59,720 Speaker 1: at the hands of the doctor, which I'm sure for 1455 01:19:59,760 --> 01:20:01,800 Speaker 1: any doctor who spent all the years in school that 1456 01:20:01,840 --> 01:20:03,840 Speaker 1: they did would love to be able to wipe their 1457 01:20:03,840 --> 01:20:06,760 Speaker 1: hands clean of something going wrong. Because guess what can 1458 01:20:06,800 --> 01:20:09,800 Speaker 1: happen whether you're in a car with nobody or a 1459 01:20:09,800 --> 01:20:13,720 Speaker 1: car with somebody, it's inevitable that something can and will 1460 01:20:13,720 --> 01:20:16,160 Speaker 1: always go wrong. Whether it happens to you or not 1461 01:20:16,280 --> 01:20:17,880 Speaker 1: is the factor. 1462 01:20:19,200 --> 01:20:19,760 Speaker 3: No, that's interesting. 1463 01:20:20,120 --> 01:20:21,559 Speaker 2: I know we don't do that, like did you change 1464 01:20:21,560 --> 01:20:22,559 Speaker 2: your mind yet anymore? 1465 01:20:22,720 --> 01:20:25,839 Speaker 5: But here you see this. Hearing you say this last part. 1466 01:20:25,680 --> 01:20:29,360 Speaker 2: Reminds me, like I think it feels like to me now, 1467 01:20:29,520 --> 01:20:32,280 Speaker 2: like because of the capitalism element of it all when 1468 01:20:32,280 --> 01:20:33,880 Speaker 2: it comes to AI, because it's a tool, right, it's 1469 01:20:33,880 --> 01:20:35,680 Speaker 2: a tool and people are trying to make money. I 1470 01:20:35,680 --> 01:20:40,320 Speaker 2: feel like the anxiety is a result of like the 1471 01:20:40,320 --> 01:20:44,200 Speaker 2: potential for one person to save money fighting the potential 1472 01:20:44,320 --> 01:20:46,960 Speaker 2: for somebody feeling like they may lose money or opportunity, 1473 01:20:47,240 --> 01:20:50,200 Speaker 2: and the AI is just a tool that makes that 1474 01:20:50,240 --> 01:20:52,360 Speaker 2: fight happen. And I think that's why everybody's trying to 1475 01:20:52,400 --> 01:20:55,439 Speaker 2: having this anxiety when they hear AI. I wonder if 1476 01:20:55,479 --> 01:20:57,920 Speaker 2: like that I don't need that robot mess. It's really 1477 01:20:58,000 --> 01:21:00,479 Speaker 2: just shorthand for something else, right, And it's not the 1478 01:21:00,560 --> 01:21:03,160 Speaker 2: robots that they're afraid of. They're just like, yo, this 1479 01:21:03,200 --> 01:21:05,759 Speaker 2: is just another way for shitty humans to fuck. 1480 01:21:05,640 --> 01:21:09,760 Speaker 1: Me period pause as well. I don't like shame mis 1481 01:21:09,760 --> 01:21:12,840 Speaker 1: fucking me, but yeah, at the end of the day, 1482 01:21:12,880 --> 01:21:17,040 Speaker 1: I would like to know. Go and join us over 1483 01:21:17,200 --> 01:21:22,559 Speaker 1: on Instagram at Selective Selective Ignorance PODA, join us on 1484 01:21:22,600 --> 01:21:28,160 Speaker 1: the Patreon the discord that's patreon dot com backslash Selective Ignorance, 1485 01:21:28,320 --> 01:21:31,479 Speaker 1: or head on over to the YouTube channel at with 1486 01:21:31,640 --> 01:21:35,680 Speaker 1: Mandy Bee. You could see periodsys there Selective Ignorance all 1487 01:21:35,680 --> 01:21:38,680 Speaker 1: of the clips and I want to further talk about this. 1488 01:21:38,680 --> 01:21:40,640 Speaker 1: This is another episode. Thank you guys for having this 1489 01:21:40,680 --> 01:21:42,600 Speaker 1: conversation with me. Thank y'all for listening. I hope that 1490 01:21:42,680 --> 01:21:45,719 Speaker 1: you are deeper into your thoughts on what AI means, 1491 01:21:45,720 --> 01:21:49,000 Speaker 1: what it's doing, all the things. Make sure you tune 1492 01:21:49,040 --> 01:21:52,000 Speaker 1: into us next week as well as the bonus episodes. 1493 01:21:52,040 --> 01:21:55,479 Speaker 1: I've been dropping my book club for No Holds Barred 1494 01:21:55,520 --> 01:21:58,360 Speaker 1: that I've been recording virtually so every Friday, y'all are 1495 01:21:58,360 --> 01:22:01,080 Speaker 1: getting a bonus episode this this week, we're talking about 1496 01:22:01,160 --> 01:22:04,800 Speaker 1: the progressive portion of the book No Holds bart If 1497 01:22:04,800 --> 01:22:08,000 Speaker 1: you haven't got it yet, get it wherever you get books, y'all. 1498 01:22:08,040 --> 01:22:12,040 Speaker 1: This is another episode though, of Selective Ignorance. I'm your girl, 1499 01:22:12,080 --> 01:22:17,000 Speaker 1: Mandy B. And this is where curiosity live, controversy thrives, 1500 01:22:17,000 --> 01:22:25,040 Speaker 1: and conversations matter. See you next week. Selective Ignorance a 1501 01:22:25,120 --> 01:22:28,200 Speaker 1: production of the Black Effect podcast Network. For more podcasts 1502 01:22:28,240 --> 01:22:32,799 Speaker 1: from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or wherever 1503 01:22:32,920 --> 01:22:34,200 Speaker 1: you listen to your favorite shows. 1504 01:22:34,320 --> 01:22:36,600 Speaker 2: Thanks for tuning in the Selective Ignorance of Mandy B. 1505 01:22:36,800 --> 01:22:39,760 Speaker 2: Selective Ignorance. It's executive produced to Buy Mandy B. And 1506 01:22:39,800 --> 01:22:43,200 Speaker 2: it's a Full Court Media studio production with lead producers 1507 01:22:43,280 --> 01:22:45,800 Speaker 2: Jason Mondriguez. That's me and Aaron A. 1508 01:22:45,960 --> 01:22:46,559 Speaker 5: King Howell. 1509 01:22:46,640 --> 01:22:49,600 Speaker 2: Now, do us a favor and rate, subscribe, comment and 1510 01:22:49,680 --> 01:22:52,800 Speaker 2: share wherever you get your favorite podcasts, and be sure 1511 01:22:52,840 --> 01:22:57,519 Speaker 2: to follow Selective Ignorance on Instagram at Selective Underscore Ignorance. 1512 01:22:57,520 --> 01:22:59,800 Speaker 2: And of course, if you're not following our hosts Mandy B, 1513 01:23:00,240 --> 01:23:02,360 Speaker 2: make sure you're following her at full Court Pumps. 1514 01:23:02,360 --> 01:23:02,479 Speaker 1: Now. 1515 01:23:02,520 --> 01:23:05,320 Speaker 2: If you want the full video experience of Selective Ignorance, 1516 01:23:05,439 --> 01:23:08,439 Speaker 2: make sure you subscribe to the Patreon It's patreons dot 1517 01:23:08,479 --> 01:23:11,200 Speaker 2: com backslash Selective Ignorance