1 00:00:00,080 --> 00:00:03,400 Speaker 1: Native Lamp Pod is a production of iHeartRadio and partnership 2 00:00:03,440 --> 00:00:05,359 Speaker 1: with Reason Choice Media. 3 00:00:05,559 --> 00:00:08,320 Speaker 2: Welcome, Welcome, Welcome, Welcome, Welcome, Welcome. 4 00:00:10,240 --> 00:00:13,600 Speaker 3: All right, everybody, welcome home. This is Native Lamb Pod. 5 00:00:13,720 --> 00:00:15,680 Speaker 3: It is a mini pod, and we are so thrilled 6 00:00:15,760 --> 00:00:19,080 Speaker 3: to have some very very special guests joining us today. 7 00:00:19,800 --> 00:00:22,840 Speaker 3: Isaac Hayes is the founder and creator of fan Base, 8 00:00:23,200 --> 00:00:25,959 Speaker 3: and he is joined today by a producer who recently 9 00:00:26,000 --> 00:00:31,360 Speaker 3: had a very viral conversation with him, Tamisha Harris, about 10 00:00:31,840 --> 00:00:36,120 Speaker 3: fan Base and why we should be utilizing our own platforms, 11 00:00:36,280 --> 00:00:37,800 Speaker 3: especially in this day and age. We just had a 12 00:00:37,800 --> 00:00:40,400 Speaker 3: whole conversation about fascism. We know a little bit about that. 13 00:00:40,479 --> 00:00:43,320 Speaker 3: We are witnessing it all together. So we are grateful 14 00:00:43,360 --> 00:00:45,479 Speaker 3: for you all coming here and joining us today. 15 00:00:46,320 --> 00:00:47,360 Speaker 4: Thanks for having us. 16 00:00:48,479 --> 00:00:50,440 Speaker 1: I'd love if we could get started one give us 17 00:00:50,479 --> 00:00:52,120 Speaker 1: a little background. 18 00:00:51,760 --> 00:00:54,960 Speaker 5: On each of you. What brings you really to this work? 19 00:00:56,680 --> 00:00:58,360 Speaker 1: Mister Hayes. You can imagine that there are a lot 20 00:00:58,400 --> 00:00:59,120 Speaker 1: of fans out. 21 00:00:58,960 --> 00:01:00,680 Speaker 5: There of your of your name. 22 00:01:01,240 --> 00:01:07,280 Speaker 1: Uh but yeah, but your family obviously, and uh is 23 00:01:07,319 --> 00:01:10,720 Speaker 1: it Tanisha Misha? Yeah, I thought it was a m 24 00:01:11,000 --> 00:01:15,640 Speaker 1: but I thought I wrote it down wrong. Yeah, Tamisha, 25 00:01:15,800 --> 00:01:18,039 Speaker 1: I'm curious because I saw your video as well, and 26 00:01:18,080 --> 00:01:23,240 Speaker 1: I was telling the ladies before how not alarmist you sounded. 27 00:01:23,280 --> 00:01:25,080 Speaker 1: I know you referenced. You know I'm not a you know, 28 00:01:25,120 --> 00:01:27,960 Speaker 1: I don't you know, I'm not being sensational. I'm telling 29 00:01:27,959 --> 00:01:31,679 Speaker 1: you that we're under threat. Give us in our audience 30 00:01:31,720 --> 00:01:33,800 Speaker 1: a little bit of background as to what brings you 31 00:01:34,280 --> 00:01:37,080 Speaker 1: one to that point, and I think basically through your background, 32 00:01:37,080 --> 00:01:39,440 Speaker 1: they'll understand it a little bit better than obviously Sam 33 00:01:39,440 --> 00:01:40,600 Speaker 1: for you, Missays. 34 00:01:40,880 --> 00:01:44,360 Speaker 4: Thanks Andrews. So I think for me, I have been 35 00:01:45,160 --> 00:01:49,440 Speaker 4: a news producer, local and network news for since I 36 00:01:49,480 --> 00:01:55,560 Speaker 4: want to say, twenty eleven, and so, you know, watching 37 00:01:55,600 --> 00:01:59,640 Speaker 4: how stories are being told, watching how our community is 38 00:01:59,680 --> 00:02:02,600 Speaker 4: impact by the way we tell stories or the way 39 00:02:02,640 --> 00:02:07,000 Speaker 4: that we don't tell stories, has always been of concern 40 00:02:07,120 --> 00:02:10,040 Speaker 4: to me. But you know, we keep working, you know, 41 00:02:10,080 --> 00:02:11,560 Speaker 4: because you have a job to do, so you just 42 00:02:11,639 --> 00:02:14,200 Speaker 4: keep working and you do what you're supposed to do. 43 00:02:14,560 --> 00:02:20,600 Speaker 4: For me, I started by making sure we brought women 44 00:02:20,720 --> 00:02:25,280 Speaker 4: on air, that our lineup of subject matter experts were diverse, 45 00:02:25,680 --> 00:02:27,600 Speaker 4: so we can hear from their point of view. Even 46 00:02:27,639 --> 00:02:31,120 Speaker 4: an Angela ry Angela you've been on several of my shows. 47 00:02:31,400 --> 00:02:36,600 Speaker 4: So it's bringing on guests who can speak to from 48 00:02:36,639 --> 00:02:40,680 Speaker 4: a certain perspective that our community can understand that's important 49 00:02:40,680 --> 00:02:46,000 Speaker 4: to me. And then from there it started to become 50 00:02:46,040 --> 00:02:51,200 Speaker 4: more of a more critical issue from me that once 51 00:02:51,240 --> 00:02:58,000 Speaker 4: the networks started losing their base, the views social media 52 00:02:58,240 --> 00:03:01,960 Speaker 4: is now it's becoming clear that's where we're getting our news, 53 00:03:03,200 --> 00:03:06,760 Speaker 4: and it's a good time for us to take hold 54 00:03:06,800 --> 00:03:13,079 Speaker 4: of it and and get our legacy anchors, the Tiffany Crosses, 55 00:03:13,200 --> 00:03:16,600 Speaker 4: get them, get them in one space so that we're 56 00:03:16,600 --> 00:03:18,880 Speaker 4: not all spread out, so we can come as a 57 00:03:18,919 --> 00:03:22,520 Speaker 4: collective and be incredibly powerful. Now that we can own 58 00:03:22,680 --> 00:03:26,600 Speaker 4: the space and get organized so that when anything does 59 00:03:26,720 --> 00:03:29,640 Speaker 4: come down, because we all know that things get organized, 60 00:03:29,680 --> 00:03:33,920 Speaker 4: things are being organized, that we are also organized and 61 00:03:33,960 --> 00:03:36,760 Speaker 4: we are forceful, and that we're bringing the correct information 62 00:03:37,240 --> 00:03:39,920 Speaker 4: to the people who need it most, to the people 63 00:03:39,960 --> 00:03:45,640 Speaker 4: who uh uh. Mainstream media may not focus on because 64 00:03:45,680 --> 00:03:49,640 Speaker 4: they have other focuses. They have a wide group of 65 00:03:49,640 --> 00:03:52,920 Speaker 4: people that they need to focus on. But we we 66 00:03:52,960 --> 00:03:57,320 Speaker 4: have a collective, we have a specific we have specific 67 00:03:57,440 --> 00:04:01,280 Speaker 4: challenges in our community that we have to address, and 68 00:04:01,320 --> 00:04:04,920 Speaker 4: so that's what brings me. That's why I say fan 69 00:04:05,000 --> 00:04:08,840 Speaker 4: base is a is a good opportunity for us. I 70 00:04:08,840 --> 00:04:10,960 Speaker 4: don't have a vested interest in fan base. I have 71 00:04:11,000 --> 00:04:17,440 Speaker 4: a vested interest in getting our stories out, get our 72 00:04:17,640 --> 00:04:19,600 Speaker 4: getting our stories told accurately. 73 00:04:20,600 --> 00:04:20,800 Speaker 3: To me. 74 00:04:20,880 --> 00:04:21,080 Speaker 5: Sure. 75 00:04:21,120 --> 00:04:23,880 Speaker 3: I love that you said that you don't have a 76 00:04:24,000 --> 00:04:28,160 Speaker 3: vested interest in fan base, because there's an article that 77 00:04:28,320 --> 00:04:32,800 Speaker 3: just came out about creators who have been paid by 78 00:04:32,800 --> 00:04:35,680 Speaker 3: a platform, and there's all of this solations. So the 79 00:04:35,720 --> 00:04:37,840 Speaker 3: fact that you're like, hey, let me disclose this, this 80 00:04:37,920 --> 00:04:39,560 Speaker 3: is what I just believe is the right thing to do. 81 00:04:39,640 --> 00:04:42,320 Speaker 3: And to that point, we actually were challenged by a 82 00:04:42,440 --> 00:04:45,480 Speaker 3: native lampod viewer who said, why aren't y'all on fan base? 83 00:04:45,480 --> 00:04:46,839 Speaker 3: And we were on the pod and I was like, 84 00:04:46,920 --> 00:04:49,239 Speaker 3: you know what, we need to be on fan base. 85 00:04:49,279 --> 00:04:52,200 Speaker 3: So I don't know if if Tip and Andrew started 86 00:04:52,240 --> 00:04:53,919 Speaker 3: their accounts yet, but I did mine. 87 00:04:54,000 --> 00:04:56,039 Speaker 4: I opened up your start of line to go ahead. 88 00:04:56,080 --> 00:04:59,800 Speaker 3: Angela'sation good, so I started, mind, we have a native 89 00:04:59,839 --> 00:05:02,680 Speaker 3: lan one, Isaac. I want to hear from you about 90 00:05:02,680 --> 00:05:05,719 Speaker 3: why this is so important your biases. Of course, you 91 00:05:05,760 --> 00:05:07,719 Speaker 3: are the owner of this platform, but why is it 92 00:05:07,760 --> 00:05:10,080 Speaker 3: important for us to have our own be in our 93 00:05:10,160 --> 00:05:12,359 Speaker 3: own space, especially in a violent time like this. 94 00:05:13,640 --> 00:05:14,839 Speaker 5: Well he's aheading the curve. 95 00:05:15,960 --> 00:05:18,840 Speaker 2: Yeah, Well, from the beginning of social media, I think 96 00:05:18,920 --> 00:05:22,440 Speaker 2: black people have always influenced the culture. If we go 97 00:05:22,520 --> 00:05:26,280 Speaker 2: back to black Planet, black Planet page looks very similar 98 00:05:26,279 --> 00:05:30,560 Speaker 2: to a MySpace page, and between now and then, we 99 00:05:30,640 --> 00:05:33,920 Speaker 2: haven't really had a social media platform founded by someone 100 00:05:33,960 --> 00:05:37,320 Speaker 2: that's African American, and I think that or that's founded 101 00:05:37,320 --> 00:05:39,960 Speaker 2: by someone in African American that it has at least 102 00:05:40,000 --> 00:05:42,640 Speaker 2: achieved the level of success that fan Base has and 103 00:05:42,680 --> 00:05:47,040 Speaker 2: the opportunity there. I'm more interested in infrastructure because the 104 00:05:47,040 --> 00:05:50,040 Speaker 2: infrastructures of these social media platforms reach hundreds of billions 105 00:05:50,080 --> 00:05:53,200 Speaker 2: of dollars, and the people that own those platforms, the 106 00:05:53,240 --> 00:05:56,520 Speaker 2: people that monetize those platforms, take that wealth and they 107 00:05:56,800 --> 00:05:59,400 Speaker 2: and they pass agendas that work against black people. 108 00:06:00,320 --> 00:06:02,720 Speaker 5: So Elon has x there's. 109 00:06:02,520 --> 00:06:06,160 Speaker 2: True social Zuckerberg tends to flip flop between whatever it 110 00:06:06,240 --> 00:06:08,919 Speaker 2: is and whatever's going to happen with TikTok. I'm pretty 111 00:06:08,920 --> 00:06:10,719 Speaker 2: sure the Trump administration is going to have their hand 112 00:06:10,760 --> 00:06:13,039 Speaker 2: in that. And so the design for fan Base to 113 00:06:13,040 --> 00:06:15,119 Speaker 2: build something that was black OneD, but for every single 114 00:06:15,160 --> 00:06:17,279 Speaker 2: person on the planet to use because we have to 115 00:06:17,279 --> 00:06:18,800 Speaker 2: look out for ourselves. We have to make sure that 116 00:06:18,800 --> 00:06:22,400 Speaker 2: we're not getting shadow band, our accounts aren't getting deprioritized, 117 00:06:22,720 --> 00:06:25,720 Speaker 2: we're not getting pushed off these platforms, and then more importantly, 118 00:06:26,480 --> 00:06:30,560 Speaker 2: owning these platforms, because again the idea behind creating a 119 00:06:30,600 --> 00:06:32,760 Speaker 2: social media platform that people could actually have equity in 120 00:06:33,160 --> 00:06:35,120 Speaker 2: is because we know that black culture puts social media 121 00:06:35,200 --> 00:06:38,520 Speaker 2: equals billion dollar platforms, but we never have equity and 122 00:06:38,560 --> 00:06:41,120 Speaker 2: ownership of these platforms. And then we beg to be respected, 123 00:06:41,400 --> 00:06:43,520 Speaker 2: we beg to be let in, we beg to be acknowledged, 124 00:06:43,560 --> 00:06:45,640 Speaker 2: we beg to be equal, and no one has ever 125 00:06:45,720 --> 00:06:48,240 Speaker 2: gone and just taking the step up, let's build the infrastructure. 126 00:06:48,640 --> 00:06:51,320 Speaker 2: Like I always look at Black cultures like vibranium. This 127 00:06:51,360 --> 00:06:54,159 Speaker 2: is our vibranium. We're the only people on planet Earth 128 00:06:54,160 --> 00:06:56,080 Speaker 2: that can make this. We're the smallest culture group on 129 00:06:56,120 --> 00:06:59,320 Speaker 2: the planet and we're the most influential African American people. 130 00:06:59,520 --> 00:07:01,359 Speaker 2: When you pour pour a black culture into a shoe, 131 00:07:01,480 --> 00:07:03,960 Speaker 2: you get a Jordan. You pour pour black culture into 132 00:07:03,960 --> 00:07:06,440 Speaker 2: a record player, you get DJ culture. When you pour 133 00:07:06,520 --> 00:07:09,200 Speaker 2: black culture into a phone, you get social media. But 134 00:07:09,240 --> 00:07:10,880 Speaker 2: we have to think about who's going to own those 135 00:07:10,880 --> 00:07:14,400 Speaker 2: infrastructures first, because they're literally scaling these things two hundreds 136 00:07:14,400 --> 00:07:16,800 Speaker 2: of billions of dollars, and we have to start creating 137 00:07:16,880 --> 00:07:19,840 Speaker 2: generational wealth, and on top of that, we have to 138 00:07:19,840 --> 00:07:23,720 Speaker 2: start creating wealth collectively. I am happy for everybody that's 139 00:07:23,760 --> 00:07:26,400 Speaker 2: black in this country that has had individual success, all 140 00:07:26,440 --> 00:07:29,600 Speaker 2: the millionaires and billionaires, but that is doing nothing to 141 00:07:29,640 --> 00:07:30,760 Speaker 2: pull black people out of. 142 00:07:30,680 --> 00:07:31,280 Speaker 5: The wealth gap. 143 00:07:31,520 --> 00:07:33,160 Speaker 2: So what we need to start to do is invest 144 00:07:33,200 --> 00:07:37,400 Speaker 2: in infrastructures that are technologically based, so tech tech founded, 145 00:07:37,480 --> 00:07:40,960 Speaker 2: tech based that need black culture to survive, and then 146 00:07:41,080 --> 00:07:44,040 Speaker 2: invest in those by the thousands, and then scale those 147 00:07:44,040 --> 00:07:47,280 Speaker 2: platforms two hundreds of billions of dollars, not a billion, 148 00:07:47,640 --> 00:07:50,600 Speaker 2: not twenty billion, two hundred three hundred billion, like Byte 149 00:07:50,680 --> 00:07:54,320 Speaker 2: dances worth three hundred and thirty billion dollars. Instagram separated 150 00:07:54,320 --> 00:07:57,160 Speaker 2: from Meta is a four hundred billion dollar company. Meta 151 00:07:57,200 --> 00:08:00,360 Speaker 2: itself has a one point eighty five trillion dollar market cap. 152 00:08:00,640 --> 00:08:03,320 Speaker 2: We need something like that in our community that we own. 153 00:08:03,520 --> 00:08:06,440 Speaker 2: So rather than having one Zuckerberg, we canna have a 154 00:08:06,480 --> 00:08:10,040 Speaker 2: twenty three thousand multimillionaires and billionaires that are invested in 155 00:08:10,040 --> 00:08:11,480 Speaker 2: a platform and scaled it up. 156 00:08:11,880 --> 00:08:13,600 Speaker 4: I know you guys want to I know you guys 157 00:08:13,680 --> 00:08:15,120 Speaker 4: have to jump in. I just want to say one 158 00:08:15,200 --> 00:08:19,160 Speaker 4: thing real quick. If I'm looking at just as a 159 00:08:19,160 --> 00:08:22,400 Speaker 4: as a viewer, as a person who watches the news, 160 00:08:22,480 --> 00:08:26,600 Speaker 4: and I'm watching Tiffany, and I trust Tiffany. Tiffany is 161 00:08:26,640 --> 00:08:29,080 Speaker 4: the person that I watch every single day, every single night. 162 00:08:29,120 --> 00:08:31,760 Speaker 4: I trust, I trust Angela. These are the faces that 163 00:08:31,800 --> 00:08:36,760 Speaker 4: I'm I'm used to seeing, so I trust them. Andrew, 164 00:08:37,880 --> 00:08:43,440 Speaker 4: if for whatever reason, you all are shadow band the 165 00:08:43,559 --> 00:08:46,320 Speaker 4: trusted people that I need to get to, I can't 166 00:08:46,400 --> 00:08:50,080 Speaker 4: get to you, especially when there's information that needs to 167 00:08:50,080 --> 00:08:54,240 Speaker 4: get out. That's my big push with fan base right now, 168 00:08:54,720 --> 00:08:58,240 Speaker 4: particularly with this administration. If there's something that needs to 169 00:08:58,240 --> 00:09:03,080 Speaker 4: get out, and for whatever reason, these tech founders are 170 00:09:03,280 --> 00:09:09,080 Speaker 4: acquiescing to this administration, We our community can't afford not 171 00:09:09,160 --> 00:09:11,080 Speaker 4: to not to hear what Angela has to say, not 172 00:09:11,120 --> 00:09:13,520 Speaker 4: to hear what what Tiffany has to say, and Andrew 173 00:09:13,520 --> 00:09:14,840 Speaker 4: has to say. We can't afford it. 174 00:09:14,880 --> 00:09:15,960 Speaker 5: We have to know what to do. 175 00:09:16,559 --> 00:09:19,199 Speaker 4: So that's why it's important for that's why it's important 176 00:09:19,200 --> 00:09:22,520 Speaker 4: for us to be in on it, on a platform 177 00:09:22,600 --> 00:09:25,880 Speaker 4: that will not silence us regardless of how you feel 178 00:09:25,920 --> 00:09:29,840 Speaker 4: about a Isaac or whatever. It's bigger. 179 00:09:30,360 --> 00:09:32,960 Speaker 6: I wanted to to jump in here on one just 180 00:09:33,000 --> 00:09:35,679 Speaker 6: for context for our viewers. Isaacy reference Bite Dance. That is, 181 00:09:35,720 --> 00:09:41,000 Speaker 6: of course, the parent company of TikTok, which they accord, 182 00:09:41,080 --> 00:09:43,720 Speaker 6: the d o J, the FDC, all the people said, 183 00:09:43,760 --> 00:09:46,800 Speaker 6: you have to sell TikTok in order for it to 184 00:09:47,080 --> 00:09:49,360 Speaker 6: operate here in the United States. So it's caught up 185 00:09:49,400 --> 00:09:51,800 Speaker 6: in a lot of policy. Uh So we'll keep our 186 00:09:51,840 --> 00:09:54,199 Speaker 6: eyes and keep our viewers informed on what happens there. 187 00:09:54,480 --> 00:09:57,560 Speaker 6: My question, I want to go to you to Misha 188 00:09:57,640 --> 00:10:01,280 Speaker 6: on this because long before I was ever on air, 189 00:10:01,520 --> 00:10:04,680 Speaker 6: I was a producer and executive producer, bureau chief, and 190 00:10:04,720 --> 00:10:07,120 Speaker 6: so I would trade war stories with you, my friend, 191 00:10:07,160 --> 00:10:09,760 Speaker 6: as a fellow survivor of network news, because I know 192 00:10:09,840 --> 00:10:12,400 Speaker 6: how challenging that is as a black woman, and navigate 193 00:10:12,480 --> 00:10:14,880 Speaker 6: some of those spaces and advocate for stories that I 194 00:10:14,880 --> 00:10:19,760 Speaker 6: imagine that you did in your long career. Part of 195 00:10:19,920 --> 00:10:23,000 Speaker 6: what you're describing I have experienced as a double edged 196 00:10:23,000 --> 00:10:28,200 Speaker 6: sword when people began getting their news from social media. 197 00:10:28,720 --> 00:10:32,600 Speaker 6: One I understood it because mainstream media summarily dismissed our 198 00:10:32,679 --> 00:10:36,480 Speaker 6: lived experience. If the only time you saw outrage about 199 00:10:36,920 --> 00:10:40,200 Speaker 6: a kid in a hoodie being shot was on a meme, 200 00:10:40,360 --> 00:10:42,680 Speaker 6: or a twelve year old being shot, or a black 201 00:10:42,679 --> 00:10:46,080 Speaker 6: woman winding up dead after being arrested. If the only 202 00:10:46,160 --> 00:10:49,000 Speaker 6: place that your interests and your stories and your outrage 203 00:10:49,040 --> 00:10:52,839 Speaker 6: was reflected back to you was on Facebook or Instagram, 204 00:10:53,000 --> 00:10:57,560 Speaker 6: of course that is where you were going to gather. However, 205 00:10:57,760 --> 00:11:01,600 Speaker 6: what we saw particular leading up to the twenty sixteen 206 00:11:01,640 --> 00:11:06,880 Speaker 6: election is bad actors, foreign adversaries completely infiltrated these places 207 00:11:06,960 --> 00:11:11,439 Speaker 6: and pretended to be a quote unquote journalists, influencers, commentators, 208 00:11:11,440 --> 00:11:14,320 Speaker 6: et cetera. And they were gaining traction. Of course, that 209 00:11:14,440 --> 00:11:18,079 Speaker 6: was the IRA, the Internet Russia agency who spent millions 210 00:11:18,080 --> 00:11:21,920 Speaker 6: of dollars on these campaigns, reaching millions of black people, 211 00:11:21,960 --> 00:11:24,600 Speaker 6: and we certainly saw the results. Now you cut to 212 00:11:24,720 --> 00:11:28,400 Speaker 6: years later, I imagine China has invested interests and the information 213 00:11:28,480 --> 00:11:31,080 Speaker 6: we get. I imagine Iran has invested interests and the 214 00:11:31,080 --> 00:11:34,880 Speaker 6: information we get. So as social media became more democratized 215 00:11:34,920 --> 00:11:37,120 Speaker 6: and who has a voice, I do fear that people 216 00:11:38,160 --> 00:11:41,080 Speaker 6: missed who is an actual reporter, who is a journalist, 217 00:11:41,200 --> 00:11:44,320 Speaker 6: because having an opinion does not make you a journalist. 218 00:11:44,520 --> 00:11:48,199 Speaker 6: I mean Native lampod we are technically under in news category, 219 00:11:48,240 --> 00:11:50,880 Speaker 6: but we are not reporting. I try to inform. I 220 00:11:50,880 --> 00:11:52,960 Speaker 6: think we all try to inform, but it's a lot 221 00:11:52,960 --> 00:11:56,679 Speaker 6: of opining. And so because someone comes out and you know, 222 00:11:57,440 --> 00:12:01,079 Speaker 6: because someone has two million followers, their podcast gets two 223 00:12:01,080 --> 00:12:04,280 Speaker 6: million viewers, people will take a Joe Rogan like, oh, 224 00:12:04,360 --> 00:12:08,200 Speaker 6: you're a voice of authority, and that to me has 225 00:12:08,280 --> 00:12:13,040 Speaker 6: created a danger. It has aided misinformation and disinformation. So 226 00:12:13,160 --> 00:12:16,200 Speaker 6: as you're promoting people getting their news on social media, 227 00:12:16,320 --> 00:12:19,959 Speaker 6: how do you balance that with who to identify as 228 00:12:20,000 --> 00:12:22,880 Speaker 6: a trusted voice, How to know the difference between someone 229 00:12:22,880 --> 00:12:26,360 Speaker 6: who is a commentator offering an opinion, someone who's an 230 00:12:26,360 --> 00:12:30,160 Speaker 6: actual reporter who did the reporting themselves, someone who's citing 231 00:12:30,200 --> 00:12:34,319 Speaker 6: someone else's reporting, Because those are all very crucial things, 232 00:12:34,559 --> 00:12:36,920 Speaker 6: especially at a time like this, when fascism is on 233 00:12:36,960 --> 00:12:39,319 Speaker 6: the rise because of misinformation and disinformation. 234 00:12:41,360 --> 00:12:45,400 Speaker 4: I love that question so much, Tiffany. And here's why 235 00:12:46,600 --> 00:12:49,200 Speaker 4: we're moving to social media. Whether we like it, we're here. 236 00:12:49,720 --> 00:12:52,280 Speaker 4: So whether we like it or not, we're already here. 237 00:12:52,679 --> 00:12:57,920 Speaker 4: This is mainstream TV, right. There is a clear difference 238 00:12:58,000 --> 00:13:00,720 Speaker 4: in anyone if someone wants the real information. There's a 239 00:13:00,760 --> 00:13:06,760 Speaker 4: clear difference in a Tiffany Cross as opposed to someone 240 00:13:06,800 --> 00:13:09,440 Speaker 4: who just started as a journalist or someone who doesn't 241 00:13:09,480 --> 00:13:13,120 Speaker 4: have the information. Because Tiffany is going to give us 242 00:13:13,160 --> 00:13:16,680 Speaker 4: the background story. Tiffany has done the research. You hear 243 00:13:16,720 --> 00:13:19,040 Speaker 4: it and you know it. Tiffany. You know when you 244 00:13:19,200 --> 00:13:21,679 Speaker 4: hear a real journalist, that person is going to give 245 00:13:21,679 --> 00:13:24,520 Speaker 4: you both sides. That person is going to bring on 246 00:13:24,760 --> 00:13:27,640 Speaker 4: guests so we can get the full picture. There's context there. 247 00:13:28,000 --> 00:13:31,120 Speaker 4: A person knows if they want the real information. It's 248 00:13:31,400 --> 00:13:36,199 Speaker 4: very clear what Angela sounds like. It's very clear what 249 00:13:36,280 --> 00:13:37,600 Speaker 4: you sound like. That's not. 250 00:13:40,600 --> 00:13:46,839 Speaker 6: A sophisticated enough. As we continue to cord cut from 251 00:13:47,000 --> 00:13:51,120 Speaker 6: cable news right after being ignored, younger generations are not 252 00:13:51,320 --> 00:13:54,120 Speaker 6: reading newspapers like they literally. I hear so many people 253 00:13:54,160 --> 00:13:56,000 Speaker 6: say I want to be a journalist, and they have 254 00:13:56,120 --> 00:13:58,679 Speaker 6: no idea what that means. They think that I'll get 255 00:13:58,720 --> 00:14:00,520 Speaker 6: to go on TV and talk, and I'm like, journalists 256 00:14:00,520 --> 00:14:03,319 Speaker 6: don't talk. They listen like you are not giving an 257 00:14:03,320 --> 00:14:06,800 Speaker 6: opinion as a journalist. So I don't know that. 258 00:14:07,040 --> 00:14:10,880 Speaker 4: The same way we were taught, Tiffany, the same way 259 00:14:10,880 --> 00:14:13,679 Speaker 4: we were taught what is your source this? We always 260 00:14:13,760 --> 00:14:17,160 Speaker 4: ask that question, what's the source? And then who's paying them? 261 00:14:17,520 --> 00:14:20,360 Speaker 4: We have to know, we have to teach people how 262 00:14:20,400 --> 00:14:23,760 Speaker 4: to learn, teach people how to source if it's a 263 00:14:23,840 --> 00:14:25,760 Speaker 4: real if this is a if this is all the 264 00:14:25,800 --> 00:14:28,200 Speaker 4: information that they need, if this is a true source. 265 00:14:28,200 --> 00:14:31,240 Speaker 4: We teach them because they're just learning. You know, this 266 00:14:31,280 --> 00:14:34,360 Speaker 4: is a new space and they need to know how 267 00:14:34,600 --> 00:14:38,400 Speaker 4: how okay is this information right? We tell them, we 268 00:14:38,440 --> 00:14:43,200 Speaker 4: break things down, start from the backstory. We teach them 269 00:14:43,760 --> 00:14:48,480 Speaker 4: how to source news so they can so they can 270 00:14:48,560 --> 00:14:51,000 Speaker 4: become sophisticated the same way we did. It's just a 271 00:14:51,000 --> 00:14:59,880 Speaker 4: different platform. 272 00:15:00,480 --> 00:15:02,600 Speaker 3: To that point, Tomisha, you said, we are here. We 273 00:15:02,640 --> 00:15:05,080 Speaker 3: are in the land of social media. This is where 274 00:15:05,080 --> 00:15:08,000 Speaker 3: people are getting I get stuff from social media on 275 00:15:08,240 --> 00:15:10,480 Speaker 3: our news sites. I'm like, this is easier to pull 276 00:15:10,520 --> 00:15:12,080 Speaker 3: this clip from here than it is here. Let me 277 00:15:12,120 --> 00:15:14,600 Speaker 3: send this over to somebody, right. But I think that 278 00:15:14,960 --> 00:15:17,640 Speaker 3: what also is important about what Tiff was raising is 279 00:15:18,240 --> 00:15:21,040 Speaker 3: the ways in which these folks have been validated is 280 00:15:21,160 --> 00:15:25,000 Speaker 3: by the number of followers. There have been recent reports 281 00:15:25,000 --> 00:15:29,840 Speaker 3: that talk about the number of conservative hosts or influencers 282 00:15:30,040 --> 00:15:33,960 Speaker 3: who pay for their followers. They actually didn't earn those people. 283 00:15:33,960 --> 00:15:34,600 Speaker 6: They're not right. 284 00:15:34,640 --> 00:15:37,640 Speaker 3: There are these audiences that have been curated and built 285 00:15:37,880 --> 00:15:40,600 Speaker 3: because they were able to pay for them. So, Isaac, 286 00:15:40,640 --> 00:15:43,080 Speaker 3: can you talk about how fan base is different in 287 00:15:43,080 --> 00:15:46,240 Speaker 3: that regard? Can you buy followers on fan base? If not, 288 00:15:46,360 --> 00:15:48,600 Speaker 3: how are you preventing that from happening. 289 00:15:49,320 --> 00:15:51,720 Speaker 5: No, you can't buy followers on fan base. 290 00:15:51,840 --> 00:15:55,640 Speaker 2: But more importantly, I want to be very clear that 291 00:15:57,000 --> 00:16:01,280 Speaker 2: linear television is dying and even the way that we 292 00:16:01,320 --> 00:16:05,280 Speaker 2: consume all media and I mean music, I mean podcasts, 293 00:16:05,440 --> 00:16:09,400 Speaker 2: I mean Netflix, Hulu, anything that you consume media wise 294 00:16:09,920 --> 00:16:12,320 Speaker 2: is going to be distributed through a large social network 295 00:16:12,320 --> 00:16:14,320 Speaker 2: in the future. It is not We're not going to 296 00:16:14,360 --> 00:16:16,240 Speaker 2: get on our TVs and go to Netflix. We're not 297 00:16:16,280 --> 00:16:18,400 Speaker 2: going to go to Hulu. We're not going to turn 298 00:16:18,440 --> 00:16:22,040 Speaker 2: on direct TV or Infinity. We're going to go and 299 00:16:22,120 --> 00:16:24,960 Speaker 2: watch all of these things, listen to these things through 300 00:16:24,960 --> 00:16:28,960 Speaker 2: a Facebook, through an Instagram, through a Twitter, through a TikTok, 301 00:16:29,240 --> 00:16:31,600 Speaker 2: or a fan base. That's how all media is going 302 00:16:31,640 --> 00:16:33,680 Speaker 2: to be consumed. It's going to be pushed all into 303 00:16:33,720 --> 00:16:35,920 Speaker 2: one place, and that's how we're going to receive all 304 00:16:35,960 --> 00:16:39,400 Speaker 2: of our information, be it entertainment, fun, community, all these things, 305 00:16:39,400 --> 00:16:41,480 Speaker 2: and so I'm not you know, I didn't build a 306 00:16:41,480 --> 00:16:46,360 Speaker 2: platform to silence people or even pretend or buy fake followers. 307 00:16:46,480 --> 00:16:48,560 Speaker 2: I actually built a platform so that people can have 308 00:16:48,600 --> 00:16:51,400 Speaker 2: the reach that they actually have on these platforms. You're 309 00:16:51,400 --> 00:16:57,360 Speaker 2: suppressed on Instagram, You're suppressed on platforms like x and TikTok. 310 00:16:57,400 --> 00:17:00,800 Speaker 2: You're suppressed because your reach means money. So if you're 311 00:17:00,800 --> 00:17:03,360 Speaker 2: someone that has one hundred million followers on Instagram, they're 312 00:17:03,400 --> 00:17:05,280 Speaker 2: not letting you reach one hundred million people because one 313 00:17:05,359 --> 00:17:07,960 Speaker 2: hundred million people garners the ability to charge eight million 314 00:17:08,000 --> 00:17:10,399 Speaker 2: dollars for a one minute commercial like we do on 315 00:17:10,440 --> 00:17:11,040 Speaker 2: the Super Bowl. 316 00:17:11,440 --> 00:17:13,119 Speaker 5: So really, ever since. 317 00:17:12,960 --> 00:17:15,520 Speaker 2: Video got added to social media, it turned every single 318 00:17:15,560 --> 00:17:17,760 Speaker 2: person on the planet into a television network. 319 00:17:17,960 --> 00:17:19,440 Speaker 5: We are all TV networks. 320 00:17:19,640 --> 00:17:22,760 Speaker 2: If you have a social media platform and you have video, 321 00:17:23,119 --> 00:17:26,080 Speaker 2: you are a network. Now on fan Base, it's free 322 00:17:26,119 --> 00:17:27,679 Speaker 2: to down, I'm free to download, a free to use, 323 00:17:27,720 --> 00:17:30,520 Speaker 2: and even free to post content. But you can simultaneously 324 00:17:30,640 --> 00:17:33,639 Speaker 2: take content and put it behind a paywall in charge 325 00:17:33,680 --> 00:17:35,720 Speaker 2: as well. So it now gives the freedom for every 326 00:17:35,800 --> 00:17:38,240 Speaker 2: single person on the planet to be their own content 327 00:17:38,280 --> 00:17:40,320 Speaker 2: provider to a community of people that are going to 328 00:17:40,560 --> 00:17:44,040 Speaker 2: consume content as a community. We're going to watch shows together, 329 00:17:44,280 --> 00:17:46,120 Speaker 2: and we're going to talk about them in audio rooms, 330 00:17:46,119 --> 00:17:48,080 Speaker 2: and we're going to chat about them in real time, 331 00:17:48,320 --> 00:17:50,119 Speaker 2: the same way young people are doing on Twitch. 332 00:17:50,200 --> 00:17:51,880 Speaker 5: And so I'm ahead of the curve. 333 00:17:51,960 --> 00:17:54,600 Speaker 2: I understand that this is where everything is going, that 334 00:17:54,640 --> 00:17:57,959 Speaker 2: the new networks of the future are human beings and 335 00:17:58,040 --> 00:18:00,840 Speaker 2: not networks themselves. So the infrastry lecture just has to 336 00:18:00,880 --> 00:18:02,840 Speaker 2: be fan based again a. 337 00:18:03,040 --> 00:18:07,600 Speaker 6: Regular like because you're saying, like every well, because everybody, 338 00:18:07,640 --> 00:18:11,560 Speaker 6: if everybody is their own network, and you're saying like, yeah, 339 00:18:11,800 --> 00:18:14,760 Speaker 6: you know, I'm free to download, like something I find 340 00:18:14,880 --> 00:18:18,040 Speaker 6: very curious, like Angela saying that these conservative people they 341 00:18:18,080 --> 00:18:21,760 Speaker 6: buy their viewers. It's very true. They also frolic where 342 00:18:22,440 --> 00:18:25,439 Speaker 6: where they're their bots or they're real people who like 343 00:18:25,640 --> 00:18:28,600 Speaker 6: you know, Blue Sky was a platform that you know, 344 00:18:28,760 --> 00:18:30,879 Speaker 6: was I think skewed to the left, and so a 345 00:18:30,880 --> 00:18:32,680 Speaker 6: lot of people were there, and then as it grew 346 00:18:32,720 --> 00:18:35,919 Speaker 6: on popularity, this huge surge of maga people came and 347 00:18:36,000 --> 00:18:40,439 Speaker 6: started you know, dropping fake information there, harassing people. How 348 00:18:40,480 --> 00:18:45,120 Speaker 6: do you regulate, like their regulation determines you know, what 349 00:18:45,359 --> 00:18:49,480 Speaker 6: is constitutes hate language or hate speech, what's appropriate or 350 00:18:49,480 --> 00:18:52,280 Speaker 6: somebody uploads you know, naked pictures or porn. If somebody 351 00:18:52,320 --> 00:18:56,280 Speaker 6: is spreading disinformation that you know is spreading disinformation, what 352 00:18:56,320 --> 00:18:58,240 Speaker 6: does the regulation look like on fan Base? 353 00:19:00,600 --> 00:19:04,960 Speaker 2: Well, artificial intelligence is a real big part of how 354 00:19:05,000 --> 00:19:05,919 Speaker 2: you monitor content. 355 00:19:06,040 --> 00:19:08,160 Speaker 5: Now, human moderators were hard. 356 00:19:08,160 --> 00:19:11,680 Speaker 2: It's impossible to moderate content with just human beings. So 357 00:19:11,960 --> 00:19:15,480 Speaker 2: when you think about content ingestion engines that are see 358 00:19:15,480 --> 00:19:18,240 Speaker 2: what's written, see what's spoken, see what's in the video. 359 00:19:18,960 --> 00:19:20,840 Speaker 2: There's a lot of platforms that use that and it's 360 00:19:20,840 --> 00:19:24,000 Speaker 2: called like moderation software, and AI makes it faster and 361 00:19:24,080 --> 00:19:26,919 Speaker 2: easier to do. And we actually use that software on 362 00:19:26,960 --> 00:19:30,000 Speaker 2: fan Base, and we actually are building a stronger recommendation 363 00:19:30,080 --> 00:19:32,880 Speaker 2: engine to actually serve people content that they want. So 364 00:19:33,040 --> 00:19:35,399 Speaker 2: artificial intelligence is definitely going to help. You know, you 365 00:19:35,440 --> 00:19:38,879 Speaker 2: can fact check a lot of things through artificial intelligence. 366 00:19:39,160 --> 00:19:43,120 Speaker 2: Now I wouldn't necessarily I trust trust all artificial intelligence 367 00:19:43,119 --> 00:19:45,320 Speaker 2: platforms may be the one that Eline built per se, 368 00:19:45,640 --> 00:19:48,719 Speaker 2: but the ones that can provide factual information and links 369 00:19:48,920 --> 00:19:51,040 Speaker 2: and the way that you train artificial intelligence tould be 370 00:19:51,040 --> 00:19:54,000 Speaker 2: absolutely truthful. Then there's no getting around the truth. When 371 00:19:54,040 --> 00:19:54,560 Speaker 2: the truth is. 372 00:19:54,560 --> 00:19:58,760 Speaker 1: There you know, Isaac, just picking up where you left off. 373 00:19:58,760 --> 00:20:01,719 Speaker 1: I think it's also important that in these communities that 374 00:20:01,760 --> 00:20:05,240 Speaker 1: we police each other, that there's accountability by the folks 375 00:20:05,280 --> 00:20:09,560 Speaker 1: who are signing up and subscribing, who are contributors. I 376 00:20:09,600 --> 00:20:11,879 Speaker 1: think you know in Tiffany you know as well. I 377 00:20:11,880 --> 00:20:17,280 Speaker 1: think just the democratization of access through these technologies to 378 00:20:17,400 --> 00:20:22,280 Speaker 1: people has yes inflated a lot of people's personalities. And 379 00:20:22,440 --> 00:20:24,080 Speaker 1: you know, there's a lot of mess out there, and 380 00:20:24,119 --> 00:20:26,080 Speaker 1: a lots you don't. You know, you don't want to 381 00:20:26,080 --> 00:20:28,879 Speaker 1: get into the uh, into the ears and the eyes 382 00:20:28,920 --> 00:20:32,200 Speaker 1: of of of of your kids and and and friends. 383 00:20:33,280 --> 00:20:36,080 Speaker 1: But but I'm curious to know, is you you talk 384 00:20:36,119 --> 00:20:39,960 Speaker 1: about building well for our community and ownership for our community, 385 00:20:40,000 --> 00:20:44,600 Speaker 1: and certainly the opportunity for authentic and real truth tellers 386 00:20:44,640 --> 00:20:48,160 Speaker 1: to be heard, not censored, not reduced to the algorithms 387 00:20:48,440 --> 00:20:51,760 Speaker 1: so that you never get to hear their voices and 388 00:20:51,880 --> 00:20:55,640 Speaker 1: access their content. How you know, it's one thing for 389 00:20:56,040 --> 00:20:58,800 Speaker 1: you to make it, and it's another thing for you 390 00:20:58,880 --> 00:21:01,600 Speaker 1: to have a commune are your folks, Because I think frankly, 391 00:21:01,640 --> 00:21:05,280 Speaker 1: more begets more, uh, the more of yous that exists, 392 00:21:05,320 --> 00:21:07,760 Speaker 1: the more of us that are out here demanding it, 393 00:21:08,040 --> 00:21:09,800 Speaker 1: and then we start to expect it, and then we 394 00:21:09,840 --> 00:21:12,480 Speaker 1: start to say, if it isn't this, that I don't 395 00:21:12,520 --> 00:21:15,560 Speaker 1: want it. What's the community look like of folks who 396 00:21:15,600 --> 00:21:19,159 Speaker 1: are also building, you know, frankly, as you all climb, 397 00:21:20,080 --> 00:21:20,600 Speaker 1: are there more? 398 00:21:20,760 --> 00:21:20,959 Speaker 5: I mean? 399 00:21:21,680 --> 00:21:24,399 Speaker 2: Yeah, So fan base is just an organic community, right, 400 00:21:24,560 --> 00:21:30,199 Speaker 2: of course, it started with seed investment of myself and 401 00:21:30,240 --> 00:21:33,040 Speaker 2: then investment from people that decided that they want to 402 00:21:33,080 --> 00:21:33,480 Speaker 2: own the. 403 00:21:33,440 --> 00:21:34,520 Speaker 5: Future of social media. 404 00:21:35,520 --> 00:21:36,240 Speaker 4: Martin Brock. 405 00:21:36,240 --> 00:21:40,840 Speaker 2: Communities heavily African American. Yeah, of course, absolutely shout out, 406 00:21:41,200 --> 00:21:44,280 Speaker 2: shout out to Roland. But it's people that believe that 407 00:21:44,359 --> 00:21:46,879 Speaker 2: we actually really need to be able to say the 408 00:21:46,920 --> 00:21:51,080 Speaker 2: things that we want and have a way to speak. Now, 409 00:21:51,480 --> 00:21:54,600 Speaker 2: my concern in talking to you guys, and I get 410 00:21:54,600 --> 00:21:56,679 Speaker 2: a little I get a little concern with black media 411 00:21:56,760 --> 00:21:59,439 Speaker 2: is that I feel like in this area, black media 412 00:21:59,520 --> 00:22:02,800 Speaker 2: does not do what black media should do a lot 413 00:22:02,800 --> 00:22:07,560 Speaker 2: of times in highlighting the stories that can uplift our 414 00:22:07,600 --> 00:22:09,800 Speaker 2: community or bring attention to things like that. 415 00:22:09,840 --> 00:22:12,240 Speaker 5: You mentioned Blue Sky just to sheer. 416 00:22:11,960 --> 00:22:16,639 Speaker 2: Fact that twenty five million people went from X to 417 00:22:16,680 --> 00:22:20,000 Speaker 2: Blue Sky in sixty days raised the value of that 418 00:22:20,000 --> 00:22:22,760 Speaker 2: company to seven hundred million dollars. It was just a 419 00:22:22,840 --> 00:22:25,160 Speaker 2: sheer volume of people deciding that we want to leave 420 00:22:25,280 --> 00:22:28,600 Speaker 2: one place and go to another place, and so it's 421 00:22:28,640 --> 00:22:30,240 Speaker 2: just the actual fact of downloading. 422 00:22:30,240 --> 00:22:33,600 Speaker 5: That's why how Tamisha and I got together. 423 00:22:33,680 --> 00:22:35,560 Speaker 2: Tamisha said, Isaac, I need to come talk to you, 424 00:22:35,920 --> 00:22:38,480 Speaker 2: and she was like, fan base is an emergency. If 425 00:22:38,480 --> 00:22:41,760 Speaker 2: we don't get on this platform, we're not going to 426 00:22:41,800 --> 00:22:46,080 Speaker 2: have a place where we can have our conversations, meet, 427 00:22:46,320 --> 00:22:49,320 Speaker 2: organize virtually, you know what I'm saying, to create physical 428 00:22:49,400 --> 00:22:54,600 Speaker 2: organization and activation. And so that's the part that's extremely important. 429 00:22:54,800 --> 00:22:58,240 Speaker 2: I understand that social media was a toy from two 430 00:22:58,280 --> 00:23:00,480 Speaker 2: thousand and four, you know, all the way up to 431 00:23:00,520 --> 00:23:04,080 Speaker 2: maybe two thousand and sixteen. But now it is a tool, 432 00:23:04,240 --> 00:23:06,439 Speaker 2: and we have to take control of the tools that 433 00:23:06,480 --> 00:23:09,600 Speaker 2: disseminate the information to our community. And so it's open 434 00:23:09,640 --> 00:23:11,480 Speaker 2: for everybody. Like I said, we're in one hundred and 435 00:23:11,600 --> 00:23:15,120 Speaker 2: ninety country countries and territories and we're both on iOS 436 00:23:15,119 --> 00:23:15,680 Speaker 2: and Android. 437 00:23:15,680 --> 00:23:17,040 Speaker 5: So this is a global platform. 438 00:23:17,280 --> 00:23:20,159 Speaker 2: And we talk about Africa coming on wildly to the 439 00:23:20,160 --> 00:23:25,040 Speaker 2: Internet in the next few years. It's almost two billion 440 00:23:25,040 --> 00:23:28,480 Speaker 2: people that will be platformed and who's going to who's 441 00:23:28,520 --> 00:23:31,120 Speaker 2: going to own the infrastructure of the platforms that they 442 00:23:31,240 --> 00:23:34,399 Speaker 2: use to communicate and do that, And so black people 443 00:23:34,480 --> 00:23:37,639 Speaker 2: have to own our own sources of media and have 444 00:23:37,720 --> 00:23:41,040 Speaker 2: those conversations. So when I say, if you can't say 445 00:23:41,040 --> 00:23:43,720 Speaker 2: what you want, how you want, how often you want, 446 00:23:43,800 --> 00:23:45,960 Speaker 2: the way you want, that's what I call media. 447 00:23:46,480 --> 00:23:47,320 Speaker 5: That's what I call where. 448 00:23:47,320 --> 00:23:50,280 Speaker 2: It's like, if there's an injustice happening anywhere, it doesn't 449 00:23:50,280 --> 00:23:52,320 Speaker 2: matter if it does, if it's not good for ratings, 450 00:23:52,359 --> 00:23:54,600 Speaker 2: if it's not good for whatever it is. If truck 451 00:23:54,680 --> 00:23:57,560 Speaker 2: drivers are going missing, if team if black teen girls 452 00:23:57,560 --> 00:24:01,639 Speaker 2: are getting sex trafficked, if young young boys are getting 453 00:24:02,000 --> 00:24:04,640 Speaker 2: harassed by the police in DC, and the only way 454 00:24:04,680 --> 00:24:06,840 Speaker 2: that we know this is by them pulling out their 455 00:24:06,880 --> 00:24:09,560 Speaker 2: own cell phones and sharing it to a community like 456 00:24:09,640 --> 00:24:12,120 Speaker 2: fan base or social media. And that's what we need. 457 00:24:12,160 --> 00:24:15,360 Speaker 2: And that's the urgency and importance of having this platform. 458 00:24:15,480 --> 00:24:15,919 Speaker 5: I love it. 459 00:24:15,960 --> 00:24:17,320 Speaker 1: I know we've got a run and we can't keep 460 00:24:17,359 --> 00:24:21,959 Speaker 1: y'all all day, but we leave people with takeaways, admonishments, 461 00:24:22,000 --> 00:24:24,760 Speaker 1: things that they must do as a result of this conversation. 462 00:24:24,840 --> 00:24:29,520 Speaker 1: And I'm curious for both Timisha, you and Isaac, what 463 00:24:29,560 --> 00:24:32,679 Speaker 1: instruction do you have for folks? What should they take 464 00:24:32,720 --> 00:24:36,359 Speaker 1: away from this conversation as an admonishment to go and 465 00:24:36,400 --> 00:24:36,919 Speaker 1: do something. 466 00:24:37,520 --> 00:24:41,280 Speaker 4: Thank you Andrew. Just real quick, what I see is 467 00:24:41,320 --> 00:24:44,440 Speaker 4: that our stories are not a priority. Whereas you saw 468 00:24:44,440 --> 00:24:49,240 Speaker 4: in twenty twenty, we were seeing all kinds of stories 469 00:24:49,280 --> 00:24:53,480 Speaker 4: on police brutality, we were seeing stories that impacted black 470 00:24:53,520 --> 00:24:58,439 Speaker 4: people come to the forefront in the news. Now we 471 00:24:58,520 --> 00:25:03,080 Speaker 4: are really fighting to even get a story that needs 472 00:25:03,160 --> 00:25:06,720 Speaker 4: to be an urgent story that impacts us. We are 473 00:25:06,720 --> 00:25:10,679 Speaker 4: fighting for that in in our production meetings and for me, 474 00:25:13,160 --> 00:25:16,800 Speaker 4: that impacts us greatly. That's my concern. We need to 475 00:25:16,840 --> 00:25:19,760 Speaker 4: be able to get our stories out, to be able 476 00:25:19,800 --> 00:25:22,600 Speaker 4: to make sure that we are helping our our people, 477 00:25:22,720 --> 00:25:28,879 Speaker 4: our communities. So my last words are for the media people, 478 00:25:30,040 --> 00:25:36,200 Speaker 4: for the anchors, the reporters, join fan I say, joined 479 00:25:36,280 --> 00:25:39,280 Speaker 4: fan base, so that we know we have a specific 480 00:25:39,400 --> 00:25:43,600 Speaker 4: place for where you're going to be that we trust, 481 00:25:43,600 --> 00:25:47,760 Speaker 4: because we trust your voice, we trust your research. We 482 00:25:47,800 --> 00:25:51,359 Speaker 4: just need you all in one place. That's what I'm saying. 483 00:25:51,600 --> 00:25:54,800 Speaker 2: I love that, Thank you, and and for me, I 484 00:25:54,840 --> 00:25:58,840 Speaker 2: would say invest we don't you know again, I love 485 00:25:58,880 --> 00:26:01,119 Speaker 2: the individual success. I love all the black being a 486 00:26:01,280 --> 00:26:03,479 Speaker 2: billionaires we have in the United States of America. But 487 00:26:03,520 --> 00:26:06,560 Speaker 2: if we have fourteen black millionaires billionaires, we need one 488 00:26:06,640 --> 00:26:09,720 Speaker 2: hundred and forty more, or we need thousands and thousands 489 00:26:09,720 --> 00:26:13,639 Speaker 2: of millionaires because agendas are bought, right, These agendas are purchased. 490 00:26:13,640 --> 00:26:16,119 Speaker 2: Elon Musk bought a whole social media platform so he 491 00:26:16,160 --> 00:26:18,080 Speaker 2: can win an election. And what I'm saying is that 492 00:26:18,359 --> 00:26:21,320 Speaker 2: I tell people that if you want to own and 493 00:26:21,480 --> 00:26:23,199 Speaker 2: own a piece of fan Base, you can do that. 494 00:26:23,280 --> 00:26:23,440 Speaker 5: Now. 495 00:26:23,480 --> 00:26:25,520 Speaker 2: I know that Angela talked about investing. I think you 496 00:26:25,560 --> 00:26:28,159 Speaker 2: guys have talked about investing. But anybody that wants to 497 00:26:28,200 --> 00:26:31,119 Speaker 2: invest and actually own part of this platform, go to 498 00:26:31,160 --> 00:26:33,439 Speaker 2: start engine, dot com, slash fan Base and invest the 499 00:26:33,480 --> 00:26:36,720 Speaker 2: minimum it's three hundred and ninety nine dollars, and I 500 00:26:36,760 --> 00:26:39,520 Speaker 2: feel like we have over twenty three thousand investors. And again, 501 00:26:39,600 --> 00:26:42,320 Speaker 2: we have to take this opportunity to own the infrastructure 502 00:26:42,640 --> 00:26:44,679 Speaker 2: of the things that we make great. If not, we 503 00:26:44,720 --> 00:26:47,880 Speaker 2: will be customers to our creations. We just we will 504 00:26:47,920 --> 00:26:50,680 Speaker 2: be the talent and we will not be the infrastructure. 505 00:26:51,000 --> 00:26:53,919 Speaker 2: And that is my ask, is that people actually invest 506 00:26:53,920 --> 00:26:55,119 Speaker 2: in fan Base and own a part of it. 507 00:26:55,560 --> 00:26:56,000 Speaker 5: I love that. 508 00:26:56,160 --> 00:26:58,520 Speaker 1: I love that, as you all mentioned at this onset 509 00:26:58,560 --> 00:27:00,840 Speaker 1: of this black people you know well box our weight. 510 00:27:01,640 --> 00:27:05,280 Speaker 1: We've always out boxed our weight since our arrival here 511 00:27:05,320 --> 00:27:05,920 Speaker 1: in this land. 512 00:27:05,960 --> 00:27:06,280 Speaker 5: And so. 513 00:27:08,080 --> 00:27:11,440 Speaker 1: Uh uh, if we make the choice, we then set 514 00:27:11,480 --> 00:27:13,960 Speaker 1: the curve for everything that then comes after. It's just 515 00:27:14,280 --> 00:27:16,400 Speaker 1: it's a fact. I think, Tiffany I can say it's 516 00:27:16,400 --> 00:27:20,320 Speaker 1: a fact. You're funder that phrase. Uh. And Angela, I 517 00:27:20,359 --> 00:27:21,879 Speaker 1: know you bought us in and you want to close 518 00:27:21,960 --> 00:27:23,680 Speaker 1: us out, But I just want to say, on behalf 519 00:27:23,720 --> 00:27:25,480 Speaker 1: of our listeners, you want to thank you all for this, 520 00:27:25,720 --> 00:27:29,760 Speaker 1: for sounding the alarm, but also for setting the agenda. 521 00:27:29,520 --> 00:27:29,879 Speaker 5: The plate. 522 00:27:30,160 --> 00:27:31,920 Speaker 1: You know what we have to look forward to. I'm 523 00:27:31,960 --> 00:27:32,639 Speaker 1: looking forward to it. 524 00:27:32,720 --> 00:27:36,879 Speaker 6: And before I just want to say, really quickly, Isaac, 525 00:27:36,920 --> 00:27:40,000 Speaker 6: when you hear me say on a platform on the 526 00:27:40,000 --> 00:27:44,040 Speaker 6: public states that will be customers of our creation, I'm 527 00:27:44,040 --> 00:27:46,600 Speaker 6: giving you credit in my heart you. 528 00:27:46,680 --> 00:27:52,200 Speaker 1: Yes, because she will out loud sewer. 529 00:27:53,000 --> 00:27:56,480 Speaker 5: We freestyled greatness. 530 00:27:57,080 --> 00:27:59,639 Speaker 1: When you hear that again, you won't get credit, but. 531 00:28:00,280 --> 00:28:02,639 Speaker 6: In my heart because you dropped like three barns and 532 00:28:02,720 --> 00:28:05,440 Speaker 6: I'm just I'm taking them well. 533 00:28:05,560 --> 00:28:08,000 Speaker 3: At some point, even when we borrow from each other, 534 00:28:08,080 --> 00:28:11,280 Speaker 3: we shall all be free someday, So we thank y'all 535 00:28:11,320 --> 00:28:14,680 Speaker 3: for joining us. Isaac Hayes and Tamitia Hairs are really 536 00:28:14,720 --> 00:28:16,280 Speaker 3: really grateful for this conversation. 537 00:28:16,440 --> 00:28:17,880 Speaker 4: You all are always welcome back. 538 00:28:18,200 --> 00:28:21,040 Speaker 1: Welcome home, y'all, Welcome home, get on base. 539 00:28:21,200 --> 00:28:27,000 Speaker 3: Thank you, Flavor Flavor Boy, I tell you by my damn. 540 00:28:29,200 --> 00:28:30,160 Speaker 5: Thank you, guys. 541 00:28:45,000 --> 00:28:47,959 Speaker 3: Native Lampard is a production of iHeart Radio and partnership 542 00:28:47,960 --> 00:28:51,240 Speaker 3: with Reason Choice Media. For more podcasts from iHeart Radio, 543 00:28:51,320 --> 00:28:54,000 Speaker 3: visit the iHeart Radio app, Apple Podcasts, or wherever you 544 00:28:54,040 --> 00:28:55,640 Speaker 3: listen to your favorite shows,