1 00:00:11,760 --> 00:00:14,560 Speaker 1: Good morning, peeps, and welcome to wok F Daily with 2 00:00:14,680 --> 00:00:19,160 Speaker 1: Meet Your Girl Danielle Moody, Recording from the Home Bunker. Folks, 3 00:00:19,440 --> 00:00:24,280 Speaker 1: I'm very excited to welcome back to woke a F Daily, 4 00:00:24,760 --> 00:00:29,640 Speaker 1: friend of the show and colleague on Outspoken for iHeart 5 00:00:30,080 --> 00:00:32,800 Speaker 1: Bridget Todd, who is the host of There Are No 6 00:00:32,920 --> 00:00:36,960 Speaker 1: Girls on the Internet. And Bridgitte and I get into 7 00:00:37,040 --> 00:00:41,800 Speaker 1: a really interesting conversation as we enter into the last 8 00:00:42,000 --> 00:00:45,160 Speaker 1: week of Pride Pride month, but not the last week 9 00:00:45,159 --> 00:00:49,839 Speaker 1: of Pride, where we're talking about, you know, artificial intelligence, 10 00:00:49,920 --> 00:00:54,720 Speaker 1: which you cannot escape a segment or news story headline 11 00:00:55,200 --> 00:01:00,200 Speaker 1: these days without somebody talking about artificial intelligence and chat 12 00:01:00,280 --> 00:01:03,920 Speaker 1: GPT and whether or not we think that AI is 13 00:01:03,960 --> 00:01:06,440 Speaker 1: going to destroy the world or whether we think that 14 00:01:06,520 --> 00:01:08,640 Speaker 1: it's going to be something that is going to aid 15 00:01:09,480 --> 00:01:12,840 Speaker 1: in our progress. And the conversation that I get into 16 00:01:12,880 --> 00:01:15,720 Speaker 1: with bridget is one that I've been grappling with in 17 00:01:15,760 --> 00:01:19,520 Speaker 1: my mind for quite some time, which is that if 18 00:01:19,760 --> 00:01:28,800 Speaker 1: the people right behind these technological advances are of the 19 00:01:28,920 --> 00:01:36,440 Speaker 1: same mind, meaning are the same white cysts, hetero, you know, 20 00:01:37,720 --> 00:01:42,520 Speaker 1: male people, If they're still the same people that are 21 00:01:42,760 --> 00:01:49,960 Speaker 1: in control, then how do we not think that bias, discrimination, 22 00:01:51,040 --> 00:01:57,280 Speaker 1: outright bigotry is going to be embedded in these new systems, right, Like, 23 00:01:57,480 --> 00:02:01,520 Speaker 1: we continue to believe and look at different advancements and 24 00:02:01,560 --> 00:02:03,280 Speaker 1: think like this is going to be the thing that 25 00:02:03,400 --> 00:02:06,880 Speaker 1: takes us to the next level, But we haven't fundamentally 26 00:02:06,920 --> 00:02:10,120 Speaker 1: dealt with the shit that has kept us, the country, 27 00:02:10,160 --> 00:02:13,600 Speaker 1: and the world under the thumb, under the knee of 28 00:02:13,639 --> 00:02:18,440 Speaker 1: white supremacy. Right. So when we have these conversations, and 29 00:02:18,600 --> 00:02:23,000 Speaker 1: you know, I reference in my talk with bridget about 30 00:02:23,040 --> 00:02:25,840 Speaker 1: a Netflix documentary that I've talked about on WKF before 31 00:02:25,880 --> 00:02:29,519 Speaker 1: called Coded Bias, that was on Netflix. You know, when 32 00:02:29,600 --> 00:02:34,320 Speaker 1: we talk about technology, when we talk about you know, 33 00:02:34,400 --> 00:02:39,680 Speaker 1: these technocrats, it is still white, cis hetero men that 34 00:02:39,760 --> 00:02:44,359 Speaker 1: are in control of the future, right, And they are 35 00:02:44,400 --> 00:02:46,680 Speaker 1: the same ones that are in control of our past. 36 00:02:46,760 --> 00:02:49,160 Speaker 1: And what do I mean by that, Well, look no 37 00:02:49,280 --> 00:02:54,000 Speaker 1: further than Governor Ron DeSantis and Greg Abbott and others 38 00:02:54,360 --> 00:03:00,200 Speaker 1: who are dictating what it is students and people in 39 00:03:00,240 --> 00:03:05,520 Speaker 1: their state can learn about and know about our nation's history. 40 00:03:06,760 --> 00:03:15,280 Speaker 1: So when you have this same biased, racist, misogynistic, homophobic, transphobic, 41 00:03:15,480 --> 00:03:22,120 Speaker 1: islamophobic people and that mentality that is still in charge. 42 00:03:22,560 --> 00:03:25,440 Speaker 1: How do you think that we advance out of racism. 43 00:03:25,440 --> 00:03:30,120 Speaker 1: It's like this idea which is so fucking stupid that like, oh, well, 44 00:03:30,120 --> 00:03:34,639 Speaker 1: when the racists die off, they don't they fucking reproduce, right, 45 00:03:35,200 --> 00:03:41,680 Speaker 1: They groom their children in their ideology, then they rise 46 00:03:41,800 --> 00:03:45,560 Speaker 1: up and they get into positions of power, and they 47 00:03:47,480 --> 00:03:54,280 Speaker 1: continue to blanket society through their industry in that ideology. 48 00:03:54,640 --> 00:03:57,240 Speaker 1: So it's like, if you don't get to the root, 49 00:03:58,880 --> 00:04:02,320 Speaker 1: it doesn't matter what is born on the tree, because 50 00:04:02,360 --> 00:04:06,600 Speaker 1: it's always going to be fucking poisonous. And that's the 51 00:04:06,640 --> 00:04:12,520 Speaker 1: thing in America that we continue to just not get right, 52 00:04:12,640 --> 00:04:14,880 Speaker 1: like we never want to go to the root of anything. 53 00:04:15,240 --> 00:04:18,279 Speaker 1: And that was my same understanding with you know, talking 54 00:04:18,320 --> 00:04:22,279 Speaker 1: about the Juneteenth Holiday, it's like you want to give 55 00:04:22,279 --> 00:04:25,760 Speaker 1: a federal holidays something that you don't even understand. People 56 00:04:25,800 --> 00:04:28,400 Speaker 1: want to say, oh, well, they were just the enslaved 57 00:04:28,400 --> 00:04:34,039 Speaker 1: were just late in finding out. No, it was a strategy. 58 00:04:34,320 --> 00:04:39,800 Speaker 1: It was part of keeping people enslaved and tied to production. 59 00:04:41,440 --> 00:04:47,960 Speaker 1: So it was about money and free labor. We continue 60 00:04:47,960 --> 00:04:51,440 Speaker 1: to have conversations about, you know, oh, should we have 61 00:04:51,520 --> 00:04:54,719 Speaker 1: reparations or should we not? And who is having that 62 00:04:54,800 --> 00:04:59,200 Speaker 1: commra who gets to decide the very people whose ancestors 63 00:04:59,200 --> 00:05:02,280 Speaker 1: did the fucking instructions and the very people who got 64 00:05:02,279 --> 00:05:08,080 Speaker 1: to benefit from the torture and the rape and the 65 00:05:08,240 --> 00:05:12,799 Speaker 1: enslavement and the selling off of humans, and the breaking 66 00:05:12,880 --> 00:05:17,320 Speaker 1: up of family, the breaking up of culture, the wiping 67 00:05:17,360 --> 00:05:24,880 Speaker 1: away of history of knowing. So until we want to 68 00:05:24,960 --> 00:05:29,440 Speaker 1: have real conversations about power, until we want to have 69 00:05:29,520 --> 00:05:34,480 Speaker 1: real conversations about the root of the root, than anything, 70 00:05:34,880 --> 00:05:40,520 Speaker 1: anything that is about quote unquote technological advancement or human 71 00:05:40,640 --> 00:05:45,400 Speaker 1: advancement is always gonna be steeped in the shit we 72 00:05:45,400 --> 00:05:52,800 Speaker 1: refuse to discuss, recognize, and acknowledge. So coming up next, 73 00:05:52,960 --> 00:06:01,480 Speaker 1: my conversation with podcast host Bridget Todd. Folks, I am 74 00:06:01,520 --> 00:06:05,800 Speaker 1: so excited to welcome back to wok f our friend 75 00:06:06,040 --> 00:06:09,360 Speaker 1: Bridget Todd, who is the creator and host. There are 76 00:06:09,440 --> 00:06:14,080 Speaker 1: no girls on the Internet, Bridget Happy Pride. 77 00:06:13,120 --> 00:06:15,520 Speaker 2: Happy Pride, Danielle, thank you for having me here. I'm 78 00:06:15,560 --> 00:06:17,840 Speaker 2: so excited to be back one of my favorite shows. 79 00:06:18,160 --> 00:06:22,400 Speaker 1: I love when you come on. I'm so excited that 80 00:06:23,240 --> 00:06:28,279 Speaker 1: you two are on outspoken and on iHeart. I think 81 00:06:28,320 --> 00:06:31,160 Speaker 1: that they are doing a really great job in terms 82 00:06:31,240 --> 00:06:37,000 Speaker 1: of finding such diverse talent, showcasing, you know, the breadth 83 00:06:37,120 --> 00:06:40,159 Speaker 1: and depth of the community. So I'm very excited about 84 00:06:40,200 --> 00:06:43,920 Speaker 1: that and to have you back. So I want to 85 00:06:44,000 --> 00:06:47,240 Speaker 1: jump in first with talking about the thing that has 86 00:06:47,240 --> 00:06:53,320 Speaker 1: been keeping me up at night, which is AI, Artificial intelligence. 87 00:06:53,480 --> 00:06:57,359 Speaker 1: Everybody is all hip and in love with chat GPT. 88 00:06:58,040 --> 00:07:02,360 Speaker 1: It is making life so very easy. But funny enough, 89 00:07:03,560 --> 00:07:08,120 Speaker 1: a group of CEOs ranging from I think it is 90 00:07:08,240 --> 00:07:13,160 Speaker 1: Coca Cola to Xerox to Zoom CEOs gathered together for 91 00:07:13,240 --> 00:07:18,400 Speaker 1: a Yale summit and forty two per They did a poll, 92 00:07:18,760 --> 00:07:22,440 Speaker 1: and forty two percent of them believe that AI is 93 00:07:22,480 --> 00:07:26,240 Speaker 1: going to cause our inventing doom, you know, the end 94 00:07:26,280 --> 00:07:28,720 Speaker 1: of society as we know it, in the next five 95 00:07:28,760 --> 00:07:32,880 Speaker 1: to ten years, not the next fifty years, the next 96 00:07:33,000 --> 00:07:36,480 Speaker 1: five to ten. And so I wanted to talk to 97 00:07:36,480 --> 00:07:43,400 Speaker 1: you about all of the conversation around AI, the politics 98 00:07:43,680 --> 00:07:50,320 Speaker 1: around trying to regulate, trying to have our Octanegerian politicians 99 00:07:50,600 --> 00:07:55,880 Speaker 1: regulate something when they barely understand social media. So, you know, 100 00:07:56,440 --> 00:08:00,320 Speaker 1: please give me your thoughts on all things. 101 00:08:01,200 --> 00:08:03,400 Speaker 2: Oh my gosh, what a good topic. I don't even 102 00:08:03,440 --> 00:08:06,600 Speaker 2: know where to start. I guess I would say it's 103 00:08:06,640 --> 00:08:09,040 Speaker 2: kind of a good news bad news situation. In my book, 104 00:08:09,560 --> 00:08:12,640 Speaker 2: I think that I at first was coming from a 105 00:08:12,680 --> 00:08:14,920 Speaker 2: place very much where you're coming from too, write this 106 00:08:15,000 --> 00:08:17,720 Speaker 2: idea of like, oh my god, I'm hearing that AI 107 00:08:18,120 --> 00:08:20,520 Speaker 2: might change everything, we might all be out of jobs, 108 00:08:20,520 --> 00:08:24,160 Speaker 2: that might cause our our our doom. Not in one 109 00:08:24,200 --> 00:08:26,600 Speaker 2: hundred years, but in five years. But I actually did 110 00:08:26,640 --> 00:08:30,360 Speaker 2: a really interesting interview with a tech critic named Paris 111 00:08:30,400 --> 00:08:33,040 Speaker 2: Marx who hosts the podcast Tech Won't Save Us, and 112 00:08:33,400 --> 00:08:36,400 Speaker 2: that interview really helped me put things in perspectives. One 113 00:08:36,440 --> 00:08:40,280 Speaker 2: of the things that they said was that AI, the 114 00:08:40,320 --> 00:08:43,400 Speaker 2: people who make that technology, kind of need for there 115 00:08:43,440 --> 00:08:45,719 Speaker 2: to be a big hype cycle where even if we're 116 00:08:45,720 --> 00:08:47,880 Speaker 2: talking about it in terms of doom and gloom, that 117 00:08:48,000 --> 00:08:50,960 Speaker 2: is actually helpful for them because it helps them market 118 00:08:51,000 --> 00:08:54,600 Speaker 2: this idea that AI is going to change everything. AI 119 00:08:54,760 --> 00:08:56,600 Speaker 2: is going to take all of our jobs. AI is 120 00:08:56,640 --> 00:08:59,240 Speaker 2: going to be this big shift in all of our 121 00:08:59,280 --> 00:09:02,120 Speaker 2: thinking and the way that we do everything. Even if 122 00:09:02,160 --> 00:09:04,959 Speaker 2: it's even if we're talking about it in a way 123 00:09:05,000 --> 00:09:07,840 Speaker 2: that is scary, that seems alarmist, that is actually good 124 00:09:07,840 --> 00:09:10,680 Speaker 2: for them because it helps us all get comfortable with 125 00:09:10,720 --> 00:09:13,000 Speaker 2: this idea that AI is is here to stay. We're 126 00:09:13,000 --> 00:09:15,120 Speaker 2: all gonna have to get comfortable with it. It's kind 127 00:09:15,160 --> 00:09:16,440 Speaker 2: of if you look at the way that people talked 128 00:09:16,440 --> 00:09:19,640 Speaker 2: about cryptocurrency, maybe like a couple of years earlier, it 129 00:09:19,720 --> 00:09:21,680 Speaker 2: was a similar kind of hype cycle where it was 130 00:09:21,880 --> 00:09:25,360 Speaker 2: very easy to believe this is going to change everything financially, 131 00:09:25,760 --> 00:09:27,640 Speaker 2: you know, a couple of years from now, we're all 132 00:09:27,679 --> 00:09:29,480 Speaker 2: going to be using crypto, our bank's going to be 133 00:09:29,559 --> 00:09:32,360 Speaker 2: using crypto. That didn't really pan out, And so it 134 00:09:32,400 --> 00:09:35,560 Speaker 2: is important to think about in what ways are we 135 00:09:35,679 --> 00:09:39,640 Speaker 2: all feeding into, like a really common hype cycle by 136 00:09:40,480 --> 00:09:42,800 Speaker 2: get just sort of like giving into the idea that 137 00:09:42,840 --> 00:09:45,080 Speaker 2: AI is going to change everything. So that's one. So 138 00:09:45,400 --> 00:09:47,720 Speaker 2: I don't think there's a need to be quite so 139 00:09:48,240 --> 00:09:51,520 Speaker 2: doom and gloom, although I completely get where that comes from. However, 140 00:09:51,800 --> 00:09:55,560 Speaker 2: here's the bad news, which is that AHI most certainly 141 00:09:55,760 --> 00:09:58,079 Speaker 2: is going to change a lot of things, right, And 142 00:09:58,720 --> 00:10:00,200 Speaker 2: one of the things that Paris told me is that 143 00:10:00,240 --> 00:10:03,480 Speaker 2: they believe quite strongly that AI is simply going to 144 00:10:04,080 --> 00:10:06,680 Speaker 2: it's not going to replace workers. It's just going to 145 00:10:06,720 --> 00:10:09,480 Speaker 2: make life more miserable for workers. So you'll have more 146 00:10:09,520 --> 00:10:13,800 Speaker 2: survengeous of workers. You'll have more expectations of workers. You'll 147 00:10:13,840 --> 00:10:15,920 Speaker 2: have workers being paid less for the work they've been 148 00:10:15,960 --> 00:10:20,280 Speaker 2: doing already. You'll have knowledge like knowledge workers having that 149 00:10:20,320 --> 00:10:23,960 Speaker 2: work be devalued because the people who make financial decisions think, oh, well, 150 00:10:24,000 --> 00:10:26,240 Speaker 2: AI can do it better, even though we're not quite 151 00:10:26,280 --> 00:10:29,559 Speaker 2: there yet. Right. If you ask chat GPT to write 152 00:10:29,600 --> 00:10:33,120 Speaker 2: an episode of woke af, you probably laugh at what 153 00:10:33,120 --> 00:10:36,439 Speaker 2: they came up with. Right, So they still need commentators, podcasters, 154 00:10:36,480 --> 00:10:39,319 Speaker 2: knowledge workers like you and I, Right, And so one 155 00:10:39,320 --> 00:10:41,319 Speaker 2: of the things I really do think is that when 156 00:10:41,320 --> 00:10:44,080 Speaker 2: the people who make decisions about how money is made, 157 00:10:44,080 --> 00:10:46,760 Speaker 2: how people are employed, what their employment looks like, when 158 00:10:46,880 --> 00:10:50,360 Speaker 2: they are buying into hype cycles around AI and what 159 00:10:50,440 --> 00:10:52,560 Speaker 2: it can do, that's when I really get worried. And 160 00:10:52,600 --> 00:10:55,520 Speaker 2: it's kind of exactly what you're saying. I don't think that, 161 00:10:56,080 --> 00:11:00,000 Speaker 2: you know, our lawmakers really have enough understanding of AI 162 00:11:00,559 --> 00:11:03,240 Speaker 2: to be making laws that will actually regulate it. I 163 00:11:03,280 --> 00:11:07,000 Speaker 2: don't know if you saw a bunch of lawmakers were 164 00:11:07,240 --> 00:11:10,160 Speaker 2: pulled the head of chat GBT, Sam Altman, in for 165 00:11:10,280 --> 00:11:14,120 Speaker 2: a hearing, and before that hearing, they reportedly had a 166 00:11:14,120 --> 00:11:17,280 Speaker 2: dinner where Sam was showing these lawmakers all these different 167 00:11:17,320 --> 00:11:19,320 Speaker 2: tricks that A I could do, and it seemed like 168 00:11:19,320 --> 00:11:21,280 Speaker 2: like a trade show, right, and they were all really 169 00:11:21,320 --> 00:11:25,040 Speaker 2: like oohing and eyeing, and so you know, that's a 170 00:11:25,160 --> 00:11:28,680 Speaker 2: much more cozy relationship between people who are meant to 171 00:11:28,679 --> 00:11:31,160 Speaker 2: be regulating and the people who make the thing that 172 00:11:31,280 --> 00:11:33,640 Speaker 2: is meant to be being regulated that I would like 173 00:11:33,679 --> 00:11:36,000 Speaker 2: to see. And so it's not all I don't think 174 00:11:36,000 --> 00:11:37,800 Speaker 2: that you necessarily need to be coming from a place 175 00:11:37,840 --> 00:11:40,040 Speaker 2: of like doom and gloom. But it's not all good 176 00:11:40,120 --> 00:11:42,160 Speaker 2: because we know the people who make decisions, the people 177 00:11:42,160 --> 00:11:44,760 Speaker 2: who are supposed to be, you know, regulating this technology, 178 00:11:45,200 --> 00:11:47,840 Speaker 2: we really can't always trust them to make decisions that 179 00:11:47,920 --> 00:11:49,840 Speaker 2: are good for all of us, you know. 180 00:11:49,960 --> 00:11:51,680 Speaker 1: And I think that one of the things that you 181 00:11:51,760 --> 00:11:56,760 Speaker 1: bring up that is worth unpacking too is about workers, right, 182 00:11:56,840 --> 00:12:00,760 Speaker 1: And you're talking about knowledge workers versus you know, I 183 00:12:00,840 --> 00:12:07,240 Speaker 1: guess hands on workers, factory workers, delivery type people who 184 00:12:07,440 --> 00:12:09,920 Speaker 1: we write we at one time we wanted to say, 185 00:12:10,000 --> 00:12:13,880 Speaker 1: during the pandemic, these people were what were they essential? Right? 186 00:12:14,440 --> 00:12:16,760 Speaker 1: But we don't pay them like they're essential, and we 187 00:12:16,800 --> 00:12:20,280 Speaker 1: don't respect them like they are essential. But nonetheless that 188 00:12:20,360 --> 00:12:26,800 Speaker 1: when it comes to more information, more knowledge based kinds 189 00:12:26,880 --> 00:12:30,080 Speaker 1: of careers and professions, those will not be at risk 190 00:12:30,200 --> 00:12:33,839 Speaker 1: yet because the technology just is not there. To your 191 00:12:33,920 --> 00:12:38,600 Speaker 1: point about podcasting, to the point about writing a sonnet, 192 00:12:38,800 --> 00:12:44,640 Speaker 1: or you know, developing something that does require creativity innovation, 193 00:12:44,840 --> 00:12:47,719 Speaker 1: but not just the know how of doing it, but 194 00:12:47,840 --> 00:12:51,600 Speaker 1: really the intellectual strategy and understanding of how to do it. 195 00:12:51,760 --> 00:12:54,840 Speaker 1: AI is not strategizing, right, it is just doing and 196 00:12:54,920 --> 00:13:00,840 Speaker 1: pulling information. But when we look at these more skilled 197 00:13:00,920 --> 00:13:04,520 Speaker 1: hands on jobs. If I'm looking at an Amazon factory, 198 00:13:04,760 --> 00:13:09,040 Speaker 1: for instance, right, if I'm looking at a Starbucks for instance, 199 00:13:10,360 --> 00:13:15,600 Speaker 1: there is something to be concerned about with what happens 200 00:13:15,640 --> 00:13:21,800 Speaker 1: to those jobs. Right if I'm building a world where 201 00:13:22,200 --> 00:13:25,760 Speaker 1: smart cars God willing, hopefully not the ones Elon Musk built, 202 00:13:25,920 --> 00:13:29,880 Speaker 1: but where the smart cars are now the Amazon driver 203 00:13:30,600 --> 00:13:34,320 Speaker 1: and delivery person, where I'm going into seven to eleven 204 00:13:34,440 --> 00:13:38,079 Speaker 1: or any convenience store and instead of having a checkout person, 205 00:13:38,440 --> 00:13:43,120 Speaker 1: they're all just one personed right where I can go 206 00:13:43,200 --> 00:13:45,920 Speaker 1: in and do this. And this already exists in Japan, 207 00:13:45,960 --> 00:13:50,480 Speaker 1: it already exists in parts of in parts of Asia. 208 00:13:50,520 --> 00:13:54,160 Speaker 1: What do you think then happens to those workers and 209 00:13:54,240 --> 00:13:57,760 Speaker 1: how do we even if it's not about setting the 210 00:13:57,800 --> 00:14:02,280 Speaker 1: guardrails yet of this future technology that's here, but we 211 00:14:02,320 --> 00:14:05,480 Speaker 1: don't under really understand the implications of the damage that 212 00:14:05,520 --> 00:14:08,680 Speaker 1: it can do. We're only being told really about the 213 00:14:08,720 --> 00:14:11,760 Speaker 1: pomp pond look at this, you know, good things. How 214 00:14:11,760 --> 00:14:14,640 Speaker 1: do we talk about it from a worker's perspective and 215 00:14:14,720 --> 00:14:17,479 Speaker 1: the harm that can happen to our economy. 216 00:14:18,280 --> 00:14:21,040 Speaker 2: That's exactly the conversation that we need to be having. 217 00:14:21,160 --> 00:14:23,640 Speaker 2: I think that you're exactly right. I think we're seeing 218 00:14:23,760 --> 00:14:26,320 Speaker 2: so many people get really wrapped up in the rah 219 00:14:26,400 --> 00:14:29,040 Speaker 2: rah or even the I think the other side of 220 00:14:29,040 --> 00:14:31,840 Speaker 2: that coin is the real doom and gloom and erasing 221 00:14:31,880 --> 00:14:35,040 Speaker 2: the fact that it is workers who are the ones 222 00:14:35,040 --> 00:14:37,680 Speaker 2: who I think are going to be impacted. And so 223 00:14:38,040 --> 00:14:41,320 Speaker 2: what I really get concerned about is people who make 224 00:14:41,360 --> 00:14:45,240 Speaker 2: decisions being swept up in the ra ra ai hype 225 00:14:45,240 --> 00:14:48,840 Speaker 2: cycle and then thinking, oh, well, we could have a 226 00:14:48,960 --> 00:14:52,720 Speaker 2: robot car replace the Amazon delivery driver, we can have 227 00:14:52,800 --> 00:14:56,680 Speaker 2: a robot barista replace the human barista, without thinking about 228 00:14:56,720 --> 00:14:58,720 Speaker 2: all of the ways that that is not one going 229 00:14:58,720 --> 00:15:02,200 Speaker 2: to threaten workers make all of our lives more miserable 230 00:15:02,240 --> 00:15:04,440 Speaker 2: because the technology is not there yet. And so I 231 00:15:04,480 --> 00:15:06,440 Speaker 2: really think that breaking out of these hype cycles is 232 00:15:06,440 --> 00:15:09,280 Speaker 2: super important and making sure that the people who are 233 00:15:09,600 --> 00:15:12,840 Speaker 2: signing the checks, hiring the workers, making the decisions, you know, 234 00:15:13,560 --> 00:15:16,280 Speaker 2: really understand that when we talk about technology it really 235 00:15:16,280 --> 00:15:18,480 Speaker 2: has to come from a place of people first, and 236 00:15:18,840 --> 00:15:21,640 Speaker 2: that means breaking away from the hype cycle that can 237 00:15:21,680 --> 00:15:24,880 Speaker 2: be so tempting to get wrapped up in. But you're 238 00:15:24,960 --> 00:15:27,360 Speaker 2: exactly right. I think that the people who are the 239 00:15:27,400 --> 00:15:30,440 Speaker 2: most that threatened by this are human workers, and so 240 00:15:30,480 --> 00:15:33,680 Speaker 2: making sure that we have conversations that are centered in 241 00:15:33,760 --> 00:15:36,240 Speaker 2: that reality that are not just pie in the sky 242 00:15:36,760 --> 00:15:41,520 Speaker 2: science fiction, you know. But again, it is so easy 243 00:15:41,840 --> 00:15:44,280 Speaker 2: to get swept up in that. I think a lot 244 00:15:44,320 --> 00:15:46,040 Speaker 2: of our lawmakers have gotten swept up in that. I 245 00:15:46,040 --> 00:15:48,200 Speaker 2: think a lot of the people who make decisions and 246 00:15:48,240 --> 00:15:51,520 Speaker 2: write checks, and like the ones who are who have power, 247 00:15:52,080 --> 00:15:53,920 Speaker 2: I think that we were already in a place where 248 00:15:53,920 --> 00:15:57,600 Speaker 2: they have just abandoned this understanding of how can we 249 00:15:57,680 --> 00:16:01,000 Speaker 2: have this conversation in a way that centers people, humans 250 00:16:01,000 --> 00:16:02,640 Speaker 2: and workers. You know. 251 00:16:02,800 --> 00:16:05,680 Speaker 1: The other thing too, And I believe that we've talked 252 00:16:05,680 --> 00:16:08,280 Speaker 1: about this before when I've had you on in the 253 00:16:08,360 --> 00:16:14,080 Speaker 1: documentary Netflix documentary Coded Bias, right, there were all sorts 254 00:16:14,160 --> 00:16:20,240 Speaker 1: of conversation about how we are literally coding bias, taking 255 00:16:20,320 --> 00:16:26,119 Speaker 1: what we know as racial discrimination, misogyny, you know, anti blackness, 256 00:16:26,160 --> 00:16:31,760 Speaker 1: anti LGBTQ, plus you know, sentiments and placing them, right, 257 00:16:32,640 --> 00:16:38,400 Speaker 1: we're making them technologically advanced and so again in order 258 00:16:38,480 --> 00:16:44,400 Speaker 1: to protect what I see just on its face already crumbling, right, 259 00:16:44,640 --> 00:16:49,000 Speaker 1: like our morals and our values as a country, the 260 00:16:49,040 --> 00:16:52,320 Speaker 1: dignity and respect that all people are deserving. And then 261 00:16:52,360 --> 00:16:54,440 Speaker 1: we're looking to the future and I'm just like, who 262 00:16:54,520 --> 00:16:58,440 Speaker 1: you think is creating this technology? It's like, so how 263 00:16:58,440 --> 00:17:03,480 Speaker 1: do we also know? It's like you had these billionaires 264 00:17:03,760 --> 00:17:07,359 Speaker 1: that we're doing their rocket ship race to get you know, 265 00:17:07,440 --> 00:17:09,760 Speaker 1: to get to out of space, and I'm like, all 266 00:17:09,800 --> 00:17:13,159 Speaker 1: you're doing with this dick measuring contest is gonna export, 267 00:17:13,680 --> 00:17:17,560 Speaker 1: you know, into the atmosphere, like export off the planet. 268 00:17:17,760 --> 00:17:21,240 Speaker 1: The racism, the misogyny and hatred that is here. 269 00:17:21,640 --> 00:17:27,560 Speaker 2: Oh my gosh, if I like yes, so if yes, yes, yes, 270 00:17:27,600 --> 00:17:29,760 Speaker 2: this is what keeps me up that night. Right. First 271 00:17:29,800 --> 00:17:32,840 Speaker 2: of all, these are the people who, for whatever reason, 272 00:17:33,200 --> 00:17:35,520 Speaker 2: believe that they should be in charge of shaping what 273 00:17:35,600 --> 00:17:38,520 Speaker 2: the future looks like for all of us. The fact 274 00:17:38,560 --> 00:17:42,320 Speaker 2: that we are being told that if our planet does 275 00:17:42,400 --> 00:17:45,400 Speaker 2: not survive because of climate collapse, we need to put 276 00:17:45,480 --> 00:17:49,119 Speaker 2: all of our collective futures into the hands of Elon 277 00:17:49,240 --> 00:17:51,040 Speaker 2: Musk to take us to another planet. Oh dear, are 278 00:17:51,119 --> 00:17:53,479 Speaker 2: you kidding me? Are you kidding me? And it's exactly 279 00:17:53,520 --> 00:17:56,680 Speaker 2: what you said. I think that the tech billionaires, who, 280 00:17:56,760 --> 00:18:00,640 Speaker 2: let's be real, are mostly cis, mostly straight, mostly white. 281 00:18:00,920 --> 00:18:02,959 Speaker 2: These are the people who would like us to believe 282 00:18:03,000 --> 00:18:06,320 Speaker 2: that the technology that they build is neutral. It's just 283 00:18:06,320 --> 00:18:08,879 Speaker 2: the technology doesn't have any bias, right, but right I 284 00:18:09,040 --> 00:18:13,359 Speaker 2: know is that's complete. BS. Technology is not neutral. It 285 00:18:13,400 --> 00:18:19,840 Speaker 2: comes coded into it all the different biases homophobia, transphobia, misogyny, racism, 286 00:18:19,920 --> 00:18:23,480 Speaker 2: fat phobia, all of those different things are coded into 287 00:18:23,520 --> 00:18:26,879 Speaker 2: that technology. And so until we get to a place 288 00:18:26,880 --> 00:18:29,840 Speaker 2: where those people are able to really see and like 289 00:18:30,040 --> 00:18:32,640 Speaker 2: meaningfully grapple with all of the ways that they are 290 00:18:33,760 --> 00:18:37,080 Speaker 2: propagating these bad things in our society and building it 291 00:18:37,119 --> 00:18:39,840 Speaker 2: into the technology and then being able to be like, oh, well, 292 00:18:39,880 --> 00:18:42,879 Speaker 2: it's neutral, Like the technology can't be racist. We know 293 00:18:43,000 --> 00:18:45,840 Speaker 2: that's BS. Until we do that, we will never actually 294 00:18:45,880 --> 00:18:48,400 Speaker 2: meaningfully get a handle on this. And yeah, I think 295 00:18:48,400 --> 00:18:50,879 Speaker 2: that when it comes to AI, there are so many examples. 296 00:18:51,119 --> 00:18:54,080 Speaker 2: Even if anybody listening played with things like the lensa 297 00:18:54,160 --> 00:18:56,399 Speaker 2: app where you fed in a couple of selfies and 298 00:18:56,440 --> 00:19:00,600 Speaker 2: it gave you AI generated images of yourself, we'll come 299 00:19:00,640 --> 00:19:04,040 Speaker 2: to find out though it wasn't AI generated necessarily. A 300 00:19:04,080 --> 00:19:08,280 Speaker 2: lot of that was just pulling from existing images out. 301 00:19:08,200 --> 00:19:08,720 Speaker 1: On the web. 302 00:19:08,760 --> 00:19:12,199 Speaker 2: And so when black women fed their selfies into it, 303 00:19:12,320 --> 00:19:14,080 Speaker 2: some of them would say, oh, they made my features 304 00:19:14,160 --> 00:19:16,919 Speaker 2: look a lot more European, they lighten my skin, or 305 00:19:16,960 --> 00:19:20,040 Speaker 2: they made me more kind of conventionally attractive. That's not 306 00:19:20,119 --> 00:19:22,840 Speaker 2: coming from nowhere. That's coming from us humans. And so 307 00:19:22,920 --> 00:19:25,600 Speaker 2: until we as humans get to handle on these you know, 308 00:19:26,240 --> 00:19:29,720 Speaker 2: negative things and dynamics in our society, it is simply 309 00:19:29,760 --> 00:19:32,480 Speaker 2: going to be like recreated and made that much worse 310 00:19:32,600 --> 00:19:35,760 Speaker 2: by this technology. And it's such a drag because this 311 00:19:35,880 --> 00:19:41,120 Speaker 2: technology could be so powerful if the people who are 312 00:19:41,160 --> 00:19:44,200 Speaker 2: making it actually did examine that, and it could I 313 00:19:44,240 --> 00:19:46,240 Speaker 2: could imagine all kinds of ways it could be used 314 00:19:46,240 --> 00:19:50,840 Speaker 2: to break away from these harmful binaries and biases. But 315 00:19:50,960 --> 00:19:54,199 Speaker 2: it's not until these people really examine these biases and 316 00:19:54,200 --> 00:19:57,240 Speaker 2: how they are are replicating them. We're just going to 317 00:19:57,280 --> 00:20:00,240 Speaker 2: continue replicating the same harmful crap that we already see 318 00:20:00,240 --> 00:20:00,800 Speaker 2: in society. 319 00:20:01,480 --> 00:20:03,600 Speaker 1: And I think that that's right. I think that the 320 00:20:04,160 --> 00:20:07,840 Speaker 1: you know, much like anything right, that the people who 321 00:20:07,960 --> 00:20:11,520 Speaker 1: have the power to create them, their product is only 322 00:20:11,600 --> 00:20:15,479 Speaker 1: as good as their intentions, right, And so you can't 323 00:20:15,880 --> 00:20:19,920 Speaker 1: create from a place of neutrality because there is no neutrality, right, 324 00:20:20,080 --> 00:20:22,720 Speaker 1: not in this country, not in the world. And so 325 00:20:22,880 --> 00:20:27,360 Speaker 1: unless you're going to do some real internal examination of 326 00:20:27,520 --> 00:20:31,920 Speaker 1: self and how your own lens you know, sheds light 327 00:20:32,000 --> 00:20:35,000 Speaker 1: on how you see others, it is going to be 328 00:20:35,160 --> 00:20:39,280 Speaker 1: you are going to continue to code in bias right, 329 00:20:39,400 --> 00:20:42,919 Speaker 1: without having any type of intentionality around it. And I 330 00:20:43,000 --> 00:20:44,720 Speaker 1: do you know, I'm not. I mean, I am a 331 00:20:44,720 --> 00:20:47,600 Speaker 1: doom and gloomer, but I also am somebody that has 332 00:20:47,680 --> 00:20:51,680 Speaker 1: seen the ways in which technology obviously has been able 333 00:20:51,760 --> 00:20:56,199 Speaker 1: to bridge disability gaps, build you know, build community that 334 00:20:56,240 --> 00:20:59,680 Speaker 1: wouldn't normally have been right to be able to expose 335 00:20:59,720 --> 00:21:03,800 Speaker 1: people to images, movements, and issues outside of their own 336 00:21:03,840 --> 00:21:08,200 Speaker 1: front door. We got that, you know, that connectivity because 337 00:21:08,840 --> 00:21:13,800 Speaker 1: of technology. And it's just it worries me because these billionaires, 338 00:21:14,160 --> 00:21:18,040 Speaker 1: the ones that are in control, these technocrats, are not 339 00:21:18,600 --> 00:21:25,120 Speaker 1: ones with solid intentions about bettering humanity. And that's where 340 00:21:25,160 --> 00:21:28,120 Speaker 1: I think, like the conversation needs to. 341 00:21:28,040 --> 00:21:31,919 Speaker 2: Be absolutely I would even go further. I agree that 342 00:21:31,960 --> 00:21:35,320 Speaker 2: they are not people who are meaningfully interested in bettering 343 00:21:35,359 --> 00:21:38,000 Speaker 2: society and using their technology to do that. I would 344 00:21:38,080 --> 00:21:42,639 Speaker 2: argue that what some of these people see as a good, full, 345 00:21:42,960 --> 00:21:47,480 Speaker 2: meaningful life is so radically different than what you or 346 00:21:47,600 --> 00:21:49,480 Speaker 2: I or I would say the majority of the world 347 00:21:49,720 --> 00:21:53,320 Speaker 2: would see as a full and meaningful life. Like there's 348 00:21:53,359 --> 00:21:57,040 Speaker 2: this very popular engineer, Lex Friedman who I once was 349 00:21:57,040 --> 00:21:59,040 Speaker 2: listening to an interview and he was talking about how 350 00:21:59,320 --> 00:22:02,560 Speaker 2: he has this fantasy of when you get up in 351 00:22:02,600 --> 00:22:03,879 Speaker 2: the middle of the night at three o'clock in the 352 00:22:03,880 --> 00:22:06,080 Speaker 2: morning and you go to the kitchen to like sneak 353 00:22:06,080 --> 00:22:08,320 Speaker 2: a snack, you could eating ice cream because you can't sleep. 354 00:22:08,680 --> 00:22:12,480 Speaker 2: He in his fantasy, he would have a smart refrigerator 355 00:22:12,480 --> 00:22:14,960 Speaker 2: that he's able to share this as a as a 356 00:22:14,960 --> 00:22:17,880 Speaker 2: meaningful moment between him and his smart refrigerator. He wants 357 00:22:17,880 --> 00:22:20,359 Speaker 2: to have a genuine connection, And I was like, I 358 00:22:20,359 --> 00:22:22,399 Speaker 2: don't what if what if I want to connect with 359 00:22:22,440 --> 00:22:25,399 Speaker 2: my partner and my family and my friends. What if? 360 00:22:25,480 --> 00:22:27,800 Speaker 2: What if what looks like a good life to me 361 00:22:28,320 --> 00:22:30,679 Speaker 2: is not a life where I am spending more time 362 00:22:31,320 --> 00:22:36,640 Speaker 2: with screens exactly exactly. And so you're you're asking people who, 363 00:22:36,760 --> 00:22:40,080 Speaker 2: let's just be real, don't always have the most healthy 364 00:22:40,800 --> 00:22:44,800 Speaker 2: or the most healthy outlooks on what connection looks like 365 00:22:44,840 --> 00:22:46,640 Speaker 2: and what a good and a full life looks like. 366 00:22:46,840 --> 00:22:49,239 Speaker 2: We are asking them to design our future. And so 367 00:22:49,280 --> 00:22:51,239 Speaker 2: I would I would really say, like the future that 368 00:22:51,280 --> 00:22:53,439 Speaker 2: they see as a good future and the future that 369 00:22:53,480 --> 00:22:55,520 Speaker 2: the rest of us probably see is a good future 370 00:22:55,760 --> 00:22:58,320 Speaker 2: are two vastly different things. Then we got to ask, well, 371 00:22:58,359 --> 00:23:01,160 Speaker 2: who put them in charge of designing what our collective 372 00:23:01,160 --> 00:23:02,000 Speaker 2: future will look like? 373 00:23:02,600 --> 00:23:04,359 Speaker 1: Yeah, because when I think about the ways that we 374 00:23:04,359 --> 00:23:08,560 Speaker 1: could utilize technology in order to stave off our climate crisis, 375 00:23:08,800 --> 00:23:10,600 Speaker 1: When I think about the ways that we could utilize 376 00:23:10,640 --> 00:23:15,639 Speaker 1: technology in order to better coexist inside of nature, how 377 00:23:15,920 --> 00:23:19,280 Speaker 1: we can use what, you know, the technology that is 378 00:23:19,400 --> 00:23:22,240 Speaker 1: that trees do to talk to one another to build 379 00:23:22,280 --> 00:23:25,960 Speaker 1: fertile ground, and be able to translate that into something 380 00:23:26,040 --> 00:23:29,560 Speaker 1: for people like I think about, Wow, amazing, you know, 381 00:23:29,640 --> 00:23:31,920 Speaker 1: I think about the sci fi that I read written 382 00:23:31,960 --> 00:23:35,960 Speaker 1: by black women like Nidi Okafor and Octavia Butler, when 383 00:23:35,960 --> 00:23:40,119 Speaker 1: I'm thinking about, like, where there is opportunity, it seems 384 00:23:40,280 --> 00:23:44,280 Speaker 1: very rich, but not if it is the same cis white, 385 00:23:44,480 --> 00:23:47,960 Speaker 1: hetero strict men that are in charge of it, it 386 00:23:48,000 --> 00:23:50,080 Speaker 1: is just going to be the same shit, but in 387 00:23:50,240 --> 00:23:51,040 Speaker 1: robot form. 388 00:23:51,560 --> 00:23:54,800 Speaker 2: Absolutely, And it breaks my heart because there is so 389 00:23:55,080 --> 00:24:00,960 Speaker 2: much innovative, creative, interesting out there thinking being done by 390 00:24:01,520 --> 00:24:07,119 Speaker 2: black and brown and queer technologists and people imagining futures 391 00:24:07,119 --> 00:24:10,000 Speaker 2: that we can't even see with our eyes. Yet there 392 00:24:10,040 --> 00:24:12,199 Speaker 2: is so much of that out there, and yet the 393 00:24:12,240 --> 00:24:14,680 Speaker 2: people who have the power and have the money and 394 00:24:14,720 --> 00:24:17,400 Speaker 2: take up the most space, they're not having those conversations. 395 00:24:17,520 --> 00:24:21,760 Speaker 2: What they're saying are really harmful ideologies like long termism, 396 00:24:21,800 --> 00:24:24,439 Speaker 2: which says like people like you and me and probably 397 00:24:24,520 --> 00:24:27,120 Speaker 2: ninety percent of the people listening, we are too focused 398 00:24:27,119 --> 00:24:29,159 Speaker 2: on the day to day of like paying our bills 399 00:24:29,240 --> 00:24:31,080 Speaker 2: to think long term. And so what we need to 400 00:24:31,119 --> 00:24:34,679 Speaker 2: do is gather all the white techie, billionaire folks and 401 00:24:34,840 --> 00:24:37,679 Speaker 2: think about what we're going to do when climate collapse 402 00:24:37,760 --> 00:24:41,000 Speaker 2: comes for us. And they're the only ones left right, 403 00:24:41,160 --> 00:24:44,080 Speaker 2: they're thinking about the future in these ways that are 404 00:24:44,240 --> 00:24:50,000 Speaker 2: so regressive and harmful, sometimes outright eugenicist that just yet 405 00:24:50,280 --> 00:24:54,920 Speaker 2: just like leave us out in the cold while they're like, oh, well, 406 00:24:54,960 --> 00:24:56,760 Speaker 2: this is going to be better for everybody because we're 407 00:24:56,760 --> 00:24:59,520 Speaker 2: the smartest and we're the most genetically advanced, and so 408 00:24:59,680 --> 00:25:02,280 Speaker 2: future where we're okay and y'all are doing, God knows 409 00:25:02,320 --> 00:25:05,080 Speaker 2: what is better for everybody. And that it makes it 410 00:25:05,119 --> 00:25:08,040 Speaker 2: breaks my heart that that is the thinking that takes 411 00:25:08,119 --> 00:25:10,040 Speaker 2: up the most space, and that is the thinking that 412 00:25:10,200 --> 00:25:12,960 Speaker 2: it seems like these people have just determined like they 413 00:25:13,000 --> 00:25:14,560 Speaker 2: know best for us, and they certainly. 414 00:25:14,320 --> 00:25:18,320 Speaker 1: Do not no, absolutely not switching gears. With just a 415 00:25:18,320 --> 00:25:20,919 Speaker 1: few minutes that we have left, I do want to 416 00:25:20,960 --> 00:25:27,200 Speaker 1: ask you, you know, how you are feeling about Pride 417 00:25:28,080 --> 00:25:31,920 Speaker 1: during this time and this season it is, you know, 418 00:25:32,280 --> 00:25:37,360 Speaker 1: I've I'm interviewing folks, you know, all month long with 419 00:25:37,440 --> 00:25:39,879 Speaker 1: our you know theme around the fact that you can't 420 00:25:39,920 --> 00:25:43,560 Speaker 1: ban queer joy. We partner with Glad in order to 421 00:25:43,720 --> 00:25:46,880 Speaker 1: elevate you know, voices of folks and that is their 422 00:25:46,960 --> 00:25:52,320 Speaker 1: theme as well. How do you feel this month? 423 00:25:53,240 --> 00:25:56,440 Speaker 2: I had a very so I'm in d c our 424 00:25:56,480 --> 00:25:59,800 Speaker 2: Pride Parade was last weekend, which I attended. I had 425 00:25:59,800 --> 00:26:03,800 Speaker 2: a very a lot of emotions around Pride. I think 426 00:26:03,840 --> 00:26:07,520 Speaker 2: it is. I used to be one of those people 427 00:26:07,520 --> 00:26:10,200 Speaker 2: who was like, pride is so corporate? Do we really 428 00:26:10,280 --> 00:26:13,760 Speaker 2: does ore community really need this anymore? How wrong I was? 429 00:26:14,400 --> 00:26:20,840 Speaker 2: You know, this year, when pride is being threatened, when 430 00:26:20,880 --> 00:26:25,880 Speaker 2: pride is being used as a way to attack our communities, 431 00:26:26,320 --> 00:26:30,520 Speaker 2: when pride is being used to smear and demonize our 432 00:26:30,840 --> 00:26:36,879 Speaker 2: trans siblings, particularly trans youth, I really celebrating it felt 433 00:26:36,880 --> 00:26:41,720 Speaker 2: different this year, and I don't really know what to 434 00:26:41,800 --> 00:26:45,439 Speaker 2: do with that. And I remember going to a like 435 00:26:45,480 --> 00:26:50,399 Speaker 2: a queer dance party, and I have to say, like, 436 00:26:52,320 --> 00:26:55,119 Speaker 2: on the one hand, I was happy to be experiencing 437 00:26:55,320 --> 00:26:58,080 Speaker 2: joy and be around other queer people who were also 438 00:26:58,119 --> 00:27:01,560 Speaker 2: experiencing joy, but I was a little bit like, I 439 00:27:01,600 --> 00:27:03,440 Speaker 2: was like, I want to know where the exits are right, 440 00:27:03,520 --> 00:27:06,560 Speaker 2: I want to like, I want to like. And that 441 00:27:06,640 --> 00:27:11,080 Speaker 2: was the first, probably the first time celebrating a Pride 442 00:27:11,119 --> 00:27:13,679 Speaker 2: where I felt that way, where I felt like I 443 00:27:13,760 --> 00:27:17,440 Speaker 2: just felt like the kind of euphoria and joy that 444 00:27:17,640 --> 00:27:20,440 Speaker 2: you that we all deserve to feel that pride is about. 445 00:27:21,000 --> 00:27:26,120 Speaker 2: I was not comfortable experiencing that because of this dynamic 446 00:27:26,640 --> 00:27:30,000 Speaker 2: that is pushed by a small but vocal number of 447 00:27:30,040 --> 00:27:34,080 Speaker 2: extremists and so yeah, it was. I think I think 448 00:27:34,119 --> 00:27:38,200 Speaker 2: that it it showed me why we need pride, yeah, 449 00:27:38,240 --> 00:27:42,919 Speaker 2: and the importance of celebrating and commemorating our communities. But 450 00:27:43,160 --> 00:27:45,760 Speaker 2: I it was it was kind of a privilege and 451 00:27:45,800 --> 00:27:48,959 Speaker 2: a luxury to be annoyed by how corporate pride is 452 00:27:49,080 --> 00:27:52,040 Speaker 2: a few years ago, and this time around I really 453 00:27:52,320 --> 00:27:56,440 Speaker 2: felt how how significant it is. And yeah, just it 454 00:27:58,359 --> 00:28:05,679 Speaker 2: really alarms me how much we have, how much ground 455 00:28:05,720 --> 00:28:09,280 Speaker 2: we have lost, and I to center the conversation around 456 00:28:09,280 --> 00:28:13,199 Speaker 2: corporations and like, is targets celebrating pride or you know whatever. 457 00:28:13,520 --> 00:28:15,679 Speaker 2: But I do think that that can sometimes be a 458 00:28:15,720 --> 00:28:19,320 Speaker 2: barometer for the public and the fact that a small 459 00:28:19,480 --> 00:28:23,320 Speaker 2: handful of extremists have been able to sway that because 460 00:28:23,320 --> 00:28:25,119 Speaker 2: I think glad It's put out a survey that shows 461 00:28:25,200 --> 00:28:29,359 Speaker 2: that most Americans are are cool with LGBTQ people. Right, 462 00:28:29,400 --> 00:28:31,879 Speaker 2: It's like we are talking about the way that a 463 00:28:31,960 --> 00:28:37,199 Speaker 2: small handful of extremists have been able to essentially hold 464 00:28:37,280 --> 00:28:40,680 Speaker 2: the country hostage from this progress because they know they 465 00:28:40,760 --> 00:28:43,880 Speaker 2: are losing, because they know their their sentiments are unpopular, 466 00:28:44,240 --> 00:28:45,440 Speaker 2: and it really is alarming. 467 00:28:46,360 --> 00:28:50,760 Speaker 1: Yeah, yeah, it is, And I feel the same. I 468 00:28:50,800 --> 00:28:54,400 Speaker 1: think that again, it was a privilege to only have 469 00:28:54,480 --> 00:28:58,440 Speaker 1: to focus on the corporatization of Pride. A couple of 470 00:28:58,560 --> 00:29:02,120 Speaker 1: years ago, I would have known, ever thought that we 471 00:29:02,160 --> 00:29:04,120 Speaker 1: would be in the place that we are right now, 472 00:29:04,680 --> 00:29:08,720 Speaker 1: which is a place of you know, of danger and 473 00:29:09,120 --> 00:29:13,560 Speaker 1: particularly for our trans brothers and sisters. Last thing for you, 474 00:29:14,400 --> 00:29:19,400 Speaker 1: tell us about the new season of your show on Outspoken. 475 00:29:19,840 --> 00:29:22,520 Speaker 2: Yes, so we are in our fourth season. We just 476 00:29:22,640 --> 00:29:25,920 Speaker 2: launched it, super excited about it. We are doing a 477 00:29:26,040 --> 00:29:30,080 Speaker 2: series that we're calling Present Future, and so as we're 478 00:29:30,120 --> 00:29:33,560 Speaker 2: all thinking, as our conversation about AI kind of demonstrates, 479 00:29:33,800 --> 00:29:35,640 Speaker 2: we are in this weird moment when it comes to 480 00:29:35,680 --> 00:29:38,600 Speaker 2: things like AI social media platforms. I don't know if 481 00:29:38,640 --> 00:29:40,480 Speaker 2: you saw that glad report that just came out about 482 00:29:40,480 --> 00:29:44,200 Speaker 2: how Twitter is doing such a terrible job of amplifying 483 00:29:44,240 --> 00:29:45,920 Speaker 2: and supporting LGBTQ voices. 484 00:29:45,960 --> 00:29:48,520 Speaker 1: Like I didn't need a report, but I. 485 00:29:48,200 --> 00:29:50,800 Speaker 2: Surprise reporting what we already knew. So it does feel 486 00:29:50,840 --> 00:29:53,960 Speaker 2: like we're in this really weird new place when it 487 00:29:54,000 --> 00:29:56,200 Speaker 2: comes to technology, and so it seems like this far 488 00:29:56,280 --> 00:29:58,880 Speaker 2: off future that we've all been referencing is here right now. 489 00:29:58,920 --> 00:30:01,280 Speaker 2: So on my Podcas Hast on Outspoken, there are no 490 00:30:01,400 --> 00:30:04,320 Speaker 2: girls on the internet. We are exploring what it means, 491 00:30:04,640 --> 00:30:07,440 Speaker 2: you know, when it comes to technology, social platforms, media, 492 00:30:07,880 --> 00:30:11,480 Speaker 2: and the future, and making sure that marginalized voices, women, 493 00:30:11,600 --> 00:30:14,400 Speaker 2: queer folks, trans folks, people of color do not get 494 00:30:14,440 --> 00:30:16,760 Speaker 2: lost in those conversations. That we are centered in those 495 00:30:16,760 --> 00:30:20,720 Speaker 2: conversations because we belong at the forefront. We belong, you know, 496 00:30:21,280 --> 00:30:24,920 Speaker 2: amplified and centered and having our stories and experiences, voices 497 00:30:25,080 --> 00:30:27,760 Speaker 2: and voices being grounded when we talk about what the 498 00:30:27,800 --> 00:30:30,600 Speaker 2: future of technology looks like in our world. So please 499 00:30:30,680 --> 00:30:31,160 Speaker 2: check it out. 500 00:30:31,720 --> 00:30:34,960 Speaker 1: Amazing folks, do check it out. I'm always so excited 501 00:30:35,600 --> 00:30:38,080 Speaker 1: when I have the opportunity to speak with the bridget 502 00:30:38,240 --> 00:30:40,560 Speaker 1: and just the work that you do, the conversations that 503 00:30:40,640 --> 00:30:45,320 Speaker 1: you have on your pod. There's always such good cross pollination. 504 00:30:45,560 --> 00:30:49,080 Speaker 1: So thank you friend for making the time to join 505 00:30:49,720 --> 00:30:52,120 Speaker 1: me on WOKF daily. I appreciate you. 506 00:30:52,400 --> 00:30:54,680 Speaker 2: Oh Danielle, I appreciate you so much. Thanks for having me. 507 00:31:00,120 --> 00:31:02,680 Speaker 1: That is it for me today, Dear friends on woke 508 00:31:02,720 --> 00:31:06,000 Speaker 1: a f as always power to the people and to 509 00:31:06,120 --> 00:31:09,840 Speaker 1: all the people. Power, get woke and stay woke. As 510 00:31:09,880 --> 00:31:10,120 Speaker 1: fun