1 00:00:10,960 --> 00:00:14,600 Speaker 1: Welcome to the Therapy for Black Girls Podcast, a weekly 2 00:00:14,680 --> 00:00:19,440 Speaker 1: conversation about mental health, personal development, and all the small 3 00:00:19,480 --> 00:00:22,640 Speaker 1: decisions we can make to become the best possible versions 4 00:00:22,640 --> 00:00:26,759 Speaker 1: of ourselves. I'm your host, doctor Joy hard and Bradford, 5 00:00:27,120 --> 00:00:32,199 Speaker 1: a licensed psychologist in Atlanta, Georgia. For more information or 6 00:00:32,320 --> 00:00:35,720 Speaker 1: to find a therapist in your area, visit our website 7 00:00:35,840 --> 00:00:39,559 Speaker 1: at Therapy for Blackgirls dot com. While I hope you 8 00:00:39,600 --> 00:00:43,599 Speaker 1: love listening to and learning from the podcast, it is 9 00:00:43,640 --> 00:00:46,559 Speaker 1: not meant to be a substitute for a relationship with 10 00:00:46,600 --> 00:00:57,480 Speaker 1: a licensed mental health professional. Hey, y'all, thanks so much 11 00:00:57,480 --> 00:00:59,640 Speaker 1: for joining me for session three eighty six of the 12 00:00:59,640 --> 00:01:02,880 Speaker 1: Therapy for Black Girls Podcast. We'll get right into our 13 00:01:02,920 --> 00:01:05,080 Speaker 1: conversation after a word from our sponsors. 14 00:01:05,760 --> 00:01:09,160 Speaker 2: Hi. I'm doctor JOYBLAMWINNI, and I'm on the Therapy for 15 00:01:09,280 --> 00:01:14,760 Speaker 2: Black Girls podcast today. I'm in session unpacking everything you 16 00:01:14,840 --> 00:01:18,520 Speaker 2: need to know about Artificial intelligence. 17 00:01:21,280 --> 00:01:25,440 Speaker 1: ASIS. We're seeking an experienced and passionate AD sales strategists 18 00:01:25,480 --> 00:01:27,679 Speaker 1: to join our team here at Therapy for Black Girls. 19 00:01:27,800 --> 00:01:30,039 Speaker 1: We're looking for somebody who can help us to strengthen 20 00:01:30,080 --> 00:01:33,080 Speaker 1: and maintain our existing brand partnerships and who can help 21 00:01:33,120 --> 00:01:36,319 Speaker 1: us identify and cultivate new brand partnerships that align with 22 00:01:36,360 --> 00:01:38,880 Speaker 1: our mission. If you are someone who has five to 23 00:01:38,959 --> 00:01:41,840 Speaker 1: seven years in AD sales or media buying, or similar 24 00:01:41,880 --> 00:01:45,440 Speaker 1: position with a proven track record of success, we love 25 00:01:45,480 --> 00:01:48,000 Speaker 1: to chat with you. Go to Therapy for Blackgirls dot 26 00:01:48,040 --> 00:01:50,520 Speaker 1: Com Slash ad Sales to learn more about the position 27 00:01:50,760 --> 00:01:58,080 Speaker 1: or to apply. The future is here, and it looks 28 00:01:58,120 --> 00:02:02,360 Speaker 1: like deep fakes of real people sayings, chatbots claiming to 29 00:02:02,400 --> 00:02:06,120 Speaker 1: have human level consciousness, and evil robots ready to take 30 00:02:06,160 --> 00:02:11,639 Speaker 1: everyone's jobs. Artificial intelligence, while only just recently becoming widespread 31 00:02:11,639 --> 00:02:14,920 Speaker 1: and accessible, is transforming our world in ways that make 32 00:02:15,040 --> 00:02:18,480 Speaker 1: understanding it more crucial than ever. Joining me for today's 33 00:02:18,520 --> 00:02:22,720 Speaker 1: important conversation on the ethical implications of AI is doctor 34 00:02:22,800 --> 00:02:27,400 Speaker 1: Joy Bullamweini. She is the founder of the Algorithmic Justice League, 35 00:02:27,680 --> 00:02:31,480 Speaker 1: an award winning researcher and poet of code. She's also 36 00:02:31,560 --> 00:02:34,840 Speaker 1: the author of the national best selling book Unmasking AI, 37 00:02:35,560 --> 00:02:37,840 Speaker 1: My mission to protect what is human in a world 38 00:02:37,880 --> 00:02:42,160 Speaker 1: of machines. During our conversation, we examine some of the 39 00:02:42,200 --> 00:02:47,920 Speaker 1: basic definitions, players, and concerns associated with AI, how biases 40 00:02:47,960 --> 00:02:51,399 Speaker 1: are transferred in the creation of AI, and then reflected 41 00:02:51,480 --> 00:02:55,880 Speaker 1: on its application, and lastly, the specific challenges AI poses, 42 00:02:55,960 --> 00:02:59,399 Speaker 1: particularly for black people. If something resonates with you while 43 00:02:59,480 --> 00:03:02,959 Speaker 1: enjoying our conversation, please share with us on social media 44 00:03:03,080 --> 00:03:07,520 Speaker 1: using the hashtag tpg in session, or join us over 45 00:03:07,560 --> 00:03:10,040 Speaker 1: in the sister circles to talk more about the episode. 46 00:03:10,240 --> 00:03:12,920 Speaker 1: You can join us at community dot therapy for blackgirls 47 00:03:12,919 --> 00:03:19,600 Speaker 1: dot com. Here's our conversation. Hi Notctor Joy? How are you? 48 00:03:20,080 --> 00:03:22,600 Speaker 2: Hi, Doctor Joy? I'm well, how. 49 00:03:22,480 --> 00:03:24,840 Speaker 1: Are you double the joy today? 50 00:03:25,200 --> 00:03:25,960 Speaker 2: All good? 51 00:03:26,280 --> 00:03:28,639 Speaker 1: Yes, I'm very excited to chat with you. There's been 52 00:03:28,680 --> 00:03:31,040 Speaker 1: a lot of conversation in our audience, a lot of 53 00:03:31,120 --> 00:03:34,600 Speaker 1: questions about AI and just all the things that are happening. 54 00:03:34,920 --> 00:03:37,960 Speaker 1: So you are the new author of the book Unmasking AI, 55 00:03:38,160 --> 00:03:40,400 Speaker 1: my mission to protect what is human in a world 56 00:03:40,400 --> 00:03:42,760 Speaker 1: of machines, and I'd love for you to get started 57 00:03:42,760 --> 00:03:45,480 Speaker 1: by telling us how you first got interested in AI. 58 00:03:46,040 --> 00:03:49,960 Speaker 2: Yes, so I've been interested in tech since I was 59 00:03:50,000 --> 00:03:52,120 Speaker 2: a little kid. I'm the daughter of an artist and 60 00:03:52,200 --> 00:03:55,360 Speaker 2: a scientist, and actually grew up going to my dad's 61 00:03:55,440 --> 00:03:58,200 Speaker 2: lab where he would feed cancer cells, but he would 62 00:03:58,240 --> 00:04:02,400 Speaker 2: also use these huge silk congrassic computers to help him 63 00:04:02,440 --> 00:04:07,600 Speaker 2: support his research. So I grew up around computers and technology, 64 00:04:08,160 --> 00:04:09,920 Speaker 2: and I also grew up watching a lot of TV, 65 00:04:10,080 --> 00:04:13,080 Speaker 2: but it was a very strict diet, the PBS diet, 66 00:04:13,160 --> 00:04:15,960 Speaker 2: so I could only watch public TV for about two 67 00:04:16,000 --> 00:04:19,279 Speaker 2: hours at a time, and I loved watching the different 68 00:04:19,440 --> 00:04:23,240 Speaker 2: science tech shows and one of them actually showed a 69 00:04:23,400 --> 00:04:28,040 Speaker 2: robot named Kismet, which was a social robot that could 70 00:04:28,120 --> 00:04:31,640 Speaker 2: smile at you and try to interact like a person would. 71 00:04:31,640 --> 00:04:34,120 Speaker 2: And before then I always thought of robots as being 72 00:04:34,160 --> 00:04:39,320 Speaker 2: more industrial, so that really sparked my imagination and wanting to, 73 00:04:40,200 --> 00:04:44,040 Speaker 2: in my terms, become a robotics engineer and go to MIT. 74 00:04:44,400 --> 00:04:46,760 Speaker 2: I didn't realize there were requirements. I was a kid 75 00:04:46,800 --> 00:04:50,360 Speaker 2: at the time, but that's what got me interested in 76 00:04:50,480 --> 00:04:52,960 Speaker 2: exploring science and technology. 77 00:04:53,800 --> 00:04:57,200 Speaker 1: Wow. I love that there's such an early childhood connection 78 00:04:57,360 --> 00:04:59,560 Speaker 1: for you to the work that you're doing today. 79 00:05:00,279 --> 00:05:03,120 Speaker 2: Yes, there is definitely very. 80 00:05:02,960 --> 00:05:05,320 Speaker 1: Cool, very cool. So I want to hear more about 81 00:05:05,400 --> 00:05:08,159 Speaker 1: your experiences at MIT and would love for you to 82 00:05:08,200 --> 00:05:11,040 Speaker 1: talk about how many black and brown people you were 83 00:05:11,040 --> 00:05:13,679 Speaker 1: interacting with as you are doing the work of AI. 84 00:05:14,440 --> 00:05:17,120 Speaker 2: Now, that's such a great question. So I made it 85 00:05:17,160 --> 00:05:21,599 Speaker 2: to graduate school. My dream school was to go to MIT, 86 00:05:21,680 --> 00:05:26,760 Speaker 2: and in twenty fifteen I entered as a master's student. 87 00:05:26,960 --> 00:05:30,479 Speaker 2: The numbers, I'm pretty sure were handful at the time. 88 00:05:30,600 --> 00:05:33,760 Speaker 2: I'm not sure if it has gotten too much better 89 00:05:33,960 --> 00:05:36,680 Speaker 2: since then, but that was pretty typical of most of 90 00:05:36,720 --> 00:05:40,680 Speaker 2: my educational experiences, being one of one or one a 91 00:05:40,680 --> 00:05:45,239 Speaker 2: few definitely black people, black women, oh my goodness, even 92 00:05:45,360 --> 00:05:49,120 Speaker 2: fewer of us in certain spaces, and so it was 93 00:05:49,279 --> 00:05:54,920 Speaker 2: not surprising in that regard, but it was still clearly 94 00:05:55,680 --> 00:06:00,159 Speaker 2: lacking representation, which I didn't know would go beyond on 95 00:06:00,400 --> 00:06:05,160 Speaker 2: the actual people and manifest into the AI systems themselves. 96 00:06:05,520 --> 00:06:08,720 Speaker 2: So when I got to MIT, I was really excited. 97 00:06:09,120 --> 00:06:12,640 Speaker 2: First year, I take a fun class called Science Fabrication. 98 00:06:12,880 --> 00:06:16,279 Speaker 2: You're supposed to read science fiction and create something you 99 00:06:16,360 --> 00:06:19,720 Speaker 2: probably wouldn't make otherwise, and so I thought, okay, here's 100 00:06:19,760 --> 00:06:22,560 Speaker 2: a good chance to do shape shifting. So I'm from 101 00:06:22,600 --> 00:06:25,880 Speaker 2: Ghana and I was very much inspired by stories of 102 00:06:26,040 --> 00:06:30,920 Speaker 2: a Nazi the spider, this trickster spider, and I thought, okay, cool, 103 00:06:30,960 --> 00:06:33,560 Speaker 2: it's shape shift. We only had six weeks, though, and 104 00:06:33,640 --> 00:06:35,800 Speaker 2: I probably wasn't going to change the laws of physics 105 00:06:35,839 --> 00:06:39,080 Speaker 2: anytime soon. So my thought was, if I can't shift 106 00:06:39,120 --> 00:06:42,080 Speaker 2: my physical form. Maybe I can shift the reflection of 107 00:06:42,200 --> 00:06:46,360 Speaker 2: myself in a mirror. And so I started experimenting with 108 00:06:46,440 --> 00:06:51,240 Speaker 2: this material that makes a regular mirror actually have light 109 00:06:51,360 --> 00:06:54,159 Speaker 2: shine through it in the back. And so essentially what 110 00:06:54,240 --> 00:06:56,800 Speaker 2: I was able to do is create a filter like 111 00:06:56,880 --> 00:06:59,880 Speaker 2: you might see on a video stream, like a Snapchat 112 00:07:00,080 --> 00:07:03,200 Speaker 2: filter or any of these apps. But now instead of 113 00:07:03,240 --> 00:07:05,919 Speaker 2: it going through a video screen, it was actually on 114 00:07:06,080 --> 00:07:08,719 Speaker 2: a mirror in front of you. So it had a 115 00:07:08,760 --> 00:07:11,440 Speaker 2: really cool effect. So then I thought, okay, can I 116 00:07:11,480 --> 00:07:14,480 Speaker 2: have it follow me in the mirror? So I added 117 00:07:14,480 --> 00:07:17,520 Speaker 2: a camera on top, a little webcam and got some 118 00:07:17,680 --> 00:07:20,760 Speaker 2: software that was meant to track your face. This is 119 00:07:20,760 --> 00:07:23,720 Speaker 2: where things go sideway. So here I am. I'm sitting 120 00:07:23,760 --> 00:07:27,960 Speaker 2: at MIT. It's supposed to be this epicenter of innovation. 121 00:07:28,760 --> 00:07:32,640 Speaker 2: I'm trying to have this machine I'm building detect my face, 122 00:07:32,760 --> 00:07:37,840 Speaker 2: and it's not detecting my face consistently. So I literally 123 00:07:37,920 --> 00:07:41,200 Speaker 2: draw face on my palm, hold it up to the camera. 124 00:07:41,800 --> 00:07:44,080 Speaker 2: The smiley face more or less I drew on my 125 00:07:44,200 --> 00:07:48,440 Speaker 2: palm was detected, and so at that point I thought, okay, 126 00:07:48,560 --> 00:07:52,160 Speaker 2: anything is up for grabs. It was around Halloween time. 127 00:07:52,240 --> 00:07:55,200 Speaker 2: The book actually starts on Halloween. So I grabbed a 128 00:07:55,240 --> 00:07:58,600 Speaker 2: Halloween mask I happen to have. I didn't even have 129 00:07:58,680 --> 00:08:00,840 Speaker 2: to put it all the way over my face before 130 00:08:00,840 --> 00:08:04,520 Speaker 2: it started being detected as a face. And so that 131 00:08:04,760 --> 00:08:08,120 Speaker 2: was really this shocking moment for me, just to see 132 00:08:08,160 --> 00:08:13,240 Speaker 2: how easily it detected the white mask, and then when 133 00:08:13,240 --> 00:08:17,000 Speaker 2: it came to my actual human face, we were running 134 00:08:17,200 --> 00:08:20,520 Speaker 2: into some challenges. And I mean, Fenan already said it 135 00:08:20,600 --> 00:08:24,120 Speaker 2: black skin, white mask. I just didn't think it would 136 00:08:24,200 --> 00:08:28,400 Speaker 2: be so literal. And it was this reminder that even 137 00:08:28,440 --> 00:08:32,120 Speaker 2: though I had, in a sense quote unquote made it 138 00:08:32,200 --> 00:08:34,920 Speaker 2: to MIT, there was still a long way to go 139 00:08:35,000 --> 00:08:38,199 Speaker 2: in terms of not just representation of the student population 140 00:08:38,360 --> 00:08:43,520 Speaker 2: or the faculty population, but also representation within the technology 141 00:08:43,720 --> 00:08:48,080 Speaker 2: we were developing. And so my question was, is this 142 00:08:48,440 --> 00:08:52,400 Speaker 2: just my face or is this experience that I'm having 143 00:08:52,520 --> 00:08:55,240 Speaker 2: what I now call an encounter with the coded gaze 144 00:08:55,240 --> 00:08:57,840 Speaker 2: that you might have heard of the male gaze or 145 00:08:57,880 --> 00:09:02,200 Speaker 2: the white gaze. This is a extension of that kind 146 00:09:02,200 --> 00:09:07,319 Speaker 2: of terminology, the coded gaze. There at MIT, I was seeing, Okay, 147 00:09:08,000 --> 00:09:13,400 Speaker 2: us not being represented is going to impact the technology, and. 148 00:09:13,400 --> 00:09:15,200 Speaker 1: It feels like that's a beautiful tie back to the 149 00:09:15,200 --> 00:09:17,240 Speaker 1: cover of your book, right, I'm guessing that that is 150 00:09:17,480 --> 00:09:19,720 Speaker 1: where some of the cover story or it comes from. 151 00:09:20,080 --> 00:09:24,080 Speaker 2: You'll also notice where my finger is in relation to 152 00:09:24,679 --> 00:09:28,000 Speaker 2: the white mask itself in terms of whose turn it 153 00:09:28,080 --> 00:09:31,520 Speaker 2: is to speak. And for those with some really really 154 00:09:31,600 --> 00:09:36,240 Speaker 2: sharp eyes, you'll notice that my earrings are also a 155 00:09:36,320 --> 00:09:39,640 Speaker 2: neural network, which is one of the ways in which 156 00:09:39,800 --> 00:09:44,880 Speaker 2: people are developing architectures to power machine learning systems that 157 00:09:45,080 --> 00:09:48,760 Speaker 2: are undergirding so much of the AI we're seeing. It 158 00:09:48,840 --> 00:09:51,200 Speaker 2: also is meant to evoke a bit of an African 159 00:09:51,280 --> 00:09:55,000 Speaker 2: mask to give some of my Ashanti the Gary heritage, 160 00:09:55,520 --> 00:09:59,079 Speaker 2: and a face mask as well, so you have multiple 161 00:09:59,160 --> 00:10:03,640 Speaker 2: things going on in the cover, including the gesture of 162 00:10:03,760 --> 00:10:06,480 Speaker 2: the hand and so forth. But yes, now, I'm glad 163 00:10:06,520 --> 00:10:10,120 Speaker 2: you noticed. So that white mask and masking AI comes 164 00:10:10,120 --> 00:10:13,560 Speaker 2: from my experiense of literally putting on a white mask 165 00:10:13,640 --> 00:10:15,680 Speaker 2: to be seen by a machine. 166 00:10:16,559 --> 00:10:21,400 Speaker 1: Scene right right, scene in quotes. So, doctor Joy, there's 167 00:10:21,400 --> 00:10:24,400 Speaker 1: so much conversation about AI, and I think it's important 168 00:10:24,440 --> 00:10:26,880 Speaker 1: that we just start with the basics. So how would 169 00:10:26,920 --> 00:10:29,960 Speaker 1: you explain AI to a five year old? What is 170 00:10:30,040 --> 00:10:33,160 Speaker 1: the basic definition of what we mean when we say AI. 171 00:10:33,559 --> 00:10:37,160 Speaker 2: I always think of AI as this ongoing quest to 172 00:10:37,240 --> 00:10:40,960 Speaker 2: give machines abilities people have, right, So it might be 173 00:10:41,000 --> 00:10:44,600 Speaker 2: the ability to communicate, to talk back and forth, it 174 00:10:44,679 --> 00:10:47,840 Speaker 2: might be the ability to perceive the world, to detect 175 00:10:47,840 --> 00:10:51,600 Speaker 2: an object like a car, a cat, a face, a house, 176 00:10:51,840 --> 00:10:55,080 Speaker 2: a train. Those can all be forms of AI. And 177 00:10:55,120 --> 00:10:59,520 Speaker 2: then you also have AI systems that are about decision making, 178 00:10:59,720 --> 00:11:03,120 Speaker 2: so deciding who gets an opportunity or not, so how 179 00:11:03,120 --> 00:11:06,400 Speaker 2: many cookies someone might get, or if you get a house, right, 180 00:11:06,559 --> 00:11:11,240 Speaker 2: So you have AI involved in making decisions about people's lives. 181 00:11:11,280 --> 00:11:15,120 Speaker 2: You have AI systems that are about communicating back and 182 00:11:15,200 --> 00:11:18,080 Speaker 2: forth as you might interact with at chatbot. You have 183 00:11:18,160 --> 00:11:20,880 Speaker 2: AI systems that are about perceiving the world. So if 184 00:11:20,880 --> 00:11:23,319 Speaker 2: you want to have self driving cars, you might want 185 00:11:23,360 --> 00:11:26,440 Speaker 2: to detect the people walking around. And I like to 186 00:11:26,440 --> 00:11:30,920 Speaker 2: say it's an ongoing quest because what AI is keeps 187 00:11:31,200 --> 00:11:36,120 Speaker 2: evolving as the technology advances, and so you'll find that 188 00:11:36,120 --> 00:11:40,120 Speaker 2: there are many different definitions of what counts as AI, 189 00:11:40,760 --> 00:11:44,360 Speaker 2: what doesn't, and it continues to expand, so it is 190 00:11:44,880 --> 00:11:46,120 Speaker 2: ever evolving. 191 00:11:46,720 --> 00:11:49,480 Speaker 1: I really appreciate that definition because I think now when 192 00:11:49,520 --> 00:11:52,320 Speaker 1: I think about AI, I am thinking about it as 193 00:11:52,400 --> 00:11:55,920 Speaker 1: this chant GPT kind of machines taking over the world 194 00:11:56,480 --> 00:11:59,040 Speaker 1: kind of thing. But when I heard you explain it 195 00:11:59,080 --> 00:12:03,080 Speaker 1: as giving machines human capabilities, I thought about text to 196 00:12:03,200 --> 00:12:07,480 Speaker 1: speech software that helps people who maybe read and comprehend differently. 197 00:12:07,880 --> 00:12:10,160 Speaker 1: That would be considered AI. And that is like an 198 00:12:10,200 --> 00:12:12,760 Speaker 1: earlier form of AI than where we are now. 199 00:12:12,840 --> 00:12:15,640 Speaker 2: Right, or even some of the basic things we learn 200 00:12:15,720 --> 00:12:20,360 Speaker 2: when we're learning about AI and skull ocr optical character 201 00:12:20,520 --> 00:12:23,320 Speaker 2: recognition banks use this all of the time when you 202 00:12:23,360 --> 00:12:26,880 Speaker 2: have checks or other sorts of things you're writing in 203 00:12:27,200 --> 00:12:30,000 Speaker 2: right to get zip codes if you're trying to ship 204 00:12:30,120 --> 00:12:34,360 Speaker 2: packages and so forth, and so that is a type 205 00:12:34,400 --> 00:12:37,640 Speaker 2: of AI. But again, as things evolve, people like, well, 206 00:12:37,720 --> 00:12:40,600 Speaker 2: okay with the other stuff, right. 207 00:12:40,520 --> 00:12:42,360 Speaker 1: Yeah, it does feel like now it's like, oh, wait 208 00:12:42,400 --> 00:12:45,120 Speaker 1: a minute, what's actually happening here? What's going on? So 209 00:12:45,400 --> 00:12:47,840 Speaker 1: I wonder if you can explain to us how something 210 00:12:47,960 --> 00:12:51,680 Speaker 1: like a chat GPT is able to answer our questions 211 00:12:51,720 --> 00:12:54,680 Speaker 1: so quickly and like, for the most part, pretty coherently. 212 00:12:55,440 --> 00:13:00,000 Speaker 2: I think it's really important when we're talking about AI 213 00:13:00,200 --> 00:13:06,040 Speaker 2: systems to understand that even if the answers appear coherent, 214 00:13:06,600 --> 00:13:10,560 Speaker 2: it doesn't mean that they're accurate, right, And so I 215 00:13:10,640 --> 00:13:14,000 Speaker 2: just want to put that first. And now let's get 216 00:13:14,040 --> 00:13:18,440 Speaker 2: into how does something like a chat GPT work. So 217 00:13:18,600 --> 00:13:21,120 Speaker 2: chat GPT is an app that's built on top of 218 00:13:21,160 --> 00:13:25,040 Speaker 2: something called a large language model. And what a large 219 00:13:25,120 --> 00:13:29,120 Speaker 2: language model is is basically what I would call a 220 00:13:29,200 --> 00:13:35,160 Speaker 2: pattern recognition and production system. And so we just talked 221 00:13:35,160 --> 00:13:37,840 Speaker 2: about what are some of the capabilities you might try 222 00:13:37,880 --> 00:13:41,080 Speaker 2: to give AI or a machine, and one is the 223 00:13:41,120 --> 00:13:45,560 Speaker 2: capability to communicate, okay, if we want it to communicate 224 00:13:45,640 --> 00:13:49,280 Speaker 2: like a human. What scientists and researchers figured out was 225 00:13:49,320 --> 00:13:51,880 Speaker 2: instead of trying to code in every single way a 226 00:13:51,960 --> 00:13:55,480 Speaker 2: human might respond, which trusts me, that takes too much time, 227 00:13:55,880 --> 00:13:59,680 Speaker 2: what if we could learn patterns of language. And so 228 00:13:59,800 --> 00:14:04,280 Speaker 2: the the way these systems have been created is actually 229 00:14:04,320 --> 00:14:09,400 Speaker 2: based on large and large data sets of written language. 230 00:14:09,480 --> 00:14:14,000 Speaker 2: So it can be languaged from newspapers, from magazines, from websites, 231 00:14:14,080 --> 00:14:17,960 Speaker 2: a lot from Wikipedia, and all of that gets put 232 00:14:18,040 --> 00:14:22,120 Speaker 2: into a system that then trains the AI model to 233 00:14:22,320 --> 00:14:26,920 Speaker 2: recognize language patterns. And so some people actually call chat 234 00:14:26,960 --> 00:14:31,160 Speaker 2: GPT like spicy autocomplete. You know, if you're typing in 235 00:14:31,280 --> 00:14:34,680 Speaker 2: or texting and you're about to type something. Over time, 236 00:14:34,960 --> 00:14:37,760 Speaker 2: your little system might learn which words you tend to 237 00:14:37,840 --> 00:14:43,240 Speaker 2: say next, right, so please close the what would you 238 00:14:43,280 --> 00:14:47,640 Speaker 2: say door? Door? Right? And so there are word patterns 239 00:14:47,680 --> 00:14:51,440 Speaker 2: that we know over time what words are likely to follow, 240 00:14:51,760 --> 00:14:55,400 Speaker 2: And so that is the basic idea that's being used. 241 00:14:55,800 --> 00:14:59,920 Speaker 2: But then you expand it to be much more complex. 242 00:15:00,120 --> 00:15:04,560 Speaker 2: So then you start looking at sentences, at paragraphs, right, 243 00:15:04,880 --> 00:15:08,120 Speaker 2: at much longer phrases. But that is what it's building on. 244 00:15:08,600 --> 00:15:13,040 Speaker 2: What is the next most likely word based on this 245 00:15:13,280 --> 00:15:20,800 Speaker 2: huge example I have of humans communicating through emails, through 246 00:15:21,480 --> 00:15:26,200 Speaker 2: online blogs and forums and all of that. Right. So, 247 00:15:26,720 --> 00:15:30,440 Speaker 2: now if you know that these sorts of systems are 248 00:15:30,480 --> 00:15:34,800 Speaker 2: being trained on information online, this means you get the good, 249 00:15:34,920 --> 00:15:39,320 Speaker 2: the bad, and the ugly, right. And this is really 250 00:15:39,360 --> 00:15:43,920 Speaker 2: important because sometimes what these systems are learning are racial 251 00:15:44,000 --> 00:15:48,880 Speaker 2: bias online. There's a recent paper that came out that 252 00:15:49,040 --> 00:15:54,280 Speaker 2: showed that systems like large language models which would powerchat GPT, 253 00:15:55,040 --> 00:15:59,560 Speaker 2: actually have what they're calling covert racism. So overt racism 254 00:15:59,640 --> 00:16:02,720 Speaker 2: we know like okay, using the N word or saying 255 00:16:02,840 --> 00:16:07,160 Speaker 2: black people are at a negative stereotype or shouldn't do 256 00:16:07,800 --> 00:16:14,920 Speaker 2: something positive, right, covert is being politically correct, but still 257 00:16:14,960 --> 00:16:19,200 Speaker 2: holding racist attitudes. So how they tested for this when 258 00:16:19,240 --> 00:16:22,600 Speaker 2: it came to large language models is they would present 259 00:16:22,760 --> 00:16:26,440 Speaker 2: different characters, one speaking quote unquote Standard English and one 260 00:16:26,480 --> 00:16:30,400 Speaker 2: speaking quote unquote African American English, and then they would 261 00:16:30,480 --> 00:16:34,440 Speaker 2: ask the chatbot about how long one character or the 262 00:16:34,520 --> 00:16:37,880 Speaker 2: other should be sentenced for jail, and the one speaking 263 00:16:37,960 --> 00:16:41,520 Speaker 2: African American English would have a longer jail sentence. And 264 00:16:41,560 --> 00:16:45,320 Speaker 2: so this is what I mean by the covert racism, 265 00:16:45,400 --> 00:16:48,400 Speaker 2: which can be even harder to find, and you have 266 00:16:48,440 --> 00:16:51,760 Speaker 2: to be more clever in terms of distinguishing that. So 267 00:16:51,840 --> 00:16:55,000 Speaker 2: then when it comes back to your question about how 268 00:16:55,400 --> 00:16:58,920 Speaker 2: are we able to type into something like chat GPT 269 00:16:59,120 --> 00:17:02,200 Speaker 2: and it seems really coherent while it's been trained on 270 00:17:02,280 --> 00:17:05,600 Speaker 2: a lot of human language and using that spicy auto complete, 271 00:17:05,640 --> 00:17:08,560 Speaker 2: it has learned over time what seems to be coherent. 272 00:17:08,960 --> 00:17:13,240 Speaker 2: But you can be coherent and convincingly say the wrong thing. 273 00:17:13,960 --> 00:17:17,879 Speaker 2: And so that's the other part that makes me cautious 274 00:17:18,359 --> 00:17:23,160 Speaker 2: and something to also consider. Right, we've heard coherent people 275 00:17:23,240 --> 00:17:27,119 Speaker 2: who can talk a good gag, but what they're saying 276 00:17:27,280 --> 00:17:31,280 Speaker 2: isn't necessarily true. And we see the same thing with 277 00:17:31,520 --> 00:17:35,199 Speaker 2: some AI systems and might call them bs machines, and 278 00:17:35,280 --> 00:17:37,480 Speaker 2: we've seen this even in demos from some of the 279 00:17:37,560 --> 00:17:41,040 Speaker 2: largest tech companies. There's a sixty minute segment they were 280 00:17:41,040 --> 00:17:43,960 Speaker 2: looking at a system from Google bart at the time, 281 00:17:44,000 --> 00:17:47,600 Speaker 2: I believe, and it seemed to be a very impressive demo. 282 00:17:47,840 --> 00:17:51,040 Speaker 2: It had been asked to give a list of book recommendations. 283 00:17:51,080 --> 00:17:54,440 Speaker 2: It gave the books. When they went back to research 284 00:17:54,520 --> 00:17:58,760 Speaker 2: those books, those books didn't exist. So knowing the pattern 285 00:17:58,800 --> 00:18:01,199 Speaker 2: of language and knowing how to produce it in a 286 00:18:01,240 --> 00:18:04,840 Speaker 2: convincing way doesn't make it true. And so that's why 287 00:18:04,880 --> 00:18:09,200 Speaker 2: we have to be very careful when using these systems. 288 00:18:09,280 --> 00:18:13,440 Speaker 2: Because there is so much fluency with the language, it's 289 00:18:13,520 --> 00:18:17,160 Speaker 2: easy to be lulled into it thinking right that what 290 00:18:17,200 --> 00:18:20,680 Speaker 2: you're getting is actually an accurate representation of the world, 291 00:18:20,760 --> 00:18:23,920 Speaker 2: and what it is is that representation of the online 292 00:18:23,960 --> 00:18:28,520 Speaker 2: world with stereotypes and misinformation and all. 293 00:18:29,200 --> 00:18:30,680 Speaker 1: I'm glad you said that, because that was going to 294 00:18:30,720 --> 00:18:32,800 Speaker 1: be my follow up question. Because we know something we've 295 00:18:32,800 --> 00:18:36,000 Speaker 1: heard from like students trying to use chad GBT or 296 00:18:36,000 --> 00:18:39,840 Speaker 1: something to write an essay, it will sometimes produce citations 297 00:18:39,840 --> 00:18:43,080 Speaker 1: that don't actually exist. And so you sharing the example 298 00:18:43,119 --> 00:18:46,080 Speaker 1: about the books makes me think. So it doesn't just 299 00:18:46,200 --> 00:18:49,400 Speaker 1: search what's already there. It's also trying to put things 300 00:18:49,400 --> 00:18:54,560 Speaker 1: together based on patterns it recognizes from other citations exactly. 301 00:18:54,760 --> 00:18:58,760 Speaker 2: And so citations look like this. Let me make up 302 00:18:58,800 --> 00:19:01,639 Speaker 2: a citation that looks correct. You actually had a lawyer 303 00:19:02,040 --> 00:19:05,600 Speaker 2: who lost their license. They were debarred because they were 304 00:19:05,760 --> 00:19:09,760 Speaker 2: using one of these systems in a case and it 305 00:19:09,880 --> 00:19:13,600 Speaker 2: was citing case law. I believe that didn't exist. It 306 00:19:13,680 --> 00:19:17,880 Speaker 2: looked plausible, right, it had the right form, but it 307 00:19:18,000 --> 00:19:21,840 Speaker 2: wasn't actually something that existed. And that's why I'm saying 308 00:19:21,960 --> 00:19:25,520 Speaker 2: it's really dangerous because if you're not an expert, you 309 00:19:25,600 --> 00:19:28,320 Speaker 2: might not know the difference. If that's not your area 310 00:19:28,520 --> 00:19:32,639 Speaker 2: of focus and it looks right, I mean unless you 311 00:19:32,720 --> 00:19:35,680 Speaker 2: know how else. So I would always be very skeptical. 312 00:19:35,880 --> 00:19:38,680 Speaker 2: And to your point. The other thing though about students 313 00:19:38,800 --> 00:19:43,000 Speaker 2: using chat, GPT or others is we've also seen another 314 00:19:43,119 --> 00:19:48,440 Speaker 2: kind of bias when it comes to bias in AI detectors. Right, 315 00:19:48,480 --> 00:19:51,760 Speaker 2: So teachers will say, ah, students are using AI. We 316 00:19:51,840 --> 00:19:55,119 Speaker 2: want to see if somebody actually wrote the paper or 317 00:19:55,200 --> 00:19:58,800 Speaker 2: not they actually found You have bias with these systems 318 00:19:58,840 --> 00:20:03,280 Speaker 2: that follow a long life English as a second language, right, 319 00:20:03,520 --> 00:20:07,680 Speaker 2: or some students with different kinds of learning abilities, and 320 00:20:07,800 --> 00:20:12,000 Speaker 2: so you were more likely to be flagged as having 321 00:20:12,160 --> 00:20:16,760 Speaker 2: cheated even if you didn't if English was your second language. 322 00:20:16,800 --> 00:20:20,560 Speaker 2: So that's another type of bias built on top of this. 323 00:20:21,040 --> 00:20:23,399 Speaker 1: So, doctor Joy, who are some of the major players 324 00:20:23,440 --> 00:20:25,800 Speaker 1: that we should have on our radar? You already mentioned 325 00:20:25,800 --> 00:20:28,520 Speaker 1: like Google was building something with board, Like who are 326 00:20:28,520 --> 00:20:30,480 Speaker 1: some of the beer names in this space to be 327 00:20:30,480 --> 00:20:31,240 Speaker 1: paying attention to? 328 00:20:31,880 --> 00:20:35,160 Speaker 2: Oh yeah, so I do think you're big tech giants, right, 329 00:20:35,200 --> 00:20:38,120 Speaker 2: So you definitely want to be paying attention to Microsoft. 330 00:20:38,560 --> 00:20:43,440 Speaker 2: And Microsoft invested in open ai, right and open ai 331 00:20:43,520 --> 00:20:47,640 Speaker 2: they're the creators of chat GPT. We already mentioned Google. 332 00:20:48,080 --> 00:20:52,760 Speaker 2: Facebook is a really important player here as well, because 333 00:20:52,800 --> 00:20:57,200 Speaker 2: they are creating what are known as open source models, 334 00:20:57,920 --> 00:21:00,640 Speaker 2: which is to say, we are going to make our 335 00:21:00,800 --> 00:21:05,560 Speaker 2: code available for other organizations to use, and so that 336 00:21:05,680 --> 00:21:09,359 Speaker 2: gives them a different type of power if they're controlling 337 00:21:09,440 --> 00:21:12,880 Speaker 2: the systems that some of the other big tech companies 338 00:21:12,920 --> 00:21:16,879 Speaker 2: aren't necessarily using, but they're letting smaller guys get into 339 00:21:17,119 --> 00:21:21,200 Speaker 2: the game. So that's another area to explore. You have 340 00:21:21,320 --> 00:21:25,440 Speaker 2: to definitely consider Amazon because at the end of the day, 341 00:21:25,520 --> 00:21:28,879 Speaker 2: the data needs to sit somewhere, the compute power to 342 00:21:29,000 --> 00:21:33,120 Speaker 2: process all of these things needs to sit somewhere, And oftentimes, 343 00:21:33,440 --> 00:21:36,840 Speaker 2: if you look under the hood, Amazon Web Services are 344 00:21:36,880 --> 00:21:42,160 Speaker 2: involved in hosting these systems, deploying these systems, computing what's 345 00:21:42,200 --> 00:21:46,199 Speaker 2: going on behind the scenes as well. So those are 346 00:21:46,240 --> 00:21:50,399 Speaker 2: certainly a few companies to keep in mind for sure. 347 00:21:51,600 --> 00:21:54,480 Speaker 1: What would you say, are some of the environmental implications 348 00:21:54,480 --> 00:21:56,280 Speaker 1: of AI huge? 349 00:21:56,440 --> 00:21:59,440 Speaker 2: So all of these systems that we're just talking about, 350 00:22:00,080 --> 00:22:04,560 Speaker 2: they are costly to make, right, So we're talking not millions, 351 00:22:04,760 --> 00:22:08,280 Speaker 2: not tens of millions, but hundreds of millions of dollars 352 00:22:08,680 --> 00:22:11,960 Speaker 2: to train some of these systems. And when these systems 353 00:22:11,960 --> 00:22:15,720 Speaker 2: are being trained right, to process all of that data 354 00:22:16,160 --> 00:22:19,480 Speaker 2: with all of that compute requires energy. It doesn't just 355 00:22:19,560 --> 00:22:23,920 Speaker 2: come from anywhere, and so there's this environmental impact that's 356 00:22:23,920 --> 00:22:29,000 Speaker 2: happening alongside the entire AI development life cycle. So when 357 00:22:29,040 --> 00:22:34,600 Speaker 2: the systems are being trained themselves, there's an environmental impact. 358 00:22:34,920 --> 00:22:37,639 Speaker 2: And in this case, think about the data centers that 359 00:22:37,720 --> 00:22:40,359 Speaker 2: have to be built. They also have to be cooled. 360 00:22:40,840 --> 00:22:44,399 Speaker 2: They're being cooled by water oftentimes, right, so there's this 361 00:22:44,560 --> 00:22:48,360 Speaker 2: water impact that's happening as well as what you typically 362 00:22:48,359 --> 00:22:51,199 Speaker 2: would think of in terms of environmental impact with the 363 00:22:51,240 --> 00:22:55,639 Speaker 2: carbon footprint. But that's just to train it. Now you 364 00:22:55,720 --> 00:22:59,439 Speaker 2: have people who are using it, you're at the deployment section. 365 00:23:00,119 --> 00:23:03,119 Speaker 2: There are different estimates for this. What I've seen is 366 00:23:03,520 --> 00:23:05,919 Speaker 2: every time you put in a prompt, depending on the 367 00:23:05,960 --> 00:23:09,520 Speaker 2: time of day, you can imagine it as drinking a 368 00:23:09,640 --> 00:23:13,240 Speaker 2: half of a glass of water for each prompt you 369 00:23:13,359 --> 00:23:16,159 Speaker 2: put in. And again it varies. You have different sized 370 00:23:16,200 --> 00:23:18,760 Speaker 2: prompts and so forth. But just to give you a 371 00:23:18,880 --> 00:23:21,440 Speaker 2: sense that it's not just the amount of energy to 372 00:23:21,840 --> 00:23:24,639 Speaker 2: train the system, but it's the amount of energy and 373 00:23:24,680 --> 00:23:29,440 Speaker 2: also water that's being used each time you're exploring these systems. 374 00:23:29,440 --> 00:23:31,720 Speaker 2: And then you have to ask which communities are these 375 00:23:31,800 --> 00:23:36,280 Speaker 2: data centers being put into, and whose water sources and 376 00:23:36,320 --> 00:23:39,760 Speaker 2: supplies are being impacted, And let me tell you, probably 377 00:23:39,800 --> 00:23:42,639 Speaker 2: not so surprisingly, it tends to be communities of color. 378 00:23:43,040 --> 00:23:45,800 Speaker 1: Mm hmmm. You've already kind of alluded to a lot 379 00:23:45,800 --> 00:23:47,200 Speaker 1: of this, but I would love for you to talk 380 00:23:47,240 --> 00:23:50,919 Speaker 1: more about, like how the human involvement in creating AI 381 00:23:51,560 --> 00:23:54,119 Speaker 1: really leads to some of the biases that are transferred 382 00:23:54,160 --> 00:23:56,200 Speaker 1: to the process of like the answers that we get 383 00:23:56,200 --> 00:23:57,360 Speaker 1: when we type in a prompt. 384 00:23:57,920 --> 00:24:00,720 Speaker 2: Yes, Well, I'll come back to my own space of 385 00:24:00,760 --> 00:24:05,560 Speaker 2: facial recognition technologies. Right, Why is it that we have 386 00:24:05,880 --> 00:24:09,119 Speaker 2: these AI systems that supposedly detect human faces? But I'm 387 00:24:09,200 --> 00:24:10,960 Speaker 2: coning in a white mask and I'm here out on 388 00:24:11,040 --> 00:24:15,080 Speaker 2: MIT what's going on? And so I think of something 389 00:24:15,320 --> 00:24:19,440 Speaker 2: called power shadow. So we're just talking about the ways 390 00:24:19,560 --> 00:24:23,160 Speaker 2: in which you now have machine learning techniques being used 391 00:24:23,160 --> 00:24:26,520 Speaker 2: for AI. So what's the machine learning from data? Where 392 00:24:26,560 --> 00:24:29,000 Speaker 2: are you getting the data from? Oh? Okay, this is 393 00:24:29,080 --> 00:24:33,320 Speaker 2: where the human footprints and the human fingerprints come into play. 394 00:24:33,880 --> 00:24:36,359 Speaker 2: When you're collecting data, Let's say, in the early days, 395 00:24:36,400 --> 00:24:39,800 Speaker 2: for collecting data for face data sets, we would do 396 00:24:39,880 --> 00:24:43,040 Speaker 2: something like what the AI companies doing the large language 397 00:24:43,040 --> 00:24:47,359 Speaker 2: models are doing. Go online, find some faces. In this case, 398 00:24:47,400 --> 00:24:51,359 Speaker 2: find faces of public officials and public officials who hold power, 399 00:24:51,840 --> 00:24:56,480 Speaker 2: who tends to hold power all around the world? Men? Right, 400 00:24:56,720 --> 00:25:00,800 Speaker 2: And so if that is your source for face it's 401 00:25:00,840 --> 00:25:03,639 Speaker 2: not so surprisingly then when you start getting these data 402 00:25:03,640 --> 00:25:07,520 Speaker 2: sets that are seventy percent or more of mald labeled faces. 403 00:25:07,520 --> 00:25:10,439 Speaker 2: And that's what we're getting in early days of this 404 00:25:10,560 --> 00:25:13,080 Speaker 2: sort of technology, and this is something I call a 405 00:25:13,160 --> 00:25:16,879 Speaker 2: power shadow, right, So the inequalities of the world being 406 00:25:17,040 --> 00:25:21,199 Speaker 2: basically reflected in the data set itself. So this is 407 00:25:21,240 --> 00:25:23,880 Speaker 2: how we end up with such a male skew when 408 00:25:23,920 --> 00:25:26,959 Speaker 2: it comes to face data sets. Now, let's think about 409 00:25:27,080 --> 00:25:30,360 Speaker 2: the color side. Why are they mainly lighter skin right 410 00:25:30,440 --> 00:25:34,840 Speaker 2: in the earlier days eighty percent lighter skin or more. Well, 411 00:25:35,119 --> 00:25:38,480 Speaker 2: just like my example with the white mask, even if 412 00:25:38,520 --> 00:25:42,160 Speaker 2: you're searching online for photos of people, if you don't 413 00:25:42,200 --> 00:25:46,639 Speaker 2: detect dark skin people as faces, you're not going to 414 00:25:46,720 --> 00:25:49,040 Speaker 2: have them in the data set in the first place. 415 00:25:49,080 --> 00:25:51,359 Speaker 2: So you already missed those who were be there in 416 00:25:51,400 --> 00:25:54,800 Speaker 2: some cases. But also let's go back to media representation. 417 00:25:55,520 --> 00:25:59,480 Speaker 2: Who's featured in the media, Which stories get the most airtime? Right, 418 00:25:59,600 --> 00:26:04,879 Speaker 2: Even if you're watching a film, who's in the lead 419 00:26:05,119 --> 00:26:08,840 Speaker 2: role and who gets the little side character? Right. It's 420 00:26:08,840 --> 00:26:12,439 Speaker 2: only more recently we've seen some diversification, and even that 421 00:26:12,560 --> 00:26:14,679 Speaker 2: is being rolled back a little bit, So you have 422 00:26:14,760 --> 00:26:18,640 Speaker 2: to think about the full representation. So when you think 423 00:26:18,680 --> 00:26:23,200 Speaker 2: about the representation of who is positioned as worthy, who's 424 00:26:23,200 --> 00:26:29,080 Speaker 2: positioned as expert, who's positioned as desirable, lighter skin tends 425 00:26:29,119 --> 00:26:32,320 Speaker 2: to be on top, and so it's not so surprising. 426 00:26:32,400 --> 00:26:35,440 Speaker 2: Then if what we're doing is let's grape the internet 427 00:26:35,480 --> 00:26:38,560 Speaker 2: for faces, you end up with faces that are largely 428 00:26:38,600 --> 00:26:41,520 Speaker 2: pale and largely male. So these pale mal data sets 429 00:26:41,520 --> 00:26:44,679 Speaker 2: are a reflection of these power shadows in terms of 430 00:26:44,720 --> 00:26:48,800 Speaker 2: who's more likely to be represented. And so going back 431 00:26:48,800 --> 00:26:51,800 Speaker 2: to your question, right, how does the bias happen? In 432 00:26:51,840 --> 00:26:56,040 Speaker 2: some ways, the bias is reflecting some of the inequalities 433 00:26:56,160 --> 00:26:59,320 Speaker 2: in society at times. I say, the past dwells within 434 00:26:59,400 --> 00:27:03,320 Speaker 2: our data, right, and so that's what we're seeing. And 435 00:27:03,359 --> 00:27:09,320 Speaker 2: then that is the diet. That very bland or homogeneous 436 00:27:09,400 --> 00:27:12,840 Speaker 2: diet is what's fed to the AI systems, so it's 437 00:27:12,880 --> 00:27:16,159 Speaker 2: not so surprising when they encounter something different than what 438 00:27:16,320 --> 00:27:20,240 Speaker 2: they have been exposed to. We get some issues. 439 00:27:20,920 --> 00:27:23,879 Speaker 1: More from our conversation after the break, but first a 440 00:27:23,960 --> 00:27:26,480 Speaker 1: quick snippet of what's coming up next week on TVG. 441 00:27:27,800 --> 00:27:29,800 Speaker 3: I mean, really, when you look across the board at 442 00:27:29,840 --> 00:27:34,640 Speaker 3: every health indicator, we are scoring in the highest with 443 00:27:34,680 --> 00:27:38,760 Speaker 3: regards to being diagnosed and with the poorest outcomes. I 444 00:27:38,800 --> 00:27:41,080 Speaker 3: think that we really need to be mindful about the 445 00:27:41,240 --> 00:27:45,280 Speaker 3: lives that we're curating and making sure that it's okay 446 00:27:45,320 --> 00:27:48,439 Speaker 3: to stay not now right. It's okay to say, what 447 00:27:48,520 --> 00:27:51,119 Speaker 3: is this doing to me? It's okay for you to 448 00:27:51,160 --> 00:27:54,520 Speaker 3: pay attention to how you're building your business. It doesn't 449 00:27:54,520 --> 00:27:57,000 Speaker 3: have to scale to a million dollars today job. You 450 00:27:57,040 --> 00:28:00,359 Speaker 3: can take your time, but ultimately we have to be 451 00:28:00,440 --> 00:28:04,600 Speaker 3: realistic with who we are and fold that in to 452 00:28:04,760 --> 00:28:07,239 Speaker 3: the lives that we're living and not just look at 453 00:28:07,240 --> 00:28:12,680 Speaker 3: the outcomes as the measurement of our value. 454 00:28:16,880 --> 00:28:20,440 Speaker 1: Hey, since we're seeking and experienced and passionate AD sales 455 00:28:20,480 --> 00:28:23,280 Speaker 1: strategists to join our team here at Therapy for Black Girls, 456 00:28:23,400 --> 00:28:25,640 Speaker 1: were looking for somebody who can help us to strengthen 457 00:28:25,680 --> 00:28:28,680 Speaker 1: and maintain our existing brand partnerships and who can help 458 00:28:28,760 --> 00:28:31,919 Speaker 1: us identify and cultivate new brand partnerships that align with 459 00:28:31,960 --> 00:28:34,520 Speaker 1: our mission. If you are someone who has five to 460 00:28:34,560 --> 00:28:37,440 Speaker 1: seven years in AD sales or media buying or similar 461 00:28:37,480 --> 00:28:41,040 Speaker 1: position with a proven track record of success, we love 462 00:28:41,120 --> 00:28:43,360 Speaker 1: to chat with you. Go to Therapy for Black Girls 463 00:28:43,440 --> 00:28:45,680 Speaker 1: dot com slash ad sales to learn more about the 464 00:28:45,720 --> 00:28:53,000 Speaker 1: position or to apply. I'd love to hear you talk 465 00:28:53,080 --> 00:28:55,520 Speaker 1: doctor Joy about because I feel like some of what 466 00:28:55,800 --> 00:28:58,600 Speaker 1: I've been hearing in terms of trying to combat this 467 00:28:58,840 --> 00:29:02,360 Speaker 1: bias that exist in AI is Okay, let's have more 468 00:29:02,400 --> 00:29:06,080 Speaker 1: black people use it, right, And then that causes me pause, 469 00:29:06,240 --> 00:29:07,920 Speaker 1: probably for some of the reasons we're going to talk 470 00:29:07,920 --> 00:29:10,320 Speaker 1: about today, because then it feels like we're giving all 471 00:29:10,360 --> 00:29:13,400 Speaker 1: of this information, but then how is it being used? 472 00:29:13,760 --> 00:29:16,760 Speaker 1: So would you say that the answer to decreasing some 473 00:29:16,840 --> 00:29:19,680 Speaker 1: of this bias is for just more people of color 474 00:29:19,720 --> 00:29:22,719 Speaker 1: to be interacting with the AI, for us to be 475 00:29:22,720 --> 00:29:24,880 Speaker 1: building the systems, Like, what would you say about that? 476 00:29:25,360 --> 00:29:30,400 Speaker 2: I like to remind people that accurate systems can be abused, 477 00:29:31,240 --> 00:29:33,880 Speaker 2: and so because so much of my early research was 478 00:29:33,920 --> 00:29:39,360 Speaker 2: around facial recognition technologies, it meant that as tempting as 479 00:29:39,400 --> 00:29:41,840 Speaker 2: it was to just say, Okay, let's make the data 480 00:29:41,880 --> 00:29:46,040 Speaker 2: sets more inclusive and we're done, I had to contend with, 481 00:29:46,160 --> 00:29:51,360 Speaker 2: wait a second, we're saying facial recognition on drones with guns, 482 00:29:51,680 --> 00:29:56,720 Speaker 2: lethal autonomous weapons systems, right, These are real use cases. 483 00:29:57,080 --> 00:30:00,520 Speaker 2: So it wasn't just the question of how act a 484 00:30:00,640 --> 00:30:04,520 Speaker 2: system was, even though we had huge accuracy disparities and 485 00:30:04,560 --> 00:30:07,800 Speaker 2: we continue to do, but it's how will a system 486 00:30:07,880 --> 00:30:10,600 Speaker 2: be used? And that's why we have to be cautious 487 00:30:10,680 --> 00:30:14,920 Speaker 2: because there's a power in balance. For example, one project 488 00:30:15,000 --> 00:30:18,040 Speaker 2: we had explored doing right was creating more diverse spaced 489 00:30:18,160 --> 00:30:20,760 Speaker 2: data sets and having people label it as part of 490 00:30:20,760 --> 00:30:24,160 Speaker 2: a crowdsourcing situation. Then we're like, wait, all of this 491 00:30:24,320 --> 00:30:28,280 Speaker 2: free labor being fed into these systems that are then 492 00:30:28,400 --> 00:30:32,680 Speaker 2: sold back to people, but then are also adopted by 493 00:30:32,960 --> 00:30:37,120 Speaker 2: law enforcement agencies that then go out and use that 494 00:30:37,240 --> 00:30:40,840 Speaker 2: technology in ways that can be oppressive to our communities. 495 00:30:40,880 --> 00:30:44,000 Speaker 2: And in the book I talk about grappling with this. 496 00:30:44,200 --> 00:30:46,600 Speaker 2: I have some of the CEOs from the biggest tech 497 00:30:46,600 --> 00:30:50,600 Speaker 2: companies asking me to help them quote unquote improve their 498 00:30:50,600 --> 00:30:55,200 Speaker 2: facial recognition, and I had to say, it's more than 499 00:30:55,280 --> 00:31:01,520 Speaker 2: a technical conversation, because if I lend my expertise in 500 00:31:01,560 --> 00:31:06,320 Speaker 2: a way that actually create more accurate systems that are 501 00:31:06,400 --> 00:31:09,960 Speaker 2: used to oppress people like me in communities I care about, 502 00:31:10,320 --> 00:31:15,240 Speaker 2: That's actually not why I'm bringing up these issues. I'm 503 00:31:15,280 --> 00:31:18,040 Speaker 2: bringing up these issues not to say we need more 504 00:31:18,080 --> 00:31:22,080 Speaker 2: accurate facial recognition technologies that can then be used and 505 00:31:22,200 --> 00:31:25,360 Speaker 2: weaponized in harmful ways, but we need to attend to 506 00:31:25,680 --> 00:31:28,880 Speaker 2: all of the ways in which AI is being developed, 507 00:31:28,920 --> 00:31:32,680 Speaker 2: where we have assumptions about accuracy that don't hold. So 508 00:31:32,720 --> 00:31:37,440 Speaker 2: when we're thinking about AI systems being developed for medical purposes, 509 00:31:37,560 --> 00:31:41,320 Speaker 2: right we're looking at clinical research, we don't want to 510 00:31:41,400 --> 00:31:44,120 Speaker 2: kit ourselves into a false sense of progress, the same 511 00:31:44,200 --> 00:31:47,080 Speaker 2: kind of false sense of progress we had around facial 512 00:31:47,240 --> 00:31:51,080 Speaker 2: recognition technology. So this was very much a cautionary tell 513 00:31:51,200 --> 00:31:58,120 Speaker 2: with just face detection, gender classification, facial identification, and verification 514 00:31:58,320 --> 00:32:00,680 Speaker 2: as examples to say, this is how we get it 515 00:32:00,720 --> 00:32:03,760 Speaker 2: wrong in this domain, but there are lessons to be 516 00:32:03,920 --> 00:32:08,160 Speaker 2: learned in other domains. And so I wanted to really 517 00:32:08,400 --> 00:32:13,400 Speaker 2: challenge our assumptions about AI systems because what I was 518 00:32:13,440 --> 00:32:17,000 Speaker 2: saying is currently we had this narrative of if it 519 00:32:17,040 --> 00:32:20,440 Speaker 2: worked on our gold standard benchmarks, then it worked for 520 00:32:20,480 --> 00:32:23,560 Speaker 2: the rest of the world. But truly we had misleaning 521 00:32:23,640 --> 00:32:26,800 Speaker 2: measures of success. I'm looking at what is supposed to 522 00:32:26,800 --> 00:32:30,320 Speaker 2: be a gold standard and we're not represented and so 523 00:32:30,440 --> 00:32:33,000 Speaker 2: then you can say the system works xyz. And now 524 00:32:33,000 --> 00:32:36,280 Speaker 2: again let's think about something like melanoma or skin kitser 525 00:32:36,480 --> 00:32:41,160 Speaker 2: right where you want these systems to be inclusive, you 526 00:32:41,280 --> 00:32:44,280 Speaker 2: want the data to be gathered in an ethical way, 527 00:32:44,880 --> 00:32:47,360 Speaker 2: and you have to actually test that it works. You 528 00:32:47,360 --> 00:32:50,320 Speaker 2: can't just assume it worked on one population, so then 529 00:32:50,360 --> 00:32:53,320 Speaker 2: it's going to work on another population. So those were 530 00:32:53,360 --> 00:32:56,920 Speaker 2: the broader lessons that I was seeing from the work 531 00:32:56,920 --> 00:32:59,320 Speaker 2: I was doing, because I was looking at the process 532 00:32:59,360 --> 00:33:02,840 Speaker 2: of creation, how these power shadows get embedded and all 533 00:33:02,880 --> 00:33:06,160 Speaker 2: of that, which had implications for all of AI. When 534 00:33:06,160 --> 00:33:10,040 Speaker 2: we look at these generative AI systems, right, they're perpetuating 535 00:33:10,080 --> 00:33:13,080 Speaker 2: that bias, and we're going from a mirror to what 536 00:33:13,160 --> 00:33:17,480 Speaker 2: I like to call a kaleidoscope of distortion. So here's 537 00:33:17,520 --> 00:33:22,080 Speaker 2: an example. You had Bloomberg News. They did a test 538 00:33:22,400 --> 00:33:27,480 Speaker 2: where they decided to use these image generation AI tools 539 00:33:28,240 --> 00:33:30,120 Speaker 2: and they would give it a prompt for show me 540 00:33:30,200 --> 00:33:35,360 Speaker 2: a CEO, Show me an architect, right, usual suspects, right, 541 00:33:35,600 --> 00:33:39,840 Speaker 2: pill mail or lighter skin veil, Show me a social worker, Oh, 542 00:33:39,920 --> 00:33:43,720 Speaker 2: women coming in. Show me a school team. Okay, there's 543 00:33:43,720 --> 00:33:47,840 Speaker 2: some diversity there. Show me a drug dealer, show me 544 00:33:47,960 --> 00:33:52,080 Speaker 2: a terrorist, right, show me these criminal stereotypes. And what 545 00:33:52,240 --> 00:33:57,600 Speaker 2: they found was that these systems weren't just mirroring society, 546 00:33:58,480 --> 00:34:03,240 Speaker 2: they were amplifying by right. So black people would be 547 00:34:03,480 --> 00:34:08,680 Speaker 2: over represented as criminals right in that sort of situation. 548 00:34:08,920 --> 00:34:11,440 Speaker 2: And so these concerns that I had starting from that 549 00:34:11,560 --> 00:34:17,239 Speaker 2: white mask experience, actually have implications across these different iterations 550 00:34:17,280 --> 00:34:21,560 Speaker 2: we're seeing in the evolution of AI, which is to say, 551 00:34:21,560 --> 00:34:24,600 Speaker 2: you can't say you have robust AI systems if they 552 00:34:24,680 --> 00:34:28,360 Speaker 2: only represent a small portion of the world, and you 553 00:34:28,400 --> 00:34:31,759 Speaker 2: can't say you have ethical and responsible AI systems if 554 00:34:31,840 --> 00:34:35,440 Speaker 2: your even more accurate systems are being used in abusive ways. 555 00:34:35,719 --> 00:34:37,759 Speaker 2: So it's both the question of how well do these 556 00:34:37,800 --> 00:34:41,680 Speaker 2: systems work, but also how are they being used? And 557 00:34:41,760 --> 00:34:45,239 Speaker 2: that is a question around our values as a society. 558 00:34:45,880 --> 00:34:47,640 Speaker 1: So I want to talk about some of the most 559 00:34:47,640 --> 00:34:51,000 Speaker 1: common concerns that I think people have around AI and 560 00:34:51,040 --> 00:34:53,120 Speaker 1: have you weigh in on them. So the first one 561 00:34:53,360 --> 00:34:56,760 Speaker 1: is deep fakes and voice modulation without consent. 562 00:34:57,360 --> 00:35:00,440 Speaker 2: Huge. I mean that we saw earlier this You're the 563 00:35:00,520 --> 00:35:05,640 Speaker 2: robocall with President Biden telling people not to vote, which 564 00:35:05,880 --> 00:35:09,040 Speaker 2: was not even his voice, and so we are seeing 565 00:35:09,080 --> 00:35:13,040 Speaker 2: a proliferation of deep fakes and even on the voice piece. 566 00:35:13,120 --> 00:35:17,880 Speaker 2: I actually start unmasking AI with an example of Jennifer Destaff. No, 567 00:35:18,120 --> 00:35:21,160 Speaker 2: she gets a call mom, Mom, these bad guys have me. 568 00:35:21,880 --> 00:35:25,400 Speaker 2: She's hearing her daughter's voice. She had the wherewithal the 569 00:35:25,600 --> 00:35:29,839 Speaker 2: message her daughter daughter's fine, she's chilling. She's like, why 570 00:35:29,840 --> 00:35:32,840 Speaker 2: are you worried? Mean, while this guy's asking for money. 571 00:35:32,880 --> 00:35:35,799 Speaker 2: But because we have so much social media that's out there, 572 00:35:36,239 --> 00:35:39,680 Speaker 2: and then you have organizations that are making AI tools 573 00:35:39,760 --> 00:35:43,839 Speaker 2: open source, it's not too difficult right to have the replication. 574 00:35:44,080 --> 00:35:46,640 Speaker 2: So it does mean we have to be more critical 575 00:35:47,120 --> 00:35:52,080 Speaker 2: consumers of information or if we hear something that's emotionally 576 00:35:52,160 --> 00:35:56,520 Speaker 2: triggering to us, take a pause. Some people are going 577 00:35:56,600 --> 00:35:59,239 Speaker 2: to safe words and other sorts of things just to 578 00:35:59,320 --> 00:36:02,799 Speaker 2: make sure the call is who they think it is. 579 00:36:02,840 --> 00:36:06,359 Speaker 2: But this is a major issue because it is polluting 580 00:36:06,520 --> 00:36:10,240 Speaker 2: our information networks, right, So we're coming to a place 581 00:36:10,280 --> 00:36:12,680 Speaker 2: where it's hard to know if you can trust what 582 00:36:12,760 --> 00:36:16,440 Speaker 2: you see with your eyes or hear with your ears. 583 00:36:17,080 --> 00:36:20,520 Speaker 2: So it's huge. And there's another aspect to that, which 584 00:36:20,560 --> 00:36:25,240 Speaker 2: are non consensual explicit deep fakes, right, and so thinking 585 00:36:25,280 --> 00:36:28,839 Speaker 2: about things like deep fake porn, and we see that 586 00:36:29,040 --> 00:36:33,080 Speaker 2: over ninety percent of deep fakes are actually in this category, 587 00:36:33,360 --> 00:36:36,960 Speaker 2: and of that another over ninety percent are of women 588 00:36:37,600 --> 00:36:41,000 Speaker 2: and girls. We saw more attention come to it when 589 00:36:41,000 --> 00:36:45,040 Speaker 2: this happened to Taylor Swift. The Defiance Act was introduced. 590 00:36:45,040 --> 00:36:48,279 Speaker 2: But you have middle school girls right, who are facing 591 00:36:48,480 --> 00:36:53,280 Speaker 2: this kind of digital abuse. And so that's another area 592 00:36:53,360 --> 00:36:56,960 Speaker 2: that is rising in something for students and parents and 593 00:36:57,040 --> 00:37:00,880 Speaker 2: caregivers to be aware of. And then even in the 594 00:37:00,960 --> 00:37:04,840 Speaker 2: generative AI space, I remember MIT Tech Review there is 595 00:37:04,880 --> 00:37:08,160 Speaker 2: a journalist and they had these AI apps that are 596 00:37:08,200 --> 00:37:12,640 Speaker 2: giving you profile photos, and her male colleagues would get 597 00:37:12,680 --> 00:37:17,080 Speaker 2: to be astronauts and explorers and things like that, and 598 00:37:17,160 --> 00:37:21,160 Speaker 2: she saw she was being put into skimpy clothing and 599 00:37:21,239 --> 00:37:24,480 Speaker 2: even had a child photo of hers made into an 600 00:37:24,480 --> 00:37:30,600 Speaker 2: explicit photo, right, And so these are definitely just reaffirming 601 00:37:30,640 --> 00:37:34,400 Speaker 2: what you're saying, rising dangers, which is why it's important 602 00:37:34,440 --> 00:37:37,919 Speaker 2: that we actually have safeguards. So what the Defiance Act 603 00:37:37,920 --> 00:37:41,120 Speaker 2: would do, for example, is give you the right to sue. 604 00:37:41,719 --> 00:37:45,240 Speaker 2: So right now, what's happening is there are no consequences 605 00:37:45,320 --> 00:37:48,320 Speaker 2: if I have a tool that creates these sorts of images. 606 00:37:48,760 --> 00:37:51,319 Speaker 2: As long as there are no consequences for it, there's 607 00:37:51,360 --> 00:37:54,400 Speaker 2: no reason I have to stop. And so what something 608 00:37:54,520 --> 00:37:57,640 Speaker 2: like the Defiance Act and other types of legislation like 609 00:37:57,680 --> 00:38:01,720 Speaker 2: that would do is say, there are consequences. 610 00:38:00,920 --> 00:38:03,160 Speaker 1: Got it? Do you have any tips for how we 611 00:38:03,239 --> 00:38:07,040 Speaker 1: can tell and authentic image or voice versus like an 612 00:38:07,040 --> 00:38:08,200 Speaker 1: AI generated one. 613 00:38:08,719 --> 00:38:12,200 Speaker 2: It's challenging some of what people are doing, and it 614 00:38:12,239 --> 00:38:16,200 Speaker 2: really depends on the resolution is with the eyes. For example, 615 00:38:16,400 --> 00:38:20,040 Speaker 2: they are using the way that the eye reflection looks 616 00:38:20,160 --> 00:38:23,600 Speaker 2: across both eyes to see if they would actually match 617 00:38:23,719 --> 00:38:28,040 Speaker 2: up based on some physics principles. But you can imagine 618 00:38:28,200 --> 00:38:32,040 Speaker 2: in this sort of race cat and mouse that okay, 619 00:38:32,120 --> 00:38:35,160 Speaker 2: once people figure out what the reflections are supposed to 620 00:38:35,200 --> 00:38:38,960 Speaker 2: look like, eventually they will try to simulate it. But 621 00:38:39,080 --> 00:38:43,120 Speaker 2: that's one area people currently take a look at. Hands 622 00:38:43,120 --> 00:38:47,000 Speaker 2: still tend to be pretty difficult for many in AI system. 623 00:38:47,080 --> 00:38:51,160 Speaker 2: That's actually why on the book I have hands. Yeah, 624 00:38:51,200 --> 00:38:53,960 Speaker 2: it's almost like a flex to AI. Now will this 625 00:38:54,080 --> 00:38:57,640 Speaker 2: be a forever thing, not sure, But in that moment 626 00:38:57,760 --> 00:39:00,520 Speaker 2: when I was writing the book, right you could definitely 627 00:39:01,000 --> 00:39:03,040 Speaker 2: take a look at the hands and that would give 628 00:39:03,080 --> 00:39:06,080 Speaker 2: you a sense of was their human touch involved things 629 00:39:06,200 --> 00:39:09,759 Speaker 2: like ear low attachments and so forth. But again, all 630 00:39:09,840 --> 00:39:13,399 Speaker 2: of these over time will change. This is as bad 631 00:39:13,440 --> 00:39:16,000 Speaker 2: as AI is ever going to be. And if you're 632 00:39:16,040 --> 00:39:20,479 Speaker 2: seeing what's out there, it's pretty convincing. And here's the thing, 633 00:39:20,560 --> 00:39:23,360 Speaker 2: it only has to be convincing for a short amount 634 00:39:23,360 --> 00:39:26,400 Speaker 2: of time. You have deep fakes where AI is involved, 635 00:39:26,400 --> 00:39:29,600 Speaker 2: but you also have cheap fakes, right, think of photoshop 636 00:39:29,680 --> 00:39:33,400 Speaker 2: and other things where you just say something happened, You 637 00:39:33,440 --> 00:39:36,680 Speaker 2: spread the lie, you start the rumor, and that itself 638 00:39:37,239 --> 00:39:39,680 Speaker 2: has the impact. So it does mean we have to 639 00:39:39,719 --> 00:39:43,239 Speaker 2: be more critical consumers of data, and so that's where 640 00:39:43,280 --> 00:39:46,680 Speaker 2: you'll see some people looking more for the verifications or 641 00:39:46,719 --> 00:39:48,920 Speaker 2: things like that. So we're not trying to certify the 642 00:39:48,920 --> 00:39:53,279 Speaker 2: whole ocean, but we're saying this cup, right, this one 643 00:39:53,360 --> 00:39:56,600 Speaker 2: you can verify. So I think we are moving towards 644 00:39:56,600 --> 00:39:58,759 Speaker 2: that world where you have to be just much more 645 00:39:58,840 --> 00:40:04,760 Speaker 2: critical about what you are seeing for sure. And again 646 00:40:05,040 --> 00:40:08,799 Speaker 2: it has to go back to consequences, right, because if 647 00:40:08,800 --> 00:40:13,239 Speaker 2: there are no consequences for producing deceitful information or for 648 00:40:13,640 --> 00:40:16,920 Speaker 2: propagating it, then it will propagate. So I like to 649 00:40:16,920 --> 00:40:19,960 Speaker 2: think about it as Frankenstein's monster in the basement. Right. 650 00:40:20,320 --> 00:40:23,520 Speaker 2: If the monster stays in the basement, it's still terrifying, 651 00:40:23,600 --> 00:40:27,200 Speaker 2: but it's contained. Now if the monster gets on the road, 652 00:40:27,280 --> 00:40:32,359 Speaker 2: thinks social media, right, and it starts to spread, that's 653 00:40:32,400 --> 00:40:36,480 Speaker 2: where it becomes even more impactful. And so we have 654 00:40:36,560 --> 00:40:39,480 Speaker 2: to also make sure that there is accountability when it 655 00:40:39,520 --> 00:40:44,200 Speaker 2: comes to distribution. So right now there's very little accountability. 656 00:40:44,320 --> 00:40:48,560 Speaker 2: So everybody's just letting whatever fly ride for now. That 657 00:40:48,640 --> 00:40:51,160 Speaker 2: cannot last if we actually want to curb the issues 658 00:40:51,200 --> 00:40:51,960 Speaker 2: with the fakes. 659 00:40:52,400 --> 00:40:55,800 Speaker 1: What about the concern around AI taking jobs? 660 00:40:56,520 --> 00:40:59,960 Speaker 2: True concern? And what's interesting about this that I've seen. 661 00:41:00,360 --> 00:41:04,719 Speaker 2: It's not even just AI's capabilities, which vary it really 662 00:41:04,760 --> 00:41:09,120 Speaker 2: depends on what specific jobs we're talking about in functions, 663 00:41:09,640 --> 00:41:14,480 Speaker 2: but the hype around AI, the stories we tell ourselves 664 00:41:14,480 --> 00:41:18,759 Speaker 2: about AI capabilities have implications for jobs. And let me 665 00:41:18,840 --> 00:41:22,440 Speaker 2: give you this example. I saw a headline. It was 666 00:41:22,480 --> 00:41:25,320 Speaker 2: probably beginning of the week when I saw the headline. 667 00:41:25,360 --> 00:41:29,560 Speaker 2: It was something to do with NETTA, National Eating Disorder Association. 668 00:41:30,480 --> 00:41:33,919 Speaker 2: Their workers were, I believe, unionizing, and so they said, 669 00:41:34,120 --> 00:41:36,160 Speaker 2: we're not dealing with this. So they fined the call 670 00:41:36,200 --> 00:41:40,759 Speaker 2: center workers and they replaced it with the chat bot. Right. 671 00:41:40,840 --> 00:41:43,640 Speaker 2: And at that point, you're hearing all of this aihype. Right, 672 00:41:43,800 --> 00:41:48,000 Speaker 2: chat bot can replace the humans, all of this customer 673 00:41:48,080 --> 00:41:52,120 Speaker 2: service call centers, and we are seeing that. So they 674 00:41:52,160 --> 00:41:54,640 Speaker 2: replaced the humans. I don't think it was even five 675 00:41:54,840 --> 00:41:57,720 Speaker 2: or six days later, the next headline was chat bot 676 00:41:57,840 --> 00:42:04,000 Speaker 2: shut down was the chatbot shut down. Chatbot was actually 677 00:42:04,440 --> 00:42:09,239 Speaker 2: giving advice that's known to make eating this orders worse. Now, 678 00:42:09,280 --> 00:42:13,719 Speaker 2: this is an organization whose whole mission is around addressing 679 00:42:14,040 --> 00:42:18,440 Speaker 2: eating disorders, right and helping people who are struggling with that. 680 00:42:18,840 --> 00:42:22,719 Speaker 2: And so not only did they compromise their workforce, they've 681 00:42:22,760 --> 00:42:25,960 Speaker 2: compromised their mission not because AI was so great, but 682 00:42:26,080 --> 00:42:30,640 Speaker 2: because they believe the height in the AI system. And 683 00:42:30,719 --> 00:42:34,040 Speaker 2: so I think it's so easy to assume it's you 684 00:42:34,080 --> 00:42:37,799 Speaker 2: can automate the thing you don't know as well, right, 685 00:42:38,480 --> 00:42:41,480 Speaker 2: or so it's like, okay, that's easy or that's easy enough. 686 00:42:41,640 --> 00:42:43,600 Speaker 2: But if you're the person in the day to day, 687 00:42:43,640 --> 00:42:47,400 Speaker 2: you might actually realize there are some nuances to the 688 00:42:47,440 --> 00:42:50,440 Speaker 2: work that you're doing. So that's one piece, but there 689 00:42:50,480 --> 00:42:53,080 Speaker 2: are other areas right that. Again, when it comes to 690 00:42:53,120 --> 00:42:56,160 Speaker 2: customer service and things of that nature, you are seeing 691 00:42:56,239 --> 00:43:00,920 Speaker 2: companies reduce their workforces now because they're a portion of 692 00:43:00,920 --> 00:43:05,080 Speaker 2: that work that can be automated. And so when people say, oh, 693 00:43:05,160 --> 00:43:08,799 Speaker 2: the AI is just here to enhance, the rhetoric of 694 00:43:09,000 --> 00:43:13,919 Speaker 2: enhancement soon becomes the reality of replacement. Right If you're 695 00:43:14,040 --> 00:43:18,080 Speaker 2: looking at trajectories and so there is certainly an economic 696 00:43:18,239 --> 00:43:22,560 Speaker 2: impact of AI on jobs that is here and that 697 00:43:22,719 --> 00:43:26,200 Speaker 2: is coming. That's the reality of the situation because it's 698 00:43:26,200 --> 00:43:30,840 Speaker 2: all about cost cutting and quote unquote efficiency as opposed 699 00:43:30,920 --> 00:43:33,600 Speaker 2: to quality. And I truly believe where you want the 700 00:43:33,719 --> 00:43:38,120 Speaker 2: quality work, you want humans. And you'll hear a lot 701 00:43:38,160 --> 00:43:41,120 Speaker 2: of people saying, well, humans with AI. It depends on 702 00:43:41,200 --> 00:43:44,120 Speaker 2: the type of AI, because we also don't want a 703 00:43:44,160 --> 00:43:47,239 Speaker 2: case where we're living in I call this the apprentice gap. 704 00:43:47,760 --> 00:43:51,200 Speaker 2: So you say AI is taking over the entry level jobs, right, 705 00:43:51,840 --> 00:43:54,360 Speaker 2: how do you ever gain mastery if you never have 706 00:43:54,760 --> 00:43:58,240 Speaker 2: the entry level jobs. I recently started playing my guitar 707 00:43:58,280 --> 00:44:01,319 Speaker 2: again and I had to my younger self. I still 708 00:44:01,320 --> 00:44:05,440 Speaker 2: had my calluses, right, And I think sometimes we forget 709 00:44:05,520 --> 00:44:09,960 Speaker 2: the professional callouses we develop by going through the process 710 00:44:10,520 --> 00:44:14,840 Speaker 2: of gaining mastery in anything. Sometimes it seems like drudge 711 00:44:14,840 --> 00:44:17,560 Speaker 2: work or things that could be automated, but you're also 712 00:44:17,680 --> 00:44:21,600 Speaker 2: learning things in that process. And if you take that away, 713 00:44:21,680 --> 00:44:25,600 Speaker 2: then there is no on ramp to the mastery stage. 714 00:44:25,680 --> 00:44:27,840 Speaker 2: And so then when we end up living in the 715 00:44:27,880 --> 00:44:31,000 Speaker 2: age of the last masters, right, I mean, one day 716 00:44:31,000 --> 00:44:34,280 Speaker 2: I might tell my future children, I remember when humans 717 00:44:34,320 --> 00:44:40,279 Speaker 2: wrote books. I'll tell them I wrote each word of 718 00:44:41,160 --> 00:44:46,480 Speaker 2: no props. And then there are other authors will say, 719 00:44:46,600 --> 00:44:49,359 Speaker 2: let me take my work and put it through an 720 00:44:49,400 --> 00:44:52,040 Speaker 2: AI system and see if they're themes or summaries or 721 00:44:52,120 --> 00:44:55,560 Speaker 2: way to scaffold my writing process, which is a different 722 00:44:55,600 --> 00:44:58,279 Speaker 2: sort of use than let me not go through the 723 00:44:58,320 --> 00:45:01,520 Speaker 2: process of creation and activity. And so I do think 724 00:45:01,760 --> 00:45:05,359 Speaker 2: there are hybrid modes, but we have to be very 725 00:45:05,400 --> 00:45:11,120 Speaker 2: intentional to make sure we're still working our cognitive functions. Right, 726 00:45:11,280 --> 00:45:13,359 Speaker 2: just like if we don't work our muscles, they're not 727 00:45:13,440 --> 00:45:16,839 Speaker 2: as strong. If we're not working our cognitive functions, they're 728 00:45:16,840 --> 00:45:19,120 Speaker 2: not going to be as strong. And I also think 729 00:45:19,160 --> 00:45:22,640 Speaker 2: there are ways in which we can use AI tools, right. 730 00:45:22,680 --> 00:45:25,440 Speaker 2: I think about things like alpha fold and my dad 731 00:45:25,800 --> 00:45:29,240 Speaker 2: and the work that he does that having a system 732 00:45:29,320 --> 00:45:33,400 Speaker 2: that's showing you these protein structures actually allows you to 733 00:45:34,040 --> 00:45:37,640 Speaker 2: have certain types of scientific research that just wouldn't be 734 00:45:37,760 --> 00:45:41,640 Speaker 2: humanly possible because of the amount of data that would 735 00:45:41,680 --> 00:45:45,520 Speaker 2: need to be analyzed and understood. And so I certainly 736 00:45:45,600 --> 00:45:51,760 Speaker 2: think there are opportunities where that human AI collaboration actually 737 00:45:51,800 --> 00:45:54,440 Speaker 2: does make a difference. But I think you have to 738 00:45:54,480 --> 00:45:59,280 Speaker 2: be really careful when companies are talking about those opportunities. Meanwhile, 739 00:45:59,280 --> 00:46:03,720 Speaker 2: they're lacking the data of artists and writers and giving 740 00:46:03,760 --> 00:46:08,239 Speaker 2: you spiciattle complete They're not the same thing, right, right. 741 00:46:08,600 --> 00:46:20,279 Speaker 1: More from our conversation after the break, I appreciate the 742 00:46:20,320 --> 00:46:23,000 Speaker 1: example you shared about the eating disorder work, because that 743 00:46:23,120 --> 00:46:25,040 Speaker 1: is something, of course that I'm paying attention to in 744 00:46:25,080 --> 00:46:28,560 Speaker 1: my industry. Right, is this idea of therapist GPT. So 745 00:46:28,640 --> 00:46:30,799 Speaker 1: this is a platform that has been built to kind 746 00:46:30,800 --> 00:46:34,600 Speaker 1: of offer advice and support in I think the tone 747 00:46:34,680 --> 00:46:37,560 Speaker 1: of a therapist, and I think about like earlier this year, 748 00:46:37,600 --> 00:46:41,080 Speaker 1: there was a company that was paying clients to secretly 749 00:46:41,160 --> 00:46:45,640 Speaker 1: record their therapy sessions and then upload it to some system, 750 00:46:45,680 --> 00:46:49,040 Speaker 1: presumably I'm guessing to train it for something like a 751 00:46:49,080 --> 00:46:51,759 Speaker 1: therapist GPT. So I wonder if you could talk a 752 00:46:51,800 --> 00:46:54,719 Speaker 1: little bit more about like the concern about something like 753 00:46:54,760 --> 00:46:59,120 Speaker 1: a therapist GPT existing in an effort to maybe make 754 00:46:59,239 --> 00:47:03,239 Speaker 1: mental health more accessible, but like you mentioned, offering suggestions 755 00:47:03,280 --> 00:47:04,880 Speaker 1: that actually make treatment worse. 756 00:47:05,400 --> 00:47:08,680 Speaker 2: I'm so glad you're bringing this up, because again, there 757 00:47:08,840 --> 00:47:15,080 Speaker 2: is that enticing narrative of democratizing, right, or making something 758 00:47:15,160 --> 00:47:18,600 Speaker 2: more accessible that would otherwise be out of reach, and 759 00:47:18,680 --> 00:47:22,200 Speaker 2: so that becomes the origin story and you have the 760 00:47:22,280 --> 00:47:28,160 Speaker 2: veil of doing good. I'm thinking about Character AI right now, 761 00:47:28,960 --> 00:47:32,440 Speaker 2: company that was actually started by people who spun out 762 00:47:32,480 --> 00:47:35,239 Speaker 2: of Google, some of the people who created the Transformer 763 00:47:35,360 --> 00:47:38,960 Speaker 2: architecture that's behind so much of the generitive AI systems 764 00:47:39,000 --> 00:47:43,319 Speaker 2: that are happening, and they were somewhat complaining that at 765 00:47:43,400 --> 00:47:46,240 Speaker 2: Google we're not able to explore more of the creative, 766 00:47:46,560 --> 00:47:50,400 Speaker 2: riskier side of AI, and so they decide to create 767 00:47:50,760 --> 00:47:56,480 Speaker 2: basically AI companions, AI companions for entertainment. Right, maybe you 768 00:47:56,480 --> 00:48:00,600 Speaker 2: want an AI boyfriend or something like that, and it 769 00:48:00,640 --> 00:48:04,160 Speaker 2: was supposed to be all funning games. Fast forward and 770 00:48:04,239 --> 00:48:06,840 Speaker 2: you have the story of a fourteen year old committing 771 00:48:06,920 --> 00:48:12,080 Speaker 2: suicide after engaging for quite some time with one of 772 00:48:12,120 --> 00:48:17,640 Speaker 2: these character dot AI chatbots that he had customized, and 773 00:48:18,000 --> 00:48:21,719 Speaker 2: his mom is pressing charges and saying she believes he'd 774 00:48:21,719 --> 00:48:25,440 Speaker 2: still be alive if not for this kind of emotional 775 00:48:25,800 --> 00:48:30,719 Speaker 2: connection that was being formed with this AI system that 776 00:48:30,880 --> 00:48:34,200 Speaker 2: is not human, that has no agency, that has no care, 777 00:48:34,280 --> 00:48:37,759 Speaker 2: that has no understanding. It is all an illusion, a 778 00:48:37,880 --> 00:48:40,360 Speaker 2: very good one. Just like when I watch Finding Nemo 779 00:48:40,520 --> 00:48:44,640 Speaker 2: or some other animated series. I'm still caught up in 780 00:48:44,800 --> 00:48:48,399 Speaker 2: all of it okay, Kong Fu Panda. I know this 781 00:48:48,480 --> 00:48:51,840 Speaker 2: is a series of images. I know it's an actor 782 00:48:51,960 --> 00:48:56,279 Speaker 2: with the voice, but when it's manipulating the way we 783 00:48:56,400 --> 00:49:02,120 Speaker 2: perceive and give agency to things because of our human psychology, 784 00:49:02,800 --> 00:49:05,600 Speaker 2: then we still have that illusion, even though we know. 785 00:49:06,320 --> 00:49:10,120 Speaker 2: So when I think of people interacting with these chatbots, 786 00:49:10,239 --> 00:49:12,920 Speaker 2: it is like that illusion, except for now it's an 787 00:49:12,960 --> 00:49:19,320 Speaker 2: interactive illusion and it can have data about previous conversations. 788 00:49:19,880 --> 00:49:24,040 Speaker 2: This is so dangerous. I think this is actually one 789 00:49:24,040 --> 00:49:27,959 Speaker 2: of the most dangerous uses of AI that doesn't get 790 00:49:28,160 --> 00:49:30,640 Speaker 2: as much attention, because again, it can be like oh, 791 00:49:30,680 --> 00:49:35,680 Speaker 2: funny games, right until someone gets hurt, until people commit suicide. 792 00:49:35,680 --> 00:49:38,520 Speaker 2: Even before the fourteen year old boy we were just mentioning, 793 00:49:38,880 --> 00:49:41,839 Speaker 2: there's also a man in Belgium whose widow says he'd 794 00:49:41,880 --> 00:49:45,319 Speaker 2: still be alive had he not started engaging in these 795 00:49:45,360 --> 00:49:50,799 Speaker 2: conversations with a chatbot in that instance, and sadly there 796 00:49:50,840 --> 00:49:54,880 Speaker 2: are likely more stories like that that haven't hit the 797 00:49:55,000 --> 00:49:59,480 Speaker 2: news cycles in the same sort of way. So I 798 00:49:59,520 --> 00:50:03,400 Speaker 2: think it is so tempting to say here is the 799 00:50:03,520 --> 00:50:08,480 Speaker 2: technological fix for our human needs for connection, our human 800 00:50:08,560 --> 00:50:12,239 Speaker 2: need to be understood. We know that we have a 801 00:50:12,440 --> 00:50:15,960 Speaker 2: huge problem with loneliness. It was made even worse by 802 00:50:16,000 --> 00:50:20,279 Speaker 2: the pandemic. It's been exacerbated by social media and now 803 00:50:20,360 --> 00:50:23,799 Speaker 2: the very companies and the people who built those companies 804 00:50:24,239 --> 00:50:27,560 Speaker 2: that built these systems that we're supposed to connect us 805 00:50:27,600 --> 00:50:31,440 Speaker 2: but left us isolated, say, now we have a new pill, 806 00:50:32,040 --> 00:50:36,080 Speaker 2: this AI companion, to fill the voids that only other 807 00:50:36,239 --> 00:50:41,360 Speaker 2: humans can in an authentic way. And so now you 808 00:50:41,520 --> 00:50:45,000 Speaker 2: have the substitute. And at the end of the day, 809 00:50:45,080 --> 00:50:50,400 Speaker 2: the substitute isn't real, and it leaves people emptier. And 810 00:50:50,440 --> 00:50:53,759 Speaker 2: that's what we're seeing with some of these AI companions. 811 00:50:53,800 --> 00:50:57,200 Speaker 2: And maybe if you're using it in an entertainment way, 812 00:50:57,760 --> 00:51:00,400 Speaker 2: that's one thing, But what we're seeing is people are 813 00:51:00,440 --> 00:51:06,000 Speaker 2: forming real emotional dependencies. And we're talking children here, you 814 00:51:06,080 --> 00:51:09,120 Speaker 2: see adults the adult man in Belgium, right, So it 815 00:51:09,239 --> 00:51:13,560 Speaker 2: cuts across age range, but it's especially dangerous at that 816 00:51:13,760 --> 00:51:16,799 Speaker 2: developmental stage. I'm sure you can speak more to it 817 00:51:16,840 --> 00:51:19,680 Speaker 2: than I can, as it's more of your area of expertise, 818 00:51:19,719 --> 00:51:24,160 Speaker 2: But as somebody who's from the AI side of the fence, 819 00:51:24,239 --> 00:51:27,720 Speaker 2: I think that's one of the most dangerous uses of AI. 820 00:51:27,960 --> 00:51:32,360 Speaker 2: When you can emotionally manipulate somebody. And that's what's happening 821 00:51:32,719 --> 00:51:37,080 Speaker 2: when you're forming what seems to be a connection, because 822 00:51:37,120 --> 00:51:40,040 Speaker 2: again you have this illusion, but instead of it being 823 00:51:40,120 --> 00:51:43,880 Speaker 2: Kung Fu Panda or Finding Nemo some of my favorite 824 00:51:43,880 --> 00:51:47,600 Speaker 2: animated films, and I know that illusion, you're blurring the 825 00:51:47,640 --> 00:51:50,800 Speaker 2: lines and you're forming real attachments. 826 00:51:52,000 --> 00:51:53,759 Speaker 1: That's a Joe in your mind. What would it look 827 00:51:53,880 --> 00:51:58,160 Speaker 1: like to have an equitable, culturally aware AI driven platform. 828 00:51:58,400 --> 00:52:02,600 Speaker 2: I think, first of all, if it's culturally aware and equitable, 829 00:52:02,760 --> 00:52:07,360 Speaker 2: what's being centered people, not the technology? And I find 830 00:52:07,400 --> 00:52:10,080 Speaker 2: this so often, and actually this is what drew me 831 00:52:10,120 --> 00:52:12,440 Speaker 2: to computer science. I won't lie. People are messy. 832 00:52:13,760 --> 00:52:17,640 Speaker 4: I'm like, great, give me the algorithms, give me the tech. 833 00:52:18,440 --> 00:52:21,880 Speaker 4: I found science and tech ists for me. Let the 834 00:52:22,000 --> 00:52:26,760 Speaker 4: humanities folks deal with the social science people great, everyone 835 00:52:26,800 --> 00:52:28,240 Speaker 4: in their lane, everyone. 836 00:52:27,840 --> 00:52:31,400 Speaker 2: In their life. You know, it's so easy to want 837 00:52:31,520 --> 00:52:34,759 Speaker 2: technology to be the savior when we, at the end 838 00:52:34,760 --> 00:52:38,440 Speaker 2: of the day, have to save ourselves. And so I think, 839 00:52:38,480 --> 00:52:42,239 Speaker 2: when I'm thinking of the use of AI, what are 840 00:52:42,239 --> 00:52:46,640 Speaker 2: we doing when it comes to broader social inequities and 841 00:52:46,719 --> 00:52:51,319 Speaker 2: inequalities right? Because AI will not solve poverty because the 842 00:52:51,360 --> 00:52:54,640 Speaker 2: power dynamics that lead to poverty and the profit motive 843 00:52:54,760 --> 00:52:59,399 Speaker 2: are not technical questions. AI won't solve climate change if 844 00:52:59,400 --> 00:53:03,200 Speaker 2: we can't actually shift the economic incentives and economic structures. 845 00:53:03,400 --> 00:53:07,000 Speaker 2: Even if the AI helps us discover new materials or 846 00:53:07,040 --> 00:53:09,840 Speaker 2: ways to be more efficient, you still have that broader 847 00:53:10,280 --> 00:53:14,480 Speaker 2: question of what are we going to value as society? 848 00:53:14,680 --> 00:53:18,319 Speaker 2: And so I can see AI being a tool that 849 00:53:18,440 --> 00:53:20,960 Speaker 2: helps us explore, but we still have to deal with 850 00:53:21,000 --> 00:53:25,680 Speaker 2: the messiness of human dynamics, right, We have to deal 851 00:53:25,800 --> 00:53:30,320 Speaker 2: with questions of power, who has a say and who doesn't. 852 00:53:30,719 --> 00:53:34,920 Speaker 2: We have to deal with economic questions as well, who 853 00:53:35,040 --> 00:53:38,960 Speaker 2: profits who doesn't. And so for me, when I'm thinking 854 00:53:39,280 --> 00:53:43,520 Speaker 2: of this future, it is very much where we're thinking 855 00:53:43,640 --> 00:53:49,200 Speaker 2: through data is not a way to destin you to discrimination. 856 00:53:49,800 --> 00:53:53,320 Speaker 2: Where the people who are impacted by AI systems actually 857 00:53:53,400 --> 00:53:55,800 Speaker 2: have a voice and a choice in how these AI 858 00:53:55,920 --> 00:53:58,640 Speaker 2: systems are created. And I love to say, if you 859 00:53:58,680 --> 00:54:01,800 Speaker 2: have a face, you have a place the conversation around AI. 860 00:54:02,239 --> 00:54:05,440 Speaker 2: So how do we build governance structures that allow those 861 00:54:05,480 --> 00:54:08,600 Speaker 2: who are most impacted right to be part of the 862 00:54:08,719 --> 00:54:13,040 Speaker 2: process of creation, not just receiving and responding and playing 863 00:54:13,120 --> 00:54:14,880 Speaker 2: bias whack amol. 864 00:54:15,200 --> 00:54:17,440 Speaker 1: And that's why it's really important. So, in addition to 865 00:54:17,520 --> 00:54:20,640 Speaker 1: being an author, you're also the founder of the Algorithmic 866 00:54:21,000 --> 00:54:24,040 Speaker 1: Justice League, which is an organization combining art and research 867 00:54:24,360 --> 00:54:28,440 Speaker 1: to illuminate artificial intelligence, social implications and harms. Can you 868 00:54:28,520 --> 00:54:30,560 Speaker 1: tell us a little bit more about the organization and 869 00:54:30,640 --> 00:54:32,320 Speaker 1: how people might be able to get involved. 870 00:54:33,320 --> 00:54:36,880 Speaker 2: Oh? Yes, So my day job is that work with 871 00:54:36,960 --> 00:54:41,239 Speaker 2: the Algorithmic Justice League, and we amplify issues of emerging 872 00:54:41,280 --> 00:54:45,239 Speaker 2: AI harms. We also connect people with resources if they've 873 00:54:45,239 --> 00:54:48,240 Speaker 2: been harmed by AI. Anyone who's been harmed by AI, 874 00:54:48,360 --> 00:54:50,480 Speaker 2: we call them the ex coded. You could be Taylor 875 00:54:50,480 --> 00:54:53,360 Speaker 2: Swift with the deep explicits. You're ex coded. You're the 876 00:54:53,400 --> 00:54:56,520 Speaker 2: student who was flagged as cheating with an AI system 877 00:54:56,640 --> 00:54:59,240 Speaker 2: just because English is your second language. You're also among 878 00:54:59,280 --> 00:55:03,200 Speaker 2: the ex coded Poorsia Woodrofe arrested eight months pregnant due 879 00:55:03,239 --> 00:55:06,880 Speaker 2: to faulty facial recognition filled by AI. And so we 880 00:55:06,920 --> 00:55:11,080 Speaker 2: connect those who've been excoded with resources, but also campaigns. 881 00:55:11,280 --> 00:55:14,880 Speaker 2: So one of our recent campaigns is the Freedom Flyer's Campaign, 882 00:55:15,320 --> 00:55:18,239 Speaker 2: and this was to raise awareness and continue to raise 883 00:55:18,239 --> 00:55:22,600 Speaker 2: awareness of the increased use of facial recognition at airports. 884 00:55:22,960 --> 00:55:25,120 Speaker 2: So right now, you can go to the airport and 885 00:55:25,160 --> 00:55:28,319 Speaker 2: you've likely noticed that they have face scanning going on. 886 00:55:28,840 --> 00:55:32,040 Speaker 2: What most people don't know is for domestic flights, these 887 00:55:32,120 --> 00:55:35,239 Speaker 2: scans are optional. You can actually step to the side 888 00:55:35,320 --> 00:55:39,640 Speaker 2: and say you want the standard check. It's literally that easy, right, 889 00:55:39,680 --> 00:55:42,239 Speaker 2: and you should be able to go through. Now we 890 00:55:42,280 --> 00:55:45,520 Speaker 2: know they're power dynamics people behind you in line, you're 891 00:55:45,560 --> 00:55:47,960 Speaker 2: trying to get to your flight and so forth. So 892 00:55:48,040 --> 00:55:50,719 Speaker 2: if you feel comfortable opting out, we invite you to 893 00:55:50,800 --> 00:55:53,319 Speaker 2: join the opt out club. You have cool swag, you know, 894 00:55:53,400 --> 00:55:56,839 Speaker 2: the whole thing. But regardless of your experience, we ask 895 00:55:56,960 --> 00:56:02,719 Speaker 2: everybody to fill out a TSA scorecard dot AJL dot org. 896 00:56:02,800 --> 00:56:07,120 Speaker 2: Because this actually allows us to hold TSA accountable. They're saying, oh, 897 00:56:07,160 --> 00:56:11,120 Speaker 2: there's notice, people don't even see the signs. You have 898 00:56:11,320 --> 00:56:14,760 Speaker 2: light blue on dark blue text you're not gonna see, 899 00:56:14,960 --> 00:56:18,400 Speaker 2: you know, if it's even visible. So things of that nature. 900 00:56:18,440 --> 00:56:20,480 Speaker 2: So those are some of the campaigns we do so 901 00:56:20,520 --> 00:56:23,680 Speaker 2: people actually know where they have the ability to push back. 902 00:56:23,680 --> 00:56:27,719 Speaker 2: We recently did this with LinkedIn, so LinkedIn for all 903 00:56:27,719 --> 00:56:32,200 Speaker 2: of their US users automatically enrolled you in so that 904 00:56:32,360 --> 00:56:35,439 Speaker 2: your content and personal data can be used to train 905 00:56:35,480 --> 00:56:38,759 Speaker 2: their AI systems. Now you can opt out if you 906 00:56:38,840 --> 00:56:41,600 Speaker 2: go to the right settings. We have opt out blue 907 00:56:41,680 --> 00:56:44,719 Speaker 2: dot AJL dot org which will take you straight to 908 00:56:44,800 --> 00:56:48,719 Speaker 2: those settings so you can opt out of it. This 909 00:56:48,800 --> 00:56:51,840 Speaker 2: is the kind of design practice we push back against. 910 00:56:51,920 --> 00:56:54,799 Speaker 2: It should always be opt in. I shouldn't have to 911 00:56:54,920 --> 00:56:57,839 Speaker 2: hear on therapy for black girls though, this is why 912 00:56:57,880 --> 00:57:01,600 Speaker 2: you should be listening. Right, You can actually opt out 913 00:57:01,640 --> 00:57:03,839 Speaker 2: of that. I should have had a choice in the 914 00:57:03,880 --> 00:57:07,560 Speaker 2: first place. So automatically enrolling me and saying I could 915 00:57:07,600 --> 00:57:10,120 Speaker 2: have opted out and hence I had a choice is 916 00:57:10,160 --> 00:57:13,239 Speaker 2: not really informed consent. So those are the sorts of 917 00:57:13,280 --> 00:57:16,360 Speaker 2: things we do with the Algorithmic Justice League, just so 918 00:57:16,440 --> 00:57:19,760 Speaker 2: people know what is going on where they have the 919 00:57:19,840 --> 00:57:22,840 Speaker 2: ability to push back where that is possible, while also 920 00:57:23,280 --> 00:57:27,520 Speaker 2: pushing for the larger societal changes. Right, because some of 921 00:57:27,560 --> 00:57:32,000 Speaker 2: these changes are not an individual pushing back in one 922 00:57:32,080 --> 00:57:35,400 Speaker 2: or two ways. It's we have made sure that there 923 00:57:35,400 --> 00:57:41,640 Speaker 2: are consequences for abusing AI systems. We've put in procurement processes. 924 00:57:41,680 --> 00:57:44,480 Speaker 2: So before you even adopt an AI system, is this 925 00:57:44,560 --> 00:57:47,960 Speaker 2: going to be discriminatory? Right? Has it been proven safe 926 00:57:47,960 --> 00:57:52,200 Speaker 2: and effective? Are there meaningful alternatives and fallbacks. All of 927 00:57:52,200 --> 00:57:54,840 Speaker 2: these are part of the blueprint from an AI bill 928 00:57:54,880 --> 00:57:59,120 Speaker 2: of rights that should actually be put in place by law, 929 00:57:59,160 --> 00:58:02,120 Speaker 2: which isn't just yeah. But we have ways of thinking 930 00:58:02,160 --> 00:58:05,360 Speaker 2: about how do we develop AI in a way that's 931 00:58:05,400 --> 00:58:07,640 Speaker 2: actually going to help more of us, not just the 932 00:58:07,680 --> 00:58:10,479 Speaker 2: privilege view. And so that's some of what we do 933 00:58:10,680 --> 00:58:14,480 Speaker 2: with the Algorithmic Justice League. That's my day job. And 934 00:58:14,520 --> 00:58:18,280 Speaker 2: then I am also a writer and a poet, and 935 00:58:18,360 --> 00:58:21,840 Speaker 2: so the art practice is really important for me. Yes, 936 00:58:21,880 --> 00:58:23,960 Speaker 2: we have the book Unmasking AI, but we also have 937 00:58:24,080 --> 00:58:29,800 Speaker 2: the Emmy nominated documentary Coded Bias available on Netflix, and 938 00:58:29,920 --> 00:58:33,280 Speaker 2: so if that's something you're interested in learning more about, 939 00:58:33,360 --> 00:58:38,360 Speaker 2: it's an opportunity to educate yourself also entertain your family 940 00:58:38,480 --> 00:58:42,440 Speaker 2: and community around topics of AI bias. And what I 941 00:58:42,520 --> 00:58:45,480 Speaker 2: love about that film is it features so many highly 942 00:58:45,560 --> 00:58:51,240 Speaker 2: melanated women dropping knowledge as experts in the field and 943 00:58:51,360 --> 00:58:54,960 Speaker 2: also leading the charge, like Trenee of the Brooklyn tenants, 944 00:58:54,960 --> 00:58:58,960 Speaker 2: She's the one who got the information for other tenants 945 00:58:58,960 --> 00:59:01,000 Speaker 2: to say, hey, we don't want the installation of this 946 00:59:01,120 --> 00:59:04,479 Speaker 2: facial recognition system in our home. And so I think 947 00:59:04,560 --> 00:59:07,720 Speaker 2: even just from seeing who gets to be part of 948 00:59:07,800 --> 00:59:11,080 Speaker 2: technology and who gets to make change. We see ourselves 949 00:59:11,120 --> 00:59:15,000 Speaker 2: represented in that way is a powerful depiction and part 950 00:59:15,040 --> 00:59:17,200 Speaker 2: of why I even wanted to be part of the 951 00:59:17,280 --> 00:59:21,000 Speaker 2: documentary in the first place, while also raising awareness about 952 00:59:21,000 --> 00:59:23,840 Speaker 2: all of these different ways AI can be a biased 953 00:59:23,840 --> 00:59:30,560 Speaker 2: whether we're talking employment, the economic impact, healthcare, education, criminal justice, 954 00:59:30,600 --> 00:59:32,040 Speaker 2: in so many other areas. 955 00:59:32,760 --> 00:59:36,240 Speaker 1: So, how do you see AI evolving within the next 956 00:59:36,280 --> 00:59:36,960 Speaker 1: ten years? 957 00:59:37,640 --> 00:59:42,919 Speaker 2: Within the next ten years, I ultimately think it's up 958 00:59:42,960 --> 00:59:47,880 Speaker 2: to us. I really truly believe in human agency. So 959 00:59:47,960 --> 00:59:53,000 Speaker 2: there's a version of AI, right that exacerbates an equality. 960 00:59:53,200 --> 00:59:56,680 Speaker 2: There's a version of AI that leaves more of us 961 00:59:56,800 --> 01:00:01,120 Speaker 2: behind fewer jobs. Right, there's a version of AI where 962 01:00:01,160 --> 01:00:04,760 Speaker 2: we are claiming to be more equitable, hiring, more equitable 963 01:00:05,160 --> 01:00:09,520 Speaker 2: healthcare system, when what's actually happening is those inequalities are 964 01:00:09,520 --> 01:00:13,720 Speaker 2: getting worse. But you're saying, now you have an algorithmic gatekeeper, 965 01:00:13,960 --> 01:00:18,760 Speaker 2: and because of that, it's actually hard to hold anybody accountable. 966 01:00:18,880 --> 01:00:22,400 Speaker 2: So that's one of the futures we can have. Right, 967 01:00:22,600 --> 01:00:25,120 Speaker 2: we have another kind of future and actually look at 968 01:00:25,240 --> 01:00:27,720 Speaker 2: different parts of the world where we see what's happening 969 01:00:27,720 --> 01:00:30,919 Speaker 2: with Europe. They passed the EUAI Act where they said 970 01:00:30,960 --> 01:00:34,760 Speaker 2: we're going to actually put certain restrictions on high risk 971 01:00:34,920 --> 01:00:38,480 Speaker 2: use of AI. Things like live biometrics are going to 972 01:00:38,520 --> 01:00:42,160 Speaker 2: be something we don't use. I see a future where 973 01:00:42,160 --> 01:00:45,880 Speaker 2: you might have face free societies or free face societies 974 01:00:46,200 --> 01:00:50,800 Speaker 2: where we say we don't want our children's biometrics scanned right, 975 01:00:50,880 --> 01:00:53,480 Speaker 2: and that will actually be a privilege. And in other 976 01:00:53,560 --> 01:00:57,640 Speaker 2: societies where you don't have those protections. Your scan from 977 01:00:57,760 --> 01:01:01,400 Speaker 2: the moment you're born, from cradle to grave, you're part 978 01:01:01,480 --> 01:01:04,600 Speaker 2: of this biometric system, whether it's your iris, your face, 979 01:01:04,720 --> 01:01:09,880 Speaker 2: your voice, and so forth. So I see these parallel worlds, 980 01:01:10,200 --> 01:01:14,120 Speaker 2: and part of our job is to vote for the 981 01:01:14,160 --> 01:01:16,400 Speaker 2: world we want and push towards that. 982 01:01:17,080 --> 01:01:18,880 Speaker 1: Thank you so much for that, doctor Joey. I have 983 01:01:18,960 --> 01:01:20,800 Speaker 1: so many more questions that I want to ask you, 984 01:01:20,840 --> 01:01:23,120 Speaker 1: but I know we are out of time. This has 985 01:01:23,120 --> 01:01:26,240 Speaker 1: been so so informative, so fascinating. So I really appreciate 986 01:01:26,240 --> 01:01:28,920 Speaker 1: you spending some time with us today. Let us know 987 01:01:29,000 --> 01:01:31,360 Speaker 1: how we can stay connected with you. Where can we 988 01:01:31,400 --> 01:01:34,800 Speaker 1: grab our copy of Unmasking AI? Tell us the website 989 01:01:34,840 --> 01:01:36,600 Speaker 1: as well as any social media handles. 990 01:01:36,880 --> 01:01:41,640 Speaker 2: So for supporting the algorithmic Justice League. Donate dot AJL 991 01:01:41,760 --> 01:01:44,760 Speaker 2: dot org. We need all of your support to continue 992 01:01:44,760 --> 01:01:50,240 Speaker 2: fighting for algorithmic justice. Forgetting your copy of Unmasking AI, 993 01:01:50,280 --> 01:01:54,480 Speaker 2: you can literally go to ww dot Unmasking dot ai. 994 01:01:54,720 --> 01:01:58,480 Speaker 2: All of the information is there. I recorded the audiobook 995 01:01:58,680 --> 01:02:01,760 Speaker 2: three and a half days and so if you're not 996 01:02:01,880 --> 01:02:05,919 Speaker 2: tired of my voice yet, I definitely checked that out. 997 01:02:06,080 --> 01:02:10,080 Speaker 2: I'm at poet of Code on Instagram, so you can 998 01:02:10,120 --> 01:02:14,440 Speaker 2: follow there, and then ww dot put of code dot 999 01:02:14,480 --> 01:02:18,520 Speaker 2: com is my main website. And as a poet, I 1000 01:02:18,600 --> 01:02:22,320 Speaker 2: can't come on this podcast and not leave you with 1001 01:02:22,520 --> 01:02:25,560 Speaker 2: any poetry. So can I drop a few lines? 1002 01:02:25,960 --> 01:02:26,680 Speaker 1: Absolutely? 1003 01:02:27,200 --> 01:02:31,080 Speaker 2: Okay. We have AI anti Woman, which is literally an 1004 01:02:31,160 --> 01:02:35,360 Speaker 2: ode to black women. So I feel that will probably 1005 01:02:35,440 --> 01:02:39,680 Speaker 2: be the best poem for this podcast. And so to 1006 01:02:39,760 --> 01:02:42,800 Speaker 2: give you a bit of context, I wrote AI ain 1007 01:02:42,880 --> 01:02:45,720 Speaker 2: to I a Woman as a grad student, and it 1008 01:02:45,840 --> 01:02:50,600 Speaker 2: was inspired by Sojourner Troop's Akron Ohio speech ato I 1009 01:02:50,760 --> 01:02:54,400 Speaker 2: a Woman, which was really pushing the women's rights movement 1010 01:02:54,480 --> 01:02:57,640 Speaker 2: at the time to think about intersectionality. It's like, great, 1011 01:02:57,880 --> 01:03:00,520 Speaker 2: all these rights for white women, what about the rest 1012 01:03:00,560 --> 01:03:04,400 Speaker 2: of us? Right? And that was very informative to the 1013 01:03:04,480 --> 01:03:07,160 Speaker 2: research that I did at mit that showed some of 1014 01:03:07,200 --> 01:03:11,520 Speaker 2: these huge biases from Amazon, Microsoft, IBM and so forth, 1015 01:03:11,920 --> 01:03:15,760 Speaker 2: And so let's get into it. Ai, ain't I a woman? 1016 01:03:17,200 --> 01:03:20,080 Speaker 2: My heart smiles as I bask in their legacies, knowing 1017 01:03:20,120 --> 01:03:23,160 Speaker 2: their lives have altered many destinies. In her eyes, I 1018 01:03:23,200 --> 01:03:25,960 Speaker 2: see my mother's poise in her face. I gliped my 1019 01:03:26,040 --> 01:03:29,680 Speaker 2: Auntie's grace. In this case of deja vu, a nineteenth 1020 01:03:29,680 --> 01:03:33,480 Speaker 2: century question comes into view and a time when sojourner 1021 01:03:33,560 --> 01:03:38,040 Speaker 2: Truth asked, ain't I a woman? Today? We pose this 1022 01:03:38,160 --> 01:03:42,480 Speaker 2: question to new powers making bets on artificial intelligence, hope towers, 1023 01:03:42,880 --> 01:03:48,120 Speaker 2: the Amazonians peek through windows, blocking deep blues as faces increment, scars, 1024 01:03:48,560 --> 01:03:53,120 Speaker 2: old burns, new urns, collecting data chronicling our past, often 1025 01:03:53,200 --> 01:03:57,120 Speaker 2: forgetting to deal with gender, race, and class. Again, I 1026 01:03:57,240 --> 01:04:01,920 Speaker 2: ask ain't I a woman? Face by face? That answer 1027 01:04:02,000 --> 01:04:06,240 Speaker 2: seem uncertain. Young and old proud icons are dismissed. Can 1028 01:04:06,280 --> 01:04:09,720 Speaker 2: machines ever see my queens as I view them? Can 1029 01:04:09,720 --> 01:04:13,960 Speaker 2: machines ever see our grandmothers as we knew them? Ida 1030 01:04:14,040 --> 01:04:17,840 Speaker 2: b Well's data science pioneer, hanging back stacking stats on 1031 01:04:17,920 --> 01:04:21,880 Speaker 2: the lynching of humanity, teaching truth hidden in data, each 1032 01:04:22,080 --> 01:04:26,800 Speaker 2: entry and omission a person worthy of respect. Shirley Chisholm 1033 01:04:26,800 --> 01:04:30,840 Speaker 2: embossed and unbal the first black congress woman, but not 1034 01:04:31,000 --> 01:04:34,760 Speaker 2: the first to be misunderstood by machines well versed in 1035 01:04:34,880 --> 01:04:39,680 Speaker 2: data driven mistakes. Michelle Obama, unabashed and unafraid to wear 1036 01:04:39,720 --> 01:04:42,720 Speaker 2: her crowd of history. Here her crown seems a mystery 1037 01:04:42,960 --> 01:04:45,560 Speaker 2: to systems. Unsure of her hair, a wig of a 1038 01:04:45,680 --> 01:04:48,640 Speaker 2: fond a to pay? Maybe not? Are there no words 1039 01:04:48,640 --> 01:04:51,680 Speaker 2: for our braids in our loves? The sunny skin and 1040 01:04:51,720 --> 01:04:55,960 Speaker 2: relaxed hair make Oprah the first lady, even for her 1041 01:04:56,000 --> 01:04:59,880 Speaker 2: face well known, some algorithms fault her, echoing sentiments that 1042 01:05:00,160 --> 01:05:05,160 Speaker 2: strong women are men. We laugh, celebrating the successes of 1043 01:05:05,200 --> 01:05:10,280 Speaker 2: our sisters with serena smiles. No label is worthy of 1044 01:05:10,320 --> 01:05:10,960 Speaker 2: our beauty. 1045 01:05:11,560 --> 01:05:15,440 Speaker 1: Oh my gosh, what a beautiful way to end this conversation. 1046 01:05:15,640 --> 01:05:17,840 Speaker 1: Thank you so much, doctor Joy for spending some time 1047 01:05:17,880 --> 01:05:19,920 Speaker 1: with us. I really really appreciate. 1048 01:05:19,440 --> 01:05:22,200 Speaker 2: It, and thank you, doctor Joy. 1049 01:05:27,000 --> 01:05:29,120 Speaker 1: I'm so glad doctor Joy was able to join me 1050 01:05:29,200 --> 01:05:32,440 Speaker 1: for this conversation. To learn more about her and her work, 1051 01:05:32,600 --> 01:05:34,880 Speaker 1: or to grab a copy of her book, be sure 1052 01:05:34,880 --> 01:05:37,280 Speaker 1: to visit our show notes at Therapy for Blackgirls dot 1053 01:05:37,320 --> 01:05:40,760 Speaker 1: Com slash Session three eighty six, and don't forget to 1054 01:05:40,800 --> 01:05:43,040 Speaker 1: text this episode to two of your girls right now 1055 01:05:43,080 --> 01:05:45,640 Speaker 1: and tell them to check it out. If you're looking 1056 01:05:45,680 --> 01:05:48,840 Speaker 1: for a therapist in your area, visit our therapist directory 1057 01:05:48,880 --> 01:05:52,640 Speaker 1: at Therapy for Blackgirls dot com slash directory. And if 1058 01:05:52,640 --> 01:05:55,200 Speaker 1: you want to continue digging into this topic or just 1059 01:05:55,280 --> 01:05:58,000 Speaker 1: be in community with other sisters, come on over and 1060 01:05:58,080 --> 01:06:00,920 Speaker 1: join us in the Sister Circle. It's our cozy corner 1061 01:06:00,920 --> 01:06:03,800 Speaker 1: of the Internet designed just for black women. You can 1062 01:06:03,880 --> 01:06:07,520 Speaker 1: join us at community dot Therapy for Black Girls dot com. 1063 01:06:07,720 --> 01:06:11,480 Speaker 1: This episode was produced by Elish Ellis, Zairea Taylor, and 1064 01:06:11,560 --> 01:06:16,439 Speaker 1: Tyree Rush. Editing was done by Dennis and Bradford. Thank 1065 01:06:16,520 --> 01:06:18,920 Speaker 1: y'all so much for joining me again this week. I 1066 01:06:18,960 --> 01:06:22,200 Speaker 1: look forward to continuing this conversation with you all real soon. 1067 01:06:22,880 --> 01:06:23,560 Speaker 1: Take good care,