1 00:00:00,160 --> 00:00:03,920 Speaker 1: To make a real difference in the world. Today's episode 2 00:00:04,480 --> 00:00:08,800 Speaker 1: is a powerful one, and I am fortunate to participate 3 00:00:08,880 --> 00:00:13,680 Speaker 1: in this important discussion. I think there is no way 4 00:00:13,720 --> 00:00:18,480 Speaker 1: to express how tumultuous twenty twenty has been that hasn't 5 00:00:18,560 --> 00:00:22,200 Speaker 1: already been said a million times by this point. The 6 00:00:22,280 --> 00:00:26,400 Speaker 1: United States and the world at large continue to grapple 7 00:00:26,520 --> 00:00:31,680 Speaker 1: with enormous problems, ranging from the COVID nineteen pandemic to 8 00:00:31,920 --> 00:00:37,200 Speaker 1: economic recessions to climate change. But one issue that really 9 00:00:37,640 --> 00:00:40,639 Speaker 1: ties into all of those when you look into it, 10 00:00:40,920 --> 00:00:45,320 Speaker 1: and frankly has for centuries, has been the need for 11 00:00:45,479 --> 00:00:50,040 Speaker 1: racial justice in the United States. The stories of George Floyd, 12 00:00:50,320 --> 00:00:55,680 Speaker 1: Brianna Taylor, Eric Garner, and so many more have prompted 13 00:00:55,720 --> 00:01:00,000 Speaker 1: crowds to protest in the streets and call for systemic change. 14 00:01:00,360 --> 00:01:03,920 Speaker 1: And it's not that the underlying issues of systemic racism 15 00:01:04,000 --> 00:01:07,840 Speaker 1: are new. In fact, what makes them systemic is that 16 00:01:07,920 --> 00:01:12,520 Speaker 1: these issues are woven into the policies and social structures 17 00:01:12,560 --> 00:01:16,400 Speaker 1: of the United States. Rather, it is that they are 18 00:01:16,560 --> 00:01:21,560 Speaker 1: undeniable and we have a responsibility to address that. To 19 00:01:21,720 --> 00:01:25,560 Speaker 1: that end, within IBM, a group of employees began to 20 00:01:25,640 --> 00:01:29,720 Speaker 1: formulate a means to lean on the company's enormous spectrum 21 00:01:29,760 --> 00:01:34,240 Speaker 1: of technologies towards the goal of attaining true racial justice. 22 00:01:34,760 --> 00:01:38,120 Speaker 1: The company's history of programs and initiatives that aim to 23 00:01:38,160 --> 00:01:41,240 Speaker 1: make the world a better place served as a sort 24 00:01:41,240 --> 00:01:45,640 Speaker 1: of launching ground for the call for Code for Racial Justice. 25 00:01:46,360 --> 00:01:50,400 Speaker 1: Today you'll hear a conversation I had with three important 26 00:01:50,480 --> 00:01:54,559 Speaker 1: members of that initiative. Dale Davis Jones is a Vice 27 00:01:54,600 --> 00:01:58,440 Speaker 1: president and Distinguished Engineer at IBM and the leader of 28 00:01:58,520 --> 00:02:03,080 Speaker 1: IBM s gts are cotect Community. Lisa Banks is a 29 00:02:03,120 --> 00:02:07,200 Speaker 1: distinguished engineer working within the CTO office of the IBM 30 00:02:07,240 --> 00:02:12,040 Speaker 1: Cloud Engagement Hub. And Brittany Lonesome is a creative architect 31 00:02:12,240 --> 00:02:16,640 Speaker 1: with a deep experience in cloud systems. Listen to their 32 00:02:16,680 --> 00:02:19,840 Speaker 1: stories and how they and other IBM r s were 33 00:02:19,880 --> 00:02:23,680 Speaker 1: able to take the incredible emotional response to the racial 34 00:02:23,800 --> 00:02:28,320 Speaker 1: justice crisis and turn it into actions that anyone can 35 00:02:28,400 --> 00:02:32,560 Speaker 1: contribute to. We have a lot to cover today. We've 36 00:02:32,600 --> 00:02:35,480 Speaker 1: got a really big and important topic. But before we 37 00:02:35,560 --> 00:02:38,200 Speaker 1: jump into that, I would love to get a little 38 00:02:38,200 --> 00:02:42,280 Speaker 1: more information about my guests today and to learn about 39 00:02:42,280 --> 00:02:46,960 Speaker 1: your professional background and what brought you into the jobs 40 00:02:46,960 --> 00:02:49,840 Speaker 1: that you you currently hold and what you find exciting 41 00:02:49,880 --> 00:02:53,200 Speaker 1: about it and why it matters to you and really 42 00:02:53,520 --> 00:02:56,560 Speaker 1: to the whole world. But let's start with you, Dale. 43 00:02:56,639 --> 00:02:59,000 Speaker 1: Can you tell me a little bit about what it 44 00:02:59,080 --> 00:03:00,760 Speaker 1: is you do and how you got to where you 45 00:03:00,760 --> 00:03:04,640 Speaker 1: are today. Okay, so Hi, I'm Dale Davis Jones, and 46 00:03:05,280 --> 00:03:09,160 Speaker 1: I'm in the Global Technology Services part of um IBM, 47 00:03:09,240 --> 00:03:13,960 Speaker 1: where we serve more than four thousand clients in transforming 48 00:03:14,440 --> 00:03:18,720 Speaker 1: their I T infrastructure, which means we look at what 49 00:03:18,760 --> 00:03:21,400 Speaker 1: they're doing to run the infrastructure to ensure that they 50 00:03:21,400 --> 00:03:26,880 Speaker 1: are continuously evolving to meet the needs of the world today, 51 00:03:26,960 --> 00:03:30,360 Speaker 1: and we help them on that journey with cloud, with AI, 52 00:03:30,480 --> 00:03:33,520 Speaker 1: with automation. For for those of us who are not 53 00:03:33,600 --> 00:03:37,080 Speaker 1: in it. UM you go to the store and you 54 00:03:37,120 --> 00:03:39,400 Speaker 1: know the air condition works in your grocery, the food 55 00:03:39,440 --> 00:03:45,320 Speaker 1: is um is not tainted, the hospitals are running with 56 00:03:45,400 --> 00:03:49,360 Speaker 1: the right systems, and the patient records are in order. 57 00:03:49,800 --> 00:03:53,600 Speaker 1: And UM I leave the architects who work with clients 58 00:03:53,680 --> 00:03:56,720 Speaker 1: on these engagements, and I'm also the global leader for 59 00:03:56,720 --> 00:04:01,320 Speaker 1: the architect team. My background really briefly is undergrad I 60 00:04:01,360 --> 00:04:05,920 Speaker 1: was UM a math major with a minor and computer science. UM. 61 00:04:06,200 --> 00:04:12,640 Speaker 1: I did graduate school in UM in Systems, UM and controls. 62 00:04:13,200 --> 00:04:16,120 Speaker 1: And I've had I would say, a checkered career at IBM, 63 00:04:16,200 --> 00:04:19,320 Speaker 1: and that I have had many rules in many different 64 00:04:19,320 --> 00:04:22,560 Speaker 1: parts of the business, from the mainframe organization to consulting 65 00:04:23,080 --> 00:04:27,119 Speaker 1: technology services and a standing corporate headquarters. UM. As someone 66 00:04:27,160 --> 00:04:30,560 Speaker 1: from a small island, which is Trinidad and Tobago, it's 67 00:04:30,600 --> 00:04:33,240 Speaker 1: really exciting to me to get up and know in 68 00:04:33,279 --> 00:04:35,880 Speaker 1: a given day I can be talking to clients and 69 00:04:35,960 --> 00:04:39,200 Speaker 1: IBM or is in UM, you know, almost any country 70 00:04:39,240 --> 00:04:42,000 Speaker 1: in the world. Yeah, Lisa, can I hear a little 71 00:04:42,040 --> 00:04:45,799 Speaker 1: bit about you? Sure? Absolutely so. I am a computer 72 00:04:45,920 --> 00:04:52,320 Speaker 1: engineer and mathematician by degree. I've reinvented myself several times 73 00:04:52,880 --> 00:04:57,280 Speaker 1: throughout my career and IBM. I started as a mainframe developer. 74 00:04:57,480 --> 00:05:01,240 Speaker 1: I've worked in corporate technical strategy, where I had the 75 00:05:01,279 --> 00:05:03,719 Speaker 1: privilege to meet and work with Dale. As a matter 76 00:05:03,720 --> 00:05:06,240 Speaker 1: of fact, and one thing she did not tell you 77 00:05:06,279 --> 00:05:09,400 Speaker 1: about her professional background is that she's actually the first 78 00:05:09,760 --> 00:05:15,840 Speaker 1: black female distinguished engineer in IBM. Uh. She saw something 79 00:05:15,880 --> 00:05:20,159 Speaker 1: in me, put her arm around me, mentored me, and 80 00:05:20,440 --> 00:05:23,640 Speaker 1: I'm very happy to say that with her support, I 81 00:05:23,720 --> 00:05:28,359 Speaker 1: became the second black female distinguished engineer and IBM. But 82 00:05:28,440 --> 00:05:31,240 Speaker 1: as I mentioned, I've I've reinvented myself several times. So, 83 00:05:31,320 --> 00:05:35,160 Speaker 1: like I said, starting the mainframe development, moving on to 84 00:05:35,279 --> 00:05:41,360 Speaker 1: corporate technical strategy, I've worked as a industry solutions architect 85 00:05:41,680 --> 00:05:46,960 Speaker 1: UM in cloud for IBM. I've helped IBM with several 86 00:05:47,000 --> 00:05:52,400 Speaker 1: acquisitions including you Stream, Clearly, the Weather Company, and helped 87 00:05:52,400 --> 00:05:59,400 Speaker 1: define a new business unit UM around those acquisitions. Within IBM, 88 00:05:59,440 --> 00:06:04,080 Speaker 1: I've worked with teams to do DevOps transformations UM too 89 00:06:04,920 --> 00:06:10,160 Speaker 1: adopt cloud technologies and architectures for legacy applications you know. 90 00:06:10,279 --> 00:06:13,000 Speaker 1: Right now, I am on a team called our Cloud 91 00:06:13,000 --> 00:06:18,800 Speaker 1: Engagement Hub where we guide and advise IBM's top clients 92 00:06:18,839 --> 00:06:22,680 Speaker 1: on their journey to cloud UM, making sure that we're 93 00:06:22,720 --> 00:06:29,560 Speaker 1: able to help them rethink, re engineer, reimagine their legacy 94 00:06:29,600 --> 00:06:33,760 Speaker 1: application portfolios UH to leverage and take advantage of the 95 00:06:33,880 --> 00:06:36,920 Speaker 1: values of some of the newer technologies that are out now. 96 00:06:37,400 --> 00:06:41,600 Speaker 1: And I focus primarily right now on mainframe modernization and 97 00:06:41,640 --> 00:06:45,680 Speaker 1: how we can bring a cloud native developer experience UH 98 00:06:45,720 --> 00:06:48,120 Speaker 1: to the main frame. So it's really exciting to have 99 00:06:48,240 --> 00:06:52,080 Speaker 1: a job that fuses both my my you know based 100 00:06:52,160 --> 00:06:56,400 Speaker 1: fundamental experience and IBM and all of the newer things 101 00:06:56,400 --> 00:07:00,880 Speaker 1: I've learned in some of my more recent roles. Wonderful 102 00:07:01,080 --> 00:07:03,920 Speaker 1: and Brittany, Please, I would love to hear about your 103 00:07:03,960 --> 00:07:07,200 Speaker 1: background as well. Sure, Sure, I wish I could say 104 00:07:07,240 --> 00:07:10,680 Speaker 1: I was I was the third UH black female, but 105 00:07:11,600 --> 00:07:14,160 Speaker 1: not the case. Were very very honor happy to be 106 00:07:14,480 --> 00:07:16,280 Speaker 1: you know, to have worked within me in the company 107 00:07:16,360 --> 00:07:20,240 Speaker 1: of of of Dal and Lisa. UM. I'm a I'm 108 00:07:20,280 --> 00:07:23,520 Speaker 1: a gym, a boy engineer by trade actually, and I 109 00:07:23,720 --> 00:07:29,040 Speaker 1: joined UM IBM after getting my UH Tech NBA from 110 00:07:29,160 --> 00:07:33,800 Speaker 1: Johns Hopkins. I came in through the IBM Summit UH 111 00:07:33,880 --> 00:07:39,160 Speaker 1: Sales Sales Organization UM, so I was a client pacient 112 00:07:39,360 --> 00:07:43,440 Speaker 1: architect UM for stually most of my career. I started 113 00:07:43,600 --> 00:07:47,600 Speaker 1: with the CAMPS portfolio back when it was Cloud and 114 00:07:47,680 --> 00:07:50,920 Speaker 1: a Living to Mobile Social Security UM, and then moved 115 00:07:51,080 --> 00:07:55,520 Speaker 1: UM solding into the cloud organization for UM the last 116 00:07:55,520 --> 00:08:00,360 Speaker 1: six years of my career, SO I helped client with 117 00:08:00,440 --> 00:08:03,840 Speaker 1: their cloud strategy. It's adopting cloud is being successful on 118 00:08:03,960 --> 00:08:07,320 Speaker 1: the cloud and the organizations UM. Lisa and I actually 119 00:08:07,320 --> 00:08:10,200 Speaker 1: did cross paths very briefly. UM. I think it was 120 00:08:10,240 --> 00:08:12,520 Speaker 1: about five years ago when I was working on a 121 00:08:13,640 --> 00:08:17,480 Speaker 1: first of a con project without being research. Um, it 122 00:08:17,560 --> 00:08:22,600 Speaker 1: was a it was a a high gpustrating and cloud 123 00:08:22,600 --> 00:08:25,760 Speaker 1: plant for for for various use cases and in the 124 00:08:25,840 --> 00:08:31,280 Speaker 1: gaming and healthcare something some other industries. So yeah, I'm 125 00:08:31,280 --> 00:08:33,760 Speaker 1: just really excited. It's happy to be here. UM. But 126 00:08:33,840 --> 00:08:37,960 Speaker 1: the sectually about ib as well is the this is 127 00:08:38,320 --> 00:08:41,760 Speaker 1: the flexibility and control that I've essentially had over over 128 00:08:41,840 --> 00:08:44,880 Speaker 1: over my career. I was able to the previous roles 129 00:08:45,360 --> 00:08:49,120 Speaker 1: kind of create my own roles while within IBM. So um, 130 00:08:49,120 --> 00:08:51,640 Speaker 1: that's that's really what maybe treat not being over other 131 00:08:51,760 --> 00:08:54,640 Speaker 1: of other companies with for was just being able to 132 00:08:54,679 --> 00:08:57,439 Speaker 1: be in the droversy of of of your experience. You're 133 00:08:57,520 --> 00:09:02,199 Speaker 1: just like, my next question is one I feel like 134 00:09:02,320 --> 00:09:06,320 Speaker 1: the importance is sort of evident on the face of it, 135 00:09:06,360 --> 00:09:09,119 Speaker 1: but I feel that this is one we need to address. 136 00:09:09,240 --> 00:09:13,240 Speaker 1: So in your own words, can you explain why, since 137 00:09:13,280 --> 00:09:15,840 Speaker 1: we're talking about call for code for racial justice, why 138 00:09:15,880 --> 00:09:19,719 Speaker 1: everyone should care about racial justice? Why is this an 139 00:09:19,720 --> 00:09:23,440 Speaker 1: issue for everybody? When you ask the question, why does 140 00:09:23,480 --> 00:09:26,600 Speaker 1: it matter to everyone? I think we've I think the 141 00:09:26,640 --> 00:09:29,400 Speaker 1: first thing is to understand what we mean when we 142 00:09:29,480 --> 00:09:36,840 Speaker 1: say racial justice, UM, and what I have learned and 143 00:09:36,880 --> 00:09:40,080 Speaker 1: read and believe is that racial justice is a systemic 144 00:09:40,640 --> 00:09:45,600 Speaker 1: and um systemically fear treatment of people of any and 145 00:09:45,679 --> 00:09:51,440 Speaker 1: all races that results in equal and equitable opportunities for 146 00:09:51,480 --> 00:09:55,840 Speaker 1: all people as well as outcomes for them. And I 147 00:09:55,880 --> 00:09:59,080 Speaker 1: think if we are clear on that, then it is 148 00:09:59,080 --> 00:10:03,960 Speaker 1: not about black versus white, or um other you know, 149 00:10:04,120 --> 00:10:07,400 Speaker 1: any race you know versus another, But it is a 150 00:10:07,440 --> 00:10:12,120 Speaker 1: systemic fair treatment of all people. And if you think 151 00:10:12,160 --> 00:10:16,240 Speaker 1: of this as we're all humans living in a civilized society, 152 00:10:17,040 --> 00:10:22,760 Speaker 1: then it is incumbent on any human who wants to 153 00:10:22,840 --> 00:10:29,440 Speaker 1: preserve a civil civilized society to understand that injustice for 154 00:10:29,640 --> 00:10:33,880 Speaker 1: any race takes a tool on the society. Um. It 155 00:10:34,000 --> 00:10:40,880 Speaker 1: leads to unrest. It affects um everything that we hold 156 00:10:41,080 --> 00:10:45,800 Speaker 1: dear as human beings, a right to life, to a 157 00:10:45,880 --> 00:10:50,960 Speaker 1: quality of life, an a right to good health, a 158 00:10:51,120 --> 00:10:55,760 Speaker 1: right to justice and fair treatment, a right to education, 159 00:10:55,840 --> 00:11:00,480 Speaker 1: a right to vote. And once you introduce where you know, 160 00:11:00,720 --> 00:11:06,640 Speaker 1: um racism and racial bias into a system, you you 161 00:11:06,400 --> 00:11:11,800 Speaker 1: you create an imbalance that really damages the fabric of 162 00:11:11,880 --> 00:11:19,880 Speaker 1: how the that entire society functions and creates uh an 163 00:11:20,000 --> 00:11:26,040 Speaker 1: environment where it is not possible for anyone who is 164 00:11:26,080 --> 00:11:29,960 Speaker 1: a civilized human being to function um the way that 165 00:11:30,040 --> 00:11:33,480 Speaker 1: they need to. And and and there's one other thing, right, 166 00:11:33,520 --> 00:11:40,319 Speaker 1: It's not just a matter of not discriminating and not um, 167 00:11:40,400 --> 00:11:45,600 Speaker 1: you know, promoting inequities. It's also being deliberate and ensuring 168 00:11:46,320 --> 00:11:51,800 Speaker 1: that we support others and each other as human beings 169 00:11:52,360 --> 00:11:58,960 Speaker 1: to achieve a sustained level of racial equity, UM, to 170 00:11:59,160 --> 00:12:03,880 Speaker 1: achieve the rights that a civilized society expects UM as 171 00:12:03,880 --> 00:12:06,440 Speaker 1: a human being. And when we get rid of that, 172 00:12:07,160 --> 00:12:11,160 Speaker 1: when we devalue or we don't value each other as 173 00:12:11,240 --> 00:12:17,280 Speaker 1: human beings equally, we create an instability that has a 174 00:12:17,320 --> 00:12:21,680 Speaker 1: long lasting impact on our health, our safety, our well 175 00:12:21,760 --> 00:12:25,160 Speaker 1: being as human beings. That's my view. Well, I think 176 00:12:25,240 --> 00:12:28,880 Speaker 1: that's a good segue for us to talk about Call 177 00:12:29,000 --> 00:12:31,600 Speaker 1: for Code for Racial Justice. Can you tell us a 178 00:12:31,600 --> 00:12:35,760 Speaker 1: little bit about what that is and how it even 179 00:12:35,800 --> 00:12:37,679 Speaker 1: how it even came to be, and how each of 180 00:12:37,720 --> 00:12:42,520 Speaker 1: you became involved with this project. So UM that we 181 00:12:42,880 --> 00:12:46,640 Speaker 1: of the George Floyd in our protests, which again came 182 00:12:46,679 --> 00:12:50,960 Speaker 1: about via video. UM, that's when the conversations really you know, 183 00:12:51,080 --> 00:12:54,280 Speaker 1: started to come to come into the workplace, you know, finally, 184 00:12:54,320 --> 00:12:59,120 Speaker 1: after after many years. And so the way I got 185 00:12:59,120 --> 00:13:02,800 Speaker 1: involved in in a in the project that that first week, UM, 186 00:13:02,880 --> 00:13:05,400 Speaker 1: we you know, we had a lot of talent call 187 00:13:05,720 --> 00:13:09,760 Speaker 1: in talent hall discussions, asked me anythings, UM, just various 188 00:13:09,800 --> 00:13:14,800 Speaker 1: discussions across ib with the black communities, Allied communities, UM, 189 00:13:14,840 --> 00:13:17,520 Speaker 1: not just in the US, but but literally across the world. 190 00:13:17,760 --> 00:13:21,480 Speaker 1: And UM and and as part of those discussions, UM, 191 00:13:21,520 --> 00:13:25,280 Speaker 1: you know, the black community asked for you know, a 192 00:13:25,280 --> 00:13:28,960 Speaker 1: call for code, you know, an initiative to apply technology. 193 00:13:28,960 --> 00:13:32,120 Speaker 1: Can can we you know, put our heads together, our 194 00:13:32,160 --> 00:13:36,760 Speaker 1: expertise together to to to UM to find technology to 195 00:13:36,800 --> 00:13:41,280 Speaker 1: this issue. So UM and and do and much who 196 00:13:41,360 --> 00:13:46,000 Speaker 1: is also UM one of our leaders for this initiative UM. 197 00:13:46,040 --> 00:13:48,000 Speaker 1: I worked with him on D on the Cloud Advisor 198 00:13:48,559 --> 00:13:53,720 Speaker 1: UM team. So he called myself, Lisa dell um and 199 00:13:53,800 --> 00:13:56,400 Speaker 1: asked us to be you know, a part of this 200 00:13:56,480 --> 00:13:59,040 Speaker 1: initiative and just and to build it and to build 201 00:13:59,080 --> 00:14:02,320 Speaker 1: us into what you know the program is today. So 202 00:14:02,920 --> 00:14:05,040 Speaker 1: when he called it, it was a no, it was 203 00:14:05,040 --> 00:14:06,800 Speaker 1: a no brain and there wasn't even I thought it 204 00:14:06,840 --> 00:14:09,480 Speaker 1: was like, of course, this is a lived experience. We're 205 00:14:09,480 --> 00:14:11,800 Speaker 1: black in the office, were black outside of it. So 206 00:14:12,440 --> 00:14:14,679 Speaker 1: it was something we couldn't we we could couldn't pass. 207 00:14:15,360 --> 00:14:18,440 Speaker 1: You know, Brittany's right, right, I feel there there was 208 00:14:18,480 --> 00:14:25,320 Speaker 1: a sense of anger, of pain, of frustration, of just 209 00:14:25,320 --> 00:14:29,080 Speaker 1: just so much emotion right from the black IBM community 210 00:14:29,600 --> 00:14:34,600 Speaker 1: in the wake of recent events George Floyd, Brianna Taylor, 211 00:14:34,920 --> 00:14:39,680 Speaker 1: countless others. Things kind of came to a head and 212 00:14:39,720 --> 00:14:43,880 Speaker 1: the Black community spoke up and IBAM listened, you know, 213 00:14:44,000 --> 00:14:47,360 Speaker 1: And and I have to stop and say that I'm 214 00:14:47,400 --> 00:14:52,920 Speaker 1: incredibly proud of what IBM's response has been. Within two 215 00:14:53,000 --> 00:15:03,160 Speaker 1: weeks of George floyd murder, IBM was taking action and responding. 216 00:15:03,640 --> 00:15:08,800 Speaker 1: They were listening. There were a series of town halls, 217 00:15:09,000 --> 00:15:16,960 Speaker 1: of small round tables, fireside chats, discussions, in in team meetings, 218 00:15:17,000 --> 00:15:23,880 Speaker 1: in slack and email. IBM wanted to understand. And the 219 00:15:24,000 --> 00:15:30,120 Speaker 1: Black community spoke up and IBM heard the pain. IBM 220 00:15:30,200 --> 00:15:35,359 Speaker 1: listened to the frustrations. We took all of that information, 221 00:15:35,520 --> 00:15:39,280 Speaker 1: all of that feedback, and we synthesized it. And when 222 00:15:39,280 --> 00:15:42,080 Speaker 1: I say we, I mean a group of black IBM 223 00:15:42,160 --> 00:15:46,880 Speaker 1: rs uh in concert uh you know, with the you know, 224 00:15:47,000 --> 00:15:50,880 Speaker 1: senior executive sponsor of the Black Community and IBM. We 225 00:15:51,040 --> 00:15:55,920 Speaker 1: worked together. We synthesized it. We used design thinking to 226 00:15:56,080 --> 00:15:59,640 Speaker 1: really understand what are the common themes, the common threads, 227 00:16:00,560 --> 00:16:04,640 Speaker 1: and we found some, right, I think we found maybe 228 00:16:04,720 --> 00:16:06,960 Speaker 1: you know, Brittany can keep me honest here, she's been 229 00:16:07,000 --> 00:16:10,000 Speaker 1: in this since to start with me. Um, you know 230 00:16:10,160 --> 00:16:16,280 Speaker 1: several themes maybe ten fifteen different common central themes around 231 00:16:17,000 --> 00:16:23,360 Speaker 1: systemic racism in our society and how we wanted to 232 00:16:23,400 --> 00:16:28,040 Speaker 1: combat those. Uh. We distilled them down. We put them 233 00:16:28,080 --> 00:16:31,960 Speaker 1: to a vote to the Black community and IBM and said, look, 234 00:16:32,000 --> 00:16:34,760 Speaker 1: we listened, we heard. Let us play back to you 235 00:16:34,880 --> 00:16:37,960 Speaker 1: what we heard. Are these the right things or these 236 00:16:38,080 --> 00:16:42,520 Speaker 1: the the activities the initiatives that matter to you? Are 237 00:16:42,560 --> 00:16:45,520 Speaker 1: these the problems you want to see us as an 238 00:16:45,560 --> 00:16:49,800 Speaker 1: IBM community as a whole tackle. And we put them 239 00:16:49,800 --> 00:16:52,280 Speaker 1: to a vote because we know that we can't do 240 00:16:52,560 --> 00:16:55,440 Speaker 1: everything as much as we might want to. But if 241 00:16:55,480 --> 00:16:59,480 Speaker 1: we bring focus to a few things that we can 242 00:16:59,600 --> 00:17:03,080 Speaker 1: move the needle, we can apply our technology are best 243 00:17:03,120 --> 00:17:06,560 Speaker 1: and brightest to try to solve some subset of the 244 00:17:06,600 --> 00:17:10,320 Speaker 1: problems that we face as a society. Uh. We put 245 00:17:10,359 --> 00:17:13,320 Speaker 1: them to a vote. They voted, and those are the 246 00:17:13,359 --> 00:17:17,920 Speaker 1: three themes that we have UH externalized as part of 247 00:17:18,000 --> 00:17:23,480 Speaker 1: Call for Code for Racial Justice, Diverse Representation, UH police 248 00:17:24,640 --> 00:17:31,639 Speaker 1: and judicial reform, and policy and legislation reform. Dale, you 249 00:17:31,680 --> 00:17:36,520 Speaker 1: want to share yours, as Lisa and Brittany shared, and 250 00:17:36,600 --> 00:17:41,600 Speaker 1: dow mch Um IBM fellow pulled me into our Call 251 00:17:41,720 --> 00:17:46,000 Speaker 1: for Code Challenge team and I started to say, Wow, 252 00:17:46,400 --> 00:17:49,560 Speaker 1: I have you know this other project. I'm running this 253 00:17:49,640 --> 00:17:55,520 Speaker 1: other initiative and it's it's morather than just IBM. And 254 00:17:55,560 --> 00:17:58,680 Speaker 1: then I had a really hard talking too with myself 255 00:17:58,760 --> 00:18:04,000 Speaker 1: and said, IBM is listening as Lisa's said, right, um, 256 00:18:04,040 --> 00:18:08,359 Speaker 1: we have IBMS attention. The world is looking for help, 257 00:18:08,560 --> 00:18:11,840 Speaker 1: the US is looking for help. I could choose my 258 00:18:12,000 --> 00:18:16,080 Speaker 1: choose to loose sleep over the fear I'm feeling, the 259 00:18:16,119 --> 00:18:21,440 Speaker 1: frustration I'm feeling, or I can channel my energy right 260 00:18:21,720 --> 00:18:26,240 Speaker 1: um and my waking hours to doing something. So I said, Okay, 261 00:18:26,320 --> 00:18:29,040 Speaker 1: I'm gonna jump in and I'm going to help. And 262 00:18:29,400 --> 00:18:34,680 Speaker 1: it worked out that and had some challenges that took 263 00:18:34,760 --> 00:18:37,200 Speaker 1: him away from this right at the end of August, 264 00:18:37,720 --> 00:18:41,600 Speaker 1: and I ended up having to manage both of these. 265 00:18:41,640 --> 00:18:45,480 Speaker 1: But Jonathan, I will tell you this, right I sat 266 00:18:45,520 --> 00:18:47,440 Speaker 1: at the beginning that one of the best things in 267 00:18:47,560 --> 00:18:50,280 Speaker 1: IBM is waking up every morning and knowing I can 268 00:18:50,320 --> 00:18:54,920 Speaker 1: talk to people in many countries of the world and clients. 269 00:18:54,960 --> 00:18:57,040 Speaker 1: But one of the things I didn't say is that 270 00:18:57,160 --> 00:19:02,000 Speaker 1: the people in IBM from all of life came together 271 00:19:02,760 --> 00:19:07,840 Speaker 1: to help me and and do and Lisa with both 272 00:19:07,920 --> 00:19:10,760 Speaker 1: of those initiatives. Right, I have a team of sixty 273 00:19:10,840 --> 00:19:14,560 Speaker 1: people who have been working with me on the I 274 00:19:14,680 --> 00:19:20,159 Speaker 1: T Language initiative, and the passion and the drive is 275 00:19:21,040 --> 00:19:26,159 Speaker 1: unbelievable that they bring to the table, all volunteering to help. 276 00:19:26,720 --> 00:19:30,600 Speaker 1: We have more than five hundred people from IBM plus 277 00:19:30,680 --> 00:19:34,640 Speaker 1: Red Hats who have come together to work on the 278 00:19:34,760 --> 00:19:38,879 Speaker 1: Call for Code initiative. And for me, it was cathartic. 279 00:19:38,920 --> 00:19:41,879 Speaker 1: It felt like I was able to do something I 280 00:19:42,359 --> 00:19:44,680 Speaker 1: I was. I was going around, I would almost say, 281 00:19:44,680 --> 00:19:47,800 Speaker 1: in a daze, and I turned on me my turned 282 00:19:47,800 --> 00:19:51,120 Speaker 1: my emotions on and off for survival. And we got 283 00:19:51,200 --> 00:19:54,760 Speaker 1: people from you know, all ages, right um, all different 284 00:19:54,800 --> 00:19:57,440 Speaker 1: parts of the company, all different kinds of skills coming 285 00:19:57,480 --> 00:20:01,600 Speaker 1: together to build this out. I think was cathartic for 286 00:20:01,640 --> 00:20:04,920 Speaker 1: all of us, and it helped us to channel what 287 00:20:04,960 --> 00:20:11,840 Speaker 1: we're doing, what our frustration, our numbness, our exhaustion, um 288 00:20:11,880 --> 00:20:14,840 Speaker 1: and our fear. But I will say for me, Call 289 00:20:14,960 --> 00:20:19,200 Speaker 1: for Code really helped me to feel that I could 290 00:20:19,280 --> 00:20:23,280 Speaker 1: contribute in a tangible way and bring others along. And 291 00:20:23,359 --> 00:20:27,200 Speaker 1: everyone who's been working with us on this has been 292 00:20:28,280 --> 00:20:34,680 Speaker 1: really touched by the power of our technology, the support 293 00:20:34,720 --> 00:20:38,520 Speaker 1: of our leadership Bob Lord, Aravin Krishna or SeeU to 294 00:20:38,640 --> 00:20:41,480 Speaker 1: create a space for us to make this happen. And 295 00:20:41,520 --> 00:20:43,200 Speaker 1: I know it's a long answer, but I really want 296 00:20:43,200 --> 00:20:46,440 Speaker 1: to do to understand, you know what what it meant 297 00:20:46,480 --> 00:20:50,080 Speaker 1: for me, right because I was shutting down, literally shutting 298 00:20:50,080 --> 00:20:53,240 Speaker 1: down and operating in you know, sort of an automatic 299 00:20:53,320 --> 00:20:57,520 Speaker 1: mode just for you know, my my personal and mental 300 00:20:58,000 --> 00:21:02,560 Speaker 1: and emotional survival, and Call for Code for Racial Justice 301 00:21:02,720 --> 00:21:06,800 Speaker 1: allowed me to feel and then channel those feelings into 302 00:21:06,880 --> 00:21:14,000 Speaker 1: helping the team. I'm really really incredibly impressed with what 303 00:21:14,160 --> 00:21:17,840 Speaker 1: I'm hearing, not just by you know, knowing that this 304 00:21:17,880 --> 00:21:21,360 Speaker 1: is such a critical topic in the first place. I mean, 305 00:21:21,359 --> 00:21:26,760 Speaker 1: that's the fact that we have IBM ers dedicating precious 306 00:21:26,840 --> 00:21:31,360 Speaker 1: time to really tackling this. That really speaks to how 307 00:21:31,400 --> 00:21:34,040 Speaker 1: critical an issue it is. It also speaks highly of 308 00:21:34,119 --> 00:21:36,879 Speaker 1: the character of you and all the other ib M 309 00:21:36,960 --> 00:21:40,720 Speaker 1: r s who have uh participated in the Call for 310 00:21:40,800 --> 00:21:46,480 Speaker 1: Code for Racial Justice and I'm I'm humbled again to 311 00:21:46,720 --> 00:21:50,720 Speaker 1: be part of this conversation. Can you talk a little 312 00:21:50,720 --> 00:21:53,399 Speaker 1: bit about some of the actual work that you have 313 00:21:53,520 --> 00:21:57,040 Speaker 1: created with Call for Code for Racial Justice so far? Sure? 314 00:21:57,119 --> 00:21:59,159 Speaker 1: So at least I mentioned and in betid out of 315 00:21:59,160 --> 00:22:03,040 Speaker 1: the podcast, we synthesized UM, you know, input from the 316 00:22:03,040 --> 00:22:06,600 Speaker 1: black communities down to three themes and their police and 317 00:22:06,680 --> 00:22:11,959 Speaker 1: judicial reform and accountability, policy and legislative reform, and UM 318 00:22:12,040 --> 00:22:16,760 Speaker 1: diverse representation. So for police and judicial reform, we're looking 319 00:22:16,760 --> 00:22:20,040 Speaker 1: at how can we use technology UM to better analyze 320 00:22:20,080 --> 00:22:24,800 Speaker 1: broad data, provide insights, and make recommendations that will drive 321 00:22:25,280 --> 00:22:30,280 Speaker 1: racial equality and reform across criminal justice and public safety. 322 00:22:30,760 --> 00:22:34,840 Speaker 1: For policy and legislation reform, it's the question is how 323 00:22:34,840 --> 00:22:39,760 Speaker 1: can we use technology to analyze, inform and develop policy 324 00:22:40,240 --> 00:22:45,520 Speaker 1: to reform the workplace products, public safety, just legislation and 325 00:22:46,000 --> 00:22:49,639 Speaker 1: society at large. And then for diverse representation, we're looking 326 00:22:49,680 --> 00:22:54,000 Speaker 1: at the prevention, detection and the remediation of bias and 327 00:22:54,080 --> 00:22:58,520 Speaker 1: misrepresentation in workplace UM products in society. And like we've 328 00:22:58,520 --> 00:23:01,159 Speaker 1: been saying before, you know, full corporations and for the 329 00:23:01,280 --> 00:23:03,760 Speaker 1: you know, for the world to succeed if it's critical 330 00:23:04,200 --> 00:23:08,280 Speaker 1: to have light representation at every level. UM. So within 331 00:23:08,400 --> 00:23:13,719 Speaker 1: the Police in Judicial Reform theme where we're gonna release 332 00:23:14,080 --> 00:23:18,520 Speaker 1: a couple of solutions. One looks like the capabilities for contribution, 333 00:23:19,200 --> 00:23:23,040 Speaker 1: management and analysis of categories of trust and information about 334 00:23:23,080 --> 00:23:27,000 Speaker 1: incidents for both the police and their stakeholders. So for 335 00:23:27,160 --> 00:23:30,919 Speaker 1: Police in Judicial Reform and accountability, one solution UM is 336 00:23:30,920 --> 00:23:33,840 Speaker 1: a web bag that allows the defense attorney or public 337 00:23:33,920 --> 00:23:38,760 Speaker 1: defender to upload information about a case and and they're defendant, 338 00:23:39,160 --> 00:23:42,080 Speaker 1: and we'll send this data to a bias detection engine 339 00:23:42,400 --> 00:23:45,080 Speaker 1: and using the results from that, we can create a 340 00:23:45,160 --> 00:23:49,240 Speaker 1: report outline and possible charges, the range of possible sentence, 341 00:23:49,320 --> 00:23:54,960 Speaker 1: peach charge, evidence of passed bias UM indeed biased recommendations 342 00:23:54,960 --> 00:23:59,919 Speaker 1: for police bargaining sentences. So again, how can we trans 343 00:24:00,080 --> 00:24:03,600 Speaker 1: form the process when someone is going to trial. We 344 00:24:03,680 --> 00:24:08,200 Speaker 1: also have another solution that predicts how likely UM a 345 00:24:08,280 --> 00:24:12,000 Speaker 1: charge the sentence sentencing will be very different if the 346 00:24:12,040 --> 00:24:16,240 Speaker 1: convicted person was of a different race. FIRD solution for 347 00:24:16,840 --> 00:24:20,679 Speaker 1: Police in Judicial Reform is a content management application that 348 00:24:20,720 --> 00:24:25,800 Speaker 1: will allow civilians to contribute uh statements and evidence to 349 00:24:25,920 --> 00:24:29,800 Speaker 1: police Internet reports and create a tamper proof record with 350 00:24:29,880 --> 00:24:34,000 Speaker 1: all accounts of the Internet. For policy and legislative reform, 351 00:24:34,280 --> 00:24:37,800 Speaker 1: we have a web advocation to enable and empower black 352 00:24:37,880 --> 00:24:40,960 Speaker 1: people to exercise their right to vote by assuring that 353 00:24:41,000 --> 00:24:43,200 Speaker 1: their voice is heard. It's a it's a virtual one 354 00:24:43,240 --> 00:24:46,560 Speaker 1: stop shop for but voters to assist them with all 355 00:24:46,560 --> 00:24:50,199 Speaker 1: of their their voting needs. And another solution for policy 356 00:24:50,280 --> 00:24:54,120 Speaker 1: legislative reform is a web based application to help advocates 357 00:24:54,119 --> 00:24:57,840 Speaker 1: find legislation of interest based on the advocates preferences for 358 00:24:58,040 --> 00:25:03,639 Speaker 1: impact areas and geographical location. UMAN. Finally, UH, we have 359 00:25:03,720 --> 00:25:09,360 Speaker 1: a platform as capable of story curated PRNL information as 360 00:25:09,359 --> 00:25:12,560 Speaker 1: a termined by the community. It provides a mobile friendly 361 00:25:12,600 --> 00:25:16,240 Speaker 1: way for users to examine that information, increasing their legal 362 00:25:16,280 --> 00:25:19,680 Speaker 1: awareness UM and to allow them to communicate their reactions 363 00:25:19,680 --> 00:25:23,199 Speaker 1: and thoughts via the recording of video testimonials to be 364 00:25:23,240 --> 00:25:25,840 Speaker 1: shared with the community and that people responsible for the 365 00:25:26,040 --> 00:25:29,439 Speaker 1: creation of that And for diverse representation, I'll turn it 366 00:25:29,440 --> 00:25:31,040 Speaker 1: over to them to talk about the work that we're 367 00:25:31,040 --> 00:25:37,520 Speaker 1: doing there. So diverse representation UM is a theme that 368 00:25:37,600 --> 00:25:41,040 Speaker 1: really cuts across all of the three themes. And if 369 00:25:41,080 --> 00:25:45,480 Speaker 1: you think about racial intstice and bias UM. If you 370 00:25:46,080 --> 00:25:54,320 Speaker 1: don't have a society that UM is diverse and UH 371 00:25:54,520 --> 00:25:59,560 Speaker 1: those uh and and you don't have a way of 372 00:26:00,560 --> 00:26:07,160 Speaker 1: eliminating both explicit and implicit bias from your society, then 373 00:26:07,160 --> 00:26:10,280 Speaker 1: you run into some of the challenges we see where 374 00:26:10,359 --> 00:26:18,000 Speaker 1: policy and legislative reform or police UM legislation and reform 375 00:26:18,040 --> 00:26:23,840 Speaker 1: accountability isn't UM isn't performed in an equitable way. So 376 00:26:23,960 --> 00:26:27,800 Speaker 1: for the but for the diverse rich representation solution, what 377 00:26:27,840 --> 00:26:32,600 Speaker 1: we really have focused on is providing a means for 378 00:26:32,720 --> 00:26:43,800 Speaker 1: us to detect bias and content that promotes negative stereotypes 379 00:26:44,160 --> 00:26:51,680 Speaker 1: about the black community that UM promotes both in an 380 00:26:51,680 --> 00:27:01,320 Speaker 1: avert and a subtle way UH, racial language and themes 381 00:27:01,320 --> 00:27:09,800 Speaker 1: that continue to reinforce the perception of one race, in 382 00:27:09,840 --> 00:27:15,280 Speaker 1: this case blocks as lesser than or less value than 383 00:27:16,280 --> 00:27:21,639 Speaker 1: another race or other races. And also to provide a 384 00:27:21,680 --> 00:27:27,960 Speaker 1: way for the community to come together two work on 385 00:27:28,280 --> 00:27:35,040 Speaker 1: this issue, which is a lot less UM a lot 386 00:27:35,480 --> 00:27:40,600 Speaker 1: more difficult to tackle than overt racism in many ways 387 00:27:40,760 --> 00:27:47,639 Speaker 1: because it gets into the UM language we've accepted as 388 00:27:48,600 --> 00:27:52,439 Speaker 1: part of the norm for a very long time. Can 389 00:27:52,600 --> 00:27:56,000 Speaker 1: people get involved in build upon the work that you 390 00:27:56,560 --> 00:27:59,879 Speaker 1: are doing with Call for Code for Racial Justice, So 391 00:28:00,720 --> 00:28:06,680 Speaker 1: we will be announcing the solutions next week. UM. We 392 00:28:06,760 --> 00:28:11,240 Speaker 1: have a Call for Code for UM Racial Justice site 393 00:28:11,840 --> 00:28:16,240 Speaker 1: where if you're a developer, you can register, you can 394 00:28:16,280 --> 00:28:21,199 Speaker 1: get involved UM, you can start contributing code ideas UM. 395 00:28:22,000 --> 00:28:27,320 Speaker 1: If you are a on GEO or private sector company 396 00:28:27,480 --> 00:28:34,240 Speaker 1: or other interested community entities like community governments, you can 397 00:28:34,960 --> 00:28:39,720 Speaker 1: UM engage and share that you are interested in partnering 398 00:28:39,760 --> 00:28:42,680 Speaker 1: with us, maybe as a tester, maybe as an end user, 399 00:28:42,720 --> 00:28:45,560 Speaker 1: a deployer, or someone who wants to take what we're 400 00:28:45,560 --> 00:28:48,880 Speaker 1: doing and build a community around it to implement with 401 00:28:49,040 --> 00:28:52,560 Speaker 1: their own developers. So we have created a site. These 402 00:28:52,560 --> 00:28:56,680 Speaker 1: solutions will be featured on those on our websites UM 403 00:28:56,960 --> 00:29:01,280 Speaker 1: under IBM Developer Call for Code, UM for Racial Justice 404 00:29:01,960 --> 00:29:05,120 Speaker 1: and UM we will be ruling these solutions out and 405 00:29:05,280 --> 00:29:10,520 Speaker 1: any developer or or UM par ecosystem partner can join 406 00:29:10,680 --> 00:29:14,719 Speaker 1: us UM to be part of the journey of taking 407 00:29:14,760 --> 00:29:18,800 Speaker 1: these solutions and bringing them to our communities in America 408 00:29:18,840 --> 00:29:21,760 Speaker 1: and the world. And all they have to do is 409 00:29:21,800 --> 00:29:24,520 Speaker 1: click on a link and sign up and if they'd 410 00:29:24,560 --> 00:29:27,520 Speaker 1: like to go learn about it at All Things Open 411 00:29:27,680 --> 00:29:34,800 Speaker 1: and the Inclusive in the inclusion and diversity. Um uh 412 00:29:35,120 --> 00:29:39,880 Speaker 1: tracks they can um, they can hear and do talk 413 00:29:39,920 --> 00:29:44,320 Speaker 1: about those solutions in a number of sessions a Lisa 414 00:29:44,520 --> 00:29:46,080 Speaker 1: or or Britney. I don't know if there's anything else 415 00:29:46,120 --> 00:29:48,400 Speaker 1: you wanted to add. Yeah, I just wanted to add 416 00:29:48,440 --> 00:29:50,960 Speaker 1: one thing, which is partly you know, why it's so 417 00:29:51,040 --> 00:29:55,760 Speaker 1: important that we've open source these, these starter kits, these solutions, 418 00:29:55,880 --> 00:29:59,280 Speaker 1: and also how people can get involved. Right. Yes, this 419 00:29:59,400 --> 00:30:02,520 Speaker 1: is technolog legy, but you don't just have to be 420 00:30:02,600 --> 00:30:06,560 Speaker 1: a developer, right. I think the the net of it 421 00:30:06,640 --> 00:30:09,360 Speaker 1: really comes down to the fact that you know, I 422 00:30:09,360 --> 00:30:12,360 Speaker 1: think I mentioned this before, you know, we want to 423 00:30:12,400 --> 00:30:16,640 Speaker 1: address the community. We want to apply you know, technical 424 00:30:17,200 --> 00:30:20,160 Speaker 1: know how as well. And so that's why really the 425 00:30:20,680 --> 00:30:25,120 Speaker 1: premise around open sourcing these, right is we know, we 426 00:30:25,240 --> 00:30:31,080 Speaker 1: recognize that there is a diverse representation issue in the 427 00:30:31,120 --> 00:30:35,959 Speaker 1: tech industry. Um. And I think by open sourcing these 428 00:30:36,120 --> 00:30:40,200 Speaker 1: it allows us to open the aperture and get a broader, 429 00:30:41,000 --> 00:30:46,000 Speaker 1: uh set of contributors working with us, whether it's developers, 430 00:30:46,080 --> 00:30:48,760 Speaker 1: whether it's people in the community who want to make 431 00:30:48,800 --> 00:30:52,280 Speaker 1: a difference, who have an idea or see a problem 432 00:30:52,320 --> 00:30:55,480 Speaker 1: that they think technology can help. Solve and so I think, 433 00:30:55,560 --> 00:30:59,280 Speaker 1: you know, open sourcing, this really helps us to bring 434 00:30:59,480 --> 00:31:04,000 Speaker 1: all different types of skill sets, all different different types 435 00:31:04,080 --> 00:31:10,720 Speaker 1: of um people together to really form a community. And 436 00:31:10,800 --> 00:31:13,400 Speaker 1: you know, for folks to get involved, please don't think 437 00:31:13,400 --> 00:31:15,560 Speaker 1: that you have to be a developer, that you have 438 00:31:15,760 --> 00:31:18,960 Speaker 1: to be a coder. You know, there's room for everyone 439 00:31:19,840 --> 00:31:24,080 Speaker 1: and and all types of skills UH to participate in 440 00:31:24,080 --> 00:31:28,800 Speaker 1: in in Call for Code for Racial Justice. I it's 441 00:31:28,800 --> 00:31:31,160 Speaker 1: hard for me to put into words how inspirational I 442 00:31:31,200 --> 00:31:34,640 Speaker 1: have found this conversation. You look at a problem as 443 00:31:34,800 --> 00:31:41,720 Speaker 1: enormous as the addressing injustice, racial injustice, and I think 444 00:31:41,760 --> 00:31:44,800 Speaker 1: for a lot of people, the first reaction is this 445 00:31:44,840 --> 00:31:48,560 Speaker 1: problem is too big for me to do anything that 446 00:31:48,760 --> 00:31:52,000 Speaker 1: will make any difference. And I think that this initiative 447 00:31:52,200 --> 00:31:55,760 Speaker 1: proves that wrong, and it gives people that opportunity and 448 00:31:55,800 --> 00:31:58,000 Speaker 1: that hope that I think a lot of people are 449 00:31:58,040 --> 00:32:01,800 Speaker 1: really searching for right now. So, from the bottom of 450 00:32:01,840 --> 00:32:04,840 Speaker 1: my heart, thank you all so much, not just for 451 00:32:04,960 --> 00:32:07,880 Speaker 1: being on the show, but for doing this incredible work 452 00:32:08,400 --> 00:32:11,760 Speaker 1: and and for opening up the doors for other people 453 00:32:12,120 --> 00:32:14,480 Speaker 1: to be a part of it and to make the 454 00:32:14,520 --> 00:32:17,720 Speaker 1: world a better place. It is a truly phenomenal story, 455 00:32:17,800 --> 00:32:19,760 Speaker 1: and I am so thankful that I get to be 456 00:32:19,920 --> 00:32:22,160 Speaker 1: one of the people to tell it. No, thank you 457 00:32:22,240 --> 00:32:25,800 Speaker 1: so much for having us. Thank you, Jonathan M. Really 458 00:32:25,800 --> 00:32:31,320 Speaker 1: appreciate this opportunity. Yes, thank you. I have covered the 459 00:32:31,360 --> 00:32:35,200 Speaker 1: tech industry for more than ten years. In that time, 460 00:32:35,320 --> 00:32:41,120 Speaker 1: I've talked about topics ranging from inconsequential two critically important issues. 461 00:32:41,760 --> 00:32:46,840 Speaker 1: Racial justice clearly belongs in that second category, and I 462 00:32:46,880 --> 00:32:50,280 Speaker 1: mean it when I say it is a problem so large, 463 00:32:50,760 --> 00:32:55,600 Speaker 1: so deeply ingrained in the systems that run our society, 464 00:32:55,600 --> 00:32:57,760 Speaker 1: that it was really hard for me to get a 465 00:32:57,800 --> 00:33:01,800 Speaker 1: feel for what the average person like myself could do 466 00:33:02,040 --> 00:33:05,240 Speaker 1: in response to it. The Call for Code for Racial 467 00:33:05,280 --> 00:33:09,520 Speaker 1: Justice is a great example of how anyone, whether they 468 00:33:09,560 --> 00:33:12,920 Speaker 1: are a developer or not, can get involved to make 469 00:33:13,040 --> 00:33:16,920 Speaker 1: real world change happen. The work that comes out of 470 00:33:16,960 --> 00:33:20,600 Speaker 1: this project will help address wrongs that have been part 471 00:33:20,640 --> 00:33:24,160 Speaker 1: of our world for far too long, and in the end, 472 00:33:24,720 --> 00:33:30,880 Speaker 1: righting those wrongs will help everyone. Systemic racism really does 473 00:33:31,000 --> 00:33:34,920 Speaker 1: harm everyone within a society, and when there is real 474 00:33:35,120 --> 00:33:39,640 Speaker 1: justice that is equitable across the board, we all benefit. 475 00:33:40,760 --> 00:33:45,880 Speaker 1: To get involved. Please visit developer dot IBM dot com 476 00:33:45,880 --> 00:33:52,200 Speaker 1: slash call for code slash, racial dash justice. You don't 477 00:33:52,200 --> 00:33:54,840 Speaker 1: need to be a developer to be a part of this. 478 00:33:55,760 --> 00:33:58,680 Speaker 1: You just need to have the desire to make the 479 00:33:58,720 --> 00:34:02,720 Speaker 1: world a more us and a better place. That's all 480 00:34:02,760 --> 00:34:06,120 Speaker 1: for today, Thank you for listening. I'll talk to you 481 00:34:06,200 --> 00:34:14,880 Speaker 1: again really soon. Text Stuff is an I Heart Radio production. 482 00:34:15,120 --> 00:34:17,920 Speaker 1: For more podcasts from my Heart Radio, visit the i 483 00:34:18,040 --> 00:34:21,279 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 484 00:34:21,320 --> 00:34:22,240 Speaker 1: your favorite shows.