1 00:00:04,880 --> 00:00:13,039 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Today, we 2 00:00:13,080 --> 00:00:16,040 Speaker 1: are witnessed to one of those rare moments in history, 3 00:00:16,440 --> 00:00:19,639 Speaker 1: the rise of an innovative technology with the potential to 4 00:00:19,760 --> 00:00:24,520 Speaker 1: radically transform business and society forever. That technology, of course, 5 00:00:25,000 --> 00:00:28,560 Speaker 1: is artificial intelligence, and it's the central focus for this 6 00:00:28,720 --> 00:00:32,720 Speaker 1: new season of Smart Talks with IBM. Join hosts from 7 00:00:32,760 --> 00:00:36,479 Speaker 1: your favorite Pushkin podcasts as they talk with industry experts 8 00:00:36,479 --> 00:00:40,080 Speaker 1: and leaders to explore how businesses can integrate AI into 9 00:00:40,120 --> 00:00:43,440 Speaker 1: their workflows and help drive real change in this new 10 00:00:43,560 --> 00:00:47,239 Speaker 1: era of AI, and of course, host Malcolm Gladwell will 11 00:00:47,240 --> 00:00:49,560 Speaker 1: be there to guide you through the season and throw 12 00:00:49,640 --> 00:00:52,559 Speaker 1: in his two cents as well. Look out for new 13 00:00:52,560 --> 00:00:55,440 Speaker 1: episodes of Smart Talks with IBM every other week on 14 00:00:55,520 --> 00:00:59,720 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts, 15 00:01:00,040 --> 00:01:04,640 Speaker 1: and learn more at IBM dot com slash smart Talks. 16 00:01:07,400 --> 00:01:11,039 Speaker 2: Hello, Hello, Welcome to Smart Talks with IBM, a podcast 17 00:01:11,040 --> 00:01:17,080 Speaker 2: from Pushkin Industries, iHeartRadio and IBM. I'm Malcolm Glawell. This season, 18 00:01:17,080 --> 00:01:20,160 Speaker 2: we're diving back into the world of artificial intelligence, but 19 00:01:20,240 --> 00:01:26,360 Speaker 2: with a focus on the powerful concept of open its possibilities, implications, 20 00:01:26,400 --> 00:01:29,720 Speaker 2: and misconceptions. We'll look at openness from a variety of 21 00:01:29,760 --> 00:01:33,800 Speaker 2: angles and explore how the concept is already reshaping industries, 22 00:01:34,240 --> 00:01:38,039 Speaker 2: ways of doing business, and our very notion of what's possible. 23 00:01:38,840 --> 00:01:42,080 Speaker 2: On today's episode, doctor Lori Santos sat down with two 24 00:01:42,080 --> 00:01:46,600 Speaker 2: women at the forefront of AI in education. Justina Nixon 25 00:01:46,680 --> 00:01:51,720 Speaker 2: Santil is vice president and Chief Impact Officer of IBM 26 00:01:51,880 --> 00:01:56,440 Speaker 2: Corporate Social Responsibility, and April Dawson is an Associate Dean 27 00:01:56,480 --> 00:02:00,280 Speaker 2: of Technology and Innovation and Professor of Law at North 28 00:02:00,280 --> 00:02:05,160 Speaker 2: Carolina Central University School of Law. Together, they explore the 29 00:02:05,200 --> 00:02:10,040 Speaker 2: transformative impact of AI on education and the workforce. As 30 00:02:10,120 --> 00:02:15,000 Speaker 2: technology rapidly evolves, industries are being reshaped and the demand 31 00:02:15,040 --> 00:02:18,440 Speaker 2: for new skills is at an all time high. This 32 00:02:18,600 --> 00:02:23,120 Speaker 2: is opening up opportunities for diverse talent, enabling individuals from 33 00:02:23,200 --> 00:02:26,200 Speaker 2: various backgrounds to excel in roles they might not have 34 00:02:26,280 --> 00:02:31,200 Speaker 2: previously considered. They also address the ethical considerations of AI, 35 00:02:31,720 --> 00:02:37,280 Speaker 2: emphasizing the importance of maintaining a human centered approach. Whether 36 00:02:37,320 --> 00:02:39,880 Speaker 2: you're a teacher or a student, or someone interested in the 37 00:02:39,919 --> 00:02:43,040 Speaker 2: future of work, it's essential to embrace the role of 38 00:02:43,080 --> 00:02:47,360 Speaker 2: AI in the education landscape. AI is not only changing 39 00:02:47,360 --> 00:02:50,400 Speaker 2: the way we work, but also how we learn, making 40 00:02:50,560 --> 00:02:55,840 Speaker 2: education more accessible, personalized, and aligned with the demands of 41 00:02:55,880 --> 00:02:57,080 Speaker 2: the modern job market. 42 00:03:02,240 --> 00:03:04,200 Speaker 3: Justina and Epel, so great to me both of you. 43 00:03:04,280 --> 00:03:05,800 Speaker 3: I'm so excited for this conversation. 44 00:03:06,600 --> 00:03:07,639 Speaker 4: Thank you for having. 45 00:03:07,440 --> 00:03:09,360 Speaker 5: Me, and thank you for having me. 46 00:03:10,520 --> 00:03:13,120 Speaker 3: Justina, to start, could you share some insights on your 47 00:03:13,200 --> 00:03:16,560 Speaker 3: journey to becoming IBM's Chief Impact Officer and how your 48 00:03:16,560 --> 00:03:20,079 Speaker 3: background in engineering shapes your approach to corporate social responsibility. 49 00:03:20,520 --> 00:03:23,200 Speaker 4: So I've had an interest in journey. I'm an immigrant. 50 00:03:23,280 --> 00:03:26,400 Speaker 4: I was one of the only black women who graduated 51 00:03:26,400 --> 00:03:30,519 Speaker 4: from my school's mechanical engineering program many many years ago. 52 00:03:30,960 --> 00:03:34,040 Speaker 4: I started my engineering career at a nuclear facility that's 53 00:03:34,080 --> 00:03:37,360 Speaker 4: around forty five miles outside of Buffalo, New York, and 54 00:03:37,440 --> 00:03:40,560 Speaker 4: eventually worked for one of the largest telecommunications companies in 55 00:03:40,600 --> 00:03:45,120 Speaker 4: the world in engineering, marketing, and eventually in corporate social responsibility. 56 00:03:45,680 --> 00:03:48,920 Speaker 4: I was hired to lead the organization away from traditional 57 00:03:48,920 --> 00:03:53,720 Speaker 4: philanthropy to creating platforms and solutions that leveraged four G 58 00:03:53,840 --> 00:03:58,960 Speaker 4: and five G technologies to positively impact disadvantaged communities, and 59 00:03:59,000 --> 00:04:01,280 Speaker 4: that is what has led me to the work that 60 00:04:01,320 --> 00:04:04,120 Speaker 4: I do at IBM today, I have the honor of 61 00:04:04,160 --> 00:04:07,240 Speaker 4: being the company's first Chief Impact Officer, and is such 62 00:04:07,280 --> 00:04:11,160 Speaker 4: a privilege and a responsibility to be at IBM, which 63 00:04:11,200 --> 00:04:16,360 Speaker 4: has such a huge history in sustainability, in social and 64 00:04:16,400 --> 00:04:19,800 Speaker 4: in the ethical space as well. When I consider how 65 00:04:19,880 --> 00:04:23,320 Speaker 4: my background in engineering ties into the work that I do, 66 00:04:24,000 --> 00:04:29,720 Speaker 4: I actually think engineers are very skilled at analyzing data 67 00:04:29,760 --> 00:04:33,880 Speaker 4: and at innovative problem solving. The other thing where there's 68 00:04:33,880 --> 00:04:37,400 Speaker 4: a lot of alignment with my engineering background is really 69 00:04:37,440 --> 00:04:41,039 Speaker 4: around how do I think about using technology to solve 70 00:04:41,080 --> 00:04:44,000 Speaker 4: some of the biggest issues that we have in society? 71 00:04:44,360 --> 00:04:48,039 Speaker 4: And I get very excited about innovating and creating and 72 00:04:48,160 --> 00:04:52,599 Speaker 4: leveraging technologies like AI and hybrid cloud to really bring 73 00:04:52,640 --> 00:04:54,920 Speaker 4: those into the work that we do and to solve 74 00:04:54,960 --> 00:04:57,919 Speaker 4: some of those big challenges that we have in society 75 00:04:57,960 --> 00:05:00,000 Speaker 4: today around sustainability and EDGUC. 76 00:05:00,760 --> 00:05:03,400 Speaker 3: That's fabulous, April. Tell me about your path to becoming 77 00:05:03,440 --> 00:05:06,000 Speaker 3: Associate Dean of Technology and Innovation as well as a 78 00:05:06,000 --> 00:05:06,840 Speaker 3: professor in law. 79 00:05:07,800 --> 00:05:12,480 Speaker 5: So I am a child of an educator, actually educators. 80 00:05:12,480 --> 00:05:15,559 Speaker 5: Both my parents are educators. I went to high school 81 00:05:15,560 --> 00:05:18,359 Speaker 5: where my mom taught, and it was in the eighties 82 00:05:18,400 --> 00:05:21,719 Speaker 5: and it was during that time period when teachers were 83 00:05:21,760 --> 00:05:24,880 Speaker 5: given Apple computers, so they were brand new. My mom 84 00:05:24,920 --> 00:05:27,919 Speaker 5: brought one home. I started playing with it. Then I 85 00:05:27,960 --> 00:05:30,720 Speaker 5: just kind of fell in love with the technology. I 86 00:05:30,760 --> 00:05:33,880 Speaker 5: received my undergraduate degree in computer science because of that 87 00:05:33,960 --> 00:05:37,960 Speaker 5: early exposure. I went to Bennett College here in Greensboro, 88 00:05:38,040 --> 00:05:43,120 Speaker 5: North Carolina. It's an HBCU, a historically black college and university. 89 00:05:43,760 --> 00:05:47,760 Speaker 5: I was a programmer after graduating from Bennett, and I've 90 00:05:47,800 --> 00:05:51,520 Speaker 5: always loved technology, but I also had a love for 91 00:05:51,600 --> 00:05:54,599 Speaker 5: the law. So after being a programmer for a couple 92 00:05:54,600 --> 00:05:57,200 Speaker 5: of years, I decided to go to law school. And 93 00:05:57,279 --> 00:06:01,599 Speaker 5: even as a lawyer, I leveraged technology in my private practice. 94 00:06:02,000 --> 00:06:05,359 Speaker 5: When I decided to begin teaching almost twenty years ago, 95 00:06:06,080 --> 00:06:09,000 Speaker 5: I would ask myself, how could I leverage a technology 96 00:06:09,160 --> 00:06:13,000 Speaker 5: to enhance my teaching to help the students better understand 97 00:06:13,040 --> 00:06:17,920 Speaker 5: the material. And so when our dean at the time, 98 00:06:18,000 --> 00:06:21,400 Speaker 5: Brownie Lewis, when she was able to facilitate a five 99 00:06:21,400 --> 00:06:25,080 Speaker 5: million dollar grant to North Carolina Central University School of Law, 100 00:06:25,880 --> 00:06:30,039 Speaker 5: we created the Technology Law and Policy Center, and she 101 00:06:30,200 --> 00:06:32,800 Speaker 5: asked me if I would be interested in serving as 102 00:06:32,839 --> 00:06:37,520 Speaker 5: the inaugural Associate dean of technology and innovation. So suffice 103 00:06:37,560 --> 00:06:39,719 Speaker 5: it to say, I'm in my dream job. I'm able 104 00:06:39,800 --> 00:06:43,200 Speaker 5: to combine my love of technology, my love of law, 105 00:06:43,279 --> 00:06:47,000 Speaker 5: my love of education, and so it's really an exciting 106 00:06:47,080 --> 00:06:49,520 Speaker 5: time to be in a position like I have. 107 00:06:50,520 --> 00:06:51,040 Speaker 1: I love that. 108 00:06:51,200 --> 00:06:53,880 Speaker 3: April, what inspired you to integrate AI and technology into 109 00:06:53,920 --> 00:06:55,359 Speaker 3: your law curriculum? 110 00:06:55,960 --> 00:07:00,080 Speaker 5: It's interesting. As I mentioned before, I've always used it 111 00:07:00,080 --> 00:07:05,000 Speaker 5: it personally as an educator, but the thought of teaching 112 00:07:05,080 --> 00:07:08,839 Speaker 5: a class that really kind of focused on technology and 113 00:07:08,880 --> 00:07:13,920 Speaker 5: the legal implications of that really occurred because Ray Thomas, 114 00:07:13,920 --> 00:07:16,880 Speaker 5: who was an IP lawyer and worked at IBM at 115 00:07:16,920 --> 00:07:19,760 Speaker 5: the time in twenty twenty so around the pandemic, he 116 00:07:19,960 --> 00:07:23,440 Speaker 5: encouraged us to take advantage of the IBM Skills Build 117 00:07:23,560 --> 00:07:27,600 Speaker 5: training program, the Train the Trainer program. So really not 118 00:07:27,800 --> 00:07:31,120 Speaker 5: until that time period did I even really even think 119 00:07:31,160 --> 00:07:35,640 Speaker 5: about teaching a tech focused legal class. And during that 120 00:07:35,720 --> 00:07:38,000 Speaker 5: time period, a couple of my other colleagues and I 121 00:07:38,120 --> 00:07:41,800 Speaker 5: we did the Train the Trainer Blockchain course. I did 122 00:07:41,800 --> 00:07:45,360 Speaker 5: the Data Science course, and then that next summer we 123 00:07:45,520 --> 00:07:49,120 Speaker 5: team taught the Blockchain for Lawyer's class, which we designed, 124 00:07:49,640 --> 00:07:52,920 Speaker 5: and then I taught a Data Science for Lawyers class, 125 00:07:53,560 --> 00:07:56,040 Speaker 5: and so that was, you know, really kind of the 126 00:07:56,040 --> 00:08:00,520 Speaker 5: first iteration of us really being intentional about teaching techechnology 127 00:08:00,640 --> 00:08:04,200 Speaker 5: and law. And then one of my other colleagues, doctor 128 00:08:04,240 --> 00:08:07,480 Speaker 5: Savon Da Grady, she is a professor at the School 129 00:08:07,480 --> 00:08:11,880 Speaker 5: of Library and Information Sciences here at NCCU. She reached 130 00:08:11,880 --> 00:08:13,880 Speaker 5: out to me and said, would you be interesting in 131 00:08:13,960 --> 00:08:17,560 Speaker 5: teaching a joint AI and the law class that would 132 00:08:17,600 --> 00:08:21,360 Speaker 5: include her Masters of Information Science students and my law students. 133 00:08:21,720 --> 00:08:25,960 Speaker 5: So it's a wonderful interdisciplinary class where you have master's 134 00:08:26,000 --> 00:08:29,680 Speaker 5: students and law students and we talk about the foundations 135 00:08:29,680 --> 00:08:33,640 Speaker 5: of AI, we talk about the legal implications of policy implications, 136 00:08:34,120 --> 00:08:36,720 Speaker 5: and so really, you know, this kind of all started 137 00:08:36,840 --> 00:08:41,800 Speaker 5: because of the resources that IBM have made available to NCCU. 138 00:08:43,080 --> 00:08:45,280 Speaker 3: That's so cool, and that class sounds amazing. I wish 139 00:08:45,360 --> 00:08:47,199 Speaker 3: I could like drop out of being a professor and 140 00:08:47,480 --> 00:08:50,199 Speaker 3: then this class. This sounds awesome. And so as a 141 00:08:50,280 --> 00:08:52,120 Speaker 3: question for both of you, in this age of AI 142 00:08:52,160 --> 00:08:55,559 Speaker 3: and open technology, does the role of education change? Are 143 00:08:55,559 --> 00:08:57,520 Speaker 3: we kind of at a different spot with what education 144 00:08:57,559 --> 00:08:58,520 Speaker 3: should be doing now? 145 00:08:59,320 --> 00:09:02,560 Speaker 4: When I look at at the role of education today 146 00:09:02,640 --> 00:09:06,040 Speaker 4: from the corporate point of view, I think it does change. 147 00:09:06,120 --> 00:09:10,040 Speaker 4: I was having a discussion earlier today with some members 148 00:09:10,080 --> 00:09:15,000 Speaker 4: of my team, and we were discussing early professional hires, 149 00:09:15,000 --> 00:09:18,520 Speaker 4: so people would want to hire right out of college, 150 00:09:18,960 --> 00:09:21,440 Speaker 4: and one of the first things that I shared was 151 00:09:21,559 --> 00:09:24,959 Speaker 4: some of the tasks that they would have done previously 152 00:09:25,360 --> 00:09:28,880 Speaker 4: will be automated. We will be using AI for those 153 00:09:29,400 --> 00:09:32,480 Speaker 4: basic tasks that in the past would have hired an 154 00:09:32,520 --> 00:09:37,400 Speaker 4: intern or a recent college graduate to do. And it's 155 00:09:37,480 --> 00:09:42,200 Speaker 4: so critical now that we look at higher level types 156 00:09:42,200 --> 00:09:44,920 Speaker 4: of tasks that we will need college graduates to do. 157 00:09:45,360 --> 00:09:49,360 Speaker 4: And I can foresee in the future hiring someone from 158 00:09:49,440 --> 00:09:53,120 Speaker 4: college who does not have at least a basic understanding 159 00:09:53,160 --> 00:09:56,040 Speaker 4: of AI. There will be some roles where they will 160 00:09:56,040 --> 00:09:58,880 Speaker 4: have to have an advanced understanding, especially if they're in 161 00:09:59,000 --> 00:10:03,360 Speaker 4: an engineering role or computer science role, but across the 162 00:10:03,400 --> 00:10:06,760 Speaker 4: board they will need to understand AI. So when I 163 00:10:06,800 --> 00:10:10,200 Speaker 4: think about the way that education is changing, whether you're 164 00:10:10,200 --> 00:10:14,120 Speaker 4: a college student, whether you are an adult professional, you 165 00:10:14,200 --> 00:10:17,160 Speaker 4: will need to be a lifelong learner and you will 166 00:10:17,200 --> 00:10:21,880 Speaker 4: need to understand how to continuously upskill and reskill yourself 167 00:10:22,360 --> 00:10:25,600 Speaker 4: to be able to understand technologies like AI. Because of 168 00:10:25,679 --> 00:10:29,040 Speaker 4: the rapid acceleration of these types of technologies, and I 169 00:10:29,080 --> 00:10:32,559 Speaker 4: think that's very important. I think everyone has to be prepared, 170 00:10:32,760 --> 00:10:35,760 Speaker 4: if they're not doing it today, to upskill and reskill themselves. 171 00:10:36,120 --> 00:10:39,600 Speaker 4: And I can't foresee any roles in the future where 172 00:10:39,800 --> 00:10:44,280 Speaker 4: candidates will not need to have a very basic understanding 173 00:10:44,320 --> 00:10:46,959 Speaker 4: of AI or even advanced understanding of AI. 174 00:10:48,360 --> 00:10:50,480 Speaker 3: That's great, April. Let me ask you a slightly different 175 00:10:50,520 --> 00:10:52,800 Speaker 3: version of the question, what is the significance of AI 176 00:10:52,840 --> 00:10:54,880 Speaker 3: for students and young professionals today? 177 00:10:55,679 --> 00:10:59,280 Speaker 5: When we think about the disruption that JENAI especially has 178 00:10:59,360 --> 00:11:03,960 Speaker 5: caused within legal profession students have to be more adept 179 00:11:04,000 --> 00:11:08,400 Speaker 5: when it comes to feeling comfortable, being uncomfortable, and learning 180 00:11:08,440 --> 00:11:11,120 Speaker 5: something new. The other thing that I would just kind 181 00:11:11,120 --> 00:11:15,480 Speaker 5: of emphasize from an educational standpoint is this also means 182 00:11:15,520 --> 00:11:19,719 Speaker 5: that educators have to approach teaching differently. You know, I've 183 00:11:19,720 --> 00:11:24,600 Speaker 5: been teaching for going on twenty years and things are 184 00:11:24,679 --> 00:11:28,080 Speaker 5: kind of being turned on their heads somewhat right, and 185 00:11:28,960 --> 00:11:33,240 Speaker 5: I have had to upskill and reskill. We can't teach 186 00:11:33,320 --> 00:11:35,840 Speaker 5: that what we don't know, We can't monitor that what 187 00:11:35,960 --> 00:11:38,640 Speaker 5: we don't know. Just as the students have to understand 188 00:11:38,679 --> 00:11:42,040 Speaker 5: generative AI, the educators have to understand it as well. 189 00:11:42,600 --> 00:11:44,960 Speaker 3: Yeah, this is something I've felt in the classroom myself 190 00:11:45,000 --> 00:11:48,600 Speaker 3: as a psychology professor, right, is that I'm realizing how 191 00:11:48,800 --> 00:11:50,840 Speaker 3: much I need to kind of go back to school 192 00:11:50,880 --> 00:11:53,440 Speaker 3: and learn about all these AI tools, not just so 193 00:11:53,440 --> 00:11:54,720 Speaker 3: that I can teach it, but just so I can 194 00:11:54,840 --> 00:11:57,640 Speaker 3: understand how my students are using these things, right, but 195 00:11:57,679 --> 00:11:59,960 Speaker 3: also to figure out how I can enhance the educa 196 00:12:00,000 --> 00:12:02,760 Speaker 3: cocational experience of my own students in psychology right by 197 00:12:02,800 --> 00:12:05,360 Speaker 3: giving them access to these tools. And so, yeah, I'm 198 00:12:05,360 --> 00:12:08,440 Speaker 3: curious in your experience, how does AI actually enhance the 199 00:12:08,640 --> 00:12:11,320 Speaker 3: educational experience for your law students. And I'm curious if 200 00:12:11,360 --> 00:12:13,120 Speaker 3: you could give an example of the type of thing 201 00:12:13,160 --> 00:12:14,280 Speaker 3: you do in your classroom. 202 00:12:15,000 --> 00:12:15,200 Speaker 4: Yeah. 203 00:12:15,360 --> 00:12:18,640 Speaker 5: So one of the things that I tell my students 204 00:12:18,760 --> 00:12:21,400 Speaker 5: is you got to get your hands dirty. You can't 205 00:12:21,520 --> 00:12:24,000 Speaker 5: understand these tools if you don't kind of dig in 206 00:12:24,120 --> 00:12:27,199 Speaker 5: and just see how they work. So one, giving them 207 00:12:27,240 --> 00:12:30,680 Speaker 5: permission and encouraging them to do it in terms of 208 00:12:30,760 --> 00:12:34,320 Speaker 5: how they might be able to use these tools to 209 00:12:34,440 --> 00:12:38,000 Speaker 5: help them learn better. I encourage them to as they're 210 00:12:38,040 --> 00:12:42,880 Speaker 5: wrestling maybe with concepts that are confusing, they haven't completely 211 00:12:42,920 --> 00:12:45,240 Speaker 5: wrapped their heads around it. And when we think about 212 00:12:45,320 --> 00:12:49,560 Speaker 5: large language models, these tools are really helpful in that sense. Right, 213 00:12:49,920 --> 00:12:52,600 Speaker 5: if there's a passage in the book and you're not 214 00:12:52,720 --> 00:12:55,960 Speaker 5: quite following it, or there's a case right and you 215 00:12:56,080 --> 00:13:00,720 Speaker 5: need some assistance in breaking it down, running that information 216 00:13:00,800 --> 00:13:03,560 Speaker 5: through a large language model and then asking questions about 217 00:13:03,559 --> 00:13:06,920 Speaker 5: it can be really beneficial. Also in the law score 218 00:13:07,040 --> 00:13:10,640 Speaker 5: the legal contexts, large language models are really helpful for 219 00:13:10,720 --> 00:13:13,280 Speaker 5: that as well. But one thing I do caution my 220 00:13:13,320 --> 00:13:16,040 Speaker 5: students is that any understanding that you think you have 221 00:13:16,240 --> 00:13:19,280 Speaker 5: gained through the use of these tools, you need to 222 00:13:19,320 --> 00:13:21,679 Speaker 5: circle back to your professor and make sure that your 223 00:13:21,760 --> 00:13:23,040 Speaker 5: understanding is correct. 224 00:13:23,840 --> 00:13:25,960 Speaker 3: I love that and I've seen the importance of that 225 00:13:26,000 --> 00:13:28,400 Speaker 3: in my own classroom too. You mentioned so many of 226 00:13:28,440 --> 00:13:30,360 Speaker 3: the things that these tools are great at, but I 227 00:13:30,360 --> 00:13:32,600 Speaker 3: think another thing that AI in the classroom can help 228 00:13:32,640 --> 00:13:36,760 Speaker 3: us with is democratizing the classroom. And so, Justina, I'm curious, 229 00:13:36,800 --> 00:13:39,600 Speaker 3: in what ways do you think integrating AI into education 230 00:13:39,760 --> 00:13:41,680 Speaker 3: is help us going to bridge these gaps and actually 231 00:13:41,679 --> 00:13:44,120 Speaker 3: democratize access to education even more. 232 00:13:44,520 --> 00:13:47,319 Speaker 4: Yeah, I think it's going to really make a difference 233 00:13:47,400 --> 00:13:51,280 Speaker 4: in providing access to education in many different ways. I 234 00:13:51,320 --> 00:13:53,880 Speaker 4: want to give you an example through alla IBM Skills 235 00:13:53,880 --> 00:13:59,040 Speaker 4: Bill program, we're infusing AI technology into the platform to 236 00:13:59,120 --> 00:14:03,920 Speaker 4: create a more personally enhance experience for our learners in 237 00:14:04,000 --> 00:14:08,080 Speaker 4: every language. So we are creating personalized learning pathways, we 238 00:14:08,160 --> 00:14:13,080 Speaker 4: are tailoring the access to our learners to meet their 239 00:14:13,160 --> 00:14:17,479 Speaker 4: individual needs, and we're also using AI to answer questions 240 00:14:17,520 --> 00:14:20,520 Speaker 4: in a more timely and accurate manner. If you really 241 00:14:20,600 --> 00:14:23,280 Speaker 4: think about it, you will need a significant staff to 242 00:14:23,360 --> 00:14:26,920 Speaker 4: be able to respond quickly to questions to make sure 243 00:14:26,920 --> 00:14:31,440 Speaker 4: the questions are accurate. With AI, we can answer questions immediately, 244 00:14:31,960 --> 00:14:34,640 Speaker 4: we can answer them in a more sophisticated way than 245 00:14:34,680 --> 00:14:37,280 Speaker 4: we did in the past, and we can also offer 246 00:14:37,440 --> 00:14:42,000 Speaker 4: cost recommendations and learning pathways that meet their needs. We 247 00:14:42,120 --> 00:14:45,840 Speaker 4: have courses such as AI Ethics and prompt Writing and 248 00:14:45,920 --> 00:14:49,480 Speaker 4: getting started with Machine Learning all the way to actually 249 00:14:49,640 --> 00:14:53,880 Speaker 4: use in coding to help create these large language models. 250 00:14:53,880 --> 00:14:56,680 Speaker 4: So when you think about the average learner that we 251 00:14:56,720 --> 00:14:59,840 Speaker 4: are working with, they may want just an introductory course 252 00:15:00,080 --> 00:15:04,600 Speaker 4: on AI ethics or understanding how to use AI in 253 00:15:04,680 --> 00:15:07,680 Speaker 4: their day to day work, or they actually may want 254 00:15:07,720 --> 00:15:11,160 Speaker 4: to understand how do you really leverage or code for 255 00:15:11,240 --> 00:15:13,480 Speaker 4: a large language model, and I think it's important to 256 00:15:13,520 --> 00:15:16,880 Speaker 4: give them all the different options and create those personalized 257 00:15:16,920 --> 00:15:20,880 Speaker 4: learning pathways for them. The other thing around really democratizing 258 00:15:20,960 --> 00:15:25,240 Speaker 4: opportunities to provide free access to this kind of learning, 259 00:15:25,600 --> 00:15:27,960 Speaker 4: and we do that again through our Skills Bill program. 260 00:15:28,480 --> 00:15:33,760 Speaker 4: If you have courses that you can only pay to access, 261 00:15:33,800 --> 00:15:37,560 Speaker 4: then you're really not giving the opportunity for everyone to 262 00:15:37,680 --> 00:15:41,080 Speaker 4: advance and to learn. So by leveraging AI on our 263 00:15:41,120 --> 00:15:44,760 Speaker 4: platform but also providing that free access, we're really helping 264 00:15:44,800 --> 00:15:48,000 Speaker 4: to bridge the gap for learners and make sure they 265 00:15:48,000 --> 00:15:51,880 Speaker 4: can upskill and reskill themselves and help them also increase 266 00:15:51,920 --> 00:15:53,640 Speaker 4: social and economic mobility. 267 00:15:54,880 --> 00:15:57,400 Speaker 3: It sounds like an amazing program, Justina. Can you describe 268 00:15:57,440 --> 00:16:00,400 Speaker 3: the vision behind IBM Skills Build and how how it's 269 00:16:00,480 --> 00:16:02,520 Speaker 3: built to reach so many learners around the world. 270 00:16:02,840 --> 00:16:06,520 Speaker 4: Yeah. So, IBM has always been committed to investing in 271 00:16:06,560 --> 00:16:11,640 Speaker 4: the future of work and we've offered educational experiences for many, 272 00:16:11,680 --> 00:16:16,560 Speaker 4: many years. And IBM Skills Built is a program. Again, 273 00:16:16,640 --> 00:16:19,920 Speaker 4: it's free, it's open, anyone can access it. But it's 274 00:16:19,960 --> 00:16:24,080 Speaker 4: really around getting access to the right technical skills and 275 00:16:24,240 --> 00:16:27,920 Speaker 4: workplace learning skills so that you could be prepared for 276 00:16:28,480 --> 00:16:33,000 Speaker 4: a career in technology, but in any industry and any field. 277 00:16:33,400 --> 00:16:38,600 Speaker 4: We know now that understanding technology, understanding AI or cybersecurity 278 00:16:39,160 --> 00:16:42,560 Speaker 4: or any of those tech topics are needed whether you're 279 00:16:42,640 --> 00:16:45,520 Speaker 4: working in a tech company, or whether you're working in 280 00:16:45,560 --> 00:16:49,040 Speaker 4: retail or in legal or any of these different industries. 281 00:16:49,080 --> 00:16:50,640 Speaker 4: So we want to make sure we could provide that 282 00:16:50,760 --> 00:16:54,080 Speaker 4: access to learners. In twenty twenty one, we launch a 283 00:16:54,120 --> 00:16:57,320 Speaker 4: global commitment to skill thirty million people by twenty thirty 284 00:16:57,760 --> 00:17:01,680 Speaker 4: and we are making significant progress against that goal. Just 285 00:17:01,800 --> 00:17:04,840 Speaker 4: last year we reported that we skilled eleven point five 286 00:17:04,920 --> 00:17:08,480 Speaker 4: million learners around the world, and these are learners that 287 00:17:08,720 --> 00:17:13,600 Speaker 4: enrolled in IBM courses, including access in our platform, IBM 288 00:17:13,640 --> 00:17:17,800 Speaker 4: Skills Build, and it's really the cornerstone of our education 289 00:17:17,920 --> 00:17:21,960 Speaker 4: work at IBM. We're really focus on scaling our work 290 00:17:22,240 --> 00:17:27,280 Speaker 4: through partnerships. So we partner with historically black colleges and universities, 291 00:17:27,280 --> 00:17:29,679 Speaker 4: and that's how, of course, we got the chance to 292 00:17:29,680 --> 00:17:34,160 Speaker 4: meet April. We partner with nonprofit organizations across the globe. 293 00:17:34,320 --> 00:17:37,640 Speaker 4: We also partner with governments to make sure we provide 294 00:17:37,680 --> 00:17:40,960 Speaker 4: that free access to the communities that are aligned with 295 00:17:41,000 --> 00:17:44,600 Speaker 4: their national agenda around skilling, and those communities that are 296 00:17:44,600 --> 00:17:48,639 Speaker 4: most in need. It's really important that we scale the 297 00:17:48,720 --> 00:17:53,520 Speaker 4: program through those premier partnerships, so that's extremely important to us. 298 00:17:54,520 --> 00:17:58,720 Speaker 2: The vision behind IBM Skills Build is truly inspiring. In 299 00:17:58,760 --> 00:18:02,760 Speaker 2: a world where technology is changing every industry, having access 300 00:18:02,760 --> 00:18:06,439 Speaker 2: to these crucial skills is more important than ever. This 301 00:18:06,560 --> 00:18:10,280 Speaker 2: initiative is breaking down barriers and ensuring that people from 302 00:18:10,359 --> 00:18:13,920 Speaker 2: all walks of life can participate in the future of work. 303 00:18:14,880 --> 00:18:19,000 Speaker 2: In order to effectively scale a platform, the strategic collaborations 304 00:18:19,000 --> 00:18:24,440 Speaker 2: with educational institutions, nonprofits and governments are key. It's clear 305 00:18:24,480 --> 00:18:28,800 Speaker 2: that IBM is deeply invested in creating long lasting change 306 00:18:28,880 --> 00:18:33,000 Speaker 2: in communities around the world. This approach will strengthen the 307 00:18:33,040 --> 00:18:37,320 Speaker 2: workforce globally, helping to bridge the digital divide and create 308 00:18:37,440 --> 00:18:40,199 Speaker 2: more equitable opportunities in the tech space. 309 00:18:41,800 --> 00:18:43,600 Speaker 3: So now we're shifting gears to think a little bit 310 00:18:43,600 --> 00:18:46,480 Speaker 3: about the real world insights. Justina, what can you tell 311 00:18:46,520 --> 00:18:49,159 Speaker 3: us about the skills first movement? This seems to be 312 00:18:49,240 --> 00:18:52,000 Speaker 3: an open approach to attracting top talent. What are you 313 00:18:52,040 --> 00:18:53,720 Speaker 3: hearing from students and partners. 314 00:18:54,440 --> 00:18:58,160 Speaker 4: Yeah, so IBM has been leading the skills first movement 315 00:18:58,280 --> 00:19:01,399 Speaker 4: full quite some time. And the things that we realized 316 00:19:01,440 --> 00:19:04,560 Speaker 4: and we actually tested this out, is that you don't 317 00:19:04,560 --> 00:19:08,239 Speaker 4: always need a four year degree to be successful at 318 00:19:08,280 --> 00:19:11,520 Speaker 4: a tech job. So when we looked at the job 319 00:19:11,560 --> 00:19:15,000 Speaker 4: postings that we had, we decided to make a commitment 320 00:19:15,400 --> 00:19:19,280 Speaker 4: to have at least fifty percent of our job postings 321 00:19:19,359 --> 00:19:23,680 Speaker 4: not requiring a four year degree. And when we started 322 00:19:23,800 --> 00:19:27,120 Speaker 4: hiring people without a four year degree in certain roles, 323 00:19:27,760 --> 00:19:31,720 Speaker 4: we realized that they were as successful as those with 324 00:19:31,800 --> 00:19:34,320 Speaker 4: a four year degree. Now, this doesn't work across the board, 325 00:19:34,400 --> 00:19:37,560 Speaker 4: but this is really a way to get access to 326 00:19:37,640 --> 00:19:41,600 Speaker 4: what I consider to be untapped talent that are skilled 327 00:19:41,600 --> 00:19:45,439 Speaker 4: in different ways. Maybe they've had some experiences already, maybe 328 00:19:45,440 --> 00:19:48,720 Speaker 4: they have a different set of badges and certificates or 329 00:19:48,760 --> 00:19:52,399 Speaker 4: other credentials that can support them getting access to some 330 00:19:52,440 --> 00:19:55,639 Speaker 4: of the roles that are offered by companies. So this 331 00:19:55,840 --> 00:19:59,520 Speaker 4: is really a way to help address the opportunity gap 332 00:19:59,560 --> 00:20:02,639 Speaker 4: and provide a pathway for diverse talent. 333 00:20:03,840 --> 00:20:06,199 Speaker 3: What impact do you think AI has had on global 334 00:20:06,280 --> 00:20:08,200 Speaker 3: learning standards broadly so far? 335 00:20:09,400 --> 00:20:12,119 Speaker 5: I think from the perspective of a law student, realizing 336 00:20:12,160 --> 00:20:14,879 Speaker 5: that this little universe in which we kind of thought 337 00:20:14,920 --> 00:20:18,639 Speaker 5: we might operate has expanded. When we think about AI 338 00:20:18,760 --> 00:20:21,560 Speaker 5: and we think about the implications of AI, it goes 339 00:20:21,640 --> 00:20:24,960 Speaker 5: far beyond our state national I mean, you have to 340 00:20:25,000 --> 00:20:28,800 Speaker 5: have an understanding of what's going on in other countries. 341 00:20:28,880 --> 00:20:31,840 Speaker 5: So even when we're thinking about the regulation of AI 342 00:20:31,920 --> 00:20:35,879 Speaker 5: and the governance of AI and policies surrounding AI, it 343 00:20:35,920 --> 00:20:39,200 Speaker 5: means you have to be open to learning about what's 344 00:20:39,240 --> 00:20:43,240 Speaker 5: happening in other countries where AI is disrupting those spaces 345 00:20:43,320 --> 00:20:46,960 Speaker 5: as well. So again, I think it really underscores for 346 00:20:47,280 --> 00:20:50,719 Speaker 5: our law students how you have to see yourself as 347 00:20:50,800 --> 00:20:54,159 Speaker 5: part of a larger team. Lawyers don't work in isolation, 348 00:20:54,480 --> 00:20:58,520 Speaker 5: and it's really good that law students are recognizing that 349 00:20:58,680 --> 00:20:59,919 Speaker 5: while they're still in school. 350 00:21:00,920 --> 00:21:03,320 Speaker 3: So it really seems like these technologies are kind of 351 00:21:03,440 --> 00:21:06,399 Speaker 3: changing the learning experience and law by making kind of 352 00:21:06,400 --> 00:21:09,239 Speaker 3: broader and maybe more global. Justina, can you share an 353 00:21:09,240 --> 00:21:11,920 Speaker 3: example of how IBM Skills Build has made a significant 354 00:21:11,960 --> 00:21:14,240 Speaker 3: difference in other kinds of learning journeys? 355 00:21:14,359 --> 00:21:14,600 Speaker 5: Yeah? 356 00:21:14,600 --> 00:21:16,880 Speaker 4: Absolutely, I mean this is one of the most rewarding 357 00:21:16,920 --> 00:21:19,720 Speaker 4: parts of my job. What I get excited about is 358 00:21:19,760 --> 00:21:22,960 Speaker 4: when I travel and I meet with students who have 359 00:21:23,040 --> 00:21:25,359 Speaker 4: been a part of IBM Skills Build and they have 360 00:21:25,560 --> 00:21:30,160 Speaker 4: been able to use the learning, the certificates, the opportunities 361 00:21:30,560 --> 00:21:33,560 Speaker 4: that we've provided them around mentorship as well to be 362 00:21:33,600 --> 00:21:36,199 Speaker 4: able to move into a better paying job or a 363 00:21:36,200 --> 00:21:40,040 Speaker 4: new job that they did not have the opportunity previously. 364 00:21:40,440 --> 00:21:43,040 Speaker 4: We had one of our learners. His name was Oscar 365 00:21:43,400 --> 00:21:46,720 Speaker 4: and he arrived in California from Mexico when he was 366 00:21:46,760 --> 00:21:49,520 Speaker 4: around five years old and he worked and he attended 367 00:21:49,560 --> 00:21:53,560 Speaker 4: college full time. But during his last semester he was 368 00:21:53,640 --> 00:21:56,680 Speaker 4: introduced to the IBM Skills Build program through the Hispanic 369 00:21:56,800 --> 00:22:01,199 Speaker 4: Heritage Foundation, one of our partners, and through the career 370 00:22:01,240 --> 00:22:04,520 Speaker 4: assessment tool of the program, he identified areas where he 371 00:22:04,560 --> 00:22:08,160 Speaker 4: could excel and it allowed him to dig deeper into 372 00:22:08,280 --> 00:22:11,720 Speaker 4: learning paths that match his interests and his skills. So 373 00:22:11,800 --> 00:22:16,399 Speaker 4: he started taking cosses such as AI Fundamentals, he earned credentials, 374 00:22:16,680 --> 00:22:19,840 Speaker 4: and he was able to get a better role when 375 00:22:19,840 --> 00:22:23,720 Speaker 4: he graduated from college. So we have so many beneficiaries 376 00:22:23,720 --> 00:22:26,719 Speaker 4: of the program who have been able to access the training, 377 00:22:27,080 --> 00:22:30,480 Speaker 4: also access the mentorship that we provide through the program, 378 00:22:30,960 --> 00:22:33,600 Speaker 4: and able to get a better pain on new job 379 00:22:33,640 --> 00:22:34,240 Speaker 4: because of it. 380 00:22:35,040 --> 00:22:38,200 Speaker 3: That's fabulous, April. I know your students have used IBM 381 00:22:38,280 --> 00:22:40,560 Speaker 3: skills Build. Can you give us an example of how 382 00:22:40,560 --> 00:22:42,840 Speaker 3: it's made an important impact on a student's journey. 383 00:22:42,920 --> 00:22:47,560 Speaker 5: Yes, absolutely. So. I mentioned that we taught a Blockchain 384 00:22:47,640 --> 00:22:51,959 Speaker 5: for Lawyers class and one of the students had a 385 00:22:51,960 --> 00:22:56,760 Speaker 5: big interest in blockchain cryptocurrency. He actually also had a 386 00:22:56,880 --> 00:22:59,920 Speaker 5: master's in information science and so he was a dual 387 00:23:00,040 --> 00:23:03,159 Speaker 5: agree student. He was also in my AI and the 388 00:23:03,240 --> 00:23:06,480 Speaker 5: law class. So he not only got the blockchain certificate, 389 00:23:06,960 --> 00:23:10,520 Speaker 5: he got the AI Foundation certificate. He wound up being 390 00:23:10,520 --> 00:23:13,360 Speaker 5: the editor in chief of the Law journal, and he 391 00:23:13,440 --> 00:23:16,200 Speaker 5: is a legal tech lawyer. And so this kind of 392 00:23:16,240 --> 00:23:19,119 Speaker 5: goes back to what Justina was saying about making sure 393 00:23:19,200 --> 00:23:23,160 Speaker 5: that the talent that's there has access to the resources. 394 00:23:23,240 --> 00:23:25,360 Speaker 5: It really does make a big difference in so many 395 00:23:25,400 --> 00:23:26,480 Speaker 5: of our students' lives. 396 00:23:27,480 --> 00:23:30,840 Speaker 3: That's such an inspiring story, Justina. I'm curious what impact 397 00:23:30,880 --> 00:23:33,160 Speaker 3: skills Build has had on the communities you work with, 398 00:23:33,240 --> 00:23:34,760 Speaker 3: maybe even beyond just students. 399 00:23:35,440 --> 00:23:40,560 Speaker 4: Yeah, so it has had a tremendous impact in our communities. 400 00:23:40,960 --> 00:23:44,240 Speaker 4: I think one of the big things about digital skills 401 00:23:44,280 --> 00:23:49,200 Speaker 4: and upskill in and reskilling is not just in certain areas. 402 00:23:49,240 --> 00:23:52,840 Speaker 4: For example, I mentioned the story of Oscar who was 403 00:23:52,920 --> 00:23:56,200 Speaker 4: graduating from college got access to Skills Build. It helped 404 00:23:56,600 --> 00:23:59,840 Speaker 4: him get a better paying job. But we have programs 405 00:24:00,080 --> 00:24:03,760 Speaker 4: sustainability as well where we are working with farmers in 406 00:24:03,800 --> 00:24:07,399 Speaker 4: the middle of Texas and we are providing access to 407 00:24:07,520 --> 00:24:10,320 Speaker 4: Skills Built as well so that they can use the 408 00:24:10,359 --> 00:24:14,080 Speaker 4: technology and understand the technology that we are bringing to 409 00:24:14,119 --> 00:24:18,359 Speaker 4: them throughout Sustainability Accelerator program. And what's so interesting about 410 00:24:18,359 --> 00:24:22,080 Speaker 4: this is we need to upskill and reskill them as well. 411 00:24:22,560 --> 00:24:24,600 Speaker 4: So if you think about certain jobs where you just 412 00:24:24,640 --> 00:24:28,240 Speaker 4: need to better understand the data or the technology, our 413 00:24:28,320 --> 00:24:30,879 Speaker 4: partnerships with nonprofits to be able to bring it to 414 00:24:31,000 --> 00:24:34,280 Speaker 4: people in different fields, and sustainability is one that we 415 00:24:34,359 --> 00:24:37,520 Speaker 4: focus on as well that has been inspiring to me. 416 00:24:38,119 --> 00:24:41,360 Speaker 4: We also have programs where we focus on girls, especially 417 00:24:41,359 --> 00:24:44,520 Speaker 4: in India, and make sure we're giving them access to 418 00:24:44,600 --> 00:24:48,040 Speaker 4: this kind of training and mentorship again to make them 419 00:24:48,520 --> 00:24:51,159 Speaker 4: competitive in the marketplace, to make sure that they have 420 00:24:51,800 --> 00:24:54,159 Speaker 4: an opportunity at a good paying job and that they 421 00:24:54,200 --> 00:24:58,320 Speaker 4: could be independent. So our global partners work with us 422 00:24:58,440 --> 00:25:03,159 Speaker 4: on leveraging Skills Build, curating it in a way that 423 00:25:03,240 --> 00:25:06,320 Speaker 4: makes sense for their communities that they want to impact, 424 00:25:06,320 --> 00:25:09,800 Speaker 4: and we focus on women who have left the workforce 425 00:25:09,960 --> 00:25:12,679 Speaker 4: and they want to return. We focus on veterans, We 426 00:25:12,800 --> 00:25:16,840 Speaker 4: focus on black communities in the US or Hispanic communities. 427 00:25:17,040 --> 00:25:20,840 Speaker 4: So we really look at those really great global partnerships 428 00:25:20,880 --> 00:25:24,040 Speaker 4: and make sure we are bringing in people who would 429 00:25:24,080 --> 00:25:26,919 Speaker 4: have been otherwise left out of the tech field and 430 00:25:27,200 --> 00:25:30,439 Speaker 4: giving them the opportunity to reskill and upskill themselves and 431 00:25:30,600 --> 00:25:33,679 Speaker 4: helping them through our partnerships, connect to good paying jobs 432 00:25:33,720 --> 00:25:34,160 Speaker 4: as well. 433 00:25:34,880 --> 00:25:37,959 Speaker 3: So so far we've been focused on students in their learning, 434 00:25:38,000 --> 00:25:40,480 Speaker 3: but now I want to turn to both of your learning. 435 00:25:40,880 --> 00:25:43,439 Speaker 3: I'm curious, what are some challenges that you've faced in 436 00:25:43,480 --> 00:25:45,560 Speaker 3: your careers and how have you overcome them? 437 00:25:46,040 --> 00:25:49,840 Speaker 5: Yeah. Sure. So one of the things that I quickly 438 00:25:49,880 --> 00:25:53,359 Speaker 5: found out was that law school was not as I envisioned. 439 00:25:53,440 --> 00:25:55,320 Speaker 5: You kind of go in, you think it's one thing, 440 00:25:55,359 --> 00:25:58,920 Speaker 5: it's another. The curriculum can be very surprising. It's not 441 00:25:59,080 --> 00:26:02,840 Speaker 5: like the underground do it curriculum. And I just had 442 00:26:02,880 --> 00:26:07,199 Speaker 5: to kind of reach out and develop mentors. And I 443 00:26:07,359 --> 00:26:11,639 Speaker 5: was very lucky in that I had a number of 444 00:26:11,880 --> 00:26:15,640 Speaker 5: individuals who provided me with a tremendous amount of support. 445 00:26:15,960 --> 00:26:17,720 Speaker 5: And I think that's One of the reasons why I 446 00:26:17,800 --> 00:26:21,160 Speaker 5: love teaching so much is to be able to support 447 00:26:21,240 --> 00:26:24,160 Speaker 5: the students and just help them kind of build their 448 00:26:24,200 --> 00:26:27,640 Speaker 5: community and their network so they can excel, and then 449 00:26:27,640 --> 00:26:30,280 Speaker 5: they can reach back and help others excel as well. 450 00:26:31,400 --> 00:26:34,120 Speaker 3: I love that. Justina, same question. What are some key 451 00:26:34,160 --> 00:26:36,280 Speaker 3: challenges that you've faced in your career and how have 452 00:26:36,359 --> 00:26:37,040 Speaker 3: you overcome them? 453 00:26:37,119 --> 00:26:41,639 Speaker 4: Yeah, I'm smiling because what April mentioned is exactly the 454 00:26:41,680 --> 00:26:44,520 Speaker 4: experience I've had. I was one of the only black 455 00:26:44,560 --> 00:26:48,760 Speaker 4: women to graduate from my school's Meke chemical engineering program, 456 00:26:48,840 --> 00:26:52,879 Speaker 4: and when my children were very young, I also stepped 457 00:26:52,880 --> 00:26:56,200 Speaker 4: away from the workforce for several years to focus on them. 458 00:26:57,119 --> 00:27:00,840 Speaker 4: And I don't think I would be successful today without 459 00:27:00,840 --> 00:27:03,840 Speaker 4: the help of mentors. They're the ones that really help 460 00:27:03,920 --> 00:27:08,160 Speaker 4: me to be successful, to understand the corporate environment, to 461 00:27:08,240 --> 00:27:12,480 Speaker 4: connect me with other opportunities, and I think it's important 462 00:27:12,480 --> 00:27:16,240 Speaker 4: to me to make myself available to others, and that's 463 00:27:16,520 --> 00:27:18,520 Speaker 4: a really big part of what I do. I want 464 00:27:18,560 --> 00:27:22,800 Speaker 4: to make myself and my field more representative of the 465 00:27:22,800 --> 00:27:24,440 Speaker 4: work that we do, and I want to make sure 466 00:27:24,520 --> 00:27:28,199 Speaker 4: that I provide access to others and give others the 467 00:27:28,200 --> 00:27:31,479 Speaker 4: same types of opportunities I have and that's why I 468 00:27:31,520 --> 00:27:34,639 Speaker 4: do enjoy leading this type of work at IBM. 469 00:27:35,160 --> 00:27:37,240 Speaker 3: Here here to both of you, giving back to the 470 00:27:37,240 --> 00:27:40,840 Speaker 3: students that we were back in the day. It's so important, Justina. 471 00:27:40,960 --> 00:27:43,720 Speaker 3: IBM has a goal of equipping thirty million learners with 472 00:27:43,800 --> 00:27:46,760 Speaker 3: technology skills by twenty thirty as part of the IBM 473 00:27:46,920 --> 00:27:51,080 Speaker 3: Skills Build programming. Why is this initiative important and how 474 00:27:51,160 --> 00:27:53,280 Speaker 3: is IBM planning to exactly achieve this? 475 00:27:54,040 --> 00:27:56,520 Speaker 4: Yeah, we believe the talent gap is one of the 476 00:27:56,520 --> 00:28:00,480 Speaker 4: biggest challenges that we face in society today. So AI 477 00:28:00,920 --> 00:28:04,240 Speaker 4: of course is accelerating this movement and there's more of 478 00:28:04,280 --> 00:28:07,760 Speaker 4: a sense of urgency. However, we know that there is 479 00:28:07,800 --> 00:28:11,199 Speaker 4: a significant talent gap and that there are many people 480 00:28:11,400 --> 00:28:14,240 Speaker 4: that are disadvantage, who are not getting access to the 481 00:28:14,320 --> 00:28:17,480 Speaker 4: right opportunities, and that's why we made the commitment to 482 00:28:17,520 --> 00:28:20,640 Speaker 4: skill thirty million people by twenty thirty, and that's why 483 00:28:20,680 --> 00:28:24,200 Speaker 4: we're providing free access to programs like IBM Skills Build 484 00:28:24,880 --> 00:28:28,040 Speaker 4: with over a thousand courses in twenty languages, to make 485 00:28:28,040 --> 00:28:31,439 Speaker 4: them accessible to all and to give others the chance 486 00:28:31,600 --> 00:28:35,440 Speaker 4: to be successful. Last year, we also announced the commitment 487 00:28:35,560 --> 00:28:39,120 Speaker 4: to train two million people in AI over the next 488 00:28:39,160 --> 00:28:43,080 Speaker 4: three years, because again, we understand the importance of AI 489 00:28:43,480 --> 00:28:48,000 Speaker 4: and understanding it to be successful in any job, especially 490 00:28:48,040 --> 00:28:51,560 Speaker 4: an entry level job. So we're continuing to expand our 491 00:28:51,600 --> 00:28:57,000 Speaker 4: AI offerings because we know that it is exacerbating the 492 00:28:57,160 --> 00:29:00,400 Speaker 4: talent gap and we know that these skills will be 493 00:29:00,480 --> 00:29:03,720 Speaker 4: in demand significantly by corporations. 494 00:29:04,440 --> 00:29:07,040 Speaker 3: So, April, Justina just mentioned, you know, all the changes 495 00:29:07,040 --> 00:29:09,280 Speaker 3: that we're seeing at AI. I'm curious what role you 496 00:29:09,320 --> 00:29:12,400 Speaker 3: think educators play in terms of making students aware of 497 00:29:12,440 --> 00:29:15,920 Speaker 3: all these technological and societal changes happening in their fields. 498 00:29:16,480 --> 00:29:19,840 Speaker 5: Yeah, educators are so vital. And one of the things 499 00:29:19,880 --> 00:29:23,680 Speaker 5: that I've noticed is that students who have not engaged 500 00:29:23,720 --> 00:29:27,720 Speaker 5: with the tech have not done so either because an educator, 501 00:29:27,800 --> 00:29:31,280 Speaker 5: a teacher or professor has told them not to that, 502 00:29:31,520 --> 00:29:33,640 Speaker 5: you know, they just say, you know, no, you can't 503 00:29:33,760 --> 00:29:36,200 Speaker 5: use it, or they haven't said anything at all. They 504 00:29:36,200 --> 00:29:39,960 Speaker 5: haven't encouraged them to look into it to try it. 505 00:29:40,320 --> 00:29:45,320 Speaker 5: And we have to encourage students to become familiar with 506 00:29:45,400 --> 00:29:48,400 Speaker 5: these tools for all the reasons that Justina mentioned in 507 00:29:48,480 --> 00:29:52,360 Speaker 5: terms of what the workforce is demanding, but also if 508 00:29:52,400 --> 00:29:55,960 Speaker 5: we don't provide them with guidance, then there's the real 509 00:29:56,120 --> 00:29:59,800 Speaker 5: chance that they will use them inappropriately, So we have 510 00:29:59,880 --> 00:30:04,360 Speaker 5: to to provide them with permission to dive in. We 511 00:30:04,520 --> 00:30:07,600 Speaker 5: have to teach them how to use these tools ethically 512 00:30:08,200 --> 00:30:12,240 Speaker 5: with integrity, what are the best practices? And again that 513 00:30:12,320 --> 00:30:14,920 Speaker 5: kind of goes back to something I mentioned before, which 514 00:30:14,960 --> 00:30:18,760 Speaker 5: I speak about a lot, is that it requires educators 515 00:30:18,800 --> 00:30:22,320 Speaker 5: to themselves learn about these tools. And that's one of 516 00:30:22,360 --> 00:30:26,040 Speaker 5: the reasons why I was so appreciative of the trainer 517 00:30:26,120 --> 00:30:30,360 Speaker 5: program because again, we started offering courses at the law school, 518 00:30:30,440 --> 00:30:34,840 Speaker 5: because these courses were provided free of charge, of course 519 00:30:34,920 --> 00:30:39,120 Speaker 5: to our faculty, so we were able to upskill and 520 00:30:39,240 --> 00:30:42,960 Speaker 5: reskill and then turn around and share that with our students. 521 00:30:43,160 --> 00:30:46,640 Speaker 5: So educators are vital. But I also think that we 522 00:30:46,720 --> 00:30:49,920 Speaker 5: need to make sure we do a better job as 523 00:30:49,920 --> 00:30:54,080 Speaker 5: a society of supporting our educators so that they can 524 00:30:54,120 --> 00:30:57,800 Speaker 5: gain the knowledge and then pay that forward to the students. 525 00:30:58,480 --> 00:31:01,440 Speaker 3: Right, because not everybody's provided the kinds of free resources 526 00:31:01,440 --> 00:31:05,560 Speaker 3: that IBM provides. We're teachers who really need it. April, 527 00:31:05,720 --> 00:31:08,720 Speaker 3: in what ways has IBM Skills Build changed your perspective 528 00:31:08,800 --> 00:31:11,440 Speaker 3: on the potential of AI and education. 529 00:31:11,720 --> 00:31:14,000 Speaker 5: Well as far as the potential. It makes it so 530 00:31:14,080 --> 00:31:17,880 Speaker 5: much easier, right, I mean, it lightens the lift for educators. 531 00:31:17,960 --> 00:31:25,120 Speaker 5: If I had to design the AI Foundations class ground up, 532 00:31:25,520 --> 00:31:27,600 Speaker 5: there's no way I could have done that. And if 533 00:31:27,600 --> 00:31:32,120 Speaker 5: we're thinking about exposing students, regardless of their area of 534 00:31:32,200 --> 00:31:38,120 Speaker 5: study to AI or to technology, those that are experts 535 00:31:38,200 --> 00:31:42,000 Speaker 5: in those particular spaces, they're not going to be able 536 00:31:42,280 --> 00:31:46,640 Speaker 5: to build those courses. So having something like IBM Skills 537 00:31:46,680 --> 00:31:49,960 Speaker 5: Build available so that we can, you know, design a 538 00:31:50,040 --> 00:31:53,160 Speaker 5: course around those modules that are already put together, is 539 00:31:53,280 --> 00:31:57,080 Speaker 5: incredibly helpful. And so it means the potential of providing 540 00:31:57,160 --> 00:32:02,240 Speaker 5: AI education to all students just really increases the possibility, 541 00:32:02,240 --> 00:32:04,040 Speaker 5: which is good for all of us. 542 00:32:05,520 --> 00:32:07,880 Speaker 3: Justina, as you think about your work at IBM, how 543 00:32:07,880 --> 00:32:10,800 Speaker 3: do you balance the need for technological innovation with the 544 00:32:10,800 --> 00:32:14,000 Speaker 3: importance of maintaining a human centered approach and education. 545 00:32:14,560 --> 00:32:19,160 Speaker 4: I really like how April touched on ethics earlier, because 546 00:32:19,240 --> 00:32:22,560 Speaker 4: it is so important that we continue to make sure 547 00:32:22,640 --> 00:32:24,760 Speaker 4: the human is at the center of everything that we 548 00:32:24,800 --> 00:32:28,520 Speaker 4: do and that we are protecting people even as we 549 00:32:28,640 --> 00:32:32,840 Speaker 4: foster innovation with AI. And the way that IBM has 550 00:32:32,920 --> 00:32:37,280 Speaker 4: done that, we've had reasonable policies and guardrails in place 551 00:32:37,800 --> 00:32:41,240 Speaker 4: around everything that we do around AI. I'm actually a 552 00:32:41,240 --> 00:32:43,600 Speaker 4: part of our AI Ethics Board. We meet on a 553 00:32:43,640 --> 00:32:49,160 Speaker 4: regular basis to discuss cases, to discuss technology, and we 554 00:32:49,240 --> 00:32:53,920 Speaker 4: actually have discussions and make decisions on what is the 555 00:32:54,000 --> 00:32:57,600 Speaker 4: right thing to do, and we are always considering a 556 00:32:57,680 --> 00:33:00,680 Speaker 4: human centered approach. How do we make sure that we 557 00:33:00,720 --> 00:33:03,280 Speaker 4: are protecting people and how do we make sure that 558 00:33:03,880 --> 00:33:06,760 Speaker 4: we have their voice in every decision that we make. 559 00:33:07,280 --> 00:33:11,760 Speaker 4: We have three principles around trust and transparency, and the 560 00:33:11,760 --> 00:33:16,560 Speaker 4: first is the purpose of AI is to augment human intelligence, 561 00:33:16,720 --> 00:33:20,280 Speaker 4: not replace it. The second is that data and insights 562 00:33:20,360 --> 00:33:23,320 Speaker 4: belong to their creators, So with anyone that we work with, 563 00:33:23,760 --> 00:33:27,120 Speaker 4: we make sure that we protect their data insights and 564 00:33:27,160 --> 00:33:29,960 Speaker 4: it belongs to them, it doesn't belong to us. And 565 00:33:30,000 --> 00:33:35,400 Speaker 4: then any new technology, including any AI products, systems, platforms, 566 00:33:35,840 --> 00:33:40,320 Speaker 4: must be transparent and explainable. So I think that's important 567 00:33:40,320 --> 00:33:42,680 Speaker 4: to have those types of principles in place. I'm proud 568 00:33:42,680 --> 00:33:45,000 Speaker 4: to be a part of the AI Ethics Board making 569 00:33:45,080 --> 00:33:50,200 Speaker 4: decisions around how AI is deployed, and I think making 570 00:33:50,240 --> 00:33:53,440 Speaker 4: sure that we continue to keep humans people at the 571 00:33:53,480 --> 00:33:56,680 Speaker 4: center of every decision we make around innovation is how 572 00:33:56,680 --> 00:33:58,120 Speaker 4: we protect them. 573 00:33:58,560 --> 00:34:01,040 Speaker 3: So we've talked so much about all the changes that 574 00:34:01,080 --> 00:34:03,160 Speaker 3: are happening right now. Justin know, I kind of want 575 00:34:03,160 --> 00:34:05,520 Speaker 3: you to put on your like, you know, future prediction cap. 576 00:34:05,840 --> 00:34:08,520 Speaker 3: What future developments do you anticipate in the realm of 577 00:34:08,560 --> 00:34:09,360 Speaker 3: open education. 578 00:34:10,239 --> 00:34:12,960 Speaker 4: I think that And I've been in education a very 579 00:34:13,000 --> 00:34:16,200 Speaker 4: long time, and I remember us talking about personalized learning 580 00:34:17,080 --> 00:34:20,680 Speaker 4: maybe ten years fifteen years ago, and I'm not sure 581 00:34:21,200 --> 00:34:24,759 Speaker 4: it ever came to fruition in the way that we imagined. 582 00:34:25,360 --> 00:34:29,360 Speaker 4: And we know that the teacher will always be the guide. 583 00:34:29,440 --> 00:34:31,719 Speaker 4: They will always be the one that's needed. I don't 584 00:34:31,719 --> 00:34:35,439 Speaker 4: think any technology will ever replace teachers, but I think 585 00:34:35,480 --> 00:34:39,600 Speaker 4: what AI can do is enhanced that experience by really 586 00:34:39,640 --> 00:34:45,720 Speaker 4: creating personalized learning content and experiences in the education space. 587 00:34:46,160 --> 00:34:47,880 Speaker 4: I think that is one of the things that I 588 00:34:47,880 --> 00:34:51,080 Speaker 4: would say should be something we see in the very 589 00:34:51,120 --> 00:34:54,280 Speaker 4: near future around the acceleration of AI. 590 00:34:55,440 --> 00:34:58,000 Speaker 3: April, You've done so much elegant work teaching your students 591 00:34:58,040 --> 00:35:01,480 Speaker 3: about AI and technology. I'm curious what advice you have 592 00:35:01,600 --> 00:35:04,840 Speaker 3: for other educators and technologists looking to advocate for a 593 00:35:04,920 --> 00:35:08,080 Speaker 3: skills first approach or more AI training for their students. 594 00:35:08,200 --> 00:35:10,320 Speaker 3: What advice would you have for them. 595 00:35:10,520 --> 00:35:13,040 Speaker 5: The first piece of advice that I always give is 596 00:35:13,120 --> 00:35:16,759 Speaker 5: don't feel overwhelmed because you can. I mean, there's a 597 00:35:16,800 --> 00:35:19,279 Speaker 5: lot going on. It's hard to keep up with how 598 00:35:19,320 --> 00:35:21,400 Speaker 5: fast things are moving, even for those of us that 599 00:35:21,800 --> 00:35:24,760 Speaker 5: love this space. You don't have to do everything at once, 600 00:35:25,080 --> 00:35:28,680 Speaker 5: just you know, baby steps and that's absolutely fine. 601 00:35:28,960 --> 00:35:29,200 Speaker 2: Thank you. 602 00:35:29,280 --> 00:35:30,759 Speaker 3: As a professor, I have to say I needed to 603 00:35:30,800 --> 00:35:33,560 Speaker 3: hear that, so giving myself grace taking that one to heart. 604 00:35:33,960 --> 00:35:37,600 Speaker 5: In fact, I have in my PowerPoint presentation the first 605 00:35:37,600 --> 00:35:39,880 Speaker 5: slide I put up is of a turtle and it 606 00:35:39,920 --> 00:35:42,239 Speaker 5: says slow your role. And it's like, I'm going to 607 00:35:42,280 --> 00:35:44,279 Speaker 5: be talking about a lot of things, but I want 608 00:35:44,320 --> 00:35:47,440 Speaker 5: you to remember this slide, just slow your role. It's okay. 609 00:35:47,760 --> 00:35:51,640 Speaker 5: The other thing that I encourage professors to do is 610 00:35:51,680 --> 00:35:55,120 Speaker 5: to join an educator community group. And there are a 611 00:35:55,280 --> 00:35:58,200 Speaker 5: lot that have popped up as a result of jin 612 00:35:58,280 --> 00:36:01,920 Speaker 5: Ai and the disruption we're seeing just in the education space, 613 00:36:02,200 --> 00:36:06,759 Speaker 5: and so how can we crowdsource our advice without a doubt? 614 00:36:06,800 --> 00:36:09,319 Speaker 5: If you're thinking about a particular assignment and how you 615 00:36:09,440 --> 00:36:14,040 Speaker 5: might use Jenai in crafting that assignment or incorporating it 616 00:36:14,719 --> 00:36:17,919 Speaker 5: in the assessment. There is a professor out there who 617 00:36:18,400 --> 00:36:21,839 Speaker 5: has either already done it or they're also thinking about it, 618 00:36:21,920 --> 00:36:25,480 Speaker 5: So you know, let's be more collaborative. And I will 619 00:36:25,480 --> 00:36:30,120 Speaker 5: say that's been really wonderful for me as a law professor, 620 00:36:30,600 --> 00:36:35,880 Speaker 5: being able to collaborate with professors from other disciplines. And 621 00:36:35,960 --> 00:36:39,040 Speaker 5: the last thing that I would say, you know, sometimes 622 00:36:39,080 --> 00:36:42,759 Speaker 5: it can be hard to convince your colleagues within your 623 00:36:42,800 --> 00:36:48,000 Speaker 5: institution to be progressive, and if you can bring an 624 00:36:48,080 --> 00:36:51,400 Speaker 5: outside speaker to come in and kind of just share 625 00:36:51,400 --> 00:36:55,719 Speaker 5: what's going on, that can oftentimes get people moving even 626 00:36:55,760 --> 00:36:58,480 Speaker 5: if you within the building aren't able to get that 627 00:36:58,560 --> 00:37:01,440 Speaker 5: same traction. So those serve kind of the three pieces 628 00:37:01,440 --> 00:37:03,719 Speaker 5: of advice that I'll typically give professors. 629 00:37:04,840 --> 00:37:07,160 Speaker 3: So this has been a fabulous conversation that we are 630 00:37:07,200 --> 00:37:10,080 Speaker 3: reaching the end of our time. But before we wrap, 631 00:37:10,360 --> 00:37:15,280 Speaker 3: let's do a speed round. Ready, first question, April first, 632 00:37:15,640 --> 00:37:19,560 Speaker 3: complete this sentence. In five years, AI will blank. 633 00:37:20,719 --> 00:37:27,239 Speaker 5: In five years, AI will be more fully leveraged to 634 00:37:27,360 --> 00:37:32,200 Speaker 5: help lawyers better serve their clients more efficiently and will 635 00:37:32,239 --> 00:37:34,800 Speaker 5: help close the access to justice gap. 636 00:37:36,160 --> 00:37:38,680 Speaker 3: Nice justin the same question. 637 00:37:39,160 --> 00:37:43,960 Speaker 4: In five years, AI will have disrupted every industry and 638 00:37:44,080 --> 00:37:47,480 Speaker 4: there would have been significant advancements made in education and 639 00:37:47,520 --> 00:37:49,399 Speaker 4: sustainability with the use of AI. 640 00:37:50,600 --> 00:37:53,600 Speaker 3: Okay, speed round question number two. What is the number 641 00:37:53,600 --> 00:37:56,400 Speaker 3: one thing that people misunderstand about AI? 642 00:37:56,840 --> 00:38:00,600 Speaker 4: Justinat you first, The number one mis understanding about AI 643 00:38:00,719 --> 00:38:03,680 Speaker 4: is that it's going to destroy everyone's jobs. I think 644 00:38:03,719 --> 00:38:08,120 Speaker 4: that people with AI skills or understanding of AI will 645 00:38:08,120 --> 00:38:09,840 Speaker 4: have some advantages in the workplace. 646 00:38:10,480 --> 00:38:16,080 Speaker 5: April, the number one thing people misunderstand about AI is 647 00:38:16,120 --> 00:38:23,040 Speaker 5: that only computer scientists or mathematicians or engineers can understand it. 648 00:38:23,719 --> 00:38:26,960 Speaker 5: You can gain an understanding again through baby steps, and 649 00:38:27,000 --> 00:38:31,759 Speaker 5: there are so many resources available. If you explore the 650 00:38:31,840 --> 00:38:35,080 Speaker 5: information and bite sized pieces, you can begin to wrap 651 00:38:35,080 --> 00:38:35,799 Speaker 5: your head around it. 652 00:38:36,640 --> 00:38:39,560 Speaker 3: Okay, next speed round question, what advice would you give 653 00:38:39,600 --> 00:38:42,960 Speaker 3: yourself ten years ago to better prepare you for today? 654 00:38:43,239 --> 00:38:44,480 Speaker 3: Justina you first. 655 00:38:44,760 --> 00:38:47,760 Speaker 4: The advice I would give myself ten years ago is 656 00:38:47,800 --> 00:38:53,120 Speaker 4: to continue learning. I always love understanding technology. I always 657 00:38:53,600 --> 00:38:56,719 Speaker 4: dove deep into whether it's machine learning or four G 658 00:38:56,800 --> 00:39:01,320 Speaker 4: and five G technologies. Understanding AI and hybrid cloud today 659 00:39:01,440 --> 00:39:04,719 Speaker 4: is something that I also enjoy doing, so I would 660 00:39:04,760 --> 00:39:09,960 Speaker 4: say continue learning, continue diving into these technologies, Continue understanding 661 00:39:10,000 --> 00:39:13,600 Speaker 4: what it means for you and your future career, April. 662 00:39:14,680 --> 00:39:19,200 Speaker 5: Be more interdisciplinary, so stay current with the evolution of 663 00:39:19,239 --> 00:39:24,839 Speaker 5: computer science, but also incorporate the study of data and 664 00:39:24,960 --> 00:39:30,000 Speaker 5: ethics and sociology because the challenges they're opposed by AI, 665 00:39:30,520 --> 00:39:35,319 Speaker 5: they're multifaceted, and you have to have an understanding in 666 00:39:35,400 --> 00:39:39,680 Speaker 5: these areas to really address the promise and the challenges 667 00:39:39,680 --> 00:39:40,080 Speaker 5: of AI. 668 00:39:40,920 --> 00:39:44,360 Speaker 3: Final speed round question, how are you already using AI 669 00:39:44,520 --> 00:39:46,680 Speaker 3: in your day to day life today, April. 670 00:39:47,400 --> 00:39:50,200 Speaker 5: So, I use it in my teaching. The other way 671 00:39:50,280 --> 00:39:52,520 Speaker 5: that I plan on using it in the future is 672 00:39:52,640 --> 00:39:56,040 Speaker 5: serving the students and then using the data analysis tool 673 00:39:56,480 --> 00:40:00,439 Speaker 5: to help me gather that information and figure out how 674 00:40:00,520 --> 00:40:04,080 Speaker 5: best to address the information that I've received from my students. 675 00:40:04,680 --> 00:40:06,719 Speaker 3: Nice Justina, Yeah. 676 00:40:06,480 --> 00:40:09,560 Speaker 4: So the way that we're using AI today is to 677 00:40:09,600 --> 00:40:15,040 Speaker 4: actually analyze complex and large data sets in our sustainability 678 00:40:15,040 --> 00:40:19,600 Speaker 4: work to provide insights to some of our partners on 679 00:40:19,760 --> 00:40:24,279 Speaker 4: how they can increase crop yield, for example, or how 680 00:40:24,320 --> 00:40:27,920 Speaker 4: they can deliver clean energy solutions to rural areas. So 681 00:40:28,360 --> 00:40:31,399 Speaker 4: we actively using it in the programs that we have 682 00:40:31,920 --> 00:40:36,359 Speaker 4: within our corporate Social responsibility portfolio and also integrating it 683 00:40:36,400 --> 00:40:38,000 Speaker 4: into our Skills Bill platform. 684 00:40:39,239 --> 00:40:41,440 Speaker 3: Well, thank you both so much. You did excellent in 685 00:40:41,520 --> 00:40:44,239 Speaker 3: the speed round, but it was just so fabulous to 686 00:40:44,280 --> 00:40:45,840 Speaker 3: talk to you both today. I think this is a 687 00:40:45,880 --> 00:40:49,400 Speaker 3: time of so many exciting challenges in the field of education, 688 00:40:49,520 --> 00:40:52,120 Speaker 3: and it was fabulous to hear more about how AI 689 00:40:52,200 --> 00:40:54,799 Speaker 3: and IBM Skills Build and so many technologies can help 690 00:40:54,880 --> 00:40:57,160 Speaker 3: us out. Thank you both so much for this fun conversation. 691 00:40:57,440 --> 00:40:59,120 Speaker 4: Thank you for having us. It was great. 692 00:40:59,400 --> 00:41:01,120 Speaker 5: Yes, thank you, Thank you. 693 00:41:03,960 --> 00:41:07,880 Speaker 2: What an insightful conversation with Justina and April. This discussion 694 00:41:08,000 --> 00:41:12,759 Speaker 2: demonstrated how technology and education can intersect to create a 695 00:41:12,840 --> 00:41:17,800 Speaker 2: meaningful impact in today's educational landscape. Students must utilize AI 696 00:41:17,960 --> 00:41:21,239 Speaker 2: in the classroom in order to prepare for the modern workforce, 697 00:41:21,760 --> 00:41:25,799 Speaker 2: and educators must use the technology, including IBM Skills Build, 698 00:41:26,160 --> 00:41:31,399 Speaker 2: to train students for the complexities of tomorrow's challenges. As 699 00:41:31,440 --> 00:41:35,360 Speaker 2: April and Justina emphasized, impact starts by centering the humans 700 00:41:35,480 --> 00:41:39,520 Speaker 2: using the tool. Ensuring their empowered to access, adopt, and 701 00:41:39,600 --> 00:41:43,319 Speaker 2: excel with the technology is just as critical as the 702 00:41:43,360 --> 00:41:47,719 Speaker 2: power of the tool itself. Justina and April's work is 703 00:41:47,760 --> 00:41:51,360 Speaker 2: a powerful reminder that as we continue to integrate AI 704 00:41:51,480 --> 00:41:56,080 Speaker 2: technology into our educational systems. We have the opportunity to 705 00:41:56,160 --> 00:42:01,000 Speaker 2: create more equitable and accessible learning environments. It's clear that 706 00:42:01,040 --> 00:42:04,160 Speaker 2: the future of learning and technology is bright, and the 707 00:42:04,200 --> 00:42:10,720 Speaker 2: adoption of AI is crucial in shaping that future. Smart 708 00:42:10,719 --> 00:42:13,960 Speaker 2: Talks with IBM is produced by Matt Romano, Joey Fishground, 709 00:42:14,040 --> 00:42:18,520 Speaker 2: Amy Gains McQuaid, and Jacob Goldstein or edited by Lydia 710 00:42:18,719 --> 00:42:22,800 Speaker 2: Jane Kott. Our engineers are Sarah Bugaier and Ben Tolliday. 711 00:42:23,200 --> 00:42:26,480 Speaker 2: Theme song by Gramoscope. Special thanks to the eight Bar 712 00:42:26,520 --> 00:42:29,720 Speaker 2: and IBM teams, as well as the Pushkin marketing team. 713 00:42:29,920 --> 00:42:33,080 Speaker 2: Smart Talks with IBM is a production of Pushkin Industries 714 00:42:33,320 --> 00:42:38,160 Speaker 2: and Ruby Studio at iHeartMedia. To find more Pushkin podcasts, 715 00:42:38,440 --> 00:42:43,400 Speaker 2: listen on the iHeartRadio app, Apple Podcasts, or wherever you 716 00:42:43,520 --> 00:42:48,120 Speaker 2: listen to podcasts. I'm Malcolm Glappa. This is a paid 717 00:42:48,160 --> 00:42:52,960 Speaker 2: advertisement from IBM. The conversations on this podcast don't necessarily 718 00:42:52,960 --> 00:42:56,920 Speaker 2: represent IBM's positions, strategies or opinions.