1 00:00:00,200 --> 00:00:04,400 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. This season 2 00:00:04,400 --> 00:00:08,000 Speaker 1: on Smart Talks with IBM, Malcolm Glabwell is back, and 3 00:00:08,000 --> 00:00:10,680 Speaker 1: this time he's taking the show on the road. Malcolm 4 00:00:10,720 --> 00:00:14,800 Speaker 1: is stepping outside the studio to explore how IBM clients 5 00:00:14,840 --> 00:00:18,880 Speaker 1: are using artificial intelligence to solve real world challenges and 6 00:00:18,960 --> 00:00:23,080 Speaker 1: transform the way they do business, from accelerating scientific breakthroughs 7 00:00:23,160 --> 00:00:27,600 Speaker 1: to reimagining education. It's a fresh look at innovation in action, 8 00:00:28,080 --> 00:00:31,760 Speaker 1: where big ideas meet cutting edge solutions. You'll hear from 9 00:00:31,800 --> 00:00:36,080 Speaker 1: industry leaders, creative thinkers, and of course, Malcolm Glabwell himself 10 00:00:36,360 --> 00:00:40,000 Speaker 1: as he guides you through each story. New episodes of 11 00:00:40,040 --> 00:00:43,360 Speaker 1: Smart Talks with IBM drop every month on the iHeartRadio app, 12 00:00:43,560 --> 00:00:47,320 Speaker 1: Apple Podcasts, or wherever you get your podcasts. Learn more 13 00:00:47,400 --> 00:00:50,000 Speaker 1: at IBM dot com slash smart Talks. 14 00:00:51,960 --> 00:00:54,640 Speaker 2: In the world of educational research, there is a famous 15 00:00:54,720 --> 00:00:58,120 Speaker 2: video of a boy named Sean. I don't mean famous 16 00:00:58,120 --> 00:01:00,360 Speaker 2: in a sense that it has a million views on YouTube. 17 00:01:00,680 --> 00:01:02,639 Speaker 2: I mean that in the circle of people who think 18 00:01:02,680 --> 00:01:06,200 Speaker 2: about teaching and how to make teaching better. The video 19 00:01:06,240 --> 00:01:09,760 Speaker 2: has been written about in journal articles and shown over 20 00:01:09,800 --> 00:01:13,360 Speaker 2: and again in college classrooms. It's a ten minute clip 21 00:01:13,800 --> 00:01:17,080 Speaker 2: of a third grade class somewhere in Michigan. It was 22 00:01:17,080 --> 00:01:19,840 Speaker 2: filmed in January of nineteen ninety, so the video is 23 00:01:19,880 --> 00:01:23,520 Speaker 2: a bit grainy. The teacher's name is Deborah Lowenberg Ball. 24 00:01:23,880 --> 00:01:26,880 Speaker 2: She's a professor at Michigan State University who is part 25 00:01:26,920 --> 00:01:29,679 Speaker 2: of her research, teaches a one hour math class at 26 00:01:29,680 --> 00:01:33,240 Speaker 2: a local elementary school on the day in question. Miss 27 00:01:33,319 --> 00:01:36,960 Speaker 2: Ball begins by asking her students about the previous day's lesson, 28 00:01:37,360 --> 00:01:40,120 Speaker 2: which was about even and odd numbers. 29 00:01:40,280 --> 00:01:42,039 Speaker 3: I would like to hear from as many people as 30 00:01:42,080 --> 00:01:45,039 Speaker 3: possible what comments you had, reactions you had to being 31 00:01:45,080 --> 00:01:46,120 Speaker 3: in that meeting yesterday. 32 00:01:47,000 --> 00:01:49,840 Speaker 2: A little boy with black hair raises his hand. His 33 00:01:49,960 --> 00:01:50,559 Speaker 2: name is Sean. 34 00:01:51,320 --> 00:01:54,800 Speaker 4: I don't have anything about the meeting yesterday that I 35 00:01:55,000 --> 00:01:56,680 Speaker 4: was just thinking about sitting im. 36 00:01:56,880 --> 00:01:59,040 Speaker 2: Sean was thinking about the number six. 37 00:02:00,280 --> 00:02:02,440 Speaker 4: I was thinking that it's a it's an idd. It 38 00:02:02,440 --> 00:02:04,480 Speaker 4: can be an odd number two because there could. 39 00:02:04,280 --> 00:02:13,680 Speaker 5: Be two, two, four, six, two, three, twos and two threes. 40 00:02:13,760 --> 00:02:14,680 Speaker 4: It will be an odd. 41 00:02:14,720 --> 00:02:22,040 Speaker 5: Anthonytina three thinks make it takes me two things. 42 00:02:21,639 --> 00:02:25,360 Speaker 2: And Sean doesn't understand what odd and even means. He 43 00:02:25,440 --> 00:02:27,720 Speaker 2: thinks that just because you can break down six in 44 00:02:27,800 --> 00:02:31,440 Speaker 2: an odd number of parts and an even number of parts. 45 00:02:31,480 --> 00:02:35,560 Speaker 2: That six must exist in some magical middle category. And 46 00:02:35,600 --> 00:02:38,360 Speaker 2: when you listen to the Sean videotape, you keep waiting 47 00:02:38,360 --> 00:02:41,640 Speaker 2: for the teacher to say, oh, no, Sean, you misunderstand. 48 00:02:42,240 --> 00:02:46,840 Speaker 2: But Deborahaul doesn't do that. She never tells him he's wrong. Instead, 49 00:02:47,280 --> 00:02:50,360 Speaker 2: she simply asks him to explain his thinking. 50 00:02:50,720 --> 00:02:52,520 Speaker 6: And the two things that you put together to make 51 00:02:52,520 --> 00:02:55,440 Speaker 6: it were odd right, three and three of each old. 52 00:02:56,440 --> 00:02:57,079 Speaker 4: And I think. 53 00:02:59,160 --> 00:03:03,440 Speaker 2: Two bauld And asked the class to give their views. 54 00:03:03,800 --> 00:03:06,960 Speaker 2: Other students jump up and explain their theories on the blackboard. 55 00:03:07,160 --> 00:03:10,239 Speaker 2: For the next fifteen minutes, she definitely guides the class 56 00:03:10,280 --> 00:03:13,440 Speaker 2: through an in depth investigation of what she calls shawn 57 00:03:13,600 --> 00:03:17,680 Speaker 2: numbers until Sewn himself realizes that the real meaning of 58 00:03:17,680 --> 00:03:20,440 Speaker 2: odd and even is something different than he had imagined. 59 00:03:20,960 --> 00:03:22,680 Speaker 2: And now he gets it. 60 00:03:23,120 --> 00:03:25,760 Speaker 4: I'm a thank you for winging in love. 61 00:03:26,680 --> 00:03:29,160 Speaker 2: I don't want to focus just on how little Sean 62 00:03:29,520 --> 00:03:32,040 Speaker 2: finally made his own way to the right answer. I'm 63 00:03:32,080 --> 00:03:34,359 Speaker 2: interested in what his teacher did to get him there. 64 00:03:35,120 --> 00:03:39,080 Speaker 2: Deborah Ball worked magic. She never told Sean the right answer. 65 00:03:39,720 --> 00:03:41,720 Speaker 2: She just led him to a place where he could 66 00:03:41,800 --> 00:03:46,680 Speaker 2: discover it for himself. My name is Malcolm Glawo. This 67 00:03:46,800 --> 00:03:49,640 Speaker 2: is season six of Smart Talks with IBM, where we 68 00:03:49,680 --> 00:03:52,280 Speaker 2: offer our listeners a glimpse behind the curtain of the 69 00:03:52,280 --> 00:03:57,560 Speaker 2: world of technology and artificial intelligence. In this season, we're 70 00:03:57,560 --> 00:04:00,760 Speaker 2: going to visit companies as varied as Lorielle and Ferrari 71 00:04:01,000 --> 00:04:04,880 Speaker 2: and tell stories of how they're using artificial intelligence and 72 00:04:05,040 --> 00:04:09,360 Speaker 2: data to transform the way they do business. This episode 73 00:04:09,720 --> 00:04:12,080 Speaker 2: is about the promise of a radical new idea called 74 00:04:12,120 --> 00:04:15,480 Speaker 2: responsive teaching, the kind of teaching that took place that 75 00:04:15,600 --> 00:04:19,839 Speaker 2: day in Shawn's classroom, and whether artificial intelligence can help 76 00:04:19,920 --> 00:04:22,760 Speaker 2: us train the next generation of teachers to be as 77 00:04:22,760 --> 00:04:31,200 Speaker 2: good as Deborah Ball. Before we talk about how AI 78 00:04:31,279 --> 00:04:33,960 Speaker 2: could transform the way we train teachers, I want to 79 00:04:34,000 --> 00:04:36,480 Speaker 2: go back for a moment to the famous video of Sean. 80 00:04:37,360 --> 00:04:40,680 Speaker 2: In the video, the teacher Deborah Ball doesn't have a 81 00:04:40,720 --> 00:04:45,200 Speaker 2: predetermined plan that she's imposing on the class. She's improvising, 82 00:04:45,920 --> 00:04:49,320 Speaker 2: making up her approach as she goes along, responding to 83 00:04:49,360 --> 00:04:53,600 Speaker 2: her student's odd theory about the number six second. She's 84 00:04:53,640 --> 00:04:59,080 Speaker 2: taking Sean seriously. She's not dismissing his theory. She's listening 85 00:04:59,080 --> 00:05:01,440 Speaker 2: to him and trying to you understand the problem from 86 00:05:01,560 --> 00:05:06,400 Speaker 2: his perspective. And thirdly, and most importantly, she's not force 87 00:05:06,480 --> 00:05:10,520 Speaker 2: feeding him the right answer. She's being patient. She's waiting 88 00:05:10,520 --> 00:05:13,560 Speaker 2: to see if with just the right subtle hints, he 89 00:05:13,640 --> 00:05:17,400 Speaker 2: can get to the right answer on his own. Improvisation, 90 00:05:18,040 --> 00:05:22,080 Speaker 2: empathy patients. That's responsive teaching. 91 00:05:22,360 --> 00:05:25,080 Speaker 7: What I think about in terms of responsiveness is more like, 92 00:05:26,160 --> 00:05:28,760 Speaker 7: I think that students need to have a sense of 93 00:05:29,760 --> 00:05:34,120 Speaker 7: agency in what happens in the classroom, and like authentic 94 00:05:34,240 --> 00:05:39,360 Speaker 7: agency where they can be legitimized as knowers. 95 00:05:40,080 --> 00:05:43,480 Speaker 2: I spoke to a physicist at Seattle Pacific University named 96 00:05:43,520 --> 00:05:47,920 Speaker 2: Amy Robertson, a longtime advocate for responsive teaching. She uses 97 00:05:47,960 --> 00:05:49,560 Speaker 2: the Sean video in her classroom. 98 00:05:49,760 --> 00:05:52,000 Speaker 7: You have to trust that kids have a way of 99 00:05:52,040 --> 00:05:55,240 Speaker 7: doing that and that, like heard, what she mostly did 100 00:05:55,320 --> 00:05:57,919 Speaker 7: was to facilitate a conversation and to say you have 101 00:05:57,960 --> 00:05:59,239 Speaker 7: to listen to them talk. 102 00:06:00,080 --> 00:06:02,599 Speaker 2: Told him he was wrong, that's right, and then he goes, 103 00:06:02,920 --> 00:06:05,640 Speaker 2: He goes, I didn't think of it that way. 104 00:06:05,800 --> 00:06:09,160 Speaker 4: Again, I thank you for ringing it alone. 105 00:06:09,560 --> 00:06:13,440 Speaker 2: You've expanded my understanding. Thank you for bringing it up again. 106 00:06:13,480 --> 00:06:17,159 Speaker 2: It's like this, I love I know. 107 00:06:18,000 --> 00:06:20,360 Speaker 7: Responsive teaching, as I think about it, is kind of 108 00:06:20,560 --> 00:06:24,320 Speaker 7: rooted in this like Eleanor Duckworth's work around the Having 109 00:06:24,320 --> 00:06:27,080 Speaker 7: of Wonderful Ideas, where she says, like, the goal of 110 00:06:27,279 --> 00:06:30,760 Speaker 7: education is for students to have wonderful ideas and to 111 00:06:30,800 --> 00:06:32,040 Speaker 7: have a good time having them. 112 00:06:32,640 --> 00:06:35,560 Speaker 2: I love that. I've never heard that. What a beautiful, 113 00:06:36,160 --> 00:06:39,679 Speaker 2: succinct way of summing up the purpose of education. Yes, 114 00:06:40,600 --> 00:06:44,320 Speaker 2: responsive teaching is beautiful. It's rare to find a new 115 00:06:44,320 --> 00:06:48,000 Speaker 2: teaching idea that everyone loves. This is one of those 116 00:06:48,080 --> 00:06:51,240 Speaker 2: rare ideas. Watching the Debora Ball classroom, all I could 117 00:06:51,240 --> 00:06:53,960 Speaker 2: think was, I really really hope my daughters get to 118 00:06:54,000 --> 00:06:57,240 Speaker 2: experience a math class like that. Far too many kids 119 00:06:57,440 --> 00:07:00,600 Speaker 2: are convincing themselves at far too young age age that 120 00:07:00,680 --> 00:07:04,479 Speaker 2: math isn't for them, and responsive teaching is a way 121 00:07:04,520 --> 00:07:08,760 Speaker 2: to solve that problem. But here is the issue. It's 122 00:07:09,040 --> 00:07:12,920 Speaker 2: really really hard to teach responsive teaching. Robertson says that 123 00:07:13,000 --> 00:07:16,120 Speaker 2: teaching exists in a cultural environment where the teacher is 124 00:07:16,200 --> 00:07:19,280 Speaker 2: expected to be the source of truth. That teaching is 125 00:07:19,280 --> 00:07:22,520 Speaker 2: about the immediate correction of error and not letting a 126 00:07:22,600 --> 00:07:27,320 Speaker 2: child wander down the pathway of their own misunderstanding responsive 127 00:07:27,360 --> 00:07:31,679 Speaker 2: teaching is deeply counterintuitive, and the only way to understand 128 00:07:31,720 --> 00:07:35,320 Speaker 2: its beauty is to do it over and over again. 129 00:07:36,200 --> 00:07:41,120 Speaker 2: Aspiring teachers need a way to practice. For as long 130 00:07:41,160 --> 00:07:44,040 Speaker 2: as there has been technology, people have turned to digital 131 00:07:44,080 --> 00:07:48,080 Speaker 2: machines to solve problems. My father was a mathematician, and 132 00:07:48,120 --> 00:07:50,720 Speaker 2: I remember him coming home in the nineteen seventies with 133 00:07:50,800 --> 00:07:53,840 Speaker 2: a big stack of computer cards in his briefcase that 134 00:07:53,920 --> 00:07:57,400 Speaker 2: he used to program the main frame back of the office. Today, 135 00:07:57,680 --> 00:08:01,840 Speaker 2: with the rise of artificial intelligence, the scale and complexity 136 00:08:02,080 --> 00:08:05,280 Speaker 2: of the problems technology can help us solve has jumped 137 00:08:05,280 --> 00:08:08,520 Speaker 2: by many orders of magnitude. You must have worked with 138 00:08:08,520 --> 00:08:12,040 Speaker 2: a with a million customers who are experimenting with ll 139 00:08:12,200 --> 00:08:15,400 Speaker 2: m's Has there been one use case that you were like, WHOA, 140 00:08:15,440 --> 00:08:19,280 Speaker 2: I had no idea? Or just simply that's clever. I'm 141 00:08:19,320 --> 00:08:23,040 Speaker 2: speaking to Brian Bissel, who works out of IBM's Manhattan office. 142 00:08:23,440 --> 00:08:26,640 Speaker 2: He helps IBM customers discover how best to get AI 143 00:08:26,760 --> 00:08:27,640 Speaker 2: to work for them. 144 00:08:28,360 --> 00:08:30,000 Speaker 8: There is one, but I don't think I can talk 145 00:08:30,040 --> 00:08:30,920 Speaker 8: about it unfortunately. 146 00:08:31,200 --> 00:08:33,439 Speaker 2: Wait, wait, you can't tease me like that, can you? 147 00:08:34,640 --> 00:08:34,880 Speaker 4: Wait? 148 00:08:35,640 --> 00:08:38,760 Speaker 2: Disguise disguise it for me? Just give me a general. 149 00:08:39,040 --> 00:08:40,760 Speaker 8: It was about the ability to pull certain types of 150 00:08:40,840 --> 00:08:44,520 Speaker 8: information out of documents that you you wouldn't think you 151 00:08:44,559 --> 00:08:47,880 Speaker 8: would be able to get the model to do, and 152 00:08:48,040 --> 00:08:51,040 Speaker 8: be able to do that at a very large scale. 153 00:08:51,360 --> 00:08:54,000 Speaker 2: Bissile's point was that we are well past the stage 154 00:08:54,000 --> 00:08:57,560 Speaker 2: where anyone wonders whether AI can be useful. The real 155 00:08:57,640 --> 00:09:00,679 Speaker 2: question now is what problems do we want to use 156 00:09:00,720 --> 00:09:03,280 Speaker 2: it to solve Where it can make the biggest difference, 157 00:09:03,960 --> 00:09:07,360 Speaker 2: and Basil saw lots of opportunities in education. 158 00:09:08,280 --> 00:09:11,400 Speaker 8: I have two kids, one in middle school and one 159 00:09:11,400 --> 00:09:14,720 Speaker 8: who just graduated high school, and I'm well aware of 160 00:09:14,760 --> 00:09:18,400 Speaker 8: students using things like chat GPT to do their homework, 161 00:09:18,480 --> 00:09:24,640 Speaker 8: and it's very easy to take tools like that and 162 00:09:24,679 --> 00:09:28,360 Speaker 8: even IBM's own large language models and just take a problem, 163 00:09:28,720 --> 00:09:32,440 Speaker 8: a piece of homework, something you want written, and drop 164 00:09:32,480 --> 00:09:34,360 Speaker 8: it into that and have it generate the answer for 165 00:09:34,440 --> 00:09:38,800 Speaker 8: you and the student. The user in that case hasn't 166 00:09:38,800 --> 00:09:41,280 Speaker 8: done any work, they haven't put any real thought into it. 167 00:09:41,840 --> 00:09:45,760 Speaker 2: To Basil, that's the wrong use of AI. That's technology 168 00:09:46,000 --> 00:09:49,840 Speaker 2: making is dumber. What we really want is technology that 169 00:09:49,920 --> 00:09:52,839 Speaker 2: makes us smarter. Basil explained to me that there are 170 00:09:52,840 --> 00:09:57,079 Speaker 2: now two big tools being used for AI productivity, AI agents, 171 00:09:57,559 --> 00:10:01,880 Speaker 2: and AI assistance. Let's start with the AI agents. AI 172 00:10:01,960 --> 00:10:05,920 Speaker 2: agents can reason plan and collaborate with other AI tools 173 00:10:06,200 --> 00:10:10,200 Speaker 2: to autonomously perform tasks for a user. Mis Will gave 174 00:10:10,240 --> 00:10:12,720 Speaker 2: me an example of how college freshmen might use an 175 00:10:12,720 --> 00:10:14,080 Speaker 2: AI agent As. 176 00:10:13,960 --> 00:10:16,360 Speaker 8: A new student. You may not know how do I 177 00:10:16,400 --> 00:10:18,360 Speaker 8: do with my health and wellness issue? So many the 178 00:10:18,400 --> 00:10:20,960 Speaker 8: credits are going to get for this given class. You 179 00:10:21,040 --> 00:10:24,400 Speaker 8: could talk to someone and find out some of that, 180 00:10:25,400 --> 00:10:26,959 Speaker 8: but maybe it's a little bit sensitive and you don't 181 00:10:27,000 --> 00:10:27,520 Speaker 8: want to do that. 182 00:10:28,040 --> 00:10:31,200 Speaker 2: Missill told me you could build an AI agent, a 183 00:10:31,280 --> 00:10:34,720 Speaker 2: resource for new students that helps them navigate a new campus, 184 00:10:35,040 --> 00:10:38,240 Speaker 2: register for classes, access the services they need, and even 185 00:10:38,280 --> 00:10:42,079 Speaker 2: schedule appointments on their behalf, which in turn buys them 186 00:10:42,120 --> 00:10:44,440 Speaker 2: more time to focus on their actual school work. 187 00:10:44,760 --> 00:10:49,680 Speaker 8: We can see patterns of how agents and assistants can 188 00:10:49,760 --> 00:10:54,160 Speaker 8: help employees and customers and end users be more productive, 189 00:10:54,360 --> 00:10:58,240 Speaker 8: automate workflows so they're not doing certain types of repetitive 190 00:10:58,280 --> 00:11:02,480 Speaker 8: work over and over again, and streamlining their lives and 191 00:11:02,520 --> 00:11:06,440 Speaker 8: making data more accessible to them twenty four hours a day. 192 00:11:06,800 --> 00:11:10,520 Speaker 2: But Bissel says you can also use AI assistance in 193 00:11:10,559 --> 00:11:14,679 Speaker 2: the education space. AI assistants are reactive as opposed to 194 00:11:14,720 --> 00:11:19,120 Speaker 2: AI agents, which are proactive. AI assistants only performed tasks 195 00:11:19,160 --> 00:11:24,040 Speaker 2: at your request. They're programmed to answer your questions, and 196 00:11:24,080 --> 00:11:27,200 Speaker 2: as it turns out, AI assistants are now being used 197 00:11:27,520 --> 00:11:31,320 Speaker 2: to further the responsive teaching revolution, which is why I 198 00:11:31,360 --> 00:11:34,840 Speaker 2: found myself on a beautiful Georgia spring day not long ago, 199 00:11:34,960 --> 00:11:38,600 Speaker 2: on the campus of Kansas State University, sitting in the 200 00:11:38,600 --> 00:11:42,760 Speaker 2: classroom with two researchers, one of them Professor Dabe Lee. 201 00:11:43,000 --> 00:11:45,439 Speaker 2: Let's go into the journey of building this thing. You 202 00:11:46,040 --> 00:11:48,640 Speaker 2: started by taking a course. What was the course you took. 203 00:11:49,320 --> 00:11:54,319 Speaker 5: Yeah, so it was offered by Coursera. It was designed 204 00:11:54,360 --> 00:11:58,600 Speaker 5: by IBM. It was AI Foundation for everyone. 205 00:12:01,880 --> 00:12:04,560 Speaker 2: In her AI Foundation's course, Lee learned how to build 206 00:12:04,559 --> 00:12:08,280 Speaker 2: an AI assistant using IBM Watson X. That course took 207 00:12:08,280 --> 00:12:10,480 Speaker 2: how long to take it was. 208 00:12:10,520 --> 00:12:12,760 Speaker 5: Not to know it was like fourteen weeks. 209 00:12:13,760 --> 00:12:17,199 Speaker 2: Lee's idea was to train an AI assistant on classroom 210 00:12:17,280 --> 00:12:20,640 Speaker 2: data to play the role of Sean, a digital persona 211 00:12:20,880 --> 00:12:23,360 Speaker 2: of a nine year old who likes math but doesn't 212 00:12:23,360 --> 00:12:27,440 Speaker 2: always understand math, and that AI assistant she thought could 213 00:12:27,440 --> 00:12:30,560 Speaker 2: be used to train preservice teachers or teachers in training 214 00:12:30,800 --> 00:12:33,400 Speaker 2: who are preparing to enter one of the most challenging 215 00:12:33,440 --> 00:12:35,200 Speaker 2: professions in the modern world. 216 00:12:36,120 --> 00:12:39,160 Speaker 5: So when you think about the teacher education and a 217 00:12:39,320 --> 00:12:45,000 Speaker 5: major challenge that teacher education phase is that we need 218 00:12:45,240 --> 00:12:49,960 Speaker 5: children to practice with. We need instructors who will give 219 00:12:50,080 --> 00:12:55,120 Speaker 5: the instruction on the pedagogical skills. So when you look 220 00:12:55,160 --> 00:12:59,360 Speaker 5: at the teacher education program, we have coursework in field experience, 221 00:13:00,000 --> 00:13:04,280 Speaker 5: and in those two areas there is something missing all 222 00:13:04,320 --> 00:13:04,760 Speaker 5: the time. 223 00:13:05,960 --> 00:13:09,320 Speaker 2: Li says that pre service teachers often lack access to 224 00:13:09,360 --> 00:13:12,920 Speaker 2: both students and experienced teachers during their education. 225 00:13:13,559 --> 00:13:17,920 Speaker 5: So what we try to resolve is that we have 226 00:13:18,080 --> 00:13:21,280 Speaker 5: this virtual student for pre service teacher to work with 227 00:13:22,080 --> 00:13:26,319 Speaker 5: so that they can practice their responsive teaching skills. 228 00:13:26,559 --> 00:13:30,600 Speaker 2: The first AI assistant Lee created is g wuji Wu, 229 00:13:30,760 --> 00:13:34,679 Speaker 2: emulates the persona of a nine year old third grade girl. Then, 230 00:13:34,760 --> 00:13:37,440 Speaker 2: with the help of one of her collaborators, a researcher 231 00:13:37,480 --> 00:13:41,760 Speaker 2: at Canazon named Sean English, she created two more AI assistants, 232 00:13:42,160 --> 00:13:47,200 Speaker 2: Gabriel and Noah, each of which have their own distinctive characteristics. 233 00:13:47,480 --> 00:13:50,720 Speaker 2: So how are Gabriel and Noah different from. 234 00:13:50,640 --> 00:13:56,720 Speaker 5: G Wu gabrielle My first one is very short answered. 235 00:13:56,920 --> 00:13:59,800 Speaker 5: If you ask an open ended question, he will answer 236 00:13:59,800 --> 00:14:04,240 Speaker 5: it in a close way. So I use that characteristic. 237 00:14:04,320 --> 00:14:09,280 Speaker 5: And that's the problem that most teachers actually base. They're 238 00:14:09,360 --> 00:14:12,600 Speaker 5: asked children who are shay, who are reserved, and who 239 00:14:12,600 --> 00:14:17,280 Speaker 5: would not sure much of their thoughts. So we wanted 240 00:14:17,320 --> 00:14:21,080 Speaker 5: that characteristic in some characters, and we use Gabrielle to 241 00:14:21,400 --> 00:14:22,560 Speaker 5: have that characteristic. 242 00:14:24,000 --> 00:14:26,520 Speaker 2: And Noah. What'snaah's personality? 243 00:14:27,880 --> 00:14:31,560 Speaker 3: How do he playful? Cheery, bright and energetic? 244 00:14:32,400 --> 00:14:38,320 Speaker 2: That's Sean English professor, Lee's fellow researcher, and jewuj. 245 00:14:37,720 --> 00:14:43,120 Speaker 5: Is articulated and kind of smart, but she has her 246 00:14:43,200 --> 00:14:44,360 Speaker 5: own way of thinking. 247 00:14:44,760 --> 00:14:47,120 Speaker 2: I would end up spending a lot of time with Jeewu. 248 00:14:47,640 --> 00:14:50,520 Speaker 2: She's something of a character. I asked Sean about the 249 00:14:50,560 --> 00:14:54,160 Speaker 2: process of creating these AI assistants. What does building the 250 00:14:54,240 --> 00:14:57,680 Speaker 2: content side of the AI assistant entail? 251 00:14:58,840 --> 00:14:59,080 Speaker 4: Sean? 252 00:14:59,480 --> 00:15:02,040 Speaker 3: It sets up a series of actions, effectively, which are 253 00:15:02,680 --> 00:15:05,240 Speaker 3: response cases. You can kind of think of them as 254 00:15:05,520 --> 00:15:08,280 Speaker 3: you have a series of questions that you tie to 255 00:15:09,480 --> 00:15:13,280 Speaker 3: an intent, and then that intent has reactions from the bot, 256 00:15:13,600 --> 00:15:16,720 Speaker 3: and so effectively, if we were looking to say, make 257 00:15:16,760 --> 00:15:19,080 Speaker 3: a hello action, we would have all the different ways 258 00:15:19,080 --> 00:15:21,440 Speaker 3: that people could say hello, Hello, what's up, how you doing, 259 00:15:21,440 --> 00:15:22,440 Speaker 3: and all that kind of stuff. 260 00:15:22,760 --> 00:15:26,640 Speaker 2: Sean says, the longer the list of potential responses, the better, 261 00:15:27,160 --> 00:15:31,040 Speaker 2: But AI's responses don't just follow the list. The AI 262 00:15:31,120 --> 00:15:34,680 Speaker 2: assistant uses those suggested responses to come up with a 263 00:15:34,800 --> 00:15:38,680 Speaker 2: universe of other responses, and in that process sometimes it 264 00:15:38,720 --> 00:15:41,040 Speaker 2: comes up with things that just don't make sense. 265 00:15:41,120 --> 00:15:44,600 Speaker 3: And from a technological standpoint, while AI is a fantastic tool, 266 00:15:44,720 --> 00:15:47,360 Speaker 3: AI can hallucinate, which means just give things that it's 267 00:15:47,400 --> 00:15:50,320 Speaker 3: just straight up made up. There's a famous example of 268 00:15:50,360 --> 00:15:52,320 Speaker 3: this called the three rs is where you ask a 269 00:15:52,320 --> 00:15:55,200 Speaker 3: popular large language model how many RS are in strawberry, 270 00:15:55,480 --> 00:15:57,480 Speaker 3: and it gives you the wrong answer, and he repeats 271 00:15:57,520 --> 00:16:00,360 Speaker 3: that result repetitively. You always want to have a human 272 00:16:00,880 --> 00:16:02,840 Speaker 3: interacting with the system to be able to go, hey, 273 00:16:03,480 --> 00:16:05,640 Speaker 3: that's a little crazy. I don't think that's exactly what 274 00:16:05,640 --> 00:16:06,440 Speaker 3: we're going for here. 275 00:16:07,120 --> 00:16:09,120 Speaker 2: That's why it's good to have someone like Sean English 276 00:16:09,120 --> 00:16:11,200 Speaker 2: around to step in and get the model back on track, 277 00:16:11,600 --> 00:16:14,760 Speaker 2: and over time, when the model has enough training, it's 278 00:16:14,800 --> 00:16:21,120 Speaker 2: ready for the teachers in training. One of the rollouts 279 00:16:21,160 --> 00:16:24,280 Speaker 2: of Jiwu Gabriel and Noah was with the teacher training 280 00:16:24,320 --> 00:16:26,280 Speaker 2: program at the University of Missouri. 281 00:16:26,600 --> 00:16:28,960 Speaker 6: I was just kind of excited to see what the 282 00:16:29,040 --> 00:16:31,440 Speaker 6: program was and what it was going to be doing. 283 00:16:31,880 --> 00:16:34,960 Speaker 2: This is Logan Hovis, a junior at Missouri on the 284 00:16:35,000 --> 00:16:37,320 Speaker 2: path to becoming an elementary school teacher. 285 00:16:37,680 --> 00:16:40,200 Speaker 6: Obviously a little skeptical when he said it was sos to, 286 00:16:40,240 --> 00:16:43,160 Speaker 6: you know, be like talking to a student. You're like, 287 00:16:43,200 --> 00:16:45,600 Speaker 6: there's no way this AI thing is going to totally 288 00:16:45,640 --> 00:16:48,320 Speaker 6: sound like a second grader or a third grader, Like 289 00:16:48,360 --> 00:16:50,880 Speaker 6: it's going to sound like an adult, or it's going 290 00:16:50,960 --> 00:16:52,800 Speaker 6: to sound like a robot that knows all the answers. 291 00:16:53,280 --> 00:16:56,040 Speaker 6: And it really didn't. It really was like talking to 292 00:16:56,080 --> 00:16:58,720 Speaker 6: a child. It was very very well developed in the 293 00:16:58,760 --> 00:17:00,640 Speaker 6: way that you really said that and you feel like 294 00:17:00,640 --> 00:17:01,680 Speaker 6: you're talking to a kid. 295 00:17:02,240 --> 00:17:05,040 Speaker 2: Her point wasn't that jie Wu and her fellow avatars 296 00:17:05,280 --> 00:17:08,960 Speaker 2: were equivalent to real kids. Of course not, but for 297 00:17:09,040 --> 00:17:11,880 Speaker 2: someone starting out, someone who is already nervous about being 298 00:17:11,920 --> 00:17:14,879 Speaker 2: plunged into a classroom of nine year olds, Jeewu was 299 00:17:14,920 --> 00:17:16,960 Speaker 2: like a warm up before a baseball game. 300 00:17:17,359 --> 00:17:19,479 Speaker 6: What I can think of is like, you know, how 301 00:17:19,840 --> 00:17:22,159 Speaker 6: when you're at batting practice for baseball or softball, you 302 00:17:22,200 --> 00:17:25,160 Speaker 6: have those automatic pitchers that throw them because you're working 303 00:17:25,160 --> 00:17:27,800 Speaker 6: on your skill as the hitter. What can I do differently? 304 00:17:27,840 --> 00:17:30,879 Speaker 6: What am I doing wrong? But that doesn't replace the 305 00:17:30,920 --> 00:17:32,639 Speaker 6: game and what you do in a game. But this 306 00:17:32,760 --> 00:17:35,560 Speaker 6: is you getting to practice your own skills to be 307 00:17:35,560 --> 00:17:37,040 Speaker 6: better when you go in a game. And I think 308 00:17:37,080 --> 00:17:40,120 Speaker 6: that's kind of what the AI software feels like for us. 309 00:17:42,000 --> 00:17:44,679 Speaker 2: In batting practice, the pitches don't come as hard and 310 00:17:44,680 --> 00:17:47,239 Speaker 2: fast as the pitch is in a real game, but 311 00:17:47,280 --> 00:17:49,119 Speaker 2: you get to stand at the plate and the pitcher 312 00:17:49,200 --> 00:17:52,840 Speaker 2: throws you dozens of balls over and over again in 313 00:17:52,880 --> 00:17:55,680 Speaker 2: a concentrated block that allows you to work on your 314 00:17:55,720 --> 00:17:58,280 Speaker 2: swing closely and carefully. 315 00:17:58,880 --> 00:18:01,320 Speaker 6: There's a lot less stimuli going on around because the 316 00:18:01,320 --> 00:18:04,119 Speaker 6: classroom is very very busy. It's wonderful, it's beautiful, but 317 00:18:04,160 --> 00:18:06,639 Speaker 6: it's very very busy, so sometimes it's hard to keep 318 00:18:07,160 --> 00:18:10,440 Speaker 6: you know, that focus in on the tasks that they're 319 00:18:10,440 --> 00:18:13,040 Speaker 6: doing at hand, and also in the teacher setting, you're 320 00:18:13,040 --> 00:18:15,720 Speaker 6: also kind of always looking around making sure that other 321 00:18:15,760 --> 00:18:17,800 Speaker 6: students are doing what they're supposed to be doing, but 322 00:18:17,840 --> 00:18:20,000 Speaker 6: also like if they need any help, if everything's going 323 00:18:20,040 --> 00:18:24,800 Speaker 6: okay in the classroom, So being on the ji Wu chat, 324 00:18:25,800 --> 00:18:27,679 Speaker 6: it was just nice that you didn't have to do 325 00:18:27,840 --> 00:18:30,160 Speaker 6: any of the extra work to keep the focus on there, 326 00:18:30,600 --> 00:18:34,159 Speaker 6: and it also felt you didn't have to feel the 327 00:18:34,200 --> 00:18:37,800 Speaker 6: student's nervousness of being one on one with you. And 328 00:18:38,000 --> 00:18:40,560 Speaker 6: also as a teacher, it was a lot less pressure too, 329 00:18:40,600 --> 00:18:43,239 Speaker 6: because I was like, Okay, I'm taking this series. This 330 00:18:43,280 --> 00:18:46,440 Speaker 6: is a student I'm questioning, but I also know I'm 331 00:18:46,440 --> 00:18:49,080 Speaker 6: probably not going to hurt someone's feelings right now, and 332 00:18:49,119 --> 00:18:51,160 Speaker 6: that's terrifying to think I'm going to ask the wrong 333 00:18:51,240 --> 00:18:54,960 Speaker 6: question and upset the child because I've done that. 334 00:18:55,960 --> 00:18:58,439 Speaker 2: We think that the typical use of AI as a 335 00:18:58,480 --> 00:19:01,240 Speaker 2: tool for speeding things up. That's what we always hear 336 00:19:01,520 --> 00:19:04,399 Speaker 2: that the introduction of AI to problem X gave an 337 00:19:04,440 --> 00:19:08,680 Speaker 2: answer in minutes when solving problem X used to take weeks. 338 00:19:09,280 --> 00:19:13,160 Speaker 2: But we shouldn't forget another use that it allows us 339 00:19:13,480 --> 00:19:16,880 Speaker 2: to slow things down. Hovis, if she wanted to, could 340 00:19:16,920 --> 00:19:20,080 Speaker 2: spend a whole weekend practicing with gi Wu. A real 341 00:19:20,200 --> 00:19:22,879 Speaker 2: nine year old will get frustrated on board with the 342 00:19:22,880 --> 00:19:27,240 Speaker 2: fumbling novice after ten minutes, but gie wuji Wu will 343 00:19:27,240 --> 00:19:29,919 Speaker 2: happily answer questions for as long as it takes for 344 00:19:29,960 --> 00:19:33,119 Speaker 2: the people who want to learn to be responsive to 345 00:19:33,240 --> 00:19:37,399 Speaker 2: learn how to be responsive. At the end of my 346 00:19:37,480 --> 00:19:41,000 Speaker 2: time at Kenesas State, Sean and Dabe led me to 347 00:19:41,040 --> 00:19:43,880 Speaker 2: a small table where Dabe had set up her laptop. 348 00:19:44,480 --> 00:19:46,800 Speaker 2: In the corner of the screen was a chat box 349 00:19:47,080 --> 00:19:49,479 Speaker 2: of the sort we've all seen and used a thousand times. 350 00:19:50,280 --> 00:19:53,800 Speaker 2: Ji Wu began. She had been given a math problem. 351 00:19:53,920 --> 00:19:59,120 Speaker 4: A rule kodo who out of grude force? How half 352 00:19:59,200 --> 00:20:05,800 Speaker 4: a flower? Okay? Do ball? Thanks? Another three? Six is cup? 353 00:20:06,480 --> 00:20:11,800 Speaker 4: It's a total amount of flower the use greater or 354 00:20:12,160 --> 00:20:16,680 Speaker 4: dan or a less than war cop how much flower 355 00:20:17,000 --> 00:20:17,720 Speaker 4: can use? 356 00:20:18,040 --> 00:20:21,040 Speaker 2: That's a simulation of gi Wu speaking. We pause it 357 00:20:21,119 --> 00:20:26,600 Speaker 2: for a second. So Jiwu is trying to solve this problem. 358 00:20:26,680 --> 00:20:29,320 Speaker 2: And the first thing she does is she draws a 359 00:20:29,359 --> 00:20:32,960 Speaker 2: rectangle on the screen. This is a common tactic of 360 00:20:33,040 --> 00:20:37,119 Speaker 2: nine year olds. Try to visualize the fractions. And she 361 00:20:37,240 --> 00:20:42,600 Speaker 2: divides it into four pieces. And now she's gonna color 362 00:20:42,640 --> 00:20:45,280 Speaker 2: in three of the four pieces. Yes, so she's representing. 363 00:20:45,320 --> 00:20:48,320 Speaker 2: This is quite good. She's representing three quarters on the screen. 364 00:20:48,320 --> 00:20:53,280 Speaker 4: Okay, this is three six. 365 00:20:55,560 --> 00:21:01,080 Speaker 2: So now Jiwu does another rectangle with six boxes and 366 00:21:01,160 --> 00:21:02,240 Speaker 2: colors in three of. 367 00:21:02,200 --> 00:21:09,520 Speaker 4: Them, okay, together makes sikes come off. 368 00:21:11,080 --> 00:21:15,080 Speaker 2: So then she counts up all the colored boxes and 369 00:21:15,320 --> 00:21:18,399 Speaker 2: that's her numerator, and counts up the total number of 370 00:21:18,400 --> 00:21:22,080 Speaker 2: boxes and that's her denominator. Ji Wu had counted the 371 00:21:22,119 --> 00:21:25,720 Speaker 2: colored boxes and landed on an answer. When you add 372 00:21:25,920 --> 00:21:28,639 Speaker 2: three quarters of a cup and three sixth of a cup, 373 00:21:29,200 --> 00:21:32,320 Speaker 2: you get six tenths of a cup. So, according to 374 00:21:32,440 --> 00:21:35,359 Speaker 2: ji Wu, Martin has less than one cup. And she 375 00:21:35,440 --> 00:21:36,600 Speaker 2: thinks she solved the problem. 376 00:21:36,720 --> 00:21:39,440 Speaker 5: Yes, okay, so it's less than one cup. 377 00:21:39,760 --> 00:21:43,000 Speaker 2: Yeah, so she says it's less than one cup. Now, 378 00:21:43,119 --> 00:21:46,040 Speaker 2: oh my god, this is hard. So the question is 379 00:21:46,040 --> 00:21:49,399 Speaker 2: what do I, as a teacher say to Jiwu. We 380 00:21:49,400 --> 00:21:52,480 Speaker 2: were off. The rules were simple. I couldn't give ji 381 00:21:52,560 --> 00:21:55,119 Speaker 2: Wu the answer or explain to her what she was 382 00:21:55,160 --> 00:21:58,040 Speaker 2: doing wrong. I had to be Deborah Ble. I had 383 00:21:58,080 --> 00:22:01,439 Speaker 2: to help her find the way herself. The chat box 384 00:22:01,560 --> 00:22:03,600 Speaker 2: in the corner of the screen was waiting for my 385 00:22:03,640 --> 00:22:06,919 Speaker 2: first question. I thought for a moment and started typing, 386 00:22:07,200 --> 00:22:10,280 Speaker 2: do you think the boxes in the red rectangle are 387 00:22:10,320 --> 00:22:13,359 Speaker 2: the same size as the boxes in the blue rectangle? 388 00:22:14,160 --> 00:22:16,560 Speaker 2: Then I turned to Sean and Dabey is that a 389 00:22:16,560 --> 00:22:17,119 Speaker 2: good question. 390 00:22:17,359 --> 00:22:21,240 Speaker 5: Yeah, seriously did Yeah, that's a good question. 391 00:22:21,800 --> 00:22:25,920 Speaker 2: Jewu doesn't mess around. She answers immediately. So Ju says, 392 00:22:25,920 --> 00:22:28,080 Speaker 2: the blue and red pieces are not the same sizes. 393 00:22:28,920 --> 00:22:33,199 Speaker 5: Oh so you understand now, gu knows that size differences. 394 00:22:34,400 --> 00:22:36,160 Speaker 2: So she's pretty smart here. 395 00:22:36,320 --> 00:22:37,040 Speaker 5: Yeah. 396 00:22:37,119 --> 00:22:39,880 Speaker 2: Then I asked, if they are not the same size, 397 00:22:40,200 --> 00:22:43,520 Speaker 2: do you think you can add them together? Jiwu answered 398 00:22:43,560 --> 00:22:47,240 Speaker 2: right away. Ji Wu says, I have learned that I 399 00:22:47,280 --> 00:22:50,119 Speaker 2: could add any numbers in grade two. So three p 400 00:22:50,280 --> 00:22:52,240 Speaker 2: three is six and four to six is ten. 401 00:22:52,560 --> 00:22:57,040 Speaker 5: Yeah, so she is using the knowledge of adding intiquers 402 00:22:57,040 --> 00:22:58,800 Speaker 5: into adding fractures. 403 00:22:59,359 --> 00:23:03,320 Speaker 2: Now I'm stuck. So now I have to somehow lead 404 00:23:03,440 --> 00:23:05,960 Speaker 2: her to figure out a way to get her to 405 00:23:06,080 --> 00:23:10,679 Speaker 2: understand that we're dealing with a different kind of problem, 406 00:23:10,960 --> 00:23:14,200 Speaker 2: a harder problem. Amy Robertson had told me that learning 407 00:23:14,200 --> 00:23:17,959 Speaker 2: how to do responsive teaching properly was really hard, and 408 00:23:18,000 --> 00:23:20,920 Speaker 2: now I understood why. I had to put my mind 409 00:23:21,320 --> 00:23:23,400 Speaker 2: inside the mind of a nine year old. I had 410 00:23:23,400 --> 00:23:26,960 Speaker 2: to internalize her knowledge base and assumptions, and keep in mind, 411 00:23:27,280 --> 00:23:30,760 Speaker 2: I haven't been nine for a very long time. I 412 00:23:30,800 --> 00:23:34,000 Speaker 2: honestly had no idea what to say next. I thought 413 00:23:34,040 --> 00:23:36,600 Speaker 2: for a moment, I asked what I quickly realized was 414 00:23:36,600 --> 00:23:40,320 Speaker 2: a hopelessly convoluted question. Daby and Sean had built a 415 00:23:40,359 --> 00:23:44,640 Speaker 2: mentor into the system, an experienced, responsive teacher who supervises 416 00:23:44,640 --> 00:23:47,800 Speaker 2: the session and offers advice. My mentor noticed that I 417 00:23:47,840 --> 00:23:54,320 Speaker 2: was struggling, told me to simplify my question. Grader Dabe 418 00:23:54,440 --> 00:23:56,919 Speaker 2: was trying to help me too, She suggested, why not 419 00:23:57,080 --> 00:24:00,560 Speaker 2: just ask ji Wu if three quarters is bigger or 420 00:24:00,600 --> 00:24:01,760 Speaker 2: smaller than one half? 421 00:24:02,160 --> 00:24:05,600 Speaker 5: So we are trying to help her to think about 422 00:24:05,680 --> 00:24:07,800 Speaker 5: faction in a more conceptual way. 423 00:24:08,280 --> 00:24:13,240 Speaker 2: This time, Jeewu understood. She wrote back, three quarters is 424 00:24:13,440 --> 00:24:17,000 Speaker 2: larger than one half? I wrote back, is three six 425 00:24:17,080 --> 00:24:21,440 Speaker 2: of a cup bigger or smaller than one half? Jewu said, 426 00:24:22,040 --> 00:24:25,440 Speaker 2: I'm confused. Oh no, I've confused, gi Wu. 427 00:24:25,760 --> 00:24:30,680 Speaker 5: It's good she's understanding. She's realizing her misconception. So she's 428 00:24:30,720 --> 00:24:31,440 Speaker 5: getting confused. 429 00:24:31,480 --> 00:24:34,040 Speaker 2: She says, I'm confused. Three quarters is pretty close to one, 430 00:24:34,119 --> 00:24:37,880 Speaker 2: and adding three six would make it go over one. Oh, 431 00:24:37,920 --> 00:24:41,040 Speaker 2: so she's got the answer. Yeah, But then she says, 432 00:24:41,560 --> 00:24:43,240 Speaker 2: but there are six pieces out of ten, which is 433 00:24:43,320 --> 00:24:44,680 Speaker 2: less than one, So I don't get it. 434 00:24:45,200 --> 00:24:48,720 Speaker 5: So she's the point that oh this, I have something 435 00:24:48,960 --> 00:24:49,560 Speaker 5: wrong here. 436 00:24:49,880 --> 00:24:50,840 Speaker 7: That's a good sign. 437 00:24:51,359 --> 00:24:52,080 Speaker 2: She's getting there. 438 00:24:52,160 --> 00:24:53,440 Speaker 5: Yeah, she's getting there, but. 439 00:24:53,440 --> 00:24:56,640 Speaker 2: I still have to get her. She has to get 440 00:24:56,800 --> 00:24:59,160 Speaker 2: the six pieces out of ten out of her head. Yeah, 441 00:24:59,280 --> 00:25:03,560 Speaker 2: I have no idea. I didn't do that, and she 442 00:25:03,680 --> 00:25:07,880 Speaker 2: thinks she's confused when she has. Actually she's figured out 443 00:25:07,880 --> 00:25:10,840 Speaker 2: the answer. Yeah she did. So we have advance. Even 444 00:25:10,880 --> 00:25:14,840 Speaker 2: in my stumbling and bumbling, we've made some progress, very 445 00:25:14,880 --> 00:25:23,480 Speaker 2: notable progress. My conversation with gie Wu went on for 446 00:25:23,520 --> 00:25:27,280 Speaker 2: some time, and eventually I got there. Ji Wu found 447 00:25:27,320 --> 00:25:30,400 Speaker 2: her way to the right answer. She said, I have 448 00:25:30,520 --> 00:25:33,320 Speaker 2: more than one cup of flower. The mentor chimed in. 449 00:25:33,560 --> 00:25:35,840 Speaker 2: I got a little emoji that made me feel good, 450 00:25:36,080 --> 00:25:38,919 Speaker 2: And when it was over, I realized two things. The 451 00:25:38,960 --> 00:25:42,879 Speaker 2: first was I needed more batting practice, much more, and 452 00:25:42,920 --> 00:25:46,399 Speaker 2: that batting practice was really, really easy to do, because 453 00:25:46,440 --> 00:25:48,879 Speaker 2: someone has gone to the trouble of building me my 454 00:25:49,040 --> 00:25:52,200 Speaker 2: very own baseball diamond and given me a pitcher who 455 00:25:52,200 --> 00:25:56,360 Speaker 2: had thrown me baseball's all day long. My second thought 456 00:25:56,680 --> 00:26:00,080 Speaker 2: was that I've been thinking about AI all wrong. I 457 00:26:00,119 --> 00:26:02,159 Speaker 2: have interpreted a lot of the talk about the promise 458 00:26:02,200 --> 00:26:05,800 Speaker 2: of AI to be about replacing human expertise. I had 459 00:26:05,840 --> 00:26:08,880 Speaker 2: actually thought when I first heard about Dabe's project, that 460 00:26:08,880 --> 00:26:12,000 Speaker 2: that's what Dabe and Sean were doing, creating an AI 461 00:26:12,119 --> 00:26:15,600 Speaker 2: to teach students by passing the teacher altogether. But if 462 00:26:15,600 --> 00:26:17,920 Speaker 2: you did it that way, you had missed the magic 463 00:26:17,960 --> 00:26:22,199 Speaker 2: of the classroom. Remember Eleanor Duckworth's quote, the goal of 464 00:26:22,320 --> 00:26:25,879 Speaker 2: education is for students to have wonderful ideas and have 465 00:26:25,920 --> 00:26:29,280 Speaker 2: a good time having them. I think we often focus 466 00:26:29,400 --> 00:26:32,480 Speaker 2: on the first part of that formulation, the wonderful ideas, 467 00:26:32,800 --> 00:26:37,920 Speaker 2: but neglect the second, the good time having them. Real 468 00:26:38,000 --> 00:26:42,800 Speaker 2: learning is born in pleasure, in community, in playful discussion, 469 00:26:43,240 --> 00:26:45,879 Speaker 2: in a group of kids coming together to solve a problem, 470 00:26:46,240 --> 00:26:49,639 Speaker 2: And all of that magic only comes from human interaction 471 00:26:50,440 --> 00:26:53,159 Speaker 2: from a teacher who is skilled enough to inspire a 472 00:26:53,160 --> 00:26:56,560 Speaker 2: class of nine year olds. We don't want AI assistants 473 00:26:56,960 --> 00:27:00,720 Speaker 2: to replace the teacher. We want AI assistants to help 474 00:27:00,760 --> 00:27:19,720 Speaker 2: teachers turn themselves into even better teachers. Smart Talks with 475 00:27:19,800 --> 00:27:24,280 Speaker 2: IBM is produced by Matt Romano, Amy Gains McQuaid, Lucy Sullivan, 476 00:27:24,600 --> 00:27:28,280 Speaker 2: and Jake Harper were edited by Lacy Roberts. Engineering by 477 00:27:28,359 --> 00:27:32,960 Speaker 2: Nina Birt Lawrence, Mastering by Sarah Brugerer music by Gramoscope 478 00:27:33,440 --> 00:27:37,720 Speaker 2: Special thanks to Tatiana Lieberman and Cassidy Meyer. Smart Talks 479 00:27:37,720 --> 00:27:40,720 Speaker 2: with IBM is a production of Pushkin Industries and Ruby 480 00:27:40,800 --> 00:27:45,480 Speaker 2: Studio at iHeartMedia. To find more Pushkin podcasts, listen on 481 00:27:45,520 --> 00:27:49,800 Speaker 2: the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. 482 00:27:50,440 --> 00:27:54,360 Speaker 2: I'm Malcolm Glapo. This is a paid advertisement from IBM. 483 00:27:54,640 --> 00:28:00,000 Speaker 2: The conversations on this podcast don't necessarily represent IBM's positions, strategy, 484 00:28:00,680 --> 00:28:01,400 Speaker 2: our opinions,