1 00:00:02,160 --> 00:00:04,760 Speaker 1: In the world of educational research, there is a famous 2 00:00:04,920 --> 00:00:08,280 Speaker 1: video of a boy named Sean. I don't mean famous 3 00:00:08,320 --> 00:00:10,440 Speaker 1: in a sense that it has a million views on YouTube. 4 00:00:10,840 --> 00:00:12,840 Speaker 1: I mean it in the circle of people who think 5 00:00:12,880 --> 00:00:16,400 Speaker 1: about teaching and how to make teaching better. The video 6 00:00:16,440 --> 00:00:19,960 Speaker 1: has been written about in journal articles and shown over 7 00:00:20,000 --> 00:00:23,560 Speaker 1: and again in college classrooms. It's a ten minute clip 8 00:00:24,000 --> 00:00:27,240 Speaker 1: of a third grade class somewhere in Michigan. It was 9 00:00:27,280 --> 00:00:30,040 Speaker 1: filmed in January of nineteen ninety, so the video is 10 00:00:30,080 --> 00:00:33,720 Speaker 1: a bit grainy. The teacher's name is Deborah Lowenburg Ball. 11 00:00:34,080 --> 00:00:37,080 Speaker 1: She's a professor at Michigan State University who is part 12 00:00:37,120 --> 00:00:39,839 Speaker 1: of her research teaches a one hour math class at 13 00:00:39,880 --> 00:00:43,400 Speaker 1: a local elementary school on the day in question. Miss 14 00:00:43,479 --> 00:00:47,160 Speaker 1: Ball begins by asking her students about the previous day's lesson, 15 00:00:47,560 --> 00:00:50,320 Speaker 1: which was about even and odd numbers. 16 00:00:50,520 --> 00:00:52,040 Speaker 2: And I would like to hear from as many people 17 00:00:52,080 --> 00:00:55,040 Speaker 2: as possible what comments you had, reactions you had to 18 00:00:55,080 --> 00:00:56,920 Speaker 2: be in that meeting yesterday. 19 00:00:57,200 --> 00:01:00,120 Speaker 1: A little boy with black hair raises his hand. His 20 00:01:00,200 --> 00:01:00,760 Speaker 1: name is Sean. 21 00:01:01,520 --> 00:01:04,840 Speaker 3: I know I don't have anything about the meeting yesterday, 22 00:01:04,920 --> 00:01:06,560 Speaker 3: but I was just thinking about. 23 00:01:06,400 --> 00:01:09,200 Speaker 1: Sit Sean was thinking about the number six. 24 00:01:10,440 --> 00:01:12,600 Speaker 3: I was thinking that it's a it's an ID. It 25 00:01:12,640 --> 00:01:17,959 Speaker 3: can be an odd number two because there could be two, two, four, six, two, 26 00:01:18,400 --> 00:01:25,680 Speaker 3: three tudents and two threes, it would be an add anthonisina. 27 00:01:28,800 --> 00:01:29,280 Speaker 2: She thinks. 28 00:01:29,280 --> 00:01:32,040 Speaker 3: To make it takes me two things. 29 00:01:32,080 --> 00:01:35,920 Speaker 1: Sean doesn't understand what odd and even means. He thinks 30 00:01:35,959 --> 00:01:38,080 Speaker 1: that just because you can break down six in an 31 00:01:38,160 --> 00:01:41,160 Speaker 1: odd number of parts and an even number of parts, 32 00:01:41,640 --> 00:01:45,760 Speaker 1: that six must exist in some magical middle category. And 33 00:01:45,800 --> 00:01:48,520 Speaker 1: when you listen to the Sean videotape, you keep waiting 34 00:01:48,560 --> 00:01:51,800 Speaker 1: for the teacher to say, oh, no, Sean, you misunderstand. 35 00:01:52,440 --> 00:01:55,400 Speaker 1: But Deborah Ball doesn't do that. She never tells him 36 00:01:55,400 --> 00:02:00,559 Speaker 1: he's wrong. Instead, she simply asks him to explain his thinking. 37 00:02:00,600 --> 00:02:02,520 Speaker 4: Me and the two things that you put together to 38 00:02:02,560 --> 00:02:05,040 Speaker 4: make it were odd, right, three and three of each 39 00:02:05,080 --> 00:02:08,120 Speaker 4: child any think proba. 40 00:02:09,360 --> 00:02:13,080 Speaker 1: Two or easy bald and asked the class to give 41 00:02:13,120 --> 00:02:16,440 Speaker 1: their views. Other students jump up and explain their theories 42 00:02:16,480 --> 00:02:19,680 Speaker 1: on the blackboard. For the next fifteen minutes, she definitely 43 00:02:19,680 --> 00:02:22,640 Speaker 1: guides the class through an in depth investigation of what 44 00:02:22,720 --> 00:02:27,200 Speaker 1: she calls Shawn numbers, until Sewn himself realizes that the 45 00:02:27,240 --> 00:02:29,880 Speaker 1: real meaning of odd and even is something different than 46 00:02:29,880 --> 00:02:32,880 Speaker 1: he had imagined, and now he gets it. 47 00:02:33,320 --> 00:02:35,959 Speaker 3: I've been thank you for waring in love. 48 00:02:36,880 --> 00:02:39,400 Speaker 1: I don't want to focus just on how little Sean 49 00:02:39,720 --> 00:02:42,239 Speaker 1: finally made his own way to the right answer. I'm 50 00:02:42,240 --> 00:02:44,560 Speaker 1: interested in what his teacher did to get him there. 51 00:02:45,320 --> 00:02:49,280 Speaker 1: Deborah Ball worked magic. She never told Sean the right answer, 52 00:02:49,919 --> 00:02:51,960 Speaker 1: She just led him to a place where he could 53 00:02:52,000 --> 00:02:56,880 Speaker 1: discover it for himself. My name is Malcolm Glawo. This 54 00:02:57,000 --> 00:02:59,760 Speaker 1: is season six of Smart Talks with IBM, where we 55 00:02:59,800 --> 00:03:02,360 Speaker 1: are for our listeners a glimpse behind the curtain of 56 00:03:02,400 --> 00:03:07,240 Speaker 1: the world of technology and artificial intelligence. In this season, 57 00:03:07,560 --> 00:03:10,440 Speaker 1: we're going to visit companies as varied as Lail and 58 00:03:10,520 --> 00:03:14,639 Speaker 1: Ferrari and tell stories of how they're using artificial intelligence 59 00:03:15,000 --> 00:03:18,799 Speaker 1: and data to transform the way they do business. This 60 00:03:18,919 --> 00:03:21,960 Speaker 1: episode is about the promise of a radical new idea 61 00:03:22,040 --> 00:03:25,560 Speaker 1: called responsive teaching, the kind of teaching that took place 62 00:03:25,600 --> 00:03:29,799 Speaker 1: that day in Shawn's classroom, and whether artificial intelligence can 63 00:03:29,840 --> 00:03:32,760 Speaker 1: help us train the next generation of teachers to be 64 00:03:32,800 --> 00:03:41,080 Speaker 1: as good as Deborah Ball. Before we talk about how 65 00:03:41,120 --> 00:03:44,080 Speaker 1: AI could transform the way we train teachers, I want 66 00:03:44,120 --> 00:03:46,200 Speaker 1: to go back for a moment to the famous video 67 00:03:46,200 --> 00:03:50,600 Speaker 1: of Sean. In the video, the teacher Deborah Ball doesn't 68 00:03:50,600 --> 00:03:53,880 Speaker 1: have a predetermined plan that she's imposing on the class. 69 00:03:54,320 --> 00:03:58,360 Speaker 1: She's improvising, making up her approach as she goes along, 70 00:03:58,600 --> 00:04:03,000 Speaker 1: responding to her students odd theory about the number six. Second, 71 00:04:03,520 --> 00:04:08,720 Speaker 1: she's taking Sean seriously. She's not dismissing his theory. She's 72 00:04:08,880 --> 00:04:11,720 Speaker 1: listening to him and trying to understand the problem from 73 00:04:11,720 --> 00:04:16,599 Speaker 1: his perspective. And Thirdly, and most importantly, she's not force 74 00:04:16,680 --> 00:04:20,679 Speaker 1: feeding him the right answer. She's being patient. She's waiting 75 00:04:20,720 --> 00:04:23,800 Speaker 1: to see if with just the right subtle hints, he 76 00:04:23,800 --> 00:04:27,600 Speaker 1: can get to the right answer on his own. Improvisation, 77 00:04:28,240 --> 00:04:32,240 Speaker 1: empathy patients. That's responsive teaching. 78 00:04:32,560 --> 00:04:35,000 Speaker 5: What I think about in terms of responsiveness is more 79 00:04:35,120 --> 00:04:38,719 Speaker 5: like I think that students need to have a sense 80 00:04:38,880 --> 00:04:43,680 Speaker 5: of agency in what happens in the classroom, and like 81 00:04:43,720 --> 00:04:49,520 Speaker 5: authentic agency where they can be legitimized as knowers. 82 00:04:50,240 --> 00:04:53,640 Speaker 1: I spoke to a physicist at Seattle Pacific University named 83 00:04:53,680 --> 00:04:58,080 Speaker 1: Amy Robertson, a longtime advocate for responsive teaching. She uses 84 00:04:58,120 --> 00:04:59,719 Speaker 1: the Sean video in her classroom. 85 00:05:00,080 --> 00:05:02,200 Speaker 5: You have to trust that kids have a way of 86 00:05:02,240 --> 00:05:05,440 Speaker 5: doing that, and that like heard. What she mostly did 87 00:05:05,520 --> 00:05:08,080 Speaker 5: was to facilitate a conversation and to say you have 88 00:05:08,160 --> 00:05:09,240 Speaker 5: to listen to them talk. 89 00:05:09,680 --> 00:05:12,200 Speaker 1: No one told him he was wrong, that's right. And 90 00:05:12,240 --> 00:05:15,279 Speaker 1: then he goes, He goes, I didn't think of it 91 00:05:15,320 --> 00:05:20,040 Speaker 1: that way. Again, I thank you for leaving alone. You've 92 00:05:20,120 --> 00:05:23,599 Speaker 1: expanded my understanding. Thank you for bringing it up again. 93 00:05:23,680 --> 00:05:26,240 Speaker 6: It's like this, I love. 94 00:05:27,120 --> 00:05:30,159 Speaker 5: I know responsive teaching, as I think about it is 95 00:05:30,240 --> 00:05:34,080 Speaker 5: kind of rooted in this, like Eleanor Duckworth's work around 96 00:05:34,160 --> 00:05:36,800 Speaker 5: the Having of Wonderful Ideas, where she says, like the 97 00:05:36,839 --> 00:05:40,760 Speaker 5: goal of education is for students to have wonderful ideas 98 00:05:40,760 --> 00:05:42,240 Speaker 5: and to have a good time having them. 99 00:05:42,839 --> 00:05:45,799 Speaker 1: I love that. I've never heard that. What a beautiful, 100 00:05:46,360 --> 00:05:49,880 Speaker 1: succinct way of summing up the purpose of education. Yes, 101 00:05:50,760 --> 00:05:54,480 Speaker 1: responsive teaching is beautiful. It's rare to find a new 102 00:05:54,520 --> 00:05:58,200 Speaker 1: teaching idea that everyone loves. This is one of those 103 00:05:58,279 --> 00:06:01,880 Speaker 1: rare ideas. Watching the Ball classroom, all I could think was, 104 00:06:02,120 --> 00:06:04,719 Speaker 1: I really really hope my daughters get to experience a 105 00:06:04,760 --> 00:06:08,359 Speaker 1: math class like that. Far too many kids are convincing 106 00:06:08,400 --> 00:06:11,599 Speaker 1: themselves at far too young an age that math isn't 107 00:06:11,600 --> 00:06:15,159 Speaker 1: for them, and responsive teaching is a way to solve 108 00:06:15,160 --> 00:06:19,960 Speaker 1: that problem. But here is the issue. It's really really 109 00:06:20,000 --> 00:06:23,920 Speaker 1: hard to teach responsive teaching. Robertson says that teaching exists 110 00:06:24,000 --> 00:06:27,080 Speaker 1: in a cultural environment where the teacher is expected to 111 00:06:27,080 --> 00:06:29,880 Speaker 1: be the source of truth. That teaching is about the 112 00:06:30,040 --> 00:06:33,480 Speaker 1: immediate correction of error and not letting a child wander 113 00:06:33,560 --> 00:06:38,599 Speaker 1: down the pathway of their own Misunderstanding. Responsive teaching is 114 00:06:38,839 --> 00:06:42,479 Speaker 1: deeply counterintuitive, and the only way to understand its beauty 115 00:06:42,839 --> 00:06:47,400 Speaker 1: is to do it over and over again. Aspiring teachers 116 00:06:47,800 --> 00:06:51,599 Speaker 1: need a way to practice. For as long as there 117 00:06:51,640 --> 00:06:55,080 Speaker 1: has been technology, people have turned to digital machines to 118 00:06:55,120 --> 00:06:58,800 Speaker 1: solve problems. My father was a mathematician and I remember 119 00:06:58,839 --> 00:07:01,320 Speaker 1: him coming home in the nineteen seventies with a big 120 00:07:01,360 --> 00:07:04,479 Speaker 1: stack of computer cards in his briefcase that he used 121 00:07:04,480 --> 00:07:07,560 Speaker 1: to program the main frame back of the office. Today, 122 00:07:07,880 --> 00:07:12,040 Speaker 1: with the rise of artificial intelligence, the scale and complexity 123 00:07:12,240 --> 00:07:15,480 Speaker 1: of the problems technology can help us solve has jumped 124 00:07:15,480 --> 00:07:18,640 Speaker 1: by many orders of magnitude. You must have worked with 125 00:07:18,720 --> 00:07:22,800 Speaker 1: a million customers who are experimenting with llm's. Has there 126 00:07:22,880 --> 00:07:25,720 Speaker 1: been one use case that you were like, WHOA, I 127 00:07:25,720 --> 00:07:29,920 Speaker 1: had no idea or just simply that's clever. I'm speaking 128 00:07:29,960 --> 00:07:33,280 Speaker 1: to Brian Bissell, who works out of IBM's Manhattan office. 129 00:07:33,640 --> 00:07:36,840 Speaker 1: He helps IBM customers discover how best to get AI 130 00:07:36,960 --> 00:07:37,800 Speaker 1: to work for them. 131 00:07:38,560 --> 00:07:40,200 Speaker 7: There is one, but I don't think I can talk 132 00:07:40,240 --> 00:07:41,120 Speaker 7: about it unfortunately. 133 00:07:41,400 --> 00:07:45,080 Speaker 1: Wait, you can't tease me like that, can you wait? 134 00:07:45,840 --> 00:07:47,600 Speaker 1: Disguise disguise it for me. 135 00:07:47,720 --> 00:07:50,120 Speaker 7: Just give me a general It was about the ability 136 00:07:50,120 --> 00:07:53,360 Speaker 7: to pull certain types of information out of documents that 137 00:07:53,400 --> 00:07:56,680 Speaker 7: you wouldn't think you would be able to get the 138 00:07:56,840 --> 00:07:59,720 Speaker 7: model to do, and be able to do that at 139 00:07:59,800 --> 00:08:01,240 Speaker 7: a very large scale. 140 00:08:01,560 --> 00:08:04,200 Speaker 1: Bissil's point was that we are well past the stage 141 00:08:04,200 --> 00:08:07,760 Speaker 1: where anyone wonders whether AI can be useful. The real 142 00:08:07,840 --> 00:08:10,880 Speaker 1: question now is what problems do we want to use 143 00:08:10,920 --> 00:08:13,480 Speaker 1: it to solve? Where it can make the biggest difference, 144 00:08:14,160 --> 00:08:17,560 Speaker 1: And Bissil saw lots of opportunities in education. 145 00:08:18,480 --> 00:08:21,600 Speaker 7: I have two kids, one in middle school and one 146 00:08:21,600 --> 00:08:24,880 Speaker 7: who just graduated high school, and I'm well aware of 147 00:08:24,960 --> 00:08:28,600 Speaker 7: students using things like chat GPT to do their homework. 148 00:08:28,680 --> 00:08:34,800 Speaker 7: And it's very easy to take tools like that and 149 00:08:34,840 --> 00:08:38,560 Speaker 7: even IBM's own large language models, and just take a problem, 150 00:08:38,880 --> 00:08:42,640 Speaker 7: a piece of homework, something you want written, and drop 151 00:08:42,679 --> 00:08:44,560 Speaker 7: it into that and have it generate the answer for 152 00:08:44,640 --> 00:08:48,959 Speaker 7: you and the student. The user in that case hasn't 153 00:08:49,000 --> 00:08:51,480 Speaker 7: done any work, they haven't put any real thought into it. 154 00:08:52,040 --> 00:08:55,959 Speaker 1: To Bissil, that's the wrong use of AI. That's technology 155 00:08:56,200 --> 00:09:00,079 Speaker 1: making is dumber. What we really want is technology that 156 00:09:00,160 --> 00:09:02,880 Speaker 1: makes us smarter. This will explain to me that there 157 00:09:02,880 --> 00:09:05,760 Speaker 1: are now two big tools being used for AI productivity, 158 00:09:06,240 --> 00:09:11,719 Speaker 1: AI agents and AI assistants. Let's start with AI agents. 159 00:09:11,800 --> 00:09:15,600 Speaker 1: AI agents can reason plan and collaborate with other AI 160 00:09:15,679 --> 00:09:20,160 Speaker 1: tools to autonomously perform tasks for a user. This will 161 00:09:20,200 --> 00:09:22,800 Speaker 1: gave me an example of how college freshmen might use 162 00:09:22,800 --> 00:09:23,520 Speaker 1: an AI agent. 163 00:09:23,960 --> 00:09:26,440 Speaker 7: As a new student, you may not know how do 164 00:09:26,480 --> 00:09:28,480 Speaker 7: I do with my health and wellness issue? Some of 165 00:09:28,520 --> 00:09:30,360 Speaker 7: your credits are going to get for this given class. 166 00:09:31,080 --> 00:09:34,600 Speaker 7: You could talk to someone and find out some of that, 167 00:09:35,600 --> 00:09:37,160 Speaker 7: but maybe it's a little bit sensitive and you don't 168 00:09:37,200 --> 00:09:37,719 Speaker 7: want to do that. 169 00:09:38,240 --> 00:09:41,360 Speaker 1: Bisill told me you could build an AI agent, a 170 00:09:41,440 --> 00:09:44,960 Speaker 1: resource for new students that helps them navigate a new campus, 171 00:09:45,200 --> 00:09:48,440 Speaker 1: register for classes, access the services they need, and even 172 00:09:48,480 --> 00:09:52,280 Speaker 1: schedule appointments on their behalf, which in turn buys them 173 00:09:52,320 --> 00:09:54,640 Speaker 1: more time to focus on their actual schoolwork. 174 00:09:54,920 --> 00:09:59,800 Speaker 7: We can see patterns of how agents and assistants can 175 00:09:59,840 --> 00:10:04,319 Speaker 7: help help employees and customers and end users be more productive, 176 00:10:04,520 --> 00:10:08,400 Speaker 7: automate workflows so they're not doing certain types of repetitive 177 00:10:08,480 --> 00:10:12,680 Speaker 7: work over and over again, and streamlining their lives and 178 00:10:12,720 --> 00:10:16,640 Speaker 7: making data more accessible to them twenty four hours a day. 179 00:10:17,000 --> 00:10:20,720 Speaker 1: But Bissel says you can also use AI assistance in 180 00:10:20,760 --> 00:10:24,839 Speaker 1: the education space. AI assistants are reactive as opposed to 181 00:10:24,920 --> 00:10:29,160 Speaker 1: AI agents, which are proactive. AI assistants only perform tasks 182 00:10:29,360 --> 00:10:34,240 Speaker 1: at your request. They're programmed to answer your questions, and 183 00:10:34,280 --> 00:10:37,400 Speaker 1: as it turns out, AI assistants are now being used 184 00:10:37,720 --> 00:10:41,480 Speaker 1: to further the responsive teaching revolution, which is why I 185 00:10:41,559 --> 00:10:44,520 Speaker 1: found myself on a beautiful Georgia spring day not long 186 00:10:44,559 --> 00:10:48,720 Speaker 1: ago on the campus of Kansas State University, sitting in 187 00:10:48,720 --> 00:10:52,960 Speaker 1: the classroom with two researchers, one of them Professor Dabe Lee. 188 00:10:53,160 --> 00:10:55,640 Speaker 1: Let's go into the journey of building this thing. You 189 00:10:56,200 --> 00:10:58,840 Speaker 1: started by taking a course. What was the course you took? 190 00:11:00,040 --> 00:11:05,280 Speaker 2: So it was offered by Coursera, It was designed by IBM. 191 00:11:05,480 --> 00:11:08,800 Speaker 2: It was AI Foundation for everyone. 192 00:11:12,080 --> 00:11:14,760 Speaker 1: In her AI Foundation's course, Lee learned how to build 193 00:11:14,760 --> 00:11:18,480 Speaker 1: an AI assistant using IBM watson X that course took 194 00:11:18,480 --> 00:11:19,719 Speaker 1: how long to take. 195 00:11:20,280 --> 00:11:22,679 Speaker 2: It was not to know it was like fourteen mix. 196 00:11:23,960 --> 00:11:27,400 Speaker 1: Lee's idea was to train an AI assistant on classroom 197 00:11:27,480 --> 00:11:30,840 Speaker 1: data to play the role of Sean, a digital persona 198 00:11:31,080 --> 00:11:33,559 Speaker 1: of a nine year old who likes math but doesn't 199 00:11:33,559 --> 00:11:37,600 Speaker 1: always understand math, and that AI assistant, she thought could 200 00:11:37,640 --> 00:11:40,760 Speaker 1: be used to train preservice teachers or teachers in training 201 00:11:41,000 --> 00:11:43,600 Speaker 1: who are preparing to enter one of the most challenging 202 00:11:43,640 --> 00:11:45,400 Speaker 1: professions in the modern world. 203 00:11:46,320 --> 00:11:49,319 Speaker 2: So when you think about the teacher education and a 204 00:11:49,520 --> 00:11:55,160 Speaker 2: major challenge that teacher education face is that we need 205 00:11:55,400 --> 00:12:00,360 Speaker 2: children to practice with. We need instructors who if the 206 00:12:01,240 --> 00:12:05,480 Speaker 2: instruction on the pedagogical skills. So when you look at 207 00:12:05,480 --> 00:12:09,559 Speaker 2: the teacher education program, we have coursework in field experience, 208 00:12:10,080 --> 00:12:14,480 Speaker 2: and in those two areas there is something missing all 209 00:12:14,520 --> 00:12:14,959 Speaker 2: the time. 210 00:12:16,120 --> 00:12:19,520 Speaker 1: Li says that pre service teachers often lack access to 211 00:12:19,559 --> 00:12:23,040 Speaker 1: both students and experienced teachers during their education. 212 00:12:23,760 --> 00:12:28,160 Speaker 2: So what we try to resolve is that we have 213 00:12:28,280 --> 00:12:31,480 Speaker 2: this virtual student for pre service teacher to work with 214 00:12:32,280 --> 00:12:36,520 Speaker 2: so that they can practice their responsive teaching skills. 215 00:12:36,720 --> 00:12:40,439 Speaker 1: The first AI assistant Lee created is g Wu. G 216 00:12:40,600 --> 00:12:43,440 Speaker 1: Wu emulates the persona of a nine year old third 217 00:12:43,440 --> 00:12:46,960 Speaker 1: grade girl. Then, with the help of one of her collaborators, 218 00:12:47,040 --> 00:12:50,720 Speaker 1: a researcher at Canazon named Sean English, she created two 219 00:12:50,760 --> 00:12:55,240 Speaker 1: more AI assistants, Gabriel and Noah, each of which have 220 00:12:55,360 --> 00:13:00,920 Speaker 1: their own distinctive characteristics. So how are Gabriel and Noah different. 221 00:13:00,600 --> 00:13:05,839 Speaker 2: From ji Wu gab Real my first one? He is 222 00:13:06,000 --> 00:13:09,120 Speaker 2: very short answered. If you ask an open ended question, 223 00:13:09,320 --> 00:13:13,120 Speaker 2: he will answer it in a close way. So I 224 00:13:13,360 --> 00:13:17,120 Speaker 2: use that characteristic. And that's the problem that most teachers 225 00:13:17,320 --> 00:13:22,240 Speaker 2: actually base. They're asked children who are shay, who are reserved, 226 00:13:22,480 --> 00:13:26,480 Speaker 2: and who would not sure much of their thoughts. So 227 00:13:26,760 --> 00:13:30,439 Speaker 2: we wanted that characteristic in some characters, and we use 228 00:13:30,559 --> 00:13:32,760 Speaker 2: gabrielle to have that characteristic. 229 00:13:34,200 --> 00:13:37,240 Speaker 1: And Noah. What'snawah's personality? 230 00:13:38,040 --> 00:13:41,760 Speaker 6: How do he playful, cheering, right and energetic? 231 00:13:42,600 --> 00:13:45,280 Speaker 1: That's Sean English, Professor Lee's fellow researcher. 232 00:13:45,840 --> 00:13:52,000 Speaker 2: And Jiwu Ju is articulated and kind of smart, but 233 00:13:52,559 --> 00:13:54,560 Speaker 2: she has her own way of thinking. 234 00:13:54,960 --> 00:13:56,880 Speaker 1: I would end up spending a lot of time with 235 00:13:57,000 --> 00:14:00,680 Speaker 1: ji Wu. She's something of a character. What Sean about 236 00:14:00,679 --> 00:14:04,199 Speaker 1: the process of creating these AI assistants? What does building 237 00:14:04,280 --> 00:14:09,880 Speaker 1: the content side of the AI assistant entail Sean what it. 238 00:14:09,800 --> 00:14:14,000 Speaker 6: Sets up a series of actions, effectively, which are response cases. 239 00:14:14,120 --> 00:14:16,520 Speaker 6: You can kind of think of them as you have 240 00:14:16,559 --> 00:14:20,240 Speaker 6: a series of questions that you tie to an intent, 241 00:14:20,640 --> 00:14:23,920 Speaker 6: and then that intent has reactions from the bot, and 242 00:14:23,960 --> 00:14:27,040 Speaker 6: so effectively, if we were looking to say, make a 243 00:14:27,040 --> 00:14:29,320 Speaker 6: hello action, we would have all the different ways that 244 00:14:29,360 --> 00:14:31,600 Speaker 6: people could say Hello, Hello, what's up, how you doing, 245 00:14:31,640 --> 00:14:32,640 Speaker 6: and all that kind of stuff. 246 00:14:32,960 --> 00:14:36,840 Speaker 1: Sean says, the longer the list of potential responses, the better. 247 00:14:37,360 --> 00:14:41,240 Speaker 1: But AI's responses don't just follow the list. The AI 248 00:14:41,280 --> 00:14:44,840 Speaker 1: assistant uses those suggested responses to come up with a 249 00:14:44,960 --> 00:14:48,840 Speaker 1: universe of other responses, and in that process sometimes it 250 00:14:48,920 --> 00:14:51,200 Speaker 1: comes up with things that just don't make sense. 251 00:14:51,320 --> 00:14:54,760 Speaker 6: And from a technological standpoint, while AI is a fantastic tool, 252 00:14:54,920 --> 00:14:57,360 Speaker 6: AI can hallucinate, which I mean just give things that 253 00:14:57,360 --> 00:15:00,440 Speaker 6: it's just straight up made up. There's a famous example 254 00:15:00,480 --> 00:15:02,360 Speaker 6: of this called the three rs is where you ask 255 00:15:02,440 --> 00:15:05,359 Speaker 6: a popular large language model how many RS are in strawberry, 256 00:15:05,640 --> 00:15:07,720 Speaker 6: and it gives you the wrong answer and he repeats 257 00:15:07,720 --> 00:15:10,520 Speaker 6: that result repetitively. You always want to have a human 258 00:15:11,080 --> 00:15:13,040 Speaker 6: interacting with the system to be able to go Hey, 259 00:15:13,680 --> 00:15:15,840 Speaker 6: that's a little crazy. I don't think that's exactly what 260 00:15:15,840 --> 00:15:16,640 Speaker 6: we're going for here. 261 00:15:17,280 --> 00:15:19,280 Speaker 1: That's why it's good to have someone like Sean English 262 00:15:19,320 --> 00:15:21,400 Speaker 1: around to step in and get the model back on track, 263 00:15:21,800 --> 00:15:24,920 Speaker 1: and over time, when the model has enough training, it's 264 00:15:25,000 --> 00:15:31,360 Speaker 1: ready for the teachers in training. One of the rollouts 265 00:15:31,360 --> 00:15:34,480 Speaker 1: of Jiwu, Gabriel and Noah was with the teacher training 266 00:15:34,520 --> 00:15:36,479 Speaker 1: program at the University of Missouri. 267 00:15:36,800 --> 00:15:39,160 Speaker 4: I was just kind of excited to see what the 268 00:15:39,240 --> 00:15:41,600 Speaker 4: program was and what it was going to be doing. 269 00:15:42,080 --> 00:15:45,160 Speaker 1: This is Logan Hovis, a junior at Missouri on the 270 00:15:45,200 --> 00:15:47,480 Speaker 1: path to becoming an elementary school teacher. 271 00:15:47,880 --> 00:15:50,400 Speaker 4: Obviously a little skeptical when he said it was sosed to, 272 00:15:50,440 --> 00:15:53,360 Speaker 4: you know, be like talking to a student. You're like, 273 00:15:53,400 --> 00:15:55,800 Speaker 4: there's no way this AI thing is going to totally 274 00:15:55,840 --> 00:15:58,480 Speaker 4: sound like a second grader or a third grader, Like 275 00:15:58,560 --> 00:16:01,080 Speaker 4: it's going to sound like an adult, or it's going 276 00:16:01,120 --> 00:16:02,960 Speaker 4: to sound like a robot that knows all the answers. 277 00:16:03,480 --> 00:16:06,200 Speaker 4: And it really didn't. It really was like talking to 278 00:16:06,280 --> 00:16:08,920 Speaker 4: a child. It was very very well developed in the 279 00:16:08,960 --> 00:16:10,800 Speaker 4: way that you really sit there and you feel like 280 00:16:10,840 --> 00:16:11,840 Speaker 4: you're talking to a kid. 281 00:16:12,440 --> 00:16:15,640 Speaker 1: Her point wasn't that Jiwu and her fellow avatars were 282 00:16:15,720 --> 00:16:19,440 Speaker 1: equivalent to real kids, of course not, but for someone 283 00:16:19,560 --> 00:16:22,480 Speaker 1: starting out, someone who was already nervous about being plunged 284 00:16:22,520 --> 00:16:25,200 Speaker 1: into a classroom of nine year olds, Geewu was like 285 00:16:25,240 --> 00:16:27,160 Speaker 1: a warm up before a baseball game. 286 00:16:27,520 --> 00:16:29,640 Speaker 4: What I can think of is like, you know how, 287 00:16:30,040 --> 00:16:32,360 Speaker 4: when you're at batting practice for baseball or softball, you 288 00:16:32,400 --> 00:16:35,360 Speaker 4: have those automatic pitchers that throw them because you're working 289 00:16:35,360 --> 00:16:38,000 Speaker 4: on your skill as the hitter. What can I do differently? 290 00:16:38,040 --> 00:16:41,040 Speaker 4: What am I doing wrong? But that doesn't replace the 291 00:16:41,080 --> 00:16:42,800 Speaker 4: game and what you do in a game. But this 292 00:16:42,960 --> 00:16:45,680 Speaker 4: is you getting to practice your own skills to be 293 00:16:45,720 --> 00:16:47,240 Speaker 4: better when you go in a game. And I think 294 00:16:47,240 --> 00:16:50,280 Speaker 4: that's kind of what the AI software feels like for us. 295 00:16:52,200 --> 00:16:54,840 Speaker 1: In batting practice, the pitches don't come as hard and 296 00:16:54,880 --> 00:16:57,440 Speaker 1: fast as the pitch is in a real game, but 297 00:16:57,480 --> 00:16:59,360 Speaker 1: you get to stand at the plate and the picture 298 00:16:59,360 --> 00:17:03,040 Speaker 1: throws you dozens of balls over and over again in 299 00:17:03,080 --> 00:17:05,880 Speaker 1: a concentrated block that allows you to work on your 300 00:17:05,880 --> 00:17:08,440 Speaker 1: swing closely and carefully. 301 00:17:09,080 --> 00:17:11,520 Speaker 4: There's a lot less stimulus going on around because the 302 00:17:11,520 --> 00:17:14,280 Speaker 4: classroom is very, very busy. It's wonderful, it's beautiful, but 303 00:17:14,320 --> 00:17:16,800 Speaker 4: it's very, very busy, so sometimes it's hard to keep 304 00:17:17,359 --> 00:17:20,640 Speaker 4: you know, that focus in on the tasks that they're 305 00:17:20,640 --> 00:17:23,240 Speaker 4: doing at hand, and also in the teacher setting, you're 306 00:17:23,240 --> 00:17:25,879 Speaker 4: also kind of always looking around making sure that other 307 00:17:25,920 --> 00:17:27,960 Speaker 4: students are doing what they're supposed to be doing, but 308 00:17:28,040 --> 00:17:30,200 Speaker 4: also like if they need any help, if everything's going 309 00:17:30,240 --> 00:17:35,440 Speaker 4: okay in the classroom. So being on the Jiwu chat, 310 00:17:36,000 --> 00:17:37,879 Speaker 4: it was just nice that you didn't have to do 311 00:17:38,040 --> 00:17:40,360 Speaker 4: any of the extra work to keep the focus on there, 312 00:17:40,800 --> 00:17:44,320 Speaker 4: and it also felt you didn't have to feel the 313 00:17:44,359 --> 00:17:48,000 Speaker 4: student's nervousness of being one on one with you. And 314 00:17:48,160 --> 00:17:50,160 Speaker 4: also as a tea shirt, it was a lot less 315 00:17:50,200 --> 00:17:53,240 Speaker 4: pressure too, because I was like, Okay, I'm taking this serious. 316 00:17:53,320 --> 00:17:55,080 Speaker 4: This is a student I'm questioning. 317 00:17:55,400 --> 00:17:57,760 Speaker 8: But I also know I'm probably not going to hurt 318 00:17:57,760 --> 00:18:00,879 Speaker 8: someone's feelings right now, and that's terrify. I'm going to 319 00:18:00,880 --> 00:18:04,399 Speaker 8: ask the wrong question and have set the child because 320 00:18:04,400 --> 00:18:05,159 Speaker 8: I've done that. 321 00:18:06,160 --> 00:18:08,639 Speaker 1: We think of the typical use of AI as a 322 00:18:08,680 --> 00:18:11,440 Speaker 1: tool for speeding things up. That's what we always hear 323 00:18:11,680 --> 00:18:14,639 Speaker 1: that the introduction of AI to problem X gave an 324 00:18:14,640 --> 00:18:18,879 Speaker 1: answer in minutes. When solving problem X used to take weeks, 325 00:18:19,440 --> 00:18:23,360 Speaker 1: but we shouldn't forget another use that it allows us 326 00:18:23,680 --> 00:18:27,120 Speaker 1: to slow things down. Hovis, if she wanted to, could 327 00:18:27,119 --> 00:18:30,280 Speaker 1: spend a whole weekend practicing with g Wu. A real 328 00:18:30,400 --> 00:18:33,040 Speaker 1: nine year old will get frustrated on board with the 329 00:18:33,080 --> 00:18:37,080 Speaker 1: fumbling novice after ten minutes, but g Wu g Wu 330 00:18:37,240 --> 00:18:40,040 Speaker 1: will happily answer questions for as long as it takes 331 00:18:40,040 --> 00:18:42,680 Speaker 1: for the people who want to learn to be responsive 332 00:18:43,240 --> 00:18:47,520 Speaker 1: to learn how to be responsive. At the end of 333 00:18:47,520 --> 00:18:51,000 Speaker 1: my time at Kenesas State, Sean and Dabe led me 334 00:18:51,080 --> 00:18:54,080 Speaker 1: to a small table where Dabe had set up her laptop. 335 00:18:54,680 --> 00:18:57,000 Speaker 1: In the corner of the screen was a chat box 336 00:18:57,280 --> 00:18:59,600 Speaker 1: of the sort we've all seen and used a thousand times. 337 00:19:00,480 --> 00:19:03,919 Speaker 1: Jiwu began she had been given a math problem. 338 00:19:06,480 --> 00:19:10,520 Speaker 3: Who are of three fourth cup of a flower to 339 00:19:10,680 --> 00:19:16,040 Speaker 3: the ball? Thanks to the added another three six is cup. 340 00:19:16,680 --> 00:19:22,000 Speaker 3: It's a cotal amount of flower. Who use greater or 341 00:19:22,359 --> 00:19:26,920 Speaker 3: dan or a less than war cup? How much flower 342 00:19:27,040 --> 00:19:27,919 Speaker 3: they can loose? 343 00:19:28,240 --> 00:19:31,359 Speaker 1: That's a simulation of Jiwu speaking. We pause her for 344 00:19:31,359 --> 00:19:36,840 Speaker 1: a second. So Jeewu is trying to solve this problem, 345 00:19:36,880 --> 00:19:39,520 Speaker 1: and the first thing she does is she draws a 346 00:19:39,560 --> 00:19:43,159 Speaker 1: rectangle on the screen. This is a common tactic of 347 00:19:43,240 --> 00:19:47,320 Speaker 1: nine year olds. Try to visualize the fractions, and she 348 00:19:47,400 --> 00:19:52,800 Speaker 1: divides it into four pieces. And now she's gonna color 349 00:19:52,840 --> 00:19:55,480 Speaker 1: in three of the four pieces. Yes, so she's representing. 350 00:19:55,520 --> 00:19:58,520 Speaker 1: This is quite good. She's representing three quarters on the screen. 351 00:20:00,200 --> 00:20:00,480 Speaker 2: This is. 352 00:20:02,600 --> 00:20:03,480 Speaker 3: Three six. 353 00:20:05,760 --> 00:20:11,240 Speaker 1: So now, Jiwu, there's another rectangle with six boxes and 354 00:20:11,359 --> 00:20:12,600 Speaker 1: colors in three of them. 355 00:20:12,920 --> 00:20:19,720 Speaker 3: Okay, the together that makes sikes come off. 356 00:20:21,280 --> 00:20:25,240 Speaker 1: So then she counts up all the colored boxes and 357 00:20:25,520 --> 00:20:28,600 Speaker 1: that's her numerator, and counts up the total number of 358 00:20:28,600 --> 00:20:32,280 Speaker 1: boxes and that's her denominator. Ji Wu had counted the 359 00:20:32,320 --> 00:20:35,919 Speaker 1: colored boxes and landed on an answer. When you add 360 00:20:36,119 --> 00:20:38,800 Speaker 1: three quarters of a cup and three sixth of a cup, 361 00:20:39,400 --> 00:20:42,520 Speaker 1: you get six tenths of a cup. So, according to 362 00:20:42,640 --> 00:20:45,560 Speaker 1: ji Wu, Martin has less than one cup. And she 363 00:20:45,640 --> 00:20:46,800 Speaker 1: thinks she solved the problem. 364 00:20:46,920 --> 00:20:49,600 Speaker 2: Yes, okay, so it's less than one cup. 365 00:20:49,960 --> 00:20:53,200 Speaker 1: Yeah, so she says it's less than one cup. Now, 366 00:20:53,320 --> 00:20:56,240 Speaker 1: oh my god, this is hard. So the question is 367 00:20:56,240 --> 00:20:59,600 Speaker 1: what do I, as a teacher say to Jiwu. We 368 00:20:59,600 --> 00:21:02,679 Speaker 1: were off. The rules were simple. I couldn't give ji 369 00:21:02,760 --> 00:21:05,320 Speaker 1: Wu the answer or explain to her what she was 370 00:21:05,359 --> 00:21:08,240 Speaker 1: doing wrong. I had to be Deborah Ble. I had 371 00:21:08,280 --> 00:21:11,639 Speaker 1: to help her find the way herself. The chat box 372 00:21:11,720 --> 00:21:13,800 Speaker 1: in the corner of the screen was waiting for my 373 00:21:13,840 --> 00:21:17,119 Speaker 1: first question. I thought for a moment and started typing, 374 00:21:17,400 --> 00:21:20,480 Speaker 1: do you think the boxes in the red rectangle are 375 00:21:20,520 --> 00:21:23,560 Speaker 1: the same size as the boxes in the blue rectangle? 376 00:21:24,359 --> 00:21:26,760 Speaker 1: Then I turned to Sean and dabey, is that a 377 00:21:26,760 --> 00:21:27,280 Speaker 1: good question? 378 00:21:27,560 --> 00:21:28,399 Speaker 6: Yeah? 379 00:21:28,480 --> 00:21:29,000 Speaker 7: Serious thing. 380 00:21:30,000 --> 00:21:31,439 Speaker 2: Yeah, that's a good question. 381 00:21:32,000 --> 00:21:36,080 Speaker 1: Jewu doesn't mess around. She answers immediately. So Ju says, 382 00:21:36,119 --> 00:21:38,280 Speaker 1: the blue and red pieces are not the same sizes. 383 00:21:39,080 --> 00:21:43,400 Speaker 2: Oh, so you understand now Ju knows that side differences. 384 00:21:44,600 --> 00:21:48,440 Speaker 1: So she's pretty smart here. Yeah. Then I asked, if 385 00:21:48,480 --> 00:21:51,000 Speaker 1: they are not the same size, do you think you 386 00:21:51,000 --> 00:21:55,720 Speaker 1: can add them together? Ji Wu answered right away. Jiwu says, 387 00:21:56,760 --> 00:21:58,720 Speaker 1: I have learned that I could add any numbers in 388 00:21:58,800 --> 00:22:01,600 Speaker 1: grade too, So three p three is six and four 389 00:22:01,640 --> 00:22:02,440 Speaker 1: to six is ten. 390 00:22:02,760 --> 00:22:07,199 Speaker 2: Yeah. So she is using the knowledge of edding intiquers 391 00:22:07,240 --> 00:22:09,160 Speaker 2: into edding fractions. 392 00:22:09,560 --> 00:22:13,520 Speaker 1: Now I'm stumped. So now I have to somehow lead 393 00:22:13,640 --> 00:22:16,159 Speaker 1: her to figure out a way to get her to 394 00:22:16,280 --> 00:22:20,840 Speaker 1: understand that we're dealing with a different kind of problem, 395 00:22:21,160 --> 00:22:24,359 Speaker 1: a harder problem. Amy Robertson had told me that learning 396 00:22:24,400 --> 00:22:28,159 Speaker 1: how to do responsive teaching properly was really hard, and 397 00:22:28,160 --> 00:22:31,119 Speaker 1: now I understood why I had to put my mind 398 00:22:31,520 --> 00:22:33,600 Speaker 1: inside the mind of a nine year old. I had 399 00:22:33,600 --> 00:22:37,159 Speaker 1: to internalize her knowledge base and assumptions, and keep in mind, 400 00:22:37,440 --> 00:22:40,959 Speaker 1: I haven't been nine for a very long time. I 401 00:22:41,000 --> 00:22:44,200 Speaker 1: honestly had no idea what to say next. I thought 402 00:22:44,240 --> 00:22:46,760 Speaker 1: for a moment, I asked what I quickly realized was 403 00:22:46,800 --> 00:22:50,520 Speaker 1: a hopelessly convoluted question. Dobby and Sean had built a 404 00:22:50,560 --> 00:22:54,800 Speaker 1: mentor into the system, an experienced responsive teacher who supervises 405 00:22:54,840 --> 00:22:58,000 Speaker 1: the session and offers advice. My mentor noticed that I 406 00:22:58,040 --> 00:23:01,000 Speaker 1: was struggling, told me to implify my question. 407 00:23:01,320 --> 00:23:03,160 Speaker 2: Remember she was a thirdth grader. 408 00:23:04,200 --> 00:23:07,000 Speaker 1: Dabe was trying to help me too. She suggested, why 409 00:23:07,040 --> 00:23:10,240 Speaker 1: not just ask jie Wu if three quarters is bigger 410 00:23:10,640 --> 00:23:11,920 Speaker 1: or smaller than one half? 411 00:23:12,320 --> 00:23:15,800 Speaker 2: So we are trying to help her to think about 412 00:23:15,880 --> 00:23:18,000 Speaker 2: faction in a more conceptual way. 413 00:23:18,480 --> 00:23:23,439 Speaker 1: This time, Jiwu understood. She wrote back, three quarters is 414 00:23:23,640 --> 00:23:27,200 Speaker 1: larger than one half? I wrote back, is three six 415 00:23:27,280 --> 00:23:31,679 Speaker 1: of a cup bigger or smaller than one half? Jewu said, 416 00:23:32,240 --> 00:23:35,159 Speaker 1: I'm confused. Oh no, I've confused, Or. 417 00:23:35,240 --> 00:23:39,920 Speaker 2: Gi Wu, it's good she's understanding, she's realizing her misconception, 418 00:23:40,240 --> 00:23:41,600 Speaker 2: so she's getting confused. 419 00:23:41,680 --> 00:23:43,960 Speaker 1: She says, I'm confused. Three quarters is pretty close to 420 00:23:44,040 --> 00:23:48,080 Speaker 1: one and adding three six would make it go over one. Oh, 421 00:23:48,119 --> 00:23:51,240 Speaker 1: so she's got the answer. Yeah, But then she says, 422 00:23:51,760 --> 00:23:53,439 Speaker 1: but there are six pieces out of ten, which is 423 00:23:53,480 --> 00:23:54,880 Speaker 1: less than one, so I don't get it. 424 00:23:55,400 --> 00:23:58,920 Speaker 2: So she's the point that, oh this, I have something 425 00:23:59,119 --> 00:24:01,040 Speaker 2: wrong here. That's a good sign. 426 00:24:01,560 --> 00:24:02,280 Speaker 1: She's getting there. 427 00:24:02,560 --> 00:24:03,560 Speaker 2: She's getting there. 428 00:24:03,480 --> 00:24:06,520 Speaker 1: But I still have to get her to She has 429 00:24:06,560 --> 00:24:08,520 Speaker 1: to get the six pieces out of ten out of 430 00:24:08,520 --> 00:24:10,800 Speaker 1: her head. Yeah, I have no idea how to do that, 431 00:24:13,480 --> 00:24:17,560 Speaker 1: and she thinks she's confused when she has. Actually she's 432 00:24:17,600 --> 00:24:20,760 Speaker 1: figured out the answer. Yeah she did. So we have advance. 433 00:24:20,840 --> 00:24:24,679 Speaker 1: Even in my stumbling and bumbling, we've made some progress, and. 434 00:24:24,880 --> 00:24:25,719 Speaker 6: Very notable progress. 435 00:24:25,920 --> 00:24:34,240 Speaker 1: As my conversation with ji Wu went on for some time, 436 00:24:34,560 --> 00:24:37,840 Speaker 1: and eventually I got there. Ji Wu found her way 437 00:24:37,920 --> 00:24:41,040 Speaker 1: to the right answer. She said, I have more than 438 00:24:41,119 --> 00:24:44,040 Speaker 1: one cup of flower. The mentor chimed in. I got 439 00:24:44,080 --> 00:24:46,560 Speaker 1: a little emoji that made me feel good, And when 440 00:24:46,600 --> 00:24:49,720 Speaker 1: it was over, I realized two things. The first was 441 00:24:50,000 --> 00:24:53,600 Speaker 1: I needed more batting practice, much more, and that batting 442 00:24:53,640 --> 00:24:57,159 Speaker 1: practice was really, really easy to do, because someone has 443 00:24:57,200 --> 00:24:59,560 Speaker 1: gone to the trouble of building me my very own 444 00:24:59,560 --> 00:25:02,800 Speaker 1: baseball diamond and given me a pitcher who had thrown 445 00:25:02,800 --> 00:25:07,080 Speaker 1: me baseballs all day long. My second thought was that 446 00:25:07,119 --> 00:25:10,960 Speaker 1: I've been thinking about AI all wrong. I have interpreted 447 00:25:11,000 --> 00:25:12,840 Speaker 1: a lot of the talk about the promise of AI 448 00:25:13,040 --> 00:25:16,640 Speaker 1: to be about replacing human expertise. I had actually thought 449 00:25:16,800 --> 00:25:19,359 Speaker 1: when I first heard about Dabe's project that that's what 450 00:25:19,480 --> 00:25:22,800 Speaker 1: Dabe and Sean were doing, creating an AI to teach 451 00:25:22,840 --> 00:25:26,080 Speaker 1: students by passing the teacher altogether. But if you did 452 00:25:26,119 --> 00:25:29,040 Speaker 1: it that way, you had missed the magic of the classroom. 453 00:25:29,400 --> 00:25:33,639 Speaker 1: Remember Eleanor Duckworth's quote, the goal of education is for 454 00:25:33,680 --> 00:25:36,680 Speaker 1: students to have wonderful ideas and have a good time 455 00:25:36,720 --> 00:25:40,160 Speaker 1: having them. I think we often focus on the first 456 00:25:40,200 --> 00:25:44,240 Speaker 1: part of that formulation, the wonderful ideas, but neglect the second, 457 00:25:44,840 --> 00:25:49,800 Speaker 1: the good time having them. Real learning is born in pleasure, 458 00:25:50,400 --> 00:25:54,280 Speaker 1: in community, in playful discussion, in a group of kids 459 00:25:54,320 --> 00:25:57,160 Speaker 1: coming together to solve a problem, And all of that 460 00:25:57,240 --> 00:26:01,720 Speaker 1: magic only comes from human interaction from a teacher who 461 00:26:01,800 --> 00:26:04,199 Speaker 1: is skilled enough to inspire a class of nine year 462 00:26:04,240 --> 00:26:08,320 Speaker 1: olds We don't want AI assistants to replace the teacher. 463 00:26:08,920 --> 00:26:13,480 Speaker 1: We want AI assistants to help teachers turn themselves into 464 00:26:13,520 --> 00:26:30,919 Speaker 1: even better teachers. Smart Talks with IBM is produced by 465 00:26:30,960 --> 00:26:35,560 Speaker 1: Matt Ramano, Amy Gains, McQuaid, Lucy Sullivan, and Jake Harper. 466 00:26:35,880 --> 00:26:39,480 Speaker 1: We're edited by Lacy Roberts, Engineering by Nina Bird Lawrence, 467 00:26:40,000 --> 00:26:44,320 Speaker 1: mastering by Sarah Bugier, Music by Gramoscope. Special thanks to 468 00:26:44,359 --> 00:26:48,520 Speaker 1: Tatiana Lieberman and Cassidy Meyer. Smart Talks with IBM is 469 00:26:48,560 --> 00:26:52,600 Speaker 1: a production of Pushkin Industries and Ruby Studio at iHeartMedia. 470 00:26:53,119 --> 00:26:56,879 Speaker 1: To find more Pushkin podcasts, listen on the iHeartRadio app, 471 00:26:57,280 --> 00:27:01,520 Speaker 1: Apple Podcasts, or wherever you get your podcasts. I'm Malcolm Gabo. 472 00:27:02,040 --> 00:27:05,680 Speaker 1: This is a paid advertisement from ib M. The conversations 473 00:27:05,720 --> 00:27:10,520 Speaker 1: on this podcast don't necessarily represent ib m's positions, strategies, 474 00:27:11,000 --> 00:27:21,760 Speaker 1: or opinions.