1 00:00:15,356 --> 00:00:22,916 Speaker 1: Pushkin. There's a good chance you've heard of con Academy 2 00:00:23,316 --> 00:00:26,596 Speaker 1: about how Salcon started out more than a decade ago 3 00:00:26,716 --> 00:00:30,836 Speaker 1: tutoring his young cousins and then started posting simple tutoring 4 00:00:30,916 --> 00:00:34,116 Speaker 1: videos on YouTube, like how to solve a quadratic equation, 5 00:00:34,476 --> 00:00:37,516 Speaker 1: that kind of thing. He founded con Academy in two 6 00:00:37,596 --> 00:00:40,116 Speaker 1: thousand and eight, and it grew into this big thing 7 00:00:40,356 --> 00:00:44,076 Speaker 1: beloved by Bill Gates and the Ted talk set. Today, 8 00:00:44,436 --> 00:00:47,596 Speaker 1: con Academy has tens of millions of monthly users and 9 00:00:47,716 --> 00:00:51,316 Speaker 1: provides not just those YouTube videos, but thousands of practice 10 00:00:51,356 --> 00:00:55,396 Speaker 1: questions across math and science and the humanities, along with 11 00:00:55,476 --> 00:00:59,036 Speaker 1: software that monitors students progress. All this has been in 12 00:00:59,116 --> 00:01:02,876 Speaker 1: service of a big idea using technology to give everyone 13 00:01:03,236 --> 00:01:06,316 Speaker 1: the huge benefit that Salcon's cousins got from one on 14 00:01:06,316 --> 00:01:10,396 Speaker 1: one tutoring. But in spite of Salza, the con Academy 15 00:01:10,596 --> 00:01:13,516 Speaker 1: has not been able to fully deliver on that idea. 16 00:01:13,636 --> 00:01:16,556 Speaker 1: The technology just hasn't been there to match what a 17 00:01:16,636 --> 00:01:20,436 Speaker 1: human can do. But now sal thinks that may be 18 00:01:20,516 --> 00:01:24,556 Speaker 1: about to change. Earlier this year, con Academy launched something 19 00:01:24,556 --> 00:01:28,036 Speaker 1: they call Conmego, an AI tutor built on top of 20 00:01:28,156 --> 00:01:37,036 Speaker 1: GPT four. I'm Jacob Goldstein, and this is what's your problem. 21 00:01:37,196 --> 00:01:40,236 Speaker 1: My guest today is Sal Con. He's the founder and 22 00:01:40,356 --> 00:01:44,276 Speaker 1: CEO of con Academy. Sal's problem is this, how do 23 00:01:44,316 --> 00:01:47,716 Speaker 1: you use AI to bring students the benefit of working 24 00:01:47,756 --> 00:01:51,476 Speaker 1: with a human tutor. Sal says his real work with 25 00:01:51,556 --> 00:01:54,116 Speaker 1: AI started last summer when he heard from the co 26 00:01:54,156 --> 00:01:56,196 Speaker 1: founders of the company that built. 27 00:01:55,876 --> 00:02:02,476 Speaker 2: GPT About a year ago. Summer of twenty twenty two, 28 00:02:03,236 --> 00:02:06,036 Speaker 2: I get an email from Sam Altman and Greg Brockman, 29 00:02:06,076 --> 00:02:09,556 Speaker 2: the founders of open AI, who I knew, you know, 30 00:02:09,596 --> 00:02:11,916 Speaker 2: I bumped into them at different events and things like that, 31 00:02:12,396 --> 00:02:14,756 Speaker 2: and they said, hey, you know, we're working on our 32 00:02:14,756 --> 00:02:17,756 Speaker 2: next model. If we would love to talk to you 33 00:02:17,756 --> 00:02:19,756 Speaker 2: all about a few, you know, things that we might 34 00:02:19,796 --> 00:02:23,476 Speaker 2: be able to collaborate on. I was skeptical that it 35 00:02:23,516 --> 00:02:27,476 Speaker 2: would really have any implications for con Academy, but as 36 00:02:27,516 --> 00:02:29,996 Speaker 2: a nerd, I said, Oh, this must be GPT four 37 00:02:30,356 --> 00:02:32,436 Speaker 2: that they must be working on. And I have a 38 00:02:32,476 --> 00:02:34,876 Speaker 2: lot of respect for Greg and Sam what they had 39 00:02:34,876 --> 00:02:37,836 Speaker 2: already accomplished, so I was like, yeah, just I'm just curious. Yeah, 40 00:02:38,036 --> 00:02:41,396 Speaker 2: let's meet is Actually at that point hadn't finished training 41 00:02:41,836 --> 00:02:44,956 Speaker 2: what would eventually be GPT four, but they were about 42 00:02:44,956 --> 00:02:48,516 Speaker 2: two weeks away from it, but they said, look, we 43 00:02:48,596 --> 00:02:52,196 Speaker 2: think this is going to be the model, the generative 44 00:02:52,236 --> 00:02:55,396 Speaker 2: AI that really opens people's minds to what's possible here. 45 00:02:56,436 --> 00:02:58,036 Speaker 2: And because of that, it's going to be exciting and 46 00:02:58,036 --> 00:03:00,756 Speaker 2: a little bit scary, and so we want to launch 47 00:03:01,116 --> 00:03:04,276 Speaker 2: with some social positive use cases. The first one that 48 00:03:04,316 --> 00:03:07,436 Speaker 2: came to our mind is education, and the first organization 49 00:03:07,476 --> 00:03:09,996 Speaker 2: that came to our mind is Kin Academy, because y'all. 50 00:03:09,956 --> 00:03:12,236 Speaker 1: Want they want you to be the warm, fuzzy, happy 51 00:03:12,236 --> 00:03:13,116 Speaker 1: face of AI. 52 00:03:13,516 --> 00:03:15,796 Speaker 2: Yeah, to see if we'd be interested to see and 53 00:03:16,036 --> 00:03:18,676 Speaker 2: they want you know, they don't want to just be optics. 54 00:03:18,756 --> 00:03:21,956 Speaker 2: I think they generally want AI to have warm, fuzzy applications. 55 00:03:22,076 --> 00:03:26,116 Speaker 2: I believe, of course, the dark and sordid applications that. 56 00:03:26,556 --> 00:03:30,476 Speaker 1: They clearly do, right, They clearly recognized the downside. 57 00:03:30,116 --> 00:03:33,436 Speaker 2: Yeah, exactly. But the other reason they actually reached out 58 00:03:33,476 --> 00:03:36,436 Speaker 2: to us and I connected the dots later, they said, 59 00:03:36,436 --> 00:03:37,756 Speaker 2: you know, we really want it to be good at 60 00:03:37,756 --> 00:03:42,076 Speaker 2: academic things like That's where GPT three it really had 61 00:03:42,156 --> 00:03:45,996 Speaker 2: no solid handle on knowledge, and we think GPT four 62 00:03:46,036 --> 00:03:49,396 Speaker 2: will and y'all have a lot of items we were 63 00:03:49,396 --> 00:03:51,076 Speaker 2: just talking about how deep of an item bank we 64 00:03:51,116 --> 00:03:54,316 Speaker 2: have it academy, across subjects and grades. And they were 65 00:03:54,356 --> 00:03:57,076 Speaker 2: particularly interested in ap bio, which I later learned from 66 00:03:57,116 --> 00:04:00,356 Speaker 2: Bill Gates. It was because when he saw GPT three 67 00:04:00,756 --> 00:04:03,556 Speaker 2: he said he essentially told him, look, this is cute, guys, 68 00:04:03,836 --> 00:04:05,716 Speaker 2: but I'll be impressed if this could pass an ap 69 00:04:05,836 --> 00:04:09,836 Speaker 2: biology exam. So I think Greg and Sam literally said, Okay, 70 00:04:10,516 --> 00:04:13,796 Speaker 2: you know, Bill Gates, that's a pretty good benchmark. Can 71 00:04:13,836 --> 00:04:15,996 Speaker 2: this pass the ap bile exam? So I think that's 72 00:04:16,036 --> 00:04:17,756 Speaker 2: the other thing. They said, hey, can we use some 73 00:04:17,796 --> 00:04:20,076 Speaker 2: of your items to test to evaluate the model. 74 00:04:20,476 --> 00:04:24,116 Speaker 1: So they wanted you have like whatever, thousands of questions 75 00:04:24,156 --> 00:04:27,476 Speaker 1: about all these different subjects kind of test questions, test 76 00:04:27,636 --> 00:04:30,716 Speaker 1: like questions about ap biology and calculus and lots of 77 00:04:30,756 --> 00:04:33,716 Speaker 1: other subjects, and they wanted to train the model on 78 00:04:33,756 --> 00:04:35,916 Speaker 1: your questions because you have the questions and the answers 79 00:04:35,956 --> 00:04:39,076 Speaker 1: and exactly what you would want if you were training 80 00:04:39,436 --> 00:04:44,276 Speaker 1: an AI model on sort of high school college level knowledge. 81 00:04:44,596 --> 00:04:46,676 Speaker 2: Well, it's interesting out of the gate, they were actually 82 00:04:46,676 --> 00:04:49,076 Speaker 2: more interested in just on the evaluation side. So once 83 00:04:49,116 --> 00:04:51,116 Speaker 2: you produce a model, you want to see how good 84 00:04:51,116 --> 00:04:53,796 Speaker 2: it is so you literally make it take questions and 85 00:04:53,836 --> 00:04:56,716 Speaker 2: see how many gets right. So I was like, yeah, 86 00:04:56,756 --> 00:05:00,196 Speaker 2: you know, maybe you know, I wasn't sure if this 87 00:05:00,276 --> 00:05:02,436 Speaker 2: was really going to be something we could use, but 88 00:05:02,476 --> 00:05:04,036 Speaker 2: I still wanted to see. I still wanted to see 89 00:05:04,036 --> 00:05:06,676 Speaker 2: the demo when I was available. So two weeks later 90 00:05:06,676 --> 00:05:08,516 Speaker 2: they said, oh, we finished training, we'd love to show 91 00:05:08,556 --> 00:05:11,436 Speaker 2: you a demo. And I remember that meeting they put 92 00:05:11,516 --> 00:05:14,236 Speaker 2: up and you know, this isn't that mind blowing the 93 00:05:14,276 --> 00:05:16,676 Speaker 2: folks now because they've gotten used to this. But in 94 00:05:16,756 --> 00:05:20,756 Speaker 2: summer of twenty twenty two, we were on a video 95 00:05:20,796 --> 00:05:23,836 Speaker 2: conference and they on Greg's screen. He showed an ap 96 00:05:23,916 --> 00:05:26,036 Speaker 2: bio question and he said, sah, what do you think 97 00:05:26,036 --> 00:05:27,916 Speaker 2: the answer is? And I like read the question was 98 00:05:27,956 --> 00:05:31,596 Speaker 2: about osmosis. I'm like, oh c. And then they asked 99 00:05:31,636 --> 00:05:33,676 Speaker 2: the AI. I said yeah, see, I'm like, okay, that's 100 00:05:33,756 --> 00:05:36,196 Speaker 2: kind of cool. And then I said, well, you know, 101 00:05:36,236 --> 00:05:39,476 Speaker 2: ask it why is? The answer explained it very well, 102 00:05:39,516 --> 00:05:41,836 Speaker 2: and that's when I was like, wow, that is interesting. 103 00:05:41,876 --> 00:05:43,796 Speaker 2: I said, well, why aren't the other choices? The answer 104 00:05:44,316 --> 00:05:47,716 Speaker 2: explained it very well. I'm like okay, yeah, this is getting. 105 00:05:47,836 --> 00:05:50,476 Speaker 1: Kind of wow. Right, that's still like I know, it's 106 00:05:50,476 --> 00:05:53,516 Speaker 1: not really cognition. Maybe it doesn't understand, but something is 107 00:05:53,556 --> 00:05:56,876 Speaker 1: happening there, right, something something well. 108 00:05:56,876 --> 00:05:59,916 Speaker 2: I think it's starting to make all of us start 109 00:05:59,916 --> 00:06:02,596 Speaker 2: to parse words, like we start to realize that like 110 00:06:03,316 --> 00:06:08,716 Speaker 2: sentience and cognition and intelligence are not necessarily the in fact, 111 00:06:08,716 --> 00:06:11,596 Speaker 2: they aren't same thing, and we can separate these things. 112 00:06:11,596 --> 00:06:13,916 Speaker 2: Something can be intelligent. And I think it's going to 113 00:06:13,996 --> 00:06:16,636 Speaker 2: challenge our words, our semantics for some of these things. 114 00:06:16,796 --> 00:06:19,116 Speaker 2: But then I said, hey, can it can write another 115 00:06:19,156 --> 00:06:20,636 Speaker 2: question like this? And it did it, and I'm like 116 00:06:20,636 --> 00:06:22,876 Speaker 2: that's a pretty good question. And then I said write 117 00:06:22,876 --> 00:06:25,636 Speaker 2: ten more questions like this, and it did it. And 118 00:06:26,276 --> 00:06:27,396 Speaker 2: you know, at the moment. 119 00:06:27,276 --> 00:06:28,996 Speaker 1: At your moment where you were like, oh, no, AI 120 00:06:29,076 --> 00:06:31,596 Speaker 1: is coming from my job, I kind. 121 00:06:31,356 --> 00:06:34,436 Speaker 2: Of thought that a little bit. But then what really 122 00:06:34,556 --> 00:06:37,556 Speaker 2: opened my mid They're like, oh, well, would you want 123 00:06:37,636 --> 00:06:39,956 Speaker 2: access to it and to at least try it out 124 00:06:39,996 --> 00:06:42,396 Speaker 2: see how you might be able to use it. And 125 00:06:42,476 --> 00:06:44,276 Speaker 2: I was like, yeah, I want access. I want to 126 00:06:44,276 --> 00:06:47,356 Speaker 2: try out this thing, and so myself and actually eventually 127 00:06:47,356 --> 00:06:49,756 Speaker 2: our whole organization got under a non disclosure agreement with 128 00:06:49,796 --> 00:06:53,076 Speaker 2: open ai back back in August, and we started testing 129 00:06:53,116 --> 00:06:55,036 Speaker 2: it in that first weekend, and I remember it was 130 00:06:55,076 --> 00:06:55,476 Speaker 2: myself in. 131 00:06:55,476 --> 00:06:57,076 Speaker 1: The August twenty twenty two. 132 00:06:57,556 --> 00:07:03,116 Speaker 2: Yes, myself, our Chief Learning Officer Christen, and our CTO 133 00:07:03,236 --> 00:07:05,916 Speaker 2: Paul were the first people to have access to and 134 00:07:05,956 --> 00:07:08,116 Speaker 2: I think we even got access to it before many 135 00:07:08,116 --> 00:07:11,476 Speaker 2: folks that open Ai access to it, and we were 136 00:07:11,516 --> 00:07:13,836 Speaker 2: just playing around with it, and we were also slacking 137 00:07:13,836 --> 00:07:16,156 Speaker 2: with the open AI team. We're like, hey, you have 138 00:07:16,156 --> 00:07:17,716 Speaker 2: any ideas. We're trying to get it to do this 139 00:07:17,876 --> 00:07:19,796 Speaker 2: or we got it closed, but it's still and they 140 00:07:19,796 --> 00:07:21,516 Speaker 2: were giving us some really good tips and by the 141 00:07:21,596 --> 00:07:24,676 Speaker 2: end of that weekend we had it being able to 142 00:07:24,916 --> 00:07:29,556 Speaker 2: take on personas and modeling you know, pretty good tutor behavior. 143 00:07:30,196 --> 00:07:32,196 Speaker 2: And that's when it really when I started saying, Okay, 144 00:07:32,196 --> 00:07:35,316 Speaker 2: this is a this is a game changer. This is 145 00:07:35,356 --> 00:07:39,276 Speaker 2: science fiction has talked about artificially intelligent tutors forever, most 146 00:07:39,316 --> 00:07:43,436 Speaker 2: famously Diamond Age Young Ladies. Illustrated primer Neil Stevenson wrote 147 00:07:43,476 --> 00:07:45,996 Speaker 2: about this in the nineteen nineties about you know, an 148 00:07:46,036 --> 00:07:50,236 Speaker 2: intelligent tutor being able to educate essentially. I mean, this 149 00:07:50,396 --> 00:07:52,916 Speaker 2: book was set in the not too far future and 150 00:07:52,956 --> 00:07:55,956 Speaker 2: it educates all these young girls in China living on 151 00:07:56,036 --> 00:07:58,316 Speaker 2: barges were orphans and then they take over the world 152 00:07:58,436 --> 00:08:01,236 Speaker 2: because this tutor was able to empower them so much so. 153 00:08:01,436 --> 00:08:06,116 Speaker 1: Sidebar Neil Stevenson amazingly prescient, right like he's the like 154 00:08:06,796 --> 00:08:10,476 Speaker 1: president guy for this, he's the president guy for for crypto, 155 00:08:10,836 --> 00:08:12,396 Speaker 1: right like it's weird, right. 156 00:08:12,716 --> 00:08:14,796 Speaker 2: Oh, he's pretty good. And he wrote, you know, diamond 157 00:08:14,836 --> 00:08:17,356 Speaker 2: Age was nineteen ninety four, so yeah, you know, almost 158 00:08:17,356 --> 00:08:19,076 Speaker 2: as an aside, it was a tablet app so he 159 00:08:19,116 --> 00:08:23,476 Speaker 2: was prescient for tablet and mobile. But but that's when 160 00:08:23,516 --> 00:08:24,876 Speaker 2: when I when I saw that it could take on 161 00:08:24,956 --> 00:08:29,316 Speaker 2: personas and act as a tutor and not superficially but 162 00:08:29,356 --> 00:08:31,436 Speaker 2: actually do some things that are I would view like 163 00:08:31,516 --> 00:08:34,396 Speaker 2: actually quite thoughtful as a tutor, I said, Wow, we 164 00:08:34,556 --> 00:08:36,076 Speaker 2: I think we're at the cusp of something here. And 165 00:08:36,076 --> 00:08:37,276 Speaker 2: I didn't know if it was going to happen in 166 00:08:37,276 --> 00:08:39,916 Speaker 2: like three months or three years, but I'm like, we've 167 00:08:39,996 --> 00:08:41,476 Speaker 2: got we've got to work on this. 168 00:08:41,876 --> 00:08:44,476 Speaker 1: Just to pause there and kind of go broader, Like 169 00:08:44,836 --> 00:08:48,436 Speaker 1: the big idea behind con Academy the whole time, right 170 00:08:48,516 --> 00:08:51,556 Speaker 1: since its inception whatever what more than a decade ago 171 00:08:51,636 --> 00:08:54,636 Speaker 1: now is the power of a tutor right, that like 172 00:08:54,716 --> 00:08:58,876 Speaker 1: a tutor is profoundly valuable, and that's sort of empirically clear, right. 173 00:09:00,076 --> 00:09:03,076 Speaker 2: That's exactly right. It all got started back in two 174 00:09:03,116 --> 00:09:05,796 Speaker 2: thousand and four with me tutoring cousins just on the side, 175 00:09:06,316 --> 00:09:09,596 Speaker 2: one cousin Nada needed help, and then I started tutoring brothers. 176 00:09:09,676 --> 00:09:12,356 Speaker 2: Wordspreads in my family, free tutoring is going on, and 177 00:09:12,836 --> 00:09:15,116 Speaker 2: I saw it with my own cousins just on an 178 00:09:15,116 --> 00:09:17,956 Speaker 2: anecdotal level, that everyone that I was tutoring, I was 179 00:09:17,996 --> 00:09:19,796 Speaker 2: able to put in, you know, thirty forty minutes a 180 00:09:19,876 --> 00:09:22,116 Speaker 2: day with all of them collectively, it was it was 181 00:09:22,236 --> 00:09:25,076 Speaker 2: dramatically accelerating them. And so to a large degree, when 182 00:09:25,116 --> 00:09:29,076 Speaker 2: I started making exercises and software for my cousins and 183 00:09:29,076 --> 00:09:32,236 Speaker 2: then eventually making videos for my cousins that obviously many 184 00:09:32,236 --> 00:09:35,036 Speaker 2: many more folks ended up using, I was always in 185 00:09:35,036 --> 00:09:37,876 Speaker 2: the mindset of how can I help scale my tutoring, 186 00:09:38,236 --> 00:09:40,996 Speaker 2: How can I start making that tutoring a little bit 187 00:09:40,996 --> 00:09:43,636 Speaker 2: more self service so that my cousins and eventually other 188 00:09:43,676 --> 00:09:46,476 Speaker 2: people could help themselves. So that's been the journey of 189 00:09:46,516 --> 00:09:48,916 Speaker 2: coin Academy for the last I mean it's been almost 190 00:09:49,836 --> 00:09:52,876 Speaker 2: it's been nineteen years. Since I started tutoring my cousins, 191 00:09:53,636 --> 00:09:57,916 Speaker 2: and so this technology held the potential to take it 192 00:09:57,956 --> 00:09:58,596 Speaker 2: that much further. 193 00:09:58,796 --> 00:10:01,436 Speaker 1: Yeah, so you're playing with this thing. It's very early. 194 00:10:01,876 --> 00:10:04,436 Speaker 1: The world doesn't even know about GPT for yet. What 195 00:10:04,436 --> 00:10:05,916 Speaker 1: what are you doing? What are you working on? Within 196 00:10:05,996 --> 00:10:06,636 Speaker 1: con Academy. 197 00:10:06,676 --> 00:10:08,876 Speaker 2: At this point we started to see, Wow, this could 198 00:10:08,916 --> 00:10:10,916 Speaker 2: just could really work, This could be really powerful, and 199 00:10:10,916 --> 00:10:14,076 Speaker 2: we started figuring out ways to minimize some of the 200 00:10:14,156 --> 00:10:17,996 Speaker 2: rough spots of Jenai, like hallucinations where it can make 201 00:10:18,076 --> 00:10:21,636 Speaker 2: up things like it's making math errors, which is obviously 202 00:10:21,676 --> 00:10:23,756 Speaker 2: a problem if we want it to be a math tutor. 203 00:10:24,436 --> 00:10:26,996 Speaker 2: So we started going working through that. At the same time, 204 00:10:27,036 --> 00:10:30,236 Speaker 2: we started having some pretty intense debates inside of our organization. 205 00:10:31,356 --> 00:10:33,836 Speaker 2: Roughly speaking, half the organization was like, this is the 206 00:10:33,836 --> 00:10:38,356 Speaker 2: most important technological advancement ever or at least in our lifetimes. 207 00:10:38,636 --> 00:10:40,476 Speaker 2: We've got to go all in on this, like it's 208 00:10:40,796 --> 00:10:42,796 Speaker 2: our duty. And then the other half of the team, 209 00:10:43,116 --> 00:10:46,196 Speaker 2: not that they disagreed, but they said, look, kind academy 210 00:10:46,236 --> 00:10:49,636 Speaker 2: stands for a lot it and we're there to help students, 211 00:10:49,676 --> 00:10:52,636 Speaker 2: and if we go out there with something that's either 212 00:10:52,716 --> 00:10:56,916 Speaker 2: not well baked or it leads to something that's suspicious 213 00:10:57,036 --> 00:11:00,316 Speaker 2: or shady or just scary for folks. Because JENAI could 214 00:11:00,316 --> 00:11:02,836 Speaker 2: be scary, it's going to be bad for us. So 215 00:11:02,876 --> 00:11:05,916 Speaker 2: we started having those debates. But within a couple of months, 216 00:11:06,116 --> 00:11:10,356 Speaker 2: as we were just kept prototyping, well, two things happened. 217 00:11:10,636 --> 00:11:12,596 Speaker 2: One is I think a consensus. I don't know if 218 00:11:12,636 --> 00:11:15,756 Speaker 2: it was a consensus, but I started to drive alignment 219 00:11:15,916 --> 00:11:19,036 Speaker 2: around the idea like, these these fears and these risks 220 00:11:19,036 --> 00:11:21,676 Speaker 2: that people are articulating are real. We should not ignore them. 221 00:11:22,236 --> 00:11:26,316 Speaker 2: But there's ways to turn them into features that actually 222 00:11:26,636 --> 00:11:30,156 Speaker 2: not only mitigate risks, but they actually can be enhancements. So, 223 00:11:30,196 --> 00:11:33,516 Speaker 2: for example, we said, well, what if on what eventually 224 00:11:33,516 --> 00:11:37,116 Speaker 2: would be konmego we didn't call it. Then, then all 225 00:11:37,156 --> 00:11:40,116 Speaker 2: the sessions of a student who's under eighteen are recorded 226 00:11:40,116 --> 00:11:42,956 Speaker 2: and accessible by parents or teachers. What if we have 227 00:11:43,036 --> 00:11:45,676 Speaker 2: a second AI that moderates the conversations, and if the 228 00:11:45,676 --> 00:11:50,156 Speaker 2: conversations go anywhere shady can actively notify the parent or teacher. 229 00:11:50,516 --> 00:11:53,036 Speaker 2: What if our AI doesn't, It doesn't just give you 230 00:11:53,076 --> 00:11:55,516 Speaker 2: the answer. Even before you know this was before chat 231 00:11:55,556 --> 00:11:57,436 Speaker 2: GPT came out, we knew that this could be used 232 00:11:57,436 --> 00:12:00,076 Speaker 2: for cheating that the raw technology could. We said, we're 233 00:12:00,076 --> 00:12:02,676 Speaker 2: not going to do that, but it can, it can 234 00:12:02,716 --> 00:12:05,356 Speaker 2: support you. So we started thinking through all of these 235 00:12:05,676 --> 00:12:08,236 Speaker 2: what you know, we aren't going to make We aren't 236 00:12:08,276 --> 00:12:11,796 Speaker 2: going to use information to train the AI, at least 237 00:12:11,796 --> 00:12:14,036 Speaker 2: in this stage where people aren't sure how it might go. 238 00:12:14,356 --> 00:12:17,276 Speaker 2: We weren't going to let any personally identifiable information go 239 00:12:17,356 --> 00:12:20,956 Speaker 2: between the student and the artificially intelligent model. And so 240 00:12:20,956 --> 00:12:22,316 Speaker 2: in a lot of ways, it's like we're gonna put 241 00:12:22,316 --> 00:12:24,916 Speaker 2: more safeguards on jen AI than frankly there exist on 242 00:12:24,956 --> 00:12:28,316 Speaker 2: the Internet, and when kids are just randomly on the Internet. 243 00:12:28,036 --> 00:12:30,116 Speaker 1: That's a pretty low bar. You definitely want to be 244 00:12:30,196 --> 00:12:32,436 Speaker 1: higher than random, let's be honest. 245 00:12:32,636 --> 00:12:34,316 Speaker 2: Yeah, we made a much higher because we knew this 246 00:12:34,396 --> 00:12:36,596 Speaker 2: was gonna kind of you know, people were going to 247 00:12:36,636 --> 00:12:37,516 Speaker 2: have mixed feelings about it. 248 00:12:37,516 --> 00:12:39,036 Speaker 1: A lot of ways it could go wrong. There's a 249 00:12:39,036 --> 00:12:39,756 Speaker 1: lot of ways. That's right. 250 00:12:40,276 --> 00:12:42,756 Speaker 2: That's right. Then, I would say the other big thing 251 00:12:42,796 --> 00:12:46,076 Speaker 2: that happened end of November, chat GPT comes out, and 252 00:12:46,676 --> 00:12:51,116 Speaker 2: that now that captured everyone's imagination and we all remember 253 00:12:51,196 --> 00:12:54,076 Speaker 2: those first few weeks and months where people were you know, 254 00:12:54,076 --> 00:12:58,636 Speaker 2: everyone was getting on chat GPT and taking screenshots and 255 00:12:58,636 --> 00:13:00,556 Speaker 2: putting it on social media of what it was doing. 256 00:13:00,916 --> 00:13:02,876 Speaker 2: And it was doing some amazing things. It was doing 257 00:13:02,956 --> 00:13:05,516 Speaker 2: some very imperfect things too, the hallucinations, the math, theres, 258 00:13:05,556 --> 00:13:09,556 Speaker 2: et cetera. I was worried initially because the narrative and 259 00:13:09,716 --> 00:13:13,516 Speaker 2: education immediately became this thing as error prone, this thing 260 00:13:13,596 --> 00:13:17,556 Speaker 2: can be used to cheat. It's the end of term papers, homework, 261 00:13:17,596 --> 00:13:21,996 Speaker 2: et cetera. School district started banning chat GPT. I was like, 262 00:13:22,036 --> 00:13:24,156 Speaker 2: oh no, we're working so much on this, and GPT 263 00:13:24,236 --> 00:13:26,356 Speaker 2: four is so much better, and what was eventually going 264 00:13:26,396 --> 00:13:29,356 Speaker 2: to become conmego was so much better, Like I hope 265 00:13:29,356 --> 00:13:32,676 Speaker 2: the baby doesn't get thrown out with the bathwater. In hindsight, 266 00:13:33,116 --> 00:13:36,636 Speaker 2: that was a good thing because it made us even 267 00:13:36,676 --> 00:13:39,596 Speaker 2: internally say, look, the genie's out of the bottle. Now 268 00:13:39,636 --> 00:13:41,636 Speaker 2: it's how we use the genie, Like we're going to 269 00:13:41,676 --> 00:13:43,716 Speaker 2: be a force for good. Hopefully we have to work 270 00:13:43,756 --> 00:13:46,236 Speaker 2: feverishly to show that it can be used well. And 271 00:13:46,276 --> 00:13:48,636 Speaker 2: then I think by the time GPT four came out 272 00:13:48,676 --> 00:13:52,436 Speaker 2: and Conmego came out with it in March, the education 273 00:13:52,636 --> 00:13:56,516 Speaker 2: system and frankly society had a chance to process it 274 00:13:56,556 --> 00:13:59,356 Speaker 2: and they had come around like, well, the technology isn't bad. 275 00:13:59,396 --> 00:14:01,916 Speaker 2: We just need reasonable guardrails and we just need tools 276 00:14:01,916 --> 00:14:04,596 Speaker 2: that are built for the actual use cases. And then 277 00:14:04,596 --> 00:14:06,396 Speaker 2: we were able to show up with like you mean 278 00:14:06,516 --> 00:14:08,716 Speaker 2: like this, you mean like con migo, and then most 279 00:14:08,756 --> 00:14:11,796 Speaker 2: people and yes, exactly like that. And now we're seeing, 280 00:14:12,196 --> 00:14:15,756 Speaker 2: I mean honestly, more schools and districts and parents want 281 00:14:15,796 --> 00:14:18,076 Speaker 2: it for their students and their children than we can 282 00:14:18,116 --> 00:14:18,836 Speaker 2: currently handle. 283 00:14:19,116 --> 00:14:23,916 Speaker 1: So you mentioned some of the guardrails you built into it, 284 00:14:24,036 --> 00:14:26,956 Speaker 1: you know, not accepting personal information, having a second AI 285 00:14:27,116 --> 00:14:32,036 Speaker 1: monitoring for anything shady, not just giving answers. There's another 286 00:14:32,156 --> 00:14:35,996 Speaker 1: piece of the process of building con Migo that I've 287 00:14:36,156 --> 00:14:39,316 Speaker 1: heard you discuss elsewhere that is really interesting to me. 288 00:14:39,476 --> 00:14:47,036 Speaker 1: And that's that's about having the AI sort of think 289 00:14:47,076 --> 00:14:52,036 Speaker 1: about its answers, think about its responses. Right, I probably 290 00:14:52,076 --> 00:14:54,676 Speaker 1: haven't articulated that well, but you can so tell me 291 00:14:54,716 --> 00:14:55,756 Speaker 1: about that piece of it. 292 00:14:56,836 --> 00:15:00,116 Speaker 2: As you can imagine, one of the hardest things to 293 00:15:00,196 --> 00:15:03,676 Speaker 2: resolve was that even GPT four, which is dramatically better 294 00:15:03,716 --> 00:15:06,836 Speaker 2: than GPT three point five at math, it was making 295 00:15:06,996 --> 00:15:10,796 Speaker 2: an uncomfortable number of math errors, especially when it came 296 00:15:10,836 --> 00:15:14,876 Speaker 2: to being in the tutoring use case where let's say 297 00:15:15,116 --> 00:15:18,116 Speaker 2: there's an algebra problem and I the student take a 298 00:15:18,116 --> 00:15:21,036 Speaker 2: step and maybe I distribute it, distribut use the distributive 299 00:15:21,076 --> 00:15:24,916 Speaker 2: property and correctly will it recognize it? Will it not? 300 00:15:25,236 --> 00:15:27,916 Speaker 2: How will it provide that feedback to the student? And 301 00:15:27,956 --> 00:15:30,756 Speaker 2: out the box, it wasn't doing it that well. It 302 00:15:30,796 --> 00:15:36,076 Speaker 2: was making mistakes a lot, unacceptably often. Then we had 303 00:15:36,076 --> 00:15:36,956 Speaker 2: a it's. 304 00:15:36,796 --> 00:15:39,436 Speaker 1: Really bad if the tutor is getting the problem wrong. 305 00:15:40,956 --> 00:15:45,236 Speaker 2: No, tutor, Yeah, that's not acceptable. And then an open 306 00:15:45,276 --> 00:15:49,356 Speaker 2: AI researcher gave us an idea, which is, instead of 307 00:15:49,516 --> 00:15:54,356 Speaker 2: just having the AI respond immediately to the student, instead, 308 00:15:54,716 --> 00:15:57,676 Speaker 2: what if you were to have the AI essentially on 309 00:15:57,756 --> 00:16:00,236 Speaker 2: its own, not show this part to the student, but 310 00:16:00,436 --> 00:16:03,556 Speaker 2: generate what it thinks could have been reasonable responses for 311 00:16:03,636 --> 00:16:09,316 Speaker 2: the student HU and then use that plus the conversation 312 00:16:09,356 --> 00:16:11,956 Speaker 2: with the student to then respond to the student. 313 00:16:13,076 --> 00:16:15,956 Speaker 1: Yeah, and we did this thingly next level, right, like 314 00:16:15,996 --> 00:16:19,676 Speaker 1: a teacher would always be doing that, but there's no 315 00:16:19,716 --> 00:16:22,916 Speaker 1: reason to think that just a raw large language model 316 00:16:22,956 --> 00:16:23,636 Speaker 1: would be right. 317 00:16:24,356 --> 00:16:27,276 Speaker 2: That's right, But you know what's interesting about this. We 318 00:16:27,676 --> 00:16:30,156 Speaker 2: did it and it dramatically improved the math. And then 319 00:16:30,196 --> 00:16:31,836 Speaker 2: we started tweaking it more and more, and we got 320 00:16:31,836 --> 00:16:33,716 Speaker 2: it better and better. Now we do a bunch of 321 00:16:33,996 --> 00:16:36,516 Speaker 2: fancy stuff along along those lines to get it a 322 00:16:36,556 --> 00:16:39,756 Speaker 2: lot better. But then we realize, to your point, that's 323 00:16:39,836 --> 00:16:42,556 Speaker 2: exactly what a teacher would do. So that's what con 324 00:16:42,596 --> 00:16:44,596 Speaker 2: Migo does. It works on it on its own and 325 00:16:44,676 --> 00:16:47,676 Speaker 2: its own scratch paper, so to speak. Compares the student's 326 00:16:47,756 --> 00:16:51,476 Speaker 2: response if they got something different. Con Migo doesn't assume 327 00:16:51,516 --> 00:16:54,716 Speaker 2: the student is wrong. It says, because con Migo can 328 00:16:54,756 --> 00:16:57,396 Speaker 2: sometimes be wrong even on its own work, It says, hey, 329 00:16:57,436 --> 00:17:00,476 Speaker 2: I got something different, can you explain your reasoning? And 330 00:17:00,516 --> 00:17:03,396 Speaker 2: then when the student explains the reasoning, that's really good 331 00:17:03,436 --> 00:17:06,276 Speaker 2: for large language models. It's actually able to understand that. 332 00:17:06,676 --> 00:17:09,116 Speaker 2: And what's interesting, not only is that very pedagogically good, 333 00:17:09,876 --> 00:17:12,316 Speaker 2: we've been getting a strong a lot of feedback from 334 00:17:12,316 --> 00:17:15,316 Speaker 2: students that they really appreciate that type of an interaction. 335 00:17:15,356 --> 00:17:19,356 Speaker 2: It eerily feels human that this thing you were so 336 00:17:19,476 --> 00:17:21,876 Speaker 2: used to computers being so perfect, and it's like you're wrong, 337 00:17:21,996 --> 00:17:24,396 Speaker 2: you're right here, you know, that's what a robot would do, 338 00:17:24,596 --> 00:17:26,316 Speaker 2: but no, this is what a tutor would do. It's like, hey, 339 00:17:26,316 --> 00:17:27,916 Speaker 2: I didn't get the same thing. Let's work through it together. 340 00:17:27,956 --> 00:17:29,436 Speaker 2: Let's see who made the mistake. 341 00:17:32,196 --> 00:17:36,276 Speaker 1: In a minute, how conmego compares to GPT today and 342 00:17:36,436 --> 00:17:38,436 Speaker 1: what it might be like a few years from now. 343 00:17:46,036 --> 00:17:48,076 Speaker 1: When I was getting ready for this interview, I played 344 00:17:48,076 --> 00:17:51,116 Speaker 1: around with con Migo and chat GPT. I put them 345 00:17:51,196 --> 00:17:53,676 Speaker 1: side by side in different tabs in my browser, and 346 00:17:53,716 --> 00:17:57,156 Speaker 1: I asked each one a basic calculus question, how do 347 00:17:57,196 --> 00:17:59,876 Speaker 1: you take a derivative? This is a thing that I 348 00:17:59,916 --> 00:18:02,156 Speaker 1: knew how to do a long time ago, but I 349 00:18:02,196 --> 00:18:05,036 Speaker 1: forgot also a long time ago, so it seemed like 350 00:18:05,076 --> 00:18:08,476 Speaker 1: a good test, and it was striking how different the 351 00:18:08,556 --> 00:18:13,236 Speaker 1: response is were GVT gave me something like the Wikipedia 352 00:18:13,396 --> 00:18:16,356 Speaker 1: entry about derivatives, a bunch of text with some rules, 353 00:18:16,596 --> 00:18:20,276 Speaker 1: some equations, and then I tried it on conmego, and 354 00:18:20,396 --> 00:18:22,876 Speaker 1: first it asked me what I knew, and I said, 355 00:18:22,876 --> 00:18:26,756 Speaker 1: I knew algebra. And then it explained one rule, a 356 00:18:26,836 --> 00:18:29,796 Speaker 1: sort of first rule, the power rule for finding derivatives, 357 00:18:29,836 --> 00:18:31,996 Speaker 1: and it gave me a problem to try, and I 358 00:18:32,036 --> 00:18:34,236 Speaker 1: got that problem wrong, and it sort of asked me 359 00:18:34,716 --> 00:18:36,596 Speaker 1: what I was thinking, and I tried it again and 360 00:18:36,636 --> 00:18:39,316 Speaker 1: I got it right. And then Konnigo asked if I 361 00:18:39,356 --> 00:18:41,596 Speaker 1: wanted to try another problem, and I said yes, and 362 00:18:41,636 --> 00:18:44,316 Speaker 1: it gave me another practice problem. So clearly this is 363 00:18:44,316 --> 00:18:48,436 Speaker 1: a very different experience than the plain vanilla chat GPT. 364 00:18:49,156 --> 00:18:51,196 Speaker 1: But it was still a little bit awkward and a 365 00:18:51,196 --> 00:18:53,636 Speaker 1: little bit hard to follow. And when I closed the 366 00:18:53,676 --> 00:18:57,076 Speaker 1: tab and came back, it didn't remember our previous conversation. 367 00:18:57,356 --> 00:19:00,916 Speaker 1: So overall, very good, but not quite there yet. I 368 00:19:00,956 --> 00:19:03,596 Speaker 1: asked sal if that seemed about right to him. 369 00:19:03,876 --> 00:19:05,556 Speaker 2: Yeah, I think that's pretty accurate. I think where it 370 00:19:05,596 --> 00:19:09,116 Speaker 2: is today, it's like if you really wanted to learn 371 00:19:09,236 --> 00:19:11,916 Speaker 2: how to take a derivative, or if you're learning calculus 372 00:19:11,956 --> 00:19:13,876 Speaker 2: for the first time, I would say, go to the 373 00:19:13,876 --> 00:19:17,396 Speaker 2: calculus course on kon Academy, start watching some of those videos, 374 00:19:17,476 --> 00:19:19,596 Speaker 2: do those practice And what con migo is going to 375 00:19:19,596 --> 00:19:22,996 Speaker 2: be really good at is if those videos, or the 376 00:19:23,036 --> 00:19:25,676 Speaker 2: practice problems, or the articles we have or the hints 377 00:19:25,676 --> 00:19:30,596 Speaker 2: we provide, there's still some itchy conceptual thing you're not getting. 378 00:19:31,476 --> 00:19:35,516 Speaker 2: Con Migo's really good at trying to unlock that one 379 00:19:35,676 --> 00:19:36,796 Speaker 2: conceptual thing. 380 00:19:37,236 --> 00:19:39,516 Speaker 1: Like what I think really kind of a narrow problem, 381 00:19:39,556 --> 00:19:41,556 Speaker 1: like you're almost there, but you just need like a 382 00:19:41,596 --> 00:19:45,036 Speaker 1: little one more explanation or a different kind of explanation 383 00:19:45,156 --> 00:19:45,996 Speaker 1: or something. 384 00:19:46,156 --> 00:19:49,316 Speaker 2: Exactly, or there's just some conceptual dimension that maybe the 385 00:19:49,356 --> 00:19:52,316 Speaker 2: video didn't address that you're curious about or you want 386 00:19:52,316 --> 00:19:54,996 Speaker 2: to connect it. You're learning about entropy and chemistry, and 387 00:19:54,996 --> 00:19:57,356 Speaker 2: you're like, I've heard this word in computer science, how 388 00:19:57,356 --> 00:20:00,156 Speaker 2: are they similar? Con Migo is great for that. I 389 00:20:00,156 --> 00:20:04,876 Speaker 2: don't think conmego by itself is a place where you 390 00:20:04,916 --> 00:20:08,396 Speaker 2: would just say, start being my calculus tutor for the 391 00:20:08,436 --> 00:20:11,316 Speaker 2: next year year and you're gonna work through calculus with me. 392 00:20:11,716 --> 00:20:14,716 Speaker 1: So that's that's where you are today. Where do you 393 00:20:14,716 --> 00:20:18,436 Speaker 1: think you're going to be in a year or five years? 394 00:20:18,636 --> 00:20:22,356 Speaker 2: Yeah, one year, I'm quite confident we're going to have 395 00:20:22,636 --> 00:20:25,436 Speaker 2: I mean it might be six months memory. So kind 396 00:20:25,436 --> 00:20:26,796 Speaker 2: of migo is going to be able to know about 397 00:20:26,796 --> 00:20:29,756 Speaker 2: previous conversations. And memory isn't just even about that, it's 398 00:20:29,756 --> 00:20:31,796 Speaker 2: also about being able to report back to teachers, so 399 00:20:31,836 --> 00:20:33,956 Speaker 2: the teachers can say, Hey, con Migo, what have you 400 00:20:34,076 --> 00:20:36,716 Speaker 2: been working on with my students? Have you noticed any 401 00:20:36,756 --> 00:20:40,076 Speaker 2: general patterns amongst my students, any conceptual gaps? In fact 402 00:20:40,076 --> 00:20:42,796 Speaker 2: that functionality we're already prototyped that one and we're going 403 00:20:42,836 --> 00:20:45,356 Speaker 2: to launch that in the coming months, but it can 404 00:20:45,396 --> 00:20:48,316 Speaker 2: also develop insights about the students, like hey, you know 405 00:20:48,436 --> 00:20:53,076 Speaker 2: Mary is really into anime. Whatever I give an anime example, 406 00:20:53,156 --> 00:20:56,596 Speaker 2: she lights up, or you know, Billy really likes money 407 00:20:56,956 --> 00:20:59,956 Speaker 2: and so and so we're developing were already have a 408 00:20:59,956 --> 00:21:02,476 Speaker 2: prototype of that where it develops these insights, but we're 409 00:21:02,516 --> 00:21:05,236 Speaker 2: making it transparent to the user, like these are the 410 00:21:05,236 --> 00:21:07,516 Speaker 2: insights that it's collected about you, so the. 411 00:21:07,556 --> 00:21:11,116 Speaker 1: User, so people don't freak out like why does this 412 00:21:11,196 --> 00:21:12,476 Speaker 1: machine know that I like anime? 413 00:21:12,836 --> 00:21:15,076 Speaker 2: Or so doesn't climb the ladder of inference, which unfortunately 414 00:21:15,076 --> 00:21:17,796 Speaker 2: a lots of humans do about other people, where the 415 00:21:17,836 --> 00:21:20,956 Speaker 2: student can say no, actually, I just I used to 416 00:21:20,996 --> 00:21:23,276 Speaker 2: be into anime. I'm not into it anymore. Or no, 417 00:21:24,156 --> 00:21:26,316 Speaker 2: I know you think I really like that, but that's 418 00:21:26,356 --> 00:21:28,076 Speaker 2: not you just you just went a little bit too 419 00:21:28,116 --> 00:21:30,436 Speaker 2: far with that. So you're gonna have that, I think 420 00:21:30,436 --> 00:21:32,796 Speaker 2: within a year. I don't know if this if we're 421 00:21:32,836 --> 00:21:34,436 Speaker 2: going to have this out to hundreds of thousands or 422 00:21:34,436 --> 00:21:37,076 Speaker 2: millions of people yet, but you're gonna have the ability 423 00:21:37,156 --> 00:21:39,996 Speaker 2: to talk to conmigo much as you would talk to 424 00:21:41,236 --> 00:21:45,316 Speaker 2: you know, your your your your Amazon or Apple devices. 425 00:21:45,636 --> 00:21:50,116 Speaker 2: But it's going to be far more intelligent than those devices. 426 00:21:50,156 --> 00:21:52,596 Speaker 2: It's going to know about all your context on con Academy, 427 00:21:52,636 --> 00:21:54,196 Speaker 2: even though even if you're not talking to it. 428 00:21:54,236 --> 00:21:56,396 Speaker 1: So you basically made a voice interface instead of a 429 00:21:56,436 --> 00:21:58,356 Speaker 1: typing interface. Just to be clear, when you say talk, 430 00:21:58,356 --> 00:21:59,756 Speaker 1: you mean speak yes. 431 00:21:59,796 --> 00:22:01,836 Speaker 2: And I think the other thing that will surprise folks 432 00:22:02,076 --> 00:22:04,756 Speaker 2: is how human like the voice will be in a year. 433 00:22:04,916 --> 00:22:07,356 Speaker 2: In a year, I think a year from now, that 434 00:22:07,436 --> 00:22:11,716 Speaker 2: whole loop of teachers develop assignments, creating rubrics, assigning it 435 00:22:11,756 --> 00:22:15,396 Speaker 2: to students, students doing the assignment with an AI, the 436 00:22:15,436 --> 00:22:18,156 Speaker 2: AI reporting back to the teacher that won. Yes, the 437 00:22:18,156 --> 00:22:20,556 Speaker 2: student really did the work with me, like we worked together. 438 00:22:20,636 --> 00:22:22,876 Speaker 2: It didn't just get copy and pasted from chat GBT. 439 00:22:23,036 --> 00:22:25,876 Speaker 2: So that solves the hopefully addresses the cheating issue, gives 440 00:22:25,876 --> 00:22:28,596 Speaker 2: students more support, and then also the AI gives a 441 00:22:28,596 --> 00:22:31,956 Speaker 2: preliminary grade to the teacher. I think that whole workflow 442 00:22:31,996 --> 00:22:34,156 Speaker 2: you're going to see in about a year, and you're 443 00:22:34,156 --> 00:22:36,476 Speaker 2: going to see it beyond con Academy. We made an 444 00:22:36,516 --> 00:22:39,476 Speaker 2: announcement a couple of weeks ago with Instructure, the people 445 00:22:39,516 --> 00:22:42,356 Speaker 2: who make the you know, the biggest learning management system 446 00:22:42,396 --> 00:22:44,916 Speaker 2: for k twelve and higher ed you might start seeing 447 00:22:44,916 --> 00:22:48,076 Speaker 2: conmego there and other places they. 448 00:22:47,956 --> 00:22:51,916 Speaker 1: Are, meaning in schools that use this particular platform. 449 00:22:51,996 --> 00:22:54,796 Speaker 2: Yeah, right now, con Migo's only on con Academy's website. 450 00:22:54,996 --> 00:22:56,476 Speaker 2: I think in a year you're going to see con 451 00:22:56,516 --> 00:22:59,956 Speaker 2: Migo sit on other websites as well, not just on 452 00:22:59,996 --> 00:23:01,196 Speaker 2: con Academy's website. 453 00:23:01,476 --> 00:23:05,316 Speaker 1: That's a year which those all seem like sort of 454 00:23:05,356 --> 00:23:07,556 Speaker 1: within this new universe we've learned to live in of 455 00:23:07,596 --> 00:23:11,476 Speaker 1: where generative AI is. Now, what's the five year? Where's 456 00:23:11,476 --> 00:23:12,596 Speaker 1: it going to be in five years? 457 00:23:12,876 --> 00:23:14,196 Speaker 2: Yeah? The five years kind of wild? 458 00:23:14,556 --> 00:23:16,836 Speaker 1: Yeah, the five years are so wild, right, like it 459 00:23:17,596 --> 00:23:19,636 Speaker 1: we might not need you anymore in five years, right, 460 00:23:19,636 --> 00:23:22,236 Speaker 1: That's part of what I was thinking about, like truly, 461 00:23:22,516 --> 00:23:24,436 Speaker 1: like I don't know what is the five year. 462 00:23:25,236 --> 00:23:27,836 Speaker 2: I think in five years you're going to be able 463 00:23:27,836 --> 00:23:30,876 Speaker 2: to have an interview with a gen AI version of 464 00:23:30,916 --> 00:23:33,396 Speaker 2: sal that will look like this interview you're doing with 465 00:23:33,436 --> 00:23:35,916 Speaker 2: me right now. And for those listening, like you can 466 00:23:35,916 --> 00:23:37,556 Speaker 2: see me right now. We're on a video conference, so 467 00:23:37,556 --> 00:23:39,356 Speaker 2: you can actually see me. I actually think. 468 00:23:39,276 --> 00:23:42,476 Speaker 1: Point that AI could teach me whatever derivatives as well 469 00:23:42,476 --> 00:23:44,596 Speaker 1: as you could teach me derivatives at some point in 470 00:23:44,636 --> 00:23:46,436 Speaker 1: the future. Do you have a sense of how far 471 00:23:46,476 --> 00:23:46,756 Speaker 1: off that? 472 00:23:46,876 --> 00:23:49,596 Speaker 2: I think. I think in the five year timeframe, it's 473 00:23:49,636 --> 00:23:52,436 Speaker 2: an engineering problem more than a science problem at this point, 474 00:23:53,236 --> 00:23:55,996 Speaker 2: where you just have to make the stuff fit together 475 00:23:56,116 --> 00:23:58,836 Speaker 2: so it feels seamless and it feels really natural and magical. 476 00:23:58,836 --> 00:24:00,276 Speaker 2: And that's what we're spending a lot of time doing 477 00:24:00,356 --> 00:24:03,116 Speaker 2: so that the memory feels natural, so that how it 478 00:24:03,276 --> 00:24:04,996 Speaker 2: holds you accountable feels natural. 479 00:24:05,956 --> 00:24:08,636 Speaker 1: I mean, understand a little bit. It needs to understand 480 00:24:08,636 --> 00:24:11,836 Speaker 1: better than understands now based on my very limited experience. No, 481 00:24:12,116 --> 00:24:13,916 Speaker 1: or communicate its understanding or something. 482 00:24:14,876 --> 00:24:16,996 Speaker 2: I think it's more. I think it's more of a 483 00:24:17,396 --> 00:24:19,796 Speaker 2: even with GPT four, it's more about the prompting and 484 00:24:19,836 --> 00:24:22,756 Speaker 2: the communication and what you're passing to what okay fair. 485 00:24:22,836 --> 00:24:24,716 Speaker 2: I mean, in five years you're probably gonna have GPT 486 00:24:24,836 --> 00:24:26,516 Speaker 2: five or six, and you're gonna have these other large 487 00:24:26,556 --> 00:24:28,516 Speaker 2: language models, because you know you have at least five 488 00:24:28,596 --> 00:24:30,956 Speaker 2: or six major groups are throwing billions at it. 489 00:24:31,036 --> 00:24:35,076 Speaker 1: Will conmigo be basically redundant to GPT X whatever, GPT 490 00:24:35,196 --> 00:24:37,476 Speaker 1: six and then like where do teachers fit in? Right? 491 00:24:37,476 --> 00:24:39,596 Speaker 1: Those seem like two logical questions there. 492 00:24:40,956 --> 00:24:43,636 Speaker 2: I've been thinking a lot about this and speaking a 493 00:24:43,636 --> 00:24:45,756 Speaker 2: lot about this. I think you're going to see job 494 00:24:45,836 --> 00:24:48,036 Speaker 2: disruption in a lot of places, but I do not 495 00:24:48,116 --> 00:24:50,596 Speaker 2: think it's going to happen in teaching. I think you're 496 00:24:50,636 --> 00:24:55,316 Speaker 2: going to I think if I told every teacher on 497 00:24:55,356 --> 00:24:58,396 Speaker 2: the planet that Hey, all of a sudden, we discovered 498 00:24:58,396 --> 00:25:00,956 Speaker 2: all this money, and we're going to hire three teaching 499 00:25:00,996 --> 00:25:04,116 Speaker 2: assistants for every one of you to go into your 500 00:25:04,116 --> 00:25:06,556 Speaker 2: classroom and they're going to help you with lesson planning. 501 00:25:06,556 --> 00:25:08,636 Speaker 2: They're going to help you grade papers, are going to 502 00:25:08,676 --> 00:25:10,956 Speaker 2: help you write progress supports, and while you know, while 503 00:25:10,956 --> 00:25:14,116 Speaker 2: you're in the class session, they're going to circulate and 504 00:25:14,156 --> 00:25:16,996 Speaker 2: answer any questions that the students have and then report 505 00:25:17,076 --> 00:25:20,916 Speaker 2: back to you. I mean, I think every teacher like, finally. 506 00:25:21,076 --> 00:25:25,916 Speaker 1: So that's the teacher side. What about the con academy side? Like, 507 00:25:25,996 --> 00:25:29,836 Speaker 1: could AI render con Academy obsolete? Yeah? 508 00:25:29,876 --> 00:25:32,916 Speaker 2: I think in five years, honestly, Jenna, I might be 509 00:25:32,916 --> 00:25:36,996 Speaker 2: able to make real time videos very similar to Conacademy videos, 510 00:25:36,996 --> 00:25:40,116 Speaker 2: but they feel like real time explanations. It will be 511 00:25:40,196 --> 00:25:43,356 Speaker 2: that much better at potentially creating exercises and things like that. 512 00:25:43,836 --> 00:25:46,396 Speaker 2: But I think it's still going to be better when 513 00:25:46,396 --> 00:25:48,636 Speaker 2: it's anchored on a framework, on a scope and sees 514 00:25:48,716 --> 00:25:50,236 Speaker 2: just as a teacher, a real teacher, right, A real 515 00:25:50,276 --> 00:25:52,236 Speaker 2: teacher can do all of these things, but they are 516 00:25:52,236 --> 00:25:54,596 Speaker 2: better when they have a curriculum, when they have textbooks, 517 00:25:54,596 --> 00:25:56,316 Speaker 2: when they have con academy, when they have all of 518 00:25:56,316 --> 00:25:58,316 Speaker 2: these other tools around it. I think the same thing 519 00:25:58,396 --> 00:26:01,676 Speaker 2: is going to be true of Jenai for a very 520 00:26:01,876 --> 00:26:02,396 Speaker 2: long time. 521 00:26:05,276 --> 00:26:07,276 Speaker 1: We'll be back in a minute with the lightning round, 522 00:26:07,636 --> 00:26:09,396 Speaker 1: in which I try to get Salt to reach into 523 00:26:09,436 --> 00:26:11,796 Speaker 1: his days as the frontman of a heavy metal band. 524 00:26:19,116 --> 00:26:23,916 Speaker 1: Let's do a lightning round. What's the first Bollywood movie 525 00:26:23,956 --> 00:26:26,276 Speaker 1: I should watch? If I've never watched a Bollywood movie. 526 00:26:27,636 --> 00:26:31,316 Speaker 2: You know, I mean, my standard disclaimer with all Bollywood 527 00:26:31,356 --> 00:26:33,836 Speaker 2: movies are there will be moments in almost every Bollywood 528 00:26:33,876 --> 00:26:36,676 Speaker 2: movies that will make you cringe. Cringe. There's gonna be 529 00:26:36,676 --> 00:26:38,676 Speaker 2: a little bit overacting. People are going to be dancing 530 00:26:38,716 --> 00:26:44,956 Speaker 2: at inappropriate times. But I'm your Khan, who's one of 531 00:26:44,996 --> 00:26:48,396 Speaker 2: the more famous Indian actors and he also produces movies. 532 00:26:48,436 --> 00:26:52,156 Speaker 2: He has this movie called Three Idiots that one I recommend, 533 00:26:52,236 --> 00:26:54,356 Speaker 2: and it has some not you know to I think 534 00:26:54,396 --> 00:26:58,916 Speaker 2: western sensibilities, some cringe worthy moments, but it's it's it's 535 00:26:58,956 --> 00:27:02,276 Speaker 2: about education, you know, not to be too self aggrandizing, 536 00:27:02,316 --> 00:27:04,876 Speaker 2: but the whole, the main story is about this guy. 537 00:27:04,916 --> 00:27:09,316 Speaker 2: It's actually based loosely on a real person who he 538 00:27:09,396 --> 00:27:12,036 Speaker 2: was very unsatisfied with the education system. 539 00:27:12,396 --> 00:27:13,956 Speaker 1: Is it based on you? Yes? Or no? 540 00:27:14,076 --> 00:27:15,836 Speaker 2: No, it's not based on me. It's not based on me. 541 00:27:16,116 --> 00:27:18,876 Speaker 2: But he uh uh. He ends up eventually starting his 542 00:27:18,876 --> 00:27:20,636 Speaker 2: own school. But he also ends up falling in love 543 00:27:20,676 --> 00:27:22,276 Speaker 2: with the doctor and marrying her. And I'm like, that's 544 00:27:22,276 --> 00:27:23,556 Speaker 2: my life. That's that's what. 545 00:27:24,956 --> 00:27:28,756 Speaker 1: Uh? What was the what was the name of the 546 00:27:28,756 --> 00:27:30,356 Speaker 1: heavy metal band you were in in your youth? 547 00:27:32,236 --> 00:27:35,316 Speaker 2: The name was Malignancy until we realized someone else. 548 00:27:35,116 --> 00:27:36,996 Speaker 1: Had that had that good name. It sounds like a 549 00:27:36,996 --> 00:27:40,156 Speaker 1: heavy metal band. What can you give me a few 550 00:27:40,196 --> 00:27:41,276 Speaker 1: bars of one of your songs? 551 00:27:43,716 --> 00:27:48,156 Speaker 2: No, but come on, I have a certain brand now. 552 00:27:48,196 --> 00:27:52,196 Speaker 2: I can't growling into a into a microphone anymore. A 553 00:27:52,196 --> 00:27:54,156 Speaker 2: whole series of lyrics just went through my head, and 554 00:27:54,556 --> 00:27:58,996 Speaker 2: my my podcast filter vetoed all of them. 555 00:27:58,996 --> 00:28:01,236 Speaker 1: Just one phrase, one phrase. You don't even have to 556 00:28:01,236 --> 00:28:02,076 Speaker 1: sing it, just say it. 557 00:28:05,876 --> 00:28:08,956 Speaker 2: No, it was I was angry. 558 00:28:09,196 --> 00:28:14,196 Speaker 1: Yeah, yeah, what's the last non work thing? You use 559 00:28:14,316 --> 00:28:15,116 Speaker 1: chat GPT for? 560 00:28:16,756 --> 00:28:22,156 Speaker 2: Oh, last non work thing? I partially used it to 561 00:28:22,196 --> 00:28:26,276 Speaker 2: planification several vacations actually, and it worked. 562 00:28:26,276 --> 00:28:29,396 Speaker 1: It was helpful. Yeah. So, you know, one of the 563 00:28:29,396 --> 00:28:33,356 Speaker 1: things that's interesting to me looking at the arc or 564 00:28:33,396 --> 00:28:36,996 Speaker 1: the ramp of con Academy from the outside is it's 565 00:28:37,036 --> 00:28:39,956 Speaker 1: been clearly, like profoundly successful and kind of a darling 566 00:28:39,996 --> 00:28:42,596 Speaker 1: of this sort of Bill Gates ted talk Silicon Valley 567 00:28:42,636 --> 00:28:45,956 Speaker 1: universe and reasonably so rightly so. But I'm curious, like, 568 00:28:46,276 --> 00:28:50,236 Speaker 1: what on the inside of building the organization was was hard? 569 00:28:50,316 --> 00:28:51,516 Speaker 1: Harder than it looked. 570 00:28:54,356 --> 00:28:59,196 Speaker 2: I think when you scale an organization, you know, I 571 00:28:59,276 --> 00:29:03,156 Speaker 2: was a one person shop for many years, and then 572 00:29:03,236 --> 00:29:05,836 Speaker 2: you start scaling. And I definitely think when we got 573 00:29:05,836 --> 00:29:09,116 Speaker 2: to between between twenty people and one hundred and fifty, 574 00:29:09,396 --> 00:29:10,716 Speaker 2: we had a lot of growing pains. A lot of 575 00:29:10,796 --> 00:29:12,596 Speaker 2: organizations do I now see. I mean, when you're in it, 576 00:29:12,596 --> 00:29:14,196 Speaker 2: you're like, is it just to us that we can't 577 00:29:14,236 --> 00:29:16,476 Speaker 2: seem to But I realized that there's a when you 578 00:29:16,476 --> 00:29:20,036 Speaker 2: get My old management philosophy was just get the smartest, 579 00:29:20,036 --> 00:29:21,956 Speaker 2: most passionate people in the room. And then we'll figure 580 00:29:21,956 --> 00:29:25,036 Speaker 2: it out. And I've now realized I'll get the smartest, passionate, 581 00:29:25,156 --> 00:29:28,236 Speaker 2: most aligned folks in the room and work constantly to 582 00:29:28,316 --> 00:29:31,876 Speaker 2: align around a true north. So yeah, yes, I think 583 00:29:32,676 --> 00:29:35,076 Speaker 2: like many startups, when we were in a startup mode, 584 00:29:35,076 --> 00:29:38,356 Speaker 2: we probably zigged and zagged more than we necessarily had to. 585 00:29:39,196 --> 00:29:41,756 Speaker 2: But maybe it's part of just natural growing pains. But 586 00:29:41,796 --> 00:29:43,956 Speaker 2: that's that's probably the biggest source of tension over the 587 00:29:44,076 --> 00:29:46,516 Speaker 2: years of like, our mission is so big, should we 588 00:29:46,596 --> 00:29:49,116 Speaker 2: just do one part of it? What about international? What 589 00:29:49,196 --> 00:29:52,156 Speaker 2: about domestic? What about you know, English language arts? What 590 00:29:52,156 --> 00:29:53,876 Speaker 2: about math? And so we've been pulled in so many 591 00:29:53,916 --> 00:29:57,916 Speaker 2: different directions, and we violated a lot of basic business strategy. 592 00:29:58,116 --> 00:30:00,676 Speaker 2: Basic business strategy would be like focus on just one thing, 593 00:30:01,036 --> 00:30:04,236 Speaker 2: and I'd be like, I'm impatient, Like I'm I mean, 594 00:30:04,276 --> 00:30:06,116 Speaker 2: I used to be the to your point. I used 595 00:30:06,116 --> 00:30:07,916 Speaker 2: to be like the young guy on the scene, you know, 596 00:30:08,756 --> 00:30:10,476 Speaker 2: with this A. I started out in my early thirties, 597 00:30:11,356 --> 00:30:14,436 Speaker 2: and now I'm approaching fifty. I'm about to turn forty 598 00:30:14,476 --> 00:30:16,516 Speaker 2: seven years old, and I'm like, wow, I only have 599 00:30:16,636 --> 00:30:19,236 Speaker 2: like probably, if I'm lucky, I have another good twenty 600 00:30:19,276 --> 00:30:23,036 Speaker 2: twenty five years. If I'm lucky and I started tooting 601 00:30:23,076 --> 00:30:25,676 Speaker 2: now the nineteen years ago, like, we got to get 602 00:30:25,676 --> 00:30:27,476 Speaker 2: on this. We got to do all the subjects, all 603 00:30:27,516 --> 00:30:30,636 Speaker 2: the great all the countries as soon as possible. I 604 00:30:30,716 --> 00:30:31,996 Speaker 2: want kind of kind of me to be around for 605 00:30:32,076 --> 00:30:34,556 Speaker 2: hundreds of years, thousands of years. I read a lot 606 00:30:34,556 --> 00:30:36,916 Speaker 2: of science fiction books, and so you know, even if 607 00:30:36,916 --> 00:30:38,796 Speaker 2: I even if I wanted around one hundred years, two 608 00:30:38,876 --> 00:30:42,996 Speaker 2: hundred years and serve this mission free world class education, 609 00:30:43,276 --> 00:30:46,796 Speaker 2: how does it do that when I'm not there or 610 00:30:46,796 --> 00:30:49,316 Speaker 2: when people who didn't know me are no longer there. 611 00:30:49,756 --> 00:30:51,676 Speaker 2: And I don't want it to just exist. I mean, 612 00:30:51,716 --> 00:30:55,116 Speaker 2: there is a success scenario where in a hundred years 613 00:30:54,716 --> 00:30:57,476 Speaker 2: it's it's successful, but then it becomes it's the incumbent. 614 00:30:57,556 --> 00:31:00,156 Speaker 2: It's just like every other large publisher. And that's a 615 00:31:00,196 --> 00:31:03,116 Speaker 2: lame scenario either. I you know, if I was reincarnated 616 00:31:03,116 --> 00:31:05,476 Speaker 2: at that time, I'd want to disrupt that incumbent. So 617 00:31:05,556 --> 00:31:08,796 Speaker 2: how does it stay innovative and doesn't take itself too seriously? 618 00:31:08,876 --> 00:31:11,356 Speaker 2: But it really is and hopefully at that point serving 619 00:31:11,356 --> 00:31:12,156 Speaker 2: most of humanity? 620 00:31:13,316 --> 00:31:14,716 Speaker 1: That is wildly ambitious. 621 00:31:16,676 --> 00:31:16,996 Speaker 2: It is. 622 00:31:23,236 --> 00:31:27,596 Speaker 1: Salcon is the founder and CEO of con Academy. Today's 623 00:31:27,596 --> 00:31:30,796 Speaker 1: show was produced by Gabriel Hunter Chang and Edith Russlo, 624 00:31:30,916 --> 00:31:33,836 Speaker 1: and it was edited by Sarah Nix and engineered by 625 00:31:33,836 --> 00:31:37,036 Speaker 1: Amanda k Wong. I'm Jacob Goldstein, and we'll be back 626 00:31:37,076 --> 00:31:43,756 Speaker 1: next week with another episode of What's Your Problem.