1 00:00:15,316 --> 00:00:24,436 Speaker 1: Pushkin. Artificial intelligence is this weird, big phrase that suddenly 2 00:00:24,476 --> 00:00:26,876 Speaker 1: seems to be everywhere, and it can be hard to 3 00:00:26,916 --> 00:00:30,116 Speaker 1: know exactly what it means. But when businesses say they're 4 00:00:30,196 --> 00:00:34,516 Speaker 1: using artificial intelligence, they usually mean one particular thing. They 5 00:00:34,556 --> 00:00:37,756 Speaker 1: mean automated systems that can take in lots and lots 6 00:00:37,756 --> 00:00:41,276 Speaker 1: of data and use the data to make predictions. This 7 00:00:41,356 --> 00:00:45,596 Speaker 1: is called machine learning, and it's spreading everywhere. Drug companies 8 00:00:45,716 --> 00:00:47,836 Speaker 1: use it to predict which molecules are likely to work 9 00:00:47,876 --> 00:00:51,076 Speaker 1: as medicines. Hedge funds use it to predict which stocks 10 00:00:51,076 --> 00:00:53,396 Speaker 1: are going to go up or down. Instagram uses it 11 00:00:53,436 --> 00:00:55,596 Speaker 1: to predict which adds I'm most likely to click on 12 00:00:56,316 --> 00:00:58,356 Speaker 1: For the record, the machine has learned that I will 13 00:00:58,396 --> 00:01:02,676 Speaker 1: often click on ads for overpriced workout clothes. Anyway, if 14 00:01:02,676 --> 00:01:05,996 Speaker 1: you want to understand what's happening with business and technology today, 15 00:01:06,396 --> 00:01:14,076 Speaker 1: you really have to understand machine learning. I'm Jacob Goldstein. 16 00:01:14,236 --> 00:01:16,556 Speaker 1: This is What's Your Problem, the show where entrepreneurs and 17 00:01:16,596 --> 00:01:18,756 Speaker 1: engineers talk about how they're going to change the world 18 00:01:18,956 --> 00:01:24,116 Speaker 1: once they've solved a few problems. My guest today is 19 00:01:24,236 --> 00:01:29,236 Speaker 1: Luis Vonon, the founder and CEO of Duolingo. Duolingo is 20 00:01:29,276 --> 00:01:32,596 Speaker 1: both a wildly popular language app and also a hardcore 21 00:01:32,636 --> 00:01:36,196 Speaker 1: tech company built on machine learning. Luis used to be 22 00:01:36,276 --> 00:01:39,356 Speaker 1: a professor of computer science at Carnegie Mellon, and in 23 00:01:39,356 --> 00:01:42,796 Speaker 1: our conversation he was really candid about the technical limits 24 00:01:42,836 --> 00:01:45,996 Speaker 1: of what Duolingo can do today. The app is good 25 00:01:46,036 --> 00:01:48,676 Speaker 1: at teaching people to read and to understand, he said, 26 00:01:49,196 --> 00:01:51,836 Speaker 1: but Duolingo is not as good at teaching people to 27 00:01:51,996 --> 00:01:55,596 Speaker 1: speak a new language. And solving that problem turns out 28 00:01:55,636 --> 00:01:59,556 Speaker 1: to be part of this great, big, interesting frontier problem 29 00:01:59,596 --> 00:02:02,316 Speaker 1: that is relevant not just for Duolingo, but for the 30 00:02:02,316 --> 00:02:07,436 Speaker 1: whole field of artificial intelligence. We started out talking about 31 00:02:07,476 --> 00:02:10,476 Speaker 1: the origins of Duolingo, which go back to a different problem, 32 00:02:10,796 --> 00:02:13,556 Speaker 1: one that Louis discovered before he'd ever heard of machine learning. 33 00:02:13,796 --> 00:02:15,636 Speaker 1: It was a problem he saw all around him when 34 00:02:15,636 --> 00:02:19,836 Speaker 1: he was growing up in Guatemala. I was fortunate that 35 00:02:19,916 --> 00:02:23,236 Speaker 1: my mother basically spent essentially her entire net worth on 36 00:02:23,396 --> 00:02:25,996 Speaker 1: my education, and so I was fortunately that I got 37 00:02:25,996 --> 00:02:27,836 Speaker 1: a good education. But then I could see the people 38 00:02:27,836 --> 00:02:31,476 Speaker 1: who would get public education barely learn how to read 39 00:02:31,476 --> 00:02:33,196 Speaker 1: and write. This is just what would happen, And you 40 00:02:33,196 --> 00:02:35,436 Speaker 1: cannot expect that these people are going to become, you know, 41 00:02:36,796 --> 00:02:38,996 Speaker 1: the CEO of a public company or anything like that, 42 00:02:39,156 --> 00:02:42,996 Speaker 1: because they kind of wont. Often people talk about education 43 00:02:43,276 --> 00:02:47,236 Speaker 1: as an engine for reducing inequality, but what you're describing 44 00:02:47,316 --> 00:02:50,916 Speaker 1: is the exact opposite, and it's education, when you have 45 00:02:50,956 --> 00:02:54,156 Speaker 1: to pay for it, is a mechanism for perpetuating inequality. 46 00:02:54,196 --> 00:02:55,956 Speaker 1: And I really believe that, and I believe that's true 47 00:02:55,996 --> 00:02:57,996 Speaker 1: in most countries in the world. There may be some countries, 48 00:02:58,036 --> 00:03:00,476 Speaker 1: you know, like the Scandinavian countries, where pretty much everybody 49 00:03:00,476 --> 00:03:03,436 Speaker 1: gets the same education everything, right, Yeah, there may be 50 00:03:03,516 --> 00:03:05,796 Speaker 1: some countries like that, But in the vast majority of countries, 51 00:03:06,276 --> 00:03:09,116 Speaker 1: if you have money, you can get a much better education. 52 00:03:09,316 --> 00:03:11,156 Speaker 1: So I wanted to do something that would give equal 53 00:03:11,156 --> 00:03:14,156 Speaker 1: access to education to everybody, and so we started with that. 54 00:03:14,356 --> 00:03:18,116 Speaker 1: But then we started thinking, Okay, if education is pretty general, 55 00:03:19,356 --> 00:03:22,476 Speaker 1: let's start by teaching one thing. Eventually we settled on 56 00:03:22,516 --> 00:03:26,076 Speaker 1: teaching languages for a number of reasons, the biggest one 57 00:03:26,076 --> 00:03:31,196 Speaker 1: of which is that learning English in particular can completely 58 00:03:31,276 --> 00:03:34,116 Speaker 1: change people's lives. If you know English, you can double 59 00:03:34,156 --> 00:03:37,276 Speaker 1: your incompotential in most countries. Wow, it's just as simple 60 00:03:37,316 --> 00:03:39,596 Speaker 1: as that. And so it's it's why why is that. 61 00:03:39,836 --> 00:03:43,596 Speaker 1: Basically it opens up for almost any job. You can 62 00:03:43,636 --> 00:03:46,036 Speaker 1: get a better version of that job. For example, you 63 00:03:46,036 --> 00:03:48,396 Speaker 1: could be a waiter, or you could be a waiter 64 00:03:48,436 --> 00:03:52,356 Speaker 1: at the five star hotel. You could be an executive assistant, 65 00:03:52,556 --> 00:03:55,356 Speaker 1: or you could be an executive assistant for a multinational 66 00:03:55,676 --> 00:03:59,676 Speaker 1: ceoyeah yeah, okay, So really teaching English the core, the 67 00:03:59,716 --> 00:04:05,116 Speaker 1: core sort of reducing inequality dream of a language is 68 00:04:05,196 --> 00:04:10,316 Speaker 1: really teaching English to people in largely in poorer countries. Yes, 69 00:04:10,516 --> 00:04:12,676 Speaker 1: so what we wanted to do was teach teach English. 70 00:04:12,676 --> 00:04:13,916 Speaker 1: But you know, if you're going to teach English, we 71 00:04:13,916 --> 00:04:16,476 Speaker 1: may as well teach other languages, So teach those and 72 00:04:17,196 --> 00:04:21,316 Speaker 1: do so for free. So that's what Luis did and 73 00:04:21,396 --> 00:04:24,796 Speaker 1: it worked. Today, tens of millions of people use Dual 74 00:04:24,876 --> 00:04:27,716 Speaker 1: Lingo every month to learn English and dozens of other 75 00:04:27,796 --> 00:04:31,116 Speaker 1: languages for free. The company makes money by selling ads 76 00:04:31,116 --> 00:04:34,636 Speaker 1: and premium subscriptions. It went public in twenty twenty one 77 00:04:34,676 --> 00:04:37,476 Speaker 1: and is currently worth billions of dollars. And the company 78 00:04:37,556 --> 00:04:40,276 Speaker 1: really is built on machine learning. Luis gave me a 79 00:04:40,276 --> 00:04:43,236 Speaker 1: few key examples of the way the company uses the technology. 80 00:04:43,716 --> 00:04:45,196 Speaker 1: So let me tell you a few of the things 81 00:04:45,196 --> 00:04:48,676 Speaker 1: that we do. One of the things that we do 82 00:04:48,796 --> 00:04:53,556 Speaker 1: we like very much is we have data on whenever 83 00:04:53,596 --> 00:04:57,796 Speaker 1: people use dual Lingo. We record every exercise that they 84 00:04:57,876 --> 00:05:00,756 Speaker 1: do and whether they got it right or wrong, and 85 00:05:00,756 --> 00:05:02,476 Speaker 1: if they got it wrong, why they got it wrong. 86 00:05:02,516 --> 00:05:04,956 Speaker 1: With all of this data, we're able to do certain 87 00:05:04,996 --> 00:05:08,196 Speaker 1: things with artificial intelligence. For examples, for every exercise that 88 00:05:08,236 --> 00:05:10,196 Speaker 1: we're about to give you able to predict what is 89 00:05:10,196 --> 00:05:12,316 Speaker 1: the probability that you're going to get this exercise right 90 00:05:12,396 --> 00:05:14,876 Speaker 1: or wrong. So, in a sense, that is a thing 91 00:05:14,956 --> 00:05:19,436 Speaker 1: that a teacher in a classroom could do fairly easily, 92 00:05:19,516 --> 00:05:23,916 Speaker 1: right a teacher with twenty students, But you're able to 93 00:05:23,916 --> 00:05:26,996 Speaker 1: do it with whatever how many people use your app 94 00:05:27,036 --> 00:05:30,396 Speaker 1: actively forty two million per month, so the machine can 95 00:05:30,436 --> 00:05:32,956 Speaker 1: do that for all forty two million people at the 96 00:05:32,996 --> 00:05:36,356 Speaker 1: same time. More or less, yes, and very accurately. Part 97 00:05:36,396 --> 00:05:38,996 Speaker 1: of the secret source of Duelingo is that we realized 98 00:05:39,076 --> 00:05:41,716 Speaker 1: if we were to only give you things that you're 99 00:05:41,756 --> 00:05:44,476 Speaker 1: not very good at, we'd basically be giving you lessons 100 00:05:44,516 --> 00:05:47,236 Speaker 1: from hell every time. So we can't do that because 101 00:05:47,236 --> 00:05:50,316 Speaker 1: that frustrates users. So what we do is, whenever you 102 00:05:50,356 --> 00:05:52,436 Speaker 1: start a listening and doing we're actually trying to optimize 103 00:05:52,436 --> 00:05:54,236 Speaker 1: for two things at the same time. We're trying to 104 00:05:54,276 --> 00:05:56,916 Speaker 1: teach you things you know that you don't not very 105 00:05:56,916 --> 00:06:00,276 Speaker 1: good at, but also we're trying to keep you motivated 106 00:06:00,316 --> 00:06:03,316 Speaker 1: and engaged. Yea. And the way we do that is 107 00:06:03,396 --> 00:06:06,276 Speaker 1: we try to give you exercises for which we know 108 00:06:06,716 --> 00:06:09,676 Speaker 1: you have about an eighty percent chance of getting them right. Huh. 109 00:06:10,396 --> 00:06:12,156 Speaker 1: And have you found that to be the sweet spot? 110 00:06:12,156 --> 00:06:14,396 Speaker 1: I mean, have you done like experiments and sort of 111 00:06:14,436 --> 00:06:16,356 Speaker 1: turn the dial. We've done that, and we're you know, 112 00:06:16,356 --> 00:06:17,716 Speaker 1: we're not the first to figure this out. I mean, 113 00:06:17,756 --> 00:06:20,236 Speaker 1: there's a lot of literature and psychology, etc. Just and 114 00:06:20,276 --> 00:06:22,076 Speaker 1: the number is not exactly eighty percent. It's a little 115 00:06:22,156 --> 00:06:23,756 Speaker 1: higher than that. It's like eighty three percent or something. 116 00:06:23,756 --> 00:06:26,116 Speaker 1: But there's there's a number, and it really is the 117 00:06:26,156 --> 00:06:28,116 Speaker 1: case that if that number is higher, that means these 118 00:06:28,116 --> 00:06:30,276 Speaker 1: things are a little easier for you. Then you get 119 00:06:30,316 --> 00:06:32,236 Speaker 1: a little bored. You feel like you're not learning right. 120 00:06:32,236 --> 00:06:34,396 Speaker 1: If I'm getting ninety five percent right, I'm like, what, 121 00:06:34,476 --> 00:06:36,676 Speaker 1: I'm just wasting my time? And you feel bored because 122 00:06:36,756 --> 00:06:38,196 Speaker 1: it's like it's like a game that you always win. 123 00:06:38,316 --> 00:06:40,396 Speaker 1: I mean, that's that's nice. At the very beginning, but 124 00:06:40,436 --> 00:06:42,996 Speaker 1: then you're just not going to play it, and then 125 00:06:43,036 --> 00:06:45,276 Speaker 1: if it's lower than that, that means that things are 126 00:06:45,276 --> 00:06:47,236 Speaker 1: too hard for you. You get very frustrated and you 127 00:06:47,276 --> 00:06:49,996 Speaker 1: go away. And there's a lot of tricks that you know, 128 00:06:50,596 --> 00:06:53,476 Speaker 1: certainly app developers play, and you know we play as well. 129 00:06:53,516 --> 00:06:55,676 Speaker 1: So I'll tell you another kind of similar trick. You know, 130 00:06:55,876 --> 00:06:57,996 Speaker 1: we end up applying it to language. But the easiest 131 00:06:57,996 --> 00:07:00,796 Speaker 1: way to understand this trick is with a slot machine. 132 00:07:02,556 --> 00:07:05,116 Speaker 1: When you get two out of three, it's you almost 133 00:07:05,156 --> 00:07:06,956 Speaker 1: got it, you gotta do one more, you gotta do 134 00:07:06,996 --> 00:07:08,716 Speaker 1: one more. You just gotta do one more because you 135 00:07:09,236 --> 00:07:11,316 Speaker 1: got it. So there's this, there's this you're so close 136 00:07:11,356 --> 00:07:13,516 Speaker 1: to psychological trick that we played. It's like, oh, there's 137 00:07:13,516 --> 00:07:15,876 Speaker 1: two out of three, almost got it, But you knew 138 00:07:15,916 --> 00:07:17,796 Speaker 1: I was going to get two out of three. Yes, sure, 139 00:07:17,876 --> 00:07:19,756 Speaker 1: you gave me two easy ones and when there was 140 00:07:19,796 --> 00:07:22,796 Speaker 1: super hard that's exactly right. So so we we played 141 00:07:22,796 --> 00:07:25,556 Speaker 1: this type of trick where just people are like, almost 142 00:07:25,636 --> 00:07:27,436 Speaker 1: got it, and that gets them to do another one. 143 00:07:27,476 --> 00:07:29,476 Speaker 1: So you know, in our case, we just we basically 144 00:07:29,516 --> 00:07:33,356 Speaker 1: spend a lot of time training computers to figure out 145 00:07:33,396 --> 00:07:35,916 Speaker 1: what it is that makes people use Duolingo for longer, 146 00:07:35,996 --> 00:07:39,596 Speaker 1: and also that we teach them more so that that's 147 00:07:39,596 --> 00:07:42,596 Speaker 1: a major use for artificial intelligence. Main use is just 148 00:07:42,676 --> 00:07:47,676 Speaker 1: in teaching better. After the break a big problem, Louise 149 00:07:47,756 --> 00:07:50,596 Speaker 1: and dual Lingo are still trying to solve a problem 150 00:07:50,596 --> 00:07:53,196 Speaker 1: that turns out to be a big frontier problem for 151 00:07:53,316 --> 00:08:02,516 Speaker 1: all of artificial intelligence. That's the end of the ads. 152 00:08:02,956 --> 00:08:05,316 Speaker 1: Now we're going back to the show. So let's talk 153 00:08:05,436 --> 00:08:08,716 Speaker 1: now about problems you haven't solved yet. You know, like, 154 00:08:08,756 --> 00:08:10,796 Speaker 1: what are you what are you trying to figure out? 155 00:08:10,836 --> 00:08:13,236 Speaker 1: What are you working on that that isn't quite working yet. 156 00:08:13,396 --> 00:08:15,476 Speaker 1: So dual linguals is very effective at teaching you all 157 00:08:15,556 --> 00:08:18,236 Speaker 1: kinds of things. But if you go look under the 158 00:08:18,236 --> 00:08:19,756 Speaker 1: hood or you know, what is it that you're learning. 159 00:08:19,796 --> 00:08:23,636 Speaker 1: You're learning reading really well. You're learning writing pretty well, 160 00:08:23,636 --> 00:08:26,796 Speaker 1: but not as well as reading. You're learning listening pretty well, 161 00:08:27,916 --> 00:08:31,876 Speaker 1: but you're not learning spontaneous speaking very well. This, by 162 00:08:31,916 --> 00:08:33,916 Speaker 1: the way, is also you're not something you're not learning 163 00:08:33,996 --> 00:08:36,076 Speaker 1: very well in university semesters, Like you're basically not learning 164 00:08:36,116 --> 00:08:39,316 Speaker 1: that well either in due lingo or in university semesters. Okay, 165 00:08:39,716 --> 00:08:42,196 Speaker 1: it's just harder to teach in a sort of classroom. 166 00:08:43,916 --> 00:08:47,276 Speaker 1: What you need to do to teach that is basically, 167 00:08:47,316 --> 00:08:51,276 Speaker 1: have you really interact with wealth? For now another human 168 00:08:51,756 --> 00:08:55,076 Speaker 1: and just you just practice that a lot. Now, here's 169 00:08:55,156 --> 00:08:57,596 Speaker 1: the here's the thing about that. I know how to 170 00:08:57,596 --> 00:08:59,756 Speaker 1: get you to interact with another human. Just put another 171 00:08:59,836 --> 00:09:04,676 Speaker 1: human there. The problem is about eighty percent of our 172 00:09:04,756 --> 00:09:07,596 Speaker 1: users just does not want to talk to a stranger 173 00:09:07,796 --> 00:09:10,116 Speaker 1: in a language that they're not very good. So the 174 00:09:10,716 --> 00:09:13,676 Speaker 1: problem that we're trying to solve here is how do 175 00:09:13,756 --> 00:09:18,756 Speaker 1: we practice kind of spontaneous conversation but without having a 176 00:09:18,836 --> 00:09:22,076 Speaker 1: human on the other side. And we've been working on that, 177 00:09:22,396 --> 00:09:24,796 Speaker 1: and you know we're not there yet. Can I just interrupt? 178 00:09:24,836 --> 00:09:28,076 Speaker 1: Because I mean we were talking about artificial intelligence? Right? 179 00:09:28,156 --> 00:09:30,756 Speaker 1: The most famous test that I know of, the most 180 00:09:30,756 --> 00:09:33,756 Speaker 1: famous idea I know of of artificial intelligence in a 181 00:09:33,796 --> 00:09:38,356 Speaker 1: computer is can a computer hold one end of a conversation? Right? Like? 182 00:09:38,396 --> 00:09:41,876 Speaker 1: That's the classic touring test is like you're going to 183 00:09:41,956 --> 00:09:45,796 Speaker 1: have this like chat conversation and can you tell if 184 00:09:45,836 --> 00:09:48,156 Speaker 1: the person on the other end is a person or 185 00:09:48,156 --> 00:09:51,556 Speaker 1: a machine? Like? That's the og artificial intelligence idea, right, 186 00:09:51,556 --> 00:09:53,196 Speaker 1: I mean, are you telling me, that's what you're trying 187 00:09:53,236 --> 00:09:56,716 Speaker 1: to solve. Not quite. I mean, it would be awesome 188 00:09:56,756 --> 00:09:58,836 Speaker 1: if we would solve that. I mean, but that's the dream, 189 00:09:58,956 --> 00:10:01,476 Speaker 1: right solution to that, that is the dream. But notice 190 00:10:01,516 --> 00:10:03,516 Speaker 1: in our case, we don't actually care if the human 191 00:10:03,556 --> 00:10:06,036 Speaker 1: can tell that there's a computer on the other side. Okay, 192 00:10:06,516 --> 00:10:09,516 Speaker 1: it's okay. As long as it practices thing and as 193 00:10:09,516 --> 00:10:11,756 Speaker 1: long as you're able to carry on a conversation in 194 00:10:11,756 --> 00:10:14,196 Speaker 1: a way that seems a little natural or something, it's okay. 195 00:10:14,236 --> 00:10:18,076 Speaker 1: If if it, you know, goes off the rails every 196 00:10:18,116 --> 00:10:20,516 Speaker 1: now and then, So tell me, what is it that 197 00:10:20,556 --> 00:10:22,836 Speaker 1: you're trying to build. This is exciting, Like what are 198 00:10:22,836 --> 00:10:25,276 Speaker 1: you trying to do? We're starting with text, by the way, 199 00:10:25,476 --> 00:10:28,796 Speaker 1: so either just basically a texting conversation. So think of 200 00:10:28,836 --> 00:10:31,436 Speaker 1: it as like a chat bot in uh, you know, 201 00:10:31,436 --> 00:10:34,196 Speaker 1: in Spanish, where it just you're just having a real 202 00:10:34,636 --> 00:10:36,676 Speaker 1: a little conversation. You've had lots of people have had 203 00:10:36,676 --> 00:10:39,596 Speaker 1: experience with chat bots. Yes, they're right, Like you go 204 00:10:39,636 --> 00:10:44,756 Speaker 1: to whatever, cancel your cable and they want you to text, 205 00:10:44,796 --> 00:10:46,876 Speaker 1: and then you realize you're texting with the machine. So 206 00:10:46,956 --> 00:10:50,756 Speaker 1: like that's the that's step one. So that's the idea. 207 00:10:50,956 --> 00:10:53,436 Speaker 1: That's step one. We of course, I mean a lot 208 00:10:53,476 --> 00:10:58,116 Speaker 1: of those experiences with chatbots are are very um, they're 209 00:10:58,156 --> 00:11:00,796 Speaker 1: just very geared at whatever it is you're trying to do. So, 210 00:11:00,836 --> 00:11:03,236 Speaker 1: for example, that chapel maybe very good at at canceling 211 00:11:03,316 --> 00:11:06,036 Speaker 1: your cable, but only that in my experience, they're not 212 00:11:06,076 --> 00:11:09,316 Speaker 1: even good at that. No, they're not that great. So 213 00:11:09,316 --> 00:11:10,916 Speaker 1: we're trying to do that, and you know, we're not 214 00:11:10,956 --> 00:11:13,756 Speaker 1: there yet. I don't think so you're like out on 215 00:11:13,796 --> 00:11:16,156 Speaker 1: the frontier. We are. We are like you're trying to, yes, 216 00:11:16,276 --> 00:11:18,396 Speaker 1: and we're not there yet. I mean, this is something 217 00:11:18,396 --> 00:11:20,276 Speaker 1: that's going to take us, not just us, I mean 218 00:11:21,396 --> 00:11:25,076 Speaker 1: the whole academic community and technology just a few more years. 219 00:11:25,516 --> 00:11:28,156 Speaker 1: But so let me ask you this. Can we talk 220 00:11:28,196 --> 00:11:30,556 Speaker 1: about that in a way that would be like, can 221 00:11:30,596 --> 00:11:33,596 Speaker 1: we try and just go one level into sort of 222 00:11:33,636 --> 00:11:35,756 Speaker 1: what you're trying to do and like what works and 223 00:11:35,796 --> 00:11:38,796 Speaker 1: what doesn't work, and like why it's hard? Yeah, I mean, 224 00:11:39,116 --> 00:11:41,076 Speaker 1: you know, the first, by the ways, the first way 225 00:11:41,116 --> 00:11:42,636 Speaker 1: you think of if you're trying to make a chap 226 00:11:43,116 --> 00:11:46,156 Speaker 1: the first thing you think of is, okay, I'm just 227 00:11:46,196 --> 00:11:49,396 Speaker 1: going to program the computer. Forget about artificial intelligence. I'm 228 00:11:49,396 --> 00:11:54,716 Speaker 1: just going to program the computer to respond to specific questions, 229 00:11:55,036 --> 00:11:58,796 Speaker 1: and how many possible questions could there be? You start thinking, okay, well, 230 00:11:59,236 --> 00:12:02,516 Speaker 1: when the person says high, we're going to program a 231 00:12:02,516 --> 00:12:04,996 Speaker 1: think to say hi back. When the person says, how 232 00:12:05,036 --> 00:12:06,756 Speaker 1: are you doing, We're going to program I think to 233 00:12:06,756 --> 00:12:09,476 Speaker 1: say I'm doing pretty well? How about you? Yeah? V 234 00:12:09,716 --> 00:12:11,716 Speaker 1: zero of a chatbot. And this, you know, this comes 235 00:12:11,756 --> 00:12:13,396 Speaker 1: from you know, fifty years ago. This is what you 236 00:12:13,436 --> 00:12:15,836 Speaker 1: start doing. The problem is there's billions of things that 237 00:12:15,836 --> 00:12:18,716 Speaker 1: people can say, and so we may have programmed the 238 00:12:18,756 --> 00:12:20,516 Speaker 1: thing of what to say, but you know how you're doing, 239 00:12:20,876 --> 00:12:23,396 Speaker 1: and we can respond. But if instead of asking that, 240 00:12:23,436 --> 00:12:26,596 Speaker 1: they may ask like, hey, did you watch the game 241 00:12:26,676 --> 00:12:28,956 Speaker 1: last night? And we just have no idea how to 242 00:12:28,996 --> 00:12:32,236 Speaker 1: respond to that. About a decade ago, Louis says Ai, 243 00:12:32,276 --> 00:12:36,276 Speaker 1: researchers started trying a really different approach. Rather than trying 244 00:12:36,276 --> 00:12:40,516 Speaker 1: to teach computers every rule, they started throwing massive amounts 245 00:12:40,516 --> 00:12:44,036 Speaker 1: of documents and texts at computers and essentially telling the 246 00:12:44,036 --> 00:12:48,116 Speaker 1: computers figure out the patterns in all these documents. So 247 00:12:48,196 --> 00:12:50,916 Speaker 1: when somebody writes something like did you watch the game 248 00:12:50,996 --> 00:12:53,796 Speaker 1: last night? The computers should be able to predict what 249 00:12:53,916 --> 00:12:57,516 Speaker 1: kinds of answers might follow. This strategy clearly has not 250 00:12:57,836 --> 00:13:00,836 Speaker 1: entirely worked yet. That's why it's still a problem solving. 251 00:13:00,836 --> 00:13:04,156 Speaker 1: It will take both more text and more clever algorithms 252 00:13:04,196 --> 00:13:08,036 Speaker 1: to help computers make sense of that text. But Louis says, 253 00:13:08,316 --> 00:13:11,636 Speaker 1: you can see progress every time you open your Gmail 254 00:13:11,796 --> 00:13:13,636 Speaker 1: or a Google Doc. And I don't know if you've used, 255 00:13:13,676 --> 00:13:16,756 Speaker 1: for example, you use Google Docs lately or Gmail like 256 00:13:17,276 --> 00:13:20,236 Speaker 1: it finishes off your sentences now. And basically the way 257 00:13:20,276 --> 00:13:23,516 Speaker 1: this works is, you know, this system has looked at 258 00:13:24,356 --> 00:13:26,876 Speaker 1: a ton ton of text that has been written by 259 00:13:26,876 --> 00:13:28,196 Speaker 1: a lot of people. In the case of Google Docs, 260 00:13:28,196 --> 00:13:29,676 Speaker 1: I actually don't know what they look at, but I 261 00:13:29,676 --> 00:13:31,676 Speaker 1: wouldn't be surprised if they look at everything that has 262 00:13:31,676 --> 00:13:34,116 Speaker 1: ever been written in Google Docs. I'm going to tell 263 00:13:34,156 --> 00:13:36,716 Speaker 1: you one that happened to me in Google Docs today 264 00:13:36,756 --> 00:13:39,476 Speaker 1: when I was typing notes for this interview, I typed 265 00:13:39,956 --> 00:13:43,556 Speaker 1: zone of pr and then you know what, you know 266 00:13:43,556 --> 00:13:48,676 Speaker 1: how it completed it proximal development. Yes, it knew I 267 00:13:48,756 --> 00:13:52,196 Speaker 1: was going to write zone of proximal development. Yep. No, 268 00:13:52,316 --> 00:13:54,316 Speaker 1: this is amazing, And they just see that if you'd 269 00:13:54,316 --> 00:13:57,516 Speaker 1: write the zone of per there's like a ninety five 270 00:13:57,516 --> 00:14:00,996 Speaker 1: percent chance that it ends in proximal development. What is 271 00:14:00,996 --> 00:14:05,516 Speaker 1: the zone of proximal development? You know? In teaching, you know, 272 00:14:05,516 --> 00:14:07,956 Speaker 1: there's this concept of just keeping you at this zone 273 00:14:07,956 --> 00:14:10,676 Speaker 1: of proximal development, which is always kind of challenging you, 274 00:14:11,076 --> 00:14:12,716 Speaker 1: giving you things that you don't know. But but there 275 00:14:12,756 --> 00:14:15,196 Speaker 1: are all things that are fair to give you. Proximal 276 00:14:15,276 --> 00:14:17,676 Speaker 1: means like close to or next two. Right. So it's 277 00:14:17,716 --> 00:14:19,716 Speaker 1: the idea is like you know a thing, yes, then 278 00:14:19,756 --> 00:14:21,476 Speaker 1: like what you want to teach the person is the 279 00:14:21,596 --> 00:14:24,396 Speaker 1: very next thing, right, It's like that's right, It's like 280 00:14:24,436 --> 00:14:26,316 Speaker 1: the frontier of your knowledge. I like it because it 281 00:14:26,356 --> 00:14:28,476 Speaker 1: applies to like the way you teach, but also to 282 00:14:28,516 --> 00:14:32,036 Speaker 1: your work, right, And like it's just a nice life idea, right, 283 00:14:32,076 --> 00:14:34,476 Speaker 1: It's like the next thing you want, the next thing. 284 00:14:34,516 --> 00:14:36,276 Speaker 1: And I feel like the chat bot is maybe a 285 00:14:36,356 --> 00:14:39,476 Speaker 1: version of that at the level of your company. Yeah, yeah, 286 00:14:39,476 --> 00:14:41,156 Speaker 1: it really is. It really is a nice idea, and 287 00:14:41,236 --> 00:14:43,196 Speaker 1: it is I mean, and if you think about it, 288 00:14:43,556 --> 00:14:45,516 Speaker 1: this is what a great teacher does. You know. I've 289 00:14:45,556 --> 00:14:48,916 Speaker 1: said this inside the company at due Lingo. UM. All 290 00:14:48,996 --> 00:14:51,196 Speaker 1: we need to do is first figure out what you know, 291 00:14:52,396 --> 00:14:53,876 Speaker 1: by the way, not that easy to figure out what 292 00:14:53,916 --> 00:14:55,436 Speaker 1: you know. But let's first figure out what you know 293 00:14:55,596 --> 00:14:58,116 Speaker 1: and then just take you to that zone of proximal development. 294 00:14:58,156 --> 00:15:00,396 Speaker 1: Because now we know what you know, just take you 295 00:15:00,436 --> 00:15:03,516 Speaker 1: to the frontier and then just keep expanding it as 296 00:15:03,516 --> 00:15:05,476 Speaker 1: fast as possible. That's all we need to do. Of course, 297 00:15:05,516 --> 00:15:09,196 Speaker 1: this is easily said, hard to do. Yeah. Um, And 298 00:15:09,276 --> 00:15:11,716 Speaker 1: is there a limit to what you can do with 299 00:15:11,756 --> 00:15:13,836 Speaker 1: a computer? Is there anything a teacher can do? Is 300 00:15:13,876 --> 00:15:17,196 Speaker 1: that a computer will never be able to do? You know? 301 00:15:17,236 --> 00:15:19,236 Speaker 1: Of course I do lingual We love teachers. If they 302 00:15:19,236 --> 00:15:22,156 Speaker 1: are a good teacher and also have the time, they 303 00:15:22,196 --> 00:15:24,116 Speaker 1: are much more able to adapt to their students than 304 00:15:24,156 --> 00:15:28,556 Speaker 1: a computer is. Um. But I don't believe that will 305 00:15:28,556 --> 00:15:30,596 Speaker 1: always be the case. I mean, I think at some 306 00:15:30,636 --> 00:15:32,756 Speaker 1: point it's not just teachers. I mean teachers, this is 307 00:15:32,796 --> 00:15:35,756 Speaker 1: one thing. I mean, at some point my belief and 308 00:15:35,756 --> 00:15:39,116 Speaker 1: this is of course just my belief. People, not everybody agrees. 309 00:15:39,556 --> 00:15:41,396 Speaker 1: They believe that computers will be able to do every 310 00:15:41,436 --> 00:15:44,436 Speaker 1: single thing that humans can. Now you may start asking 311 00:15:44,476 --> 00:15:47,516 Speaker 1: really tough questions like can they love? Yeah, I don't know, 312 00:15:47,756 --> 00:15:49,076 Speaker 1: I don't know what they can love or not but 313 00:15:49,196 --> 00:15:51,636 Speaker 1: from the outside it will look just as if they 314 00:15:51,716 --> 00:15:54,836 Speaker 1: love so who knows who knows what's going on inside? 315 00:15:54,836 --> 00:15:56,636 Speaker 1: Who knows that they that's like a big yeah, we're 316 00:15:56,756 --> 00:16:00,636 Speaker 1: big philosophical questions that I'm not here today, and that's right, 317 00:16:00,676 --> 00:16:03,036 Speaker 1: nor am I. But I do think from input output behavior, 318 00:16:03,116 --> 00:16:05,356 Speaker 1: I don't see why. I don't see any reason why 319 00:16:05,396 --> 00:16:07,316 Speaker 1: computers won't be able to do everything that humans can. 320 00:16:07,876 --> 00:16:11,036 Speaker 1: So they can teach, but they can also write a 321 00:16:11,156 --> 00:16:13,516 Speaker 1: computer code. They can also run companies, they can also 322 00:16:13,556 --> 00:16:16,356 Speaker 1: make podcasts, they can do everything. Should be able to 323 00:16:16,396 --> 00:16:17,996 Speaker 1: do that. I think they should be able to do that. 324 00:16:18,236 --> 00:16:21,396 Speaker 1: I don't know when that'll happen, but they should be 325 00:16:21,436 --> 00:16:25,796 Speaker 1: able to do that. In a minute, the lightning round, well, 326 00:16:25,836 --> 00:16:28,196 Speaker 1: hear what job Luise would love to do but thinks 327 00:16:28,196 --> 00:16:30,836 Speaker 1: he wouldn't be very good at. And the real reason 328 00:16:30,956 --> 00:16:39,916 Speaker 1: treasure Chess keep showing up in duo lingo. And now 329 00:16:40,036 --> 00:16:41,876 Speaker 1: back to the show. We're going to finish with a 330 00:16:41,956 --> 00:16:45,036 Speaker 1: lightning round, not counting duo Lingo. What's your favorite app 331 00:16:45,076 --> 00:16:48,156 Speaker 1: on your phone? Spotify? What have you been listening to 332 00:16:48,276 --> 00:16:51,396 Speaker 1: on Spotify? I'm always a huge fan of the band 333 00:16:51,436 --> 00:16:56,556 Speaker 1: called Churches with a v to Virchase. So that's what 334 00:16:56,636 --> 00:16:58,596 Speaker 1: I was listening to this morning on my work at work. 335 00:16:58,716 --> 00:17:00,236 Speaker 1: If you have a ten minute break in the middle 336 00:17:00,276 --> 00:17:03,356 Speaker 1: of the day, what do you do to relax? Played 337 00:17:03,356 --> 00:17:06,956 Speaker 1: this game called Class Royale. We are a lot of 338 00:17:06,996 --> 00:17:09,596 Speaker 1: the gaming mechanics that we use for duel and will 339 00:17:09,636 --> 00:17:14,596 Speaker 1: come from gaming companies, like the treasure chests, exactly right, 340 00:17:14,796 --> 00:17:16,956 Speaker 1: the treasure chests. If you ever played Class Royale, they 341 00:17:16,996 --> 00:17:19,996 Speaker 1: have the treasure chests. If somebody's going to go to 342 00:17:20,516 --> 00:17:22,876 Speaker 1: visit Guatemala for the first time, what's one thing they 343 00:17:22,916 --> 00:17:28,476 Speaker 1: should definitely do? Oh? Um, Decal is the Mayan ruins. Um. 344 00:17:28,756 --> 00:17:30,716 Speaker 1: You know, if I feel very strong, I've been to 345 00:17:30,956 --> 00:17:34,276 Speaker 1: southern Mexico where they have chi Chenitsa. It's a joke 346 00:17:34,396 --> 00:17:37,436 Speaker 1: compared to the Mayan ruins in Guada. There's there's one 347 00:17:38,316 --> 00:17:41,916 Speaker 1: pyramid in Chichenitsa. There are four hundred in Guatemala in Decal, 348 00:17:42,156 --> 00:17:45,836 Speaker 1: So yeah, they should like I like that. Not only 349 00:17:45,836 --> 00:17:51,356 Speaker 1: are you recommending Tikal, you're also taking as I have 350 00:17:51,436 --> 00:17:53,876 Speaker 1: no trouble with Chichenitsa. It's just they are very good 351 00:17:53,916 --> 00:17:58,356 Speaker 1: at marketing. Amazing. What would you do if you couldn't 352 00:17:58,876 --> 00:18:01,836 Speaker 1: do the job you do. Now, well there's what what 353 00:18:01,956 --> 00:18:03,956 Speaker 1: would I actually do? On? What would I'd like to do? 354 00:18:03,956 --> 00:18:05,396 Speaker 1: I would love to be a writer. I don't think 355 00:18:05,396 --> 00:18:07,956 Speaker 1: i'd be a very good one. Um So if I 356 00:18:07,956 --> 00:18:10,836 Speaker 1: if I wasn't doing the job that I'm doing right now, 357 00:18:11,396 --> 00:18:13,436 Speaker 1: you know, I'd probably be back to being a professor. 358 00:18:14,236 --> 00:18:16,956 Speaker 1: How will you know when it's time to retire? I'm 359 00:18:16,996 --> 00:18:22,556 Speaker 1: never retiring, That's what everybody says. Well, maybe I will, 360 00:18:22,556 --> 00:18:24,316 Speaker 1: but I mean right now, I don't. I don't want 361 00:18:24,356 --> 00:18:29,276 Speaker 1: to do that. Luis Vaughan is the founder and CEO 362 00:18:29,436 --> 00:18:33,596 Speaker 1: of Duelingo. Today's show was produced by Edith Russelo. It 363 00:18:33,636 --> 00:18:36,436 Speaker 1: was edited by Kate Parkinson Morgan and Robert Smith, and 364 00:18:36,476 --> 00:18:39,276 Speaker 1: it was engineered by Amanda kay Wong. Theme music by 365 00:18:39,356 --> 00:18:42,116 Speaker 1: Louis Kara. Our development team is Lee, Tom Mulad and 366 00:18:42,196 --> 00:18:45,076 Speaker 1: Justine Lang. A huge team of people makes What's Your 367 00:18:45,116 --> 00:18:48,276 Speaker 1: Problem possible. That team includes, but is not limited to, 368 00:18:48,916 --> 00:18:52,236 Speaker 1: Jacob Weisberg, Mia Lobel, Heather Fain, John Schnars, Kerry Brodie, 369 00:18:52,276 --> 00:18:56,636 Speaker 1: Carli mcgleory, Christina Sullivan, Jason Gambrell, Brand Hayes, Eric Sandler, 370 00:18:56,676 --> 00:19:00,276 Speaker 1: Maggie Taylor, Morgan Ratner, Nicolemrano, Mary Beth Smith, Royston Deserve, 371 00:19:00,356 --> 00:19:03,996 Speaker 1: Maya Kanig, Daniello, Lakhan, Kazia Tan and David Clever. What's 372 00:19:03,996 --> 00:19:07,676 Speaker 1: Your Problem is a co production of Pushkin Industries and iHeartMedia. 373 00:19:07,796 --> 00:19:11,036 Speaker 1: To find more Pushkin podcast Us, listen on the iHeartRadio app, 374 00:19:11,196 --> 00:19:15,196 Speaker 1: Apple Podcasts, or wherever. I'm Jacob Goldstein and I'll be 375 00:19:15,236 --> 00:19:24,156 Speaker 1: back next week with another episode of What's Your Problem.