1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,000 --> 00:00:14,640 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,760 --> 00:00:17,479 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,520 --> 00:00:21,560 Speaker 1: and love all things tech, and this is time for 5 00:00:21,680 --> 00:00:25,200 Speaker 1: a classic tech Stuff episode. This episode originally published on 6 00:00:25,320 --> 00:00:29,960 Speaker 1: June twenty three, two thousand fourteen. It is titled passing 7 00:00:30,320 --> 00:00:34,840 Speaker 1: the Turing Test, something that we frequently associate with artificial 8 00:00:34,840 --> 00:00:37,839 Speaker 1: intelligence and actual you know, like machines thinking or at 9 00:00:37,880 --> 00:00:40,879 Speaker 1: least appearing to think. So let's listen in to this 10 00:00:41,040 --> 00:00:45,600 Speaker 1: classic episode. The Touring Test is named after Alan Turing, 11 00:00:46,240 --> 00:00:49,040 Speaker 1: and we've done a full episode on Alan Turing way 12 00:00:49,080 --> 00:00:56,120 Speaker 1: back when, back in November. Yeah, phenomenal person, amazing thinker. Yeah, 13 00:00:56,160 --> 00:01:00,320 Speaker 1: one of the like the grandfather of computer science. Also 14 00:01:00,720 --> 00:01:04,319 Speaker 1: tragic life story which we went into detail back in 15 00:01:04,400 --> 00:01:06,320 Speaker 1: that story. And so you might wonder why, right, So 16 00:01:06,360 --> 00:01:08,360 Speaker 1: what does that have to do was the the guy 17 00:01:08,760 --> 00:01:11,280 Speaker 1: who was the essentially the father of computer science or 18 00:01:11,319 --> 00:01:13,680 Speaker 1: grandfather of computer science? What does he have to do 19 00:01:13,720 --> 00:01:16,759 Speaker 1: with the story about a computer program in June two 20 00:01:16,760 --> 00:01:21,560 Speaker 1: thousand fourteen, a computer program with an interesting name, Eugene 21 00:01:21,560 --> 00:01:25,640 Speaker 1: Goostman passing the Turing test. What does that all have 22 00:01:25,680 --> 00:01:27,840 Speaker 1: to do with each other? Well? To answer that, we 23 00:01:27,880 --> 00:01:31,399 Speaker 1: have to ask what is a Turing test? Well, as 24 00:01:31,480 --> 00:01:35,240 Speaker 1: it turns out, Away back in the nineteen fifties, he 25 00:01:35,360 --> 00:01:40,440 Speaker 1: started envisioning uh I thought experiment. Yeah, he published a 26 00:01:40,520 --> 00:01:44,960 Speaker 1: paper in a journal called Mind in nineteen fifty called Computing, 27 00:01:45,080 --> 00:01:48,160 Speaker 1: Machinery and Intelligence, and he sort of laid out his 28 00:01:48,560 --> 00:01:51,200 Speaker 1: thought experiment there. It's interesting because what he did was 29 00:01:51,240 --> 00:01:53,800 Speaker 1: he took this idea of a party game and then 30 00:01:53,840 --> 00:01:58,160 Speaker 1: adapted it for computers. Right. The party game that it's 31 00:01:58,240 --> 00:02:02,080 Speaker 1: based on would have three participants, an interrogator, a man, 32 00:02:02,240 --> 00:02:04,960 Speaker 1: and a woman um, all situated so that they can't 33 00:02:05,000 --> 00:02:07,720 Speaker 1: see one another, and the interrogator is supposed to ask 34 00:02:07,840 --> 00:02:10,960 Speaker 1: questions um to try to figure out which of the 35 00:02:11,000 --> 00:02:14,239 Speaker 1: participants is the dude and which is the lady exactly. 36 00:02:14,600 --> 00:02:17,440 Speaker 1: And it's the dude's job to try and mislead the 37 00:02:17,480 --> 00:02:21,200 Speaker 1: interrogator to believe that that he, in fact is the 38 00:02:21,280 --> 00:02:23,280 Speaker 1: lady and the other one is the man. It's the 39 00:02:23,360 --> 00:02:26,240 Speaker 1: lady's job to say, hey, I want you to get 40 00:02:26,280 --> 00:02:29,000 Speaker 1: this right. I'm the lady. That other guy that's the dude. 41 00:02:29,480 --> 00:02:32,840 Speaker 1: And so the interrogator has to ask questions. Now, obviously 42 00:02:33,440 --> 00:02:35,240 Speaker 1: they can't see each other. Because they could see each other, 43 00:02:35,320 --> 00:02:38,840 Speaker 1: then that would probably give things away. Probably, they really 44 00:02:38,880 --> 00:02:40,960 Speaker 1: shouldn't be able to hear each other because that could 45 00:02:41,000 --> 00:02:45,040 Speaker 1: also give things away. Sure, and if you use handwritten notes, 46 00:02:45,120 --> 00:02:48,520 Speaker 1: then you the interrogator might be able to make judgments 47 00:02:48,520 --> 00:02:51,680 Speaker 1: based on the handwriting style. So it should really be typewritten, right, 48 00:02:51,720 --> 00:02:54,600 Speaker 1: So you want to you want to remove as many 49 00:02:54,639 --> 00:02:58,280 Speaker 1: easily identifiable traits from this game as possible to make 50 00:02:58,320 --> 00:03:02,640 Speaker 1: it all about the questions and the answers. Now, Touring said, 51 00:03:02,720 --> 00:03:05,119 Speaker 1: what if we were to take the same basic premise, 52 00:03:05,680 --> 00:03:10,520 Speaker 1: but instead of having two human interviewees, replace one of 53 00:03:10,520 --> 00:03:14,720 Speaker 1: those humans with a machine. Now, if that machine can 54 00:03:14,800 --> 00:03:18,840 Speaker 1: convince the interrogator that the machine itself is a human being, 55 00:03:19,400 --> 00:03:22,680 Speaker 1: it would be a pretty phenomenal achievement. Or you know, 56 00:03:22,760 --> 00:03:25,880 Speaker 1: just generally, if the interrogator wasn't sure which of the 57 00:03:25,960 --> 00:03:30,120 Speaker 1: two interrogate, right, whether which one is the human, which 58 00:03:30,120 --> 00:03:32,720 Speaker 1: one is the machine, or even if there could be 59 00:03:32,800 --> 00:03:35,160 Speaker 1: a case where you don't know. It may be that 60 00:03:35,240 --> 00:03:38,600 Speaker 1: you have two humans that you're interrogating, and it may 61 00:03:38,640 --> 00:03:40,520 Speaker 1: be one of those things where you have to you know, 62 00:03:40,560 --> 00:03:42,240 Speaker 1: if you don't know for a fact that one of 63 00:03:42,280 --> 00:03:44,680 Speaker 1: them the machine, that makes it even harder, right, at 64 00:03:44,720 --> 00:03:48,320 Speaker 1: least assuming that the computer program is sophisticated enough. Now, 65 00:03:48,600 --> 00:03:51,480 Speaker 1: Touring was saying that we don't have any machines right 66 00:03:51,480 --> 00:03:53,640 Speaker 1: now that can do this, but I envision a time 67 00:03:54,120 --> 00:03:57,200 Speaker 1: when computers will be able to do such a thing 68 00:03:57,240 --> 00:03:59,960 Speaker 1: where if you were to interrogate a computer, you would 69 00:04:00,040 --> 00:04:04,080 Speaker 1: get back responses that would be uh, convincing enough for 70 00:04:04,120 --> 00:04:06,200 Speaker 1: it to make it difficult to determine if it were 71 00:04:06,280 --> 00:04:09,680 Speaker 1: man or machine. And so he said that he predicted 72 00:04:09,720 --> 00:04:12,360 Speaker 1: in fifty years time, which would be the year two thousand, 73 00:04:13,080 --> 00:04:16,359 Speaker 1: that computers and software be sophisticate enough that interrogators would 74 00:04:16,360 --> 00:04:19,559 Speaker 1: only be able to guess correctly seventy percent of the time, 75 00:04:19,880 --> 00:04:22,679 Speaker 1: meaning that they would be fooled by the computer pent 76 00:04:22,760 --> 00:04:25,440 Speaker 1: of the time. Right. So, uh, this was just kind 77 00:04:25,440 --> 00:04:27,600 Speaker 1: of a thing he was coming up with, like an idea, 78 00:04:28,520 --> 00:04:32,200 Speaker 1: a prediction, not necessarily a test. Although very much like 79 00:04:32,520 --> 00:04:36,160 Speaker 1: Moore's observation became Moore's law, this became what is known 80 00:04:36,200 --> 00:04:39,239 Speaker 1: as the Turing test. People talk about a Touring test 81 00:04:39,720 --> 00:04:43,960 Speaker 1: as a machine capable of fooling people into thinking it's 82 00:04:44,080 --> 00:04:47,200 Speaker 1: another person at least thirty percent of the time after 83 00:04:47,400 --> 00:04:50,400 Speaker 1: five minutes of conversation. Very good point. Yes, it needs 84 00:04:50,440 --> 00:04:52,159 Speaker 1: to be five minutes of conversation. If you are just 85 00:04:52,160 --> 00:04:55,120 Speaker 1: getting maybe two or three responses, that might not be 86 00:04:55,240 --> 00:04:57,160 Speaker 1: enough for you to be able to draw a conclusion 87 00:04:57,160 --> 00:05:01,200 Speaker 1: that you feel good about. If after five minutes you 88 00:05:01,760 --> 00:05:06,480 Speaker 1: still are not entirely, entirely certain, then that might say 89 00:05:06,800 --> 00:05:09,520 Speaker 1: that this machine, in fact has passed the Turing test. 90 00:05:09,880 --> 00:05:14,040 Speaker 1: And this has been extrapolated to mean something about machine 91 00:05:14,080 --> 00:05:19,360 Speaker 1: intelligence because Turing himself tied the idea of of how 92 00:05:19,400 --> 00:05:24,160 Speaker 1: we perceive a machines intelligence directly to artificial intelligence and 93 00:05:24,400 --> 00:05:29,039 Speaker 1: and and even how we perceive human intelligence. Because here's 94 00:05:29,040 --> 00:05:32,200 Speaker 1: Touring's idea is a little cheeky, and I love the 95 00:05:32,240 --> 00:05:35,799 Speaker 1: fact that it's so cheeky. So Turing kind of said, 96 00:05:37,000 --> 00:05:39,000 Speaker 1: would you say such a machine as intelligent if it 97 00:05:39,040 --> 00:05:41,760 Speaker 1: appears to be intelligent, is it fair to say that 98 00:05:42,000 --> 00:05:46,280 Speaker 1: is intelligent? Touring? So why not? Because I'm only able 99 00:05:46,320 --> 00:05:49,800 Speaker 1: to tell that I am intelligent, right because of my experience. 100 00:05:49,800 --> 00:05:52,320 Speaker 1: I'm only able to have my own personal experience. I 101 00:05:52,360 --> 00:05:55,800 Speaker 1: can't experience what someone else's life is like, all right, 102 00:05:55,839 --> 00:05:58,239 Speaker 1: sitting across from each other, Jonathan and I can only 103 00:05:58,360 --> 00:06:01,200 Speaker 1: assume that the other one is intelligent, right, And it's 104 00:06:01,240 --> 00:06:04,760 Speaker 1: because the other person is displaying traits that we associate 105 00:06:04,839 --> 00:06:09,160 Speaker 1: with intelligence. They seem to be able to take an information, 106 00:06:09,200 --> 00:06:12,520 Speaker 1: respond to it, make decisions. And based on the fact 107 00:06:12,520 --> 00:06:15,240 Speaker 1: that we ourselves also do that thing, we go ahead 108 00:06:15,240 --> 00:06:18,119 Speaker 1: and say, all right, well, they clearly have the same 109 00:06:18,600 --> 00:06:23,559 Speaker 1: features that I have, which includes intelligence. Uh. Now, he says, 110 00:06:23,560 --> 00:06:26,599 Speaker 1: why would we not extend that same courtesy to a 111 00:06:26,640 --> 00:06:31,640 Speaker 1: machine if it also appeared to display those same uh features? 112 00:06:31,640 --> 00:06:35,160 Speaker 1: He says, doesn't matter if the computer is quote unquote thinking. 113 00:06:35,839 --> 00:06:37,840 Speaker 1: If it's it can fool you. Yeah, if it's able 114 00:06:37,920 --> 00:06:40,000 Speaker 1: to to simulate it well enough, you might as well 115 00:06:40,040 --> 00:06:42,400 Speaker 1: say it's intelligent, because simulate it is probably a kinder 116 00:06:42,440 --> 00:06:45,400 Speaker 1: phrase for that than fool. Yes. Yeah, well I know 117 00:06:45,480 --> 00:06:48,720 Speaker 1: it's it is fooling essentially, I mean, because ultimately you're 118 00:06:48,760 --> 00:06:52,680 Speaker 1: talking about a computer programmer who's making this happen. So 119 00:06:53,040 --> 00:06:55,640 Speaker 1: nowadays we think of this as the Turing test. Can 120 00:06:55,680 --> 00:06:59,920 Speaker 1: a machine of the time are more fool someone after 121 00:07:00,080 --> 00:07:03,440 Speaker 1: five minutes of conversation into thinking it's a human? Now, 122 00:07:05,240 --> 00:07:08,719 Speaker 1: this is really hard to do. This is is non trivial. 123 00:07:08,920 --> 00:07:13,040 Speaker 1: I mean it sounds almost simple. Yeah, the concept is simple, 124 00:07:13,240 --> 00:07:20,040 Speaker 1: the execution incredibly difficult. Because here's the thing, human language varied. 125 00:07:20,640 --> 00:07:25,440 Speaker 1: We have unstructured, unpredictable ways to say things like if 126 00:07:25,480 --> 00:07:28,480 Speaker 1: if I were to tell you that it's a hundred 127 00:07:28,520 --> 00:07:34,880 Speaker 1: degrees outside and it is humid like the humidities, and 128 00:07:34,920 --> 00:07:37,480 Speaker 1: you go out there, then I'm sure all of our 129 00:07:37,480 --> 00:07:41,880 Speaker 1: listeners would have slightly different ways to express their thoughts 130 00:07:42,600 --> 00:07:46,640 Speaker 1: on the conditions outside. Some of them would probably contain 131 00:07:46,720 --> 00:07:52,440 Speaker 1: colorful metaphors. Mine very likely when it contain colorful metaphors, 132 00:07:52,480 --> 00:07:54,720 Speaker 1: particularly if I had to be outside for any length 133 00:07:54,720 --> 00:07:57,200 Speaker 1: of time. But that's the point. We would all have 134 00:07:57,240 --> 00:08:00,320 Speaker 1: different ways of saying this. So how do you make 135 00:08:00,400 --> 00:08:05,800 Speaker 1: a computer program able to interpret all the myriad of 136 00:08:05,880 --> 00:08:09,560 Speaker 1: ways we can all express the same thought, let alone 137 00:08:10,200 --> 00:08:12,680 Speaker 1: any thought? All right? This is what's referred to as 138 00:08:12,880 --> 00:08:15,960 Speaker 1: natural language recognition, and it's a really huge problem in 139 00:08:16,080 --> 00:08:19,360 Speaker 1: artificial intelligence and and and a lot of other speech 140 00:08:19,480 --> 00:08:23,880 Speaker 1: related computer programming. Exactly. Yeah, this is where a computer 141 00:08:23,960 --> 00:08:27,280 Speaker 1: program has to be able to parse the language so 142 00:08:27,440 --> 00:08:32,120 Speaker 1: it recognizes things like this word is a noun, this 143 00:08:32,200 --> 00:08:37,720 Speaker 1: word is a verb, this this word alters this other word. Uh. 144 00:08:37,800 --> 00:08:39,480 Speaker 1: And not only does it need to be able to 145 00:08:39,480 --> 00:08:43,160 Speaker 1: recognize it, it it needs to be able to respond in kind. Right, 146 00:08:43,200 --> 00:08:46,120 Speaker 1: So if you have appropriately in some way or another share. Yeah, 147 00:08:46,120 --> 00:08:49,480 Speaker 1: you could have a computer program that's literally making up 148 00:08:49,559 --> 00:08:53,800 Speaker 1: sentences or you know, the approximation of a sentence randomly, 149 00:08:53,960 --> 00:08:57,439 Speaker 1: where it's just pulling strings of words and placing them 150 00:08:57,520 --> 00:09:00,000 Speaker 1: in a sequence and then presenting them. But that wouldn't 151 00:09:00,000 --> 00:09:02,520 Speaker 1: be convincing at all. If I were to say, hello, 152 00:09:02,600 --> 00:09:05,320 Speaker 1: how are you today, and Lauren was to say blue 153 00:09:05,320 --> 00:09:10,400 Speaker 1: panther pickup jump down street, I'd be like what? And 154 00:09:10,440 --> 00:09:12,960 Speaker 1: even that was closer to being a sentence than some 155 00:09:13,000 --> 00:09:14,600 Speaker 1: of the random stuff that you would see if it 156 00:09:14,640 --> 00:09:19,719 Speaker 1: was just truly completely So it also has to be 157 00:09:19,760 --> 00:09:23,720 Speaker 1: able to uh endure a five minute long conversation, like 158 00:09:23,760 --> 00:09:25,600 Speaker 1: we said, in order to pass the Turing test. So 159 00:09:25,880 --> 00:09:29,200 Speaker 1: you can't have too much repetition or that gives it away. Absolutely. 160 00:09:29,240 --> 00:09:30,800 Speaker 1: If you've ever been playing a video game and all 161 00:09:30,880 --> 00:09:32,839 Speaker 1: of the NPCs say the same thing over and over 162 00:09:32,880 --> 00:09:35,640 Speaker 1: and over again, yeah, if if all the chat bot 163 00:09:35,720 --> 00:09:40,320 Speaker 1: says is hey, listen, then you're clearly playing Zelda and 164 00:09:40,400 --> 00:09:44,520 Speaker 1: you're not actually having a decent conversation. That being said, 165 00:09:44,559 --> 00:09:47,080 Speaker 1: there is someone I know who plays a ferry at 166 00:09:47,080 --> 00:09:50,559 Speaker 1: the Georgia Renaissance Festival, and hey, listen is heavily represented 167 00:09:50,559 --> 00:09:53,880 Speaker 1: in a repertoire. It's pretty amazing. No, it's pretty awesome. 168 00:09:54,120 --> 00:09:56,319 Speaker 1: But at any rate, Yeah, so so these are these 169 00:09:56,320 --> 00:09:59,800 Speaker 1: are big problems. You have to build a database of words, 170 00:10:00,120 --> 00:10:02,839 Speaker 1: you have to be able to figure out what kind 171 00:10:02,840 --> 00:10:06,120 Speaker 1: of syntax are you going for. It's a wide open, 172 00:10:06,200 --> 00:10:10,840 Speaker 1: huge problem, and solving this problem can be really beneficial 173 00:10:10,880 --> 00:10:12,480 Speaker 1: in lots of ways. We'll talk a little bit about 174 00:10:12,520 --> 00:10:15,760 Speaker 1: that later. It's it's beyond just making a program that 175 00:10:15,880 --> 00:10:19,800 Speaker 1: seems human, right, That's that's one way of looking at it. 176 00:10:19,840 --> 00:10:21,240 Speaker 1: But there are a lot of other benefits that come 177 00:10:21,280 --> 00:10:23,040 Speaker 1: along with it, which will chat about towards the end 178 00:10:23,040 --> 00:10:25,280 Speaker 1: of the show. But for right now, let's go into 179 00:10:25,440 --> 00:10:29,360 Speaker 1: this story behind Eugene Goostman and whether or not it 180 00:10:29,440 --> 00:10:33,200 Speaker 1: was actually the first chat bought to pass the Turing test. Yeah. So, 181 00:10:33,320 --> 00:10:36,400 Speaker 1: first of all, we've got three programmers in this story 182 00:10:36,440 --> 00:10:41,880 Speaker 1: of Vladimir Vslov, Eugene Demchinko and Sarage you listen. As 183 00:10:41,920 --> 00:10:44,920 Speaker 1: you may guess from their names, they all hail from 184 00:10:45,840 --> 00:10:49,720 Speaker 1: Russia and the Ukraine as well, and although not all, 185 00:10:49,960 --> 00:10:52,400 Speaker 1: not all of them still live there. But starting in 186 00:10:52,679 --> 00:10:55,600 Speaker 1: two thousand one they got to work on this, right, Yeah, 187 00:10:55,640 --> 00:10:58,120 Speaker 1: they were trying to design a computer program that would 188 00:10:58,160 --> 00:11:01,400 Speaker 1: pose specifically as a third teen year old boy from 189 00:11:01,400 --> 00:11:05,280 Speaker 1: Odessa in the Ukraine. Yeah, and that meant that they 190 00:11:05,320 --> 00:11:09,880 Speaker 1: had specific parameters that they could work within. It automatically 191 00:11:10,760 --> 00:11:14,360 Speaker 1: helped reduce some of that unpredictability and that lack of 192 00:11:14,440 --> 00:11:17,320 Speaker 1: restriction that you would have if you were to just say, 193 00:11:17,360 --> 00:11:21,800 Speaker 1: this is a fluent adult speaker of a given language. Yeah, Yeah, 194 00:11:22,000 --> 00:11:25,600 Speaker 1: giving him these you know, setting up the expectation from 195 00:11:25,679 --> 00:11:27,520 Speaker 1: the judges that you know, this is a non native 196 00:11:27,559 --> 00:11:31,400 Speaker 1: English speaker. It's it's a kid essentially. Um, you know 197 00:11:31,400 --> 00:11:34,679 Speaker 1: that they'll expect him to have limited knowledge of the 198 00:11:34,720 --> 00:11:39,720 Speaker 1: world and and different subject areas, and a limited understanding 199 00:11:39,760 --> 00:11:43,040 Speaker 1: of English vocabulary and grammar and all that kind of stuff. 200 00:11:43,120 --> 00:11:46,400 Speaker 1: So you're already managing expectations. That's going to come into 201 00:11:46,400 --> 00:11:49,800 Speaker 1: play when we talk about some of the criticisms about this. 202 00:11:49,920 --> 00:11:51,880 Speaker 1: Although although I do think it's a very clever way 203 00:11:51,920 --> 00:11:54,640 Speaker 1: and and a lot of previous chatbots have have had 204 00:11:54,679 --> 00:11:57,840 Speaker 1: similar yes kind of approaches. Yeah, because like we said, 205 00:11:57,840 --> 00:12:00,560 Speaker 1: if you were to take a quote unquote pure approach 206 00:12:00,600 --> 00:12:05,199 Speaker 1: to this, it's really really challenging. So yeah, by by 207 00:12:05,320 --> 00:12:09,160 Speaker 1: limiting this, the judges have an idea of well, this 208 00:12:09,160 --> 00:12:10,840 Speaker 1: this could be a thirteen year old boy or it 209 00:12:10,840 --> 00:12:13,360 Speaker 1: could be a computer program. It means that the computer 210 00:12:13,400 --> 00:12:16,200 Speaker 1: program doesn't have to be as sophisticated as one that 211 00:12:16,200 --> 00:12:19,959 Speaker 1: would be completely fluent and have you know, an adult's experiences, 212 00:12:20,880 --> 00:12:24,240 Speaker 1: uh and ability to communicate. So that was step one 213 00:12:24,920 --> 00:12:29,000 Speaker 1: and uh, Eugene Goosman took part in the competition. UH 214 00:12:29,040 --> 00:12:33,000 Speaker 1: that had five total chatbots. It was it was one 215 00:12:33,040 --> 00:12:36,960 Speaker 1: of five that took place on the sixtieth anniversary of 216 00:12:37,040 --> 00:12:41,400 Speaker 1: Turing's death, and the program managed to full thirty three 217 00:12:41,559 --> 00:12:45,280 Speaker 1: percent of the judges into thinking it was actually a person. Uh, 218 00:12:45,280 --> 00:12:47,800 Speaker 1: and it had there were thirty judges from what I understand, 219 00:12:48,080 --> 00:12:51,960 Speaker 1: So that was where you got all the headlines of 220 00:12:52,080 --> 00:12:55,959 Speaker 1: chat bought. There are computer beats touring tests, which already 221 00:12:56,120 --> 00:13:01,160 Speaker 1: is not accurate. There were some slight um hiccups in 222 00:13:01,679 --> 00:13:03,920 Speaker 1: a little bit of the news reporting that we'll get 223 00:13:03,960 --> 00:13:07,120 Speaker 1: into that part of the story later. But you know, 224 00:13:07,440 --> 00:13:10,000 Speaker 1: the key takeaways I think here are that this was 225 00:13:10,040 --> 00:13:14,840 Speaker 1: a this was a competition that was celebrating touring awesome um. 226 00:13:14,920 --> 00:13:19,840 Speaker 1: And that there were thirty judges, some of whom were celebrities, yes, 227 00:13:20,160 --> 00:13:24,560 Speaker 1: including an actor who had appeared on Red Dwarf right. Yes. 228 00:13:24,840 --> 00:13:27,679 Speaker 1: And then a couple of years before that, because because 229 00:13:27,679 --> 00:13:29,839 Speaker 1: you said they started working on this in two two 230 00:13:29,840 --> 00:13:32,359 Speaker 1: thousand one, this was not the first time that Gustman 231 00:13:32,480 --> 00:13:37,479 Speaker 1: had entered competition. UH. Two years previously, that same software 232 00:13:37,520 --> 00:13:40,760 Speaker 1: had convinced twenty of judges at a similar competition that 233 00:13:40,840 --> 00:13:43,880 Speaker 1: was held at Bletchley Park. Now, but Bletchley Park that's 234 00:13:43,880 --> 00:13:48,720 Speaker 1: where Touring helped crack the Enigma machine, the encoding device 235 00:13:48,800 --> 00:13:52,000 Speaker 1: that the German military was using during World War Two. 236 00:13:52,559 --> 00:13:54,880 Speaker 1: So this was a big celebration. It was at the 237 00:13:55,120 --> 00:13:59,240 Speaker 1: centennial celebrating his birth, and it ended up falling just 238 00:13:59,400 --> 00:14:05,960 Speaker 1: short of passing the Turing test. Uh. The organizer of 239 00:14:06,000 --> 00:14:09,400 Speaker 1: the the more recent event, the one in which Gustman 240 00:14:09,720 --> 00:14:13,360 Speaker 1: ran away with quote unquote beating the Turing test. UH. 241 00:14:13,400 --> 00:14:16,680 Speaker 1: That organizer was Kevin Warwick. That name may sound familiar 242 00:14:16,720 --> 00:14:18,400 Speaker 1: to some of our listeners. If you've ever heard us 243 00:14:18,440 --> 00:14:22,000 Speaker 1: talk about cyborgs. He's the guy who had an r 244 00:14:22,120 --> 00:14:26,240 Speaker 1: f I D chip surgically implanted into him. His wife 245 00:14:26,240 --> 00:14:29,480 Speaker 1: also did at one point. Yes, they could communicate with 246 00:14:29,520 --> 00:14:33,920 Speaker 1: each other through them. Some unflattering news media sometimes refers 247 00:14:33,960 --> 00:14:37,400 Speaker 1: to him as Captain Cyborg. Yeah. There there are some 248 00:14:37,600 --> 00:14:41,680 Speaker 1: critics who say that he, uh, he courts publicity in 249 00:14:41,720 --> 00:14:47,240 Speaker 1: a manner that is unbecoming of of of a scientist. Yes, 250 00:14:47,320 --> 00:14:51,520 Speaker 1: really of anybody. Yes, yes, uh, those are the critics 251 00:14:51,560 --> 00:14:53,320 Speaker 1: who say that. By the way, I just want to 252 00:14:53,320 --> 00:14:56,320 Speaker 1: make that clear. At any rate, He said that this 253 00:14:56,360 --> 00:14:58,760 Speaker 1: was the first time a chatbot had passed the Turing 254 00:14:58,880 --> 00:15:01,840 Speaker 1: test at an event, or the conversation was open ended, 255 00:15:02,200 --> 00:15:06,080 Speaker 1: meaning that they had not previously uh decided upon a 256 00:15:06,160 --> 00:15:10,440 Speaker 1: specific topic or line of questioning that the judges were 257 00:15:10,440 --> 00:15:13,000 Speaker 1: allowed to say whatever they wanted to the chat bot 258 00:15:13,000 --> 00:15:15,760 Speaker 1: and response, which obviously makes it harder because you have 259 00:15:15,840 --> 00:15:20,760 Speaker 1: to have a much wider breadth of potential responses. Yeah. Yeah, 260 00:15:20,800 --> 00:15:22,840 Speaker 1: because again, if you were to say, all right, this 261 00:15:22,920 --> 00:15:26,400 Speaker 1: chat bot is just going to talk about, um, I 262 00:15:26,400 --> 00:15:29,760 Speaker 1: don't know, sporting events from last year, well, then you 263 00:15:30,040 --> 00:15:32,720 Speaker 1: can prepare pretty well for that. Yeah, exactly. So it's 264 00:15:32,720 --> 00:15:34,480 Speaker 1: one of those again, it's one of those things where 265 00:15:34,680 --> 00:15:38,840 Speaker 1: the unrestricted nature adds in a degree of difficulty. It's 266 00:15:38,880 --> 00:15:41,080 Speaker 1: time for us to take a quick break, but will 267 00:15:41,120 --> 00:15:51,040 Speaker 1: be right back. So why would you need to make 268 00:15:51,040 --> 00:15:53,480 Speaker 1: the qualification that this is an open ended approach and 269 00:15:53,520 --> 00:15:56,280 Speaker 1: this is the first chat bot to manage it. That 270 00:15:56,320 --> 00:16:00,400 Speaker 1: would be because despite what you may have heard, Eugene 271 00:16:00,480 --> 00:16:03,600 Speaker 1: Goosband was not the first program to beat the Turing test, 272 00:16:04,040 --> 00:16:08,440 Speaker 1: not by a long shot. So work on these sort 273 00:16:08,440 --> 00:16:13,760 Speaker 1: of chatbots, these kind of artificial conversationalists. That's real recent, right, 274 00:16:13,800 --> 00:16:16,000 Speaker 1: I mean they just started doing that like what like 275 00:16:16,000 --> 00:16:18,200 Speaker 1: like maybe three or four years ago or two thousand, 276 00:16:18,200 --> 00:16:19,840 Speaker 1: one of the earliest, I mean that's when they started 277 00:16:19,840 --> 00:16:22,600 Speaker 1: with Goose. Yeah. No, in the nineteen sixties and seventies. 278 00:16:22,640 --> 00:16:28,080 Speaker 1: Say what back in the mid nineteen sixties there was Eliza, 279 00:16:28,680 --> 00:16:32,400 Speaker 1: which was written by Joseph Weisenbaum. If you're pronouncing it 280 00:16:32,440 --> 00:16:35,840 Speaker 1: in the correct German, that's correct. Excellent, I'm finally learning. 281 00:16:35,920 --> 00:16:39,800 Speaker 1: I'm sure. I'm sure he pronounces at Wisenbaum. But yeah, 282 00:16:39,880 --> 00:16:43,240 Speaker 1: but no, it would be Weisenbaum at any rate. Um, 283 00:16:43,280 --> 00:16:46,680 Speaker 1: this this was a program um that would respond to 284 00:16:46,760 --> 00:16:50,880 Speaker 1: human conversation in what ideally would be a relevant way. Yes, 285 00:16:51,240 --> 00:16:54,240 Speaker 1: it was obviously an early attempt. It was not meant 286 00:16:54,360 --> 00:16:58,400 Speaker 1: to be a program that takes on the Turing test. 287 00:16:58,400 --> 00:17:00,200 Speaker 1: It was really a thought again, kind of like a 288 00:17:00,200 --> 00:17:02,760 Speaker 1: thought experiment, the idea of what does it take to 289 00:17:02,880 --> 00:17:06,080 Speaker 1: create a piece of software that can react to questions 290 00:17:06,640 --> 00:17:08,960 Speaker 1: and make it make sense. At that point, it was 291 00:17:09,000 --> 00:17:11,000 Speaker 1: more a can we do this? Then let's do this 292 00:17:11,040 --> 00:17:14,880 Speaker 1: for real? Yeah, and these are the sort of foundations 293 00:17:15,040 --> 00:17:16,879 Speaker 1: that you have to lay in order for other things 294 00:17:16,920 --> 00:17:21,679 Speaker 1: like the Eugene Goostman program to be uh successful. So 295 00:17:22,480 --> 00:17:26,720 Speaker 1: he created a language analyzer. Now this specifically would look 296 00:17:26,760 --> 00:17:29,680 Speaker 1: at words that users would put in and then compare 297 00:17:29,680 --> 00:17:32,359 Speaker 1: them against a database of words that were stored in 298 00:17:32,359 --> 00:17:36,879 Speaker 1: the computer's memory. And then also created scripts. Now, in 299 00:17:36,880 --> 00:17:39,159 Speaker 1: this case, the scripts were sets of rules. They're kind 300 00:17:39,160 --> 00:17:41,280 Speaker 1: of like, you know, like a protocol or an algorithm 301 00:17:41,280 --> 00:17:45,639 Speaker 1: in a way. These rules dictated how Eliza would respond 302 00:17:45,680 --> 00:17:49,000 Speaker 1: to messages in order to cut down on that huge, 303 00:17:49,320 --> 00:17:51,760 Speaker 1: massive number of variables we were talking about, that the 304 00:17:51,840 --> 00:17:55,800 Speaker 1: whole unrestricted, unpredictable thing, And so they would have different 305 00:17:56,080 --> 00:17:58,840 Speaker 1: uh kind of like like overlays. Think of him as 306 00:17:58,880 --> 00:18:02,000 Speaker 1: an overlay that would kind of guide Aliza's responses, And 307 00:18:02,000 --> 00:18:05,880 Speaker 1: the most well known one was called Doctor, which put 308 00:18:05,880 --> 00:18:11,080 Speaker 1: Eliza in the role of a Rogerian psychiatrist. Uh. This 309 00:18:11,160 --> 00:18:13,959 Speaker 1: is the person who responds to everything with a question. 310 00:18:14,040 --> 00:18:17,760 Speaker 1: Al right. It's that it's that passive interview style where 311 00:18:17,760 --> 00:18:20,719 Speaker 1: where you know, you repeat back. You know, if if 312 00:18:20,920 --> 00:18:23,080 Speaker 1: if I go, oh, man, like I'm I'm I'm really 313 00:18:23,119 --> 00:18:26,120 Speaker 1: sad about my cat, Oh, tell what is it about 314 00:18:26,119 --> 00:18:28,280 Speaker 1: your cat that makes you sad? And then you can 315 00:18:28,359 --> 00:18:30,400 Speaker 1: say things like you know, And that's one of those 316 00:18:30,400 --> 00:18:34,639 Speaker 1: things that where as a conversation starts to wind down, 317 00:18:34,960 --> 00:18:37,600 Speaker 1: you then have another line of questions, So tell me 318 00:18:37,640 --> 00:18:40,639 Speaker 1: more about your mother, like that the whole tell me 319 00:18:40,720 --> 00:18:42,879 Speaker 1: about your mother thing. That's really coming back to this 320 00:18:42,960 --> 00:18:45,200 Speaker 1: kind of model of psychiatrists. But yeah, if you've ever 321 00:18:45,240 --> 00:18:48,520 Speaker 1: heard the joke about them responding to anything with another question, 322 00:18:48,960 --> 00:18:51,000 Speaker 1: so I just just taking the last word and turning 323 00:18:51,040 --> 00:18:53,560 Speaker 1: it into a question what am I paying you for? 324 00:18:53,720 --> 00:18:57,800 Speaker 1: Why do you think you're paying for? That kind of thing? Uh? 325 00:18:57,840 --> 00:19:00,880 Speaker 1: So that's what that's how Aliza spotted and in fact, 326 00:19:00,920 --> 00:19:07,080 Speaker 1: you can find examples of the Eliza proscript transcripts, and 327 00:19:07,119 --> 00:19:10,080 Speaker 1: even even the actual there's there. There are ports you 328 00:19:10,119 --> 00:19:13,119 Speaker 1: could say, or people have essentially created their own version 329 00:19:13,119 --> 00:19:16,199 Speaker 1: of Eliza just using the originalize as a guide. You 330 00:19:16,240 --> 00:19:18,439 Speaker 1: can find tons of them on the web, and you 331 00:19:18,480 --> 00:19:21,640 Speaker 1: can attempt to have a conversation with them. It's not 332 00:19:21,880 --> 00:19:25,520 Speaker 1: terribly compelling, but it's it's kind of fun. Usually within 333 00:19:25,640 --> 00:19:28,840 Speaker 1: maybe four or five exchanges, you've already run into something 334 00:19:28,840 --> 00:19:31,760 Speaker 1: where you're like, well, this can't be a person, or 335 00:19:31,800 --> 00:19:33,880 Speaker 1: if it's a person, it's the weirdest person I've ever 336 00:19:34,040 --> 00:19:36,719 Speaker 1: conversed with. But but again, it wasn't really an earnest 337 00:19:36,720 --> 00:19:41,119 Speaker 1: attempt to to create something that would pass the quote 338 00:19:41,160 --> 00:19:43,359 Speaker 1: unquote tearing tests, right, which I don't think was being 339 00:19:43,400 --> 00:19:45,199 Speaker 1: referred to as such at that point yet. Yeah, it 340 00:19:45,240 --> 00:19:48,199 Speaker 1: was kind of people knew about tourings prediction, but it 341 00:19:48,240 --> 00:19:51,400 Speaker 1: wasn't so much called a Turing test. Another chat bought 342 00:19:51,440 --> 00:19:54,480 Speaker 1: that premiered in the early nineteen seventies went even further 343 00:19:54,560 --> 00:19:58,080 Speaker 1: and actually was an attempt to try and pass the 344 00:19:58,119 --> 00:20:01,160 Speaker 1: Turing test in a very specific approach, kind of like 345 00:20:01,560 --> 00:20:04,000 Speaker 1: you know, Eugene Goostman is a very specific approach to 346 00:20:04,119 --> 00:20:07,120 Speaker 1: narrow down that those parameters. In this case, this one 347 00:20:07,200 --> 00:20:10,920 Speaker 1: was called Perry p A R R y A. Right, 348 00:20:10,960 --> 00:20:15,000 Speaker 1: and Kenneth Colby created it too to emulate a patient 349 00:20:15,119 --> 00:20:18,919 Speaker 1: who has paranoid schizophrenia. Yeah, someone who has you know, 350 00:20:18,960 --> 00:20:23,679 Speaker 1: sort of a persecution complex. Uh. They imagined that that 351 00:20:23,840 --> 00:20:27,520 Speaker 1: there are people or other entities that are out to 352 00:20:27,560 --> 00:20:31,720 Speaker 1: get them, and in this uh specific case, he kinda 353 00:20:31,960 --> 00:20:35,919 Speaker 1: he kind of really embraced this approach. It reminds me 354 00:20:35,960 --> 00:20:39,639 Speaker 1: of people who create a like a Dungeons and Dragon's character, 355 00:20:39,680 --> 00:20:42,600 Speaker 1: but then give their character an entire backstory. Yeah. Yeah, yeah, 356 00:20:42,600 --> 00:20:45,720 Speaker 1: this this Perry persona was an entire persona. It was 357 00:20:45,720 --> 00:20:47,560 Speaker 1: a twenty eight year old man with a job as 358 00:20:47,560 --> 00:20:50,200 Speaker 1: a post office clerk who was single, had no brothers 359 00:20:50,200 --> 00:20:54,000 Speaker 1: and sisters, and rarely saw his parents. He had specific hobbies. 360 00:20:54,080 --> 00:20:57,200 Speaker 1: He liked to go to the movies and horse racing. Yeah, 361 00:20:57,200 --> 00:20:59,800 Speaker 1: he liked to bet on the horses. Uh. And he 362 00:21:00,080 --> 00:21:02,280 Speaker 1: placed about with bookies in the past, right, and that 363 00:21:03,080 --> 00:21:07,520 Speaker 1: he realized later bookies have an association with the criminal 364 00:21:07,640 --> 00:21:11,919 Speaker 1: underworld and that therefore the mafia knew about him and 365 00:21:12,040 --> 00:21:14,199 Speaker 1: we're out to get him. Now, now all of that 366 00:21:14,280 --> 00:21:16,760 Speaker 1: might sound very ridiculous to you if you never had 367 00:21:16,840 --> 00:21:20,920 Speaker 1: any kind of interaction with someone who suffers from paranoid schizophrenia, 368 00:21:21,040 --> 00:21:23,880 Speaker 1: that can seem like, well, that seems cartoonish. But no, this, 369 00:21:23,880 --> 00:21:28,000 Speaker 1: this kind of thinking is not uncommon, you know, whether 370 00:21:28,080 --> 00:21:31,399 Speaker 1: it's whether it's a criminal organization or the government or 371 00:21:31,520 --> 00:21:36,160 Speaker 1: some other even unnamed entity. Oh absolutely, uh, it's it's 372 00:21:36,280 --> 00:21:39,440 Speaker 1: very much realistic in terms of that kind of diagnosis. 373 00:21:39,520 --> 00:21:41,520 Speaker 1: And and if I was speaking about it with humor 374 00:21:41,520 --> 00:21:43,399 Speaker 1: in my voice a moment ago, it's it's only because 375 00:21:43,440 --> 00:21:47,359 Speaker 1: I am absolutely tickled that the programmers of this of 376 00:21:47,440 --> 00:21:51,199 Speaker 1: this program built it, built it in. Yeah, I mean 377 00:21:51,440 --> 00:21:54,360 Speaker 1: it's it's actually pretty entertaining that they went so far 378 00:21:54,400 --> 00:21:57,720 Speaker 1: as to make this whole, uh, this whole backstory to explain, 379 00:21:57,760 --> 00:22:01,160 Speaker 1: because that's what gives it the believability, and then they 380 00:22:01,240 --> 00:22:04,160 Speaker 1: ended up testing it in conversation with a human. Perry 381 00:22:04,200 --> 00:22:08,640 Speaker 1: would gradually start to introduce his thoughts quote unquote thoughts 382 00:22:08,680 --> 00:22:12,359 Speaker 1: about being persecuted, and would respond sensitively to anything said 383 00:22:12,359 --> 00:22:16,280 Speaker 1: about his appearance, family, or religious beliefs. I've actually seen 384 00:22:16,400 --> 00:22:19,320 Speaker 1: lots of transcripts of conversations with Perry, and sure enough, 385 00:22:19,560 --> 00:22:21,920 Speaker 1: it's one of those things where you know, you might 386 00:22:22,280 --> 00:22:24,639 Speaker 1: have a few exchanges and then Perry ends up saying 387 00:22:24,760 --> 00:22:28,520 Speaker 1: something that seems really odd, but not so odd as 388 00:22:28,560 --> 00:22:31,680 Speaker 1: to seem artificial. It just seems like it's a non sequitur, 389 00:22:32,240 --> 00:22:34,280 Speaker 1: you know, something like, well, that's what they want you 390 00:22:34,320 --> 00:22:35,760 Speaker 1: to think. And if you were to say, who are 391 00:22:35,800 --> 00:22:38,840 Speaker 1: they the mafia? Who did you think I was talking about? 392 00:22:39,520 --> 00:22:42,760 Speaker 1: Like the mafia after you? Of course they're after me. 393 00:22:43,240 --> 00:22:44,879 Speaker 1: They know who I am. That kind of stuff, and 394 00:22:45,119 --> 00:22:48,040 Speaker 1: it's disturbing, like it's you know, when you know what 395 00:22:48,080 --> 00:22:50,199 Speaker 1: it is, it's kind of amusing, but like if you're 396 00:22:50,200 --> 00:22:55,040 Speaker 1: in the middle of a conversation, join, oh this poor person. Yeah. 397 00:22:55,200 --> 00:22:58,560 Speaker 1: And so in order to test it, Kolby did a 398 00:22:58,560 --> 00:23:01,880 Speaker 1: couple of different things. Did one test where eight psychiatrists 399 00:23:01,920 --> 00:23:06,040 Speaker 1: interviewed both Perry and a human patient via teletypewriter. So 400 00:23:06,359 --> 00:23:10,879 Speaker 1: in both cases the the the psychiatrists could not see 401 00:23:10,920 --> 00:23:12,639 Speaker 1: who they were interviewing. This is going back to the 402 00:23:12,680 --> 00:23:15,560 Speaker 1: kind of the original Touring test idea, or at least 403 00:23:15,560 --> 00:23:19,960 Speaker 1: Touring's proposed experiment, and in this case, only two of 404 00:23:19,960 --> 00:23:21,879 Speaker 1: the eight were able to identify that one of the 405 00:23:21,880 --> 00:23:24,800 Speaker 1: interviewees was human. And the other was a machine. In 406 00:23:24,840 --> 00:23:28,760 Speaker 1: a second test, Colby presented a group of a hundred psychiatrists, 407 00:23:29,040 --> 00:23:34,720 Speaker 1: almost a psychics, transcripts of interviews between Perry changed the Yeah, 408 00:23:34,960 --> 00:23:39,480 Speaker 1: he had he had pairs of exactly I foresee no. 409 00:23:39,640 --> 00:23:44,160 Speaker 1: But they gave these psychiatrists transcripts of interviews between an 410 00:23:44,160 --> 00:23:46,879 Speaker 1: interviewer and Perry, and an interviewer and a human patient, 411 00:23:47,680 --> 00:23:51,119 Speaker 1: and forty out of the responded. I don't know if 412 00:23:51,119 --> 00:23:54,040 Speaker 1: the other sixty just never got it or if they didn't. 413 00:23:54,920 --> 00:23:58,080 Speaker 1: Response rates are variable, right, so I'll the forty who responded, 414 00:23:58,160 --> 00:24:03,199 Speaker 1: nineteen of them guessed incorrectly. So that's almost a fifty percent, 415 00:24:03,840 --> 00:24:06,840 Speaker 1: you know, getting you know, right up there with a 416 00:24:06,920 --> 00:24:12,639 Speaker 1: pretty impressive amount. Now, again, we have to look at 417 00:24:12,680 --> 00:24:15,480 Speaker 1: the fact that Perry is operating under a very restricted, 418 00:24:15,920 --> 00:24:20,200 Speaker 1: uh set of rules. We're talking about paranoid schizophrenic, someone 419 00:24:20,240 --> 00:24:25,160 Speaker 1: who we would assume would occasionally an non normative answers 420 00:24:25,200 --> 00:24:29,159 Speaker 1: exactly two conversational pieces. And and again it's it's a 421 00:24:29,200 --> 00:24:32,040 Speaker 1: limited time that you're having with this person or this 422 00:24:32,200 --> 00:24:37,119 Speaker 1: entity in this case, this this program, and uh. Because 423 00:24:37,560 --> 00:24:41,919 Speaker 1: the psychiatrists had a specific expectation of the type of 424 00:24:41,920 --> 00:24:45,320 Speaker 1: interactions they were going to see that could have affected 425 00:24:45,720 --> 00:24:52,600 Speaker 1: their their uh answer right. So ideally, in the perfect situation, 426 00:24:53,240 --> 00:24:56,399 Speaker 1: you would have this interview happening where you have no 427 00:24:56,520 --> 00:25:01,040 Speaker 1: expectation as to what the answers should be. In other words, 428 00:25:01,240 --> 00:25:05,239 Speaker 1: you don't know ahead of time that the interviewee is 429 00:25:05,760 --> 00:25:07,879 Speaker 1: having any kind of other, you know, any kind of 430 00:25:07,880 --> 00:25:10,760 Speaker 1: restrictions upon that person, so that you would be interviewing 431 00:25:10,840 --> 00:25:14,280 Speaker 1: anyone like any average person. But that's obviously not what 432 00:25:14,320 --> 00:25:16,159 Speaker 1: we're talking about here, nor was it the one for 433 00:25:16,200 --> 00:25:20,840 Speaker 1: Eugene Goost Bond. So I've also seen, by the way, 434 00:25:21,200 --> 00:25:25,399 Speaker 1: transcripts of people who set up Eliza and Perry to 435 00:25:25,440 --> 00:25:28,320 Speaker 1: talk to each other. So you have Eliza acting as 436 00:25:28,359 --> 00:25:35,240 Speaker 1: the Rogerian psychiatrist and Perry the paranoid schizophrenic, having bizarre conversations, 437 00:25:35,640 --> 00:25:38,240 Speaker 1: and they usually don't last very long because Perry gets upset, 438 00:25:39,960 --> 00:25:42,560 Speaker 1: And obviously by gets upset, I just mean that Perry 439 00:25:42,640 --> 00:25:46,240 Speaker 1: ends up essentially shutting down the conversation because Eliza just 440 00:25:46,280 --> 00:25:49,200 Speaker 1: wants to ask questions and Perry gets suspicious of people 441 00:25:49,200 --> 00:25:52,320 Speaker 1: who are asking questions, and by again get suspicious, I'm 442 00:25:52,320 --> 00:25:54,919 Speaker 1: saying following specific rules that make it feel like this. 443 00:25:55,320 --> 00:25:58,240 Speaker 1: Computer programs getting suspicious, but they are entertaining. If you 444 00:25:58,280 --> 00:26:01,480 Speaker 1: ever do a search on line, just look for Eliza 445 00:26:01,600 --> 00:26:04,480 Speaker 1: Perry transcripts. There. There are a few of them, and 446 00:26:04,520 --> 00:26:08,040 Speaker 1: they're all pretty entertaining. But so since since that time, 447 00:26:08,119 --> 00:26:10,880 Speaker 1: I mean obviously, lots and lots of chatbots have been 448 00:26:10,920 --> 00:26:13,760 Speaker 1: created for multiple reasons. Oh yeah, well, you know, some 449 00:26:13,840 --> 00:26:16,280 Speaker 1: of them are trying to test the Turing test, and 450 00:26:16,440 --> 00:26:19,640 Speaker 1: others are trying to fool you into giving out your 451 00:26:19,680 --> 00:26:22,400 Speaker 1: credit card information or plaining on a link that has 452 00:26:22,480 --> 00:26:27,680 Speaker 1: malicious uh links to malicious software. Yeah, anyone who's been 453 00:26:27,720 --> 00:26:33,280 Speaker 1: on any kind of chat program, specifically like AIM has 454 00:26:33,320 --> 00:26:35,720 Speaker 1: probably encountered this at least once or twice, where they're 455 00:26:35,720 --> 00:26:41,440 Speaker 1: they're getting an unsolicited message from someone or something or something. Yeah, 456 00:26:41,760 --> 00:26:43,760 Speaker 1: if you type a couple of times, you realize, oh, 457 00:26:43,760 --> 00:26:46,919 Speaker 1: this is not actually this is an attempt for to 458 00:26:47,000 --> 00:26:49,439 Speaker 1: either get information from me or have me click on 459 00:26:49,440 --> 00:26:53,400 Speaker 1: a link. Yeah, that's a that's a thing. But there 460 00:26:53,440 --> 00:26:58,439 Speaker 1: are some examples of uh, I guess saying legitimate is weird, 461 00:26:58,480 --> 00:27:01,280 Speaker 1: but there are some examples of they're more and more 462 00:27:01,320 --> 00:27:06,760 Speaker 1: scholarly attempts. Yeah, like a PC therapist that one was 463 00:27:06,920 --> 00:27:12,880 Speaker 1: by Joseph Weintraub, and it fooled of its judges into 464 00:27:12,880 --> 00:27:17,040 Speaker 1: thinking that it was human. Of ten judges, of course, 465 00:27:17,119 --> 00:27:19,359 Speaker 1: so like, if I'm doing my math correctly, that's five 466 00:27:19,560 --> 00:27:24,120 Speaker 1: that I believe you are. Unless we're talking about quantum judges. Um, 467 00:27:24,119 --> 00:27:26,359 Speaker 1: these judges were both right and wrong at the same time. 468 00:27:27,119 --> 00:27:30,680 Speaker 1: And um, and it was it was a whimsical program. 469 00:27:30,720 --> 00:27:32,800 Speaker 1: I guess you could say, yeah, it had come some 470 00:27:33,520 --> 00:27:35,240 Speaker 1: I think it's fair to say its answers could be 471 00:27:35,240 --> 00:27:39,160 Speaker 1: pretty smart ass. Also, I read some of these transcripts 472 00:27:39,160 --> 00:27:42,399 Speaker 1: and it actually surprised me that enough judges thought that 473 00:27:42,480 --> 00:27:44,720 Speaker 1: it was a person. Maybe they thought it was a 474 00:27:44,760 --> 00:27:48,119 Speaker 1: person who was purposefully attempting to fool them into thinking 475 00:27:48,119 --> 00:27:50,680 Speaker 1: it was a computer, huh, you know, because it would 476 00:27:50,680 --> 00:27:53,000 Speaker 1: say things like I compute, therefore I am that kind 477 00:27:53,000 --> 00:27:57,280 Speaker 1: of stuff where it was specifically yeah. So you're looking 478 00:27:57,280 --> 00:27:59,000 Speaker 1: at this stuff and you're thinking, all right, well, maybe 479 00:27:59,040 --> 00:28:01,080 Speaker 1: this is kind of going back to that original touring 480 00:28:01,119 --> 00:28:05,040 Speaker 1: test party game, where the the effort is for a 481 00:28:05,160 --> 00:28:07,480 Speaker 1: person like the person who's being interviewed is trying to 482 00:28:07,520 --> 00:28:11,000 Speaker 1: throw everybody off, and that's perfectly within the rules, unless 483 00:28:11,040 --> 00:28:13,720 Speaker 1: you stay upfront. No, just be honest in your in 484 00:28:13,720 --> 00:28:15,960 Speaker 1: your answers. There's nothing in here, by the way that 485 00:28:16,040 --> 00:28:19,880 Speaker 1: says that the interviewee has to tell the truth necessarily, 486 00:28:20,200 --> 00:28:22,480 Speaker 1: unless you just state that as a parameter at the beginning. 487 00:28:23,160 --> 00:28:24,919 Speaker 1: So in other words, you could you could be like, 488 00:28:25,000 --> 00:28:28,480 Speaker 1: I'm totally the computer and you're the human being interviewed. Uh. 489 00:28:28,640 --> 00:28:30,720 Speaker 1: I don't know if that's a fair way of saying 490 00:28:30,800 --> 00:28:34,720 Speaker 1: that the device won or lost, but it is a possibility. Uh. 491 00:28:34,800 --> 00:28:36,520 Speaker 1: Then we have a two thousand eleven. Now, this one 492 00:28:36,560 --> 00:28:40,440 Speaker 1: is a really pretty impressive one. And this was clever Butt, 493 00:28:40,960 --> 00:28:44,200 Speaker 1: which was made by a fellow named Rollo Carpenter, and 494 00:28:44,240 --> 00:28:47,520 Speaker 1: it fooled fifty nine point three percent of a live 495 00:28:47,560 --> 00:28:50,200 Speaker 1: audience and an event in India with more than a 496 00:28:50,280 --> 00:28:53,680 Speaker 1: thousand people. Yeah, the way this worked was that the 497 00:28:53,680 --> 00:28:58,360 Speaker 1: audience watched as interviewers interacted via text with either clever 498 00:28:58,440 --> 00:29:01,880 Speaker 1: Bot or a human in the course of a four 499 00:29:01,920 --> 00:29:04,680 Speaker 1: minute interview. So it's a little shorter than what Turing 500 00:29:04,720 --> 00:29:07,120 Speaker 1: had said, but not by not by a whole lot. 501 00:29:07,560 --> 00:29:09,280 Speaker 1: In four mints is still a good amount of time. 502 00:29:09,440 --> 00:29:13,080 Speaker 1: Sure that that is twenty less, that's true, that's true. 503 00:29:13,360 --> 00:29:16,440 Speaker 1: So keep that in mind. But but at any rate, 504 00:29:16,520 --> 00:29:19,520 Speaker 1: it was you know, pretty interesting experience. And also from 505 00:29:19,520 --> 00:29:24,400 Speaker 1: why I read they misidentified, they thought that the human 506 00:29:24,480 --> 00:29:28,640 Speaker 1: was a computer sixty of the time. Um, because they 507 00:29:28,640 --> 00:29:32,160 Speaker 1: didn't necessarily just say that it was a computer or 508 00:29:32,200 --> 00:29:37,480 Speaker 1: it was. So now we see that there are a 509 00:29:37,520 --> 00:29:41,560 Speaker 1: few examples of chat bots quote unquote passing the Turing test. 510 00:29:42,000 --> 00:29:45,600 Speaker 1: So what does that mean. Does it mean that the 511 00:29:45,720 --> 00:29:52,120 Speaker 1: machines are actually thinking? Um? No, I mean, and it's 512 00:29:52,160 --> 00:29:53,880 Speaker 1: not to say that that computers don't have a certain 513 00:29:53,920 --> 00:29:57,880 Speaker 1: amount of machine intelligence, but there's absolutely a distinction between 514 00:29:57,920 --> 00:30:01,600 Speaker 1: that and what we consider to be human intelligence. That's true. 515 00:30:01,840 --> 00:30:04,720 Speaker 1: The programmers themselves have said that this doesn't mean a 516 00:30:04,760 --> 00:30:08,760 Speaker 1: machine is able to think. Uh, they're just able to 517 00:30:08,840 --> 00:30:12,640 Speaker 1: interpret commands then to follow a set of rules to 518 00:30:12,840 --> 00:30:16,320 Speaker 1: make a response, which is still pretty cool. And and 519 00:30:16,360 --> 00:30:18,960 Speaker 1: it doesn't It certainly doesn't mean that the Turing test 520 00:30:19,200 --> 00:30:22,640 Speaker 1: is worthless as an exercise. No, it is. In fact, 521 00:30:22,680 --> 00:30:27,080 Speaker 1: it's improving our ability to create programs that that can 522 00:30:27,200 --> 00:30:31,800 Speaker 1: understand or at least respond to natural language. Natural language 523 00:30:31,840 --> 00:30:34,280 Speaker 1: recognition is one of those big things where if you're 524 00:30:34,360 --> 00:30:36,880 Speaker 1: really able to crack it, then you can have some 525 00:30:37,000 --> 00:30:39,840 Speaker 1: amazing opportunities open up, and we've seen this recently with 526 00:30:39,880 --> 00:30:43,520 Speaker 1: stuff like Siri. Oh absolutely being able to speak to 527 00:30:43,560 --> 00:30:46,840 Speaker 1: your computer rather than having to I mean, even if 528 00:30:46,880 --> 00:30:49,240 Speaker 1: you could speak to your computer through a keyboard and 529 00:30:49,360 --> 00:30:51,920 Speaker 1: have it understand what you're what you're saying, I mean, 530 00:30:51,960 --> 00:30:55,240 Speaker 1: like it's it's the reason why Google spends so much 531 00:30:55,280 --> 00:30:57,960 Speaker 1: time and money, and it's search algorithms of of trying 532 00:30:58,000 --> 00:31:01,040 Speaker 1: to figure out what you really mean when you search 533 00:31:01,120 --> 00:31:06,200 Speaker 1: for a certain phrase, because traditionally, you know, before we 534 00:31:06,320 --> 00:31:10,880 Speaker 1: really got into the natural language recognition era, it meant 535 00:31:10,880 --> 00:31:12,479 Speaker 1: that in order to work with a computer, you had 536 00:31:12,520 --> 00:31:14,800 Speaker 1: to work with a computer on the computer's terms. You 537 00:31:14,840 --> 00:31:17,000 Speaker 1: had to learn the commands, you had to learn the 538 00:31:17,000 --> 00:31:19,560 Speaker 1: way to navigate a computer system in order for it 539 00:31:19,560 --> 00:31:21,880 Speaker 1: to do what you wanted it to do. Once you 540 00:31:21,880 --> 00:31:25,000 Speaker 1: get to a point where natural language recognition software is 541 00:31:25,680 --> 00:31:29,920 Speaker 1: robust enough the computer is working on your terms, you 542 00:31:29,920 --> 00:31:33,600 Speaker 1: you can put in however you're thinking, like whatever, whatever 543 00:31:34,080 --> 00:31:37,920 Speaker 1: mental exercise you've gone through to ask this computer to 544 00:31:37,960 --> 00:31:41,840 Speaker 1: do something, whatever you do to kind of express a thought, 545 00:31:42,440 --> 00:31:45,239 Speaker 1: uh as a command to this computer, the computer can 546 00:31:45,280 --> 00:31:49,600 Speaker 1: then interpret then respond and it's not just for serving 547 00:31:49,640 --> 00:31:52,640 Speaker 1: you back whatever information you happen to be looking for. 548 00:31:52,760 --> 00:31:55,280 Speaker 1: It's I mean, I mean, we're talking about being able 549 00:31:55,280 --> 00:31:58,360 Speaker 1: to just look at a computer and say, you know, 550 00:31:58,560 --> 00:32:02,360 Speaker 1: I really want a graph that looks blue and has 551 00:32:02,440 --> 00:32:06,400 Speaker 1: these percentages in it and is about this thing, and 552 00:32:06,440 --> 00:32:08,680 Speaker 1: it just doesn't. Yeah, like I want to see what 553 00:32:08,800 --> 00:32:12,640 Speaker 1: the population distribution of Atlanta is in a bar chart 554 00:32:12,760 --> 00:32:15,360 Speaker 1: or something, and then it could bring that shows out, 555 00:32:15,440 --> 00:32:18,640 Speaker 1: finds that information, put it into a bar chart, and yeah, 556 00:32:18,640 --> 00:32:21,440 Speaker 1: that's pretty phenomenal stuff. We have a little bit more 557 00:32:21,480 --> 00:32:23,760 Speaker 1: to say about passing the Turing test, but before we 558 00:32:23,800 --> 00:32:33,960 Speaker 1: get to that, let's take another quick break. We see 559 00:32:34,000 --> 00:32:37,480 Speaker 1: other examples of machine intelligence everywhere, things like pattern recognition, 560 00:32:37,520 --> 00:32:42,920 Speaker 1: probabilistic predictions, for example, Pandora. You know the Music Genome project. 561 00:32:42,960 --> 00:32:45,920 Speaker 1: It's yeah, that's that's pattern recognition. Yeah, it's looking for 562 00:32:47,000 --> 00:32:49,480 Speaker 1: elements of songs that you say you like and then 563 00:32:49,520 --> 00:32:52,440 Speaker 1: looking for other stuff that's not in the specific category 564 00:32:52,480 --> 00:32:56,960 Speaker 1: you mentioned or the specific examples you mentioned, and there's 565 00:32:57,000 --> 00:32:58,800 Speaker 1: something else you probably will like because you like these 566 00:32:58,800 --> 00:33:01,800 Speaker 1: other things that also have this stuff in it. Uh, 567 00:33:01,840 --> 00:33:05,320 Speaker 1: you know, sometimes that's less functional than other times. It 568 00:33:05,360 --> 00:33:07,880 Speaker 1: makes me think of Patton Oswald has a great routine 569 00:33:07,920 --> 00:33:10,120 Speaker 1: about TiVo that does the same sort of thing where 570 00:33:10,160 --> 00:33:12,840 Speaker 1: he says, you know, TiVo is great. I mean, I like, 571 00:33:12,920 --> 00:33:15,760 Speaker 1: I love Westerns, and I'll have it set to TiVo 572 00:33:15,840 --> 00:33:17,600 Speaker 1: a Western for me, and I come back, and then 573 00:33:17,640 --> 00:33:20,120 Speaker 1: they'll be all these other Westerns that will be suggested. 574 00:33:20,160 --> 00:33:22,960 Speaker 1: I didn't even know about. Thank you, TiVo. But then 575 00:33:23,040 --> 00:33:25,560 Speaker 1: sometimes TiVo gets it wrong and I come back and 576 00:33:25,600 --> 00:33:29,600 Speaker 1: everything has got horses in it, because Westerns have horses 577 00:33:29,600 --> 00:33:31,960 Speaker 1: in it. So I've got my little pony and cartoons 578 00:33:31,960 --> 00:33:34,760 Speaker 1: with horses and unicorns and things, and I have to say, no, TiVo, 579 00:33:34,960 --> 00:33:38,040 Speaker 1: that's a bad TiVo. But TiVo says, but you said 580 00:33:38,080 --> 00:33:41,480 Speaker 1: you liked horses. Same sort of thing. Like when you 581 00:33:41,560 --> 00:33:45,920 Speaker 1: get more sophisticated than the the computer program starts to 582 00:33:45,960 --> 00:33:50,200 Speaker 1: anticipate things and makes these probabilistic models. These these models 583 00:33:50,200 --> 00:33:56,000 Speaker 1: where there there are certain percentages associated with various responses, 584 00:33:56,080 --> 00:33:58,640 Speaker 1: and it goes with whichever one seems to be the 585 00:33:58,640 --> 00:34:01,720 Speaker 1: most prevalent, assoning it meets a threshold. If this sounds 586 00:34:01,720 --> 00:34:06,239 Speaker 1: familiar to you, it's because that's how IBM S. Watson worked, right, 587 00:34:06,320 --> 00:34:10,880 Speaker 1: which is a really good example of natural language recognition, absolutely, 588 00:34:10,920 --> 00:34:13,840 Speaker 1: because not only was it able to recognize natural language, 589 00:34:14,160 --> 00:34:18,440 Speaker 1: it had to interpret things like wordplay that Jeopardy do. 590 00:34:18,600 --> 00:34:21,080 Speaker 1: This is the machine that went up on Jeopardy and 591 00:34:21,120 --> 00:34:26,239 Speaker 1: beat the returning champions or former champions. Uh And you know, 592 00:34:26,280 --> 00:34:29,520 Speaker 1: if you've ever played Jeopardy or watched Jeopardy, you know 593 00:34:29,600 --> 00:34:32,960 Speaker 1: that there are categories that depend on things like puns 594 00:34:33,160 --> 00:34:37,200 Speaker 1: or hominem's or other forms of word play. So it 595 00:34:37,239 --> 00:34:39,879 Speaker 1: has to parse all of that, and that's even more 596 00:34:39,920 --> 00:34:42,719 Speaker 1: complicated than just taking a simple sentence and figuring out, 597 00:34:43,080 --> 00:34:46,440 Speaker 1: all right, what are the potential responses to whatever this 598 00:34:46,440 --> 00:34:50,640 Speaker 1: this this phrase is so great, great example. You know, 599 00:34:50,680 --> 00:34:53,280 Speaker 1: they would end up coming up with a potential answer. 600 00:34:53,760 --> 00:34:57,160 Speaker 1: It would assign a percentage of how quote unquote sure 601 00:34:57,320 --> 00:34:59,680 Speaker 1: it was that that was the right answer, and if 602 00:34:59,680 --> 00:35:02,359 Speaker 1: the percentage was higher than its threshold, which I think 603 00:35:02,400 --> 00:35:05,919 Speaker 1: was something like that, it would buzz in and give 604 00:35:05,920 --> 00:35:09,239 Speaker 1: that as a guess. Sometimes it was wrong, but it 605 00:35:09,280 --> 00:35:12,120 Speaker 1: was right a lot of the time, so that's kind 606 00:35:12,120 --> 00:35:17,799 Speaker 1: of cool. Uh So, getting back to Eugene Gene, the 607 00:35:17,800 --> 00:35:19,759 Speaker 1: main machine as I called him in in my notes 608 00:35:19,800 --> 00:35:23,560 Speaker 1: at one point. Uh And of course, I'm anthropomorphizing when 609 00:35:23,560 --> 00:35:25,640 Speaker 1: I say him, it's a it's a it's an it. Well, 610 00:35:25,880 --> 00:35:27,879 Speaker 1: it has a dude name. Yeah, it has a dude 611 00:35:27,960 --> 00:35:31,520 Speaker 1: name and a dude persona. But it's ultimately, isn't it. 612 00:35:33,440 --> 00:35:37,440 Speaker 1: Would you say that perhaps some of the reporting around 613 00:35:37,440 --> 00:35:40,960 Speaker 1: this was was maybe a little misleading or at least 614 00:35:41,680 --> 00:35:47,040 Speaker 1: hype is well, you know, Okay, the entire Eugene Gooseman 615 00:35:47,239 --> 00:35:51,080 Speaker 1: chat bought sounds really cool. I haven't met it personally, No, 616 00:35:51,360 --> 00:35:54,440 Speaker 1: I haven't either, although you can. There is an Internet version, 617 00:35:54,560 --> 00:35:56,399 Speaker 1: and I'm not sure that it's the same version that's 618 00:35:56,400 --> 00:35:59,160 Speaker 1: being used in competition, because I've seen some transcripts from 619 00:35:59,160 --> 00:36:02,480 Speaker 1: the Internet version and they don't seem good at all. 620 00:36:02,800 --> 00:36:05,160 Speaker 1: They seem bad, right, I guess you'd have to talk 621 00:36:05,200 --> 00:36:09,080 Speaker 1: to some actual thirteen year old boys from actually, yeah, 622 00:36:09,120 --> 00:36:12,760 Speaker 1: that this is part of it. You know. There's certainly 623 00:36:12,800 --> 00:36:18,520 Speaker 1: been some some questions among uh natural language AI enthusiasts 624 00:36:18,520 --> 00:36:22,400 Speaker 1: online about whether we're really just lowering our expectations for 625 00:36:22,520 --> 00:36:26,040 Speaker 1: human communication. Which, yeah, that's that's a totally different way 626 00:36:26,040 --> 00:36:29,480 Speaker 1: of looking at and a depressing one to say that, Oh, well, 627 00:36:29,520 --> 00:36:32,040 Speaker 1: if you come from this place and if you are 628 00:36:32,160 --> 00:36:34,719 Speaker 1: of this age, then I expect you to only be 629 00:36:34,800 --> 00:36:40,280 Speaker 1: able to communicate at this level, right, which is depressing. 630 00:36:40,800 --> 00:36:42,680 Speaker 1: It certainly is well, but you know, but it's a 631 00:36:42,760 --> 00:36:45,000 Speaker 1: valid point, I think. I think it's a good thing 632 00:36:45,000 --> 00:36:48,319 Speaker 1: to be thinking about in this kind of situation. UM. 633 00:36:48,560 --> 00:36:51,359 Speaker 1: You know. Beyond that though, and I certainly don't want 634 00:36:51,400 --> 00:36:57,279 Speaker 1: to downplay the apparent achievements of its programmers, because I 635 00:36:57,320 --> 00:37:02,239 Speaker 1: haven't programmed any capable chatbots today, I it's been ever 636 00:37:02,480 --> 00:37:05,560 Speaker 1: since I did. But there are a few things that 637 00:37:05,600 --> 00:37:08,520 Speaker 1: are just a little bit shady about the news UM. 638 00:37:08,960 --> 00:37:12,520 Speaker 1: First off, the original press release, which came out of 639 00:37:12,560 --> 00:37:16,400 Speaker 1: the University of Reading, I believe so UM stated that 640 00:37:16,560 --> 00:37:22,600 Speaker 1: a quote supercomputer had achieved this feat, and perhaps charitably, 641 00:37:22,719 --> 00:37:25,680 Speaker 1: it was a mistake or misunderstanding on the part of 642 00:37:25,920 --> 00:37:28,440 Speaker 1: the writer of the press release, but some skeptics have 643 00:37:28,600 --> 00:37:32,640 Speaker 1: suggested that it was in fact a purposeful publicity play 644 00:37:32,680 --> 00:37:34,719 Speaker 1: that in fact worked, because a whole lot of news 645 00:37:34,800 --> 00:37:39,680 Speaker 1: headlines around the interwebs repeated the error very excitedly. Yes, 646 00:37:39,680 --> 00:37:42,000 Speaker 1: because it was not a supercomputer. It was a computer 647 00:37:42,160 --> 00:37:44,440 Speaker 1: running a piece of software. It's a program, Yes, the 648 00:37:44,440 --> 00:37:46,640 Speaker 1: program that did the work. I mean, the computer just 649 00:37:46,760 --> 00:37:50,040 Speaker 1: provided the horsepower. Right, it's the software that did all 650 00:37:50,040 --> 00:37:52,200 Speaker 1: the action work, and it was not It wasn't on 651 00:37:52,239 --> 00:37:58,719 Speaker 1: a supercomputer. Little little known fact. Supercomputers have better things 652 00:37:58,760 --> 00:38:03,560 Speaker 1: to do than run chat bought software generally. Yeah. Yeah, 653 00:38:03,600 --> 00:38:07,440 Speaker 1: we're talking about things like figuring out global weather changing 654 00:38:07,840 --> 00:38:10,239 Speaker 1: change patterns and things like that, you know, or or 655 00:38:10,280 --> 00:38:14,839 Speaker 1: the way that money works. Chat bots low on their 656 00:38:14,840 --> 00:38:19,399 Speaker 1: priority list. It's like number seven at least. Um And 657 00:38:19,400 --> 00:38:23,120 Speaker 1: and also that this press release in question was largely 658 00:38:23,280 --> 00:38:28,240 Speaker 1: a quotation from Dr Kevin Warwick. Kevin Warwick, of course 659 00:38:28,280 --> 00:38:33,600 Speaker 1: being the fellow who organized this entire competition. Um who 660 00:38:33,640 --> 00:38:36,359 Speaker 1: who's an engineer and a futurist, um and also the 661 00:38:36,440 --> 00:38:41,040 Speaker 1: instigator and or enjoyer of a certain amount of hype 662 00:38:41,120 --> 00:38:47,440 Speaker 1: and debate about future technologies. Yeah, he is, obviously you 663 00:38:47,440 --> 00:38:49,759 Speaker 1: can tell. This is the guy who elected to have 664 00:38:49,800 --> 00:38:53,120 Speaker 1: surgery performed on him so he could have that r 665 00:38:53,200 --> 00:38:55,239 Speaker 1: F I D chip and call himself aside Wars. This 666 00:38:55,280 --> 00:39:01,239 Speaker 1: is someone who not only embraces this these ideas of futurism, 667 00:39:01,239 --> 00:39:04,799 Speaker 1: but is actively trying to promote them and get to 668 00:39:04,840 --> 00:39:06,799 Speaker 1: them that we're not even saying that that's a bad thing. 669 00:39:06,840 --> 00:39:09,879 Speaker 1: What we are saying, is that that may give him 670 00:39:09,920 --> 00:39:13,200 Speaker 1: somewhat of a bias when it comes to proclaiming a 671 00:39:13,239 --> 00:39:17,800 Speaker 1: computer software piece of computer software being an amazing achievement 672 00:39:17,840 --> 00:39:20,760 Speaker 1: that beat the Turing test, right sure. I mean he 673 00:39:21,080 --> 00:39:24,319 Speaker 1: admits basically to to being a provocateur. He he says 674 00:39:24,360 --> 00:39:26,360 Speaker 1: that that's really his job, you know, it is to 675 00:39:26,440 --> 00:39:31,160 Speaker 1: get people excited about tech and engineering and in the future. 676 00:39:31,320 --> 00:39:34,200 Speaker 1: And we get that like that that we agree that's 677 00:39:34,200 --> 00:39:36,560 Speaker 1: our job too. We think it's red. We think that 678 00:39:36,760 --> 00:39:40,040 Speaker 1: perhaps I don't want to put words into your mount Lauren. 679 00:39:40,200 --> 00:39:42,759 Speaker 1: I think perhaps that there's a different way of going 680 00:39:42,760 --> 00:39:46,319 Speaker 1: about it where you can still be excited, but you 681 00:39:46,360 --> 00:39:49,080 Speaker 1: can be a little more grounded in the way you 682 00:39:49,160 --> 00:39:51,920 Speaker 1: present things, because I I also think the achievement of 683 00:39:51,920 --> 00:39:56,080 Speaker 1: creating a chatbot that could be uh convincing is a 684 00:39:56,120 --> 00:39:58,759 Speaker 1: fantastic achievement. I mean, it's something that, even under any 685 00:39:58,840 --> 00:40:01,879 Speaker 1: number of qualifications, incredibly challenging to do, no matter how 686 00:40:01,960 --> 00:40:05,480 Speaker 1: you frame it. Um. I do think, however, that if 687 00:40:05,520 --> 00:40:09,319 Speaker 1: you seem to over inflate the achievement, you run the 688 00:40:09,400 --> 00:40:12,839 Speaker 1: danger of making people feel jaded about it later, which 689 00:40:12,880 --> 00:40:18,080 Speaker 1: I think Computer Wolf exactly. Yeah, exactly. So it's one 690 00:40:18,120 --> 00:40:20,879 Speaker 1: of those things where you know, you have to take 691 00:40:20,880 --> 00:40:25,160 Speaker 1: the context into account, right and don't don't downplay the achievement, 692 00:40:25,200 --> 00:40:27,399 Speaker 1: but don't sit there and say, like, ah ha, now 693 00:40:27,440 --> 00:40:30,480 Speaker 1: we have intelligent computers everywhere. That's not that's not true either. 694 00:40:31,320 --> 00:40:35,480 Speaker 1: I saw there's a great Wired article that specifically went 695 00:40:35,520 --> 00:40:40,120 Speaker 1: into uh, kind of debunking the whole beating the Turing 696 00:40:40,160 --> 00:40:42,960 Speaker 1: test thing and again kind of saying the same thing 697 00:40:42,960 --> 00:40:45,920 Speaker 1: we're saying, like, take the context into account, and and 698 00:40:45,920 --> 00:40:50,000 Speaker 1: part of that article. They ended up asking a cognitive 699 00:40:50,080 --> 00:40:53,959 Speaker 1: scientist named Gary Marcus of n y U about this, 700 00:40:54,120 --> 00:40:57,480 Speaker 1: and Marcus proposed a new version of the Turing test 701 00:40:57,560 --> 00:41:00,120 Speaker 1: because he says the old version is not really a 702 00:41:00,200 --> 00:41:04,759 Speaker 1: measurement of machine intelligence. Uh, it does kind of illustrate 703 00:41:04,920 --> 00:41:08,640 Speaker 1: ways of creating natural language recognition and clever ways to 704 00:41:08,840 --> 00:41:12,520 Speaker 1: fool the human side, huh. And that it was very 705 00:41:12,600 --> 00:41:16,440 Speaker 1: valid historically at the time because you know, text textual 706 00:41:16,480 --> 00:41:20,080 Speaker 1: communication was new and exciting and it was you know, 707 00:41:20,160 --> 00:41:22,759 Speaker 1: it pushed the field forward, it really did. But now 708 00:41:22,800 --> 00:41:25,400 Speaker 1: we've gotten to a point where fooling the person on 709 00:41:25,440 --> 00:41:29,000 Speaker 1: the other side of a keyboard is not necessarily the 710 00:41:29,040 --> 00:41:31,520 Speaker 1: goal that we should be looking at. He proposes, the 711 00:41:31,560 --> 00:41:34,440 Speaker 1: next version of the Turing test should be that a 712 00:41:34,440 --> 00:41:38,280 Speaker 1: computer software, uh, like any kind of program that wants 713 00:41:38,280 --> 00:41:40,120 Speaker 1: to beat it, what it has to do is first 714 00:41:40,400 --> 00:41:45,280 Speaker 1: quote unquote watch a movie, television show, YouTube video something, 715 00:41:45,680 --> 00:41:48,000 Speaker 1: some kind of video media, and then be able to 716 00:41:48,080 --> 00:41:51,239 Speaker 1: respond to questions about it. So sort of like here, 717 00:41:51,320 --> 00:41:55,000 Speaker 1: let me show you this ten minute video on car safety, 718 00:41:55,440 --> 00:41:58,719 Speaker 1: and then asking questions about specifically about the video, what 719 00:41:58,840 --> 00:42:01,960 Speaker 1: happened after they fastened the seat belt, that kind of thing, 720 00:42:02,239 --> 00:42:04,520 Speaker 1: and if the computer program is able to answer it, 721 00:42:04,960 --> 00:42:08,160 Speaker 1: then that would be a much more convincing touring test 722 00:42:08,239 --> 00:42:12,600 Speaker 1: than just kind of spewing out a script, which is, 723 00:42:12,840 --> 00:42:16,000 Speaker 1: you know, it's adding another layer of difficulty on top 724 00:42:16,040 --> 00:42:18,520 Speaker 1: of an already difficult task. But that's that's the whole 725 00:42:18,520 --> 00:42:20,600 Speaker 1: only way you can go forward. Otherwise we're just going 726 00:42:20,640 --> 00:42:24,799 Speaker 1: to see increasingly sophisticated chat box. Yeah, and and that 727 00:42:24,960 --> 00:42:28,359 Speaker 1: is actually a very difficult and interesting problem. What wasn't 728 00:42:28,360 --> 00:42:31,000 Speaker 1: it just recently that there was a computer that we 729 00:42:31,000 --> 00:42:33,240 Speaker 1: we taught I mean not us personally, but that humanity, 730 00:42:33,920 --> 00:42:38,440 Speaker 1: some researchers and we're taught taught to identify cats pictures 731 00:42:38,440 --> 00:42:41,359 Speaker 1: of cats. That was the AI program that essentially went 732 00:42:41,400 --> 00:42:44,480 Speaker 1: through thousands and thousands of uh I think it was 733 00:42:44,600 --> 00:42:48,440 Speaker 1: images and videos and then became able to identify cats. 734 00:42:48,440 --> 00:42:51,759 Speaker 1: It essentially defined what a cat was because no one 735 00:42:51,880 --> 00:42:56,080 Speaker 1: taught it right. It learned what cats are based upon 736 00:42:56,120 --> 00:42:59,200 Speaker 1: their appearance, on their appearance and can look at pictures 737 00:42:59,200 --> 00:43:01,880 Speaker 1: of cats and say that is totally a cat, essentially 738 00:43:01,960 --> 00:43:04,239 Speaker 1: saying that thing that is in that video is the 739 00:43:04,320 --> 00:43:06,719 Speaker 1: same as this other thing that's in this picture. That's 740 00:43:06,719 --> 00:43:08,400 Speaker 1: the same as this other thing that's in this totally 741 00:43:08,440 --> 00:43:12,919 Speaker 1: different video, which sounds trivial and hilarious, and it kind 742 00:43:13,000 --> 00:43:15,719 Speaker 1: of is hilarious. Also easy because I mean, come on, 743 00:43:15,800 --> 00:43:18,720 Speaker 1: everything on the internet has cats in it. Yeah, that 744 00:43:18,600 --> 00:43:20,239 Speaker 1: is That is kind of a gimme, isn't it? But 745 00:43:20,239 --> 00:43:22,399 Speaker 1: but still no, it is. It is cool. I mean 746 00:43:22,640 --> 00:43:25,399 Speaker 1: because just just that level of image recognition, I mean 747 00:43:25,800 --> 00:43:27,680 Speaker 1: being able to take an object and look at it 748 00:43:27,760 --> 00:43:31,359 Speaker 1: from a different angle than you were taught or that's 749 00:43:31,360 --> 00:43:35,080 Speaker 1: a different color, different different size. Yeah, all these things. 750 00:43:35,120 --> 00:43:37,880 Speaker 1: All of these things are easy for us, hard for computers, 751 00:43:37,880 --> 00:43:42,400 Speaker 1: so seeing something make that breakthrough is really exciting. Anyway, 752 00:43:42,600 --> 00:43:44,440 Speaker 1: we thought we would take that story and kind of 753 00:43:44,440 --> 00:43:47,280 Speaker 1: break it down for you guys, explain how it's still 754 00:43:47,320 --> 00:43:49,759 Speaker 1: cool but maybe not as cool as the way some 755 00:43:49,800 --> 00:43:52,759 Speaker 1: of the headlines are saying. Right, and also say hey 756 00:43:52,840 --> 00:43:57,239 Speaker 1: internet journalists, UM, step up your game. Yeah, I understand 757 00:43:57,719 --> 00:43:59,759 Speaker 1: that you want people to read your stuff. Oh yeah, 758 00:43:59,800 --> 00:44:02,440 Speaker 1: And and you're under deadline pressure and that's terrible. But 759 00:44:02,520 --> 00:44:07,319 Speaker 1: let's look hard. Let's represent reality, shall we don't just 760 00:44:07,360 --> 00:44:09,600 Speaker 1: don't just spit out press releases the way that you 761 00:44:09,680 --> 00:44:12,680 Speaker 1: found them. Yeah, and uh, and I would be criminal 762 00:44:12,840 --> 00:44:17,440 Speaker 1: to to neglect to mention Noel. Our producer reminded me 763 00:44:18,040 --> 00:44:21,600 Speaker 1: that obviously this is a very important field of study 764 00:44:21,760 --> 00:44:24,240 Speaker 1: because we want to be able to tell the difference 765 00:44:24,280 --> 00:44:28,640 Speaker 1: between computers and humans when the future of Blade Runner 766 00:44:28,800 --> 00:44:32,600 Speaker 1: becomes our reality and you're chasing down a replicant and 767 00:44:32,640 --> 00:44:34,759 Speaker 1: you have to determine if it's actually a replicate or 768 00:44:34,840 --> 00:44:38,000 Speaker 1: human being. And that wraps up this classic episode about 769 00:44:38,040 --> 00:44:41,759 Speaker 1: passing the Turing Test. Honestly, the whole Turing Test thing, 770 00:44:41,880 --> 00:44:47,919 Speaker 1: that general concept has morphed over the decades, and it's 771 00:44:47,960 --> 00:44:51,319 Speaker 1: it's more like shorthand right. Passing the Touring test isn't 772 00:44:51,320 --> 00:44:55,560 Speaker 1: so much about passing a specific hypothetical test. It's more 773 00:44:55,680 --> 00:45:00,239 Speaker 1: about this idea of creating a machine that of here's 774 00:45:00,280 --> 00:45:03,920 Speaker 1: to be indistinguishable from a human as far as you know, 775 00:45:04,040 --> 00:45:07,960 Speaker 1: processing and communicating information so that you feel like the 776 00:45:08,000 --> 00:45:11,640 Speaker 1: machine is truly intelligent. I think the turning test is 777 00:45:11,680 --> 00:45:16,439 Speaker 1: one of those concepts that changes over time, and and 778 00:45:17,200 --> 00:45:19,279 Speaker 1: it's more like you start to find out what it's 779 00:45:19,360 --> 00:45:22,160 Speaker 1: not rather than what it is. It's an interesting things, 780 00:45:22,320 --> 00:45:25,719 Speaker 1: very much like our artificial intelligence itself, or even machine consciousness. 781 00:45:26,280 --> 00:45:30,360 Speaker 1: These are all concepts that are are kind of wibbly wobbly. 782 00:45:30,560 --> 00:45:33,640 Speaker 1: As dr who might say, well, if you have suggestions 783 00:45:33,640 --> 00:45:36,480 Speaker 1: for topics I should cover in future episodes of tech Stuff, 784 00:45:36,520 --> 00:45:38,440 Speaker 1: reach out to me. The best way to do that 785 00:45:38,600 --> 00:45:41,040 Speaker 1: is over on Twitter. The handle for the show is 786 00:45:41,080 --> 00:45:45,440 Speaker 1: text Stuff HSW and I'll talk to you again really soon. 787 00:45:52,080 --> 00:45:55,120 Speaker 1: Text Stuff is an I Heart Radio production. For more 788 00:45:55,200 --> 00:45:58,600 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 789 00:45:58,719 --> 00:46:01,880 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.