1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tech Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:14,120 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:14,160 --> 00:00:17,520 Speaker 1: I am your host, Jonathan Strickland. I'm an executive producer 4 00:00:17,560 --> 00:00:20,640 Speaker 1: with I Heart Radio and I love all things tech. 5 00:00:21,520 --> 00:00:24,720 Speaker 1: It is time for a tech Stuff classic episode, which 6 00:00:24,800 --> 00:00:28,160 Speaker 1: means it's Friday, so I hope you're all celebrating This 7 00:00:28,200 --> 00:00:33,120 Speaker 1: episode originally published back on February two thousand thirteen. It 8 00:00:33,240 --> 00:00:39,160 Speaker 1: is called tech Stuff Gets Domestic Robots. Lauren Voge Obama 9 00:00:39,159 --> 00:00:42,080 Speaker 1: and I sat down to talk about domestic robots and 10 00:00:42,400 --> 00:00:44,519 Speaker 1: where we're going with them. This is another one of 11 00:00:44,520 --> 00:00:47,360 Speaker 1: those episodes that probably merits a follow up, but I 12 00:00:47,400 --> 00:00:49,520 Speaker 1: thought i'd be fun to listen to the original one. 13 00:00:49,680 --> 00:00:53,519 Speaker 1: I'm back in two thousand thirteen, so enjoy. This is 14 00:00:53,600 --> 00:00:56,680 Speaker 1: a topic that a listener of ours wanted to hear 15 00:00:56,720 --> 00:00:59,959 Speaker 1: more about, so we thought, Hey, we haven't really come 16 00:01:00,000 --> 00:01:03,120 Speaker 1: a domestic robots, so yeah, let's let's do that crazy 17 00:01:03,160 --> 00:01:05,880 Speaker 1: thing you Yeah, and when people think of domestic robots, 18 00:01:05,920 --> 00:01:08,360 Speaker 1: I think that the image that immediately pops into most 19 00:01:08,400 --> 00:01:11,240 Speaker 1: people's minds is Rosie, the robot from the Jetsons. Yeah, 20 00:01:11,280 --> 00:01:14,560 Speaker 1: the documentary series. The documentary series from the sixties of 21 00:01:14,600 --> 00:01:16,720 Speaker 1: what the future will be like. This was this was 22 00:01:16,720 --> 00:01:20,760 Speaker 1: created in three set in two by the way, if 23 00:01:20,760 --> 00:01:22,800 Speaker 1: you didn't know that, So we've we've got some cracking 24 00:01:22,920 --> 00:01:25,759 Speaker 1: to get on. Yeah, I think it's okay because their 25 00:01:26,160 --> 00:01:29,120 Speaker 1: vision of the future did not incorporate some pretty amazing 26 00:01:29,160 --> 00:01:31,759 Speaker 1: technologies that we have at our disposal right now. That's fair, 27 00:01:31,880 --> 00:01:34,640 Speaker 1: that's fair. But they did have this giant robotic human, 28 00:01:34,800 --> 00:01:38,640 Speaker 1: life size humanoid made that could you know, sit around 29 00:01:38,640 --> 00:01:41,360 Speaker 1: and drink milkshakes with you and have a conversation and 30 00:01:41,360 --> 00:01:43,840 Speaker 1: and raise your kids for your kids for you. You're 31 00:01:43,959 --> 00:01:48,480 Speaker 1: too busy space parents, right, So yeah, anyway, Rosie the 32 00:01:48,560 --> 00:01:50,560 Speaker 1: robot kind of kind of was this idea of like 33 00:01:50,960 --> 00:01:53,680 Speaker 1: the robot made or the robot butler that could take 34 00:01:53,720 --> 00:01:57,040 Speaker 1: care of all the menial chores that most of us 35 00:01:57,240 --> 00:02:00,000 Speaker 1: do not like to do. Like it, it just seems 36 00:02:00,120 --> 00:02:03,200 Speaker 1: like it's one of those things that takes up valuable 37 00:02:03,240 --> 00:02:06,400 Speaker 1: time that you could spend fragging people on halo floor 38 00:02:06,640 --> 00:02:11,480 Speaker 1: doing literally anything else other than washing dishes again, dishes, 39 00:02:12,080 --> 00:02:17,079 Speaker 1: folding laundry, cleaning windows. These are these these chores that 40 00:02:17,080 --> 00:02:19,720 Speaker 1: that take up time that uh, you know, we just 41 00:02:19,760 --> 00:02:21,600 Speaker 1: wish someone else would take care of that every now 42 00:02:21,639 --> 00:02:24,600 Speaker 1: and then, and sometimes we hire people for that, which 43 00:02:24,639 --> 00:02:27,760 Speaker 1: is in fact probably enormously cheaper and easier than building 44 00:02:27,760 --> 00:02:29,720 Speaker 1: a robot to do it, because a lot of these 45 00:02:29,760 --> 00:02:32,240 Speaker 1: things are actually really hard to do. But we'll talk 46 00:02:32,280 --> 00:02:34,680 Speaker 1: about that later. Yeah, So let's start off by talking 47 00:02:34,720 --> 00:02:37,520 Speaker 1: about where the word robot comes from. And I'm sure 48 00:02:37,800 --> 00:02:39,720 Speaker 1: a lot of you out there already know this story, 49 00:02:39,760 --> 00:02:42,560 Speaker 1: but there was a check writer, as in someone who 50 00:02:42,639 --> 00:02:46,240 Speaker 1: was from Czechoslovakia. His name or the Czech Republic at 51 00:02:46,240 --> 00:02:51,799 Speaker 1: the time, but now the Czech Republic, uh, Carol Kpek, 52 00:02:52,320 --> 00:02:55,160 Speaker 1: who wrote a play called Are You Are? And Are 53 00:02:55,200 --> 00:03:01,200 Speaker 1: You Are stands for Rossom's Universal Robots and this was one. Yeah, 54 00:03:01,240 --> 00:03:05,200 Speaker 1: and the word robot comes from the check word robota, 55 00:03:05,360 --> 00:03:09,800 Speaker 1: which means forced labor or servitude, and that word, in 56 00:03:09,840 --> 00:03:14,240 Speaker 1: fact comes from another word rob r a b That 57 00:03:14,280 --> 00:03:18,919 Speaker 1: word means slave. So uh. Now, in the original form 58 00:03:19,040 --> 00:03:22,520 Speaker 1: that did not necessarily mean an artificial life form of 59 00:03:22,520 --> 00:03:25,160 Speaker 1: any type. It could a robot could be a person. 60 00:03:25,240 --> 00:03:28,119 Speaker 1: It could be someone who has been forced to conform 61 00:03:28,240 --> 00:03:32,400 Speaker 1: to a very specific set of behaviors and to perform 62 00:03:32,440 --> 00:03:36,200 Speaker 1: those behaviors for the benefit of some larger entity, and 63 00:03:36,240 --> 00:03:38,520 Speaker 1: this was sure sharing. This was kind of the crux 64 00:03:38,560 --> 00:03:40,920 Speaker 1: of the play. It was exploring whether or not it 65 00:03:41,000 --> 00:03:43,160 Speaker 1: was okay for machines to be used the way that 66 00:03:43,200 --> 00:03:47,840 Speaker 1: we use people and yeah so so eventually the term 67 00:03:47,960 --> 00:03:52,560 Speaker 1: robot became more about a device, a mechanism that can 68 00:03:52,680 --> 00:03:57,400 Speaker 1: perform functions in an automated way, and perhaps even in 69 00:03:57,600 --> 00:04:01,400 Speaker 1: an autonomous way auto as an automated being two different things. 70 00:04:01,440 --> 00:04:04,640 Speaker 1: Autonomous means that it can do it under its own direction, 71 00:04:05,160 --> 00:04:08,640 Speaker 1: doesn't need someone there to press a button for it 72 00:04:08,720 --> 00:04:11,200 Speaker 1: to to be able to do whatever it does, right, 73 00:04:11,240 --> 00:04:13,760 Speaker 1: which is why we don't call, for example, our dishwashers, 74 00:04:13,880 --> 00:04:17,480 Speaker 1: or our toaster ovens, or our washing machines robots, even 75 00:04:17,480 --> 00:04:20,159 Speaker 1: though technically they are robotic. Yeah they have, they have 76 00:04:20,400 --> 00:04:23,600 Speaker 1: features that are common to robots, though they themselves are 77 00:04:23,680 --> 00:04:26,960 Speaker 1: not a robot. Kind of like how I have features 78 00:04:27,000 --> 00:04:31,880 Speaker 1: that are similar to a human being and yet defy explanation. 79 00:04:32,880 --> 00:04:35,680 Speaker 1: I'm gonna I'm gonna leave that one right there, all right, 80 00:04:35,720 --> 00:04:39,320 Speaker 1: that's fair. So before we get into the whole domestic 81 00:04:39,440 --> 00:04:43,200 Speaker 1: robot history and the sort of things that have developed 82 00:04:43,200 --> 00:04:45,839 Speaker 1: over the years in the field of domestic robots and 83 00:04:45,839 --> 00:04:47,720 Speaker 1: where we are today and where we expect to go. 84 00:04:48,320 --> 00:04:50,000 Speaker 1: I thought it'd be interesting to talk a little bit 85 00:04:50,000 --> 00:04:52,760 Speaker 1: about a survey that was conducted a few years ago 86 00:04:52,880 --> 00:04:56,800 Speaker 1: by a company called Persuadable Research. This was the last 87 00:04:56,839 --> 00:04:58,720 Speaker 1: one that I saw. Was from two thousand twelves. Yeah, 88 00:04:58,800 --> 00:05:02,160 Speaker 1: that was the most in one that This survey was 89 00:05:02,320 --> 00:05:07,039 Speaker 1: to ask people, Hey, if you had the opportunity to 90 00:05:07,200 --> 00:05:12,039 Speaker 1: purchase a robotic device that would perform your chores for you, 91 00:05:12,080 --> 00:05:15,359 Speaker 1: would that be of interest to you? And sixty percent 92 00:05:15,480 --> 00:05:18,640 Speaker 1: of the people surveyed said, why, yes, it would. I 93 00:05:18,640 --> 00:05:20,800 Speaker 1: would love to have a robot that could do things 94 00:05:20,839 --> 00:05:24,240 Speaker 1: like clean windows or do laundry. Also, I would like 95 00:05:24,279 --> 00:05:27,480 Speaker 1: a robot that could move heavy stuff from one place 96 00:05:27,520 --> 00:05:29,600 Speaker 1: to another so that I don't have to do it, 97 00:05:29,800 --> 00:05:33,360 Speaker 1: or provide better home security, personal assistant duties. I think 98 00:05:33,400 --> 00:05:36,880 Speaker 1: we're high on the health monitoring was up there too, Yeah, 99 00:05:36,920 --> 00:05:40,200 Speaker 1: and and a lot of them. I thought that this 100 00:05:40,279 --> 00:05:43,240 Speaker 1: sort of robot would be a very useful device, and 101 00:05:43,240 --> 00:05:45,680 Speaker 1: and in most cases, I think they were thinking of 102 00:05:45,800 --> 00:05:50,679 Speaker 1: a single robot capable of doing multiple things, not not 103 00:05:50,680 --> 00:05:53,320 Speaker 1: not a whole bunch of different robots that are specialized, 104 00:05:53,360 --> 00:05:57,359 Speaker 1: but more of a general purpose robot. Now that that 105 00:05:57,600 --> 00:06:01,800 Speaker 1: was interesting enough, but also there were asking more questions 106 00:06:01,839 --> 00:06:04,719 Speaker 1: like what form would you want this robot to be in? 107 00:06:04,839 --> 00:06:07,560 Speaker 1: And it seemed like most people wanted to have a 108 00:06:07,680 --> 00:06:12,719 Speaker 1: humanoid robot. So we're talking about robots that have arms 109 00:06:12,760 --> 00:06:14,800 Speaker 1: and legs and a head, you know, things that we 110 00:06:14,839 --> 00:06:18,920 Speaker 1: would generally associate with the human form. And they also 111 00:06:19,200 --> 00:06:22,560 Speaker 1: did not want their robots to necessarily have a gender, 112 00:06:23,000 --> 00:06:26,720 Speaker 1: so the voice needed to be neither male nor female. Right, 113 00:06:27,000 --> 00:06:30,760 Speaker 1: So kind of that that stereo root, yes, the one 114 00:06:30,800 --> 00:06:33,080 Speaker 1: that the one that we all have. We all have 115 00:06:33,120 --> 00:06:35,119 Speaker 1: a robot. I think we all have a robot voice, 116 00:06:35,520 --> 00:06:39,159 Speaker 1: and we all have an Internet comments voice, Like we 117 00:06:39,200 --> 00:06:41,480 Speaker 1: have a voice that we use when we describe Internet comments. 118 00:06:41,480 --> 00:06:44,120 Speaker 1: People like you guys are awful, you are your show 119 00:06:44,160 --> 00:06:47,440 Speaker 1: was bad and you should feel bad. That's my Internet voice. 120 00:06:48,360 --> 00:06:50,800 Speaker 1: That's not that's not a very kind Internet voice, Jonathan. 121 00:06:51,000 --> 00:06:53,440 Speaker 1: I'm not saying that everyone on the Internet leaves comments 122 00:06:53,760 --> 00:06:55,840 Speaker 1: sounds like that. I'm saying the people who leave those 123 00:06:55,920 --> 00:06:59,160 Speaker 1: kind of comments sound like that. That's entirely if they 124 00:06:59,200 --> 00:07:02,320 Speaker 1: can insult me and I can insult them, you can, Yes, right, 125 00:07:02,400 --> 00:07:05,160 Speaker 1: I have a I have a medium eye for an eye. 126 00:07:05,240 --> 00:07:09,200 Speaker 1: It's a harsh podcast landscape guys is like Donkey Kong. 127 00:07:10,320 --> 00:07:13,160 Speaker 1: But no, that that that they want. That's my internet voice. 128 00:07:13,160 --> 00:07:14,640 Speaker 1: But my robot voice is you know, of course, like 129 00:07:15,280 --> 00:07:18,400 Speaker 1: how can I help you? That's sort of that's that's 130 00:07:18,440 --> 00:07:20,320 Speaker 1: I actually do not possess a robot voice. I don't 131 00:07:20,320 --> 00:07:23,600 Speaker 1: think what I'm sorry, I'll work on one. Okay, next time, 132 00:07:23,720 --> 00:07:26,840 Speaker 1: next time, we must get vogel bound a robot voice 133 00:07:26,920 --> 00:07:29,480 Speaker 1: that really do voices. This is about it. Okay, well 134 00:07:29,680 --> 00:07:32,080 Speaker 1: we'll we'll break you of that habit sooner or later. 135 00:07:32,880 --> 00:07:37,600 Speaker 1: So yeah. And also that the people of responded said 136 00:07:37,600 --> 00:07:40,160 Speaker 1: they would be willing to pay fifteen dollars or more 137 00:07:40,760 --> 00:07:44,920 Speaker 1: for a comprehensive robotic assistant. That's a lot of money. 138 00:07:45,200 --> 00:07:46,920 Speaker 1: Oh well, I mean when you when you think about it, 139 00:07:46,960 --> 00:07:49,720 Speaker 1: that's you know, that's that's the mid range for a 140 00:07:49,760 --> 00:07:52,880 Speaker 1: decent car. Right, Yeah, you're talking about it, and you're 141 00:07:52,920 --> 00:07:56,080 Speaker 1: talking about something that's that's running around your house doing 142 00:07:56,120 --> 00:07:59,080 Speaker 1: everything you do. But if it's a robot this folding laundry, 143 00:07:59,160 --> 00:08:02,200 Speaker 1: you're like, how my is is the fact that I 144 00:08:02,240 --> 00:08:05,640 Speaker 1: don't have to fold laundry? Worth fifteen grand. I guess 145 00:08:05,640 --> 00:08:08,320 Speaker 1: it all depends on how much you hate laundry. Most 146 00:08:08,320 --> 00:08:13,200 Speaker 1: people said that they were their their limit was like 147 00:08:13,280 --> 00:08:15,960 Speaker 1: anything that was a thousand dollars or more was kind 148 00:08:15,960 --> 00:08:18,920 Speaker 1: of outside the range. I think that's interesting because there 149 00:08:18,920 --> 00:08:22,400 Speaker 1: are robots on the market right now, domestic robots that 150 00:08:22,440 --> 00:08:26,840 Speaker 1: have very specific uses, like they don't do things outside 151 00:08:26,880 --> 00:08:29,400 Speaker 1: of whatever it is they were meant for, that are 152 00:08:29,440 --> 00:08:32,360 Speaker 1: more expensive than that. But there there are pool robots 153 00:08:32,400 --> 00:08:34,360 Speaker 1: and we'll talk about those in a bit that are 154 00:08:34,440 --> 00:08:37,640 Speaker 1: fifteen hundred dollars. So this is so this is perhaps 155 00:08:37,720 --> 00:08:40,439 Speaker 1: unrealistic of people to expect that sort of price point, 156 00:08:40,600 --> 00:08:43,200 Speaker 1: And as you were mentioning before, it would end up 157 00:08:43,240 --> 00:08:45,760 Speaker 1: being cheaper to hire someone to come and do this 158 00:08:45,840 --> 00:08:48,720 Speaker 1: work for you, at least over a certain period of time. 159 00:08:48,720 --> 00:08:52,000 Speaker 1: Maybe you could eventually find like how many years would 160 00:08:52,000 --> 00:08:55,319 Speaker 1: it take for a robot to make financial sense compared 161 00:08:55,400 --> 00:08:57,679 Speaker 1: to hiring a human being. Somebody go do that math 162 00:08:57,720 --> 00:09:00,360 Speaker 1: for us. I do not feel like doing it at all. Well, Also, 163 00:09:00,520 --> 00:09:02,600 Speaker 1: we don't have all the variables right, which is because 164 00:09:02,640 --> 00:09:05,200 Speaker 1: you would need to know what is the what's the 165 00:09:05,280 --> 00:09:09,679 Speaker 1: average life span for domestic robot? So without that knowledge. 166 00:09:09,720 --> 00:09:12,680 Speaker 1: We can't really say, like, well, if you if you 167 00:09:13,480 --> 00:09:16,960 Speaker 1: build in maintenance costs and repair costs, then that adds 168 00:09:17,000 --> 00:09:20,920 Speaker 1: to it. Right, You don't necessarily have to repair someone 169 00:09:20,920 --> 00:09:23,200 Speaker 1: who comes in and cleans your stuff. You just hire 170 00:09:23,280 --> 00:09:26,120 Speaker 1: someone else because you know you're a heartless person. I mean, 171 00:09:26,160 --> 00:09:30,400 Speaker 1: I'm just talking about me here, right my perspective. I'm like, wow, 172 00:09:30,440 --> 00:09:34,480 Speaker 1: it's really sad that that, my my, the people I 173 00:09:34,559 --> 00:09:37,080 Speaker 1: hired to come in and clean my windows broke their legs. 174 00:09:37,320 --> 00:09:40,040 Speaker 1: Thank goodness, there's this other cleaning service I can hire instead. 175 00:09:40,160 --> 00:09:42,040 Speaker 1: I can just leave those first guys right there on 176 00:09:42,080 --> 00:09:43,960 Speaker 1: the sidewalk and just keep going. This is turning into 177 00:09:43,960 --> 00:09:47,960 Speaker 1: a really terrible Well you should see the piles of 178 00:09:47,960 --> 00:09:50,719 Speaker 1: people outside my home. You think this is bad? You 179 00:09:50,720 --> 00:09:53,360 Speaker 1: should really take a look at them. But anyway, so, 180 00:09:53,440 --> 00:09:56,080 Speaker 1: and the takeaway that I got from this Persuadable Research 181 00:09:56,120 --> 00:09:59,720 Speaker 1: Corporation survey was that what people want from their domestic 182 00:09:59,760 --> 00:10:03,200 Speaker 1: robo is that they want them to be really cute 183 00:10:03,360 --> 00:10:06,960 Speaker 1: humanoid mobile PCs about in the price range, because a 184 00:10:06,960 --> 00:10:09,440 Speaker 1: thousand dollars is about what you would pay for for 185 00:10:09,440 --> 00:10:12,280 Speaker 1: for a really good PC, for for a decent PC. 186 00:10:12,760 --> 00:10:15,760 Speaker 1: And um Bill Gates actually in two thousand six wrote 187 00:10:15,760 --> 00:10:19,280 Speaker 1: an essay called a Robot in Every Home that was 188 00:10:19,320 --> 00:10:22,240 Speaker 1: basically saying the same thing he said. If I can quote, 189 00:10:22,440 --> 00:10:24,160 Speaker 1: he said, we may be on the verge of a 190 00:10:24,160 --> 00:10:26,240 Speaker 1: new era when the PC will get up off the 191 00:10:26,280 --> 00:10:29,240 Speaker 1: desktop and allow us to see here, touch and manipulate 192 00:10:29,280 --> 00:10:32,360 Speaker 1: objects in places where we are not physically present. And 193 00:10:32,440 --> 00:10:35,040 Speaker 1: this was kind of his vision for the future that 194 00:10:35,040 --> 00:10:37,600 Speaker 1: that we would have. If you guys have seen Far Escape, 195 00:10:37,600 --> 00:10:39,360 Speaker 1: sort of like the d r D s more so 196 00:10:39,480 --> 00:10:41,840 Speaker 1: than Rosie the Rosie the Robot. Yeah, I don't know 197 00:10:41,880 --> 00:10:44,640 Speaker 1: what a d r D and d r D was. 198 00:10:44,800 --> 00:10:48,640 Speaker 1: It stands for Diagnostic Repair Drones. And these were these 199 00:10:48,679 --> 00:10:52,920 Speaker 1: kind of little little bug shaped, little trialobite looking critters 200 00:10:52,920 --> 00:10:55,560 Speaker 1: that would run around Far Escape and do minor repair 201 00:10:55,600 --> 00:10:58,640 Speaker 1: work and play the Symphony of eighteen twelve and etcetera, 202 00:10:58,679 --> 00:11:01,720 Speaker 1: etcetera as you need them to. Okay, I can see 203 00:11:01,760 --> 00:11:07,040 Speaker 1: the pressing need for the twelve overture when I'm when 204 00:11:07,040 --> 00:11:08,800 Speaker 1: I'm talking to them up it. Look, it depends on 205 00:11:08,840 --> 00:11:11,400 Speaker 1: how crazy you get while you're lost in space. Okay, 206 00:11:11,800 --> 00:11:13,880 Speaker 1: lost in space now I can talk about that show, 207 00:11:14,280 --> 00:11:17,600 Speaker 1: but that had a different robot. Anyway, I'm getting off track. 208 00:11:17,880 --> 00:11:20,600 Speaker 1: So it's interesting that you say that this whole trilobyte idea, 209 00:11:20,679 --> 00:11:24,200 Speaker 1: because in fact, the very first domestic robot that I 210 00:11:24,240 --> 00:11:26,679 Speaker 1: could come up that I came across my research as 211 00:11:26,720 --> 00:11:29,360 Speaker 1: far as on the consumer level goes. So the first 212 00:11:29,480 --> 00:11:31,600 Speaker 1: robot that the average person could go out and buy 213 00:11:31,679 --> 00:11:35,200 Speaker 1: for their home was a very specific use robot. It 214 00:11:35,280 --> 00:11:38,800 Speaker 1: was the electro Lux Trilobyte, and it was a robot 215 00:11:38,800 --> 00:11:41,200 Speaker 1: that looked kind of like a trilobyte, and it was 216 00:11:41,280 --> 00:11:44,320 Speaker 1: a vacuum robot. And you've probably seen different vacuum robots. 217 00:11:44,360 --> 00:11:46,480 Speaker 1: We will talk about a very famous one in a 218 00:11:46,480 --> 00:11:50,120 Speaker 1: little bit, but the electro Lux Trilobyte was the first 219 00:11:50,240 --> 00:11:53,200 Speaker 1: robotic vacuum cleaner and it hit the market in two 220 00:11:53,240 --> 00:11:57,800 Speaker 1: thousand one. And the original model used ultrasonic sensors to 221 00:11:57,880 --> 00:12:02,679 Speaker 1: navigate through an environment, So essentially it's shooting out ultrasonic sounds. 222 00:12:02,720 --> 00:12:05,360 Speaker 1: So this is these are signals that we cannot detect 223 00:12:05,400 --> 00:12:08,880 Speaker 1: their outside the range of human hearing, and the signal 224 00:12:08,960 --> 00:12:12,160 Speaker 1: would emit from the robot and if it, if it 225 00:12:12,600 --> 00:12:16,400 Speaker 1: encountered anything, it would bounce back like a vacuum bat. Yeah, 226 00:12:16,480 --> 00:12:19,600 Speaker 1: it was a vacuum bat. It was kind of like 227 00:12:19,640 --> 00:12:23,479 Speaker 1: what Batman would be if he were a domestic cleaning 228 00:12:23,559 --> 00:12:27,320 Speaker 1: person as opposed to a crime fighter. He would dress 229 00:12:27,400 --> 00:12:29,679 Speaker 1: up as a bat and clean the robot. What it 230 00:12:29,679 --> 00:12:31,240 Speaker 1: would do is it would send out the signal. It 231 00:12:31,280 --> 00:12:33,360 Speaker 1: would the single would reflect back, and that's how we 232 00:12:33,400 --> 00:12:36,720 Speaker 1: would know how far away it was from some other object. 233 00:12:37,040 --> 00:12:39,600 Speaker 1: So when you have it set down to clean your 234 00:12:39,640 --> 00:12:41,840 Speaker 1: living room and you've got all this furniture there, as 235 00:12:41,880 --> 00:12:44,320 Speaker 1: it would approach the furniture, the signal would go out 236 00:12:44,400 --> 00:12:46,880 Speaker 1: bounce back. The robot would know, all right, I have 237 00:12:47,000 --> 00:12:49,200 Speaker 1: to stop in three inches or I'm going to run 238 00:12:49,240 --> 00:12:53,320 Speaker 1: into something like couch or whatever. And so the earliest 239 00:12:53,360 --> 00:12:56,840 Speaker 1: model had these ultrasonic sensors and that was pretty much it. 240 00:12:56,920 --> 00:12:59,800 Speaker 1: You could you could lay down these magnetic strips along 241 00:12:59,840 --> 00:13:02,640 Speaker 1: the borders of your room so that it would not 242 00:13:02,760 --> 00:13:06,920 Speaker 1: go to across, which is important if, for instance, you 243 00:13:06,960 --> 00:13:09,840 Speaker 1: happen to have a staircase and you don't want your 244 00:13:10,000 --> 00:13:12,880 Speaker 1: extremely expensive Trilobyte robot to go take a tumble down 245 00:13:12,880 --> 00:13:15,160 Speaker 1: the stairs. Yes, I imagine that those first ones were 246 00:13:15,240 --> 00:13:19,360 Speaker 1: not at all inexpensive. I'm sure they were quite expensive. Yes, 247 00:13:19,440 --> 00:13:21,920 Speaker 1: and uh, I think that it's still fairly expensive to 248 00:13:21,920 --> 00:13:24,959 Speaker 1: get most of these sort of robots because as the 249 00:13:25,040 --> 00:13:27,760 Speaker 1: years go on, the technology gets more sophisticated, so each 250 00:13:27,800 --> 00:13:31,840 Speaker 1: model that comes out has more features. Uh. The second 251 00:13:32,080 --> 00:13:35,920 Speaker 1: round of the electro Lux trilobyte also included infrared sensors, 252 00:13:36,480 --> 00:13:39,080 Speaker 1: so it made it a little easier for the robot 253 00:13:39,120 --> 00:13:41,040 Speaker 1: to sense its environment because one of the problems with 254 00:13:41,040 --> 00:13:43,800 Speaker 1: the ultrasonic ones was that if it came across, uh, 255 00:13:43,880 --> 00:13:46,400 Speaker 1: something that was like a really sharp curve or whatever, 256 00:13:46,440 --> 00:13:50,160 Speaker 1: the ultra the ultrasonic responses wouldn't be accurate enough for it. 257 00:13:52,559 --> 00:13:55,160 Speaker 1: So that was the first one, and it also had 258 00:13:55,200 --> 00:13:58,520 Speaker 1: a base unit that it could automatically find, again using 259 00:13:58,600 --> 00:14:01,839 Speaker 1: ultrasonic sensors. The base unit, you know, it sends a signal. 260 00:14:01,880 --> 00:14:04,000 Speaker 1: Base unit picks it up, it sends a signal. The 261 00:14:04,080 --> 00:14:06,640 Speaker 1: robot hones in on that signal so that it can 262 00:14:06,679 --> 00:14:11,080 Speaker 1: plug back in charge exactly. So I've seen uses of 263 00:14:11,120 --> 00:14:14,320 Speaker 1: this in toys as well. So for example, there was 264 00:14:15,040 --> 00:14:19,320 Speaker 1: uh Lego has a series of robots that you can 265 00:14:19,960 --> 00:14:24,120 Speaker 1: build yourself robot kits, right, and the robot kits come 266 00:14:24,240 --> 00:14:28,200 Speaker 1: with a programmable base and the programmable base is actually 267 00:14:28,200 --> 00:14:32,520 Speaker 1: really cool and lets you plug and play robot commands 268 00:14:32,600 --> 00:14:35,840 Speaker 1: into your robots. So essentially it's like it's like if 269 00:14:35,960 --> 00:14:40,880 Speaker 1: then commands, if this robot encounters this situation, then the 270 00:14:40,960 --> 00:14:43,800 Speaker 1: robot should behave this way. And one of the things 271 00:14:43,800 --> 00:14:45,520 Speaker 1: you can do is you can buy different types of 272 00:14:45,560 --> 00:14:48,240 Speaker 1: sensors to be on your lego robot, and one of 273 00:14:48,280 --> 00:14:51,840 Speaker 1: them is the ultrasonic sensor, which allows a robot to 274 00:14:51,920 --> 00:14:56,920 Speaker 1: really zero in on objects. And here and how stuff works. 275 00:14:57,680 --> 00:15:01,320 Speaker 1: For our big holiday party of we had some of 276 00:15:01,440 --> 00:15:03,320 Speaker 1: we had someone come in with these robots and show 277 00:15:03,360 --> 00:15:05,600 Speaker 1: them off, and one of the robots had just simple 278 00:15:05,640 --> 00:15:08,400 Speaker 1: infrared sensors that would allow it to detect when it 279 00:15:08,440 --> 00:15:11,680 Speaker 1: was coming close to the edge of the player right 280 00:15:11,680 --> 00:15:13,600 Speaker 1: and it would turn around and come back in. The 281 00:15:13,600 --> 00:15:16,560 Speaker 1: other robot had the infrared sensors and the ultrasonic which 282 00:15:16,560 --> 00:15:20,400 Speaker 1: allowed it to zero in on the its opponent, so 283 00:15:20,520 --> 00:15:23,920 Speaker 1: we could go right after the other robot. Uh. Clearly, 284 00:15:23,960 --> 00:15:26,320 Speaker 1: in the domestic situation here we're talking more about the 285 00:15:26,400 --> 00:15:31,040 Speaker 1: robot just avoiding damage, also avoid damaging other stuff in 286 00:15:31,080 --> 00:15:34,480 Speaker 1: the environment like your furniture. UM, I think this is 287 00:15:34,520 --> 00:15:37,760 Speaker 1: a good time to take a quick break and thank 288 00:15:37,760 --> 00:15:48,200 Speaker 1: our sponsor, and now back to the show. Are getting 289 00:15:48,200 --> 00:15:51,960 Speaker 1: back into domestic robots. There's a company that you pretty 290 00:15:52,040 --> 00:15:55,480 Speaker 1: much have to talk about called I Robot, which is 291 00:15:55,520 --> 00:15:59,960 Speaker 1: kind of funny. I Robot makes me think of something else. Yes, yes, yeah, 292 00:16:00,000 --> 00:16:02,680 Speaker 1: we didn't even mention Isaac asthmov well earlier I think 293 00:16:02,720 --> 00:16:05,880 Speaker 1: this is a perfect opportunity before we dive into the company. Yeah, 294 00:16:05,960 --> 00:16:08,840 Speaker 1: science fiction writer working in Oh, you might be better 295 00:16:08,840 --> 00:16:10,600 Speaker 1: at this and I am in the sixties, is that correct? 296 00:16:11,480 --> 00:16:14,880 Speaker 1: And earlier? Yes, and earlier, but um, I've wrote wrote 297 00:16:14,920 --> 00:16:18,720 Speaker 1: the three Laws of robotics. Yeah, these were the laws 298 00:16:18,800 --> 00:16:23,080 Speaker 1: that that in within the Asimov stories guided the robots 299 00:16:23,200 --> 00:16:28,920 Speaker 1: behavior so that they would try and follow ethical programming. Uh. 300 00:16:29,040 --> 00:16:31,840 Speaker 1: The idea being that the basic three rules, I don't 301 00:16:31,880 --> 00:16:33,960 Speaker 1: have them written in front of me, so I apologize 302 00:16:34,040 --> 00:16:35,560 Speaker 1: for the fact that I'm going to get these out 303 00:16:35,600 --> 00:16:38,800 Speaker 1: of order and that I'm paraphrasing. But that the robot 304 00:16:38,800 --> 00:16:42,520 Speaker 1: could not do anything to harm itself unless it would 305 00:16:42,680 --> 00:16:45,760 Speaker 1: mean that it would allow others to come to harm. So, 306 00:16:45,800 --> 00:16:48,240 Speaker 1: in other words, first, robot couldn't harm itself. A robot 307 00:16:48,320 --> 00:16:53,160 Speaker 1: could not harm other people. Uh, And there was another one. 308 00:16:53,200 --> 00:16:56,080 Speaker 1: I think it's property that robots can't hurt. But anyway, 309 00:16:56,120 --> 00:16:58,680 Speaker 1: the most important rule of all of them was that 310 00:16:58,880 --> 00:17:01,520 Speaker 1: a robot could not allow could not hurt a person 311 00:17:01,680 --> 00:17:03,720 Speaker 1: or allow a person to be hurt to come to 312 00:17:03,840 --> 00:17:08,040 Speaker 1: damage yes through inaction. Uh. And that overruled all other 313 00:17:08,320 --> 00:17:10,680 Speaker 1: robotic rules. So if it meant that the robot would 314 00:17:10,760 --> 00:17:13,399 Speaker 1: end up being damaged in the process of preventing a 315 00:17:13,440 --> 00:17:16,280 Speaker 1: living thing from being hurt, the robot would go ahead 316 00:17:16,280 --> 00:17:19,040 Speaker 1: and override that rule and allow itself to be damaged 317 00:17:19,080 --> 00:17:22,680 Speaker 1: so that could save the person. Uh. Most of Asimov's 318 00:17:22,680 --> 00:17:28,040 Speaker 1: stories involving robots had had stuff to do with unintended 319 00:17:28,119 --> 00:17:32,520 Speaker 1: consequences of these what what appeared to be ironclad rules, right, 320 00:17:33,119 --> 00:17:37,280 Speaker 1: because when you give a logical set to a very 321 00:17:37,280 --> 00:17:41,360 Speaker 1: sophisticate in computer, it may or may not find ways around, right. 322 00:17:41,640 --> 00:17:44,320 Speaker 1: And and it's not necessarily that the computer is trying 323 00:17:44,359 --> 00:17:50,520 Speaker 1: to behave in a a malevolent way. For example, here's 324 00:17:50,560 --> 00:17:52,200 Speaker 1: another example. This is one of those things about the 325 00:17:52,280 --> 00:17:55,200 Speaker 1: robot apocalypse that you always, you know, hear science fiction 326 00:17:55,200 --> 00:17:59,120 Speaker 1: writers talk about, and some futurists as well. Uh. Let's 327 00:17:59,160 --> 00:18:03,000 Speaker 1: say that you a super intelligent computer robot, and you 328 00:18:03,080 --> 00:18:08,199 Speaker 1: tell the robot, uh, I want you to make humankind. Uh, 329 00:18:08,720 --> 00:18:10,320 Speaker 1: I want I want you to bring about world peace. 330 00:18:11,040 --> 00:18:13,080 Speaker 1: You know, you're you are smarter than we are, bring 331 00:18:13,080 --> 00:18:15,600 Speaker 1: about world peace. It's possible that the robot could come 332 00:18:15,640 --> 00:18:17,640 Speaker 1: to the conclusion that the reason why there's not world 333 00:18:17,680 --> 00:18:19,879 Speaker 1: peace is because there are people, all of those pesky 334 00:18:19,920 --> 00:18:22,159 Speaker 1: humans running around causing wars. So if we just end 335 00:18:22,160 --> 00:18:24,960 Speaker 1: of all kill all humans and then you've got peace. 336 00:18:25,880 --> 00:18:30,760 Speaker 1: So yeah, that's that's that's that's the classic example. But 337 00:18:30,840 --> 00:18:33,520 Speaker 1: so you don't have to worry about your domestic vacuum 338 00:18:33,600 --> 00:18:37,160 Speaker 1: robot doing that. Probably not. I've I've known I've known 339 00:18:37,200 --> 00:18:40,119 Speaker 1: an I Robot roomba, and it did not seem malevolent, 340 00:18:40,280 --> 00:18:43,640 Speaker 1: very very malevolent to me. My dog was not too shared. 341 00:18:44,880 --> 00:18:47,760 Speaker 1: The dog might have a completely different opinion about the 342 00:18:47,800 --> 00:18:50,359 Speaker 1: intentions of the I Robot roomba. All right, well to 343 00:18:50,440 --> 00:18:53,159 Speaker 1: talk about I robot before we we talked about the roomba. 344 00:18:53,240 --> 00:18:55,719 Speaker 1: What's interesting to me is that it's a company that 345 00:18:55,800 --> 00:19:00,719 Speaker 1: was formed in nineteen by three m I T associate uh. 346 00:19:00,840 --> 00:19:03,840 Speaker 1: Those three m I T Associates where Rod Brooks, Colin 347 00:19:03,960 --> 00:19:09,600 Speaker 1: Angle and Helen Grinder and they wanted to bring robots 348 00:19:09,640 --> 00:19:14,720 Speaker 1: out of the realm of academ and industry. Those were 349 00:19:14,720 --> 00:19:17,160 Speaker 1: the two places where you would find a robot pretty 350 00:19:17,280 --> 00:19:20,600 Speaker 1: much before ninety nine, with some exceptions. There were some 351 00:19:20,680 --> 00:19:24,480 Speaker 1: toys that were had. I had a very annoying robot 352 00:19:24,520 --> 00:19:27,480 Speaker 1: toy when I was a young child, all of all 353 00:19:27,520 --> 00:19:29,760 Speaker 1: of the noises. My my aunt, who didn't have kids yet, 354 00:19:29,760 --> 00:19:32,399 Speaker 1: got it for me. Of course, that's always the relative 355 00:19:32,400 --> 00:19:37,080 Speaker 1: who does that. Yeah, So there were toys, and like 356 00:19:37,160 --> 00:19:40,520 Speaker 1: hobbyists had some access to some robotics, and a lot 357 00:19:40,560 --> 00:19:42,720 Speaker 1: of people were doing d I y stuff, but much 358 00:19:42,760 --> 00:19:45,040 Speaker 1: like the field of early computers, most people were just 359 00:19:45,119 --> 00:19:47,760 Speaker 1: kind of going like, what is this thing good for? Yeah? Yeah, 360 00:19:47,840 --> 00:19:49,560 Speaker 1: so what can we do with it? Consumers didn't have 361 00:19:49,600 --> 00:19:54,800 Speaker 1: an option as of yet. This is way back in purchase. First, 362 00:19:54,840 --> 00:19:58,000 Speaker 1: they well, first their company was called I S Robotics, 363 00:19:58,000 --> 00:20:00,560 Speaker 1: but they would change that further down the line, and 364 00:20:00,720 --> 00:20:04,439 Speaker 1: they originally built their business plan on the idea of 365 00:20:04,440 --> 00:20:10,120 Speaker 1: space exploration robots. So their first designs were for NASA 366 00:20:10,480 --> 00:20:14,199 Speaker 1: and UH. They created some robots that were meant for 367 00:20:14,280 --> 00:20:20,320 Speaker 1: things like lunar exploration and UH spaceship robot type creations 368 00:20:20,400 --> 00:20:26,720 Speaker 1: and planetary exploration robots, and these designs allowed them to 369 00:20:26,720 --> 00:20:29,480 Speaker 1: get some contracts with the government. And the whole point 370 00:20:29,520 --> 00:20:32,120 Speaker 1: of them making these designs, besides the fact that this 371 00:20:32,160 --> 00:20:35,040 Speaker 1: was like they were genuinely interested in, was that it 372 00:20:35,040 --> 00:20:37,880 Speaker 1: would allow them to gain the capital they would need 373 00:20:37,920 --> 00:20:41,879 Speaker 1: to invest in a consumer robot, because that's not a 374 00:20:41,920 --> 00:20:44,600 Speaker 1: small undertaking. Remember this is a decade before we see 375 00:20:44,640 --> 00:20:48,399 Speaker 1: the first consumer robot on shelves, So it's as Lauren 376 00:20:48,440 --> 00:20:52,880 Speaker 1: was saying before, it's not the most efficient or economically 377 00:20:52,960 --> 00:20:56,560 Speaker 1: feasible way for you to take care of minor chores 378 00:20:56,600 --> 00:20:58,640 Speaker 1: around the house. It could be a very expensive thing. 379 00:20:58,760 --> 00:21:01,919 Speaker 1: So they need to raise some capital first before they 380 00:21:01,920 --> 00:21:04,600 Speaker 1: could start getting into consumer robots. You know, they couldn't 381 00:21:04,600 --> 00:21:07,080 Speaker 1: just hit the ground running doing that. So first they 382 00:21:07,080 --> 00:21:11,600 Speaker 1: started working on robots for space exploration missions. They developed 383 00:21:12,320 --> 00:21:16,800 Speaker 1: a robot called the I Robot five tin PackBot, and 384 00:21:16,840 --> 00:21:20,760 Speaker 1: this was a search and rescue and bomb disposal robots. 385 00:21:20,800 --> 00:21:22,399 Speaker 1: I was used by the military, and they still do 386 00:21:22,440 --> 00:21:25,320 Speaker 1: a bunch of military work I believe they do. So, yeah, 387 00:21:25,359 --> 00:21:27,439 Speaker 1: you've got to hope that your room bud does not 388 00:21:27,480 --> 00:21:32,320 Speaker 1: get mixed up with the PackBot five ten. Yes, fingers 389 00:21:32,320 --> 00:21:35,199 Speaker 1: crossed to be fair. Of the five tin pac bots 390 00:21:35,200 --> 00:21:37,320 Speaker 1: not it's not a military robot in the sense of 391 00:21:37,320 --> 00:21:40,399 Speaker 1: one it's weaponized. It's at most of these are in 392 00:21:40,480 --> 00:21:45,959 Speaker 1: fact going on reconmissions or to defuse bombs where so 393 00:21:46,000 --> 00:21:49,440 Speaker 1: that you know, or or possibly to uh, like I said, 394 00:21:49,440 --> 00:21:51,800 Speaker 1: search and rescue. So for first responders, they might send 395 00:21:51,800 --> 00:21:54,440 Speaker 1: a robot if there's a building, for example, that's been 396 00:21:54,600 --> 00:21:57,840 Speaker 1: the target of a bomb, then sending a robot in 397 00:21:57,920 --> 00:22:01,719 Speaker 1: to look for survivors, UH could mean saving the lives 398 00:22:01,720 --> 00:22:04,199 Speaker 1: of first responders. You know. It's it's one of those 399 00:22:04,200 --> 00:22:06,320 Speaker 1: things where you think, well, this is a really expensive robot, 400 00:22:06,400 --> 00:22:10,280 Speaker 1: but compared to a human life, it is negligible. So UH, 401 00:22:11,320 --> 00:22:13,800 Speaker 1: I Robot began to make those as well, and again 402 00:22:13,840 --> 00:22:16,560 Speaker 1: this helped I Robot gain the capital they needed to 403 00:22:16,600 --> 00:22:19,240 Speaker 1: go into the consumer market. And one of the other 404 00:22:19,359 --> 00:22:22,680 Speaker 1: big projects they did before they got into consumer robots 405 00:22:23,040 --> 00:22:27,240 Speaker 1: was called Auto Cleaner. An auto cleaner was meant for S. C. 406 00:22:27,480 --> 00:22:33,480 Speaker 1: Johnson wax and as an industrial cleaning robot. UH. And 407 00:22:33,920 --> 00:22:36,359 Speaker 1: this this is kind of the project that inspired a 408 00:22:36,400 --> 00:22:39,480 Speaker 1: couple of the engineers once the project was winding down, 409 00:22:39,920 --> 00:22:42,919 Speaker 1: to look into a way of building a consumer model 410 00:22:43,040 --> 00:22:47,400 Speaker 1: of and that was what would eventually become the Room 411 00:22:47,760 --> 00:22:50,840 Speaker 1: And it was when they first started working on it, 412 00:22:50,920 --> 00:22:53,440 Speaker 1: and the Rombo would not hit store shelves until September 413 00:22:53,480 --> 00:23:00,000 Speaker 1: two two so um. By then, the electro Lux Trilloby 414 00:23:00,119 --> 00:23:02,240 Speaker 1: had been on the market for almost a year, and 415 00:23:02,280 --> 00:23:05,679 Speaker 1: then they launched the I Robot, which don't totally different. 416 00:23:05,680 --> 00:23:09,320 Speaker 1: Company obviously launched the roomba, which has I think become 417 00:23:09,400 --> 00:23:14,800 Speaker 1: almost like the best synonymous basically, yeah exactly, or Xerox 418 00:23:14,800 --> 00:23:17,679 Speaker 1: it's it's or Jello fridge. Yeah. Yeah, It's just one 419 00:23:17,680 --> 00:23:22,119 Speaker 1: of those things where the term has almost just substituted 420 00:23:22,160 --> 00:23:25,600 Speaker 1: the idea of vacuum robot roomba shorthand as as of 421 00:23:25,640 --> 00:23:27,960 Speaker 1: two thousand eleven, over six million of the buggers have 422 00:23:28,000 --> 00:23:31,920 Speaker 1: been sold. So so um, there's a whole bunch of 423 00:23:32,000 --> 00:23:33,480 Speaker 1: jokes that are coming to mind, and I'm going to 424 00:23:33,560 --> 00:23:37,960 Speaker 1: leave them all alone. Excellent, but you're all welcome. I 425 00:23:38,080 --> 00:23:41,640 Speaker 1: Robot does make several other kinds of things, Yes, they do. 426 00:23:41,800 --> 00:23:45,320 Speaker 1: They make They make a robot scuba, which is a mop. Yes, 427 00:23:45,359 --> 00:23:50,919 Speaker 1: scuba actually has an interesting approach. Well, it's good to 428 00:23:50,920 --> 00:23:54,400 Speaker 1: compare it to the roomba. So the rumba uses essentially 429 00:23:54,400 --> 00:23:57,679 Speaker 1: three different kinds of brushes. There's one brush that just 430 00:23:57,720 --> 00:24:01,040 Speaker 1: sort of helps sweep particles with in the path into 431 00:24:01,080 --> 00:24:03,400 Speaker 1: the path of the roomba. But then it has these 432 00:24:03,400 --> 00:24:07,040 Speaker 1: two brushes that are used to pick up the larger 433 00:24:07,440 --> 00:24:11,720 Speaker 1: types of debris. And uh so these two brushes spin 434 00:24:12,440 --> 00:24:15,720 Speaker 1: in a way where they're one is clockwise, the other's counterclockwise, 435 00:24:17,400 --> 00:24:20,560 Speaker 1: and they use that motion to I'm gesticulating, and that's 436 00:24:20,560 --> 00:24:24,159 Speaker 1: not helping. Yeah, they're using this this opposite rotating motion 437 00:24:24,240 --> 00:24:28,160 Speaker 1: to flick stuff up into the bin that the that 438 00:24:28,320 --> 00:24:31,399 Speaker 1: is inside the roomba, the the refuse bin if you 439 00:24:31,440 --> 00:24:34,760 Speaker 1: prefer trash bin um. And then it also has a 440 00:24:34,840 --> 00:24:38,400 Speaker 1: vacuum that uses suction to suck up the finer particles, 441 00:24:39,040 --> 00:24:43,199 Speaker 1: finer being smaller, not of higher quality. And then the 442 00:24:43,240 --> 00:24:47,400 Speaker 1: scuba its method is it uses it has a vacuum 443 00:24:47,400 --> 00:24:49,960 Speaker 1: as well to suck up loose particles, and then it 444 00:24:50,520 --> 00:24:56,120 Speaker 1: sprays the floor with a combination of water and cleaning solution, 445 00:24:56,720 --> 00:25:00,160 Speaker 1: then uses a rotating brush to scrub the floor. Then 446 00:25:00,200 --> 00:25:03,840 Speaker 1: it sucks up the dirty water into a wastebind that 447 00:25:03,960 --> 00:25:06,840 Speaker 1: is separate from the water and cleaning solution bend. So 448 00:25:07,000 --> 00:25:11,159 Speaker 1: there's two different containers inside the scuba for this stuff, 449 00:25:11,200 --> 00:25:13,240 Speaker 1: so that way you don't mix the two. Clearly you 450 00:25:13,200 --> 00:25:17,359 Speaker 1: want dirty water to go back into there. And then uh, 451 00:25:17,520 --> 00:25:21,240 Speaker 1: it uses a technology called I adapt to monitor and 452 00:25:21,320 --> 00:25:25,200 Speaker 1: respond to its environment. So the rumba and the scuba 453 00:25:25,280 --> 00:25:28,480 Speaker 1: both have to be able to maneuver through an environment 454 00:25:28,680 --> 00:25:33,280 Speaker 1: and uh and cover an entire floor, because I mean, 455 00:25:33,280 --> 00:25:34,920 Speaker 1: it wouldn't do you any good if the room bad 456 00:25:35,000 --> 00:25:37,320 Speaker 1: just kind of wandered aimlessly and then went to its 457 00:25:37,400 --> 00:25:40,160 Speaker 1: charging station. You go in and like, well, this one 458 00:25:40,600 --> 00:25:44,040 Speaker 1: meaningless path is clean, but everything else is filthy. So 459 00:25:44,119 --> 00:25:47,400 Speaker 1: it has to be able to make its way across 460 00:25:47,440 --> 00:25:49,640 Speaker 1: around in a in a space that was not necessarily 461 00:25:49,680 --> 00:25:52,640 Speaker 1: intended for a robot to move around, right, because yeah, 462 00:25:53,040 --> 00:25:56,240 Speaker 1: well humans can instinctively look at a coffee table and go, 463 00:25:56,280 --> 00:25:59,679 Speaker 1: I shouldn't walk directly into that. Yeah, robots don't necessarily 464 00:25:59,680 --> 00:26:02,959 Speaker 1: seraily know that off the top. They have to actually 465 00:26:03,040 --> 00:26:05,960 Speaker 1: be programmed to not do that that. We'll get into 466 00:26:06,000 --> 00:26:08,160 Speaker 1: more of that in a second too. But yeah, those 467 00:26:08,160 --> 00:26:10,640 Speaker 1: are the two big ones, roomba and scuba. Yeah, yeah, 468 00:26:10,680 --> 00:26:14,439 Speaker 1: they I think it's CS or it might have been. 469 00:26:15,119 --> 00:26:17,800 Speaker 1: And they also introduced the I Robot Luge, which is 470 00:26:17,800 --> 00:26:20,639 Speaker 1: a gutter cleaner. Yeah. I've seen examples of the luge 471 00:26:20,680 --> 00:26:24,400 Speaker 1: for a couple of years, but it's it's essentially this 472 00:26:24,480 --> 00:26:26,320 Speaker 1: thing that you put in your gutter and it just 473 00:26:26,400 --> 00:26:28,359 Speaker 1: kind of goes, it goes, it goes down like a 474 00:26:28,440 --> 00:26:32,439 Speaker 1: bullet and then just starts scoop it out gunk. So 475 00:26:32,600 --> 00:26:35,439 Speaker 1: do not stand near your gutter while the while that 476 00:26:35,520 --> 00:26:38,000 Speaker 1: sucker is going to stand in your gutter that's near 477 00:26:38,040 --> 00:26:43,679 Speaker 1: your gutter like or under Yeah. Gross. Remember though, if 478 00:26:43,720 --> 00:26:46,239 Speaker 1: your head's in the gutter, your eyes are looking at 479 00:26:46,240 --> 00:26:52,680 Speaker 1: the stars, just a paraphrase of that is deep. Oh 480 00:26:52,760 --> 00:26:54,600 Speaker 1: that's pretty you know, I can get pretty deep in 481 00:26:54,600 --> 00:26:57,800 Speaker 1: the gutter anyway. So there are other types of domestic 482 00:26:57,880 --> 00:27:00,480 Speaker 1: robots as well. I mean, these these were probably the 483 00:27:00,560 --> 00:27:03,439 Speaker 1: best known because they are it's a huge brand and 484 00:27:03,440 --> 00:27:06,000 Speaker 1: they've been around for a few years. But there's a 485 00:27:06,200 --> 00:27:09,560 Speaker 1: robotic lawnmowers. I think we're I think I have Robot 486 00:27:09,560 --> 00:27:12,080 Speaker 1: actually does one of those as well. Yeah, so they 487 00:27:12,160 --> 00:27:14,280 Speaker 1: might I know that. I know there are quite a 488 00:27:14,280 --> 00:27:17,120 Speaker 1: few brands that do have robotic lawnmowers, and so these 489 00:27:17,160 --> 00:27:20,440 Speaker 1: are again using that same tech. Those same technologies are 490 00:27:20,480 --> 00:27:23,560 Speaker 1: using collision detection to make sure that they don't bang 491 00:27:23,560 --> 00:27:27,720 Speaker 1: into anything. Uh. Most of them have a little kill 492 00:27:27,800 --> 00:27:30,760 Speaker 1: switches inside them, so they if they're tilted or turned over, 493 00:27:31,240 --> 00:27:34,560 Speaker 1: the blade stops spinning immediately, so that way you don't 494 00:27:34,680 --> 00:27:37,679 Speaker 1: endanger anyone like a little kid wondering what this thing is, 495 00:27:38,000 --> 00:27:39,960 Speaker 1: I'll put my hand on it. Yeah, that's that can 496 00:27:40,000 --> 00:27:43,320 Speaker 1: be dangerous. And they also can you know, you can 497 00:27:43,359 --> 00:27:47,280 Speaker 1: define the parameters of your yard so that the lawnmower 498 00:27:47,520 --> 00:27:50,600 Speaker 1: works within those parameters and doesn't you know just suddenly 499 00:27:50,720 --> 00:27:53,040 Speaker 1: like well, all done here, let's do the neighbor's yard, 500 00:27:54,840 --> 00:27:58,200 Speaker 1: and then the next one, next one. So there there 501 00:27:58,240 --> 00:28:02,960 Speaker 1: are examples of that. They it works very similar technologies 502 00:28:03,000 --> 00:28:08,240 Speaker 1: as the vacuum robots. Another similar robots, window cleaning robots 503 00:28:08,240 --> 00:28:11,560 Speaker 1: that use suction to attach themselves to a window. And 504 00:28:11,640 --> 00:28:15,440 Speaker 1: then again with the cleaning solution and the interesting water. 505 00:28:15,600 --> 00:28:18,800 Speaker 1: I haven't seen any of these. They sound terrifying. There 506 00:28:18,880 --> 00:28:24,240 Speaker 1: was one unbuild at CSEN called the back windbot. I 507 00:28:24,320 --> 00:28:27,800 Speaker 1: wish I had seen that now, and uh it's yeah, yeah, 508 00:28:27,800 --> 00:28:29,840 Speaker 1: it's supposed to debut later this year for under four 509 00:28:30,240 --> 00:28:33,400 Speaker 1: dollars supposedly, and yeah, reusable pads to wash and dry. 510 00:28:33,400 --> 00:28:35,600 Speaker 1: It has a little squeegee in there, so I guess 511 00:28:35,600 --> 00:28:38,120 Speaker 1: you set it on the window and then it goes Yeah, 512 00:28:38,480 --> 00:28:40,840 Speaker 1: that's that's the thing. This is another example of why 513 00:28:41,320 --> 00:28:44,880 Speaker 1: people are a little disappointed, not that they should be, 514 00:28:44,880 --> 00:28:47,880 Speaker 1: but they're disappointed in domestic robots because it's not a 515 00:28:47,960 --> 00:28:50,800 Speaker 1: robot that just comes out of a closet and then 516 00:28:50,920 --> 00:28:53,040 Speaker 1: takes a look around says, oh, these are the four 517 00:28:53,160 --> 00:28:55,520 Speaker 1: things that need to be done in this room. I 518 00:28:55,560 --> 00:28:57,640 Speaker 1: need to to sweep and mop the floors, and need 519 00:28:57,680 --> 00:28:59,520 Speaker 1: to clean the windows, need to fold the laundry, and 520 00:28:59,520 --> 00:29:01,760 Speaker 1: I need to pick the kid off the TV because 521 00:29:01,760 --> 00:29:04,320 Speaker 1: he's been on it too long, which is what you know, 522 00:29:04,400 --> 00:29:08,600 Speaker 1: Rosie the robot could do. It's John again. The robots 523 00:29:08,600 --> 00:29:10,720 Speaker 1: still beeping, So we're going to take another break while 524 00:29:10,720 --> 00:29:20,080 Speaker 1: I try and figure out how to turn this thing off. Yeah, 525 00:29:20,120 --> 00:29:23,800 Speaker 1: I've got one called the dress Man. Dress Man. Dress 526 00:29:23,840 --> 00:29:27,720 Speaker 1: Man looks like a kind of a mannequin torso, but 527 00:29:27,800 --> 00:29:32,440 Speaker 1: it essentially uses uh in. The robot itself inflates and 528 00:29:32,560 --> 00:29:36,560 Speaker 1: uses hot air to dry and press shirts. It's a 529 00:29:36,680 --> 00:29:41,880 Speaker 1: robotic iron that's that's terrific. Yeah, I could use one. 530 00:29:42,360 --> 00:29:45,480 Speaker 1: I definitely tend to wear wrinkled stuff. Yeah, I think 531 00:29:45,520 --> 00:29:50,040 Speaker 1: I iron approximately never, so that's yeah. There's also robots 532 00:29:50,080 --> 00:29:53,760 Speaker 1: in pet care, robotical litter boxes that self scoop away, 533 00:29:54,320 --> 00:29:58,120 Speaker 1: that's a valuable service, refilling food and water trays. There 534 00:29:58,120 --> 00:30:01,040 Speaker 1: are robot pool cleaners, which we talked about it. You know, 535 00:30:01,080 --> 00:30:05,000 Speaker 1: I mentioned that there was um and and you know 536 00:30:05,120 --> 00:30:08,680 Speaker 1: that these are all different levels of sophistication. Not all 537 00:30:08,720 --> 00:30:11,560 Speaker 1: of them need all the sensors that other ones need 538 00:30:11,600 --> 00:30:15,360 Speaker 1: because the job they do doesn't require it. So, uh, 539 00:30:15,480 --> 00:30:18,640 Speaker 1: not every single robot is going to be decked out 540 00:30:18,640 --> 00:30:24,880 Speaker 1: with sensors. However, the future robots might might be decked 541 00:30:24,880 --> 00:30:28,240 Speaker 1: out with sensors. We've got a lot of development in 542 00:30:28,280 --> 00:30:32,480 Speaker 1: the robots space, people who want to create general purpose 543 00:30:32,600 --> 00:30:36,440 Speaker 1: robots that can tackle different tasks so that you don't have, 544 00:30:37,120 --> 00:30:39,760 Speaker 1: you know, eight different robots to do all of your chores. 545 00:30:39,800 --> 00:30:42,200 Speaker 1: You've got one robot that can do all of them, right. 546 00:30:42,640 --> 00:30:45,360 Speaker 1: Willow Garage is one company that has been making a 547 00:30:45,440 --> 00:30:48,640 Speaker 1: lot of waves. Yeah, they've got a robot called the 548 00:30:48,720 --> 00:30:52,600 Speaker 1: PR two, which stands for Personal Robot to the number 549 00:30:52,600 --> 00:30:56,240 Speaker 1: two and it's essentially a research and development platform. It's 550 00:30:56,280 --> 00:30:59,320 Speaker 1: not necessarily meant to be a robot that a consumer 551 00:30:59,320 --> 00:31:01,040 Speaker 1: will go out. And first of all, it's like around 552 00:31:01,040 --> 00:31:06,680 Speaker 1: four dollars tag prohibitively expensive for most of us, including myself. 553 00:31:07,280 --> 00:31:09,719 Speaker 1: Um And but what what this is meant to do 554 00:31:09,840 --> 00:31:13,240 Speaker 1: is it's meant to allow people to build apps, robot 555 00:31:13,280 --> 00:31:17,080 Speaker 1: apps that would increase the functionality of the robot. So 556 00:31:17,160 --> 00:31:19,560 Speaker 1: you might create an app that's a laundry folding app 557 00:31:20,160 --> 00:31:23,200 Speaker 1: or an app that's a fry an egg app. And 558 00:31:23,560 --> 00:31:27,560 Speaker 1: the robot is humanoid in the sense that has arms, 559 00:31:27,680 --> 00:31:33,120 Speaker 1: it's got like sensors that are in a head length thing. Yeah, 560 00:31:33,600 --> 00:31:37,520 Speaker 1: it looks like a torso with arms and uh wheels. 561 00:31:37,520 --> 00:31:40,720 Speaker 1: It doesn't have legs. It's gonna it's gonna a wheeled base, 562 00:31:40,840 --> 00:31:42,960 Speaker 1: and it's a it's a little from what I saw. 563 00:31:43,600 --> 00:31:46,400 Speaker 1: I didn't see like an actual scale like next to 564 00:31:46,400 --> 00:31:47,960 Speaker 1: a human being. But to me, it like they came 565 00:31:48,040 --> 00:31:51,520 Speaker 1: up a little higher than your waist maybe mid chest 566 00:31:51,680 --> 00:31:54,640 Speaker 1: area is what it looked like. It could be that 567 00:31:54,680 --> 00:31:57,360 Speaker 1: they're much taller than that, but that's not what it 568 00:31:57,560 --> 00:32:00,720 Speaker 1: seemed to be when I was looking at the videos. Um, 569 00:32:01,280 --> 00:32:02,920 Speaker 1: all of the robot videos that I've watched in the 570 00:32:02,960 --> 00:32:05,000 Speaker 1: past twenty four hours are running into my head, so 571 00:32:05,320 --> 00:32:08,400 Speaker 1: I've got no to You're watching a lot of like 572 00:32:09,320 --> 00:32:11,840 Speaker 1: robots that had nothing to do with this. It's just 573 00:32:11,840 --> 00:32:17,800 Speaker 1: just robots, just rot It's transformers, Michael Bay kick stuff 574 00:32:17,800 --> 00:32:21,400 Speaker 1: blew up. Anyway, much much better robot videos than those 575 00:32:21,480 --> 00:32:27,280 Speaker 1: come on. Nice, well done, you shall live. So the 576 00:32:27,320 --> 00:32:30,160 Speaker 1: PR two robot is really meant for developers to create 577 00:32:30,240 --> 00:32:33,680 Speaker 1: software for this role. Right, it's all open platform, open 578 00:32:33,720 --> 00:32:37,880 Speaker 1: source developing UM is highly encouraged, and it's mostly trying 579 00:32:37,920 --> 00:32:40,120 Speaker 1: to get people excited about it. Also, Willo Garage is 580 00:32:40,120 --> 00:32:44,040 Speaker 1: doing the turtle Butt, which works together with I Robot 581 00:32:44,040 --> 00:32:47,360 Speaker 1: makes something called to Create, which is a roomba base 582 00:32:47,680 --> 00:32:50,880 Speaker 1: that has a loading dock for whatever other stuff you 583 00:32:50,880 --> 00:32:52,600 Speaker 1: want to kind of shove in there. And it's for 584 00:32:52,680 --> 00:32:55,680 Speaker 1: home developers. UM you can you can hook it up 585 00:32:55,680 --> 00:32:57,760 Speaker 1: with a connect with a netbook anything else you want, 586 00:32:58,560 --> 00:33:01,600 Speaker 1: and teach it how to do your own stuff. Neat. Yeah, 587 00:33:02,400 --> 00:33:06,240 Speaker 1: things that allow hackers more tools to play with stuff 588 00:33:06,320 --> 00:33:09,120 Speaker 1: is always cool. And the PR two has some pretty 589 00:33:08,840 --> 00:33:11,840 Speaker 1: hefty hardware on it. It's got two computers on it 590 00:33:12,640 --> 00:33:16,479 Speaker 1: that have eight core processors um each each computer has 591 00:33:16,480 --> 00:33:21,200 Speaker 1: an eight core processor, twenty four gigs of RAM, and 592 00:33:21,240 --> 00:33:26,400 Speaker 1: two terabytes of disk space. And according to that, to 593 00:33:26,400 --> 00:33:28,880 Speaker 1: to what I was to one of the articles I 594 00:33:28,920 --> 00:33:32,640 Speaker 1: was reading, all of that heavy duty hardware gives the 595 00:33:32,680 --> 00:33:36,120 Speaker 1: pr tow the capability of folding a towel in just 596 00:33:36,440 --> 00:33:42,440 Speaker 1: six minutes, which is down from twenty four minutes, which 597 00:33:42,480 --> 00:33:47,400 Speaker 1: just means it's not the most efficient towel folding device ever. Well, 598 00:33:47,480 --> 00:33:50,440 Speaker 1: to sure, but towels are towels are tricksy? No, yes, 599 00:33:50,520 --> 00:33:53,960 Speaker 1: tells the other boy. Towels are so much trickier than say, 600 00:33:54,040 --> 00:33:57,240 Speaker 1: t shirts. Well, I mean, you know, if if you're 601 00:33:57,240 --> 00:33:59,040 Speaker 1: if you're looking at it as a as a human, 602 00:33:59,120 --> 00:34:01,080 Speaker 1: then no, towels are pretty simple. But if you're looking 603 00:34:01,120 --> 00:34:03,600 Speaker 1: at it as a robot, it tells are are floppy? 604 00:34:03,720 --> 00:34:06,480 Speaker 1: They move around, they don't they don't say. And here's 605 00:34:06,520 --> 00:34:09,040 Speaker 1: here's the rub. This is the problem is that teaching 606 00:34:09,040 --> 00:34:11,240 Speaker 1: a robot to do something that's simple for a human 607 00:34:11,360 --> 00:34:13,919 Speaker 1: is incredibly complex. Yeah. I mean, if if you teach 608 00:34:13,960 --> 00:34:15,960 Speaker 1: a robot, how do you play chess? For example? A 609 00:34:16,080 --> 00:34:19,520 Speaker 1: chess is one of those if then statement end of things. 610 00:34:19,560 --> 00:34:22,840 Speaker 1: Only so many potential things you can do within a 611 00:34:22,920 --> 00:34:25,520 Speaker 1: chess game. There are a lot of them, but there 612 00:34:25,600 --> 00:34:27,680 Speaker 1: is a limit. There's an upward limit of the things 613 00:34:27,680 --> 00:34:30,000 Speaker 1: you are capable of doing within the context of a 614 00:34:30,080 --> 00:34:32,960 Speaker 1: chess game. If you ask a robot, however, to go 615 00:34:33,160 --> 00:34:35,279 Speaker 1: to the kitchen, pick up a mug of tea and 616 00:34:35,320 --> 00:34:39,200 Speaker 1: bring it back into the podcast room, that's I mean, 617 00:34:39,440 --> 00:34:41,880 Speaker 1: that's that's incredibly complicated, Yeah, because you never know what 618 00:34:41,960 --> 00:34:44,320 Speaker 1: could be in the way. I mean, if the environment 619 00:34:44,480 --> 00:34:48,480 Speaker 1: is constantly the same and the parameters for that task 620 00:34:48,640 --> 00:34:52,200 Speaker 1: are always identical, that's one thing. But when you're in 621 00:34:52,200 --> 00:34:55,200 Speaker 1: an environment that's dynamic, that can change over time. Maybe 622 00:34:55,200 --> 00:34:57,719 Speaker 1: there's a chair that's in the way that normally wouldn't be. 623 00:34:58,040 --> 00:35:01,720 Speaker 1: Maybe the mugs are all pretty, or maybe there's only 624 00:35:01,840 --> 00:35:04,240 Speaker 1: only clean mugs are in the dishwasher, as if people 625 00:35:04,360 --> 00:35:08,239 Speaker 1: use that thing, Come on, how stuff works, people, Or 626 00:35:08,320 --> 00:35:10,440 Speaker 1: or maybe maybe some mugs are more delicate than others 627 00:35:10,440 --> 00:35:12,800 Speaker 1: and it crushes the mug to a fine powder instead 628 00:35:12,800 --> 00:35:14,520 Speaker 1: of picking it up, and then pours the tea on 629 00:35:14,560 --> 00:35:18,840 Speaker 1: it and brings you a what pile a mug powder 630 00:35:19,440 --> 00:35:23,040 Speaker 1: which is not delicious. No, it's i recommend you do 631 00:35:23,120 --> 00:35:26,960 Speaker 1: not drink ceramic powder. That would probably not go overwhelming. 632 00:35:27,280 --> 00:35:29,600 Speaker 1: But yeah, I mean that's the thing is that teaching 633 00:35:29,600 --> 00:35:33,000 Speaker 1: a robot to deal with these kind of changing environments 634 00:35:33,120 --> 00:35:36,320 Speaker 1: is really challenging, and there are a lot of companies 635 00:35:36,360 --> 00:35:39,120 Speaker 1: that are working on this with various types of artificial 636 00:35:39,160 --> 00:35:42,040 Speaker 1: intelligence to deal with this. I mean, you think about 637 00:35:42,080 --> 00:35:44,960 Speaker 1: it this way. If I walk into a room that 638 00:35:45,080 --> 00:35:47,719 Speaker 1: I have never been in before, but I have all 639 00:35:47,760 --> 00:35:51,120 Speaker 1: of my normal senses, if the lights are on, if 640 00:35:51,200 --> 00:35:53,600 Speaker 1: lights are on and I'm able to see, I've got 641 00:35:53,600 --> 00:35:56,120 Speaker 1: my glasses on because I am here sided. Let's say 642 00:35:56,120 --> 00:35:58,000 Speaker 1: I walk into this room and I take a look 643 00:35:58,040 --> 00:36:02,400 Speaker 1: around the room. I can get around that room fairly effectively, 644 00:36:02,400 --> 00:36:05,040 Speaker 1: assuming there aren't like laser death traps or something in it, 645 00:36:05,120 --> 00:36:07,480 Speaker 1: just a typical room, like there's there's some chairs and 646 00:36:07,560 --> 00:36:09,840 Speaker 1: some other furniture around. As long as I haven't built 647 00:36:09,840 --> 00:36:11,960 Speaker 1: the room right, I can get from point A to 648 00:36:12,040 --> 00:36:15,879 Speaker 1: point B without killing myself or bumping into something or 649 00:36:15,920 --> 00:36:18,680 Speaker 1: you know, otherwise making an idiot of myself more so 650 00:36:18,760 --> 00:36:22,399 Speaker 1: than I normally do. A robot goes into that room, 651 00:36:22,440 --> 00:36:24,239 Speaker 1: if it's never been there before, it has to have 652 00:36:24,280 --> 00:36:26,800 Speaker 1: a lot of sophisticated equipment to be able to map 653 00:36:26,840 --> 00:36:31,839 Speaker 1: out that room and then maneuver through that map, and 654 00:36:32,280 --> 00:36:35,239 Speaker 1: it also needs to be able to map dynamically to 655 00:36:35,320 --> 00:36:38,320 Speaker 1: be able to update this map in real time and 656 00:36:38,600 --> 00:36:43,400 Speaker 1: act processing power, a lot of processing power, and it 657 00:36:43,520 --> 00:36:48,200 Speaker 1: requires fairly sensitive and sophisticated sensors. So we take it 658 00:36:48,200 --> 00:36:51,600 Speaker 1: for granted because it's our natural way of life. You know, 659 00:36:51,719 --> 00:36:54,160 Speaker 1: we walk into a room and that's just the way 660 00:36:54,200 --> 00:36:56,920 Speaker 1: things are. But for a robot, there is no natural 661 00:36:56,960 --> 00:36:59,200 Speaker 1: way of life. We have to program all of that 662 00:36:59,280 --> 00:37:02,239 Speaker 1: into the robot, and that is really really tricky, and 663 00:37:02,239 --> 00:37:04,160 Speaker 1: it gets even more tricky when you start talking about 664 00:37:04,160 --> 00:37:07,759 Speaker 1: things like can the robot navigate stairs? There are some 665 00:37:07,920 --> 00:37:12,160 Speaker 1: robots that can. Honda's Asimo can go up and downstairs, 666 00:37:12,160 --> 00:37:14,160 Speaker 1: but it needs to be the stairs have to be 667 00:37:14,200 --> 00:37:17,240 Speaker 1: programmed in Asimo. At least in the last I read, 668 00:37:17,800 --> 00:37:21,560 Speaker 1: as Amo could not navigate us upstairs that it had 669 00:37:21,600 --> 00:37:25,320 Speaker 1: not encountered before. It could get through a room Manhattan 670 00:37:25,440 --> 00:37:29,560 Speaker 1: encountered before, because it's sensors were sophisticate enough so that 671 00:37:29,640 --> 00:37:31,880 Speaker 1: if you said, all right, as Amo, I need you 672 00:37:31,920 --> 00:37:34,560 Speaker 1: to pick up the red ball that's in this room. 673 00:37:34,600 --> 00:37:36,839 Speaker 1: As Amo could navigate through a room even if I've 674 00:37:36,840 --> 00:37:38,640 Speaker 1: never seen it before and get to the object. And 675 00:37:38,640 --> 00:37:40,160 Speaker 1: this is if you haven't seen one of these critters. 676 00:37:40,239 --> 00:37:42,680 Speaker 1: It is. It is bipedal. It looks more or less 677 00:37:42,680 --> 00:37:44,359 Speaker 1: like a like a person. I think it's about four 678 00:37:44,360 --> 00:37:47,879 Speaker 1: ft two, so it's a small as Hueman's his Lauren's eize, 679 00:37:48,239 --> 00:37:52,320 Speaker 1: thanks welcome. Hey, I'm five too. I'm gigantic compared to that. 680 00:37:53,120 --> 00:37:56,120 Speaker 1: You could take him, totally take it, I keep saying him. 681 00:37:56,280 --> 00:37:59,480 Speaker 1: But as Amo doesn't technically have a gender. It is genderless. 682 00:37:59,520 --> 00:38:01,760 Speaker 1: It is is a robot that's like a little spaceman. 683 00:38:01,880 --> 00:38:04,799 Speaker 1: Though does it kind of honestly, it creeps me out. 684 00:38:05,160 --> 00:38:08,520 Speaker 1: I'm kind of sort of not okay with humanoid robots. 685 00:38:08,600 --> 00:38:11,960 Speaker 1: I mean, have you seen it run? I has this 686 00:38:12,080 --> 00:38:14,640 Speaker 1: little hopping run because Asthma that was that was one 687 00:38:14,640 --> 00:38:16,400 Speaker 1: of the things that was famous about Asthma was that 688 00:38:16,480 --> 00:38:19,960 Speaker 1: was the first robot of that size to be able 689 00:38:20,040 --> 00:38:23,080 Speaker 1: to run. And by definition, running is a is a 690 00:38:23,120 --> 00:38:26,840 Speaker 1: method of conveyance where at some point both your feet 691 00:38:26,880 --> 00:38:31,000 Speaker 1: are off the ground, and most robots had to have 692 00:38:31,080 --> 00:38:33,120 Speaker 1: at least one foot in contact with the ground at 693 00:38:33,120 --> 00:38:36,719 Speaker 1: all times. In order to support over and uh, and 694 00:38:36,760 --> 00:38:39,359 Speaker 1: it does mean that you have to program very sophisticated 695 00:38:39,400 --> 00:38:42,520 Speaker 1: sensors in the robot to be able to have both 696 00:38:42,520 --> 00:38:44,760 Speaker 1: feet come off the ground and when when a foot 697 00:38:44,960 --> 00:38:47,920 Speaker 1: makes contact with the ground again, the robot has to 698 00:38:47,920 --> 00:38:51,000 Speaker 1: be very carefully balanced so they can react in the 699 00:38:51,080 --> 00:38:54,960 Speaker 1: right way so it remains upright and as Amo. They 700 00:38:55,000 --> 00:38:57,600 Speaker 1: achieved that with Asimo, so when it runs, it it's 701 00:38:57,640 --> 00:39:01,879 Speaker 1: really this little hoppy run that looks of comical when 702 00:39:01,880 --> 00:39:05,120 Speaker 1: you when you first see it. Um. But Asimo also 703 00:39:05,520 --> 00:39:09,320 Speaker 1: had a lot more sophistication than uh than previous versions 704 00:39:09,320 --> 00:39:13,320 Speaker 1: of robots did. And um I wrote a whole article 705 00:39:13,360 --> 00:39:15,760 Speaker 1: about Asimo, and I got a chance to see Asimo 706 00:39:15,880 --> 00:39:20,480 Speaker 1: in person, actually anyone could because Asimo was at Disneylands 707 00:39:20,520 --> 00:39:25,720 Speaker 1: in oventions and uh it was it was interesting to see. 708 00:39:25,800 --> 00:39:28,319 Speaker 1: Now again, Asimo is kind of like an example of 709 00:39:28,360 --> 00:39:31,279 Speaker 1: what a domestic robot might one day look like. But 710 00:39:31,360 --> 00:39:35,080 Speaker 1: it really illustrates how challenging it is to build a 711 00:39:35,160 --> 00:39:39,960 Speaker 1: robot that is bipedal and humanoid. It just doesn't it's 712 00:39:40,000 --> 00:39:42,479 Speaker 1: not it's not the easiest design, all right, It doesn't 713 00:39:42,520 --> 00:39:48,279 Speaker 1: necessarily make development sense. It's easier in the long run 714 00:39:48,360 --> 00:39:50,359 Speaker 1: to to build these kind of small things the room 715 00:39:50,400 --> 00:39:53,400 Speaker 1: by the window washer, etcetera, etcetera. If we're going to, 716 00:39:53,560 --> 00:39:55,880 Speaker 1: I mean because we can either we can either build 717 00:39:55,960 --> 00:39:58,640 Speaker 1: robots to interact with our environment, or we can build 718 00:39:58,640 --> 00:40:03,120 Speaker 1: our environment to interact more easily with robots. And that 719 00:40:03,160 --> 00:40:07,040 Speaker 1: would involve, you know, having an extra entire space in 720 00:40:07,040 --> 00:40:11,080 Speaker 1: our kitchen so that a robot can can access all 721 00:40:11,120 --> 00:40:14,360 Speaker 1: of our food and cook for us, rather than you know, 722 00:40:14,440 --> 00:40:16,000 Speaker 1: set it loosen there and just kind of see what 723 00:40:16,040 --> 00:40:18,920 Speaker 1: it does. Yeah, yeah, it's it's There are a lot 724 00:40:18,920 --> 00:40:21,880 Speaker 1: of challenges, and grant they're very smart people working on 725 00:40:21,920 --> 00:40:25,319 Speaker 1: these challenges to try and and overcome them. There are 726 00:40:25,320 --> 00:40:30,319 Speaker 1: other challenges as well. There are emotional challenges, because the 727 00:40:30,360 --> 00:40:33,840 Speaker 1: more humanoid you make a robot, the more likely you 728 00:40:33,880 --> 00:40:36,480 Speaker 1: are to develop some sort of emotional attachment. There are 729 00:40:36,480 --> 00:40:39,360 Speaker 1: people who find their room bas to be perfectly charming. 730 00:40:39,360 --> 00:40:41,960 Speaker 1: There are people who have named their room bas. Apparently 731 00:40:42,000 --> 00:40:44,560 Speaker 1: most people that name their room bas. Yeah, so it 732 00:40:44,640 --> 00:40:46,920 Speaker 1: ends up becoming almost like a pet. Even though this 733 00:40:46,960 --> 00:40:50,880 Speaker 1: is this is an object that has no sentience, there's 734 00:40:51,200 --> 00:40:55,720 Speaker 1: no emotion in this object whatsoever. But they seem because 735 00:40:55,760 --> 00:40:58,400 Speaker 1: they are animated and this we move around and and 736 00:40:58,440 --> 00:41:00,480 Speaker 1: go through an environment. They seem I used to talk 737 00:41:00,520 --> 00:41:03,360 Speaker 1: to mine all the time, and you know it to 738 00:41:03,360 --> 00:41:04,880 Speaker 1: be fair a long and you just talk to yourself 739 00:41:04,920 --> 00:41:11,120 Speaker 1: all the time, so entirely fair, Yeah that was I mean, yeah, 740 00:41:11,400 --> 00:41:14,520 Speaker 1: so of course you talked to it all the time. However, 741 00:41:14,560 --> 00:41:16,200 Speaker 1: if it had looked I mean, you know, the way 742 00:41:16,239 --> 00:41:19,080 Speaker 1: the way that I reacted to the videos I was 743 00:41:19,120 --> 00:41:22,600 Speaker 1: watching of of the ASMAO, I'm kind of creeped out 744 00:41:22,600 --> 00:41:26,239 Speaker 1: by it. I'm not really okay with humanoid robot, which 745 00:41:26,400 --> 00:41:29,160 Speaker 1: brings to the concept of the uncanny valley right where 746 00:41:29,160 --> 00:41:33,360 Speaker 1: you start to The closer you approach the appearance of 747 00:41:33,400 --> 00:41:35,560 Speaker 1: a human the more likely you are to make it 748 00:41:35,600 --> 00:41:40,080 Speaker 1: an unsettling experience for someone to look at. Right. Um, now, 749 00:41:40,120 --> 00:41:41,799 Speaker 1: if you were to ever get to a point where 750 00:41:41,800 --> 00:41:45,480 Speaker 1: the robot is indistinguishable from a human being, you would 751 00:41:45,560 --> 00:41:48,400 Speaker 1: bridge theoretically, you would bridge the uncanny valley and it 752 00:41:48,440 --> 00:41:51,319 Speaker 1: would no longer be an issue. But the problem is 753 00:41:51,320 --> 00:41:54,200 Speaker 1: when you get really close but not exactly there, so 754 00:41:54,280 --> 00:41:58,520 Speaker 1: that there's something that's off and it just really it 755 00:41:59,080 --> 00:42:02,640 Speaker 1: unsettles you in a very Freudian itch in between your 756 00:42:02,640 --> 00:42:05,520 Speaker 1: shoulder blades kind of way. It's it's it gives you 757 00:42:05,600 --> 00:42:10,279 Speaker 1: the willies in the in the Americans sense the word please, 758 00:42:10,320 --> 00:42:13,680 Speaker 1: Brits still right to me. I'm not being rude. We're 759 00:42:13,719 --> 00:42:18,279 Speaker 1: not trying to be anyway now I am ha ha. Anyway, Yeah, 760 00:42:18,960 --> 00:42:21,240 Speaker 1: you get to these robots that look kind of human, 761 00:42:21,320 --> 00:42:24,799 Speaker 1: but they're not quite right, and that definitely gives a 762 00:42:24,800 --> 00:42:27,400 Speaker 1: comment on settling feeling. The example, I always hear that 763 00:42:27,440 --> 00:42:32,000 Speaker 1: this isn't robots, this is computer generated uh imagery? Is 764 00:42:32,080 --> 00:42:35,799 Speaker 1: the film Polar Express? Oh yeah, when the when the 765 00:42:35,840 --> 00:42:39,240 Speaker 1: trailers came out for the Polar Express movie did the 766 00:42:39,280 --> 00:42:44,360 Speaker 1: animated models looked very lifelike, but the eyes were not 767 00:42:44,520 --> 00:42:47,160 Speaker 1: quite right and kind of gave the sort of deadlights 768 00:42:47,239 --> 00:42:49,920 Speaker 1: kind of thing that you get a little creepy. Now, 769 00:42:50,239 --> 00:42:52,359 Speaker 1: as as we get better and better at that, there 770 00:42:52,400 --> 00:42:54,640 Speaker 1: may be a time where we do bridge that gap. 771 00:42:55,040 --> 00:42:58,080 Speaker 1: We get that that David from Prometheus kind of thing 772 00:42:58,160 --> 00:43:00,680 Speaker 1: where all robots look like Michael fast during. That's pretty 773 00:43:00,760 --> 00:43:04,080 Speaker 1: much okay. I was thinking back to AI personally, but 774 00:43:04,640 --> 00:43:08,279 Speaker 1: because I want my robots like Jude Law. That's entirely fair. 775 00:43:08,320 --> 00:43:11,200 Speaker 1: We're allowed to disagree on Okay, all right, I'm glad. 776 00:43:11,239 --> 00:43:14,959 Speaker 1: I'm glad we've reached that that accord. Excellent. We've we've 777 00:43:15,040 --> 00:43:17,600 Speaker 1: kind of covered the whole topic here. The future of 778 00:43:17,760 --> 00:43:21,640 Speaker 1: domestic robots. I think more likely likely than not, we're 779 00:43:21,640 --> 00:43:25,520 Speaker 1: going to see more specification type robots, you know, robots 780 00:43:25,520 --> 00:43:29,080 Speaker 1: made for specific use. Um, it will still be quite 781 00:43:29,080 --> 00:43:32,400 Speaker 1: a while before we get robots that can be general 782 00:43:32,440 --> 00:43:35,880 Speaker 1: purpose domestic robots, especially the humanoid ones. I think that's 783 00:43:36,600 --> 00:43:40,400 Speaker 1: easily more than a decade away, just because the the 784 00:43:41,160 --> 00:43:43,200 Speaker 1: stuff that we could do now is incredible, but it 785 00:43:43,239 --> 00:43:47,080 Speaker 1: requires so much computing horsepower and such a huge support 786 00:43:47,120 --> 00:43:49,880 Speaker 1: system around it, it's not feasible for them. Yeah, the 787 00:43:50,280 --> 00:43:52,719 Speaker 1: hardware is kind of getting their hardware is cheap enough 788 00:43:52,760 --> 00:43:55,400 Speaker 1: these days that that I think that that part is 789 00:43:55,400 --> 00:43:58,319 Speaker 1: on the feasible end. But it's really the software that 790 00:43:58,400 --> 00:44:01,000 Speaker 1: we do not have the capa see yet, you know, 791 00:44:01,400 --> 00:44:04,160 Speaker 1: to run the programs necessary. Yeah. So I mean, if 792 00:44:04,200 --> 00:44:06,600 Speaker 1: you if you had a supercomputer in your home that 793 00:44:06,680 --> 00:44:10,200 Speaker 1: could run the robot for you, so the robot would 794 00:44:10,200 --> 00:44:13,560 Speaker 1: not have to carry around its own superhuman robot brain, 795 00:44:14,200 --> 00:44:16,239 Speaker 1: that might help. But I don't know a lot of 796 00:44:16,239 --> 00:44:19,080 Speaker 1: people who have supercomputers. You know, just stored away and 797 00:44:19,160 --> 00:44:22,080 Speaker 1: there their gaming room or whatever. Not not too many 798 00:44:22,080 --> 00:44:25,200 Speaker 1: of us, Nope, I definitely do not number among them. 799 00:44:25,200 --> 00:44:27,600 Speaker 1: But anyway, that's kind of the low down on domestic 800 00:44:27,719 --> 00:44:30,480 Speaker 1: robots as they stand now and how they may be 801 00:44:30,600 --> 00:44:32,880 Speaker 1: in the future. I would love to see robots that 802 00:44:32,960 --> 00:44:35,440 Speaker 1: come out that have this sort of adaptability, like the 803 00:44:35,520 --> 00:44:38,719 Speaker 1: kind of stuff that Willow Garage is walking working on. 804 00:44:38,760 --> 00:44:41,640 Speaker 1: Where You've got these robots that you know, depending upon 805 00:44:41,640 --> 00:44:44,719 Speaker 1: what apps you download, they can do different stuff. And 806 00:44:44,840 --> 00:44:47,239 Speaker 1: that wraps up another classic episode of tech stuff. Hope 807 00:44:47,239 --> 00:44:50,480 Speaker 1: you guys enjoyed it. Uh. It's one of those topics 808 00:44:50,520 --> 00:44:53,919 Speaker 1: that I think is fascinating because the more I learned 809 00:44:54,000 --> 00:44:57,160 Speaker 1: about the what goes into designing these robots, the more 810 00:44:57,920 --> 00:45:02,000 Speaker 1: I appreciate how complicated is. Because you're not just designing 811 00:45:02,239 --> 00:45:06,359 Speaker 1: the systems. You're not just engineering the robotics so that 812 00:45:06,400 --> 00:45:08,959 Speaker 1: the thing does the thing you need it to do. 813 00:45:09,360 --> 00:45:12,960 Speaker 1: You also have to start taking human psychology into account, 814 00:45:13,320 --> 00:45:16,640 Speaker 1: and once you start looking at how humans interact with robots, 815 00:45:17,000 --> 00:45:20,279 Speaker 1: things change in ways you probably didn't expect when you 816 00:45:20,320 --> 00:45:22,680 Speaker 1: were just trying to work out the kinks of the 817 00:45:22,760 --> 00:45:26,360 Speaker 1: mechanical aspect. I find that fascinating. So this is a 818 00:45:26,400 --> 00:45:29,040 Speaker 1: topic I'm sure I will return to in the future. 819 00:45:29,239 --> 00:45:32,359 Speaker 1: If you guys have suggestions for future topics I should cover, 820 00:45:32,719 --> 00:45:35,360 Speaker 1: email me. The address is tech stuff at how stuff 821 00:45:35,360 --> 00:45:37,840 Speaker 1: works dot com, or drop me a line on Facebook 822 00:45:37,880 --> 00:45:39,960 Speaker 1: or Twitter. The handle it both of those is text 823 00:45:39,960 --> 00:45:43,480 Speaker 1: stuff HSW. You can go to our website that's tech 824 00:45:43,520 --> 00:45:46,319 Speaker 1: stuff podcast dot com. There's an archive there of every 825 00:45:46,320 --> 00:45:49,000 Speaker 1: episode that's ever published. It's searchable. Go check that out. 826 00:45:49,440 --> 00:45:52,000 Speaker 1: And there's also a link to our online store where 827 00:45:52,000 --> 00:45:54,200 Speaker 1: every purchase you make goes to out the show. And 828 00:45:54,239 --> 00:45:56,680 Speaker 1: we greatly appreciate it, and I will talk to you 829 00:45:56,719 --> 00:46:03,960 Speaker 1: again really soon. Text Stuff is a production of I 830 00:46:04,080 --> 00:46:07,040 Speaker 1: Heart Radio's How Stuff Works. For more podcasts from I 831 00:46:07,160 --> 00:46:10,759 Speaker 1: heart Radio, visit the I heart Radio app, Apple Podcasts, 832 00:46:10,880 --> 00:46:12,840 Speaker 1: or wherever you listen to your favorite shows.