1 00:00:04,400 --> 00:00:12,480 Speaker 1: Welcome to tex Stuff production from I Heart Radio. Hey there, 2 00:00:12,520 --> 00:00:15,720 Speaker 1: and welcome to tex Stuff. I'm your host, John Than Strickland. 3 00:00:15,720 --> 00:00:18,160 Speaker 1: I'm an executive producer with I Heart Radio and Hell 4 00:00:18,360 --> 00:00:22,840 Speaker 1: the Tech Area. It is time for a tech Stuff 5 00:00:22,880 --> 00:00:28,440 Speaker 1: classic episode. This episode originally published June two thousand fifteen. 6 00:00:28,520 --> 00:00:34,559 Speaker 1: It is titled DARPA Robotics Challenge Review. It's about the 7 00:00:34,880 --> 00:00:39,720 Speaker 1: DARPA Robotics Challenge where they were simulating UH an emergency 8 00:00:39,760 --> 00:00:44,040 Speaker 1: situation and putting a robot through paces of doing certain 9 00:00:44,159 --> 00:00:47,919 Speaker 1: basic tasks in an effort to see if the robots 10 00:00:47,960 --> 00:00:51,440 Speaker 1: could do them, you know, more or less autonomously. Turns 11 00:00:51,479 --> 00:00:54,040 Speaker 1: out that some things that you might think of as 12 00:00:54,040 --> 00:00:58,120 Speaker 1: being incredibly easy to accomplish turned out to be pretty hard. 13 00:00:58,680 --> 00:01:03,360 Speaker 1: Let's listen in So, to start off, we got to 14 00:01:03,360 --> 00:01:07,560 Speaker 1: talk about DARPA itself. Remember that's the research and development 15 00:01:07,720 --> 00:01:11,600 Speaker 1: arm of the Department of Defenses technically the US Defense 16 00:01:11,680 --> 00:01:17,200 Speaker 1: Advanced Research Projects Agency UH and DARPA has been responsible 17 00:01:17,200 --> 00:01:21,680 Speaker 1: for a lot of cutting edge research, largely in the 18 00:01:21,720 --> 00:01:25,720 Speaker 1: mode of military use. But we have seen the benefits 19 00:01:25,800 --> 00:01:29,600 Speaker 1: of that research hit us in other ways. Besides, you know, 20 00:01:29,840 --> 00:01:35,160 Speaker 1: the military applications, So for example, our pa net, the 21 00:01:35,200 --> 00:01:39,560 Speaker 1: predecessor to the Internet, was the platform upon which the 22 00:01:39,600 --> 00:01:43,400 Speaker 1: technologies that allow the Internet to work were developed. And 23 00:01:43,480 --> 00:01:45,600 Speaker 1: that was a DARPA project that was back when they 24 00:01:45,600 --> 00:01:50,800 Speaker 1: were first just called ARPA rather than DARPA. But we 25 00:01:50,960 --> 00:01:53,600 Speaker 1: wouldn't have the Internet, at least not the way we 26 00:01:53,680 --> 00:01:57,200 Speaker 1: have it today if it were not for this agency. 27 00:01:57,240 --> 00:01:59,160 Speaker 1: So the agency has done a lot of stuff that 28 00:01:59,240 --> 00:02:02,040 Speaker 1: has benefited us in many ways. I'll talk about another 29 00:02:02,080 --> 00:02:04,560 Speaker 1: one a little bit later on in the podcast, kind 30 00:02:04,600 --> 00:02:09,040 Speaker 1: of draw parallels to what happened with the robotics challenge. 31 00:02:09,960 --> 00:02:13,800 Speaker 1: So here we have this agency. They're all about research 32 00:02:13,800 --> 00:02:17,520 Speaker 1: and development. They don't actually have lots of labs with 33 00:02:17,560 --> 00:02:21,360 Speaker 1: scientists working in them. Know what this agency does is 34 00:02:21,400 --> 00:02:26,040 Speaker 1: it sends out proposals to other groups and asks them 35 00:02:26,200 --> 00:02:30,959 Speaker 1: to help work on various projects. Sometimes they hold challenges. 36 00:02:31,440 --> 00:02:34,320 Speaker 1: In these challenges, it's kind of an open invitation for 37 00:02:34,440 --> 00:02:40,440 Speaker 1: any group that can participate to uh to compete in 38 00:02:40,520 --> 00:02:43,440 Speaker 1: some way, and the winner usually gets some sort of 39 00:02:43,480 --> 00:02:47,320 Speaker 1: cash prize. And so often the cash prize will be 40 00:02:47,440 --> 00:02:51,040 Speaker 1: less than the investment needed to actually participate in the 41 00:02:51,080 --> 00:02:54,320 Speaker 1: competition to build out the thing that you need to 42 00:02:54,360 --> 00:02:57,639 Speaker 1: make for us to to win, but it's still a 43 00:02:57,760 --> 00:03:03,119 Speaker 1: very important UH and prestigious event to be part of 44 00:03:03,240 --> 00:03:06,359 Speaker 1: if you are capable of competing. Now we're talking mostly 45 00:03:06,360 --> 00:03:10,960 Speaker 1: about research departments engineering departments in universities, but there are 46 00:03:11,000 --> 00:03:15,800 Speaker 1: also some private institutions, not just university type things, but 47 00:03:15,919 --> 00:03:19,640 Speaker 1: actual research and development labs that will participate in DARPA 48 00:03:19,760 --> 00:03:25,440 Speaker 1: projects as well. And it's collectively that these teams were 49 00:03:26,200 --> 00:03:28,880 Speaker 1: kind of they're the ones that are pushing the technology forward, 50 00:03:28,919 --> 00:03:32,040 Speaker 1: and DARK is kind of the facilitator. They're offering up 51 00:03:32,080 --> 00:03:38,280 Speaker 1: the possibility for people to actually UH participate, and not 52 00:03:38,280 --> 00:03:41,760 Speaker 1: only the possibility, but the goals. They define the goals 53 00:03:41,840 --> 00:03:44,160 Speaker 1: that have to be met, and as it turns out, 54 00:03:44,960 --> 00:03:48,360 Speaker 1: defining those goals is a very important part of these 55 00:03:48,440 --> 00:03:53,520 Speaker 1: challenges because you have to tell the engineers what it 56 00:03:53,680 --> 00:03:56,800 Speaker 1: is they need to be able to do before they 57 00:03:56,840 --> 00:03:59,200 Speaker 1: can build the stuff that does it. If if you 58 00:03:59,280 --> 00:04:03,160 Speaker 1: have a very vague description of what needs to be done, 59 00:04:03,920 --> 00:04:08,800 Speaker 1: the possibilities of achieving it are so varied that it 60 00:04:08,840 --> 00:04:13,280 Speaker 1: can often paralyze a project nothing gets done. But if 61 00:04:13,280 --> 00:04:17,000 Speaker 1: you make it very specific, then it narrows down the 62 00:04:17,080 --> 00:04:20,640 Speaker 1: options and it gives more focus to the project. So 63 00:04:21,040 --> 00:04:26,359 Speaker 1: the DARPA Robotics project itself was inspired by a real 64 00:04:26,480 --> 00:04:32,320 Speaker 1: world disaster, specifically the Fukushima disaster in Japan. Now, this 65 00:04:32,520 --> 00:04:38,359 Speaker 1: was the nuclear facility, the nuclear reactor that was flooded 66 00:04:38,839 --> 00:04:45,120 Speaker 1: and then suffered some massive problems because of damage to 67 00:04:45,160 --> 00:04:50,800 Speaker 1: the infrastructure. And we're talking about a nuclear facility, Radiation 68 00:04:51,000 --> 00:04:55,120 Speaker 1: is a factor, so it's a very dangerous environment to 69 00:04:55,240 --> 00:05:01,240 Speaker 1: send responders to. Dark As approach was to create a 70 00:05:01,279 --> 00:05:05,880 Speaker 1: scenario in which a robot would need to go through 71 00:05:05,920 --> 00:05:10,120 Speaker 1: an environment similar to that that was at Fukushima and 72 00:05:10,160 --> 00:05:14,680 Speaker 1: be able to perform the tasks necessary to help avert 73 00:05:15,080 --> 00:05:20,440 Speaker 1: true devastation and catastrophe without putting people at risk. So 74 00:05:20,480 --> 00:05:22,600 Speaker 1: the idea is that you can use a robot which 75 00:05:22,640 --> 00:05:27,240 Speaker 1: is not going to respond to radiation the way a 76 00:05:27,279 --> 00:05:31,560 Speaker 1: living organism would and be able to actually carry out 77 00:05:31,839 --> 00:05:35,679 Speaker 1: the stuff you need to prevent a really huge problem 78 00:05:35,760 --> 00:05:41,200 Speaker 1: from getting worse. So uh, some people have referred to 79 00:05:41,320 --> 00:05:46,800 Speaker 1: the Darker Robotics Challenge as the Robolympics because it was 80 00:05:46,839 --> 00:05:49,920 Speaker 1: a series of tasks that a robot has to complete 81 00:05:49,960 --> 00:05:53,080 Speaker 1: and it had to all be the same. Robot teams 82 00:05:53,120 --> 00:05:57,360 Speaker 1: could not build individual robots designed to complete specific tasks. 83 00:05:57,680 --> 00:06:00,680 Speaker 1: If they could, it would be way easier of a 84 00:06:00,760 --> 00:06:03,600 Speaker 1: challenge because most of the robots that are in use 85 00:06:03,640 --> 00:06:09,200 Speaker 1: today have very narrow parameters. There's just maybe a single task, 86 00:06:09,839 --> 00:06:12,359 Speaker 1: or perhaps just a couple of small, you know, a 87 00:06:12,400 --> 00:06:14,719 Speaker 1: small range of tasks that the robot has to do. 88 00:06:15,600 --> 00:06:17,880 Speaker 1: It doesn't have to do anything outside of that. So 89 00:06:17,880 --> 00:06:20,719 Speaker 1: when you're designing the robot, you just say, all right, 90 00:06:20,920 --> 00:06:24,200 Speaker 1: what design elements will allow the robot to do what 91 00:06:24,240 --> 00:06:27,200 Speaker 1: it has to do. Anything that's outside of that is unnecessary, 92 00:06:27,279 --> 00:06:30,359 Speaker 1: we will not do it. Take the roomba as an example. 93 00:06:30,360 --> 00:06:33,280 Speaker 1: It's a great example. It's a robot the vacuums that 94 00:06:33,360 --> 00:06:37,400 Speaker 1: cleans floors. Anything that's not necessary for the roombot to 95 00:06:37,440 --> 00:06:41,520 Speaker 1: do that is put aside. You'd want to make sure 96 00:06:41,560 --> 00:06:46,039 Speaker 1: that all the design elements in your robot complement its purpose. 97 00:06:47,200 --> 00:06:50,640 Speaker 1: But if you have to complete a lot of different tasks, 98 00:06:51,240 --> 00:06:54,960 Speaker 1: and those tasks are not always similar to one another, 99 00:06:55,880 --> 00:07:02,960 Speaker 1: you complicate the whole design process by incredible factors. It's 100 00:07:03,000 --> 00:07:08,000 Speaker 1: hard to even express how much more difficult this is. Now, 101 00:07:08,040 --> 00:07:11,280 Speaker 1: if you've watched any of the videos that came out 102 00:07:11,320 --> 00:07:14,800 Speaker 1: of the Darker Robotics Challenge. You probably saw at least 103 00:07:14,800 --> 00:07:18,720 Speaker 1: one or two that were montages of robots falling over 104 00:07:19,480 --> 00:07:21,440 Speaker 1: and it's a little funny to watch. You see these 105 00:07:21,640 --> 00:07:27,680 Speaker 1: big robots just topple over sometimes apparently for no reason. 106 00:07:27,880 --> 00:07:30,280 Speaker 1: They just they just they're standing and then they collapse. 107 00:07:31,080 --> 00:07:33,520 Speaker 1: And there's something comic about that, you know that you 108 00:07:33,640 --> 00:07:38,000 Speaker 1: can't deny that it's funny, but it also demonstrates that 109 00:07:38,080 --> 00:07:40,680 Speaker 1: these robots are trying to do something that while we 110 00:07:40,800 --> 00:07:44,480 Speaker 1: as human beings might find easy, it's a real challenge 111 00:07:44,720 --> 00:07:50,800 Speaker 1: from a robotics standpoint. So this Darker Robotics Challenge had 112 00:07:50,840 --> 00:07:55,720 Speaker 1: three phases. The first one was a virtual challenge, which 113 00:07:56,360 --> 00:08:00,200 Speaker 1: in which teams were to design software that would allow 114 00:08:00,400 --> 00:08:04,440 Speaker 1: a virtual robot to complete um certain tasks within a 115 00:08:04,520 --> 00:08:08,800 Speaker 1: virtual environment. So there were no real robotics here, but 116 00:08:08,880 --> 00:08:11,679 Speaker 1: it was a test to see if if the teams 117 00:08:11,680 --> 00:08:15,160 Speaker 1: could actually build the software necessary to make the robot 118 00:08:15,240 --> 00:08:19,480 Speaker 1: do the tasks that needed to be done. The next 119 00:08:20,000 --> 00:08:23,200 Speaker 1: had a trial where it was a physical trial where 120 00:08:23,560 --> 00:08:28,320 Speaker 1: teams actually had to uh place a real robot through 121 00:08:28,600 --> 00:08:31,600 Speaker 1: to you know, go to the next level. And then 122 00:08:31,640 --> 00:08:33,400 Speaker 1: you had the finals, which were the ones that have 123 00:08:33,720 --> 00:08:37,240 Speaker 1: happened most recently. Uh, and I'll go through each of 124 00:08:37,320 --> 00:08:39,560 Speaker 1: the tasks that they had to do in just a second. 125 00:08:40,559 --> 00:08:42,840 Speaker 1: If we look back at the Virtual challenge, there were 126 00:08:42,840 --> 00:08:46,000 Speaker 1: twenty six teams that were part of it to build 127 00:08:46,040 --> 00:08:49,880 Speaker 1: that software. The robot they were controlling was a virtual 128 00:08:50,040 --> 00:08:55,840 Speaker 1: Atlas robot, which is a humanoid robot and was a 129 00:08:55,880 --> 00:08:58,240 Speaker 1: popular choice for a lot of the teams. When they 130 00:08:58,240 --> 00:09:02,560 Speaker 1: moved on to the more physical trials, they still would 131 00:09:02,640 --> 00:09:04,679 Speaker 1: use the Atlas robots. The the challenge, by the way, 132 00:09:04,720 --> 00:09:07,880 Speaker 1: did not require the teams to build their own robots 133 00:09:07,920 --> 00:09:11,520 Speaker 1: from scratch. They could use pre existing robots, but they 134 00:09:11,600 --> 00:09:14,280 Speaker 1: had to program them so that they could complete the 135 00:09:14,320 --> 00:09:17,520 Speaker 1: necessary tasks. So not every robot out there would have 136 00:09:17,559 --> 00:09:19,520 Speaker 1: been a good fit for this. Let's go back to 137 00:09:19,559 --> 00:09:22,160 Speaker 1: the Roomba. The roombo would have been a terrible choice 138 00:09:22,200 --> 00:09:26,000 Speaker 1: to enter into the Darbo Robotics Challenge. We'll be back 139 00:09:26,040 --> 00:09:28,679 Speaker 1: with more of this classic episode of tech Stuff after 140 00:09:28,720 --> 00:09:39,520 Speaker 1: this quick break. So, uh, here are some of the 141 00:09:39,640 --> 00:09:44,959 Speaker 1: virtual tasks that the virtual robots had to complete using 142 00:09:45,000 --> 00:09:48,320 Speaker 1: this software. They had to walk through an area of 143 00:09:48,720 --> 00:09:53,800 Speaker 1: uneven ground that had debris included there, so simulated debris 144 00:09:53,920 --> 00:09:56,319 Speaker 1: had to have a software that would allow the robot 145 00:09:56,360 --> 00:10:00,120 Speaker 1: to maintain its balance even while tracking a path to 146 00:10:00,160 --> 00:10:04,319 Speaker 1: a specific destination. Through this this uneven terrain, they had 147 00:10:04,360 --> 00:10:08,640 Speaker 1: to attach a hose to a spigot and turn a valve. Uh. 148 00:10:09,040 --> 00:10:14,400 Speaker 1: They also were given some artificial limits because and an 149 00:10:14,440 --> 00:10:18,720 Speaker 1: emergency situation, you cannot always count on your communication lines 150 00:10:19,120 --> 00:10:24,960 Speaker 1: being perfectly sound, such as that the Fukushima disaster. You 151 00:10:25,200 --> 00:10:28,600 Speaker 1: need to be sure that your robot can contend with 152 00:10:28,640 --> 00:10:31,040 Speaker 1: the fact that there might be lag between when you 153 00:10:31,080 --> 00:10:34,079 Speaker 1: can issue a command and when the robot is ready 154 00:10:34,120 --> 00:10:37,240 Speaker 1: for the next command. UH. That meant that some of 155 00:10:37,280 --> 00:10:40,360 Speaker 1: the teams started to really look at ways to create 156 00:10:40,520 --> 00:10:44,800 Speaker 1: a semi autonomous robot. So some of the teams built 157 00:10:44,880 --> 00:10:51,240 Speaker 1: robots that were capable of taking on some tasks autonomously. UH. 158 00:10:51,320 --> 00:10:55,800 Speaker 1: Some teams avoided that entirely. They focused on using the 159 00:10:55,920 --> 00:11:00,640 Speaker 1: robots as a direct extension of the control roles that 160 00:11:00,720 --> 00:11:03,240 Speaker 1: the humans were behind. So, in other words, you might 161 00:11:03,280 --> 00:11:09,320 Speaker 1: think of those robots as enormous technological puppets. The puppets 162 00:11:09,320 --> 00:11:14,040 Speaker 1: would respond to direct human commands and wouldn't be able 163 00:11:14,080 --> 00:11:17,040 Speaker 1: to do anything on their own. They didn't have any autonomy. 164 00:11:17,720 --> 00:11:20,280 Speaker 1: There are other ones that you know, you could have 165 00:11:20,360 --> 00:11:22,559 Speaker 1: semi autonomy in the show it a set of stairs 166 00:11:22,640 --> 00:11:26,080 Speaker 1: and you send it the command to climb those stairs, 167 00:11:26,160 --> 00:11:28,440 Speaker 1: and it could do the rest. It could calculate how 168 00:11:28,520 --> 00:11:30,480 Speaker 1: high it needed to put its feet and how to 169 00:11:30,480 --> 00:11:33,640 Speaker 1: shift its weight, that sort of stuff. UH, And there 170 00:11:33,679 --> 00:11:36,600 Speaker 1: were a lot of different strategies employed and to varying 171 00:11:36,600 --> 00:11:41,439 Speaker 1: degrees of success. It wasn't like the autonomous robots automatically 172 00:11:41,440 --> 00:11:44,040 Speaker 1: were better than the human controlled or vice versa. There 173 00:11:44,120 --> 00:11:48,120 Speaker 1: was actually it all depended on the implementation. So that 174 00:11:48,200 --> 00:11:57,960 Speaker 1: virtual trial ended up having one team winning pretty pretty decisively. UM. 175 00:11:58,000 --> 00:12:02,360 Speaker 1: It was the Institute for Human and Machine Cognition, which 176 00:12:02,440 --> 00:12:07,320 Speaker 1: is UH, an organization in Pensacola, Florida, and eight other 177 00:12:07,360 --> 00:12:10,360 Speaker 1: teams qualified for the trials that we're going to be 178 00:12:10,400 --> 00:12:16,320 Speaker 1: held in December. Now, by the time those trials happened, 179 00:12:17,080 --> 00:12:21,920 Speaker 1: some of the teams had merged UH and some other 180 00:12:22,000 --> 00:12:24,600 Speaker 1: teams were coming into it. Six of the teams would 181 00:12:24,600 --> 00:12:27,280 Speaker 1: go on to compete at the trials that were held 182 00:12:27,320 --> 00:12:32,160 Speaker 1: at the Homestead Miami Speedway, and they had eight tasks 183 00:12:32,200 --> 00:12:34,720 Speaker 1: they had to complete that were similar to the ones 184 00:12:34,760 --> 00:12:40,400 Speaker 1: in the Darker robotics final, and those tasks included manning 185 00:12:40,400 --> 00:12:45,600 Speaker 1: a vehicle, um walking through uneven terrain, or otherwise moving 186 00:12:45,640 --> 00:12:49,280 Speaker 1: through uneven terrain. Robots did not have to be bipedal. 187 00:12:49,840 --> 00:12:52,480 Speaker 1: They could move around on however many limbs they had. 188 00:12:52,520 --> 00:12:54,480 Speaker 1: They just had to be also able to do these 189 00:12:54,480 --> 00:12:57,480 Speaker 1: other tasks. Uh. They had to be able to climb 190 00:12:57,480 --> 00:12:59,720 Speaker 1: a ladder. They had to be able to remove debris 191 00:12:59,760 --> 00:13:02,120 Speaker 1: from a door and then opened the door. They had 192 00:13:02,200 --> 00:13:04,840 Speaker 1: to be able to break through a wall. Uh, they 193 00:13:04,880 --> 00:13:07,439 Speaker 1: had to be able to handle valves and to attach 194 00:13:07,480 --> 00:13:11,480 Speaker 1: a hose to a spigott. The winning team of that 195 00:13:12,080 --> 00:13:16,320 Speaker 1: set of trials was a group called Shaft spelled s 196 00:13:16,360 --> 00:13:20,520 Speaker 1: C H A f T. This was a group out 197 00:13:20,520 --> 00:13:24,560 Speaker 1: of Japan. Now, when this group was competing, they were 198 00:13:24,760 --> 00:13:28,560 Speaker 1: this little independent group. But then they got acquired by 199 00:13:28,600 --> 00:13:31,440 Speaker 1: a little company that we've talked about several times on 200 00:13:31,480 --> 00:13:34,760 Speaker 1: this podcast. You know him. You may or may not 201 00:13:34,840 --> 00:13:38,520 Speaker 1: love them Google. So just before the trials, Google had 202 00:13:38,520 --> 00:13:44,240 Speaker 1: acquired uh, this this this group, this organization. Now Shaft 203 00:13:44,280 --> 00:13:47,719 Speaker 1: did really well, uh, which you know makes sense. I mean, 204 00:13:47,760 --> 00:13:50,840 Speaker 1: you've you've heard the theme song. But out of thirty 205 00:13:50,840 --> 00:13:55,160 Speaker 1: two points that it could possibly score. It earned twenty seven, 206 00:13:56,160 --> 00:13:59,680 Speaker 1: so pretty good score for for a robot and also 207 00:14:00,240 --> 00:14:05,120 Speaker 1: was seven points more than the next closest competitor, which 208 00:14:05,200 --> 00:14:07,920 Speaker 1: was from I H M C, the same group that 209 00:14:07,960 --> 00:14:13,120 Speaker 1: won the virtual trial earlier. Third place was Tartan Rescue 210 00:14:13,480 --> 00:14:17,280 Speaker 1: from Carnegie Mellon And that will be important in a 211 00:14:17,320 --> 00:14:21,080 Speaker 1: second to when we get into the finals. Now, Shaft 212 00:14:21,240 --> 00:14:24,480 Speaker 1: did not go on to compete in the DARPA Robotics 213 00:14:24,520 --> 00:14:28,880 Speaker 1: Finals because Google announced that they were refocusing the team, 214 00:14:28,960 --> 00:14:34,960 Speaker 1: dedicating it to actual commercial Google projects. So uh, they 215 00:14:35,040 --> 00:14:38,800 Speaker 1: were no longer being dedicated to this challenge. They were 216 00:14:38,800 --> 00:14:41,920 Speaker 1: being dedicated to a real world product of some sort. 217 00:14:43,120 --> 00:14:49,760 Speaker 1: Google's also had other involvement with some DARPA stuff, at 218 00:14:49,800 --> 00:14:53,040 Speaker 1: least after the fact. Google does not tend to get 219 00:14:53,080 --> 00:14:57,640 Speaker 1: involved in these challenges directly. Um, probably because of the 220 00:14:57,760 --> 00:15:02,720 Speaker 1: military association of DARPA and Google is very careful to 221 00:15:02,880 --> 00:15:07,240 Speaker 1: avoid those kind of associations. Now, the finals took place 222 00:15:07,720 --> 00:15:12,760 Speaker 1: over June five and June six and Pomona, California, and uh, 223 00:15:12,800 --> 00:15:16,640 Speaker 1: it was at the Fair Plex and there was quite 224 00:15:16,640 --> 00:15:20,280 Speaker 1: a crowd watching these events, and the reports I read 225 00:15:20,320 --> 00:15:22,760 Speaker 1: were really interesting, and also the videos I watched were 226 00:15:22,800 --> 00:15:27,440 Speaker 1: interesting because the crowd was extremely enthusiastic cheering on the robots, 227 00:15:28,080 --> 00:15:32,280 Speaker 1: groaning every time a robot failed a task or fell over, 228 00:15:33,440 --> 00:15:36,800 Speaker 1: gasping when that sort of stuff happened, because everyone wanted 229 00:15:36,840 --> 00:15:40,520 Speaker 1: to see these robots succeed. No one wanted any group 230 00:15:40,560 --> 00:15:44,840 Speaker 1: to fail. And uh. Also, the other interesting thing was 231 00:15:44,920 --> 00:15:47,560 Speaker 1: that it was really hard to tell if you were 232 00:15:47,560 --> 00:15:52,080 Speaker 1: in the audience when a robot was operating autonomously as 233 00:15:52,080 --> 00:15:56,640 Speaker 1: opposed to being controlled by humans. Um. That says two things, 234 00:15:56,880 --> 00:16:00,320 Speaker 1: One that the line between these two is getting further 235 00:16:00,360 --> 00:16:05,520 Speaker 1: and further blurred. And to that sometimes the robot would 236 00:16:05,560 --> 00:16:07,480 Speaker 1: behave in a weird way, and you couldn't be sure 237 00:16:07,520 --> 00:16:11,520 Speaker 1: if it was because the autonomous programming was lacking or 238 00:16:11,640 --> 00:16:16,160 Speaker 1: because the control mechanisms were limited. So, in other words, 239 00:16:16,880 --> 00:16:19,640 Speaker 1: it's not that necessarily that the art has gotten so 240 00:16:19,680 --> 00:16:21,800 Speaker 1: advanced that we can't tell the difference. It may be 241 00:16:21,960 --> 00:16:25,000 Speaker 1: that the art has not advanced enough that we can't 242 00:16:26,120 --> 00:16:31,480 Speaker 1: spot what is the cause of incompetence. And and I 243 00:16:31,600 --> 00:16:34,720 Speaker 1: use the word incompetence mainly as a joke. It's just 244 00:16:34,840 --> 00:16:40,440 Speaker 1: a joke because truthfully, it displays how difficult robotics as 245 00:16:40,440 --> 00:16:43,480 Speaker 1: a discipline actually is how how challenging it is to 246 00:16:43,560 --> 00:16:46,840 Speaker 1: design a robot that is capable of doing the same 247 00:16:46,880 --> 00:16:50,440 Speaker 1: sort of things that humans can do. It shows that 248 00:16:50,520 --> 00:16:54,160 Speaker 1: the things that we take for granted as being pretty 249 00:16:54,200 --> 00:16:59,040 Speaker 1: simple are fiendishly difficult when you get to a design 250 00:16:59,120 --> 00:17:02,000 Speaker 1: and programming level when you're actually building a robot that's 251 00:17:02,040 --> 00:17:04,840 Speaker 1: going to be capable of doing the same thing. So 252 00:17:04,920 --> 00:17:09,320 Speaker 1: here are the eight tasks that the robots had to complete, 253 00:17:09,400 --> 00:17:12,560 Speaker 1: and one of these tasks was called Surprise. I'll get 254 00:17:12,600 --> 00:17:15,600 Speaker 1: to it. But the reason it's called surprises that DARPA 255 00:17:16,000 --> 00:17:20,080 Speaker 1: told all the teams, your robot will encounter a task 256 00:17:20,400 --> 00:17:24,600 Speaker 1: similar to these other ones that you already know about, 257 00:17:25,160 --> 00:17:26,639 Speaker 1: but we're not going to tell you what it is. 258 00:17:27,720 --> 00:17:30,720 Speaker 1: And it meant that the teams knew that there was 259 00:17:30,760 --> 00:17:34,120 Speaker 1: going to be something on that list of tasks that 260 00:17:34,560 --> 00:17:38,600 Speaker 1: was not not defined, and that the robot would have 261 00:17:38,640 --> 00:17:40,399 Speaker 1: to be able to contend with it in order to 262 00:17:40,440 --> 00:17:44,520 Speaker 1: get a point for that particular task. And so the 263 00:17:44,640 --> 00:17:47,040 Speaker 1: tasks were um there were eight, Like I said, eight 264 00:17:47,040 --> 00:17:50,600 Speaker 1: of them. You were awarded a point if your robot 265 00:17:50,600 --> 00:17:55,680 Speaker 1: could successfully complete that task. So the score the final 266 00:17:55,760 --> 00:17:59,640 Speaker 1: scores once everything was done were determined by how many 267 00:17:59,680 --> 00:18:03,879 Speaker 1: tasks were completed successfully and what was the time of 268 00:18:03,960 --> 00:18:09,119 Speaker 1: the robots performance. Now, when they were first designing the robots, 269 00:18:09,440 --> 00:18:12,240 Speaker 1: teams were told they would have half an hour per task. 270 00:18:12,640 --> 00:18:15,160 Speaker 1: So you have eight tasks, half an hour per test, 271 00:18:15,240 --> 00:18:18,639 Speaker 1: that's four hours total. But the actual competition they were 272 00:18:18,640 --> 00:18:23,359 Speaker 1: told they would have one hour to complete all eight tests. Um. 273 00:18:23,440 --> 00:18:26,280 Speaker 1: And why you might think that, how is that fair? 274 00:18:26,320 --> 00:18:28,639 Speaker 1: As a as a bait and switch, you also have 275 00:18:28,720 --> 00:18:33,080 Speaker 1: to understand this was all about simulating an emergency response. 276 00:18:34,200 --> 00:18:38,320 Speaker 1: So under those emergency conditions, you can't expect to ask 277 00:18:38,359 --> 00:18:41,879 Speaker 1: for more time. That's not realistic. Uh. And it added 278 00:18:41,920 --> 00:18:46,720 Speaker 1: an extra layer of pressure on the teams. So those 279 00:18:46,760 --> 00:18:49,760 Speaker 1: eight tasks, the first one was to drive, so the 280 00:18:49,840 --> 00:18:53,800 Speaker 1: robots had to drive a Polaris Ranger XP nine vehicle. 281 00:18:54,560 --> 00:18:56,760 Speaker 1: These if you haven't seen them, they look like sometimes 282 00:18:56,760 --> 00:18:59,080 Speaker 1: you might call mcgator or you might think of it 283 00:18:59,119 --> 00:19:03,600 Speaker 1: as a golf car on steroids. These are um vehicles 284 00:19:03,680 --> 00:19:06,600 Speaker 1: that are similar to golf carts. They're they're largely open, 285 00:19:06,640 --> 00:19:10,320 Speaker 1: They've got heavier wheels than golf carts, do more warm 286 00:19:10,400 --> 00:19:14,520 Speaker 1: horsepower than a golf cart would, but it's definitely in 287 00:19:14,640 --> 00:19:18,760 Speaker 1: that range between golf cart and real car. Teams had 288 00:19:18,800 --> 00:19:23,960 Speaker 1: five minutes to alter the vehicle without using tools to 289 00:19:24,080 --> 00:19:27,280 Speaker 1: make sure that their robots could actually operate it. So 290 00:19:27,320 --> 00:19:29,760 Speaker 1: what the robots had to do was drive from the 291 00:19:29,840 --> 00:19:35,320 Speaker 1: starting point to a destination, and it was only considered 292 00:19:35,480 --> 00:19:38,359 Speaker 1: a success if the vehicle had completely moved past a 293 00:19:38,560 --> 00:19:43,320 Speaker 1: finishing mark. The vehicle also had to go through essentially 294 00:19:43,320 --> 00:19:47,560 Speaker 1: a driving course with obstacles and cones set up, and 295 00:19:47,600 --> 00:19:51,240 Speaker 1: it was determined that if the robot were to collide 296 00:19:51,320 --> 00:19:53,200 Speaker 1: with one of those obstacles or one of those cones 297 00:19:53,240 --> 00:19:56,200 Speaker 1: and cause it to move as a result of that collision, 298 00:19:56,720 --> 00:20:00,359 Speaker 1: the robot would not receive a point for that ask 299 00:20:01,880 --> 00:20:05,000 Speaker 1: the robot. Also, the team could choose for the robot too, 300 00:20:05,119 --> 00:20:08,959 Speaker 1: instead of driving, to walk to the destination, but in 301 00:20:08,960 --> 00:20:11,439 Speaker 1: that case they would not be awarded a point for 302 00:20:11,480 --> 00:20:16,400 Speaker 1: that task. They would forfeit the point um. But ideally 303 00:20:16,440 --> 00:20:18,600 Speaker 1: the robot would be able to operate the accelerator and 304 00:20:18,680 --> 00:20:22,439 Speaker 1: the steering wheel and maybe even the break and the shift. 305 00:20:23,160 --> 00:20:26,920 Speaker 1: The cars were already or the vehicles were already started, 306 00:20:26,920 --> 00:20:31,119 Speaker 1: The engines were already running, and they were already in 307 00:20:31,240 --> 00:20:34,359 Speaker 1: high gear. Because it was considered to be the smoothest 308 00:20:34,680 --> 00:20:38,639 Speaker 1: way for the robots to to maneuver the vehicle. But 309 00:20:39,000 --> 00:20:41,720 Speaker 1: if teams wanted to, they could design a robot that 310 00:20:41,760 --> 00:20:45,600 Speaker 1: would be capable of shifting gears. It was not a requirement, 311 00:20:45,600 --> 00:20:47,560 Speaker 1: it was just something they could do if they wanted to. 312 00:20:48,720 --> 00:20:51,199 Speaker 1: At any rate, the robot needed to be able to 313 00:20:51,280 --> 00:20:55,439 Speaker 1: drive safely from the starting point to the destination and 314 00:20:55,480 --> 00:20:59,280 Speaker 1: then halt the vehicle either by letting off of the 315 00:20:59,280 --> 00:21:01,919 Speaker 1: accelerators so it coasted to a stop, are actually applying 316 00:21:01,920 --> 00:21:05,880 Speaker 1: the brake. The next task. That was just task number one. 317 00:21:06,440 --> 00:21:09,240 Speaker 1: The next task was called egress, which is really just 318 00:21:09,440 --> 00:21:12,960 Speaker 1: getting out of the car. It sounds incredibly simple, and 319 00:21:12,960 --> 00:21:15,520 Speaker 1: for humans, for most of us, it really is pretty simple. 320 00:21:16,119 --> 00:21:20,000 Speaker 1: You know. We we intuitively know how to maintain our 321 00:21:20,000 --> 00:21:22,800 Speaker 1: balance and to shift our weight so that we go 322 00:21:22,880 --> 00:21:26,080 Speaker 1: from a seated position to standing. But that's not the 323 00:21:26,119 --> 00:21:28,679 Speaker 1: case with the robots. You have to design the robot 324 00:21:28,760 --> 00:21:30,679 Speaker 1: to do that. You have to program the robot to 325 00:21:30,680 --> 00:21:34,160 Speaker 1: do that. You know, the robot be able to anticipate 326 00:21:34,600 --> 00:21:37,560 Speaker 1: what a shift in its weight will do, what how 327 00:21:37,600 --> 00:21:40,920 Speaker 1: it's momentum will carry it forward. As it turns out, 328 00:21:40,920 --> 00:21:44,520 Speaker 1: This was one of the trickier tasks that teams had 329 00:21:44,520 --> 00:21:47,159 Speaker 1: to complete to get out of a vehicle. It was 330 00:21:47,200 --> 00:21:52,320 Speaker 1: not an easy thing to do um but they were 331 00:21:52,400 --> 00:21:54,000 Speaker 1: told that they could get out of the vehicle in 332 00:21:54,040 --> 00:21:56,400 Speaker 1: either direction. They didn't have to exit outll the left 333 00:21:56,440 --> 00:21:58,760 Speaker 1: side versus the right side. That the robot could get 334 00:21:58,800 --> 00:22:02,880 Speaker 1: out of either side, and again the task was considered 335 00:22:02,920 --> 00:22:05,919 Speaker 1: complete if the robot could get out of the vehicle 336 00:22:05,960 --> 00:22:09,520 Speaker 1: and maintain its balance and move on to the next challenge, 337 00:22:09,880 --> 00:22:12,919 Speaker 1: and if they could, it would receive a point. The 338 00:22:13,000 --> 00:22:17,199 Speaker 1: next one was door. Now egress was tough. Door was 339 00:22:17,240 --> 00:22:21,320 Speaker 1: surprisingly tough. You would think that opening a door would 340 00:22:21,359 --> 00:22:24,040 Speaker 1: be a pretty simple task. This was a push door 341 00:22:24,119 --> 00:22:27,080 Speaker 1: operated by a lever style handle. You had to push 342 00:22:27,160 --> 00:22:29,680 Speaker 1: up or pull you know, pull up or push down 343 00:22:29,840 --> 00:22:32,919 Speaker 1: rather on the handle and then push the door to 344 00:22:33,000 --> 00:22:36,440 Speaker 1: open it, then step through the door. The door did 345 00:22:36,440 --> 00:22:38,399 Speaker 1: not have a threshold, so there was nothing that you 346 00:22:38,440 --> 00:22:42,080 Speaker 1: had to step over. However, the door was a standard 347 00:22:42,119 --> 00:22:45,520 Speaker 1: thirty six inch wide doorway and with the door jam 348 00:22:45,600 --> 00:22:48,320 Speaker 1: that's closer to thirty three and a half inches, and 349 00:22:48,359 --> 00:22:51,080 Speaker 1: some of these robots were too wide to walk through 350 00:22:51,119 --> 00:22:54,920 Speaker 1: the door, you know, heading with their torso facing forward. 351 00:22:55,160 --> 00:22:58,200 Speaker 1: They actually had to turn sideways and then shuffle through 352 00:22:58,240 --> 00:23:01,119 Speaker 1: the doorway. And there were a lot of robots that 353 00:23:01,200 --> 00:23:07,600 Speaker 1: fell down at this stage of the the challenge. They 354 00:23:08,040 --> 00:23:10,639 Speaker 1: would lose their balance either when pushing the door or 355 00:23:10,720 --> 00:23:15,560 Speaker 1: when trying to maneuver through the doorway. One robot fell 356 00:23:15,600 --> 00:23:18,320 Speaker 1: down at this stage and was able to pick itself 357 00:23:18,359 --> 00:23:24,280 Speaker 1: back up very very slowly and deliberately, which is remarkable 358 00:23:24,320 --> 00:23:28,800 Speaker 1: because it's the only robot that managed to pick itself 359 00:23:28,840 --> 00:23:31,360 Speaker 1: back up after falling over, and that was the Carnegie 360 00:23:31,440 --> 00:23:35,000 Speaker 1: Melon robot, the same one we mentioned in the Trials 361 00:23:35,160 --> 00:23:39,320 Speaker 1: part in just a few moments ago. But anyway, that 362 00:23:39,359 --> 00:23:40,720 Speaker 1: was it. You just had to open the door and 363 00:23:40,760 --> 00:23:42,520 Speaker 1: walk through the doorway, and that was the end of 364 00:23:42,560 --> 00:23:46,320 Speaker 1: that task, and yet it was deceptively difficult. The next 365 00:23:46,320 --> 00:23:51,600 Speaker 1: one was Valve, which was not about the game company, 366 00:23:51,640 --> 00:23:54,800 Speaker 1: but rather about turning a control as if you were 367 00:23:54,840 --> 00:23:58,359 Speaker 1: turning a valve to operate a fire hose or to 368 00:23:58,960 --> 00:24:01,760 Speaker 1: shut down water to a specific part of the system, 369 00:24:01,800 --> 00:24:04,720 Speaker 1: which makes sense. In a nuclear facility, you might have 370 00:24:04,880 --> 00:24:08,200 Speaker 1: to shut off or open a valve. In that case, 371 00:24:09,720 --> 00:24:13,480 Speaker 1: the valve had to be turned counterclockwise in three sixty 372 00:24:13,560 --> 00:24:16,320 Speaker 1: degrees in order for the task to be complete, and 373 00:24:16,359 --> 00:24:19,159 Speaker 1: the team was only told that the valve would have 374 00:24:19,320 --> 00:24:23,399 Speaker 1: a size between four and sixteen inches in diameter, which 375 00:24:23,400 --> 00:24:25,920 Speaker 1: meant that you had to create a robot that would 376 00:24:25,920 --> 00:24:29,399 Speaker 1: be capable of gripping anything within that range and then 377 00:24:29,600 --> 00:24:34,240 Speaker 1: turning it in that counterclockwise direction three sixty degrees. Again 378 00:24:34,280 --> 00:24:37,560 Speaker 1: triggier than it sounds. We've got more to say in 379 00:24:37,600 --> 00:24:40,920 Speaker 1: this classic episode of tech Stuff. After these quick messages 380 00:24:48,760 --> 00:24:51,600 Speaker 1: the next step. The next task was called wall, in 381 00:24:51,600 --> 00:24:53,720 Speaker 1: which a robot had to pick up a cordless drill. 382 00:24:53,920 --> 00:24:57,760 Speaker 1: There were two of them available. They were not automatically on. 383 00:24:57,840 --> 00:25:00,040 Speaker 1: The robot had to operate the cordless drill and and 384 00:25:00,160 --> 00:25:04,880 Speaker 1: it on by squeezing the trigger, and use the cordless 385 00:25:04,960 --> 00:25:08,320 Speaker 1: drill to cut through some drywall in a shape that 386 00:25:08,440 --> 00:25:12,120 Speaker 1: was drawn on the drywall. The drywall was a half 387 00:25:12,119 --> 00:25:15,600 Speaker 1: inch thick, and the robot had to hold the drill, 388 00:25:15,960 --> 00:25:18,440 Speaker 1: operate the drill and move in this shape and then 389 00:25:18,480 --> 00:25:21,639 Speaker 1: remove the rubble. The idea being that sometimes the robot 390 00:25:21,760 --> 00:25:25,200 Speaker 1: might have to cut through a surface in order to 391 00:25:25,200 --> 00:25:28,840 Speaker 1: get access to certain controls uh that might otherwise be 392 00:25:28,880 --> 00:25:33,119 Speaker 1: blocked by collapse of a room or something along those lines. 393 00:25:34,720 --> 00:25:37,600 Speaker 1: Also pretty tricky, you know, using tools that were designed 394 00:25:37,640 --> 00:25:40,640 Speaker 1: to be held in human hands. You know, humans, we 395 00:25:40,640 --> 00:25:43,400 Speaker 1: we have the ability to detect how much pressure we're 396 00:25:43,480 --> 00:25:48,760 Speaker 1: using and to um to change that based upon what's happening. 397 00:25:49,119 --> 00:25:51,000 Speaker 1: Robots a little more tricky. I mean, you can have 398 00:25:51,040 --> 00:25:56,159 Speaker 1: sensors that alert the operator what's happening to what's happening, 399 00:25:56,160 --> 00:25:59,159 Speaker 1: but it's not intuitive. Again, you have to program it. 400 00:26:00,520 --> 00:26:04,359 Speaker 1: After wall was the surprise. Now, the surprise was something 401 00:26:04,400 --> 00:26:07,280 Speaker 1: that DARPA could change out from day to day. There 402 00:26:07,280 --> 00:26:10,879 Speaker 1: were two days of this series of challenges, and I 403 00:26:10,960 --> 00:26:13,760 Speaker 1: know that on one day, I'm not sure if this 404 00:26:13,840 --> 00:26:15,640 Speaker 1: was the same for the other day, but on one 405 00:26:15,720 --> 00:26:19,399 Speaker 1: day it involved picking up a plug and plugging it 406 00:26:19,440 --> 00:26:24,280 Speaker 1: into a socket. Uh. I think they probably changed it 407 00:26:24,400 --> 00:26:26,440 Speaker 1: for the second day, But I never could find out 408 00:26:26,440 --> 00:26:29,640 Speaker 1: what the surprise was on that case. But at any rate, 409 00:26:30,040 --> 00:26:32,280 Speaker 1: it also ended up being pretty tricky that I saw 410 00:26:32,400 --> 00:26:36,280 Speaker 1: one robot that attempted several times to plug the plug 411 00:26:36,320 --> 00:26:40,639 Speaker 1: into a socket and it was sad and funny at 412 00:26:40,680 --> 00:26:43,280 Speaker 1: the same time. But yeah, again, it was one of 413 00:26:43,280 --> 00:26:46,000 Speaker 1: those things where it was similar to tasks the robots 414 00:26:46,000 --> 00:26:49,080 Speaker 1: had had to do previously, but wasn't something they were 415 00:26:49,119 --> 00:26:52,680 Speaker 1: expressly told they would have to do during the actual challenge, 416 00:26:52,720 --> 00:26:56,800 Speaker 1: so that's what made it really hard. The next task 417 00:26:56,880 --> 00:27:00,159 Speaker 1: was rubble, in which the robot had to walk through 418 00:27:00,240 --> 00:27:03,359 Speaker 1: a debris field or on top of the debris, either 419 00:27:03,440 --> 00:27:06,440 Speaker 1: through it or on top of it. Again, very challenging 420 00:27:06,480 --> 00:27:09,760 Speaker 1: for robots to maintain their balance. There were lots of 421 00:27:10,760 --> 00:27:13,919 Speaker 1: you know, Boston Dynamics robots that were pretty good at 422 00:27:13,960 --> 00:27:16,239 Speaker 1: doing this, but still pretty tricky. You saw a lot 423 00:27:16,240 --> 00:27:19,480 Speaker 1: of robots fall over at this point too. And then 424 00:27:19,520 --> 00:27:22,480 Speaker 1: there were stairs that was the final task was to 425 00:27:22,520 --> 00:27:24,720 Speaker 1: climb us up stairs that had a rail on the 426 00:27:24,800 --> 00:27:28,320 Speaker 1: left side but no rail on the right side, And 427 00:27:28,440 --> 00:27:32,000 Speaker 1: once the robot was completely on the top of the stairs, 428 00:27:32,359 --> 00:27:34,879 Speaker 1: it was considered to have completed the task and the 429 00:27:34,880 --> 00:27:41,440 Speaker 1: course and it's time would be logged. So again, those 430 00:27:41,520 --> 00:27:43,760 Speaker 1: those tasks, for the most part, pretty simple for your 431 00:27:43,800 --> 00:27:47,000 Speaker 1: average human to do. But imagine that it's your job 432 00:27:47,040 --> 00:27:49,320 Speaker 1: to build a robot that could do all of those things. 433 00:27:50,480 --> 00:27:53,200 Speaker 1: It has to be able to have some form of perception, 434 00:27:53,320 --> 00:27:56,359 Speaker 1: has to be able to see either for the human 435 00:27:56,359 --> 00:28:01,440 Speaker 1: operators to pick up various two tools, or navigate through 436 00:28:01,800 --> 00:28:06,080 Speaker 1: certain areas, or for it to to see and operate 437 00:28:06,200 --> 00:28:09,200 Speaker 1: autonomously and has to be able to perceive its environment 438 00:28:09,960 --> 00:28:13,040 Speaker 1: understand depth. In fact, a lot of teams had trouble 439 00:28:13,080 --> 00:28:16,240 Speaker 1: with the the wall task because it was very hard 440 00:28:16,320 --> 00:28:19,439 Speaker 1: to perceive how far away the drill was from a 441 00:28:19,520 --> 00:28:23,720 Speaker 1: hand to pick it up and then use it again. 442 00:28:23,800 --> 00:28:27,160 Speaker 1: Things that are pretty easy for most of us very 443 00:28:27,200 --> 00:28:32,680 Speaker 1: hard from a robotic standpoint. Um. And like I said, 444 00:28:32,680 --> 00:28:35,280 Speaker 1: the door was one of the hardest ones. Uh. A 445 00:28:35,320 --> 00:28:37,560 Speaker 1: lot of robots had a lot of issues just walking 446 00:28:37,560 --> 00:28:40,880 Speaker 1: through a doorway, which is both funny and and really 447 00:28:40,920 --> 00:28:43,520 Speaker 1: does bring to light that we've got a long way 448 00:28:43,520 --> 00:28:48,760 Speaker 1: to go with robotics. Um. Well, if if you, as 449 00:28:48,800 --> 00:28:52,080 Speaker 1: someone in the crowd had said, if the robotic uprising happens, 450 00:28:52,160 --> 00:28:56,040 Speaker 1: just close the door and you'll be fine. Uh, Because 451 00:28:56,080 --> 00:28:59,680 Speaker 1: it's tricky stuff. It really shows the automation is. We 452 00:28:59,760 --> 00:29:02,600 Speaker 1: take a lot of it for granted because the examples 453 00:29:02,600 --> 00:29:05,720 Speaker 1: we see tend to be pretty elegant because they're designed 454 00:29:06,200 --> 00:29:09,560 Speaker 1: for a specific purpose. But as we start looking at 455 00:29:09,560 --> 00:29:13,200 Speaker 1: a more general purpose robot, we begin to understand exactly 456 00:29:13,280 --> 00:29:17,400 Speaker 1: how complicated we human beings are, so to design a 457 00:29:17,440 --> 00:29:21,160 Speaker 1: machine that can operate within our human world seamlessly is 458 00:29:21,200 --> 00:29:26,200 Speaker 1: an incredible challenge. So who won, Well, the first place 459 00:29:26,240 --> 00:29:29,920 Speaker 1: went to a South Korean team k A I ST. 460 00:29:31,000 --> 00:29:34,760 Speaker 1: It had its robot d d r C Hubo, which 461 00:29:34,840 --> 00:29:38,920 Speaker 1: completed all eight tasks in forty four minutes and twenty 462 00:29:39,000 --> 00:29:43,000 Speaker 1: eight seconds. Those eight tests forty four minutes and twenty 463 00:29:43,000 --> 00:29:45,920 Speaker 1: eight seconds, remember that's driving from one point to another 464 00:29:46,040 --> 00:29:49,640 Speaker 1: a very short distance, actually getting out of that vehicle, 465 00:29:50,640 --> 00:29:53,720 Speaker 1: walking through a doorway, cutting a hole through a wall, 466 00:29:53,760 --> 00:29:58,760 Speaker 1: turning a valve, plugging in a plug, um, walking over 467 00:29:58,800 --> 00:30:01,400 Speaker 1: some debris, and climbing some stairs. Something that would take 468 00:30:02,080 --> 00:30:06,600 Speaker 1: you know, a healthy person. If it took the more 469 00:30:06,640 --> 00:30:08,720 Speaker 1: than ten minutes, you would wonder what was going on. 470 00:30:09,840 --> 00:30:14,000 Speaker 1: So another example that the winning team did it in 471 00:30:14,040 --> 00:30:17,080 Speaker 1: forty four minutes and twenty eight seconds. There's also a 472 00:30:17,120 --> 00:30:20,160 Speaker 1: great preparation video for the d r C Hubo that 473 00:30:20,280 --> 00:30:23,360 Speaker 1: was put out by the South Korean team that I 474 00:30:23,520 --> 00:30:27,000 Speaker 1: loved because it was like a training montage from Rocky 475 00:30:27,200 --> 00:30:29,760 Speaker 1: and and they It was clear the team had a 476 00:30:29,800 --> 00:30:31,960 Speaker 1: sense of humor about this because it was showing the 477 00:30:32,040 --> 00:30:35,840 Speaker 1: robot not just complete tasks that would be similar to 478 00:30:35,840 --> 00:30:38,240 Speaker 1: the ones that were in the actual challenge, including things 479 00:30:38,240 --> 00:30:41,920 Speaker 1: like climbing a ladder or um or lifting a tool, 480 00:30:42,880 --> 00:30:46,600 Speaker 1: but also doing your typical, you know, training montage stuff 481 00:30:46,640 --> 00:30:49,920 Speaker 1: like doing push ups or taking a fighting stance like 482 00:30:50,120 --> 00:30:54,480 Speaker 1: you're a kung fu master, And that really tickled me. 483 00:30:54,480 --> 00:30:57,400 Speaker 1: I thought it was pretty entertaining but also very impressive 484 00:30:57,440 --> 00:30:59,640 Speaker 1: about what they were capable of doing with that robot. 485 00:31:00,080 --> 00:31:02,760 Speaker 1: The second place team was I H M C. That 486 00:31:02,880 --> 00:31:05,320 Speaker 1: Pencacle of Florida group we talked about a minute ago. 487 00:31:05,480 --> 00:31:11,200 Speaker 1: They were using the Boston Dynamics running Man Atlas butt Um. 488 00:31:11,240 --> 00:31:15,760 Speaker 1: It took it took about fifty minutes in twenty six 489 00:31:15,760 --> 00:31:18,440 Speaker 1: seconds to complete all eight tasks. After he did so, 490 00:31:18,520 --> 00:31:21,400 Speaker 1: after it had climbed the stairs, it lifted its arms 491 00:31:21,400 --> 00:31:27,560 Speaker 1: in victory and then fell over, So I guess that's Hubris. 492 00:31:27,600 --> 00:31:31,760 Speaker 1: The third place team was Carnegie Melon. Their team Tartan Rescue, 493 00:31:31,920 --> 00:31:34,720 Speaker 1: the one that had come in third place in the 494 00:31:34,760 --> 00:31:38,640 Speaker 1: trials as well. Their robot was called Chimp, which had 495 00:31:38,800 --> 00:31:41,680 Speaker 1: very long arms. Is a red robot with arms that 496 00:31:41,720 --> 00:31:44,600 Speaker 1: looked kind of freakishly long if you compare it to 497 00:31:44,680 --> 00:31:49,360 Speaker 1: a human, and it also had treads like a like 498 00:31:49,400 --> 00:31:51,680 Speaker 1: a treadmill type thing on both of its legs and 499 00:31:51,720 --> 00:31:54,040 Speaker 1: on both of its arms. And that was the robot 500 00:31:54,040 --> 00:31:56,400 Speaker 1: that fell over in the door challenge and was able 501 00:31:56,400 --> 00:32:00,640 Speaker 1: to right itself by itself, the only one that could 502 00:32:00,640 --> 00:32:02,360 Speaker 1: get up by itself and didn't need a team of 503 00:32:02,440 --> 00:32:05,520 Speaker 1: humans to actually lift it up. Now, those were the 504 00:32:05,560 --> 00:32:08,960 Speaker 1: only three teams that actually completed all of the eight tasks, 505 00:32:09,040 --> 00:32:12,000 Speaker 1: and they won the prizes. First place was two million dollars, 506 00:32:12,080 --> 00:32:14,680 Speaker 1: second place was one million, and third place was five thousand. 507 00:32:15,520 --> 00:32:18,200 Speaker 1: The other teams broke down like this. Four of them 508 00:32:18,400 --> 00:32:22,400 Speaker 1: were able to complete seven of the tasks, only one 509 00:32:22,880 --> 00:32:26,800 Speaker 1: was capable of completing just six tasks. You had two 510 00:32:26,800 --> 00:32:31,560 Speaker 1: teams that completed five, two teams that completed four, four 511 00:32:31,640 --> 00:32:36,080 Speaker 1: teams that completed three, two that completed to one that 512 00:32:36,120 --> 00:32:39,840 Speaker 1: completed one, and four teams weren't able to complete any 513 00:32:39,920 --> 00:32:43,719 Speaker 1: of the tasks successfully. Um. Now, the whole point of 514 00:32:43,720 --> 00:32:47,880 Speaker 1: this was to really push that state of the art forward, 515 00:32:48,360 --> 00:32:51,480 Speaker 1: to have engineers think, what would you need to do 516 00:32:51,600 --> 00:32:55,000 Speaker 1: to build a robot capable of actually responding to real 517 00:32:55,040 --> 00:32:59,440 Speaker 1: world situations? Who are the challenges that are in the 518 00:32:59,480 --> 00:33:03,240 Speaker 1: way of how can we advance the technology both in 519 00:33:03,280 --> 00:33:07,360 Speaker 1: the hardware and in the software to overcome these challenges. 520 00:33:08,000 --> 00:33:11,360 Speaker 1: And the goal was not to build a super robot 521 00:33:11,440 --> 00:33:13,800 Speaker 1: that's going to save the world. This is not you know, 522 00:33:13,920 --> 00:33:16,920 Speaker 1: the Avengers age of Ultron. That's not what's going on here. 523 00:33:16,960 --> 00:33:21,680 Speaker 1: It's all about incremental improvements in the art and discipline 524 00:33:21,680 --> 00:33:25,800 Speaker 1: of robotics so that the next generation of robots can 525 00:33:25,920 --> 00:33:29,360 Speaker 1: benefit from the research and development done in this first 526 00:33:29,400 --> 00:33:32,840 Speaker 1: generation and a good way of seeing how this plays 527 00:33:32,880 --> 00:33:35,840 Speaker 1: out in the real world, because you know, you might think, well, 528 00:33:35,880 --> 00:33:39,239 Speaker 1: that was a clever demonstration, but what what can we 529 00:33:39,280 --> 00:33:43,280 Speaker 1: expect here on our day to day lives. Take a 530 00:33:43,320 --> 00:33:47,680 Speaker 1: look at how autonomous cars are coming along, because of course, 531 00:33:47,760 --> 00:33:51,480 Speaker 1: DARPA held the Grand Challenge back in the mid to 532 00:33:51,560 --> 00:33:54,880 Speaker 1: late two thousand's, and that was the challenge in which 533 00:33:55,040 --> 00:33:58,480 Speaker 1: different groups tried to build self driving cars that could 534 00:33:58,520 --> 00:34:02,120 Speaker 1: complete a core whether it was on the desert or 535 00:34:02,240 --> 00:34:07,560 Speaker 1: simulated urban environment. And we are now seeing people who 536 00:34:07,720 --> 00:34:11,719 Speaker 1: worked on the various teams that that competed in that 537 00:34:11,800 --> 00:34:16,320 Speaker 1: DARPA Grand Challenge building what will be the first generation 538 00:34:16,320 --> 00:34:19,120 Speaker 1: of driver lest cars that will eventually make it to 539 00:34:19,120 --> 00:34:24,560 Speaker 1: the consumer market. Google has people on its team that 540 00:34:24,640 --> 00:34:30,440 Speaker 1: competed in those Grand challenges, So we're seeing that expertise 541 00:34:30,520 --> 00:34:34,240 Speaker 1: that was developed as as engineers were given a problem 542 00:34:34,280 --> 00:34:36,560 Speaker 1: and told find a way to solve this. We're seeing 543 00:34:36,560 --> 00:34:39,120 Speaker 1: that expertise they developed come into play in the real 544 00:34:39,160 --> 00:34:41,640 Speaker 1: world now. And it's sure it's gonna be a few 545 00:34:41,680 --> 00:34:46,520 Speaker 1: more years before we get autonomous cars as a reality 546 00:34:46,560 --> 00:34:50,520 Speaker 1: that an actual human being can go out and purchase, 547 00:34:50,600 --> 00:34:54,040 Speaker 1: as opposed to a representative of a company or you know, 548 00:34:54,440 --> 00:34:58,000 Speaker 1: just uh, you know, someone who is there to demonstrate 549 00:34:58,080 --> 00:35:00,560 Speaker 1: the viability of the technology. It's going to be a 550 00:35:00,600 --> 00:35:02,880 Speaker 1: few more years before this becomes something that you or 551 00:35:02,920 --> 00:35:06,920 Speaker 1: I could go to the dealership and actually purchase. But 552 00:35:07,280 --> 00:35:10,000 Speaker 1: the reality is there on the horizon. Same here with 553 00:35:10,080 --> 00:35:15,280 Speaker 1: the Darba Robotics challenge. It will likely take a decade 554 00:35:15,360 --> 00:35:19,040 Speaker 1: or more to develop a humanoid robot or a robot 555 00:35:19,080 --> 00:35:24,360 Speaker 1: capable of operating within a world designed by humanoids for 556 00:35:24,520 --> 00:35:29,960 Speaker 1: humanoids and do so seamlessly. It will take a lot 557 00:35:29,960 --> 00:35:33,480 Speaker 1: of time and a lot more development, but the foundation 558 00:35:33,560 --> 00:35:38,160 Speaker 1: has been laid, so I'm particularly excited to see what 559 00:35:38,280 --> 00:35:42,200 Speaker 1: the future holds for us based upon these results. And 560 00:35:42,239 --> 00:35:45,800 Speaker 1: while those montages of robots falling over are really funny. 561 00:35:46,200 --> 00:35:50,640 Speaker 1: I am actually optimistic about what robotics will be able 562 00:35:50,880 --> 00:35:54,600 Speaker 1: to accomplish in the future. I'm just also realistic that 563 00:35:54,640 --> 00:35:57,880 Speaker 1: it's going to take time. It's not We're not on 564 00:35:57,920 --> 00:36:02,600 Speaker 1: the verge of a terminator like future. The state of 565 00:36:02,640 --> 00:36:06,640 Speaker 1: the art and artificial intelligence has gotten gone very far 566 00:36:06,719 --> 00:36:09,759 Speaker 1: in certain realms of computer science, but when it comes 567 00:36:09,760 --> 00:36:12,040 Speaker 1: to robotics, there's all there are a lot of problems 568 00:36:12,080 --> 00:36:15,080 Speaker 1: that are still very difficult. You know that vision is 569 00:36:15,080 --> 00:36:19,920 Speaker 1: a big one. H and again, just being able to 570 00:36:19,960 --> 00:36:22,680 Speaker 1: maintain balance again one of those things that we take 571 00:36:22,680 --> 00:36:27,760 Speaker 1: for granted, really a challenging problem. I hope you enjoyed 572 00:36:27,800 --> 00:36:32,239 Speaker 1: that classic episode about the DARPA Robotics Challenge review. If 573 00:36:32,320 --> 00:36:34,640 Speaker 1: you have suggestions for topics I should cover in future 574 00:36:34,680 --> 00:36:37,440 Speaker 1: episodes of tech Stuff, please reach out and let me 575 00:36:37,480 --> 00:36:40,960 Speaker 1: know what you're thinking, because otherwise I'll never know. The 576 00:36:41,000 --> 00:36:43,279 Speaker 1: best way to reach me is to use Twitter. The 577 00:36:43,400 --> 00:36:47,799 Speaker 1: show's handle is text Stuff HSW and I'll talk to 578 00:36:47,800 --> 00:36:56,839 Speaker 1: you again really soon. Text Stuff is an I Heart 579 00:36:56,920 --> 00:37:00,680 Speaker 1: Radio production. For more podcasts from my Heart Radio, visit 580 00:37:00,719 --> 00:37:03,759 Speaker 1: the i Heart Radio app Apple Podcasts or wherever you 581 00:37:03,840 --> 00:37:09,719 Speaker 1: listen to your favorite shows. H