1 00:00:04,160 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,240 --> 00:00:14,040 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:14,040 --> 00:00:17,040 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer at 4 00:00:17,040 --> 00:00:19,920 Speaker 1: how Stuff Works in a love all things tech, and 5 00:00:20,000 --> 00:00:23,920 Speaker 1: you are listening to a classic episode of tech Stuff. 6 00:00:24,560 --> 00:00:28,600 Speaker 1: This one is about the sad tale of hitch Bot, 7 00:00:29,120 --> 00:00:33,920 Speaker 1: a robot designed to hitchhike across North America, and how 8 00:00:34,120 --> 00:00:37,960 Speaker 1: it journeyed across Canada and then met a rather untimely 9 00:00:38,120 --> 00:00:42,279 Speaker 1: end in the United States spoiler alert. And this episode 10 00:00:42,280 --> 00:00:46,240 Speaker 1: features guest host Scott Benjamin, who joins us to talk 11 00:00:46,280 --> 00:00:50,880 Speaker 1: about Hitchbot and the somewhat tragic comedic tale. I hope 12 00:00:50,920 --> 00:00:56,680 Speaker 1: you enjoy this classic episode. On July two thousand fourteen, 13 00:00:57,440 --> 00:01:03,160 Speaker 1: Hitchhiker began a historic journey from Halifax, Nova Scotia, to 14 00:01:03,440 --> 00:01:08,600 Speaker 1: get to Victoria, British Columbia, on the other side of Canada. 15 00:01:08,720 --> 00:01:13,759 Speaker 1: We're talking crossing the entire width of Canada. And if 16 00:01:13,760 --> 00:01:17,679 Speaker 1: you were to do that on the most efficient route possible, 17 00:01:17,720 --> 00:01:20,880 Speaker 1: if you got to choose the route that would be 18 00:01:20,959 --> 00:01:24,480 Speaker 1: at minimum around three thousand, six hundred forty four miles 19 00:01:24,640 --> 00:01:29,000 Speaker 1: or five thousand, eight hundred sixty four kilometers. However, hitchhikers 20 00:01:29,120 --> 00:01:34,399 Speaker 1: rarely have the ability to call exactly what route needs 21 00:01:34,440 --> 00:01:36,440 Speaker 1: to be taken. They are at the mercy of the 22 00:01:36,520 --> 00:01:40,240 Speaker 1: drivers that pick them up. Sure, I'm just I'm headed west, exactly, 23 00:01:40,280 --> 00:01:42,640 Speaker 1: take me as far as you're going. And not that 24 00:01:42,680 --> 00:01:45,160 Speaker 1: I condone hitchhiking or anything like that. That's kind of dangerous. 25 00:01:45,480 --> 00:01:47,720 Speaker 1: It can be, and we will definitely get into some 26 00:01:47,840 --> 00:01:52,760 Speaker 1: danger territory. Part of this conversation, well, the full trip 27 00:01:53,280 --> 00:01:56,600 Speaker 1: took closer to ten thousand kilometers or about sixty two 28 00:01:56,720 --> 00:02:01,080 Speaker 1: hundred miles. Uh. And here's the weird part. The hitchhiker 29 00:02:01,120 --> 00:02:03,919 Speaker 1: was a robot. Ah, that is weird. Yeah, not a person. 30 00:02:04,920 --> 00:02:06,760 Speaker 1: I've never picked up a hitchhiker. In fact, for a 31 00:02:06,760 --> 00:02:09,880 Speaker 1: long time, I had never even seen one. Uh saw 32 00:02:09,919 --> 00:02:13,919 Speaker 1: a lot in Hawaii. Surfer culture is still going strong. Yeah, 33 00:02:13,919 --> 00:02:16,480 Speaker 1: there's certain parts of the United States that you can 34 00:02:16,520 --> 00:02:19,720 Speaker 1: expect to see more hitchhikers than than other parts. Right, 35 00:02:19,880 --> 00:02:22,320 Speaker 1: And uh, in Hawaii, I guess would be one of 36 00:02:22,320 --> 00:02:24,639 Speaker 1: those places. I've been there too, and I know exactly 37 00:02:24,680 --> 00:02:27,720 Speaker 1: what we're talking about. I think I've seen more hitchhikers 38 00:02:27,760 --> 00:02:30,960 Speaker 1: in Hawaii standing at bus stops attempting to hit your ride, 39 00:02:31,240 --> 00:02:33,760 Speaker 1: just hoping to catch a ride before that bus shows up. 40 00:02:33,960 --> 00:02:35,480 Speaker 1: You know, it's like one or the other. Eventually a 41 00:02:35,480 --> 00:02:37,880 Speaker 1: bust will come. But if in the meantime, if I 42 00:02:37,880 --> 00:02:39,720 Speaker 1: can just get a free ride down the road, That's 43 00:02:39,720 --> 00:02:41,640 Speaker 1: what I'll do. I've just seen lots and lots of 44 00:02:41,680 --> 00:02:44,519 Speaker 1: surfers trying to trying to get to the beach, trying 45 00:02:44,520 --> 00:02:47,320 Speaker 1: to catch that next wave. That's right, man, you can't. 46 00:02:47,600 --> 00:02:50,520 Speaker 1: The waves wait for no one. And so we're talking 47 00:02:50,639 --> 00:02:54,000 Speaker 1: about Hitchbot, which a lot of you have probably heard about. 48 00:02:54,160 --> 00:02:56,680 Speaker 1: Hitchbot made the news first made the news in two 49 00:02:56,720 --> 00:02:59,679 Speaker 1: thousand and fourteen during this historic attempt to get a 50 00:02:59,760 --> 00:03:03,120 Speaker 1: rope out to hitchhike across all of Canada successful. It 51 00:03:03,200 --> 00:03:06,160 Speaker 1: was successful, so spoiler alert there, and then went on 52 00:03:06,240 --> 00:03:10,760 Speaker 1: to do this again in Germany. It went all over Germany, 53 00:03:10,840 --> 00:03:13,280 Speaker 1: and it also took a little vacation in the Netherlands. 54 00:03:14,040 --> 00:03:17,080 Speaker 1: And then finally there was an attempt for this robot 55 00:03:17,120 --> 00:03:19,440 Speaker 1: to hitchhike its way across the United States, which was 56 00:03:20,600 --> 00:03:23,440 Speaker 1: cut short. Just like the robot. This all sounds so nice. 57 00:03:23,440 --> 00:03:27,079 Speaker 1: What could possibly go wrong? So you might be wondering 58 00:03:27,680 --> 00:03:30,680 Speaker 1: I've heard about this, what actually is going on. The 59 00:03:30,720 --> 00:03:33,120 Speaker 1: first thing I want to say is, while we're talking 60 00:03:33,120 --> 00:03:37,320 Speaker 1: about a hitchhiking robot, Honestly, if I were to describe this, 61 00:03:37,680 --> 00:03:40,360 Speaker 1: I would not have used the word robot. Yeah, it's 62 00:03:40,520 --> 00:03:44,000 Speaker 1: it's They're using a term very loosely here. I think 63 00:03:44,120 --> 00:03:46,800 Speaker 1: it's because the form factor makes it look sort of 64 00:03:46,840 --> 00:03:50,880 Speaker 1: like a robot. Uh. And it definitely had the beterfit 65 00:03:50,920 --> 00:03:53,520 Speaker 1: of having like a light display that that made a 66 00:03:53,600 --> 00:03:56,200 Speaker 1: very simple smiley face, so you had kind of a 67 00:03:56,720 --> 00:04:00,600 Speaker 1: you know, ahead that you could identify. But really we're 68 00:04:00,600 --> 00:04:03,800 Speaker 1: really talking about a hitchhiking computer. It's just a hitchhiking 69 00:04:03,840 --> 00:04:07,240 Speaker 1: computer that was in a more or less static robot body. 70 00:04:07,320 --> 00:04:10,360 Speaker 1: It's simple, but they found a way to anthropomorphosize this 71 00:04:10,440 --> 00:04:12,880 Speaker 1: thing to the point where people look at it and said, oh, 72 00:04:12,920 --> 00:04:15,360 Speaker 1: that's kind of cute. Yeah, because it has a torso, 73 00:04:15,520 --> 00:04:18,280 Speaker 1: it's got arms and legs. Arms and legs don't move 74 00:04:19,080 --> 00:04:24,400 Speaker 1: the torso. Static torsos a bucket um literally bucket, yeah, 75 00:04:24,520 --> 00:04:27,200 Speaker 1: and literally a bucket. It does have solar panels that 76 00:04:27,240 --> 00:04:30,680 Speaker 1: are arranged on the outside of the bucket, so that's 77 00:04:30,720 --> 00:04:34,279 Speaker 1: one of the ways that the robot gets electricity. The 78 00:04:34,320 --> 00:04:36,599 Speaker 1: other is that it will ask people to plug it 79 00:04:36,680 --> 00:04:40,960 Speaker 1: into the a lighter socket on a in a car. Yeah, 80 00:04:41,000 --> 00:04:43,000 Speaker 1: it's only about three ft tall, so it's very small. 81 00:04:43,560 --> 00:04:46,600 Speaker 1: That's waterproof, which is kind of surprising. It's waterproof, but 82 00:04:46,640 --> 00:04:50,120 Speaker 1: it is well and also, I mean it wears waterproof boots. 83 00:04:50,839 --> 00:04:54,719 Speaker 1: Every single article, every article without fail, talks about the 84 00:04:54,720 --> 00:04:59,400 Speaker 1: fact that it wears welliest. The brand name boot Yeah, okay, 85 00:04:59,440 --> 00:05:02,279 Speaker 1: like like a rubber boot. It's a named after the 86 00:05:02,320 --> 00:05:06,120 Speaker 1: Duke of weg Well. It's arms are what pool noodles, 87 00:05:06,560 --> 00:05:08,279 Speaker 1: so you know, it's not it's not like they went 88 00:05:08,320 --> 00:05:10,080 Speaker 1: to two great lengths to try to make this thing 89 00:05:10,120 --> 00:05:12,720 Speaker 1: look human or anything like that. But it does have 90 00:05:12,920 --> 00:05:14,960 Speaker 1: arms and legs. It does have a face as you mentioned, 91 00:05:15,000 --> 00:05:18,600 Speaker 1: has um a it almost looks like a Tupperware container 92 00:05:18,600 --> 00:05:20,640 Speaker 1: on the top looks like a hat, a beret or 93 00:05:20,680 --> 00:05:24,760 Speaker 1: something like that. Um that's designed to to actually protect 94 00:05:24,839 --> 00:05:29,320 Speaker 1: the the electronics the tablet computer that is running the 95 00:05:29,400 --> 00:05:32,919 Speaker 1: software that the robot so used likely part of the 96 00:05:32,920 --> 00:05:36,480 Speaker 1: waterproofing overall, I guess, and um it's GPS equipped. The 97 00:05:36,520 --> 00:05:38,560 Speaker 1: thing weighs about twenty five pounds total. So it's not 98 00:05:38,600 --> 00:05:41,920 Speaker 1: that heavy. Right, has its own built in seat. That's 99 00:05:42,120 --> 00:05:44,120 Speaker 1: it's like a car seat almost that well, it is 100 00:05:44,160 --> 00:05:46,839 Speaker 1: a car seat from a kid's car seat that you 101 00:05:46,880 --> 00:05:48,960 Speaker 1: can then put in your car and buckle in, so 102 00:05:49,040 --> 00:05:50,560 Speaker 1: it's secure when it's in there. It's not gonna fly 103 00:05:50,600 --> 00:05:53,320 Speaker 1: around the vehicle loose, you know, it's something were to happen. Right, 104 00:05:53,360 --> 00:05:56,800 Speaker 1: And and it's legs are not powered, they are statics. 105 00:05:56,800 --> 00:05:58,800 Speaker 1: So the way that you would set this up when 106 00:05:58,880 --> 00:06:00,640 Speaker 1: you are done carrying it as far as you want 107 00:06:00,640 --> 00:06:04,159 Speaker 1: to carry it is the seat also has essentially a 108 00:06:04,240 --> 00:06:08,520 Speaker 1: lever that can fold down into a tripod like position, 109 00:06:08,960 --> 00:06:12,000 Speaker 1: so the two legs act as two of the legs 110 00:06:12,000 --> 00:06:14,440 Speaker 1: of the tripod. This lever access the third and if 111 00:06:14,440 --> 00:06:16,240 Speaker 1: you were to pick it up, you could fold that 112 00:06:16,520 --> 00:06:19,640 Speaker 1: arm back up against the seat, so that would allow 113 00:06:19,680 --> 00:06:22,560 Speaker 1: you to put it into your vehicle and secure the 114 00:06:23,120 --> 00:06:24,799 Speaker 1: car seat. And as if you were in an area 115 00:06:24,839 --> 00:06:26,159 Speaker 1: that you wanted to set it up on the side 116 00:06:26,160 --> 00:06:27,800 Speaker 1: of the road, that was you know, like a field 117 00:06:27,920 --> 00:06:29,960 Speaker 1: or something where there's nowhere for it to sit, like 118 00:06:30,000 --> 00:06:31,680 Speaker 1: on a bench or maybe in a wall or something 119 00:06:31,760 --> 00:06:34,760 Speaker 1: right right, and uh, you know, like I said, it 120 00:06:34,839 --> 00:06:38,400 Speaker 1: was running on essentially a tablet PC that if you 121 00:06:38,480 --> 00:06:42,360 Speaker 1: looked at all the equipment according to the website, Uh, 122 00:06:42,440 --> 00:06:45,800 Speaker 1: it cost about a thousand dollars, maybe a little less. 123 00:06:45,839 --> 00:06:49,680 Speaker 1: And that was a calculated decision. They wanted that. They, 124 00:06:49,760 --> 00:06:51,760 Speaker 1: being the team behind this, and I'll talk about them 125 00:06:51,760 --> 00:06:55,880 Speaker 1: in a second, wanted the robot to be inexpensive enough 126 00:06:55,880 --> 00:06:58,640 Speaker 1: where it would not be an obvious target for someone 127 00:06:58,680 --> 00:07:00,920 Speaker 1: to just steal the component out of it. They wanted 128 00:07:00,960 --> 00:07:05,039 Speaker 1: it to be uh, accessible, They wanted it to be cute. 129 00:07:05,320 --> 00:07:07,240 Speaker 1: They wanted it to be something that people would want 130 00:07:07,279 --> 00:07:11,000 Speaker 1: to interact with and to have enough of an ability 131 00:07:11,080 --> 00:07:16,160 Speaker 1: to have interactions, including holding a conversation sort of yeah, 132 00:07:16,240 --> 00:07:20,960 Speaker 1: kind of. Yeah, we're being real generous with the term conversation. 133 00:07:21,080 --> 00:07:22,800 Speaker 1: That's one thing that you and I talked about off 134 00:07:22,800 --> 00:07:25,640 Speaker 1: air is that every conversation we've ever seen with this, 135 00:07:25,800 --> 00:07:29,760 Speaker 1: you know that between a human and hitchpot it was 136 00:07:29,760 --> 00:07:32,480 Speaker 1: awkward to say the least. I mean it was it 137 00:07:32,600 --> 00:07:34,320 Speaker 1: kind of it kind of picked up on what what 138 00:07:35,480 --> 00:07:38,600 Speaker 1: the even was saying, but not entirely. It didn't quite 139 00:07:38,640 --> 00:07:40,760 Speaker 1: get the gist of the conversation, and it would respond 140 00:07:40,800 --> 00:07:43,200 Speaker 1: in an awkward way. Yeah, it had. It had a 141 00:07:43,280 --> 00:07:45,840 Speaker 1: microphone so it could pick up on what people were saying, 142 00:07:45,920 --> 00:07:50,040 Speaker 1: and a speaker so it could then return and communicate back. 143 00:07:50,080 --> 00:07:53,120 Speaker 1: And a big problem with that was probably the the 144 00:07:53,240 --> 00:07:55,960 Speaker 1: vast array of dialects that was dealing with, right, and 145 00:07:56,040 --> 00:08:00,080 Speaker 1: just the fact that our spoken language is incredibly pla 146 00:08:00,120 --> 00:08:03,160 Speaker 1: stick and adaptive, and there are so many different ways 147 00:08:03,200 --> 00:08:06,400 Speaker 1: to say the same thing that it can be difficult 148 00:08:06,640 --> 00:08:09,080 Speaker 1: for you know, it's like a non native speaker of 149 00:08:09,120 --> 00:08:12,400 Speaker 1: any language, you might be taught how to say or 150 00:08:12,480 --> 00:08:15,720 Speaker 1: ask for something a very specific way, and that specific 151 00:08:15,760 --> 00:08:19,080 Speaker 1: way is still correct, it's not incorrect, but it's just 152 00:08:19,360 --> 00:08:22,760 Speaker 1: one way to say that, to express that thought. And 153 00:08:22,760 --> 00:08:25,760 Speaker 1: in most languages there are lots of different ways to 154 00:08:25,840 --> 00:08:29,640 Speaker 1: express the same thought, and you're familiar with one of them, 155 00:08:30,200 --> 00:08:32,080 Speaker 1: so if anyone comes up to you and uses a 156 00:08:32,160 --> 00:08:35,840 Speaker 1: different one, you could be completely confused, even though you 157 00:08:35,960 --> 00:08:39,200 Speaker 1: know one way of saying it that you understand all 158 00:08:39,200 --> 00:08:41,920 Speaker 1: these other ways don't. Same thing with computers. If you 159 00:08:42,080 --> 00:08:46,880 Speaker 1: train a computer using machine learning on what certain phrases mean, 160 00:08:47,480 --> 00:08:50,200 Speaker 1: that's great, the computer might be able to identify that. 161 00:08:50,240 --> 00:08:53,080 Speaker 1: But if someone were to ask for the same sort 162 00:08:53,080 --> 00:08:56,319 Speaker 1: of thing, but word it slightly differently. That can be 163 00:08:56,440 --> 00:08:59,440 Speaker 1: enough to throw a computer off, because the disease are 164 00:08:59,559 --> 00:09:04,280 Speaker 1: subtle things that we humans can intuitively grasp, but computers 165 00:09:04,360 --> 00:09:07,320 Speaker 1: lack intuition. And there's not only the word problem. There's 166 00:09:07,360 --> 00:09:10,680 Speaker 1: also the way that it's said. So the you know, 167 00:09:10,720 --> 00:09:13,360 Speaker 1: the the regions, zones, like you know, here in the South, 168 00:09:13,440 --> 00:09:15,480 Speaker 1: people talk different than they do in the Pacific, the 169 00:09:15,520 --> 00:09:20,000 Speaker 1: Pacific Northwest that's the herd, or the Northeast or you know, 170 00:09:20,440 --> 00:09:24,520 Speaker 1: the South, the Northeast talk faster than I can hear. Yea, 171 00:09:24,600 --> 00:09:27,120 Speaker 1: So it's different. It's different, not only um, you know, 172 00:09:27,240 --> 00:09:29,559 Speaker 1: just the different languages, like of course this thing had 173 00:09:29,600 --> 00:09:32,480 Speaker 1: to learn German, had to learn uh you know dotch 174 00:09:32,559 --> 00:09:34,440 Speaker 1: I guess had to know French to get through all 175 00:09:34,440 --> 00:09:37,120 Speaker 1: of Canada. Sure yeah, exactly right, and of course English 176 00:09:37,200 --> 00:09:39,000 Speaker 1: and you know, and not only that, but the different 177 00:09:39,040 --> 00:09:42,680 Speaker 1: dialects along the way. Sure yeah, So lots of challenges here. 178 00:09:42,760 --> 00:09:46,560 Speaker 1: But the whole goal was not The goal really, I 179 00:09:46,600 --> 00:09:49,200 Speaker 1: don't think, was to have a robot hitchhike from one 180 00:09:49,280 --> 00:09:53,280 Speaker 1: end of Canada to another. That was sort of the 181 00:09:53,280 --> 00:09:57,360 Speaker 1: the face of this project. The actual goal was more 182 00:09:57,480 --> 00:10:01,160 Speaker 1: of an artistic expression as well as an experiment and 183 00:10:01,360 --> 00:10:05,160 Speaker 1: robot human interactions, because the thing could have been picked 184 00:10:05,240 --> 00:10:07,240 Speaker 1: up right at the right at the start, right the 185 00:10:07,320 --> 00:10:09,920 Speaker 1: very first day, and driven all the way across by 186 00:10:09,960 --> 00:10:12,800 Speaker 1: one person, you know, on a long haul truck or something, 187 00:10:13,240 --> 00:10:15,040 Speaker 1: and what would be the adventure and that one would 188 00:10:15,080 --> 00:10:17,520 Speaker 1: be the fund The idea was that this relies on 189 00:10:18,480 --> 00:10:21,520 Speaker 1: human interaction and human kindness to get this thing from 190 00:10:21,520 --> 00:10:23,640 Speaker 1: one place to the next, kind of take care of 191 00:10:23,679 --> 00:10:26,439 Speaker 1: this thing and along the way have kind of a 192 00:10:26,520 --> 00:10:28,040 Speaker 1: checklist of things that they wanted to do. Now, I 193 00:10:28,080 --> 00:10:30,000 Speaker 1: don't know if the Canadian trip had a checklist. I 194 00:10:30,000 --> 00:10:32,720 Speaker 1: don't think it did. Uh. The USA trip did have 195 00:10:32,760 --> 00:10:36,120 Speaker 1: a checklist which was was started um. But the Canadian 196 00:10:36,120 --> 00:10:38,400 Speaker 1: trip um, you know, just for instance, you know, it 197 00:10:38,640 --> 00:10:41,360 Speaker 1: twenty six days to get across um, you know, the 198 00:10:41,520 --> 00:10:46,080 Speaker 1: entire nation there. But it did things like attend to wedding, um, 199 00:10:46,120 --> 00:10:49,000 Speaker 1: it was dancing in Saskatchewan. It it met some of 200 00:10:49,000 --> 00:10:52,199 Speaker 1: the Canadians First Nations people, uh, some of the Aboriginal 201 00:10:52,240 --> 00:10:56,280 Speaker 1: people um that were um, you know, native to Canada. 202 00:10:56,400 --> 00:10:58,480 Speaker 1: And it did all kinds of things. I mean, it 203 00:10:58,480 --> 00:11:02,040 Speaker 1: went to you know, parks, went to scening locations, and 204 00:11:02,120 --> 00:11:05,240 Speaker 1: all the time it was snapping photographs because this programmed 205 00:11:05,280 --> 00:11:08,080 Speaker 1: to take a photo every twenty minutes, and it could 206 00:11:08,080 --> 00:11:13,920 Speaker 1: tweet out that information. So it's interactions were uh doubled, 207 00:11:14,000 --> 00:11:17,000 Speaker 1: and that it could interact in person to people who 208 00:11:17,000 --> 00:11:20,959 Speaker 1: are around it and actually attempt to have a conversation 209 00:11:21,040 --> 00:11:23,880 Speaker 1: and interject. In fact, I read one report of people 210 00:11:23,880 --> 00:11:26,559 Speaker 1: who had picked up the robot and they said, yeah, 211 00:11:26,640 --> 00:11:29,200 Speaker 1: it was weird. There were three of us in the 212 00:11:29,240 --> 00:11:32,240 Speaker 1: car plus the robot, and we would the three of 213 00:11:32,320 --> 00:11:35,240 Speaker 1: us be talking, and the robot would interject and interrupt 214 00:11:35,360 --> 00:11:39,880 Speaker 1: us and often say something that is completely not connected 215 00:11:39,880 --> 00:11:42,240 Speaker 1: to any of the rest of the conversation. And I thought, 216 00:11:43,160 --> 00:11:46,440 Speaker 1: I've written in cars with people like that, you know, 217 00:11:46,559 --> 00:11:49,120 Speaker 1: just just someone just pipes up with a total non 218 00:11:49,160 --> 00:11:52,040 Speaker 1: sequite or and you think, are are we in the 219 00:11:52,120 --> 00:11:54,360 Speaker 1: car with a crazy person? What's that have to do 220 00:11:54,400 --> 00:11:56,320 Speaker 1: with the price of eggs in China? Yeah, something along 221 00:11:56,360 --> 00:11:59,680 Speaker 1: those lines, and so, uh, you know that that was 222 00:11:59,720 --> 00:12:03,000 Speaker 1: definitely part of it. I think there was certainly an 223 00:12:03,200 --> 00:12:07,160 Speaker 1: an element of let's see how humans treat robots, and 224 00:12:07,200 --> 00:12:10,319 Speaker 1: also let's see how we can design a robot that 225 00:12:10,920 --> 00:12:14,080 Speaker 1: from the get go is meant to interact with humans 226 00:12:14,120 --> 00:12:17,440 Speaker 1: because one of the things we're starting to see increasingly 227 00:12:17,480 --> 00:12:23,320 Speaker 1: in technology is the robotic sphere and the human sphere colliding. 228 00:12:23,400 --> 00:12:26,640 Speaker 1: Like there then and by design, we want robots in 229 00:12:26,679 --> 00:12:30,520 Speaker 1: our lives. People have named their room bas for example, 230 00:12:30,559 --> 00:12:33,360 Speaker 1: they have given their room bas uh. You know, they've 231 00:12:33,400 --> 00:12:38,600 Speaker 1: they've imprinted upon them this idea of a personality. Something 232 00:12:38,640 --> 00:12:40,319 Speaker 1: happens in that room, but you know, I'm sad, you're 233 00:12:40,320 --> 00:12:42,840 Speaker 1: gonna be well, i'd be sad to be out, you know, 234 00:12:42,880 --> 00:12:47,440 Speaker 1: two bucks on that. That's another thing, there's that side. 235 00:12:47,480 --> 00:12:50,720 Speaker 1: But you know, if something happens to Fred, yeah, don't 236 00:12:50,800 --> 00:12:53,480 Speaker 1: name your room BA. Right, Yeah, but that's true. Is 237 00:12:53,520 --> 00:12:56,520 Speaker 1: a lot of people do name these these robots. People 238 00:12:56,600 --> 00:13:00,240 Speaker 1: get emotionally invested in these machines, and so there is 239 00:13:00,320 --> 00:13:05,440 Speaker 1: this growing field of research of human robotic interactions. How 240 00:13:05,480 --> 00:13:11,160 Speaker 1: can we one capitalize on this need for humans to 241 00:13:11,440 --> 00:13:16,559 Speaker 1: have an emotional attachment to these these otherwise emotionless beings, 242 00:13:16,600 --> 00:13:21,000 Speaker 1: these beings that lack of consciousness, lack emotions. How can 243 00:13:21,080 --> 00:13:24,240 Speaker 1: we capitalize on that so that the interactions are are 244 00:13:24,360 --> 00:13:26,920 Speaker 1: useful and meaningful in some way, even if it's only 245 00:13:26,960 --> 00:13:29,520 Speaker 1: meaningful for the human. If it's impossible for it to 246 00:13:29,520 --> 00:13:31,920 Speaker 1: be meaningful for the robot, that's okay, if it's still 247 00:13:31,960 --> 00:13:35,280 Speaker 1: meaningful for the human, or how do we design robots 248 00:13:35,480 --> 00:13:40,120 Speaker 1: that are specifically meant to not evoke that reaction because 249 00:13:40,320 --> 00:13:44,520 Speaker 1: it would be just you know, another distraction from whatever 250 00:13:44,559 --> 00:13:47,720 Speaker 1: the robot is supposed to do, or maybe that whatever 251 00:13:47,720 --> 00:13:51,080 Speaker 1: the robot is supposed to do is inherently dangerous and 252 00:13:51,160 --> 00:13:54,480 Speaker 1: should not you know, you don't want to encourage human 253 00:13:54,520 --> 00:13:56,959 Speaker 1: interaction if if a robot is meant to do something 254 00:13:57,000 --> 00:14:01,120 Speaker 1: like dig into rubble, you don't want people to you know, 255 00:14:01,320 --> 00:14:04,560 Speaker 1: worry about the robot. The whole reason the robots digging 256 00:14:04,559 --> 00:14:06,520 Speaker 1: into rouble in the first place is likely to look 257 00:14:06,559 --> 00:14:10,000 Speaker 1: for survivors in the in the fall out of a 258 00:14:10,040 --> 00:14:13,480 Speaker 1: building collapse or something. Sure, or well, just to prevent 259 00:14:13,559 --> 00:14:16,840 Speaker 1: having to have a human do the same thing. Yeah, exactly, yeah, yeah, 260 00:14:16,840 --> 00:14:20,240 Speaker 1: and that's really I mean, the way that a lot 261 00:14:20,280 --> 00:14:24,920 Speaker 1: of robotics experts see robots really taking off, at least 262 00:14:24,960 --> 00:14:27,680 Speaker 1: in the near future, is they'll be used to do 263 00:14:28,120 --> 00:14:32,480 Speaker 1: jobs that are either too dirty, dull, or dangerous for humans. 264 00:14:33,040 --> 00:14:38,040 Speaker 1: So jobs that are incredibly repetitive and don't require much thought. 265 00:14:38,360 --> 00:14:42,280 Speaker 1: Robots are perfect for that. Also, robots don't sustain like 266 00:14:42,800 --> 00:14:46,360 Speaker 1: repetitive injuries. You know, you do have to continuously maintain them. 267 00:14:46,400 --> 00:14:48,880 Speaker 1: You can't just expect them to work forever. But they 268 00:14:48,920 --> 00:14:52,360 Speaker 1: don't get carbal tunnel syndrome for unless they can fix themselves. 269 00:14:52,800 --> 00:14:55,280 Speaker 1: Right then we we will get to that point eventually. 270 00:14:55,560 --> 00:14:58,120 Speaker 1: And dangerous. Obviously you don't want, you know, you would 271 00:14:58,160 --> 00:14:59,440 Speaker 1: want to be able to use a robot and a 272 00:14:59,520 --> 00:15:01,720 Speaker 1: dangerous that situation so that you're not putting human life 273 00:15:01,720 --> 00:15:03,720 Speaker 1: at risk, but with the goal of being able to 274 00:15:03,760 --> 00:15:06,520 Speaker 1: use it again and again and again. Right. So, but 275 00:15:07,440 --> 00:15:09,560 Speaker 1: those robots probably don't need to have a lot of 276 00:15:09,640 --> 00:15:14,160 Speaker 1: human interactivity. They're designed to do something that they're replacing 277 00:15:14,200 --> 00:15:17,080 Speaker 1: a human, not interacting with a human. But at the 278 00:15:17,120 --> 00:15:20,280 Speaker 1: same time, we are seeing this growing industry of robots 279 00:15:20,280 --> 00:15:23,120 Speaker 1: that are designed to be around us in our daily lives, 280 00:15:23,160 --> 00:15:27,360 Speaker 1: either as a telepresence style robot where the robot is 281 00:15:27,360 --> 00:15:30,400 Speaker 1: standing in as a surrogate for an actual person, and 282 00:15:30,480 --> 00:15:33,000 Speaker 1: you might have like an iPad or something like that 283 00:15:33,320 --> 00:15:37,320 Speaker 1: as a head where someone can skype in. And this 284 00:15:37,400 --> 00:15:40,280 Speaker 1: is always creepy whenever I see it done anywhere but 285 00:15:40,960 --> 00:15:43,160 Speaker 1: I keep being told it's the way of the future. 286 00:15:43,320 --> 00:15:46,120 Speaker 1: I have never actually interacted directly with one in an 287 00:15:46,120 --> 00:15:48,400 Speaker 1: official capacity, but I've seen them at c ES. Now 288 00:15:48,440 --> 00:15:50,240 Speaker 1: I'm gonna I'm gonna make a reference to something that 289 00:15:50,280 --> 00:15:54,520 Speaker 1: I have nothing, I have no personal interaction with. I 290 00:15:54,560 --> 00:15:56,680 Speaker 1: believe my wife was telling me about a movie recently 291 00:15:56,840 --> 00:15:59,320 Speaker 1: called her and it was a man who fell in 292 00:15:59,320 --> 00:16:02,640 Speaker 1: love with the rating system and uh and the voice 293 00:16:02,800 --> 00:16:05,480 Speaker 1: that that that that operating system had. Now I can 294 00:16:05,480 --> 00:16:07,600 Speaker 1: see something like that. And let's see. You get a refrigerator, 295 00:16:07,880 --> 00:16:10,160 Speaker 1: and refrigerators have screens on the Now ours does hear it? 296 00:16:10,240 --> 00:16:11,920 Speaker 1: How stuff works? And it's got a screen, but it 297 00:16:11,960 --> 00:16:14,200 Speaker 1: doesn't talk to us, But it does have a screen. 298 00:16:14,240 --> 00:16:15,720 Speaker 1: You can interact, and there's a lot of different things 299 00:16:15,760 --> 00:16:19,320 Speaker 1: you can do with that screen, including putting a wacky 300 00:16:19,400 --> 00:16:22,520 Speaker 1: background image on it, as someone did when they put 301 00:16:22,560 --> 00:16:27,560 Speaker 1: in the UM, the symbol for the evil organization from 302 00:16:27,680 --> 00:16:30,160 Speaker 1: loss UM. I could see if if it was talking 303 00:16:30,200 --> 00:16:32,920 Speaker 1: to you every day and had a likable voice, something 304 00:16:32,920 --> 00:16:36,880 Speaker 1: that you felt comfortable interacting with um that you know, 305 00:16:36,920 --> 00:16:39,000 Speaker 1: I could see somebody saying, why I'd be sad to 306 00:16:39,000 --> 00:16:42,200 Speaker 1: get rid of that refrigerator in in five years. Well, 307 00:16:42,280 --> 00:16:44,560 Speaker 1: especially if you would come home after a long day 308 00:16:44,560 --> 00:16:47,960 Speaker 1: and your refrigerator says, hi, would you like a frosty 309 00:16:48,080 --> 00:16:50,440 Speaker 1: adult beverage? I mean, you know you're gonna have a 310 00:16:50,480 --> 00:16:54,480 Speaker 1: bond with that machine immediately. Yeah. So but there there's 311 00:16:54,520 --> 00:16:56,840 Speaker 1: this whole discipline that's coming up, like how do we 312 00:16:57,120 --> 00:17:01,240 Speaker 1: how do we define these interactions? How do we shape them? Uh? 313 00:17:01,560 --> 00:17:03,280 Speaker 1: And a lot of it means you have to do 314 00:17:03,760 --> 00:17:05,760 Speaker 1: study on both sides. You have to do a study 315 00:17:05,800 --> 00:17:08,199 Speaker 1: on the robot side like what works and what doesn't, 316 00:17:08,480 --> 00:17:11,560 Speaker 1: and you actually have to study human psychology how do 317 00:17:11,720 --> 00:17:16,320 Speaker 1: humans respond to robots and at what point do humans 318 00:17:16,520 --> 00:17:20,240 Speaker 1: end up treating robots as if they are alive, as 319 00:17:20,280 --> 00:17:24,160 Speaker 1: if they're living creatures. And for a while people were thinking, um, well, 320 00:17:24,240 --> 00:17:27,840 Speaker 1: the robot's gonna need to look like something biological already, 321 00:17:27,840 --> 00:17:29,440 Speaker 1: Like it's gonna have to be like a robot dog 322 00:17:29,640 --> 00:17:32,600 Speaker 1: or a robot you know, android type person, almost like 323 00:17:32,440 --> 00:17:34,240 Speaker 1: a like a crash test dummy where it looks like 324 00:17:34,240 --> 00:17:36,560 Speaker 1: a human but you can tell it's not a real human. 325 00:17:36,640 --> 00:17:39,240 Speaker 1: And it turns out that's not necessarily true, because, as 326 00:17:39,240 --> 00:17:41,400 Speaker 1: we've already said, people have been naming the room bus, 327 00:17:41,400 --> 00:17:43,720 Speaker 1: people get emotionally invested. It turns out that we are 328 00:17:44,320 --> 00:17:48,560 Speaker 1: if it looks animate, if it appears to behave based 329 00:17:48,640 --> 00:17:51,600 Speaker 1: upon its own decisions, whether it's true or not. If 330 00:17:51,600 --> 00:17:54,480 Speaker 1: it if it looks like it's doing that, we start 331 00:17:54,520 --> 00:17:57,959 Speaker 1: to kind of in our minds give it these qualities. 332 00:17:58,240 --> 00:18:00,560 Speaker 1: Scott and I have a lot more to talk about 333 00:18:00,880 --> 00:18:04,280 Speaker 1: in this classic tech stuff episode, but first let's take 334 00:18:04,320 --> 00:18:15,360 Speaker 1: a quick break to thank our sponsor. So a lot 335 00:18:15,400 --> 00:18:19,719 Speaker 1: of this really was studying that, like this idea of 336 00:18:19,800 --> 00:18:23,080 Speaker 1: the way people and machines are interacting and how that 337 00:18:23,200 --> 00:18:26,199 Speaker 1: is becoming defined over time and what we might need 338 00:18:26,280 --> 00:18:29,040 Speaker 1: to think about in that respect, and also just kind 339 00:18:29,040 --> 00:18:32,359 Speaker 1: of a you know, it's a happy story about how 340 00:18:32,480 --> 00:18:38,600 Speaker 1: people find joy in h a silly I mean, really 341 00:18:38,640 --> 00:18:41,199 Speaker 1: you get down to it, it's a silly robot, not 342 00:18:41,320 --> 00:18:45,320 Speaker 1: a bad robot. It's a silly robot. And and the 343 00:18:45,320 --> 00:18:50,520 Speaker 1: the experience of discovery and sharing that with other people. 344 00:18:50,640 --> 00:18:53,240 Speaker 1: That was a big part of this project too, and 345 00:18:53,359 --> 00:18:55,640 Speaker 1: it was really successful for three out of the four 346 00:18:57,200 --> 00:19:00,320 Speaker 1: big things that it did. The one that wasn't so 347 00:19:00,359 --> 00:19:04,200 Speaker 1: successful and was the United States. So um, really quickly, 348 00:19:04,240 --> 00:19:06,199 Speaker 1: before we get into the us stuff. I was going 349 00:19:06,240 --> 00:19:09,400 Speaker 1: to talk about some of the folks who designed and 350 00:19:09,560 --> 00:19:13,840 Speaker 1: came up with this idea. Uh. The two leads who 351 00:19:13,960 --> 00:19:17,840 Speaker 1: first came up with the concept for Hitchbot, we're David 352 00:19:17,880 --> 00:19:23,400 Speaker 1: Harris Smith and Franca Zeller. And I probably am mispronouncing 353 00:19:23,440 --> 00:19:28,320 Speaker 1: Mrs Ellera Fraca Zeller. It's a name that I am. 354 00:19:28,359 --> 00:19:31,280 Speaker 1: I was not familiar with, totally new for me. And 355 00:19:31,400 --> 00:19:35,440 Speaker 1: they're out of port Credit, Ontario. Yes, and uh. Smith 356 00:19:35,800 --> 00:19:38,720 Speaker 1: is assistant professor at McMaster University in the Department of 357 00:19:38,720 --> 00:19:43,480 Speaker 1: Communication Studies. Zeller is an assistant professor in the School 358 00:19:43,480 --> 00:19:47,960 Speaker 1: of Professional Communication at Ryerson University Communications professors. Now, this 359 00:19:48,040 --> 00:19:52,320 Speaker 1: makes perfect sense because they're they're fishing for the way 360 00:19:52,320 --> 00:19:54,919 Speaker 1: people interact with this. They want to find out exactly 361 00:19:55,280 --> 00:19:59,600 Speaker 1: how people respond to this, how how he respond to them. Uh, 362 00:19:59,640 --> 00:20:03,280 Speaker 1: this in reaction is really really interesting for these people 363 00:20:03,320 --> 00:20:06,840 Speaker 1: in particular, I'm sure right. And Zella, she got her PhD. 364 00:20:07,440 --> 00:20:10,680 Speaker 1: Her thesis was on human robot interaction. Well this is 365 00:20:10,720 --> 00:20:12,800 Speaker 1: it then, Yeah, And they were joined by a lot 366 00:20:12,840 --> 00:20:15,160 Speaker 1: of other people. I've just got a couple of names 367 00:20:15,160 --> 00:20:17,800 Speaker 1: I'll mention, but the team itself is quite large. You 368 00:20:17,840 --> 00:20:20,679 Speaker 1: can actually read up on all of them on the website. 369 00:20:20,720 --> 00:20:23,120 Speaker 1: It's funny because the way the website is written, it's 370 00:20:23,160 --> 00:20:28,360 Speaker 1: written from Hitchbot's perspective. So hitch it's Hitchbot saying, Oh, 371 00:20:28,520 --> 00:20:30,280 Speaker 1: this is the person who helped me learn how to 372 00:20:30,320 --> 00:20:32,320 Speaker 1: talk and it's very cute. This is the person that 373 00:20:32,359 --> 00:20:35,240 Speaker 1: takes care of my electronics on a daily basis. I 374 00:20:35,280 --> 00:20:37,480 Speaker 1: think they're about fourteen or fifteen people on that team. 375 00:20:37,640 --> 00:20:40,040 Speaker 1: It's a big team. Is a large team, not just 376 00:20:40,080 --> 00:20:42,800 Speaker 1: the two two leads here, right, So you've got people 377 00:20:42,840 --> 00:20:46,320 Speaker 1: like a Colin or a Gadget who is a developer 378 00:20:46,359 --> 00:20:50,119 Speaker 1: of hitchbot, and he's also a McMaster University student. He 379 00:20:50,119 --> 00:20:52,480 Speaker 1: helped design and test hitchbot to make sure it would 380 00:20:52,480 --> 00:20:57,160 Speaker 1: be able to withstand the various environments that it would encounter. 381 00:20:57,480 --> 00:20:59,439 Speaker 1: Keep in mind, this was summer in Canada, so it 382 00:20:59,440 --> 00:21:01,320 Speaker 1: wasn't going to have to deal with a Canadian winter 383 00:21:01,359 --> 00:21:06,040 Speaker 1: than summer. Yeah, it's a little bit different from the 384 00:21:06,119 --> 00:21:10,919 Speaker 1: Atlanta summers. Slightly slightly less warm and humid, about sixty 385 00:21:10,960 --> 00:21:15,760 Speaker 1: degrees cooler fahrenheit. That is, um, they're not Celsius. That 386 00:21:15,800 --> 00:21:19,200 Speaker 1: would be pretty incredible. Uh So then you had Davin Bigelow, 387 00:21:19,200 --> 00:21:21,840 Speaker 1: who was an undergraduate student at McMaster who worked on 388 00:21:21,960 --> 00:21:26,280 Speaker 1: the conversational skills of this robot. Karen Veal birth Fish, 389 00:21:26,359 --> 00:21:29,520 Speaker 1: who was another person who worked on Hitchbot's language skills. 390 00:21:30,359 --> 00:21:34,240 Speaker 1: Dominic kal Kinn, whose undergraduate student at McMaster whose job 391 00:21:34,359 --> 00:21:37,200 Speaker 1: was to monitor Hitchbot's status and make sure the robot 392 00:21:37,320 --> 00:21:40,520 Speaker 1: was okay. So again, the robot was fitted with GPS 393 00:21:40,640 --> 00:21:44,280 Speaker 1: and three G capability to essentially report back home saying 394 00:21:44,320 --> 00:21:46,960 Speaker 1: here's where I'm at, be given time in every twenty minutes, 395 00:21:46,960 --> 00:21:49,440 Speaker 1: he's getting a photograph sent from this robot to him 396 00:21:49,480 --> 00:21:52,720 Speaker 1: to kind of update status where he is right now. Yeah, 397 00:21:52,840 --> 00:21:56,160 Speaker 1: and then uh, there was the big brother robot to Hitchbot, 398 00:21:56,440 --> 00:22:01,719 Speaker 1: Culture bought k U L t U R. Yeah. This 399 00:22:01,800 --> 00:22:05,240 Speaker 1: was this was a robot that preceded hitchpot. This was 400 00:22:05,440 --> 00:22:10,719 Speaker 1: a different human robot interaction experiment. Culture bots job was 401 00:22:10,760 --> 00:22:16,640 Speaker 1: to attend artistic exhibitions, take images of what was going on, 402 00:22:16,960 --> 00:22:20,639 Speaker 1: tweet them, and critique them. It was a robotic art 403 00:22:20,680 --> 00:22:24,920 Speaker 1: critic interesting anyway, so it would actually do the critique 404 00:22:24,920 --> 00:22:26,840 Speaker 1: on the fly. They're like right there at the event. 405 00:22:27,600 --> 00:22:30,280 Speaker 1: That's how it was described. But I didn't read enough 406 00:22:30,280 --> 00:22:32,919 Speaker 1: into it to find out how this actually worked. Like, 407 00:22:32,960 --> 00:22:37,360 Speaker 1: I don't know if it was capable of stringing together 408 00:22:37,640 --> 00:22:40,679 Speaker 1: any words just based upon what it was seeing. I 409 00:22:40,680 --> 00:22:43,320 Speaker 1: don't know if it had human intervention where the human 410 00:22:43,440 --> 00:22:46,240 Speaker 1: was the one actually providing the caption. I don't know 411 00:22:46,320 --> 00:22:49,199 Speaker 1: the answer to that, but I do know that that 412 00:22:49,359 --> 00:22:53,119 Speaker 1: was essentially another project that was taken. Uh. That was 413 00:22:53,240 --> 00:22:56,280 Speaker 1: that was being performed by much of the same team, 414 00:22:56,320 --> 00:22:58,840 Speaker 1: and kind of the Hitchpot was kind of the next step, 415 00:22:59,280 --> 00:23:01,680 Speaker 1: not directly connected. It was just one of those ideas 416 00:23:01,720 --> 00:23:06,000 Speaker 1: that that Smith and and Zaler came up with that 417 00:23:06,040 --> 00:23:10,960 Speaker 1: they thought was a really interesting concept. So after going 418 00:23:11,000 --> 00:23:13,919 Speaker 1: through Canada, they went to Germany. It had a lot 419 00:23:13,960 --> 00:23:16,560 Speaker 1: of adventures in Germany, went to castles, went to another wedding. 420 00:23:16,680 --> 00:23:20,639 Speaker 1: There's a great picture of a bride giving Hitchpot a 421 00:23:20,640 --> 00:23:26,080 Speaker 1: little kiss. That's just adorable. Yeah, and and uh, lots 422 00:23:26,119 --> 00:23:29,400 Speaker 1: of stories of people. All of Hitchpot's journeys, by the way, 423 00:23:29,560 --> 00:23:32,320 Speaker 1: are chronicled on the website. There are blog posts that 424 00:23:32,440 --> 00:23:35,000 Speaker 1: tell what happened on each day. Some of them also 425 00:23:35,119 --> 00:23:38,959 Speaker 1: have embedded videos of the stuff that went on and 426 00:23:39,000 --> 00:23:43,960 Speaker 1: also photographs. It's very cute. Then after Germany, they went 427 00:23:43,960 --> 00:23:46,919 Speaker 1: to the Netherlands for a brief while in the summer, 428 00:23:47,040 --> 00:23:50,879 Speaker 1: early summer, for a bunch of activities and events that 429 00:23:50,960 --> 00:23:54,280 Speaker 1: I cannot pronounce. Yes, I'm not even going to attempt 430 00:23:54,560 --> 00:23:58,080 Speaker 1: a series of festivals with unpronounceable names. And then at 431 00:23:58,119 --> 00:24:00,760 Speaker 1: least for the American tongue, and then uh, and then 432 00:24:00,760 --> 00:24:03,320 Speaker 1: it moved over to the good old us of A. Yeah, 433 00:24:03,440 --> 00:24:05,920 Speaker 1: started in Boston, right, it was gonna go from gonna 434 00:24:05,960 --> 00:24:08,600 Speaker 1: go from Boston to San Francisco. That was the goal. 435 00:24:08,760 --> 00:24:11,640 Speaker 1: That was the goal. And it also had a bucket list, 436 00:24:11,640 --> 00:24:13,800 Speaker 1: which is appropriate since it was a bucket I have 437 00:24:13,960 --> 00:24:17,240 Speaker 1: the bucket list in front of me now. Uh. While 438 00:24:17,280 --> 00:24:19,600 Speaker 1: the bucket list has a couple of check marks on it. Now. 439 00:24:19,680 --> 00:24:22,520 Speaker 1: One check mark was, um uh, to do the wave 440 00:24:22,560 --> 00:24:24,679 Speaker 1: of a sports game anywhere, it didn't matter where it 441 00:24:24,760 --> 00:24:26,679 Speaker 1: was to do that. Uh. The other one was to 442 00:24:26,720 --> 00:24:28,679 Speaker 1: see the lights in Times Square, of course in New 443 00:24:28,760 --> 00:24:31,760 Speaker 1: York City. And there were others. There's other stuff along 444 00:24:31,800 --> 00:24:33,159 Speaker 1: the way, and I'll just mention a few of these 445 00:24:33,200 --> 00:24:36,640 Speaker 1: because there's probably again different things in Grand Canyon has 446 00:24:36,680 --> 00:24:40,720 Speaker 1: to be on there. Let's see Grand Canyon. Um uh, 447 00:24:40,920 --> 00:24:43,399 Speaker 1: you know, I'll have to check out yes see the 448 00:24:43,680 --> 00:24:45,760 Speaker 1: jaw dropping views of the Grand Canyon. That is one 449 00:24:45,840 --> 00:24:48,920 Speaker 1: yes in Arizona. Um posed with the Lincoln Statue in 450 00:24:49,000 --> 00:24:51,720 Speaker 1: d C was another one. Tan at Myrtle Beach Um 451 00:24:51,760 --> 00:24:54,199 Speaker 1: experienced the magic of Walt Disney World in Florida. So 452 00:24:54,600 --> 00:24:57,280 Speaker 1: it was going to need to actually go south along 453 00:24:57,320 --> 00:24:59,919 Speaker 1: the Eastern Seaboard. I mean, for people who are not 454 00:25:00,040 --> 00:25:02,480 Speaker 1: from the United States and aren't familiar with our geography, 455 00:25:02,520 --> 00:25:04,960 Speaker 1: if you were going Boston to San Francisco, you would 456 00:25:05,040 --> 00:25:08,480 Speaker 1: essentially be setting your sites west. Yeah, just do west, 457 00:25:08,560 --> 00:25:11,280 Speaker 1: Just go yeah, just go west, and just keep on 458 00:25:12,160 --> 00:25:15,320 Speaker 1: adjusting your your your journey in orders for you to 459 00:25:15,359 --> 00:25:19,200 Speaker 1: get to California exactly. But with this list, it means 460 00:25:19,280 --> 00:25:21,080 Speaker 1: that you would have need you first would need to 461 00:25:21,080 --> 00:25:23,600 Speaker 1: go south, because you would have to go south from 462 00:25:23,600 --> 00:25:25,960 Speaker 1: Boston to well to get to New York City, but 463 00:25:26,000 --> 00:25:29,240 Speaker 1: also to d C and to Florida. And then this 464 00:25:29,280 --> 00:25:31,320 Speaker 1: goes all over the place, I mean all over the Midwest. 465 00:25:31,359 --> 00:25:34,120 Speaker 1: So there are things to do in Illinois like explore 466 00:25:34,160 --> 00:25:37,480 Speaker 1: the cloud Gate in Millennium Park, um stand under the 467 00:25:37,520 --> 00:25:40,760 Speaker 1: Gateway Arch in Missouri. Just all kinds of things like this, 468 00:25:40,800 --> 00:25:42,960 Speaker 1: and again it's a it's a relatively long list and 469 00:25:43,080 --> 00:25:48,080 Speaker 1: with many different states and many different activities, and checked 470 00:25:48,080 --> 00:25:50,760 Speaker 1: off two items on that list. Yeah. And the reason 471 00:25:50,840 --> 00:25:54,359 Speaker 1: that only two items were checked off is that Hitchbot 472 00:25:54,840 --> 00:25:58,800 Speaker 1: met and untimely demise at the hands of a vandal 473 00:25:59,000 --> 00:26:02,720 Speaker 1: maliciously murder yes, decapitated, we can we just can say 474 00:26:02,800 --> 00:26:04,800 Speaker 1: murdered when it's a robot. I mean it was, I 475 00:26:04,840 --> 00:26:08,320 Speaker 1: mean disassembled. Yeah, that's what Johnny five was scared about. 476 00:26:08,560 --> 00:26:12,600 Speaker 1: No disassembled Johnny five. Um. Yeah, So I'm guessing disassembled 477 00:26:12,640 --> 00:26:14,680 Speaker 1: is probably the best way of putting it. At some point, 478 00:26:14,680 --> 00:26:18,879 Speaker 1: you figure there's gonna be another robot getting a delivery 479 00:26:18,920 --> 00:26:21,119 Speaker 1: with a cardboard box and it'll just be what's in 480 00:26:21,160 --> 00:26:26,480 Speaker 1: the box? Uh, and they'll have the binary uh code 481 00:26:26,520 --> 00:26:30,240 Speaker 1: for seven as the title of that. Very clever, very clever. 482 00:26:30,359 --> 00:26:33,520 Speaker 1: All right, So this is a weird, weird ending. So, um, 483 00:26:33,600 --> 00:26:35,840 Speaker 1: it is the night prior to this, So it's it's 484 00:26:35,880 --> 00:26:37,560 Speaker 1: what the end of July, right, because I think this 485 00:26:37,640 --> 00:26:39,680 Speaker 1: all went down on August one, as when we heard 486 00:26:39,720 --> 00:26:42,639 Speaker 1: exactly so July thirty one. Um, I think it was 487 00:26:42,680 --> 00:26:49,280 Speaker 1: a Saturday night and Hitchbot was in Philadelphia and hanging 488 00:26:49,280 --> 00:26:52,040 Speaker 1: out with a vlogger by the name of Jesse Wellens. 489 00:26:52,680 --> 00:26:55,000 Speaker 1: And it is well documented what they did in their evening, 490 00:26:55,080 --> 00:26:57,359 Speaker 1: during their evening because you know, Wellen's being a YouTube 491 00:26:57,480 --> 00:27:00,639 Speaker 1: YouTube personality. Um, it took him on the town. You 492 00:27:00,680 --> 00:27:02,080 Speaker 1: kind of did a lot of different things with him. 493 00:27:02,080 --> 00:27:05,200 Speaker 1: There were other people involved. There's another guy um there 494 00:27:05,240 --> 00:27:08,399 Speaker 1: with him. His name was Ed Bassmaster, who was another 495 00:27:08,480 --> 00:27:12,280 Speaker 1: YouTube personality, and they had kind of a fun evening. Yeah. 496 00:27:12,320 --> 00:27:14,359 Speaker 1: The whole the whole idea was that this is a 497 00:27:14,680 --> 00:27:17,879 Speaker 1: I mean it was it was elevating Hitchbot's profile and 498 00:27:17,920 --> 00:27:21,800 Speaker 1: elevating the YouTuber's profile. This is this is a dream 499 00:27:21,880 --> 00:27:25,840 Speaker 1: come true for a YouTube personality because it gives you 500 00:27:25,880 --> 00:27:30,240 Speaker 1: the chance to interact with a meme while it's at 501 00:27:30,480 --> 00:27:34,040 Speaker 1: while it's happening. You're not capitalizing it on afterwards. This 502 00:27:34,080 --> 00:27:36,200 Speaker 1: is this is like a once in a lifetime type deal. 503 00:27:36,320 --> 00:27:40,919 Speaker 1: It can it can garner you at international attention immediately. Yeah, 504 00:27:41,200 --> 00:27:43,560 Speaker 1: it really would because people were tracking this thing. People 505 00:27:43,560 --> 00:27:46,280 Speaker 1: are watching exactly where this is and they knew, you know, 506 00:27:46,320 --> 00:27:48,080 Speaker 1: when it was when it was in their city. They 507 00:27:48,119 --> 00:27:50,120 Speaker 1: knew where it was. They could walk I mean if 508 00:27:50,160 --> 00:27:53,000 Speaker 1: it said, it's been sitting here at this corner of 509 00:27:53,280 --> 00:27:55,720 Speaker 1: you know, Maine and Elm Street for the last twenty minutes. 510 00:27:56,080 --> 00:27:57,600 Speaker 1: You could go down to Maine and Elm Street and 511 00:27:57,640 --> 00:27:58,919 Speaker 1: look at this thing, or pick it up and give 512 00:27:58,960 --> 00:28:01,879 Speaker 1: it a ride yourself or two. In fact, the early 513 00:28:01,920 --> 00:28:03,520 Speaker 1: part of the trip we didn't talk about this, but 514 00:28:03,840 --> 00:28:05,679 Speaker 1: it took a long time for it to leave the 515 00:28:05,680 --> 00:28:10,080 Speaker 1: Boston area. People in Boston, we're taking it to different 516 00:28:10,119 --> 00:28:12,520 Speaker 1: parks and different you know, they're kicking out on boats 517 00:28:12,560 --> 00:28:15,240 Speaker 1: and things and taking you know, selfies with it, and um, 518 00:28:15,320 --> 00:28:17,440 Speaker 1: it took I think it was more than seven days 519 00:28:17,520 --> 00:28:19,640 Speaker 1: for it to get out of the Boston area, which 520 00:28:19,680 --> 00:28:23,880 Speaker 1: I think the team would have found wonderful because it 521 00:28:23,960 --> 00:28:27,720 Speaker 1: was what was happening was the robot was gathering a 522 00:28:27,880 --> 00:28:32,359 Speaker 1: series of rich experiences, and the people were gathering the 523 00:28:32,359 --> 00:28:34,960 Speaker 1: experience of interacting with the robot, which is the purpose 524 00:28:35,280 --> 00:28:38,479 Speaker 1: for this thing in the first place. So having it 525 00:28:38,560 --> 00:28:41,040 Speaker 1: take a really long time to get off any area 526 00:28:41,480 --> 00:28:45,920 Speaker 1: would not be considered a a an impediment on the 527 00:28:45,960 --> 00:28:49,000 Speaker 1: behalf of the people running the project. They I'm sure 528 00:28:49,240 --> 00:28:51,440 Speaker 1: loved it. Oh yeah, in no way is that a failure. 529 00:28:51,480 --> 00:28:54,120 Speaker 1: That's a that's a wind in fact, so you know 530 00:28:54,280 --> 00:28:56,360 Speaker 1: here it is after this fun evening on the thirty one, 531 00:28:56,720 --> 00:28:59,800 Speaker 1: and they placed it on the bench middle of the night. 532 00:28:59,840 --> 00:29:01,920 Speaker 1: You know, it's late at night of course dark. They 533 00:29:01,960 --> 00:29:03,720 Speaker 1: placed it on a park bench or not a park bench, 534 00:29:03,760 --> 00:29:06,680 Speaker 1: but a bench on the city street there. And that's it. 535 00:29:06,760 --> 00:29:09,440 Speaker 1: I mean, you see a cab driver arrive, think uh, 536 00:29:09,720 --> 00:29:11,600 Speaker 1: And and that's it. I mean at the end of 537 00:29:11,640 --> 00:29:16,520 Speaker 1: the video interaction with with Hitchpot at that point. And 538 00:29:16,680 --> 00:29:18,920 Speaker 1: the next day we wake up to the news that 539 00:29:19,200 --> 00:29:23,480 Speaker 1: Hitchbot has been murdered. Yes, its head had been removed 540 00:29:23,480 --> 00:29:26,040 Speaker 1: from its torso and its arms ripped off. I gotta 541 00:29:26,040 --> 00:29:29,200 Speaker 1: ask you this when you saw the photograph of Hitchbot, 542 00:29:29,280 --> 00:29:31,440 Speaker 1: because and now this is a dramatic photograph, I mean 543 00:29:31,480 --> 00:29:34,520 Speaker 1: it did. It did look a lot like a true 544 00:29:34,560 --> 00:29:38,400 Speaker 1: crime scene photo in that horrific as this maybe. I mean, 545 00:29:38,520 --> 00:29:41,720 Speaker 1: Hitchbot's arms were pulled off and placed above its head 546 00:29:41,760 --> 00:29:43,760 Speaker 1: and um, he was laying in a pile of leaves. 547 00:29:43,800 --> 00:29:46,680 Speaker 1: It's headless at this point, I mean, they're they're real 548 00:29:46,800 --> 00:29:49,320 Speaker 1: crime scene photos like that. Now understand that this was starting, 549 00:29:49,440 --> 00:29:52,080 Speaker 1: it was starting to feel like this is this is 550 00:29:52,120 --> 00:29:55,040 Speaker 1: the robot equivalent of a serial killer a crime scene. 551 00:29:55,120 --> 00:29:56,760 Speaker 1: So it was laid out in this light like the 552 00:29:56,880 --> 00:30:00,720 Speaker 1: staged manner and u and very very oie, I guess. 553 00:30:00,760 --> 00:30:04,200 Speaker 1: And there were actually websites or you know, blogs that 554 00:30:04,240 --> 00:30:07,160 Speaker 1: would say, I don't really feel comfortable showing you this image, 555 00:30:07,160 --> 00:30:09,840 Speaker 1: which is weird because here it's just a bucket with 556 00:30:09,920 --> 00:30:12,760 Speaker 1: a couple of noodle arms and some some rubber boots. Right. 557 00:30:12,800 --> 00:30:16,360 Speaker 1: If you saw this same collection of stuff and a 558 00:30:16,400 --> 00:30:19,560 Speaker 1: hardware store, you would just think, oh, somebody just left 559 00:30:19,600 --> 00:30:22,280 Speaker 1: their random shopping right here. That's pretty funny. I never 560 00:30:22,320 --> 00:30:24,800 Speaker 1: thought it that way. Yeah, it's like, yeah, you could 561 00:30:24,800 --> 00:30:27,120 Speaker 1: gather the stuff up the Walmart and put it on 562 00:30:27,160 --> 00:30:29,520 Speaker 1: the floor and not think twice about it. But now 563 00:30:29,560 --> 00:30:32,120 Speaker 1: that we know that this is this uh, this this 564 00:30:32,400 --> 00:30:35,040 Speaker 1: thing that they've created that has a name, uh, now 565 00:30:35,080 --> 00:30:37,200 Speaker 1: it takes on a different twist, doesn't it take? It 566 00:30:37,200 --> 00:30:41,400 Speaker 1: takes a different different feel, which, again you might say, 567 00:30:41,400 --> 00:30:44,960 Speaker 1: while while it brings the the that particular part of 568 00:30:44,960 --> 00:30:49,200 Speaker 1: the experiment to an end, it also says a lot, right. 569 00:30:49,320 --> 00:30:52,360 Speaker 1: It also tells you a lot about robot human interactions 570 00:30:52,360 --> 00:30:54,240 Speaker 1: and where they can go. It sure does, and you're 571 00:30:54,920 --> 00:30:57,160 Speaker 1: right away you would think, well, of course, Wellens and 572 00:30:57,200 --> 00:30:59,800 Speaker 1: bass Master have done this. That's that's the two people, 573 00:30:59,840 --> 00:31:02,400 Speaker 1: the two characters that were involved with this last and 574 00:31:02,440 --> 00:31:05,480 Speaker 1: then lo and behold three or day later, two days 575 00:31:05,560 --> 00:31:09,160 Speaker 1: later or whatever, there comes this uh secret surveillance video 576 00:31:09,200 --> 00:31:11,080 Speaker 1: that was taken it in from a nearby store that 577 00:31:11,080 --> 00:31:13,800 Speaker 1: shows what happened. And someone wanders up in a football 578 00:31:13,880 --> 00:31:17,600 Speaker 1: jersey and just kicks the living heck out of this 579 00:31:17,680 --> 00:31:21,360 Speaker 1: thing and destroys it. But it's it's shown just off screen. 580 00:31:21,440 --> 00:31:24,880 Speaker 1: We don't actually see the person kicking and destroying this thing. 581 00:31:24,920 --> 00:31:27,720 Speaker 1: We see him kicking in this destroying the area of 582 00:31:27,720 --> 00:31:30,880 Speaker 1: the bench that that that hitchpot was known to be 583 00:31:31,000 --> 00:31:35,160 Speaker 1: last right and then later uh, everyone was reporting that 584 00:31:35,200 --> 00:31:38,760 Speaker 1: this video was essentially itself was staged. Yeah, it's a 585 00:31:38,760 --> 00:31:40,719 Speaker 1: fake video because you can go to that scene and 586 00:31:40,760 --> 00:31:43,280 Speaker 1: look exactly where it was taken from the same perspective. 587 00:31:43,560 --> 00:31:46,600 Speaker 1: There's no camera there. Yeah, so it's a fake. And 588 00:31:46,680 --> 00:31:49,040 Speaker 1: so what's going on here? Because they never have found 589 00:31:49,520 --> 00:31:51,640 Speaker 1: the head of Hitchbot. I mean they never found the 590 00:31:52,120 --> 00:31:55,040 Speaker 1: I guess the CPU, the the thing that would tell 591 00:31:55,120 --> 00:31:58,280 Speaker 1: them PC. Yeah, that would tell them what is what 592 00:31:58,440 --> 00:32:01,080 Speaker 1: is happening, like, what what happened to it? Yeah, so 593 00:32:01,600 --> 00:32:05,720 Speaker 1: the best guess is just that somebody scavenged it. But 594 00:32:05,760 --> 00:32:07,840 Speaker 1: you know, in fact that the people behind the scenes, 595 00:32:07,880 --> 00:32:11,320 Speaker 1: the people behind the project, have said, we don't we 596 00:32:11,400 --> 00:32:13,880 Speaker 1: don't care to identify the person who did it or 597 00:32:13,920 --> 00:32:16,440 Speaker 1: why they did it. That's not important to what we 598 00:32:16,440 --> 00:32:19,320 Speaker 1: were trying to do with the taking photographs every twenty minutes. 599 00:32:19,360 --> 00:32:21,720 Speaker 1: I wonder if it captured something and sent it without 600 00:32:21,800 --> 00:32:25,280 Speaker 1: the the the person, you know, the purp knowing what 601 00:32:25,440 --> 00:32:28,480 Speaker 1: had happened, or if or if it just happened to 602 00:32:28,520 --> 00:32:31,200 Speaker 1: default within that time frame between when the photos are 603 00:32:31,200 --> 00:32:34,480 Speaker 1: taken and didn't capture anything, and it's just like everybody else, 604 00:32:34,480 --> 00:32:38,400 Speaker 1: they don't know anything. It's possible either way. The you know, 605 00:32:38,720 --> 00:32:42,680 Speaker 1: of course, there was a huge reaction to this, both ways. Yeah. Yeah, 606 00:32:42,680 --> 00:32:45,400 Speaker 1: there were people who were saying, this is awful, this 607 00:32:45,440 --> 00:32:48,680 Speaker 1: is the worst. You know. It really reflects poorly on 608 00:32:48,720 --> 00:32:53,120 Speaker 1: the United States that a robot that was capable of 609 00:32:53,160 --> 00:32:56,280 Speaker 1: safely traveling from one coast of Canada to the other, 610 00:32:56,480 --> 00:32:59,560 Speaker 1: and also in Germany and also in the Netherlands, gets 611 00:33:00,200 --> 00:33:02,960 Speaker 1: barely into its journey here in the United States before 612 00:33:03,040 --> 00:33:12,040 Speaker 1: it's destroyed. Yeah, that that was a telling um condemnation, 613 00:33:12,200 --> 00:33:14,520 Speaker 1: if you will, of the United States in general and 614 00:33:14,560 --> 00:33:17,640 Speaker 1: Philadelphia in particular. You can think of all the different 615 00:33:17,640 --> 00:33:20,160 Speaker 1: comments that were immediately happening afterwards. A lot of people 616 00:33:20,320 --> 00:33:22,440 Speaker 1: say like, well, this happens to real hitchhikers as well, 617 00:33:23,360 --> 00:33:25,240 Speaker 1: it happens to people. Well, and then there were a 618 00:33:25,240 --> 00:33:27,840 Speaker 1: ton of comments that said, well, of course this happened 619 00:33:27,840 --> 00:33:30,240 Speaker 1: in Philadelphia. Yeah, there's that. And there's another group of 620 00:33:30,240 --> 00:33:33,080 Speaker 1: people that would just respond with something like a big deal, 621 00:33:33,120 --> 00:33:37,240 Speaker 1: it was a bucket of bolts. Anyways, it was a machine, yeah, 622 00:33:37,280 --> 00:33:41,520 Speaker 1: and I mean it's it's again. This ties right back 623 00:33:41,560 --> 00:33:45,400 Speaker 1: into that robotic human interactions. You might think the experiments 624 00:33:45,400 --> 00:33:48,160 Speaker 1: over I would argue that the team probably says, no, 625 00:33:48,320 --> 00:33:51,360 Speaker 1: it's still going. It doesn't matter if the robot is gone. 626 00:33:52,000 --> 00:33:57,280 Speaker 1: The continuing conversation around this is still informing us and 627 00:33:57,400 --> 00:34:02,240 Speaker 1: still giving us a lot more adda about how humans 628 00:34:02,320 --> 00:34:07,560 Speaker 1: view robots, how we can identify with them, we can 629 00:34:07,600 --> 00:34:10,640 Speaker 1: imprint an emotional response to them, and in fact, they 630 00:34:10,680 --> 00:34:12,799 Speaker 1: have tweeted out, you know, continuing to tweet out with 631 00:34:12,800 --> 00:34:15,640 Speaker 1: the robots, saying the robot still loves people. Still loved 632 00:34:15,640 --> 00:34:19,200 Speaker 1: its adventures, which I think needles it even more right. 633 00:34:19,239 --> 00:34:21,640 Speaker 1: You made the robot this innocent creature. It reminds me 634 00:34:21,680 --> 00:34:25,680 Speaker 1: a lot of when, um, the various Mars rovers when 635 00:34:25,719 --> 00:34:29,480 Speaker 1: they were like the Phoenix Lander, specifically, when it was 636 00:34:29,640 --> 00:34:31,800 Speaker 1: nearing the end of its mission, like it had gone 637 00:34:31,840 --> 00:34:35,520 Speaker 1: well beyond what the mission parameters were and it was 638 00:34:35,680 --> 00:34:38,120 Speaker 1: to a point where it was no longer going to 639 00:34:38,160 --> 00:34:41,480 Speaker 1: get enough sunlight to recharge its batteries, and it's essentially 640 00:34:41,480 --> 00:34:44,800 Speaker 1: the social media team at NASA sent out a final 641 00:34:44,840 --> 00:34:48,200 Speaker 1: tweet from the robot, keeping in mind the robot was 642 00:34:48,360 --> 00:34:51,120 Speaker 1: never tweeting directly. It was always a human being taking 643 00:34:51,200 --> 00:34:55,040 Speaker 1: data from the robot and then messaging it out. But 644 00:34:55,120 --> 00:34:58,360 Speaker 1: people identified with that robot, and when that last tweet 645 00:34:58,440 --> 00:35:01,520 Speaker 1: came out, people cry, isn't that crazy? Now? They're they're 646 00:35:01,560 --> 00:35:04,680 Speaker 1: watching this thing and they're thinking, they're almost making it 647 00:35:04,760 --> 00:35:06,640 Speaker 1: into like like it's like you're watching a dog or 648 00:35:06,680 --> 00:35:09,440 Speaker 1: something like that. That's that's dying. That thing right there 649 00:35:09,520 --> 00:35:12,279 Speaker 1: is dying and I'm watching it happen and they're not 650 00:35:12,440 --> 00:35:14,960 Speaker 1: understanding that. It's like, um, now, I know it's way 651 00:35:14,960 --> 00:35:17,800 Speaker 1: more complexness, but it's not like, well, it's time to 652 00:35:17,800 --> 00:35:20,239 Speaker 1: get a new toaster. You know, it's not like a 653 00:35:20,280 --> 00:35:23,360 Speaker 1: machine that you don't really have any kind of personal 654 00:35:23,760 --> 00:35:27,680 Speaker 1: um attachment to ye attachment Like if my microwave were 655 00:35:27,760 --> 00:35:30,879 Speaker 1: to malfunction, I think a lot of pain in the butt. 656 00:35:30,880 --> 00:35:32,000 Speaker 1: I have to go and get a new one. If 657 00:35:32,040 --> 00:35:35,040 Speaker 1: you're not gonna cry. No. But and again that that 658 00:35:35,719 --> 00:35:38,800 Speaker 1: goes right back into that idea of how do robots 659 00:35:38,800 --> 00:35:41,799 Speaker 1: and humans interact? And and we do need to think 660 00:35:41,840 --> 00:35:45,279 Speaker 1: about this because if we enter into a world where 661 00:35:45,280 --> 00:35:49,640 Speaker 1: we start treating these relationships is really casual. When stuff 662 00:35:49,719 --> 00:35:56,400 Speaker 1: happens and people go through an actual grieving experience, we 663 00:35:56,480 --> 00:35:58,360 Speaker 1: won't be ready for it. But if we know ahead 664 00:35:58,360 --> 00:36:00,200 Speaker 1: of time, we can say, all right, you know what, 665 00:36:00,360 --> 00:36:02,799 Speaker 1: we we know this about ourselves. This is something that 666 00:36:02,880 --> 00:36:07,520 Speaker 1: is innately human, at least for many people realize that. Yeah, 667 00:36:07,600 --> 00:36:10,480 Speaker 1: and then you then you think, all right, now I 668 00:36:10,520 --> 00:36:13,920 Speaker 1: can I can design a product and market a product 669 00:36:14,040 --> 00:36:18,440 Speaker 1: and and do it in a responsible way that doesn't 670 00:36:19,200 --> 00:36:22,680 Speaker 1: say this is a weird you know, aberration or anything. No, 671 00:36:22,840 --> 00:36:25,040 Speaker 1: this is a very human kind of trait that a 672 00:36:25,080 --> 00:36:28,239 Speaker 1: lot of people have. Sure, whatever elicits that response from 673 00:36:28,239 --> 00:36:30,439 Speaker 1: the humans, that's what you're looking for, Yeah, and so 674 00:36:30,800 --> 00:36:33,040 Speaker 1: or at least that you account for it. Even if 675 00:36:33,080 --> 00:36:35,520 Speaker 1: that's not the purpose of whatever it is you're making, 676 00:36:36,000 --> 00:36:38,239 Speaker 1: you at least account for the fact that it exists. 677 00:36:38,920 --> 00:36:41,120 Speaker 1: We've got a little bit more to chat about when 678 00:36:41,120 --> 00:36:44,560 Speaker 1: it comes to Hitchbot and to really kind of delve 679 00:36:44,600 --> 00:36:50,000 Speaker 1: into the psychology behind humans and robot relations. And we'll 680 00:36:50,080 --> 00:36:52,240 Speaker 1: get right to that in just a second, but first 681 00:36:52,360 --> 00:37:02,680 Speaker 1: let's take another quick break to thank our sponsor. In 682 00:37:02,719 --> 00:37:06,359 Speaker 1: the fallout of all of this, we've seen not just 683 00:37:06,760 --> 00:37:10,920 Speaker 1: the condemnation of an act of violence against what seemed 684 00:37:10,960 --> 00:37:16,359 Speaker 1: to be an innocent creature that loved adventure and meeting people, or, 685 00:37:16,400 --> 00:37:20,160 Speaker 1: as Smith had called it, a story collecting and story 686 00:37:20,360 --> 00:37:24,600 Speaker 1: generating machine, like that was its purpose. We've also seen 687 00:37:25,880 --> 00:37:29,480 Speaker 1: people come together to say, we can't let this be 688 00:37:29,520 --> 00:37:32,000 Speaker 1: the end of the story. Um. We there are a 689 00:37:32,080 --> 00:37:35,879 Speaker 1: couple of different groups in Philadelphia that had said, let 690 00:37:36,120 --> 00:37:40,840 Speaker 1: us build a robot two continue on in the spirit 691 00:37:40,880 --> 00:37:44,680 Speaker 1: of Hitchbot, because I can't stand for hitchbot story to 692 00:37:44,760 --> 00:37:47,040 Speaker 1: have ended in the city that I call home. Sure, 693 00:37:47,160 --> 00:37:50,080 Speaker 1: let's so, let's let's have a chance to make this right. Yeah, 694 00:37:50,160 --> 00:37:53,320 Speaker 1: and we're going to do it. Yeah, so there are 695 00:37:53,160 --> 00:37:56,640 Speaker 1: the tech community in Philadelphia has responded with this and 696 00:37:56,719 --> 00:38:02,399 Speaker 1: actually received some more or less tentative thumbs up from 697 00:38:02,440 --> 00:38:06,040 Speaker 1: some of the members of the Hitchpot team to give 698 00:38:06,120 --> 00:38:10,080 Speaker 1: this a go. And so there's a group um at 699 00:38:10,120 --> 00:38:14,319 Speaker 1: the gathered at the hactory. The hactory. Yeah, it's like 700 00:38:14,320 --> 00:38:18,399 Speaker 1: a factory hactory in West Philadelphia. Um, this was this 701 00:38:18,440 --> 00:38:21,600 Speaker 1: is very recent and when this happened. Uh, And they 702 00:38:21,680 --> 00:38:23,640 Speaker 1: have come up with an idea that they're calling the 703 00:38:23,760 --> 00:38:28,560 Speaker 1: Philly love Bot, which I don't like the sound of that. Yeah, 704 00:38:29,160 --> 00:38:32,680 Speaker 1: an odd choice for a name. Wait, but in the 705 00:38:32,719 --> 00:38:35,239 Speaker 1: brotherly love. Okay, I don't want to I don't take 706 00:38:35,239 --> 00:38:37,160 Speaker 1: this in the wrong direction or anything. But we're not 707 00:38:37,200 --> 00:38:39,400 Speaker 1: talking about a sexpot. Okay, Okay, was a sex boy. 708 00:38:39,400 --> 00:38:40,799 Speaker 1: I didn't want to just come right out and say it. 709 00:38:40,840 --> 00:38:42,719 Speaker 1: But it seems like I've heard of products like this. 710 00:38:42,800 --> 00:38:45,920 Speaker 1: Yeah no, not not that kind of love. But this 711 00:38:46,000 --> 00:38:49,360 Speaker 1: is more of a love for all mankind and robots 712 00:38:49,440 --> 00:38:51,480 Speaker 1: kind of what kind of bought podcast if I wandered 713 00:38:51,520 --> 00:38:53,480 Speaker 1: into Yeah, no, I'm not gonna do another bait and 714 00:38:53,520 --> 00:38:55,799 Speaker 1: switch on you, Scott. I already promised you. I wasn't 715 00:38:55,800 --> 00:39:00,279 Speaker 1: gonna do that, not doing it this time, honest. So 716 00:39:00,800 --> 00:39:04,680 Speaker 1: they said that the idea they have is they would 717 00:39:04,680 --> 00:39:08,399 Speaker 1: build a robot that was designed to be passed from 718 00:39:08,400 --> 00:39:11,239 Speaker 1: one person to another. So it's not designed to hitchhike 719 00:39:11,360 --> 00:39:15,799 Speaker 1: from one location to another location. There's no location requirement, 720 00:39:15,920 --> 00:39:19,759 Speaker 1: at least in their initial approach. Instead, what they want 721 00:39:20,080 --> 00:39:22,840 Speaker 1: is to design a robot that when you, when you 722 00:39:22,880 --> 00:39:25,520 Speaker 1: take possession of it, when someone gives it to you, 723 00:39:25,520 --> 00:39:29,400 Speaker 1: you are tasked with performing a good deed, however you 724 00:39:29,480 --> 00:39:33,280 Speaker 1: define it, and it gets documented by the robot itself. 725 00:39:33,360 --> 00:39:35,840 Speaker 1: The robot you take along with you to do whatever 726 00:39:35,880 --> 00:39:39,160 Speaker 1: this good deed maybe, and then you pass it on. 727 00:39:39,239 --> 00:39:41,320 Speaker 1: It's like a pay it forward. You pass the robot 728 00:39:41,320 --> 00:39:43,759 Speaker 1: onto someone else and it is their duty now to 729 00:39:43,840 --> 00:39:46,280 Speaker 1: go out and do a good deed. And the idea 730 00:39:46,400 --> 00:39:50,080 Speaker 1: to kind of a tone for the horrible murder of 731 00:39:50,200 --> 00:39:54,839 Speaker 1: Hitchbot by promoting good deeds, and the robot is the 732 00:39:54,920 --> 00:39:57,480 Speaker 1: kind of the almost like a totem for that. I 733 00:39:57,520 --> 00:39:59,120 Speaker 1: was gonna say, I like this idea, but you could 734 00:39:59,160 --> 00:40:02,000 Speaker 1: do the same thing with with a carved stick. You 735 00:40:02,040 --> 00:40:05,080 Speaker 1: could hand a carved stick to somebody. Now that you're 736 00:40:05,120 --> 00:40:06,920 Speaker 1: in possession of the stick, it's your duty to do 737 00:40:06,920 --> 00:40:09,160 Speaker 1: a good deed and pass the stick onto somebody else. 738 00:40:09,200 --> 00:40:11,480 Speaker 1: It doesn't have to be, uh, something that collects and 739 00:40:11,520 --> 00:40:13,920 Speaker 1: gathers the information. But I guess that keeps everybody kind 740 00:40:13,920 --> 00:40:16,160 Speaker 1: of honest, doesn't it. Well? Yeah, and I think also, 741 00:40:16,239 --> 00:40:18,920 Speaker 1: you know you I. They also are planning on having 742 00:40:19,040 --> 00:40:21,880 Speaker 1: the robot, which I am going to guess it is 743 00:40:21,880 --> 00:40:24,080 Speaker 1: going to be another really another computer or not so 744 00:40:24,160 --> 00:40:26,440 Speaker 1: much a robot. They're going to have it capable of 745 00:40:26,480 --> 00:40:30,319 Speaker 1: interacting with you, just as the hitchbot could. So in 746 00:40:30,360 --> 00:40:33,960 Speaker 1: other words, there will still be that robot human interaction 747 00:40:34,040 --> 00:40:38,640 Speaker 1: element that will play a part in this experiment, but 748 00:40:39,360 --> 00:40:44,160 Speaker 1: the nature of the the overall experiment, the the perceived 749 00:40:44,280 --> 00:40:47,080 Speaker 1: purpose will be different, now, isn't It's funny because I 750 00:40:47,160 --> 00:40:49,160 Speaker 1: wonder what some people are going to consider a good 751 00:40:49,239 --> 00:40:51,600 Speaker 1: deed too, because there might be some comical examples of 752 00:40:51,600 --> 00:40:54,560 Speaker 1: what people consider to be their good deed for humanity, 753 00:40:54,960 --> 00:40:57,799 Speaker 1: like one I I. Yeah, I imagine we would see 754 00:40:57,800 --> 00:41:00,279 Speaker 1: everything from someone saying, all right, I'm gonna take this 755 00:41:01,120 --> 00:41:04,080 Speaker 1: robot with me while me and my company while we 756 00:41:04,120 --> 00:41:06,399 Speaker 1: go out and we clean up a neighborhood. That could 757 00:41:06,440 --> 00:41:08,399 Speaker 1: be one, or it could be I'm going to set 758 00:41:08,400 --> 00:41:10,240 Speaker 1: this robot here on the corner so it can watch 759 00:41:10,280 --> 00:41:13,000 Speaker 1: me as I stopped traffic so this mama duck and 760 00:41:13,000 --> 00:41:15,080 Speaker 1: her baby ducks can get across the street. It could 761 00:41:15,160 --> 00:41:18,239 Speaker 1: be anything. Yeah, like oh man, you almost spilled your beer, 762 00:41:18,280 --> 00:41:21,920 Speaker 1: but I I saved you. I'm passing this thing on 763 00:41:22,040 --> 00:41:25,920 Speaker 1: and see that that also again because of you Look, 764 00:41:25,960 --> 00:41:28,160 Speaker 1: if you look at it as the as an experiment, 765 00:41:29,120 --> 00:41:32,760 Speaker 1: that's still meaningful data. Right. That's true. It's it's interesting, 766 00:41:32,960 --> 00:41:35,200 Speaker 1: you know, it's kind of like a joke, but it's 767 00:41:35,239 --> 00:41:38,880 Speaker 1: also that's humanity too, that's true. And that's exactly what 768 00:41:39,040 --> 00:41:41,440 Speaker 1: the initial goal of this whole thing was. I mean, 769 00:41:41,440 --> 00:41:44,080 Speaker 1: it's to see what happens. It wasn't the goal of 770 00:41:44,120 --> 00:41:46,200 Speaker 1: getting this thing across Canada. But they can put in 771 00:41:46,239 --> 00:41:48,480 Speaker 1: a box and ship if they want, or just have 772 00:41:48,560 --> 00:41:50,279 Speaker 1: a trucking company hall it. Like we said, you know, 773 00:41:50,360 --> 00:41:52,720 Speaker 1: one shot all the way straight across. But the idea 774 00:41:52,880 --> 00:41:55,120 Speaker 1: was to see what happens along the way. It's like 775 00:41:55,160 --> 00:41:58,279 Speaker 1: the journey is better than the destination exactly. Yeah, And 776 00:41:58,320 --> 00:42:01,799 Speaker 1: that it's those experience is that we're important and that 777 00:42:02,160 --> 00:42:05,719 Speaker 1: documenting culture, and we're talking about an emerging culture now, 778 00:42:05,800 --> 00:42:09,719 Speaker 1: not just tradition, not just the embedded culture that's been 779 00:42:09,760 --> 00:42:13,680 Speaker 1: around for generations. We're talking about an emerging culture of 780 00:42:13,719 --> 00:42:18,000 Speaker 1: technology and our daily lives intermingling on a level that 781 00:42:18,120 --> 00:42:21,560 Speaker 1: has been it's unprecedented. We've never seen it like that, 782 00:42:21,960 --> 00:42:27,879 Speaker 1: and it grows every day, so fascinating, really a fascinating experiment. 783 00:42:28,440 --> 00:42:30,560 Speaker 1: I wouldn't call it a failure at all. I mean, 784 00:42:30,600 --> 00:42:33,440 Speaker 1: I'm sad that the Hitchbot didn't get further along in 785 00:42:33,480 --> 00:42:36,080 Speaker 1: its journey so that more people could experience it and 786 00:42:36,120 --> 00:42:40,960 Speaker 1: that we could have more stories. But it's all right, 787 00:42:41,040 --> 00:42:44,879 Speaker 1: because the story continues. It's just the Hitchbot chapter is over. 788 00:42:45,560 --> 00:42:49,240 Speaker 1: So when you look at it that way, it's actually 789 00:42:49,600 --> 00:42:52,480 Speaker 1: really interesting and inspiring. And you know, of course, you 790 00:42:52,560 --> 00:42:55,880 Speaker 1: might say, well, I hope that the next robot meets 791 00:42:55,880 --> 00:42:58,920 Speaker 1: with more success and doesn't have the same kind of encounter. 792 00:42:59,480 --> 00:43:01,879 Speaker 1: But if we do see these kind of encounters happen 793 00:43:01,920 --> 00:43:04,440 Speaker 1: again and again, then we have new questions to ask, like, 794 00:43:04,800 --> 00:43:07,960 Speaker 1: why is this happening? Uh? You know, what are what 795 00:43:08,040 --> 00:43:11,319 Speaker 1: are the motivations behind it? Are there things we need 796 00:43:11,360 --> 00:43:14,400 Speaker 1: to look at as a as a society, not just 797 00:43:14,719 --> 00:43:18,160 Speaker 1: not because we want to protect robots, but are there 798 00:43:18,280 --> 00:43:22,160 Speaker 1: underlying issues that this is just an indicator of it, 799 00:43:22,280 --> 00:43:24,319 Speaker 1: and maybe there are things we need to fix. It's 800 00:43:24,320 --> 00:43:28,600 Speaker 1: a real anger, some deep seated anger against robots, or 801 00:43:28,760 --> 00:43:32,880 Speaker 1: or even just one of those situations where clearly the 802 00:43:33,320 --> 00:43:35,960 Speaker 1: person who was trying to scavenge it wanted to get 803 00:43:35,960 --> 00:43:38,480 Speaker 1: it for the parts to sell for some reasons, and 804 00:43:38,520 --> 00:43:40,879 Speaker 1: then that's well, if that's in fact the answer, then 805 00:43:40,920 --> 00:43:43,200 Speaker 1: you're you. You might say, all right, you know, this 806 00:43:43,280 --> 00:43:48,200 Speaker 1: is yet another indicator that there are conditions that maybe 807 00:43:48,239 --> 00:43:50,759 Speaker 1: we should look at and really talk about. And yes, 808 00:43:50,840 --> 00:43:53,759 Speaker 1: this is a kind of trivial way of highlighting that, 809 00:43:53,840 --> 00:43:56,960 Speaker 1: and it's stuff that we already know, but it's another 810 00:43:57,040 --> 00:44:00,360 Speaker 1: way to say, think about this. I mean, we're really 811 00:44:00,800 --> 00:44:03,680 Speaker 1: we're really talking about compassion, and on a level, that 812 00:44:03,840 --> 00:44:06,800 Speaker 1: is a very you know, human trait, a very innate 813 00:44:06,840 --> 00:44:09,600 Speaker 1: trait in us. Maybe we should apply that to our 814 00:44:09,640 --> 00:44:13,120 Speaker 1: fellow humans, to not just to the robots, even even 815 00:44:13,120 --> 00:44:16,000 Speaker 1: the even the humans that caused damage to the robots, 816 00:44:16,200 --> 00:44:19,920 Speaker 1: we should show compassion too, because we don't know the 817 00:44:20,000 --> 00:44:23,160 Speaker 1: reason behind it, and there may be reasons that we 818 00:44:23,239 --> 00:44:26,239 Speaker 1: can't even identify with because we're not in that situation, 819 00:44:26,760 --> 00:44:29,680 Speaker 1: and that's all the more reason to show compassion. And 820 00:44:29,719 --> 00:44:32,080 Speaker 1: that's exactly what they're Again, I keep going back this, 821 00:44:32,160 --> 00:44:34,359 Speaker 1: but that's exactly what they were looking for when they 822 00:44:34,400 --> 00:44:37,080 Speaker 1: started this whole experiment a couple of years ago. Yeah. 823 00:44:37,120 --> 00:44:40,000 Speaker 1: So this has really been fascinating and I can't wait 824 00:44:40,040 --> 00:44:43,120 Speaker 1: to see what the next phase will bring to us. 825 00:44:43,160 --> 00:44:44,960 Speaker 1: Can I ask you one question before we leave here? 826 00:44:45,000 --> 00:44:47,680 Speaker 1: And I think we had discussed this, but only briefly 827 00:44:47,719 --> 00:44:50,920 Speaker 1: and we didn't really get into much detail. But um, 828 00:44:51,400 --> 00:44:54,279 Speaker 1: had you not known about hitchpot, right, had you never 829 00:44:54,360 --> 00:44:56,640 Speaker 1: heard of this whole thing exactly you passed it on 830 00:44:56,719 --> 00:44:59,880 Speaker 1: the on the city streets driving, would you have stopped 831 00:44:59,880 --> 00:45:03,160 Speaker 1: the pick it up? None a chance at all that 832 00:45:03,200 --> 00:45:06,600 Speaker 1: I would stop because and you made me think about 833 00:45:06,600 --> 00:45:09,600 Speaker 1: this because I was coming to it from the perspective 834 00:45:09,640 --> 00:45:13,000 Speaker 1: of knowing about Hitchbot. If I saw hitch Bot, I'd think, 835 00:45:13,600 --> 00:45:17,200 Speaker 1: holy crap, there's the hitchhiking robot. We've got to take 836 00:45:17,239 --> 00:45:19,520 Speaker 1: part in this. This is something special and I feel 837 00:45:19,560 --> 00:45:22,759 Speaker 1: the exact same way. But not knowing about Hitchbot, not 838 00:45:22,880 --> 00:45:26,600 Speaker 1: knowing about it and seeing a bucket that has electronics 839 00:45:26,719 --> 00:45:31,080 Speaker 1: attached to it, even with the happy face, maybe particularly 840 00:45:31,120 --> 00:45:34,600 Speaker 1: with the happy face, I might think, oh, what this 841 00:45:34,680 --> 00:45:37,120 Speaker 1: is like a suspicious device of some sort. I thought, 842 00:45:37,320 --> 00:45:39,080 Speaker 1: I said, it looks a lot, an awful lot like 843 00:45:39,080 --> 00:45:41,799 Speaker 1: an I E. D. And I thought it's not too 844 00:45:41,840 --> 00:45:43,520 Speaker 1: far off in the description. I know that they're a 845 00:45:43,520 --> 00:45:47,160 Speaker 1: little more um I guess camouflaged in the way that 846 00:45:47,200 --> 00:45:50,279 Speaker 1: they typically do those things. But this just seems to 847 00:45:50,320 --> 00:45:53,160 Speaker 1: me like not a good idea to pick something up, like, 848 00:45:53,200 --> 00:45:54,839 Speaker 1: you know, something like this up on the street and 849 00:45:54,960 --> 00:45:57,320 Speaker 1: you know, strap it in the car next your kids. Yeah, 850 00:45:57,800 --> 00:46:00,520 Speaker 1: not really, but I mean, knowing what it is, yeah, 851 00:46:00,600 --> 00:46:02,040 Speaker 1: of course you want to do that. You'd want to 852 00:46:02,080 --> 00:46:03,759 Speaker 1: you know, it'd be a great experience for you and 853 00:46:03,760 --> 00:46:05,920 Speaker 1: your family, you know, to to do something with us, 854 00:46:05,920 --> 00:46:08,480 Speaker 1: even if you drive it a mile or five miles 855 00:46:08,560 --> 00:46:11,239 Speaker 1: or whatever. Just take a quick photograph with it and 856 00:46:11,280 --> 00:46:13,319 Speaker 1: say you were part of that journey. That's kind of cool. 857 00:46:13,440 --> 00:46:17,680 Speaker 1: It's actually kind of interesting to me that Hitchbot spent 858 00:46:17,760 --> 00:46:23,480 Speaker 1: seven days in Boston because Boston is also where we 859 00:46:23,600 --> 00:46:28,440 Speaker 1: had the Aquitine Hunger Force Moon andite bomb scare. It 860 00:46:28,480 --> 00:46:30,680 Speaker 1: was in two thousand seven and it was in Boston. 861 00:46:30,880 --> 00:46:32,920 Speaker 1: For it was in Boston. Yeah, yeah, that's with the 862 00:46:33,440 --> 00:46:35,080 Speaker 1: neon or not in the it. I'm sorry l E 863 00:46:35,200 --> 00:46:37,880 Speaker 1: D Yeah, um well yeah, yeah, you can tell him 864 00:46:37,880 --> 00:46:39,560 Speaker 1: what it was. Yeah, the the l E ed there. 865 00:46:39,600 --> 00:46:42,719 Speaker 1: There were these two characters from Aquitine Hunger Force, these 866 00:46:42,719 --> 00:46:47,080 Speaker 1: two Moononites from the Moon. They look like uh like yeah, 867 00:46:47,239 --> 00:46:50,080 Speaker 1: the eight bit characters from a really crappy video game 868 00:46:50,120 --> 00:46:53,480 Speaker 1: that they're specifically made to look like that they're two dimensional. 869 00:46:53,560 --> 00:46:55,759 Speaker 1: They're they when they turn sideways, you don't see him 870 00:46:55,760 --> 00:46:59,080 Speaker 1: anymore because they're gone. And also they're hilarious. They are hilarious. 871 00:46:59,080 --> 00:47:02,719 Speaker 1: They are incredible, inappropriate, as is everything on Aquitine Hunger Force, 872 00:47:03,000 --> 00:47:09,200 Speaker 1: but they are hilarious. And there was a publicity stunt 873 00:47:09,440 --> 00:47:13,359 Speaker 1: where uh these led signs of the two characters were 874 00:47:13,360 --> 00:47:17,440 Speaker 1: put up in various locations, and in Boston it caused 875 00:47:17,480 --> 00:47:20,319 Speaker 1: a bomb scare. People thought that maybe it was the 876 00:47:20,360 --> 00:47:26,080 Speaker 1: indication of an explosive device nearby, and so they were dismantled, 877 00:47:26,160 --> 00:47:29,920 Speaker 1: and it very quickly became kind of a joke slash 878 00:47:29,960 --> 00:47:33,319 Speaker 1: a discussion about you have to be very careful in 879 00:47:33,320 --> 00:47:36,680 Speaker 1: the way you present these kind of guerrilla marketing attempts 880 00:47:36,719 --> 00:47:39,600 Speaker 1: because in a in a post nine eleven world, they 881 00:47:39,600 --> 00:47:42,399 Speaker 1: can be misinterpreted. It was post nine eleven and pre 882 00:47:43,160 --> 00:47:45,840 Speaker 1: marathon bombing too, yeah, so it's kind of in between. 883 00:47:46,040 --> 00:47:48,480 Speaker 1: But they were on high alert there for a while 884 00:47:48,560 --> 00:47:52,440 Speaker 1: about these signs, and then you know, sheepishly Cartoon Network 885 00:47:52,480 --> 00:47:54,840 Speaker 1: had to say, oh, that was us, yeah, and I 886 00:47:54,880 --> 00:47:57,240 Speaker 1: here's what happened. But in fact they were a little 887 00:47:57,239 --> 00:48:00,480 Speaker 1: reluctant to say that was us. Well, but you know, 888 00:48:00,520 --> 00:48:04,280 Speaker 1: then again bad press is still pressed. That's true. So 889 00:48:04,680 --> 00:48:08,200 Speaker 1: I'm actually amazed that the that hitchbot didn't meet with 890 00:48:08,239 --> 00:48:11,800 Speaker 1: any hitches in Boston based on that um and the 891 00:48:11,920 --> 00:48:14,480 Speaker 1: and as you when you asked that question, you gave 892 00:48:14,520 --> 00:48:16,759 Speaker 1: the qualifier, Hey, you've never heard of hitchbot and you 893 00:48:16,800 --> 00:48:19,120 Speaker 1: see this thing on the side of the road. I 894 00:48:19,200 --> 00:48:21,400 Speaker 1: definitely would have wondered what the heck it was, and 895 00:48:21,400 --> 00:48:23,640 Speaker 1: I probably would have thought I might not want to 896 00:48:23,640 --> 00:48:26,759 Speaker 1: get too close to that just in case. Yeah. Sure, 897 00:48:26,800 --> 00:48:29,840 Speaker 1: And imagine if you were, you know somewhere in Canada, 898 00:48:29,920 --> 00:48:33,120 Speaker 1: you know where it's it's wide open farmland and uh, 899 00:48:33,160 --> 00:48:35,440 Speaker 1: you know there's it's it's a mile between houses and 900 00:48:35,520 --> 00:48:38,000 Speaker 1: this thing is propped up on it's it's it's legs 901 00:48:38,000 --> 00:48:41,160 Speaker 1: and its tripods seat there out in the middle of nowhere. 902 00:48:41,719 --> 00:48:44,120 Speaker 1: I don't think i'd stopped. And in that case, I 903 00:48:44,160 --> 00:48:47,680 Speaker 1: probably would stop only because I would think, who the 904 00:48:47,760 --> 00:48:51,960 Speaker 1: heck would set up something sinister in the middle of 905 00:48:52,120 --> 00:48:55,319 Speaker 1: nowhere where you are not likely to affect much of 906 00:48:55,400 --> 00:48:59,520 Speaker 1: anything at all, and that's how they get you, Whereas 907 00:48:59,520 --> 00:49:02,560 Speaker 1: I would be more concerned about the the city location, 908 00:49:02,640 --> 00:49:07,120 Speaker 1: where the the opportunity is higher, you know. Yeah, I 909 00:49:07,160 --> 00:49:09,040 Speaker 1: just see it as like, and that was the last 910 00:49:09,080 --> 00:49:11,560 Speaker 1: thing that he thought. Well, I already told you that 911 00:49:11,560 --> 00:49:13,960 Speaker 1: I was worried that my obituary from yesterday was going 912 00:49:14,000 --> 00:49:17,200 Speaker 1: to say choked to death on twizzlers. So it's actually 913 00:49:17,239 --> 00:49:20,719 Speaker 1: much worse. Yeah, no, especially since I hate twizzlers all right. Well, 914 00:49:20,760 --> 00:49:22,600 Speaker 1: at any rate, this was really a lot of fun 915 00:49:22,640 --> 00:49:25,160 Speaker 1: to talk about, and it was fun to kind of, 916 00:49:25,320 --> 00:49:28,719 Speaker 1: you know, think about the weird adventures, which are all, 917 00:49:28,760 --> 00:49:30,960 Speaker 1: like I said, documented. You can go to the hitchbot 918 00:49:30,960 --> 00:49:34,400 Speaker 1: website and read up on the different days and events 919 00:49:34,440 --> 00:49:37,120 Speaker 1: and things that it encountered, the people it met, so 920 00:49:37,160 --> 00:49:39,880 Speaker 1: many successful journeys and so many events and things that 921 00:49:39,880 --> 00:49:42,680 Speaker 1: had happened, that happened, and and it posted about all 922 00:49:42,719 --> 00:49:45,359 Speaker 1: that stuff, and all his interactions are recorded in some way, 923 00:49:45,560 --> 00:49:47,719 Speaker 1: which is great, so you can actually go back and 924 00:49:47,760 --> 00:49:51,680 Speaker 1: relive those journeys. And I do hope that we see 925 00:49:52,000 --> 00:49:56,080 Speaker 1: some further experiments that that are in the same spirit, 926 00:49:56,239 --> 00:49:58,920 Speaker 1: whether or not it's another hitchhiking thing or it's like 927 00:49:58,960 --> 00:50:02,960 Speaker 1: the Philly love body or so. I know, it's like, 928 00:50:03,120 --> 00:50:05,439 Speaker 1: it's like I'm ten years old. I can't you can't 929 00:50:05,480 --> 00:50:07,960 Speaker 1: say that, and I still giggle every time. But you know, 930 00:50:08,280 --> 00:50:10,839 Speaker 1: I see what you mean. I anticipate a bunch of 931 00:50:11,120 --> 00:50:15,560 Speaker 1: copycat hitchpots popping up. Um, here's another little thing that 932 00:50:15,600 --> 00:50:17,239 Speaker 1: I I we didn't get to really talk about this, 933 00:50:17,239 --> 00:50:18,719 Speaker 1: but I think the very next thing we're going to 934 00:50:18,760 --> 00:50:21,160 Speaker 1: see out of this unfortunately, and this is just my 935 00:50:21,280 --> 00:50:24,880 Speaker 1: gut feeling. You said, somebody probably scrapped the head, you know, 936 00:50:24,960 --> 00:50:28,400 Speaker 1: the control unit. I have a feeling we're going to 937 00:50:28,480 --> 00:50:30,600 Speaker 1: see a photograph of that show up somewhere that's gonna 938 00:50:30,600 --> 00:50:33,839 Speaker 1: be sent to the creators of a hitch Pot, which 939 00:50:34,120 --> 00:50:38,799 Speaker 1: to the creators probably would just be yet another data point. Probably, Yeah, 940 00:50:38,800 --> 00:50:41,120 Speaker 1: But I I see it as going like the way 941 00:50:41,239 --> 00:50:44,560 Speaker 1: a real crime against a human would have gone, um, 942 00:50:44,600 --> 00:50:46,879 Speaker 1: you know, and that there will be next a kind 943 00:50:46,920 --> 00:50:49,520 Speaker 1: of a taunting note sent to them as well, which 944 00:50:49,840 --> 00:50:53,520 Speaker 1: you know, I hope I'm wrong that if that does happen, though, 945 00:50:53,640 --> 00:50:56,719 Speaker 1: it is interesting because it's just that whoever does it 946 00:50:56,840 --> 00:51:00,719 Speaker 1: obviously would think of it as uh or I would 947 00:51:00,760 --> 00:51:03,400 Speaker 1: guess this is armchair psychology, but I would imagine that 948 00:51:03,400 --> 00:51:05,680 Speaker 1: they would think of it as like a joke. That 949 00:51:05,760 --> 00:51:07,719 Speaker 1: you know, I'm treating this as if it were a 950 00:51:07,719 --> 00:51:10,839 Speaker 1: real person. Meanwhile, there are other people who think that's 951 00:51:10,920 --> 00:51:13,880 Speaker 1: sick because they do think of Hitchpot as at least 952 00:51:14,200 --> 00:51:17,239 Speaker 1: some in some way similar to a person. And this 953 00:51:17,280 --> 00:51:20,200 Speaker 1: is again so that we're clear here. I read a 954 00:51:20,239 --> 00:51:22,080 Speaker 1: lot of true crimes. So it's not that I'm just 955 00:51:22,120 --> 00:51:24,640 Speaker 1: thinking about this all the time. I mean, I'm just 956 00:51:24,680 --> 00:51:28,239 Speaker 1: saying around the one year anniversary, just pay attention to 957 00:51:28,239 --> 00:51:30,640 Speaker 1: what's going on the news. It might happen, might not happen. 958 00:51:30,719 --> 00:51:32,520 Speaker 1: I don't have any insight info or anything like that. 959 00:51:32,800 --> 00:51:34,680 Speaker 1: I could just as easily be that someone saw it 960 00:51:34,680 --> 00:51:36,680 Speaker 1: and thought, oh I want a tablet. PC could be 961 00:51:36,719 --> 00:51:38,360 Speaker 1: and it could it could have ended up in a dumpster, 962 00:51:38,640 --> 00:51:41,600 Speaker 1: half a block away. Yeah, yeah, it's it's hard to say, 963 00:51:41,719 --> 00:51:43,560 Speaker 1: but this was a lot of fun. Scott, thank you 964 00:51:43,600 --> 00:51:46,040 Speaker 1: for coming on the show. Thank you again. I appreciate it, 965 00:51:46,080 --> 00:51:48,440 Speaker 1: and likewise I had a good time doing it. Fantastic. 966 00:51:48,480 --> 00:51:51,719 Speaker 1: If you guys want to listen to more amazing content 967 00:51:51,840 --> 00:51:54,600 Speaker 1: that Scott and Ben generate all the time, you've got 968 00:51:54,719 --> 00:51:57,560 Speaker 1: to check out car Stuff. It's great, show sucks, it's 969 00:51:57,680 --> 00:51:59,520 Speaker 1: it's tons of fun. I've I've had so much fun 970 00:51:59,560 --> 00:52:02,120 Speaker 1: listening to those episodes. Uh, I thought you were gonna 971 00:52:02,120 --> 00:52:04,279 Speaker 1: say those clowns right there, he said, I had so 972 00:52:04,360 --> 00:52:07,359 Speaker 1: much fun listening to and there's a little pause, and well, 973 00:52:07,840 --> 00:52:10,680 Speaker 1: that's true outside of the room, I like in the 974 00:52:10,800 --> 00:52:13,160 Speaker 1: studio or in the office rather well. I appreciate it, 975 00:52:13,200 --> 00:52:15,359 Speaker 1: Thank you very much. Yeah. So, guys, if you have 976 00:52:15,400 --> 00:52:17,840 Speaker 1: any suggestions for future topics or you have you know, 977 00:52:17,960 --> 00:52:20,719 Speaker 1: your own perspective on hitchpot and what that experiment was 978 00:52:20,760 --> 00:52:22,920 Speaker 1: all about, let me know. Send me an email the 979 00:52:22,960 --> 00:52:26,120 Speaker 1: addresses tech stuff at how stuff works dot com, or 980 00:52:26,160 --> 00:52:28,920 Speaker 1: you can drop me a line on Facebook or Twitter 981 00:52:29,040 --> 00:52:31,480 Speaker 1: the handle of both of those tech Stuff h s 982 00:52:31,680 --> 00:52:41,880 Speaker 1: W and I'll talk to you again Lily soon. For 983 00:52:42,040 --> 00:52:44,360 Speaker 1: more on this and bassans of other topics because it 984 00:52:44,440 --> 00:52:53,320 Speaker 1: has to works. Dot com