1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,840 --> 00:00:14,160 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,280 --> 00:00:16,920 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heeart Radio 4 00:00:17,000 --> 00:00:19,760 Speaker 1: and how the tech are you? It is a Friday, 5 00:00:19,800 --> 00:00:22,720 Speaker 1: which means it's time for a tech Stuff classic episode. 6 00:00:23,480 --> 00:00:27,520 Speaker 1: This episode is called the Sad Tale of hitch Bot, 7 00:00:28,120 --> 00:00:32,960 Speaker 1: and in many ways it's another example of comparing the 8 00:00:33,080 --> 00:00:37,280 Speaker 1: cultures of Canada versus that of the United States. Scott 9 00:00:37,360 --> 00:00:40,360 Speaker 1: Benjamin was on hand to serve as a guest host 10 00:00:40,440 --> 00:00:44,159 Speaker 1: in this episode, and it originally published on August twenty four, 11 00:00:44,440 --> 00:00:48,720 Speaker 1: two thousand fifteen. Let's listen in let me paint you 12 00:00:48,960 --> 00:00:56,000 Speaker 1: a word picture Scott. On July two thousand fourteen, Hitchhiker 13 00:00:56,080 --> 00:01:01,680 Speaker 1: began a historic journey from Halifax Nova's Scotia to get 14 00:01:01,840 --> 00:01:06,480 Speaker 1: to Victoria, British Columbia, on the other side of Canada. 15 00:01:06,640 --> 00:01:11,640 Speaker 1: We're talking crossing the entire width of Canada. And if 16 00:01:11,680 --> 00:01:15,560 Speaker 1: you were to do that on the most efficient route possible, 17 00:01:15,600 --> 00:01:18,800 Speaker 1: if you got to choose the route, that would be 18 00:01:18,880 --> 00:01:22,440 Speaker 1: at minimum around three thousand, six d forty four miles 19 00:01:22,520 --> 00:01:26,880 Speaker 1: or five thousand, eight hundred sixty four kilometers However, hitchhikers 20 00:01:27,000 --> 00:01:32,319 Speaker 1: rarely have the ability to call exactly what route needs 21 00:01:32,360 --> 00:01:34,319 Speaker 1: to be taken. They are at the mercy of the 22 00:01:34,400 --> 00:01:38,120 Speaker 1: drivers that pick them up. I'm just I'm headed west, exactly, 23 00:01:38,160 --> 00:01:40,520 Speaker 1: take me as far as you're going. And not that 24 00:01:40,600 --> 00:01:43,080 Speaker 1: I condone hitchhiking or anything like that. That's kind of dangerous. 25 00:01:43,360 --> 00:01:45,640 Speaker 1: It can be, and we will definitely get into some 26 00:01:45,720 --> 00:01:50,680 Speaker 1: danger territory. Part of this conversation, well, the full trip 27 00:01:51,200 --> 00:01:54,440 Speaker 1: took closer to ten thousand kilometers or about sixty two 28 00:01:54,600 --> 00:01:58,960 Speaker 1: hundred miles. Uh. And here's the weird part. The hitchhiker 29 00:01:59,040 --> 00:02:01,840 Speaker 1: was a robot. That is weird. Yeah, not a person. 30 00:02:02,800 --> 00:02:04,640 Speaker 1: I've never picked up a hitchhicker. In fact, for a 31 00:02:04,680 --> 00:02:07,760 Speaker 1: long time, I had never even seen one. Uh saw 32 00:02:07,800 --> 00:02:11,799 Speaker 1: a lot in Hawaii. Surfer culture is still going strong. Yeah, 33 00:02:11,800 --> 00:02:14,400 Speaker 1: there's certain parts of the United States that you can 34 00:02:14,400 --> 00:02:17,600 Speaker 1: expect to see more hitchhikers than than other parts, right, 35 00:02:17,760 --> 00:02:20,200 Speaker 1: and uh, in Hawaii, I guess would be one of 36 00:02:20,240 --> 00:02:22,560 Speaker 1: those places. I've been there too, and I know exactly 37 00:02:22,560 --> 00:02:25,640 Speaker 1: what we're talking about. I think I've seen more hitchhikers 38 00:02:25,639 --> 00:02:28,840 Speaker 1: in Hawaii standing at bus stops attempting to hit your ride, 39 00:02:29,160 --> 00:02:31,680 Speaker 1: just hoping to catch a ride before that bus shows up. 40 00:02:31,840 --> 00:02:33,280 Speaker 1: You know, it's like one or the other. Eventually that 41 00:02:33,400 --> 00:02:35,760 Speaker 1: bust will come. But if in the meantime, if I 42 00:02:35,760 --> 00:02:37,600 Speaker 1: can just get a free ride down the road, That's 43 00:02:37,600 --> 00:02:39,519 Speaker 1: what I'll do. I've just seen lots and lots of 44 00:02:39,560 --> 00:02:42,440 Speaker 1: surfers trying to trying to get to the beach, trying 45 00:02:42,440 --> 00:02:45,200 Speaker 1: to catch that next wave. That's right, man, you can't. 46 00:02:45,520 --> 00:02:48,440 Speaker 1: The waves wait for no one. And so we're talking 47 00:02:48,560 --> 00:02:51,919 Speaker 1: about Hitchbot, which a lot of you have probably heard about. 48 00:02:52,080 --> 00:02:54,560 Speaker 1: Hitchbot made the news first made the news in two 49 00:02:54,600 --> 00:02:57,600 Speaker 1: thousand and fourteen during this historic attempt to get a 50 00:02:57,680 --> 00:03:01,960 Speaker 1: robot to hitchhike across all of Canada successful. It was successful, 51 00:03:02,000 --> 00:03:04,679 Speaker 1: so spoiler alert there, and then went on to do 52 00:03:04,760 --> 00:03:08,880 Speaker 1: this again in Germany. It went all over Germany, and 53 00:03:09,000 --> 00:03:12,080 Speaker 1: it also took a little vacation in the Netherlands. And 54 00:03:12,120 --> 00:03:15,040 Speaker 1: then finally there was an attempt for this robot to 55 00:03:15,120 --> 00:03:18,760 Speaker 1: hitchhike it's way across the United States, which was cut 56 00:03:18,880 --> 00:03:21,320 Speaker 1: short just like the robot. This all sounds so nice. 57 00:03:21,360 --> 00:03:24,959 Speaker 1: What could possibly go wrong? So you might be wondering, 58 00:03:25,600 --> 00:03:28,600 Speaker 1: I've heard about this what actually is going on. The 59 00:03:28,639 --> 00:03:31,000 Speaker 1: first thing I want to say is, while we're talking 60 00:03:31,000 --> 00:03:35,200 Speaker 1: about a hitchhiking robot, Honestly, if I were to describe this, 61 00:03:35,560 --> 00:03:38,600 Speaker 1: I would not have used the word robot. Yeah, it's 62 00:03:38,440 --> 00:03:41,600 Speaker 1: it's they're using a term very loosely at here. I 63 00:03:41,640 --> 00:03:44,560 Speaker 1: think it's because the form factor makes it look sort 64 00:03:44,600 --> 00:03:48,240 Speaker 1: of like a robot. Uh. And it definitely had the 65 00:03:48,280 --> 00:03:51,320 Speaker 1: benefit of having like a light display that that made 66 00:03:51,320 --> 00:03:54,000 Speaker 1: a very simple smiley face, so you had kind of 67 00:03:54,040 --> 00:03:57,480 Speaker 1: a you know, ahead that you could identify. But really 68 00:03:58,280 --> 00:04:01,200 Speaker 1: we're really talking about a hitchhiking computer. It's just a 69 00:04:01,280 --> 00:04:04,160 Speaker 1: hitchhiking computer that was in a more or less static 70 00:04:04,360 --> 00:04:06,800 Speaker 1: robot body. It's simple, but they found a way to 71 00:04:06,960 --> 00:04:10,080 Speaker 1: anthropomorphosize this thing to the point where people look at 72 00:04:10,080 --> 00:04:12,160 Speaker 1: it and said, oh, that's kind of cute. Yeah, because 73 00:04:12,320 --> 00:04:15,360 Speaker 1: it has a torso, it's got arms and legs. Arms 74 00:04:15,360 --> 00:04:18,760 Speaker 1: and legs don't move. The torso is static. Torsos a 75 00:04:18,800 --> 00:04:23,640 Speaker 1: bucket um literally buck yeah, and literally a bucket. It 76 00:04:23,680 --> 00:04:27,440 Speaker 1: does have solar panels that are arranged on the outside 77 00:04:27,440 --> 00:04:29,720 Speaker 1: of the bucket, so that's one of the ways that 78 00:04:29,880 --> 00:04:33,039 Speaker 1: the robot gets electricity. The other is that it will 79 00:04:33,120 --> 00:04:37,479 Speaker 1: ask people to plug it into the lighter socket on 80 00:04:37,560 --> 00:04:40,080 Speaker 1: a in a car. Yeah, it's only about three feet tall, 81 00:04:40,160 --> 00:04:43,880 Speaker 1: so it's very small. That's waterproof, which is kind of surprising. 82 00:04:43,880 --> 00:04:46,120 Speaker 1: It's waterproof. It does well and also, I mean it 83 00:04:46,240 --> 00:04:51,880 Speaker 1: wears waterproof boots. Every single article, every article without fail, 84 00:04:52,040 --> 00:04:55,840 Speaker 1: talks about the fact that it wears Wellies. The brand 85 00:04:55,880 --> 00:04:58,480 Speaker 1: name boot. Yeah, okay, like a like a rubber boot. 86 00:04:58,560 --> 00:05:02,000 Speaker 1: It's a named to the Duke of Wellington. Well It's 87 00:05:02,120 --> 00:05:05,200 Speaker 1: arms are what pool noodles, so you know, it's not 88 00:05:05,440 --> 00:05:07,279 Speaker 1: it's not like they went to two great lengths to 89 00:05:07,279 --> 00:05:09,560 Speaker 1: try to make this thing look human or anything like that. 90 00:05:09,640 --> 00:05:11,880 Speaker 1: But it does have arms and legs. It does have 91 00:05:11,920 --> 00:05:15,240 Speaker 1: a face as you mentioned, has um A. It almost 92 00:05:15,279 --> 00:05:17,120 Speaker 1: looks like a tupperware container on the top looks like 93 00:05:17,120 --> 00:05:20,880 Speaker 1: a hat, a beret or something like that. That's designed 94 00:05:20,880 --> 00:05:25,839 Speaker 1: to to actually protect the the electronics, the tablet computer 95 00:05:25,960 --> 00:05:29,800 Speaker 1: that is running the software that the robot so used 96 00:05:29,960 --> 00:05:33,279 Speaker 1: likely part of the waterproofing overall, I guess, and um 97 00:05:33,279 --> 00:05:36,040 Speaker 1: it's GPS equipped. The thing weighs about twenty five pounds total, 98 00:05:36,080 --> 00:05:38,599 Speaker 1: so it's not that heavy. Really has its own built 99 00:05:38,600 --> 00:05:41,839 Speaker 1: in seat that it's like a car seat almost that well, 100 00:05:41,839 --> 00:05:44,039 Speaker 1: it is a car seat from a kid's car seat 101 00:05:44,360 --> 00:05:46,800 Speaker 1: that you can then put in your car and buckle in, 102 00:05:46,800 --> 00:05:48,279 Speaker 1: so it's secure when it's in there. It's not gonna 103 00:05:48,279 --> 00:05:50,480 Speaker 1: fly around the vehicle loose, you know, it's something were 104 00:05:50,520 --> 00:05:53,480 Speaker 1: to happen, right, And and it's legs are not powered, 105 00:05:53,680 --> 00:05:56,239 Speaker 1: they are statics. So the way that you would set 106 00:05:56,240 --> 00:05:58,120 Speaker 1: this up when you are done carrying it as far 107 00:05:58,160 --> 00:06:01,119 Speaker 1: as you want to carry it is the seat also 108 00:06:01,200 --> 00:06:04,960 Speaker 1: has essentially a lever that can fold down into a 109 00:06:05,000 --> 00:06:09,440 Speaker 1: tripod like position, so the two legs act as two 110 00:06:09,440 --> 00:06:11,800 Speaker 1: of the legs of the tripod. This lever access the 111 00:06:11,839 --> 00:06:13,240 Speaker 1: third and if you were to pick it up, you 112 00:06:13,240 --> 00:06:16,880 Speaker 1: could fold that arm back up against the seat, so 113 00:06:16,920 --> 00:06:19,159 Speaker 1: that would allow you to put it into your vehicle 114 00:06:19,640 --> 00:06:22,080 Speaker 1: and secure the car seat there. And that's if you 115 00:06:22,120 --> 00:06:23,520 Speaker 1: were in an area that you wanted to set it 116 00:06:23,600 --> 00:06:25,160 Speaker 1: up on the side of the road that was you know, 117 00:06:25,200 --> 00:06:27,400 Speaker 1: like a field or something where there's nowhere for it 118 00:06:27,440 --> 00:06:28,880 Speaker 1: to sit, like on a bench or maybe in a 119 00:06:28,920 --> 00:06:31,919 Speaker 1: wall or something right that right, and uh, you know, 120 00:06:32,120 --> 00:06:35,039 Speaker 1: like I said, it was running on essentially a tablet 121 00:06:35,080 --> 00:06:38,240 Speaker 1: PC that if you looked at all the equipment according 122 00:06:38,279 --> 00:06:42,840 Speaker 1: to the website, uh, it cost about a thousand dollars, 123 00:06:42,839 --> 00:06:45,880 Speaker 1: maybe a little less, And that was a calculated decision. 124 00:06:46,360 --> 00:06:48,919 Speaker 1: They wanted that. They, being the team behind this, and 125 00:06:48,920 --> 00:06:51,880 Speaker 1: I'll talk about them in a second, wanted the robot 126 00:06:51,960 --> 00:06:54,920 Speaker 1: to be inexpensive enough where it would not be an 127 00:06:54,920 --> 00:06:58,200 Speaker 1: obvious target for someone to just steal the components out 128 00:06:58,200 --> 00:07:01,599 Speaker 1: of it. They wanted it to be uh uh accessible. 129 00:07:01,600 --> 00:07:03,880 Speaker 1: They wanted it to be cute. They wanted it to 130 00:07:03,880 --> 00:07:06,720 Speaker 1: be something that people would want to interact with and 131 00:07:06,800 --> 00:07:10,960 Speaker 1: to have enough of an ability to have interactions, including 132 00:07:11,320 --> 00:07:15,560 Speaker 1: holding a conversation sort of ye kind of yeah, uh, 133 00:07:16,160 --> 00:07:19,320 Speaker 1: we're being real generous with the term conversation. That's one 134 00:07:19,360 --> 00:07:21,000 Speaker 1: thing that you and I talked about off air is 135 00:07:21,080 --> 00:07:23,880 Speaker 1: that every conversation we've ever seen with this, you know 136 00:07:23,920 --> 00:07:28,560 Speaker 1: that between a human and hitchpot it was awkward to 137 00:07:28,560 --> 00:07:30,760 Speaker 1: say the least. I mean it was it kind of 138 00:07:30,840 --> 00:07:34,040 Speaker 1: it kind of picked up on what what the even 139 00:07:34,160 --> 00:07:37,040 Speaker 1: was saying, but not entirely didn't quite get the gist 140 00:07:37,080 --> 00:07:39,960 Speaker 1: of the conversation, and it would respond in an awkward way. Yeah, 141 00:07:40,000 --> 00:07:42,400 Speaker 1: it had It had a microphone so it could pick 142 00:07:42,480 --> 00:07:44,760 Speaker 1: up on what people were saying, and a speaker so 143 00:07:44,920 --> 00:07:48,640 Speaker 1: it could then return and communicate back. And a big 144 00:07:48,680 --> 00:07:51,960 Speaker 1: problem with that was probably the the vast array of 145 00:07:52,000 --> 00:07:55,040 Speaker 1: dialects that was dealing with, right, and just the fact 146 00:07:55,040 --> 00:07:59,560 Speaker 1: that our spoken language is incredibly plastic and adaptive and 147 00:07:59,600 --> 00:08:01,880 Speaker 1: there are so many different ways to say the same 148 00:08:01,920 --> 00:08:05,480 Speaker 1: thing that it can be difficult for you know, it's 149 00:08:05,520 --> 00:08:08,440 Speaker 1: like a non native speaker of any language, you might 150 00:08:08,480 --> 00:08:11,360 Speaker 1: be taught how to say or ask for something a 151 00:08:11,440 --> 00:08:14,800 Speaker 1: very specific way, and that specific way is still correct, 152 00:08:14,880 --> 00:08:18,640 Speaker 1: it's not incorrect, but it's just one way to say that, 153 00:08:18,720 --> 00:08:22,480 Speaker 1: to express that thought. And in most languages there are 154 00:08:22,600 --> 00:08:26,120 Speaker 1: lots of different ways to express the same thought, and 155 00:08:26,280 --> 00:08:29,000 Speaker 1: you're familiar with one of them, so if anyone comes 156 00:08:29,080 --> 00:08:31,679 Speaker 1: up to you and uses a different one, you could 157 00:08:31,680 --> 00:08:34,800 Speaker 1: be completely confused, even though you know one way of 158 00:08:34,840 --> 00:08:37,960 Speaker 1: saying it that you understand all these other ways don't. 159 00:08:38,120 --> 00:08:41,840 Speaker 1: Same thing with computers. If you train a computer using 160 00:08:41,840 --> 00:08:46,160 Speaker 1: machine learning on what certain phrases mean, that's great, the 161 00:08:46,200 --> 00:08:48,720 Speaker 1: computer might be able to identify that. But if someone 162 00:08:48,800 --> 00:08:51,600 Speaker 1: were to ask for the same sort of thing but 163 00:08:51,960 --> 00:08:54,880 Speaker 1: word it slightly differently, that can be enough to throw 164 00:08:54,920 --> 00:08:58,800 Speaker 1: a computer off, because these are subtle things that we 165 00:08:59,000 --> 00:09:03,320 Speaker 1: humans can into tively grasp, but computers lack intuition. And 166 00:09:03,360 --> 00:09:06,920 Speaker 1: there's not only the word problem. There's also the way 167 00:09:06,960 --> 00:09:09,400 Speaker 1: that it's said. So the you know, the the regions, 168 00:09:09,400 --> 00:09:11,600 Speaker 1: the zones, like you know, here in the South, people 169 00:09:11,600 --> 00:09:13,400 Speaker 1: talk different than they do in the Pacific, or the 170 00:09:13,400 --> 00:09:17,920 Speaker 1: Pacific Northwest for the or the Northeast or you know, 171 00:09:18,320 --> 00:09:22,480 Speaker 1: the South, the Northeast talk faster than I can hear. Yeah, 172 00:09:22,480 --> 00:09:25,040 Speaker 1: so it's different. It's different. Not only um, you know, 173 00:09:25,120 --> 00:09:27,480 Speaker 1: just the different languages, like of course this thing had 174 00:09:27,480 --> 00:09:30,400 Speaker 1: to learn German, had to learn uh, you know dotch 175 00:09:30,440 --> 00:09:32,320 Speaker 1: I guess I had no French to get through all 176 00:09:32,360 --> 00:09:35,079 Speaker 1: of Canada. Sure, yeah, exactly right, and of course English, 177 00:09:35,080 --> 00:09:36,920 Speaker 1: and you know, and not only that but the different 178 00:09:36,920 --> 00:09:40,560 Speaker 1: dialects along the way. Sure yeah, So lots of challenges here. 179 00:09:40,640 --> 00:09:44,480 Speaker 1: But the whole goal was not The goal really, I 180 00:09:44,520 --> 00:09:47,120 Speaker 1: don't think was to have a robot hitchhike from one 181 00:09:47,200 --> 00:09:51,199 Speaker 1: end of Canada to another. That was sort of the 182 00:09:51,200 --> 00:09:55,240 Speaker 1: the face of this project. The actual goal was more 183 00:09:55,360 --> 00:09:59,040 Speaker 1: of an artistic expression as well as an experiment in 184 00:09:59,280 --> 00:10:02,800 Speaker 1: robot human interactions. Yeah, because the thing could have been 185 00:10:02,800 --> 00:10:05,000 Speaker 1: picked up right at the right at the start, right 186 00:10:05,000 --> 00:10:07,160 Speaker 1: at the very first day, and driven all the way 187 00:10:07,200 --> 00:10:09,880 Speaker 1: across by one person, you know, on a long haul 188 00:10:09,920 --> 00:10:12,480 Speaker 1: truck or something, and what would be the adventure and 189 00:10:12,520 --> 00:10:14,480 Speaker 1: that what would be the fun. The idea was that 190 00:10:14,559 --> 00:10:18,680 Speaker 1: this relies on um, human interaction and human kindness to 191 00:10:18,720 --> 00:10:20,400 Speaker 1: get this thing from one place to the next, kind 192 00:10:20,400 --> 00:10:23,120 Speaker 1: of to take care of this thing and along the 193 00:10:23,120 --> 00:10:25,280 Speaker 1: way have kind of a checklist of things that they 194 00:10:25,320 --> 00:10:26,800 Speaker 1: wanted to do. Now, I don't know if the Canadian 195 00:10:26,840 --> 00:10:29,560 Speaker 1: trip had a checklist. I don't think it did. Uh. 196 00:10:29,640 --> 00:10:32,120 Speaker 1: The USA trip did have a checklist, which was was 197 00:10:32,160 --> 00:10:35,480 Speaker 1: started um. But the Canadian trip um, you know, just 198 00:10:35,520 --> 00:10:38,840 Speaker 1: for instance, you know, twenty six days to get across um, 199 00:10:38,880 --> 00:10:42,480 Speaker 1: you know, the entire nation there. But it did things 200 00:10:42,480 --> 00:10:45,760 Speaker 1: like attend to wedding, um, it was dancing in Saskatchewan. 201 00:10:45,840 --> 00:10:49,040 Speaker 1: It it met some of the Canadians First Nations people, uh, 202 00:10:49,080 --> 00:10:52,599 Speaker 1: some of the Aboriginal people um that were um, you know, 203 00:10:52,800 --> 00:10:56,120 Speaker 1: native to Canada. And it did all kinds of things. 204 00:10:56,120 --> 00:10:58,160 Speaker 1: I mean, it went to you know, parks, it went 205 00:10:58,200 --> 00:11:01,160 Speaker 1: to scenic locations, um. All the time. It was snapping 206 00:11:01,200 --> 00:11:04,600 Speaker 1: photographs because this program to take a photo every twenty 207 00:11:04,600 --> 00:11:08,480 Speaker 1: minutes and it could tweet out that information, so it's 208 00:11:08,520 --> 00:11:13,400 Speaker 1: interactions were uh doubled, and that it could interact in 209 00:11:13,480 --> 00:11:17,240 Speaker 1: person to people who are around it and actually attempt 210 00:11:17,320 --> 00:11:20,800 Speaker 1: to have a conversation and interject. In fact, I read 211 00:11:20,840 --> 00:11:23,160 Speaker 1: one report of people who had picked up the robot 212 00:11:23,160 --> 00:11:26,559 Speaker 1: and they said, yeah, it was weird. There were three 213 00:11:26,640 --> 00:11:29,240 Speaker 1: of us in the car plus the robot, and we 214 00:11:29,320 --> 00:11:31,680 Speaker 1: would the three of us be talking, and the robot 215 00:11:31,720 --> 00:11:35,840 Speaker 1: would interject and interrupt us and often say something that 216 00:11:36,000 --> 00:11:38,760 Speaker 1: is completely not connected to any of the rest of 217 00:11:38,800 --> 00:11:42,080 Speaker 1: the conversation. And I thought, I've written in cars with 218 00:11:42,120 --> 00:11:45,960 Speaker 1: people like that, you know, just just someone just pipes 219 00:11:46,080 --> 00:11:48,160 Speaker 1: up with a total non sequite or and you think, 220 00:11:48,880 --> 00:11:51,400 Speaker 1: are are we in a car with a crazy person? 221 00:11:51,520 --> 00:11:53,000 Speaker 1: What's that have to do with the price of eggs 222 00:11:53,040 --> 00:11:56,840 Speaker 1: in China? Yeah, something along those lines, And so, uh, 223 00:11:56,880 --> 00:11:58,840 Speaker 1: you know that that was definitely part of it. I 224 00:11:58,880 --> 00:12:02,800 Speaker 1: think there was certainly and an element of let's see 225 00:12:02,880 --> 00:12:06,200 Speaker 1: how humans treat robots and also let's see how we 226 00:12:06,240 --> 00:12:10,360 Speaker 1: can design a robot that from the get go is 227 00:12:10,440 --> 00:12:13,720 Speaker 1: meant to interact with humans. Because One of the things 228 00:12:13,720 --> 00:12:17,840 Speaker 1: we're starting to see increasingly in technology is the robotic 229 00:12:18,400 --> 00:12:22,240 Speaker 1: sphere and the human sphere colliding like there and and 230 00:12:22,640 --> 00:12:26,000 Speaker 1: by design, we want robots in our lives. People have 231 00:12:26,600 --> 00:12:29,160 Speaker 1: named their room bas for example, they have given their 232 00:12:29,200 --> 00:12:32,520 Speaker 1: room bas uh. You know, they've they've imprinted upon them 233 00:12:32,600 --> 00:12:37,320 Speaker 1: this idea of a personality when something happens to the room. 234 00:12:37,360 --> 00:12:39,160 Speaker 1: But you know, I'm sad, you're going to be well, 235 00:12:39,200 --> 00:12:41,920 Speaker 1: i'd be sad to be out, you know, two bucks. 236 00:12:42,080 --> 00:12:46,800 Speaker 1: That's another there's that side. But you know, if something 237 00:12:46,880 --> 00:12:50,440 Speaker 1: happens to Fred, yeah, don't name your room BA. Right, yeah, 238 00:12:50,679 --> 00:12:52,520 Speaker 1: but that's true. Is a lot of people do name 239 00:12:52,880 --> 00:12:56,600 Speaker 1: these these robots. People get emotionally invested in these machines. 240 00:12:57,120 --> 00:13:00,880 Speaker 1: And so there is this growing field of research of 241 00:13:00,960 --> 00:13:07,360 Speaker 1: human robotic interactions. How can we one capitalize on this 242 00:13:07,800 --> 00:13:11,760 Speaker 1: need for humans to have an emotional attachment to these 243 00:13:11,760 --> 00:13:16,920 Speaker 1: these otherwise emotionless beings, these beings that lack of consciousness, 244 00:13:17,040 --> 00:13:20,960 Speaker 1: lack emotions. How can we capitalize on that so that 245 00:13:21,000 --> 00:13:23,960 Speaker 1: the interactions are are useful and meaningful in some way, 246 00:13:24,040 --> 00:13:26,679 Speaker 1: even if it's only meaningful for the human if it's 247 00:13:26,679 --> 00:13:28,680 Speaker 1: impossible for it to be meaningful for the robot, that's 248 00:13:28,679 --> 00:13:31,960 Speaker 1: okay if it's still meaningful for the human or how 249 00:13:31,960 --> 00:13:35,800 Speaker 1: do we design robots that are specifically meant to not 250 00:13:36,200 --> 00:13:40,120 Speaker 1: evoke that reaction because it would be just you know, 251 00:13:40,200 --> 00:13:43,640 Speaker 1: another distraction from whatever the robot is supposed to do. 252 00:13:44,600 --> 00:13:46,800 Speaker 1: Or maybe that whatever the robot is supposed to do 253 00:13:47,240 --> 00:13:51,120 Speaker 1: is inherently dangerous and should not you know, you don't 254 00:13:51,120 --> 00:13:54,160 Speaker 1: want to encourage human interaction if if a robot is 255 00:13:54,160 --> 00:13:57,160 Speaker 1: meant to do something like dig into rubble, you don't 256 00:13:57,240 --> 00:14:00,960 Speaker 1: want people to you know, worry about the robot. The 257 00:14:01,000 --> 00:14:03,200 Speaker 1: whole reason the robots digging into rouble in the first 258 00:14:03,200 --> 00:14:06,240 Speaker 1: place is likely to look for survivors in the in 259 00:14:06,320 --> 00:14:10,160 Speaker 1: the fall out of a building collapse or something, or 260 00:14:10,280 --> 00:14:12,400 Speaker 1: well just to prevent having to have a human do 261 00:14:12,480 --> 00:14:16,200 Speaker 1: the same thing. Yeah, exactly, yeah, yeah, and that's really 262 00:14:16,240 --> 00:14:19,640 Speaker 1: I mean, the way that a lot of robotics experts 263 00:14:20,000 --> 00:14:23,680 Speaker 1: see robots really taking off, at least in the near future, 264 00:14:24,360 --> 00:14:27,120 Speaker 1: is they'll be used to do jobs that are either 265 00:14:27,200 --> 00:14:32,240 Speaker 1: too dirty, dull, or dangerous for humans. So jobs that 266 00:14:32,240 --> 00:14:36,760 Speaker 1: are incredibly repetitive and don't require much thought robots are 267 00:14:36,760 --> 00:14:41,800 Speaker 1: perfect for that. Also, robots don't sustain like repetitive injuries. 268 00:14:42,120 --> 00:14:44,400 Speaker 1: You know, you do have to continuously maintain them. You 269 00:14:44,440 --> 00:14:47,200 Speaker 1: can't just expect them to work forever. But they don't 270 00:14:47,280 --> 00:14:50,920 Speaker 1: get carbal tunnel syndrome for unless they fix themselves. Right, 271 00:14:51,120 --> 00:14:53,600 Speaker 1: and we we will get to that point eventually. And 272 00:14:53,760 --> 00:14:56,160 Speaker 1: dangerous Obviously, you don't want, you know, you would want 273 00:14:56,200 --> 00:14:58,200 Speaker 1: to be able to use a robot in a dangerous situation, 274 00:14:58,320 --> 00:15:00,360 Speaker 1: so that you're not putting human life at risk. Yeah, 275 00:15:00,440 --> 00:15:01,880 Speaker 1: but with the goal of being able to use it 276 00:15:01,920 --> 00:15:06,000 Speaker 1: again and again and again. Right. So, but those robots 277 00:15:06,000 --> 00:15:08,600 Speaker 1: probably don't need to have a lot of human interactivity. 278 00:15:08,720 --> 00:15:12,520 Speaker 1: They're designed to do something that they're replacing a human, 279 00:15:12,600 --> 00:15:15,400 Speaker 1: not interacting with a human. But at the same time, 280 00:15:15,400 --> 00:15:18,400 Speaker 1: we are seeing this growing industry of robots that are 281 00:15:18,440 --> 00:15:21,320 Speaker 1: designed to be around us in our daily lives, either 282 00:15:21,480 --> 00:15:25,680 Speaker 1: as a telepresence style robot where the robot is standing 283 00:15:25,720 --> 00:15:28,440 Speaker 1: in as a surrogate for an actual person, and you 284 00:15:28,520 --> 00:15:31,360 Speaker 1: might have like an iPad or something like that as 285 00:15:31,400 --> 00:15:35,400 Speaker 1: a head where someone can skype in. And this is 286 00:15:35,440 --> 00:15:38,960 Speaker 1: always creepy whenever I see it done anywhere, but I 287 00:15:39,080 --> 00:15:41,320 Speaker 1: keep being told it's the way of the future. I 288 00:15:41,360 --> 00:15:45,000 Speaker 1: have never actually interacted directly with one in an official capacity, 289 00:15:45,040 --> 00:15:46,840 Speaker 1: but I've seen them at CS. Now I'm gonna I'm 290 00:15:46,840 --> 00:15:48,880 Speaker 1: gonna make a reference to something that I have nothing, 291 00:15:48,880 --> 00:15:53,160 Speaker 1: I have no personal interaction with. I believe my wife 292 00:15:53,160 --> 00:15:56,240 Speaker 1: was telling me about a movie recently called Her and 293 00:15:56,280 --> 00:15:57,640 Speaker 1: it was a man who fell in love with the 294 00:15:57,680 --> 00:16:01,440 Speaker 1: operating system and end of voice that that that that 295 00:16:01,480 --> 00:16:04,160 Speaker 1: operating system had. Now I can see something like that. 296 00:16:04,160 --> 00:16:06,600 Speaker 1: And let's say you get a refrigerator, and refrigerators have 297 00:16:06,720 --> 00:16:08,800 Speaker 1: screens on the Now ours does hear it how stuff works? 298 00:16:08,840 --> 00:16:10,640 Speaker 1: And it's got a screen but doesn't talk to us, 299 00:16:11,160 --> 00:16:12,800 Speaker 1: But it does have a screen. You can interact, and 300 00:16:12,800 --> 00:16:14,120 Speaker 1: there's a lot of different things you can do with 301 00:16:14,120 --> 00:16:18,480 Speaker 1: that screen, including putting a wacky background image on it, 302 00:16:18,560 --> 00:16:21,880 Speaker 1: as someone did when they put in the UM, the 303 00:16:22,800 --> 00:16:26,920 Speaker 1: symbol for the evil organization from loss UM. I could 304 00:16:26,920 --> 00:16:28,680 Speaker 1: see if if it was talking to you every day, 305 00:16:28,720 --> 00:16:31,520 Speaker 1: and it had a likable voice, something that you felt 306 00:16:31,560 --> 00:16:35,520 Speaker 1: comfortable interacting with um that you know, I could see 307 00:16:35,560 --> 00:16:37,320 Speaker 1: somebody saying, why I'd be sad to get rid of 308 00:16:37,360 --> 00:16:40,920 Speaker 1: that refrigerator in in five years? Well, especially if you 309 00:16:40,920 --> 00:16:44,080 Speaker 1: would come home after a long day and your refrigerator says, Hi, 310 00:16:44,440 --> 00:16:47,480 Speaker 1: would you like a frosty adult beverage? I mean you 311 00:16:47,480 --> 00:16:50,480 Speaker 1: know you're gonna have a bond with that machine immediately. Yeah. 312 00:16:50,840 --> 00:16:54,280 Speaker 1: So there's this whole discipline that's coming up, like how 313 00:16:54,320 --> 00:16:57,280 Speaker 1: do we how do we define these interactions? How do 314 00:16:57,360 --> 00:17:00,520 Speaker 1: we shape them? Uh? And a lot of it means 315 00:17:00,600 --> 00:17:03,000 Speaker 1: you have to do study on both sides. You have 316 00:17:03,040 --> 00:17:05,000 Speaker 1: to do a study on the robot side like what 317 00:17:05,080 --> 00:17:07,639 Speaker 1: works and what doesn't, and you actually have to study 318 00:17:07,760 --> 00:17:12,840 Speaker 1: human psychology how do humans respond to robots and at 319 00:17:12,880 --> 00:17:16,160 Speaker 1: what point do humans end up treating robots as if 320 00:17:16,240 --> 00:17:20,119 Speaker 1: they are alive, as if they're living creatures. And for 321 00:17:20,160 --> 00:17:22,920 Speaker 1: a while people were thinking, um, well, the robot's gonna 322 00:17:22,920 --> 00:17:26,200 Speaker 1: need to look like something biological already, Like it's gonna 323 00:17:26,200 --> 00:17:28,200 Speaker 1: have to be like a robot dog or a robot 324 00:17:28,359 --> 00:17:31,160 Speaker 1: you know, android type person, almost like a crash test 325 00:17:31,480 --> 00:17:33,560 Speaker 1: where it looks like a human but you can tell 326 00:17:33,600 --> 00:17:35,520 Speaker 1: it's not a real human. And it turns out that's 327 00:17:35,560 --> 00:17:38,200 Speaker 1: not necessarily true, because as we've already said, people have 328 00:17:38,280 --> 00:17:40,600 Speaker 1: been naming the room bas people get emotionally invested. It 329 00:17:40,680 --> 00:17:44,159 Speaker 1: turns out that we are if it looks animate, if 330 00:17:44,200 --> 00:17:48,280 Speaker 1: it appears to behave based upon its own decisions, whether 331 00:17:48,400 --> 00:17:50,480 Speaker 1: it's true or not. If it if it looks like 332 00:17:50,600 --> 00:17:53,920 Speaker 1: it's doing that, we start to kind of in our 333 00:17:54,000 --> 00:17:57,760 Speaker 1: minds give it these qualities. We'll be back with more 334 00:17:57,960 --> 00:18:10,359 Speaker 1: of the sad tale of Hitchbot after these messages. So 335 00:18:11,160 --> 00:18:14,639 Speaker 1: a lot of this really was studying that, like this 336 00:18:15,080 --> 00:18:18,480 Speaker 1: idea of the way people and machines are interacting and 337 00:18:18,840 --> 00:18:21,920 Speaker 1: how that is becoming defined over time and what we 338 00:18:22,040 --> 00:18:24,720 Speaker 1: might need to think about in that respect, and also 339 00:18:24,880 --> 00:18:27,240 Speaker 1: just kind of a you know, it's a happy story 340 00:18:27,440 --> 00:18:34,400 Speaker 1: about how people find joy in h a silly I mean, 341 00:18:34,520 --> 00:18:36,639 Speaker 1: really you get down to it, it's a silly robot, 342 00:18:37,280 --> 00:18:41,000 Speaker 1: not a bad robot. It's a silly robot. And and 343 00:18:41,160 --> 00:18:46,639 Speaker 1: the the experience of discovery and sharing that with other people. 344 00:18:46,800 --> 00:18:49,280 Speaker 1: That was a big part of this project too, and 345 00:18:49,520 --> 00:18:51,760 Speaker 1: it was really successful for three out of the four 346 00:18:53,359 --> 00:18:56,399 Speaker 1: big things that it did. The one that wasn't so 347 00:18:56,480 --> 00:19:00,400 Speaker 1: successful and was the United States. So um, really quickly, 348 00:19:00,400 --> 00:19:02,359 Speaker 1: before we get into the US stuff, I was going 349 00:19:02,400 --> 00:19:05,440 Speaker 1: to talk about some of the folks who designed and 350 00:19:05,720 --> 00:19:09,880 Speaker 1: came up with this idea. Uh. The two leads who 351 00:19:10,160 --> 00:19:13,960 Speaker 1: first came up with the concept for Hitchbot. We're David 352 00:19:14,040 --> 00:19:19,560 Speaker 1: Harris Smith and Franca Zeller And I probably am mispronouncing 353 00:19:19,640 --> 00:19:24,439 Speaker 1: Mrs Ellera Traca Zella. It's a name that I am. 354 00:19:24,520 --> 00:19:27,399 Speaker 1: I was not familiar with, totally new for me. And 355 00:19:27,560 --> 00:19:31,600 Speaker 1: they're out of port Credit, Ontario. Yes, and uh. Smith 356 00:19:31,960 --> 00:19:34,840 Speaker 1: is assistant professor at McMaster University in the Department of 357 00:19:34,880 --> 00:19:39,600 Speaker 1: Communication Studies. Zeller is an assistant professor in the School 358 00:19:39,640 --> 00:19:44,080 Speaker 1: of Professional Communication at Ryerson University Communications professors. Now, this 359 00:19:44,200 --> 00:19:48,440 Speaker 1: makes perfect sense because they're they're fishing for the way 360 00:19:48,480 --> 00:19:51,040 Speaker 1: people interact with this. They want to find out exactly 361 00:19:51,480 --> 00:19:55,719 Speaker 1: how people respond to this, how how in response to them. Uh, 362 00:19:55,840 --> 00:19:59,520 Speaker 1: this interaction is really really interesting for these people and 363 00:19:59,600 --> 00:20:03,000 Speaker 1: particul I'm sure right. And Zeller she got her PhD. 364 00:20:03,600 --> 00:20:07,640 Speaker 1: Her thesis was on human robot interaction, is it then? Yeah, 365 00:20:07,760 --> 00:20:09,680 Speaker 1: And they were joined by a lot of other people. 366 00:20:09,800 --> 00:20:12,240 Speaker 1: I've just got a couple of names I'll mention, but 367 00:20:12,359 --> 00:20:14,640 Speaker 1: the team itself is quite large. You can actually read 368 00:20:14,720 --> 00:20:17,240 Speaker 1: up on all of them on the website. It's funny 369 00:20:17,240 --> 00:20:19,800 Speaker 1: because the way the website is written, it's written from 370 00:20:19,880 --> 00:20:24,960 Speaker 1: Hitchbot's perspective. So hitch it's hitchbot saying, Oh, this is 371 00:20:25,040 --> 00:20:26,879 Speaker 1: the person who helped me learn how to talk and 372 00:20:27,040 --> 00:20:28,919 Speaker 1: it's very cute. This is the person that takes care 373 00:20:29,000 --> 00:20:31,560 Speaker 1: of my electronics. Yes, on a daily basis. I think 374 00:20:31,560 --> 00:20:33,960 Speaker 1: they're about fourteen or fifteen people on that team. It's 375 00:20:34,240 --> 00:20:36,159 Speaker 1: it's a big team. Is a large team, not just 376 00:20:36,240 --> 00:20:38,960 Speaker 1: the two two leads here, right, So you've got people 377 00:20:39,000 --> 00:20:43,200 Speaker 1: like a colin or a gadget who is a developer hitchbot, 378 00:20:43,359 --> 00:20:46,879 Speaker 1: and he's also a McMaster University student. He helped design 379 00:20:46,920 --> 00:20:49,040 Speaker 1: and test hitchbot to make sure it would be able 380 00:20:49,119 --> 00:20:53,840 Speaker 1: to withstand the various environments that it would encounter. Keep 381 00:20:53,840 --> 00:20:55,840 Speaker 1: in mind, this was summer in Canada, so it wasn't 382 00:20:55,840 --> 00:20:59,880 Speaker 1: going to have to deal with a Canadian winter than summer. Yeah, 383 00:21:00,240 --> 00:21:04,840 Speaker 1: it's a little bit different from the Atlanta summers. Slightly 384 00:21:05,000 --> 00:21:09,720 Speaker 1: slightly less warm and humid, about sixty degrees cooler fahrenheit. 385 00:21:09,800 --> 00:21:13,080 Speaker 1: That is, um, they're not Celsius. That would be pretty incredible. 386 00:21:13,440 --> 00:21:15,720 Speaker 1: Uh So, then you had Davin Bigelow, who was an 387 00:21:15,800 --> 00:21:19,399 Speaker 1: undergraduate student at McMaster who worked on the conversational skills 388 00:21:19,520 --> 00:21:23,359 Speaker 1: of this robot. Karen Veal birth Fish, who was another 389 00:21:23,400 --> 00:21:27,840 Speaker 1: person who worked on Hitchbot's language skills, Dominic kal Kinn, 390 00:21:28,240 --> 00:21:31,080 Speaker 1: whose undergraduate student at McMaster whose job was to monitor 391 00:21:31,160 --> 00:21:35,040 Speaker 1: Hitchbot's status and make sure the robot was okay. So again, 392 00:21:35,119 --> 00:21:38,240 Speaker 1: the robot was fitted with GPS and three G capability 393 00:21:38,359 --> 00:21:41,280 Speaker 1: to essentially report back home saying here's where I'm at, 394 00:21:41,600 --> 00:21:43,560 Speaker 1: being given time in every twenty minutes, he's getting a 395 00:21:43,600 --> 00:21:45,960 Speaker 1: photograph sent from this robot to him to kind of 396 00:21:46,520 --> 00:21:49,639 Speaker 1: update status where he is right now. Yeah, and then uh, 397 00:21:49,920 --> 00:21:53,359 Speaker 1: there was the big brother robot to Hitchbot, culture Bot 398 00:21:54,000 --> 00:21:58,000 Speaker 1: k U L t u are. But yeah, this was 399 00:21:58,440 --> 00:22:01,639 Speaker 1: this was a robot that preceded hitchpot. This was a 400 00:22:01,760 --> 00:22:06,960 Speaker 1: different human robot interaction experiment. Culture Bod's job was to 401 00:22:07,160 --> 00:22:12,680 Speaker 1: attend artistic exhibitions, take images of what was going on, 402 00:22:13,119 --> 00:22:16,760 Speaker 1: tweet them, and critique them. It was a robotic art 403 00:22:16,840 --> 00:22:21,040 Speaker 1: critic interesting anyway, so it would actually do the critique 404 00:22:21,080 --> 00:22:22,960 Speaker 1: on the fly. They're like right there at the event. 405 00:22:23,760 --> 00:22:26,400 Speaker 1: That's how it was described. But I didn't read enough 406 00:22:26,480 --> 00:22:29,040 Speaker 1: into it to find out how this actually worked. Like, 407 00:22:29,119 --> 00:22:33,480 Speaker 1: I don't know if it was capable of stringing together 408 00:22:33,800 --> 00:22:36,800 Speaker 1: any words, just based upon what it was seeing. I 409 00:22:36,840 --> 00:22:39,480 Speaker 1: don't know if it had human intervention where the human 410 00:22:39,600 --> 00:22:42,320 Speaker 1: was the one actually providing the caption. I don't know 411 00:22:42,480 --> 00:22:45,119 Speaker 1: the answer to that. Fascinating, but I do know that 412 00:22:45,240 --> 00:22:49,040 Speaker 1: that was essentially another project that was taken Uh. That 413 00:22:49,200 --> 00:22:51,920 Speaker 1: was that was being performed by much of the same team, 414 00:22:52,480 --> 00:22:54,960 Speaker 1: and kind of the Hitchpot was kind of the next step, 415 00:22:55,480 --> 00:22:57,760 Speaker 1: not directly connected. It was just one of those ideas 416 00:22:57,920 --> 00:23:02,359 Speaker 1: that that Smith and Seller came up with that they 417 00:23:02,480 --> 00:23:07,800 Speaker 1: thought was a really interesting concept. So after going through Canada, 418 00:23:08,280 --> 00:23:10,640 Speaker 1: they went to Germany. It had a lot of adventures 419 00:23:10,640 --> 00:23:13,080 Speaker 1: in Germany, went to castles, went to another wedding. There's 420 00:23:13,080 --> 00:23:17,359 Speaker 1: a great picture of a bride giving Hitchpot a little kiss. 421 00:23:19,320 --> 00:23:23,240 Speaker 1: Yeah and um and uh. Lots of stories of people. 422 00:23:23,640 --> 00:23:26,600 Speaker 1: All of Hitchpot's journeys, by the way, are chronicled on 423 00:23:26,640 --> 00:23:29,399 Speaker 1: the website. There are blog posts that tell what happened 424 00:23:29,440 --> 00:23:32,520 Speaker 1: on each day. Some of them also have embedded videos 425 00:23:33,200 --> 00:23:37,000 Speaker 1: of the stuff that went on and also photographs. It's 426 00:23:37,119 --> 00:23:40,840 Speaker 1: very cute. Then after Germany, they went to the Netherlands 427 00:23:40,960 --> 00:23:44,960 Speaker 1: for a brief while in the summer early summer, for 428 00:23:45,040 --> 00:23:48,360 Speaker 1: a bunch of activities and events that I cannot pronounce. Yes, 429 00:23:48,800 --> 00:23:52,080 Speaker 1: I'm not even going to attempt a series of festivals 430 00:23:52,119 --> 00:23:55,600 Speaker 1: with unpronounceable names. And then at least for the American tongue. 431 00:23:55,960 --> 00:23:57,920 Speaker 1: And then uh, and then it moved over to the 432 00:23:58,000 --> 00:24:00,760 Speaker 1: good old us of a Ya started in Boston, right, 433 00:24:00,800 --> 00:24:02,840 Speaker 1: it was gonna go from We're gonna go from Boston 434 00:24:02,960 --> 00:24:05,480 Speaker 1: to San Francisco. That was the goal. That was the goal. 435 00:24:05,600 --> 00:24:08,560 Speaker 1: And it also had a bucket list, which is appropriate 436 00:24:08,640 --> 00:24:10,760 Speaker 1: since it was a bucket. I have the bucket list 437 00:24:10,800 --> 00:24:14,440 Speaker 1: in front of me. Uh. Well, the bucket list has 438 00:24:14,440 --> 00:24:16,240 Speaker 1: a couple of check marks on it. Now. One check 439 00:24:16,280 --> 00:24:18,879 Speaker 1: mark was, um uh, to do the wave of a 440 00:24:18,920 --> 00:24:21,080 Speaker 1: sports game anywhere, it didn't matter where it was to 441 00:24:21,160 --> 00:24:23,119 Speaker 1: do that. Uh. The other one was to see the 442 00:24:23,200 --> 00:24:25,320 Speaker 1: lights in Times Square, of course in New York City. 443 00:24:25,920 --> 00:24:28,159 Speaker 1: And there were others. There's other stuff along the way, 444 00:24:28,200 --> 00:24:29,800 Speaker 1: and I'll just mention a few of these because there's 445 00:24:29,840 --> 00:24:32,960 Speaker 1: probably again different things. In Grand Canyon has to be 446 00:24:33,040 --> 00:24:37,200 Speaker 1: on there. Let's see Grand Canyon. Um Uh, you know, 447 00:24:37,480 --> 00:24:39,560 Speaker 1: I'll have to check out. Yes, it does. See the 448 00:24:39,840 --> 00:24:42,160 Speaker 1: jaw dropping views of the Grand Canyon. That is one. Yes. 449 00:24:42,200 --> 00:24:45,200 Speaker 1: In Arizona, um, posed with the Lincoln Statue in d 450 00:24:45,320 --> 00:24:48,440 Speaker 1: C was another one. Tan at Myrtle beach Um experienced 451 00:24:48,480 --> 00:24:50,800 Speaker 1: the magic of Walt Disney World in Florida. So it 452 00:24:50,920 --> 00:24:53,520 Speaker 1: was going to need to actually go south along the 453 00:24:53,600 --> 00:24:56,320 Speaker 1: Eastern Seaboard. I mean, for people who are not from 454 00:24:56,320 --> 00:24:58,760 Speaker 1: the United States and aren't familiar with our geography, if 455 00:24:58,800 --> 00:25:01,560 Speaker 1: you were going Boston to San Francisco, you would essentially 456 00:25:01,600 --> 00:25:04,760 Speaker 1: be setting your sites west. Yeah, just do west, Just 457 00:25:04,840 --> 00:25:08,840 Speaker 1: go yeah, just go west, and just keep on adjusting 458 00:25:08,960 --> 00:25:11,720 Speaker 1: your your your journey in orders for you to get 459 00:25:11,800 --> 00:25:15,480 Speaker 1: to California exactly. But with this list, it means that 460 00:25:15,600 --> 00:25:17,840 Speaker 1: you would have need you first would need to go south, 461 00:25:18,400 --> 00:25:20,520 Speaker 1: because you would have to go south from Boston well 462 00:25:21,000 --> 00:25:22,639 Speaker 1: to get to New York City, but also to d 463 00:25:22,800 --> 00:25:26,000 Speaker 1: C and to Florida. And then this goes all over 464 00:25:26,080 --> 00:25:27,679 Speaker 1: the place, I mean all over the Midwest. So there 465 00:25:27,720 --> 00:25:31,119 Speaker 1: are things to do in Illinois like explore the cloud 466 00:25:31,160 --> 00:25:34,359 Speaker 1: Gate in Millennium park Um, stand under the Gateway Arch 467 00:25:34,440 --> 00:25:37,000 Speaker 1: in Missouri, just all kinds of things like this. And 468 00:25:37,200 --> 00:25:39,280 Speaker 1: again it's a it's a relatively long list and with 469 00:25:39,440 --> 00:25:44,640 Speaker 1: many different states, many different activities. And checked off two 470 00:25:44,720 --> 00:25:47,480 Speaker 1: items on that list. Yeah, and the reason that only 471 00:25:47,560 --> 00:25:51,680 Speaker 1: two items were checked off is that Hitchbot met and 472 00:25:51,920 --> 00:25:57,639 Speaker 1: untimely demise at the hands of a vandal maliciously murdered, Yes, decapitated. 473 00:25:58,080 --> 00:25:59,960 Speaker 1: Can you just can you say murdered when it's a robot? 474 00:26:00,000 --> 00:26:02,840 Speaker 1: What I mean it was? I mean disassembled? Yeah, that's 475 00:26:02,880 --> 00:26:07,480 Speaker 1: what Johnny five was scared about. No disassembled Johnny five. Um. Yeah, 476 00:26:07,560 --> 00:26:09,600 Speaker 1: So I'm guessing disassembled is probably the best way of 477 00:26:09,640 --> 00:26:11,760 Speaker 1: putting it. At some point, you figure there's gonna be 478 00:26:12,200 --> 00:26:16,359 Speaker 1: another robot getting a delivery with a cardboard box, and 479 00:26:16,440 --> 00:26:20,600 Speaker 1: it'll just be what's in the box? Uh, and they'll 480 00:26:20,600 --> 00:26:24,280 Speaker 1: have the binary uh code for seven as the title 481 00:26:24,320 --> 00:26:27,200 Speaker 1: of that. Very clever, very clever. All right, So this 482 00:26:27,359 --> 00:26:30,159 Speaker 1: is a weird, weird ending. So, um, it is the 483 00:26:30,480 --> 00:26:32,439 Speaker 1: night prior to this, so it's it's what the end 484 00:26:32,440 --> 00:26:34,440 Speaker 1: of July, right, because I think this all went down 485 00:26:34,520 --> 00:26:37,120 Speaker 1: on August first, as when we heard exactly, so July 486 00:26:37,240 --> 00:26:39,600 Speaker 1: thirty one. Um, I think it was a Saturday night 487 00:26:40,080 --> 00:26:45,879 Speaker 1: and Hitchbot was in Philadelphia and hanging out with a 488 00:26:46,280 --> 00:26:49,399 Speaker 1: vlogger by the name of Jesse Wellens. And it is 489 00:26:49,440 --> 00:26:51,560 Speaker 1: well documented what they did in their evening during their 490 00:26:51,560 --> 00:26:55,199 Speaker 1: evening because you know, Wellens being a YouTube YouTube personality, Um, 491 00:26:55,600 --> 00:26:57,199 Speaker 1: it took him on the town. You kind of did 492 00:26:57,240 --> 00:26:58,639 Speaker 1: a lot of different things with him. There were other 493 00:26:58,720 --> 00:27:01,920 Speaker 1: people involved. There's another guy um there with him. His 494 00:27:02,080 --> 00:27:05,600 Speaker 1: name was ed bass Master. It was another YouTube personality 495 00:27:05,960 --> 00:27:08,520 Speaker 1: and they had kind of a fun evening. Yeah. The 496 00:27:08,560 --> 00:27:10,879 Speaker 1: whole the whole idea was that this is a I 497 00:27:10,960 --> 00:27:14,480 Speaker 1: mean it was. It was elevating Hitchbot's profile and elevating 498 00:27:14,520 --> 00:27:18,280 Speaker 1: the YouTuber's profile. This was this is a dream come 499 00:27:18,359 --> 00:27:22,080 Speaker 1: true for a YouTube personality because it gives you the 500 00:27:22,200 --> 00:27:26,840 Speaker 1: chance to interact with a meme while it's at while 501 00:27:26,920 --> 00:27:30,280 Speaker 1: it's happening. You're not capitalizing it on afterwards. This is 502 00:27:30,520 --> 00:27:32,320 Speaker 1: this is like a once in a lifetime type deal. 503 00:27:32,480 --> 00:27:36,920 Speaker 1: It can it can garner you international attention immediately. Yeah, 504 00:27:37,400 --> 00:27:39,720 Speaker 1: it really would because people were tracking this thing. People 505 00:27:39,720 --> 00:27:42,399 Speaker 1: are watching exactly where this is and they knew, you know, 506 00:27:42,480 --> 00:27:44,239 Speaker 1: when it was when it was in their city. They 507 00:27:44,280 --> 00:27:46,239 Speaker 1: knew where it was. They could walk. I mean, if 508 00:27:46,320 --> 00:27:49,080 Speaker 1: it said it's been sitting here at this corner of 509 00:27:49,440 --> 00:27:51,879 Speaker 1: you know, Maine and Elm Street for at last twenty minutes, 510 00:27:52,240 --> 00:27:53,760 Speaker 1: you could go down to Maine and Elm Street and 511 00:27:53,800 --> 00:27:55,120 Speaker 1: look at this thing, or pick it up and give 512 00:27:55,119 --> 00:27:56,760 Speaker 1: it a ride yourself if you wanted to. In fact, 513 00:27:57,560 --> 00:27:59,480 Speaker 1: the early part of the trip we didn't talk about this, 514 00:27:59,560 --> 00:28:01,720 Speaker 1: but it took a long time for it to leave 515 00:28:01,760 --> 00:28:05,280 Speaker 1: the Boston area. People in Boston, we're taking it to 516 00:28:05,880 --> 00:28:08,159 Speaker 1: different parks and different you know, they're taking out on 517 00:28:08,280 --> 00:28:10,600 Speaker 1: boats and things and taking you know, selfies with it, 518 00:28:10,720 --> 00:28:12,879 Speaker 1: and um, it took I think it was more than 519 00:28:13,000 --> 00:28:15,520 Speaker 1: seven days for it to get out of the Boston area, 520 00:28:15,600 --> 00:28:19,639 Speaker 1: which I think the team would have found wonderful because 521 00:28:19,880 --> 00:28:23,000 Speaker 1: it was what was happening was the robot was gathering 522 00:28:23,800 --> 00:28:28,359 Speaker 1: a series of rich experiences, and the people were gathering 523 00:28:28,440 --> 00:28:30,600 Speaker 1: the experience of interacting with the robot, which was the 524 00:28:30,720 --> 00:28:34,480 Speaker 1: purpose for this thing in the first place. So having 525 00:28:34,520 --> 00:28:36,800 Speaker 1: it take a really long time to get off any 526 00:28:36,880 --> 00:28:41,280 Speaker 1: area would not be considered a a an impediment on 527 00:28:41,480 --> 00:28:44,800 Speaker 1: the behalf of the people running the project. They I'm 528 00:28:44,880 --> 00:28:47,040 Speaker 1: sure loved it. Oh yeah, in no way is that 529 00:28:47,120 --> 00:28:49,560 Speaker 1: a failure. That's a that's a wind as fact. So 530 00:28:50,080 --> 00:28:51,880 Speaker 1: you know, here it is after this fun evening on 531 00:28:52,000 --> 00:28:54,720 Speaker 1: the thirty feet and they placed it on the bench 532 00:28:55,360 --> 00:28:56,880 Speaker 1: middle of the night. You know it's late at night 533 00:28:56,960 --> 00:28:59,080 Speaker 1: of course dark. They placed it on a park bench, 534 00:28:59,200 --> 00:29:00,680 Speaker 1: or not a park bench, but a bench on the 535 00:29:00,720 --> 00:29:03,400 Speaker 1: city street there. And that's it. I mean, you see 536 00:29:03,440 --> 00:29:07,000 Speaker 1: a cab driver arrive. Think uh. And and that's it. 537 00:29:07,080 --> 00:29:09,680 Speaker 1: I mean at the end of the video interaction with 538 00:29:10,120 --> 00:29:13,840 Speaker 1: with hitchpot at that point. And the next day we 539 00:29:13,960 --> 00:29:17,360 Speaker 1: wake up to the news that Hitchbot has been murdered. Yes, 540 00:29:18,400 --> 00:29:20,760 Speaker 1: its head had been removed from its torso and its 541 00:29:20,880 --> 00:29:23,200 Speaker 1: arms ripped off. And I gotta ask you this when 542 00:29:23,240 --> 00:29:26,280 Speaker 1: you saw the photograph of Hitchbot, because now this is 543 00:29:26,320 --> 00:29:29,040 Speaker 1: a dramatic photograph, I mean it did it did look 544 00:29:29,040 --> 00:29:31,960 Speaker 1: a lot like a true crime scene photo in that 545 00:29:32,640 --> 00:29:35,200 Speaker 1: uh and that horrific as this maybe, I mean, Hitchbot's 546 00:29:35,280 --> 00:29:38,440 Speaker 1: arms were pulled off and placed above its head and um, 547 00:29:38,520 --> 00:29:40,560 Speaker 1: he was laying in a pile of leaves. It's headless 548 00:29:40,600 --> 00:29:43,400 Speaker 1: at this point, I mean, they're they're real crime scene 549 00:29:43,440 --> 00:29:45,840 Speaker 1: photos like that. Now understand this was starting. It was 550 00:29:45,920 --> 00:29:48,760 Speaker 1: starting to feel like this is this is the robot 551 00:29:48,800 --> 00:29:51,480 Speaker 1: equivalent of a serial killer a crime scene. So it 552 00:29:51,600 --> 00:29:53,880 Speaker 1: was laid out in this light like the staged manner 553 00:29:54,600 --> 00:29:57,920 Speaker 1: and uh and very very showy, I guess. And there 554 00:29:57,960 --> 00:30:00,800 Speaker 1: were actually websites or you know, logs that would say, 555 00:30:01,400 --> 00:30:03,480 Speaker 1: I don't really feel comfortable showing you this image, which 556 00:30:03,520 --> 00:30:06,120 Speaker 1: is weird because here it's just a bucket with a 557 00:30:06,160 --> 00:30:08,920 Speaker 1: couple of noodle arms and some some rubber boots. Right, 558 00:30:08,960 --> 00:30:12,080 Speaker 1: if you saw this same collection of stuff and a 559 00:30:12,560 --> 00:30:15,680 Speaker 1: hardware store, you would just think, oh, somebody just left 560 00:30:15,760 --> 00:30:18,440 Speaker 1: their random shopping right here. That's pretty funny. I never 561 00:30:18,480 --> 00:30:20,680 Speaker 1: thought of it that way. Yeah, it's like, yeah, you 562 00:30:20,760 --> 00:30:23,080 Speaker 1: could gather the stuff up at Walmart and put it 563 00:30:23,160 --> 00:30:25,400 Speaker 1: on the floor and not think twice about it. But 564 00:30:25,520 --> 00:30:27,680 Speaker 1: now that we know that this is this uh this 565 00:30:27,840 --> 00:30:30,840 Speaker 1: this thing that they've created that has a name. Uh, 566 00:30:31,040 --> 00:30:33,080 Speaker 1: now it takes on a different twist, doesn't it take? 567 00:30:33,120 --> 00:30:36,720 Speaker 1: It takes a different different feel, which, again you might say, 568 00:30:37,560 --> 00:30:41,080 Speaker 1: while while it brings the the that particular part of 569 00:30:41,120 --> 00:30:45,320 Speaker 1: the experiment to an end, it also says a lot, right. 570 00:30:45,480 --> 00:30:48,480 Speaker 1: It also tells you a lot about robot human interactions 571 00:30:48,520 --> 00:30:50,440 Speaker 1: and where they can go. It sure does, and you're 572 00:30:51,080 --> 00:30:53,280 Speaker 1: right away you would think, well, of course, Wellens and 573 00:30:53,400 --> 00:30:55,920 Speaker 1: bass Master have done this. That's that's the two people, 574 00:30:56,000 --> 00:30:58,520 Speaker 1: the two characters that were involved with this last and 575 00:30:58,600 --> 00:31:01,640 Speaker 1: then lo and behold three or day later, two days 576 00:31:01,720 --> 00:31:04,959 Speaker 1: later or whatever, there comes to this uh secret surveillance 577 00:31:05,040 --> 00:31:07,080 Speaker 1: video that was taken it in from a nearby store 578 00:31:07,160 --> 00:31:09,480 Speaker 1: that shows what happened. And someone wanders up in a 579 00:31:09,520 --> 00:31:13,600 Speaker 1: football jersey and just kicks the living heck out of 580 00:31:13,640 --> 00:31:16,920 Speaker 1: this thing and destroys it. But it's it's shown just 581 00:31:17,040 --> 00:31:20,040 Speaker 1: on screen. We don't actually see the person kicking and 582 00:31:20,160 --> 00:31:23,240 Speaker 1: destroying this thing. We see him kicking in this destroying 583 00:31:23,320 --> 00:31:26,480 Speaker 1: the area of the bench that that that Hitchbot was 584 00:31:26,600 --> 00:31:30,560 Speaker 1: known to be last right, and then later uh, everyone 585 00:31:30,720 --> 00:31:34,680 Speaker 1: was reporting that this video was essentially itself was staged. Yeah, 586 00:31:34,720 --> 00:31:36,280 Speaker 1: it's a fake video because you can go to that 587 00:31:36,400 --> 00:31:38,600 Speaker 1: scene and look exactly where it was taken from the 588 00:31:38,640 --> 00:31:41,920 Speaker 1: same perspective. There's no camera there. Yeah, so it's a fake. 589 00:31:42,360 --> 00:31:44,640 Speaker 1: And so what's going on here? Because they never have 590 00:31:44,840 --> 00:31:47,640 Speaker 1: found the head of Hitchbot. I mean they never found 591 00:31:47,720 --> 00:31:50,960 Speaker 1: the I guess the CPU, the the thing that would 592 00:31:51,000 --> 00:31:54,120 Speaker 1: tell them PC. Yeah, that would tell them what is 593 00:31:54,360 --> 00:31:57,120 Speaker 1: what is happening? Like what happened to it? Yeah, So 594 00:31:57,760 --> 00:32:01,720 Speaker 1: the best guess is just that somebody scavenged it. But 595 00:32:01,920 --> 00:32:03,960 Speaker 1: you know in fact that the people behind the scenes, 596 00:32:04,040 --> 00:32:07,440 Speaker 1: the people behind the project, have said, we don't we 597 00:32:07,560 --> 00:32:10,000 Speaker 1: don't care to identify the person who did it or 598 00:32:10,080 --> 00:32:12,520 Speaker 1: why they did it. That's not important to what we 599 00:32:12,600 --> 00:32:15,480 Speaker 1: were trying to do with the taking photographs every twenty minutes. 600 00:32:15,520 --> 00:32:17,800 Speaker 1: I wonder if it captured something and sent it without 601 00:32:17,960 --> 00:32:21,440 Speaker 1: the the the person, you know, the purp knowing what 602 00:32:21,600 --> 00:32:24,560 Speaker 1: had happened, right, or if or if it just happened 603 00:32:24,560 --> 00:32:27,200 Speaker 1: to default within that time frame between when the photos 604 00:32:27,240 --> 00:32:29,920 Speaker 1: are taken and didn't capture anything right, and it's just 605 00:32:30,000 --> 00:32:33,360 Speaker 1: like everybody else, they don't know anything. It's possible either way, uh, 606 00:32:33,600 --> 00:32:36,400 Speaker 1: the you know, of course there was a huge reaction 607 00:32:36,480 --> 00:32:39,560 Speaker 1: to this both ways. Yeah. Yeah, there were people who 608 00:32:39,680 --> 00:32:42,760 Speaker 1: were saying, this is awful, this is the worst. You know, 609 00:32:43,360 --> 00:32:47,000 Speaker 1: it really reflects poorly on the United States that a 610 00:32:47,280 --> 00:32:51,280 Speaker 1: robot that was capable of safely traveling from one coast 611 00:32:51,320 --> 00:32:53,840 Speaker 1: of Canada to the other, and also in Germany and 612 00:32:53,920 --> 00:32:57,959 Speaker 1: also in the Netherlands gets barely into its journey here 613 00:32:57,960 --> 00:33:01,520 Speaker 1: in the United States before it's destroyed. That seven days. Yeah, 614 00:33:01,680 --> 00:33:08,880 Speaker 1: that that was a telling um condemnation, if you will, 615 00:33:09,160 --> 00:33:12,120 Speaker 1: of the United States. In general and Philadelphia in particular. 616 00:33:12,560 --> 00:33:14,520 Speaker 1: You can think of all the different comments that were 617 00:33:14,560 --> 00:33:16,880 Speaker 1: immediately happening afterwards. A lot of people would say like, well, 618 00:33:16,960 --> 00:33:20,000 Speaker 1: this happens to real hitchhikers as well, it happens to 619 00:33:20,080 --> 00:33:22,160 Speaker 1: people well, and then there were a ton of comments 620 00:33:22,240 --> 00:33:25,040 Speaker 1: that said, well, of course this happened in Philadelphia. Yeah, 621 00:33:25,080 --> 00:33:27,040 Speaker 1: there's that. And there's another group of people that would 622 00:33:27,080 --> 00:33:29,440 Speaker 1: just respond with something like a big deal. It was 623 00:33:29,480 --> 00:33:33,400 Speaker 1: a bucket of bolts anyway, it was a machine. Yeah, 624 00:33:33,480 --> 00:33:37,680 Speaker 1: and I mean it's it's again. This ties right back 625 00:33:37,720 --> 00:33:41,720 Speaker 1: into that robotic human interactions. You might think the experiments over. 626 00:33:41,840 --> 00:33:44,680 Speaker 1: I would argue that the team probably says, no, it's 627 00:33:44,680 --> 00:33:47,480 Speaker 1: still going. It doesn't matter if the robot is gone. 628 00:33:48,160 --> 00:33:53,360 Speaker 1: The continuing conversation around this is still informing us and 629 00:33:53,560 --> 00:33:58,320 Speaker 1: still giving us a lot more data about how humans 630 00:33:58,480 --> 00:34:03,680 Speaker 1: view robots, how we can identify with them, we can 631 00:34:03,760 --> 00:34:06,760 Speaker 1: imprint an emotional response to them, and in fact, they 632 00:34:06,840 --> 00:34:08,920 Speaker 1: have tweeted out, you know, continuing to tweet out with 633 00:34:08,960 --> 00:34:11,799 Speaker 1: the robots saying the robot still loves people, still loved 634 00:34:11,840 --> 00:34:15,359 Speaker 1: its adventures, which I think needles it even more right. 635 00:34:15,400 --> 00:34:17,759 Speaker 1: You made the robot this innocent creature. It reminds me 636 00:34:17,840 --> 00:34:21,839 Speaker 1: a lot of when um the various Mars rovers when 637 00:34:21,880 --> 00:34:25,600 Speaker 1: they were like the Phoenix Lander, specifically when it was 638 00:34:25,800 --> 00:34:27,919 Speaker 1: nearing the end of its mission, like it had gone 639 00:34:28,000 --> 00:34:31,600 Speaker 1: well beyond what the mission parameters were and it was 640 00:34:31,840 --> 00:34:34,200 Speaker 1: to a point where it was no longer going to 641 00:34:34,320 --> 00:34:37,600 Speaker 1: get enough sunlight to recharge its batteries, and it's essentially 642 00:34:37,680 --> 00:34:40,920 Speaker 1: the social media team at NASA sent out a final 643 00:34:41,040 --> 00:34:44,320 Speaker 1: tweet from the robot, keeping in mind the robot was 644 00:34:44,520 --> 00:34:47,280 Speaker 1: never tweeting directly. It was always a human being taking 645 00:34:47,360 --> 00:34:51,160 Speaker 1: data from the robot and then messaging it out. But 646 00:34:51,280 --> 00:34:54,520 Speaker 1: people identified with that robot, and when that last tweet 647 00:34:54,600 --> 00:34:58,480 Speaker 1: came out, people cried, they're they're watching this thing and 648 00:34:58,520 --> 00:35:01,640 Speaker 1: they're thinking, they're almost making it into like like it's 649 00:35:01,640 --> 00:35:03,759 Speaker 1: like you're watching a dog or something like that. That's 650 00:35:03,880 --> 00:35:06,279 Speaker 1: that's dying. That thing right there is dying and I'm 651 00:35:06,320 --> 00:35:10,480 Speaker 1: watching it happen, and they're not understanding that. It's like, um, now, 652 00:35:10,520 --> 00:35:12,680 Speaker 1: I know it's way more complexness, but it's not like, 653 00:35:13,400 --> 00:35:15,320 Speaker 1: well it's time to get a new toaster, right, you know, 654 00:35:15,440 --> 00:35:17,880 Speaker 1: It's not like a machine that you don't really have 655 00:35:18,000 --> 00:35:22,520 Speaker 1: any kind of personal attachment to attachment. Like if my 656 00:35:22,680 --> 00:35:26,600 Speaker 1: microwave were to malfunction, I think a lot of pain 657 00:35:26,640 --> 00:35:27,680 Speaker 1: in the butt. I have to go and get a 658 00:35:27,719 --> 00:35:29,960 Speaker 1: new one. If you're not gonna cry. No. But and 659 00:35:30,200 --> 00:35:33,880 Speaker 1: again that that goes right back into that idea of 660 00:35:34,000 --> 00:35:37,279 Speaker 1: how do robots and humans interact? And and we do 661 00:35:37,480 --> 00:35:40,719 Speaker 1: need to think about this because if we enter into 662 00:35:40,800 --> 00:35:44,480 Speaker 1: a world where we start treating these relationships is really casual. 663 00:35:45,080 --> 00:35:51,280 Speaker 1: When stuff happens and people go through an actual grieving experience, 664 00:35:52,480 --> 00:35:54,120 Speaker 1: we won't be ready for it. But if we know 665 00:35:54,239 --> 00:35:56,120 Speaker 1: ahead of time, we can say, all right, you know 666 00:35:56,239 --> 00:35:58,799 Speaker 1: what we we know this about ourselves. This is something 667 00:35:58,840 --> 00:36:03,640 Speaker 1: that is innately human, at least for many people. That yeah, 668 00:36:03,760 --> 00:36:06,600 Speaker 1: and then you then you think, all right, now I 669 00:36:06,680 --> 00:36:10,040 Speaker 1: can I can design a product and market a product 670 00:36:10,200 --> 00:36:14,600 Speaker 1: and and do it in a responsible way that doesn't 671 00:36:15,360 --> 00:36:18,840 Speaker 1: say this is a weird you know, aberration or anything. No, 672 00:36:19,000 --> 00:36:21,120 Speaker 1: this is a very human kind of trait that a 673 00:36:21,239 --> 00:36:25,000 Speaker 1: lot of people have, whatever elicits from the humans. That's 674 00:36:25,000 --> 00:36:27,439 Speaker 1: what you're looking for, Yeah, and so or at least 675 00:36:27,520 --> 00:36:29,719 Speaker 1: that you account for it, even if that's not the 676 00:36:29,920 --> 00:36:32,600 Speaker 1: purpose of whatever it is you're making. You at least 677 00:36:32,640 --> 00:36:35,560 Speaker 1: account for the fact that it exists. Scott and I 678 00:36:35,640 --> 00:36:37,839 Speaker 1: will be right back with more about the sad tale 679 00:36:37,840 --> 00:36:49,920 Speaker 1: of Hitchbot after this. In the fallout of all of this, 680 00:36:50,960 --> 00:36:54,960 Speaker 1: we've seen not just the condemnation of an act of 681 00:36:55,080 --> 00:36:59,520 Speaker 1: violence against what seemed to be an innocent creature that 682 00:36:59,719 --> 00:37:03,560 Speaker 1: love adventure and meeting people, or as Smith had called it, 683 00:37:03,920 --> 00:37:08,000 Speaker 1: a story collecting and story generating machine, like that was 684 00:37:08,120 --> 00:37:13,680 Speaker 1: its purpose. We've also seen people come together to say, 685 00:37:14,000 --> 00:37:16,280 Speaker 1: we can't let this be the end of the story. 686 00:37:16,840 --> 00:37:18,920 Speaker 1: Um we there are a couple of different groups in 687 00:37:18,960 --> 00:37:24,239 Speaker 1: Philadelphia that had said, let us build a robot two 688 00:37:25,360 --> 00:37:28,480 Speaker 1: continue on in the spirit of Hitchbot, because I can't 689 00:37:28,600 --> 00:37:31,760 Speaker 1: stand for hitchbot story to have ended in the city 690 00:37:31,840 --> 00:37:34,799 Speaker 1: that I call home. Let's let's let's have a chance 691 00:37:34,840 --> 00:37:37,160 Speaker 1: to make this right. Yeah, and we're gonna do it. Yeah. 692 00:37:37,320 --> 00:37:41,759 Speaker 1: So there are the tech community in Philadelphia has responded 693 00:37:41,840 --> 00:37:46,960 Speaker 1: with this and actually received some more or less tentative 694 00:37:47,280 --> 00:37:49,879 Speaker 1: thumbs up from some of the members of the hitch 695 00:37:49,920 --> 00:37:54,000 Speaker 1: Bot team to give this a go. And so there's 696 00:37:54,040 --> 00:37:59,640 Speaker 1: a group um at that gathered at the Hactory. The Hactory, 697 00:38:00,000 --> 00:38:03,680 Speaker 1: it's like a factory factory in West Philadelphia. Um, this 698 00:38:03,920 --> 00:38:06,279 Speaker 1: was this is very recent and when this happened. Uh, 699 00:38:06,920 --> 00:38:09,000 Speaker 1: And they have come up with an idea that they're 700 00:38:09,040 --> 00:38:12,759 Speaker 1: calling the Philly love Bot, which I don't like the 701 00:38:12,800 --> 00:38:17,480 Speaker 1: sound of that. Yeah, an odd choice for a name. Wait, 702 00:38:18,000 --> 00:38:20,040 Speaker 1: but in the brotherly love. Okay. I don't want to 703 00:38:20,520 --> 00:38:22,279 Speaker 1: I don't wanna take this in the wrong direction or anything. 704 00:38:22,360 --> 00:38:24,600 Speaker 1: But we're not talking about a sex spot. Okay, okay, 705 00:38:24,840 --> 00:38:26,239 Speaker 1: a sexpot. I didn't want to just come right out 706 00:38:26,280 --> 00:38:27,719 Speaker 1: and say it, but it seems like I've heard of 707 00:38:27,760 --> 00:38:30,680 Speaker 1: products like this. Yeah no, not not that kind of love. 708 00:38:30,760 --> 00:38:34,400 Speaker 1: But this is more of a love for all mankind 709 00:38:34,520 --> 00:38:36,880 Speaker 1: and robots kind of what kind of bought podcast. If 710 00:38:36,880 --> 00:38:38,719 Speaker 1: I wandered into Yeah, no, I'm not going to do 711 00:38:38,800 --> 00:38:41,200 Speaker 1: another bait and switch on your Scott. I already promised 712 00:38:41,239 --> 00:38:45,200 Speaker 1: you I wasn't gonna do that. Not doing it this time, honest. 713 00:38:45,880 --> 00:38:50,120 Speaker 1: So they said that. Um, the idea they have is 714 00:38:50,200 --> 00:38:53,400 Speaker 1: they would build a robot that was designed to be 715 00:38:53,560 --> 00:38:56,240 Speaker 1: passed from one person to another, so it's not designed 716 00:38:56,360 --> 00:39:00,080 Speaker 1: to hitchhike from one location to another location. There's no 717 00:39:00,320 --> 00:39:04,960 Speaker 1: location requirement, at least in their initial approach. Instead, what 718 00:39:05,160 --> 00:39:08,120 Speaker 1: they want is to design a robot that when you, 719 00:39:08,400 --> 00:39:10,480 Speaker 1: when you take possession of it, when someone gives it 720 00:39:10,560 --> 00:39:14,279 Speaker 1: to you, you are tasked with performing a good deed 721 00:39:14,680 --> 00:39:18,120 Speaker 1: however you define it, and it gets documented by the 722 00:39:18,280 --> 00:39:20,960 Speaker 1: robot itself. The robot you take along with you to 723 00:39:21,120 --> 00:39:24,160 Speaker 1: do whatever this good deed maybe Christian idea, and then 724 00:39:24,239 --> 00:39:26,240 Speaker 1: you pass it on. It's like a pay it forward. 725 00:39:26,320 --> 00:39:28,279 Speaker 1: You pass the robot onto someone else, and it is 726 00:39:28,360 --> 00:39:31,000 Speaker 1: their duty now to go out and do a good deed. 727 00:39:31,480 --> 00:39:34,560 Speaker 1: And the idea to kind of a tone for the 728 00:39:34,719 --> 00:39:39,520 Speaker 1: horrible murder of Hitchbot by promoting good deeds, and the 729 00:39:39,680 --> 00:39:42,320 Speaker 1: robot is the kind of the almost like a totem 730 00:39:42,880 --> 00:39:44,600 Speaker 1: for that. I was gonna say, I like this idea, 731 00:39:44,640 --> 00:39:46,120 Speaker 1: But you could do the same thing with a with 732 00:39:46,200 --> 00:39:49,640 Speaker 1: a carved stick. You could hand a carved stick to somebody. 733 00:39:50,400 --> 00:39:52,239 Speaker 1: Now that you're in possession of the stick, it's your 734 00:39:52,320 --> 00:39:54,080 Speaker 1: duty to do a good deed and pass the stick 735 00:39:54,160 --> 00:39:56,719 Speaker 1: onto somebody else. It doesn't have to be, uh, something 736 00:39:56,760 --> 00:39:58,800 Speaker 1: that collects and gathers the information. But I guess that 737 00:39:58,960 --> 00:40:01,040 Speaker 1: keeps everybody kind of this, doesn't it? Well? Yeah? And 738 00:40:01,120 --> 00:40:03,759 Speaker 1: I think also you know you I They also are 739 00:40:03,840 --> 00:40:07,279 Speaker 1: planning on having the robot, which I am going to 740 00:40:07,360 --> 00:40:09,600 Speaker 1: guess is going to be another really another computer, or 741 00:40:09,680 --> 00:40:11,560 Speaker 1: not so much a robot. They're going to have it 742 00:40:11,719 --> 00:40:15,920 Speaker 1: capable of interacting with you, just as the hitchbot could. So, 743 00:40:16,040 --> 00:40:19,200 Speaker 1: in other words, there will still be that robot human 744 00:40:19,280 --> 00:40:23,799 Speaker 1: interaction element that will play a part in this experiment, 745 00:40:24,400 --> 00:40:28,759 Speaker 1: but the nature of the the overall experiment, the the 746 00:40:29,320 --> 00:40:32,839 Speaker 1: perceived purpose will be different. Now. Isn't this funny? Because 747 00:40:32,840 --> 00:40:34,800 Speaker 1: I wonder what some people are going to consider a 748 00:40:34,880 --> 00:40:37,320 Speaker 1: good deed too, because there might be some comical examples 749 00:40:37,360 --> 00:40:40,360 Speaker 1: of what people consider to be their good deed for humanity, 750 00:40:40,840 --> 00:40:43,439 Speaker 1: like one I I yeah, I I imagine we would 751 00:40:43,480 --> 00:40:45,839 Speaker 1: see everything from someone saying, all right, I'm gonna take 752 00:40:45,920 --> 00:40:49,719 Speaker 1: this robot with me while me and my company while 753 00:40:49,840 --> 00:40:52,040 Speaker 1: we go out and we clean up a neighborhood. That 754 00:40:52,080 --> 00:40:54,000 Speaker 1: could be one, or it could be I'm going to 755 00:40:54,080 --> 00:40:55,759 Speaker 1: set this robot here on the corner so it can 756 00:40:55,840 --> 00:40:58,719 Speaker 1: watch me as I stopped traffic, so this mama duck 757 00:40:58,760 --> 00:41:00,759 Speaker 1: and her baby ducks can get across the street. It's 758 00:41:00,760 --> 00:41:03,720 Speaker 1: could be anything, yeah, like oh man, you almost spilled 759 00:41:03,760 --> 00:41:07,480 Speaker 1: your beer, but I I saved you. I'm passing this 760 00:41:07,560 --> 00:41:11,759 Speaker 1: thing on and see that that also again because of you. Look, 761 00:41:11,800 --> 00:41:13,960 Speaker 1: if you look at it as the as an experiment, 762 00:41:15,000 --> 00:41:18,600 Speaker 1: that's still meaningful data, right. That's true. It's it's interesting, 763 00:41:18,840 --> 00:41:21,040 Speaker 1: you know, it's kind of like a joke, but it's 764 00:41:21,080 --> 00:41:24,680 Speaker 1: also that's humanity too, That's true. And that's exactly what 765 00:41:24,920 --> 00:41:27,279 Speaker 1: the initial goal of this whole thing was. I mean, 766 00:41:27,320 --> 00:41:29,880 Speaker 1: it's to see what happens. It wasn't the goal of 767 00:41:30,000 --> 00:41:32,080 Speaker 1: getting this thing across Canada, because they can put in 768 00:41:32,120 --> 00:41:34,320 Speaker 1: a box and ship if they want, or just have 769 00:41:34,440 --> 00:41:36,120 Speaker 1: a trucking company hall it. Like we said, you know, 770 00:41:36,200 --> 00:41:39,360 Speaker 1: one shot always straight across. But the idea was to 771 00:41:39,440 --> 00:41:41,560 Speaker 1: see what happens along the way. It's like the journey 772 00:41:41,680 --> 00:41:44,920 Speaker 1: is better than the destination exactly. Yeah, And that it's 773 00:41:44,960 --> 00:41:49,360 Speaker 1: those experiences that were important and that documenting culture. And 774 00:41:49,640 --> 00:41:52,600 Speaker 1: we're talking about an emerging culture now, not just tradition, 775 00:41:52,840 --> 00:41:56,759 Speaker 1: not just the embedded culture that's been around for generations. 776 00:41:57,080 --> 00:42:00,640 Speaker 1: We're talking about an emerging culture of technolog ology and 777 00:42:01,160 --> 00:42:04,399 Speaker 1: our daily lives intermingling on a level that has been 778 00:42:04,600 --> 00:42:08,040 Speaker 1: it's unprecedented. We've never seen it like that, and it 779 00:42:08,400 --> 00:42:14,320 Speaker 1: grows every day, so fascinating, really a fascinating experiment. I 780 00:42:14,360 --> 00:42:16,600 Speaker 1: wouldn't call it a failure at all. I mean, I'm 781 00:42:16,680 --> 00:42:19,440 Speaker 1: sad that the Hitchbot didn't get further along in its 782 00:42:19,520 --> 00:42:22,040 Speaker 1: journey so that more people could experience it and that 783 00:42:22,719 --> 00:42:27,200 Speaker 1: we could have more stories. But it's all right, because 784 00:42:27,320 --> 00:42:30,680 Speaker 1: the story continues. It's just the Hitchbot chapter is over. 785 00:42:31,440 --> 00:42:35,040 Speaker 1: So when you look at it that way, it's actually 786 00:42:35,480 --> 00:42:38,320 Speaker 1: really interesting and inspiring. And you know, of course, you 787 00:42:38,400 --> 00:42:41,680 Speaker 1: might say, well, I hope that the next robot meets 788 00:42:41,760 --> 00:42:44,720 Speaker 1: with more success and doesn't have the same kind of encounter. 789 00:42:45,360 --> 00:42:47,759 Speaker 1: But if we do see these kind of encounters happen 790 00:42:47,800 --> 00:42:50,239 Speaker 1: again and again, then we have new questions to ask, like, 791 00:42:50,640 --> 00:42:53,800 Speaker 1: why is this happening? Uh? You know, what are what 792 00:42:53,880 --> 00:42:57,160 Speaker 1: are the motivations behind it? Are there things we need 793 00:42:57,239 --> 00:43:00,239 Speaker 1: to look at as a as a society, not just 794 00:43:00,600 --> 00:43:04,040 Speaker 1: not because we want to protect robots, but are there 795 00:43:04,120 --> 00:43:07,960 Speaker 1: underlying issues that this is just an indicator of it, 796 00:43:08,120 --> 00:43:10,040 Speaker 1: and maybe there are things we need to fix it 797 00:43:10,120 --> 00:43:14,239 Speaker 1: some real anger, some deep seated anger against robots, or 798 00:43:14,600 --> 00:43:18,719 Speaker 1: or even just one of those situations where clearly the 799 00:43:19,160 --> 00:43:21,799 Speaker 1: person who was trying to scavenge it wanted to get 800 00:43:21,840 --> 00:43:24,279 Speaker 1: it for the parts to sell for some reason, and 801 00:43:24,400 --> 00:43:26,719 Speaker 1: then that's well, if that's in fact the answer, then 802 00:43:26,760 --> 00:43:28,960 Speaker 1: you're you. You might say, all right, you know, this 803 00:43:29,120 --> 00:43:34,040 Speaker 1: is yet another indicator that there are conditions that maybe 804 00:43:34,120 --> 00:43:36,600 Speaker 1: we should look at and really talk about. And yes, 805 00:43:36,719 --> 00:43:39,520 Speaker 1: this is a kind of trivial way of highlighting that, 806 00:43:39,680 --> 00:43:42,760 Speaker 1: and it's stuff that we already know, but it's another 807 00:43:42,920 --> 00:43:46,160 Speaker 1: way to say, think about this. I mean, we're really 808 00:43:46,680 --> 00:43:49,520 Speaker 1: we're really talking about compassion, and on a level, that 809 00:43:49,719 --> 00:43:52,640 Speaker 1: is a very you know, human trait, a very innate 810 00:43:52,680 --> 00:43:55,440 Speaker 1: trait in us. Maybe we should apply that to our 811 00:43:55,520 --> 00:43:58,920 Speaker 1: fellow humans, to not just to the robots, even even 812 00:43:59,000 --> 00:44:01,800 Speaker 1: the even the human that caused damage to the robots, 813 00:44:02,080 --> 00:44:05,759 Speaker 1: we should show compassion too, because we don't know the 814 00:44:05,880 --> 00:44:08,880 Speaker 1: reason behind it, and there may be reasons that we 815 00:44:09,120 --> 00:44:12,040 Speaker 1: can't even identify with because we're not in that situation, 816 00:44:12,640 --> 00:44:15,480 Speaker 1: and that's all the more reason to show compassion. And 817 00:44:15,560 --> 00:44:17,920 Speaker 1: that's exactly what they're Again, I keep going back this, 818 00:44:18,040 --> 00:44:20,200 Speaker 1: but that's exactly what they were looking for when they 819 00:44:20,239 --> 00:44:23,040 Speaker 1: started this whole experiment a couple of years ago. So 820 00:44:23,120 --> 00:44:25,960 Speaker 1: this has really been fascinating and I can't wait to 821 00:44:26,040 --> 00:44:29,160 Speaker 1: see what the next phase will bring to us. Can 822 00:44:29,160 --> 00:44:30,880 Speaker 1: I ask you one question before we leave here? And 823 00:44:30,960 --> 00:44:33,719 Speaker 1: I we had discussed this, but only briefly and we 824 00:44:33,760 --> 00:44:37,560 Speaker 1: didn't really get into much detail. But um, had you 825 00:44:37,719 --> 00:44:40,480 Speaker 1: not known about hitchpot right, had you never heard of 826 00:44:40,560 --> 00:44:42,960 Speaker 1: this whole thing exactly? You passed it on the on 827 00:44:43,040 --> 00:44:46,279 Speaker 1: the city streets driving, would you stop to pick it up? 828 00:44:46,640 --> 00:44:50,080 Speaker 1: None a chance at all that I would stop because 829 00:44:51,320 --> 00:44:53,400 Speaker 1: and you made me think about this because I was 830 00:44:53,480 --> 00:44:56,880 Speaker 1: coming to it from the perspective of knowing about Hitchbot. 831 00:44:57,360 --> 00:45:00,880 Speaker 1: If I saw hitch Bot, I'd think, holy wrap, there's 832 00:45:00,920 --> 00:45:03,520 Speaker 1: the hitchhiking robot. We've got to take part in this. 833 00:45:03,840 --> 00:45:06,439 Speaker 1: This is something special. And I feel the exact same way. 834 00:45:06,560 --> 00:45:09,520 Speaker 1: But not knowing about Hitchbot, not knowing about it, and 835 00:45:09,680 --> 00:45:14,279 Speaker 1: seeing a bucket that has electronics attached to it, even 836 00:45:14,400 --> 00:45:17,839 Speaker 1: with the happy face, maybe particularly with the happy face, 837 00:45:18,480 --> 00:45:21,440 Speaker 1: I might think, oh, what this is like a suspicious 838 00:45:21,480 --> 00:45:24,239 Speaker 1: device of something. I thought, I said, it looks at 839 00:45:24,280 --> 00:45:26,680 Speaker 1: an awful lot like an I E. D. Yeah, I thought, 840 00:45:27,040 --> 00:45:28,880 Speaker 1: it's not too far off in the description. I know 841 00:45:29,000 --> 00:45:32,719 Speaker 1: that they're a little more um, I guess camouflaged in 842 00:45:32,719 --> 00:45:35,320 Speaker 1: the way that they typically do those things. But this 843 00:45:35,560 --> 00:45:38,160 Speaker 1: just seems to me like not a good idea to 844 00:45:38,239 --> 00:45:40,000 Speaker 1: pick something up, like, you know, something like this up 845 00:45:40,040 --> 00:45:41,560 Speaker 1: on the street and you know, strap it in the 846 00:45:41,640 --> 00:45:44,560 Speaker 1: car next to your kids. Yeah, not really, but I mean, 847 00:45:44,640 --> 00:45:47,200 Speaker 1: knowing what it is, yeah, of course you'd want to 848 00:45:47,239 --> 00:45:48,480 Speaker 1: do that. You'd want to you know, it'd be a 849 00:45:48,560 --> 00:45:50,440 Speaker 1: great experience for you and your family, you know, to 850 00:45:50,800 --> 00:45:52,520 Speaker 1: to do something with us, even if you drive it 851 00:45:52,920 --> 00:45:55,960 Speaker 1: a mile or five miles or whatever, just take a 852 00:45:56,080 --> 00:45:57,840 Speaker 1: quick photograph with it and say you were part of 853 00:45:57,920 --> 00:45:59,960 Speaker 1: that journey. That's kind of cool. It's actually kind of 854 00:46:00,080 --> 00:46:04,640 Speaker 1: interesting to me that Hitchbot spent seven days in Boston 855 00:46:05,840 --> 00:46:11,719 Speaker 1: because Boston is also where we had the Aquitine Hunger 856 00:46:11,800 --> 00:46:15,120 Speaker 1: Force Moon Andite bomb scare. It was in two thousand 857 00:46:15,239 --> 00:46:18,319 Speaker 1: seven and it was in Boston. Was in Boston, Yeah, yeah, 858 00:46:18,400 --> 00:46:21,719 Speaker 1: that's with the Neon or not. I'm sorry. L E. D. Yeah, 859 00:46:21,840 --> 00:46:24,279 Speaker 1: um well yeah, yeah, you can tell him what it was. Yeah, 860 00:46:24,320 --> 00:46:26,319 Speaker 1: the the l E ED They there were these two 861 00:46:26,480 --> 00:46:30,600 Speaker 1: characters from Aquitine Hunger Force, these two Moonites from the Moon. 862 00:46:30,719 --> 00:46:34,600 Speaker 1: They look like uh like yeah eight that characters from 863 00:46:34,640 --> 00:46:37,279 Speaker 1: a really crappy video game that they're specifically made to 864 00:46:37,360 --> 00:46:40,200 Speaker 1: look like that they're two dimensional. They're they When they 865 00:46:40,280 --> 00:46:42,879 Speaker 1: turn sideways, you don't see him anymore because they're gone. 866 00:46:42,960 --> 00:46:46,600 Speaker 1: And they're hilarious. They are hilarious. They are incredibly inappropriate, 867 00:46:46,800 --> 00:46:49,760 Speaker 1: as is everything on Aquitine Hunger Force, but they are hilarious. 868 00:46:50,200 --> 00:46:56,880 Speaker 1: And there was a publicity stunt where, uh, these l 869 00:46:56,920 --> 00:46:59,560 Speaker 1: e ed signs of the two characters were put up 870 00:46:59,600 --> 00:47:03,960 Speaker 1: in very locations and in Boston it caused a bomb scare. 871 00:47:04,040 --> 00:47:07,640 Speaker 1: People thought that maybe it was the indication of an 872 00:47:07,719 --> 00:47:12,279 Speaker 1: explosive device nearby, and so they were dismantled, and it 873 00:47:12,560 --> 00:47:16,840 Speaker 1: very quickly became kind of a joke slash a discussion 874 00:47:16,880 --> 00:47:19,440 Speaker 1: about you have to be very careful in the way 875 00:47:19,480 --> 00:47:23,680 Speaker 1: you present these kind of guerrilla marketing attempts because in 876 00:47:23,800 --> 00:47:26,480 Speaker 1: a in a post nine eleven world, they can be misinterpreted. 877 00:47:26,560 --> 00:47:30,399 Speaker 1: It was post nine eleven and pre marathon bombing too, yeah, 878 00:47:30,560 --> 00:47:32,799 Speaker 1: so it's kind of in between. But um, they were 879 00:47:33,000 --> 00:47:35,280 Speaker 1: on high alert there for a while about these signs, 880 00:47:35,320 --> 00:47:38,920 Speaker 1: and then you know, sheepishly Cartoon Network had to say, oh, 881 00:47:39,360 --> 00:47:41,560 Speaker 1: that was us, yeah, and I hear's what happened but 882 00:47:41,880 --> 00:47:43,920 Speaker 1: in fact they were a little reluctant to say that 883 00:47:44,040 --> 00:47:47,840 Speaker 1: was us. Well, but you know, then again bad press 884 00:47:47,920 --> 00:47:51,600 Speaker 1: is still pressed. That's true. So I'm actually amazed that 885 00:47:52,000 --> 00:47:55,560 Speaker 1: the that hitchbot didn't meet with any hitches in Boston 886 00:47:56,280 --> 00:47:58,879 Speaker 1: based on that um and the and as you when 887 00:47:58,920 --> 00:48:01,480 Speaker 1: you asked that question, you gave the qualifier, Hey, you've 888 00:48:01,520 --> 00:48:03,239 Speaker 1: never heard of hitchbot and you see this thing on 889 00:48:03,320 --> 00:48:06,120 Speaker 1: the side of the road. I definitely would have wondered 890 00:48:06,160 --> 00:48:07,920 Speaker 1: what the heck it was, and I probably would have 891 00:48:07,960 --> 00:48:10,279 Speaker 1: thought I might not want to get too close to 892 00:48:10,360 --> 00:48:14,400 Speaker 1: that just in case. Yeah, sure, And imagine if you were, uh, 893 00:48:14,480 --> 00:48:16,560 Speaker 1: you know, somewhere in Canada, you know where it's it's 894 00:48:16,600 --> 00:48:19,920 Speaker 1: wide open farmland, and uh, you know, there's it's it's 895 00:48:19,920 --> 00:48:22,440 Speaker 1: a mile between houses, and this thing is propped up 896 00:48:22,440 --> 00:48:25,239 Speaker 1: on its it's its legs and its tripods seat there 897 00:48:25,960 --> 00:48:28,480 Speaker 1: out in the middle of nowhere. I don't think I'd stopped. 898 00:48:28,880 --> 00:48:31,640 Speaker 1: And in that case, I probably would stop, only because 899 00:48:31,680 --> 00:48:35,200 Speaker 1: I would think, who the heck would set up something 900 00:48:35,840 --> 00:48:39,480 Speaker 1: sinister in the middle of nowhere where you are not 901 00:48:39,719 --> 00:48:42,480 Speaker 1: likely to affect much of anything at all, And that's 902 00:48:42,520 --> 00:48:46,520 Speaker 1: how they get you. Whereas I would be more concerned 903 00:48:46,560 --> 00:48:50,959 Speaker 1: about the the city location where the the opportunity is higher. 904 00:48:52,080 --> 00:48:54,160 Speaker 1: Ye yep, I just see it as like, and that 905 00:48:54,360 --> 00:48:56,920 Speaker 1: was the last thing that he thought. Well, I already 906 00:48:56,960 --> 00:48:59,000 Speaker 1: told you that I was worried that my obituary from 907 00:48:59,120 --> 00:49:01,640 Speaker 1: yesterday was going to say choked to death on twizzlers. 908 00:49:02,239 --> 00:49:04,960 Speaker 1: So it's actually much worse. Yeah, no, especially since I 909 00:49:05,000 --> 00:49:07,560 Speaker 1: hate twizzlers all right. Well, at any rate, this was 910 00:49:07,680 --> 00:49:09,880 Speaker 1: really a lot of fun to talk about, and it 911 00:49:10,080 --> 00:49:12,400 Speaker 1: was fun to kind of, you know, think about the 912 00:49:13,160 --> 00:49:15,719 Speaker 1: weird adventures, which are all, like I said, documented. You 913 00:49:15,800 --> 00:49:18,080 Speaker 1: can go to the hitchbot website and read up on 914 00:49:18,239 --> 00:49:21,480 Speaker 1: the different days and events and things that it encountered 915 00:49:21,520 --> 00:49:24,279 Speaker 1: in the people it met. So many successful journeys and 916 00:49:24,360 --> 00:49:26,880 Speaker 1: so many events and things that had happened that happened, 917 00:49:26,920 --> 00:49:29,200 Speaker 1: and and it posted about all that stuff, and all 918 00:49:29,320 --> 00:49:32,040 Speaker 1: his interactions are recorded in some way, which is great, 919 00:49:32,120 --> 00:49:35,000 Speaker 1: so you can actually go back and relive those journeys. 920 00:49:35,600 --> 00:49:39,799 Speaker 1: And I do hope that we see some further experiments 921 00:49:39,920 --> 00:49:42,640 Speaker 1: that that are in the same spirit, whether or not 922 00:49:42,800 --> 00:49:45,879 Speaker 1: it's another hitchhiking thing or it's like the Philly love butt, 923 00:49:46,480 --> 00:49:50,080 Speaker 1: or I know, it's like it's like I'm ten years old. 924 00:49:50,120 --> 00:49:52,640 Speaker 1: I can't you can't say that, and I still giggle 925 00:49:52,719 --> 00:49:54,799 Speaker 1: every time. But you know, I see what you mean. 926 00:49:54,840 --> 00:49:58,920 Speaker 1: I I anticipate a bunch of copycat hitchpots popping up. 927 00:49:59,360 --> 00:50:02,400 Speaker 1: Um here's another little thing that we didn't get to 928 00:50:02,400 --> 00:50:04,120 Speaker 1: really talk about this, but I think the very next 929 00:50:04,160 --> 00:50:06,279 Speaker 1: thing we're going to see out of this unfortunately, and 930 00:50:06,480 --> 00:50:08,640 Speaker 1: this is just my gut feeling. You said, somebody probably 931 00:50:08,680 --> 00:50:13,480 Speaker 1: scrapped the head, you know, the control unit. I have 932 00:50:13,560 --> 00:50:15,279 Speaker 1: a feeling we're going to see a photograph of that 933 00:50:15,320 --> 00:50:17,160 Speaker 1: show up somewhere that's going to be sent to the 934 00:50:17,800 --> 00:50:20,239 Speaker 1: creators of the of a hitch pot, which to the 935 00:50:20,320 --> 00:50:24,600 Speaker 1: creators probably would just be yet another data point. I probably, yeah, 936 00:50:24,680 --> 00:50:26,920 Speaker 1: But I I see it as going like the way 937 00:50:27,080 --> 00:50:30,359 Speaker 1: a real crime against a human would have gone. Um, 938 00:50:30,480 --> 00:50:32,719 Speaker 1: you know, and that there will be next a kind 939 00:50:32,760 --> 00:50:35,319 Speaker 1: of a taunting note sent to them as well, which 940 00:50:35,719 --> 00:50:39,319 Speaker 1: you know, I hope I'm wrong that if that does happen, though, 941 00:50:39,840 --> 00:50:42,600 Speaker 1: it is interesting because it's just that whoever does it 942 00:50:42,680 --> 00:50:46,040 Speaker 1: obviously would think of it as uh or at least 943 00:50:46,239 --> 00:50:48,799 Speaker 1: I would guess this is armchair psychology. But I would 944 00:50:48,800 --> 00:50:50,799 Speaker 1: imagine that they would think of it as like a joke. 945 00:50:51,320 --> 00:50:53,440 Speaker 1: That you know, I'm treating this as if it were 946 00:50:53,480 --> 00:50:56,319 Speaker 1: a real person. Meanwhile, there are other people who think 947 00:50:56,520 --> 00:50:59,200 Speaker 1: that's sick because they do think of hitch Pot as 948 00:50:59,280 --> 00:51:02,600 Speaker 1: at least some in some way similar to a person. 949 00:51:02,719 --> 00:51:06,360 Speaker 1: And this is again here I read a lot of 950 00:51:06,400 --> 00:51:08,480 Speaker 1: true crime, so it's not that I'm just thinking about 951 00:51:08,520 --> 00:51:11,600 Speaker 1: this all the time. I mean, I'm just saying around 952 00:51:11,640 --> 00:51:14,480 Speaker 1: the one year anniversary, just pay attention to what's going 953 00:51:14,480 --> 00:51:16,600 Speaker 1: on the news. It might happen, might not happen. I 954 00:51:16,640 --> 00:51:18,680 Speaker 1: don't have any inside info or anything like that. I 955 00:51:18,719 --> 00:51:20,839 Speaker 1: could just as easily be that someone saw it and thought, 956 00:51:20,920 --> 00:51:22,719 Speaker 1: oh I want a tablet. PC could be, and it 957 00:51:22,719 --> 00:51:24,680 Speaker 1: could It could have ended up in a dumpster half 958 00:51:24,719 --> 00:51:27,080 Speaker 1: a block away. Yeah, yeah, it's it's hard to say, 959 00:51:27,600 --> 00:51:29,400 Speaker 1: but this was a lot of fun. Scott, thank you 960 00:51:29,440 --> 00:51:31,880 Speaker 1: for coming on the show. Thank you again. I appreciate it, 961 00:51:31,920 --> 00:51:34,239 Speaker 1: and likewise I had a good time doing it. I 962 00:51:34,320 --> 00:51:37,080 Speaker 1: hope you enjoyed that classic episode of tech stuff, The 963 00:51:37,360 --> 00:51:40,920 Speaker 1: Sad Tale of Hitchmot This is why we cannot have 964 00:51:41,480 --> 00:51:43,840 Speaker 1: nice things. If you'd like to get in touch with me, 965 00:51:44,120 --> 00:51:46,960 Speaker 1: let me know what you would like me to talk about. 966 00:51:47,120 --> 00:51:50,560 Speaker 1: Maybe give feedback on the show. Be gentle, I'm a 967 00:51:50,600 --> 00:51:52,880 Speaker 1: delicate flower. You can do so in a couple of 968 00:51:52,880 --> 00:51:55,120 Speaker 1: different ways. One way is to download the I heart 969 00:51:55,200 --> 00:51:57,759 Speaker 1: Radio app. It's free to downloads free to use. You 970 00:51:57,840 --> 00:52:00,560 Speaker 1: just navigate over to tech Stuff. There's a little microphone 971 00:52:00,719 --> 00:52:02,200 Speaker 1: icon there. If you click on that, you can leave 972 00:52:02,200 --> 00:52:05,160 Speaker 1: a voice message up to thirty seconds and length. Otherwise, 973 00:52:05,200 --> 00:52:07,360 Speaker 1: if you prefer, you can reach out on Twitter. The 974 00:52:07,440 --> 00:52:09,960 Speaker 1: handle for the show is text Stuff hs W and 975 00:52:10,000 --> 00:52:18,640 Speaker 1: I'll talk to you again and really soon. Yeah. Text 976 00:52:18,640 --> 00:52:22,080 Speaker 1: Stuff is an I heart Radio production. For more podcasts 977 00:52:22,120 --> 00:52:24,879 Speaker 1: from I Heart Radio, visit the i Heart Radio app, 978 00:52:25,040 --> 00:52:28,160 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.