1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:14,440 Speaker 1: stuff works dot com. Hey there, and welcome to Tech Stuff. 3 00:00:14,480 --> 00:00:17,760 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,920 --> 00:00:20,720 Speaker 1: Stuff Media and I heart radio and love all things tech. 5 00:00:20,800 --> 00:00:25,840 Speaker 1: And Today, dear listeners, we have something very special for 6 00:00:25,880 --> 00:00:30,680 Speaker 1: all of you because you've become so accustomed to having 7 00:00:31,240 --> 00:00:36,080 Speaker 1: your lonesome hero out there on the waste lands covering technology. Today, 8 00:00:36,200 --> 00:00:39,320 Speaker 1: I've got a partner in crime, Adam, and I'm Dowd 9 00:00:39,360 --> 00:00:42,920 Speaker 1: from Android Authority. Thank you so much for joining me 10 00:00:42,960 --> 00:00:45,960 Speaker 1: for this episode. I'm excited to be here. Thank you 11 00:00:46,040 --> 00:00:47,760 Speaker 1: so much for having me. And you know what, I 12 00:00:47,920 --> 00:00:53,320 Speaker 1: enjoy your your lonesome man routines, so I'm just glad 13 00:00:53,360 --> 00:00:56,160 Speaker 1: that I can contribute. It's it's great. Uh. I have 14 00:00:56,360 --> 00:00:59,319 Speaker 1: gotten very good at doing that tennis ball thing from 15 00:00:59,320 --> 00:01:02,880 Speaker 1: stillag seven team. That's a film reference. All you youngsters 16 00:01:02,880 --> 00:01:05,560 Speaker 1: out there, go check that out. But it's nice to 17 00:01:05,600 --> 00:01:09,319 Speaker 1: have someone else to chat tech with and just some 18 00:01:09,440 --> 00:01:13,680 Speaker 1: background guys. Adam reached out to me heading into ce S. 19 00:01:13,760 --> 00:01:17,440 Speaker 1: Originally we had hoped to record a podcast at c 20 00:01:17,560 --> 00:01:21,360 Speaker 1: e S itself. That didn't quite work out, but for 21 00:01:21,520 --> 00:01:26,600 Speaker 1: Cess nuts, you've heard me talk about it, And so 22 00:01:26,959 --> 00:01:28,800 Speaker 1: how many ce s s have you been to? Was 23 00:01:28,840 --> 00:01:31,440 Speaker 1: this your first one or had you been before? This 24 00:01:31,520 --> 00:01:34,120 Speaker 1: was my second one, but there were three years apart. 25 00:01:34,280 --> 00:01:36,880 Speaker 1: So my first one was in two thousand sixteen, and 26 00:01:36,920 --> 00:01:38,760 Speaker 1: then I was supposed to go in two thousand seventeen, 27 00:01:38,800 --> 00:01:42,080 Speaker 1: got canceled last minute, two thousand eighteen I just couldn't 28 00:01:42,280 --> 00:01:45,480 Speaker 1: swing it, and two thousand nineteen, blamo, here we go. 29 00:01:45,640 --> 00:01:49,160 Speaker 1: So you've been twice, I've been. I've been ten times. 30 00:01:50,000 --> 00:01:53,880 Speaker 1: And that's not the summi bragging. There are journalists who 31 00:01:53,920 --> 00:01:57,240 Speaker 1: have been covering ce s for longer than I've been alive, 32 00:01:57,400 --> 00:02:04,720 Speaker 1: so not. But you know, is is a chaotic busy. 33 00:02:05,960 --> 00:02:10,240 Speaker 1: You know, you're constantly running from one point to another experience. 34 00:02:10,919 --> 00:02:15,120 Speaker 1: Nothing is close to anything else. So many ubers, so 35 00:02:15,160 --> 00:02:20,160 Speaker 1: many and and you think you look at a map 36 00:02:20,280 --> 00:02:22,200 Speaker 1: of the Las Vegas Strip and you think it can't 37 00:02:22,200 --> 00:02:24,839 Speaker 1: possibly take very long to get from there to there? 38 00:02:25,040 --> 00:02:28,000 Speaker 1: Oh boy, can it take a long time to get 39 00:02:28,040 --> 00:02:31,280 Speaker 1: even just a few casinos away? You might you can 40 00:02:31,280 --> 00:02:34,000 Speaker 1: probably walk into about the same time, but you're going 41 00:02:34,040 --> 00:02:37,320 Speaker 1: to be exhausted when you get there. My favorite story 42 00:02:37,480 --> 00:02:41,680 Speaker 1: to tell regarding the distances between places in Las Vegas. 43 00:02:41,800 --> 00:02:45,079 Speaker 1: Was this year. I was at the Flamingo Hotel attending 44 00:02:45,160 --> 00:02:48,239 Speaker 1: a robotics conference which we will undoubitedly talk about shortly, 45 00:02:49,160 --> 00:02:52,280 Speaker 1: and from there I had to go to Pepcom, which 46 00:02:52,360 --> 00:02:54,400 Speaker 1: was in the Mirage Hotel. Now, if you look at 47 00:02:54,400 --> 00:02:57,320 Speaker 1: a map of Las Vegas, there's only it's only about 48 00:02:57,320 --> 00:03:00,880 Speaker 1: a ten minute walk from the Flamingo over to the Mirage. However, 49 00:03:00,960 --> 00:03:03,280 Speaker 1: once you hit the front door of the Mirage, you 50 00:03:03,320 --> 00:03:07,160 Speaker 1: have to walk another literally ten minutes from the front 51 00:03:07,200 --> 00:03:10,280 Speaker 1: door of the Mirage to the ballroom in the Mirage 52 00:03:10,480 --> 00:03:14,640 Speaker 1: where Pepcom is actually being held. You just you hate 53 00:03:14,639 --> 00:03:16,720 Speaker 1: yourself by the time you get there. Yeah. Part of 54 00:03:16,760 --> 00:03:19,080 Speaker 1: this is because of the design of casinos, right. The 55 00:03:19,080 --> 00:03:22,120 Speaker 1: whole idea is to try and entice people into gamble 56 00:03:22,280 --> 00:03:25,880 Speaker 1: and to spend more money. So there's no easy direct 57 00:03:26,040 --> 00:03:29,840 Speaker 1: path for a lot of these these sort of journeys 58 00:03:29,840 --> 00:03:31,960 Speaker 1: you go through. You have to go through an entire 59 00:03:32,000 --> 00:03:35,800 Speaker 1: casino floor and there's usually like that means diverting around 60 00:03:35,880 --> 00:03:40,040 Speaker 1: some centralized area. Usually it's the high rollers suite or 61 00:03:40,040 --> 00:03:43,600 Speaker 1: whatever in the middle, or it's a restaurant or something 62 00:03:43,640 --> 00:03:46,040 Speaker 1: it's it's just it's not They don't make it easy 63 00:03:46,080 --> 00:03:48,560 Speaker 1: to get around, and so we didn't get a chance 64 00:03:48,600 --> 00:03:50,320 Speaker 1: to meet up during C E S. But you were 65 00:03:50,400 --> 00:03:54,120 Speaker 1: covering robotics at c E S. It's a subject near 66 00:03:54,200 --> 00:03:58,400 Speaker 1: and dear to my heart, which still remains human. Don't worry, 67 00:03:58,440 --> 00:04:01,360 Speaker 1: but I thought we would talk. Yeah, it's well, you know, 68 00:04:01,760 --> 00:04:03,680 Speaker 1: I gotta take that that test every now and then. 69 00:04:04,320 --> 00:04:07,320 Speaker 1: But we're going to We're going to talk about consumer 70 00:04:07,440 --> 00:04:11,920 Speaker 1: robots because I've covered robotics quite a bit on tech stuff, 71 00:04:12,200 --> 00:04:16,400 Speaker 1: and I've talked about robotics from prototyping and military robots 72 00:04:16,440 --> 00:04:19,479 Speaker 1: that did the episodes about Boston Dynamics and how many 73 00:04:19,560 --> 00:04:22,120 Speaker 1: years they spent just trying to get a robot to 74 00:04:22,160 --> 00:04:25,400 Speaker 1: hop in a circle properly. We're going to talk about 75 00:04:25,440 --> 00:04:28,320 Speaker 1: the consumer side of things and what you saw at 76 00:04:28,360 --> 00:04:31,480 Speaker 1: CES because you were right there in the thick of it, 77 00:04:31,880 --> 00:04:34,680 Speaker 1: whereas I was. I was in an ivory tower next 78 00:04:34,720 --> 00:04:38,720 Speaker 1: to a Claude bathtub next to my bed. For reasons 79 00:04:39,000 --> 00:04:43,160 Speaker 1: I still don't quite grasp it's it. It's you know what. 80 00:04:43,360 --> 00:04:45,920 Speaker 1: It gives me a wonderful mental picture, though, so I 81 00:04:45,960 --> 00:04:48,760 Speaker 1: really appreciate that. So it didn't take a photo of 82 00:04:48,800 --> 00:04:50,640 Speaker 1: me in the tub, but I was fully clothed and 83 00:04:50,680 --> 00:04:53,440 Speaker 1: there was in the water in there. It's just just 84 00:04:53,520 --> 00:04:58,320 Speaker 1: I was like, I've got to I've got to experience this, right, Oh, totally, totally. 85 00:04:58,360 --> 00:05:00,560 Speaker 1: So yeah, just a little background. You know, I work 86 00:05:00,640 --> 00:05:03,839 Speaker 1: for Android Authority and we're all things, you know, smartphones 87 00:05:03,880 --> 00:05:07,960 Speaker 1: and tablets and whatnot. But I was asked by my editors, 88 00:05:08,120 --> 00:05:10,080 Speaker 1: you know, if you could take a look at, you know, 89 00:05:10,120 --> 00:05:14,240 Speaker 1: anything else aside from you know, smartphones, because ce us 90 00:05:14,440 --> 00:05:17,359 Speaker 1: not really huge for smartphones, to be honest, so you 91 00:05:17,400 --> 00:05:20,320 Speaker 1: want more about the Mobile World Congress or something from 92 00:05:20,320 --> 00:05:22,920 Speaker 1: the right right, which is coming up in just a month. 93 00:05:23,080 --> 00:05:24,960 Speaker 1: But so they asked me, you know what else would 94 00:05:24,960 --> 00:05:26,240 Speaker 1: you like to take a look at. Would you like 95 00:05:26,279 --> 00:05:29,240 Speaker 1: to look at robots? And I said, sure, that sounds great, 96 00:05:29,320 --> 00:05:31,120 Speaker 1: let's do that. Because you know, I'm I'm a big 97 00:05:31,200 --> 00:05:34,880 Speaker 1: Jetsons fan. I love Rosie, I love you know, from 98 00:05:34,880 --> 00:05:37,360 Speaker 1: from Lost in Space, the robot you know, I love 99 00:05:37,400 --> 00:05:39,400 Speaker 1: all that stuff. So I figured, let's go take a 100 00:05:39,400 --> 00:05:43,320 Speaker 1: look at that. And then I realized just how broad 101 00:05:43,600 --> 00:05:49,440 Speaker 1: the robotics industry is, and it's everything from autonomous cars too, 102 00:05:50,760 --> 00:05:54,040 Speaker 1: you know, personal robots to room Buzz to everything in between. 103 00:05:54,080 --> 00:05:55,600 Speaker 1: So I sat down with my editors and I said, 104 00:05:55,640 --> 00:05:58,320 Speaker 1: we need to focus, we need to we need to 105 00:05:58,400 --> 00:06:01,000 Speaker 1: narrow this thing down a little bit. And they said, well, 106 00:06:01,000 --> 00:06:02,279 Speaker 1: what do you want to look at it? And I said, 107 00:06:02,320 --> 00:06:05,280 Speaker 1: I want something that will bring me my beer. Can 108 00:06:05,320 --> 00:06:07,960 Speaker 1: we do that? And they said absolutely, let's do that. 109 00:06:08,080 --> 00:06:11,000 Speaker 1: So that's where I kind of ended up, was looking 110 00:06:11,000 --> 00:06:13,560 Speaker 1: at the robots that not not so much the ones 111 00:06:13,600 --> 00:06:15,200 Speaker 1: that you snuggle with to help you to sleep, but 112 00:06:15,279 --> 00:06:17,799 Speaker 1: the ones that will, you know, do things for you. Yeah. 113 00:06:18,040 --> 00:06:21,000 Speaker 1: I mean I remember a couple of years ago one 114 00:06:21,040 --> 00:06:23,560 Speaker 1: of the robots I've got a lot of buzz on 115 00:06:23,600 --> 00:06:29,520 Speaker 1: the floor was a close folding robot and it didn't 116 00:06:29,560 --> 00:06:32,039 Speaker 1: look anything. People when they hear the word robot, I 117 00:06:32,040 --> 00:06:35,320 Speaker 1: think more often than not, they're thinking of an anthropomorphic figure, right, 118 00:06:35,560 --> 00:06:41,040 Speaker 1: some humors vaguely human shaped, and unless there's someone who's 119 00:06:41,160 --> 00:06:43,480 Speaker 1: owned things like room Buzz and that sort of stuff 120 00:06:43,520 --> 00:06:45,360 Speaker 1: where they think, oh, no, robots can come in all 121 00:06:45,360 --> 00:06:48,680 Speaker 1: shapes and sizes. This thing looked like it was a box. Uh. 122 00:06:48,720 --> 00:06:50,920 Speaker 1: It looked like just a box that you put clothing 123 00:06:50,960 --> 00:06:53,120 Speaker 1: in and it would bring you folded it would you know, 124 00:06:53,160 --> 00:06:56,440 Speaker 1: spit folded clothing out And I was there this year too, 125 00:06:56,880 --> 00:06:59,320 Speaker 1: So that's that just tells you that robots come in 126 00:06:59,320 --> 00:07:01,880 Speaker 1: all shapes and sizes and for all sorts of functions, 127 00:07:01,920 --> 00:07:04,719 Speaker 1: and a lot of them are dedicated functions, like a 128 00:07:04,800 --> 00:07:08,800 Speaker 1: very specific one function kind of robot. Because as you 129 00:07:08,920 --> 00:07:12,160 Speaker 1: learn more about robotics, you learned that the more things 130 00:07:12,160 --> 00:07:14,000 Speaker 1: you want your robot to do, the harder it is 131 00:07:14,040 --> 00:07:16,040 Speaker 1: to design the robot to do any of them. Well, 132 00:07:17,920 --> 00:07:20,400 Speaker 1: you have a development community behind you that will build 133 00:07:20,440 --> 00:07:22,760 Speaker 1: all these different skills for different robots. But I could 134 00:07:22,760 --> 00:07:25,160 Speaker 1: be referring to another podcast that we might have just 135 00:07:25,200 --> 00:07:27,920 Speaker 1: finished recording. Yes, you you will definitely have to listen 136 00:07:27,960 --> 00:07:32,160 Speaker 1: to that episode over at at Android Authority because we 137 00:07:32,280 --> 00:07:36,360 Speaker 1: talked a lot about the Misty robot platform, and we 138 00:07:36,400 --> 00:07:39,760 Speaker 1: won't go over that material here because I think it's 139 00:07:39,800 --> 00:07:42,440 Speaker 1: really important you guys go out and check out that episode. 140 00:07:42,480 --> 00:07:44,520 Speaker 1: It was a great episode. I really enjoyed being on it. 141 00:07:44,600 --> 00:07:46,560 Speaker 1: Thank you by the way for inviting me to be 142 00:07:46,640 --> 00:07:50,440 Speaker 1: on that show. And I think you guys out there 143 00:07:50,480 --> 00:07:52,360 Speaker 1: tech stuff fans, you should definitely go check that out 144 00:07:52,400 --> 00:07:55,760 Speaker 1: because that will be a nice compliment to this episode, 145 00:07:56,360 --> 00:07:58,240 Speaker 1: I thought, I thought something that'd be fun to do. 146 00:07:58,320 --> 00:08:00,680 Speaker 1: This is just I'm just gonna run down a quick 147 00:08:00,720 --> 00:08:04,680 Speaker 1: list of some of the famous consumer robots that have 148 00:08:04,800 --> 00:08:06,440 Speaker 1: come out over the years and kind of give you 149 00:08:06,440 --> 00:08:12,320 Speaker 1: an idea of the the evolution of consumer robots over 150 00:08:12,360 --> 00:08:16,640 Speaker 1: the past decades. So, Adam, I don't know how old 151 00:08:16,680 --> 00:08:18,160 Speaker 1: you are. I am not going to ask you that 152 00:08:18,240 --> 00:08:21,880 Speaker 1: question unless you feel comfortable mentioning it. I am of 153 00:08:21,920 --> 00:08:24,800 Speaker 1: the tender age of forty three years old. And you 154 00:08:24,880 --> 00:08:28,000 Speaker 1: got me all right, there we go. So you remember 155 00:08:28,000 --> 00:08:32,840 Speaker 1: the eighties right vaguely? Yeah? Yeah, it was a crazy 156 00:08:32,880 --> 00:08:35,880 Speaker 1: time for all of us. But so in the in 157 00:08:35,920 --> 00:08:40,000 Speaker 1: the eighties, I remember distinctly there there was one thing 158 00:08:40,080 --> 00:08:42,160 Speaker 1: I really wish I could have asked for as a 159 00:08:42,200 --> 00:08:45,000 Speaker 1: Christmas present, but I just I knew it was. It 160 00:08:45,080 --> 00:08:48,360 Speaker 1: was beyond my star, as Shakespeare would say, it was 161 00:08:48,440 --> 00:08:50,920 Speaker 1: something that was to be wished for but never owned. 162 00:08:51,440 --> 00:08:55,040 Speaker 1: That was the infamous Omnibot. I don't know if you 163 00:08:55,040 --> 00:08:57,680 Speaker 1: ever saw an omnibot or you know what I'm talking about. 164 00:08:57,760 --> 00:09:01,920 Speaker 1: The original Omnibot had of a dome like head. It 165 00:09:01,960 --> 00:09:04,440 Speaker 1: was tiny, it was it was you know, maybe two 166 00:09:04,480 --> 00:09:06,679 Speaker 1: ft tall, maybe three I don't think it was three 167 00:09:06,679 --> 00:09:09,640 Speaker 1: ft tall, probably two ft tall, and had a cassette 168 00:09:10,120 --> 00:09:13,000 Speaker 1: tape player built into its chest, and you could actually 169 00:09:13,080 --> 00:09:16,920 Speaker 1: put programming on cassette, you know, record a program onto 170 00:09:17,000 --> 00:09:20,040 Speaker 1: magnetic tape on cassette and put that into chest and 171 00:09:20,080 --> 00:09:22,480 Speaker 1: it would follow it. It would also just play in 172 00:09:22,520 --> 00:09:24,560 Speaker 1: regular audio cassettes if you wanted, So, if you wanted 173 00:09:24,559 --> 00:09:26,679 Speaker 1: to have your robot shred to a C d C, 174 00:09:26,840 --> 00:09:29,800 Speaker 1: you could, and you should I think, you know, and 175 00:09:30,720 --> 00:09:33,040 Speaker 1: for those for those about two bot, we salute you, 176 00:09:33,800 --> 00:09:37,080 Speaker 1: and uh. You could also program it with a remote control. 177 00:09:37,280 --> 00:09:39,520 Speaker 1: You could do voice commands over the remote control, and 178 00:09:39,559 --> 00:09:42,840 Speaker 1: it was it was done as like a programmable robot 179 00:09:42,880 --> 00:09:47,400 Speaker 1: that you could own. And I thought it looked phenomenal. 180 00:09:47,600 --> 00:09:49,400 Speaker 1: These days, if you look at you're gonna think, wow, 181 00:09:49,480 --> 00:09:54,560 Speaker 1: how quaint. But that was an early example and and 182 00:09:54,880 --> 00:09:58,200 Speaker 1: by all measure of today's robotics, it was certainly a 183 00:09:58,240 --> 00:10:01,880 Speaker 1: primitive example. But to a kid in the nineteen eighties, 184 00:10:02,400 --> 00:10:04,439 Speaker 1: this was the next best thing to having your own 185 00:10:04,559 --> 00:10:08,080 Speaker 1: R two D two in your house. Oh for sure. Absolutely. 186 00:10:08,240 --> 00:10:10,240 Speaker 1: I was gonna say, amnibot did kind of look like 187 00:10:10,400 --> 00:10:13,280 Speaker 1: Artie D two's idiot cousin at the time, but yes, 188 00:10:13,720 --> 00:10:17,480 Speaker 1: for sure. So this kind of leads into the next 189 00:10:17,559 --> 00:10:20,040 Speaker 1: robot I want to talk about, which was the Robotic 190 00:10:20,120 --> 00:10:25,240 Speaker 1: Operating Buddy or rob. Rob was a justification to sell 191 00:10:25,320 --> 00:10:30,079 Speaker 1: Nintoto entertainment systems. So when the NES first came out 192 00:10:30,200 --> 00:10:34,000 Speaker 1: in the United States, stores did not want to carry 193 00:10:34,080 --> 00:10:38,040 Speaker 1: a video game console because the video game crash in 194 00:10:38,080 --> 00:10:43,319 Speaker 1: the United States had totally decimated that that market actually 195 00:10:43,360 --> 00:10:46,040 Speaker 1: decimates the wrong term, that's just reducing it by ten. 196 00:10:46,640 --> 00:10:50,000 Speaker 1: That market was gone. Uh so no one wanted to 197 00:10:50,040 --> 00:10:53,360 Speaker 1: carry a video game console. So Nintendo, in an effort 198 00:10:53,400 --> 00:10:57,120 Speaker 1: to make it more attractive to toy stores in the 199 00:10:57,240 --> 00:11:02,760 Speaker 1: United States, had developed this successory, this peripheral called rob 200 00:11:03,320 --> 00:11:06,800 Speaker 1: that was a robot and I used big air quotes 201 00:11:06,840 --> 00:11:11,079 Speaker 1: on that that would go along with the console. And 202 00:11:11,120 --> 00:11:13,680 Speaker 1: it had maybe two titles that worked with the robot, 203 00:11:14,320 --> 00:11:18,320 Speaker 1: and that was enough to get toy stores convinced to 204 00:11:18,360 --> 00:11:20,520 Speaker 1: carry it because robotics was starting to become a thing, 205 00:11:21,080 --> 00:11:23,679 Speaker 1: even though video game consoles no longer were really thought 206 00:11:23,679 --> 00:11:27,920 Speaker 1: of as being marketable. And then and it worked. And 207 00:11:27,960 --> 00:11:29,880 Speaker 1: then as soon as Nintendo saw it was working, they 208 00:11:29,960 --> 00:11:32,360 Speaker 1: dropped rob because there was no reason to keep making 209 00:11:32,360 --> 00:11:34,800 Speaker 1: this very expensive peripheral when they could just sell the 210 00:11:34,880 --> 00:11:38,640 Speaker 1: video game console. Uh yeah, I was in Atari family, 211 00:11:38,720 --> 00:11:40,800 Speaker 1: so we didn't really get into the whole Nintendo thing 212 00:11:40,840 --> 00:11:44,679 Speaker 1: until much later. Same here, Buddy, I was Ayer, and 213 00:11:45,080 --> 00:11:49,280 Speaker 1: later on I inherited and in television that was Those 214 00:11:49,280 --> 00:11:54,040 Speaker 1: were my early consoles. Uh and then we're skipping ahead 215 00:11:54,240 --> 00:11:56,280 Speaker 1: a lot here, more than a decade, actually, I guess 216 00:11:56,280 --> 00:11:58,960 Speaker 1: I could argue was also when we saw an amazing 217 00:11:59,040 --> 00:12:04,680 Speaker 1: robot in the uh incredible documentary film Rocky four. But 218 00:12:05,840 --> 00:12:07,760 Speaker 1: the less we say about that, the better it's. It 219 00:12:08,360 --> 00:12:10,600 Speaker 1: wasn't really a consumer robot that anyone could just go 220 00:12:10,640 --> 00:12:12,199 Speaker 1: out and buy unless you had a quarter of a 221 00:12:12,240 --> 00:12:18,080 Speaker 1: million dollars on you. So Lego introduced mind Storms, the 222 00:12:18,200 --> 00:12:21,920 Speaker 1: modular design system that allowed people to play with concepts 223 00:12:21,960 --> 00:12:25,160 Speaker 1: of robotics and programming as well as robotic design, which 224 00:12:25,200 --> 00:12:30,480 Speaker 1: was kind of cool. Sony released the first Ibo, robotic Dog, 225 00:12:30,600 --> 00:12:35,160 Speaker 1: which it continues to iterate upon, so we occasionally see 226 00:12:35,600 --> 00:12:39,280 Speaker 1: new IBOs that look more I don't know if the 227 00:12:39,320 --> 00:12:42,600 Speaker 1: word is realistic. It doesn't really look more like a dog, 228 00:12:42,720 --> 00:12:49,720 Speaker 1: but it looks less nineties robots style, which is an improvement. 229 00:12:49,760 --> 00:12:52,040 Speaker 1: I think we can all argue. I think so. Two 230 00:12:52,080 --> 00:12:54,520 Speaker 1: thousand two was when we first got the room BA. 231 00:12:54,840 --> 00:12:57,960 Speaker 1: I didn't even realize it was that old. But yeah. 232 00:12:57,960 --> 00:13:01,200 Speaker 1: I Robot, which had been making military are robots up 233 00:13:01,240 --> 00:13:05,240 Speaker 1: to that point, introduces a consumer vacuum cleaner robot in 234 00:13:05,240 --> 00:13:08,960 Speaker 1: two thousand two. I'm pretty sure that's the that's the 235 00:13:08,960 --> 00:13:13,079 Speaker 1: the actual plot to Terminator, but I could be wrong 236 00:13:13,120 --> 00:13:17,120 Speaker 1: about that. Have seen its mapping out all of our 237 00:13:17,160 --> 00:13:20,360 Speaker 1: houses and sending them over. Yeah, and so we have 238 00:13:20,480 --> 00:13:23,720 Speaker 1: essentially like the Google Maps, except it's everybody's home because 239 00:13:24,440 --> 00:13:26,679 Speaker 1: like rumbas, like this is a particularly tricky corner. I 240 00:13:26,679 --> 00:13:29,440 Speaker 1: get stuck in this corner listen because the way they 241 00:13:29,480 --> 00:13:30,960 Speaker 1: have the chair set up, and you just want to 242 00:13:30,960 --> 00:13:33,000 Speaker 1: be aware of that. That's why I live in a 243 00:13:33,000 --> 00:13:35,760 Speaker 1: split level house. Yes, yeah, I've got lots of stairs 244 00:13:35,760 --> 00:13:38,200 Speaker 1: in my home. There's no chance for a rumba in 245 00:13:38,280 --> 00:13:40,320 Speaker 1: my place. So I don't I don't know anyone. But 246 00:13:40,360 --> 00:13:42,480 Speaker 1: then two thousand five they came out with the Scuba, 247 00:13:42,520 --> 00:13:45,360 Speaker 1: which was the mopping robot. Version two thousand six was 248 00:13:46,320 --> 00:13:49,840 Speaker 1: a robot that didn't it no longer is in production. 249 00:13:49,840 --> 00:13:52,440 Speaker 1: It's called the Dirt Dog. It was a robotic shop back, 250 00:13:53,000 --> 00:13:56,480 Speaker 1: but it was discontinued. In two thousand seven, they introduced 251 00:13:56,559 --> 00:13:59,840 Speaker 1: the Luge gutter cleaning robot, which was actually a remote 252 00:14:00,040 --> 00:14:04,080 Speaker 1: troll device, and then the Vero pool cleaning robot was 253 00:14:04,200 --> 00:14:07,920 Speaker 1: also released in two thousand seven. And I remember this 254 00:14:07,920 --> 00:14:10,040 Speaker 1: because I was at the c E s was when 255 00:14:10,080 --> 00:14:13,680 Speaker 1: Parrot drone first showed off their their first quad copter 256 00:14:14,240 --> 00:14:17,800 Speaker 1: at c then that I would argue is more of 257 00:14:17,840 --> 00:14:23,160 Speaker 1: a radio controlled device, but it it kind of figuratively 258 00:14:23,200 --> 00:14:28,480 Speaker 1: and literally launched the consumer drone craze, which later on 259 00:14:28,560 --> 00:14:32,480 Speaker 1: developed into models that we saw have some at least 260 00:14:33,040 --> 00:14:35,960 Speaker 1: rudimentary robotic elements to it, things like being able to 261 00:14:36,000 --> 00:14:39,120 Speaker 1: follow a person around right, have facial were condition and following. 262 00:14:39,400 --> 00:14:42,480 Speaker 1: It seems some pretty cool ones. STO was first shown 263 00:14:42,520 --> 00:14:44,880 Speaker 1: off inven I remember that as well. Two point o 264 00:14:45,040 --> 00:14:49,880 Speaker 1: came out in and there are no shortage of robots 265 00:14:49,920 --> 00:14:54,320 Speaker 1: at ces past and present. They range everything from the 266 00:14:54,440 --> 00:14:59,240 Speaker 1: incredibly sophisticated we're talking like autonomous car type technology too, 267 00:15:00,320 --> 00:15:03,440 Speaker 1: things that are called robots but aren't really robots. They're 268 00:15:03,440 --> 00:15:07,520 Speaker 1: more like fun little toys or sometimes just diversions like 269 00:15:07,560 --> 00:15:12,000 Speaker 1: I think of like the Little Dancing Flower. It takes 270 00:15:12,080 --> 00:15:14,400 Speaker 1: audio and it starts to move like. There are a 271 00:15:14,440 --> 00:15:18,400 Speaker 1: lot of those as well. Sure, it's such a broad field. 272 00:15:18,480 --> 00:15:20,960 Speaker 1: Like we mentioned at the at the beginning of the show, 273 00:15:21,360 --> 00:15:24,320 Speaker 1: there's just so much that you could consider robotics. One 274 00:15:24,400 --> 00:15:26,160 Speaker 1: one robot that you actually left out of your list, 275 00:15:26,200 --> 00:15:28,320 Speaker 1: if you don't mind, this was. This is my robot 276 00:15:28,440 --> 00:15:31,479 Speaker 1: when I was a kid was the Radio Shack Armatron. 277 00:15:31,760 --> 00:15:33,840 Speaker 1: I don't know if you remember that. That was a 278 00:15:33,880 --> 00:15:36,200 Speaker 1: little robotic arm It set on a pedestal, had two 279 00:15:36,200 --> 00:15:39,880 Speaker 1: little joysticks that you could use to manipulate the claw 280 00:15:40,000 --> 00:15:42,120 Speaker 1: and the end, and it came with like a little 281 00:15:42,160 --> 00:15:44,000 Speaker 1: little balls in a little box that you could open 282 00:15:44,080 --> 00:15:45,480 Speaker 1: up and put the balls in the box. It was. 283 00:15:45,600 --> 00:15:48,320 Speaker 1: It was. It was tons of fun and we used 284 00:15:48,400 --> 00:15:51,440 Speaker 1: that thing for days and days and days. Yeah, I 285 00:15:51,440 --> 00:15:54,120 Speaker 1: mean I remember I never owned one, but I remember 286 00:15:54,160 --> 00:15:56,000 Speaker 1: those as well. I remember when that came out, and 287 00:15:56,080 --> 00:15:59,880 Speaker 1: I remember thinking that was a really cool looking devices. Uh, 288 00:16:00,080 --> 00:16:02,400 Speaker 1: from that same from that same sort of era of 289 00:16:02,440 --> 00:16:07,920 Speaker 1: my childhood. So these are these are really we've seen 290 00:16:07,960 --> 00:16:12,400 Speaker 1: an evolution of these products over the years, Like there's 291 00:16:12,440 --> 00:16:14,000 Speaker 1: certain ones that are always going to be in that 292 00:16:14,040 --> 00:16:18,120 Speaker 1: sort of diversion toy category, and they're never they're never 293 00:16:18,120 --> 00:16:20,960 Speaker 1: gonna get particularly sophisticated, and that's fine because that's not 294 00:16:21,000 --> 00:16:23,480 Speaker 1: their intention. They're meant to be toys. They're meant to 295 00:16:23,520 --> 00:16:26,640 Speaker 1: be these fun little things that are not terribly expensive 296 00:16:26,840 --> 00:16:30,640 Speaker 1: that can do something vaguely entertaining. Hey, guys, Jonathan here, 297 00:16:30,720 --> 00:16:32,760 Speaker 1: Adam and I have a lot more to say about 298 00:16:32,800 --> 00:16:35,960 Speaker 1: consumer robotics, but before we get into that, let's take 299 00:16:36,040 --> 00:16:46,360 Speaker 1: a quick break to thank our sponsor. We've seen a 300 00:16:46,360 --> 00:16:51,760 Speaker 1: lot of advances in other areas of consumer robotics technology, 301 00:16:51,920 --> 00:16:56,640 Speaker 1: and uh so, I'm curious what was your experience covering 302 00:16:56,960 --> 00:16:58,920 Speaker 1: robotics at c e S. What were some of the 303 00:16:58,960 --> 00:17:02,080 Speaker 1: things that you saw that really either stood out to 304 00:17:02,120 --> 00:17:05,840 Speaker 1: you as being particularly interesting or even if there was 305 00:17:05,880 --> 00:17:09,479 Speaker 1: something that was maybe confounding or you know, you're just like, 306 00:17:09,560 --> 00:17:12,240 Speaker 1: I don't I don't even get what they were going 307 00:17:12,280 --> 00:17:14,760 Speaker 1: before with this, because sometimes you run into that too. 308 00:17:15,119 --> 00:17:18,600 Speaker 1: Oh sure, Oh, definitely one of the I think one 309 00:17:18,600 --> 00:17:21,719 Speaker 1: of the coolest robot One of the things that just 310 00:17:21,800 --> 00:17:24,800 Speaker 1: overall impressions that I came away from c e S 311 00:17:24,840 --> 00:17:27,919 Speaker 1: with was I was kind of surprised how far we 312 00:17:28,000 --> 00:17:33,760 Speaker 1: already are because I saw legitimate consumer products that someone 313 00:17:33,800 --> 00:17:37,960 Speaker 1: could go out and buy and have driving around their house. Now, 314 00:17:38,040 --> 00:17:39,760 Speaker 1: you'd have to be rich, you'd have to be a 315 00:17:39,760 --> 00:17:41,920 Speaker 1: lot more rich than I am, but you could still 316 00:17:42,000 --> 00:17:45,560 Speaker 1: have this robot driving around your house and you know, 317 00:17:45,640 --> 00:17:49,360 Speaker 1: following you with music or bringing you drinks things like that. 318 00:17:49,720 --> 00:17:53,119 Speaker 1: One robot in particular that stood out there were two 319 00:17:53,359 --> 00:17:56,200 Speaker 1: really that I think we're really kind of creme day 320 00:17:56,240 --> 00:17:58,840 Speaker 1: la creme. But the one that really stood out to 321 00:17:58,880 --> 00:18:01,560 Speaker 1: me as being a little bit more realistic at this 322 00:18:01,760 --> 00:18:05,040 Speaker 1: day and age would be the Temi robot, which was 323 00:18:06,200 --> 00:18:09,400 Speaker 1: kind of like um, kind of like a rolling serving 324 00:18:09,480 --> 00:18:12,840 Speaker 1: trade that could could roll around. It would in theory, 325 00:18:12,920 --> 00:18:14,800 Speaker 1: it would meet you at the door when you came in. 326 00:18:15,280 --> 00:18:20,680 Speaker 1: It would not play Stone called Steve Austin's intro music. Unfortunately, big, big, 327 00:18:20,720 --> 00:18:24,560 Speaker 1: but big check against that robot, but a right, So 328 00:18:24,600 --> 00:18:26,560 Speaker 1: that's the one that's one box that didn't get checked. 329 00:18:26,600 --> 00:18:31,040 Speaker 1: But it did have an integrated wireless charging doc that 330 00:18:31,119 --> 00:18:33,480 Speaker 1: you could set your phone down on it and start 331 00:18:33,560 --> 00:18:35,640 Speaker 1: charging your phone as soon as you walked in the door, 332 00:18:36,200 --> 00:18:38,680 Speaker 1: and then it would follow you It had a tracking 333 00:18:38,680 --> 00:18:41,160 Speaker 1: software where you could turn away from it and walk 334 00:18:41,200 --> 00:18:43,680 Speaker 1: away from it and it would follow you around. And 335 00:18:43,760 --> 00:18:46,640 Speaker 1: what was really cool was if someone walked in between 336 00:18:46,680 --> 00:18:49,439 Speaker 1: you and the robot, it didn't freak out. It was 337 00:18:49,480 --> 00:18:51,679 Speaker 1: able to pick you back up and keep following you. 338 00:18:52,400 --> 00:18:56,560 Speaker 1: And you could ask you, you know, various Alexa type things, 339 00:18:56,560 --> 00:19:00,320 Speaker 1: how's the weather outside, how's you know, what's my jewel 340 00:19:00,400 --> 00:19:02,959 Speaker 1: for today? Stuff like that. It had a screen so 341 00:19:03,040 --> 00:19:05,960 Speaker 1: it could show you what was what you wanted to see. 342 00:19:06,000 --> 00:19:08,919 Speaker 1: It could play videos for you things like that, and 343 00:19:08,920 --> 00:19:10,440 Speaker 1: it had a tray on the back so you could, 344 00:19:10,440 --> 00:19:13,080 Speaker 1: you know, set things down and have it carried around 345 00:19:13,119 --> 00:19:15,719 Speaker 1: for you. I don't know if you're particularly lazy, but 346 00:19:16,280 --> 00:19:19,080 Speaker 1: this this robot kind of stood out to me just 347 00:19:19,160 --> 00:19:22,200 Speaker 1: because it was the type of robot that I could 348 00:19:22,200 --> 00:19:24,440 Speaker 1: see myself using if I had a lot of money. 349 00:19:25,040 --> 00:19:28,800 Speaker 1: Um I don't, so I won't, but it was That's 350 00:19:28,880 --> 00:19:30,760 Speaker 1: what really kind of stuck out to me, is this 351 00:19:30,840 --> 00:19:33,000 Speaker 1: is something that I could see in two thousand nineteen 352 00:19:33,119 --> 00:19:38,000 Speaker 1: a home actually having in it and not being stupid. 353 00:19:38,680 --> 00:19:41,520 Speaker 1: If that makes that's that's that's a legit thing. Though. 354 00:19:41,560 --> 00:19:44,520 Speaker 1: I mean, like we're getting to a point where the 355 00:19:44,520 --> 00:19:47,359 Speaker 1: technology is reaching a level of sophistication where we can 356 00:19:47,400 --> 00:19:54,040 Speaker 1: actually see applications beyond that very singular approach. Every time, 357 00:19:54,280 --> 00:19:56,359 Speaker 1: every time you want a robot to do something, in 358 00:19:56,400 --> 00:20:00,040 Speaker 1: addition to whatever it's primary function is, you've just to 359 00:20:00,160 --> 00:20:03,800 Speaker 1: increase the complexity of the build, the complexity of the 360 00:20:03,840 --> 00:20:07,320 Speaker 1: software sometimes the complexity of the interface. And obviously you 361 00:20:07,320 --> 00:20:09,720 Speaker 1: can only do that for so long before it gets 362 00:20:09,840 --> 00:20:15,160 Speaker 1: unmanageable and uh or the robots just crappy at everything, right, 363 00:20:15,240 --> 00:20:18,719 Speaker 1: It's just you can't, like sure and in theory can 364 00:20:18,800 --> 00:20:20,639 Speaker 1: do all these things, but it doesn't do any of 365 00:20:20,680 --> 00:20:23,119 Speaker 1: them well, and it's more frustrating to use than if 366 00:20:23,160 --> 00:20:26,680 Speaker 1: I just did them myself. That that's it's a hard 367 00:20:26,840 --> 00:20:30,440 Speaker 1: barrier to get past because it requires so much sophistication 368 00:20:30,520 --> 00:20:35,280 Speaker 1: from multiple perspectives, from multiple starting points. It's funny you 369 00:20:35,280 --> 00:20:38,159 Speaker 1: should mention that because since c E S. Since I 370 00:20:38,200 --> 00:20:41,320 Speaker 1: was covering robotics at CES, I was particularly interested in robotics, 371 00:20:41,640 --> 00:20:44,679 Speaker 1: and as part of my daily tech news roundup podcast 372 00:20:44,720 --> 00:20:48,840 Speaker 1: I Report, I reported on a hotel in Japan who 373 00:20:48,960 --> 00:20:54,600 Speaker 1: was retiring half of its two hundred and THIRTIESO robotic workforce. 374 00:20:55,160 --> 00:21:00,840 Speaker 1: Mainly because the people required to keep those robot bots 375 00:21:00,880 --> 00:21:04,960 Speaker 1: on task basically were made it kind of silly to 376 00:21:05,040 --> 00:21:08,199 Speaker 1: have the robots themselves. So people would like walk up 377 00:21:08,240 --> 00:21:11,160 Speaker 1: to those robots and say where can I catch a cab? 378 00:21:11,640 --> 00:21:13,800 Speaker 1: And the robot would be like, I have no idea 379 00:21:13,840 --> 00:21:15,960 Speaker 1: what you just said. And then people would say, well, 380 00:21:15,960 --> 00:21:17,920 Speaker 1: where's a good place to eat? And the robot would 381 00:21:17,960 --> 00:21:20,080 Speaker 1: be like, I have no idea what you want. And 382 00:21:20,119 --> 00:21:22,280 Speaker 1: so they ended up having to turn to a person 383 00:21:22,359 --> 00:21:26,960 Speaker 1: to answer those same questions because once again, these the 384 00:21:27,000 --> 00:21:29,520 Speaker 1: people that were coming into the hotel had this expectation 385 00:21:29,560 --> 00:21:31,719 Speaker 1: that a robot will be able to answer whatever questions 386 00:21:31,720 --> 00:21:33,879 Speaker 1: that I should have. Personally, I think they should have 387 00:21:33,920 --> 00:21:35,560 Speaker 1: just replaced it with a Google Home and called it 388 00:21:35,600 --> 00:21:38,800 Speaker 1: a day. But you know what, beggars, camp be choosers, 389 00:21:38,840 --> 00:21:41,320 Speaker 1: I suppose. But they actually had to retire half of 390 00:21:41,359 --> 00:21:43,960 Speaker 1: their They actually had the term used in the article 391 00:21:44,040 --> 00:21:47,959 Speaker 1: was laid off their half of their robotic workforce because 392 00:21:48,320 --> 00:21:51,040 Speaker 1: they had to have so many people supporting those robots 393 00:21:51,359 --> 00:21:53,359 Speaker 1: that it was it made it silly to have the 394 00:21:53,440 --> 00:21:55,800 Speaker 1: robots in the first place. This this is the robotic 395 00:21:56,080 --> 00:22:00,520 Speaker 1: equivalent of using a vacuum on a car pit and 396 00:22:00,520 --> 00:22:03,000 Speaker 1: you're running it over a piece of fluff that's on 397 00:22:03,040 --> 00:22:05,600 Speaker 1: the carpet, and the vacuum runs over it, and you 398 00:22:05,600 --> 00:22:07,400 Speaker 1: pull the vacuum back, and the fluff is still there. 399 00:22:07,840 --> 00:22:09,840 Speaker 1: So you push the vacuum over it again, you pull 400 00:22:09,840 --> 00:22:12,040 Speaker 1: it back, it's still there. You've been down. You pick 401 00:22:12,119 --> 00:22:14,119 Speaker 1: up the fluff, you put it back down on the ground, 402 00:22:14,400 --> 00:22:16,640 Speaker 1: you roll it over with a vacuum cleaner, and you're 403 00:22:16,680 --> 00:22:19,399 Speaker 1: just you're you're spending so much time just trying to 404 00:22:19,440 --> 00:22:21,440 Speaker 1: get the thing that does the thing to do the 405 00:22:21,440 --> 00:22:23,680 Speaker 1: thing that you could have just done the thing yourself. 406 00:22:24,240 --> 00:22:25,840 Speaker 1: I was really hoping that you were going to talk 407 00:22:25,840 --> 00:22:27,960 Speaker 1: about picking up the fluff and then putting it back down, 408 00:22:28,080 --> 00:22:32,959 Speaker 1: because it's exactly it's a universal experience. I think just 409 00:22:33,040 --> 00:22:35,920 Speaker 1: about everyone who's ever used a vacuum cleaner has done 410 00:22:35,960 --> 00:22:38,640 Speaker 1: this thing where they've you know, it's not just pick 411 00:22:38,720 --> 00:22:40,480 Speaker 1: up the fluff and throw it away. It's no, this 412 00:22:40,800 --> 00:22:43,080 Speaker 1: vacuum cleaner is supposed to do this thing, and I'm 413 00:22:43,080 --> 00:22:46,280 Speaker 1: gonna keep at it until it works. Uh. Yeah, we've 414 00:22:46,280 --> 00:22:49,840 Speaker 1: all had that experience. So seeing that with robotics, it's 415 00:22:50,160 --> 00:22:53,359 Speaker 1: it's absolutely the same sort of thing, just on a 416 00:22:53,400 --> 00:22:56,040 Speaker 1: different application. The thing that really impressed me by what 417 00:22:56,080 --> 00:22:58,919 Speaker 1: you were saying was the idea of a robot that 418 00:22:58,960 --> 00:23:02,560 Speaker 1: can follow us specific person and not be able to 419 00:23:02,640 --> 00:23:06,000 Speaker 1: not lose its way if something breaks its line of sight, 420 00:23:06,080 --> 00:23:08,159 Speaker 1: like if someone else comes in between it. And the 421 00:23:08,200 --> 00:23:10,919 Speaker 1: reason why I say, I find that really interesting. I 422 00:23:10,960 --> 00:23:14,480 Speaker 1: went to a talk at south By Southwest a few 423 00:23:14,560 --> 00:23:18,280 Speaker 1: years ago and the person giving the talk that it 424 00:23:18,440 --> 00:23:21,080 Speaker 1: was all about robotics. The person giving the talk was 425 00:23:21,119 --> 00:23:24,679 Speaker 1: talking about how they were teaching a robot how to 426 00:23:24,760 --> 00:23:29,360 Speaker 1: open a door. And this is a non trivial problem 427 00:23:29,400 --> 00:23:33,720 Speaker 1: because we don't have a single standard for doors in 428 00:23:33,720 --> 00:23:38,440 Speaker 1: the human world. They are all the different kinds of handles. 429 00:23:38,560 --> 00:23:40,919 Speaker 1: Some of them you have to push something specific so 430 00:23:40,960 --> 00:23:42,679 Speaker 1: that the door will unlatch and you can open it. 431 00:23:42,800 --> 00:23:45,040 Speaker 1: Or you have to turn a door knob, or you 432 00:23:45,080 --> 00:23:47,639 Speaker 1: have to press down on a little thumb leaver or 433 00:23:47,640 --> 00:23:51,640 Speaker 1: whatever it may be, and there's pushing versus pulling. There's 434 00:23:51,680 --> 00:23:55,280 Speaker 1: all these sorts of different features that could be in 435 00:23:55,359 --> 00:23:59,639 Speaker 1: place for any given door, and so we humans, we 436 00:23:59,760 --> 00:24:02,080 Speaker 1: in on her this once or twice, we can then 437 00:24:02,080 --> 00:24:06,040 Speaker 1: extrapolate in other situations and very quickly figure out how 438 00:24:06,080 --> 00:24:08,800 Speaker 1: to use a door, even if we've never gone through 439 00:24:08,800 --> 00:24:13,040 Speaker 1: that that specific door before. Robots. Robots are not good 440 00:24:13,040 --> 00:24:16,400 Speaker 1: at this. Robots typically can do what you program them 441 00:24:16,400 --> 00:24:19,240 Speaker 1: to do, and even with machine learning, they have to 442 00:24:19,240 --> 00:24:22,560 Speaker 1: go through lots of training in order to actually learn 443 00:24:22,640 --> 00:24:26,680 Speaker 1: what the the quote unquote correct approach is. And so 444 00:24:26,760 --> 00:24:29,800 Speaker 1: she talked about how they had this robot and they 445 00:24:29,800 --> 00:24:32,719 Speaker 1: were training at how to open doors, different doors, and 446 00:24:32,760 --> 00:24:38,280 Speaker 1: the robot would sit motionless, staring at a specific door 447 00:24:38,320 --> 00:24:41,520 Speaker 1: they had in this one hallway in their research facility, 448 00:24:41,680 --> 00:24:46,120 Speaker 1: and it would just be processing this image for hours 449 00:24:46,160 --> 00:24:51,400 Speaker 1: at a time, just just staring at a door. One thinks, wistfully, perhaps, 450 00:24:51,440 --> 00:24:55,200 Speaker 1: but that's projecting and before it ever makes an attempt. 451 00:24:55,600 --> 00:24:58,159 Speaker 1: And the problem that one of the many problems, was 452 00:24:58,200 --> 00:25:00,879 Speaker 1: that if someone were to walk down the hallway and 453 00:25:00,960 --> 00:25:03,480 Speaker 1: break the line of sight between the robot and the door, 454 00:25:03,880 --> 00:25:07,240 Speaker 1: that would confuse the robot and it would just effectively 455 00:25:07,440 --> 00:25:10,640 Speaker 1: increase the amount of time it would require to study 456 00:25:10,680 --> 00:25:14,400 Speaker 1: the door before it can make an attempt. So someone 457 00:25:14,520 --> 00:25:19,000 Speaker 1: just walking down the hall could mess up an entire experiment, 458 00:25:19,800 --> 00:25:22,040 Speaker 1: or at least make it go on twice as long 459 00:25:22,080 --> 00:25:26,920 Speaker 1: as they had intended because of that unexpected interruption and 460 00:25:27,040 --> 00:25:29,760 Speaker 1: line of sight. So to have a have a consumer 461 00:25:29,880 --> 00:25:33,440 Speaker 1: robot that is capable of following a specific person even 462 00:25:33,440 --> 00:25:36,720 Speaker 1: if that line of sight is broken is a phenomenal achievement. 463 00:25:36,760 --> 00:25:39,080 Speaker 1: It doesn't sound like it because we to us the 464 00:25:39,240 --> 00:25:43,359 Speaker 1: humans that's easy, but two machine, that's very hard. But 465 00:25:43,480 --> 00:25:45,920 Speaker 1: think about it. You're spending all this time trying to 466 00:25:45,920 --> 00:25:48,399 Speaker 1: teach a robot how to open the door. That's also 467 00:25:48,680 --> 00:25:51,879 Speaker 1: very simple. It's something that we do dozens of times 468 00:25:51,920 --> 00:25:54,480 Speaker 1: every day. And the thing about technology, or at least 469 00:25:54,520 --> 00:25:56,440 Speaker 1: this is how I am with technology, is I will 470 00:25:56,440 --> 00:25:58,320 Speaker 1: get a new piece of technology and I will start 471 00:25:58,320 --> 00:26:00,560 Speaker 1: pushing those limits because I want to know actually where 472 00:26:00,560 --> 00:26:03,359 Speaker 1: those borders are. So if you have if you have 473 00:26:03,400 --> 00:26:05,560 Speaker 1: a device that can watch Netflix, fine, I'm going to 474 00:26:05,600 --> 00:26:08,400 Speaker 1: ask for a very specific episode of a very specific 475 00:26:08,400 --> 00:26:10,560 Speaker 1: show on Netflix. And if you can't serve that up 476 00:26:10,560 --> 00:26:13,199 Speaker 1: to me, okay, I know that that's my limitation. I 477 00:26:13,200 --> 00:26:15,439 Speaker 1: can have you open Netflix, but I can't ask or 478 00:26:15,520 --> 00:26:19,040 Speaker 1: you to play episode three of season two of Stranger Things. 479 00:26:19,040 --> 00:26:21,359 Speaker 1: I'm going to know that so and and that's the 480 00:26:21,400 --> 00:26:23,240 Speaker 1: thing is, if I get a robot in my house, 481 00:26:23,280 --> 00:26:26,720 Speaker 1: Oh my lord, I'm going to be pushing so many boundaries, 482 00:26:26,760 --> 00:26:29,879 Speaker 1: like go get me a beer. Okay, you can't do that. Okay, 483 00:26:29,880 --> 00:26:32,240 Speaker 1: so go to the refrigerator. Okay, you don't know where 484 00:26:32,240 --> 00:26:36,240 Speaker 1: the refrigerator is, Okay, follow me, and then we're gonna 485 00:26:36,280 --> 00:26:38,360 Speaker 1: figure out. And then the problem is is that you're 486 00:26:38,359 --> 00:26:40,240 Speaker 1: eventually going to get down to a point where you're 487 00:26:40,280 --> 00:26:41,600 Speaker 1: just like, well, why do I have this thing? It 488 00:26:41,680 --> 00:26:46,760 Speaker 1: can't do anything? And that's the that's the biggest obstacle 489 00:26:46,960 --> 00:26:51,080 Speaker 1: that robotics and roboticists are facing, especially when it comes 490 00:26:51,119 --> 00:26:54,960 Speaker 1: to consumer applications. Sure. Yeah, there's so many things that 491 00:26:55,280 --> 00:27:01,640 Speaker 1: go into designing good robotics in general. Uh, everything from 492 00:27:01,680 --> 00:27:04,840 Speaker 1: just the mechanics of how the robot moves that is 493 00:27:05,840 --> 00:27:09,040 Speaker 1: that's a that's an art form unto itself. People spent 494 00:27:09,240 --> 00:27:13,720 Speaker 1: years trying to do things like replicate the way biological 495 00:27:13,800 --> 00:27:16,480 Speaker 1: leg works in order to have a robot that can 496 00:27:16,520 --> 00:27:19,320 Speaker 1: walk on two legs. The Grand DARPA Challenge of a 497 00:27:19,359 --> 00:27:24,960 Speaker 1: few years ago, where the robots were doing a rescue mission, 498 00:27:25,400 --> 00:27:28,600 Speaker 1: simulated rescue mission as if it were in the wake 499 00:27:28,680 --> 00:27:33,480 Speaker 1: of a Fukushima disaster. They they had they were using 500 00:27:33,560 --> 00:27:36,040 Speaker 1: legged robots that the reason they did was because the 501 00:27:36,119 --> 00:27:38,240 Speaker 1: robot had to be able to do multiple things that 502 00:27:38,320 --> 00:27:41,840 Speaker 1: humans can do, including operator vehicle, get out of the vehicle, 503 00:27:42,359 --> 00:27:44,480 Speaker 1: open a door, go through the door, pick up a 504 00:27:44,520 --> 00:27:47,760 Speaker 1: power tool, use it, all this other stuff. So the 505 00:27:48,640 --> 00:27:51,240 Speaker 1: that that meant that the robots had to fall into 506 00:27:51,280 --> 00:27:56,399 Speaker 1: a certain area of design. Well, building a robot that 507 00:27:56,480 --> 00:28:00,880 Speaker 1: can move the way a human can before you even 508 00:28:00,880 --> 00:28:04,920 Speaker 1: get to the software part, just building the complexity necessary 509 00:28:05,359 --> 00:28:08,879 Speaker 1: in the robotic limbs is incredibly complicated. That's why we 510 00:28:08,880 --> 00:28:11,240 Speaker 1: see a lot of robots that rely on things like wheels. 511 00:28:11,280 --> 00:28:15,800 Speaker 1: That's much easier to implement than limbs. And you know 512 00:28:15,960 --> 00:28:18,439 Speaker 1: it also and when we talk about easier to implement, 513 00:28:18,480 --> 00:28:20,960 Speaker 1: that also means it brings the price down, which makes 514 00:28:20,960 --> 00:28:24,000 Speaker 1: it more accessible as well. You can't you couldn't have 515 00:28:24,160 --> 00:28:28,120 Speaker 1: a Boston Dynamics big dog style robot on the consumer market. 516 00:28:28,160 --> 00:28:31,200 Speaker 1: It would cost a million bucks at least. Uh, so 517 00:28:31,760 --> 00:28:33,959 Speaker 1: you go much more simple. We've got a little bit 518 00:28:33,960 --> 00:28:38,080 Speaker 1: more we're going to say about consumer robotics. Really excited 519 00:28:38,120 --> 00:28:40,200 Speaker 1: for you guys to hear this episode. But before we 520 00:28:40,240 --> 00:28:42,960 Speaker 1: get into that, let's take another quick break to thank 521 00:28:42,960 --> 00:28:53,760 Speaker 1: our sponsor so anytime you're making a consumer robot, you 522 00:28:53,760 --> 00:28:57,480 Speaker 1: have to start with these considerations, like how complicated is 523 00:28:57,520 --> 00:29:00,280 Speaker 1: the machinery going to be? That alone has going to 524 00:29:00,360 --> 00:29:02,719 Speaker 1: impact the price I'm gonna have to charge for this 525 00:29:02,720 --> 00:29:04,760 Speaker 1: thing if I want to have even if I just 526 00:29:04,760 --> 00:29:08,880 Speaker 1: want to recoup costs, that's before I even think about profit. Sure, 527 00:29:09,400 --> 00:29:11,600 Speaker 1: then what is it supposed to do? How am I 528 00:29:11,600 --> 00:29:13,600 Speaker 1: going to make sure it does that really well? Well 529 00:29:13,720 --> 00:29:16,680 Speaker 1: enough so that justifies the purchase of the robot? These 530 00:29:16,680 --> 00:29:20,000 Speaker 1: are these are all tricky things. So you saw this, 531 00:29:20,000 --> 00:29:22,600 Speaker 1: this robot that could follow people around. I know what 532 00:29:22,640 --> 00:29:24,320 Speaker 1: I would use it for. If I could program it, 533 00:29:24,360 --> 00:29:26,080 Speaker 1: I would program it so that it would hold my 534 00:29:26,120 --> 00:29:29,360 Speaker 1: phone in the morning. And my phone is what it uses, 535 00:29:29,560 --> 00:29:33,320 Speaker 1: my alarm clock. So having my alarm clocks set on 536 00:29:33,360 --> 00:29:36,440 Speaker 1: the robot. When my alarm goes off and and the 537 00:29:36,520 --> 00:29:38,880 Speaker 1: robot sees me stirring in bed, it knows to just 538 00:29:38,920 --> 00:29:41,600 Speaker 1: slowly back away so that I have to get up 539 00:29:42,160 --> 00:29:44,880 Speaker 1: to go after the phone, because that guarantees I actually 540 00:29:44,960 --> 00:29:47,760 Speaker 1: get out of bed and get on my way and 541 00:29:47,840 --> 00:29:49,720 Speaker 1: that so so it's the opposite of what it would 542 00:29:49,720 --> 00:29:52,320 Speaker 1: normally do. Instead of following me it's retreating a little bit. 543 00:29:52,520 --> 00:29:54,719 Speaker 1: That's a pretty crazy niche case, but we'll go with it, 544 00:29:54,840 --> 00:29:58,200 Speaker 1: you know. I mean it's to me, it's like it 545 00:29:58,240 --> 00:30:00,760 Speaker 1: means that I actually get started in day. So really, 546 00:30:00,760 --> 00:30:03,400 Speaker 1: when you think about it, my time is invaluable. So 547 00:30:03,880 --> 00:30:06,840 Speaker 1: there you go. That's hounds justify it to the I 548 00:30:07,000 --> 00:30:09,720 Speaker 1: R S exactly. Yeah, And as long as I counted 549 00:30:09,760 --> 00:30:12,400 Speaker 1: off as a work expense, everything's cool, right, right, Yeah, 550 00:30:12,400 --> 00:30:16,000 Speaker 1: I king robot. So what else did you see at 551 00:30:16,040 --> 00:30:21,000 Speaker 1: C E S So actually getting back into the concept 552 00:30:21,000 --> 00:30:24,200 Speaker 1: of walking robots and as opposed to wheels, there was 553 00:30:24,240 --> 00:30:27,520 Speaker 1: a robot there by a company called ub Tech, which 554 00:30:28,360 --> 00:30:32,000 Speaker 1: is actually appropriately named Walker. And it's called Walker because 555 00:30:32,080 --> 00:30:34,520 Speaker 1: this is a robot that walks around. It's similar to 556 00:30:34,680 --> 00:30:38,600 Speaker 1: that Sony robot whose name escapes me, that fell down 557 00:30:38,640 --> 00:30:41,760 Speaker 1: on stage famously back in back in the late eighties 558 00:30:41,760 --> 00:30:44,880 Speaker 1: early nineties. Similar look to that, but this is a 559 00:30:45,000 --> 00:30:47,400 Speaker 1: robot that's about I want to say it was about 560 00:30:47,480 --> 00:30:49,440 Speaker 1: four and a half feet tall, so it's a pretty 561 00:30:49,480 --> 00:30:53,560 Speaker 1: big robot too that can walk around. And they put 562 00:30:53,640 --> 00:30:56,520 Speaker 1: on a demonstration for me where and granted this is 563 00:30:56,520 --> 00:30:59,880 Speaker 1: a very specific set of steps that were set up 564 00:31:00,200 --> 00:31:03,520 Speaker 1: to demonstrate the robot's capabilities. But basically what it did 565 00:31:03,680 --> 00:31:06,040 Speaker 1: was it came to the door to greet you, and 566 00:31:06,080 --> 00:31:08,680 Speaker 1: it took your coat, hung the coat up on a wall, 567 00:31:09,560 --> 00:31:11,360 Speaker 1: and then, oh it should I should also mention it 568 00:31:11,400 --> 00:31:15,200 Speaker 1: opened the door before before you walked in. Then it 569 00:31:15,320 --> 00:31:18,360 Speaker 1: followed you to the middle of the room. The demonstrator 570 00:31:18,400 --> 00:31:20,600 Speaker 1: went down and sat on the couch. So it went 571 00:31:20,680 --> 00:31:24,280 Speaker 1: and got a tasty beverage out of the refrigerator, opened 572 00:31:24,280 --> 00:31:26,880 Speaker 1: the door of the refrigerator, pulled out the pulled out 573 00:31:26,880 --> 00:31:29,760 Speaker 1: the cannon I think it was soda or whatever, and 574 00:31:29,800 --> 00:31:33,240 Speaker 1: then closed the refrigerator. Went over and picked up a 575 00:31:33,320 --> 00:31:37,440 Speaker 1: can of chips that come into can that will go nameless, 576 00:31:37,480 --> 00:31:41,480 Speaker 1: but it rhymes with flingles, and then brought it over 577 00:31:41,600 --> 00:31:43,280 Speaker 1: and then brought it over to the guy in the couch, 578 00:31:43,720 --> 00:31:46,000 Speaker 1: and it said, by the way, you have a hot 579 00:31:46,120 --> 00:31:48,760 Speaker 1: date coming up tonight, but it's supposed to rain. Can 580 00:31:48,840 --> 00:31:50,840 Speaker 1: I get you your umbrella? And said, yeah, I'd love 581 00:31:50,880 --> 00:31:52,720 Speaker 1: to you get your my umbrella. So then it walked 582 00:31:52,760 --> 00:31:55,560 Speaker 1: over and picked up the umbrella, came back and handed 583 00:31:55,600 --> 00:31:58,959 Speaker 1: over the umbrella and shuffled him out the door. So 584 00:31:59,040 --> 00:32:02,600 Speaker 1: I mean it was a lot of very complex motions 585 00:32:02,640 --> 00:32:05,040 Speaker 1: that it had to get down. Now, you could tell 586 00:32:05,120 --> 00:32:08,880 Speaker 1: that this was a very specific demo. Um. There there 587 00:32:08,920 --> 00:32:14,480 Speaker 1: were targets all over the floor, like barcode looking targets 588 00:32:14,520 --> 00:32:16,760 Speaker 1: that kind of told the robot, this is where this 589 00:32:16,800 --> 00:32:18,840 Speaker 1: item will be. So it didn't really have to know 590 00:32:19,280 --> 00:32:22,160 Speaker 1: where all this stuff, or I should say it didn't 591 00:32:22,160 --> 00:32:24,000 Speaker 1: have to find all this stuff. It already knew where 592 00:32:24,000 --> 00:32:26,080 Speaker 1: all this stuff was. So you know, you know, the 593 00:32:26,160 --> 00:32:28,520 Speaker 1: soda was putting a very specific place that Pringles were 594 00:32:28,520 --> 00:32:30,600 Speaker 1: putting a very oh I said, the name, we're put 595 00:32:30,600 --> 00:32:33,600 Speaker 1: in a very specific place. Um, the umbrella wasn't a 596 00:32:33,600 --> 00:32:37,600 Speaker 1: specific place. Yeada, YadA YadA. So but still, the fact 597 00:32:37,680 --> 00:32:40,600 Speaker 1: that this robot was up and around and walking and 598 00:32:40,640 --> 00:32:45,360 Speaker 1: talking and not falling over did a little dance at 599 00:32:45,400 --> 00:32:47,960 Speaker 1: the end of a demo, so that was kind of cute. 600 00:32:48,560 --> 00:32:51,960 Speaker 1: It was It was very apparent that first of all, 601 00:32:52,000 --> 00:32:56,080 Speaker 1: this was going to be a very expensive consumer model 602 00:32:56,680 --> 00:32:58,360 Speaker 1: when it does when it does come out. I was 603 00:32:58,400 --> 00:33:01,800 Speaker 1: talking with one of the PR guys and they anticipated 604 00:33:01,840 --> 00:33:05,080 Speaker 1: that as the technology involves, the cost will eventually come 605 00:33:05,120 --> 00:33:08,680 Speaker 1: down to a place where it's reasonable for I don't know, 606 00:33:08,760 --> 00:33:12,440 Speaker 1: not me, but people fluid but right. But the fact 607 00:33:12,480 --> 00:33:15,480 Speaker 1: that this is on legs means that it can handle stairs, 608 00:33:15,640 --> 00:33:18,479 Speaker 1: It can handle you know, my split level house wouldn't 609 00:33:18,520 --> 00:33:23,160 Speaker 1: bother it all that much. And it has you know, fingers, 610 00:33:23,200 --> 00:33:25,920 Speaker 1: you know, manipulatable digits and the thumbs, so it can 611 00:33:25,960 --> 00:33:29,239 Speaker 1: pick up things and carry them around. So it's a 612 00:33:29,320 --> 00:33:34,000 Speaker 1: lot more potentially farther down the road useful than something 613 00:33:34,040 --> 00:33:37,200 Speaker 1: like a temmy, which is basically a rolling serving tray. 614 00:33:37,240 --> 00:33:41,200 Speaker 1: But it's also because it's that much more complicated, you 615 00:33:41,280 --> 00:33:43,000 Speaker 1: know that the cost associated with that are going to 616 00:33:43,040 --> 00:33:47,440 Speaker 1: be huge at least the start Yeah sure, yeah, absolutely, 617 00:33:47,520 --> 00:33:50,480 Speaker 1: And and the idea of having the barcodes there to 618 00:33:50,760 --> 00:33:53,200 Speaker 1: give the robot the points of reference it needs to 619 00:33:53,240 --> 00:33:56,800 Speaker 1: in order to do the demonstration that is not surprising 620 00:33:56,840 --> 00:34:00,400 Speaker 1: to me. It is incredibly challenging to create a robot 621 00:34:00,440 --> 00:34:04,040 Speaker 1: that is capable of not just sensing its environment. That 622 00:34:04,120 --> 00:34:07,280 Speaker 1: part we've got that down. You know, we've got amazing 623 00:34:07,320 --> 00:34:09,920 Speaker 1: sensors out there where robots can tell when they're getting 624 00:34:09,920 --> 00:34:13,600 Speaker 1: close to say, an obstacle of some sort, but to 625 00:34:13,880 --> 00:34:17,880 Speaker 1: be able to do pathfinding that's far more challenging and 626 00:34:17,960 --> 00:34:21,160 Speaker 1: requires a lot a lot of artificial intelligence. You may 627 00:34:21,200 --> 00:34:26,000 Speaker 1: remember Azumo, the robot that Honda made years and years 628 00:34:26,000 --> 00:34:29,160 Speaker 1: ago and fantastic. It was the first robot that could 629 00:34:29,160 --> 00:34:32,160 Speaker 1: ever run. And by run, it means that a part 630 00:34:32,160 --> 00:34:35,359 Speaker 1: of its gate both feet are off the ground for 631 00:34:35,440 --> 00:34:39,120 Speaker 1: a split second, so it's run actually looks like it's 632 00:34:39,120 --> 00:34:42,520 Speaker 1: doing a weird, little hoppy jog, kind of like I 633 00:34:42,520 --> 00:34:45,520 Speaker 1: always thought that it looked like Ozomo really needed to 634 00:34:45,640 --> 00:34:49,919 Speaker 1: use the facilities that maybe there's Yeah, you've never seen 635 00:34:49,960 --> 00:34:54,040 Speaker 1: me run either, So not so much passing judgment as 636 00:34:54,040 --> 00:34:57,080 Speaker 1: I'm just making an observation. I wrote how Ozumo works, 637 00:34:57,080 --> 00:34:59,359 Speaker 1: so I had to watch a lot of video of it. 638 00:34:59,800 --> 00:35:04,040 Speaker 1: And but but as amazing as the robot was, and 639 00:35:04,080 --> 00:35:10,000 Speaker 1: it truly was remarkable, it required programmers to work very 640 00:35:10,000 --> 00:35:13,160 Speaker 1: closely with it so that it would follow pre programmed routes. 641 00:35:13,640 --> 00:35:16,480 Speaker 1: It could climb stairs, it could descend stairs, it could run, 642 00:35:16,560 --> 00:35:19,360 Speaker 1: but it had to have all of that environmental information 643 00:35:19,400 --> 00:35:22,400 Speaker 1: programmed into it ahead of time so that would know 644 00:35:22,440 --> 00:35:25,600 Speaker 1: what route to take. It could not make that determination 645 00:35:25,640 --> 00:35:27,720 Speaker 1: on its own. You couldn't put it at the entrance 646 00:35:27,960 --> 00:35:31,319 Speaker 1: of a room on one side and say exit the 647 00:35:31,400 --> 00:35:33,560 Speaker 1: room on the other side, you would have to program 648 00:35:33,600 --> 00:35:37,200 Speaker 1: that route into the robot. And you know, we're slowly 649 00:35:37,280 --> 00:35:40,960 Speaker 1: moving beyond that. We're moving to the point where artificial intelligence, 650 00:35:41,040 --> 00:35:44,840 Speaker 1: machine learning, pathfinding technology, all of these sort of things 651 00:35:44,880 --> 00:35:47,480 Speaker 1: are getting more sophisticated to the point where we might 652 00:35:47,480 --> 00:35:51,719 Speaker 1: be able to have robots find numerous ways to get 653 00:35:51,760 --> 00:35:54,440 Speaker 1: from point A to point B, the most efficient or 654 00:35:54,480 --> 00:35:57,120 Speaker 1: most effective ways, the most power efficient ways, whatever it 655 00:35:57,160 --> 00:35:59,759 Speaker 1: may be. We're getting. We're getting to that point, but 656 00:36:00,080 --> 00:36:04,960 Speaker 1: it it's a very difficult problem to solve, not just 657 00:36:05,000 --> 00:36:09,840 Speaker 1: for robotics but for artificial intelligence in general. So so 658 00:36:10,160 --> 00:36:13,080 Speaker 1: I give a lot of Slack two companies for these 659 00:36:13,080 --> 00:36:17,720 Speaker 1: sort of demonstrations because they typically are saying, here's one 660 00:36:17,800 --> 00:36:21,240 Speaker 1: element of this technology that has really come a far away, 661 00:36:21,320 --> 00:36:26,640 Speaker 1: and here's how we're envisioning it in the future. And meanwhile, 662 00:36:26,680 --> 00:36:29,399 Speaker 1: you have this other set of of challenges you still 663 00:36:29,440 --> 00:36:32,120 Speaker 1: have to overcome before that's truly achievable. But you want 664 00:36:32,160 --> 00:36:35,160 Speaker 1: to be able to show something off right right exactly, 665 00:36:35,800 --> 00:36:39,040 Speaker 1: it's important to recognize the advances that have been made 666 00:36:39,360 --> 00:36:41,640 Speaker 1: while recognizing that you know, there's still a long way 667 00:36:41,680 --> 00:36:44,239 Speaker 1: to go. Exactly. It's like it's like concept cars at 668 00:36:44,239 --> 00:36:46,560 Speaker 1: a car show. You know, when you see a concept 669 00:36:46,640 --> 00:36:48,960 Speaker 1: car that is not what you're going to see on 670 00:36:49,000 --> 00:36:52,239 Speaker 1: a show floor at your car dealership down the road, 671 00:36:52,320 --> 00:36:55,800 Speaker 1: you're most you're almost certain not to see that exact 672 00:36:55,840 --> 00:36:59,439 Speaker 1: same design, but elements of that are going to find 673 00:36:59,440 --> 00:37:03,319 Speaker 1: their way in two cars down the line, hopefully, and 674 00:37:03,360 --> 00:37:07,319 Speaker 1: that you'll get the best implementation of those ideas in 675 00:37:07,440 --> 00:37:10,520 Speaker 1: future models of cars. Same sort of thing here. These 676 00:37:10,520 --> 00:37:14,080 Speaker 1: demonstrations we see, they're meant to wow people, and they're 677 00:37:14,080 --> 00:37:16,239 Speaker 1: meant to show off. Yes, we've come a long way 678 00:37:16,280 --> 00:37:18,920 Speaker 1: in this particular fashion. We still have ways to go, 679 00:37:19,280 --> 00:37:23,000 Speaker 1: but we want to make sure we're inspiring people because 680 00:37:23,400 --> 00:37:27,239 Speaker 1: that helps drive the development process. Otherwise, if if there's 681 00:37:27,280 --> 00:37:30,200 Speaker 1: no interest, no one, there's If there's no interest, there's 682 00:37:30,200 --> 00:37:33,120 Speaker 1: no money. If there's no money, there's no development. Well 683 00:37:33,200 --> 00:37:35,399 Speaker 1: it's also it also should be mentioned that a lot 684 00:37:35,440 --> 00:37:42,600 Speaker 1: of the capabilities that robots have today are also contributing 685 00:37:42,719 --> 00:37:45,720 Speaker 1: to other advancements that have been made outside the field 686 00:37:45,760 --> 00:37:50,560 Speaker 1: of robotics. So you know, for example, the software algorithms 687 00:37:50,560 --> 00:37:53,080 Speaker 1: that a robot can use to find the path from 688 00:37:53,120 --> 00:37:56,320 Speaker 1: A to B that can also be used in autonomous driving. 689 00:37:56,360 --> 00:37:59,600 Speaker 1: That can also be used in even something like you 690 00:37:59,640 --> 00:38:03,600 Speaker 1: know where house picking, you know, finding various products within 691 00:38:04,080 --> 00:38:06,479 Speaker 1: within a warehouse. There was another robot that I saw, 692 00:38:06,560 --> 00:38:09,239 Speaker 1: also by ub Tech that was designed to be more 693 00:38:09,239 --> 00:38:11,560 Speaker 1: of a retail presence where it would greet you at 694 00:38:11,600 --> 00:38:13,759 Speaker 1: the door and you would say, you know, you're at 695 00:38:13,760 --> 00:38:16,080 Speaker 1: home depot, I'm here to get a hammer, and then 696 00:38:16,120 --> 00:38:17,879 Speaker 1: you would push the hammer button and it would take 697 00:38:17,880 --> 00:38:19,600 Speaker 1: you over to where the hammers are and then it 698 00:38:19,600 --> 00:38:21,520 Speaker 1: would show you a video on why this hammer is 699 00:38:21,560 --> 00:38:25,640 Speaker 1: the best. So a lot of that stuff that is 700 00:38:25,800 --> 00:38:28,440 Speaker 1: integrated into that robot is also going to be used 701 00:38:28,480 --> 00:38:33,640 Speaker 1: in you know, various AI applications and location finding, you know, 702 00:38:33,719 --> 00:38:35,880 Speaker 1: who knows. It might even be used to implement like 703 00:38:35,960 --> 00:38:38,879 Speaker 1: in a sort of indoor GPS type system. It's it's, 704 00:38:38,920 --> 00:38:41,920 Speaker 1: it's it's. One of the great things about technology is 705 00:38:41,920 --> 00:38:45,480 Speaker 1: that everything helps everything. So so if you make it, 706 00:38:45,560 --> 00:38:48,880 Speaker 1: if you make an advancement in you know, A, it 707 00:38:48,920 --> 00:38:52,040 Speaker 1: will also help in B, C and D. And as 708 00:38:52,080 --> 00:38:54,200 Speaker 1: you get more of these robots out in the real 709 00:38:54,239 --> 00:38:58,360 Speaker 1: world and you're able to get data on how they're doing, 710 00:38:58,640 --> 00:39:01,560 Speaker 1: then obviously that can go into the design process to 711 00:39:01,640 --> 00:39:04,960 Speaker 1: make the next generation of robots better or more effective, 712 00:39:05,120 --> 00:39:08,040 Speaker 1: or to do firmware updates to improve the operation of 713 00:39:08,120 --> 00:39:10,520 Speaker 1: robots that are already out there. Um, did you get 714 00:39:10,520 --> 00:39:14,719 Speaker 1: a chance to see the Samsung robot that takes your 715 00:39:14,760 --> 00:39:18,000 Speaker 1: pulse and tells you if you're healthier or not or dead? 716 00:39:18,480 --> 00:39:22,600 Speaker 1: Or I saw the demonstration at the press conference, I 717 00:39:22,640 --> 00:39:25,160 Speaker 1: didn't actually get a chance to go face to face 718 00:39:25,200 --> 00:39:27,880 Speaker 1: with it as it were. Usually usually things that are 719 00:39:27,920 --> 00:39:32,120 Speaker 1: at that level are normally hands off, I found at Yes. Yeah, 720 00:39:32,160 --> 00:39:36,040 Speaker 1: Typically it's like you might see a couple of of friendly, 721 00:39:37,239 --> 00:39:41,360 Speaker 1: semi approachable handlers who will be happy to answer questions 722 00:39:41,560 --> 00:39:43,799 Speaker 1: as long as you keep your filthy mits off of it, 723 00:39:44,080 --> 00:39:48,080 Speaker 1: and they'll be happy to put their figures on the fingerprints. Yes, exactly, exactly, 724 00:39:48,719 --> 00:39:51,160 Speaker 1: But you don't the thing about Samsung, and you know 725 00:39:51,200 --> 00:39:53,720 Speaker 1: Samsung released I want to say it was four different 726 00:39:53,840 --> 00:39:57,520 Speaker 1: bonts ads at c at CES this year. They had 727 00:39:57,560 --> 00:40:04,040 Speaker 1: the Samsung but Care, the butt Retail, the Bot Air 728 00:40:04,880 --> 00:40:06,319 Speaker 1: and I want to say there was maybe it was 729 00:40:06,360 --> 00:40:09,560 Speaker 1: only three anyway, But one thing that I thought was 730 00:40:09,680 --> 00:40:12,280 Speaker 1: really interesting about the fact that they introduced three different 731 00:40:12,320 --> 00:40:17,080 Speaker 1: robots was there kind of tacitly admitting that right now. 732 00:40:17,200 --> 00:40:20,360 Speaker 1: This robot is designed for your health, and that's it. 733 00:40:20,840 --> 00:40:24,480 Speaker 1: This robot is designed for a retail experience, and that's it. 734 00:40:24,920 --> 00:40:27,680 Speaker 1: And this robot is designed to test the quality of 735 00:40:27,680 --> 00:40:29,279 Speaker 1: the air in your house or filter the air in 736 00:40:29,320 --> 00:40:32,920 Speaker 1: your house, and that is it. So it's one of 737 00:40:32,960 --> 00:40:36,800 Speaker 1: the biggest complications, especially when it comes to consumer robots, 738 00:40:37,080 --> 00:40:40,600 Speaker 1: is finding that robot that can do a hundred different things, 739 00:40:40,680 --> 00:40:43,000 Speaker 1: because you don't want a robot that's just going to 740 00:40:43,000 --> 00:40:45,200 Speaker 1: back away from you when your alarms going off. You're 741 00:40:45,200 --> 00:40:46,880 Speaker 1: gonna want a robot that will back away from you 742 00:40:46,880 --> 00:40:48,600 Speaker 1: when your alarms going off. Then it'll go get your 743 00:40:48,600 --> 00:40:50,719 Speaker 1: toothbrush and your toothpaste, and then get the paper from 744 00:40:50,719 --> 00:40:53,400 Speaker 1: outain front. I'm just kidding. Nobody reads papers anymore. But 745 00:40:53,480 --> 00:40:55,560 Speaker 1: then you wanted to go, you know, make you start 746 00:40:55,600 --> 00:40:57,960 Speaker 1: making your breakfast, turn on the oven for you, stuff 747 00:40:58,000 --> 00:40:59,839 Speaker 1: like that. You want a robot that's gonna do all 748 00:40:59,880 --> 00:41:03,959 Speaker 1: the those different things. And Samsung's robots were designed for 749 00:41:04,600 --> 00:41:07,560 Speaker 1: pretty much. You have one job to do. Yeah, and again, 750 00:41:08,040 --> 00:41:10,239 Speaker 1: by focusing on the one job, you can try and 751 00:41:10,280 --> 00:41:13,000 Speaker 1: make sure you're making a thing that does what it's 752 00:41:13,000 --> 00:41:16,360 Speaker 1: supposed to do, really, really well, but as as you, 753 00:41:16,520 --> 00:41:19,520 Speaker 1: as you point out, no one wants to sit down 754 00:41:19,520 --> 00:41:22,640 Speaker 1: and say, yeah, but do I want a hundred things 755 00:41:22,719 --> 00:41:25,120 Speaker 1: that do the one thing really really well and clutter 756 00:41:25,200 --> 00:41:27,520 Speaker 1: up my house? And my house just becomes house of robots. 757 00:41:27,560 --> 00:41:30,480 Speaker 1: And really at that point, it just becomes how can 758 00:41:30,520 --> 00:41:33,280 Speaker 1: I get another outlet installed so I can charge everything? 759 00:41:33,360 --> 00:41:36,839 Speaker 1: You know, it gets to a point where you start 760 00:41:36,880 --> 00:41:40,440 Speaker 1: to ask, well, is this useful enough to be worth 761 00:41:40,560 --> 00:41:43,200 Speaker 1: the investment it would take for me to have it 762 00:41:43,280 --> 00:41:46,839 Speaker 1: in my life? And we're we're still we're still at 763 00:41:46,880 --> 00:41:49,200 Speaker 1: that level where that's still going to be a struggle. 764 00:41:49,320 --> 00:41:54,360 Speaker 1: And uh, again it comes into play with how complicated 765 00:41:54,800 --> 00:41:59,080 Speaker 1: this whole approach is. In the sister episode to this one, 766 00:41:59,160 --> 00:42:01,840 Speaker 1: the one that we did on on Android Authority, we 767 00:42:01,960 --> 00:42:05,120 Speaker 1: talked a bit about Misty, and I'm really not going 768 00:42:05,160 --> 00:42:07,319 Speaker 1: to dive into it here, but just to give a 769 00:42:07,400 --> 00:42:11,200 Speaker 1: quick hint to my listeners what that's all about, Misty. 770 00:42:11,239 --> 00:42:15,239 Speaker 1: They were creating a robot development platform, not just not 771 00:42:15,320 --> 00:42:18,480 Speaker 1: just a robot, although they did build that too, They 772 00:42:18,520 --> 00:42:23,759 Speaker 1: created a platform for developers two create different applications that 773 00:42:23,800 --> 00:42:28,080 Speaker 1: the robot could potentially do, and thus open this up 774 00:42:28,160 --> 00:42:32,040 Speaker 1: to a much broader array of developers. It's no longer 775 00:42:32,120 --> 00:42:35,520 Speaker 1: just an internal team that is part of a single 776 00:42:35,600 --> 00:42:38,080 Speaker 1: company that is trying to, you know, suss out, well, 777 00:42:38,080 --> 00:42:40,040 Speaker 1: how can we make the robot do this one thing 778 00:42:40,080 --> 00:42:42,400 Speaker 1: really well or do this second thing just as well 779 00:42:42,440 --> 00:42:44,799 Speaker 1: as it does the first thing. Now it's opened up 780 00:42:44,840 --> 00:42:47,800 Speaker 1: to an entire community with a sort of open source 781 00:42:47,800 --> 00:42:53,399 Speaker 1: approach almost two to really see where the limitations are, 782 00:42:53,440 --> 00:42:56,520 Speaker 1: almost like what you were saying earlier, thinking I have 783 00:42:56,719 --> 00:43:00,360 Speaker 1: this this practically a blank slate in one of me, 784 00:43:01,440 --> 00:43:04,359 Speaker 1: what what's the most out there thing I can make 785 00:43:04,400 --> 00:43:08,080 Speaker 1: it do? And then you start seeing all these different 786 00:43:08,080 --> 00:43:11,680 Speaker 1: applications arise as a result of that. That could go 787 00:43:11,760 --> 00:43:15,960 Speaker 1: a very long way into getting closer to the robot 788 00:43:16,000 --> 00:43:18,400 Speaker 1: that can do a hundred things as opposed to the 789 00:43:18,480 --> 00:43:22,000 Speaker 1: robot that does the one thing. And the overlying philosophy 790 00:43:22,080 --> 00:43:24,920 Speaker 1: behind the Misty two robot was to take an almost 791 00:43:24,960 --> 00:43:29,240 Speaker 1: Amazon Alexa type approach where you can download different skills 792 00:43:29,320 --> 00:43:32,120 Speaker 1: depending on how you want you know, what you want 793 00:43:32,160 --> 00:43:34,799 Speaker 1: your robot to do. And that was kind of the 794 00:43:35,400 --> 00:43:37,640 Speaker 1: you could tell Misty was really taking a long view 795 00:43:38,400 --> 00:43:42,640 Speaker 1: in terms of that. So anyway, but that that was 796 00:43:42,920 --> 00:43:46,279 Speaker 1: an indication of Misty taking the long approach, the it's 797 00:43:46,280 --> 00:43:48,840 Speaker 1: a marathon, not a splint a sprint approach. So you 798 00:43:48,840 --> 00:43:51,959 Speaker 1: could go buy a robot and one of the beautiful things, 799 00:43:52,000 --> 00:43:53,560 Speaker 1: like you were saying before, is you know, you could 800 00:43:53,560 --> 00:43:55,239 Speaker 1: take a robot and you can say, what's the most 801 00:43:55,280 --> 00:43:58,280 Speaker 1: out there thing I wanted to do? Well, Missy's approach 802 00:43:58,360 --> 00:44:01,239 Speaker 1: is to send out five hundred different robots to five 803 00:44:01,320 --> 00:44:04,000 Speaker 1: hundred different developers and have them each asked the question, 804 00:44:04,480 --> 00:44:06,400 Speaker 1: what's the most out there thing I could imagine this 805 00:44:06,520 --> 00:44:09,000 Speaker 1: robot doing? And then they'll build it on their own, 806 00:44:10,000 --> 00:44:12,640 Speaker 1: and then you'll suddenly have a robot that's capable of 807 00:44:12,640 --> 00:44:15,640 Speaker 1: doing five hundred different things depending on what you want 808 00:44:15,640 --> 00:44:20,720 Speaker 1: to download. That's a really compelling thing about the Misty 809 00:44:20,760 --> 00:44:24,080 Speaker 1: two platform. Yeah, I I love that, uh that approach. 810 00:44:24,120 --> 00:44:26,399 Speaker 1: It is something that I think it's going to take 811 00:44:26,400 --> 00:44:30,520 Speaker 1: a few generations for us to really uh see what 812 00:44:31,000 --> 00:44:33,879 Speaker 1: truly stands out. I mean, I'm guessing that the Misty team, 813 00:44:33,880 --> 00:44:37,719 Speaker 1: when they see these different approaches, they're gonna say, you know, 814 00:44:37,840 --> 00:44:41,719 Speaker 1: when we develop the next the next generation of this 815 00:44:41,760 --> 00:44:45,719 Speaker 1: particular platform, we're gonna change the design slightly because we've 816 00:44:45,760 --> 00:44:48,480 Speaker 1: noticed a trend towards such and such, and we just 817 00:44:48,520 --> 00:44:50,960 Speaker 1: think that by changing the form factor a little bit, 818 00:44:51,440 --> 00:44:55,480 Speaker 1: we will facilitate that in a better way. And that's 819 00:44:55,560 --> 00:45:01,040 Speaker 1: a really cool organic approach to developing con summer robotics. 820 00:45:01,600 --> 00:45:04,560 Speaker 1: And so we've seen we've seen both. We've seen the 821 00:45:04,640 --> 00:45:08,759 Speaker 1: very focused approach where we're gonna go for something that's 822 00:45:08,760 --> 00:45:11,319 Speaker 1: not going to be terribly expensive. It's gonna do one 823 00:45:11,360 --> 00:45:14,280 Speaker 1: thing reasonably well. I mean, I would argue the roomba 824 00:45:14,320 --> 00:45:17,279 Speaker 1: as a great example of this. It's, you know, it's 825 00:45:17,320 --> 00:45:21,399 Speaker 1: it's not cheaper than a vacuum cleaner, but it does 826 00:45:21,480 --> 00:45:24,440 Speaker 1: automate that process. And if you have the money and 827 00:45:24,520 --> 00:45:26,360 Speaker 1: you have the right kind of space for it, my 828 00:45:26,440 --> 00:45:29,759 Speaker 1: house would be hopeless. I live in the town. Yeah, 829 00:45:29,840 --> 00:45:31,920 Speaker 1: I'm just like, yeah, it would be nice to have, 830 00:45:32,000 --> 00:45:34,000 Speaker 1: except for the fact that I have to own three 831 00:45:34,040 --> 00:45:36,680 Speaker 1: of them and I like to have all my doors 832 00:45:36,719 --> 00:45:40,200 Speaker 1: open all the time. Yeah, it just would. It does. 833 00:45:40,200 --> 00:45:42,560 Speaker 1: It's not practical for my use but for the right 834 00:45:42,600 --> 00:45:44,440 Speaker 1: type of home, you can see you can make the 835 00:45:44,440 --> 00:45:46,880 Speaker 1: the use case argument for it. And I think I 836 00:45:46,920 --> 00:45:48,880 Speaker 1: think that's I think that's fine, and we're gonna keep 837 00:45:48,920 --> 00:45:51,560 Speaker 1: seeing those we're gonna see improvements of those over time. 838 00:45:51,960 --> 00:45:56,040 Speaker 1: But I like also this more almost experimental approach where 839 00:45:56,480 --> 00:45:58,840 Speaker 1: we're going. We don't know what we're going to get yet, 840 00:45:58,960 --> 00:46:02,600 Speaker 1: and that's the beautiful thing about it. It's great company 841 00:46:02,680 --> 00:46:05,720 Speaker 1: embrace that it's a beautiful thing, and it's a scary 842 00:46:05,800 --> 00:46:08,719 Speaker 1: part because you don't know, you know, it's the type 843 00:46:08,760 --> 00:46:11,719 Speaker 1: of thing where it could very easily blow up. But 844 00:46:11,840 --> 00:46:14,879 Speaker 1: I just think that based on what I've seen so far, 845 00:46:14,920 --> 00:46:16,600 Speaker 1: and you'll have to listen to the first episode to 846 00:46:16,640 --> 00:46:19,040 Speaker 1: like really get into what we're talking about in terms 847 00:46:19,040 --> 00:46:21,959 Speaker 1: of like the developer community that they're building. They've really 848 00:46:21,960 --> 00:46:24,239 Speaker 1: got the right approach, and I'm very optimistic about what 849 00:46:24,320 --> 00:46:27,040 Speaker 1: Misty is going to be doing in the future. I'm 850 00:46:27,080 --> 00:46:30,359 Speaker 1: really excited to see as well. I uh I was 851 00:46:30,400 --> 00:46:32,640 Speaker 1: thrilled to hear about it. Since I didn't get the 852 00:46:32,719 --> 00:46:35,920 Speaker 1: chance to go to the the robotics parts of the 853 00:46:35,920 --> 00:46:39,279 Speaker 1: show floor, my CES experience this past year was much 854 00:46:39,320 --> 00:46:43,080 Speaker 1: more limited than it typically is, uh for for better 855 00:46:43,120 --> 00:46:46,080 Speaker 1: and for worse. The better being, I didn't feel like 856 00:46:46,160 --> 00:46:49,440 Speaker 1: I was had been through a twelve round fight with 857 00:46:49,480 --> 00:46:53,000 Speaker 1: a championship boxer every day, which is typically how I 858 00:46:53,000 --> 00:46:54,680 Speaker 1: feel at the end of the day at c e S. 859 00:46:55,040 --> 00:46:57,560 Speaker 1: But the bad part is I had a very limited 860 00:46:57,600 --> 00:47:00,520 Speaker 1: amount of stuff that I got to see uh in person, 861 00:47:00,560 --> 00:47:02,680 Speaker 1: although I got to hear about a lot of really 862 00:47:02,719 --> 00:47:05,359 Speaker 1: interesting stuff while I was there. So and now you've 863 00:47:05,400 --> 00:47:07,799 Speaker 1: heard about more of it. Yeah, I'm so glad you 864 00:47:07,840 --> 00:47:10,600 Speaker 1: were able to join the show and to give us 865 00:47:10,640 --> 00:47:14,840 Speaker 1: your experience, your insight on this topic. And it was 866 00:47:15,080 --> 00:47:17,320 Speaker 1: a pleasure having you on. It was also a pleasure 867 00:47:17,400 --> 00:47:21,600 Speaker 1: being on on your show, Android Authority. So Adam, please 868 00:47:21,840 --> 00:47:25,719 Speaker 1: please tell my listeners where they can find your great work. Well, 869 00:47:25,760 --> 00:47:28,840 Speaker 1: thank you. Um I am I actually am a podcast 870 00:47:28,840 --> 00:47:31,719 Speaker 1: producer over with the Digit Studios, which is associated with 871 00:47:31,800 --> 00:47:35,680 Speaker 1: Android Authority. Obviously, we've already talked about the Android Authority podcast, 872 00:47:35,719 --> 00:47:38,120 Speaker 1: which I think would be a wonderful It would be 873 00:47:38,160 --> 00:47:40,080 Speaker 1: wonderful for your listeners to come over and take a 874 00:47:40,120 --> 00:47:43,680 Speaker 1: listen to. Um. I think even more appropriate for your 875 00:47:44,000 --> 00:47:47,360 Speaker 1: more general look at technology in general is our Digit 876 00:47:47,440 --> 00:47:50,919 Speaker 1: Daily podcast, which is a podcast produced by B and 877 00:47:51,080 --> 00:47:55,360 Speaker 1: one other, one other awesome dude named Sam. We we 878 00:47:55,440 --> 00:47:57,799 Speaker 1: do a daily podcast which is all the tech news 879 00:47:57,880 --> 00:48:00,239 Speaker 1: that you need to know, and it comes out every day. 880 00:48:00,320 --> 00:48:02,920 Speaker 1: Right around noon Central time, because I live in Central 881 00:48:02,960 --> 00:48:05,560 Speaker 1: Time and that's how I roll. So it's based on 882 00:48:05,960 --> 00:48:08,279 Speaker 1: it is based on our newsletter, which is also really cool. 883 00:48:08,280 --> 00:48:11,080 Speaker 1: They worked. We worked very well with the newsletter team. 884 00:48:11,080 --> 00:48:12,799 Speaker 1: We worked together, We kind of prop each other off. 885 00:48:12,800 --> 00:48:15,440 Speaker 1: It's really cool. So I actually think that with the 886 00:48:15,480 --> 00:48:18,879 Speaker 1: more generalized nature of your audience, I think the Digit 887 00:48:18,960 --> 00:48:21,560 Speaker 1: Daily would probably be a a good fit. And I 888 00:48:21,560 --> 00:48:23,520 Speaker 1: think the Android Authority would be an awesome fit for 889 00:48:23,560 --> 00:48:25,919 Speaker 1: those who are interested in mobile technology because we break 890 00:48:25,920 --> 00:48:29,279 Speaker 1: down the all the top news of Android every week. 891 00:48:29,480 --> 00:48:31,560 Speaker 1: So it's a it's a pretty great uh, it's a 892 00:48:31,560 --> 00:48:34,480 Speaker 1: pretty great thing. We've also got since you mentioned Parrot 893 00:48:34,480 --> 00:48:37,279 Speaker 1: earlier in the episode, we also the third person on 894 00:48:37,320 --> 00:48:41,080 Speaker 1: our Android Authority team is Jonathan Fist, who runs Drone Rush, 895 00:48:41,080 --> 00:48:43,400 Speaker 1: which is also a sister site for Android Authorities, so 896 00:48:43,440 --> 00:48:45,920 Speaker 1: he's all drones all the time. I got to love it. 897 00:48:45,920 --> 00:48:47,680 Speaker 1: It's a really it's a really diverse group that we 898 00:48:47,680 --> 00:48:51,400 Speaker 1: have over there. Excellent. Yes, I I urge my listeners 899 00:48:51,480 --> 00:48:54,520 Speaker 1: to go and check out those shows and those sites 900 00:48:54,560 --> 00:48:58,319 Speaker 1: because it's always nice to run into other people in 901 00:48:58,360 --> 00:49:01,319 Speaker 1: the field who have this same sort of passion for 902 00:49:01,360 --> 00:49:04,960 Speaker 1: a tech, whether it's a very specific niche in tech 903 00:49:05,360 --> 00:49:08,480 Speaker 1: or a general approach to tech. It's just great to 904 00:49:08,560 --> 00:49:13,040 Speaker 1: find communicators who really love it and they love sharing 905 00:49:13,320 --> 00:49:17,360 Speaker 1: that enthusiasm and that knowledge with an audience. I value 906 00:49:17,440 --> 00:49:21,759 Speaker 1: that quite a bit, so it's great too. Has been 907 00:49:21,800 --> 00:49:24,640 Speaker 1: really fun. Yeah, we'll have to do it again soon 908 00:49:25,560 --> 00:49:28,600 Speaker 1: and I'm gonna do that. Hey, please do. I mean, 909 00:49:28,600 --> 00:49:30,839 Speaker 1: I'm always happy to have someone to chat with on 910 00:49:30,920 --> 00:49:33,439 Speaker 1: this show, and goodness knows, I'm going to be having 911 00:49:33,480 --> 00:49:38,880 Speaker 1: some fun episode topics coming up in the near future. 912 00:49:39,640 --> 00:49:44,720 Speaker 1: There's a great spy story, and by great spy story 913 00:49:44,719 --> 00:49:48,719 Speaker 1: I mean a terrifying story about surveillance that I'll be 914 00:49:48,760 --> 00:49:51,440 Speaker 1: covering pretty soon that doesn't even have to do with 915 00:49:51,480 --> 00:49:53,920 Speaker 1: a FaceTime I'll be doing that one as well. So 916 00:49:53,960 --> 00:49:56,560 Speaker 1: there's a couple of surveillance episodes. Oh, you're doing a 917 00:49:56,600 --> 00:49:59,319 Speaker 1: FaceTime episode. That's great. Yes, yeah, I'll be doing one 918 00:49:59,320 --> 00:50:01,400 Speaker 1: of those as well. But the other one is is 919 00:50:01,560 --> 00:50:04,480 Speaker 1: unrelated to that, not now, at least not directly related 920 00:50:04,520 --> 00:50:07,279 Speaker 1: to that. But we'll get into it later this week 921 00:50:07,320 --> 00:50:10,600 Speaker 1: when I sit down and and dive into this CD 922 00:50:10,840 --> 00:50:14,759 Speaker 1: dark world of spies. But that wraps up this episode 923 00:50:14,960 --> 00:50:18,320 Speaker 1: of tech Stuff. If you guys have any questions, or 924 00:50:18,360 --> 00:50:21,239 Speaker 1: you have uh suggestion for a future topic, maybe there's 925 00:50:21,280 --> 00:50:23,680 Speaker 1: someone you want me to have on the show. Maybe 926 00:50:23,719 --> 00:50:26,080 Speaker 1: you're all thinking, can you please get Adam back on here. 927 00:50:26,160 --> 00:50:28,200 Speaker 1: I definitely want to hear that. I know Adam would 928 00:50:28,200 --> 00:50:30,440 Speaker 1: love to hear that too, So you can write me. 929 00:50:30,560 --> 00:50:33,360 Speaker 1: The email address is tech stuff at how stuff works 930 00:50:33,440 --> 00:50:35,840 Speaker 1: dot com. You can pop on over to our website 931 00:50:35,840 --> 00:50:38,880 Speaker 1: that's tech stuff podcast dot com. You'll find the archive 932 00:50:38,960 --> 00:50:40,879 Speaker 1: of all the old episodes there. You'll find the ways 933 00:50:40,920 --> 00:50:44,439 Speaker 1: to contact me on social media over there. And also 934 00:50:44,640 --> 00:50:46,759 Speaker 1: don't forget to check out our merchandise store that's over 935 00:50:46,800 --> 00:50:49,720 Speaker 1: at t public dot com slash tech stuff. Every purchase 936 00:50:49,760 --> 00:50:51,840 Speaker 1: you make goes to help the show and we greatly 937 00:50:51,880 --> 00:50:55,440 Speaker 1: appreciate it. And I'll talk to you again really soon 938 00:51:01,160 --> 00:51:03,600 Speaker 1: for more on this and thousands of other topics. Is 939 00:51:03,640 --> 00:51:10,239 Speaker 1: it how stuff works dot com, wh