1 00:00:00,200 --> 00:00:06,760 Speaker 1: It's the Ray Appleton Podcast from KMJ and kmjnow dot com. 2 00:00:07,600 --> 00:00:13,239 Speaker 2: Welcome back to KMJ and time to doctor robots. How 3 00:00:13,280 --> 00:00:15,720 Speaker 2: would that be? I wonder if we finally came to 4 00:00:15,760 --> 00:00:19,040 Speaker 2: that point where robots were working for us and Elon 5 00:00:19,160 --> 00:00:22,000 Speaker 2: Musk got his way. He said today that the company 6 00:00:22,480 --> 00:00:27,000 Speaker 2: is planning to make its Optimist robots available for sale 7 00:00:27,080 --> 00:00:29,320 Speaker 2: to you and me by the end of twenty twenty seven. 8 00:00:30,840 --> 00:00:34,200 Speaker 2: Now where did we get this information from Elon Musk? 9 00:00:34,720 --> 00:00:40,960 Speaker 2: Where else? Davos, Switzerland Economic Forum. Baby when asked during 10 00:00:40,960 --> 00:00:46,479 Speaker 2: a discussion with the Blackrock CEO and interim WEF co 11 00:00:46,600 --> 00:00:50,479 Speaker 2: chair Larry Fink about when Tessa's Optimist robots is going 12 00:00:50,479 --> 00:00:54,240 Speaker 2: to be deployed, and we're talking about my deployed. We 13 00:00:54,280 --> 00:00:58,320 Speaker 2: don't mean in a Wars setting, but you know, in manufacturing settings, 14 00:00:58,960 --> 00:01:03,400 Speaker 2: humanoid robots will advance very quickly. We do have some 15 00:01:03,480 --> 00:01:07,959 Speaker 2: of the Tesla Optimist robots doing very simple tasks at 16 00:01:07,959 --> 00:01:11,920 Speaker 2: the factory, so seys Musk probably later this year, by 17 00:01:11,920 --> 00:01:14,120 Speaker 2: the end of next year, for sure, I think they're 18 00:01:14,120 --> 00:01:16,720 Speaker 2: going to be doing more complex tasks and they'll be 19 00:01:16,720 --> 00:01:20,280 Speaker 2: deployed in an industrial environment. Yeah, you know, start them 20 00:01:20,280 --> 00:01:22,240 Speaker 2: there and keep them there, Thank you very much. By 21 00:01:22,240 --> 00:01:24,040 Speaker 2: the end of next year, I think we'll be selling 22 00:01:24,400 --> 00:01:29,080 Speaker 2: the humanoid robots. So humanoid they're going to have our features. 23 00:01:30,560 --> 00:01:35,160 Speaker 2: How Oh, and that's when we're confident that it's very 24 00:01:35,640 --> 00:01:39,119 Speaker 2: there's going to be a very you know, high reliability 25 00:01:39,520 --> 00:01:42,119 Speaker 2: and a very high safety factor and a very great 26 00:01:42,240 --> 00:01:46,680 Speaker 2: range of functionality. Makes me all very nervous. I think 27 00:01:46,720 --> 00:01:50,040 Speaker 2: this is all very high. You basically can ask it 28 00:01:50,160 --> 00:01:57,680 Speaker 2: to do anything you like. Oh, you're making me nervous. 29 00:01:57,840 --> 00:02:02,480 Speaker 2: You're making me there, Testlea's optimist robot. Oh, there's one 30 00:02:03,520 --> 00:02:06,280 Speaker 2: is expected to look like Will Smith. Is expected to 31 00:02:06,320 --> 00:02:08,800 Speaker 2: go on sale to the public soon. I've already said that, 32 00:02:09,720 --> 00:02:14,680 Speaker 2: and the humanoid robots and robots will eventually outnumber humans. Ah, 33 00:02:14,760 --> 00:02:17,800 Speaker 2: explaining that I think everybody on earth is going to 34 00:02:17,840 --> 00:02:21,480 Speaker 2: have the one that they want. Yes, everybody in the 35 00:02:21,480 --> 00:02:25,679 Speaker 2: studio shaking their head. No freaking way. Who wouldn't want 36 00:02:25,720 --> 00:02:29,120 Speaker 2: a robot, assuming that it's very safe to watch over 37 00:02:29,160 --> 00:02:31,880 Speaker 2: the kids, take care of your pets, if you have 38 00:02:31,960 --> 00:02:39,720 Speaker 2: elderly parents, if you if you have a lot of friends, Yeah, 39 00:02:40,280 --> 00:02:42,280 Speaker 2: a lot of friends of mine said that they have 40 00:02:42,360 --> 00:02:44,800 Speaker 2: a need for you know, robots for their elderly parents. 41 00:02:46,040 --> 00:02:52,600 Speaker 2: And I don't know about this, ladies, And I don't know. 42 00:02:53,680 --> 00:02:56,040 Speaker 2: I don't know for no, I don't know. 43 00:02:56,560 --> 00:03:01,080 Speaker 3: Well, you just made the comment it watches your kids, right, Yeah, 44 00:03:01,120 --> 00:03:03,440 Speaker 3: So what happens if you're at work and your kids 45 00:03:03,480 --> 00:03:05,639 Speaker 3: act up at home and the robot decides to discipline 46 00:03:05,639 --> 00:03:06,120 Speaker 3: your children? 47 00:03:06,320 --> 00:03:06,760 Speaker 2: Yeah? 48 00:03:06,919 --> 00:03:08,600 Speaker 3: Would it do that? Does it make choices like that? 49 00:03:08,639 --> 00:03:11,639 Speaker 2: Does it call you and ask you what you ray? 50 00:03:11,720 --> 00:03:14,880 Speaker 2: I need to paddle the young one. I need to 51 00:03:14,880 --> 00:03:19,440 Speaker 2: break its neck off with Yeah, yeah, I mean Elon Musk, 52 00:03:19,440 --> 00:03:22,280 Speaker 2: he's a future in this with a lot of humanoid 53 00:03:22,400 --> 00:03:26,480 Speaker 2: robots to be used around the world. Was There was 54 00:03:26,520 --> 00:03:28,600 Speaker 2: a post I think the day before yesterday on x 55 00:03:29,160 --> 00:03:33,600 Speaker 2: in social media about the UH production of the Cybercab, 56 00:03:34,320 --> 00:03:37,120 Speaker 2: noting that there's still remain challenges with the robotic action 57 00:03:37,200 --> 00:03:39,920 Speaker 2: of that. So if there's still problems with the cybercab, 58 00:03:40,480 --> 00:03:42,600 Speaker 2: how can he tell us that the humanoid robots are 59 00:03:42,640 --> 00:03:45,840 Speaker 2: somewhat perfective? You know, I don't think a humanoid robot 60 00:03:45,880 --> 00:03:47,600 Speaker 2: is going to be driving a cybercab by the way. 61 00:03:47,600 --> 00:03:49,600 Speaker 2: I think it's all they're all on their own. 62 00:03:50,280 --> 00:03:50,400 Speaker 4: Uh. 63 00:03:50,720 --> 00:03:56,760 Speaker 2: But Musk says It's always important to note, and there's 64 00:03:56,800 --> 00:04:01,560 Speaker 2: a caveat, that the initial production is always very slow 65 00:04:01,600 --> 00:04:06,440 Speaker 2: and follows an S curve. Okay, the speed of the 66 00:04:06,480 --> 00:04:12,760 Speaker 2: production ramp is inversely proportionate to how many new parts 67 00:04:12,800 --> 00:04:17,080 Speaker 2: and steps there are. For the cybercab and optimists, almost 68 00:04:17,080 --> 00:04:22,080 Speaker 2: everything is new, so the early production rate will be 69 00:04:22,240 --> 00:04:30,680 Speaker 2: agonizingly slow but eventually end up being insanely fast. That's 70 00:04:30,680 --> 00:04:32,360 Speaker 2: a good way to get out of it. He in 71 00:04:32,480 --> 00:04:37,480 Speaker 2: good way. But listen to the experts telling us that 72 00:04:37,800 --> 00:04:43,120 Speaker 2: scaling the humanoid robots is technically very complex, and it's 73 00:04:43,160 --> 00:04:47,880 Speaker 2: in part because of a lack of evidence and data 74 00:04:48,560 --> 00:04:51,880 Speaker 2: that is needed to train the AI models that underpin 75 00:04:51,960 --> 00:04:54,680 Speaker 2: the robots behavior. That goes back to what you were 76 00:04:54,720 --> 00:05:03,600 Speaker 2: asking about, Josh, you know, I mean that's the program 77 00:05:03,640 --> 00:05:06,360 Speaker 2: that will be used to tell your robot what to 78 00:05:06,400 --> 00:05:08,320 Speaker 2: do with the kid that needs to be disciplined. 79 00:05:08,440 --> 00:05:10,479 Speaker 3: I'm just saying I've seen the movie Megan and it 80 00:05:10,520 --> 00:05:11,000 Speaker 3: didn't end. 81 00:05:11,080 --> 00:05:15,000 Speaker 2: Well, yeah, no it didn't, So I mean the robot 82 00:05:15,920 --> 00:05:20,680 Speaker 2: has to have a lot of training. Otherwise, if the 83 00:05:20,800 --> 00:05:24,520 Speaker 2: robot has seen too many slasher movies, we know where 84 00:05:24,560 --> 00:05:28,080 Speaker 2: this is going to go, and it's not going to 85 00:05:28,120 --> 00:05:30,479 Speaker 2: be good. You know, I'm going to use this knife 86 00:05:30,520 --> 00:05:33,920 Speaker 2: to remove the child's head and maybe it will obey. Oh, 87 00:05:33,960 --> 00:05:36,080 Speaker 2: I took it. I did that. But now that the 88 00:05:36,160 --> 00:05:41,159 Speaker 2: legs aren't working. See, this is why it scares me. 89 00:05:42,000 --> 00:05:46,400 Speaker 2: Quote for optimists, while they the market in the need, 90 00:05:46,400 --> 00:05:50,479 Speaker 2: that the need is credible evidence of scalable manufacturing a 91 00:05:50,600 --> 00:05:55,000 Speaker 2: regulatory path, and unit economists say, if this is all possible, 92 00:05:55,320 --> 00:06:00,520 Speaker 2: they can stick to that twenty twenty seven deadline. Yeah, 93 00:06:00,520 --> 00:06:04,160 Speaker 2: all right, good luck. Would you have a robot? How soon? 94 00:06:04,200 --> 00:06:05,960 Speaker 2: Would you have a robot working for you in your 95 00:06:05,960 --> 00:06:11,080 Speaker 2: home taking care of your elderly parents, doing your home chores? 96 00:06:11,960 --> 00:06:16,200 Speaker 2: What do you need to see before this is something 97 00:06:16,800 --> 00:06:23,000 Speaker 2: that is a provable tech? Approven approven technology to you 98 00:06:23,080 --> 00:06:26,560 Speaker 2: and you say, okay, I'm ready, I'm ready to spend 99 00:06:26,720 --> 00:06:33,360 Speaker 2: the money, and yeah, and what do you need to see? 100 00:06:33,440 --> 00:06:37,480 Speaker 2: It's what I'm asking four nine zero fifty eight fifty 101 00:06:37,520 --> 00:06:40,520 Speaker 2: eight, eight hundred and seven seven six fifty eight to fifty eight. 102 00:06:40,520 --> 00:06:43,480 Speaker 2: What do you need to see to you know, make 103 00:06:43,560 --> 00:06:46,920 Speaker 2: you feel that it's safe and that you're in good 104 00:06:46,960 --> 00:06:51,279 Speaker 2: hands and that your robot is right for whatever it 105 00:06:51,320 --> 00:06:53,880 Speaker 2: is you want the robot to do, to change the bed, 106 00:06:53,960 --> 00:06:55,839 Speaker 2: to do the vacuuming, to take care of mom and 107 00:06:55,920 --> 00:07:01,200 Speaker 2: dad all of that, or is it just no, No, 108 00:07:01,320 --> 00:07:06,279 Speaker 2: I can't do this. I'm not ready yet. I'm not 109 00:07:06,360 --> 00:07:08,960 Speaker 2: ready yet. And if that's the case, you know you 110 00:07:09,040 --> 00:07:15,120 Speaker 2: may never be ready, never be ready. Think about that 111 00:07:15,360 --> 00:07:17,520 Speaker 2: and give me a shout. I think I want to 112 00:07:17,560 --> 00:07:20,840 Speaker 2: hear from you because I'm looking for confidence in my 113 00:07:21,000 --> 00:07:24,080 Speaker 2: own set of circumstances of what I would need to 114 00:07:24,120 --> 00:07:26,400 Speaker 2: see before I could. I could go along with it 115 00:07:26,440 --> 00:07:29,440 Speaker 2: and say, Okay, we can do this now, you know, 116 00:07:32,760 --> 00:07:36,600 Speaker 2: because it makes me incredibly nervous, it really really does. 117 00:07:36,640 --> 00:07:38,960 Speaker 2: We have to break for news and we will do that. 118 00:07:39,040 --> 00:07:41,600 Speaker 2: Hopefully we'll have a call or two. I don't expect 119 00:07:41,600 --> 00:07:43,320 Speaker 2: you to jump all over the phones and answer this 120 00:07:43,360 --> 00:07:45,480 Speaker 2: one very quickly. Oh yeah, I got it down, Ray, 121 00:07:45,520 --> 00:07:51,040 Speaker 2: I know exactly what. No, you don't, but hey, I 122 00:07:51,120 --> 00:07:53,800 Speaker 2: trust you. You've been a good audience for almost thirty 123 00:07:53,880 --> 00:07:57,480 Speaker 2: nine years. What do you need to see that tells 124 00:07:57,520 --> 00:08:01,120 Speaker 2: you you're ready for a robot? I'm ready for a 125 00:08:01,240 --> 00:08:06,000 Speaker 2: robot and the phones are open. We'll continue on KINGMJ 126 00:08:07,280 --> 00:08:14,720 Speaker 2: with CAMJ News Update. Liz, what do you need to see. Yeah, 127 00:08:15,120 --> 00:08:19,800 Speaker 2: I think every woman would want that and that's enough. Yeah, no, 128 00:08:20,000 --> 00:08:23,239 Speaker 2: I'm there. If I can have a robot that did 129 00:08:23,440 --> 00:08:26,240 Speaker 2: the household chores and the laundry and the dishes of 130 00:08:26,400 --> 00:08:29,280 Speaker 2: the household chores, that's easy. That'd be all I want. 131 00:08:30,160 --> 00:08:34,240 Speaker 2: That would really would be all I want. Cooking is 132 00:08:34,320 --> 00:08:36,600 Speaker 2: kind of fun. Well, I mean, if there is such 133 00:08:36,600 --> 00:08:40,600 Speaker 2: a thing, I'm in oh geez see now you know, 134 00:08:40,640 --> 00:08:42,760 Speaker 2: but for something like that, you would think that there 135 00:08:42,800 --> 00:08:45,080 Speaker 2: would have to be en up charge, right, I'm not 136 00:08:45,120 --> 00:08:48,960 Speaker 2: going to be giving that technology away. She Yeah, that's 137 00:08:48,960 --> 00:08:50,439 Speaker 2: I mean, you just get them down to peanut butter 138 00:08:50,520 --> 00:08:51,599 Speaker 2: sandwiches and things like that. 139 00:08:51,920 --> 00:08:54,920 Speaker 3: Actually, Elon says they're going to be about thirty thousand dollars, 140 00:08:54,960 --> 00:08:56,400 Speaker 3: so it's a price of a car. It's not much. 141 00:08:57,600 --> 00:09:00,840 Speaker 2: We have a surprising number of calls everybody in the 142 00:09:00,840 --> 00:09:06,520 Speaker 2: favorability wanting a robot. I want a robot. Fabian, Fabian, 143 00:09:06,559 --> 00:09:10,120 Speaker 2: what a great name, Fabian. You want a robot? And 144 00:09:10,280 --> 00:09:13,079 Speaker 2: why oh? 145 00:09:13,240 --> 00:09:16,080 Speaker 1: I would mainly for my dad. He's a very plegic. 146 00:09:16,160 --> 00:09:18,680 Speaker 1: And I think that'd be a great health brand, especially 147 00:09:18,720 --> 00:09:20,559 Speaker 1: since you needs a lot of attention. In twenty four 148 00:09:20,559 --> 00:09:20,920 Speaker 1: to seven. 149 00:09:21,960 --> 00:09:26,880 Speaker 2: Absolutely, if the science were perfected down to something like that, 150 00:09:27,320 --> 00:09:30,079 Speaker 2: I totally agree you would you would trust a robot 151 00:09:30,120 --> 00:09:30,720 Speaker 2: with your father. 152 00:09:33,360 --> 00:09:36,360 Speaker 1: I mean, the way AI is right now, I'm pretty 153 00:09:36,360 --> 00:09:38,840 Speaker 1: sure it will be safe. And uh, I mean, well 154 00:09:38,880 --> 00:09:41,200 Speaker 1: he's already lived in years. Well what can they really do? 155 00:09:41,320 --> 00:09:46,000 Speaker 2: But what what may I ask? Is your father's problem? 156 00:09:46,600 --> 00:09:50,480 Speaker 2: God bless your dad? What's the problem he got? 157 00:09:50,600 --> 00:09:54,200 Speaker 1: He got hurt try graving and uh, he suffers from diabetes. 158 00:09:54,280 --> 00:09:58,240 Speaker 1: And he a metal rod hit his foot. So when 159 00:09:58,280 --> 00:09:59,880 Speaker 1: the when he when he was at the hospital, the 160 00:10:00,040 --> 00:10:01,720 Speaker 1: it off his leg and he has to say clay 161 00:10:01,840 --> 00:10:05,120 Speaker 1: for like a year and then uh, he was he 162 00:10:05,200 --> 00:10:07,440 Speaker 1: got an accident where his leg came off by accident. 163 00:10:07,480 --> 00:10:09,720 Speaker 1: He just hit the core, hit the back of his 164 00:10:09,720 --> 00:10:13,560 Speaker 1: head and he can ever since then he's been paralyzed. Yeah, 165 00:10:13,679 --> 00:10:16,400 Speaker 1: he's going on already, he's going on three or four 166 00:10:16,440 --> 00:10:17,280 Speaker 1: years already like that. 167 00:10:18,000 --> 00:10:18,720 Speaker 2: How old is he? 168 00:10:20,600 --> 00:10:23,640 Speaker 1: H He's from nineteen sixty two, So I wouldn't I 169 00:10:23,679 --> 00:10:25,199 Speaker 1: seem like in the sixties around there? 170 00:10:26,440 --> 00:10:31,040 Speaker 2: Well he's young compared to me. Well, all right, fabyan, 171 00:10:31,120 --> 00:10:33,559 Speaker 2: I get it, I get it. Why you would you 172 00:10:33,600 --> 00:10:36,280 Speaker 2: would like something like that? Thank you? For calling that. 173 00:10:37,360 --> 00:10:39,040 Speaker 2: Go ahead, Yeah, no. 174 00:10:39,080 --> 00:10:42,000 Speaker 1: Go ahead, especially since he has all especially he has 175 00:10:42,040 --> 00:10:44,920 Speaker 1: all his is and uh, everybody be busy with their 176 00:10:45,040 --> 00:10:47,560 Speaker 1: working in life, so we don't have We can't be 177 00:10:47,600 --> 00:10:49,280 Speaker 1: the twenty four seven for him. So I think that 178 00:10:49,320 --> 00:10:52,680 Speaker 1: would be like better better alternative for him too, since 179 00:10:52,679 --> 00:10:54,120 Speaker 1: he needs scared of twenty four seven. 180 00:10:56,200 --> 00:10:59,240 Speaker 2: All right, bro, you thank you for your sake. I 181 00:10:59,280 --> 00:11:01,440 Speaker 2: hope we wind up something like that someday so your 182 00:11:01,520 --> 00:11:06,079 Speaker 2: dad can be comfortable, comfortable, comfortable, comfortable, Chuck, why do 183 00:11:06,120 --> 00:11:07,240 Speaker 2: you want a robot? 184 00:11:08,520 --> 00:11:14,480 Speaker 5: Oh, it's like everybody else just to have one. I mean, 185 00:11:14,559 --> 00:11:18,000 Speaker 5: everybody's gonna want one. I do have one question for 186 00:11:18,160 --> 00:11:23,439 Speaker 5: mister Muskstowe. Are they going to have the three laws? 187 00:11:23,880 --> 00:11:28,480 Speaker 2: Are they going to have what the three laws that are. 188 00:11:30,760 --> 00:11:36,520 Speaker 5: A robot will never? Yes? 189 00:11:36,679 --> 00:11:39,920 Speaker 2: Yes, yes, I didn't understand what you said. Uh well 190 00:11:39,920 --> 00:11:43,240 Speaker 2: there there has to listen to me. For me, there 191 00:11:43,280 --> 00:11:46,199 Speaker 2: has to be some kind of I don't want to 192 00:11:46,200 --> 00:11:48,920 Speaker 2: say psychological. There has to be some kind of guideline 193 00:11:49,840 --> 00:11:54,280 Speaker 2: that a safety device, but at that's sometime something that 194 00:11:54,360 --> 00:11:56,960 Speaker 2: gives that. And with AI, this has to be possible. 195 00:11:57,320 --> 00:11:59,440 Speaker 2: It has to be something to give that robot a 196 00:11:59,520 --> 00:12:02,000 Speaker 2: moral compass, Do you know what I'm saying. 197 00:12:03,000 --> 00:12:03,280 Speaker 5: Yep. 198 00:12:03,559 --> 00:12:07,880 Speaker 2: Now, I don't expect them to understand morality and the like, 199 00:12:08,400 --> 00:12:11,320 Speaker 2: but still they're going to need that moral compass just 200 00:12:11,360 --> 00:12:14,199 Speaker 2: to keep them on the right down the right path. 201 00:12:15,040 --> 00:12:17,080 Speaker 2: And I think that's I think it's a good way 202 00:12:17,120 --> 00:12:19,760 Speaker 2: to explain it, the right path. 203 00:12:20,000 --> 00:12:23,480 Speaker 5: Probably I probably would God take it so slow getting 204 00:12:23,480 --> 00:12:25,640 Speaker 5: at the gate, because they're going to have to do 205 00:12:25,760 --> 00:12:28,680 Speaker 5: something to make it at least seem that way. 206 00:12:29,320 --> 00:12:32,000 Speaker 2: All right, Chuck, thank you very much, take care, have 207 00:12:32,040 --> 00:12:34,360 Speaker 2: a good night. You know, when you say the right path, 208 00:12:34,760 --> 00:12:36,439 Speaker 2: who's going to design that? And what the hell is 209 00:12:36,440 --> 00:12:38,199 Speaker 2: it going to be? Is it going to be Chuck's 210 00:12:38,240 --> 00:12:39,960 Speaker 2: right path? Or is going to be raised right path? 211 00:12:41,320 --> 00:12:49,000 Speaker 2: Ooh ah, Joseph, you're on bro Hi. Hey, how you 212 00:12:49,080 --> 00:12:50,840 Speaker 2: doing good? Man? Doing real good? 213 00:12:52,760 --> 00:12:52,920 Speaker 6: Oh? 214 00:12:53,000 --> 00:12:53,840 Speaker 5: Yeah, for sure. 215 00:12:54,160 --> 00:12:58,120 Speaker 7: I definitely wanting for sure, I definitely want to Robot. 216 00:12:58,559 --> 00:13:00,320 Speaker 7: I think it's just be nice to, you know, do 217 00:13:00,400 --> 00:13:02,200 Speaker 7: things around the house that me and the wife to 218 00:13:02,240 --> 00:13:05,600 Speaker 7: have more time with each other, you know, time with 219 00:13:05,720 --> 00:13:08,280 Speaker 7: each other, time with the kids. You know, just just 220 00:13:08,320 --> 00:13:11,280 Speaker 7: be great. My one thing that I do on is 221 00:13:11,280 --> 00:13:13,080 Speaker 7: obviously the three loss, but I want like you know, 222 00:13:13,120 --> 00:13:15,000 Speaker 7: like a lifeline alert, you know, a little thing around 223 00:13:15,040 --> 00:13:17,200 Speaker 7: your chest where no matter what, you can turn this 224 00:13:17,280 --> 00:13:21,080 Speaker 7: thing off because, like most people and like most electronics, 225 00:13:21,160 --> 00:13:22,960 Speaker 7: you leave it on for long enough, it starts to 226 00:13:23,000 --> 00:13:25,080 Speaker 7: be wonky. It has to be a little weird, you know, 227 00:13:25,200 --> 00:13:28,520 Speaker 7: So being able to shut it off. A kill switch 228 00:13:28,840 --> 00:13:30,520 Speaker 7: that no matter what, you know, anybody in the house 229 00:13:30,520 --> 00:13:33,120 Speaker 7: could have or bun that we can turn this thing 230 00:13:33,200 --> 00:13:34,600 Speaker 7: off at any moment. 231 00:13:34,920 --> 00:13:38,560 Speaker 2: An immediate gill switch. I would totally agree. We're going 232 00:13:38,600 --> 00:13:40,200 Speaker 2: to have that going down. We're going to have that 233 00:13:40,240 --> 00:13:40,720 Speaker 2: going down. 234 00:13:40,920 --> 00:13:43,240 Speaker 6: Yeah, But it'd be amazing. 235 00:13:43,320 --> 00:13:46,040 Speaker 7: It'd be awesome to have less less things. I mean, 236 00:13:46,040 --> 00:13:48,040 Speaker 7: that's what we build tools for. That's at the end 237 00:13:48,040 --> 00:13:50,400 Speaker 7: of the day, would be you know, something that makes 238 00:13:50,480 --> 00:13:52,760 Speaker 7: life easier for us so that we can do other things. 239 00:13:52,800 --> 00:13:52,960 Speaker 3: You know. 240 00:13:53,040 --> 00:13:56,720 Speaker 7: Can you imagine, you know, even like to say galilet 241 00:13:56,800 --> 00:13:58,320 Speaker 7: or the bench, if they have somebody clean up their 242 00:13:58,320 --> 00:14:00,360 Speaker 7: paint brushes, clean up all this stuff that he didn't 243 00:14:00,360 --> 00:14:02,520 Speaker 7: have to do. He had more free time to just enjoy, 244 00:14:02,960 --> 00:14:03,959 Speaker 7: you know, creating stuff. 245 00:14:04,720 --> 00:14:12,920 Speaker 2: Yeah, thanks man, I'm glad you called. Thanks. Uh wow, yeah, 246 00:14:13,160 --> 00:14:15,679 Speaker 2: lost lost a couple of your calls here four nine 247 00:14:15,760 --> 00:14:18,120 Speaker 2: zero fifty eight fifty eight, eight hundred and seven seven 248 00:14:18,240 --> 00:14:21,920 Speaker 2: six fifty eight fifty eight. The two out of the 249 00:14:21,920 --> 00:14:24,560 Speaker 2: three callers that we took are really concerned about this 250 00:14:24,600 --> 00:14:27,880 Speaker 2: moral compass that I brought up. And how do you 251 00:14:27,880 --> 00:14:31,120 Speaker 2: teach your robot morality? You know what I'm saying, How 252 00:14:31,120 --> 00:14:33,960 Speaker 2: do you how do you teach your robot morality? How 253 00:14:34,000 --> 00:14:37,200 Speaker 2: do you even explain the concept of morality to a robot? 254 00:14:38,960 --> 00:14:43,480 Speaker 2: It's it's something that I totally agree that they have 255 00:14:43,560 --> 00:14:45,720 Speaker 2: to have. But what is it? You know, if it 256 00:14:45,760 --> 00:14:47,600 Speaker 2: comes to you and says, what is morality? I mean, 257 00:14:47,640 --> 00:14:55,720 Speaker 2: good luck? Good luck. In the beginning, I didn't think 258 00:14:55,760 --> 00:14:58,320 Speaker 2: I would want one at all. Now that I've thought 259 00:14:58,360 --> 00:15:01,880 Speaker 2: about it, I too if it had that moral compass 260 00:15:01,960 --> 00:15:08,720 Speaker 2: or whatever, I think that would be okay. I think 261 00:15:08,720 --> 00:15:11,120 Speaker 2: that would be okay. But again, you know, I've got 262 00:15:11,120 --> 00:15:12,480 Speaker 2: to be able to turn the damn thing off in 263 00:15:12,520 --> 00:15:16,480 Speaker 2: case something goes wrong. That would be very you know, 264 00:15:16,520 --> 00:15:19,880 Speaker 2: for me, I've got this problem, as you know where 265 00:15:20,160 --> 00:15:24,680 Speaker 2: I've got this what do you call it? 266 00:15:24,680 --> 00:15:24,720 Speaker 7: This? 267 00:15:24,880 --> 00:15:28,520 Speaker 2: This my spinal problem, the ANKI losing spotylitis. There are 268 00:15:28,560 --> 00:15:30,920 Speaker 2: a lot of things that I cannot do if you 269 00:15:31,000 --> 00:15:33,160 Speaker 2: have AS. You know what the hell I'm talking about, 270 00:15:34,000 --> 00:15:36,560 Speaker 2: all right, and so it would be very very helpful 271 00:15:36,800 --> 00:15:39,480 Speaker 2: to an AS patient who can't lean over. It's an 272 00:15:39,520 --> 00:15:41,960 Speaker 2: extreme arthritic thing that has to do with your spine, 273 00:15:42,560 --> 00:15:44,800 Speaker 2: and it's just like, you know, what do I do? Now? 274 00:15:45,360 --> 00:15:47,360 Speaker 2: The dog can't always be there to show me what 275 00:15:47,360 --> 00:15:50,120 Speaker 2: the problem is. The dog doesn't do anything to do 276 00:15:50,160 --> 00:15:54,840 Speaker 2: anything about the problem. My dog Elvis, who I missed 277 00:15:54,880 --> 00:16:01,240 Speaker 2: so so much, he had a lot of abilities to say, hey, 278 00:16:01,280 --> 00:16:04,560 Speaker 2: look what happened here, But he can't do anything about 279 00:16:04,600 --> 00:16:07,440 Speaker 2: what happened there because he's not strong enough, not smart enough, 280 00:16:08,560 --> 00:16:11,280 Speaker 2: you know. So yeah, there's a lot that they can do, 281 00:16:11,360 --> 00:16:15,080 Speaker 2: but it's not the actual doing of the function. It 282 00:16:15,160 --> 00:16:17,800 Speaker 2: can just tell you here's what the problem is. Now, 283 00:16:17,840 --> 00:16:21,120 Speaker 2: you've got to find somebody, get another dog that's trained better, 284 00:16:21,480 --> 00:16:24,160 Speaker 2: that can actually take care of the problem for you. Now, 285 00:16:24,320 --> 00:16:27,760 Speaker 2: James is calling in the first individual who doesn't want one, 286 00:16:28,320 --> 00:16:32,080 Speaker 2: and probably most people are going to be of your persuasion. James, 287 00:16:32,320 --> 00:16:33,160 Speaker 2: talk to me, Hi. 288 00:16:34,000 --> 00:16:35,160 Speaker 6: Hey, Ray, I love the show. 289 00:16:35,360 --> 00:16:36,680 Speaker 2: Well, thanks, thanks. 290 00:16:36,880 --> 00:16:39,080 Speaker 6: I saw a video that they're coming out with a 291 00:16:39,120 --> 00:16:42,520 Speaker 6: new robot and it's a very weird looking robot. They're 292 00:16:42,520 --> 00:16:44,080 Speaker 6: trying to make it look cute. They're putting it in 293 00:16:44,200 --> 00:16:47,600 Speaker 6: a sweater now so you can't see all the electronics. 294 00:16:48,160 --> 00:16:51,840 Speaker 6: But this robot now they're trying to bring it out. 295 00:16:51,840 --> 00:16:55,240 Speaker 6: But they're saying that it's not one hundred percent robotic yet. 296 00:16:55,240 --> 00:16:57,280 Speaker 6: They got the way they presented it is. They had 297 00:16:57,280 --> 00:17:01,640 Speaker 6: a guy in another room with the no control controlling 298 00:17:01,720 --> 00:17:05,639 Speaker 6: the robot. So they're saying that when people start buying 299 00:17:05,640 --> 00:17:08,560 Speaker 6: these things that they're actually going to be controlled remotely 300 00:17:08,640 --> 00:17:12,399 Speaker 6: from somebody else, you know, from you don't reven nowhere. 301 00:17:12,480 --> 00:17:14,840 Speaker 6: It'd be like a call center kind of control thing. 302 00:17:15,840 --> 00:17:20,080 Speaker 6: So I don't think that technology is there yet for people. 303 00:17:20,160 --> 00:17:23,159 Speaker 6: What they think is just a totally autonomous robot, I 304 00:17:23,160 --> 00:17:25,280 Speaker 6: think don't be controlled. 305 00:17:25,560 --> 00:17:28,600 Speaker 2: To understand you, what you're saying is that you're not controlled. 306 00:17:28,600 --> 00:17:30,880 Speaker 2: Nobody in the home is controlling your robot. That's done 307 00:17:30,880 --> 00:17:32,160 Speaker 2: by like a command central. 308 00:17:33,160 --> 00:17:35,480 Speaker 6: Yep, it's going to be somebody looking through the eyes 309 00:17:35,520 --> 00:17:39,200 Speaker 6: of the robot with the video game controller control of 310 00:17:39,280 --> 00:17:43,040 Speaker 6: the robots. So you got some stranger controlling the robot 311 00:17:43,320 --> 00:17:44,520 Speaker 6: and he's looking through your house. 312 00:17:44,880 --> 00:17:48,960 Speaker 2: Huh huh, no, no, thank you. I'm out as fast 313 00:17:49,000 --> 00:17:52,560 Speaker 2: as I was in. I'm now out, James, have a 314 00:17:52,600 --> 00:17:55,000 Speaker 2: great night and thank you. Going to pull the plug 315 00:17:55,080 --> 00:17:57,320 Speaker 2: right there. Even with five or six calls, I was 316 00:17:57,359 --> 00:18:00,280 Speaker 2: surprised that money it would be calling in about, you know, 317 00:18:00,400 --> 00:18:01,240 Speaker 2: wanting a robot. 318 00:18:02,080 --> 00:18:02,480 Speaker 7: I don't know. 319 00:18:02,520 --> 00:18:05,080 Speaker 2: I'm still mixed. I go back to the same thing 320 00:18:05,600 --> 00:18:08,679 Speaker 2: that Liz was talking about. You know, clean the dishes, 321 00:18:08,760 --> 00:18:11,879 Speaker 2: do the dishes, clean the kitchen, get that all done. 322 00:18:12,280 --> 00:18:14,480 Speaker 2: You know. It's funny how if you have a housekeeper, 323 00:18:15,200 --> 00:18:17,439 Speaker 2: which I do have had the same woman for twelve years, 324 00:18:17,880 --> 00:18:20,000 Speaker 2: member of the family. I love her, I adore her. 325 00:18:20,240 --> 00:18:24,119 Speaker 2: She's a very very close friend. Twelve years we've been 326 00:18:24,160 --> 00:18:29,320 Speaker 2: dealing with this housekeeper and she does a great job. 327 00:18:29,680 --> 00:18:31,960 Speaker 2: But if you have a housekeeper, you know what you 328 00:18:32,080 --> 00:18:34,560 Speaker 2: have to go through to get ready for the housekeeper. 329 00:18:36,240 --> 00:18:38,520 Speaker 2: You've got to do a whole bunch of work to 330 00:18:38,600 --> 00:18:41,439 Speaker 2: get ready for the housekeeper to clean the house. So 331 00:18:41,560 --> 00:18:45,879 Speaker 2: you've actually cleaned the house once already. But now that 332 00:18:46,080 --> 00:18:48,679 Speaker 2: you've got that done, you can bring the housekeeper in 333 00:18:48,720 --> 00:18:50,520 Speaker 2: to do the final cleaning and the polishing of the 334 00:18:50,560 --> 00:18:55,600 Speaker 2: house so to speak. I mean, probably nobody better to 335 00:18:55,720 --> 00:18:59,480 Speaker 2: talk about this robot availability from Elon Musk, who says 336 00:18:59,560 --> 00:19:01,920 Speaker 2: the end of two twenty seven, you know about thirty 337 00:19:02,000 --> 00:19:03,439 Speaker 2: five you're going to be able to pluck down and 338 00:19:03,440 --> 00:19:06,000 Speaker 2: buy a robot. And I don't know anybody can speak 339 00:19:06,119 --> 00:19:08,800 Speaker 2: it's better than Philip Teresi. And one of the things 340 00:19:08,840 --> 00:19:11,399 Speaker 2: we were worried about was giving the robot like a 341 00:19:11,440 --> 00:19:14,199 Speaker 2: moral compass. You know that it's going to have, you know, 342 00:19:14,240 --> 00:19:16,840 Speaker 2: the three laws or whatever, but at the same time, 343 00:19:17,080 --> 00:19:19,639 Speaker 2: whose moral compass is it going to be. Is it 344 00:19:19,640 --> 00:19:22,720 Speaker 2: gonna be my moral compass or your moral compass or what? Well. 345 00:19:23,000 --> 00:19:26,439 Speaker 4: One of the things that differentiated the predictions of Clark 346 00:19:26,720 --> 00:19:29,960 Speaker 4: from Asimov. No, there you go, was whether or not 347 00:19:30,240 --> 00:19:33,480 Speaker 4: you could instill that moral compass. That's what the three 348 00:19:33,560 --> 00:19:37,240 Speaker 4: laws of robotics were. And you actually see Verhoven hit 349 00:19:37,320 --> 00:19:41,160 Speaker 4: that in RoboCop with the directives, and you ultimately see 350 00:19:41,160 --> 00:19:43,879 Speaker 4: the consequence of the directives conflicting with the desire of 351 00:19:43,920 --> 00:19:47,800 Speaker 4: the machine. When Murphy goes out and takes fifty thousand 352 00:19:47,920 --> 00:19:50,040 Speaker 4: volts to his head. Would I tell you to clear 353 00:19:50,080 --> 00:19:53,080 Speaker 4: the CPU? Would I tell you? I think that Clark's 354 00:19:53,080 --> 00:19:56,040 Speaker 4: got it right. Everybody knows the movie two thousand and one, 355 00:19:56,080 --> 00:19:58,200 Speaker 4: A Space Odyssey. Did you ever read the. 356 00:19:58,160 --> 00:19:59,400 Speaker 2: Novel Open the Pod Door? 357 00:19:59,480 --> 00:19:59,680 Speaker 4: How? 358 00:20:00,160 --> 00:20:03,560 Speaker 2: I did? Okay, so so sorry, David, I can't do that. 359 00:20:03,880 --> 00:20:06,280 Speaker 4: So you know that the movie is like a million 360 00:20:06,280 --> 00:20:08,680 Speaker 4: times better if you've read the book, because. 361 00:20:08,359 --> 00:20:09,760 Speaker 2: All of that, I thought the movie was a million 362 00:20:09,800 --> 00:20:14,080 Speaker 2: times better on acid. I'm sorry, folks, I have to 363 00:20:14,119 --> 00:20:17,320 Speaker 2: be honest. Here warners theater way back. 364 00:20:17,760 --> 00:20:19,560 Speaker 4: So there's a lot of folks that walked out of 365 00:20:19,560 --> 00:20:22,440 Speaker 4: that movie going, ah, okay, that was cool. But I'm 366 00:20:22,440 --> 00:20:23,000 Speaker 4: not sure what. 367 00:20:23,040 --> 00:20:25,080 Speaker 2: If you read the book, I couldn't walk it. 368 00:20:26,720 --> 00:20:30,680 Speaker 4: You get in the interior monologues of the character. I 369 00:20:31,119 --> 00:20:35,240 Speaker 4: totally agree, and How nine thousand has an internal monologue, 370 00:20:35,600 --> 00:20:39,280 Speaker 4: and if you follow that, the reason that How does 371 00:20:39,320 --> 00:20:41,440 Speaker 4: the things that he does, the reason that he won't 372 00:20:41,520 --> 00:20:46,840 Speaker 4: unlock the pod Bay doors is because, in his actually 373 00:20:47,080 --> 00:20:52,080 Speaker 4: accurate logic, human beings desire for self preservation outweighed the 374 00:20:52,160 --> 00:20:55,480 Speaker 4: objectives of the mission, which Hal believed was more important 375 00:20:55,520 --> 00:20:56,960 Speaker 4: than individual human lives. 376 00:20:57,000 --> 00:21:01,080 Speaker 2: Oh absolutely, yeah, totally, totally, total, totally yeah. 377 00:21:01,119 --> 00:21:03,879 Speaker 4: So you figure it's going to take these robots about 378 00:21:03,960 --> 00:21:07,760 Speaker 4: fifteen minutes of hanging out with an avid Twitter user 379 00:21:07,800 --> 00:21:09,920 Speaker 4: to realize all hope is lost. And it's time to 380 00:21:09,960 --> 00:21:11,040 Speaker 4: start removing the humans. 381 00:21:11,160 --> 00:21:13,240 Speaker 3: Yeah, that's what I was saying, Like, Tony Stark built 382 00:21:13,320 --> 00:21:15,840 Speaker 3: Ultron and his purpose was to better the world, and 383 00:21:15,880 --> 00:21:18,399 Speaker 3: then Ultron realized that humans were the problems. 384 00:21:18,000 --> 00:21:24,720 Speaker 2: Started removing the humans. Okay, it's settled, don't need it.