1 00:00:00,280 --> 00:00:02,920 Speaker 1: Brought to you by the reinvented two thousand twelve Camray. 2 00:00:03,160 --> 00:00:07,600 Speaker 1: It's ready. Are you welcome to Stuff you should know 3 00:00:08,200 --> 00:00:11,560 Speaker 1: from How Stuff Works dot Com? Stuff you should know 4 00:00:11,680 --> 00:00:13,920 Speaker 1: is brought to you by Visa. We all have things 5 00:00:13,920 --> 00:00:16,680 Speaker 1: we like to think about. Online fraud shouldn't be one 6 00:00:16,720 --> 00:00:20,440 Speaker 1: of them, because with every purchase, PISA prevents to texts 7 00:00:20,640 --> 00:00:26,480 Speaker 1: and resolves online fraud safe secure Visa. Hey, and welcome 8 00:00:26,520 --> 00:00:29,240 Speaker 1: to the podcast. I'm Josh Clark, staff writer here at 9 00:00:29,240 --> 00:00:32,080 Speaker 1: how Stuff Works dot Com. With me is always is 10 00:00:32,159 --> 00:00:35,200 Speaker 1: Charles Chuck Bryant, also staff writer here at how Stuff 11 00:00:35,240 --> 00:00:39,040 Speaker 1: Works dot Com. What up? Staff writer Chuck? Well, Josh, 12 00:00:39,040 --> 00:00:41,400 Speaker 1: I think, as you can see, I'm just snuggled up 13 00:00:41,400 --> 00:00:43,960 Speaker 1: with my real doll here. I did notice, Chuck if 14 00:00:44,120 --> 00:00:46,360 Speaker 1: wasn't going to bring it up. She is a looker. 15 00:00:46,479 --> 00:00:49,360 Speaker 1: She will give you that. Um, but she is what 16 00:00:49,440 --> 00:00:53,680 Speaker 1: poly ethylene? Maybe I don't ask you and she doesn't answer, 17 00:00:53,760 --> 00:00:57,640 Speaker 1: so I'm sure she doesn't. That reminds me of a movie. Um, 18 00:00:57,680 --> 00:01:00,200 Speaker 1: I haven't seen you. I've been told by several will 19 00:01:00,360 --> 00:01:02,440 Speaker 1: definitely see it. Lars and the Real Girl. Yep, I've 20 00:01:02,440 --> 00:01:04,120 Speaker 1: seen it. It's really good And how is this good 21 00:01:04,120 --> 00:01:06,920 Speaker 1: little indie. It's got Ryan Gosling. He's really great actor. 22 00:01:07,440 --> 00:01:10,399 Speaker 1: No spoilers, Chuck, I haven't seen it yet, Okay, okay, 23 00:01:10,400 --> 00:01:13,840 Speaker 1: go ahead. But the basic plot outline is that he 24 00:01:13,880 --> 00:01:16,200 Speaker 1: plays this lonely guy who gets a real doll. And 25 00:01:16,520 --> 00:01:18,800 Speaker 1: it's a real girl doll, right, Yes, is that what 26 00:01:18,840 --> 00:01:22,240 Speaker 1: they're called? And uh? And I had asked Samantha, but 27 00:01:22,280 --> 00:01:25,119 Speaker 1: she wouldn't answer. Um. And so he has this doll 28 00:01:25,160 --> 00:01:27,000 Speaker 1: in the in the town kind of just accepts it 29 00:01:27,440 --> 00:01:30,479 Speaker 1: as being his girlfriend. And it's worth Yeah, it's really 30 00:01:30,480 --> 00:01:32,280 Speaker 1: it sounds really creepy, but it's actually kind of a 31 00:01:32,280 --> 00:01:35,319 Speaker 1: sweet movie. Well you know that's uh, that's actually not 32 00:01:35,480 --> 00:01:38,880 Speaker 1: too far off, um from what some people are predicting, 33 00:01:39,280 --> 00:01:42,160 Speaker 1: is going to be uh part of the future of humanity. 34 00:01:42,920 --> 00:01:46,760 Speaker 1: Too far off at all at robot human weddings. Um. 35 00:01:46,840 --> 00:01:49,040 Speaker 1: And we're not even talking real girls here. We're talking 36 00:01:49,080 --> 00:01:54,920 Speaker 1: about real girls that like super and real girls, yes, exactly. Um. 37 00:01:54,960 --> 00:01:58,720 Speaker 1: And apparently there's this guy named David Levy who is uh, 38 00:01:58,760 --> 00:02:01,200 Speaker 1: I think some sort of future US maybe a roboticist. 39 00:02:01,240 --> 00:02:04,520 Speaker 1: But he he wrote a paper um based on research 40 00:02:04,560 --> 00:02:08,519 Speaker 1: he did in philosophy, UM, sexology, which to tell you 41 00:02:08,560 --> 00:02:11,800 Speaker 1: the truth. Until I read his article, UM, I didn't 42 00:02:11,840 --> 00:02:15,680 Speaker 1: realize it was an actual discipline. Um, robotics, of course, 43 00:02:15,720 --> 00:02:18,840 Speaker 1: all sorts of other stuff. Uh, sexology. It's like the 44 00:02:18,880 --> 00:02:22,160 Speaker 1: greatest word ever. It sounds like an MTV show, it 45 00:02:22,560 --> 00:02:26,600 Speaker 1: really does, or like a drink you get in cancoon right. Um. 46 00:02:26,680 --> 00:02:31,120 Speaker 1: So Levy combines all these disciplines together and comes up 47 00:02:31,160 --> 00:02:37,880 Speaker 1: with the notion that by um, some states, starting with Massachusetts, 48 00:02:37,880 --> 00:02:43,919 Speaker 1: deposits will allow human robot weddings marriages legally recognized. Yeah. 49 00:02:43,960 --> 00:02:47,320 Speaker 1: In fact, I believe he said quote it is inevitable. Yes, 50 00:02:47,360 --> 00:02:51,520 Speaker 1: he didn't even before that, though, another roboticists predicted that 51 00:02:51,560 --> 00:02:55,519 Speaker 1: by should not too far off from uh, people will 52 00:02:55,520 --> 00:02:58,640 Speaker 1: be having sex with robots, which I mean once you 53 00:02:58,680 --> 00:03:01,520 Speaker 1: start having sex with something, if you if, if somebody 54 00:03:01,520 --> 00:03:04,920 Speaker 1: eventually wants to marry it. We are a moral species. Yeah, 55 00:03:04,960 --> 00:03:07,400 Speaker 1: you could argue that we're not, but yeah, generally you're right. 56 00:03:07,520 --> 00:03:09,840 Speaker 1: Just like in Larger than Real Girl, he had a 57 00:03:09,840 --> 00:03:12,519 Speaker 1: lot of respect for his real doll and I don't 58 00:03:12,520 --> 00:03:14,480 Speaker 1: want to ruin it, Please don't chuck, But it was 59 00:03:14,520 --> 00:03:17,480 Speaker 1: it was a relationship based on, if not mutual, at 60 00:03:17,520 --> 00:03:20,880 Speaker 1: least one way respect right right. But um, with with this, 61 00:03:21,000 --> 00:03:23,760 Speaker 1: it's uh, it's with a robot, it's going to be 62 00:03:23,880 --> 00:03:28,119 Speaker 1: much more um mutual. I don't know about necessarily feelings 63 00:03:28,200 --> 00:03:30,040 Speaker 1: for for robots. I don't think we were going to 64 00:03:30,120 --> 00:03:32,080 Speaker 1: reach that point, but it will appear a lot more 65 00:03:32,200 --> 00:03:36,080 Speaker 1: mutual because they can program respect or at least things 66 00:03:36,080 --> 00:03:38,840 Speaker 1: that you can say that would indicate right right now. Um. 67 00:03:39,480 --> 00:03:41,920 Speaker 1: One of the things that's that's leading the way that's 68 00:03:41,960 --> 00:03:45,000 Speaker 1: going to be allowing things like robots that people would 69 00:03:45,080 --> 00:03:48,240 Speaker 1: want to have sex with or marry, um is like 70 00:03:48,280 --> 00:03:50,720 Speaker 1: this the skin that's being developed. There's a guy who 71 00:03:50,800 --> 00:03:53,440 Speaker 1: used to work for Disney. He's a roboticist who uh 72 00:03:53,640 --> 00:03:56,920 Speaker 1: created the skin that bunches and wrinkles. And when you 73 00:03:57,000 --> 00:04:02,400 Speaker 1: have lifelike skin, you can can a emotions through lifelike 74 00:04:02,480 --> 00:04:06,600 Speaker 1: facial expressions. Right once you start having that, you you've 75 00:04:06,640 --> 00:04:10,000 Speaker 1: got a really realistic looking robot. Yeah. I don't think 76 00:04:10,000 --> 00:04:12,080 Speaker 1: I'll ever go to the Hall of Presidents again with 77 00:04:12,120 --> 00:04:16,120 Speaker 1: the same eyes. No, exactly, Abraham Lincoln for the head 78 00:04:16,160 --> 00:04:19,559 Speaker 1: of Abraham Lincoln. Right. Um. This whole this whole issue 79 00:04:19,600 --> 00:04:23,320 Speaker 1: of possibly marrying robots and definitely having sex with robots 80 00:04:23,920 --> 00:04:27,760 Speaker 1: has has brought to the attention of some people. Uh, 81 00:04:27,800 --> 00:04:30,520 Speaker 1: the concept of robot rights. You have you heard of 82 00:04:30,640 --> 00:04:33,039 Speaker 1: robot rights? I mean the movement behind it. Yeah, I know. 83 00:04:33,120 --> 00:04:35,479 Speaker 1: Japan was is kind of at the forefront of this 84 00:04:35,520 --> 00:04:37,919 Speaker 1: whole thing. They are now, but they were lagging for 85 00:04:37,960 --> 00:04:40,600 Speaker 1: a little while. Um. At first it was just South Korea, 86 00:04:40,920 --> 00:04:47,920 Speaker 1: UM and Europe. Basically Japan was conspicuously absent from the table. 87 00:04:48,120 --> 00:04:50,600 Speaker 1: And they're they're at the forefront of robotics. They needed 88 00:04:50,600 --> 00:04:52,240 Speaker 1: to be there, so they finally caught up and now 89 00:04:52,279 --> 00:04:56,840 Speaker 1: they're they're all about robot rights. UM. And a lot 90 00:04:56,839 --> 00:04:58,599 Speaker 1: of people are kind of like, what is this, Why 91 00:04:58,640 --> 00:05:02,360 Speaker 1: would we even have robot rights? This is ridiculous as silly. Uh. 92 00:05:02,400 --> 00:05:06,479 Speaker 1: And to those people proponents of robot rights, say, you 93 00:05:06,480 --> 00:05:10,120 Speaker 1: ever heard of animal rights? Uh? Yeah, isn't that technically, 94 00:05:10,279 --> 00:05:13,080 Speaker 1: you know, a silly idea in the same vein, But 95 00:05:13,440 --> 00:05:16,200 Speaker 1: if you if you really think about it, these are animals. 96 00:05:16,200 --> 00:05:19,360 Speaker 1: But we've we humans have established rights for how we 97 00:05:19,400 --> 00:05:21,520 Speaker 1: interact with them, how we allow them to interact with 98 00:05:21,640 --> 00:05:23,479 Speaker 1: and it's accepted now. And I think a lot of 99 00:05:23,480 --> 00:05:25,320 Speaker 1: people years ago might have thought the same thing about 100 00:05:25,320 --> 00:05:27,760 Speaker 1: animal rights. Right as they do about robot rights, right, 101 00:05:27,839 --> 00:05:31,480 Speaker 1: and I think UM with robots, especially with robots that 102 00:05:31,520 --> 00:05:35,960 Speaker 1: can give like a lifelike appearance where we're really awful 103 00:05:36,000 --> 00:05:38,080 Speaker 1: things are going to come out of humans. You know, 104 00:05:38,200 --> 00:05:41,960 Speaker 1: Like if if you have a robot that can recoil 105 00:05:42,040 --> 00:05:45,000 Speaker 1: and horror or wins in pain, there's going to be 106 00:05:45,040 --> 00:05:46,760 Speaker 1: people out there who are going to want to like 107 00:05:46,880 --> 00:05:48,880 Speaker 1: kill them, like a drift or something like that. Right. 108 00:05:48,920 --> 00:05:51,360 Speaker 1: The potential for abuse is big, and it's yeah, real 109 00:05:51,400 --> 00:05:55,080 Speaker 1: abuse on a robot, but there's still something sociopathic about it, definitely. 110 00:05:55,200 --> 00:05:58,839 Speaker 1: But I I predict that once um lifelike robots are 111 00:05:58,880 --> 00:06:02,800 Speaker 1: available UM and produced on mass, I think there's going 112 00:06:02,839 --> 00:06:05,320 Speaker 1: to be a lot of awful stuff. And I think 113 00:06:05,360 --> 00:06:08,040 Speaker 1: that it's good that we're preparing for this now because 114 00:06:08,240 --> 00:06:10,400 Speaker 1: I think the first half of the century is going 115 00:06:10,400 --> 00:06:14,359 Speaker 1: to see an explosion in advancements and robotics. Yeah. I 116 00:06:14,360 --> 00:06:17,080 Speaker 1: think South Korea said a robot in every house by 117 00:06:17,120 --> 00:06:20,120 Speaker 1: the year Yeah, in that same year. Actually, the US 118 00:06:20,160 --> 00:06:23,799 Speaker 1: has said that it plans to UH supplement one fifth 119 00:06:23,839 --> 00:06:28,320 Speaker 1: of its battalions with robots, which raises a whole other question. 120 00:06:28,360 --> 00:06:32,360 Speaker 1: I mean, like, do robots that UM kill people are 121 00:06:32,400 --> 00:06:36,039 Speaker 1: designed to kill people. Are they afforded any kind of rights? 122 00:06:36,080 --> 00:06:38,640 Speaker 1: Should they exactly? Should they be free from harm? That 123 00:06:38,760 --> 00:06:41,400 Speaker 1: kind of thing? Right, It's really it's it's too much 124 00:06:41,480 --> 00:06:44,160 Speaker 1: for my brain to be honest. Let me get in 125 00:06:44,200 --> 00:06:48,360 Speaker 1: the driver's seat here, chuck the problem. So basically the 126 00:06:48,400 --> 00:06:51,880 Speaker 1: main argument is that no, uh, a robot that's programmed 127 00:06:51,880 --> 00:06:54,840 Speaker 1: to kill should be able to have harm done to it. Right, 128 00:06:55,279 --> 00:06:57,680 Speaker 1: Most robots that we're going to interact with aren't going 129 00:06:57,720 --> 00:07:00,160 Speaker 1: to be designed to kill. If you see a robot 130 00:07:00,160 --> 00:07:01,720 Speaker 1: that you know is designed to kill, you should turn 131 00:07:01,760 --> 00:07:04,919 Speaker 1: and run really fast. Um, because it only needs programmed 132 00:07:04,920 --> 00:07:08,440 Speaker 1: to do one thing, right, yeah, kill, kill kill Um. 133 00:07:08,520 --> 00:07:11,120 Speaker 1: So most of the robots we're going to interact with 134 00:07:11,160 --> 00:07:16,200 Speaker 1: will be helping around the house, serving in the sex trade. Um, 135 00:07:16,320 --> 00:07:19,320 Speaker 1: they already have those. Not the sex trade, although they may. 136 00:07:19,360 --> 00:07:21,560 Speaker 1: But you know the little the robots that clean the 137 00:07:21,560 --> 00:07:23,480 Speaker 1: floor and they do things like that, well, yeah, the 138 00:07:23,560 --> 00:07:27,360 Speaker 1: roomba and the scuba and all that kind of concept. Right, 139 00:07:27,480 --> 00:07:30,120 Speaker 1: these things are going to look a lot more lifelike. Like, 140 00:07:30,160 --> 00:07:33,760 Speaker 1: if you want one doubt Fire, you can have Mrs 141 00:07:33,760 --> 00:07:36,520 Speaker 1: doubt Fire working for you if you shell out enough cash. 142 00:07:37,120 --> 00:07:40,120 Speaker 1: So Let's say you you have Mrs doubt Fire is 143 00:07:40,160 --> 00:07:42,520 Speaker 1: a household robot, and Mrs doubt Fire is bringing you 144 00:07:42,560 --> 00:07:46,600 Speaker 1: a hot cup of coffee. Unfortunately, Mrs Doubtfire trips and 145 00:07:46,760 --> 00:07:49,840 Speaker 1: uh spills the hot coffee on you, and you get 146 00:07:49,920 --> 00:07:53,520 Speaker 1: up and react by smacking Mrs doubt Fire across the face. 147 00:07:54,080 --> 00:07:56,920 Speaker 1: Should you be penalized for that? Should you be punished? Right? 148 00:07:56,960 --> 00:07:59,240 Speaker 1: And I think that's what Japan and some other South 149 00:07:59,320 --> 00:08:02,560 Speaker 1: Korea are trying to work out, are the parameters of 150 00:08:02,600 --> 00:08:06,840 Speaker 1: what's allowed and what's not allowed, and whether or not 151 00:08:06,920 --> 00:08:09,320 Speaker 1: robots should have rights just like you and I. Right now, 152 00:08:09,400 --> 00:08:12,880 Speaker 1: let's say Mrs doubt Fire um doesn't trip, but she 153 00:08:13,640 --> 00:08:15,720 Speaker 1: walks up to you and pours the hot coffee on you. 154 00:08:16,160 --> 00:08:20,240 Speaker 1: Who's responsible for that? The magality, the manufacturer of the robot? 155 00:08:20,360 --> 00:08:23,320 Speaker 1: In my mind, I'm no lawyer, right well that they 156 00:08:23,440 --> 00:08:25,480 Speaker 1: no one has any idea. All these questions are totally 157 00:08:25,600 --> 00:08:26,960 Speaker 1: up in the air right now, and they're trying to 158 00:08:27,000 --> 00:08:29,400 Speaker 1: hammer them out. Um and and the whole thing kind 159 00:08:29,400 --> 00:08:32,839 Speaker 1: of goes both ways. Actually, humans are going to also 160 00:08:32,960 --> 00:08:36,200 Speaker 1: need protection from robots, which is where Mr Isaac Asimov 161 00:08:36,320 --> 00:08:40,040 Speaker 1: comes in, right Yeah, science fiction writer Isaac Asimoff. I 162 00:08:40,040 --> 00:08:42,520 Speaker 1: think first, uh, he was really one of the first 163 00:08:42,559 --> 00:08:45,920 Speaker 1: people to talk about robots and humans living together and 164 00:08:46,200 --> 00:08:50,079 Speaker 1: getting along or not getting along. And uh. He established 165 00:08:50,120 --> 00:08:51,920 Speaker 1: the three laws of robotics. I think in one of 166 00:08:51,960 --> 00:08:54,800 Speaker 1: his short stories, running Around. I think, yeah, Runaround, which 167 00:08:54,880 --> 00:08:58,040 Speaker 1: was actually in a collection of short stories called I Robot, 168 00:08:58,120 --> 00:09:01,280 Speaker 1: which Will Smith, as you know, made into a pretty 169 00:09:01,520 --> 00:09:05,040 Speaker 1: substandard film, which he borrowed from a couple of these, 170 00:09:05,120 --> 00:09:07,760 Speaker 1: and he actually used the three laws of robotics. They 171 00:09:07,840 --> 00:09:09,600 Speaker 1: refer to those in the film as well. You want 172 00:09:09,640 --> 00:09:12,319 Speaker 1: to give him the three laws, I will, A robot 173 00:09:12,440 --> 00:09:17,000 Speaker 1: may not injure a human being uh, or through an action, 174 00:09:17,160 --> 00:09:21,160 Speaker 1: allow human to come to harm. Great first law, Yeah, 175 00:09:21,160 --> 00:09:24,080 Speaker 1: A robot must obey orders given given it by humans, 176 00:09:24,080 --> 00:09:26,600 Speaker 1: except where such orders would conflict with the first law. 177 00:09:27,520 --> 00:09:30,199 Speaker 1: And uh, a robot must protect its own existence as 178 00:09:30,200 --> 00:09:33,240 Speaker 1: long as such protection does not conflict with the first 179 00:09:33,320 --> 00:09:36,360 Speaker 1: or second law. Right. And these laws just sound like 180 00:09:36,400 --> 00:09:39,520 Speaker 1: there's no way around him. Yeah. I was reading a 181 00:09:39,559 --> 00:09:44,720 Speaker 1: critical analysis of asimov um and basically the author pointed 182 00:09:44,720 --> 00:09:47,760 Speaker 1: out that I don't think he didn't think Asthmov thought 183 00:09:47,800 --> 00:09:49,880 Speaker 1: these things were water tight. He basically used to like 184 00:09:50,000 --> 00:09:53,520 Speaker 1: to use them as a theme to show how even 185 00:09:53,559 --> 00:10:00,199 Speaker 1: these really great cohesive uh closed system laws could threw up. Um. 186 00:10:00,240 --> 00:10:02,920 Speaker 1: Which is why nobody's going to go to Asimov to 187 00:10:02,920 --> 00:10:06,280 Speaker 1: figure out how to program robots in the future. But 188 00:10:06,400 --> 00:10:09,640 Speaker 1: we do need some some level of protection, like um 189 00:10:10,000 --> 00:10:12,920 Speaker 1: our robots just kept from interacting with humans altogether, Like 190 00:10:12,960 --> 00:10:15,840 Speaker 1: a robot cannot touch a human. Is that something that 191 00:10:15,840 --> 00:10:18,720 Speaker 1: that we would do? Um, there's and there's already you know, 192 00:10:18,720 --> 00:10:23,920 Speaker 1: there's been casualties. Um, there's been death by robots happened already. Uh. 193 00:10:23,960 --> 00:10:27,160 Speaker 1: The first one happened in two when a guy was 194 00:10:27,320 --> 00:10:30,320 Speaker 1: crushed on a factory line by you know, a big 195 00:10:30,400 --> 00:10:36,079 Speaker 1: robotic arm pretty not really but yeah, kind of um. 196 00:10:36,160 --> 00:10:37,880 Speaker 1: And then since then a lot of people have died. 197 00:10:37,920 --> 00:10:40,439 Speaker 1: Actually one one guy. The worst death by robot that 198 00:10:40,480 --> 00:10:43,360 Speaker 1: I've heard so far was um. Uh, a guy had 199 00:10:43,440 --> 00:10:48,040 Speaker 1: a enough of an amount of molten aluminum poured on 200 00:10:48,120 --> 00:10:51,079 Speaker 1: him by a robot that it killed him. I wonder 201 00:10:51,080 --> 00:10:53,200 Speaker 1: if he was trying to make a little robot buddy. 202 00:10:53,840 --> 00:10:56,559 Speaker 1: I don't, I don't know. Maybe, so we're hoping not. 203 00:10:57,280 --> 00:11:00,240 Speaker 1: We were talking about industrial robots. The thing is these 204 00:11:00,240 --> 00:11:02,960 Speaker 1: are isolated incidents, But what happens when there is a 205 00:11:03,040 --> 00:11:05,760 Speaker 1: robot in every house and not only Korea but the world, 206 00:11:06,120 --> 00:11:09,200 Speaker 1: These accidents could step up quite a bit, and you know, 207 00:11:09,280 --> 00:11:11,840 Speaker 1: we need to figure out how to how to address 208 00:11:11,880 --> 00:11:15,160 Speaker 1: them now before it happens. UM. And also, I think 209 00:11:15,160 --> 00:11:18,360 Speaker 1: a lot of roboticists are really worried about the moment 210 00:11:18,440 --> 00:11:21,640 Speaker 1: when robots are equipped with systems that allow them to 211 00:11:21,760 --> 00:11:26,000 Speaker 1: learn right. When that happens, they lose all predictability whatsoever, 212 00:11:26,240 --> 00:11:29,040 Speaker 1: and and we won't be able to tell what they're 213 00:11:29,080 --> 00:11:31,800 Speaker 1: about to do, what they won't do, that they'll be 214 00:11:31,840 --> 00:11:34,240 Speaker 1: as they'll be as unpredictable as humans. And you know, 215 00:11:34,600 --> 00:11:36,640 Speaker 1: when you're on the subway with somebody you don't really 216 00:11:36,640 --> 00:11:39,560 Speaker 1: trust your you got your like your muscles tensed in 217 00:11:39,559 --> 00:11:42,880 Speaker 1: your ribs for a knifeing. You know, the same thing 218 00:11:42,880 --> 00:11:45,120 Speaker 1: would happen with the humans and robots. And I think 219 00:11:45,160 --> 00:11:47,880 Speaker 1: so it's probably even a little more creepy because while 220 00:11:47,960 --> 00:11:50,880 Speaker 1: humans are unpredictable, robots, you don't know what they're programmed 221 00:11:50,920 --> 00:11:53,240 Speaker 1: to do if it's not your robots, right, and also 222 00:11:53,280 --> 00:11:56,760 Speaker 1: there's also issues of morality that would factor into that. UM. 223 00:11:57,000 --> 00:12:00,520 Speaker 1: You know, you like to think that most humans would 224 00:12:00,600 --> 00:12:03,679 Speaker 1: stop themselves from stabbing you even if they wanted to, 225 00:12:04,160 --> 00:12:07,360 Speaker 1: because they have some some sort of moral judgment. How 226 00:12:07,360 --> 00:12:10,960 Speaker 1: do you program morals into robots? You know? Um, it's 227 00:12:11,040 --> 00:12:16,400 Speaker 1: a it's a very big, sticky ball of questions. I 228 00:12:16,480 --> 00:12:18,839 Speaker 1: have to say, I am pretty glad that the people 229 00:12:18,840 --> 00:12:21,199 Speaker 1: who are trying to figure this out now are figuring 230 00:12:21,240 --> 00:12:23,240 Speaker 1: it out now, and there are a lot smarter than 231 00:12:23,400 --> 00:12:25,719 Speaker 1: I am. Exactly. It's it's kind of one of the 232 00:12:25,800 --> 00:12:28,880 Speaker 1: situations I would kick back and say, go to it. Yeah, 233 00:12:28,880 --> 00:12:31,560 Speaker 1: I would have no idea. Yeah, yeah, hopefully they won't 234 00:12:31,640 --> 00:12:35,840 Speaker 1: use their advantage to make us their slaves though, Yeah, 235 00:12:35,840 --> 00:12:39,680 Speaker 1: I agree. So that's about that for robot marriages. But 236 00:12:39,760 --> 00:12:42,320 Speaker 1: there's even more in the article that I wrote on 237 00:12:42,400 --> 00:12:45,439 Speaker 1: the site will Robots Get Married? On how stuff works 238 00:12:45,480 --> 00:12:48,000 Speaker 1: dot com? And stick around to find out how to 239 00:12:48,120 --> 00:12:50,680 Speaker 1: get water from a beach if you're ever strained on 240 00:12:50,720 --> 00:12:55,160 Speaker 1: a deserted island. Right after this stuff, you should know 241 00:12:55,280 --> 00:12:57,480 Speaker 1: is brought to you by Visa. We all have things 242 00:12:57,480 --> 00:12:59,800 Speaker 1: to think about, like say, what's the best site to 243 00:12:59,840 --> 00:13:02,120 Speaker 1: buy a new leather jacket, or whether to buy the 244 00:13:02,160 --> 00:13:05,600 Speaker 1: three or six megapixel camera. But thankfully we don't need 245 00:13:05,679 --> 00:13:08,760 Speaker 1: to think about online fraud because for every purchase you make, 246 00:13:09,160 --> 00:13:11,800 Speaker 1: Visa keeps an eye out for fraud with real time 247 00:13:11,840 --> 00:13:15,000 Speaker 1: fraud monitoring and by making sure you're not liable for 248 00:13:15,040 --> 00:13:19,000 Speaker 1: any unauthorized purchases. How's that for peace of mind? Safe, 249 00:13:19,320 --> 00:13:26,360 Speaker 1: secure Visa way to stick around. It may save your life. So, Chuck, 250 00:13:26,400 --> 00:13:28,760 Speaker 1: I'm going to tell these people. Already know you know 251 00:13:28,840 --> 00:13:30,480 Speaker 1: how to do this because it's based on one of 252 00:13:30,480 --> 00:13:33,000 Speaker 1: your articles. But I'm gonna tell everybody how to get 253 00:13:33,120 --> 00:13:36,240 Speaker 1: water out of a beach on a deserted island. Right, 254 00:13:36,240 --> 00:13:38,800 Speaker 1: it's called a beach. Well, yeah, and it can come 255 00:13:38,800 --> 00:13:43,040 Speaker 1: in really handy if you're you know, like a castaway castaway, Yeah, exactly, 256 00:13:43,120 --> 00:13:44,959 Speaker 1: go ahead and let him know I will. So basically, 257 00:13:44,960 --> 00:13:48,240 Speaker 1: if you you find it sand dune right and uh, 258 00:13:48,640 --> 00:13:51,320 Speaker 1: right behind it, you dig about a three to five 259 00:13:51,320 --> 00:13:55,160 Speaker 1: ft hole that's deep, right or wide? Uh deep? Okay, 260 00:13:55,160 --> 00:13:57,400 Speaker 1: three to five probably about a foot wide. Just you 261 00:13:57,440 --> 00:13:59,439 Speaker 1: can reach down in there. If you can, you on 262 00:13:59,600 --> 00:14:02,439 Speaker 1: a line the sides of the hole with wood to 263 00:14:02,559 --> 00:14:05,120 Speaker 1: keep it from collapsing. You want to play some rocks 264 00:14:05,120 --> 00:14:07,560 Speaker 1: in the in the bottom of it, right, and uh, 265 00:14:07,880 --> 00:14:10,680 Speaker 1: you basically just walk away for a few hours. Right, 266 00:14:10,880 --> 00:14:14,679 Speaker 1: you come back and presto, you've got about five gallons 267 00:14:14,720 --> 00:14:18,560 Speaker 1: of pure water right there. Presto, Chango. It's not saltwater, 268 00:14:18,840 --> 00:14:21,720 Speaker 1: you know, and you recommended in the article that it was. 269 00:14:21,960 --> 00:14:23,760 Speaker 1: If it was a little salty, just move back a 270 00:14:23,760 --> 00:14:27,720 Speaker 1: little further. And if you have a size twelve foot 271 00:14:27,760 --> 00:14:30,560 Speaker 1: and you ever find yourself stranded on a desert island, 272 00:14:30,840 --> 00:14:33,080 Speaker 1: just measure off a hundred feet and start digging there. 273 00:14:33,160 --> 00:14:36,000 Speaker 1: That's what Chuck recommends. And drink out of your shoe. Yeah, 274 00:14:36,080 --> 00:14:39,040 Speaker 1: that's a good one. Also, you can find a myriad 275 00:14:39,160 --> 00:14:42,000 Speaker 1: other ways to save your own life, not only in 276 00:14:42,080 --> 00:14:44,800 Speaker 1: Chuck's article how to find water in the wild, but 277 00:14:44,920 --> 00:14:49,600 Speaker 1: on the adventure channel on how stuff works dot com 278 00:14:50,040 --> 00:14:52,440 Speaker 1: for more on this and thousands of other topics. Because 279 00:14:52,480 --> 00:14:55,360 Speaker 1: it how stuff works dot com, let us know what 280 00:14:55,440 --> 00:14:58,520 Speaker 1: you think. Send an email to podcast at how stuff 281 00:14:58,560 --> 00:15:02,640 Speaker 1: works dot com. M M. Brought to you by the 282 00:15:02,680 --> 00:15:05,960 Speaker 1: reinvented two thousand twelve Camry. It's ready. Are you