1 00:00:11,720 --> 00:00:14,520 Speaker 1: Hi Os Voloscin here and Cara Price. We're taking the 2 00:00:14,560 --> 00:00:15,000 Speaker 1: week off. 3 00:00:15,120 --> 00:00:18,480 Speaker 2: Indeed, we'll be back with new episodes starting September tenth. 4 00:00:18,600 --> 00:00:21,079 Speaker 1: In the meantime, instead of leaving this feed empty, we 5 00:00:21,120 --> 00:00:24,400 Speaker 1: wanted to share an episode from earlier this year. This week, 6 00:00:24,440 --> 00:00:28,520 Speaker 1: we're reairing my conversation with Nathaniel Rich from January twenty ninth. 7 00:00:28,800 --> 00:00:32,400 Speaker 1: He's a novelist, essayist, and writer at large for the 8 00:00:32,400 --> 00:00:36,720 Speaker 1: New York Times magazine. We discuss a NASA experiment where 9 00:00:36,760 --> 00:00:40,280 Speaker 1: civilians take a simulated trip to Mars and try to 10 00:00:40,360 --> 00:00:43,800 Speaker 1: handle the isolation. Hope you enjoy it and thanks for listening. 11 00:00:44,560 --> 00:00:47,159 Speaker 3: NASA has a punch list of eight hundred problems that 12 00:00:47,240 --> 00:00:52,199 Speaker 3: must be solved before the first mission to Mars is launched. 13 00:00:52,960 --> 00:00:56,400 Speaker 3: Very few of them have to do with problems of 14 00:00:56,840 --> 00:01:01,280 Speaker 3: human psychology or really even of in survival, which is 15 00:01:01,320 --> 00:01:05,640 Speaker 3: the subject of this experiment that I wrote about, called SHAPEA. 16 00:01:06,000 --> 00:01:10,960 Speaker 1: This particular experiment began with rather intriguing announcement on the 17 00:01:11,000 --> 00:01:11,800 Speaker 1: NASA website. 18 00:01:12,800 --> 00:01:15,760 Speaker 3: Yeah, it was a little bit like the Wonka Factory, 19 00:01:16,000 --> 00:01:19,360 Speaker 3: the Golden ticket that you know, four civilians would be 20 00:01:19,440 --> 00:01:24,200 Speaker 3: chosen to go to Mars Asterisk not really Mars, but 21 00:01:24,319 --> 00:01:27,440 Speaker 3: a habitat that was built on essentially a stage set 22 00:01:27,920 --> 00:01:31,520 Speaker 3: to look exactly like what they expect the first mission 23 00:01:31,520 --> 00:01:35,520 Speaker 3: to Mars to look like. And it generated enormous excitement 24 00:01:35,600 --> 00:01:38,520 Speaker 3: and people from all over the country rushed to apply. 25 00:01:39,160 --> 00:01:42,640 Speaker 3: They wanted the Golden ticket to live out. In most cases, 26 00:01:42,680 --> 00:01:46,479 Speaker 3: I think it's kind of childhood fantasy of space exploration 27 00:01:46,800 --> 00:01:51,640 Speaker 3: to see if they could withstand psychologically the challenges of 28 00:01:51,760 --> 00:01:54,920 Speaker 3: living away from the rest of the everyone else they've 29 00:01:54,920 --> 00:01:56,200 Speaker 3: ever known or met. 30 00:02:00,280 --> 00:02:03,960 Speaker 1: Welcome to tex Stuff the Story. I'm os Vloschen, and 31 00:02:04,040 --> 00:02:06,440 Speaker 1: each week we bring you an in depth interview with 32 00:02:06,520 --> 00:02:09,440 Speaker 1: one of the brightest and farthest seeing minds in and 33 00:02:09,520 --> 00:02:17,680 Speaker 1: about tech. Karen, I'm excited to bring you this interview 34 00:02:17,720 --> 00:02:20,200 Speaker 1: with Nathaniel Rich. When we ask people to come on 35 00:02:20,240 --> 00:02:23,120 Speaker 1: the show, it's always because one or other of us 36 00:02:23,200 --> 00:02:26,359 Speaker 1: has been fascinated by something they've said, something they've done, 37 00:02:26,800 --> 00:02:27,720 Speaker 1: or something they've written. 38 00:02:27,840 --> 00:02:31,400 Speaker 2: Well, Nathaniel kind of had me at Mars Asterisk. 39 00:02:30,919 --> 00:02:34,919 Speaker 1: Me too. You can't really understand tech today without understanding, 40 00:02:35,000 --> 00:02:38,440 Speaker 1: or at least investigating the dreams and the fantasies of 41 00:02:38,520 --> 00:02:42,359 Speaker 1: the tech Titans. Colonizing space is such an important touchstone 42 00:02:42,400 --> 00:02:45,440 Speaker 1: for Elon Musk and Jeff Bezos in particular, and also 43 00:02:45,520 --> 00:02:49,040 Speaker 1: mentioned by Trump for his inauguration as quote the pursuit 44 00:02:49,120 --> 00:02:51,000 Speaker 1: of our manifest destiny. 45 00:02:50,600 --> 00:02:52,519 Speaker 2: He said, put stars and strip What do you say? 46 00:02:52,520 --> 00:02:54,600 Speaker 2: Put red, white, and blue? Are stars and stripes on 47 00:02:55,160 --> 00:02:56,079 Speaker 2: Mars Marsia. 48 00:02:56,639 --> 00:02:58,440 Speaker 1: So when I came across this article in the New 49 00:02:58,480 --> 00:03:02,760 Speaker 1: York Times magazine under the headline can humans withstand the 50 00:03:02,800 --> 00:03:06,359 Speaker 1: psychological torture of Mars? I had to know more. In fact, 51 00:03:06,360 --> 00:03:09,120 Speaker 1: I remember reading it and just getting goosebumps, and so 52 00:03:09,440 --> 00:03:11,799 Speaker 1: I kind of wanted to talk to Nathaniel about how 53 00:03:11,840 --> 00:03:14,960 Speaker 1: realistic the dreams of getting to Mars are and what 54 00:03:15,080 --> 00:03:18,119 Speaker 1: some of the practical dare I say, technical steps required 55 00:03:18,200 --> 00:03:18,959 Speaker 1: to achieve the mark? 56 00:03:19,080 --> 00:03:20,600 Speaker 2: Before you get too excited, Can you just tell me 57 00:03:20,600 --> 00:03:21,600 Speaker 2: who Nathaniel Rich is? 58 00:03:21,680 --> 00:03:22,080 Speaker 4: Sorry. 59 00:03:22,400 --> 00:03:26,360 Speaker 1: Nathaniel is an author. He's written novels like The Mayor's Tongue, 60 00:03:26,440 --> 00:03:30,720 Speaker 1: Odds Against Tomorrow, and King Zeno, but also nonfiction books 61 00:03:30,760 --> 00:03:34,120 Speaker 1: primarily about the environment, such as Losing Earth, A Recent 62 00:03:34,240 --> 00:03:38,680 Speaker 1: History and Second Nature Scenes from a World Remade. One 63 00:03:38,720 --> 00:03:42,720 Speaker 1: critic actually said Rich is a gifted caricaturist and a 64 00:03:42,760 --> 00:03:47,520 Speaker 1: gifted apocalyptist. It's his talent for describing the apocalypse which 65 00:03:47,560 --> 00:03:50,280 Speaker 1: brought him in some ways to reporting on the Mars 66 00:03:50,360 --> 00:04:01,200 Speaker 1: June Alpha project, which asked to you about why did 67 00:04:01,200 --> 00:04:02,120 Speaker 1: you decide to write the piece? 68 00:04:03,440 --> 00:04:07,080 Speaker 3: The NASA part of it was almost came secondarily. I 69 00:04:07,120 --> 00:04:14,160 Speaker 3: had become obsessed with this history of isolation research, and 70 00:04:14,200 --> 00:04:19,560 Speaker 3: particularly by this incredible story of a man named Michel 71 00:04:19,720 --> 00:04:25,320 Speaker 3: Sefrey who had launched a series of cave experiments to 72 00:04:25,600 --> 00:04:30,560 Speaker 3: test the endurance of people in isolation, in environments where 73 00:04:30,560 --> 00:04:32,680 Speaker 3: they're completely cut off from the world. And so he 74 00:04:32,720 --> 00:04:35,360 Speaker 3: had run a series of these experiments that culminated with 75 00:04:35,400 --> 00:04:38,480 Speaker 3: this experiment by the first female participant in the series, 76 00:04:38,480 --> 00:04:41,520 Speaker 3: who was this woman named Veronique Legwyn was in the 77 00:04:41,600 --> 00:04:46,040 Speaker 3: late eighties, and she went underground and ended up setting 78 00:04:46,080 --> 00:04:48,080 Speaker 3: the record at the time as one hundred and eleven 79 00:04:48,120 --> 00:04:51,880 Speaker 3: days underground. And she kept a journal and she wrote 80 00:04:51,920 --> 00:04:54,920 Speaker 3: about everything she was thinking about and feeling, and ultimately 81 00:04:54,960 --> 00:04:57,279 Speaker 3: what happened was she went a little bit insane, but 82 00:04:57,360 --> 00:05:03,520 Speaker 3: also had these moments of great and enlightenment. And it's 83 00:05:03,560 --> 00:05:06,920 Speaker 3: a tragic story though, because she came out finally, and 84 00:05:06,960 --> 00:05:10,520 Speaker 3: after being celebrated and becoming a kind of national celebrity 85 00:05:10,560 --> 00:05:13,800 Speaker 3: for a period of time, entered into this great depression 86 00:05:13,880 --> 00:05:17,800 Speaker 3: and ultimately killed herself within a year. And she had 87 00:05:17,839 --> 00:05:21,039 Speaker 3: said before her death something to the effect of, you know, 88 00:05:21,040 --> 00:05:23,080 Speaker 3: I never was more alive than I was down and 89 00:05:23,600 --> 00:05:25,960 Speaker 3: underground when I was all by myself. And that led 90 00:05:26,000 --> 00:05:29,640 Speaker 3: me into a whole obsession with these types of experiments. 91 00:05:29,680 --> 00:05:32,880 Speaker 3: And I wanted to see if anyone was doing these 92 00:05:32,880 --> 00:05:37,200 Speaker 3: things now, because they're on one level, they're completely unethical 93 00:05:37,440 --> 00:05:41,360 Speaker 3: because basically what you'd expect happens, which is most people 94 00:05:41,800 --> 00:05:45,120 Speaker 3: struggle and often lose their hold on reality. And I 95 00:05:45,160 --> 00:05:47,280 Speaker 3: found that no one was really doing these experiments for 96 00:05:47,320 --> 00:05:52,159 Speaker 3: that reason, except for NASA, who had continued under the 97 00:05:52,160 --> 00:05:54,640 Speaker 3: guise of this Martian project. 98 00:05:54,800 --> 00:05:58,640 Speaker 1: So on the one had NASA putting out the cool applicants, 99 00:05:59,360 --> 00:06:02,160 Speaker 1: but on the other hand, they had to build mass 100 00:06:02,279 --> 00:06:04,479 Speaker 1: or at least a motion colony on Earth. 101 00:06:05,600 --> 00:06:09,960 Speaker 3: Yeah, they had to build or actually print using a 102 00:06:10,000 --> 00:06:13,680 Speaker 3: three D printer, a habitat, which is, by the way, 103 00:06:13,760 --> 00:06:17,320 Speaker 3: how they will do it. When we get to Mars. 104 00:06:17,360 --> 00:06:21,000 Speaker 3: You can't travel thirty three million miles with a house. 105 00:06:21,200 --> 00:06:24,600 Speaker 3: You know of towing a house behind you. Yeah, free. 106 00:06:26,240 --> 00:06:27,920 Speaker 3: So they can't quite do that, or they don't have 107 00:06:27,920 --> 00:06:30,200 Speaker 3: the technology to do that. It's not efficient. And so 108 00:06:30,279 --> 00:06:33,000 Speaker 3: what they will do is they will just lug a 109 00:06:33,000 --> 00:06:37,920 Speaker 3: three D printer up there and use Martian rock regolith 110 00:06:38,200 --> 00:06:41,239 Speaker 3: as ink for this three D printer. 111 00:06:41,480 --> 00:06:43,599 Speaker 1: So they'll tone the sand into cement somehow. 112 00:06:44,000 --> 00:06:45,000 Speaker 4: Yeah, and they can do that. 113 00:06:45,040 --> 00:06:47,599 Speaker 3: They do that on this planet too, And there are 114 00:06:47,839 --> 00:06:50,159 Speaker 3: you can find online some habitats that have been built, 115 00:06:50,200 --> 00:06:52,280 Speaker 3: some houses that have been built this way, not using 116 00:06:52,320 --> 00:06:57,480 Speaker 3: Martian rock obviously, but terrestrial rock. And they will construct 117 00:06:57,520 --> 00:07:01,680 Speaker 3: this house. It's a seventeen hundred square foot habitat and 118 00:07:02,440 --> 00:07:05,719 Speaker 3: they built it in a warehouse at the Johnson Space 119 00:07:05,720 --> 00:07:11,360 Speaker 3: Center in Houston, and it's their four little bedrooms and 120 00:07:11,480 --> 00:07:17,120 Speaker 3: a lounge and you know, a small indoor garden and 121 00:07:17,480 --> 00:07:21,320 Speaker 3: some computers and desks and like a little relaxation space. 122 00:07:22,480 --> 00:07:27,480 Speaker 3: And that seventeen hundred foot house habitat was where they 123 00:07:27,480 --> 00:07:31,160 Speaker 3: were going to send four people for more than a year. 124 00:07:31,800 --> 00:07:35,720 Speaker 1: And this habitat resembles exactly what they intend to build 125 00:07:36,160 --> 00:07:37,720 Speaker 1: on Mars when they get there. 126 00:07:38,320 --> 00:07:42,280 Speaker 3: Yeah, I'm sure subject to change, and I suppose part 127 00:07:42,280 --> 00:07:46,560 Speaker 3: of this experiment was to determine whether this particular model 128 00:07:46,760 --> 00:07:48,000 Speaker 3: would work best. 129 00:07:48,080 --> 00:07:49,920 Speaker 4: But yeah, this is the plan. 130 00:07:50,360 --> 00:07:53,679 Speaker 1: And the kind of simulated colony in the Johnson Space 131 00:07:53,720 --> 00:07:55,800 Speaker 1: Center had quite a romantic. 132 00:07:55,440 --> 00:07:59,480 Speaker 3: Name, Yeah, Marsdoo and Alpha is the name of the habitat, 133 00:07:59,600 --> 00:08:03,880 Speaker 3: and the the mission is named Shapeah, which is I 134 00:08:03,880 --> 00:08:06,160 Speaker 3: guess NASA's idea of a sexy name. 135 00:08:07,880 --> 00:08:11,080 Speaker 1: And so okay, So the call goes out for some 136 00:08:11,240 --> 00:08:14,840 Speaker 1: volunteers to go to Mons Dunolf. One of the people 137 00:08:14,880 --> 00:08:20,520 Speaker 1: who sees the advertisement is Nathan Jones. Who's Nathan. 138 00:08:20,880 --> 00:08:25,520 Speaker 3: Yeah, Nathan Jones is in many ways the most fascinating 139 00:08:25,680 --> 00:08:28,240 Speaker 3: figure for me in reporting the piece. He's an emergency 140 00:08:28,320 --> 00:08:34,880 Speaker 3: room physician from Springfield, Illinois, father of three boys, married 141 00:08:35,200 --> 00:08:39,720 Speaker 3: and Nathan was, like basically everyone I spoke to for 142 00:08:39,800 --> 00:08:43,360 Speaker 3: the story, was a kind of self professed NASA geek 143 00:08:43,440 --> 00:08:49,280 Speaker 3: or obsessive and had always dreamed of doing something special, 144 00:08:49,400 --> 00:08:53,480 Speaker 3: bigger with his life. He was obsessed with space travel 145 00:08:53,520 --> 00:08:58,680 Speaker 3: and when he saw this posting, he applied immediately and 146 00:08:59,040 --> 00:09:01,880 Speaker 3: then told his his wife, who was I think as 147 00:09:01,920 --> 00:09:03,720 Speaker 3: safe to say as it was a pault. 148 00:09:04,200 --> 00:09:07,520 Speaker 1: The sequence that seems a little all speaking as American man. 149 00:09:09,280 --> 00:09:11,960 Speaker 3: Yeah, I don't. I wouldn't have flied in my house. 150 00:09:13,240 --> 00:09:15,240 Speaker 3: But he was unique actually in that he was the 151 00:09:15,280 --> 00:09:19,600 Speaker 3: only one of the finalists who had children, and as 152 00:09:19,600 --> 00:09:23,719 Speaker 3: the father of two small children myself, I felt for 153 00:09:24,720 --> 00:09:26,720 Speaker 3: the family. And he was fully aware he was going 154 00:09:26,760 --> 00:09:29,280 Speaker 3: to miss out on a lot. You miss a year 155 00:09:29,320 --> 00:09:31,240 Speaker 3: with your children, you're missing a lot, and you come 156 00:09:31,280 --> 00:09:34,360 Speaker 3: back and the children look like different people. So there 157 00:09:34,400 --> 00:09:39,120 Speaker 3: was another dimension of an emotional challenge with him. But 158 00:09:39,160 --> 00:09:40,240 Speaker 3: he was determined to do it. 159 00:09:40,520 --> 00:09:41,520 Speaker 1: And how did he prepare. 160 00:09:42,400 --> 00:09:48,760 Speaker 3: He prepared very dutifully by him and his wife had 161 00:09:48,880 --> 00:09:51,280 Speaker 3: a whole series. I was fascinated by this, a whole 162 00:09:51,280 --> 00:09:55,040 Speaker 3: series of preparations that they did. He wrote little letters 163 00:09:55,160 --> 00:09:57,760 Speaker 3: to that he placed around the house in secret hiding 164 00:09:57,800 --> 00:10:00,760 Speaker 3: spots that the kids and his wife case he might 165 00:10:00,800 --> 00:10:03,960 Speaker 3: find over the course of the year. Sometimes little like 166 00:10:04,720 --> 00:10:08,360 Speaker 3: notes of encouragement, like he put a note in the 167 00:10:08,400 --> 00:10:11,480 Speaker 3: fuse box for like the first time the lights went 168 00:10:11,520 --> 00:10:13,600 Speaker 3: out and said, you know you can do this. I 169 00:10:13,600 --> 00:10:16,120 Speaker 3: trust you, just flipped this switch. And so they're all 170 00:10:16,120 --> 00:10:19,600 Speaker 3: these sort of sweet and for somewhat poignant point. 171 00:10:19,600 --> 00:10:21,720 Speaker 1: It's almost like the script of a movie where somebody 172 00:10:21,720 --> 00:10:22,520 Speaker 1: knows they're going to die. 173 00:10:23,400 --> 00:10:27,640 Speaker 3: Yeah, and there's but the poignancy is somewhat compromised. I 174 00:10:27,720 --> 00:10:32,080 Speaker 3: found by the fact that it was all a contrived 175 00:10:32,280 --> 00:10:36,560 Speaker 3: scenario he was. That's there's a kind of beathos to 176 00:10:36,600 --> 00:10:39,720 Speaker 3: the fact that, well, he wasn't actually going to Mars. 177 00:10:39,760 --> 00:10:43,120 Speaker 3: It's not quite the Matthew mcconnie Interstellar where he's missing 178 00:10:43,160 --> 00:10:45,720 Speaker 3: his children for this this major mission. He's just going 179 00:10:45,760 --> 00:10:48,000 Speaker 3: to sit on a stage set for a year. But 180 00:10:48,200 --> 00:10:51,040 Speaker 3: that tension between the kind of absurdity of the whole 181 00:10:51,080 --> 00:10:57,120 Speaker 3: proposition and then the real emotion that attended every aspect 182 00:10:57,160 --> 00:11:00,000 Speaker 3: of this process, for me, that was really the heart 183 00:11:00,600 --> 00:11:01,280 Speaker 3: of the story. 184 00:11:01,920 --> 00:11:03,800 Speaker 1: All you have to do is watch the video of 185 00:11:03,920 --> 00:11:09,040 Speaker 1: him about to go into the Mars Dunelfa. What did 186 00:11:09,080 --> 00:11:11,240 Speaker 1: you feel when you watch somebody you spend time with 187 00:11:11,320 --> 00:11:13,920 Speaker 1: his source in such distress? 188 00:11:14,559 --> 00:11:18,400 Speaker 4: Yeah, that was striking. He had predicted it. 189 00:11:18,559 --> 00:11:22,679 Speaker 3: But sure enough, when it came time to enter this habitat, 190 00:11:23,040 --> 00:11:26,400 Speaker 3: they had this dramatic ceremony. They were filmed right in 191 00:11:26,440 --> 00:11:29,520 Speaker 3: front of the main portal, which is basically just a door. 192 00:11:30,120 --> 00:11:33,240 Speaker 3: It wasn't like some major like you're entering a submarine 193 00:11:33,320 --> 00:11:35,360 Speaker 3: or something. But they were at a they get a 194 00:11:35,400 --> 00:11:37,520 Speaker 3: little press conference, and each one of them had to 195 00:11:37,520 --> 00:11:41,600 Speaker 3: give a talk, give a little statement, and he broke down. 196 00:11:41,640 --> 00:11:45,000 Speaker 3: He couldn't finish it because he was so overcome by 197 00:11:45,320 --> 00:11:49,320 Speaker 3: the thought of saying goodbye finally to his family for 198 00:11:49,360 --> 00:11:50,680 Speaker 3: this long period of time. 199 00:11:50,880 --> 00:11:53,400 Speaker 5: But I believe that tomorrow will only be possible because 200 00:11:53,440 --> 00:11:57,840 Speaker 5: we step into Mars Dune Alpa today. And with that 201 00:11:57,920 --> 00:12:00,160 Speaker 5: in mind, I also want to take a moment to 202 00:12:00,160 --> 00:12:04,200 Speaker 5: sincerely think the great many people who've worked tirelessly in 203 00:12:04,679 --> 00:12:08,720 Speaker 5: so many countless hours to get us to this point. Also, 204 00:12:08,800 --> 00:12:11,520 Speaker 5: thank you to our families and friends for their sacrifices. 205 00:12:12,320 --> 00:12:16,280 Speaker 5: We see, we know those sacrifices. We couldn't be here 206 00:12:16,320 --> 00:12:26,720 Speaker 5: without your love and support. Sorry, Sorry to my wife 207 00:12:27,760 --> 00:12:28,320 Speaker 5: and kids. 208 00:12:29,120 --> 00:12:32,839 Speaker 1: I love you, the moon. I'm sorry, bars and back. 209 00:12:33,640 --> 00:12:36,920 Speaker 3: And it's very moving and upsetting and sort of sweet 210 00:12:37,440 --> 00:12:39,679 Speaker 3: and horrible in some ways as well. It's something that 211 00:12:39,760 --> 00:12:42,200 Speaker 3: he brought upon himself. But I think what's key to 212 00:12:42,320 --> 00:12:46,800 Speaker 3: understand is that everybody in the mission, from the administrators 213 00:12:46,840 --> 00:12:51,959 Speaker 3: to the participants, felt very certain that what they were 214 00:12:52,000 --> 00:12:57,319 Speaker 3: doing was a critical next step towards this wonderful dream. 215 00:12:57,480 --> 00:13:02,040 Speaker 3: Of Humanity's next chapter, they felt that there is no Mars, 216 00:13:02,080 --> 00:13:05,720 Speaker 3: there is no exploration of Mars unless you have the 217 00:13:05,720 --> 00:13:07,040 Speaker 3: shapea experiment. 218 00:13:07,360 --> 00:13:09,880 Speaker 4: I'm not convinced that's true at all. 219 00:13:09,920 --> 00:13:12,080 Speaker 3: I mean, I wrote about that, but they certainly were, 220 00:13:12,440 --> 00:13:16,679 Speaker 3: and so they did feel that they were sacrificing, making 221 00:13:16,880 --> 00:13:21,280 Speaker 3: a major personal sacrifice towards achieving a great goal for 222 00:13:21,320 --> 00:13:23,800 Speaker 3: all of humanity, which. 223 00:13:23,640 --> 00:13:25,720 Speaker 1: May have kept them safe. And the woman you mentioned 224 00:13:25,720 --> 00:13:28,280 Speaker 1: at the beginning, the French woman who took her own life, 225 00:13:28,600 --> 00:13:30,080 Speaker 1: does she have that same sense of mission. 226 00:13:31,000 --> 00:13:32,160 Speaker 4: That's a great point. 227 00:13:32,480 --> 00:13:35,360 Speaker 3: There is some commonality, and that there was this idea 228 00:13:35,640 --> 00:13:39,320 Speaker 3: that they were on a kind of different frontier of 229 00:13:40,240 --> 00:13:44,680 Speaker 3: human psychology. But yes, it's not it was. I don't 230 00:13:44,720 --> 00:13:47,800 Speaker 3: think it was quite as ennobling. The stakes were quite 231 00:13:47,840 --> 00:13:50,120 Speaker 3: as high as you see with NASA and all the 232 00:13:50,160 --> 00:13:51,200 Speaker 3: trappings of NASA. 233 00:13:51,679 --> 00:13:56,080 Speaker 1: And also she was totally alone, whereas Nathan had three companions. 234 00:13:55,800 --> 00:13:58,400 Speaker 3: Right right, And so there's some distinctions there, although I 235 00:13:58,440 --> 00:14:01,440 Speaker 3: will say that in the long history of experiments in 236 00:14:01,480 --> 00:14:06,920 Speaker 3: which people are together in isolation, they suffer also, I mean, 237 00:14:07,200 --> 00:14:10,000 Speaker 3: maybe it's not quite as extreme, but you know, in 238 00:14:10,080 --> 00:14:14,120 Speaker 3: conducting the research for the piece I spoke with a 239 00:14:14,160 --> 00:14:18,559 Speaker 3: bunch of psychiatrists and historians of science and historians of psychology, 240 00:14:19,240 --> 00:14:25,760 Speaker 3: and I learned that the definition of isolation is not 241 00:14:26,040 --> 00:14:31,200 Speaker 3: necessarily being alone. It's being removed from your normal life 242 00:14:31,200 --> 00:14:33,720 Speaker 3: and from the people close to you. So you can 243 00:14:33,760 --> 00:14:37,600 Speaker 3: be in isolation with other people, and in fact, many 244 00:14:37,600 --> 00:14:41,840 Speaker 3: of the same psychological effects are experienced whether or not 245 00:14:41,880 --> 00:14:44,600 Speaker 3: there are you're with other people, if you're cut off 246 00:14:44,600 --> 00:14:46,440 Speaker 3: from the people who are most important to you. 247 00:14:47,360 --> 00:14:49,520 Speaker 1: When I think about the history of space movies is 248 00:14:49,520 --> 00:14:53,040 Speaker 1: obviously the famous Houston. We have a problem. Could Nathan 249 00:14:53,120 --> 00:14:55,680 Speaker 1: and co stay in touch with homebase and even with 250 00:14:55,720 --> 00:14:58,360 Speaker 1: their families while they were in Mars Dunolfa. 251 00:14:59,040 --> 00:14:59,320 Speaker 4: Yeah. 252 00:15:00,320 --> 00:15:04,600 Speaker 3: They were very scrupulous about imitating the reality, the expected reality, 253 00:15:04,600 --> 00:15:09,520 Speaker 3: which is that there's this time lapse for any communication 254 00:15:10,280 --> 00:15:12,640 Speaker 3: from Mars because it's far away and you're dealing with 255 00:15:12,680 --> 00:15:16,800 Speaker 3: the limits of the speed of sound and technology, and 256 00:15:16,840 --> 00:15:19,040 Speaker 3: so there's something it depends on where it is in 257 00:15:19,040 --> 00:15:22,160 Speaker 3: the orbit, but essentially there's like a twenty nine minute lapse, 258 00:15:22,840 --> 00:15:27,400 Speaker 3: and so you can't have a conversation, any kind of 259 00:15:27,480 --> 00:15:31,360 Speaker 3: normal conversation, but they can send messages. But the other 260 00:15:31,440 --> 00:15:36,200 Speaker 3: problem is that every form of electronic communication from the 261 00:15:36,280 --> 00:15:41,840 Speaker 3: habitat has to go through the same channel, So that 262 00:15:41,880 --> 00:15:44,880 Speaker 3: includes any kind of data that the habitat is sending 263 00:15:44,880 --> 00:15:47,720 Speaker 3: back to Earth about I don't know, oxygen levels or 264 00:15:48,240 --> 00:15:52,200 Speaker 3: what's happening in the experiments, or any kind of computer connections. 265 00:15:52,240 --> 00:15:56,000 Speaker 3: And so that's sort of the best case scenario, and 266 00:15:56,040 --> 00:15:59,600 Speaker 3: that actually the lag can be much longer, and the 267 00:15:59,800 --> 00:16:02,440 Speaker 3: low larger the audio file or the text file, the 268 00:16:02,440 --> 00:16:05,400 Speaker 3: computer file, the longer it takes of sending a short 269 00:16:05,480 --> 00:16:09,360 Speaker 3: video even in low resolution, could take days, where sending 270 00:16:09,360 --> 00:16:11,960 Speaker 3: a one line text message maybe takes only half an 271 00:16:11,960 --> 00:16:15,600 Speaker 3: hour or so. So they could communicate, but only in 272 00:16:15,600 --> 00:16:23,040 Speaker 3: this clipped way with all of these ellipses essentially between communications. 273 00:16:23,040 --> 00:16:26,720 Speaker 3: So if there's an emergency, say back at home, they 274 00:16:26,760 --> 00:16:31,040 Speaker 3: couldn't just start having a conversation with them. Now, in reality, 275 00:16:31,400 --> 00:16:34,440 Speaker 3: since they were on a stage set, they could break 276 00:16:34,480 --> 00:16:37,240 Speaker 3: the experiment at any time if someone just like I 277 00:16:37,240 --> 00:16:40,120 Speaker 3: don't know, cut off their finger or something, but they 278 00:16:40,160 --> 00:16:43,040 Speaker 3: would try to they would do anything to avoid breaking 279 00:16:43,040 --> 00:16:46,360 Speaker 3: the experiment obviously, So yeah, they were reduced to these 280 00:16:46,400 --> 00:16:51,320 Speaker 3: sort of intermittent text messages essentially that would be relayed 281 00:16:51,320 --> 00:16:53,200 Speaker 3: at unpredictable intervals. 282 00:16:53,960 --> 00:16:56,360 Speaker 1: How did you choose the headline field story? 283 00:16:56,600 --> 00:17:00,520 Speaker 4: I don't choose the headlines from a not a lot. 284 00:17:00,560 --> 00:17:00,840 Speaker 4: I don't. 285 00:17:00,880 --> 00:17:02,720 Speaker 3: I can consult on them, and I can say this 286 00:17:02,760 --> 00:17:03,840 Speaker 3: one's worse than the other one. 287 00:17:03,880 --> 00:17:05,919 Speaker 1: But the headline the New York Times magazine went with 288 00:17:06,240 --> 00:17:09,639 Speaker 1: was can humans withstand the psychological torture? 289 00:17:11,600 --> 00:17:12,800 Speaker 4: I mean, it's pretty good. I can't. 290 00:17:14,240 --> 00:17:17,440 Speaker 3: Yes, yes, And that's also what it's about, basically, can 291 00:17:17,440 --> 00:17:21,320 Speaker 3: we can people survive this? Because most of what NAS 292 00:17:21,359 --> 00:17:24,960 Speaker 3: has been asking over the course of its space program 293 00:17:25,080 --> 00:17:28,080 Speaker 3: is can we physically get people into space? Can we 294 00:17:28,119 --> 00:17:31,320 Speaker 3: physically put them on another planet? Very little thought has 295 00:17:31,359 --> 00:17:37,440 Speaker 3: been given into can human beings once they're there survive psychologically, emotionally. 296 00:17:38,000 --> 00:17:41,760 Speaker 3: And that's that's what this experiment is, at least ostensibly about. 297 00:17:42,160 --> 00:17:44,000 Speaker 3: And it's definitely what the story is that I wrote 298 00:17:44,080 --> 00:17:44,800 Speaker 3: us about. 299 00:17:45,720 --> 00:17:48,800 Speaker 1: When we come back. More from Nathaniel rich on why 300 00:17:48,840 --> 00:17:52,040 Speaker 1: we're so obsessed with going to Mars and how historically 301 00:17:52,400 --> 00:18:01,480 Speaker 1: attitudes towards Mars have always revealed deeper cultural undercurrents. How 302 00:18:01,480 --> 00:18:05,560 Speaker 1: close is NASA to putting humans on Mos? They've been 303 00:18:05,560 --> 00:18:07,760 Speaker 1: predicting for many years that it's just around the corner. 304 00:18:07,840 --> 00:18:10,000 Speaker 1: They keep pushing back the window. 305 00:18:10,160 --> 00:18:12,719 Speaker 3: Even a few years ago, I think by twenty eighteen 306 00:18:13,480 --> 00:18:16,320 Speaker 3: they had predicted that it would be no later than 307 00:18:16,359 --> 00:18:20,960 Speaker 3: the end of the twenty twenties. I think now they're 308 00:18:20,960 --> 00:18:25,159 Speaker 3: looking more to the middle of the next decade. But 309 00:18:25,280 --> 00:18:28,760 Speaker 3: they are full speed ahead, and I think they're very 310 00:18:29,520 --> 00:18:33,080 Speaker 3: confident that they will get people to the planet in 311 00:18:33,119 --> 00:18:36,680 Speaker 3: a fairly short amount of time. The technical problems that 312 00:18:36,800 --> 00:18:40,440 Speaker 3: lay before them that we referenced are not seen as 313 00:18:40,680 --> 00:18:45,080 Speaker 3: intimidatingly difficult. They're just math problems to be worked out, 314 00:18:45,320 --> 00:18:47,360 Speaker 3: is the sense that I got from speaking with one 315 00:18:47,359 --> 00:18:52,400 Speaker 3: of these senior propulsion engineers. So there is, and there 316 00:18:52,400 --> 00:18:55,359 Speaker 3: has been for quite a while within NASA, quite a 317 00:18:55,400 --> 00:18:57,040 Speaker 3: lot of optimism that this is going to happen. 318 00:18:57,040 --> 00:18:58,080 Speaker 4: It's going to happen pretty soon. 319 00:18:58,720 --> 00:19:01,160 Speaker 1: And why why mos Well. 320 00:19:01,000 --> 00:19:03,960 Speaker 3: That's the million dollars, that's the million dollar question. I mean, 321 00:19:04,440 --> 00:19:06,800 Speaker 3: there's a lot of different rationales. The main ones you 322 00:19:06,800 --> 00:19:11,560 Speaker 3: hear from NASA is it represents scientific progress. It's the 323 00:19:11,600 --> 00:19:16,280 Speaker 3: next step for human exploration of the universe, and certainly 324 00:19:16,320 --> 00:19:20,560 Speaker 3: human progress in this space exploration. There's also the rationale 325 00:19:20,560 --> 00:19:25,680 Speaker 3: that through the kind of innovation that's necessary to put 326 00:19:25,680 --> 00:19:27,840 Speaker 3: people on Mars or to reach any new milestone in 327 00:19:27,880 --> 00:19:32,000 Speaker 3: the space expeditions, that there will be some kind of 328 00:19:32,040 --> 00:19:37,040 Speaker 3: unpredictable benefits, technological benefits that can be applied for all 329 00:19:37,080 --> 00:19:40,439 Speaker 3: of humanity, so that maybe they'll invent new materials or 330 00:19:40,520 --> 00:19:43,960 Speaker 3: new types of devices that can then make our life 331 00:19:44,000 --> 00:19:46,800 Speaker 3: on Earth easier. And there are plenty of examples I 332 00:19:46,800 --> 00:19:50,800 Speaker 3: think of that in the past. And then there's a 333 00:19:50,880 --> 00:19:54,399 Speaker 3: kind of political rationale, which is to say that we 334 00:19:54,480 --> 00:19:55,960 Speaker 3: need to do it before someone else does. 335 00:19:56,080 --> 00:19:57,840 Speaker 4: There's a national pride on the line. 336 00:19:58,200 --> 00:20:00,680 Speaker 1: I mean, it's like in the sixties when I wanted 337 00:20:00,720 --> 00:20:02,600 Speaker 1: to put a man on the moon first. Is there 338 00:20:02,640 --> 00:20:05,760 Speaker 1: a parallel to the sixties in that respect, Yeah, I. 339 00:20:05,720 --> 00:20:07,320 Speaker 3: Would say not only is there a parallel, but I 340 00:20:07,359 --> 00:20:10,600 Speaker 3: think NASA in its whole frame of thinking. If you 341 00:20:10,640 --> 00:20:13,680 Speaker 3: can speak of something the size of an agency, the 342 00:20:13,720 --> 00:20:16,679 Speaker 3: size of NASA as a personified in some way, but 343 00:20:16,760 --> 00:20:20,400 Speaker 3: I think the whole enterprise is really stuck in the sixties, 344 00:20:20,440 --> 00:20:23,000 Speaker 3: if not the fit nineteen fifties one it's created. So 345 00:20:23,080 --> 00:20:25,560 Speaker 3: it's very much it's you know, you see this sort 346 00:20:25,560 --> 00:20:29,640 Speaker 3: of vestigial, almost cold war mentality that I think informs 347 00:20:29,680 --> 00:20:31,960 Speaker 3: all almost every aspect of the whole enterprise. 348 00:20:32,359 --> 00:20:34,240 Speaker 1: What does it say to you that in the sixties 349 00:20:34,280 --> 00:20:39,359 Speaker 1: it was the president JFK sort of outlining this national 350 00:20:40,040 --> 00:20:43,840 Speaker 1: mission to put a man on the moon, and now 351 00:20:44,280 --> 00:20:47,080 Speaker 1: in the twenty twenties it's l Musk and to a 352 00:20:47,080 --> 00:20:48,320 Speaker 1: certain extent Jeff Bezos. 353 00:20:49,240 --> 00:20:52,399 Speaker 3: Yeah, I think you can learn all you need to 354 00:20:52,480 --> 00:20:55,280 Speaker 3: know about a culture or a society by studying its 355 00:20:55,280 --> 00:20:59,840 Speaker 3: attitudes about Mars. You know, it's it's certainly now it's 356 00:21:00,040 --> 00:21:02,920 Speaker 3: dominated by a kind of There are a few different strands. 357 00:21:02,920 --> 00:21:06,840 Speaker 3: There's a kind of private enterprise strand, but that is 358 00:21:06,880 --> 00:21:10,320 Speaker 3: often including in the case of Musk, closely alloyed with 359 00:21:10,880 --> 00:21:16,080 Speaker 3: a libertarian fantasy of a lawless world in which people 360 00:21:16,160 --> 00:21:19,040 Speaker 3: can stake their claim a kind of wild West and 361 00:21:19,080 --> 00:21:23,800 Speaker 3: not have regulation and oversight. There are groups of Mars 362 00:21:24,440 --> 00:21:29,880 Speaker 3: enthusiasts out there that are very much explicitly libertarian ideologues 363 00:21:29,960 --> 00:21:32,560 Speaker 3: who hope to start a libertarian society on Mars. 364 00:21:32,640 --> 00:21:34,040 Speaker 4: So that exists if you. 365 00:21:33,960 --> 00:21:35,760 Speaker 3: Go back to the fifties and sixties, where at this 366 00:21:36,400 --> 00:21:39,480 Speaker 3: very different place in our culture, obviously in society, a 367 00:21:39,520 --> 00:21:44,359 Speaker 3: place of tremendous global cooperation relatively that gave birth to 368 00:21:44,400 --> 00:21:46,879 Speaker 3: the entire sort of modern space race, even though you 369 00:21:46,920 --> 00:21:49,840 Speaker 3: have a competition between the Cold War powers. But you 370 00:21:49,880 --> 00:21:52,840 Speaker 3: can even go back further and if you look at 371 00:21:53,000 --> 00:21:59,800 Speaker 3: the late nineteenth century when Chaparelli, a Milanaisy astronomer, observed 372 00:21:59,800 --> 00:22:02,320 Speaker 3: that there were canals on Mars, there was this great 373 00:22:02,520 --> 00:22:06,160 Speaker 3: fascination for decades about are people living on Mars? Are 374 00:22:06,200 --> 00:22:10,280 Speaker 3: Martians building canals? And it was very much an expression. 375 00:22:10,760 --> 00:22:14,160 Speaker 3: You can find very clear a correlation between the kind 376 00:22:14,200 --> 00:22:16,960 Speaker 3: of excitement of the industrial age, and there was a 377 00:22:16,960 --> 00:22:19,960 Speaker 3: period where people were competing with Mars to build more 378 00:22:20,000 --> 00:22:23,400 Speaker 3: canals as fast as possible, as also of course during 379 00:22:23,400 --> 00:22:25,439 Speaker 3: the same period of the digging of the sus canals. 380 00:22:25,480 --> 00:22:27,680 Speaker 4: So this was you know, this is the New York Times. 381 00:22:27,680 --> 00:22:30,199 Speaker 3: This is not just some like weird thing is this 382 00:22:30,280 --> 00:22:32,560 Speaker 3: is at the time generally accepted that we're in this 383 00:22:32,640 --> 00:22:35,800 Speaker 3: race against the Martians. So it's always been a kind 384 00:22:35,840 --> 00:22:41,600 Speaker 3: of repository Mars for the kind of subconscious of the 385 00:22:41,640 --> 00:22:45,360 Speaker 3: culture that observes it. And I think that's true today. 386 00:22:45,480 --> 00:22:48,240 Speaker 3: And I think as our society changes, probably our view 387 00:22:48,240 --> 00:22:51,040 Speaker 3: of Mars will change. In tandem with it. 388 00:22:51,320 --> 00:22:54,960 Speaker 1: You've written that future Mars voyages will have to want 389 00:22:54,960 --> 00:22:57,119 Speaker 1: to travel to Mars more than almost anyone else in 390 00:22:57,160 --> 00:22:59,560 Speaker 1: the world. They'll have to embrace the knowledge that for 391 00:22:59,560 --> 00:23:02,560 Speaker 1: at least five hundred and seventy days, they will be 392 00:23:02,600 --> 00:23:06,120 Speaker 1: the most isolated human beings in the history of the universe. 393 00:23:07,080 --> 00:23:10,560 Speaker 3: Yes, they will have to, because that's what they're signing 394 00:23:10,640 --> 00:23:11,520 Speaker 3: up up for. 395 00:23:11,800 --> 00:23:12,760 Speaker 1: What will that do to them? 396 00:23:13,200 --> 00:23:15,160 Speaker 3: You know? I think a distinction has to be made 397 00:23:15,200 --> 00:23:17,919 Speaker 3: between the kind of person who wants to be an 398 00:23:17,920 --> 00:23:20,679 Speaker 3: astronaut and wants to go on a mission like this, 399 00:23:21,080 --> 00:23:23,960 Speaker 3: like the people I wrote about, like Nathan Jones. But 400 00:23:24,040 --> 00:23:28,920 Speaker 3: then once we start talking about a permanent settlement or colonies, 401 00:23:29,400 --> 00:23:31,960 Speaker 3: we're talking about a very different group of people. So 402 00:23:32,080 --> 00:23:35,719 Speaker 3: you have this sort of kind of zelot astronauts who 403 00:23:35,760 --> 00:23:38,960 Speaker 3: are perfectly fit, who are the most stable people you've 404 00:23:38,960 --> 00:23:45,040 Speaker 3: ever met, enormous reserves of self concentration and self reliance 405 00:23:45,080 --> 00:23:47,360 Speaker 3: and all the rest, and then the rest of us right, 406 00:23:47,480 --> 00:23:51,280 Speaker 3: and for colony to exist, it has to look very different. 407 00:23:51,359 --> 00:23:54,439 Speaker 3: And a major criticism that I encountered in researching the 408 00:23:54,440 --> 00:23:57,560 Speaker 3: piece from close watchers of the NASA program is that 409 00:23:58,440 --> 00:24:02,720 Speaker 3: Even if this experiment has some value to predict the 410 00:24:02,760 --> 00:24:07,119 Speaker 3: ability of say, astronauts to survive in this setting, it 411 00:24:07,160 --> 00:24:10,280 Speaker 3: will have no value for the rest of us, who 412 00:24:10,520 --> 00:24:12,720 Speaker 3: you know, all kinds of other considerations would have to 413 00:24:12,720 --> 00:24:16,280 Speaker 3: be made. And so we're certainly not at the stage 414 00:24:16,280 --> 00:24:19,360 Speaker 3: where we're asking can people have families up there? Can 415 00:24:19,400 --> 00:24:23,439 Speaker 3: people give birth? There's some major biological challenges there. What 416 00:24:23,520 --> 00:24:27,560 Speaker 3: happens if someone gets sick, what happens if someone misses home, 417 00:24:27,720 --> 00:24:30,440 Speaker 3: you know, enters a depression, and none of that. We're 418 00:24:30,480 --> 00:24:33,680 Speaker 3: nowhere near those kinds of questions yet, but I think 419 00:24:33,760 --> 00:24:36,800 Speaker 3: that's if they continue to hit these benchmarks, that's where 420 00:24:36,840 --> 00:24:38,280 Speaker 3: this is ultimately heading. 421 00:24:38,800 --> 00:24:42,800 Speaker 1: So when you wrote the piece Nathan and co In 422 00:24:42,880 --> 00:24:46,200 Speaker 1: the mods Habitat, and since publication, they've of course come back. 423 00:24:47,000 --> 00:24:49,080 Speaker 1: Do you know what the experience was like for Nathan? 424 00:24:49,720 --> 00:24:50,560 Speaker 4: No, they're not. 425 00:24:50,800 --> 00:24:55,280 Speaker 3: They're basically sworn to secrecy. And this was the level 426 00:24:55,320 --> 00:24:59,840 Speaker 3: of secrecy that shrouded just about every aspect of the 427 00:25:00,160 --> 00:25:07,600 Speaker 3: experiment was somewhat astounding or surprise for me. It was 428 00:25:07,640 --> 00:25:10,159 Speaker 3: as it reporting the story at least talking to the 429 00:25:10,240 --> 00:25:14,520 Speaker 3: NASA people and to some extent the participants themselves, you'd 430 00:25:14,520 --> 00:25:17,480 Speaker 3: think I was investigating I don't know, Abu grab or 431 00:25:17,480 --> 00:25:20,600 Speaker 3: something like. The way that it was talked about extremely confidential. Now, 432 00:25:20,640 --> 00:25:23,399 Speaker 3: their justification was that they want to run the experiment 433 00:25:23,480 --> 00:25:27,960 Speaker 3: multiple times, and they don't want prospective applicants to know 434 00:25:28,200 --> 00:25:30,880 Speaker 3: anything about what they're going to do. They don't want 435 00:25:30,920 --> 00:25:34,320 Speaker 3: to because it would, I guess, diminish the value of 436 00:25:34,359 --> 00:25:36,840 Speaker 3: what they find if people already know, like, these are 437 00:25:36,840 --> 00:25:38,360 Speaker 3: the kinds of things we're going to do when we're there. 438 00:25:38,520 --> 00:25:39,720 Speaker 4: This is what happened to people. 439 00:25:40,200 --> 00:25:43,800 Speaker 3: It struck me as slightly ridiculous because, on the one hand, 440 00:25:44,280 --> 00:25:48,159 Speaker 3: very similar experiments have been conducted many times, including by NASA, 441 00:25:48,240 --> 00:25:51,000 Speaker 3: and those results are public, so the results. 442 00:25:51,040 --> 00:25:52,720 Speaker 1: NASA haven't published any results of this. 443 00:25:53,160 --> 00:25:55,680 Speaker 3: Not that I'm aware of, no, and you know, they 444 00:25:55,720 --> 00:25:57,119 Speaker 3: release these very anodyne statements. 445 00:25:57,160 --> 00:25:59,120 Speaker 4: It's a success. Everyone had a great time. 446 00:25:59,280 --> 00:26:01,439 Speaker 1: And you put the story in the context of the 447 00:26:01,480 --> 00:26:05,840 Speaker 1: history of isolation research. But more specifically, it seems like 448 00:26:05,920 --> 00:26:10,320 Speaker 1: this particular simulation of life on Mars has happened multiple 449 00:26:10,359 --> 00:26:13,760 Speaker 1: times in the past and is also being replicated multiple 450 00:26:13,760 --> 00:26:15,719 Speaker 1: times right now all around the world. Can you kind 451 00:26:15,760 --> 00:26:20,000 Speaker 1: of describe the spread of this type of experiment being run. 452 00:26:21,080 --> 00:26:24,520 Speaker 3: Yeah, I guess it depends on how narrowly you want 453 00:26:24,560 --> 00:26:28,840 Speaker 3: to define the experiment. But NASA has been doing some version, 454 00:26:29,200 --> 00:26:33,080 Speaker 3: conducting some version of this experiment since before NASA was 455 00:26:33,119 --> 00:26:35,119 Speaker 3: even called NASA. I mean, they had some of the 456 00:26:35,160 --> 00:26:41,280 Speaker 3: early first astronauts did isolation experiments. They would put them 457 00:26:41,280 --> 00:26:45,399 Speaker 3: in little pods for long periods of time, sometimes in 458 00:26:45,440 --> 00:26:50,520 Speaker 3: fairly brutal configurations and sometimes completely in isolation, especially back 459 00:26:50,600 --> 00:26:53,399 Speaker 3: in the fifties when they thought that astronauts would have 460 00:26:53,520 --> 00:26:56,879 Speaker 3: to be propelled in tiny little vessels for months at 461 00:26:56,880 --> 00:27:00,960 Speaker 3: a time into outer space. But there was another similar 462 00:27:01,000 --> 00:27:04,879 Speaker 3: experiment called High Seas, which was the subject of a 463 00:27:04,920 --> 00:27:07,920 Speaker 3: really fascinating book by the writer Kate Green, who was 464 00:27:07,960 --> 00:27:10,440 Speaker 3: one of the original crew members they ran that experiment. 465 00:27:11,040 --> 00:27:13,439 Speaker 3: I don't know, I think a dozen times. That was 466 00:27:13,480 --> 00:27:16,480 Speaker 3: a similar idea in a habitat that was built on 467 00:27:16,720 --> 00:27:22,040 Speaker 3: Mona Loa Mountain in Hawaii, and it was four people 468 00:27:22,600 --> 00:27:25,639 Speaker 3: or sometimes six put into this environment for months at 469 00:27:25,640 --> 00:27:30,240 Speaker 3: a time, and Green writes very elegantly and movingly about 470 00:27:30,320 --> 00:27:32,919 Speaker 3: the experience and on the kind of madness of it 471 00:27:32,960 --> 00:27:35,200 Speaker 3: and what it did to her life. 472 00:27:35,840 --> 00:27:38,280 Speaker 4: The book once upon a time I lived on Mars it's. 473 00:27:38,119 --> 00:27:42,280 Speaker 3: Called And then there was a crazy experiment called Mars 474 00:27:42,320 --> 00:27:47,240 Speaker 3: five hundred that was administered by a Russian agency called 475 00:27:47,280 --> 00:27:50,520 Speaker 3: which has a name that I love, called the Institute 476 00:27:50,520 --> 00:27:53,960 Speaker 3: of Biomedical Problems. So of course that's who did this 477 00:27:55,160 --> 00:27:58,080 Speaker 3: completely barbaric experiment where they locked six male crew members 478 00:27:58,119 --> 00:28:01,199 Speaker 3: together for five hundred and twenty days. Wow, that was 479 00:28:01,240 --> 00:28:04,800 Speaker 3: in twenty ten and eleven in a kind of fake 480 00:28:05,040 --> 00:28:10,520 Speaker 3: spacecraft on a fake Mars and that was pretty well 481 00:28:10,560 --> 00:28:14,080 Speaker 3: studied and people participants lost their hair and lost weight. 482 00:28:14,800 --> 00:28:17,679 Speaker 3: But then there's NASA, if they have something like a 483 00:28:17,720 --> 00:28:21,800 Speaker 3: dozen different versions of this going on at all times. 484 00:28:21,840 --> 00:28:24,480 Speaker 3: There are all different configurations, different amounts of time, different 485 00:28:24,520 --> 00:28:25,600 Speaker 3: number of participants. 486 00:28:25,600 --> 00:28:27,920 Speaker 1: So did you lost? Do you say to NASA, why 487 00:28:27,920 --> 00:28:28,800 Speaker 1: do you need to keep doing that? 488 00:28:28,880 --> 00:28:30,960 Speaker 3: Yes, that was one of my big questions, why do 489 00:28:31,040 --> 00:28:34,280 Speaker 3: we keep doing this? And don't we know what happens? 490 00:28:34,400 --> 00:28:37,119 Speaker 3: Even before the NASA history, there's this whole other history 491 00:28:37,119 --> 00:28:43,000 Speaker 3: of people doing similar isolation experiments, and their official answer was, yes, 492 00:28:43,040 --> 00:28:47,040 Speaker 3: we've done similar some experiments, but actually there's no substitution 493 00:28:47,320 --> 00:28:52,520 Speaker 3: for this is far closer to the expected reality and experimentally, scientifically, 494 00:28:53,040 --> 00:28:56,240 Speaker 3: all of the previous experiments are essentially useless, and this 495 00:28:56,400 --> 00:29:00,080 Speaker 3: is the only one that will matter. Now if you 496 00:29:00,120 --> 00:29:05,120 Speaker 3: believe that, you also have to then wonder well. And 497 00:29:05,160 --> 00:29:06,960 Speaker 3: this is what some of the people who study that's 498 00:29:07,000 --> 00:29:10,440 Speaker 3: pointed out to me. Yes, okay, this experiment, even if 499 00:29:10,440 --> 00:29:14,720 Speaker 3: it's its exact simulation, a perfect simulation of what the 500 00:29:14,800 --> 00:29:17,960 Speaker 3: first Mars expedition is going to be, you're only testing 501 00:29:18,720 --> 00:29:21,080 Speaker 3: a group of four people or eve an n of 502 00:29:21,200 --> 00:29:26,920 Speaker 3: four right, experimentally speaking, and so the statistical value of 503 00:29:26,960 --> 00:29:29,920 Speaker 3: this experiment is close to nil. You'd have to run 504 00:29:30,000 --> 00:29:35,880 Speaker 3: this experiment thousands of times for it to be statistically reliable, 505 00:29:36,000 --> 00:29:37,560 Speaker 3: and of course they're not going to do that. So 506 00:29:37,680 --> 00:29:41,360 Speaker 3: even if you grant them this sort of scientific argument 507 00:29:41,400 --> 00:29:44,320 Speaker 3: that this experiment is unlike all the other ones, even 508 00:29:44,360 --> 00:29:46,680 Speaker 3: though they all basically have the same results, it doesn't 509 00:29:46,720 --> 00:29:49,560 Speaker 3: actually have much scientific value unless they would do it all, 510 00:29:49,760 --> 00:29:51,240 Speaker 3: you know, fifty. 511 00:29:50,880 --> 00:29:52,800 Speaker 4: Times or a thousand times. 512 00:29:52,840 --> 00:29:55,800 Speaker 3: I'm not sure where the probability charts cut off, but 513 00:29:56,760 --> 00:29:58,960 Speaker 3: as it stands, they're probably going to do it one 514 00:29:59,040 --> 00:30:02,000 Speaker 3: or two more times, at which point they'll be ready 515 00:30:02,040 --> 00:30:03,200 Speaker 3: to hurl people. 516 00:30:03,000 --> 00:30:06,000 Speaker 1: Up to Mars. But from that point of view, was 517 00:30:06,040 --> 00:30:10,640 Speaker 1: this about understanding if humans can withstand isolation or was 518 00:30:10,640 --> 00:30:13,640 Speaker 1: this some we talked to the beginning about the technical 519 00:30:13,640 --> 00:30:15,959 Speaker 1: problems NASA has to solve or was this Were there 520 00:30:16,000 --> 00:30:18,440 Speaker 1: any technical problems they were looking to solve with this? 521 00:30:18,680 --> 00:30:22,240 Speaker 3: That was probably the That was the point where I 522 00:30:22,520 --> 00:30:26,040 Speaker 3: was most I mean, there's something that's where I sort 523 00:30:26,040 --> 00:30:29,040 Speaker 3: of laughed in the reporting, although it's kind of horrible. So, yes, 524 00:30:29,080 --> 00:30:30,600 Speaker 3: the official line is where we want to test the 525 00:30:30,640 --> 00:30:33,440 Speaker 3: human side of this. We have all these divisions doing 526 00:30:34,280 --> 00:30:38,000 Speaker 3: the science and the technology, and this is the human 527 00:30:38,040 --> 00:30:40,080 Speaker 3: research side. And in fact, there is a human Research 528 00:30:40,160 --> 00:30:45,840 Speaker 3: division within NASA that was administering the experiment. However, they 529 00:30:45,840 --> 00:30:50,200 Speaker 3: were partnered with two other divisions, and the division that 530 00:30:50,280 --> 00:30:55,320 Speaker 3: oversaw the whole experiment was actually run by someone named 531 00:30:55,400 --> 00:30:59,760 Speaker 3: Rachel McCauley, who is a propulsion engineer. She's the one 532 00:30:59,800 --> 00:31:04,040 Speaker 3: who decides which rocket will do the job best. And 533 00:31:04,080 --> 00:31:08,120 Speaker 3: in order to make that determination, she needs to nail 534 00:31:08,120 --> 00:31:10,000 Speaker 3: down a bunch of variables. And one of the main 535 00:31:10,080 --> 00:31:14,640 Speaker 3: variables is how much weight needs to be carried by 536 00:31:14,720 --> 00:31:16,840 Speaker 3: the rocket ship. And so what that means is, of 537 00:31:16,920 --> 00:31:19,360 Speaker 3: course the weight of the people, but also how much 538 00:31:19,360 --> 00:31:23,080 Speaker 3: food do they have to take? And so when I 539 00:31:23,120 --> 00:31:28,280 Speaker 3: talked to her, she was like, very blithely kind of 540 00:31:28,320 --> 00:31:32,920 Speaker 3: dismissive of the whole human psychological aspect of the thing, 541 00:31:33,000 --> 00:31:36,800 Speaker 3: and instead she focused on how much food are they 542 00:31:36,800 --> 00:31:40,040 Speaker 3: going to eat? Like what's the weight? How much waste 543 00:31:40,040 --> 00:31:43,840 Speaker 3: are they going to produce? And once I have those figures, 544 00:31:43,920 --> 00:31:47,200 Speaker 3: then I will know exactly what kind of propulsion device 545 00:31:47,240 --> 00:31:47,640 Speaker 3: to use. 546 00:31:48,080 --> 00:31:49,680 Speaker 1: And so then I went a little bit dubious. 547 00:31:49,920 --> 00:31:51,440 Speaker 3: Yeah, and so I was like, what, no, I mean, 548 00:31:51,480 --> 00:31:53,480 Speaker 3: I believed her because she was running the experiment. 549 00:31:54,480 --> 00:31:56,680 Speaker 4: She's a solid propulsion systems engineer. 550 00:31:57,000 --> 00:32:00,040 Speaker 3: And so then I went back to the sort of 551 00:32:00,120 --> 00:32:02,560 Speaker 3: human research people and they're like, oh, no, no, no, it's 552 00:32:02,560 --> 00:32:05,960 Speaker 3: all about human psychology. But in fact the person they 553 00:32:06,000 --> 00:32:08,200 Speaker 3: were reporting to, the person who was running the whole thing, 554 00:32:09,040 --> 00:32:11,840 Speaker 3: said that was not the case. And so actually, I 555 00:32:11,920 --> 00:32:14,760 Speaker 3: think if you follow the money, you start to wonder, well, 556 00:32:14,840 --> 00:32:18,080 Speaker 3: is this whole human aspects side of it part of 557 00:32:18,120 --> 00:32:22,800 Speaker 3: the marketing and it's frankly irrelevant to what NASA's real 558 00:32:22,880 --> 00:32:25,560 Speaker 3: concern is, which is, yeah, how many pounds of food 559 00:32:25,600 --> 00:32:26,600 Speaker 3: do we need to put on this thing? 560 00:32:27,120 --> 00:32:27,400 Speaker 2: Geez. 561 00:32:31,480 --> 00:32:34,440 Speaker 1: Stay with us for more from Nathaniel rich on why 562 00:32:34,520 --> 00:32:38,600 Speaker 1: dreams of Mars and dreams of AI are inextricably linked 563 00:32:39,120 --> 00:32:43,120 Speaker 1: and why some techno optimists theorize that humans would evolve 564 00:32:43,160 --> 00:32:50,280 Speaker 1: into AI powered martians. There was a part of your 565 00:32:50,400 --> 00:32:55,120 Speaker 1: story that's really stuck out to me was that NASA's 566 00:32:55,200 --> 00:33:00,240 Speaker 1: chief research scientist Dennis Bushel said that as Colonis, he 567 00:33:00,280 --> 00:33:04,960 Speaker 1: most becomes more feasible, colonists themselves will evolve into mortians. 568 00:33:05,600 --> 00:33:06,080 Speaker 4: Yes. 569 00:33:06,880 --> 00:33:08,120 Speaker 1: Did that surprise you. 570 00:33:10,280 --> 00:33:12,440 Speaker 4: Yes, although a little bit. 571 00:33:12,640 --> 00:33:14,800 Speaker 3: It was surprised me to see him write about that 572 00:33:15,080 --> 00:33:19,440 Speaker 3: so openly. This is Yes, This chief scientist at the 573 00:33:19,480 --> 00:33:23,800 Speaker 3: Langley Research Center, who had been I think he recently retired, 574 00:33:23,800 --> 00:33:26,880 Speaker 3: had been a NASA for sixty years, and he published 575 00:33:26,880 --> 00:33:30,880 Speaker 3: this sort of opus about the institutional view of deep 576 00:33:31,000 --> 00:33:34,240 Speaker 3: space exploration, and he said, what I think a lot 577 00:33:34,280 --> 00:33:38,080 Speaker 3: of scientists have predicted is that if people are able 578 00:33:38,120 --> 00:33:42,640 Speaker 3: to survive on Mars for any extended amount of time 579 00:33:44,480 --> 00:33:48,840 Speaker 3: with oxygen and all the rest, that ultimately their bodies 580 00:33:48,920 --> 00:33:53,240 Speaker 3: will change. That over time, because of the radiation exposure, 581 00:33:53,880 --> 00:33:56,880 Speaker 3: because of the reduced gravity, that there will be real 582 00:33:57,360 --> 00:34:01,080 Speaker 3: physiological changes to their bodies. There's no way out of that. 583 00:34:01,440 --> 00:34:03,840 Speaker 3: So essentially one of the kind of tricks for surviving 584 00:34:03,880 --> 00:34:07,120 Speaker 3: Mars is to live there long enough so that people 585 00:34:07,160 --> 00:34:10,880 Speaker 3: evolve into Martians and they look different and they probably 586 00:34:10,880 --> 00:34:14,560 Speaker 3: have elongated heads and maybe different diets and all the rest. 587 00:34:14,400 --> 00:34:17,759 Speaker 1: Of it evolved means, of course natural selection. Survivor are 588 00:34:17,800 --> 00:34:20,200 Speaker 1: the fittest on Moss exactly. 589 00:34:20,600 --> 00:34:24,840 Speaker 3: We're talking about a generational No, it's a generational shift. 590 00:34:24,880 --> 00:34:25,000 Speaker 2: Now. 591 00:34:25,040 --> 00:34:27,759 Speaker 3: Of course they have to solve things like inconvenient things 592 00:34:27,800 --> 00:34:29,960 Speaker 3: like procreation on Mars and all the rest of that. 593 00:34:30,080 --> 00:34:32,880 Speaker 3: But yes, that's the long term view, is that we 594 00:34:32,960 --> 00:34:35,520 Speaker 3: won't have to solve every problem perfectly because people will 595 00:34:35,560 --> 00:34:38,360 Speaker 3: just start to there'll be natural selection and they'll be 596 00:34:38,400 --> 00:34:41,920 Speaker 3: forced to evolve into these other Martian creatures. And that 597 00:34:42,360 --> 00:34:44,640 Speaker 3: seems to be NASA's view. 598 00:34:47,320 --> 00:34:49,560 Speaker 1: There's another piece you wrote in The New York Times recently, 599 00:34:49,640 --> 00:34:53,720 Speaker 1: which was a review of Ray Causweill's book The Singularity 600 00:34:53,920 --> 00:34:57,239 Speaker 1: Is Nearer. Can you talk about who Ray Causweil is 601 00:34:58,040 --> 00:35:02,759 Speaker 1: that book and how viewing that book syncs up with 602 00:35:02,840 --> 00:35:04,400 Speaker 1: your writing on this experiment. 603 00:35:05,160 --> 00:35:07,880 Speaker 3: Yeah, Kurzweil is a kind of god of Ai who's 604 00:35:08,120 --> 00:35:12,560 Speaker 3: called the godfather of AI, who is for many decades 605 00:35:12,560 --> 00:35:17,680 Speaker 3: has been predicting the rise of artificial intelligence and ultimately 606 00:35:19,040 --> 00:35:23,719 Speaker 3: the singularity. But yes, his idea is that there will 607 00:35:23,760 --> 00:35:29,400 Speaker 3: be nanobots powered by artificial intelligence that we will inject 608 00:35:29,400 --> 00:35:32,880 Speaker 3: into our bodies, and that they will swim through our 609 00:35:32,880 --> 00:35:36,640 Speaker 3: bloodstream into our brains and connect our neocortex to the cloud, 610 00:35:37,280 --> 00:35:39,759 Speaker 3: linking us up to the I guess the Internet are 611 00:35:39,800 --> 00:35:45,000 Speaker 3: really like the global repository of all human information civilization, 612 00:35:45,400 --> 00:35:47,480 Speaker 3: and so at that point when we're just kind of 613 00:35:47,560 --> 00:35:54,480 Speaker 3: wired into intelligence, electronic intelligence, that for him is a singularity, 614 00:35:54,920 --> 00:35:58,400 Speaker 3: and he thinks that's coming very soon, basically by the 615 00:35:58,480 --> 00:35:59,320 Speaker 3: end of the decade. 616 00:35:59,560 --> 00:36:02,320 Speaker 1: Well, but there's something to me which is very striking 617 00:36:02,320 --> 00:36:05,560 Speaker 1: in the sense that Ray caswild this Sea the godfather 618 00:36:05,600 --> 00:36:08,640 Speaker 1: of AI, on the one hand, and on the other hand, 619 00:36:08,960 --> 00:36:13,759 Speaker 1: Dennis Bushnell, the NASA Chief Scientist, I'm both saying in 620 00:36:13,800 --> 00:36:18,000 Speaker 1: one way or another that within our lifetimes, the technological 621 00:36:18,080 --> 00:36:22,560 Speaker 1: future will mean that we no longer conform to the 622 00:36:22,600 --> 00:36:24,880 Speaker 1: current definition of what it is to be human. 623 00:36:25,760 --> 00:36:28,240 Speaker 3: Yeah, although I think you'd be hard pressed to find 624 00:36:28,400 --> 00:36:32,560 Speaker 3: a definition that would admit that would be universally agreed 625 00:36:32,600 --> 00:36:35,240 Speaker 3: to on what it means to be human. True, now, 626 00:36:35,360 --> 00:36:39,120 Speaker 3: we already and that's part of Kurzwell's argument, is that 627 00:36:39,160 --> 00:36:43,319 Speaker 3: we already atsourced so much of our mind and identity 628 00:36:43,400 --> 00:36:48,000 Speaker 3: to technology that we rely on the Internet to remember 629 00:36:48,040 --> 00:36:51,560 Speaker 3: things for us, our digital record, a lot of our 630 00:36:52,400 --> 00:36:57,759 Speaker 3: powers are only possible through technology. And if we were 631 00:36:57,800 --> 00:37:00,400 Speaker 3: just put in the wilderness, most of us we'd be 632 00:37:00,440 --> 00:37:04,000 Speaker 3: able to survive a couple of weeks. But yes, both 633 00:37:04,040 --> 00:37:07,840 Speaker 3: of these visions of they're both kind of these technologically 634 00:37:07,840 --> 00:37:13,960 Speaker 3: optimistic views of the world. There's this kind of viscerally 635 00:37:14,760 --> 00:37:19,680 Speaker 3: disturbing aspect to them, which is that they require us 636 00:37:19,760 --> 00:37:23,319 Speaker 3: to reimagine physically what will look like, you know, but 637 00:37:23,400 --> 00:37:26,520 Speaker 3: even putting aside all the sort of mental psychological aspect 638 00:37:26,560 --> 00:37:28,839 Speaker 3: of it, that we're going to be morphed into these 639 00:37:28,880 --> 00:37:31,759 Speaker 3: other different kinds of creatures that are going to be 640 00:37:31,760 --> 00:37:35,000 Speaker 3: like physically in some ways unrecognizable. And Kurzwill has this 641 00:37:35,040 --> 00:37:37,880 Speaker 3: whole thing about how soon people be able to design 642 00:37:37,920 --> 00:37:40,120 Speaker 3: their own bodies the way you can design like a 643 00:37:40,200 --> 00:37:43,239 Speaker 3: virtual avatar, and that we can we'll have people have 644 00:37:43,280 --> 00:37:48,920 Speaker 3: wings and the tusks and whatever you want, you know, feathers, 645 00:37:49,040 --> 00:37:53,600 Speaker 3: and that part of it tends not to be spoken 646 00:37:53,680 --> 00:37:57,160 Speaker 3: aloud or advertised as much as the part about, you know, 647 00:37:57,360 --> 00:38:01,160 Speaker 3: improving our intelligence. But I think think what was striking 648 00:38:01,160 --> 00:38:03,239 Speaker 3: to me about Curzweil's book and what I wanted to 649 00:38:03,239 --> 00:38:06,200 Speaker 3: write about is let's not forget the part where he 650 00:38:07,160 --> 00:38:09,799 Speaker 3: the prerequisite for all of these future predictions is that 651 00:38:09,840 --> 00:38:14,400 Speaker 3: we're injecting microscopic robots into our brains and our bloodstream. 652 00:38:14,719 --> 00:38:19,319 Speaker 3: Let's not lose track of that part of it, so that, yes, 653 00:38:19,400 --> 00:38:21,120 Speaker 3: I think you're right to draw a kind of a 654 00:38:21,160 --> 00:38:25,279 Speaker 3: parallel with the Mars visions. They tend to collide in 655 00:38:25,320 --> 00:38:28,480 Speaker 3: the realm of artificial intelligence. It's not surprising that Elon Musk, 656 00:38:28,560 --> 00:38:31,440 Speaker 3: you know, is obsessed with both Mars and AI. 657 00:38:32,440 --> 00:38:35,760 Speaker 1: You used the phrase earlier on a conversation about mourning, 658 00:38:36,000 --> 00:38:38,120 Speaker 1: and one of the pieces of Cozweil's book that you 659 00:38:38,239 --> 00:38:41,760 Speaker 1: draw out is him talking about basically making an AI 660 00:38:41,920 --> 00:38:44,880 Speaker 1: version of his father, who passed away in nineteen seventy 661 00:38:45,000 --> 00:38:48,000 Speaker 1: to be able to talk to him about music. And 662 00:38:48,040 --> 00:38:50,680 Speaker 1: one of the other things I noticed in the piece 663 00:38:50,880 --> 00:38:55,040 Speaker 1: about Mars was the crop garden in the Mars June 664 00:38:55,080 --> 00:38:58,719 Speaker 1: Alpha colony, which wouldn't be for eating, but rather for 665 00:38:58,800 --> 00:39:02,880 Speaker 1: the mental health of the participants. You know, it's I 666 00:39:02,880 --> 00:39:04,799 Speaker 1: guess it makes me think of that whole sort of 667 00:39:05,000 --> 00:39:07,759 Speaker 1: cliched thing about the fisherman who becomes a millionaire and 668 00:39:07,760 --> 00:39:10,800 Speaker 1: then returns to where he lived to fish. The craving 669 00:39:10,960 --> 00:39:15,239 Speaker 1: for the kind of things which are the touchstones of 670 00:39:15,480 --> 00:39:19,200 Speaker 1: what we think about as our human experience also is 671 00:39:19,280 --> 00:39:21,759 Speaker 1: present in these future fantasies. 672 00:39:22,840 --> 00:39:23,480 Speaker 4: Absolutely. 673 00:39:23,920 --> 00:39:27,439 Speaker 3: That's another major point of convergence I think, is this 674 00:39:28,400 --> 00:39:34,760 Speaker 3: that once you peel back this techno optimistic fantasy of 675 00:39:34,960 --> 00:39:39,240 Speaker 3: how things are going to be, you find this deep 676 00:39:39,520 --> 00:39:44,240 Speaker 3: sense of longing for how things once were. You certainly 677 00:39:44,239 --> 00:39:47,640 Speaker 3: see it in Kurswell, where after hundreds of pages of 678 00:39:47,719 --> 00:39:51,839 Speaker 3: talking about all the wonders of this new technology, all 679 00:39:51,880 --> 00:39:55,240 Speaker 3: the conveniences, and how we can travel have beach holidays 680 00:39:55,239 --> 00:39:57,759 Speaker 3: without leaving our houses through virtual reality and all the 681 00:39:57,840 --> 00:40:03,600 Speaker 3: rest of it, his ultimate goal is to reanimate his 682 00:40:04,040 --> 00:40:07,480 Speaker 3: dead father, who was a composer not of some renown 683 00:40:07,560 --> 00:40:11,600 Speaker 3: and a conductor in New York. And he's already gone 684 00:40:11,600 --> 00:40:15,040 Speaker 3: so far as to program an AI version of his 685 00:40:15,080 --> 00:40:18,799 Speaker 3: father that trained on his father's letters and writings and 686 00:40:18,840 --> 00:40:22,840 Speaker 3: personal documents and his and his music. In the pages 687 00:40:22,840 --> 00:40:24,960 Speaker 3: of the book, he has a there's a transcript of 688 00:40:24,960 --> 00:40:28,880 Speaker 3: a conversation that that Curswell has with his dead father, 689 00:40:29,400 --> 00:40:31,920 Speaker 3: and that to him is that's his great hope, is 690 00:40:31,920 --> 00:40:36,080 Speaker 3: to bring back his dad. In the same way in Mars, 691 00:40:36,640 --> 00:40:41,640 Speaker 3: I was struck by the mournful quality of this whole enterprise, 692 00:40:42,239 --> 00:40:46,080 Speaker 3: and everyone I asked, every sort of expert I interviewed, 693 00:40:46,120 --> 00:40:48,640 Speaker 3: I asked, there's something, there's something just a little bit 694 00:40:49,560 --> 00:40:52,000 Speaker 3: upsetting about all of this, like what you know, and 695 00:40:52,000 --> 00:40:54,040 Speaker 3: they all kind of many people kind of agreed, but 696 00:40:54,080 --> 00:40:56,080 Speaker 3: they couldn't put their finger on it until I spoke 697 00:40:56,160 --> 00:41:02,200 Speaker 3: to this one historian of isolation experiments, Mattius at Cornell, 698 00:41:02,440 --> 00:41:05,279 Speaker 3: and he said, this thing that for me is the 699 00:41:05,280 --> 00:41:07,000 Speaker 3: heart of the story, and to some extent, it's the 700 00:41:07,000 --> 00:41:09,239 Speaker 3: heart of the Curzwell and even ai store, which is 701 00:41:09,239 --> 00:41:12,920 Speaker 3: the urge to try to recreate a perfect world, is 702 00:41:12,920 --> 00:41:16,560 Speaker 3: always going to be about rehearsing what we got wrong here. 703 00:41:17,040 --> 00:41:20,520 Speaker 3: He told me, we're not chasing Mars, We're mourning Earth. 704 00:41:21,440 --> 00:41:24,239 Speaker 3: That struck a chord with me because I feel like 705 00:41:24,360 --> 00:41:29,000 Speaker 3: that is the through line here, that there's this attempt 706 00:41:29,040 --> 00:41:33,160 Speaker 3: to chase something that we've lost, and you know, for Mattias, 707 00:41:33,160 --> 00:41:37,840 Speaker 3: he was talking about essentially a world ruined by climate 708 00:41:37,920 --> 00:41:42,560 Speaker 3: change and environmental degradation, and that the ultimate fulfillment of 709 00:41:42,600 --> 00:41:46,040 Speaker 3: the Mars fantasy, at least in our age, seems to 710 00:41:46,080 --> 00:41:50,400 Speaker 3: be to terraform the planet and create a kind of 711 00:41:50,440 --> 00:41:53,920 Speaker 3: idyllic second Earth that won't be marred by all the 712 00:41:53,960 --> 00:41:57,960 Speaker 3: mistakes that we've made here. And the Ai fantasy has 713 00:41:57,960 --> 00:41:59,719 Speaker 3: the same component. It's, you know, we'll all be young 714 00:41:59,719 --> 00:42:05,160 Speaker 3: and and free of sin in a way, and that 715 00:42:06,000 --> 00:42:09,839 Speaker 3: I think that's true, and I think that's I think 716 00:42:09,880 --> 00:42:12,440 Speaker 3: we lose something when we just assume that all of 717 00:42:12,440 --> 00:42:16,000 Speaker 3: these stories are about what the way they're advertised. 718 00:42:16,000 --> 00:42:17,000 Speaker 4: It's like progress. 719 00:42:17,040 --> 00:42:20,240 Speaker 3: I think it's also there's a kind of a morning 720 00:42:20,280 --> 00:42:22,319 Speaker 3: of something that we've lost that we're trying to get back, 721 00:42:22,360 --> 00:42:25,160 Speaker 3: and we don't quite know how to do it, and 722 00:42:25,239 --> 00:42:28,879 Speaker 3: so we're trying to build a fancy news sports car 723 00:42:29,080 --> 00:42:31,040 Speaker 3: to get us there, but we can't. 724 00:42:40,600 --> 00:42:42,400 Speaker 2: The thing that I found the most interesting about this 725 00:42:42,520 --> 00:42:45,320 Speaker 2: piece that you did was this idea that, like, isolation 726 00:42:46,080 --> 00:42:49,600 Speaker 2: is not about being alone. Yes, isolation is about being 727 00:42:49,600 --> 00:42:54,400 Speaker 2: away from community, absolutely, and you can be with the 728 00:42:54,400 --> 00:42:57,799 Speaker 2: community of people in a place that isn't home and 729 00:42:57,880 --> 00:42:59,080 Speaker 2: be very isolated. 730 00:42:59,400 --> 00:43:01,319 Speaker 1: Well not enough thing. You know. One of the questions 731 00:43:01,360 --> 00:43:03,319 Speaker 1: I didn't ask Nathaniel, but which I kind of wish 732 00:43:03,400 --> 00:43:07,600 Speaker 1: that I had, was this interest in isolation research, Like 733 00:43:08,280 --> 00:43:11,759 Speaker 1: we are constantly bombarded with this idea of the loneliness epidemic, 734 00:43:11,840 --> 00:43:14,120 Speaker 1: and like, even though we're more connected, were more isolated 735 00:43:14,160 --> 00:43:16,120 Speaker 1: than ever. And I was wondering if there was a 736 00:43:16,200 --> 00:43:18,239 Speaker 1: kind of another text spread that I actually didn't pull on, 737 00:43:18,280 --> 00:43:21,000 Speaker 1: but perhaps should have done about you know, why this 738 00:43:21,080 --> 00:43:23,000 Speaker 1: cultural moment is so interested in isolation? 739 00:43:23,320 --> 00:43:26,319 Speaker 2: That's right? And I think that, you know, I mean 740 00:43:26,360 --> 00:43:27,799 Speaker 2: I think about it all the time when I'm sitting 741 00:43:27,840 --> 00:43:30,560 Speaker 2: at home on the couch on my phone, feeling incredibly 742 00:43:30,560 --> 00:43:33,960 Speaker 2: connected to people and like how I could survive that way, 743 00:43:34,360 --> 00:43:37,319 Speaker 2: but also questioning like do I want to live that way? Right, 744 00:43:37,360 --> 00:43:39,240 Speaker 2: you know, and sort of how do I force myself 745 00:43:39,239 --> 00:43:39,600 Speaker 2: out of that? 746 00:43:40,000 --> 00:43:40,160 Speaker 1: Now. 747 00:43:40,160 --> 00:43:43,000 Speaker 2: That really has nothing to do with going to Mars Asterisk. 748 00:43:43,719 --> 00:43:46,239 Speaker 1: But you are somebody who grew up as a lover 749 00:43:46,320 --> 00:43:50,160 Speaker 1: of science fiction. Your father was a science fiction author. Yes, so, 750 00:43:50,880 --> 00:43:53,040 Speaker 1: I mean some people like to be very dismissive of 751 00:43:53,360 --> 00:43:56,200 Speaker 1: muscum Bezos and their dreams of space. You know, I 752 00:43:56,880 --> 00:44:01,480 Speaker 1: think they are two characters who are probably can deal 753 00:44:01,560 --> 00:44:04,520 Speaker 1: with the bit of stick. But I don't think it's 754 00:44:04,600 --> 00:44:09,160 Speaker 1: wrong to dream and even plan about space exploration. 755 00:44:10,200 --> 00:44:13,000 Speaker 2: Well. I think part of it is a colonizer's instinct, 756 00:44:13,960 --> 00:44:17,279 Speaker 2: But I also think this idea of like what is 757 00:44:17,320 --> 00:44:22,000 Speaker 2: outside of our reach is always something that will fascinate 758 00:44:22,080 --> 00:44:25,879 Speaker 2: writers of science fiction, will always fascinate even you know, 759 00:44:26,520 --> 00:44:29,920 Speaker 2: the most practical technologists, because it's something that in a 760 00:44:29,960 --> 00:44:32,880 Speaker 2: certain way is a fantasy. Like even the idea of 761 00:44:32,880 --> 00:44:36,440 Speaker 2: like having to bring a three D printer to Mars 762 00:44:36,480 --> 00:44:39,399 Speaker 2: because we can't lug certain things there. I mean, these 763 00:44:39,440 --> 00:44:42,279 Speaker 2: are such far out concepts, you know. 764 00:44:42,320 --> 00:44:44,520 Speaker 1: I find them exciting. I find them exciting, I think, 765 00:44:44,760 --> 00:44:47,040 Speaker 1: but I also did find it very tragic this idea 766 00:44:47,080 --> 00:44:50,879 Speaker 1: of like the compulsion to repeat these quite damaging experiments, 767 00:44:50,920 --> 00:44:54,600 Speaker 1: of sending people to simulate life on Mars and hurting 768 00:44:54,680 --> 00:44:56,160 Speaker 1: them in the process in their life on Earth. 769 00:44:56,239 --> 00:44:58,600 Speaker 2: Yeah. Of course, we just had Trump, on day one 770 00:44:58,800 --> 00:45:02,719 Speaker 2: of his second term, simultaneously make an executive order to 771 00:45:02,840 --> 00:45:06,840 Speaker 2: drop out of the Paris Climate Accords and declare that 772 00:45:06,880 --> 00:45:10,680 Speaker 2: we will launch astronauts into space. And I quote, plant 773 00:45:10,760 --> 00:45:15,560 Speaker 2: the stars and stripes on the planet Mars. So this 774 00:45:16,160 --> 00:45:19,399 Speaker 2: twinning of saying goodbye to Earth and embracing Mars actually 775 00:45:19,440 --> 00:45:21,359 Speaker 2: feels very salient and very right. 776 00:45:21,400 --> 00:45:24,279 Speaker 1: Now, Well, that's true, But what of this leaves me 777 00:45:24,400 --> 00:45:29,600 Speaker 1: the question about you? Is there anything that could be 778 00:45:29,680 --> 00:45:31,800 Speaker 1: done that I could offer to induce you to spend 779 00:45:32,000 --> 00:45:34,120 Speaker 1: three hundred and fifty days in as ssimulated Mars. 780 00:45:34,239 --> 00:45:38,320 Speaker 2: Now I went to space camp. You'll remember, or maybe remember, 781 00:45:38,719 --> 00:45:40,560 Speaker 2: but I do remember. Now I did go to space camp. 782 00:45:41,040 --> 00:45:48,480 Speaker 2: I am an intellectual explorer. I am not a physical explorer. 783 00:45:48,600 --> 00:45:50,719 Speaker 1: You're not a psychon one either, No, I'm. 784 00:45:50,520 --> 00:45:53,360 Speaker 2: Definitely not a psychoap and I did. I found the 785 00:45:53,400 --> 00:45:58,320 Speaker 2: story of the woman was at LeGuin really really tragic. 786 00:45:58,960 --> 00:46:02,759 Speaker 2: And I do think that what's interesting is that in 787 00:46:03,080 --> 00:46:07,240 Speaker 2: moments of you know, innovation or exploration, we do test 788 00:46:07,320 --> 00:46:10,680 Speaker 2: people's psychological limits. Do we have to? I don't know, 789 00:46:10,920 --> 00:46:14,400 Speaker 2: you know, but I think that for me personally, I 790 00:46:14,480 --> 00:46:20,120 Speaker 2: am not compelled by living for that long outside of 791 00:46:20,360 --> 00:46:24,080 Speaker 2: the sort of my normal life. No are you? 792 00:46:24,120 --> 00:46:28,319 Speaker 1: No, No, I'm not. But that sense that we talked 793 00:46:28,360 --> 00:46:32,080 Speaker 1: about of these experiments in some ways being a kind 794 00:46:32,080 --> 00:46:35,799 Speaker 1: of psychological mourning for what we're losing. You did make 795 00:46:35,840 --> 00:46:39,520 Speaker 1: me think about environmental degradation. And you know, there are 796 00:46:39,520 --> 00:46:43,560 Speaker 1: these I've seen these kind of techno fantasy illustrations of 797 00:46:43,719 --> 00:46:46,440 Speaker 1: what life on Mars might look like, and they're basically 798 00:46:46,480 --> 00:46:51,040 Speaker 1: these biospheres into which you have crammed, like the Swiss Alps, 799 00:46:51,080 --> 00:46:54,600 Speaker 1: the Grand Canyon, the Mediterranean Sea, like beautiful animal. 800 00:46:54,480 --> 00:46:58,440 Speaker 2: I also just think we're still human beings right well 801 00:46:58,880 --> 00:47:02,120 Speaker 2: now for now, but you know, we project all of 802 00:47:02,120 --> 00:47:04,440 Speaker 2: our fantasies still in the world of the creature comforts 803 00:47:04,440 --> 00:47:06,840 Speaker 2: that we want. Do I want to ski on Mars? 804 00:47:06,840 --> 00:47:09,279 Speaker 2: I guess right, because I like skiing here. 805 00:47:09,480 --> 00:47:13,080 Speaker 1: You know, it makes you remember just how wonderful, you know, 806 00:47:13,160 --> 00:47:16,479 Speaker 1: this earth of ours is. And what I loved about 807 00:47:16,480 --> 00:47:18,920 Speaker 1: this interview and took away from it is when you 808 00:47:18,960 --> 00:47:22,160 Speaker 1: play out the fantasy and when you actually ask, you know, 809 00:47:22,239 --> 00:47:25,359 Speaker 1: one of the chief research scientists at NASA, what this 810 00:47:25,400 --> 00:47:28,200 Speaker 1: looks like in the future. It's not just going to Mars. 811 00:47:28,320 --> 00:47:31,960 Speaker 1: It's evolving into a new species with different shape of head, 812 00:47:32,239 --> 00:47:35,360 Speaker 1: with a different reaction to radiation. And what that says 813 00:47:35,360 --> 00:47:38,479 Speaker 1: to me is this is not just you know, going 814 00:47:38,520 --> 00:47:41,439 Speaker 1: on a fun trip. This is essentially saying that there's 815 00:47:41,440 --> 00:47:45,279 Speaker 1: going to be a fundamental categorical shift in US as 816 00:47:45,280 --> 00:47:49,160 Speaker 1: a species in order to colonize Mars. And it's just 817 00:47:49,200 --> 00:47:52,080 Speaker 1: a very weird and I find disturbing thought. 818 00:47:52,960 --> 00:47:54,239 Speaker 2: Again, not something I would do. 819 00:47:54,760 --> 00:47:56,400 Speaker 1: That's a good place to leave it. That's if a 820 00:47:56,480 --> 00:48:03,560 Speaker 1: tech Stuff Today. Today's episode was produced by Sina Ozaki, 821 00:48:03,719 --> 00:48:07,880 Speaker 1: Eliza Dennis, Victoria Dominguez, and Lizzie Jacobs. It was executive 822 00:48:07,920 --> 00:48:11,520 Speaker 1: produced by me Oswaaloshin, Kara Price, and Kate Osborne for 823 00:48:11,560 --> 00:48:15,640 Speaker 1: Kaleidoscope and Katrina Norvell by Heart Podcasts. The Engineer is 824 00:48:15,680 --> 00:48:19,560 Speaker 1: Beheit Fraser, Kyle Murdoch, Rodar Themsong Join us on Friday 825 00:48:19,560 --> 00:48:22,239 Speaker 1: for Textuff's The Week in Tech, when we'll explore the 826 00:48:22,280 --> 00:48:27,280 Speaker 1: origin story of our current obsession with step counting. Please rate, review, 827 00:48:27,360 --> 00:48:29,920 Speaker 1: and reach out to us at tech Stuff podcast at 828 00:48:29,960 --> 00:48:32,160 Speaker 1: gmail dot com. We want to hear us on your mind.