1 00:00:00,560 --> 00:00:03,160 Speaker 1: NASA has a punch list of eight hundred problems that 2 00:00:03,240 --> 00:00:08,200 Speaker 1: must be solved before the first mission to Mars is launched. 3 00:00:08,960 --> 00:00:12,560 Speaker 1: Very few of them have to do with problems of 4 00:00:12,840 --> 00:00:17,279 Speaker 1: human psychology or really even of human survival, which is 5 00:00:17,320 --> 00:00:21,640 Speaker 1: the subject of this experiment that I wrote about called SHAPEA. 6 00:00:22,040 --> 00:00:26,960 Speaker 2: This particular experiment began with rather intriguing announcement on the 7 00:00:27,040 --> 00:00:28,560 Speaker 2: NASA website. 8 00:00:28,800 --> 00:00:31,760 Speaker 1: Yeah, it was a little bit like the Wonka Factory. 9 00:00:32,000 --> 00:00:35,360 Speaker 1: The Golden Ticket that you know, four civilians would be 10 00:00:35,479 --> 00:00:40,199 Speaker 1: chosen to go to Mars Asterisk, not really Mars, but 11 00:00:40,320 --> 00:00:43,479 Speaker 1: a habitat that was built on essentially a stage set 12 00:00:43,920 --> 00:00:47,520 Speaker 1: to look exactly like what they expect the first mission 13 00:00:47,520 --> 00:00:51,519 Speaker 1: to Mars to look like. And it generated enormous excitement 14 00:00:51,600 --> 00:00:54,520 Speaker 1: and people from all over the country rushed to apply. 15 00:00:55,160 --> 00:00:58,640 Speaker 1: They wanted the Golden Ticket to live out. In most cases, 16 00:00:58,680 --> 00:01:02,480 Speaker 1: I think it's kind of childhood fantasy of space exploration 17 00:01:02,800 --> 00:01:07,640 Speaker 1: to see if they could withstand psychologically the challenges of 18 00:01:07,760 --> 00:01:10,920 Speaker 1: living away from the rest of the everyone else they've 19 00:01:10,959 --> 00:01:12,240 Speaker 1: ever known or met. 20 00:01:16,280 --> 00:01:19,959 Speaker 2: Welcome to Tech Stuff the story. I'm Os Voloshin, and 21 00:01:20,040 --> 00:01:22,440 Speaker 2: each week we bring you an in depth interview with 22 00:01:22,520 --> 00:01:25,240 Speaker 2: one of the brightest and farthest seeing minds in. 23 00:01:25,360 --> 00:01:26,160 Speaker 1: And about tech. 24 00:01:31,880 --> 00:01:34,640 Speaker 2: Karen, I'm excited to bring you this interview with Nathaniel Rich. 25 00:01:35,080 --> 00:01:37,039 Speaker 2: When we ask people to come on the show, it's 26 00:01:37,080 --> 00:01:40,080 Speaker 2: always because one or other of us has been fascinated 27 00:01:40,080 --> 00:01:43,720 Speaker 2: by something they've said, something they've done, or something they've written. 28 00:01:43,840 --> 00:01:47,280 Speaker 3: Well, Nathaniel kind of had me at Mars asterisk me too. 29 00:01:47,560 --> 00:01:51,160 Speaker 2: You can't really understand tech today without understanding or at 30 00:01:51,240 --> 00:01:55,400 Speaker 2: least investigating the dreams and the fantasies of the tech titans. 31 00:01:55,760 --> 00:01:59,080 Speaker 2: Colonizing space is such an important touchstone for Elon Musk 32 00:01:59,160 --> 00:02:02,480 Speaker 2: and Jeff Bezos in particular, and also mentioned by Trump 33 00:02:02,600 --> 00:02:06,520 Speaker 2: his inauguration as quote the pursuit of our manifest destiny. 34 00:02:06,600 --> 00:02:08,519 Speaker 3: He said, put stars and stripe. What did you say, 35 00:02:08,520 --> 00:02:10,600 Speaker 3: put red, white, and blue? Are stars and stripes on 36 00:02:11,160 --> 00:02:11,880 Speaker 3: Mars Mars? 37 00:02:11,960 --> 00:02:14,320 Speaker 2: Yeah, So, when I came across this article in the 38 00:02:14,360 --> 00:02:18,639 Speaker 2: New York Times magazine under the headline can humans withstand 39 00:02:18,680 --> 00:02:22,000 Speaker 2: the psychological torture of Mars? I had to know more. 40 00:02:22,080 --> 00:02:24,920 Speaker 2: In fact, I remember reading it just getting goosebumps, and 41 00:02:25,000 --> 00:02:27,520 Speaker 2: so I kind of wanted to talk to Nathaniel about 42 00:02:27,600 --> 00:02:30,639 Speaker 2: how realistic the dreams of getting to Mars are and 43 00:02:30,880 --> 00:02:33,600 Speaker 2: what some of the practical dare I say, technical steps 44 00:02:33,639 --> 00:02:35,040 Speaker 2: required to achieve the mark? 45 00:02:35,080 --> 00:02:36,600 Speaker 3: Before you get too excited, can you just tell me 46 00:02:36,600 --> 00:02:37,600 Speaker 3: who Nathaniel Rich is? 47 00:02:37,680 --> 00:02:38,080 Speaker 1: Sorry? 48 00:02:38,400 --> 00:02:42,320 Speaker 2: Nathaniel is an author. He's written novels like The Mayor's Tongue, 49 00:02:42,440 --> 00:02:46,720 Speaker 2: Odds Against Tomorrow, and King Zeno, but also nonfiction books 50 00:02:46,760 --> 00:02:50,120 Speaker 2: primarily about the environment, such as Losing Earth, A Recent 51 00:02:50,240 --> 00:02:54,680 Speaker 2: History and Second Nature Scenes from a World Remade. One 52 00:02:54,720 --> 00:02:58,720 Speaker 2: critic actually said Rich is a gifted caricaturist and a 53 00:02:58,760 --> 00:03:03,480 Speaker 2: gifted apocalypse. It's his talent for describing the apocalypse which 54 00:03:03,560 --> 00:03:06,280 Speaker 2: brought him, in some ways to reporting on the Mars 55 00:03:06,360 --> 00:03:17,239 Speaker 2: June Alpha project, which I asked to about why did you 56 00:03:17,240 --> 00:03:18,120 Speaker 2: decide to write the piece? 57 00:03:19,440 --> 00:03:23,080 Speaker 1: The NASA part of it was almost came secondarily. I 58 00:03:23,120 --> 00:03:30,120 Speaker 1: had become obsessed with this history of isolation research, and 59 00:03:30,200 --> 00:03:35,560 Speaker 1: particularly by this incredible story of a man named Michel 60 00:03:35,720 --> 00:03:41,320 Speaker 1: Sifrey who had launched a series of cave experiments to 61 00:03:41,600 --> 00:03:46,560 Speaker 1: test the endurance of people in isolation, in environments where 62 00:03:46,560 --> 00:03:48,680 Speaker 1: they're completely cut off from the world. And so he 63 00:03:48,720 --> 00:03:51,360 Speaker 1: had run a series of these experiments that culminated with 64 00:03:51,440 --> 00:03:54,520 Speaker 1: this experiment by the first female participant in the series, 65 00:03:54,520 --> 00:03:57,520 Speaker 1: who was this woman named Veronique Legwyn was in the 66 00:03:57,600 --> 00:04:02,040 Speaker 1: late eighties, and she went underground and ended up setting 67 00:04:02,080 --> 00:04:04,080 Speaker 1: the record at the time as one hundred and eleven 68 00:04:04,120 --> 00:04:07,880 Speaker 1: days underground. And she kept a journal and she wrote 69 00:04:07,920 --> 00:04:10,920 Speaker 1: about everything she was thinking about and feeling, and ultimately 70 00:04:10,960 --> 00:04:13,280 Speaker 1: what happened was she went a little bit insane, but 71 00:04:13,360 --> 00:04:19,359 Speaker 1: also had these moments of great euphoria and enlightenment. And 72 00:04:19,360 --> 00:04:22,400 Speaker 1: it's a tragic story though, because she came out finally 73 00:04:22,800 --> 00:04:25,960 Speaker 1: and after being celebrated and becoming a kind of national 74 00:04:26,000 --> 00:04:29,200 Speaker 1: celebrity for a period of time, entered into this great 75 00:04:29,240 --> 00:04:33,600 Speaker 1: depression and ultimately killed herself within a year. And she 76 00:04:33,640 --> 00:04:36,240 Speaker 1: had said before her death something to the effect of, 77 00:04:36,839 --> 00:04:38,680 Speaker 1: you know, I never was more alive than I was 78 00:04:38,760 --> 00:04:41,560 Speaker 1: down and underground when I was all by myself. And 79 00:04:41,560 --> 00:04:45,080 Speaker 1: that led me into a whole obsession with these types 80 00:04:45,120 --> 00:04:47,200 Speaker 1: of experiments, and I wanted to see if anyone was 81 00:04:47,240 --> 00:04:51,520 Speaker 1: doing these things now, because they're on one level, they're 82 00:04:51,680 --> 00:04:56,039 Speaker 1: completely unethical because basically what you'd expect happens, which is 83 00:04:56,760 --> 00:05:00,840 Speaker 1: most people struggle and often lose their whole on reality 84 00:05:00,920 --> 00:05:02,520 Speaker 1: and I found that no one was really doing these 85 00:05:02,520 --> 00:05:07,480 Speaker 1: experiments for that reason except for NASA, who had continued 86 00:05:07,800 --> 00:05:10,640 Speaker 1: under the guise of this Martian project. 87 00:05:10,800 --> 00:05:14,800 Speaker 2: So on the one had NASA putting out the cool applicants, 88 00:05:15,400 --> 00:05:18,160 Speaker 2: but on the other hand, they had to build Mars 89 00:05:18,279 --> 00:05:20,479 Speaker 2: or at least a motion colony on Earth. 90 00:05:21,600 --> 00:05:25,960 Speaker 1: Yeah, they had to build or actually print using a 91 00:05:26,000 --> 00:05:29,680 Speaker 1: three D printer, a habitat, which is, by the way, 92 00:05:29,760 --> 00:05:33,360 Speaker 1: how they will do it. When we get to Mars. 93 00:05:33,360 --> 00:05:37,000 Speaker 1: You can't travel thirty three million miles with a house, 94 00:05:37,200 --> 00:05:42,360 Speaker 1: you know, of towing a half behind you. Yeah, so 95 00:05:42,400 --> 00:05:43,960 Speaker 1: they can't quite do that, or they don't have the 96 00:05:44,000 --> 00:05:46,400 Speaker 1: technology to do that. It's not efficient. And so what 97 00:05:46,440 --> 00:05:49,160 Speaker 1: they will do is they will just lug a three 98 00:05:49,240 --> 00:05:54,440 Speaker 1: D printer up there and use Martian rock regolith as 99 00:05:54,720 --> 00:05:57,279 Speaker 1: ink for this three D printer. 100 00:05:57,480 --> 00:05:59,599 Speaker 2: So they'll tone the sand into cement somehow. 101 00:06:00,120 --> 00:06:01,640 Speaker 1: Yeah, and they can do that. They do that on 102 00:06:01,680 --> 00:06:04,839 Speaker 1: this planet too, And there are you can find online 103 00:06:04,880 --> 00:06:06,919 Speaker 1: some habitats that have been built, some houses that have 104 00:06:06,960 --> 00:06:09,479 Speaker 1: been built this way, not using Martian rock obviously, but 105 00:06:10,120 --> 00:06:14,480 Speaker 1: terrestrial rock. And they will construct this house. It's a 106 00:06:14,520 --> 00:06:19,720 Speaker 1: seventeen hundred square foot habitat, and they built it in 107 00:06:19,760 --> 00:06:24,680 Speaker 1: a warehouse at the Johnson Space Center in Houston, and 108 00:06:24,720 --> 00:06:29,880 Speaker 1: it's their four little bedrooms and a lounge and you know, 109 00:06:29,920 --> 00:06:35,120 Speaker 1: a small indoor garden and some computers and desks and 110 00:06:35,240 --> 00:06:41,080 Speaker 1: like a little relaxation space. And that seventeen hundred foot 111 00:06:42,000 --> 00:06:45,320 Speaker 1: habitat was where they were going to send four people 112 00:06:45,440 --> 00:06:47,159 Speaker 1: for more than a year. 113 00:06:47,800 --> 00:06:51,760 Speaker 2: And this habitat resembles exactly what they intend to build 114 00:06:52,160 --> 00:06:53,760 Speaker 2: on Moss when they get there. 115 00:06:54,320 --> 00:06:58,280 Speaker 1: Yeah, I'm sure subject to change, and I suppose part 116 00:06:58,279 --> 00:07:02,560 Speaker 1: of this experiment was to determine whether this particular model 117 00:07:02,760 --> 00:07:05,640 Speaker 1: would work best. But yeah, this is the plan. 118 00:07:06,360 --> 00:07:09,679 Speaker 2: And the kind of simulated colony in the Johnson Space 119 00:07:09,720 --> 00:07:11,640 Speaker 2: Center had quite a romantic name. 120 00:07:12,360 --> 00:07:15,520 Speaker 1: Yeah, Mars Dune Alpha is the name of the habitat, 121 00:07:15,600 --> 00:07:20,120 Speaker 1: and the mission is named Shapeah, which is I guess 122 00:07:20,240 --> 00:07:21,920 Speaker 1: NASA's idea of a sexy name. 123 00:07:23,880 --> 00:07:27,160 Speaker 2: And so okay, So the call goes out for some 124 00:07:27,240 --> 00:07:30,600 Speaker 2: volunteers to go to Mars dun Alfa. One of the 125 00:07:30,600 --> 00:07:36,520 Speaker 2: people who sees the advertisement is Nathan Jones. Who's Nathan. 126 00:07:36,880 --> 00:07:41,520 Speaker 1: Yeah, Nathan Jones is in many ways the most fascinating 127 00:07:41,680 --> 00:07:44,240 Speaker 1: figure for me in reporting the piece. He's an emergency 128 00:07:44,320 --> 00:07:50,880 Speaker 1: room physician from Springfield, Illinois, father of three boys, married, 129 00:07:51,200 --> 00:07:55,760 Speaker 1: and Nathan was like basically everyone I spoke to for 130 00:07:55,800 --> 00:07:59,360 Speaker 1: the story, was a kind of self professed NASA geek 131 00:07:59,440 --> 00:08:05,280 Speaker 1: or ass and had always dreamed of doing something special, 132 00:08:05,400 --> 00:08:09,520 Speaker 1: bigger with his life. He was obsessed with space travel 133 00:08:09,560 --> 00:08:14,680 Speaker 1: and when he saw this posting, he applied immediately and 134 00:08:15,040 --> 00:08:18,200 Speaker 1: then told his wife, who was I think as safe 135 00:08:18,200 --> 00:08:19,200 Speaker 1: to say as it was a. 136 00:08:19,080 --> 00:08:22,960 Speaker 2: Pault the sequence that seems a little all speaking as 137 00:08:22,960 --> 00:08:23,640 Speaker 2: Americ man. 138 00:08:25,280 --> 00:08:27,960 Speaker 1: Yeah, I don't. I wouldn't have flied in my house. 139 00:08:29,240 --> 00:08:31,240 Speaker 1: But he was unique actually in that he was the 140 00:08:31,280 --> 00:08:35,600 Speaker 1: only one of the finalists who had children, and as 141 00:08:35,640 --> 00:08:39,719 Speaker 1: the father of two small children myself, I felt for 142 00:08:40,720 --> 00:08:42,680 Speaker 1: the family. And he was fully aware he was going 143 00:08:42,760 --> 00:08:45,280 Speaker 1: to miss out on a lot. You miss a year 144 00:08:45,320 --> 00:08:47,240 Speaker 1: with your children, you're missing a lot, and you come 145 00:08:47,280 --> 00:08:50,360 Speaker 1: back and the children look like different people. So there 146 00:08:50,400 --> 00:08:55,120 Speaker 1: was another dimension of an emotional challenge with him. But 147 00:08:55,160 --> 00:08:56,280 Speaker 1: he was determined to do it. 148 00:08:56,520 --> 00:08:57,559 Speaker 2: And how did he prepare? 149 00:08:58,400 --> 00:09:04,800 Speaker 1: He prepared very dutifully by him and his wife had 150 00:09:04,880 --> 00:09:07,280 Speaker 1: a whole series. I was fascinated by this, a whole 151 00:09:07,280 --> 00:09:11,079 Speaker 1: series of preparations that they did. He wrote little letters 152 00:09:11,160 --> 00:09:13,760 Speaker 1: to that he placed around the house in secret hiding 153 00:09:13,800 --> 00:09:17,040 Speaker 1: spots that the kids and his wife, Casey might find 154 00:09:17,080 --> 00:09:21,120 Speaker 1: over the course of the year. Sometimes little like notes 155 00:09:21,760 --> 00:09:24,720 Speaker 1: of encouragement, like he put a note in the fuse 156 00:09:24,760 --> 00:09:27,760 Speaker 1: box for like the first time the lights went out 157 00:09:27,800 --> 00:09:30,080 Speaker 1: and said, you know you can do this. I trust you, 158 00:09:30,120 --> 00:09:32,439 Speaker 1: just flipped this switch. And so they're all these sort 159 00:09:32,480 --> 00:09:35,600 Speaker 1: of sweet and for somewhat poignant point. 160 00:09:35,640 --> 00:09:37,720 Speaker 2: It's almost like the script of a movie where somebody 161 00:09:37,760 --> 00:09:38,520 Speaker 2: knows they're going to die. 162 00:09:39,400 --> 00:09:43,640 Speaker 1: Yeah, and there's but the poignancy is somewhat compromised. I 163 00:09:43,720 --> 00:09:48,920 Speaker 1: found by the fact that it was all a contrived scenario. 164 00:09:49,320 --> 00:09:52,439 Speaker 1: He wasn't that that's that there's a kind of beathos 165 00:09:52,480 --> 00:09:55,720 Speaker 1: to the fact that, well, he wasn't actually going to Mars. 166 00:09:55,760 --> 00:09:59,120 Speaker 1: It's not quite the Matthew mcconnae Interstellar where he's missing 167 00:09:59,160 --> 00:10:01,840 Speaker 1: his children for this major mission. He's just going to 168 00:10:01,840 --> 00:10:04,360 Speaker 1: sit on a stage set for a year. But that 169 00:10:04,559 --> 00:10:07,720 Speaker 1: tension between the kind of absurdity of the whole proposition 170 00:10:08,440 --> 00:10:13,280 Speaker 1: and then the real emotion that attended every aspect of 171 00:10:13,320 --> 00:10:16,720 Speaker 1: this process, for me, that was really the heart of 172 00:10:16,760 --> 00:10:17,280 Speaker 1: the story. 173 00:10:17,920 --> 00:10:19,800 Speaker 2: All you have to do is watch the video of 174 00:10:19,920 --> 00:10:24,880 Speaker 2: him about to go into the Man's dun alfa. What 175 00:10:24,920 --> 00:10:27,160 Speaker 2: did you feel when you watched somebody you spent time 176 00:10:27,160 --> 00:10:29,720 Speaker 2: with his source in such distress. 177 00:10:30,559 --> 00:10:35,240 Speaker 1: Yeah, that was striking. He had predicted it. But sure enough, 178 00:10:35,280 --> 00:10:39,240 Speaker 1: when it came time to enter this habitat, they had 179 00:10:39,240 --> 00:10:42,840 Speaker 1: this dramatic ceremony. They were filmed right in front of 180 00:10:42,880 --> 00:10:46,280 Speaker 1: the main portal, which is basically just a door. It 181 00:10:46,320 --> 00:10:49,640 Speaker 1: wasn't like some major like you're entering a submarine or something, 182 00:10:49,679 --> 00:10:51,920 Speaker 1: but they were at a They gave a little press 183 00:10:51,920 --> 00:10:54,120 Speaker 1: conference and each one of them had to give a talk, 184 00:10:54,679 --> 00:10:58,080 Speaker 1: give a little statement, and he broke down. He couldn't 185 00:10:58,120 --> 00:11:01,760 Speaker 1: finish it because he was so overcome by the thought 186 00:11:01,800 --> 00:11:05,760 Speaker 1: of saying goodbye finally to his family for this long 187 00:11:05,800 --> 00:11:06,680 Speaker 1: period of time. 188 00:11:06,880 --> 00:11:09,400 Speaker 4: But I believe that tomorrow will only be possible because 189 00:11:09,440 --> 00:11:13,840 Speaker 4: we step into marsdo now but today, and with that 190 00:11:13,920 --> 00:11:16,120 Speaker 4: in mind, I also want to take a moment to 191 00:11:16,160 --> 00:11:20,160 Speaker 4: sincerely thank the great many people who've worked tirelessly in 192 00:11:20,679 --> 00:11:24,720 Speaker 4: so many countless hours to get us to this point. Also, 193 00:11:24,800 --> 00:11:27,520 Speaker 4: thank you to our families and friends for their sacrifices. 194 00:11:28,320 --> 00:11:32,280 Speaker 4: We see, we know those sacrifices. We couldn't be here 195 00:11:32,320 --> 00:11:42,680 Speaker 4: without your love and support. Sorry, Sorry to my wife 196 00:11:43,760 --> 00:11:44,320 Speaker 4: and kids. 197 00:11:45,120 --> 00:11:48,839 Speaker 2: I love you, the moon. I'm sorry Mars and back. 198 00:11:49,640 --> 00:11:52,920 Speaker 1: And it's very moving and upsetting and sort of sweet 199 00:11:53,440 --> 00:11:55,720 Speaker 1: and horrible in some ways as well. It's something that 200 00:11:55,760 --> 00:11:58,200 Speaker 1: he brought upon himself. But I think what's key to 201 00:11:58,320 --> 00:12:02,800 Speaker 1: understand is that everybody in the mission, from the administrators 202 00:12:02,840 --> 00:12:07,959 Speaker 1: to the participants, felt very certain that what they were 203 00:12:08,000 --> 00:12:13,319 Speaker 1: doing was a critical next step towards this wonderful dream 204 00:12:13,520 --> 00:12:17,960 Speaker 1: of humanity's next chapter. They felt that there is no Mars, 205 00:12:18,080 --> 00:12:21,720 Speaker 1: there is no exploration of Mars unless you have the 206 00:12:21,720 --> 00:12:26,000 Speaker 1: shapea experiment. I'm not convinced that's true at all. I 207 00:12:26,040 --> 00:12:28,600 Speaker 1: mean I wrote about that, but they certainly were, and 208 00:12:28,679 --> 00:12:33,000 Speaker 1: so they did feel that they were sacrificing, making a 209 00:12:33,040 --> 00:12:37,440 Speaker 1: major personal sacrifice towards achieving a great goal for all 210 00:12:37,480 --> 00:12:38,560 Speaker 1: of humanity. 211 00:12:39,400 --> 00:12:41,440 Speaker 2: Which may have kept them safe. And the woman you 212 00:12:41,480 --> 00:12:43,679 Speaker 2: mentioned at the beginning, the French woman who took her 213 00:12:43,679 --> 00:12:46,080 Speaker 2: own life, does she have that same sense of mission. 214 00:12:47,000 --> 00:12:50,360 Speaker 1: That's a great point. There is some commonality, and that 215 00:12:50,400 --> 00:12:53,920 Speaker 1: there was this idea that they were on a kind 216 00:12:53,920 --> 00:12:59,400 Speaker 1: of different frontier of human psychology. And but yes, it's 217 00:12:59,440 --> 00:13:02,160 Speaker 1: not it was. I don't think it was quite as ennobling, 218 00:13:02,880 --> 00:13:04,880 Speaker 1: or the stakes were quite as high as you see 219 00:13:04,960 --> 00:13:07,240 Speaker 1: with NASA and all the trappings of NASA. 220 00:13:07,679 --> 00:13:12,160 Speaker 2: And also she was totally alone, whereas Nathan had three companions. 221 00:13:11,800 --> 00:13:14,400 Speaker 1: Right right, And so there's some distinctions there, although I 222 00:13:14,440 --> 00:13:17,440 Speaker 1: will say that in the long history of experiments in 223 00:13:17,480 --> 00:13:22,920 Speaker 1: which people are together in isolation, they suffer. Also, I mean, 224 00:13:23,200 --> 00:13:26,040 Speaker 1: maybe it's not quite as extreme, but you know, in 225 00:13:26,080 --> 00:13:30,120 Speaker 1: conducting the research for the piece, I spoke with a 226 00:13:30,160 --> 00:13:34,559 Speaker 1: bunch of psychiatrists and historians of science and historians of psychology, 227 00:13:35,240 --> 00:13:41,760 Speaker 1: and I learned that the definition of isolation is not 228 00:13:42,080 --> 00:13:47,200 Speaker 1: necessarily being alone. It's being removed from your normal life 229 00:13:47,200 --> 00:13:49,720 Speaker 1: and from the people close to you. So you can 230 00:13:49,760 --> 00:13:53,600 Speaker 1: be in isolation with other people, and in fact, many 231 00:13:53,600 --> 00:13:57,840 Speaker 1: of the same psychological effects are experienced whether or not 232 00:13:57,880 --> 00:14:00,720 Speaker 1: there are you're with other people. You're cut off from 233 00:14:00,760 --> 00:14:02,440 Speaker 1: the people who are most important to you. 234 00:14:03,360 --> 00:14:05,520 Speaker 2: When I think about the history of space movies is 235 00:14:05,520 --> 00:14:09,040 Speaker 2: obviously the famous Houston. We have a problem. Could Nathan 236 00:14:09,120 --> 00:14:11,680 Speaker 2: and co stay in touch with homebase and even with 237 00:14:11,720 --> 00:14:14,400 Speaker 2: their families while they were in Mars dun Alpha. 238 00:14:15,040 --> 00:14:19,440 Speaker 1: Yeah, so that they were very scrupulous about imitating the reality, 239 00:14:19,680 --> 00:14:23,960 Speaker 1: the expected reality, which is that there's this time lapse 240 00:14:24,400 --> 00:14:28,000 Speaker 1: for any communication from Mars because it's far away and 241 00:14:28,040 --> 00:14:30,480 Speaker 1: you're dealing with the limits of the speed of sound 242 00:14:30,640 --> 00:14:34,760 Speaker 1: and technology, and so there's something it depends on where 243 00:14:34,760 --> 00:14:36,240 Speaker 1: it is in the orbit, but essentially there's like a 244 00:14:36,280 --> 00:14:42,280 Speaker 1: twenty nine minute lapse, and so you can't have a conversation, 245 00:14:42,960 --> 00:14:46,320 Speaker 1: any kind of normal conversation, but they can send messages. 246 00:14:46,720 --> 00:14:49,880 Speaker 1: But the other problem is that every form of electronic 247 00:14:49,960 --> 00:14:55,920 Speaker 1: communication from the habitat has to go through the same channel. 248 00:14:56,240 --> 00:15:00,480 Speaker 1: So that includes any kind of data that the habitat 249 00:15:00,520 --> 00:15:02,840 Speaker 1: is sending back to Earth about I don't know, oxygen 250 00:15:03,000 --> 00:15:06,240 Speaker 1: levels or what's happening in the experiments, or any kind 251 00:15:06,280 --> 00:15:10,480 Speaker 1: of computer connections. And so that's sort of the best 252 00:15:10,520 --> 00:15:15,080 Speaker 1: case scenario, and that actually the lag can be much longer, 253 00:15:15,160 --> 00:15:18,360 Speaker 1: and the larger the audio file or the text file, 254 00:15:18,360 --> 00:15:21,040 Speaker 1: the computer file, the longer it takes. So sending a 255 00:15:21,080 --> 00:15:24,960 Speaker 1: short video, even in low resolution, could take days, where 256 00:15:25,040 --> 00:15:27,800 Speaker 1: sending a one line text message maybe takes only half 257 00:15:27,840 --> 00:15:31,440 Speaker 1: an hour or so. So they could communicate, but only 258 00:15:31,480 --> 00:15:37,400 Speaker 1: in this clipped way with all of these ellipses essentially 259 00:15:37,440 --> 00:15:42,320 Speaker 1: between communications. So if there's an emergency, say back at home, 260 00:15:42,600 --> 00:15:45,320 Speaker 1: they couldn't just start having a conversation with them. Now, 261 00:15:46,200 --> 00:15:49,400 Speaker 1: in reality, since they were on a stage set, they 262 00:15:49,480 --> 00:15:52,800 Speaker 1: could break the experiment at any time if someone just 263 00:15:52,920 --> 00:15:54,960 Speaker 1: like I don't know, cut off their finger or something, 264 00:15:55,280 --> 00:15:57,920 Speaker 1: but they would try to they would do anything to 265 00:15:57,960 --> 00:16:02,160 Speaker 1: avoid breaking the experiment. So yeah, they were reduced to 266 00:16:02,200 --> 00:16:06,880 Speaker 1: these sort of intermittent text messages essentially that would be 267 00:16:06,960 --> 00:16:09,000 Speaker 1: relayed at unpredictable intervals. 268 00:16:09,960 --> 00:16:11,680 Speaker 2: How did you choose the headline? Feel story? 269 00:16:12,600 --> 00:16:16,840 Speaker 1: I don't choose the headlines. I'm not allow I don't. 270 00:16:16,880 --> 00:16:18,720 Speaker 1: I can consult on them, and I can say this 271 00:16:18,760 --> 00:16:19,840 Speaker 1: one's worse than the other one. 272 00:16:19,880 --> 00:16:21,920 Speaker 2: But the headline the New York Times magazine went with 273 00:16:22,240 --> 00:16:25,640 Speaker 2: was can humans withstand the psychological torture? 274 00:16:27,600 --> 00:16:30,960 Speaker 1: I mean, it's pretty good, I can't headline yes, yes, 275 00:16:31,360 --> 00:16:34,360 Speaker 1: And that's also what it's about, basically, can we can 276 00:16:34,400 --> 00:16:37,640 Speaker 1: people survive this? Because most of what NASA has been 277 00:16:37,680 --> 00:16:41,880 Speaker 1: asking over the course of its space program is can 278 00:16:41,880 --> 00:16:44,640 Speaker 1: we physically get people into space? Can we physically put 279 00:16:44,680 --> 00:16:47,800 Speaker 1: them on another planet. Very little thought has been given 280 00:16:47,840 --> 00:16:53,440 Speaker 1: into can human beings once they're there survive psychologically, emotionally, 281 00:16:54,000 --> 00:16:57,760 Speaker 1: And that's that's what this experiment is, at least ostensibly about, 282 00:16:58,160 --> 00:16:59,920 Speaker 1: and it's definitely what the story is that I wrote 283 00:17:00,160 --> 00:17:02,120 Speaker 1: about when. 284 00:17:01,920 --> 00:17:05,000 Speaker 2: We come back. More from Nathaniel rich on why we're 285 00:17:05,040 --> 00:17:08,960 Speaker 2: so obsessed with going to Mars and how historically attitudes 286 00:17:09,000 --> 00:17:17,760 Speaker 2: towards Mars have always revealed deeper cultural undercurrents. How close 287 00:17:17,840 --> 00:17:22,000 Speaker 2: is NASA to putting humans on Mars. They've been predicting 288 00:17:22,000 --> 00:17:23,919 Speaker 2: for many years that it's just around the corner. They 289 00:17:24,000 --> 00:17:26,000 Speaker 2: keep pushing back the window. 290 00:17:26,160 --> 00:17:28,760 Speaker 1: Even a few years ago, I think by twenty eighteen 291 00:17:29,480 --> 00:17:32,320 Speaker 1: they had predicted that it would be no later than 292 00:17:32,359 --> 00:17:36,640 Speaker 1: the end of the twenty twenties. I think now it's 293 00:17:36,720 --> 00:17:40,760 Speaker 1: they're looking more to the middle of the next decade. 294 00:17:41,080 --> 00:17:44,280 Speaker 1: But they are full speed ahead, and I think they're 295 00:17:44,480 --> 00:17:48,560 Speaker 1: very confident that they will get people to the planet 296 00:17:49,000 --> 00:17:52,480 Speaker 1: in a fairly short amount of time. The technical problems 297 00:17:52,520 --> 00:17:55,800 Speaker 1: that lay before them that we referenced are not seen 298 00:17:56,000 --> 00:18:01,080 Speaker 1: as intimidatingly difficult. They're just math problems to be worked out. 299 00:18:01,320 --> 00:18:03,360 Speaker 1: Is the sense that I got from speaking with one 300 00:18:03,359 --> 00:18:08,400 Speaker 1: of these senior propulsion engineers. So there is, and there 301 00:18:08,400 --> 00:18:11,359 Speaker 1: has been for quite a while within NASA, quite a 302 00:18:11,440 --> 00:18:13,199 Speaker 1: lot of optimism that this is going to happen. It's 303 00:18:13,200 --> 00:18:14,080 Speaker 1: going to happen pretty soon. 304 00:18:14,720 --> 00:18:17,160 Speaker 2: And why why mos Well. 305 00:18:17,000 --> 00:18:19,960 Speaker 1: That's the million dollars, that's the million dollar question. I mean, 306 00:18:20,440 --> 00:18:22,800 Speaker 1: there's a lot of different rationales. The main ones you 307 00:18:22,840 --> 00:18:27,560 Speaker 1: hear from NASA is it represents scientific progress. It's the 308 00:18:27,600 --> 00:18:32,240 Speaker 1: next step for human exploration of the universe, and certainly 309 00:18:32,320 --> 00:18:36,520 Speaker 1: human progress in this space exploration. There's also the rationale 310 00:18:36,560 --> 00:18:41,680 Speaker 1: that through the kind of innovation that's necessary to put 311 00:18:41,680 --> 00:18:43,840 Speaker 1: people on Mars or to reach any new milestone in 312 00:18:43,920 --> 00:18:48,000 Speaker 1: the space expeditions, that there will be some kind of 313 00:18:48,040 --> 00:18:53,040 Speaker 1: unpredictable benefits, technological benefits that can be applied for all 314 00:18:53,080 --> 00:18:56,439 Speaker 1: of humanity, so that maybe they'll invent new materials or 315 00:18:56,520 --> 00:18:59,880 Speaker 1: new types of devices that can then make our life 316 00:19:00,040 --> 00:19:02,800 Speaker 1: on Earth easier. And there are plenty of examples I 317 00:19:02,840 --> 00:19:06,800 Speaker 1: think of that in the past. And then there's a 318 00:19:06,880 --> 00:19:10,399 Speaker 1: kind of political rationale, which is to say that we 319 00:19:10,480 --> 00:19:12,320 Speaker 1: need to do it before someone else does. There's a 320 00:19:12,400 --> 00:19:13,880 Speaker 1: national pride on the line. 321 00:19:14,200 --> 00:19:16,240 Speaker 2: I mean, is this like in the sixties when JFK 322 00:19:16,440 --> 00:19:18,480 Speaker 2: wanted to put a man on the moon first. Is 323 00:19:18,480 --> 00:19:20,880 Speaker 2: there a parallel to the sixties in that respect? 324 00:19:21,440 --> 00:19:23,200 Speaker 1: Yeah, I would say not only is there a parallel, 325 00:19:23,200 --> 00:19:26,320 Speaker 1: but I think NASA and its whole frame of thinking. 326 00:19:26,359 --> 00:19:29,639 Speaker 1: If you can speak of something the size of an agency, 327 00:19:29,640 --> 00:19:32,560 Speaker 1: the size of NASA as a personified in some way. 328 00:19:32,600 --> 00:19:35,639 Speaker 1: But I think the whole enterprise is really stuck in 329 00:19:35,680 --> 00:19:38,520 Speaker 1: the sixties, if not the fit nineteen fifties one is created. 330 00:19:38,920 --> 00:19:41,280 Speaker 1: So it's very much it's you know, you see this 331 00:19:41,359 --> 00:19:45,120 Speaker 1: sort of vestigial almost cold war mentality that I think 332 00:19:45,160 --> 00:19:47,960 Speaker 1: informs all almost every aspect of the whole enterprise. 333 00:19:48,359 --> 00:19:50,240 Speaker 2: What does it say to you that in the sixties 334 00:19:50,280 --> 00:19:55,359 Speaker 2: it was the president JFK sort of outlining this national 335 00:19:56,040 --> 00:20:00,160 Speaker 2: mission to put a man on the moon, and now 336 00:20:00,320 --> 00:20:03,040 Speaker 2: in the twenty twenties it's El Musk and to a 337 00:20:03,080 --> 00:20:04,360 Speaker 2: certain extent, Jeff Bezos. 338 00:20:05,240 --> 00:20:08,440 Speaker 1: Yeah, I think you can learn all you need to 339 00:20:08,480 --> 00:20:11,280 Speaker 1: know about a culture or a society by studying its 340 00:20:11,280 --> 00:20:16,560 Speaker 1: attitudes about Mars. You know, it's certainly now it's dominated 341 00:20:16,600 --> 00:20:18,920 Speaker 1: by a kind of there are a few different strands. 342 00:20:18,920 --> 00:20:22,840 Speaker 1: There's a kind of private enterprise strand but that is 343 00:20:22,880 --> 00:20:26,320 Speaker 1: often including in the case of Musk, closely alloyed with 344 00:20:26,880 --> 00:20:32,080 Speaker 1: a libertarian fantasy of a lawless world in which people 345 00:20:32,160 --> 00:20:35,040 Speaker 1: can stake their claim a kind of wild West and 346 00:20:35,119 --> 00:20:39,840 Speaker 1: not have regulation and oversight. There are groups of Mars 347 00:20:40,440 --> 00:20:45,880 Speaker 1: enthusiasts out there that are very much explicitly libertarian ideologues 348 00:20:45,960 --> 00:20:48,720 Speaker 1: who hope to start a libertarian society on Mars. So 349 00:20:48,760 --> 00:20:51,280 Speaker 1: that exists if you go back to the fifties and sixties, 350 00:20:51,280 --> 00:20:54,199 Speaker 1: where at this very different place in our culture, obviously 351 00:20:54,280 --> 00:20:59,600 Speaker 1: in society, a place of tremendous global cooperation relatively that 352 00:20:59,720 --> 00:21:02,400 Speaker 1: gave birth to the entire sort of modern space race, 353 00:21:02,440 --> 00:21:04,960 Speaker 1: even though you have a competition between the Cold War powers. 354 00:21:05,640 --> 00:21:08,159 Speaker 1: But you can even go back further and if you 355 00:21:08,240 --> 00:21:14,960 Speaker 1: look at the late nineteenth century when Chaparelli, a Milanaisy astronomer, 356 00:21:15,359 --> 00:21:17,919 Speaker 1: observed that there were canals on Mars. There was this 357 00:21:18,040 --> 00:21:21,639 Speaker 1: great fascination for decades about are people living on Mars? 358 00:21:22,040 --> 00:21:26,280 Speaker 1: Are Martians building canals? And it was very much an expression. 359 00:21:26,760 --> 00:21:30,160 Speaker 1: You can find very clear a correlation between the kind 360 00:21:30,200 --> 00:21:32,960 Speaker 1: of excitement of the industrial age and there was a 361 00:21:32,960 --> 00:21:35,960 Speaker 1: period where people were competing with Mars to build more 362 00:21:36,000 --> 00:21:39,400 Speaker 1: canals as fast as possible, as also, of course, during 363 00:21:39,400 --> 00:21:41,399 Speaker 1: the same period of the digging of the sus Canal. 364 00:21:41,480 --> 00:21:43,680 Speaker 1: So this was you know, this is the New York Times. 365 00:21:43,680 --> 00:21:46,240 Speaker 1: This is not just some like weird thing. Is this 366 00:21:46,280 --> 00:21:48,560 Speaker 1: is at the time generally accepted that we're in this 367 00:21:48,640 --> 00:21:51,800 Speaker 1: race against the Martians. So it's always been a kind 368 00:21:51,840 --> 00:21:57,600 Speaker 1: of repository Mars for the kind of subconscious of the 369 00:21:57,640 --> 00:22:01,360 Speaker 1: culture that observes it. And I think that's true today, 370 00:22:01,480 --> 00:22:04,240 Speaker 1: and I think as our society changes, probably our view 371 00:22:04,240 --> 00:22:06,880 Speaker 1: of Mars will change in tandem with it. 372 00:22:07,320 --> 00:22:10,960 Speaker 2: You've written that future Mars voyages will have to want 373 00:22:11,000 --> 00:22:13,119 Speaker 2: to travel to Mars more than almost anyone else in 374 00:22:13,160 --> 00:22:15,600 Speaker 2: the world. They'll have to embrace the knowledge that for 375 00:22:15,600 --> 00:22:18,560 Speaker 2: at least five hundred and seventy days, they will be 376 00:22:18,600 --> 00:22:22,120 Speaker 2: the most isolated human beings in the history of the universe. 377 00:22:23,080 --> 00:22:26,560 Speaker 1: Yes, they will have to, because that's what they're signing 378 00:22:26,640 --> 00:22:27,520 Speaker 1: up up for. 379 00:22:27,800 --> 00:22:28,879 Speaker 2: What will that do to them? 380 00:22:29,200 --> 00:22:29,439 Speaker 3: You know? 381 00:22:29,520 --> 00:22:31,600 Speaker 1: I think a distinction has to be made between the 382 00:22:31,680 --> 00:22:35,080 Speaker 1: kind of person who wants to be an astronaut and 383 00:22:35,119 --> 00:22:37,280 Speaker 1: wants to go on a mission like this, like the 384 00:22:37,280 --> 00:22:40,560 Speaker 1: people I wrote about, like Nathan Jones. But then once 385 00:22:40,560 --> 00:22:45,600 Speaker 1: we start talking about a permanent settlement or colonies, we're 386 00:22:45,600 --> 00:22:48,159 Speaker 1: talking about a very different group of people. So you 387 00:22:48,280 --> 00:22:51,920 Speaker 1: have this sort of kind of zealot astronauts, who are 388 00:22:52,119 --> 00:22:54,960 Speaker 1: you perfectly fit, who are the most stable people you've 389 00:22:55,000 --> 00:23:01,040 Speaker 1: ever met, enormous reserves of self concentration and self reliance 390 00:23:01,080 --> 00:23:03,360 Speaker 1: and all the rest, and then the rest of us, right, 391 00:23:03,480 --> 00:23:07,280 Speaker 1: and for colony to exist, it has to look very different. 392 00:23:07,400 --> 00:23:10,439 Speaker 1: And a major criticism that I encountered in researching the 393 00:23:10,480 --> 00:23:13,560 Speaker 1: piece from close watchers of the NASA program is that 394 00:23:14,440 --> 00:23:18,720 Speaker 1: even if this experiment has some value to predict the 395 00:23:18,760 --> 00:23:23,119 Speaker 1: ability of say, astronauts to survive in this setting, it 396 00:23:23,160 --> 00:23:26,280 Speaker 1: will have no value for the rest of us, who, 397 00:23:26,520 --> 00:23:28,720 Speaker 1: you know, all kinds of other considerations would have to 398 00:23:28,720 --> 00:23:32,280 Speaker 1: be made. And so we're certainly not at the stage 399 00:23:32,280 --> 00:23:35,359 Speaker 1: where we're asking can people have families up there? Can 400 00:23:35,400 --> 00:23:39,439 Speaker 1: people give birth? There's some major biological challenges there. What 401 00:23:39,520 --> 00:23:43,560 Speaker 1: happens if someone gets sick, what happens if someone misses home, 402 00:23:43,720 --> 00:23:46,840 Speaker 1: you know, enters a depression, none of that. We're nowhere 403 00:23:46,840 --> 00:23:50,200 Speaker 1: near those kinds of questions yet. But I think that's 404 00:23:50,520 --> 00:23:52,919 Speaker 1: if they continue to hit these benchmarks. That's where this 405 00:23:53,080 --> 00:23:54,280 Speaker 1: is ultimately heading. 406 00:23:54,800 --> 00:23:58,840 Speaker 2: So when you wrote the piece, Nathan and co in 407 00:23:58,880 --> 00:24:02,200 Speaker 2: the mod's habitat, and since oublication, they've of course come back. 408 00:24:03,000 --> 00:24:05,080 Speaker 2: Do you know what the experience was like for Nathan. 409 00:24:05,720 --> 00:24:10,879 Speaker 1: No, they're basically sworn to secrecy. And this was the 410 00:24:11,000 --> 00:24:15,879 Speaker 1: level of secrecy that shrouded just about every aspect of 411 00:24:15,920 --> 00:24:23,600 Speaker 1: this experiment was somewhat astounding. Orse for me, it was 412 00:24:23,640 --> 00:24:26,159 Speaker 1: as it reporting the story, at least talking to the 413 00:24:26,240 --> 00:24:30,520 Speaker 1: NASA people and to some extent the participants themselves, you'd 414 00:24:30,520 --> 00:24:33,480 Speaker 1: think I was investigating I don't know, Abu grab or 415 00:24:33,480 --> 00:24:36,639 Speaker 1: something like. The way that it was talked about extremely confidential. Now, 416 00:24:36,640 --> 00:24:39,440 Speaker 1: their justification was that they want to run the experiment 417 00:24:39,480 --> 00:24:43,960 Speaker 1: multiple times, and they don't want prospective applicants to know 418 00:24:44,200 --> 00:24:46,880 Speaker 1: anything about what they're going to do. They don't want 419 00:24:46,920 --> 00:24:50,320 Speaker 1: to because it would, I guess, diminish the value of 420 00:24:50,359 --> 00:24:52,840 Speaker 1: what they find if people already know, like these are 421 00:24:52,880 --> 00:24:54,360 Speaker 1: the kinds of things they're going to do when we're there, 422 00:24:54,440 --> 00:24:56,760 Speaker 1: or this is what happened to people. It struck me 423 00:24:56,760 --> 00:25:00,840 Speaker 1: as slightly ridiculous because, on the one hand, very similar 424 00:25:00,880 --> 00:25:04,359 Speaker 1: experiments have been conducted many times including by NASA, and 425 00:25:04,440 --> 00:25:06,680 Speaker 1: those results are public. 426 00:25:06,480 --> 00:25:08,720 Speaker 2: So the results. NASA haven't published any results of this. 427 00:25:09,160 --> 00:25:11,680 Speaker 1: Not that I'm aware of, no, and you know they 428 00:25:11,720 --> 00:25:13,200 Speaker 1: release these very anodyne statements. 429 00:25:13,200 --> 00:25:13,840 Speaker 2: It's a success. 430 00:25:13,880 --> 00:25:15,120 Speaker 1: Everyone had a great time. 431 00:25:15,280 --> 00:25:17,439 Speaker 2: And you put the story in the context of the 432 00:25:17,480 --> 00:25:21,840 Speaker 2: history of isolation research. But more specifically, it seems like 433 00:25:21,920 --> 00:25:26,320 Speaker 2: this particular simulation of life on Mars has happened multiple 434 00:25:26,359 --> 00:25:29,760 Speaker 2: times in the past and is also been replicated multiple 435 00:25:29,760 --> 00:25:31,760 Speaker 2: times right now all around the world. Can you kind 436 00:25:31,760 --> 00:25:36,000 Speaker 2: of describe the spread of this type of experiment being run? 437 00:25:37,080 --> 00:25:40,520 Speaker 1: Yeah, I guess it depends on how narrowly you want 438 00:25:40,560 --> 00:25:44,840 Speaker 1: to define the experiment. But NASA has been doing some version, 439 00:25:45,200 --> 00:25:49,080 Speaker 1: conducting some version of this experiment since before NASA was 440 00:25:49,119 --> 00:25:51,119 Speaker 1: even called NASA. I mean, they had some of the 441 00:25:51,160 --> 00:25:57,280 Speaker 1: early first astronauts did isolation experiments. They would put them 442 00:25:57,280 --> 00:26:01,280 Speaker 1: in little pods for long periods of time time, sometimes 443 00:26:01,320 --> 00:26:06,280 Speaker 1: in fairly brutal configurations and sometimes completely in isolation, especially 444 00:26:06,280 --> 00:26:09,320 Speaker 1: back in the fifties when they thought that astronauts would 445 00:26:09,359 --> 00:26:12,720 Speaker 1: have to be propelled in tiny little vessels for months 446 00:26:12,760 --> 00:26:15,600 Speaker 1: at a time into outer space. But there was another 447 00:26:16,560 --> 00:26:20,760 Speaker 1: similar experiment called high Seas, which was the subject of 448 00:26:20,800 --> 00:26:23,840 Speaker 1: a really fascinating book by the writer Kate Green, who 449 00:26:23,880 --> 00:26:26,440 Speaker 1: was one of the original crew members they ran that experiment, 450 00:26:27,040 --> 00:26:29,439 Speaker 1: I don't know, I think a dozen times. That was 451 00:26:29,480 --> 00:26:32,480 Speaker 1: a similar idea in a habitat that was built on 452 00:26:32,720 --> 00:26:38,040 Speaker 1: Mona Loa Mountain in Hawaii, and it was four people 453 00:26:38,600 --> 00:26:41,639 Speaker 1: or sometimes six put into this environment for months at 454 00:26:41,680 --> 00:26:46,240 Speaker 1: a time. And Green writes very elegantly and movingly about 455 00:26:46,320 --> 00:26:48,919 Speaker 1: the experience and on the kind of madness of it 456 00:26:48,960 --> 00:26:52,520 Speaker 1: and what it did to her life. The book Once 457 00:26:52,600 --> 00:26:55,359 Speaker 1: upon a Time I Lived on Mars, it's called And 458 00:26:55,400 --> 00:26:59,040 Speaker 1: then there was a crazy experiment called Mars five hundred 459 00:26:59,200 --> 00:27:04,000 Speaker 1: that was inistered by the Russian agency called which has 460 00:27:04,040 --> 00:27:08,000 Speaker 1: a name that I love, called the Institute of Biomedical Problems. 461 00:27:08,160 --> 00:27:12,520 Speaker 1: So of course that's who did this completely barbaric experiment 462 00:27:12,520 --> 00:27:14,800 Speaker 1: where they locked six male crew members together for five 463 00:27:14,880 --> 00:27:16,000 Speaker 1: hundred and twenty days. 464 00:27:16,200 --> 00:27:16,600 Speaker 2: Wow. 465 00:27:16,920 --> 00:27:20,400 Speaker 1: That was in twenty ten and eleven, in a kind 466 00:27:20,440 --> 00:27:26,000 Speaker 1: of fake spacecraft on a fake Mars, and that was 467 00:27:26,000 --> 00:27:29,399 Speaker 1: pretty well studied and people participants lost their hair and 468 00:27:29,480 --> 00:27:33,520 Speaker 1: lost weight. But then there's NASA. They have something like 469 00:27:33,600 --> 00:27:37,760 Speaker 1: a dozen different versions of this going on at all times. 470 00:27:37,840 --> 00:27:40,480 Speaker 1: There are all different configurations, different amounts of time, different 471 00:27:40,520 --> 00:27:41,600 Speaker 1: number of participants. 472 00:27:41,600 --> 00:27:44,000 Speaker 2: So if you lost, you say to NASA, why do 473 00:27:44,000 --> 00:27:44,800 Speaker 2: you need to keep doing that? 474 00:27:44,920 --> 00:27:46,960 Speaker 1: Yes, that was one of my big questions. Why do 475 00:27:47,040 --> 00:27:50,280 Speaker 1: we keep doing this? And don't we know what happened? 476 00:27:50,400 --> 00:27:53,119 Speaker 1: Even before the NASA history, there's this whole other history 477 00:27:53,119 --> 00:27:59,000 Speaker 1: of people doing similar isolation experiments, and their official answer was, yes, 478 00:27:59,040 --> 00:28:03,040 Speaker 1: we've done similar some experiments, but actually there's no substitution 479 00:28:03,320 --> 00:28:06,760 Speaker 1: for this is far closer to the expected reality and 480 00:28:06,920 --> 00:28:11,800 Speaker 1: experimentally scientifically, all of the previous experiments are essentially useless 481 00:28:11,800 --> 00:28:15,119 Speaker 1: and this is the only one that will matter. Now, 482 00:28:15,760 --> 00:28:19,879 Speaker 1: if you believe that, you also have to then wonder well, 483 00:28:21,000 --> 00:28:22,720 Speaker 1: And this is what some of the people that study 484 00:28:22,760 --> 00:28:26,280 Speaker 1: that's pointed out to me. Yes, okay, this experiment, even 485 00:28:26,280 --> 00:28:30,359 Speaker 1: if it's its exact simulation, a perfect simulation of what 486 00:28:30,680 --> 00:28:33,520 Speaker 1: the first Mars expedition is going to be, you're only 487 00:28:33,560 --> 00:28:38,040 Speaker 1: testing a group of four people eve an n of four, right, 488 00:28:38,160 --> 00:28:43,040 Speaker 1: it's experimentally speaking, and so the statistical value of this 489 00:28:43,120 --> 00:28:46,080 Speaker 1: experiment is close to nil. You'd have to run this 490 00:28:46,200 --> 00:28:51,880 Speaker 1: experiment thousands of times for it to be statistically reliable, 491 00:28:52,040 --> 00:28:53,560 Speaker 1: and of course they're not going to do that. So 492 00:28:53,680 --> 00:28:57,360 Speaker 1: even if you grant them this sort of scientific argument 493 00:28:57,400 --> 00:29:00,320 Speaker 1: that this experiment is unlike all the other ones, even 494 00:29:00,360 --> 00:29:02,680 Speaker 1: though they all basically have the same results, it doesn't 495 00:29:02,720 --> 00:29:05,360 Speaker 1: actually have much scientific value unless they would do it 496 00:29:05,360 --> 00:29:08,800 Speaker 1: a million, you know, fifty times or a thousand times. 497 00:29:08,840 --> 00:29:11,800 Speaker 1: I'm not sure where the probability charts cut off, but 498 00:29:12,760 --> 00:29:14,960 Speaker 1: as it stands, they're probably going to do it one 499 00:29:15,040 --> 00:29:18,000 Speaker 1: or two more times, at which point they'll be ready 500 00:29:18,040 --> 00:29:19,560 Speaker 1: to hurl people up to Mars. 501 00:29:20,000 --> 00:29:23,480 Speaker 2: But from that point of view, was this about understanding 502 00:29:23,600 --> 00:29:27,680 Speaker 2: if humans can withstand isolation or was this some we 503 00:29:27,720 --> 00:29:30,600 Speaker 2: talked to the beginning about the technical problems NASA has 504 00:29:30,680 --> 00:29:33,240 Speaker 2: to solve or was this Were there any technical problems 505 00:29:33,240 --> 00:29:34,440 Speaker 2: they were looking to solve with this? 506 00:29:34,680 --> 00:29:38,240 Speaker 1: That was probably the That was the point where I 507 00:29:38,520 --> 00:29:42,040 Speaker 1: was most I mean, there's something that's where I sort 508 00:29:42,040 --> 00:29:45,040 Speaker 1: of laughed in the reporting, although it's kind of horrible. So, yes, 509 00:29:45,080 --> 00:29:46,560 Speaker 1: the official line is where we want to test the 510 00:29:46,640 --> 00:29:49,440 Speaker 1: human side of this. We have all these divisions doing 511 00:29:50,280 --> 00:29:54,000 Speaker 1: the science and the technology, and this is the human 512 00:29:54,040 --> 00:29:56,080 Speaker 1: research side, And in fact, there is a human research 513 00:29:56,160 --> 00:30:01,840 Speaker 1: division within NASA that was administering the experiment. However, they 514 00:30:01,840 --> 00:30:06,200 Speaker 1: were partnered with two other divisions, and the division that 515 00:30:06,280 --> 00:30:11,320 Speaker 1: oversaw the whole experiment was actually run by someone named 516 00:30:11,400 --> 00:30:15,880 Speaker 1: Rachel McCauley, who is a propulsion engineer. She's the one 517 00:30:15,880 --> 00:30:20,040 Speaker 1: who decides which rocket will do the job best, and 518 00:30:20,080 --> 00:30:24,120 Speaker 1: in order to make that determination, she needs to nail 519 00:30:24,160 --> 00:30:26,000 Speaker 1: down a bunch of variables. And one of the main 520 00:30:26,080 --> 00:30:30,640 Speaker 1: variables is how much weight needs to be carried by 521 00:30:30,720 --> 00:30:32,840 Speaker 1: the rocket ship. And so what that means is, of 522 00:30:32,920 --> 00:30:35,360 Speaker 1: course the weight of the people, but also how much 523 00:30:35,360 --> 00:30:39,080 Speaker 1: food do they have to take? And so when I 524 00:30:39,120 --> 00:30:44,240 Speaker 1: talked to her, she was like, very blithely kind of 525 00:30:44,320 --> 00:30:48,920 Speaker 1: dismissive of the whole human psychological aspect of the thing, 526 00:30:49,000 --> 00:30:52,800 Speaker 1: and instead she focused on how much food are they 527 00:30:52,800 --> 00:30:56,040 Speaker 1: going to eat? Like, what's the weight? How much waste 528 00:30:56,080 --> 00:30:59,760 Speaker 1: are they going to produce? And once I have those figures, 529 00:31:00,280 --> 00:31:03,640 Speaker 1: I will know exactly what kind of propulsion device to use. 530 00:31:04,080 --> 00:31:06,120 Speaker 1: And so then I went, you a little bit dubious, yeah, 531 00:31:06,160 --> 00:31:07,560 Speaker 1: And so I was like, what, no, I mean, I 532 00:31:07,560 --> 00:31:10,840 Speaker 1: believed her because she was running the experiment. She's a 533 00:31:10,960 --> 00:31:14,920 Speaker 1: solid propulsion systems engineer, and so then I went back 534 00:31:15,000 --> 00:31:17,960 Speaker 1: to the sort of human research people and they're like, oh, no, no, no, 535 00:31:18,440 --> 00:31:21,880 Speaker 1: it's all about human psychology. But in fact the person 536 00:31:21,880 --> 00:31:23,640 Speaker 1: they were reporting to, the person who was running the 537 00:31:23,640 --> 00:31:27,000 Speaker 1: whole thing, said that was not the case. And so 538 00:31:27,360 --> 00:31:30,120 Speaker 1: actually I think if you follow the money, you start 539 00:31:30,160 --> 00:31:33,680 Speaker 1: to wonder, well, is this whole human aspect side of 540 00:31:33,680 --> 00:31:37,680 Speaker 1: it part of the marketing And it's frankly irrelevant to 541 00:31:37,760 --> 00:31:40,920 Speaker 1: what NASA's real concern is, which is, yeah, how many 542 00:31:40,920 --> 00:31:42,640 Speaker 1: pounds of food do we need to put on this thing? 543 00:31:43,080 --> 00:31:50,120 Speaker 2: She's stay with us for more for Nathaniel Reach on 544 00:31:50,240 --> 00:31:54,920 Speaker 2: why dreams of Mars and dreams of AI are inextricably linked, 545 00:31:55,120 --> 00:31:59,120 Speaker 2: and why some techno optimists theorize that humans would evolve 546 00:31:59,160 --> 00:32:06,280 Speaker 2: into AI powers martians. There was a part of your 547 00:32:06,400 --> 00:32:11,080 Speaker 2: story that pretty stuck out to me was that NASA's 548 00:32:11,200 --> 00:32:16,560 Speaker 2: chief research scientist, Dennis Bushel said that as colonizing mass 549 00:32:16,560 --> 00:32:21,000 Speaker 2: becomes more feasible, colonists themselves will evolve into martians. 550 00:32:21,600 --> 00:32:22,040 Speaker 1: Yes. 551 00:32:22,920 --> 00:32:24,080 Speaker 2: Did that surprise you? 552 00:32:26,280 --> 00:32:29,560 Speaker 1: Yes, although a little bit. It was surprised me to 553 00:32:29,600 --> 00:32:33,800 Speaker 1: see him write about that so openly. Yes, This chief 554 00:32:33,840 --> 00:32:38,640 Speaker 1: scientist at the Langley Research Center who had been I 555 00:32:38,680 --> 00:32:41,760 Speaker 1: think he recently retired, had been a NASA for sixty years, 556 00:32:41,800 --> 00:32:45,520 Speaker 1: and he published this sort of opus about the institutional 557 00:32:45,640 --> 00:32:49,760 Speaker 1: view of deep space exploration, and he said, what I 558 00:32:49,760 --> 00:32:53,200 Speaker 1: think a lot of scientists have predicted is that if 559 00:32:53,200 --> 00:32:57,920 Speaker 1: people are able to survive on Mars for any extended 560 00:32:57,960 --> 00:33:02,640 Speaker 1: amount of time with oxygen and all the rest, that 561 00:33:03,160 --> 00:33:07,840 Speaker 1: ultimately their bodies will change. That over time because of 562 00:33:07,920 --> 00:33:12,280 Speaker 1: the radiation exposure, because of the reduced gravity, that there 563 00:33:12,280 --> 00:33:16,320 Speaker 1: will be real physiological changes to their bodies. That there's 564 00:33:16,320 --> 00:33:18,440 Speaker 1: no way out of that. So essentially one of the 565 00:33:18,520 --> 00:33:21,600 Speaker 1: kind of tricks for surviving Mars is to live there 566 00:33:21,640 --> 00:33:25,760 Speaker 1: long enough so that people evolve into Martians and they 567 00:33:25,800 --> 00:33:28,960 Speaker 1: look different and they probably have elongated heads and maybe 568 00:33:29,040 --> 00:33:30,560 Speaker 1: different diets and all the rest. 569 00:33:30,400 --> 00:33:33,760 Speaker 2: Of it evolved means of course natural selection. Survived are 570 00:33:33,800 --> 00:33:36,120 Speaker 2: the fittest on Moss exactly. 571 00:33:36,320 --> 00:33:40,960 Speaker 1: If we're talking about a generational no, it's a generational shift. Now, 572 00:33:41,040 --> 00:33:43,760 Speaker 1: of course, they have to solve things like inconvenient things 573 00:33:43,800 --> 00:33:45,960 Speaker 1: like procreation on Mars and all the rest of that. 574 00:33:46,080 --> 00:33:48,880 Speaker 1: But yes, that's the long term view, is that we 575 00:33:48,960 --> 00:33:51,560 Speaker 1: won't have to solve every problem perfectly because people will 576 00:33:51,560 --> 00:33:54,360 Speaker 1: just start to there'll be natural selection and they'll be 577 00:33:54,400 --> 00:33:57,920 Speaker 1: forced to evolve into these other Martian creatures, and that 578 00:33:58,360 --> 00:34:00,640 Speaker 1: seems to be NASA's view. 579 00:34:03,320 --> 00:34:05,560 Speaker 2: There's another piece you wrote in The New York Times recently, 580 00:34:05,640 --> 00:34:09,720 Speaker 2: which was a review of Ray Causwell's book The Singularity 581 00:34:09,920 --> 00:34:13,279 Speaker 2: Is Nearer. Can you talk about who Ray Causwell is 582 00:34:14,040 --> 00:34:18,799 Speaker 2: that book and how viewing that book syncs up with 583 00:34:18,840 --> 00:34:20,360 Speaker 2: your writing on this experiment. 584 00:34:21,160 --> 00:34:21,359 Speaker 3: Yeah. 585 00:34:21,520 --> 00:34:24,319 Speaker 1: Kurzweil is a kind of god of Ai, who's called 586 00:34:24,320 --> 00:34:28,759 Speaker 1: the Godfather of Ai, who is for many decades has 587 00:34:28,800 --> 00:34:36,800 Speaker 1: been predicting the rise of artificial intelligence and ultimately the singularity. 588 00:34:37,480 --> 00:34:41,440 Speaker 1: But yes, his idea is that there will be nanobots 589 00:34:42,320 --> 00:34:46,320 Speaker 1: powered by artificial intelligence that we will inject into our bodies, 590 00:34:46,719 --> 00:34:49,799 Speaker 1: and that they will swim through our bloodstream into our 591 00:34:49,840 --> 00:34:53,839 Speaker 1: brains and connect our neocortex to the cloud, linking us 592 00:34:53,920 --> 00:34:56,200 Speaker 1: up to the I guess the Internet are really like 593 00:34:56,200 --> 00:35:01,640 Speaker 1: the global repository of all human information civilization, and so 594 00:35:01,680 --> 00:35:06,000 Speaker 1: at that point when we're just kind of wired into intelligence, 595 00:35:06,880 --> 00:35:11,160 Speaker 1: electronic intelligence, that for him is a singularity, and he 596 00:35:11,280 --> 00:35:14,719 Speaker 1: thinks that's coming very soon, basically by the end of 597 00:35:14,719 --> 00:35:15,240 Speaker 1: the decade. 598 00:35:15,560 --> 00:35:18,320 Speaker 2: Well, but there's something to me which is very striking 599 00:35:18,320 --> 00:35:21,560 Speaker 2: in the sense that Ray caswild this year the godfather 600 00:35:21,600 --> 00:35:24,640 Speaker 2: of AI, on the one hand, and on the other hand, 601 00:35:24,960 --> 00:35:29,759 Speaker 2: Dennis Bushnell, the NASA Chief Scientist, are both saying in 602 00:35:29,800 --> 00:35:34,000 Speaker 2: one way or another that within our lifetimes, the technological 603 00:35:34,080 --> 00:35:38,560 Speaker 2: future will mean that we no longer conform to the 604 00:35:38,600 --> 00:35:40,880 Speaker 2: current definition of what it is to be human. 605 00:35:41,760 --> 00:35:44,240 Speaker 1: Yes, although I think you'd be hard pressed to find 606 00:35:44,400 --> 00:35:48,600 Speaker 1: a definition that would admit that would be universally agreed 607 00:35:48,600 --> 00:35:51,240 Speaker 1: to on what it means to be human. True, now, 608 00:35:51,360 --> 00:35:54,520 Speaker 1: we are already and that's part of kurz Wells's argument, 609 00:35:54,880 --> 00:35:57,680 Speaker 1: is that we already outsourced so much of our mind 610 00:35:58,480 --> 00:36:03,440 Speaker 1: and identity to technology that we rely on the Internet 611 00:36:03,480 --> 00:36:07,239 Speaker 1: to remember things for us, our digital record, a lot 612 00:36:07,280 --> 00:36:13,520 Speaker 1: of our powers are only possible through technology, and if 613 00:36:13,520 --> 00:36:16,040 Speaker 1: we were just put in the wilderness, most of us 614 00:36:16,040 --> 00:36:19,560 Speaker 1: wouldn't be able to survive a couple of weeks. But yes, 615 00:36:19,760 --> 00:36:22,919 Speaker 1: both of these visions of they're both kind of these 616 00:36:22,960 --> 00:36:28,080 Speaker 1: technologically optimistic views of the world. There's this kind of 617 00:36:29,360 --> 00:36:34,560 Speaker 1: viscerally disturbing aspect to them, which is that they require 618 00:36:35,520 --> 00:36:39,200 Speaker 1: us to reimagine physically what will look like, you know, 619 00:36:39,400 --> 00:36:42,520 Speaker 1: even putting aside all the sort of mental psychological aspect 620 00:36:42,560 --> 00:36:44,839 Speaker 1: of it, that we're going to be morph into these 621 00:36:44,880 --> 00:36:47,759 Speaker 1: other different kinds of creatures that are going to be 622 00:36:47,760 --> 00:36:51,000 Speaker 1: like physically in some ways unrecognizable. And Kurzwill has this 623 00:36:51,040 --> 00:36:53,880 Speaker 1: whole thing about how soon people be able to design 624 00:36:53,920 --> 00:36:56,080 Speaker 1: their own bodies the way you can design like a 625 00:36:56,200 --> 00:36:59,239 Speaker 1: virtual avatar, and that we can well have people have 626 00:36:59,280 --> 00:37:04,920 Speaker 1: wings and tusks and whatever you want, you know, feathers, 627 00:37:05,080 --> 00:37:09,600 Speaker 1: and that part of it tends not to be spoken 628 00:37:09,680 --> 00:37:13,120 Speaker 1: aloud or advertised as much as the part about, you know, 629 00:37:13,360 --> 00:37:17,279 Speaker 1: improving our intelligence. But I think what was striking to 630 00:37:17,320 --> 00:37:19,479 Speaker 1: me about Kurzweil's book and what I wanted to write 631 00:37:19,480 --> 00:37:23,239 Speaker 1: about is let's not forget the part where he the 632 00:37:23,239 --> 00:37:26,040 Speaker 1: prerequisite for all of these future predictions is that we're 633 00:37:26,040 --> 00:37:30,919 Speaker 1: injecting microscopic robots into our brains and our bloodstream. Let's 634 00:37:30,920 --> 00:37:35,319 Speaker 1: not lose track of that part of it. So that, yes, 635 00:37:35,400 --> 00:37:37,600 Speaker 1: I think you're right to draw a kind of parallel 636 00:37:37,680 --> 00:37:41,400 Speaker 1: with the Mars visions. They tend to collide in the 637 00:37:41,440 --> 00:37:44,480 Speaker 1: realm of artificial intelligence. It's not surprising that Elon Musk 638 00:37:44,560 --> 00:37:47,440 Speaker 1: you know, is obsessed with both Mars and Ai. 639 00:37:48,440 --> 00:37:51,760 Speaker 2: You use the phrase earlier on a conversation about mourning, 640 00:37:52,040 --> 00:37:54,160 Speaker 2: and one of the pieces of Coswill's book that you 641 00:37:54,239 --> 00:37:57,760 Speaker 2: draw out is him talking about basically making an Ali 642 00:37:57,960 --> 00:38:00,880 Speaker 2: version of his father who passed away in nineteen seventy 643 00:38:01,000 --> 00:38:04,000 Speaker 2: to be able to talk to him about music. And 644 00:38:04,040 --> 00:38:06,680 Speaker 2: one of the other things I noticed in the piece 645 00:38:06,920 --> 00:38:11,040 Speaker 2: about Mars was the crop garden in the Mars Dune 646 00:38:11,080 --> 00:38:14,719 Speaker 2: Alpha colony, which wouldn't be for eating, but rather for 647 00:38:14,800 --> 00:38:18,880 Speaker 2: the mental health of the participants. You know, it's I 648 00:38:18,920 --> 00:38:20,839 Speaker 2: guess it makes me think of that whole sort of 649 00:38:21,000 --> 00:38:23,759 Speaker 2: cliched thing about the fisherman who becomes a millionaire and 650 00:38:23,760 --> 00:38:26,800 Speaker 2: then returns to where he lived to fish. The craving 651 00:38:26,960 --> 00:38:31,239 Speaker 2: for the kind of things which are the touchstones of 652 00:38:31,480 --> 00:38:35,200 Speaker 2: what we think about as our human experience also is 653 00:38:35,280 --> 00:38:37,759 Speaker 2: present in these future fantasies. 654 00:38:38,840 --> 00:38:43,160 Speaker 1: Absolutely. That's another major point of convergence I think, is 655 00:38:43,239 --> 00:38:49,960 Speaker 1: this that once you peel back this techno optimistic fantasy 656 00:38:50,440 --> 00:38:54,719 Speaker 1: of how things are going to be, you find this 657 00:38:54,920 --> 00:39:00,239 Speaker 1: deep sense of longing for how things once were. Only 658 00:39:00,280 --> 00:39:03,680 Speaker 1: see it in Kurzwell, where after hundreds of pages of 659 00:39:03,719 --> 00:39:07,839 Speaker 1: talking about all the wonders of this new technology, all 660 00:39:07,880 --> 00:39:11,240 Speaker 1: the conveniences, and how we can travel, have beach holidays 661 00:39:11,239 --> 00:39:13,840 Speaker 1: without leaving our houses through virtual reality and all the 662 00:39:13,840 --> 00:39:19,600 Speaker 1: rest of it. His ultimate goal is to reanimate his 663 00:39:20,040 --> 00:39:23,480 Speaker 1: dead father, who was a composer not of some renown 664 00:39:23,560 --> 00:39:27,600 Speaker 1: and a conductor in New York. And he's already gone 665 00:39:27,600 --> 00:39:31,040 Speaker 1: so far as to program an AI version of his 666 00:39:31,080 --> 00:39:34,799 Speaker 1: father that trained on his father's letters and writings and 667 00:39:34,840 --> 00:39:39,239 Speaker 1: personal documents and his music. In the pages of the book, 668 00:39:40,120 --> 00:39:43,600 Speaker 1: there's a transcript of a conversation that Kurswell has with 669 00:39:44,040 --> 00:39:46,960 Speaker 1: his dead father, and that to him is that's his 670 00:39:47,040 --> 00:39:50,360 Speaker 1: great hope, is to bring back his dad. In the 671 00:39:50,400 --> 00:39:55,240 Speaker 1: same way in Mars, I was struck by the mournful 672 00:39:55,320 --> 00:40:00,799 Speaker 1: quality of this whole enterprise, and everyone I asked, every 673 00:40:00,800 --> 00:40:03,000 Speaker 1: sort of expert I interviewed, I asked us, there's something, 674 00:40:03,000 --> 00:40:06,839 Speaker 1: there's something just a little bit upsetting about all of this, 675 00:40:07,040 --> 00:40:09,000 Speaker 1: like what you know, And they all kind of many 676 00:40:09,000 --> 00:40:11,040 Speaker 1: people kind of agreed, but they couldn't put their finger 677 00:40:11,080 --> 00:40:14,440 Speaker 1: on it until I spoke to this one historian of 678 00:40:14,480 --> 00:40:20,560 Speaker 1: isolation experiments, Mattius at Cornell, and he said this thing 679 00:40:20,600 --> 00:40:22,160 Speaker 1: that for me is the heart of the story, and 680 00:40:22,200 --> 00:40:24,239 Speaker 1: to some extent it's the heart of the Kurzwell and 681 00:40:24,280 --> 00:40:27,280 Speaker 1: even aistor, which is the urge to try to recreate 682 00:40:27,320 --> 00:40:30,799 Speaker 1: a perfect world, is always going to be about rehearsing 683 00:40:30,880 --> 00:40:34,440 Speaker 1: what we got wrong here. He told me, we're not 684 00:40:34,560 --> 00:40:39,279 Speaker 1: chasing Mars, We're mourning Earth. That struck a chord with me, 685 00:40:39,360 --> 00:40:43,240 Speaker 1: because I feel like that is the through line here, 686 00:40:43,360 --> 00:40:47,720 Speaker 1: that there's this attempt to chase something that we've lost. 687 00:40:47,800 --> 00:40:51,279 Speaker 1: And you know, for Mattias, he was talking about essentially 688 00:40:51,960 --> 00:40:56,279 Speaker 1: a world ruined by climate change and environmental degradation, and 689 00:40:56,960 --> 00:41:00,640 Speaker 1: that the ultimate fulfillment of the Mars fantasy, at least 690 00:41:00,680 --> 00:41:04,480 Speaker 1: in our age, seems to be to terraform the planet 691 00:41:04,800 --> 00:41:08,960 Speaker 1: and create a kind of idyllic second Earth that won't 692 00:41:09,000 --> 00:41:11,760 Speaker 1: be marred by all the mistakes that we've made here. 693 00:41:12,120 --> 00:41:15,120 Speaker 1: And the Ai fantasy has the same component. It's you know, 694 00:41:15,120 --> 00:41:18,000 Speaker 1: we'll all be young and beautiful and free of sin 695 00:41:18,400 --> 00:41:23,799 Speaker 1: in a way, and that I think that's true, and 696 00:41:23,840 --> 00:41:26,960 Speaker 1: I think that's I think we lose something when we 697 00:41:27,360 --> 00:41:30,120 Speaker 1: just assume that all of these stories are about what 698 00:41:30,880 --> 00:41:33,440 Speaker 1: the way they're advertised. It's like progress. I think it's 699 00:41:33,480 --> 00:41:36,759 Speaker 1: also there's a kind of a morning of something that 700 00:41:36,800 --> 00:41:38,520 Speaker 1: we've lost that we're trying to get back, and we 701 00:41:38,560 --> 00:41:42,480 Speaker 1: don't quite know how to do it, and so we're 702 00:41:42,520 --> 00:41:45,399 Speaker 1: trying to build a fancy news sports car to get 703 00:41:45,480 --> 00:41:47,040 Speaker 1: us there, but we can't. 704 00:41:56,600 --> 00:41:58,400 Speaker 3: The thing that I found the most interesting about this 705 00:41:58,520 --> 00:42:01,320 Speaker 3: piece that you did was this idea that, like, isolation 706 00:42:02,080 --> 00:42:05,600 Speaker 3: is not about being alone. Yes, isolation is about being 707 00:42:05,600 --> 00:42:10,400 Speaker 3: away from community, absolutely, and you can be with the 708 00:42:10,400 --> 00:42:13,839 Speaker 3: community of people in a place that isn't home and 709 00:42:13,880 --> 00:42:15,080 Speaker 3: be very isolated. 710 00:42:15,440 --> 00:42:17,319 Speaker 2: Well, not for nothing, you know. One of the questions 711 00:42:17,360 --> 00:42:19,319 Speaker 2: I didn't ask Nathaniel, but which I kind of wish 712 00:42:19,400 --> 00:42:23,600 Speaker 2: that I had, was this interest in isolation research, Like 713 00:42:24,320 --> 00:42:27,759 Speaker 2: we are constantly bombarded with this idea of the loneliness epidemic, 714 00:42:27,840 --> 00:42:30,120 Speaker 2: and like even though we're more connected, we're more isolated 715 00:42:30,160 --> 00:42:32,120 Speaker 2: than ever. And I was wondering if there was a 716 00:42:32,200 --> 00:42:33,959 Speaker 2: kind of another text s thread that I actually didn't 717 00:42:33,960 --> 00:42:36,200 Speaker 2: pull on that perhaps should have done about you know, 718 00:42:36,440 --> 00:42:39,000 Speaker 2: why this cultural moment is so interested in isolation. 719 00:42:39,320 --> 00:42:42,319 Speaker 3: That's right, And I think that, you know, I mean, 720 00:42:42,360 --> 00:42:43,799 Speaker 3: I think about it all the time when I'm sitting 721 00:42:43,840 --> 00:42:46,560 Speaker 3: at home on the couch on my phone, feeling incredibly 722 00:42:46,560 --> 00:42:49,919 Speaker 3: connected to people and like how I could survive that way, 723 00:42:50,360 --> 00:42:52,600 Speaker 3: but also questioning like do I want to live that 724 00:42:52,640 --> 00:42:54,600 Speaker 3: way right, you know, and sort of how do I 725 00:42:54,680 --> 00:42:55,640 Speaker 3: force myself out of that? 726 00:42:56,000 --> 00:42:56,160 Speaker 2: Now? 727 00:42:56,160 --> 00:42:59,000 Speaker 3: That really has nothing to do with going to Mars Asterisk. 728 00:42:59,680 --> 00:43:02,239 Speaker 2: But you are somebody who grew up as a lover 729 00:43:02,320 --> 00:43:06,160 Speaker 2: of science fiction. Your father was a science fiction aus. Yes, so, 730 00:43:06,880 --> 00:43:09,040 Speaker 2: I mean some people like to be very dismissive of 731 00:43:09,360 --> 00:43:11,280 Speaker 2: muscum Bezos and their dreams of space. 732 00:43:11,840 --> 00:43:12,080 Speaker 3: You know. 733 00:43:12,160 --> 00:43:17,279 Speaker 2: I think they are two characters who are probably can 734 00:43:17,320 --> 00:43:20,319 Speaker 2: deal with the bit of stick. But I don't think 735 00:43:20,320 --> 00:43:25,239 Speaker 2: it's wrong to dream and even plan about space exploration. 736 00:43:26,200 --> 00:43:29,000 Speaker 3: Well. I think part of it is a colonizer's instinct, 737 00:43:29,960 --> 00:43:33,279 Speaker 3: But I also think this idea of like what is 738 00:43:33,320 --> 00:43:38,000 Speaker 3: outside of our reach is always something that will fascinate 739 00:43:38,080 --> 00:43:41,879 Speaker 3: writers of science fiction, will always fascinate even you know, 740 00:43:42,520 --> 00:43:45,920 Speaker 3: the most practical technologists, because it's something that in a 741 00:43:45,960 --> 00:43:48,880 Speaker 3: certain way is a fantasy. Like even the idea of 742 00:43:48,880 --> 00:43:52,480 Speaker 3: like having to bring a three D printer to Mars 743 00:43:52,480 --> 00:43:55,399 Speaker 3: because we can't lug certain things there. I mean, these 744 00:43:55,440 --> 00:43:58,279 Speaker 3: are such far out concepts, you know. 745 00:43:58,320 --> 00:44:00,520 Speaker 2: I find them exciting. I find them exciting and I think, 746 00:44:00,760 --> 00:44:03,040 Speaker 2: but I also did find it very tragic, this idea 747 00:44:03,080 --> 00:44:06,879 Speaker 2: of like the compulsion to repeat these quite damaging experiments, 748 00:44:06,920 --> 00:44:10,600 Speaker 2: of sending people to simulate life on Mars and hurting 749 00:44:10,680 --> 00:44:12,160 Speaker 2: them in the process in their life on Earth. 750 00:44:12,239 --> 00:44:14,600 Speaker 3: Yeah, of course, we just had Trump, on day one 751 00:44:14,800 --> 00:44:18,719 Speaker 3: of his second term, simultaneously make an executive order to 752 00:44:18,840 --> 00:44:22,840 Speaker 3: drop out of the Paris Climate Accords and declare that 753 00:44:22,880 --> 00:44:26,680 Speaker 3: we will launch astronauts into space and I quote plant 754 00:44:26,760 --> 00:44:30,839 Speaker 3: the stars and stripes on the planet Mars. Wow. So 755 00:44:31,360 --> 00:44:35,120 Speaker 3: this twinning of saying goodbye to Earth and embracing Mars 756 00:44:35,160 --> 00:44:37,720 Speaker 3: actually feels very salient and very right. Now. 757 00:44:37,960 --> 00:44:40,440 Speaker 2: Well, that's true. But all of this leaves me the 758 00:44:40,560 --> 00:44:45,919 Speaker 2: question about you. Is there anything that could be done 759 00:44:45,920 --> 00:44:48,160 Speaker 2: that I could offer to induce you to spend three 760 00:44:48,239 --> 00:44:50,160 Speaker 2: hundred and fifty days in assimulated Mars. 761 00:44:50,239 --> 00:44:52,280 Speaker 3: Now, I went to space camp. 762 00:44:52,320 --> 00:44:55,280 Speaker 2: You'll remember, or maybe, but I do remember. 763 00:44:55,320 --> 00:44:59,880 Speaker 3: Now I did go to space camp. I am intellectual 764 00:45:00,040 --> 00:45:04,480 Speaker 3: will explorer. I am not a physical explorer. 765 00:45:04,600 --> 00:45:06,719 Speaker 2: You're not a psycho one either, No, I'm. 766 00:45:06,520 --> 00:45:09,360 Speaker 3: Definitely not a psychoope. And I did. I found the 767 00:45:09,400 --> 00:45:14,320 Speaker 3: story of the woman was at leguinea really really tragic 768 00:45:14,960 --> 00:45:18,759 Speaker 3: And I do think that what's interesting is that in 769 00:45:19,080 --> 00:45:23,240 Speaker 3: moments of you know, innovation or exploration, we do test 770 00:45:23,320 --> 00:45:26,680 Speaker 3: people's psychological limits. Do we have to? I don't know, 771 00:45:26,920 --> 00:45:30,400 Speaker 3: you know, but I think that for me personally, I 772 00:45:30,480 --> 00:45:36,120 Speaker 3: am not compelled by living for that long outside of 773 00:45:36,360 --> 00:45:40,080 Speaker 3: the sort of my normal life, No are you? 774 00:45:40,120 --> 00:45:44,319 Speaker 2: No, No, I'm not. But that sense that we talked 775 00:45:44,360 --> 00:45:48,080 Speaker 2: about of these experiments in some ways being a kind 776 00:45:48,080 --> 00:45:51,799 Speaker 2: of psychological mourning for what we're losing. You did make 777 00:45:51,840 --> 00:45:55,520 Speaker 2: me think about environmental degradation. And you know, there are 778 00:45:55,520 --> 00:45:59,520 Speaker 2: these I've seen these kind of techno fantasy illustrations of 779 00:45:59,560 --> 00:46:01,960 Speaker 2: like what life on Mars might look like, and they're 780 00:46:02,000 --> 00:46:06,160 Speaker 2: basically these biospheres into which you have crammed, like the 781 00:46:06,200 --> 00:46:10,320 Speaker 2: Swiss Alps, the Grand Canyon, the Mediterranean Sea, like beautiful animal. 782 00:46:10,480 --> 00:46:14,480 Speaker 3: I also just think we're still human beings right well 783 00:46:14,880 --> 00:46:18,120 Speaker 3: now for now, But you know, we project all of 784 00:46:18,120 --> 00:46:20,440 Speaker 3: our fantasies still in the world of the creature comforts 785 00:46:20,440 --> 00:46:22,840 Speaker 3: that we want. Do I want to ski on Mars? 786 00:46:22,880 --> 00:46:25,279 Speaker 3: I guess right, because I like skiing here. 787 00:46:25,480 --> 00:46:29,080 Speaker 2: You know, it makes you remember just how wonderful, you know, 788 00:46:29,160 --> 00:46:32,479 Speaker 2: this earth of ours is. And what I loved about 789 00:46:32,480 --> 00:46:34,920 Speaker 2: this interview and took away from it is when you 790 00:46:34,960 --> 00:46:38,160 Speaker 2: play out the fantasy and when you actually ask, you know, 791 00:46:38,239 --> 00:46:41,359 Speaker 2: one of the chief research scientists at NASA, what this 792 00:46:41,400 --> 00:46:44,200 Speaker 2: looks like in the future. It's not just going to Mars. 793 00:46:44,320 --> 00:46:47,960 Speaker 2: It's evolving into a new species with different shape of head, 794 00:46:48,239 --> 00:46:51,360 Speaker 2: with a different reaction to radiation. And what that says 795 00:46:51,400 --> 00:46:54,479 Speaker 2: to me is, this is not just you know, going 796 00:46:54,520 --> 00:46:57,439 Speaker 2: on a fun trip. This is essentially saying that there's 797 00:46:57,440 --> 00:47:01,279 Speaker 2: going to be a fundamental categorical shift in US as 798 00:47:01,280 --> 00:47:05,120 Speaker 2: a species in order to colonize Mars. And it's just 799 00:47:05,200 --> 00:47:08,080 Speaker 2: a very weird and I find disturbing thought. 800 00:47:08,960 --> 00:47:10,239 Speaker 3: Again, not something I would do. 801 00:47:10,760 --> 00:47:12,400 Speaker 2: That's a good place to leave it. That's it for 802 00:47:12,480 --> 00:47:19,600 Speaker 2: Tech Stuff Today. Today's episode was produced by Sina Ozaki, 803 00:47:19,719 --> 00:47:23,920 Speaker 2: Eliza Dennis, Victoria Dominguez, and Lizzie Jacobs. It was executive 804 00:47:23,920 --> 00:47:27,520 Speaker 2: produced by me Oswaaloshin, Kara Price, and Kate Osborne for 805 00:47:27,560 --> 00:47:32,560 Speaker 2: Kaleidoscope and Katrina Norvel Bhart Podcasts. The Engineer is Biheit, Fraser, 806 00:47:32,800 --> 00:47:35,600 Speaker 2: Kyle Murdoch, rodear theme song Join us on Friday for 807 00:47:35,719 --> 00:47:38,239 Speaker 2: tex Stuff's The Week in Tech, when we'll explore the 808 00:47:38,280 --> 00:47:43,280 Speaker 2: origin story of our current obsession with step counting. Please rate, review, 809 00:47:43,360 --> 00:47:45,960 Speaker 2: and reach out to us at tech Stuff podcast at 810 00:47:45,960 --> 00:47:48,160 Speaker 2: gmail dot com. We want to hear us on your mind.