1 00:00:05,760 --> 00:00:07,720 Speaker 1: Hey, welcome to stuff to blow your mind. My name 2 00:00:07,760 --> 00:00:10,520 Speaker 1: is Robert Lamb and I'm Joe McCormick, and it is 3 00:00:10,640 --> 00:00:14,040 Speaker 1: vault time. We shall enter. The door hangs open, the 4 00:00:14,120 --> 00:00:17,760 Speaker 1: void calls, Uh, what have we got today, Robert? Oh? Well, 5 00:00:17,760 --> 00:00:20,960 Speaker 1: this this week we have cyborgs. This is an episode 6 00:00:20,960 --> 00:00:25,840 Speaker 1: that I did with Christian back in April nineteen to 7 00:00:25,840 --> 00:00:29,479 Speaker 1: be specific. And yeah, it's a fun discussion of just this, 8 00:00:30,000 --> 00:00:32,760 Speaker 1: this kind of a morphous idea of cyborgs, right, because 9 00:00:32,800 --> 00:00:35,839 Speaker 1: we all we all have our favorite, right the movie 10 00:00:35,840 --> 00:00:39,520 Speaker 1: Cyborg Albert Hyoune movie Cyborg with Jean Claude van Dam. Yes, 11 00:00:39,520 --> 00:00:41,760 Speaker 1: clearly that's everybody's favorite. But then also a few other, 12 00:00:41,920 --> 00:00:44,960 Speaker 1: you know, small independent films like like RoboCop and the 13 00:00:45,159 --> 00:00:49,360 Speaker 1: Terminator films, you know, um, and and also by mentioning that, 14 00:00:49,520 --> 00:00:51,640 Speaker 1: I mean some of you are already probably arguing in 15 00:00:51,680 --> 00:00:54,520 Speaker 1: your mind and making well with RoboCop really a cyborg 16 00:00:54,720 --> 00:00:56,840 Speaker 1: or was he just a robot that they put a 17 00:00:56,880 --> 00:00:59,600 Speaker 1: face on as a testament to a fallen cop. Do 18 00:00:59,800 --> 00:01:02,800 Speaker 1: you you and our seemed colleague Christian Sega address these 19 00:01:02,880 --> 00:01:05,920 Speaker 1: questions in the episode, Yeah, a little bit, because ultimately 20 00:01:06,160 --> 00:01:08,480 Speaker 1: it is kind of an amorphous Torrent term this, this 21 00:01:08,600 --> 00:01:11,160 Speaker 1: term cyborgs. So we talked about like where it came from, 22 00:01:11,319 --> 00:01:15,440 Speaker 1: like some of its origins in our considerations of of 23 00:01:15,520 --> 00:01:20,520 Speaker 1: travel beyond our planet, of of changing astronauts uh to 24 00:01:21,000 --> 00:01:24,959 Speaker 1: better uh survive the harsh conditions of outer space, as 25 00:01:24,959 --> 00:01:27,640 Speaker 1: well as how the the idea of of of the 26 00:01:27,640 --> 00:01:30,319 Speaker 1: cyborg has been taken up as as kind of a 27 00:01:30,319 --> 00:01:33,800 Speaker 1: metaphor for say, feminist identity. Oh interesting, I don't think 28 00:01:33,840 --> 00:01:35,760 Speaker 1: I've ever heard of it that way that. Now I 29 00:01:35,760 --> 00:01:37,760 Speaker 1: feel like I gotta listen to this. Yeah, well I 30 00:01:37,800 --> 00:01:40,119 Speaker 1: figure out what you guys talked about. Well, yeah, let's 31 00:01:40,200 --> 00:01:43,200 Speaker 1: let's do let's all dive in and explore the world 32 00:01:43,240 --> 00:01:49,760 Speaker 1: of cyborgs. Welcome to Stuff to Blow your Mind from 33 00:01:49,760 --> 00:01:59,160 Speaker 1: how Stuff Works dot com. Hey you, welcome to Stuff 34 00:01:59,200 --> 00:02:00,920 Speaker 1: to Blow your Mind. My name is Robert Brown and 35 00:02:01,000 --> 00:02:07,280 Speaker 1: I am Christian Sager. So Robert sideboard the words cyborg. 36 00:02:07,320 --> 00:02:09,800 Speaker 1: Do you hear that? What does it mean to you? 37 00:02:09,880 --> 00:02:13,200 Speaker 1: What is it? What pops up immediately? Well, you know, 38 00:02:13,680 --> 00:02:15,840 Speaker 1: it kind of depends on how far back in my 39 00:02:15,880 --> 00:02:18,560 Speaker 1: own timeline I go like, I can't help but instantly 40 00:02:18,560 --> 00:02:22,600 Speaker 1: go back to being a kid where Cyborg meant terminator, 41 00:02:22,680 --> 00:02:25,680 Speaker 1: cyborg meant RoboCop. And so you and I are both 42 00:02:25,760 --> 00:02:28,720 Speaker 1: children of the eighties, and that's what cyborgs were at 43 00:02:28,720 --> 00:02:32,080 Speaker 1: their height of popularity, probably right. Yeah, this idea that like, 44 00:02:32,120 --> 00:02:34,800 Speaker 1: there's a machine, but it's it's got at least a 45 00:02:34,800 --> 00:02:38,239 Speaker 1: little bit of humanity to it, but nothing, nothing was 46 00:02:38,240 --> 00:02:40,280 Speaker 1: going to hold it back too much from being like 47 00:02:40,360 --> 00:02:43,320 Speaker 1: a you know, a terrorizing robot or this this this 48 00:02:43,480 --> 00:02:48,280 Speaker 1: brutal metal badass. Yeah, I and I tease this a 49 00:02:48,280 --> 00:02:51,480 Speaker 1: little bit on social media, but for me, I immediately 50 00:02:51,560 --> 00:02:56,560 Speaker 1: go to a comic book character named Cyborg and it's 51 00:02:56,960 --> 00:03:00,000 Speaker 1: a character that it was first created in the seven 52 00:03:00,080 --> 00:03:02,680 Speaker 1: these and that's you know, seventies eighties, that's when I 53 00:03:02,760 --> 00:03:07,400 Speaker 1: was reading these comic books. Uh. He is an African 54 00:03:07,440 --> 00:03:11,040 Speaker 1: American character who becomes a cyborg because he's in some 55 00:03:11,120 --> 00:03:13,800 Speaker 1: kind of like athletic accident or a car accident or something, 56 00:03:13,840 --> 00:03:16,760 Speaker 1: and his dad is like a cyber next genius and 57 00:03:17,240 --> 00:03:20,120 Speaker 1: rebuilds his body and he becomes a superhero joins the 58 00:03:20,120 --> 00:03:22,160 Speaker 1: Teen Titans. A lot of people out there may know 59 00:03:22,280 --> 00:03:25,160 Speaker 1: this character from the Teen Titans cartoon show in the 60 00:03:25,240 --> 00:03:28,840 Speaker 1: last decade. Yeah, my nephew was telling me all about Cyborg. 61 00:03:29,480 --> 00:03:31,919 Speaker 1: I was hanging out with him in the past few months, 62 00:03:32,080 --> 00:03:34,080 Speaker 1: and it was kind of I was impressed because it 63 00:03:34,120 --> 00:03:35,920 Speaker 1: sounds like Teen Titans has done a good job of 64 00:03:35,920 --> 00:03:39,640 Speaker 1: sort of giving a thoughtful treatment of Cyborg, Like what 65 00:03:39,680 --> 00:03:41,680 Speaker 1: does it mean that this character is a little bit 66 00:03:41,960 --> 00:03:46,440 Speaker 1: machine and a little bit human. It's kind of I mean, 67 00:03:46,440 --> 00:03:50,040 Speaker 1: the cartoon is more of a comedy, but so the 68 00:03:50,360 --> 00:03:51,960 Speaker 1: caveat they wanted to place on this is, you know, 69 00:03:52,080 --> 00:03:56,360 Speaker 1: d C is rolling out its big summer blockbuster universe 70 00:03:56,360 --> 00:03:59,000 Speaker 1: of superhero movies, and Cyborg is going to have his 71 00:03:59,040 --> 00:04:00,800 Speaker 1: own movie, and he's going to be in the Justice 72 00:04:00,840 --> 00:04:03,000 Speaker 1: League movies. And I haven't seen it, but I guess 73 00:04:03,000 --> 00:04:07,160 Speaker 1: that Batman v. Superman movie spoilers like hints at him 74 00:04:07,200 --> 00:04:11,280 Speaker 1: in somewhere. Uh So I kept thinking, as we were 75 00:04:11,280 --> 00:04:13,080 Speaker 1: doing the research for this episode, which if you guys 76 00:04:13,120 --> 00:04:16,120 Speaker 1: out there haven't guessed by now, is about cyborgs uh, 77 00:04:16,920 --> 00:04:19,680 Speaker 1: I kept thinking, you know, the people who are writing 78 00:04:19,839 --> 00:04:22,839 Speaker 1: and doing all the preproduction on that Cyborg tent pole 79 00:04:22,920 --> 00:04:25,000 Speaker 1: movie right now, I really hope they listen to this 80 00:04:25,040 --> 00:04:28,240 Speaker 1: episode because we've got a lot of interesting themes going 81 00:04:28,279 --> 00:04:31,920 Speaker 1: on here with the idea of cyborgs in general, and 82 00:04:31,920 --> 00:04:36,440 Speaker 1: and that is what this episode is going to revolve around. Now, 83 00:04:36,440 --> 00:04:39,200 Speaker 1: certainly we've had episodes in the past that have dealt 84 00:04:39,240 --> 00:04:42,320 Speaker 1: with sort of like mind machine interfaces, including Joe and 85 00:04:42,360 --> 00:04:44,080 Speaker 1: I did one in the past few months. I'll make 86 00:04:44,080 --> 00:04:45,479 Speaker 1: sure we linked to that on the landing page for 87 00:04:45,480 --> 00:04:48,719 Speaker 1: this episode, and it will be doing episodes in the future. 88 00:04:48,760 --> 00:04:53,120 Speaker 1: I'm sure about cybernetic enhancements, prosthetic limbs, etcetera. But this 89 00:04:53,160 --> 00:04:56,880 Speaker 1: episode is, as the title implies, it's about what do 90 00:04:56,960 --> 00:04:59,800 Speaker 1: we think about when we think about cybords. What is 91 00:04:59,839 --> 00:05:02,400 Speaker 1: the the meaning of cyborg as a word and as 92 00:05:02,440 --> 00:05:07,200 Speaker 1: a trope and as a metaphor for understanding the human experience. Yeah, 93 00:05:07,240 --> 00:05:09,880 Speaker 1: and what I especially got out of it is that 94 00:05:10,560 --> 00:05:14,800 Speaker 1: cyborg in general, there's a lot of philosophical arguments to 95 00:05:14,839 --> 00:05:19,119 Speaker 1: make that we're already cyborgs, and that it is sort 96 00:05:19,160 --> 00:05:23,480 Speaker 1: of like the natural evolution towards trans humanism, which is 97 00:05:23,520 --> 00:05:26,400 Speaker 1: another thing we talked about on this show quite a bit. Uh. 98 00:05:26,400 --> 00:05:28,719 Speaker 1: And I want to read a quote by Donna Harroway, 99 00:05:28,760 --> 00:05:31,320 Speaker 1: who we're going to talk about later, but this really 100 00:05:31,320 --> 00:05:33,440 Speaker 1: struck me as being crucial to us kind of thinking 101 00:05:33,440 --> 00:05:37,160 Speaker 1: throughout the episode about she says technology is not neutral. 102 00:05:37,760 --> 00:05:40,440 Speaker 1: We are inside of what we make, and it is 103 00:05:40,480 --> 00:05:43,560 Speaker 1: inside of us. We're living in a world of connections, 104 00:05:43,680 --> 00:05:47,680 Speaker 1: and it matters which ones get made and unmade. So 105 00:05:47,720 --> 00:05:51,440 Speaker 1: she's not just talking about like, you know, pop sci fi, 106 00:05:51,480 --> 00:05:54,719 Speaker 1: you plug a USB poured into your ear, cyborg, that 107 00:05:54,839 --> 00:05:58,440 Speaker 1: kind of connection. She's talking about like cultural connections as well, 108 00:05:58,480 --> 00:06:03,160 Speaker 1: and sort of how we fine reality based on that. Yeah, 109 00:06:03,160 --> 00:06:05,680 Speaker 1: so I think that that is just in general, let's 110 00:06:05,680 --> 00:06:07,880 Speaker 1: try to hold onto that while we're talking about all 111 00:06:07,920 --> 00:06:10,960 Speaker 1: this stuff you two out there. Yeah, Now that doesn't mean, 112 00:06:11,040 --> 00:06:13,040 Speaker 1: you know, cast aside your sci fi ideas as well, 113 00:06:13,240 --> 00:06:16,120 Speaker 1: because a helpful Yeah, I think she even I can't 114 00:06:16,120 --> 00:06:18,800 Speaker 1: remember she if she mentioned RoboCop in her money, And 115 00:06:18,920 --> 00:06:21,520 Speaker 1: she definitely mentions some of the sci fi visions of 116 00:06:21,520 --> 00:06:24,320 Speaker 1: cyborg because that's part of the metaphor. And we will, 117 00:06:24,760 --> 00:06:27,440 Speaker 1: you know, if we haven't thoroughly satiated your pop culture 118 00:06:27,480 --> 00:06:30,479 Speaker 1: cyborg references, will make some more throughout the episode. But 119 00:06:30,560 --> 00:06:33,800 Speaker 1: there's too many, I think, maybe to get them all. Yeah, Yeah, 120 00:06:33,800 --> 00:06:36,479 Speaker 1: they're just they're rampant, especially in the wake of Terminator 121 00:06:36,480 --> 00:06:39,480 Speaker 1: and RoboCop. Just just dealing with films alone. There's so 122 00:06:39,520 --> 00:06:44,120 Speaker 1: many fabulously horrible be movies with cyboards and yeah, totally. 123 00:06:44,240 --> 00:06:48,040 Speaker 1: But before we get into the word cyborg and where 124 00:06:48,040 --> 00:06:51,119 Speaker 1: that comes from, uh, let's talk a little bit about cybernetics, 125 00:06:51,800 --> 00:06:54,600 Speaker 1: which one of the core papers here. It comes from 126 00:06:55,160 --> 00:06:59,920 Speaker 1: m T mathematician Norbert Winer, who wrote Cybernetics or cont 127 00:07:00,120 --> 00:07:04,839 Speaker 1: Role and Communication in the Animal and Machine Back And 128 00:07:04,880 --> 00:07:07,200 Speaker 1: this is a work that dealt with information theory with 129 00:07:07,279 --> 00:07:10,720 Speaker 1: a focus on feedback and the similarities between a vast 130 00:07:10,760 --> 00:07:13,880 Speaker 1: group of different phenomena from everything from throwing a ball 131 00:07:14,080 --> 00:07:16,960 Speaker 1: to running a company to launching launching a missile. It's 132 00:07:16,960 --> 00:07:20,640 Speaker 1: all about you're doing things, you're getting feedback and that 133 00:07:20,680 --> 00:07:25,440 Speaker 1: allows you to uh for rational control of everything from 134 00:07:25,440 --> 00:07:29,120 Speaker 1: machines to economic systems to communities. Uh and even a 135 00:07:29,160 --> 00:07:33,560 Speaker 1: way to arguably tackle wicked problems. UM. I use the 136 00:07:33,640 --> 00:07:35,400 Speaker 1: term we could problems, but essentially that was kind of 137 00:07:35,400 --> 00:07:37,840 Speaker 1: the the area was getting into. Yeah, and I think 138 00:07:37,880 --> 00:07:41,920 Speaker 1: one thing that's important to keep in mind about Winer's research, 139 00:07:42,080 --> 00:07:44,240 Speaker 1: or I guess just it's not really research as much 140 00:07:44,240 --> 00:07:46,960 Speaker 1: as just sort of like a general like pitch for 141 00:07:47,000 --> 00:07:49,760 Speaker 1: the future, right, saying like this is a field, this 142 00:07:49,800 --> 00:07:53,040 Speaker 1: is an approach that we can use to advance and 143 00:07:53,080 --> 00:07:56,600 Speaker 1: to understand. He's coming right on the heels of World 144 00:07:56,600 --> 00:07:59,920 Speaker 1: War two. Uh. And he is very much in particular 145 00:08:00,040 --> 00:08:06,280 Speaker 1: considering cybernetics systems as being constituted by flows of information. 146 00:08:07,080 --> 00:08:09,880 Speaker 1: And there's a really great article by a woman named 147 00:08:09,960 --> 00:08:13,080 Speaker 1: Catherine Hales uh and she's she's at it, or at 148 00:08:13,120 --> 00:08:14,800 Speaker 1: least this is hosted at u c l A. She 149 00:08:14,840 --> 00:08:18,280 Speaker 1: may not be there anymore but just faculty and uh. 150 00:08:18,520 --> 00:08:21,920 Speaker 1: She's basically looking at like the idea of Winer's like 151 00:08:22,120 --> 00:08:25,680 Speaker 1: version of cyborg ism and how it mixes with sort 152 00:08:25,720 --> 00:08:29,800 Speaker 1: of liberal humanism as well. And she is, my understanding, 153 00:08:29,920 --> 00:08:33,079 Speaker 1: maybe not a student of but a disciple of Donna Harroway, 154 00:08:33,120 --> 00:08:36,679 Speaker 1: who we're going to talk about extensively later. Um. But basically, 155 00:08:37,200 --> 00:08:40,319 Speaker 1: the Winer version goes like this, right, and we're we're 156 00:08:40,320 --> 00:08:43,160 Speaker 1: gonna use this analogy a lot. I think if a 157 00:08:43,200 --> 00:08:46,719 Speaker 1: blind man is using a cane, is he a cyborg? 158 00:08:47,720 --> 00:08:51,560 Speaker 1: And the Winer argument would say yes, because it's about 159 00:08:51,600 --> 00:08:54,680 Speaker 1: the flow of information, right. The flow of information going 160 00:08:54,720 --> 00:08:57,760 Speaker 1: through the cane is building reality for the blind man. 161 00:08:58,120 --> 00:09:01,000 Speaker 1: Therefore it makes him a cyborg. The other argument he 162 00:09:01,040 --> 00:09:03,720 Speaker 1: would probably make is a deaf person using a hearing 163 00:09:03,720 --> 00:09:06,559 Speaker 1: aid cyborg, right, yeah, And that is going to be 164 00:09:06,600 --> 00:09:09,280 Speaker 1: a recurring theme, like to what, to what extent is 165 00:09:09,360 --> 00:09:12,480 Speaker 1: this individual cyborg? And I think this strong case to 166 00:09:12,480 --> 00:09:14,880 Speaker 1: be made that, Yeah, when you are even basic tool 167 00:09:15,040 --> 00:09:18,959 Speaker 1: use is cybernetic, it's inherently cybernetic. And when we're inherently 168 00:09:19,000 --> 00:09:23,800 Speaker 1: cybernetic organism, I'm wearing contact lenses right now, that probably 169 00:09:23,800 --> 00:09:27,360 Speaker 1: makes me a cyborg. Yeah, yeah, so, and that also 170 00:09:27,440 --> 00:09:31,000 Speaker 1: involves like dental work. You know, you're wearing a time 171 00:09:31,000 --> 00:09:34,839 Speaker 1: piece on your arm, wearable computing, um, and you can 172 00:09:34,840 --> 00:09:37,880 Speaker 1: of course get into this isn't even getting into the smartphone. Yeah, 173 00:09:37,920 --> 00:09:42,439 Speaker 1: the whole smartphone thing is just like mind blowing lee cybernetic. 174 00:09:43,120 --> 00:09:45,480 Speaker 1: I should also throw in that he took the name 175 00:09:45,559 --> 00:09:50,960 Speaker 1: cybernetics from the Greek word kuber mettes, which means steersman. Uh. 176 00:09:51,000 --> 00:09:52,520 Speaker 1: And you can think of just sort of the classic 177 00:09:52,600 --> 00:09:55,920 Speaker 1: image of you know, a helmsman at a boat, um, 178 00:09:56,040 --> 00:09:58,800 Speaker 1: you know, taking the taking the old raft across the 179 00:09:59,080 --> 00:10:02,160 Speaker 1: river sticks. Yeah, that's what I was thinking of immediately. 180 00:10:02,280 --> 00:10:04,200 Speaker 1: Why is that like the first helms when I go 181 00:10:04,280 --> 00:10:06,360 Speaker 1: to his death. Yeah, I mean he's a great helms 182 00:10:06,400 --> 00:10:09,160 Speaker 1: and he has an important because he always gets across. Yeah, 183 00:10:09,280 --> 00:10:12,480 Speaker 1: the Coxswain of the dead. Uh. But is essentially here 184 00:10:12,760 --> 00:10:15,360 Speaker 1: the Steersman is depending on a constant flow of information 185 00:10:15,720 --> 00:10:20,199 Speaker 1: and that governs the interface. So yeah, one of Winer's 186 00:10:20,240 --> 00:10:23,840 Speaker 1: you know, big arguments is uh, and he doesn't I 187 00:10:23,840 --> 00:10:27,320 Speaker 1: don't think maybe make this explicitly, but you know, basically 188 00:10:27,320 --> 00:10:31,320 Speaker 1: it comes across as our cyborgs modifications that are intended 189 00:10:31,360 --> 00:10:34,479 Speaker 1: to compensate for deficiencies. So I just mentioned my contacts 190 00:10:34,840 --> 00:10:38,880 Speaker 1: deficient eyesight, So I wear contacts, right? Or are they 191 00:10:38,920 --> 00:10:43,920 Speaker 1: interventions that are designed to enhance our normal functioning? Right? So? 192 00:10:44,080 --> 00:10:45,760 Speaker 1: I mean I don't know of this, but are there 193 00:10:45,760 --> 00:10:47,600 Speaker 1: a lot of people who wear contact lenses to give 194 00:10:47,640 --> 00:10:53,040 Speaker 1: themselves better than vision? Um, well, it's not necessarily good 195 00:10:53,040 --> 00:10:56,840 Speaker 1: if that happens, because my my eyesight changed and it 196 00:10:57,040 --> 00:11:02,240 Speaker 1: apparently occasionally like eyesight gets like it prove, which throws 197 00:11:02,280 --> 00:11:04,600 Speaker 1: your contacts out of whack. And so that's what happened 198 00:11:04,600 --> 00:11:07,160 Speaker 1: to me. So I actually get my prescription taken down 199 00:11:07,320 --> 00:11:11,360 Speaker 1: a little bit. Oh okay, Yeah, well that's interesting. Okay, 200 00:11:11,400 --> 00:11:13,760 Speaker 1: so but yeah, so basically we're looking at this a 201 00:11:13,760 --> 00:11:17,240 Speaker 1: is like, is it is it both? I think it is. 202 00:11:17,280 --> 00:11:20,160 Speaker 1: I think it's both. It's both enhancing our our pre 203 00:11:20,200 --> 00:11:24,320 Speaker 1: existing abilities but also compensating for deficiencies that we have. Yeah, 204 00:11:24,400 --> 00:11:26,520 Speaker 1: but then there's like a blur somewhere in between, because 205 00:11:26,520 --> 00:11:28,520 Speaker 1: because how do you define deficiency and how do you 206 00:11:28,559 --> 00:11:31,760 Speaker 1: define what's like the what is the ideal human experience 207 00:11:31,840 --> 00:11:36,040 Speaker 1: that we are either correcting for or going beyond? Uh, Like, 208 00:11:36,120 --> 00:11:39,440 Speaker 1: there's no there's no just a template for the human. 209 00:11:39,440 --> 00:11:41,840 Speaker 1: There's no basic human, right, so that line is just 210 00:11:41,920 --> 00:11:46,440 Speaker 1: always going to be distorted. Yeah, and Harroway who we're 211 00:11:46,440 --> 00:11:49,160 Speaker 1: going to talk about later. It's very important to this. Uh, 212 00:11:49,360 --> 00:11:53,160 Speaker 1: she adds in a distinction that is beyond you know, 213 00:11:53,200 --> 00:11:56,000 Speaker 1: cybernetics is beyond anything that fuses the device with a 214 00:11:56,080 --> 00:12:02,000 Speaker 1: biological organism. It replaces cognition and neural feedback. So it 215 00:12:02,160 --> 00:12:06,680 Speaker 1: challenges the difference between us as humans and us as animals. 216 00:12:06,679 --> 00:12:09,080 Speaker 1: So maybe that's a way that we can draw the 217 00:12:09,160 --> 00:12:12,720 Speaker 1: deficiency enhancement line, although again depends on what animal you're 218 00:12:12,720 --> 00:12:15,480 Speaker 1: comparing yourself. Plenty of examples of animals that use tools, 219 00:12:15,640 --> 00:12:20,280 Speaker 1: including many primates, um crows, Joe and I are discussed 220 00:12:20,280 --> 00:12:22,760 Speaker 1: in recent episodes. So so yeah, even when when you 221 00:12:22,800 --> 00:12:25,719 Speaker 1: start applying tool use to the scenario, you can make 222 00:12:25,720 --> 00:12:28,440 Speaker 1: a case that there are plenty of cybernetical um, you know, 223 00:12:28,480 --> 00:12:31,880 Speaker 1: animals out there. Absolutely. Yeah, Yeah, that's something to keep 224 00:12:31,920 --> 00:12:33,960 Speaker 1: in mind as well. It's not just a human phenomenon. 225 00:12:34,480 --> 00:12:36,760 Speaker 1: So the last bit from Hails that I think is 226 00:12:36,800 --> 00:12:40,480 Speaker 1: important to consider when you're looking at Weiner is she 227 00:12:40,640 --> 00:12:44,880 Speaker 1: makes the argument that Weiner is sort of conflicted between 228 00:12:44,920 --> 00:12:50,280 Speaker 1: his uh somewhat humanistic endeavor that Heat envisions for cybernetics 229 00:12:50,360 --> 00:12:54,240 Speaker 1: and his use or proposal of use of them as 230 00:12:54,320 --> 00:12:57,120 Speaker 1: being effective killing machines for the military. So there's a 231 00:12:57,160 --> 00:13:00,520 Speaker 1: little bit of a contradiction there that she points out. 232 00:13:01,120 --> 00:13:04,120 Speaker 1: And again I would say, well, can they be both? 233 00:13:04,480 --> 00:13:09,440 Speaker 1: And clearly the Department of Defense would hope so, because 234 00:13:09,679 --> 00:13:11,960 Speaker 1: as we'll talk about, millions and millions of dollars have 235 00:13:12,040 --> 00:13:15,679 Speaker 1: gone into us developing cyborgs for warfare. Well, it kind 236 00:13:15,720 --> 00:13:17,080 Speaker 1: of gets down to the fact if you were if 237 00:13:17,120 --> 00:13:21,240 Speaker 1: you're going to repair or or augment the human form 238 00:13:21,320 --> 00:13:24,920 Speaker 1: your change, you're also augmenting everything that is human, both 239 00:13:25,160 --> 00:13:27,520 Speaker 1: the good stuff and the bad stuff. So so, hey, 240 00:13:27,559 --> 00:13:30,160 Speaker 1: a blind man can read a book again perhaps, but 241 00:13:30,240 --> 00:13:32,880 Speaker 1: also maybe a blind man can shoot lasers out of 242 00:13:32,880 --> 00:13:35,719 Speaker 1: his eyes at the enemy. Oh yeah, absolutely. I mean, 243 00:13:35,760 --> 00:13:37,520 Speaker 1: like a lot of the research that we're seeing that's 244 00:13:37,520 --> 00:13:40,600 Speaker 1: sort of in its infancy with brain computer interfaces. That's 245 00:13:40,600 --> 00:13:43,360 Speaker 1: where it's at right now, right where it's like, we're 246 00:13:43,440 --> 00:13:46,600 Speaker 1: developing this so that like maybe a person who's missing 247 00:13:46,600 --> 00:13:49,480 Speaker 1: a limb can move a robotic limb with their mind, 248 00:13:50,400 --> 00:13:53,280 Speaker 1: but the application moves on from there. Right there, a 249 00:13:53,360 --> 00:13:57,200 Speaker 1: person with their mind could control a missile or something. Now, 250 00:13:57,280 --> 00:14:00,360 Speaker 1: it's important to note that nowadays very few people call 251 00:14:00,400 --> 00:14:03,240 Speaker 1: themselves cyberneticists in the you know, the original sense of 252 00:14:03,240 --> 00:14:06,440 Speaker 1: the word, because cybernetics kind of petered out as a 253 00:14:06,440 --> 00:14:10,360 Speaker 1: scientific discipline for a few different reasons. So it branched 254 00:14:10,400 --> 00:14:13,880 Speaker 1: off into more promising fields of cognitive science and robotics, 255 00:14:14,320 --> 00:14:17,160 Speaker 1: but it also lost on funding. It couldn't deal with 256 00:14:16,920 --> 00:14:21,480 Speaker 1: the ultimate gap between organic and mechanic mechanisms of control 257 00:14:21,520 --> 00:14:26,520 Speaker 1: and communication. And the first cyborg recorded in history was 258 00:14:26,560 --> 00:14:30,160 Speaker 1: a white lab rat that was experimented on at New 259 00:14:30,240 --> 00:14:33,280 Speaker 1: York's Rockland State Hospital in the late fifties. So that's 260 00:14:33,320 --> 00:14:37,440 Speaker 1: a good ten years after Winer's making his proposals. Uh. 261 00:14:37,480 --> 00:14:40,280 Speaker 1: And basically it had a tiny osmatic pump that was 262 00:14:40,320 --> 00:14:45,080 Speaker 1: implanted inside its body that injected controlled the controlled doses 263 00:14:45,120 --> 00:14:47,960 Speaker 1: of chemicals into it to sort of regulate its physical systems. 264 00:14:48,040 --> 00:14:50,040 Speaker 1: And of course this is interesting though because it then 265 00:14:50,040 --> 00:14:52,360 Speaker 1: again draws us back to what is a cyborg because 266 00:14:52,560 --> 00:14:55,920 Speaker 1: perhaps you could because just mentioning the pharmaceuticals, like the 267 00:14:55,960 --> 00:14:58,720 Speaker 1: first human to take a drug, be it, you know, 268 00:14:58,760 --> 00:15:01,360 Speaker 1: something that the found owned in the woods, or certainly 269 00:15:01,360 --> 00:15:05,080 Speaker 1: are our modern pharmacological world like that is kind of 270 00:15:05,080 --> 00:15:08,920 Speaker 1: inherently cybernetic. You're changing who you are and and creating 271 00:15:08,920 --> 00:15:12,440 Speaker 1: this new, perhaps ideal, idealized version with who you are. 272 00:15:12,520 --> 00:15:15,960 Speaker 1: So I'm on a penny dreadful kick, as as you 273 00:15:16,000 --> 00:15:17,960 Speaker 1: guys out there might know, I've been talking about it 274 00:15:18,000 --> 00:15:20,720 Speaker 1: a lot on the show lately. And Uh, there's this 275 00:15:20,920 --> 00:15:24,080 Speaker 1: great quote where the Victor Frankenstein on the show is 276 00:15:24,120 --> 00:15:26,800 Speaker 1: a drug addict and he's I'm assuming it's heroin or 277 00:15:26,880 --> 00:15:29,960 Speaker 1: some opiate that he's constantly injecting into himself. He gives 278 00:15:29,960 --> 00:15:32,840 Speaker 1: this big speech about why it's okay because basically the 279 00:15:32,880 --> 00:15:36,320 Speaker 1: body is just a bunch of biological and chemiclot chemical processes, 280 00:15:36,360 --> 00:15:39,960 Speaker 1: and all he's doing is either accelerating or decelerating those 281 00:15:40,000 --> 00:15:45,160 Speaker 1: processes with the you know, the narcotics he's applying to himself. 282 00:15:45,440 --> 00:15:47,640 Speaker 1: So you could sort of look at drug use in 283 00:15:47,720 --> 00:15:51,240 Speaker 1: general as cybernetics, as you're saying, Yeah, indeed, all right 284 00:15:51,280 --> 00:15:53,520 Speaker 1: now at this point in the narrative, we're going to 285 00:15:54,280 --> 00:15:58,720 Speaker 1: fast forward to nineteen sixty. Now important note here this 286 00:15:58,760 --> 00:16:01,040 Speaker 1: is a year before Earth to put the first human 287 00:16:01,080 --> 00:16:04,520 Speaker 1: being in space, or more specifically, in Soviet Union, put 288 00:16:04,560 --> 00:16:07,120 Speaker 1: the first thing in space. But yeah, it's especially important 289 00:16:07,200 --> 00:16:09,960 Speaker 1: because what we're going to be talking about is a 290 00:16:10,080 --> 00:16:15,400 Speaker 1: paper proposed by Manfred E. Kleines and Nathan S. Klein. 291 00:16:15,680 --> 00:16:19,200 Speaker 1: They have very similar lesson informer with a C ladder 292 00:16:19,240 --> 00:16:22,600 Speaker 1: with a K, and it's called Cyborgs in Space. And 293 00:16:22,640 --> 00:16:24,120 Speaker 1: I know that sounds like it would be a joke, 294 00:16:24,200 --> 00:16:26,440 Speaker 1: but hey, it was in nineteen sixty and they were 295 00:16:26,440 --> 00:16:30,200 Speaker 1: pitching a legitimate idea for making space travel easier. Yeah, 296 00:16:30,240 --> 00:16:34,600 Speaker 1: and pitching an overall idea, specific ideas and sort of 297 00:16:34,800 --> 00:16:38,480 Speaker 1: a philosophy of how to approach taking humans into space. 298 00:16:38,680 --> 00:16:41,160 Speaker 1: Because the model that ultimately one out and the model 299 00:16:41,200 --> 00:16:44,440 Speaker 1: we're still using today with put sending humans into space 300 00:16:44,560 --> 00:16:46,800 Speaker 1: is like, all right, look at the human being, Look 301 00:16:46,800 --> 00:16:50,000 Speaker 1: at Homo sapiens. This is an organism that is evolved 302 00:16:50,360 --> 00:16:52,160 Speaker 1: not only to live on Earth, but to live in 303 00:16:52,200 --> 00:16:56,080 Speaker 1: a very slim layer of Earth's atmosphere under certain conditions. 304 00:16:56,280 --> 00:16:58,640 Speaker 1: We can't. There's places on Earth where we go and 305 00:16:58,680 --> 00:17:02,320 Speaker 1: we die. Yes, So the vironmental constraints are very important 306 00:17:02,360 --> 00:17:05,160 Speaker 1: for human life. So what we've been doing is we've 307 00:17:05,160 --> 00:17:08,400 Speaker 1: been taking humans and sending them into space in a 308 00:17:08,480 --> 00:17:11,439 Speaker 1: capsualized version of their own environment or as much as 309 00:17:11,560 --> 00:17:15,480 Speaker 1: one as we can manage to try to replicate Earth's environment, 310 00:17:15,720 --> 00:17:19,199 Speaker 1: and take that replication with us. Yeah, it's and in 311 00:17:19,240 --> 00:17:22,800 Speaker 1: a way, it's kind of like I'm gonna move from 312 00:17:22,840 --> 00:17:26,720 Speaker 1: Ohio to Florida, but I'm going to make sure that 313 00:17:26,840 --> 00:17:29,760 Speaker 1: i have all I'm bringing Ohio with me, and it's 314 00:17:29,760 --> 00:17:33,360 Speaker 1: going to be like an encapsulated Ohio in Florida. Yeah, 315 00:17:33,480 --> 00:17:36,320 Speaker 1: you dig up like a chunk of Ohio and then 316 00:17:36,480 --> 00:17:39,560 Speaker 1: move it to Florida and plant it there. My thermostat 317 00:17:39,800 --> 00:17:42,800 Speaker 1: is always going to keep things Ohio. Um, so what 318 00:17:43,000 --> 00:17:46,440 Speaker 1: clins incline? We're arguing here is that. Well, how about 319 00:17:46,520 --> 00:17:49,200 Speaker 1: we we do the the opposite. How about instead of 320 00:17:49,320 --> 00:17:52,960 Speaker 1: bringing Ohio to Florida, what is as much as is 321 00:17:53,040 --> 00:17:58,719 Speaker 1: humanly possible or or trans humanly possible, you become a Floridian? 322 00:17:58,960 --> 00:18:01,800 Speaker 1: What even to what extent can we take a human, 323 00:18:02,200 --> 00:18:06,120 Speaker 1: send them into space and change the human so that 324 00:18:06,160 --> 00:18:09,040 Speaker 1: they can actually live in space, or at least they 325 00:18:09,040 --> 00:18:15,360 Speaker 1: can better manage what is an imperfect representation of Earth's environment. Yeah, 326 00:18:15,359 --> 00:18:18,320 Speaker 1: and what we're getting at here too. And those of 327 00:18:18,359 --> 00:18:20,280 Speaker 1: you out there who are sci fi fans are probably 328 00:18:20,440 --> 00:18:23,439 Speaker 1: well aware of this, But this is a trope that 329 00:18:23,480 --> 00:18:26,960 Speaker 1: has been used in science fiction probably before these guys 330 00:18:26,960 --> 00:18:30,119 Speaker 1: pitched this idea. But but sort of the am I 331 00:18:30,200 --> 00:18:33,359 Speaker 1: man or am I machine conundrum and where you know, 332 00:18:33,400 --> 00:18:36,320 Speaker 1: where do I begin? And where does the machine and 333 00:18:37,040 --> 00:18:40,440 Speaker 1: kind of thing that we've seen in pop culture fiction 334 00:18:40,880 --> 00:18:44,480 Speaker 1: for decades now, right. But but these guys, what's fascinating 335 00:18:44,560 --> 00:18:46,280 Speaker 1: to me about this is they pitched this whole thing 336 00:18:46,320 --> 00:18:48,240 Speaker 1: about like this is a great way to go to space, 337 00:18:48,280 --> 00:18:51,080 Speaker 1: and there's not one moment where they think about the 338 00:18:51,119 --> 00:18:53,920 Speaker 1: ethical quandary of like what's left over of the human 339 00:18:53,960 --> 00:18:57,199 Speaker 1: being that they're putting all this stuff into well they do, 340 00:18:57,280 --> 00:18:59,040 Speaker 1: and I think part of it is, you know, you 341 00:18:59,080 --> 00:19:02,000 Speaker 1: have to to to bear in mind like the time period, 342 00:19:02,040 --> 00:19:06,159 Speaker 1: you know, because putting humans in space was instill it 343 00:19:06,240 --> 00:19:12,080 Speaker 1: still is a just tremendously difficult endeavor. And they were saying, Hey, 344 00:19:12,320 --> 00:19:14,360 Speaker 1: you want to climb the mountain, Here is a way, 345 00:19:14,440 --> 00:19:16,520 Speaker 1: Here are some possibilities. This is these are some ways 346 00:19:16,600 --> 00:19:19,520 Speaker 1: you can climb the mountain. And you know, it's it's 347 00:19:19,640 --> 00:19:22,960 Speaker 1: very matter of fact. Now granted we've we've steered away 348 00:19:22,960 --> 00:19:26,080 Speaker 1: from from what they outlined, but but you know, I 349 00:19:26,160 --> 00:19:28,639 Speaker 1: still think it's a it's a valid argument. Maybe it's 350 00:19:28,640 --> 00:19:32,800 Speaker 1: an argument that ultimately defeats the idea of sending humans 351 00:19:32,880 --> 00:19:36,479 Speaker 1: into space, Uh, you know, long term, but are they 352 00:19:36,640 --> 00:19:39,800 Speaker 1: human anymore? Right or right cyborg? Or if you're having 353 00:19:39,840 --> 00:19:42,159 Speaker 1: to make all these changes to the human body, like 354 00:19:42,240 --> 00:19:44,359 Speaker 1: does it, then why are you doing it to begin with? 355 00:19:44,480 --> 00:19:46,400 Speaker 1: They're saying that this is a means to an end, 356 00:19:46,840 --> 00:19:49,119 Speaker 1: that if you want humans to go into space, if 357 00:19:49,119 --> 00:19:53,480 Speaker 1: you want us to expand beyond this world, then you 358 00:19:53,560 --> 00:19:55,320 Speaker 1: have to change what humans are. And this is a 359 00:19:55,400 --> 00:19:58,240 Speaker 1: means to an end. Not that we want to become cyborgs, 360 00:19:58,240 --> 00:20:00,240 Speaker 1: but if this is what you want to be, this 361 00:20:00,359 --> 00:20:03,639 Speaker 1: is what you have to Transcending Earth's boundaries are the 362 00:20:03,680 --> 00:20:05,720 Speaker 1: most important thing for us, and we should be willing 363 00:20:05,760 --> 00:20:09,600 Speaker 1: to commit these acts. Um. And the way that they 364 00:20:09,680 --> 00:20:14,000 Speaker 1: start is with a very basic idea, which is respiration, Right, 365 00:20:14,119 --> 00:20:17,520 Speaker 1: we we breathe uh. And they say, well, you know, 366 00:20:17,920 --> 00:20:20,520 Speaker 1: for instance, you wear scuba masks when you go swimming underwater. 367 00:20:20,640 --> 00:20:23,680 Speaker 1: Why wouldn't you you know, change your respiration somehow for 368 00:20:23,760 --> 00:20:27,600 Speaker 1: outer space? Their example or metaphor, I guess, is what 369 00:20:27,720 --> 00:20:31,840 Speaker 1: if a fish was intelligent enough to engineer itself something 370 00:20:31,880 --> 00:20:34,600 Speaker 1: that allowed it to live on land and breathe air. 371 00:20:35,160 --> 00:20:38,359 Speaker 1: And what was fascinating to me about that was, Um, 372 00:20:38,880 --> 00:20:40,600 Speaker 1: I've probably talked about this on the show before, or 373 00:20:40,600 --> 00:20:43,000 Speaker 1: at least talk to you about it. There's this really 374 00:20:43,240 --> 00:20:48,080 Speaker 1: great Japanese manga horror called Geo that's all about fish 375 00:20:48,160 --> 00:20:51,400 Speaker 1: climbing up out of the ocean and they're like strapped 376 00:20:51,400 --> 00:20:54,480 Speaker 1: into these exoskeletons that like keep them alive, and they 377 00:20:54,560 --> 00:20:58,600 Speaker 1: scuttle around and attack people, and it is one of 378 00:20:58,600 --> 00:21:02,000 Speaker 1: the most horrifying images I can ever think of. So 379 00:21:02,040 --> 00:21:05,159 Speaker 1: these guys back in nineteen sixty we're basically pitching that 380 00:21:05,280 --> 00:21:10,200 Speaker 1: saying like, yeah, let's do that, but but to human beings. Yeah, uh, 381 00:21:10,280 --> 00:21:14,000 Speaker 1: And it comes down to efficiency, right, Like these guys 382 00:21:14,080 --> 00:21:18,720 Speaker 1: were ultimately about efficiently getting into space, especially when you consider, 383 00:21:18,800 --> 00:21:20,920 Speaker 1: and we've talked about this before, especially on our Space 384 00:21:20,920 --> 00:21:25,040 Speaker 1: Mirrors episode or or also our episode about space weapons, 385 00:21:25,440 --> 00:21:28,439 Speaker 1: how tremendously expensive it is to propel any mass and 386 00:21:28,480 --> 00:21:32,120 Speaker 1: outer space. Right. And they talk about human fuel as 387 00:21:32,160 --> 00:21:35,560 Speaker 1: being sort of a detriment. And when they say human fuel, 388 00:21:35,640 --> 00:21:40,239 Speaker 1: what they mean is precisely ten pounds per day. And 389 00:21:40,280 --> 00:21:43,479 Speaker 1: that's two pounds of oxygen to breathe, four pounds of 390 00:21:43,480 --> 00:21:46,920 Speaker 1: fluids to drink, and four pounds of food to eat. 391 00:21:47,480 --> 00:21:49,240 Speaker 1: So that's the way that they look at it is 392 00:21:49,320 --> 00:21:51,399 Speaker 1: like the same way that you would look at like, 393 00:21:51,440 --> 00:21:54,080 Speaker 1: well we need fuel for our space shuttle, right, and 394 00:21:54,119 --> 00:21:55,640 Speaker 1: how much that way is and how much it will 395 00:21:55,680 --> 00:21:59,280 Speaker 1: cost to fly that up. They're considering the human fuel. Yeah, 396 00:21:59,320 --> 00:22:01,920 Speaker 1: And it's ultimately like the human engineering problem. It's not 397 00:22:02,000 --> 00:22:03,919 Speaker 1: only the engineering problem of the vessel, but the just 398 00:22:03,920 --> 00:22:08,720 Speaker 1: the engineering problem of the human Yeah, and space. I mean, 399 00:22:08,760 --> 00:22:10,760 Speaker 1: I gotta say, like I wrote an episode for our 400 00:22:10,840 --> 00:22:13,679 Speaker 1: video series brain Stuff one time that was all about 401 00:22:14,320 --> 00:22:16,400 Speaker 1: what space would do to the human body and all 402 00:22:16,440 --> 00:22:18,800 Speaker 1: the horrible ways in which you would die if you 403 00:22:18,840 --> 00:22:21,960 Speaker 1: were just exposed to uh space without a suit or 404 00:22:21,960 --> 00:22:26,000 Speaker 1: anything like that, and it's it's pretty vicious. Uh So 405 00:22:26,080 --> 00:22:27,840 Speaker 1: for them, what they were looking at was not just 406 00:22:27,880 --> 00:22:32,080 Speaker 1: the purpose of the cyborg as being to mitigate those effects, 407 00:22:32,320 --> 00:22:36,320 Speaker 1: but also to take care of those problems automatically and unconsciously, 408 00:22:36,440 --> 00:22:39,959 Speaker 1: right that the cyborg wouldn't be thinking about doing it 409 00:22:40,040 --> 00:22:42,879 Speaker 1: as they were doing it. And this is this paper 410 00:22:42,920 --> 00:22:45,919 Speaker 1: is where the word cyborg comes from. They coined it. 411 00:22:46,080 --> 00:22:48,760 Speaker 1: So um, as we're talking about their their work here, 412 00:22:48,800 --> 00:22:52,840 Speaker 1: realized that these guys are the granddaddies of all the 413 00:22:53,280 --> 00:22:56,959 Speaker 1: ridiculous or um you know, real world ideas that come 414 00:22:56,960 --> 00:22:58,879 Speaker 1: out of this. But we can see that we're already 415 00:22:58,920 --> 00:23:02,920 Speaker 1: getting away from Winer's idea of cyborg just being about 416 00:23:02,920 --> 00:23:05,440 Speaker 1: the flow of information. Right, So I'm gonna read a 417 00:23:05,520 --> 00:23:09,119 Speaker 1: quick quote from Cyborgs and Space to give you a 418 00:23:09,160 --> 00:23:17,400 Speaker 1: taste here. Um feel like like muppet style like cyber space. Yeah, well, 419 00:23:17,400 --> 00:23:20,720 Speaker 1: maybe no one can put some sort of echo hopefully. Quote, 420 00:23:20,760 --> 00:23:23,679 Speaker 1: what are some of the devices necessary for creating self 421 00:23:23,720 --> 00:23:27,680 Speaker 1: regulating man machine systems. This self regulation must function without 422 00:23:27,720 --> 00:23:30,359 Speaker 1: the benefit of consciousness in order to cooperate with the 423 00:23:30,400 --> 00:23:37,240 Speaker 1: body's own autonomous homeostatic controls. For the EXO genuously extended 424 00:23:37,400 --> 00:23:42,680 Speaker 1: organizational complex functioning as an integrated homeostatic system unconsciously, we 425 00:23:42,800 --> 00:23:48,480 Speaker 1: propose the term cyborg. The cyborg deliberately incorporates exogeneous components, 426 00:23:48,520 --> 00:23:52,000 Speaker 1: extending the self regulatory control function of the organism in 427 00:23:52,119 --> 00:23:56,440 Speaker 1: order to adapt to new environments. So that's basically their 428 00:23:56,560 --> 00:23:58,879 Speaker 1: thesis statement. They're starting off and saying like, all right, 429 00:23:58,920 --> 00:24:01,760 Speaker 1: this is what we're proposed zing. May seem a little outlandish, 430 00:24:01,800 --> 00:24:03,720 Speaker 1: but here, let us give you some examples. And when 431 00:24:03,720 --> 00:24:07,320 Speaker 1: you read through the document, they they go through one 432 00:24:07,359 --> 00:24:09,720 Speaker 1: by one of like, here's some cool things we could 433 00:24:09,720 --> 00:24:11,760 Speaker 1: do to the human body. Right, Yeah, it's a very 434 00:24:11,800 --> 00:24:15,040 Speaker 1: readable document, So I'm created anyone that's interested to seek 435 00:24:15,040 --> 00:24:16,439 Speaker 1: it out for themselves. Well include a link on the 436 00:24:16,480 --> 00:24:19,080 Speaker 1: landing page for this episode. But some of the ideas 437 00:24:19,080 --> 00:24:22,600 Speaker 1: that they roll out involved the following. First off, drug 438 00:24:22,640 --> 00:24:25,280 Speaker 1: induced wakefulness, which is of actually this is one of 439 00:24:25,320 --> 00:24:29,720 Speaker 1: the things that we see utilized in modern human space travel. 440 00:24:30,040 --> 00:24:32,439 Speaker 1: Uh and and this next one calls back to that 441 00:24:32,480 --> 00:24:36,320 Speaker 1: White rat. They want to implant osmatic pressure pump capsules 442 00:24:36,800 --> 00:24:39,240 Speaker 1: in the body that could sense and control mechanisms to 443 00:24:39,280 --> 00:24:46,000 Speaker 1: automatically administer everything from astronauts speed to hibernation inducing pituitary drugs. 444 00:24:46,440 --> 00:24:50,240 Speaker 1: So and it's and certainly some of these um, these 445 00:24:50,240 --> 00:24:55,240 Speaker 1: pharmaceutical products are are utilized by astronauts, but this would 446 00:24:55,240 --> 00:24:57,040 Speaker 1: be a situation where they wouldn't have to think about 447 00:24:57,040 --> 00:24:59,240 Speaker 1: taking it. It would just happen to their body. What 448 00:24:59,320 --> 00:25:04,880 Speaker 1: astronauts speed is, um I just sounds like the best speed. 449 00:25:05,200 --> 00:25:08,200 Speaker 1: It's like but it's dried. It's like it's like astronite 450 00:25:08,240 --> 00:25:10,920 Speaker 1: ice cream. Yeah, well they're going to get the good stuff. Um. 451 00:25:11,040 --> 00:25:12,600 Speaker 1: I did a blog post years back off to I 452 00:25:12,600 --> 00:25:14,679 Speaker 1: have to link to it on the landing page for 453 00:25:14,720 --> 00:25:18,119 Speaker 1: this episode because there was a list available of the 454 00:25:18,200 --> 00:25:21,600 Speaker 1: various pharmaceuticals that are available, say on board a space 455 00:25:21,600 --> 00:25:26,399 Speaker 1: shuttle or then space Shuttle or the I S s okay, alright, 456 00:25:26,560 --> 00:25:30,880 Speaker 1: the next recommendation replace the lung with inverse fuel cells. 457 00:25:30,920 --> 00:25:35,000 Speaker 1: They also talked about altering plumbing our bodies plumbing. I'm assuming, 458 00:25:35,440 --> 00:25:38,120 Speaker 1: uh so that wastewater goes through a filter and right 459 00:25:38,160 --> 00:25:40,160 Speaker 1: back into your blood. Sounds kind of like a still 460 00:25:40,200 --> 00:25:42,480 Speaker 1: suit to me. Yeah, yeah, definitely, like a like a 461 00:25:43,400 --> 00:25:47,640 Speaker 1: a bioengineered still suit. They also talked about enzyme tinkering 462 00:25:47,680 --> 00:25:52,320 Speaker 1: to create anaerobic organisms, in other words, astronauts that don't 463 00:25:52,320 --> 00:25:56,320 Speaker 1: require air or can live in different atmospheres. They would 464 00:25:56,359 --> 00:25:59,639 Speaker 1: also drain your ear fluid or fill them up to 465 00:26:00,040 --> 00:26:05,760 Speaker 1: up with weightlessness. Okay. Also electronic electric slash drug cardigo 466 00:26:05,800 --> 00:26:09,960 Speaker 1: vascular control drugs that would prevent muscle atrophy. I wonder 467 00:26:10,000 --> 00:26:12,119 Speaker 1: if that's I don't know enough about that topic, but 468 00:26:12,119 --> 00:26:16,120 Speaker 1: I wonder if that's something they do. Um, I haven't. 469 00:26:16,160 --> 00:26:18,240 Speaker 1: I looked at the research recently, but it's still that's 470 00:26:18,320 --> 00:26:20,840 Speaker 1: vicorse still very much in an area of interest. It 471 00:26:20,840 --> 00:26:22,520 Speaker 1: seems like it would be, especially those guys who are 472 00:26:22,560 --> 00:26:24,880 Speaker 1: up there for like a year at a time. Uh. 473 00:26:24,920 --> 00:26:28,680 Speaker 1: They've also talked about lower press body pressure engineering lower 474 00:26:28,680 --> 00:26:32,080 Speaker 1: body pressure in the human body, kind of I like 475 00:26:32,160 --> 00:26:36,600 Speaker 1: to think, to facilitate naked spacewalks sort of but essentially 476 00:26:36,760 --> 00:26:39,520 Speaker 1: saying all right, we can't maybe we can't actually put 477 00:26:39,520 --> 00:26:41,320 Speaker 1: a person out there in the void because the void 478 00:26:41,400 --> 00:26:43,840 Speaker 1: is just I mean, the void is death and it's 479 00:26:44,160 --> 00:26:47,000 Speaker 1: but maybe we can make the human body a little less. 480 00:26:47,200 --> 00:26:51,280 Speaker 1: Uh you know explosion e god. Yeah, yeah, space does not. 481 00:26:51,600 --> 00:26:56,919 Speaker 1: Space will kill you. Uh. Engineering of a light sensitive, 482 00:26:57,119 --> 00:27:02,119 Speaker 1: chemically regulated system which would adjust to its own reflectance 483 00:27:02,320 --> 00:27:05,200 Speaker 1: so as to maintain the temperature desires. We're basically talking 484 00:27:05,240 --> 00:27:11,399 Speaker 1: about like a light regulation system of of of the 485 00:27:11,440 --> 00:27:13,639 Speaker 1: temperature of the body. And that's another thing, because like 486 00:27:13,920 --> 00:27:16,480 Speaker 1: space can go from like being like incredibly hot to 487 00:27:16,600 --> 00:27:20,159 Speaker 1: being so cold it'll freeze you dead. Uh, it just 488 00:27:20,320 --> 00:27:22,399 Speaker 1: in the blink of a shadow, right, Yeah, you need 489 00:27:22,440 --> 00:27:26,119 Speaker 1: to be able to absorb solar radiation when necessary, but 490 00:27:26,160 --> 00:27:29,880 Speaker 1: also to reflect it when it's just going to cook you. Yeah. Wow, 491 00:27:29,880 --> 00:27:33,119 Speaker 1: I'm trying to imagine what this cyborg would look like. 492 00:27:33,240 --> 00:27:35,399 Speaker 1: I wonder if I wonder if over the years if 493 00:27:35,440 --> 00:27:39,240 Speaker 1: anybody has like taken their recipe for the space cyborg 494 00:27:39,400 --> 00:27:42,719 Speaker 1: and like developed that out somehow and some fan art 495 00:27:42,840 --> 00:27:44,600 Speaker 1: or something like that. I don't know, I would I 496 00:27:44,600 --> 00:27:48,320 Speaker 1: would love to see it. One piece of fiction that 497 00:27:48,320 --> 00:27:50,040 Speaker 1: that always comes to mind when I when I think 498 00:27:50,080 --> 00:27:53,920 Speaker 1: about this paper was a Clifford Symic novel that came 499 00:27:53,920 --> 00:27:58,760 Speaker 1: out called The Werewolf Principle Um And it's essentially a 500 00:27:58,760 --> 00:28:02,800 Speaker 1: space werewolf story. But the The idea here is that 501 00:28:02,880 --> 00:28:06,200 Speaker 1: we engineered human that would go into space and would 502 00:28:06,280 --> 00:28:10,720 Speaker 1: rapidly adapt to life and other worlds. And uh so 503 00:28:10,800 --> 00:28:13,560 Speaker 1: the space traveler goes to other worlds, adapts into these 504 00:28:13,560 --> 00:28:17,719 Speaker 1: different forms that allow him to live in these strange environments, 505 00:28:17,880 --> 00:28:20,720 Speaker 1: and then when he returns to Earth, he will sometimes 506 00:28:20,720 --> 00:28:23,680 Speaker 1: shift into these forms. So he's changing into a quote 507 00:28:23,760 --> 00:28:26,960 Speaker 1: unquote werewolf. But the werewolf is actually a form that 508 00:28:27,000 --> 00:28:29,320 Speaker 1: he adapted in another world another two in order to 509 00:28:29,400 --> 00:28:33,320 Speaker 1: live there, and it's no longer acceptable in Earth's environment 510 00:28:33,440 --> 00:28:36,160 Speaker 1: and society, right, like, because he probably makes them eat 511 00:28:36,200 --> 00:28:38,560 Speaker 1: people or something. Yeah, it's been a while since I've 512 00:28:38,560 --> 00:28:41,760 Speaker 1: read it, but it's a pretty trippy book. It has 513 00:28:41,840 --> 00:28:45,440 Speaker 1: also has flying houses and brownies, as in like the 514 00:28:45,480 --> 00:28:49,360 Speaker 1: little fairies, those brownies, because it just turns out that, 515 00:28:49,400 --> 00:28:52,720 Speaker 1: oh yeah, brownies exist. Like It's like it's like humans 516 00:28:52,720 --> 00:28:54,680 Speaker 1: advanced to the point where they realize they realize, oh yeah, 517 00:28:54,680 --> 00:28:56,640 Speaker 1: there are brownies. They live out there in the woods 518 00:28:57,000 --> 00:28:59,840 Speaker 1: and occasionally we can glimpse them. Yeah, this does sound fascinating, 519 00:29:00,120 --> 00:29:05,360 Speaker 1: right Alright, So needles to say, as we already pointed out, 520 00:29:05,840 --> 00:29:09,400 Speaker 1: uh NASA did not take all these recommendations to heart 521 00:29:09,920 --> 00:29:14,080 Speaker 1: and um, so there's a certain amount of space between uh, 522 00:29:14,160 --> 00:29:17,080 Speaker 1: cyborgs in space and where we are now. The Atlantic 523 00:29:17,120 --> 00:29:21,480 Speaker 1: magazine's Alexis c. Magical caught up with the co author 524 00:29:21,520 --> 00:29:24,920 Speaker 1: Manford Eclins back in two thousand and ten, and by 525 00:29:24,920 --> 00:29:28,160 Speaker 1: the way, as of this recording, Clients is still kicking 526 00:29:28,200 --> 00:29:31,880 Speaker 1: at age ninety. This climbs with a c uh. One 527 00:29:31,920 --> 00:29:34,800 Speaker 1: of the things that Magical points out is that many 528 00:29:34,880 --> 00:29:38,800 Speaker 1: uses of cyborgs seem to view the human machine hybrid 529 00:29:38,920 --> 00:29:41,320 Speaker 1: as as an end point. So like we're gonna get 530 00:29:41,360 --> 00:29:45,440 Speaker 1: to the point where we become the cyborg but uh 531 00:29:45,560 --> 00:29:47,840 Speaker 1: and and and maybe as a compromise as well. But 532 00:29:48,120 --> 00:29:50,160 Speaker 1: Clients saw it as as that means to the to 533 00:29:50,200 --> 00:29:54,440 Speaker 1: an end quote, a way of enlarging the human experience. Yeah, 534 00:29:54,520 --> 00:29:58,080 Speaker 1: I highly recommend if you're interested in what we're talking 535 00:29:58,080 --> 00:30:01,000 Speaker 1: about in this episode, go hunt down matt Icles Atlantic piece. 536 00:30:01,000 --> 00:30:04,320 Speaker 1: It's really interesting. Basically what he gets out from talking 537 00:30:04,320 --> 00:30:07,200 Speaker 1: with Clients is that Kleins saw cyb words as a 538 00:30:07,320 --> 00:30:11,240 Speaker 1: means to enlarging the human experience. It wasn't just about 539 00:30:11,240 --> 00:30:14,040 Speaker 1: space for him, and he was focused in particular on 540 00:30:14,120 --> 00:30:17,840 Speaker 1: expanding our brains relationship with the world, And to me, 541 00:30:18,000 --> 00:30:20,760 Speaker 1: like I wrote in my notes, isn't that trans humanism? 542 00:30:20,760 --> 00:30:22,520 Speaker 1: Like this guy sounds like he's the father of trans 543 00:30:22,600 --> 00:30:26,840 Speaker 1: humanism to me. Uh And, and I know out there 544 00:30:26,920 --> 00:30:28,440 Speaker 1: a lot of people have been asking us to do 545 00:30:28,520 --> 00:30:30,440 Speaker 1: an episode. I think we're gonna if we do, we're 546 00:30:30,440 --> 00:30:32,440 Speaker 1: gonna have to do a two parter on trans humanism. 547 00:30:32,440 --> 00:30:39,080 Speaker 1: It's just such a deep topic and clearly interests us uh. 548 00:30:39,120 --> 00:30:43,160 Speaker 1: And So his focus after the whole you know space proposal, 549 00:30:43,320 --> 00:30:48,920 Speaker 1: was on humans communicating without words, because, as he put it, 550 00:30:49,120 --> 00:30:52,720 Speaker 1: language is messy and ambiguous. And what struck me about 551 00:30:52,760 --> 00:30:56,200 Speaker 1: this is if if you know anything about like in 552 00:30:56,240 --> 00:30:59,640 Speaker 1: the sixties, around this time that he was making these 553 00:30:59,640 --> 00:31:04,080 Speaker 1: proposed this is when post structuralism really erupted in communications 554 00:31:04,120 --> 00:31:08,720 Speaker 1: studies and linguistics, and it's essentially, you know, it's, oh God. 555 00:31:08,760 --> 00:31:12,680 Speaker 1: I can't simplify post structuralism into one sentence, but I 556 00:31:12,680 --> 00:31:15,640 Speaker 1: would say his statement of because language is messy and 557 00:31:15,680 --> 00:31:19,480 Speaker 1: ambiguous is a nice lead in for post structuralism. So 558 00:31:19,720 --> 00:31:22,800 Speaker 1: he was thinking about cyborg ways to get us there, 559 00:31:22,840 --> 00:31:26,760 Speaker 1: and that leads us to Donna Harroway eventually. But he 560 00:31:26,800 --> 00:31:29,080 Speaker 1: also did. He invented this machine that he talks to 561 00:31:29,200 --> 00:31:33,720 Speaker 1: Magical about called the Computer of Average Transience. And apparently 562 00:31:33,760 --> 00:31:38,720 Speaker 1: what this thing did was cancel noise impulses in the 563 00:31:38,800 --> 00:31:44,280 Speaker 1: brain and translated them into averages of their impulses. Um. 564 00:31:44,360 --> 00:31:48,280 Speaker 1: And the argument is basically like, when we're talking about 565 00:31:48,320 --> 00:31:51,040 Speaker 1: these electrical impulses that he mean, he means language, like 566 00:31:51,160 --> 00:31:54,280 Speaker 1: how language is encoded in the brain. Uh, but words 567 00:31:54,320 --> 00:31:58,200 Speaker 1: don't have averages, right. You can't say, like, the word 568 00:31:58,360 --> 00:32:01,320 Speaker 1: cyborg is five points x, so let's round it up 569 00:32:01,360 --> 00:32:05,239 Speaker 1: to six, right, or or or the average of you know, 570 00:32:05,360 --> 00:32:09,880 Speaker 1: in between whatever two different numbers gives you a number 571 00:32:09,920 --> 00:32:14,480 Speaker 1: in between, It gives you a word in between two words. Right. Um, 572 00:32:14,560 --> 00:32:18,000 Speaker 1: So he's not only does it seem like he is 573 00:32:18,040 --> 00:32:21,680 Speaker 1: talking about transhumanism at a very early age, and he's 574 00:32:21,720 --> 00:32:25,600 Speaker 1: talking about post structuralism, but then he's talking about math 575 00:32:25,880 --> 00:32:30,200 Speaker 1: as language, which is really interesting. Yeah, and and certainly 576 00:32:30,320 --> 00:32:33,520 Speaker 1: essential to any kind of bridge between organic So yeah, 577 00:32:33,640 --> 00:32:37,480 Speaker 1: definitely machine. Yeah, which leads us it both connects back 578 00:32:37,560 --> 00:32:41,360 Speaker 1: to Weiner's whole flow of information thing about being a cyborg, 579 00:32:41,680 --> 00:32:44,640 Speaker 1: but then leads us further down the road of sort 580 00:32:44,640 --> 00:32:46,920 Speaker 1: of the philosophy of what it would mean to be 581 00:32:47,000 --> 00:32:50,520 Speaker 1: a cyborg. Indeed, so we've already touched on, you know, 582 00:32:50,680 --> 00:32:53,120 Speaker 1: a little bit on what we call a cyborg. You know, 583 00:32:53,160 --> 00:32:56,440 Speaker 1: the blind man with a cane, a monkey with the stick, glasses, 584 00:32:56,520 --> 00:33:00,000 Speaker 1: contact lenses. Um, I would and I, as I've said, 585 00:33:00,080 --> 00:33:02,680 Speaker 1: virtually all tool use counts, like not only because we're 586 00:33:02,680 --> 00:33:04,959 Speaker 1: picking up and using something, but it all a lot 587 00:33:04,960 --> 00:33:07,080 Speaker 1: of it comes down to our body schema, our brain's 588 00:33:07,120 --> 00:33:10,560 Speaker 1: conception of our body's position in space. Uh. You know, 589 00:33:10,680 --> 00:33:14,200 Speaker 1: just that alone entails some pretty complex mental processing, you know, 590 00:33:14,280 --> 00:33:16,280 Speaker 1: just to say this is where I am, this is 591 00:33:16,640 --> 00:33:18,480 Speaker 1: this is the space around me. And I have kind 592 00:33:18,480 --> 00:33:20,800 Speaker 1: of like this virtual version of in my head of 593 00:33:20,880 --> 00:33:23,120 Speaker 1: in a virtual idea of what my body consists of 594 00:33:23,200 --> 00:33:25,720 Speaker 1: what are its limits? Uh, you know, what are my 595 00:33:25,800 --> 00:33:29,600 Speaker 1: limits of control? Uh? So our brains are constantly processing 596 00:33:29,680 --> 00:33:32,560 Speaker 1: sense feedback to establish where our limbs are at any 597 00:33:32,560 --> 00:33:35,360 Speaker 1: given moment. And here's the crazy part. When we wield 598 00:33:35,360 --> 00:33:38,920 Speaker 1: a hammer, when we wield a sword, when we you know, 599 00:33:39,040 --> 00:33:41,680 Speaker 1: use one of those reachy clow things to get a 600 00:33:41,760 --> 00:33:45,280 Speaker 1: can off of a shelf, extensions, Yeah, our body schema 601 00:33:45,480 --> 00:33:49,440 Speaker 1: updates to include that as part of our bodies. So 602 00:33:49,680 --> 00:33:54,440 Speaker 1: on a very neuroscientific level, we're we're already cyborgs, and 603 00:33:54,480 --> 00:33:59,360 Speaker 1: likewise our memory adapts to use the Internet via transactive memory. 604 00:33:59,400 --> 00:34:03,280 Speaker 1: We effort Lislee outsourced the remembrance of data. This is 605 00:34:03,320 --> 00:34:05,680 Speaker 1: something that I thought about a lot as we were 606 00:34:05,720 --> 00:34:10,000 Speaker 1: researching this episode that, uh, I don't know about you 607 00:34:10,239 --> 00:34:13,239 Speaker 1: or the listeners, but I have definitely found in the 608 00:34:13,320 --> 00:34:18,440 Speaker 1: last ten to fifteen years that like, not only do 609 00:34:18,520 --> 00:34:21,480 Speaker 1: I have more information available at my fingertips than I 610 00:34:21,560 --> 00:34:24,440 Speaker 1: ever would have before, but at the same time, like, 611 00:34:25,600 --> 00:34:28,040 Speaker 1: because my brain only has so much RAM, I have 612 00:34:28,120 --> 00:34:30,640 Speaker 1: to offload some of that into the cloud, right And 613 00:34:30,680 --> 00:34:33,200 Speaker 1: I'm like, well, I can't particularly remember that right now. 614 00:34:33,239 --> 00:34:36,200 Speaker 1: I'll put it in Google Docs, or I'll I'll let 615 00:34:36,280 --> 00:34:39,080 Speaker 1: Wikipedia hold onto that for me for now, and I 616 00:34:39,080 --> 00:34:43,800 Speaker 1: won't memorize. I don't know, like, uh, you know where 617 00:34:44,400 --> 00:34:48,080 Speaker 1: Rod Stewart's from or something some like casual bit of 618 00:34:48,120 --> 00:34:51,560 Speaker 1: trivial knowledge. Yeah, it's the same phenomenon that it allows 619 00:34:51,920 --> 00:34:57,000 Speaker 1: or enables like one number of a you know, romantic compulse. 620 00:34:57,040 --> 00:35:00,120 Speaker 1: The couple say to forget like an important date. They 621 00:35:00,160 --> 00:35:02,680 Speaker 1: forget it because I'm like a subconscious level. They know 622 00:35:02,760 --> 00:35:05,920 Speaker 1: the other individual will remember it, So why should It's 623 00:35:05,920 --> 00:35:09,359 Speaker 1: just pure economics, right, Why should all members of this 624 00:35:09,560 --> 00:35:13,200 Speaker 1: group of of interconnected humans a part of this network. 625 00:35:13,600 --> 00:35:16,640 Speaker 1: Why should all nodes on the network carry that data? 626 00:35:17,040 --> 00:35:20,400 Speaker 1: It doesn't make sense they should collectively carry it. Yeah. Absolutely, 627 00:35:20,480 --> 00:35:24,120 Speaker 1: And subsequently we end up with IICL or Google Calendar 628 00:35:24,239 --> 00:35:27,719 Speaker 1: or whatever your your platform of choice. Yes. Right, if 629 00:35:27,760 --> 00:35:28,759 Speaker 1: you want to know what it's like to have a 630 00:35:28,800 --> 00:35:31,440 Speaker 1: cybernetic implant in your brain, you already have it. It's 631 00:35:31,440 --> 00:35:33,200 Speaker 1: called spell check. I was on my way to work 632 00:35:33,239 --> 00:35:35,840 Speaker 1: this morning. I'm on the train, riding in my phone 633 00:35:35,880 --> 00:35:38,400 Speaker 1: buzzes and I pick it up. My cope, is this 634 00:35:38,520 --> 00:35:40,840 Speaker 1: a text message? Nope, it was my phone reminding me 635 00:35:40,880 --> 00:35:43,120 Speaker 1: that we were recording this episode this morning because it's 636 00:35:43,120 --> 00:35:45,080 Speaker 1: on my calendar. And of course this isn't even getting 637 00:35:45,080 --> 00:35:48,520 Speaker 1: into the whole realm of say, biomedical implants, etcetera. Yeah, 638 00:35:48,560 --> 00:35:53,520 Speaker 1: I mean Klein's incline had a very particular obsession. And 639 00:35:53,719 --> 00:35:55,319 Speaker 1: again we go back to this we talked about it 640 00:35:55,320 --> 00:35:59,160 Speaker 1: with with Winer, right, that there's an obsession for science 641 00:35:59,400 --> 00:36:03,360 Speaker 1: in the mill terry, with this combination of machine and 642 00:36:03,600 --> 00:36:06,640 Speaker 1: man and what it comes down to is how can 643 00:36:06,680 --> 00:36:11,279 Speaker 1: we escape our annoying bodies? Basically? Right? And uh and man, 644 00:36:11,440 --> 00:36:16,120 Speaker 1: this was a great time for ideas. Uh. Not only 645 00:36:16,360 --> 00:36:19,040 Speaker 1: does their paper coincide with the you know what I 646 00:36:19,080 --> 00:36:21,880 Speaker 1: was talking about with post structuralism, but it also coincides 647 00:36:21,880 --> 00:36:24,040 Speaker 1: with something so completely on the other side of it, 648 00:36:24,040 --> 00:36:26,799 Speaker 1: which is the Silver Age of comic books. Uh. And 649 00:36:27,600 --> 00:36:29,520 Speaker 1: in particular. You know, I'm not going to go into 650 00:36:29,560 --> 00:36:32,480 Speaker 1: a whole rant about the Silver Age and explain comics, 651 00:36:32,760 --> 00:36:36,600 Speaker 1: but the Silver Age was very much about superheroes that 652 00:36:36,640 --> 00:36:41,080 Speaker 1: were science oriented, that we're sort of above and beyond 653 00:36:41,360 --> 00:36:43,480 Speaker 1: what the human body could do, right, and so like 654 00:36:43,520 --> 00:36:46,359 Speaker 1: the first Silver Age superhero character is often cited as 655 00:36:46,400 --> 00:36:49,520 Speaker 1: being the Flash, and of course the the the dream 656 00:36:49,600 --> 00:36:52,080 Speaker 1: of the Flash, right is that he can move faster, 657 00:36:52,120 --> 00:36:55,040 Speaker 1: He can do everything faster than we can, because oh, 658 00:36:55,120 --> 00:36:58,920 Speaker 1: these bodies are so slow, they're limited, right. Uh. And 659 00:36:58,960 --> 00:37:01,719 Speaker 1: it's bringing in history and man together in a way 660 00:37:01,760 --> 00:37:04,000 Speaker 1: that it sounds like the military was very interested in. 661 00:37:04,440 --> 00:37:08,479 Speaker 1: This was when we get into uh, clins incline. Their 662 00:37:08,560 --> 00:37:11,839 Speaker 1: paper had a lot of influence in millions of US 663 00:37:11,880 --> 00:37:17,520 Speaker 1: Air Force dollars were spent developing exoskeletons, robot arms, biofeedback devices, 664 00:37:17,560 --> 00:37:20,640 Speaker 1: and more. I mean, we are obsessed with this dream. 665 00:37:20,680 --> 00:37:23,160 Speaker 1: That's why it showed up in our pop culture over 666 00:37:23,239 --> 00:37:28,200 Speaker 1: and over again. Uh, the six million Dollar Man, the 667 00:37:28,280 --> 00:37:30,719 Speaker 1: Bionic Woman. Right. I think if we were both a 668 00:37:30,760 --> 00:37:33,600 Speaker 1: little older, like those would be the example. Yeah. Yeah 669 00:37:33,640 --> 00:37:36,239 Speaker 1: for me, Like, uh, you know you mentioned RoboCop. I 670 00:37:36,239 --> 00:37:39,640 Speaker 1: think of like, um, surely Star Trek and Doctor who 671 00:37:39,719 --> 00:37:43,319 Speaker 1: had their own iterations of cyborgs. But again, like it was, 672 00:37:43,400 --> 00:37:46,799 Speaker 1: it ultimately comes down to that whole like uh, like 673 00:37:46,840 --> 00:37:49,759 Speaker 1: they're they're agonizing over the am I Man or am 674 00:37:49,760 --> 00:37:52,320 Speaker 1: I Machine? Where do I you know, where do I exist? 675 00:37:52,560 --> 00:37:55,800 Speaker 1: In comics? Uh? This character that was just recently portrayed 676 00:37:55,800 --> 00:37:58,840 Speaker 1: by Paul Bettany in the Avengers movie, The Vision is 677 00:37:58,880 --> 00:38:02,799 Speaker 1: an android and he there's this classic comic book cover 678 00:38:03,200 --> 00:38:10,520 Speaker 1: with the Vision and it says even an android can cry. Ah. Yeah, 679 00:38:10,520 --> 00:38:12,080 Speaker 1: I mean it all comes down to you to to 680 00:38:12,120 --> 00:38:15,719 Speaker 1: what extent is a is a cyborg either an advancement 681 00:38:15,760 --> 00:38:18,560 Speaker 1: of the human condition, a lessening of the human condition, 682 00:38:18,760 --> 00:38:21,799 Speaker 1: or somewhere nicely neutral in the middle. Yeah. I want 683 00:38:21,800 --> 00:38:24,160 Speaker 1: to mention one more thing from that Atlantic piece. Um, 684 00:38:25,280 --> 00:38:28,920 Speaker 1: the authors spoke with Clins and Clients presented just another 685 00:38:28,960 --> 00:38:31,799 Speaker 1: wonderful example of what it might mean to be a 686 00:38:31,800 --> 00:38:35,080 Speaker 1: cyborg and what it means to perhaps you know, already 687 00:38:35,080 --> 00:38:38,560 Speaker 1: be a cyborg. And that he presented the example of 688 00:38:38,600 --> 00:38:42,640 Speaker 1: a cyborg implant that's part of our naturally occurring anatomy. 689 00:38:42,800 --> 00:38:46,520 Speaker 1: I've never heard of this before. Really, I was like, oh, wow, 690 00:38:46,680 --> 00:38:48,520 Speaker 1: that's creepy. I'll give you a second to see if 691 00:38:48,520 --> 00:38:51,880 Speaker 1: you can guess what it is, listeners, it's the lens 692 00:38:52,000 --> 00:38:55,319 Speaker 1: of the eye. I'm going to read the quote. The 693 00:38:55,440 --> 00:38:58,120 Speaker 1: lens is not in any way part of the body, 694 00:38:58,239 --> 00:39:01,239 Speaker 1: except that it happens to be there. In fact, it 695 00:39:01,280 --> 00:39:05,160 Speaker 1: has no normal blood supply. It does have liquid surrounding it, 696 00:39:05,200 --> 00:39:07,480 Speaker 1: but there is no blood supply because if you had 697 00:39:07,520 --> 00:39:10,440 Speaker 1: blood going through the lens, you wouldn't see too well. 698 00:39:10,719 --> 00:39:14,120 Speaker 1: Nature has taken care of it. The biological control and 699 00:39:14,200 --> 00:39:19,480 Speaker 1: invention of the lens is a beautiful and fantastic thing. Yeah, 700 00:39:19,520 --> 00:39:22,440 Speaker 1: and he the way he talks about it, he and 701 00:39:22,520 --> 00:39:24,440 Speaker 1: I don't know if this is the author of the 702 00:39:24,440 --> 00:39:28,680 Speaker 1: Atlantic piece or Clients himself, but basically says that our 703 00:39:28,719 --> 00:39:31,880 Speaker 1: control over the lens of our eye is the nearest 704 00:39:32,040 --> 00:39:37,200 Speaker 1: thing that we have to telekinesis Wow. Yeah, because we're talking. 705 00:39:38,480 --> 00:39:40,799 Speaker 1: You know, it's a conscious movement of the body, but 706 00:39:40,880 --> 00:39:43,080 Speaker 1: it's the only one not tied to the brain by 707 00:39:43,120 --> 00:39:46,080 Speaker 1: neural feedback. You see the results of thinking it's something, 708 00:39:46,200 --> 00:39:48,279 Speaker 1: you're thinking at it, and it happens, but there's no 709 00:39:48,360 --> 00:39:52,480 Speaker 1: muscular feedback from from those muscles that activate the curvature 710 00:39:52,520 --> 00:39:54,720 Speaker 1: of the lens. It's exactly what they were talking about 711 00:39:54,760 --> 00:39:56,880 Speaker 1: with their whole space proposal, right that we have no 712 00:39:57,000 --> 00:40:00,759 Speaker 1: knowledge of it operating automatically on its own, and just 713 00:40:00,800 --> 00:40:03,360 Speaker 1: because we think a thing, it happens. There's no feedback, 714 00:40:03,400 --> 00:40:06,239 Speaker 1: and it's just it just does it thinks and that 715 00:40:06,400 --> 00:40:09,560 Speaker 1: big gun pops out of it exactly right. Yeah, all right, 716 00:40:09,600 --> 00:40:11,160 Speaker 1: we're gonna take a quick break and when we come 717 00:40:11,200 --> 00:40:15,600 Speaker 1: back we will launch into some ethical quandaries about cyborgs 718 00:40:15,680 --> 00:40:28,239 Speaker 1: and into even indeed into the idea of cyborg feminism. 719 00:40:28,280 --> 00:40:30,120 Speaker 1: All right, we're back and we're gonna talk a little 720 00:40:30,160 --> 00:40:35,560 Speaker 1: bit about cyborg ethics. Um, there is a British cyberneticist 721 00:40:35,680 --> 00:40:38,400 Speaker 1: by the name of Kevin Warwick, and a number of 722 00:40:38,400 --> 00:40:41,799 Speaker 1: you that follow sort of trans human topics. You may 723 00:40:41,800 --> 00:40:44,360 Speaker 1: be familiar with him thanks to his series of Captain 724 00:40:44,400 --> 00:40:48,400 Speaker 1: Cyborg experiments, and these generally involved like placing chips in 725 00:40:48,440 --> 00:40:51,760 Speaker 1: his body and sort of exploring, you know, early examples 726 00:40:51,760 --> 00:40:54,120 Speaker 1: of what it is to be a human cyborg based 727 00:40:54,160 --> 00:40:57,719 Speaker 1: on circuitry. But he's also done some thinking about the 728 00:40:57,760 --> 00:41:01,520 Speaker 1: ethics of it. Particularly he's he's asked if humans will 729 00:41:01,560 --> 00:41:04,960 Speaker 1: one day be required to upgrade to a cybernetic state 730 00:41:05,360 --> 00:41:08,000 Speaker 1: to become cyborgs, or if they'll be be able to 731 00:41:08,000 --> 00:41:11,400 Speaker 1: live their lives in a primitive state, which he likens 732 00:41:11,440 --> 00:41:13,680 Speaker 1: to that of a chimpanzee living in the shadow of 733 00:41:13,680 --> 00:41:17,760 Speaker 1: a human. I imagine that that is probably like the 734 00:41:17,760 --> 00:41:19,839 Speaker 1: the heart flutter that a lot of people had when 735 00:41:19,960 --> 00:41:22,840 Speaker 1: Google Glass hit the scene a couple of years ago. Remember, like, 736 00:41:24,040 --> 00:41:26,720 Speaker 1: oh God, I'm gonna have to wear that to interact 737 00:41:26,760 --> 00:41:30,280 Speaker 1: with society. No, No, I'll go live on a farm 738 00:41:30,360 --> 00:41:32,719 Speaker 1: somewhere and I will not participate in Google Glass. And 739 00:41:32,800 --> 00:41:35,200 Speaker 1: luckily it didn't pan out for anybody. It's kind of 740 00:41:35,239 --> 00:41:38,200 Speaker 1: like when you encounter somebody who doesn't have a smartphone, 741 00:41:38,680 --> 00:41:41,359 Speaker 1: and which is to a large extent, I applaud those 742 00:41:41,400 --> 00:41:45,719 Speaker 1: individuals who who do that, but you also you sort 743 00:41:45,719 --> 00:41:47,200 Speaker 1: of act yourself, like how do you how do you 744 00:41:47,200 --> 00:41:50,360 Speaker 1: live your life like that? How? You know? How? Like 745 00:41:50,800 --> 00:41:53,360 Speaker 1: we grow so accustomed to being just constantly plugged in, 746 00:41:53,440 --> 00:41:56,399 Speaker 1: and when we're not plugged in, it it takes something 747 00:41:56,440 --> 00:41:58,440 Speaker 1: out of you. Like when I went on vacation at 748 00:41:58,480 --> 00:42:01,359 Speaker 1: the beginning of the year and my smartphone didn't work 749 00:42:01,400 --> 00:42:04,880 Speaker 1: for a week, it was a little panicky. I was. 750 00:42:04,920 --> 00:42:06,799 Speaker 1: I felt a little panicky at first. I had to 751 00:42:06,800 --> 00:42:10,359 Speaker 1: sort of adjust to this new freedom of not being 752 00:42:10,440 --> 00:42:13,920 Speaker 1: shackled to this device and all its augmentation. I remember 753 00:42:13,960 --> 00:42:16,120 Speaker 1: being in my early twenties and I didn't have a 754 00:42:16,160 --> 00:42:19,440 Speaker 1: cell phone, and I was like, I hate that everybody's 755 00:42:19,440 --> 00:42:21,160 Speaker 1: on these cell phones all the time and they just 756 00:42:21,200 --> 00:42:24,360 Speaker 1: walk around and talking and texting and not paying attention 757 00:42:24,400 --> 00:42:26,640 Speaker 1: to the world around them. And I said something to 758 00:42:26,680 --> 00:42:28,759 Speaker 1: a friend one time, like I'll never get a cell 759 00:42:28,800 --> 00:42:32,719 Speaker 1: phone until they can implant them in your skull. And then, like, 760 00:42:32,880 --> 00:42:35,560 Speaker 1: you know, cut to fifteen years later, and I've got 761 00:42:35,600 --> 00:42:38,399 Speaker 1: a smartphone just like everybody else, and it is kind 762 00:42:38,400 --> 00:42:41,600 Speaker 1: of implanted. Yeah, exactly, it pretty much is. Yeah, But 763 00:42:41,680 --> 00:42:44,560 Speaker 1: a lot of individuals have have have studied, have written 764 00:42:44,600 --> 00:42:47,520 Speaker 1: about the idea of cyborg ethics. One cool paper that 765 00:42:47,560 --> 00:42:51,520 Speaker 1: we ran across is one titled Cyborgs and Moral Identity 766 00:42:51,680 --> 00:42:54,719 Speaker 1: by Grant Gillette, published in two thousand six in the 767 00:42:54,760 --> 00:42:58,840 Speaker 1: Journal of Medical Ethics, and it explored several different ethical 768 00:42:59,080 --> 00:43:02,799 Speaker 1: cyborg andres. It's a fun paper, very readal. The author 769 00:43:02,880 --> 00:43:07,200 Speaker 1: lays out some quote unquote fanciful cases. They often kind 770 00:43:07,200 --> 00:43:09,600 Speaker 1: of tread into black mirror kind of territory. I'm glad 771 00:43:09,640 --> 00:43:11,640 Speaker 1: that you noticed that as well. Yeah. As I was 772 00:43:11,880 --> 00:43:13,640 Speaker 1: reading through it, I was like, man, this guy is 773 00:43:13,680 --> 00:43:17,760 Speaker 1: just pitching black Mirror episode after episode. Yeah, yeah, yeah, 774 00:43:17,880 --> 00:43:20,400 Speaker 1: And they all kind of come down to the same question. 775 00:43:20,480 --> 00:43:24,800 Speaker 1: If we cybernetically enhance a human, if we cybernetically enhance 776 00:43:24,880 --> 00:43:28,000 Speaker 1: the brain, then to what extent is the resulting mind 777 00:43:28,680 --> 00:43:32,520 Speaker 1: the resulting person still human? Some of the some of 778 00:43:32,520 --> 00:43:35,000 Speaker 1: the exam I'm gonna roll through some of the examples here. Um, 779 00:43:35,280 --> 00:43:37,040 Speaker 1: I'm not your favorite. One is the last one, Yeah, 780 00:43:37,040 --> 00:43:38,320 Speaker 1: the last one is the one we'll talk about in 781 00:43:38,440 --> 00:43:41,040 Speaker 1: More Death, because that's the problem of right. So he 782 00:43:41,120 --> 00:43:45,040 Speaker 1: discussed discusses neuro reconstruction of a three year old severe 783 00:43:45,080 --> 00:43:47,640 Speaker 1: brain injury. It ends up changing who the three year 784 00:43:47,640 --> 00:43:50,120 Speaker 1: old is, but the three old three year old gets 785 00:43:50,160 --> 00:43:52,920 Speaker 1: to to live, okay. And I think most people can 786 00:43:52,920 --> 00:43:55,240 Speaker 1: get okay with that one because it's like you saved 787 00:43:55,239 --> 00:43:59,560 Speaker 1: a life, right, and it's not perfect, but you saved alright. 788 00:43:59,600 --> 00:44:03,120 Speaker 1: The next cybernetic eyes for the blind, the Jordy Leaforge scenario. 789 00:44:03,440 --> 00:44:07,400 Speaker 1: Everybody's cool with Jordy. Another one, extensive brain injury and 790 00:44:07,440 --> 00:44:11,280 Speaker 1: then replacement with micro networks. So this is like the 791 00:44:11,360 --> 00:44:15,560 Speaker 1: natural evolution of brain computer interfaces. Yeah, and saying like, oh, 792 00:44:15,560 --> 00:44:17,400 Speaker 1: there's damage to the brain, but we can fix it 793 00:44:17,920 --> 00:44:20,560 Speaker 1: with with this new technology, all right. And again we're 794 00:44:21,239 --> 00:44:25,200 Speaker 1: treating at the wound. We're treating an injury. Everybody's generally 795 00:44:25,200 --> 00:44:30,200 Speaker 1: okay with that. An unborn child with an incompletely formed brain, 796 00:44:30,600 --> 00:44:33,960 Speaker 1: and then doctors grow that brain out with cybernetic techniques 797 00:44:33,960 --> 00:44:36,600 Speaker 1: to ensure the child is born with a working brain. 798 00:44:37,239 --> 00:44:39,239 Speaker 1: This one's a lot that sounds like the beginning of 799 00:44:39,239 --> 00:44:41,640 Speaker 1: a horror story to me. Well, yeah, I guess it 800 00:44:41,680 --> 00:44:44,319 Speaker 1: could be. But or but you could also say that 801 00:44:44,360 --> 00:44:49,640 Speaker 1: it forestalls the the real life horror story child. Yeah, absolutely, Yeah, 802 00:44:49,880 --> 00:44:52,480 Speaker 1: So I think this one, this one, yeah, this is 803 00:44:52,520 --> 00:44:56,440 Speaker 1: a little more problematic because you're you're changing the the 804 00:44:56,440 --> 00:45:00,080 Speaker 1: the human in utero, creating a cyborg in utero. And 805 00:45:00,120 --> 00:45:03,120 Speaker 1: then and then you have to ask to what extent 806 00:45:03,239 --> 00:45:06,680 Speaker 1: is the resulting child still the child? But yeah, right, 807 00:45:07,480 --> 00:45:10,160 Speaker 1: but then there's the the Peggy story, and you want 808 00:45:10,160 --> 00:45:12,759 Speaker 1: to take this one, sure, yeah, this is so, this 809 00:45:12,840 --> 00:45:14,959 Speaker 1: is the one that the most felt like a black 810 00:45:15,000 --> 00:45:18,600 Speaker 1: Mirror episode to me. And and genuinely, as I'm reading 811 00:45:18,600 --> 00:45:24,000 Speaker 1: this academic article the twist to this story, i went, oh, 812 00:45:24,160 --> 00:45:26,080 Speaker 1: like it's said, it's sent shivers that might spend like 813 00:45:26,080 --> 00:45:29,879 Speaker 1: the twist to one of those old like horror comics. 814 00:45:29,920 --> 00:45:33,799 Speaker 1: All right, so, uh, basically, the idea here is that 815 00:45:34,120 --> 00:45:37,680 Speaker 1: Bob and Peggy are a couple and they have problems. Uh, 816 00:45:37,920 --> 00:45:42,800 Speaker 1: Peggy's depressed, right, and uh so they rent Cybo help, 817 00:45:43,000 --> 00:45:46,080 Speaker 1: which isn't like an android that's customized to come in 818 00:45:46,120 --> 00:45:48,640 Speaker 1: and be compassionate and take care of Peggy and kind 819 00:45:48,640 --> 00:45:52,440 Speaker 1: of like help her get beyond her depression. Her presence 820 00:45:52,520 --> 00:45:58,600 Speaker 1: cheers everybody up, and Peggy undergoes neuropsychi psychiatric treatments and 821 00:45:58,760 --> 00:46:01,440 Speaker 1: becomes her old self. Right. And one of the keys 822 00:46:01,480 --> 00:46:04,920 Speaker 1: here is that the the android has these like symbols 823 00:46:04,920 --> 00:46:06,880 Speaker 1: on the back of her skull that show like which 824 00:46:07,160 --> 00:46:09,680 Speaker 1: which features she's been loaded with? And I think it's 825 00:46:10,239 --> 00:46:13,239 Speaker 1: like there's some for like artistic abilities, and the most 826 00:46:13,280 --> 00:46:17,759 Speaker 1: important one is is is one that allows the android 827 00:46:17,920 --> 00:46:23,759 Speaker 1: to to show compassion. Yeah, so Bob and Peggy they say, hey, 828 00:46:23,800 --> 00:46:29,560 Speaker 1: everything's fine. Now, Peggy's fine, she had this procedure was good. Yeah, 829 00:46:29,560 --> 00:46:31,760 Speaker 1: so they'll say, well, we don't need the android anymore. 830 00:46:31,760 --> 00:46:35,640 Speaker 1: They send her back to the plant. Uh. And at 831 00:46:35,680 --> 00:46:38,400 Speaker 1: the end of it, Bob is like stroking Peggy's hair 832 00:46:38,480 --> 00:46:41,359 Speaker 1: or something like that, and he notices that she's got 833 00:46:41,360 --> 00:46:45,440 Speaker 1: the raised embossed indentations of lettering the same way that 834 00:46:45,480 --> 00:46:47,839 Speaker 1: the android did on the back of her head. So 835 00:46:47,880 --> 00:46:50,319 Speaker 1: it's implied, wait, what did that android do to my 836 00:46:50,360 --> 00:46:52,360 Speaker 1: wife when I wasn't home? Right? Oh yeah, there's like 837 00:46:52,360 --> 00:46:54,960 Speaker 1: a thing in there where he takes an extended business 838 00:46:55,000 --> 00:46:57,600 Speaker 1: trip and that's when and when he gets back is 839 00:46:57,600 --> 00:46:59,560 Speaker 1: when he notices that. So it's a little a little 840 00:46:59,560 --> 00:47:02,040 Speaker 1: bit like Efford Wives. Right, there's a there's a very 841 00:47:02,200 --> 00:47:05,880 Speaker 1: very much like a sexist theme here as well, But 842 00:47:06,080 --> 00:47:09,440 Speaker 1: ultimately this one is more problematic, right because what happened 843 00:47:09,440 --> 00:47:12,120 Speaker 1: to Peggy, what happened to the old Peggy? Was that 844 00:47:12,200 --> 00:47:16,200 Speaker 1: the real Peggy? And is the new Peggy the new Peggy? Yeah? 845 00:47:16,239 --> 00:47:19,000 Speaker 1: Where's Peggy? Also says something and I don't know if 846 00:47:19,000 --> 00:47:21,480 Speaker 1: this is just Grant g letter or us how we 847 00:47:21,560 --> 00:47:25,200 Speaker 1: approach this topic, but like how we think of depression too, 848 00:47:25,600 --> 00:47:29,160 Speaker 1: as like that's a thing you cure. Yeah, we have 849 00:47:29,200 --> 00:47:33,520 Speaker 1: a robot come in and just fix it. Yeah. He 850 00:47:33,600 --> 00:47:36,600 Speaker 1: makes two main observations with all of these. At one, 851 00:47:36,880 --> 00:47:39,200 Speaker 1: we are less concerned with the cybernetic components of the 852 00:47:39,480 --> 00:47:42,080 Speaker 1: if the of the person, if they seem peripheral or 853 00:47:42,120 --> 00:47:46,640 Speaker 1: somewhat incidental to their psychological identity or character. Okay, so 854 00:47:46,760 --> 00:47:50,240 Speaker 1: we're cool with augmentations. That's no big deal because certainly 855 00:47:50,320 --> 00:47:52,880 Speaker 1: you apply that to real life. We augment ourselves all 856 00:47:52,920 --> 00:47:55,440 Speaker 1: the time. The type of coffees an augmentation, pair of 857 00:47:55,480 --> 00:47:58,960 Speaker 1: glasses an augmentation, but generally you don't. People may joke 858 00:47:59,040 --> 00:48:01,120 Speaker 1: about their not the selves until they have that cup 859 00:48:01,120 --> 00:48:04,160 Speaker 1: of coffee, but nobody actually believes that. You know, what's 860 00:48:04,200 --> 00:48:07,520 Speaker 1: cybernetic for me taking a shower? Yeah, Like I was 861 00:48:07,560 --> 00:48:10,560 Speaker 1: thinking about that this weekend, Like what did people do 862 00:48:11,040 --> 00:48:13,719 Speaker 1: before showers? Because if I don't have a shower. I 863 00:48:13,760 --> 00:48:18,560 Speaker 1: feel exhausted and tired and gross. But then like you're 864 00:48:18,560 --> 00:48:20,960 Speaker 1: getting that shower and it's just boom, I'm ready for 865 00:48:21,000 --> 00:48:22,520 Speaker 1: the day. I don't know what it is. I'm the 866 00:48:22,560 --> 00:48:27,040 Speaker 1: same way. His second observation is that quote we are 867 00:48:27,120 --> 00:48:30,839 Speaker 1: more concerned where a non human mode of relationship or 868 00:48:31,040 --> 00:48:36,160 Speaker 1: reaction or response to others emergence. So that's pretty basic, 869 00:48:36,239 --> 00:48:40,160 Speaker 1: right when when you're when when the result seems non 870 00:48:40,360 --> 00:48:43,520 Speaker 1: human or the relationship is not seems non human, then 871 00:48:43,560 --> 00:48:46,279 Speaker 1: we're saying, Okay, what's what's wrong? This is not a 872 00:48:46,320 --> 00:48:49,799 Speaker 1: cybernetic scenario I can get behind. So Gillette like proposing 873 00:48:49,840 --> 00:48:53,080 Speaker 1: all of these fictional scenarios is basically getting at his 874 00:48:53,120 --> 00:48:58,239 Speaker 1: big question, which is how should we morally treat a cyborg? Right, 875 00:48:58,560 --> 00:49:01,399 Speaker 1: we still treat each other well. We like to think 876 00:49:01,400 --> 00:49:04,560 Speaker 1: we do as moral agents when we're interacting with each 877 00:49:04,600 --> 00:49:09,920 Speaker 1: other through diaries or computers or even antidepressants. Right now, again, 878 00:49:09,960 --> 00:49:11,879 Speaker 1: I say I like to think we two and then 879 00:49:11,960 --> 00:49:14,320 Speaker 1: like go take a look at some YouTube comments. Sometimes 880 00:49:14,360 --> 00:49:17,479 Speaker 1: I don't know that they're necessarily moral agencies that play there. 881 00:49:18,080 --> 00:49:23,200 Speaker 1: But his argument is if we're ethical and moral to 882 00:49:23,239 --> 00:49:25,959 Speaker 1: one another through those things, shouldn't we do the same 883 00:49:26,000 --> 00:49:29,799 Speaker 1: thing if our brain is somehow connected to technology. Yeah, 884 00:49:29,960 --> 00:49:34,080 Speaker 1: it's it's uh, the Peggy example, especially if the morning 885 00:49:34,080 --> 00:49:36,600 Speaker 1: you chew on it because you have the android who 886 00:49:36,719 --> 00:49:38,840 Speaker 1: is fake and it's just you know, a servant that 887 00:49:38,920 --> 00:49:41,920 Speaker 1: is then turned back over, but of the same things 888 00:49:41,960 --> 00:49:45,200 Speaker 1: that that that make her seem genuine, like the same 889 00:49:45,239 --> 00:49:48,719 Speaker 1: sort of uh, you know, emotional programming. If that same 890 00:49:48,719 --> 00:49:52,600 Speaker 1: programming is used to quote unquote fix Peggy, then is 891 00:49:52,640 --> 00:49:56,320 Speaker 1: Peggy fake now too? And then by but then if 892 00:49:56,360 --> 00:49:59,280 Speaker 1: the reverse is true, then was the android a real 893 00:49:59,320 --> 00:50:01,200 Speaker 1: person as well? What does it mean to be human? 894 00:50:01,239 --> 00:50:03,560 Speaker 1: What does it mean to be non human? You know what? 895 00:50:04,120 --> 00:50:10,359 Speaker 1: This is a perfect segue into the Donna Harroway uh conundrum, 896 00:50:10,560 --> 00:50:15,200 Speaker 1: the cyborg manifesto. We're talking about Peggy and her depression 897 00:50:15,360 --> 00:50:18,920 Speaker 1: and her female identity and whether or not it changes 898 00:50:19,040 --> 00:50:22,520 Speaker 1: or is the same if she gets somehow computerized. That 899 00:50:22,600 --> 00:50:26,520 Speaker 1: leads right into Harroway. Yes, we are now somewhere around 900 00:50:26,600 --> 00:50:32,759 Speaker 1: nineteen five and we're talking about a cyborg manifesto by 901 00:50:32,960 --> 00:50:37,520 Speaker 1: Donna J. Harraway. She is Distinguished Professor Emerita of the 902 00:50:37,640 --> 00:50:41,200 Speaker 1: History of Consciousness Department and feminist Studies department at the 903 00:50:41,280 --> 00:50:44,239 Speaker 1: University of California, Santa Cruz. They have a History of 904 00:50:44,320 --> 00:50:48,840 Speaker 1: Consciousness department, or they did, who knows if they're funding 905 00:50:48,960 --> 00:50:52,960 Speaker 1: is still available. But that's pretty cool. So let's just 906 00:50:53,239 --> 00:50:56,160 Speaker 1: get this out of the way. We both read, or 907 00:50:56,480 --> 00:50:59,520 Speaker 1: in my case, attempted to read the Cyborg Manifesto. It's 908 00:50:59,640 --> 00:51:02,800 Speaker 1: dense reading. I will warn you out there, people. Uh. 909 00:51:03,640 --> 00:51:07,640 Speaker 1: Hairway has written this in a very kind of postmodern 910 00:51:07,719 --> 00:51:11,960 Speaker 1: philosophical styling that it doesn't necessarily read like your traditional 911 00:51:12,000 --> 00:51:14,879 Speaker 1: academic paper in that it, you know, it doesn't set 912 00:51:14,920 --> 00:51:17,560 Speaker 1: up a methodology for you and then walking through an 913 00:51:17,560 --> 00:51:20,560 Speaker 1: experiment and tell you what the conclusions were. A lot 914 00:51:20,680 --> 00:51:24,919 Speaker 1: of it is her riffing on the ideas of what 915 00:51:25,080 --> 00:51:27,640 Speaker 1: being a cyborg means. Yeah, and you kind of have 916 00:51:27,680 --> 00:51:31,160 Speaker 1: to unravel what she means and what her argument here 917 00:51:31,239 --> 00:51:35,400 Speaker 1: is as well. But ultimately it's Uh, it's a very 918 00:51:35,400 --> 00:51:40,879 Speaker 1: compelling argument and one that probably beautifully transforms and illuminates 919 00:51:41,080 --> 00:51:43,879 Speaker 1: the idea of cyborder that we've been discussing this whole episode. Yeah, 920 00:51:43,960 --> 00:51:48,600 Speaker 1: I find it particularly useful. Uh. And in nine seven, 921 00:51:49,160 --> 00:51:52,359 Speaker 1: UH Robert found an article in Wired Is written by 922 00:51:52,400 --> 00:52:01,400 Speaker 1: Harri Kun's rue that basically deconstructs hairway cyborg manifesto and 923 00:52:01,480 --> 00:52:04,560 Speaker 1: explains it much better. Uh, let's see if we can 924 00:52:04,600 --> 00:52:06,040 Speaker 1: take a stab at it here. But you know what, 925 00:52:06,160 --> 00:52:08,320 Speaker 1: let's start with that quote, just to give our listeners 926 00:52:08,320 --> 00:52:10,879 Speaker 1: an idea of what kind of reading material it is. 927 00:52:11,520 --> 00:52:15,200 Speaker 1: Quote by the late twentieth century, our time a mythic time. 928 00:52:15,520 --> 00:52:20,640 Speaker 1: We're all chimeras, theorized and fabricated hybrids of machine and organism. 929 00:52:20,719 --> 00:52:25,040 Speaker 1: In short, we're cyborgs. The cyborg is our ontology. It 930 00:52:25,080 --> 00:52:28,440 Speaker 1: gives us our politics. The cyborg is a condensed image 931 00:52:28,440 --> 00:52:33,960 Speaker 1: of both imagination and material reality. The relations between organism 932 00:52:34,000 --> 00:52:38,480 Speaker 1: and machine has been a border war. Yeah, okay, I 933 00:52:38,480 --> 00:52:40,320 Speaker 1: think it's fair to say. And this is written in 934 00:52:40,360 --> 00:52:44,440 Speaker 1: the notes here. What the hell does that mean? Uh? 935 00:52:44,760 --> 00:52:46,880 Speaker 1: This is what I got out of reading it directly. 936 00:52:47,000 --> 00:52:49,080 Speaker 1: But then let's turn to that Wired article to see 937 00:52:49,120 --> 00:52:50,759 Speaker 1: if we can unpack it a little bit more so. 938 00:52:50,840 --> 00:52:54,200 Speaker 1: First of all, she sees cyborgs everywhere, and keep in 939 00:52:54,239 --> 00:52:57,440 Speaker 1: mind share at this in which, you know, as we 940 00:52:57,480 --> 00:52:59,640 Speaker 1: talked about earlier, that's when we were growing up seeing 941 00:52:59,680 --> 00:53:02,120 Speaker 1: them in pop culture every right, But she sees them 942 00:53:02,160 --> 00:53:05,399 Speaker 1: in war. She sees them in sets, she sees them 943 00:53:05,400 --> 00:53:09,440 Speaker 1: in medicine. Uh. In her thesis is essentially that cyborgs 944 00:53:09,840 --> 00:53:14,040 Speaker 1: are a fiction that maps out our social and bodily 945 00:53:14,120 --> 00:53:19,759 Speaker 1: reality that that can suggest what she calls fruitful couplings. 946 00:53:19,800 --> 00:53:22,560 Speaker 1: And I think what she means by fruitful couplings are 947 00:53:22,600 --> 00:53:27,360 Speaker 1: sort of like a redefinition of identity in such a 948 00:53:27,400 --> 00:53:30,759 Speaker 1: way that is beneficial to the individual. We'll see, we'll 949 00:53:30,760 --> 00:53:33,040 Speaker 1: see if I'm right about that or not. But yeah, 950 00:53:33,239 --> 00:53:36,520 Speaker 1: we're all chimeras, right, and in particular that's important to her. 951 00:53:36,600 --> 00:53:39,360 Speaker 1: And this is where it's not a cyber feminism or 952 00:53:39,360 --> 00:53:41,839 Speaker 1: cyborg feminism is not her term, but it comes out 953 00:53:41,840 --> 00:53:46,000 Speaker 1: of this paper. Cyborgs are post gender beings, right, or 954 00:53:46,000 --> 00:53:50,320 Speaker 1: at least they're capable of being so. Uh. And going 955 00:53:50,400 --> 00:53:53,200 Speaker 1: back all the way to us talking about Winer and 956 00:53:53,360 --> 00:53:59,399 Speaker 1: Catherine Hayes paper about him, Harroway sees cyborgs as being 957 00:53:59,440 --> 00:54:04,600 Speaker 1: inherently a confluence of both militarism and capitalism. And she 958 00:54:04,680 --> 00:54:07,879 Speaker 1: breaks down. She says, there's three boundaries that come into 959 00:54:07,880 --> 00:54:11,880 Speaker 1: play when we're talking about cyborgs. There's the human versus 960 00:54:11,960 --> 00:54:16,120 Speaker 1: animal boundary, There's the organism versus machine boundary, and then 961 00:54:16,160 --> 00:54:19,480 Speaker 1: there's the physical and non physical boundary. And I guess 962 00:54:19,520 --> 00:54:22,040 Speaker 1: like that one to me gets us back to that 963 00:54:22,080 --> 00:54:26,359 Speaker 1: like flow of information thing, right, Yeah, that's the non 964 00:54:26,440 --> 00:54:30,239 Speaker 1: physical Yeah, the flow of information, the network of information. Um, 965 00:54:30,520 --> 00:54:33,360 Speaker 1: that's very essential to all of this. But again a 966 00:54:33,440 --> 00:54:35,120 Speaker 1: lot of you are probably wondering what the hell does 967 00:54:35,120 --> 00:54:38,160 Speaker 1: that mean? Uh? And so let's turn to that come 968 00:54:38,280 --> 00:54:42,480 Speaker 1: through article he profiled and chatted with Haraway for that 969 00:54:42,560 --> 00:54:46,480 Speaker 1: piece and the one of the more useful examples that 970 00:54:46,520 --> 00:54:48,360 Speaker 1: he brings up, and this is one where he's talking 971 00:54:48,360 --> 00:54:52,320 Speaker 1: to her about this is the example of doping in sports. Okay, 972 00:54:52,480 --> 00:54:57,880 Speaker 1: So Haraway sees this as just revelent because training, quote, 973 00:54:57,880 --> 00:55:02,080 Speaker 1: training and technology make every Olympian a node in an 974 00:55:02,200 --> 00:55:06,840 Speaker 1: in an international technoculture network. So winning an Olympic foot 975 00:55:06,920 --> 00:55:10,840 Speaker 1: race isn't just about running fast or running faster because 976 00:55:10,880 --> 00:55:15,319 Speaker 1: you took this particular medication. It's about quote, the interaction 977 00:55:15,400 --> 00:55:21,160 Speaker 1: of medicine, diet, training practices, clothing and equipment, manufacturer and manufacture, 978 00:55:21,320 --> 00:55:25,360 Speaker 1: visualization and time keeping. In other words, like that that 979 00:55:25,360 --> 00:55:28,640 Speaker 1: that Olympic runner is the product of this, this vast 980 00:55:28,800 --> 00:55:32,640 Speaker 1: interconnected system, These ideas of what a runner is and 981 00:55:32,680 --> 00:55:34,640 Speaker 1: all of these technologies that make it possible and it's 982 00:55:34,640 --> 00:55:38,080 Speaker 1: all artificial to her, right, like all of that is 983 00:55:38,080 --> 00:55:40,440 Speaker 1: an example of being a cyborg. She even goes so 984 00:55:40,480 --> 00:55:42,919 Speaker 1: far as to point out that before the Civil War, 985 00:55:43,000 --> 00:55:46,200 Speaker 1: I didn't know this. Before the Civil War, there weren't 986 00:55:46,360 --> 00:55:50,319 Speaker 1: right and left shoes. You just had shoes. Uh, And 987 00:55:50,520 --> 00:55:52,920 Speaker 1: that the the invention of a right shoe and a 988 00:55:53,080 --> 00:55:57,680 Speaker 1: left shoe, you know, was essentially yes, it was for comfortability, 989 00:55:57,760 --> 00:56:00,560 Speaker 1: but also to you know, maximize walking and run. Yeah, 990 00:56:00,840 --> 00:56:05,239 Speaker 1: to say nothing of reebok pomps exactly, all those all 991 00:56:05,239 --> 00:56:09,680 Speaker 1: those sweet hoops that you're and so Harroway actually addresses 992 00:56:10,200 --> 00:56:13,200 Speaker 1: the feminism thing here, and it basically comes down to 993 00:56:13,280 --> 00:56:16,560 Speaker 1: that she doesn't buy into that version of feminism that 994 00:56:16,680 --> 00:56:20,760 Speaker 1: is she calls quote goddess feminism, where man, I really 995 00:56:20,760 --> 00:56:23,720 Speaker 1: want Kristen and Caroline away on this our our colleagues 996 00:56:23,719 --> 00:56:25,400 Speaker 1: who do stuff mom. Now, I thought about that. I 997 00:56:25,480 --> 00:56:27,360 Speaker 1: did a quick search. I don't think they've covered this 998 00:56:27,400 --> 00:56:29,279 Speaker 1: topic before, but I think it would be perfect, it 999 00:56:29,320 --> 00:56:32,000 Speaker 1: would be awesome. Uh So, Anyway, she doesn't she didn't 1000 00:56:32,000 --> 00:56:34,600 Speaker 1: buy into that the the the kind of idea that 1001 00:56:35,440 --> 00:56:38,160 Speaker 1: you you shake off the modern world and somehow connect 1002 00:56:38,200 --> 00:56:42,040 Speaker 1: to Mother Earth. Right. Instead, she sees that the realities 1003 00:56:42,120 --> 00:56:47,799 Speaker 1: of modern life include a relationship between people and technology, 1004 00:56:47,920 --> 00:56:51,680 Speaker 1: and this is such an intimate relationship between those things 1005 00:56:51,680 --> 00:56:54,480 Speaker 1: that it's impossible to tell where we begin in the end. 1006 00:56:54,560 --> 00:56:58,040 Speaker 1: So again we're getting back to that science fiction cyborg 1007 00:56:58,440 --> 00:57:00,799 Speaker 1: and thinking of like episode of Star Trek the Next 1008 00:57:00,840 --> 00:57:03,480 Speaker 1: Generation with data, right, But he's like, am I human? 1009 00:57:03,680 --> 00:57:06,960 Speaker 1: Am I in android? As opposed to the board model, 1010 00:57:07,000 --> 00:57:09,239 Speaker 1: which is very much like look, you can see the 1011 00:57:09,320 --> 00:57:11,840 Speaker 1: human part is the white skin stuff and then the 1012 00:57:11,920 --> 00:57:22,960 Speaker 1: rest is just all you know, trip to lows madness. 1013 00:57:23,000 --> 00:57:27,160 Speaker 1: So for her, one of the fundamentals about cyborgs and 1014 00:57:27,200 --> 00:57:30,160 Speaker 1: how we're connected to modern society is one of our 1015 00:57:30,160 --> 00:57:33,160 Speaker 1: most important commodities. It's a commodity that you're listening to 1016 00:57:33,280 --> 00:57:34,919 Speaker 1: right now, and that Robert and I make a living 1017 00:57:34,920 --> 00:57:40,600 Speaker 1: off of information. Uh, cyborgs are information machines, right, So 1018 00:57:40,640 --> 00:57:42,160 Speaker 1: I think like in a way we could say, like 1019 00:57:42,960 --> 00:57:45,280 Speaker 1: if you out there right now are listening to podcasts 1020 00:57:45,320 --> 00:57:47,320 Speaker 1: like I do. You've got your phone, there's some kind 1021 00:57:47,320 --> 00:57:51,840 Speaker 1: of platform on it, it's running, uh the MP three file, 1022 00:57:51,960 --> 00:57:55,360 Speaker 1: and you're listening to us talk about cyborgs while you're 1023 00:57:55,400 --> 00:57:59,120 Speaker 1: doing whatever, your laundry, your commute, uh, you know, whatever 1024 00:57:59,440 --> 00:58:03,560 Speaker 1: your exer sizing. Uh. That is making you into an 1025 00:58:03,640 --> 00:58:08,840 Speaker 1: information machine. And we're part of that information machine. Yeah, indeed. Um. So, 1026 00:58:09,120 --> 00:58:10,760 Speaker 1: like one of the ideas here too, is that there's 1027 00:58:10,800 --> 00:58:13,920 Speaker 1: there's no longer a dichotomy of natural and artificial in 1028 00:58:13,920 --> 00:58:17,760 Speaker 1: our world. Everything is chimera, everything is cyborg. And here's 1029 00:58:17,800 --> 00:58:20,880 Speaker 1: the thing. There's no natural order. There's only the order 1030 00:58:20,920 --> 00:58:24,520 Speaker 1: of reinvention. We are all the new Peggy. Um, and 1031 00:58:24,600 --> 00:58:27,000 Speaker 1: we can be any version of Peggy that we want 1032 00:58:27,040 --> 00:58:30,560 Speaker 1: to be. Yeah. And this, uh, this is where it 1033 00:58:30,800 --> 00:58:35,080 Speaker 1: gets really relevant, I think, to modern day society. Right. So, 1034 00:58:35,440 --> 00:58:39,880 Speaker 1: Haroway further goes into it by talking about erotic fascination 1035 00:58:40,000 --> 00:58:44,280 Speaker 1: with cyborgs. She refers to quote the violation of boundaries 1036 00:58:44,320 --> 00:58:48,520 Speaker 1: by a cyborg as a pleasurable tight coupling between parts 1037 00:58:48,560 --> 00:58:51,800 Speaker 1: that are not supposed to touch. And I read that, 1038 00:58:51,840 --> 00:58:54,000 Speaker 1: and I thought of and I hope you haven't seen this, 1039 00:58:54,040 --> 00:58:55,840 Speaker 1: and I don't wish it upon any of our audience. 1040 00:58:55,840 --> 00:58:59,000 Speaker 1: But the episode of Torchwood, the TV show that is 1041 00:58:59,040 --> 00:59:03,160 Speaker 1: called cyber Woman, have you seen this? I may have 1042 00:59:03,200 --> 00:59:07,160 Speaker 1: watched this one. It's just like first second, yeah, first season, 1043 00:59:08,520 --> 00:59:11,480 Speaker 1: it's terrible. Yeah, yeah, I did watch this one and 1044 00:59:11,520 --> 00:59:14,320 Speaker 1: it's and it's you know, basically the premises in the 1045 00:59:14,440 --> 00:59:19,040 Speaker 1: Dr Hugh universe or these cyborgs called cybermen, and they 1046 00:59:19,080 --> 00:59:22,560 Speaker 1: just look like big kind of like robots, but they've 1047 00:59:22,600 --> 00:59:25,160 Speaker 1: got like human brains in them or some some organic 1048 00:59:25,200 --> 00:59:28,240 Speaker 1: parts in them. And uh, somewhere along the line they 1049 00:59:28,240 --> 00:59:31,760 Speaker 1: this woman was made into a cyber woman, and so 1050 00:59:31,800 --> 00:59:35,600 Speaker 1: she's like conflicted with between man and machine. But the 1051 00:59:35,840 --> 00:59:38,480 Speaker 1: design that they did for this episode is just so 1052 00:59:38,560 --> 00:59:43,840 Speaker 1: insulting to this poor actress's basically wearing like a cyborg bikini. Uh. 1053 00:59:43,920 --> 00:59:46,600 Speaker 1: And and and it really to me, I was like, oh, 1054 00:59:46,680 --> 00:59:51,400 Speaker 1: there's that erotic fascination with the pleasurable tight coupling of 1055 00:59:51,400 --> 00:59:55,400 Speaker 1: the cyborg right like like clearly somebody who had access 1056 00:59:55,400 --> 00:59:59,120 Speaker 1: to the BBC's uh finances, was like, this is what 1057 00:59:59,280 --> 01:00:02,440 Speaker 1: our viewers want. They want to see this half naked 1058 01:00:02,440 --> 01:00:06,200 Speaker 1: cyborg lady. And to me, that leads us to the 1059 01:00:06,600 --> 01:00:11,280 Speaker 1: real heart of it, the heart of Harroway's argument, the 1060 01:00:11,280 --> 01:00:14,320 Speaker 1: trans humanism everything we've been talking about here today, and 1061 01:00:14,360 --> 01:00:18,560 Speaker 1: this comes via Hales and she says, the cyborg becomes 1062 01:00:18,600 --> 01:00:23,960 Speaker 1: the stage on which are performed contestation about body boundaries 1063 01:00:24,000 --> 01:00:29,400 Speaker 1: that have often marked class, ethnic, and cultural differences. So 1064 01:00:29,440 --> 01:00:33,360 Speaker 1: we're looking at a complex hybridization that's going to get 1065 01:00:33,480 --> 01:00:37,720 Speaker 1: rid of our old fashioned concepts of what is natural 1066 01:00:37,920 --> 01:00:42,000 Speaker 1: versus what is artificial? Right Like, so again, like this 1067 01:00:42,040 --> 01:00:44,680 Speaker 1: is what I imagined Harroway was thinking of. Breast implants 1068 01:00:44,760 --> 01:00:46,320 Speaker 1: was probably what she was thinking of when she was 1069 01:00:46,360 --> 01:00:50,680 Speaker 1: thinking about the like the erotic fascination of cyborgs. Uh. 1070 01:00:50,920 --> 01:00:55,080 Speaker 1: But it throws away binary concepts like gender. Right. And 1071 01:00:55,200 --> 01:00:59,439 Speaker 1: as we're recording this, it made me think of what's 1072 01:00:59,480 --> 01:01:02,680 Speaker 1: going on in with Carolina right now with this law 1073 01:01:02,760 --> 01:01:04,960 Speaker 1: that's got a lot of people upset on both sides 1074 01:01:05,080 --> 01:01:11,400 Speaker 1: about transgender people and public restrooms. Uh. And you know, 1075 01:01:11,440 --> 01:01:16,280 Speaker 1: wherever you fall on that, that is a transition from 1076 01:01:16,280 --> 01:01:22,080 Speaker 1: a binary duality that is totally freaking people out right. Uh. 1077 01:01:22,120 --> 01:01:26,440 Speaker 1: And that's just the beginning. Like when you think about 1078 01:01:26,440 --> 01:01:31,560 Speaker 1: the cyborg transition that our whole world is going through 1079 01:01:31,680 --> 01:01:37,040 Speaker 1: right now, get ready for infinite identities, like any possible combination. 1080 01:01:37,720 --> 01:01:41,560 Speaker 1: We're just squeamish right now about something that doesn't fit 1081 01:01:41,600 --> 01:01:46,680 Speaker 1: into one of our two categories. Right for restrooms, cyborg 1082 01:01:46,880 --> 01:01:54,880 Speaker 1: ism makes an infinite possibility of identities available or genders available, right, Yeah, indeed. 1083 01:01:55,160 --> 01:01:58,480 Speaker 1: I mean it also reminds me of the recent episode 1084 01:01:58,520 --> 01:02:01,240 Speaker 1: that we did on hyper religion. Uh. And some I 1085 01:02:01,320 --> 01:02:04,720 Speaker 1: think we've had conversations about this as well, about religious 1086 01:02:04,760 --> 01:02:06,440 Speaker 1: beliefs that are, you know, kind of the salad bar 1087 01:02:06,480 --> 01:02:09,360 Speaker 1: approach to religion. It's like a lot of us are 1088 01:02:09,360 --> 01:02:12,760 Speaker 1: engaging in a kind of cybernetic religion instead of saying like, 1089 01:02:12,920 --> 01:02:15,760 Speaker 1: this is an absolute truth. And instead of saying this 1090 01:02:15,840 --> 01:02:18,560 Speaker 1: is an absolute truth, we're saying, you know, I'm going 1091 01:02:18,640 --> 01:02:21,320 Speaker 1: to build my truth out of this element and this 1092 01:02:21,360 --> 01:02:24,520 Speaker 1: element and this element and create the kind of cyber 1093 01:02:24,680 --> 01:02:30,439 Speaker 1: cyborg um worldview that makes the most sense to me. Yeah, totally. Uh. 1094 01:02:30,480 --> 01:02:33,560 Speaker 1: And and Harroway actually has a quote that actually makes 1095 01:02:33,560 --> 01:02:37,760 Speaker 1: sense to me, uh, with regards to this, especially to absolutes. 1096 01:02:38,120 --> 01:02:42,200 Speaker 1: She says, good or bad nature, or nurture right or wrong, 1097 01:02:42,680 --> 01:02:46,000 Speaker 1: it's messier than that. And that's that's a great way 1098 01:02:46,000 --> 01:02:48,560 Speaker 1: to put it, Like, it's messier than that. So if 1099 01:02:48,600 --> 01:02:52,040 Speaker 1: you're gnashing your teeth one way or the other over 1100 01:02:52,120 --> 01:02:55,200 Speaker 1: what's going on in North Carolina right now, it's messier 1101 01:02:55,200 --> 01:02:59,280 Speaker 1: than that. And then there's the networked aspect of all 1102 01:02:59,280 --> 01:03:02,480 Speaker 1: of this. So we're not isolated individuals within our own 1103 01:03:02,480 --> 01:03:05,960 Speaker 1: skulls were essentially part of the matrix. We're we're all 1104 01:03:05,960 --> 01:03:09,000 Speaker 1: part of that massive battery. And that's that's not a 1105 01:03:09,040 --> 01:03:10,880 Speaker 1: bad thing. That's one of the things that she drives 1106 01:03:10,880 --> 01:03:12,760 Speaker 1: home is that we are we are, we are, we 1107 01:03:12,800 --> 01:03:16,320 Speaker 1: are all networked together, and that's something that should uh 1108 01:03:16,400 --> 01:03:20,920 Speaker 1: that we should pay more attention to and not district card. Yeah. 1109 01:03:21,000 --> 01:03:25,040 Speaker 1: So that subsequently, out of all of Harroway's ideas, is 1110 01:03:25,080 --> 01:03:28,280 Speaker 1: where we get cyber feminism from. And this is not 1111 01:03:28,440 --> 01:03:31,320 Speaker 1: her term, that's right, and this means that there is 1112 01:03:31,480 --> 01:03:35,960 Speaker 1: no natural role quote unquote natural for a female in society, 1113 01:03:36,000 --> 01:03:38,400 Speaker 1: that we're past that, and that we're already kind of 1114 01:03:38,400 --> 01:03:41,320 Speaker 1: of post human in that respect, and I think kanz 1115 01:03:41,360 --> 01:03:44,000 Speaker 1: Ru sums it up nicely in that Wired paper from 1116 01:03:44,040 --> 01:03:47,960 Speaker 1: n quote. Feminists around the world have seized on this possibility. 1117 01:03:48,280 --> 01:03:51,960 Speaker 1: Cyber feminism is based on the idea that, in conjunction 1118 01:03:52,000 --> 01:03:56,000 Speaker 1: with technology, it's possible to construct your identity, your sexuality, 1119 01:03:56,240 --> 01:03:59,440 Speaker 1: even your gender, just as you please. And that is 1120 01:03:59,640 --> 01:04:03,720 Speaker 1: kind of I think the appeal of trans humanism. This 1121 01:04:03,800 --> 01:04:06,080 Speaker 1: is the point where I think we're moving from cyborg 1122 01:04:06,600 --> 01:04:11,120 Speaker 1: to trans human, right, or at least the conceptions. Maybe 1123 01:04:11,120 --> 01:04:13,560 Speaker 1: they're the same thing when you get down to it, right, 1124 01:04:13,560 --> 01:04:16,000 Speaker 1: I'd be curious what harroway has take is on that. 1125 01:04:16,400 --> 01:04:18,400 Speaker 1: But yeah, that's what we're talking about, is really kind 1126 01:04:18,400 --> 01:04:23,680 Speaker 1: of evolving your identity beyond what is considered your natural state. Right. 1127 01:04:23,880 --> 01:04:25,880 Speaker 1: And of course this also ties into the whole area 1128 01:04:25,920 --> 01:04:30,240 Speaker 1: of race and racial identity um, which is certainly certainly 1129 01:04:30,240 --> 01:04:33,919 Speaker 1: falls under that messy category that Harroway um laid out 1130 01:04:34,080 --> 01:04:37,760 Speaker 1: in her paper. Because you know, there are aspects of 1131 01:04:38,000 --> 01:04:41,320 Speaker 1: racial and trans racial identity that we're very open to 1132 01:04:41,360 --> 01:04:45,360 Speaker 1: exploring and in our our modern culture. There are other 1133 01:04:45,440 --> 01:04:47,640 Speaker 1: areas that are a lot more taboo, like, for instance, 1134 01:04:47,800 --> 01:04:51,560 Speaker 1: identifying as African American if you are in fact of 1135 01:04:51,640 --> 01:04:55,920 Speaker 1: Caucasian descent, Like this brings to mind, uh, the story 1136 01:04:55,960 --> 01:04:59,600 Speaker 1: of a c p UM leader Rachel dolisal but came 1137 01:04:59,640 --> 01:05:03,640 Speaker 1: out in recent years. Yeah, the specific example here being 1138 01:05:03,680 --> 01:05:07,320 Speaker 1: that but she was born Caucasian to a Caucasian family, 1139 01:05:07,520 --> 01:05:10,760 Speaker 1: but that she was portraying herself, yeah, and said that 1140 01:05:10,800 --> 01:05:14,560 Speaker 1: she identified as African American. Uh. Yeah, and this was 1141 01:05:14,600 --> 01:05:18,680 Speaker 1: an idea that basically nobody was comfortable with. Yeah, everybody, 1142 01:05:19,400 --> 01:05:23,160 Speaker 1: the media and and so harroway. You know, she says, well, 1143 01:05:23,200 --> 01:05:28,240 Speaker 1: but everything can be reconstructed between technology and biology, right, 1144 01:05:28,720 --> 01:05:33,120 Speaker 1: so then everything's up for grabs identity wise, So all 1145 01:05:33,280 --> 01:05:38,320 Speaker 1: basic assumptions about quote unquote how things are come into question. 1146 01:05:38,800 --> 01:05:42,880 Speaker 1: So she you know, whether we're talking about identity, ethnicity, gender, 1147 01:05:43,160 --> 01:05:46,320 Speaker 1: all of it is fluid. And here's the part, right, like, 1148 01:05:46,320 --> 01:05:48,920 Speaker 1: like we see examples like that pop up or the 1149 01:05:48,920 --> 01:05:52,040 Speaker 1: North Carolina restroom thing pop up, and and it's like 1150 01:05:52,080 --> 01:05:55,960 Speaker 1: they seem like they're blips. We cannot escape this. This 1151 01:05:56,040 --> 01:05:59,640 Speaker 1: is where humanity is heading. And it's just these are 1152 01:05:59,720 --> 01:06:03,680 Speaker 1: kind lake I guess, growing pains along the way. Yeah, 1153 01:06:03,880 --> 01:06:07,320 Speaker 1: I mean it makes me think of pretty much any 1154 01:06:07,400 --> 01:06:10,640 Speaker 1: kind of trans human topic makes me think of in 1155 01:06:10,800 --> 01:06:15,120 Speaker 1: in Banks Culture series. In in that setting, the humans 1156 01:06:15,120 --> 01:06:18,400 Speaker 1: of the culture they live these extra long lives. There 1157 01:06:19,040 --> 01:06:23,360 Speaker 1: they're able to consciously and perhaps subconsciously administer various levels 1158 01:06:23,400 --> 01:06:25,840 Speaker 1: of pharmaceuticals into their own body to to to meet 1159 01:06:25,880 --> 01:06:28,400 Speaker 1: whatever their needs are. But they also throughout their long lives, 1160 01:06:28,400 --> 01:06:31,160 Speaker 1: they'll change their own gender. Uh, they will change their 1161 01:06:31,760 --> 01:06:33,720 Speaker 1: they may decide they need wings, They might want to 1162 01:06:33,720 --> 01:06:36,520 Speaker 1: sort of change species. They might want to live in 1163 01:06:36,520 --> 01:06:39,640 Speaker 1: a virtual environment instead of a physical one. And uh, 1164 01:06:39,840 --> 01:06:42,600 Speaker 1: I can I can't remember remember a specific example, but 1165 01:06:42,640 --> 01:06:45,800 Speaker 1: I can well imagine an individual in the culture changing 1166 01:06:45,800 --> 01:06:49,000 Speaker 1: their race and it being no big deal either. But 1167 01:06:49,760 --> 01:06:53,040 Speaker 1: we're not quite there yet. Uh, for a number of reasons. 1168 01:06:53,040 --> 01:06:55,080 Speaker 1: Maybe that'll Yeah, I don't know, maybe I'm stretching a 1169 01:06:55,120 --> 01:06:57,400 Speaker 1: little too far here, but maybe that will be We 1170 01:06:57,400 --> 01:06:59,680 Speaker 1: we visit this idea often on the show. A couple 1171 01:06:59,680 --> 01:07:01,600 Speaker 1: of hun your years from now, people will look back 1172 01:07:01,640 --> 01:07:04,800 Speaker 1: at us and be like, they were just so uptight. Yeah, 1173 01:07:04,800 --> 01:07:09,760 Speaker 1: they were so stuck on their identities, their singular identities. Uh, 1174 01:07:10,600 --> 01:07:14,080 Speaker 1: and now we're all cyborgs. Well put, well put. Now. 1175 01:07:14,560 --> 01:07:16,480 Speaker 1: We don't have time to go into into all these 1176 01:07:16,480 --> 01:07:18,040 Speaker 1: in this episode, but I do want to to to 1177 01:07:18,080 --> 01:07:21,800 Speaker 1: mention that hard waits work has been tremendously influential on 1178 01:07:21,880 --> 01:07:25,800 Speaker 1: individuals and a number of different disciplines. She's an influenced 1179 01:07:26,040 --> 01:07:31,280 Speaker 1: views on science, economics, computer development, thermodynamics, information theory. So yeah, 1180 01:07:31,320 --> 01:07:32,960 Speaker 1: you can you can go online and you can look 1181 01:07:33,040 --> 01:07:36,000 Speaker 1: up cyborg economics and it is a thing that people 1182 01:07:36,000 --> 01:07:40,680 Speaker 1: have written about um rather exhaustively. Yeah, and so you know, 1183 01:07:40,960 --> 01:07:42,320 Speaker 1: here's how I want I want to close it out 1184 01:07:42,360 --> 01:07:47,120 Speaker 1: how we started off. So somebody out there is writing 1185 01:07:47,200 --> 01:07:52,880 Speaker 1: screenplay right now for DC's Cyborg movie. Uh, and I 1186 01:07:52,920 --> 01:07:56,320 Speaker 1: think that there's a lot of potential there. I don't knowing, 1187 01:07:56,400 --> 01:07:59,160 Speaker 1: knowing what I know about superhero movies and in particular 1188 01:07:59,200 --> 01:08:01,640 Speaker 1: Warner Brothers super her movies, I'm not you know, I 1189 01:08:01,640 --> 01:08:03,400 Speaker 1: don't have a lot of high hopes that they're going 1190 01:08:03,400 --> 01:08:06,720 Speaker 1: to particularly address Donna Harroway's themes, for instance, in the 1191 01:08:06,800 --> 01:08:10,000 Speaker 1: Cyborg movie. But hey, if you're listening and you're working 1192 01:08:10,040 --> 01:08:13,880 Speaker 1: on the Cyborg movie, maybe think about the fluidity I've 1193 01:08:13,960 --> 01:08:18,200 Speaker 1: identity that Cyborg could have or the flow of information 1194 01:08:18,280 --> 01:08:20,400 Speaker 1: that seems to me like something that they'll probably tap 1195 01:08:20,439 --> 01:08:24,240 Speaker 1: into their like real excited about the idea of humanity 1196 01:08:24,600 --> 01:08:28,840 Speaker 1: connecting with machines and using information is like power in 1197 01:08:28,880 --> 01:08:31,559 Speaker 1: a way. Yeah, I mean, correct me if I'm wrong. 1198 01:08:31,600 --> 01:08:34,960 Speaker 1: But the character is African American? Right, is that explored 1199 01:08:35,000 --> 01:08:36,800 Speaker 1: at all in the comics, Like the idea that like 1200 01:08:36,840 --> 01:08:40,640 Speaker 1: the transformation of Really, I don't, I don't remember it 1201 01:08:40,680 --> 01:08:46,120 Speaker 1: ever being explored as like, uh, it's not specifically like ethnically, No, 1202 01:08:46,600 --> 01:08:50,840 Speaker 1: there is a point where he transcends being human and 1203 01:08:50,880 --> 01:08:54,280 Speaker 1: he sort of becomes like the T one thousand, he 1204 01:08:54,320 --> 01:08:58,599 Speaker 1: turns into this like gold liquid metal. Uh. And that 1205 01:08:58,800 --> 01:09:01,680 Speaker 1: I think that I think is maybe somebody's saying like, 1206 01:09:01,720 --> 01:09:04,519 Speaker 1: oh well, the natural extension for the cyborg thing would 1207 01:09:04,520 --> 01:09:08,320 Speaker 1: be that it would be beyond the identity of of 1208 01:09:08,560 --> 01:09:11,280 Speaker 1: of being African American, or of being even human. Right, 1209 01:09:11,600 --> 01:09:14,519 Speaker 1: But he was so recognizable in the other form that 1210 01:09:14,520 --> 01:09:17,240 Speaker 1: they brought him back around to the to the form 1211 01:09:17,320 --> 01:09:19,360 Speaker 1: that will be in the movie. You gotta put something 1212 01:09:19,360 --> 01:09:23,200 Speaker 1: on the comic book cover. Oh yeah, yeah exactly. Um. 1213 01:09:23,240 --> 01:09:25,040 Speaker 1: I wonder if he's going to go into outer space, 1214 01:09:25,120 --> 01:09:28,479 Speaker 1: if there's gonna be a little nod declines incline there 1215 01:09:28,560 --> 01:09:31,320 Speaker 1: with the outer space stuff, and then you know, surely 1216 01:09:31,920 --> 01:09:35,160 Speaker 1: I would imagine they would talk about the ethics, uh, 1217 01:09:35,240 --> 01:09:39,320 Speaker 1: surrounding it, how he will be treated. What I'm concerned about, though, 1218 01:09:39,360 --> 01:09:41,599 Speaker 1: is that it's going to end up being like all 1219 01:09:41,640 --> 01:09:44,559 Speaker 1: that nineteen eighties cyborg fiction we carry up with, which 1220 01:09:44,600 --> 01:09:49,639 Speaker 1: is just you know, agonizing over humanity. Is human anymore? 1221 01:09:49,800 --> 01:09:52,080 Speaker 1: Is he not? And a gun comes out of his legs, 1222 01:09:52,240 --> 01:09:55,840 Speaker 1: right exactly. All right, Well, we will see, we will see. Yeah, yeah, 1223 01:09:55,840 --> 01:09:57,559 Speaker 1: I hope they incorporate some of those ideas that would 1224 01:09:57,560 --> 01:09:59,920 Speaker 1: be that would be very cool. Well, uh, you know, 1225 01:10:00,200 --> 01:10:03,000 Speaker 1: I think we've done a pretty good job of tackling 1226 01:10:03,160 --> 01:10:08,639 Speaker 1: the theory the philosophy of cyborgs. We maybe didn't get 1227 01:10:08,640 --> 01:10:12,160 Speaker 1: into the technology that you might be interested in, but 1228 01:10:12,360 --> 01:10:14,680 Speaker 1: you know, out there, let us know, let us know 1229 01:10:14,720 --> 01:10:18,559 Speaker 1: what you know about cyborgs that we missed. Uh, you know, 1230 01:10:18,600 --> 01:10:22,960 Speaker 1: what do you think about harroway and uh, cyberfeminism or 1231 01:10:23,000 --> 01:10:27,160 Speaker 1: the ethics surrounding this, uh, the black mirror style Peggy 1232 01:10:27,280 --> 01:10:32,040 Speaker 1: android as wife scenario? Yeah, indeed. And one thing I 1233 01:10:32,040 --> 01:10:34,840 Speaker 1: would love to hear is, first of all, I would 1234 01:10:34,840 --> 01:10:37,000 Speaker 1: love to hear people take what you what we've talked 1235 01:10:37,000 --> 01:10:39,559 Speaker 1: about in this episode and apply that to like a 1236 01:10:39,680 --> 01:10:42,720 Speaker 1: night bad nineteen eighties cyborg and give us a like 1237 01:10:42,880 --> 01:10:47,559 Speaker 1: nice intelligent read on that simple character. Or likewise, if 1238 01:10:47,600 --> 01:10:50,120 Speaker 1: you can think of an example of a really intelligent 1239 01:10:50,200 --> 01:10:54,719 Speaker 1: treatment of of cyborgs in fiction does tie into this material, 1240 01:10:55,080 --> 01:10:58,200 Speaker 1: I would definitely want to hear about that. Man, there's 1241 01:10:58,200 --> 01:11:01,120 Speaker 1: a there was a missed opportunity when they remade RoboCop, 1242 01:11:01,439 --> 01:11:03,600 Speaker 1: they had a lot of opportunity to dive into some 1243 01:11:03,640 --> 01:11:05,600 Speaker 1: of this stuff, but they just kind of remade it. 1244 01:11:05,640 --> 01:11:07,679 Speaker 1: Oh yeah, I haven't seen that one yet. It's not bad, 1245 01:11:07,720 --> 01:11:10,960 Speaker 1: but it's just, you know, it's pretty much just a remake. 1246 01:11:11,000 --> 01:11:12,639 Speaker 1: And then they have more c g I, so there's 1247 01:11:12,640 --> 01:11:14,840 Speaker 1: a lot more crazy gun play because I remember the 1248 01:11:14,840 --> 01:11:17,360 Speaker 1: original exported a little bit like the whole bit. Maybe 1249 01:11:17,400 --> 01:11:19,000 Speaker 1: it was in the sequel where they were like, oh, well, 1250 01:11:19,000 --> 01:11:21,400 Speaker 1: this isn't him, this is just this is a tribute 1251 01:11:21,439 --> 01:11:24,240 Speaker 1: to him. Yeah, yeah, yeah, totally, that wasn't there. So hey, 1252 01:11:24,280 --> 01:11:25,519 Speaker 1: you want to get in touch with us, pull that 1253 01:11:25,600 --> 01:11:28,360 Speaker 1: cybernetic enhancement out of your pocket. Head on over to 1254 01:11:28,360 --> 01:11:30,120 Speaker 1: stuff to Blow your Mind dot com. That's where we'll 1255 01:11:30,120 --> 01:11:32,920 Speaker 1: find all the episodes. You'll find videos, you'll find blog posts, 1256 01:11:32,960 --> 01:11:37,200 Speaker 1: you'll find links out to our social media accounts like Twitter, Tumbler, Facebook, 1257 01:11:37,400 --> 01:11:41,280 Speaker 1: and Instagram. And you could use the old fashioned cybernetic 1258 01:11:41,280 --> 01:11:43,960 Speaker 1: way of getting in touch with us, write us an email. 1259 01:11:44,160 --> 01:11:46,439 Speaker 1: You could use it on a desktop computer or a 1260 01:11:46,439 --> 01:11:49,880 Speaker 1: mobile device, or maybe with your mind. We are at 1261 01:11:50,160 --> 01:12:02,439 Speaker 1: blow the Mind at how Stuff works dot com for 1262 01:12:02,560 --> 01:12:04,880 Speaker 1: more on this and thousands of other topics. Does it, 1263 01:12:04,960 --> 01:12:13,040 Speaker 1: How stuff works dot com. They