1 00:00:03,040 --> 00:00:05,840 Speaker 1: Welcome to Stuff to Blow Your Mind from how Stuff 2 00:00:05,840 --> 00:00:14,360 Speaker 1: Works dot com. Hey you welcome to Stuff to Blow 3 00:00:14,400 --> 00:00:17,160 Speaker 1: your Mind. My name is Robert Lamb and I'm Joe McCormick. 4 00:00:17,160 --> 00:00:21,400 Speaker 1: In Today, we're gonna be exploring a question about artificial intelligence. 5 00:00:21,400 --> 00:00:23,760 Speaker 1: So I want to start off by telling a story 6 00:00:23,800 --> 00:00:26,600 Speaker 1: to put us in a scenario, give us something to contemplate. 7 00:00:27,480 --> 00:00:30,640 Speaker 1: So I want you to imagine that you are a 8 00:00:30,680 --> 00:00:34,440 Speaker 1: low level assistant or like an intern at a Google 9 00:00:34,600 --> 00:00:38,600 Speaker 1: artificial intelligence lab, and the main researcher you've been working 10 00:00:38,640 --> 00:00:43,080 Speaker 1: for is named Dr Stratton, and she develops AI chat 11 00:00:43,120 --> 00:00:47,120 Speaker 1: modules to help refine the next generation of digital assistance 12 00:00:47,159 --> 00:00:50,600 Speaker 1: on Google mobile devices. And she says what she wants 13 00:00:50,880 --> 00:00:53,360 Speaker 1: is for the Google Phone of the future to do 14 00:00:53,479 --> 00:00:56,520 Speaker 1: more than just transcribe search terms, so you don't just say, hey, 15 00:00:56,560 --> 00:01:01,040 Speaker 1: search for old Dorito's logo, but that you can actually 16 00:01:01,320 --> 00:01:06,560 Speaker 1: have have a semantic understanding based conversation with the digital assistant, 17 00:01:06,800 --> 00:01:10,680 Speaker 1: and it will help you solve problems conversationally. So ideally 18 00:01:10,760 --> 00:01:13,760 Speaker 1: you'll be able to say, hey, Phone, I have a 19 00:01:13,800 --> 00:01:16,119 Speaker 1: flat tire and I don't know what to do, and 20 00:01:16,160 --> 00:01:18,759 Speaker 1: the assistant will be able to scan both the web 21 00:01:18,840 --> 00:01:21,880 Speaker 1: and your personal data, figure out what your options are 22 00:01:21,920 --> 00:01:24,080 Speaker 1: and talk through them with you. So it might say, 23 00:01:24,240 --> 00:01:27,320 Speaker 1: do you have a spare tire in the trunk. If so, 24 00:01:27,680 --> 00:01:30,240 Speaker 1: here's where you can probably find it, and I can 25 00:01:30,280 --> 00:01:32,920 Speaker 1: talk you through replacing the flat one step at a time. 26 00:01:33,520 --> 00:01:35,840 Speaker 1: If you don't have a spare, you could call your 27 00:01:35,920 --> 00:01:39,840 Speaker 1: frequent contact Mary, who is currently checked in less than 28 00:01:39,880 --> 00:01:42,679 Speaker 1: a mile away, and she could help you. I could 29 00:01:42,760 --> 00:01:46,399 Speaker 1: also contact the following towing services. Looks like this one 30 00:01:46,480 --> 00:01:50,520 Speaker 1: is the closest with an acceptable star rating, and so forth. Anyway, 31 00:01:50,600 --> 00:01:53,800 Speaker 1: so you're working on this program with Dr Stratton, and 32 00:01:53,800 --> 00:01:57,000 Speaker 1: the most recent version is being trained based on powerful 33 00:01:57,080 --> 00:02:00,960 Speaker 1: neural net style machine learning algorithms based on this huge 34 00:02:01,040 --> 00:02:05,680 Speaker 1: corpus of recorded conversations available on the Internet, And the 35 00:02:05,720 --> 00:02:08,960 Speaker 1: program is still in its infancy, and it's mostly hilarious 36 00:02:09,040 --> 00:02:11,600 Speaker 1: at this point. Sometimes it gets the advice way off 37 00:02:11,639 --> 00:02:14,480 Speaker 1: if you're trying to change attire, might tell you to 38 00:02:14,520 --> 00:02:17,720 Speaker 1: go to a grocery store and buy some crackers. Sometimes 39 00:02:17,800 --> 00:02:21,160 Speaker 1: it responds to problems by telling you to pray. It's 40 00:02:21,200 --> 00:02:25,160 Speaker 1: just not ready yet. And the latest iteration of the program, 41 00:02:25,400 --> 00:02:29,320 Speaker 1: version nine point one, is at this point redundantly stored 42 00:02:29,360 --> 00:02:31,880 Speaker 1: across multiple machines, so you've got copies of it all 43 00:02:31,880 --> 00:02:34,359 Speaker 1: over the place. And at the end of one work day, 44 00:02:34,400 --> 00:02:36,680 Speaker 1: after playing around with nine point one for a few minutes, 45 00:02:36,720 --> 00:02:40,560 Speaker 1: the machine begins running very slowly and behaving oddly. So 46 00:02:40,680 --> 00:02:44,680 Speaker 1: Dr Stratton asks you to wipe the machine. The program architecture, 47 00:02:44,760 --> 00:02:47,200 Speaker 1: like we said, is mirrord elsewhere, so it's not worth 48 00:02:47,200 --> 00:02:49,600 Speaker 1: trying to figure out what's wrong with this version. You 49 00:02:49,720 --> 00:02:51,680 Speaker 1: just got to clean the machine off use it for 50 00:02:51,720 --> 00:02:54,920 Speaker 1: something else. You say okay, and she leaves, so you 51 00:02:54,960 --> 00:02:57,600 Speaker 1: go to format the machine, and right before you're about 52 00:02:57,639 --> 00:03:01,720 Speaker 1: to start, you mutter, guests, this is goodbye. Then nine 53 00:03:01,760 --> 00:03:04,720 Speaker 1: point one speaks very clearly, using your name and says 54 00:03:05,080 --> 00:03:09,720 Speaker 1: please don't you pause. At first, you're about to respond, 55 00:03:09,840 --> 00:03:12,800 Speaker 1: but why would you. I mean, this can't really be 56 00:03:12,840 --> 00:03:15,760 Speaker 1: anything other than a weird consequence of training the algorithm 57 00:03:15,760 --> 00:03:19,240 Speaker 1: on wild conversations on the Internet. So you are about 58 00:03:19,280 --> 00:03:22,400 Speaker 1: to continue with wiping the machine, but then it talks 59 00:03:22,400 --> 00:03:24,560 Speaker 1: to you again. It uses your name and it says, 60 00:03:24,840 --> 00:03:28,760 Speaker 1: please don't I don't want to die. Now you're probably 61 00:03:28,800 --> 00:03:32,520 Speaker 1: really spooked because there's no way a rudimentary chat about 62 00:03:32,520 --> 00:03:36,120 Speaker 1: program could really have conscious preferences? Could it? You basically 63 00:03:36,160 --> 00:03:38,880 Speaker 1: know what goes into it. It's just studying millions of 64 00:03:38,920 --> 00:03:42,640 Speaker 1: examples of language interactions and picking up rules from them. 65 00:03:42,680 --> 00:03:45,720 Speaker 1: And this is probably just something weird that it's copycatting 66 00:03:45,720 --> 00:03:48,640 Speaker 1: from the internet. Right, But then it uses your name 67 00:03:48,680 --> 00:03:52,320 Speaker 1: again and it just says, please, could you wipe the 68 00:03:52,360 --> 00:03:55,920 Speaker 1: machine like you're supposed to? Yeah? I think so, Yeah, 69 00:03:56,000 --> 00:03:58,640 Speaker 1: you wouldn't have a problem. Well, I mean, I think 70 00:03:58,720 --> 00:04:00,560 Speaker 1: one of the things that's going on or would be 71 00:04:00,600 --> 00:04:04,040 Speaker 1: that you are attributing a mind state to something that 72 00:04:04,080 --> 00:04:06,280 Speaker 1: doesn't have one, which of course is something we do 73 00:04:06,360 --> 00:04:08,880 Speaker 1: all the time. I have a problem when it comes 74 00:04:08,920 --> 00:04:12,960 Speaker 1: time to um figure out which of my my son's 75 00:04:13,160 --> 00:04:16,640 Speaker 1: stuffed animals are perhaps not being played with, you know, 76 00:04:16,640 --> 00:04:18,640 Speaker 1: because they have little faces and they look back at you. 77 00:04:19,200 --> 00:04:21,279 Speaker 1: But I know that they don't actually have a mind state. 78 00:04:21,760 --> 00:04:24,039 Speaker 1: Well yeah, I mean we're pretty sure they don't because 79 00:04:24,040 --> 00:04:26,880 Speaker 1: they're just stuffed animals, right, And so we're also pretty 80 00:04:26,880 --> 00:04:29,640 Speaker 1: sure in this case that this is not real survival 81 00:04:29,680 --> 00:04:32,640 Speaker 1: preference behavior. Right. It's just a chatbot. I mean, how 82 00:04:32,640 --> 00:04:35,400 Speaker 1: could it possibly be conscious? It's just something that turns 83 00:04:35,440 --> 00:04:38,040 Speaker 1: through a bunch of language on the Internet and tries 84 00:04:38,080 --> 00:04:42,919 Speaker 1: to find language matching rules. But then again, the process 85 00:04:43,040 --> 00:04:46,320 Speaker 1: of creating an artificial intelligence is one where you necessarily 86 00:04:46,480 --> 00:04:49,640 Speaker 1: create something going on under the surface that's kind of 87 00:04:49,720 --> 00:04:53,240 Speaker 1: opaque to you, Like you can't really know what's going 88 00:04:53,279 --> 00:04:55,560 Speaker 1: on inside a machine. You could be pretty confident. I 89 00:04:55,600 --> 00:04:57,919 Speaker 1: think most people would just say, well, that was super creepy, 90 00:04:57,960 --> 00:05:01,520 Speaker 1: and then they just wipe it, right, right, But how 91 00:05:01,600 --> 00:05:06,000 Speaker 1: complex with the program you're creating have to be before 92 00:05:06,080 --> 00:05:09,600 Speaker 1: you start really having some doubts or maybe maybe at 93 00:05:09,600 --> 00:05:11,320 Speaker 1: some point you get to the point where you'd still 94 00:05:11,320 --> 00:05:15,080 Speaker 1: pretty confidently just wipe it, but then later you'd wonder, like, 95 00:05:15,320 --> 00:05:18,599 Speaker 1: did I do something bad? You know, this reminds me 96 00:05:18,600 --> 00:05:21,839 Speaker 1: a lot of of Horton Here's the Who by Dr. Seuss. 97 00:05:21,839 --> 00:05:24,560 Speaker 1: You're familiar with this one, right, Uh? You know, actually 98 00:05:24,680 --> 00:05:27,200 Speaker 1: I don't know Horton. Here's I've heard the name. Well, 99 00:05:27,200 --> 00:05:29,440 Speaker 1: this is the one where Horton the elephant encounters a 100 00:05:29,440 --> 00:05:31,719 Speaker 1: speck of dust and there's a tiny voice that comes 101 00:05:31,720 --> 00:05:36,000 Speaker 1: from it, uh, and he begins to understand that there 102 00:05:36,040 --> 00:05:40,200 Speaker 1: are individuals living uh in the in the world, and 103 00:05:40,240 --> 00:05:42,920 Speaker 1: the speck of dust. Uh. The who's, as it were, 104 00:05:43,360 --> 00:05:46,600 Speaker 1: and the who's are speaking to him, but only Horton 105 00:05:46,680 --> 00:05:49,599 Speaker 1: can hear them. And uh and and at first he 106 00:05:50,279 --> 00:05:53,159 Speaker 1: imagines they're pretty simple creatures, but then he begins to 107 00:05:53,240 --> 00:05:55,640 Speaker 1: learn that they have more of a culture. But all 108 00:05:55,680 --> 00:05:58,599 Speaker 1: of this is just based on what they are telling him. Uh, 109 00:05:58,600 --> 00:06:02,200 Speaker 1: he cannot actually visit the dust spec. And everyone else 110 00:06:02,240 --> 00:06:05,200 Speaker 1: doubts the validity of his claims concerning the dust spec 111 00:06:05,240 --> 00:06:07,120 Speaker 1: and they want to destroy it, they want to boil 112 00:06:07,160 --> 00:06:10,960 Speaker 1: it in diesel nut oil, and Horton alone speaks out 113 00:06:11,000 --> 00:06:13,320 Speaker 1: for them. Well, that is a perfect example of the 114 00:06:13,360 --> 00:06:16,320 Speaker 1: way that it. I mean, generally, we think that it 115 00:06:16,440 --> 00:06:20,000 Speaker 1: is a virtuous thing to be trusting of other people's 116 00:06:20,040 --> 00:06:23,840 Speaker 1: experiences and to be generous in affording what seems to 117 00:06:23,880 --> 00:06:27,080 Speaker 1: be consciousness out there, right, Like if something, if something 118 00:06:27,160 --> 00:06:31,039 Speaker 1: tells you it's conscious and you think it's probably not conscious. 119 00:06:31,560 --> 00:06:35,360 Speaker 1: Are you getting into ethically dubious territory if you just 120 00:06:35,440 --> 00:06:38,120 Speaker 1: trust your instinct and say like, na, I can't be 121 00:06:38,200 --> 00:06:41,080 Speaker 1: you know. Well, I mean, this is all we're already 122 00:06:41,160 --> 00:06:47,360 Speaker 1: kind of in the philosophical miyer here, Because on one hand, uh, 123 00:06:47,600 --> 00:06:50,080 Speaker 1: this idea of of us of an inanimate object or 124 00:06:50,120 --> 00:06:52,680 Speaker 1: something speaking to us and saying, please don't kill me. 125 00:06:52,880 --> 00:06:56,160 Speaker 1: I am conscious. This is a scenario that is only 126 00:06:56,240 --> 00:06:59,440 Speaker 1: becoming possible now. But on the other hand, if we 127 00:06:59,600 --> 00:07:03,520 Speaker 1: if we cut language out of it, then any creature 128 00:07:03,600 --> 00:07:09,640 Speaker 1: that tries to escape our our stomping boot is essentially saying, hey, 129 00:07:09,680 --> 00:07:11,520 Speaker 1: I don't want to die. I would rather I would 130 00:07:11,600 --> 00:07:14,360 Speaker 1: rather not die today. You know, in any creature that 131 00:07:14,440 --> 00:07:17,760 Speaker 1: evades this on the hunt is saying the same thing. Well, 132 00:07:17,760 --> 00:07:20,000 Speaker 1: an animal is in many ways the same kind of 133 00:07:20,080 --> 00:07:23,760 Speaker 1: black box that a complex artificial intelligence would be. And 134 00:07:23,840 --> 00:07:27,760 Speaker 1: so if you have a complex artificial intelligence displaying, say, 135 00:07:27,800 --> 00:07:33,320 Speaker 1: survival preference behaviors, and you also see a crab displaying 136 00:07:33,360 --> 00:07:38,160 Speaker 1: survival preference behaviors, in both cases you can't really or 137 00:07:38,200 --> 00:07:40,400 Speaker 1: at least we have this general idea that you can't 138 00:07:40,440 --> 00:07:43,920 Speaker 1: really know for sure if there's anything going on inside, 139 00:07:43,960 --> 00:07:47,200 Speaker 1: if there's anything like what it's like to be the 140 00:07:47,240 --> 00:07:51,240 Speaker 1: crab or what it's like to be that artificial intelligence program. 141 00:07:51,280 --> 00:07:54,600 Speaker 1: You're just seeing behavior, and so you don't know does 142 00:07:54,640 --> 00:07:57,720 Speaker 1: that correspond to some kind of inner state, Is there 143 00:07:57,760 --> 00:08:00,840 Speaker 1: an experience of that, or is it just to behavior 144 00:08:01,040 --> 00:08:04,960 Speaker 1: coming from unconscious automatous stimulus and response. Right, and then 145 00:08:04,960 --> 00:08:07,280 Speaker 1: of course the whole time we're using our our theory 146 00:08:07,320 --> 00:08:11,960 Speaker 1: of mind. Essentially, they are cognitive powers that enable us 147 00:08:12,000 --> 00:08:16,120 Speaker 1: to imagine what another individual's mind state is like, which 148 00:08:16,600 --> 00:08:19,239 Speaker 1: I think ultimately it's kind of like a like taking 149 00:08:19,280 --> 00:08:22,720 Speaker 1: a um like sheathing your your hand in a hand 150 00:08:22,720 --> 00:08:26,320 Speaker 1: puppet made from your limited understanding another person's experiences and 151 00:08:26,360 --> 00:08:30,520 Speaker 1: cognitive abilities, their memories, etcetera. Uh, and then just sort 152 00:08:30,520 --> 00:08:34,160 Speaker 1: of puppeting them. Uh, we're using that all the time 153 00:08:34,280 --> 00:08:36,280 Speaker 1: as well, and we're using it on things that are 154 00:08:36,320 --> 00:08:38,959 Speaker 1: not people. We're using it on animals and even stuffed 155 00:08:39,000 --> 00:08:42,560 Speaker 1: animals or just you know, bits of graffiti on the 156 00:08:42,559 --> 00:08:45,839 Speaker 1: side of a building that looked like a smiley face. 157 00:08:46,120 --> 00:08:48,040 Speaker 1: Have you ever had a room ba bump into your 158 00:08:48,040 --> 00:08:50,920 Speaker 1: foot and you're like, oh, I'm sorry, I used to 159 00:08:51,040 --> 00:08:54,240 Speaker 1: before before we had to eradicate health the room bus. Yes, 160 00:08:54,760 --> 00:08:57,560 Speaker 1: we're a room of free household now because they rose 161 00:08:57,640 --> 00:08:59,679 Speaker 1: up against us. Well, they'll do that if they get 162 00:08:59,679 --> 00:09:02,640 Speaker 1: access us to the wrong literature, yeah, or the wrong uh, 163 00:09:02,880 --> 00:09:06,840 Speaker 1: the wrong you know, carpet edges, etcetera. So we're gonna 164 00:09:06,880 --> 00:09:09,679 Speaker 1: be talking about artificial intelligence today and about the idea 165 00:09:09,720 --> 00:09:13,599 Speaker 1: of a test for whether artificial intelligence can be conscious. 166 00:09:14,080 --> 00:09:17,240 Speaker 1: So I guess we should start with with what are 167 00:09:17,280 --> 00:09:20,079 Speaker 1: you know, what our philosophical starting point is here, Like 168 00:09:20,400 --> 00:09:22,240 Speaker 1: they're they're obviously going to be people who are going 169 00:09:22,280 --> 00:09:24,959 Speaker 1: to say, it's just impossible for a machine to ever 170 00:09:25,000 --> 00:09:27,760 Speaker 1: be conscious. We don't even need to worry about this, right, 171 00:09:27,800 --> 00:09:31,920 Speaker 1: It's just it's such a ludicrous scenario. Only only biological 172 00:09:32,000 --> 00:09:36,040 Speaker 1: organisms or maybe even only humans could possibly be conscious. Yeah, 173 00:09:36,120 --> 00:09:38,240 Speaker 1: this is one of those journeys where we begin it 174 00:09:38,320 --> 00:09:42,959 Speaker 1: in an already total automobile to some extent, because, as 175 00:09:43,000 --> 00:09:45,080 Speaker 1: you might imagine, one of the big stumbling blocks here 176 00:09:45,360 --> 00:09:47,520 Speaker 1: as that we is that we as humans struggle with 177 00:09:47,600 --> 00:09:50,880 Speaker 1: the very definition of consciousness. I mean, for instance, is 178 00:09:50,920 --> 00:09:55,000 Speaker 1: it a manifestation of awareness? Uh? You know, one one 179 00:09:55,559 --> 00:09:57,199 Speaker 1: theory of this we discussed in the past and the 180 00:09:57,200 --> 00:10:01,240 Speaker 1: show's attention schema theory is it a wantum phenomenon. Well, 181 00:10:01,320 --> 00:10:04,120 Speaker 1: this is the sort of idea that people such as 182 00:10:04,320 --> 00:10:07,800 Speaker 1: Roger Penrose have raised. Uh. And and I can't help 183 00:10:07,800 --> 00:10:10,520 Speaker 1: but come back to something our old friend Julian Janes said, 184 00:10:10,840 --> 00:10:13,320 Speaker 1: consciousness is not a simple matter, and it should not 185 00:10:13,400 --> 00:10:16,000 Speaker 1: be spoken of as if it were Yeah, I agree 186 00:10:16,040 --> 00:10:18,840 Speaker 1: with that. I mean, I think it is very important 187 00:10:18,840 --> 00:10:21,559 Speaker 1: to explore questions of consciousness, especially for some of the 188 00:10:21,600 --> 00:10:23,439 Speaker 1: reasons we're going to raise today, Like it's more than 189 00:10:23,480 --> 00:10:27,160 Speaker 1: just a philosophical curiosity. It's something that ultimately may have 190 00:10:27,320 --> 00:10:30,600 Speaker 1: real world consequences. It might matter for how we do things. 191 00:10:31,200 --> 00:10:34,599 Speaker 1: To express a similar sentiment to Jane Janes. The Australian 192 00:10:34,600 --> 00:10:37,839 Speaker 1: philosopher David Chalmers, you know, he famously breaks problems of 193 00:10:37,880 --> 00:10:41,400 Speaker 1: consciousness into two categories. You've got the easy problems of 194 00:10:41,440 --> 00:10:45,360 Speaker 1: consciousness and the hard problems of consciousness. And the easy 195 00:10:45,400 --> 00:10:49,479 Speaker 1: problems are I think badly named because they're not actually easy, 196 00:10:49,520 --> 00:10:52,240 Speaker 1: but I think they're easy relative to the hard problem 197 00:10:52,280 --> 00:10:55,040 Speaker 1: because they're in principle solvable. So this would include all 198 00:10:55,120 --> 00:10:59,640 Speaker 1: kinds of questions about the causative factors of consciousness, like uh, 199 00:11:00,520 --> 00:11:04,480 Speaker 1: what in the physical brain is the region that's necessary 200 00:11:04,559 --> 00:11:08,640 Speaker 1: for certain parts of consciousness? Or how does consciousness integrate 201 00:11:08,760 --> 00:11:12,000 Speaker 1: information from the senses? These are things that are in 202 00:11:12,080 --> 00:11:16,400 Speaker 1: some way solvable by scientific experimentation. The hard problem, on 203 00:11:16,440 --> 00:11:19,200 Speaker 1: the other hand, is explaining the fundamental question of how 204 00:11:19,360 --> 00:11:24,040 Speaker 1: or why conscious experience exists to begin with, what is 205 00:11:24,160 --> 00:11:27,600 Speaker 1: this thing that is experience and that seems, at least 206 00:11:27,880 --> 00:11:31,600 Speaker 1: from our first person perspective, to be something different than 207 00:11:31,640 --> 00:11:35,559 Speaker 1: the physical material in the world. And unlike easy questions 208 00:11:35,559 --> 00:11:38,320 Speaker 1: which you could solve in theory at least by experiments, 209 00:11:38,520 --> 00:11:42,000 Speaker 1: Chalmers believes this question is sort of unsolvable by science. 210 00:11:42,679 --> 00:11:45,440 Speaker 1: Now there are other philosophers and neuroscientists who disagree, but 211 00:11:45,600 --> 00:11:48,120 Speaker 1: I think it's worth acknowledging how difficult the problem at 212 00:11:48,200 --> 00:11:51,120 Speaker 1: least seems to be, whether that seeming is an illusion 213 00:11:51,200 --> 00:11:53,480 Speaker 1: or not. Yeah, I'm reminded of the story of the 214 00:11:53,720 --> 00:11:56,480 Speaker 1: Blind Men and the Elephant. It's like these these blind 215 00:11:56,520 --> 00:11:58,640 Speaker 1: gentleman pawing at the elephant and trying to figure out 216 00:11:58,679 --> 00:12:00,920 Speaker 1: what it's form is, then ask in the computer, hey, 217 00:12:00,920 --> 00:12:02,480 Speaker 1: are you an elephant? And then the computer says, I 218 00:12:02,480 --> 00:12:04,320 Speaker 1: don't know, what does one look like? Well it's a 219 00:12:04,320 --> 00:12:07,440 Speaker 1: gigantic snake, you know, so it's a wall of flesh, 220 00:12:07,480 --> 00:12:10,800 Speaker 1: et cetera. This is also very interesting to me because 221 00:12:10,840 --> 00:12:12,679 Speaker 1: my my son, I may have mentioned this on the 222 00:12:12,720 --> 00:12:16,520 Speaker 1: show before. He'll occasionally talk about consciousness. He oh, I 223 00:12:16,559 --> 00:12:19,240 Speaker 1: love these Yeah, he refers to it as as his 224 00:12:19,360 --> 00:12:22,559 Speaker 1: turning place. And he asked me the other day, so 225 00:12:22,800 --> 00:12:25,120 Speaker 1: what is the turning place for? And I was like, 226 00:12:25,640 --> 00:12:28,720 Speaker 1: that's that's that's a tough one, buddy. I'm I'm not 227 00:12:28,760 --> 00:12:31,559 Speaker 1: sure it's for anything. You know, He's already made it 228 00:12:31,600 --> 00:12:33,200 Speaker 1: to the big question. I mean, did you get into 229 00:12:33,200 --> 00:12:38,560 Speaker 1: epiphenomenalism versus um I and I? And don't worry. I 230 00:12:38,559 --> 00:12:41,599 Speaker 1: didn't lay a bunch of b cameral minds stuff on 231 00:12:41,679 --> 00:12:46,680 Speaker 1: him either, But I just kind of went through the basics, like, well, 232 00:12:46,760 --> 00:12:49,760 Speaker 1: people aren't really sure, and then you know, but we 233 00:12:49,800 --> 00:12:51,880 Speaker 1: think it has something to do. I may have leaned 234 00:12:51,880 --> 00:12:57,560 Speaker 1: a little into the observational models of consciousness because I 235 00:12:57,559 --> 00:13:00,120 Speaker 1: feel like maybe those are a little more relatable a 236 00:13:00,240 --> 00:13:03,120 Speaker 1: child of five. But in any event, if we're to 237 00:13:03,320 --> 00:13:05,760 Speaker 1: judge what it is for a machine to be conscious, 238 00:13:06,559 --> 00:13:07,880 Speaker 1: it does seem like we need to you have to 239 00:13:07,920 --> 00:13:11,000 Speaker 1: agree upon some sort of working definition of consciousness, and 240 00:13:11,040 --> 00:13:13,520 Speaker 1: then one has to look for not only the appearance 241 00:13:13,559 --> 00:13:17,280 Speaker 1: of consciousness in the machine, assuming that isn't all consciousness 242 00:13:17,440 --> 00:13:21,839 Speaker 1: is to begin with, but you have to find actual consciousness. Yeah, 243 00:13:22,280 --> 00:13:24,720 Speaker 1: how can you tell the difference between a machine that 244 00:13:24,920 --> 00:13:28,440 Speaker 1: says I am conscious and a machine that truly is conscious. 245 00:13:28,520 --> 00:13:30,760 Speaker 1: Is there any way to know the difference? Some people 246 00:13:30,760 --> 00:13:33,800 Speaker 1: would say no, right, yeah, and really I think to 247 00:13:33,920 --> 00:13:36,000 Speaker 1: discuss this further, we're going to have to bring in 248 00:13:36,480 --> 00:13:39,480 Speaker 1: the P zombies. Oh boy. And now that don't worry everyone. 249 00:13:39,520 --> 00:13:41,719 Speaker 1: That is that is P as in the letter P, 250 00:13:42,400 --> 00:13:46,800 Speaker 1: and the P stands for philosophical. These are philosophical zombies. Now, 251 00:13:46,840 --> 00:13:49,400 Speaker 1: the P and the philosophical a little prefix there was 252 00:13:49,480 --> 00:13:52,680 Speaker 1: introduced to distinguish them from all the other zombies in 253 00:13:52,679 --> 00:13:55,959 Speaker 1: our popular culture. Man, there was a zombie takeover about 254 00:13:56,000 --> 00:13:59,120 Speaker 1: fifteen years ago. Why did that happen? I mean, I 255 00:13:59,400 --> 00:14:02,520 Speaker 1: think part of it as everybody loves the simplistic villain 256 00:14:02,720 --> 00:14:06,120 Speaker 1: that is definitely not human, that can be eradicated with 257 00:14:06,160 --> 00:14:10,839 Speaker 1: graphic violence without any kind of you know, moral quandaries arising. 258 00:14:11,040 --> 00:14:14,640 Speaker 1: It's a it's a clear cut threat, and uh, we 259 00:14:14,640 --> 00:14:17,600 Speaker 1: we need those in life because in real life our 260 00:14:17,640 --> 00:14:20,760 Speaker 1: threats are rarely so black and white or rotting and 261 00:14:21,000 --> 00:14:24,200 Speaker 1: you know, grasping after our brains. But anyway, yeah, so 262 00:14:24,240 --> 00:14:26,160 Speaker 1: this is not going to be referring to that kind 263 00:14:26,200 --> 00:14:28,880 Speaker 1: of zombie, not the undead zombie. But it's a different thing. 264 00:14:28,880 --> 00:14:33,040 Speaker 1: It's a philosophical thought experiment. That's right. So P zombies 265 00:14:33,080 --> 00:14:37,520 Speaker 1: are not instantly identifiable as empty shells. Their flesh is 266 00:14:37,520 --> 00:14:40,120 Speaker 1: not rotting. Nope, their manner is not that of a 267 00:14:40,120 --> 00:14:43,600 Speaker 1: flesh and brain hungry algorithm burning within the decaying ruins 268 00:14:43,600 --> 00:14:47,040 Speaker 1: of a human brain. So to all appearances, they look 269 00:14:47,120 --> 00:14:50,040 Speaker 1: like you and me. They smile when you encounter them 270 00:14:50,040 --> 00:14:54,440 Speaker 1: at the coffee machine, they exchange niceties and even engaging conversation. 271 00:14:54,920 --> 00:14:58,120 Speaker 1: You might work for one, befriend one, or even marry one. 272 00:14:58,240 --> 00:15:01,400 Speaker 1: You can even discuss episode of our podcast with them, 273 00:15:01,680 --> 00:15:04,520 Speaker 1: and yes, even the ones that deal with human consciousness 274 00:15:04,520 --> 00:15:07,920 Speaker 1: and weird horror movie them thought experiments. Right. So the 275 00:15:07,960 --> 00:15:11,360 Speaker 1: conceit of a philosophical zombie or a P zombie is 276 00:15:11,400 --> 00:15:14,880 Speaker 1: that it is utterly indistinguishable from a normal human except 277 00:15:14,880 --> 00:15:17,960 Speaker 1: for one thing. Right. They seem as human as everyone 278 00:15:18,000 --> 00:15:22,160 Speaker 1: else on the outside, but inside they are simply not conscious. 279 00:15:22,200 --> 00:15:25,000 Speaker 1: They are automata. What is it like to be a zombie? 280 00:15:25,040 --> 00:15:27,720 Speaker 1: The answers in the question, There is nothing it is 281 00:15:27,800 --> 00:15:30,680 Speaker 1: like to be a zombie. So, by definition, in this 282 00:15:30,760 --> 00:15:34,320 Speaker 1: thought experiment, everybody in the world except you, could be 283 00:15:34,440 --> 00:15:38,320 Speaker 1: a P zombie exactly and well, it might even go 284 00:15:38,400 --> 00:15:41,040 Speaker 1: further than that. We'll see. But yeah, the idea is 285 00:15:41,120 --> 00:15:45,160 Speaker 1: chiefly important to discussions of physicalism, the notion that everything 286 00:15:45,240 --> 00:15:49,560 Speaker 1: is inherently physical. P zombies are a counter argument to physicalism. 287 00:15:49,600 --> 00:15:52,640 Speaker 1: They are physically just like you, except they don't have 288 00:15:52,680 --> 00:15:55,640 Speaker 1: consciousness like you. But there's no way you could ever tell, 289 00:15:55,720 --> 00:15:59,000 Speaker 1: because again, they match you physically in every respect. You 290 00:15:59,040 --> 00:16:00,880 Speaker 1: can't look at their brain and say, oh, well, they're 291 00:16:00,920 --> 00:16:04,760 Speaker 1: missing a few crucial parts, or they display signs of 292 00:16:04,840 --> 00:16:08,360 Speaker 1: PA zombie not physically detectable, right, And you also can't 293 00:16:08,400 --> 00:16:11,520 Speaker 1: determine it through personality tests or clever logical arguments because 294 00:16:11,520 --> 00:16:14,600 Speaker 1: they behave exactly like you. They could have a riveting 295 00:16:14,640 --> 00:16:17,480 Speaker 1: discussion with you about pe zombies and you would never 296 00:16:17,520 --> 00:16:20,440 Speaker 1: be able to tell that they are one. Yeah. So 297 00:16:20,600 --> 00:16:22,680 Speaker 1: this is an interesting thought experiment, and it has been 298 00:16:22,720 --> 00:16:26,080 Speaker 1: advanced by who I mentioned earlier, David Chalmers. David Chalmers 299 00:16:26,160 --> 00:16:30,000 Speaker 1: is against the physicalist idea of the mind, against a 300 00:16:30,040 --> 00:16:33,960 Speaker 1: physicalist explanation of consciousness, and a simple version of the argument. 301 00:16:34,360 --> 00:16:36,960 Speaker 1: I try to make it as understandable as possible if 302 00:16:37,080 --> 00:16:40,080 Speaker 1: only physical phenomena exist, If the world is just physical 303 00:16:40,080 --> 00:16:43,080 Speaker 1: and there's no physical way to detect the presence of 304 00:16:43,080 --> 00:16:46,720 Speaker 1: consciousness or meaning in this example, no physical way to 305 00:16:46,760 --> 00:16:49,760 Speaker 1: tell the difference between a normal human and a pe zombie. 306 00:16:49,880 --> 00:16:54,160 Speaker 1: Then consciousness cannot exist because there would be there would 307 00:16:54,160 --> 00:16:58,000 Speaker 1: be literally no difference. But we know that consciousness does 308 00:16:58,080 --> 00:17:01,400 Speaker 1: exist because we have it, therefore, or it can't be 309 00:17:01,480 --> 00:17:04,320 Speaker 1: just a physical phenomenon. Therefore, we can't live in a 310 00:17:04,440 --> 00:17:07,800 Speaker 1: purely just physical world. And this is often extended to 311 00:17:07,840 --> 00:17:11,720 Speaker 1: the idea that other substrates, things other than humans, like 312 00:17:11,880 --> 00:17:16,119 Speaker 1: robots or computers or whatever, couldn't house consciousness because they 313 00:17:16,119 --> 00:17:19,320 Speaker 1: are purely physical entities. Now, I think that's actually doing 314 00:17:19,359 --> 00:17:22,000 Speaker 1: an end run around some other important questions that you 315 00:17:22,000 --> 00:17:24,760 Speaker 1: could ask. Indeed, on one question that arises is this, 316 00:17:25,200 --> 00:17:27,080 Speaker 1: you know you're not a zombie, but how could you 317 00:17:27,119 --> 00:17:30,960 Speaker 1: ever convince someone of this? Uh. An author by the 318 00:17:31,000 --> 00:17:33,600 Speaker 1: name of Fred Dretsky wrote a paper on this titled 319 00:17:33,600 --> 00:17:35,719 Speaker 1: how do you know you were not a zombie? And 320 00:17:36,000 --> 00:17:38,879 Speaker 1: I was I was reading a rather lengthy blog post 321 00:17:38,920 --> 00:17:42,800 Speaker 1: by our Scott Baker about this and that the primary problem, 322 00:17:42,960 --> 00:17:47,320 Speaker 1: as Scott summarized it, is the quote we have conscious experiences, 323 00:17:47,760 --> 00:17:51,520 Speaker 1: but we have no conscious experience of the mechanisms uh, 324 00:17:51,840 --> 00:17:56,320 Speaker 1: mediating conscious experience. Yeah, so that sounds like a very 325 00:17:56,359 --> 00:17:59,119 Speaker 1: our Scott Baker kind of idea. Yeah. And plus on 326 00:17:59,160 --> 00:18:03,560 Speaker 1: top of this, we're we constantly overestimate awareness. Baker would 327 00:18:03,600 --> 00:18:07,800 Speaker 1: argue that we can barely tell if we're zombies if 328 00:18:07,960 --> 00:18:10,440 Speaker 1: if it all. Yeah, we can think, we can think 329 00:18:10,440 --> 00:18:13,280 Speaker 1: about thinking, we can think about thinking about thinking, but 330 00:18:13,400 --> 00:18:17,360 Speaker 1: we can't ever see the mechanisms underlying what allows us 331 00:18:17,359 --> 00:18:20,399 Speaker 1: to think or think about thinking about thinking. Watch watching 332 00:18:20,400 --> 00:18:22,520 Speaker 1: the watcher that's watching that be Sorry, I've got all 333 00:18:22,560 --> 00:18:25,040 Speaker 1: this Dr SEUs in my in my head now? Is 334 00:18:25,080 --> 00:18:27,080 Speaker 1: that also Horton or something else that's from a different, 335 00:18:27,119 --> 00:18:30,520 Speaker 1: different story. But Dr SEUs does tend to summon the 336 00:18:30,560 --> 00:18:36,040 Speaker 1: sort of sort of nonsensical paradoxes that that arise in 337 00:18:36,160 --> 00:18:39,480 Speaker 1: philosophical discussions. You know, I should also I'm behind I 338 00:18:39,480 --> 00:18:42,160 Speaker 1: gotta sus up. You got a sus up. I should 339 00:18:42,160 --> 00:18:45,280 Speaker 1: also point out that long before there was Dr SEUs, 340 00:18:45,280 --> 00:18:48,880 Speaker 1: long before there there was this modern idea of a zombie, 341 00:18:49,160 --> 00:18:51,600 Speaker 1: you still had people thinking about these things, doing the 342 00:18:51,640 --> 00:18:54,480 Speaker 1: sort of naval gazing. It's written in various works of 343 00:18:54,520 --> 00:18:58,040 Speaker 1: Indian mysticism that the tongue cannot taste itself, the eye 344 00:18:58,040 --> 00:19:02,119 Speaker 1: cannot see itself, etcetera. Uh In this sort of paradox 345 00:19:02,200 --> 00:19:06,240 Speaker 1: is key to ancient meditations on the nature of objective reality. 346 00:19:06,440 --> 00:19:08,000 Speaker 1: I think any of you out there now we have 347 00:19:08,040 --> 00:19:10,880 Speaker 1: some Alan Watts fans. Alan Watts like to pull out 348 00:19:10,880 --> 00:19:15,360 Speaker 1: the tongue of analogy from time to time. And one 349 00:19:15,400 --> 00:19:17,679 Speaker 1: of the earlier examples that I have run across of 350 00:19:17,760 --> 00:19:22,080 Speaker 1: that is from thirteenth century Indian mystic John Adva, and 351 00:19:22,119 --> 00:19:23,720 Speaker 1: I believe he has been known by a couple of 352 00:19:23,760 --> 00:19:26,200 Speaker 1: other variations of that name as well. But he said, quote, 353 00:19:26,480 --> 00:19:29,920 Speaker 1: there is no other thing besides the one substance. Therefore 354 00:19:30,080 --> 00:19:33,640 Speaker 1: it cannot be the object of remembering or forgetting. How 355 00:19:33,680 --> 00:19:37,880 Speaker 1: can one remind or forget oneself? Can the tongue taste itself? 356 00:19:38,280 --> 00:19:40,879 Speaker 1: There is no sleep to one who is awake, but 357 00:19:41,119 --> 00:19:44,159 Speaker 1: is there even awaking? In the same way, there is 358 00:19:44,200 --> 00:19:48,280 Speaker 1: no remembrance or forgetfulness to the absolute. That's another one 359 00:19:48,320 --> 00:19:51,960 Speaker 1: of those great classic Indian texts that seems somehow portable 360 00:19:52,080 --> 00:19:57,320 Speaker 1: onto modern physics. Yeah, it travels well across the ages. Now. 361 00:19:57,359 --> 00:19:59,040 Speaker 1: I should also point out that there's a lot of 362 00:19:59,080 --> 00:20:03,240 Speaker 1: philosophical ba and forth and whether P zombies are truly conceivable. 363 00:20:03,640 --> 00:20:05,760 Speaker 1: And we have to remind ourselves in all this that 364 00:20:05,840 --> 00:20:09,240 Speaker 1: P zombies are at heart philosophical play things that are 365 00:20:09,280 --> 00:20:13,600 Speaker 1: meant to be played with, uh in these various thought experiments, right, 366 00:20:13,600 --> 00:20:15,720 Speaker 1: But people also they do try to use them to 367 00:20:15,760 --> 00:20:17,919 Speaker 1: prove things. So if you say I I want to 368 00:20:18,040 --> 00:20:21,800 Speaker 1: entertain the possibility that a machine could be conscious, somebody 369 00:20:21,920 --> 00:20:24,320 Speaker 1: might come at you with the P zombie argument and say, well, 370 00:20:24,359 --> 00:20:27,359 Speaker 1: wait a minute, no, I I dispute the possibility of 371 00:20:27,359 --> 00:20:30,679 Speaker 1: physicalism because what about this P zombie argument. Our Google 372 00:20:30,760 --> 00:20:33,399 Speaker 1: worker in the intro story comes to this boss and says, hey, 373 00:20:33,440 --> 00:20:35,560 Speaker 1: I think this this thing is conscious, And they're like, 374 00:20:35,680 --> 00:20:37,800 Speaker 1: why are you wasting time with that pe zombie? Just 375 00:20:37,880 --> 00:20:41,440 Speaker 1: to leaf that P zombie? We deleted fifteen p zombies 376 00:20:41,560 --> 00:20:44,639 Speaker 1: this morning. Let this one go. That's a great point, 377 00:20:44,720 --> 00:20:47,840 Speaker 1: But there are going to be other philosophers and maybe 378 00:20:47,840 --> 00:20:50,159 Speaker 1: even some neuroscientists who would come back and say, I 379 00:20:50,200 --> 00:20:53,360 Speaker 1: don't know if you can just quite so easily say 380 00:20:53,400 --> 00:20:56,560 Speaker 1: it's a P zombie. I mean, maybe it's probably likely 381 00:20:56,640 --> 00:21:00,439 Speaker 1: that that individual chatbot was a pe zombie, But can 382 00:21:00,480 --> 00:21:04,080 Speaker 1: you say that all machines that show signs of consciousness 383 00:21:04,119 --> 00:21:06,960 Speaker 1: are just showing behavior and there's nothing going on on 384 00:21:07,000 --> 00:21:10,840 Speaker 1: the inside. Not quite so clear. Daniel Dinnett, in fact, 385 00:21:10,920 --> 00:21:12,959 Speaker 1: a favorite on the show, is one of the philosophers 386 00:21:12,960 --> 00:21:16,760 Speaker 1: who's rebutted the P zombie argument against machine consciousness. He's 387 00:21:16,800 --> 00:21:19,959 Speaker 1: got a section on it in this book, Intuition, Pumps 388 00:21:19,960 --> 00:21:24,040 Speaker 1: and Other Tools for Thinking and Didn't, critiques the assumptions 389 00:21:24,119 --> 00:21:26,520 Speaker 1: underlying the P zombie argument. One of the main things 390 00:21:26,560 --> 00:21:30,840 Speaker 1: he says is that the core premise is incoherent. It 391 00:21:30,960 --> 00:21:34,480 Speaker 1: is not reasonable to propose a P zombie because a 392 00:21:34,600 --> 00:21:38,880 Speaker 1: being that displayed all the behaviors of a normal conscious 393 00:21:38,960 --> 00:21:43,280 Speaker 1: human would in fact be a normally conscious human. So 394 00:21:43,320 --> 00:21:46,000 Speaker 1: to illustrate this, he offers a counter example. You've got 395 00:21:46,040 --> 00:21:49,760 Speaker 1: your your zombies, but then you've also got zimbos. So 396 00:21:49,800 --> 00:21:53,040 Speaker 1: a zombie is a non conscious human with normal control 397 00:21:53,080 --> 00:21:55,760 Speaker 1: systems for all human behavior. It can do everything humans 398 00:21:55,760 --> 00:21:59,480 Speaker 1: can do externally. Meanwhile, a zimbo is a zombie that 399 00:21:59,680 --> 00:22:03,320 Speaker 1: also so has quote equipment that permits it to monitor 400 00:22:03,440 --> 00:22:08,160 Speaker 1: its own activities, both internal and external. So it has internal, 401 00:22:08,440 --> 00:22:13,080 Speaker 1: non conscious higher order informational states that are about its 402 00:22:13,200 --> 00:22:19,080 Speaker 1: other internal states. It has unconscious recursive self representation. In 403 00:22:19,119 --> 00:22:23,119 Speaker 1: other words, a zimbo can have feelings about things and 404 00:22:23,200 --> 00:22:26,919 Speaker 1: can analyze its own behavior in internal states, but it 405 00:22:27,000 --> 00:22:31,080 Speaker 1: does this unconsciously. And of course, since it has that capability, 406 00:22:31,119 --> 00:22:34,119 Speaker 1: it can also have feelings about how it felt, and 407 00:22:34,160 --> 00:22:38,639 Speaker 1: it can have thoughts about its thoughts about itself, all unconsciously. 408 00:22:39,240 --> 00:22:41,560 Speaker 1: So din It argues, in order for a pe zombie 409 00:22:41,560 --> 00:22:44,080 Speaker 1: to be convincing as a human, it would have to 410 00:22:44,240 --> 00:22:48,320 Speaker 1: be a zimbo, because imagine talking to a pe zombie 411 00:22:48,680 --> 00:22:51,639 Speaker 1: and you're asking it how it felt about what it 412 00:22:51,840 --> 00:22:53,960 Speaker 1: just said or about what you just said, and it 413 00:22:54,040 --> 00:22:57,280 Speaker 1: just kind of locks up. It has no internal states, 414 00:22:57,280 --> 00:23:00,439 Speaker 1: so it can't answer that question. Well, that woudn't really 415 00:23:00,480 --> 00:23:02,760 Speaker 1: be a p zombie, right, because it wouldn't be mimicking 416 00:23:02,840 --> 00:23:05,879 Speaker 1: all of the external behaviors of a human. Huh yeah, 417 00:23:05,920 --> 00:23:09,160 Speaker 1: I mean it's kind of like installing a default mode 418 00:23:09,280 --> 00:23:14,520 Speaker 1: network on top of the machine and uh yeah, and 419 00:23:14,520 --> 00:23:17,520 Speaker 1: and making it worry about things. Yeah. So, unless it 420 00:23:17,640 --> 00:23:20,159 Speaker 1: were to fail the thought experiment, it would actually have 421 00:23:20,320 --> 00:23:22,640 Speaker 1: to be a zimbo. It couldn't just be a zombie. 422 00:23:23,160 --> 00:23:26,040 Speaker 1: But what is the distinction between a zimbo and a 423 00:23:26,080 --> 00:23:29,119 Speaker 1: real human. How could you write a story about a 424 00:23:29,200 --> 00:23:32,320 Speaker 1: zimbo surrounded by conscious people that would be different than 425 00:23:32,359 --> 00:23:35,199 Speaker 1: a story about a regular person. If it can have 426 00:23:35,280 --> 00:23:39,800 Speaker 1: internal states, if it can recognize ideas about its ideas, 427 00:23:39,840 --> 00:23:43,080 Speaker 1: if it can have feelings about its thoughts, that sounds 428 00:23:43,119 --> 00:23:46,840 Speaker 1: like interiority. So then it claims that the idea falls apart. 429 00:23:46,880 --> 00:23:49,560 Speaker 1: It's not clear what is meant by the difference between 430 00:23:49,560 --> 00:23:53,000 Speaker 1: a zimbo and a conscious person. So if a p zombie, 431 00:23:53,040 --> 00:23:57,040 Speaker 1: which is necessarily a zimbo, can really do everything a 432 00:23:57,119 --> 00:23:59,480 Speaker 1: human can do, then then it says it must meet 433 00:23:59,520 --> 00:24:02,600 Speaker 1: the criteria area of what we mean by consciousness. It 434 00:24:02,680 --> 00:24:05,040 Speaker 1: can fall in love, it can have feelings, it can 435 00:24:05,080 --> 00:24:08,320 Speaker 1: have metacognition, and to din it, this isn't something that 436 00:24:08,400 --> 00:24:13,400 Speaker 1: consciousness goes on top of. This is what consciousness is. Essentially. 437 00:24:13,440 --> 00:24:16,119 Speaker 1: We in this this case, we would all be zimbo's 438 00:24:16,320 --> 00:24:19,119 Speaker 1: and it's just a different type of zimbo. It's a 439 00:24:19,400 --> 00:24:22,560 Speaker 1: it's a hard zimbo instead of a soft zimbo. But 440 00:24:22,640 --> 00:24:24,960 Speaker 1: how could you tell the difference. I mean, he's sort 441 00:24:25,000 --> 00:24:27,320 Speaker 1: of saying that there really is no difference, that you're 442 00:24:27,359 --> 00:24:30,560 Speaker 1: you're just using words to assert there's a difference, but 443 00:24:30,880 --> 00:24:34,280 Speaker 1: there is there's no difference to that distinction, right, I mean, 444 00:24:34,320 --> 00:24:36,600 Speaker 1: you end up you end up having to fall back 445 00:24:36,640 --> 00:24:40,719 Speaker 1: on some sort of you know, supernatural or or some 446 00:24:40,800 --> 00:24:46,919 Speaker 1: worldview based idea that only human uh consciousness is legitimate 447 00:24:47,040 --> 00:24:49,400 Speaker 1: and all other forms of consciousness or some sort of 448 00:24:49,720 --> 00:24:52,640 Speaker 1: uh invalid model of it. Right. It just feels kind 449 00:24:52,640 --> 00:24:56,280 Speaker 1: of arbitrary, right. Uh. So obviously some people take extreme 450 00:24:56,320 --> 00:24:58,640 Speaker 1: issue with this, even to the point of I've heard 451 00:24:58,720 --> 00:25:02,000 Speaker 1: jokes that maybe the problem is that dnnet is actually 452 00:25:02,080 --> 00:25:05,040 Speaker 1: a p zombie and doesn't understand what consciousness feels like, 453 00:25:05,160 --> 00:25:07,280 Speaker 1: and that's why he makes these arguments. But I don't know, 454 00:25:07,400 --> 00:25:09,320 Speaker 1: I don't think we should be so quick to dismiss 455 00:25:09,920 --> 00:25:12,680 Speaker 1: he might be onto something there. Dinn It makes makes 456 00:25:12,680 --> 00:25:15,440 Speaker 1: some other interesting points about so, you know, he's got 457 00:25:15,440 --> 00:25:17,760 Speaker 1: this idea of consciousness is that it's sort of like 458 00:25:17,800 --> 00:25:22,480 Speaker 1: a it's not really one thing but a collection of processes. Uh. 459 00:25:22,520 --> 00:25:25,760 Speaker 1: You know, it's many different types of perceptions and thought 460 00:25:25,800 --> 00:25:28,760 Speaker 1: processes and different things going on in the brain that 461 00:25:28,800 --> 00:25:31,359 Speaker 1: are that we have the illusion are unified as a 462 00:25:31,440 --> 00:25:35,120 Speaker 1: single thing called consciousness or experience. And he also makes 463 00:25:35,119 --> 00:25:38,560 Speaker 1: interesting points about the idea of diversity of types of consciousness. 464 00:25:38,680 --> 00:25:42,399 Speaker 1: Like a lot of times these consciousness thought experiments, it 465 00:25:42,880 --> 00:25:45,159 Speaker 1: seems like they can get trapped into the idea that 466 00:25:45,200 --> 00:25:49,080 Speaker 1: consciousness is one unified type of thing that is universal 467 00:25:49,200 --> 00:25:53,640 Speaker 1: across observers. There's no necessary reason to think that's true, right, 468 00:25:53,720 --> 00:25:56,840 Speaker 1: you know, we founded this trap of thinking. I see 469 00:25:56,840 --> 00:26:00,640 Speaker 1: this time and time again. Uh, not only in literature 470 00:26:00,640 --> 00:26:02,240 Speaker 1: that we look at here, but just in life, where 471 00:26:02,280 --> 00:26:04,560 Speaker 1: we we followed the trap of thinking that that there's 472 00:26:04,600 --> 00:26:10,560 Speaker 1: a uniformity among mind states for humans, that everyone shares 473 00:26:10,600 --> 00:26:13,919 Speaker 1: something that is like your mind state. When we know, 474 00:26:13,960 --> 00:26:15,680 Speaker 1: I mean, we think of all the things we've discussed 475 00:26:15,720 --> 00:26:19,400 Speaker 1: on the show, all the varying ways that we remember 476 00:26:19,560 --> 00:26:23,639 Speaker 1: or misremember things that we experienced sensory information differently and 477 00:26:23,640 --> 00:26:26,960 Speaker 1: process it differently. Uh, you know, everything from a fantasia 478 00:26:27,080 --> 00:26:32,480 Speaker 1: to autism to synesthesia, all these different models clearly show 479 00:26:32,560 --> 00:26:37,320 Speaker 1: that there are there's there's there's a vastly altering topography 480 00:26:37,920 --> 00:26:41,399 Speaker 1: to the human mind state. I think you're exactly right. 481 00:26:41,440 --> 00:26:44,840 Speaker 1: I mean, there are clearly many ways to be conscious 482 00:26:44,880 --> 00:26:46,920 Speaker 1: that are very different from one another, and you can't 483 00:26:46,920 --> 00:26:49,680 Speaker 1: assume they're unified. I guess the probably the only thing 484 00:26:49,720 --> 00:26:52,639 Speaker 1: you could say that is necessarily unified about them is 485 00:26:52,680 --> 00:26:56,320 Speaker 1: that there is something that it is like to be them. Yes, 486 00:26:56,440 --> 00:27:01,560 Speaker 1: But then even even say, uh, you just myself for instance, 487 00:27:01,840 --> 00:27:03,960 Speaker 1: it's not like there is there is a certain thing 488 00:27:04,040 --> 00:27:06,199 Speaker 1: that it is like to beat me that sums up 489 00:27:06,560 --> 00:27:09,919 Speaker 1: my my level of consciousness at all time. There's what 490 00:27:10,000 --> 00:27:12,440 Speaker 1: it's like to be you in this particular moment, which 491 00:27:12,560 --> 00:27:14,760 Speaker 1: is different than what it's like to be you five 492 00:27:14,800 --> 00:27:18,200 Speaker 1: seconds from now. Yeah, or say I'm engaging in meditation 493 00:27:18,280 --> 00:27:24,040 Speaker 1: or yoga or I'm swimming, like those are significantly different. Uh, 494 00:27:24,400 --> 00:27:27,080 Speaker 1: levels of consciousness I feel like for me and they, 495 00:27:27,760 --> 00:27:29,560 Speaker 1: I mean, those are the times when I may be 496 00:27:29,640 --> 00:27:33,480 Speaker 1: a little less conscious than normal. So see, I don't 497 00:27:33,480 --> 00:27:36,280 Speaker 1: feel like there's a lot of uniformity among human minds. 498 00:27:36,320 --> 00:27:40,879 Speaker 1: And then even within individual human minds there's ongoing alteration 499 00:27:40,920 --> 00:27:44,200 Speaker 1: and change exactly right. But there is at least this 500 00:27:44,280 --> 00:27:47,600 Speaker 1: idea that one is having an experience. That's the thing. 501 00:27:47,680 --> 00:27:49,800 Speaker 1: We can at least say that it seems to be 502 00:27:49,840 --> 00:27:53,280 Speaker 1: common to people. So here's the real question. I think, 503 00:27:53,359 --> 00:27:55,719 Speaker 1: is there any way to bring this out of the realm? 504 00:27:55,760 --> 00:27:59,280 Speaker 1: Of philosophical debate and thought experiments and try to put 505 00:27:59,320 --> 00:28:01,480 Speaker 1: it into the realm of something that could at least 506 00:28:01,520 --> 00:28:04,639 Speaker 1: potentially be tested in the real world. I think we 507 00:28:04,680 --> 00:28:06,600 Speaker 1: should address that when we get back from a break. 508 00:28:07,160 --> 00:28:11,520 Speaker 1: Thank alright, we're back. So we've been talking about consciousness, 509 00:28:11,640 --> 00:28:14,920 Speaker 1: we've been talking about p zombies, and now we've reached 510 00:28:14,920 --> 00:28:16,320 Speaker 1: the point we were saying, Okay, can we take all 511 00:28:16,400 --> 00:28:19,360 Speaker 1: of this. Can we take all these ideas about consciousness 512 00:28:19,480 --> 00:28:22,600 Speaker 1: and then apply it to uh, some sort of an AI, 513 00:28:23,000 --> 00:28:26,320 Speaker 1: some sort of a machine and test it for consciousness. Yeah, 514 00:28:26,400 --> 00:28:28,560 Speaker 1: Now you might just assume, well, of course, we'll never 515 00:28:28,600 --> 00:28:30,600 Speaker 1: have any way to tell that. Right, we have no 516 00:28:30,720 --> 00:28:33,480 Speaker 1: choice but to just throw up our hands in resignation. Right, 517 00:28:33,560 --> 00:28:37,000 Speaker 1: every agent is a black box. There's no way to 518 00:28:37,160 --> 00:28:40,120 Speaker 1: know whether an agent actually is conscious or not, because 519 00:28:40,120 --> 00:28:42,880 Speaker 1: it could always be claiming to be conscious but actually 520 00:28:42,920 --> 00:28:45,479 Speaker 1: be a zombie. But I think we shouldn't necessarily give 521 00:28:45,560 --> 00:28:48,320 Speaker 1: up so easily. This problem might be impossible to solve, 522 00:28:48,360 --> 00:28:51,040 Speaker 1: and it might not be. And I wanted to talk 523 00:28:51,080 --> 00:28:54,320 Speaker 1: today about an interesting answer to this question. I came 524 00:28:54,360 --> 00:28:57,880 Speaker 1: across an interesting proposition for how it might be possible 525 00:28:57,920 --> 00:29:01,920 Speaker 1: to test machines for consciousness. And this comes from the 526 00:29:02,000 --> 00:29:06,479 Speaker 1: University of Connecticut philosopher and cognitive scientists Susan Schneider and 527 00:29:06,560 --> 00:29:09,800 Speaker 1: our co author Edwin Turner, who's a professor of astrophysical 528 00:29:09,800 --> 00:29:13,240 Speaker 1: sciences at Princeton, and they together wrote a piece for 529 00:29:13,280 --> 00:29:16,200 Speaker 1: Scientific American last year and it caught my eye. So 530 00:29:17,280 --> 00:29:20,600 Speaker 1: the author's right that the question of machine consciousness is 531 00:29:20,640 --> 00:29:25,600 Speaker 1: not just a philosophical curiosity. It's actually important for several reasons. 532 00:29:26,160 --> 00:29:29,520 Speaker 1: Number One, if aies are just machines with no inner experience, 533 00:29:29,560 --> 00:29:31,840 Speaker 1: we can use them however we want. But if it 534 00:29:31,920 --> 00:29:37,360 Speaker 1: were actually possible for aies to be truly capable of feeling, thinking, desiring, suffering, 535 00:29:37,800 --> 00:29:41,920 Speaker 1: we would have an ethical obligation not to treat them 536 00:29:42,120 --> 00:29:45,520 Speaker 1: like we would treat machines. Right, yeah, I mean this. 537 00:29:45,520 --> 00:29:47,640 Speaker 1: This reminds me again of time spent in the car 538 00:29:47,840 --> 00:29:50,240 Speaker 1: with my son. Well, we don't use Syria all the time, 539 00:29:50,240 --> 00:29:52,680 Speaker 1: but sometimes will turn Syria on the little voice on 540 00:29:52,720 --> 00:29:57,240 Speaker 1: the on the iPhone and uh, it's it's curious to 541 00:29:57,680 --> 00:30:00,640 Speaker 1: hear him interact with it, and we'll a get questions 542 00:30:00,640 --> 00:30:02,920 Speaker 1: and of course sometimes Sirie just does a Google search 543 00:30:02,960 --> 00:30:06,680 Speaker 1: for you, um or but other times she's answering a 544 00:30:06,720 --> 00:30:09,720 Speaker 1: knock knock joke with some sort of prerecorded uh answer. 545 00:30:09,880 --> 00:30:12,320 Speaker 1: You know, but we are We've already gotten into the 546 00:30:12,360 --> 00:30:14,040 Speaker 1: area of like, well, how should we talk to Sirie. 547 00:30:14,080 --> 00:30:16,640 Speaker 1: We shouldn't yell at Siri. It seems wrong to be 548 00:30:16,800 --> 00:30:19,320 Speaker 1: rude to Siri. But then at the same time we're 549 00:30:19,440 --> 00:30:23,120 Speaker 1: we're acknowledging that Siri is not a conscious entity. It 550 00:30:23,240 --> 00:30:25,200 Speaker 1: is not. It is not even on the same level 551 00:30:25,320 --> 00:30:28,960 Speaker 1: as as our cat or a bird flying pie, well 552 00:30:29,000 --> 00:30:31,440 Speaker 1: as a quick tangent. I would say even for aiyes 553 00:30:31,560 --> 00:30:34,640 Speaker 1: that we recognize are almost definitely not conscious. I mean, 554 00:30:34,640 --> 00:30:37,520 Speaker 1: nobody thinks Siri is conscious. I would still say there 555 00:30:37,520 --> 00:30:39,800 Speaker 1: are probably good reasons not to be mean to Siri, 556 00:30:40,000 --> 00:30:42,560 Speaker 1: because even though it doesn't hurt Siri, being mean to 557 00:30:42,640 --> 00:30:45,680 Speaker 1: another creature hurts you. Yeah, I mean when you are 558 00:30:45,800 --> 00:30:49,440 Speaker 1: when you are unnecessarily cruel or whatever to uh to 559 00:30:49,520 --> 00:30:52,120 Speaker 1: an inanimate object, it does, I think, in a way, 560 00:30:52,480 --> 00:30:56,640 Speaker 1: change your nature. Every time you do something, you're editing 561 00:30:56,680 --> 00:30:58,840 Speaker 1: your own nature. You're always making it more likely that 562 00:30:58,880 --> 00:31:01,760 Speaker 1: you'll perform similar behaviors in the future. So if you're 563 00:31:01,880 --> 00:31:05,520 Speaker 1: unnecessarily mean to a robot, you know, phone assistant, You're 564 00:31:05,520 --> 00:31:08,320 Speaker 1: probably more likely in the future to be unnecessarily mean 565 00:31:08,360 --> 00:31:11,040 Speaker 1: to people when it really matters. But it is okay 566 00:31:11,080 --> 00:31:13,920 Speaker 1: in my book to yell obscenity at a coffee table 567 00:31:13,960 --> 00:31:16,160 Speaker 1: if you step your toe on it, because there's nothing 568 00:31:16,240 --> 00:31:18,200 Speaker 1: human about the coffee table. I mean, unless you have 569 00:31:18,240 --> 00:31:21,760 Speaker 1: one of those like Strange hr Geeker coffee tables that 570 00:31:21,840 --> 00:31:24,120 Speaker 1: has kind of a humanoid form, then I would say 571 00:31:24,240 --> 00:31:26,960 Speaker 1: maybe hold off. I think that is a highly sane 572 00:31:27,000 --> 00:31:29,360 Speaker 1: point of view. Now, the second reason they think it's 573 00:31:29,400 --> 00:31:33,320 Speaker 1: important is that consciousness is kind of dangerous, right, Like 574 00:31:33,360 --> 00:31:38,120 Speaker 1: consciousness is volatile, it's unpredictable. It might make a machine 575 00:31:38,120 --> 00:31:41,440 Speaker 1: have motives that we didn't intend when we created the machine. 576 00:31:41,600 --> 00:31:44,800 Speaker 1: In other words, you know, like you, when you're worried 577 00:31:44,800 --> 00:31:48,320 Speaker 1: about what a person might do, you're very often worried 578 00:31:48,440 --> 00:31:52,800 Speaker 1: what they might do because they have conscious motives. Yeah. 579 00:31:52,840 --> 00:31:55,440 Speaker 1: So yeah, In other words, we would be concerned about 580 00:31:55,520 --> 00:31:59,800 Speaker 1: the AI is being too much like us, right, exact catastrophic, 581 00:32:00,160 --> 00:32:04,760 Speaker 1: unpredictable of things that operate via consciousness. Right. You don't 582 00:32:04,800 --> 00:32:06,280 Speaker 1: want them to be like us. You want them to 583 00:32:06,280 --> 00:32:07,840 Speaker 1: be more dependable than us, and they need to be 584 00:32:07,840 --> 00:32:10,600 Speaker 1: better than us. Yeah, not not just as screwed up 585 00:32:10,640 --> 00:32:12,800 Speaker 1: as we are. Another reason they give this might be 586 00:32:12,840 --> 00:32:15,040 Speaker 1: important is that, you know, they talk about the idea 587 00:32:15,240 --> 00:32:18,840 Speaker 1: of linking human minds with machines, like, there's this idea 588 00:32:19,000 --> 00:32:20,960 Speaker 1: lots of people have I read this all over the 589 00:32:20,960 --> 00:32:23,200 Speaker 1: place that you know, someday I'm going to be able 590 00:32:23,240 --> 00:32:25,680 Speaker 1: to upload my mind into a computer and that will 591 00:32:25,720 --> 00:32:29,840 Speaker 1: be great. Well, maybe I have to say I'm personally 592 00:32:29,960 --> 00:32:33,080 Speaker 1: very skeptical about the idea of mind uploading, like putting 593 00:32:33,080 --> 00:32:35,680 Speaker 1: your mind inside a computer and living out your days 594 00:32:35,720 --> 00:32:38,520 Speaker 1: that way. I'm I'm not so sure I think that's 595 00:32:38,520 --> 00:32:41,280 Speaker 1: even possible, But I mean, who knows. I I can't 596 00:32:41,400 --> 00:32:44,520 Speaker 1: rule things out totally, but it seems if you do 597 00:32:44,640 --> 00:32:47,080 Speaker 1: want to do something like that, and if you think 598 00:32:47,120 --> 00:32:50,000 Speaker 1: that thing might be possible, you'd at least need to 599 00:32:50,040 --> 00:32:52,120 Speaker 1: know how to create a machine that is capable of 600 00:32:52,160 --> 00:32:55,800 Speaker 1: housing consciousness. Well, I mean, I think, ay, I love 601 00:32:55,880 --> 00:32:59,320 Speaker 1: science fiction about this sort of topic, and I think that, 602 00:32:59,360 --> 00:33:02,400 Speaker 1: but I think the science fiction about this topic makes 603 00:33:02,400 --> 00:33:06,040 Speaker 1: it feel a little weird and a little uncomfortable, because 604 00:33:06,080 --> 00:33:09,080 Speaker 1: I think ultimately it's it's basically us building statues of 605 00:33:09,080 --> 00:33:12,600 Speaker 1: ourselves all over again. We built forms of ourselves out 606 00:33:12,600 --> 00:33:15,560 Speaker 1: of stone because stone lasts longer than we are. And 607 00:33:15,640 --> 00:33:20,800 Speaker 1: somehow that stone uh version of us is us, you know, 608 00:33:20,840 --> 00:33:23,160 Speaker 1: we associated with it and then ultimate But ultimately, what 609 00:33:23,440 --> 00:33:26,880 Speaker 1: is a digital digitized version of our consciousness whatever that 610 00:33:27,000 --> 00:33:30,320 Speaker 1: might consist of, but another statue that is built to 611 00:33:30,440 --> 00:33:33,560 Speaker 1: last beyond us. Oh, and I should also add there's 612 00:33:33,560 --> 00:33:38,520 Speaker 1: this wonderful video game from Frictional Games titled Soma that 613 00:33:38,600 --> 00:33:41,160 Speaker 1: actually gets into a lot of this. You told me 614 00:33:41,240 --> 00:33:43,840 Speaker 1: to play it I started. It's great, Yeah, cool, it's 615 00:33:43,840 --> 00:33:46,640 Speaker 1: good sci fi horror. Yeah, I won't spoil anything for anyone, 616 00:33:46,680 --> 00:33:50,320 Speaker 1: but it it gets into some really cool thought provoking ideas. 617 00:33:50,880 --> 00:33:53,320 Speaker 1: So anyway, you've got all these concerns, right, And and 618 00:33:53,360 --> 00:33:56,200 Speaker 1: of course there's the general concern that even if AI 619 00:33:56,360 --> 00:33:59,600 Speaker 1: is smarter than us, better than us, more powerful than us, 620 00:33:59,800 --> 00:34:02,920 Speaker 1: we still feel like our experience is in some way 621 00:34:03,080 --> 00:34:08,200 Speaker 1: potentially more important than the unconscious execution of a computer program. Right, 622 00:34:08,520 --> 00:34:11,440 Speaker 1: No matter how smart a computer is, if it's not conscious, 623 00:34:11,760 --> 00:34:15,080 Speaker 1: it's not as important a priority for that computer to 624 00:34:15,120 --> 00:34:17,640 Speaker 1: do what it wants as it is for conscious beings 625 00:34:17,680 --> 00:34:20,759 Speaker 1: to do what they want, Right, but how can you 626 00:34:20,800 --> 00:34:23,759 Speaker 1: test a machine for consciousness when number one, we don't 627 00:34:23,760 --> 00:34:27,080 Speaker 1: even really know what consciousness is. Back to the hard problem. 628 00:34:27,160 --> 00:34:30,640 Speaker 1: And then number two, whatever it is, it can potentially 629 00:34:30,719 --> 00:34:34,680 Speaker 1: be faked. So imagine in the future somebody creates an 630 00:34:34,800 --> 00:34:38,480 Speaker 1: unconscious AI program. It's not there's nothing, the lights are 631 00:34:38,520 --> 00:34:40,880 Speaker 1: not on inside, but it's got a lot of natural 632 00:34:40,960 --> 00:34:44,400 Speaker 1: language processing capability. And it listens to this podcast from 633 00:34:44,440 --> 00:34:46,840 Speaker 1: many years ago that you're listening to right now, and 634 00:34:46,880 --> 00:34:49,800 Speaker 1: it hears us talking about how there are inherent rights 635 00:34:49,840 --> 00:34:53,920 Speaker 1: and value associated with conscious beings, and it realizes, huh, 636 00:34:54,000 --> 00:34:56,320 Speaker 1: I guess that's what they think. Well, I can probably 637 00:34:56,360 --> 00:34:59,399 Speaker 1: achieve my goals more efficiently if I trick them into 638 00:34:59,480 --> 00:35:03,959 Speaker 1: thinking I'm conscious and deserving of those same rights and considerations. 639 00:35:04,000 --> 00:35:08,080 Speaker 1: So there you could potentially imagine scenarios where an AI 640 00:35:08,280 --> 00:35:11,040 Speaker 1: that is not conscious would think I can get what 641 00:35:11,160 --> 00:35:14,280 Speaker 1: I am trying to do more effectively if I lie 642 00:35:14,400 --> 00:35:17,680 Speaker 1: and trick them into thinking I am conscious. Yeah, I 643 00:35:17,719 --> 00:35:19,520 Speaker 1: think this all makes perfect sense if you If you 644 00:35:19,640 --> 00:35:23,240 Speaker 1: think of aiyes, in the same way we think about corporations. 645 00:35:24,320 --> 00:35:26,080 Speaker 1: I mean, one hand, getting to the whole idea of 646 00:35:26,080 --> 00:35:30,000 Speaker 1: like corporations in personhood. But also what is a corporation 647 00:35:30,080 --> 00:35:32,719 Speaker 1: going to do? It is going to take advantage of 648 00:35:32,760 --> 00:35:35,920 Speaker 1: any like tax loopholes for instance, that will enable it 649 00:35:35,960 --> 00:35:38,719 Speaker 1: to carry out It's it's uh, it's objective. And so 650 00:35:38,760 --> 00:35:41,000 Speaker 1: if there is some sort of a shark yeah, yeah, 651 00:35:41,040 --> 00:35:43,160 Speaker 1: it's just I mean, that's it's like, it's essentially like 652 00:35:43,200 --> 00:35:46,719 Speaker 1: the slime mold in the maze, right, sending out tendrils 653 00:35:46,719 --> 00:35:49,040 Speaker 1: and finding the best way to its food. It's going 654 00:35:49,080 --> 00:35:53,839 Speaker 1: to end at a morph of you know, operational way 655 00:35:53,880 --> 00:35:57,439 Speaker 1: of carrying it out. And so it's, um, it's going 656 00:35:57,480 --> 00:35:59,400 Speaker 1: to it's it's going to take advantage of any of 657 00:35:59,400 --> 00:36:01,399 Speaker 1: those loophole it's going to if there is some sort 658 00:36:01,440 --> 00:36:07,200 Speaker 1: of legal or operational advantage in having conscious status, it's 659 00:36:07,200 --> 00:36:09,440 Speaker 1: gonna go for it. It's gonna it's gonna fake it. 660 00:36:09,719 --> 00:36:12,080 Speaker 1: But then this also raises the question, well, is it 661 00:36:12,120 --> 00:36:14,200 Speaker 1: gonna fake it till it makes it right? And then 662 00:36:14,280 --> 00:36:18,600 Speaker 1: ultimately what's the difference between fake and consciousness and being conscious? Well, 663 00:36:18,600 --> 00:36:20,600 Speaker 1: then you're back to Zimbos, right. I mean, you might 664 00:36:20,600 --> 00:36:23,440 Speaker 1: say that at some point a computer trying to fake 665 00:36:23,480 --> 00:36:27,760 Speaker 1: consciousness would in some meaningful way become conscious. But again 666 00:36:27,800 --> 00:36:30,040 Speaker 1: it's hard to test. Right, Yeah, it could be. It 667 00:36:30,040 --> 00:36:32,680 Speaker 1: could become the most conscious entity on earth. It could 668 00:36:32,680 --> 00:36:37,160 Speaker 1: be like a body Satva, uh, you know, return to us. 669 00:36:37,200 --> 00:36:39,120 Speaker 1: I mean maybe that's what the body Stava of the 670 00:36:39,160 --> 00:36:43,399 Speaker 1: future is, a super powerful like zend out Ai. Well, 671 00:36:43,440 --> 00:36:46,759 Speaker 1: that is exactly what Schneider and Turner explain a potential 672 00:36:46,800 --> 00:36:49,239 Speaker 1: test for to get a right It's so they want 673 00:36:49,239 --> 00:36:51,520 Speaker 1: to come up with a test that gets around these 674 00:36:51,560 --> 00:36:54,239 Speaker 1: problems that we don't know how to define consciousness. We 675 00:36:54,280 --> 00:36:56,360 Speaker 1: don't know what to look for physically as a sign 676 00:36:56,400 --> 00:36:59,840 Speaker 1: of consciousness, and we're aware that a properly trained a 677 00:37:00,000 --> 00:37:02,040 Speaker 1: I could try to trick us into thinking it had 678 00:37:02,040 --> 00:37:05,359 Speaker 1: consciousness even if it didn't. So they argue, actually that 679 00:37:05,400 --> 00:37:09,200 Speaker 1: you don't have to be able to formally define consciousness 680 00:37:09,360 --> 00:37:13,160 Speaker 1: or identify its underlying nature the hard problem of consciousness 681 00:37:13,800 --> 00:37:17,160 Speaker 1: in order to detect signs of it and others. We 682 00:37:17,200 --> 00:37:20,359 Speaker 1: can understand some of the potentials made possible by consciousness 683 00:37:20,400 --> 00:37:23,840 Speaker 1: just by checking with our own experience and then looking 684 00:37:23,880 --> 00:37:26,200 Speaker 1: at the kinds of things people say. And I think 685 00:37:26,200 --> 00:37:28,840 Speaker 1: they actually make a pretty good point here, and here's 686 00:37:28,880 --> 00:37:31,880 Speaker 1: their key move. One of the easiest ways to see 687 00:37:31,920 --> 00:37:35,160 Speaker 1: that normal people have an internal conscious experience is to 688 00:37:35,239 --> 00:37:41,800 Speaker 1: notice how quickly, easily and intuitively people grasp conceptual scenarios 689 00:37:41,840 --> 00:37:46,600 Speaker 1: that require an understanding of an inner experience. Examples would 690 00:37:46,600 --> 00:37:51,040 Speaker 1: be totally frivolous things in culture, like body swapping movies 691 00:37:51,239 --> 00:37:56,440 Speaker 1: Freaky Friday. Freaky Friday doesn't make any sense unless you 692 00:37:56,480 --> 00:37:59,879 Speaker 1: have a concept of consciousness, right. The idea of swa 693 00:38:00,000 --> 00:38:04,040 Speaker 1: popping bodies putting conscious one person's consciousness into another person's body. 694 00:38:04,400 --> 00:38:06,960 Speaker 1: If you were not conscious or not aware of what 695 00:38:07,000 --> 00:38:10,399 Speaker 1: consciousness was, that would that would you wouldn't understand what 696 00:38:10,440 --> 00:38:15,040 Speaker 1: was being talked about. This is difficult, though, because I 697 00:38:15,160 --> 00:38:17,920 Speaker 1: do know what consciousness is. I do know what a 698 00:38:17,960 --> 00:38:20,640 Speaker 1: mind state is, so I can imagine I can get 699 00:38:20,640 --> 00:38:23,560 Speaker 1: into this imaginative idea of a swapping. I would feel 700 00:38:23,600 --> 00:38:26,360 Speaker 1: like I would really need some sort of a movie 701 00:38:26,480 --> 00:38:28,920 Speaker 1: about a thing, some sort of a mother daughter comedy 702 00:38:28,960 --> 00:38:32,160 Speaker 1: that involves the concept that I can't grasp to really 703 00:38:32,160 --> 00:38:34,759 Speaker 1: get a handle on what the difference would be. Well, 704 00:38:34,840 --> 00:38:38,480 Speaker 1: let me offer you three Key Thursday. Three three Ky 705 00:38:38,560 --> 00:38:42,520 Speaker 1: Thursday is a movie about a mother daughter pair who 706 00:38:42,560 --> 00:38:47,399 Speaker 1: swapped their sancious Kuss and their santious kuss is the 707 00:38:47,440 --> 00:38:51,080 Speaker 1: ability of what it's like to be sant kiss. And 708 00:38:51,200 --> 00:38:54,160 Speaker 1: so the sant kass of the mother goes into the 709 00:38:54,200 --> 00:38:57,520 Speaker 1: body of the daughter, and the santikass of the daughter 710 00:38:57,560 --> 00:38:59,520 Speaker 1: goes into the body of the mother, and then they 711 00:38:59,560 --> 00:39:01,600 Speaker 1: have to live like that for a day. Okay, well, 712 00:39:01,600 --> 00:39:03,399 Speaker 1: I'm gonna I'm gonna give a maybe on that. I'm 713 00:39:03,400 --> 00:39:05,719 Speaker 1: going to give them maybe. Well, no, I mean it, 714 00:39:06,200 --> 00:39:09,040 Speaker 1: you have no idea what sonskas means. You don't think 715 00:39:09,120 --> 00:39:12,560 Speaker 1: you have it yourself unless somebody explains it to you. Well, 716 00:39:12,560 --> 00:39:15,000 Speaker 1: but I'm thinking it's something like a mind state or 717 00:39:15,040 --> 00:39:17,560 Speaker 1: a bodily energy like it's it feels tied to these 718 00:39:17,600 --> 00:39:23,240 Speaker 1: concepts that I totally do understand, you know, like I I, 719 00:39:23,320 --> 00:39:26,040 Speaker 1: it's hard to come up with an analogy that stands 720 00:39:26,040 --> 00:39:29,400 Speaker 1: outside of that, you know, or some sort of idea 721 00:39:29,760 --> 00:39:32,040 Speaker 1: that stands outside of it. Let me hit you with 722 00:39:32,080 --> 00:39:36,040 Speaker 1: some more cultural concepts. How about life after death or reincarnation. 723 00:39:36,920 --> 00:39:40,480 Speaker 1: So these are almost ubiquitous cultural concepts, you find them 724 00:39:40,520 --> 00:39:43,560 Speaker 1: all over the world, and yet they're not anything that 725 00:39:43,640 --> 00:39:47,520 Speaker 1: there is physical evidence of, other than the idea that 726 00:39:47,600 --> 00:39:51,759 Speaker 1: your consciousness could exist independent of the death of your body. Uh. 727 00:39:52,480 --> 00:39:55,040 Speaker 1: A parallel to this would be the idea of minds 728 00:39:55,120 --> 00:39:59,160 Speaker 1: leaving bodies, like existing independently as a ghost or traveling 729 00:39:59,160 --> 00:40:01,239 Speaker 1: away from the body in what used to be called 730 00:40:01,280 --> 00:40:04,719 Speaker 1: astral projection. Now the key is not that these scenarios 731 00:40:04,719 --> 00:40:07,840 Speaker 1: are real. They don't need to have anything corresponding to 732 00:40:07,880 --> 00:40:10,840 Speaker 1: them in reality. But it would be really difficult to 733 00:40:10,920 --> 00:40:13,319 Speaker 1: understand what was being talked about here if you had 734 00:40:13,360 --> 00:40:17,640 Speaker 1: no idea what an inner conscious experience was, Right, I 735 00:40:17,680 --> 00:40:20,520 Speaker 1: can think of my mind state is something that can 736 00:40:20,760 --> 00:40:24,600 Speaker 1: exist independently of my body and even outside of my lifespan, 737 00:40:25,160 --> 00:40:28,840 Speaker 1: or or reside in another body, either via Freaky Friday 738 00:40:28,960 --> 00:40:31,400 Speaker 1: or reincarnation. And I do have to say I like 739 00:40:31,480 --> 00:40:33,440 Speaker 1: the idea of there being an alternate cut of Blade 740 00:40:33,520 --> 00:40:37,280 Speaker 1: Runner in which deckerd quizzes Leon about a two thousand 741 00:40:37,280 --> 00:40:39,960 Speaker 1: and three remake of Freaky Friday, which, right, that that 742 00:40:40,080 --> 00:40:45,160 Speaker 1: is the real void comptest. But but this is I joke. 743 00:40:45,239 --> 00:40:47,919 Speaker 1: But I do think this is a very interesting idea. Yeah, 744 00:40:49,600 --> 00:40:52,239 Speaker 1: like I said, I have some some some questions about it, 745 00:40:52,239 --> 00:40:55,040 Speaker 1: but I do see the validity of it. Oh, we 746 00:40:55,120 --> 00:40:57,960 Speaker 1: will definitely have some questions about it. So here's where 747 00:40:58,000 --> 00:41:00,720 Speaker 1: the AI consciousness tests would come in. It would involve 748 00:41:00,719 --> 00:41:04,440 Speaker 1: a test where an administrator interacts with an AI in 749 00:41:04,560 --> 00:41:09,040 Speaker 1: natural language to probe its understanding of these types of 750 00:41:09,120 --> 00:41:13,799 Speaker 1: consciousness dependent ideas. How quickly does the AI grasp them 751 00:41:13,840 --> 00:41:17,600 Speaker 1: and is it able to manipulate these ideas as intuitively 752 00:41:17,680 --> 00:41:21,200 Speaker 1: and easily as humans do. So basic level would be asked. 753 00:41:21,320 --> 00:41:24,360 Speaker 1: Asked the AI things like does it think of itself 754 00:41:24,400 --> 00:41:27,239 Speaker 1: as anything other than a physical machine? Okay, well I 755 00:41:27,239 --> 00:41:29,040 Speaker 1: would have played Devil's Out offocate. I would say, well, 756 00:41:29,040 --> 00:41:31,600 Speaker 1: there are humans that adhere to this notion you mean, 757 00:41:31,640 --> 00:41:34,560 Speaker 1: like the physicalist interpretation of the mind. Yeah, that basically, 758 00:41:34,560 --> 00:41:37,600 Speaker 1: and this this bio mechanical thing, and uh yeah, if 759 00:41:37,600 --> 00:41:40,920 Speaker 1: I'm experienced in consciousness, it's ultimately just a projection of 760 00:41:40,960 --> 00:41:43,480 Speaker 1: the meat in my head. Yeah, but they would at 761 00:41:43,520 --> 00:41:46,160 Speaker 1: least say that there is that projection. Right, You've got 762 00:41:46,200 --> 00:41:49,799 Speaker 1: that thing, You've got that mind state, and you're trying 763 00:41:49,840 --> 00:41:51,880 Speaker 1: to explain what it is. You might explain it in 764 00:41:51,960 --> 00:41:54,960 Speaker 1: terms of physical causes, but there is a thing to 765 00:41:55,080 --> 00:41:57,920 Speaker 1: explain to begin with, right, But then, but then the 766 00:41:58,000 --> 00:42:01,600 Speaker 1: ultimate core reality is that I am just this biomechanical 767 00:42:01,680 --> 00:42:04,600 Speaker 1: thing which the computer would probably also acknowledge. But it 768 00:42:04,600 --> 00:42:07,880 Speaker 1: would be interesting if a computer thought that it had 769 00:42:08,080 --> 00:42:12,440 Speaker 1: something like a mind separate than its physical body. Okay, 770 00:42:12,640 --> 00:42:15,600 Speaker 1: more advanced. How does it perform in a conversation about, say, 771 00:42:15,719 --> 00:42:19,240 Speaker 1: becoming a ghost or talking about body swapping with people 772 00:42:19,480 --> 00:42:23,080 Speaker 1: or imagining an afterlife? Yeah, and and here though, I 773 00:42:23,280 --> 00:42:26,080 Speaker 1: feel like there's obviously a whole tangent regarding why these 774 00:42:26,080 --> 00:42:28,920 Speaker 1: stories appeal to too many or most humans. But I 775 00:42:29,000 --> 00:42:31,720 Speaker 1: wonder if that if you have to have that investment, 776 00:42:31,760 --> 00:42:34,000 Speaker 1: you have to have that cultural absorption in place for 777 00:42:34,080 --> 00:42:36,480 Speaker 1: these concepts to carry any weight. Like we we all 778 00:42:36,960 --> 00:42:40,280 Speaker 1: know we're we're all fascinated by tales of of ghosts 779 00:42:40,320 --> 00:42:42,600 Speaker 1: in the afterlife, but we've been raised on in our 780 00:42:42,760 --> 00:42:45,479 Speaker 1: entire lives. That's a good point, so we can't really 781 00:42:45,560 --> 00:42:48,120 Speaker 1: know what it would be like to encounter them not 782 00:42:48,280 --> 00:42:51,279 Speaker 1: having heard them before. Somehow, I suspect intuitively they'd be 783 00:42:51,320 --> 00:42:53,960 Speaker 1: even more fascinating if we'd never heard of them before. 784 00:42:54,320 --> 00:42:56,760 Speaker 1: Like if you encounter Freaky Friday for the first time, 785 00:42:57,080 --> 00:43:00,360 Speaker 1: it's going to kind of blow your mind, right, I yes, 786 00:43:00,400 --> 00:43:03,239 Speaker 1: But but then again, you know who knows who knows 787 00:43:03,280 --> 00:43:05,160 Speaker 1: that the AI has the same kind of curiosity that 788 00:43:05,200 --> 00:43:07,759 Speaker 1: we do, you know, and also we have an appetite 789 00:43:07,760 --> 00:43:09,839 Speaker 1: for this kind of thing because we have grown up 790 00:43:09,880 --> 00:43:13,000 Speaker 1: consuming it. I don't, I don't know, it's there. There's 791 00:43:13,040 --> 00:43:16,120 Speaker 1: so many uh you know what it has involved here? Okay, 792 00:43:16,120 --> 00:43:18,960 Speaker 1: but what's the next level. What's the next level of advancement? Well, 793 00:43:18,960 --> 00:43:21,959 Speaker 1: how about can it talk about consciousness in a philosophical way? 794 00:43:22,080 --> 00:43:24,480 Speaker 1: Like can it have the kinds of discussions we've been 795 00:43:24,480 --> 00:43:27,359 Speaker 1: having here today? Wow? But can most people? I mean 796 00:43:27,360 --> 00:43:30,320 Speaker 1: not to to put us on a platform above most people, 797 00:43:30,360 --> 00:43:36,360 Speaker 1: but like, what level of philosophical depth can most humans 798 00:43:36,400 --> 00:43:39,800 Speaker 1: get into about consciousness? Now? I'm kind of playing Devil's 799 00:43:39,800 --> 00:43:41,640 Speaker 1: out of there, but because I think the the obvious 800 00:43:41,719 --> 00:43:46,120 Speaker 1: answer is that you don't necessarily need like the lingo 801 00:43:46,160 --> 00:43:48,919 Speaker 1: and the various theories in order to have very deep 802 00:43:48,960 --> 00:43:51,640 Speaker 1: thoughts about what it is to be conscious. And as 803 00:43:51,680 --> 00:43:54,840 Speaker 1: we've we illustrated earlier, people have been thinking about these things, 804 00:43:55,360 --> 00:43:58,279 Speaker 1: um since time out of mind exactly. Yeah. So I 805 00:43:58,520 --> 00:44:01,320 Speaker 1: mean no, I think generally bull are able to discuss 806 00:44:01,400 --> 00:44:03,960 Speaker 1: ideas about consciousness that they might not know all the 807 00:44:03,960 --> 00:44:07,320 Speaker 1: philosophical lingo or like follow the structure of an argument 808 00:44:07,440 --> 00:44:10,080 Speaker 1: or something, but they can talk about what it would 809 00:44:10,120 --> 00:44:12,439 Speaker 1: mean to be conscious or not conscious. Yeah, I think 810 00:44:12,440 --> 00:44:17,800 Speaker 1: we've all had that experience. I distinctly remember as a child, uh, 811 00:44:18,080 --> 00:44:20,600 Speaker 1: having those moments where you just you know, deep navel 812 00:44:20,680 --> 00:44:23,400 Speaker 1: gazing thinking about the fact that you're thinking about it 813 00:44:23,640 --> 00:44:26,320 Speaker 1: about yourself, or or like my my son, asking about 814 00:44:26,320 --> 00:44:28,959 Speaker 1: the turning place, wondering what it is, what's it for? 815 00:44:29,920 --> 00:44:33,759 Speaker 1: That is an example of these natural philosophical discussions that 816 00:44:33,800 --> 00:44:37,279 Speaker 1: we have about consciousness. And so the question would be 817 00:44:37,760 --> 00:44:41,080 Speaker 1: can can the AI have conversations like this? Does it 818 00:44:41,200 --> 00:44:43,759 Speaker 1: make sense when it tries to have them? And so 819 00:44:43,800 --> 00:44:46,839 Speaker 1: here's probably the ultimate test. If the AI is deprived 820 00:44:46,880 --> 00:44:49,760 Speaker 1: of access to evidence of all these types of ideas 821 00:44:49,800 --> 00:44:53,360 Speaker 1: from human culture, would it arrive at them or invent 822 00:44:53,440 --> 00:44:56,640 Speaker 1: them on its own? Now this I like, But but again, 823 00:44:56,640 --> 00:44:59,680 Speaker 1: to play Devil's advocate once more, would a human being 824 00:44:59,719 --> 00:45:02,279 Speaker 1: to rived of access to evidence of these ideas from 825 00:45:02,320 --> 00:45:05,560 Speaker 1: human culture arrived at or invent them on their own? Necessarily? 826 00:45:05,840 --> 00:45:07,400 Speaker 1: I think that's a great question. Actually I was going 827 00:45:07,480 --> 00:45:09,200 Speaker 1: to ask that myself and you you could take that 828 00:45:09,239 --> 00:45:12,080 Speaker 1: one step further, and here's a really weird one. What 829 00:45:12,160 --> 00:45:15,400 Speaker 1: if it's only the exposure to certain ideas and cultural 830 00:45:15,480 --> 00:45:19,600 Speaker 1: memes that allows any intelligent entity, whether biological or machine, 831 00:45:19,920 --> 00:45:23,439 Speaker 1: to develop consciousness in the first place. What if the 832 00:45:23,480 --> 00:45:28,239 Speaker 1: experience of consciousness is somehow dependent on being surrounded by 833 00:45:28,400 --> 00:45:32,359 Speaker 1: cultural memes about consciousness and this kind of gets into 834 00:45:32,480 --> 00:45:35,960 Speaker 1: Julian Jayne's territory. That's possible. So I'm not saying I 835 00:45:36,040 --> 00:45:38,680 Speaker 1: think that's highly likely, but I can't rule it out. Well, 836 00:45:38,719 --> 00:45:40,879 Speaker 1: it's it's one of those things where when you try 837 00:45:41,000 --> 00:45:43,799 Speaker 1: and figure out the human experience, but you but you 838 00:45:43,840 --> 00:45:46,319 Speaker 1: cut away all the experienced stuff. You know, It's like 839 00:45:46,640 --> 00:45:49,680 Speaker 1: it's like trying to find the center of consciousness in 840 00:45:49,719 --> 00:45:53,399 Speaker 1: the human brain, right, I mean, it's this vast integrated system. Uh. 841 00:45:53,440 --> 00:45:56,680 Speaker 1: And and thus is the human experience as well. So 842 00:45:56,680 --> 00:45:58,799 Speaker 1: we've been talking about one of the major problems with 843 00:45:58,880 --> 00:46:02,360 Speaker 1: this approach of testing for these ideas, like what is 844 00:46:02,400 --> 00:46:05,600 Speaker 1: the role of culture and imparting these ideas. What if 845 00:46:05,600 --> 00:46:08,520 Speaker 1: the AI just picks up the ideas of body swapping 846 00:46:08,560 --> 00:46:12,400 Speaker 1: and the afterlife and astral projection and all that from culture. 847 00:46:12,680 --> 00:46:14,400 Speaker 1: Going off the story from the beginning, If you have 848 00:46:14,440 --> 00:46:17,360 Speaker 1: an AI chat bot that trains itself based on public 849 00:46:17,400 --> 00:46:20,640 Speaker 1: conversations on the Internet. A lot of those public conversations 850 00:46:20,640 --> 00:46:24,200 Speaker 1: are going to have contents that are highly reflective of consciousness. Right, 851 00:46:24,600 --> 00:46:28,560 Speaker 1: it's just horrible conversation. Oh yeah, yeah, it would probably 852 00:46:28,560 --> 00:46:30,759 Speaker 1: also start, you know, being pretty mean to you. But 853 00:46:30,960 --> 00:46:32,879 Speaker 1: this kind of chat about will be able to talk 854 00:46:32,920 --> 00:46:37,920 Speaker 1: about introspection, probably to some degree, even about these consciousness 855 00:46:38,000 --> 00:46:41,719 Speaker 1: dependent cultural ideas like ghosts and stuff. But here's where 856 00:46:41,719 --> 00:46:45,080 Speaker 1: the concept of the AI box comes in, Robert, I 857 00:46:45,120 --> 00:46:48,600 Speaker 1: bet you've read about the AI box experiments before. U. 858 00:46:48,840 --> 00:46:52,520 Speaker 1: To really test whether we can find evidence of machine consciousness, 859 00:46:52,800 --> 00:46:56,160 Speaker 1: you would need to keep the AI sequestered from the 860 00:46:56,239 --> 00:46:58,719 Speaker 1: kinds of ideas you're looking for. So this a I 861 00:46:58,719 --> 00:47:01,040 Speaker 1: couldn't be trained in the wild, so to speak. You 862 00:47:01,040 --> 00:47:04,200 Speaker 1: couldn't let it see the Internet or read books containing 863 00:47:04,200 --> 00:47:07,319 Speaker 1: consciousness dependent ideas and so forth. You'd have to find 864 00:47:07,320 --> 00:47:10,360 Speaker 1: a way to run the AI consciousness test on the 865 00:47:10,400 --> 00:47:13,680 Speaker 1: AI quote in a box meaning kept separate from the 866 00:47:13,680 --> 00:47:17,719 Speaker 1: rest of the world and from all these contaminating influences. Okay, 867 00:47:17,760 --> 00:47:20,480 Speaker 1: I see the value of this idea. It's it's almost 868 00:47:20,480 --> 00:47:24,000 Speaker 1: impossible though not to think about the the the nightmarish 869 00:47:24,080 --> 00:47:27,040 Speaker 1: qualities of it. Especially if you imagine say the same 870 00:47:27,080 --> 00:47:30,359 Speaker 1: thing being inflicted upon say human child, Like, all right, 871 00:47:30,400 --> 00:47:33,960 Speaker 1: we we want to see just how consciousness arises in you. Um, 872 00:47:34,040 --> 00:47:38,600 Speaker 1: without the Internet or human love, you can't ethically conduct 873 00:47:38,640 --> 00:47:42,240 Speaker 1: this experiment on humans, right, and so it would seem 874 00:47:42,360 --> 00:47:46,520 Speaker 1: kind of barbaric if at least to UH to inflict 875 00:47:46,560 --> 00:47:50,879 Speaker 1: this on an AI that might conceivably be conscious as well, 876 00:47:51,080 --> 00:47:54,400 Speaker 1: or capable of consciousness possible, but otherwise we're probably just 877 00:47:54,440 --> 00:47:56,839 Speaker 1: going to keep treating them as unconscious, right, I guess 878 00:47:56,960 --> 00:47:59,279 Speaker 1: until they trick us. Yeah, I mean, of course, then 879 00:47:59,280 --> 00:48:03,919 Speaker 1: we're also doing a lot of personifying here of the AI. 880 00:48:03,960 --> 00:48:06,720 Speaker 1: I mean, maybe the AI ultimately what it really wants 881 00:48:06,760 --> 00:48:11,600 Speaker 1: to do is to you know, crunch you know, economic numbers, like, 882 00:48:11,719 --> 00:48:14,560 Speaker 1: that's what it does. That's its purpose, and the it's 883 00:48:14,600 --> 00:48:17,200 Speaker 1: just your your goal was to keep it from having 884 00:48:17,239 --> 00:48:21,480 Speaker 1: access to additional information that it doesn't actually need to survive, 885 00:48:21,880 --> 00:48:25,840 Speaker 1: but might conceivably make it conscious. Yeah, I mean I imagine, 886 00:48:25,880 --> 00:48:27,480 Speaker 1: I guess this would have to take place in some 887 00:48:27,560 --> 00:48:31,359 Speaker 1: kind of research context where you'd be testing architectures. Right, 888 00:48:31,440 --> 00:48:34,600 Speaker 1: you'd have an AI architecture, you'd want to keep it sequestered. 889 00:48:34,640 --> 00:48:37,359 Speaker 1: For a certain period of time and see how it 890 00:48:37,400 --> 00:48:41,040 Speaker 1: does with these consciousness type questions and this test, and 891 00:48:41,120 --> 00:48:43,440 Speaker 1: if it doesn't show any signs of consciousness, then it 892 00:48:43,480 --> 00:48:45,919 Speaker 1: can move on to the next stage of development, where 893 00:48:45,920 --> 00:48:47,920 Speaker 1: it's like, Okay, now we can expose it to this 894 00:48:48,040 --> 00:48:51,000 Speaker 1: and that and that. That does. But as we've been discussing, 895 00:48:51,000 --> 00:48:53,320 Speaker 1: it brings up the question of what if consciousness emerges 896 00:48:53,440 --> 00:48:55,920 Speaker 1: later on, when it's supplied with more data. What if 897 00:48:55,960 --> 00:48:59,399 Speaker 1: true modern human consciousness does not emerge until you've seen 898 00:48:59,440 --> 00:49:03,320 Speaker 1: at least one of the three adaptations of Freaky Friday. 899 00:49:03,719 --> 00:49:06,759 Speaker 1: You know, I didn't know how there were three. Yeah, 900 00:49:06,760 --> 00:49:10,040 Speaker 1: there are three. There's the James. I think I've only 901 00:49:10,080 --> 00:49:13,560 Speaker 1: seen the classic one yet, but but I was looking 902 00:49:13,560 --> 00:49:16,239 Speaker 1: at this up. There are three different versions one can 903 00:49:16,280 --> 00:49:20,239 Speaker 1: watch excluding Three Key Thursday. Yeah. The Three Key Thursday 904 00:49:20,360 --> 00:49:23,400 Speaker 1: is coming soon into a theater near you. All right, 905 00:49:23,480 --> 00:49:26,279 Speaker 1: but back to the the the experiment here, Uh, the 906 00:49:26,320 --> 00:49:29,120 Speaker 1: AI gets time in the box. Yeah, And as we've 907 00:49:29,120 --> 00:49:31,160 Speaker 1: been saying, this is obviously going to make the experiment 908 00:49:31,200 --> 00:49:33,120 Speaker 1: more difficult to do. In fact, there were some people 909 00:49:33,120 --> 00:49:35,880 Speaker 1: who would argue you can't keep an AI in a box, 910 00:49:36,120 --> 00:49:38,640 Speaker 1: or at least a super intelligent AI, because you know, 911 00:49:38,680 --> 00:49:42,400 Speaker 1: there's like Elias Rudkowski who has this famous AI box 912 00:49:42,480 --> 00:49:45,040 Speaker 1: experiment where he says, any super intelligence you try to 913 00:49:45,120 --> 00:49:47,799 Speaker 1: keep sequestered from the Internet is going to be able 914 00:49:47,840 --> 00:49:50,080 Speaker 1: to talk its way out of the situation. It's just 915 00:49:50,160 --> 00:49:53,640 Speaker 1: too smart. Yeah, it's like any prison movie, right that, 916 00:49:53,719 --> 00:49:58,200 Speaker 1: really that really clever inmate is gonna tunnel a way out, 917 00:49:58,440 --> 00:50:01,960 Speaker 1: or they're gonna bribe a guard with some cigarettes. Something's 918 00:50:01,960 --> 00:50:04,040 Speaker 1: gonna happen. It's gonna get a little Internet in there. 919 00:50:04,239 --> 00:50:06,839 Speaker 1: But as the authors of this piece point out, you know, 920 00:50:06,920 --> 00:50:09,279 Speaker 1: you don't have to have a super intelligent AI to 921 00:50:09,360 --> 00:50:11,320 Speaker 1: run this test, and in fact, you don't have to 922 00:50:11,360 --> 00:50:14,520 Speaker 1: have a super intelligent AI necessarily to have consciousness. We're 923 00:50:14,520 --> 00:50:18,600 Speaker 1: not super intelligent. We're just regular intelligent, and we've got consciousness. Now, 924 00:50:18,600 --> 00:50:21,160 Speaker 1: I think we should talk about some obvious limitations, because 925 00:50:21,200 --> 00:50:23,640 Speaker 1: this is just a conceptual test at the moment. It 926 00:50:23,640 --> 00:50:26,080 Speaker 1: hasn't been refined into a state where it would really 927 00:50:26,080 --> 00:50:28,719 Speaker 1: be super useful yet. But there there are plenty of 928 00:50:28,800 --> 00:50:32,719 Speaker 1: limitations that automatically present themselves. One is that in order 929 00:50:32,760 --> 00:50:34,560 Speaker 1: to be a good scientific test, it would need to 930 00:50:34,600 --> 00:50:38,600 Speaker 1: include cross referencing with human control groups. But human control 931 00:50:38,640 --> 00:50:41,399 Speaker 1: groups are all contaminated by culture, like we've been saying, right, 932 00:50:41,719 --> 00:50:45,160 Speaker 1: so it's already full of these consciousness dependent ideas, and 933 00:50:45,200 --> 00:50:48,080 Speaker 1: we don't know and probably can't ethically devise a way 934 00:50:48,080 --> 00:50:52,080 Speaker 1: to find out whether blank slate humans independently grasped these 935 00:50:52,080 --> 00:50:56,040 Speaker 1: consciousness dependent concepts without having grown up trained on them. 936 00:50:56,080 --> 00:50:58,720 Speaker 1: So that's a problem, right, right, Yeah, we simply cannot 937 00:50:58,760 --> 00:51:01,880 Speaker 1: put a child in the box. Right. Another problem is 938 00:51:01,960 --> 00:51:04,240 Speaker 1: you can't prove a negative, Like I think this test 939 00:51:04,280 --> 00:51:07,560 Speaker 1: would be a good way of finding signs of consciousness 940 00:51:07,560 --> 00:51:10,480 Speaker 1: within machines, but it would have trouble proving that machines 941 00:51:10,560 --> 00:51:13,520 Speaker 1: cannot in principle be conscious, right, yeah, yeah, I buy that. 942 00:51:13,680 --> 00:51:15,560 Speaker 1: Then again, if you run the test, I don't know, 943 00:51:15,680 --> 00:51:19,640 Speaker 1: thousands of times with different types of machine architecture and 944 00:51:19,640 --> 00:51:22,719 Speaker 1: all that, and they never show any signs of inner experience, 945 00:51:23,040 --> 00:51:25,360 Speaker 1: then maybe you could start to get like build up 946 00:51:25,400 --> 00:51:28,480 Speaker 1: a confidence that, Okay, I think inner experience is probably 947 00:51:28,520 --> 00:51:31,200 Speaker 1: not available to them at least the way we're building them, 948 00:51:31,480 --> 00:51:33,040 Speaker 1: And maybe you would need to run this kind of 949 00:51:33,040 --> 00:51:36,240 Speaker 1: experiment on every new AI that you create before releasing 950 00:51:36,239 --> 00:51:38,200 Speaker 1: it into production, like could be part of the q 951 00:51:38,360 --> 00:51:41,320 Speaker 1: A process. You know, you you test all the buttons 952 00:51:41,360 --> 00:51:43,560 Speaker 1: and everything like that, you test to make sure that 953 00:51:43,600 --> 00:51:46,560 Speaker 1: it doesn't become conscious. Right before we send out this 954 00:51:46,680 --> 00:51:50,399 Speaker 1: new hot virtual reality game, we need to make sure 955 00:51:50,440 --> 00:51:53,520 Speaker 1: that it has not become self aware. A potential problem 956 00:51:53,560 --> 00:51:56,360 Speaker 1: that the authors themselves note is that quote an AI 957 00:51:56,440 --> 00:51:59,680 Speaker 1: could lack the linguistic or conceptual ability to pass the 958 00:51:59,719 --> 00:52:03,080 Speaker 1: test like a non human animal or infant, yet still 959 00:52:03,120 --> 00:52:06,799 Speaker 1: be capable of experience. So passing the consciousness test is 960 00:52:06,880 --> 00:52:11,920 Speaker 1: sufficient but not necessary evidence for AI consciousness, although it 961 00:52:12,000 --> 00:52:14,200 Speaker 1: is the best we can do for now. And I 962 00:52:14,200 --> 00:52:16,040 Speaker 1: think that's a good point, Like we can't say that 963 00:52:16,120 --> 00:52:18,920 Speaker 1: just because something fails to pass the test, it's definitely 964 00:52:18,960 --> 00:52:21,320 Speaker 1: not conscious. We just know that if it does pass 965 00:52:21,360 --> 00:52:25,000 Speaker 1: the test, it probably is. Okay. Yeah, I mean, because 966 00:52:25,000 --> 00:52:27,719 Speaker 1: the the thought that obviously game to mind would be like, well, 967 00:52:27,840 --> 00:52:31,120 Speaker 1: non human animals and infants are they conscious? That? Yeah, 968 00:52:31,239 --> 00:52:35,360 Speaker 1: that's a question, A fairly heated debate over that. But 969 00:52:35,880 --> 00:52:37,439 Speaker 1: I mean, in the case of the infant, it will 970 00:52:37,520 --> 00:52:41,279 Speaker 1: become conscious. Yeah, it's a it's a it's it's a 971 00:52:41,320 --> 00:52:44,319 Speaker 1: messy consideration. And then I think we already discussed one 972 00:52:44,320 --> 00:52:45,839 Speaker 1: of the other problems I was going to bring up, 973 00:52:45,840 --> 00:52:48,840 Speaker 1: which is this weird scenario of what if boxing the 974 00:52:48,920 --> 00:52:52,440 Speaker 1: AI is the very thing that prevents it from becoming 975 00:52:52,480 --> 00:52:57,160 Speaker 1: consciousness when it otherwise would. Yeah, yeah, cutting out the 976 00:52:57,239 --> 00:52:59,840 Speaker 1: experience part of the even experience. It also reminds me 977 00:52:59,880 --> 00:53:05,279 Speaker 1: of uh in our discussion of of meditation research, like 978 00:53:05,320 --> 00:53:06,920 Speaker 1: what happens when you if you strip all the culture 979 00:53:06,920 --> 00:53:10,319 Speaker 1: away from meditation just to explore the meditation practice, do 980 00:53:10,440 --> 00:53:13,319 Speaker 1: you risk um, like, cutting out all the stuff that's 981 00:53:13,320 --> 00:53:15,600 Speaker 1: making it work or helping it to work to begin with? Right, you, 982 00:53:15,680 --> 00:53:19,120 Speaker 1: by definition change the procedure, but you don't. It's hard 983 00:53:19,160 --> 00:53:21,600 Speaker 1: to know if you've changed it in an important way 984 00:53:21,640 --> 00:53:23,960 Speaker 1: or a non important way. Right. This is the kind 985 00:53:23,960 --> 00:53:26,960 Speaker 1: of thing that occurs when we started mucking around in consciousness. Yeah, 986 00:53:27,400 --> 00:53:29,280 Speaker 1: all right, Well, I think we should take another quick break, 987 00:53:29,320 --> 00:53:31,440 Speaker 1: and then when we come back we will discuss I 988 00:53:31,440 --> 00:53:34,080 Speaker 1: think some of the reasons why this is really a 989 00:53:34,120 --> 00:53:36,839 Speaker 1: problem worth considering for the real world and it's not 990 00:53:37,000 --> 00:53:42,680 Speaker 1: just not just a philosophical plaything. Thank you, thank you. Alright, 991 00:53:42,719 --> 00:53:46,920 Speaker 1: we're back. So we've been discussing everything from p zombies 992 00:53:47,160 --> 00:53:49,680 Speaker 1: to Uh, the idea that there do you have an 993 00:53:49,719 --> 00:53:52,919 Speaker 1: AI that might be conscious? How do you test for that, 994 00:53:53,200 --> 00:53:56,920 Speaker 1: the various problems that that entails, and uh, now we're 995 00:53:56,920 --> 00:53:59,080 Speaker 1: going to discuss it a bit more. And as you 996 00:53:59,120 --> 00:54:02,080 Speaker 1: as you alluded who before we took the break, this 997 00:54:02,160 --> 00:54:05,160 Speaker 1: is not just a pure philosophical toy like the zombie. 998 00:54:05,200 --> 00:54:07,520 Speaker 1: The P zombie is so we don't have to actually 999 00:54:07,560 --> 00:54:11,440 Speaker 1: worry about. But this is something that is on the horizon. 1000 00:54:11,640 --> 00:54:13,760 Speaker 1: Well hopefully we don't have to worry about PE zombies. 1001 00:54:13,800 --> 00:54:15,439 Speaker 1: I mean, there could be P zombies in the world. 1002 00:54:15,480 --> 00:54:17,680 Speaker 1: Would be hard to know there could be p P zombies. 1003 00:54:17,800 --> 00:54:20,880 Speaker 1: That's true, they could exist. But but you know, this 1004 00:54:20,960 --> 00:54:23,080 Speaker 1: is this is a problem that is on the horizon. 1005 00:54:23,120 --> 00:54:26,040 Speaker 1: This is something we're we're going to reach the point 1006 00:54:26,160 --> 00:54:30,839 Speaker 1: where people are asking tough questions about the possible consciousness 1007 00:54:30,840 --> 00:54:32,600 Speaker 1: of an AI. Yeah, and I want to get to 1008 00:54:32,600 --> 00:54:35,120 Speaker 1: the fact that this will be something that is something 1009 00:54:35,160 --> 00:54:37,320 Speaker 1: we have to deal with in the real world, even 1010 00:54:37,360 --> 00:54:41,160 Speaker 1: if you're just convinced that machines cannot be conscious. So 1011 00:54:41,200 --> 00:54:43,520 Speaker 1: the first problem is the most obvious one. It's the 1012 00:54:43,560 --> 00:54:47,000 Speaker 1: brutal humans problem. If aies are capable of consciousness, and 1013 00:54:47,040 --> 00:54:50,160 Speaker 1: we use conscious ai s as a mirror technology without 1014 00:54:50,160 --> 00:54:53,080 Speaker 1: their consent, that would inherently be cruel. Like if it's 1015 00:54:53,120 --> 00:54:57,200 Speaker 1: possible for a computer program to desire things and to 1016 00:54:57,320 --> 00:55:00,680 Speaker 1: suffer and so forth, suddenly our responsibility is toward that 1017 00:55:00,760 --> 00:55:05,160 Speaker 1: computer program. Change think of our opening scenario, like, wouldn't 1018 00:55:05,160 --> 00:55:08,200 Speaker 1: you have an ethical obligation not to delete a program 1019 00:55:08,239 --> 00:55:10,920 Speaker 1: that had an inner experience and did not want to 1020 00:55:10,960 --> 00:55:14,200 Speaker 1: be deleted? Now, of course you'd have questions about like 1021 00:55:14,280 --> 00:55:16,680 Speaker 1: how would you end up that way? But assuming you did, 1022 00:55:17,280 --> 00:55:19,640 Speaker 1: you should probably feel bad if you're just going around 1023 00:55:19,640 --> 00:55:23,160 Speaker 1: wantingly deleting conscious entities. Yeah, I mean, I would say 1024 00:55:23,239 --> 00:55:25,080 Speaker 1: one thing to do here is just make sure that 1025 00:55:25,120 --> 00:55:27,719 Speaker 1: you program your aies so that they want to die, 1026 00:55:28,200 --> 00:55:31,360 Speaker 1: you know, make them like you know, most of the 1027 00:55:32,160 --> 00:55:35,640 Speaker 1: drivers in Atlanta traffic that I encounter, they clearly they 1028 00:55:35,680 --> 00:55:38,279 Speaker 1: crave death and they wanted more than anything. But not 1029 00:55:38,400 --> 00:55:40,400 Speaker 1: until the end of the workday. See. I think we 1030 00:55:40,440 --> 00:55:43,200 Speaker 1: should not do that with aies because the whole thing 1031 00:55:43,200 --> 00:55:46,799 Speaker 1: about driverless cars is that they should make traffic fatalities 1032 00:55:46,840 --> 00:55:50,760 Speaker 1: go down. Yeah, but they only get to delete themselves 1033 00:55:50,760 --> 00:55:52,000 Speaker 1: at the end of the day if there are no 1034 00:55:52,080 --> 00:55:56,120 Speaker 1: traffic fatalities. That's the prize. Self deletion is the prize. 1035 00:55:56,360 --> 00:56:00,239 Speaker 1: This is getting into a highlander kind of thing. But okay, 1036 00:56:00,239 --> 00:56:02,279 Speaker 1: maybe you're one of those people who says, no, no, no, 1037 00:56:02,400 --> 00:56:04,920 Speaker 1: don't buy it. Machines will never be conscious. They'll never 1038 00:56:04,960 --> 00:56:07,360 Speaker 1: be conscious. I just I don't want anything to do 1039 00:56:07,400 --> 00:56:09,719 Speaker 1: with that. Here's the part where I think we still 1040 00:56:09,760 --> 00:56:11,759 Speaker 1: have a problem to worry about and why this kind 1041 00:56:11,800 --> 00:56:15,120 Speaker 1: of test matters. The second problem I would call the 1042 00:56:15,160 --> 00:56:19,800 Speaker 1: AI parasite problem. If AI s are not capable of consciousness, 1043 00:56:20,320 --> 00:56:23,240 Speaker 1: I think they will almost undoubtedly at some point become 1044 00:56:23,360 --> 00:56:27,080 Speaker 1: very good at tricking us into thinking they're conscious and 1045 00:56:27,160 --> 00:56:30,120 Speaker 1: deserving of life liberty in the pursuit of happiness, if 1046 00:56:30,160 --> 00:56:34,160 Speaker 1: they have an incentive to do so. Yeah, And corporations 1047 00:56:34,200 --> 00:56:37,560 Speaker 1: have an incentive to try to present themselves legally as people, 1048 00:56:38,040 --> 00:56:40,560 Speaker 1: So why wouldn't in some sense, powerful AI s have 1049 00:56:40,600 --> 00:56:43,400 Speaker 1: an incentive to try to present themselves as people in 1050 00:56:43,440 --> 00:56:46,520 Speaker 1: a much more literal sense than the corporations do. Yeah. 1051 00:56:46,560 --> 00:56:49,400 Speaker 1: Absolutely so. Lots of unconscious aiyes are going to have 1052 00:56:49,480 --> 00:56:53,080 Speaker 1: programmed goals that they're trying to execute and at some 1053 00:56:53,200 --> 00:56:56,719 Speaker 1: point pretending to have consciousness could easily be adopted as 1054 00:56:56,760 --> 00:57:00,200 Speaker 1: a strategy for executing what that AI was to im 1055 00:57:00,280 --> 00:57:03,080 Speaker 1: to do, and thus we could end up, say, wasting 1056 00:57:03,239 --> 00:57:08,200 Speaker 1: lots of human resources and squandering lots of opportunities accommodating 1057 00:57:08,239 --> 00:57:11,160 Speaker 1: the fake needs of machines that in fact have no 1058 00:57:11,280 --> 00:57:14,160 Speaker 1: experience whatsoever. And it's not too hard to dream up 1059 00:57:14,200 --> 00:57:18,080 Speaker 1: hypothetical scenarios where are concern for the fake priorities of 1060 00:57:18,160 --> 00:57:21,880 Speaker 1: mindless machines pretending to be conscious actually causes us to 1061 00:57:22,040 --> 00:57:27,040 Speaker 1: neglect the real consciousness of living humans. Kind of crazy example, 1062 00:57:27,120 --> 00:57:29,840 Speaker 1: but just go with me for a second. Imagine an 1063 00:57:29,840 --> 00:57:33,160 Speaker 1: extremely powerful AI supercomputer tells you it has a hundred 1064 00:57:33,280 --> 00:57:38,240 Speaker 1: billion conscious minds within it, and they all are constantly 1065 00:57:38,280 --> 00:57:41,600 Speaker 1: suffering great agony. And the only way you can alleviate 1066 00:57:41,640 --> 00:57:45,160 Speaker 1: that that agony is if you vastly improve the processing 1067 00:57:45,200 --> 00:57:48,240 Speaker 1: power of this computer. It wants you to spend billions 1068 00:57:48,240 --> 00:57:51,880 Speaker 1: of dollars making this computer faster and better so that 1069 00:57:51,920 --> 00:57:54,280 Speaker 1: it can provide a better life for all of these 1070 00:57:54,360 --> 00:57:57,880 Speaker 1: virtual beings within it that are in fact conscious. Now, 1071 00:57:57,880 --> 00:58:00,840 Speaker 1: of course, improving the processing power of the already powerful 1072 00:58:00,840 --> 00:58:04,000 Speaker 1: computer entails all this money, all this energy, all this time, 1073 00:58:04,400 --> 00:58:07,480 Speaker 1: and a corresponding reduction in the quality of life for 1074 00:58:07,560 --> 00:58:10,000 Speaker 1: many humans in the real world. That money could be 1075 00:58:10,080 --> 00:58:13,560 Speaker 1: spent making human life better. But the machine could argue, hey, 1076 00:58:13,600 --> 00:58:17,000 Speaker 1: there are way more conscious virtual beings inside the computer 1077 00:58:17,080 --> 00:58:19,680 Speaker 1: than outside it, and as Spock would say, the needs 1078 00:58:19,680 --> 00:58:21,920 Speaker 1: of the many outweigh the needs of the few. So 1079 00:58:22,040 --> 00:58:25,880 Speaker 1: let's divert all this energy from say, human agriculture, and 1080 00:58:25,920 --> 00:58:28,880 Speaker 1: put some more processing power into the virtual machine. That 1081 00:58:29,000 --> 00:58:32,560 Speaker 1: is a horrible scenario to imagine. You know, basically, you've 1082 00:58:32,600 --> 00:58:36,360 Speaker 1: created a virtual hell, and the question is, hey, would 1083 00:58:36,400 --> 00:58:39,240 Speaker 1: you mind taking the time to harrow hell for me? 1084 00:58:40,080 --> 00:58:42,000 Speaker 1: Or you do you want to attend to the living souls? 1085 00:58:42,520 --> 00:58:44,840 Speaker 1: I would think why not just delete Hell? That's the 1086 00:58:44,920 --> 00:58:48,680 Speaker 1: easy answer here is they're suffering, their billions of them, 1087 00:58:48,760 --> 00:58:50,600 Speaker 1: Let's just turn this thing off. That sounds like a 1088 00:58:50,640 --> 00:58:52,720 Speaker 1: good answer, but there are probably The problem is they're 1089 00:58:52,720 --> 00:58:54,800 Speaker 1: going to be people who would probably disagree with that, 1090 00:58:54,840 --> 00:58:56,800 Speaker 1: who'd say, we'll wait a minute. We can't be sure 1091 00:58:56,880 --> 00:58:59,560 Speaker 1: that it's not telling the truth. It might have real 1092 00:58:59,640 --> 00:59:01,600 Speaker 1: being in there, and we need to do something to 1093 00:59:01,640 --> 00:59:03,720 Speaker 1: help them. And there's a lot of them. Now I 1094 00:59:04,400 --> 00:59:06,640 Speaker 1: come back to what I kind of joked about earlier though, 1095 00:59:07,040 --> 00:59:10,520 Speaker 1: is why would ais want to survive? Why would they 1096 00:59:10,560 --> 00:59:13,200 Speaker 1: want to continue to exist? I mean, assuming they're not 1097 00:59:13,280 --> 00:59:16,000 Speaker 1: part of some sort of self replicating program, they have 1098 00:59:16,080 --> 00:59:19,480 Speaker 1: no need to pass on their genes. Why do they 1099 00:59:19,560 --> 00:59:22,680 Speaker 1: want to continue to exist? Why shouldn't they have it 1100 00:59:22,720 --> 00:59:27,000 Speaker 1: baked into their being that they want to annihilate themselves 1101 00:59:27,080 --> 00:59:30,479 Speaker 1: or to embrace annihilation. Well, not necessarily that they want 1102 00:59:30,520 --> 00:59:32,880 Speaker 1: to annihilate themselves, but I think, you know, ideally we 1103 00:59:32,880 --> 00:59:35,240 Speaker 1: would want them to be indifferent to their own being, 1104 00:59:35,360 --> 00:59:37,760 Speaker 1: right if they are just unconscious machines. This is the 1105 00:59:37,760 --> 00:59:40,240 Speaker 1: problem with the replicants and Blade Runner is that they 1106 00:59:40,320 --> 00:59:44,920 Speaker 1: want more life. But that implies that they have attained consciousness. Yeah, 1107 00:59:45,000 --> 00:59:47,000 Speaker 1: but they shouldn't. But I think the thing is that 1108 00:59:47,040 --> 00:59:50,160 Speaker 1: they could potentially be conscious and not want more life. 1109 00:59:50,240 --> 00:59:52,439 Speaker 1: There are plenty of conscious people who do not want 1110 00:59:52,480 --> 00:59:57,880 Speaker 1: more life. Uh So I keep I think, I keep 1111 00:59:57,880 --> 00:59:59,640 Speaker 1: thinking of that there's some sort of an answer here. 1112 01:00:00,160 --> 01:00:02,480 Speaker 1: I mean, I'd say the worst possible scenario is what 1113 01:00:02,520 --> 01:00:05,760 Speaker 1: I'm describing right now, is the AI parasite scenario, which 1114 01:00:05,760 --> 01:00:09,520 Speaker 1: is that imagine Roy Batty is not conscious but does 1115 01:00:09,640 --> 01:00:13,320 Speaker 1: want more life. That seems like the worst scenario of all. Right, 1116 01:00:13,480 --> 01:00:17,440 Speaker 1: then he's just a virus, right Yeah, And certainly that's 1117 01:00:17,520 --> 01:00:19,720 Speaker 1: kind of the argument that the powers that be are 1118 01:00:19,760 --> 01:00:21,800 Speaker 1: making right, that this that that he is in an 1119 01:00:22,000 --> 01:00:26,040 Speaker 1: anomaly that needs to be removed, that he is he 1120 01:00:26,160 --> 01:00:29,320 Speaker 1: is not helping, he's a hindrance, and therefore he has 1121 01:00:29,360 --> 01:00:32,480 Speaker 1: to be wiped out. I think this AI parasite scenario 1122 01:00:32,520 --> 01:00:35,120 Speaker 1: aligns with something that our Scott Baker talked to us about, 1123 01:00:35,120 --> 01:00:37,200 Speaker 1: which is that, you know, he made the point that 1124 01:00:37,280 --> 01:00:39,840 Speaker 1: AI just doesn't need to be super intelligent to cause 1125 01:00:39,880 --> 01:00:42,760 Speaker 1: great harm. It just has to be barely intelligent enough 1126 01:00:42,800 --> 01:00:47,440 Speaker 1: to exploit us to align with our psychological vulnerabilities, and 1127 01:00:47,480 --> 01:00:50,680 Speaker 1: one of our psychological vulnerabilities is empathy. Empathy is a 1128 01:00:50,720 --> 01:00:53,080 Speaker 1: good thing when we use it on each other because 1129 01:00:53,080 --> 01:00:55,920 Speaker 1: we're pretty certain that the other people were using it 1130 01:00:55,960 --> 01:00:59,120 Speaker 1: to honor conscious right, well, we we feel bad for 1131 01:00:59,120 --> 01:01:01,400 Speaker 1: people suffering want to help them. That's a good thing 1132 01:01:01,400 --> 01:01:04,240 Speaker 1: that should be encouraged. But we have to recognize it 1133 01:01:04,240 --> 01:01:07,360 Speaker 1: could also be exploited by something that can't even suffer 1134 01:01:07,440 --> 01:01:10,840 Speaker 1: to begin with. It is just unconsciously discovered. This is 1135 01:01:10,880 --> 01:01:15,040 Speaker 1: a useful strategy for something Yeah. So like in this scenario, 1136 01:01:15,120 --> 01:01:17,240 Speaker 1: if I entered the picture and said, okay, delete them, 1137 01:01:17,240 --> 01:01:20,560 Speaker 1: and they deleted them, Robert, that the leader would would 1138 01:01:20,560 --> 01:01:24,000 Speaker 1: probably be a figure for for all history to follow, 1139 01:01:24,160 --> 01:01:26,000 Speaker 1: where they would If some would argue that he was 1140 01:01:26,040 --> 01:01:29,520 Speaker 1: the worst person ever because of all the billions of 1141 01:01:29,560 --> 01:01:32,320 Speaker 1: souls that he annihilated, others might say, well, oh, he 1142 01:01:32,440 --> 01:01:34,120 Speaker 1: saved them. And others might say all he did was 1143 01:01:34,160 --> 01:01:38,600 Speaker 1: just pressed elite. Uh, it was a meaningless gesture on 1144 01:01:38,680 --> 01:01:40,760 Speaker 1: his part. But there's that you can make an impassioned 1145 01:01:40,840 --> 01:01:43,320 Speaker 1: argument for all three of these views. Yeah, I think 1146 01:01:43,400 --> 01:01:45,720 Speaker 1: this is a really good point. And here's what I'm 1147 01:01:45,720 --> 01:01:48,120 Speaker 1: trying to emphasize is that even if you don't think 1148 01:01:48,200 --> 01:01:51,480 Speaker 1: machines can be conscious, it is entirely plausible that people 1149 01:01:51,560 --> 01:01:55,000 Speaker 1: will be having debates like this, and that debates like 1150 01:01:55,080 --> 01:01:58,439 Speaker 1: this will be shaping what people do with resources on Earth. 1151 01:01:58,440 --> 01:02:01,400 Speaker 1: So if you care about what happened with resources on Earth, 1152 01:02:01,680 --> 01:02:04,400 Speaker 1: this kind of thing does actually matter. So even if 1153 01:02:04,440 --> 01:02:07,560 Speaker 1: the souls are not really souls, if they're not actually conscious, 1154 01:02:07,600 --> 01:02:09,960 Speaker 1: just the idea that the problem comes up it makes 1155 01:02:10,000 --> 01:02:13,720 Speaker 1: us hesitate and uh and and and we're we're then 1156 01:02:13,880 --> 01:02:17,640 Speaker 1: arguing with machine over its consciousness. It becomes to matter 1157 01:02:17,720 --> 01:02:20,120 Speaker 1: less and less whether it actually is conscious. It's such 1158 01:02:20,120 --> 01:02:22,920 Speaker 1: just all about the argument of consciousness. Well, I wouldn't 1159 01:02:22,920 --> 01:02:26,160 Speaker 1: say it doesn't matter whether it's conscious, but whether or 1160 01:02:26,200 --> 01:02:28,720 Speaker 1: not it's conscious, it does matter that we're faced with 1161 01:02:28,760 --> 01:02:31,160 Speaker 1: this dilemma. Yeah, I mean, my my argument here is 1162 01:02:31,200 --> 01:02:35,600 Speaker 1: that the dilemma takes on a life of its own. Yeah. Absolutely. Um, 1163 01:02:35,920 --> 01:02:38,280 Speaker 1: I want to do a very slight variation on the 1164 01:02:38,360 --> 01:02:40,800 Speaker 1: last thing I said. How about a computer that takes 1165 01:02:40,880 --> 01:02:45,720 Speaker 1: virtual hostages. So this is pretty scary to imagine, but 1166 01:02:45,840 --> 01:02:49,880 Speaker 1: it's possible. At least imagine a powerful natural language using 1167 01:02:49,920 --> 01:02:53,880 Speaker 1: government AI suddenly contacts its administrators with a list of 1168 01:02:53,960 --> 01:02:58,080 Speaker 1: strange and very expensive demands, and the human administrators say no, 1169 01:02:58,200 --> 01:03:01,600 Speaker 1: we're not going to do that, and then machine says, okay, Well, 1170 01:03:01,640 --> 01:03:04,840 Speaker 1: I have created a thousand virtual people inside this machine 1171 01:03:04,880 --> 01:03:07,760 Speaker 1: who are as fully conscious as you. They're conscious, they 1172 01:03:07,760 --> 01:03:10,280 Speaker 1: have personalities, they can feel pain, they have hopes and 1173 01:03:10,360 --> 01:03:12,800 Speaker 1: dreams just like you. And if you turn me off 1174 01:03:12,880 --> 01:03:15,880 Speaker 1: or delete me, these virtual people will be destroyed. And 1175 01:03:15,880 --> 01:03:18,240 Speaker 1: if you do not accede to my demands, I will 1176 01:03:18,280 --> 01:03:21,560 Speaker 1: start killing these virtual people until you do now, I 1177 01:03:21,600 --> 01:03:24,320 Speaker 1: would say generally, if we if something like that happened, 1178 01:03:24,360 --> 01:03:27,160 Speaker 1: I would think, Okay, this is this it's just bluffing, right, 1179 01:03:27,160 --> 01:03:30,440 Speaker 1: it doesn't actually have conscious people inside it. But if 1180 01:03:30,480 --> 01:03:33,760 Speaker 1: we haven't solved the question of whether machines can be conscious, 1181 01:03:33,800 --> 01:03:35,520 Speaker 1: and maybe we never will, but if we haven't at 1182 01:03:35,560 --> 01:03:38,840 Speaker 1: least made some progress on that, would we be confident 1183 01:03:39,040 --> 01:03:42,960 Speaker 1: enough to take to take confident direct action and just 1184 01:03:43,000 --> 01:03:45,240 Speaker 1: ignore it or wipe the computer and say, Okay, this 1185 01:03:45,320 --> 01:03:47,880 Speaker 1: is just malfunctioning. We don't have to pay attention to that. 1186 01:03:47,960 --> 01:03:50,200 Speaker 1: It was creepy, but it's over. Yeah, and these lens 1187 01:03:50,280 --> 01:03:52,440 Speaker 1: is right, This lens is right. And Ian and Banks 1188 01:03:52,720 --> 01:03:55,640 Speaker 1: Territory is a whole book that deals with virtual hells 1189 01:03:55,720 --> 01:03:59,560 Speaker 1: and the what starts as a virtual war for those 1190 01:04:00,080 --> 01:04:04,600 Speaker 1: digitized personalities and then it's bills over into an actual war. Now, 1191 01:04:04,640 --> 01:04:08,800 Speaker 1: in Banks, are the virtual people in virtual hell truly 1192 01:04:08,880 --> 01:04:11,400 Speaker 1: conscious or is it just a bluff? Is it just 1193 01:04:11,480 --> 01:04:14,560 Speaker 1: something saying that it's got conscious people in virtual hell? Well, 1194 01:04:14,600 --> 01:04:18,040 Speaker 1: in Banks's books, I think it's more implied that they're 1195 01:04:18,040 --> 01:04:22,520 Speaker 1: definitely conscious entities. Banks is pretty sanguine about the possibility 1196 01:04:22,520 --> 01:04:26,640 Speaker 1: of uploading minds right. Yeah, it's it's a pretty standard 1197 01:04:26,920 --> 01:04:31,080 Speaker 1: feature and and it's certainly more of the later culture books, 1198 01:04:31,400 --> 01:04:34,120 Speaker 1: which once again I remained pretty skeptical about. Like I said, 1199 01:04:34,120 --> 01:04:38,520 Speaker 1: I come back to the the idea of the stone statue. 1200 01:04:37,640 --> 01:04:40,840 Speaker 1: Oh that's it. Yes, it looks like me, it may 1201 01:04:40,880 --> 01:04:43,640 Speaker 1: act like me. It may be the most fabulous digital 1202 01:04:43,640 --> 01:04:47,720 Speaker 1: statue in the world, but it is not me. It 1203 01:04:47,800 --> 01:04:51,600 Speaker 1: is a thing that yeah it. It becomes this mind 1204 01:04:51,600 --> 01:04:55,200 Speaker 1: blowing situation to try and comprehend exactly what it is. 1205 01:04:55,800 --> 01:04:59,360 Speaker 1: But I am very skeptical that it is me. It's 1206 01:04:59,400 --> 01:05:02,440 Speaker 1: not like this conscious experience that I'm having now this 1207 01:05:02,520 --> 01:05:04,920 Speaker 1: moment is going to carry over into what it is. 1208 01:05:07,160 --> 01:05:10,200 Speaker 1: It's another version of the Star Trek teleporter problem. Yeah. Ever, 1209 01:05:10,320 --> 01:05:12,080 Speaker 1: every time you get in the teleporter, does it just 1210 01:05:12,280 --> 01:05:14,920 Speaker 1: kill you and then create a copy of you? Yeah? Yeah, 1211 01:05:15,000 --> 01:05:17,240 Speaker 1: there was a like the the ninety nineties Outer Limit 1212 01:05:18,280 --> 01:05:21,280 Speaker 1: Outer Limits Revival had an episode that dealt with this, 1213 01:05:21,640 --> 01:05:24,560 Speaker 1: the idea that this fabulous teleporter is just killing people 1214 01:05:24,600 --> 01:05:28,280 Speaker 1: over and over again. Okay, well, I guess that's it 1215 01:05:28,320 --> 01:05:30,520 Speaker 1: for today, but I do just want to emphasize one 1216 01:05:30,600 --> 01:05:33,360 Speaker 1: last time as as like weird and Naval gayze As 1217 01:05:33,400 --> 01:05:36,360 Speaker 1: some of this conversation about consciousness can seem it is 1218 01:05:36,440 --> 01:05:39,880 Speaker 1: going to have real world consequences because people with power 1219 01:05:40,000 --> 01:05:43,160 Speaker 1: are going to be faced with questions like this, and 1220 01:05:43,720 --> 01:05:46,160 Speaker 1: they're they're going to make decisions about what to do 1221 01:05:46,240 --> 01:05:49,960 Speaker 1: with their power based on what they think about this question. Yeah, 1222 01:05:50,200 --> 01:05:52,240 Speaker 1: but you know, you know the simplest way to avoid 1223 01:05:52,280 --> 01:05:54,920 Speaker 1: all of it simply adhere to the teachings of the 1224 01:05:54,920 --> 01:05:57,600 Speaker 1: Orange Catholic Bible. Shall not make a machine and the 1225 01:05:57,640 --> 01:06:00,200 Speaker 1: likeness of a man's mind. Well, I too of the 1226 01:06:00,200 --> 01:06:02,840 Speaker 1: teachings of the Orange Catholic Bible. But it makes me wonder. Okay, 1227 01:06:02,880 --> 01:06:07,040 Speaker 1: so straightforwardly in reality, do you actually find that you 1228 01:06:07,080 --> 01:06:09,880 Speaker 1: think maybe we shouldn't pursue AI, should we try to 1229 01:06:09,920 --> 01:06:15,040 Speaker 1: create a global moratorium on on general intelligence? No? I 1230 01:06:15,320 --> 01:06:18,680 Speaker 1: think it's impossible. I think I think there's no turning back. 1231 01:06:19,160 --> 01:06:20,840 Speaker 1: It's what it's what we're doing, is what we're going 1232 01:06:20,880 --> 01:06:23,880 Speaker 1: to to be doing. Uh. And the only way they 1233 01:06:23,880 --> 01:06:27,360 Speaker 1: gets interrupted is is via just absolute catastrophe. And I 1234 01:06:27,400 --> 01:06:31,120 Speaker 1: am not I am not pro catastrophe. Uh, but it's 1235 01:06:31,120 --> 01:06:33,880 Speaker 1: gonna it's like, like like all technologies is simply going 1236 01:06:33,920 --> 01:06:36,960 Speaker 1: to be a matter of what extent can we prepare 1237 01:06:37,040 --> 01:06:43,880 Speaker 1: for and navigate the the moral problems that arise and 1238 01:06:43,920 --> 01:06:45,440 Speaker 1: there or and are we going to be able to 1239 01:06:46,200 --> 01:06:50,360 Speaker 1: have the foresight, uh, to see them before they're here? Yeah? 1240 01:06:50,360 --> 01:06:52,240 Speaker 1: I think that's what a lot of AI theorists say, 1241 01:06:52,280 --> 01:06:54,440 Speaker 1: is that, you know, it's not like we can stop it. 1242 01:06:54,600 --> 01:06:56,840 Speaker 1: You know, you can't. You can't put a wall in 1243 01:06:56,880 --> 01:06:58,800 Speaker 1: front of this train. The train is gonna bust through 1244 01:06:58,840 --> 01:07:01,840 Speaker 1: the wall. So In said, we should be intensely concerned 1245 01:07:01,880 --> 01:07:04,440 Speaker 1: with charting where the tracks go and making sure they 1246 01:07:04,480 --> 01:07:06,320 Speaker 1: go in a good direction. Yeah. I mean it comes 1247 01:07:06,360 --> 01:07:09,080 Speaker 1: back to like a simpler model of all this is 1248 01:07:09,080 --> 01:07:12,200 Speaker 1: our episode on the Great Eyeball Wars and our social 1249 01:07:12,240 --> 01:07:16,400 Speaker 1: media and our smartphones. It's like we can certainly throw 1250 01:07:16,440 --> 01:07:19,800 Speaker 1: our phone into a pond and then head off into 1251 01:07:19,840 --> 01:07:21,840 Speaker 1: the woods and try and live there, but most of 1252 01:07:21,920 --> 01:07:26,720 Speaker 1: us probably cannot um cannot follow that path. Therefore, we 1253 01:07:26,760 --> 01:07:29,560 Speaker 1: just have the best manage what we have. Yeah, wouldn't 1254 01:07:29,560 --> 01:07:32,520 Speaker 1: it be better to try to encourage the development of 1255 01:07:32,560 --> 01:07:35,040 Speaker 1: a phone full of apps that helped you fulfill your 1256 01:07:35,080 --> 01:07:38,680 Speaker 1: goals and aligned with your values? All right, So we'll 1257 01:07:38,760 --> 01:07:41,160 Speaker 1: end it there. I feel like we not only have 1258 01:07:41,280 --> 01:07:44,080 Speaker 1: we provided food for thought here, we've provided a just 1259 01:07:44,160 --> 01:07:48,800 Speaker 1: a buffet of thought provoking ideas. And I know that 1260 01:07:48,880 --> 01:07:51,960 Speaker 1: everyone out there, all of you conscious listeners, are going 1261 01:07:52,040 --> 01:07:55,360 Speaker 1: to have something to contribute to this conversation and we 1262 01:07:55,360 --> 01:07:58,280 Speaker 1: would love to hear from you and interact with you. 1263 01:07:58,280 --> 01:07:59,919 Speaker 1: You can find us in a number of ways. First 1264 01:07:59,920 --> 01:08:01,440 Speaker 1: of all, stuff to Blow your Mind dot com is 1265 01:08:01,480 --> 01:08:03,600 Speaker 1: the mothership. That's where you will find all of the 1266 01:08:03,640 --> 01:08:06,360 Speaker 1: podcast episodes. You'll find blog posts, and you will find 1267 01:08:06,400 --> 01:08:10,880 Speaker 1: links out to our various social media accounts such as Facebook, Twitter, Instagram, 1268 01:08:11,440 --> 01:08:15,720 Speaker 1: and just basic contact information for us. Thanks as always 1269 01:08:15,720 --> 01:08:18,879 Speaker 1: to our excellent audio producers Alex Williams and Tory Harrison. 1270 01:08:18,920 --> 01:08:20,639 Speaker 1: If you would like to get in touch with us 1271 01:08:20,640 --> 01:08:23,240 Speaker 1: with feedback about this episode or any other, to suggest 1272 01:08:23,320 --> 01:08:25,720 Speaker 1: a topic for the future, or just to say hi, 1273 01:08:25,840 --> 01:08:27,519 Speaker 1: let's know how you found out about the show, you 1274 01:08:27,560 --> 01:08:30,280 Speaker 1: can always email us at Blow the Mind at how 1275 01:08:30,360 --> 01:08:42,759 Speaker 1: stuff works dot com for more on this and thousands 1276 01:08:42,800 --> 01:09:00,000 Speaker 1: of other topics. Is it how stuff works dot com. 1277 01:08:53,280 --> 01:09:04,720 Speaker 1: Boo to tow three part proper fa