1 00:00:05,920 --> 00:00:09,000 Speaker 1: On the podcast, we love to ask deep questions and 2 00:00:09,039 --> 00:00:12,280 Speaker 1: then explore the edge of our knowledge. Today we're going 3 00:00:12,360 --> 00:00:14,600 Speaker 1: to go a little bit meta and try to understand 4 00:00:14,680 --> 00:00:18,440 Speaker 1: the workings of our own minds, the us that is 5 00:00:18,520 --> 00:00:22,200 Speaker 1: asking those questions and yearning to understand the universe, from 6 00:00:22,200 --> 00:00:25,360 Speaker 1: biology to physics, and yes, sometimes even a little bit 7 00:00:25,400 --> 00:00:29,680 Speaker 1: of chemistry. Science gives us this incredible tool to consistently 8 00:00:29,720 --> 00:00:33,320 Speaker 1: build knowledge about the universe, but it's not clear that 9 00:00:33,440 --> 00:00:36,880 Speaker 1: it can tell us about ourselves. That it can help 10 00:00:36,960 --> 00:00:41,360 Speaker 1: us understand why we have a first person subjective experience 11 00:00:41,400 --> 00:00:44,640 Speaker 1: in the first place, why we have an experience at all, 12 00:00:45,080 --> 00:00:48,440 Speaker 1: why chocolate has a taste and pain has a negative valance, 13 00:00:48,760 --> 00:00:52,319 Speaker 1: Why there's something it's like to be me. How's that 14 00:00:52,360 --> 00:00:54,880 Speaker 1: different from what it's like to be a bat? Is 15 00:00:54,920 --> 00:00:57,680 Speaker 1: it like anything to be a rock or a puddle 16 00:00:57,800 --> 00:01:00,800 Speaker 1: or a star? Science, like all you and endeavors, is 17 00:01:00,840 --> 00:01:03,760 Speaker 1: limited by the senses that we use to interact with 18 00:01:03,800 --> 00:01:07,000 Speaker 1: the universe. It can help us understand what we sense 19 00:01:07,080 --> 00:01:10,119 Speaker 1: and find patterns and build models, but it can't ever 20 00:01:10,200 --> 00:01:13,399 Speaker 1: really tell us about the hard reality of the physical 21 00:01:13,520 --> 00:01:17,480 Speaker 1: universe that exists out there beyond our minds and our senses, 22 00:01:18,000 --> 00:01:20,800 Speaker 1: and that same limitation might apply to the other end 23 00:01:20,840 --> 00:01:24,520 Speaker 1: of that sensory pipe. Can a scientist use science to 24 00:01:24,600 --> 00:01:28,319 Speaker 1: probe the nature of the mind while trapped inside their 25 00:01:28,360 --> 00:01:31,800 Speaker 1: own mind? That might just be the province of philosophy. 26 00:01:32,120 --> 00:01:35,160 Speaker 1: Today on the podcast, we're getting philosophical and asking what 27 00:01:35,319 --> 00:01:39,720 Speaker 1: we know and how we can know about how consciousness emerges. 28 00:01:40,480 --> 00:01:44,800 Speaker 1: Welcome to Daniel and Kelly's Extraordinarily self Conscious Universe. 29 00:01:58,680 --> 00:02:01,840 Speaker 2: Hello, I'm Kelly Winnersmith. I study parasites and space, and 30 00:02:01,840 --> 00:02:04,040 Speaker 2: today I'm wondering if we live in a simulation. 31 00:02:05,320 --> 00:02:09,239 Speaker 1: Hi, I'm Daniel. I'm a particle physicist, and I don't 32 00:02:09,280 --> 00:02:11,400 Speaker 1: care if we live in a simulation. I want to 33 00:02:11,400 --> 00:02:12,760 Speaker 1: eat simulated dark chocolate. 34 00:02:13,000 --> 00:02:15,079 Speaker 2: So that's my question. Is it possible that we live 35 00:02:15,080 --> 00:02:17,880 Speaker 2: in a simulation? We're getting philosophical today, let's go all 36 00:02:17,880 --> 00:02:18,160 Speaker 2: the way. 37 00:02:18,440 --> 00:02:22,640 Speaker 1: Yeah, absolutely, it's possible. But it also seems like a 38 00:02:22,840 --> 00:02:27,800 Speaker 1: modern reflection on recent concepts about how computers work. You know, 39 00:02:27,800 --> 00:02:30,800 Speaker 1: one hundred years ago, people thought the whole universe was 40 00:02:30,880 --> 00:02:33,840 Speaker 1: like pipes and cogs, and these days we think it's 41 00:02:33,880 --> 00:02:37,760 Speaker 1: all computers. So it's possible, but we also have zero 42 00:02:37,880 --> 00:02:41,240 Speaker 1: evidence that it's a simulation, and dark chocolate tastes really 43 00:02:41,440 --> 00:02:42,880 Speaker 1: really good anyway. 44 00:02:42,840 --> 00:02:46,519 Speaker 2: That's true, and so does pad see You. I'm hungry today. 45 00:02:47,680 --> 00:02:50,040 Speaker 1: This is why we shouldn't report the podcast just before lunch. 46 00:02:50,120 --> 00:02:52,440 Speaker 2: That's right. I'm so glad that I have simulated taste 47 00:02:52,440 --> 00:02:54,880 Speaker 2: buds and smell receptors so that I can enjoy all 48 00:02:54,960 --> 00:02:56,200 Speaker 2: of those things very much. 49 00:02:56,280 --> 00:02:59,960 Speaker 1: But it is fascinating to think about where your consciousness were, 50 00:03:00,000 --> 00:03:03,000 Speaker 1: where your experience comes from, why it is that you 51 00:03:03,200 --> 00:03:06,280 Speaker 1: even have one while you can like or hate pad 52 00:03:06,360 --> 00:03:09,360 Speaker 1: see You, whether or not that's something that comes from 53 00:03:09,400 --> 00:03:11,680 Speaker 1: the meat in your mind, or whether it's just the 54 00:03:11,720 --> 00:03:15,600 Speaker 1: information circling through those neurons which could be uploaded to 55 00:03:15,760 --> 00:03:16,280 Speaker 1: the cloud. 56 00:03:16,639 --> 00:03:18,120 Speaker 2: Yeah. I mean, I think you'd have to be a 57 00:03:18,120 --> 00:03:20,840 Speaker 2: machine to not like TI food in general personally, but 58 00:03:20,919 --> 00:03:22,600 Speaker 2: I could be wrong. But I'm excited that so you 59 00:03:22,639 --> 00:03:26,240 Speaker 2: and I. We get a lot of emails from listeners. 60 00:03:26,680 --> 00:03:29,679 Speaker 2: You can send those emails to questions at Daniel and 61 00:03:29,720 --> 00:03:32,200 Speaker 2: Kelly dot org. And lately we've been getting a lot 62 00:03:32,200 --> 00:03:35,160 Speaker 2: of questions from folks who have been asking about consciousness. 63 00:03:35,200 --> 00:03:37,000 Speaker 2: I think we've been getting like one a week for 64 00:03:37,040 --> 00:03:40,120 Speaker 2: a while, and we decided it was time to jump 65 00:03:40,120 --> 00:03:42,960 Speaker 2: into the consciousness question, and you lucked out, and you 66 00:03:43,040 --> 00:03:45,080 Speaker 2: happen to know the perfect guest I do. 67 00:03:45,120 --> 00:03:47,400 Speaker 1: There's lots of really smart folks here at you See Irvine, 68 00:03:47,920 --> 00:03:50,240 Speaker 1: and I'm good at googling and finding them and persuading 69 00:03:50,240 --> 00:03:52,640 Speaker 1: them to come and talk to me on the podcast. 70 00:03:53,040 --> 00:03:54,200 Speaker 1: I have a lot of fun, and this is a 71 00:03:54,360 --> 00:03:57,000 Speaker 1: topic that's really close to my heart. I've been thinking 72 00:03:57,040 --> 00:03:59,480 Speaker 1: about it for a long time and also thinking about 73 00:03:59,560 --> 00:04:03,520 Speaker 1: whether it's something you really can ask scientifically. You know, 74 00:04:03,760 --> 00:04:06,040 Speaker 1: science is wonderful and powerful, and of course we are 75 00:04:06,080 --> 00:04:08,800 Speaker 1: advocates for science here on the podcast, But just because 76 00:04:08,800 --> 00:04:10,920 Speaker 1: it's a tool that does let us build knowledge about 77 00:04:10,920 --> 00:04:13,160 Speaker 1: the universe doesn't mean it's a tool that can answer 78 00:04:13,240 --> 00:04:16,520 Speaker 1: every question. Right. You can't answer the question like should 79 00:04:16,560 --> 00:04:18,280 Speaker 1: I get out of bed in the morning? Why do 80 00:04:18,360 --> 00:04:23,040 Speaker 1: people eat white chocolate? Anyway? Right, Not every question is scientific, 81 00:04:23,640 --> 00:04:25,839 Speaker 1: and so there are things about the universe that we 82 00:04:25,920 --> 00:04:28,080 Speaker 1: can probe with science, and there are limits to what 83 00:04:28,120 --> 00:04:31,240 Speaker 1: science can answer. And I've always wondered how much we 84 00:04:31,360 --> 00:04:35,880 Speaker 1: can understand about consciousness itself using science. So I was 85 00:04:35,880 --> 00:04:38,880 Speaker 1: really excited to talk to a cognitive scientist who thinks 86 00:04:38,920 --> 00:04:41,240 Speaker 1: about this all day in a very serious way. 87 00:04:41,520 --> 00:04:43,640 Speaker 2: Yeah, I'm excited that consciousness is sort of at the 88 00:04:43,800 --> 00:04:47,440 Speaker 2: interface of science and philosophy, which makes it a really 89 00:04:47,440 --> 00:04:49,760 Speaker 2: fun topic to think about. And so I had a 90 00:04:49,800 --> 00:04:51,960 Speaker 2: lot of fun today with Megan Peter's chatting with her 91 00:04:51,960 --> 00:04:52,240 Speaker 2: about this. 92 00:04:52,320 --> 00:04:55,600 Speaker 1: Stopping before we dive into this wonderful conversation, we were 93 00:04:55,640 --> 00:04:59,479 Speaker 1: curious what folks out there thought about their own consciousness, 94 00:05:00,000 --> 00:05:03,560 Speaker 1: what ideas people have for why they even have an experience, 95 00:05:04,040 --> 00:05:07,200 Speaker 1: how these few pounds of goo in your brain somehow 96 00:05:07,320 --> 00:05:10,960 Speaker 1: generate all of this joy and frustration that we experience. 97 00:05:11,440 --> 00:05:14,760 Speaker 1: So I asked our cadre of volunteers to chime in 98 00:05:14,800 --> 00:05:19,560 Speaker 1: on the question how does consciousness emerge? If you'd like 99 00:05:19,600 --> 00:05:21,680 Speaker 1: to provide answers for a future episodes, please don't be 100 00:05:21,680 --> 00:05:24,800 Speaker 1: shy to right to questions at danieland Kelly dot org. 101 00:05:25,200 --> 00:05:26,880 Speaker 1: So think about it for a minute. How do you 102 00:05:26,920 --> 00:05:31,400 Speaker 1: think consciousness emerges? Here's what our listeners had to say. 103 00:05:31,920 --> 00:05:34,680 Speaker 3: I think we have to really think about what consciousness 104 00:05:34,760 --> 00:05:39,560 Speaker 3: might mean before we can figure out where it comes from. 105 00:05:39,880 --> 00:05:41,360 Speaker 3: But if I had to make a guess, I'd say 106 00:05:41,360 --> 00:05:46,040 Speaker 3: consciousness is just energy sliding into different forms as it 107 00:05:46,120 --> 00:05:48,159 Speaker 3: rolls along from our brain. 108 00:05:48,279 --> 00:05:52,400 Speaker 1: Jews exponential, exponential extension of the emergence of life. I 109 00:05:52,400 --> 00:05:55,880 Speaker 1: don't think anyone knows human consciousness is an intersection of 110 00:05:55,920 --> 00:05:59,880 Speaker 1: the more mechanistic functions of the human brain and the 111 00:06:00,240 --> 00:06:04,440 Speaker 1: created soul. I think consciousness needs you to have both 112 00:06:04,520 --> 00:06:08,599 Speaker 1: theory of mind and the complexity to apply it to yourself. 113 00:06:08,680 --> 00:06:11,560 Speaker 4: Very slowly on a Saturday morning after a night out 114 00:06:11,560 --> 00:06:13,599 Speaker 4: down the town with several coffees. 115 00:06:14,120 --> 00:06:18,560 Speaker 1: Consciousness is fundamental and that we are as humans somehow 116 00:06:18,920 --> 00:06:23,480 Speaker 1: tapped into it. Wow, consciousness, that's a bigie. Well, I 117 00:06:23,480 --> 00:06:25,360 Speaker 1: think it's magic. 118 00:06:25,839 --> 00:06:28,920 Speaker 3: As the parent of twins who think I'm pretty much useless, 119 00:06:29,560 --> 00:06:32,720 Speaker 3: I'll say it that somehow it involves interaction with others. 120 00:06:32,880 --> 00:06:36,680 Speaker 1: Consciousness emerges when the penguin models in. I've always leaned 121 00:06:36,680 --> 00:06:40,080 Speaker 1: towards the idea that consciousness is an illusion. 122 00:06:40,440 --> 00:06:42,880 Speaker 4: I believe consciousness emerges when the brain's elegical florid of 123 00:06:42,960 --> 00:06:45,200 Speaker 4: plasma interacts with the noise and dangers in the brain. 124 00:06:45,240 --> 00:06:48,200 Speaker 2: Thanks the brain itself is a sensory organ that can 125 00:06:48,240 --> 00:06:49,760 Speaker 2: sense its own processing. 126 00:06:50,200 --> 00:06:53,200 Speaker 3: A pint of strong fresh coffee is a necessary but 127 00:06:53,440 --> 00:06:54,720 Speaker 3: not sufficient condition. 128 00:06:55,000 --> 00:06:58,200 Speaker 1: It's a very complicated dance in the brain. It's a 129 00:06:58,320 --> 00:07:02,640 Speaker 1: very slow process. I think consciousness takes many, many forms, 130 00:07:03,120 --> 00:07:06,840 Speaker 1: things that the brain is doing, neural inspiring, reaching out 131 00:07:06,960 --> 00:07:12,400 Speaker 1: a critical consistency, and an introduction number. 132 00:07:12,640 --> 00:07:15,200 Speaker 3: I don't think we can know how consciousness emerges until 133 00:07:15,240 --> 00:07:18,680 Speaker 3: we know what consciousness is and if we have it all. 134 00:07:18,680 --> 00:07:23,800 Speaker 2: Right, We've got coffee, penguins, magic, all kinds of amazing 135 00:07:23,840 --> 00:07:26,600 Speaker 2: things in there, and a large diversity of answers. 136 00:07:26,720 --> 00:07:30,360 Speaker 1: Mmmmm, And there's a thread here. I've noticed a lot 137 00:07:30,360 --> 00:07:34,360 Speaker 1: of people suggesting that somehow it arises from the complexity 138 00:07:34,360 --> 00:07:38,320 Speaker 1: of the brain neurons, reaching a critical number or you know, 139 00:07:38,360 --> 00:07:41,200 Speaker 1: all these interactions. To me, that sort of frames the 140 00:07:41,280 --> 00:07:44,680 Speaker 1: question but doesn't really answer it, right, Like, why is 141 00:07:44,680 --> 00:07:48,480 Speaker 1: it the complex network of cells can have a consciousness 142 00:07:48,480 --> 00:07:51,400 Speaker 1: emerge regardless of the complexity, Right, It's not obvious to 143 00:07:51,440 --> 00:07:54,640 Speaker 1: me that complexity is enough. And that's really I think 144 00:07:54,640 --> 00:07:56,960 Speaker 1: the focus of the question for today's episode. 145 00:07:57,240 --> 00:08:00,160 Speaker 2: I would personally love to know the evolutionary reason why 146 00:08:00,200 --> 00:08:02,840 Speaker 2: we have consciousness, how does it benefit us? And would 147 00:08:02,880 --> 00:08:05,800 Speaker 2: we expect to see other animals to have it as well? Yeah, 148 00:08:05,800 --> 00:08:07,680 Speaker 2: and anyway, there were so many exciting things we talked 149 00:08:07,680 --> 00:08:08,559 Speaker 2: about with Megan today. 150 00:08:08,800 --> 00:08:11,360 Speaker 1: Yeah, there's lots of really fun questions there, like where 151 00:08:11,240 --> 00:08:14,000 Speaker 1: are there other species of humans which actually were more 152 00:08:14,040 --> 00:08:18,640 Speaker 1: intelligent but didn't survive because our ancestors were more warlike 153 00:08:18,800 --> 00:08:22,480 Speaker 1: or more cannibalistic? Right? Or if you ran the Earth 154 00:08:22,520 --> 00:08:25,400 Speaker 1: experiment a thousand times, how often would you even get 155 00:08:25,400 --> 00:08:28,720 Speaker 1: intelligence or more or less intelligence? All right? Is this 156 00:08:28,760 --> 00:08:31,400 Speaker 1: a fluke or is it a common outcome? Man? I'd 157 00:08:31,440 --> 00:08:32,920 Speaker 1: love to know the answer to those questions. 158 00:08:33,040 --> 00:08:35,120 Speaker 2: And you're also getting into some of the complexity there 159 00:08:35,160 --> 00:08:38,319 Speaker 2: because you were calling it intelligence and where does intelligence 160 00:08:38,480 --> 00:08:41,360 Speaker 2: end in consciousness begin? And what is the difference in 161 00:08:41,400 --> 00:08:44,319 Speaker 2: those things? And it gets complicated pretty fast. 162 00:08:44,440 --> 00:08:46,640 Speaker 1: It does. And fortunately we have an expert to help 163 00:08:46,720 --> 00:08:49,600 Speaker 1: us dig through this. So let's jump right into our 164 00:08:49,600 --> 00:08:52,840 Speaker 1: conversation with Professor Megan Peters from the Cognitive Science department 165 00:08:52,880 --> 00:08:58,440 Speaker 1: here at UC Irvine. All right, so it's my pleasure 166 00:08:58,480 --> 00:09:01,720 Speaker 1: to have on the podcast Professor Megan Peters. She's an 167 00:09:01,720 --> 00:09:04,240 Speaker 1: associate professor here at you see Irvine in the Department 168 00:09:04,280 --> 00:09:07,920 Speaker 1: of Cognitive Science. She's a PhD from UCLA in cognitive 169 00:09:07,960 --> 00:09:11,040 Speaker 1: science and her research aims to reveal how the brain 170 00:09:11,160 --> 00:09:16,480 Speaker 1: represents and uses uncertainty. She uses neuroimaging, computational modeling, machine learning, 171 00:09:16,520 --> 00:09:19,880 Speaker 1: and neural stimulation techniques to study all of this, and 172 00:09:20,000 --> 00:09:22,400 Speaker 1: most importantly, she agreed to come on the podcast and 173 00:09:22,440 --> 00:09:25,880 Speaker 1: talk to us about what is consciousness and answer all 174 00:09:25,880 --> 00:09:28,920 Speaker 1: of our questions and fend off Kelly's poop references. 175 00:09:29,160 --> 00:09:34,839 Speaker 4: Ah, good luck, Thank you both for having me. It's 176 00:09:34,880 --> 00:09:38,120 Speaker 4: really a pleasure to be here. As an aside, my 177 00:09:38,280 --> 00:09:41,679 Speaker 4: PhD is in psychology with a focus on cognitive and 178 00:09:41,679 --> 00:09:44,200 Speaker 4: computational neuroscience. If that matters, So you can cut this 179 00:09:44,240 --> 00:09:46,120 Speaker 4: part out if you want to redo that intro. 180 00:09:46,640 --> 00:09:48,960 Speaker 1: No, we'll find out if that matters, we'll see Yes, 181 00:09:49,080 --> 00:09:52,480 Speaker 1: absolutely great, So let's dig right in. I want to 182 00:09:52,640 --> 00:09:55,040 Speaker 1: know And this is maybe the hardest question we're going 183 00:09:55,120 --> 00:09:58,679 Speaker 1: to ask you all day. How do we define consciousness? 184 00:09:58,720 --> 00:09:59,920 Speaker 1: We're going to be talking about it, We're going to 185 00:09:59,960 --> 00:10:02,880 Speaker 1: be arguing about We're going to be saying do rocks habit? 186 00:10:03,000 --> 00:10:05,240 Speaker 1: Do dogs have it? But it's kind of slippery if 187 00:10:05,240 --> 00:10:07,320 Speaker 1: we don't even know what it is we're talking about. 188 00:10:07,440 --> 00:10:11,080 Speaker 1: So what is consciousness? How do we define this thing 189 00:10:11,240 --> 00:10:13,560 Speaker 1: that we all sort of know intuitively but have a 190 00:10:13,600 --> 00:10:15,520 Speaker 1: hard time describing. 191 00:10:15,360 --> 00:10:18,640 Speaker 4: Super important question and leads me into one of the 192 00:10:18,640 --> 00:10:21,200 Speaker 4: first kind of technical things that we'll talk about today, 193 00:10:21,440 --> 00:10:24,920 Speaker 4: which is the distinction between are the lights on? But 194 00:10:25,040 --> 00:10:29,200 Speaker 4: is anyone home? Essentially, So we talk in the fields 195 00:10:29,280 --> 00:10:32,360 Speaker 4: of psychology and in the philosophy of mind about the 196 00:10:32,400 --> 00:10:38,040 Speaker 4: distinction between what's called access consciousness and phenomenal consciousness. So 197 00:10:38,080 --> 00:10:40,720 Speaker 4: these are terms that the philosopher and ned Block introduced 198 00:10:40,760 --> 00:10:44,439 Speaker 4: a while back, and the idea is that access consciousness 199 00:10:44,480 --> 00:10:48,520 Speaker 4: is about something like the global availability of information in 200 00:10:48,600 --> 00:10:53,559 Speaker 4: a complex system to allow that system to behave usefully 201 00:10:53,600 --> 00:10:57,720 Speaker 4: in its environment, to survive, to seek goals, to seek rewards, 202 00:10:57,760 --> 00:11:00,880 Speaker 4: to get food, to not be eaten by something. And 203 00:11:01,520 --> 00:11:04,680 Speaker 4: the distinction then is between you know, access consciousness, which 204 00:11:04,720 --> 00:11:07,600 Speaker 4: is information goes all over in the brain or in 205 00:11:07,640 --> 00:11:10,640 Speaker 4: the mind if there isn't a brain present. But the 206 00:11:10,679 --> 00:11:14,319 Speaker 4: difference then between that and phenomenal consciousness is that the 207 00:11:14,400 --> 00:11:19,000 Speaker 4: phenomenal consciousness has to do with the phenomenology of the 208 00:11:19,200 --> 00:11:22,040 Speaker 4: experience that the observer is having. So I'm going to 209 00:11:22,080 --> 00:11:24,679 Speaker 4: break that down that really just comes down to this 210 00:11:25,200 --> 00:11:29,000 Speaker 4: something that it is like to be you the qualitative 211 00:11:29,080 --> 00:11:32,000 Speaker 4: experiences that you have of the world. The fact that 212 00:11:32,080 --> 00:11:36,000 Speaker 4: pain hurts. It isn't just a signal like your tesla 213 00:11:36,000 --> 00:11:38,480 Speaker 4: would send to itself, like damage in my right front 214 00:11:38,520 --> 00:11:41,680 Speaker 4: tire or something like that. It's more like it has 215 00:11:41,800 --> 00:11:44,960 Speaker 4: qualitative character to it. There's something that it's like to 216 00:11:45,000 --> 00:11:46,840 Speaker 4: be in pain. If I whack you in the leg, 217 00:11:47,440 --> 00:11:51,000 Speaker 4: it's not just a signal that I've damaged the tissue 218 00:11:51,040 --> 00:11:52,800 Speaker 4: in your leg. There's more to it than that. 219 00:11:53,200 --> 00:11:56,360 Speaker 1: So are you saying that a tesla has access consciousness 220 00:11:56,400 --> 00:11:59,400 Speaker 1: because it notices damage in its tire, but it doesn't 221 00:11:59,400 --> 00:12:03,160 Speaker 1: feel pain, doesn't have anomenological consciousness, or if I miss. 222 00:12:02,480 --> 00:12:05,200 Speaker 4: Ansers, I wouldn't even go that far to say that 223 00:12:05,240 --> 00:12:08,080 Speaker 4: a tesla has any sort of access consciousness whatsoever. I 224 00:12:08,080 --> 00:12:10,480 Speaker 4: think that that's a higher bar there, But I think 225 00:12:10,520 --> 00:12:11,640 Speaker 4: that the distinction is important. 226 00:12:11,720 --> 00:12:11,800 Speaker 2: Right. 227 00:12:11,880 --> 00:12:14,360 Speaker 4: We can imagine a system that has all of the 228 00:12:14,400 --> 00:12:17,720 Speaker 4: hallmarks of access consciousness, like a fancy future tesla. Okay, 229 00:12:17,920 --> 00:12:20,960 Speaker 4: you know that has all of this global availability of information. 230 00:12:21,080 --> 00:12:25,199 Speaker 4: It enables flexible adaptive behavior. It can change based on 231 00:12:25,280 --> 00:12:28,120 Speaker 4: its context. It's not just going according to its programming. 232 00:12:28,280 --> 00:12:30,440 Speaker 4: You know that kind of thing. And yet there's nothing 233 00:12:30,440 --> 00:12:33,000 Speaker 4: that it's like to be that fancy future tesla. It's 234 00:12:33,160 --> 00:12:37,400 Speaker 4: just a zombie. It's just engaging in behaviors that are 235 00:12:37,600 --> 00:12:40,960 Speaker 4: useful for that organism, but there's no one home. So 236 00:12:42,000 --> 00:12:45,400 Speaker 4: someone being home aspect is also really important because we 237 00:12:45,440 --> 00:12:48,720 Speaker 4: can imagine in like, you know, not a future robot scenario, 238 00:12:48,920 --> 00:12:51,960 Speaker 4: but just a medical situation where you've got a person 239 00:12:52,000 --> 00:12:55,439 Speaker 4: who's in a coma or they're in a persistent vegetative state, 240 00:12:56,000 --> 00:13:00,520 Speaker 4: and they might wake up seemingly and have of sleep 241 00:13:00,559 --> 00:13:04,360 Speaker 4: wake cycles and maybe respond to external stimuli. But the 242 00:13:04,360 --> 00:13:07,840 Speaker 4: critical factor in deciding whether you know someone's home whether 243 00:13:07,920 --> 00:13:11,360 Speaker 4: to keep them on life support is whether there's anything 244 00:13:11,480 --> 00:13:15,080 Speaker 4: is like to be them inside. And the flip side 245 00:13:15,120 --> 00:13:17,760 Speaker 4: is also really important in the context of medical science, 246 00:13:17,880 --> 00:13:21,480 Speaker 4: because even if they don't exhibit any of the outward 247 00:13:21,760 --> 00:13:26,000 Speaker 4: signs or symptoms of being conscious, of having access to 248 00:13:26,080 --> 00:13:29,520 Speaker 4: that available information, of being able to behave in their environments, 249 00:13:29,840 --> 00:13:32,679 Speaker 4: someone still might be in there. They might be locked in. 250 00:13:33,280 --> 00:13:36,520 Speaker 4: And so it's this presence of phenomenal experience of someone 251 00:13:36,600 --> 00:13:40,440 Speaker 4: being home, so to speak, that I think is the 252 00:13:40,559 --> 00:13:43,440 Speaker 4: important thing to remember when we're talking about consciousness in 253 00:13:43,480 --> 00:13:46,800 Speaker 4: the context of biological systems or artificial ones. 254 00:13:47,120 --> 00:13:49,160 Speaker 2: So I'm still trying to wrap my head around the 255 00:13:49,160 --> 00:13:52,040 Speaker 2: two different kinds of consciousness. Is there a kind of 256 00:13:52,160 --> 00:13:54,559 Speaker 2: organism or a situation a person can be in where 257 00:13:54,559 --> 00:13:56,680 Speaker 2: they would have one but not the other to help 258 00:13:56,679 --> 00:13:58,480 Speaker 2: me sort of differentiate. 259 00:13:58,000 --> 00:14:01,320 Speaker 4: Well, For humans, the idea is that, presuming that you 260 00:14:01,360 --> 00:14:05,240 Speaker 4: don't subscribe to, you know, philosophical zombiesm that like, you 261 00:14:05,280 --> 00:14:07,680 Speaker 4: have no idea that I'm conscious, right, I exhibit all 262 00:14:07,720 --> 00:14:10,360 Speaker 4: the hallmarks, but you have no idea. But presuming that 263 00:14:10,440 --> 00:14:12,760 Speaker 4: you don't want to go down that rabbit hole, I. 264 00:14:12,679 --> 00:14:15,480 Speaker 1: Do actually in a minute. 265 00:14:15,600 --> 00:14:18,640 Speaker 4: In humans, it seems like they go hand in hand, right, 266 00:14:18,679 --> 00:14:21,760 Speaker 4: that if you have access consciousness, you have phenomenal consciousness 267 00:14:21,880 --> 00:14:25,080 Speaker 4: in general as an awake, behaving organism. But there might 268 00:14:25,120 --> 00:14:30,600 Speaker 4: be cases in specific scientific experiments where you can reveal 269 00:14:31,360 --> 00:14:35,560 Speaker 4: symptoms or cases or evidence of access consciousness in the 270 00:14:35,600 --> 00:14:40,080 Speaker 4: absence of phenomenal consciousness. So there's some very specific psychological 271 00:14:40,120 --> 00:14:45,040 Speaker 4: experiments that suggests that these are separable entities. One classic 272 00:14:45,120 --> 00:14:49,120 Speaker 4: example is a task that was actually attributed to and 273 00:14:49,280 --> 00:14:52,600 Speaker 4: was developed by a cognitive scientist here in my department, 274 00:14:52,640 --> 00:14:55,920 Speaker 4: George Spurling. So this is the classic spurling task where 275 00:14:55,920 --> 00:14:58,720 Speaker 4: you show someone an array of letters. There's like five 276 00:14:58,800 --> 00:15:01,320 Speaker 4: letters in a row, and there's rows of letters, and 277 00:15:01,360 --> 00:15:04,880 Speaker 4: you flash it very fast, and you ask people to 278 00:15:05,000 --> 00:15:08,520 Speaker 4: kind of give their impression of the overall array, like 279 00:15:08,560 --> 00:15:10,160 Speaker 4: did you see it? Did you feel like you got 280 00:15:10,160 --> 00:15:11,920 Speaker 4: the whole thing? And people say, yeah, I feel like 281 00:15:11,960 --> 00:15:14,160 Speaker 4: I got the whole array. That's fine. But then when 282 00:15:14,160 --> 00:15:16,520 Speaker 4: you ask them to report the whole array they can't, 283 00:15:17,080 --> 00:15:19,120 Speaker 4: But when you ask them to report a specific row 284 00:15:19,200 --> 00:15:22,560 Speaker 4: they can't. It feels like there's a distinction then between 285 00:15:22,560 --> 00:15:26,880 Speaker 4: this feeling that you've got phenomenal experience of the whole array, 286 00:15:27,000 --> 00:15:29,440 Speaker 4: but your access might be limited, and so ned Block 287 00:15:29,480 --> 00:15:33,840 Speaker 4: has famously called this phenomenal consciousness overflowing access. 288 00:15:34,080 --> 00:15:36,240 Speaker 1: It seems to me sort of the crucial distinction, right, 289 00:15:36,240 --> 00:15:40,320 Speaker 1: because access consciousness is something we can understand sort of 290 00:15:40,560 --> 00:15:42,960 Speaker 1: on a fundamental level, Like we can trace signals into 291 00:15:42,960 --> 00:15:45,440 Speaker 1: your eye and watch those proteins fold and then up 292 00:15:45,480 --> 00:15:48,720 Speaker 1: the optic nerve and into your brain, but we can't 293 00:15:49,240 --> 00:15:53,760 Speaker 1: know whether anybody's there like experiencing, or what that experience 294 00:15:53,800 --> 00:15:56,440 Speaker 1: of seeing a red photon is, or whether my red 295 00:15:56,480 --> 00:15:57,960 Speaker 1: is the same as your red and all this sort 296 00:15:58,000 --> 00:16:01,440 Speaker 1: of dorm room philosophy kind of stuff, and so access 297 00:16:01,440 --> 00:16:07,440 Speaker 1: consciousness is sort of more accessible scientifically than phenomenological consciousness, 298 00:16:07,480 --> 00:16:09,240 Speaker 1: which is more philosophical. Is that fair? 299 00:16:09,440 --> 00:16:12,480 Speaker 4: I think that's a relatively fair characterization, and that there 300 00:16:12,480 --> 00:16:15,520 Speaker 4: are a lot of kind of current scientific theories of 301 00:16:15,560 --> 00:16:19,000 Speaker 4: consciousness that report to be a theory of consciousness, and 302 00:16:19,080 --> 00:16:21,360 Speaker 4: most of them, really, when it comes down to brass tacks, 303 00:16:21,440 --> 00:16:24,920 Speaker 4: end up being about access consciousness because the phenomenal part 304 00:16:25,080 --> 00:16:28,040 Speaker 4: is really hard to get at. As you said, I 305 00:16:28,040 --> 00:16:31,400 Speaker 4: think that there are some approaches that might be promising 306 00:16:31,480 --> 00:16:35,120 Speaker 4: in this vein. So you could look for neural correlates 307 00:16:35,360 --> 00:16:40,080 Speaker 4: or patterns of neural activity that are associated with reports 308 00:16:40,280 --> 00:16:44,360 Speaker 4: of phenomenal experience. So like I can create conditions where 309 00:16:44,400 --> 00:16:46,200 Speaker 4: I show you the same stimulus over and over and 310 00:16:46,240 --> 00:16:48,880 Speaker 4: over again, and so the early parts of your visual 311 00:16:48,920 --> 00:16:52,400 Speaker 4: brain are responding kind of similarly. I mean, there's noise, 312 00:16:52,480 --> 00:16:55,480 Speaker 4: and there's variability and so on, But if I flash 313 00:16:55,600 --> 00:16:57,920 Speaker 4: the same thing at you over and over and over again, 314 00:16:58,240 --> 00:17:00,720 Speaker 4: I'm going to get a consistent kind of pattern of 315 00:17:00,760 --> 00:17:03,280 Speaker 4: responding in the back of your head, which is the 316 00:17:03,320 --> 00:17:07,720 Speaker 4: early visual cortex. But you might have fluctuations in your 317 00:17:07,800 --> 00:17:12,119 Speaker 4: subjective experience of that stimulus. Sometimes you feel like you 318 00:17:12,160 --> 00:17:15,239 Speaker 4: see it, sometimes you feel like you can't. So to 319 00:17:15,320 --> 00:17:17,760 Speaker 4: the extent that I can hold most of the stuff 320 00:17:17,800 --> 00:17:19,720 Speaker 4: in the back of your head constant, and I can 321 00:17:19,800 --> 00:17:22,880 Speaker 4: measure that in relation to flashing something at you over 322 00:17:22,960 --> 00:17:25,400 Speaker 4: and over and over again. But then I look for 323 00:17:25,960 --> 00:17:30,320 Speaker 4: how your subjective experience fluctuates from one moment to the next. 324 00:17:30,359 --> 00:17:32,640 Speaker 4: I feel like I saw that strongly. I feel very 325 00:17:32,680 --> 00:17:35,399 Speaker 4: confident that I got that right. I feel like that 326 00:17:35,600 --> 00:17:39,280 Speaker 4: was nothing at all. I didn't really experience anything. Then 327 00:17:39,320 --> 00:17:42,120 Speaker 4: I can go look for neural correlates that are co 328 00:17:42,240 --> 00:17:46,960 Speaker 4: varying with the subjective experience that you're reporting to me. 329 00:17:47,560 --> 00:17:49,399 Speaker 4: There are a lot of problems with that too, that 330 00:17:49,520 --> 00:17:52,080 Speaker 4: maybe you don't have perfect access to your own subjective 331 00:17:52,119 --> 00:17:55,000 Speaker 4: experiences and your reports are spurious and blah blah blah. 332 00:17:55,000 --> 00:17:57,360 Speaker 4: But I think that there are ways that we can 333 00:17:57,400 --> 00:18:00,760 Speaker 4: go about trying to get at how the brain constructs 334 00:18:00,840 --> 00:18:07,000 Speaker 4: or supports or represents these subjective experiences that are different 335 00:18:07,040 --> 00:18:11,960 Speaker 4: from just how your brain processes information about the external environment. 336 00:18:11,760 --> 00:18:14,160 Speaker 1: Right, And I do want to get into those experiments. 337 00:18:14,200 --> 00:18:16,159 Speaker 1: But first I just want to make sure we're totally 338 00:18:16,160 --> 00:18:19,679 Speaker 1: clear on these definitions. And I think the example that 339 00:18:19,720 --> 00:18:21,760 Speaker 1: you were a little dismissive of a minute ago is 340 00:18:21,800 --> 00:18:24,560 Speaker 1: actually helpful, at least to me to clarify what we're 341 00:18:24,560 --> 00:18:27,360 Speaker 1: talking about. And that's the example of philosophical zombies. Right. 342 00:18:27,680 --> 00:18:30,960 Speaker 1: The idea here is, could you take Daniel and replace 343 00:18:31,040 --> 00:18:35,680 Speaker 1: him with some machine, biological or whatever that replicates all 344 00:18:35,720 --> 00:18:39,400 Speaker 1: of my actions? Seems the way I do to be conscious. 345 00:18:39,520 --> 00:18:43,680 Speaker 1: But there's nobody home, right, there's nobody experiencing the red 346 00:18:43,800 --> 00:18:46,080 Speaker 1: and eating the pizza and whatever. It's just, but it 347 00:18:46,119 --> 00:18:50,240 Speaker 1: seems like it reacts exactly the same way. So philosophical zombies, 348 00:18:50,280 --> 00:18:51,960 Speaker 1: as I understand them, and tell me if I'm wrong, 349 00:18:52,280 --> 00:18:56,000 Speaker 1: conjectures that it's possible to build something like this, which 350 00:18:56,080 --> 00:19:00,239 Speaker 1: means therefore that the phenomenological consciousness is totally unmeasurable. Right, 351 00:19:00,280 --> 00:19:02,040 Speaker 1: that there's no way for us to know we have, 352 00:19:02,200 --> 00:19:03,920 Speaker 1: As you said, we have to sort of trust your 353 00:19:04,000 --> 00:19:07,879 Speaker 1: first person reports about your subjective experience, which makes it 354 00:19:07,880 --> 00:19:11,560 Speaker 1: difficult to do any actual science. Right, is that a 355 00:19:11,600 --> 00:19:13,800 Speaker 1: fair characterization of philosophical zombies? 356 00:19:13,840 --> 00:19:16,000 Speaker 4: If I missed something, Oh, I have like five things 357 00:19:16,000 --> 00:19:17,439 Speaker 4: I want to say, let's make sure I get to 358 00:19:17,440 --> 00:19:21,040 Speaker 4: all of them. Okay, So first, the idea of philosophical 359 00:19:21,119 --> 00:19:24,840 Speaker 4: zombies has been hotly debated in the philosophical literature for 360 00:19:24,880 --> 00:19:27,159 Speaker 4: a very long time, and there are a number of 361 00:19:27,160 --> 00:19:31,000 Speaker 4: philosophers who say, yes, this is absolutely totally reasonable to 362 00:19:31,080 --> 00:19:32,719 Speaker 4: pause it, that this could happen, And then there are 363 00:19:32,720 --> 00:19:35,320 Speaker 4: other people who say, no, like, that's not really like 364 00:19:35,400 --> 00:19:39,680 Speaker 4: a reasonable assumption. So you know, go read Dan Dennitt 365 00:19:39,680 --> 00:19:41,880 Speaker 4: if you want to hear about all of this stuff. 366 00:19:42,359 --> 00:19:46,880 Speaker 4: But I think that you've touched upon another important point too, 367 00:19:47,760 --> 00:19:51,320 Speaker 4: which is this idea of testing for whether someone is 368 00:19:51,359 --> 00:19:54,360 Speaker 4: in there or not, and how we don't currently have 369 00:19:54,520 --> 00:19:57,280 Speaker 4: the capacity to do that even in other beings that 370 00:19:57,320 --> 00:20:00,480 Speaker 4: look and behave precisely exactly like you do, like me 371 00:20:01,200 --> 00:20:01,439 Speaker 4: or like. 372 00:20:01,520 --> 00:20:04,360 Speaker 1: Kelly, or chemists are like, are they really in there? 373 00:20:04,400 --> 00:20:06,800 Speaker 1: Do they actually like chemistry? Is that possible? 374 00:20:07,280 --> 00:20:08,320 Speaker 2: They have to be a machine? 375 00:20:08,440 --> 00:20:10,520 Speaker 4: Right, are they just lying? 376 00:20:10,800 --> 00:20:11,119 Speaker 1: Yeah? 377 00:20:11,520 --> 00:20:12,880 Speaker 4: We don't have a way of testing it. We don't 378 00:20:12,880 --> 00:20:14,960 Speaker 4: have like a consciousness ometer. I can't pull out my 379 00:20:15,240 --> 00:20:17,840 Speaker 4: hair dryer looking consciousness ometer and pointed at you and 380 00:20:17,880 --> 00:20:19,840 Speaker 4: be like are you conscious? Like we don't have that. 381 00:20:20,320 --> 00:20:22,080 Speaker 1: How do you know what it would look like? We 382 00:20:22,119 --> 00:20:23,840 Speaker 1: don't even have the device. Why does it look like 383 00:20:23,880 --> 00:20:24,720 Speaker 1: a hair dryer? 384 00:20:25,800 --> 00:20:28,520 Speaker 4: At some point actual clearman is positive that it was 385 00:20:28,560 --> 00:20:30,480 Speaker 4: going to look like a hair dryer, and it is 386 00:20:30,560 --> 00:20:33,760 Speaker 4: kind of propagated throughout by thinking since then. But that 387 00:20:33,840 --> 00:20:36,200 Speaker 4: seems about right right, Like it's kind of like a spidometer, 388 00:20:36,320 --> 00:20:38,240 Speaker 4: like they, you know, cop pulls up that on the 389 00:20:38,240 --> 00:20:38,880 Speaker 4: side of the road. 390 00:20:39,000 --> 00:20:40,879 Speaker 2: Too much consciousness, here's your ticket. 391 00:20:42,560 --> 00:20:45,040 Speaker 4: Something like that. But we don't have those and in fact, 392 00:20:45,200 --> 00:20:47,320 Speaker 4: like we wouldn't even know how to build one, right, 393 00:20:47,400 --> 00:20:50,359 Speaker 4: We don't even know what the relevant metrics are. Do 394 00:20:50,400 --> 00:20:53,280 Speaker 4: we care about measuring brain stuff? Or is that kind 395 00:20:53,320 --> 00:20:56,840 Speaker 4: of spurious? But one of the arguments kind of against 396 00:20:56,880 --> 00:21:02,439 Speaker 4: philosophical zombism is the supposition that consciousness is not just 397 00:21:02,520 --> 00:21:05,080 Speaker 4: this epi phenomenon, That it's not just this thing that 398 00:21:05,200 --> 00:21:08,119 Speaker 4: kind of comes along for the ride over and above 399 00:21:08,600 --> 00:21:12,760 Speaker 4: an organism behaving usefully in its environment and flexibly adapting 400 00:21:12,760 --> 00:21:16,480 Speaker 4: to different conditions and being intelligent and seeking goals and 401 00:21:16,520 --> 00:21:20,159 Speaker 4: not getting eaten by predators. That consciousness itself serves a 402 00:21:20,240 --> 00:21:24,400 Speaker 4: function that you cannot have all of that stuff, all 403 00:21:24,440 --> 00:21:29,760 Speaker 4: of that useful stuff for staying alive without consciousness. And 404 00:21:29,800 --> 00:21:33,679 Speaker 4: this is an arguable position. It's not a fact. But 405 00:21:34,400 --> 00:21:37,960 Speaker 4: the folks who are going to argue that consciousness serves 406 00:21:37,960 --> 00:21:41,160 Speaker 4: a function, there is a function of consciousness. It's not 407 00:21:41,240 --> 00:21:44,119 Speaker 4: just that there are functions for creating consciousness somewhere in 408 00:21:44,160 --> 00:21:48,600 Speaker 4: your head, but that consciousness itself is useful, that it 409 00:21:49,119 --> 00:21:55,600 Speaker 4: allows some evolutionary adaptive advantage. That is certainly a defensible position. 410 00:21:55,760 --> 00:21:59,240 Speaker 4: And from that context you actually couldn't have philosophical zombies 411 00:21:59,240 --> 00:22:03,040 Speaker 4: at all, because it is actually impossible to get all 412 00:22:03,080 --> 00:22:06,000 Speaker 4: of those other behaviors and cognitive processes and all of 413 00:22:06,000 --> 00:22:08,560 Speaker 4: that stuff without the consciousness bit. 414 00:22:08,920 --> 00:22:12,280 Speaker 1: I see. If that's true, then the interaction with people 415 00:22:12,520 --> 00:22:14,680 Speaker 1: who appear to be conscious is evidence that they are 416 00:22:14,760 --> 00:22:15,720 Speaker 1: actually conscious. 417 00:22:16,000 --> 00:22:19,800 Speaker 4: Yeah, but these are all like arguments that we can have, right. 418 00:22:19,880 --> 00:22:22,800 Speaker 4: There's no way of saying I'm right and you're wrong, 419 00:22:22,840 --> 00:22:24,520 Speaker 4: depending on which position you're holding. 420 00:22:25,080 --> 00:22:27,679 Speaker 1: I think about this when I interact with my dog, 421 00:22:28,119 --> 00:22:30,800 Speaker 1: you know, because I feel like my dog is in 422 00:22:30,840 --> 00:22:33,760 Speaker 1: there right, Like my dog knows me, my dog loves me. 423 00:22:33,800 --> 00:22:36,320 Speaker 1: I love my dog. My dog understands that I love him, 424 00:22:36,800 --> 00:22:40,159 Speaker 1: that I'm nice to him. It's hard to imagine that 425 00:22:40,359 --> 00:22:43,639 Speaker 1: there's nobody at home in my dog. But you know, 426 00:22:43,680 --> 00:22:47,040 Speaker 1: this whole concept suggests that it's possible for there to 427 00:22:47,080 --> 00:22:49,439 Speaker 1: be a machine effectively, and I say a machine just 428 00:22:49,480 --> 00:22:52,320 Speaker 1: to indicate like that there's nobody home, though I don't 429 00:22:52,320 --> 00:22:55,440 Speaker 1: actually know if AI could have phenomenological consciousness, But it's 430 00:22:55,440 --> 00:22:58,159 Speaker 1: hard to imagine that you're not actually in there, Kelly's 431 00:22:58,160 --> 00:22:59,600 Speaker 1: not actually in there. I'm the only one in the 432 00:22:59,680 --> 00:23:03,880 Speaker 1: universe who's experiencing this, but I can't actually prove it, right, 433 00:23:03,960 --> 00:23:07,520 Speaker 1: So it really isn't fundamentally important point, even though it 434 00:23:07,600 --> 00:23:10,679 Speaker 1: sounds absurd on the face of it, right that we 435 00:23:10,720 --> 00:23:13,240 Speaker 1: could be the only one aware that we can't actually 436 00:23:13,240 --> 00:23:16,000 Speaker 1: know if anybody else is in there. We feel people 437 00:23:16,040 --> 00:23:19,000 Speaker 1: loving us, you know, and their experience and their pain 438 00:23:19,040 --> 00:23:21,840 Speaker 1: reflects ours, but we can't actually know, And I think 439 00:23:21,840 --> 00:23:24,400 Speaker 1: that's a fundamentally important point to hold on to, even 440 00:23:24,440 --> 00:23:25,719 Speaker 1: though it feels ridiculous. 441 00:23:26,000 --> 00:23:29,679 Speaker 4: It is, I mean, the easiest solution here is that 442 00:23:29,720 --> 00:23:31,639 Speaker 4: we all have consciousness, right Like that would be the 443 00:23:31,640 --> 00:23:35,320 Speaker 4: most parsimonious explanation, and it's easy to make that leap 444 00:23:35,440 --> 00:23:41,679 Speaker 4: because there's so many physical information processing similarities between you 445 00:23:41,760 --> 00:23:44,600 Speaker 4: and me. Right, our brains are not exactly the same, 446 00:23:44,640 --> 00:23:47,200 Speaker 4: but they're pretty darn close, and so we can make 447 00:23:47,200 --> 00:23:51,040 Speaker 4: the leap maybe to talking about great apes or monkeys 448 00:23:51,160 --> 00:23:53,960 Speaker 4: or dogs or other vertebrates, other mammals, that kind of thing. 449 00:23:54,440 --> 00:23:56,919 Speaker 4: I have a harder time when we get down into 450 00:23:57,200 --> 00:23:59,639 Speaker 4: you know, insects, like I don't know that it be 451 00:24:00,240 --> 00:24:01,160 Speaker 4: a really. 452 00:24:01,359 --> 00:24:03,480 Speaker 1: But I don't know, like what are people who like 453 00:24:03,520 --> 00:24:05,400 Speaker 1: live in Virginia, for example. That's just like. 454 00:24:07,119 --> 00:24:12,320 Speaker 4: I'm not going there. But the presence of like the 455 00:24:12,359 --> 00:24:16,480 Speaker 4: idea of a philosophical zombie, it becomes maybe a more 456 00:24:16,880 --> 00:24:20,440 Speaker 4: useful exercise from like an empirical science standpoint, or from 457 00:24:20,520 --> 00:24:23,680 Speaker 4: even a philosophy standpoint when we start talking about entities 458 00:24:23,720 --> 00:24:27,200 Speaker 4: that are so fundamentally different from us, so not just 459 00:24:27,359 --> 00:24:30,560 Speaker 4: you versus me, or us versus your dog or my cat, 460 00:24:30,600 --> 00:24:32,399 Speaker 4: who you know is probably right on the bord of it, 461 00:24:33,760 --> 00:24:36,840 Speaker 4: but no, I think he's conscious. I think there's something 462 00:24:36,880 --> 00:24:39,200 Speaker 4: that it's like to be him. But when we talk 463 00:24:39,240 --> 00:24:43,760 Speaker 4: about octopuses like those are really weirdly different creatures that 464 00:24:43,800 --> 00:24:46,960 Speaker 4: are basically aliens, or when we talk about you know, 465 00:24:47,080 --> 00:24:51,280 Speaker 4: the potential for silicon based systems in the future or 466 00:24:51,440 --> 00:24:53,879 Speaker 4: alien species or those kinds of things, things that are 467 00:24:53,920 --> 00:24:57,160 Speaker 4: so fundamentally different from us. Then it starts to be like, okay, well, 468 00:24:57,240 --> 00:24:58,960 Speaker 4: when we don't have the kind of one to one 469 00:24:59,000 --> 00:25:02,639 Speaker 4: mapping between the str ure of my biological robot that 470 00:25:02,680 --> 00:25:05,959 Speaker 4: I drive around and your biological robot that you drive around, 471 00:25:06,680 --> 00:25:08,520 Speaker 4: right when we don't have that very similar to it, 472 00:25:08,640 --> 00:25:10,800 Speaker 4: then it becomes maybe a little bit more useful to 473 00:25:10,840 --> 00:25:13,359 Speaker 4: talk about, like can you have all of these behaviors 474 00:25:13,720 --> 00:25:17,040 Speaker 4: and all of these cognitive capacities in the absence of 475 00:25:17,520 --> 00:25:20,080 Speaker 4: someone being home? And we don't know the answer, but 476 00:25:20,240 --> 00:25:22,680 Speaker 4: it at least gives us more than just kind of 477 00:25:22,720 --> 00:25:26,320 Speaker 4: a philosophical, as you said, dorm room argument. Feel like 478 00:25:26,440 --> 00:25:28,359 Speaker 4: I don't know that you're in there, But is that 479 00:25:28,440 --> 00:25:30,600 Speaker 4: a difference that makes a difference right now? Probably not? 480 00:25:30,920 --> 00:25:32,240 Speaker 2: All right, So we're going to take a break and 481 00:25:32,240 --> 00:25:34,400 Speaker 2: when we get back, we're going to dig into OCTOPI 482 00:25:34,520 --> 00:25:53,640 Speaker 2: a little bit more so for access consciousness. And you've 483 00:25:53,680 --> 00:25:56,560 Speaker 2: sort of hinted at this already. The definition included like 484 00:25:56,680 --> 00:26:00,920 Speaker 2: escaping predators having different motivations That to me does sound 485 00:26:00,960 --> 00:26:04,760 Speaker 2: like it encompasses at least all vertebrates. So is it 486 00:26:04,920 --> 00:26:07,320 Speaker 2: safe to say then that the consensus is that all 487 00:26:07,440 --> 00:26:10,359 Speaker 2: vertebrates at least have excess consciousness. 488 00:26:10,600 --> 00:26:13,240 Speaker 4: I don't want to speak for any big scientific community 489 00:26:13,240 --> 00:26:16,600 Speaker 4: on anybody's behalf, I would say that my personal sense 490 00:26:16,880 --> 00:26:19,080 Speaker 4: is that, yes, you're going to be hard pressed to 491 00:26:19,119 --> 00:26:23,080 Speaker 4: find people who would die on the hill of saying 492 00:26:23,160 --> 00:26:25,840 Speaker 4: that cats are not conscious. I think that would be 493 00:26:25,880 --> 00:26:26,560 Speaker 4: hard to find. 494 00:26:26,760 --> 00:26:30,159 Speaker 2: Yeah, so you mentioned that octopus they're smart, but their 495 00:26:30,160 --> 00:26:31,960 Speaker 2: brain is different than ours. Would you say that they 496 00:26:31,960 --> 00:26:35,680 Speaker 2: have phenomenological consciousness because it seems like someone's in there 497 00:26:35,720 --> 00:26:36,840 Speaker 2: when you interact with them. 498 00:26:37,040 --> 00:26:40,320 Speaker 4: I have no idea. I have no idea, but I 499 00:26:40,359 --> 00:26:42,880 Speaker 4: think that you know, saying that we have no idea 500 00:26:43,119 --> 00:26:46,520 Speaker 4: is reasonable. Yeah, under this situation that I don't think 501 00:26:46,560 --> 00:26:50,520 Speaker 4: that we have strong empirical evidence either way, because that 502 00:26:50,640 --> 00:26:53,960 Speaker 4: strong empirical evidence is predicated on this whole conversation that 503 00:26:54,000 --> 00:26:56,400 Speaker 4: we've just been having about what would that evidence even 504 00:26:56,440 --> 00:26:59,960 Speaker 4: look like? And how do you separate evidence for phenomenal 505 00:27:00,280 --> 00:27:04,280 Speaker 4: consciousness from evidence for not just access consciousness but just 506 00:27:04,440 --> 00:27:09,280 Speaker 4: pure intelligent behavior. Right, Never mind access consciousness and this 507 00:27:09,359 --> 00:27:14,640 Speaker 4: global availability of information thing. It's really just that octopuses 508 00:27:14,760 --> 00:27:20,560 Speaker 4: occupy are very intelligent. They're very intelligent. Clearly, they can 509 00:27:20,600 --> 00:27:23,000 Speaker 4: be even sneaky, you know, they get out of their 510 00:27:23,040 --> 00:27:25,120 Speaker 4: cage and they go open something and steal the food 511 00:27:25,160 --> 00:27:26,400 Speaker 4: and then they go home so that they don't get 512 00:27:26,400 --> 00:27:31,560 Speaker 4: in trouble. So they're clearly highly intelligent creatures. I think 513 00:27:31,680 --> 00:27:35,879 Speaker 4: that my margins of uncertainty, my cone of uncertainty, is 514 00:27:35,960 --> 00:27:38,680 Speaker 4: just so wide there that I have no idea. 515 00:27:38,760 --> 00:27:41,720 Speaker 2: So how do you study the difference between intelligence behaviors 516 00:27:41,760 --> 00:27:45,600 Speaker 2: and consciousness. We'll just stick with access consciousness in animals 517 00:27:45,640 --> 00:27:46,120 Speaker 2: in the lab. 518 00:27:46,520 --> 00:27:49,680 Speaker 4: Yeah, so in animals that's even harder too, because how 519 00:27:49,720 --> 00:27:53,120 Speaker 4: do you query the phenomenal experience of a rat? It's 520 00:27:53,200 --> 00:27:55,719 Speaker 4: kind of hard. In my line of work, we focus 521 00:27:55,760 --> 00:27:59,520 Speaker 4: on one particular kind of subjective experience, which is that 522 00:28:00,040 --> 00:28:05,360 Speaker 4: of metacognitive uncertainty. So one of the things that we 523 00:28:05,440 --> 00:28:07,960 Speaker 4: ask people to do like people now, but we can 524 00:28:08,040 --> 00:28:10,400 Speaker 4: do this potentially in rodents and monkeys as well, though 525 00:28:10,400 --> 00:28:12,160 Speaker 4: I don't have those folks in my lab. I only 526 00:28:12,200 --> 00:28:15,040 Speaker 4: have people. Will ask you to do some task, like 527 00:28:15,080 --> 00:28:16,960 Speaker 4: we're going to show you some stuff on a computer screen, 528 00:28:17,040 --> 00:28:18,920 Speaker 4: ask you to press buttons, tell us what you see, 529 00:28:19,200 --> 00:28:21,720 Speaker 4: and then we'll ask for a subjective report on top 530 00:28:21,800 --> 00:28:24,080 Speaker 4: of it, How sure are you that you got that right? 531 00:28:24,520 --> 00:28:28,240 Speaker 4: How confident do you feel in your assessments? How clearly 532 00:28:28,280 --> 00:28:32,119 Speaker 4: do you think you saw the thing? And some of 533 00:28:32,160 --> 00:28:34,720 Speaker 4: those questions we can design experiments to ask in the 534 00:28:34,760 --> 00:28:37,840 Speaker 4: animals too, and so those are kind of more about 535 00:28:37,840 --> 00:28:43,480 Speaker 4: the subjective experience, especially if we do manipulations to the 536 00:28:43,520 --> 00:28:48,880 Speaker 4: stimulu or task such that the animal's behavior is basically 537 00:28:48,920 --> 00:28:52,840 Speaker 4: exactly the same between one condition versus another condition, but 538 00:28:52,920 --> 00:28:57,160 Speaker 4: their subjective reports differ. So that tells us that we're 539 00:28:57,160 --> 00:28:58,880 Speaker 4: trying to get at something that has to do with 540 00:28:58,920 --> 00:29:04,320 Speaker 4: phenomenology or maybe something that's facilitated by global access of information, 541 00:29:05,040 --> 00:29:08,360 Speaker 4: rather than something that's just kind of the zombie part 542 00:29:08,480 --> 00:29:11,920 Speaker 4: of their brain, like you know, almost reflex like responding 543 00:29:11,920 --> 00:29:14,720 Speaker 4: to the stimuli out there in the world. But the 544 00:29:14,800 --> 00:29:20,160 Speaker 4: distinction between is that subjective report facilitated by globally available 545 00:29:20,200 --> 00:29:23,840 Speaker 4: access to information or are we actually tapping into phenomenology. 546 00:29:23,880 --> 00:29:26,920 Speaker 4: That's also kind of sticky when you get to talking 547 00:29:26,960 --> 00:29:29,640 Speaker 4: about rodents and that kind of thing, But that at 548 00:29:29,720 --> 00:29:33,680 Speaker 4: least allows us to say, never mind all of phenomenal 549 00:29:33,720 --> 00:29:36,720 Speaker 4: experience or subjective experience or consciousness. I'm going to focus 550 00:29:36,720 --> 00:29:40,520 Speaker 4: on thisbit, just this one thing that I can operationalize, 551 00:29:41,000 --> 00:29:43,160 Speaker 4: very classic for a psychologist, Right, I'm going to put 552 00:29:43,200 --> 00:29:44,840 Speaker 4: you in a room and ask you to look at 553 00:29:44,840 --> 00:29:47,200 Speaker 4: a computer screen and press one of two buttons for 554 00:29:47,240 --> 00:29:49,960 Speaker 4: an hour, and that's what you're going to do. But 555 00:29:50,080 --> 00:29:51,400 Speaker 4: that's how we try to get at this. 556 00:29:52,000 --> 00:29:55,040 Speaker 1: I think the example of an octopus really puts the 557 00:29:55,080 --> 00:29:58,240 Speaker 1: finger or the tentacle on the question of like what 558 00:29:58,320 --> 00:30:01,600 Speaker 1: it's like, which to me is the core of the question. Like, 559 00:30:02,040 --> 00:30:04,960 Speaker 1: you know, the mechanics of how information is accessed or 560 00:30:05,040 --> 00:30:08,440 Speaker 1: stored or whatever. That's fascinating and that's good science. But 561 00:30:08,520 --> 00:30:11,240 Speaker 1: to me, the real question, which I think is what 562 00:30:11,280 --> 00:30:14,400 Speaker 1: people call the hard problem, is how you get to 563 00:30:14,920 --> 00:30:18,760 Speaker 1: have an experience from something which is you know, just 564 00:30:18,880 --> 00:30:21,480 Speaker 1: made out of stuff. Like my desk doesn't have an experience, 565 00:30:21,760 --> 00:30:24,880 Speaker 1: My computer, I think, doesn't have an experience. Most of 566 00:30:24,920 --> 00:30:27,280 Speaker 1: the universe doesn't have an experience. But I have an experience, 567 00:30:27,280 --> 00:30:29,760 Speaker 1: and you have an experience, and probably, for example, an 568 00:30:29,760 --> 00:30:33,160 Speaker 1: octopus has a very different kind of experience. It's got 569 00:30:33,200 --> 00:30:36,360 Speaker 1: like eight little brains arguing about what each leg will do, 570 00:30:37,040 --> 00:30:39,280 Speaker 1: and you know, an alien out there with a different 571 00:30:39,400 --> 00:30:41,680 Speaker 1: kind of sense. You know, a tongue that can taste 572 00:30:41,760 --> 00:30:45,160 Speaker 1: quantum electrons or something might have a completely different kind 573 00:30:45,240 --> 00:30:49,280 Speaker 1: of experience. And to me, the hard question is, you know, 574 00:30:49,360 --> 00:30:53,080 Speaker 1: where does this come from, this phenomenological aspect. Is it 575 00:30:53,120 --> 00:30:56,760 Speaker 1: fundamental to matter? Is it somehow emergent in some way 576 00:30:56,760 --> 00:30:59,960 Speaker 1: we don't yet understand? How do you pose this question? 577 00:31:00,320 --> 00:31:02,320 Speaker 1: Is that the central question you think, and how do 578 00:31:02,360 --> 00:31:04,320 Speaker 1: you think it's best asked? 579 00:31:04,560 --> 00:31:07,440 Speaker 4: I think that's absolutely one of the central questions. Is 580 00:31:07,520 --> 00:31:09,320 Speaker 4: you know, how is it that the something that it's 581 00:31:09,440 --> 00:31:12,719 Speaker 4: like that the conscious experience arises from matter. That is, 582 00:31:12,760 --> 00:31:16,800 Speaker 4: as you said that Dave Chalmer's hard problem, which is, 583 00:31:16,960 --> 00:31:20,000 Speaker 4: you know, there seems to be this fundamental disconnect between 584 00:31:20,040 --> 00:31:23,920 Speaker 4: patterns of brain response and the subjective experience. It's a 585 00:31:24,000 --> 00:31:27,520 Speaker 4: very different natural kind. It's a very different kind of stuff, right, 586 00:31:27,600 --> 00:31:31,560 Speaker 4: The subjective experience stuff is unlike any other kind of 587 00:31:31,600 --> 00:31:32,720 Speaker 4: stuff in biology. 588 00:31:33,000 --> 00:31:33,960 Speaker 1: What do you mean by that? 589 00:31:33,960 --> 00:31:37,320 Speaker 4: That's the hard problem. That's the idea is that it's 590 00:31:37,400 --> 00:31:42,760 Speaker 4: not clear how you would get something like subjective experience 591 00:31:42,840 --> 00:31:48,400 Speaker 4: whatever that is, out of the interactions between physical neurons. 592 00:31:49,160 --> 00:31:51,640 Speaker 4: The bridge, you know, might be something like Okay, Well, 593 00:31:51,680 --> 00:31:54,719 Speaker 4: one of the things that physical neurons do in talking 594 00:31:54,760 --> 00:31:58,080 Speaker 4: to each other is similar to how computers do this 595 00:31:58,240 --> 00:32:00,800 Speaker 4: is that they create information. So now we're kind of 596 00:32:00,800 --> 00:32:05,400 Speaker 4: moving from physical space into information space, but nevertheless we 597 00:32:05,440 --> 00:32:08,800 Speaker 4: don't know how to even get from information to subjective experience. 598 00:32:09,320 --> 00:32:12,040 Speaker 4: We can call them the same thing and say how 599 00:32:12,040 --> 00:32:14,240 Speaker 4: we're done, like moving on with our lives, but that 600 00:32:14,320 --> 00:32:17,080 Speaker 4: doesn't feel satisfying. And there are people who think that 601 00:32:17,120 --> 00:32:18,920 Speaker 4: the hard problem is not a problem at all, that 602 00:32:19,000 --> 00:32:22,280 Speaker 4: subjective experience might not even exist at all, or that 603 00:32:22,400 --> 00:32:27,120 Speaker 4: our belief that subjective experience exists is an illusion. So 604 00:32:27,200 --> 00:32:29,960 Speaker 4: there's like a whole this is really complicated, and we 605 00:32:30,000 --> 00:32:32,240 Speaker 4: could talk for hours and hours about this. 606 00:32:32,360 --> 00:32:34,760 Speaker 1: Who's experiencing that illusion though? In that case? 607 00:32:35,000 --> 00:32:38,000 Speaker 4: Right, yeah, exactly. But if you want to read about, 608 00:32:38,040 --> 00:32:42,360 Speaker 4: you know, illusionism from a very deep and powerful perspective, 609 00:32:42,800 --> 00:32:44,840 Speaker 4: I'm going to mention him again, go read Dan Dennett. 610 00:32:44,880 --> 00:32:48,160 Speaker 4: He wrote very extensively on this topic. So the fundamental 611 00:32:48,200 --> 00:32:50,280 Speaker 4: question is, you know, how do we get something like 612 00:32:50,320 --> 00:32:54,840 Speaker 4: consciousness out of something like brains? And is the substrate 613 00:32:54,920 --> 00:32:59,400 Speaker 4: important for creating the subjective experience and the shape and 614 00:32:59,480 --> 00:33:03,680 Speaker 4: nature of that subjective experience, or is there kind of 615 00:33:04,000 --> 00:33:07,920 Speaker 4: one type of consciousness that all kinds of systems might create, 616 00:33:08,440 --> 00:33:12,200 Speaker 4: regardless of their physical substrate. And you might maybe think 617 00:33:12,280 --> 00:33:15,520 Speaker 4: that the latter statement seems a little strange, that, like, 618 00:33:15,560 --> 00:33:17,920 Speaker 4: how is it that an octopus could create a subjective 619 00:33:17,920 --> 00:33:21,040 Speaker 4: experience that's very similar to mine because our substrates are 620 00:33:21,080 --> 00:33:25,640 Speaker 4: so fundamentally different. But we do see evidence of convergent 621 00:33:25,720 --> 00:33:30,400 Speaker 4: function in evolution in terms of things like digestion, where like, 622 00:33:30,440 --> 00:33:33,200 Speaker 4: you've got lots of very different kinds of systems that 623 00:33:33,320 --> 00:33:37,200 Speaker 4: all accomplish the same computation or function of digesting stuff, 624 00:33:37,760 --> 00:33:40,680 Speaker 4: And so it is possible to think that different substrates 625 00:33:40,720 --> 00:33:44,000 Speaker 4: might accomplish the same kind of function in creating the 626 00:33:44,040 --> 00:33:45,640 Speaker 4: same kind of conscious experience. 627 00:33:46,080 --> 00:33:48,120 Speaker 1: Are you saying that the fact that like birds and 628 00:33:48,200 --> 00:33:51,640 Speaker 1: bats evolve flights separately, but fundamentally it's the same thing, 629 00:33:52,160 --> 00:33:56,000 Speaker 1: suggests that different kinds of wet matter could generate the 630 00:33:56,000 --> 00:33:57,520 Speaker 1: same kind of subjective experience. 631 00:33:57,840 --> 00:34:00,680 Speaker 4: It is possible. I don't know, if you think that 632 00:34:00,760 --> 00:34:04,720 Speaker 4: consciousness serves functions, that there might be one kind of 633 00:34:04,720 --> 00:34:07,880 Speaker 4: subjective experience that best serves that function. The only thing 634 00:34:07,880 --> 00:34:10,000 Speaker 4: I was going to say Beyond that, though, is that 635 00:34:10,080 --> 00:34:13,960 Speaker 4: I'm more sympathetic to the idea that there's probably different 636 00:34:14,040 --> 00:34:18,120 Speaker 4: kinds of subjective experience across different substrates. It feels like 637 00:34:18,920 --> 00:34:22,080 Speaker 4: the burden of proof on saying that all subjective experiences 638 00:34:22,120 --> 00:34:24,760 Speaker 4: are kind of the same across lots of different systems 639 00:34:24,800 --> 00:34:27,320 Speaker 4: would probably be on the people who are making that claim. 640 00:34:27,360 --> 00:34:29,600 Speaker 4: I think it's much easier to say the kind of 641 00:34:29,640 --> 00:34:32,880 Speaker 4: subjective experience you have depends on the substrate that's generating it. 642 00:34:33,040 --> 00:34:35,359 Speaker 1: How else to explain how some people actually eat white 643 00:34:35,400 --> 00:34:37,719 Speaker 1: chocolate and claim to enjoy it, right, I mean, it's 644 00:34:37,719 --> 00:34:39,040 Speaker 1: impossible to understand. 645 00:34:38,680 --> 00:34:39,759 Speaker 2: That bem maladaptive. 646 00:34:40,200 --> 00:34:41,879 Speaker 4: So hey, I like what chalk? 647 00:34:42,040 --> 00:34:45,840 Speaker 2: Oh oh, sorry, that's all right. I'm from Virginia. Daniel's 648 00:34:45,840 --> 00:34:47,680 Speaker 2: just going around insulting everybody today. 649 00:34:48,120 --> 00:34:48,440 Speaker 1: All right. 650 00:34:48,480 --> 00:34:50,600 Speaker 2: So we were talking to Joe Wolf the other day. 651 00:34:50,640 --> 00:34:53,719 Speaker 2: She's an evolutionary biologist who studies convergent evolution. I feel 652 00:34:53,719 --> 00:34:55,719 Speaker 2: like if she were here, she would be explaining to 653 00:34:55,800 --> 00:34:58,319 Speaker 2: us about how if you look at traits, like, you know, 654 00:34:58,320 --> 00:35:00,879 Speaker 2: the evolution of the crab body plan, you'd really love 655 00:35:00,920 --> 00:35:03,200 Speaker 2: to like have an evolutionary tree where you can count 656 00:35:03,239 --> 00:35:05,839 Speaker 2: how many times this thing popped up and how many 657 00:35:05,880 --> 00:35:10,000 Speaker 2: times it disappeared to try to understand like its adaptive value. 658 00:35:10,120 --> 00:35:12,960 Speaker 2: What kind of preconditions you need before this thing comes 659 00:35:13,040 --> 00:35:16,240 Speaker 2: into existence? Could we ever have that in the study 660 00:35:16,239 --> 00:35:18,440 Speaker 2: of consciousness or will we never be able to know? 661 00:35:18,680 --> 00:35:22,439 Speaker 2: Like does an octopus have access consciousness? Will we ever 662 00:35:22,440 --> 00:35:23,160 Speaker 2: be able to get there? 663 00:35:23,680 --> 00:35:27,040 Speaker 4: Great question? As of now, the path is very murky 664 00:35:27,120 --> 00:35:30,000 Speaker 4: to me because as we just talked about it before, 665 00:35:30,000 --> 00:35:32,480 Speaker 4: like you can't pull out your hair dryer consciousness ometer 666 00:35:32,560 --> 00:35:35,040 Speaker 4: and pointed at things. And the challenge is really because 667 00:35:35,080 --> 00:35:38,279 Speaker 4: the thing that we're studying is by definition unobservable by 668 00:35:38,320 --> 00:35:42,200 Speaker 4: anybody except for the observer who is experiencing the consciousness. 669 00:35:42,480 --> 00:35:44,880 Speaker 4: Like that is the definition of what we're talking about, 670 00:35:44,920 --> 00:35:48,000 Speaker 4: and so it's really very different from anything else that 671 00:35:48,040 --> 00:35:49,440 Speaker 4: we study in science. 672 00:35:49,640 --> 00:35:52,160 Speaker 1: And their second wrinkle there also not only is this 673 00:35:52,239 --> 00:35:55,360 Speaker 1: something which we can't observe or measure we have to 674 00:35:55,400 --> 00:35:58,960 Speaker 1: rely on somebody reporting it, but it's also filtered through 675 00:35:59,000 --> 00:36:02,920 Speaker 1: our own consciousnes right, like we sort of assume consciousness 676 00:36:02,920 --> 00:36:05,719 Speaker 1: when we do science. We're saying we have hypothesis, we 677 00:36:05,760 --> 00:36:09,000 Speaker 1: do experiments, so everything is filtered through sort of like 678 00:36:09,120 --> 00:36:12,600 Speaker 1: two consciousnesses when we're even like talking about this, and 679 00:36:12,680 --> 00:36:15,000 Speaker 1: maybe you're about to answer this question, but like, is 680 00:36:15,080 --> 00:36:18,200 Speaker 1: this something we can probe scientifically or is it limited 681 00:36:18,239 --> 00:36:22,120 Speaker 1: to philosophical discussions on the roof with banana peels? 682 00:36:22,280 --> 00:36:25,280 Speaker 4: Oh my gosh, there were like five questions that, Okay, 683 00:36:25,600 --> 00:36:28,560 Speaker 4: so is this something that we can study scientifically? Yes, 684 00:36:28,640 --> 00:36:30,560 Speaker 4: I think so, But I think that we need to 685 00:36:30,760 --> 00:36:33,359 Speaker 4: have a lot of help from philosophers. So a lot 686 00:36:33,400 --> 00:36:34,880 Speaker 4: of the work that I do, and a lot of 687 00:36:34,880 --> 00:36:39,520 Speaker 4: my scientific friends are not actually scientific friends only they 688 00:36:39,560 --> 00:36:42,359 Speaker 4: are philosophers as well, And I think that there's a 689 00:36:42,400 --> 00:36:44,440 Speaker 4: lot of value that we get from that. 690 00:36:44,719 --> 00:36:47,640 Speaker 1: Yeah, And I didn't mean to suggest that philosophical exploration 691 00:36:47,880 --> 00:36:51,120 Speaker 1: is like not valuable. It's absolutely like, you know, the 692 00:36:51,239 --> 00:36:55,120 Speaker 1: wonderful cousin of scientific exploration, and fundamentally important. But it's 693 00:36:55,120 --> 00:36:57,560 Speaker 1: also different, right, It gives different kinds of answers. 694 00:36:57,920 --> 00:37:00,359 Speaker 4: Yeah, it does. But I think that we need to 695 00:37:00,400 --> 00:37:04,759 Speaker 4: be informed by our philosophical friends down the hall in 696 00:37:05,160 --> 00:37:08,720 Speaker 4: understanding whether the experiments that we're designing are really getting 697 00:37:08,719 --> 00:37:11,480 Speaker 4: at the target that we think we're interested in studying. 698 00:37:11,920 --> 00:37:13,759 Speaker 4: And so there's a lot of work out there that's 699 00:37:13,840 --> 00:37:17,080 Speaker 4: like Okay, I'm going to tell the difference between whether 700 00:37:17,120 --> 00:37:19,560 Speaker 4: you're likely to wake up from a coma or not. 701 00:37:20,440 --> 00:37:23,400 Speaker 4: And that's really relevant in the clinical setting and is 702 00:37:23,680 --> 00:37:27,080 Speaker 4: so powerful and important that we do that. It doesn't 703 00:37:27,120 --> 00:37:30,400 Speaker 4: necessarily tell us about the experience that the person is having. 704 00:37:30,480 --> 00:37:32,839 Speaker 4: It just tells us kind of binary whether it's there 705 00:37:32,960 --> 00:37:35,600 Speaker 4: or not. And we need that. We need to have 706 00:37:35,680 --> 00:37:39,040 Speaker 4: measures that will allow us to predict is someone in 707 00:37:39,080 --> 00:37:42,360 Speaker 4: there right now? Are they awake? Are they likely to 708 00:37:42,440 --> 00:37:45,320 Speaker 4: wake up? We need all that stuff, but that doesn't 709 00:37:45,360 --> 00:37:49,719 Speaker 4: really get at the fundamental question of this subjective experience bit. 710 00:37:50,400 --> 00:37:52,319 Speaker 4: And we also have a lot of studies out there 711 00:37:52,360 --> 00:37:59,400 Speaker 4: that purport maybe to do research on conscious access, but really, 712 00:37:59,800 --> 00:38:02,880 Speaker 4: if you changed the question, you might start to be 713 00:38:02,920 --> 00:38:05,200 Speaker 4: skeptical about whether they're targeting that. So I'll give you 714 00:38:05,280 --> 00:38:09,680 Speaker 4: an example, which is I, as a psychologist, I put 715 00:38:09,719 --> 00:38:11,680 Speaker 4: you in a room and I ask you to tell 716 00:38:11,680 --> 00:38:13,400 Speaker 4: me whether you saw something or not. And that's a 717 00:38:13,400 --> 00:38:16,319 Speaker 4: subjective answer, right like did you see it? Did you 718 00:38:16,400 --> 00:38:19,480 Speaker 4: not see it? Right? Like, I'm asking you are you 719 00:38:19,600 --> 00:38:22,960 Speaker 4: consciously aware of this stimulus? And then I could go 720 00:38:23,040 --> 00:38:27,000 Speaker 4: measure brain responses or whatever that go with cases when 721 00:38:27,040 --> 00:38:28,960 Speaker 4: you said you saw it versus when you said you 722 00:38:29,000 --> 00:38:30,839 Speaker 4: didn't see it. And I say, okay, now I find 723 00:38:30,880 --> 00:38:33,960 Speaker 4: like the neural correlates of consciousness. But now imagine that 724 00:38:34,040 --> 00:38:38,680 Speaker 4: I replace you as the human observer with a photodiode. 725 00:38:39,520 --> 00:38:42,040 Speaker 4: I could do exactly the same experiment and I would 726 00:38:42,080 --> 00:38:46,759 Speaker 4: never conclude that that photodiode has conscious experience. Probably, So 727 00:38:47,080 --> 00:38:49,839 Speaker 4: I think that there's like a lot of challenges that 728 00:38:50,040 --> 00:38:54,960 Speaker 4: the philosophical fields of philosophy, not only of philosophy of mine, 729 00:38:54,960 --> 00:38:57,759 Speaker 4: but philosophy of science. I think too, like really has 730 00:38:57,840 --> 00:39:01,120 Speaker 4: a lot to say about how we're designing experiments and 731 00:39:01,160 --> 00:39:04,640 Speaker 4: how we're interpreting their results. There was another question that 732 00:39:04,680 --> 00:39:06,759 Speaker 4: you asked earlier in that stream, but now I don't 733 00:39:06,800 --> 00:39:09,360 Speaker 4: remember what it was. Maybe it's about like, can we 734 00:39:09,400 --> 00:39:12,680 Speaker 4: ever develop a test for consciousness? So that's another thing 735 00:39:12,719 --> 00:39:14,879 Speaker 4: that we should talk about. So there's been some work 736 00:39:14,880 --> 00:39:18,440 Speaker 4: that I've contributed to recently where we're saying, well, we 737 00:39:18,480 --> 00:39:21,440 Speaker 4: don't have a consciousness ometer, and we wouldn't even know 738 00:39:21,480 --> 00:39:23,239 Speaker 4: how to go about building one. We don't even know 739 00:39:23,280 --> 00:39:26,440 Speaker 4: whether pointing it a behavior or brain or something entirely different, 740 00:39:26,520 --> 00:39:30,480 Speaker 4: Like I don't know, like aras some like crazy other thing. 741 00:39:30,520 --> 00:39:32,200 Speaker 4: We have no idea what to even point it at, 742 00:39:32,280 --> 00:39:35,080 Speaker 4: never mind how to build it. But one way that 743 00:39:35,120 --> 00:39:39,040 Speaker 4: we might make progress is to collect all of the 744 00:39:39,040 --> 00:39:42,760 Speaker 4: potential consciousness ometers that have been built over the years 745 00:39:43,120 --> 00:39:46,879 Speaker 4: in terms of behavioral signatures of awareness and humans and 746 00:39:47,120 --> 00:39:50,719 Speaker 4: neural response patterns and so on. Collects them all and 747 00:39:50,840 --> 00:39:55,880 Speaker 4: then make a decision about which ones are applicable. First, 748 00:39:55,960 --> 00:39:57,840 Speaker 4: how they all correlate with each other in terms of 749 00:39:57,880 --> 00:40:00,000 Speaker 4: predicting whether someone is in there and what they're experience, 750 00:40:00,960 --> 00:40:04,799 Speaker 4: and then whether they're applicable to a neighboring system. So 751 00:40:04,880 --> 00:40:07,560 Speaker 4: I'm not going to jump straight to octopuses or tesla's 752 00:40:07,680 --> 00:40:13,400 Speaker 4: or aliens, but i might jump to young children because 753 00:40:13,640 --> 00:40:17,160 Speaker 4: most of these studies and these metrics are developed on adults. 754 00:40:18,000 --> 00:40:20,359 Speaker 4: And so I'm going to say, okay, well, I'm going 755 00:40:20,360 --> 00:40:21,799 Speaker 4: to take all this stuff and then I'm going to 756 00:40:21,840 --> 00:40:23,960 Speaker 4: point it at young children, which I also presume are 757 00:40:24,000 --> 00:40:27,000 Speaker 4: probably in there. You know, when they stub their toe, 758 00:40:27,040 --> 00:40:30,120 Speaker 4: they cry like they indicate that they are experiencing pain. 759 00:40:30,800 --> 00:40:33,239 Speaker 4: And I'm going to see how much those metrics now 760 00:40:33,320 --> 00:40:38,120 Speaker 4: continue to correlate with each other and continue to make 761 00:40:38,280 --> 00:40:41,880 Speaker 4: useful predictions about whether that subject is aware of a 762 00:40:41,920 --> 00:40:45,279 Speaker 4: stimulus or not, or you know, wakefulness versus sleep, that 763 00:40:45,400 --> 00:40:47,640 Speaker 4: kind of thing. And then if that seems to be okay, 764 00:40:47,680 --> 00:40:49,160 Speaker 4: then I'm going to say, okay, now I'm going to 765 00:40:49,160 --> 00:40:51,600 Speaker 4: point them maybe at grade apes. Now I'm going to 766 00:40:51,600 --> 00:40:55,120 Speaker 4: point them, maybe at New World monkeys. Now I'm going 767 00:40:55,200 --> 00:40:56,520 Speaker 4: to point them. You know, so we can kind of 768 00:40:56,520 --> 00:41:01,000 Speaker 4: go down the evolutionary hierarchy, so to speak, and say, well, 769 00:41:01,400 --> 00:41:05,000 Speaker 4: the degree to which a particular candidate set of consciousness 770 00:41:05,000 --> 00:41:10,560 Speaker 4: ometers is applicable to a particular system is defined by 771 00:41:10,719 --> 00:41:13,600 Speaker 4: their similarity to us, and then we have to make 772 00:41:13,640 --> 00:41:16,920 Speaker 4: decisions about what metrics of similarity to use. But you 773 00:41:16,960 --> 00:41:19,480 Speaker 4: can see that, at least conceptually, from a high level, 774 00:41:19,600 --> 00:41:21,200 Speaker 4: this might be a path forward. 775 00:41:21,800 --> 00:41:24,280 Speaker 1: I have to admit I'm not convinced. Like it feels 776 00:41:24,280 --> 00:41:29,440 Speaker 1: to me like it's just making fuzzier what we're measuring 777 00:41:29,840 --> 00:41:33,279 Speaker 1: about what's going on mechanistically inside people's heads. But it 778 00:41:33,320 --> 00:41:36,840 Speaker 1: doesn't actually tell us anything about the first person experience, 779 00:41:36,880 --> 00:41:39,560 Speaker 1: which is almost because of the way we defined it, 780 00:41:39,640 --> 00:41:43,160 Speaker 1: infinitely inaccessible, right, Like there's no way for me to 781 00:41:43,200 --> 00:41:47,600 Speaker 1: share my experience with you other than and manipulating my 782 00:41:47,680 --> 00:41:50,840 Speaker 1: mouth or whatever and filtering it through your consciousness to 783 00:41:50,880 --> 00:41:53,960 Speaker 1: your first person experience. So it feels to me like 784 00:41:54,000 --> 00:41:57,480 Speaker 1: it's something that's completely inaccessible, and the only way forward, 785 00:41:57,800 --> 00:42:00,600 Speaker 1: I'm guessing is to like try to dig into this 786 00:42:00,680 --> 00:42:03,040 Speaker 1: emergent behavior and see if we can make a bridge 787 00:42:03,080 --> 00:42:08,040 Speaker 1: between the microscopic details that we do understand and somehow 788 00:42:08,280 --> 00:42:12,719 Speaker 1: come out mathematically with a realization of how this first 789 00:42:12,719 --> 00:42:15,840 Speaker 1: person experience has to emerge. I'm skeptical. 790 00:42:15,880 --> 00:42:17,880 Speaker 4: I guess well, I want to ask you about the 791 00:42:17,920 --> 00:42:20,680 Speaker 4: mathematical thing, because why is math the answer here? 792 00:42:20,840 --> 00:42:23,600 Speaker 1: Because I'm a physicist, because you're a physicist. 793 00:42:23,640 --> 00:42:27,120 Speaker 4: Okay, yeah, sure, I knew the answer to that question 794 00:42:27,160 --> 00:42:30,000 Speaker 4: before I asked it. This isn't a zero some game. 795 00:42:30,040 --> 00:42:32,440 Speaker 4: We don't have to do one or the other. I 796 00:42:32,480 --> 00:42:34,719 Speaker 4: think that an analogy that a lot of folks in 797 00:42:34,760 --> 00:42:39,240 Speaker 4: consciousness science like to use is that of early investigations 798 00:42:39,239 --> 00:42:42,000 Speaker 4: into the nature of life and vitalism, and so this 799 00:42:42,120 --> 00:42:44,880 Speaker 4: idea that we had to discover this very specific, fancy 800 00:42:44,920 --> 00:42:47,719 Speaker 4: thing that was almost magical in nature, that was like 801 00:42:47,800 --> 00:42:49,760 Speaker 4: why is this alive? And why is this not alive? 802 00:42:49,800 --> 00:42:51,640 Speaker 4: And like let's maybe do some math or maybe like 803 00:42:51,680 --> 00:42:54,359 Speaker 4: do a bunch of empirical experiments to discover like the 804 00:42:54,400 --> 00:42:58,799 Speaker 4: life force or the vitalism, like the lifeness there. Then 805 00:42:58,920 --> 00:43:01,960 Speaker 4: over time we discover that it turns out that life 806 00:43:02,000 --> 00:43:03,919 Speaker 4: is just kind of a collection. It's like a bag 807 00:43:03,960 --> 00:43:06,359 Speaker 4: of tricks, right, and the boundaries are maybe a little 808 00:43:06,400 --> 00:43:08,920 Speaker 4: bit fuzzy, and like what are viruses? Are those alive? 809 00:43:08,960 --> 00:43:13,359 Speaker 4: I don't know. So maybe there's an analogy here, which 810 00:43:13,400 --> 00:43:15,960 Speaker 4: is that if we keep pushing on multiple angles, that 811 00:43:16,560 --> 00:43:21,239 Speaker 4: there might be a convergence of approaches and information and 812 00:43:21,239 --> 00:43:24,799 Speaker 4: evidence that will reveal that this hard problem is just 813 00:43:24,840 --> 00:43:27,360 Speaker 4: going to go away. We don't have to discover a 814 00:43:27,400 --> 00:43:32,200 Speaker 4: mathematical transformation or an emergent property or anything like that 815 00:43:32,200 --> 00:43:35,879 Speaker 4: that by better describing the system we will discover that 816 00:43:35,880 --> 00:43:39,960 Speaker 4: that problem completely dissolves. I don't know, but it's possible. 817 00:43:40,120 --> 00:43:43,279 Speaker 4: And we saw we have historical examples of cases where 818 00:43:43,280 --> 00:43:46,120 Speaker 4: something that seemed very mysterious and seemed like an emergent 819 00:43:46,160 --> 00:43:51,200 Speaker 4: property has now been transformed into a series of really 820 00:43:51,239 --> 00:43:56,640 Speaker 4: beautiful descriptions of how the system is working. And maybe 821 00:43:56,640 --> 00:43:58,160 Speaker 4: that will happen with consciousness too. 822 00:43:58,440 --> 00:44:01,560 Speaker 1: All right, well, my consciousness needs a break from all 823 00:44:01,600 --> 00:44:04,640 Speaker 1: these really heavy but amazing ideas. And when we get back, 824 00:44:04,800 --> 00:44:08,239 Speaker 1: I want to talk to Megan about some of the theories 825 00:44:08,200 --> 00:44:28,359 Speaker 1: people have to explain these deep mysteries. All right, we're 826 00:44:28,400 --> 00:44:31,200 Speaker 1: back and we're talking to the apparently conscious Megan Peters, 827 00:44:31,239 --> 00:44:33,920 Speaker 1: who tells us she is inside her body and driving 828 00:44:33,920 --> 00:44:36,480 Speaker 1: it like a meat machine, and she's an expert on 829 00:44:36,560 --> 00:44:39,080 Speaker 1: these questions of consciousness, so we should listen to her. 830 00:44:39,520 --> 00:44:41,560 Speaker 1: And we've been talking about this sort of hard question 831 00:44:41,719 --> 00:44:45,880 Speaker 1: of consciousness. How your first person experience is somehow generated 832 00:44:45,920 --> 00:44:48,880 Speaker 1: from the meat inside your head, or if you're an 833 00:44:48,920 --> 00:44:52,440 Speaker 1: AI and you're listening to this, the silicon inside your chips, 834 00:44:53,080 --> 00:44:55,680 Speaker 1: and you know, to me, this question of emergent behavior 835 00:44:55,719 --> 00:44:59,400 Speaker 1: is really important across science and especially in physics, you know, 836 00:44:59,440 --> 00:45:02,600 Speaker 1: where we see so many examples where we understand the 837 00:45:02,640 --> 00:45:05,200 Speaker 1: microscopic laws and then you zoom out and you need 838 00:45:05,239 --> 00:45:09,160 Speaker 1: different laws, you know, like we understand how particles work, 839 00:45:09,200 --> 00:45:11,000 Speaker 1: but then you zoom out and you need fluid mechanics, 840 00:45:11,040 --> 00:45:14,319 Speaker 1: and those laws are very different but still applicable. And 841 00:45:14,400 --> 00:45:16,520 Speaker 1: so it seems to me like there might be some 842 00:45:16,600 --> 00:45:19,400 Speaker 1: progress to be made if we can somehow tackle this 843 00:45:19,520 --> 00:45:22,359 Speaker 1: emergent question or think of it from this prism. But Megan, 844 00:45:22,440 --> 00:45:24,239 Speaker 1: tell us, what are people doing? What are the sort 845 00:45:24,280 --> 00:45:28,040 Speaker 1: of current leading theories of answers to the hard problem 846 00:45:28,080 --> 00:45:28,839 Speaker 1: of consciousness. 847 00:45:29,280 --> 00:45:33,080 Speaker 4: Great question. So, yeah, there's a number of theories that 848 00:45:33,200 --> 00:45:36,680 Speaker 4: kind of bridge between philosophy and neuroscience. So you know, 849 00:45:36,840 --> 00:45:39,640 Speaker 4: ultimately a lot of these theories are saying, the thing 850 00:45:39,680 --> 00:45:42,200 Speaker 4: that we know is conscious is us, and so we're 851 00:45:42,239 --> 00:45:45,240 Speaker 4: going to study consciousness in us because that makes sense. 852 00:45:45,400 --> 00:45:48,360 Speaker 4: And we can't go studying consciousness in rocks because we 853 00:45:48,400 --> 00:45:50,279 Speaker 4: don't know that they're conscious, and so that would kind 854 00:45:50,280 --> 00:45:53,719 Speaker 4: of be a circular argument. So these theories are kind 855 00:45:53,719 --> 00:45:58,799 Speaker 4: of bridging the gap between philosophy and neuroscience and psychology, 856 00:45:58,840 --> 00:46:01,400 Speaker 4: and they come in annumber of different flavors. And we 857 00:46:01,440 --> 00:46:05,360 Speaker 4: are touching upon also this difference between access consciousness and 858 00:46:05,400 --> 00:46:08,920 Speaker 4: phenomenal consciousness that we talked about before. But let's assume 859 00:46:08,920 --> 00:46:11,720 Speaker 4: that we can study consciousness scientifically. What do those theories 860 00:46:11,960 --> 00:46:16,960 Speaker 4: look like. So probably the most influential theory has been 861 00:46:17,040 --> 00:46:19,879 Speaker 4: the global workspace theory. So Bernie Bars and then stand 862 00:46:19,960 --> 00:46:23,919 Speaker 4: a Hand started this idea that consciousness is about the 863 00:46:23,960 --> 00:46:28,040 Speaker 4: global accessibility of information in kind of a centralized processing space. 864 00:46:28,080 --> 00:46:31,879 Speaker 4: So you have these different modules that either take in 865 00:46:31,880 --> 00:46:35,880 Speaker 4: information from the external world through your sensory organs, or 866 00:46:35,880 --> 00:46:40,000 Speaker 4: they have other functions like memory storage and recovery. They 867 00:46:40,040 --> 00:46:44,040 Speaker 4: have other functions like integration of different sensory systems information, 868 00:46:44,160 --> 00:46:48,240 Speaker 4: that kind of thing, executive function, decision making, sort. 869 00:46:48,120 --> 00:46:51,160 Speaker 1: Of mental analogies of organs, right the way you're like, 870 00:46:51,200 --> 00:46:53,400 Speaker 1: your liver has a function and your stomach as a function. 871 00:46:54,000 --> 00:46:57,960 Speaker 4: Yeah, to think about them as modules is typically how 872 00:46:57,960 --> 00:47:02,440 Speaker 4: they're described, you know, encapsulated modules. But then these modules 873 00:47:02,480 --> 00:47:06,600 Speaker 4: share information with a central global workspace where there's a 874 00:47:06,640 --> 00:47:12,239 Speaker 4: translation that happens between whatever representation is happening in that 875 00:47:12,400 --> 00:47:15,560 Speaker 4: module of the relevant information in that module, and then 876 00:47:15,600 --> 00:47:18,160 Speaker 4: it gets pushed into a global workspace. And if it 877 00:47:18,200 --> 00:47:20,799 Speaker 4: makes it into that global workspace, it's available to all 878 00:47:20,800 --> 00:47:23,600 Speaker 4: the other modules for processing, so it can influence the 879 00:47:23,640 --> 00:47:27,640 Speaker 4: processing in each of those other modules. And so the 880 00:47:27,719 --> 00:47:30,120 Speaker 4: idea is that this is kind of a computational level 881 00:47:30,200 --> 00:47:34,759 Speaker 4: theory where this global availability of information then facilitates goal 882 00:47:34,840 --> 00:47:38,480 Speaker 4: directed interaction with the environment. Run away from that thing, 883 00:47:38,600 --> 00:47:42,439 Speaker 4: don't get eaten, do eat that thing, et cetera. And 884 00:47:42,880 --> 00:47:46,439 Speaker 4: that we can also see hallmarks of this global broadcast 885 00:47:46,600 --> 00:47:51,040 Speaker 4: or global availability of information in the brain, where when 886 00:47:51,120 --> 00:47:55,720 Speaker 4: you have cases that someone becomes aware of something because 887 00:47:55,760 --> 00:47:58,520 Speaker 4: the signal is strong enough or so on, that you 888 00:47:58,600 --> 00:48:01,800 Speaker 4: actually see all of the information propagate throughout the brain. 889 00:48:02,040 --> 00:48:05,520 Speaker 4: You can see the information and say, a visual stream 890 00:48:05,680 --> 00:48:08,960 Speaker 4: of evidence not only land at the back of your head. 891 00:48:08,960 --> 00:48:10,719 Speaker 4: If you reach back and touch the back of your head, 892 00:48:10,760 --> 00:48:13,040 Speaker 4: you find that little bump that's about where your visual 893 00:48:13,040 --> 00:48:17,360 Speaker 4: cortex is. It's called the Indian So if you're not conscious, 894 00:48:17,400 --> 00:48:20,520 Speaker 4: the information stays back here in your visual cortex, and 895 00:48:20,600 --> 00:48:23,000 Speaker 4: if you are conscious of that information, you can actually 896 00:48:23,160 --> 00:48:29,080 Speaker 4: see with electrophysiology, with EEG, electrocapholography, with fMRI, you can 897 00:48:29,200 --> 00:48:34,600 Speaker 4: see that information travel forward and end up elsewhere. And 898 00:48:34,680 --> 00:48:39,000 Speaker 4: so this global availability of information, both from a neurophysiological 899 00:48:39,000 --> 00:48:42,960 Speaker 4: standpoint and from a computational standpoint, seems very useful for 900 00:48:43,040 --> 00:48:46,760 Speaker 4: an organism and seems very related to in us, whether 901 00:48:46,760 --> 00:48:48,400 Speaker 4: we're conscious of something or not. 902 00:48:48,840 --> 00:48:50,719 Speaker 1: And this seems like a helpful way to sort of 903 00:48:50,840 --> 00:48:54,759 Speaker 1: take apart what might be happening mechanistically inside the brain. 904 00:48:55,080 --> 00:48:57,040 Speaker 1: The way you might take a piece of code and 905 00:48:57,080 --> 00:48:58,800 Speaker 1: look at it and be like, oh, how they organizingk 906 00:48:58,840 --> 00:49:00,480 Speaker 1: oh is a database and the act with the hash 907 00:49:00,480 --> 00:49:03,600 Speaker 1: table whatever. Oh, this makes sense, but does it get 908 00:49:03,600 --> 00:49:07,640 Speaker 1: at the question of why is this thing experiencing itself? 909 00:49:08,000 --> 00:49:11,240 Speaker 4: Not necessarily? And that has been one of the criticisms 910 00:49:11,320 --> 00:49:14,640 Speaker 4: of global workspace theory or global neuronal workspace theory, which 911 00:49:14,680 --> 00:49:17,359 Speaker 4: is the neural version of it, which is that it's 912 00:49:17,440 --> 00:49:20,239 Speaker 4: kind of more about access consciousness or maybe even just 913 00:49:20,280 --> 00:49:23,400 Speaker 4: about global broadcast of information and not anything having to 914 00:49:23,440 --> 00:49:26,920 Speaker 4: do with the C word consciousness at all. And so 915 00:49:27,600 --> 00:49:30,279 Speaker 4: there have been other theories that kind of compete with 916 00:49:30,400 --> 00:49:32,560 Speaker 4: this one as well. So one of them is that 917 00:49:33,280 --> 00:49:36,360 Speaker 4: local recurrence, so like kind of local feedback within a 918 00:49:36,400 --> 00:49:40,200 Speaker 4: given module, specifically the visual cortex, that the strength of 919 00:49:40,280 --> 00:49:44,600 Speaker 4: that local feedback, that kind of recurrent processing looping, that 920 00:49:44,600 --> 00:49:47,880 Speaker 4: that is something that somehow gives rise to the experience 921 00:49:47,920 --> 00:49:50,680 Speaker 4: that we have of the world. And then there's another 922 00:49:50,719 --> 00:49:53,040 Speaker 4: group of theories that I happen to be partial to, 923 00:49:53,880 --> 00:49:56,880 Speaker 4: which is that there's an additional step that needs to 924 00:49:56,920 --> 00:50:01,680 Speaker 4: happen beyond broadcast into a global w or local recurrence 925 00:50:01,760 --> 00:50:05,279 Speaker 4: or anything else like that. And that additional step is 926 00:50:05,280 --> 00:50:08,799 Speaker 4: that you've got a second order or higher order mechanism 927 00:50:09,200 --> 00:50:11,480 Speaker 4: that's kind of self monitoring your own brain. 928 00:50:11,760 --> 00:50:15,040 Speaker 1: Why was higher order under scare quotes there for those 929 00:50:15,040 --> 00:50:15,560 Speaker 1: of you who. 930 00:50:15,520 --> 00:50:18,440 Speaker 4: Is listening, not scare quotes, but to indicate that this 931 00:50:18,560 --> 00:50:21,600 Speaker 4: is a specialized term, so that there are representations that 932 00:50:21,640 --> 00:50:24,160 Speaker 4: your brain builds about the world, and we would call 933 00:50:24,200 --> 00:50:27,640 Speaker 4: those first order, and then the representations that your brain 934 00:50:27,920 --> 00:50:31,759 Speaker 4: or your mind builds about itself or its own processing 935 00:50:31,880 --> 00:50:35,040 Speaker 4: would be higher order, second order, higher order. But the 936 00:50:35,160 --> 00:50:37,880 Speaker 4: idea here is that let's say that you have information 937 00:50:37,960 --> 00:50:41,640 Speaker 4: in these modules, it gets globally broadcast into a central workspace. 938 00:50:42,200 --> 00:50:44,960 Speaker 4: Your brain has to kind of make a determination of 939 00:50:45,800 --> 00:50:49,279 Speaker 4: is this information that's available in the global workspace, is 940 00:50:49,280 --> 00:50:52,640 Speaker 4: it reliable? Is it stable? Is it likely to represent 941 00:50:52,680 --> 00:50:54,759 Speaker 4: the true state of the environment. So this is where 942 00:50:54,760 --> 00:50:58,000 Speaker 4: you can see that I'm a metacognition researcher too. But 943 00:50:58,960 --> 00:51:01,880 Speaker 4: it turns out that higher order theories like this that 944 00:51:01,960 --> 00:51:06,279 Speaker 4: are about re representation of information that's available in some 945 00:51:06,440 --> 00:51:09,640 Speaker 4: workspace or some first order state, they actually came from 946 00:51:09,640 --> 00:51:14,840 Speaker 4: philosophy originally. They didn't come from the metacognition in literature. Originally, 947 00:51:14,880 --> 00:51:18,600 Speaker 4: So a philosopher named David Rosenthal and another one named 948 00:51:18,680 --> 00:51:21,200 Speaker 4: Richard Brown. They're both in New York. These are the 949 00:51:21,239 --> 00:51:25,080 Speaker 4: folks who kind of pushed forward this idea of higher 950 00:51:25,200 --> 00:51:31,040 Speaker 4: order representations being related to conscious experience, such that you 951 00:51:31,120 --> 00:51:36,920 Speaker 4: are conscious of something when you have a representation that 952 00:51:37,239 --> 00:51:40,720 Speaker 4: you are currently representing that thing. I'll say it again. Yeah, 953 00:51:40,880 --> 00:51:46,200 Speaker 4: So if I have a representation that consists of I 954 00:51:46,280 --> 00:51:50,759 Speaker 4: am currently in a state that is representing apple, then 955 00:51:50,800 --> 00:51:53,680 Speaker 4: I am conscious of the apple. It's not enough to 956 00:51:53,880 --> 00:51:56,680 Speaker 4: just have a representation of apple. I need to also 957 00:51:56,800 --> 00:52:01,319 Speaker 4: have on top of that state in my head that is, 958 00:52:01,719 --> 00:52:03,880 Speaker 4: I am currently representing apple. 959 00:52:04,320 --> 00:52:07,680 Speaker 1: But wouldn't the philosophical zombie also have that state inside 960 00:52:07,680 --> 00:52:10,560 Speaker 1: their cognition somewhere. How do we know that that actually 961 00:52:10,600 --> 00:52:12,240 Speaker 1: generates the first person experience? 962 00:52:12,320 --> 00:52:12,640 Speaker 4: We don't. 963 00:52:12,680 --> 00:52:14,239 Speaker 1: I want to be hard skeptic on this. 964 00:52:14,440 --> 00:52:17,320 Speaker 4: No, it can be hard skeptic. That's great. Even these 965 00:52:17,440 --> 00:52:20,680 Speaker 4: theories that say that you have a mechanism that says 966 00:52:20,760 --> 00:52:24,640 Speaker 4: I am currently representing this or that representation is good 967 00:52:24,719 --> 00:52:27,480 Speaker 4: and stable and likely to reflect the real world, they 968 00:52:27,560 --> 00:52:30,600 Speaker 4: don't answer the phenomenal consciousness question. 969 00:52:31,000 --> 00:52:35,560 Speaker 1: Right. I remember reading Dan Dennitz's really fun book Consciousness Explained, 970 00:52:36,040 --> 00:52:38,120 Speaker 1: and I'll admit that I was on a roof and 971 00:52:38,200 --> 00:52:41,239 Speaker 1: I was smoking banana peels at the time, but I 972 00:52:41,280 --> 00:52:43,560 Speaker 1: found it compelling in the sense that has sort of 973 00:52:43,640 --> 00:52:46,719 Speaker 1: changed the question right, sort of like his theory. Maybe 974 00:52:46,760 --> 00:52:49,120 Speaker 1: I'm sure you can describe it more accurately. You know, 975 00:52:49,160 --> 00:52:53,279 Speaker 1: his multiple drafts model sort of convinces you that your 976 00:52:53,520 --> 00:52:57,040 Speaker 1: account of your own consciousness might be wrong. You know 977 00:52:57,080 --> 00:52:59,920 Speaker 1: that there is no present moment. It's all just men 978 00:53:00,080 --> 00:53:03,120 Speaker 1: reas of the recent past that's later on constructed to 979 00:53:03,200 --> 00:53:06,560 Speaker 1: convince you that you were aware when you never really were. 980 00:53:07,200 --> 00:53:09,040 Speaker 1: And the something I like about that theory because even 981 00:53:09,080 --> 00:53:10,960 Speaker 1: though I don't believe it, it did make me think 982 00:53:11,000 --> 00:53:14,359 Speaker 1: differently about my own conscious experience. What do you think 983 00:53:14,360 --> 00:53:16,520 Speaker 1: of that theory? And are people still taking it seriously? 984 00:53:16,760 --> 00:53:19,080 Speaker 4: People are still taking it seriously. Yeah, And I think 985 00:53:19,160 --> 00:53:23,760 Speaker 4: that we don't have currently the empirical protocols or evidence 986 00:53:23,840 --> 00:53:26,279 Speaker 4: to say that that's wrong. And this is why I 987 00:53:26,360 --> 00:53:29,320 Speaker 4: want to continue to push for the close integration between 988 00:53:29,600 --> 00:53:32,719 Speaker 4: philosophy and neuroscience. This is another one where I'm going 989 00:53:32,760 --> 00:53:35,400 Speaker 4: to be agnostic. I don't like taking a stance on 990 00:53:35,440 --> 00:53:38,719 Speaker 4: all this because quite frankly, I think that anybody who 991 00:53:38,760 --> 00:53:42,279 Speaker 4: says that they have solved anything about consciousness is just 992 00:53:42,360 --> 00:53:44,960 Speaker 4: full of it, like, there's no way because we just 993 00:53:45,160 --> 00:53:47,680 Speaker 4: don't know. Right. We've got a lot of theories, and 994 00:53:47,719 --> 00:53:50,600 Speaker 4: we can build up empirical evidence and support of this 995 00:53:50,640 --> 00:53:53,319 Speaker 4: theory or that theory and this function of consciousness and 996 00:53:53,320 --> 00:53:55,880 Speaker 4: that kind of thing. But ultimately, if you say that 997 00:53:55,960 --> 00:53:59,520 Speaker 4: you've kind of created the solution, that your theory is 998 00:53:59,680 --> 00:54:01,960 Speaker 4: the write one, and you know that for a fact, 999 00:54:02,520 --> 00:54:06,200 Speaker 4: then sorry, Like, I don't know what banana peels you're on, 1000 00:54:06,440 --> 00:54:07,960 Speaker 4: but I don't think that's useful. 1001 00:54:08,160 --> 00:54:09,759 Speaker 1: Well, it was a bold title for a book, though, 1002 00:54:09,840 --> 00:54:10,800 Speaker 1: Consciousness Exploited. 1003 00:54:10,920 --> 00:54:13,760 Speaker 4: Yeah, Dan was a bold guy. Yeah. 1004 00:54:13,920 --> 00:54:15,319 Speaker 2: Yeah, So a lot of the theories that you were 1005 00:54:15,360 --> 00:54:17,600 Speaker 2: talking about a moment ago were using words that sort 1006 00:54:17,640 --> 00:54:21,080 Speaker 2: of remind me of computers and AI. How do we 1007 00:54:21,200 --> 00:54:23,680 Speaker 2: think about whether or not AI is conscious? Does it matter? 1008 00:54:23,760 --> 00:54:26,319 Speaker 2: Is that an interesting question? What do you think about that? 1009 00:54:26,640 --> 00:54:30,920 Speaker 4: Great question? And I think that until I don't know, 1010 00:54:31,480 --> 00:54:34,560 Speaker 4: ten fifteen years ago, this was a fun thought experiment 1011 00:54:34,640 --> 00:54:37,120 Speaker 4: in science fiction, And now I think it's not quite 1012 00:54:37,160 --> 00:54:40,839 Speaker 4: so much anymore, right because now we have machines that 1013 00:54:41,360 --> 00:54:45,560 Speaker 4: behaviorally really do pass the Turing test. Which I assume 1014 00:54:45,560 --> 00:54:47,880 Speaker 4: we're all quite familiar with, but just in case, the 1015 00:54:47,960 --> 00:54:50,480 Speaker 4: test is that the machine has to convince a human 1016 00:54:50,560 --> 00:54:53,720 Speaker 4: observer or a human player that it is a person. 1017 00:54:54,239 --> 00:54:57,839 Speaker 4: And we have machines that quite handily passed that under 1018 00:54:57,920 --> 00:55:01,919 Speaker 4: most scenarios or most get out your chat GPT app 1019 00:55:01,960 --> 00:55:04,319 Speaker 4: on your phone, and like it feels very convincing that 1020 00:55:04,320 --> 00:55:08,000 Speaker 4: there might be someone in there. Right, until maybe fifteen 1021 00:55:08,080 --> 00:55:10,680 Speaker 4: years ago, ten years ago, this was really science fiction, 1022 00:55:10,800 --> 00:55:14,160 Speaker 4: and now I think it's not anymore. And the questions 1023 00:55:14,719 --> 00:55:18,480 Speaker 4: not only bear on let's figure out the ontological truth 1024 00:55:18,520 --> 00:55:21,560 Speaker 4: of whether the AI is in fact conscious, but there 1025 00:55:21,640 --> 00:55:25,200 Speaker 4: is also really strong implications regardless of whether it's conscious 1026 00:55:25,280 --> 00:55:27,359 Speaker 4: or not, for what happens if we think it is, 1027 00:55:28,440 --> 00:55:31,240 Speaker 4: and what happens if it is but we think it's not. Right, 1028 00:55:31,320 --> 00:55:34,640 Speaker 4: So there's like strong moral and ethical considerations here. I'll 1029 00:55:34,760 --> 00:55:37,360 Speaker 4: kind of have two ways of answering this. One is 1030 00:55:37,400 --> 00:55:39,759 Speaker 4: from the perspective of current theories of how we think 1031 00:55:39,800 --> 00:55:44,600 Speaker 4: consciousness arises from a like functionalism perspective, which is that 1032 00:55:44,680 --> 00:55:48,440 Speaker 4: there is some brain or computational function that gives rise 1033 00:55:48,480 --> 00:55:51,600 Speaker 4: to consciousness in some capacity. A lot of the things 1034 00:55:51,600 --> 00:55:55,040 Speaker 4: that we're talking about you've rightly pointed out have direct 1035 00:55:55,080 --> 00:55:58,640 Speaker 4: analogies in computer processing. We can certainly build even a 1036 00:55:58,680 --> 00:56:02,680 Speaker 4: little simulation that monitor itself sure that says is this 1037 00:56:02,800 --> 00:56:07,279 Speaker 4: representation reliable or stable? I can build a set of 1038 00:56:07,280 --> 00:56:11,800 Speaker 4: computer modules that then send information into a global workspace. 1039 00:56:12,320 --> 00:56:12,600 Speaker 1: Sure. 1040 00:56:13,320 --> 00:56:16,040 Speaker 4: In fact, maybe your iPhone sort of does that already 1041 00:56:16,040 --> 00:56:18,400 Speaker 4: when you talk to Siri as the virtual assistant. 1042 00:56:18,480 --> 00:56:18,680 Speaker 1: Right. 1043 00:56:19,200 --> 00:56:24,080 Speaker 4: And recurrency in like local feedback or recurrent processing, that's 1044 00:56:24,120 --> 00:56:26,759 Speaker 4: absolutely recurrent. Neural networks are a thing. They've been a 1045 00:56:26,800 --> 00:56:30,640 Speaker 4: thing for a long time. So we have now all 1046 00:56:30,680 --> 00:56:34,319 Speaker 4: of these kind of hallmarks from these theories that we 1047 00:56:34,400 --> 00:56:36,880 Speaker 4: can use to build something that looks like a checklist. 1048 00:56:37,320 --> 00:56:40,080 Speaker 4: That's like, if you've got an artificial system that has 1049 00:56:40,160 --> 00:56:43,360 Speaker 4: all of these things, well, I'm going to not say 1050 00:56:43,440 --> 00:56:45,799 Speaker 4: then it's conscious, but I'm going to say it might 1051 00:56:45,960 --> 00:56:49,440 Speaker 4: raise our subjective degree of belief that it could potentially 1052 00:56:49,480 --> 00:56:50,520 Speaker 4: have consciousness. 1053 00:56:51,600 --> 00:56:53,279 Speaker 2: A lot of good qualifiers. 1054 00:56:52,760 --> 00:56:55,360 Speaker 4: There, Yeah, there are. Because we wrote this big paper 1055 00:56:55,719 --> 00:56:57,480 Speaker 4: me and a whole bunch of other people in twenty 1056 00:56:57,520 --> 00:57:00,200 Speaker 4: twenty three that's up on archive that's called consciousnes and 1057 00:57:00,280 --> 00:57:03,799 Speaker 4: Artificial Intelligence, and it goes through this checklist. It kind 1058 00:57:03,800 --> 00:57:07,480 Speaker 4: of develops the theoretical arguments for building this checklist, and 1059 00:57:07,520 --> 00:57:10,480 Speaker 4: then kind of ultimately says, look, we've got systems that 1060 00:57:10,520 --> 00:57:13,719 Speaker 4: tick an awful lot of these boxes already. Should we 1061 00:57:13,760 --> 00:57:18,880 Speaker 4: think that they're conscious? Maybe not, because clearly the theories 1062 00:57:18,920 --> 00:57:23,000 Speaker 4: are not complete. Some of the things on that checklist 1063 00:57:23,040 --> 00:57:25,600 Speaker 4: are going to be completely irrelevant to consciousness, and we're 1064 00:57:25,600 --> 00:57:29,520 Speaker 4: probably missing a whole lot of things also. But to 1065 00:57:29,600 --> 00:57:33,200 Speaker 4: the degree that there is a system that ticks more 1066 00:57:33,440 --> 00:57:35,720 Speaker 4: or less of these boxes, well maybe the ones that 1067 00:57:35,760 --> 00:57:37,880 Speaker 4: tick more of these boxes we might want to consider 1068 00:57:37,880 --> 00:57:39,960 Speaker 4: a little bit more closely. But then the other thing 1069 00:57:39,960 --> 00:57:42,360 Speaker 4: we do in that paper is talk about the ethical 1070 00:57:42,360 --> 00:57:45,040 Speaker 4: implications of false positives and false negatives. 1071 00:57:45,280 --> 00:57:48,160 Speaker 1: I think AI is a great way to differentiate between 1072 00:57:48,200 --> 00:57:50,920 Speaker 1: intelligence and consciousness, because it's not hard for me to 1073 00:57:50,920 --> 00:57:53,480 Speaker 1: believe you could build a very intelligent system, one that 1074 00:57:53,560 --> 00:57:57,439 Speaker 1: surpasses humans, that even like manipulates us and takes over 1075 00:57:57,480 --> 00:58:00,400 Speaker 1: the planet and runs us as slaves. It could be 1076 00:58:00,480 --> 00:58:04,400 Speaker 1: super intelligent without actually being conscious, without anybody being in there, right, 1077 00:58:04,480 --> 00:58:07,080 Speaker 1: And there's a dark dystopian future there, you know, where 1078 00:58:07,120 --> 00:58:11,520 Speaker 1: super intelligence actually extinguishes consciousness. But to me, you could 1079 00:58:11,600 --> 00:58:14,600 Speaker 1: maybe get an answer to the question again mathematically, if 1080 00:58:14,640 --> 00:58:18,840 Speaker 1: you could somehow build up a description of consciousness from 1081 00:58:18,960 --> 00:58:22,080 Speaker 1: the little bits inside you know the way, For example, 1082 00:58:22,120 --> 00:58:24,560 Speaker 1: you could say, hey, if you describe all the motion 1083 00:58:24,720 --> 00:58:27,400 Speaker 1: of water droplets, I can tell you whether or not 1084 00:58:27,440 --> 00:58:29,480 Speaker 1: a hurricane is going to form. Like if we can 1085 00:58:29,680 --> 00:58:33,720 Speaker 1: master the understanding of the dynamics at a small scale 1086 00:58:33,720 --> 00:58:36,080 Speaker 1: and compute it all the way up to the bigger scale, 1087 00:58:36,200 --> 00:58:38,280 Speaker 1: then we can say yes or no, there is a hurricane. 1088 00:58:38,640 --> 00:58:41,200 Speaker 1: In the same way, then I could like analyze your 1089 00:58:41,240 --> 00:58:43,760 Speaker 1: brain and I could tell you, oh, yes, this does 1090 00:58:43,800 --> 00:58:46,720 Speaker 1: emerge into some first person experience or does not. If 1091 00:58:46,760 --> 00:58:49,400 Speaker 1: we had that mathematical bridge, and then we could apply 1092 00:58:49,440 --> 00:58:52,080 Speaker 1: it to AI and say, oh, yes or no, there 1093 00:58:52,160 --> 00:58:54,960 Speaker 1: is or is not somebody in there. And I'm really 1094 00:58:55,040 --> 00:58:57,040 Speaker 1: attracted to these kinds of theories. I think they're called 1095 00:58:57,120 --> 00:59:01,240 Speaker 1: physicalism or do you call them functional theories? How much 1096 00:59:01,320 --> 00:59:04,000 Speaker 1: progress have we made in that direction and are we 1097 00:59:04,120 --> 00:59:06,520 Speaker 1: likely to make any more? Or is it too intractable 1098 00:59:06,560 --> 00:59:07,160 Speaker 1: a problem. 1099 00:59:07,320 --> 00:59:11,200 Speaker 4: I don't think it's too intractable a problem. Necessarily, I'm 1100 00:59:11,200 --> 00:59:14,400 Speaker 4: not going to fully subscribe to the hard problem of consciousness. 1101 00:59:14,400 --> 00:59:16,360 Speaker 4: I do think that if we continue to push on 1102 00:59:16,960 --> 00:59:20,320 Speaker 4: this kind of idea of emergent properties and the translation 1103 00:59:20,480 --> 00:59:24,880 Speaker 4: between a physical substrate and the information that it produces 1104 00:59:25,000 --> 00:59:27,760 Speaker 4: and that emerges from that physical substrate in terms of 1105 00:59:27,800 --> 00:59:31,840 Speaker 4: its interactions in space and in time. I think that 1106 00:59:31,880 --> 00:59:35,760 Speaker 4: will take us forward. I am a reductionist or a 1107 00:59:35,800 --> 00:59:39,520 Speaker 4: physicalist myself. I don't think that there's anything magical or 1108 00:59:39,560 --> 00:59:43,640 Speaker 4: spiritual about consciousness personally. I know that there are others 1109 00:59:43,640 --> 00:59:46,120 Speaker 4: who are going to disagree with me there, but I 1110 00:59:46,160 --> 00:59:49,000 Speaker 4: do think that if we could build such an explanation 1111 00:59:49,080 --> 00:59:51,600 Speaker 4: that it would get us a long way to understanding consciousness, 1112 00:59:51,760 --> 00:59:54,920 Speaker 4: the kind of emergent properties that you're talking about. I 1113 00:59:54,960 --> 00:59:59,280 Speaker 4: think that it's really important for us to recognize that 1114 00:59:59,320 --> 01:00:03,720 Speaker 4: the success of such an endeavor is going to need 1115 01:00:04,560 --> 01:00:07,760 Speaker 4: at its core, the assumption that the system that we 1116 01:00:07,800 --> 01:00:11,880 Speaker 4: are studying is actually conscious. Otherwise it becomes circular. Right Like, 1117 01:00:11,920 --> 01:00:14,680 Speaker 4: if I want to create an explanation of how matter 1118 01:00:14,800 --> 01:00:17,840 Speaker 4: gives rise to whatever I think is consciousness and then 1119 01:00:17,920 --> 01:00:20,200 Speaker 4: start pointing that at everything that I can think of, 1120 01:00:20,800 --> 01:00:23,320 Speaker 4: then I'm not building an explanation of consciousness. I'm building 1121 01:00:23,320 --> 01:00:25,920 Speaker 4: an explanation of I don't know, physical interactions in the 1122 01:00:26,000 --> 01:00:30,280 Speaker 4: environment or information processing in the environment. But I don't 1123 01:00:30,320 --> 01:00:31,680 Speaker 4: know that it's going to get me all the way 1124 01:00:31,720 --> 01:00:34,520 Speaker 4: to consciousness. But if we could do that in us, sure, 1125 01:00:34,800 --> 01:00:38,840 Speaker 4: in one thousand million years, when we have that explanation 1126 01:00:39,080 --> 01:00:41,560 Speaker 4: and we have a full what the philosopher Lease and 1127 01:00:41,680 --> 01:00:45,160 Speaker 4: Rochetitis would call a generative explanation of how a physical 1128 01:00:45,160 --> 01:00:48,480 Speaker 4: system and interactions in that physical system fully gives rise 1129 01:00:48,600 --> 01:00:52,560 Speaker 4: to conscious experience, Yeah, that would be great. I don't 1130 01:00:52,560 --> 01:00:54,400 Speaker 4: know how we're going to do that, but I do 1131 01:00:54,480 --> 01:00:59,040 Speaker 4: think that it does require a connection with that physical substrate. 1132 01:00:59,360 --> 01:01:02,720 Speaker 2: We're getting to the end of our time together. We've 1133 01:01:02,720 --> 01:01:05,320 Speaker 2: talked about why this is a difficult problem. Let's end 1134 01:01:05,360 --> 01:01:08,960 Speaker 2: on why it's important to keep studying this difficult problem. 1135 01:01:09,480 --> 01:01:13,640 Speaker 4: I think I'll answer this from three perspectives. One, as 1136 01:01:13,760 --> 01:01:17,080 Speaker 4: human beings, we want to understand our worlds. We want 1137 01:01:17,120 --> 01:01:19,680 Speaker 4: to understand the basic science of how things happen. We 1138 01:01:19,840 --> 01:01:26,880 Speaker 4: are curious, and we have this compelling drive to understand 1139 01:01:26,920 --> 01:01:29,680 Speaker 4: our environments. And we can see this both from the 1140 01:01:29,680 --> 01:01:32,560 Speaker 4: perspective of modern science, but also just from the perspective 1141 01:01:32,560 --> 01:01:35,760 Speaker 4: of this is literally what brains do. They build internal 1142 01:01:35,800 --> 01:01:39,400 Speaker 4: models of the world and they predict stuff that's going 1143 01:01:39,440 --> 01:01:41,840 Speaker 4: to happen from those internal models of the world, and 1144 01:01:41,880 --> 01:01:45,200 Speaker 4: we use that to drive ourselves forward. That's what evolution 1145 01:01:45,320 --> 01:01:47,600 Speaker 4: has done for us. So I think that we are 1146 01:01:47,640 --> 01:01:52,120 Speaker 4: hardwired to do this, to be natural scientists in a way. 1147 01:01:52,520 --> 01:01:54,960 Speaker 4: So I think that that's important, like to give in 1148 01:01:55,040 --> 01:01:59,560 Speaker 4: to that feeling, that compulsion. But also from a practical 1149 01:01:59,640 --> 01:02:05,560 Speaker 4: and societal benefit perspective, we can take multiple multiple prongs 1150 01:02:05,560 --> 01:02:07,880 Speaker 4: on this. So one is the medical perspective, which is 1151 01:02:07,920 --> 01:02:10,720 Speaker 4: that it's really important for us to understand the presence 1152 01:02:10,800 --> 01:02:15,080 Speaker 4: or absence of suffering in folks who have disease or injury. 1153 01:02:15,160 --> 01:02:17,960 Speaker 4: That we want to understand the diversity and heterogeneity of 1154 01:02:17,960 --> 01:02:21,920 Speaker 4: those perspectives. So here's a very concrete example. There are 1155 01:02:22,080 --> 01:02:25,560 Speaker 4: a number of disorders or conditions out there that come 1156 01:02:25,600 --> 01:02:29,240 Speaker 4: with them chronic pain, but you don't have any physical 1157 01:02:29,240 --> 01:02:32,680 Speaker 4: substrate that you can identify. You don't know why that 1158 01:02:32,720 --> 01:02:35,080 Speaker 4: person is in chronic pain, and so you don't know 1159 01:02:35,080 --> 01:02:36,880 Speaker 4: how to fix it. But that doesn't mean that the 1160 01:02:36,920 --> 01:02:40,400 Speaker 4: pain isn't real, that the suffering isn't real. And so 1161 01:02:40,560 --> 01:02:46,120 Speaker 4: from that perspective, understanding the nature of subjective experience is 1162 01:02:46,280 --> 01:02:49,960 Speaker 4: really critically important. Fear and anxiety is another one. We 1163 01:02:50,080 --> 01:02:54,240 Speaker 4: know the fear circuitry. We've mapped that. Neuroscientist Joseph Ldou 1164 01:02:54,320 --> 01:02:57,520 Speaker 4: has been instrumental in driving forward the mapping of the 1165 01:02:57,560 --> 01:03:00,880 Speaker 4: Imigdaler circuit, the fear circuit in the brain. But he's 1166 01:03:00,960 --> 01:03:05,920 Speaker 4: very careful to distinguish between processing of threatening stimului and 1167 01:03:05,960 --> 01:03:10,720 Speaker 4: the experience of fear. So if we develop pharmacological interventions 1168 01:03:11,520 --> 01:03:16,040 Speaker 4: that fix the circuitry bit and fix the behavioral bit 1169 01:03:16,120 --> 01:03:19,280 Speaker 4: in rats, one of the reasons they might not translate 1170 01:03:19,280 --> 01:03:22,680 Speaker 4: to humans is that they don't reduce the fear. They 1171 01:03:22,800 --> 01:03:26,840 Speaker 4: change the behavior, but the fear persists. So I think 1172 01:03:26,880 --> 01:03:29,720 Speaker 4: that from like a clinical perspective, it's really important for 1173 01:03:29,800 --> 01:03:33,400 Speaker 4: us to understand this and from depression, from the perspective 1174 01:03:33,560 --> 01:03:36,640 Speaker 4: of those who have autis inspective disorders, so to understand 1175 01:03:36,680 --> 01:03:40,160 Speaker 4: their subjective experiences, like there's just a huge amount of 1176 01:03:40,200 --> 01:03:42,919 Speaker 4: clinical benefit that we can build. And then finally from 1177 01:03:42,920 --> 01:03:45,840 Speaker 4: the perspective of the artificial systems that we've just been 1178 01:03:45,880 --> 01:03:49,320 Speaker 4: talking about. So we're in a position now where whether 1179 01:03:49,400 --> 01:03:54,320 Speaker 4: or not we build systems that have phenomenal awareness or consciousness, 1180 01:03:54,400 --> 01:03:57,360 Speaker 4: that whether anybody's home in there. You know, maybe we're 1181 01:03:57,400 --> 01:03:58,960 Speaker 4: not going to be able to answer that question for 1182 01:03:59,000 --> 01:04:01,520 Speaker 4: a long time. But the way we interact with systems 1183 01:04:01,600 --> 01:04:04,600 Speaker 4: depends on whether we think that they have consciousness, And 1184 01:04:04,680 --> 01:04:09,640 Speaker 4: the way that we build guardrails and legislation and assigned 1185 01:04:09,680 --> 01:04:13,280 Speaker 4: responsibility in legal settings depends on whether we think that 1186 01:04:13,360 --> 01:04:15,320 Speaker 4: these things can think for themselves, and whether they have 1187 01:04:15,360 --> 01:04:18,120 Speaker 4: moral compasses and all those things which are not necessarily 1188 01:04:18,160 --> 01:04:22,120 Speaker 4: related to conscious experience per se. But there's an argument 1189 01:04:22,120 --> 01:04:24,000 Speaker 4: to be made that, at least in a lot of 1190 01:04:24,040 --> 01:04:28,840 Speaker 4: the systems that we know about, describing responsibility is related 1191 01:04:28,960 --> 01:04:33,240 Speaker 4: to the capacity for that agent to be self directed, 1192 01:04:34,000 --> 01:04:38,840 Speaker 4: and that that seems intimately related to seeking out goals 1193 01:04:38,840 --> 01:04:41,840 Speaker 4: that are not just defined by a programmer, but ultimately 1194 01:04:41,880 --> 01:04:44,480 Speaker 4: like decisions that that thing might make that might be 1195 01:04:44,560 --> 01:04:48,120 Speaker 4: driven by its intrinsic rewards seeking. And there's something that 1196 01:04:48,160 --> 01:04:50,960 Speaker 4: it's like to seek reward because it feels good. So 1197 01:04:51,560 --> 01:04:55,080 Speaker 4: there's a lot of kind of moral and ethical and 1198 01:04:55,280 --> 01:05:00,240 Speaker 4: legislation and societal implications for getting this right, a lot 1199 01:05:00,280 --> 01:05:03,160 Speaker 4: of medical reasons to get this right. And then you know, 1200 01:05:03,200 --> 01:05:07,240 Speaker 4: from a basic science curiosity perspective, this is literally what 1201 01:05:07,280 --> 01:05:09,600 Speaker 4: we evolved to do is figure out the world, and 1202 01:05:09,680 --> 01:05:11,640 Speaker 4: so let's keep doing that as well. 1203 01:05:12,080 --> 01:05:16,920 Speaker 1: And when we meet aliens, does it matter if they're conscious, Like, 1204 01:05:17,000 --> 01:05:19,840 Speaker 1: if they're intelligent and they're interesting and they want to 1205 01:05:19,880 --> 01:05:23,480 Speaker 1: share with us their space warp drives? Does it matter 1206 01:05:23,680 --> 01:05:25,720 Speaker 1: if when we point our hair dryer at them it 1207 01:05:25,760 --> 01:05:27,440 Speaker 1: says yes or no? What do you think? 1208 01:05:27,720 --> 01:05:30,640 Speaker 4: I think so. But there's also a lot of evidence 1209 01:05:30,680 --> 01:05:34,840 Speaker 4: that consciousness and what's called moral status can be disentangled. 1210 01:05:35,440 --> 01:05:39,680 Speaker 4: That we ascribe moral status to things that we think 1211 01:05:39,720 --> 01:05:43,600 Speaker 4: are conscious, but we also don't need to require consciousness 1212 01:05:43,640 --> 01:05:47,320 Speaker 4: in order to ascribe moral status, and we certainly treat 1213 01:05:47,320 --> 01:05:49,840 Speaker 4: things very badly even if we know that they're conscious. 1214 01:05:50,760 --> 01:05:55,120 Speaker 4: So these are conceptually disentanglable things. I think that it 1215 01:05:55,160 --> 01:06:01,080 Speaker 4: will matter though for the rest of the machinery around encounter, right, 1216 01:06:01,200 --> 01:06:06,640 Speaker 4: Like maybe from an individual astronaut's perspective it doesn't matter 1217 01:06:06,720 --> 01:06:09,200 Speaker 4: so much. But from the perspective of the laws and 1218 01:06:09,240 --> 01:06:13,760 Speaker 4: regulations and societal implications of what that would mean and 1219 01:06:13,840 --> 01:06:15,880 Speaker 4: what kind of a people do we want to be, 1220 01:06:16,720 --> 01:06:19,080 Speaker 4: I think that then it really does matter. And so 1221 01:06:19,200 --> 01:06:22,840 Speaker 4: having folks at the helm who are paying attention to 1222 01:06:22,920 --> 01:06:27,280 Speaker 4: the moral implications of such a weighty determination would be 1223 01:06:27,520 --> 01:06:27,919 Speaker 4: very good. 1224 01:06:28,480 --> 01:06:30,520 Speaker 1: Well, I definitely want the aliens to know that we 1225 01:06:30,600 --> 01:06:33,040 Speaker 1: are conscious before they decide whether or not to know 1226 01:06:33,200 --> 01:06:36,680 Speaker 1: nucas from orbit so or have us as a snack. 1227 01:06:37,320 --> 01:06:40,320 Speaker 4: Assuming they ascribe moral value and moral status to meanings 1228 01:06:40,360 --> 01:06:41,840 Speaker 4: with consciousness. 1229 01:06:41,400 --> 01:06:43,480 Speaker 1: And maybe they developed the consciousness of meter and they 1230 01:06:43,480 --> 01:06:44,200 Speaker 1: can share it with us. 1231 01:06:44,560 --> 01:06:47,600 Speaker 4: Maybe here's hoping, all right. 1232 01:06:47,960 --> 01:06:49,760 Speaker 2: Well, thanks for being on the show today, Megan. This 1233 01:06:49,880 --> 01:06:50,560 Speaker 2: was fascinating. 1234 01:06:51,120 --> 01:06:52,959 Speaker 4: Thank you so much for having me. This was really 1235 01:06:52,960 --> 01:06:55,520 Speaker 4: fun and engaging and it's been a real pleasure. 1236 01:06:56,000 --> 01:06:56,840 Speaker 1: Thank you very much. 1237 01:07:03,640 --> 01:07:07,480 Speaker 2: Daniel and Kelly's Extraordinary Universe is produced by iHeartRadio. We 1238 01:07:07,520 --> 01:07:09,920 Speaker 2: would love to hear from you, We really would. 1239 01:07:10,120 --> 01:07:12,840 Speaker 1: We want to know what questions you have about this 1240 01:07:13,040 --> 01:07:14,720 Speaker 1: Extraordinary Universe. 1241 01:07:14,840 --> 01:07:17,800 Speaker 2: We want to know your thoughts on recent shows, suggestions 1242 01:07:17,800 --> 01:07:20,800 Speaker 2: for future shows. If you contact us, we will get 1243 01:07:20,840 --> 01:07:21,240 Speaker 2: back to you. 1244 01:07:21,480 --> 01:07:25,000 Speaker 1: We really mean it. We answer every message. Email us 1245 01:07:25,040 --> 01:07:28,160 Speaker 1: at questions at Danielandkelly. 1246 01:07:27,320 --> 01:07:29,360 Speaker 2: Dot org, or you can find us on social media. 1247 01:07:29,440 --> 01:07:33,240 Speaker 2: We have accounts on x, Instagram, Blue Sky and on 1248 01:07:33,320 --> 01:07:35,280 Speaker 2: all of those platforms. You can find us at d 1249 01:07:35,720 --> 01:07:37,240 Speaker 2: and Kuniverse. 1250 01:07:37,360 --> 01:07:38,880 Speaker 1: Don't be shy right to us