1 00:00:05,280 --> 00:00:09,320 Speaker 1: What is consciousness? Do you perceive the color red the 2 00:00:09,320 --> 00:00:11,960 Speaker 1: same way that I do on the inside? What is 3 00:00:12,000 --> 00:00:16,599 Speaker 1: wrong with the standard textbook version of vision? Why do 4 00:00:16,640 --> 00:00:19,319 Speaker 1: you have so many feedback loops in the brain? And 5 00:00:19,360 --> 00:00:21,600 Speaker 1: what does any of this have to do with Ernest 6 00:00:21,640 --> 00:00:26,040 Speaker 1: Hemingway or Plato's cave or artificial neural networks that. 7 00:00:26,079 --> 00:00:27,560 Speaker 2: See dogs everywhere? 8 00:00:31,000 --> 00:00:34,000 Speaker 1: Welcome to inner Cosmos with me David Eagleman, I mean 9 00:00:34,040 --> 00:00:36,720 Speaker 1: neuroscientists and author at Stanford. 10 00:00:36,280 --> 00:00:38,360 Speaker 2: And in these episodes we sail. 11 00:00:38,159 --> 00:00:41,839 Speaker 1: Deeply into our three pound universe to understand why and 12 00:00:41,960 --> 00:00:55,400 Speaker 1: how our lives look the way they do. Today's episode 13 00:00:55,760 --> 00:00:59,560 Speaker 1: is about the sense of being alive, and in a moment, 14 00:00:59,600 --> 00:01:02,680 Speaker 1: I'm going to bring in my colleague annal Seth, who 15 00:01:02,680 --> 00:01:06,479 Speaker 1: wrote a great book called Being You, which is all 16 00:01:06,520 --> 00:01:10,040 Speaker 1: about this neuroscientific problem. Now, by way of setting the 17 00:01:10,040 --> 00:01:14,000 Speaker 1: table for this, you may say, what is the neuroscientific 18 00:01:14,080 --> 00:01:18,959 Speaker 1: problem of consciousness? Well, it's simply this. It feels like 19 00:01:19,160 --> 00:01:24,240 Speaker 1: something to be you, and that feeling flickers to life 20 00:01:24,240 --> 00:01:26,679 Speaker 1: when you wake up in the morning, and it's not 21 00:01:26,840 --> 00:01:30,520 Speaker 1: there when you're in a deep sleep or under anesthesia, 22 00:01:31,240 --> 00:01:34,760 Speaker 1: and we're not sure how that happens. Our brains are 23 00:01:34,760 --> 00:01:39,520 Speaker 1: made up of tens of billions of very sophisticated processing 24 00:01:39,600 --> 00:01:43,679 Speaker 1: units neurons that are all operating together in a giant network. 25 00:01:44,040 --> 00:01:46,520 Speaker 1: But just because something has a lot of pieces and 26 00:01:46,600 --> 00:01:50,680 Speaker 1: parts doesn't tell you anything about why it's conscious, why 27 00:01:50,720 --> 00:01:55,360 Speaker 1: there's any subjective experience. If you get the right tools 28 00:01:55,400 --> 00:01:58,560 Speaker 1: and take your cover off your iPhone, you'll find that 29 00:01:58,640 --> 00:02:03,800 Speaker 1: it has a chip which has nineteen billion transistors. Now 30 00:02:04,000 --> 00:02:07,400 Speaker 1: just think about the interactions and the almost speed of 31 00:02:07,560 --> 00:02:12,080 Speaker 1: light signaling in that rich, sweeping electronic landscape. 32 00:02:12,600 --> 00:02:15,919 Speaker 2: But we don't have any meaningful reason. 33 00:02:16,000 --> 00:02:19,840 Speaker 1: To believe that your phone has consciousness or that it 34 00:02:19,880 --> 00:02:23,800 Speaker 1: would be like something to be a phone. In other words, 35 00:02:24,040 --> 00:02:27,359 Speaker 1: when your phone plays a funny video on the screen, 36 00:02:28,040 --> 00:02:31,320 Speaker 1: do you think it feels amused or is it more 37 00:02:31,440 --> 00:02:34,680 Speaker 1: likely that it's zero's and ones moving around in a 38 00:02:34,760 --> 00:02:38,880 Speaker 1: deterministic way through these billions of pathways when it gets 39 00:02:38,960 --> 00:02:42,600 Speaker 1: an email from your boss, does it feel stressed when 40 00:02:42,639 --> 00:02:46,840 Speaker 1: it registers the receipt of a text message? Does your 41 00:02:46,919 --> 00:02:51,360 Speaker 1: phone have the capacity to feel sad? Probably not? But 42 00:02:51,480 --> 00:02:55,280 Speaker 1: how do we make a more rigorous assessment of the question. 43 00:02:55,480 --> 00:02:58,600 Speaker 1: How could we know what is conscious and what is 44 00:02:58,680 --> 00:03:01,960 Speaker 1: not until we have we have tighter constraints on what 45 00:03:02,160 --> 00:03:07,320 Speaker 1: consciousness is. This is the problem that neuroscience faces. How 46 00:03:07,440 --> 00:03:10,120 Speaker 1: is it that all our billions of cells hook up 47 00:03:10,120 --> 00:03:12,880 Speaker 1: in just such a way that we have consciousness? In 48 00:03:12,919 --> 00:03:16,079 Speaker 1: other words, it feels like something to be us now. 49 00:03:16,120 --> 00:03:19,400 Speaker 1: Not so long ago. In neuroscience, like thirty years ago, 50 00:03:20,000 --> 00:03:25,880 Speaker 1: this problem of subjective experience was essentially not talked about. 51 00:03:26,360 --> 00:03:29,720 Speaker 1: People generally felt it was too squishy, and they talked 52 00:03:29,720 --> 00:03:32,400 Speaker 1: about how the brain worked, but not about why we 53 00:03:32,520 --> 00:03:37,000 Speaker 1: have subjective experience. But things started to change around the 54 00:03:37,040 --> 00:03:41,760 Speaker 1: nineteen nineties when some great minds started devoting themselves to 55 00:03:41,920 --> 00:03:46,320 Speaker 1: taking this problem seriously. And two of those minds happened 56 00:03:46,400 --> 00:03:50,480 Speaker 1: to be Nobel laureates in San Diego. One was Francis 57 00:03:50,600 --> 00:03:53,839 Speaker 1: Crick and one was Gerald Adelman, and they worked in 58 00:03:54,160 --> 00:03:57,240 Speaker 1: neighboring institutions. And it happened that when I was a 59 00:03:57,280 --> 00:03:59,920 Speaker 1: young post doc, I got to work with Krick, and 60 00:04:00,160 --> 00:04:03,800 Speaker 1: just across the way was another young postdoc working with Adelman. 61 00:04:04,040 --> 00:04:07,360 Speaker 1: His name was Annal Seth, and we both were reared 62 00:04:07,400 --> 00:04:10,600 Speaker 1: in this environment where it made sense to tackle the 63 00:04:10,680 --> 00:04:14,840 Speaker 1: problem of subjective experience. The mission was how can we 64 00:04:14,920 --> 00:04:20,160 Speaker 1: work towards a scientific understanding of what consciousness is and 65 00:04:20,279 --> 00:04:22,160 Speaker 1: how brains give rise to it. 66 00:04:22,839 --> 00:04:24,159 Speaker 2: Annal Seth is now. 67 00:04:24,000 --> 00:04:28,120 Speaker 1: A professor of cognitive and Computational neuroscience at the University 68 00:04:28,120 --> 00:04:31,200 Speaker 1: of Sussex, where he also directs the Sussex Center for 69 00:04:31,279 --> 00:04:34,719 Speaker 1: Consciousness Science. In twenty seventeen he gave a very popular 70 00:04:34,800 --> 00:04:39,279 Speaker 1: Ted talk called Your Brain Hallucinates Your Conscious Reality, and 71 00:04:39,320 --> 00:04:42,480 Speaker 1: in twenty twenty one he published a book called Being You. 72 00:04:42,760 --> 00:04:44,800 Speaker 2: A New Science of Consciousness. 73 00:04:45,040 --> 00:04:47,960 Speaker 1: So I called him up to share his views on perception, 74 00:04:48,400 --> 00:04:49,760 Speaker 1: consciousness and reality. 75 00:04:50,200 --> 00:04:57,440 Speaker 2: Here's my interview with annal Seth. What is consciousness? 76 00:04:57,440 --> 00:04:59,520 Speaker 3: Well, of course we could yan you and I have 77 00:04:59,760 --> 00:05:04,360 Speaker 3: just this for many, many hours, and philosophers for centuries, 78 00:05:04,400 --> 00:05:07,200 Speaker 3: and so you've got to be pragmatic and I define consciousness. 79 00:05:07,200 --> 00:05:09,280 Speaker 3: So I follow this a philosopher Thomas Nagel, who I 80 00:05:09,320 --> 00:05:12,719 Speaker 3: think put it very beautifully and very simply, and his 81 00:05:12,880 --> 00:05:16,039 Speaker 3: idea was that for a conscious organism, there is something 82 00:05:16,240 --> 00:05:19,560 Speaker 3: it is like to be that organism. It feels like 83 00:05:19,640 --> 00:05:22,039 Speaker 3: something to be me, and it feels like something to 84 00:05:22,080 --> 00:05:25,440 Speaker 3: be you. It's a bit circular, right, There's there's experiencing happening. 85 00:05:26,800 --> 00:05:29,880 Speaker 3: But I think it's I think there's something useful about 86 00:05:29,920 --> 00:05:33,279 Speaker 3: that because it doesn't mix up consciousness with other things 87 00:05:33,360 --> 00:05:37,320 Speaker 3: like intelligence or language or a particular sense of identity. 88 00:05:37,360 --> 00:05:39,960 Speaker 3: It's just any kind of experience whatsoever. So at least 89 00:05:39,960 --> 00:05:41,760 Speaker 3: we know what we're talking about. You know, it's also 90 00:05:41,839 --> 00:05:45,280 Speaker 3: what goes away under something like general anesthesia and then 91 00:05:45,320 --> 00:05:48,440 Speaker 3: comes back. That for me, is a nice starting point 92 00:05:48,480 --> 00:05:51,480 Speaker 3: for what consciousness is. So we know roughly what we're 93 00:05:51,520 --> 00:05:54,000 Speaker 3: talking about, and then it's the Then the approach is, well, 94 00:05:54,000 --> 00:06:00,000 Speaker 3: how do we explain it in terms of neuroscience, biology, physics, whatever, 95 00:06:00,600 --> 00:06:02,760 Speaker 3: How does it happen? Why is it the way it is? 96 00:06:03,440 --> 00:06:08,520 Speaker 3: And here my approach has been again quite pragmatic. Instead 97 00:06:08,560 --> 00:06:11,160 Speaker 3: of trying to find some sort of magic Eureka solution 98 00:06:11,360 --> 00:06:16,040 Speaker 3: that magic's conscious experience out of neurons or atoms or 99 00:06:16,160 --> 00:06:20,440 Speaker 3: quantum fields or whatever it might be, let's just take 100 00:06:20,480 --> 00:06:23,560 Speaker 3: a different approach and accept that consciousness exists, because there'll 101 00:06:23,600 --> 00:06:26,360 Speaker 3: be some philosophers that try and tell you it doesn't 102 00:06:26,400 --> 00:06:30,039 Speaker 3: even really exist, and that it has certain properties. Different 103 00:06:30,080 --> 00:06:33,920 Speaker 3: experiences feel different ways, and emotion feels different from a 104 00:06:34,040 --> 00:06:38,760 Speaker 3: visual experience, So let's try to understand how and why 105 00:06:39,120 --> 00:06:42,120 Speaker 3: these experiences are the way they are. And every experience 106 00:06:42,160 --> 00:06:44,719 Speaker 3: has various things in common too, like every experience is 107 00:06:44,880 --> 00:06:48,400 Speaker 3: unified more than more than the sum of its many parts. 108 00:06:49,240 --> 00:06:53,000 Speaker 3: So my approach is to try to find ways of 109 00:06:53,160 --> 00:06:57,919 Speaker 3: bridging between description of the brain in some way collection 110 00:06:57,960 --> 00:07:01,680 Speaker 3: of neurons or areas or whatever, and descriptions of experience. 111 00:07:01,800 --> 00:07:05,120 Speaker 3: And this is where this idea of controlled hallucination comes in. 112 00:07:05,240 --> 00:07:07,479 Speaker 3: And I'm sure we'll dig more into it, but very 113 00:07:07,560 --> 00:07:12,040 Speaker 3: very simply, the idea is that instead of our experience 114 00:07:12,080 --> 00:07:15,520 Speaker 3: of the world and the body sort of pouring itself 115 00:07:15,560 --> 00:07:18,679 Speaker 3: into the brain through the transparent windows of the senses, 116 00:07:19,640 --> 00:07:22,440 Speaker 3: in fact perception works the other way around. It's not 117 00:07:22,520 --> 00:07:26,720 Speaker 3: a new idea, it's it's very old that perception is 118 00:07:26,760 --> 00:07:30,560 Speaker 3: a process of inference. The brain is locked inside this 119 00:07:30,600 --> 00:07:33,720 Speaker 3: bony vault of a skull. It's dark and silent in there, 120 00:07:34,720 --> 00:07:37,280 Speaker 3: and so it has to make sense of sensory signals 121 00:07:37,280 --> 00:07:41,440 Speaker 3: which don't have colors or shapes or labels, and they're 122 00:07:41,560 --> 00:07:45,160 Speaker 3: uncertain that they're ambiguous and noisy with respect to whatever's 123 00:07:45,160 --> 00:07:47,440 Speaker 3: going on in the world and the body. So the 124 00:07:47,440 --> 00:07:50,840 Speaker 3: brain's always casting out predictions about the causes of its 125 00:07:50,880 --> 00:07:55,000 Speaker 3: sensory signals, and using sensory signals to calibrate to update 126 00:07:55,040 --> 00:07:59,520 Speaker 3: these predictions. And the claim here is that that that's 127 00:07:59,560 --> 00:08:02,440 Speaker 3: what we experience. The brain doesn't read out the world 128 00:08:02,800 --> 00:08:07,040 Speaker 3: from the outside in. It's always actively constructing the world 129 00:08:07,400 --> 00:08:10,760 Speaker 3: from the top down or the inside out. This is 130 00:08:10,800 --> 00:08:14,360 Speaker 3: why I think the term controlled hallucination is useful, because 131 00:08:14,680 --> 00:08:17,360 Speaker 3: we tend to think of hallucinations as things that are 132 00:08:17,400 --> 00:08:20,760 Speaker 3: internally generated, and I think that that's true for all 133 00:08:20,800 --> 00:08:25,280 Speaker 3: our experiences. It's just that our normal perceptual experiences are 134 00:08:25,320 --> 00:08:29,760 Speaker 3: controlled by calibrated, by yoked to, geared to the world 135 00:08:29,840 --> 00:08:33,320 Speaker 3: and the body in ways that are useful. So perception 136 00:08:33,480 --> 00:08:37,360 Speaker 3: is a controlled hallucination, and consciousness is a collection of perceptions. 137 00:08:37,480 --> 00:08:40,160 Speaker 1: So let's double click on a couple things there. So 138 00:08:41,040 --> 00:08:44,520 Speaker 1: first let's go back to the history. Let's say Kan't 139 00:08:44,600 --> 00:08:47,960 Speaker 1: and Helmholtz and think about what were the first clues 140 00:08:48,040 --> 00:08:52,040 Speaker 1: that people got in thinking about this idea that we're 141 00:08:52,080 --> 00:08:55,480 Speaker 1: not seeing reality as it is out there, but instead 142 00:08:56,000 --> 00:08:57,600 Speaker 1: it's something of a construction. 143 00:08:58,000 --> 00:08:58,679 Speaker 4: I think you're right. 144 00:08:58,920 --> 00:09:01,560 Speaker 3: The history of this is super fascinating and it's I 145 00:09:01,559 --> 00:09:03,640 Speaker 3: think not given enough credits sometimes and you can go 146 00:09:03,760 --> 00:09:06,800 Speaker 3: right back to Plato, I think in his allegory of 147 00:09:06,840 --> 00:09:10,320 Speaker 3: the cave, where you have all these prisoners in a cave. 148 00:09:10,400 --> 00:09:13,000 Speaker 3: They're chained to the walls, and all they can see 149 00:09:13,280 --> 00:09:16,560 Speaker 3: are shadows cast on the wall of the cave by 150 00:09:16,679 --> 00:09:20,360 Speaker 3: the light of a fire, and they take the shadows 151 00:09:20,400 --> 00:09:23,080 Speaker 3: to be real because that's all they have access to, 152 00:09:23,720 --> 00:09:26,240 Speaker 3: and they don't really know that there's anything out there, 153 00:09:27,280 --> 00:09:30,280 Speaker 3: you know, that is actually responsible for the shadows. So 154 00:09:30,920 --> 00:09:34,080 Speaker 3: I think that's a great starting point because our brain 155 00:09:34,240 --> 00:09:36,040 Speaker 3: is in a bit of a similar situation to the 156 00:09:36,040 --> 00:09:38,280 Speaker 3: prisoners in the cave. You know, they don't The brain 157 00:09:38,320 --> 00:09:41,760 Speaker 3: doesn't have any direct access to anything really, the body, 158 00:09:41,840 --> 00:09:45,280 Speaker 3: the world, whatever, so it has to make its best 159 00:09:45,320 --> 00:09:48,880 Speaker 3: guess of what's going on on the basis of things 160 00:09:49,000 --> 00:09:52,880 Speaker 3: like shadows, and then can't I think for me is 161 00:09:52,880 --> 00:09:55,280 Speaker 3: always the reference point. I always keep coming back to 162 00:09:55,320 --> 00:09:58,800 Speaker 3: his idea of the newmnen. So there is a reality. 163 00:09:59,160 --> 00:10:03,679 Speaker 3: I'm often sometimes misunderstood as denying that there's a real 164 00:10:03,760 --> 00:10:05,880 Speaker 3: world because I've used this word hallucination. 165 00:10:06,000 --> 00:10:08,360 Speaker 4: But no, not at all. Now there is a real world. 166 00:10:08,440 --> 00:10:11,760 Speaker 3: I think there is objective reality at least to answer 167 00:10:11,760 --> 00:10:14,640 Speaker 3: that question, you'd better ask a physicist rather than me. 168 00:10:15,800 --> 00:10:20,960 Speaker 3: But the world that we experience is never identical to 169 00:10:21,040 --> 00:10:26,600 Speaker 3: that world. It's always an interpretation. In Kant's way of thinking, 170 00:10:28,120 --> 00:10:32,319 Speaker 3: the newmenon reality as it really is is always hidden 171 00:10:32,360 --> 00:10:36,360 Speaker 3: behind a sensory veil or a kind of inferential curtain, 172 00:10:36,400 --> 00:10:40,200 Speaker 3: as I might describe it now. And then, Yeah, there 173 00:10:40,240 --> 00:10:42,720 Speaker 3: were many hints I think initially in vision. I think 174 00:10:42,720 --> 00:10:45,600 Speaker 3: most of the early workin this was done by vision. 175 00:10:45,640 --> 00:10:50,320 Speaker 3: There's Ibanel Heitm, the Arabian scientist and polymath, did a 176 00:10:50,320 --> 00:10:54,960 Speaker 3: lot of work hundreds of years ago basically arguing that 177 00:10:55,559 --> 00:10:58,959 Speaker 3: perception cannot be this direct readoubt of the world because 178 00:10:58,960 --> 00:11:01,520 Speaker 3: the relation is so in direct and you know, things 179 00:11:02,880 --> 00:11:08,199 Speaker 3: seen to obey regularities that can't be explained purely in 180 00:11:08,280 --> 00:11:10,400 Speaker 3: terms of the sensory data. Like when you take a 181 00:11:10,400 --> 00:11:13,839 Speaker 3: piece of white paper from insider room to outside room, 182 00:11:13,880 --> 00:11:14,839 Speaker 3: it still seems white. 183 00:11:14,920 --> 00:11:16,480 Speaker 4: You know, how does that happen? 184 00:11:16,480 --> 00:11:19,000 Speaker 3: It's because the brain isn't just reading off the light 185 00:11:19,040 --> 00:11:21,880 Speaker 3: that comes into the eyes. It's trying to figure out 186 00:11:22,160 --> 00:11:23,679 Speaker 3: what's causing the light. 187 00:11:24,520 --> 00:11:25,400 Speaker 2: Just to double click on that. 188 00:11:25,679 --> 00:11:27,960 Speaker 1: So if you're under let's say fluorescent lights, and then 189 00:11:28,000 --> 00:11:31,960 Speaker 1: you walk under the sunlight. What's actually bouncing off the 190 00:11:31,960 --> 00:11:34,480 Speaker 1: paper and hitting your eyes. There's a different wavelength, and 191 00:11:34,600 --> 00:11:36,640 Speaker 1: yet we see it as white, and well, this is 192 00:11:36,920 --> 00:11:38,360 Speaker 1: color constancy. 193 00:11:38,640 --> 00:11:42,160 Speaker 3: The brain is always after utility. You know, we perceive 194 00:11:42,240 --> 00:11:46,120 Speaker 3: the world in a way that evolution has decided is 195 00:11:46,200 --> 00:11:48,560 Speaker 3: useful for us. The novelist deny as Nin put it 196 00:11:48,600 --> 00:11:50,760 Speaker 3: beautifully when she said, you know, we do not see 197 00:11:50,800 --> 00:11:53,800 Speaker 3: things as they are. We see them as we are. 198 00:11:54,720 --> 00:11:56,080 Speaker 3: That's what the brain is gunning for. 199 00:11:56,679 --> 00:11:56,839 Speaker 4: That. 200 00:11:56,920 --> 00:12:00,480 Speaker 3: And then there's this trade off between the brains prior 201 00:12:01,559 --> 00:12:05,080 Speaker 3: expectations or beliefs about what's going on, and how much 202 00:12:05,120 --> 00:12:10,400 Speaker 3: the brain decides to update its beliefs, expectations, predictions with 203 00:12:10,559 --> 00:12:14,679 Speaker 3: new sensory data. Sometimes it can update quite quickly, pay 204 00:12:14,720 --> 00:12:17,600 Speaker 3: a lot of attention to this data. Sometimes it can 205 00:12:17,640 --> 00:12:21,559 Speaker 3: pay less attention. In fact, attention is exactly that process 206 00:12:21,679 --> 00:12:25,200 Speaker 3: in my mind anyway. Attention is exactly the process of 207 00:12:25,440 --> 00:12:30,920 Speaker 3: balancing how much incoming sensory information is able to update 208 00:12:31,040 --> 00:12:35,880 Speaker 3: the brain's best guess controlled hallucination of what's actually happening. 209 00:12:36,320 --> 00:12:39,040 Speaker 1: Right, So people like Can't and Helmholtz and others all 210 00:12:39,040 --> 00:12:41,360 Speaker 1: the way back to platos who point out have thought about, hey, 211 00:12:41,400 --> 00:12:43,640 Speaker 1: maybe we're not seeing reality as it is, but we're 212 00:12:43,679 --> 00:12:45,080 Speaker 1: seeing something. 213 00:12:45,280 --> 00:12:47,360 Speaker 2: That we are having. 214 00:12:47,160 --> 00:12:51,880 Speaker 1: To construction the inside because the brain is isolated. So 215 00:12:51,920 --> 00:12:57,360 Speaker 1: then by the nineteen sixties, the neuroscientist Donald McKay noticed something. 216 00:12:57,440 --> 00:12:59,120 Speaker 1: I don't know if other people had noticed this before, 217 00:12:59,160 --> 00:13:01,880 Speaker 1: but he noticed that the the you know, the amount 218 00:13:01,960 --> 00:13:04,000 Speaker 1: of input to the visual cortex in the back of 219 00:13:04,040 --> 00:13:08,240 Speaker 1: the brain that's coming from the eyes is actually really small. 220 00:13:09,240 --> 00:13:11,640 Speaker 1: I think the estimates now or that it's five percent 221 00:13:11,720 --> 00:13:13,960 Speaker 1: of the input to the visual cortex comes from the 222 00:13:14,040 --> 00:13:17,880 Speaker 1: eyeballs and all the rest is this sort of feedback. 223 00:13:18,559 --> 00:13:21,680 Speaker 1: So let's come back to this issue how you think 224 00:13:21,720 --> 00:13:26,240 Speaker 1: about this with predictions and what the brain is doing. 225 00:13:26,400 --> 00:13:28,880 Speaker 3: That's a great place to start, because that's such a paradox, 226 00:13:28,920 --> 00:13:30,679 Speaker 3: isn't it. I Mean when I was starting out, and 227 00:13:30,920 --> 00:13:33,400 Speaker 3: I think you and I started in neuroscience around the 228 00:13:33,440 --> 00:13:38,240 Speaker 3: same time, and the textbooks I remember reading just describe 229 00:13:38,240 --> 00:13:41,720 Speaker 3: perception as this bottom up, inside out process. You know, 230 00:13:41,800 --> 00:13:45,199 Speaker 3: signals would come into the retina early parts of the 231 00:13:45,320 --> 00:13:48,080 Speaker 3: visual cortex right at the back of the brain would 232 00:13:48,160 --> 00:13:51,680 Speaker 3: fish out really simple features like lines and edges, and 233 00:13:51,720 --> 00:13:55,920 Speaker 3: then you got this picture of information marching deeper and 234 00:13:55,960 --> 00:13:59,080 Speaker 3: deeper into the brain, sort of more complex things being 235 00:13:59,120 --> 00:14:02,000 Speaker 3: fished out. And then somehow the brain put all these 236 00:14:02,040 --> 00:14:06,480 Speaker 3: pieces back together, and that led to this experience. 237 00:14:05,960 --> 00:14:07,880 Speaker 4: Of the world. 238 00:14:07,920 --> 00:14:10,959 Speaker 3: Probably if you open a textbook now, you will probably 239 00:14:11,040 --> 00:14:13,720 Speaker 3: see something quite similar. I think this has been a very, 240 00:14:13,800 --> 00:14:18,559 Speaker 3: very pervasive idea despite this long and rich history from 241 00:14:18,720 --> 00:14:21,720 Speaker 3: Plato to Cant to Helmholt and so on. 242 00:14:21,880 --> 00:14:25,240 Speaker 1: In sixty something years since Mackay published his paper. 243 00:14:25,320 --> 00:14:29,440 Speaker 3: Is sixty years since Mackay's observation, which is hard to 244 00:14:29,560 --> 00:14:33,200 Speaker 3: understand in this classical view, Right, if perception is done 245 00:14:33,240 --> 00:14:36,680 Speaker 3: in this bottom up direction, why do you have so 246 00:14:36,920 --> 00:14:40,560 Speaker 3: many connections mark going the other way from the brain 247 00:14:40,640 --> 00:14:41,720 Speaker 3: back out to the census. 248 00:14:41,760 --> 00:14:42,800 Speaker 4: It doesn't make any sense. 249 00:14:43,480 --> 00:14:46,720 Speaker 3: But from the perspective of perception as this constructive process, 250 00:14:46,760 --> 00:14:50,520 Speaker 3: it makes much more sense. So digging into it a 251 00:14:50,560 --> 00:14:53,960 Speaker 3: bit more so far, you know, we've kind of outlined 252 00:14:54,000 --> 00:14:58,360 Speaker 3: that the brain has to make this inference about what's 253 00:14:58,400 --> 00:15:01,680 Speaker 3: out there on the basis of noisy and ambiguous sensory 254 00:15:01,720 --> 00:15:06,920 Speaker 3: signals without labels. Now, in mathematics we call this Bayesian inference. 255 00:15:07,280 --> 00:15:11,400 Speaker 3: It's a process of reasoning under conditions of uncertainty, or 256 00:15:11,960 --> 00:15:15,920 Speaker 3: in general, how you should update what you believe when 257 00:15:15,960 --> 00:15:18,200 Speaker 3: you get new data. And this is a very very 258 00:15:18,400 --> 00:15:23,080 Speaker 3: general formulation. The Reverend Thomas Bays and Laplace figured out 259 00:15:23,080 --> 00:15:26,120 Speaker 3: the maths hundreds of years ago, and it's been used 260 00:15:26,120 --> 00:15:28,120 Speaker 3: to do all kinds of things like figure out where 261 00:15:28,160 --> 00:15:31,760 Speaker 3: to look for missing nuclear submarines, or even figure out 262 00:15:31,840 --> 00:15:33,840 Speaker 3: how likely you are to have a disease if you've 263 00:15:33,880 --> 00:15:37,040 Speaker 3: got particular symptoms. It's always when things are uncertain that 264 00:15:37,080 --> 00:15:41,440 Speaker 3: this mathematics is useful. It's the same deal with perception. 265 00:15:41,560 --> 00:15:44,440 Speaker 3: The brain is trying to figure out what it should 266 00:15:44,520 --> 00:15:48,720 Speaker 3: believe now about what's out there, given some new information 267 00:15:48,800 --> 00:15:53,040 Speaker 3: from the census. Now, the problem is that actually doing 268 00:15:53,200 --> 00:15:57,040 Speaker 3: Bayesian inference is really really hard. In fact, it's almost 269 00:15:57,040 --> 00:16:00,840 Speaker 3: impossible to do it exactly. So the brain, any good 270 00:16:01,440 --> 00:16:06,040 Speaker 3: biological evolutionary hack has figured out an approximation, a very 271 00:16:06,160 --> 00:16:12,720 Speaker 3: general approximation. And this approximation is in the business we 272 00:16:12,840 --> 00:16:16,760 Speaker 3: call it predictive coding or predictive processing. And what this 273 00:16:16,880 --> 00:16:21,480 Speaker 3: means is that the brain effectively has some kind of 274 00:16:21,680 --> 00:16:25,280 Speaker 3: model of the world and the body, and it uses 275 00:16:25,320 --> 00:16:31,320 Speaker 3: that model to generate predictions about the sensory information that 276 00:16:31,400 --> 00:16:34,920 Speaker 3: should be coming in. And these predictions they cascade in 277 00:16:34,960 --> 00:16:39,280 Speaker 3: this top down direction and back out to the sensory 278 00:16:39,280 --> 00:16:43,560 Speaker 3: surfaces through all these connections that Mackay identified, and then 279 00:16:44,320 --> 00:16:48,600 Speaker 3: the sensory signals instead of being read out by the brain, 280 00:16:48,760 --> 00:16:52,680 Speaker 3: they just serve as prediction errors. They report the difference 281 00:16:52,720 --> 00:16:55,760 Speaker 3: between what the brain expects and what it gets you 282 00:16:56,000 --> 00:16:59,280 Speaker 3: at every level from the retina to the early parts 283 00:16:59,320 --> 00:17:02,040 Speaker 3: of the visual cortex all the way up. And if 284 00:17:02,080 --> 00:17:06,040 Speaker 3: the brain then follows a very very simple rule, which 285 00:17:06,119 --> 00:17:10,080 Speaker 3: is just either make an action or update its prediction 286 00:17:10,240 --> 00:17:13,480 Speaker 3: to try and minimize these prediction errors. So it's just 287 00:17:13,600 --> 00:17:17,600 Speaker 3: trying to minimize and trying to reduce these prediction era 288 00:17:17,760 --> 00:17:24,240 Speaker 3: sensory signals, then collectively its predictions will be a very 289 00:17:24,440 --> 00:17:29,320 Speaker 3: very very good approximation to Baysing inference. So we get 290 00:17:29,359 --> 00:17:33,840 Speaker 3: a picture in which the brain is constantly casting out 291 00:17:33,920 --> 00:17:38,360 Speaker 3: predictions into the world, the sensory signals update these predictions, 292 00:17:39,000 --> 00:17:42,320 Speaker 3: and there's always this simple mechanism which the brain is 293 00:17:42,640 --> 00:17:46,640 Speaker 3: just trying to minimize prediction error. And as a result, 294 00:17:47,119 --> 00:17:49,879 Speaker 3: what happens is the brain is able to make a 295 00:17:49,960 --> 00:17:53,320 Speaker 3: best guess about what's out there in the world or 296 00:17:53,359 --> 00:17:56,119 Speaker 3: in the body, and then the claim is, well, that 297 00:17:56,119 --> 00:17:57,240 Speaker 3: that's what we perceive. 298 00:18:13,480 --> 00:18:14,840 Speaker 2: So let me just summarize. 299 00:18:14,920 --> 00:18:18,160 Speaker 1: So the idea is the brain can't know exactly what's 300 00:18:18,160 --> 00:18:21,280 Speaker 1: out there, so it's making its best guesses and then saying, oh, wow, 301 00:18:21,320 --> 00:18:23,520 Speaker 1: that guess was really off, that guess was a little closer, 302 00:18:23,560 --> 00:18:26,680 Speaker 1: and so on, and as it matches the incoming data, 303 00:18:27,040 --> 00:18:31,840 Speaker 1: it gets to refine its model that way until it 304 00:18:31,880 --> 00:18:34,439 Speaker 1: has a reasonably good model. And so one of the 305 00:18:34,480 --> 00:18:36,840 Speaker 1: ideas that's been floating around in neuroscience for a while, 306 00:18:36,920 --> 00:18:38,480 Speaker 1: but again I don't think this makes it to the 307 00:18:38,520 --> 00:18:41,359 Speaker 1: textbooks is this idea that all of the data that 308 00:18:41,400 --> 00:18:44,800 Speaker 1: we see in the century courtices is actually the error. 309 00:18:44,880 --> 00:18:47,440 Speaker 1: It's the part that you didn't get right. And if 310 00:18:47,440 --> 00:18:50,720 Speaker 1: you were actually getting everything one hundred percent right, you'd have, 311 00:18:50,840 --> 00:18:53,240 Speaker 1: you know, golden silence going on in there. But the 312 00:18:53,280 --> 00:18:56,000 Speaker 1: world is complicated and things are always happening, not in 313 00:18:56,000 --> 00:18:58,440 Speaker 1: the way you can predict away. But this is the 314 00:18:58,480 --> 00:19:02,440 Speaker 1: frame shift that's real required to see this sort of thing. 315 00:19:03,000 --> 00:19:07,400 Speaker 1: So coming back to why you call this a controlled hallucination, 316 00:19:10,160 --> 00:19:12,880 Speaker 1: let's do it this way. Which is we don't need 317 00:19:12,920 --> 00:19:15,400 Speaker 1: our eyes at all to have rich visual experience, right. 318 00:19:15,560 --> 00:19:18,400 Speaker 1: You have dreams inn when your eyes are closed, right, 319 00:19:18,880 --> 00:19:24,680 Speaker 1: And so the idea is that maybe you are dreaming, 320 00:19:24,720 --> 00:19:27,040 Speaker 1: you're producing all the stuff on the inside, but use 321 00:19:27,080 --> 00:19:30,640 Speaker 1: the word controlled to indicate, Look, you're anchored to what's going. 322 00:19:30,440 --> 00:19:31,040 Speaker 2: On in the outside. 323 00:19:31,080 --> 00:19:33,600 Speaker 1: You've got this, you know, let's say five percent of 324 00:19:33,720 --> 00:19:36,879 Speaker 1: data coming in from the central world, and that anchors 325 00:19:36,920 --> 00:19:40,520 Speaker 1: you somewhat. And that's what you mean, my controlled hallucination 326 00:19:40,720 --> 00:19:42,880 Speaker 1: control dreaming in a sense. 327 00:19:43,200 --> 00:19:47,439 Speaker 3: That's exactly right. In fact, if you describe perception as 328 00:19:47,440 --> 00:19:50,600 Speaker 3: a kind of controlled hallucination, then you can just flip 329 00:19:50,600 --> 00:19:53,360 Speaker 3: it around and you can say when you're hallucinating as 330 00:19:53,359 --> 00:19:55,600 Speaker 3: we would normally use it. You know, when someone sees 331 00:19:55,640 --> 00:19:58,680 Speaker 3: something that's not there, indeed, when they're dreaming, well you 332 00:19:58,720 --> 00:20:02,080 Speaker 3: can just call that uncontrolled perception. And I think the 333 00:20:02,160 --> 00:20:05,480 Speaker 3: point here is that there's a continuity. They're not completely 334 00:20:05,520 --> 00:20:08,520 Speaker 3: different categories. So, you know, we used to thinking of 335 00:20:08,600 --> 00:20:12,359 Speaker 3: dreams and hallucinations as coming more from the inside than 336 00:20:12,400 --> 00:20:16,080 Speaker 3: the outside. And it's true that in normal perception the 337 00:20:16,119 --> 00:20:21,080 Speaker 3: outside world plays a bigger role. But fundamentally, I think 338 00:20:21,520 --> 00:20:25,159 Speaker 3: it's the same kind of process. It's the same dance, 339 00:20:25,320 --> 00:20:29,320 Speaker 3: this exchange of prediction and prediction error, but you change 340 00:20:29,440 --> 00:20:32,160 Speaker 3: exactly how this dance plays out, and you can sort 341 00:20:32,160 --> 00:20:35,200 Speaker 3: of sweep through all these different ways of experiencing, whether 342 00:20:35,240 --> 00:20:39,600 Speaker 3: it's normal perception or dreaming or hallucination or something else. 343 00:20:40,119 --> 00:20:45,440 Speaker 1: Now use the terms controlled and uncontrolled, but in fact 344 00:20:45,480 --> 00:20:48,679 Speaker 1: it's more of a spectrum. I'd imagine right where a 345 00:20:48,760 --> 00:20:52,359 Speaker 1: controlled hallucination is I'm looking for something on my desk, 346 00:20:52,600 --> 00:20:55,359 Speaker 1: I have to really pay attention, and so in that sense, 347 00:20:55,359 --> 00:20:58,040 Speaker 1: it's really controlled by what's in the outside world. Here 348 00:20:58,040 --> 00:21:01,960 Speaker 1: it's more controlled, whereas other times it's less and less 349 00:21:02,000 --> 00:21:06,679 Speaker 1: controlled when I'm you know, imagining something or just seeing whatever. 350 00:21:06,720 --> 00:21:10,399 Speaker 1: And you can imagine in cases like Charles Bonet syndrome, 351 00:21:10,400 --> 00:21:13,760 Speaker 1: as people are going blind, they have formed visual hallucinations. 352 00:21:13,800 --> 00:21:16,840 Speaker 1: They think they see somebody walking in or doing a 353 00:21:16,840 --> 00:21:19,040 Speaker 1: bunch of dancers in the street or something, even though 354 00:21:19,040 --> 00:21:22,120 Speaker 1: that's not there. So it's sort of like a spectrum 355 00:21:22,600 --> 00:21:26,560 Speaker 1: of how much control the outside world has. 356 00:21:26,359 --> 00:21:29,800 Speaker 3: That's exactly right, And in fact, one of the powers 357 00:21:29,840 --> 00:21:32,000 Speaker 3: of this approach is that you can use it to 358 00:21:32,119 --> 00:21:35,720 Speaker 3: better understand what's going on in conditions like Charles Bonne, 359 00:21:36,000 --> 00:21:39,000 Speaker 3: where people have hallucinations. One of the things we did 360 00:21:39,080 --> 00:21:43,240 Speaker 3: in my research group last year was we built computational 361 00:21:43,320 --> 00:21:47,600 Speaker 3: models of this process of prediction and prediction error and 362 00:21:47,640 --> 00:21:50,359 Speaker 3: so on, and we tweak the model in different ways 363 00:21:50,440 --> 00:21:55,320 Speaker 3: to try and simulate different kinds of hallucination. So Charles 364 00:21:55,320 --> 00:22:00,240 Speaker 3: bona syndrome where indeed people often see patterns, but also 365 00:22:00,480 --> 00:22:05,800 Speaker 3: people wandering around. Then there's in Parkinson's disease, people often 366 00:22:05,840 --> 00:22:10,960 Speaker 3: have quite rich and complex hallucinations. And then in psychedelics 367 00:22:11,000 --> 00:22:14,400 Speaker 3: you get all kinds of different hallucinations too that seem 368 00:22:14,640 --> 00:22:17,959 Speaker 3: sometimes seem to emerge out of things that are already 369 00:22:17,960 --> 00:22:21,040 Speaker 3: there in the environment. That clouds can become animals or people, 370 00:22:21,119 --> 00:22:23,639 Speaker 3: things like that. So we've been able to use this 371 00:22:23,720 --> 00:22:29,200 Speaker 3: approach to really drill down and understand not only how 372 00:22:29,240 --> 00:22:33,119 Speaker 3: these hallucinations happen, but how they and why they differ 373 00:22:33,160 --> 00:22:36,520 Speaker 3: from each other. And what we did was we actually 374 00:22:36,600 --> 00:22:39,679 Speaker 3: went to people who have these kinds of hallucinations in 375 00:22:39,720 --> 00:22:42,640 Speaker 3: real life and we asked them to judge the output 376 00:22:42,640 --> 00:22:45,640 Speaker 3: of the model so that we could test whether our 377 00:22:46,359 --> 00:22:49,280 Speaker 3: you know, our computational model of these hallucinations was right. 378 00:22:49,320 --> 00:22:52,960 Speaker 3: You know, would somebody with Parkinson's disease pick the output 379 00:22:53,200 --> 00:22:57,280 Speaker 3: that we generated when we were trying to simulate Parkinson's 380 00:22:57,280 --> 00:23:00,879 Speaker 3: hallucinations and more or less that that works well. So 381 00:23:00,960 --> 00:23:05,359 Speaker 3: we're beginning to be able to really characterize and at 382 00:23:05,400 --> 00:23:08,800 Speaker 3: the level of what the brain is doing these different 383 00:23:08,880 --> 00:23:10,600 Speaker 3: kinds of hallucinations. 384 00:23:10,840 --> 00:23:13,480 Speaker 1: You know, one of the things that I talked about 385 00:23:13,520 --> 00:23:16,320 Speaker 1: in my book Incognito was this possibility that all of 386 00:23:16,400 --> 00:23:20,640 Speaker 1: us might be having hallucinations all the time, but we 387 00:23:20,800 --> 00:23:23,399 Speaker 1: don't know it. So for example, you know, it's something 388 00:23:23,400 --> 00:23:26,159 Speaker 1: on my desk here, or you know, I think my 389 00:23:26,240 --> 00:23:28,520 Speaker 1: dog's over there when he's not, and so on, and 390 00:23:28,720 --> 00:23:33,639 Speaker 1: we only notice it when there's a clear indication that 391 00:23:34,600 --> 00:23:36,639 Speaker 1: it's not true, either because someone else tells us, or 392 00:23:36,720 --> 00:23:38,359 Speaker 1: I go look for my dog and realize that was 393 00:23:38,359 --> 00:23:40,160 Speaker 1: a bag on the floor, not my dog, or so on. 394 00:23:40,440 --> 00:23:44,000 Speaker 1: But probably this is happening all the time. I want 395 00:23:44,040 --> 00:23:46,359 Speaker 1: to drill in on what you said about how you 396 00:23:46,400 --> 00:23:49,560 Speaker 1: go out to different people with psychedelics or with Parkinson's 397 00:23:49,560 --> 00:23:53,639 Speaker 1: disease and ask them and find out the degree to 398 00:23:53,680 --> 00:23:56,320 Speaker 1: which this match is. What you're doing is taking things like, 399 00:23:56,600 --> 00:23:58,480 Speaker 1: for example, in your Ted talk a while ago. 400 00:23:58,720 --> 00:24:00,000 Speaker 2: You used Google deep. 401 00:24:00,080 --> 00:24:04,679 Speaker 1: Dream to take footage, and Google deep dream sort of 402 00:24:04,680 --> 00:24:07,520 Speaker 1: has these psychedelical aist nations where it sees dogs, faces 403 00:24:07,520 --> 00:24:09,440 Speaker 1: and everything, So tell us about that. 404 00:24:09,440 --> 00:24:10,280 Speaker 4: That's exactly right. 405 00:24:10,480 --> 00:24:12,560 Speaker 3: So this is I think it's a nice story because 406 00:24:13,440 --> 00:24:16,280 Speaker 3: we basically did their own to start with, just because 407 00:24:16,320 --> 00:24:18,480 Speaker 3: it was a lot of fun. You know, we had 408 00:24:18,480 --> 00:24:21,840 Speaker 3: this virtual reality lab. One of my post docs is 409 00:24:21,920 --> 00:24:24,800 Speaker 3: very good at coding these things up, and we just thought, 410 00:24:25,880 --> 00:24:29,960 Speaker 3: why can't we try and make a situation where you know, 411 00:24:30,000 --> 00:24:32,359 Speaker 3: at the time, people had been just taking photos of 412 00:24:32,400 --> 00:24:35,440 Speaker 3: bowls of pasta and then putting them through Google deep Dream, 413 00:24:35,440 --> 00:24:37,920 Speaker 3: and they'd all become you know, they'd grow, loads of 414 00:24:37,960 --> 00:24:40,400 Speaker 3: puppy heads would be there in the in the bowl 415 00:24:40,440 --> 00:24:42,600 Speaker 3: of pasta. And we just thought, should we just try 416 00:24:42,640 --> 00:24:46,520 Speaker 3: and do this in virtual reality just because? And so 417 00:24:46,640 --> 00:24:49,960 Speaker 3: we took a panoramic movie, so three hundred and sixty 418 00:24:49,960 --> 00:24:54,240 Speaker 3: degree movie, and we put every frame through an adaptation 419 00:24:54,320 --> 00:24:56,720 Speaker 3: of this deep dream algorithm so that you would get 420 00:24:56,960 --> 00:24:59,720 Speaker 3: framed frame continuity. So it was it was not an 421 00:24:59,720 --> 00:25:00,480 Speaker 3: easy to do. 422 00:25:00,880 --> 00:25:03,920 Speaker 1: Can I for just one second just over on sclear 423 00:25:04,600 --> 00:25:08,840 Speaker 1: The reason Google deep dreams sees puppy faces everywhere is 424 00:25:08,880 --> 00:25:13,000 Speaker 1: because that's its expectation. It's busy and prior is that 425 00:25:13,080 --> 00:25:15,840 Speaker 1: it's looking for puppy faces, and so that's why that's 426 00:25:15,880 --> 00:25:17,280 Speaker 1: why it sees it everywhere. 427 00:25:17,400 --> 00:25:20,480 Speaker 2: If you even vaguely have like two dots. 428 00:25:20,040 --> 00:25:21,919 Speaker 1: In the line or something, it says, oh, there's a 429 00:25:21,960 --> 00:25:22,600 Speaker 1: puppy face. 430 00:25:22,880 --> 00:25:24,320 Speaker 2: Okay, keep doing that. 431 00:25:24,320 --> 00:25:26,760 Speaker 3: That's exactly right. And of course it doesn't have to 432 00:25:26,760 --> 00:25:29,720 Speaker 3: be puppies. You could you could give the network another 433 00:25:30,520 --> 00:25:35,520 Speaker 3: expectation and basically fix another node or part of the network, 434 00:25:35,680 --> 00:25:38,480 Speaker 3: and then it's expecting something else. And that's actually the key. 435 00:25:38,520 --> 00:25:41,280 Speaker 3: So once we'd done this with Sussex campus, and we 436 00:25:41,320 --> 00:25:45,960 Speaker 3: did it with the dog expectations, so suddenly people would 437 00:25:46,000 --> 00:25:49,360 Speaker 3: have all these strange experiences of Sussex University campus, all 438 00:25:49,359 --> 00:25:51,760 Speaker 3: these dogs coming out of the walls and the windows 439 00:25:51,800 --> 00:25:54,959 Speaker 3: and the sky and and we tried it and it 440 00:25:55,000 --> 00:25:57,040 Speaker 3: was fun, and it just struck us when we were 441 00:25:57,040 --> 00:26:01,560 Speaker 3: thinking about this, what would there be any actual scientific 442 00:26:01,680 --> 00:26:05,040 Speaker 3: utility in it? And what we realized was that in 443 00:26:05,080 --> 00:26:09,399 Speaker 3: a lot of psychology experiments, people focus, for good reasons, 444 00:26:09,440 --> 00:26:12,479 Speaker 3: on very constrained situations. You know, they ask people was 445 00:26:13,119 --> 00:26:15,400 Speaker 3: the dot moving to the left or the right, or 446 00:26:15,600 --> 00:26:17,800 Speaker 3: was the patch of light there or not there, or 447 00:26:17,840 --> 00:26:20,680 Speaker 3: something very very simple, so that you can make very 448 00:26:20,680 --> 00:26:23,800 Speaker 3: precise measurements. But these things are very far from the 449 00:26:23,880 --> 00:26:28,639 Speaker 3: richness of everyday conscious experience. In the end, that's what 450 00:26:28,680 --> 00:26:31,600 Speaker 3: we want to understand. So what we realized was what 451 00:26:31,640 --> 00:26:35,439 Speaker 3: we built was not a model of any kind of 452 00:26:35,480 --> 00:26:39,040 Speaker 3: cognition of what people think or any kind of behavior 453 00:26:39,080 --> 00:26:42,360 Speaker 3: what they do. What we built was a model of experience. 454 00:26:42,760 --> 00:26:46,439 Speaker 3: We built a model of particular way of encountering the world, 455 00:26:47,280 --> 00:26:50,320 Speaker 3: and not many people have really been doing that. In fact, 456 00:26:50,680 --> 00:26:54,040 Speaker 3: it was probably one of the first examples of doing 457 00:26:54,080 --> 00:26:56,000 Speaker 3: something like this, and it was really just for fun. 458 00:26:56,359 --> 00:27:01,000 Speaker 3: So from that starting point, then you're point is exactly right. 459 00:27:01,080 --> 00:27:06,720 Speaker 3: So that initial network was set up to project expectations 460 00:27:06,720 --> 00:27:09,760 Speaker 3: of dogs into everything. But so what we then did 461 00:27:10,000 --> 00:27:13,240 Speaker 3: was we moved on from the Google Deep Dream algorithm 462 00:27:13,280 --> 00:27:17,600 Speaker 3: to some more complicated neural network architecture that we could 463 00:27:17,680 --> 00:27:21,600 Speaker 3: tweak in different ways so that we could begin to 464 00:27:21,640 --> 00:27:26,200 Speaker 3: simulate different kinds of hallucinations. So some hallucinations are very 465 00:27:26,320 --> 00:27:30,640 Speaker 3: rich and very complex, others are very simple and very geometric. 466 00:27:31,760 --> 00:27:36,560 Speaker 3: Some hallucinations appear out of nowhere, you know, they just spontaneous. 467 00:27:36,560 --> 00:27:41,600 Speaker 3: Theorize other hallucinations are transformations of things that are already there. 468 00:27:41,640 --> 00:27:44,920 Speaker 3: So we were able to kind of create this space 469 00:27:45,000 --> 00:27:48,040 Speaker 3: we could move around in to generate these different kinds 470 00:27:48,040 --> 00:27:49,440 Speaker 3: of experience. 471 00:27:49,960 --> 00:27:51,280 Speaker 2: So tell us what you learned from that. 472 00:27:51,600 --> 00:27:53,919 Speaker 3: So what we were able to do this is with 473 00:27:54,520 --> 00:27:57,760 Speaker 3: my colleagues David Schwartzman and ks Ksuzuki. 474 00:27:57,840 --> 00:27:59,320 Speaker 4: They did all this work. By the way, I want 475 00:27:59,359 --> 00:28:00,480 Speaker 4: to make that very clear. 476 00:28:01,640 --> 00:28:06,520 Speaker 3: We went to people who have these hallucination in real life, 477 00:28:06,520 --> 00:28:10,600 Speaker 3: people with Parkinson's disease, people with another condition you mentioned, 478 00:28:10,720 --> 00:28:14,639 Speaker 3: Charles Bonne syndrome, people who've had psychedelic experiences, not that 479 00:28:14,720 --> 00:28:16,120 Speaker 3: we're having them then and. 480 00:28:16,080 --> 00:28:18,040 Speaker 4: There, but have had them. 481 00:28:18,520 --> 00:28:23,439 Speaker 3: And we ask them to pick examples from our models 482 00:28:23,480 --> 00:28:27,280 Speaker 3: that were most similar to the experiences that they had, 483 00:28:27,800 --> 00:28:33,040 Speaker 3: and that way we can test our hypotheses about the 484 00:28:33,119 --> 00:28:38,240 Speaker 3: computational basis of these different kinds of hallucinations. So eventually, 485 00:28:38,280 --> 00:28:41,240 Speaker 3: and we're not there yet, but eventually the idea is 486 00:28:41,280 --> 00:28:43,600 Speaker 3: by doing this, we'll be able to make some predictions 487 00:28:43,960 --> 00:28:46,560 Speaker 3: about what we might see if we put these people 488 00:28:46,560 --> 00:28:50,479 Speaker 3: in brain imaging scanners and image what's happening while they 489 00:28:50,480 --> 00:28:55,040 Speaker 3: have their hallucinations. That's the next step, And overall the 490 00:28:55,080 --> 00:28:57,600 Speaker 3: goal is, I think, to put it in this larger frame, 491 00:28:58,000 --> 00:29:00,720 Speaker 3: you know, when you want to understand something like how 492 00:29:00,760 --> 00:29:05,200 Speaker 3: we experience the world, the nature of perception, it's often 493 00:29:05,200 --> 00:29:08,160 Speaker 3: a very good idea to look at those situations where 494 00:29:08,320 --> 00:29:11,080 Speaker 3: things are a little bit strange, you know, where people 495 00:29:11,120 --> 00:29:13,640 Speaker 3: are experiencing things differently. You poke around in it and 496 00:29:13,680 --> 00:29:17,240 Speaker 3: see what happens when things a little bit out of whack. 497 00:29:17,680 --> 00:29:21,680 Speaker 3: So the utility of studying hallucinations for me, it's firstly 498 00:29:21,760 --> 00:29:24,120 Speaker 3: for the people that have them. We can help them 499 00:29:24,160 --> 00:29:29,360 Speaker 3: understand their lived experience better. But it also reflects back 500 00:29:29,440 --> 00:29:33,640 Speaker 3: on our understanding of perception in general, because, as we've 501 00:29:33,680 --> 00:29:36,160 Speaker 3: been saying, fundamentally, it's the same process. 502 00:29:38,160 --> 00:29:41,480 Speaker 1: Yes, now, let me drill in on this for a minute, 503 00:29:41,480 --> 00:29:46,320 Speaker 1: because you and I are both fascinated by individual differences 504 00:29:46,560 --> 00:29:50,160 Speaker 1: in the reality inside different heads, and I've made many 505 00:29:50,200 --> 00:29:53,960 Speaker 1: episodes on this topic. For example, the spectrum from a 506 00:29:54,120 --> 00:29:57,760 Speaker 1: fantasia to hyperfantasia, or the internal voice, or what happens 507 00:29:57,760 --> 00:30:01,240 Speaker 1: with synesthesia of different types. All these of things indicate 508 00:30:01,280 --> 00:30:03,960 Speaker 1: that we're experiencing different realities and ways that we can study. 509 00:30:04,400 --> 00:30:04,560 Speaker 4: You. 510 00:30:05,160 --> 00:30:09,440 Speaker 1: With your colleague Fiona mcpheerson, you launched the perception census. 511 00:30:10,000 --> 00:30:12,120 Speaker 1: Tell us about that and tell us what you've learned. 512 00:30:12,680 --> 00:30:13,720 Speaker 4: Oh, thank you for asking this. 513 00:30:13,760 --> 00:30:15,840 Speaker 3: I mean, I'm glad we're talking about this because I 514 00:30:15,880 --> 00:30:18,840 Speaker 3: think it is probably one of our strongest overlaps. And 515 00:30:18,880 --> 00:30:20,440 Speaker 3: I have to say a lot of my interest in 516 00:30:20,480 --> 00:30:25,640 Speaker 3: this area was inspired by conversations with you dating back scarily, 517 00:30:25,720 --> 00:30:28,080 Speaker 3: I think well over twenty years now when we first 518 00:30:28,080 --> 00:30:31,600 Speaker 3: started talking about these things, which but certainly the work 519 00:30:31,600 --> 00:30:35,800 Speaker 3: in synesthesia individual differences, it falls out as a consequence 520 00:30:35,840 --> 00:30:38,160 Speaker 3: of this way of thinking, and I think this is 521 00:30:38,200 --> 00:30:41,400 Speaker 3: worth saying first. If we start with a textbook view 522 00:30:41,560 --> 00:30:47,000 Speaker 3: of visual perception, it's kind of easy to think that 523 00:30:47,080 --> 00:30:49,840 Speaker 3: we will all see the world hear the world in 524 00:30:49,960 --> 00:30:54,280 Speaker 3: roughly the same way. And that's also how perceptual experience 525 00:30:54,280 --> 00:30:56,760 Speaker 3: feels like. It feels like I just see the world 526 00:30:56,800 --> 00:30:59,640 Speaker 3: as it is. It doesn't seem to me as though 527 00:30:59,880 --> 00:31:02,880 Speaker 3: it depends all that much on my brain, or certainly 528 00:31:02,880 --> 00:31:05,320 Speaker 3: not on the specific because of my brain compared to yours. 529 00:31:06,240 --> 00:31:09,560 Speaker 3: But how it seems is never a particularly good guide 530 00:31:09,560 --> 00:31:13,800 Speaker 3: to how things are and in this view of perception, 531 00:31:13,920 --> 00:31:20,400 Speaker 3: as this prediction, this controlled hallucination, this generative process, then 532 00:31:21,040 --> 00:31:22,840 Speaker 3: it's going to be different for each and every one 533 00:31:22,880 --> 00:31:24,640 Speaker 3: of us. You know, we all differ on the outside 534 00:31:24,640 --> 00:31:27,800 Speaker 3: in skin color and height and body shape, and we 535 00:31:27,840 --> 00:31:31,520 Speaker 3: all have slightly different brains too, and so we should 536 00:31:31,560 --> 00:31:35,400 Speaker 3: all experience the world to some extent differently. But the 537 00:31:35,480 --> 00:31:38,560 Speaker 3: key difference here is it's easy to tell whether people 538 00:31:38,600 --> 00:31:40,880 Speaker 3: are different heights, even if the difference is quite small. 539 00:31:40,920 --> 00:31:42,440 Speaker 3: You know, I can just look at two people standing 540 00:31:42,480 --> 00:31:45,080 Speaker 3: next to each other and I'll see if they're one's 541 00:31:45,120 --> 00:31:50,160 Speaker 3: taller than the other. But if you experience red slightly 542 00:31:50,200 --> 00:31:55,680 Speaker 3: differently from me, or time slightly differently from me, how 543 00:31:55,840 --> 00:31:59,440 Speaker 3: will we ever know? Because your experience is private, subjective 544 00:31:59,480 --> 00:32:02,280 Speaker 3: to you will probably use the same words. You know, 545 00:32:02,440 --> 00:32:06,240 Speaker 3: it's read that lasted about a second. It's so much 546 00:32:06,240 --> 00:32:08,920 Speaker 3: harder to tell. And so we end up assuming, I think, 547 00:32:09,080 --> 00:32:15,240 Speaker 3: overestimating the similarity of our inner world. And so this 548 00:32:15,360 --> 00:32:19,680 Speaker 3: project of the Perception Census, with Fiona Macpherson and many 549 00:32:19,720 --> 00:32:22,560 Speaker 3: other colleagues, Reny by Kovira, a postdoc who really drove 550 00:32:22,600 --> 00:32:27,440 Speaker 3: it here at Sussex, we wanted to study these individual differences, 551 00:32:27,440 --> 00:32:30,920 Speaker 3: but we wanted to do it at scale. So this 552 00:32:31,120 --> 00:32:33,440 Speaker 3: is not a new idea, David. You've done a lot 553 00:32:33,440 --> 00:32:36,040 Speaker 3: of this work, contributed a lot to this literature already. 554 00:32:37,120 --> 00:32:41,000 Speaker 3: But many of these studies focus on one or two 555 00:32:41,080 --> 00:32:45,640 Speaker 3: or three aspects of perception. Maybe synesthesia, which I'm sure 556 00:32:45,640 --> 00:32:47,640 Speaker 3: your listeners will know about because you wrote the book 557 00:32:47,680 --> 00:32:51,880 Speaker 3: on this, literally several books, but you know, or maybe 558 00:32:51,880 --> 00:32:58,120 Speaker 3: something else like time or ability to discriminate different musical notes. 559 00:32:58,840 --> 00:33:01,000 Speaker 3: We wanted to look at lot, lots of things together, 560 00:33:01,040 --> 00:33:03,360 Speaker 3: and we wanted to look at a lot of people, 561 00:33:03,440 --> 00:33:07,840 Speaker 3: and people from many places and of many ages, so 562 00:33:07,920 --> 00:33:12,680 Speaker 3: we got quite ambitious. We put together over fifty different 563 00:33:12,720 --> 00:33:17,280 Speaker 3: tasks rather than just two or three, fifty different experiments 564 00:33:17,320 --> 00:33:21,280 Speaker 3: that people could do, lots of visual illusions, sound had 565 00:33:21,320 --> 00:33:23,040 Speaker 3: to be things people could do at home, so we 566 00:33:23,080 --> 00:33:27,560 Speaker 3: couldn't do things like smell whatever, But fifty different things, 567 00:33:28,240 --> 00:33:32,280 Speaker 3: and overall we were able to get around forty thousand 568 00:33:32,320 --> 00:33:38,280 Speaker 3: people engaged in the census from ages of eighteen to 569 00:33:38,440 --> 00:33:41,400 Speaker 3: well over seventy and from one hundred and twenty seven 570 00:33:41,560 --> 00:33:42,680 Speaker 3: different countries. 571 00:33:43,000 --> 00:33:43,320 Speaker 1: Wow. 572 00:33:43,440 --> 00:33:47,640 Speaker 3: So it's been a huge data gathering effort and I 573 00:33:47,680 --> 00:33:49,800 Speaker 3: really see this as a resource, and I hope it's 574 00:33:49,840 --> 00:33:53,600 Speaker 3: going to be an important resource for the whole community 575 00:33:53,680 --> 00:33:56,600 Speaker 3: because we're going to make the data entirely open and 576 00:33:56,640 --> 00:33:59,080 Speaker 3: it can be a bit of a sandpit for testing 577 00:34:00,440 --> 00:34:03,200 Speaker 3: ideas or hypotheses that people might have. You might want 578 00:34:03,240 --> 00:34:07,680 Speaker 3: to ask, oh, to somebody with more vivid mental imagery, 579 00:34:07,840 --> 00:34:12,480 Speaker 3: you know, do they see more different shades of color? 580 00:34:13,400 --> 00:34:16,600 Speaker 3: What things tend to go together? And as part of 581 00:34:16,640 --> 00:34:20,080 Speaker 3: the census, we also have some data on people whether 582 00:34:20,719 --> 00:34:27,000 Speaker 3: where they reside on some of the more clinical dimensions too, autism, ADHD, 583 00:34:27,840 --> 00:34:32,160 Speaker 3: things like that, so we can start to understand how 584 00:34:32,320 --> 00:34:37,920 Speaker 3: these conditions relate to the normal spectrum of variability. So 585 00:34:37,960 --> 00:34:42,439 Speaker 3: I've been using the term here perceptual diversity rather than 586 00:34:42,760 --> 00:34:46,520 Speaker 3: what many people have heard neurodiversity, And I want to 587 00:34:46,560 --> 00:34:49,440 Speaker 3: just dwell on that for a second, because this idea 588 00:34:49,520 --> 00:34:52,480 Speaker 3: of neurodiversity has been very important and it's led to 589 00:34:52,880 --> 00:34:57,879 Speaker 3: a lot of recognition that people with conditions like autism 590 00:34:58,000 --> 00:35:00,399 Speaker 3: is the one that's most commonly used here as others 591 00:35:00,400 --> 00:35:05,399 Speaker 3: as well ADHD, that they experience things differently and that 592 00:35:05,440 --> 00:35:09,440 Speaker 3: can cause some problems, can give some benefits, but it's different. 593 00:35:09,800 --> 00:35:15,040 Speaker 3: But ironically, in my mind, the idea of neurodiversity has 594 00:35:15,760 --> 00:35:19,800 Speaker 3: kind of reinforced the idea that if you're not neurodivergent 595 00:35:19,960 --> 00:35:24,080 Speaker 3: in some way, then you're neurotypical and you see the 596 00:35:24,080 --> 00:35:29,759 Speaker 3: world as it is, and it's underplayed I think the reality, 597 00:35:30,120 --> 00:35:32,920 Speaker 3: or certainly the reality we're exploring with the census. 598 00:35:33,000 --> 00:35:35,200 Speaker 4: Let's see if it's true that. 599 00:35:35,560 --> 00:35:40,000 Speaker 3: There's just variation, and maybe when you get somewhere towards 600 00:35:40,080 --> 00:35:43,319 Speaker 3: the extreme, if a distribution, a label is slapped on 601 00:35:43,360 --> 00:35:48,000 Speaker 3: it and it becomes a neurodivergent condition. But I think 602 00:35:48,000 --> 00:35:52,200 Speaker 3: that if we all understood that each of us experiences 603 00:35:52,239 --> 00:35:57,400 Speaker 3: the world in our own unique individual way, it will 604 00:35:57,880 --> 00:36:01,920 Speaker 3: This is not at all to diminish or minimize neurodiversion conditions. 605 00:36:01,960 --> 00:36:05,279 Speaker 3: I just want to understand them as part of the 606 00:36:05,360 --> 00:36:10,520 Speaker 3: spectrum in which there's variation among all of us, just 607 00:36:10,560 --> 00:36:13,719 Speaker 3: as there is in high body shape anything else. That's 608 00:36:13,719 --> 00:36:17,640 Speaker 3: what the perception census is empirically trying to look at, 609 00:36:17,640 --> 00:36:20,399 Speaker 3: because we just don't have this deita yet. How much 610 00:36:20,480 --> 00:36:22,520 Speaker 3: variation is out there, what does it look like, how 611 00:36:22,520 --> 00:36:23,280 Speaker 3: does it correlate? 612 00:36:40,000 --> 00:36:41,600 Speaker 1: By the way, just for final I'll tell you about 613 00:36:41,600 --> 00:36:44,480 Speaker 1: my idiothesis. Idiothesis is the term we use in my 614 00:36:44,600 --> 00:36:48,120 Speaker 1: lab for an idiotic hypothesis. But it's what I've been 615 00:36:48,160 --> 00:36:52,040 Speaker 1: interested in is, as you know, I'm a lover of literature, 616 00:36:52,440 --> 00:36:57,560 Speaker 1: and I've noticed that authors like Ernest Hemingway and let's say, 617 00:36:57,560 --> 00:37:00,440 Speaker 1: Thomas Hardy have very different ways of writing. And I 618 00:37:00,520 --> 00:37:05,680 Speaker 1: think this is a I'm calling this retrospective brain scanning, 619 00:37:05,760 --> 00:37:10,080 Speaker 1: because I think we can tell that Hemingway was probably 620 00:37:10,160 --> 00:37:14,359 Speaker 1: a fantasic, meaning he didn't picture details in his head 621 00:37:14,480 --> 00:37:17,040 Speaker 1: and didn't care about them. But somebody like Thomas Hardy 622 00:37:17,160 --> 00:37:22,200 Speaker 1: or you know, Fenimore Cooper or anybody like that, put 623 00:37:22,320 --> 00:37:25,680 Speaker 1: so many details on the page of the red curtains billowed, 624 00:37:25,719 --> 00:37:28,759 Speaker 1: and the flowers were these flowers in this arrangement and 625 00:37:28,800 --> 00:37:32,000 Speaker 1: so on. And I happen to be on the a 626 00:37:32,120 --> 00:37:34,879 Speaker 1: fantasic end of things, and so I can't stand those 627 00:37:34,880 --> 00:37:37,120 Speaker 1: authors who give tons and tons of detail. 628 00:37:37,120 --> 00:37:37,960 Speaker 2: That don't matter. 629 00:37:38,239 --> 00:37:40,719 Speaker 1: Now, the reason this is an idiothesis is because I 630 00:37:40,760 --> 00:37:43,040 Speaker 1: can't ever approve this is true, and there may be 631 00:37:43,120 --> 00:37:45,239 Speaker 1: other reasons why they wrote in the styles they did, 632 00:37:45,280 --> 00:37:49,120 Speaker 1: but I like the idea that this is yet another 633 00:37:49,239 --> 00:37:52,000 Speaker 1: correlation in the world is by looking at people's outputs, 634 00:37:52,040 --> 00:37:54,200 Speaker 1: the way they talk, the way they described the world, 635 00:37:54,400 --> 00:37:57,280 Speaker 1: and that might give us some even very rough insight 636 00:37:57,320 --> 00:37:58,840 Speaker 1: about what's happening internally. 637 00:37:59,360 --> 00:38:02,040 Speaker 3: I think that there's something too that I think. One 638 00:38:02,040 --> 00:38:04,200 Speaker 3: thing that's well known, I mean, you'll correct me if 639 00:38:04,200 --> 00:38:08,960 Speaker 3: I'm wrong, is that some authors were clearly synesthetic. So 640 00:38:09,000 --> 00:38:13,040 Speaker 3: I think Vladimir Nabokov was well known as a synaesthete. 641 00:38:13,200 --> 00:38:17,000 Speaker 3: And that's interesting not only for the perspective of how 642 00:38:17,040 --> 00:38:19,640 Speaker 3: he might have described the world around him, whether it 643 00:38:19,680 --> 00:38:23,520 Speaker 3: was detailed, but whether it affected other things, whether there 644 00:38:23,600 --> 00:38:27,120 Speaker 3: was sort of just more cross fertilization, more associativity in 645 00:38:27,520 --> 00:38:31,560 Speaker 3: the way he was writing compared to other people. Because synthesia, 646 00:38:31,640 --> 00:38:35,880 Speaker 3: which again is something that well you actually did. I 647 00:38:35,880 --> 00:38:39,160 Speaker 3: think it was the first large scale survey of synesthesia 648 00:38:39,239 --> 00:38:43,360 Speaker 3: in the synesthesia Battery. So we're looking at cynesesia again, 649 00:38:43,640 --> 00:38:46,720 Speaker 3: but of course in the same group of people, we're 650 00:38:46,719 --> 00:38:48,680 Speaker 3: looking at all kinds of other things too, So I'm 651 00:38:48,760 --> 00:38:53,160 Speaker 3: really excited to see what synesthesia is associated with. Unfortunately, 652 00:38:53,160 --> 00:38:55,640 Speaker 3: we didn't ask people whether they'd written any great works 653 00:38:55,640 --> 00:38:56,240 Speaker 3: of literature. 654 00:38:57,080 --> 00:38:59,480 Speaker 1: You know, the truth is always struggled with this question 655 00:38:59,520 --> 00:39:05,759 Speaker 1: about weather. Synesthesia is associated with higher creativity because in 656 00:39:05,800 --> 00:39:08,640 Speaker 1: a sense, it's not creative. If I always see M 657 00:39:08,680 --> 00:39:11,399 Speaker 1: as purple and B is orange and so on, that's 658 00:39:11,440 --> 00:39:12,480 Speaker 1: I'm particularly creative. 659 00:39:12,520 --> 00:39:14,040 Speaker 2: It's just an association that I have. 660 00:39:15,239 --> 00:39:17,960 Speaker 1: In the case of letting me in a book off 661 00:39:18,120 --> 00:39:21,680 Speaker 1: he you know, there are little clues that we have. Ever, 662 00:39:22,000 --> 00:39:25,080 Speaker 1: just as one example, he was a lepidopterist, which means, 663 00:39:25,120 --> 00:39:29,840 Speaker 1: you know, he studied butterflies, and for him, his favorite 664 00:39:29,840 --> 00:39:33,400 Speaker 1: butterfly had a particular pattern of yellow black yellow, and 665 00:39:34,320 --> 00:39:38,200 Speaker 1: so his novel Aida a Da happens to map onto 666 00:39:38,239 --> 00:39:42,840 Speaker 1: those colors of those letters, where you know A is 667 00:39:42,920 --> 00:39:44,440 Speaker 1: yellow and D is black. 668 00:39:44,239 --> 00:39:45,000 Speaker 2: And A is yellow. 669 00:39:45,120 --> 00:39:46,960 Speaker 1: So we see little clues like that, But I don't 670 00:39:47,000 --> 00:39:48,960 Speaker 1: know if it makes a person more creative or not. 671 00:39:49,280 --> 00:39:51,640 Speaker 3: Yeah, there's another example of that. I actually had that 672 00:39:51,719 --> 00:39:55,319 Speaker 3: writing a piece for a gallery catalog about and an 673 00:39:55,400 --> 00:39:56,719 Speaker 3: artist ya Yu Kusana. 674 00:39:56,840 --> 00:39:57,600 Speaker 4: She's still alive. 675 00:39:57,640 --> 00:40:01,799 Speaker 3: She's in a nineties in Japan, and she's very very 676 00:40:01,800 --> 00:40:07,160 Speaker 3: well known for these artworks that have things like red 677 00:40:07,239 --> 00:40:11,000 Speaker 3: polka dots everywhere, and that there's this sort of patterns 678 00:40:11,880 --> 00:40:16,640 Speaker 3: over everything, representations of landscapes and so on and it's 679 00:40:16,640 --> 00:40:19,000 Speaker 3: a it's a very distinctive style. She also does these 680 00:40:19,040 --> 00:40:23,439 Speaker 3: mirror infinity rooms. But it seems that she has this 681 00:40:23,880 --> 00:40:26,160 Speaker 3: way of experiencing the world in which she literally does 682 00:40:26,280 --> 00:40:31,040 Speaker 3: see red dots everywhere. At some times there's I forget 683 00:40:31,040 --> 00:40:33,319 Speaker 3: the technical name for it, but there's there's a you know, 684 00:40:33,360 --> 00:40:38,160 Speaker 3: a sort of sexual condition where where this happens. And 685 00:40:38,440 --> 00:40:44,040 Speaker 3: so exactly to your point, maybe her artistic innovation was 686 00:40:44,080 --> 00:40:47,680 Speaker 3: to some extent a direct transcription of what she was 687 00:40:47,920 --> 00:40:51,480 Speaker 3: just seeing. Now, I think that is not the full story. 688 00:40:51,520 --> 00:40:54,040 Speaker 3: You know, you can you have to. I think it 689 00:40:54,120 --> 00:40:58,120 Speaker 3: provides the raw materials that people can use to generate 690 00:40:59,000 --> 00:41:02,880 Speaker 3: pieces of art, whether the paintings, music, novels that have 691 00:41:03,000 --> 00:41:05,520 Speaker 3: a creative and artistic impact. So I don't think it 692 00:41:05,560 --> 00:41:12,200 Speaker 3: reduces But also I always agree with you that that disassociation. 693 00:41:12,400 --> 00:41:15,960 Speaker 3: We have to be careful because if it's automatic and normal, 694 00:41:16,200 --> 00:41:18,239 Speaker 3: you know, where is the creativity. The creativity is and 695 00:41:18,239 --> 00:41:20,560 Speaker 3: what you do with it. It's not the thing itself. 696 00:41:20,520 --> 00:41:20,960 Speaker 2: That's right. 697 00:41:21,000 --> 00:41:24,560 Speaker 1: And by the way, Kandinsky, the visual painter, he had 698 00:41:24,600 --> 00:41:27,160 Speaker 1: a very rich synessiege that was triggered by sound, and 699 00:41:27,200 --> 00:41:29,680 Speaker 1: so what he would do is crank up his phonograph 700 00:41:29,719 --> 00:41:32,560 Speaker 1: and stand in front of the canvas and paint what 701 00:41:32,680 --> 00:41:36,120 Speaker 1: he was seeing. And what's interesting about that, that's an 702 00:41:36,120 --> 00:41:41,520 Speaker 1: example where he's just transcribing what his perception is. What's 703 00:41:41,520 --> 00:41:44,719 Speaker 1: interesting is, given that we all have different internal worlds, 704 00:41:45,160 --> 00:41:48,120 Speaker 1: it's nice to find ways to share like that. And 705 00:41:48,200 --> 00:41:52,120 Speaker 1: I do think in let's say, literature and novels, you 706 00:41:52,200 --> 00:41:55,560 Speaker 1: are getting to step into the shoes of someone who 707 00:41:55,640 --> 00:41:57,839 Speaker 1: sees the world differently than you do, and that's why 708 00:41:57,880 --> 00:41:59,760 Speaker 1: we love to share stories with one another. 709 00:42:00,080 --> 00:42:03,480 Speaker 3: Yeah, I think again, something that you and I no 710 00:42:03,520 --> 00:42:06,680 Speaker 3: doubt agree on is that, you know, when when we 711 00:42:06,719 --> 00:42:09,520 Speaker 3: face the challenge of understanding consciousness, you know, in the 712 00:42:09,640 --> 00:42:12,839 Speaker 3: large in the sort of most expansive way, we want 713 00:42:12,880 --> 00:42:16,319 Speaker 3: to understand what it is like to be me or 714 00:42:16,360 --> 00:42:21,040 Speaker 3: to be you. And sure we can study visual perception 715 00:42:21,360 --> 00:42:23,799 Speaker 3: or auditory perception or something like that, but for a 716 00:42:23,800 --> 00:42:27,120 Speaker 3: lot of people, the experience of being who they are 717 00:42:28,200 --> 00:42:30,840 Speaker 3: is tied up with the self, that's tied up with 718 00:42:30,920 --> 00:42:34,759 Speaker 3: some kind of internal narrative, distinctive way of being in 719 00:42:34,800 --> 00:42:37,719 Speaker 3: the world. And literature has done, i think, more than 720 00:42:37,760 --> 00:42:42,000 Speaker 3: science to explore examine that aspect of consciousness. 721 00:42:42,120 --> 00:42:43,960 Speaker 1: I'm so glad you brought this up because I was 722 00:42:44,000 --> 00:42:46,480 Speaker 1: going to ask you about this. So in your book 723 00:42:46,520 --> 00:42:49,360 Speaker 1: Being You, you talked about the way that we build 724 00:42:49,400 --> 00:42:53,000 Speaker 1: a model of the outside world by minimizing the prediction aris, 725 00:42:53,360 --> 00:42:56,400 Speaker 1: but also we build models of the inside world. 726 00:42:56,400 --> 00:42:57,359 Speaker 2: So tell us about that. 727 00:42:57,680 --> 00:43:01,840 Speaker 3: So this is another INVERTI if you like a conceptual 728 00:43:02,600 --> 00:43:06,080 Speaker 3: upside down move, We've already had one, which is the 729 00:43:06,160 --> 00:43:09,279 Speaker 3: idea that instead of experiencing the world in this kind 730 00:43:09,360 --> 00:43:12,919 Speaker 3: of outside in direction where the world just pours into 731 00:43:12,960 --> 00:43:17,160 Speaker 3: the mind, it's this act of construction, this controlled hallucination. Now, 732 00:43:17,239 --> 00:43:20,920 Speaker 3: another common assumption or sort of it might seem an 733 00:43:20,960 --> 00:43:23,839 Speaker 3: obvious way of thinking about things, is when we think 734 00:43:23,840 --> 00:43:28,200 Speaker 3: about the self, and it might seem just intuitive to 735 00:43:28,280 --> 00:43:31,919 Speaker 3: say that the self what it is to be me. Well, 736 00:43:31,960 --> 00:43:35,319 Speaker 3: that's the thing that does the perceiving. There's something, there's 737 00:43:35,320 --> 00:43:39,160 Speaker 3: some essence of me inside my skull. That is the 738 00:43:39,200 --> 00:43:44,760 Speaker 3: recipients of all these perceptions, however they're constructed, that takes 739 00:43:44,760 --> 00:43:47,720 Speaker 3: them in, figures out what to do next, does something, 740 00:43:47,800 --> 00:43:50,320 Speaker 3: and we sense, we think, we act. 741 00:43:50,480 --> 00:43:51,839 Speaker 4: And go round and round and round. 742 00:43:53,040 --> 00:43:58,120 Speaker 3: Now, I think this is not the way to think 743 00:43:58,160 --> 00:44:00,880 Speaker 3: about things, I think is very different. Rather than the 744 00:44:00,960 --> 00:44:04,640 Speaker 3: self being that which does the perceiving, I think it's 745 00:44:04,800 --> 00:44:08,640 Speaker 3: more correct to say that the self itself is a 746 00:44:08,760 --> 00:44:14,759 Speaker 3: kind of perception. So there's just a same process going on. 747 00:44:14,880 --> 00:44:20,720 Speaker 3: The brain is making inferences about sensory signals from different 748 00:44:20,719 --> 00:44:24,320 Speaker 3: places and over different time scales. Some of those inferences 749 00:44:24,400 --> 00:44:29,120 Speaker 3: underlie our experiences of the world, and some of those inferences, 750 00:44:29,560 --> 00:44:34,839 Speaker 3: some of those controlled hallucinations, are what the self actually is. 751 00:44:34,880 --> 00:44:40,719 Speaker 3: The self is this collection of perceptions that have and 752 00:44:40,760 --> 00:44:43,040 Speaker 3: I think this is the key, you know, So why 753 00:44:43,160 --> 00:44:45,799 Speaker 3: is the self different from the world. What's special about 754 00:44:45,800 --> 00:44:48,600 Speaker 3: the self? Well, one of the things that's special about 755 00:44:48,600 --> 00:44:52,560 Speaker 3: the self is the body. So perceptions of self, for 756 00:44:52,640 --> 00:44:58,640 Speaker 3: me anyway, are rooted in the brain's perception predictions about 757 00:44:59,239 --> 00:45:03,920 Speaker 3: the body, and all our aspects itself are built on that. 758 00:45:05,560 --> 00:45:08,759 Speaker 1: Great So your brain is monitoring the outside world, and 759 00:45:08,800 --> 00:45:11,640 Speaker 1: it's monitoring the inside world inside the body. 760 00:45:12,200 --> 00:45:15,320 Speaker 2: And he made the argument that. 761 00:45:15,320 --> 00:45:20,879 Speaker 1: It not only tries to predict, but eventually becomes good 762 00:45:20,960 --> 00:45:23,600 Speaker 1: at saying Okay, look, this is my model is that 763 00:45:23,680 --> 00:45:26,200 Speaker 1: let's say the body temperature should be at this, so 764 00:45:26,320 --> 00:45:28,960 Speaker 1: if it fluctuates, this is what gives you the ability 765 00:45:29,000 --> 00:45:31,880 Speaker 1: to do homeostasis to keep things in order because you 766 00:45:32,200 --> 00:45:34,400 Speaker 1: have a well established model at this point. 767 00:45:34,640 --> 00:45:35,640 Speaker 4: That's absolutely right. 768 00:45:35,920 --> 00:45:39,360 Speaker 3: And I know you recently had Lisa Felman Barrett on 769 00:45:39,840 --> 00:45:43,680 Speaker 3: the podcast, and she and I have a pretty similar 770 00:45:43,760 --> 00:45:45,680 Speaker 3: view about this, but I think we came to it 771 00:45:45,760 --> 00:45:48,080 Speaker 3: from very different directions. I mean, Lisa has always been 772 00:45:48,120 --> 00:45:53,000 Speaker 3: interested specifically in emotion and pushing back against these Diarwinian 773 00:45:53,040 --> 00:45:57,200 Speaker 3: ideas of hard coded emotion circuits in the brain. I 774 00:45:57,320 --> 00:46:01,399 Speaker 3: came to these kinds of ideas by just saying thinking about, well, 775 00:46:01,400 --> 00:46:04,240 Speaker 3: how does the brain perceive the outside world, Well, maybe 776 00:46:04,239 --> 00:46:08,719 Speaker 3: there's something similar happening about emotion. Building actually on some 777 00:46:08,840 --> 00:46:12,359 Speaker 3: very early ideas of William James and Carl Langer, who 778 00:46:12,440 --> 00:46:16,680 Speaker 3: talked about emotion as a process of the brain's appraisal 779 00:46:16,800 --> 00:46:20,880 Speaker 3: or interpretation of the physiological condition of the body. But 780 00:46:21,000 --> 00:46:23,520 Speaker 3: where we end up is somewhere. It's not exactly the same. 781 00:46:23,800 --> 00:46:26,480 Speaker 3: There are differences between how Lisa and I see things, 782 00:46:26,880 --> 00:46:32,760 Speaker 3: But where we end up is one fundamental realization about 783 00:46:32,760 --> 00:46:37,520 Speaker 3: what brains are for that really I think casts everything 784 00:46:37,600 --> 00:46:41,560 Speaker 3: we think about perception consciousness in a different light, and 785 00:46:41,640 --> 00:46:45,600 Speaker 3: this is that brains are not for creating art, or 786 00:46:45,640 --> 00:46:50,080 Speaker 3: writing poetry, or doing complex math, or even having conversations. 787 00:46:50,440 --> 00:46:53,000 Speaker 3: If you think about what the primary duty of any 788 00:46:53,040 --> 00:46:56,920 Speaker 3: brain is, it's to keep the body and itself alive. 789 00:46:57,440 --> 00:47:00,480 Speaker 3: If you don't do that, you don't do anything. And 790 00:47:00,600 --> 00:47:03,200 Speaker 3: to keep it an organism alive is a difficult thing. 791 00:47:03,239 --> 00:47:06,040 Speaker 3: I mean, that's why evolution has shaped all these weird 792 00:47:06,040 --> 00:47:09,960 Speaker 3: and wonderful creatures. But it usually well. In fact, it 793 00:47:10,000 --> 00:47:16,480 Speaker 3: always requires keeping certain physiological variables like heart rate, blood pressure, 794 00:47:16,760 --> 00:47:21,560 Speaker 3: blood oxygenation within very tight ranges. You've got to regulate 795 00:47:21,560 --> 00:47:24,960 Speaker 3: your physiology. If your blood oxygen drops, you won't last 796 00:47:25,080 --> 00:47:28,520 Speaker 3: very long if it drops too much. And so how 797 00:47:28,560 --> 00:47:32,040 Speaker 3: does the brain do this well? Any control engineer will 798 00:47:32,080 --> 00:47:35,239 Speaker 3: tell you that a good way to regulate something, to 799 00:47:35,360 --> 00:47:38,400 Speaker 3: keep it where it needs to be, is to have 800 00:47:38,440 --> 00:47:41,440 Speaker 3: a predictive model about it, because if you have a 801 00:47:41,440 --> 00:47:46,360 Speaker 3: predictive model, you can anticipate deviations. When we both finish 802 00:47:46,440 --> 00:47:50,040 Speaker 3: recording and we stand up, the brain is anticipating that 803 00:47:50,120 --> 00:47:53,560 Speaker 3: our blood pressure will drop a bit, so it transiently 804 00:47:53,600 --> 00:47:58,040 Speaker 3: increases the blood pressure. It constricts our vessels, so our 805 00:47:58,080 --> 00:48:01,319 Speaker 3: blood pressure in fact remains pretty much exactly the same. 806 00:48:01,360 --> 00:48:05,680 Speaker 3: We don't faint, So the ability to control and regulate 807 00:48:06,360 --> 00:48:12,120 Speaker 3: is done best through prediction. And so for me that 808 00:48:12,120 --> 00:48:16,839 Speaker 3: that's the fundamental reason why brains work this way, why 809 00:48:17,040 --> 00:48:20,440 Speaker 3: perception works this way. It's all built on this fundamental 810 00:48:20,440 --> 00:48:26,640 Speaker 3: imperative of the brain to regulate homeostatically, allostatically the physiology 811 00:48:26,880 --> 00:48:28,960 Speaker 3: of the body. And I think the way I like 812 00:48:29,000 --> 00:48:30,640 Speaker 3: to put it in my book is we experience the 813 00:48:30,680 --> 00:48:34,399 Speaker 3: world and the self with through and because of our 814 00:48:34,520 --> 00:48:35,760 Speaker 3: living bodies. 815 00:48:35,840 --> 00:48:37,800 Speaker 2: And so impact that just a little bit more. 816 00:48:37,719 --> 00:48:42,640 Speaker 3: So, there's I mean, this whole question of what emotions are, 817 00:48:42,880 --> 00:48:45,440 Speaker 3: how they come about with their force is super super interesting. 818 00:48:45,800 --> 00:48:49,200 Speaker 3: But they kind of polarize two extremes, and one extreme 819 00:48:49,600 --> 00:48:53,920 Speaker 3: you have Darwin, or at least a kind of caricature 820 00:48:53,960 --> 00:48:57,920 Speaker 3: of Darwin, which says that there are certain there's a 821 00:48:57,960 --> 00:49:03,640 Speaker 3: certain repertoire of emotions happiness, anger, sadness, discussed things like this. 822 00:49:04,080 --> 00:49:09,520 Speaker 3: They're pretty much hard wired into our brains. They're kind 823 00:49:09,560 --> 00:49:13,479 Speaker 3: of different from cognition, from thinking, you know, they're more 824 00:49:14,360 --> 00:49:20,279 Speaker 3: physiological like survival reflexes that feel particular ways, and they're 825 00:49:20,280 --> 00:49:25,640 Speaker 3: pretty fixed and conserved. The ideas across time across cultures, 826 00:49:26,239 --> 00:49:28,640 Speaker 3: and this has been kind of the mainstream view, I think, 827 00:49:28,680 --> 00:49:31,160 Speaker 3: in one way or another for a long time. And 828 00:49:31,200 --> 00:49:33,040 Speaker 3: then on the other hand, you've got the idea of 829 00:49:33,800 --> 00:49:38,839 Speaker 3: emotions as being somewhat constructed. Now this has also got 830 00:49:38,840 --> 00:49:41,520 Speaker 3: a long history. William James, we mentioned a minute ago, 831 00:49:41,600 --> 00:49:49,600 Speaker 3: he realized over one hundred years ago that emotions were very, 832 00:49:49,719 --> 00:49:53,360 Speaker 3: very deeply embodied. That you know, he proposed the idea 833 00:49:53,400 --> 00:49:56,959 Speaker 3: that if say, a bear comes charging into the room, 834 00:49:57,520 --> 00:50:00,520 Speaker 3: we will feel afraid and then we might run away, 835 00:50:00,760 --> 00:50:05,040 Speaker 3: and it might be normal to think that, you know, 836 00:50:05,760 --> 00:50:08,920 Speaker 3: it's the sight of the bear that causes the feeling 837 00:50:08,960 --> 00:50:11,360 Speaker 3: of fear, and then the feeling of the fear causes 838 00:50:11,400 --> 00:50:15,680 Speaker 3: me to run away, and James is becoming a common 839 00:50:15,719 --> 00:50:19,120 Speaker 3: theme to our conversation. Now flips this around, right, So, 840 00:50:20,200 --> 00:50:22,759 Speaker 3: for William James and also Carl Ager, the bear comes 841 00:50:22,800 --> 00:50:26,800 Speaker 3: charging into the room, my brain registers the presence of 842 00:50:27,200 --> 00:50:31,879 Speaker 3: the bear. This puts my body into a particular physiological state, 843 00:50:31,960 --> 00:50:36,680 Speaker 3: and adrenaline shoots up, courts all starts racing around. And 844 00:50:36,719 --> 00:50:42,400 Speaker 3: then my brain perceives this change happening in the body 845 00:50:42,520 --> 00:50:45,800 Speaker 3: in the context of a bear being there, and that 846 00:50:46,160 --> 00:50:49,680 Speaker 3: is the emotion of fear. So fear in this case 847 00:50:49,840 --> 00:50:52,160 Speaker 3: follows or is at least part of the change in 848 00:50:52,200 --> 00:50:54,839 Speaker 3: the body. It's not that fear then causes the change 849 00:50:54,880 --> 00:50:57,160 Speaker 3: in the body. The experience of fear is the perception 850 00:50:57,239 --> 00:51:01,000 Speaker 3: of what's happening in the body in this context of 851 00:51:01,400 --> 00:51:05,600 Speaker 3: something dangerous happening. So that's that's the way I think 852 00:51:05,640 --> 00:51:08,600 Speaker 3: of that. I've come to think about emotions as grounded 853 00:51:08,640 --> 00:51:13,239 Speaker 3: in this this imperative for regulation. And you know, just 854 00:51:13,320 --> 00:51:17,600 Speaker 3: as with the perception census, there will be variation, but 855 00:51:18,320 --> 00:51:20,480 Speaker 3: there will be also a lot of similarity. There's a 856 00:51:20,480 --> 00:51:24,239 Speaker 3: lot of similarities in the way we see things. So 857 00:51:24,920 --> 00:51:26,799 Speaker 3: you know, I tend to always land in this this 858 00:51:26,840 --> 00:51:30,919 Speaker 3: sort of happy or unhappy medium where there may be 859 00:51:30,960 --> 00:51:34,880 Speaker 3: some amount of biological conservation going on. There's sort of 860 00:51:34,880 --> 00:51:37,920 Speaker 3: good reasons for that, but there might also be more 861 00:51:38,000 --> 00:51:40,480 Speaker 3: variation than we might think. 862 00:51:41,320 --> 00:51:43,400 Speaker 1: Given the way that you are thinking about consciousness, what 863 00:51:43,440 --> 00:51:45,719 Speaker 1: do you think about other animals having consciousness? 864 00:51:45,760 --> 00:51:49,640 Speaker 2: And what to take on AI becoming conscious. 865 00:51:48,920 --> 00:51:53,319 Speaker 3: Two super important and increasingly timely questions. I mean, there's 866 00:51:53,360 --> 00:51:57,799 Speaker 3: so much excitement, height and some amount of anxiety and 867 00:51:57,840 --> 00:51:59,840 Speaker 3: fear about AI and. 868 00:52:01,480 --> 00:52:02,280 Speaker 4: Things are changing. 869 00:52:02,840 --> 00:52:05,200 Speaker 3: Animals, of course, have been around for a very long time, 870 00:52:06,920 --> 00:52:12,680 Speaker 3: and we've had over history varying views about their status 871 00:52:12,760 --> 00:52:16,200 Speaker 3: as conscious creatures or not, and I think they pose 872 00:52:16,239 --> 00:52:22,239 Speaker 3: us very different challenges. So we humans, we tend to 873 00:52:22,280 --> 00:52:25,399 Speaker 3: see the world through the lens of being human. We're 874 00:52:25,480 --> 00:52:31,160 Speaker 3: very anthropocentric, and not only that, we're very anthropomorphic. We 875 00:52:31,239 --> 00:52:35,480 Speaker 3: tend to project human like things into other systems on 876 00:52:35,520 --> 00:52:39,200 Speaker 3: the basis of what might be superficial similarities. So this 877 00:52:39,320 --> 00:52:42,560 Speaker 3: is to say that when it comes to other animals, 878 00:52:42,600 --> 00:52:48,000 Speaker 3: if they're different from us, we tend to withhold from 879 00:52:48,040 --> 00:52:51,840 Speaker 3: them things that we think are distinctively human or matter 880 00:52:51,920 --> 00:52:56,680 Speaker 3: to our humanity, like intelligence and consciousness. And I think 881 00:52:56,719 --> 00:52:59,239 Speaker 3: it's this combination of biases that can get us into 882 00:52:59,280 --> 00:53:02,600 Speaker 3: trouble here. Because we see things through a human lens. 883 00:53:02,680 --> 00:53:06,680 Speaker 3: We've tended over history to associate something like consciousness with 884 00:53:06,880 --> 00:53:10,400 Speaker 3: other things that we think of as distinctively human, like 885 00:53:10,600 --> 00:53:16,000 Speaker 3: intelligence and language. So if we do that and we 886 00:53:16,040 --> 00:53:20,240 Speaker 3: look at non human animals, we tend to reserve consciousness 887 00:53:20,239 --> 00:53:24,640 Speaker 3: for those animals that seem the smartest. And in fact, 888 00:53:25,040 --> 00:53:27,000 Speaker 3: you know, we've done worse than that. It wasn't that 889 00:53:27,080 --> 00:53:30,839 Speaker 3: long ago that people didn't give babies anesthesia. I mean, 890 00:53:30,880 --> 00:53:34,320 Speaker 3: this seems crazy now that it was not common practice 891 00:53:34,400 --> 00:53:38,799 Speaker 3: until a few decades ago, there was this sort of assumptions. 892 00:53:38,840 --> 00:53:41,400 Speaker 3: I didn't really feel paid, and I think that just 893 00:53:41,440 --> 00:53:48,880 Speaker 3: shows how deeply our assumptions like this can affect our practice. 894 00:53:49,200 --> 00:53:53,120 Speaker 3: And we have exactly the opposite situation now with artificial intelligence. 895 00:53:53,680 --> 00:53:58,560 Speaker 3: So AI systems like chat, GPT or Claude can speak 896 00:53:58,600 --> 00:54:01,399 Speaker 3: to us fluently. Whether it count as conversations, I'm much 897 00:54:01,480 --> 00:54:03,640 Speaker 3: less sure. I mean, Sherry Turkle has talked about this 898 00:54:03,680 --> 00:54:05,920 Speaker 3: beautifully and said, when we converse with a machine, we 899 00:54:06,000 --> 00:54:09,400 Speaker 3: unthinkably simplify what we mean by a conversation. It's a 900 00:54:09,440 --> 00:54:13,960 Speaker 3: different form of human machine interaction. But because words are 901 00:54:14,000 --> 00:54:17,560 Speaker 3: being exchanged and some of the things that language models 902 00:54:17,600 --> 00:54:22,359 Speaker 3: can say, it really are quite surprisingly articulate and informative. 903 00:54:23,080 --> 00:54:29,120 Speaker 3: We tend to project qualities like consciousness into these machines 904 00:54:29,320 --> 00:54:32,160 Speaker 3: because if it was a human being talking to us 905 00:54:32,200 --> 00:54:35,160 Speaker 3: like that, well that human being would clearly be conscious. 906 00:54:35,480 --> 00:54:36,120 Speaker 4: But it's not. 907 00:54:36,400 --> 00:54:40,360 Speaker 3: It's something very, very, very different. So I think the 908 00:54:40,360 --> 00:54:42,680 Speaker 3: first thing we have to do is recognize how much 909 00:54:43,880 --> 00:54:49,399 Speaker 3: our intuitions about the circle of consciousness, how far these 910 00:54:49,440 --> 00:54:52,480 Speaker 3: intuitions are shaped by our biases, and that we can't 911 00:54:52,520 --> 00:54:55,319 Speaker 3: just crawl out from under them, We can't get away 912 00:54:55,360 --> 00:54:59,000 Speaker 3: from them. These biases might be what we might call 913 00:54:59,080 --> 00:55:02,640 Speaker 3: cognitively imped like some visual illusions. You know, there are 914 00:55:02,640 --> 00:55:06,200 Speaker 3: some illusions that even when you know two lines are 915 00:55:06,440 --> 00:55:09,440 Speaker 3: the same length, they will always look different. This is 916 00:55:09,480 --> 00:55:12,960 Speaker 3: the Mullaalaya illusion. So even if we know we're biased 917 00:55:13,560 --> 00:55:17,360 Speaker 3: in these anthropocentric ways, we will still be unable to 918 00:55:17,440 --> 00:55:21,920 Speaker 3: resist these intuitions. So we need to just surface that 919 00:55:22,840 --> 00:55:26,600 Speaker 3: and then ask the question, so what's actually most likely 920 00:55:26,640 --> 00:55:32,240 Speaker 3: the case? And here I think there's extremely good reason 921 00:55:32,320 --> 00:55:37,600 Speaker 3: to believe that consciousness is pretty widespread in non human animals. 922 00:55:38,120 --> 00:55:41,200 Speaker 3: If you look through all mammals, whether it's a tree, 923 00:55:41,239 --> 00:55:46,400 Speaker 3: shrew or orangutang, they have the same basic neural architecture 924 00:55:46,440 --> 00:55:49,120 Speaker 3: that we know is important in human beings for consciouness. 925 00:55:49,360 --> 00:55:52,760 Speaker 3: I think things get really tricky when we look at birds, 926 00:55:52,800 --> 00:55:55,680 Speaker 3: when we look at fish, when we look at insects. 927 00:55:55,800 --> 00:55:58,440 Speaker 3: And here I just think we have to we can't 928 00:55:58,440 --> 00:56:00,680 Speaker 3: be determined, as we don't know one way or the other. 929 00:56:01,520 --> 00:56:05,400 Speaker 3: But we just need to keep updating our credence in 930 00:56:05,520 --> 00:56:08,239 Speaker 3: consciousness in these things. And I think the key thing 931 00:56:08,280 --> 00:56:10,040 Speaker 3: is we also have to ask, well, what kind of 932 00:56:10,040 --> 00:56:13,080 Speaker 3: consciousness matters. It may well be the case that not 933 00:56:13,200 --> 00:56:20,600 Speaker 3: many non human animals have fully developed reflexive, reflective experiences 934 00:56:20,640 --> 00:56:23,239 Speaker 3: of being particular individuals. I mean, we don't know yet, 935 00:56:23,239 --> 00:56:28,080 Speaker 3: but it seems unlikely. But ethically what matters is whether 936 00:56:28,120 --> 00:56:32,279 Speaker 3: they have the capacity to suffer, to feel pain. This 937 00:56:32,320 --> 00:56:34,840 Speaker 3: is a very utilitarian perspective, but I think it's a 938 00:56:34,880 --> 00:56:41,040 Speaker 3: sensible one. So I think we underestimate probably the extent 939 00:56:41,080 --> 00:56:45,360 Speaker 3: of animal consciousness, and I think we overestimate the likelihood 940 00:56:45,560 --> 00:56:49,799 Speaker 3: of AI being conscious. And there are many reasons why 941 00:56:49,800 --> 00:56:55,480 Speaker 3: I'm very skeptical of this, but fundamentally, my great basis 942 00:56:55,480 --> 00:56:59,400 Speaker 3: for the skepticism is I think we've just overused the 943 00:56:59,520 --> 00:57:03,520 Speaker 3: metaphor of the brain as a computer. It's a beautiful, brilliant, 944 00:57:03,760 --> 00:57:07,920 Speaker 3: powerful metaphor, but it's a metaphor. We've always used a 945 00:57:07,960 --> 00:57:10,600 Speaker 3: dominant technology of the day as a metaphor for the brain, 946 00:57:11,920 --> 00:57:14,960 Speaker 3: and we always get into trouble, or we usually get 947 00:57:15,000 --> 00:57:17,840 Speaker 3: into trouble when we start confusing a metaphor with the 948 00:57:17,840 --> 00:57:18,520 Speaker 3: thing itself. 949 00:57:19,200 --> 00:57:22,360 Speaker 1: What do you see, is the biggest unanswered question in 950 00:57:22,520 --> 00:57:26,160 Speaker 1: consciousness research and what excites you the most about where 951 00:57:26,160 --> 00:57:27,000 Speaker 1: the field is headed? 952 00:57:27,520 --> 00:57:31,720 Speaker 3: Oh wow, I mean the biggest unanswered question is it's 953 00:57:31,760 --> 00:57:36,800 Speaker 3: still the old question. How does it happen? We still 954 00:57:37,280 --> 00:57:39,960 Speaker 3: don't really know. I don't think. I mean, there's progress, 955 00:57:40,000 --> 00:57:42,520 Speaker 3: and to be honest, and I don't know. I'd be 956 00:57:42,560 --> 00:57:44,600 Speaker 3: interested if you feel the same way that id David. 957 00:57:44,600 --> 00:57:47,240 Speaker 3: We've been doing this more or less for the same 958 00:57:47,280 --> 00:57:52,720 Speaker 3: amount of time, long time now decades. Some days I 959 00:57:52,800 --> 00:57:55,960 Speaker 3: feel like, ah, it's still a complete mystery. You know, 960 00:57:56,000 --> 00:58:01,680 Speaker 3: we have neurons, chemicals, electrical signals. Why do I experience anything? 961 00:58:01,720 --> 00:58:04,200 Speaker 3: Why is there experience in the world, in the universe 962 00:58:04,240 --> 00:58:07,120 Speaker 3: at all? Those days, you know, we'll have a cup 963 00:58:07,120 --> 00:58:09,360 Speaker 3: of coffee and get on with things. But then on 964 00:58:09,400 --> 00:58:11,960 Speaker 3: other days I think back to where we were in 965 00:58:12,040 --> 00:58:14,720 Speaker 3: the You know, when I started studying and in the 966 00:58:14,880 --> 00:58:18,880 Speaker 3: early nineteen nineties, consciousness wasn't even on the menu if 967 00:58:18,920 --> 00:58:21,000 Speaker 3: you wanted to look at psychology and neuroscience. 968 00:58:21,040 --> 00:58:21,280 Speaker 2: It was. 969 00:58:21,440 --> 00:58:25,040 Speaker 3: It was ostracized completely. I mean we met because I 970 00:58:25,120 --> 00:58:27,439 Speaker 3: came to San Diego in the Lake in the early 971 00:58:27,480 --> 00:58:29,360 Speaker 3: two thousands, which was at the time one of the 972 00:58:29,360 --> 00:58:34,320 Speaker 3: only places where you could study consciousness legitimately with the 973 00:58:34,360 --> 00:58:38,000 Speaker 3: approval of a PI in a lab, and was still 974 00:58:38,000 --> 00:58:41,320 Speaker 3: a pretty rare thing to do. And so looking back 975 00:58:41,400 --> 00:58:43,480 Speaker 3: through that lens and looking at the theories that we 976 00:58:43,600 --> 00:58:45,919 Speaker 3: now have and the kinds of experiments that have been done, 977 00:58:46,560 --> 00:58:49,840 Speaker 3: and how we've learned more about different kinds of consciousness, 978 00:58:50,760 --> 00:58:54,320 Speaker 3: you know, I then worry less that this big metaphysical 979 00:58:54,600 --> 00:58:58,000 Speaker 3: existential question is still waiting in the wings. 980 00:58:58,320 --> 00:58:59,960 Speaker 4: And so I think that. 981 00:59:01,800 --> 00:59:03,320 Speaker 3: Maybe the way to put it, I haven't thought about 982 00:59:03,320 --> 00:59:05,440 Speaker 3: it in quite this way before, But maybe the biggest 983 00:59:06,080 --> 00:59:10,560 Speaker 3: unanswered question in consciousness research is whether there will turn 984 00:59:10,600 --> 00:59:14,760 Speaker 3: out to be a big unanswered question in consciousness research 985 00:59:14,880 --> 00:59:19,439 Speaker 3: or not. In other words, as we just understand more 986 00:59:19,480 --> 00:59:24,640 Speaker 3: and more about different kinds of experience, what happens in anesthesia, 987 00:59:24,800 --> 00:59:27,200 Speaker 3: maybe the question of consciousness will turn out to be 988 00:59:27,240 --> 00:59:30,400 Speaker 3: a little bit like the question of life. 989 00:59:30,480 --> 00:59:30,640 Speaker 1: You know. 990 00:59:30,640 --> 00:59:34,280 Speaker 3: One hundred and fifty years ago, many people thought that 991 00:59:34,360 --> 00:59:37,560 Speaker 3: life was something beyond the reach of science, that there 992 00:59:37,600 --> 00:59:40,800 Speaker 3: had to be something supernatural, some elan vital, some spark 993 00:59:40,840 --> 00:59:43,040 Speaker 3: of life to explain the difference between the living and 994 00:59:43,040 --> 00:59:46,240 Speaker 3: the non living. And of course We don't think that anymore. 995 00:59:46,280 --> 00:59:49,920 Speaker 3: I mean, life is still not a completely written book. 996 00:59:49,960 --> 00:59:54,760 Speaker 3: We don't understand everything, but there's no conceptual mystery. That 997 00:59:54,840 --> 00:59:58,800 Speaker 3: life is part of nature, and even its origin seems 998 00:59:59,040 --> 01:00:03,440 Speaker 3: increasingly within reach to explain and understand. And what happened 999 01:00:03,480 --> 01:00:06,560 Speaker 3: that wasn't that anybody proved that life didn't exist or 1000 01:00:07,160 --> 01:00:10,400 Speaker 3: found the spark of life in some Eureka moment. Noe 1001 01:00:11,280 --> 01:00:14,880 Speaker 3: biologists started just to make progress, explaining this property of 1002 01:00:14,920 --> 01:00:20,440 Speaker 3: living systems homeostas is this property reproduction, this property metabolism, 1003 01:00:21,200 --> 01:00:24,840 Speaker 3: And little by little we learned about the nature of 1004 01:00:24,920 --> 01:00:29,360 Speaker 3: life and we stopped worrying that there had there was 1005 01:00:29,560 --> 01:00:34,480 Speaker 3: something fundamentally inexplicable about it. And progress in science is 1006 01:00:34,480 --> 01:00:37,520 Speaker 3: often like this. You know, sometimes it's not only the 1007 01:00:37,560 --> 01:00:40,840 Speaker 3: answers that change, but it's the questions that change too. 1008 01:00:41,520 --> 01:00:43,360 Speaker 3: So that's what I've got my own. That's what gives 1009 01:00:43,360 --> 01:00:45,800 Speaker 3: me hope. I think we understand more when the questions 1010 01:00:45,840 --> 01:00:47,720 Speaker 3: we ask start to look different. 1011 01:00:49,160 --> 01:00:51,480 Speaker 2: I totally agree with you. And what's interesting. 1012 01:00:51,640 --> 01:00:54,280 Speaker 1: What's been interesting for me is I also feel the 1013 01:00:54,280 --> 01:00:57,880 Speaker 1: way you knew many mornings where I think, gosh, yeah, 1014 01:00:57,880 --> 01:00:59,640 Speaker 1: when you and I were in San Diego and the 1015 01:00:59,640 --> 01:01:04,400 Speaker 1: early thousands, the questions that were being asked were mostly 1016 01:01:04,400 --> 01:01:07,480 Speaker 1: the same, but things have changed slowly. And just as 1017 01:01:07,560 --> 01:01:11,720 Speaker 1: one example, at that time, we all sort of every 1018 01:01:11,920 --> 01:01:16,040 Speaker 1: all neuroscientists were sort of snarky about artificial intelligence because 1019 01:01:16,040 --> 01:01:18,280 Speaker 1: they felt like, Okay, we've been at this for a 1020 01:01:18,320 --> 01:01:19,240 Speaker 1: long time, it's. 1021 01:01:19,120 --> 01:01:20,320 Speaker 2: Not really happening. 1022 01:01:20,560 --> 01:01:24,920 Speaker 1: There were artificial neural networks with the same principles that 1023 01:01:25,000 --> 01:01:28,800 Speaker 1: we have now, but it seemed like it's never gonna happen. 1024 01:01:28,840 --> 01:01:31,439 Speaker 1: And now we look at you at GPT every day 1025 01:01:31,920 --> 01:01:35,960 Speaker 1: and do experiments on it, and we find, wow, it's 1026 01:01:36,000 --> 01:01:37,800 Speaker 1: it's actually working in a way. 1027 01:01:37,840 --> 01:01:39,360 Speaker 2: Now. Who knows if it's conscious. 1028 01:01:39,440 --> 01:01:42,440 Speaker 1: I think we share the view that it's probably not conscious, 1029 01:01:42,840 --> 01:01:49,160 Speaker 1: but boy is it impressive. And so because of improvements 1030 01:01:49,200 --> 01:01:53,360 Speaker 1: in our technology, in our ability to measure things in biology, 1031 01:01:53,760 --> 01:01:56,360 Speaker 1: in you know, in just data gathering and ways of 1032 01:01:56,400 --> 01:01:59,920 Speaker 1: doing things bigger and better, we we find different views 1033 01:02:00,160 --> 01:02:02,080 Speaker 1: now than we did one quarter century ago. 1034 01:02:02,360 --> 01:02:03,200 Speaker 4: I think that that's right. 1035 01:02:03,240 --> 01:02:06,440 Speaker 3: By the way, I have to just one other aspect 1036 01:02:06,440 --> 01:02:08,440 Speaker 3: of the whole AI think that I think is really 1037 01:02:09,040 --> 01:02:13,760 Speaker 3: useful to think about. So there's this whole debate about 1038 01:02:13,800 --> 01:02:16,680 Speaker 3: whether chat GPT or language models might be conscious. People 1039 01:02:16,720 --> 01:02:18,680 Speaker 3: ask the question, right, and some people think that they are. 1040 01:02:19,200 --> 01:02:22,760 Speaker 3: But just yesterday I was lucky enough to visit deep 1041 01:02:22,800 --> 01:02:27,760 Speaker 3: mines in London and that's where alpha fold was developed, 1042 01:02:27,800 --> 01:02:31,240 Speaker 3: which is another AI system that's able to predict protein 1043 01:02:31,960 --> 01:02:35,760 Speaker 3: structure from sequence I mean acid sequences. The protein folding 1044 01:02:35,800 --> 01:02:38,760 Speaker 3: problem has been one of the biggest challenges in biology 1045 01:02:38,760 --> 01:02:42,280 Speaker 3: for a long time. Alpha fold basically sorts that out. Now, 1046 01:02:42,360 --> 01:02:46,000 Speaker 3: isn't it interesting that no one thinks that alpha fold 1047 01:02:46,040 --> 01:02:49,680 Speaker 3: is conscious. I've not heard anybody suggests to me that 1048 01:02:49,760 --> 01:02:51,760 Speaker 3: alpha fold might have experience. 1049 01:02:52,280 --> 01:02:52,560 Speaker 2: Yet. 1050 01:02:53,920 --> 01:02:57,480 Speaker 3: You know, there's some differences, but they're very, very similar 1051 01:02:57,480 --> 01:02:59,280 Speaker 3: to the architectures. I mean they're made out of that. 1052 01:02:59,440 --> 01:03:04,200 Speaker 3: They're both computer algorithms. Then you're on network based with 1053 01:03:04,280 --> 01:03:07,360 Speaker 3: some other stuff that they even have transformer architectures. They're 1054 01:03:07,400 --> 01:03:10,800 Speaker 3: really not that different. Yet our intuitions about the two 1055 01:03:10,800 --> 01:03:12,840 Speaker 3: are so different. I think that really, to me highlights 1056 01:03:12,880 --> 01:03:17,920 Speaker 3: how much of what we think is driven by our 1057 01:03:17,960 --> 01:03:21,640 Speaker 3: psychological bias. I think the other thing is is that, yeah, 1058 01:03:21,640 --> 01:03:24,040 Speaker 3: there can be this interesting nonlinearity here. 1059 01:03:24,120 --> 01:03:24,280 Speaker 4: Right. 1060 01:03:24,720 --> 01:03:29,080 Speaker 3: AI seemed to flatline for a long time, and there's 1061 01:03:29,080 --> 01:03:34,400 Speaker 3: this sudden rush of discovery and invention and increase incompetence. 1062 01:03:35,280 --> 01:03:37,480 Speaker 3: Maybe this is going to happen in the neuroscience of 1063 01:03:37,480 --> 01:03:42,200 Speaker 3: consciousness too, And there are some super exciting developments over 1064 01:03:42,280 --> 01:03:44,840 Speaker 3: where you're part of the world in Stanford, there's so 1065 01:03:44,920 --> 01:03:48,720 Speaker 3: much exciting work going on in optogenetics and in synthetic biology. 1066 01:03:49,240 --> 01:03:54,040 Speaker 3: People developing brain organoids, these these collections of brain cells 1067 01:03:54,080 --> 01:03:58,080 Speaker 3: in dishes that are derived from human embryonic stem cells, 1068 01:04:00,240 --> 01:04:03,440 Speaker 3: things that really, as I think you said, that give 1069 01:04:03,520 --> 01:04:08,080 Speaker 3: us different perspectives, different views, different tools, and science often 1070 01:04:08,120 --> 01:04:10,960 Speaker 3: needs that. Yeah, you can have ideas, you can have theories, 1071 01:04:11,520 --> 01:04:15,000 Speaker 3: but a lot of advance historically in science is when 1072 01:04:15,480 --> 01:04:20,480 Speaker 3: people acquire optit invent new tools, and I think we're 1073 01:04:20,520 --> 01:04:23,240 Speaker 3: seeing a lot of that in neuroscience now, So I 1074 01:04:23,240 --> 01:04:25,080 Speaker 3: am actually pretty optimistic for the future. 1075 01:04:29,600 --> 01:04:32,480 Speaker 1: That was my interview with Analseth, professor of cognitive and 1076 01:04:32,520 --> 01:04:36,680 Speaker 1: Computational neuroscience at Sussex and the author of Being You. 1077 01:04:37,600 --> 01:04:42,040 Speaker 1: We explored here how our brains create our reality, from 1078 01:04:42,400 --> 01:04:47,000 Speaker 1: control hallucinations to the rich diversity of individual perceptions. 1079 01:04:47,120 --> 01:04:47,680 Speaker 2: What I want to. 1080 01:04:47,720 --> 01:04:49,520 Speaker 1: Leave you with is an idea that you've heard me 1081 01:04:49,560 --> 01:04:52,640 Speaker 1: talk about many times before on this podcast, which is 1082 01:04:52,680 --> 01:04:57,000 Speaker 1: that consciousness is not just about directly experiencing the world, 1083 01:04:57,000 --> 01:05:02,240 Speaker 1: but instead it's about actively constructing it. While consciousness remains 1084 01:05:02,520 --> 01:05:06,120 Speaker 1: one of the central mysteries and neuroscience, today's conversation, I 1085 01:05:06,160 --> 01:05:08,960 Speaker 1: hope gives us a lens through which to view it 1086 01:05:09,040 --> 01:05:13,360 Speaker 1: not as something mystical or otherworldly, but as a grounded 1087 01:05:13,440 --> 01:05:18,760 Speaker 1: biological phenomenon rooted in the brain's continuous dance of sensory 1088 01:05:18,840 --> 01:05:23,480 Speaker 1: input and its own predictions. Now, thinking about our perception 1089 01:05:23,640 --> 01:05:26,800 Speaker 1: of the world not as a direct reflection of reality, 1090 01:05:26,800 --> 01:05:30,200 Speaker 1: but a construction of the brain, the inner world meeting 1091 01:05:30,240 --> 01:05:33,919 Speaker 1: the outer world. This is really the single reasonable way 1092 01:05:33,960 --> 01:05:36,320 Speaker 1: to understand it, and in a sense, it's a little 1093 01:05:36,360 --> 01:05:40,960 Speaker 1: strange that this is still rare in textbooks or in 1094 01:05:41,120 --> 01:05:44,400 Speaker 1: common knowledge. And note that this process of trying to 1095 01:05:44,640 --> 01:05:47,440 Speaker 1: predict signals applies not just to our perception of the 1096 01:05:47,480 --> 01:05:51,640 Speaker 1: outside world, but to the swirling world inside our bodies 1097 01:05:51,680 --> 01:05:56,240 Speaker 1: as well, which sheds light on how we experience emotions 1098 01:05:56,280 --> 01:06:00,560 Speaker 1: and maintain homeostasis. In this way, the self is not 1099 01:06:00,600 --> 01:06:04,640 Speaker 1: a fixed entity, but an ever changing construct. And one 1100 01:06:04,680 --> 01:06:07,800 Speaker 1: of the other themes that emerge today is that each 1101 01:06:07,840 --> 01:06:11,600 Speaker 1: of us carries around a slightly different version of reality, 1102 01:06:12,080 --> 01:06:16,040 Speaker 1: shaped by our biology and our experiences. This is a 1103 01:06:16,080 --> 01:06:21,080 Speaker 1: reminder of how subjective and varied human experience can be, 1104 01:06:21,720 --> 01:06:24,600 Speaker 1: and how much we still have to learn about the 1105 01:06:24,680 --> 01:06:30,000 Speaker 1: minds of others. That includes other humans and animals, and 1106 01:06:30,120 --> 01:06:35,080 Speaker 1: possibly at some point ai are we overestimating the sentience 1107 01:06:35,120 --> 01:06:40,280 Speaker 1: of machines or underestimating that of animals. So, in closing, 1108 01:06:40,400 --> 01:06:44,840 Speaker 1: as Annel mentioned, just as we once demystified the life itself, 1109 01:06:44,920 --> 01:06:48,360 Speaker 1: step by step through the patient work of science, we 1110 01:06:48,440 --> 01:06:53,280 Speaker 1: may one day come to understand consciousness in a similar manner. 1111 01:06:53,760 --> 01:06:56,680 Speaker 1: The key, as it is so often in science, is 1112 01:06:56,720 --> 01:07:00,560 Speaker 1: not just about answering the big question, but can continually 1113 01:07:00,640 --> 01:07:07,680 Speaker 1: asking smaller, better ones. Go to Eagleman dot com slash 1114 01:07:07,720 --> 01:07:10,800 Speaker 1: podcast for more information and to find further reading. 1115 01:07:11,560 --> 01:07:12,160 Speaker 2: Send me an. 1116 01:07:12,040 --> 01:07:15,040 Speaker 5: Email at podcasts at eagleman dot com with questions or 1117 01:07:15,080 --> 01:07:19,200 Speaker 5: discussion and check out Subscribe to Inner Cosmos on YouTube 1118 01:07:19,240 --> 01:07:22,600 Speaker 5: for videos of each episode and to leave comments. 1119 01:07:24,280 --> 01:07:25,040 Speaker 2: Until next time. 1120 01:07:25,160 --> 01:07:29,840 Speaker 1: I'm David Eagleman and this is Inner Cosmos.