1 00:00:00,120 --> 00:00:04,400 Speaker 1: The Craig Ferguson Pants on Fire Tour is on sale now. 2 00:00:04,559 --> 00:00:07,880 Speaker 1: It's a new show, it's new material, but I'm afraid 3 00:00:07,920 --> 00:00:12,440 Speaker 1: it's still only me, Craig Ferguson on my own, standing 4 00:00:12,480 --> 00:00:15,760 Speaker 1: on a stage telling comedy words. Come and see me, 5 00:00:16,040 --> 00:00:18,520 Speaker 1: buy tickets, bring your loved ones, or don't come and 6 00:00:18,560 --> 00:00:21,400 Speaker 1: see me. Don't buy tickets and don't bring your loved ones. 7 00:00:21,480 --> 00:00:23,880 Speaker 1: I'm not your dad. You come or don't come, but 8 00:00:24,120 --> 00:00:26,120 Speaker 1: you should at least know what's happening, and it is. 9 00:00:26,440 --> 00:00:30,120 Speaker 1: The tour kicks off late September and goes through the 10 00:00:30,320 --> 00:00:33,960 Speaker 1: end of the year and beyond. Tickets are available at 11 00:00:34,040 --> 00:00:37,440 Speaker 1: the Craig Ferguson Show dot com slash tour. They're available 12 00:00:37,440 --> 00:00:40,720 Speaker 1: at the Craig Ferguson show dot com slash tour or 13 00:00:40,760 --> 00:00:45,080 Speaker 1: at your local outlet in your region. My name is 14 00:00:45,080 --> 00:00:49,879 Speaker 1: Craig Ferguson. The name of this podcast is Joy. I 15 00:00:50,040 --> 00:00:53,600 Speaker 1: talk to interest in people about what brings them happiness. 16 00:00:56,400 --> 00:01:00,680 Speaker 1: On the podcast today, my guest is David Eagleman. 17 00:01:01,200 --> 00:01:06,000 Speaker 2: Professor David Eagleman to be precise. Professor Eagleman, or David 18 00:01:06,000 --> 00:01:09,240 Speaker 2: as I call him, is a neuroscientist he knows a 19 00:01:09,240 --> 00:01:11,280 Speaker 2: lot about the human brain, but as it turns out, 20 00:01:11,280 --> 00:01:13,679 Speaker 2: he knows a lot about artificial brains too. 21 00:01:13,880 --> 00:01:15,240 Speaker 3: He's just a very brainy guy. 22 00:01:15,440 --> 00:01:17,280 Speaker 4: So I'm going to sound a bit more stupid than 23 00:01:17,360 --> 00:01:29,279 Speaker 4: usual in Jomi, David. Let me just say this before 24 00:01:29,280 --> 00:01:29,800 Speaker 4: we start. 25 00:01:30,120 --> 00:01:32,959 Speaker 1: Do I call you David? Do I call you mister Eagleman? 26 00:01:33,080 --> 00:01:35,440 Speaker 1: Do I call you doctor Eagleman? Or do you call 27 00:01:35,480 --> 00:01:36,839 Speaker 1: you professor Eagleman? 28 00:01:37,520 --> 00:01:41,480 Speaker 3: Or sir? Please call me David. All right? Is it 29 00:01:42,000 --> 00:01:44,600 Speaker 3: a doctor? By the way, you're doctor? Yeah? Doctor Eagelman 30 00:01:44,680 --> 00:01:48,320 Speaker 3: would be what yes, but my mother would call me yes. 31 00:01:50,320 --> 00:01:53,240 Speaker 1: Now I gotta apologize because I'm going to not so 32 00:01:53,360 --> 00:01:55,320 Speaker 1: much to you personally, because I think you'll probably be 33 00:01:55,400 --> 00:01:59,360 Speaker 1: able to handle it, but to people who already know 34 00:01:59,600 --> 00:02:03,320 Speaker 1: and and understand what you do, I'm going to come 35 00:02:03,360 --> 00:02:07,520 Speaker 1: across as someone who doesn't know anything about what you do. Now, 36 00:02:07,840 --> 00:02:11,440 Speaker 1: let's just say that I actually do know what you're doing. 37 00:02:11,560 --> 00:02:16,440 Speaker 1: I know everything about your studies and and what you 38 00:02:16,560 --> 00:02:20,239 Speaker 1: kind of do. And I'm not an ideot, but everybody 39 00:02:20,280 --> 00:02:26,079 Speaker 1: knows I kind of a little bit. Because neuroscience is 40 00:02:26,320 --> 00:02:30,720 Speaker 1: I barely understand the Dictionary definition of neuroscience correct me 41 00:02:30,720 --> 00:02:34,720 Speaker 1: if I'm wrong. I think it is the study of 42 00:02:34,800 --> 00:02:38,160 Speaker 1: how the brain works due to the physicality of the brain. 43 00:02:38,240 --> 00:02:42,639 Speaker 3: Is that. Yeah, you don't even need the second half. 44 00:02:42,800 --> 00:02:44,320 Speaker 3: Just trying to figure out how the brain works, trying 45 00:02:44,320 --> 00:02:46,440 Speaker 3: to figure out what's going on with the brain. And 46 00:02:46,480 --> 00:02:50,800 Speaker 3: it can be anything from understanding how vision works, are hearing, 47 00:02:51,160 --> 00:02:55,399 Speaker 3: to understanding decision making, to understanding emotions, to understanding why 48 00:02:55,400 --> 00:02:59,000 Speaker 3: we have consciousness or how we perceive time. Any of 49 00:02:59,040 --> 00:03:02,440 Speaker 3: that falls under the umbrella of neuroscience. 50 00:03:02,840 --> 00:03:04,880 Speaker 1: But it's interesting because it seems to me to be 51 00:03:05,000 --> 00:03:08,720 Speaker 1: something that it's an interesting science because it seems to 52 00:03:08,800 --> 00:03:13,160 Speaker 1: kind of wonder the theology and metaphysics, and because everything 53 00:03:13,240 --> 00:03:16,880 Speaker 1: is perception. Even the study of neuroscience is perception. So 54 00:03:18,160 --> 00:03:21,760 Speaker 1: how do you feel like you kind of examining yourself 55 00:03:21,800 --> 00:03:22,760 Speaker 1: from the inside. 56 00:03:23,240 --> 00:03:25,919 Speaker 3: Yeah, that's right. Well, I would say at the edges, 57 00:03:26,040 --> 00:03:30,960 Speaker 3: neuroscience scratches lots of things, certainly philosophy, maybe theology, but 58 00:03:31,160 --> 00:03:36,640 Speaker 3: the you know, the way you can do things is 59 00:03:36,720 --> 00:03:39,720 Speaker 3: set things up objectively in the real world where you 60 00:03:39,760 --> 00:03:44,120 Speaker 3: can verify. Look, I have, you know, three circles that 61 00:03:44,160 --> 00:03:47,560 Speaker 3: are the color red, you know, projecting this wavelength on 62 00:03:47,600 --> 00:03:50,480 Speaker 3: the screen and you know, what are people seeing. That's 63 00:03:50,840 --> 00:03:52,920 Speaker 3: still example, But the point is we can set things 64 00:03:53,000 --> 00:03:58,680 Speaker 3: up in the world and understand what people are individually experiencing. 65 00:03:59,240 --> 00:04:00,840 Speaker 3: I'll give you an example. One of the things I 66 00:04:00,840 --> 00:04:03,640 Speaker 3: study is called synesthesia, and that's where people have a 67 00:04:04,200 --> 00:04:07,400 Speaker 3: blending of the senses. So they might look at letters 68 00:04:07,480 --> 00:04:10,160 Speaker 3: on the page and it triggers a color experience for them, 69 00:04:10,440 --> 00:04:13,800 Speaker 3: so they'll see Jay is purple, and why is blue? 70 00:04:13,880 --> 00:04:18,400 Speaker 3: And m is red and so on, and you know, 71 00:04:18,480 --> 00:04:21,080 Speaker 3: it's just an internal experience that they're having. We can 72 00:04:21,240 --> 00:04:23,040 Speaker 3: verify what's going on in the real world. We can 73 00:04:23,040 --> 00:04:25,680 Speaker 3: compare people to each other. About three percent of the 74 00:04:25,680 --> 00:04:29,240 Speaker 3: population has synesthesia. But there are lots of things like 75 00:04:29,279 --> 00:04:33,400 Speaker 3: this that we do where we study across people to 76 00:04:33,520 --> 00:04:38,240 Speaker 3: understand how perception differs. You know, there are other things 77 00:04:38,320 --> 00:04:43,480 Speaker 3: like if I ask you Craig to imagine a you know, 78 00:04:43,560 --> 00:04:46,960 Speaker 3: an ant crawling our red and white table cloth towards 79 00:04:47,440 --> 00:04:50,760 Speaker 3: jar of purple jelly. How do you perceive that in 80 00:04:50,760 --> 00:04:52,760 Speaker 3: your head? Is it clear like a movie or is 81 00:04:52,800 --> 00:04:55,680 Speaker 3: it you don't really see anything at all, it's just conceptual. 82 00:04:56,080 --> 00:04:59,080 Speaker 1: Does it involve you then? And like, for example, you 83 00:04:59,160 --> 00:05:02,039 Speaker 1: just gave me the the ant on the If you say, 84 00:05:02,120 --> 00:05:05,240 Speaker 1: is it clear like a movie, it's such a weirdly Uh, 85 00:05:05,720 --> 00:05:08,400 Speaker 1: it's semantics because I mean, if I do I imagine 86 00:05:08,440 --> 00:05:10,800 Speaker 1: a clear iPhone film? 87 00:05:11,120 --> 00:05:15,320 Speaker 3: Is it show on? Is a sixteen millimeters or print? 88 00:05:15,480 --> 00:05:17,000 Speaker 3: Is it? Do you know what I mean? Is it 89 00:05:17,040 --> 00:05:19,159 Speaker 3: black and white? Right? So, if you had to have 90 00:05:19,200 --> 00:05:22,000 Speaker 3: if there was a spectrum from no picture at all 91 00:05:22,080 --> 00:05:24,840 Speaker 3: in your head to a movie at you know, at 92 00:05:24,880 --> 00:05:27,120 Speaker 3: the other end of the extreme, where would where would 93 00:05:27,160 --> 00:05:30,000 Speaker 3: you be with the ant on the paper? Yeah, the 94 00:05:30,040 --> 00:05:30,880 Speaker 3: on on the table cloth. 95 00:05:31,120 --> 00:05:33,680 Speaker 1: I think I could get myself right up to imax 96 00:05:33,920 --> 00:05:35,200 Speaker 1: with the end on the paper. 97 00:05:36,279 --> 00:05:39,120 Speaker 3: Okay, that's that's amazing. So it turns out that there 98 00:05:39,200 --> 00:05:42,640 Speaker 3: is a spectrum across the population. Everyone's spread pretty evenly 99 00:05:43,120 --> 00:05:48,039 Speaker 3: across this in terms of how how visually you imagine 100 00:05:48,240 --> 00:05:50,920 Speaker 3: things on the inside. And again, this is something that 101 00:05:50,920 --> 00:05:53,400 Speaker 3: we can test across the population, and we can also 102 00:05:53,520 --> 00:05:57,920 Speaker 3: test it objectively using grain imaging to see how much 103 00:05:57,960 --> 00:06:00,400 Speaker 3: activity there is in the visual part of the brain. 104 00:06:00,880 --> 00:06:04,880 Speaker 3: And we see that across anything we measure people exist 105 00:06:05,000 --> 00:06:08,560 Speaker 3: on a spectrum. For example, if I ask you how 106 00:06:09,120 --> 00:06:12,440 Speaker 3: loud or intrusive is your internal voice? You know, everyone 107 00:06:12,480 --> 00:06:15,640 Speaker 3: has a dialogue with themselves, right right? Are you aware 108 00:06:15,680 --> 00:06:17,920 Speaker 3: of your internal voice all the time or hardly? Ever? 109 00:06:18,600 --> 00:06:22,400 Speaker 1: I would say that that is very much situationally dependent. 110 00:06:22,520 --> 00:06:26,320 Speaker 1: When I'm you know, when I'm calm, not at all. 111 00:06:26,480 --> 00:06:31,200 Speaker 1: When I'm angry, probably not at all. And when i'm 112 00:06:31,320 --> 00:06:33,919 Speaker 1: you know, when I'm trying to make a difficult or 113 00:06:34,040 --> 00:06:39,960 Speaker 1: political decision a lot, you know so, or or am 114 00:06:39,960 --> 00:06:41,480 Speaker 1: I thinking about the wrong thing? 115 00:06:41,680 --> 00:06:44,359 Speaker 3: No? No, that's that's right. That's a good observation that 116 00:06:44,480 --> 00:06:47,640 Speaker 3: it differs moment to moment. But across the population, we 117 00:06:47,720 --> 00:06:50,240 Speaker 3: also find, you know, some people are really overwhelmed by 118 00:06:50,279 --> 00:06:55,880 Speaker 3: their internal radio. Other people have what's called an endophasia, 119 00:06:55,920 --> 00:06:59,000 Speaker 3: which means no internal voice. They just know it's totally 120 00:06:59,040 --> 00:07:03,400 Speaker 3: silent in there. Essentially, across anything we measure, we find 121 00:07:03,400 --> 00:07:05,799 Speaker 3: that people have very different results. Or how about your memory? 122 00:07:05,839 --> 00:07:08,600 Speaker 3: Are you do you have a great autobiographical memory? Can 123 00:07:08,640 --> 00:07:11,320 Speaker 3: you remember exactly what you were doing at this time 124 00:07:11,400 --> 00:07:13,000 Speaker 3: last year or five years ago? 125 00:07:14,560 --> 00:07:17,720 Speaker 1: And I think it's deteriorating as well. Actually as I age, 126 00:07:17,760 --> 00:07:20,720 Speaker 1: I really am beginning to think there may be something 127 00:07:20,760 --> 00:07:21,040 Speaker 1: in that. 128 00:07:21,760 --> 00:07:24,280 Speaker 3: Yeah, yeah, you just said that three minutes ago. No, 129 00:07:24,280 --> 00:07:28,240 Speaker 3: I'm kidding her. Yeah, that's the thing. But you know, 130 00:07:28,280 --> 00:07:31,080 Speaker 3: some people like Mary Lou Henner the actress, and you 131 00:07:31,080 --> 00:07:36,200 Speaker 3: know somebody I have this, you know, untaxable autobiographical memory 132 00:07:36,600 --> 00:07:37,800 Speaker 3: and everywhere in between. 133 00:07:38,200 --> 00:07:40,280 Speaker 1: I talked to Mary Lou Henna rctly and asked her 134 00:07:40,280 --> 00:07:43,480 Speaker 1: about that, and she I tested her because I had 135 00:07:43,520 --> 00:07:46,400 Speaker 1: worked with her years before on a different show. This 136 00:07:46,520 --> 00:07:48,360 Speaker 1: was and I talked to her on late night and 137 00:07:48,400 --> 00:07:50,000 Speaker 1: she had been on the drink, you know why. I 138 00:07:50,000 --> 00:07:51,880 Speaker 1: got even remember why I was talking to her about it. 139 00:07:51,960 --> 00:07:55,760 Speaker 1: But she really she really could do that. It's kind 140 00:07:55,800 --> 00:07:59,960 Speaker 1: of it's like a weird trick. It's very it's impressive. 141 00:08:00,120 --> 00:08:03,040 Speaker 1: I guess is a genetic I guess it's genetics. 142 00:08:02,640 --> 00:08:06,400 Speaker 3: Right, yeah, yeah, and so and actually so are all 143 00:08:06,440 --> 00:08:08,440 Speaker 3: these things, at least as far as we can tell, 144 00:08:08,520 --> 00:08:11,520 Speaker 3: you know, all this. But what it goes to illustrate 145 00:08:11,520 --> 00:08:13,960 Speaker 3: is that people are very different from one another on 146 00:08:14,040 --> 00:08:16,160 Speaker 3: the inside. And one of the this has been one 147 00:08:16,160 --> 00:08:18,680 Speaker 3: of my eras of interest in neuroscience is figuring out 148 00:08:19,520 --> 00:08:23,160 Speaker 3: why what are the genes or what are the experiences 149 00:08:23,480 --> 00:08:25,120 Speaker 3: or you know, what is the thing that has led 150 00:08:25,160 --> 00:08:28,880 Speaker 3: to different circuitry in Mary Leeu's brain in your brain, 151 00:08:29,360 --> 00:08:31,880 Speaker 3: that would lead you to have different experiences of what 152 00:08:31,960 --> 00:08:32,559 Speaker 3: memory is. 153 00:08:33,240 --> 00:08:37,079 Speaker 1: What would happen or have you ever encountered anithon divid 154 00:08:37,320 --> 00:08:40,680 Speaker 1: I can imagine that it could be potentially very explosive 155 00:08:41,400 --> 00:08:45,000 Speaker 1: if you say you know, well you know, because it 156 00:08:45,000 --> 00:08:48,040 Speaker 1: could lead you into horrible areas of racism. Like you say, 157 00:08:48,920 --> 00:08:52,040 Speaker 1: Medisranean people tend their brains stand to be like this, 158 00:08:52,480 --> 00:08:54,760 Speaker 1: or Nordic people their brains stand to. 159 00:08:54,760 --> 00:08:56,800 Speaker 3: Be like this. Is that a real thing or is 160 00:08:56,840 --> 00:09:03,400 Speaker 3: that made up by Nazis? The general story is, you know, 161 00:09:03,520 --> 00:09:06,520 Speaker 3: Homo sapiens have only spread around the earth very recently. 162 00:09:06,880 --> 00:09:10,040 Speaker 3: So you know, we originated in Africa about two hudred 163 00:09:10,040 --> 00:09:12,000 Speaker 3: fift thousand years ago. We came out the top and 164 00:09:12,040 --> 00:09:14,600 Speaker 3: half the people turned left and half turned right and 165 00:09:14,760 --> 00:09:19,720 Speaker 3: became you know, Europeans or Asians. But on the inside, 166 00:09:20,000 --> 00:09:22,640 Speaker 3: the organ that we have, that three pound mission control 167 00:09:22,679 --> 00:09:27,120 Speaker 3: center hasn't changed. That's the same same thing. Why because 168 00:09:27,120 --> 00:09:30,120 Speaker 3: two hundred fift thousand years just isn't enough time. In 169 00:09:30,160 --> 00:09:32,800 Speaker 3: the same way that people's hearts and lungs and kidneys 170 00:09:32,880 --> 00:09:35,360 Speaker 3: are the same as you go around the world. So 171 00:09:36,000 --> 00:09:40,480 Speaker 3: while there's an enormous amount of difference between people, there 172 00:09:40,520 --> 00:09:43,800 Speaker 3: aren't between groups of people. On average. You find this 173 00:09:44,000 --> 00:09:45,560 Speaker 3: giant distribution everywhere. 174 00:09:46,440 --> 00:09:50,520 Speaker 1: Well about the idea, I mean, I presume this study. 175 00:09:51,000 --> 00:09:54,440 Speaker 1: Is it driven, I mean it's knowledge driven. I guess 176 00:09:54,520 --> 00:09:58,360 Speaker 1: it's science. But is it a medical Is that what. 177 00:09:58,320 --> 00:09:58,880 Speaker 3: We're looking for? 178 00:09:59,360 --> 00:10:04,040 Speaker 1: Is it to try and solve problem? Like one of 179 00:10:04,040 --> 00:10:06,960 Speaker 1: this brings to mind is demanchia obviously in Alzheimer's, which 180 00:10:07,000 --> 00:10:11,080 Speaker 1: is a real kind of cognitive problem. 181 00:10:11,280 --> 00:10:15,520 Speaker 3: Yes, yeah, exactly right. So you know, the field of 182 00:10:15,520 --> 00:10:22,280 Speaker 3: neuroscience traditionally studies diseases and disorders and what happens with 183 00:10:22,320 --> 00:10:24,640 Speaker 3: the brain to make it change, for example, in cases 184 00:10:24,679 --> 00:10:27,680 Speaker 3: of dementia. I also study that. A lot of the 185 00:10:27,720 --> 00:10:30,920 Speaker 3: things I do have to do with those areas, but 186 00:10:31,280 --> 00:10:33,880 Speaker 3: just personally, I got very interested in the topic of 187 00:10:34,320 --> 00:10:38,719 Speaker 3: you know, how does the brain run in everyone under 188 00:10:38,800 --> 00:10:42,880 Speaker 3: normal circumstances, and again, what are the differences between people? 189 00:10:42,920 --> 00:10:45,240 Speaker 3: And by the way, how does it matter for society. 190 00:10:45,320 --> 00:10:47,000 Speaker 3: So one of the things I do I run this 191 00:10:47,880 --> 00:10:52,240 Speaker 3: Center for Neuroscience and Law, which is all the things 192 00:10:52,240 --> 00:10:54,839 Speaker 3: that we're learning in neuroscience. How does this affect the 193 00:10:55,280 --> 00:10:57,920 Speaker 3: legal system and how we think about things there. 194 00:10:58,400 --> 00:11:03,000 Speaker 1: I mean, so if it there's a behavioral problem, for example, 195 00:11:03,520 --> 00:11:06,880 Speaker 1: excuse me, if you find out that someone reacts a 196 00:11:06,880 --> 00:11:10,400 Speaker 1: certain way being triggered by a certain stimulus. Like off 197 00:11:10,440 --> 00:11:11,679 Speaker 1: the top of my head, I'm not a doctor and 198 00:11:11,679 --> 00:11:13,640 Speaker 1: I don't know what I'm talking about, but say that 199 00:11:14,000 --> 00:11:16,560 Speaker 1: I have a brain that if you touch me, I 200 00:11:16,679 --> 00:11:20,480 Speaker 1: get very very upset. Is that something that you could 201 00:11:20,480 --> 00:11:23,200 Speaker 1: bring into the legal world where you say, well, this 202 00:11:23,679 --> 00:11:26,360 Speaker 1: person behaved very badly when they were being arrested, but 203 00:11:26,520 --> 00:11:28,200 Speaker 1: now we found out they've got the type of brain 204 00:11:28,320 --> 00:11:29,840 Speaker 1: if you touch them, they get very upset. 205 00:11:30,360 --> 00:11:35,200 Speaker 3: So here's the thing, great question. It turns out none 206 00:11:35,240 --> 00:11:38,160 Speaker 3: of this lets people off the hook. So if you 207 00:11:38,440 --> 00:11:40,760 Speaker 3: break the law, if you cross a sideal line, you 208 00:11:41,120 --> 00:11:43,680 Speaker 3: still have to confront the legal system for it. But 209 00:11:44,600 --> 00:11:46,400 Speaker 3: one of the things that tells us a lot about 210 00:11:47,160 --> 00:11:49,560 Speaker 3: is new methods for rehabilitation. So what we do right 211 00:11:49,600 --> 00:11:51,160 Speaker 3: now is a society and it is true around the 212 00:11:51,160 --> 00:11:54,480 Speaker 3: world we treat incarceration as a one size fits all solution, 213 00:11:55,400 --> 00:11:57,720 Speaker 3: but in fact, we know so much about the brain 214 00:11:57,760 --> 00:12:01,040 Speaker 3: now that if you come in with this particular disorder 215 00:12:01,080 --> 00:12:03,520 Speaker 3: where touching you on your shoulder, makes you react badly. 216 00:12:03,640 --> 00:12:05,400 Speaker 3: You know, maybe there's something we can do to help 217 00:12:05,480 --> 00:12:12,000 Speaker 3: you out, at least such that you know at least 218 00:12:13,400 --> 00:12:15,319 Speaker 3: exactly what we're not gonna do. And then it turns 219 00:12:15,320 --> 00:12:17,560 Speaker 3: out that you know, we might be able to help 220 00:12:17,600 --> 00:12:19,720 Speaker 3: you for the next time. Now, again, it doesn't let 221 00:12:19,720 --> 00:12:22,640 Speaker 3: people off the hook. It's not that you go without punishment, 222 00:12:22,679 --> 00:12:25,320 Speaker 3: but that's that's the idea, and there are lots of 223 00:12:25,360 --> 00:12:29,160 Speaker 3: ways we can do this. Just one example. I've been 224 00:12:29,400 --> 00:12:33,760 Speaker 3: a strong advocate for having specialized mental health courts. So 225 00:12:33,800 --> 00:12:35,160 Speaker 3: what we do right now as everyone goes to the 226 00:12:35,160 --> 00:12:37,680 Speaker 3: same court system, but if somebody has mental illness, which 227 00:12:37,720 --> 00:12:39,720 Speaker 3: is a you know, quite a high number of people 228 00:12:39,720 --> 00:12:42,160 Speaker 3: with mental illness end up on the wrong side of 229 00:12:42,240 --> 00:12:44,559 Speaker 3: legal system. You know, you have judges and juries that 230 00:12:44,600 --> 00:12:47,760 Speaker 3: maybe don't know anything about let's say schizophrenia or take 231 00:12:48,160 --> 00:12:52,240 Speaker 3: drug rehabilitation. Yeah, most judges and juries don't know a 232 00:12:52,280 --> 00:12:55,479 Speaker 3: great deal about what options are available. So having specialized 233 00:12:55,480 --> 00:12:57,800 Speaker 3: mental health courts, specialized drug courts, things like this are 234 00:12:57,800 --> 00:13:00,760 Speaker 3: really helpful where you have people that expertise, they know 235 00:13:01,000 --> 00:13:02,840 Speaker 3: the strategy is available. 236 00:13:03,360 --> 00:13:07,560 Speaker 1: It's an interesting thing that you mentioned excuse me schizophrenia 237 00:13:08,000 --> 00:13:12,480 Speaker 1: and drug rehabilitation, because schizophrenia. Remember I know nothing about this. 238 00:13:12,920 --> 00:13:16,160 Speaker 1: You know, you're the brain. I'm pinky, right, So that 239 00:13:17,280 --> 00:13:20,760 Speaker 1: my understanding of is or the tinium and I know 240 00:13:20,800 --> 00:13:23,800 Speaker 1: about it is that schizophrenia is a condition which you 241 00:13:24,120 --> 00:13:25,920 Speaker 1: I develop. 242 00:13:25,679 --> 00:13:29,240 Speaker 3: From a genetic position. Right. It's it's an illness that 243 00:13:29,280 --> 00:13:32,320 Speaker 3: is born within you. Right. It has a strong generic 244 00:13:32,400 --> 00:13:35,160 Speaker 3: component to it. It's not entirely can it be brought 245 00:13:35,240 --> 00:13:40,520 Speaker 3: on by by outside stimulus. It certainly can be exacerbated 246 00:13:40,520 --> 00:13:43,560 Speaker 3: and brought on early by things like drugs, for example. 247 00:13:43,760 --> 00:13:45,760 Speaker 3: So for sure, example, this is a real problem with 248 00:13:45,920 --> 00:13:49,520 Speaker 3: young people using marijuana, which has a much higher percentage 249 00:13:49,559 --> 00:13:53,040 Speaker 3: of THHC now than it used to in earlier strains. 250 00:13:53,600 --> 00:13:57,240 Speaker 3: A lot of young people are getting psychotic breaks as 251 00:13:57,280 --> 00:13:59,520 Speaker 3: a result of that much higher percentage than used to. 252 00:14:00,120 --> 00:14:02,240 Speaker 1: That happened to me when I took marijuana when I 253 00:14:02,320 --> 00:14:05,360 Speaker 1: was when I was young, I stopped take it. 254 00:14:05,400 --> 00:14:05,800 Speaker 3: I started. 255 00:14:05,800 --> 00:14:09,199 Speaker 1: I took marimana when it was about eighteen through until 256 00:14:09,200 --> 00:14:12,240 Speaker 1: I was about twenty, which is all terrible thing to do. 257 00:14:12,400 --> 00:14:15,199 Speaker 1: And I didn't do tons of it, but there was 258 00:14:15,240 --> 00:14:17,440 Speaker 1: at one point I took some and it it was 259 00:14:18,040 --> 00:14:22,359 Speaker 1: one of the most horrendous experiences of my life from marijuana, 260 00:14:22,400 --> 00:14:24,520 Speaker 1: and it's very hard to explain that to people who 261 00:14:25,240 --> 00:14:26,520 Speaker 1: don't get affected. 262 00:14:26,160 --> 00:14:29,520 Speaker 3: By it that way, right, right, exactly, And there's probably 263 00:14:29,520 --> 00:14:31,600 Speaker 3: a genetic component to that in terms of who gets 264 00:14:31,600 --> 00:14:34,560 Speaker 3: affected and who doesn't. But yeah, So back to your 265 00:14:34,600 --> 00:14:37,720 Speaker 3: question about schizophrenia though, right, So, what were you going 266 00:14:37,800 --> 00:14:40,600 Speaker 3: to So it's mostly genetic, but there are environmental things 267 00:14:40,600 --> 00:14:41,560 Speaker 3: that exacerbate it. 268 00:14:48,920 --> 00:14:51,280 Speaker 1: Well, I was going to ask you about drug addiction, 269 00:14:52,080 --> 00:14:55,560 Speaker 1: which I think I don't know. Is that the same 270 00:14:55,720 --> 00:15:00,360 Speaker 1: is it percent more percentage genetic for schizophrenia is for 271 00:15:00,440 --> 00:15:03,880 Speaker 1: drug addiction? Is it more environmental and land behavior for 272 00:15:04,000 --> 00:15:07,640 Speaker 1: drug and alcohol addiction? I can't imagine there is peer 273 00:15:07,680 --> 00:15:10,760 Speaker 1: pressure to become schizophrenic, you know. It seems like it's 274 00:15:10,800 --> 00:15:17,840 Speaker 1: almost accidental. Whereas I speak, is a sober alcoholic? I mean, 275 00:15:18,280 --> 00:15:21,560 Speaker 1: I didn't say out for that to happen, you know, 276 00:15:21,600 --> 00:15:23,760 Speaker 1: I say for the sober thing to happen, it basically, 277 00:15:23,760 --> 00:15:26,080 Speaker 1: but it kind of crept up on me. Is there 278 00:15:26,120 --> 00:15:29,000 Speaker 1: a genetic predisposition to both of these things? Is what 279 00:15:29,040 --> 00:15:29,520 Speaker 1: I'm saying. 280 00:15:29,760 --> 00:15:33,440 Speaker 3: There is, although completely separate genetics on the Sudie. But yes, 281 00:15:33,480 --> 00:15:36,680 Speaker 3: there is a genetic predisposition for addiction, for having an 282 00:15:36,720 --> 00:15:42,600 Speaker 3: addictive personality. That is clearly a thing. But there are 283 00:15:42,720 --> 00:15:45,600 Speaker 3: very few things really that can be separated cleanly into 284 00:15:45,760 --> 00:15:48,920 Speaker 3: nature versus nurture, because there are influences on both. I'll 285 00:15:48,960 --> 00:15:50,520 Speaker 3: just give you an example. It turns out that with 286 00:15:50,680 --> 00:15:55,600 Speaker 3: back to schizophrenia, one of the things that influences whether 287 00:15:55,680 --> 00:15:58,000 Speaker 3: someone has a psychotic break in part has to do 288 00:15:58,040 --> 00:16:00,640 Speaker 3: with whether they are living in a place where they 289 00:16:00,640 --> 00:16:03,120 Speaker 3: are in their culture, in their language, or whether they 290 00:16:03,160 --> 00:16:06,560 Speaker 3: have immigrated somewhere else. And when you're living somewhere else, 291 00:16:07,200 --> 00:16:10,920 Speaker 3: there are things like you know, you can't for example, 292 00:16:10,920 --> 00:16:14,160 Speaker 3: you can't make jokes in your new language as well, 293 00:16:14,280 --> 00:16:17,200 Speaker 3: or you can't fit in exactly as well. And it 294 00:16:17,240 --> 00:16:20,200 Speaker 3: turns out that more people have psychotic breaks. In my 295 00:16:20,200 --> 00:16:22,440 Speaker 3: book in Cognito, I talked about this as you know, 296 00:16:22,480 --> 00:16:25,560 Speaker 3: one of the risks of getting schatphrenia is the color 297 00:16:25,600 --> 00:16:30,120 Speaker 3: of your passport. So you know that's that's a surprising 298 00:16:30,320 --> 00:16:32,920 Speaker 3: social aspect of it that people have discovered. 299 00:16:33,240 --> 00:16:35,960 Speaker 1: Well, that's that's kind of fascinating to me. Then does 300 00:16:36,040 --> 00:16:40,920 Speaker 1: that does that lead you to study more men aloneness, 301 00:16:41,000 --> 00:16:44,200 Speaker 1: because clearly there are things like it's not a man 302 00:16:44,280 --> 00:16:47,640 Speaker 1: in aloneness, but clearly if someone is dyslexic, they're bre dyslexic. Right, 303 00:16:47,720 --> 00:16:50,160 Speaker 1: it's not something you learn as a little baby, that's right. 304 00:16:50,200 --> 00:16:53,760 Speaker 1: But if someone's left handed or right handed? I was, 305 00:16:53,760 --> 00:16:55,440 Speaker 1: was that do you learn? 306 00:16:55,520 --> 00:16:56,280 Speaker 3: That? Is it? 307 00:16:56,360 --> 00:16:58,800 Speaker 1: I mean, how much information you get right in the beginning, 308 00:16:58,840 --> 00:16:59,840 Speaker 1: I think, is what I'm saying. 309 00:17:00,120 --> 00:17:02,320 Speaker 3: Yeah, yeah, these are great questions. The fact is, when 310 00:17:02,320 --> 00:17:05,080 Speaker 3: it comes to nature versus nurture, the answer is almost 311 00:17:05,119 --> 00:17:09,200 Speaker 3: always both. There are a very tiny number of things 312 00:17:09,720 --> 00:17:11,560 Speaker 3: that are one or the other. Four example, the first 313 00:17:11,600 --> 00:17:14,720 Speaker 3: gene that was pulled for a disease was Huntington's disease, 314 00:17:15,880 --> 00:17:19,600 Speaker 3: and everyone thought, great, if you have this gene, you're 315 00:17:19,640 --> 00:17:21,800 Speaker 3: gonna get Huntington's that's it. And never thought this is 316 00:17:21,840 --> 00:17:23,359 Speaker 3: going to be easy. But it turns out it's one 317 00:17:23,359 --> 00:17:27,040 Speaker 3: of the few monogenetic diseases that exist, meaning you know, 318 00:17:27,280 --> 00:17:29,200 Speaker 3: if you have this gene, blah blah, because everything turns 319 00:17:29,200 --> 00:17:33,439 Speaker 3: out to be more complicated. Other diseases involve genetics. They 320 00:17:33,480 --> 00:17:36,240 Speaker 3: have all lots of different genes, whole families of genes. 321 00:17:36,240 --> 00:17:37,800 Speaker 3: We're still trying to get to the bottom of them. 322 00:17:37,960 --> 00:17:42,679 Speaker 3: But also most of everything involves what's going on societally too. 323 00:17:42,840 --> 00:17:44,320 Speaker 3: Let me give you an example of this. This came 324 00:17:44,359 --> 00:17:48,520 Speaker 3: out some years ago. The question is are there genes 325 00:17:48,560 --> 00:17:51,680 Speaker 3: for depression? Well, it turns out if you're a carrier 326 00:17:51,720 --> 00:17:55,320 Speaker 3: of particular genes, the question is, okay, are you more 327 00:17:55,400 --> 00:17:57,560 Speaker 3: likely to get depression? And the answer is that totally 328 00:17:57,600 --> 00:18:01,160 Speaker 3: depends on the number of really traumatic life events you have. 329 00:18:01,320 --> 00:18:03,879 Speaker 3: So let's say, you know, a terrible car accident or 330 00:18:03,920 --> 00:18:05,960 Speaker 3: the death of a loved one, things like that. If 331 00:18:05,960 --> 00:18:09,879 Speaker 3: you've had a lot of traumatic life experiences and you 332 00:18:09,960 --> 00:18:12,960 Speaker 3: carry these genes, then your chance of getting depression is 333 00:18:13,040 --> 00:18:15,760 Speaker 3: much higher than someone who's had the same number of 334 00:18:15,760 --> 00:18:18,600 Speaker 3: traumatic life experiences but don't carry the genes. But if 335 00:18:18,640 --> 00:18:21,080 Speaker 3: you don't have let's say any or just a few 336 00:18:21,160 --> 00:18:24,680 Speaker 3: life experiences that are bad, your chances are no different 337 00:18:24,680 --> 00:18:28,280 Speaker 3: than anyone else. So this is now we refer to 338 00:18:28,320 --> 00:18:34,240 Speaker 3: this as gene times environment you know, gene x environment interactions. 339 00:18:34,640 --> 00:18:37,160 Speaker 3: So it depends on both things. 340 00:18:37,640 --> 00:18:41,040 Speaker 1: So it's kind of like a recipe then, right, yeah, yeah, 341 00:18:41,240 --> 00:18:44,439 Speaker 1: Like so I want to stay about like some some 342 00:18:44,600 --> 00:18:45,919 Speaker 1: bit of this and some bit of that. 343 00:18:46,040 --> 00:18:47,359 Speaker 3: You know, you get a little bit of sugar and 344 00:18:47,400 --> 00:18:48,800 Speaker 3: a little bit of salt and a little. 345 00:18:48,560 --> 00:18:51,560 Speaker 1: Bit of trauma, and you get you know, you get 346 00:18:51,560 --> 00:18:53,280 Speaker 1: a spatial thing, exactly. 347 00:18:53,280 --> 00:18:55,400 Speaker 3: And what this points to is the complexity of both 348 00:18:55,400 --> 00:18:59,560 Speaker 3: biology and life, right you know, yeah, born, Yeah, things 349 00:18:59,560 --> 00:19:01,639 Speaker 3: can happen to you that we're unexpected, and you can 350 00:19:01,720 --> 00:19:04,080 Speaker 3: have genes that interact in unexpected ways. 351 00:19:04,359 --> 00:19:08,080 Speaker 1: Yes, And that's so I imagine those probably an almost 352 00:19:08,200 --> 00:19:11,239 Speaker 1: infinite amount of variables and all these different things. So 353 00:19:11,520 --> 00:19:14,040 Speaker 1: to depend on how someone is ever going to be, 354 00:19:14,080 --> 00:19:16,680 Speaker 1: it just remains as elusive as ever then, right, that 355 00:19:17,280 --> 00:19:18,000 Speaker 1: is exactly right. 356 00:19:18,040 --> 00:19:21,560 Speaker 3: So you take a movie like Minority Report, where the 357 00:19:21,640 --> 00:19:23,640 Speaker 3: shtick was that you could predict who's going to commit 358 00:19:23,680 --> 00:19:26,920 Speaker 3: a crime in the future. It's total fantasy and it'll 359 00:19:26,960 --> 00:19:29,479 Speaker 3: never happen. In other words, people think, hey, as we 360 00:19:29,520 --> 00:19:32,159 Speaker 3: get better with brain imaging or with AI, won't we 361 00:19:32,200 --> 00:19:34,919 Speaker 3: get to that point someday? But the answer is never happened. 362 00:19:34,960 --> 00:19:40,240 Speaker 3: Why Because your brain is changing and rewiring every second 363 00:19:40,280 --> 00:19:43,920 Speaker 3: of your life depending on your interactions. So, for example, 364 00:19:44,520 --> 00:19:47,440 Speaker 3: your brain and my brain are different than they were 365 00:19:47,480 --> 00:19:51,080 Speaker 3: five minutes ago, just from conversing with each other. Right, So, 366 00:19:51,520 --> 00:19:54,320 Speaker 3: and this is the notion of brain plasticity, which is 367 00:19:54,359 --> 00:19:57,840 Speaker 3: that fundamentally, what mother nature has done is built a 368 00:19:57,960 --> 00:20:04,320 Speaker 3: system that absorbs the world and is constantly reconfiguring itself. 369 00:20:05,000 --> 00:20:07,840 Speaker 1: So I kind of you mentioned the AI there, so 370 00:20:07,960 --> 00:20:10,960 Speaker 1: I kind of find that fascinating because the idea I 371 00:20:11,040 --> 00:20:13,480 Speaker 1: suppose of AI. Again, I know not of AI either, 372 00:20:13,800 --> 00:20:17,359 Speaker 1: but the idea is that it mimics the landing pattern 373 00:20:17,480 --> 00:20:19,439 Speaker 1: of a human brain, right, because that's all it can do, 374 00:20:19,520 --> 00:20:21,440 Speaker 1: given the fact that's all we have to build it with. 375 00:20:22,080 --> 00:20:24,080 Speaker 3: Sort of, So this is the really interesting thing. So 376 00:20:24,720 --> 00:20:28,760 Speaker 3: AI launched many decades ago, and the idea was, okay, look, 377 00:20:28,800 --> 00:20:32,320 Speaker 3: the brain is super complicated, but fundamentally you've got these units, 378 00:20:32,320 --> 00:20:35,120 Speaker 3: and you've got these connections between the units in the brain. 379 00:20:35,200 --> 00:20:37,240 Speaker 3: These are neurons, and these are all the connections between 380 00:20:37,280 --> 00:20:40,000 Speaker 3: the urts. So people said, look, what if we just 381 00:20:40,040 --> 00:20:42,199 Speaker 3: make a cartoon version of this where you just have 382 00:20:42,400 --> 00:20:45,120 Speaker 3: these you know, little units that we've got these connections 383 00:20:45,119 --> 00:20:47,040 Speaker 3: that you are just changing the strength of those connections 384 00:20:47,040 --> 00:20:52,560 Speaker 3: across the big network. So that's where artificial neural networks 385 00:20:52,880 --> 00:20:55,280 Speaker 3: took off and went in that direction, and it turns 386 00:20:55,320 --> 00:21:00,000 Speaker 3: out that's become incredibly successful. We've got this great renaissance 387 00:21:00,160 --> 00:21:04,240 Speaker 3: going on of AI. But it's actually not that much 388 00:21:04,480 --> 00:21:07,959 Speaker 3: like the human brain. It's it's quite different. So there 389 00:21:07,960 --> 00:21:10,960 Speaker 3: are many many things that the human brain does that 390 00:21:11,000 --> 00:21:15,000 Speaker 3: AI simply can't, and at least in its current architecture, 391 00:21:16,119 --> 00:21:19,119 Speaker 3: you won't do anytime soon. I'll give you an example 392 00:21:19,119 --> 00:21:21,080 Speaker 3: of that. But I'll also say really quickly that it's 393 00:21:21,119 --> 00:21:24,600 Speaker 3: not to say that we can't build artificial neural networks 394 00:21:24,600 --> 00:21:27,120 Speaker 3: that are just like the brain and someday, maybe five 395 00:21:27,200 --> 00:21:29,440 Speaker 3: years now, maybe fifty years from now, do everything in 396 00:21:29,480 --> 00:21:33,360 Speaker 3: brain does. But our current stuff like chat GPT, for example, 397 00:21:33,400 --> 00:21:36,560 Speaker 3: does not have an internal model of the world. So 398 00:21:36,760 --> 00:21:40,520 Speaker 3: if I ask GPT, hey, when Craig Ferguson walks into 399 00:21:40,560 --> 00:21:44,080 Speaker 3: a room, does his nose come with him? It won't 400 00:21:44,160 --> 00:21:46,880 Speaker 3: know the answer to that because it has no model 401 00:21:47,000 --> 00:21:50,200 Speaker 3: of the world. You know, does does his spleen come 402 00:21:50,200 --> 00:21:50,520 Speaker 3: with him? 403 00:21:50,520 --> 00:21:50,680 Speaker 2: Well? 404 00:21:50,800 --> 00:21:53,240 Speaker 3: How does it know? It doesn't? It why? Because the 405 00:21:53,280 --> 00:21:56,199 Speaker 3: way chat GPT is trained, it's read everything in the 406 00:21:56,240 --> 00:22:00,520 Speaker 3: world and it's just doing statistical games on what word 407 00:22:00,600 --> 00:22:03,000 Speaker 3: is likely to come next. That's all GPT does. It's 408 00:22:03,040 --> 00:22:08,280 Speaker 3: an enormous, enormous network yeah, that's just why I did ask. 409 00:22:08,720 --> 00:22:11,919 Speaker 1: I asked chat GPT to write me a short Craig 410 00:22:11,960 --> 00:22:17,720 Speaker 1: ferguson stand up comedy routine. And you know, I feel 411 00:22:17,720 --> 00:22:21,720 Speaker 1: like either I'm a terrible writer but my delivery is great, 412 00:22:22,280 --> 00:22:26,240 Speaker 1: or just a a terrible comedian, or chat GPT has 413 00:22:26,280 --> 00:22:27,840 Speaker 1: got a way to go yet, and it should maybe, 414 00:22:27,880 --> 00:22:29,960 Speaker 1: you know, work on his material in some clubs. 415 00:22:30,440 --> 00:22:32,159 Speaker 3: So so, okay, this is a really good point. And 416 00:22:32,240 --> 00:22:36,480 Speaker 3: chat GPT is terrible at humor, at making up new jokes, Okay. 417 00:22:36,520 --> 00:22:40,520 Speaker 3: Why it's because it's just a statistical parent. And what 418 00:22:40,560 --> 00:22:44,280 Speaker 3: it realizes is that humor is all about the violation 419 00:22:44,359 --> 00:22:48,720 Speaker 3: of expectation, but it doesn't know how to violate it. Well, 420 00:22:48,920 --> 00:22:50,600 Speaker 3: so if you ask it, tell me a joke about 421 00:22:50,640 --> 00:22:52,440 Speaker 3: you know, three guys who walk into a bar and say, 422 00:22:52,440 --> 00:22:54,439 Speaker 3: you know, and do blah blah blah. It'll say the 423 00:22:54,440 --> 00:22:56,359 Speaker 3: first guy does this, the second guy does this, and 424 00:22:56,400 --> 00:22:58,800 Speaker 3: the third guy does this, and it'll say something that 425 00:22:58,840 --> 00:23:01,640 Speaker 3: doesn't make any sense because it knows the third thing 426 00:23:01,680 --> 00:23:03,840 Speaker 3: is supposed to break the pattern, but it doesn't know 427 00:23:03,880 --> 00:23:06,200 Speaker 3: how to do it in a funny way. 428 00:23:06,880 --> 00:23:11,360 Speaker 5: I feel in Spain, I I think there's an argument 429 00:23:11,440 --> 00:23:13,560 Speaker 5: to be made for having a comedy night in a 430 00:23:13,600 --> 00:23:16,920 Speaker 5: club where comedians have to tell jokes written by computers. 431 00:23:17,160 --> 00:23:20,399 Speaker 1: And in fact, I am going to put that together 432 00:23:20,440 --> 00:23:21,480 Speaker 1: as soon as possible. 433 00:23:24,600 --> 00:23:27,000 Speaker 3: So this is funny that you mentioned this because I'm 434 00:23:27,040 --> 00:23:31,360 Speaker 3: actually working on on on a television documentary now. I'm 435 00:23:31,400 --> 00:23:34,440 Speaker 3: writing this up with my colleagues. It's called Bits and Giggles, 436 00:23:34,800 --> 00:23:37,640 Speaker 3: and it's exactly about this. It's about a comedian who 437 00:23:37,680 --> 00:23:41,399 Speaker 3: goes on a road trip with no Yeah, We've actually 438 00:23:41,440 --> 00:23:45,560 Speaker 3: built a little bot that does you know speech to text? 439 00:23:45,560 --> 00:23:47,600 Speaker 3: It goes off to chat GPT and then does you 440 00:23:47,640 --> 00:23:50,600 Speaker 3: know text to speech? So you can have a dialogue 441 00:23:50,680 --> 00:23:53,359 Speaker 3: back and forth with this little bot. And the question 442 00:23:53,600 --> 00:23:56,159 Speaker 3: is what does AI mean for us? Will it be 443 00:23:56,280 --> 00:23:59,119 Speaker 3: funny it, can it take the place of community? Can 444 00:23:59,200 --> 00:24:02,960 Speaker 3: it perform on stage with a comedian? Yeah? 445 00:24:03,119 --> 00:24:06,320 Speaker 1: I think I think it's quite interesting because I because 446 00:24:06,359 --> 00:24:09,639 Speaker 1: of my own history with komedy, I love it and 447 00:24:09,720 --> 00:24:13,640 Speaker 1: I feel that it's a very human connectivity thing. It's 448 00:24:13,680 --> 00:24:16,320 Speaker 1: a very and it's odd way, it's very intimate thing, 449 00:24:17,160 --> 00:24:19,880 Speaker 1: even although you know it's one person then an audience 450 00:24:20,000 --> 00:24:22,040 Speaker 1: in the way that I do it anyway, and I 451 00:24:22,119 --> 00:24:27,240 Speaker 1: wonder if that happens, then do you have your robot lover, 452 00:24:27,440 --> 00:24:31,520 Speaker 1: do you have your robot husband, your robot wife, your 453 00:24:31,640 --> 00:24:34,520 Speaker 1: robot spouse, your robot children? 454 00:24:34,720 --> 00:24:38,760 Speaker 3: I mean, is it? Is it possible? Yeah? So okay, 455 00:24:38,840 --> 00:24:40,760 Speaker 3: so this is funny mite. So my wife has been 456 00:24:40,840 --> 00:24:42,760 Speaker 3: joking about this for a long time. She's been joking 457 00:24:42,760 --> 00:24:45,800 Speaker 3: about the five percent better David, by which she means, 458 00:24:46,400 --> 00:24:48,720 Speaker 3: what does she have AI David that had all my 459 00:24:48,760 --> 00:24:52,080 Speaker 3: good qualities but it never got distracted or angry or 460 00:24:52,119 --> 00:24:55,879 Speaker 3: you know, we looked in my phone beeps or whatever. So, uh, 461 00:24:56,400 --> 00:24:59,600 Speaker 3: we've talked about this a lot and what this means. 462 00:24:59,680 --> 00:25:02,359 Speaker 3: And you know, the issue now is that lots of 463 00:25:02,400 --> 00:25:04,879 Speaker 3: young people are getting AI girlfriends and to some lesser 464 00:25:04,920 --> 00:25:09,639 Speaker 3: extent AI boyfriends girls getting that. In Japan, apparently this 465 00:25:09,680 --> 00:25:13,840 Speaker 3: is becoming a bigger thing where people have these AI relationships. 466 00:25:14,640 --> 00:25:17,760 Speaker 3: We can imagine perhaps the downsides of this, but I 467 00:25:17,800 --> 00:25:20,520 Speaker 3: do want to note I think an upside for young 468 00:25:20,600 --> 00:25:24,159 Speaker 3: people is you might be able to learn how to 469 00:25:24,240 --> 00:25:27,520 Speaker 3: navigate relationships a little bit better, and you know, you 470 00:25:27,600 --> 00:25:31,359 Speaker 3: kind of get your sandbox, your practice relationship, and if 471 00:25:31,400 --> 00:25:34,480 Speaker 3: the AI it gives you good feedback, you might actually 472 00:25:34,560 --> 00:25:36,400 Speaker 3: be a better person in relationships. 473 00:25:36,840 --> 00:25:39,600 Speaker 1: Yeah, but then you don't get the requisite amount of 474 00:25:39,640 --> 00:25:42,520 Speaker 1: trauma to make you human. I mean, no one wants 475 00:25:42,560 --> 00:25:45,840 Speaker 1: to wish trauma and anyone but junior high school may 476 00:25:45,920 --> 00:25:49,520 Speaker 1: be an essential component of making you a better persons. 477 00:25:50,560 --> 00:25:54,200 Speaker 3: It's an interesting conundrum. So I agree with you, and 478 00:25:54,880 --> 00:25:56,680 Speaker 3: it turns out the way to do this, I think, 479 00:25:56,760 --> 00:26:00,320 Speaker 3: is to make the AI bought be traumatic in the 480 00:26:00,400 --> 00:26:03,080 Speaker 3: sense of, you know, if you say something wrong to it, 481 00:26:03,080 --> 00:26:05,320 Speaker 3: it's not kind or something. You know, it gets its 482 00:26:05,320 --> 00:26:09,359 Speaker 3: feelings hurt. Obviously it's just statistically impersonating this, but the 483 00:26:09,440 --> 00:26:12,480 Speaker 3: point is instead of having an AI that just says, oh, 484 00:26:12,520 --> 00:26:14,399 Speaker 3: that was so good, that was so funny and nice, 485 00:26:14,760 --> 00:26:17,000 Speaker 3: instead it gives you real feedback, tough love. 486 00:26:17,160 --> 00:26:20,520 Speaker 1: And that that might actually work better than the parenting 487 00:26:20,560 --> 00:26:24,359 Speaker 1: that my generation inflicted on the next generation, which I 488 00:26:24,359 --> 00:26:27,399 Speaker 1: think was a little too positive. I don't know, something 489 00:26:27,440 --> 00:26:30,119 Speaker 1: went wrong, So let me try it. Let me steer 490 00:26:30,160 --> 00:26:32,600 Speaker 1: you around back to the brain and perception a little bit, 491 00:26:32,640 --> 00:26:36,320 Speaker 1: because I'm fascinated by the idea in my own life, 492 00:26:36,320 --> 00:26:39,480 Speaker 1: I'm fascinated by the idea of I think most people 493 00:26:39,520 --> 00:26:42,200 Speaker 1: are What is all about was the meaning of life 494 00:26:42,520 --> 00:26:43,960 Speaker 1: was the is there a god? 495 00:26:44,480 --> 00:26:44,719 Speaker 3: You know? 496 00:26:45,200 --> 00:26:49,080 Speaker 1: And I wonder if in the study of the brain, 497 00:26:49,280 --> 00:26:53,159 Speaker 1: which is you know, it's information central, it's information and 498 00:26:53,200 --> 00:26:57,280 Speaker 1: control everything's passing through there, does it lead you in 499 00:26:57,400 --> 00:27:01,560 Speaker 1: any direction personally for yourself? Does it lead you in 500 00:27:01,600 --> 00:27:02,920 Speaker 1: an atheistic direction? 501 00:27:03,040 --> 00:27:06,639 Speaker 3: Does it lead you in a faith based direction? But 502 00:27:06,800 --> 00:27:09,600 Speaker 3: does it do anything to you personally? Sure? I mean, 503 00:27:09,640 --> 00:27:12,359 Speaker 3: I've spent my life in science, and I feel like 504 00:27:12,560 --> 00:27:17,359 Speaker 3: the main lesson that one can derive is to really 505 00:27:17,440 --> 00:27:22,879 Speaker 3: understand the vastness of our ignorance. And so the more 506 00:27:22,960 --> 00:27:26,800 Speaker 3: you reach down into science in the world and the cosmos, 507 00:27:26,920 --> 00:27:29,399 Speaker 3: you find that there's so much that we don't know, 508 00:27:29,520 --> 00:27:31,679 Speaker 3: we don't understand. So what does that meant for me? 509 00:27:32,119 --> 00:27:37,520 Speaker 3: I am neither an atheist nor religious, because atheism, at 510 00:27:37,600 --> 00:27:40,879 Speaker 3: least in its harshest form, in its strictest form, kind 511 00:27:40,880 --> 00:27:44,000 Speaker 3: of often pretends, Hey, we've got this all figured out, 512 00:27:44,000 --> 00:27:46,240 Speaker 3: we know what's going on here, but it's clear that 513 00:27:46,240 --> 00:27:49,119 Speaker 3: we don't know what's going on here. On the flip side, 514 00:27:49,880 --> 00:27:54,320 Speaker 3: all the traditional religions also pretend to have certainty about stuff, 515 00:27:54,359 --> 00:27:58,080 Speaker 3: and they're all making it up. And so that puts 516 00:27:58,119 --> 00:28:00,879 Speaker 3: me in the middle. I don't call my self agnostic, 517 00:28:00,960 --> 00:28:05,439 Speaker 3: because agnostic often means I don't know if there's a 518 00:28:05,480 --> 00:28:08,159 Speaker 3: guy with a beard on a cloud or not. But 519 00:28:08,480 --> 00:28:11,040 Speaker 3: I call myself something else. I call myself a possibilion. 520 00:28:11,359 --> 00:28:15,760 Speaker 3: And the idea with possibilianism is an active exploration of 521 00:28:15,800 --> 00:28:19,320 Speaker 3: the possibility space, trying to figure out what is going 522 00:28:19,359 --> 00:28:22,560 Speaker 3: on in this great, big cosmos that we're in. And 523 00:28:22,640 --> 00:28:27,080 Speaker 3: so the idea is, you know, to take a scientific 524 00:28:27,200 --> 00:28:30,719 Speaker 3: mindset to this question, which is to say, you know, 525 00:28:30,800 --> 00:28:34,200 Speaker 3: science always has a broad table and allows lots of 526 00:28:34,280 --> 00:28:37,920 Speaker 3: hypotheses on and says, okay, maybe this, maybe that cool. 527 00:28:37,920 --> 00:28:41,120 Speaker 3: I'll let anything on the table. But you know, we 528 00:28:41,280 --> 00:28:44,040 Speaker 3: use the tools of science to rule out parts of 529 00:28:44,040 --> 00:28:47,280 Speaker 3: the possibility space. So if you come and say, hey, look, 530 00:28:47,360 --> 00:28:49,000 Speaker 3: I think you know this thing is going on with 531 00:28:49,120 --> 00:28:51,959 Speaker 3: crystals or this or yesp or whatever, we can actually 532 00:28:52,000 --> 00:28:54,080 Speaker 3: test that and rule things out. And we and we 533 00:28:54,280 --> 00:28:55,880 Speaker 3: that's what we do all the time. There's lots of 534 00:28:55,920 --> 00:28:59,040 Speaker 3: stuff that's sort of off the table at this point. 535 00:28:59,600 --> 00:29:01,920 Speaker 3: For example, you know, if you were to say, like 536 00:29:02,040 --> 00:29:04,480 Speaker 3: you know, traditional religious books, hey the earth is six 537 00:29:04,520 --> 00:29:08,640 Speaker 3: thousand years old. You know that's a problem because you know, 538 00:29:09,040 --> 00:29:12,800 Speaker 3: the Japanese were, you know, making pottery six thousand years 539 00:29:12,840 --> 00:29:15,520 Speaker 3: before that, and so that can't be true. So we 540 00:29:16,040 --> 00:29:18,480 Speaker 3: can use the tools of science to open up new 541 00:29:18,520 --> 00:29:22,400 Speaker 3: folds in the possibility space and to rule things out. 542 00:29:22,720 --> 00:29:25,040 Speaker 3: But what it allows is a big space where we 543 00:29:25,080 --> 00:29:27,080 Speaker 3: can shine a flashlight around and say, all right, look, 544 00:29:27,160 --> 00:29:29,440 Speaker 3: we're not going to pretend that we know for sure 545 00:29:29,520 --> 00:29:33,560 Speaker 3: that nothing exists, or that this particular made up story exists. Instead, 546 00:29:33,600 --> 00:29:35,840 Speaker 3: we're going to explore. It's fascinating to me. 547 00:29:35,880 --> 00:29:37,920 Speaker 1: I imagine at some point in your life you run 548 00:29:37,960 --> 00:29:40,720 Speaker 1: across the varieties of religious experience. 549 00:29:41,080 --> 00:29:44,240 Speaker 3: The William James lechos, did you run under them? I 550 00:29:44,360 --> 00:29:45,880 Speaker 3: have not read that. I've heard of it. 551 00:29:46,520 --> 00:29:50,200 Speaker 1: Yeah, essentist because I'm kind of like claiming through it 552 00:29:50,320 --> 00:29:52,520 Speaker 1: right now for I don't know. I guess that's what 553 00:29:52,640 --> 00:29:56,160 Speaker 1: I do for entertainment. There was one of the lectures 554 00:29:56,160 --> 00:30:00,600 Speaker 1: he gave when he talks about that when people believe 555 00:30:00,640 --> 00:30:03,440 Speaker 1: something and it makes them feel good, then they are 556 00:30:03,520 --> 00:30:08,160 Speaker 1: convinced it's true. He's talking and I thought, that's fascinating 557 00:30:08,200 --> 00:30:09,800 Speaker 1: to me. I knew I was going to be talking 558 00:30:09,800 --> 00:30:12,120 Speaker 1: to you today as well, and I thought that that's 559 00:30:12,160 --> 00:30:16,920 Speaker 1: an interesting position to be in that the religious experience. 560 00:30:17,360 --> 00:30:23,000 Speaker 1: If something a ceremony, or a story, or a particular 561 00:30:23,200 --> 00:30:26,479 Speaker 1: piece of dogma gives you a sense of euphoria, is 562 00:30:26,520 --> 00:30:31,880 Speaker 1: euphoria something that breeds very similitude? Is it something that 563 00:30:31,920 --> 00:30:35,840 Speaker 1: you say, I feel good, therefore this must be true. 564 00:30:36,080 --> 00:30:38,680 Speaker 3: Yeah. I think there are lots of reasons why people 565 00:30:39,280 --> 00:30:43,800 Speaker 3: believe things about any religious story. One of them, of course, 566 00:30:43,920 --> 00:30:47,640 Speaker 3: is that often people don't apply real rigor in what 567 00:30:47,680 --> 00:30:50,120 Speaker 3: they call true or not true in the first place. 568 00:30:50,160 --> 00:30:52,760 Speaker 3: But secondly, there's a huge social component to this. If 569 00:30:52,760 --> 00:30:55,960 Speaker 3: your friends, your family, or loved ones go and they 570 00:30:56,000 --> 00:30:59,640 Speaker 3: pray to this deity at this situation, then people feel like, hey, 571 00:30:59,680 --> 00:31:03,800 Speaker 3: you know, that's something that is meaningful to me too. 572 00:31:04,360 --> 00:31:06,360 Speaker 3: So I think there are a lot of reasons why 573 00:31:06,400 --> 00:31:08,960 Speaker 3: people believe in things. One of them is that people, 574 00:31:09,240 --> 00:31:12,280 Speaker 3: you know, don't necessarily apply rigorous tools when they're deciding 575 00:31:12,280 --> 00:31:15,640 Speaker 3: what to believe or not. But more than that, there's 576 00:31:15,680 --> 00:31:19,680 Speaker 3: a huge social component to religion or faith of any sort, 577 00:31:19,680 --> 00:31:22,040 Speaker 3: which is to say, if your friends and loved ones 578 00:31:22,680 --> 00:31:25,200 Speaker 3: believe in a particular thing, we tend to be compelled 579 00:31:25,200 --> 00:31:27,440 Speaker 3: that way. And if you live in a place where 580 00:31:27,480 --> 00:31:31,520 Speaker 3: everyone around you believes whatever deity and whatever crazy thing. 581 00:31:31,840 --> 00:31:33,800 Speaker 3: Then you grow up that way and you think, of, 582 00:31:33,800 --> 00:31:35,480 Speaker 3: of course I must be true because these people that 583 00:31:35,520 --> 00:31:39,160 Speaker 3: I love and respect they believe that. So there are 584 00:31:39,200 --> 00:31:42,880 Speaker 3: many different things that compel people. And you know, maybe 585 00:31:42,920 --> 00:31:44,800 Speaker 3: someone says, Okay, I'm going to finally break from my 586 00:31:44,840 --> 00:31:48,720 Speaker 3: religion and they go to some other religion and whatever 587 00:31:48,760 --> 00:31:53,920 Speaker 3: they're you know, attractive people there, or a compelling narrator 588 00:31:53,920 --> 00:31:56,160 Speaker 3: who tells them something, and so they feel like, hey, 589 00:31:56,200 --> 00:31:58,440 Speaker 3: that fits with what I need in my life. That's 590 00:31:58,480 --> 00:32:01,600 Speaker 3: the message I need right now. But none of this, Yeah, 591 00:32:02,120 --> 00:32:05,320 Speaker 3: none of this qualifies as good reasons to believe. It's 592 00:32:05,360 --> 00:32:06,400 Speaker 3: just why people believe. 593 00:32:07,160 --> 00:32:10,480 Speaker 1: It's interesting though, because the only the ultimate measuring tool 594 00:32:10,520 --> 00:32:12,520 Speaker 1: that you have in front of you, or that you 595 00:32:12,600 --> 00:32:16,320 Speaker 1: have to use, is in fact your own perception and 596 00:32:16,360 --> 00:32:19,480 Speaker 1: the perception of your contemporaries. Right, So if you set 597 00:32:19,560 --> 00:32:23,760 Speaker 1: up even the most rigorous academic test, you're still looking 598 00:32:23,800 --> 00:32:26,040 Speaker 1: at it with your eyes and thinking about it with 599 00:32:26,200 --> 00:32:26,920 Speaker 1: your brain. 600 00:32:27,440 --> 00:32:29,280 Speaker 3: Don't I don't think so. I think that we can 601 00:32:29,320 --> 00:32:32,640 Speaker 3: actually use the tools of science really. So yeah, So, 602 00:32:33,680 --> 00:32:37,080 Speaker 3: for example, I mentioned the way that science opens up 603 00:32:37,120 --> 00:32:40,880 Speaker 3: new folds in the possibility space as we discover things, 604 00:32:41,240 --> 00:32:43,360 Speaker 3: for example, about the size of the I mean, look, 605 00:32:43,400 --> 00:32:46,280 Speaker 3: you know, poor you know, Galileo had to spend the 606 00:32:46,320 --> 00:32:49,800 Speaker 3: last part of his life imprisoned because he suggested that 607 00:32:49,880 --> 00:32:52,320 Speaker 3: maybe the Earth is going around the Sun and not 608 00:32:52,440 --> 00:32:54,920 Speaker 3: vice versa. But as we discover more and more about 609 00:32:54,920 --> 00:32:58,720 Speaker 3: the cosmos and understand the absolute enormity of it, and 610 00:32:59,160 --> 00:33:02,960 Speaker 3: that our galaxy has one hundred billion stars, any number 611 00:33:03,040 --> 00:33:05,320 Speaker 3: you know, any one of which has number of planets 612 00:33:05,360 --> 00:33:07,960 Speaker 3: rolling around it, and and our galaxy is one of 613 00:33:08,080 --> 00:33:11,840 Speaker 3: one hundred billion galaxies in the cosmos. And as we 614 00:33:11,960 --> 00:33:14,479 Speaker 3: as we understand these things, I think that opens us 615 00:33:14,560 --> 00:33:17,720 Speaker 3: up to a very different kind of faith, so that 616 00:33:17,720 --> 00:33:20,360 Speaker 3: we don't have to think about, Okay, my little local 617 00:33:20,400 --> 00:33:23,320 Speaker 3: deity can beat your local deity and so on. First 618 00:33:23,360 --> 00:33:25,560 Speaker 3: of all, science opens up these things. But then the 619 00:33:25,600 --> 00:33:28,240 Speaker 3: other thing I mention is that science rules things out, 620 00:33:29,760 --> 00:33:32,960 Speaker 3: you know, whether that's the age of the Earth or 621 00:33:33,000 --> 00:33:36,239 Speaker 3: the idea that you know, your deity, you know, did 622 00:33:36,680 --> 00:33:39,160 Speaker 3: some little things, some little magic trick, and you can 623 00:33:39,600 --> 00:33:43,280 Speaker 3: rule that stuff in or out. I think we probably 624 00:33:43,320 --> 00:33:46,600 Speaker 3: have a very different perspective on the world than we 625 00:33:46,680 --> 00:33:50,280 Speaker 3: did even three hundred years ago. When people consider it, Hey, 626 00:33:50,320 --> 00:33:53,880 Speaker 3: do I think this deity represents the truth? And and 627 00:33:53,880 --> 00:33:57,480 Speaker 3: and you know, for example, we're so global now that 628 00:33:57,560 --> 00:34:01,480 Speaker 3: we see there are two thousand religions on the planet, 629 00:34:01,920 --> 00:34:05,520 Speaker 3: and so it becomes harder to believe, Oh, the thing 630 00:34:05,520 --> 00:34:07,760 Speaker 3: that I grew up with has to be the right one, 631 00:34:07,760 --> 00:34:10,719 Speaker 3: because you now see that there are two thousand other 632 00:34:10,960 --> 00:34:14,320 Speaker 3: versions of this stuff. So anyway, all these things point 633 00:34:14,360 --> 00:34:17,560 Speaker 3: to as we become smarter as a society, I think 634 00:34:17,600 --> 00:34:22,480 Speaker 3: we can develop notions that maybe are more appropriate to 635 00:34:22,560 --> 00:34:31,879 Speaker 3: a deeper view along the lines of what you were saying. 636 00:34:31,920 --> 00:34:34,480 Speaker 1: I remember before my first kid was born, I said 637 00:34:35,000 --> 00:34:39,840 Speaker 1: to the obstetrician, how much do you actually know about 638 00:34:39,920 --> 00:34:43,760 Speaker 1: what pregnancy was going on in pregnancy? And she said, 639 00:34:44,160 --> 00:34:47,239 Speaker 1: if you'd ask me that question ten years ago, i'd 640 00:34:47,239 --> 00:34:51,880 Speaker 1: have said about fifty percent. But we've learned so much 641 00:34:52,480 --> 00:34:55,160 Speaker 1: that now I would say about twenty five percent. 642 00:34:55,920 --> 00:34:57,759 Speaker 3: Excellent, very good. 643 00:34:58,160 --> 00:35:01,720 Speaker 1: And I think that that that's a fascinating though, because 644 00:35:01,719 --> 00:35:07,040 Speaker 1: if you follow the logical, mathematical root of that, literally, 645 00:35:07,120 --> 00:35:08,960 Speaker 1: the more you know, the less you know. 646 00:35:09,560 --> 00:35:13,280 Speaker 3: Yeah, that's exact. I mean, there used to be people 647 00:35:13,360 --> 00:35:17,560 Speaker 3: called pansophists, which meant, you know, someone who knows everything 648 00:35:17,640 --> 00:35:20,280 Speaker 3: to be known. And you know, back in ancient Greece 649 00:35:21,200 --> 00:35:23,680 Speaker 3: it was plausible to have somebody who was a Panzifist, 650 00:35:23,719 --> 00:35:27,239 Speaker 3: and now it's totally impossible. It has been for centuries. 651 00:35:29,200 --> 00:35:32,160 Speaker 3: So you know, that's lovely. And what I love, by 652 00:35:32,200 --> 00:35:34,040 Speaker 3: the way, about this moment in time right now, is 653 00:35:34,080 --> 00:35:39,640 Speaker 3: we've got AI that has consumed you know, every single 654 00:35:39,680 --> 00:35:43,600 Speaker 3: thing ever written by humans, and so that provides a 655 00:35:43,640 --> 00:35:47,400 Speaker 3: whole new way of interacting and learning human kind's knowledge, 656 00:35:47,400 --> 00:35:49,880 Speaker 3: which is a sphere that is now much too large 657 00:35:49,880 --> 00:35:52,440 Speaker 3: for any of us to ever hope to even get into. 658 00:35:52,840 --> 00:35:55,480 Speaker 3: But what we can do is find some doorway that 659 00:35:55,560 --> 00:35:58,239 Speaker 3: interests us and enter the sphere that way, and by 660 00:35:58,320 --> 00:36:02,440 Speaker 3: talking to the AI just you know, learn all about 661 00:36:02,480 --> 00:36:04,879 Speaker 3: the world by asking questions that are relevant to us. 662 00:36:04,880 --> 00:36:06,560 Speaker 3: And I think this is going to really change our 663 00:36:06,680 --> 00:36:11,359 Speaker 3: educational systems for schooling, because right now, you know, kids 664 00:36:11,480 --> 00:36:13,919 Speaker 3: in classrooms, it's too fast for half the kids, too 665 00:36:13,920 --> 00:36:18,400 Speaker 3: slow for half the kids. But we can finally achieve 666 00:36:18,520 --> 00:36:24,600 Speaker 3: this dream of real individualized education where everyone you know 667 00:36:24,680 --> 00:36:27,000 Speaker 3: has an AI tutor, which is, by the way, how 668 00:36:27,000 --> 00:36:28,920 Speaker 3: it used to go. You know, Alexander the Great was 669 00:36:28,960 --> 00:36:31,960 Speaker 3: tutored by Aristotle, and you know you'd sit there and 670 00:36:32,120 --> 00:36:35,520 Speaker 3: have conversations. And I think we'll return to that. But 671 00:36:35,560 --> 00:36:38,160 Speaker 3: what that's I mean, that's fantastic for learning. 672 00:36:38,200 --> 00:36:40,919 Speaker 1: But I mean what we what we were, you're talking 673 00:36:40,960 --> 00:36:43,560 Speaker 1: about as well, even we were talking about religion. Is 674 00:36:43,600 --> 00:36:49,080 Speaker 1: that your religion or your propensity to certain depressions or 675 00:36:49,320 --> 00:36:50,040 Speaker 1: or or. 676 00:36:51,880 --> 00:36:53,040 Speaker 3: Different traumas. 677 00:36:53,080 --> 00:36:58,239 Speaker 1: All of that's to do with socialization and to have 678 00:36:58,400 --> 00:37:02,960 Speaker 1: an AI a certain point, I have to ask myself. 679 00:37:02,960 --> 00:37:04,680 Speaker 1: I mean, I'm just speaking. 680 00:37:04,719 --> 00:37:07,240 Speaker 3: I love that. What's the point if it just mimics 681 00:37:07,280 --> 00:37:09,480 Speaker 3: everything that we've already go. We've ready to go all that, 682 00:37:10,560 --> 00:37:12,880 Speaker 3: So okay, great. Those are two types of questions. So 683 00:37:12,920 --> 00:37:15,680 Speaker 3: as far as socialization, I think that's what school will become. 684 00:37:15,719 --> 00:37:20,120 Speaker 3: I think you know, you're exactly right that that's such 685 00:37:20,160 --> 00:37:22,920 Speaker 3: an important part of growing up. And this was the 686 00:37:23,800 --> 00:37:26,960 Speaker 3: terrible thing for parents about COVID is seeing your young 687 00:37:27,080 --> 00:37:30,200 Speaker 3: children having to stay home and not wrestling and rolling 688 00:37:30,200 --> 00:37:33,239 Speaker 3: around and jumping on trampolines with other kids. So we'll 689 00:37:33,239 --> 00:37:35,480 Speaker 3: always have that, but school will become more about that 690 00:37:35,800 --> 00:37:38,480 Speaker 3: and instead of having the teacher drone on to the kids. 691 00:37:38,480 --> 00:37:41,319 Speaker 3: It'll be you know, the kids put on headphones. Do that, 692 00:37:43,440 --> 00:37:47,360 Speaker 3: at least as it stands. Now, what humans are really 693 00:37:47,400 --> 00:37:55,160 Speaker 3: great at doing is creativity and also understanding which creative 694 00:37:55,239 --> 00:37:59,640 Speaker 3: moves matter. So, for example, I can say to the AI, 695 00:37:59,719 --> 00:38:03,840 Speaker 3: hate generate one hundred pictures of you know, Craig sitting 696 00:38:03,840 --> 00:38:07,040 Speaker 3: in an avocado chair holding a poodle, and it'll do that. 697 00:38:07,440 --> 00:38:10,840 Speaker 3: But it doesn't know which of those pictures are better 698 00:38:11,360 --> 00:38:13,680 Speaker 3: than another. But a human looks at and says, oh, Craig, 699 00:38:13,680 --> 00:38:15,359 Speaker 3: that's a really good one, and that one over these 700 00:38:15,600 --> 00:38:20,840 Speaker 3: staying over here or whatever. And so humans are actually necessary, 701 00:38:20,840 --> 00:38:23,560 Speaker 3: at least at the moment for doing this next step 702 00:38:23,560 --> 00:38:26,800 Speaker 3: for figuring out Okay, I can ask, I can query 703 00:38:26,840 --> 00:38:28,799 Speaker 3: the AI, but what do I do with that? What's 704 00:38:28,840 --> 00:38:32,719 Speaker 3: the next step. I'll give you a specific example science. 705 00:38:33,080 --> 00:38:38,239 Speaker 3: So AI can tell me incredible things like, hey, I 706 00:38:38,280 --> 00:38:40,840 Speaker 3: need to understand you know, these facts that are scattered 707 00:38:40,840 --> 00:38:44,080 Speaker 3: around all these different journal papers across fifty years. Give 708 00:38:44,080 --> 00:38:46,719 Speaker 3: me a summary of this. It's trivial for it to 709 00:38:46,760 --> 00:38:48,960 Speaker 3: do that, and that's super useful. But what it can't 710 00:38:49,040 --> 00:38:54,120 Speaker 3: do is generate new sorts of science in the way that. 711 00:38:54,200 --> 00:38:57,479 Speaker 3: Let's say Albert Einstein says, Okay, what if I were 712 00:38:57,880 --> 00:39:00,600 Speaker 3: writing on a photon of lightweight? What would that be 713 00:39:00,719 --> 00:39:02,360 Speaker 3: like if we were moving into speed of light? And 714 00:39:02,520 --> 00:39:04,520 Speaker 3: he thinks through that, he says, oh, and he comes 715 00:39:04,600 --> 00:39:07,840 Speaker 3: up with the special theory of relativity. That's progress. He 716 00:39:07,880 --> 00:39:09,680 Speaker 3: wants to know about prize for this sort of stuff. 717 00:39:10,000 --> 00:39:12,400 Speaker 3: That's the kind of things that at least at the moment, 718 00:39:12,800 --> 00:39:16,880 Speaker 3: AI does not do. So in answer your question, the 719 00:39:18,040 --> 00:39:21,680 Speaker 3: ultimate perfect thing is if we have AI co pilots 720 00:39:21,719 --> 00:39:24,960 Speaker 3: with us who can tell us lots of information, and 721 00:39:25,000 --> 00:39:29,880 Speaker 3: then we use our creativity and our extrapolation and simulation 722 00:39:29,960 --> 00:39:34,520 Speaker 3: of possible futures to put that together to make to 723 00:39:34,560 --> 00:39:37,000 Speaker 3: make something that's the next step for our civilization. 724 00:39:37,360 --> 00:39:41,480 Speaker 1: All right, So if that can happen, and let's imagine 725 00:39:41,480 --> 00:39:46,680 Speaker 1: that it can, is there a possibility the certain point, 726 00:39:46,840 --> 00:39:52,440 Speaker 1: if we can find the genetic and chemical recipe for 727 00:39:52,560 --> 00:39:58,840 Speaker 1: any individual's personality, that that can be I'm asking the 728 00:39:58,880 --> 00:40:02,680 Speaker 1: singularity question. Can you put the mind and the soul 729 00:40:02,800 --> 00:40:10,240 Speaker 1: of a cognizant, coherent santient being inside something which is digital? 730 00:40:10,719 --> 00:40:13,520 Speaker 3: Okay, so that's the question of can you upload your brain? 731 00:40:13,680 --> 00:40:16,719 Speaker 3: So you don't have to die. Here's how it would work. 732 00:40:16,800 --> 00:40:20,760 Speaker 3: It would work by taking a scan of your brain 733 00:40:21,239 --> 00:40:23,640 Speaker 3: at the kind of resolution that we can't even dream 734 00:40:23,760 --> 00:40:27,160 Speaker 3: of now. Right now with our very fancy brain imaging 735 00:40:27,200 --> 00:40:31,640 Speaker 3: what we call fMRI functional magnetic residence imaging is very crude. Okay, 736 00:40:31,960 --> 00:40:36,120 Speaker 3: but cut to one hundred years from now, our great 737 00:40:36,160 --> 00:40:39,160 Speaker 3: grand children are sitting around having a podcast with each other, 738 00:40:39,440 --> 00:40:42,399 Speaker 3: and the question is, could you scan a brain at 739 00:40:42,400 --> 00:40:44,879 Speaker 3: the resolution where you know every single neuron and all 740 00:40:44,880 --> 00:40:47,720 Speaker 3: the connections and perhaps everything going on inside the neuron 741 00:40:48,000 --> 00:40:51,560 Speaker 3: and reproduce that algorithm on a computer? The answer is 742 00:40:51,640 --> 00:40:54,560 Speaker 3: probably probably you could do that, and therefore you could 743 00:40:54,600 --> 00:40:59,759 Speaker 3: download it and run Craig or Craig's great grandchild such 744 00:40:59,840 --> 00:41:02,839 Speaker 3: that you really couldn't tell a difference. So I say 745 00:41:02,960 --> 00:41:06,120 Speaker 3: to the computer, Hey, Craig's great grandchild, are you in there? 746 00:41:06,440 --> 00:41:09,000 Speaker 3: And she says, yeah, I'm here, what's up? And you know, 747 00:41:09,040 --> 00:41:12,640 Speaker 3: we have a conversation that's fascinating to me. 748 00:41:12,800 --> 00:41:16,359 Speaker 1: So the idea that is, I mean, I wouldn't hold 749 00:41:16,400 --> 00:41:20,920 Speaker 1: you to it, but sort of theoretically, sort of kind 750 00:41:20,960 --> 00:41:22,120 Speaker 1: of maybe. 751 00:41:22,719 --> 00:41:27,279 Speaker 3: Oh, theoretically yes. And it's because as best we can 752 00:41:27,320 --> 00:41:30,959 Speaker 3: tell this is just a machine in here. It's the 753 00:41:31,000 --> 00:41:34,440 Speaker 3: most complex sophisticated thing that we have ever come across 754 00:41:34,480 --> 00:41:38,160 Speaker 3: in our universe, the human brain. But it's just a machine. 755 00:41:38,200 --> 00:41:41,360 Speaker 3: It's just built out of eighty six billion neurons and 756 00:41:41,400 --> 00:41:44,080 Speaker 3: about the same number of glial cells, and it's you know, 757 00:41:44,239 --> 00:41:46,520 Speaker 3: every neuron in your head is popping off the little 758 00:41:46,600 --> 00:41:51,760 Speaker 3: signals tends to hundreds of times per second, So it's 759 00:41:52,280 --> 00:41:55,799 Speaker 3: unbelievably complex. You've got something like two hundred trillion connections 760 00:41:55,840 --> 00:41:58,440 Speaker 3: between these neurons, and as I said, it's it's like 761 00:41:58,480 --> 00:42:02,000 Speaker 3: a forest that's re configuring with every experience that you have. 762 00:42:02,040 --> 00:42:05,080 Speaker 3: So it's unbelievably complex. But it's a machine. And so 763 00:42:05,160 --> 00:42:07,160 Speaker 3: there's no reason that we should not be able to 764 00:42:07,280 --> 00:42:11,000 Speaker 3: reproduce that on silicon or whatever I mean. In theory, 765 00:42:11,080 --> 00:42:13,920 Speaker 3: I could reproduce your brain out of beer cans and 766 00:42:13,960 --> 00:42:14,680 Speaker 3: tennis balls. 767 00:42:14,960 --> 00:42:18,839 Speaker 1: And if it's definitely yeah, I don't think you need 768 00:42:18,880 --> 00:42:19,560 Speaker 1: the tennis ball. 769 00:42:21,560 --> 00:42:25,560 Speaker 3: And if it's doing the same algorithm, then then it's you. 770 00:42:25,600 --> 00:42:27,560 Speaker 3: And if I say, hey, Craig, how you doing, you say, ah, 771 00:42:27,600 --> 00:42:31,560 Speaker 3: I'm a little hungry, whatever, But it's it's you. Because 772 00:42:31,600 --> 00:42:33,759 Speaker 3: all we are is you know, these vast machines. And 773 00:42:33,840 --> 00:42:35,200 Speaker 3: by the way, the reason we know that, I'm not 774 00:42:35,239 --> 00:42:36,920 Speaker 3: just saying this as an assertion. The reason we know 775 00:42:37,040 --> 00:42:41,239 Speaker 3: that is because of centuries of studying brain damage. If 776 00:42:41,239 --> 00:42:43,560 Speaker 3: you study even a very tiny bit of your brain 777 00:42:44,800 --> 00:42:47,879 Speaker 3: that completely changes that can change your personality or decision making, 778 00:42:47,920 --> 00:42:52,560 Speaker 3: your ability to recognize animals, or see colors, or hear music, 779 00:42:52,840 --> 00:42:55,160 Speaker 3: or you know a thousand other things that we see 780 00:42:55,200 --> 00:42:57,360 Speaker 3: in the clinics every day. And that's how we know 781 00:42:58,120 --> 00:43:00,759 Speaker 3: that you are the operation of your brain and when 782 00:43:00,800 --> 00:43:03,080 Speaker 3: little things change, that changes you. And by the way, 783 00:43:03,160 --> 00:43:06,560 Speaker 3: drugs and alcohol are just invisibly small molecules that get 784 00:43:06,600 --> 00:43:08,560 Speaker 3: in your blood stream and change the functioning of your 785 00:43:08,600 --> 00:43:13,719 Speaker 3: brain changes you. You know, when you sleep each night, 786 00:43:13,719 --> 00:43:15,640 Speaker 3: you go into deep sleep and then you're not even 787 00:43:15,640 --> 00:43:17,880 Speaker 3: there anymore, and then you know, when you wake up, 788 00:43:17,920 --> 00:43:20,080 Speaker 3: it sort of reboots the whole system and so on. 789 00:43:20,320 --> 00:43:22,719 Speaker 3: But the point is it's all happening in these three 790 00:43:22,800 --> 00:43:25,680 Speaker 3: pounds here. That is you. That is fascinating. 791 00:43:25,920 --> 00:43:30,760 Speaker 1: So I have to ask you personally, what does anything 792 00:43:30,920 --> 00:43:32,120 Speaker 1: frighten you about this? 793 00:43:33,520 --> 00:43:37,840 Speaker 6: I don't think so, I mean, just because what you're saying, 794 00:43:37,840 --> 00:43:39,759 Speaker 6: would I think what you're saying would frighten a lot 795 00:43:39,800 --> 00:43:45,760 Speaker 6: of people who are committed to different theological or psychological 796 00:43:45,960 --> 00:43:50,760 Speaker 6: or philosophical ideas, and you're and you're saying some stuff 797 00:43:50,800 --> 00:43:51,839 Speaker 6: that would challenge that. 798 00:43:51,920 --> 00:43:55,560 Speaker 3: I think from a scientific point of view, Yeah, I mean, 799 00:43:55,640 --> 00:43:58,160 Speaker 3: I guess I can't say that people have all kinds 800 00:43:58,160 --> 00:44:00,520 Speaker 3: of you know, with eight point three billion people on 801 00:44:00,560 --> 00:44:02,960 Speaker 3: the earth, there are that many views on the world. 802 00:44:03,040 --> 00:44:09,799 Speaker 3: So but this view, I would assert, is the only 803 00:44:09,840 --> 00:44:13,280 Speaker 3: one that's defensible. I mean, you could have whatever faith 804 00:44:13,360 --> 00:44:15,439 Speaker 3: you have, whatever deity you have, But if you walk 805 00:44:15,480 --> 00:44:18,240 Speaker 3: into a neurology ward and you see patients with different 806 00:44:18,280 --> 00:44:20,640 Speaker 3: brain damage and they have different things going on, I 807 00:44:20,640 --> 00:44:25,319 Speaker 3: don't know how one would explain that otherwise except to say, yeah, 808 00:44:25,680 --> 00:44:29,520 Speaker 3: you are your brain. I guess the part that frightens me, 809 00:44:29,880 --> 00:44:33,560 Speaker 3: But it's sort of a calm, mellow fright is just 810 00:44:33,920 --> 00:44:37,920 Speaker 3: how fragile we are as creatures. But everybody knows that anyway. 811 00:44:37,960 --> 00:44:41,080 Speaker 3: All it takes is a stroke, you know, a little 812 00:44:41,320 --> 00:44:44,480 Speaker 3: clot that gets in there, or traumatic brain injury or 813 00:44:44,520 --> 00:44:47,000 Speaker 3: a brain tumor or whatever, and then you're not even 814 00:44:47,040 --> 00:44:50,239 Speaker 3: you anymore. So I guess that part is frightening, but 815 00:44:50,320 --> 00:44:54,520 Speaker 3: I'm so used to thinking about that. Yeah, it's a 816 00:44:54,680 --> 00:44:56,440 Speaker 3: fascinating endlessly. 817 00:44:56,719 --> 00:44:58,719 Speaker 1: I think that must be one of the attractions of it, 818 00:44:58,800 --> 00:45:02,840 Speaker 1: surely is the fact that it it's endless. It's endless, 819 00:45:02,880 --> 00:45:06,080 Speaker 1: there's no there's no endpoint where you go, well, that's 820 00:45:06,120 --> 00:45:08,480 Speaker 1: the brain done, let's move on to the kidneys. It's 821 00:45:08,640 --> 00:45:12,640 Speaker 1: there's just not it's a long way to go. 822 00:45:13,960 --> 00:45:18,520 Speaker 3: Yeah, that's exactly right. And you know what's so cool 823 00:45:18,719 --> 00:45:21,600 Speaker 3: is so I've been in the field now, I wow, 824 00:45:21,920 --> 00:45:25,240 Speaker 3: well over a quarter century and the progress that I've seen. 825 00:45:25,600 --> 00:45:28,920 Speaker 3: But I would say exactly what the you know, what 826 00:45:28,960 --> 00:45:31,759 Speaker 3: the obstetricians said about pregnancy, I would say the same 827 00:45:31,800 --> 00:45:34,600 Speaker 3: thing about the brain, which is we we have so 828 00:45:34,760 --> 00:45:37,799 Speaker 3: much knowledge now we know less and less as a percentage. 829 00:45:38,719 --> 00:45:40,399 Speaker 3: You know, we have this book in the field called 830 00:45:40,440 --> 00:45:43,720 Speaker 3: Principles of Neuroscience, and it's enormous. It's about a thousand 831 00:45:43,800 --> 00:45:45,760 Speaker 3: pages long at this point, you know, it's the umpteenth 832 00:45:45,920 --> 00:45:48,760 Speaker 3: edition of this of this book. But what's very funny 833 00:45:48,760 --> 00:45:51,160 Speaker 3: about it is it keeps growing longer with each edition. 834 00:45:51,680 --> 00:45:54,520 Speaker 3: And it's not principles, because if it were principles, it 835 00:45:54,520 --> 00:45:58,160 Speaker 3: would be like a pamphlet, But instead, as we get 836 00:45:58,239 --> 00:46:00,919 Speaker 3: more and more data, we just keeping stuff in there. 837 00:46:01,000 --> 00:46:02,640 Speaker 3: We say, oh, well, but there's also this, and there's 838 00:46:02,680 --> 00:46:04,360 Speaker 3: that in these kind of cells and that kind of genes. 839 00:46:04,719 --> 00:46:07,400 Speaker 3: And what it demonstrates is we don't have the principles 840 00:46:07,480 --> 00:46:11,200 Speaker 3: yet that allow us to have some sort of compression 841 00:46:11,320 --> 00:46:11,680 Speaker 3: of this. 842 00:46:12,560 --> 00:46:14,640 Speaker 1: Does it have an effect on you and your personal level? 843 00:46:14,719 --> 00:46:18,120 Speaker 1: Do you drink alcohol? Do you? Do you take drugs? 844 00:46:18,160 --> 00:46:20,880 Speaker 3: Do you? I mean you don't have to in a 845 00:46:20,960 --> 00:46:23,520 Speaker 3: vague way? Do you know? No, I don't do any 846 00:46:23,560 --> 00:46:29,120 Speaker 3: of that. Actually. Is that because of brain damage? Yeah? Yeah, no, 847 00:46:29,200 --> 00:46:32,719 Speaker 3: I think right. I think it's because I'm sort of 848 00:46:32,719 --> 00:46:34,840 Speaker 3: a control freak about my brain. I just want to 849 00:46:34,920 --> 00:46:37,200 Speaker 3: keep this as healthy as I can. Yeah, so I 850 00:46:37,200 --> 00:46:39,400 Speaker 3: don't do any of that. I'm the last guy in 851 00:46:39,440 --> 00:46:42,680 Speaker 3: Silicon Valley that hasn't done psychedelics and had interesting trips 852 00:46:42,680 --> 00:46:45,480 Speaker 3: and souf and everyone asks me about that around here, 853 00:46:46,600 --> 00:46:48,960 Speaker 3: but I, yeah, I just I haven't taken it. It's 854 00:46:49,000 --> 00:46:51,600 Speaker 3: probably it's probably so low risk to do that, but 855 00:46:51,680 --> 00:46:58,239 Speaker 3: I just maybe that's a fear. Is the my consciousness 856 00:46:58,280 --> 00:47:00,640 Speaker 3: is a very particular thing, and I know that if 857 00:47:00,680 --> 00:47:04,440 Speaker 3: I stick in these invisibly small molecules. It'll change the 858 00:47:05,719 --> 00:47:08,840 Speaker 3: you know, the receptors or whatever. It'll change the activity 859 00:47:09,120 --> 00:47:12,480 Speaker 3: just a few percentage, and I'll be talking to silver leprechauns. 860 00:47:13,120 --> 00:47:15,200 Speaker 3: But I don't. I don't want to change that. I 861 00:47:15,200 --> 00:47:18,319 Speaker 3: don't want to mess up the system because this is 862 00:47:18,360 --> 00:47:22,279 Speaker 3: all I've got. And what about physical health do you? 863 00:47:22,520 --> 00:47:24,959 Speaker 1: I mean it kind of obviously your body feeds your brain, 864 00:47:25,080 --> 00:47:28,680 Speaker 1: I mean everything. So I mean do you find yourself 865 00:47:29,640 --> 00:47:34,720 Speaker 1: avoiding certain foods or or avoid in certain activities. 866 00:47:34,320 --> 00:47:37,799 Speaker 3: Or or yeah? We basically that I keep in really 867 00:47:37,800 --> 00:47:42,680 Speaker 3: good shape and I I eat healthfully. Yeah, I just 868 00:47:42,760 --> 00:47:45,400 Speaker 3: I try to make sure that I'm optimizing everything I 869 00:47:45,440 --> 00:47:46,360 Speaker 3: can on that front. 870 00:47:47,040 --> 00:47:48,759 Speaker 1: You are you hyper aware of it? Like if you 871 00:47:48,760 --> 00:47:51,680 Speaker 1: eat if you eat some sugar, do you do you 872 00:47:51,719 --> 00:47:52,080 Speaker 1: feel it? 873 00:47:52,120 --> 00:47:54,400 Speaker 3: Do you know what it's doing? Yeah? I know what 874 00:47:54,440 --> 00:47:58,359 Speaker 3: it's doing. I we let's see. I don't think I'm 875 00:47:58,400 --> 00:48:01,239 Speaker 3: hyper aware of it. But I I'm not even attracted 876 00:48:01,280 --> 00:48:04,840 Speaker 3: to sugary things. I don't even like that. But I 877 00:48:04,920 --> 00:48:07,160 Speaker 3: think I would say anymore. Obviously when I was a child, 878 00:48:07,200 --> 00:48:10,280 Speaker 3: I did. But the more the more I care about 879 00:48:10,280 --> 00:48:12,920 Speaker 3: optimizing the whole system, the less I'm even interested in 880 00:48:12,920 --> 00:48:14,600 Speaker 3: those sorts of things. 881 00:48:14,800 --> 00:48:18,400 Speaker 1: Let's very quickly, because I've been taken up an enormous 882 00:48:18,520 --> 00:48:19,000 Speaker 1: of your time. 883 00:48:19,040 --> 00:48:20,080 Speaker 3: But I'm fascinated by this. 884 00:48:20,480 --> 00:48:23,719 Speaker 1: What does because sugar is a thing for me, What 885 00:48:23,800 --> 00:48:26,360 Speaker 1: does sugar do in your brain? 886 00:48:27,440 --> 00:48:31,360 Speaker 3: Well, you know, so you increase blood sugar in your body, 887 00:48:31,400 --> 00:48:33,880 Speaker 3: and there are all kinds of bad effects that that 888 00:48:34,000 --> 00:48:38,239 Speaker 3: can have just physically in the brain. It's actually not 889 00:48:38,320 --> 00:48:40,920 Speaker 3: a bad thing because it's it's an energy source. It's 890 00:48:40,920 --> 00:48:43,279 Speaker 3: a quick energy source. So you know, if you're if 891 00:48:43,280 --> 00:48:45,239 Speaker 3: you're really tired and you need to do something, it's 892 00:48:45,239 --> 00:48:48,800 Speaker 3: probably it's probably not a bad idea. But the rest 893 00:48:48,840 --> 00:48:51,040 Speaker 3: of the body it reeks happy with time. 894 00:48:51,280 --> 00:48:55,240 Speaker 1: Yeah, it's well, it's a it's a fascinating thing, David. 895 00:48:55,360 --> 00:48:58,160 Speaker 1: And and thank you for being so patient with me, 896 00:48:58,160 --> 00:49:01,560 Speaker 1: because I really do know nothing about this, but I 897 00:49:01,560 --> 00:49:03,480 Speaker 1: feel like I know a little more another And now 898 00:49:03,520 --> 00:49:05,560 Speaker 1: that I know a little more, I really know how 899 00:49:05,640 --> 00:49:06,160 Speaker 1: little I. 900 00:49:06,160 --> 00:49:10,520 Speaker 3: Know about what I was talking about. But it's a 901 00:49:10,560 --> 00:49:11,360 Speaker 3: fascinating subject. 902 00:49:11,560 --> 00:49:13,000 Speaker 1: And I hope you'll come back and talk to us 903 00:49:13,040 --> 00:49:18,720 Speaker 1: again because it's it really is an endlessly interesting piece 904 00:49:18,760 --> 00:49:22,040 Speaker 1: of the world that you're involved in. Great, probably the 905 00:49:22,040 --> 00:49:25,839 Speaker 1: most important piece, I guess, I kind of think so. 906 00:49:25,920 --> 00:49:27,680 Speaker 3: I mean, it is the center of who we are. 907 00:49:27,760 --> 00:49:31,320 Speaker 3: There's really if you want to understand something about the self, 908 00:49:31,400 --> 00:49:34,080 Speaker 3: one can take you know, spiritual classes, psychology class and 909 00:49:34,080 --> 00:49:37,040 Speaker 3: stuff like that, and those are probably good inroads, but fundamentally, 910 00:49:37,560 --> 00:49:40,840 Speaker 3: this is the perceptual machinery by which you view the 911 00:49:40,880 --> 00:49:44,680 Speaker 3: whole world. So this is probably the best inroad there 912 00:49:44,719 --> 00:49:47,080 Speaker 3: is to understand what the what the heck we're doing here, 913 00:49:47,200 --> 00:49:49,319 Speaker 3: what your perception of the world is, and why you 914 00:49:49,360 --> 00:49:51,360 Speaker 3: react the way you do, why you have the feelings 915 00:49:51,440 --> 00:49:53,319 Speaker 3: and emotions you do, while you think the way you do, 916 00:49:53,440 --> 00:49:57,200 Speaker 3: and so on, and by the way, you know, you know, 917 00:49:57,280 --> 00:50:00,120 Speaker 3: I think. I've got this podcast called Inner Cosmos, and 918 00:50:00,200 --> 00:50:02,359 Speaker 3: what I do is every week I talk about this 919 00:50:02,440 --> 00:50:07,680 Speaker 3: intersection between the brain and daily life and why we 920 00:50:07,920 --> 00:50:09,319 Speaker 3: experience the world the way we do. 921 00:50:10,120 --> 00:50:13,439 Speaker 1: It is a fascinating idea, and I will watch your 922 00:50:13,480 --> 00:50:17,359 Speaker 1: Inner Cosmos and listen to your inner cosmos and investigate 923 00:50:17,400 --> 00:50:19,120 Speaker 1: your inner cosmos. 924 00:50:19,560 --> 00:50:21,239 Speaker 3: And see if it can help me. By I am 925 00:50:21,239 --> 00:50:23,279 Speaker 3: fascinated by it, and it really is great. David, Thank 926 00:50:23,280 --> 00:50:25,520 Speaker 3: you so much for being around great. Thanks Greig it's 927 00:50:25,520 --> 00:50:28,880 Speaker 3: such a pleasure to be here. Thanks buddy,