1 00:00:04,440 --> 00:00:12,559 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,560 --> 00:00:15,960 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,000 --> 00:00:19,520 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 4 00:00:19,640 --> 00:00:23,279 Speaker 1: are you listeners? We have a treat today. We've got 5 00:00:23,320 --> 00:00:28,960 Speaker 1: a guest on the show. He's a neuroscientist, he's a technologist. 6 00:00:29,560 --> 00:00:34,800 Speaker 1: Most importantly, he's a fellow podcaster, David Eagleman, host of 7 00:00:34,960 --> 00:00:36,920 Speaker 1: Inner Cosmos with David Eagleman. 8 00:00:37,040 --> 00:00:37,599 Speaker 2: Welcome to the. 9 00:00:37,520 --> 00:00:39,800 Speaker 3: Show, Thanks Jonathan. Great to be here. 10 00:00:40,440 --> 00:00:40,760 Speaker 2: David. 11 00:00:41,159 --> 00:00:44,120 Speaker 1: It's a pleasure to have you on. I think you 12 00:00:44,200 --> 00:00:46,239 Speaker 1: may only be the first or second person I've ever 13 00:00:46,280 --> 00:00:48,320 Speaker 1: had on the show who has done a ted talk. 14 00:00:48,760 --> 00:00:52,680 Speaker 1: I've had film directors on the show. I've had production 15 00:00:52,760 --> 00:00:55,400 Speaker 1: designers for things like Stranger Things on the show. But 16 00:00:55,760 --> 00:00:57,360 Speaker 1: it's rare that I get a chance to sit down 17 00:00:57,360 --> 00:00:59,280 Speaker 1: with a neuroscientist. So this is going to be a 18 00:00:59,320 --> 00:01:02,400 Speaker 1: real pleasure, great pleasure for me too. Well, let's start 19 00:01:02,440 --> 00:01:06,039 Speaker 1: off by talking a bit about Inner Cosmos, because you know, 20 00:01:06,120 --> 00:01:09,280 Speaker 1: this show has launched, it's got thirty episodes out as 21 00:01:09,319 --> 00:01:11,440 Speaker 1: of the time we're recording this episode, thirty I think 22 00:01:11,440 --> 00:01:14,160 Speaker 1: published the day before we're recording this. That's right, And 23 00:01:14,880 --> 00:01:17,400 Speaker 1: first I want to applaud you for your restraint that 24 00:01:17,480 --> 00:01:20,560 Speaker 1: it took seventeen episodes before you gave us the definitive 25 00:01:20,600 --> 00:01:24,679 Speaker 1: answer on what is consciousness? 26 00:01:24,800 --> 00:01:27,520 Speaker 3: Yeah, well, you know the show as you well though, 27 00:01:27,600 --> 00:01:30,959 Speaker 3: it's about you know, the intersection of the brain with 28 00:01:31,000 --> 00:01:34,920 Speaker 3: our lives, our daily lives, and right, so it took 29 00:01:34,959 --> 00:01:37,440 Speaker 3: a little while to work up consciousness. Unfortunately, we still 30 00:01:37,440 --> 00:01:40,559 Speaker 3: do not have a definitive answer to that, and it's 31 00:01:40,600 --> 00:01:43,880 Speaker 3: you know, this is really at the center of the 32 00:01:43,920 --> 00:01:46,360 Speaker 3: mystery of neuroscience is how do you put together eighty 33 00:01:46,360 --> 00:01:49,480 Speaker 3: six billion neurons and have all this you know, three 34 00:01:49,760 --> 00:01:54,720 Speaker 3: pound glob of wet biological stuff that is you? But 35 00:01:54,760 --> 00:01:56,880 Speaker 3: why does it feel like anything to be you? Why 36 00:01:56,920 --> 00:02:00,200 Speaker 3: aren't you just you know, my Mac laptop here has 37 00:02:00,240 --> 00:02:03,920 Speaker 3: a whole bunch of transistors, but presumably it doesn't find 38 00:02:04,040 --> 00:02:08,239 Speaker 3: things funny or interesting or painful or curious. Yeah. 39 00:02:09,080 --> 00:02:09,360 Speaker 2: Yeah. 40 00:02:09,480 --> 00:02:12,239 Speaker 1: I often talk about in the sense of the ability 41 00:02:12,240 --> 00:02:16,119 Speaker 1: to have an experience versus the ability of just being 42 00:02:16,120 --> 00:02:19,800 Speaker 1: able to process information, and a lot of what you 43 00:02:19,919 --> 00:02:24,240 Speaker 1: talk about has obvious crossover into the world of technology, 44 00:02:24,720 --> 00:02:26,679 Speaker 1: but beyond that, like when we start talking about things 45 00:02:26,680 --> 00:02:31,799 Speaker 1: like artificial intelligence or artificial neuron networks. Like that's where 46 00:02:31,800 --> 00:02:37,040 Speaker 1: we start really having this interesting cross talk, because I 47 00:02:37,120 --> 00:02:42,120 Speaker 1: get the sense that often the technologist side has maybe 48 00:02:42,160 --> 00:02:47,400 Speaker 1: a superficial understanding or sometimes a complete misunderstanding of the neuroscience. 49 00:02:47,800 --> 00:02:50,919 Speaker 1: But they're trying to use neuroscience to describe their technologies, 50 00:02:51,320 --> 00:02:54,480 Speaker 1: and meanwhile you have neuroscientists who are often trying to 51 00:02:55,480 --> 00:03:01,480 Speaker 1: straighten out the concepts, and it can become a complicated 52 00:03:01,520 --> 00:03:04,760 Speaker 1: conversation where people are talking on two different levels. So 53 00:03:04,840 --> 00:03:06,640 Speaker 1: one of the things I wanted to ask you actually 54 00:03:07,080 --> 00:03:10,959 Speaker 1: relates to this, which is, as a neuroscientist and as 55 00:03:10,960 --> 00:03:16,160 Speaker 1: a technologist, how do you actually define artificial intelligence? Because 56 00:03:16,160 --> 00:03:20,440 Speaker 1: it's such an enormous discipline, and I feel like a 57 00:03:20,440 --> 00:03:23,320 Speaker 1: lot of the reporting around it gets very reductive. 58 00:03:24,360 --> 00:03:28,080 Speaker 3: Yeah, all great questions. Here's the thing, you know, we 59 00:03:28,120 --> 00:03:31,000 Speaker 3: don't even know exactly how to define natural intelligence. What 60 00:03:31,040 --> 00:03:33,400 Speaker 3: I mean is people have known, for example, since the 61 00:03:33,480 --> 00:03:38,600 Speaker 3: nineteen twenties that the IQ test measures something useful and 62 00:03:39,200 --> 00:03:43,720 Speaker 3: it correlates with one's success in life along various metrics 63 00:03:43,760 --> 00:03:45,840 Speaker 3: career and so on. But there's a big debate about 64 00:03:45,840 --> 00:03:49,760 Speaker 3: what intelligence is in humans. Is it the ability to 65 00:03:50,440 --> 00:03:53,560 Speaker 3: better simulate possible futures and evaluate them. Is it the 66 00:03:53,600 --> 00:03:59,000 Speaker 3: ability to squelch distractors and so on? There's lots of 67 00:03:59,240 --> 00:04:01,880 Speaker 3: there's lots of different diferent ideas about what intelligence might be, 68 00:04:02,400 --> 00:04:04,720 Speaker 3: but we don't know. It's probably a word that has 69 00:04:04,800 --> 00:04:08,680 Speaker 3: too much semantic weight on it, meaning that it's not 70 00:04:09,000 --> 00:04:11,000 Speaker 3: just one thing anyway, So when you ask the question 71 00:04:11,040 --> 00:04:14,120 Speaker 3: of what is artificial intelligence, it can mean a lot 72 00:04:14,160 --> 00:04:17,040 Speaker 3: of things, and people mean different things by it. And 73 00:04:17,120 --> 00:04:19,839 Speaker 3: so the way we're building AI is the way we 74 00:04:19,880 --> 00:04:22,400 Speaker 3: build any sort of machine where you're trying to get 75 00:04:22,400 --> 00:04:24,760 Speaker 3: the machine to do something for you that you don't 76 00:04:24,760 --> 00:04:27,440 Speaker 3: want to do. So, for example, you know, Google Maps 77 00:04:27,600 --> 00:04:31,320 Speaker 3: knows every street, every place I should go, how a 78 00:04:31,520 --> 00:04:34,359 Speaker 3: best way to get there? That is artificial intelligence in 79 00:04:34,400 --> 00:04:36,599 Speaker 3: the sense that it's way better than I would be 80 00:04:36,760 --> 00:04:40,520 Speaker 3: at memorizing an entire atlas, And so that's a very 81 00:04:40,640 --> 00:04:47,000 Speaker 3: useful machine for me. You know, Chat, GPT compiling the 82 00:04:47,080 --> 00:04:50,039 Speaker 3: world's data and being able to tell it to you 83 00:04:50,320 --> 00:04:53,000 Speaker 3: in the structure of a human speaking. It's sort of 84 00:04:53,120 --> 00:04:56,240 Speaker 3: like what we appreciated about Google, which is this compilation 85 00:04:56,279 --> 00:04:57,760 Speaker 3: of all the world's data. But now it can just 86 00:04:57,800 --> 00:05:00,240 Speaker 3: spit it out to you in various ways. Lots of 87 00:05:00,240 --> 00:05:03,719 Speaker 3: ways to think about what artificial intelligence is. But I 88 00:05:03,760 --> 00:05:06,320 Speaker 3: will I do want to say something, which is that 89 00:05:06,839 --> 00:05:11,360 Speaker 3: you know, AI butted off from neuroscience because neuroscientists figured 90 00:05:11,360 --> 00:05:14,120 Speaker 3: out a while ago, Look, you know, you've got neurons. 91 00:05:14,160 --> 00:05:16,440 Speaker 3: These neurals are connected to one another. Now, as it 92 00:05:16,480 --> 00:05:19,560 Speaker 3: turns out, neurons are really complicated. Each neuron is about 93 00:05:19,560 --> 00:05:22,760 Speaker 3: as complicated as a city. It's got the entire human 94 00:05:22,800 --> 00:05:26,719 Speaker 3: genome inute. It's trafficking millions of proteins around very complicated cascades, 95 00:05:27,320 --> 00:05:32,640 Speaker 3: and it's constantly plugging and unplugging and seeking and plugging 96 00:05:32,640 --> 00:05:35,080 Speaker 3: into other neurons and so on, and so people a 97 00:05:35,120 --> 00:05:37,919 Speaker 3: while ago came up with this idea of an artificial 98 00:05:37,920 --> 00:05:39,760 Speaker 3: neural network where they said, look, what if we just 99 00:05:39,800 --> 00:05:42,440 Speaker 3: have units that are connected to each other and those 100 00:05:42,720 --> 00:05:46,359 Speaker 3: those connections have particular strengths that can change. And so, 101 00:05:46,520 --> 00:05:48,760 Speaker 3: as it turns out, that has been the trick that 102 00:05:48,800 --> 00:05:52,960 Speaker 3: has led to this renaissance in AI that's been incredible, 103 00:05:53,600 --> 00:05:56,440 Speaker 3: but it also it's departed from the biology. I would 104 00:05:56,440 --> 00:05:59,760 Speaker 3: say they're now sort of oft in different directions. It's 105 00:05:59,839 --> 00:06:02,400 Speaker 3: very cartoonish version of the biology. In some ways it 106 00:06:02,480 --> 00:06:04,560 Speaker 3: is much better than the biology, and in other ways 107 00:06:04,560 --> 00:06:09,200 Speaker 3: it's much worse. Just as an example, AI, even given 108 00:06:09,240 --> 00:06:11,640 Speaker 3: the absolutely mind blowing things it's doing right now, it's 109 00:06:11,720 --> 00:06:13,720 Speaker 3: not as good as let's say, a five year old 110 00:06:14,640 --> 00:06:18,240 Speaker 3: is at navigating a complex room and socially manipulating adults 111 00:06:18,279 --> 00:06:20,680 Speaker 3: to get what she wants, and you know, all these 112 00:06:20,720 --> 00:06:23,479 Speaker 3: things that you know, getting food to her mouth when 113 00:06:23,480 --> 00:06:26,400 Speaker 3: she's hungry and so on. There's a lot of things 114 00:06:26,400 --> 00:06:29,520 Speaker 3: that humans can. You know, I should say that ais 115 00:06:29,680 --> 00:06:31,440 Speaker 3: to there's still gonna be work ahead of it to 116 00:06:31,440 --> 00:06:31,839 Speaker 3: get there. 117 00:06:32,160 --> 00:06:33,640 Speaker 2: Yeah, yeah, I remember. 118 00:06:33,720 --> 00:06:35,840 Speaker 1: One of my favorite stories was from a few years 119 00:06:35,880 --> 00:06:38,120 Speaker 1: back when I was at south By Southwest and a 120 00:06:38,200 --> 00:06:42,000 Speaker 1: roboticist was talking about a project in which they had 121 00:06:42,040 --> 00:06:45,560 Speaker 1: a robot that was capable of moving through human environments 122 00:06:46,120 --> 00:06:50,400 Speaker 1: and they were attempting to have the robot figure out 123 00:06:50,440 --> 00:06:54,000 Speaker 1: how to open a door, and they said the process 124 00:06:54,000 --> 00:06:56,960 Speaker 1: of that had the robots standing in a corridor for 125 00:06:57,040 --> 00:07:01,520 Speaker 1: like three days just analyzing door and getting in the 126 00:07:01,520 --> 00:07:03,239 Speaker 1: way of people who needed to go place. 127 00:07:03,520 --> 00:07:04,760 Speaker 2: Like everyone hated the robot. 128 00:07:04,839 --> 00:07:07,160 Speaker 1: By the end of this, but they were explaining like, 129 00:07:07,200 --> 00:07:10,200 Speaker 1: this is something where as a human, once you have 130 00:07:10,280 --> 00:07:13,240 Speaker 1: the experience with a door, you can start to reason 131 00:07:13,240 --> 00:07:15,240 Speaker 1: out how to interact with other doors, even if the 132 00:07:15,280 --> 00:07:18,760 Speaker 1: mechanism is different. Right, you can start to interpret how 133 00:07:18,840 --> 00:07:23,040 Speaker 1: to interact with this barrier, and it may mean that 134 00:07:23,080 --> 00:07:25,360 Speaker 1: you have to try a couple of different things before 135 00:07:25,560 --> 00:07:27,920 Speaker 1: you successfully open the door, but you have an idea. 136 00:07:28,440 --> 00:07:32,560 Speaker 1: But in artificial intelligence, that's something that we're only starting 137 00:07:32,600 --> 00:07:36,360 Speaker 1: to approach right now. Another great example was when DHARPA 138 00:07:36,440 --> 00:07:39,760 Speaker 1: did the General Challenge where they had the humanoid robots 139 00:07:39,800 --> 00:07:43,960 Speaker 1: that needed to complete a series of tasks like driving 140 00:07:43,960 --> 00:07:46,720 Speaker 1: a vehicle, getting out of it, walking through a door, 141 00:07:46,760 --> 00:07:49,440 Speaker 1: picking up a tool, using it, and they found that 142 00:07:49,520 --> 00:07:52,960 Speaker 1: one of the toughest challenges was walking through the door. 143 00:07:53,000 --> 00:07:55,640 Speaker 1: There's so many videos. Don't look it up, people, it 144 00:07:55,680 --> 00:07:58,440 Speaker 1: will break your heart, but so many videos are very expensive. 145 00:07:58,520 --> 00:08:01,679 Speaker 1: Robots face plan think while trying to walk through a doorway, 146 00:08:02,320 --> 00:08:06,400 Speaker 1: and it just shows that these these things that can 147 00:08:06,480 --> 00:08:09,840 Speaker 1: come very easily to humans do not necessarily come easily 148 00:08:09,880 --> 00:08:13,040 Speaker 1: to machines. Whether we're talking about an AI software running 149 00:08:13,040 --> 00:08:14,400 Speaker 1: in the background or a robot. 150 00:08:15,000 --> 00:08:16,920 Speaker 3: That's true, but let me jump in with one question 151 00:08:16,960 --> 00:08:19,120 Speaker 3: about what we mean by come easily to humans, because 152 00:08:19,160 --> 00:08:22,240 Speaker 3: what's interesting is we've had millions of years of evolution 153 00:08:22,800 --> 00:08:26,480 Speaker 3: building this incredibly complex structure, the most complex thing we've 154 00:08:26,520 --> 00:08:29,160 Speaker 3: ever found on the planet, the human brain, And so 155 00:08:29,360 --> 00:08:33,440 Speaker 3: what seems easy to us is actually not easy at all, 156 00:08:33,480 --> 00:08:36,080 Speaker 3: of course from other nature to have built. But yes, 157 00:08:36,160 --> 00:08:38,880 Speaker 3: I totally agree with you. Anything physical is something that 158 00:08:39,600 --> 00:08:41,680 Speaker 3: AI still has a long way to go to get. 159 00:08:42,000 --> 00:08:46,680 Speaker 1: Yeah, and to your point, yes, the human brain is 160 00:08:46,760 --> 00:08:53,160 Speaker 1: a phenomenon of evolution where it's what makes this comparatively 161 00:08:53,240 --> 00:08:56,440 Speaker 1: easy for us when you combine, you know, compare that 162 00:08:56,480 --> 00:09:01,280 Speaker 1: against a machine. I had mentioned earlier about your TED talk. 163 00:09:01,360 --> 00:09:04,760 Speaker 1: I loved it because it was all about using technology 164 00:09:04,840 --> 00:09:10,319 Speaker 1: to expand human senses. And also you very well, you 165 00:09:10,320 --> 00:09:14,480 Speaker 1: put it extremely well about how our understanding of the 166 00:09:14,520 --> 00:09:18,800 Speaker 1: world around us, our understanding of the universes, obviously funneled 167 00:09:18,840 --> 00:09:22,640 Speaker 1: through our senses, which means when you look at what 168 00:09:22,679 --> 00:09:27,000 Speaker 1: our senses are capable of, you realize we're experiencing the 169 00:09:27,120 --> 00:09:31,080 Speaker 1: tiniest fraction of what's actually going on out there. That 170 00:09:31,120 --> 00:09:35,720 Speaker 1: there are entire frequencies of the electromagnetic spectrum that we 171 00:09:35,800 --> 00:09:40,920 Speaker 1: cannot observe directly, and that's just part of the limitations 172 00:09:40,920 --> 00:09:43,440 Speaker 1: we have on ourselves. And that when we think about that, 173 00:09:43,480 --> 00:09:49,479 Speaker 1: and we think about technology having the capacity to expand that, 174 00:09:49,480 --> 00:09:51,360 Speaker 1: that's really a fascinating thing. And I thought it was 175 00:09:51,360 --> 00:09:55,840 Speaker 1: great because I had also done stories about biohacking and grinders, 176 00:09:55,840 --> 00:09:59,040 Speaker 1: people who are kind of taking the underground auto mechanic 177 00:09:59,120 --> 00:10:01,800 Speaker 1: garage approach to this sort of thing where you might 178 00:10:01,840 --> 00:10:04,800 Speaker 1: be talking about implanting magnets on your fingertips so you 179 00:10:04,880 --> 00:10:12,600 Speaker 1: could sense electromagnetic fields. Usually it means directing stimuli in 180 00:10:12,640 --> 00:10:14,280 Speaker 1: a way that you would pick up with one of 181 00:10:14,280 --> 00:10:18,560 Speaker 1: your traditional senses, not creating a new one necessarily, sort 182 00:10:18,559 --> 00:10:20,360 Speaker 1: of the same way that you would say, like infrared 183 00:10:20,400 --> 00:10:24,240 Speaker 1: goggles translates information into the visual spectrum so we can 184 00:10:24,280 --> 00:10:27,040 Speaker 1: see it. But I'm curious, since that talk, since you've 185 00:10:27,040 --> 00:10:30,960 Speaker 1: given that talk where you were looking into different ways 186 00:10:31,040 --> 00:10:34,839 Speaker 1: to bring stimuli in and have the brain interpret it 187 00:10:35,280 --> 00:10:39,920 Speaker 1: and perhaps even augment or create new senses, have you 188 00:10:39,960 --> 00:10:43,480 Speaker 1: done any more work along those lines since you get 189 00:10:43,520 --> 00:10:45,280 Speaker 1: that talk, and I would love to hear about it. 190 00:10:45,880 --> 00:10:48,600 Speaker 3: Yeah, great, great, So maybe I should back up just 191 00:10:48,600 --> 00:10:51,040 Speaker 3: for one second, if for those listeners who haven't heard 192 00:10:51,120 --> 00:10:53,400 Speaker 3: the toss, it was just about this issue of how 193 00:10:53,440 --> 00:10:56,559 Speaker 3: the brain is locked in silence and darkness inside your skull. 194 00:10:56,880 --> 00:10:58,760 Speaker 3: It doesn't see any of the world, doesn't hear any 195 00:10:58,760 --> 00:11:01,240 Speaker 3: of the world. All it ever has or these little 196 00:11:01,280 --> 00:11:05,880 Speaker 3: electrical spikes zipping around among neurons and then release of neurotransmitters. 197 00:11:05,920 --> 00:11:09,400 Speaker 3: But that say, it's just this electrochemical world buzzing around 198 00:11:09,440 --> 00:11:12,400 Speaker 3: in the dark, and yet it constructs this whole cosmos 199 00:11:12,400 --> 00:11:14,200 Speaker 3: for you, you know, the colors and the lights and 200 00:11:14,240 --> 00:11:16,600 Speaker 3: the joy and the smells and the tastes and the 201 00:11:16,640 --> 00:11:19,440 Speaker 3: touch and all that. So this is you know, we 202 00:11:19,520 --> 00:11:22,920 Speaker 3: actually mentioned this at the beginning about the mystery of consciousness. 203 00:11:22,960 --> 00:11:24,880 Speaker 3: But this is what brains are good at doing, is 204 00:11:24,960 --> 00:11:29,880 Speaker 3: extracting patterns and assigning meaning to those and figuring out 205 00:11:29,880 --> 00:11:32,440 Speaker 3: how to navigate itself in the world. So what I 206 00:11:32,440 --> 00:11:34,640 Speaker 3: got interested in is, you know, if I were to 207 00:11:34,720 --> 00:11:39,280 Speaker 3: show you a little piece of you know, of the brain. 208 00:11:39,920 --> 00:11:41,800 Speaker 3: Let's say I could do this in real life with 209 00:11:41,840 --> 00:11:43,920 Speaker 3: you know, like drilling a little hole in there and 210 00:11:43,960 --> 00:11:46,319 Speaker 3: you get to see all the action the activity inside 211 00:11:46,320 --> 00:11:48,520 Speaker 3: that little piece of brain. And I said to you, hey, 212 00:11:48,559 --> 00:11:50,360 Speaker 3: what part of the brain is this Is this visual 213 00:11:50,360 --> 00:11:54,079 Speaker 3: information or auditory or touch or smell? You wouldn't be 214 00:11:54,080 --> 00:11:55,679 Speaker 3: able to tell me. I wouldn't be able to tell 215 00:11:55,679 --> 00:11:59,319 Speaker 3: you because it looks the same everywhere the cortex. The 216 00:11:59,360 --> 00:12:02,240 Speaker 3: wrinklely bent on the outside of the brain is actually 217 00:12:02,280 --> 00:12:07,319 Speaker 3: the same structure all across the brain, even though we 218 00:12:07,400 --> 00:12:10,480 Speaker 3: tend to call this part, oh, that's your visual center, 219 00:12:10,520 --> 00:12:13,719 Speaker 3: and here's your auditory system, and here's your amount of 220 00:12:13,800 --> 00:12:16,440 Speaker 3: sensory system for touch and so on. In fact, it's 221 00:12:16,520 --> 00:12:18,640 Speaker 3: all exactly the same structure. It's just a matter of 222 00:12:18,640 --> 00:12:21,079 Speaker 3: which data cables are getting plugged into there. So that's 223 00:12:21,160 --> 00:12:24,160 Speaker 3: the background. Whatever data you send in there, it figures 224 00:12:24,200 --> 00:12:25,760 Speaker 3: out what to do with it and how to make 225 00:12:25,800 --> 00:12:28,400 Speaker 3: it relevant. So I got interested in this question of 226 00:12:28,480 --> 00:12:32,080 Speaker 3: sensory substitution, which is can we feed in completely new 227 00:12:32,200 --> 00:12:36,320 Speaker 3: kinds of senses into the brain. And so what I 228 00:12:36,400 --> 00:12:39,200 Speaker 3: presented at that talk, as you know, of course, Jonathan, 229 00:12:39,360 --> 00:12:42,440 Speaker 3: is a vest with vibratory motors on it, and we 230 00:12:42,440 --> 00:12:44,920 Speaker 3: were able to show, for example, that we could that 231 00:12:45,480 --> 00:12:48,360 Speaker 3: deaf people, people who could not hear, who are profoundly 232 00:12:48,400 --> 00:12:53,000 Speaker 3: or severely deaf, they could come to understand the auditory 233 00:12:53,000 --> 00:12:56,200 Speaker 3: world just by patterns vibration on their skin. This is 234 00:12:56,800 --> 00:12:59,120 Speaker 3: what your inner ear does. You know, when you capture 235 00:12:59,120 --> 00:13:02,520 Speaker 3: air compression waves on your ear drum, that goes through 236 00:13:02,520 --> 00:13:04,520 Speaker 3: a few stages, then it gets broken up from high 237 00:13:04,520 --> 00:13:07,160 Speaker 3: to low frequency, then shipped off to your brain as 238 00:13:07,200 --> 00:13:09,760 Speaker 3: patterns of spikes. All we're doing is capturing sound on 239 00:13:09,840 --> 00:13:13,120 Speaker 3: microphones and breaking it up from high low frequency and 240 00:13:13,160 --> 00:13:16,199 Speaker 3: putting that into different parts of the skin as vibrations. 241 00:13:16,559 --> 00:13:18,400 Speaker 3: And it turns out that people who are deaf could 242 00:13:18,400 --> 00:13:21,160 Speaker 3: come to hear this way. And so what we have done, 243 00:13:21,160 --> 00:13:23,880 Speaker 3: and answer your question, what we've done since then, is 244 00:13:23,960 --> 00:13:26,320 Speaker 3: we have shrunk the VEST down to a wristband and 245 00:13:26,360 --> 00:13:29,400 Speaker 3: I spun this company off from my labs called Neosensory, 246 00:13:30,000 --> 00:13:32,520 Speaker 3: and the wrist band does exactly the same thing as 247 00:13:32,559 --> 00:13:34,760 Speaker 3: the VEST. It's capturing sound in here. It's got full 248 00:13:34,800 --> 00:13:38,600 Speaker 3: computational capacity, and it has these vibratory motors along the 249 00:13:38,640 --> 00:13:42,520 Speaker 3: inside of the wrist, multiple motors, and it's turning that 250 00:13:42,679 --> 00:13:45,760 Speaker 3: sound into patterns of vibration on the skin of your wrist, 251 00:13:45,840 --> 00:13:48,280 Speaker 3: and people who are deaf can come to hear this way. 252 00:13:48,760 --> 00:13:52,240 Speaker 3: The really wild part is that after about six months 253 00:13:52,480 --> 00:13:54,880 Speaker 3: when I ask them, hey, when you hear a sound, 254 00:13:54,920 --> 00:13:56,560 Speaker 3: what is it? You know? Do you hear a dog 255 00:13:56,679 --> 00:13:58,560 Speaker 3: mark and you feel a buzz on your wrist and 256 00:13:58,640 --> 00:14:00,719 Speaker 3: you think, oh, what was that? Must be a dog 257 00:14:00,720 --> 00:14:02,840 Speaker 3: where And they say, no, I just hear the dog. 258 00:14:02,880 --> 00:14:04,120 Speaker 3: And I say, well, what do you mean by that? 259 00:14:04,200 --> 00:14:06,439 Speaker 3: They say, I just I hear the dog out there, 260 00:14:07,200 --> 00:14:10,840 Speaker 3: which is wild, right, except that's all that your ears 261 00:14:10,840 --> 00:14:13,360 Speaker 3: are doing too. I mean, we've long stopped appreciating that 262 00:14:13,400 --> 00:14:15,560 Speaker 3: because you've been using your ear since you were a baby. 263 00:14:15,920 --> 00:14:19,600 Speaker 3: But for goodness sakes, all you have are spikes running around. 264 00:14:19,640 --> 00:14:22,280 Speaker 3: And yet you think, oh, I'm hearing Eagleman's voice here 265 00:14:22,720 --> 00:14:25,680 Speaker 3: from out there, whereas in fact, all of the sound 266 00:14:26,000 --> 00:14:29,760 Speaker 3: is just happening in your head. It's the interpretation of 267 00:14:29,800 --> 00:14:34,400 Speaker 3: an auditory signal that's happening in your consciousness. So turns 268 00:14:34,400 --> 00:14:36,440 Speaker 3: out it doesn't matter how you push the information in there, 269 00:14:36,480 --> 00:14:39,960 Speaker 3: you get the same thing happening there. And we also, 270 00:14:40,080 --> 00:14:42,920 Speaker 3: we've done a whole We have several products on the 271 00:14:42,920 --> 00:14:44,840 Speaker 3: market now and we're on risks all over the world. 272 00:14:45,360 --> 00:14:49,480 Speaker 3: One of them is the you know, to replace hearing aids. 273 00:14:49,640 --> 00:14:51,560 Speaker 3: So most people, as they get older they lose their 274 00:14:51,640 --> 00:14:55,320 Speaker 3: high frequencies because yes, high frequencies. Yeah, you know, that 275 00:14:55,400 --> 00:14:58,440 Speaker 3: has the most energy, and it kills the little cells 276 00:14:58,440 --> 00:15:01,160 Speaker 3: inside your in your ear eventually, so they stop hearing 277 00:15:01,360 --> 00:15:04,360 Speaker 3: high frequencies. And that's why often as people get older, 278 00:15:04,360 --> 00:15:07,960 Speaker 3: they have a harder time understanding women and children because 279 00:15:08,000 --> 00:15:10,560 Speaker 3: their voices tend to be a slight higher register. And 280 00:15:10,640 --> 00:15:15,280 Speaker 3: so what we're doing is using exactly the same risk band. 281 00:15:15,360 --> 00:15:18,400 Speaker 3: It's using cutting at ji to listen in real time 282 00:15:18,600 --> 00:15:22,920 Speaker 3: for different high frequency parts of speech called phonemes. And 283 00:15:23,440 --> 00:15:25,360 Speaker 3: when it detects, oh, there was an S, there was 284 00:15:25,400 --> 00:15:27,080 Speaker 3: a K, there was a V, there was a T, 285 00:15:27,320 --> 00:15:30,040 Speaker 3: and so on, it buzzes in different ways to tell you. 286 00:15:30,600 --> 00:15:33,200 Speaker 3: And so what's happening is that somebody's ears doing fine 287 00:15:33,240 --> 00:15:35,680 Speaker 3: at the medium and low frequency, and the risk band 288 00:15:35,760 --> 00:15:39,000 Speaker 3: is simply clarifying what is happening at the high frequency. 289 00:15:39,480 --> 00:15:42,560 Speaker 3: And so that turns out to be a really great solution. 290 00:15:42,640 --> 00:15:44,680 Speaker 3: It It takes about a week or two for people 291 00:15:44,760 --> 00:15:47,600 Speaker 3: to get used to it, to understand how to use 292 00:15:47,640 --> 00:15:51,000 Speaker 3: it. It's all unconscious learning, but they just get better and 293 00:15:51,040 --> 00:15:53,640 Speaker 3: better at understanding what's being said, and they'll just mention 294 00:15:53,720 --> 00:15:58,400 Speaker 3: one other thing which was sort of as it's fascinating 295 00:15:58,440 --> 00:16:01,280 Speaker 3: that it worked, which is that for which is ringing 296 00:16:01,360 --> 00:16:04,840 Speaker 3: in the ears. What we do is we have people 297 00:16:04,840 --> 00:16:07,400 Speaker 3: wear the wristband and we play just for ten minutes 298 00:16:07,400 --> 00:16:09,400 Speaker 3: to day. We play a series of tones boom boo, 299 00:16:09,440 --> 00:16:11,480 Speaker 3: boo boo, boom boom, boom boom, and as you're hearing 300 00:16:11,520 --> 00:16:14,360 Speaker 3: the tones, you're feeling the buzz on the wristband. And 301 00:16:14,400 --> 00:16:17,800 Speaker 3: it turns out this drives down tenatus as well as 302 00:16:17,960 --> 00:16:22,680 Speaker 3: any solution that exists. And the reason is there are 303 00:16:22,720 --> 00:16:25,160 Speaker 3: complicated arguments for us. But I think the simple reason 304 00:16:25,200 --> 00:16:27,680 Speaker 3: this works is because you're teaching the brain what is 305 00:16:27,720 --> 00:16:31,200 Speaker 3: a real sound boom boom boom boom, because you're getting 306 00:16:31,240 --> 00:16:34,160 Speaker 3: verification on the wristband, but the internal ring in the ears, 307 00:16:34,240 --> 00:16:38,320 Speaker 3: the bea doesn't get any confirmation and so the brain says, Okay, 308 00:16:38,320 --> 00:16:40,080 Speaker 3: that's fake news and drives it down. 309 00:16:41,120 --> 00:16:43,239 Speaker 1: I'm going to need to get one of these wristbands. 310 00:16:44,240 --> 00:16:47,280 Speaker 1: I spent a lot of my youth going to the 311 00:16:47,320 --> 00:16:50,920 Speaker 1: University of Georgia and all the little rock clubs in 312 00:16:51,000 --> 00:16:54,120 Speaker 1: the surrounding area, and I have paid for my decisions. 313 00:16:55,400 --> 00:16:59,400 Speaker 1: Kids use earbuds or not earbuds, ear plugs. Use earplugs 314 00:16:59,440 --> 00:17:01,960 Speaker 1: when you go to the shows, because trust me, yes, 315 00:17:02,080 --> 00:17:04,960 Speaker 1: trust me, we're gonna have to take a quick break, 316 00:17:04,960 --> 00:17:06,800 Speaker 1: but when we come back we have more. With my 317 00:17:06,880 --> 00:17:19,520 Speaker 1: conversation with David Eagleman, the other thing I wanted to 318 00:17:19,520 --> 00:17:21,879 Speaker 1: point out for folks who may not have picked up 319 00:17:21,920 --> 00:17:25,200 Speaker 1: on it, is that this solution where you're using this 320 00:17:25,320 --> 00:17:29,359 Speaker 1: tactile feedback, where it's taking in sound and creating this 321 00:17:29,480 --> 00:17:33,600 Speaker 1: not only is it unconscious learning, it obviously does not 322 00:17:33,640 --> 00:17:37,760 Speaker 1: require invasive surgery. It can be orders of magnitude less 323 00:17:37,800 --> 00:17:41,679 Speaker 1: expensive and thus far more accessible, which means that it's 324 00:17:41,720 --> 00:17:44,879 Speaker 1: a technology that can help the heart of hearing, or 325 00:17:44,920 --> 00:17:47,679 Speaker 1: beyond heart of hearing, to experience the world on a 326 00:17:47,720 --> 00:17:52,600 Speaker 1: greater scale than they otherwise could, which is always a 327 00:17:52,720 --> 00:17:55,280 Speaker 1: wonderful story to tell when you're talking about tech, because 328 00:17:55,280 --> 00:17:58,919 Speaker 1: often we're talking about tech solutions that are not really solutions. 329 00:17:59,000 --> 00:18:02,200 Speaker 1: They are convenient is for a select. 330 00:18:01,880 --> 00:18:04,600 Speaker 3: Few, Yeah, thank you. I mean this is this is 331 00:18:04,640 --> 00:18:06,480 Speaker 3: why we're on risks all over the world now, and 332 00:18:06,560 --> 00:18:10,280 Speaker 3: lots of deaf schools in really impoverished countries as well, 333 00:18:11,240 --> 00:18:13,800 Speaker 3: precisely because you know, the only other option for somebody 334 00:18:13,840 --> 00:18:16,399 Speaker 3: who is deaf is a cochlear implant, which is one 335 00:18:16,440 --> 00:18:19,480 Speaker 3: hundred thousand dollars and an invasive surgery. So yeah, this 336 00:18:19,600 --> 00:18:22,359 Speaker 3: makes a big difference, and it works so well just 337 00:18:22,359 --> 00:18:25,719 Speaker 3: to push the information into the brain via an unusual channel. 338 00:18:25,880 --> 00:18:27,040 Speaker 2: It's fascinating to me. 339 00:18:27,280 --> 00:18:28,960 Speaker 1: The more you think about it, the more the more 340 00:18:29,040 --> 00:18:31,040 Speaker 1: you start to come to the conclusion of well, of 341 00:18:31,119 --> 00:18:33,560 Speaker 1: course it works when you start to break it down, 342 00:18:34,160 --> 00:18:38,199 Speaker 1: as the brain is creating everything we're experiencing. It's just 343 00:18:38,280 --> 00:18:42,320 Speaker 1: it's bringing in information from various sensors, and the nature 344 00:18:42,320 --> 00:18:45,760 Speaker 1: of the sensor doesn't matter as much as how the 345 00:18:45,800 --> 00:18:48,879 Speaker 1: brain can synthesize that data and turn it into something 346 00:18:48,880 --> 00:18:53,160 Speaker 1: that we can experience as something meaningful exactly. 347 00:18:53,200 --> 00:18:55,919 Speaker 3: I mean, those two spheres on the front of your head, 348 00:18:56,240 --> 00:18:59,280 Speaker 3: those are just capturing photons and turning them into spikes. 349 00:19:00,240 --> 00:19:03,159 Speaker 3: In your ear, your very sophisticated ear, is just capturing 350 00:19:03,200 --> 00:19:05,400 Speaker 3: air compression waves and turning into spikes. And your nose 351 00:19:05,440 --> 00:19:08,399 Speaker 3: in your mouth are capturing mixtures of molecules turning those 352 00:19:08,440 --> 00:19:11,240 Speaker 3: into spikes. But it's all it's all ending up in 353 00:19:11,280 --> 00:19:12,480 Speaker 3: the same common currency. 354 00:19:12,960 --> 00:19:14,840 Speaker 1: Well, David, one of the other things I wanted to 355 00:19:14,880 --> 00:19:18,280 Speaker 1: mention to you. I recently was doing some episodes about 356 00:19:18,800 --> 00:19:22,359 Speaker 1: the Ignobel Prizes, which you know, obviously a series of 357 00:19:22,440 --> 00:19:25,600 Speaker 1: joke prizes that came out of a satirical science magazine. 358 00:19:25,760 --> 00:19:30,920 Speaker 1: It's meant to either celebrate or sometimes raz people who 359 00:19:31,000 --> 00:19:34,080 Speaker 1: have done various things in science. One of the things 360 00:19:34,119 --> 00:19:36,760 Speaker 1: I thought was really interesting was they gave an Ignobel 361 00:19:36,840 --> 00:19:40,199 Speaker 1: Prize to a group of people Craig Bennett, Abigail Baird, 362 00:19:40,520 --> 00:19:44,959 Speaker 1: Michael Miller, and George Wolford who wrote a fascinating paper. 363 00:19:45,560 --> 00:19:50,520 Speaker 1: The paper is titled Neural Correlates of Interspecies Perspective taking 364 00:19:50,520 --> 00:19:54,760 Speaker 1: in the post mortem Atlantic Salmon An argument for proper 365 00:19:54,840 --> 00:19:58,680 Speaker 1: multiple comparisons correction. So this study was all about how 366 00:19:59,440 --> 00:20:05,720 Speaker 1: using very very sensitive equipment, mostly fMRI I actually and 367 00:20:06,520 --> 00:20:10,119 Speaker 1: through misuse of statistics, it could be very easy to 368 00:20:10,160 --> 00:20:14,320 Speaker 1: get false positives with things of this nature. And thus 369 00:20:14,359 --> 00:20:16,560 Speaker 1: what it means is that we need to be very 370 00:20:16,680 --> 00:20:20,639 Speaker 1: very careful when designing controls when we're using things like 371 00:20:20,800 --> 00:20:24,879 Speaker 1: fMRI I to draw conclusions. And I'm curious as your 372 00:20:24,880 --> 00:20:27,280 Speaker 1: work as a neuroscientist, as your work as a technologist, 373 00:20:27,320 --> 00:20:30,480 Speaker 1: have you encountered this sort of thing where there's always 374 00:20:30,560 --> 00:20:33,720 Speaker 1: this emphasis on moving fast, you know, move fast and 375 00:20:33,720 --> 00:20:36,959 Speaker 1: break things and to get to the end product as 376 00:20:37,000 --> 00:20:39,600 Speaker 1: quickly as possible. But in the process you have to 377 00:20:39,600 --> 00:20:41,600 Speaker 1: make sure that the science is right or else the 378 00:20:41,640 --> 00:20:45,560 Speaker 1: conclusions you draw, the solutions you make aren't really addressing anything. 379 00:20:45,800 --> 00:20:47,840 Speaker 1: Have you encountered things of that nature? 380 00:20:48,640 --> 00:20:51,560 Speaker 3: Sure, I mean spending a career in science when deals 381 00:20:51,600 --> 00:20:55,320 Speaker 3: with this all the time, which is to say, you know, 382 00:20:55,440 --> 00:20:58,959 Speaker 3: science constantly tries to drive forward, but it's always two 383 00:20:58,960 --> 00:21:01,280 Speaker 3: steps forward, one step back in the sense I mean 384 00:21:01,480 --> 00:21:04,359 Speaker 3: this fMRI paper with the salmon for anyone who does know, 385 00:21:04,400 --> 00:21:06,560 Speaker 3: it's you know, they put a dead salmon in the scanner, 386 00:21:06,920 --> 00:21:09,760 Speaker 3: and we're able to demonstrate that if you do this 387 00:21:10,119 --> 00:21:12,000 Speaker 3: the math in a certain way, that it looks like 388 00:21:12,040 --> 00:21:14,040 Speaker 3: there's activity in the salmon's brain even though there can't 389 00:21:14,080 --> 00:21:17,400 Speaker 3: possibly be so. So this is actually not an indictment 390 00:21:17,640 --> 00:21:21,320 Speaker 3: of functional metic residence imaging, the brain imaging technology that 391 00:21:21,359 --> 00:21:23,879 Speaker 3: we use. Instead, it's just an indictment of how you 392 00:21:23,920 --> 00:21:28,080 Speaker 3: do the statistics and so on. This is an incredibly 393 00:21:28,400 --> 00:21:33,199 Speaker 3: complicated device fMRI, and the you know, the way you 394 00:21:33,240 --> 00:21:36,399 Speaker 3: analyze the data requires, you know, a long time to 395 00:21:36,440 --> 00:21:38,880 Speaker 3: get to get up to speed on it. But what 396 00:21:38,920 --> 00:21:42,080 Speaker 3: it allows us to do is see what's happening in 397 00:21:42,119 --> 00:21:45,040 Speaker 3: the brain, in the live brain of a human when 398 00:21:45,080 --> 00:21:48,160 Speaker 3: they're doing a task or thinking about something. It's been 399 00:21:48,200 --> 00:21:53,280 Speaker 3: absolutely incredible the amount of data that's come out of this. 400 00:21:53,359 --> 00:21:55,720 Speaker 3: But what their paper shows is that, hey, you just 401 00:21:55,760 --> 00:21:57,760 Speaker 3: have to make sure you're doing the statistics right and 402 00:21:57,800 --> 00:22:00,480 Speaker 3: so on. And you know, like every field and sence, 403 00:22:01,440 --> 00:22:04,000 Speaker 3: you find lots and lots of papers that are publishing 404 00:22:04,080 --> 00:22:07,000 Speaker 3: right at the P equals point zero five, you know, 405 00:22:08,040 --> 00:22:12,800 Speaker 3: statistical significant value, meaning that there you know, people are saying, oh, 406 00:22:12,840 --> 00:22:15,440 Speaker 3: we've just spent a year during the study, and maybe 407 00:22:15,480 --> 00:22:18,199 Speaker 3: it's not quite statistically significant, so there's a there's some 408 00:22:18,320 --> 00:22:21,040 Speaker 3: little amount of fraud that happens. Look, let's do it 409 00:22:21,080 --> 00:22:25,720 Speaker 3: this way. My mentors was Francis Crick, the guy who 410 00:22:25,720 --> 00:22:29,960 Speaker 3: discovered the structure of DNA with Watson, and he, you know, 411 00:22:30,000 --> 00:22:31,879 Speaker 3: he would read the journals every single week when they 412 00:22:31,920 --> 00:22:34,760 Speaker 3: came out, and he said to me that he just 413 00:22:34,880 --> 00:22:37,679 Speaker 3: assumes that twenty five percent of what he reads is wrong, 414 00:22:38,119 --> 00:22:41,679 Speaker 3: whether by fraud or accident or whatever, doesn't matter. But 415 00:22:41,880 --> 00:22:45,000 Speaker 3: this is the thing science is the only human endeavor 416 00:22:45,000 --> 00:22:48,439 Speaker 3: we have that is constantly knocking down its own walls 417 00:22:48,480 --> 00:22:53,280 Speaker 3: and being self correcting, So you know, think, of course 418 00:22:53,320 --> 00:22:56,480 Speaker 3: there's mistakes and fraud and terrible stuff happens all the time, 419 00:22:56,520 --> 00:22:59,480 Speaker 3: size but at least that gets ironed out and straightened out, 420 00:22:59,480 --> 00:23:02,320 Speaker 3: whereas in every other field of human endeavor it doesn't. 421 00:23:02,840 --> 00:23:06,159 Speaker 1: Yeah, I thought the paper was incredibly invaluable. It was 422 00:23:06,200 --> 00:23:08,439 Speaker 1: one of those where, like I get the idea of 423 00:23:08,440 --> 00:23:10,760 Speaker 1: the Ignobel Prize going to it because the whole idea 424 00:23:10,760 --> 00:23:12,720 Speaker 1: of it makes you laugh and then it makes you think. 425 00:23:13,240 --> 00:23:16,320 Speaker 1: But my argument was, now, this is, this is a 426 00:23:16,359 --> 00:23:20,520 Speaker 1: statement to other researchers saying, just make sure that you're 427 00:23:20,640 --> 00:23:24,720 Speaker 1: absolutely positive you're doing everything you know correctly with scientific rigor, 428 00:23:24,840 --> 00:23:28,560 Speaker 1: that it's replicable, that it's replicable by other scientists who 429 00:23:28,560 --> 00:23:31,199 Speaker 1: are not directly working on whatever it is you're working on, 430 00:23:31,880 --> 00:23:35,359 Speaker 1: so that you ultimately are getting to the right conclusion, 431 00:23:35,400 --> 00:23:38,240 Speaker 1: because otherwise you're going to feel that you're you're sort 432 00:23:38,240 --> 00:23:41,280 Speaker 1: of wandering off the path, and perhaps worst case scenario, 433 00:23:41,280 --> 00:23:43,920 Speaker 1: you could say you're wasting time, or truly, worst case scenario, 434 00:23:43,920 --> 00:23:45,159 Speaker 1: you could be misleading people. 435 00:23:45,600 --> 00:23:46,080 Speaker 2: You know you. 436 00:23:46,000 --> 00:23:49,040 Speaker 1: Combine that with the world of like startups and Silicon 437 00:23:49,160 --> 00:23:52,639 Speaker 1: Valley and the general philosophy that they have, and you 438 00:23:52,640 --> 00:23:56,159 Speaker 1: can see that there's not parody between the two. And 439 00:23:56,200 --> 00:23:59,720 Speaker 1: so I imagine it requires a lot of balancing to make 440 00:23:59,760 --> 00:24:04,080 Speaker 1: certain that the science is there before the real big 441 00:24:04,119 --> 00:24:07,200 Speaker 1: moves in technology. Otherwise you would run the danger of 442 00:24:07,240 --> 00:24:10,879 Speaker 1: running into something like a Therah noose, which obviously a 443 00:24:11,000 --> 00:24:12,600 Speaker 1: poster child for bad science. 444 00:24:13,480 --> 00:24:16,639 Speaker 3: Yeah, exactly. I mean, again, the good news about science 445 00:24:16,720 --> 00:24:18,959 Speaker 3: or startups is that if somebody is doing something that 446 00:24:19,040 --> 00:24:24,040 Speaker 3: doesn't work, the truth outs. Yes, And you know, unfortunately 447 00:24:24,040 --> 00:24:26,520 Speaker 3: in science all the in science and startups all the time, 448 00:24:26,880 --> 00:24:30,480 Speaker 3: this stuff happens again, you know, sometimes by fraud and 449 00:24:30,560 --> 00:24:34,560 Speaker 3: often by mistake or accident. But you can't keep that 450 00:24:34,640 --> 00:24:36,760 Speaker 3: ship running for very long. I mean, look, you know, 451 00:24:36,800 --> 00:24:39,200 Speaker 3: the big thing that happened more recently is this issue 452 00:24:39,200 --> 00:24:43,200 Speaker 3: about Alzheimer's disease. And you've got these plaques and tangles 453 00:24:43,240 --> 00:24:45,040 Speaker 3: and so on. It turns out that it was discovered 454 00:24:45,040 --> 00:24:48,159 Speaker 3: that about thirteen years ago, this guy who published the 455 00:24:48,200 --> 00:24:52,280 Speaker 3: papers that actually fraudulently doctored the photographs to make it 456 00:24:52,320 --> 00:24:55,720 Speaker 3: look like something was going on there. Cut to thirteen 457 00:24:55,840 --> 00:25:01,080 Speaker 3: years of research and countless millions of dollars going into 458 00:25:01,080 --> 00:25:03,720 Speaker 3: researching this thing that was all predicated on a fraud, 459 00:25:04,240 --> 00:25:07,320 Speaker 3: and that sucks. But you know, in thirteen years is 460 00:25:07,359 --> 00:25:10,000 Speaker 3: a long time, but yeah, it gets figured out. At 461 00:25:10,000 --> 00:25:11,800 Speaker 3: some point, nothing's gonna last. 462 00:25:11,880 --> 00:25:14,720 Speaker 1: Yeah, yeah, it's which is that is something of a 463 00:25:14,760 --> 00:25:16,679 Speaker 1: comfort to know that in the long run it all 464 00:25:16,720 --> 00:25:19,040 Speaker 1: gets hashed out. I mean, obviously, if you are someone 465 00:25:19,080 --> 00:25:20,840 Speaker 1: who's caught up in it, in the middle of it, 466 00:25:20,840 --> 00:25:23,119 Speaker 1: it can be it can be life changing in a 467 00:25:23,520 --> 00:25:27,160 Speaker 1: really negative way. But but there is there is comfort 468 00:25:27,200 --> 00:25:30,679 Speaker 1: to know that the very nature of the process of 469 00:25:30,720 --> 00:25:35,040 Speaker 1: science is one that will weed this stuff out. And 470 00:25:35,119 --> 00:25:37,000 Speaker 1: I also argue all the time and like, you know 471 00:25:37,040 --> 00:25:40,000 Speaker 1: that science works because if science didn't work, none of 472 00:25:40,040 --> 00:25:43,240 Speaker 1: your technology would work. Your technology is proof science works. 473 00:25:43,280 --> 00:25:48,800 Speaker 1: It is the actual manifestation, physical manifestation of science. And 474 00:25:49,400 --> 00:25:52,480 Speaker 1: it's also why I get very let's say, passionate when 475 00:25:52,480 --> 00:25:56,800 Speaker 1: I encounter people who are in some ways science deniers. 476 00:25:56,359 --> 00:25:59,520 Speaker 1: It's it's it's unthinkable to me because I think how 477 00:25:59,600 --> 00:26:04,280 Speaker 1: much of your life revolves around your interaction with things 478 00:26:04,320 --> 00:26:08,520 Speaker 1: that literally would not work if science, that the scientific 479 00:26:08,560 --> 00:26:09,520 Speaker 1: process didn't work. 480 00:26:10,000 --> 00:26:12,720 Speaker 3: I agree with you, But let me play devil's advocate 481 00:26:12,760 --> 00:26:16,119 Speaker 3: here for just a second, which is, you know, given 482 00:26:16,200 --> 00:26:19,159 Speaker 3: that science is wrong some of the time, and you know, 483 00:26:19,200 --> 00:26:21,680 Speaker 3: thank goodness, it's self correcting and we'll get it straightened out. 484 00:26:22,520 --> 00:26:25,480 Speaker 3: I always have a little bit of I give a 485 00:26:25,480 --> 00:26:28,679 Speaker 3: little wiggle room to people who say, well, how do 486 00:26:28,720 --> 00:26:32,360 Speaker 3: we know that's true? You know, maybe these scientists are 487 00:26:32,359 --> 00:26:35,000 Speaker 3: influenced by politics or something. And that's true too, that 488 00:26:35,040 --> 00:26:37,240 Speaker 3: happens all the time. And in fact, the grant money 489 00:26:37,280 --> 00:26:41,119 Speaker 3: goes towards things that any society and any moment in 490 00:26:41,119 --> 00:26:45,280 Speaker 3: time finds politically interesting or expedient to do. And you know, 491 00:26:45,320 --> 00:26:47,920 Speaker 3: science money doesn't go to those things that society doesn't 492 00:26:47,960 --> 00:26:48,320 Speaker 3: care about. 493 00:26:48,359 --> 00:26:48,679 Speaker 2: And so on. 494 00:26:48,960 --> 00:26:53,160 Speaker 3: Anyway, I whenever I hear somebody saying, hey, I don't 495 00:26:53,200 --> 00:26:54,719 Speaker 3: think this is true, or that, how do we know 496 00:26:54,840 --> 00:26:57,119 Speaker 3: climate change? How do we know the vaccines, whatever, I 497 00:26:57,160 --> 00:27:00,600 Speaker 3: always feel like, good man, keep asking the questions, and 498 00:27:00,760 --> 00:27:02,840 Speaker 3: you know, and the truth will out as a result 499 00:27:02,960 --> 00:27:05,760 Speaker 3: of that, and one side or the other will find 500 00:27:05,760 --> 00:27:06,840 Speaker 3: out who's wrong. Right. 501 00:27:07,240 --> 00:27:11,000 Speaker 1: Yeah, I'm certainly in favor of asking questions. I think 502 00:27:11,000 --> 00:27:14,040 Speaker 1: I get frustrated with people who when faced with an 503 00:27:14,080 --> 00:27:19,520 Speaker 1: answer that they don't like. Are They're not interested in 504 00:27:19,560 --> 00:27:22,040 Speaker 1: the process that arrived to that answer. They're interested in 505 00:27:22,080 --> 00:27:24,719 Speaker 1: the answer changing. Right, Yeah, yeah, fair enough, fair enough. 506 00:27:24,720 --> 00:27:27,399 Speaker 3: There's lots of there's lots of wrong reasons to criticize science. 507 00:27:28,040 --> 00:27:31,359 Speaker 1: But I mean, but to your point, the whole, the 508 00:27:31,440 --> 00:27:34,960 Speaker 1: whole of science is all about, like, I discovered this thing. 509 00:27:35,119 --> 00:27:37,080 Speaker 1: I think it's a thing. Can you look at this 510 00:27:37,160 --> 00:27:39,280 Speaker 1: thing and tell me if you also think it's a thing? 511 00:27:39,640 --> 00:27:41,399 Speaker 1: And if enough people think it's a thing, then we 512 00:27:41,440 --> 00:27:44,320 Speaker 1: have a scientific consensus about it being a thing. And 513 00:27:44,359 --> 00:27:48,440 Speaker 1: then that we can we can say we're fairly comfortable 514 00:27:48,480 --> 00:27:52,480 Speaker 1: this is a thing. Maybe sometime in the future we'll discover, oh, 515 00:27:52,560 --> 00:27:54,400 Speaker 1: we were totally wrong, that wasn't a thing at all, 516 00:27:54,480 --> 00:27:56,880 Speaker 1: But for now we're comfortable in calling it a thing, 517 00:27:57,880 --> 00:28:01,760 Speaker 1: which is a crazy genderic way of describing it. 518 00:28:02,200 --> 00:28:03,760 Speaker 3: No, I like the way you describe it, but it 519 00:28:03,800 --> 00:28:06,360 Speaker 3: is it is important to look. I'm totally on your 520 00:28:06,400 --> 00:28:08,679 Speaker 3: side on this, but just just to fill out the 521 00:28:08,720 --> 00:28:11,439 Speaker 3: other side for a moment, all we ever have in 522 00:28:11,480 --> 00:28:14,120 Speaker 3: science is the weight of evidence, So we never can 523 00:28:14,160 --> 00:28:17,080 Speaker 3: conclude this is the truth or this is the thing, 524 00:28:17,720 --> 00:28:20,520 Speaker 3: and so all we say is, look, all the data 525 00:28:20,560 --> 00:28:22,840 Speaker 3: seems to indicate, seems to point in this. For example, 526 00:28:23,119 --> 00:28:27,080 Speaker 3: we all believe in quantum mechanics because it's the best 527 00:28:27,119 --> 00:28:30,440 Speaker 3: theory we have. It predicts experiments correctly to fourteen decimal places. 528 00:28:30,840 --> 00:28:32,840 Speaker 3: We don't know it's true. We don't know it's true. 529 00:28:32,840 --> 00:28:34,199 Speaker 3: I mean, you know, there may be something in one 530 00:28:34,240 --> 00:28:37,000 Speaker 3: hundred years from now. We just cover. Oh that's way better. 531 00:28:37,080 --> 00:28:39,200 Speaker 3: By the way, it predicts experiments to thirty decimal places, 532 00:28:39,200 --> 00:28:41,480 Speaker 3: and it's it's a totally different thing. And I mean 533 00:28:41,480 --> 00:28:44,080 Speaker 3: in the same way that let's take Newtonian physics. You know, 534 00:28:44,120 --> 00:28:45,680 Speaker 3: you drop a ball or what do you shoot a 535 00:28:45,720 --> 00:28:48,440 Speaker 3: cannon ball or something, and it works pretty well, but 536 00:28:48,480 --> 00:28:52,200 Speaker 3: it doesn't it doesn't expand to deal with things that 537 00:28:52,800 --> 00:28:55,440 Speaker 3: at the very big or very small. Sure, and so 538 00:28:55,920 --> 00:28:58,800 Speaker 3: we say, okay, well it's good enough for here. It's fine. 539 00:28:59,400 --> 00:29:01,880 Speaker 3: But yeah, yeah, I don't know. I always there's a 540 00:29:01,920 --> 00:29:05,440 Speaker 3: little piece in my heart that always is fine with 541 00:29:05,520 --> 00:29:09,280 Speaker 3: people questioning anything, because really, the whole history of science 542 00:29:09,640 --> 00:29:12,800 Speaker 3: is that in every generation people have thought, okay, we've 543 00:29:12,800 --> 00:29:14,600 Speaker 3: got all the puzzle pieces here, we sort of know 544 00:29:14,720 --> 00:29:18,080 Speaker 3: the answer, and it's never been true once ever, And 545 00:29:18,280 --> 00:29:23,600 Speaker 3: just imagine trying to explain, you know, how the heart 546 00:29:23,640 --> 00:29:27,520 Speaker 3: works before the discovery of electricity, or how the northern 547 00:29:27,600 --> 00:29:31,040 Speaker 3: lights work before the discovery of the discovery of you know, 548 00:29:31,240 --> 00:29:34,080 Speaker 3: photons and the magnetosphere around the planet and stuff like 549 00:29:34,080 --> 00:29:36,200 Speaker 3: like you could you'd be making up theories, but you 550 00:29:36,200 --> 00:29:40,080 Speaker 3: would be doomed to be incorrect. And so whoever was 551 00:29:40,160 --> 00:29:43,560 Speaker 3: the naysayers in those days, good good on them for saying, 552 00:29:43,560 --> 00:29:45,520 Speaker 3: I don't know, you know, just because all the sciences 553 00:29:45,600 --> 00:29:47,959 Speaker 3: agree on this doesn't necessitate it's correct. 554 00:29:48,000 --> 00:29:51,320 Speaker 1: Well, and I do think that curiosity is probably one 555 00:29:51,320 --> 00:29:54,040 Speaker 1: of the most important human qualities to have, right. You 556 00:29:54,160 --> 00:29:57,160 Speaker 1: need to have curiosity, You need to have that desire 557 00:29:57,200 --> 00:30:00,840 Speaker 1: to learn. And like I remember, I was chatting with 558 00:30:00,920 --> 00:30:04,320 Speaker 1: my niece who was asking very deep questions for an 559 00:30:04,320 --> 00:30:07,000 Speaker 1: eight year old, and I was explaining things like I 560 00:30:07,120 --> 00:30:10,440 Speaker 1: was explaining in big terms things like Newtonian physics and 561 00:30:10,440 --> 00:30:12,920 Speaker 1: then quantum physics. And then she said, at what stage 562 00:30:13,520 --> 00:30:16,960 Speaker 1: do you swap from describing something in Newtonian physics to 563 00:30:17,040 --> 00:30:19,520 Speaker 1: quantum physics. I said, you have hit upon a question 564 00:30:20,240 --> 00:30:23,520 Speaker 1: that no one has an answer to, and I know 565 00:30:23,600 --> 00:30:26,240 Speaker 1: that's not satisfying to you. But this is where you 566 00:30:26,320 --> 00:30:30,240 Speaker 1: have to know that adults don't know everything God exactly. 567 00:30:30,400 --> 00:30:32,040 Speaker 3: But you know what the funny part is, it's not 568 00:30:32,080 --> 00:30:34,880 Speaker 3: that your niece was asking deep questions, that she's just 569 00:30:34,960 --> 00:30:37,480 Speaker 3: asking good questions. Yes, as someone looking at through the 570 00:30:37,520 --> 00:30:39,320 Speaker 3: whole thing with fresh eyes, like wait, why do you 571 00:30:39,360 --> 00:30:41,480 Speaker 3: guys believe in this and that? Yeah? 572 00:30:41,800 --> 00:30:43,960 Speaker 2: Yeah, it's like the simpler way. 573 00:30:44,040 --> 00:30:45,560 Speaker 1: I mean when I was a kid, I was asking, well, 574 00:30:45,600 --> 00:30:47,600 Speaker 1: what makes that a hill and that a mountain if 575 00:30:47,640 --> 00:30:51,400 Speaker 1: they're both the same height, which just gets into semantics 576 00:30:52,280 --> 00:30:55,920 Speaker 1: right right, But it's great, like I'm glad that she's 577 00:30:56,000 --> 00:30:59,320 Speaker 1: thinking of things in her way and that she's gonna 578 00:30:59,600 --> 00:31:02,680 Speaker 1: totally leave me in the dust, which is fantastic, that's 579 00:31:02,720 --> 00:31:05,640 Speaker 1: what I want to say. But exactly, you know, it's 580 00:31:05,640 --> 00:31:08,000 Speaker 1: that the question seems deep to us only because we 581 00:31:08,120 --> 00:31:10,280 Speaker 1: have forgotten to ask it anymore. 582 00:31:10,760 --> 00:31:12,840 Speaker 3: That's the frustrating part. Is we get older because look, 583 00:31:12,960 --> 00:31:14,920 Speaker 3: the job in the brain is to make an internal 584 00:31:15,000 --> 00:31:17,040 Speaker 3: model of the outside world to say, okay, this is 585 00:31:17,040 --> 00:31:19,120 Speaker 3: how this works, is how that works. And as we 586 00:31:19,160 --> 00:31:21,480 Speaker 3: get older and older, we say, okay, I've kind of 587 00:31:21,480 --> 00:31:23,040 Speaker 3: got this figured out. I know how the world works. 588 00:31:23,040 --> 00:31:25,000 Speaker 3: I know how people respondated this. I know how to 589 00:31:25,080 --> 00:31:27,479 Speaker 3: ride a bicycle, I know where my door is. But 590 00:31:27,640 --> 00:31:31,200 Speaker 3: as a result of this, we, you know, we stop 591 00:31:31,280 --> 00:31:34,960 Speaker 3: asking questions. We lose the curiosity that a child has 592 00:31:35,000 --> 00:31:37,520 Speaker 3: because we kind of think we know the answers to things. 593 00:31:37,960 --> 00:31:40,520 Speaker 3: And what makes good science or I say, you know, 594 00:31:40,560 --> 00:31:45,280 Speaker 3: like a good scientist is anyone who just retains the natiety, 595 00:31:45,400 --> 00:31:49,360 Speaker 3: the youthful, you know, acting like a child their entire 596 00:31:49,520 --> 00:31:51,600 Speaker 3: lives and saying, wait a minute, how do we know that? 597 00:31:52,160 --> 00:31:52,480 Speaker 2: Yeah? 598 00:31:52,560 --> 00:31:55,080 Speaker 1: So I like that. I like that a lot. And 599 00:31:55,600 --> 00:31:59,760 Speaker 1: those moments where it hits me, which are well, I'll 600 00:31:59,800 --> 00:32:02,640 Speaker 1: say they're rare, like it does hit me occasionally, but 601 00:32:02,680 --> 00:32:05,360 Speaker 1: it's always wonderful when it does. And I ask a 602 00:32:05,440 --> 00:32:08,560 Speaker 1: question that I just I didn't even think to ask before, 603 00:32:09,120 --> 00:32:12,040 Speaker 1: and then going down that rabbit hole can be really 604 00:32:13,120 --> 00:32:15,560 Speaker 1: informative and entertaining. Even if you don't arrive at an 605 00:32:15,560 --> 00:32:18,360 Speaker 1: answer that you can fully apply to whatever the question was, 606 00:32:19,200 --> 00:32:22,160 Speaker 1: it's the journey along the way that can be really fascinating. 607 00:32:23,040 --> 00:32:27,080 Speaker 1: It's great because, like, I have discussions with different kinds 608 00:32:27,120 --> 00:32:30,080 Speaker 1: of people. So I've had discussions with scientists and I've 609 00:32:30,080 --> 00:32:33,520 Speaker 1: had discussions with engineers, and it can be fun to 610 00:32:33,520 --> 00:32:36,480 Speaker 1: see the different approaches people take, because engineers tend to 611 00:32:36,480 --> 00:32:38,480 Speaker 1: see the world as a series of problems and they're 612 00:32:38,480 --> 00:32:42,000 Speaker 1: always thinking about solutions, and scientists tend to look at 613 00:32:42,040 --> 00:32:44,960 Speaker 1: the world and just ask questions about the very nature 614 00:32:45,000 --> 00:32:47,400 Speaker 1: of it and how they can have a better understanding 615 00:32:47,440 --> 00:32:50,400 Speaker 1: of it, and how does that fit within our current understanding? 616 00:32:51,000 --> 00:32:52,960 Speaker 1: Does it fit or does it mean that we have 617 00:32:53,040 --> 00:32:57,200 Speaker 1: to actually adjust our current understanding. And I learned so 618 00:32:57,400 --> 00:33:00,440 Speaker 1: much by talking to these two different groups of people, 619 00:33:00,680 --> 00:33:02,600 Speaker 1: and it's always a joy. 620 00:33:02,960 --> 00:33:05,440 Speaker 3: Yeah, And you know, one of the things with neuroscience 621 00:33:05,480 --> 00:33:08,960 Speaker 3: in particular is it's fundamentally the question of how we 622 00:33:09,600 --> 00:33:12,400 Speaker 3: perceive reality at all. So, just as an example, you 623 00:33:12,440 --> 00:33:13,920 Speaker 3: may know this, but I've spent a big chunk of 624 00:33:13,960 --> 00:33:17,760 Speaker 3: my career studying the perception of time. And you know, 625 00:33:17,800 --> 00:33:19,840 Speaker 3: this is one of those things that just seems obvious, like, oh, yeah, 626 00:33:19,880 --> 00:33:21,880 Speaker 3: five minutes past, no problem, you know, or this happened 627 00:33:21,920 --> 00:33:24,040 Speaker 3: before that, or you know, this is blinking at a 628 00:33:24,080 --> 00:33:26,840 Speaker 3: particular rate or whatever, like you don't even think to 629 00:33:26,920 --> 00:33:30,280 Speaker 3: question it. But you know, I've spent whatever twenty five 630 00:33:30,440 --> 00:33:35,440 Speaker 3: years studying this really hard, and just the deeper you 631 00:33:35,480 --> 00:33:38,200 Speaker 3: reach down to this system, the more it sometimes feels like, 632 00:33:38,560 --> 00:33:43,720 Speaker 3: oh my gosh, it's so weird, you know, or you know, 633 00:33:43,800 --> 00:33:47,120 Speaker 3: we mentioned conscious before, like the colors. Colors don't exist 634 00:33:47,120 --> 00:33:50,000 Speaker 3: in the outside world, you know, there's just electromagnetic radiation 635 00:33:50,040 --> 00:33:53,120 Speaker 3: of different wavelengths, and your brain assigns what we call 636 00:33:53,160 --> 00:33:56,520 Speaker 3: equaliya to it, so we say, oh, that's red, that's blue, 637 00:33:56,560 --> 00:33:59,040 Speaker 3: that's green. But those the colors don't exist. It's all 638 00:33:59,280 --> 00:34:02,320 Speaker 3: in your head, you know. And sound doesn't exist as such. 639 00:34:02,360 --> 00:34:06,560 Speaker 3: It's just air compression waves. So the world outside your 640 00:34:06,600 --> 00:34:08,920 Speaker 3: head you can't even really picture it. I mean, it's 641 00:34:08,960 --> 00:34:12,680 Speaker 3: just electromaic magnetic radiation and air compression waves and whatever. 642 00:34:12,760 --> 00:34:17,360 Speaker 3: But but we experienced it as this colorful cacophonousts, you 643 00:34:17,400 --> 00:34:22,279 Speaker 3: know place. So anyway, the more you reach down into 644 00:34:22,320 --> 00:34:26,120 Speaker 3: this stuff, sometimes it feels like, oh my gosh, it's 645 00:34:26,360 --> 00:34:28,799 Speaker 3: it's not even clear what's true or not true. And then, 646 00:34:28,840 --> 00:34:32,920 Speaker 3: of course, with the AI having had this explosion the 647 00:34:32,960 --> 00:34:35,919 Speaker 3: way it has recently, you know, in Silicon Valley, there's 648 00:34:35,960 --> 00:34:39,719 Speaker 3: been this real shift from people saying they're atheists to 649 00:34:39,840 --> 00:34:42,600 Speaker 3: being creationists. Creationists meaning like you know, we're an AI 650 00:34:42,680 --> 00:34:48,400 Speaker 3: simulation by who the heck knows, And you know suddenly 651 00:34:48,520 --> 00:34:50,440 Speaker 3: that that idea, you know, we've all talked about as 652 00:34:50,440 --> 00:34:52,000 Speaker 3: our whole career is about, Hey, how do we know 653 00:34:52,000 --> 00:34:54,000 Speaker 3: if we're in a simulation? And so on? But all 654 00:34:54,040 --> 00:34:55,759 Speaker 3: of a sudden, it seems like, whoa, that's not as 655 00:34:55,800 --> 00:34:57,160 Speaker 3: crazy as it once seemed. 656 00:34:57,440 --> 00:35:00,719 Speaker 1: Yeah, David and I got a little more to talk about. 657 00:35:00,800 --> 00:35:03,280 Speaker 1: But before we can get to that, let's take another 658 00:35:03,400 --> 00:35:16,520 Speaker 1: quick break. If it is possible to build a system 659 00:35:17,200 --> 00:35:22,160 Speaker 1: that is capable of creating a synthetic reality where the 660 00:35:22,200 --> 00:35:25,520 Speaker 1: inhabitants of that synthetic reality would assume it is real. 661 00:35:25,640 --> 00:35:28,839 Speaker 1: If that is in fact possible, then the chances are 662 00:35:28,920 --> 00:35:32,839 Speaker 1: that some civilization at some point has done that, and 663 00:35:32,880 --> 00:35:35,600 Speaker 1: that the chances that we are the civilization that will 664 00:35:35,640 --> 00:35:39,920 Speaker 1: eventually get to that point. Is incredibly egotistical to suggest 665 00:35:39,960 --> 00:35:42,440 Speaker 1: that if there are all these things, the more you 666 00:35:42,480 --> 00:35:45,040 Speaker 1: look at it, as you're taking all these things to consideration, 667 00:35:45,120 --> 00:35:48,200 Speaker 1: the odds are diminishingly likely that we're in a real 668 00:35:48,239 --> 00:35:49,719 Speaker 1: reality in a simulated one. 669 00:35:50,040 --> 00:35:52,319 Speaker 3: Exactly, the odds are that we're in a simulated reality. Yeah, 670 00:35:52,320 --> 00:35:54,960 Speaker 3: this is Nick Bostrom's argument. So the whole notion of 671 00:35:54,960 --> 00:35:56,880 Speaker 3: what is reality and why do you experience? Why do 672 00:35:56,880 --> 00:35:58,880 Speaker 3: you experience things the way we are? I feel like 673 00:35:58,960 --> 00:36:01,400 Speaker 3: it's that that has complexified even from the time I 674 00:36:01,520 --> 00:36:04,040 Speaker 3: was whatever, from the time I was a student till now, 675 00:36:04,680 --> 00:36:08,600 Speaker 3: just trying to understand what reality even is basic question. 676 00:36:08,800 --> 00:36:11,560 Speaker 3: So it's interesting because the engineers have a simpler job 677 00:36:11,560 --> 00:36:13,040 Speaker 3: in a sense, which is, okay, if I launch this 678 00:36:13,080 --> 00:36:16,200 Speaker 3: cannonball doesn't land there, great, I can you know, refine 679 00:36:16,320 --> 00:36:20,160 Speaker 3: and change the angle and terrific. But the scientists are 680 00:36:20,160 --> 00:36:23,040 Speaker 3: facing really, really tough questions. 681 00:36:23,120 --> 00:36:26,520 Speaker 1: Yeah, I would urge my listeners not to think about 682 00:36:26,560 --> 00:36:29,319 Speaker 1: it for too long or too hard, or you'll find 683 00:36:29,320 --> 00:36:30,360 Speaker 1: yourself in a spiral. 684 00:36:30,640 --> 00:36:32,640 Speaker 3: I've always founded interesting by the way the bounce back, 685 00:36:32,680 --> 00:36:35,840 Speaker 3: which is to say, you know, no matter how deeply 686 00:36:35,920 --> 00:36:40,480 Speaker 3: you study something about reality or colors or whatever, you know, 687 00:36:40,680 --> 00:36:42,960 Speaker 3: you stand up from your desk where you're reading this 688 00:36:43,000 --> 00:36:45,960 Speaker 3: book or whatever, and you go out and then suddenly think, oh, 689 00:36:46,040 --> 00:36:48,920 Speaker 3: that honeysuckle smells awesome. It's been such a good mood. 690 00:36:48,960 --> 00:36:51,919 Speaker 3: And oh, this guy is so blue it's beautiful. Oh, 691 00:36:51,960 --> 00:36:55,200 Speaker 3: and here's my beautiful wife and I feel so happy 692 00:36:55,239 --> 00:36:57,760 Speaker 3: to see her, and here's my children. I'm so happy 693 00:36:57,760 --> 00:36:59,839 Speaker 3: to you know, all the stuff that you just can 694 00:37:00,000 --> 00:37:03,560 Speaker 3: pcluded was all illusory. You know, you can't help but 695 00:37:04,080 --> 00:37:04,960 Speaker 3: experience that. 696 00:37:05,200 --> 00:37:05,480 Speaker 2: Yeah. 697 00:37:05,680 --> 00:37:08,200 Speaker 3: In other words, if I explained to you, if I 698 00:37:08,280 --> 00:37:10,560 Speaker 3: had a way of diagramming and showing with equations so 699 00:37:10,640 --> 00:37:13,600 Speaker 3: on the whole reason why you like, you know, chocolate 700 00:37:13,640 --> 00:37:16,719 Speaker 3: ice cream, it would actually have no difference in your 701 00:37:16,760 --> 00:37:18,920 Speaker 3: life to your enjoyment of chocolate ice cream. You would 702 00:37:19,040 --> 00:37:21,200 Speaker 3: you would say, wow, I can't believe it's all mechanical 703 00:37:21,239 --> 00:37:23,719 Speaker 3: and here's dopamine serotonin in this. But then you eat, 704 00:37:23,760 --> 00:37:25,240 Speaker 3: You're like, wow, that sure is delicious. 705 00:37:25,280 --> 00:37:25,600 Speaker 2: Yeah. 706 00:37:25,680 --> 00:37:28,880 Speaker 1: Yeah, Yeah, the knowing does not diminish the experience. 707 00:37:29,160 --> 00:37:29,759 Speaker 2: Yeah. 708 00:37:30,000 --> 00:37:32,600 Speaker 1: I get very pragmatic about it. I think if it's 709 00:37:32,600 --> 00:37:36,080 Speaker 1: a synthetic reality, if what we're in is a simulation, 710 00:37:36,719 --> 00:37:39,960 Speaker 1: it makes no difference to my day to day existence. 711 00:37:40,080 --> 00:37:43,799 Speaker 1: Like I still want to make sure, like my motto is, 712 00:37:44,280 --> 00:37:46,799 Speaker 1: I want to leave things better than they were when 713 00:37:46,840 --> 00:37:48,640 Speaker 1: I got here. If I can do that, then that 714 00:37:48,960 --> 00:37:54,040 Speaker 1: at any level I can, whether the people are simulated 715 00:37:54,160 --> 00:37:57,239 Speaker 1: or perhaps there's some outer reality where people are looking 716 00:37:57,280 --> 00:37:57,520 Speaker 1: at it and. 717 00:37:57,520 --> 00:37:59,880 Speaker 2: Saying like I did his best. He didn't know his 718 00:38:00,080 --> 00:38:02,560 Speaker 2: get it right, but he tried really hard. Well, I'm 719 00:38:02,560 --> 00:38:04,359 Speaker 2: going to consider that a win in the long run. 720 00:38:05,120 --> 00:38:06,320 Speaker 3: Excellent. Excellent. 721 00:38:06,640 --> 00:38:09,440 Speaker 1: Yeah, and I wish I could keep you here to 722 00:38:09,480 --> 00:38:12,319 Speaker 1: talk about all of the different topics you've covered on 723 00:38:12,360 --> 00:38:15,280 Speaker 1: Inner Cosmos, but obviously there's a whole podcast that exists 724 00:38:15,320 --> 00:38:19,480 Speaker 1: for that very purpose, and would be rather ridiculous of 725 00:38:19,480 --> 00:38:21,520 Speaker 1: me to have you repeat it all. I mean, there 726 00:38:21,560 --> 00:38:23,799 Speaker 1: are topics that you talk about that I used to 727 00:38:23,800 --> 00:38:26,800 Speaker 1: discuss on a show I did called Forward Thinking, about 728 00:38:26,920 --> 00:38:30,440 Speaker 1: things like the concept that some people have bandied about, 729 00:38:30,480 --> 00:38:34,480 Speaker 1: about the idea of uploading yourself into some sort of 730 00:38:34,480 --> 00:38:38,319 Speaker 1: of computer system in order to maintain some semblance of 731 00:38:38,320 --> 00:38:41,799 Speaker 1: digital immortality. I don't know how you specifically feel about that. 732 00:38:41,880 --> 00:38:44,520 Speaker 1: I've always felt that this kind of is in the 733 00:38:44,560 --> 00:38:49,320 Speaker 1: realm of techno billionaires who are absolutely terrified at the 734 00:38:49,400 --> 00:38:52,719 Speaker 1: thought of ceasing to exist. But I'm curious about your 735 00:38:52,760 --> 00:38:53,359 Speaker 1: own take. 736 00:38:54,080 --> 00:38:56,239 Speaker 3: It's not just techno billionaires. They can do something about it, 737 00:38:56,239 --> 00:38:59,320 Speaker 3: but some fractions of the population really cares about dying. 738 00:39:00,480 --> 00:39:02,880 Speaker 3: They're really horrified by that. I just happened to be 739 00:39:02,920 --> 00:39:04,640 Speaker 3: at the other end of the spectrum where I really 740 00:39:04,680 --> 00:39:08,960 Speaker 3: don't care. It's just I mean, I just understand mortality, 741 00:39:09,040 --> 00:39:10,520 Speaker 3: and you know, I know that I'll die at some 742 00:39:10,560 --> 00:39:13,520 Speaker 3: point and that's that. But some people really care about 743 00:39:13,560 --> 00:39:17,160 Speaker 3: figuring out. Look, could I, for example, freeze my whole 744 00:39:17,200 --> 00:39:20,359 Speaker 3: body or just freeze my head? And I'm actually doing 745 00:39:20,440 --> 00:39:23,479 Speaker 3: another episode on this coming up. Okay, here's a question 746 00:39:23,520 --> 00:39:26,839 Speaker 3: for you. The first guy to freeze himself for the 747 00:39:26,880 --> 00:39:30,000 Speaker 3: hope of getting rebooted in the future, you know, cut 748 00:39:30,000 --> 00:39:32,040 Speaker 3: to five hundred years from now when people know, ah, 749 00:39:32,080 --> 00:39:36,000 Speaker 3: here's how we can revivify a frozen body. What year 750 00:39:36,160 --> 00:39:39,240 Speaker 3: was that guy born in the first person who's frozen 751 00:39:39,280 --> 00:39:39,560 Speaker 3: like that? 752 00:39:39,640 --> 00:39:44,720 Speaker 1: Gosh, oh, that's a great question. Okay, so I'm gonna 753 00:39:45,040 --> 00:39:47,560 Speaker 1: just give a shot in the dark. I'm gonna say 754 00:39:47,600 --> 00:39:48,600 Speaker 1: nineteen twenty. 755 00:39:48,880 --> 00:39:52,080 Speaker 3: Okay, eighteen ninety eight is the first guy, and he 756 00:39:52,120 --> 00:39:55,439 Speaker 3: froze himself in the late sixties. Yeah, and I didn't 757 00:39:55,440 --> 00:39:57,160 Speaker 3: remember any people have been thinking about this guy thing 758 00:39:57,200 --> 00:39:59,400 Speaker 3: for a long time. Or Hey, could I freeze myself 759 00:39:59,440 --> 00:40:01,920 Speaker 3: and do a Hail Mary pass into the future and 760 00:40:02,080 --> 00:40:04,359 Speaker 3: just hope that people then will know a lot more 761 00:40:04,360 --> 00:40:06,200 Speaker 3: and we know now and they'll fig because you know, 762 00:40:06,280 --> 00:40:08,759 Speaker 3: you can revivify a body of someone who's let's say, 763 00:40:08,760 --> 00:40:12,080 Speaker 3: frozen in a lake. You know, they've fallen through the ice, 764 00:40:12,160 --> 00:40:14,960 Speaker 3: they freeze, their heart stopped, their vitals have all stopped, 765 00:40:14,960 --> 00:40:17,320 Speaker 3: and you know you can warm them up slowly and 766 00:40:17,360 --> 00:40:20,600 Speaker 3: get them going again. And so we know this is possible, 767 00:40:21,719 --> 00:40:24,240 Speaker 3: and yeah, so a lot of people are really interested 768 00:40:24,239 --> 00:40:25,480 Speaker 3: in this. There's two ways to do it. One is 769 00:40:25,480 --> 00:40:27,400 Speaker 3: you freeze the body. The other one is this issue 770 00:40:27,440 --> 00:40:32,439 Speaker 3: of could you somehow extract all the interesting structure of 771 00:40:32,560 --> 00:40:34,720 Speaker 3: what's going on in somebody's brain, you know, the whole 772 00:40:34,800 --> 00:40:39,080 Speaker 3: reconfiguration of their network that represents their life. Could you 773 00:40:39,120 --> 00:40:42,440 Speaker 3: reconstruct that digitally? It would save a lot of space 774 00:40:42,480 --> 00:40:44,400 Speaker 3: and so on, And then you're and then you know, 775 00:40:44,480 --> 00:40:47,360 Speaker 3: essentially run the person in a simulation. So that's that's 776 00:40:47,480 --> 00:40:50,399 Speaker 3: where we might already be. But in any case, we're 777 00:40:50,440 --> 00:40:53,600 Speaker 3: trying to do a simulation inside a simulation of saying, 778 00:40:54,640 --> 00:40:57,120 Speaker 3: you know, could we could we reboot you in some 779 00:40:57,200 --> 00:41:00,480 Speaker 3: other world? And and the thing is about the computationalacity 780 00:41:01,040 --> 00:41:04,480 Speaker 3: of the planet. It's still the estimate is that to 781 00:41:04,560 --> 00:41:07,120 Speaker 3: capture all the information and a brain, all the connections 782 00:41:07,160 --> 00:41:09,680 Speaker 3: and exactly the three D structure would be about a 783 00:41:09,760 --> 00:41:14,680 Speaker 3: zetabyte of information, which is about four times more than 784 00:41:14,680 --> 00:41:17,160 Speaker 3: the computational capacity of the entire planet right now. So 785 00:41:17,520 --> 00:41:20,480 Speaker 3: it's an enormous thing. But boy, and you know, in 786 00:41:20,640 --> 00:41:22,880 Speaker 3: fifty years we won't even care. You'll be wearing a 787 00:41:23,080 --> 00:41:24,560 Speaker 3: zetabyte on your wrist or something. 788 00:41:24,640 --> 00:41:28,279 Speaker 1: So well, we know, we know in the short term 789 00:41:28,280 --> 00:41:31,399 Speaker 1: who the people will be who would go for that, 790 00:41:31,440 --> 00:41:33,279 Speaker 1: because it would be the people who could afford to 791 00:41:33,320 --> 00:41:33,880 Speaker 1: set aside. 792 00:41:33,960 --> 00:41:34,640 Speaker 2: Is that a bite of. 793 00:41:35,239 --> 00:41:38,920 Speaker 1: Of processing power and computer data storage. 794 00:41:39,080 --> 00:41:41,919 Speaker 3: But what's interesting is if you're the tech billionaire who 795 00:41:41,920 --> 00:41:44,560 Speaker 3: gets yourself uploaded some stimulation, you are now at the 796 00:41:45,320 --> 00:41:49,120 Speaker 3: be you know, you're now slave to the programmers. Yes, right, 797 00:41:49,160 --> 00:41:52,080 Speaker 3: And so the programmers might say, look, we're gonna do 798 00:41:52,200 --> 00:41:54,279 Speaker 3: something terrible to you here or whatever we're gonna you know, 799 00:41:54,280 --> 00:41:56,040 Speaker 3: you need to send some money to us, so that 800 00:41:56,040 --> 00:41:59,000 Speaker 3: because we're now controlling your entire world. Yeah, I don't know, 801 00:41:59,000 --> 00:41:59,959 Speaker 3: it would be such a great pause. 802 00:42:00,719 --> 00:42:03,200 Speaker 1: No, I think it ends up being myopic, which is 803 00:42:03,280 --> 00:42:06,319 Speaker 1: sort of how I felt at the very start of 804 00:42:06,320 --> 00:42:09,879 Speaker 1: this conversation too about that whole concept. Like, I do 805 00:42:09,920 --> 00:42:14,080 Speaker 1: think it's an intense fear of mortality that ends up 806 00:42:14,160 --> 00:42:16,640 Speaker 1: driving this. I think, actually, if I'm being honest, I 807 00:42:16,640 --> 00:42:18,920 Speaker 1: think a lot of futurists kind of fall into that 808 00:42:19,000 --> 00:42:23,239 Speaker 1: category two, specifically the ones who have talked about the 809 00:42:23,320 --> 00:42:27,000 Speaker 1: concepts like the singularity and transhumanism and things, elements of 810 00:42:27,040 --> 00:42:30,800 Speaker 1: which we're seeing play out to some extent, but certainly 811 00:42:30,840 --> 00:42:35,759 Speaker 1: not in the form or scope that has been talked 812 00:42:35,760 --> 00:42:38,279 Speaker 1: about for a couple of decades. And it's not to 813 00:42:38,320 --> 00:42:40,360 Speaker 1: say that we're not going to get to a point 814 00:42:40,400 --> 00:42:44,799 Speaker 1: where remarkable change isn't happening at a scale where we 815 00:42:44,840 --> 00:42:47,080 Speaker 1: can't even talk about the present. Maybe that does happen, 816 00:42:47,560 --> 00:42:51,160 Speaker 1: but it just seems to me that there's this underlying 817 00:42:51,239 --> 00:42:54,680 Speaker 1: theme under a lot of those writings of that seems 818 00:42:54,719 --> 00:42:57,040 Speaker 1: to suggest a fear of mortality. This could just be 819 00:42:57,160 --> 00:43:00,719 Speaker 1: armchair psychology bypart, and maybe it's just a theory I 820 00:43:00,760 --> 00:43:04,520 Speaker 1: have which I totally will own, but just it keeps 821 00:43:04,520 --> 00:43:07,120 Speaker 1: coming back to me because I spent five years doing 822 00:43:07,320 --> 00:43:12,560 Speaker 1: a show kind of exploring futurists and futurism thinking and 823 00:43:12,640 --> 00:43:15,439 Speaker 1: that sort of things, and those were the the things 824 00:43:15,440 --> 00:43:19,080 Speaker 1: that's sort of they weren't they weren't even subtext, but 825 00:43:19,120 --> 00:43:22,840 Speaker 1: it seemed to be kind of an underlying motivating factor. 826 00:43:23,239 --> 00:43:26,000 Speaker 3: You know, I'm not sure I see that only because 827 00:43:26,080 --> 00:43:28,480 Speaker 3: I happen to not care about the mortality issue. Yeah, 828 00:43:28,520 --> 00:43:30,640 Speaker 3: and yet I'm very interested in the future. I know, 829 00:43:30,680 --> 00:43:32,680 Speaker 3: I spent a lot of my time thinking about futurism 830 00:43:32,719 --> 00:43:36,160 Speaker 3: and so on. So they don't I don't think they 831 00:43:36,239 --> 00:43:41,200 Speaker 3: necessarily go together. Being thoughtful and creative about what's happening 832 00:43:41,239 --> 00:43:44,120 Speaker 3: next and where things are going feels to me like, 833 00:43:44,200 --> 00:43:46,320 Speaker 3: you know, this is the best part about human brains, 834 00:43:46,400 --> 00:43:49,840 Speaker 3: is our capacity to do this, to say, Wow, look, 835 00:43:50,239 --> 00:43:53,640 Speaker 3: I'm gonna I'm going to simulate forward. Given all the 836 00:43:53,680 --> 00:43:55,560 Speaker 3: stuff that I know, the future is probably not going 837 00:43:55,640 --> 00:43:58,560 Speaker 3: to be just like the past. Instead, you're gonna have 838 00:43:58,600 --> 00:44:01,200 Speaker 3: these things come together and then this other technology merge 839 00:44:01,239 --> 00:44:03,040 Speaker 3: with that, and then wow, look at all these things. 840 00:44:03,040 --> 00:44:06,000 Speaker 3: It leads to I mean, we're always wrong, of course, right, sure, 841 00:44:06,239 --> 00:44:08,880 Speaker 3: terribly wrong. But if you look at the world's fares 842 00:44:09,840 --> 00:44:14,719 Speaker 3: through the decades, like the nineteen sixty Worldfare had displays 843 00:44:14,760 --> 00:44:18,680 Speaker 3: on like underwater hotels or using lasers to cut down 844 00:44:18,719 --> 00:44:21,200 Speaker 3: trees or things like that, they had all kinds of 845 00:44:21,239 --> 00:44:25,680 Speaker 3: amazing stuff, but nobody even thought of the Internet. I 846 00:44:25,680 --> 00:44:28,440 Speaker 3: mean really one of the most is the biggest change 847 00:44:28,440 --> 00:44:30,400 Speaker 3: is no one thought of that, And you get the 848 00:44:30,400 --> 00:44:33,799 Speaker 3: most creative thinkers together, and yet they just they miss 849 00:44:33,920 --> 00:44:37,120 Speaker 3: what are the really big changes, like cell phones or something. 850 00:44:37,800 --> 00:44:38,800 Speaker 2: Yeah, yeah, I mean. 851 00:44:38,680 --> 00:44:41,879 Speaker 1: Well, it's like before the transistor, no one could think 852 00:44:41,920 --> 00:44:45,480 Speaker 1: about an era of maniaturization. Right, Like, if you were 853 00:44:45,480 --> 00:44:48,480 Speaker 1: before the transistor, you would never have thought that a 854 00:44:48,520 --> 00:44:50,879 Speaker 1: computer would be something that someone could own in their home, 855 00:44:50,960 --> 00:44:54,480 Speaker 1: let alone carry in their pocket. And there'd be no way, like, 856 00:44:54,520 --> 00:44:58,120 Speaker 1: you would have no frame of reference to say, because 857 00:44:58,160 --> 00:45:01,200 Speaker 1: in your experience, the way coulduter's got more powerful was 858 00:45:01,239 --> 00:45:04,800 Speaker 1: that they were larger, Right, So like if your computer 859 00:45:04,920 --> 00:45:09,400 Speaker 1: held up, held up an entire floor over underneath Stanford, 860 00:45:09,840 --> 00:45:12,080 Speaker 1: that was a really good computer. There's no way you 861 00:45:12,080 --> 00:45:14,720 Speaker 1: could have one of those in your home. And I'm 862 00:45:14,880 --> 00:45:16,759 Speaker 1: very much the same way. I'll think back occasionally and 863 00:45:16,840 --> 00:45:19,160 Speaker 1: I'll go, wow, those knuckleheads get it all wrong, and 864 00:45:19,160 --> 00:45:20,560 Speaker 1: I thought, yeah, but if I were back then, I 865 00:45:20,560 --> 00:45:22,879 Speaker 1: would have gotten it wrong too, because I didn't have 866 00:45:22,960 --> 00:45:26,840 Speaker 1: the knowledge that the transistor was right around the corner exactly. 867 00:45:26,920 --> 00:45:28,160 Speaker 3: By the way, you know, when you said that that 868 00:45:28,239 --> 00:45:31,000 Speaker 3: just suddenly a thought crossed my mind, which is it's 869 00:45:31,040 --> 00:45:33,600 Speaker 3: funny that we still call this a PC a personal 870 00:45:33,680 --> 00:45:37,120 Speaker 3: computer to distinguish it from you know, like I can wow, 871 00:45:37,120 --> 00:45:38,520 Speaker 3: I can take it one home myself. 872 00:45:39,320 --> 00:45:41,360 Speaker 1: Well, I also grew up in the seventies and eighties, 873 00:45:41,400 --> 00:45:44,040 Speaker 1: so for me, like it's cemented in my head because 874 00:45:44,040 --> 00:45:47,279 Speaker 1: that's how I was exposed to it. And and as 875 00:45:47,280 --> 00:45:49,640 Speaker 1: I'm sure you're aware, sometimes we have a tendency to 876 00:45:49,719 --> 00:45:53,680 Speaker 1: kind of arrest definitions in our head based upon when 877 00:45:53,680 --> 00:45:58,200 Speaker 1: we encountered the thing for the first time. Absolutely, well, David, 878 00:45:58,239 --> 00:45:59,960 Speaker 1: before we go, can you tell us a little bit 879 00:46:00,040 --> 00:46:02,400 Speaker 1: more about inner cosmos for those who I mean, obviously 880 00:46:02,440 --> 00:46:06,560 Speaker 1: you cover tons of different topics related to the brain, 881 00:46:06,719 --> 00:46:09,719 Speaker 1: but give our listeners a little insight into what the 882 00:46:09,719 --> 00:46:12,560 Speaker 1: show's all about, because I want text of listeners to 883 00:46:12,600 --> 00:46:13,680 Speaker 1: definitely go and check it out. 884 00:46:14,400 --> 00:46:15,200 Speaker 3: Thank you. 885 00:46:15,200 --> 00:46:15,400 Speaker 2: You know. 886 00:46:15,600 --> 00:46:18,279 Speaker 3: Part of what's really hard about doing the podcast but 887 00:46:18,360 --> 00:46:21,719 Speaker 3: so rewarding is each week is a monologue. I just 888 00:46:21,760 --> 00:46:23,799 Speaker 3: you know, for forty five minutes or an hour, I 889 00:46:23,840 --> 00:46:27,239 Speaker 3: talk about some topic. But it's like I'm doing storytelling. 890 00:46:27,280 --> 00:46:28,920 Speaker 3: What I'm doing is I'm plenty of other threads of 891 00:46:29,239 --> 00:46:33,480 Speaker 3: history and culture and science and extrapolation in the future 892 00:46:33,880 --> 00:46:37,400 Speaker 3: to address questions of various sorts, like this question of 893 00:46:37,440 --> 00:46:39,960 Speaker 3: you know, could we ever upload our brains? Or will 894 00:46:40,000 --> 00:46:42,479 Speaker 3: we ever do mind reading? There's all kinds of stuff 895 00:46:42,480 --> 00:46:45,600 Speaker 3: in the media about you know, reading minds. Or sometimes 896 00:46:45,600 --> 00:46:48,200 Speaker 3: I address mysteries like you know, why do some people 897 00:46:48,239 --> 00:46:52,759 Speaker 3: think the northern lights make noise? And quick hint on 898 00:46:52,800 --> 00:46:55,719 Speaker 3: that is it has to do with synesthesia, which is 899 00:46:55,760 --> 00:46:58,400 Speaker 3: where someone has a blending of the senses. Or I 900 00:46:58,440 --> 00:47:02,719 Speaker 3: talk about various issues about our perception of time, like 901 00:47:02,840 --> 00:47:05,600 Speaker 3: does time actually run in slow motion when you're in 902 00:47:05,640 --> 00:47:08,120 Speaker 3: fear for your life? This is something that happened to 903 00:47:08,120 --> 00:47:09,759 Speaker 3: me when I was a kid and almost died. I 904 00:47:09,760 --> 00:47:12,359 Speaker 3: fell off of a a house under construction, and then 905 00:47:12,400 --> 00:47:14,520 Speaker 3: I ended up studying this as an neuroscientist known and 906 00:47:14,600 --> 00:47:18,120 Speaker 3: ever studied this question before about this time slow down? 907 00:47:18,640 --> 00:47:20,840 Speaker 3: And so it's all about things like this. And actually 908 00:47:20,880 --> 00:47:23,200 Speaker 3: in the episode that's coming out next week, I'm talking 909 00:47:23,239 --> 00:47:26,040 Speaker 3: about the dress. You remember the dress? Was it you 910 00:47:26,120 --> 00:47:28,840 Speaker 3: know blue or black or white or gold? Yeah, Like 911 00:47:28,880 --> 00:47:32,920 Speaker 3: I'm really unpacking that and other other memes that you 912 00:47:32,960 --> 00:47:35,719 Speaker 3: may remember, Like you know, this thing, does it sound 913 00:47:35,760 --> 00:47:38,520 Speaker 3: like brainstorm or green Needle. It's the same audio file 914 00:47:38,600 --> 00:47:41,200 Speaker 3: but can sound like either one. Anyway, I'm putting a 915 00:47:41,239 --> 00:47:43,439 Speaker 3: whole bunch of things together. What I realized, I think, 916 00:47:43,600 --> 00:47:46,200 Speaker 3: is that to my knowledge, no one's ever sort of 917 00:47:46,239 --> 00:47:49,560 Speaker 3: put all these together and served it up in a 918 00:47:49,600 --> 00:47:54,319 Speaker 3: way where it's where you really get something deeper out 919 00:47:54,360 --> 00:47:55,920 Speaker 3: of it, like, oh, yeah, I saw that meme, I 920 00:47:55,960 --> 00:47:57,719 Speaker 3: saw that. Mean But what does this all mean to us? 921 00:47:57,719 --> 00:47:59,640 Speaker 3: What does this mean for our brains the way we 922 00:47:59,680 --> 00:48:02,200 Speaker 3: see the world. What does it mean for who we 923 00:48:02,239 --> 00:48:06,359 Speaker 3: are in terms of our neural networks. One theme that 924 00:48:06,719 --> 00:48:08,680 Speaker 3: if you've listened to several of the episodes, you might have 925 00:48:08,719 --> 00:48:11,240 Speaker 3: already been picking up on this theme. I'm very interested 926 00:48:11,239 --> 00:48:13,280 Speaker 3: in the fact that we are essentially like a team 927 00:48:13,360 --> 00:48:17,200 Speaker 3: of rivals underneath the hood, meaning you're not just one thing. 928 00:48:17,239 --> 00:48:21,760 Speaker 3: You've got all these different networks running and they're often 929 00:48:21,840 --> 00:48:24,840 Speaker 3: in conflict. And so this is why humans are nuanced 930 00:48:24,880 --> 00:48:27,399 Speaker 3: and subtle and interesting, Because you know, if I put 931 00:48:27,440 --> 00:48:30,160 Speaker 3: some chocolate chip cookies in front of you, part of 932 00:48:30,160 --> 00:48:32,600 Speaker 3: your brain says, hey, that's a good energy source, let's 933 00:48:32,600 --> 00:48:34,120 Speaker 3: eat it, and party brain says, no, don't eat it, 934 00:48:34,120 --> 00:48:35,640 Speaker 3: you'll get fat, and Party of Brains says, Okay, I'll 935 00:48:35,640 --> 00:48:37,160 Speaker 3: eat it, but I'll go to the gym tonight. And 936 00:48:37,800 --> 00:48:40,160 Speaker 3: you know, you can start contracting yourself and cussing with 937 00:48:40,200 --> 00:48:43,520 Speaker 3: yourself and like who's talking to whom? Here? It's all you, 938 00:48:43,600 --> 00:48:46,759 Speaker 3: but it's different parts of you and anyway, So all 939 00:48:46,800 --> 00:48:49,799 Speaker 3: this stuff is of great interest to me, and I 940 00:48:49,840 --> 00:48:55,920 Speaker 3: think is the material for storytelling and really diving into 941 00:48:55,960 --> 00:48:58,560 Speaker 3: this stuff. So I just love that people care about 942 00:48:58,600 --> 00:48:59,120 Speaker 3: this stuff. 943 00:48:59,320 --> 00:48:59,560 Speaker 2: Yeah. 944 00:49:00,080 --> 00:49:02,719 Speaker 1: To me, David, it's like, like it reminds me of 945 00:49:02,800 --> 00:49:06,320 Speaker 1: conversations for some reason, I associate when I was a teenager, 946 00:49:06,840 --> 00:49:10,480 Speaker 1: but because I was the nerdy teenager, I didn't you know, 947 00:49:10,640 --> 00:49:12,680 Speaker 1: drink or anything. I would hang out with my friends 948 00:49:13,040 --> 00:49:16,560 Speaker 1: and we just start asking questions we did not know 949 00:49:16,600 --> 00:49:19,799 Speaker 1: the answers to, and like you get into all the 950 00:49:20,000 --> 00:49:25,080 Speaker 1: like the stereotypical ones, like when I see red and 951 00:49:25,120 --> 00:49:27,600 Speaker 1: I experienced the color red, how do I know that 952 00:49:27,640 --> 00:49:30,279 Speaker 1: when you're looking at something that's red it looks the 953 00:49:30,320 --> 00:49:32,920 Speaker 1: same to you as it does to me, Or maybe 954 00:49:32,960 --> 00:49:36,239 Speaker 1: your red looks like blue to me, but we've both 955 00:49:36,280 --> 00:49:39,479 Speaker 1: agreed that this thing is red. Therefore that's what red 956 00:49:39,520 --> 00:49:41,239 Speaker 1: is to me. It's what red is to you, but 957 00:49:41,320 --> 00:49:45,080 Speaker 1: our experiences are and you just gone into this deep conversation. 958 00:49:45,880 --> 00:49:47,920 Speaker 3: Yeah, exactly. And by the way, this is just like 959 00:49:48,000 --> 00:49:50,560 Speaker 3: with your niece who when you said she's eight years old. 960 00:49:51,000 --> 00:49:53,839 Speaker 3: I mean, this is the same thing. It's that as 961 00:49:53,880 --> 00:49:57,120 Speaker 3: we age, we forget to ask those questions. We just 962 00:49:57,280 --> 00:50:00,000 Speaker 3: somehow stopped asking. Think, oh, I don't know the answer, 963 00:50:00,120 --> 00:50:03,040 Speaker 3: and that's the end of it. And so you know, 964 00:50:03,080 --> 00:50:05,359 Speaker 3: it's funny that you had mentioned that because I often think, 965 00:50:05,520 --> 00:50:08,560 Speaker 3: for example, you know a few of my episodes have 966 00:50:08,560 --> 00:50:11,120 Speaker 3: been about visual illusions, and this one about the dress 967 00:50:11,160 --> 00:50:14,120 Speaker 3: and so includes some issues about that. But I often 968 00:50:14,120 --> 00:50:16,960 Speaker 3: think that visual illusions are really interesting to ten year 969 00:50:16,960 --> 00:50:19,960 Speaker 3: olds and then to neuroscientists when they grow up. But 970 00:50:20,000 --> 00:50:22,239 Speaker 3: that's it, Like everyone else sort of forgets about visual 971 00:50:22,280 --> 00:50:25,480 Speaker 3: illusions unless they're you know, taking three seconds to appreciate 972 00:50:25,520 --> 00:50:28,760 Speaker 3: a meme on the internet. But like, why does that happen? 973 00:50:29,560 --> 00:50:32,960 Speaker 3: It's an incredibly powerful inroad into what is happening in 974 00:50:32,960 --> 00:50:36,840 Speaker 3: the brain to understand what's happening with an illusion anyway, 975 00:50:36,920 --> 00:50:39,080 Speaker 3: So what I'm trying to do with Inner Cosmos is 976 00:50:39,120 --> 00:50:44,440 Speaker 3: bring all these threads together so we can really understand ourselves, 977 00:50:44,520 --> 00:50:46,640 Speaker 3: understand who we are and what our reality is. 978 00:50:46,719 --> 00:50:47,120 Speaker 2: I love it. 979 00:50:47,160 --> 00:50:49,880 Speaker 1: I mean it's a tall ask for a podcast, but 980 00:50:49,920 --> 00:50:51,640 Speaker 1: I think you're up to the challenge. 981 00:50:51,880 --> 00:50:52,279 Speaker 3: Thank you. 982 00:50:52,560 --> 00:50:52,799 Speaker 2: Yeah. 983 00:50:53,080 --> 00:50:55,919 Speaker 1: I highly recommend that y'all check it out. That's Inner 984 00:50:55,960 --> 00:50:58,160 Speaker 1: Cosmos with David Eagleman. It's one of those things that 985 00:50:58,200 --> 00:51:03,360 Speaker 1: I listened to just as a fan. Thank you terrific, well, David, 986 00:51:03,400 --> 00:51:04,560 Speaker 1: Thank you for coming on the show. 987 00:51:04,560 --> 00:51:05,560 Speaker 2: I really appreciate it. 988 00:51:06,239 --> 00:51:07,880 Speaker 3: Thank you for having me. It's such a pleasure to 989 00:51:07,920 --> 00:51:08,560 Speaker 3: be here and talk with you. 990 00:51:08,640 --> 00:51:11,200 Speaker 2: Jonathan fantastic guys. That's it. 991 00:51:11,760 --> 00:51:13,840 Speaker 1: Go check out the show and I'll talk to you 992 00:51:13,880 --> 00:51:23,920 Speaker 1: again really soon. Tech Stuff is an iHeartRadio production. For 993 00:51:24,000 --> 00:51:28,839 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 994 00:51:28,960 --> 00:51:31,000 Speaker 1: or wherever you listen to your favorite shows.